Homework-10/7/2017: Backpropagation

In this project, I will use backpropagation to complish an xor problem. There are two inputs, two hidden units and one output neuron in this structure. The activation function is $f(x)=1/(1+\exp(-x))$. The learning rate is 0.6 in each layer. Please use BP networks to train the network and make the error less than $0.008$. Record the number of iteration steps. Analysis the data result and write a report.

Basic method

This part will use the basic cost function-quadratic cost function, basic activation function-sigmoid function and basic initial method of weights-according to the Gaussian distribution. Later on, I will use other methods for comparison and analysis.

Import modules


In [1]:
import numpy as np
import matplotlib.pyplot as plt

Prepare the data


In [2]:
# List all possibilities of XOR problem.
training_data = np.array([[0,0], [0,1], [1,0], [1,1]])
desired_output = np.array([0, 1, 1, 0])

Build the network


In [22]:
class Network(object):
    
    def __init__(self, sizes):
        """The list "sizes" contains the number of neurons in respective layers.""" 
        self.sizes = sizes
        self.number_layers = len(sizes)
        self.biases = [np.random.randn(i, 1) for i in sizes[1:]]
        self.weights = [np.random.randn(i, j) for i,j in zip(sizes[1:],sizes[:-1])]   
        
    def feedforward(self, input_):
        """Return the output of the neural network."""
        activation = input_
        for bias, weight in zip(self.biases, self.weights):
            output = np.dot(weight, activation) + bias
            activation = sigmoid(output)
        return activation
    
    def set_values(self, weights, biases):
        self.weights = weights
        self.biases = biases
    
    def backprop(self, training_data, desired_output, lr_rate):
        """Update parameters"""
        # Store gradients of weights and biases for update
        grad_weights = np.zeros_like(self.weights)
        grad_biases = np.zeros_like(self.biases)
        
        # Store outputs and activations for backprop
        outputs = []
        activation = training_data
        activations = [activation] 
        for b, w in zip(self.biases, self.weights):
            output = np.dot(w, activation) + b
            outputs.append(output)
            activation = sigmoid(output)
            activations.append(activation)
        
        # Compute the gradients of the last layer
        error = self.feedforward(training_data) - desired_output
        delta = error * sigmoid_deriv(outputs[-1])
        grad_biases[-1] = delta
        grad_weights[-1] = np.dot(delta, activations[-2].transpose())
        
        # Compute gradients of remaining layers
        for layer in range(2, self.number_layers):
            delta = np.dot(self.weights[-layer+1].transpose(), delta) * sigmoid_deriv(outputs[-layer])
            grad_biases[-layer] = delta
            grad_weights[-layer] = np.dot(delta, activations[-layer-1].transpose())
        
        # Update weights and biases
        self.weights = [w-lr_rate*grad_w for w, grad_w in zip(self.weights, grad_weights)]
        self.biases = [b-lr_rate*grad_b for b, grad_b in zip(self.biases, grad_biases)]

In [3]:
# Activation Function
def sigmoid(a):
    """Sigmoid function"""
    return 1 / (1 + np.exp(-a))

def sigmoid_deriv(a):
    """Derivative of the sigmoid function"""
    return sigmoid(a) * (1 - sigmoid(a))

Set hyperparameters


In [28]:
# Set training epoches and learning rate
epochs = 50000
lr_rate = 0.6

Train the network


In [81]:
# Train the network 
def train(training_data, desired_output, epochs, lr_rate):
    losses = [] # Store losses for plotting
    epoch_n = [] # Store epochs for plotting
    network = Network([2,2,1])
    for epoch in range(epochs):
        for x, y in zip(training_data, desired_output):
            x = x[:, None] # Convert row vector to column vector
            network.backprop(x, y, lr_rate)
            # Printing out the training progress
            train_loss = abs(y - network.feedforward(x))
            if epoch % 500 == 0:
                losses.append(train_loss[0][0]) 
                epoch_n.append(epoch)
            print("Epoch: {}, Training Loss: {:.5f}".format(epoch, train_loss[0][0]))
    return network.weights, network.biases, losses, epoch_n
    
weights, biases, train_losses, epoch_n = train(training_data, desired_output, epochs, lr_rate)


Epoch: 0, Training Loss: 0.64046
Epoch: 0, Training Loss: 0.24261
Epoch: 0, Training Loss: 0.40488
Epoch: 0, Training Loss: 0.65505
Epoch: 1, Training Loss: 0.61554
Epoch: 1, Training Loss: 0.26549
Epoch: 1, Training Loss: 0.42676
Epoch: 1, Training Loss: 0.62918
Epoch: 2, Training Loss: 0.59302
Epoch: 2, Training Loss: 0.28656
Epoch: 2, Training Loss: 0.44629
Epoch: 2, Training Loss: 0.60614
Epoch: 3, Training Loss: 0.57315
Epoch: 3, Training Loss: 0.30546
Epoch: 3, Training Loss: 0.46342
Epoch: 3, Training Loss: 0.58603
Epoch: 4, Training Loss: 0.55591
Epoch: 4, Training Loss: 0.32204
Epoch: 4, Training Loss: 0.47820
Epoch: 4, Training Loss: 0.56872
Epoch: 5, Training Loss: 0.54116
Epoch: 5, Training Loss: 0.33634
Epoch: 5, Training Loss: 0.49084
Epoch: 5, Training Loss: 0.55397
Epoch: 6, Training Loss: 0.52865
Epoch: 6, Training Loss: 0.34852
Epoch: 6, Training Loss: 0.50157
Epoch: 6, Training Loss: 0.54149
Epoch: 7, Training Loss: 0.51810
Epoch: 7, Training Loss: 0.35879
Epoch: 7, Training Loss: 0.51062
Epoch: 7, Training Loss: 0.53096
Epoch: 8, Training Loss: 0.50925
Epoch: 8, Training Loss: 0.36738
Epoch: 8, Training Loss: 0.51824
Epoch: 8, Training Loss: 0.52211
Epoch: 9, Training Loss: 0.50183
Epoch: 9, Training Loss: 0.37450
Epoch: 9, Training Loss: 0.52465
Epoch: 9, Training Loss: 0.51467
Epoch: 10, Training Loss: 0.49563
Epoch: 10, Training Loss: 0.38037
Epoch: 10, Training Loss: 0.53002
Epoch: 10, Training Loss: 0.50841
Epoch: 11, Training Loss: 0.49045
Epoch: 11, Training Loss: 0.38518
Epoch: 11, Training Loss: 0.53454
Epoch: 11, Training Loss: 0.50314
Epoch: 12, Training Loss: 0.48612
Epoch: 12, Training Loss: 0.38909
Epoch: 12, Training Loss: 0.53834
Epoch: 12, Training Loss: 0.49870
Epoch: 13, Training Loss: 0.48250
Epoch: 13, Training Loss: 0.39224
Epoch: 13, Training Loss: 0.54154
Epoch: 13, Training Loss: 0.49495
Epoch: 14, Training Loss: 0.47947
Epoch: 14, Training Loss: 0.39475
Epoch: 14, Training Loss: 0.54424
Epoch: 14, Training Loss: 0.49178
Epoch: 15, Training Loss: 0.47693
Epoch: 15, Training Loss: 0.39672
Epoch: 15, Training Loss: 0.54652
Epoch: 15, Training Loss: 0.48908
Epoch: 16, Training Loss: 0.47480
Epoch: 16, Training Loss: 0.39824
Epoch: 16, Training Loss: 0.54847
Epoch: 16, Training Loss: 0.48678
Epoch: 17, Training Loss: 0.47300
Epoch: 17, Training Loss: 0.39938
Epoch: 17, Training Loss: 0.55012
Epoch: 17, Training Loss: 0.48481
Epoch: 18, Training Loss: 0.47149
Epoch: 18, Training Loss: 0.40019
Epoch: 18, Training Loss: 0.55154
Epoch: 18, Training Loss: 0.48311
Epoch: 19, Training Loss: 0.47021
Epoch: 19, Training Loss: 0.40074
Epoch: 19, Training Loss: 0.55276
Epoch: 19, Training Loss: 0.48165
Epoch: 20, Training Loss: 0.46912
Epoch: 20, Training Loss: 0.40107
Epoch: 20, Training Loss: 0.55383
Epoch: 20, Training Loss: 0.48037
Epoch: 21, Training Loss: 0.46818
Epoch: 21, Training Loss: 0.40120
Epoch: 21, Training Loss: 0.55476
Epoch: 21, Training Loss: 0.47925
Epoch: 22, Training Loss: 0.46738
Epoch: 22, Training Loss: 0.40117
Epoch: 22, Training Loss: 0.55558
Epoch: 22, Training Loss: 0.47826
Epoch: 23, Training Loss: 0.46669
Epoch: 23, Training Loss: 0.40101
Epoch: 23, Training Loss: 0.55631
Epoch: 23, Training Loss: 0.47738
Epoch: 24, Training Loss: 0.46609
Epoch: 24, Training Loss: 0.40073
Epoch: 24, Training Loss: 0.55696
Epoch: 24, Training Loss: 0.47658
Epoch: 25, Training Loss: 0.46555
Epoch: 25, Training Loss: 0.40036
Epoch: 25, Training Loss: 0.55756
Epoch: 25, Training Loss: 0.47587
Epoch: 26, Training Loss: 0.46508
Epoch: 26, Training Loss: 0.39990
Epoch: 26, Training Loss: 0.55811
Epoch: 26, Training Loss: 0.47521
Epoch: 27, Training Loss: 0.46465
Epoch: 27, Training Loss: 0.39937
Epoch: 27, Training Loss: 0.55862
Epoch: 27, Training Loss: 0.47461
Epoch: 28, Training Loss: 0.46427
Epoch: 28, Training Loss: 0.39878
Epoch: 28, Training Loss: 0.55910
Epoch: 28, Training Loss: 0.47404
Epoch: 29, Training Loss: 0.46391
Epoch: 29, Training Loss: 0.39814
Epoch: 29, Training Loss: 0.55956
Epoch: 29, Training Loss: 0.47351
Epoch: 30, Training Loss: 0.46358
Epoch: 30, Training Loss: 0.39746
Epoch: 30, Training Loss: 0.56000
Epoch: 30, Training Loss: 0.47301
Epoch: 31, Training Loss: 0.46326
Epoch: 31, Training Loss: 0.39673
Epoch: 31, Training Loss: 0.56042
Epoch: 31, Training Loss: 0.47253
Epoch: 32, Training Loss: 0.46296
Epoch: 32, Training Loss: 0.39597
Epoch: 32, Training Loss: 0.56084
Epoch: 32, Training Loss: 0.47207
Epoch: 33, Training Loss: 0.46267
Epoch: 33, Training Loss: 0.39518
Epoch: 33, Training Loss: 0.56124
Epoch: 33, Training Loss: 0.47163
Epoch: 34, Training Loss: 0.46239
Epoch: 34, Training Loss: 0.39437
Epoch: 34, Training Loss: 0.56165
Epoch: 34, Training Loss: 0.47119
Epoch: 35, Training Loss: 0.46211
Epoch: 35, Training Loss: 0.39353
Epoch: 35, Training Loss: 0.56205
Epoch: 35, Training Loss: 0.47076
Epoch: 36, Training Loss: 0.46184
Epoch: 36, Training Loss: 0.39266
Epoch: 36, Training Loss: 0.56245
Epoch: 36, Training Loss: 0.47034
Epoch: 37, Training Loss: 0.46157
Epoch: 37, Training Loss: 0.39178
Epoch: 37, Training Loss: 0.56285
Epoch: 37, Training Loss: 0.46993
Epoch: 38, Training Loss: 0.46129
Epoch: 38, Training Loss: 0.39089
Epoch: 38, Training Loss: 0.56325
Epoch: 38, Training Loss: 0.46951
Epoch: 39, Training Loss: 0.46102
Epoch: 39, Training Loss: 0.38997
Epoch: 39, Training Loss: 0.56366
Epoch: 39, Training Loss: 0.46910
Epoch: 40, Training Loss: 0.46074
Epoch: 40, Training Loss: 0.38904
Epoch: 40, Training Loss: 0.56407
Epoch: 40, Training Loss: 0.46869
Epoch: 41, Training Loss: 0.46046
Epoch: 41, Training Loss: 0.38810
Epoch: 41, Training Loss: 0.56448
Epoch: 41, Training Loss: 0.46828
Epoch: 42, Training Loss: 0.46017
Epoch: 42, Training Loss: 0.38714
Epoch: 42, Training Loss: 0.56490
Epoch: 42, Training Loss: 0.46787
Epoch: 43, Training Loss: 0.45988
Epoch: 43, Training Loss: 0.38617
Epoch: 43, Training Loss: 0.56533
Epoch: 43, Training Loss: 0.46746
Epoch: 44, Training Loss: 0.45958
Epoch: 44, Training Loss: 0.38519
Epoch: 44, Training Loss: 0.56576
Epoch: 44, Training Loss: 0.46704
Epoch: 45, Training Loss: 0.45928
Epoch: 45, Training Loss: 0.38420
Epoch: 45, Training Loss: 0.56620
Epoch: 45, Training Loss: 0.46662
Epoch: 46, Training Loss: 0.45897
Epoch: 46, Training Loss: 0.38320
Epoch: 46, Training Loss: 0.56664
Epoch: 46, Training Loss: 0.46620
Epoch: 47, Training Loss: 0.45866
Epoch: 47, Training Loss: 0.38219
Epoch: 47, Training Loss: 0.56709
Epoch: 47, Training Loss: 0.46577
Epoch: 48, Training Loss: 0.45834
Epoch: 48, Training Loss: 0.38116
Epoch: 48, Training Loss: 0.56755
Epoch: 48, Training Loss: 0.46534
Epoch: 49, Training Loss: 0.45801
Epoch: 49, Training Loss: 0.38013
Epoch: 49, Training Loss: 0.56802
Epoch: 49, Training Loss: 0.46491
Epoch: 50, Training Loss: 0.45767
Epoch: 50, Training Loss: 0.37909
Epoch: 50, Training Loss: 0.56849
Epoch: 50, Training Loss: 0.46447
Epoch: 51, Training Loss: 0.45733
Epoch: 51, Training Loss: 0.37804
Epoch: 51, Training Loss: 0.56897
Epoch: 51, Training Loss: 0.46403
Epoch: 52, Training Loss: 0.45698
Epoch: 52, Training Loss: 0.37698
Epoch: 52, Training Loss: 0.56945
Epoch: 52, Training Loss: 0.46359
Epoch: 53, Training Loss: 0.45663
Epoch: 53, Training Loss: 0.37591
Epoch: 53, Training Loss: 0.56994
Epoch: 53, Training Loss: 0.46314
Epoch: 54, Training Loss: 0.45627
Epoch: 54, Training Loss: 0.37484
Epoch: 54, Training Loss: 0.57044
Epoch: 54, Training Loss: 0.46268
Epoch: 55, Training Loss: 0.45590
Epoch: 55, Training Loss: 0.37376
Epoch: 55, Training Loss: 0.57094
Epoch: 55, Training Loss: 0.46222
Epoch: 56, Training Loss: 0.45552
Epoch: 56, Training Loss: 0.37266
Epoch: 56, Training Loss: 0.57146
Epoch: 56, Training Loss: 0.46175
Epoch: 57, Training Loss: 0.45514
Epoch: 57, Training Loss: 0.37156
Epoch: 57, Training Loss: 0.57197
Epoch: 57, Training Loss: 0.46128
Epoch: 58, Training Loss: 0.45475
Epoch: 58, Training Loss: 0.37046
Epoch: 58, Training Loss: 0.57250
Epoch: 58, Training Loss: 0.46081
Epoch: 59, Training Loss: 0.45435
Epoch: 59, Training Loss: 0.36934
Epoch: 59, Training Loss: 0.57303
Epoch: 59, Training Loss: 0.46033
Epoch: 60, Training Loss: 0.45395
Epoch: 60, Training Loss: 0.36822
Epoch: 60, Training Loss: 0.57356
Epoch: 60, Training Loss: 0.45984
Epoch: 61, Training Loss: 0.45354
Epoch: 61, Training Loss: 0.36709
Epoch: 61, Training Loss: 0.57411
Epoch: 61, Training Loss: 0.45935
Epoch: 62, Training Loss: 0.45312
Epoch: 62, Training Loss: 0.36596
Epoch: 62, Training Loss: 0.57465
Epoch: 62, Training Loss: 0.45885
Epoch: 63, Training Loss: 0.45269
Epoch: 63, Training Loss: 0.36481
Epoch: 63, Training Loss: 0.57521
Epoch: 63, Training Loss: 0.45835
Epoch: 64, Training Loss: 0.45226
Epoch: 64, Training Loss: 0.36367
Epoch: 64, Training Loss: 0.57577
Epoch: 64, Training Loss: 0.45785
Epoch: 65, Training Loss: 0.45182
Epoch: 65, Training Loss: 0.36251
Epoch: 65, Training Loss: 0.57633
Epoch: 65, Training Loss: 0.45734
Epoch: 66, Training Loss: 0.45138
Epoch: 66, Training Loss: 0.36135
Epoch: 66, Training Loss: 0.57690
Epoch: 66, Training Loss: 0.45682
Epoch: 67, Training Loss: 0.45093
Epoch: 67, Training Loss: 0.36018
Epoch: 67, Training Loss: 0.57748
Epoch: 67, Training Loss: 0.45630
Epoch: 68, Training Loss: 0.45047
Epoch: 68, Training Loss: 0.35901
Epoch: 68, Training Loss: 0.57806
Epoch: 68, Training Loss: 0.45577
Epoch: 69, Training Loss: 0.45001
Epoch: 69, Training Loss: 0.35783
Epoch: 69, Training Loss: 0.57864
Epoch: 69, Training Loss: 0.45524
Epoch: 70, Training Loss: 0.44954
Epoch: 70, Training Loss: 0.35665
Epoch: 70, Training Loss: 0.57923
Epoch: 70, Training Loss: 0.45470
Epoch: 71, Training Loss: 0.44906
Epoch: 71, Training Loss: 0.35546
Epoch: 71, Training Loss: 0.57983
Epoch: 71, Training Loss: 0.45416
Epoch: 72, Training Loss: 0.44858
Epoch: 72, Training Loss: 0.35426
Epoch: 72, Training Loss: 0.58043
Epoch: 72, Training Loss: 0.45362
Epoch: 73, Training Loss: 0.44809
Epoch: 73, Training Loss: 0.35306
Epoch: 73, Training Loss: 0.58103
Epoch: 73, Training Loss: 0.45306
Epoch: 74, Training Loss: 0.44760
Epoch: 74, Training Loss: 0.35186
Epoch: 74, Training Loss: 0.58164
Epoch: 74, Training Loss: 0.45251
Epoch: 75, Training Loss: 0.44710
Epoch: 75, Training Loss: 0.35065
Epoch: 75, Training Loss: 0.58225
Epoch: 75, Training Loss: 0.45195
Epoch: 76, Training Loss: 0.44659
Epoch: 76, Training Loss: 0.34944
Epoch: 76, Training Loss: 0.58286
Epoch: 76, Training Loss: 0.45138
Epoch: 77, Training Loss: 0.44608
Epoch: 77, Training Loss: 0.34822
Epoch: 77, Training Loss: 0.58348
Epoch: 77, Training Loss: 0.45081
Epoch: 78, Training Loss: 0.44556
Epoch: 78, Training Loss: 0.34700
Epoch: 78, Training Loss: 0.58410
Epoch: 78, Training Loss: 0.45024
Epoch: 79, Training Loss: 0.44504
Epoch: 79, Training Loss: 0.34577
Epoch: 79, Training Loss: 0.58472
Epoch: 79, Training Loss: 0.44966
Epoch: 80, Training Loss: 0.44451
Epoch: 80, Training Loss: 0.34455
Epoch: 80, Training Loss: 0.58535
Epoch: 80, Training Loss: 0.44908
Epoch: 81, Training Loss: 0.44398
Epoch: 81, Training Loss: 0.34331
Epoch: 81, Training Loss: 0.58598
Epoch: 81, Training Loss: 0.44849
Epoch: 82, Training Loss: 0.44344
Epoch: 82, Training Loss: 0.34208
Epoch: 82, Training Loss: 0.58661
Epoch: 82, Training Loss: 0.44790
Epoch: 83, Training Loss: 0.44290
Epoch: 83, Training Loss: 0.34084
Epoch: 83, Training Loss: 0.58725
Epoch: 83, Training Loss: 0.44731
Epoch: 84, Training Loss: 0.44235
Epoch: 84, Training Loss: 0.33960
Epoch: 84, Training Loss: 0.58788
Epoch: 84, Training Loss: 0.44671
Epoch: 85, Training Loss: 0.44180
Epoch: 85, Training Loss: 0.33836
Epoch: 85, Training Loss: 0.58852
Epoch: 85, Training Loss: 0.44611
Epoch: 86, Training Loss: 0.44124
Epoch: 86, Training Loss: 0.33711
Epoch: 86, Training Loss: 0.58916
Epoch: 86, Training Loss: 0.44550
Epoch: 87, Training Loss: 0.44068
Epoch: 87, Training Loss: 0.33586
Epoch: 87, Training Loss: 0.58981
Epoch: 87, Training Loss: 0.44490
Epoch: 88, Training Loss: 0.44011
Epoch: 88, Training Loss: 0.33461
Epoch: 88, Training Loss: 0.59045
Epoch: 88, Training Loss: 0.44428
Epoch: 89, Training Loss: 0.43954
Epoch: 89, Training Loss: 0.33336
Epoch: 89, Training Loss: 0.59110
Epoch: 89, Training Loss: 0.44367
Epoch: 90, Training Loss: 0.43897
Epoch: 90, Training Loss: 0.33211
Epoch: 90, Training Loss: 0.59174
Epoch: 90, Training Loss: 0.44305
Epoch: 91, Training Loss: 0.43839
Epoch: 91, Training Loss: 0.33085
Epoch: 91, Training Loss: 0.59239
Epoch: 91, Training Loss: 0.44243
Epoch: 92, Training Loss: 0.43780
Epoch: 92, Training Loss: 0.32960
Epoch: 92, Training Loss: 0.59304
Epoch: 92, Training Loss: 0.44180
Epoch: 93, Training Loss: 0.43722
Epoch: 93, Training Loss: 0.32834
Epoch: 93, Training Loss: 0.59369
Epoch: 93, Training Loss: 0.44118
Epoch: 94, Training Loss: 0.43663
Epoch: 94, Training Loss: 0.32708
Epoch: 94, Training Loss: 0.59434
Epoch: 94, Training Loss: 0.44055
Epoch: 95, Training Loss: 0.43603
Epoch: 95, Training Loss: 0.32582
Epoch: 95, Training Loss: 0.59499
Epoch: 95, Training Loss: 0.43992
Epoch: 96, Training Loss: 0.43543
Epoch: 96, Training Loss: 0.32456
Epoch: 96, Training Loss: 0.59564
Epoch: 96, Training Loss: 0.43928
Epoch: 97, Training Loss: 0.43483
Epoch: 97, Training Loss: 0.32330
Epoch: 97, Training Loss: 0.59629
Epoch: 97, Training Loss: 0.43864
Epoch: 98, Training Loss: 0.43423
Epoch: 98, Training Loss: 0.32204
Epoch: 98, Training Loss: 0.59694
Epoch: 98, Training Loss: 0.43801
Epoch: 99, Training Loss: 0.43362
Epoch: 99, Training Loss: 0.32078
Epoch: 99, Training Loss: 0.59759
Epoch: 99, Training Loss: 0.43737
Epoch: 100, Training Loss: 0.43301
Epoch: 100, Training Loss: 0.31953
Epoch: 100, Training Loss: 0.59824
Epoch: 100, Training Loss: 0.43672
Epoch: 101, Training Loss: 0.43239
Epoch: 101, Training Loss: 0.31827
Epoch: 101, Training Loss: 0.59889
Epoch: 101, Training Loss: 0.43608
Epoch: 102, Training Loss: 0.43178
Epoch: 102, Training Loss: 0.31701
Epoch: 102, Training Loss: 0.59953
Epoch: 102, Training Loss: 0.43543
Epoch: 103, Training Loss: 0.43116
Epoch: 103, Training Loss: 0.31575
Epoch: 103, Training Loss: 0.60018
Epoch: 103, Training Loss: 0.43479
Epoch: 104, Training Loss: 0.43054
Epoch: 104, Training Loss: 0.31449
Epoch: 104, Training Loss: 0.60083
Epoch: 104, Training Loss: 0.43414
Epoch: 105, Training Loss: 0.42991
Epoch: 105, Training Loss: 0.31324
Epoch: 105, Training Loss: 0.60147
Epoch: 105, Training Loss: 0.43349
Epoch: 106, Training Loss: 0.42929
Epoch: 106, Training Loss: 0.31198
Epoch: 106, Training Loss: 0.60211
Epoch: 106, Training Loss: 0.43284
Epoch: 107, Training Loss: 0.42866
Epoch: 107, Training Loss: 0.31073
Epoch: 107, Training Loss: 0.60275
Epoch: 107, Training Loss: 0.43219
Epoch: 108, Training Loss: 0.42803
Epoch: 108, Training Loss: 0.30947
Epoch: 108, Training Loss: 0.60339
Epoch: 108, Training Loss: 0.43154
Epoch: 109, Training Loss: 0.42740
Epoch: 109, Training Loss: 0.30822
Epoch: 109, Training Loss: 0.60403
Epoch: 109, Training Loss: 0.43089
Epoch: 110, Training Loss: 0.42676
Epoch: 110, Training Loss: 0.30698
Epoch: 110, Training Loss: 0.60466
Epoch: 110, Training Loss: 0.43024
Epoch: 111, Training Loss: 0.42613
Epoch: 111, Training Loss: 0.30573
Epoch: 111, Training Loss: 0.60530
Epoch: 111, Training Loss: 0.42959
Epoch: 112, Training Loss: 0.42549
Epoch: 112, Training Loss: 0.30448
Epoch: 112, Training Loss: 0.60593
Epoch: 112, Training Loss: 0.42894
Epoch: 113, Training Loss: 0.42485
Epoch: 113, Training Loss: 0.30324
Epoch: 113, Training Loss: 0.60656
Epoch: 113, Training Loss: 0.42829
Epoch: 114, Training Loss: 0.42421
Epoch: 114, Training Loss: 0.30200
Epoch: 114, Training Loss: 0.60718
Epoch: 114, Training Loss: 0.42763
Epoch: 115, Training Loss: 0.42357
Epoch: 115, Training Loss: 0.30076
Epoch: 115, Training Loss: 0.60780
Epoch: 115, Training Loss: 0.42698
Epoch: 116, Training Loss: 0.42293
Epoch: 116, Training Loss: 0.29953
Epoch: 116, Training Loss: 0.60842
Epoch: 116, Training Loss: 0.42633
Epoch: 117, Training Loss: 0.42229
Epoch: 117, Training Loss: 0.29830
Epoch: 117, Training Loss: 0.60904
Epoch: 117, Training Loss: 0.42568
Epoch: 118, Training Loss: 0.42165
Epoch: 118, Training Loss: 0.29707
Epoch: 118, Training Loss: 0.60966
Epoch: 118, Training Loss: 0.42504
Epoch: 119, Training Loss: 0.42100
Epoch: 119, Training Loss: 0.29584
Epoch: 119, Training Loss: 0.61027
Epoch: 119, Training Loss: 0.42439
Epoch: 120, Training Loss: 0.42036
Epoch: 120, Training Loss: 0.29462
Epoch: 120, Training Loss: 0.61087
Epoch: 120, Training Loss: 0.42374
Epoch: 121, Training Loss: 0.41971
Epoch: 121, Training Loss: 0.29340
Epoch: 121, Training Loss: 0.61148
Epoch: 121, Training Loss: 0.42310
Epoch: 122, Training Loss: 0.41907
Epoch: 122, Training Loss: 0.29218
Epoch: 122, Training Loss: 0.61208
Epoch: 122, Training Loss: 0.42245
Epoch: 123, Training Loss: 0.41842
Epoch: 123, Training Loss: 0.29097
Epoch: 123, Training Loss: 0.61268
Epoch: 123, Training Loss: 0.42181
Epoch: 124, Training Loss: 0.41778
Epoch: 124, Training Loss: 0.28976
Epoch: 124, Training Loss: 0.61327
Epoch: 124, Training Loss: 0.42117
Epoch: 125, Training Loss: 0.41713
Epoch: 125, Training Loss: 0.28856
Epoch: 125, Training Loss: 0.61386
Epoch: 125, Training Loss: 0.42053
Epoch: 126, Training Loss: 0.41649
Epoch: 126, Training Loss: 0.28735
Epoch: 126, Training Loss: 0.61445
Epoch: 126, Training Loss: 0.41989
Epoch: 127, Training Loss: 0.41584
Epoch: 127, Training Loss: 0.28616
Epoch: 127, Training Loss: 0.61503
Epoch: 127, Training Loss: 0.41926
Epoch: 128, Training Loss: 0.41520
Epoch: 128, Training Loss: 0.28496
Epoch: 128, Training Loss: 0.61561
Epoch: 128, Training Loss: 0.41863
Epoch: 129, Training Loss: 0.41455
Epoch: 129, Training Loss: 0.28377
Epoch: 129, Training Loss: 0.61618
Epoch: 129, Training Loss: 0.41800
Epoch: 130, Training Loss: 0.41391
Epoch: 130, Training Loss: 0.28259
Epoch: 130, Training Loss: 0.61675
Epoch: 130, Training Loss: 0.41737
Epoch: 131, Training Loss: 0.41326
Epoch: 131, Training Loss: 0.28141
Epoch: 131, Training Loss: 0.61731
Epoch: 131, Training Loss: 0.41674
Epoch: 132, Training Loss: 0.41262
Epoch: 132, Training Loss: 0.28023
Epoch: 132, Training Loss: 0.61787
Epoch: 132, Training Loss: 0.41612
Epoch: 133, Training Loss: 0.41198
Epoch: 133, Training Loss: 0.27906
Epoch: 133, Training Loss: 0.61843
Epoch: 133, Training Loss: 0.41550
Epoch: 134, Training Loss: 0.41134
Epoch: 134, Training Loss: 0.27789
Epoch: 134, Training Loss: 0.61898
Epoch: 134, Training Loss: 0.41488
Epoch: 135, Training Loss: 0.41070
Epoch: 135, Training Loss: 0.27673
Epoch: 135, Training Loss: 0.61953
Epoch: 135, Training Loss: 0.41426
Epoch: 136, Training Loss: 0.41006
Epoch: 136, Training Loss: 0.27557
Epoch: 136, Training Loss: 0.62007
Epoch: 136, Training Loss: 0.41365
Epoch: 137, Training Loss: 0.40942
Epoch: 137, Training Loss: 0.27442
Epoch: 137, Training Loss: 0.62061
Epoch: 137, Training Loss: 0.41304
Epoch: 138, Training Loss: 0.40879
Epoch: 138, Training Loss: 0.27327
Epoch: 138, Training Loss: 0.62114
Epoch: 138, Training Loss: 0.41244
Epoch: 139, Training Loss: 0.40815
Epoch: 139, Training Loss: 0.27213
Epoch: 139, Training Loss: 0.62167
Epoch: 139, Training Loss: 0.41183
Epoch: 140, Training Loss: 0.40752
Epoch: 140, Training Loss: 0.27099
Epoch: 140, Training Loss: 0.62220
Epoch: 140, Training Loss: 0.41123
Epoch: 141, Training Loss: 0.40689
Epoch: 141, Training Loss: 0.26985
Epoch: 141, Training Loss: 0.62271
Epoch: 141, Training Loss: 0.41064
Epoch: 142, Training Loss: 0.40626
Epoch: 142, Training Loss: 0.26873
Epoch: 142, Training Loss: 0.62323
Epoch: 142, Training Loss: 0.41004
Epoch: 143, Training Loss: 0.40563
Epoch: 143, Training Loss: 0.26760
Epoch: 143, Training Loss: 0.62374
Epoch: 143, Training Loss: 0.40945
Epoch: 144, Training Loss: 0.40500
Epoch: 144, Training Loss: 0.26648
Epoch: 144, Training Loss: 0.62424
Epoch: 144, Training Loss: 0.40887
Epoch: 145, Training Loss: 0.40437
Epoch: 145, Training Loss: 0.26537
Epoch: 145, Training Loss: 0.62474
Epoch: 145, Training Loss: 0.40829
Epoch: 146, Training Loss: 0.40375
Epoch: 146, Training Loss: 0.26427
Epoch: 146, Training Loss: 0.62523
Epoch: 146, Training Loss: 0.40771
Epoch: 147, Training Loss: 0.40313
Epoch: 147, Training Loss: 0.26316
Epoch: 147, Training Loss: 0.62572
Epoch: 147, Training Loss: 0.40713
Epoch: 148, Training Loss: 0.40251
Epoch: 148, Training Loss: 0.26207
Epoch: 148, Training Loss: 0.62621
Epoch: 148, Training Loss: 0.40656
Epoch: 149, Training Loss: 0.40189
Epoch: 149, Training Loss: 0.26098
Epoch: 149, Training Loss: 0.62669
Epoch: 149, Training Loss: 0.40600
Epoch: 150, Training Loss: 0.40127
Epoch: 150, Training Loss: 0.25989
Epoch: 150, Training Loss: 0.62716
Epoch: 150, Training Loss: 0.40543
Epoch: 151, Training Loss: 0.40066
Epoch: 151, Training Loss: 0.25881
Epoch: 151, Training Loss: 0.62763
Epoch: 151, Training Loss: 0.40488
Epoch: 152, Training Loss: 0.40005
Epoch: 152, Training Loss: 0.25774
Epoch: 152, Training Loss: 0.62809
Epoch: 152, Training Loss: 0.40432
Epoch: 153, Training Loss: 0.39944
Epoch: 153, Training Loss: 0.25667
Epoch: 153, Training Loss: 0.62855
Epoch: 153, Training Loss: 0.40377
Epoch: 154, Training Loss: 0.39883
Epoch: 154, Training Loss: 0.25561
Epoch: 154, Training Loss: 0.62900
Epoch: 154, Training Loss: 0.40322
Epoch: 155, Training Loss: 0.39823
Epoch: 155, Training Loss: 0.25455
Epoch: 155, Training Loss: 0.62945
Epoch: 155, Training Loss: 0.40268
Epoch: 156, Training Loss: 0.39763
Epoch: 156, Training Loss: 0.25350
Epoch: 156, Training Loss: 0.62989
Epoch: 156, Training Loss: 0.40215
Epoch: 157, Training Loss: 0.39703
Epoch: 157, Training Loss: 0.25245
Epoch: 157, Training Loss: 0.63033
Epoch: 157, Training Loss: 0.40161
Epoch: 158, Training Loss: 0.39643
Epoch: 158, Training Loss: 0.25141
Epoch: 158, Training Loss: 0.63076
Epoch: 158, Training Loss: 0.40108
Epoch: 159, Training Loss: 0.39583
Epoch: 159, Training Loss: 0.25038
Epoch: 159, Training Loss: 0.63119
Epoch: 159, Training Loss: 0.40056
Epoch: 160, Training Loss: 0.39524
Epoch: 160, Training Loss: 0.24935
Epoch: 160, Training Loss: 0.63161
Epoch: 160, Training Loss: 0.40004
Epoch: 161, Training Loss: 0.39465
Epoch: 161, Training Loss: 0.24833
Epoch: 161, Training Loss: 0.63202
Epoch: 161, Training Loss: 0.39952
Epoch: 162, Training Loss: 0.39406
Epoch: 162, Training Loss: 0.24731
Epoch: 162, Training Loss: 0.63243
Epoch: 162, Training Loss: 0.39901
Epoch: 163, Training Loss: 0.39348
Epoch: 163, Training Loss: 0.24630
Epoch: 163, Training Loss: 0.63284
Epoch: 163, Training Loss: 0.39851
Epoch: 164, Training Loss: 0.39290
Epoch: 164, Training Loss: 0.24529
Epoch: 164, Training Loss: 0.63324
Epoch: 164, Training Loss: 0.39800
Epoch: 165, Training Loss: 0.39232
Epoch: 165, Training Loss: 0.24430
Epoch: 165, Training Loss: 0.63363
Epoch: 165, Training Loss: 0.39751
Epoch: 166, Training Loss: 0.39174
Epoch: 166, Training Loss: 0.24330
Epoch: 166, Training Loss: 0.63402
Epoch: 166, Training Loss: 0.39701
Epoch: 167, Training Loss: 0.39117
Epoch: 167, Training Loss: 0.24232
Epoch: 167, Training Loss: 0.63441
Epoch: 167, Training Loss: 0.39653
Epoch: 168, Training Loss: 0.39059
Epoch: 168, Training Loss: 0.24133
Epoch: 168, Training Loss: 0.63479
Epoch: 168, Training Loss: 0.39604
Epoch: 169, Training Loss: 0.39003
Epoch: 169, Training Loss: 0.24036
Epoch: 169, Training Loss: 0.63516
Epoch: 169, Training Loss: 0.39556
Epoch: 170, Training Loss: 0.38946
Epoch: 170, Training Loss: 0.23939
Epoch: 170, Training Loss: 0.63553
Epoch: 170, Training Loss: 0.39509
Epoch: 171, Training Loss: 0.38890
Epoch: 171, Training Loss: 0.23843
Epoch: 171, Training Loss: 0.63590
Epoch: 171, Training Loss: 0.39462
Epoch: 172, Training Loss: 0.38834
Epoch: 172, Training Loss: 0.23747
Epoch: 172, Training Loss: 0.63625
Epoch: 172, Training Loss: 0.39416
Epoch: 173, Training Loss: 0.38778
Epoch: 173, Training Loss: 0.23652
Epoch: 173, Training Loss: 0.63661
Epoch: 173, Training Loss: 0.39370
Epoch: 174, Training Loss: 0.38722
Epoch: 174, Training Loss: 0.23557
Epoch: 174, Training Loss: 0.63696
Epoch: 174, Training Loss: 0.39324
Epoch: 175, Training Loss: 0.38667
Epoch: 175, Training Loss: 0.23463
Epoch: 175, Training Loss: 0.63730
Epoch: 175, Training Loss: 0.39279
Epoch: 176, Training Loss: 0.38612
Epoch: 176, Training Loss: 0.23370
Epoch: 176, Training Loss: 0.63764
Epoch: 176, Training Loss: 0.39234
Epoch: 177, Training Loss: 0.38558
Epoch: 177, Training Loss: 0.23277
Epoch: 177, Training Loss: 0.63797
Epoch: 177, Training Loss: 0.39190
Epoch: 178, Training Loss: 0.38504
Epoch: 178, Training Loss: 0.23185
Epoch: 178, Training Loss: 0.63830
Epoch: 178, Training Loss: 0.39147
Epoch: 179, Training Loss: 0.38450
Epoch: 179, Training Loss: 0.23093
Epoch: 179, Training Loss: 0.63862
Epoch: 179, Training Loss: 0.39104
Epoch: 180, Training Loss: 0.38396
Epoch: 180, Training Loss: 0.23002
Epoch: 180, Training Loss: 0.63894
Epoch: 180, Training Loss: 0.39061
Epoch: 181, Training Loss: 0.38342
Epoch: 181, Training Loss: 0.22912
Epoch: 181, Training Loss: 0.63926
Epoch: 181, Training Loss: 0.39019
Epoch: 182, Training Loss: 0.38289
Epoch: 182, Training Loss: 0.22822
Epoch: 182, Training Loss: 0.63957
Epoch: 182, Training Loss: 0.38977
Epoch: 183, Training Loss: 0.38236
Epoch: 183, Training Loss: 0.22732
Epoch: 183, Training Loss: 0.63987
Epoch: 183, Training Loss: 0.38936
Epoch: 184, Training Loss: 0.38184
Epoch: 184, Training Loss: 0.22644
Epoch: 184, Training Loss: 0.64017
Epoch: 184, Training Loss: 0.38895
Epoch: 185, Training Loss: 0.38131
Epoch: 185, Training Loss: 0.22556
Epoch: 185, Training Loss: 0.64046
Epoch: 185, Training Loss: 0.38855
Epoch: 186, Training Loss: 0.38079
Epoch: 186, Training Loss: 0.22468
Epoch: 186, Training Loss: 0.64075
Epoch: 186, Training Loss: 0.38815
Epoch: 187, Training Loss: 0.38028
Epoch: 187, Training Loss: 0.22381
Epoch: 187, Training Loss: 0.64104
Epoch: 187, Training Loss: 0.38776
Epoch: 188, Training Loss: 0.37976
Epoch: 188, Training Loss: 0.22295
Epoch: 188, Training Loss: 0.64132
Epoch: 188, Training Loss: 0.38737
Epoch: 189, Training Loss: 0.37925
Epoch: 189, Training Loss: 0.22209
Epoch: 189, Training Loss: 0.64159
Epoch: 189, Training Loss: 0.38698
Epoch: 190, Training Loss: 0.37874
Epoch: 190, Training Loss: 0.22124
Epoch: 190, Training Loss: 0.64186
Epoch: 190, Training Loss: 0.38660
Epoch: 191, Training Loss: 0.37824
Epoch: 191, Training Loss: 0.22039
Epoch: 191, Training Loss: 0.64213
Epoch: 191, Training Loss: 0.38623
Epoch: 192, Training Loss: 0.37773
Epoch: 192, Training Loss: 0.21955
Epoch: 192, Training Loss: 0.64239
Epoch: 192, Training Loss: 0.38586
Epoch: 193, Training Loss: 0.37723
Epoch: 193, Training Loss: 0.21871
Epoch: 193, Training Loss: 0.64265
Epoch: 193, Training Loss: 0.38549
Epoch: 194, Training Loss: 0.37674
Epoch: 194, Training Loss: 0.21788
Epoch: 194, Training Loss: 0.64290
Epoch: 194, Training Loss: 0.38513
Epoch: 195, Training Loss: 0.37624
Epoch: 195, Training Loss: 0.21706
Epoch: 195, Training Loss: 0.64315
Epoch: 195, Training Loss: 0.38477
Epoch: 196, Training Loss: 0.37575
Epoch: 196, Training Loss: 0.21624
Epoch: 196, Training Loss: 0.64339
Epoch: 196, Training Loss: 0.38442
Epoch: 197, Training Loss: 0.37526
Epoch: 197, Training Loss: 0.21543
Epoch: 197, Training Loss: 0.64363
Epoch: 197, Training Loss: 0.38408
Epoch: 198, Training Loss: 0.37478
Epoch: 198, Training Loss: 0.21462
Epoch: 198, Training Loss: 0.64386
Epoch: 198, Training Loss: 0.38373
Epoch: 199, Training Loss: 0.37429
Epoch: 199, Training Loss: 0.21382
Epoch: 199, Training Loss: 0.64409
Epoch: 199, Training Loss: 0.38339
Epoch: 200, Training Loss: 0.37381
Epoch: 200, Training Loss: 0.21302
Epoch: 200, Training Loss: 0.64432
Epoch: 200, Training Loss: 0.38306
Epoch: 201, Training Loss: 0.37333
Epoch: 201, Training Loss: 0.21223
Epoch: 201, Training Loss: 0.64454
Epoch: 201, Training Loss: 0.38273
Epoch: 202, Training Loss: 0.37286
Epoch: 202, Training Loss: 0.21145
Epoch: 202, Training Loss: 0.64476
Epoch: 202, Training Loss: 0.38241
Epoch: 203, Training Loss: 0.37239
Epoch: 203, Training Loss: 0.21066
Epoch: 203, Training Loss: 0.64497
Epoch: 203, Training Loss: 0.38209
Epoch: 204, Training Loss: 0.37192
Epoch: 204, Training Loss: 0.20989
Epoch: 204, Training Loss: 0.64518
Epoch: 204, Training Loss: 0.38177
Epoch: 205, Training Loss: 0.37145
Epoch: 205, Training Loss: 0.20912
Epoch: 205, Training Loss: 0.64539
Epoch: 205, Training Loss: 0.38146
Epoch: 206, Training Loss: 0.37098
Epoch: 206, Training Loss: 0.20836
Epoch: 206, Training Loss: 0.64559
Epoch: 206, Training Loss: 0.38115
Epoch: 207, Training Loss: 0.37052
Epoch: 207, Training Loss: 0.20760
Epoch: 207, Training Loss: 0.64578
Epoch: 207, Training Loss: 0.38085
Epoch: 208, Training Loss: 0.37006
Epoch: 208, Training Loss: 0.20684
Epoch: 208, Training Loss: 0.64598
Epoch: 208, Training Loss: 0.38055
Epoch: 209, Training Loss: 0.36961
Epoch: 209, Training Loss: 0.20610
Epoch: 209, Training Loss: 0.64617
Epoch: 209, Training Loss: 0.38026
Epoch: 210, Training Loss: 0.36915
Epoch: 210, Training Loss: 0.20535
Epoch: 210, Training Loss: 0.64635
Epoch: 210, Training Loss: 0.37997
Epoch: 211, Training Loss: 0.36870
Epoch: 211, Training Loss: 0.20461
Epoch: 211, Training Loss: 0.64653
Epoch: 211, Training Loss: 0.37968
Epoch: 212, Training Loss: 0.36825
Epoch: 212, Training Loss: 0.20388
Epoch: 212, Training Loss: 0.64671
Epoch: 212, Training Loss: 0.37940
Epoch: 213, Training Loss: 0.36780
Epoch: 213, Training Loss: 0.20315
Epoch: 213, Training Loss: 0.64688
Epoch: 213, Training Loss: 0.37912
Epoch: 214, Training Loss: 0.36736
Epoch: 214, Training Loss: 0.20243
Epoch: 214, Training Loss: 0.64705
Epoch: 214, Training Loss: 0.37885
Epoch: 215, Training Loss: 0.36692
Epoch: 215, Training Loss: 0.20171
Epoch: 215, Training Loss: 0.64722
Epoch: 215, Training Loss: 0.37858
Epoch: 216, Training Loss: 0.36648
Epoch: 216, Training Loss: 0.20100
Epoch: 216, Training Loss: 0.64738
Epoch: 216, Training Loss: 0.37832
Epoch: 217, Training Loss: 0.36604
Epoch: 217, Training Loss: 0.20029
Epoch: 217, Training Loss: 0.64753
Epoch: 217, Training Loss: 0.37806
Epoch: 218, Training Loss: 0.36560
Epoch: 218, Training Loss: 0.19959
Epoch: 218, Training Loss: 0.64769
Epoch: 218, Training Loss: 0.37780
Epoch: 219, Training Loss: 0.36517
Epoch: 219, Training Loss: 0.19889
Epoch: 219, Training Loss: 0.64784
Epoch: 219, Training Loss: 0.37755
Epoch: 220, Training Loss: 0.36474
Epoch: 220, Training Loss: 0.19820
Epoch: 220, Training Loss: 0.64799
Epoch: 220, Training Loss: 0.37730
Epoch: 221, Training Loss: 0.36431
Epoch: 221, Training Loss: 0.19751
Epoch: 221, Training Loss: 0.64813
Epoch: 221, Training Loss: 0.37706
Epoch: 222, Training Loss: 0.36389
Epoch: 222, Training Loss: 0.19683
Epoch: 222, Training Loss: 0.64827
Epoch: 222, Training Loss: 0.37682
Epoch: 223, Training Loss: 0.36346
Epoch: 223, Training Loss: 0.19615
Epoch: 223, Training Loss: 0.64840
Epoch: 223, Training Loss: 0.37658
Epoch: 224, Training Loss: 0.36304
Epoch: 224, Training Loss: 0.19548
Epoch: 224, Training Loss: 0.64854
Epoch: 224, Training Loss: 0.37635
Epoch: 225, Training Loss: 0.36262
Epoch: 225, Training Loss: 0.19481
Epoch: 225, Training Loss: 0.64867
Epoch: 225, Training Loss: 0.37612
Epoch: 226, Training Loss: 0.36220
Epoch: 226, Training Loss: 0.19414
Epoch: 226, Training Loss: 0.64879
Epoch: 226, Training Loss: 0.37590
Epoch: 227, Training Loss: 0.36179
Epoch: 227, Training Loss: 0.19348
Epoch: 227, Training Loss: 0.64891
Epoch: 227, Training Loss: 0.37567
Epoch: 228, Training Loss: 0.36137
Epoch: 228, Training Loss: 0.19283
Epoch: 228, Training Loss: 0.64903
Epoch: 228, Training Loss: 0.37546
Epoch: 229, Training Loss: 0.36096
Epoch: 229, Training Loss: 0.19218
Epoch: 229, Training Loss: 0.64915
Epoch: 229, Training Loss: 0.37524
Epoch: 230, Training Loss: 0.36055
Epoch: 230, Training Loss: 0.19153
Epoch: 230, Training Loss: 0.64926
Epoch: 230, Training Loss: 0.37504
Epoch: 231, Training Loss: 0.36014
Epoch: 231, Training Loss: 0.19089
Epoch: 231, Training Loss: 0.64937
Epoch: 231, Training Loss: 0.37483
Epoch: 232, Training Loss: 0.35974
Epoch: 232, Training Loss: 0.19026
Epoch: 232, Training Loss: 0.64947
Epoch: 232, Training Loss: 0.37463
Epoch: 233, Training Loss: 0.35934
Epoch: 233, Training Loss: 0.18962
Epoch: 233, Training Loss: 0.64958
Epoch: 233, Training Loss: 0.37443
Epoch: 234, Training Loss: 0.35893
Epoch: 234, Training Loss: 0.18900
Epoch: 234, Training Loss: 0.64968
Epoch: 234, Training Loss: 0.37424
Epoch: 235, Training Loss: 0.35853
Epoch: 235, Training Loss: 0.18837
Epoch: 235, Training Loss: 0.64977
Epoch: 235, Training Loss: 0.37405
Epoch: 236, Training Loss: 0.35814
Epoch: 236, Training Loss: 0.18775
Epoch: 236, Training Loss: 0.64987
Epoch: 236, Training Loss: 0.37386
Epoch: 237, Training Loss: 0.35774
Epoch: 237, Training Loss: 0.18714
Epoch: 237, Training Loss: 0.64995
Epoch: 237, Training Loss: 0.37368
Epoch: 238, Training Loss: 0.35735
Epoch: 238, Training Loss: 0.18653
Epoch: 238, Training Loss: 0.65004
Epoch: 238, Training Loss: 0.37350
Epoch: 239, Training Loss: 0.35695
Epoch: 239, Training Loss: 0.18592
Epoch: 239, Training Loss: 0.65012
Epoch: 239, Training Loss: 0.37332
Epoch: 240, Training Loss: 0.35656
Epoch: 240, Training Loss: 0.18532
Epoch: 240, Training Loss: 0.65021
Epoch: 240, Training Loss: 0.37315
Epoch: 241, Training Loss: 0.35617
Epoch: 241, Training Loss: 0.18472
Epoch: 241, Training Loss: 0.65028
Epoch: 241, Training Loss: 0.37298
Epoch: 242, Training Loss: 0.35578
Epoch: 242, Training Loss: 0.18413
Epoch: 242, Training Loss: 0.65036
Epoch: 242, Training Loss: 0.37281
Epoch: 243, Training Loss: 0.35540
Epoch: 243, Training Loss: 0.18354
Epoch: 243, Training Loss: 0.65043
Epoch: 243, Training Loss: 0.37265
Epoch: 244, Training Loss: 0.35501
Epoch: 244, Training Loss: 0.18296
Epoch: 244, Training Loss: 0.65050
Epoch: 244, Training Loss: 0.37249
Epoch: 245, Training Loss: 0.35463
Epoch: 245, Training Loss: 0.18238
Epoch: 245, Training Loss: 0.65056
Epoch: 245, Training Loss: 0.37234
Epoch: 246, Training Loss: 0.35425
Epoch: 246, Training Loss: 0.18180
Epoch: 246, Training Loss: 0.65063
Epoch: 246, Training Loss: 0.37218
Epoch: 247, Training Loss: 0.35387
Epoch: 247, Training Loss: 0.18123
Epoch: 247, Training Loss: 0.65069
Epoch: 247, Training Loss: 0.37204
Epoch: 248, Training Loss: 0.35349
Epoch: 248, Training Loss: 0.18066
Epoch: 248, Training Loss: 0.65074
Epoch: 248, Training Loss: 0.37189
Epoch: 249, Training Loss: 0.35311
Epoch: 249, Training Loss: 0.18009
Epoch: 249, Training Loss: 0.65080
Epoch: 249, Training Loss: 0.37175
Epoch: 250, Training Loss: 0.35274
Epoch: 250, Training Loss: 0.17953
Epoch: 250, Training Loss: 0.65085
Epoch: 250, Training Loss: 0.37161
Epoch: 251, Training Loss: 0.35236
Epoch: 251, Training Loss: 0.17897
Epoch: 251, Training Loss: 0.65090
Epoch: 251, Training Loss: 0.37148
Epoch: 252, Training Loss: 0.35199
Epoch: 252, Training Loss: 0.17842
Epoch: 252, Training Loss: 0.65094
Epoch: 252, Training Loss: 0.37134
Epoch: 253, Training Loss: 0.35162
Epoch: 253, Training Loss: 0.17787
Epoch: 253, Training Loss: 0.65099
Epoch: 253, Training Loss: 0.37122
Epoch: 254, Training Loss: 0.35125
Epoch: 254, Training Loss: 0.17733
Epoch: 254, Training Loss: 0.65103
Epoch: 254, Training Loss: 0.37109
Epoch: 255, Training Loss: 0.35088
Epoch: 255, Training Loss: 0.17679
Epoch: 255, Training Loss: 0.65106
Epoch: 255, Training Loss: 0.37097
Epoch: 256, Training Loss: 0.35051
Epoch: 256, Training Loss: 0.17625
Epoch: 256, Training Loss: 0.65110
Epoch: 256, Training Loss: 0.37085
Epoch: 257, Training Loss: 0.35014
Epoch: 257, Training Loss: 0.17571
Epoch: 257, Training Loss: 0.65113
Epoch: 257, Training Loss: 0.37073
Epoch: 258, Training Loss: 0.34978
Epoch: 258, Training Loss: 0.17518
Epoch: 258, Training Loss: 0.65116
Epoch: 258, Training Loss: 0.37062
Epoch: 259, Training Loss: 0.34941
Epoch: 259, Training Loss: 0.17466
Epoch: 259, Training Loss: 0.65119
Epoch: 259, Training Loss: 0.37051
Epoch: 260, Training Loss: 0.34905
Epoch: 260, Training Loss: 0.17413
Epoch: 260, Training Loss: 0.65121
Epoch: 260, Training Loss: 0.37041
Epoch: 261, Training Loss: 0.34868
Epoch: 261, Training Loss: 0.17361
Epoch: 261, Training Loss: 0.65123
Epoch: 261, Training Loss: 0.37030
Epoch: 262, Training Loss: 0.34832
Epoch: 262, Training Loss: 0.17310
Epoch: 262, Training Loss: 0.65125
Epoch: 262, Training Loss: 0.37020
Epoch: 263, Training Loss: 0.34796
Epoch: 263, Training Loss: 0.17259
Epoch: 263, Training Loss: 0.65127
Epoch: 263, Training Loss: 0.37011
Epoch: 264, Training Loss: 0.34760
Epoch: 264, Training Loss: 0.17208
Epoch: 264, Training Loss: 0.65128
Epoch: 264, Training Loss: 0.37001
Epoch: 265, Training Loss: 0.34724
Epoch: 265, Training Loss: 0.17157
Epoch: 265, Training Loss: 0.65130
Epoch: 265, Training Loss: 0.36992
Epoch: 266, Training Loss: 0.34689
Epoch: 266, Training Loss: 0.17107
Epoch: 266, Training Loss: 0.65130
Epoch: 266, Training Loss: 0.36983
Epoch: 267, Training Loss: 0.34653
Epoch: 267, Training Loss: 0.17057
Epoch: 267, Training Loss: 0.65131
Epoch: 267, Training Loss: 0.36975
Epoch: 268, Training Loss: 0.34617
Epoch: 268, Training Loss: 0.17008
Epoch: 268, Training Loss: 0.65132
Epoch: 268, Training Loss: 0.36967
Epoch: 269, Training Loss: 0.34582
Epoch: 269, Training Loss: 0.16959
Epoch: 269, Training Loss: 0.65132
Epoch: 269, Training Loss: 0.36959
Epoch: 270, Training Loss: 0.34546
Epoch: 270, Training Loss: 0.16910
Epoch: 270, Training Loss: 0.65132
Epoch: 270, Training Loss: 0.36951
Epoch: 271, Training Loss: 0.34511
Epoch: 271, Training Loss: 0.16861
Epoch: 271, Training Loss: 0.65131
Epoch: 271, Training Loss: 0.36944
Epoch: 272, Training Loss: 0.34475
Epoch: 272, Training Loss: 0.16813
Epoch: 272, Training Loss: 0.65131
Epoch: 272, Training Loss: 0.36937
Epoch: 273, Training Loss: 0.34440
Epoch: 273, Training Loss: 0.16765
Epoch: 273, Training Loss: 0.65130
Epoch: 273, Training Loss: 0.36930
Epoch: 274, Training Loss: 0.34405
Epoch: 274, Training Loss: 0.16718
Epoch: 274, Training Loss: 0.65129
Epoch: 274, Training Loss: 0.36924
Epoch: 275, Training Loss: 0.34370
Epoch: 275, Training Loss: 0.16671
Epoch: 275, Training Loss: 0.65128
Epoch: 275, Training Loss: 0.36918
Epoch: 276, Training Loss: 0.34335
Epoch: 276, Training Loss: 0.16624
Epoch: 276, Training Loss: 0.65126
Epoch: 276, Training Loss: 0.36912
Epoch: 277, Training Loss: 0.34300
Epoch: 277, Training Loss: 0.16577
Epoch: 277, Training Loss: 0.65124
Epoch: 277, Training Loss: 0.36907
Epoch: 278, Training Loss: 0.34265
Epoch: 278, Training Loss: 0.16531
Epoch: 278, Training Loss: 0.65122
Epoch: 278, Training Loss: 0.36901
Epoch: 279, Training Loss: 0.34230
Epoch: 279, Training Loss: 0.16485
Epoch: 279, Training Loss: 0.65120
Epoch: 279, Training Loss: 0.36896
Epoch: 280, Training Loss: 0.34195
Epoch: 280, Training Loss: 0.16440
Epoch: 280, Training Loss: 0.65118
Epoch: 280, Training Loss: 0.36892
Epoch: 281, Training Loss: 0.34160
Epoch: 281, Training Loss: 0.16395
Epoch: 281, Training Loss: 0.65115
Epoch: 281, Training Loss: 0.36887
Epoch: 282, Training Loss: 0.34125
Epoch: 282, Training Loss: 0.16350
Epoch: 282, Training Loss: 0.65112
Epoch: 282, Training Loss: 0.36883
Epoch: 283, Training Loss: 0.34091
Epoch: 283, Training Loss: 0.16305
Epoch: 283, Training Loss: 0.65109
Epoch: 283, Training Loss: 0.36879
Epoch: 284, Training Loss: 0.34056
Epoch: 284, Training Loss: 0.16261
Epoch: 284, Training Loss: 0.65106
Epoch: 284, Training Loss: 0.36876
Epoch: 285, Training Loss: 0.34021
Epoch: 285, Training Loss: 0.16217
Epoch: 285, Training Loss: 0.65102
Epoch: 285, Training Loss: 0.36873
Epoch: 286, Training Loss: 0.33987
Epoch: 286, Training Loss: 0.16173
Epoch: 286, Training Loss: 0.65098
Epoch: 286, Training Loss: 0.36870
Epoch: 287, Training Loss: 0.33952
Epoch: 287, Training Loss: 0.16130
Epoch: 287, Training Loss: 0.65094
Epoch: 287, Training Loss: 0.36867
Epoch: 288, Training Loss: 0.33917
Epoch: 288, Training Loss: 0.16086
Epoch: 288, Training Loss: 0.65090
Epoch: 288, Training Loss: 0.36865
Epoch: 289, Training Loss: 0.33883
Epoch: 289, Training Loss: 0.16044
Epoch: 289, Training Loss: 0.65085
Epoch: 289, Training Loss: 0.36862
Epoch: 290, Training Loss: 0.33848
Epoch: 290, Training Loss: 0.16001
Epoch: 290, Training Loss: 0.65081
Epoch: 290, Training Loss: 0.36861
Epoch: 291, Training Loss: 0.33814
Epoch: 291, Training Loss: 0.15959
Epoch: 291, Training Loss: 0.65076
Epoch: 291, Training Loss: 0.36859
Epoch: 292, Training Loss: 0.33779
Epoch: 292, Training Loss: 0.15917
Epoch: 292, Training Loss: 0.65070
Epoch: 292, Training Loss: 0.36858
Epoch: 293, Training Loss: 0.33745
Epoch: 293, Training Loss: 0.15875
Epoch: 293, Training Loss: 0.65065
Epoch: 293, Training Loss: 0.36857
Epoch: 294, Training Loss: 0.33710
Epoch: 294, Training Loss: 0.15834
Epoch: 294, Training Loss: 0.65059
Epoch: 294, Training Loss: 0.36856
Epoch: 295, Training Loss: 0.33675
Epoch: 295, Training Loss: 0.15793
Epoch: 295, Training Loss: 0.65053
Epoch: 295, Training Loss: 0.36855
Epoch: 296, Training Loss: 0.33641
Epoch: 296, Training Loss: 0.15752
Epoch: 296, Training Loss: 0.65047
Epoch: 296, Training Loss: 0.36855
Epoch: 297, Training Loss: 0.33606
Epoch: 297, Training Loss: 0.15711
Epoch: 297, Training Loss: 0.65041
Epoch: 297, Training Loss: 0.36855
Epoch: 298, Training Loss: 0.33572
Epoch: 298, Training Loss: 0.15671
Epoch: 298, Training Loss: 0.65035
Epoch: 298, Training Loss: 0.36855
Epoch: 299, Training Loss: 0.33537
Epoch: 299, Training Loss: 0.15631
Epoch: 299, Training Loss: 0.65028
Epoch: 299, Training Loss: 0.36856
Epoch: 300, Training Loss: 0.33503
Epoch: 300, Training Loss: 0.15591
Epoch: 300, Training Loss: 0.65021
Epoch: 300, Training Loss: 0.36857
Epoch: 301, Training Loss: 0.33468
Epoch: 301, Training Loss: 0.15552
Epoch: 301, Training Loss: 0.65014
Epoch: 301, Training Loss: 0.36858
Epoch: 302, Training Loss: 0.33433
Epoch: 302, Training Loss: 0.15512
Epoch: 302, Training Loss: 0.65006
Epoch: 302, Training Loss: 0.36859
Epoch: 303, Training Loss: 0.33399
Epoch: 303, Training Loss: 0.15473
Epoch: 303, Training Loss: 0.64999
Epoch: 303, Training Loss: 0.36861
Epoch: 304, Training Loss: 0.33364
Epoch: 304, Training Loss: 0.15435
Epoch: 304, Training Loss: 0.64991
Epoch: 304, Training Loss: 0.36863
Epoch: 305, Training Loss: 0.33329
Epoch: 305, Training Loss: 0.15396
Epoch: 305, Training Loss: 0.64983
Epoch: 305, Training Loss: 0.36865
Epoch: 306, Training Loss: 0.33295
Epoch: 306, Training Loss: 0.15358
Epoch: 306, Training Loss: 0.64974
Epoch: 306, Training Loss: 0.36868
Epoch: 307, Training Loss: 0.33260
Epoch: 307, Training Loss: 0.15320
Epoch: 307, Training Loss: 0.64966
Epoch: 307, Training Loss: 0.36870
Epoch: 308, Training Loss: 0.33225
Epoch: 308, Training Loss: 0.15282
Epoch: 308, Training Loss: 0.64957
Epoch: 308, Training Loss: 0.36873
Epoch: 309, Training Loss: 0.33190
Epoch: 309, Training Loss: 0.15245
Epoch: 309, Training Loss: 0.64948
Epoch: 309, Training Loss: 0.36877
Epoch: 310, Training Loss: 0.33155
Epoch: 310, Training Loss: 0.15208
Epoch: 310, Training Loss: 0.64939
Epoch: 310, Training Loss: 0.36880
Epoch: 311, Training Loss: 0.33120
Epoch: 311, Training Loss: 0.15171
Epoch: 311, Training Loss: 0.64930
Epoch: 311, Training Loss: 0.36884
Epoch: 312, Training Loss: 0.33085
Epoch: 312, Training Loss: 0.15134
Epoch: 312, Training Loss: 0.64920
Epoch: 312, Training Loss: 0.36888
Epoch: 313, Training Loss: 0.33050
Epoch: 313, Training Loss: 0.15098
Epoch: 313, Training Loss: 0.64910
Epoch: 313, Training Loss: 0.36892
Epoch: 314, Training Loss: 0.33015
Epoch: 314, Training Loss: 0.15062
Epoch: 314, Training Loss: 0.64900
Epoch: 314, Training Loss: 0.36897
Epoch: 315, Training Loss: 0.32980
Epoch: 315, Training Loss: 0.15026
Epoch: 315, Training Loss: 0.64890
Epoch: 315, Training Loss: 0.36902
Epoch: 316, Training Loss: 0.32945
Epoch: 316, Training Loss: 0.14990
Epoch: 316, Training Loss: 0.64879
Epoch: 316, Training Loss: 0.36907
Epoch: 317, Training Loss: 0.32909
Epoch: 317, Training Loss: 0.14955
Epoch: 317, Training Loss: 0.64868
Epoch: 317, Training Loss: 0.36912
Epoch: 318, Training Loss: 0.32874
Epoch: 318, Training Loss: 0.14919
Epoch: 318, Training Loss: 0.64857
Epoch: 318, Training Loss: 0.36918
Epoch: 319, Training Loss: 0.32839
Epoch: 319, Training Loss: 0.14885
Epoch: 319, Training Loss: 0.64846
Epoch: 319, Training Loss: 0.36924
Epoch: 320, Training Loss: 0.32803
Epoch: 320, Training Loss: 0.14850
Epoch: 320, Training Loss: 0.64835
Epoch: 320, Training Loss: 0.36930
Epoch: 321, Training Loss: 0.32767
Epoch: 321, Training Loss: 0.14815
Epoch: 321, Training Loss: 0.64823
Epoch: 321, Training Loss: 0.36936
Epoch: 322, Training Loss: 0.32732
Epoch: 322, Training Loss: 0.14781
Epoch: 322, Training Loss: 0.64811
Epoch: 322, Training Loss: 0.36943
Epoch: 323, Training Loss: 0.32696
Epoch: 323, Training Loss: 0.14747
Epoch: 323, Training Loss: 0.64799
Epoch: 323, Training Loss: 0.36950
Epoch: 324, Training Loss: 0.32660
Epoch: 324, Training Loss: 0.14713
Epoch: 324, Training Loss: 0.64787
Epoch: 324, Training Loss: 0.36957
Epoch: 325, Training Loss: 0.32624
Epoch: 325, Training Loss: 0.14680
Epoch: 325, Training Loss: 0.64774
Epoch: 325, Training Loss: 0.36965
Epoch: 326, Training Loss: 0.32588
Epoch: 326, Training Loss: 0.14646
Epoch: 326, Training Loss: 0.64762
Epoch: 326, Training Loss: 0.36973
Epoch: 327, Training Loss: 0.32552
Epoch: 327, Training Loss: 0.14613
Epoch: 327, Training Loss: 0.64749
Epoch: 327, Training Loss: 0.36981
Epoch: 328, Training Loss: 0.32516
Epoch: 328, Training Loss: 0.14580
Epoch: 328, Training Loss: 0.64735
Epoch: 328, Training Loss: 0.36989
Epoch: 329, Training Loss: 0.32479
Epoch: 329, Training Loss: 0.14548
Epoch: 329, Training Loss: 0.64722
Epoch: 329, Training Loss: 0.36998
Epoch: 330, Training Loss: 0.32443
Epoch: 330, Training Loss: 0.14515
Epoch: 330, Training Loss: 0.64708
Epoch: 330, Training Loss: 0.37007
Epoch: 331, Training Loss: 0.32407
Epoch: 331, Training Loss: 0.14483
Epoch: 331, Training Loss: 0.64694
Epoch: 331, Training Loss: 0.37016
Epoch: 332, Training Loss: 0.32370
Epoch: 332, Training Loss: 0.14451
Epoch: 332, Training Loss: 0.64680
Epoch: 332, Training Loss: 0.37025
Epoch: 333, Training Loss: 0.32333
Epoch: 333, Training Loss: 0.14419
Epoch: 333, Training Loss: 0.64666
Epoch: 333, Training Loss: 0.37035
Epoch: 334, Training Loss: 0.32296
Epoch: 334, Training Loss: 0.14388
Epoch: 334, Training Loss: 0.64651
Epoch: 334, Training Loss: 0.37045
Epoch: 335, Training Loss: 0.32259
Epoch: 335, Training Loss: 0.14356
Epoch: 335, Training Loss: 0.64636
Epoch: 335, Training Loss: 0.37055
Epoch: 336, Training Loss: 0.32222
Epoch: 336, Training Loss: 0.14325
Epoch: 336, Training Loss: 0.64621
Epoch: 336, Training Loss: 0.37066
Epoch: 337, Training Loss: 0.32185
Epoch: 337, Training Loss: 0.14294
Epoch: 337, Training Loss: 0.64606
Epoch: 337, Training Loss: 0.37076
Epoch: 338, Training Loss: 0.32148
Epoch: 338, Training Loss: 0.14263
Epoch: 338, Training Loss: 0.64591
Epoch: 338, Training Loss: 0.37087
Epoch: 339, Training Loss: 0.32110
Epoch: 339, Training Loss: 0.14233
Epoch: 339, Training Loss: 0.64575
Epoch: 339, Training Loss: 0.37099
Epoch: 340, Training Loss: 0.32072
Epoch: 340, Training Loss: 0.14203
Epoch: 340, Training Loss: 0.64559
Epoch: 340, Training Loss: 0.37110
Epoch: 341, Training Loss: 0.32035
Epoch: 341, Training Loss: 0.14173
Epoch: 341, Training Loss: 0.64542
Epoch: 341, Training Loss: 0.37122
Epoch: 342, Training Loss: 0.31997
Epoch: 342, Training Loss: 0.14143
Epoch: 342, Training Loss: 0.64526
Epoch: 342, Training Loss: 0.37134
Epoch: 343, Training Loss: 0.31959
Epoch: 343, Training Loss: 0.14113
Epoch: 343, Training Loss: 0.64509
Epoch: 343, Training Loss: 0.37147
Epoch: 344, Training Loss: 0.31921
Epoch: 344, Training Loss: 0.14084
Epoch: 344, Training Loss: 0.64492
Epoch: 344, Training Loss: 0.37159
Epoch: 345, Training Loss: 0.31882
Epoch: 345, Training Loss: 0.14054
Epoch: 345, Training Loss: 0.64475
Epoch: 345, Training Loss: 0.37172
Epoch: 346, Training Loss: 0.31844
Epoch: 346, Training Loss: 0.14025
Epoch: 346, Training Loss: 0.64458
Epoch: 346, Training Loss: 0.37186
Epoch: 347, Training Loss: 0.31805
Epoch: 347, Training Loss: 0.13997
Epoch: 347, Training Loss: 0.64440
Epoch: 347, Training Loss: 0.37199
Epoch: 348, Training Loss: 0.31766
Epoch: 348, Training Loss: 0.13968
Epoch: 348, Training Loss: 0.64422
Epoch: 348, Training Loss: 0.37213
Epoch: 349, Training Loss: 0.31727
Epoch: 349, Training Loss: 0.13940
Epoch: 349, Training Loss: 0.64404
Epoch: 349, Training Loss: 0.37227
Epoch: 350, Training Loss: 0.31688
Epoch: 350, Training Loss: 0.13911
Epoch: 350, Training Loss: 0.64385
Epoch: 350, Training Loss: 0.37242
Epoch: 351, Training Loss: 0.31649
Epoch: 351, Training Loss: 0.13883
Epoch: 351, Training Loss: 0.64367
Epoch: 351, Training Loss: 0.37256
Epoch: 352, Training Loss: 0.31609
Epoch: 352, Training Loss: 0.13855
Epoch: 352, Training Loss: 0.64348
Epoch: 352, Training Loss: 0.37271
Epoch: 353, Training Loss: 0.31570
Epoch: 353, Training Loss: 0.13828
Epoch: 353, Training Loss: 0.64328
Epoch: 353, Training Loss: 0.37286
Epoch: 354, Training Loss: 0.31530
Epoch: 354, Training Loss: 0.13800
Epoch: 354, Training Loss: 0.64309
Epoch: 354, Training Loss: 0.37302
Epoch: 355, Training Loss: 0.31490
Epoch: 355, Training Loss: 0.13773
Epoch: 355, Training Loss: 0.64289
Epoch: 355, Training Loss: 0.37318
Epoch: 356, Training Loss: 0.31450
Epoch: 356, Training Loss: 0.13746
Epoch: 356, Training Loss: 0.64269
Epoch: 356, Training Loss: 0.37334
Epoch: 357, Training Loss: 0.31410
Epoch: 357, Training Loss: 0.13719
Epoch: 357, Training Loss: 0.64249
Epoch: 357, Training Loss: 0.37350
Epoch: 358, Training Loss: 0.31369
Epoch: 358, Training Loss: 0.13693
Epoch: 358, Training Loss: 0.64229
Epoch: 358, Training Loss: 0.37367
Epoch: 359, Training Loss: 0.31329
Epoch: 359, Training Loss: 0.13666
Epoch: 359, Training Loss: 0.64208
Epoch: 359, Training Loss: 0.37384
Epoch: 360, Training Loss: 0.31288
Epoch: 360, Training Loss: 0.13640
Epoch: 360, Training Loss: 0.64187
Epoch: 360, Training Loss: 0.37401
Epoch: 361, Training Loss: 0.31247
Epoch: 361, Training Loss: 0.13614
Epoch: 361, Training Loss: 0.64166
Epoch: 361, Training Loss: 0.37419
Epoch: 362, Training Loss: 0.31205
Epoch: 362, Training Loss: 0.13588
Epoch: 362, Training Loss: 0.64144
Epoch: 362, Training Loss: 0.37436
Epoch: 363, Training Loss: 0.31164
Epoch: 363, Training Loss: 0.13562
Epoch: 363, Training Loss: 0.64122
Epoch: 363, Training Loss: 0.37454
Epoch: 364, Training Loss: 0.31122
Epoch: 364, Training Loss: 0.13537
Epoch: 364, Training Loss: 0.64100
Epoch: 364, Training Loss: 0.37473
Epoch: 365, Training Loss: 0.31081
Epoch: 365, Training Loss: 0.13512
Epoch: 365, Training Loss: 0.64078
Epoch: 365, Training Loss: 0.37492
Epoch: 366, Training Loss: 0.31039
Epoch: 366, Training Loss: 0.13486
Epoch: 366, Training Loss: 0.64055
Epoch: 366, Training Loss: 0.37511
Epoch: 367, Training Loss: 0.30996
Epoch: 367, Training Loss: 0.13462
Epoch: 367, Training Loss: 0.64033
Epoch: 367, Training Loss: 0.37530
Epoch: 368, Training Loss: 0.30954
Epoch: 368, Training Loss: 0.13437
Epoch: 368, Training Loss: 0.64009
Epoch: 368, Training Loss: 0.37549
Epoch: 369, Training Loss: 0.30911
Epoch: 369, Training Loss: 0.13412
Epoch: 369, Training Loss: 0.63986
Epoch: 369, Training Loss: 0.37569
Epoch: 370, Training Loss: 0.30868
Epoch: 370, Training Loss: 0.13388
Epoch: 370, Training Loss: 0.63962
Epoch: 370, Training Loss: 0.37589
Epoch: 371, Training Loss: 0.30825
Epoch: 371, Training Loss: 0.13364
Epoch: 371, Training Loss: 0.63938
Epoch: 371, Training Loss: 0.37610
Epoch: 372, Training Loss: 0.30782
Epoch: 372, Training Loss: 0.13340
Epoch: 372, Training Loss: 0.63914
Epoch: 372, Training Loss: 0.37631
Epoch: 373, Training Loss: 0.30739
Epoch: 373, Training Loss: 0.13316
Epoch: 373, Training Loss: 0.63890
Epoch: 373, Training Loss: 0.37652
Epoch: 374, Training Loss: 0.30695
Epoch: 374, Training Loss: 0.13292
Epoch: 374, Training Loss: 0.63865
Epoch: 374, Training Loss: 0.37673
Epoch: 375, Training Loss: 0.30651
Epoch: 375, Training Loss: 0.13269
Epoch: 375, Training Loss: 0.63840
Epoch: 375, Training Loss: 0.37695
Epoch: 376, Training Loss: 0.30607
Epoch: 376, Training Loss: 0.13246
Epoch: 376, Training Loss: 0.63815
Epoch: 376, Training Loss: 0.37717
Epoch: 377, Training Loss: 0.30563
Epoch: 377, Training Loss: 0.13223
Epoch: 377, Training Loss: 0.63789
Epoch: 377, Training Loss: 0.37739
Epoch: 378, Training Loss: 0.30518
Epoch: 378, Training Loss: 0.13200
Epoch: 378, Training Loss: 0.63763
Epoch: 378, Training Loss: 0.37761
Epoch: 379, Training Loss: 0.30473
Epoch: 379, Training Loss: 0.13177
Epoch: 379, Training Loss: 0.63737
Epoch: 379, Training Loss: 0.37784
Epoch: 380, Training Loss: 0.30428
Epoch: 380, Training Loss: 0.13155
Epoch: 380, Training Loss: 0.63711
Epoch: 380, Training Loss: 0.37807
Epoch: 381, Training Loss: 0.30383
Epoch: 381, Training Loss: 0.13132
Epoch: 381, Training Loss: 0.63684
Epoch: 381, Training Loss: 0.37831
Epoch: 382, Training Loss: 0.30337
Epoch: 382, Training Loss: 0.13110
Epoch: 382, Training Loss: 0.63657
Epoch: 382, Training Loss: 0.37855
Epoch: 383, Training Loss: 0.30292
Epoch: 383, Training Loss: 0.13088
Epoch: 383, Training Loss: 0.63630
Epoch: 383, Training Loss: 0.37879
Epoch: 384, Training Loss: 0.30246
Epoch: 384, Training Loss: 0.13066
Epoch: 384, Training Loss: 0.63602
Epoch: 384, Training Loss: 0.37903
Epoch: 385, Training Loss: 0.30200
Epoch: 385, Training Loss: 0.13045
Epoch: 385, Training Loss: 0.63574
Epoch: 385, Training Loss: 0.37928
Epoch: 386, Training Loss: 0.30153
Epoch: 386, Training Loss: 0.13023
Epoch: 386, Training Loss: 0.63546
Epoch: 386, Training Loss: 0.37953
Epoch: 387, Training Loss: 0.30107
Epoch: 387, Training Loss: 0.13002
Epoch: 387, Training Loss: 0.63518
Epoch: 387, Training Loss: 0.37978
Epoch: 388, Training Loss: 0.30060
Epoch: 388, Training Loss: 0.12981
Epoch: 388, Training Loss: 0.63489
Epoch: 388, Training Loss: 0.38004
Epoch: 389, Training Loss: 0.30013
Epoch: 389, Training Loss: 0.12960
Epoch: 389, Training Loss: 0.63460
Epoch: 389, Training Loss: 0.38029
Epoch: 390, Training Loss: 0.29965
Epoch: 390, Training Loss: 0.12940
Epoch: 390, Training Loss: 0.63431
Epoch: 390, Training Loss: 0.38056
Epoch: 391, Training Loss: 0.29918
Epoch: 391, Training Loss: 0.12919
Epoch: 391, Training Loss: 0.63401
Epoch: 391, Training Loss: 0.38082
Epoch: 392, Training Loss: 0.29870
Epoch: 392, Training Loss: 0.12899
Epoch: 392, Training Loss: 0.63371
Epoch: 392, Training Loss: 0.38109
Epoch: 393, Training Loss: 0.29822
Epoch: 393, Training Loss: 0.12879
Epoch: 393, Training Loss: 0.63341
Epoch: 393, Training Loss: 0.38136
Epoch: 394, Training Loss: 0.29773
Epoch: 394, Training Loss: 0.12859
Epoch: 394, Training Loss: 0.63310
Epoch: 394, Training Loss: 0.38164
Epoch: 395, Training Loss: 0.29725
Epoch: 395, Training Loss: 0.12839
Epoch: 395, Training Loss: 0.63279
Epoch: 395, Training Loss: 0.38191
Epoch: 396, Training Loss: 0.29676
Epoch: 396, Training Loss: 0.12819
Epoch: 396, Training Loss: 0.63248
Epoch: 396, Training Loss: 0.38219
Epoch: 397, Training Loss: 0.29627
Epoch: 397, Training Loss: 0.12800
Epoch: 397, Training Loss: 0.63217
Epoch: 397, Training Loss: 0.38248
Epoch: 398, Training Loss: 0.29578
Epoch: 398, Training Loss: 0.12780
Epoch: 398, Training Loss: 0.63185
Epoch: 398, Training Loss: 0.38276
Epoch: 399, Training Loss: 0.29528
Epoch: 399, Training Loss: 0.12761
Epoch: 399, Training Loss: 0.63153
Epoch: 399, Training Loss: 0.38305
Epoch: 400, Training Loss: 0.29478
Epoch: 400, Training Loss: 0.12742
Epoch: 400, Training Loss: 0.63121
Epoch: 400, Training Loss: 0.38334
Epoch: 401, Training Loss: 0.29428
Epoch: 401, Training Loss: 0.12724
Epoch: 401, Training Loss: 0.63088
Epoch: 401, Training Loss: 0.38364
Epoch: 402, Training Loss: 0.29378
Epoch: 402, Training Loss: 0.12705
Epoch: 402, Training Loss: 0.63056
Epoch: 402, Training Loss: 0.38394
Epoch: 403, Training Loss: 0.29327
Epoch: 403, Training Loss: 0.12687
Epoch: 403, Training Loss: 0.63022
Epoch: 403, Training Loss: 0.38424
Epoch: 404, Training Loss: 0.29277
Epoch: 404, Training Loss: 0.12668
Epoch: 404, Training Loss: 0.62989
Epoch: 404, Training Loss: 0.38455
Epoch: 405, Training Loss: 0.29226
Epoch: 405, Training Loss: 0.12650
Epoch: 405, Training Loss: 0.62955
Epoch: 405, Training Loss: 0.38485
Epoch: 406, Training Loss: 0.29174
Epoch: 406, Training Loss: 0.12632
Epoch: 406, Training Loss: 0.62921
Epoch: 406, Training Loss: 0.38516
Epoch: 407, Training Loss: 0.29123
Epoch: 407, Training Loss: 0.12615
Epoch: 407, Training Loss: 0.62887
Epoch: 407, Training Loss: 0.38548
Epoch: 408, Training Loss: 0.29071
Epoch: 408, Training Loss: 0.12597
Epoch: 408, Training Loss: 0.62852
Epoch: 408, Training Loss: 0.38579
Epoch: 409, Training Loss: 0.29019
Epoch: 409, Training Loss: 0.12580
Epoch: 409, Training Loss: 0.62817
Epoch: 409, Training Loss: 0.38611
Epoch: 410, Training Loss: 0.28967
Epoch: 410, Training Loss: 0.12562
Epoch: 410, Training Loss: 0.62782
Epoch: 410, Training Loss: 0.38644
Epoch: 411, Training Loss: 0.28914
Epoch: 411, Training Loss: 0.12545
Epoch: 411, Training Loss: 0.62746
Epoch: 411, Training Loss: 0.38676
Epoch: 412, Training Loss: 0.28862
Epoch: 412, Training Loss: 0.12529
Epoch: 412, Training Loss: 0.62710
Epoch: 412, Training Loss: 0.38709
Epoch: 413, Training Loss: 0.28809
Epoch: 413, Training Loss: 0.12512
Epoch: 413, Training Loss: 0.62674
Epoch: 413, Training Loss: 0.38742
Epoch: 414, Training Loss: 0.28756
Epoch: 414, Training Loss: 0.12495
Epoch: 414, Training Loss: 0.62638
Epoch: 414, Training Loss: 0.38776
Epoch: 415, Training Loss: 0.28702
Epoch: 415, Training Loss: 0.12479
Epoch: 415, Training Loss: 0.62601
Epoch: 415, Training Loss: 0.38809
Epoch: 416, Training Loss: 0.28648
Epoch: 416, Training Loss: 0.12463
Epoch: 416, Training Loss: 0.62564
Epoch: 416, Training Loss: 0.38843
Epoch: 417, Training Loss: 0.28595
Epoch: 417, Training Loss: 0.12447
Epoch: 417, Training Loss: 0.62526
Epoch: 417, Training Loss: 0.38878
Epoch: 418, Training Loss: 0.28540
Epoch: 418, Training Loss: 0.12431
Epoch: 418, Training Loss: 0.62489
Epoch: 418, Training Loss: 0.38912
Epoch: 419, Training Loss: 0.28486
Epoch: 419, Training Loss: 0.12415
Epoch: 419, Training Loss: 0.62451
Epoch: 419, Training Loss: 0.38947
Epoch: 420, Training Loss: 0.28431
Epoch: 420, Training Loss: 0.12400
Epoch: 420, Training Loss: 0.62413
Epoch: 420, Training Loss: 0.38982
Epoch: 421, Training Loss: 0.28377
Epoch: 421, Training Loss: 0.12384
Epoch: 421, Training Loss: 0.62374
Epoch: 421, Training Loss: 0.39018
Epoch: 422, Training Loss: 0.28322
Epoch: 422, Training Loss: 0.12369
Epoch: 422, Training Loss: 0.62335
Epoch: 422, Training Loss: 0.39054
Epoch: 423, Training Loss: 0.28266
Epoch: 423, Training Loss: 0.12354
Epoch: 423, Training Loss: 0.62296
Epoch: 423, Training Loss: 0.39090
Epoch: 424, Training Loss: 0.28211
Epoch: 424, Training Loss: 0.12339
Epoch: 424, Training Loss: 0.62257
Epoch: 424, Training Loss: 0.39126
Epoch: 425, Training Loss: 0.28155
Epoch: 425, Training Loss: 0.12325
Epoch: 425, Training Loss: 0.62217
Epoch: 425, Training Loss: 0.39162
Epoch: 426, Training Loss: 0.28099
Epoch: 426, Training Loss: 0.12310
Epoch: 426, Training Loss: 0.62177
Epoch: 426, Training Loss: 0.39199
Epoch: 427, Training Loss: 0.28043
Epoch: 427, Training Loss: 0.12296
Epoch: 427, Training Loss: 0.62137
Epoch: 427, Training Loss: 0.39236
Epoch: 428, Training Loss: 0.27987
Epoch: 428, Training Loss: 0.12282
Epoch: 428, Training Loss: 0.62096
Epoch: 428, Training Loss: 0.39274
Epoch: 429, Training Loss: 0.27930
Epoch: 429, Training Loss: 0.12268
Epoch: 429, Training Loss: 0.62055
Epoch: 429, Training Loss: 0.39311
Epoch: 430, Training Loss: 0.27873
Epoch: 430, Training Loss: 0.12254
Epoch: 430, Training Loss: 0.62014
Epoch: 430, Training Loss: 0.39349
Epoch: 431, Training Loss: 0.27816
Epoch: 431, Training Loss: 0.12240
Epoch: 431, Training Loss: 0.61973
Epoch: 431, Training Loss: 0.39387
Epoch: 432, Training Loss: 0.27759
Epoch: 432, Training Loss: 0.12227
Epoch: 432, Training Loss: 0.61931
Epoch: 432, Training Loss: 0.39426
Epoch: 433, Training Loss: 0.27702
Epoch: 433, Training Loss: 0.12213
Epoch: 433, Training Loss: 0.61889
Epoch: 433, Training Loss: 0.39464
Epoch: 434, Training Loss: 0.27644
Epoch: 434, Training Loss: 0.12200
Epoch: 434, Training Loss: 0.61847
Epoch: 434, Training Loss: 0.39503
Epoch: 435, Training Loss: 0.27587
Epoch: 435, Training Loss: 0.12187
Epoch: 435, Training Loss: 0.61804
Epoch: 435, Training Loss: 0.39542
Epoch: 436, Training Loss: 0.27529
Epoch: 436, Training Loss: 0.12174
Epoch: 436, Training Loss: 0.61762
Epoch: 436, Training Loss: 0.39582
Epoch: 437, Training Loss: 0.27470
Epoch: 437, Training Loss: 0.12161
Epoch: 437, Training Loss: 0.61719
Epoch: 437, Training Loss: 0.39621
Epoch: 438, Training Loss: 0.27412
Epoch: 438, Training Loss: 0.12149
Epoch: 438, Training Loss: 0.61675
Epoch: 438, Training Loss: 0.39661
Epoch: 439, Training Loss: 0.27354
Epoch: 439, Training Loss: 0.12136
Epoch: 439, Training Loss: 0.61632
Epoch: 439, Training Loss: 0.39701
Epoch: 440, Training Loss: 0.27295
Epoch: 440, Training Loss: 0.12124
Epoch: 440, Training Loss: 0.61588
Epoch: 440, Training Loss: 0.39742
Epoch: 441, Training Loss: 0.27236
Epoch: 441, Training Loss: 0.12112
Epoch: 441, Training Loss: 0.61544
Epoch: 441, Training Loss: 0.39782
Epoch: 442, Training Loss: 0.27177
Epoch: 442, Training Loss: 0.12100
Epoch: 442, Training Loss: 0.61499
Epoch: 442, Training Loss: 0.39823
Epoch: 443, Training Loss: 0.27118
Epoch: 443, Training Loss: 0.12088
Epoch: 443, Training Loss: 0.61455
Epoch: 443, Training Loss: 0.39864
Epoch: 444, Training Loss: 0.27058
Epoch: 444, Training Loss: 0.12077
Epoch: 444, Training Loss: 0.61410
Epoch: 444, Training Loss: 0.39905
Epoch: 445, Training Loss: 0.26999
Epoch: 445, Training Loss: 0.12065
Epoch: 445, Training Loss: 0.61365
Epoch: 445, Training Loss: 0.39947
Epoch: 446, Training Loss: 0.26939
Epoch: 446, Training Loss: 0.12054
Epoch: 446, Training Loss: 0.61320
Epoch: 446, Training Loss: 0.39988
Epoch: 447, Training Loss: 0.26880
Epoch: 447, Training Loss: 0.12043
Epoch: 447, Training Loss: 0.61274
Epoch: 447, Training Loss: 0.40030
Epoch: 448, Training Loss: 0.26820
Epoch: 448, Training Loss: 0.12032
Epoch: 448, Training Loss: 0.61228
Epoch: 448, Training Loss: 0.40072
Epoch: 449, Training Loss: 0.26759
Epoch: 449, Training Loss: 0.12021
Epoch: 449, Training Loss: 0.61182
Epoch: 449, Training Loss: 0.40114
Epoch: 450, Training Loss: 0.26699
Epoch: 450, Training Loss: 0.12010
Epoch: 450, Training Loss: 0.61136
Epoch: 450, Training Loss: 0.40157
Epoch: 451, Training Loss: 0.26639
Epoch: 451, Training Loss: 0.12000
Epoch: 451, Training Loss: 0.61090
Epoch: 451, Training Loss: 0.40199
Epoch: 452, Training Loss: 0.26578
Epoch: 452, Training Loss: 0.11989
Epoch: 452, Training Loss: 0.61043
Epoch: 452, Training Loss: 0.40242
Epoch: 453, Training Loss: 0.26518
Epoch: 453, Training Loss: 0.11979
Epoch: 453, Training Loss: 0.60996
Epoch: 453, Training Loss: 0.40285
Epoch: 454, Training Loss: 0.26457
Epoch: 454, Training Loss: 0.11969
Epoch: 454, Training Loss: 0.60949
Epoch: 454, Training Loss: 0.40328
Epoch: 455, Training Loss: 0.26396
Epoch: 455, Training Loss: 0.11959
Epoch: 455, Training Loss: 0.60902
Epoch: 455, Training Loss: 0.40371
Epoch: 456, Training Loss: 0.26335
Epoch: 456, Training Loss: 0.11949
Epoch: 456, Training Loss: 0.60854
Epoch: 456, Training Loss: 0.40415
Epoch: 457, Training Loss: 0.26274
Epoch: 457, Training Loss: 0.11939
Epoch: 457, Training Loss: 0.60806
Epoch: 457, Training Loss: 0.40458
Epoch: 458, Training Loss: 0.26213
Epoch: 458, Training Loss: 0.11930
Epoch: 458, Training Loss: 0.60759
Epoch: 458, Training Loss: 0.40502
Epoch: 459, Training Loss: 0.26152
Epoch: 459, Training Loss: 0.11920
Epoch: 459, Training Loss: 0.60710
Epoch: 459, Training Loss: 0.40546
Epoch: 460, Training Loss: 0.26090
Epoch: 460, Training Loss: 0.11911
Epoch: 460, Training Loss: 0.60662
Epoch: 460, Training Loss: 0.40590
Epoch: 461, Training Loss: 0.26029
Epoch: 461, Training Loss: 0.11902
Epoch: 461, Training Loss: 0.60614
Epoch: 461, Training Loss: 0.40634
Epoch: 462, Training Loss: 0.25967
Epoch: 462, Training Loss: 0.11893
Epoch: 462, Training Loss: 0.60565
Epoch: 462, Training Loss: 0.40678
Epoch: 463, Training Loss: 0.25906
Epoch: 463, Training Loss: 0.11884
Epoch: 463, Training Loss: 0.60516
Epoch: 463, Training Loss: 0.40723
Epoch: 464, Training Loss: 0.25844
Epoch: 464, Training Loss: 0.11875
Epoch: 464, Training Loss: 0.60467
Epoch: 464, Training Loss: 0.40767
Epoch: 465, Training Loss: 0.25782
Epoch: 465, Training Loss: 0.11867
Epoch: 465, Training Loss: 0.60418
Epoch: 465, Training Loss: 0.40812
Epoch: 466, Training Loss: 0.25720
Epoch: 466, Training Loss: 0.11858
Epoch: 466, Training Loss: 0.60369
Epoch: 466, Training Loss: 0.40857
Epoch: 467, Training Loss: 0.25658
Epoch: 467, Training Loss: 0.11850
Epoch: 467, Training Loss: 0.60319
Epoch: 467, Training Loss: 0.40901
Epoch: 468, Training Loss: 0.25596
Epoch: 468, Training Loss: 0.11842
Epoch: 468, Training Loss: 0.60270
Epoch: 468, Training Loss: 0.40946
Epoch: 469, Training Loss: 0.25534
Epoch: 469, Training Loss: 0.11834
Epoch: 469, Training Loss: 0.60220
Epoch: 469, Training Loss: 0.40991
Epoch: 470, Training Loss: 0.25472
Epoch: 470, Training Loss: 0.11826
Epoch: 470, Training Loss: 0.60170
Epoch: 470, Training Loss: 0.41036
Epoch: 471, Training Loss: 0.25410
Epoch: 471, Training Loss: 0.11818
Epoch: 471, Training Loss: 0.60120
Epoch: 471, Training Loss: 0.41082
Epoch: 472, Training Loss: 0.25348
Epoch: 472, Training Loss: 0.11810
Epoch: 472, Training Loss: 0.60070
Epoch: 472, Training Loss: 0.41127
Epoch: 473, Training Loss: 0.25286
Epoch: 473, Training Loss: 0.11803
Epoch: 473, Training Loss: 0.60019
Epoch: 473, Training Loss: 0.41172
Epoch: 474, Training Loss: 0.25224
Epoch: 474, Training Loss: 0.11795
Epoch: 474, Training Loss: 0.59969
Epoch: 474, Training Loss: 0.41218
Epoch: 475, Training Loss: 0.25162
Epoch: 475, Training Loss: 0.11788
Epoch: 475, Training Loss: 0.59919
Epoch: 475, Training Loss: 0.41263
Epoch: 476, Training Loss: 0.25099
Epoch: 476, Training Loss: 0.11781
Epoch: 476, Training Loss: 0.59868
Epoch: 476, Training Loss: 0.41309
Epoch: 477, Training Loss: 0.25037
Epoch: 477, Training Loss: 0.11774
Epoch: 477, Training Loss: 0.59817
Epoch: 477, Training Loss: 0.41354
Epoch: 478, Training Loss: 0.24975
Epoch: 478, Training Loss: 0.11767
Epoch: 478, Training Loss: 0.59766
Epoch: 478, Training Loss: 0.41400
Epoch: 479, Training Loss: 0.24913
Epoch: 479, Training Loss: 0.11760
Epoch: 479, Training Loss: 0.59715
Epoch: 479, Training Loss: 0.41445
Epoch: 480, Training Loss: 0.24851
Epoch: 480, Training Loss: 0.11753
Epoch: 480, Training Loss: 0.59664
Epoch: 480, Training Loss: 0.41491
Epoch: 481, Training Loss: 0.24788
Epoch: 481, Training Loss: 0.11747
Epoch: 481, Training Loss: 0.59613
Epoch: 481, Training Loss: 0.41537
Epoch: 482, Training Loss: 0.24726
Epoch: 482, Training Loss: 0.11740
Epoch: 482, Training Loss: 0.59562
Epoch: 482, Training Loss: 0.41582
Epoch: 483, Training Loss: 0.24664
Epoch: 483, Training Loss: 0.11734
Epoch: 483, Training Loss: 0.59511
Epoch: 483, Training Loss: 0.41628
Epoch: 484, Training Loss: 0.24602
Epoch: 484, Training Loss: 0.11728
Epoch: 484, Training Loss: 0.59459
Epoch: 484, Training Loss: 0.41674
Epoch: 485, Training Loss: 0.24540
Epoch: 485, Training Loss: 0.11722
Epoch: 485, Training Loss: 0.59408
Epoch: 485, Training Loss: 0.41719
Epoch: 486, Training Loss: 0.24478
Epoch: 486, Training Loss: 0.11716
Epoch: 486, Training Loss: 0.59357
Epoch: 486, Training Loss: 0.41765
Epoch: 487, Training Loss: 0.24415
Epoch: 487, Training Loss: 0.11710
Epoch: 487, Training Loss: 0.59305
Epoch: 487, Training Loss: 0.41811
Epoch: 488, Training Loss: 0.24353
Epoch: 488, Training Loss: 0.11704
Epoch: 488, Training Loss: 0.59254
Epoch: 488, Training Loss: 0.41856
Epoch: 489, Training Loss: 0.24291
Epoch: 489, Training Loss: 0.11699
Epoch: 489, Training Loss: 0.59202
Epoch: 489, Training Loss: 0.41902
Epoch: 490, Training Loss: 0.24230
Epoch: 490, Training Loss: 0.11693
Epoch: 490, Training Loss: 0.59150
Epoch: 490, Training Loss: 0.41948
Epoch: 491, Training Loss: 0.24168
Epoch: 491, Training Loss: 0.11688
Epoch: 491, Training Loss: 0.59099
Epoch: 491, Training Loss: 0.41993
Epoch: 492, Training Loss: 0.24106
Epoch: 492, Training Loss: 0.11682
Epoch: 492, Training Loss: 0.59047
Epoch: 492, Training Loss: 0.42039
Epoch: 493, Training Loss: 0.24044
Epoch: 493, Training Loss: 0.11677
Epoch: 493, Training Loss: 0.58995
Epoch: 493, Training Loss: 0.42084
Epoch: 494, Training Loss: 0.23982
Epoch: 494, Training Loss: 0.11672
Epoch: 494, Training Loss: 0.58944
Epoch: 494, Training Loss: 0.42130
Epoch: 495, Training Loss: 0.23921
Epoch: 495, Training Loss: 0.11667
Epoch: 495, Training Loss: 0.58892
Epoch: 495, Training Loss: 0.42175
Epoch: 496, Training Loss: 0.23859
Epoch: 496, Training Loss: 0.11662
Epoch: 496, Training Loss: 0.58840
Epoch: 496, Training Loss: 0.42220
Epoch: 497, Training Loss: 0.23798
Epoch: 497, Training Loss: 0.11657
Epoch: 497, Training Loss: 0.58788
Epoch: 497, Training Loss: 0.42266
Epoch: 498, Training Loss: 0.23736
Epoch: 498, Training Loss: 0.11652
Epoch: 498, Training Loss: 0.58737
Epoch: 498, Training Loss: 0.42311
Epoch: 499, Training Loss: 0.23675
Epoch: 499, Training Loss: 0.11648
Epoch: 499, Training Loss: 0.58685
Epoch: 499, Training Loss: 0.42356
Epoch: 500, Training Loss: 0.23614
Epoch: 500, Training Loss: 0.11643
Epoch: 500, Training Loss: 0.58633
Epoch: 500, Training Loss: 0.42401
Epoch: 501, Training Loss: 0.23553
Epoch: 501, Training Loss: 0.11639
Epoch: 501, Training Loss: 0.58582
Epoch: 501, Training Loss: 0.42446
Epoch: 502, Training Loss: 0.23492
Epoch: 502, Training Loss: 0.11634
Epoch: 502, Training Loss: 0.58530
Epoch: 502, Training Loss: 0.42491
Epoch: 503, Training Loss: 0.23431
Epoch: 503, Training Loss: 0.11630
Epoch: 503, Training Loss: 0.58478
Epoch: 503, Training Loss: 0.42535
Epoch: 504, Training Loss: 0.23370
Epoch: 504, Training Loss: 0.11626
Epoch: 504, Training Loss: 0.58427
Epoch: 504, Training Loss: 0.42580
Epoch: 505, Training Loss: 0.23309
Epoch: 505, Training Loss: 0.11622
Epoch: 505, Training Loss: 0.58375
Epoch: 505, Training Loss: 0.42624
Epoch: 506, Training Loss: 0.23249
Epoch: 506, Training Loss: 0.11618
Epoch: 506, Training Loss: 0.58324
Epoch: 506, Training Loss: 0.42669
Epoch: 507, Training Loss: 0.23188
Epoch: 507, Training Loss: 0.11614
Epoch: 507, Training Loss: 0.58272
Epoch: 507, Training Loss: 0.42713
Epoch: 508, Training Loss: 0.23128
Epoch: 508, Training Loss: 0.11610
Epoch: 508, Training Loss: 0.58221
Epoch: 508, Training Loss: 0.42757
Epoch: 509, Training Loss: 0.23068
Epoch: 509, Training Loss: 0.11607
Epoch: 509, Training Loss: 0.58169
Epoch: 509, Training Loss: 0.42801
Epoch: 510, Training Loss: 0.23008
Epoch: 510, Training Loss: 0.11603
Epoch: 510, Training Loss: 0.58118
Epoch: 510, Training Loss: 0.42845
Epoch: 511, Training Loss: 0.22948
Epoch: 511, Training Loss: 0.11599
Epoch: 511, Training Loss: 0.58067
Epoch: 511, Training Loss: 0.42889
Epoch: 512, Training Loss: 0.22888
Epoch: 512, Training Loss: 0.11596
Epoch: 512, Training Loss: 0.58016
Epoch: 512, Training Loss: 0.42932
Epoch: 513, Training Loss: 0.22829
Epoch: 513, Training Loss: 0.11593
Epoch: 513, Training Loss: 0.57965
Epoch: 513, Training Loss: 0.42976
Epoch: 514, Training Loss: 0.22769
Epoch: 514, Training Loss: 0.11589
Epoch: 514, Training Loss: 0.57914
Epoch: 514, Training Loss: 0.43019
Epoch: 515, Training Loss: 0.22710
Epoch: 515, Training Loss: 0.11586
Epoch: 515, Training Loss: 0.57863
Epoch: 515, Training Loss: 0.43062
Epoch: 516, Training Loss: 0.22650
Epoch: 516, Training Loss: 0.11583
Epoch: 516, Training Loss: 0.57812
Epoch: 516, Training Loss: 0.43105
Epoch: 517, Training Loss: 0.22591
Epoch: 517, Training Loss: 0.11580
Epoch: 517, Training Loss: 0.57761
Epoch: 517, Training Loss: 0.43148
Epoch: 518, Training Loss: 0.22532
Epoch: 518, Training Loss: 0.11577
Epoch: 518, Training Loss: 0.57710
Epoch: 518, Training Loss: 0.43190
Epoch: 519, Training Loss: 0.22474
Epoch: 519, Training Loss: 0.11574
Epoch: 519, Training Loss: 0.57660
Epoch: 519, Training Loss: 0.43233
Epoch: 520, Training Loss: 0.22415
Epoch: 520, Training Loss: 0.11572
Epoch: 520, Training Loss: 0.57609
Epoch: 520, Training Loss: 0.43275
Epoch: 521, Training Loss: 0.22357
Epoch: 521, Training Loss: 0.11569
Epoch: 521, Training Loss: 0.57559
Epoch: 521, Training Loss: 0.43317
Epoch: 522, Training Loss: 0.22298
Epoch: 522, Training Loss: 0.11566
Epoch: 522, Training Loss: 0.57509
Epoch: 522, Training Loss: 0.43359
Epoch: 523, Training Loss: 0.22240
Epoch: 523, Training Loss: 0.11564
Epoch: 523, Training Loss: 0.57459
Epoch: 523, Training Loss: 0.43401
Epoch: 524, Training Loss: 0.22182
Epoch: 524, Training Loss: 0.11561
Epoch: 524, Training Loss: 0.57409
Epoch: 524, Training Loss: 0.43442
Epoch: 525, Training Loss: 0.22125
Epoch: 525, Training Loss: 0.11559
Epoch: 525, Training Loss: 0.57359
Epoch: 525, Training Loss: 0.43484
Epoch: 526, Training Loss: 0.22067
Epoch: 526, Training Loss: 0.11557
Epoch: 526, Training Loss: 0.57309
Epoch: 526, Training Loss: 0.43525
Epoch: 527, Training Loss: 0.22010
Epoch: 527, Training Loss: 0.11555
Epoch: 527, Training Loss: 0.57259
Epoch: 527, Training Loss: 0.43565
Epoch: 528, Training Loss: 0.21952
Epoch: 528, Training Loss: 0.11552
Epoch: 528, Training Loss: 0.57210
Epoch: 528, Training Loss: 0.43606
Epoch: 529, Training Loss: 0.21895
Epoch: 529, Training Loss: 0.11550
Epoch: 529, Training Loss: 0.57160
Epoch: 529, Training Loss: 0.43647
Epoch: 530, Training Loss: 0.21839
Epoch: 530, Training Loss: 0.11548
Epoch: 530, Training Loss: 0.57111
Epoch: 530, Training Loss: 0.43687
Epoch: 531, Training Loss: 0.21782
Epoch: 531, Training Loss: 0.11547
Epoch: 531, Training Loss: 0.57062
Epoch: 531, Training Loss: 0.43727
Epoch: 532, Training Loss: 0.21725
Epoch: 532, Training Loss: 0.11545
Epoch: 532, Training Loss: 0.57013
Epoch: 532, Training Loss: 0.43767
Epoch: 533, Training Loss: 0.21669
Epoch: 533, Training Loss: 0.11543
Epoch: 533, Training Loss: 0.56964
Epoch: 533, Training Loss: 0.43806
Epoch: 534, Training Loss: 0.21613
Epoch: 534, Training Loss: 0.11541
Epoch: 534, Training Loss: 0.56916
Epoch: 534, Training Loss: 0.43846
Epoch: 535, Training Loss: 0.21557
Epoch: 535, Training Loss: 0.11540
Epoch: 535, Training Loss: 0.56867
Epoch: 535, Training Loss: 0.43885
Epoch: 536, Training Loss: 0.21502
Epoch: 536, Training Loss: 0.11538
Epoch: 536, Training Loss: 0.56819
Epoch: 536, Training Loss: 0.43923
Epoch: 537, Training Loss: 0.21446
Epoch: 537, Training Loss: 0.11537
Epoch: 537, Training Loss: 0.56770
Epoch: 537, Training Loss: 0.43962
Epoch: 538, Training Loss: 0.21391
Epoch: 538, Training Loss: 0.11536
Epoch: 538, Training Loss: 0.56722
Epoch: 538, Training Loss: 0.44000
Epoch: 539, Training Loss: 0.21336
Epoch: 539, Training Loss: 0.11534
Epoch: 539, Training Loss: 0.56674
Epoch: 539, Training Loss: 0.44039
Epoch: 540, Training Loss: 0.21281
Epoch: 540, Training Loss: 0.11533
Epoch: 540, Training Loss: 0.56626
Epoch: 540, Training Loss: 0.44076
Epoch: 541, Training Loss: 0.21226
Epoch: 541, Training Loss: 0.11532
Epoch: 541, Training Loss: 0.56579
Epoch: 541, Training Loss: 0.44114
Epoch: 542, Training Loss: 0.21172
Epoch: 542, Training Loss: 0.11531
Epoch: 542, Training Loss: 0.56531
Epoch: 542, Training Loss: 0.44151
Epoch: 543, Training Loss: 0.21117
Epoch: 543, Training Loss: 0.11530
Epoch: 543, Training Loss: 0.56484
Epoch: 543, Training Loss: 0.44189
Epoch: 544, Training Loss: 0.21063
Epoch: 544, Training Loss: 0.11529
Epoch: 544, Training Loss: 0.56437
Epoch: 544, Training Loss: 0.44225
Epoch: 545, Training Loss: 0.21009
Epoch: 545, Training Loss: 0.11529
Epoch: 545, Training Loss: 0.56390
Epoch: 545, Training Loss: 0.44262
Epoch: 546, Training Loss: 0.20956
Epoch: 546, Training Loss: 0.11528
Epoch: 546, Training Loss: 0.56343
Epoch: 546, Training Loss: 0.44298
Epoch: 547, Training Loss: 0.20902
Epoch: 547, Training Loss: 0.11527
Epoch: 547, Training Loss: 0.56296
Epoch: 547, Training Loss: 0.44334
Epoch: 548, Training Loss: 0.20849
Epoch: 548, Training Loss: 0.11527
Epoch: 548, Training Loss: 0.56250
Epoch: 548, Training Loss: 0.44370
Epoch: 549, Training Loss: 0.20796
Epoch: 549, Training Loss: 0.11526
Epoch: 549, Training Loss: 0.56203
Epoch: 549, Training Loss: 0.44406
Epoch: 550, Training Loss: 0.20743
Epoch: 550, Training Loss: 0.11526
Epoch: 550, Training Loss: 0.56157
Epoch: 550, Training Loss: 0.44441
Epoch: 551, Training Loss: 0.20691
Epoch: 551, Training Loss: 0.11526
Epoch: 551, Training Loss: 0.56111
Epoch: 551, Training Loss: 0.44476
Epoch: 552, Training Loss: 0.20638
Epoch: 552, Training Loss: 0.11526
Epoch: 552, Training Loss: 0.56065
Epoch: 552, Training Loss: 0.44511
Epoch: 553, Training Loss: 0.20586
Epoch: 553, Training Loss: 0.11525
Epoch: 553, Training Loss: 0.56020
Epoch: 553, Training Loss: 0.44545
Epoch: 554, Training Loss: 0.20534
Epoch: 554, Training Loss: 0.11525
Epoch: 554, Training Loss: 0.55974
Epoch: 554, Training Loss: 0.44579
Epoch: 555, Training Loss: 0.20482
Epoch: 555, Training Loss: 0.11526
Epoch: 555, Training Loss: 0.55929
Epoch: 555, Training Loss: 0.44613
Epoch: 556, Training Loss: 0.20431
Epoch: 556, Training Loss: 0.11526
Epoch: 556, Training Loss: 0.55884
Epoch: 556, Training Loss: 0.44647
Epoch: 557, Training Loss: 0.20380
Epoch: 557, Training Loss: 0.11526
Epoch: 557, Training Loss: 0.55839
Epoch: 557, Training Loss: 0.44680
Epoch: 558, Training Loss: 0.20328
Epoch: 558, Training Loss: 0.11526
Epoch: 558, Training Loss: 0.55794
Epoch: 558, Training Loss: 0.44713
Epoch: 559, Training Loss: 0.20278
Epoch: 559, Training Loss: 0.11527
Epoch: 559, Training Loss: 0.55749
Epoch: 559, Training Loss: 0.44746
Epoch: 560, Training Loss: 0.20227
Epoch: 560, Training Loss: 0.11527
Epoch: 560, Training Loss: 0.55705
Epoch: 560, Training Loss: 0.44778
Epoch: 561, Training Loss: 0.20177
Epoch: 561, Training Loss: 0.11528
Epoch: 561, Training Loss: 0.55661
Epoch: 561, Training Loss: 0.44810
Epoch: 562, Training Loss: 0.20126
Epoch: 562, Training Loss: 0.11529
Epoch: 562, Training Loss: 0.55616
Epoch: 562, Training Loss: 0.44842
Epoch: 563, Training Loss: 0.20076
Epoch: 563, Training Loss: 0.11530
Epoch: 563, Training Loss: 0.55573
Epoch: 563, Training Loss: 0.44873
Epoch: 564, Training Loss: 0.20027
Epoch: 564, Training Loss: 0.11531
Epoch: 564, Training Loss: 0.55529
Epoch: 564, Training Loss: 0.44904
Epoch: 565, Training Loss: 0.19977
Epoch: 565, Training Loss: 0.11532
Epoch: 565, Training Loss: 0.55485
Epoch: 565, Training Loss: 0.44935
Epoch: 566, Training Loss: 0.19928
Epoch: 566, Training Loss: 0.11533
Epoch: 566, Training Loss: 0.55442
Epoch: 566, Training Loss: 0.44966
Epoch: 567, Training Loss: 0.19879
Epoch: 567, Training Loss: 0.11534
Epoch: 567, Training Loss: 0.55398
Epoch: 567, Training Loss: 0.44996
Epoch: 568, Training Loss: 0.19830
Epoch: 568, Training Loss: 0.11535
Epoch: 568, Training Loss: 0.55355
Epoch: 568, Training Loss: 0.45026
Epoch: 569, Training Loss: 0.19781
Epoch: 569, Training Loss: 0.11537
Epoch: 569, Training Loss: 0.55312
Epoch: 569, Training Loss: 0.45056
Epoch: 570, Training Loss: 0.19733
Epoch: 570, Training Loss: 0.11538
Epoch: 570, Training Loss: 0.55269
Epoch: 570, Training Loss: 0.45085
Epoch: 571, Training Loss: 0.19685
Epoch: 571, Training Loss: 0.11540
Epoch: 571, Training Loss: 0.55227
Epoch: 571, Training Loss: 0.45114
Epoch: 572, Training Loss: 0.19637
Epoch: 572, Training Loss: 0.11542
Epoch: 572, Training Loss: 0.55184
Epoch: 572, Training Loss: 0.45143
Epoch: 573, Training Loss: 0.19589
Epoch: 573, Training Loss: 0.11544
Epoch: 573, Training Loss: 0.55142
Epoch: 573, Training Loss: 0.45171
Epoch: 574, Training Loss: 0.19541
Epoch: 574, Training Loss: 0.11546
Epoch: 574, Training Loss: 0.55100
Epoch: 574, Training Loss: 0.45199
Epoch: 575, Training Loss: 0.19494
Epoch: 575, Training Loss: 0.11548
Epoch: 575, Training Loss: 0.55058
Epoch: 575, Training Loss: 0.45227
Epoch: 576, Training Loss: 0.19447
Epoch: 576, Training Loss: 0.11550
Epoch: 576, Training Loss: 0.55016
Epoch: 576, Training Loss: 0.45254
Epoch: 577, Training Loss: 0.19400
Epoch: 577, Training Loss: 0.11553
Epoch: 577, Training Loss: 0.54974
Epoch: 577, Training Loss: 0.45281
Epoch: 578, Training Loss: 0.19353
Epoch: 578, Training Loss: 0.11555
Epoch: 578, Training Loss: 0.54933
Epoch: 578, Training Loss: 0.45308
Epoch: 579, Training Loss: 0.19307
Epoch: 579, Training Loss: 0.11558
Epoch: 579, Training Loss: 0.54891
Epoch: 579, Training Loss: 0.45334
Epoch: 580, Training Loss: 0.19261
Epoch: 580, Training Loss: 0.11561
Epoch: 580, Training Loss: 0.54850
Epoch: 580, Training Loss: 0.45360
Epoch: 581, Training Loss: 0.19215
Epoch: 581, Training Loss: 0.11564
Epoch: 581, Training Loss: 0.54809
Epoch: 581, Training Loss: 0.45386
Epoch: 582, Training Loss: 0.19169
Epoch: 582, Training Loss: 0.11567
Epoch: 582, Training Loss: 0.54768
Epoch: 582, Training Loss: 0.45411
Epoch: 583, Training Loss: 0.19123
Epoch: 583, Training Loss: 0.11570
Epoch: 583, Training Loss: 0.54727
Epoch: 583, Training Loss: 0.45436
Epoch: 584, Training Loss: 0.19078
Epoch: 584, Training Loss: 0.11574
Epoch: 584, Training Loss: 0.54686
Epoch: 584, Training Loss: 0.45461
Epoch: 585, Training Loss: 0.19033
Epoch: 585, Training Loss: 0.11577
Epoch: 585, Training Loss: 0.54646
Epoch: 585, Training Loss: 0.45485
Epoch: 586, Training Loss: 0.18988
Epoch: 586, Training Loss: 0.11581
Epoch: 586, Training Loss: 0.54605
Epoch: 586, Training Loss: 0.45509
Epoch: 587, Training Loss: 0.18943
Epoch: 587, Training Loss: 0.11585
Epoch: 587, Training Loss: 0.54565
Epoch: 587, Training Loss: 0.45532
Epoch: 588, Training Loss: 0.18899
Epoch: 588, Training Loss: 0.11589
Epoch: 588, Training Loss: 0.54525
Epoch: 588, Training Loss: 0.45555
Epoch: 589, Training Loss: 0.18854
Epoch: 589, Training Loss: 0.11593
Epoch: 589, Training Loss: 0.54485
Epoch: 589, Training Loss: 0.45578
Epoch: 590, Training Loss: 0.18810
Epoch: 590, Training Loss: 0.11598
Epoch: 590, Training Loss: 0.54445
Epoch: 590, Training Loss: 0.45600
Epoch: 591, Training Loss: 0.18766
Epoch: 591, Training Loss: 0.11602
Epoch: 591, Training Loss: 0.54405
Epoch: 591, Training Loss: 0.45622
Epoch: 592, Training Loss: 0.18723
Epoch: 592, Training Loss: 0.11607
Epoch: 592, Training Loss: 0.54365
Epoch: 592, Training Loss: 0.45644
Epoch: 593, Training Loss: 0.18679
Epoch: 593, Training Loss: 0.11612
Epoch: 593, Training Loss: 0.54326
Epoch: 593, Training Loss: 0.45665
Epoch: 594, Training Loss: 0.18636
Epoch: 594, Training Loss: 0.11617
Epoch: 594, Training Loss: 0.54286
Epoch: 594, Training Loss: 0.45685
Epoch: 595, Training Loss: 0.18593
Epoch: 595, Training Loss: 0.11622
Epoch: 595, Training Loss: 0.54247
Epoch: 595, Training Loss: 0.45706
Epoch: 596, Training Loss: 0.18550
Epoch: 596, Training Loss: 0.11628
Epoch: 596, Training Loss: 0.54207
Epoch: 596, Training Loss: 0.45726
Epoch: 597, Training Loss: 0.18507
Epoch: 597, Training Loss: 0.11633
Epoch: 597, Training Loss: 0.54168
Epoch: 597, Training Loss: 0.45745
Epoch: 598, Training Loss: 0.18465
Epoch: 598, Training Loss: 0.11639
Epoch: 598, Training Loss: 0.54129
Epoch: 598, Training Loss: 0.45764
Epoch: 599, Training Loss: 0.18423
Epoch: 599, Training Loss: 0.11645
Epoch: 599, Training Loss: 0.54090
Epoch: 599, Training Loss: 0.45783
Epoch: 600, Training Loss: 0.18381
Epoch: 600, Training Loss: 0.11652
Epoch: 600, Training Loss: 0.54051
Epoch: 600, Training Loss: 0.45801
Epoch: 601, Training Loss: 0.18339
Epoch: 601, Training Loss: 0.11658
Epoch: 601, Training Loss: 0.54012
Epoch: 601, Training Loss: 0.45819
Epoch: 602, Training Loss: 0.18297
Epoch: 602, Training Loss: 0.11665
Epoch: 602, Training Loss: 0.53973
Epoch: 602, Training Loss: 0.45836
Epoch: 603, Training Loss: 0.18256
Epoch: 603, Training Loss: 0.11672
Epoch: 603, Training Loss: 0.53934
Epoch: 603, Training Loss: 0.45853
Epoch: 604, Training Loss: 0.18214
Epoch: 604, Training Loss: 0.11679
Epoch: 604, Training Loss: 0.53896
Epoch: 604, Training Loss: 0.45869
Epoch: 605, Training Loss: 0.18173
Epoch: 605, Training Loss: 0.11687
Epoch: 605, Training Loss: 0.53857
Epoch: 605, Training Loss: 0.45884
Epoch: 606, Training Loss: 0.18132
Epoch: 606, Training Loss: 0.11694
Epoch: 606, Training Loss: 0.53818
Epoch: 606, Training Loss: 0.45900
Epoch: 607, Training Loss: 0.18092
Epoch: 607, Training Loss: 0.11702
Epoch: 607, Training Loss: 0.53780
Epoch: 607, Training Loss: 0.45914
Epoch: 608, Training Loss: 0.18051
Epoch: 608, Training Loss: 0.11711
Epoch: 608, Training Loss: 0.53741
Epoch: 608, Training Loss: 0.45929
Epoch: 609, Training Loss: 0.18011
Epoch: 609, Training Loss: 0.11719
Epoch: 609, Training Loss: 0.53703
Epoch: 609, Training Loss: 0.45942
Epoch: 610, Training Loss: 0.17971
Epoch: 610, Training Loss: 0.11728
Epoch: 610, Training Loss: 0.53664
Epoch: 610, Training Loss: 0.45955
Epoch: 611, Training Loss: 0.17931
Epoch: 611, Training Loss: 0.11737
Epoch: 611, Training Loss: 0.53626
Epoch: 611, Training Loss: 0.45968
Epoch: 612, Training Loss: 0.17891
Epoch: 612, Training Loss: 0.11746
Epoch: 612, Training Loss: 0.53587
Epoch: 612, Training Loss: 0.45979
Epoch: 613, Training Loss: 0.17851
Epoch: 613, Training Loss: 0.11756
Epoch: 613, Training Loss: 0.53549
Epoch: 613, Training Loss: 0.45991
Epoch: 614, Training Loss: 0.17812
Epoch: 614, Training Loss: 0.11766
Epoch: 614, Training Loss: 0.53510
Epoch: 614, Training Loss: 0.46001
Epoch: 615, Training Loss: 0.17772
Epoch: 615, Training Loss: 0.11776
Epoch: 615, Training Loss: 0.53472
Epoch: 615, Training Loss: 0.46011
Epoch: 616, Training Loss: 0.17733
Epoch: 616, Training Loss: 0.11787
Epoch: 616, Training Loss: 0.53433
Epoch: 616, Training Loss: 0.46021
Epoch: 617, Training Loss: 0.17694
Epoch: 617, Training Loss: 0.11798
Epoch: 617, Training Loss: 0.53394
Epoch: 617, Training Loss: 0.46029
Epoch: 618, Training Loss: 0.17656
Epoch: 618, Training Loss: 0.11809
Epoch: 618, Training Loss: 0.53356
Epoch: 618, Training Loss: 0.46037
Epoch: 619, Training Loss: 0.17617
Epoch: 619, Training Loss: 0.11821
Epoch: 619, Training Loss: 0.53317
Epoch: 619, Training Loss: 0.46044
Epoch: 620, Training Loss: 0.17579
Epoch: 620, Training Loss: 0.11833
Epoch: 620, Training Loss: 0.53278
Epoch: 620, Training Loss: 0.46050
Epoch: 621, Training Loss: 0.17540
Epoch: 621, Training Loss: 0.11845
Epoch: 621, Training Loss: 0.53239
Epoch: 621, Training Loss: 0.46056
Epoch: 622, Training Loss: 0.17502
Epoch: 622, Training Loss: 0.11858
Epoch: 622, Training Loss: 0.53201
Epoch: 622, Training Loss: 0.46060
Epoch: 623, Training Loss: 0.17464
Epoch: 623, Training Loss: 0.11871
Epoch: 623, Training Loss: 0.53162
Epoch: 623, Training Loss: 0.46064
Epoch: 624, Training Loss: 0.17426
Epoch: 624, Training Loss: 0.11884
Epoch: 624, Training Loss: 0.53123
Epoch: 624, Training Loss: 0.46067
Epoch: 625, Training Loss: 0.17388
Epoch: 625, Training Loss: 0.11898
Epoch: 625, Training Loss: 0.53083
Epoch: 625, Training Loss: 0.46069
Epoch: 626, Training Loss: 0.17351
Epoch: 626, Training Loss: 0.11912
Epoch: 626, Training Loss: 0.53044
Epoch: 626, Training Loss: 0.46070
Epoch: 627, Training Loss: 0.17313
Epoch: 627, Training Loss: 0.11927
Epoch: 627, Training Loss: 0.53005
Epoch: 627, Training Loss: 0.46070
Epoch: 628, Training Loss: 0.17276
Epoch: 628, Training Loss: 0.11942
Epoch: 628, Training Loss: 0.52965
Epoch: 628, Training Loss: 0.46069
Epoch: 629, Training Loss: 0.17239
Epoch: 629, Training Loss: 0.11958
Epoch: 629, Training Loss: 0.52925
Epoch: 629, Training Loss: 0.46066
Epoch: 630, Training Loss: 0.17202
Epoch: 630, Training Loss: 0.11974
Epoch: 630, Training Loss: 0.52885
Epoch: 630, Training Loss: 0.46063
Epoch: 631, Training Loss: 0.17165
Epoch: 631, Training Loss: 0.11991
Epoch: 631, Training Loss: 0.52845
Epoch: 631, Training Loss: 0.46058
Epoch: 632, Training Loss: 0.17128
Epoch: 632, Training Loss: 0.12008
Epoch: 632, Training Loss: 0.52805
Epoch: 632, Training Loss: 0.46052
Epoch: 633, Training Loss: 0.17091
Epoch: 633, Training Loss: 0.12025
Epoch: 633, Training Loss: 0.52764
Epoch: 633, Training Loss: 0.46045
Epoch: 634, Training Loss: 0.17055
Epoch: 634, Training Loss: 0.12043
Epoch: 634, Training Loss: 0.52724
Epoch: 634, Training Loss: 0.46036
Epoch: 635, Training Loss: 0.17018
Epoch: 635, Training Loss: 0.12062
Epoch: 635, Training Loss: 0.52683
Epoch: 635, Training Loss: 0.46025
Epoch: 636, Training Loss: 0.16982
Epoch: 636, Training Loss: 0.12081
Epoch: 636, Training Loss: 0.52641
Epoch: 636, Training Loss: 0.46013
Epoch: 637, Training Loss: 0.16946
Epoch: 637, Training Loss: 0.12101
Epoch: 637, Training Loss: 0.52600
Epoch: 637, Training Loss: 0.46000
Epoch: 638, Training Loss: 0.16910
Epoch: 638, Training Loss: 0.12121
Epoch: 638, Training Loss: 0.52558
Epoch: 638, Training Loss: 0.45984
Epoch: 639, Training Loss: 0.16874
Epoch: 639, Training Loss: 0.12142
Epoch: 639, Training Loss: 0.52516
Epoch: 639, Training Loss: 0.45967
Epoch: 640, Training Loss: 0.16838
Epoch: 640, Training Loss: 0.12163
Epoch: 640, Training Loss: 0.52473
Epoch: 640, Training Loss: 0.45948
Epoch: 641, Training Loss: 0.16802
Epoch: 641, Training Loss: 0.12185
Epoch: 641, Training Loss: 0.52430
Epoch: 641, Training Loss: 0.45926
Epoch: 642, Training Loss: 0.16766
Epoch: 642, Training Loss: 0.12208
Epoch: 642, Training Loss: 0.52387
Epoch: 642, Training Loss: 0.45902
Epoch: 643, Training Loss: 0.16730
Epoch: 643, Training Loss: 0.12231
Epoch: 643, Training Loss: 0.52343
Epoch: 643, Training Loss: 0.45876
Epoch: 644, Training Loss: 0.16695
Epoch: 644, Training Loss: 0.12255
Epoch: 644, Training Loss: 0.52299
Epoch: 644, Training Loss: 0.45847
Epoch: 645, Training Loss: 0.16659
Epoch: 645, Training Loss: 0.12280
Epoch: 645, Training Loss: 0.52254
Epoch: 645, Training Loss: 0.45816
Epoch: 646, Training Loss: 0.16624
Epoch: 646, Training Loss: 0.12305
Epoch: 646, Training Loss: 0.52209
Epoch: 646, Training Loss: 0.45782
Epoch: 647, Training Loss: 0.16588
Epoch: 647, Training Loss: 0.12331
Epoch: 647, Training Loss: 0.52164
Epoch: 647, Training Loss: 0.45744
Epoch: 648, Training Loss: 0.16553
Epoch: 648, Training Loss: 0.12357
Epoch: 648, Training Loss: 0.52118
Epoch: 648, Training Loss: 0.45704
Epoch: 649, Training Loss: 0.16518
Epoch: 649, Training Loss: 0.12385
Epoch: 649, Training Loss: 0.52071
Epoch: 649, Training Loss: 0.45660
Epoch: 650, Training Loss: 0.16483
Epoch: 650, Training Loss: 0.12412
Epoch: 650, Training Loss: 0.52023
Epoch: 650, Training Loss: 0.45612
Epoch: 651, Training Loss: 0.16448
Epoch: 651, Training Loss: 0.12441
Epoch: 651, Training Loss: 0.51975
Epoch: 651, Training Loss: 0.45560
Epoch: 652, Training Loss: 0.16412
Epoch: 652, Training Loss: 0.12470
Epoch: 652, Training Loss: 0.51927
Epoch: 652, Training Loss: 0.45503
Epoch: 653, Training Loss: 0.16377
Epoch: 653, Training Loss: 0.12500
Epoch: 653, Training Loss: 0.51877
Epoch: 653, Training Loss: 0.45443
Epoch: 654, Training Loss: 0.16343
Epoch: 654, Training Loss: 0.12531
Epoch: 654, Training Loss: 0.51827
Epoch: 654, Training Loss: 0.45377
Epoch: 655, Training Loss: 0.16308
Epoch: 655, Training Loss: 0.12562
Epoch: 655, Training Loss: 0.51776
Epoch: 655, Training Loss: 0.45305
Epoch: 656, Training Loss: 0.16273
Epoch: 656, Training Loss: 0.12594
Epoch: 656, Training Loss: 0.51724
Epoch: 656, Training Loss: 0.45229
Epoch: 657, Training Loss: 0.16238
Epoch: 657, Training Loss: 0.12627
Epoch: 657, Training Loss: 0.51671
Epoch: 657, Training Loss: 0.45145
Epoch: 658, Training Loss: 0.16204
Epoch: 658, Training Loss: 0.12660
Epoch: 658, Training Loss: 0.51617
Epoch: 658, Training Loss: 0.45056
Epoch: 659, Training Loss: 0.16169
Epoch: 659, Training Loss: 0.12694
Epoch: 659, Training Loss: 0.51562
Epoch: 659, Training Loss: 0.44959
Epoch: 660, Training Loss: 0.16135
Epoch: 660, Training Loss: 0.12728
Epoch: 660, Training Loss: 0.51505
Epoch: 660, Training Loss: 0.44854
Epoch: 661, Training Loss: 0.16101
Epoch: 661, Training Loss: 0.12763
Epoch: 661, Training Loss: 0.51448
Epoch: 661, Training Loss: 0.44741
Epoch: 662, Training Loss: 0.16067
Epoch: 662, Training Loss: 0.12798
Epoch: 662, Training Loss: 0.51389
Epoch: 662, Training Loss: 0.44619
Epoch: 663, Training Loss: 0.16033
Epoch: 663, Training Loss: 0.12833
Epoch: 663, Training Loss: 0.51329
Epoch: 663, Training Loss: 0.44488
Epoch: 664, Training Loss: 0.16000
Epoch: 664, Training Loss: 0.12869
Epoch: 664, Training Loss: 0.51267
Epoch: 664, Training Loss: 0.44346
Epoch: 665, Training Loss: 0.15967
Epoch: 665, Training Loss: 0.12905
Epoch: 665, Training Loss: 0.51204
Epoch: 665, Training Loss: 0.44193
Epoch: 666, Training Loss: 0.15934
Epoch: 666, Training Loss: 0.12941
Epoch: 666, Training Loss: 0.51139
Epoch: 666, Training Loss: 0.44027
Epoch: 667, Training Loss: 0.15901
Epoch: 667, Training Loss: 0.12977
Epoch: 667, Training Loss: 0.51072
Epoch: 667, Training Loss: 0.43849
Epoch: 668, Training Loss: 0.15870
Epoch: 668, Training Loss: 0.13013
Epoch: 668, Training Loss: 0.51004
Epoch: 668, Training Loss: 0.43657
Epoch: 669, Training Loss: 0.15838
Epoch: 669, Training Loss: 0.13048
Epoch: 669, Training Loss: 0.50933
Epoch: 669, Training Loss: 0.43450
Epoch: 670, Training Loss: 0.15808
Epoch: 670, Training Loss: 0.13082
Epoch: 670, Training Loss: 0.50859
Epoch: 670, Training Loss: 0.43227
Epoch: 671, Training Loss: 0.15778
Epoch: 671, Training Loss: 0.13116
Epoch: 671, Training Loss: 0.50783
Epoch: 671, Training Loss: 0.42987
Epoch: 672, Training Loss: 0.15750
Epoch: 672, Training Loss: 0.13148
Epoch: 672, Training Loss: 0.50704
Epoch: 672, Training Loss: 0.42729
Epoch: 673, Training Loss: 0.15722
Epoch: 673, Training Loss: 0.13179
Epoch: 673, Training Loss: 0.50622
Epoch: 673, Training Loss: 0.42452
Epoch: 674, Training Loss: 0.15696
Epoch: 674, Training Loss: 0.13208
Epoch: 674, Training Loss: 0.50537
Epoch: 674, Training Loss: 0.42154
Epoch: 675, Training Loss: 0.15672
Epoch: 675, Training Loss: 0.13235
Epoch: 675, Training Loss: 0.50448
Epoch: 675, Training Loss: 0.41835
Epoch: 676, Training Loss: 0.15649
Epoch: 676, Training Loss: 0.13259
Epoch: 676, Training Loss: 0.50355
Epoch: 676, Training Loss: 0.41495
Epoch: 677, Training Loss: 0.15629
Epoch: 677, Training Loss: 0.13280
Epoch: 677, Training Loss: 0.50257
Epoch: 677, Training Loss: 0.41131
Epoch: 678, Training Loss: 0.15611
Epoch: 678, Training Loss: 0.13297
Epoch: 678, Training Loss: 0.50154
Epoch: 678, Training Loss: 0.40744
Epoch: 679, Training Loss: 0.15596
Epoch: 679, Training Loss: 0.13310
Epoch: 679, Training Loss: 0.50046
Epoch: 679, Training Loss: 0.40334
Epoch: 680, Training Loss: 0.15585
Epoch: 680, Training Loss: 0.13318
Epoch: 680, Training Loss: 0.49931
Epoch: 680, Training Loss: 0.39901
Epoch: 681, Training Loss: 0.15577
Epoch: 681, Training Loss: 0.13321
Epoch: 681, Training Loss: 0.49809
Epoch: 681, Training Loss: 0.39444
Epoch: 682, Training Loss: 0.15573
Epoch: 682, Training Loss: 0.13318
Epoch: 682, Training Loss: 0.49680
Epoch: 682, Training Loss: 0.38965
Epoch: 683, Training Loss: 0.15574
Epoch: 683, Training Loss: 0.13308
Epoch: 683, Training Loss: 0.49543
Epoch: 683, Training Loss: 0.38466
Epoch: 684, Training Loss: 0.15581
Epoch: 684, Training Loss: 0.13292
Epoch: 684, Training Loss: 0.49396
Epoch: 684, Training Loss: 0.37948
Epoch: 685, Training Loss: 0.15593
Epoch: 685, Training Loss: 0.13268
Epoch: 685, Training Loss: 0.49239
Epoch: 685, Training Loss: 0.37413
Epoch: 686, Training Loss: 0.15612
Epoch: 686, Training Loss: 0.13236
Epoch: 686, Training Loss: 0.49071
Epoch: 686, Training Loss: 0.36865
Epoch: 687, Training Loss: 0.15637
Epoch: 687, Training Loss: 0.13197
Epoch: 687, Training Loss: 0.48892
Epoch: 687, Training Loss: 0.36307
Epoch: 688, Training Loss: 0.15669
Epoch: 688, Training Loss: 0.13149
Epoch: 688, Training Loss: 0.48700
Epoch: 688, Training Loss: 0.35742
Epoch: 689, Training Loss: 0.15708
Epoch: 689, Training Loss: 0.13092
Epoch: 689, Training Loss: 0.48495
Epoch: 689, Training Loss: 0.35174
Epoch: 690, Training Loss: 0.15756
Epoch: 690, Training Loss: 0.13028
Epoch: 690, Training Loss: 0.48276
Epoch: 690, Training Loss: 0.34606
Epoch: 691, Training Loss: 0.15810
Epoch: 691, Training Loss: 0.12956
Epoch: 691, Training Loss: 0.48043
Epoch: 691, Training Loss: 0.34044
Epoch: 692, Training Loss: 0.15873
Epoch: 692, Training Loss: 0.12876
Epoch: 692, Training Loss: 0.47796
Epoch: 692, Training Loss: 0.33489
Epoch: 693, Training Loss: 0.15942
Epoch: 693, Training Loss: 0.12790
Epoch: 693, Training Loss: 0.47535
Epoch: 693, Training Loss: 0.32945
Epoch: 694, Training Loss: 0.16019
Epoch: 694, Training Loss: 0.12698
Epoch: 694, Training Loss: 0.47259
Epoch: 694, Training Loss: 0.32416
Epoch: 695, Training Loss: 0.16103
Epoch: 695, Training Loss: 0.12601
Epoch: 695, Training Loss: 0.46969
Epoch: 695, Training Loss: 0.31902
Epoch: 696, Training Loss: 0.16192
Epoch: 696, Training Loss: 0.12500
Epoch: 696, Training Loss: 0.46667
Epoch: 696, Training Loss: 0.31406
Epoch: 697, Training Loss: 0.16287
Epoch: 697, Training Loss: 0.12395
Epoch: 697, Training Loss: 0.46352
Epoch: 697, Training Loss: 0.30928
Epoch: 698, Training Loss: 0.16386
Epoch: 698, Training Loss: 0.12289
Epoch: 698, Training Loss: 0.46026
Epoch: 698, Training Loss: 0.30470
Epoch: 699, Training Loss: 0.16488
Epoch: 699, Training Loss: 0.12181
Epoch: 699, Training Loss: 0.45690
Epoch: 699, Training Loss: 0.30031
Epoch: 700, Training Loss: 0.16594
Epoch: 700, Training Loss: 0.12073
Epoch: 700, Training Loss: 0.45345
Epoch: 700, Training Loss: 0.29612
Epoch: 701, Training Loss: 0.16701
Epoch: 701, Training Loss: 0.11965
Epoch: 701, Training Loss: 0.44992
Epoch: 701, Training Loss: 0.29211
Epoch: 702, Training Loss: 0.16809
Epoch: 702, Training Loss: 0.11859
Epoch: 702, Training Loss: 0.44632
Epoch: 702, Training Loss: 0.28828
Epoch: 703, Training Loss: 0.16917
Epoch: 703, Training Loss: 0.11754
Epoch: 703, Training Loss: 0.44268
Epoch: 703, Training Loss: 0.28463
Epoch: 704, Training Loss: 0.17025
Epoch: 704, Training Loss: 0.11652
Epoch: 704, Training Loss: 0.43900
Epoch: 704, Training Loss: 0.28113
Epoch: 705, Training Loss: 0.17131
Epoch: 705, Training Loss: 0.11553
Epoch: 705, Training Loss: 0.43529
Epoch: 705, Training Loss: 0.27779
Epoch: 706, Training Loss: 0.17235
Epoch: 706, Training Loss: 0.11458
Epoch: 706, Training Loss: 0.43156
Epoch: 706, Training Loss: 0.27459
Epoch: 707, Training Loss: 0.17336
Epoch: 707, Training Loss: 0.11366
Epoch: 707, Training Loss: 0.42782
Epoch: 707, Training Loss: 0.27151
Epoch: 708, Training Loss: 0.17434
Epoch: 708, Training Loss: 0.11279
Epoch: 708, Training Loss: 0.42409
Epoch: 708, Training Loss: 0.26856
Epoch: 709, Training Loss: 0.17528
Epoch: 709, Training Loss: 0.11195
Epoch: 709, Training Loss: 0.42036
Epoch: 709, Training Loss: 0.26572
Epoch: 710, Training Loss: 0.17618
Epoch: 710, Training Loss: 0.11116
Epoch: 710, Training Loss: 0.41666
Epoch: 710, Training Loss: 0.26297
Epoch: 711, Training Loss: 0.17703
Epoch: 711, Training Loss: 0.11041
Epoch: 711, Training Loss: 0.41297
Epoch: 711, Training Loss: 0.26032
Epoch: 712, Training Loss: 0.17784
Epoch: 712, Training Loss: 0.10970
Epoch: 712, Training Loss: 0.40932
Epoch: 712, Training Loss: 0.25775
Epoch: 713, Training Loss: 0.17860
Epoch: 713, Training Loss: 0.10904
Epoch: 713, Training Loss: 0.40569
Epoch: 713, Training Loss: 0.25526
Epoch: 714, Training Loss: 0.17931
Epoch: 714, Training Loss: 0.10841
Epoch: 714, Training Loss: 0.40210
Epoch: 714, Training Loss: 0.25285
Epoch: 715, Training Loss: 0.17996
Epoch: 715, Training Loss: 0.10783
Epoch: 715, Training Loss: 0.39856
Epoch: 715, Training Loss: 0.25050
Epoch: 716, Training Loss: 0.18057
Epoch: 716, Training Loss: 0.10729
Epoch: 716, Training Loss: 0.39505
Epoch: 716, Training Loss: 0.24821
Epoch: 717, Training Loss: 0.18112
Epoch: 717, Training Loss: 0.10678
Epoch: 717, Training Loss: 0.39159
Epoch: 717, Training Loss: 0.24597
Epoch: 718, Training Loss: 0.18163
Epoch: 718, Training Loss: 0.10631
Epoch: 718, Training Loss: 0.38817
Epoch: 718, Training Loss: 0.24379
Epoch: 719, Training Loss: 0.18208
Epoch: 719, Training Loss: 0.10587
Epoch: 719, Training Loss: 0.38480
Epoch: 719, Training Loss: 0.24166
Epoch: 720, Training Loss: 0.18248
Epoch: 720, Training Loss: 0.10547
Epoch: 720, Training Loss: 0.38147
Epoch: 720, Training Loss: 0.23958
Epoch: 721, Training Loss: 0.18284
Epoch: 721, Training Loss: 0.10510
Epoch: 721, Training Loss: 0.37820
Epoch: 721, Training Loss: 0.23754
Epoch: 722, Training Loss: 0.18315
Epoch: 722, Training Loss: 0.10475
Epoch: 722, Training Loss: 0.37497
Epoch: 722, Training Loss: 0.23554
Epoch: 723, Training Loss: 0.18341
Epoch: 723, Training Loss: 0.10444
Epoch: 723, Training Loss: 0.37179
Epoch: 723, Training Loss: 0.23357
Epoch: 724, Training Loss: 0.18363
Epoch: 724, Training Loss: 0.10415
Epoch: 724, Training Loss: 0.36866
Epoch: 724, Training Loss: 0.23165
Epoch: 725, Training Loss: 0.18380
Epoch: 725, Training Loss: 0.10388
Epoch: 725, Training Loss: 0.36557
Epoch: 725, Training Loss: 0.22976
Epoch: 726, Training Loss: 0.18394
Epoch: 726, Training Loss: 0.10364
Epoch: 726, Training Loss: 0.36253
Epoch: 726, Training Loss: 0.22790
Epoch: 727, Training Loss: 0.18403
Epoch: 727, Training Loss: 0.10341
Epoch: 727, Training Loss: 0.35954
Epoch: 727, Training Loss: 0.22607
Epoch: 728, Training Loss: 0.18409
Epoch: 728, Training Loss: 0.10321
Epoch: 728, Training Loss: 0.35660
Epoch: 728, Training Loss: 0.22428
Epoch: 729, Training Loss: 0.18411
Epoch: 729, Training Loss: 0.10302
Epoch: 729, Training Loss: 0.35370
Epoch: 729, Training Loss: 0.22251
Epoch: 730, Training Loss: 0.18409
Epoch: 730, Training Loss: 0.10285
Epoch: 730, Training Loss: 0.35084
Epoch: 730, Training Loss: 0.22077
Epoch: 731, Training Loss: 0.18405
Epoch: 731, Training Loss: 0.10270
Epoch: 731, Training Loss: 0.34803
Epoch: 731, Training Loss: 0.21906
Epoch: 732, Training Loss: 0.18397
Epoch: 732, Training Loss: 0.10256
Epoch: 732, Training Loss: 0.34527
Epoch: 732, Training Loss: 0.21738
Epoch: 733, Training Loss: 0.18386
Epoch: 733, Training Loss: 0.10243
Epoch: 733, Training Loss: 0.34254
Epoch: 733, Training Loss: 0.21572
Epoch: 734, Training Loss: 0.18372
Epoch: 734, Training Loss: 0.10232
Epoch: 734, Training Loss: 0.33986
Epoch: 734, Training Loss: 0.21409
Epoch: 735, Training Loss: 0.18356
Epoch: 735, Training Loss: 0.10221
Epoch: 735, Training Loss: 0.33722
Epoch: 735, Training Loss: 0.21248
Epoch: 736, Training Loss: 0.18337
Epoch: 736, Training Loss: 0.10211
Epoch: 736, Training Loss: 0.33462
Epoch: 736, Training Loss: 0.21090
Epoch: 737, Training Loss: 0.18315
Epoch: 737, Training Loss: 0.10203
Epoch: 737, Training Loss: 0.33205
Epoch: 737, Training Loss: 0.20934
Epoch: 738, Training Loss: 0.18292
Epoch: 738, Training Loss: 0.10194
Epoch: 738, Training Loss: 0.32953
Epoch: 738, Training Loss: 0.20780
Epoch: 739, Training Loss: 0.18266
Epoch: 739, Training Loss: 0.10187
Epoch: 739, Training Loss: 0.32705
Epoch: 739, Training Loss: 0.20629
Epoch: 740, Training Loss: 0.18238
Epoch: 740, Training Loss: 0.10180
Epoch: 740, Training Loss: 0.32460
Epoch: 740, Training Loss: 0.20479
Epoch: 741, Training Loss: 0.18209
Epoch: 741, Training Loss: 0.10174
Epoch: 741, Training Loss: 0.32219
Epoch: 741, Training Loss: 0.20332
Epoch: 742, Training Loss: 0.18177
Epoch: 742, Training Loss: 0.10168
Epoch: 742, Training Loss: 0.31981
Epoch: 742, Training Loss: 0.20187
Epoch: 743, Training Loss: 0.18144
Epoch: 743, Training Loss: 0.10163
Epoch: 743, Training Loss: 0.31748
Epoch: 743, Training Loss: 0.20044
Epoch: 744, Training Loss: 0.18109
Epoch: 744, Training Loss: 0.10158
Epoch: 744, Training Loss: 0.31517
Epoch: 744, Training Loss: 0.19904
Epoch: 745, Training Loss: 0.18073
Epoch: 745, Training Loss: 0.10153
Epoch: 745, Training Loss: 0.31290
Epoch: 745, Training Loss: 0.19765
Epoch: 746, Training Loss: 0.18036
Epoch: 746, Training Loss: 0.10149
Epoch: 746, Training Loss: 0.31066
Epoch: 746, Training Loss: 0.19628
Epoch: 747, Training Loss: 0.17997
Epoch: 747, Training Loss: 0.10145
Epoch: 747, Training Loss: 0.30846
Epoch: 747, Training Loss: 0.19493
Epoch: 748, Training Loss: 0.17957
Epoch: 748, Training Loss: 0.10141
Epoch: 748, Training Loss: 0.30628
Epoch: 748, Training Loss: 0.19360
Epoch: 749, Training Loss: 0.17916
Epoch: 749, Training Loss: 0.10136
Epoch: 749, Training Loss: 0.30414
Epoch: 749, Training Loss: 0.19228
Epoch: 750, Training Loss: 0.17874
Epoch: 750, Training Loss: 0.10133
Epoch: 750, Training Loss: 0.30203
Epoch: 750, Training Loss: 0.19099
Epoch: 751, Training Loss: 0.17831
Epoch: 751, Training Loss: 0.10129
Epoch: 751, Training Loss: 0.29995
Epoch: 751, Training Loss: 0.18971
Epoch: 752, Training Loss: 0.17787
Epoch: 752, Training Loss: 0.10125
Epoch: 752, Training Loss: 0.29790
Epoch: 752, Training Loss: 0.18846
Epoch: 753, Training Loss: 0.17742
Epoch: 753, Training Loss: 0.10121
Epoch: 753, Training Loss: 0.29588
Epoch: 753, Training Loss: 0.18722
Epoch: 754, Training Loss: 0.17696
Epoch: 754, Training Loss: 0.10117
Epoch: 754, Training Loss: 0.29389
Epoch: 754, Training Loss: 0.18599
Epoch: 755, Training Loss: 0.17650
Epoch: 755, Training Loss: 0.10113
Epoch: 755, Training Loss: 0.29192
Epoch: 755, Training Loss: 0.18478
Epoch: 756, Training Loss: 0.17603
Epoch: 756, Training Loss: 0.10109
Epoch: 756, Training Loss: 0.28998
Epoch: 756, Training Loss: 0.18359
Epoch: 757, Training Loss: 0.17555
Epoch: 757, Training Loss: 0.10104
Epoch: 757, Training Loss: 0.28807
Epoch: 757, Training Loss: 0.18242
Epoch: 758, Training Loss: 0.17507
Epoch: 758, Training Loss: 0.10100
Epoch: 758, Training Loss: 0.28619
Epoch: 758, Training Loss: 0.18126
Epoch: 759, Training Loss: 0.17458
Epoch: 759, Training Loss: 0.10096
Epoch: 759, Training Loss: 0.28433
Epoch: 759, Training Loss: 0.18012
Epoch: 760, Training Loss: 0.17409
Epoch: 760, Training Loss: 0.10091
Epoch: 760, Training Loss: 0.28249
Epoch: 760, Training Loss: 0.17899
Epoch: 761, Training Loss: 0.17360
Epoch: 761, Training Loss: 0.10086
Epoch: 761, Training Loss: 0.28069
Epoch: 761, Training Loss: 0.17788
Epoch: 762, Training Loss: 0.17310
Epoch: 762, Training Loss: 0.10081
Epoch: 762, Training Loss: 0.27890
Epoch: 762, Training Loss: 0.17679
Epoch: 763, Training Loss: 0.17260
Epoch: 763, Training Loss: 0.10076
Epoch: 763, Training Loss: 0.27714
Epoch: 763, Training Loss: 0.17571
Epoch: 764, Training Loss: 0.17209
Epoch: 764, Training Loss: 0.10071
Epoch: 764, Training Loss: 0.27541
Epoch: 764, Training Loss: 0.17464
Epoch: 765, Training Loss: 0.17158
Epoch: 765, Training Loss: 0.10065
Epoch: 765, Training Loss: 0.27370
Epoch: 765, Training Loss: 0.17359
Epoch: 766, Training Loss: 0.17107
Epoch: 766, Training Loss: 0.10059
Epoch: 766, Training Loss: 0.27201
Epoch: 766, Training Loss: 0.17255
Epoch: 767, Training Loss: 0.17056
Epoch: 767, Training Loss: 0.10053
Epoch: 767, Training Loss: 0.27034
Epoch: 767, Training Loss: 0.17152
Epoch: 768, Training Loss: 0.17005
Epoch: 768, Training Loss: 0.10047
Epoch: 768, Training Loss: 0.26869
Epoch: 768, Training Loss: 0.17051
Epoch: 769, Training Loss: 0.16953
Epoch: 769, Training Loss: 0.10041
Epoch: 769, Training Loss: 0.26707
Epoch: 769, Training Loss: 0.16951
Epoch: 770, Training Loss: 0.16901
Epoch: 770, Training Loss: 0.10034
Epoch: 770, Training Loss: 0.26547
Epoch: 770, Training Loss: 0.16853
Epoch: 771, Training Loss: 0.16850
Epoch: 771, Training Loss: 0.10027
Epoch: 771, Training Loss: 0.26388
Epoch: 771, Training Loss: 0.16756
Epoch: 772, Training Loss: 0.16798
Epoch: 772, Training Loss: 0.10020
Epoch: 772, Training Loss: 0.26232
Epoch: 772, Training Loss: 0.16660
Epoch: 773, Training Loss: 0.16746
Epoch: 773, Training Loss: 0.10013
Epoch: 773, Training Loss: 0.26078
Epoch: 773, Training Loss: 0.16565
Epoch: 774, Training Loss: 0.16694
Epoch: 774, Training Loss: 0.10006
Epoch: 774, Training Loss: 0.25926
Epoch: 774, Training Loss: 0.16472
Epoch: 775, Training Loss: 0.16642
Epoch: 775, Training Loss: 0.09998
Epoch: 775, Training Loss: 0.25776
Epoch: 775, Training Loss: 0.16379
Epoch: 776, Training Loss: 0.16590
Epoch: 776, Training Loss: 0.09990
Epoch: 776, Training Loss: 0.25628
Epoch: 776, Training Loss: 0.16288
Epoch: 777, Training Loss: 0.16538
Epoch: 777, Training Loss: 0.09982
Epoch: 777, Training Loss: 0.25481
Epoch: 777, Training Loss: 0.16198
Epoch: 778, Training Loss: 0.16487
Epoch: 778, Training Loss: 0.09974
Epoch: 778, Training Loss: 0.25337
Epoch: 778, Training Loss: 0.16110
Epoch: 779, Training Loss: 0.16435
Epoch: 779, Training Loss: 0.09965
Epoch: 779, Training Loss: 0.25194
Epoch: 779, Training Loss: 0.16022
Epoch: 780, Training Loss: 0.16383
Epoch: 780, Training Loss: 0.09957
Epoch: 780, Training Loss: 0.25053
Epoch: 780, Training Loss: 0.15936
Epoch: 781, Training Loss: 0.16332
Epoch: 781, Training Loss: 0.09948
Epoch: 781, Training Loss: 0.24914
Epoch: 781, Training Loss: 0.15850
Epoch: 782, Training Loss: 0.16280
Epoch: 782, Training Loss: 0.09939
Epoch: 782, Training Loss: 0.24776
Epoch: 782, Training Loss: 0.15766
Epoch: 783, Training Loss: 0.16229
Epoch: 783, Training Loss: 0.09929
Epoch: 783, Training Loss: 0.24640
Epoch: 783, Training Loss: 0.15683
Epoch: 784, Training Loss: 0.16177
Epoch: 784, Training Loss: 0.09920
Epoch: 784, Training Loss: 0.24506
Epoch: 784, Training Loss: 0.15600
Epoch: 785, Training Loss: 0.16126
Epoch: 785, Training Loss: 0.09910
Epoch: 785, Training Loss: 0.24374
Epoch: 785, Training Loss: 0.15519
Epoch: 786, Training Loss: 0.16075
Epoch: 786, Training Loss: 0.09900
Epoch: 786, Training Loss: 0.24243
Epoch: 786, Training Loss: 0.15439
Epoch: 787, Training Loss: 0.16024
Epoch: 787, Training Loss: 0.09890
Epoch: 787, Training Loss: 0.24113
Epoch: 787, Training Loss: 0.15360
Epoch: 788, Training Loss: 0.15973
Epoch: 788, Training Loss: 0.09880
Epoch: 788, Training Loss: 0.23985
Epoch: 788, Training Loss: 0.15282
Epoch: 789, Training Loss: 0.15923
Epoch: 789, Training Loss: 0.09870
Epoch: 789, Training Loss: 0.23859
Epoch: 789, Training Loss: 0.15204
Epoch: 790, Training Loss: 0.15873
Epoch: 790, Training Loss: 0.09859
Epoch: 790, Training Loss: 0.23734
Epoch: 790, Training Loss: 0.15128
Epoch: 791, Training Loss: 0.15822
Epoch: 791, Training Loss: 0.09849
Epoch: 791, Training Loss: 0.23611
Epoch: 791, Training Loss: 0.15052
Epoch: 792, Training Loss: 0.15772
Epoch: 792, Training Loss: 0.09838
Epoch: 792, Training Loss: 0.23489
Epoch: 792, Training Loss: 0.14978
Epoch: 793, Training Loss: 0.15723
Epoch: 793, Training Loss: 0.09827
Epoch: 793, Training Loss: 0.23369
Epoch: 793, Training Loss: 0.14904
Epoch: 794, Training Loss: 0.15673
Epoch: 794, Training Loss: 0.09816
Epoch: 794, Training Loss: 0.23250
Epoch: 794, Training Loss: 0.14832
Epoch: 795, Training Loss: 0.15623
Epoch: 795, Training Loss: 0.09805
Epoch: 795, Training Loss: 0.23132
Epoch: 795, Training Loss: 0.14760
Epoch: 796, Training Loss: 0.15574
Epoch: 796, Training Loss: 0.09793
Epoch: 796, Training Loss: 0.23016
Epoch: 796, Training Loss: 0.14689
Epoch: 797, Training Loss: 0.15525
Epoch: 797, Training Loss: 0.09782
Epoch: 797, Training Loss: 0.22901
Epoch: 797, Training Loss: 0.14619
Epoch: 798, Training Loss: 0.15476
Epoch: 798, Training Loss: 0.09770
Epoch: 798, Training Loss: 0.22787
Epoch: 798, Training Loss: 0.14549
Epoch: 799, Training Loss: 0.15428
Epoch: 799, Training Loss: 0.09758
Epoch: 799, Training Loss: 0.22675
Epoch: 799, Training Loss: 0.14481
Epoch: 800, Training Loss: 0.15380
Epoch: 800, Training Loss: 0.09746
Epoch: 800, Training Loss: 0.22564
Epoch: 800, Training Loss: 0.14413
Epoch: 801, Training Loss: 0.15331
Epoch: 801, Training Loss: 0.09734
Epoch: 801, Training Loss: 0.22454
Epoch: 801, Training Loss: 0.14346
Epoch: 802, Training Loss: 0.15284
Epoch: 802, Training Loss: 0.09722
Epoch: 802, Training Loss: 0.22345
Epoch: 802, Training Loss: 0.14280
Epoch: 803, Training Loss: 0.15236
Epoch: 803, Training Loss: 0.09710
Epoch: 803, Training Loss: 0.22238
Epoch: 803, Training Loss: 0.14214
Epoch: 804, Training Loss: 0.15189
Epoch: 804, Training Loss: 0.09698
Epoch: 804, Training Loss: 0.22132
Epoch: 804, Training Loss: 0.14150
Epoch: 805, Training Loss: 0.15141
Epoch: 805, Training Loss: 0.09685
Epoch: 805, Training Loss: 0.22026
Epoch: 805, Training Loss: 0.14086
Epoch: 806, Training Loss: 0.15094
Epoch: 806, Training Loss: 0.09673
Epoch: 806, Training Loss: 0.21923
Epoch: 806, Training Loss: 0.14022
Epoch: 807, Training Loss: 0.15048
Epoch: 807, Training Loss: 0.09660
Epoch: 807, Training Loss: 0.21820
Epoch: 807, Training Loss: 0.13960
Epoch: 808, Training Loss: 0.15001
Epoch: 808, Training Loss: 0.09647
Epoch: 808, Training Loss: 0.21718
Epoch: 808, Training Loss: 0.13898
Epoch: 809, Training Loss: 0.14955
Epoch: 809, Training Loss: 0.09634
Epoch: 809, Training Loss: 0.21618
Epoch: 809, Training Loss: 0.13837
Epoch: 810, Training Loss: 0.14909
Epoch: 810, Training Loss: 0.09622
Epoch: 810, Training Loss: 0.21518
Epoch: 810, Training Loss: 0.13776
Epoch: 811, Training Loss: 0.14863
Epoch: 811, Training Loss: 0.09609
Epoch: 811, Training Loss: 0.21420
Epoch: 811, Training Loss: 0.13717
Epoch: 812, Training Loss: 0.14818
Epoch: 812, Training Loss: 0.09596
Epoch: 812, Training Loss: 0.21322
Epoch: 812, Training Loss: 0.13657
Epoch: 813, Training Loss: 0.14773
Epoch: 813, Training Loss: 0.09582
Epoch: 813, Training Loss: 0.21226
Epoch: 813, Training Loss: 0.13599
Epoch: 814, Training Loss: 0.14728
Epoch: 814, Training Loss: 0.09569
Epoch: 814, Training Loss: 0.21131
Epoch: 814, Training Loss: 0.13541
Epoch: 815, Training Loss: 0.14683
Epoch: 815, Training Loss: 0.09556
Epoch: 815, Training Loss: 0.21037
Epoch: 815, Training Loss: 0.13484
Epoch: 816, Training Loss: 0.14639
Epoch: 816, Training Loss: 0.09543
Epoch: 816, Training Loss: 0.20943
Epoch: 816, Training Loss: 0.13427
Epoch: 817, Training Loss: 0.14594
Epoch: 817, Training Loss: 0.09529
Epoch: 817, Training Loss: 0.20851
Epoch: 817, Training Loss: 0.13371
Epoch: 818, Training Loss: 0.14550
Epoch: 818, Training Loss: 0.09516
Epoch: 818, Training Loss: 0.20760
Epoch: 818, Training Loss: 0.13316
Epoch: 819, Training Loss: 0.14507
Epoch: 819, Training Loss: 0.09503
Epoch: 819, Training Loss: 0.20669
Epoch: 819, Training Loss: 0.13261
Epoch: 820, Training Loss: 0.14463
Epoch: 820, Training Loss: 0.09489
Epoch: 820, Training Loss: 0.20580
Epoch: 820, Training Loss: 0.13207
Epoch: 821, Training Loss: 0.14420
Epoch: 821, Training Loss: 0.09476
Epoch: 821, Training Loss: 0.20491
Epoch: 821, Training Loss: 0.13153
Epoch: 822, Training Loss: 0.14377
Epoch: 822, Training Loss: 0.09462
Epoch: 822, Training Loss: 0.20403
Epoch: 822, Training Loss: 0.13100
Epoch: 823, Training Loss: 0.14335
Epoch: 823, Training Loss: 0.09448
Epoch: 823, Training Loss: 0.20317
Epoch: 823, Training Loss: 0.13047
Epoch: 824, Training Loss: 0.14292
Epoch: 824, Training Loss: 0.09435
Epoch: 824, Training Loss: 0.20231
Epoch: 824, Training Loss: 0.12995
Epoch: 825, Training Loss: 0.14250
Epoch: 825, Training Loss: 0.09421
Epoch: 825, Training Loss: 0.20146
Epoch: 825, Training Loss: 0.12944
Epoch: 826, Training Loss: 0.14208
Epoch: 826, Training Loss: 0.09407
Epoch: 826, Training Loss: 0.20061
Epoch: 826, Training Loss: 0.12893
Epoch: 827, Training Loss: 0.14166
Epoch: 827, Training Loss: 0.09393
Epoch: 827, Training Loss: 0.19978
Epoch: 827, Training Loss: 0.12842
Epoch: 828, Training Loss: 0.14125
Epoch: 828, Training Loss: 0.09380
Epoch: 828, Training Loss: 0.19895
Epoch: 828, Training Loss: 0.12792
Epoch: 829, Training Loss: 0.14084
Epoch: 829, Training Loss: 0.09366
Epoch: 829, Training Loss: 0.19814
Epoch: 829, Training Loss: 0.12743
Epoch: 830, Training Loss: 0.14043
Epoch: 830, Training Loss: 0.09352
Epoch: 830, Training Loss: 0.19733
Epoch: 830, Training Loss: 0.12694
Epoch: 831, Training Loss: 0.14002
Epoch: 831, Training Loss: 0.09338
Epoch: 831, Training Loss: 0.19653
Epoch: 831, Training Loss: 0.12646
Epoch: 832, Training Loss: 0.13961
Epoch: 832, Training Loss: 0.09324
Epoch: 832, Training Loss: 0.19573
Epoch: 832, Training Loss: 0.12598
Epoch: 833, Training Loss: 0.13921
Epoch: 833, Training Loss: 0.09310
Epoch: 833, Training Loss: 0.19495
Epoch: 833, Training Loss: 0.12550
Epoch: 834, Training Loss: 0.13881
Epoch: 834, Training Loss: 0.09296
Epoch: 834, Training Loss: 0.19417
Epoch: 834, Training Loss: 0.12503
Epoch: 835, Training Loss: 0.13842
Epoch: 835, Training Loss: 0.09283
Epoch: 835, Training Loss: 0.19340
Epoch: 835, Training Loss: 0.12457
Epoch: 836, Training Loss: 0.13802
Epoch: 836, Training Loss: 0.09269
Epoch: 836, Training Loss: 0.19263
Epoch: 836, Training Loss: 0.12410
Epoch: 837, Training Loss: 0.13763
Epoch: 837, Training Loss: 0.09255
Epoch: 837, Training Loss: 0.19188
Epoch: 837, Training Loss: 0.12365
Epoch: 838, Training Loss: 0.13724
Epoch: 838, Training Loss: 0.09241
Epoch: 838, Training Loss: 0.19113
Epoch: 838, Training Loss: 0.12320
Epoch: 839, Training Loss: 0.13685
Epoch: 839, Training Loss: 0.09227
Epoch: 839, Training Loss: 0.19039
Epoch: 839, Training Loss: 0.12275
Epoch: 840, Training Loss: 0.13646
Epoch: 840, Training Loss: 0.09213
Epoch: 840, Training Loss: 0.18965
Epoch: 840, Training Loss: 0.12230
Epoch: 841, Training Loss: 0.13608
Epoch: 841, Training Loss: 0.09199
Epoch: 841, Training Loss: 0.18892
Epoch: 841, Training Loss: 0.12187
Epoch: 842, Training Loss: 0.13570
Epoch: 842, Training Loss: 0.09185
Epoch: 842, Training Loss: 0.18820
Epoch: 842, Training Loss: 0.12143
Epoch: 843, Training Loss: 0.13532
Epoch: 843, Training Loss: 0.09171
Epoch: 843, Training Loss: 0.18748
Epoch: 843, Training Loss: 0.12100
Epoch: 844, Training Loss: 0.13495
Epoch: 844, Training Loss: 0.09157
Epoch: 844, Training Loss: 0.18678
Epoch: 844, Training Loss: 0.12057
Epoch: 845, Training Loss: 0.13457
Epoch: 845, Training Loss: 0.09143
Epoch: 845, Training Loss: 0.18607
Epoch: 845, Training Loss: 0.12015
Epoch: 846, Training Loss: 0.13420
Epoch: 846, Training Loss: 0.09129
Epoch: 846, Training Loss: 0.18538
Epoch: 846, Training Loss: 0.11973
Epoch: 847, Training Loss: 0.13383
Epoch: 847, Training Loss: 0.09115
Epoch: 847, Training Loss: 0.18469
Epoch: 847, Training Loss: 0.11932
Epoch: 848, Training Loss: 0.13347
Epoch: 848, Training Loss: 0.09101
Epoch: 848, Training Loss: 0.18401
Epoch: 848, Training Loss: 0.11890
Epoch: 849, Training Loss: 0.13310
Epoch: 849, Training Loss: 0.09087
Epoch: 849, Training Loss: 0.18333
Epoch: 849, Training Loss: 0.11850
Epoch: 850, Training Loss: 0.13274
Epoch: 850, Training Loss: 0.09073
Epoch: 850, Training Loss: 0.18266
Epoch: 850, Training Loss: 0.11809
Epoch: 851, Training Loss: 0.13238
Epoch: 851, Training Loss: 0.09059
Epoch: 851, Training Loss: 0.18199
Epoch: 851, Training Loss: 0.11769
Epoch: 852, Training Loss: 0.13202
Epoch: 852, Training Loss: 0.09046
Epoch: 852, Training Loss: 0.18133
Epoch: 852, Training Loss: 0.11730
Epoch: 853, Training Loss: 0.13166
Epoch: 853, Training Loss: 0.09032
Epoch: 853, Training Loss: 0.18068
Epoch: 853, Training Loss: 0.11690
Epoch: 854, Training Loss: 0.13131
Epoch: 854, Training Loss: 0.09018
Epoch: 854, Training Loss: 0.18003
Epoch: 854, Training Loss: 0.11651
Epoch: 855, Training Loss: 0.13096
Epoch: 855, Training Loss: 0.09004
Epoch: 855, Training Loss: 0.17939
Epoch: 855, Training Loss: 0.11613
Epoch: 856, Training Loss: 0.13061
Epoch: 856, Training Loss: 0.08990
Epoch: 856, Training Loss: 0.17876
Epoch: 856, Training Loss: 0.11575
Epoch: 857, Training Loss: 0.13026
Epoch: 857, Training Loss: 0.08976
Epoch: 857, Training Loss: 0.17813
Epoch: 857, Training Loss: 0.11537
Epoch: 858, Training Loss: 0.12992
Epoch: 858, Training Loss: 0.08963
Epoch: 858, Training Loss: 0.17750
Epoch: 858, Training Loss: 0.11499
Epoch: 859, Training Loss: 0.12957
Epoch: 859, Training Loss: 0.08949
Epoch: 859, Training Loss: 0.17688
Epoch: 859, Training Loss: 0.11462
Epoch: 860, Training Loss: 0.12923
Epoch: 860, Training Loss: 0.08935
Epoch: 860, Training Loss: 0.17627
Epoch: 860, Training Loss: 0.11425
Epoch: 861, Training Loss: 0.12889
Epoch: 861, Training Loss: 0.08921
Epoch: 861, Training Loss: 0.17566
Epoch: 861, Training Loss: 0.11388
Epoch: 862, Training Loss: 0.12856
Epoch: 862, Training Loss: 0.08908
Epoch: 862, Training Loss: 0.17505
Epoch: 862, Training Loss: 0.11352
Epoch: 863, Training Loss: 0.12822
Epoch: 863, Training Loss: 0.08894
Epoch: 863, Training Loss: 0.17445
Epoch: 863, Training Loss: 0.11316
Epoch: 864, Training Loss: 0.12789
Epoch: 864, Training Loss: 0.08880
Epoch: 864, Training Loss: 0.17386
Epoch: 864, Training Loss: 0.11280
Epoch: 865, Training Loss: 0.12756
Epoch: 865, Training Loss: 0.08867
Epoch: 865, Training Loss: 0.17327
Epoch: 865, Training Loss: 0.11245
Epoch: 866, Training Loss: 0.12723
Epoch: 866, Training Loss: 0.08853
Epoch: 866, Training Loss: 0.17269
Epoch: 866, Training Loss: 0.11210
Epoch: 867, Training Loss: 0.12690
Epoch: 867, Training Loss: 0.08839
Epoch: 867, Training Loss: 0.17211
Epoch: 867, Training Loss: 0.11175
Epoch: 868, Training Loss: 0.12658
Epoch: 868, Training Loss: 0.08826
Epoch: 868, Training Loss: 0.17153
Epoch: 868, Training Loss: 0.11141
Epoch: 869, Training Loss: 0.12626
Epoch: 869, Training Loss: 0.08812
Epoch: 869, Training Loss: 0.17096
Epoch: 869, Training Loss: 0.11106
Epoch: 870, Training Loss: 0.12594
Epoch: 870, Training Loss: 0.08799
Epoch: 870, Training Loss: 0.17040
Epoch: 870, Training Loss: 0.11073
Epoch: 871, Training Loss: 0.12562
Epoch: 871, Training Loss: 0.08785
Epoch: 871, Training Loss: 0.16984
Epoch: 871, Training Loss: 0.11039
Epoch: 872, Training Loss: 0.12530
Epoch: 872, Training Loss: 0.08772
Epoch: 872, Training Loss: 0.16928
Epoch: 872, Training Loss: 0.11006
Epoch: 873, Training Loss: 0.12499
Epoch: 873, Training Loss: 0.08758
Epoch: 873, Training Loss: 0.16873
Epoch: 873, Training Loss: 0.10972
Epoch: 874, Training Loss: 0.12467
Epoch: 874, Training Loss: 0.08745
Epoch: 874, Training Loss: 0.16818
Epoch: 874, Training Loss: 0.10940
Epoch: 875, Training Loss: 0.12436
Epoch: 875, Training Loss: 0.08732
Epoch: 875, Training Loss: 0.16764
Epoch: 875, Training Loss: 0.10907
Epoch: 876, Training Loss: 0.12405
Epoch: 876, Training Loss: 0.08718
Epoch: 876, Training Loss: 0.16710
Epoch: 876, Training Loss: 0.10875
Epoch: 877, Training Loss: 0.12374
Epoch: 877, Training Loss: 0.08705
Epoch: 877, Training Loss: 0.16657
Epoch: 877, Training Loss: 0.10843
Epoch: 878, Training Loss: 0.12344
Epoch: 878, Training Loss: 0.08692
Epoch: 878, Training Loss: 0.16604
Epoch: 878, Training Loss: 0.10811
Epoch: 879, Training Loss: 0.12313
Epoch: 879, Training Loss: 0.08678
Epoch: 879, Training Loss: 0.16551
Epoch: 879, Training Loss: 0.10780
Epoch: 880, Training Loss: 0.12283
Epoch: 880, Training Loss: 0.08665
Epoch: 880, Training Loss: 0.16499
Epoch: 880, Training Loss: 0.10748
Epoch: 881, Training Loss: 0.12253
Epoch: 881, Training Loss: 0.08652
Epoch: 881, Training Loss: 0.16447
Epoch: 881, Training Loss: 0.10717
Epoch: 882, Training Loss: 0.12223
Epoch: 882, Training Loss: 0.08639
Epoch: 882, Training Loss: 0.16396
Epoch: 882, Training Loss: 0.10687
Epoch: 883, Training Loss: 0.12194
Epoch: 883, Training Loss: 0.08626
Epoch: 883, Training Loss: 0.16345
Epoch: 883, Training Loss: 0.10656
Epoch: 884, Training Loss: 0.12164
Epoch: 884, Training Loss: 0.08613
Epoch: 884, Training Loss: 0.16295
Epoch: 884, Training Loss: 0.10626
Epoch: 885, Training Loss: 0.12135
Epoch: 885, Training Loss: 0.08600
Epoch: 885, Training Loss: 0.16244
Epoch: 885, Training Loss: 0.10596
Epoch: 886, Training Loss: 0.12106
Epoch: 886, Training Loss: 0.08587
Epoch: 886, Training Loss: 0.16195
Epoch: 886, Training Loss: 0.10566
Epoch: 887, Training Loss: 0.12077
Epoch: 887, Training Loss: 0.08574
Epoch: 887, Training Loss: 0.16145
Epoch: 887, Training Loss: 0.10536
Epoch: 888, Training Loss: 0.12048
Epoch: 888, Training Loss: 0.08561
Epoch: 888, Training Loss: 0.16096
Epoch: 888, Training Loss: 0.10507
Epoch: 889, Training Loss: 0.12019
Epoch: 889, Training Loss: 0.08548
Epoch: 889, Training Loss: 0.16048
Epoch: 889, Training Loss: 0.10478
Epoch: 890, Training Loss: 0.11991
Epoch: 890, Training Loss: 0.08535
Epoch: 890, Training Loss: 0.15999
Epoch: 890, Training Loss: 0.10449
Epoch: 891, Training Loss: 0.11962
Epoch: 891, Training Loss: 0.08522
Epoch: 891, Training Loss: 0.15951
Epoch: 891, Training Loss: 0.10420
Epoch: 892, Training Loss: 0.11934
Epoch: 892, Training Loss: 0.08509
Epoch: 892, Training Loss: 0.15904
Epoch: 892, Training Loss: 0.10392
Epoch: 893, Training Loss: 0.11906
Epoch: 893, Training Loss: 0.08496
Epoch: 893, Training Loss: 0.15857
Epoch: 893, Training Loss: 0.10363
Epoch: 894, Training Loss: 0.11878
Epoch: 894, Training Loss: 0.08484
Epoch: 894, Training Loss: 0.15810
Epoch: 894, Training Loss: 0.10335
Epoch: 895, Training Loss: 0.11851
Epoch: 895, Training Loss: 0.08471
Epoch: 895, Training Loss: 0.15763
Epoch: 895, Training Loss: 0.10308
Epoch: 896, Training Loss: 0.11823
Epoch: 896, Training Loss: 0.08458
Epoch: 896, Training Loss: 0.15717
Epoch: 896, Training Loss: 0.10280
Epoch: 897, Training Loss: 0.11796
Epoch: 897, Training Loss: 0.08446
Epoch: 897, Training Loss: 0.15671
Epoch: 897, Training Loss: 0.10253
Epoch: 898, Training Loss: 0.11769
Epoch: 898, Training Loss: 0.08433
Epoch: 898, Training Loss: 0.15626
Epoch: 898, Training Loss: 0.10225
Epoch: 899, Training Loss: 0.11741
Epoch: 899, Training Loss: 0.08421
Epoch: 899, Training Loss: 0.15581
Epoch: 899, Training Loss: 0.10198
Epoch: 900, Training Loss: 0.11715
Epoch: 900, Training Loss: 0.08408
Epoch: 900, Training Loss: 0.15536
Epoch: 900, Training Loss: 0.10171
Epoch: 901, Training Loss: 0.11688
Epoch: 901, Training Loss: 0.08396
Epoch: 901, Training Loss: 0.15492
Epoch: 901, Training Loss: 0.10145
Epoch: 902, Training Loss: 0.11661
Epoch: 902, Training Loss: 0.08383
Epoch: 902, Training Loss: 0.15447
Epoch: 902, Training Loss: 0.10118
Epoch: 903, Training Loss: 0.11635
Epoch: 903, Training Loss: 0.08371
Epoch: 903, Training Loss: 0.15404
Epoch: 903, Training Loss: 0.10092
Epoch: 904, Training Loss: 0.11609
Epoch: 904, Training Loss: 0.08358
Epoch: 904, Training Loss: 0.15360
Epoch: 904, Training Loss: 0.10066
Epoch: 905, Training Loss: 0.11582
Epoch: 905, Training Loss: 0.08346
Epoch: 905, Training Loss: 0.15317
Epoch: 905, Training Loss: 0.10040
Epoch: 906, Training Loss: 0.11556
Epoch: 906, Training Loss: 0.08334
Epoch: 906, Training Loss: 0.15274
Epoch: 906, Training Loss: 0.10015
Epoch: 907, Training Loss: 0.11531
Epoch: 907, Training Loss: 0.08321
Epoch: 907, Training Loss: 0.15231
Epoch: 907, Training Loss: 0.09989
Epoch: 908, Training Loss: 0.11505
Epoch: 908, Training Loss: 0.08309
Epoch: 908, Training Loss: 0.15189
Epoch: 908, Training Loss: 0.09964
Epoch: 909, Training Loss: 0.11479
Epoch: 909, Training Loss: 0.08297
Epoch: 909, Training Loss: 0.15147
Epoch: 909, Training Loss: 0.09939
Epoch: 910, Training Loss: 0.11454
Epoch: 910, Training Loss: 0.08285
Epoch: 910, Training Loss: 0.15105
Epoch: 910, Training Loss: 0.09914
Epoch: 911, Training Loss: 0.11429
Epoch: 911, Training Loss: 0.08273
Epoch: 911, Training Loss: 0.15064
Epoch: 911, Training Loss: 0.09889
Epoch: 912, Training Loss: 0.11403
Epoch: 912, Training Loss: 0.08261
Epoch: 912, Training Loss: 0.15023
Epoch: 912, Training Loss: 0.09864
Epoch: 913, Training Loss: 0.11378
Epoch: 913, Training Loss: 0.08249
Epoch: 913, Training Loss: 0.14982
Epoch: 913, Training Loss: 0.09840
Epoch: 914, Training Loss: 0.11354
Epoch: 914, Training Loss: 0.08237
Epoch: 914, Training Loss: 0.14941
Epoch: 914, Training Loss: 0.09815
Epoch: 915, Training Loss: 0.11329
Epoch: 915, Training Loss: 0.08225
Epoch: 915, Training Loss: 0.14901
Epoch: 915, Training Loss: 0.09791
Epoch: 916, Training Loss: 0.11304
Epoch: 916, Training Loss: 0.08213
Epoch: 916, Training Loss: 0.14861
Epoch: 916, Training Loss: 0.09767
Epoch: 917, Training Loss: 0.11280
Epoch: 917, Training Loss: 0.08201
Epoch: 917, Training Loss: 0.14821
Epoch: 917, Training Loss: 0.09744
Epoch: 918, Training Loss: 0.11256
Epoch: 918, Training Loss: 0.08189
Epoch: 918, Training Loss: 0.14782
Epoch: 918, Training Loss: 0.09720
Epoch: 919, Training Loss: 0.11231
Epoch: 919, Training Loss: 0.08177
Epoch: 919, Training Loss: 0.14742
Epoch: 919, Training Loss: 0.09696
Epoch: 920, Training Loss: 0.11207
Epoch: 920, Training Loss: 0.08165
Epoch: 920, Training Loss: 0.14704
Epoch: 920, Training Loss: 0.09673
Epoch: 921, Training Loss: 0.11183
Epoch: 921, Training Loss: 0.08153
Epoch: 921, Training Loss: 0.14665
Epoch: 921, Training Loss: 0.09650
Epoch: 922, Training Loss: 0.11160
Epoch: 922, Training Loss: 0.08142
Epoch: 922, Training Loss: 0.14626
Epoch: 922, Training Loss: 0.09627
Epoch: 923, Training Loss: 0.11136
Epoch: 923, Training Loss: 0.08130
Epoch: 923, Training Loss: 0.14588
Epoch: 923, Training Loss: 0.09604
Epoch: 924, Training Loss: 0.11112
Epoch: 924, Training Loss: 0.08118
Epoch: 924, Training Loss: 0.14550
Epoch: 924, Training Loss: 0.09581
Epoch: 925, Training Loss: 0.11089
Epoch: 925, Training Loss: 0.08107
Epoch: 925, Training Loss: 0.14513
Epoch: 925, Training Loss: 0.09559
Epoch: 926, Training Loss: 0.11066
Epoch: 926, Training Loss: 0.08095
Epoch: 926, Training Loss: 0.14475
Epoch: 926, Training Loss: 0.09536
Epoch: 927, Training Loss: 0.11043
Epoch: 927, Training Loss: 0.08084
Epoch: 927, Training Loss: 0.14438
Epoch: 927, Training Loss: 0.09514
Epoch: 928, Training Loss: 0.11020
Epoch: 928, Training Loss: 0.08072
Epoch: 928, Training Loss: 0.14401
Epoch: 928, Training Loss: 0.09492
Epoch: 929, Training Loss: 0.10997
Epoch: 929, Training Loss: 0.08061
Epoch: 929, Training Loss: 0.14364
Epoch: 929, Training Loss: 0.09470
Epoch: 930, Training Loss: 0.10974
Epoch: 930, Training Loss: 0.08049
Epoch: 930, Training Loss: 0.14328
Epoch: 930, Training Loss: 0.09448
Epoch: 931, Training Loss: 0.10951
Epoch: 931, Training Loss: 0.08038
Epoch: 931, Training Loss: 0.14292
Epoch: 931, Training Loss: 0.09427
Epoch: 932, Training Loss: 0.10929
Epoch: 932, Training Loss: 0.08027
Epoch: 932, Training Loss: 0.14256
Epoch: 932, Training Loss: 0.09405
Epoch: 933, Training Loss: 0.10906
Epoch: 933, Training Loss: 0.08015
Epoch: 933, Training Loss: 0.14220
Epoch: 933, Training Loss: 0.09384
Epoch: 934, Training Loss: 0.10884
Epoch: 934, Training Loss: 0.08004
Epoch: 934, Training Loss: 0.14185
Epoch: 934, Training Loss: 0.09362
Epoch: 935, Training Loss: 0.10862
Epoch: 935, Training Loss: 0.07993
Epoch: 935, Training Loss: 0.14149
Epoch: 935, Training Loss: 0.09341
Epoch: 936, Training Loss: 0.10840
Epoch: 936, Training Loss: 0.07982
Epoch: 936, Training Loss: 0.14114
Epoch: 936, Training Loss: 0.09320
Epoch: 937, Training Loss: 0.10818
Epoch: 937, Training Loss: 0.07970
Epoch: 937, Training Loss: 0.14079
Epoch: 937, Training Loss: 0.09299
Epoch: 938, Training Loss: 0.10796
Epoch: 938, Training Loss: 0.07959
Epoch: 938, Training Loss: 0.14045
Epoch: 938, Training Loss: 0.09279
Epoch: 939, Training Loss: 0.10774
Epoch: 939, Training Loss: 0.07948
Epoch: 939, Training Loss: 0.14010
Epoch: 939, Training Loss: 0.09258
Epoch: 940, Training Loss: 0.10752
Epoch: 940, Training Loss: 0.07937
Epoch: 940, Training Loss: 0.13976
Epoch: 940, Training Loss: 0.09238
Epoch: 941, Training Loss: 0.10731
Epoch: 941, Training Loss: 0.07926
Epoch: 941, Training Loss: 0.13942
Epoch: 941, Training Loss: 0.09217
Epoch: 942, Training Loss: 0.10709
Epoch: 942, Training Loss: 0.07915
Epoch: 942, Training Loss: 0.13908
Epoch: 942, Training Loss: 0.09197
Epoch: 943, Training Loss: 0.10688
Epoch: 943, Training Loss: 0.07904
Epoch: 943, Training Loss: 0.13875
Epoch: 943, Training Loss: 0.09177
Epoch: 944, Training Loss: 0.10667
Epoch: 944, Training Loss: 0.07893
Epoch: 944, Training Loss: 0.13841
Epoch: 944, Training Loss: 0.09157
Epoch: 945, Training Loss: 0.10646
Epoch: 945, Training Loss: 0.07882
Epoch: 945, Training Loss: 0.13808
Epoch: 945, Training Loss: 0.09137
Epoch: 946, Training Loss: 0.10625
Epoch: 946, Training Loss: 0.07872
Epoch: 946, Training Loss: 0.13775
Epoch: 946, Training Loss: 0.09117
Epoch: 947, Training Loss: 0.10604
Epoch: 947, Training Loss: 0.07861
Epoch: 947, Training Loss: 0.13743
Epoch: 947, Training Loss: 0.09098
Epoch: 948, Training Loss: 0.10583
Epoch: 948, Training Loss: 0.07850
Epoch: 948, Training Loss: 0.13710
Epoch: 948, Training Loss: 0.09078
Epoch: 949, Training Loss: 0.10563
Epoch: 949, Training Loss: 0.07839
Epoch: 949, Training Loss: 0.13678
Epoch: 949, Training Loss: 0.09059
Epoch: 950, Training Loss: 0.10542
Epoch: 950, Training Loss: 0.07829
Epoch: 950, Training Loss: 0.13646
Epoch: 950, Training Loss: 0.09040
Epoch: 951, Training Loss: 0.10522
Epoch: 951, Training Loss: 0.07818
Epoch: 951, Training Loss: 0.13614
Epoch: 951, Training Loss: 0.09020
Epoch: 952, Training Loss: 0.10501
Epoch: 952, Training Loss: 0.07807
Epoch: 952, Training Loss: 0.13582
Epoch: 952, Training Loss: 0.09001
Epoch: 953, Training Loss: 0.10481
Epoch: 953, Training Loss: 0.07797
Epoch: 953, Training Loss: 0.13550
Epoch: 953, Training Loss: 0.08982
Epoch: 954, Training Loss: 0.10461
Epoch: 954, Training Loss: 0.07786
Epoch: 954, Training Loss: 0.13519
Epoch: 954, Training Loss: 0.08964
Epoch: 955, Training Loss: 0.10441
Epoch: 955, Training Loss: 0.07776
Epoch: 955, Training Loss: 0.13488
Epoch: 955, Training Loss: 0.08945
Epoch: 956, Training Loss: 0.10421
Epoch: 956, Training Loss: 0.07765
Epoch: 956, Training Loss: 0.13457
Epoch: 956, Training Loss: 0.08926
Epoch: 957, Training Loss: 0.10401
Epoch: 957, Training Loss: 0.07755
Epoch: 957, Training Loss: 0.13426
Epoch: 957, Training Loss: 0.08908
Epoch: 958, Training Loss: 0.10381
Epoch: 958, Training Loss: 0.07744
Epoch: 958, Training Loss: 0.13395
Epoch: 958, Training Loss: 0.08889
Epoch: 959, Training Loss: 0.10362
Epoch: 959, Training Loss: 0.07734
Epoch: 959, Training Loss: 0.13365
Epoch: 959, Training Loss: 0.08871
Epoch: 960, Training Loss: 0.10342
Epoch: 960, Training Loss: 0.07724
Epoch: 960, Training Loss: 0.13335
Epoch: 960, Training Loss: 0.08853
Epoch: 961, Training Loss: 0.10323
Epoch: 961, Training Loss: 0.07713
Epoch: 961, Training Loss: 0.13305
Epoch: 961, Training Loss: 0.08835
Epoch: 962, Training Loss: 0.10304
Epoch: 962, Training Loss: 0.07703
Epoch: 962, Training Loss: 0.13275
Epoch: 962, Training Loss: 0.08817
Epoch: 963, Training Loss: 0.10284
Epoch: 963, Training Loss: 0.07693
Epoch: 963, Training Loss: 0.13245
Epoch: 963, Training Loss: 0.08799
Epoch: 964, Training Loss: 0.10265
Epoch: 964, Training Loss: 0.07682
Epoch: 964, Training Loss: 0.13215
Epoch: 964, Training Loss: 0.08781
Epoch: 965, Training Loss: 0.10246
Epoch: 965, Training Loss: 0.07672
Epoch: 965, Training Loss: 0.13186
Epoch: 965, Training Loss: 0.08764
Epoch: 966, Training Loss: 0.10227
Epoch: 966, Training Loss: 0.07662
Epoch: 966, Training Loss: 0.13157
Epoch: 966, Training Loss: 0.08746
Epoch: 967, Training Loss: 0.10208
Epoch: 967, Training Loss: 0.07652
Epoch: 967, Training Loss: 0.13128
Epoch: 967, Training Loss: 0.08729
Epoch: 968, Training Loss: 0.10189
Epoch: 968, Training Loss: 0.07642
Epoch: 968, Training Loss: 0.13099
Epoch: 968, Training Loss: 0.08711
Epoch: 969, Training Loss: 0.10171
Epoch: 969, Training Loss: 0.07632
Epoch: 969, Training Loss: 0.13070
Epoch: 969, Training Loss: 0.08694
Epoch: 970, Training Loss: 0.10152
Epoch: 970, Training Loss: 0.07622
Epoch: 970, Training Loss: 0.13041
Epoch: 970, Training Loss: 0.08677
Epoch: 971, Training Loss: 0.10134
Epoch: 971, Training Loss: 0.07612
Epoch: 971, Training Loss: 0.13013
Epoch: 971, Training Loss: 0.08660
Epoch: 972, Training Loss: 0.10115
Epoch: 972, Training Loss: 0.07602
Epoch: 972, Training Loss: 0.12985
Epoch: 972, Training Loss: 0.08643
Epoch: 973, Training Loss: 0.10097
Epoch: 973, Training Loss: 0.07592
Epoch: 973, Training Loss: 0.12957
Epoch: 973, Training Loss: 0.08626
Epoch: 974, Training Loss: 0.10079
Epoch: 974, Training Loss: 0.07582
Epoch: 974, Training Loss: 0.12929
Epoch: 974, Training Loss: 0.08609
Epoch: 975, Training Loss: 0.10060
Epoch: 975, Training Loss: 0.07572
Epoch: 975, Training Loss: 0.12901
Epoch: 975, Training Loss: 0.08593
Epoch: 976, Training Loss: 0.10042
Epoch: 976, Training Loss: 0.07562
Epoch: 976, Training Loss: 0.12873
Epoch: 976, Training Loss: 0.08576
Epoch: 977, Training Loss: 0.10024
Epoch: 977, Training Loss: 0.07553
Epoch: 977, Training Loss: 0.12846
Epoch: 977, Training Loss: 0.08560
Epoch: 978, Training Loss: 0.10006
Epoch: 978, Training Loss: 0.07543
Epoch: 978, Training Loss: 0.12819
Epoch: 978, Training Loss: 0.08543
Epoch: 979, Training Loss: 0.09989
Epoch: 979, Training Loss: 0.07533
Epoch: 979, Training Loss: 0.12791
Epoch: 979, Training Loss: 0.08527
Epoch: 980, Training Loss: 0.09971
Epoch: 980, Training Loss: 0.07523
Epoch: 980, Training Loss: 0.12764
Epoch: 980, Training Loss: 0.08511
Epoch: 981, Training Loss: 0.09953
Epoch: 981, Training Loss: 0.07514
Epoch: 981, Training Loss: 0.12738
Epoch: 981, Training Loss: 0.08495
Epoch: 982, Training Loss: 0.09936
Epoch: 982, Training Loss: 0.07504
Epoch: 982, Training Loss: 0.12711
Epoch: 982, Training Loss: 0.08479
Epoch: 983, Training Loss: 0.09918
Epoch: 983, Training Loss: 0.07495
Epoch: 983, Training Loss: 0.12684
Epoch: 983, Training Loss: 0.08463
Epoch: 984, Training Loss: 0.09901
Epoch: 984, Training Loss: 0.07485
Epoch: 984, Training Loss: 0.12658
Epoch: 984, Training Loss: 0.08447
Epoch: 985, Training Loss: 0.09883
Epoch: 985, Training Loss: 0.07475
Epoch: 985, Training Loss: 0.12632
Epoch: 985, Training Loss: 0.08431
Epoch: 986, Training Loss: 0.09866
Epoch: 986, Training Loss: 0.07466
Epoch: 986, Training Loss: 0.12605
Epoch: 986, Training Loss: 0.08415
Epoch: 987, Training Loss: 0.09849
Epoch: 987, Training Loss: 0.07457
Epoch: 987, Training Loss: 0.12579
Epoch: 987, Training Loss: 0.08400
Epoch: 988, Training Loss: 0.09832
Epoch: 988, Training Loss: 0.07447
Epoch: 988, Training Loss: 0.12554
Epoch: 988, Training Loss: 0.08384
Epoch: 989, Training Loss: 0.09815
Epoch: 989, Training Loss: 0.07438
Epoch: 989, Training Loss: 0.12528
Epoch: 989, Training Loss: 0.08368
Epoch: 990, Training Loss: 0.09798
Epoch: 990, Training Loss: 0.07428
Epoch: 990, Training Loss: 0.12502
Epoch: 990, Training Loss: 0.08353
Epoch: 991, Training Loss: 0.09781
Epoch: 991, Training Loss: 0.07419
Epoch: 991, Training Loss: 0.12477
Epoch: 991, Training Loss: 0.08338
Epoch: 992, Training Loss: 0.09764
Epoch: 992, Training Loss: 0.07410
Epoch: 992, Training Loss: 0.12451
Epoch: 992, Training Loss: 0.08323
Epoch: 993, Training Loss: 0.09747
Epoch: 993, Training Loss: 0.07400
Epoch: 993, Training Loss: 0.12426
Epoch: 993, Training Loss: 0.08307
Epoch: 994, Training Loss: 0.09731
Epoch: 994, Training Loss: 0.07391
Epoch: 994, Training Loss: 0.12401
Epoch: 994, Training Loss: 0.08292
Epoch: 995, Training Loss: 0.09714
Epoch: 995, Training Loss: 0.07382
Epoch: 995, Training Loss: 0.12376
Epoch: 995, Training Loss: 0.08277
Epoch: 996, Training Loss: 0.09698
Epoch: 996, Training Loss: 0.07373
Epoch: 996, Training Loss: 0.12352
Epoch: 996, Training Loss: 0.08262
Epoch: 997, Training Loss: 0.09681
Epoch: 997, Training Loss: 0.07363
Epoch: 997, Training Loss: 0.12327
Epoch: 997, Training Loss: 0.08248
Epoch: 998, Training Loss: 0.09665
Epoch: 998, Training Loss: 0.07354
Epoch: 998, Training Loss: 0.12302
Epoch: 998, Training Loss: 0.08233
Epoch: 999, Training Loss: 0.09649
Epoch: 999, Training Loss: 0.07345
Epoch: 999, Training Loss: 0.12278
Epoch: 999, Training Loss: 0.08218
Epoch: 1000, Training Loss: 0.09633
Epoch: 1000, Training Loss: 0.07336
Epoch: 1000, Training Loss: 0.12254
Epoch: 1000, Training Loss: 0.08204
Epoch: 1001, Training Loss: 0.09616
Epoch: 1001, Training Loss: 0.07327
Epoch: 1001, Training Loss: 0.12230
Epoch: 1001, Training Loss: 0.08189
Epoch: 1002, Training Loss: 0.09600
Epoch: 1002, Training Loss: 0.07318
Epoch: 1002, Training Loss: 0.12206
Epoch: 1002, Training Loss: 0.08175
Epoch: 1003, Training Loss: 0.09584
Epoch: 1003, Training Loss: 0.07309
Epoch: 1003, Training Loss: 0.12182
Epoch: 1003, Training Loss: 0.08160
Epoch: 1004, Training Loss: 0.09569
Epoch: 1004, Training Loss: 0.07300
Epoch: 1004, Training Loss: 0.12158
Epoch: 1004, Training Loss: 0.08146
Epoch: 1005, Training Loss: 0.09553
Epoch: 1005, Training Loss: 0.07291
Epoch: 1005, Training Loss: 0.12134
Epoch: 1005, Training Loss: 0.08132
Epoch: 1006, Training Loss: 0.09537
Epoch: 1006, Training Loss: 0.07282
Epoch: 1006, Training Loss: 0.12111
Epoch: 1006, Training Loss: 0.08117
Epoch: 1007, Training Loss: 0.09521
Epoch: 1007, Training Loss: 0.07273
Epoch: 1007, Training Loss: 0.12087
Epoch: 1007, Training Loss: 0.08103
Epoch: 1008, Training Loss: 0.09506
Epoch: 1008, Training Loss: 0.07264
Epoch: 1008, Training Loss: 0.12064
Epoch: 1008, Training Loss: 0.08089
Epoch: 1009, Training Loss: 0.09490
Epoch: 1009, Training Loss: 0.07256
Epoch: 1009, Training Loss: 0.12041
Epoch: 1009, Training Loss: 0.08075
Epoch: 1010, Training Loss: 0.09475
Epoch: 1010, Training Loss: 0.07247
Epoch: 1010, Training Loss: 0.12018
Epoch: 1010, Training Loss: 0.08061
Epoch: 1011, Training Loss: 0.09459
Epoch: 1011, Training Loss: 0.07238
Epoch: 1011, Training Loss: 0.11995
Epoch: 1011, Training Loss: 0.08048
Epoch: 1012, Training Loss: 0.09444
Epoch: 1012, Training Loss: 0.07229
Epoch: 1012, Training Loss: 0.11972
Epoch: 1012, Training Loss: 0.08034
Epoch: 1013, Training Loss: 0.09429
Epoch: 1013, Training Loss: 0.07221
Epoch: 1013, Training Loss: 0.11949
Epoch: 1013, Training Loss: 0.08020
Epoch: 1014, Training Loss: 0.09413
Epoch: 1014, Training Loss: 0.07212
Epoch: 1014, Training Loss: 0.11927
Epoch: 1014, Training Loss: 0.08006
Epoch: 1015, Training Loss: 0.09398
Epoch: 1015, Training Loss: 0.07203
Epoch: 1015, Training Loss: 0.11904
Epoch: 1015, Training Loss: 0.07993
Epoch: 1016, Training Loss: 0.09383
Epoch: 1016, Training Loss: 0.07195
Epoch: 1016, Training Loss: 0.11882
Epoch: 1016, Training Loss: 0.07979
Epoch: 1017, Training Loss: 0.09368
Epoch: 1017, Training Loss: 0.07186
Epoch: 1017, Training Loss: 0.11860
Epoch: 1017, Training Loss: 0.07966
Epoch: 1018, Training Loss: 0.09353
Epoch: 1018, Training Loss: 0.07177
Epoch: 1018, Training Loss: 0.11837
Epoch: 1018, Training Loss: 0.07953
Epoch: 1019, Training Loss: 0.09338
Epoch: 1019, Training Loss: 0.07169
Epoch: 1019, Training Loss: 0.11815
Epoch: 1019, Training Loss: 0.07939
Epoch: 1020, Training Loss: 0.09323
Epoch: 1020, Training Loss: 0.07160
Epoch: 1020, Training Loss: 0.11793
Epoch: 1020, Training Loss: 0.07926
Epoch: 1021, Training Loss: 0.09309
Epoch: 1021, Training Loss: 0.07152
Epoch: 1021, Training Loss: 0.11772
Epoch: 1021, Training Loss: 0.07913
Epoch: 1022, Training Loss: 0.09294
Epoch: 1022, Training Loss: 0.07143
Epoch: 1022, Training Loss: 0.11750
Epoch: 1022, Training Loss: 0.07900
Epoch: 1023, Training Loss: 0.09279
Epoch: 1023, Training Loss: 0.07135
Epoch: 1023, Training Loss: 0.11728
Epoch: 1023, Training Loss: 0.07887
Epoch: 1024, Training Loss: 0.09265
Epoch: 1024, Training Loss: 0.07126
Epoch: 1024, Training Loss: 0.11707
Epoch: 1024, Training Loss: 0.07874
Epoch: 1025, Training Loss: 0.09250
Epoch: 1025, Training Loss: 0.07118
Epoch: 1025, Training Loss: 0.11685
Epoch: 1025, Training Loss: 0.07861
Epoch: 1026, Training Loss: 0.09236
Epoch: 1026, Training Loss: 0.07110
Epoch: 1026, Training Loss: 0.11664
Epoch: 1026, Training Loss: 0.07848
Epoch: 1027, Training Loss: 0.09221
Epoch: 1027, Training Loss: 0.07101
Epoch: 1027, Training Loss: 0.11643
Epoch: 1027, Training Loss: 0.07835
Epoch: 1028, Training Loss: 0.09207
Epoch: 1028, Training Loss: 0.07093
Epoch: 1028, Training Loss: 0.11622
Epoch: 1028, Training Loss: 0.07822
Epoch: 1029, Training Loss: 0.09193
Epoch: 1029, Training Loss: 0.07085
Epoch: 1029, Training Loss: 0.11601
Epoch: 1029, Training Loss: 0.07810
Epoch: 1030, Training Loss: 0.09178
Epoch: 1030, Training Loss: 0.07076
Epoch: 1030, Training Loss: 0.11580
Epoch: 1030, Training Loss: 0.07797
Epoch: 1031, Training Loss: 0.09164
Epoch: 1031, Training Loss: 0.07068
Epoch: 1031, Training Loss: 0.11559
Epoch: 1031, Training Loss: 0.07784
Epoch: 1032, Training Loss: 0.09150
Epoch: 1032, Training Loss: 0.07060
Epoch: 1032, Training Loss: 0.11538
Epoch: 1032, Training Loss: 0.07772
Epoch: 1033, Training Loss: 0.09136
Epoch: 1033, Training Loss: 0.07052
Epoch: 1033, Training Loss: 0.11518
Epoch: 1033, Training Loss: 0.07759
Epoch: 1034, Training Loss: 0.09122
Epoch: 1034, Training Loss: 0.07044
Epoch: 1034, Training Loss: 0.11497
Epoch: 1034, Training Loss: 0.07747
Epoch: 1035, Training Loss: 0.09108
Epoch: 1035, Training Loss: 0.07036
Epoch: 1035, Training Loss: 0.11477
Epoch: 1035, Training Loss: 0.07735
Epoch: 1036, Training Loss: 0.09094
Epoch: 1036, Training Loss: 0.07027
Epoch: 1036, Training Loss: 0.11456
Epoch: 1036, Training Loss: 0.07722
Epoch: 1037, Training Loss: 0.09080
Epoch: 1037, Training Loss: 0.07019
Epoch: 1037, Training Loss: 0.11436
Epoch: 1037, Training Loss: 0.07710
Epoch: 1038, Training Loss: 0.09067
Epoch: 1038, Training Loss: 0.07011
Epoch: 1038, Training Loss: 0.11416
Epoch: 1038, Training Loss: 0.07698
Epoch: 1039, Training Loss: 0.09053
Epoch: 1039, Training Loss: 0.07003
Epoch: 1039, Training Loss: 0.11396
Epoch: 1039, Training Loss: 0.07686
Epoch: 1040, Training Loss: 0.09039
Epoch: 1040, Training Loss: 0.06995
Epoch: 1040, Training Loss: 0.11376
Epoch: 1040, Training Loss: 0.07674
Epoch: 1041, Training Loss: 0.09026
Epoch: 1041, Training Loss: 0.06987
Epoch: 1041, Training Loss: 0.11356
Epoch: 1041, Training Loss: 0.07662
Epoch: 1042, Training Loss: 0.09012
Epoch: 1042, Training Loss: 0.06979
Epoch: 1042, Training Loss: 0.11336
Epoch: 1042, Training Loss: 0.07650
Epoch: 1043, Training Loss: 0.08999
Epoch: 1043, Training Loss: 0.06971
Epoch: 1043, Training Loss: 0.11317
Epoch: 1043, Training Loss: 0.07638
Epoch: 1044, Training Loss: 0.08985
Epoch: 1044, Training Loss: 0.06963
Epoch: 1044, Training Loss: 0.11297
Epoch: 1044, Training Loss: 0.07626
Epoch: 1045, Training Loss: 0.08972
Epoch: 1045, Training Loss: 0.06956
Epoch: 1045, Training Loss: 0.11278
Epoch: 1045, Training Loss: 0.07614
Epoch: 1046, Training Loss: 0.08959
Epoch: 1046, Training Loss: 0.06948
Epoch: 1046, Training Loss: 0.11258
Epoch: 1046, Training Loss: 0.07602
Epoch: 1047, Training Loss: 0.08945
Epoch: 1047, Training Loss: 0.06940
Epoch: 1047, Training Loss: 0.11239
Epoch: 1047, Training Loss: 0.07591
Epoch: 1048, Training Loss: 0.08932
Epoch: 1048, Training Loss: 0.06932
Epoch: 1048, Training Loss: 0.11220
Epoch: 1048, Training Loss: 0.07579
Epoch: 1049, Training Loss: 0.08919
Epoch: 1049, Training Loss: 0.06924
Epoch: 1049, Training Loss: 0.11201
Epoch: 1049, Training Loss: 0.07568
Epoch: 1050, Training Loss: 0.08906
Epoch: 1050, Training Loss: 0.06916
Epoch: 1050, Training Loss: 0.11182
Epoch: 1050, Training Loss: 0.07556
Epoch: 1051, Training Loss: 0.08893
Epoch: 1051, Training Loss: 0.06909
Epoch: 1051, Training Loss: 0.11163
Epoch: 1051, Training Loss: 0.07544
Epoch: 1052, Training Loss: 0.08880
Epoch: 1052, Training Loss: 0.06901
Epoch: 1052, Training Loss: 0.11144
Epoch: 1052, Training Loss: 0.07533
Epoch: 1053, Training Loss: 0.08867
Epoch: 1053, Training Loss: 0.06893
Epoch: 1053, Training Loss: 0.11125
Epoch: 1053, Training Loss: 0.07522
Epoch: 1054, Training Loss: 0.08854
Epoch: 1054, Training Loss: 0.06886
Epoch: 1054, Training Loss: 0.11106
Epoch: 1054, Training Loss: 0.07510
Epoch: 1055, Training Loss: 0.08841
Epoch: 1055, Training Loss: 0.06878
Epoch: 1055, Training Loss: 0.11088
Epoch: 1055, Training Loss: 0.07499
Epoch: 1056, Training Loss: 0.08828
Epoch: 1056, Training Loss: 0.06870
Epoch: 1056, Training Loss: 0.11069
Epoch: 1056, Training Loss: 0.07488
Epoch: 1057, Training Loss: 0.08816
Epoch: 1057, Training Loss: 0.06863
Epoch: 1057, Training Loss: 0.11051
Epoch: 1057, Training Loss: 0.07477
Epoch: 1058, Training Loss: 0.08803
Epoch: 1058, Training Loss: 0.06855
Epoch: 1058, Training Loss: 0.11032
Epoch: 1058, Training Loss: 0.07465
Epoch: 1059, Training Loss: 0.08790
Epoch: 1059, Training Loss: 0.06848
Epoch: 1059, Training Loss: 0.11014
Epoch: 1059, Training Loss: 0.07454
Epoch: 1060, Training Loss: 0.08778
Epoch: 1060, Training Loss: 0.06840
Epoch: 1060, Training Loss: 0.10996
Epoch: 1060, Training Loss: 0.07443
Epoch: 1061, Training Loss: 0.08765
Epoch: 1061, Training Loss: 0.06833
Epoch: 1061, Training Loss: 0.10978
Epoch: 1061, Training Loss: 0.07432
Epoch: 1062, Training Loss: 0.08753
Epoch: 1062, Training Loss: 0.06825
Epoch: 1062, Training Loss: 0.10960
Epoch: 1062, Training Loss: 0.07421
Epoch: 1063, Training Loss: 0.08740
Epoch: 1063, Training Loss: 0.06818
Epoch: 1063, Training Loss: 0.10942
Epoch: 1063, Training Loss: 0.07410
Epoch: 1064, Training Loss: 0.08728
Epoch: 1064, Training Loss: 0.06810
Epoch: 1064, Training Loss: 0.10924
Epoch: 1064, Training Loss: 0.07399
Epoch: 1065, Training Loss: 0.08715
Epoch: 1065, Training Loss: 0.06803
Epoch: 1065, Training Loss: 0.10906
Epoch: 1065, Training Loss: 0.07389
Epoch: 1066, Training Loss: 0.08703
Epoch: 1066, Training Loss: 0.06795
Epoch: 1066, Training Loss: 0.10888
Epoch: 1066, Training Loss: 0.07378
Epoch: 1067, Training Loss: 0.08691
Epoch: 1067, Training Loss: 0.06788
Epoch: 1067, Training Loss: 0.10870
Epoch: 1067, Training Loss: 0.07367
Epoch: 1068, Training Loss: 0.08678
Epoch: 1068, Training Loss: 0.06781
Epoch: 1068, Training Loss: 0.10853
Epoch: 1068, Training Loss: 0.07356
Epoch: 1069, Training Loss: 0.08666
Epoch: 1069, Training Loss: 0.06773
Epoch: 1069, Training Loss: 0.10835
Epoch: 1069, Training Loss: 0.07346
Epoch: 1070, Training Loss: 0.08654
Epoch: 1070, Training Loss: 0.06766
Epoch: 1070, Training Loss: 0.10818
Epoch: 1070, Training Loss: 0.07335
Epoch: 1071, Training Loss: 0.08642
Epoch: 1071, Training Loss: 0.06759
Epoch: 1071, Training Loss: 0.10800
Epoch: 1071, Training Loss: 0.07325
Epoch: 1072, Training Loss: 0.08630
Epoch: 1072, Training Loss: 0.06751
Epoch: 1072, Training Loss: 0.10783
Epoch: 1072, Training Loss: 0.07314
Epoch: 1073, Training Loss: 0.08618
Epoch: 1073, Training Loss: 0.06744
Epoch: 1073, Training Loss: 0.10766
Epoch: 1073, Training Loss: 0.07304
Epoch: 1074, Training Loss: 0.08606
Epoch: 1074, Training Loss: 0.06737
Epoch: 1074, Training Loss: 0.10749
Epoch: 1074, Training Loss: 0.07293
Epoch: 1075, Training Loss: 0.08594
Epoch: 1075, Training Loss: 0.06730
Epoch: 1075, Training Loss: 0.10732
Epoch: 1075, Training Loss: 0.07283
Epoch: 1076, Training Loss: 0.08582
Epoch: 1076, Training Loss: 0.06722
Epoch: 1076, Training Loss: 0.10715
Epoch: 1076, Training Loss: 0.07272
Epoch: 1077, Training Loss: 0.08571
Epoch: 1077, Training Loss: 0.06715
Epoch: 1077, Training Loss: 0.10698
Epoch: 1077, Training Loss: 0.07262
Epoch: 1078, Training Loss: 0.08559
Epoch: 1078, Training Loss: 0.06708
Epoch: 1078, Training Loss: 0.10681
Epoch: 1078, Training Loss: 0.07252
Epoch: 1079, Training Loss: 0.08547
Epoch: 1079, Training Loss: 0.06701
Epoch: 1079, Training Loss: 0.10664
Epoch: 1079, Training Loss: 0.07242
Epoch: 1080, Training Loss: 0.08535
Epoch: 1080, Training Loss: 0.06694
Epoch: 1080, Training Loss: 0.10647
Epoch: 1080, Training Loss: 0.07232
Epoch: 1081, Training Loss: 0.08524
Epoch: 1081, Training Loss: 0.06687
Epoch: 1081, Training Loss: 0.10631
Epoch: 1081, Training Loss: 0.07221
Epoch: 1082, Training Loss: 0.08512
Epoch: 1082, Training Loss: 0.06680
Epoch: 1082, Training Loss: 0.10614
Epoch: 1082, Training Loss: 0.07211
Epoch: 1083, Training Loss: 0.08501
Epoch: 1083, Training Loss: 0.06673
Epoch: 1083, Training Loss: 0.10598
Epoch: 1083, Training Loss: 0.07201
Epoch: 1084, Training Loss: 0.08489
Epoch: 1084, Training Loss: 0.06666
Epoch: 1084, Training Loss: 0.10581
Epoch: 1084, Training Loss: 0.07191
Epoch: 1085, Training Loss: 0.08478
Epoch: 1085, Training Loss: 0.06659
Epoch: 1085, Training Loss: 0.10565
Epoch: 1085, Training Loss: 0.07181
Epoch: 1086, Training Loss: 0.08466
Epoch: 1086, Training Loss: 0.06652
Epoch: 1086, Training Loss: 0.10549
Epoch: 1086, Training Loss: 0.07171
Epoch: 1087, Training Loss: 0.08455
Epoch: 1087, Training Loss: 0.06645
Epoch: 1087, Training Loss: 0.10532
Epoch: 1087, Training Loss: 0.07161
Epoch: 1088, Training Loss: 0.08443
Epoch: 1088, Training Loss: 0.06638
Epoch: 1088, Training Loss: 0.10516
Epoch: 1088, Training Loss: 0.07152
Epoch: 1089, Training Loss: 0.08432
Epoch: 1089, Training Loss: 0.06631
Epoch: 1089, Training Loss: 0.10500
Epoch: 1089, Training Loss: 0.07142
Epoch: 1090, Training Loss: 0.08421
Epoch: 1090, Training Loss: 0.06624
Epoch: 1090, Training Loss: 0.10484
Epoch: 1090, Training Loss: 0.07132
Epoch: 1091, Training Loss: 0.08410
Epoch: 1091, Training Loss: 0.06617
Epoch: 1091, Training Loss: 0.10468
Epoch: 1091, Training Loss: 0.07122
Epoch: 1092, Training Loss: 0.08399
Epoch: 1092, Training Loss: 0.06610
Epoch: 1092, Training Loss: 0.10452
Epoch: 1092, Training Loss: 0.07113
Epoch: 1093, Training Loss: 0.08387
Epoch: 1093, Training Loss: 0.06603
Epoch: 1093, Training Loss: 0.10436
Epoch: 1093, Training Loss: 0.07103
Epoch: 1094, Training Loss: 0.08376
Epoch: 1094, Training Loss: 0.06596
Epoch: 1094, Training Loss: 0.10421
Epoch: 1094, Training Loss: 0.07093
Epoch: 1095, Training Loss: 0.08365
Epoch: 1095, Training Loss: 0.06590
Epoch: 1095, Training Loss: 0.10405
Epoch: 1095, Training Loss: 0.07084
Epoch: 1096, Training Loss: 0.08354
Epoch: 1096, Training Loss: 0.06583
Epoch: 1096, Training Loss: 0.10389
Epoch: 1096, Training Loss: 0.07074
Epoch: 1097, Training Loss: 0.08343
Epoch: 1097, Training Loss: 0.06576
Epoch: 1097, Training Loss: 0.10374
Epoch: 1097, Training Loss: 0.07065
Epoch: 1098, Training Loss: 0.08332
Epoch: 1098, Training Loss: 0.06569
Epoch: 1098, Training Loss: 0.10358
Epoch: 1098, Training Loss: 0.07055
Epoch: 1099, Training Loss: 0.08321
Epoch: 1099, Training Loss: 0.06563
Epoch: 1099, Training Loss: 0.10343
Epoch: 1099, Training Loss: 0.07046
Epoch: 1100, Training Loss: 0.08311
Epoch: 1100, Training Loss: 0.06556
Epoch: 1100, Training Loss: 0.10327
Epoch: 1100, Training Loss: 0.07036
Epoch: 1101, Training Loss: 0.08300
Epoch: 1101, Training Loss: 0.06549
Epoch: 1101, Training Loss: 0.10312
Epoch: 1101, Training Loss: 0.07027
Epoch: 1102, Training Loss: 0.08289
Epoch: 1102, Training Loss: 0.06543
Epoch: 1102, Training Loss: 0.10297
Epoch: 1102, Training Loss: 0.07018
Epoch: 1103, Training Loss: 0.08278
Epoch: 1103, Training Loss: 0.06536
Epoch: 1103, Training Loss: 0.10281
Epoch: 1103, Training Loss: 0.07008
Epoch: 1104, Training Loss: 0.08268
Epoch: 1104, Training Loss: 0.06529
Epoch: 1104, Training Loss: 0.10266
Epoch: 1104, Training Loss: 0.06999
Epoch: 1105, Training Loss: 0.08257
Epoch: 1105, Training Loss: 0.06523
Epoch: 1105, Training Loss: 0.10251
Epoch: 1105, Training Loss: 0.06990
Epoch: 1106, Training Loss: 0.08246
Epoch: 1106, Training Loss: 0.06516
Epoch: 1106, Training Loss: 0.10236
Epoch: 1106, Training Loss: 0.06981
Epoch: 1107, Training Loss: 0.08236
Epoch: 1107, Training Loss: 0.06509
Epoch: 1107, Training Loss: 0.10221
Epoch: 1107, Training Loss: 0.06971
Epoch: 1108, Training Loss: 0.08225
Epoch: 1108, Training Loss: 0.06503
Epoch: 1108, Training Loss: 0.10206
Epoch: 1108, Training Loss: 0.06962
Epoch: 1109, Training Loss: 0.08215
Epoch: 1109, Training Loss: 0.06496
Epoch: 1109, Training Loss: 0.10191
Epoch: 1109, Training Loss: 0.06953
Epoch: 1110, Training Loss: 0.08204
Epoch: 1110, Training Loss: 0.06490
Epoch: 1110, Training Loss: 0.10176
Epoch: 1110, Training Loss: 0.06944
Epoch: 1111, Training Loss: 0.08194
Epoch: 1111, Training Loss: 0.06483
Epoch: 1111, Training Loss: 0.10162
Epoch: 1111, Training Loss: 0.06935
Epoch: 1112, Training Loss: 0.08183
Epoch: 1112, Training Loss: 0.06477
Epoch: 1112, Training Loss: 0.10147
Epoch: 1112, Training Loss: 0.06926
Epoch: 1113, Training Loss: 0.08173
Epoch: 1113, Training Loss: 0.06470
Epoch: 1113, Training Loss: 0.10132
Epoch: 1113, Training Loss: 0.06917
Epoch: 1114, Training Loss: 0.08162
Epoch: 1114, Training Loss: 0.06464
Epoch: 1114, Training Loss: 0.10118
Epoch: 1114, Training Loss: 0.06908
Epoch: 1115, Training Loss: 0.08152
Epoch: 1115, Training Loss: 0.06457
Epoch: 1115, Training Loss: 0.10103
Epoch: 1115, Training Loss: 0.06899
Epoch: 1116, Training Loss: 0.08142
Epoch: 1116, Training Loss: 0.06451
Epoch: 1116, Training Loss: 0.10089
Epoch: 1116, Training Loss: 0.06891
Epoch: 1117, Training Loss: 0.08132
Epoch: 1117, Training Loss: 0.06445
Epoch: 1117, Training Loss: 0.10074
Epoch: 1117, Training Loss: 0.06882
Epoch: 1118, Training Loss: 0.08121
Epoch: 1118, Training Loss: 0.06438
Epoch: 1118, Training Loss: 0.10060
Epoch: 1118, Training Loss: 0.06873
Epoch: 1119, Training Loss: 0.08111
Epoch: 1119, Training Loss: 0.06432
Epoch: 1119, Training Loss: 0.10046
Epoch: 1119, Training Loss: 0.06864
Epoch: 1120, Training Loss: 0.08101
Epoch: 1120, Training Loss: 0.06426
Epoch: 1120, Training Loss: 0.10032
Epoch: 1120, Training Loss: 0.06856
Epoch: 1121, Training Loss: 0.08091
Epoch: 1121, Training Loss: 0.06419
Epoch: 1121, Training Loss: 0.10017
Epoch: 1121, Training Loss: 0.06847
Epoch: 1122, Training Loss: 0.08081
Epoch: 1122, Training Loss: 0.06413
Epoch: 1122, Training Loss: 0.10003
Epoch: 1122, Training Loss: 0.06838
Epoch: 1123, Training Loss: 0.08071
Epoch: 1123, Training Loss: 0.06407
Epoch: 1123, Training Loss: 0.09989
Epoch: 1123, Training Loss: 0.06830
Epoch: 1124, Training Loss: 0.08061
Epoch: 1124, Training Loss: 0.06400
Epoch: 1124, Training Loss: 0.09975
Epoch: 1124, Training Loss: 0.06821
Epoch: 1125, Training Loss: 0.08051
Epoch: 1125, Training Loss: 0.06394
Epoch: 1125, Training Loss: 0.09961
Epoch: 1125, Training Loss: 0.06812
Epoch: 1126, Training Loss: 0.08041
Epoch: 1126, Training Loss: 0.06388
Epoch: 1126, Training Loss: 0.09947
Epoch: 1126, Training Loss: 0.06804
Epoch: 1127, Training Loss: 0.08031
Epoch: 1127, Training Loss: 0.06382
Epoch: 1127, Training Loss: 0.09933
Epoch: 1127, Training Loss: 0.06795
Epoch: 1128, Training Loss: 0.08021
Epoch: 1128, Training Loss: 0.06375
Epoch: 1128, Training Loss: 0.09920
Epoch: 1128, Training Loss: 0.06787
Epoch: 1129, Training Loss: 0.08012
Epoch: 1129, Training Loss: 0.06369
Epoch: 1129, Training Loss: 0.09906
Epoch: 1129, Training Loss: 0.06778
Epoch: 1130, Training Loss: 0.08002
Epoch: 1130, Training Loss: 0.06363
Epoch: 1130, Training Loss: 0.09892
Epoch: 1130, Training Loss: 0.06770
Epoch: 1131, Training Loss: 0.07992
Epoch: 1131, Training Loss: 0.06357
Epoch: 1131, Training Loss: 0.09878
Epoch: 1131, Training Loss: 0.06762
Epoch: 1132, Training Loss: 0.07982
Epoch: 1132, Training Loss: 0.06351
Epoch: 1132, Training Loss: 0.09865
Epoch: 1132, Training Loss: 0.06753
Epoch: 1133, Training Loss: 0.07973
Epoch: 1133, Training Loss: 0.06345
Epoch: 1133, Training Loss: 0.09851
Epoch: 1133, Training Loss: 0.06745
Epoch: 1134, Training Loss: 0.07963
Epoch: 1134, Training Loss: 0.06338
Epoch: 1134, Training Loss: 0.09838
Epoch: 1134, Training Loss: 0.06737
Epoch: 1135, Training Loss: 0.07953
Epoch: 1135, Training Loss: 0.06332
Epoch: 1135, Training Loss: 0.09824
Epoch: 1135, Training Loss: 0.06728
Epoch: 1136, Training Loss: 0.07944
Epoch: 1136, Training Loss: 0.06326
Epoch: 1136, Training Loss: 0.09811
Epoch: 1136, Training Loss: 0.06720
Epoch: 1137, Training Loss: 0.07934
Epoch: 1137, Training Loss: 0.06320
Epoch: 1137, Training Loss: 0.09797
Epoch: 1137, Training Loss: 0.06712
Epoch: 1138, Training Loss: 0.07925
Epoch: 1138, Training Loss: 0.06314
Epoch: 1138, Training Loss: 0.09784
Epoch: 1138, Training Loss: 0.06704
Epoch: 1139, Training Loss: 0.07915
Epoch: 1139, Training Loss: 0.06308
Epoch: 1139, Training Loss: 0.09771
Epoch: 1139, Training Loss: 0.06696
Epoch: 1140, Training Loss: 0.07906
Epoch: 1140, Training Loss: 0.06302
Epoch: 1140, Training Loss: 0.09758
Epoch: 1140, Training Loss: 0.06688
Epoch: 1141, Training Loss: 0.07896
Epoch: 1141, Training Loss: 0.06296
Epoch: 1141, Training Loss: 0.09744
Epoch: 1141, Training Loss: 0.06679
Epoch: 1142, Training Loss: 0.07887
Epoch: 1142, Training Loss: 0.06290
Epoch: 1142, Training Loss: 0.09731
Epoch: 1142, Training Loss: 0.06671
Epoch: 1143, Training Loss: 0.07877
Epoch: 1143, Training Loss: 0.06284
Epoch: 1143, Training Loss: 0.09718
Epoch: 1143, Training Loss: 0.06663
Epoch: 1144, Training Loss: 0.07868
Epoch: 1144, Training Loss: 0.06278
Epoch: 1144, Training Loss: 0.09705
Epoch: 1144, Training Loss: 0.06655
Epoch: 1145, Training Loss: 0.07859
Epoch: 1145, Training Loss: 0.06272
Epoch: 1145, Training Loss: 0.09692
Epoch: 1145, Training Loss: 0.06647
Epoch: 1146, Training Loss: 0.07850
Epoch: 1146, Training Loss: 0.06266
Epoch: 1146, Training Loss: 0.09679
Epoch: 1146, Training Loss: 0.06639
Epoch: 1147, Training Loss: 0.07840
Epoch: 1147, Training Loss: 0.06260
Epoch: 1147, Training Loss: 0.09666
Epoch: 1147, Training Loss: 0.06631
Epoch: 1148, Training Loss: 0.07831
Epoch: 1148, Training Loss: 0.06255
Epoch: 1148, Training Loss: 0.09654
Epoch: 1148, Training Loss: 0.06624
Epoch: 1149, Training Loss: 0.07822
Epoch: 1149, Training Loss: 0.06249
Epoch: 1149, Training Loss: 0.09641
Epoch: 1149, Training Loss: 0.06616
Epoch: 1150, Training Loss: 0.07813
Epoch: 1150, Training Loss: 0.06243
Epoch: 1150, Training Loss: 0.09628
Epoch: 1150, Training Loss: 0.06608
Epoch: 1151, Training Loss: 0.07804
Epoch: 1151, Training Loss: 0.06237
Epoch: 1151, Training Loss: 0.09615
Epoch: 1151, Training Loss: 0.06600
Epoch: 1152, Training Loss: 0.07794
Epoch: 1152, Training Loss: 0.06231
Epoch: 1152, Training Loss: 0.09603
Epoch: 1152, Training Loss: 0.06592
Epoch: 1153, Training Loss: 0.07785
Epoch: 1153, Training Loss: 0.06225
Epoch: 1153, Training Loss: 0.09590
Epoch: 1153, Training Loss: 0.06584
Epoch: 1154, Training Loss: 0.07776
Epoch: 1154, Training Loss: 0.06220
Epoch: 1154, Training Loss: 0.09577
Epoch: 1154, Training Loss: 0.06577
Epoch: 1155, Training Loss: 0.07767
Epoch: 1155, Training Loss: 0.06214
Epoch: 1155, Training Loss: 0.09565
Epoch: 1155, Training Loss: 0.06569
Epoch: 1156, Training Loss: 0.07758
Epoch: 1156, Training Loss: 0.06208
Epoch: 1156, Training Loss: 0.09552
Epoch: 1156, Training Loss: 0.06561
Epoch: 1157, Training Loss: 0.07749
Epoch: 1157, Training Loss: 0.06202
Epoch: 1157, Training Loss: 0.09540
Epoch: 1157, Training Loss: 0.06554
Epoch: 1158, Training Loss: 0.07740
Epoch: 1158, Training Loss: 0.06196
Epoch: 1158, Training Loss: 0.09528
Epoch: 1158, Training Loss: 0.06546
Epoch: 1159, Training Loss: 0.07732
Epoch: 1159, Training Loss: 0.06191
Epoch: 1159, Training Loss: 0.09515
Epoch: 1159, Training Loss: 0.06538
Epoch: 1160, Training Loss: 0.07723
Epoch: 1160, Training Loss: 0.06185
Epoch: 1160, Training Loss: 0.09503
Epoch: 1160, Training Loss: 0.06531
Epoch: 1161, Training Loss: 0.07714
Epoch: 1161, Training Loss: 0.06179
Epoch: 1161, Training Loss: 0.09491
Epoch: 1161, Training Loss: 0.06523
Epoch: 1162, Training Loss: 0.07705
Epoch: 1162, Training Loss: 0.06174
Epoch: 1162, Training Loss: 0.09478
Epoch: 1162, Training Loss: 0.06516
Epoch: 1163, Training Loss: 0.07696
Epoch: 1163, Training Loss: 0.06168
Epoch: 1163, Training Loss: 0.09466
Epoch: 1163, Training Loss: 0.06508
Epoch: 1164, Training Loss: 0.07687
Epoch: 1164, Training Loss: 0.06162
Epoch: 1164, Training Loss: 0.09454
Epoch: 1164, Training Loss: 0.06501
Epoch: 1165, Training Loss: 0.07679
Epoch: 1165, Training Loss: 0.06157
Epoch: 1165, Training Loss: 0.09442
Epoch: 1165, Training Loss: 0.06493
Epoch: 1166, Training Loss: 0.07670
Epoch: 1166, Training Loss: 0.06151
Epoch: 1166, Training Loss: 0.09430
Epoch: 1166, Training Loss: 0.06486
Epoch: 1167, Training Loss: 0.07661
Epoch: 1167, Training Loss: 0.06145
Epoch: 1167, Training Loss: 0.09418
Epoch: 1167, Training Loss: 0.06478
Epoch: 1168, Training Loss: 0.07653
Epoch: 1168, Training Loss: 0.06140
Epoch: 1168, Training Loss: 0.09406
Epoch: 1168, Training Loss: 0.06471
Epoch: 1169, Training Loss: 0.07644
Epoch: 1169, Training Loss: 0.06134
Epoch: 1169, Training Loss: 0.09394
Epoch: 1169, Training Loss: 0.06464
Epoch: 1170, Training Loss: 0.07635
Epoch: 1170, Training Loss: 0.06129
Epoch: 1170, Training Loss: 0.09382
Epoch: 1170, Training Loss: 0.06456
Epoch: 1171, Training Loss: 0.07627
Epoch: 1171, Training Loss: 0.06123
Epoch: 1171, Training Loss: 0.09370
Epoch: 1171, Training Loss: 0.06449
Epoch: 1172, Training Loss: 0.07618
Epoch: 1172, Training Loss: 0.06118
Epoch: 1172, Training Loss: 0.09359
Epoch: 1172, Training Loss: 0.06442
Epoch: 1173, Training Loss: 0.07610
Epoch: 1173, Training Loss: 0.06112
Epoch: 1173, Training Loss: 0.09347
Epoch: 1173, Training Loss: 0.06435
Epoch: 1174, Training Loss: 0.07601
Epoch: 1174, Training Loss: 0.06107
Epoch: 1174, Training Loss: 0.09335
Epoch: 1174, Training Loss: 0.06427
Epoch: 1175, Training Loss: 0.07593
Epoch: 1175, Training Loss: 0.06101
Epoch: 1175, Training Loss: 0.09323
Epoch: 1175, Training Loss: 0.06420
Epoch: 1176, Training Loss: 0.07584
Epoch: 1176, Training Loss: 0.06096
Epoch: 1176, Training Loss: 0.09312
Epoch: 1176, Training Loss: 0.06413
Epoch: 1177, Training Loss: 0.07576
Epoch: 1177, Training Loss: 0.06090
Epoch: 1177, Training Loss: 0.09300
Epoch: 1177, Training Loss: 0.06406
Epoch: 1178, Training Loss: 0.07568
Epoch: 1178, Training Loss: 0.06085
Epoch: 1178, Training Loss: 0.09289
Epoch: 1178, Training Loss: 0.06399
Epoch: 1179, Training Loss: 0.07559
Epoch: 1179, Training Loss: 0.06079
Epoch: 1179, Training Loss: 0.09277
Epoch: 1179, Training Loss: 0.06391
Epoch: 1180, Training Loss: 0.07551
Epoch: 1180, Training Loss: 0.06074
Epoch: 1180, Training Loss: 0.09266
Epoch: 1180, Training Loss: 0.06384
Epoch: 1181, Training Loss: 0.07542
Epoch: 1181, Training Loss: 0.06068
Epoch: 1181, Training Loss: 0.09254
Epoch: 1181, Training Loss: 0.06377
Epoch: 1182, Training Loss: 0.07534
Epoch: 1182, Training Loss: 0.06063
Epoch: 1182, Training Loss: 0.09243
Epoch: 1182, Training Loss: 0.06370
Epoch: 1183, Training Loss: 0.07526
Epoch: 1183, Training Loss: 0.06058
Epoch: 1183, Training Loss: 0.09231
Epoch: 1183, Training Loss: 0.06363
Epoch: 1184, Training Loss: 0.07518
Epoch: 1184, Training Loss: 0.06052
Epoch: 1184, Training Loss: 0.09220
Epoch: 1184, Training Loss: 0.06356
Epoch: 1185, Training Loss: 0.07509
Epoch: 1185, Training Loss: 0.06047
Epoch: 1185, Training Loss: 0.09209
Epoch: 1185, Training Loss: 0.06349
Epoch: 1186, Training Loss: 0.07501
Epoch: 1186, Training Loss: 0.06042
Epoch: 1186, Training Loss: 0.09197
Epoch: 1186, Training Loss: 0.06342
Epoch: 1187, Training Loss: 0.07493
Epoch: 1187, Training Loss: 0.06036
Epoch: 1187, Training Loss: 0.09186
Epoch: 1187, Training Loss: 0.06335
Epoch: 1188, Training Loss: 0.07485
Epoch: 1188, Training Loss: 0.06031
Epoch: 1188, Training Loss: 0.09175
Epoch: 1188, Training Loss: 0.06328
Epoch: 1189, Training Loss: 0.07477
Epoch: 1189, Training Loss: 0.06026
Epoch: 1189, Training Loss: 0.09164
Epoch: 1189, Training Loss: 0.06321
Epoch: 1190, Training Loss: 0.07469
Epoch: 1190, Training Loss: 0.06020
Epoch: 1190, Training Loss: 0.09153
Epoch: 1190, Training Loss: 0.06315
Epoch: 1191, Training Loss: 0.07461
Epoch: 1191, Training Loss: 0.06015
Epoch: 1191, Training Loss: 0.09142
Epoch: 1191, Training Loss: 0.06308
Epoch: 1192, Training Loss: 0.07453
Epoch: 1192, Training Loss: 0.06010
Epoch: 1192, Training Loss: 0.09131
Epoch: 1192, Training Loss: 0.06301
Epoch: 1193, Training Loss: 0.07445
Epoch: 1193, Training Loss: 0.06005
Epoch: 1193, Training Loss: 0.09120
Epoch: 1193, Training Loss: 0.06294
Epoch: 1194, Training Loss: 0.07437
Epoch: 1194, Training Loss: 0.05999
Epoch: 1194, Training Loss: 0.09109
Epoch: 1194, Training Loss: 0.06287
Epoch: 1195, Training Loss: 0.07429
Epoch: 1195, Training Loss: 0.05994
Epoch: 1195, Training Loss: 0.09098
Epoch: 1195, Training Loss: 0.06281
Epoch: 1196, Training Loss: 0.07421
Epoch: 1196, Training Loss: 0.05989
Epoch: 1196, Training Loss: 0.09087
Epoch: 1196, Training Loss: 0.06274
Epoch: 1197, Training Loss: 0.07413
Epoch: 1197, Training Loss: 0.05984
Epoch: 1197, Training Loss: 0.09076
Epoch: 1197, Training Loss: 0.06267
Epoch: 1198, Training Loss: 0.07405
Epoch: 1198, Training Loss: 0.05979
Epoch: 1198, Training Loss: 0.09065
Epoch: 1198, Training Loss: 0.06260
Epoch: 1199, Training Loss: 0.07397
Epoch: 1199, Training Loss: 0.05973
Epoch: 1199, Training Loss: 0.09054
Epoch: 1199, Training Loss: 0.06254
Epoch: 1200, Training Loss: 0.07389
Epoch: 1200, Training Loss: 0.05968
Epoch: 1200, Training Loss: 0.09044
Epoch: 1200, Training Loss: 0.06247
Epoch: 1201, Training Loss: 0.07381
Epoch: 1201, Training Loss: 0.05963
Epoch: 1201, Training Loss: 0.09033
Epoch: 1201, Training Loss: 0.06240
Epoch: 1202, Training Loss: 0.07373
Epoch: 1202, Training Loss: 0.05958
Epoch: 1202, Training Loss: 0.09022
Epoch: 1202, Training Loss: 0.06234
Epoch: 1203, Training Loss: 0.07366
Epoch: 1203, Training Loss: 0.05953
Epoch: 1203, Training Loss: 0.09011
Epoch: 1203, Training Loss: 0.06227
Epoch: 1204, Training Loss: 0.07358
Epoch: 1204, Training Loss: 0.05948
Epoch: 1204, Training Loss: 0.09001
Epoch: 1204, Training Loss: 0.06221
Epoch: 1205, Training Loss: 0.07350
Epoch: 1205, Training Loss: 0.05943
Epoch: 1205, Training Loss: 0.08990
Epoch: 1205, Training Loss: 0.06214
Epoch: 1206, Training Loss: 0.07342
Epoch: 1206, Training Loss: 0.05937
Epoch: 1206, Training Loss: 0.08980
Epoch: 1206, Training Loss: 0.06207
Epoch: 1207, Training Loss: 0.07335
Epoch: 1207, Training Loss: 0.05932
Epoch: 1207, Training Loss: 0.08969
Epoch: 1207, Training Loss: 0.06201
Epoch: 1208, Training Loss: 0.07327
Epoch: 1208, Training Loss: 0.05927
Epoch: 1208, Training Loss: 0.08959
Epoch: 1208, Training Loss: 0.06194
Epoch: 1209, Training Loss: 0.07319
Epoch: 1209, Training Loss: 0.05922
Epoch: 1209, Training Loss: 0.08948
Epoch: 1209, Training Loss: 0.06188
Epoch: 1210, Training Loss: 0.07312
Epoch: 1210, Training Loss: 0.05917
Epoch: 1210, Training Loss: 0.08938
Epoch: 1210, Training Loss: 0.06181
Epoch: 1211, Training Loss: 0.07304
Epoch: 1211, Training Loss: 0.05912
Epoch: 1211, Training Loss: 0.08927
Epoch: 1211, Training Loss: 0.06175
Epoch: 1212, Training Loss: 0.07297
Epoch: 1212, Training Loss: 0.05907
Epoch: 1212, Training Loss: 0.08917
Epoch: 1212, Training Loss: 0.06169
Epoch: 1213, Training Loss: 0.07289
Epoch: 1213, Training Loss: 0.05902
Epoch: 1213, Training Loss: 0.08907
Epoch: 1213, Training Loss: 0.06162
Epoch: 1214, Training Loss: 0.07282
Epoch: 1214, Training Loss: 0.05897
Epoch: 1214, Training Loss: 0.08896
Epoch: 1214, Training Loss: 0.06156
Epoch: 1215, Training Loss: 0.07274
Epoch: 1215, Training Loss: 0.05892
Epoch: 1215, Training Loss: 0.08886
Epoch: 1215, Training Loss: 0.06149
Epoch: 1216, Training Loss: 0.07266
Epoch: 1216, Training Loss: 0.05887
Epoch: 1216, Training Loss: 0.08876
Epoch: 1216, Training Loss: 0.06143
Epoch: 1217, Training Loss: 0.07259
Epoch: 1217, Training Loss: 0.05882
Epoch: 1217, Training Loss: 0.08866
Epoch: 1217, Training Loss: 0.06137
Epoch: 1218, Training Loss: 0.07252
Epoch: 1218, Training Loss: 0.05877
Epoch: 1218, Training Loss: 0.08856
Epoch: 1218, Training Loss: 0.06130
Epoch: 1219, Training Loss: 0.07244
Epoch: 1219, Training Loss: 0.05872
Epoch: 1219, Training Loss: 0.08846
Epoch: 1219, Training Loss: 0.06124
Epoch: 1220, Training Loss: 0.07237
Epoch: 1220, Training Loss: 0.05868
Epoch: 1220, Training Loss: 0.08835
Epoch: 1220, Training Loss: 0.06118
Epoch: 1221, Training Loss: 0.07229
Epoch: 1221, Training Loss: 0.05863
Epoch: 1221, Training Loss: 0.08825
Epoch: 1221, Training Loss: 0.06112
Epoch: 1222, Training Loss: 0.07222
Epoch: 1222, Training Loss: 0.05858
Epoch: 1222, Training Loss: 0.08815
Epoch: 1222, Training Loss: 0.06105
Epoch: 1223, Training Loss: 0.07215
Epoch: 1223, Training Loss: 0.05853
Epoch: 1223, Training Loss: 0.08805
Epoch: 1223, Training Loss: 0.06099
Epoch: 1224, Training Loss: 0.07207
Epoch: 1224, Training Loss: 0.05848
Epoch: 1224, Training Loss: 0.08795
Epoch: 1224, Training Loss: 0.06093
Epoch: 1225, Training Loss: 0.07200
Epoch: 1225, Training Loss: 0.05843
Epoch: 1225, Training Loss: 0.08785
Epoch: 1225, Training Loss: 0.06087
Epoch: 1226, Training Loss: 0.07193
Epoch: 1226, Training Loss: 0.05838
Epoch: 1226, Training Loss: 0.08775
Epoch: 1226, Training Loss: 0.06081
Epoch: 1227, Training Loss: 0.07185
Epoch: 1227, Training Loss: 0.05833
Epoch: 1227, Training Loss: 0.08766
Epoch: 1227, Training Loss: 0.06074
Epoch: 1228, Training Loss: 0.07178
Epoch: 1228, Training Loss: 0.05829
Epoch: 1228, Training Loss: 0.08756
Epoch: 1228, Training Loss: 0.06068
Epoch: 1229, Training Loss: 0.07171
Epoch: 1229, Training Loss: 0.05824
Epoch: 1229, Training Loss: 0.08746
Epoch: 1229, Training Loss: 0.06062
Epoch: 1230, Training Loss: 0.07164
Epoch: 1230, Training Loss: 0.05819
Epoch: 1230, Training Loss: 0.08736
Epoch: 1230, Training Loss: 0.06056
Epoch: 1231, Training Loss: 0.07156
Epoch: 1231, Training Loss: 0.05814
Epoch: 1231, Training Loss: 0.08726
Epoch: 1231, Training Loss: 0.06050
Epoch: 1232, Training Loss: 0.07149
Epoch: 1232, Training Loss: 0.05809
Epoch: 1232, Training Loss: 0.08717
Epoch: 1232, Training Loss: 0.06044
Epoch: 1233, Training Loss: 0.07142
Epoch: 1233, Training Loss: 0.05805
Epoch: 1233, Training Loss: 0.08707
Epoch: 1233, Training Loss: 0.06038
Epoch: 1234, Training Loss: 0.07135
Epoch: 1234, Training Loss: 0.05800
Epoch: 1234, Training Loss: 0.08697
Epoch: 1234, Training Loss: 0.06032
Epoch: 1235, Training Loss: 0.07128
Epoch: 1235, Training Loss: 0.05795
Epoch: 1235, Training Loss: 0.08688
Epoch: 1235, Training Loss: 0.06026
Epoch: 1236, Training Loss: 0.07121
Epoch: 1236, Training Loss: 0.05790
Epoch: 1236, Training Loss: 0.08678
Epoch: 1236, Training Loss: 0.06020
Epoch: 1237, Training Loss: 0.07114
Epoch: 1237, Training Loss: 0.05786
Epoch: 1237, Training Loss: 0.08668
Epoch: 1237, Training Loss: 0.06014
Epoch: 1238, Training Loss: 0.07107
Epoch: 1238, Training Loss: 0.05781
Epoch: 1238, Training Loss: 0.08659
Epoch: 1238, Training Loss: 0.06008
Epoch: 1239, Training Loss: 0.07100
Epoch: 1239, Training Loss: 0.05776
Epoch: 1239, Training Loss: 0.08649
Epoch: 1239, Training Loss: 0.06002
Epoch: 1240, Training Loss: 0.07093
Epoch: 1240, Training Loss: 0.05771
Epoch: 1240, Training Loss: 0.08640
Epoch: 1240, Training Loss: 0.05996
Epoch: 1241, Training Loss: 0.07086
Epoch: 1241, Training Loss: 0.05767
Epoch: 1241, Training Loss: 0.08630
Epoch: 1241, Training Loss: 0.05990
Epoch: 1242, Training Loss: 0.07079
Epoch: 1242, Training Loss: 0.05762
Epoch: 1242, Training Loss: 0.08621
Epoch: 1242, Training Loss: 0.05984
Epoch: 1243, Training Loss: 0.07072
Epoch: 1243, Training Loss: 0.05757
Epoch: 1243, Training Loss: 0.08612
Epoch: 1243, Training Loss: 0.05978
Epoch: 1244, Training Loss: 0.07065
Epoch: 1244, Training Loss: 0.05753
Epoch: 1244, Training Loss: 0.08602
Epoch: 1244, Training Loss: 0.05973
Epoch: 1245, Training Loss: 0.07058
Epoch: 1245, Training Loss: 0.05748
Epoch: 1245, Training Loss: 0.08593
Epoch: 1245, Training Loss: 0.05967
Epoch: 1246, Training Loss: 0.07051
Epoch: 1246, Training Loss: 0.05744
Epoch: 1246, Training Loss: 0.08583
Epoch: 1246, Training Loss: 0.05961
Epoch: 1247, Training Loss: 0.07044
Epoch: 1247, Training Loss: 0.05739
Epoch: 1247, Training Loss: 0.08574
Epoch: 1247, Training Loss: 0.05955
Epoch: 1248, Training Loss: 0.07037
Epoch: 1248, Training Loss: 0.05734
Epoch: 1248, Training Loss: 0.08565
Epoch: 1248, Training Loss: 0.05949
Epoch: 1249, Training Loss: 0.07030
Epoch: 1249, Training Loss: 0.05730
Epoch: 1249, Training Loss: 0.08556
Epoch: 1249, Training Loss: 0.05944
Epoch: 1250, Training Loss: 0.07024
Epoch: 1250, Training Loss: 0.05725
Epoch: 1250, Training Loss: 0.08546
Epoch: 1250, Training Loss: 0.05938
Epoch: 1251, Training Loss: 0.07017
Epoch: 1251, Training Loss: 0.05721
Epoch: 1251, Training Loss: 0.08537
Epoch: 1251, Training Loss: 0.05932
Epoch: 1252, Training Loss: 0.07010
Epoch: 1252, Training Loss: 0.05716
Epoch: 1252, Training Loss: 0.08528
Epoch: 1252, Training Loss: 0.05926
Epoch: 1253, Training Loss: 0.07003
Epoch: 1253, Training Loss: 0.05711
Epoch: 1253, Training Loss: 0.08519
Epoch: 1253, Training Loss: 0.05921
Epoch: 1254, Training Loss: 0.06996
Epoch: 1254, Training Loss: 0.05707
Epoch: 1254, Training Loss: 0.08510
Epoch: 1254, Training Loss: 0.05915
Epoch: 1255, Training Loss: 0.06990
Epoch: 1255, Training Loss: 0.05702
Epoch: 1255, Training Loss: 0.08501
Epoch: 1255, Training Loss: 0.05909
Epoch: 1256, Training Loss: 0.06983
Epoch: 1256, Training Loss: 0.05698
Epoch: 1256, Training Loss: 0.08492
Epoch: 1256, Training Loss: 0.05904
Epoch: 1257, Training Loss: 0.06976
Epoch: 1257, Training Loss: 0.05693
Epoch: 1257, Training Loss: 0.08483
Epoch: 1257, Training Loss: 0.05898
Epoch: 1258, Training Loss: 0.06970
Epoch: 1258, Training Loss: 0.05689
Epoch: 1258, Training Loss: 0.08474
Epoch: 1258, Training Loss: 0.05892
Epoch: 1259, Training Loss: 0.06963
Epoch: 1259, Training Loss: 0.05684
Epoch: 1259, Training Loss: 0.08465
Epoch: 1259, Training Loss: 0.05887
Epoch: 1260, Training Loss: 0.06956
Epoch: 1260, Training Loss: 0.05680
Epoch: 1260, Training Loss: 0.08456
Epoch: 1260, Training Loss: 0.05881
Epoch: 1261, Training Loss: 0.06950
Epoch: 1261, Training Loss: 0.05675
Epoch: 1261, Training Loss: 0.08447
Epoch: 1261, Training Loss: 0.05876
Epoch: 1262, Training Loss: 0.06943
Epoch: 1262, Training Loss: 0.05671
Epoch: 1262, Training Loss: 0.08438
Epoch: 1262, Training Loss: 0.05870
Epoch: 1263, Training Loss: 0.06937
Epoch: 1263, Training Loss: 0.05666
Epoch: 1263, Training Loss: 0.08429
Epoch: 1263, Training Loss: 0.05864
Epoch: 1264, Training Loss: 0.06930
Epoch: 1264, Training Loss: 0.05662
Epoch: 1264, Training Loss: 0.08420
Epoch: 1264, Training Loss: 0.05859
Epoch: 1265, Training Loss: 0.06923
Epoch: 1265, Training Loss: 0.05658
Epoch: 1265, Training Loss: 0.08411
Epoch: 1265, Training Loss: 0.05853
Epoch: 1266, Training Loss: 0.06917
Epoch: 1266, Training Loss: 0.05653
Epoch: 1266, Training Loss: 0.08403
Epoch: 1266, Training Loss: 0.05848
Epoch: 1267, Training Loss: 0.06910
Epoch: 1267, Training Loss: 0.05649
Epoch: 1267, Training Loss: 0.08394
Epoch: 1267, Training Loss: 0.05842
Epoch: 1268, Training Loss: 0.06904
Epoch: 1268, Training Loss: 0.05644
Epoch: 1268, Training Loss: 0.08385
Epoch: 1268, Training Loss: 0.05837
Epoch: 1269, Training Loss: 0.06897
Epoch: 1269, Training Loss: 0.05640
Epoch: 1269, Training Loss: 0.08376
Epoch: 1269, Training Loss: 0.05831
Epoch: 1270, Training Loss: 0.06891
Epoch: 1270, Training Loss: 0.05635
Epoch: 1270, Training Loss: 0.08368
Epoch: 1270, Training Loss: 0.05826
Epoch: 1271, Training Loss: 0.06884
Epoch: 1271, Training Loss: 0.05631
Epoch: 1271, Training Loss: 0.08359
Epoch: 1271, Training Loss: 0.05821
Epoch: 1272, Training Loss: 0.06878
Epoch: 1272, Training Loss: 0.05627
Epoch: 1272, Training Loss: 0.08350
Epoch: 1272, Training Loss: 0.05815
Epoch: 1273, Training Loss: 0.06872
Epoch: 1273, Training Loss: 0.05622
Epoch: 1273, Training Loss: 0.08342
Epoch: 1273, Training Loss: 0.05810
Epoch: 1274, Training Loss: 0.06865
Epoch: 1274, Training Loss: 0.05618
Epoch: 1274, Training Loss: 0.08333
Epoch: 1274, Training Loss: 0.05804
Epoch: 1275, Training Loss: 0.06859
Epoch: 1275, Training Loss: 0.05614
Epoch: 1275, Training Loss: 0.08324
Epoch: 1275, Training Loss: 0.05799
Epoch: 1276, Training Loss: 0.06852
Epoch: 1276, Training Loss: 0.05609
Epoch: 1276, Training Loss: 0.08316
Epoch: 1276, Training Loss: 0.05794
Epoch: 1277, Training Loss: 0.06846
Epoch: 1277, Training Loss: 0.05605
Epoch: 1277, Training Loss: 0.08307
Epoch: 1277, Training Loss: 0.05788
Epoch: 1278, Training Loss: 0.06840
Epoch: 1278, Training Loss: 0.05601
Epoch: 1278, Training Loss: 0.08299
Epoch: 1278, Training Loss: 0.05783
Epoch: 1279, Training Loss: 0.06833
Epoch: 1279, Training Loss: 0.05596
Epoch: 1279, Training Loss: 0.08290
Epoch: 1279, Training Loss: 0.05778
Epoch: 1280, Training Loss: 0.06827
Epoch: 1280, Training Loss: 0.05592
Epoch: 1280, Training Loss: 0.08282
Epoch: 1280, Training Loss: 0.05772
Epoch: 1281, Training Loss: 0.06821
Epoch: 1281, Training Loss: 0.05588
Epoch: 1281, Training Loss: 0.08273
Epoch: 1281, Training Loss: 0.05767
Epoch: 1282, Training Loss: 0.06814
Epoch: 1282, Training Loss: 0.05584
Epoch: 1282, Training Loss: 0.08265
Epoch: 1282, Training Loss: 0.05762
Epoch: 1283, Training Loss: 0.06808
Epoch: 1283, Training Loss: 0.05579
Epoch: 1283, Training Loss: 0.08257
Epoch: 1283, Training Loss: 0.05756
Epoch: 1284, Training Loss: 0.06802
Epoch: 1284, Training Loss: 0.05575
Epoch: 1284, Training Loss: 0.08248
Epoch: 1284, Training Loss: 0.05751
Epoch: 1285, Training Loss: 0.06796
Epoch: 1285, Training Loss: 0.05571
Epoch: 1285, Training Loss: 0.08240
Epoch: 1285, Training Loss: 0.05746
Epoch: 1286, Training Loss: 0.06790
Epoch: 1286, Training Loss: 0.05567
Epoch: 1286, Training Loss: 0.08232
Epoch: 1286, Training Loss: 0.05741
Epoch: 1287, Training Loss: 0.06783
Epoch: 1287, Training Loss: 0.05562
Epoch: 1287, Training Loss: 0.08223
Epoch: 1287, Training Loss: 0.05736
Epoch: 1288, Training Loss: 0.06777
Epoch: 1288, Training Loss: 0.05558
Epoch: 1288, Training Loss: 0.08215
Epoch: 1288, Training Loss: 0.05730
Epoch: 1289, Training Loss: 0.06771
Epoch: 1289, Training Loss: 0.05554
Epoch: 1289, Training Loss: 0.08207
Epoch: 1289, Training Loss: 0.05725
Epoch: 1290, Training Loss: 0.06765
Epoch: 1290, Training Loss: 0.05550
Epoch: 1290, Training Loss: 0.08199
Epoch: 1290, Training Loss: 0.05720
Epoch: 1291, Training Loss: 0.06759
Epoch: 1291, Training Loss: 0.05546
Epoch: 1291, Training Loss: 0.08190
Epoch: 1291, Training Loss: 0.05715
Epoch: 1292, Training Loss: 0.06753
Epoch: 1292, Training Loss: 0.05541
Epoch: 1292, Training Loss: 0.08182
Epoch: 1292, Training Loss: 0.05710
Epoch: 1293, Training Loss: 0.06747
Epoch: 1293, Training Loss: 0.05537
Epoch: 1293, Training Loss: 0.08174
Epoch: 1293, Training Loss: 0.05705
Epoch: 1294, Training Loss: 0.06740
Epoch: 1294, Training Loss: 0.05533
Epoch: 1294, Training Loss: 0.08166
Epoch: 1294, Training Loss: 0.05699
Epoch: 1295, Training Loss: 0.06734
Epoch: 1295, Training Loss: 0.05529
Epoch: 1295, Training Loss: 0.08158
Epoch: 1295, Training Loss: 0.05694
Epoch: 1296, Training Loss: 0.06728
Epoch: 1296, Training Loss: 0.05525
Epoch: 1296, Training Loss: 0.08150
Epoch: 1296, Training Loss: 0.05689
Epoch: 1297, Training Loss: 0.06722
Epoch: 1297, Training Loss: 0.05521
Epoch: 1297, Training Loss: 0.08142
Epoch: 1297, Training Loss: 0.05684
Epoch: 1298, Training Loss: 0.06716
Epoch: 1298, Training Loss: 0.05516
Epoch: 1298, Training Loss: 0.08134
Epoch: 1298, Training Loss: 0.05679
Epoch: 1299, Training Loss: 0.06710
Epoch: 1299, Training Loss: 0.05512
Epoch: 1299, Training Loss: 0.08126
Epoch: 1299, Training Loss: 0.05674
Epoch: 1300, Training Loss: 0.06704
Epoch: 1300, Training Loss: 0.05508
Epoch: 1300, Training Loss: 0.08118
Epoch: 1300, Training Loss: 0.05669
Epoch: 1301, Training Loss: 0.06698
Epoch: 1301, Training Loss: 0.05504
Epoch: 1301, Training Loss: 0.08110
Epoch: 1301, Training Loss: 0.05664
Epoch: 1302, Training Loss: 0.06692
Epoch: 1302, Training Loss: 0.05500
Epoch: 1302, Training Loss: 0.08102
Epoch: 1302, Training Loss: 0.05659
Epoch: 1303, Training Loss: 0.06686
Epoch: 1303, Training Loss: 0.05496
Epoch: 1303, Training Loss: 0.08094
Epoch: 1303, Training Loss: 0.05654
Epoch: 1304, Training Loss: 0.06680
Epoch: 1304, Training Loss: 0.05492
Epoch: 1304, Training Loss: 0.08086
Epoch: 1304, Training Loss: 0.05649
Epoch: 1305, Training Loss: 0.06674
Epoch: 1305, Training Loss: 0.05488
Epoch: 1305, Training Loss: 0.08078
Epoch: 1305, Training Loss: 0.05644
Epoch: 1306, Training Loss: 0.06669
Epoch: 1306, Training Loss: 0.05484
Epoch: 1306, Training Loss: 0.08070
Epoch: 1306, Training Loss: 0.05639
Epoch: 1307, Training Loss: 0.06663
Epoch: 1307, Training Loss: 0.05480
Epoch: 1307, Training Loss: 0.08062
Epoch: 1307, Training Loss: 0.05634
Epoch: 1308, Training Loss: 0.06657
Epoch: 1308, Training Loss: 0.05476
Epoch: 1308, Training Loss: 0.08054
Epoch: 1308, Training Loss: 0.05629
Epoch: 1309, Training Loss: 0.06651
Epoch: 1309, Training Loss: 0.05472
Epoch: 1309, Training Loss: 0.08046
Epoch: 1309, Training Loss: 0.05624
Epoch: 1310, Training Loss: 0.06645
Epoch: 1310, Training Loss: 0.05468
Epoch: 1310, Training Loss: 0.08039
Epoch: 1310, Training Loss: 0.05619
Epoch: 1311, Training Loss: 0.06639
Epoch: 1311, Training Loss: 0.05464
Epoch: 1311, Training Loss: 0.08031
Epoch: 1311, Training Loss: 0.05615
Epoch: 1312, Training Loss: 0.06633
Epoch: 1312, Training Loss: 0.05460
Epoch: 1312, Training Loss: 0.08023
Epoch: 1312, Training Loss: 0.05610
Epoch: 1313, Training Loss: 0.06628
Epoch: 1313, Training Loss: 0.05456
Epoch: 1313, Training Loss: 0.08015
Epoch: 1313, Training Loss: 0.05605
Epoch: 1314, Training Loss: 0.06622
Epoch: 1314, Training Loss: 0.05452
Epoch: 1314, Training Loss: 0.08008
Epoch: 1314, Training Loss: 0.05600
Epoch: 1315, Training Loss: 0.06616
Epoch: 1315, Training Loss: 0.05448
Epoch: 1315, Training Loss: 0.08000
Epoch: 1315, Training Loss: 0.05595
Epoch: 1316, Training Loss: 0.06610
Epoch: 1316, Training Loss: 0.05444
Epoch: 1316, Training Loss: 0.07992
Epoch: 1316, Training Loss: 0.05590
Epoch: 1317, Training Loss: 0.06605
Epoch: 1317, Training Loss: 0.05440
Epoch: 1317, Training Loss: 0.07985
Epoch: 1317, Training Loss: 0.05585
Epoch: 1318, Training Loss: 0.06599
Epoch: 1318, Training Loss: 0.05436
Epoch: 1318, Training Loss: 0.07977
Epoch: 1318, Training Loss: 0.05581
Epoch: 1319, Training Loss: 0.06593
Epoch: 1319, Training Loss: 0.05432
Epoch: 1319, Training Loss: 0.07969
Epoch: 1319, Training Loss: 0.05576
Epoch: 1320, Training Loss: 0.06587
Epoch: 1320, Training Loss: 0.05428
Epoch: 1320, Training Loss: 0.07962
Epoch: 1320, Training Loss: 0.05571
Epoch: 1321, Training Loss: 0.06582
Epoch: 1321, Training Loss: 0.05424
Epoch: 1321, Training Loss: 0.07954
Epoch: 1321, Training Loss: 0.05566
Epoch: 1322, Training Loss: 0.06576
Epoch: 1322, Training Loss: 0.05420
Epoch: 1322, Training Loss: 0.07947
Epoch: 1322, Training Loss: 0.05562
Epoch: 1323, Training Loss: 0.06570
Epoch: 1323, Training Loss: 0.05416
Epoch: 1323, Training Loss: 0.07939
Epoch: 1323, Training Loss: 0.05557
Epoch: 1324, Training Loss: 0.06565
Epoch: 1324, Training Loss: 0.05412
Epoch: 1324, Training Loss: 0.07932
Epoch: 1324, Training Loss: 0.05552
Epoch: 1325, Training Loss: 0.06559
Epoch: 1325, Training Loss: 0.05408
Epoch: 1325, Training Loss: 0.07924
Epoch: 1325, Training Loss: 0.05547
Epoch: 1326, Training Loss: 0.06553
Epoch: 1326, Training Loss: 0.05404
Epoch: 1326, Training Loss: 0.07917
Epoch: 1326, Training Loss: 0.05543
Epoch: 1327, Training Loss: 0.06548
Epoch: 1327, Training Loss: 0.05400
Epoch: 1327, Training Loss: 0.07909
Epoch: 1327, Training Loss: 0.05538
Epoch: 1328, Training Loss: 0.06542
Epoch: 1328, Training Loss: 0.05397
Epoch: 1328, Training Loss: 0.07902
Epoch: 1328, Training Loss: 0.05533
Epoch: 1329, Training Loss: 0.06537
Epoch: 1329, Training Loss: 0.05393
Epoch: 1329, Training Loss: 0.07894
Epoch: 1329, Training Loss: 0.05529
Epoch: 1330, Training Loss: 0.06531
Epoch: 1330, Training Loss: 0.05389
Epoch: 1330, Training Loss: 0.07887
Epoch: 1330, Training Loss: 0.05524
Epoch: 1331, Training Loss: 0.06525
Epoch: 1331, Training Loss: 0.05385
Epoch: 1331, Training Loss: 0.07880
Epoch: 1331, Training Loss: 0.05519
Epoch: 1332, Training Loss: 0.06520
Epoch: 1332, Training Loss: 0.05381
Epoch: 1332, Training Loss: 0.07872
Epoch: 1332, Training Loss: 0.05515
Epoch: 1333, Training Loss: 0.06514
Epoch: 1333, Training Loss: 0.05377
Epoch: 1333, Training Loss: 0.07865
Epoch: 1333, Training Loss: 0.05510
Epoch: 1334, Training Loss: 0.06509
Epoch: 1334, Training Loss: 0.05373
Epoch: 1334, Training Loss: 0.07858
Epoch: 1334, Training Loss: 0.05505
Epoch: 1335, Training Loss: 0.06503
Epoch: 1335, Training Loss: 0.05370
Epoch: 1335, Training Loss: 0.07850
Epoch: 1335, Training Loss: 0.05501
Epoch: 1336, Training Loss: 0.06498
Epoch: 1336, Training Loss: 0.05366
Epoch: 1336, Training Loss: 0.07843
Epoch: 1336, Training Loss: 0.05496
Epoch: 1337, Training Loss: 0.06492
Epoch: 1337, Training Loss: 0.05362
Epoch: 1337, Training Loss: 0.07836
Epoch: 1337, Training Loss: 0.05492
Epoch: 1338, Training Loss: 0.06487
Epoch: 1338, Training Loss: 0.05358
Epoch: 1338, Training Loss: 0.07828
Epoch: 1338, Training Loss: 0.05487
Epoch: 1339, Training Loss: 0.06481
Epoch: 1339, Training Loss: 0.05354
Epoch: 1339, Training Loss: 0.07821
Epoch: 1339, Training Loss: 0.05482
Epoch: 1340, Training Loss: 0.06476
Epoch: 1340, Training Loss: 0.05351
Epoch: 1340, Training Loss: 0.07814
Epoch: 1340, Training Loss: 0.05478
Epoch: 1341, Training Loss: 0.06471
Epoch: 1341, Training Loss: 0.05347
Epoch: 1341, Training Loss: 0.07807
Epoch: 1341, Training Loss: 0.05473
Epoch: 1342, Training Loss: 0.06465
Epoch: 1342, Training Loss: 0.05343
Epoch: 1342, Training Loss: 0.07800
Epoch: 1342, Training Loss: 0.05469
Epoch: 1343, Training Loss: 0.06460
Epoch: 1343, Training Loss: 0.05339
Epoch: 1343, Training Loss: 0.07793
Epoch: 1343, Training Loss: 0.05464
Epoch: 1344, Training Loss: 0.06454
Epoch: 1344, Training Loss: 0.05336
Epoch: 1344, Training Loss: 0.07785
Epoch: 1344, Training Loss: 0.05460
Epoch: 1345, Training Loss: 0.06449
Epoch: 1345, Training Loss: 0.05332
Epoch: 1345, Training Loss: 0.07778
Epoch: 1345, Training Loss: 0.05455
Epoch: 1346, Training Loss: 0.06444
Epoch: 1346, Training Loss: 0.05328
Epoch: 1346, Training Loss: 0.07771
Epoch: 1346, Training Loss: 0.05451
Epoch: 1347, Training Loss: 0.06438
Epoch: 1347, Training Loss: 0.05324
Epoch: 1347, Training Loss: 0.07764
Epoch: 1347, Training Loss: 0.05446
Epoch: 1348, Training Loss: 0.06433
Epoch: 1348, Training Loss: 0.05321
Epoch: 1348, Training Loss: 0.07757
Epoch: 1348, Training Loss: 0.05442
Epoch: 1349, Training Loss: 0.06428
Epoch: 1349, Training Loss: 0.05317
Epoch: 1349, Training Loss: 0.07750
Epoch: 1349, Training Loss: 0.05437
Epoch: 1350, Training Loss: 0.06422
Epoch: 1350, Training Loss: 0.05313
Epoch: 1350, Training Loss: 0.07743
Epoch: 1350, Training Loss: 0.05433
Epoch: 1351, Training Loss: 0.06417
Epoch: 1351, Training Loss: 0.05309
Epoch: 1351, Training Loss: 0.07736
Epoch: 1351, Training Loss: 0.05428
Epoch: 1352, Training Loss: 0.06412
Epoch: 1352, Training Loss: 0.05306
Epoch: 1352, Training Loss: 0.07729
Epoch: 1352, Training Loss: 0.05424
Epoch: 1353, Training Loss: 0.06406
Epoch: 1353, Training Loss: 0.05302
Epoch: 1353, Training Loss: 0.07722
Epoch: 1353, Training Loss: 0.05420
Epoch: 1354, Training Loss: 0.06401
Epoch: 1354, Training Loss: 0.05298
Epoch: 1354, Training Loss: 0.07715
Epoch: 1354, Training Loss: 0.05415
Epoch: 1355, Training Loss: 0.06396
Epoch: 1355, Training Loss: 0.05295
Epoch: 1355, Training Loss: 0.07708
Epoch: 1355, Training Loss: 0.05411
Epoch: 1356, Training Loss: 0.06391
Epoch: 1356, Training Loss: 0.05291
Epoch: 1356, Training Loss: 0.07701
Epoch: 1356, Training Loss: 0.05406
Epoch: 1357, Training Loss: 0.06385
Epoch: 1357, Training Loss: 0.05287
Epoch: 1357, Training Loss: 0.07694
Epoch: 1357, Training Loss: 0.05402
Epoch: 1358, Training Loss: 0.06380
Epoch: 1358, Training Loss: 0.05284
Epoch: 1358, Training Loss: 0.07687
Epoch: 1358, Training Loss: 0.05398
Epoch: 1359, Training Loss: 0.06375
Epoch: 1359, Training Loss: 0.05280
Epoch: 1359, Training Loss: 0.07680
Epoch: 1359, Training Loss: 0.05393
Epoch: 1360, Training Loss: 0.06370
Epoch: 1360, Training Loss: 0.05276
Epoch: 1360, Training Loss: 0.07674
Epoch: 1360, Training Loss: 0.05389
Epoch: 1361, Training Loss: 0.06365
Epoch: 1361, Training Loss: 0.05273
Epoch: 1361, Training Loss: 0.07667
Epoch: 1361, Training Loss: 0.05385
Epoch: 1362, Training Loss: 0.06359
Epoch: 1362, Training Loss: 0.05269
Epoch: 1362, Training Loss: 0.07660
Epoch: 1362, Training Loss: 0.05380
Epoch: 1363, Training Loss: 0.06354
Epoch: 1363, Training Loss: 0.05266
Epoch: 1363, Training Loss: 0.07653
Epoch: 1363, Training Loss: 0.05376
Epoch: 1364, Training Loss: 0.06349
Epoch: 1364, Training Loss: 0.05262
Epoch: 1364, Training Loss: 0.07646
Epoch: 1364, Training Loss: 0.05372
Epoch: 1365, Training Loss: 0.06344
Epoch: 1365, Training Loss: 0.05258
Epoch: 1365, Training Loss: 0.07640
Epoch: 1365, Training Loss: 0.05367
Epoch: 1366, Training Loss: 0.06339
Epoch: 1366, Training Loss: 0.05255
Epoch: 1366, Training Loss: 0.07633
Epoch: 1366, Training Loss: 0.05363
Epoch: 1367, Training Loss: 0.06334
Epoch: 1367, Training Loss: 0.05251
Epoch: 1367, Training Loss: 0.07626
Epoch: 1367, Training Loss: 0.05359
Epoch: 1368, Training Loss: 0.06329
Epoch: 1368, Training Loss: 0.05248
Epoch: 1368, Training Loss: 0.07619
Epoch: 1368, Training Loss: 0.05355
Epoch: 1369, Training Loss: 0.06324
Epoch: 1369, Training Loss: 0.05244
Epoch: 1369, Training Loss: 0.07613
Epoch: 1369, Training Loss: 0.05350
Epoch: 1370, Training Loss: 0.06319
Epoch: 1370, Training Loss: 0.05241
Epoch: 1370, Training Loss: 0.07606
Epoch: 1370, Training Loss: 0.05346
Epoch: 1371, Training Loss: 0.06313
Epoch: 1371, Training Loss: 0.05237
Epoch: 1371, Training Loss: 0.07599
Epoch: 1371, Training Loss: 0.05342
Epoch: 1372, Training Loss: 0.06308
Epoch: 1372, Training Loss: 0.05233
Epoch: 1372, Training Loss: 0.07593
Epoch: 1372, Training Loss: 0.05338
Epoch: 1373, Training Loss: 0.06303
Epoch: 1373, Training Loss: 0.05230
Epoch: 1373, Training Loss: 0.07586
Epoch: 1373, Training Loss: 0.05333
Epoch: 1374, Training Loss: 0.06298
Epoch: 1374, Training Loss: 0.05226
Epoch: 1374, Training Loss: 0.07579
Epoch: 1374, Training Loss: 0.05329
Epoch: 1375, Training Loss: 0.06293
Epoch: 1375, Training Loss: 0.05223
Epoch: 1375, Training Loss: 0.07573
Epoch: 1375, Training Loss: 0.05325
Epoch: 1376, Training Loss: 0.06288
Epoch: 1376, Training Loss: 0.05219
Epoch: 1376, Training Loss: 0.07566
Epoch: 1376, Training Loss: 0.05321
Epoch: 1377, Training Loss: 0.06283
Epoch: 1377, Training Loss: 0.05216
Epoch: 1377, Training Loss: 0.07560
Epoch: 1377, Training Loss: 0.05317
Epoch: 1378, Training Loss: 0.06278
Epoch: 1378, Training Loss: 0.05212
Epoch: 1378, Training Loss: 0.07553
Epoch: 1378, Training Loss: 0.05312
Epoch: 1379, Training Loss: 0.06273
Epoch: 1379, Training Loss: 0.05209
Epoch: 1379, Training Loss: 0.07546
Epoch: 1379, Training Loss: 0.05308
Epoch: 1380, Training Loss: 0.06268
Epoch: 1380, Training Loss: 0.05205
Epoch: 1380, Training Loss: 0.07540
Epoch: 1380, Training Loss: 0.05304
Epoch: 1381, Training Loss: 0.06263
Epoch: 1381, Training Loss: 0.05202
Epoch: 1381, Training Loss: 0.07533
Epoch: 1381, Training Loss: 0.05300
Epoch: 1382, Training Loss: 0.06258
Epoch: 1382, Training Loss: 0.05198
Epoch: 1382, Training Loss: 0.07527
Epoch: 1382, Training Loss: 0.05296
Epoch: 1383, Training Loss: 0.06254
Epoch: 1383, Training Loss: 0.05195
Epoch: 1383, Training Loss: 0.07520
Epoch: 1383, Training Loss: 0.05292
Epoch: 1384, Training Loss: 0.06249
Epoch: 1384, Training Loss: 0.05191
Epoch: 1384, Training Loss: 0.07514
Epoch: 1384, Training Loss: 0.05288
Epoch: 1385, Training Loss: 0.06244
Epoch: 1385, Training Loss: 0.05188
Epoch: 1385, Training Loss: 0.07507
Epoch: 1385, Training Loss: 0.05284
Epoch: 1386, Training Loss: 0.06239
Epoch: 1386, Training Loss: 0.05184
Epoch: 1386, Training Loss: 0.07501
Epoch: 1386, Training Loss: 0.05279
Epoch: 1387, Training Loss: 0.06234
Epoch: 1387, Training Loss: 0.05181
Epoch: 1387, Training Loss: 0.07495
Epoch: 1387, Training Loss: 0.05275
Epoch: 1388, Training Loss: 0.06229
Epoch: 1388, Training Loss: 0.05177
Epoch: 1388, Training Loss: 0.07488
Epoch: 1388, Training Loss: 0.05271
Epoch: 1389, Training Loss: 0.06224
Epoch: 1389, Training Loss: 0.05174
Epoch: 1389, Training Loss: 0.07482
Epoch: 1389, Training Loss: 0.05267
Epoch: 1390, Training Loss: 0.06219
Epoch: 1390, Training Loss: 0.05171
Epoch: 1390, Training Loss: 0.07475
Epoch: 1390, Training Loss: 0.05263
Epoch: 1391, Training Loss: 0.06214
Epoch: 1391, Training Loss: 0.05167
Epoch: 1391, Training Loss: 0.07469
Epoch: 1391, Training Loss: 0.05259
Epoch: 1392, Training Loss: 0.06210
Epoch: 1392, Training Loss: 0.05164
Epoch: 1392, Training Loss: 0.07463
Epoch: 1392, Training Loss: 0.05255
Epoch: 1393, Training Loss: 0.06205
Epoch: 1393, Training Loss: 0.05160
Epoch: 1393, Training Loss: 0.07456
Epoch: 1393, Training Loss: 0.05251
Epoch: 1394, Training Loss: 0.06200
Epoch: 1394, Training Loss: 0.05157
Epoch: 1394, Training Loss: 0.07450
Epoch: 1394, Training Loss: 0.05247
Epoch: 1395, Training Loss: 0.06195
Epoch: 1395, Training Loss: 0.05154
Epoch: 1395, Training Loss: 0.07444
Epoch: 1395, Training Loss: 0.05243
Epoch: 1396, Training Loss: 0.06190
Epoch: 1396, Training Loss: 0.05150
Epoch: 1396, Training Loss: 0.07437
Epoch: 1396, Training Loss: 0.05239
Epoch: 1397, Training Loss: 0.06186
Epoch: 1397, Training Loss: 0.05147
Epoch: 1397, Training Loss: 0.07431
Epoch: 1397, Training Loss: 0.05235
Epoch: 1398, Training Loss: 0.06181
Epoch: 1398, Training Loss: 0.05143
Epoch: 1398, Training Loss: 0.07425
Epoch: 1398, Training Loss: 0.05231
Epoch: 1399, Training Loss: 0.06176
Epoch: 1399, Training Loss: 0.05140
Epoch: 1399, Training Loss: 0.07419
Epoch: 1399, Training Loss: 0.05227
Epoch: 1400, Training Loss: 0.06171
Epoch: 1400, Training Loss: 0.05137
Epoch: 1400, Training Loss: 0.07412
Epoch: 1400, Training Loss: 0.05223
Epoch: 1401, Training Loss: 0.06167
Epoch: 1401, Training Loss: 0.05133
Epoch: 1401, Training Loss: 0.07406
Epoch: 1401, Training Loss: 0.05219
Epoch: 1402, Training Loss: 0.06162
Epoch: 1402, Training Loss: 0.05130
Epoch: 1402, Training Loss: 0.07400
Epoch: 1402, Training Loss: 0.05215
Epoch: 1403, Training Loss: 0.06157
Epoch: 1403, Training Loss: 0.05127
Epoch: 1403, Training Loss: 0.07394
Epoch: 1403, Training Loss: 0.05211
Epoch: 1404, Training Loss: 0.06152
Epoch: 1404, Training Loss: 0.05123
Epoch: 1404, Training Loss: 0.07388
Epoch: 1404, Training Loss: 0.05207
Epoch: 1405, Training Loss: 0.06148
Epoch: 1405, Training Loss: 0.05120
Epoch: 1405, Training Loss: 0.07381
Epoch: 1405, Training Loss: 0.05203
Epoch: 1406, Training Loss: 0.06143
Epoch: 1406, Training Loss: 0.05117
Epoch: 1406, Training Loss: 0.07375
Epoch: 1406, Training Loss: 0.05199
Epoch: 1407, Training Loss: 0.06138
Epoch: 1407, Training Loss: 0.05113
Epoch: 1407, Training Loss: 0.07369
Epoch: 1407, Training Loss: 0.05196
Epoch: 1408, Training Loss: 0.06134
Epoch: 1408, Training Loss: 0.05110
Epoch: 1408, Training Loss: 0.07363
Epoch: 1408, Training Loss: 0.05192
Epoch: 1409, Training Loss: 0.06129
Epoch: 1409, Training Loss: 0.05107
Epoch: 1409, Training Loss: 0.07357
Epoch: 1409, Training Loss: 0.05188
Epoch: 1410, Training Loss: 0.06124
Epoch: 1410, Training Loss: 0.05103
Epoch: 1410, Training Loss: 0.07351
Epoch: 1410, Training Loss: 0.05184
Epoch: 1411, Training Loss: 0.06120
Epoch: 1411, Training Loss: 0.05100
Epoch: 1411, Training Loss: 0.07345
Epoch: 1411, Training Loss: 0.05180
Epoch: 1412, Training Loss: 0.06115
Epoch: 1412, Training Loss: 0.05097
Epoch: 1412, Training Loss: 0.07339
Epoch: 1412, Training Loss: 0.05176
Epoch: 1413, Training Loss: 0.06110
Epoch: 1413, Training Loss: 0.05093
Epoch: 1413, Training Loss: 0.07333
Epoch: 1413, Training Loss: 0.05172
Epoch: 1414, Training Loss: 0.06106
Epoch: 1414, Training Loss: 0.05090
Epoch: 1414, Training Loss: 0.07327
Epoch: 1414, Training Loss: 0.05168
Epoch: 1415, Training Loss: 0.06101
Epoch: 1415, Training Loss: 0.05087
Epoch: 1415, Training Loss: 0.07321
Epoch: 1415, Training Loss: 0.05165
Epoch: 1416, Training Loss: 0.06097
Epoch: 1416, Training Loss: 0.05084
Epoch: 1416, Training Loss: 0.07315
Epoch: 1416, Training Loss: 0.05161
Epoch: 1417, Training Loss: 0.06092
Epoch: 1417, Training Loss: 0.05080
Epoch: 1417, Training Loss: 0.07309
Epoch: 1417, Training Loss: 0.05157
Epoch: 1418, Training Loss: 0.06087
Epoch: 1418, Training Loss: 0.05077
Epoch: 1418, Training Loss: 0.07303
Epoch: 1418, Training Loss: 0.05153
Epoch: 1419, Training Loss: 0.06083
Epoch: 1419, Training Loss: 0.05074
Epoch: 1419, Training Loss: 0.07297
Epoch: 1419, Training Loss: 0.05149
Epoch: 1420, Training Loss: 0.06078
Epoch: 1420, Training Loss: 0.05071
Epoch: 1420, Training Loss: 0.07291
Epoch: 1420, Training Loss: 0.05146
Epoch: 1421, Training Loss: 0.06074
Epoch: 1421, Training Loss: 0.05067
Epoch: 1421, Training Loss: 0.07285
Epoch: 1421, Training Loss: 0.05142
Epoch: 1422, Training Loss: 0.06069
Epoch: 1422, Training Loss: 0.05064
Epoch: 1422, Training Loss: 0.07279
Epoch: 1422, Training Loss: 0.05138
Epoch: 1423, Training Loss: 0.06065
Epoch: 1423, Training Loss: 0.05061
Epoch: 1423, Training Loss: 0.07273
Epoch: 1423, Training Loss: 0.05134
Epoch: 1424, Training Loss: 0.06060
Epoch: 1424, Training Loss: 0.05058
Epoch: 1424, Training Loss: 0.07267
Epoch: 1424, Training Loss: 0.05130
Epoch: 1425, Training Loss: 0.06056
Epoch: 1425, Training Loss: 0.05055
Epoch: 1425, Training Loss: 0.07261
Epoch: 1425, Training Loss: 0.05127
Epoch: 1426, Training Loss: 0.06051
Epoch: 1426, Training Loss: 0.05051
Epoch: 1426, Training Loss: 0.07255
Epoch: 1426, Training Loss: 0.05123
Epoch: 1427, Training Loss: 0.06047
Epoch: 1427, Training Loss: 0.05048
Epoch: 1427, Training Loss: 0.07249
Epoch: 1427, Training Loss: 0.05119
Epoch: 1428, Training Loss: 0.06042
Epoch: 1428, Training Loss: 0.05045
Epoch: 1428, Training Loss: 0.07244
Epoch: 1428, Training Loss: 0.05115
Epoch: 1429, Training Loss: 0.06038
Epoch: 1429, Training Loss: 0.05042
Epoch: 1429, Training Loss: 0.07238
Epoch: 1429, Training Loss: 0.05112
Epoch: 1430, Training Loss: 0.06033
Epoch: 1430, Training Loss: 0.05039
Epoch: 1430, Training Loss: 0.07232
Epoch: 1430, Training Loss: 0.05108
Epoch: 1431, Training Loss: 0.06029
Epoch: 1431, Training Loss: 0.05035
Epoch: 1431, Training Loss: 0.07226
Epoch: 1431, Training Loss: 0.05104
Epoch: 1432, Training Loss: 0.06024
Epoch: 1432, Training Loss: 0.05032
Epoch: 1432, Training Loss: 0.07220
Epoch: 1432, Training Loss: 0.05101
Epoch: 1433, Training Loss: 0.06020
Epoch: 1433, Training Loss: 0.05029
Epoch: 1433, Training Loss: 0.07215
Epoch: 1433, Training Loss: 0.05097
Epoch: 1434, Training Loss: 0.06015
Epoch: 1434, Training Loss: 0.05026
Epoch: 1434, Training Loss: 0.07209
Epoch: 1434, Training Loss: 0.05093
Epoch: 1435, Training Loss: 0.06011
Epoch: 1435, Training Loss: 0.05023
Epoch: 1435, Training Loss: 0.07203
Epoch: 1435, Training Loss: 0.05090
Epoch: 1436, Training Loss: 0.06007
Epoch: 1436, Training Loss: 0.05020
Epoch: 1436, Training Loss: 0.07197
Epoch: 1436, Training Loss: 0.05086
Epoch: 1437, Training Loss: 0.06002
Epoch: 1437, Training Loss: 0.05016
Epoch: 1437, Training Loss: 0.07192
Epoch: 1437, Training Loss: 0.05082
Epoch: 1438, Training Loss: 0.05998
Epoch: 1438, Training Loss: 0.05013
Epoch: 1438, Training Loss: 0.07186
Epoch: 1438, Training Loss: 0.05079
Epoch: 1439, Training Loss: 0.05993
Epoch: 1439, Training Loss: 0.05010
Epoch: 1439, Training Loss: 0.07180
Epoch: 1439, Training Loss: 0.05075
Epoch: 1440, Training Loss: 0.05989
Epoch: 1440, Training Loss: 0.05007
Epoch: 1440, Training Loss: 0.07174
Epoch: 1440, Training Loss: 0.05071
Epoch: 1441, Training Loss: 0.05985
Epoch: 1441, Training Loss: 0.05004
Epoch: 1441, Training Loss: 0.07169
Epoch: 1441, Training Loss: 0.05068
Epoch: 1442, Training Loss: 0.05980
Epoch: 1442, Training Loss: 0.05001
Epoch: 1442, Training Loss: 0.07163
Epoch: 1442, Training Loss: 0.05064
Epoch: 1443, Training Loss: 0.05976
Epoch: 1443, Training Loss: 0.04998
Epoch: 1443, Training Loss: 0.07157
Epoch: 1443, Training Loss: 0.05060
Epoch: 1444, Training Loss: 0.05972
Epoch: 1444, Training Loss: 0.04995
Epoch: 1444, Training Loss: 0.07152
Epoch: 1444, Training Loss: 0.05057
Epoch: 1445, Training Loss: 0.05967
Epoch: 1445, Training Loss: 0.04991
Epoch: 1445, Training Loss: 0.07146
Epoch: 1445, Training Loss: 0.05053
Epoch: 1446, Training Loss: 0.05963
Epoch: 1446, Training Loss: 0.04988
Epoch: 1446, Training Loss: 0.07141
Epoch: 1446, Training Loss: 0.05050
Epoch: 1447, Training Loss: 0.05959
Epoch: 1447, Training Loss: 0.04985
Epoch: 1447, Training Loss: 0.07135
Epoch: 1447, Training Loss: 0.05046
Epoch: 1448, Training Loss: 0.05954
Epoch: 1448, Training Loss: 0.04982
Epoch: 1448, Training Loss: 0.07129
Epoch: 1448, Training Loss: 0.05042
Epoch: 1449, Training Loss: 0.05950
Epoch: 1449, Training Loss: 0.04979
Epoch: 1449, Training Loss: 0.07124
Epoch: 1449, Training Loss: 0.05039
Epoch: 1450, Training Loss: 0.05946
Epoch: 1450, Training Loss: 0.04976
Epoch: 1450, Training Loss: 0.07118
Epoch: 1450, Training Loss: 0.05035
Epoch: 1451, Training Loss: 0.05942
Epoch: 1451, Training Loss: 0.04973
Epoch: 1451, Training Loss: 0.07113
Epoch: 1451, Training Loss: 0.05032
Epoch: 1452, Training Loss: 0.05937
Epoch: 1452, Training Loss: 0.04970
Epoch: 1452, Training Loss: 0.07107
Epoch: 1452, Training Loss: 0.05028
Epoch: 1453, Training Loss: 0.05933
Epoch: 1453, Training Loss: 0.04967
Epoch: 1453, Training Loss: 0.07102
Epoch: 1453, Training Loss: 0.05025
Epoch: 1454, Training Loss: 0.05929
Epoch: 1454, Training Loss: 0.04964
Epoch: 1454, Training Loss: 0.07096
Epoch: 1454, Training Loss: 0.05021
Epoch: 1455, Training Loss: 0.05925
Epoch: 1455, Training Loss: 0.04961
Epoch: 1455, Training Loss: 0.07091
Epoch: 1455, Training Loss: 0.05018
Epoch: 1456, Training Loss: 0.05920
Epoch: 1456, Training Loss: 0.04958
Epoch: 1456, Training Loss: 0.07085
Epoch: 1456, Training Loss: 0.05014
Epoch: 1457, Training Loss: 0.05916
Epoch: 1457, Training Loss: 0.04955
Epoch: 1457, Training Loss: 0.07080
Epoch: 1457, Training Loss: 0.05010
Epoch: 1458, Training Loss: 0.05912
Epoch: 1458, Training Loss: 0.04952
Epoch: 1458, Training Loss: 0.07074
Epoch: 1458, Training Loss: 0.05007
Epoch: 1459, Training Loss: 0.05908
Epoch: 1459, Training Loss: 0.04949
Epoch: 1459, Training Loss: 0.07069
Epoch: 1459, Training Loss: 0.05003
Epoch: 1460, Training Loss: 0.05904
Epoch: 1460, Training Loss: 0.04946
Epoch: 1460, Training Loss: 0.07063
Epoch: 1460, Training Loss: 0.05000
Epoch: 1461, Training Loss: 0.05899
Epoch: 1461, Training Loss: 0.04943
Epoch: 1461, Training Loss: 0.07058
Epoch: 1461, Training Loss: 0.04996
Epoch: 1462, Training Loss: 0.05895
Epoch: 1462, Training Loss: 0.04940
Epoch: 1462, Training Loss: 0.07052
Epoch: 1462, Training Loss: 0.04993
Epoch: 1463, Training Loss: 0.05891
Epoch: 1463, Training Loss: 0.04937
Epoch: 1463, Training Loss: 0.07047
Epoch: 1463, Training Loss: 0.04990
Epoch: 1464, Training Loss: 0.05887
Epoch: 1464, Training Loss: 0.04934
Epoch: 1464, Training Loss: 0.07041
Epoch: 1464, Training Loss: 0.04986
Epoch: 1465, Training Loss: 0.05883
Epoch: 1465, Training Loss: 0.04931
Epoch: 1465, Training Loss: 0.07036
Epoch: 1465, Training Loss: 0.04983
Epoch: 1466, Training Loss: 0.05879
Epoch: 1466, Training Loss: 0.04928
Epoch: 1466, Training Loss: 0.07031
Epoch: 1466, Training Loss: 0.04979
Epoch: 1467, Training Loss: 0.05874
Epoch: 1467, Training Loss: 0.04925
Epoch: 1467, Training Loss: 0.07025
Epoch: 1467, Training Loss: 0.04976
Epoch: 1468, Training Loss: 0.05870
Epoch: 1468, Training Loss: 0.04922
Epoch: 1468, Training Loss: 0.07020
Epoch: 1468, Training Loss: 0.04972
Epoch: 1469, Training Loss: 0.05866
Epoch: 1469, Training Loss: 0.04919
Epoch: 1469, Training Loss: 0.07015
Epoch: 1469, Training Loss: 0.04969
Epoch: 1470, Training Loss: 0.05862
Epoch: 1470, Training Loss: 0.04916
Epoch: 1470, Training Loss: 0.07009
Epoch: 1470, Training Loss: 0.04965
Epoch: 1471, Training Loss: 0.05858
Epoch: 1471, Training Loss: 0.04913
Epoch: 1471, Training Loss: 0.07004
Epoch: 1471, Training Loss: 0.04962
Epoch: 1472, Training Loss: 0.05854
Epoch: 1472, Training Loss: 0.04910
Epoch: 1472, Training Loss: 0.06999
Epoch: 1472, Training Loss: 0.04959
Epoch: 1473, Training Loss: 0.05850
Epoch: 1473, Training Loss: 0.04907
Epoch: 1473, Training Loss: 0.06993
Epoch: 1473, Training Loss: 0.04955
Epoch: 1474, Training Loss: 0.05846
Epoch: 1474, Training Loss: 0.04904
Epoch: 1474, Training Loss: 0.06988
Epoch: 1474, Training Loss: 0.04952
Epoch: 1475, Training Loss: 0.05842
Epoch: 1475, Training Loss: 0.04901
Epoch: 1475, Training Loss: 0.06983
Epoch: 1475, Training Loss: 0.04948
Epoch: 1476, Training Loss: 0.05838
Epoch: 1476, Training Loss: 0.04898
Epoch: 1476, Training Loss: 0.06977
Epoch: 1476, Training Loss: 0.04945
Epoch: 1477, Training Loss: 0.05833
Epoch: 1477, Training Loss: 0.04895
Epoch: 1477, Training Loss: 0.06972
Epoch: 1477, Training Loss: 0.04942
Epoch: 1478, Training Loss: 0.05829
Epoch: 1478, Training Loss: 0.04892
Epoch: 1478, Training Loss: 0.06967
Epoch: 1478, Training Loss: 0.04938
Epoch: 1479, Training Loss: 0.05825
Epoch: 1479, Training Loss: 0.04889
Epoch: 1479, Training Loss: 0.06962
Epoch: 1479, Training Loss: 0.04935
Epoch: 1480, Training Loss: 0.05821
Epoch: 1480, Training Loss: 0.04886
Epoch: 1480, Training Loss: 0.06957
Epoch: 1480, Training Loss: 0.04932
Epoch: 1481, Training Loss: 0.05817
Epoch: 1481, Training Loss: 0.04883
Epoch: 1481, Training Loss: 0.06951
Epoch: 1481, Training Loss: 0.04928
Epoch: 1482, Training Loss: 0.05813
Epoch: 1482, Training Loss: 0.04880
Epoch: 1482, Training Loss: 0.06946
Epoch: 1482, Training Loss: 0.04925
Epoch: 1483, Training Loss: 0.05809
Epoch: 1483, Training Loss: 0.04878
Epoch: 1483, Training Loss: 0.06941
Epoch: 1483, Training Loss: 0.04921
Epoch: 1484, Training Loss: 0.05805
Epoch: 1484, Training Loss: 0.04875
Epoch: 1484, Training Loss: 0.06936
Epoch: 1484, Training Loss: 0.04918
Epoch: 1485, Training Loss: 0.05801
Epoch: 1485, Training Loss: 0.04872
Epoch: 1485, Training Loss: 0.06931
Epoch: 1485, Training Loss: 0.04915
Epoch: 1486, Training Loss: 0.05797
Epoch: 1486, Training Loss: 0.04869
Epoch: 1486, Training Loss: 0.06925
Epoch: 1486, Training Loss: 0.04912
Epoch: 1487, Training Loss: 0.05793
Epoch: 1487, Training Loss: 0.04866
Epoch: 1487, Training Loss: 0.06920
Epoch: 1487, Training Loss: 0.04908
Epoch: 1488, Training Loss: 0.05789
Epoch: 1488, Training Loss: 0.04863
Epoch: 1488, Training Loss: 0.06915
Epoch: 1488, Training Loss: 0.04905
Epoch: 1489, Training Loss: 0.05785
Epoch: 1489, Training Loss: 0.04860
Epoch: 1489, Training Loss: 0.06910
Epoch: 1489, Training Loss: 0.04902
Epoch: 1490, Training Loss: 0.05781
Epoch: 1490, Training Loss: 0.04857
Epoch: 1490, Training Loss: 0.06905
Epoch: 1490, Training Loss: 0.04898
Epoch: 1491, Training Loss: 0.05777
Epoch: 1491, Training Loss: 0.04855
Epoch: 1491, Training Loss: 0.06900
Epoch: 1491, Training Loss: 0.04895
Epoch: 1492, Training Loss: 0.05774
Epoch: 1492, Training Loss: 0.04852
Epoch: 1492, Training Loss: 0.06895
Epoch: 1492, Training Loss: 0.04892
Epoch: 1493, Training Loss: 0.05770
Epoch: 1493, Training Loss: 0.04849
Epoch: 1493, Training Loss: 0.06890
Epoch: 1493, Training Loss: 0.04888
Epoch: 1494, Training Loss: 0.05766
Epoch: 1494, Training Loss: 0.04846
Epoch: 1494, Training Loss: 0.06884
Epoch: 1494, Training Loss: 0.04885
Epoch: 1495, Training Loss: 0.05762
Epoch: 1495, Training Loss: 0.04843
Epoch: 1495, Training Loss: 0.06879
Epoch: 1495, Training Loss: 0.04882
Epoch: 1496, Training Loss: 0.05758
Epoch: 1496, Training Loss: 0.04840
Epoch: 1496, Training Loss: 0.06874
Epoch: 1496, Training Loss: 0.04879
Epoch: 1497, Training Loss: 0.05754
Epoch: 1497, Training Loss: 0.04837
Epoch: 1497, Training Loss: 0.06869
Epoch: 1497, Training Loss: 0.04875
Epoch: 1498, Training Loss: 0.05750
Epoch: 1498, Training Loss: 0.04835
Epoch: 1498, Training Loss: 0.06864
Epoch: 1498, Training Loss: 0.04872
Epoch: 1499, Training Loss: 0.05746
Epoch: 1499, Training Loss: 0.04832
Epoch: 1499, Training Loss: 0.06859
Epoch: 1499, Training Loss: 0.04869
Epoch: 1500, Training Loss: 0.05742
Epoch: 1500, Training Loss: 0.04829
Epoch: 1500, Training Loss: 0.06854
Epoch: 1500, Training Loss: 0.04866
Epoch: 1501, Training Loss: 0.05738
Epoch: 1501, Training Loss: 0.04826
Epoch: 1501, Training Loss: 0.06849
Epoch: 1501, Training Loss: 0.04863
Epoch: 1502, Training Loss: 0.05734
Epoch: 1502, Training Loss: 0.04823
Epoch: 1502, Training Loss: 0.06844
Epoch: 1502, Training Loss: 0.04859
Epoch: 1503, Training Loss: 0.05731
Epoch: 1503, Training Loss: 0.04821
Epoch: 1503, Training Loss: 0.06839
Epoch: 1503, Training Loss: 0.04856
Epoch: 1504, Training Loss: 0.05727
Epoch: 1504, Training Loss: 0.04818
Epoch: 1504, Training Loss: 0.06834
Epoch: 1504, Training Loss: 0.04853
Epoch: 1505, Training Loss: 0.05723
Epoch: 1505, Training Loss: 0.04815
Epoch: 1505, Training Loss: 0.06829
Epoch: 1505, Training Loss: 0.04850
Epoch: 1506, Training Loss: 0.05719
Epoch: 1506, Training Loss: 0.04812
Epoch: 1506, Training Loss: 0.06824
Epoch: 1506, Training Loss: 0.04846
Epoch: 1507, Training Loss: 0.05715
Epoch: 1507, Training Loss: 0.04809
Epoch: 1507, Training Loss: 0.06819
Epoch: 1507, Training Loss: 0.04843
Epoch: 1508, Training Loss: 0.05711
Epoch: 1508, Training Loss: 0.04807
Epoch: 1508, Training Loss: 0.06814
Epoch: 1508, Training Loss: 0.04840
Epoch: 1509, Training Loss: 0.05708
Epoch: 1509, Training Loss: 0.04804
Epoch: 1509, Training Loss: 0.06810
Epoch: 1509, Training Loss: 0.04837
Epoch: 1510, Training Loss: 0.05704
Epoch: 1510, Training Loss: 0.04801
Epoch: 1510, Training Loss: 0.06805
Epoch: 1510, Training Loss: 0.04834
Epoch: 1511, Training Loss: 0.05700
Epoch: 1511, Training Loss: 0.04798
Epoch: 1511, Training Loss: 0.06800
Epoch: 1511, Training Loss: 0.04831
Epoch: 1512, Training Loss: 0.05696
Epoch: 1512, Training Loss: 0.04796
Epoch: 1512, Training Loss: 0.06795
Epoch: 1512, Training Loss: 0.04827
Epoch: 1513, Training Loss: 0.05692
Epoch: 1513, Training Loss: 0.04793
Epoch: 1513, Training Loss: 0.06790
Epoch: 1513, Training Loss: 0.04824
Epoch: 1514, Training Loss: 0.05689
Epoch: 1514, Training Loss: 0.04790
Epoch: 1514, Training Loss: 0.06785
Epoch: 1514, Training Loss: 0.04821
Epoch: 1515, Training Loss: 0.05685
Epoch: 1515, Training Loss: 0.04787
Epoch: 1515, Training Loss: 0.06780
Epoch: 1515, Training Loss: 0.04818
Epoch: 1516, Training Loss: 0.05681
Epoch: 1516, Training Loss: 0.04785
Epoch: 1516, Training Loss: 0.06775
Epoch: 1516, Training Loss: 0.04815
Epoch: 1517, Training Loss: 0.05677
Epoch: 1517, Training Loss: 0.04782
Epoch: 1517, Training Loss: 0.06771
Epoch: 1517, Training Loss: 0.04812
Epoch: 1518, Training Loss: 0.05674
Epoch: 1518, Training Loss: 0.04779
Epoch: 1518, Training Loss: 0.06766
Epoch: 1518, Training Loss: 0.04809
Epoch: 1519, Training Loss: 0.05670
Epoch: 1519, Training Loss: 0.04776
Epoch: 1519, Training Loss: 0.06761
Epoch: 1519, Training Loss: 0.04806
Epoch: 1520, Training Loss: 0.05666
Epoch: 1520, Training Loss: 0.04774
Epoch: 1520, Training Loss: 0.06756
Epoch: 1520, Training Loss: 0.04802
Epoch: 1521, Training Loss: 0.05662
Epoch: 1521, Training Loss: 0.04771
Epoch: 1521, Training Loss: 0.06751
Epoch: 1521, Training Loss: 0.04799
Epoch: 1522, Training Loss: 0.05659
Epoch: 1522, Training Loss: 0.04768
Epoch: 1522, Training Loss: 0.06746
Epoch: 1522, Training Loss: 0.04796
Epoch: 1523, Training Loss: 0.05655
Epoch: 1523, Training Loss: 0.04765
Epoch: 1523, Training Loss: 0.06742
Epoch: 1523, Training Loss: 0.04793
Epoch: 1524, Training Loss: 0.05651
Epoch: 1524, Training Loss: 0.04763
Epoch: 1524, Training Loss: 0.06737
Epoch: 1524, Training Loss: 0.04790
Epoch: 1525, Training Loss: 0.05648
Epoch: 1525, Training Loss: 0.04760
Epoch: 1525, Training Loss: 0.06732
Epoch: 1525, Training Loss: 0.04787
Epoch: 1526, Training Loss: 0.05644
Epoch: 1526, Training Loss: 0.04757
Epoch: 1526, Training Loss: 0.06727
Epoch: 1526, Training Loss: 0.04784
Epoch: 1527, Training Loss: 0.05640
Epoch: 1527, Training Loss: 0.04755
Epoch: 1527, Training Loss: 0.06723
Epoch: 1527, Training Loss: 0.04781
Epoch: 1528, Training Loss: 0.05636
Epoch: 1528, Training Loss: 0.04752
Epoch: 1528, Training Loss: 0.06718
Epoch: 1528, Training Loss: 0.04778
Epoch: 1529, Training Loss: 0.05633
Epoch: 1529, Training Loss: 0.04749
Epoch: 1529, Training Loss: 0.06713
Epoch: 1529, Training Loss: 0.04775
Epoch: 1530, Training Loss: 0.05629
Epoch: 1530, Training Loss: 0.04747
Epoch: 1530, Training Loss: 0.06708
Epoch: 1530, Training Loss: 0.04772
Epoch: 1531, Training Loss: 0.05625
Epoch: 1531, Training Loss: 0.04744
Epoch: 1531, Training Loss: 0.06704
Epoch: 1531, Training Loss: 0.04769
Epoch: 1532, Training Loss: 0.05622
Epoch: 1532, Training Loss: 0.04741
Epoch: 1532, Training Loss: 0.06699
Epoch: 1532, Training Loss: 0.04766
Epoch: 1533, Training Loss: 0.05618
Epoch: 1533, Training Loss: 0.04739
Epoch: 1533, Training Loss: 0.06694
Epoch: 1533, Training Loss: 0.04763
Epoch: 1534, Training Loss: 0.05614
Epoch: 1534, Training Loss: 0.04736
Epoch: 1534, Training Loss: 0.06690
Epoch: 1534, Training Loss: 0.04760
Epoch: 1535, Training Loss: 0.05611
Epoch: 1535, Training Loss: 0.04733
Epoch: 1535, Training Loss: 0.06685
Epoch: 1535, Training Loss: 0.04756
Epoch: 1536, Training Loss: 0.05607
Epoch: 1536, Training Loss: 0.04731
Epoch: 1536, Training Loss: 0.06680
Epoch: 1536, Training Loss: 0.04753
Epoch: 1537, Training Loss: 0.05604
Epoch: 1537, Training Loss: 0.04728
Epoch: 1537, Training Loss: 0.06676
Epoch: 1537, Training Loss: 0.04750
Epoch: 1538, Training Loss: 0.05600
Epoch: 1538, Training Loss: 0.04725
Epoch: 1538, Training Loss: 0.06671
Epoch: 1538, Training Loss: 0.04747
Epoch: 1539, Training Loss: 0.05596
Epoch: 1539, Training Loss: 0.04723
Epoch: 1539, Training Loss: 0.06666
Epoch: 1539, Training Loss: 0.04744
Epoch: 1540, Training Loss: 0.05593
Epoch: 1540, Training Loss: 0.04720
Epoch: 1540, Training Loss: 0.06662
Epoch: 1540, Training Loss: 0.04741
Epoch: 1541, Training Loss: 0.05589
Epoch: 1541, Training Loss: 0.04717
Epoch: 1541, Training Loss: 0.06657
Epoch: 1541, Training Loss: 0.04738
Epoch: 1542, Training Loss: 0.05586
Epoch: 1542, Training Loss: 0.04715
Epoch: 1542, Training Loss: 0.06652
Epoch: 1542, Training Loss: 0.04735
Epoch: 1543, Training Loss: 0.05582
Epoch: 1543, Training Loss: 0.04712
Epoch: 1543, Training Loss: 0.06648
Epoch: 1543, Training Loss: 0.04732
Epoch: 1544, Training Loss: 0.05578
Epoch: 1544, Training Loss: 0.04709
Epoch: 1544, Training Loss: 0.06643
Epoch: 1544, Training Loss: 0.04730
Epoch: 1545, Training Loss: 0.05575
Epoch: 1545, Training Loss: 0.04707
Epoch: 1545, Training Loss: 0.06639
Epoch: 1545, Training Loss: 0.04727
Epoch: 1546, Training Loss: 0.05571
Epoch: 1546, Training Loss: 0.04704
Epoch: 1546, Training Loss: 0.06634
Epoch: 1546, Training Loss: 0.04724
Epoch: 1547, Training Loss: 0.05568
Epoch: 1547, Training Loss: 0.04702
Epoch: 1547, Training Loss: 0.06629
Epoch: 1547, Training Loss: 0.04721
Epoch: 1548, Training Loss: 0.05564
Epoch: 1548, Training Loss: 0.04699
Epoch: 1548, Training Loss: 0.06625
Epoch: 1548, Training Loss: 0.04718
Epoch: 1549, Training Loss: 0.05561
Epoch: 1549, Training Loss: 0.04696
Epoch: 1549, Training Loss: 0.06620
Epoch: 1549, Training Loss: 0.04715
Epoch: 1550, Training Loss: 0.05557
Epoch: 1550, Training Loss: 0.04694
Epoch: 1550, Training Loss: 0.06616
Epoch: 1550, Training Loss: 0.04712
Epoch: 1551, Training Loss: 0.05554
Epoch: 1551, Training Loss: 0.04691
Epoch: 1551, Training Loss: 0.06611
Epoch: 1551, Training Loss: 0.04709
Epoch: 1552, Training Loss: 0.05550
Epoch: 1552, Training Loss: 0.04689
Epoch: 1552, Training Loss: 0.06607
Epoch: 1552, Training Loss: 0.04706
Epoch: 1553, Training Loss: 0.05546
Epoch: 1553, Training Loss: 0.04686
Epoch: 1553, Training Loss: 0.06602
Epoch: 1553, Training Loss: 0.04703
Epoch: 1554, Training Loss: 0.05543
Epoch: 1554, Training Loss: 0.04684
Epoch: 1554, Training Loss: 0.06598
Epoch: 1554, Training Loss: 0.04700
Epoch: 1555, Training Loss: 0.05539
Epoch: 1555, Training Loss: 0.04681
Epoch: 1555, Training Loss: 0.06593
Epoch: 1555, Training Loss: 0.04697
Epoch: 1556, Training Loss: 0.05536
Epoch: 1556, Training Loss: 0.04678
Epoch: 1556, Training Loss: 0.06589
Epoch: 1556, Training Loss: 0.04694
Epoch: 1557, Training Loss: 0.05532
Epoch: 1557, Training Loss: 0.04676
Epoch: 1557, Training Loss: 0.06584
Epoch: 1557, Training Loss: 0.04691
Epoch: 1558, Training Loss: 0.05529
Epoch: 1558, Training Loss: 0.04673
Epoch: 1558, Training Loss: 0.06580
Epoch: 1558, Training Loss: 0.04688
Epoch: 1559, Training Loss: 0.05525
Epoch: 1559, Training Loss: 0.04671
Epoch: 1559, Training Loss: 0.06575
Epoch: 1559, Training Loss: 0.04686
Epoch: 1560, Training Loss: 0.05522
Epoch: 1560, Training Loss: 0.04668
Epoch: 1560, Training Loss: 0.06571
Epoch: 1560, Training Loss: 0.04683
Epoch: 1561, Training Loss: 0.05519
Epoch: 1561, Training Loss: 0.04666
Epoch: 1561, Training Loss: 0.06566
Epoch: 1561, Training Loss: 0.04680
Epoch: 1562, Training Loss: 0.05515
Epoch: 1562, Training Loss: 0.04663
Epoch: 1562, Training Loss: 0.06562
Epoch: 1562, Training Loss: 0.04677
Epoch: 1563, Training Loss: 0.05512
Epoch: 1563, Training Loss: 0.04660
Epoch: 1563, Training Loss: 0.06558
Epoch: 1563, Training Loss: 0.04674
Epoch: 1564, Training Loss: 0.05508
Epoch: 1564, Training Loss: 0.04658
Epoch: 1564, Training Loss: 0.06553
Epoch: 1564, Training Loss: 0.04671
Epoch: 1565, Training Loss: 0.05505
Epoch: 1565, Training Loss: 0.04655
Epoch: 1565, Training Loss: 0.06549
Epoch: 1565, Training Loss: 0.04668
Epoch: 1566, Training Loss: 0.05501
Epoch: 1566, Training Loss: 0.04653
Epoch: 1566, Training Loss: 0.06544
Epoch: 1566, Training Loss: 0.04665
Epoch: 1567, Training Loss: 0.05498
Epoch: 1567, Training Loss: 0.04650
Epoch: 1567, Training Loss: 0.06540
Epoch: 1567, Training Loss: 0.04663
Epoch: 1568, Training Loss: 0.05494
Epoch: 1568, Training Loss: 0.04648
Epoch: 1568, Training Loss: 0.06535
Epoch: 1568, Training Loss: 0.04660
Epoch: 1569, Training Loss: 0.05491
Epoch: 1569, Training Loss: 0.04645
Epoch: 1569, Training Loss: 0.06531
Epoch: 1569, Training Loss: 0.04657
Epoch: 1570, Training Loss: 0.05488
Epoch: 1570, Training Loss: 0.04643
Epoch: 1570, Training Loss: 0.06527
Epoch: 1570, Training Loss: 0.04654
Epoch: 1571, Training Loss: 0.05484
Epoch: 1571, Training Loss: 0.04640
Epoch: 1571, Training Loss: 0.06522
Epoch: 1571, Training Loss: 0.04651
Epoch: 1572, Training Loss: 0.05481
Epoch: 1572, Training Loss: 0.04638
Epoch: 1572, Training Loss: 0.06518
Epoch: 1572, Training Loss: 0.04648
Epoch: 1573, Training Loss: 0.05477
Epoch: 1573, Training Loss: 0.04635
Epoch: 1573, Training Loss: 0.06514
Epoch: 1573, Training Loss: 0.04646
Epoch: 1574, Training Loss: 0.05474
Epoch: 1574, Training Loss: 0.04633
Epoch: 1574, Training Loss: 0.06509
Epoch: 1574, Training Loss: 0.04643
Epoch: 1575, Training Loss: 0.05471
Epoch: 1575, Training Loss: 0.04630
Epoch: 1575, Training Loss: 0.06505
Epoch: 1575, Training Loss: 0.04640
Epoch: 1576, Training Loss: 0.05467
Epoch: 1576, Training Loss: 0.04628
Epoch: 1576, Training Loss: 0.06501
Epoch: 1576, Training Loss: 0.04637
Epoch: 1577, Training Loss: 0.05464
Epoch: 1577, Training Loss: 0.04625
Epoch: 1577, Training Loss: 0.06496
Epoch: 1577, Training Loss: 0.04634
Epoch: 1578, Training Loss: 0.05460
Epoch: 1578, Training Loss: 0.04623
Epoch: 1578, Training Loss: 0.06492
Epoch: 1578, Training Loss: 0.04632
Epoch: 1579, Training Loss: 0.05457
Epoch: 1579, Training Loss: 0.04620
Epoch: 1579, Training Loss: 0.06488
Epoch: 1579, Training Loss: 0.04629
Epoch: 1580, Training Loss: 0.05454
Epoch: 1580, Training Loss: 0.04618
Epoch: 1580, Training Loss: 0.06483
Epoch: 1580, Training Loss: 0.04626
Epoch: 1581, Training Loss: 0.05450
Epoch: 1581, Training Loss: 0.04615
Epoch: 1581, Training Loss: 0.06479
Epoch: 1581, Training Loss: 0.04623
Epoch: 1582, Training Loss: 0.05447
Epoch: 1582, Training Loss: 0.04613
Epoch: 1582, Training Loss: 0.06475
Epoch: 1582, Training Loss: 0.04620
Epoch: 1583, Training Loss: 0.05444
Epoch: 1583, Training Loss: 0.04610
Epoch: 1583, Training Loss: 0.06471
Epoch: 1583, Training Loss: 0.04618
Epoch: 1584, Training Loss: 0.05440
Epoch: 1584, Training Loss: 0.04608
Epoch: 1584, Training Loss: 0.06466
Epoch: 1584, Training Loss: 0.04615
Epoch: 1585, Training Loss: 0.05437
Epoch: 1585, Training Loss: 0.04605
Epoch: 1585, Training Loss: 0.06462
Epoch: 1585, Training Loss: 0.04612
Epoch: 1586, Training Loss: 0.05434
Epoch: 1586, Training Loss: 0.04603
Epoch: 1586, Training Loss: 0.06458
Epoch: 1586, Training Loss: 0.04609
Epoch: 1587, Training Loss: 0.05430
Epoch: 1587, Training Loss: 0.04601
Epoch: 1587, Training Loss: 0.06454
Epoch: 1587, Training Loss: 0.04607
Epoch: 1588, Training Loss: 0.05427
Epoch: 1588, Training Loss: 0.04598
Epoch: 1588, Training Loss: 0.06449
Epoch: 1588, Training Loss: 0.04604
Epoch: 1589, Training Loss: 0.05424
Epoch: 1589, Training Loss: 0.04596
Epoch: 1589, Training Loss: 0.06445
Epoch: 1589, Training Loss: 0.04601
Epoch: 1590, Training Loss: 0.05420
Epoch: 1590, Training Loss: 0.04593
Epoch: 1590, Training Loss: 0.06441
Epoch: 1590, Training Loss: 0.04598
Epoch: 1591, Training Loss: 0.05417
Epoch: 1591, Training Loss: 0.04591
Epoch: 1591, Training Loss: 0.06437
Epoch: 1591, Training Loss: 0.04596
Epoch: 1592, Training Loss: 0.05414
Epoch: 1592, Training Loss: 0.04588
Epoch: 1592, Training Loss: 0.06432
Epoch: 1592, Training Loss: 0.04593
Epoch: 1593, Training Loss: 0.05411
Epoch: 1593, Training Loss: 0.04586
Epoch: 1593, Training Loss: 0.06428
Epoch: 1593, Training Loss: 0.04590
Epoch: 1594, Training Loss: 0.05407
Epoch: 1594, Training Loss: 0.04584
Epoch: 1594, Training Loss: 0.06424
Epoch: 1594, Training Loss: 0.04587
Epoch: 1595, Training Loss: 0.05404
Epoch: 1595, Training Loss: 0.04581
Epoch: 1595, Training Loss: 0.06420
Epoch: 1595, Training Loss: 0.04585
Epoch: 1596, Training Loss: 0.05401
Epoch: 1596, Training Loss: 0.04579
Epoch: 1596, Training Loss: 0.06416
Epoch: 1596, Training Loss: 0.04582
Epoch: 1597, Training Loss: 0.05398
Epoch: 1597, Training Loss: 0.04576
Epoch: 1597, Training Loss: 0.06412
Epoch: 1597, Training Loss: 0.04579
Epoch: 1598, Training Loss: 0.05394
Epoch: 1598, Training Loss: 0.04574
Epoch: 1598, Training Loss: 0.06407
Epoch: 1598, Training Loss: 0.04577
Epoch: 1599, Training Loss: 0.05391
Epoch: 1599, Training Loss: 0.04571
Epoch: 1599, Training Loss: 0.06403
Epoch: 1599, Training Loss: 0.04574
Epoch: 1600, Training Loss: 0.05388
Epoch: 1600, Training Loss: 0.04569
Epoch: 1600, Training Loss: 0.06399
Epoch: 1600, Training Loss: 0.04571
Epoch: 1601, Training Loss: 0.05385
Epoch: 1601, Training Loss: 0.04567
Epoch: 1601, Training Loss: 0.06395
Epoch: 1601, Training Loss: 0.04569
Epoch: 1602, Training Loss: 0.05381
Epoch: 1602, Training Loss: 0.04564
Epoch: 1602, Training Loss: 0.06391
Epoch: 1602, Training Loss: 0.04566
Epoch: 1603, Training Loss: 0.05378
Epoch: 1603, Training Loss: 0.04562
Epoch: 1603, Training Loss: 0.06387
Epoch: 1603, Training Loss: 0.04563
Epoch: 1604, Training Loss: 0.05375
Epoch: 1604, Training Loss: 0.04559
Epoch: 1604, Training Loss: 0.06383
Epoch: 1604, Training Loss: 0.04560
Epoch: 1605, Training Loss: 0.05372
Epoch: 1605, Training Loss: 0.04557
Epoch: 1605, Training Loss: 0.06379
Epoch: 1605, Training Loss: 0.04558
Epoch: 1606, Training Loss: 0.05368
Epoch: 1606, Training Loss: 0.04555
Epoch: 1606, Training Loss: 0.06374
Epoch: 1606, Training Loss: 0.04555
Epoch: 1607, Training Loss: 0.05365
Epoch: 1607, Training Loss: 0.04552
Epoch: 1607, Training Loss: 0.06370
Epoch: 1607, Training Loss: 0.04552
Epoch: 1608, Training Loss: 0.05362
Epoch: 1608, Training Loss: 0.04550
Epoch: 1608, Training Loss: 0.06366
Epoch: 1608, Training Loss: 0.04550
Epoch: 1609, Training Loss: 0.05359
Epoch: 1609, Training Loss: 0.04548
Epoch: 1609, Training Loss: 0.06362
Epoch: 1609, Training Loss: 0.04547
Epoch: 1610, Training Loss: 0.05356
Epoch: 1610, Training Loss: 0.04545
Epoch: 1610, Training Loss: 0.06358
Epoch: 1610, Training Loss: 0.04545
Epoch: 1611, Training Loss: 0.05352
Epoch: 1611, Training Loss: 0.04543
Epoch: 1611, Training Loss: 0.06354
Epoch: 1611, Training Loss: 0.04542
Epoch: 1612, Training Loss: 0.05349
Epoch: 1612, Training Loss: 0.04541
Epoch: 1612, Training Loss: 0.06350
Epoch: 1612, Training Loss: 0.04539
Epoch: 1613, Training Loss: 0.05346
Epoch: 1613, Training Loss: 0.04538
Epoch: 1613, Training Loss: 0.06346
Epoch: 1613, Training Loss: 0.04537
Epoch: 1614, Training Loss: 0.05343
Epoch: 1614, Training Loss: 0.04536
Epoch: 1614, Training Loss: 0.06342
Epoch: 1614, Training Loss: 0.04534
Epoch: 1615, Training Loss: 0.05340
Epoch: 1615, Training Loss: 0.04533
Epoch: 1615, Training Loss: 0.06338
Epoch: 1615, Training Loss: 0.04531
Epoch: 1616, Training Loss: 0.05337
Epoch: 1616, Training Loss: 0.04531
Epoch: 1616, Training Loss: 0.06334
Epoch: 1616, Training Loss: 0.04529
Epoch: 1617, Training Loss: 0.05333
Epoch: 1617, Training Loss: 0.04529
Epoch: 1617, Training Loss: 0.06330
Epoch: 1617, Training Loss: 0.04526
Epoch: 1618, Training Loss: 0.05330
Epoch: 1618, Training Loss: 0.04526
Epoch: 1618, Training Loss: 0.06326
Epoch: 1618, Training Loss: 0.04523
Epoch: 1619, Training Loss: 0.05327
Epoch: 1619, Training Loss: 0.04524
Epoch: 1619, Training Loss: 0.06322
Epoch: 1619, Training Loss: 0.04521
Epoch: 1620, Training Loss: 0.05324
Epoch: 1620, Training Loss: 0.04522
Epoch: 1620, Training Loss: 0.06318
Epoch: 1620, Training Loss: 0.04518
Epoch: 1621, Training Loss: 0.05321
Epoch: 1621, Training Loss: 0.04519
Epoch: 1621, Training Loss: 0.06314
Epoch: 1621, Training Loss: 0.04516
Epoch: 1622, Training Loss: 0.05318
Epoch: 1622, Training Loss: 0.04517
Epoch: 1622, Training Loss: 0.06310
Epoch: 1622, Training Loss: 0.04513
Epoch: 1623, Training Loss: 0.05315
Epoch: 1623, Training Loss: 0.04515
Epoch: 1623, Training Loss: 0.06306
Epoch: 1623, Training Loss: 0.04510
Epoch: 1624, Training Loss: 0.05312
Epoch: 1624, Training Loss: 0.04512
Epoch: 1624, Training Loss: 0.06302
Epoch: 1624, Training Loss: 0.04508
Epoch: 1625, Training Loss: 0.05308
Epoch: 1625, Training Loss: 0.04510
Epoch: 1625, Training Loss: 0.06298
Epoch: 1625, Training Loss: 0.04505
Epoch: 1626, Training Loss: 0.05305
Epoch: 1626, Training Loss: 0.04508
Epoch: 1626, Training Loss: 0.06294
Epoch: 1626, Training Loss: 0.04503
Epoch: 1627, Training Loss: 0.05302
Epoch: 1627, Training Loss: 0.04506
Epoch: 1627, Training Loss: 0.06290
Epoch: 1627, Training Loss: 0.04500
Epoch: 1628, Training Loss: 0.05299
Epoch: 1628, Training Loss: 0.04503
Epoch: 1628, Training Loss: 0.06286
Epoch: 1628, Training Loss: 0.04498
Epoch: 1629, Training Loss: 0.05296
Epoch: 1629, Training Loss: 0.04501
Epoch: 1629, Training Loss: 0.06282
Epoch: 1629, Training Loss: 0.04495
Epoch: 1630, Training Loss: 0.05293
Epoch: 1630, Training Loss: 0.04499
Epoch: 1630, Training Loss: 0.06278
Epoch: 1630, Training Loss: 0.04492
Epoch: 1631, Training Loss: 0.05290
Epoch: 1631, Training Loss: 0.04496
Epoch: 1631, Training Loss: 0.06274
Epoch: 1631, Training Loss: 0.04490
Epoch: 1632, Training Loss: 0.05287
Epoch: 1632, Training Loss: 0.04494
Epoch: 1632, Training Loss: 0.06270
Epoch: 1632, Training Loss: 0.04487
Epoch: 1633, Training Loss: 0.05284
Epoch: 1633, Training Loss: 0.04492
Epoch: 1633, Training Loss: 0.06267
Epoch: 1633, Training Loss: 0.04485
Epoch: 1634, Training Loss: 0.05281
Epoch: 1634, Training Loss: 0.04489
Epoch: 1634, Training Loss: 0.06263
Epoch: 1634, Training Loss: 0.04482
Epoch: 1635, Training Loss: 0.05278
Epoch: 1635, Training Loss: 0.04487
Epoch: 1635, Training Loss: 0.06259
Epoch: 1635, Training Loss: 0.04480
Epoch: 1636, Training Loss: 0.05275
Epoch: 1636, Training Loss: 0.04485
Epoch: 1636, Training Loss: 0.06255
Epoch: 1636, Training Loss: 0.04477
Epoch: 1637, Training Loss: 0.05272
Epoch: 1637, Training Loss: 0.04483
Epoch: 1637, Training Loss: 0.06251
Epoch: 1637, Training Loss: 0.04475
Epoch: 1638, Training Loss: 0.05268
Epoch: 1638, Training Loss: 0.04480
Epoch: 1638, Training Loss: 0.06247
Epoch: 1638, Training Loss: 0.04472
Epoch: 1639, Training Loss: 0.05265
Epoch: 1639, Training Loss: 0.04478
Epoch: 1639, Training Loss: 0.06243
Epoch: 1639, Training Loss: 0.04470
Epoch: 1640, Training Loss: 0.05262
Epoch: 1640, Training Loss: 0.04476
Epoch: 1640, Training Loss: 0.06239
Epoch: 1640, Training Loss: 0.04467
Epoch: 1641, Training Loss: 0.05259
Epoch: 1641, Training Loss: 0.04474
Epoch: 1641, Training Loss: 0.06236
Epoch: 1641, Training Loss: 0.04465
Epoch: 1642, Training Loss: 0.05256
Epoch: 1642, Training Loss: 0.04471
Epoch: 1642, Training Loss: 0.06232
Epoch: 1642, Training Loss: 0.04462
Epoch: 1643, Training Loss: 0.05253
Epoch: 1643, Training Loss: 0.04469
Epoch: 1643, Training Loss: 0.06228
Epoch: 1643, Training Loss: 0.04460
Epoch: 1644, Training Loss: 0.05250
Epoch: 1644, Training Loss: 0.04467
Epoch: 1644, Training Loss: 0.06224
Epoch: 1644, Training Loss: 0.04457
Epoch: 1645, Training Loss: 0.05247
Epoch: 1645, Training Loss: 0.04465
Epoch: 1645, Training Loss: 0.06220
Epoch: 1645, Training Loss: 0.04455
Epoch: 1646, Training Loss: 0.05244
Epoch: 1646, Training Loss: 0.04462
Epoch: 1646, Training Loss: 0.06216
Epoch: 1646, Training Loss: 0.04452
Epoch: 1647, Training Loss: 0.05241
Epoch: 1647, Training Loss: 0.04460
Epoch: 1647, Training Loss: 0.06213
Epoch: 1647, Training Loss: 0.04450
Epoch: 1648, Training Loss: 0.05238
Epoch: 1648, Training Loss: 0.04458
Epoch: 1648, Training Loss: 0.06209
Epoch: 1648, Training Loss: 0.04447
Epoch: 1649, Training Loss: 0.05235
Epoch: 1649, Training Loss: 0.04456
Epoch: 1649, Training Loss: 0.06205
Epoch: 1649, Training Loss: 0.04445
Epoch: 1650, Training Loss: 0.05232
Epoch: 1650, Training Loss: 0.04453
Epoch: 1650, Training Loss: 0.06201
Epoch: 1650, Training Loss: 0.04442
Epoch: 1651, Training Loss: 0.05229
Epoch: 1651, Training Loss: 0.04451
Epoch: 1651, Training Loss: 0.06197
Epoch: 1651, Training Loss: 0.04440
Epoch: 1652, Training Loss: 0.05226
Epoch: 1652, Training Loss: 0.04449
Epoch: 1652, Training Loss: 0.06194
Epoch: 1652, Training Loss: 0.04437
Epoch: 1653, Training Loss: 0.05223
Epoch: 1653, Training Loss: 0.04447
Epoch: 1653, Training Loss: 0.06190
Epoch: 1653, Training Loss: 0.04435
Epoch: 1654, Training Loss: 0.05220
Epoch: 1654, Training Loss: 0.04444
Epoch: 1654, Training Loss: 0.06186
Epoch: 1654, Training Loss: 0.04432
Epoch: 1655, Training Loss: 0.05217
Epoch: 1655, Training Loss: 0.04442
Epoch: 1655, Training Loss: 0.06182
Epoch: 1655, Training Loss: 0.04430
Epoch: 1656, Training Loss: 0.05215
Epoch: 1656, Training Loss: 0.04440
Epoch: 1656, Training Loss: 0.06179
Epoch: 1656, Training Loss: 0.04427
Epoch: 1657, Training Loss: 0.05212
Epoch: 1657, Training Loss: 0.04438
Epoch: 1657, Training Loss: 0.06175
Epoch: 1657, Training Loss: 0.04425
Epoch: 1658, Training Loss: 0.05209
Epoch: 1658, Training Loss: 0.04436
Epoch: 1658, Training Loss: 0.06171
Epoch: 1658, Training Loss: 0.04422
Epoch: 1659, Training Loss: 0.05206
Epoch: 1659, Training Loss: 0.04433
Epoch: 1659, Training Loss: 0.06167
Epoch: 1659, Training Loss: 0.04420
Epoch: 1660, Training Loss: 0.05203
Epoch: 1660, Training Loss: 0.04431
Epoch: 1660, Training Loss: 0.06164
Epoch: 1660, Training Loss: 0.04418
Epoch: 1661, Training Loss: 0.05200
Epoch: 1661, Training Loss: 0.04429
Epoch: 1661, Training Loss: 0.06160
Epoch: 1661, Training Loss: 0.04415
Epoch: 1662, Training Loss: 0.05197
Epoch: 1662, Training Loss: 0.04427
Epoch: 1662, Training Loss: 0.06156
Epoch: 1662, Training Loss: 0.04413
Epoch: 1663, Training Loss: 0.05194
Epoch: 1663, Training Loss: 0.04425
Epoch: 1663, Training Loss: 0.06152
Epoch: 1663, Training Loss: 0.04410
Epoch: 1664, Training Loss: 0.05191
Epoch: 1664, Training Loss: 0.04422
Epoch: 1664, Training Loss: 0.06149
Epoch: 1664, Training Loss: 0.04408
Epoch: 1665, Training Loss: 0.05188
Epoch: 1665, Training Loss: 0.04420
Epoch: 1665, Training Loss: 0.06145
Epoch: 1665, Training Loss: 0.04405
Epoch: 1666, Training Loss: 0.05185
Epoch: 1666, Training Loss: 0.04418
Epoch: 1666, Training Loss: 0.06141
Epoch: 1666, Training Loss: 0.04403
Epoch: 1667, Training Loss: 0.05182
Epoch: 1667, Training Loss: 0.04416
Epoch: 1667, Training Loss: 0.06138
Epoch: 1667, Training Loss: 0.04401
Epoch: 1668, Training Loss: 0.05179
Epoch: 1668, Training Loss: 0.04414
Epoch: 1668, Training Loss: 0.06134
Epoch: 1668, Training Loss: 0.04398
Epoch: 1669, Training Loss: 0.05176
Epoch: 1669, Training Loss: 0.04412
Epoch: 1669, Training Loss: 0.06130
Epoch: 1669, Training Loss: 0.04396
Epoch: 1670, Training Loss: 0.05174
Epoch: 1670, Training Loss: 0.04409
Epoch: 1670, Training Loss: 0.06127
Epoch: 1670, Training Loss: 0.04393
Epoch: 1671, Training Loss: 0.05171
Epoch: 1671, Training Loss: 0.04407
Epoch: 1671, Training Loss: 0.06123
Epoch: 1671, Training Loss: 0.04391
Epoch: 1672, Training Loss: 0.05168
Epoch: 1672, Training Loss: 0.04405
Epoch: 1672, Training Loss: 0.06119
Epoch: 1672, Training Loss: 0.04389
Epoch: 1673, Training Loss: 0.05165
Epoch: 1673, Training Loss: 0.04403
Epoch: 1673, Training Loss: 0.06116
Epoch: 1673, Training Loss: 0.04386
Epoch: 1674, Training Loss: 0.05162
Epoch: 1674, Training Loss: 0.04401
Epoch: 1674, Training Loss: 0.06112
Epoch: 1674, Training Loss: 0.04384
Epoch: 1675, Training Loss: 0.05159
Epoch: 1675, Training Loss: 0.04399
Epoch: 1675, Training Loss: 0.06108
Epoch: 1675, Training Loss: 0.04381
Epoch: 1676, Training Loss: 0.05156
Epoch: 1676, Training Loss: 0.04396
Epoch: 1676, Training Loss: 0.06105
Epoch: 1676, Training Loss: 0.04379
Epoch: 1677, Training Loss: 0.05153
Epoch: 1677, Training Loss: 0.04394
Epoch: 1677, Training Loss: 0.06101
Epoch: 1677, Training Loss: 0.04377
Epoch: 1678, Training Loss: 0.05151
Epoch: 1678, Training Loss: 0.04392
Epoch: 1678, Training Loss: 0.06098
Epoch: 1678, Training Loss: 0.04374
Epoch: 1679, Training Loss: 0.05148
Epoch: 1679, Training Loss: 0.04390
Epoch: 1679, Training Loss: 0.06094
Epoch: 1679, Training Loss: 0.04372
Epoch: 1680, Training Loss: 0.05145
Epoch: 1680, Training Loss: 0.04388
Epoch: 1680, Training Loss: 0.06090
Epoch: 1680, Training Loss: 0.04370
Epoch: 1681, Training Loss: 0.05142
Epoch: 1681, Training Loss: 0.04386
Epoch: 1681, Training Loss: 0.06087
Epoch: 1681, Training Loss: 0.04367
Epoch: 1682, Training Loss: 0.05139
Epoch: 1682, Training Loss: 0.04384
Epoch: 1682, Training Loss: 0.06083
Epoch: 1682, Training Loss: 0.04365
Epoch: 1683, Training Loss: 0.05136
Epoch: 1683, Training Loss: 0.04382
Epoch: 1683, Training Loss: 0.06080
Epoch: 1683, Training Loss: 0.04363
Epoch: 1684, Training Loss: 0.05134
Epoch: 1684, Training Loss: 0.04379
Epoch: 1684, Training Loss: 0.06076
Epoch: 1684, Training Loss: 0.04360
Epoch: 1685, Training Loss: 0.05131
Epoch: 1685, Training Loss: 0.04377
Epoch: 1685, Training Loss: 0.06072
Epoch: 1685, Training Loss: 0.04358
Epoch: 1686, Training Loss: 0.05128
Epoch: 1686, Training Loss: 0.04375
Epoch: 1686, Training Loss: 0.06069
Epoch: 1686, Training Loss: 0.04356
Epoch: 1687, Training Loss: 0.05125
Epoch: 1687, Training Loss: 0.04373
Epoch: 1687, Training Loss: 0.06065
Epoch: 1687, Training Loss: 0.04353
Epoch: 1688, Training Loss: 0.05122
Epoch: 1688, Training Loss: 0.04371
Epoch: 1688, Training Loss: 0.06062
Epoch: 1688, Training Loss: 0.04351
Epoch: 1689, Training Loss: 0.05119
Epoch: 1689, Training Loss: 0.04369
Epoch: 1689, Training Loss: 0.06058
Epoch: 1689, Training Loss: 0.04349
Epoch: 1690, Training Loss: 0.05117
Epoch: 1690, Training Loss: 0.04367
Epoch: 1690, Training Loss: 0.06055
Epoch: 1690, Training Loss: 0.04346
Epoch: 1691, Training Loss: 0.05114
Epoch: 1691, Training Loss: 0.04365
Epoch: 1691, Training Loss: 0.06051
Epoch: 1691, Training Loss: 0.04344
Epoch: 1692, Training Loss: 0.05111
Epoch: 1692, Training Loss: 0.04363
Epoch: 1692, Training Loss: 0.06048
Epoch: 1692, Training Loss: 0.04342
Epoch: 1693, Training Loss: 0.05108
Epoch: 1693, Training Loss: 0.04360
Epoch: 1693, Training Loss: 0.06044
Epoch: 1693, Training Loss: 0.04339
Epoch: 1694, Training Loss: 0.05106
Epoch: 1694, Training Loss: 0.04358
Epoch: 1694, Training Loss: 0.06041
Epoch: 1694, Training Loss: 0.04337
Epoch: 1695, Training Loss: 0.05103
Epoch: 1695, Training Loss: 0.04356
Epoch: 1695, Training Loss: 0.06037
Epoch: 1695, Training Loss: 0.04335
Epoch: 1696, Training Loss: 0.05100
Epoch: 1696, Training Loss: 0.04354
Epoch: 1696, Training Loss: 0.06034
Epoch: 1696, Training Loss: 0.04332
Epoch: 1697, Training Loss: 0.05097
Epoch: 1697, Training Loss: 0.04352
Epoch: 1697, Training Loss: 0.06030
Epoch: 1697, Training Loss: 0.04330
Epoch: 1698, Training Loss: 0.05094
Epoch: 1698, Training Loss: 0.04350
Epoch: 1698, Training Loss: 0.06026
Epoch: 1698, Training Loss: 0.04328
Epoch: 1699, Training Loss: 0.05092
Epoch: 1699, Training Loss: 0.04348
Epoch: 1699, Training Loss: 0.06023
Epoch: 1699, Training Loss: 0.04325
Epoch: 1700, Training Loss: 0.05089
Epoch: 1700, Training Loss: 0.04346
Epoch: 1700, Training Loss: 0.06020
Epoch: 1700, Training Loss: 0.04323
Epoch: 1701, Training Loss: 0.05086
Epoch: 1701, Training Loss: 0.04344
Epoch: 1701, Training Loss: 0.06016
Epoch: 1701, Training Loss: 0.04321
Epoch: 1702, Training Loss: 0.05083
Epoch: 1702, Training Loss: 0.04342
Epoch: 1702, Training Loss: 0.06013
Epoch: 1702, Training Loss: 0.04319
Epoch: 1703, Training Loss: 0.05081
Epoch: 1703, Training Loss: 0.04340
Epoch: 1703, Training Loss: 0.06009
Epoch: 1703, Training Loss: 0.04316
Epoch: 1704, Training Loss: 0.05078
Epoch: 1704, Training Loss: 0.04338
Epoch: 1704, Training Loss: 0.06006
Epoch: 1704, Training Loss: 0.04314
Epoch: 1705, Training Loss: 0.05075
Epoch: 1705, Training Loss: 0.04335
Epoch: 1705, Training Loss: 0.06002
Epoch: 1705, Training Loss: 0.04312
Epoch: 1706, Training Loss: 0.05072
Epoch: 1706, Training Loss: 0.04333
Epoch: 1706, Training Loss: 0.05999
Epoch: 1706, Training Loss: 0.04309
Epoch: 1707, Training Loss: 0.05070
Epoch: 1707, Training Loss: 0.04331
Epoch: 1707, Training Loss: 0.05995
Epoch: 1707, Training Loss: 0.04307
Epoch: 1708, Training Loss: 0.05067
Epoch: 1708, Training Loss: 0.04329
Epoch: 1708, Training Loss: 0.05992
Epoch: 1708, Training Loss: 0.04305
Epoch: 1709, Training Loss: 0.05064
Epoch: 1709, Training Loss: 0.04327
Epoch: 1709, Training Loss: 0.05988
Epoch: 1709, Training Loss: 0.04303
Epoch: 1710, Training Loss: 0.05062
Epoch: 1710, Training Loss: 0.04325
Epoch: 1710, Training Loss: 0.05985
Epoch: 1710, Training Loss: 0.04300
Epoch: 1711, Training Loss: 0.05059
Epoch: 1711, Training Loss: 0.04323
Epoch: 1711, Training Loss: 0.05982
Epoch: 1711, Training Loss: 0.04298
Epoch: 1712, Training Loss: 0.05056
Epoch: 1712, Training Loss: 0.04321
Epoch: 1712, Training Loss: 0.05978
Epoch: 1712, Training Loss: 0.04296
Epoch: 1713, Training Loss: 0.05053
Epoch: 1713, Training Loss: 0.04319
Epoch: 1713, Training Loss: 0.05975
Epoch: 1713, Training Loss: 0.04294
Epoch: 1714, Training Loss: 0.05051
Epoch: 1714, Training Loss: 0.04317
Epoch: 1714, Training Loss: 0.05971
Epoch: 1714, Training Loss: 0.04291
Epoch: 1715, Training Loss: 0.05048
Epoch: 1715, Training Loss: 0.04315
Epoch: 1715, Training Loss: 0.05968
Epoch: 1715, Training Loss: 0.04289
Epoch: 1716, Training Loss: 0.05045
Epoch: 1716, Training Loss: 0.04313
Epoch: 1716, Training Loss: 0.05964
Epoch: 1716, Training Loss: 0.04287
Epoch: 1717, Training Loss: 0.05043
Epoch: 1717, Training Loss: 0.04311
Epoch: 1717, Training Loss: 0.05961
Epoch: 1717, Training Loss: 0.04285
Epoch: 1718, Training Loss: 0.05040
Epoch: 1718, Training Loss: 0.04309
Epoch: 1718, Training Loss: 0.05958
Epoch: 1718, Training Loss: 0.04283
Epoch: 1719, Training Loss: 0.05037
Epoch: 1719, Training Loss: 0.04307
Epoch: 1719, Training Loss: 0.05954
Epoch: 1719, Training Loss: 0.04280
Epoch: 1720, Training Loss: 0.05035
Epoch: 1720, Training Loss: 0.04305
Epoch: 1720, Training Loss: 0.05951
Epoch: 1720, Training Loss: 0.04278
Epoch: 1721, Training Loss: 0.05032
Epoch: 1721, Training Loss: 0.04303
Epoch: 1721, Training Loss: 0.05948
Epoch: 1721, Training Loss: 0.04276
Epoch: 1722, Training Loss: 0.05029
Epoch: 1722, Training Loss: 0.04301
Epoch: 1722, Training Loss: 0.05944
Epoch: 1722, Training Loss: 0.04274
Epoch: 1723, Training Loss: 0.05027
Epoch: 1723, Training Loss: 0.04299
Epoch: 1723, Training Loss: 0.05941
Epoch: 1723, Training Loss: 0.04271
Epoch: 1724, Training Loss: 0.05024
Epoch: 1724, Training Loss: 0.04297
Epoch: 1724, Training Loss: 0.05937
Epoch: 1724, Training Loss: 0.04269
Epoch: 1725, Training Loss: 0.05021
Epoch: 1725, Training Loss: 0.04295
Epoch: 1725, Training Loss: 0.05934
Epoch: 1725, Training Loss: 0.04267
Epoch: 1726, Training Loss: 0.05019
Epoch: 1726, Training Loss: 0.04293
Epoch: 1726, Training Loss: 0.05931
Epoch: 1726, Training Loss: 0.04265
Epoch: 1727, Training Loss: 0.05016
Epoch: 1727, Training Loss: 0.04291
Epoch: 1727, Training Loss: 0.05927
Epoch: 1727, Training Loss: 0.04263
Epoch: 1728, Training Loss: 0.05013
Epoch: 1728, Training Loss: 0.04289
Epoch: 1728, Training Loss: 0.05924
Epoch: 1728, Training Loss: 0.04260
Epoch: 1729, Training Loss: 0.05011
Epoch: 1729, Training Loss: 0.04287
Epoch: 1729, Training Loss: 0.05921
Epoch: 1729, Training Loss: 0.04258
Epoch: 1730, Training Loss: 0.05008
Epoch: 1730, Training Loss: 0.04285
Epoch: 1730, Training Loss: 0.05917
Epoch: 1730, Training Loss: 0.04256
Epoch: 1731, Training Loss: 0.05005
Epoch: 1731, Training Loss: 0.04283
Epoch: 1731, Training Loss: 0.05914
Epoch: 1731, Training Loss: 0.04254
Epoch: 1732, Training Loss: 0.05003
Epoch: 1732, Training Loss: 0.04281
Epoch: 1732, Training Loss: 0.05911
Epoch: 1732, Training Loss: 0.04252
Epoch: 1733, Training Loss: 0.05000
Epoch: 1733, Training Loss: 0.04279
Epoch: 1733, Training Loss: 0.05907
Epoch: 1733, Training Loss: 0.04250
Epoch: 1734, Training Loss: 0.04998
Epoch: 1734, Training Loss: 0.04277
Epoch: 1734, Training Loss: 0.05904
Epoch: 1734, Training Loss: 0.04247
Epoch: 1735, Training Loss: 0.04995
Epoch: 1735, Training Loss: 0.04275
Epoch: 1735, Training Loss: 0.05901
Epoch: 1735, Training Loss: 0.04245
Epoch: 1736, Training Loss: 0.04992
Epoch: 1736, Training Loss: 0.04273
Epoch: 1736, Training Loss: 0.05898
Epoch: 1736, Training Loss: 0.04243
Epoch: 1737, Training Loss: 0.04990
Epoch: 1737, Training Loss: 0.04271
Epoch: 1737, Training Loss: 0.05894
Epoch: 1737, Training Loss: 0.04241
Epoch: 1738, Training Loss: 0.04987
Epoch: 1738, Training Loss: 0.04269
Epoch: 1738, Training Loss: 0.05891
Epoch: 1738, Training Loss: 0.04239
Epoch: 1739, Training Loss: 0.04984
Epoch: 1739, Training Loss: 0.04267
Epoch: 1739, Training Loss: 0.05888
Epoch: 1739, Training Loss: 0.04236
Epoch: 1740, Training Loss: 0.04982
Epoch: 1740, Training Loss: 0.04265
Epoch: 1740, Training Loss: 0.05884
Epoch: 1740, Training Loss: 0.04234
Epoch: 1741, Training Loss: 0.04979
Epoch: 1741, Training Loss: 0.04263
Epoch: 1741, Training Loss: 0.05881
Epoch: 1741, Training Loss: 0.04232
Epoch: 1742, Training Loss: 0.04977
Epoch: 1742, Training Loss: 0.04261
Epoch: 1742, Training Loss: 0.05878
Epoch: 1742, Training Loss: 0.04230
Epoch: 1743, Training Loss: 0.04974
Epoch: 1743, Training Loss: 0.04259
Epoch: 1743, Training Loss: 0.05875
Epoch: 1743, Training Loss: 0.04228
Epoch: 1744, Training Loss: 0.04971
Epoch: 1744, Training Loss: 0.04257
Epoch: 1744, Training Loss: 0.05871
Epoch: 1744, Training Loss: 0.04226
Epoch: 1745, Training Loss: 0.04969
Epoch: 1745, Training Loss: 0.04255
Epoch: 1745, Training Loss: 0.05868
Epoch: 1745, Training Loss: 0.04224
Epoch: 1746, Training Loss: 0.04966
Epoch: 1746, Training Loss: 0.04253
Epoch: 1746, Training Loss: 0.05865
Epoch: 1746, Training Loss: 0.04221
Epoch: 1747, Training Loss: 0.04964
Epoch: 1747, Training Loss: 0.04251
Epoch: 1747, Training Loss: 0.05862
Epoch: 1747, Training Loss: 0.04219
Epoch: 1748, Training Loss: 0.04961
Epoch: 1748, Training Loss: 0.04249
Epoch: 1748, Training Loss: 0.05858
Epoch: 1748, Training Loss: 0.04217
Epoch: 1749, Training Loss: 0.04959
Epoch: 1749, Training Loss: 0.04247
Epoch: 1749, Training Loss: 0.05855
Epoch: 1749, Training Loss: 0.04215
Epoch: 1750, Training Loss: 0.04956
Epoch: 1750, Training Loss: 0.04245
Epoch: 1750, Training Loss: 0.05852
Epoch: 1750, Training Loss: 0.04213
Epoch: 1751, Training Loss: 0.04954
Epoch: 1751, Training Loss: 0.04243
Epoch: 1751, Training Loss: 0.05849
Epoch: 1751, Training Loss: 0.04211
Epoch: 1752, Training Loss: 0.04951
Epoch: 1752, Training Loss: 0.04242
Epoch: 1752, Training Loss: 0.05846
Epoch: 1752, Training Loss: 0.04209
Epoch: 1753, Training Loss: 0.04948
Epoch: 1753, Training Loss: 0.04240
Epoch: 1753, Training Loss: 0.05842
Epoch: 1753, Training Loss: 0.04207
Epoch: 1754, Training Loss: 0.04946
Epoch: 1754, Training Loss: 0.04238
Epoch: 1754, Training Loss: 0.05839
Epoch: 1754, Training Loss: 0.04204
Epoch: 1755, Training Loss: 0.04943
Epoch: 1755, Training Loss: 0.04236
Epoch: 1755, Training Loss: 0.05836
Epoch: 1755, Training Loss: 0.04202
Epoch: 1756, Training Loss: 0.04941
Epoch: 1756, Training Loss: 0.04234
Epoch: 1756, Training Loss: 0.05833
Epoch: 1756, Training Loss: 0.04200
Epoch: 1757, Training Loss: 0.04938
Epoch: 1757, Training Loss: 0.04232
Epoch: 1757, Training Loss: 0.05830
Epoch: 1757, Training Loss: 0.04198
Epoch: 1758, Training Loss: 0.04936
Epoch: 1758, Training Loss: 0.04230
Epoch: 1758, Training Loss: 0.05826
Epoch: 1758, Training Loss: 0.04196
Epoch: 1759, Training Loss: 0.04933
Epoch: 1759, Training Loss: 0.04228
Epoch: 1759, Training Loss: 0.05823
Epoch: 1759, Training Loss: 0.04194
Epoch: 1760, Training Loss: 0.04931
Epoch: 1760, Training Loss: 0.04226
Epoch: 1760, Training Loss: 0.05820
Epoch: 1760, Training Loss: 0.04192
Epoch: 1761, Training Loss: 0.04928
Epoch: 1761, Training Loss: 0.04224
Epoch: 1761, Training Loss: 0.05817
Epoch: 1761, Training Loss: 0.04190
Epoch: 1762, Training Loss: 0.04926
Epoch: 1762, Training Loss: 0.04222
Epoch: 1762, Training Loss: 0.05814
Epoch: 1762, Training Loss: 0.04188
Epoch: 1763, Training Loss: 0.04923
Epoch: 1763, Training Loss: 0.04220
Epoch: 1763, Training Loss: 0.05811
Epoch: 1763, Training Loss: 0.04186
Epoch: 1764, Training Loss: 0.04921
Epoch: 1764, Training Loss: 0.04218
Epoch: 1764, Training Loss: 0.05807
Epoch: 1764, Training Loss: 0.04184
Epoch: 1765, Training Loss: 0.04918
Epoch: 1765, Training Loss: 0.04217
Epoch: 1765, Training Loss: 0.05804
Epoch: 1765, Training Loss: 0.04181
Epoch: 1766, Training Loss: 0.04916
Epoch: 1766, Training Loss: 0.04215
Epoch: 1766, Training Loss: 0.05801
Epoch: 1766, Training Loss: 0.04179
Epoch: 1767, Training Loss: 0.04913
Epoch: 1767, Training Loss: 0.04213
Epoch: 1767, Training Loss: 0.05798
Epoch: 1767, Training Loss: 0.04177
Epoch: 1768, Training Loss: 0.04911
Epoch: 1768, Training Loss: 0.04211
Epoch: 1768, Training Loss: 0.05795
Epoch: 1768, Training Loss: 0.04175
Epoch: 1769, Training Loss: 0.04908
Epoch: 1769, Training Loss: 0.04209
Epoch: 1769, Training Loss: 0.05792
Epoch: 1769, Training Loss: 0.04173
Epoch: 1770, Training Loss: 0.04906
Epoch: 1770, Training Loss: 0.04207
Epoch: 1770, Training Loss: 0.05789
Epoch: 1770, Training Loss: 0.04171
Epoch: 1771, Training Loss: 0.04903
Epoch: 1771, Training Loss: 0.04205
Epoch: 1771, Training Loss: 0.05785
Epoch: 1771, Training Loss: 0.04169
Epoch: 1772, Training Loss: 0.04901
Epoch: 1772, Training Loss: 0.04203
Epoch: 1772, Training Loss: 0.05782
Epoch: 1772, Training Loss: 0.04167
Epoch: 1773, Training Loss: 0.04898
Epoch: 1773, Training Loss: 0.04201
Epoch: 1773, Training Loss: 0.05779
Epoch: 1773, Training Loss: 0.04165
Epoch: 1774, Training Loss: 0.04896
Epoch: 1774, Training Loss: 0.04200
Epoch: 1774, Training Loss: 0.05776
Epoch: 1774, Training Loss: 0.04163
Epoch: 1775, Training Loss: 0.04893
Epoch: 1775, Training Loss: 0.04198
Epoch: 1775, Training Loss: 0.05773
Epoch: 1775, Training Loss: 0.04161
Epoch: 1776, Training Loss: 0.04891
Epoch: 1776, Training Loss: 0.04196
Epoch: 1776, Training Loss: 0.05770
Epoch: 1776, Training Loss: 0.04159
Epoch: 1777, Training Loss: 0.04888
Epoch: 1777, Training Loss: 0.04194
Epoch: 1777, Training Loss: 0.05767
Epoch: 1777, Training Loss: 0.04157
Epoch: 1778, Training Loss: 0.04886
Epoch: 1778, Training Loss: 0.04192
Epoch: 1778, Training Loss: 0.05764
Epoch: 1778, Training Loss: 0.04155
Epoch: 1779, Training Loss: 0.04883
Epoch: 1779, Training Loss: 0.04190
Epoch: 1779, Training Loss: 0.05761
Epoch: 1779, Training Loss: 0.04153
Epoch: 1780, Training Loss: 0.04881
Epoch: 1780, Training Loss: 0.04188
Epoch: 1780, Training Loss: 0.05758
Epoch: 1780, Training Loss: 0.04151
Epoch: 1781, Training Loss: 0.04879
Epoch: 1781, Training Loss: 0.04186
Epoch: 1781, Training Loss: 0.05754
Epoch: 1781, Training Loss: 0.04149
Epoch: 1782, Training Loss: 0.04876
Epoch: 1782, Training Loss: 0.04185
Epoch: 1782, Training Loss: 0.05751
Epoch: 1782, Training Loss: 0.04147
Epoch: 1783, Training Loss: 0.04874
Epoch: 1783, Training Loss: 0.04183
Epoch: 1783, Training Loss: 0.05748
Epoch: 1783, Training Loss: 0.04145
Epoch: 1784, Training Loss: 0.04871
Epoch: 1784, Training Loss: 0.04181
Epoch: 1784, Training Loss: 0.05745
Epoch: 1784, Training Loss: 0.04143
Epoch: 1785, Training Loss: 0.04869
Epoch: 1785, Training Loss: 0.04179
Epoch: 1785, Training Loss: 0.05742
Epoch: 1785, Training Loss: 0.04141
Epoch: 1786, Training Loss: 0.04866
Epoch: 1786, Training Loss: 0.04177
Epoch: 1786, Training Loss: 0.05739
Epoch: 1786, Training Loss: 0.04139
Epoch: 1787, Training Loss: 0.04864
Epoch: 1787, Training Loss: 0.04175
Epoch: 1787, Training Loss: 0.05736
Epoch: 1787, Training Loss: 0.04136
Epoch: 1788, Training Loss: 0.04861
Epoch: 1788, Training Loss: 0.04173
Epoch: 1788, Training Loss: 0.05733
Epoch: 1788, Training Loss: 0.04134
Epoch: 1789, Training Loss: 0.04859
Epoch: 1789, Training Loss: 0.04172
Epoch: 1789, Training Loss: 0.05730
Epoch: 1789, Training Loss: 0.04132
Epoch: 1790, Training Loss: 0.04857
Epoch: 1790, Training Loss: 0.04170
Epoch: 1790, Training Loss: 0.05727
Epoch: 1790, Training Loss: 0.04130
Epoch: 1791, Training Loss: 0.04854
Epoch: 1791, Training Loss: 0.04168
Epoch: 1791, Training Loss: 0.05724
Epoch: 1791, Training Loss: 0.04128
Epoch: 1792, Training Loss: 0.04852
Epoch: 1792, Training Loss: 0.04166
Epoch: 1792, Training Loss: 0.05721
Epoch: 1792, Training Loss: 0.04126
Epoch: 1793, Training Loss: 0.04849
Epoch: 1793, Training Loss: 0.04164
Epoch: 1793, Training Loss: 0.05718
Epoch: 1793, Training Loss: 0.04124
Epoch: 1794, Training Loss: 0.04847
Epoch: 1794, Training Loss: 0.04162
Epoch: 1794, Training Loss: 0.05715
Epoch: 1794, Training Loss: 0.04122
Epoch: 1795, Training Loss: 0.04845
Epoch: 1795, Training Loss: 0.04161
Epoch: 1795, Training Loss: 0.05712
Epoch: 1795, Training Loss: 0.04120
Epoch: 1796, Training Loss: 0.04842
Epoch: 1796, Training Loss: 0.04159
Epoch: 1796, Training Loss: 0.05709
Epoch: 1796, Training Loss: 0.04118
Epoch: 1797, Training Loss: 0.04840
Epoch: 1797, Training Loss: 0.04157
Epoch: 1797, Training Loss: 0.05706
Epoch: 1797, Training Loss: 0.04117
Epoch: 1798, Training Loss: 0.04837
Epoch: 1798, Training Loss: 0.04155
Epoch: 1798, Training Loss: 0.05703
Epoch: 1798, Training Loss: 0.04115
Epoch: 1799, Training Loss: 0.04835
Epoch: 1799, Training Loss: 0.04153
Epoch: 1799, Training Loss: 0.05700
Epoch: 1799, Training Loss: 0.04113
Epoch: 1800, Training Loss: 0.04833
Epoch: 1800, Training Loss: 0.04151
Epoch: 1800, Training Loss: 0.05697
Epoch: 1800, Training Loss: 0.04111
Epoch: 1801, Training Loss: 0.04830
Epoch: 1801, Training Loss: 0.04150
Epoch: 1801, Training Loss: 0.05694
Epoch: 1801, Training Loss: 0.04109
Epoch: 1802, Training Loss: 0.04828
Epoch: 1802, Training Loss: 0.04148
Epoch: 1802, Training Loss: 0.05691
Epoch: 1802, Training Loss: 0.04107
Epoch: 1803, Training Loss: 0.04826
Epoch: 1803, Training Loss: 0.04146
Epoch: 1803, Training Loss: 0.05688
Epoch: 1803, Training Loss: 0.04105
Epoch: 1804, Training Loss: 0.04823
Epoch: 1804, Training Loss: 0.04144
Epoch: 1804, Training Loss: 0.05685
Epoch: 1804, Training Loss: 0.04103
Epoch: 1805, Training Loss: 0.04821
Epoch: 1805, Training Loss: 0.04142
Epoch: 1805, Training Loss: 0.05682
Epoch: 1805, Training Loss: 0.04101
Epoch: 1806, Training Loss: 0.04818
Epoch: 1806, Training Loss: 0.04141
Epoch: 1806, Training Loss: 0.05679
Epoch: 1806, Training Loss: 0.04099
Epoch: 1807, Training Loss: 0.04816
Epoch: 1807, Training Loss: 0.04139
Epoch: 1807, Training Loss: 0.05676
Epoch: 1807, Training Loss: 0.04097
Epoch: 1808, Training Loss: 0.04814
Epoch: 1808, Training Loss: 0.04137
Epoch: 1808, Training Loss: 0.05673
Epoch: 1808, Training Loss: 0.04095
Epoch: 1809, Training Loss: 0.04811
Epoch: 1809, Training Loss: 0.04135
Epoch: 1809, Training Loss: 0.05670
Epoch: 1809, Training Loss: 0.04093
Epoch: 1810, Training Loss: 0.04809
Epoch: 1810, Training Loss: 0.04133
Epoch: 1810, Training Loss: 0.05667
Epoch: 1810, Training Loss: 0.04091
Epoch: 1811, Training Loss: 0.04807
Epoch: 1811, Training Loss: 0.04132
Epoch: 1811, Training Loss: 0.05664
Epoch: 1811, Training Loss: 0.04089
Epoch: 1812, Training Loss: 0.04804
Epoch: 1812, Training Loss: 0.04130
Epoch: 1812, Training Loss: 0.05661
Epoch: 1812, Training Loss: 0.04087
Epoch: 1813, Training Loss: 0.04802
Epoch: 1813, Training Loss: 0.04128
Epoch: 1813, Training Loss: 0.05659
Epoch: 1813, Training Loss: 0.04085
Epoch: 1814, Training Loss: 0.04800
Epoch: 1814, Training Loss: 0.04126
Epoch: 1814, Training Loss: 0.05656
Epoch: 1814, Training Loss: 0.04083
Epoch: 1815, Training Loss: 0.04797
Epoch: 1815, Training Loss: 0.04124
Epoch: 1815, Training Loss: 0.05653
Epoch: 1815, Training Loss: 0.04081
Epoch: 1816, Training Loss: 0.04795
Epoch: 1816, Training Loss: 0.04123
Epoch: 1816, Training Loss: 0.05650
Epoch: 1816, Training Loss: 0.04079
Epoch: 1817, Training Loss: 0.04793
Epoch: 1817, Training Loss: 0.04121
Epoch: 1817, Training Loss: 0.05647
Epoch: 1817, Training Loss: 0.04077
Epoch: 1818, Training Loss: 0.04790
Epoch: 1818, Training Loss: 0.04119
Epoch: 1818, Training Loss: 0.05644
Epoch: 1818, Training Loss: 0.04075
Epoch: 1819, Training Loss: 0.04788
Epoch: 1819, Training Loss: 0.04117
Epoch: 1819, Training Loss: 0.05641
Epoch: 1819, Training Loss: 0.04074
Epoch: 1820, Training Loss: 0.04786
Epoch: 1820, Training Loss: 0.04116
Epoch: 1820, Training Loss: 0.05638
Epoch: 1820, Training Loss: 0.04072
Epoch: 1821, Training Loss: 0.04783
Epoch: 1821, Training Loss: 0.04114
Epoch: 1821, Training Loss: 0.05635
Epoch: 1821, Training Loss: 0.04070
Epoch: 1822, Training Loss: 0.04781
Epoch: 1822, Training Loss: 0.04112
Epoch: 1822, Training Loss: 0.05632
Epoch: 1822, Training Loss: 0.04068
Epoch: 1823, Training Loss: 0.04779
Epoch: 1823, Training Loss: 0.04110
Epoch: 1823, Training Loss: 0.05629
Epoch: 1823, Training Loss: 0.04066
Epoch: 1824, Training Loss: 0.04776
Epoch: 1824, Training Loss: 0.04108
Epoch: 1824, Training Loss: 0.05627
Epoch: 1824, Training Loss: 0.04064
Epoch: 1825, Training Loss: 0.04774
Epoch: 1825, Training Loss: 0.04107
Epoch: 1825, Training Loss: 0.05624
Epoch: 1825, Training Loss: 0.04062
Epoch: 1826, Training Loss: 0.04772
Epoch: 1826, Training Loss: 0.04105
Epoch: 1826, Training Loss: 0.05621
Epoch: 1826, Training Loss: 0.04060
Epoch: 1827, Training Loss: 0.04770
Epoch: 1827, Training Loss: 0.04103
Epoch: 1827, Training Loss: 0.05618
Epoch: 1827, Training Loss: 0.04058
Epoch: 1828, Training Loss: 0.04767
Epoch: 1828, Training Loss: 0.04101
Epoch: 1828, Training Loss: 0.05615
Epoch: 1828, Training Loss: 0.04056
Epoch: 1829, Training Loss: 0.04765
Epoch: 1829, Training Loss: 0.04100
Epoch: 1829, Training Loss: 0.05612
Epoch: 1829, Training Loss: 0.04054
Epoch: 1830, Training Loss: 0.04763
Epoch: 1830, Training Loss: 0.04098
Epoch: 1830, Training Loss: 0.05609
Epoch: 1830, Training Loss: 0.04053
Epoch: 1831, Training Loss: 0.04760
Epoch: 1831, Training Loss: 0.04096
Epoch: 1831, Training Loss: 0.05606
Epoch: 1831, Training Loss: 0.04051
Epoch: 1832, Training Loss: 0.04758
Epoch: 1832, Training Loss: 0.04094
Epoch: 1832, Training Loss: 0.05604
Epoch: 1832, Training Loss: 0.04049
Epoch: 1833, Training Loss: 0.04756
Epoch: 1833, Training Loss: 0.04093
Epoch: 1833, Training Loss: 0.05601
Epoch: 1833, Training Loss: 0.04047
Epoch: 1834, Training Loss: 0.04754
Epoch: 1834, Training Loss: 0.04091
Epoch: 1834, Training Loss: 0.05598
Epoch: 1834, Training Loss: 0.04045
Epoch: 1835, Training Loss: 0.04751
Epoch: 1835, Training Loss: 0.04089
Epoch: 1835, Training Loss: 0.05595
Epoch: 1835, Training Loss: 0.04043
Epoch: 1836, Training Loss: 0.04749
Epoch: 1836, Training Loss: 0.04087
Epoch: 1836, Training Loss: 0.05592
Epoch: 1836, Training Loss: 0.04041
Epoch: 1837, Training Loss: 0.04747
Epoch: 1837, Training Loss: 0.04086
Epoch: 1837, Training Loss: 0.05589
Epoch: 1837, Training Loss: 0.04039
Epoch: 1838, Training Loss: 0.04744
Epoch: 1838, Training Loss: 0.04084
Epoch: 1838, Training Loss: 0.05587
Epoch: 1838, Training Loss: 0.04037
Epoch: 1839, Training Loss: 0.04742
Epoch: 1839, Training Loss: 0.04082
Epoch: 1839, Training Loss: 0.05584
Epoch: 1839, Training Loss: 0.04036
Epoch: 1840, Training Loss: 0.04740
Epoch: 1840, Training Loss: 0.04080
Epoch: 1840, Training Loss: 0.05581
Epoch: 1840, Training Loss: 0.04034
Epoch: 1841, Training Loss: 0.04738
Epoch: 1841, Training Loss: 0.04079
Epoch: 1841, Training Loss: 0.05578
Epoch: 1841, Training Loss: 0.04032
Epoch: 1842, Training Loss: 0.04735
Epoch: 1842, Training Loss: 0.04077
Epoch: 1842, Training Loss: 0.05575
Epoch: 1842, Training Loss: 0.04030
Epoch: 1843, Training Loss: 0.04733
Epoch: 1843, Training Loss: 0.04075
Epoch: 1843, Training Loss: 0.05572
Epoch: 1843, Training Loss: 0.04028
Epoch: 1844, Training Loss: 0.04731
Epoch: 1844, Training Loss: 0.04074
Epoch: 1844, Training Loss: 0.05570
Epoch: 1844, Training Loss: 0.04026
Epoch: 1845, Training Loss: 0.04729
Epoch: 1845, Training Loss: 0.04072
Epoch: 1845, Training Loss: 0.05567
Epoch: 1845, Training Loss: 0.04024
Epoch: 1846, Training Loss: 0.04726
Epoch: 1846, Training Loss: 0.04070
Epoch: 1846, Training Loss: 0.05564
Epoch: 1846, Training Loss: 0.04022
Epoch: 1847, Training Loss: 0.04724
Epoch: 1847, Training Loss: 0.04068
Epoch: 1847, Training Loss: 0.05561
Epoch: 1847, Training Loss: 0.04021
Epoch: 1848, Training Loss: 0.04722
Epoch: 1848, Training Loss: 0.04067
Epoch: 1848, Training Loss: 0.05559
Epoch: 1848, Training Loss: 0.04019
Epoch: 1849, Training Loss: 0.04720
Epoch: 1849, Training Loss: 0.04065
Epoch: 1849, Training Loss: 0.05556
Epoch: 1849, Training Loss: 0.04017
Epoch: 1850, Training Loss: 0.04718
Epoch: 1850, Training Loss: 0.04063
Epoch: 1850, Training Loss: 0.05553
Epoch: 1850, Training Loss: 0.04015
Epoch: 1851, Training Loss: 0.04715
Epoch: 1851, Training Loss: 0.04062
Epoch: 1851, Training Loss: 0.05550
Epoch: 1851, Training Loss: 0.04013
Epoch: 1852, Training Loss: 0.04713
Epoch: 1852, Training Loss: 0.04060
Epoch: 1852, Training Loss: 0.05547
Epoch: 1852, Training Loss: 0.04011
Epoch: 1853, Training Loss: 0.04711
Epoch: 1853, Training Loss: 0.04058
Epoch: 1853, Training Loss: 0.05545
Epoch: 1853, Training Loss: 0.04010
Epoch: 1854, Training Loss: 0.04709
Epoch: 1854, Training Loss: 0.04056
Epoch: 1854, Training Loss: 0.05542
Epoch: 1854, Training Loss: 0.04008
Epoch: 1855, Training Loss: 0.04706
Epoch: 1855, Training Loss: 0.04055
Epoch: 1855, Training Loss: 0.05539
Epoch: 1855, Training Loss: 0.04006
Epoch: 1856, Training Loss: 0.04704
Epoch: 1856, Training Loss: 0.04053
Epoch: 1856, Training Loss: 0.05536
Epoch: 1856, Training Loss: 0.04004
Epoch: 1857, Training Loss: 0.04702
Epoch: 1857, Training Loss: 0.04051
Epoch: 1857, Training Loss: 0.05534
Epoch: 1857, Training Loss: 0.04002
Epoch: 1858, Training Loss: 0.04700
Epoch: 1858, Training Loss: 0.04050
Epoch: 1858, Training Loss: 0.05531
Epoch: 1858, Training Loss: 0.04000
Epoch: 1859, Training Loss: 0.04698
Epoch: 1859, Training Loss: 0.04048
Epoch: 1859, Training Loss: 0.05528
Epoch: 1859, Training Loss: 0.03999
Epoch: 1860, Training Loss: 0.04695
Epoch: 1860, Training Loss: 0.04046
Epoch: 1860, Training Loss: 0.05525
Epoch: 1860, Training Loss: 0.03997
Epoch: 1861, Training Loss: 0.04693
Epoch: 1861, Training Loss: 0.04045
Epoch: 1861, Training Loss: 0.05523
Epoch: 1861, Training Loss: 0.03995
Epoch: 1862, Training Loss: 0.04691
Epoch: 1862, Training Loss: 0.04043
Epoch: 1862, Training Loss: 0.05520
Epoch: 1862, Training Loss: 0.03993
Epoch: 1863, Training Loss: 0.04689
Epoch: 1863, Training Loss: 0.04041
Epoch: 1863, Training Loss: 0.05517
Epoch: 1863, Training Loss: 0.03991
Epoch: 1864, Training Loss: 0.04687
Epoch: 1864, Training Loss: 0.04040
Epoch: 1864, Training Loss: 0.05514
Epoch: 1864, Training Loss: 0.03989
Epoch: 1865, Training Loss: 0.04684
Epoch: 1865, Training Loss: 0.04038
Epoch: 1865, Training Loss: 0.05512
Epoch: 1865, Training Loss: 0.03988
Epoch: 1866, Training Loss: 0.04682
Epoch: 1866, Training Loss: 0.04036
Epoch: 1866, Training Loss: 0.05509
Epoch: 1866, Training Loss: 0.03986
Epoch: 1867, Training Loss: 0.04680
Epoch: 1867, Training Loss: 0.04035
Epoch: 1867, Training Loss: 0.05506
Epoch: 1867, Training Loss: 0.03984
Epoch: 1868, Training Loss: 0.04678
Epoch: 1868, Training Loss: 0.04033
Epoch: 1868, Training Loss: 0.05504
Epoch: 1868, Training Loss: 0.03982
Epoch: 1869, Training Loss: 0.04676
Epoch: 1869, Training Loss: 0.04031
Epoch: 1869, Training Loss: 0.05501
Epoch: 1869, Training Loss: 0.03980
Epoch: 1870, Training Loss: 0.04674
Epoch: 1870, Training Loss: 0.04029
Epoch: 1870, Training Loss: 0.05498
Epoch: 1870, Training Loss: 0.03979
Epoch: 1871, Training Loss: 0.04671
Epoch: 1871, Training Loss: 0.04028
Epoch: 1871, Training Loss: 0.05495
Epoch: 1871, Training Loss: 0.03977
Epoch: 1872, Training Loss: 0.04669
Epoch: 1872, Training Loss: 0.04026
Epoch: 1872, Training Loss: 0.05493
Epoch: 1872, Training Loss: 0.03975
Epoch: 1873, Training Loss: 0.04667
Epoch: 1873, Training Loss: 0.04024
Epoch: 1873, Training Loss: 0.05490
Epoch: 1873, Training Loss: 0.03973
Epoch: 1874, Training Loss: 0.04665
Epoch: 1874, Training Loss: 0.04023
Epoch: 1874, Training Loss: 0.05487
Epoch: 1874, Training Loss: 0.03972
Epoch: 1875, Training Loss: 0.04663
Epoch: 1875, Training Loss: 0.04021
Epoch: 1875, Training Loss: 0.05485
Epoch: 1875, Training Loss: 0.03970
Epoch: 1876, Training Loss: 0.04661
Epoch: 1876, Training Loss: 0.04020
Epoch: 1876, Training Loss: 0.05482
Epoch: 1876, Training Loss: 0.03968
Epoch: 1877, Training Loss: 0.04659
Epoch: 1877, Training Loss: 0.04018
Epoch: 1877, Training Loss: 0.05479
Epoch: 1877, Training Loss: 0.03966
Epoch: 1878, Training Loss: 0.04656
Epoch: 1878, Training Loss: 0.04016
Epoch: 1878, Training Loss: 0.05477
Epoch: 1878, Training Loss: 0.03964
Epoch: 1879, Training Loss: 0.04654
Epoch: 1879, Training Loss: 0.04015
Epoch: 1879, Training Loss: 0.05474
Epoch: 1879, Training Loss: 0.03963
Epoch: 1880, Training Loss: 0.04652
Epoch: 1880, Training Loss: 0.04013
Epoch: 1880, Training Loss: 0.05471
Epoch: 1880, Training Loss: 0.03961
Epoch: 1881, Training Loss: 0.04650
Epoch: 1881, Training Loss: 0.04011
Epoch: 1881, Training Loss: 0.05469
Epoch: 1881, Training Loss: 0.03959
Epoch: 1882, Training Loss: 0.04648
Epoch: 1882, Training Loss: 0.04010
Epoch: 1882, Training Loss: 0.05466
Epoch: 1882, Training Loss: 0.03957
Epoch: 1883, Training Loss: 0.04646
Epoch: 1883, Training Loss: 0.04008
Epoch: 1883, Training Loss: 0.05463
Epoch: 1883, Training Loss: 0.03956
Epoch: 1884, Training Loss: 0.04644
Epoch: 1884, Training Loss: 0.04006
Epoch: 1884, Training Loss: 0.05461
Epoch: 1884, Training Loss: 0.03954
Epoch: 1885, Training Loss: 0.04641
Epoch: 1885, Training Loss: 0.04005
Epoch: 1885, Training Loss: 0.05458
Epoch: 1885, Training Loss: 0.03952
Epoch: 1886, Training Loss: 0.04639
Epoch: 1886, Training Loss: 0.04003
Epoch: 1886, Training Loss: 0.05455
Epoch: 1886, Training Loss: 0.03950
Epoch: 1887, Training Loss: 0.04637
Epoch: 1887, Training Loss: 0.04001
Epoch: 1887, Training Loss: 0.05453
Epoch: 1887, Training Loss: 0.03948
Epoch: 1888, Training Loss: 0.04635
Epoch: 1888, Training Loss: 0.04000
Epoch: 1888, Training Loss: 0.05450
Epoch: 1888, Training Loss: 0.03947
Epoch: 1889, Training Loss: 0.04633
Epoch: 1889, Training Loss: 0.03998
Epoch: 1889, Training Loss: 0.05447
Epoch: 1889, Training Loss: 0.03945
Epoch: 1890, Training Loss: 0.04631
Epoch: 1890, Training Loss: 0.03997
Epoch: 1890, Training Loss: 0.05445
Epoch: 1890, Training Loss: 0.03943
Epoch: 1891, Training Loss: 0.04629
Epoch: 1891, Training Loss: 0.03995
Epoch: 1891, Training Loss: 0.05442
Epoch: 1891, Training Loss: 0.03941
Epoch: 1892, Training Loss: 0.04627
Epoch: 1892, Training Loss: 0.03993
Epoch: 1892, Training Loss: 0.05440
Epoch: 1892, Training Loss: 0.03940
Epoch: 1893, Training Loss: 0.04625
Epoch: 1893, Training Loss: 0.03992
Epoch: 1893, Training Loss: 0.05437
Epoch: 1893, Training Loss: 0.03938
Epoch: 1894, Training Loss: 0.04622
Epoch: 1894, Training Loss: 0.03990
Epoch: 1894, Training Loss: 0.05434
Epoch: 1894, Training Loss: 0.03936
Epoch: 1895, Training Loss: 0.04620
Epoch: 1895, Training Loss: 0.03988
Epoch: 1895, Training Loss: 0.05432
Epoch: 1895, Training Loss: 0.03934
Epoch: 1896, Training Loss: 0.04618
Epoch: 1896, Training Loss: 0.03987
Epoch: 1896, Training Loss: 0.05429
Epoch: 1896, Training Loss: 0.03933
Epoch: 1897, Training Loss: 0.04616
Epoch: 1897, Training Loss: 0.03985
Epoch: 1897, Training Loss: 0.05427
Epoch: 1897, Training Loss: 0.03931
Epoch: 1898, Training Loss: 0.04614
Epoch: 1898, Training Loss: 0.03984
Epoch: 1898, Training Loss: 0.05424
Epoch: 1898, Training Loss: 0.03929
Epoch: 1899, Training Loss: 0.04612
Epoch: 1899, Training Loss: 0.03982
Epoch: 1899, Training Loss: 0.05421
Epoch: 1899, Training Loss: 0.03928
Epoch: 1900, Training Loss: 0.04610
Epoch: 1900, Training Loss: 0.03980
Epoch: 1900, Training Loss: 0.05419
Epoch: 1900, Training Loss: 0.03926
Epoch: 1901, Training Loss: 0.04608
Epoch: 1901, Training Loss: 0.03979
Epoch: 1901, Training Loss: 0.05416
Epoch: 1901, Training Loss: 0.03924
Epoch: 1902, Training Loss: 0.04606
Epoch: 1902, Training Loss: 0.03977
Epoch: 1902, Training Loss: 0.05414
Epoch: 1902, Training Loss: 0.03922
Epoch: 1903, Training Loss: 0.04604
Epoch: 1903, Training Loss: 0.03976
Epoch: 1903, Training Loss: 0.05411
Epoch: 1903, Training Loss: 0.03921
Epoch: 1904, Training Loss: 0.04602
Epoch: 1904, Training Loss: 0.03974
Epoch: 1904, Training Loss: 0.05408
Epoch: 1904, Training Loss: 0.03919
Epoch: 1905, Training Loss: 0.04600
Epoch: 1905, Training Loss: 0.03972
Epoch: 1905, Training Loss: 0.05406
Epoch: 1905, Training Loss: 0.03917
Epoch: 1906, Training Loss: 0.04597
Epoch: 1906, Training Loss: 0.03971
Epoch: 1906, Training Loss: 0.05403
Epoch: 1906, Training Loss: 0.03915
Epoch: 1907, Training Loss: 0.04595
Epoch: 1907, Training Loss: 0.03969
Epoch: 1907, Training Loss: 0.05401
Epoch: 1907, Training Loss: 0.03914
Epoch: 1908, Training Loss: 0.04593
Epoch: 1908, Training Loss: 0.03968
Epoch: 1908, Training Loss: 0.05398
Epoch: 1908, Training Loss: 0.03912
Epoch: 1909, Training Loss: 0.04591
Epoch: 1909, Training Loss: 0.03966
Epoch: 1909, Training Loss: 0.05396
Epoch: 1909, Training Loss: 0.03910
Epoch: 1910, Training Loss: 0.04589
Epoch: 1910, Training Loss: 0.03964
Epoch: 1910, Training Loss: 0.05393
Epoch: 1910, Training Loss: 0.03909
Epoch: 1911, Training Loss: 0.04587
Epoch: 1911, Training Loss: 0.03963
Epoch: 1911, Training Loss: 0.05390
Epoch: 1911, Training Loss: 0.03907
Epoch: 1912, Training Loss: 0.04585
Epoch: 1912, Training Loss: 0.03961
Epoch: 1912, Training Loss: 0.05388
Epoch: 1912, Training Loss: 0.03905
Epoch: 1913, Training Loss: 0.04583
Epoch: 1913, Training Loss: 0.03960
Epoch: 1913, Training Loss: 0.05385
Epoch: 1913, Training Loss: 0.03904
Epoch: 1914, Training Loss: 0.04581
Epoch: 1914, Training Loss: 0.03958
Epoch: 1914, Training Loss: 0.05383
Epoch: 1914, Training Loss: 0.03902
Epoch: 1915, Training Loss: 0.04579
Epoch: 1915, Training Loss: 0.03956
Epoch: 1915, Training Loss: 0.05380
Epoch: 1915, Training Loss: 0.03900
Epoch: 1916, Training Loss: 0.04577
Epoch: 1916, Training Loss: 0.03955
Epoch: 1916, Training Loss: 0.05378
Epoch: 1916, Training Loss: 0.03898
Epoch: 1917, Training Loss: 0.04575
Epoch: 1917, Training Loss: 0.03953
Epoch: 1917, Training Loss: 0.05375
Epoch: 1917, Training Loss: 0.03897
Epoch: 1918, Training Loss: 0.04573
Epoch: 1918, Training Loss: 0.03952
Epoch: 1918, Training Loss: 0.05373
Epoch: 1918, Training Loss: 0.03895
Epoch: 1919, Training Loss: 0.04571
Epoch: 1919, Training Loss: 0.03950
Epoch: 1919, Training Loss: 0.05370
Epoch: 1919, Training Loss: 0.03893
Epoch: 1920, Training Loss: 0.04569
Epoch: 1920, Training Loss: 0.03948
Epoch: 1920, Training Loss: 0.05368
Epoch: 1920, Training Loss: 0.03892
Epoch: 1921, Training Loss: 0.04567
Epoch: 1921, Training Loss: 0.03947
Epoch: 1921, Training Loss: 0.05365
Epoch: 1921, Training Loss: 0.03890
Epoch: 1922, Training Loss: 0.04565
Epoch: 1922, Training Loss: 0.03945
Epoch: 1922, Training Loss: 0.05363
Epoch: 1922, Training Loss: 0.03888
Epoch: 1923, Training Loss: 0.04563
Epoch: 1923, Training Loss: 0.03944
Epoch: 1923, Training Loss: 0.05360
Epoch: 1923, Training Loss: 0.03887
Epoch: 1924, Training Loss: 0.04561
Epoch: 1924, Training Loss: 0.03942
Epoch: 1924, Training Loss: 0.05357
Epoch: 1924, Training Loss: 0.03885
Epoch: 1925, Training Loss: 0.04559
Epoch: 1925, Training Loss: 0.03941
Epoch: 1925, Training Loss: 0.05355
Epoch: 1925, Training Loss: 0.03883
Epoch: 1926, Training Loss: 0.04557
Epoch: 1926, Training Loss: 0.03939
Epoch: 1926, Training Loss: 0.05352
Epoch: 1926, Training Loss: 0.03882
Epoch: 1927, Training Loss: 0.04555
Epoch: 1927, Training Loss: 0.03938
Epoch: 1927, Training Loss: 0.05350
Epoch: 1927, Training Loss: 0.03880
Epoch: 1928, Training Loss: 0.04553
Epoch: 1928, Training Loss: 0.03936
Epoch: 1928, Training Loss: 0.05347
Epoch: 1928, Training Loss: 0.03878
Epoch: 1929, Training Loss: 0.04551
Epoch: 1929, Training Loss: 0.03934
Epoch: 1929, Training Loss: 0.05345
Epoch: 1929, Training Loss: 0.03877
Epoch: 1930, Training Loss: 0.04549
Epoch: 1930, Training Loss: 0.03933
Epoch: 1930, Training Loss: 0.05342
Epoch: 1930, Training Loss: 0.03875
Epoch: 1931, Training Loss: 0.04547
Epoch: 1931, Training Loss: 0.03931
Epoch: 1931, Training Loss: 0.05340
Epoch: 1931, Training Loss: 0.03873
Epoch: 1932, Training Loss: 0.04545
Epoch: 1932, Training Loss: 0.03930
Epoch: 1932, Training Loss: 0.05338
Epoch: 1932, Training Loss: 0.03872
Epoch: 1933, Training Loss: 0.04543
Epoch: 1933, Training Loss: 0.03928
Epoch: 1933, Training Loss: 0.05335
Epoch: 1933, Training Loss: 0.03870
Epoch: 1934, Training Loss: 0.04541
Epoch: 1934, Training Loss: 0.03927
Epoch: 1934, Training Loss: 0.05333
Epoch: 1934, Training Loss: 0.03868
Epoch: 1935, Training Loss: 0.04539
Epoch: 1935, Training Loss: 0.03925
Epoch: 1935, Training Loss: 0.05330
Epoch: 1935, Training Loss: 0.03867
Epoch: 1936, Training Loss: 0.04537
Epoch: 1936, Training Loss: 0.03924
Epoch: 1936, Training Loss: 0.05328
Epoch: 1936, Training Loss: 0.03865
Epoch: 1937, Training Loss: 0.04535
Epoch: 1937, Training Loss: 0.03922
Epoch: 1937, Training Loss: 0.05325
Epoch: 1937, Training Loss: 0.03863
Epoch: 1938, Training Loss: 0.04533
Epoch: 1938, Training Loss: 0.03920
Epoch: 1938, Training Loss: 0.05323
Epoch: 1938, Training Loss: 0.03862
Epoch: 1939, Training Loss: 0.04531
Epoch: 1939, Training Loss: 0.03919
Epoch: 1939, Training Loss: 0.05320
Epoch: 1939, Training Loss: 0.03860
Epoch: 1940, Training Loss: 0.04529
Epoch: 1940, Training Loss: 0.03917
Epoch: 1940, Training Loss: 0.05318
Epoch: 1940, Training Loss: 0.03858
Epoch: 1941, Training Loss: 0.04527
Epoch: 1941, Training Loss: 0.03916
Epoch: 1941, Training Loss: 0.05315
Epoch: 1941, Training Loss: 0.03857
Epoch: 1942, Training Loss: 0.04525
Epoch: 1942, Training Loss: 0.03914
Epoch: 1942, Training Loss: 0.05313
Epoch: 1942, Training Loss: 0.03855
Epoch: 1943, Training Loss: 0.04523
Epoch: 1943, Training Loss: 0.03913
Epoch: 1943, Training Loss: 0.05310
Epoch: 1943, Training Loss: 0.03853
Epoch: 1944, Training Loss: 0.04521
Epoch: 1944, Training Loss: 0.03911
Epoch: 1944, Training Loss: 0.05308
Epoch: 1944, Training Loss: 0.03852
Epoch: 1945, Training Loss: 0.04519
Epoch: 1945, Training Loss: 0.03910
Epoch: 1945, Training Loss: 0.05305
Epoch: 1945, Training Loss: 0.03850
Epoch: 1946, Training Loss: 0.04517
Epoch: 1946, Training Loss: 0.03908
Epoch: 1946, Training Loss: 0.05303
Epoch: 1946, Training Loss: 0.03849
Epoch: 1947, Training Loss: 0.04515
Epoch: 1947, Training Loss: 0.03907
Epoch: 1947, Training Loss: 0.05301
Epoch: 1947, Training Loss: 0.03847
Epoch: 1948, Training Loss: 0.04513
Epoch: 1948, Training Loss: 0.03905
Epoch: 1948, Training Loss: 0.05298
Epoch: 1948, Training Loss: 0.03845
Epoch: 1949, Training Loss: 0.04511
Epoch: 1949, Training Loss: 0.03904
Epoch: 1949, Training Loss: 0.05296
Epoch: 1949, Training Loss: 0.03844
Epoch: 1950, Training Loss: 0.04509
Epoch: 1950, Training Loss: 0.03902
Epoch: 1950, Training Loss: 0.05293
Epoch: 1950, Training Loss: 0.03842
Epoch: 1951, Training Loss: 0.04507
Epoch: 1951, Training Loss: 0.03901
Epoch: 1951, Training Loss: 0.05291
Epoch: 1951, Training Loss: 0.03840
Epoch: 1952, Training Loss: 0.04505
Epoch: 1952, Training Loss: 0.03899
Epoch: 1952, Training Loss: 0.05288
Epoch: 1952, Training Loss: 0.03839
Epoch: 1953, Training Loss: 0.04503
Epoch: 1953, Training Loss: 0.03898
Epoch: 1953, Training Loss: 0.05286
Epoch: 1953, Training Loss: 0.03837
Epoch: 1954, Training Loss: 0.04501
Epoch: 1954, Training Loss: 0.03896
Epoch: 1954, Training Loss: 0.05284
Epoch: 1954, Training Loss: 0.03836
Epoch: 1955, Training Loss: 0.04499
Epoch: 1955, Training Loss: 0.03895
Epoch: 1955, Training Loss: 0.05281
Epoch: 1955, Training Loss: 0.03834
Epoch: 1956, Training Loss: 0.04497
Epoch: 1956, Training Loss: 0.03893
Epoch: 1956, Training Loss: 0.05279
Epoch: 1956, Training Loss: 0.03832
Epoch: 1957, Training Loss: 0.04495
Epoch: 1957, Training Loss: 0.03892
Epoch: 1957, Training Loss: 0.05276
Epoch: 1957, Training Loss: 0.03831
Epoch: 1958, Training Loss: 0.04493
Epoch: 1958, Training Loss: 0.03890
Epoch: 1958, Training Loss: 0.05274
Epoch: 1958, Training Loss: 0.03829
Epoch: 1959, Training Loss: 0.04491
Epoch: 1959, Training Loss: 0.03888
Epoch: 1959, Training Loss: 0.05272
Epoch: 1959, Training Loss: 0.03828
Epoch: 1960, Training Loss: 0.04490
Epoch: 1960, Training Loss: 0.03887
Epoch: 1960, Training Loss: 0.05269
Epoch: 1960, Training Loss: 0.03826
Epoch: 1961, Training Loss: 0.04488
Epoch: 1961, Training Loss: 0.03885
Epoch: 1961, Training Loss: 0.05267
Epoch: 1961, Training Loss: 0.03824
Epoch: 1962, Training Loss: 0.04486
Epoch: 1962, Training Loss: 0.03884
Epoch: 1962, Training Loss: 0.05264
Epoch: 1962, Training Loss: 0.03823
Epoch: 1963, Training Loss: 0.04484
Epoch: 1963, Training Loss: 0.03882
Epoch: 1963, Training Loss: 0.05262
Epoch: 1963, Training Loss: 0.03821
Epoch: 1964, Training Loss: 0.04482
Epoch: 1964, Training Loss: 0.03881
Epoch: 1964, Training Loss: 0.05260
Epoch: 1964, Training Loss: 0.03820
Epoch: 1965, Training Loss: 0.04480
Epoch: 1965, Training Loss: 0.03879
Epoch: 1965, Training Loss: 0.05257
Epoch: 1965, Training Loss: 0.03818
Epoch: 1966, Training Loss: 0.04478
Epoch: 1966, Training Loss: 0.03878
Epoch: 1966, Training Loss: 0.05255
Epoch: 1966, Training Loss: 0.03816
Epoch: 1967, Training Loss: 0.04476
Epoch: 1967, Training Loss: 0.03877
Epoch: 1967, Training Loss: 0.05252
Epoch: 1967, Training Loss: 0.03815
Epoch: 1968, Training Loss: 0.04474
Epoch: 1968, Training Loss: 0.03875
Epoch: 1968, Training Loss: 0.05250
Epoch: 1968, Training Loss: 0.03813
Epoch: 1969, Training Loss: 0.04472
Epoch: 1969, Training Loss: 0.03874
Epoch: 1969, Training Loss: 0.05248
Epoch: 1969, Training Loss: 0.03812
Epoch: 1970, Training Loss: 0.04470
Epoch: 1970, Training Loss: 0.03872
Epoch: 1970, Training Loss: 0.05245
Epoch: 1970, Training Loss: 0.03810
Epoch: 1971, Training Loss: 0.04468
Epoch: 1971, Training Loss: 0.03871
Epoch: 1971, Training Loss: 0.05243
Epoch: 1971, Training Loss: 0.03808
Epoch: 1972, Training Loss: 0.04466
Epoch: 1972, Training Loss: 0.03869
Epoch: 1972, Training Loss: 0.05241
Epoch: 1972, Training Loss: 0.03807
Epoch: 1973, Training Loss: 0.04465
Epoch: 1973, Training Loss: 0.03868
Epoch: 1973, Training Loss: 0.05238
Epoch: 1973, Training Loss: 0.03805
Epoch: 1974, Training Loss: 0.04463
Epoch: 1974, Training Loss: 0.03866
Epoch: 1974, Training Loss: 0.05236
Epoch: 1974, Training Loss: 0.03804
Epoch: 1975, Training Loss: 0.04461
Epoch: 1975, Training Loss: 0.03865
Epoch: 1975, Training Loss: 0.05234
Epoch: 1975, Training Loss: 0.03802
Epoch: 1976, Training Loss: 0.04459
Epoch: 1976, Training Loss: 0.03863
Epoch: 1976, Training Loss: 0.05231
Epoch: 1976, Training Loss: 0.03801
Epoch: 1977, Training Loss: 0.04457
Epoch: 1977, Training Loss: 0.03862
Epoch: 1977, Training Loss: 0.05229
Epoch: 1977, Training Loss: 0.03799
Epoch: 1978, Training Loss: 0.04455
Epoch: 1978, Training Loss: 0.03860
Epoch: 1978, Training Loss: 0.05227
Epoch: 1978, Training Loss: 0.03797
Epoch: 1979, Training Loss: 0.04453
Epoch: 1979, Training Loss: 0.03859
Epoch: 1979, Training Loss: 0.05224
Epoch: 1979, Training Loss: 0.03796
Epoch: 1980, Training Loss: 0.04451
Epoch: 1980, Training Loss: 0.03857
Epoch: 1980, Training Loss: 0.05222
Epoch: 1980, Training Loss: 0.03794
Epoch: 1981, Training Loss: 0.04449
Epoch: 1981, Training Loss: 0.03856
Epoch: 1981, Training Loss: 0.05220
Epoch: 1981, Training Loss: 0.03793
Epoch: 1982, Training Loss: 0.04448
Epoch: 1982, Training Loss: 0.03854
Epoch: 1982, Training Loss: 0.05217
Epoch: 1982, Training Loss: 0.03791
Epoch: 1983, Training Loss: 0.04446
Epoch: 1983, Training Loss: 0.03853
Epoch: 1983, Training Loss: 0.05215
Epoch: 1983, Training Loss: 0.03790
Epoch: 1984, Training Loss: 0.04444
Epoch: 1984, Training Loss: 0.03851
Epoch: 1984, Training Loss: 0.05213
Epoch: 1984, Training Loss: 0.03788
Epoch: 1985, Training Loss: 0.04442
Epoch: 1985, Training Loss: 0.03850
Epoch: 1985, Training Loss: 0.05210
Epoch: 1985, Training Loss: 0.03786
Epoch: 1986, Training Loss: 0.04440
Epoch: 1986, Training Loss: 0.03848
Epoch: 1986, Training Loss: 0.05208
Epoch: 1986, Training Loss: 0.03785
Epoch: 1987, Training Loss: 0.04438
Epoch: 1987, Training Loss: 0.03847
Epoch: 1987, Training Loss: 0.05206
Epoch: 1987, Training Loss: 0.03783
Epoch: 1988, Training Loss: 0.04436
Epoch: 1988, Training Loss: 0.03846
Epoch: 1988, Training Loss: 0.05203
Epoch: 1988, Training Loss: 0.03782
Epoch: 1989, Training Loss: 0.04434
Epoch: 1989, Training Loss: 0.03844
Epoch: 1989, Training Loss: 0.05201
Epoch: 1989, Training Loss: 0.03780
Epoch: 1990, Training Loss: 0.04433
Epoch: 1990, Training Loss: 0.03843
Epoch: 1990, Training Loss: 0.05199
Epoch: 1990, Training Loss: 0.03779
Epoch: 1991, Training Loss: 0.04431
Epoch: 1991, Training Loss: 0.03841
Epoch: 1991, Training Loss: 0.05196
Epoch: 1991, Training Loss: 0.03777
Epoch: 1992, Training Loss: 0.04429
Epoch: 1992, Training Loss: 0.03840
Epoch: 1992, Training Loss: 0.05194
Epoch: 1992, Training Loss: 0.03776
Epoch: 1993, Training Loss: 0.04427
Epoch: 1993, Training Loss: 0.03838
Epoch: 1993, Training Loss: 0.05192
Epoch: 1993, Training Loss: 0.03774
Epoch: 1994, Training Loss: 0.04425
Epoch: 1994, Training Loss: 0.03837
Epoch: 1994, Training Loss: 0.05189
Epoch: 1994, Training Loss: 0.03773
Epoch: 1995, Training Loss: 0.04423
Epoch: 1995, Training Loss: 0.03835
Epoch: 1995, Training Loss: 0.05187
Epoch: 1995, Training Loss: 0.03771
Epoch: 1996, Training Loss: 0.04421
Epoch: 1996, Training Loss: 0.03834
Epoch: 1996, Training Loss: 0.05185
Epoch: 1996, Training Loss: 0.03769
Epoch: 1997, Training Loss: 0.04420
Epoch: 1997, Training Loss: 0.03833
Epoch: 1997, Training Loss: 0.05183
Epoch: 1997, Training Loss: 0.03768
Epoch: 1998, Training Loss: 0.04418
Epoch: 1998, Training Loss: 0.03831
Epoch: 1998, Training Loss: 0.05180
Epoch: 1998, Training Loss: 0.03766
Epoch: 1999, Training Loss: 0.04416
Epoch: 1999, Training Loss: 0.03830
Epoch: 1999, Training Loss: 0.05178
Epoch: 1999, Training Loss: 0.03765
Epoch: 2000, Training Loss: 0.04414
Epoch: 2000, Training Loss: 0.03828
Epoch: 2000, Training Loss: 0.05176
Epoch: 2000, Training Loss: 0.03763
Epoch: 2001, Training Loss: 0.04412
Epoch: 2001, Training Loss: 0.03827
Epoch: 2001, Training Loss: 0.05173
Epoch: 2001, Training Loss: 0.03762
Epoch: 2002, Training Loss: 0.04410
Epoch: 2002, Training Loss: 0.03825
Epoch: 2002, Training Loss: 0.05171
Epoch: 2002, Training Loss: 0.03760
Epoch: 2003, Training Loss: 0.04409
Epoch: 2003, Training Loss: 0.03824
Epoch: 2003, Training Loss: 0.05169
Epoch: 2003, Training Loss: 0.03759
Epoch: 2004, Training Loss: 0.04407
Epoch: 2004, Training Loss: 0.03822
Epoch: 2004, Training Loss: 0.05167
Epoch: 2004, Training Loss: 0.03757
Epoch: 2005, Training Loss: 0.04405
Epoch: 2005, Training Loss: 0.03821
Epoch: 2005, Training Loss: 0.05164
Epoch: 2005, Training Loss: 0.03756
Epoch: 2006, Training Loss: 0.04403
Epoch: 2006, Training Loss: 0.03820
Epoch: 2006, Training Loss: 0.05162
Epoch: 2006, Training Loss: 0.03754
Epoch: 2007, Training Loss: 0.04401
Epoch: 2007, Training Loss: 0.03818
Epoch: 2007, Training Loss: 0.05160
Epoch: 2007, Training Loss: 0.03753
Epoch: 2008, Training Loss: 0.04399
Epoch: 2008, Training Loss: 0.03817
Epoch: 2008, Training Loss: 0.05158
Epoch: 2008, Training Loss: 0.03751
Epoch: 2009, Training Loss: 0.04398
Epoch: 2009, Training Loss: 0.03815
Epoch: 2009, Training Loss: 0.05155
Epoch: 2009, Training Loss: 0.03750
Epoch: 2010, Training Loss: 0.04396
Epoch: 2010, Training Loss: 0.03814
Epoch: 2010, Training Loss: 0.05153
Epoch: 2010, Training Loss: 0.03748
Epoch: 2011, Training Loss: 0.04394
Epoch: 2011, Training Loss: 0.03812
Epoch: 2011, Training Loss: 0.05151
Epoch: 2011, Training Loss: 0.03747
Epoch: 2012, Training Loss: 0.04392
Epoch: 2012, Training Loss: 0.03811
Epoch: 2012, Training Loss: 0.05149
Epoch: 2012, Training Loss: 0.03745
Epoch: 2013, Training Loss: 0.04390
Epoch: 2013, Training Loss: 0.03810
Epoch: 2013, Training Loss: 0.05146
Epoch: 2013, Training Loss: 0.03744
Epoch: 2014, Training Loss: 0.04389
Epoch: 2014, Training Loss: 0.03808
Epoch: 2014, Training Loss: 0.05144
Epoch: 2014, Training Loss: 0.03742
Epoch: 2015, Training Loss: 0.04387
Epoch: 2015, Training Loss: 0.03807
Epoch: 2015, Training Loss: 0.05142
Epoch: 2015, Training Loss: 0.03741
Epoch: 2016, Training Loss: 0.04385
Epoch: 2016, Training Loss: 0.03805
Epoch: 2016, Training Loss: 0.05140
Epoch: 2016, Training Loss: 0.03739
Epoch: 2017, Training Loss: 0.04383
Epoch: 2017, Training Loss: 0.03804
Epoch: 2017, Training Loss: 0.05137
Epoch: 2017, Training Loss: 0.03738
Epoch: 2018, Training Loss: 0.04381
Epoch: 2018, Training Loss: 0.03803
Epoch: 2018, Training Loss: 0.05135
Epoch: 2018, Training Loss: 0.03736
Epoch: 2019, Training Loss: 0.04379
Epoch: 2019, Training Loss: 0.03801
Epoch: 2019, Training Loss: 0.05133
Epoch: 2019, Training Loss: 0.03735
Epoch: 2020, Training Loss: 0.04378
Epoch: 2020, Training Loss: 0.03800
Epoch: 2020, Training Loss: 0.05131
Epoch: 2020, Training Loss: 0.03733
Epoch: 2021, Training Loss: 0.04376
Epoch: 2021, Training Loss: 0.03798
Epoch: 2021, Training Loss: 0.05129
Epoch: 2021, Training Loss: 0.03732
Epoch: 2022, Training Loss: 0.04374
Epoch: 2022, Training Loss: 0.03797
Epoch: 2022, Training Loss: 0.05126
Epoch: 2022, Training Loss: 0.03730
Epoch: 2023, Training Loss: 0.04372
Epoch: 2023, Training Loss: 0.03796
Epoch: 2023, Training Loss: 0.05124
Epoch: 2023, Training Loss: 0.03729
Epoch: 2024, Training Loss: 0.04370
Epoch: 2024, Training Loss: 0.03794
Epoch: 2024, Training Loss: 0.05122
Epoch: 2024, Training Loss: 0.03727
Epoch: 2025, Training Loss: 0.04369
Epoch: 2025, Training Loss: 0.03793
Epoch: 2025, Training Loss: 0.05120
Epoch: 2025, Training Loss: 0.03726
Epoch: 2026, Training Loss: 0.04367
Epoch: 2026, Training Loss: 0.03791
Epoch: 2026, Training Loss: 0.05117
Epoch: 2026, Training Loss: 0.03724
Epoch: 2027, Training Loss: 0.04365
Epoch: 2027, Training Loss: 0.03790
Epoch: 2027, Training Loss: 0.05115
Epoch: 2027, Training Loss: 0.03723
Epoch: 2028, Training Loss: 0.04363
Epoch: 2028, Training Loss: 0.03789
Epoch: 2028, Training Loss: 0.05113
Epoch: 2028, Training Loss: 0.03721
Epoch: 2029, Training Loss: 0.04362
Epoch: 2029, Training Loss: 0.03787
Epoch: 2029, Training Loss: 0.05111
Epoch: 2029, Training Loss: 0.03720
Epoch: 2030, Training Loss: 0.04360
Epoch: 2030, Training Loss: 0.03786
Epoch: 2030, Training Loss: 0.05109
Epoch: 2030, Training Loss: 0.03718
Epoch: 2031, Training Loss: 0.04358
Epoch: 2031, Training Loss: 0.03784
Epoch: 2031, Training Loss: 0.05106
Epoch: 2031, Training Loss: 0.03717
Epoch: 2032, Training Loss: 0.04356
Epoch: 2032, Training Loss: 0.03783
Epoch: 2032, Training Loss: 0.05104
Epoch: 2032, Training Loss: 0.03715
Epoch: 2033, Training Loss: 0.04354
Epoch: 2033, Training Loss: 0.03782
Epoch: 2033, Training Loss: 0.05102
Epoch: 2033, Training Loss: 0.03714
Epoch: 2034, Training Loss: 0.04353
Epoch: 2034, Training Loss: 0.03780
Epoch: 2034, Training Loss: 0.05100
Epoch: 2034, Training Loss: 0.03712
Epoch: 2035, Training Loss: 0.04351
Epoch: 2035, Training Loss: 0.03779
Epoch: 2035, Training Loss: 0.05098
Epoch: 2035, Training Loss: 0.03711
Epoch: 2036, Training Loss: 0.04349
Epoch: 2036, Training Loss: 0.03777
Epoch: 2036, Training Loss: 0.05096
Epoch: 2036, Training Loss: 0.03709
Epoch: 2037, Training Loss: 0.04347
Epoch: 2037, Training Loss: 0.03776
Epoch: 2037, Training Loss: 0.05093
Epoch: 2037, Training Loss: 0.03708
Epoch: 2038, Training Loss: 0.04346
Epoch: 2038, Training Loss: 0.03775
Epoch: 2038, Training Loss: 0.05091
Epoch: 2038, Training Loss: 0.03707
Epoch: 2039, Training Loss: 0.04344
Epoch: 2039, Training Loss: 0.03773
Epoch: 2039, Training Loss: 0.05089
Epoch: 2039, Training Loss: 0.03705
Epoch: 2040, Training Loss: 0.04342
Epoch: 2040, Training Loss: 0.03772
Epoch: 2040, Training Loss: 0.05087
Epoch: 2040, Training Loss: 0.03704
Epoch: 2041, Training Loss: 0.04340
Epoch: 2041, Training Loss: 0.03771
Epoch: 2041, Training Loss: 0.05085
Epoch: 2041, Training Loss: 0.03702
Epoch: 2042, Training Loss: 0.04339
Epoch: 2042, Training Loss: 0.03769
Epoch: 2042, Training Loss: 0.05082
Epoch: 2042, Training Loss: 0.03701
Epoch: 2043, Training Loss: 0.04337
Epoch: 2043, Training Loss: 0.03768
Epoch: 2043, Training Loss: 0.05080
Epoch: 2043, Training Loss: 0.03699
Epoch: 2044, Training Loss: 0.04335
Epoch: 2044, Training Loss: 0.03766
Epoch: 2044, Training Loss: 0.05078
Epoch: 2044, Training Loss: 0.03698
Epoch: 2045, Training Loss: 0.04333
Epoch: 2045, Training Loss: 0.03765
Epoch: 2045, Training Loss: 0.05076
Epoch: 2045, Training Loss: 0.03696
Epoch: 2046, Training Loss: 0.04332
Epoch: 2046, Training Loss: 0.03764
Epoch: 2046, Training Loss: 0.05074
Epoch: 2046, Training Loss: 0.03695
Epoch: 2047, Training Loss: 0.04330
Epoch: 2047, Training Loss: 0.03762
Epoch: 2047, Training Loss: 0.05072
Epoch: 2047, Training Loss: 0.03693
Epoch: 2048, Training Loss: 0.04328
Epoch: 2048, Training Loss: 0.03761
Epoch: 2048, Training Loss: 0.05070
Epoch: 2048, Training Loss: 0.03692
Epoch: 2049, Training Loss: 0.04326
Epoch: 2049, Training Loss: 0.03760
Epoch: 2049, Training Loss: 0.05067
Epoch: 2049, Training Loss: 0.03691
Epoch: 2050, Training Loss: 0.04325
Epoch: 2050, Training Loss: 0.03758
Epoch: 2050, Training Loss: 0.05065
Epoch: 2050, Training Loss: 0.03689
Epoch: 2051, Training Loss: 0.04323
Epoch: 2051, Training Loss: 0.03757
Epoch: 2051, Training Loss: 0.05063
Epoch: 2051, Training Loss: 0.03688
Epoch: 2052, Training Loss: 0.04321
Epoch: 2052, Training Loss: 0.03755
Epoch: 2052, Training Loss: 0.05061
Epoch: 2052, Training Loss: 0.03686
Epoch: 2053, Training Loss: 0.04319
Epoch: 2053, Training Loss: 0.03754
Epoch: 2053, Training Loss: 0.05059
Epoch: 2053, Training Loss: 0.03685
Epoch: 2054, Training Loss: 0.04318
Epoch: 2054, Training Loss: 0.03753
Epoch: 2054, Training Loss: 0.05057
Epoch: 2054, Training Loss: 0.03683
Epoch: 2055, Training Loss: 0.04316
Epoch: 2055, Training Loss: 0.03751
Epoch: 2055, Training Loss: 0.05055
Epoch: 2055, Training Loss: 0.03682
Epoch: 2056, Training Loss: 0.04314
Epoch: 2056, Training Loss: 0.03750
Epoch: 2056, Training Loss: 0.05052
Epoch: 2056, Training Loss: 0.03680
Epoch: 2057, Training Loss: 0.04313
Epoch: 2057, Training Loss: 0.03749
Epoch: 2057, Training Loss: 0.05050
Epoch: 2057, Training Loss: 0.03679
Epoch: 2058, Training Loss: 0.04311
Epoch: 2058, Training Loss: 0.03747
Epoch: 2058, Training Loss: 0.05048
Epoch: 2058, Training Loss: 0.03678
Epoch: 2059, Training Loss: 0.04309
Epoch: 2059, Training Loss: 0.03746
Epoch: 2059, Training Loss: 0.05046
Epoch: 2059, Training Loss: 0.03676
Epoch: 2060, Training Loss: 0.04307
Epoch: 2060, Training Loss: 0.03745
Epoch: 2060, Training Loss: 0.05044
Epoch: 2060, Training Loss: 0.03675
Epoch: 2061, Training Loss: 0.04306
Epoch: 2061, Training Loss: 0.03743
Epoch: 2061, Training Loss: 0.05042
Epoch: 2061, Training Loss: 0.03673
Epoch: 2062, Training Loss: 0.04304
Epoch: 2062, Training Loss: 0.03742
Epoch: 2062, Training Loss: 0.05040
Epoch: 2062, Training Loss: 0.03672
Epoch: 2063, Training Loss: 0.04302
Epoch: 2063, Training Loss: 0.03741
Epoch: 2063, Training Loss: 0.05038
Epoch: 2063, Training Loss: 0.03670
Epoch: 2064, Training Loss: 0.04301
Epoch: 2064, Training Loss: 0.03739
Epoch: 2064, Training Loss: 0.05036
Epoch: 2064, Training Loss: 0.03669
Epoch: 2065, Training Loss: 0.04299
Epoch: 2065, Training Loss: 0.03738
Epoch: 2065, Training Loss: 0.05033
Epoch: 2065, Training Loss: 0.03668
Epoch: 2066, Training Loss: 0.04297
Epoch: 2066, Training Loss: 0.03737
Epoch: 2066, Training Loss: 0.05031
Epoch: 2066, Training Loss: 0.03666
Epoch: 2067, Training Loss: 0.04295
Epoch: 2067, Training Loss: 0.03735
Epoch: 2067, Training Loss: 0.05029
Epoch: 2067, Training Loss: 0.03665
Epoch: 2068, Training Loss: 0.04294
Epoch: 2068, Training Loss: 0.03734
Epoch: 2068, Training Loss: 0.05027
Epoch: 2068, Training Loss: 0.03663
Epoch: 2069, Training Loss: 0.04292
Epoch: 2069, Training Loss: 0.03733
Epoch: 2069, Training Loss: 0.05025
Epoch: 2069, Training Loss: 0.03662
Epoch: 2070, Training Loss: 0.04290
Epoch: 2070, Training Loss: 0.03731
Epoch: 2070, Training Loss: 0.05023
Epoch: 2070, Training Loss: 0.03661
Epoch: 2071, Training Loss: 0.04289
Epoch: 2071, Training Loss: 0.03730
Epoch: 2071, Training Loss: 0.05021
Epoch: 2071, Training Loss: 0.03659
Epoch: 2072, Training Loss: 0.04287
Epoch: 2072, Training Loss: 0.03729
Epoch: 2072, Training Loss: 0.05019
Epoch: 2072, Training Loss: 0.03658
Epoch: 2073, Training Loss: 0.04285
Epoch: 2073, Training Loss: 0.03727
Epoch: 2073, Training Loss: 0.05017
Epoch: 2073, Training Loss: 0.03656
Epoch: 2074, Training Loss: 0.04284
Epoch: 2074, Training Loss: 0.03726
Epoch: 2074, Training Loss: 0.05015
Epoch: 2074, Training Loss: 0.03655
Epoch: 2075, Training Loss: 0.04282
Epoch: 2075, Training Loss: 0.03725
Epoch: 2075, Training Loss: 0.05012
Epoch: 2075, Training Loss: 0.03654
Epoch: 2076, Training Loss: 0.04280
Epoch: 2076, Training Loss: 0.03723
Epoch: 2076, Training Loss: 0.05010
Epoch: 2076, Training Loss: 0.03652
Epoch: 2077, Training Loss: 0.04278
Epoch: 2077, Training Loss: 0.03722
Epoch: 2077, Training Loss: 0.05008
Epoch: 2077, Training Loss: 0.03651
Epoch: 2078, Training Loss: 0.04277
Epoch: 2078, Training Loss: 0.03721
Epoch: 2078, Training Loss: 0.05006
Epoch: 2078, Training Loss: 0.03649
Epoch: 2079, Training Loss: 0.04275
Epoch: 2079, Training Loss: 0.03719
Epoch: 2079, Training Loss: 0.05004
Epoch: 2079, Training Loss: 0.03648
Epoch: 2080, Training Loss: 0.04273
Epoch: 2080, Training Loss: 0.03718
Epoch: 2080, Training Loss: 0.05002
Epoch: 2080, Training Loss: 0.03647
Epoch: 2081, Training Loss: 0.04272
Epoch: 2081, Training Loss: 0.03717
Epoch: 2081, Training Loss: 0.05000
Epoch: 2081, Training Loss: 0.03645
Epoch: 2082, Training Loss: 0.04270
Epoch: 2082, Training Loss: 0.03715
Epoch: 2082, Training Loss: 0.04998
Epoch: 2082, Training Loss: 0.03644
Epoch: 2083, Training Loss: 0.04268
Epoch: 2083, Training Loss: 0.03714
Epoch: 2083, Training Loss: 0.04996
Epoch: 2083, Training Loss: 0.03642
Epoch: 2084, Training Loss: 0.04267
Epoch: 2084, Training Loss: 0.03713
Epoch: 2084, Training Loss: 0.04994
Epoch: 2084, Training Loss: 0.03641
Epoch: 2085, Training Loss: 0.04265
Epoch: 2085, Training Loss: 0.03711
Epoch: 2085, Training Loss: 0.04992
Epoch: 2085, Training Loss: 0.03640
Epoch: 2086, Training Loss: 0.04263
Epoch: 2086, Training Loss: 0.03710
Epoch: 2086, Training Loss: 0.04990
Epoch: 2086, Training Loss: 0.03638
Epoch: 2087, Training Loss: 0.04262
Epoch: 2087, Training Loss: 0.03709
Epoch: 2087, Training Loss: 0.04988
Epoch: 2087, Training Loss: 0.03637
Epoch: 2088, Training Loss: 0.04260
Epoch: 2088, Training Loss: 0.03707
Epoch: 2088, Training Loss: 0.04986
Epoch: 2088, Training Loss: 0.03635
Epoch: 2089, Training Loss: 0.04258
Epoch: 2089, Training Loss: 0.03706
Epoch: 2089, Training Loss: 0.04984
Epoch: 2089, Training Loss: 0.03634
Epoch: 2090, Training Loss: 0.04257
Epoch: 2090, Training Loss: 0.03705
Epoch: 2090, Training Loss: 0.04982
Epoch: 2090, Training Loss: 0.03633
Epoch: 2091, Training Loss: 0.04255
Epoch: 2091, Training Loss: 0.03704
Epoch: 2091, Training Loss: 0.04980
Epoch: 2091, Training Loss: 0.03631
Epoch: 2092, Training Loss: 0.04253
Epoch: 2092, Training Loss: 0.03702
Epoch: 2092, Training Loss: 0.04977
Epoch: 2092, Training Loss: 0.03630
Epoch: 2093, Training Loss: 0.04252
Epoch: 2093, Training Loss: 0.03701
Epoch: 2093, Training Loss: 0.04975
Epoch: 2093, Training Loss: 0.03629
Epoch: 2094, Training Loss: 0.04250
Epoch: 2094, Training Loss: 0.03700
Epoch: 2094, Training Loss: 0.04973
Epoch: 2094, Training Loss: 0.03627
Epoch: 2095, Training Loss: 0.04248
Epoch: 2095, Training Loss: 0.03698
Epoch: 2095, Training Loss: 0.04971
Epoch: 2095, Training Loss: 0.03626
Epoch: 2096, Training Loss: 0.04247
Epoch: 2096, Training Loss: 0.03697
Epoch: 2096, Training Loss: 0.04969
Epoch: 2096, Training Loss: 0.03624
Epoch: 2097, Training Loss: 0.04245
Epoch: 2097, Training Loss: 0.03696
Epoch: 2097, Training Loss: 0.04967
Epoch: 2097, Training Loss: 0.03623
Epoch: 2098, Training Loss: 0.04243
Epoch: 2098, Training Loss: 0.03694
Epoch: 2098, Training Loss: 0.04965
Epoch: 2098, Training Loss: 0.03622
Epoch: 2099, Training Loss: 0.04242
Epoch: 2099, Training Loss: 0.03693
Epoch: 2099, Training Loss: 0.04963
Epoch: 2099, Training Loss: 0.03620
Epoch: 2100, Training Loss: 0.04240
Epoch: 2100, Training Loss: 0.03692
Epoch: 2100, Training Loss: 0.04961
Epoch: 2100, Training Loss: 0.03619
Epoch: 2101, Training Loss: 0.04239
Epoch: 2101, Training Loss: 0.03691
Epoch: 2101, Training Loss: 0.04959
Epoch: 2101, Training Loss: 0.03618
Epoch: 2102, Training Loss: 0.04237
Epoch: 2102, Training Loss: 0.03689
Epoch: 2102, Training Loss: 0.04957
Epoch: 2102, Training Loss: 0.03616
Epoch: 2103, Training Loss: 0.04235
Epoch: 2103, Training Loss: 0.03688
Epoch: 2103, Training Loss: 0.04955
Epoch: 2103, Training Loss: 0.03615
Epoch: 2104, Training Loss: 0.04234
Epoch: 2104, Training Loss: 0.03687
Epoch: 2104, Training Loss: 0.04953
Epoch: 2104, Training Loss: 0.03614
Epoch: 2105, Training Loss: 0.04232
Epoch: 2105, Training Loss: 0.03685
Epoch: 2105, Training Loss: 0.04951
Epoch: 2105, Training Loss: 0.03612
Epoch: 2106, Training Loss: 0.04230
Epoch: 2106, Training Loss: 0.03684
Epoch: 2106, Training Loss: 0.04949
Epoch: 2106, Training Loss: 0.03611
Epoch: 2107, Training Loss: 0.04229
Epoch: 2107, Training Loss: 0.03683
Epoch: 2107, Training Loss: 0.04947
Epoch: 2107, Training Loss: 0.03609
Epoch: 2108, Training Loss: 0.04227
Epoch: 2108, Training Loss: 0.03681
Epoch: 2108, Training Loss: 0.04945
Epoch: 2108, Training Loss: 0.03608
Epoch: 2109, Training Loss: 0.04226
Epoch: 2109, Training Loss: 0.03680
Epoch: 2109, Training Loss: 0.04943
Epoch: 2109, Training Loss: 0.03607
Epoch: 2110, Training Loss: 0.04224
Epoch: 2110, Training Loss: 0.03679
Epoch: 2110, Training Loss: 0.04941
Epoch: 2110, Training Loss: 0.03605
Epoch: 2111, Training Loss: 0.04222
Epoch: 2111, Training Loss: 0.03678
Epoch: 2111, Training Loss: 0.04939
Epoch: 2111, Training Loss: 0.03604
Epoch: 2112, Training Loss: 0.04221
Epoch: 2112, Training Loss: 0.03676
Epoch: 2112, Training Loss: 0.04937
Epoch: 2112, Training Loss: 0.03603
Epoch: 2113, Training Loss: 0.04219
Epoch: 2113, Training Loss: 0.03675
Epoch: 2113, Training Loss: 0.04935
Epoch: 2113, Training Loss: 0.03601
Epoch: 2114, Training Loss: 0.04217
Epoch: 2114, Training Loss: 0.03674
Epoch: 2114, Training Loss: 0.04933
Epoch: 2114, Training Loss: 0.03600
Epoch: 2115, Training Loss: 0.04216
Epoch: 2115, Training Loss: 0.03673
Epoch: 2115, Training Loss: 0.04931
Epoch: 2115, Training Loss: 0.03599
Epoch: 2116, Training Loss: 0.04214
Epoch: 2116, Training Loss: 0.03671
Epoch: 2116, Training Loss: 0.04929
Epoch: 2116, Training Loss: 0.03597
Epoch: 2117, Training Loss: 0.04213
Epoch: 2117, Training Loss: 0.03670
Epoch: 2117, Training Loss: 0.04927
Epoch: 2117, Training Loss: 0.03596
Epoch: 2118, Training Loss: 0.04211
Epoch: 2118, Training Loss: 0.03669
Epoch: 2118, Training Loss: 0.04925
Epoch: 2118, Training Loss: 0.03595
Epoch: 2119, Training Loss: 0.04209
Epoch: 2119, Training Loss: 0.03667
Epoch: 2119, Training Loss: 0.04923
Epoch: 2119, Training Loss: 0.03593
Epoch: 2120, Training Loss: 0.04208
Epoch: 2120, Training Loss: 0.03666
Epoch: 2120, Training Loss: 0.04921
Epoch: 2120, Training Loss: 0.03592
Epoch: 2121, Training Loss: 0.04206
Epoch: 2121, Training Loss: 0.03665
Epoch: 2121, Training Loss: 0.04919
Epoch: 2121, Training Loss: 0.03591
Epoch: 2122, Training Loss: 0.04205
Epoch: 2122, Training Loss: 0.03664
Epoch: 2122, Training Loss: 0.04917
Epoch: 2122, Training Loss: 0.03589
Epoch: 2123, Training Loss: 0.04203
Epoch: 2123, Training Loss: 0.03662
Epoch: 2123, Training Loss: 0.04915
Epoch: 2123, Training Loss: 0.03588
Epoch: 2124, Training Loss: 0.04201
Epoch: 2124, Training Loss: 0.03661
Epoch: 2124, Training Loss: 0.04913
Epoch: 2124, Training Loss: 0.03587
Epoch: 2125, Training Loss: 0.04200
Epoch: 2125, Training Loss: 0.03660
Epoch: 2125, Training Loss: 0.04911
Epoch: 2125, Training Loss: 0.03585
Epoch: 2126, Training Loss: 0.04198
Epoch: 2126, Training Loss: 0.03659
Epoch: 2126, Training Loss: 0.04909
Epoch: 2126, Training Loss: 0.03584
Epoch: 2127, Training Loss: 0.04197
Epoch: 2127, Training Loss: 0.03657
Epoch: 2127, Training Loss: 0.04908
Epoch: 2127, Training Loss: 0.03583
Epoch: 2128, Training Loss: 0.04195
Epoch: 2128, Training Loss: 0.03656
Epoch: 2128, Training Loss: 0.04906
Epoch: 2128, Training Loss: 0.03581
Epoch: 2129, Training Loss: 0.04193
Epoch: 2129, Training Loss: 0.03655
Epoch: 2129, Training Loss: 0.04904
Epoch: 2129, Training Loss: 0.03580
Epoch: 2130, Training Loss: 0.04192
Epoch: 2130, Training Loss: 0.03654
Epoch: 2130, Training Loss: 0.04902
Epoch: 2130, Training Loss: 0.03579
Epoch: 2131, Training Loss: 0.04190
Epoch: 2131, Training Loss: 0.03652
Epoch: 2131, Training Loss: 0.04900
Epoch: 2131, Training Loss: 0.03577
Epoch: 2132, Training Loss: 0.04189
Epoch: 2132, Training Loss: 0.03651
Epoch: 2132, Training Loss: 0.04898
Epoch: 2132, Training Loss: 0.03576
Epoch: 2133, Training Loss: 0.04187
Epoch: 2133, Training Loss: 0.03650
Epoch: 2133, Training Loss: 0.04896
Epoch: 2133, Training Loss: 0.03575
Epoch: 2134, Training Loss: 0.04185
Epoch: 2134, Training Loss: 0.03649
Epoch: 2134, Training Loss: 0.04894
Epoch: 2134, Training Loss: 0.03573
Epoch: 2135, Training Loss: 0.04184
Epoch: 2135, Training Loss: 0.03647
Epoch: 2135, Training Loss: 0.04892
Epoch: 2135, Training Loss: 0.03572
Epoch: 2136, Training Loss: 0.04182
Epoch: 2136, Training Loss: 0.03646
Epoch: 2136, Training Loss: 0.04890
Epoch: 2136, Training Loss: 0.03571
Epoch: 2137, Training Loss: 0.04181
Epoch: 2137, Training Loss: 0.03645
Epoch: 2137, Training Loss: 0.04888
Epoch: 2137, Training Loss: 0.03570
Epoch: 2138, Training Loss: 0.04179
Epoch: 2138, Training Loss: 0.03644
Epoch: 2138, Training Loss: 0.04886
Epoch: 2138, Training Loss: 0.03568
Epoch: 2139, Training Loss: 0.04178
Epoch: 2139, Training Loss: 0.03642
Epoch: 2139, Training Loss: 0.04884
Epoch: 2139, Training Loss: 0.03567
Epoch: 2140, Training Loss: 0.04176
Epoch: 2140, Training Loss: 0.03641
Epoch: 2140, Training Loss: 0.04882
Epoch: 2140, Training Loss: 0.03566
Epoch: 2141, Training Loss: 0.04174
Epoch: 2141, Training Loss: 0.03640
Epoch: 2141, Training Loss: 0.04880
Epoch: 2141, Training Loss: 0.03564
Epoch: 2142, Training Loss: 0.04173
Epoch: 2142, Training Loss: 0.03639
Epoch: 2142, Training Loss: 0.04878
Epoch: 2142, Training Loss: 0.03563
Epoch: 2143, Training Loss: 0.04171
Epoch: 2143, Training Loss: 0.03637
Epoch: 2143, Training Loss: 0.04876
Epoch: 2143, Training Loss: 0.03562
Epoch: 2144, Training Loss: 0.04170
Epoch: 2144, Training Loss: 0.03636
Epoch: 2144, Training Loss: 0.04875
Epoch: 2144, Training Loss: 0.03560
Epoch: 2145, Training Loss: 0.04168
Epoch: 2145, Training Loss: 0.03635
Epoch: 2145, Training Loss: 0.04873
Epoch: 2145, Training Loss: 0.03559
Epoch: 2146, Training Loss: 0.04167
Epoch: 2146, Training Loss: 0.03634
Epoch: 2146, Training Loss: 0.04871
Epoch: 2146, Training Loss: 0.03558
Epoch: 2147, Training Loss: 0.04165
Epoch: 2147, Training Loss: 0.03632
Epoch: 2147, Training Loss: 0.04869
Epoch: 2147, Training Loss: 0.03556
Epoch: 2148, Training Loss: 0.04163
Epoch: 2148, Training Loss: 0.03631
Epoch: 2148, Training Loss: 0.04867
Epoch: 2148, Training Loss: 0.03555
Epoch: 2149, Training Loss: 0.04162
Epoch: 2149, Training Loss: 0.03630
Epoch: 2149, Training Loss: 0.04865
Epoch: 2149, Training Loss: 0.03554
Epoch: 2150, Training Loss: 0.04160
Epoch: 2150, Training Loss: 0.03629
Epoch: 2150, Training Loss: 0.04863
Epoch: 2150, Training Loss: 0.03553
Epoch: 2151, Training Loss: 0.04159
Epoch: 2151, Training Loss: 0.03627
Epoch: 2151, Training Loss: 0.04861
Epoch: 2151, Training Loss: 0.03551
Epoch: 2152, Training Loss: 0.04157
Epoch: 2152, Training Loss: 0.03626
Epoch: 2152, Training Loss: 0.04859
Epoch: 2152, Training Loss: 0.03550
Epoch: 2153, Training Loss: 0.04156
Epoch: 2153, Training Loss: 0.03625
Epoch: 2153, Training Loss: 0.04857
Epoch: 2153, Training Loss: 0.03549
Epoch: 2154, Training Loss: 0.04154
Epoch: 2154, Training Loss: 0.03624
Epoch: 2154, Training Loss: 0.04855
Epoch: 2154, Training Loss: 0.03547
Epoch: 2155, Training Loss: 0.04153
Epoch: 2155, Training Loss: 0.03623
Epoch: 2155, Training Loss: 0.04854
Epoch: 2155, Training Loss: 0.03546
Epoch: 2156, Training Loss: 0.04151
Epoch: 2156, Training Loss: 0.03621
Epoch: 2156, Training Loss: 0.04852
Epoch: 2156, Training Loss: 0.03545
Epoch: 2157, Training Loss: 0.04149
Epoch: 2157, Training Loss: 0.03620
Epoch: 2157, Training Loss: 0.04850
Epoch: 2157, Training Loss: 0.03544
Epoch: 2158, Training Loss: 0.04148
Epoch: 2158, Training Loss: 0.03619
Epoch: 2158, Training Loss: 0.04848
Epoch: 2158, Training Loss: 0.03542
Epoch: 2159, Training Loss: 0.04146
Epoch: 2159, Training Loss: 0.03618
Epoch: 2159, Training Loss: 0.04846
Epoch: 2159, Training Loss: 0.03541
Epoch: 2160, Training Loss: 0.04145
Epoch: 2160, Training Loss: 0.03616
Epoch: 2160, Training Loss: 0.04844
Epoch: 2160, Training Loss: 0.03540
Epoch: 2161, Training Loss: 0.04143
Epoch: 2161, Training Loss: 0.03615
Epoch: 2161, Training Loss: 0.04842
Epoch: 2161, Training Loss: 0.03538
Epoch: 2162, Training Loss: 0.04142
Epoch: 2162, Training Loss: 0.03614
Epoch: 2162, Training Loss: 0.04840
Epoch: 2162, Training Loss: 0.03537
Epoch: 2163, Training Loss: 0.04140
Epoch: 2163, Training Loss: 0.03613
Epoch: 2163, Training Loss: 0.04838
Epoch: 2163, Training Loss: 0.03536
Epoch: 2164, Training Loss: 0.04139
Epoch: 2164, Training Loss: 0.03612
Epoch: 2164, Training Loss: 0.04837
Epoch: 2164, Training Loss: 0.03535
Epoch: 2165, Training Loss: 0.04137
Epoch: 2165, Training Loss: 0.03610
Epoch: 2165, Training Loss: 0.04835
Epoch: 2165, Training Loss: 0.03533
Epoch: 2166, Training Loss: 0.04136
Epoch: 2166, Training Loss: 0.03609
Epoch: 2166, Training Loss: 0.04833
Epoch: 2166, Training Loss: 0.03532
Epoch: 2167, Training Loss: 0.04134
Epoch: 2167, Training Loss: 0.03608
Epoch: 2167, Training Loss: 0.04831
Epoch: 2167, Training Loss: 0.03531
Epoch: 2168, Training Loss: 0.04133
Epoch: 2168, Training Loss: 0.03607
Epoch: 2168, Training Loss: 0.04829
Epoch: 2168, Training Loss: 0.03530
Epoch: 2169, Training Loss: 0.04131
Epoch: 2169, Training Loss: 0.03606
Epoch: 2169, Training Loss: 0.04827
Epoch: 2169, Training Loss: 0.03528
Epoch: 2170, Training Loss: 0.04130
Epoch: 2170, Training Loss: 0.03604
Epoch: 2170, Training Loss: 0.04825
Epoch: 2170, Training Loss: 0.03527
Epoch: 2171, Training Loss: 0.04128
Epoch: 2171, Training Loss: 0.03603
Epoch: 2171, Training Loss: 0.04823
Epoch: 2171, Training Loss: 0.03526
Epoch: 2172, Training Loss: 0.04126
Epoch: 2172, Training Loss: 0.03602
Epoch: 2172, Training Loss: 0.04822
Epoch: 2172, Training Loss: 0.03525
Epoch: 2173, Training Loss: 0.04125
Epoch: 2173, Training Loss: 0.03601
Epoch: 2173, Training Loss: 0.04820
Epoch: 2173, Training Loss: 0.03523
Epoch: 2174, Training Loss: 0.04123
Epoch: 2174, Training Loss: 0.03600
Epoch: 2174, Training Loss: 0.04818
Epoch: 2174, Training Loss: 0.03522
Epoch: 2175, Training Loss: 0.04122
Epoch: 2175, Training Loss: 0.03598
Epoch: 2175, Training Loss: 0.04816
Epoch: 2175, Training Loss: 0.03521
Epoch: 2176, Training Loss: 0.04120
Epoch: 2176, Training Loss: 0.03597
Epoch: 2176, Training Loss: 0.04814
Epoch: 2176, Training Loss: 0.03519
Epoch: 2177, Training Loss: 0.04119
Epoch: 2177, Training Loss: 0.03596
Epoch: 2177, Training Loss: 0.04812
Epoch: 2177, Training Loss: 0.03518
Epoch: 2178, Training Loss: 0.04117
Epoch: 2178, Training Loss: 0.03595
Epoch: 2178, Training Loss: 0.04810
Epoch: 2178, Training Loss: 0.03517
Epoch: 2179, Training Loss: 0.04116
Epoch: 2179, Training Loss: 0.03594
Epoch: 2179, Training Loss: 0.04809
Epoch: 2179, Training Loss: 0.03516
Epoch: 2180, Training Loss: 0.04114
Epoch: 2180, Training Loss: 0.03592
Epoch: 2180, Training Loss: 0.04807
Epoch: 2180, Training Loss: 0.03514
Epoch: 2181, Training Loss: 0.04113
Epoch: 2181, Training Loss: 0.03591
Epoch: 2181, Training Loss: 0.04805
Epoch: 2181, Training Loss: 0.03513
Epoch: 2182, Training Loss: 0.04111
Epoch: 2182, Training Loss: 0.03590
Epoch: 2182, Training Loss: 0.04803
Epoch: 2182, Training Loss: 0.03512
Epoch: 2183, Training Loss: 0.04110
Epoch: 2183, Training Loss: 0.03589
Epoch: 2183, Training Loss: 0.04801
Epoch: 2183, Training Loss: 0.03511
Epoch: 2184, Training Loss: 0.04108
Epoch: 2184, Training Loss: 0.03588
Epoch: 2184, Training Loss: 0.04799
Epoch: 2184, Training Loss: 0.03509
Epoch: 2185, Training Loss: 0.04107
Epoch: 2185, Training Loss: 0.03586
Epoch: 2185, Training Loss: 0.04797
Epoch: 2185, Training Loss: 0.03508
Epoch: 2186, Training Loss: 0.04105
Epoch: 2186, Training Loss: 0.03585
Epoch: 2186, Training Loss: 0.04796
Epoch: 2186, Training Loss: 0.03507
Epoch: 2187, Training Loss: 0.04104
Epoch: 2187, Training Loss: 0.03584
Epoch: 2187, Training Loss: 0.04794
Epoch: 2187, Training Loss: 0.03506
Epoch: 2188, Training Loss: 0.04102
Epoch: 2188, Training Loss: 0.03583
Epoch: 2188, Training Loss: 0.04792
Epoch: 2188, Training Loss: 0.03504
Epoch: 2189, Training Loss: 0.04101
Epoch: 2189, Training Loss: 0.03582
Epoch: 2189, Training Loss: 0.04790
Epoch: 2189, Training Loss: 0.03503
Epoch: 2190, Training Loss: 0.04099
Epoch: 2190, Training Loss: 0.03580
Epoch: 2190, Training Loss: 0.04788
Epoch: 2190, Training Loss: 0.03502
Epoch: 2191, Training Loss: 0.04098
Epoch: 2191, Training Loss: 0.03579
Epoch: 2191, Training Loss: 0.04786
Epoch: 2191, Training Loss: 0.03501
Epoch: 2192, Training Loss: 0.04096
Epoch: 2192, Training Loss: 0.03578
Epoch: 2192, Training Loss: 0.04785
Epoch: 2192, Training Loss: 0.03500
Epoch: 2193, Training Loss: 0.04095
Epoch: 2193, Training Loss: 0.03577
Epoch: 2193, Training Loss: 0.04783
Epoch: 2193, Training Loss: 0.03498
Epoch: 2194, Training Loss: 0.04093
Epoch: 2194, Training Loss: 0.03576
Epoch: 2194, Training Loss: 0.04781
Epoch: 2194, Training Loss: 0.03497
Epoch: 2195, Training Loss: 0.04092
Epoch: 2195, Training Loss: 0.03575
Epoch: 2195, Training Loss: 0.04779
Epoch: 2195, Training Loss: 0.03496
Epoch: 2196, Training Loss: 0.04090
Epoch: 2196, Training Loss: 0.03573
Epoch: 2196, Training Loss: 0.04777
Epoch: 2196, Training Loss: 0.03495
Epoch: 2197, Training Loss: 0.04089
Epoch: 2197, Training Loss: 0.03572
Epoch: 2197, Training Loss: 0.04776
Epoch: 2197, Training Loss: 0.03493
Epoch: 2198, Training Loss: 0.04088
Epoch: 2198, Training Loss: 0.03571
Epoch: 2198, Training Loss: 0.04774
Epoch: 2198, Training Loss: 0.03492
Epoch: 2199, Training Loss: 0.04086
Epoch: 2199, Training Loss: 0.03570
Epoch: 2199, Training Loss: 0.04772
Epoch: 2199, Training Loss: 0.03491
Epoch: 2200, Training Loss: 0.04085
Epoch: 2200, Training Loss: 0.03569
Epoch: 2200, Training Loss: 0.04770
Epoch: 2200, Training Loss: 0.03490
Epoch: 2201, Training Loss: 0.04083
Epoch: 2201, Training Loss: 0.03567
Epoch: 2201, Training Loss: 0.04768
Epoch: 2201, Training Loss: 0.03488
Epoch: 2202, Training Loss: 0.04082
Epoch: 2202, Training Loss: 0.03566
Epoch: 2202, Training Loss: 0.04767
Epoch: 2202, Training Loss: 0.03487
Epoch: 2203, Training Loss: 0.04080
Epoch: 2203, Training Loss: 0.03565
Epoch: 2203, Training Loss: 0.04765
Epoch: 2203, Training Loss: 0.03486
Epoch: 2204, Training Loss: 0.04079
Epoch: 2204, Training Loss: 0.03564
Epoch: 2204, Training Loss: 0.04763
Epoch: 2204, Training Loss: 0.03485
Epoch: 2205, Training Loss: 0.04077
Epoch: 2205, Training Loss: 0.03563
Epoch: 2205, Training Loss: 0.04761
Epoch: 2205, Training Loss: 0.03484
Epoch: 2206, Training Loss: 0.04076
Epoch: 2206, Training Loss: 0.03562
Epoch: 2206, Training Loss: 0.04759
Epoch: 2206, Training Loss: 0.03482
Epoch: 2207, Training Loss: 0.04074
Epoch: 2207, Training Loss: 0.03560
Epoch: 2207, Training Loss: 0.04758
Epoch: 2207, Training Loss: 0.03481
Epoch: 2208, Training Loss: 0.04073
Epoch: 2208, Training Loss: 0.03559
Epoch: 2208, Training Loss: 0.04756
Epoch: 2208, Training Loss: 0.03480
Epoch: 2209, Training Loss: 0.04071
Epoch: 2209, Training Loss: 0.03558
Epoch: 2209, Training Loss: 0.04754
Epoch: 2209, Training Loss: 0.03479
Epoch: 2210, Training Loss: 0.04070
Epoch: 2210, Training Loss: 0.03557
Epoch: 2210, Training Loss: 0.04752
Epoch: 2210, Training Loss: 0.03477
Epoch: 2211, Training Loss: 0.04068
Epoch: 2211, Training Loss: 0.03556
Epoch: 2211, Training Loss: 0.04750
Epoch: 2211, Training Loss: 0.03476
Epoch: 2212, Training Loss: 0.04067
Epoch: 2212, Training Loss: 0.03555
Epoch: 2212, Training Loss: 0.04749
Epoch: 2212, Training Loss: 0.03475
Epoch: 2213, Training Loss: 0.04065
Epoch: 2213, Training Loss: 0.03554
Epoch: 2213, Training Loss: 0.04747
Epoch: 2213, Training Loss: 0.03474
Epoch: 2214, Training Loss: 0.04064
Epoch: 2214, Training Loss: 0.03552
Epoch: 2214, Training Loss: 0.04745
Epoch: 2214, Training Loss: 0.03473
Epoch: 2215, Training Loss: 0.04063
Epoch: 2215, Training Loss: 0.03551
Epoch: 2215, Training Loss: 0.04743
Epoch: 2215, Training Loss: 0.03471
Epoch: 2216, Training Loss: 0.04061
Epoch: 2216, Training Loss: 0.03550
Epoch: 2216, Training Loss: 0.04741
Epoch: 2216, Training Loss: 0.03470
Epoch: 2217, Training Loss: 0.04060
Epoch: 2217, Training Loss: 0.03549
Epoch: 2217, Training Loss: 0.04740
Epoch: 2217, Training Loss: 0.03469
Epoch: 2218, Training Loss: 0.04058
Epoch: 2218, Training Loss: 0.03548
Epoch: 2218, Training Loss: 0.04738
Epoch: 2218, Training Loss: 0.03468
Epoch: 2219, Training Loss: 0.04057
Epoch: 2219, Training Loss: 0.03547
Epoch: 2219, Training Loss: 0.04736
Epoch: 2219, Training Loss: 0.03467
Epoch: 2220, Training Loss: 0.04055
Epoch: 2220, Training Loss: 0.03545
Epoch: 2220, Training Loss: 0.04734
Epoch: 2220, Training Loss: 0.03465
Epoch: 2221, Training Loss: 0.04054
Epoch: 2221, Training Loss: 0.03544
Epoch: 2221, Training Loss: 0.04733
Epoch: 2221, Training Loss: 0.03464
Epoch: 2222, Training Loss: 0.04052
Epoch: 2222, Training Loss: 0.03543
Epoch: 2222, Training Loss: 0.04731
Epoch: 2222, Training Loss: 0.03463
Epoch: 2223, Training Loss: 0.04051
Epoch: 2223, Training Loss: 0.03542
Epoch: 2223, Training Loss: 0.04729
Epoch: 2223, Training Loss: 0.03462
Epoch: 2224, Training Loss: 0.04050
Epoch: 2224, Training Loss: 0.03541
Epoch: 2224, Training Loss: 0.04727
Epoch: 2224, Training Loss: 0.03461
Epoch: 2225, Training Loss: 0.04048
Epoch: 2225, Training Loss: 0.03540
Epoch: 2225, Training Loss: 0.04726
Epoch: 2225, Training Loss: 0.03459
Epoch: 2226, Training Loss: 0.04047
Epoch: 2226, Training Loss: 0.03539
Epoch: 2226, Training Loss: 0.04724
Epoch: 2226, Training Loss: 0.03458
Epoch: 2227, Training Loss: 0.04045
Epoch: 2227, Training Loss: 0.03537
Epoch: 2227, Training Loss: 0.04722
Epoch: 2227, Training Loss: 0.03457
Epoch: 2228, Training Loss: 0.04044
Epoch: 2228, Training Loss: 0.03536
Epoch: 2228, Training Loss: 0.04720
Epoch: 2228, Training Loss: 0.03456
Epoch: 2229, Training Loss: 0.04042
Epoch: 2229, Training Loss: 0.03535
Epoch: 2229, Training Loss: 0.04719
Epoch: 2229, Training Loss: 0.03455
Epoch: 2230, Training Loss: 0.04041
Epoch: 2230, Training Loss: 0.03534
Epoch: 2230, Training Loss: 0.04717
Epoch: 2230, Training Loss: 0.03453
Epoch: 2231, Training Loss: 0.04040
Epoch: 2231, Training Loss: 0.03533
Epoch: 2231, Training Loss: 0.04715
Epoch: 2231, Training Loss: 0.03452
Epoch: 2232, Training Loss: 0.04038
Epoch: 2232, Training Loss: 0.03532
Epoch: 2232, Training Loss: 0.04713
Epoch: 2232, Training Loss: 0.03451
Epoch: 2233, Training Loss: 0.04037
Epoch: 2233, Training Loss: 0.03531
Epoch: 2233, Training Loss: 0.04712
Epoch: 2233, Training Loss: 0.03450
Epoch: 2234, Training Loss: 0.04035
Epoch: 2234, Training Loss: 0.03529
Epoch: 2234, Training Loss: 0.04710
Epoch: 2234, Training Loss: 0.03449
Epoch: 2235, Training Loss: 0.04034
Epoch: 2235, Training Loss: 0.03528
Epoch: 2235, Training Loss: 0.04708
Epoch: 2235, Training Loss: 0.03447
Epoch: 2236, Training Loss: 0.04032
Epoch: 2236, Training Loss: 0.03527
Epoch: 2236, Training Loss: 0.04706
Epoch: 2236, Training Loss: 0.03446
Epoch: 2237, Training Loss: 0.04031
Epoch: 2237, Training Loss: 0.03526
Epoch: 2237, Training Loss: 0.04705
Epoch: 2237, Training Loss: 0.03445
Epoch: 2238, Training Loss: 0.04030
Epoch: 2238, Training Loss: 0.03525
Epoch: 2238, Training Loss: 0.04703
Epoch: 2238, Training Loss: 0.03444
Epoch: 2239, Training Loss: 0.04028
Epoch: 2239, Training Loss: 0.03524
Epoch: 2239, Training Loss: 0.04701
Epoch: 2239, Training Loss: 0.03443
Epoch: 2240, Training Loss: 0.04027
Epoch: 2240, Training Loss: 0.03523
Epoch: 2240, Training Loss: 0.04699
Epoch: 2240, Training Loss: 0.03442
Epoch: 2241, Training Loss: 0.04025
Epoch: 2241, Training Loss: 0.03522
Epoch: 2241, Training Loss: 0.04698
Epoch: 2241, Training Loss: 0.03440
Epoch: 2242, Training Loss: 0.04024
Epoch: 2242, Training Loss: 0.03520
Epoch: 2242, Training Loss: 0.04696
Epoch: 2242, Training Loss: 0.03439
Epoch: 2243, Training Loss: 0.04022
Epoch: 2243, Training Loss: 0.03519
Epoch: 2243, Training Loss: 0.04694
Epoch: 2243, Training Loss: 0.03438
Epoch: 2244, Training Loss: 0.04021
Epoch: 2244, Training Loss: 0.03518
Epoch: 2244, Training Loss: 0.04692
Epoch: 2244, Training Loss: 0.03437
Epoch: 2245, Training Loss: 0.04020
Epoch: 2245, Training Loss: 0.03517
Epoch: 2245, Training Loss: 0.04691
Epoch: 2245, Training Loss: 0.03436
Epoch: 2246, Training Loss: 0.04018
Epoch: 2246, Training Loss: 0.03516
Epoch: 2246, Training Loss: 0.04689
Epoch: 2246, Training Loss: 0.03435
Epoch: 2247, Training Loss: 0.04017
Epoch: 2247, Training Loss: 0.03515
Epoch: 2247, Training Loss: 0.04687
Epoch: 2247, Training Loss: 0.03433
Epoch: 2248, Training Loss: 0.04015
Epoch: 2248, Training Loss: 0.03514
Epoch: 2248, Training Loss: 0.04686
Epoch: 2248, Training Loss: 0.03432
Epoch: 2249, Training Loss: 0.04014
Epoch: 2249, Training Loss: 0.03513
Epoch: 2249, Training Loss: 0.04684
Epoch: 2249, Training Loss: 0.03431
Epoch: 2250, Training Loss: 0.04013
Epoch: 2250, Training Loss: 0.03511
Epoch: 2250, Training Loss: 0.04682
Epoch: 2250, Training Loss: 0.03430
Epoch: 2251, Training Loss: 0.04011
Epoch: 2251, Training Loss: 0.03510
Epoch: 2251, Training Loss: 0.04680
Epoch: 2251, Training Loss: 0.03429
Epoch: 2252, Training Loss: 0.04010
Epoch: 2252, Training Loss: 0.03509
Epoch: 2252, Training Loss: 0.04679
Epoch: 2252, Training Loss: 0.03428
Epoch: 2253, Training Loss: 0.04008
Epoch: 2253, Training Loss: 0.03508
Epoch: 2253, Training Loss: 0.04677
Epoch: 2253, Training Loss: 0.03426
Epoch: 2254, Training Loss: 0.04007
Epoch: 2254, Training Loss: 0.03507
Epoch: 2254, Training Loss: 0.04675
Epoch: 2254, Training Loss: 0.03425
Epoch: 2255, Training Loss: 0.04006
Epoch: 2255, Training Loss: 0.03506
Epoch: 2255, Training Loss: 0.04674
Epoch: 2255, Training Loss: 0.03424
Epoch: 2256, Training Loss: 0.04004
Epoch: 2256, Training Loss: 0.03505
Epoch: 2256, Training Loss: 0.04672
Epoch: 2256, Training Loss: 0.03423
Epoch: 2257, Training Loss: 0.04003
Epoch: 2257, Training Loss: 0.03504
Epoch: 2257, Training Loss: 0.04670
Epoch: 2257, Training Loss: 0.03422
Epoch: 2258, Training Loss: 0.04001
Epoch: 2258, Training Loss: 0.03502
Epoch: 2258, Training Loss: 0.04668
Epoch: 2258, Training Loss: 0.03421
Epoch: 2259, Training Loss: 0.04000
Epoch: 2259, Training Loss: 0.03501
Epoch: 2259, Training Loss: 0.04667
Epoch: 2259, Training Loss: 0.03419
Epoch: 2260, Training Loss: 0.03999
Epoch: 2260, Training Loss: 0.03500
Epoch: 2260, Training Loss: 0.04665
Epoch: 2260, Training Loss: 0.03418
Epoch: 2261, Training Loss: 0.03997
Epoch: 2261, Training Loss: 0.03499
Epoch: 2261, Training Loss: 0.04663
Epoch: 2261, Training Loss: 0.03417
Epoch: 2262, Training Loss: 0.03996
Epoch: 2262, Training Loss: 0.03498
Epoch: 2262, Training Loss: 0.04662
Epoch: 2262, Training Loss: 0.03416
Epoch: 2263, Training Loss: 0.03994
Epoch: 2263, Training Loss: 0.03497
Epoch: 2263, Training Loss: 0.04660
Epoch: 2263, Training Loss: 0.03415
Epoch: 2264, Training Loss: 0.03993
Epoch: 2264, Training Loss: 0.03496
Epoch: 2264, Training Loss: 0.04658
Epoch: 2264, Training Loss: 0.03414
Epoch: 2265, Training Loss: 0.03992
Epoch: 2265, Training Loss: 0.03495
Epoch: 2265, Training Loss: 0.04657
Epoch: 2265, Training Loss: 0.03412
Epoch: 2266, Training Loss: 0.03990
Epoch: 2266, Training Loss: 0.03494
Epoch: 2266, Training Loss: 0.04655
Epoch: 2266, Training Loss: 0.03411
Epoch: 2267, Training Loss: 0.03989
Epoch: 2267, Training Loss: 0.03493
Epoch: 2267, Training Loss: 0.04653
Epoch: 2267, Training Loss: 0.03410
Epoch: 2268, Training Loss: 0.03988
Epoch: 2268, Training Loss: 0.03491
Epoch: 2268, Training Loss: 0.04652
Epoch: 2268, Training Loss: 0.03409
Epoch: 2269, Training Loss: 0.03986
Epoch: 2269, Training Loss: 0.03490
Epoch: 2269, Training Loss: 0.04650
Epoch: 2269, Training Loss: 0.03408
Epoch: 2270, Training Loss: 0.03985
Epoch: 2270, Training Loss: 0.03489
Epoch: 2270, Training Loss: 0.04648
Epoch: 2270, Training Loss: 0.03407
Epoch: 2271, Training Loss: 0.03983
Epoch: 2271, Training Loss: 0.03488
Epoch: 2271, Training Loss: 0.04647
Epoch: 2271, Training Loss: 0.03406
Epoch: 2272, Training Loss: 0.03982
Epoch: 2272, Training Loss: 0.03487
Epoch: 2272, Training Loss: 0.04645
Epoch: 2272, Training Loss: 0.03404
Epoch: 2273, Training Loss: 0.03981
Epoch: 2273, Training Loss: 0.03486
Epoch: 2273, Training Loss: 0.04643
Epoch: 2273, Training Loss: 0.03403
Epoch: 2274, Training Loss: 0.03979
Epoch: 2274, Training Loss: 0.03485
Epoch: 2274, Training Loss: 0.04642
Epoch: 2274, Training Loss: 0.03402
Epoch: 2275, Training Loss: 0.03978
Epoch: 2275, Training Loss: 0.03484
Epoch: 2275, Training Loss: 0.04640
Epoch: 2275, Training Loss: 0.03401
Epoch: 2276, Training Loss: 0.03977
Epoch: 2276, Training Loss: 0.03483
Epoch: 2276, Training Loss: 0.04638
Epoch: 2276, Training Loss: 0.03400
Epoch: 2277, Training Loss: 0.03975
Epoch: 2277, Training Loss: 0.03482
Epoch: 2277, Training Loss: 0.04636
Epoch: 2277, Training Loss: 0.03399
Epoch: 2278, Training Loss: 0.03974
Epoch: 2278, Training Loss: 0.03480
Epoch: 2278, Training Loss: 0.04635
Epoch: 2278, Training Loss: 0.03398
Epoch: 2279, Training Loss: 0.03973
Epoch: 2279, Training Loss: 0.03479
Epoch: 2279, Training Loss: 0.04633
Epoch: 2279, Training Loss: 0.03397
Epoch: 2280, Training Loss: 0.03971
Epoch: 2280, Training Loss: 0.03478
Epoch: 2280, Training Loss: 0.04631
Epoch: 2280, Training Loss: 0.03395
Epoch: 2281, Training Loss: 0.03970
Epoch: 2281, Training Loss: 0.03477
Epoch: 2281, Training Loss: 0.04630
Epoch: 2281, Training Loss: 0.03394
Epoch: 2282, Training Loss: 0.03968
Epoch: 2282, Training Loss: 0.03476
Epoch: 2282, Training Loss: 0.04628
Epoch: 2282, Training Loss: 0.03393
Epoch: 2283, Training Loss: 0.03967
Epoch: 2283, Training Loss: 0.03475
Epoch: 2283, Training Loss: 0.04627
Epoch: 2283, Training Loss: 0.03392
Epoch: 2284, Training Loss: 0.03966
Epoch: 2284, Training Loss: 0.03474
Epoch: 2284, Training Loss: 0.04625
Epoch: 2284, Training Loss: 0.03391
Epoch: 2285, Training Loss: 0.03964
Epoch: 2285, Training Loss: 0.03473
Epoch: 2285, Training Loss: 0.04623
Epoch: 2285, Training Loss: 0.03390
Epoch: 2286, Training Loss: 0.03963
Epoch: 2286, Training Loss: 0.03472
Epoch: 2286, Training Loss: 0.04622
Epoch: 2286, Training Loss: 0.03389
Epoch: 2287, Training Loss: 0.03962
Epoch: 2287, Training Loss: 0.03471
Epoch: 2287, Training Loss: 0.04620
Epoch: 2287, Training Loss: 0.03387
Epoch: 2288, Training Loss: 0.03960
Epoch: 2288, Training Loss: 0.03470
Epoch: 2288, Training Loss: 0.04618
Epoch: 2288, Training Loss: 0.03386
Epoch: 2289, Training Loss: 0.03959
Epoch: 2289, Training Loss: 0.03469
Epoch: 2289, Training Loss: 0.04617
Epoch: 2289, Training Loss: 0.03385
Epoch: 2290, Training Loss: 0.03958
Epoch: 2290, Training Loss: 0.03467
Epoch: 2290, Training Loss: 0.04615
Epoch: 2290, Training Loss: 0.03384
Epoch: 2291, Training Loss: 0.03956
Epoch: 2291, Training Loss: 0.03466
Epoch: 2291, Training Loss: 0.04613
Epoch: 2291, Training Loss: 0.03383
Epoch: 2292, Training Loss: 0.03955
Epoch: 2292, Training Loss: 0.03465
Epoch: 2292, Training Loss: 0.04612
Epoch: 2292, Training Loss: 0.03382
Epoch: 2293, Training Loss: 0.03954
Epoch: 2293, Training Loss: 0.03464
Epoch: 2293, Training Loss: 0.04610
Epoch: 2293, Training Loss: 0.03381
Epoch: 2294, Training Loss: 0.03952
Epoch: 2294, Training Loss: 0.03463
Epoch: 2294, Training Loss: 0.04608
Epoch: 2294, Training Loss: 0.03380
Epoch: 2295, Training Loss: 0.03951
Epoch: 2295, Training Loss: 0.03462
Epoch: 2295, Training Loss: 0.04607
Epoch: 2295, Training Loss: 0.03379
Epoch: 2296, Training Loss: 0.03950
Epoch: 2296, Training Loss: 0.03461
Epoch: 2296, Training Loss: 0.04605
Epoch: 2296, Training Loss: 0.03377
Epoch: 2297, Training Loss: 0.03948
Epoch: 2297, Training Loss: 0.03460
Epoch: 2297, Training Loss: 0.04603
Epoch: 2297, Training Loss: 0.03376
Epoch: 2298, Training Loss: 0.03947
Epoch: 2298, Training Loss: 0.03459
Epoch: 2298, Training Loss: 0.04602
Epoch: 2298, Training Loss: 0.03375
Epoch: 2299, Training Loss: 0.03946
Epoch: 2299, Training Loss: 0.03458
Epoch: 2299, Training Loss: 0.04600
Epoch: 2299, Training Loss: 0.03374
Epoch: 2300, Training Loss: 0.03944
Epoch: 2300, Training Loss: 0.03457
Epoch: 2300, Training Loss: 0.04599
Epoch: 2300, Training Loss: 0.03373
Epoch: 2301, Training Loss: 0.03943
Epoch: 2301, Training Loss: 0.03456
Epoch: 2301, Training Loss: 0.04597
Epoch: 2301, Training Loss: 0.03372
Epoch: 2302, Training Loss: 0.03942
Epoch: 2302, Training Loss: 0.03455
Epoch: 2302, Training Loss: 0.04595
Epoch: 2302, Training Loss: 0.03371
Epoch: 2303, Training Loss: 0.03940
Epoch: 2303, Training Loss: 0.03454
Epoch: 2303, Training Loss: 0.04594
Epoch: 2303, Training Loss: 0.03370
Epoch: 2304, Training Loss: 0.03939
Epoch: 2304, Training Loss: 0.03452
Epoch: 2304, Training Loss: 0.04592
Epoch: 2304, Training Loss: 0.03369
Epoch: 2305, Training Loss: 0.03938
Epoch: 2305, Training Loss: 0.03451
Epoch: 2305, Training Loss: 0.04590
Epoch: 2305, Training Loss: 0.03367
Epoch: 2306, Training Loss: 0.03936
Epoch: 2306, Training Loss: 0.03450
Epoch: 2306, Training Loss: 0.04589
Epoch: 2306, Training Loss: 0.03366
Epoch: 2307, Training Loss: 0.03935
Epoch: 2307, Training Loss: 0.03449
Epoch: 2307, Training Loss: 0.04587
Epoch: 2307, Training Loss: 0.03365
Epoch: 2308, Training Loss: 0.03934
Epoch: 2308, Training Loss: 0.03448
Epoch: 2308, Training Loss: 0.04586
Epoch: 2308, Training Loss: 0.03364
Epoch: 2309, Training Loss: 0.03932
Epoch: 2309, Training Loss: 0.03447
Epoch: 2309, Training Loss: 0.04584
Epoch: 2309, Training Loss: 0.03363
Epoch: 2310, Training Loss: 0.03931
Epoch: 2310, Training Loss: 0.03446
Epoch: 2310, Training Loss: 0.04582
Epoch: 2310, Training Loss: 0.03362
Epoch: 2311, Training Loss: 0.03930
Epoch: 2311, Training Loss: 0.03445
Epoch: 2311, Training Loss: 0.04581
Epoch: 2311, Training Loss: 0.03361
Epoch: 2312, Training Loss: 0.03928
Epoch: 2312, Training Loss: 0.03444
Epoch: 2312, Training Loss: 0.04579
Epoch: 2312, Training Loss: 0.03360
Epoch: 2313, Training Loss: 0.03927
Epoch: 2313, Training Loss: 0.03443
Epoch: 2313, Training Loss: 0.04578
Epoch: 2313, Training Loss: 0.03359
Epoch: 2314, Training Loss: 0.03926
Epoch: 2314, Training Loss: 0.03442
Epoch: 2314, Training Loss: 0.04576
Epoch: 2314, Training Loss: 0.03358
Epoch: 2315, Training Loss: 0.03924
Epoch: 2315, Training Loss: 0.03441
Epoch: 2315, Training Loss: 0.04574
Epoch: 2315, Training Loss: 0.03356
Epoch: 2316, Training Loss: 0.03923
Epoch: 2316, Training Loss: 0.03440
Epoch: 2316, Training Loss: 0.04573
Epoch: 2316, Training Loss: 0.03355
Epoch: 2317, Training Loss: 0.03922
Epoch: 2317, Training Loss: 0.03439
Epoch: 2317, Training Loss: 0.04571
Epoch: 2317, Training Loss: 0.03354
Epoch: 2318, Training Loss: 0.03920
Epoch: 2318, Training Loss: 0.03438
Epoch: 2318, Training Loss: 0.04570
Epoch: 2318, Training Loss: 0.03353
Epoch: 2319, Training Loss: 0.03919
Epoch: 2319, Training Loss: 0.03437
Epoch: 2319, Training Loss: 0.04568
Epoch: 2319, Training Loss: 0.03352
Epoch: 2320, Training Loss: 0.03918
Epoch: 2320, Training Loss: 0.03436
Epoch: 2320, Training Loss: 0.04566
Epoch: 2320, Training Loss: 0.03351
Epoch: 2321, Training Loss: 0.03916
Epoch: 2321, Training Loss: 0.03435
Epoch: 2321, Training Loss: 0.04565
Epoch: 2321, Training Loss: 0.03350
Epoch: 2322, Training Loss: 0.03915
Epoch: 2322, Training Loss: 0.03433
Epoch: 2322, Training Loss: 0.04563
Epoch: 2322, Training Loss: 0.03349
Epoch: 2323, Training Loss: 0.03914
Epoch: 2323, Training Loss: 0.03432
Epoch: 2323, Training Loss: 0.04562
Epoch: 2323, Training Loss: 0.03348
Epoch: 2324, Training Loss: 0.03912
Epoch: 2324, Training Loss: 0.03431
Epoch: 2324, Training Loss: 0.04560
Epoch: 2324, Training Loss: 0.03347
Epoch: 2325, Training Loss: 0.03911
Epoch: 2325, Training Loss: 0.03430
Epoch: 2325, Training Loss: 0.04558
Epoch: 2325, Training Loss: 0.03346
Epoch: 2326, Training Loss: 0.03910
Epoch: 2326, Training Loss: 0.03429
Epoch: 2326, Training Loss: 0.04557
Epoch: 2326, Training Loss: 0.03344
Epoch: 2327, Training Loss: 0.03909
Epoch: 2327, Training Loss: 0.03428
Epoch: 2327, Training Loss: 0.04555
Epoch: 2327, Training Loss: 0.03343
Epoch: 2328, Training Loss: 0.03907
Epoch: 2328, Training Loss: 0.03427
Epoch: 2328, Training Loss: 0.04554
Epoch: 2328, Training Loss: 0.03342
Epoch: 2329, Training Loss: 0.03906
Epoch: 2329, Training Loss: 0.03426
Epoch: 2329, Training Loss: 0.04552
Epoch: 2329, Training Loss: 0.03341
Epoch: 2330, Training Loss: 0.03905
Epoch: 2330, Training Loss: 0.03425
Epoch: 2330, Training Loss: 0.04550
Epoch: 2330, Training Loss: 0.03340
Epoch: 2331, Training Loss: 0.03903
Epoch: 2331, Training Loss: 0.03424
Epoch: 2331, Training Loss: 0.04549
Epoch: 2331, Training Loss: 0.03339
Epoch: 2332, Training Loss: 0.03902
Epoch: 2332, Training Loss: 0.03423
Epoch: 2332, Training Loss: 0.04547
Epoch: 2332, Training Loss: 0.03338
Epoch: 2333, Training Loss: 0.03901
Epoch: 2333, Training Loss: 0.03422
Epoch: 2333, Training Loss: 0.04546
Epoch: 2333, Training Loss: 0.03337
Epoch: 2334, Training Loss: 0.03900
Epoch: 2334, Training Loss: 0.03421
Epoch: 2334, Training Loss: 0.04544
Epoch: 2334, Training Loss: 0.03336
Epoch: 2335, Training Loss: 0.03898
Epoch: 2335, Training Loss: 0.03420
Epoch: 2335, Training Loss: 0.04543
Epoch: 2335, Training Loss: 0.03335
Epoch: 2336, Training Loss: 0.03897
Epoch: 2336, Training Loss: 0.03419
Epoch: 2336, Training Loss: 0.04541
Epoch: 2336, Training Loss: 0.03334
Epoch: 2337, Training Loss: 0.03896
Epoch: 2337, Training Loss: 0.03418
Epoch: 2337, Training Loss: 0.04539
Epoch: 2337, Training Loss: 0.03333
Epoch: 2338, Training Loss: 0.03894
Epoch: 2338, Training Loss: 0.03417
Epoch: 2338, Training Loss: 0.04538
Epoch: 2338, Training Loss: 0.03332
Epoch: 2339, Training Loss: 0.03893
Epoch: 2339, Training Loss: 0.03416
Epoch: 2339, Training Loss: 0.04536
Epoch: 2339, Training Loss: 0.03330
Epoch: 2340, Training Loss: 0.03892
Epoch: 2340, Training Loss: 0.03415
Epoch: 2340, Training Loss: 0.04535
Epoch: 2340, Training Loss: 0.03329
Epoch: 2341, Training Loss: 0.03890
Epoch: 2341, Training Loss: 0.03414
Epoch: 2341, Training Loss: 0.04533
Epoch: 2341, Training Loss: 0.03328
Epoch: 2342, Training Loss: 0.03889
Epoch: 2342, Training Loss: 0.03413
Epoch: 2342, Training Loss: 0.04532
Epoch: 2342, Training Loss: 0.03327
Epoch: 2343, Training Loss: 0.03888
Epoch: 2343, Training Loss: 0.03412
Epoch: 2343, Training Loss: 0.04530
Epoch: 2343, Training Loss: 0.03326
Epoch: 2344, Training Loss: 0.03887
Epoch: 2344, Training Loss: 0.03411
Epoch: 2344, Training Loss: 0.04528
Epoch: 2344, Training Loss: 0.03325
Epoch: 2345, Training Loss: 0.03885
Epoch: 2345, Training Loss: 0.03410
Epoch: 2345, Training Loss: 0.04527
Epoch: 2345, Training Loss: 0.03324
Epoch: 2346, Training Loss: 0.03884
Epoch: 2346, Training Loss: 0.03409
Epoch: 2346, Training Loss: 0.04525
Epoch: 2346, Training Loss: 0.03323
Epoch: 2347, Training Loss: 0.03883
Epoch: 2347, Training Loss: 0.03408
Epoch: 2347, Training Loss: 0.04524
Epoch: 2347, Training Loss: 0.03322
Epoch: 2348, Training Loss: 0.03882
Epoch: 2348, Training Loss: 0.03407
Epoch: 2348, Training Loss: 0.04522
Epoch: 2348, Training Loss: 0.03321
Epoch: 2349, Training Loss: 0.03880
Epoch: 2349, Training Loss: 0.03406
Epoch: 2349, Training Loss: 0.04521
Epoch: 2349, Training Loss: 0.03320
Epoch: 2350, Training Loss: 0.03879
Epoch: 2350, Training Loss: 0.03404
Epoch: 2350, Training Loss: 0.04519
Epoch: 2350, Training Loss: 0.03319
Epoch: 2351, Training Loss: 0.03878
Epoch: 2351, Training Loss: 0.03403
Epoch: 2351, Training Loss: 0.04518
Epoch: 2351, Training Loss: 0.03318
Epoch: 2352, Training Loss: 0.03876
Epoch: 2352, Training Loss: 0.03402
Epoch: 2352, Training Loss: 0.04516
Epoch: 2352, Training Loss: 0.03317
Epoch: 2353, Training Loss: 0.03875
Epoch: 2353, Training Loss: 0.03401
Epoch: 2353, Training Loss: 0.04515
Epoch: 2353, Training Loss: 0.03316
Epoch: 2354, Training Loss: 0.03874
Epoch: 2354, Training Loss: 0.03400
Epoch: 2354, Training Loss: 0.04513
Epoch: 2354, Training Loss: 0.03314
Epoch: 2355, Training Loss: 0.03873
Epoch: 2355, Training Loss: 0.03399
Epoch: 2355, Training Loss: 0.04511
Epoch: 2355, Training Loss: 0.03313
Epoch: 2356, Training Loss: 0.03871
Epoch: 2356, Training Loss: 0.03398
Epoch: 2356, Training Loss: 0.04510
Epoch: 2356, Training Loss: 0.03312
Epoch: 2357, Training Loss: 0.03870
Epoch: 2357, Training Loss: 0.03397
Epoch: 2357, Training Loss: 0.04508
Epoch: 2357, Training Loss: 0.03311
Epoch: 2358, Training Loss: 0.03869
Epoch: 2358, Training Loss: 0.03396
Epoch: 2358, Training Loss: 0.04507
Epoch: 2358, Training Loss: 0.03310
Epoch: 2359, Training Loss: 0.03868
Epoch: 2359, Training Loss: 0.03395
Epoch: 2359, Training Loss: 0.04505
Epoch: 2359, Training Loss: 0.03309
Epoch: 2360, Training Loss: 0.03866
Epoch: 2360, Training Loss: 0.03394
Epoch: 2360, Training Loss: 0.04504
Epoch: 2360, Training Loss: 0.03308
Epoch: 2361, Training Loss: 0.03865
Epoch: 2361, Training Loss: 0.03393
Epoch: 2361, Training Loss: 0.04502
Epoch: 2361, Training Loss: 0.03307
Epoch: 2362, Training Loss: 0.03864
Epoch: 2362, Training Loss: 0.03392
Epoch: 2362, Training Loss: 0.04501
Epoch: 2362, Training Loss: 0.03306
Epoch: 2363, Training Loss: 0.03863
Epoch: 2363, Training Loss: 0.03391
Epoch: 2363, Training Loss: 0.04499
Epoch: 2363, Training Loss: 0.03305
Epoch: 2364, Training Loss: 0.03861
Epoch: 2364, Training Loss: 0.03390
Epoch: 2364, Training Loss: 0.04498
Epoch: 2364, Training Loss: 0.03304
Epoch: 2365, Training Loss: 0.03860
Epoch: 2365, Training Loss: 0.03389
Epoch: 2365, Training Loss: 0.04496
Epoch: 2365, Training Loss: 0.03303
Epoch: 2366, Training Loss: 0.03859
Epoch: 2366, Training Loss: 0.03388
Epoch: 2366, Training Loss: 0.04495
Epoch: 2366, Training Loss: 0.03302
Epoch: 2367, Training Loss: 0.03858
Epoch: 2367, Training Loss: 0.03387
Epoch: 2367, Training Loss: 0.04493
Epoch: 2367, Training Loss: 0.03301
Epoch: 2368, Training Loss: 0.03856
Epoch: 2368, Training Loss: 0.03386
Epoch: 2368, Training Loss: 0.04492
Epoch: 2368, Training Loss: 0.03300
Epoch: 2369, Training Loss: 0.03855
Epoch: 2369, Training Loss: 0.03385
Epoch: 2369, Training Loss: 0.04490
Epoch: 2369, Training Loss: 0.03299
Epoch: 2370, Training Loss: 0.03854
Epoch: 2370, Training Loss: 0.03384
Epoch: 2370, Training Loss: 0.04488
Epoch: 2370, Training Loss: 0.03298
Epoch: 2371, Training Loss: 0.03853
Epoch: 2371, Training Loss: 0.03383
Epoch: 2371, Training Loss: 0.04487
Epoch: 2371, Training Loss: 0.03297
Epoch: 2372, Training Loss: 0.03851
Epoch: 2372, Training Loss: 0.03382
Epoch: 2372, Training Loss: 0.04485
Epoch: 2372, Training Loss: 0.03296
Epoch: 2373, Training Loss: 0.03850
Epoch: 2373, Training Loss: 0.03381
Epoch: 2373, Training Loss: 0.04484
Epoch: 2373, Training Loss: 0.03295
Epoch: 2374, Training Loss: 0.03849
Epoch: 2374, Training Loss: 0.03380
Epoch: 2374, Training Loss: 0.04482
Epoch: 2374, Training Loss: 0.03294
Epoch: 2375, Training Loss: 0.03848
Epoch: 2375, Training Loss: 0.03379
Epoch: 2375, Training Loss: 0.04481
Epoch: 2375, Training Loss: 0.03293
Epoch: 2376, Training Loss: 0.03846
Epoch: 2376, Training Loss: 0.03378
Epoch: 2376, Training Loss: 0.04479
Epoch: 2376, Training Loss: 0.03292
Epoch: 2377, Training Loss: 0.03845
Epoch: 2377, Training Loss: 0.03377
Epoch: 2377, Training Loss: 0.04478
Epoch: 2377, Training Loss: 0.03290
Epoch: 2378, Training Loss: 0.03844
Epoch: 2378, Training Loss: 0.03376
Epoch: 2378, Training Loss: 0.04476
Epoch: 2378, Training Loss: 0.03289
Epoch: 2379, Training Loss: 0.03843
Epoch: 2379, Training Loss: 0.03375
Epoch: 2379, Training Loss: 0.04475
Epoch: 2379, Training Loss: 0.03288
Epoch: 2380, Training Loss: 0.03841
Epoch: 2380, Training Loss: 0.03374
Epoch: 2380, Training Loss: 0.04473
Epoch: 2380, Training Loss: 0.03287
Epoch: 2381, Training Loss: 0.03840
Epoch: 2381, Training Loss: 0.03373
Epoch: 2381, Training Loss: 0.04472
Epoch: 2381, Training Loss: 0.03286
Epoch: 2382, Training Loss: 0.03839
Epoch: 2382, Training Loss: 0.03372
Epoch: 2382, Training Loss: 0.04470
Epoch: 2382, Training Loss: 0.03285
Epoch: 2383, Training Loss: 0.03838
Epoch: 2383, Training Loss: 0.03371
Epoch: 2383, Training Loss: 0.04469
Epoch: 2383, Training Loss: 0.03284
Epoch: 2384, Training Loss: 0.03836
Epoch: 2384, Training Loss: 0.03370
Epoch: 2384, Training Loss: 0.04467
Epoch: 2384, Training Loss: 0.03283
Epoch: 2385, Training Loss: 0.03835
Epoch: 2385, Training Loss: 0.03369
Epoch: 2385, Training Loss: 0.04466
Epoch: 2385, Training Loss: 0.03282
Epoch: 2386, Training Loss: 0.03834
Epoch: 2386, Training Loss: 0.03368
Epoch: 2386, Training Loss: 0.04464
Epoch: 2386, Training Loss: 0.03281
Epoch: 2387, Training Loss: 0.03833
Epoch: 2387, Training Loss: 0.03367
Epoch: 2387, Training Loss: 0.04463
Epoch: 2387, Training Loss: 0.03280
Epoch: 2388, Training Loss: 0.03831
Epoch: 2388, Training Loss: 0.03366
Epoch: 2388, Training Loss: 0.04461
Epoch: 2388, Training Loss: 0.03279
Epoch: 2389, Training Loss: 0.03830
Epoch: 2389, Training Loss: 0.03365
Epoch: 2389, Training Loss: 0.04460
Epoch: 2389, Training Loss: 0.03278
Epoch: 2390, Training Loss: 0.03829
Epoch: 2390, Training Loss: 0.03364
Epoch: 2390, Training Loss: 0.04458
Epoch: 2390, Training Loss: 0.03277
Epoch: 2391, Training Loss: 0.03828
Epoch: 2391, Training Loss: 0.03363
Epoch: 2391, Training Loss: 0.04457
Epoch: 2391, Training Loss: 0.03276
Epoch: 2392, Training Loss: 0.03827
Epoch: 2392, Training Loss: 0.03362
Epoch: 2392, Training Loss: 0.04455
Epoch: 2392, Training Loss: 0.03275
Epoch: 2393, Training Loss: 0.03825
Epoch: 2393, Training Loss: 0.03361
Epoch: 2393, Training Loss: 0.04454
Epoch: 2393, Training Loss: 0.03274
Epoch: 2394, Training Loss: 0.03824
Epoch: 2394, Training Loss: 0.03360
Epoch: 2394, Training Loss: 0.04452
Epoch: 2394, Training Loss: 0.03273
Epoch: 2395, Training Loss: 0.03823
Epoch: 2395, Training Loss: 0.03359
Epoch: 2395, Training Loss: 0.04451
Epoch: 2395, Training Loss: 0.03272
Epoch: 2396, Training Loss: 0.03822
Epoch: 2396, Training Loss: 0.03358
Epoch: 2396, Training Loss: 0.04449
Epoch: 2396, Training Loss: 0.03271
Epoch: 2397, Training Loss: 0.03820
Epoch: 2397, Training Loss: 0.03357
Epoch: 2397, Training Loss: 0.04448
Epoch: 2397, Training Loss: 0.03270
Epoch: 2398, Training Loss: 0.03819
Epoch: 2398, Training Loss: 0.03356
Epoch: 2398, Training Loss: 0.04446
Epoch: 2398, Training Loss: 0.03269
Epoch: 2399, Training Loss: 0.03818
Epoch: 2399, Training Loss: 0.03355
Epoch: 2399, Training Loss: 0.04445
Epoch: 2399, Training Loss: 0.03268
Epoch: 2400, Training Loss: 0.03817
Epoch: 2400, Training Loss: 0.03354
Epoch: 2400, Training Loss: 0.04444
Epoch: 2400, Training Loss: 0.03267
Epoch: 2401, Training Loss: 0.03816
Epoch: 2401, Training Loss: 0.03354
Epoch: 2401, Training Loss: 0.04442
Epoch: 2401, Training Loss: 0.03266
Epoch: 2402, Training Loss: 0.03814
Epoch: 2402, Training Loss: 0.03353
Epoch: 2402, Training Loss: 0.04441
Epoch: 2402, Training Loss: 0.03265
Epoch: 2403, Training Loss: 0.03813
Epoch: 2403, Training Loss: 0.03352
Epoch: 2403, Training Loss: 0.04439
Epoch: 2403, Training Loss: 0.03264
Epoch: 2404, Training Loss: 0.03812
Epoch: 2404, Training Loss: 0.03351
Epoch: 2404, Training Loss: 0.04438
Epoch: 2404, Training Loss: 0.03263
Epoch: 2405, Training Loss: 0.03811
Epoch: 2405, Training Loss: 0.03350
Epoch: 2405, Training Loss: 0.04436
Epoch: 2405, Training Loss: 0.03262
Epoch: 2406, Training Loss: 0.03809
Epoch: 2406, Training Loss: 0.03349
Epoch: 2406, Training Loss: 0.04435
Epoch: 2406, Training Loss: 0.03261
Epoch: 2407, Training Loss: 0.03808
Epoch: 2407, Training Loss: 0.03348
Epoch: 2407, Training Loss: 0.04433
Epoch: 2407, Training Loss: 0.03260
Epoch: 2408, Training Loss: 0.03807
Epoch: 2408, Training Loss: 0.03347
Epoch: 2408, Training Loss: 0.04432
Epoch: 2408, Training Loss: 0.03259
Epoch: 2409, Training Loss: 0.03806
Epoch: 2409, Training Loss: 0.03346
Epoch: 2409, Training Loss: 0.04430
Epoch: 2409, Training Loss: 0.03258
Epoch: 2410, Training Loss: 0.03805
Epoch: 2410, Training Loss: 0.03345
Epoch: 2410, Training Loss: 0.04429
Epoch: 2410, Training Loss: 0.03257
Epoch: 2411, Training Loss: 0.03803
Epoch: 2411, Training Loss: 0.03344
Epoch: 2411, Training Loss: 0.04427
Epoch: 2411, Training Loss: 0.03256
Epoch: 2412, Training Loss: 0.03802
Epoch: 2412, Training Loss: 0.03343
Epoch: 2412, Training Loss: 0.04426
Epoch: 2412, Training Loss: 0.03255
Epoch: 2413, Training Loss: 0.03801
Epoch: 2413, Training Loss: 0.03342
Epoch: 2413, Training Loss: 0.04424
Epoch: 2413, Training Loss: 0.03254
Epoch: 2414, Training Loss: 0.03800
Epoch: 2414, Training Loss: 0.03341
Epoch: 2414, Training Loss: 0.04423
Epoch: 2414, Training Loss: 0.03253
Epoch: 2415, Training Loss: 0.03799
Epoch: 2415, Training Loss: 0.03340
Epoch: 2415, Training Loss: 0.04422
Epoch: 2415, Training Loss: 0.03252
Epoch: 2416, Training Loss: 0.03797
Epoch: 2416, Training Loss: 0.03339
Epoch: 2416, Training Loss: 0.04420
Epoch: 2416, Training Loss: 0.03251
Epoch: 2417, Training Loss: 0.03796
Epoch: 2417, Training Loss: 0.03338
Epoch: 2417, Training Loss: 0.04419
Epoch: 2417, Training Loss: 0.03250
Epoch: 2418, Training Loss: 0.03795
Epoch: 2418, Training Loss: 0.03337
Epoch: 2418, Training Loss: 0.04417
Epoch: 2418, Training Loss: 0.03249
Epoch: 2419, Training Loss: 0.03794
Epoch: 2419, Training Loss: 0.03336
Epoch: 2419, Training Loss: 0.04416
Epoch: 2419, Training Loss: 0.03248
Epoch: 2420, Training Loss: 0.03793
Epoch: 2420, Training Loss: 0.03335
Epoch: 2420, Training Loss: 0.04414
Epoch: 2420, Training Loss: 0.03247
Epoch: 2421, Training Loss: 0.03791
Epoch: 2421, Training Loss: 0.03334
Epoch: 2421, Training Loss: 0.04413
Epoch: 2421, Training Loss: 0.03246
Epoch: 2422, Training Loss: 0.03790
Epoch: 2422, Training Loss: 0.03333
Epoch: 2422, Training Loss: 0.04411
Epoch: 2422, Training Loss: 0.03245
Epoch: 2423, Training Loss: 0.03789
Epoch: 2423, Training Loss: 0.03332
Epoch: 2423, Training Loss: 0.04410
Epoch: 2423, Training Loss: 0.03244
Epoch: 2424, Training Loss: 0.03788
Epoch: 2424, Training Loss: 0.03331
Epoch: 2424, Training Loss: 0.04408
Epoch: 2424, Training Loss: 0.03243
Epoch: 2425, Training Loss: 0.03787
Epoch: 2425, Training Loss: 0.03330
Epoch: 2425, Training Loss: 0.04407
Epoch: 2425, Training Loss: 0.03242
Epoch: 2426, Training Loss: 0.03786
Epoch: 2426, Training Loss: 0.03329
Epoch: 2426, Training Loss: 0.04406
Epoch: 2426, Training Loss: 0.03241
Epoch: 2427, Training Loss: 0.03784
Epoch: 2427, Training Loss: 0.03328
Epoch: 2427, Training Loss: 0.04404
Epoch: 2427, Training Loss: 0.03240
Epoch: 2428, Training Loss: 0.03783
Epoch: 2428, Training Loss: 0.03327
Epoch: 2428, Training Loss: 0.04403
Epoch: 2428, Training Loss: 0.03239
Epoch: 2429, Training Loss: 0.03782
Epoch: 2429, Training Loss: 0.03326
Epoch: 2429, Training Loss: 0.04401
Epoch: 2429, Training Loss: 0.03238
Epoch: 2430, Training Loss: 0.03781
Epoch: 2430, Training Loss: 0.03325
Epoch: 2430, Training Loss: 0.04400
Epoch: 2430, Training Loss: 0.03237
Epoch: 2431, Training Loss: 0.03780
Epoch: 2431, Training Loss: 0.03325
Epoch: 2431, Training Loss: 0.04398
Epoch: 2431, Training Loss: 0.03236
Epoch: 2432, Training Loss: 0.03778
Epoch: 2432, Training Loss: 0.03324
Epoch: 2432, Training Loss: 0.04397
Epoch: 2432, Training Loss: 0.03235
Epoch: 2433, Training Loss: 0.03777
Epoch: 2433, Training Loss: 0.03323
Epoch: 2433, Training Loss: 0.04396
Epoch: 2433, Training Loss: 0.03234
Epoch: 2434, Training Loss: 0.03776
Epoch: 2434, Training Loss: 0.03322
Epoch: 2434, Training Loss: 0.04394
Epoch: 2434, Training Loss: 0.03233
Epoch: 2435, Training Loss: 0.03775
Epoch: 2435, Training Loss: 0.03321
Epoch: 2435, Training Loss: 0.04393
Epoch: 2435, Training Loss: 0.03232
Epoch: 2436, Training Loss: 0.03774
Epoch: 2436, Training Loss: 0.03320
Epoch: 2436, Training Loss: 0.04391
Epoch: 2436, Training Loss: 0.03231
Epoch: 2437, Training Loss: 0.03773
Epoch: 2437, Training Loss: 0.03319
Epoch: 2437, Training Loss: 0.04390
Epoch: 2437, Training Loss: 0.03230
Epoch: 2438, Training Loss: 0.03771
Epoch: 2438, Training Loss: 0.03318
Epoch: 2438, Training Loss: 0.04388
Epoch: 2438, Training Loss: 0.03229
Epoch: 2439, Training Loss: 0.03770
Epoch: 2439, Training Loss: 0.03317
Epoch: 2439, Training Loss: 0.04387
Epoch: 2439, Training Loss: 0.03228
Epoch: 2440, Training Loss: 0.03769
Epoch: 2440, Training Loss: 0.03316
Epoch: 2440, Training Loss: 0.04386
Epoch: 2440, Training Loss: 0.03227
Epoch: 2441, Training Loss: 0.03768
Epoch: 2441, Training Loss: 0.03315
Epoch: 2441, Training Loss: 0.04384
Epoch: 2441, Training Loss: 0.03226
Epoch: 2442, Training Loss: 0.03767
Epoch: 2442, Training Loss: 0.03314
Epoch: 2442, Training Loss: 0.04383
Epoch: 2442, Training Loss: 0.03225
Epoch: 2443, Training Loss: 0.03765
Epoch: 2443, Training Loss: 0.03313
Epoch: 2443, Training Loss: 0.04381
Epoch: 2443, Training Loss: 0.03224
Epoch: 2444, Training Loss: 0.03764
Epoch: 2444, Training Loss: 0.03312
Epoch: 2444, Training Loss: 0.04380
Epoch: 2444, Training Loss: 0.03223
Epoch: 2445, Training Loss: 0.03763
Epoch: 2445, Training Loss: 0.03311
Epoch: 2445, Training Loss: 0.04378
Epoch: 2445, Training Loss: 0.03222
Epoch: 2446, Training Loss: 0.03762
Epoch: 2446, Training Loss: 0.03310
Epoch: 2446, Training Loss: 0.04377
Epoch: 2446, Training Loss: 0.03221
Epoch: 2447, Training Loss: 0.03761
Epoch: 2447, Training Loss: 0.03309
Epoch: 2447, Training Loss: 0.04376
Epoch: 2447, Training Loss: 0.03220
Epoch: 2448, Training Loss: 0.03760
Epoch: 2448, Training Loss: 0.03308
Epoch: 2448, Training Loss: 0.04374
Epoch: 2448, Training Loss: 0.03219
Epoch: 2449, Training Loss: 0.03758
Epoch: 2449, Training Loss: 0.03307
Epoch: 2449, Training Loss: 0.04373
Epoch: 2449, Training Loss: 0.03218
Epoch: 2450, Training Loss: 0.03757
Epoch: 2450, Training Loss: 0.03307
Epoch: 2450, Training Loss: 0.04371
Epoch: 2450, Training Loss: 0.03217
Epoch: 2451, Training Loss: 0.03756
Epoch: 2451, Training Loss: 0.03306
Epoch: 2451, Training Loss: 0.04370
Epoch: 2451, Training Loss: 0.03217
Epoch: 2452, Training Loss: 0.03755
Epoch: 2452, Training Loss: 0.03305
Epoch: 2452, Training Loss: 0.04369
Epoch: 2452, Training Loss: 0.03216
Epoch: 2453, Training Loss: 0.03754
Epoch: 2453, Training Loss: 0.03304
Epoch: 2453, Training Loss: 0.04367
Epoch: 2453, Training Loss: 0.03215
Epoch: 2454, Training Loss: 0.03753
Epoch: 2454, Training Loss: 0.03303
Epoch: 2454, Training Loss: 0.04366
Epoch: 2454, Training Loss: 0.03214
Epoch: 2455, Training Loss: 0.03752
Epoch: 2455, Training Loss: 0.03302
Epoch: 2455, Training Loss: 0.04364
Epoch: 2455, Training Loss: 0.03213
Epoch: 2456, Training Loss: 0.03750
Epoch: 2456, Training Loss: 0.03301
Epoch: 2456, Training Loss: 0.04363
Epoch: 2456, Training Loss: 0.03212
Epoch: 2457, Training Loss: 0.03749
Epoch: 2457, Training Loss: 0.03300
Epoch: 2457, Training Loss: 0.04362
Epoch: 2457, Training Loss: 0.03211
Epoch: 2458, Training Loss: 0.03748
Epoch: 2458, Training Loss: 0.03299
Epoch: 2458, Training Loss: 0.04360
Epoch: 2458, Training Loss: 0.03210
Epoch: 2459, Training Loss: 0.03747
Epoch: 2459, Training Loss: 0.03298
Epoch: 2459, Training Loss: 0.04359
Epoch: 2459, Training Loss: 0.03209
Epoch: 2460, Training Loss: 0.03746
Epoch: 2460, Training Loss: 0.03297
Epoch: 2460, Training Loss: 0.04357
Epoch: 2460, Training Loss: 0.03208
Epoch: 2461, Training Loss: 0.03745
Epoch: 2461, Training Loss: 0.03296
Epoch: 2461, Training Loss: 0.04356
Epoch: 2461, Training Loss: 0.03207
Epoch: 2462, Training Loss: 0.03743
Epoch: 2462, Training Loss: 0.03295
Epoch: 2462, Training Loss: 0.04355
Epoch: 2462, Training Loss: 0.03206
Epoch: 2463, Training Loss: 0.03742
Epoch: 2463, Training Loss: 0.03294
Epoch: 2463, Training Loss: 0.04353
Epoch: 2463, Training Loss: 0.03205
Epoch: 2464, Training Loss: 0.03741
Epoch: 2464, Training Loss: 0.03293
Epoch: 2464, Training Loss: 0.04352
Epoch: 2464, Training Loss: 0.03204
Epoch: 2465, Training Loss: 0.03740
Epoch: 2465, Training Loss: 0.03293
Epoch: 2465, Training Loss: 0.04350
Epoch: 2465, Training Loss: 0.03203
Epoch: 2466, Training Loss: 0.03739
Epoch: 2466, Training Loss: 0.03292
Epoch: 2466, Training Loss: 0.04349
Epoch: 2466, Training Loss: 0.03202
Epoch: 2467, Training Loss: 0.03738
Epoch: 2467, Training Loss: 0.03291
Epoch: 2467, Training Loss: 0.04348
Epoch: 2467, Training Loss: 0.03201
Epoch: 2468, Training Loss: 0.03737
Epoch: 2468, Training Loss: 0.03290
Epoch: 2468, Training Loss: 0.04346
Epoch: 2468, Training Loss: 0.03200
Epoch: 2469, Training Loss: 0.03735
Epoch: 2469, Training Loss: 0.03289
Epoch: 2469, Training Loss: 0.04345
Epoch: 2469, Training Loss: 0.03199
Epoch: 2470, Training Loss: 0.03734
Epoch: 2470, Training Loss: 0.03288
Epoch: 2470, Training Loss: 0.04343
Epoch: 2470, Training Loss: 0.03198
Epoch: 2471, Training Loss: 0.03733
Epoch: 2471, Training Loss: 0.03287
Epoch: 2471, Training Loss: 0.04342
Epoch: 2471, Training Loss: 0.03197
Epoch: 2472, Training Loss: 0.03732
Epoch: 2472, Training Loss: 0.03286
Epoch: 2472, Training Loss: 0.04341
Epoch: 2472, Training Loss: 0.03196
Epoch: 2473, Training Loss: 0.03731
Epoch: 2473, Training Loss: 0.03285
Epoch: 2473, Training Loss: 0.04339
Epoch: 2473, Training Loss: 0.03195
Epoch: 2474, Training Loss: 0.03730
Epoch: 2474, Training Loss: 0.03284
Epoch: 2474, Training Loss: 0.04338
Epoch: 2474, Training Loss: 0.03194
Epoch: 2475, Training Loss: 0.03729
Epoch: 2475, Training Loss: 0.03283
Epoch: 2475, Training Loss: 0.04337
Epoch: 2475, Training Loss: 0.03194
Epoch: 2476, Training Loss: 0.03727
Epoch: 2476, Training Loss: 0.03282
Epoch: 2476, Training Loss: 0.04335
Epoch: 2476, Training Loss: 0.03193
Epoch: 2477, Training Loss: 0.03726
Epoch: 2477, Training Loss: 0.03281
Epoch: 2477, Training Loss: 0.04334
Epoch: 2477, Training Loss: 0.03192
Epoch: 2478, Training Loss: 0.03725
Epoch: 2478, Training Loss: 0.03281
Epoch: 2478, Training Loss: 0.04332
Epoch: 2478, Training Loss: 0.03191
Epoch: 2479, Training Loss: 0.03724
Epoch: 2479, Training Loss: 0.03280
Epoch: 2479, Training Loss: 0.04331
Epoch: 2479, Training Loss: 0.03190
Epoch: 2480, Training Loss: 0.03723
Epoch: 2480, Training Loss: 0.03279
Epoch: 2480, Training Loss: 0.04330
Epoch: 2480, Training Loss: 0.03189
Epoch: 2481, Training Loss: 0.03722
Epoch: 2481, Training Loss: 0.03278
Epoch: 2481, Training Loss: 0.04328
Epoch: 2481, Training Loss: 0.03188
Epoch: 2482, Training Loss: 0.03721
Epoch: 2482, Training Loss: 0.03277
Epoch: 2482, Training Loss: 0.04327
Epoch: 2482, Training Loss: 0.03187
Epoch: 2483, Training Loss: 0.03720
Epoch: 2483, Training Loss: 0.03276
Epoch: 2483, Training Loss: 0.04326
Epoch: 2483, Training Loss: 0.03186
Epoch: 2484, Training Loss: 0.03718
Epoch: 2484, Training Loss: 0.03275
Epoch: 2484, Training Loss: 0.04324
Epoch: 2484, Training Loss: 0.03185
Epoch: 2485, Training Loss: 0.03717
Epoch: 2485, Training Loss: 0.03274
Epoch: 2485, Training Loss: 0.04323
Epoch: 2485, Training Loss: 0.03184
Epoch: 2486, Training Loss: 0.03716
Epoch: 2486, Training Loss: 0.03273
Epoch: 2486, Training Loss: 0.04321
Epoch: 2486, Training Loss: 0.03183
Epoch: 2487, Training Loss: 0.03715
Epoch: 2487, Training Loss: 0.03272
Epoch: 2487, Training Loss: 0.04320
Epoch: 2487, Training Loss: 0.03182
Epoch: 2488, Training Loss: 0.03714
Epoch: 2488, Training Loss: 0.03271
Epoch: 2488, Training Loss: 0.04319
Epoch: 2488, Training Loss: 0.03181
Epoch: 2489, Training Loss: 0.03713
Epoch: 2489, Training Loss: 0.03271
Epoch: 2489, Training Loss: 0.04317
Epoch: 2489, Training Loss: 0.03180
Epoch: 2490, Training Loss: 0.03712
Epoch: 2490, Training Loss: 0.03270
Epoch: 2490, Training Loss: 0.04316
Epoch: 2490, Training Loss: 0.03179
Epoch: 2491, Training Loss: 0.03711
Epoch: 2491, Training Loss: 0.03269
Epoch: 2491, Training Loss: 0.04315
Epoch: 2491, Training Loss: 0.03178
Epoch: 2492, Training Loss: 0.03709
Epoch: 2492, Training Loss: 0.03268
Epoch: 2492, Training Loss: 0.04313
Epoch: 2492, Training Loss: 0.03178
Epoch: 2493, Training Loss: 0.03708
Epoch: 2493, Training Loss: 0.03267
Epoch: 2493, Training Loss: 0.04312
Epoch: 2493, Training Loss: 0.03177
Epoch: 2494, Training Loss: 0.03707
Epoch: 2494, Training Loss: 0.03266
Epoch: 2494, Training Loss: 0.04311
Epoch: 2494, Training Loss: 0.03176
Epoch: 2495, Training Loss: 0.03706
Epoch: 2495, Training Loss: 0.03265
Epoch: 2495, Training Loss: 0.04309
Epoch: 2495, Training Loss: 0.03175
Epoch: 2496, Training Loss: 0.03705
Epoch: 2496, Training Loss: 0.03264
Epoch: 2496, Training Loss: 0.04308
Epoch: 2496, Training Loss: 0.03174
Epoch: 2497, Training Loss: 0.03704
Epoch: 2497, Training Loss: 0.03263
Epoch: 2497, Training Loss: 0.04307
Epoch: 2497, Training Loss: 0.03173
Epoch: 2498, Training Loss: 0.03703
Epoch: 2498, Training Loss: 0.03262
Epoch: 2498, Training Loss: 0.04305
Epoch: 2498, Training Loss: 0.03172
Epoch: 2499, Training Loss: 0.03702
Epoch: 2499, Training Loss: 0.03261
Epoch: 2499, Training Loss: 0.04304
Epoch: 2499, Training Loss: 0.03171
Epoch: 2500, Training Loss: 0.03700
Epoch: 2500, Training Loss: 0.03261
Epoch: 2500, Training Loss: 0.04302
Epoch: 2500, Training Loss: 0.03170
Epoch: 2501, Training Loss: 0.03699
Epoch: 2501, Training Loss: 0.03260
Epoch: 2501, Training Loss: 0.04301
Epoch: 2501, Training Loss: 0.03169
Epoch: 2502, Training Loss: 0.03698
Epoch: 2502, Training Loss: 0.03259
Epoch: 2502, Training Loss: 0.04300
Epoch: 2502, Training Loss: 0.03168
Epoch: 2503, Training Loss: 0.03697
Epoch: 2503, Training Loss: 0.03258
Epoch: 2503, Training Loss: 0.04298
Epoch: 2503, Training Loss: 0.03167
Epoch: 2504, Training Loss: 0.03696
Epoch: 2504, Training Loss: 0.03257
Epoch: 2504, Training Loss: 0.04297
Epoch: 2504, Training Loss: 0.03166
Epoch: 2505, Training Loss: 0.03695
Epoch: 2505, Training Loss: 0.03256
Epoch: 2505, Training Loss: 0.04296
Epoch: 2505, Training Loss: 0.03165
Epoch: 2506, Training Loss: 0.03694
Epoch: 2506, Training Loss: 0.03255
Epoch: 2506, Training Loss: 0.04294
Epoch: 2506, Training Loss: 0.03165
Epoch: 2507, Training Loss: 0.03693
Epoch: 2507, Training Loss: 0.03254
Epoch: 2507, Training Loss: 0.04293
Epoch: 2507, Training Loss: 0.03164
Epoch: 2508, Training Loss: 0.03692
Epoch: 2508, Training Loss: 0.03253
Epoch: 2508, Training Loss: 0.04292
Epoch: 2508, Training Loss: 0.03163
Epoch: 2509, Training Loss: 0.03691
Epoch: 2509, Training Loss: 0.03252
Epoch: 2509, Training Loss: 0.04290
Epoch: 2509, Training Loss: 0.03162
Epoch: 2510, Training Loss: 0.03689
Epoch: 2510, Training Loss: 0.03252
Epoch: 2510, Training Loss: 0.04289
Epoch: 2510, Training Loss: 0.03161
Epoch: 2511, Training Loss: 0.03688
Epoch: 2511, Training Loss: 0.03251
Epoch: 2511, Training Loss: 0.04288
Epoch: 2511, Training Loss: 0.03160
Epoch: 2512, Training Loss: 0.03687
Epoch: 2512, Training Loss: 0.03250
Epoch: 2512, Training Loss: 0.04286
Epoch: 2512, Training Loss: 0.03159
Epoch: 2513, Training Loss: 0.03686
Epoch: 2513, Training Loss: 0.03249
Epoch: 2513, Training Loss: 0.04285
Epoch: 2513, Training Loss: 0.03158
Epoch: 2514, Training Loss: 0.03685
Epoch: 2514, Training Loss: 0.03248
Epoch: 2514, Training Loss: 0.04284
Epoch: 2514, Training Loss: 0.03157
Epoch: 2515, Training Loss: 0.03684
Epoch: 2515, Training Loss: 0.03247
Epoch: 2515, Training Loss: 0.04282
Epoch: 2515, Training Loss: 0.03156
Epoch: 2516, Training Loss: 0.03683
Epoch: 2516, Training Loss: 0.03246
Epoch: 2516, Training Loss: 0.04281
Epoch: 2516, Training Loss: 0.03155
Epoch: 2517, Training Loss: 0.03682
Epoch: 2517, Training Loss: 0.03245
Epoch: 2517, Training Loss: 0.04280
Epoch: 2517, Training Loss: 0.03154
Epoch: 2518, Training Loss: 0.03681
Epoch: 2518, Training Loss: 0.03244
Epoch: 2518, Training Loss: 0.04278
Epoch: 2518, Training Loss: 0.03154
Epoch: 2519, Training Loss: 0.03680
Epoch: 2519, Training Loss: 0.03244
Epoch: 2519, Training Loss: 0.04277
Epoch: 2519, Training Loss: 0.03153
Epoch: 2520, Training Loss: 0.03678
Epoch: 2520, Training Loss: 0.03243
Epoch: 2520, Training Loss: 0.04276
Epoch: 2520, Training Loss: 0.03152
Epoch: 2521, Training Loss: 0.03677
Epoch: 2521, Training Loss: 0.03242
Epoch: 2521, Training Loss: 0.04274
Epoch: 2521, Training Loss: 0.03151
Epoch: 2522, Training Loss: 0.03676
Epoch: 2522, Training Loss: 0.03241
Epoch: 2522, Training Loss: 0.04273
Epoch: 2522, Training Loss: 0.03150
Epoch: 2523, Training Loss: 0.03675
Epoch: 2523, Training Loss: 0.03240
Epoch: 2523, Training Loss: 0.04272
Epoch: 2523, Training Loss: 0.03149
Epoch: 2524, Training Loss: 0.03674
Epoch: 2524, Training Loss: 0.03239
Epoch: 2524, Training Loss: 0.04270
Epoch: 2524, Training Loss: 0.03148
Epoch: 2525, Training Loss: 0.03673
Epoch: 2525, Training Loss: 0.03238
Epoch: 2525, Training Loss: 0.04269
Epoch: 2525, Training Loss: 0.03147
Epoch: 2526, Training Loss: 0.03672
Epoch: 2526, Training Loss: 0.03237
Epoch: 2526, Training Loss: 0.04268
Epoch: 2526, Training Loss: 0.03146
Epoch: 2527, Training Loss: 0.03671
Epoch: 2527, Training Loss: 0.03237
Epoch: 2527, Training Loss: 0.04267
Epoch: 2527, Training Loss: 0.03145
Epoch: 2528, Training Loss: 0.03670
Epoch: 2528, Training Loss: 0.03236
Epoch: 2528, Training Loss: 0.04265
Epoch: 2528, Training Loss: 0.03144
Epoch: 2529, Training Loss: 0.03669
Epoch: 2529, Training Loss: 0.03235
Epoch: 2529, Training Loss: 0.04264
Epoch: 2529, Training Loss: 0.03144
Epoch: 2530, Training Loss: 0.03668
Epoch: 2530, Training Loss: 0.03234
Epoch: 2530, Training Loss: 0.04263
Epoch: 2530, Training Loss: 0.03143
Epoch: 2531, Training Loss: 0.03666
Epoch: 2531, Training Loss: 0.03233
Epoch: 2531, Training Loss: 0.04261
Epoch: 2531, Training Loss: 0.03142
Epoch: 2532, Training Loss: 0.03665
Epoch: 2532, Training Loss: 0.03232
Epoch: 2532, Training Loss: 0.04260
Epoch: 2532, Training Loss: 0.03141
Epoch: 2533, Training Loss: 0.03664
Epoch: 2533, Training Loss: 0.03231
Epoch: 2533, Training Loss: 0.04259
Epoch: 2533, Training Loss: 0.03140
Epoch: 2534, Training Loss: 0.03663
Epoch: 2534, Training Loss: 0.03230
Epoch: 2534, Training Loss: 0.04257
Epoch: 2534, Training Loss: 0.03139
Epoch: 2535, Training Loss: 0.03662
Epoch: 2535, Training Loss: 0.03229
Epoch: 2535, Training Loss: 0.04256
Epoch: 2535, Training Loss: 0.03138
Epoch: 2536, Training Loss: 0.03661
Epoch: 2536, Training Loss: 0.03229
Epoch: 2536, Training Loss: 0.04255
Epoch: 2536, Training Loss: 0.03137
Epoch: 2537, Training Loss: 0.03660
Epoch: 2537, Training Loss: 0.03228
Epoch: 2537, Training Loss: 0.04253
Epoch: 2537, Training Loss: 0.03136
Epoch: 2538, Training Loss: 0.03659
Epoch: 2538, Training Loss: 0.03227
Epoch: 2538, Training Loss: 0.04252
Epoch: 2538, Training Loss: 0.03135
Epoch: 2539, Training Loss: 0.03658
Epoch: 2539, Training Loss: 0.03226
Epoch: 2539, Training Loss: 0.04251
Epoch: 2539, Training Loss: 0.03135
Epoch: 2540, Training Loss: 0.03657
Epoch: 2540, Training Loss: 0.03225
Epoch: 2540, Training Loss: 0.04250
Epoch: 2540, Training Loss: 0.03134
Epoch: 2541, Training Loss: 0.03656
Epoch: 2541, Training Loss: 0.03224
Epoch: 2541, Training Loss: 0.04248
Epoch: 2541, Training Loss: 0.03133
Epoch: 2542, Training Loss: 0.03655
Epoch: 2542, Training Loss: 0.03223
Epoch: 2542, Training Loss: 0.04247
Epoch: 2542, Training Loss: 0.03132
Epoch: 2543, Training Loss: 0.03654
Epoch: 2543, Training Loss: 0.03222
Epoch: 2543, Training Loss: 0.04246
Epoch: 2543, Training Loss: 0.03131
Epoch: 2544, Training Loss: 0.03652
Epoch: 2544, Training Loss: 0.03222
Epoch: 2544, Training Loss: 0.04244
Epoch: 2544, Training Loss: 0.03130
Epoch: 2545, Training Loss: 0.03651
Epoch: 2545, Training Loss: 0.03221
Epoch: 2545, Training Loss: 0.04243
Epoch: 2545, Training Loss: 0.03129
Epoch: 2546, Training Loss: 0.03650
Epoch: 2546, Training Loss: 0.03220
Epoch: 2546, Training Loss: 0.04242
Epoch: 2546, Training Loss: 0.03128
Epoch: 2547, Training Loss: 0.03649
Epoch: 2547, Training Loss: 0.03219
Epoch: 2547, Training Loss: 0.04240
Epoch: 2547, Training Loss: 0.03127
Epoch: 2548, Training Loss: 0.03648
Epoch: 2548, Training Loss: 0.03218
Epoch: 2548, Training Loss: 0.04239
Epoch: 2548, Training Loss: 0.03127
Epoch: 2549, Training Loss: 0.03647
Epoch: 2549, Training Loss: 0.03217
Epoch: 2549, Training Loss: 0.04238
Epoch: 2549, Training Loss: 0.03126
Epoch: 2550, Training Loss: 0.03646
Epoch: 2550, Training Loss: 0.03216
Epoch: 2550, Training Loss: 0.04237
Epoch: 2550, Training Loss: 0.03125
Epoch: 2551, Training Loss: 0.03645
Epoch: 2551, Training Loss: 0.03216
Epoch: 2551, Training Loss: 0.04235
Epoch: 2551, Training Loss: 0.03124
Epoch: 2552, Training Loss: 0.03644
Epoch: 2552, Training Loss: 0.03215
Epoch: 2552, Training Loss: 0.04234
Epoch: 2552, Training Loss: 0.03123
Epoch: 2553, Training Loss: 0.03643
Epoch: 2553, Training Loss: 0.03214
Epoch: 2553, Training Loss: 0.04233
Epoch: 2553, Training Loss: 0.03122
Epoch: 2554, Training Loss: 0.03642
Epoch: 2554, Training Loss: 0.03213
Epoch: 2554, Training Loss: 0.04231
Epoch: 2554, Training Loss: 0.03121
Epoch: 2555, Training Loss: 0.03641
Epoch: 2555, Training Loss: 0.03212
Epoch: 2555, Training Loss: 0.04230
Epoch: 2555, Training Loss: 0.03120
Epoch: 2556, Training Loss: 0.03640
Epoch: 2556, Training Loss: 0.03211
Epoch: 2556, Training Loss: 0.04229
Epoch: 2556, Training Loss: 0.03119
Epoch: 2557, Training Loss: 0.03639
Epoch: 2557, Training Loss: 0.03210
Epoch: 2557, Training Loss: 0.04228
Epoch: 2557, Training Loss: 0.03119
Epoch: 2558, Training Loss: 0.03638
Epoch: 2558, Training Loss: 0.03209
Epoch: 2558, Training Loss: 0.04226
Epoch: 2558, Training Loss: 0.03118
Epoch: 2559, Training Loss: 0.03636
Epoch: 2559, Training Loss: 0.03209
Epoch: 2559, Training Loss: 0.04225
Epoch: 2559, Training Loss: 0.03117
Epoch: 2560, Training Loss: 0.03635
Epoch: 2560, Training Loss: 0.03208
Epoch: 2560, Training Loss: 0.04224
Epoch: 2560, Training Loss: 0.03116
Epoch: 2561, Training Loss: 0.03634
Epoch: 2561, Training Loss: 0.03207
Epoch: 2561, Training Loss: 0.04222
Epoch: 2561, Training Loss: 0.03115
Epoch: 2562, Training Loss: 0.03633
Epoch: 2562, Training Loss: 0.03206
Epoch: 2562, Training Loss: 0.04221
Epoch: 2562, Training Loss: 0.03114
Epoch: 2563, Training Loss: 0.03632
Epoch: 2563, Training Loss: 0.03205
Epoch: 2563, Training Loss: 0.04220
Epoch: 2563, Training Loss: 0.03113
Epoch: 2564, Training Loss: 0.03631
Epoch: 2564, Training Loss: 0.03204
Epoch: 2564, Training Loss: 0.04219
Epoch: 2564, Training Loss: 0.03112
Epoch: 2565, Training Loss: 0.03630
Epoch: 2565, Training Loss: 0.03203
Epoch: 2565, Training Loss: 0.04217
Epoch: 2565, Training Loss: 0.03112
Epoch: 2566, Training Loss: 0.03629
Epoch: 2566, Training Loss: 0.03203
Epoch: 2566, Training Loss: 0.04216
Epoch: 2566, Training Loss: 0.03111
Epoch: 2567, Training Loss: 0.03628
Epoch: 2567, Training Loss: 0.03202
Epoch: 2567, Training Loss: 0.04215
Epoch: 2567, Training Loss: 0.03110
Epoch: 2568, Training Loss: 0.03627
Epoch: 2568, Training Loss: 0.03201
Epoch: 2568, Training Loss: 0.04214
Epoch: 2568, Training Loss: 0.03109
Epoch: 2569, Training Loss: 0.03626
Epoch: 2569, Training Loss: 0.03200
Epoch: 2569, Training Loss: 0.04212
Epoch: 2569, Training Loss: 0.03108
Epoch: 2570, Training Loss: 0.03625
Epoch: 2570, Training Loss: 0.03199
Epoch: 2570, Training Loss: 0.04211
Epoch: 2570, Training Loss: 0.03107
Epoch: 2571, Training Loss: 0.03624
Epoch: 2571, Training Loss: 0.03198
Epoch: 2571, Training Loss: 0.04210
Epoch: 2571, Training Loss: 0.03106
Epoch: 2572, Training Loss: 0.03623
Epoch: 2572, Training Loss: 0.03198
Epoch: 2572, Training Loss: 0.04208
Epoch: 2572, Training Loss: 0.03105
Epoch: 2573, Training Loss: 0.03622
Epoch: 2573, Training Loss: 0.03197
Epoch: 2573, Training Loss: 0.04207
Epoch: 2573, Training Loss: 0.03105
Epoch: 2574, Training Loss: 0.03621
Epoch: 2574, Training Loss: 0.03196
Epoch: 2574, Training Loss: 0.04206
Epoch: 2574, Training Loss: 0.03104
Epoch: 2575, Training Loss: 0.03620
Epoch: 2575, Training Loss: 0.03195
Epoch: 2575, Training Loss: 0.04205
Epoch: 2575, Training Loss: 0.03103
Epoch: 2576, Training Loss: 0.03619
Epoch: 2576, Training Loss: 0.03194
Epoch: 2576, Training Loss: 0.04203
Epoch: 2576, Training Loss: 0.03102
Epoch: 2577, Training Loss: 0.03618
Epoch: 2577, Training Loss: 0.03193
Epoch: 2577, Training Loss: 0.04202
Epoch: 2577, Training Loss: 0.03101
Epoch: 2578, Training Loss: 0.03617
Epoch: 2578, Training Loss: 0.03192
Epoch: 2578, Training Loss: 0.04201
Epoch: 2578, Training Loss: 0.03100
Epoch: 2579, Training Loss: 0.03615
Epoch: 2579, Training Loss: 0.03192
Epoch: 2579, Training Loss: 0.04200
Epoch: 2579, Training Loss: 0.03099
Epoch: 2580, Training Loss: 0.03614
Epoch: 2580, Training Loss: 0.03191
Epoch: 2580, Training Loss: 0.04198
Epoch: 2580, Training Loss: 0.03098
Epoch: 2581, Training Loss: 0.03613
Epoch: 2581, Training Loss: 0.03190
Epoch: 2581, Training Loss: 0.04197
Epoch: 2581, Training Loss: 0.03098
Epoch: 2582, Training Loss: 0.03612
Epoch: 2582, Training Loss: 0.03189
Epoch: 2582, Training Loss: 0.04196
Epoch: 2582, Training Loss: 0.03097
Epoch: 2583, Training Loss: 0.03611
Epoch: 2583, Training Loss: 0.03188
Epoch: 2583, Training Loss: 0.04195
Epoch: 2583, Training Loss: 0.03096
Epoch: 2584, Training Loss: 0.03610
Epoch: 2584, Training Loss: 0.03187
Epoch: 2584, Training Loss: 0.04193
Epoch: 2584, Training Loss: 0.03095
Epoch: 2585, Training Loss: 0.03609
Epoch: 2585, Training Loss: 0.03187
Epoch: 2585, Training Loss: 0.04192
Epoch: 2585, Training Loss: 0.03094
Epoch: 2586, Training Loss: 0.03608
Epoch: 2586, Training Loss: 0.03186
Epoch: 2586, Training Loss: 0.04191
Epoch: 2586, Training Loss: 0.03093
Epoch: 2587, Training Loss: 0.03607
Epoch: 2587, Training Loss: 0.03185
Epoch: 2587, Training Loss: 0.04190
Epoch: 2587, Training Loss: 0.03092
Epoch: 2588, Training Loss: 0.03606
Epoch: 2588, Training Loss: 0.03184
Epoch: 2588, Training Loss: 0.04188
Epoch: 2588, Training Loss: 0.03092
Epoch: 2589, Training Loss: 0.03605
Epoch: 2589, Training Loss: 0.03183
Epoch: 2589, Training Loss: 0.04187
Epoch: 2589, Training Loss: 0.03091
Epoch: 2590, Training Loss: 0.03604
Epoch: 2590, Training Loss: 0.03182
Epoch: 2590, Training Loss: 0.04186
Epoch: 2590, Training Loss: 0.03090
Epoch: 2591, Training Loss: 0.03603
Epoch: 2591, Training Loss: 0.03181
Epoch: 2591, Training Loss: 0.04185
Epoch: 2591, Training Loss: 0.03089
Epoch: 2592, Training Loss: 0.03602
Epoch: 2592, Training Loss: 0.03181
Epoch: 2592, Training Loss: 0.04183
Epoch: 2592, Training Loss: 0.03088
Epoch: 2593, Training Loss: 0.03601
Epoch: 2593, Training Loss: 0.03180
Epoch: 2593, Training Loss: 0.04182
Epoch: 2593, Training Loss: 0.03087
Epoch: 2594, Training Loss: 0.03600
Epoch: 2594, Training Loss: 0.03179
Epoch: 2594, Training Loss: 0.04181
Epoch: 2594, Training Loss: 0.03086
Epoch: 2595, Training Loss: 0.03599
Epoch: 2595, Training Loss: 0.03178
Epoch: 2595, Training Loss: 0.04180
Epoch: 2595, Training Loss: 0.03085
Epoch: 2596, Training Loss: 0.03598
Epoch: 2596, Training Loss: 0.03177
Epoch: 2596, Training Loss: 0.04178
Epoch: 2596, Training Loss: 0.03085
Epoch: 2597, Training Loss: 0.03597
Epoch: 2597, Training Loss: 0.03176
Epoch: 2597, Training Loss: 0.04177
Epoch: 2597, Training Loss: 0.03084
Epoch: 2598, Training Loss: 0.03596
Epoch: 2598, Training Loss: 0.03176
Epoch: 2598, Training Loss: 0.04176
Epoch: 2598, Training Loss: 0.03083
Epoch: 2599, Training Loss: 0.03595
Epoch: 2599, Training Loss: 0.03175
Epoch: 2599, Training Loss: 0.04175
Epoch: 2599, Training Loss: 0.03082
Epoch: 2600, Training Loss: 0.03594
Epoch: 2600, Training Loss: 0.03174
Epoch: 2600, Training Loss: 0.04173
Epoch: 2600, Training Loss: 0.03081
Epoch: 2601, Training Loss: 0.03593
Epoch: 2601, Training Loss: 0.03173
Epoch: 2601, Training Loss: 0.04172
Epoch: 2601, Training Loss: 0.03080
Epoch: 2602, Training Loss: 0.03592
Epoch: 2602, Training Loss: 0.03172
Epoch: 2602, Training Loss: 0.04171
Epoch: 2602, Training Loss: 0.03080
Epoch: 2603, Training Loss: 0.03591
Epoch: 2603, Training Loss: 0.03171
Epoch: 2603, Training Loss: 0.04170
Epoch: 2603, Training Loss: 0.03079
Epoch: 2604, Training Loss: 0.03590
Epoch: 2604, Training Loss: 0.03171
Epoch: 2604, Training Loss: 0.04169
Epoch: 2604, Training Loss: 0.03078
Epoch: 2605, Training Loss: 0.03589
Epoch: 2605, Training Loss: 0.03170
Epoch: 2605, Training Loss: 0.04167
Epoch: 2605, Training Loss: 0.03077
Epoch: 2606, Training Loss: 0.03588
Epoch: 2606, Training Loss: 0.03169
Epoch: 2606, Training Loss: 0.04166
Epoch: 2606, Training Loss: 0.03076
Epoch: 2607, Training Loss: 0.03587
Epoch: 2607, Training Loss: 0.03168
Epoch: 2607, Training Loss: 0.04165
Epoch: 2607, Training Loss: 0.03075
Epoch: 2608, Training Loss: 0.03586
Epoch: 2608, Training Loss: 0.03167
Epoch: 2608, Training Loss: 0.04164
Epoch: 2608, Training Loss: 0.03074
Epoch: 2609, Training Loss: 0.03585
Epoch: 2609, Training Loss: 0.03166
Epoch: 2609, Training Loss: 0.04162
Epoch: 2609, Training Loss: 0.03074
Epoch: 2610, Training Loss: 0.03584
Epoch: 2610, Training Loss: 0.03166
Epoch: 2610, Training Loss: 0.04161
Epoch: 2610, Training Loss: 0.03073
Epoch: 2611, Training Loss: 0.03583
Epoch: 2611, Training Loss: 0.03165
Epoch: 2611, Training Loss: 0.04160
Epoch: 2611, Training Loss: 0.03072
Epoch: 2612, Training Loss: 0.03582
Epoch: 2612, Training Loss: 0.03164
Epoch: 2612, Training Loss: 0.04159
Epoch: 2612, Training Loss: 0.03071
Epoch: 2613, Training Loss: 0.03581
Epoch: 2613, Training Loss: 0.03163
Epoch: 2613, Training Loss: 0.04158
Epoch: 2613, Training Loss: 0.03070
Epoch: 2614, Training Loss: 0.03580
Epoch: 2614, Training Loss: 0.03162
Epoch: 2614, Training Loss: 0.04156
Epoch: 2614, Training Loss: 0.03069
Epoch: 2615, Training Loss: 0.03579
Epoch: 2615, Training Loss: 0.03162
Epoch: 2615, Training Loss: 0.04155
Epoch: 2615, Training Loss: 0.03069
Epoch: 2616, Training Loss: 0.03578
Epoch: 2616, Training Loss: 0.03161
Epoch: 2616, Training Loss: 0.04154
Epoch: 2616, Training Loss: 0.03068
Epoch: 2617, Training Loss: 0.03577
Epoch: 2617, Training Loss: 0.03160
Epoch: 2617, Training Loss: 0.04153
Epoch: 2617, Training Loss: 0.03067
Epoch: 2618, Training Loss: 0.03576
Epoch: 2618, Training Loss: 0.03159
Epoch: 2618, Training Loss: 0.04151
Epoch: 2618, Training Loss: 0.03066
Epoch: 2619, Training Loss: 0.03574
Epoch: 2619, Training Loss: 0.03158
Epoch: 2619, Training Loss: 0.04150
Epoch: 2619, Training Loss: 0.03065
Epoch: 2620, Training Loss: 0.03573
Epoch: 2620, Training Loss: 0.03157
Epoch: 2620, Training Loss: 0.04149
Epoch: 2620, Training Loss: 0.03064
Epoch: 2621, Training Loss: 0.03572
Epoch: 2621, Training Loss: 0.03157
Epoch: 2621, Training Loss: 0.04148
Epoch: 2621, Training Loss: 0.03063
Epoch: 2622, Training Loss: 0.03571
Epoch: 2622, Training Loss: 0.03156
Epoch: 2622, Training Loss: 0.04147
Epoch: 2622, Training Loss: 0.03063
Epoch: 2623, Training Loss: 0.03570
Epoch: 2623, Training Loss: 0.03155
Epoch: 2623, Training Loss: 0.04145
Epoch: 2623, Training Loss: 0.03062
Epoch: 2624, Training Loss: 0.03569
Epoch: 2624, Training Loss: 0.03154
Epoch: 2624, Training Loss: 0.04144
Epoch: 2624, Training Loss: 0.03061
Epoch: 2625, Training Loss: 0.03568
Epoch: 2625, Training Loss: 0.03153
Epoch: 2625, Training Loss: 0.04143
Epoch: 2625, Training Loss: 0.03060
Epoch: 2626, Training Loss: 0.03567
Epoch: 2626, Training Loss: 0.03152
Epoch: 2626, Training Loss: 0.04142
Epoch: 2626, Training Loss: 0.03059
Epoch: 2627, Training Loss: 0.03566
Epoch: 2627, Training Loss: 0.03152
Epoch: 2627, Training Loss: 0.04141
Epoch: 2627, Training Loss: 0.03058
Epoch: 2628, Training Loss: 0.03565
Epoch: 2628, Training Loss: 0.03151
Epoch: 2628, Training Loss: 0.04139
Epoch: 2628, Training Loss: 0.03058
Epoch: 2629, Training Loss: 0.03564
Epoch: 2629, Training Loss: 0.03150
Epoch: 2629, Training Loss: 0.04138
Epoch: 2629, Training Loss: 0.03057
Epoch: 2630, Training Loss: 0.03563
Epoch: 2630, Training Loss: 0.03149
Epoch: 2630, Training Loss: 0.04137
Epoch: 2630, Training Loss: 0.03056
Epoch: 2631, Training Loss: 0.03562
Epoch: 2631, Training Loss: 0.03148
Epoch: 2631, Training Loss: 0.04136
Epoch: 2631, Training Loss: 0.03055
Epoch: 2632, Training Loss: 0.03561
Epoch: 2632, Training Loss: 0.03148
Epoch: 2632, Training Loss: 0.04134
Epoch: 2632, Training Loss: 0.03054
Epoch: 2633, Training Loss: 0.03560
Epoch: 2633, Training Loss: 0.03147
Epoch: 2633, Training Loss: 0.04133
Epoch: 2633, Training Loss: 0.03053
Epoch: 2634, Training Loss: 0.03559
Epoch: 2634, Training Loss: 0.03146
Epoch: 2634, Training Loss: 0.04132
Epoch: 2634, Training Loss: 0.03053
Epoch: 2635, Training Loss: 0.03558
Epoch: 2635, Training Loss: 0.03145
Epoch: 2635, Training Loss: 0.04131
Epoch: 2635, Training Loss: 0.03052
Epoch: 2636, Training Loss: 0.03557
Epoch: 2636, Training Loss: 0.03144
Epoch: 2636, Training Loss: 0.04130
Epoch: 2636, Training Loss: 0.03051
Epoch: 2637, Training Loss: 0.03556
Epoch: 2637, Training Loss: 0.03144
Epoch: 2637, Training Loss: 0.04128
Epoch: 2637, Training Loss: 0.03050
Epoch: 2638, Training Loss: 0.03556
Epoch: 2638, Training Loss: 0.03143
Epoch: 2638, Training Loss: 0.04127
Epoch: 2638, Training Loss: 0.03049
Epoch: 2639, Training Loss: 0.03555
Epoch: 2639, Training Loss: 0.03142
Epoch: 2639, Training Loss: 0.04126
Epoch: 2639, Training Loss: 0.03048
Epoch: 2640, Training Loss: 0.03554
Epoch: 2640, Training Loss: 0.03141
Epoch: 2640, Training Loss: 0.04125
Epoch: 2640, Training Loss: 0.03048
Epoch: 2641, Training Loss: 0.03553
Epoch: 2641, Training Loss: 0.03140
Epoch: 2641, Training Loss: 0.04124
Epoch: 2641, Training Loss: 0.03047
Epoch: 2642, Training Loss: 0.03552
Epoch: 2642, Training Loss: 0.03139
Epoch: 2642, Training Loss: 0.04123
Epoch: 2642, Training Loss: 0.03046
Epoch: 2643, Training Loss: 0.03551
Epoch: 2643, Training Loss: 0.03139
Epoch: 2643, Training Loss: 0.04121
Epoch: 2643, Training Loss: 0.03045
Epoch: 2644, Training Loss: 0.03550
Epoch: 2644, Training Loss: 0.03138
Epoch: 2644, Training Loss: 0.04120
Epoch: 2644, Training Loss: 0.03044
Epoch: 2645, Training Loss: 0.03549
Epoch: 2645, Training Loss: 0.03137
Epoch: 2645, Training Loss: 0.04119
Epoch: 2645, Training Loss: 0.03044
Epoch: 2646, Training Loss: 0.03548
Epoch: 2646, Training Loss: 0.03136
Epoch: 2646, Training Loss: 0.04118
Epoch: 2646, Training Loss: 0.03043
Epoch: 2647, Training Loss: 0.03547
Epoch: 2647, Training Loss: 0.03135
Epoch: 2647, Training Loss: 0.04117
Epoch: 2647, Training Loss: 0.03042
Epoch: 2648, Training Loss: 0.03546
Epoch: 2648, Training Loss: 0.03135
Epoch: 2648, Training Loss: 0.04115
Epoch: 2648, Training Loss: 0.03041
Epoch: 2649, Training Loss: 0.03545
Epoch: 2649, Training Loss: 0.03134
Epoch: 2649, Training Loss: 0.04114
Epoch: 2649, Training Loss: 0.03040
Epoch: 2650, Training Loss: 0.03544
Epoch: 2650, Training Loss: 0.03133
Epoch: 2650, Training Loss: 0.04113
Epoch: 2650, Training Loss: 0.03039
Epoch: 2651, Training Loss: 0.03543
Epoch: 2651, Training Loss: 0.03132
Epoch: 2651, Training Loss: 0.04112
Epoch: 2651, Training Loss: 0.03039
Epoch: 2652, Training Loss: 0.03542
Epoch: 2652, Training Loss: 0.03131
Epoch: 2652, Training Loss: 0.04111
Epoch: 2652, Training Loss: 0.03038
Epoch: 2653, Training Loss: 0.03541
Epoch: 2653, Training Loss: 0.03131
Epoch: 2653, Training Loss: 0.04109
Epoch: 2653, Training Loss: 0.03037
Epoch: 2654, Training Loss: 0.03540
Epoch: 2654, Training Loss: 0.03130
Epoch: 2654, Training Loss: 0.04108
Epoch: 2654, Training Loss: 0.03036
Epoch: 2655, Training Loss: 0.03539
Epoch: 2655, Training Loss: 0.03129
Epoch: 2655, Training Loss: 0.04107
Epoch: 2655, Training Loss: 0.03035
Epoch: 2656, Training Loss: 0.03538
Epoch: 2656, Training Loss: 0.03128
Epoch: 2656, Training Loss: 0.04106
Epoch: 2656, Training Loss: 0.03035
Epoch: 2657, Training Loss: 0.03537
Epoch: 2657, Training Loss: 0.03127
Epoch: 2657, Training Loss: 0.04105
Epoch: 2657, Training Loss: 0.03034
Epoch: 2658, Training Loss: 0.03536
Epoch: 2658, Training Loss: 0.03127
Epoch: 2658, Training Loss: 0.04104
Epoch: 2658, Training Loss: 0.03033
Epoch: 2659, Training Loss: 0.03535
Epoch: 2659, Training Loss: 0.03126
Epoch: 2659, Training Loss: 0.04102
Epoch: 2659, Training Loss: 0.03032
Epoch: 2660, Training Loss: 0.03534
Epoch: 2660, Training Loss: 0.03125
Epoch: 2660, Training Loss: 0.04101
Epoch: 2660, Training Loss: 0.03031
Epoch: 2661, Training Loss: 0.03533
Epoch: 2661, Training Loss: 0.03124
Epoch: 2661, Training Loss: 0.04100
Epoch: 2661, Training Loss: 0.03030
Epoch: 2662, Training Loss: 0.03532
Epoch: 2662, Training Loss: 0.03123
Epoch: 2662, Training Loss: 0.04099
Epoch: 2662, Training Loss: 0.03030
Epoch: 2663, Training Loss: 0.03531
Epoch: 2663, Training Loss: 0.03123
Epoch: 2663, Training Loss: 0.04098
Epoch: 2663, Training Loss: 0.03029
Epoch: 2664, Training Loss: 0.03530
Epoch: 2664, Training Loss: 0.03122
Epoch: 2664, Training Loss: 0.04097
Epoch: 2664, Training Loss: 0.03028
Epoch: 2665, Training Loss: 0.03529
Epoch: 2665, Training Loss: 0.03121
Epoch: 2665, Training Loss: 0.04095
Epoch: 2665, Training Loss: 0.03027
Epoch: 2666, Training Loss: 0.03528
Epoch: 2666, Training Loss: 0.03120
Epoch: 2666, Training Loss: 0.04094
Epoch: 2666, Training Loss: 0.03026
Epoch: 2667, Training Loss: 0.03527
Epoch: 2667, Training Loss: 0.03120
Epoch: 2667, Training Loss: 0.04093
Epoch: 2667, Training Loss: 0.03026
Epoch: 2668, Training Loss: 0.03526
Epoch: 2668, Training Loss: 0.03119
Epoch: 2668, Training Loss: 0.04092
Epoch: 2668, Training Loss: 0.03025
Epoch: 2669, Training Loss: 0.03525
Epoch: 2669, Training Loss: 0.03118
Epoch: 2669, Training Loss: 0.04091
Epoch: 2669, Training Loss: 0.03024
Epoch: 2670, Training Loss: 0.03524
Epoch: 2670, Training Loss: 0.03117
Epoch: 2670, Training Loss: 0.04090
Epoch: 2670, Training Loss: 0.03023
Epoch: 2671, Training Loss: 0.03523
Epoch: 2671, Training Loss: 0.03116
Epoch: 2671, Training Loss: 0.04088
Epoch: 2671, Training Loss: 0.03022
Epoch: 2672, Training Loss: 0.03522
Epoch: 2672, Training Loss: 0.03116
Epoch: 2672, Training Loss: 0.04087
Epoch: 2672, Training Loss: 0.03022
Epoch: 2673, Training Loss: 0.03521
Epoch: 2673, Training Loss: 0.03115
Epoch: 2673, Training Loss: 0.04086
Epoch: 2673, Training Loss: 0.03021
Epoch: 2674, Training Loss: 0.03520
Epoch: 2674, Training Loss: 0.03114
Epoch: 2674, Training Loss: 0.04085
Epoch: 2674, Training Loss: 0.03020
Epoch: 2675, Training Loss: 0.03519
Epoch: 2675, Training Loss: 0.03113
Epoch: 2675, Training Loss: 0.04084
Epoch: 2675, Training Loss: 0.03019
Epoch: 2676, Training Loss: 0.03518
Epoch: 2676, Training Loss: 0.03112
Epoch: 2676, Training Loss: 0.04083
Epoch: 2676, Training Loss: 0.03018
Epoch: 2677, Training Loss: 0.03517
Epoch: 2677, Training Loss: 0.03112
Epoch: 2677, Training Loss: 0.04081
Epoch: 2677, Training Loss: 0.03018
Epoch: 2678, Training Loss: 0.03516
Epoch: 2678, Training Loss: 0.03111
Epoch: 2678, Training Loss: 0.04080
Epoch: 2678, Training Loss: 0.03017
Epoch: 2679, Training Loss: 0.03515
Epoch: 2679, Training Loss: 0.03110
Epoch: 2679, Training Loss: 0.04079
Epoch: 2679, Training Loss: 0.03016
Epoch: 2680, Training Loss: 0.03515
Epoch: 2680, Training Loss: 0.03109
Epoch: 2680, Training Loss: 0.04078
Epoch: 2680, Training Loss: 0.03015
Epoch: 2681, Training Loss: 0.03514
Epoch: 2681, Training Loss: 0.03108
Epoch: 2681, Training Loss: 0.04077
Epoch: 2681, Training Loss: 0.03014
Epoch: 2682, Training Loss: 0.03513
Epoch: 2682, Training Loss: 0.03108
Epoch: 2682, Training Loss: 0.04076
Epoch: 2682, Training Loss: 0.03014
Epoch: 2683, Training Loss: 0.03512
Epoch: 2683, Training Loss: 0.03107
Epoch: 2683, Training Loss: 0.04074
Epoch: 2683, Training Loss: 0.03013
Epoch: 2684, Training Loss: 0.03511
Epoch: 2684, Training Loss: 0.03106
Epoch: 2684, Training Loss: 0.04073
Epoch: 2684, Training Loss: 0.03012
Epoch: 2685, Training Loss: 0.03510
Epoch: 2685, Training Loss: 0.03105
Epoch: 2685, Training Loss: 0.04072
Epoch: 2685, Training Loss: 0.03011
Epoch: 2686, Training Loss: 0.03509
Epoch: 2686, Training Loss: 0.03105
Epoch: 2686, Training Loss: 0.04071
Epoch: 2686, Training Loss: 0.03010
Epoch: 2687, Training Loss: 0.03508
Epoch: 2687, Training Loss: 0.03104
Epoch: 2687, Training Loss: 0.04070
Epoch: 2687, Training Loss: 0.03010
Epoch: 2688, Training Loss: 0.03507
Epoch: 2688, Training Loss: 0.03103
Epoch: 2688, Training Loss: 0.04069
Epoch: 2688, Training Loss: 0.03009
Epoch: 2689, Training Loss: 0.03506
Epoch: 2689, Training Loss: 0.03102
Epoch: 2689, Training Loss: 0.04068
Epoch: 2689, Training Loss: 0.03008
Epoch: 2690, Training Loss: 0.03505
Epoch: 2690, Training Loss: 0.03101
Epoch: 2690, Training Loss: 0.04066
Epoch: 2690, Training Loss: 0.03007
Epoch: 2691, Training Loss: 0.03504
Epoch: 2691, Training Loss: 0.03101
Epoch: 2691, Training Loss: 0.04065
Epoch: 2691, Training Loss: 0.03006
Epoch: 2692, Training Loss: 0.03503
Epoch: 2692, Training Loss: 0.03100
Epoch: 2692, Training Loss: 0.04064
Epoch: 2692, Training Loss: 0.03006
Epoch: 2693, Training Loss: 0.03502
Epoch: 2693, Training Loss: 0.03099
Epoch: 2693, Training Loss: 0.04063
Epoch: 2693, Training Loss: 0.03005
Epoch: 2694, Training Loss: 0.03501
Epoch: 2694, Training Loss: 0.03098
Epoch: 2694, Training Loss: 0.04062
Epoch: 2694, Training Loss: 0.03004
Epoch: 2695, Training Loss: 0.03500
Epoch: 2695, Training Loss: 0.03098
Epoch: 2695, Training Loss: 0.04061
Epoch: 2695, Training Loss: 0.03003
Epoch: 2696, Training Loss: 0.03499
Epoch: 2696, Training Loss: 0.03097
Epoch: 2696, Training Loss: 0.04060
Epoch: 2696, Training Loss: 0.03002
Epoch: 2697, Training Loss: 0.03498
Epoch: 2697, Training Loss: 0.03096
Epoch: 2697, Training Loss: 0.04058
Epoch: 2697, Training Loss: 0.03002
Epoch: 2698, Training Loss: 0.03497
Epoch: 2698, Training Loss: 0.03095
Epoch: 2698, Training Loss: 0.04057
Epoch: 2698, Training Loss: 0.03001
Epoch: 2699, Training Loss: 0.03496
Epoch: 2699, Training Loss: 0.03094
Epoch: 2699, Training Loss: 0.04056
Epoch: 2699, Training Loss: 0.03000
Epoch: 2700, Training Loss: 0.03495
Epoch: 2700, Training Loss: 0.03094
Epoch: 2700, Training Loss: 0.04055
Epoch: 2700, Training Loss: 0.02999
Epoch: 2701, Training Loss: 0.03495
Epoch: 2701, Training Loss: 0.03093
Epoch: 2701, Training Loss: 0.04054
Epoch: 2701, Training Loss: 0.02998
Epoch: 2702, Training Loss: 0.03494
Epoch: 2702, Training Loss: 0.03092
Epoch: 2702, Training Loss: 0.04053
Epoch: 2702, Training Loss: 0.02998
Epoch: 2703, Training Loss: 0.03493
Epoch: 2703, Training Loss: 0.03091
Epoch: 2703, Training Loss: 0.04052
Epoch: 2703, Training Loss: 0.02997
Epoch: 2704, Training Loss: 0.03492
Epoch: 2704, Training Loss: 0.03091
Epoch: 2704, Training Loss: 0.04050
Epoch: 2704, Training Loss: 0.02996
Epoch: 2705, Training Loss: 0.03491
Epoch: 2705, Training Loss: 0.03090
Epoch: 2705, Training Loss: 0.04049
Epoch: 2705, Training Loss: 0.02995
Epoch: 2706, Training Loss: 0.03490
Epoch: 2706, Training Loss: 0.03089
Epoch: 2706, Training Loss: 0.04048
Epoch: 2706, Training Loss: 0.02995
Epoch: 2707, Training Loss: 0.03489
Epoch: 2707, Training Loss: 0.03088
Epoch: 2707, Training Loss: 0.04047
Epoch: 2707, Training Loss: 0.02994
Epoch: 2708, Training Loss: 0.03488
Epoch: 2708, Training Loss: 0.03088
Epoch: 2708, Training Loss: 0.04046
Epoch: 2708, Training Loss: 0.02993
Epoch: 2709, Training Loss: 0.03487
Epoch: 2709, Training Loss: 0.03087
Epoch: 2709, Training Loss: 0.04045
Epoch: 2709, Training Loss: 0.02992
Epoch: 2710, Training Loss: 0.03486
Epoch: 2710, Training Loss: 0.03086
Epoch: 2710, Training Loss: 0.04044
Epoch: 2710, Training Loss: 0.02991
Epoch: 2711, Training Loss: 0.03485
Epoch: 2711, Training Loss: 0.03085
Epoch: 2711, Training Loss: 0.04043
Epoch: 2711, Training Loss: 0.02991
Epoch: 2712, Training Loss: 0.03484
Epoch: 2712, Training Loss: 0.03084
Epoch: 2712, Training Loss: 0.04041
Epoch: 2712, Training Loss: 0.02990
Epoch: 2713, Training Loss: 0.03483
Epoch: 2713, Training Loss: 0.03084
Epoch: 2713, Training Loss: 0.04040
Epoch: 2713, Training Loss: 0.02989
Epoch: 2714, Training Loss: 0.03482
Epoch: 2714, Training Loss: 0.03083
Epoch: 2714, Training Loss: 0.04039
Epoch: 2714, Training Loss: 0.02988
Epoch: 2715, Training Loss: 0.03481
Epoch: 2715, Training Loss: 0.03082
Epoch: 2715, Training Loss: 0.04038
Epoch: 2715, Training Loss: 0.02987
Epoch: 2716, Training Loss: 0.03480
Epoch: 2716, Training Loss: 0.03081
Epoch: 2716, Training Loss: 0.04037
Epoch: 2716, Training Loss: 0.02987
Epoch: 2717, Training Loss: 0.03480
Epoch: 2717, Training Loss: 0.03081
Epoch: 2717, Training Loss: 0.04036
Epoch: 2717, Training Loss: 0.02986
Epoch: 2718, Training Loss: 0.03479
Epoch: 2718, Training Loss: 0.03080
Epoch: 2718, Training Loss: 0.04035
Epoch: 2718, Training Loss: 0.02985
Epoch: 2719, Training Loss: 0.03478
Epoch: 2719, Training Loss: 0.03079
Epoch: 2719, Training Loss: 0.04034
Epoch: 2719, Training Loss: 0.02984
Epoch: 2720, Training Loss: 0.03477
Epoch: 2720, Training Loss: 0.03078
Epoch: 2720, Training Loss: 0.04032
Epoch: 2720, Training Loss: 0.02984
Epoch: 2721, Training Loss: 0.03476
Epoch: 2721, Training Loss: 0.03078
Epoch: 2721, Training Loss: 0.04031
Epoch: 2721, Training Loss: 0.02983
Epoch: 2722, Training Loss: 0.03475
Epoch: 2722, Training Loss: 0.03077
Epoch: 2722, Training Loss: 0.04030
Epoch: 2722, Training Loss: 0.02982
Epoch: 2723, Training Loss: 0.03474
Epoch: 2723, Training Loss: 0.03076
Epoch: 2723, Training Loss: 0.04029
Epoch: 2723, Training Loss: 0.02981
Epoch: 2724, Training Loss: 0.03473
Epoch: 2724, Training Loss: 0.03075
Epoch: 2724, Training Loss: 0.04028
Epoch: 2724, Training Loss: 0.02980
Epoch: 2725, Training Loss: 0.03472
Epoch: 2725, Training Loss: 0.03075
Epoch: 2725, Training Loss: 0.04027
Epoch: 2725, Training Loss: 0.02980
Epoch: 2726, Training Loss: 0.03471
Epoch: 2726, Training Loss: 0.03074
Epoch: 2726, Training Loss: 0.04026
Epoch: 2726, Training Loss: 0.02979
Epoch: 2727, Training Loss: 0.03470
Epoch: 2727, Training Loss: 0.03073
Epoch: 2727, Training Loss: 0.04025
Epoch: 2727, Training Loss: 0.02978
Epoch: 2728, Training Loss: 0.03469
Epoch: 2728, Training Loss: 0.03072
Epoch: 2728, Training Loss: 0.04023
Epoch: 2728, Training Loss: 0.02977
Epoch: 2729, Training Loss: 0.03468
Epoch: 2729, Training Loss: 0.03072
Epoch: 2729, Training Loss: 0.04022
Epoch: 2729, Training Loss: 0.02977
Epoch: 2730, Training Loss: 0.03467
Epoch: 2730, Training Loss: 0.03071
Epoch: 2730, Training Loss: 0.04021
Epoch: 2730, Training Loss: 0.02976
Epoch: 2731, Training Loss: 0.03467
Epoch: 2731, Training Loss: 0.03070
Epoch: 2731, Training Loss: 0.04020
Epoch: 2731, Training Loss: 0.02975
Epoch: 2732, Training Loss: 0.03466
Epoch: 2732, Training Loss: 0.03069
Epoch: 2732, Training Loss: 0.04019
Epoch: 2732, Training Loss: 0.02974
Epoch: 2733, Training Loss: 0.03465
Epoch: 2733, Training Loss: 0.03068
Epoch: 2733, Training Loss: 0.04018
Epoch: 2733, Training Loss: 0.02974
Epoch: 2734, Training Loss: 0.03464
Epoch: 2734, Training Loss: 0.03068
Epoch: 2734, Training Loss: 0.04017
Epoch: 2734, Training Loss: 0.02973
Epoch: 2735, Training Loss: 0.03463
Epoch: 2735, Training Loss: 0.03067
Epoch: 2735, Training Loss: 0.04016
Epoch: 2735, Training Loss: 0.02972
Epoch: 2736, Training Loss: 0.03462
Epoch: 2736, Training Loss: 0.03066
Epoch: 2736, Training Loss: 0.04015
Epoch: 2736, Training Loss: 0.02971
Epoch: 2737, Training Loss: 0.03461
Epoch: 2737, Training Loss: 0.03065
Epoch: 2737, Training Loss: 0.04014
Epoch: 2737, Training Loss: 0.02970
Epoch: 2738, Training Loss: 0.03460
Epoch: 2738, Training Loss: 0.03065
Epoch: 2738, Training Loss: 0.04012
Epoch: 2738, Training Loss: 0.02970
Epoch: 2739, Training Loss: 0.03459
Epoch: 2739, Training Loss: 0.03064
Epoch: 2739, Training Loss: 0.04011
Epoch: 2739, Training Loss: 0.02969
Epoch: 2740, Training Loss: 0.03458
Epoch: 2740, Training Loss: 0.03063
Epoch: 2740, Training Loss: 0.04010
Epoch: 2740, Training Loss: 0.02968
Epoch: 2741, Training Loss: 0.03457
Epoch: 2741, Training Loss: 0.03062
Epoch: 2741, Training Loss: 0.04009
Epoch: 2741, Training Loss: 0.02967
Epoch: 2742, Training Loss: 0.03456
Epoch: 2742, Training Loss: 0.03062
Epoch: 2742, Training Loss: 0.04008
Epoch: 2742, Training Loss: 0.02967
Epoch: 2743, Training Loss: 0.03456
Epoch: 2743, Training Loss: 0.03061
Epoch: 2743, Training Loss: 0.04007
Epoch: 2743, Training Loss: 0.02966
Epoch: 2744, Training Loss: 0.03455
Epoch: 2744, Training Loss: 0.03060
Epoch: 2744, Training Loss: 0.04006
Epoch: 2744, Training Loss: 0.02965
Epoch: 2745, Training Loss: 0.03454
Epoch: 2745, Training Loss: 0.03059
Epoch: 2745, Training Loss: 0.04005
Epoch: 2745, Training Loss: 0.02964
Epoch: 2746, Training Loss: 0.03453
Epoch: 2746, Training Loss: 0.03059
Epoch: 2746, Training Loss: 0.04004
Epoch: 2746, Training Loss: 0.02964
Epoch: 2747, Training Loss: 0.03452
Epoch: 2747, Training Loss: 0.03058
Epoch: 2747, Training Loss: 0.04003
Epoch: 2747, Training Loss: 0.02963
Epoch: 2748, Training Loss: 0.03451
Epoch: 2748, Training Loss: 0.03057
Epoch: 2748, Training Loss: 0.04001
Epoch: 2748, Training Loss: 0.02962
Epoch: 2749, Training Loss: 0.03450
Epoch: 2749, Training Loss: 0.03056
Epoch: 2749, Training Loss: 0.04000
Epoch: 2749, Training Loss: 0.02961
Epoch: 2750, Training Loss: 0.03449
Epoch: 2750, Training Loss: 0.03056
Epoch: 2750, Training Loss: 0.03999
Epoch: 2750, Training Loss: 0.02961
Epoch: 2751, Training Loss: 0.03448
Epoch: 2751, Training Loss: 0.03055
Epoch: 2751, Training Loss: 0.03998
Epoch: 2751, Training Loss: 0.02960
Epoch: 2752, Training Loss: 0.03447
Epoch: 2752, Training Loss: 0.03054
Epoch: 2752, Training Loss: 0.03997
Epoch: 2752, Training Loss: 0.02959
Epoch: 2753, Training Loss: 0.03446
Epoch: 2753, Training Loss: 0.03053
Epoch: 2753, Training Loss: 0.03996
Epoch: 2753, Training Loss: 0.02958
Epoch: 2754, Training Loss: 0.03445
Epoch: 2754, Training Loss: 0.03053
Epoch: 2754, Training Loss: 0.03995
Epoch: 2754, Training Loss: 0.02958
Epoch: 2755, Training Loss: 0.03445
Epoch: 2755, Training Loss: 0.03052
Epoch: 2755, Training Loss: 0.03994
Epoch: 2755, Training Loss: 0.02957
Epoch: 2756, Training Loss: 0.03444
Epoch: 2756, Training Loss: 0.03051
Epoch: 2756, Training Loss: 0.03993
Epoch: 2756, Training Loss: 0.02956
Epoch: 2757, Training Loss: 0.03443
Epoch: 2757, Training Loss: 0.03051
Epoch: 2757, Training Loss: 0.03992
Epoch: 2757, Training Loss: 0.02955
Epoch: 2758, Training Loss: 0.03442
Epoch: 2758, Training Loss: 0.03050
Epoch: 2758, Training Loss: 0.03991
Epoch: 2758, Training Loss: 0.02954
Epoch: 2759, Training Loss: 0.03441
Epoch: 2759, Training Loss: 0.03049
Epoch: 2759, Training Loss: 0.03989
Epoch: 2759, Training Loss: 0.02954
Epoch: 2760, Training Loss: 0.03440
Epoch: 2760, Training Loss: 0.03048
Epoch: 2760, Training Loss: 0.03988
Epoch: 2760, Training Loss: 0.02953
Epoch: 2761, Training Loss: 0.03439
Epoch: 2761, Training Loss: 0.03048
Epoch: 2761, Training Loss: 0.03987
Epoch: 2761, Training Loss: 0.02952
Epoch: 2762, Training Loss: 0.03438
Epoch: 2762, Training Loss: 0.03047
Epoch: 2762, Training Loss: 0.03986
Epoch: 2762, Training Loss: 0.02951
Epoch: 2763, Training Loss: 0.03437
Epoch: 2763, Training Loss: 0.03046
Epoch: 2763, Training Loss: 0.03985
Epoch: 2763, Training Loss: 0.02951
Epoch: 2764, Training Loss: 0.03436
Epoch: 2764, Training Loss: 0.03045
Epoch: 2764, Training Loss: 0.03984
Epoch: 2764, Training Loss: 0.02950
Epoch: 2765, Training Loss: 0.03436
Epoch: 2765, Training Loss: 0.03045
Epoch: 2765, Training Loss: 0.03983
Epoch: 2765, Training Loss: 0.02949
Epoch: 2766, Training Loss: 0.03435
Epoch: 2766, Training Loss: 0.03044
Epoch: 2766, Training Loss: 0.03982
Epoch: 2766, Training Loss: 0.02948
Epoch: 2767, Training Loss: 0.03434
Epoch: 2767, Training Loss: 0.03043
Epoch: 2767, Training Loss: 0.03981
Epoch: 2767, Training Loss: 0.02948
Epoch: 2768, Training Loss: 0.03433
Epoch: 2768, Training Loss: 0.03042
Epoch: 2768, Training Loss: 0.03980
Epoch: 2768, Training Loss: 0.02947
Epoch: 2769, Training Loss: 0.03432
Epoch: 2769, Training Loss: 0.03042
Epoch: 2769, Training Loss: 0.03979
Epoch: 2769, Training Loss: 0.02946
Epoch: 2770, Training Loss: 0.03431
Epoch: 2770, Training Loss: 0.03041
Epoch: 2770, Training Loss: 0.03978
Epoch: 2770, Training Loss: 0.02945
Epoch: 2771, Training Loss: 0.03430
Epoch: 2771, Training Loss: 0.03040
Epoch: 2771, Training Loss: 0.03976
Epoch: 2771, Training Loss: 0.02945
Epoch: 2772, Training Loss: 0.03429
Epoch: 2772, Training Loss: 0.03039
Epoch: 2772, Training Loss: 0.03975
Epoch: 2772, Training Loss: 0.02944
Epoch: 2773, Training Loss: 0.03428
Epoch: 2773, Training Loss: 0.03039
Epoch: 2773, Training Loss: 0.03974
Epoch: 2773, Training Loss: 0.02943
Epoch: 2774, Training Loss: 0.03427
Epoch: 2774, Training Loss: 0.03038
Epoch: 2774, Training Loss: 0.03973
Epoch: 2774, Training Loss: 0.02942
Epoch: 2775, Training Loss: 0.03427
Epoch: 2775, Training Loss: 0.03037
Epoch: 2775, Training Loss: 0.03972
Epoch: 2775, Training Loss: 0.02942
Epoch: 2776, Training Loss: 0.03426
Epoch: 2776, Training Loss: 0.03037
Epoch: 2776, Training Loss: 0.03971
Epoch: 2776, Training Loss: 0.02941
Epoch: 2777, Training Loss: 0.03425
Epoch: 2777, Training Loss: 0.03036
Epoch: 2777, Training Loss: 0.03970
Epoch: 2777, Training Loss: 0.02940
Epoch: 2778, Training Loss: 0.03424
Epoch: 2778, Training Loss: 0.03035
Epoch: 2778, Training Loss: 0.03969
Epoch: 2778, Training Loss: 0.02940
Epoch: 2779, Training Loss: 0.03423
Epoch: 2779, Training Loss: 0.03034
Epoch: 2779, Training Loss: 0.03968
Epoch: 2779, Training Loss: 0.02939
Epoch: 2780, Training Loss: 0.03422
Epoch: 2780, Training Loss: 0.03034
Epoch: 2780, Training Loss: 0.03967
Epoch: 2780, Training Loss: 0.02938
Epoch: 2781, Training Loss: 0.03421
Epoch: 2781, Training Loss: 0.03033
Epoch: 2781, Training Loss: 0.03966
Epoch: 2781, Training Loss: 0.02937
Epoch: 2782, Training Loss: 0.03420
Epoch: 2782, Training Loss: 0.03032
Epoch: 2782, Training Loss: 0.03965
Epoch: 2782, Training Loss: 0.02937
Epoch: 2783, Training Loss: 0.03419
Epoch: 2783, Training Loss: 0.03031
Epoch: 2783, Training Loss: 0.03964
Epoch: 2783, Training Loss: 0.02936
Epoch: 2784, Training Loss: 0.03419
Epoch: 2784, Training Loss: 0.03031
Epoch: 2784, Training Loss: 0.03963
Epoch: 2784, Training Loss: 0.02935
Epoch: 2785, Training Loss: 0.03418
Epoch: 2785, Training Loss: 0.03030
Epoch: 2785, Training Loss: 0.03961
Epoch: 2785, Training Loss: 0.02934
Epoch: 2786, Training Loss: 0.03417
Epoch: 2786, Training Loss: 0.03029
Epoch: 2786, Training Loss: 0.03960
Epoch: 2786, Training Loss: 0.02934
Epoch: 2787, Training Loss: 0.03416
Epoch: 2787, Training Loss: 0.03028
Epoch: 2787, Training Loss: 0.03959
Epoch: 2787, Training Loss: 0.02933
Epoch: 2788, Training Loss: 0.03415
Epoch: 2788, Training Loss: 0.03028
Epoch: 2788, Training Loss: 0.03958
Epoch: 2788, Training Loss: 0.02932
Epoch: 2789, Training Loss: 0.03414
Epoch: 2789, Training Loss: 0.03027
Epoch: 2789, Training Loss: 0.03957
Epoch: 2789, Training Loss: 0.02931
Epoch: 2790, Training Loss: 0.03413
Epoch: 2790, Training Loss: 0.03026
Epoch: 2790, Training Loss: 0.03956
Epoch: 2790, Training Loss: 0.02931
Epoch: 2791, Training Loss: 0.03412
Epoch: 2791, Training Loss: 0.03026
Epoch: 2791, Training Loss: 0.03955
Epoch: 2791, Training Loss: 0.02930
Epoch: 2792, Training Loss: 0.03412
Epoch: 2792, Training Loss: 0.03025
Epoch: 2792, Training Loss: 0.03954
Epoch: 2792, Training Loss: 0.02929
Epoch: 2793, Training Loss: 0.03411
Epoch: 2793, Training Loss: 0.03024
Epoch: 2793, Training Loss: 0.03953
Epoch: 2793, Training Loss: 0.02928
Epoch: 2794, Training Loss: 0.03410
Epoch: 2794, Training Loss: 0.03023
Epoch: 2794, Training Loss: 0.03952
Epoch: 2794, Training Loss: 0.02928
Epoch: 2795, Training Loss: 0.03409
Epoch: 2795, Training Loss: 0.03023
Epoch: 2795, Training Loss: 0.03951
Epoch: 2795, Training Loss: 0.02927
Epoch: 2796, Training Loss: 0.03408
Epoch: 2796, Training Loss: 0.03022
Epoch: 2796, Training Loss: 0.03950
Epoch: 2796, Training Loss: 0.02926
Epoch: 2797, Training Loss: 0.03407
Epoch: 2797, Training Loss: 0.03021
Epoch: 2797, Training Loss: 0.03949
Epoch: 2797, Training Loss: 0.02925
Epoch: 2798, Training Loss: 0.03406
Epoch: 2798, Training Loss: 0.03021
Epoch: 2798, Training Loss: 0.03948
Epoch: 2798, Training Loss: 0.02925
Epoch: 2799, Training Loss: 0.03405
Epoch: 2799, Training Loss: 0.03020
Epoch: 2799, Training Loss: 0.03947
Epoch: 2799, Training Loss: 0.02924
Epoch: 2800, Training Loss: 0.03405
Epoch: 2800, Training Loss: 0.03019
Epoch: 2800, Training Loss: 0.03946
Epoch: 2800, Training Loss: 0.02923
Epoch: 2801, Training Loss: 0.03404
Epoch: 2801, Training Loss: 0.03018
Epoch: 2801, Training Loss: 0.03945
Epoch: 2801, Training Loss: 0.02923
Epoch: 2802, Training Loss: 0.03403
Epoch: 2802, Training Loss: 0.03018
Epoch: 2802, Training Loss: 0.03944
Epoch: 2802, Training Loss: 0.02922
Epoch: 2803, Training Loss: 0.03402
Epoch: 2803, Training Loss: 0.03017
Epoch: 2803, Training Loss: 0.03942
Epoch: 2803, Training Loss: 0.02921
Epoch: 2804, Training Loss: 0.03401
Epoch: 2804, Training Loss: 0.03016
Epoch: 2804, Training Loss: 0.03941
Epoch: 2804, Training Loss: 0.02920
Epoch: 2805, Training Loss: 0.03400
Epoch: 2805, Training Loss: 0.03016
Epoch: 2805, Training Loss: 0.03940
Epoch: 2805, Training Loss: 0.02920
Epoch: 2806, Training Loss: 0.03399
Epoch: 2806, Training Loss: 0.03015
Epoch: 2806, Training Loss: 0.03939
Epoch: 2806, Training Loss: 0.02919
Epoch: 2807, Training Loss: 0.03398
Epoch: 2807, Training Loss: 0.03014
Epoch: 2807, Training Loss: 0.03938
Epoch: 2807, Training Loss: 0.02918
Epoch: 2808, Training Loss: 0.03398
Epoch: 2808, Training Loss: 0.03013
Epoch: 2808, Training Loss: 0.03937
Epoch: 2808, Training Loss: 0.02917
Epoch: 2809, Training Loss: 0.03397
Epoch: 2809, Training Loss: 0.03013
Epoch: 2809, Training Loss: 0.03936
Epoch: 2809, Training Loss: 0.02917
Epoch: 2810, Training Loss: 0.03396
Epoch: 2810, Training Loss: 0.03012
Epoch: 2810, Training Loss: 0.03935
Epoch: 2810, Training Loss: 0.02916
Epoch: 2811, Training Loss: 0.03395
Epoch: 2811, Training Loss: 0.03011
Epoch: 2811, Training Loss: 0.03934
Epoch: 2811, Training Loss: 0.02915
Epoch: 2812, Training Loss: 0.03394
Epoch: 2812, Training Loss: 0.03010
Epoch: 2812, Training Loss: 0.03933
Epoch: 2812, Training Loss: 0.02915
Epoch: 2813, Training Loss: 0.03393
Epoch: 2813, Training Loss: 0.03010
Epoch: 2813, Training Loss: 0.03932
Epoch: 2813, Training Loss: 0.02914
Epoch: 2814, Training Loss: 0.03392
Epoch: 2814, Training Loss: 0.03009
Epoch: 2814, Training Loss: 0.03931
Epoch: 2814, Training Loss: 0.02913
Epoch: 2815, Training Loss: 0.03391
Epoch: 2815, Training Loss: 0.03008
Epoch: 2815, Training Loss: 0.03930
Epoch: 2815, Training Loss: 0.02912
Epoch: 2816, Training Loss: 0.03391
Epoch: 2816, Training Loss: 0.03008
Epoch: 2816, Training Loss: 0.03929
Epoch: 2816, Training Loss: 0.02912
Epoch: 2817, Training Loss: 0.03390
Epoch: 2817, Training Loss: 0.03007
Epoch: 2817, Training Loss: 0.03928
Epoch: 2817, Training Loss: 0.02911
Epoch: 2818, Training Loss: 0.03389
Epoch: 2818, Training Loss: 0.03006
Epoch: 2818, Training Loss: 0.03927
Epoch: 2818, Training Loss: 0.02910
Epoch: 2819, Training Loss: 0.03388
Epoch: 2819, Training Loss: 0.03006
Epoch: 2819, Training Loss: 0.03926
Epoch: 2819, Training Loss: 0.02909
Epoch: 2820, Training Loss: 0.03387
Epoch: 2820, Training Loss: 0.03005
Epoch: 2820, Training Loss: 0.03925
Epoch: 2820, Training Loss: 0.02909
Epoch: 2821, Training Loss: 0.03386
Epoch: 2821, Training Loss: 0.03004
Epoch: 2821, Training Loss: 0.03924
Epoch: 2821, Training Loss: 0.02908
Epoch: 2822, Training Loss: 0.03385
Epoch: 2822, Training Loss: 0.03003
Epoch: 2822, Training Loss: 0.03923
Epoch: 2822, Training Loss: 0.02907
Epoch: 2823, Training Loss: 0.03385
Epoch: 2823, Training Loss: 0.03003
Epoch: 2823, Training Loss: 0.03922
Epoch: 2823, Training Loss: 0.02907
Epoch: 2824, Training Loss: 0.03384
Epoch: 2824, Training Loss: 0.03002
Epoch: 2824, Training Loss: 0.03921
Epoch: 2824, Training Loss: 0.02906
Epoch: 2825, Training Loss: 0.03383
Epoch: 2825, Training Loss: 0.03001
Epoch: 2825, Training Loss: 0.03920
Epoch: 2825, Training Loss: 0.02905
Epoch: 2826, Training Loss: 0.03382
Epoch: 2826, Training Loss: 0.03001
Epoch: 2826, Training Loss: 0.03919
Epoch: 2826, Training Loss: 0.02904
Epoch: 2827, Training Loss: 0.03381
Epoch: 2827, Training Loss: 0.03000
Epoch: 2827, Training Loss: 0.03918
Epoch: 2827, Training Loss: 0.02904
Epoch: 2828, Training Loss: 0.03380
Epoch: 2828, Training Loss: 0.02999
Epoch: 2828, Training Loss: 0.03916
Epoch: 2828, Training Loss: 0.02903
Epoch: 2829, Training Loss: 0.03379
Epoch: 2829, Training Loss: 0.02998
Epoch: 2829, Training Loss: 0.03915
Epoch: 2829, Training Loss: 0.02902
Epoch: 2830, Training Loss: 0.03379
Epoch: 2830, Training Loss: 0.02998
Epoch: 2830, Training Loss: 0.03914
Epoch: 2830, Training Loss: 0.02902
Epoch: 2831, Training Loss: 0.03378
Epoch: 2831, Training Loss: 0.02997
Epoch: 2831, Training Loss: 0.03913
Epoch: 2831, Training Loss: 0.02901
Epoch: 2832, Training Loss: 0.03377
Epoch: 2832, Training Loss: 0.02996
Epoch: 2832, Training Loss: 0.03912
Epoch: 2832, Training Loss: 0.02900
Epoch: 2833, Training Loss: 0.03376
Epoch: 2833, Training Loss: 0.02996
Epoch: 2833, Training Loss: 0.03911
Epoch: 2833, Training Loss: 0.02899
Epoch: 2834, Training Loss: 0.03375
Epoch: 2834, Training Loss: 0.02995
Epoch: 2834, Training Loss: 0.03910
Epoch: 2834, Training Loss: 0.02899
Epoch: 2835, Training Loss: 0.03374
Epoch: 2835, Training Loss: 0.02994
Epoch: 2835, Training Loss: 0.03909
Epoch: 2835, Training Loss: 0.02898
Epoch: 2836, Training Loss: 0.03373
Epoch: 2836, Training Loss: 0.02993
Epoch: 2836, Training Loss: 0.03908
Epoch: 2836, Training Loss: 0.02897
Epoch: 2837, Training Loss: 0.03373
Epoch: 2837, Training Loss: 0.02993
Epoch: 2837, Training Loss: 0.03907
Epoch: 2837, Training Loss: 0.02897
Epoch: 2838, Training Loss: 0.03372
Epoch: 2838, Training Loss: 0.02992
Epoch: 2838, Training Loss: 0.03906
Epoch: 2838, Training Loss: 0.02896
Epoch: 2839, Training Loss: 0.03371
Epoch: 2839, Training Loss: 0.02991
Epoch: 2839, Training Loss: 0.03905
Epoch: 2839, Training Loss: 0.02895
Epoch: 2840, Training Loss: 0.03370
Epoch: 2840, Training Loss: 0.02991
Epoch: 2840, Training Loss: 0.03904
Epoch: 2840, Training Loss: 0.02894
Epoch: 2841, Training Loss: 0.03369
Epoch: 2841, Training Loss: 0.02990
Epoch: 2841, Training Loss: 0.03903
Epoch: 2841, Training Loss: 0.02894
Epoch: 2842, Training Loss: 0.03368
Epoch: 2842, Training Loss: 0.02989
Epoch: 2842, Training Loss: 0.03902
Epoch: 2842, Training Loss: 0.02893
Epoch: 2843, Training Loss: 0.03367
Epoch: 2843, Training Loss: 0.02989
Epoch: 2843, Training Loss: 0.03901
Epoch: 2843, Training Loss: 0.02892
Epoch: 2844, Training Loss: 0.03367
Epoch: 2844, Training Loss: 0.02988
Epoch: 2844, Training Loss: 0.03900
Epoch: 2844, Training Loss: 0.02892
Epoch: 2845, Training Loss: 0.03366
Epoch: 2845, Training Loss: 0.02987
Epoch: 2845, Training Loss: 0.03899
Epoch: 2845, Training Loss: 0.02891
Epoch: 2846, Training Loss: 0.03365
Epoch: 2846, Training Loss: 0.02987
Epoch: 2846, Training Loss: 0.03898
Epoch: 2846, Training Loss: 0.02890
Epoch: 2847, Training Loss: 0.03364
Epoch: 2847, Training Loss: 0.02986
Epoch: 2847, Training Loss: 0.03897
Epoch: 2847, Training Loss: 0.02889
Epoch: 2848, Training Loss: 0.03363
Epoch: 2848, Training Loss: 0.02985
Epoch: 2848, Training Loss: 0.03896
Epoch: 2848, Training Loss: 0.02889
Epoch: 2849, Training Loss: 0.03362
Epoch: 2849, Training Loss: 0.02984
Epoch: 2849, Training Loss: 0.03895
Epoch: 2849, Training Loss: 0.02888
Epoch: 2850, Training Loss: 0.03361
Epoch: 2850, Training Loss: 0.02984
Epoch: 2850, Training Loss: 0.03894
Epoch: 2850, Training Loss: 0.02887
Epoch: 2851, Training Loss: 0.03361
Epoch: 2851, Training Loss: 0.02983
Epoch: 2851, Training Loss: 0.03893
Epoch: 2851, Training Loss: 0.02887
Epoch: 2852, Training Loss: 0.03360
Epoch: 2852, Training Loss: 0.02982
Epoch: 2852, Training Loss: 0.03892
Epoch: 2852, Training Loss: 0.02886
Epoch: 2853, Training Loss: 0.03359
Epoch: 2853, Training Loss: 0.02982
Epoch: 2853, Training Loss: 0.03891
Epoch: 2853, Training Loss: 0.02885
Epoch: 2854, Training Loss: 0.03358
Epoch: 2854, Training Loss: 0.02981
Epoch: 2854, Training Loss: 0.03890
Epoch: 2854, Training Loss: 0.02885
Epoch: 2855, Training Loss: 0.03357
Epoch: 2855, Training Loss: 0.02980
Epoch: 2855, Training Loss: 0.03889
Epoch: 2855, Training Loss: 0.02884
Epoch: 2856, Training Loss: 0.03356
Epoch: 2856, Training Loss: 0.02980
Epoch: 2856, Training Loss: 0.03888
Epoch: 2856, Training Loss: 0.02883
Epoch: 2857, Training Loss: 0.03356
Epoch: 2857, Training Loss: 0.02979
Epoch: 2857, Training Loss: 0.03887
Epoch: 2857, Training Loss: 0.02882
Epoch: 2858, Training Loss: 0.03355
Epoch: 2858, Training Loss: 0.02978
Epoch: 2858, Training Loss: 0.03886
Epoch: 2858, Training Loss: 0.02882
Epoch: 2859, Training Loss: 0.03354
Epoch: 2859, Training Loss: 0.02977
Epoch: 2859, Training Loss: 0.03885
Epoch: 2859, Training Loss: 0.02881
Epoch: 2860, Training Loss: 0.03353
Epoch: 2860, Training Loss: 0.02977
Epoch: 2860, Training Loss: 0.03884
Epoch: 2860, Training Loss: 0.02880
Epoch: 2861, Training Loss: 0.03352
Epoch: 2861, Training Loss: 0.02976
Epoch: 2861, Training Loss: 0.03883
Epoch: 2861, Training Loss: 0.02880
Epoch: 2862, Training Loss: 0.03351
Epoch: 2862, Training Loss: 0.02975
Epoch: 2862, Training Loss: 0.03882
Epoch: 2862, Training Loss: 0.02879
Epoch: 2863, Training Loss: 0.03351
Epoch: 2863, Training Loss: 0.02975
Epoch: 2863, Training Loss: 0.03881
Epoch: 2863, Training Loss: 0.02878
Epoch: 2864, Training Loss: 0.03350
Epoch: 2864, Training Loss: 0.02974
Epoch: 2864, Training Loss: 0.03880
Epoch: 2864, Training Loss: 0.02878
Epoch: 2865, Training Loss: 0.03349
Epoch: 2865, Training Loss: 0.02973
Epoch: 2865, Training Loss: 0.03879
Epoch: 2865, Training Loss: 0.02877
Epoch: 2866, Training Loss: 0.03348
Epoch: 2866, Training Loss: 0.02973
Epoch: 2866, Training Loss: 0.03878
Epoch: 2866, Training Loss: 0.02876
Epoch: 2867, Training Loss: 0.03347
Epoch: 2867, Training Loss: 0.02972
Epoch: 2867, Training Loss: 0.03877
Epoch: 2867, Training Loss: 0.02875
Epoch: 2868, Training Loss: 0.03346
Epoch: 2868, Training Loss: 0.02971
Epoch: 2868, Training Loss: 0.03876
Epoch: 2868, Training Loss: 0.02875
Epoch: 2869, Training Loss: 0.03346
Epoch: 2869, Training Loss: 0.02971
Epoch: 2869, Training Loss: 0.03875
Epoch: 2869, Training Loss: 0.02874
Epoch: 2870, Training Loss: 0.03345
Epoch: 2870, Training Loss: 0.02970
Epoch: 2870, Training Loss: 0.03874
Epoch: 2870, Training Loss: 0.02873
Epoch: 2871, Training Loss: 0.03344
Epoch: 2871, Training Loss: 0.02969
Epoch: 2871, Training Loss: 0.03873
Epoch: 2871, Training Loss: 0.02873
Epoch: 2872, Training Loss: 0.03343
Epoch: 2872, Training Loss: 0.02969
Epoch: 2872, Training Loss: 0.03872
Epoch: 2872, Training Loss: 0.02872
Epoch: 2873, Training Loss: 0.03342
Epoch: 2873, Training Loss: 0.02968
Epoch: 2873, Training Loss: 0.03871
Epoch: 2873, Training Loss: 0.02871
Epoch: 2874, Training Loss: 0.03341
Epoch: 2874, Training Loss: 0.02967
Epoch: 2874, Training Loss: 0.03870
Epoch: 2874, Training Loss: 0.02871
Epoch: 2875, Training Loss: 0.03341
Epoch: 2875, Training Loss: 0.02966
Epoch: 2875, Training Loss: 0.03869
Epoch: 2875, Training Loss: 0.02870
Epoch: 2876, Training Loss: 0.03340
Epoch: 2876, Training Loss: 0.02966
Epoch: 2876, Training Loss: 0.03868
Epoch: 2876, Training Loss: 0.02869
Epoch: 2877, Training Loss: 0.03339
Epoch: 2877, Training Loss: 0.02965
Epoch: 2877, Training Loss: 0.03867
Epoch: 2877, Training Loss: 0.02869
Epoch: 2878, Training Loss: 0.03338
Epoch: 2878, Training Loss: 0.02964
Epoch: 2878, Training Loss: 0.03866
Epoch: 2878, Training Loss: 0.02868
Epoch: 2879, Training Loss: 0.03337
Epoch: 2879, Training Loss: 0.02964
Epoch: 2879, Training Loss: 0.03865
Epoch: 2879, Training Loss: 0.02867
Epoch: 2880, Training Loss: 0.03336
Epoch: 2880, Training Loss: 0.02963
Epoch: 2880, Training Loss: 0.03864
Epoch: 2880, Training Loss: 0.02866
Epoch: 2881, Training Loss: 0.03336
Epoch: 2881, Training Loss: 0.02962
Epoch: 2881, Training Loss: 0.03863
Epoch: 2881, Training Loss: 0.02866
Epoch: 2882, Training Loss: 0.03335
Epoch: 2882, Training Loss: 0.02962
Epoch: 2882, Training Loss: 0.03862
Epoch: 2882, Training Loss: 0.02865
Epoch: 2883, Training Loss: 0.03334
Epoch: 2883, Training Loss: 0.02961
Epoch: 2883, Training Loss: 0.03861
Epoch: 2883, Training Loss: 0.02864
Epoch: 2884, Training Loss: 0.03333
Epoch: 2884, Training Loss: 0.02960
Epoch: 2884, Training Loss: 0.03860
Epoch: 2884, Training Loss: 0.02864
Epoch: 2885, Training Loss: 0.03332
Epoch: 2885, Training Loss: 0.02960
Epoch: 2885, Training Loss: 0.03859
Epoch: 2885, Training Loss: 0.02863
Epoch: 2886, Training Loss: 0.03331
Epoch: 2886, Training Loss: 0.02959
Epoch: 2886, Training Loss: 0.03858
Epoch: 2886, Training Loss: 0.02862
Epoch: 2887, Training Loss: 0.03331
Epoch: 2887, Training Loss: 0.02958
Epoch: 2887, Training Loss: 0.03857
Epoch: 2887, Training Loss: 0.02862
Epoch: 2888, Training Loss: 0.03330
Epoch: 2888, Training Loss: 0.02958
Epoch: 2888, Training Loss: 0.03856
Epoch: 2888, Training Loss: 0.02861
Epoch: 2889, Training Loss: 0.03329
Epoch: 2889, Training Loss: 0.02957
Epoch: 2889, Training Loss: 0.03855
Epoch: 2889, Training Loss: 0.02860
Epoch: 2890, Training Loss: 0.03328
Epoch: 2890, Training Loss: 0.02956
Epoch: 2890, Training Loss: 0.03854
Epoch: 2890, Training Loss: 0.02860
Epoch: 2891, Training Loss: 0.03327
Epoch: 2891, Training Loss: 0.02956
Epoch: 2891, Training Loss: 0.03853
Epoch: 2891, Training Loss: 0.02859
Epoch: 2892, Training Loss: 0.03327
Epoch: 2892, Training Loss: 0.02955
Epoch: 2892, Training Loss: 0.03852
Epoch: 2892, Training Loss: 0.02858
Epoch: 2893, Training Loss: 0.03326
Epoch: 2893, Training Loss: 0.02954
Epoch: 2893, Training Loss: 0.03851
Epoch: 2893, Training Loss: 0.02857
Epoch: 2894, Training Loss: 0.03325
Epoch: 2894, Training Loss: 0.02954
Epoch: 2894, Training Loss: 0.03850
Epoch: 2894, Training Loss: 0.02857
Epoch: 2895, Training Loss: 0.03324
Epoch: 2895, Training Loss: 0.02953
Epoch: 2895, Training Loss: 0.03849
Epoch: 2895, Training Loss: 0.02856
Epoch: 2896, Training Loss: 0.03323
Epoch: 2896, Training Loss: 0.02952
Epoch: 2896, Training Loss: 0.03848
Epoch: 2896, Training Loss: 0.02855
Epoch: 2897, Training Loss: 0.03322
Epoch: 2897, Training Loss: 0.02952
Epoch: 2897, Training Loss: 0.03847
Epoch: 2897, Training Loss: 0.02855
Epoch: 2898, Training Loss: 0.03322
Epoch: 2898, Training Loss: 0.02951
Epoch: 2898, Training Loss: 0.03846
Epoch: 2898, Training Loss: 0.02854
Epoch: 2899, Training Loss: 0.03321
Epoch: 2899, Training Loss: 0.02950
Epoch: 2899, Training Loss: 0.03845
Epoch: 2899, Training Loss: 0.02853
Epoch: 2900, Training Loss: 0.03320
Epoch: 2900, Training Loss: 0.02950
Epoch: 2900, Training Loss: 0.03844
Epoch: 2900, Training Loss: 0.02853
Epoch: 2901, Training Loss: 0.03319
Epoch: 2901, Training Loss: 0.02949
Epoch: 2901, Training Loss: 0.03843
Epoch: 2901, Training Loss: 0.02852
Epoch: 2902, Training Loss: 0.03318
Epoch: 2902, Training Loss: 0.02948
Epoch: 2902, Training Loss: 0.03842
Epoch: 2902, Training Loss: 0.02851
Epoch: 2903, Training Loss: 0.03318
Epoch: 2903, Training Loss: 0.02948
Epoch: 2903, Training Loss: 0.03841
Epoch: 2903, Training Loss: 0.02851
Epoch: 2904, Training Loss: 0.03317
Epoch: 2904, Training Loss: 0.02947
Epoch: 2904, Training Loss: 0.03840
Epoch: 2904, Training Loss: 0.02850
Epoch: 2905, Training Loss: 0.03316
Epoch: 2905, Training Loss: 0.02946
Epoch: 2905, Training Loss: 0.03839
Epoch: 2905, Training Loss: 0.02849
Epoch: 2906, Training Loss: 0.03315
Epoch: 2906, Training Loss: 0.02945
Epoch: 2906, Training Loss: 0.03838
Epoch: 2906, Training Loss: 0.02849
Epoch: 2907, Training Loss: 0.03314
Epoch: 2907, Training Loss: 0.02945
Epoch: 2907, Training Loss: 0.03838
Epoch: 2907, Training Loss: 0.02848
Epoch: 2908, Training Loss: 0.03314
Epoch: 2908, Training Loss: 0.02944
Epoch: 2908, Training Loss: 0.03837
Epoch: 2908, Training Loss: 0.02847
Epoch: 2909, Training Loss: 0.03313
Epoch: 2909, Training Loss: 0.02943
Epoch: 2909, Training Loss: 0.03836
Epoch: 2909, Training Loss: 0.02847
Epoch: 2910, Training Loss: 0.03312
Epoch: 2910, Training Loss: 0.02943
Epoch: 2910, Training Loss: 0.03835
Epoch: 2910, Training Loss: 0.02846
Epoch: 2911, Training Loss: 0.03311
Epoch: 2911, Training Loss: 0.02942
Epoch: 2911, Training Loss: 0.03834
Epoch: 2911, Training Loss: 0.02845
Epoch: 2912, Training Loss: 0.03310
Epoch: 2912, Training Loss: 0.02941
Epoch: 2912, Training Loss: 0.03833
Epoch: 2912, Training Loss: 0.02845
Epoch: 2913, Training Loss: 0.03309
Epoch: 2913, Training Loss: 0.02941
Epoch: 2913, Training Loss: 0.03832
Epoch: 2913, Training Loss: 0.02844
Epoch: 2914, Training Loss: 0.03309
Epoch: 2914, Training Loss: 0.02940
Epoch: 2914, Training Loss: 0.03831
Epoch: 2914, Training Loss: 0.02843
Epoch: 2915, Training Loss: 0.03308
Epoch: 2915, Training Loss: 0.02939
Epoch: 2915, Training Loss: 0.03830
Epoch: 2915, Training Loss: 0.02843
Epoch: 2916, Training Loss: 0.03307
Epoch: 2916, Training Loss: 0.02939
Epoch: 2916, Training Loss: 0.03829
Epoch: 2916, Training Loss: 0.02842
Epoch: 2917, Training Loss: 0.03306
Epoch: 2917, Training Loss: 0.02938
Epoch: 2917, Training Loss: 0.03828
Epoch: 2917, Training Loss: 0.02841
Epoch: 2918, Training Loss: 0.03305
Epoch: 2918, Training Loss: 0.02937
Epoch: 2918, Training Loss: 0.03827
Epoch: 2918, Training Loss: 0.02841
Epoch: 2919, Training Loss: 0.03305
Epoch: 2919, Training Loss: 0.02937
Epoch: 2919, Training Loss: 0.03826
Epoch: 2919, Training Loss: 0.02840
Epoch: 2920, Training Loss: 0.03304
Epoch: 2920, Training Loss: 0.02936
Epoch: 2920, Training Loss: 0.03825
Epoch: 2920, Training Loss: 0.02839
Epoch: 2921, Training Loss: 0.03303
Epoch: 2921, Training Loss: 0.02935
Epoch: 2921, Training Loss: 0.03824
Epoch: 2921, Training Loss: 0.02839
Epoch: 2922, Training Loss: 0.03302
Epoch: 2922, Training Loss: 0.02935
Epoch: 2922, Training Loss: 0.03823
Epoch: 2922, Training Loss: 0.02838
Epoch: 2923, Training Loss: 0.03301
Epoch: 2923, Training Loss: 0.02934
Epoch: 2923, Training Loss: 0.03822
Epoch: 2923, Training Loss: 0.02837
Epoch: 2924, Training Loss: 0.03301
Epoch: 2924, Training Loss: 0.02934
Epoch: 2924, Training Loss: 0.03821
Epoch: 2924, Training Loss: 0.02837
Epoch: 2925, Training Loss: 0.03300
Epoch: 2925, Training Loss: 0.02933
Epoch: 2925, Training Loss: 0.03820
Epoch: 2925, Training Loss: 0.02836
Epoch: 2926, Training Loss: 0.03299
Epoch: 2926, Training Loss: 0.02932
Epoch: 2926, Training Loss: 0.03819
Epoch: 2926, Training Loss: 0.02835
Epoch: 2927, Training Loss: 0.03298
Epoch: 2927, Training Loss: 0.02932
Epoch: 2927, Training Loss: 0.03818
Epoch: 2927, Training Loss: 0.02835
Epoch: 2928, Training Loss: 0.03297
Epoch: 2928, Training Loss: 0.02931
Epoch: 2928, Training Loss: 0.03817
Epoch: 2928, Training Loss: 0.02834
Epoch: 2929, Training Loss: 0.03297
Epoch: 2929, Training Loss: 0.02930
Epoch: 2929, Training Loss: 0.03816
Epoch: 2929, Training Loss: 0.02833
Epoch: 2930, Training Loss: 0.03296
Epoch: 2930, Training Loss: 0.02930
Epoch: 2930, Training Loss: 0.03815
Epoch: 2930, Training Loss: 0.02833
Epoch: 2931, Training Loss: 0.03295
Epoch: 2931, Training Loss: 0.02929
Epoch: 2931, Training Loss: 0.03814
Epoch: 2931, Training Loss: 0.02832
Epoch: 2932, Training Loss: 0.03294
Epoch: 2932, Training Loss: 0.02928
Epoch: 2932, Training Loss: 0.03813
Epoch: 2932, Training Loss: 0.02831
Epoch: 2933, Training Loss: 0.03293
Epoch: 2933, Training Loss: 0.02928
Epoch: 2933, Training Loss: 0.03812
Epoch: 2933, Training Loss: 0.02831
Epoch: 2934, Training Loss: 0.03293
Epoch: 2934, Training Loss: 0.02927
Epoch: 2934, Training Loss: 0.03812
Epoch: 2934, Training Loss: 0.02830
Epoch: 2935, Training Loss: 0.03292
Epoch: 2935, Training Loss: 0.02926
Epoch: 2935, Training Loss: 0.03811
Epoch: 2935, Training Loss: 0.02829
Epoch: 2936, Training Loss: 0.03291
Epoch: 2936, Training Loss: 0.02926
Epoch: 2936, Training Loss: 0.03810
Epoch: 2936, Training Loss: 0.02829
Epoch: 2937, Training Loss: 0.03290
Epoch: 2937, Training Loss: 0.02925
Epoch: 2937, Training Loss: 0.03809
Epoch: 2937, Training Loss: 0.02828
Epoch: 2938, Training Loss: 0.03289
Epoch: 2938, Training Loss: 0.02924
Epoch: 2938, Training Loss: 0.03808
Epoch: 2938, Training Loss: 0.02827
Epoch: 2939, Training Loss: 0.03289
Epoch: 2939, Training Loss: 0.02924
Epoch: 2939, Training Loss: 0.03807
Epoch: 2939, Training Loss: 0.02827
Epoch: 2940, Training Loss: 0.03288
Epoch: 2940, Training Loss: 0.02923
Epoch: 2940, Training Loss: 0.03806
Epoch: 2940, Training Loss: 0.02826
Epoch: 2941, Training Loss: 0.03287
Epoch: 2941, Training Loss: 0.02922
Epoch: 2941, Training Loss: 0.03805
Epoch: 2941, Training Loss: 0.02825
Epoch: 2942, Training Loss: 0.03286
Epoch: 2942, Training Loss: 0.02922
Epoch: 2942, Training Loss: 0.03804
Epoch: 2942, Training Loss: 0.02825
Epoch: 2943, Training Loss: 0.03285
Epoch: 2943, Training Loss: 0.02921
Epoch: 2943, Training Loss: 0.03803
Epoch: 2943, Training Loss: 0.02824
Epoch: 2944, Training Loss: 0.03285
Epoch: 2944, Training Loss: 0.02920
Epoch: 2944, Training Loss: 0.03802
Epoch: 2944, Training Loss: 0.02823
Epoch: 2945, Training Loss: 0.03284
Epoch: 2945, Training Loss: 0.02920
Epoch: 2945, Training Loss: 0.03801
Epoch: 2945, Training Loss: 0.02823
Epoch: 2946, Training Loss: 0.03283
Epoch: 2946, Training Loss: 0.02919
Epoch: 2946, Training Loss: 0.03800
Epoch: 2946, Training Loss: 0.02822
Epoch: 2947, Training Loss: 0.03282
Epoch: 2947, Training Loss: 0.02918
Epoch: 2947, Training Loss: 0.03799
Epoch: 2947, Training Loss: 0.02821
Epoch: 2948, Training Loss: 0.03282
Epoch: 2948, Training Loss: 0.02918
Epoch: 2948, Training Loss: 0.03798
Epoch: 2948, Training Loss: 0.02821
Epoch: 2949, Training Loss: 0.03281
Epoch: 2949, Training Loss: 0.02917
Epoch: 2949, Training Loss: 0.03797
Epoch: 2949, Training Loss: 0.02820
Epoch: 2950, Training Loss: 0.03280
Epoch: 2950, Training Loss: 0.02916
Epoch: 2950, Training Loss: 0.03796
Epoch: 2950, Training Loss: 0.02819
Epoch: 2951, Training Loss: 0.03279
Epoch: 2951, Training Loss: 0.02916
Epoch: 2951, Training Loss: 0.03795
Epoch: 2951, Training Loss: 0.02819
Epoch: 2952, Training Loss: 0.03278
Epoch: 2952, Training Loss: 0.02915
Epoch: 2952, Training Loss: 0.03795
Epoch: 2952, Training Loss: 0.02818
Epoch: 2953, Training Loss: 0.03278
Epoch: 2953, Training Loss: 0.02914
Epoch: 2953, Training Loss: 0.03794
Epoch: 2953, Training Loss: 0.02817
Epoch: 2954, Training Loss: 0.03277
Epoch: 2954, Training Loss: 0.02914
Epoch: 2954, Training Loss: 0.03793
Epoch: 2954, Training Loss: 0.02817
Epoch: 2955, Training Loss: 0.03276
Epoch: 2955, Training Loss: 0.02913
Epoch: 2955, Training Loss: 0.03792
Epoch: 2955, Training Loss: 0.02816
Epoch: 2956, Training Loss: 0.03275
Epoch: 2956, Training Loss: 0.02913
Epoch: 2956, Training Loss: 0.03791
Epoch: 2956, Training Loss: 0.02815
Epoch: 2957, Training Loss: 0.03274
Epoch: 2957, Training Loss: 0.02912
Epoch: 2957, Training Loss: 0.03790
Epoch: 2957, Training Loss: 0.02815
Epoch: 2958, Training Loss: 0.03274
Epoch: 2958, Training Loss: 0.02911
Epoch: 2958, Training Loss: 0.03789
Epoch: 2958, Training Loss: 0.02814
Epoch: 2959, Training Loss: 0.03273
Epoch: 2959, Training Loss: 0.02911
Epoch: 2959, Training Loss: 0.03788
Epoch: 2959, Training Loss: 0.02813
Epoch: 2960, Training Loss: 0.03272
Epoch: 2960, Training Loss: 0.02910
Epoch: 2960, Training Loss: 0.03787
Epoch: 2960, Training Loss: 0.02813
Epoch: 2961, Training Loss: 0.03271
Epoch: 2961, Training Loss: 0.02909
Epoch: 2961, Training Loss: 0.03786
Epoch: 2961, Training Loss: 0.02812
Epoch: 2962, Training Loss: 0.03271
Epoch: 2962, Training Loss: 0.02909
Epoch: 2962, Training Loss: 0.03785
Epoch: 2962, Training Loss: 0.02811
Epoch: 2963, Training Loss: 0.03270
Epoch: 2963, Training Loss: 0.02908
Epoch: 2963, Training Loss: 0.03784
Epoch: 2963, Training Loss: 0.02811
Epoch: 2964, Training Loss: 0.03269
Epoch: 2964, Training Loss: 0.02907
Epoch: 2964, Training Loss: 0.03783
Epoch: 2964, Training Loss: 0.02810
Epoch: 2965, Training Loss: 0.03268
Epoch: 2965, Training Loss: 0.02907
Epoch: 2965, Training Loss: 0.03782
Epoch: 2965, Training Loss: 0.02809
Epoch: 2966, Training Loss: 0.03267
Epoch: 2966, Training Loss: 0.02906
Epoch: 2966, Training Loss: 0.03781
Epoch: 2966, Training Loss: 0.02809
Epoch: 2967, Training Loss: 0.03267
Epoch: 2967, Training Loss: 0.02905
Epoch: 2967, Training Loss: 0.03780
Epoch: 2967, Training Loss: 0.02808
Epoch: 2968, Training Loss: 0.03266
Epoch: 2968, Training Loss: 0.02905
Epoch: 2968, Training Loss: 0.03780
Epoch: 2968, Training Loss: 0.02807
Epoch: 2969, Training Loss: 0.03265
Epoch: 2969, Training Loss: 0.02904
Epoch: 2969, Training Loss: 0.03779
Epoch: 2969, Training Loss: 0.02807
Epoch: 2970, Training Loss: 0.03264
Epoch: 2970, Training Loss: 0.02904
Epoch: 2970, Training Loss: 0.03778
Epoch: 2970, Training Loss: 0.02806
Epoch: 2971, Training Loss: 0.03264
Epoch: 2971, Training Loss: 0.02903
Epoch: 2971, Training Loss: 0.03777
Epoch: 2971, Training Loss: 0.02806
Epoch: 2972, Training Loss: 0.03263
Epoch: 2972, Training Loss: 0.02902
Epoch: 2972, Training Loss: 0.03776
Epoch: 2972, Training Loss: 0.02805
Epoch: 2973, Training Loss: 0.03262
Epoch: 2973, Training Loss: 0.02902
Epoch: 2973, Training Loss: 0.03775
Epoch: 2973, Training Loss: 0.02804
Epoch: 2974, Training Loss: 0.03261
Epoch: 2974, Training Loss: 0.02901
Epoch: 2974, Training Loss: 0.03774
Epoch: 2974, Training Loss: 0.02804
Epoch: 2975, Training Loss: 0.03260
Epoch: 2975, Training Loss: 0.02900
Epoch: 2975, Training Loss: 0.03773
Epoch: 2975, Training Loss: 0.02803
Epoch: 2976, Training Loss: 0.03260
Epoch: 2976, Training Loss: 0.02900
Epoch: 2976, Training Loss: 0.03772
Epoch: 2976, Training Loss: 0.02802
Epoch: 2977, Training Loss: 0.03259
Epoch: 2977, Training Loss: 0.02899
Epoch: 2977, Training Loss: 0.03771
Epoch: 2977, Training Loss: 0.02802
Epoch: 2978, Training Loss: 0.03258
Epoch: 2978, Training Loss: 0.02898
Epoch: 2978, Training Loss: 0.03770
Epoch: 2978, Training Loss: 0.02801
Epoch: 2979, Training Loss: 0.03257
Epoch: 2979, Training Loss: 0.02898
Epoch: 2979, Training Loss: 0.03769
Epoch: 2979, Training Loss: 0.02800
Epoch: 2980, Training Loss: 0.03257
Epoch: 2980, Training Loss: 0.02897
Epoch: 2980, Training Loss: 0.03768
Epoch: 2980, Training Loss: 0.02800
Epoch: 2981, Training Loss: 0.03256
Epoch: 2981, Training Loss: 0.02896
Epoch: 2981, Training Loss: 0.03768
Epoch: 2981, Training Loss: 0.02799
Epoch: 2982, Training Loss: 0.03255
Epoch: 2982, Training Loss: 0.02896
Epoch: 2982, Training Loss: 0.03767
Epoch: 2982, Training Loss: 0.02798
Epoch: 2983, Training Loss: 0.03254
Epoch: 2983, Training Loss: 0.02895
Epoch: 2983, Training Loss: 0.03766
Epoch: 2983, Training Loss: 0.02798
Epoch: 2984, Training Loss: 0.03254
Epoch: 2984, Training Loss: 0.02895
Epoch: 2984, Training Loss: 0.03765
Epoch: 2984, Training Loss: 0.02797
Epoch: 2985, Training Loss: 0.03253
Epoch: 2985, Training Loss: 0.02894
Epoch: 2985, Training Loss: 0.03764
Epoch: 2985, Training Loss: 0.02796
Epoch: 2986, Training Loss: 0.03252
Epoch: 2986, Training Loss: 0.02893
Epoch: 2986, Training Loss: 0.03763
Epoch: 2986, Training Loss: 0.02796
Epoch: 2987, Training Loss: 0.03251
Epoch: 2987, Training Loss: 0.02893
Epoch: 2987, Training Loss: 0.03762
Epoch: 2987, Training Loss: 0.02795
Epoch: 2988, Training Loss: 0.03250
Epoch: 2988, Training Loss: 0.02892
Epoch: 2988, Training Loss: 0.03761
Epoch: 2988, Training Loss: 0.02795
Epoch: 2989, Training Loss: 0.03250
Epoch: 2989, Training Loss: 0.02891
Epoch: 2989, Training Loss: 0.03760
Epoch: 2989, Training Loss: 0.02794
Epoch: 2990, Training Loss: 0.03249
Epoch: 2990, Training Loss: 0.02891
Epoch: 2990, Training Loss: 0.03759
Epoch: 2990, Training Loss: 0.02793
Epoch: 2991, Training Loss: 0.03248
Epoch: 2991, Training Loss: 0.02890
Epoch: 2991, Training Loss: 0.03758
Epoch: 2991, Training Loss: 0.02793
Epoch: 2992, Training Loss: 0.03247
Epoch: 2992, Training Loss: 0.02889
Epoch: 2992, Training Loss: 0.03757
Epoch: 2992, Training Loss: 0.02792
Epoch: 2993, Training Loss: 0.03247
Epoch: 2993, Training Loss: 0.02889
Epoch: 2993, Training Loss: 0.03757
Epoch: 2993, Training Loss: 0.02791
Epoch: 2994, Training Loss: 0.03246
Epoch: 2994, Training Loss: 0.02888
Epoch: 2994, Training Loss: 0.03756
Epoch: 2994, Training Loss: 0.02791
Epoch: 2995, Training Loss: 0.03245
Epoch: 2995, Training Loss: 0.02888
Epoch: 2995, Training Loss: 0.03755
Epoch: 2995, Training Loss: 0.02790
Epoch: 2996, Training Loss: 0.03244
Epoch: 2996, Training Loss: 0.02887
Epoch: 2996, Training Loss: 0.03754
Epoch: 2996, Training Loss: 0.02789
Epoch: 2997, Training Loss: 0.03244
Epoch: 2997, Training Loss: 0.02886
Epoch: 2997, Training Loss: 0.03753
Epoch: 2997, Training Loss: 0.02789
Epoch: 2998, Training Loss: 0.03243
Epoch: 2998, Training Loss: 0.02886
Epoch: 2998, Training Loss: 0.03752
Epoch: 2998, Training Loss: 0.02788
Epoch: 2999, Training Loss: 0.03242
Epoch: 2999, Training Loss: 0.02885
Epoch: 2999, Training Loss: 0.03751
Epoch: 2999, Training Loss: 0.02788
Epoch: 3000, Training Loss: 0.03241
Epoch: 3000, Training Loss: 0.02884
Epoch: 3000, Training Loss: 0.03750
Epoch: 3000, Training Loss: 0.02787
Epoch: 3001, Training Loss: 0.03241
Epoch: 3001, Training Loss: 0.02884
Epoch: 3001, Training Loss: 0.03749
Epoch: 3001, Training Loss: 0.02786
Epoch: 3002, Training Loss: 0.03240
Epoch: 3002, Training Loss: 0.02883
Epoch: 3002, Training Loss: 0.03748
Epoch: 3002, Training Loss: 0.02786
Epoch: 3003, Training Loss: 0.03239
Epoch: 3003, Training Loss: 0.02883
Epoch: 3003, Training Loss: 0.03747
Epoch: 3003, Training Loss: 0.02785
Epoch: 3004, Training Loss: 0.03238
Epoch: 3004, Training Loss: 0.02882
Epoch: 3004, Training Loss: 0.03747
Epoch: 3004, Training Loss: 0.02784
Epoch: 3005, Training Loss: 0.03237
Epoch: 3005, Training Loss: 0.02881
Epoch: 3005, Training Loss: 0.03746
Epoch: 3005, Training Loss: 0.02784
Epoch: 3006, Training Loss: 0.03237
Epoch: 3006, Training Loss: 0.02881
Epoch: 3006, Training Loss: 0.03745
Epoch: 3006, Training Loss: 0.02783
Epoch: 3007, Training Loss: 0.03236
Epoch: 3007, Training Loss: 0.02880
Epoch: 3007, Training Loss: 0.03744
Epoch: 3007, Training Loss: 0.02782
Epoch: 3008, Training Loss: 0.03235
Epoch: 3008, Training Loss: 0.02879
Epoch: 3008, Training Loss: 0.03743
Epoch: 3008, Training Loss: 0.02782
Epoch: 3009, Training Loss: 0.03234
Epoch: 3009, Training Loss: 0.02879
Epoch: 3009, Training Loss: 0.03742
Epoch: 3009, Training Loss: 0.02781
Epoch: 3010, Training Loss: 0.03234
Epoch: 3010, Training Loss: 0.02878
Epoch: 3010, Training Loss: 0.03741
Epoch: 3010, Training Loss: 0.02781
Epoch: 3011, Training Loss: 0.03233
Epoch: 3011, Training Loss: 0.02878
Epoch: 3011, Training Loss: 0.03740
Epoch: 3011, Training Loss: 0.02780
Epoch: 3012, Training Loss: 0.03232
Epoch: 3012, Training Loss: 0.02877
Epoch: 3012, Training Loss: 0.03739
Epoch: 3012, Training Loss: 0.02779
Epoch: 3013, Training Loss: 0.03231
Epoch: 3013, Training Loss: 0.02876
Epoch: 3013, Training Loss: 0.03738
Epoch: 3013, Training Loss: 0.02779
Epoch: 3014, Training Loss: 0.03231
Epoch: 3014, Training Loss: 0.02876
Epoch: 3014, Training Loss: 0.03737
Epoch: 3014, Training Loss: 0.02778
Epoch: 3015, Training Loss: 0.03230
Epoch: 3015, Training Loss: 0.02875
Epoch: 3015, Training Loss: 0.03737
Epoch: 3015, Training Loss: 0.02777
Epoch: 3016, Training Loss: 0.03229
Epoch: 3016, Training Loss: 0.02874
Epoch: 3016, Training Loss: 0.03736
Epoch: 3016, Training Loss: 0.02777
Epoch: 3017, Training Loss: 0.03228
Epoch: 3017, Training Loss: 0.02874
Epoch: 3017, Training Loss: 0.03735
Epoch: 3017, Training Loss: 0.02776
Epoch: 3018, Training Loss: 0.03228
Epoch: 3018, Training Loss: 0.02873
Epoch: 3018, Training Loss: 0.03734
Epoch: 3018, Training Loss: 0.02776
Epoch: 3019, Training Loss: 0.03227
Epoch: 3019, Training Loss: 0.02873
Epoch: 3019, Training Loss: 0.03733
Epoch: 3019, Training Loss: 0.02775
Epoch: 3020, Training Loss: 0.03226
Epoch: 3020, Training Loss: 0.02872
Epoch: 3020, Training Loss: 0.03732
Epoch: 3020, Training Loss: 0.02774
Epoch: 3021, Training Loss: 0.03225
Epoch: 3021, Training Loss: 0.02871
Epoch: 3021, Training Loss: 0.03731
Epoch: 3021, Training Loss: 0.02774
Epoch: 3022, Training Loss: 0.03225
Epoch: 3022, Training Loss: 0.02871
Epoch: 3022, Training Loss: 0.03730
Epoch: 3022, Training Loss: 0.02773
Epoch: 3023, Training Loss: 0.03224
Epoch: 3023, Training Loss: 0.02870
Epoch: 3023, Training Loss: 0.03729
Epoch: 3023, Training Loss: 0.02772
Epoch: 3024, Training Loss: 0.03223
Epoch: 3024, Training Loss: 0.02869
Epoch: 3024, Training Loss: 0.03728
Epoch: 3024, Training Loss: 0.02772
Epoch: 3025, Training Loss: 0.03222
Epoch: 3025, Training Loss: 0.02869
Epoch: 3025, Training Loss: 0.03728
Epoch: 3025, Training Loss: 0.02771
Epoch: 3026, Training Loss: 0.03222
Epoch: 3026, Training Loss: 0.02868
Epoch: 3026, Training Loss: 0.03727
Epoch: 3026, Training Loss: 0.02771
Epoch: 3027, Training Loss: 0.03221
Epoch: 3027, Training Loss: 0.02868
Epoch: 3027, Training Loss: 0.03726
Epoch: 3027, Training Loss: 0.02770
Epoch: 3028, Training Loss: 0.03220
Epoch: 3028, Training Loss: 0.02867
Epoch: 3028, Training Loss: 0.03725
Epoch: 3028, Training Loss: 0.02769
Epoch: 3029, Training Loss: 0.03219
Epoch: 3029, Training Loss: 0.02866
Epoch: 3029, Training Loss: 0.03724
Epoch: 3029, Training Loss: 0.02769
Epoch: 3030, Training Loss: 0.03219
Epoch: 3030, Training Loss: 0.02866
Epoch: 3030, Training Loss: 0.03723
Epoch: 3030, Training Loss: 0.02768
Epoch: 3031, Training Loss: 0.03218
Epoch: 3031, Training Loss: 0.02865
Epoch: 3031, Training Loss: 0.03722
Epoch: 3031, Training Loss: 0.02767
Epoch: 3032, Training Loss: 0.03217
Epoch: 3032, Training Loss: 0.02864
Epoch: 3032, Training Loss: 0.03721
Epoch: 3032, Training Loss: 0.02767
Epoch: 3033, Training Loss: 0.03216
Epoch: 3033, Training Loss: 0.02864
Epoch: 3033, Training Loss: 0.03720
Epoch: 3033, Training Loss: 0.02766
Epoch: 3034, Training Loss: 0.03216
Epoch: 3034, Training Loss: 0.02863
Epoch: 3034, Training Loss: 0.03720
Epoch: 3034, Training Loss: 0.02766
Epoch: 3035, Training Loss: 0.03215
Epoch: 3035, Training Loss: 0.02863
Epoch: 3035, Training Loss: 0.03719
Epoch: 3035, Training Loss: 0.02765
Epoch: 3036, Training Loss: 0.03214
Epoch: 3036, Training Loss: 0.02862
Epoch: 3036, Training Loss: 0.03718
Epoch: 3036, Training Loss: 0.02764
Epoch: 3037, Training Loss: 0.03213
Epoch: 3037, Training Loss: 0.02861
Epoch: 3037, Training Loss: 0.03717
Epoch: 3037, Training Loss: 0.02764
Epoch: 3038, Training Loss: 0.03213
Epoch: 3038, Training Loss: 0.02861
Epoch: 3038, Training Loss: 0.03716
Epoch: 3038, Training Loss: 0.02763
Epoch: 3039, Training Loss: 0.03212
Epoch: 3039, Training Loss: 0.02860
Epoch: 3039, Training Loss: 0.03715
Epoch: 3039, Training Loss: 0.02762
Epoch: 3040, Training Loss: 0.03211
Epoch: 3040, Training Loss: 0.02860
Epoch: 3040, Training Loss: 0.03714
Epoch: 3040, Training Loss: 0.02762
Epoch: 3041, Training Loss: 0.03211
Epoch: 3041, Training Loss: 0.02859
Epoch: 3041, Training Loss: 0.03713
Epoch: 3041, Training Loss: 0.02761
Epoch: 3042, Training Loss: 0.03210
Epoch: 3042, Training Loss: 0.02858
Epoch: 3042, Training Loss: 0.03713
Epoch: 3042, Training Loss: 0.02761
Epoch: 3043, Training Loss: 0.03209
Epoch: 3043, Training Loss: 0.02858
Epoch: 3043, Training Loss: 0.03712
Epoch: 3043, Training Loss: 0.02760
Epoch: 3044, Training Loss: 0.03208
Epoch: 3044, Training Loss: 0.02857
Epoch: 3044, Training Loss: 0.03711
Epoch: 3044, Training Loss: 0.02759
Epoch: 3045, Training Loss: 0.03208
Epoch: 3045, Training Loss: 0.02856
Epoch: 3045, Training Loss: 0.03710
Epoch: 3045, Training Loss: 0.02759
Epoch: 3046, Training Loss: 0.03207
Epoch: 3046, Training Loss: 0.02856
Epoch: 3046, Training Loss: 0.03709
Epoch: 3046, Training Loss: 0.02758
Epoch: 3047, Training Loss: 0.03206
Epoch: 3047, Training Loss: 0.02855
Epoch: 3047, Training Loss: 0.03708
Epoch: 3047, Training Loss: 0.02757
Epoch: 3048, Training Loss: 0.03205
Epoch: 3048, Training Loss: 0.02855
Epoch: 3048, Training Loss: 0.03707
Epoch: 3048, Training Loss: 0.02757
Epoch: 3049, Training Loss: 0.03205
Epoch: 3049, Training Loss: 0.02854
Epoch: 3049, Training Loss: 0.03706
Epoch: 3049, Training Loss: 0.02756
Epoch: 3050, Training Loss: 0.03204
Epoch: 3050, Training Loss: 0.02853
Epoch: 3050, Training Loss: 0.03705
Epoch: 3050, Training Loss: 0.02756
Epoch: 3051, Training Loss: 0.03203
Epoch: 3051, Training Loss: 0.02853
Epoch: 3051, Training Loss: 0.03705
Epoch: 3051, Training Loss: 0.02755
Epoch: 3052, Training Loss: 0.03202
Epoch: 3052, Training Loss: 0.02852
Epoch: 3052, Training Loss: 0.03704
Epoch: 3052, Training Loss: 0.02754
Epoch: 3053, Training Loss: 0.03202
Epoch: 3053, Training Loss: 0.02852
Epoch: 3053, Training Loss: 0.03703
Epoch: 3053, Training Loss: 0.02754
Epoch: 3054, Training Loss: 0.03201
Epoch: 3054, Training Loss: 0.02851
Epoch: 3054, Training Loss: 0.03702
Epoch: 3054, Training Loss: 0.02753
Epoch: 3055, Training Loss: 0.03200
Epoch: 3055, Training Loss: 0.02850
Epoch: 3055, Training Loss: 0.03701
Epoch: 3055, Training Loss: 0.02753
Epoch: 3056, Training Loss: 0.03199
Epoch: 3056, Training Loss: 0.02850
Epoch: 3056, Training Loss: 0.03700
Epoch: 3056, Training Loss: 0.02752
Epoch: 3057, Training Loss: 0.03199
Epoch: 3057, Training Loss: 0.02849
Epoch: 3057, Training Loss: 0.03699
Epoch: 3057, Training Loss: 0.02751
Epoch: 3058, Training Loss: 0.03198
Epoch: 3058, Training Loss: 0.02849
Epoch: 3058, Training Loss: 0.03698
Epoch: 3058, Training Loss: 0.02751
Epoch: 3059, Training Loss: 0.03197
Epoch: 3059, Training Loss: 0.02848
Epoch: 3059, Training Loss: 0.03698
Epoch: 3059, Training Loss: 0.02750
Epoch: 3060, Training Loss: 0.03197
Epoch: 3060, Training Loss: 0.02847
Epoch: 3060, Training Loss: 0.03697
Epoch: 3060, Training Loss: 0.02749
Epoch: 3061, Training Loss: 0.03196
Epoch: 3061, Training Loss: 0.02847
Epoch: 3061, Training Loss: 0.03696
Epoch: 3061, Training Loss: 0.02749
Epoch: 3062, Training Loss: 0.03195
Epoch: 3062, Training Loss: 0.02846
Epoch: 3062, Training Loss: 0.03695
Epoch: 3062, Training Loss: 0.02748
Epoch: 3063, Training Loss: 0.03194
Epoch: 3063, Training Loss: 0.02846
Epoch: 3063, Training Loss: 0.03694
Epoch: 3063, Training Loss: 0.02748
Epoch: 3064, Training Loss: 0.03194
Epoch: 3064, Training Loss: 0.02845
Epoch: 3064, Training Loss: 0.03693
Epoch: 3064, Training Loss: 0.02747
Epoch: 3065, Training Loss: 0.03193
Epoch: 3065, Training Loss: 0.02844
Epoch: 3065, Training Loss: 0.03692
Epoch: 3065, Training Loss: 0.02746
Epoch: 3066, Training Loss: 0.03192
Epoch: 3066, Training Loss: 0.02844
Epoch: 3066, Training Loss: 0.03691
Epoch: 3066, Training Loss: 0.02746
Epoch: 3067, Training Loss: 0.03191
Epoch: 3067, Training Loss: 0.02843
Epoch: 3067, Training Loss: 0.03691
Epoch: 3067, Training Loss: 0.02745
Epoch: 3068, Training Loss: 0.03191
Epoch: 3068, Training Loss: 0.02843
Epoch: 3068, Training Loss: 0.03690
Epoch: 3068, Training Loss: 0.02745
Epoch: 3069, Training Loss: 0.03190
Epoch: 3069, Training Loss: 0.02842
Epoch: 3069, Training Loss: 0.03689
Epoch: 3069, Training Loss: 0.02744
Epoch: 3070, Training Loss: 0.03189
Epoch: 3070, Training Loss: 0.02841
Epoch: 3070, Training Loss: 0.03688
Epoch: 3070, Training Loss: 0.02743
Epoch: 3071, Training Loss: 0.03189
Epoch: 3071, Training Loss: 0.02841
Epoch: 3071, Training Loss: 0.03687
Epoch: 3071, Training Loss: 0.02743
Epoch: 3072, Training Loss: 0.03188
Epoch: 3072, Training Loss: 0.02840
Epoch: 3072, Training Loss: 0.03686
Epoch: 3072, Training Loss: 0.02742
Epoch: 3073, Training Loss: 0.03187
Epoch: 3073, Training Loss: 0.02839
Epoch: 3073, Training Loss: 0.03685
Epoch: 3073, Training Loss: 0.02742
Epoch: 3074, Training Loss: 0.03186
Epoch: 3074, Training Loss: 0.02839
Epoch: 3074, Training Loss: 0.03685
Epoch: 3074, Training Loss: 0.02741
Epoch: 3075, Training Loss: 0.03186
Epoch: 3075, Training Loss: 0.02838
Epoch: 3075, Training Loss: 0.03684
Epoch: 3075, Training Loss: 0.02740
Epoch: 3076, Training Loss: 0.03185
Epoch: 3076, Training Loss: 0.02838
Epoch: 3076, Training Loss: 0.03683
Epoch: 3076, Training Loss: 0.02740
Epoch: 3077, Training Loss: 0.03184
Epoch: 3077, Training Loss: 0.02837
Epoch: 3077, Training Loss: 0.03682
Epoch: 3077, Training Loss: 0.02739
Epoch: 3078, Training Loss: 0.03183
Epoch: 3078, Training Loss: 0.02836
Epoch: 3078, Training Loss: 0.03681
Epoch: 3078, Training Loss: 0.02739
Epoch: 3079, Training Loss: 0.03183
Epoch: 3079, Training Loss: 0.02836
Epoch: 3079, Training Loss: 0.03680
Epoch: 3079, Training Loss: 0.02738
Epoch: 3080, Training Loss: 0.03182
Epoch: 3080, Training Loss: 0.02835
Epoch: 3080, Training Loss: 0.03679
Epoch: 3080, Training Loss: 0.02737
Epoch: 3081, Training Loss: 0.03181
Epoch: 3081, Training Loss: 0.02835
Epoch: 3081, Training Loss: 0.03679
Epoch: 3081, Training Loss: 0.02737
Epoch: 3082, Training Loss: 0.03181
Epoch: 3082, Training Loss: 0.02834
Epoch: 3082, Training Loss: 0.03678
Epoch: 3082, Training Loss: 0.02736
Epoch: 3083, Training Loss: 0.03180
Epoch: 3083, Training Loss: 0.02833
Epoch: 3083, Training Loss: 0.03677
Epoch: 3083, Training Loss: 0.02736
Epoch: 3084, Training Loss: 0.03179
Epoch: 3084, Training Loss: 0.02833
Epoch: 3084, Training Loss: 0.03676
Epoch: 3084, Training Loss: 0.02735
Epoch: 3085, Training Loss: 0.03178
Epoch: 3085, Training Loss: 0.02832
Epoch: 3085, Training Loss: 0.03675
Epoch: 3085, Training Loss: 0.02734
Epoch: 3086, Training Loss: 0.03178
Epoch: 3086, Training Loss: 0.02832
Epoch: 3086, Training Loss: 0.03674
Epoch: 3086, Training Loss: 0.02734
Epoch: 3087, Training Loss: 0.03177
Epoch: 3087, Training Loss: 0.02831
Epoch: 3087, Training Loss: 0.03673
Epoch: 3087, Training Loss: 0.02733
Epoch: 3088, Training Loss: 0.03176
Epoch: 3088, Training Loss: 0.02831
Epoch: 3088, Training Loss: 0.03673
Epoch: 3088, Training Loss: 0.02733
Epoch: 3089, Training Loss: 0.03176
Epoch: 3089, Training Loss: 0.02830
Epoch: 3089, Training Loss: 0.03672
Epoch: 3089, Training Loss: 0.02732
Epoch: 3090, Training Loss: 0.03175
Epoch: 3090, Training Loss: 0.02829
Epoch: 3090, Training Loss: 0.03671
Epoch: 3090, Training Loss: 0.02731
Epoch: 3091, Training Loss: 0.03174
Epoch: 3091, Training Loss: 0.02829
Epoch: 3091, Training Loss: 0.03670
Epoch: 3091, Training Loss: 0.02731
Epoch: 3092, Training Loss: 0.03173
Epoch: 3092, Training Loss: 0.02828
Epoch: 3092, Training Loss: 0.03669
Epoch: 3092, Training Loss: 0.02730
Epoch: 3093, Training Loss: 0.03173
Epoch: 3093, Training Loss: 0.02828
Epoch: 3093, Training Loss: 0.03668
Epoch: 3093, Training Loss: 0.02730
Epoch: 3094, Training Loss: 0.03172
Epoch: 3094, Training Loss: 0.02827
Epoch: 3094, Training Loss: 0.03667
Epoch: 3094, Training Loss: 0.02729
Epoch: 3095, Training Loss: 0.03171
Epoch: 3095, Training Loss: 0.02826
Epoch: 3095, Training Loss: 0.03667
Epoch: 3095, Training Loss: 0.02728
Epoch: 3096, Training Loss: 0.03171
Epoch: 3096, Training Loss: 0.02826
Epoch: 3096, Training Loss: 0.03666
Epoch: 3096, Training Loss: 0.02728
Epoch: 3097, Training Loss: 0.03170
Epoch: 3097, Training Loss: 0.02825
Epoch: 3097, Training Loss: 0.03665
Epoch: 3097, Training Loss: 0.02727
Epoch: 3098, Training Loss: 0.03169
Epoch: 3098, Training Loss: 0.02825
Epoch: 3098, Training Loss: 0.03664
Epoch: 3098, Training Loss: 0.02727
Epoch: 3099, Training Loss: 0.03168
Epoch: 3099, Training Loss: 0.02824
Epoch: 3099, Training Loss: 0.03663
Epoch: 3099, Training Loss: 0.02726
Epoch: 3100, Training Loss: 0.03168
Epoch: 3100, Training Loss: 0.02823
Epoch: 3100, Training Loss: 0.03662
Epoch: 3100, Training Loss: 0.02725
Epoch: 3101, Training Loss: 0.03167
Epoch: 3101, Training Loss: 0.02823
Epoch: 3101, Training Loss: 0.03661
Epoch: 3101, Training Loss: 0.02725
Epoch: 3102, Training Loss: 0.03166
Epoch: 3102, Training Loss: 0.02822
Epoch: 3102, Training Loss: 0.03661
Epoch: 3102, Training Loss: 0.02724
Epoch: 3103, Training Loss: 0.03166
Epoch: 3103, Training Loss: 0.02822
Epoch: 3103, Training Loss: 0.03660
Epoch: 3103, Training Loss: 0.02724
Epoch: 3104, Training Loss: 0.03165
Epoch: 3104, Training Loss: 0.02821
Epoch: 3104, Training Loss: 0.03659
Epoch: 3104, Training Loss: 0.02723
Epoch: 3105, Training Loss: 0.03164
Epoch: 3105, Training Loss: 0.02820
Epoch: 3105, Training Loss: 0.03658
Epoch: 3105, Training Loss: 0.02722
Epoch: 3106, Training Loss: 0.03163
Epoch: 3106, Training Loss: 0.02820
Epoch: 3106, Training Loss: 0.03657
Epoch: 3106, Training Loss: 0.02722
Epoch: 3107, Training Loss: 0.03163
Epoch: 3107, Training Loss: 0.02819
Epoch: 3107, Training Loss: 0.03656
Epoch: 3107, Training Loss: 0.02721
Epoch: 3108, Training Loss: 0.03162
Epoch: 3108, Training Loss: 0.02819
Epoch: 3108, Training Loss: 0.03656
Epoch: 3108, Training Loss: 0.02721
Epoch: 3109, Training Loss: 0.03161
Epoch: 3109, Training Loss: 0.02818
Epoch: 3109, Training Loss: 0.03655
Epoch: 3109, Training Loss: 0.02720
Epoch: 3110, Training Loss: 0.03161
Epoch: 3110, Training Loss: 0.02817
Epoch: 3110, Training Loss: 0.03654
Epoch: 3110, Training Loss: 0.02719
Epoch: 3111, Training Loss: 0.03160
Epoch: 3111, Training Loss: 0.02817
Epoch: 3111, Training Loss: 0.03653
Epoch: 3111, Training Loss: 0.02719
Epoch: 3112, Training Loss: 0.03159
Epoch: 3112, Training Loss: 0.02816
Epoch: 3112, Training Loss: 0.03652
Epoch: 3112, Training Loss: 0.02718
Epoch: 3113, Training Loss: 0.03158
Epoch: 3113, Training Loss: 0.02816
Epoch: 3113, Training Loss: 0.03651
Epoch: 3113, Training Loss: 0.02718
Epoch: 3114, Training Loss: 0.03158
Epoch: 3114, Training Loss: 0.02815
Epoch: 3114, Training Loss: 0.03650
Epoch: 3114, Training Loss: 0.02717
Epoch: 3115, Training Loss: 0.03157
Epoch: 3115, Training Loss: 0.02815
Epoch: 3115, Training Loss: 0.03650
Epoch: 3115, Training Loss: 0.02716
Epoch: 3116, Training Loss: 0.03156
Epoch: 3116, Training Loss: 0.02814
Epoch: 3116, Training Loss: 0.03649
Epoch: 3116, Training Loss: 0.02716
Epoch: 3117, Training Loss: 0.03156
Epoch: 3117, Training Loss: 0.02813
Epoch: 3117, Training Loss: 0.03648
Epoch: 3117, Training Loss: 0.02715
Epoch: 3118, Training Loss: 0.03155
Epoch: 3118, Training Loss: 0.02813
Epoch: 3118, Training Loss: 0.03647
Epoch: 3118, Training Loss: 0.02715
Epoch: 3119, Training Loss: 0.03154
Epoch: 3119, Training Loss: 0.02812
Epoch: 3119, Training Loss: 0.03646
Epoch: 3119, Training Loss: 0.02714
Epoch: 3120, Training Loss: 0.03154
Epoch: 3120, Training Loss: 0.02812
Epoch: 3120, Training Loss: 0.03645
Epoch: 3120, Training Loss: 0.02714
Epoch: 3121, Training Loss: 0.03153
Epoch: 3121, Training Loss: 0.02811
Epoch: 3121, Training Loss: 0.03645
Epoch: 3121, Training Loss: 0.02713
Epoch: 3122, Training Loss: 0.03152
Epoch: 3122, Training Loss: 0.02810
Epoch: 3122, Training Loss: 0.03644
Epoch: 3122, Training Loss: 0.02712
Epoch: 3123, Training Loss: 0.03151
Epoch: 3123, Training Loss: 0.02810
Epoch: 3123, Training Loss: 0.03643
Epoch: 3123, Training Loss: 0.02712
Epoch: 3124, Training Loss: 0.03151
Epoch: 3124, Training Loss: 0.02809
Epoch: 3124, Training Loss: 0.03642
Epoch: 3124, Training Loss: 0.02711
Epoch: 3125, Training Loss: 0.03150
Epoch: 3125, Training Loss: 0.02809
Epoch: 3125, Training Loss: 0.03641
Epoch: 3125, Training Loss: 0.02711
Epoch: 3126, Training Loss: 0.03149
Epoch: 3126, Training Loss: 0.02808
Epoch: 3126, Training Loss: 0.03640
Epoch: 3126, Training Loss: 0.02710
Epoch: 3127, Training Loss: 0.03149
Epoch: 3127, Training Loss: 0.02808
Epoch: 3127, Training Loss: 0.03640
Epoch: 3127, Training Loss: 0.02709
Epoch: 3128, Training Loss: 0.03148
Epoch: 3128, Training Loss: 0.02807
Epoch: 3128, Training Loss: 0.03639
Epoch: 3128, Training Loss: 0.02709
Epoch: 3129, Training Loss: 0.03147
Epoch: 3129, Training Loss: 0.02806
Epoch: 3129, Training Loss: 0.03638
Epoch: 3129, Training Loss: 0.02708
Epoch: 3130, Training Loss: 0.03147
Epoch: 3130, Training Loss: 0.02806
Epoch: 3130, Training Loss: 0.03637
Epoch: 3130, Training Loss: 0.02708
Epoch: 3131, Training Loss: 0.03146
Epoch: 3131, Training Loss: 0.02805
Epoch: 3131, Training Loss: 0.03636
Epoch: 3131, Training Loss: 0.02707
Epoch: 3132, Training Loss: 0.03145
Epoch: 3132, Training Loss: 0.02805
Epoch: 3132, Training Loss: 0.03635
Epoch: 3132, Training Loss: 0.02706
Epoch: 3133, Training Loss: 0.03144
Epoch: 3133, Training Loss: 0.02804
Epoch: 3133, Training Loss: 0.03635
Epoch: 3133, Training Loss: 0.02706
Epoch: 3134, Training Loss: 0.03144
Epoch: 3134, Training Loss: 0.02803
Epoch: 3134, Training Loss: 0.03634
Epoch: 3134, Training Loss: 0.02705
Epoch: 3135, Training Loss: 0.03143
Epoch: 3135, Training Loss: 0.02803
Epoch: 3135, Training Loss: 0.03633
Epoch: 3135, Training Loss: 0.02705
Epoch: 3136, Training Loss: 0.03142
Epoch: 3136, Training Loss: 0.02802
Epoch: 3136, Training Loss: 0.03632
Epoch: 3136, Training Loss: 0.02704
Epoch: 3137, Training Loss: 0.03142
Epoch: 3137, Training Loss: 0.02802
Epoch: 3137, Training Loss: 0.03631
Epoch: 3137, Training Loss: 0.02704
Epoch: 3138, Training Loss: 0.03141
Epoch: 3138, Training Loss: 0.02801
Epoch: 3138, Training Loss: 0.03630
Epoch: 3138, Training Loss: 0.02703
Epoch: 3139, Training Loss: 0.03140
Epoch: 3139, Training Loss: 0.02801
Epoch: 3139, Training Loss: 0.03630
Epoch: 3139, Training Loss: 0.02702
Epoch: 3140, Training Loss: 0.03140
Epoch: 3140, Training Loss: 0.02800
Epoch: 3140, Training Loss: 0.03629
Epoch: 3140, Training Loss: 0.02702
Epoch: 3141, Training Loss: 0.03139
Epoch: 3141, Training Loss: 0.02799
Epoch: 3141, Training Loss: 0.03628
Epoch: 3141, Training Loss: 0.02701
Epoch: 3142, Training Loss: 0.03138
Epoch: 3142, Training Loss: 0.02799
Epoch: 3142, Training Loss: 0.03627
Epoch: 3142, Training Loss: 0.02701
Epoch: 3143, Training Loss: 0.03138
Epoch: 3143, Training Loss: 0.02798
Epoch: 3143, Training Loss: 0.03626
Epoch: 3143, Training Loss: 0.02700
Epoch: 3144, Training Loss: 0.03137
Epoch: 3144, Training Loss: 0.02798
Epoch: 3144, Training Loss: 0.03625
Epoch: 3144, Training Loss: 0.02699
Epoch: 3145, Training Loss: 0.03136
Epoch: 3145, Training Loss: 0.02797
Epoch: 3145, Training Loss: 0.03625
Epoch: 3145, Training Loss: 0.02699
Epoch: 3146, Training Loss: 0.03135
Epoch: 3146, Training Loss: 0.02797
Epoch: 3146, Training Loss: 0.03624
Epoch: 3146, Training Loss: 0.02698
Epoch: 3147, Training Loss: 0.03135
Epoch: 3147, Training Loss: 0.02796
Epoch: 3147, Training Loss: 0.03623
Epoch: 3147, Training Loss: 0.02698
Epoch: 3148, Training Loss: 0.03134
Epoch: 3148, Training Loss: 0.02795
Epoch: 3148, Training Loss: 0.03622
Epoch: 3148, Training Loss: 0.02697
Epoch: 3149, Training Loss: 0.03133
Epoch: 3149, Training Loss: 0.02795
Epoch: 3149, Training Loss: 0.03621
Epoch: 3149, Training Loss: 0.02697
Epoch: 3150, Training Loss: 0.03133
Epoch: 3150, Training Loss: 0.02794
Epoch: 3150, Training Loss: 0.03621
Epoch: 3150, Training Loss: 0.02696
Epoch: 3151, Training Loss: 0.03132
Epoch: 3151, Training Loss: 0.02794
Epoch: 3151, Training Loss: 0.03620
Epoch: 3151, Training Loss: 0.02695
Epoch: 3152, Training Loss: 0.03131
Epoch: 3152, Training Loss: 0.02793
Epoch: 3152, Training Loss: 0.03619
Epoch: 3152, Training Loss: 0.02695
Epoch: 3153, Training Loss: 0.03131
Epoch: 3153, Training Loss: 0.02793
Epoch: 3153, Training Loss: 0.03618
Epoch: 3153, Training Loss: 0.02694
Epoch: 3154, Training Loss: 0.03130
Epoch: 3154, Training Loss: 0.02792
Epoch: 3154, Training Loss: 0.03617
Epoch: 3154, Training Loss: 0.02694
Epoch: 3155, Training Loss: 0.03129
Epoch: 3155, Training Loss: 0.02791
Epoch: 3155, Training Loss: 0.03616
Epoch: 3155, Training Loss: 0.02693
Epoch: 3156, Training Loss: 0.03129
Epoch: 3156, Training Loss: 0.02791
Epoch: 3156, Training Loss: 0.03616
Epoch: 3156, Training Loss: 0.02693
Epoch: 3157, Training Loss: 0.03128
Epoch: 3157, Training Loss: 0.02790
Epoch: 3157, Training Loss: 0.03615
Epoch: 3157, Training Loss: 0.02692
Epoch: 3158, Training Loss: 0.03127
Epoch: 3158, Training Loss: 0.02790
Epoch: 3158, Training Loss: 0.03614
Epoch: 3158, Training Loss: 0.02691
Epoch: 3159, Training Loss: 0.03126
Epoch: 3159, Training Loss: 0.02789
Epoch: 3159, Training Loss: 0.03613
Epoch: 3159, Training Loss: 0.02691
Epoch: 3160, Training Loss: 0.03126
Epoch: 3160, Training Loss: 0.02789
Epoch: 3160, Training Loss: 0.03612
Epoch: 3160, Training Loss: 0.02690
Epoch: 3161, Training Loss: 0.03125
Epoch: 3161, Training Loss: 0.02788
Epoch: 3161, Training Loss: 0.03612
Epoch: 3161, Training Loss: 0.02690
Epoch: 3162, Training Loss: 0.03124
Epoch: 3162, Training Loss: 0.02787
Epoch: 3162, Training Loss: 0.03611
Epoch: 3162, Training Loss: 0.02689
Epoch: 3163, Training Loss: 0.03124
Epoch: 3163, Training Loss: 0.02787
Epoch: 3163, Training Loss: 0.03610
Epoch: 3163, Training Loss: 0.02689
Epoch: 3164, Training Loss: 0.03123
Epoch: 3164, Training Loss: 0.02786
Epoch: 3164, Training Loss: 0.03609
Epoch: 3164, Training Loss: 0.02688
Epoch: 3165, Training Loss: 0.03122
Epoch: 3165, Training Loss: 0.02786
Epoch: 3165, Training Loss: 0.03608
Epoch: 3165, Training Loss: 0.02687
Epoch: 3166, Training Loss: 0.03122
Epoch: 3166, Training Loss: 0.02785
Epoch: 3166, Training Loss: 0.03607
Epoch: 3166, Training Loss: 0.02687
Epoch: 3167, Training Loss: 0.03121
Epoch: 3167, Training Loss: 0.02785
Epoch: 3167, Training Loss: 0.03607
Epoch: 3167, Training Loss: 0.02686
Epoch: 3168, Training Loss: 0.03120
Epoch: 3168, Training Loss: 0.02784
Epoch: 3168, Training Loss: 0.03606
Epoch: 3168, Training Loss: 0.02686
Epoch: 3169, Training Loss: 0.03120
Epoch: 3169, Training Loss: 0.02783
Epoch: 3169, Training Loss: 0.03605
Epoch: 3169, Training Loss: 0.02685
Epoch: 3170, Training Loss: 0.03119
Epoch: 3170, Training Loss: 0.02783
Epoch: 3170, Training Loss: 0.03604
Epoch: 3170, Training Loss: 0.02685
Epoch: 3171, Training Loss: 0.03118
Epoch: 3171, Training Loss: 0.02782
Epoch: 3171, Training Loss: 0.03603
Epoch: 3171, Training Loss: 0.02684
Epoch: 3172, Training Loss: 0.03118
Epoch: 3172, Training Loss: 0.02782
Epoch: 3172, Training Loss: 0.03603
Epoch: 3172, Training Loss: 0.02683
Epoch: 3173, Training Loss: 0.03117
Epoch: 3173, Training Loss: 0.02781
Epoch: 3173, Training Loss: 0.03602
Epoch: 3173, Training Loss: 0.02683
Epoch: 3174, Training Loss: 0.03116
Epoch: 3174, Training Loss: 0.02781
Epoch: 3174, Training Loss: 0.03601
Epoch: 3174, Training Loss: 0.02682
Epoch: 3175, Training Loss: 0.03116
Epoch: 3175, Training Loss: 0.02780
Epoch: 3175, Training Loss: 0.03600
Epoch: 3175, Training Loss: 0.02682
Epoch: 3176, Training Loss: 0.03115
Epoch: 3176, Training Loss: 0.02779
Epoch: 3176, Training Loss: 0.03599
Epoch: 3176, Training Loss: 0.02681
Epoch: 3177, Training Loss: 0.03114
Epoch: 3177, Training Loss: 0.02779
Epoch: 3177, Training Loss: 0.03599
Epoch: 3177, Training Loss: 0.02681
Epoch: 3178, Training Loss: 0.03114
Epoch: 3178, Training Loss: 0.02778
Epoch: 3178, Training Loss: 0.03598
Epoch: 3178, Training Loss: 0.02680
Epoch: 3179, Training Loss: 0.03113
Epoch: 3179, Training Loss: 0.02778
Epoch: 3179, Training Loss: 0.03597
Epoch: 3179, Training Loss: 0.02679
Epoch: 3180, Training Loss: 0.03112
Epoch: 3180, Training Loss: 0.02777
Epoch: 3180, Training Loss: 0.03596
Epoch: 3180, Training Loss: 0.02679
Epoch: 3181, Training Loss: 0.03112
Epoch: 3181, Training Loss: 0.02777
Epoch: 3181, Training Loss: 0.03595
Epoch: 3181, Training Loss: 0.02678
Epoch: 3182, Training Loss: 0.03111
Epoch: 3182, Training Loss: 0.02776
Epoch: 3182, Training Loss: 0.03595
Epoch: 3182, Training Loss: 0.02678
Epoch: 3183, Training Loss: 0.03110
Epoch: 3183, Training Loss: 0.02775
Epoch: 3183, Training Loss: 0.03594
Epoch: 3183, Training Loss: 0.02677
Epoch: 3184, Training Loss: 0.03109
Epoch: 3184, Training Loss: 0.02775
Epoch: 3184, Training Loss: 0.03593
Epoch: 3184, Training Loss: 0.02677
Epoch: 3185, Training Loss: 0.03109
Epoch: 3185, Training Loss: 0.02774
Epoch: 3185, Training Loss: 0.03592
Epoch: 3185, Training Loss: 0.02676
Epoch: 3186, Training Loss: 0.03108
Epoch: 3186, Training Loss: 0.02774
Epoch: 3186, Training Loss: 0.03591
Epoch: 3186, Training Loss: 0.02675
Epoch: 3187, Training Loss: 0.03107
Epoch: 3187, Training Loss: 0.02773
Epoch: 3187, Training Loss: 0.03590
Epoch: 3187, Training Loss: 0.02675
Epoch: 3188, Training Loss: 0.03107
Epoch: 3188, Training Loss: 0.02773
Epoch: 3188, Training Loss: 0.03590
Epoch: 3188, Training Loss: 0.02674
Epoch: 3189, Training Loss: 0.03106
Epoch: 3189, Training Loss: 0.02772
Epoch: 3189, Training Loss: 0.03589
Epoch: 3189, Training Loss: 0.02674
Epoch: 3190, Training Loss: 0.03105
Epoch: 3190, Training Loss: 0.02772
Epoch: 3190, Training Loss: 0.03588
Epoch: 3190, Training Loss: 0.02673
Epoch: 3191, Training Loss: 0.03105
Epoch: 3191, Training Loss: 0.02771
Epoch: 3191, Training Loss: 0.03587
Epoch: 3191, Training Loss: 0.02673
Epoch: 3192, Training Loss: 0.03104
Epoch: 3192, Training Loss: 0.02770
Epoch: 3192, Training Loss: 0.03586
Epoch: 3192, Training Loss: 0.02672
Epoch: 3193, Training Loss: 0.03103
Epoch: 3193, Training Loss: 0.02770
Epoch: 3193, Training Loss: 0.03586
Epoch: 3193, Training Loss: 0.02672
Epoch: 3194, Training Loss: 0.03103
Epoch: 3194, Training Loss: 0.02769
Epoch: 3194, Training Loss: 0.03585
Epoch: 3194, Training Loss: 0.02671
Epoch: 3195, Training Loss: 0.03102
Epoch: 3195, Training Loss: 0.02769
Epoch: 3195, Training Loss: 0.03584
Epoch: 3195, Training Loss: 0.02670
Epoch: 3196, Training Loss: 0.03101
Epoch: 3196, Training Loss: 0.02768
Epoch: 3196, Training Loss: 0.03583
Epoch: 3196, Training Loss: 0.02670
Epoch: 3197, Training Loss: 0.03101
Epoch: 3197, Training Loss: 0.02768
Epoch: 3197, Training Loss: 0.03582
Epoch: 3197, Training Loss: 0.02669
Epoch: 3198, Training Loss: 0.03100
Epoch: 3198, Training Loss: 0.02767
Epoch: 3198, Training Loss: 0.03582
Epoch: 3198, Training Loss: 0.02669
Epoch: 3199, Training Loss: 0.03099
Epoch: 3199, Training Loss: 0.02767
Epoch: 3199, Training Loss: 0.03581
Epoch: 3199, Training Loss: 0.02668
Epoch: 3200, Training Loss: 0.03099
Epoch: 3200, Training Loss: 0.02766
Epoch: 3200, Training Loss: 0.03580
Epoch: 3200, Training Loss: 0.02668
Epoch: 3201, Training Loss: 0.03098
Epoch: 3201, Training Loss: 0.02765
Epoch: 3201, Training Loss: 0.03579
Epoch: 3201, Training Loss: 0.02667
Epoch: 3202, Training Loss: 0.03097
Epoch: 3202, Training Loss: 0.02765
Epoch: 3202, Training Loss: 0.03579
Epoch: 3202, Training Loss: 0.02666
Epoch: 3203, Training Loss: 0.03097
Epoch: 3203, Training Loss: 0.02764
Epoch: 3203, Training Loss: 0.03578
Epoch: 3203, Training Loss: 0.02666
Epoch: 3204, Training Loss: 0.03096
Epoch: 3204, Training Loss: 0.02764
Epoch: 3204, Training Loss: 0.03577
Epoch: 3204, Training Loss: 0.02665
Epoch: 3205, Training Loss: 0.03095
Epoch: 3205, Training Loss: 0.02763
Epoch: 3205, Training Loss: 0.03576
Epoch: 3205, Training Loss: 0.02665
Epoch: 3206, Training Loss: 0.03095
Epoch: 3206, Training Loss: 0.02763
Epoch: 3206, Training Loss: 0.03575
Epoch: 3206, Training Loss: 0.02664
Epoch: 3207, Training Loss: 0.03094
Epoch: 3207, Training Loss: 0.02762
Epoch: 3207, Training Loss: 0.03575
Epoch: 3207, Training Loss: 0.02664
Epoch: 3208, Training Loss: 0.03093
Epoch: 3208, Training Loss: 0.02761
Epoch: 3208, Training Loss: 0.03574
Epoch: 3208, Training Loss: 0.02663
Epoch: 3209, Training Loss: 0.03093
Epoch: 3209, Training Loss: 0.02761
Epoch: 3209, Training Loss: 0.03573
Epoch: 3209, Training Loss: 0.02663
Epoch: 3210, Training Loss: 0.03092
Epoch: 3210, Training Loss: 0.02760
Epoch: 3210, Training Loss: 0.03572
Epoch: 3210, Training Loss: 0.02662
Epoch: 3211, Training Loss: 0.03091
Epoch: 3211, Training Loss: 0.02760
Epoch: 3211, Training Loss: 0.03571
Epoch: 3211, Training Loss: 0.02661
Epoch: 3212, Training Loss: 0.03091
Epoch: 3212, Training Loss: 0.02759
Epoch: 3212, Training Loss: 0.03571
Epoch: 3212, Training Loss: 0.02661
Epoch: 3213, Training Loss: 0.03090
Epoch: 3213, Training Loss: 0.02759
Epoch: 3213, Training Loss: 0.03570
Epoch: 3213, Training Loss: 0.02660
Epoch: 3214, Training Loss: 0.03089
Epoch: 3214, Training Loss: 0.02758
Epoch: 3214, Training Loss: 0.03569
Epoch: 3214, Training Loss: 0.02660
Epoch: 3215, Training Loss: 0.03089
Epoch: 3215, Training Loss: 0.02758
Epoch: 3215, Training Loss: 0.03568
Epoch: 3215, Training Loss: 0.02659
Epoch: 3216, Training Loss: 0.03088
Epoch: 3216, Training Loss: 0.02757
Epoch: 3216, Training Loss: 0.03567
Epoch: 3216, Training Loss: 0.02659
Epoch: 3217, Training Loss: 0.03087
Epoch: 3217, Training Loss: 0.02757
Epoch: 3217, Training Loss: 0.03567
Epoch: 3217, Training Loss: 0.02658
Epoch: 3218, Training Loss: 0.03087
Epoch: 3218, Training Loss: 0.02756
Epoch: 3218, Training Loss: 0.03566
Epoch: 3218, Training Loss: 0.02658
Epoch: 3219, Training Loss: 0.03086
Epoch: 3219, Training Loss: 0.02755
Epoch: 3219, Training Loss: 0.03565
Epoch: 3219, Training Loss: 0.02657
Epoch: 3220, Training Loss: 0.03085
Epoch: 3220, Training Loss: 0.02755
Epoch: 3220, Training Loss: 0.03564
Epoch: 3220, Training Loss: 0.02656
Epoch: 3221, Training Loss: 0.03085
Epoch: 3221, Training Loss: 0.02754
Epoch: 3221, Training Loss: 0.03564
Epoch: 3221, Training Loss: 0.02656
Epoch: 3222, Training Loss: 0.03084
Epoch: 3222, Training Loss: 0.02754
Epoch: 3222, Training Loss: 0.03563
Epoch: 3222, Training Loss: 0.02655
Epoch: 3223, Training Loss: 0.03083
Epoch: 3223, Training Loss: 0.02753
Epoch: 3223, Training Loss: 0.03562
Epoch: 3223, Training Loss: 0.02655
Epoch: 3224, Training Loss: 0.03083
Epoch: 3224, Training Loss: 0.02753
Epoch: 3224, Training Loss: 0.03561
Epoch: 3224, Training Loss: 0.02654
Epoch: 3225, Training Loss: 0.03082
Epoch: 3225, Training Loss: 0.02752
Epoch: 3225, Training Loss: 0.03560
Epoch: 3225, Training Loss: 0.02654
Epoch: 3226, Training Loss: 0.03082
Epoch: 3226, Training Loss: 0.02752
Epoch: 3226, Training Loss: 0.03560
Epoch: 3226, Training Loss: 0.02653
Epoch: 3227, Training Loss: 0.03081
Epoch: 3227, Training Loss: 0.02751
Epoch: 3227, Training Loss: 0.03559
Epoch: 3227, Training Loss: 0.02653
Epoch: 3228, Training Loss: 0.03080
Epoch: 3228, Training Loss: 0.02750
Epoch: 3228, Training Loss: 0.03558
Epoch: 3228, Training Loss: 0.02652
Epoch: 3229, Training Loss: 0.03080
Epoch: 3229, Training Loss: 0.02750
Epoch: 3229, Training Loss: 0.03557
Epoch: 3229, Training Loss: 0.02652
Epoch: 3230, Training Loss: 0.03079
Epoch: 3230, Training Loss: 0.02749
Epoch: 3230, Training Loss: 0.03556
Epoch: 3230, Training Loss: 0.02651
Epoch: 3231, Training Loss: 0.03078
Epoch: 3231, Training Loss: 0.02749
Epoch: 3231, Training Loss: 0.03556
Epoch: 3231, Training Loss: 0.02650
Epoch: 3232, Training Loss: 0.03078
Epoch: 3232, Training Loss: 0.02748
Epoch: 3232, Training Loss: 0.03555
Epoch: 3232, Training Loss: 0.02650
Epoch: 3233, Training Loss: 0.03077
Epoch: 3233, Training Loss: 0.02748
Epoch: 3233, Training Loss: 0.03554
Epoch: 3233, Training Loss: 0.02649
Epoch: 3234, Training Loss: 0.03076
Epoch: 3234, Training Loss: 0.02747
Epoch: 3234, Training Loss: 0.03553
Epoch: 3234, Training Loss: 0.02649
Epoch: 3235, Training Loss: 0.03076
Epoch: 3235, Training Loss: 0.02747
Epoch: 3235, Training Loss: 0.03553
Epoch: 3235, Training Loss: 0.02648
Epoch: 3236, Training Loss: 0.03075
Epoch: 3236, Training Loss: 0.02746
Epoch: 3236, Training Loss: 0.03552
Epoch: 3236, Training Loss: 0.02648
Epoch: 3237, Training Loss: 0.03074
Epoch: 3237, Training Loss: 0.02746
Epoch: 3237, Training Loss: 0.03551
Epoch: 3237, Training Loss: 0.02647
Epoch: 3238, Training Loss: 0.03074
Epoch: 3238, Training Loss: 0.02745
Epoch: 3238, Training Loss: 0.03550
Epoch: 3238, Training Loss: 0.02647
Epoch: 3239, Training Loss: 0.03073
Epoch: 3239, Training Loss: 0.02744
Epoch: 3239, Training Loss: 0.03549
Epoch: 3239, Training Loss: 0.02646
Epoch: 3240, Training Loss: 0.03072
Epoch: 3240, Training Loss: 0.02744
Epoch: 3240, Training Loss: 0.03549
Epoch: 3240, Training Loss: 0.02645
Epoch: 3241, Training Loss: 0.03072
Epoch: 3241, Training Loss: 0.02743
Epoch: 3241, Training Loss: 0.03548
Epoch: 3241, Training Loss: 0.02645
Epoch: 3242, Training Loss: 0.03071
Epoch: 3242, Training Loss: 0.02743
Epoch: 3242, Training Loss: 0.03547
Epoch: 3242, Training Loss: 0.02644
Epoch: 3243, Training Loss: 0.03070
Epoch: 3243, Training Loss: 0.02742
Epoch: 3243, Training Loss: 0.03546
Epoch: 3243, Training Loss: 0.02644
Epoch: 3244, Training Loss: 0.03070
Epoch: 3244, Training Loss: 0.02742
Epoch: 3244, Training Loss: 0.03546
Epoch: 3244, Training Loss: 0.02643
Epoch: 3245, Training Loss: 0.03069
Epoch: 3245, Training Loss: 0.02741
Epoch: 3245, Training Loss: 0.03545
Epoch: 3245, Training Loss: 0.02643
Epoch: 3246, Training Loss: 0.03068
Epoch: 3246, Training Loss: 0.02741
Epoch: 3246, Training Loss: 0.03544
Epoch: 3246, Training Loss: 0.02642
Epoch: 3247, Training Loss: 0.03068
Epoch: 3247, Training Loss: 0.02740
Epoch: 3247, Training Loss: 0.03543
Epoch: 3247, Training Loss: 0.02642
Epoch: 3248, Training Loss: 0.03067
Epoch: 3248, Training Loss: 0.02740
Epoch: 3248, Training Loss: 0.03542
Epoch: 3248, Training Loss: 0.02641
Epoch: 3249, Training Loss: 0.03066
Epoch: 3249, Training Loss: 0.02739
Epoch: 3249, Training Loss: 0.03542
Epoch: 3249, Training Loss: 0.02641
Epoch: 3250, Training Loss: 0.03066
Epoch: 3250, Training Loss: 0.02738
Epoch: 3250, Training Loss: 0.03541
Epoch: 3250, Training Loss: 0.02640
Epoch: 3251, Training Loss: 0.03065
Epoch: 3251, Training Loss: 0.02738
Epoch: 3251, Training Loss: 0.03540
Epoch: 3251, Training Loss: 0.02639
Epoch: 3252, Training Loss: 0.03065
Epoch: 3252, Training Loss: 0.02737
Epoch: 3252, Training Loss: 0.03539
Epoch: 3252, Training Loss: 0.02639
Epoch: 3253, Training Loss: 0.03064
Epoch: 3253, Training Loss: 0.02737
Epoch: 3253, Training Loss: 0.03539
Epoch: 3253, Training Loss: 0.02638
Epoch: 3254, Training Loss: 0.03063
Epoch: 3254, Training Loss: 0.02736
Epoch: 3254, Training Loss: 0.03538
Epoch: 3254, Training Loss: 0.02638
Epoch: 3255, Training Loss: 0.03063
Epoch: 3255, Training Loss: 0.02736
Epoch: 3255, Training Loss: 0.03537
Epoch: 3255, Training Loss: 0.02637
Epoch: 3256, Training Loss: 0.03062
Epoch: 3256, Training Loss: 0.02735
Epoch: 3256, Training Loss: 0.03536
Epoch: 3256, Training Loss: 0.02637
Epoch: 3257, Training Loss: 0.03061
Epoch: 3257, Training Loss: 0.02735
Epoch: 3257, Training Loss: 0.03536
Epoch: 3257, Training Loss: 0.02636
Epoch: 3258, Training Loss: 0.03061
Epoch: 3258, Training Loss: 0.02734
Epoch: 3258, Training Loss: 0.03535
Epoch: 3258, Training Loss: 0.02636
Epoch: 3259, Training Loss: 0.03060
Epoch: 3259, Training Loss: 0.02734
Epoch: 3259, Training Loss: 0.03534
Epoch: 3259, Training Loss: 0.02635
Epoch: 3260, Training Loss: 0.03059
Epoch: 3260, Training Loss: 0.02733
Epoch: 3260, Training Loss: 0.03533
Epoch: 3260, Training Loss: 0.02635
Epoch: 3261, Training Loss: 0.03059
Epoch: 3261, Training Loss: 0.02733
Epoch: 3261, Training Loss: 0.03533
Epoch: 3261, Training Loss: 0.02634
Epoch: 3262, Training Loss: 0.03058
Epoch: 3262, Training Loss: 0.02732
Epoch: 3262, Training Loss: 0.03532
Epoch: 3262, Training Loss: 0.02634
Epoch: 3263, Training Loss: 0.03057
Epoch: 3263, Training Loss: 0.02731
Epoch: 3263, Training Loss: 0.03531
Epoch: 3263, Training Loss: 0.02633
Epoch: 3264, Training Loss: 0.03057
Epoch: 3264, Training Loss: 0.02731
Epoch: 3264, Training Loss: 0.03530
Epoch: 3264, Training Loss: 0.02632
Epoch: 3265, Training Loss: 0.03056
Epoch: 3265, Training Loss: 0.02730
Epoch: 3265, Training Loss: 0.03529
Epoch: 3265, Training Loss: 0.02632
Epoch: 3266, Training Loss: 0.03056
Epoch: 3266, Training Loss: 0.02730
Epoch: 3266, Training Loss: 0.03529
Epoch: 3266, Training Loss: 0.02631
Epoch: 3267, Training Loss: 0.03055
Epoch: 3267, Training Loss: 0.02729
Epoch: 3267, Training Loss: 0.03528
Epoch: 3267, Training Loss: 0.02631
Epoch: 3268, Training Loss: 0.03054
Epoch: 3268, Training Loss: 0.02729
Epoch: 3268, Training Loss: 0.03527
Epoch: 3268, Training Loss: 0.02630
Epoch: 3269, Training Loss: 0.03054
Epoch: 3269, Training Loss: 0.02728
Epoch: 3269, Training Loss: 0.03526
Epoch: 3269, Training Loss: 0.02630
Epoch: 3270, Training Loss: 0.03053
Epoch: 3270, Training Loss: 0.02728
Epoch: 3270, Training Loss: 0.03526
Epoch: 3270, Training Loss: 0.02629
Epoch: 3271, Training Loss: 0.03052
Epoch: 3271, Training Loss: 0.02727
Epoch: 3271, Training Loss: 0.03525
Epoch: 3271, Training Loss: 0.02629
Epoch: 3272, Training Loss: 0.03052
Epoch: 3272, Training Loss: 0.02727
Epoch: 3272, Training Loss: 0.03524
Epoch: 3272, Training Loss: 0.02628
Epoch: 3273, Training Loss: 0.03051
Epoch: 3273, Training Loss: 0.02726
Epoch: 3273, Training Loss: 0.03523
Epoch: 3273, Training Loss: 0.02628
Epoch: 3274, Training Loss: 0.03050
Epoch: 3274, Training Loss: 0.02726
Epoch: 3274, Training Loss: 0.03523
Epoch: 3274, Training Loss: 0.02627
Epoch: 3275, Training Loss: 0.03050
Epoch: 3275, Training Loss: 0.02725
Epoch: 3275, Training Loss: 0.03522
Epoch: 3275, Training Loss: 0.02627
Epoch: 3276, Training Loss: 0.03049
Epoch: 3276, Training Loss: 0.02725
Epoch: 3276, Training Loss: 0.03521
Epoch: 3276, Training Loss: 0.02626
Epoch: 3277, Training Loss: 0.03048
Epoch: 3277, Training Loss: 0.02724
Epoch: 3277, Training Loss: 0.03520
Epoch: 3277, Training Loss: 0.02625
Epoch: 3278, Training Loss: 0.03048
Epoch: 3278, Training Loss: 0.02723
Epoch: 3278, Training Loss: 0.03520
Epoch: 3278, Training Loss: 0.02625
Epoch: 3279, Training Loss: 0.03047
Epoch: 3279, Training Loss: 0.02723
Epoch: 3279, Training Loss: 0.03519
Epoch: 3279, Training Loss: 0.02624
Epoch: 3280, Training Loss: 0.03047
Epoch: 3280, Training Loss: 0.02722
Epoch: 3280, Training Loss: 0.03518
Epoch: 3280, Training Loss: 0.02624
Epoch: 3281, Training Loss: 0.03046
Epoch: 3281, Training Loss: 0.02722
Epoch: 3281, Training Loss: 0.03517
Epoch: 3281, Training Loss: 0.02623
Epoch: 3282, Training Loss: 0.03045
Epoch: 3282, Training Loss: 0.02721
Epoch: 3282, Training Loss: 0.03517
Epoch: 3282, Training Loss: 0.02623
Epoch: 3283, Training Loss: 0.03045
Epoch: 3283, Training Loss: 0.02721
Epoch: 3283, Training Loss: 0.03516
Epoch: 3283, Training Loss: 0.02622
Epoch: 3284, Training Loss: 0.03044
Epoch: 3284, Training Loss: 0.02720
Epoch: 3284, Training Loss: 0.03515
Epoch: 3284, Training Loss: 0.02622
Epoch: 3285, Training Loss: 0.03043
Epoch: 3285, Training Loss: 0.02720
Epoch: 3285, Training Loss: 0.03514
Epoch: 3285, Training Loss: 0.02621
Epoch: 3286, Training Loss: 0.03043
Epoch: 3286, Training Loss: 0.02719
Epoch: 3286, Training Loss: 0.03514
Epoch: 3286, Training Loss: 0.02621
Epoch: 3287, Training Loss: 0.03042
Epoch: 3287, Training Loss: 0.02719
Epoch: 3287, Training Loss: 0.03513
Epoch: 3287, Training Loss: 0.02620
Epoch: 3288, Training Loss: 0.03041
Epoch: 3288, Training Loss: 0.02718
Epoch: 3288, Training Loss: 0.03512
Epoch: 3288, Training Loss: 0.02620
Epoch: 3289, Training Loss: 0.03041
Epoch: 3289, Training Loss: 0.02718
Epoch: 3289, Training Loss: 0.03511
Epoch: 3289, Training Loss: 0.02619
Epoch: 3290, Training Loss: 0.03040
Epoch: 3290, Training Loss: 0.02717
Epoch: 3290, Training Loss: 0.03511
Epoch: 3290, Training Loss: 0.02619
Epoch: 3291, Training Loss: 0.03040
Epoch: 3291, Training Loss: 0.02717
Epoch: 3291, Training Loss: 0.03510
Epoch: 3291, Training Loss: 0.02618
Epoch: 3292, Training Loss: 0.03039
Epoch: 3292, Training Loss: 0.02716
Epoch: 3292, Training Loss: 0.03509
Epoch: 3292, Training Loss: 0.02618
Epoch: 3293, Training Loss: 0.03038
Epoch: 3293, Training Loss: 0.02715
Epoch: 3293, Training Loss: 0.03508
Epoch: 3293, Training Loss: 0.02617
Epoch: 3294, Training Loss: 0.03038
Epoch: 3294, Training Loss: 0.02715
Epoch: 3294, Training Loss: 0.03508
Epoch: 3294, Training Loss: 0.02616
Epoch: 3295, Training Loss: 0.03037
Epoch: 3295, Training Loss: 0.02714
Epoch: 3295, Training Loss: 0.03507
Epoch: 3295, Training Loss: 0.02616
Epoch: 3296, Training Loss: 0.03036
Epoch: 3296, Training Loss: 0.02714
Epoch: 3296, Training Loss: 0.03506
Epoch: 3296, Training Loss: 0.02615
Epoch: 3297, Training Loss: 0.03036
Epoch: 3297, Training Loss: 0.02713
Epoch: 3297, Training Loss: 0.03505
Epoch: 3297, Training Loss: 0.02615
Epoch: 3298, Training Loss: 0.03035
Epoch: 3298, Training Loss: 0.02713
Epoch: 3298, Training Loss: 0.03505
Epoch: 3298, Training Loss: 0.02614
Epoch: 3299, Training Loss: 0.03035
Epoch: 3299, Training Loss: 0.02712
Epoch: 3299, Training Loss: 0.03504
Epoch: 3299, Training Loss: 0.02614
Epoch: 3300, Training Loss: 0.03034
Epoch: 3300, Training Loss: 0.02712
Epoch: 3300, Training Loss: 0.03503
Epoch: 3300, Training Loss: 0.02613
Epoch: 3301, Training Loss: 0.03033
Epoch: 3301, Training Loss: 0.02711
Epoch: 3301, Training Loss: 0.03502
Epoch: 3301, Training Loss: 0.02613
Epoch: 3302, Training Loss: 0.03033
Epoch: 3302, Training Loss: 0.02711
Epoch: 3302, Training Loss: 0.03502
Epoch: 3302, Training Loss: 0.02612
Epoch: 3303, Training Loss: 0.03032
Epoch: 3303, Training Loss: 0.02710
Epoch: 3303, Training Loss: 0.03501
Epoch: 3303, Training Loss: 0.02612
Epoch: 3304, Training Loss: 0.03031
Epoch: 3304, Training Loss: 0.02710
Epoch: 3304, Training Loss: 0.03500
Epoch: 3304, Training Loss: 0.02611
Epoch: 3305, Training Loss: 0.03031
Epoch: 3305, Training Loss: 0.02709
Epoch: 3305, Training Loss: 0.03499
Epoch: 3305, Training Loss: 0.02611
Epoch: 3306, Training Loss: 0.03030
Epoch: 3306, Training Loss: 0.02709
Epoch: 3306, Training Loss: 0.03499
Epoch: 3306, Training Loss: 0.02610
Epoch: 3307, Training Loss: 0.03030
Epoch: 3307, Training Loss: 0.02708
Epoch: 3307, Training Loss: 0.03498
Epoch: 3307, Training Loss: 0.02610
Epoch: 3308, Training Loss: 0.03029
Epoch: 3308, Training Loss: 0.02708
Epoch: 3308, Training Loss: 0.03497
Epoch: 3308, Training Loss: 0.02609
Epoch: 3309, Training Loss: 0.03028
Epoch: 3309, Training Loss: 0.02707
Epoch: 3309, Training Loss: 0.03496
Epoch: 3309, Training Loss: 0.02609
Epoch: 3310, Training Loss: 0.03028
Epoch: 3310, Training Loss: 0.02707
Epoch: 3310, Training Loss: 0.03496
Epoch: 3310, Training Loss: 0.02608
Epoch: 3311, Training Loss: 0.03027
Epoch: 3311, Training Loss: 0.02706
Epoch: 3311, Training Loss: 0.03495
Epoch: 3311, Training Loss: 0.02608
Epoch: 3312, Training Loss: 0.03026
Epoch: 3312, Training Loss: 0.02706
Epoch: 3312, Training Loss: 0.03494
Epoch: 3312, Training Loss: 0.02607
Epoch: 3313, Training Loss: 0.03026
Epoch: 3313, Training Loss: 0.02705
Epoch: 3313, Training Loss: 0.03493
Epoch: 3313, Training Loss: 0.02606
Epoch: 3314, Training Loss: 0.03025
Epoch: 3314, Training Loss: 0.02704
Epoch: 3314, Training Loss: 0.03493
Epoch: 3314, Training Loss: 0.02606
Epoch: 3315, Training Loss: 0.03025
Epoch: 3315, Training Loss: 0.02704
Epoch: 3315, Training Loss: 0.03492
Epoch: 3315, Training Loss: 0.02605
Epoch: 3316, Training Loss: 0.03024
Epoch: 3316, Training Loss: 0.02703
Epoch: 3316, Training Loss: 0.03491
Epoch: 3316, Training Loss: 0.02605
Epoch: 3317, Training Loss: 0.03023
Epoch: 3317, Training Loss: 0.02703
Epoch: 3317, Training Loss: 0.03490
Epoch: 3317, Training Loss: 0.02604
Epoch: 3318, Training Loss: 0.03023
Epoch: 3318, Training Loss: 0.02702
Epoch: 3318, Training Loss: 0.03490
Epoch: 3318, Training Loss: 0.02604
Epoch: 3319, Training Loss: 0.03022
Epoch: 3319, Training Loss: 0.02702
Epoch: 3319, Training Loss: 0.03489
Epoch: 3319, Training Loss: 0.02603
Epoch: 3320, Training Loss: 0.03021
Epoch: 3320, Training Loss: 0.02701
Epoch: 3320, Training Loss: 0.03488
Epoch: 3320, Training Loss: 0.02603
Epoch: 3321, Training Loss: 0.03021
Epoch: 3321, Training Loss: 0.02701
Epoch: 3321, Training Loss: 0.03487
Epoch: 3321, Training Loss: 0.02602
Epoch: 3322, Training Loss: 0.03020
Epoch: 3322, Training Loss: 0.02700
Epoch: 3322, Training Loss: 0.03487
Epoch: 3322, Training Loss: 0.02602
Epoch: 3323, Training Loss: 0.03020
Epoch: 3323, Training Loss: 0.02700
Epoch: 3323, Training Loss: 0.03486
Epoch: 3323, Training Loss: 0.02601
Epoch: 3324, Training Loss: 0.03019
Epoch: 3324, Training Loss: 0.02699
Epoch: 3324, Training Loss: 0.03485
Epoch: 3324, Training Loss: 0.02601
Epoch: 3325, Training Loss: 0.03018
Epoch: 3325, Training Loss: 0.02699
Epoch: 3325, Training Loss: 0.03484
Epoch: 3325, Training Loss: 0.02600
Epoch: 3326, Training Loss: 0.03018
Epoch: 3326, Training Loss: 0.02698
Epoch: 3326, Training Loss: 0.03484
Epoch: 3326, Training Loss: 0.02600
Epoch: 3327, Training Loss: 0.03017
Epoch: 3327, Training Loss: 0.02698
Epoch: 3327, Training Loss: 0.03483
Epoch: 3327, Training Loss: 0.02599
Epoch: 3328, Training Loss: 0.03016
Epoch: 3328, Training Loss: 0.02697
Epoch: 3328, Training Loss: 0.03482
Epoch: 3328, Training Loss: 0.02599
Epoch: 3329, Training Loss: 0.03016
Epoch: 3329, Training Loss: 0.02697
Epoch: 3329, Training Loss: 0.03482
Epoch: 3329, Training Loss: 0.02598
Epoch: 3330, Training Loss: 0.03015
Epoch: 3330, Training Loss: 0.02696
Epoch: 3330, Training Loss: 0.03481
Epoch: 3330, Training Loss: 0.02598
Epoch: 3331, Training Loss: 0.03015
Epoch: 3331, Training Loss: 0.02696
Epoch: 3331, Training Loss: 0.03480
Epoch: 3331, Training Loss: 0.02597
Epoch: 3332, Training Loss: 0.03014
Epoch: 3332, Training Loss: 0.02695
Epoch: 3332, Training Loss: 0.03479
Epoch: 3332, Training Loss: 0.02597
Epoch: 3333, Training Loss: 0.03013
Epoch: 3333, Training Loss: 0.02695
Epoch: 3333, Training Loss: 0.03479
Epoch: 3333, Training Loss: 0.02596
Epoch: 3334, Training Loss: 0.03013
Epoch: 3334, Training Loss: 0.02694
Epoch: 3334, Training Loss: 0.03478
Epoch: 3334, Training Loss: 0.02596
Epoch: 3335, Training Loss: 0.03012
Epoch: 3335, Training Loss: 0.02694
Epoch: 3335, Training Loss: 0.03477
Epoch: 3335, Training Loss: 0.02595
Epoch: 3336, Training Loss: 0.03012
Epoch: 3336, Training Loss: 0.02693
Epoch: 3336, Training Loss: 0.03476
Epoch: 3336, Training Loss: 0.02595
Epoch: 3337, Training Loss: 0.03011
Epoch: 3337, Training Loss: 0.02693
Epoch: 3337, Training Loss: 0.03476
Epoch: 3337, Training Loss: 0.02594
Epoch: 3338, Training Loss: 0.03010
Epoch: 3338, Training Loss: 0.02692
Epoch: 3338, Training Loss: 0.03475
Epoch: 3338, Training Loss: 0.02593
Epoch: 3339, Training Loss: 0.03010
Epoch: 3339, Training Loss: 0.02692
Epoch: 3339, Training Loss: 0.03474
Epoch: 3339, Training Loss: 0.02593
Epoch: 3340, Training Loss: 0.03009
Epoch: 3340, Training Loss: 0.02691
Epoch: 3340, Training Loss: 0.03473
Epoch: 3340, Training Loss: 0.02592
Epoch: 3341, Training Loss: 0.03008
Epoch: 3341, Training Loss: 0.02690
Epoch: 3341, Training Loss: 0.03473
Epoch: 3341, Training Loss: 0.02592
Epoch: 3342, Training Loss: 0.03008
Epoch: 3342, Training Loss: 0.02690
Epoch: 3342, Training Loss: 0.03472
Epoch: 3342, Training Loss: 0.02591
Epoch: 3343, Training Loss: 0.03007
Epoch: 3343, Training Loss: 0.02689
Epoch: 3343, Training Loss: 0.03471
Epoch: 3343, Training Loss: 0.02591
Epoch: 3344, Training Loss: 0.03007
Epoch: 3344, Training Loss: 0.02689
Epoch: 3344, Training Loss: 0.03471
Epoch: 3344, Training Loss: 0.02590
Epoch: 3345, Training Loss: 0.03006
Epoch: 3345, Training Loss: 0.02688
Epoch: 3345, Training Loss: 0.03470
Epoch: 3345, Training Loss: 0.02590
Epoch: 3346, Training Loss: 0.03005
Epoch: 3346, Training Loss: 0.02688
Epoch: 3346, Training Loss: 0.03469
Epoch: 3346, Training Loss: 0.02589
Epoch: 3347, Training Loss: 0.03005
Epoch: 3347, Training Loss: 0.02687
Epoch: 3347, Training Loss: 0.03468
Epoch: 3347, Training Loss: 0.02589
Epoch: 3348, Training Loss: 0.03004
Epoch: 3348, Training Loss: 0.02687
Epoch: 3348, Training Loss: 0.03468
Epoch: 3348, Training Loss: 0.02588
Epoch: 3349, Training Loss: 0.03004
Epoch: 3349, Training Loss: 0.02686
Epoch: 3349, Training Loss: 0.03467
Epoch: 3349, Training Loss: 0.02588
Epoch: 3350, Training Loss: 0.03003
Epoch: 3350, Training Loss: 0.02686
Epoch: 3350, Training Loss: 0.03466
Epoch: 3350, Training Loss: 0.02587
Epoch: 3351, Training Loss: 0.03002
Epoch: 3351, Training Loss: 0.02685
Epoch: 3351, Training Loss: 0.03465
Epoch: 3351, Training Loss: 0.02587
Epoch: 3352, Training Loss: 0.03002
Epoch: 3352, Training Loss: 0.02685
Epoch: 3352, Training Loss: 0.03465
Epoch: 3352, Training Loss: 0.02586
Epoch: 3353, Training Loss: 0.03001
Epoch: 3353, Training Loss: 0.02684
Epoch: 3353, Training Loss: 0.03464
Epoch: 3353, Training Loss: 0.02586
Epoch: 3354, Training Loss: 0.03001
Epoch: 3354, Training Loss: 0.02684
Epoch: 3354, Training Loss: 0.03463
Epoch: 3354, Training Loss: 0.02585
Epoch: 3355, Training Loss: 0.03000
Epoch: 3355, Training Loss: 0.02683
Epoch: 3355, Training Loss: 0.03463
Epoch: 3355, Training Loss: 0.02585
Epoch: 3356, Training Loss: 0.02999
Epoch: 3356, Training Loss: 0.02683
Epoch: 3356, Training Loss: 0.03462
Epoch: 3356, Training Loss: 0.02584
Epoch: 3357, Training Loss: 0.02999
Epoch: 3357, Training Loss: 0.02682
Epoch: 3357, Training Loss: 0.03461
Epoch: 3357, Training Loss: 0.02584
Epoch: 3358, Training Loss: 0.02998
Epoch: 3358, Training Loss: 0.02682
Epoch: 3358, Training Loss: 0.03460
Epoch: 3358, Training Loss: 0.02583
Epoch: 3359, Training Loss: 0.02997
Epoch: 3359, Training Loss: 0.02681
Epoch: 3359, Training Loss: 0.03460
Epoch: 3359, Training Loss: 0.02583
Epoch: 3360, Training Loss: 0.02997
Epoch: 3360, Training Loss: 0.02681
Epoch: 3360, Training Loss: 0.03459
Epoch: 3360, Training Loss: 0.02582
Epoch: 3361, Training Loss: 0.02996
Epoch: 3361, Training Loss: 0.02680
Epoch: 3361, Training Loss: 0.03458
Epoch: 3361, Training Loss: 0.02582
Epoch: 3362, Training Loss: 0.02996
Epoch: 3362, Training Loss: 0.02680
Epoch: 3362, Training Loss: 0.03458
Epoch: 3362, Training Loss: 0.02581
Epoch: 3363, Training Loss: 0.02995
Epoch: 3363, Training Loss: 0.02679
Epoch: 3363, Training Loss: 0.03457
Epoch: 3363, Training Loss: 0.02581
Epoch: 3364, Training Loss: 0.02994
Epoch: 3364, Training Loss: 0.02679
Epoch: 3364, Training Loss: 0.03456
Epoch: 3364, Training Loss: 0.02580
Epoch: 3365, Training Loss: 0.02994
Epoch: 3365, Training Loss: 0.02678
Epoch: 3365, Training Loss: 0.03455
Epoch: 3365, Training Loss: 0.02580
Epoch: 3366, Training Loss: 0.02993
Epoch: 3366, Training Loss: 0.02678
Epoch: 3366, Training Loss: 0.03455
Epoch: 3366, Training Loss: 0.02579
Epoch: 3367, Training Loss: 0.02993
Epoch: 3367, Training Loss: 0.02677
Epoch: 3367, Training Loss: 0.03454
Epoch: 3367, Training Loss: 0.02579
Epoch: 3368, Training Loss: 0.02992
Epoch: 3368, Training Loss: 0.02677
Epoch: 3368, Training Loss: 0.03453
Epoch: 3368, Training Loss: 0.02578
Epoch: 3369, Training Loss: 0.02991
Epoch: 3369, Training Loss: 0.02676
Epoch: 3369, Training Loss: 0.03453
Epoch: 3369, Training Loss: 0.02578
Epoch: 3370, Training Loss: 0.02991
Epoch: 3370, Training Loss: 0.02676
Epoch: 3370, Training Loss: 0.03452
Epoch: 3370, Training Loss: 0.02577
Epoch: 3371, Training Loss: 0.02990
Epoch: 3371, Training Loss: 0.02675
Epoch: 3371, Training Loss: 0.03451
Epoch: 3371, Training Loss: 0.02577
Epoch: 3372, Training Loss: 0.02990
Epoch: 3372, Training Loss: 0.02675
Epoch: 3372, Training Loss: 0.03450
Epoch: 3372, Training Loss: 0.02576
Epoch: 3373, Training Loss: 0.02989
Epoch: 3373, Training Loss: 0.02674
Epoch: 3373, Training Loss: 0.03450
Epoch: 3373, Training Loss: 0.02576
Epoch: 3374, Training Loss: 0.02988
Epoch: 3374, Training Loss: 0.02674
Epoch: 3374, Training Loss: 0.03449
Epoch: 3374, Training Loss: 0.02575
Epoch: 3375, Training Loss: 0.02988
Epoch: 3375, Training Loss: 0.02673
Epoch: 3375, Training Loss: 0.03448
Epoch: 3375, Training Loss: 0.02575
Epoch: 3376, Training Loss: 0.02987
Epoch: 3376, Training Loss: 0.02673
Epoch: 3376, Training Loss: 0.03448
Epoch: 3376, Training Loss: 0.02574
Epoch: 3377, Training Loss: 0.02987
Epoch: 3377, Training Loss: 0.02672
Epoch: 3377, Training Loss: 0.03447
Epoch: 3377, Training Loss: 0.02574
Epoch: 3378, Training Loss: 0.02986
Epoch: 3378, Training Loss: 0.02672
Epoch: 3378, Training Loss: 0.03446
Epoch: 3378, Training Loss: 0.02573
Epoch: 3379, Training Loss: 0.02985
Epoch: 3379, Training Loss: 0.02671
Epoch: 3379, Training Loss: 0.03445
Epoch: 3379, Training Loss: 0.02573
Epoch: 3380, Training Loss: 0.02985
Epoch: 3380, Training Loss: 0.02671
Epoch: 3380, Training Loss: 0.03445
Epoch: 3380, Training Loss: 0.02572
Epoch: 3381, Training Loss: 0.02984
Epoch: 3381, Training Loss: 0.02670
Epoch: 3381, Training Loss: 0.03444
Epoch: 3381, Training Loss: 0.02572
Epoch: 3382, Training Loss: 0.02984
Epoch: 3382, Training Loss: 0.02670
Epoch: 3382, Training Loss: 0.03443
Epoch: 3382, Training Loss: 0.02571
Epoch: 3383, Training Loss: 0.02983
Epoch: 3383, Training Loss: 0.02669
Epoch: 3383, Training Loss: 0.03443
Epoch: 3383, Training Loss: 0.02571
Epoch: 3384, Training Loss: 0.02982
Epoch: 3384, Training Loss: 0.02669
Epoch: 3384, Training Loss: 0.03442
Epoch: 3384, Training Loss: 0.02570
Epoch: 3385, Training Loss: 0.02982
Epoch: 3385, Training Loss: 0.02668
Epoch: 3385, Training Loss: 0.03441
Epoch: 3385, Training Loss: 0.02570
Epoch: 3386, Training Loss: 0.02981
Epoch: 3386, Training Loss: 0.02668
Epoch: 3386, Training Loss: 0.03440
Epoch: 3386, Training Loss: 0.02569
Epoch: 3387, Training Loss: 0.02981
Epoch: 3387, Training Loss: 0.02667
Epoch: 3387, Training Loss: 0.03440
Epoch: 3387, Training Loss: 0.02569
Epoch: 3388, Training Loss: 0.02980
Epoch: 3388, Training Loss: 0.02667
Epoch: 3388, Training Loss: 0.03439
Epoch: 3388, Training Loss: 0.02568
Epoch: 3389, Training Loss: 0.02979
Epoch: 3389, Training Loss: 0.02666
Epoch: 3389, Training Loss: 0.03438
Epoch: 3389, Training Loss: 0.02568
Epoch: 3390, Training Loss: 0.02979
Epoch: 3390, Training Loss: 0.02666
Epoch: 3390, Training Loss: 0.03438
Epoch: 3390, Training Loss: 0.02567
Epoch: 3391, Training Loss: 0.02978
Epoch: 3391, Training Loss: 0.02665
Epoch: 3391, Training Loss: 0.03437
Epoch: 3391, Training Loss: 0.02567
Epoch: 3392, Training Loss: 0.02978
Epoch: 3392, Training Loss: 0.02665
Epoch: 3392, Training Loss: 0.03436
Epoch: 3392, Training Loss: 0.02566
Epoch: 3393, Training Loss: 0.02977
Epoch: 3393, Training Loss: 0.02664
Epoch: 3393, Training Loss: 0.03435
Epoch: 3393, Training Loss: 0.02566
Epoch: 3394, Training Loss: 0.02976
Epoch: 3394, Training Loss: 0.02664
Epoch: 3394, Training Loss: 0.03435
Epoch: 3394, Training Loss: 0.02565
Epoch: 3395, Training Loss: 0.02976
Epoch: 3395, Training Loss: 0.02663
Epoch: 3395, Training Loss: 0.03434
Epoch: 3395, Training Loss: 0.02565
Epoch: 3396, Training Loss: 0.02975
Epoch: 3396, Training Loss: 0.02663
Epoch: 3396, Training Loss: 0.03433
Epoch: 3396, Training Loss: 0.02564
Epoch: 3397, Training Loss: 0.02975
Epoch: 3397, Training Loss: 0.02662
Epoch: 3397, Training Loss: 0.03433
Epoch: 3397, Training Loss: 0.02564
Epoch: 3398, Training Loss: 0.02974
Epoch: 3398, Training Loss: 0.02662
Epoch: 3398, Training Loss: 0.03432
Epoch: 3398, Training Loss: 0.02563
Epoch: 3399, Training Loss: 0.02973
Epoch: 3399, Training Loss: 0.02661
Epoch: 3399, Training Loss: 0.03431
Epoch: 3399, Training Loss: 0.02563
Epoch: 3400, Training Loss: 0.02973
Epoch: 3400, Training Loss: 0.02661
Epoch: 3400, Training Loss: 0.03431
Epoch: 3400, Training Loss: 0.02562
Epoch: 3401, Training Loss: 0.02972
Epoch: 3401, Training Loss: 0.02660
Epoch: 3401, Training Loss: 0.03430
Epoch: 3401, Training Loss: 0.02562
Epoch: 3402, Training Loss: 0.02972
Epoch: 3402, Training Loss: 0.02660
Epoch: 3402, Training Loss: 0.03429
Epoch: 3402, Training Loss: 0.02561
Epoch: 3403, Training Loss: 0.02971
Epoch: 3403, Training Loss: 0.02659
Epoch: 3403, Training Loss: 0.03428
Epoch: 3403, Training Loss: 0.02561
Epoch: 3404, Training Loss: 0.02970
Epoch: 3404, Training Loss: 0.02659
Epoch: 3404, Training Loss: 0.03428
Epoch: 3404, Training Loss: 0.02560
Epoch: 3405, Training Loss: 0.02970
Epoch: 3405, Training Loss: 0.02658
Epoch: 3405, Training Loss: 0.03427
Epoch: 3405, Training Loss: 0.02560
Epoch: 3406, Training Loss: 0.02969
Epoch: 3406, Training Loss: 0.02658
Epoch: 3406, Training Loss: 0.03426
Epoch: 3406, Training Loss: 0.02559
Epoch: 3407, Training Loss: 0.02969
Epoch: 3407, Training Loss: 0.02657
Epoch: 3407, Training Loss: 0.03426
Epoch: 3407, Training Loss: 0.02559
Epoch: 3408, Training Loss: 0.02968
Epoch: 3408, Training Loss: 0.02657
Epoch: 3408, Training Loss: 0.03425
Epoch: 3408, Training Loss: 0.02558
Epoch: 3409, Training Loss: 0.02968
Epoch: 3409, Training Loss: 0.02656
Epoch: 3409, Training Loss: 0.03424
Epoch: 3409, Training Loss: 0.02558
Epoch: 3410, Training Loss: 0.02967
Epoch: 3410, Training Loss: 0.02656
Epoch: 3410, Training Loss: 0.03424
Epoch: 3410, Training Loss: 0.02557
Epoch: 3411, Training Loss: 0.02966
Epoch: 3411, Training Loss: 0.02655
Epoch: 3411, Training Loss: 0.03423
Epoch: 3411, Training Loss: 0.02557
Epoch: 3412, Training Loss: 0.02966
Epoch: 3412, Training Loss: 0.02655
Epoch: 3412, Training Loss: 0.03422
Epoch: 3412, Training Loss: 0.02556
Epoch: 3413, Training Loss: 0.02965
Epoch: 3413, Training Loss: 0.02654
Epoch: 3413, Training Loss: 0.03421
Epoch: 3413, Training Loss: 0.02556
Epoch: 3414, Training Loss: 0.02965
Epoch: 3414, Training Loss: 0.02654
Epoch: 3414, Training Loss: 0.03421
Epoch: 3414, Training Loss: 0.02555
Epoch: 3415, Training Loss: 0.02964
Epoch: 3415, Training Loss: 0.02653
Epoch: 3415, Training Loss: 0.03420
Epoch: 3415, Training Loss: 0.02555
Epoch: 3416, Training Loss: 0.02963
Epoch: 3416, Training Loss: 0.02653
Epoch: 3416, Training Loss: 0.03419
Epoch: 3416, Training Loss: 0.02554
Epoch: 3417, Training Loss: 0.02963
Epoch: 3417, Training Loss: 0.02652
Epoch: 3417, Training Loss: 0.03419
Epoch: 3417, Training Loss: 0.02554
Epoch: 3418, Training Loss: 0.02962
Epoch: 3418, Training Loss: 0.02652
Epoch: 3418, Training Loss: 0.03418
Epoch: 3418, Training Loss: 0.02553
Epoch: 3419, Training Loss: 0.02962
Epoch: 3419, Training Loss: 0.02651
Epoch: 3419, Training Loss: 0.03417
Epoch: 3419, Training Loss: 0.02553
Epoch: 3420, Training Loss: 0.02961
Epoch: 3420, Training Loss: 0.02651
Epoch: 3420, Training Loss: 0.03417
Epoch: 3420, Training Loss: 0.02552
Epoch: 3421, Training Loss: 0.02960
Epoch: 3421, Training Loss: 0.02650
Epoch: 3421, Training Loss: 0.03416
Epoch: 3421, Training Loss: 0.02552
Epoch: 3422, Training Loss: 0.02960
Epoch: 3422, Training Loss: 0.02650
Epoch: 3422, Training Loss: 0.03415
Epoch: 3422, Training Loss: 0.02551
Epoch: 3423, Training Loss: 0.02959
Epoch: 3423, Training Loss: 0.02649
Epoch: 3423, Training Loss: 0.03414
Epoch: 3423, Training Loss: 0.02551
Epoch: 3424, Training Loss: 0.02959
Epoch: 3424, Training Loss: 0.02649
Epoch: 3424, Training Loss: 0.03414
Epoch: 3424, Training Loss: 0.02550
Epoch: 3425, Training Loss: 0.02958
Epoch: 3425, Training Loss: 0.02648
Epoch: 3425, Training Loss: 0.03413
Epoch: 3425, Training Loss: 0.02550
Epoch: 3426, Training Loss: 0.02958
Epoch: 3426, Training Loss: 0.02648
Epoch: 3426, Training Loss: 0.03412
Epoch: 3426, Training Loss: 0.02549
Epoch: 3427, Training Loss: 0.02957
Epoch: 3427, Training Loss: 0.02647
Epoch: 3427, Training Loss: 0.03412
Epoch: 3427, Training Loss: 0.02549
Epoch: 3428, Training Loss: 0.02956
Epoch: 3428, Training Loss: 0.02647
Epoch: 3428, Training Loss: 0.03411
Epoch: 3428, Training Loss: 0.02548
Epoch: 3429, Training Loss: 0.02956
Epoch: 3429, Training Loss: 0.02646
Epoch: 3429, Training Loss: 0.03410
Epoch: 3429, Training Loss: 0.02548
Epoch: 3430, Training Loss: 0.02955
Epoch: 3430, Training Loss: 0.02646
Epoch: 3430, Training Loss: 0.03410
Epoch: 3430, Training Loss: 0.02547
Epoch: 3431, Training Loss: 0.02955
Epoch: 3431, Training Loss: 0.02645
Epoch: 3431, Training Loss: 0.03409
Epoch: 3431, Training Loss: 0.02547
Epoch: 3432, Training Loss: 0.02954
Epoch: 3432, Training Loss: 0.02645
Epoch: 3432, Training Loss: 0.03408
Epoch: 3432, Training Loss: 0.02546
Epoch: 3433, Training Loss: 0.02953
Epoch: 3433, Training Loss: 0.02644
Epoch: 3433, Training Loss: 0.03408
Epoch: 3433, Training Loss: 0.02546
Epoch: 3434, Training Loss: 0.02953
Epoch: 3434, Training Loss: 0.02644
Epoch: 3434, Training Loss: 0.03407
Epoch: 3434, Training Loss: 0.02545
Epoch: 3435, Training Loss: 0.02952
Epoch: 3435, Training Loss: 0.02643
Epoch: 3435, Training Loss: 0.03406
Epoch: 3435, Training Loss: 0.02545
Epoch: 3436, Training Loss: 0.02952
Epoch: 3436, Training Loss: 0.02643
Epoch: 3436, Training Loss: 0.03405
Epoch: 3436, Training Loss: 0.02544
Epoch: 3437, Training Loss: 0.02951
Epoch: 3437, Training Loss: 0.02642
Epoch: 3437, Training Loss: 0.03405
Epoch: 3437, Training Loss: 0.02544
Epoch: 3438, Training Loss: 0.02951
Epoch: 3438, Training Loss: 0.02642
Epoch: 3438, Training Loss: 0.03404
Epoch: 3438, Training Loss: 0.02543
Epoch: 3439, Training Loss: 0.02950
Epoch: 3439, Training Loss: 0.02641
Epoch: 3439, Training Loss: 0.03403
Epoch: 3439, Training Loss: 0.02543
Epoch: 3440, Training Loss: 0.02949
Epoch: 3440, Training Loss: 0.02641
Epoch: 3440, Training Loss: 0.03403
Epoch: 3440, Training Loss: 0.02542
Epoch: 3441, Training Loss: 0.02949
Epoch: 3441, Training Loss: 0.02640
Epoch: 3441, Training Loss: 0.03402
Epoch: 3441, Training Loss: 0.02542
Epoch: 3442, Training Loss: 0.02948
Epoch: 3442, Training Loss: 0.02640
Epoch: 3442, Training Loss: 0.03401
Epoch: 3442, Training Loss: 0.02541
Epoch: 3443, Training Loss: 0.02948
Epoch: 3443, Training Loss: 0.02640
Epoch: 3443, Training Loss: 0.03401
Epoch: 3443, Training Loss: 0.02541
Epoch: 3444, Training Loss: 0.02947
Epoch: 3444, Training Loss: 0.02639
Epoch: 3444, Training Loss: 0.03400
Epoch: 3444, Training Loss: 0.02540
Epoch: 3445, Training Loss: 0.02947
Epoch: 3445, Training Loss: 0.02639
Epoch: 3445, Training Loss: 0.03399
Epoch: 3445, Training Loss: 0.02540
Epoch: 3446, Training Loss: 0.02946
Epoch: 3446, Training Loss: 0.02638
Epoch: 3446, Training Loss: 0.03399
Epoch: 3446, Training Loss: 0.02540
Epoch: 3447, Training Loss: 0.02945
Epoch: 3447, Training Loss: 0.02638
Epoch: 3447, Training Loss: 0.03398
Epoch: 3447, Training Loss: 0.02539
Epoch: 3448, Training Loss: 0.02945
Epoch: 3448, Training Loss: 0.02637
Epoch: 3448, Training Loss: 0.03397
Epoch: 3448, Training Loss: 0.02539
Epoch: 3449, Training Loss: 0.02944
Epoch: 3449, Training Loss: 0.02637
Epoch: 3449, Training Loss: 0.03397
Epoch: 3449, Training Loss: 0.02538
Epoch: 3450, Training Loss: 0.02944
Epoch: 3450, Training Loss: 0.02636
Epoch: 3450, Training Loss: 0.03396
Epoch: 3450, Training Loss: 0.02538
Epoch: 3451, Training Loss: 0.02943
Epoch: 3451, Training Loss: 0.02636
Epoch: 3451, Training Loss: 0.03395
Epoch: 3451, Training Loss: 0.02537
Epoch: 3452, Training Loss: 0.02942
Epoch: 3452, Training Loss: 0.02635
Epoch: 3452, Training Loss: 0.03394
Epoch: 3452, Training Loss: 0.02537
Epoch: 3453, Training Loss: 0.02942
Epoch: 3453, Training Loss: 0.02635
Epoch: 3453, Training Loss: 0.03394
Epoch: 3453, Training Loss: 0.02536
Epoch: 3454, Training Loss: 0.02941
Epoch: 3454, Training Loss: 0.02634
Epoch: 3454, Training Loss: 0.03393
Epoch: 3454, Training Loss: 0.02536
Epoch: 3455, Training Loss: 0.02941
Epoch: 3455, Training Loss: 0.02634
Epoch: 3455, Training Loss: 0.03392
Epoch: 3455, Training Loss: 0.02535
Epoch: 3456, Training Loss: 0.02940
Epoch: 3456, Training Loss: 0.02633
Epoch: 3456, Training Loss: 0.03392
Epoch: 3456, Training Loss: 0.02535
Epoch: 3457, Training Loss: 0.02940
Epoch: 3457, Training Loss: 0.02633
Epoch: 3457, Training Loss: 0.03391
Epoch: 3457, Training Loss: 0.02534
Epoch: 3458, Training Loss: 0.02939
Epoch: 3458, Training Loss: 0.02632
Epoch: 3458, Training Loss: 0.03390
Epoch: 3458, Training Loss: 0.02534
Epoch: 3459, Training Loss: 0.02938
Epoch: 3459, Training Loss: 0.02632
Epoch: 3459, Training Loss: 0.03390
Epoch: 3459, Training Loss: 0.02533
Epoch: 3460, Training Loss: 0.02938
Epoch: 3460, Training Loss: 0.02631
Epoch: 3460, Training Loss: 0.03389
Epoch: 3460, Training Loss: 0.02533
Epoch: 3461, Training Loss: 0.02937
Epoch: 3461, Training Loss: 0.02631
Epoch: 3461, Training Loss: 0.03388
Epoch: 3461, Training Loss: 0.02532
Epoch: 3462, Training Loss: 0.02937
Epoch: 3462, Training Loss: 0.02630
Epoch: 3462, Training Loss: 0.03388
Epoch: 3462, Training Loss: 0.02532
Epoch: 3463, Training Loss: 0.02936
Epoch: 3463, Training Loss: 0.02630
Epoch: 3463, Training Loss: 0.03387
Epoch: 3463, Training Loss: 0.02531
Epoch: 3464, Training Loss: 0.02936
Epoch: 3464, Training Loss: 0.02629
Epoch: 3464, Training Loss: 0.03386
Epoch: 3464, Training Loss: 0.02531
Epoch: 3465, Training Loss: 0.02935
Epoch: 3465, Training Loss: 0.02629
Epoch: 3465, Training Loss: 0.03386
Epoch: 3465, Training Loss: 0.02530
Epoch: 3466, Training Loss: 0.02934
Epoch: 3466, Training Loss: 0.02628
Epoch: 3466, Training Loss: 0.03385
Epoch: 3466, Training Loss: 0.02530
Epoch: 3467, Training Loss: 0.02934
Epoch: 3467, Training Loss: 0.02628
Epoch: 3467, Training Loss: 0.03384
Epoch: 3467, Training Loss: 0.02529
Epoch: 3468, Training Loss: 0.02933
Epoch: 3468, Training Loss: 0.02627
Epoch: 3468, Training Loss: 0.03384
Epoch: 3468, Training Loss: 0.02529
Epoch: 3469, Training Loss: 0.02933
Epoch: 3469, Training Loss: 0.02627
Epoch: 3469, Training Loss: 0.03383
Epoch: 3469, Training Loss: 0.02528
Epoch: 3470, Training Loss: 0.02932
Epoch: 3470, Training Loss: 0.02626
Epoch: 3470, Training Loss: 0.03382
Epoch: 3470, Training Loss: 0.02528
Epoch: 3471, Training Loss: 0.02932
Epoch: 3471, Training Loss: 0.02626
Epoch: 3471, Training Loss: 0.03382
Epoch: 3471, Training Loss: 0.02527
Epoch: 3472, Training Loss: 0.02931
Epoch: 3472, Training Loss: 0.02626
Epoch: 3472, Training Loss: 0.03381
Epoch: 3472, Training Loss: 0.02527
Epoch: 3473, Training Loss: 0.02930
Epoch: 3473, Training Loss: 0.02625
Epoch: 3473, Training Loss: 0.03380
Epoch: 3473, Training Loss: 0.02527
Epoch: 3474, Training Loss: 0.02930
Epoch: 3474, Training Loss: 0.02625
Epoch: 3474, Training Loss: 0.03380
Epoch: 3474, Training Loss: 0.02526
Epoch: 3475, Training Loss: 0.02929
Epoch: 3475, Training Loss: 0.02624
Epoch: 3475, Training Loss: 0.03379
Epoch: 3475, Training Loss: 0.02526
Epoch: 3476, Training Loss: 0.02929
Epoch: 3476, Training Loss: 0.02624
Epoch: 3476, Training Loss: 0.03378
Epoch: 3476, Training Loss: 0.02525
Epoch: 3477, Training Loss: 0.02928
Epoch: 3477, Training Loss: 0.02623
Epoch: 3477, Training Loss: 0.03378
Epoch: 3477, Training Loss: 0.02525
Epoch: 3478, Training Loss: 0.02928
Epoch: 3478, Training Loss: 0.02623
Epoch: 3478, Training Loss: 0.03377
Epoch: 3478, Training Loss: 0.02524
Epoch: 3479, Training Loss: 0.02927
Epoch: 3479, Training Loss: 0.02622
Epoch: 3479, Training Loss: 0.03376
Epoch: 3479, Training Loss: 0.02524
Epoch: 3480, Training Loss: 0.02926
Epoch: 3480, Training Loss: 0.02622
Epoch: 3480, Training Loss: 0.03376
Epoch: 3480, Training Loss: 0.02523
Epoch: 3481, Training Loss: 0.02926
Epoch: 3481, Training Loss: 0.02621
Epoch: 3481, Training Loss: 0.03375
Epoch: 3481, Training Loss: 0.02523
Epoch: 3482, Training Loss: 0.02925
Epoch: 3482, Training Loss: 0.02621
Epoch: 3482, Training Loss: 0.03374
Epoch: 3482, Training Loss: 0.02522
Epoch: 3483, Training Loss: 0.02925
Epoch: 3483, Training Loss: 0.02620
Epoch: 3483, Training Loss: 0.03374
Epoch: 3483, Training Loss: 0.02522
Epoch: 3484, Training Loss: 0.02924
Epoch: 3484, Training Loss: 0.02620
Epoch: 3484, Training Loss: 0.03373
Epoch: 3484, Training Loss: 0.02521
Epoch: 3485, Training Loss: 0.02924
Epoch: 3485, Training Loss: 0.02619
Epoch: 3485, Training Loss: 0.03372
Epoch: 3485, Training Loss: 0.02521
Epoch: 3486, Training Loss: 0.02923
Epoch: 3486, Training Loss: 0.02619
Epoch: 3486, Training Loss: 0.03372
Epoch: 3486, Training Loss: 0.02520
Epoch: 3487, Training Loss: 0.02923
Epoch: 3487, Training Loss: 0.02618
Epoch: 3487, Training Loss: 0.03371
Epoch: 3487, Training Loss: 0.02520
Epoch: 3488, Training Loss: 0.02922
Epoch: 3488, Training Loss: 0.02618
Epoch: 3488, Training Loss: 0.03370
Epoch: 3488, Training Loss: 0.02519
Epoch: 3489, Training Loss: 0.02921
Epoch: 3489, Training Loss: 0.02617
Epoch: 3489, Training Loss: 0.03370
Epoch: 3489, Training Loss: 0.02519
Epoch: 3490, Training Loss: 0.02921
Epoch: 3490, Training Loss: 0.02617
Epoch: 3490, Training Loss: 0.03369
Epoch: 3490, Training Loss: 0.02518
Epoch: 3491, Training Loss: 0.02920
Epoch: 3491, Training Loss: 0.02616
Epoch: 3491, Training Loss: 0.03368
Epoch: 3491, Training Loss: 0.02518
Epoch: 3492, Training Loss: 0.02920
Epoch: 3492, Training Loss: 0.02616
Epoch: 3492, Training Loss: 0.03368
Epoch: 3492, Training Loss: 0.02518
Epoch: 3493, Training Loss: 0.02919
Epoch: 3493, Training Loss: 0.02616
Epoch: 3493, Training Loss: 0.03367
Epoch: 3493, Training Loss: 0.02517
Epoch: 3494, Training Loss: 0.02919
Epoch: 3494, Training Loss: 0.02615
Epoch: 3494, Training Loss: 0.03366
Epoch: 3494, Training Loss: 0.02517
Epoch: 3495, Training Loss: 0.02918
Epoch: 3495, Training Loss: 0.02615
Epoch: 3495, Training Loss: 0.03366
Epoch: 3495, Training Loss: 0.02516
Epoch: 3496, Training Loss: 0.02917
Epoch: 3496, Training Loss: 0.02614
Epoch: 3496, Training Loss: 0.03365
Epoch: 3496, Training Loss: 0.02516
Epoch: 3497, Training Loss: 0.02917
Epoch: 3497, Training Loss: 0.02614
Epoch: 3497, Training Loss: 0.03364
Epoch: 3497, Training Loss: 0.02515
Epoch: 3498, Training Loss: 0.02916
Epoch: 3498, Training Loss: 0.02613
Epoch: 3498, Training Loss: 0.03364
Epoch: 3498, Training Loss: 0.02515
Epoch: 3499, Training Loss: 0.02916
Epoch: 3499, Training Loss: 0.02613
Epoch: 3499, Training Loss: 0.03363
Epoch: 3499, Training Loss: 0.02514
Epoch: 3500, Training Loss: 0.02915
Epoch: 3500, Training Loss: 0.02612
Epoch: 3500, Training Loss: 0.03362
Epoch: 3500, Training Loss: 0.02514
Epoch: 3501, Training Loss: 0.02915
Epoch: 3501, Training Loss: 0.02612
Epoch: 3501, Training Loss: 0.03362
Epoch: 3501, Training Loss: 0.02513
Epoch: 3502, Training Loss: 0.02914
Epoch: 3502, Training Loss: 0.02611
Epoch: 3502, Training Loss: 0.03361
Epoch: 3502, Training Loss: 0.02513
Epoch: 3503, Training Loss: 0.02914
Epoch: 3503, Training Loss: 0.02611
Epoch: 3503, Training Loss: 0.03360
Epoch: 3503, Training Loss: 0.02512
Epoch: 3504, Training Loss: 0.02913
Epoch: 3504, Training Loss: 0.02610
Epoch: 3504, Training Loss: 0.03360
Epoch: 3504, Training Loss: 0.02512
Epoch: 3505, Training Loss: 0.02912
Epoch: 3505, Training Loss: 0.02610
Epoch: 3505, Training Loss: 0.03359
Epoch: 3505, Training Loss: 0.02511
Epoch: 3506, Training Loss: 0.02912
Epoch: 3506, Training Loss: 0.02609
Epoch: 3506, Training Loss: 0.03358
Epoch: 3506, Training Loss: 0.02511
Epoch: 3507, Training Loss: 0.02911
Epoch: 3507, Training Loss: 0.02609
Epoch: 3507, Training Loss: 0.03358
Epoch: 3507, Training Loss: 0.02510
Epoch: 3508, Training Loss: 0.02911
Epoch: 3508, Training Loss: 0.02608
Epoch: 3508, Training Loss: 0.03357
Epoch: 3508, Training Loss: 0.02510
Epoch: 3509, Training Loss: 0.02910
Epoch: 3509, Training Loss: 0.02608
Epoch: 3509, Training Loss: 0.03356
Epoch: 3509, Training Loss: 0.02510
Epoch: 3510, Training Loss: 0.02910
Epoch: 3510, Training Loss: 0.02608
Epoch: 3510, Training Loss: 0.03356
Epoch: 3510, Training Loss: 0.02509
Epoch: 3511, Training Loss: 0.02909
Epoch: 3511, Training Loss: 0.02607
Epoch: 3511, Training Loss: 0.03355
Epoch: 3511, Training Loss: 0.02509
Epoch: 3512, Training Loss: 0.02909
Epoch: 3512, Training Loss: 0.02607
Epoch: 3512, Training Loss: 0.03354
Epoch: 3512, Training Loss: 0.02508
Epoch: 3513, Training Loss: 0.02908
Epoch: 3513, Training Loss: 0.02606
Epoch: 3513, Training Loss: 0.03354
Epoch: 3513, Training Loss: 0.02508
Epoch: 3514, Training Loss: 0.02907
Epoch: 3514, Training Loss: 0.02606
Epoch: 3514, Training Loss: 0.03353
Epoch: 3514, Training Loss: 0.02507
Epoch: 3515, Training Loss: 0.02907
Epoch: 3515, Training Loss: 0.02605
Epoch: 3515, Training Loss: 0.03352
Epoch: 3515, Training Loss: 0.02507
Epoch: 3516, Training Loss: 0.02906
Epoch: 3516, Training Loss: 0.02605
Epoch: 3516, Training Loss: 0.03352
Epoch: 3516, Training Loss: 0.02506
Epoch: 3517, Training Loss: 0.02906
Epoch: 3517, Training Loss: 0.02604
Epoch: 3517, Training Loss: 0.03351
Epoch: 3517, Training Loss: 0.02506
Epoch: 3518, Training Loss: 0.02905
Epoch: 3518, Training Loss: 0.02604
Epoch: 3518, Training Loss: 0.03350
Epoch: 3518, Training Loss: 0.02505
Epoch: 3519, Training Loss: 0.02905
Epoch: 3519, Training Loss: 0.02603
Epoch: 3519, Training Loss: 0.03350
Epoch: 3519, Training Loss: 0.02505
Epoch: 3520, Training Loss: 0.02904
Epoch: 3520, Training Loss: 0.02603
Epoch: 3520, Training Loss: 0.03349
Epoch: 3520, Training Loss: 0.02504
Epoch: 3521, Training Loss: 0.02904
Epoch: 3521, Training Loss: 0.02602
Epoch: 3521, Training Loss: 0.03348
Epoch: 3521, Training Loss: 0.02504
Epoch: 3522, Training Loss: 0.02903
Epoch: 3522, Training Loss: 0.02602
Epoch: 3522, Training Loss: 0.03348
Epoch: 3522, Training Loss: 0.02503
Epoch: 3523, Training Loss: 0.02902
Epoch: 3523, Training Loss: 0.02601
Epoch: 3523, Training Loss: 0.03347
Epoch: 3523, Training Loss: 0.02503
Epoch: 3524, Training Loss: 0.02902
Epoch: 3524, Training Loss: 0.02601
Epoch: 3524, Training Loss: 0.03346
Epoch: 3524, Training Loss: 0.02503
Epoch: 3525, Training Loss: 0.02901
Epoch: 3525, Training Loss: 0.02601
Epoch: 3525, Training Loss: 0.03346
Epoch: 3525, Training Loss: 0.02502
Epoch: 3526, Training Loss: 0.02901
Epoch: 3526, Training Loss: 0.02600
Epoch: 3526, Training Loss: 0.03345
Epoch: 3526, Training Loss: 0.02502
Epoch: 3527, Training Loss: 0.02900
Epoch: 3527, Training Loss: 0.02600
Epoch: 3527, Training Loss: 0.03344
Epoch: 3527, Training Loss: 0.02501
Epoch: 3528, Training Loss: 0.02900
Epoch: 3528, Training Loss: 0.02599
Epoch: 3528, Training Loss: 0.03344
Epoch: 3528, Training Loss: 0.02501
Epoch: 3529, Training Loss: 0.02899
Epoch: 3529, Training Loss: 0.02599
Epoch: 3529, Training Loss: 0.03343
Epoch: 3529, Training Loss: 0.02500
Epoch: 3530, Training Loss: 0.02899
Epoch: 3530, Training Loss: 0.02598
Epoch: 3530, Training Loss: 0.03342
Epoch: 3530, Training Loss: 0.02500
Epoch: 3531, Training Loss: 0.02898
Epoch: 3531, Training Loss: 0.02598
Epoch: 3531, Training Loss: 0.03342
Epoch: 3531, Training Loss: 0.02499
Epoch: 3532, Training Loss: 0.02897
Epoch: 3532, Training Loss: 0.02597
Epoch: 3532, Training Loss: 0.03341
Epoch: 3532, Training Loss: 0.02499
Epoch: 3533, Training Loss: 0.02897
Epoch: 3533, Training Loss: 0.02597
Epoch: 3533, Training Loss: 0.03340
Epoch: 3533, Training Loss: 0.02498
Epoch: 3534, Training Loss: 0.02896
Epoch: 3534, Training Loss: 0.02596
Epoch: 3534, Training Loss: 0.03340
Epoch: 3534, Training Loss: 0.02498
Epoch: 3535, Training Loss: 0.02896
Epoch: 3535, Training Loss: 0.02596
Epoch: 3535, Training Loss: 0.03339
Epoch: 3535, Training Loss: 0.02497
Epoch: 3536, Training Loss: 0.02895
Epoch: 3536, Training Loss: 0.02595
Epoch: 3536, Training Loss: 0.03339
Epoch: 3536, Training Loss: 0.02497
Epoch: 3537, Training Loss: 0.02895
Epoch: 3537, Training Loss: 0.02595
Epoch: 3537, Training Loss: 0.03338
Epoch: 3537, Training Loss: 0.02497
Epoch: 3538, Training Loss: 0.02894
Epoch: 3538, Training Loss: 0.02594
Epoch: 3538, Training Loss: 0.03337
Epoch: 3538, Training Loss: 0.02496
Epoch: 3539, Training Loss: 0.02894
Epoch: 3539, Training Loss: 0.02594
Epoch: 3539, Training Loss: 0.03337
Epoch: 3539, Training Loss: 0.02496
Epoch: 3540, Training Loss: 0.02893
Epoch: 3540, Training Loss: 0.02594
Epoch: 3540, Training Loss: 0.03336
Epoch: 3540, Training Loss: 0.02495
Epoch: 3541, Training Loss: 0.02893
Epoch: 3541, Training Loss: 0.02593
Epoch: 3541, Training Loss: 0.03335
Epoch: 3541, Training Loss: 0.02495
Epoch: 3542, Training Loss: 0.02892
Epoch: 3542, Training Loss: 0.02593
Epoch: 3542, Training Loss: 0.03335
Epoch: 3542, Training Loss: 0.02494
Epoch: 3543, Training Loss: 0.02891
Epoch: 3543, Training Loss: 0.02592
Epoch: 3543, Training Loss: 0.03334
Epoch: 3543, Training Loss: 0.02494
Epoch: 3544, Training Loss: 0.02891
Epoch: 3544, Training Loss: 0.02592
Epoch: 3544, Training Loss: 0.03333
Epoch: 3544, Training Loss: 0.02493
Epoch: 3545, Training Loss: 0.02890
Epoch: 3545, Training Loss: 0.02591
Epoch: 3545, Training Loss: 0.03333
Epoch: 3545, Training Loss: 0.02493
Epoch: 3546, Training Loss: 0.02890
Epoch: 3546, Training Loss: 0.02591
Epoch: 3546, Training Loss: 0.03332
Epoch: 3546, Training Loss: 0.02492
Epoch: 3547, Training Loss: 0.02889
Epoch: 3547, Training Loss: 0.02590
Epoch: 3547, Training Loss: 0.03331
Epoch: 3547, Training Loss: 0.02492
Epoch: 3548, Training Loss: 0.02889
Epoch: 3548, Training Loss: 0.02590
Epoch: 3548, Training Loss: 0.03331
Epoch: 3548, Training Loss: 0.02491
Epoch: 3549, Training Loss: 0.02888
Epoch: 3549, Training Loss: 0.02589
Epoch: 3549, Training Loss: 0.03330
Epoch: 3549, Training Loss: 0.02491
Epoch: 3550, Training Loss: 0.02888
Epoch: 3550, Training Loss: 0.02589
Epoch: 3550, Training Loss: 0.03329
Epoch: 3550, Training Loss: 0.02491
Epoch: 3551, Training Loss: 0.02887
Epoch: 3551, Training Loss: 0.02589
Epoch: 3551, Training Loss: 0.03329
Epoch: 3551, Training Loss: 0.02490
Epoch: 3552, Training Loss: 0.02887
Epoch: 3552, Training Loss: 0.02588
Epoch: 3552, Training Loss: 0.03328
Epoch: 3552, Training Loss: 0.02490
Epoch: 3553, Training Loss: 0.02886
Epoch: 3553, Training Loss: 0.02588
Epoch: 3553, Training Loss: 0.03328
Epoch: 3553, Training Loss: 0.02489
Epoch: 3554, Training Loss: 0.02885
Epoch: 3554, Training Loss: 0.02587
Epoch: 3554, Training Loss: 0.03327
Epoch: 3554, Training Loss: 0.02489
Epoch: 3555, Training Loss: 0.02885
Epoch: 3555, Training Loss: 0.02587
Epoch: 3555, Training Loss: 0.03326
Epoch: 3555, Training Loss: 0.02488
Epoch: 3556, Training Loss: 0.02884
Epoch: 3556, Training Loss: 0.02586
Epoch: 3556, Training Loss: 0.03326
Epoch: 3556, Training Loss: 0.02488
Epoch: 3557, Training Loss: 0.02884
Epoch: 3557, Training Loss: 0.02586
Epoch: 3557, Training Loss: 0.03325
Epoch: 3557, Training Loss: 0.02487
Epoch: 3558, Training Loss: 0.02883
Epoch: 3558, Training Loss: 0.02585
Epoch: 3558, Training Loss: 0.03324
Epoch: 3558, Training Loss: 0.02487
Epoch: 3559, Training Loss: 0.02883
Epoch: 3559, Training Loss: 0.02585
Epoch: 3559, Training Loss: 0.03324
Epoch: 3559, Training Loss: 0.02486
Epoch: 3560, Training Loss: 0.02882
Epoch: 3560, Training Loss: 0.02584
Epoch: 3560, Training Loss: 0.03323
Epoch: 3560, Training Loss: 0.02486
Epoch: 3561, Training Loss: 0.02882
Epoch: 3561, Training Loss: 0.02584
Epoch: 3561, Training Loss: 0.03322
Epoch: 3561, Training Loss: 0.02486
Epoch: 3562, Training Loss: 0.02881
Epoch: 3562, Training Loss: 0.02583
Epoch: 3562, Training Loss: 0.03322
Epoch: 3562, Training Loss: 0.02485
Epoch: 3563, Training Loss: 0.02881
Epoch: 3563, Training Loss: 0.02583
Epoch: 3563, Training Loss: 0.03321
Epoch: 3563, Training Loss: 0.02485
Epoch: 3564, Training Loss: 0.02880
Epoch: 3564, Training Loss: 0.02583
Epoch: 3564, Training Loss: 0.03320
Epoch: 3564, Training Loss: 0.02484
Epoch: 3565, Training Loss: 0.02879
Epoch: 3565, Training Loss: 0.02582
Epoch: 3565, Training Loss: 0.03320
Epoch: 3565, Training Loss: 0.02484
Epoch: 3566, Training Loss: 0.02879
Epoch: 3566, Training Loss: 0.02582
Epoch: 3566, Training Loss: 0.03319
Epoch: 3566, Training Loss: 0.02483
Epoch: 3567, Training Loss: 0.02878
Epoch: 3567, Training Loss: 0.02581
Epoch: 3567, Training Loss: 0.03319
Epoch: 3567, Training Loss: 0.02483
Epoch: 3568, Training Loss: 0.02878
Epoch: 3568, Training Loss: 0.02581
Epoch: 3568, Training Loss: 0.03318
Epoch: 3568, Training Loss: 0.02482
Epoch: 3569, Training Loss: 0.02877
Epoch: 3569, Training Loss: 0.02580
Epoch: 3569, Training Loss: 0.03317
Epoch: 3569, Training Loss: 0.02482
Epoch: 3570, Training Loss: 0.02877
Epoch: 3570, Training Loss: 0.02580
Epoch: 3570, Training Loss: 0.03317
Epoch: 3570, Training Loss: 0.02481
Epoch: 3571, Training Loss: 0.02876
Epoch: 3571, Training Loss: 0.02579
Epoch: 3571, Training Loss: 0.03316
Epoch: 3571, Training Loss: 0.02481
Epoch: 3572, Training Loss: 0.02876
Epoch: 3572, Training Loss: 0.02579
Epoch: 3572, Training Loss: 0.03315
Epoch: 3572, Training Loss: 0.02481
Epoch: 3573, Training Loss: 0.02875
Epoch: 3573, Training Loss: 0.02578
Epoch: 3573, Training Loss: 0.03315
Epoch: 3573, Training Loss: 0.02480
Epoch: 3574, Training Loss: 0.02875
Epoch: 3574, Training Loss: 0.02578
Epoch: 3574, Training Loss: 0.03314
Epoch: 3574, Training Loss: 0.02480
Epoch: 3575, Training Loss: 0.02874
Epoch: 3575, Training Loss: 0.02578
Epoch: 3575, Training Loss: 0.03313
Epoch: 3575, Training Loss: 0.02479
Epoch: 3576, Training Loss: 0.02874
Epoch: 3576, Training Loss: 0.02577
Epoch: 3576, Training Loss: 0.03313
Epoch: 3576, Training Loss: 0.02479
Epoch: 3577, Training Loss: 0.02873
Epoch: 3577, Training Loss: 0.02577
Epoch: 3577, Training Loss: 0.03312
Epoch: 3577, Training Loss: 0.02478
Epoch: 3578, Training Loss: 0.02872
Epoch: 3578, Training Loss: 0.02576
Epoch: 3578, Training Loss: 0.03312
Epoch: 3578, Training Loss: 0.02478
Epoch: 3579, Training Loss: 0.02872
Epoch: 3579, Training Loss: 0.02576
Epoch: 3579, Training Loss: 0.03311
Epoch: 3579, Training Loss: 0.02477
Epoch: 3580, Training Loss: 0.02871
Epoch: 3580, Training Loss: 0.02575
Epoch: 3580, Training Loss: 0.03310
Epoch: 3580, Training Loss: 0.02477
Epoch: 3581, Training Loss: 0.02871
Epoch: 3581, Training Loss: 0.02575
Epoch: 3581, Training Loss: 0.03310
Epoch: 3581, Training Loss: 0.02476
Epoch: 3582, Training Loss: 0.02870
Epoch: 3582, Training Loss: 0.02574
Epoch: 3582, Training Loss: 0.03309
Epoch: 3582, Training Loss: 0.02476
Epoch: 3583, Training Loss: 0.02870
Epoch: 3583, Training Loss: 0.02574
Epoch: 3583, Training Loss: 0.03308
Epoch: 3583, Training Loss: 0.02476
Epoch: 3584, Training Loss: 0.02869
Epoch: 3584, Training Loss: 0.02573
Epoch: 3584, Training Loss: 0.03308
Epoch: 3584, Training Loss: 0.02475
Epoch: 3585, Training Loss: 0.02869
Epoch: 3585, Training Loss: 0.02573
Epoch: 3585, Training Loss: 0.03307
Epoch: 3585, Training Loss: 0.02475
Epoch: 3586, Training Loss: 0.02868
Epoch: 3586, Training Loss: 0.02573
Epoch: 3586, Training Loss: 0.03306
Epoch: 3586, Training Loss: 0.02474
Epoch: 3587, Training Loss: 0.02868
Epoch: 3587, Training Loss: 0.02572
Epoch: 3587, Training Loss: 0.03306
Epoch: 3587, Training Loss: 0.02474
Epoch: 3588, Training Loss: 0.02867
Epoch: 3588, Training Loss: 0.02572
Epoch: 3588, Training Loss: 0.03305
Epoch: 3588, Training Loss: 0.02473
Epoch: 3589, Training Loss: 0.02867
Epoch: 3589, Training Loss: 0.02571
Epoch: 3589, Training Loss: 0.03305
Epoch: 3589, Training Loss: 0.02473
Epoch: 3590, Training Loss: 0.02866
Epoch: 3590, Training Loss: 0.02571
Epoch: 3590, Training Loss: 0.03304
Epoch: 3590, Training Loss: 0.02472
Epoch: 3591, Training Loss: 0.02865
Epoch: 3591, Training Loss: 0.02570
Epoch: 3591, Training Loss: 0.03303
Epoch: 3591, Training Loss: 0.02472
Epoch: 3592, Training Loss: 0.02865
Epoch: 3592, Training Loss: 0.02570
Epoch: 3592, Training Loss: 0.03303
Epoch: 3592, Training Loss: 0.02472
Epoch: 3593, Training Loss: 0.02864
Epoch: 3593, Training Loss: 0.02569
Epoch: 3593, Training Loss: 0.03302
Epoch: 3593, Training Loss: 0.02471
Epoch: 3594, Training Loss: 0.02864
Epoch: 3594, Training Loss: 0.02569
Epoch: 3594, Training Loss: 0.03301
Epoch: 3594, Training Loss: 0.02471
Epoch: 3595, Training Loss: 0.02863
Epoch: 3595, Training Loss: 0.02569
Epoch: 3595, Training Loss: 0.03301
Epoch: 3595, Training Loss: 0.02470
Epoch: 3596, Training Loss: 0.02863
Epoch: 3596, Training Loss: 0.02568
Epoch: 3596, Training Loss: 0.03300
Epoch: 3596, Training Loss: 0.02470
Epoch: 3597, Training Loss: 0.02862
Epoch: 3597, Training Loss: 0.02568
Epoch: 3597, Training Loss: 0.03300
Epoch: 3597, Training Loss: 0.02469
Epoch: 3598, Training Loss: 0.02862
Epoch: 3598, Training Loss: 0.02567
Epoch: 3598, Training Loss: 0.03299
Epoch: 3598, Training Loss: 0.02469
Epoch: 3599, Training Loss: 0.02861
Epoch: 3599, Training Loss: 0.02567
Epoch: 3599, Training Loss: 0.03298
Epoch: 3599, Training Loss: 0.02468
Epoch: 3600, Training Loss: 0.02861
Epoch: 3600, Training Loss: 0.02566
Epoch: 3600, Training Loss: 0.03298
Epoch: 3600, Training Loss: 0.02468
Epoch: 3601, Training Loss: 0.02860
Epoch: 3601, Training Loss: 0.02566
Epoch: 3601, Training Loss: 0.03297
Epoch: 3601, Training Loss: 0.02468
Epoch: 3602, Training Loss: 0.02860
Epoch: 3602, Training Loss: 0.02565
Epoch: 3602, Training Loss: 0.03296
Epoch: 3602, Training Loss: 0.02467
Epoch: 3603, Training Loss: 0.02859
Epoch: 3603, Training Loss: 0.02565
Epoch: 3603, Training Loss: 0.03296
Epoch: 3603, Training Loss: 0.02467
Epoch: 3604, Training Loss: 0.02859
Epoch: 3604, Training Loss: 0.02565
Epoch: 3604, Training Loss: 0.03295
Epoch: 3604, Training Loss: 0.02466
Epoch: 3605, Training Loss: 0.02858
Epoch: 3605, Training Loss: 0.02564
Epoch: 3605, Training Loss: 0.03294
Epoch: 3605, Training Loss: 0.02466
Epoch: 3606, Training Loss: 0.02858
Epoch: 3606, Training Loss: 0.02564
Epoch: 3606, Training Loss: 0.03294
Epoch: 3606, Training Loss: 0.02465
Epoch: 3607, Training Loss: 0.02857
Epoch: 3607, Training Loss: 0.02563
Epoch: 3607, Training Loss: 0.03293
Epoch: 3607, Training Loss: 0.02465
Epoch: 3608, Training Loss: 0.02856
Epoch: 3608, Training Loss: 0.02563
Epoch: 3608, Training Loss: 0.03293
Epoch: 3608, Training Loss: 0.02464
Epoch: 3609, Training Loss: 0.02856
Epoch: 3609, Training Loss: 0.02562
Epoch: 3609, Training Loss: 0.03292
Epoch: 3609, Training Loss: 0.02464
Epoch: 3610, Training Loss: 0.02855
Epoch: 3610, Training Loss: 0.02562
Epoch: 3610, Training Loss: 0.03291
Epoch: 3610, Training Loss: 0.02464
Epoch: 3611, Training Loss: 0.02855
Epoch: 3611, Training Loss: 0.02561
Epoch: 3611, Training Loss: 0.03291
Epoch: 3611, Training Loss: 0.02463
Epoch: 3612, Training Loss: 0.02854
Epoch: 3612, Training Loss: 0.02561
Epoch: 3612, Training Loss: 0.03290
Epoch: 3612, Training Loss: 0.02463
Epoch: 3613, Training Loss: 0.02854
Epoch: 3613, Training Loss: 0.02560
Epoch: 3613, Training Loss: 0.03289
Epoch: 3613, Training Loss: 0.02462
Epoch: 3614, Training Loss: 0.02853
Epoch: 3614, Training Loss: 0.02560
Epoch: 3614, Training Loss: 0.03289
Epoch: 3614, Training Loss: 0.02462
Epoch: 3615, Training Loss: 0.02853
Epoch: 3615, Training Loss: 0.02560
Epoch: 3615, Training Loss: 0.03288
Epoch: 3615, Training Loss: 0.02461
Epoch: 3616, Training Loss: 0.02852
Epoch: 3616, Training Loss: 0.02559
Epoch: 3616, Training Loss: 0.03288
Epoch: 3616, Training Loss: 0.02461
Epoch: 3617, Training Loss: 0.02852
Epoch: 3617, Training Loss: 0.02559
Epoch: 3617, Training Loss: 0.03287
Epoch: 3617, Training Loss: 0.02460
Epoch: 3618, Training Loss: 0.02851
Epoch: 3618, Training Loss: 0.02558
Epoch: 3618, Training Loss: 0.03286
Epoch: 3618, Training Loss: 0.02460
Epoch: 3619, Training Loss: 0.02851
Epoch: 3619, Training Loss: 0.02558
Epoch: 3619, Training Loss: 0.03286
Epoch: 3619, Training Loss: 0.02460
Epoch: 3620, Training Loss: 0.02850
Epoch: 3620, Training Loss: 0.02557
Epoch: 3620, Training Loss: 0.03285
Epoch: 3620, Training Loss: 0.02459
Epoch: 3621, Training Loss: 0.02850
Epoch: 3621, Training Loss: 0.02557
Epoch: 3621, Training Loss: 0.03285
Epoch: 3621, Training Loss: 0.02459
Epoch: 3622, Training Loss: 0.02849
Epoch: 3622, Training Loss: 0.02557
Epoch: 3622, Training Loss: 0.03284
Epoch: 3622, Training Loss: 0.02458
Epoch: 3623, Training Loss: 0.02849
Epoch: 3623, Training Loss: 0.02556
Epoch: 3623, Training Loss: 0.03283
Epoch: 3623, Training Loss: 0.02458
Epoch: 3624, Training Loss: 0.02848
Epoch: 3624, Training Loss: 0.02556
Epoch: 3624, Training Loss: 0.03283
Epoch: 3624, Training Loss: 0.02457
Epoch: 3625, Training Loss: 0.02848
Epoch: 3625, Training Loss: 0.02555
Epoch: 3625, Training Loss: 0.03282
Epoch: 3625, Training Loss: 0.02457
Epoch: 3626, Training Loss: 0.02847
Epoch: 3626, Training Loss: 0.02555
Epoch: 3626, Training Loss: 0.03281
Epoch: 3626, Training Loss: 0.02456
Epoch: 3627, Training Loss: 0.02846
Epoch: 3627, Training Loss: 0.02554
Epoch: 3627, Training Loss: 0.03281
Epoch: 3627, Training Loss: 0.02456
Epoch: 3628, Training Loss: 0.02846
Epoch: 3628, Training Loss: 0.02554
Epoch: 3628, Training Loss: 0.03280
Epoch: 3628, Training Loss: 0.02456
Epoch: 3629, Training Loss: 0.02845
Epoch: 3629, Training Loss: 0.02553
Epoch: 3629, Training Loss: 0.03280
Epoch: 3629, Training Loss: 0.02455
Epoch: 3630, Training Loss: 0.02845
Epoch: 3630, Training Loss: 0.02553
Epoch: 3630, Training Loss: 0.03279
Epoch: 3630, Training Loss: 0.02455
Epoch: 3631, Training Loss: 0.02844
Epoch: 3631, Training Loss: 0.02553
Epoch: 3631, Training Loss: 0.03278
Epoch: 3631, Training Loss: 0.02454
Epoch: 3632, Training Loss: 0.02844
Epoch: 3632, Training Loss: 0.02552
Epoch: 3632, Training Loss: 0.03278
Epoch: 3632, Training Loss: 0.02454
Epoch: 3633, Training Loss: 0.02843
Epoch: 3633, Training Loss: 0.02552
Epoch: 3633, Training Loss: 0.03277
Epoch: 3633, Training Loss: 0.02453
Epoch: 3634, Training Loss: 0.02843
Epoch: 3634, Training Loss: 0.02551
Epoch: 3634, Training Loss: 0.03276
Epoch: 3634, Training Loss: 0.02453
Epoch: 3635, Training Loss: 0.02842
Epoch: 3635, Training Loss: 0.02551
Epoch: 3635, Training Loss: 0.03276
Epoch: 3635, Training Loss: 0.02453
Epoch: 3636, Training Loss: 0.02842
Epoch: 3636, Training Loss: 0.02550
Epoch: 3636, Training Loss: 0.03275
Epoch: 3636, Training Loss: 0.02452
Epoch: 3637, Training Loss: 0.02841
Epoch: 3637, Training Loss: 0.02550
Epoch: 3637, Training Loss: 0.03275
Epoch: 3637, Training Loss: 0.02452
Epoch: 3638, Training Loss: 0.02841
Epoch: 3638, Training Loss: 0.02549
Epoch: 3638, Training Loss: 0.03274
Epoch: 3638, Training Loss: 0.02451
Epoch: 3639, Training Loss: 0.02840
Epoch: 3639, Training Loss: 0.02549
Epoch: 3639, Training Loss: 0.03273
Epoch: 3639, Training Loss: 0.02451
Epoch: 3640, Training Loss: 0.02840
Epoch: 3640, Training Loss: 0.02549
Epoch: 3640, Training Loss: 0.03273
Epoch: 3640, Training Loss: 0.02450
Epoch: 3641, Training Loss: 0.02839
Epoch: 3641, Training Loss: 0.02548
Epoch: 3641, Training Loss: 0.03272
Epoch: 3641, Training Loss: 0.02450
Epoch: 3642, Training Loss: 0.02839
Epoch: 3642, Training Loss: 0.02548
Epoch: 3642, Training Loss: 0.03272
Epoch: 3642, Training Loss: 0.02449
Epoch: 3643, Training Loss: 0.02838
Epoch: 3643, Training Loss: 0.02547
Epoch: 3643, Training Loss: 0.03271
Epoch: 3643, Training Loss: 0.02449
Epoch: 3644, Training Loss: 0.02838
Epoch: 3644, Training Loss: 0.02547
Epoch: 3644, Training Loss: 0.03270
Epoch: 3644, Training Loss: 0.02449
Epoch: 3645, Training Loss: 0.02837
Epoch: 3645, Training Loss: 0.02546
Epoch: 3645, Training Loss: 0.03270
Epoch: 3645, Training Loss: 0.02448
Epoch: 3646, Training Loss: 0.02837
Epoch: 3646, Training Loss: 0.02546
Epoch: 3646, Training Loss: 0.03269
Epoch: 3646, Training Loss: 0.02448
Epoch: 3647, Training Loss: 0.02836
Epoch: 3647, Training Loss: 0.02546
Epoch: 3647, Training Loss: 0.03269
Epoch: 3647, Training Loss: 0.02447
Epoch: 3648, Training Loss: 0.02836
Epoch: 3648, Training Loss: 0.02545
Epoch: 3648, Training Loss: 0.03268
Epoch: 3648, Training Loss: 0.02447
Epoch: 3649, Training Loss: 0.02835
Epoch: 3649, Training Loss: 0.02545
Epoch: 3649, Training Loss: 0.03267
Epoch: 3649, Training Loss: 0.02446
Epoch: 3650, Training Loss: 0.02835
Epoch: 3650, Training Loss: 0.02544
Epoch: 3650, Training Loss: 0.03267
Epoch: 3650, Training Loss: 0.02446
Epoch: 3651, Training Loss: 0.02834
Epoch: 3651, Training Loss: 0.02544
Epoch: 3651, Training Loss: 0.03266
Epoch: 3651, Training Loss: 0.02446
Epoch: 3652, Training Loss: 0.02833
Epoch: 3652, Training Loss: 0.02543
Epoch: 3652, Training Loss: 0.03265
Epoch: 3652, Training Loss: 0.02445
Epoch: 3653, Training Loss: 0.02833
Epoch: 3653, Training Loss: 0.02543
Epoch: 3653, Training Loss: 0.03265
Epoch: 3653, Training Loss: 0.02445
Epoch: 3654, Training Loss: 0.02832
Epoch: 3654, Training Loss: 0.02542
Epoch: 3654, Training Loss: 0.03264
Epoch: 3654, Training Loss: 0.02444
Epoch: 3655, Training Loss: 0.02832
Epoch: 3655, Training Loss: 0.02542
Epoch: 3655, Training Loss: 0.03264
Epoch: 3655, Training Loss: 0.02444
Epoch: 3656, Training Loss: 0.02831
Epoch: 3656, Training Loss: 0.02542
Epoch: 3656, Training Loss: 0.03263
Epoch: 3656, Training Loss: 0.02443
Epoch: 3657, Training Loss: 0.02831
Epoch: 3657, Training Loss: 0.02541
Epoch: 3657, Training Loss: 0.03262
Epoch: 3657, Training Loss: 0.02443
Epoch: 3658, Training Loss: 0.02830
Epoch: 3658, Training Loss: 0.02541
Epoch: 3658, Training Loss: 0.03262
Epoch: 3658, Training Loss: 0.02442
Epoch: 3659, Training Loss: 0.02830
Epoch: 3659, Training Loss: 0.02540
Epoch: 3659, Training Loss: 0.03261
Epoch: 3659, Training Loss: 0.02442
Epoch: 3660, Training Loss: 0.02829
Epoch: 3660, Training Loss: 0.02540
Epoch: 3660, Training Loss: 0.03261
Epoch: 3660, Training Loss: 0.02442
Epoch: 3661, Training Loss: 0.02829
Epoch: 3661, Training Loss: 0.02539
Epoch: 3661, Training Loss: 0.03260
Epoch: 3661, Training Loss: 0.02441
Epoch: 3662, Training Loss: 0.02828
Epoch: 3662, Training Loss: 0.02539
Epoch: 3662, Training Loss: 0.03259
Epoch: 3662, Training Loss: 0.02441
Epoch: 3663, Training Loss: 0.02828
Epoch: 3663, Training Loss: 0.02539
Epoch: 3663, Training Loss: 0.03259
Epoch: 3663, Training Loss: 0.02440
Epoch: 3664, Training Loss: 0.02827
Epoch: 3664, Training Loss: 0.02538
Epoch: 3664, Training Loss: 0.03258
Epoch: 3664, Training Loss: 0.02440
Epoch: 3665, Training Loss: 0.02827
Epoch: 3665, Training Loss: 0.02538
Epoch: 3665, Training Loss: 0.03258
Epoch: 3665, Training Loss: 0.02439
Epoch: 3666, Training Loss: 0.02826
Epoch: 3666, Training Loss: 0.02537
Epoch: 3666, Training Loss: 0.03257
Epoch: 3666, Training Loss: 0.02439
Epoch: 3667, Training Loss: 0.02826
Epoch: 3667, Training Loss: 0.02537
Epoch: 3667, Training Loss: 0.03256
Epoch: 3667, Training Loss: 0.02439
Epoch: 3668, Training Loss: 0.02825
Epoch: 3668, Training Loss: 0.02536
Epoch: 3668, Training Loss: 0.03256
Epoch: 3668, Training Loss: 0.02438
Epoch: 3669, Training Loss: 0.02825
Epoch: 3669, Training Loss: 0.02536
Epoch: 3669, Training Loss: 0.03255
Epoch: 3669, Training Loss: 0.02438
Epoch: 3670, Training Loss: 0.02824
Epoch: 3670, Training Loss: 0.02536
Epoch: 3670, Training Loss: 0.03255
Epoch: 3670, Training Loss: 0.02437
Epoch: 3671, Training Loss: 0.02824
Epoch: 3671, Training Loss: 0.02535
Epoch: 3671, Training Loss: 0.03254
Epoch: 3671, Training Loss: 0.02437
Epoch: 3672, Training Loss: 0.02823
Epoch: 3672, Training Loss: 0.02535
Epoch: 3672, Training Loss: 0.03253
Epoch: 3672, Training Loss: 0.02436
Epoch: 3673, Training Loss: 0.02823
Epoch: 3673, Training Loss: 0.02534
Epoch: 3673, Training Loss: 0.03253
Epoch: 3673, Training Loss: 0.02436
Epoch: 3674, Training Loss: 0.02822
Epoch: 3674, Training Loss: 0.02534
Epoch: 3674, Training Loss: 0.03252
Epoch: 3674, Training Loss: 0.02436
Epoch: 3675, Training Loss: 0.02822
Epoch: 3675, Training Loss: 0.02533
Epoch: 3675, Training Loss: 0.03251
Epoch: 3675, Training Loss: 0.02435
Epoch: 3676, Training Loss: 0.02821
Epoch: 3676, Training Loss: 0.02533
Epoch: 3676, Training Loss: 0.03251
Epoch: 3676, Training Loss: 0.02435
Epoch: 3677, Training Loss: 0.02821
Epoch: 3677, Training Loss: 0.02532
Epoch: 3677, Training Loss: 0.03250
Epoch: 3677, Training Loss: 0.02434
Epoch: 3678, Training Loss: 0.02820
Epoch: 3678, Training Loss: 0.02532
Epoch: 3678, Training Loss: 0.03250
Epoch: 3678, Training Loss: 0.02434
Epoch: 3679, Training Loss: 0.02820
Epoch: 3679, Training Loss: 0.02532
Epoch: 3679, Training Loss: 0.03249
Epoch: 3679, Training Loss: 0.02433
Epoch: 3680, Training Loss: 0.02819
Epoch: 3680, Training Loss: 0.02531
Epoch: 3680, Training Loss: 0.03248
Epoch: 3680, Training Loss: 0.02433
Epoch: 3681, Training Loss: 0.02819
Epoch: 3681, Training Loss: 0.02531
Epoch: 3681, Training Loss: 0.03248
Epoch: 3681, Training Loss: 0.02433
Epoch: 3682, Training Loss: 0.02818
Epoch: 3682, Training Loss: 0.02530
Epoch: 3682, Training Loss: 0.03247
Epoch: 3682, Training Loss: 0.02432
Epoch: 3683, Training Loss: 0.02818
Epoch: 3683, Training Loss: 0.02530
Epoch: 3683, Training Loss: 0.03247
Epoch: 3683, Training Loss: 0.02432
Epoch: 3684, Training Loss: 0.02817
Epoch: 3684, Training Loss: 0.02529
Epoch: 3684, Training Loss: 0.03246
Epoch: 3684, Training Loss: 0.02431
Epoch: 3685, Training Loss: 0.02817
Epoch: 3685, Training Loss: 0.02529
Epoch: 3685, Training Loss: 0.03245
Epoch: 3685, Training Loss: 0.02431
Epoch: 3686, Training Loss: 0.02816
Epoch: 3686, Training Loss: 0.02529
Epoch: 3686, Training Loss: 0.03245
Epoch: 3686, Training Loss: 0.02430
Epoch: 3687, Training Loss: 0.02816
Epoch: 3687, Training Loss: 0.02528
Epoch: 3687, Training Loss: 0.03244
Epoch: 3687, Training Loss: 0.02430
Epoch: 3688, Training Loss: 0.02815
Epoch: 3688, Training Loss: 0.02528
Epoch: 3688, Training Loss: 0.03244
Epoch: 3688, Training Loss: 0.02430
Epoch: 3689, Training Loss: 0.02815
Epoch: 3689, Training Loss: 0.02527
Epoch: 3689, Training Loss: 0.03243
Epoch: 3689, Training Loss: 0.02429
Epoch: 3690, Training Loss: 0.02814
Epoch: 3690, Training Loss: 0.02527
Epoch: 3690, Training Loss: 0.03242
Epoch: 3690, Training Loss: 0.02429
Epoch: 3691, Training Loss: 0.02814
Epoch: 3691, Training Loss: 0.02526
Epoch: 3691, Training Loss: 0.03242
Epoch: 3691, Training Loss: 0.02428
Epoch: 3692, Training Loss: 0.02813
Epoch: 3692, Training Loss: 0.02526
Epoch: 3692, Training Loss: 0.03241
Epoch: 3692, Training Loss: 0.02428
Epoch: 3693, Training Loss: 0.02813
Epoch: 3693, Training Loss: 0.02526
Epoch: 3693, Training Loss: 0.03241
Epoch: 3693, Training Loss: 0.02428
Epoch: 3694, Training Loss: 0.02812
Epoch: 3694, Training Loss: 0.02525
Epoch: 3694, Training Loss: 0.03240
Epoch: 3694, Training Loss: 0.02427
Epoch: 3695, Training Loss: 0.02812
Epoch: 3695, Training Loss: 0.02525
Epoch: 3695, Training Loss: 0.03240
Epoch: 3695, Training Loss: 0.02427
Epoch: 3696, Training Loss: 0.02811
Epoch: 3696, Training Loss: 0.02524
Epoch: 3696, Training Loss: 0.03239
Epoch: 3696, Training Loss: 0.02426
Epoch: 3697, Training Loss: 0.02811
Epoch: 3697, Training Loss: 0.02524
Epoch: 3697, Training Loss: 0.03238
Epoch: 3697, Training Loss: 0.02426
Epoch: 3698, Training Loss: 0.02810
Epoch: 3698, Training Loss: 0.02524
Epoch: 3698, Training Loss: 0.03238
Epoch: 3698, Training Loss: 0.02425
Epoch: 3699, Training Loss: 0.02810
Epoch: 3699, Training Loss: 0.02523
Epoch: 3699, Training Loss: 0.03237
Epoch: 3699, Training Loss: 0.02425
Epoch: 3700, Training Loss: 0.02809
Epoch: 3700, Training Loss: 0.02523
Epoch: 3700, Training Loss: 0.03237
Epoch: 3700, Training Loss: 0.02425
Epoch: 3701, Training Loss: 0.02809
Epoch: 3701, Training Loss: 0.02522
Epoch: 3701, Training Loss: 0.03236
Epoch: 3701, Training Loss: 0.02424
Epoch: 3702, Training Loss: 0.02808
Epoch: 3702, Training Loss: 0.02522
Epoch: 3702, Training Loss: 0.03235
Epoch: 3702, Training Loss: 0.02424
Epoch: 3703, Training Loss: 0.02808
Epoch: 3703, Training Loss: 0.02521
Epoch: 3703, Training Loss: 0.03235
Epoch: 3703, Training Loss: 0.02423
Epoch: 3704, Training Loss: 0.02807
Epoch: 3704, Training Loss: 0.02521
Epoch: 3704, Training Loss: 0.03234
Epoch: 3704, Training Loss: 0.02423
Epoch: 3705, Training Loss: 0.02806
Epoch: 3705, Training Loss: 0.02521
Epoch: 3705, Training Loss: 0.03234
Epoch: 3705, Training Loss: 0.02422
Epoch: 3706, Training Loss: 0.02806
Epoch: 3706, Training Loss: 0.02520
Epoch: 3706, Training Loss: 0.03233
Epoch: 3706, Training Loss: 0.02422
Epoch: 3707, Training Loss: 0.02805
Epoch: 3707, Training Loss: 0.02520
Epoch: 3707, Training Loss: 0.03232
Epoch: 3707, Training Loss: 0.02422
Epoch: 3708, Training Loss: 0.02805
Epoch: 3708, Training Loss: 0.02519
Epoch: 3708, Training Loss: 0.03232
Epoch: 3708, Training Loss: 0.02421
Epoch: 3709, Training Loss: 0.02804
Epoch: 3709, Training Loss: 0.02519
Epoch: 3709, Training Loss: 0.03231
Epoch: 3709, Training Loss: 0.02421
Epoch: 3710, Training Loss: 0.02804
Epoch: 3710, Training Loss: 0.02518
Epoch: 3710, Training Loss: 0.03231
Epoch: 3710, Training Loss: 0.02420
Epoch: 3711, Training Loss: 0.02803
Epoch: 3711, Training Loss: 0.02518
Epoch: 3711, Training Loss: 0.03230
Epoch: 3711, Training Loss: 0.02420
Epoch: 3712, Training Loss: 0.02803
Epoch: 3712, Training Loss: 0.02518
Epoch: 3712, Training Loss: 0.03229
Epoch: 3712, Training Loss: 0.02419
Epoch: 3713, Training Loss: 0.02802
Epoch: 3713, Training Loss: 0.02517
Epoch: 3713, Training Loss: 0.03229
Epoch: 3713, Training Loss: 0.02419
Epoch: 3714, Training Loss: 0.02802
Epoch: 3714, Training Loss: 0.02517
Epoch: 3714, Training Loss: 0.03228
Epoch: 3714, Training Loss: 0.02419
Epoch: 3715, Training Loss: 0.02801
Epoch: 3715, Training Loss: 0.02516
Epoch: 3715, Training Loss: 0.03228
Epoch: 3715, Training Loss: 0.02418
Epoch: 3716, Training Loss: 0.02801
Epoch: 3716, Training Loss: 0.02516
Epoch: 3716, Training Loss: 0.03227
Epoch: 3716, Training Loss: 0.02418
Epoch: 3717, Training Loss: 0.02800
Epoch: 3717, Training Loss: 0.02515
Epoch: 3717, Training Loss: 0.03226
Epoch: 3717, Training Loss: 0.02417
Epoch: 3718, Training Loss: 0.02800
Epoch: 3718, Training Loss: 0.02515
Epoch: 3718, Training Loss: 0.03226
Epoch: 3718, Training Loss: 0.02417
Epoch: 3719, Training Loss: 0.02799
Epoch: 3719, Training Loss: 0.02515
Epoch: 3719, Training Loss: 0.03225
Epoch: 3719, Training Loss: 0.02417
Epoch: 3720, Training Loss: 0.02799
Epoch: 3720, Training Loss: 0.02514
Epoch: 3720, Training Loss: 0.03225
Epoch: 3720, Training Loss: 0.02416
Epoch: 3721, Training Loss: 0.02798
Epoch: 3721, Training Loss: 0.02514
Epoch: 3721, Training Loss: 0.03224
Epoch: 3721, Training Loss: 0.02416
Epoch: 3722, Training Loss: 0.02798
Epoch: 3722, Training Loss: 0.02513
Epoch: 3722, Training Loss: 0.03224
Epoch: 3722, Training Loss: 0.02415
Epoch: 3723, Training Loss: 0.02798
Epoch: 3723, Training Loss: 0.02513
Epoch: 3723, Training Loss: 0.03223
Epoch: 3723, Training Loss: 0.02415
Epoch: 3724, Training Loss: 0.02797
Epoch: 3724, Training Loss: 0.02512
Epoch: 3724, Training Loss: 0.03222
Epoch: 3724, Training Loss: 0.02414
Epoch: 3725, Training Loss: 0.02797
Epoch: 3725, Training Loss: 0.02512
Epoch: 3725, Training Loss: 0.03222
Epoch: 3725, Training Loss: 0.02414
Epoch: 3726, Training Loss: 0.02796
Epoch: 3726, Training Loss: 0.02512
Epoch: 3726, Training Loss: 0.03221
Epoch: 3726, Training Loss: 0.02414
Epoch: 3727, Training Loss: 0.02796
Epoch: 3727, Training Loss: 0.02511
Epoch: 3727, Training Loss: 0.03221
Epoch: 3727, Training Loss: 0.02413
Epoch: 3728, Training Loss: 0.02795
Epoch: 3728, Training Loss: 0.02511
Epoch: 3728, Training Loss: 0.03220
Epoch: 3728, Training Loss: 0.02413
Epoch: 3729, Training Loss: 0.02795
Epoch: 3729, Training Loss: 0.02510
Epoch: 3729, Training Loss: 0.03219
Epoch: 3729, Training Loss: 0.02412
Epoch: 3730, Training Loss: 0.02794
Epoch: 3730, Training Loss: 0.02510
Epoch: 3730, Training Loss: 0.03219
Epoch: 3730, Training Loss: 0.02412
Epoch: 3731, Training Loss: 0.02794
Epoch: 3731, Training Loss: 0.02510
Epoch: 3731, Training Loss: 0.03218
Epoch: 3731, Training Loss: 0.02412
Epoch: 3732, Training Loss: 0.02793
Epoch: 3732, Training Loss: 0.02509
Epoch: 3732, Training Loss: 0.03218
Epoch: 3732, Training Loss: 0.02411
Epoch: 3733, Training Loss: 0.02793
Epoch: 3733, Training Loss: 0.02509
Epoch: 3733, Training Loss: 0.03217
Epoch: 3733, Training Loss: 0.02411
Epoch: 3734, Training Loss: 0.02792
Epoch: 3734, Training Loss: 0.02508
Epoch: 3734, Training Loss: 0.03216
Epoch: 3734, Training Loss: 0.02410
Epoch: 3735, Training Loss: 0.02792
Epoch: 3735, Training Loss: 0.02508
Epoch: 3735, Training Loss: 0.03216
Epoch: 3735, Training Loss: 0.02410
Epoch: 3736, Training Loss: 0.02791
Epoch: 3736, Training Loss: 0.02507
Epoch: 3736, Training Loss: 0.03215
Epoch: 3736, Training Loss: 0.02409
Epoch: 3737, Training Loss: 0.02791
Epoch: 3737, Training Loss: 0.02507
Epoch: 3737, Training Loss: 0.03215
Epoch: 3737, Training Loss: 0.02409
Epoch: 3738, Training Loss: 0.02790
Epoch: 3738, Training Loss: 0.02507
Epoch: 3738, Training Loss: 0.03214
Epoch: 3738, Training Loss: 0.02409
Epoch: 3739, Training Loss: 0.02790
Epoch: 3739, Training Loss: 0.02506
Epoch: 3739, Training Loss: 0.03214
Epoch: 3739, Training Loss: 0.02408
Epoch: 3740, Training Loss: 0.02789
Epoch: 3740, Training Loss: 0.02506
Epoch: 3740, Training Loss: 0.03213
Epoch: 3740, Training Loss: 0.02408
Epoch: 3741, Training Loss: 0.02789
Epoch: 3741, Training Loss: 0.02505
Epoch: 3741, Training Loss: 0.03212
Epoch: 3741, Training Loss: 0.02407
Epoch: 3742, Training Loss: 0.02788
Epoch: 3742, Training Loss: 0.02505
Epoch: 3742, Training Loss: 0.03212
Epoch: 3742, Training Loss: 0.02407
Epoch: 3743, Training Loss: 0.02788
Epoch: 3743, Training Loss: 0.02505
Epoch: 3743, Training Loss: 0.03211
Epoch: 3743, Training Loss: 0.02407
Epoch: 3744, Training Loss: 0.02787
Epoch: 3744, Training Loss: 0.02504
Epoch: 3744, Training Loss: 0.03211
Epoch: 3744, Training Loss: 0.02406
Epoch: 3745, Training Loss: 0.02787
Epoch: 3745, Training Loss: 0.02504
Epoch: 3745, Training Loss: 0.03210
Epoch: 3745, Training Loss: 0.02406
Epoch: 3746, Training Loss: 0.02786
Epoch: 3746, Training Loss: 0.02503
Epoch: 3746, Training Loss: 0.03210
Epoch: 3746, Training Loss: 0.02405
Epoch: 3747, Training Loss: 0.02786
Epoch: 3747, Training Loss: 0.02503
Epoch: 3747, Training Loss: 0.03209
Epoch: 3747, Training Loss: 0.02405
Epoch: 3748, Training Loss: 0.02785
Epoch: 3748, Training Loss: 0.02502
Epoch: 3748, Training Loss: 0.03208
Epoch: 3748, Training Loss: 0.02404
Epoch: 3749, Training Loss: 0.02785
Epoch: 3749, Training Loss: 0.02502
Epoch: 3749, Training Loss: 0.03208
Epoch: 3749, Training Loss: 0.02404
Epoch: 3750, Training Loss: 0.02784
Epoch: 3750, Training Loss: 0.02502
Epoch: 3750, Training Loss: 0.03207
Epoch: 3750, Training Loss: 0.02404
Epoch: 3751, Training Loss: 0.02784
Epoch: 3751, Training Loss: 0.02501
Epoch: 3751, Training Loss: 0.03207
Epoch: 3751, Training Loss: 0.02403
Epoch: 3752, Training Loss: 0.02783
Epoch: 3752, Training Loss: 0.02501
Epoch: 3752, Training Loss: 0.03206
Epoch: 3752, Training Loss: 0.02403
Epoch: 3753, Training Loss: 0.02783
Epoch: 3753, Training Loss: 0.02500
Epoch: 3753, Training Loss: 0.03205
Epoch: 3753, Training Loss: 0.02402
Epoch: 3754, Training Loss: 0.02782
Epoch: 3754, Training Loss: 0.02500
Epoch: 3754, Training Loss: 0.03205
Epoch: 3754, Training Loss: 0.02402
Epoch: 3755, Training Loss: 0.02782
Epoch: 3755, Training Loss: 0.02500
Epoch: 3755, Training Loss: 0.03204
Epoch: 3755, Training Loss: 0.02402
Epoch: 3756, Training Loss: 0.02781
Epoch: 3756, Training Loss: 0.02499
Epoch: 3756, Training Loss: 0.03204
Epoch: 3756, Training Loss: 0.02401
Epoch: 3757, Training Loss: 0.02781
Epoch: 3757, Training Loss: 0.02499
Epoch: 3757, Training Loss: 0.03203
Epoch: 3757, Training Loss: 0.02401
Epoch: 3758, Training Loss: 0.02780
Epoch: 3758, Training Loss: 0.02498
Epoch: 3758, Training Loss: 0.03203
Epoch: 3758, Training Loss: 0.02400
Epoch: 3759, Training Loss: 0.02780
Epoch: 3759, Training Loss: 0.02498
Epoch: 3759, Training Loss: 0.03202
Epoch: 3759, Training Loss: 0.02400
Epoch: 3760, Training Loss: 0.02779
Epoch: 3760, Training Loss: 0.02497
Epoch: 3760, Training Loss: 0.03201
Epoch: 3760, Training Loss: 0.02400
Epoch: 3761, Training Loss: 0.02779
Epoch: 3761, Training Loss: 0.02497
Epoch: 3761, Training Loss: 0.03201
Epoch: 3761, Training Loss: 0.02399
Epoch: 3762, Training Loss: 0.02778
Epoch: 3762, Training Loss: 0.02497
Epoch: 3762, Training Loss: 0.03200
Epoch: 3762, Training Loss: 0.02399
Epoch: 3763, Training Loss: 0.02778
Epoch: 3763, Training Loss: 0.02496
Epoch: 3763, Training Loss: 0.03200
Epoch: 3763, Training Loss: 0.02398
Epoch: 3764, Training Loss: 0.02777
Epoch: 3764, Training Loss: 0.02496
Epoch: 3764, Training Loss: 0.03199
Epoch: 3764, Training Loss: 0.02398
Epoch: 3765, Training Loss: 0.02777
Epoch: 3765, Training Loss: 0.02495
Epoch: 3765, Training Loss: 0.03199
Epoch: 3765, Training Loss: 0.02397
Epoch: 3766, Training Loss: 0.02776
Epoch: 3766, Training Loss: 0.02495
Epoch: 3766, Training Loss: 0.03198
Epoch: 3766, Training Loss: 0.02397
Epoch: 3767, Training Loss: 0.02776
Epoch: 3767, Training Loss: 0.02495
Epoch: 3767, Training Loss: 0.03197
Epoch: 3767, Training Loss: 0.02397
Epoch: 3768, Training Loss: 0.02775
Epoch: 3768, Training Loss: 0.02494
Epoch: 3768, Training Loss: 0.03197
Epoch: 3768, Training Loss: 0.02396
Epoch: 3769, Training Loss: 0.02775
Epoch: 3769, Training Loss: 0.02494
Epoch: 3769, Training Loss: 0.03196
Epoch: 3769, Training Loss: 0.02396
Epoch: 3770, Training Loss: 0.02774
Epoch: 3770, Training Loss: 0.02493
Epoch: 3770, Training Loss: 0.03196
Epoch: 3770, Training Loss: 0.02395
Epoch: 3771, Training Loss: 0.02774
Epoch: 3771, Training Loss: 0.02493
Epoch: 3771, Training Loss: 0.03195
Epoch: 3771, Training Loss: 0.02395
Epoch: 3772, Training Loss: 0.02773
Epoch: 3772, Training Loss: 0.02493
Epoch: 3772, Training Loss: 0.03195
Epoch: 3772, Training Loss: 0.02395
Epoch: 3773, Training Loss: 0.02773
Epoch: 3773, Training Loss: 0.02492
Epoch: 3773, Training Loss: 0.03194
Epoch: 3773, Training Loss: 0.02394
Epoch: 3774, Training Loss: 0.02772
Epoch: 3774, Training Loss: 0.02492
Epoch: 3774, Training Loss: 0.03193
Epoch: 3774, Training Loss: 0.02394
Epoch: 3775, Training Loss: 0.02772
Epoch: 3775, Training Loss: 0.02491
Epoch: 3775, Training Loss: 0.03193
Epoch: 3775, Training Loss: 0.02393
Epoch: 3776, Training Loss: 0.02771
Epoch: 3776, Training Loss: 0.02491
Epoch: 3776, Training Loss: 0.03192
Epoch: 3776, Training Loss: 0.02393
Epoch: 3777, Training Loss: 0.02771
Epoch: 3777, Training Loss: 0.02490
Epoch: 3777, Training Loss: 0.03192
Epoch: 3777, Training Loss: 0.02393
Epoch: 3778, Training Loss: 0.02771
Epoch: 3778, Training Loss: 0.02490
Epoch: 3778, Training Loss: 0.03191
Epoch: 3778, Training Loss: 0.02392
Epoch: 3779, Training Loss: 0.02770
Epoch: 3779, Training Loss: 0.02490
Epoch: 3779, Training Loss: 0.03191
Epoch: 3779, Training Loss: 0.02392
Epoch: 3780, Training Loss: 0.02770
Epoch: 3780, Training Loss: 0.02489
Epoch: 3780, Training Loss: 0.03190
Epoch: 3780, Training Loss: 0.02391
Epoch: 3781, Training Loss: 0.02769
Epoch: 3781, Training Loss: 0.02489
Epoch: 3781, Training Loss: 0.03189
Epoch: 3781, Training Loss: 0.02391
Epoch: 3782, Training Loss: 0.02769
Epoch: 3782, Training Loss: 0.02488
Epoch: 3782, Training Loss: 0.03189
Epoch: 3782, Training Loss: 0.02391
Epoch: 3783, Training Loss: 0.02768
Epoch: 3783, Training Loss: 0.02488
Epoch: 3783, Training Loss: 0.03188
Epoch: 3783, Training Loss: 0.02390
Epoch: 3784, Training Loss: 0.02768
Epoch: 3784, Training Loss: 0.02488
Epoch: 3784, Training Loss: 0.03188
Epoch: 3784, Training Loss: 0.02390
Epoch: 3785, Training Loss: 0.02767
Epoch: 3785, Training Loss: 0.02487
Epoch: 3785, Training Loss: 0.03187
Epoch: 3785, Training Loss: 0.02389
Epoch: 3786, Training Loss: 0.02767
Epoch: 3786, Training Loss: 0.02487
Epoch: 3786, Training Loss: 0.03187
Epoch: 3786, Training Loss: 0.02389
Epoch: 3787, Training Loss: 0.02766
Epoch: 3787, Training Loss: 0.02486
Epoch: 3787, Training Loss: 0.03186
Epoch: 3787, Training Loss: 0.02389
Epoch: 3788, Training Loss: 0.02766
Epoch: 3788, Training Loss: 0.02486
Epoch: 3788, Training Loss: 0.03185
Epoch: 3788, Training Loss: 0.02388
Epoch: 3789, Training Loss: 0.02765
Epoch: 3789, Training Loss: 0.02486
Epoch: 3789, Training Loss: 0.03185
Epoch: 3789, Training Loss: 0.02388
Epoch: 3790, Training Loss: 0.02765
Epoch: 3790, Training Loss: 0.02485
Epoch: 3790, Training Loss: 0.03184
Epoch: 3790, Training Loss: 0.02387
Epoch: 3791, Training Loss: 0.02764
Epoch: 3791, Training Loss: 0.02485
Epoch: 3791, Training Loss: 0.03184
Epoch: 3791, Training Loss: 0.02387
Epoch: 3792, Training Loss: 0.02764
Epoch: 3792, Training Loss: 0.02484
Epoch: 3792, Training Loss: 0.03183
Epoch: 3792, Training Loss: 0.02387
Epoch: 3793, Training Loss: 0.02763
Epoch: 3793, Training Loss: 0.02484
Epoch: 3793, Training Loss: 0.03183
Epoch: 3793, Training Loss: 0.02386
Epoch: 3794, Training Loss: 0.02763
Epoch: 3794, Training Loss: 0.02484
Epoch: 3794, Training Loss: 0.03182
Epoch: 3794, Training Loss: 0.02386
Epoch: 3795, Training Loss: 0.02762
Epoch: 3795, Training Loss: 0.02483
Epoch: 3795, Training Loss: 0.03181
Epoch: 3795, Training Loss: 0.02385
Epoch: 3796, Training Loss: 0.02762
Epoch: 3796, Training Loss: 0.02483
Epoch: 3796, Training Loss: 0.03181
Epoch: 3796, Training Loss: 0.02385
Epoch: 3797, Training Loss: 0.02761
Epoch: 3797, Training Loss: 0.02482
Epoch: 3797, Training Loss: 0.03180
Epoch: 3797, Training Loss: 0.02384
Epoch: 3798, Training Loss: 0.02761
Epoch: 3798, Training Loss: 0.02482
Epoch: 3798, Training Loss: 0.03180
Epoch: 3798, Training Loss: 0.02384
Epoch: 3799, Training Loss: 0.02760
Epoch: 3799, Training Loss: 0.02482
Epoch: 3799, Training Loss: 0.03179
Epoch: 3799, Training Loss: 0.02384
Epoch: 3800, Training Loss: 0.02760
Epoch: 3800, Training Loss: 0.02481
Epoch: 3800, Training Loss: 0.03179
Epoch: 3800, Training Loss: 0.02383
Epoch: 3801, Training Loss: 0.02759
Epoch: 3801, Training Loss: 0.02481
Epoch: 3801, Training Loss: 0.03178
Epoch: 3801, Training Loss: 0.02383
Epoch: 3802, Training Loss: 0.02759
Epoch: 3802, Training Loss: 0.02480
Epoch: 3802, Training Loss: 0.03177
Epoch: 3802, Training Loss: 0.02382
Epoch: 3803, Training Loss: 0.02759
Epoch: 3803, Training Loss: 0.02480
Epoch: 3803, Training Loss: 0.03177
Epoch: 3803, Training Loss: 0.02382
Epoch: 3804, Training Loss: 0.02758
Epoch: 3804, Training Loss: 0.02479
Epoch: 3804, Training Loss: 0.03176
Epoch: 3804, Training Loss: 0.02382
Epoch: 3805, Training Loss: 0.02758
Epoch: 3805, Training Loss: 0.02479
Epoch: 3805, Training Loss: 0.03176
Epoch: 3805, Training Loss: 0.02381
Epoch: 3806, Training Loss: 0.02757
Epoch: 3806, Training Loss: 0.02479
Epoch: 3806, Training Loss: 0.03175
Epoch: 3806, Training Loss: 0.02381
Epoch: 3807, Training Loss: 0.02757
Epoch: 3807, Training Loss: 0.02478
Epoch: 3807, Training Loss: 0.03175
Epoch: 3807, Training Loss: 0.02380
Epoch: 3808, Training Loss: 0.02756
Epoch: 3808, Training Loss: 0.02478
Epoch: 3808, Training Loss: 0.03174
Epoch: 3808, Training Loss: 0.02380
Epoch: 3809, Training Loss: 0.02756
Epoch: 3809, Training Loss: 0.02477
Epoch: 3809, Training Loss: 0.03174
Epoch: 3809, Training Loss: 0.02380
Epoch: 3810, Training Loss: 0.02755
Epoch: 3810, Training Loss: 0.02477
Epoch: 3810, Training Loss: 0.03173
Epoch: 3810, Training Loss: 0.02379
Epoch: 3811, Training Loss: 0.02755
Epoch: 3811, Training Loss: 0.02477
Epoch: 3811, Training Loss: 0.03172
Epoch: 3811, Training Loss: 0.02379
Epoch: 3812, Training Loss: 0.02754
Epoch: 3812, Training Loss: 0.02476
Epoch: 3812, Training Loss: 0.03172
Epoch: 3812, Training Loss: 0.02378
Epoch: 3813, Training Loss: 0.02754
Epoch: 3813, Training Loss: 0.02476
Epoch: 3813, Training Loss: 0.03171
Epoch: 3813, Training Loss: 0.02378
Epoch: 3814, Training Loss: 0.02753
Epoch: 3814, Training Loss: 0.02475
Epoch: 3814, Training Loss: 0.03171
Epoch: 3814, Training Loss: 0.02378
Epoch: 3815, Training Loss: 0.02753
Epoch: 3815, Training Loss: 0.02475
Epoch: 3815, Training Loss: 0.03170
Epoch: 3815, Training Loss: 0.02377
Epoch: 3816, Training Loss: 0.02752
Epoch: 3816, Training Loss: 0.02475
Epoch: 3816, Training Loss: 0.03170
Epoch: 3816, Training Loss: 0.02377
Epoch: 3817, Training Loss: 0.02752
Epoch: 3817, Training Loss: 0.02474
Epoch: 3817, Training Loss: 0.03169
Epoch: 3817, Training Loss: 0.02376
Epoch: 3818, Training Loss: 0.02751
Epoch: 3818, Training Loss: 0.02474
Epoch: 3818, Training Loss: 0.03168
Epoch: 3818, Training Loss: 0.02376
Epoch: 3819, Training Loss: 0.02751
Epoch: 3819, Training Loss: 0.02473
Epoch: 3819, Training Loss: 0.03168
Epoch: 3819, Training Loss: 0.02376
Epoch: 3820, Training Loss: 0.02750
Epoch: 3820, Training Loss: 0.02473
Epoch: 3820, Training Loss: 0.03167
Epoch: 3820, Training Loss: 0.02375
Epoch: 3821, Training Loss: 0.02750
Epoch: 3821, Training Loss: 0.02473
Epoch: 3821, Training Loss: 0.03167
Epoch: 3821, Training Loss: 0.02375
Epoch: 3822, Training Loss: 0.02749
Epoch: 3822, Training Loss: 0.02472
Epoch: 3822, Training Loss: 0.03166
Epoch: 3822, Training Loss: 0.02374
Epoch: 3823, Training Loss: 0.02749
Epoch: 3823, Training Loss: 0.02472
Epoch: 3823, Training Loss: 0.03166
Epoch: 3823, Training Loss: 0.02374
Epoch: 3824, Training Loss: 0.02749
Epoch: 3824, Training Loss: 0.02471
Epoch: 3824, Training Loss: 0.03165
Epoch: 3824, Training Loss: 0.02374
Epoch: 3825, Training Loss: 0.02748
Epoch: 3825, Training Loss: 0.02471
Epoch: 3825, Training Loss: 0.03165
Epoch: 3825, Training Loss: 0.02373
Epoch: 3826, Training Loss: 0.02748
Epoch: 3826, Training Loss: 0.02471
Epoch: 3826, Training Loss: 0.03164
Epoch: 3826, Training Loss: 0.02373
Epoch: 3827, Training Loss: 0.02747
Epoch: 3827, Training Loss: 0.02470
Epoch: 3827, Training Loss: 0.03163
Epoch: 3827, Training Loss: 0.02372
Epoch: 3828, Training Loss: 0.02747
Epoch: 3828, Training Loss: 0.02470
Epoch: 3828, Training Loss: 0.03163
Epoch: 3828, Training Loss: 0.02372
Epoch: 3829, Training Loss: 0.02746
Epoch: 3829, Training Loss: 0.02469
Epoch: 3829, Training Loss: 0.03162
Epoch: 3829, Training Loss: 0.02372
Epoch: 3830, Training Loss: 0.02746
Epoch: 3830, Training Loss: 0.02469
Epoch: 3830, Training Loss: 0.03162
Epoch: 3830, Training Loss: 0.02371
Epoch: 3831, Training Loss: 0.02745
Epoch: 3831, Training Loss: 0.02469
Epoch: 3831, Training Loss: 0.03161
Epoch: 3831, Training Loss: 0.02371
Epoch: 3832, Training Loss: 0.02745
Epoch: 3832, Training Loss: 0.02468
Epoch: 3832, Training Loss: 0.03161
Epoch: 3832, Training Loss: 0.02371
Epoch: 3833, Training Loss: 0.02744
Epoch: 3833, Training Loss: 0.02468
Epoch: 3833, Training Loss: 0.03160
Epoch: 3833, Training Loss: 0.02370
Epoch: 3834, Training Loss: 0.02744
Epoch: 3834, Training Loss: 0.02467
Epoch: 3834, Training Loss: 0.03160
Epoch: 3834, Training Loss: 0.02370
Epoch: 3835, Training Loss: 0.02743
Epoch: 3835, Training Loss: 0.02467
Epoch: 3835, Training Loss: 0.03159
Epoch: 3835, Training Loss: 0.02369
Epoch: 3836, Training Loss: 0.02743
Epoch: 3836, Training Loss: 0.02467
Epoch: 3836, Training Loss: 0.03158
Epoch: 3836, Training Loss: 0.02369
Epoch: 3837, Training Loss: 0.02742
Epoch: 3837, Training Loss: 0.02466
Epoch: 3837, Training Loss: 0.03158
Epoch: 3837, Training Loss: 0.02369
Epoch: 3838, Training Loss: 0.02742
Epoch: 3838, Training Loss: 0.02466
Epoch: 3838, Training Loss: 0.03157
Epoch: 3838, Training Loss: 0.02368
Epoch: 3839, Training Loss: 0.02741
Epoch: 3839, Training Loss: 0.02465
Epoch: 3839, Training Loss: 0.03157
Epoch: 3839, Training Loss: 0.02368
Epoch: 3840, Training Loss: 0.02741
Epoch: 3840, Training Loss: 0.02465
Epoch: 3840, Training Loss: 0.03156
Epoch: 3840, Training Loss: 0.02367
Epoch: 3841, Training Loss: 0.02741
Epoch: 3841, Training Loss: 0.02465
Epoch: 3841, Training Loss: 0.03156
Epoch: 3841, Training Loss: 0.02367
Epoch: 3842, Training Loss: 0.02740
Epoch: 3842, Training Loss: 0.02464
Epoch: 3842, Training Loss: 0.03155
Epoch: 3842, Training Loss: 0.02367
Epoch: 3843, Training Loss: 0.02740
Epoch: 3843, Training Loss: 0.02464
Epoch: 3843, Training Loss: 0.03155
Epoch: 3843, Training Loss: 0.02366
Epoch: 3844, Training Loss: 0.02739
Epoch: 3844, Training Loss: 0.02463
Epoch: 3844, Training Loss: 0.03154
Epoch: 3844, Training Loss: 0.02366
Epoch: 3845, Training Loss: 0.02739
Epoch: 3845, Training Loss: 0.02463
Epoch: 3845, Training Loss: 0.03153
Epoch: 3845, Training Loss: 0.02365
Epoch: 3846, Training Loss: 0.02738
Epoch: 3846, Training Loss: 0.02463
Epoch: 3846, Training Loss: 0.03153
Epoch: 3846, Training Loss: 0.02365
Epoch: 3847, Training Loss: 0.02738
Epoch: 3847, Training Loss: 0.02462
Epoch: 3847, Training Loss: 0.03152
Epoch: 3847, Training Loss: 0.02365
Epoch: 3848, Training Loss: 0.02737
Epoch: 3848, Training Loss: 0.02462
Epoch: 3848, Training Loss: 0.03152
Epoch: 3848, Training Loss: 0.02364
Epoch: 3849, Training Loss: 0.02737
Epoch: 3849, Training Loss: 0.02461
Epoch: 3849, Training Loss: 0.03151
Epoch: 3849, Training Loss: 0.02364
Epoch: 3850, Training Loss: 0.02736
Epoch: 3850, Training Loss: 0.02461
Epoch: 3850, Training Loss: 0.03151
Epoch: 3850, Training Loss: 0.02363
Epoch: 3851, Training Loss: 0.02736
Epoch: 3851, Training Loss: 0.02461
Epoch: 3851, Training Loss: 0.03150
Epoch: 3851, Training Loss: 0.02363
Epoch: 3852, Training Loss: 0.02735
Epoch: 3852, Training Loss: 0.02460
Epoch: 3852, Training Loss: 0.03150
Epoch: 3852, Training Loss: 0.02363
Epoch: 3853, Training Loss: 0.02735
Epoch: 3853, Training Loss: 0.02460
Epoch: 3853, Training Loss: 0.03149
Epoch: 3853, Training Loss: 0.02362
Epoch: 3854, Training Loss: 0.02734
Epoch: 3854, Training Loss: 0.02460
Epoch: 3854, Training Loss: 0.03149
Epoch: 3854, Training Loss: 0.02362
Epoch: 3855, Training Loss: 0.02734
Epoch: 3855, Training Loss: 0.02459
Epoch: 3855, Training Loss: 0.03148
Epoch: 3855, Training Loss: 0.02361
Epoch: 3856, Training Loss: 0.02734
Epoch: 3856, Training Loss: 0.02459
Epoch: 3856, Training Loss: 0.03147
Epoch: 3856, Training Loss: 0.02361
Epoch: 3857, Training Loss: 0.02733
Epoch: 3857, Training Loss: 0.02458
Epoch: 3857, Training Loss: 0.03147
Epoch: 3857, Training Loss: 0.02361
Epoch: 3858, Training Loss: 0.02733
Epoch: 3858, Training Loss: 0.02458
Epoch: 3858, Training Loss: 0.03146
Epoch: 3858, Training Loss: 0.02360
Epoch: 3859, Training Loss: 0.02732
Epoch: 3859, Training Loss: 0.02458
Epoch: 3859, Training Loss: 0.03146
Epoch: 3859, Training Loss: 0.02360
Epoch: 3860, Training Loss: 0.02732
Epoch: 3860, Training Loss: 0.02457
Epoch: 3860, Training Loss: 0.03145
Epoch: 3860, Training Loss: 0.02359
Epoch: 3861, Training Loss: 0.02731
Epoch: 3861, Training Loss: 0.02457
Epoch: 3861, Training Loss: 0.03145
Epoch: 3861, Training Loss: 0.02359
Epoch: 3862, Training Loss: 0.02731
Epoch: 3862, Training Loss: 0.02456
Epoch: 3862, Training Loss: 0.03144
Epoch: 3862, Training Loss: 0.02359
Epoch: 3863, Training Loss: 0.02730
Epoch: 3863, Training Loss: 0.02456
Epoch: 3863, Training Loss: 0.03144
Epoch: 3863, Training Loss: 0.02358
Epoch: 3864, Training Loss: 0.02730
Epoch: 3864, Training Loss: 0.02456
Epoch: 3864, Training Loss: 0.03143
Epoch: 3864, Training Loss: 0.02358
Epoch: 3865, Training Loss: 0.02729
Epoch: 3865, Training Loss: 0.02455
Epoch: 3865, Training Loss: 0.03142
Epoch: 3865, Training Loss: 0.02358
Epoch: 3866, Training Loss: 0.02729
Epoch: 3866, Training Loss: 0.02455
Epoch: 3866, Training Loss: 0.03142
Epoch: 3866, Training Loss: 0.02357
Epoch: 3867, Training Loss: 0.02728
Epoch: 3867, Training Loss: 0.02454
Epoch: 3867, Training Loss: 0.03141
Epoch: 3867, Training Loss: 0.02357
Epoch: 3868, Training Loss: 0.02728
Epoch: 3868, Training Loss: 0.02454
Epoch: 3868, Training Loss: 0.03141
Epoch: 3868, Training Loss: 0.02356
Epoch: 3869, Training Loss: 0.02727
Epoch: 3869, Training Loss: 0.02454
Epoch: 3869, Training Loss: 0.03140
Epoch: 3869, Training Loss: 0.02356
Epoch: 3870, Training Loss: 0.02727
Epoch: 3870, Training Loss: 0.02453
Epoch: 3870, Training Loss: 0.03140
Epoch: 3870, Training Loss: 0.02356
Epoch: 3871, Training Loss: 0.02727
Epoch: 3871, Training Loss: 0.02453
Epoch: 3871, Training Loss: 0.03139
Epoch: 3871, Training Loss: 0.02355
Epoch: 3872, Training Loss: 0.02726
Epoch: 3872, Training Loss: 0.02452
Epoch: 3872, Training Loss: 0.03139
Epoch: 3872, Training Loss: 0.02355
Epoch: 3873, Training Loss: 0.02726
Epoch: 3873, Training Loss: 0.02452
Epoch: 3873, Training Loss: 0.03138
Epoch: 3873, Training Loss: 0.02354
Epoch: 3874, Training Loss: 0.02725
Epoch: 3874, Training Loss: 0.02452
Epoch: 3874, Training Loss: 0.03138
Epoch: 3874, Training Loss: 0.02354
Epoch: 3875, Training Loss: 0.02725
Epoch: 3875, Training Loss: 0.02451
Epoch: 3875, Training Loss: 0.03137
Epoch: 3875, Training Loss: 0.02354
Epoch: 3876, Training Loss: 0.02724
Epoch: 3876, Training Loss: 0.02451
Epoch: 3876, Training Loss: 0.03137
Epoch: 3876, Training Loss: 0.02353
Epoch: 3877, Training Loss: 0.02724
Epoch: 3877, Training Loss: 0.02450
Epoch: 3877, Training Loss: 0.03136
Epoch: 3877, Training Loss: 0.02353
Epoch: 3878, Training Loss: 0.02723
Epoch: 3878, Training Loss: 0.02450
Epoch: 3878, Training Loss: 0.03135
Epoch: 3878, Training Loss: 0.02352
Epoch: 3879, Training Loss: 0.02723
Epoch: 3879, Training Loss: 0.02450
Epoch: 3879, Training Loss: 0.03135
Epoch: 3879, Training Loss: 0.02352
Epoch: 3880, Training Loss: 0.02722
Epoch: 3880, Training Loss: 0.02449
Epoch: 3880, Training Loss: 0.03134
Epoch: 3880, Training Loss: 0.02352
Epoch: 3881, Training Loss: 0.02722
Epoch: 3881, Training Loss: 0.02449
Epoch: 3881, Training Loss: 0.03134
Epoch: 3881, Training Loss: 0.02351
Epoch: 3882, Training Loss: 0.02721
Epoch: 3882, Training Loss: 0.02449
Epoch: 3882, Training Loss: 0.03133
Epoch: 3882, Training Loss: 0.02351
Epoch: 3883, Training Loss: 0.02721
Epoch: 3883, Training Loss: 0.02448
Epoch: 3883, Training Loss: 0.03133
Epoch: 3883, Training Loss: 0.02351
Epoch: 3884, Training Loss: 0.02721
Epoch: 3884, Training Loss: 0.02448
Epoch: 3884, Training Loss: 0.03132
Epoch: 3884, Training Loss: 0.02350
Epoch: 3885, Training Loss: 0.02720
Epoch: 3885, Training Loss: 0.02447
Epoch: 3885, Training Loss: 0.03132
Epoch: 3885, Training Loss: 0.02350
Epoch: 3886, Training Loss: 0.02720
Epoch: 3886, Training Loss: 0.02447
Epoch: 3886, Training Loss: 0.03131
Epoch: 3886, Training Loss: 0.02349
Epoch: 3887, Training Loss: 0.02719
Epoch: 3887, Training Loss: 0.02447
Epoch: 3887, Training Loss: 0.03131
Epoch: 3887, Training Loss: 0.02349
Epoch: 3888, Training Loss: 0.02719
Epoch: 3888, Training Loss: 0.02446
Epoch: 3888, Training Loss: 0.03130
Epoch: 3888, Training Loss: 0.02349
Epoch: 3889, Training Loss: 0.02718
Epoch: 3889, Training Loss: 0.02446
Epoch: 3889, Training Loss: 0.03129
Epoch: 3889, Training Loss: 0.02348
Epoch: 3890, Training Loss: 0.02718
Epoch: 3890, Training Loss: 0.02445
Epoch: 3890, Training Loss: 0.03129
Epoch: 3890, Training Loss: 0.02348
Epoch: 3891, Training Loss: 0.02717
Epoch: 3891, Training Loss: 0.02445
Epoch: 3891, Training Loss: 0.03128
Epoch: 3891, Training Loss: 0.02347
Epoch: 3892, Training Loss: 0.02717
Epoch: 3892, Training Loss: 0.02445
Epoch: 3892, Training Loss: 0.03128
Epoch: 3892, Training Loss: 0.02347
Epoch: 3893, Training Loss: 0.02716
Epoch: 3893, Training Loss: 0.02444
Epoch: 3893, Training Loss: 0.03127
Epoch: 3893, Training Loss: 0.02347
Epoch: 3894, Training Loss: 0.02716
Epoch: 3894, Training Loss: 0.02444
Epoch: 3894, Training Loss: 0.03127
Epoch: 3894, Training Loss: 0.02346
Epoch: 3895, Training Loss: 0.02716
Epoch: 3895, Training Loss: 0.02443
Epoch: 3895, Training Loss: 0.03126
Epoch: 3895, Training Loss: 0.02346
Epoch: 3896, Training Loss: 0.02715
Epoch: 3896, Training Loss: 0.02443
Epoch: 3896, Training Loss: 0.03126
Epoch: 3896, Training Loss: 0.02346
Epoch: 3897, Training Loss: 0.02715
Epoch: 3897, Training Loss: 0.02443
Epoch: 3897, Training Loss: 0.03125
Epoch: 3897, Training Loss: 0.02345
Epoch: 3898, Training Loss: 0.02714
Epoch: 3898, Training Loss: 0.02442
Epoch: 3898, Training Loss: 0.03125
Epoch: 3898, Training Loss: 0.02345
Epoch: 3899, Training Loss: 0.02714
Epoch: 3899, Training Loss: 0.02442
Epoch: 3899, Training Loss: 0.03124
Epoch: 3899, Training Loss: 0.02344
Epoch: 3900, Training Loss: 0.02713
Epoch: 3900, Training Loss: 0.02442
Epoch: 3900, Training Loss: 0.03124
Epoch: 3900, Training Loss: 0.02344
Epoch: 3901, Training Loss: 0.02713
Epoch: 3901, Training Loss: 0.02441
Epoch: 3901, Training Loss: 0.03123
Epoch: 3901, Training Loss: 0.02344
Epoch: 3902, Training Loss: 0.02712
Epoch: 3902, Training Loss: 0.02441
Epoch: 3902, Training Loss: 0.03122
Epoch: 3902, Training Loss: 0.02343
Epoch: 3903, Training Loss: 0.02712
Epoch: 3903, Training Loss: 0.02440
Epoch: 3903, Training Loss: 0.03122
Epoch: 3903, Training Loss: 0.02343
Epoch: 3904, Training Loss: 0.02711
Epoch: 3904, Training Loss: 0.02440
Epoch: 3904, Training Loss: 0.03121
Epoch: 3904, Training Loss: 0.02342
Epoch: 3905, Training Loss: 0.02711
Epoch: 3905, Training Loss: 0.02440
Epoch: 3905, Training Loss: 0.03121
Epoch: 3905, Training Loss: 0.02342
Epoch: 3906, Training Loss: 0.02711
Epoch: 3906, Training Loss: 0.02439
Epoch: 3906, Training Loss: 0.03120
Epoch: 3906, Training Loss: 0.02342
Epoch: 3907, Training Loss: 0.02710
Epoch: 3907, Training Loss: 0.02439
Epoch: 3907, Training Loss: 0.03120
Epoch: 3907, Training Loss: 0.02341
Epoch: 3908, Training Loss: 0.02710
Epoch: 3908, Training Loss: 0.02438
Epoch: 3908, Training Loss: 0.03119
Epoch: 3908, Training Loss: 0.02341
Epoch: 3909, Training Loss: 0.02709
Epoch: 3909, Training Loss: 0.02438
Epoch: 3909, Training Loss: 0.03119
Epoch: 3909, Training Loss: 0.02341
Epoch: 3910, Training Loss: 0.02709
Epoch: 3910, Training Loss: 0.02438
Epoch: 3910, Training Loss: 0.03118
Epoch: 3910, Training Loss: 0.02340
Epoch: 3911, Training Loss: 0.02708
Epoch: 3911, Training Loss: 0.02437
Epoch: 3911, Training Loss: 0.03118
Epoch: 3911, Training Loss: 0.02340
Epoch: 3912, Training Loss: 0.02708
Epoch: 3912, Training Loss: 0.02437
Epoch: 3912, Training Loss: 0.03117
Epoch: 3912, Training Loss: 0.02339
Epoch: 3913, Training Loss: 0.02707
Epoch: 3913, Training Loss: 0.02437
Epoch: 3913, Training Loss: 0.03117
Epoch: 3913, Training Loss: 0.02339
Epoch: 3914, Training Loss: 0.02707
Epoch: 3914, Training Loss: 0.02436
Epoch: 3914, Training Loss: 0.03116
Epoch: 3914, Training Loss: 0.02339
Epoch: 3915, Training Loss: 0.02706
Epoch: 3915, Training Loss: 0.02436
Epoch: 3915, Training Loss: 0.03116
Epoch: 3915, Training Loss: 0.02338
Epoch: 3916, Training Loss: 0.02706
Epoch: 3916, Training Loss: 0.02435
Epoch: 3916, Training Loss: 0.03115
Epoch: 3916, Training Loss: 0.02338
Epoch: 3917, Training Loss: 0.02706
Epoch: 3917, Training Loss: 0.02435
Epoch: 3917, Training Loss: 0.03114
Epoch: 3917, Training Loss: 0.02338
Epoch: 3918, Training Loss: 0.02705
Epoch: 3918, Training Loss: 0.02435
Epoch: 3918, Training Loss: 0.03114
Epoch: 3918, Training Loss: 0.02337
Epoch: 3919, Training Loss: 0.02705
Epoch: 3919, Training Loss: 0.02434
Epoch: 3919, Training Loss: 0.03113
Epoch: 3919, Training Loss: 0.02337
Epoch: 3920, Training Loss: 0.02704
Epoch: 3920, Training Loss: 0.02434
Epoch: 3920, Training Loss: 0.03113
Epoch: 3920, Training Loss: 0.02336
Epoch: 3921, Training Loss: 0.02704
Epoch: 3921, Training Loss: 0.02433
Epoch: 3921, Training Loss: 0.03112
Epoch: 3921, Training Loss: 0.02336
Epoch: 3922, Training Loss: 0.02703
Epoch: 3922, Training Loss: 0.02433
Epoch: 3922, Training Loss: 0.03112
Epoch: 3922, Training Loss: 0.02336
Epoch: 3923, Training Loss: 0.02703
Epoch: 3923, Training Loss: 0.02433
Epoch: 3923, Training Loss: 0.03111
Epoch: 3923, Training Loss: 0.02335
Epoch: 3924, Training Loss: 0.02702
Epoch: 3924, Training Loss: 0.02432
Epoch: 3924, Training Loss: 0.03111
Epoch: 3924, Training Loss: 0.02335
Epoch: 3925, Training Loss: 0.02702
Epoch: 3925, Training Loss: 0.02432
Epoch: 3925, Training Loss: 0.03110
Epoch: 3925, Training Loss: 0.02334
Epoch: 3926, Training Loss: 0.02701
Epoch: 3926, Training Loss: 0.02432
Epoch: 3926, Training Loss: 0.03110
Epoch: 3926, Training Loss: 0.02334
Epoch: 3927, Training Loss: 0.02701
Epoch: 3927, Training Loss: 0.02431
Epoch: 3927, Training Loss: 0.03109
Epoch: 3927, Training Loss: 0.02334
Epoch: 3928, Training Loss: 0.02701
Epoch: 3928, Training Loss: 0.02431
Epoch: 3928, Training Loss: 0.03109
Epoch: 3928, Training Loss: 0.02333
Epoch: 3929, Training Loss: 0.02700
Epoch: 3929, Training Loss: 0.02430
Epoch: 3929, Training Loss: 0.03108
Epoch: 3929, Training Loss: 0.02333
Epoch: 3930, Training Loss: 0.02700
Epoch: 3930, Training Loss: 0.02430
Epoch: 3930, Training Loss: 0.03108
Epoch: 3930, Training Loss: 0.02333
Epoch: 3931, Training Loss: 0.02699
Epoch: 3931, Training Loss: 0.02430
Epoch: 3931, Training Loss: 0.03107
Epoch: 3931, Training Loss: 0.02332
Epoch: 3932, Training Loss: 0.02699
Epoch: 3932, Training Loss: 0.02429
Epoch: 3932, Training Loss: 0.03107
Epoch: 3932, Training Loss: 0.02332
Epoch: 3933, Training Loss: 0.02698
Epoch: 3933, Training Loss: 0.02429
Epoch: 3933, Training Loss: 0.03106
Epoch: 3933, Training Loss: 0.02331
Epoch: 3934, Training Loss: 0.02698
Epoch: 3934, Training Loss: 0.02428
Epoch: 3934, Training Loss: 0.03105
Epoch: 3934, Training Loss: 0.02331
Epoch: 3935, Training Loss: 0.02697
Epoch: 3935, Training Loss: 0.02428
Epoch: 3935, Training Loss: 0.03105
Epoch: 3935, Training Loss: 0.02331
Epoch: 3936, Training Loss: 0.02697
Epoch: 3936, Training Loss: 0.02428
Epoch: 3936, Training Loss: 0.03104
Epoch: 3936, Training Loss: 0.02330
Epoch: 3937, Training Loss: 0.02697
Epoch: 3937, Training Loss: 0.02427
Epoch: 3937, Training Loss: 0.03104
Epoch: 3937, Training Loss: 0.02330
Epoch: 3938, Training Loss: 0.02696
Epoch: 3938, Training Loss: 0.02427
Epoch: 3938, Training Loss: 0.03103
Epoch: 3938, Training Loss: 0.02330
Epoch: 3939, Training Loss: 0.02696
Epoch: 3939, Training Loss: 0.02427
Epoch: 3939, Training Loss: 0.03103
Epoch: 3939, Training Loss: 0.02329
Epoch: 3940, Training Loss: 0.02695
Epoch: 3940, Training Loss: 0.02426
Epoch: 3940, Training Loss: 0.03102
Epoch: 3940, Training Loss: 0.02329
Epoch: 3941, Training Loss: 0.02695
Epoch: 3941, Training Loss: 0.02426
Epoch: 3941, Training Loss: 0.03102
Epoch: 3941, Training Loss: 0.02328
Epoch: 3942, Training Loss: 0.02694
Epoch: 3942, Training Loss: 0.02425
Epoch: 3942, Training Loss: 0.03101
Epoch: 3942, Training Loss: 0.02328
Epoch: 3943, Training Loss: 0.02694
Epoch: 3943, Training Loss: 0.02425
Epoch: 3943, Training Loss: 0.03101
Epoch: 3943, Training Loss: 0.02328
Epoch: 3944, Training Loss: 0.02693
Epoch: 3944, Training Loss: 0.02425
Epoch: 3944, Training Loss: 0.03100
Epoch: 3944, Training Loss: 0.02327
Epoch: 3945, Training Loss: 0.02693
Epoch: 3945, Training Loss: 0.02424
Epoch: 3945, Training Loss: 0.03100
Epoch: 3945, Training Loss: 0.02327
Epoch: 3946, Training Loss: 0.02693
Epoch: 3946, Training Loss: 0.02424
Epoch: 3946, Training Loss: 0.03099
Epoch: 3946, Training Loss: 0.02327
Epoch: 3947, Training Loss: 0.02692
Epoch: 3947, Training Loss: 0.02424
Epoch: 3947, Training Loss: 0.03099
Epoch: 3947, Training Loss: 0.02326
Epoch: 3948, Training Loss: 0.02692
Epoch: 3948, Training Loss: 0.02423
Epoch: 3948, Training Loss: 0.03098
Epoch: 3948, Training Loss: 0.02326
Epoch: 3949, Training Loss: 0.02691
Epoch: 3949, Training Loss: 0.02423
Epoch: 3949, Training Loss: 0.03098
Epoch: 3949, Training Loss: 0.02325
Epoch: 3950, Training Loss: 0.02691
Epoch: 3950, Training Loss: 0.02422
Epoch: 3950, Training Loss: 0.03097
Epoch: 3950, Training Loss: 0.02325
Epoch: 3951, Training Loss: 0.02690
Epoch: 3951, Training Loss: 0.02422
Epoch: 3951, Training Loss: 0.03097
Epoch: 3951, Training Loss: 0.02325
Epoch: 3952, Training Loss: 0.02690
Epoch: 3952, Training Loss: 0.02422
Epoch: 3952, Training Loss: 0.03096
Epoch: 3952, Training Loss: 0.02324
Epoch: 3953, Training Loss: 0.02689
Epoch: 3953, Training Loss: 0.02421
Epoch: 3953, Training Loss: 0.03095
Epoch: 3953, Training Loss: 0.02324
Epoch: 3954, Training Loss: 0.02689
Epoch: 3954, Training Loss: 0.02421
Epoch: 3954, Training Loss: 0.03095
Epoch: 3954, Training Loss: 0.02324
Epoch: 3955, Training Loss: 0.02689
Epoch: 3955, Training Loss: 0.02421
Epoch: 3955, Training Loss: 0.03094
Epoch: 3955, Training Loss: 0.02323
Epoch: 3956, Training Loss: 0.02688
Epoch: 3956, Training Loss: 0.02420
Epoch: 3956, Training Loss: 0.03094
Epoch: 3956, Training Loss: 0.02323
Epoch: 3957, Training Loss: 0.02688
Epoch: 3957, Training Loss: 0.02420
Epoch: 3957, Training Loss: 0.03093
Epoch: 3957, Training Loss: 0.02322
Epoch: 3958, Training Loss: 0.02687
Epoch: 3958, Training Loss: 0.02419
Epoch: 3958, Training Loss: 0.03093
Epoch: 3958, Training Loss: 0.02322
Epoch: 3959, Training Loss: 0.02687
Epoch: 3959, Training Loss: 0.02419
Epoch: 3959, Training Loss: 0.03092
Epoch: 3959, Training Loss: 0.02322
Epoch: 3960, Training Loss: 0.02686
Epoch: 3960, Training Loss: 0.02419
Epoch: 3960, Training Loss: 0.03092
Epoch: 3960, Training Loss: 0.02321
Epoch: 3961, Training Loss: 0.02686
Epoch: 3961, Training Loss: 0.02418
Epoch: 3961, Training Loss: 0.03091
Epoch: 3961, Training Loss: 0.02321
Epoch: 3962, Training Loss: 0.02685
Epoch: 3962, Training Loss: 0.02418
Epoch: 3962, Training Loss: 0.03091
Epoch: 3962, Training Loss: 0.02321
Epoch: 3963, Training Loss: 0.02685
Epoch: 3963, Training Loss: 0.02418
Epoch: 3963, Training Loss: 0.03090
Epoch: 3963, Training Loss: 0.02320
Epoch: 3964, Training Loss: 0.02685
Epoch: 3964, Training Loss: 0.02417
Epoch: 3964, Training Loss: 0.03090
Epoch: 3964, Training Loss: 0.02320
Epoch: 3965, Training Loss: 0.02684
Epoch: 3965, Training Loss: 0.02417
Epoch: 3965, Training Loss: 0.03089
Epoch: 3965, Training Loss: 0.02319
Epoch: 3966, Training Loss: 0.02684
Epoch: 3966, Training Loss: 0.02416
Epoch: 3966, Training Loss: 0.03089
Epoch: 3966, Training Loss: 0.02319
Epoch: 3967, Training Loss: 0.02683
Epoch: 3967, Training Loss: 0.02416
Epoch: 3967, Training Loss: 0.03088
Epoch: 3967, Training Loss: 0.02319
Epoch: 3968, Training Loss: 0.02683
Epoch: 3968, Training Loss: 0.02416
Epoch: 3968, Training Loss: 0.03088
Epoch: 3968, Training Loss: 0.02318
Epoch: 3969, Training Loss: 0.02682
Epoch: 3969, Training Loss: 0.02415
Epoch: 3969, Training Loss: 0.03087
Epoch: 3969, Training Loss: 0.02318
Epoch: 3970, Training Loss: 0.02682
Epoch: 3970, Training Loss: 0.02415
Epoch: 3970, Training Loss: 0.03087
Epoch: 3970, Training Loss: 0.02318
Epoch: 3971, Training Loss: 0.02681
Epoch: 3971, Training Loss: 0.02415
Epoch: 3971, Training Loss: 0.03086
Epoch: 3971, Training Loss: 0.02317
Epoch: 3972, Training Loss: 0.02681
Epoch: 3972, Training Loss: 0.02414
Epoch: 3972, Training Loss: 0.03086
Epoch: 3972, Training Loss: 0.02317
Epoch: 3973, Training Loss: 0.02681
Epoch: 3973, Training Loss: 0.02414
Epoch: 3973, Training Loss: 0.03085
Epoch: 3973, Training Loss: 0.02317
Epoch: 3974, Training Loss: 0.02680
Epoch: 3974, Training Loss: 0.02413
Epoch: 3974, Training Loss: 0.03085
Epoch: 3974, Training Loss: 0.02316
Epoch: 3975, Training Loss: 0.02680
Epoch: 3975, Training Loss: 0.02413
Epoch: 3975, Training Loss: 0.03084
Epoch: 3975, Training Loss: 0.02316
Epoch: 3976, Training Loss: 0.02679
Epoch: 3976, Training Loss: 0.02413
Epoch: 3976, Training Loss: 0.03084
Epoch: 3976, Training Loss: 0.02315
Epoch: 3977, Training Loss: 0.02679
Epoch: 3977, Training Loss: 0.02412
Epoch: 3977, Training Loss: 0.03083
Epoch: 3977, Training Loss: 0.02315
Epoch: 3978, Training Loss: 0.02678
Epoch: 3978, Training Loss: 0.02412
Epoch: 3978, Training Loss: 0.03082
Epoch: 3978, Training Loss: 0.02315
Epoch: 3979, Training Loss: 0.02678
Epoch: 3979, Training Loss: 0.02412
Epoch: 3979, Training Loss: 0.03082
Epoch: 3979, Training Loss: 0.02314
Epoch: 3980, Training Loss: 0.02678
Epoch: 3980, Training Loss: 0.02411
Epoch: 3980, Training Loss: 0.03081
Epoch: 3980, Training Loss: 0.02314
Epoch: 3981, Training Loss: 0.02677
Epoch: 3981, Training Loss: 0.02411
Epoch: 3981, Training Loss: 0.03081
Epoch: 3981, Training Loss: 0.02314
Epoch: 3982, Training Loss: 0.02677
Epoch: 3982, Training Loss: 0.02410
Epoch: 3982, Training Loss: 0.03080
Epoch: 3982, Training Loss: 0.02313
Epoch: 3983, Training Loss: 0.02676
Epoch: 3983, Training Loss: 0.02410
Epoch: 3983, Training Loss: 0.03080
Epoch: 3983, Training Loss: 0.02313
Epoch: 3984, Training Loss: 0.02676
Epoch: 3984, Training Loss: 0.02410
Epoch: 3984, Training Loss: 0.03079
Epoch: 3984, Training Loss: 0.02312
Epoch: 3985, Training Loss: 0.02675
Epoch: 3985, Training Loss: 0.02409
Epoch: 3985, Training Loss: 0.03079
Epoch: 3985, Training Loss: 0.02312
Epoch: 3986, Training Loss: 0.02675
Epoch: 3986, Training Loss: 0.02409
Epoch: 3986, Training Loss: 0.03078
Epoch: 3986, Training Loss: 0.02312
Epoch: 3987, Training Loss: 0.02674
Epoch: 3987, Training Loss: 0.02409
Epoch: 3987, Training Loss: 0.03078
Epoch: 3987, Training Loss: 0.02311
Epoch: 3988, Training Loss: 0.02674
Epoch: 3988, Training Loss: 0.02408
Epoch: 3988, Training Loss: 0.03077
Epoch: 3988, Training Loss: 0.02311
Epoch: 3989, Training Loss: 0.02674
Epoch: 3989, Training Loss: 0.02408
Epoch: 3989, Training Loss: 0.03077
Epoch: 3989, Training Loss: 0.02311
Epoch: 3990, Training Loss: 0.02673
Epoch: 3990, Training Loss: 0.02407
Epoch: 3990, Training Loss: 0.03076
Epoch: 3990, Training Loss: 0.02310
Epoch: 3991, Training Loss: 0.02673
Epoch: 3991, Training Loss: 0.02407
Epoch: 3991, Training Loss: 0.03076
Epoch: 3991, Training Loss: 0.02310
Epoch: 3992, Training Loss: 0.02672
Epoch: 3992, Training Loss: 0.02407
Epoch: 3992, Training Loss: 0.03075
Epoch: 3992, Training Loss: 0.02310
Epoch: 3993, Training Loss: 0.02672
Epoch: 3993, Training Loss: 0.02406
Epoch: 3993, Training Loss: 0.03075
Epoch: 3993, Training Loss: 0.02309
Epoch: 3994, Training Loss: 0.02671
Epoch: 3994, Training Loss: 0.02406
Epoch: 3994, Training Loss: 0.03074
Epoch: 3994, Training Loss: 0.02309
Epoch: 3995, Training Loss: 0.02671
Epoch: 3995, Training Loss: 0.02406
Epoch: 3995, Training Loss: 0.03074
Epoch: 3995, Training Loss: 0.02308
Epoch: 3996, Training Loss: 0.02671
Epoch: 3996, Training Loss: 0.02405
Epoch: 3996, Training Loss: 0.03073
Epoch: 3996, Training Loss: 0.02308
Epoch: 3997, Training Loss: 0.02670
Epoch: 3997, Training Loss: 0.02405
Epoch: 3997, Training Loss: 0.03073
Epoch: 3997, Training Loss: 0.02308
Epoch: 3998, Training Loss: 0.02670
Epoch: 3998, Training Loss: 0.02404
Epoch: 3998, Training Loss: 0.03072
Epoch: 3998, Training Loss: 0.02307
Epoch: 3999, Training Loss: 0.02669
Epoch: 3999, Training Loss: 0.02404
Epoch: 3999, Training Loss: 0.03072
Epoch: 3999, Training Loss: 0.02307
Epoch: 4000, Training Loss: 0.02669
Epoch: 4000, Training Loss: 0.02404
Epoch: 4000, Training Loss: 0.03071
Epoch: 4000, Training Loss: 0.02307
Epoch: 4001, Training Loss: 0.02668
Epoch: 4001, Training Loss: 0.02403
Epoch: 4001, Training Loss: 0.03071
Epoch: 4001, Training Loss: 0.02306
Epoch: 4002, Training Loss: 0.02668
Epoch: 4002, Training Loss: 0.02403
Epoch: 4002, Training Loss: 0.03070
Epoch: 4002, Training Loss: 0.02306
Epoch: 4003, Training Loss: 0.02667
Epoch: 4003, Training Loss: 0.02403
Epoch: 4003, Training Loss: 0.03070
Epoch: 4003, Training Loss: 0.02305
Epoch: 4004, Training Loss: 0.02667
Epoch: 4004, Training Loss: 0.02402
Epoch: 4004, Training Loss: 0.03069
Epoch: 4004, Training Loss: 0.02305
Epoch: 4005, Training Loss: 0.02667
Epoch: 4005, Training Loss: 0.02402
Epoch: 4005, Training Loss: 0.03069
Epoch: 4005, Training Loss: 0.02305
Epoch: 4006, Training Loss: 0.02666
Epoch: 4006, Training Loss: 0.02402
Epoch: 4006, Training Loss: 0.03068
Epoch: 4006, Training Loss: 0.02304
Epoch: 4007, Training Loss: 0.02666
Epoch: 4007, Training Loss: 0.02401
Epoch: 4007, Training Loss: 0.03068
Epoch: 4007, Training Loss: 0.02304
Epoch: 4008, Training Loss: 0.02665
Epoch: 4008, Training Loss: 0.02401
Epoch: 4008, Training Loss: 0.03067
Epoch: 4008, Training Loss: 0.02304
Epoch: 4009, Training Loss: 0.02665
Epoch: 4009, Training Loss: 0.02400
Epoch: 4009, Training Loss: 0.03067
Epoch: 4009, Training Loss: 0.02303
Epoch: 4010, Training Loss: 0.02664
Epoch: 4010, Training Loss: 0.02400
Epoch: 4010, Training Loss: 0.03066
Epoch: 4010, Training Loss: 0.02303
Epoch: 4011, Training Loss: 0.02664
Epoch: 4011, Training Loss: 0.02400
Epoch: 4011, Training Loss: 0.03066
Epoch: 4011, Training Loss: 0.02303
Epoch: 4012, Training Loss: 0.02664
Epoch: 4012, Training Loss: 0.02399
Epoch: 4012, Training Loss: 0.03065
Epoch: 4012, Training Loss: 0.02302
Epoch: 4013, Training Loss: 0.02663
Epoch: 4013, Training Loss: 0.02399
Epoch: 4013, Training Loss: 0.03065
Epoch: 4013, Training Loss: 0.02302
Epoch: 4014, Training Loss: 0.02663
Epoch: 4014, Training Loss: 0.02399
Epoch: 4014, Training Loss: 0.03064
Epoch: 4014, Training Loss: 0.02301
Epoch: 4015, Training Loss: 0.02662
Epoch: 4015, Training Loss: 0.02398
Epoch: 4015, Training Loss: 0.03064
Epoch: 4015, Training Loss: 0.02301
Epoch: 4016, Training Loss: 0.02662
Epoch: 4016, Training Loss: 0.02398
Epoch: 4016, Training Loss: 0.03063
Epoch: 4016, Training Loss: 0.02301
Epoch: 4017, Training Loss: 0.02661
Epoch: 4017, Training Loss: 0.02397
Epoch: 4017, Training Loss: 0.03063
Epoch: 4017, Training Loss: 0.02300
Epoch: 4018, Training Loss: 0.02661
Epoch: 4018, Training Loss: 0.02397
Epoch: 4018, Training Loss: 0.03062
Epoch: 4018, Training Loss: 0.02300
Epoch: 4019, Training Loss: 0.02661
Epoch: 4019, Training Loss: 0.02397
Epoch: 4019, Training Loss: 0.03062
Epoch: 4019, Training Loss: 0.02300
Epoch: 4020, Training Loss: 0.02660
Epoch: 4020, Training Loss: 0.02396
Epoch: 4020, Training Loss: 0.03061
Epoch: 4020, Training Loss: 0.02299
Epoch: 4021, Training Loss: 0.02660
Epoch: 4021, Training Loss: 0.02396
Epoch: 4021, Training Loss: 0.03060
Epoch: 4021, Training Loss: 0.02299
Epoch: 4022, Training Loss: 0.02659
Epoch: 4022, Training Loss: 0.02396
Epoch: 4022, Training Loss: 0.03060
Epoch: 4022, Training Loss: 0.02299
Epoch: 4023, Training Loss: 0.02659
Epoch: 4023, Training Loss: 0.02395
Epoch: 4023, Training Loss: 0.03059
Epoch: 4023, Training Loss: 0.02298
Epoch: 4024, Training Loss: 0.02658
Epoch: 4024, Training Loss: 0.02395
Epoch: 4024, Training Loss: 0.03059
Epoch: 4024, Training Loss: 0.02298
Epoch: 4025, Training Loss: 0.02658
Epoch: 4025, Training Loss: 0.02395
Epoch: 4025, Training Loss: 0.03058
Epoch: 4025, Training Loss: 0.02298
Epoch: 4026, Training Loss: 0.02658
Epoch: 4026, Training Loss: 0.02394
Epoch: 4026, Training Loss: 0.03058
Epoch: 4026, Training Loss: 0.02297
Epoch: 4027, Training Loss: 0.02657
Epoch: 4027, Training Loss: 0.02394
Epoch: 4027, Training Loss: 0.03057
Epoch: 4027, Training Loss: 0.02297
Epoch: 4028, Training Loss: 0.02657
Epoch: 4028, Training Loss: 0.02393
Epoch: 4028, Training Loss: 0.03057
Epoch: 4028, Training Loss: 0.02296
Epoch: 4029, Training Loss: 0.02656
Epoch: 4029, Training Loss: 0.02393
Epoch: 4029, Training Loss: 0.03056
Epoch: 4029, Training Loss: 0.02296
Epoch: 4030, Training Loss: 0.02656
Epoch: 4030, Training Loss: 0.02393
Epoch: 4030, Training Loss: 0.03056
Epoch: 4030, Training Loss: 0.02296
Epoch: 4031, Training Loss: 0.02655
Epoch: 4031, Training Loss: 0.02392
Epoch: 4031, Training Loss: 0.03055
Epoch: 4031, Training Loss: 0.02295
Epoch: 4032, Training Loss: 0.02655
Epoch: 4032, Training Loss: 0.02392
Epoch: 4032, Training Loss: 0.03055
Epoch: 4032, Training Loss: 0.02295
Epoch: 4033, Training Loss: 0.02655
Epoch: 4033, Training Loss: 0.02392
Epoch: 4033, Training Loss: 0.03054
Epoch: 4033, Training Loss: 0.02295
Epoch: 4034, Training Loss: 0.02654
Epoch: 4034, Training Loss: 0.02391
Epoch: 4034, Training Loss: 0.03054
Epoch: 4034, Training Loss: 0.02294
Epoch: 4035, Training Loss: 0.02654
Epoch: 4035, Training Loss: 0.02391
Epoch: 4035, Training Loss: 0.03053
Epoch: 4035, Training Loss: 0.02294
Epoch: 4036, Training Loss: 0.02653
Epoch: 4036, Training Loss: 0.02391
Epoch: 4036, Training Loss: 0.03053
Epoch: 4036, Training Loss: 0.02294
Epoch: 4037, Training Loss: 0.02653
Epoch: 4037, Training Loss: 0.02390
Epoch: 4037, Training Loss: 0.03052
Epoch: 4037, Training Loss: 0.02293
Epoch: 4038, Training Loss: 0.02652
Epoch: 4038, Training Loss: 0.02390
Epoch: 4038, Training Loss: 0.03052
Epoch: 4038, Training Loss: 0.02293
Epoch: 4039, Training Loss: 0.02652
Epoch: 4039, Training Loss: 0.02389
Epoch: 4039, Training Loss: 0.03051
Epoch: 4039, Training Loss: 0.02292
Epoch: 4040, Training Loss: 0.02652
Epoch: 4040, Training Loss: 0.02389
Epoch: 4040, Training Loss: 0.03051
Epoch: 4040, Training Loss: 0.02292
Epoch: 4041, Training Loss: 0.02651
Epoch: 4041, Training Loss: 0.02389
Epoch: 4041, Training Loss: 0.03050
Epoch: 4041, Training Loss: 0.02292
Epoch: 4042, Training Loss: 0.02651
Epoch: 4042, Training Loss: 0.02388
Epoch: 4042, Training Loss: 0.03050
Epoch: 4042, Training Loss: 0.02291
Epoch: 4043, Training Loss: 0.02650
Epoch: 4043, Training Loss: 0.02388
Epoch: 4043, Training Loss: 0.03049
Epoch: 4043, Training Loss: 0.02291
Epoch: 4044, Training Loss: 0.02650
Epoch: 4044, Training Loss: 0.02388
Epoch: 4044, Training Loss: 0.03049
Epoch: 4044, Training Loss: 0.02291
Epoch: 4045, Training Loss: 0.02649
Epoch: 4045, Training Loss: 0.02387
Epoch: 4045, Training Loss: 0.03048
Epoch: 4045, Training Loss: 0.02290
Epoch: 4046, Training Loss: 0.02649
Epoch: 4046, Training Loss: 0.02387
Epoch: 4046, Training Loss: 0.03048
Epoch: 4046, Training Loss: 0.02290
Epoch: 4047, Training Loss: 0.02649
Epoch: 4047, Training Loss: 0.02387
Epoch: 4047, Training Loss: 0.03047
Epoch: 4047, Training Loss: 0.02290
Epoch: 4048, Training Loss: 0.02648
Epoch: 4048, Training Loss: 0.02386
Epoch: 4048, Training Loss: 0.03047
Epoch: 4048, Training Loss: 0.02289
Epoch: 4049, Training Loss: 0.02648
Epoch: 4049, Training Loss: 0.02386
Epoch: 4049, Training Loss: 0.03046
Epoch: 4049, Training Loss: 0.02289
Epoch: 4050, Training Loss: 0.02647
Epoch: 4050, Training Loss: 0.02385
Epoch: 4050, Training Loss: 0.03046
Epoch: 4050, Training Loss: 0.02289
Epoch: 4051, Training Loss: 0.02647
Epoch: 4051, Training Loss: 0.02385
Epoch: 4051, Training Loss: 0.03045
Epoch: 4051, Training Loss: 0.02288
Epoch: 4052, Training Loss: 0.02646
Epoch: 4052, Training Loss: 0.02385
Epoch: 4052, Training Loss: 0.03045
Epoch: 4052, Training Loss: 0.02288
Epoch: 4053, Training Loss: 0.02646
Epoch: 4053, Training Loss: 0.02384
Epoch: 4053, Training Loss: 0.03044
Epoch: 4053, Training Loss: 0.02287
Epoch: 4054, Training Loss: 0.02646
Epoch: 4054, Training Loss: 0.02384
Epoch: 4054, Training Loss: 0.03044
Epoch: 4054, Training Loss: 0.02287
Epoch: 4055, Training Loss: 0.02645
Epoch: 4055, Training Loss: 0.02384
Epoch: 4055, Training Loss: 0.03043
Epoch: 4055, Training Loss: 0.02287
Epoch: 4056, Training Loss: 0.02645
Epoch: 4056, Training Loss: 0.02383
Epoch: 4056, Training Loss: 0.03043
Epoch: 4056, Training Loss: 0.02286
Epoch: 4057, Training Loss: 0.02644
Epoch: 4057, Training Loss: 0.02383
Epoch: 4057, Training Loss: 0.03042
Epoch: 4057, Training Loss: 0.02286
Epoch: 4058, Training Loss: 0.02644
Epoch: 4058, Training Loss: 0.02383
Epoch: 4058, Training Loss: 0.03042
Epoch: 4058, Training Loss: 0.02286
Epoch: 4059, Training Loss: 0.02644
Epoch: 4059, Training Loss: 0.02382
Epoch: 4059, Training Loss: 0.03041
Epoch: 4059, Training Loss: 0.02285
Epoch: 4060, Training Loss: 0.02643
Epoch: 4060, Training Loss: 0.02382
Epoch: 4060, Training Loss: 0.03041
Epoch: 4060, Training Loss: 0.02285
Epoch: 4061, Training Loss: 0.02643
Epoch: 4061, Training Loss: 0.02381
Epoch: 4061, Training Loss: 0.03040
Epoch: 4061, Training Loss: 0.02285
Epoch: 4062, Training Loss: 0.02642
Epoch: 4062, Training Loss: 0.02381
Epoch: 4062, Training Loss: 0.03040
Epoch: 4062, Training Loss: 0.02284
Epoch: 4063, Training Loss: 0.02642
Epoch: 4063, Training Loss: 0.02381
Epoch: 4063, Training Loss: 0.03039
Epoch: 4063, Training Loss: 0.02284
Epoch: 4064, Training Loss: 0.02641
Epoch: 4064, Training Loss: 0.02380
Epoch: 4064, Training Loss: 0.03039
Epoch: 4064, Training Loss: 0.02284
Epoch: 4065, Training Loss: 0.02641
Epoch: 4065, Training Loss: 0.02380
Epoch: 4065, Training Loss: 0.03038
Epoch: 4065, Training Loss: 0.02283
Epoch: 4066, Training Loss: 0.02641
Epoch: 4066, Training Loss: 0.02380
Epoch: 4066, Training Loss: 0.03038
Epoch: 4066, Training Loss: 0.02283
Epoch: 4067, Training Loss: 0.02640
Epoch: 4067, Training Loss: 0.02379
Epoch: 4067, Training Loss: 0.03037
Epoch: 4067, Training Loss: 0.02282
Epoch: 4068, Training Loss: 0.02640
Epoch: 4068, Training Loss: 0.02379
Epoch: 4068, Training Loss: 0.03037
Epoch: 4068, Training Loss: 0.02282
Epoch: 4069, Training Loss: 0.02639
Epoch: 4069, Training Loss: 0.02379
Epoch: 4069, Training Loss: 0.03036
Epoch: 4069, Training Loss: 0.02282
Epoch: 4070, Training Loss: 0.02639
Epoch: 4070, Training Loss: 0.02378
Epoch: 4070, Training Loss: 0.03036
Epoch: 4070, Training Loss: 0.02281
Epoch: 4071, Training Loss: 0.02638
Epoch: 4071, Training Loss: 0.02378
Epoch: 4071, Training Loss: 0.03035
Epoch: 4071, Training Loss: 0.02281
Epoch: 4072, Training Loss: 0.02638
Epoch: 4072, Training Loss: 0.02378
Epoch: 4072, Training Loss: 0.03035
Epoch: 4072, Training Loss: 0.02281
Epoch: 4073, Training Loss: 0.02638
Epoch: 4073, Training Loss: 0.02377
Epoch: 4073, Training Loss: 0.03035
Epoch: 4073, Training Loss: 0.02280
Epoch: 4074, Training Loss: 0.02637
Epoch: 4074, Training Loss: 0.02377
Epoch: 4074, Training Loss: 0.03034
Epoch: 4074, Training Loss: 0.02280
Epoch: 4075, Training Loss: 0.02637
Epoch: 4075, Training Loss: 0.02376
Epoch: 4075, Training Loss: 0.03034
Epoch: 4075, Training Loss: 0.02280
Epoch: 4076, Training Loss: 0.02636
Epoch: 4076, Training Loss: 0.02376
Epoch: 4076, Training Loss: 0.03033
Epoch: 4076, Training Loss: 0.02279
Epoch: 4077, Training Loss: 0.02636
Epoch: 4077, Training Loss: 0.02376
Epoch: 4077, Training Loss: 0.03033
Epoch: 4077, Training Loss: 0.02279
Epoch: 4078, Training Loss: 0.02636
Epoch: 4078, Training Loss: 0.02375
Epoch: 4078, Training Loss: 0.03032
Epoch: 4078, Training Loss: 0.02279
Epoch: 4079, Training Loss: 0.02635
Epoch: 4079, Training Loss: 0.02375
Epoch: 4079, Training Loss: 0.03032
Epoch: 4079, Training Loss: 0.02278
Epoch: 4080, Training Loss: 0.02635
Epoch: 4080, Training Loss: 0.02375
Epoch: 4080, Training Loss: 0.03031
Epoch: 4080, Training Loss: 0.02278
Epoch: 4081, Training Loss: 0.02634
Epoch: 4081, Training Loss: 0.02374
Epoch: 4081, Training Loss: 0.03031
Epoch: 4081, Training Loss: 0.02278
Epoch: 4082, Training Loss: 0.02634
Epoch: 4082, Training Loss: 0.02374
Epoch: 4082, Training Loss: 0.03030
Epoch: 4082, Training Loss: 0.02277
Epoch: 4083, Training Loss: 0.02633
Epoch: 4083, Training Loss: 0.02374
Epoch: 4083, Training Loss: 0.03030
Epoch: 4083, Training Loss: 0.02277
Epoch: 4084, Training Loss: 0.02633
Epoch: 4084, Training Loss: 0.02373
Epoch: 4084, Training Loss: 0.03029
Epoch: 4084, Training Loss: 0.02276
Epoch: 4085, Training Loss: 0.02633
Epoch: 4085, Training Loss: 0.02373
Epoch: 4085, Training Loss: 0.03029
Epoch: 4085, Training Loss: 0.02276
Epoch: 4086, Training Loss: 0.02632
Epoch: 4086, Training Loss: 0.02373
Epoch: 4086, Training Loss: 0.03028
Epoch: 4086, Training Loss: 0.02276
Epoch: 4087, Training Loss: 0.02632
Epoch: 4087, Training Loss: 0.02372
Epoch: 4087, Training Loss: 0.03028
Epoch: 4087, Training Loss: 0.02275
Epoch: 4088, Training Loss: 0.02631
Epoch: 4088, Training Loss: 0.02372
Epoch: 4088, Training Loss: 0.03027
Epoch: 4088, Training Loss: 0.02275
Epoch: 4089, Training Loss: 0.02631
Epoch: 4089, Training Loss: 0.02371
Epoch: 4089, Training Loss: 0.03027
Epoch: 4089, Training Loss: 0.02275
Epoch: 4090, Training Loss: 0.02631
Epoch: 4090, Training Loss: 0.02371
Epoch: 4090, Training Loss: 0.03026
Epoch: 4090, Training Loss: 0.02274
Epoch: 4091, Training Loss: 0.02630
Epoch: 4091, Training Loss: 0.02371
Epoch: 4091, Training Loss: 0.03026
Epoch: 4091, Training Loss: 0.02274
Epoch: 4092, Training Loss: 0.02630
Epoch: 4092, Training Loss: 0.02370
Epoch: 4092, Training Loss: 0.03025
Epoch: 4092, Training Loss: 0.02274
Epoch: 4093, Training Loss: 0.02629
Epoch: 4093, Training Loss: 0.02370
Epoch: 4093, Training Loss: 0.03025
Epoch: 4093, Training Loss: 0.02273
Epoch: 4094, Training Loss: 0.02629
Epoch: 4094, Training Loss: 0.02370
Epoch: 4094, Training Loss: 0.03024
Epoch: 4094, Training Loss: 0.02273
Epoch: 4095, Training Loss: 0.02628
Epoch: 4095, Training Loss: 0.02369
Epoch: 4095, Training Loss: 0.03024
Epoch: 4095, Training Loss: 0.02273
Epoch: 4096, Training Loss: 0.02628
Epoch: 4096, Training Loss: 0.02369
Epoch: 4096, Training Loss: 0.03023
Epoch: 4096, Training Loss: 0.02272
Epoch: 4097, Training Loss: 0.02628
Epoch: 4097, Training Loss: 0.02369
Epoch: 4097, Training Loss: 0.03023
Epoch: 4097, Training Loss: 0.02272
Epoch: 4098, Training Loss: 0.02627
Epoch: 4098, Training Loss: 0.02368
Epoch: 4098, Training Loss: 0.03022
Epoch: 4098, Training Loss: 0.02272
Epoch: 4099, Training Loss: 0.02627
Epoch: 4099, Training Loss: 0.02368
Epoch: 4099, Training Loss: 0.03022
Epoch: 4099, Training Loss: 0.02271
Epoch: 4100, Training Loss: 0.02626
Epoch: 4100, Training Loss: 0.02368
Epoch: 4100, Training Loss: 0.03021
Epoch: 4100, Training Loss: 0.02271
Epoch: 4101, Training Loss: 0.02626
Epoch: 4101, Training Loss: 0.02367
Epoch: 4101, Training Loss: 0.03021
Epoch: 4101, Training Loss: 0.02271
Epoch: 4102, Training Loss: 0.02626
Epoch: 4102, Training Loss: 0.02367
Epoch: 4102, Training Loss: 0.03020
Epoch: 4102, Training Loss: 0.02270
Epoch: 4103, Training Loss: 0.02625
Epoch: 4103, Training Loss: 0.02367
Epoch: 4103, Training Loss: 0.03020
Epoch: 4103, Training Loss: 0.02270
Epoch: 4104, Training Loss: 0.02625
Epoch: 4104, Training Loss: 0.02366
Epoch: 4104, Training Loss: 0.03019
Epoch: 4104, Training Loss: 0.02269
Epoch: 4105, Training Loss: 0.02624
Epoch: 4105, Training Loss: 0.02366
Epoch: 4105, Training Loss: 0.03019
Epoch: 4105, Training Loss: 0.02269
Epoch: 4106, Training Loss: 0.02624
Epoch: 4106, Training Loss: 0.02365
Epoch: 4106, Training Loss: 0.03018
Epoch: 4106, Training Loss: 0.02269
Epoch: 4107, Training Loss: 0.02623
Epoch: 4107, Training Loss: 0.02365
Epoch: 4107, Training Loss: 0.03018
Epoch: 4107, Training Loss: 0.02268
Epoch: 4108, Training Loss: 0.02623
Epoch: 4108, Training Loss: 0.02365
Epoch: 4108, Training Loss: 0.03017
Epoch: 4108, Training Loss: 0.02268
Epoch: 4109, Training Loss: 0.02623
Epoch: 4109, Training Loss: 0.02364
Epoch: 4109, Training Loss: 0.03017
Epoch: 4109, Training Loss: 0.02268
Epoch: 4110, Training Loss: 0.02622
Epoch: 4110, Training Loss: 0.02364
Epoch: 4110, Training Loss: 0.03016
Epoch: 4110, Training Loss: 0.02267
Epoch: 4111, Training Loss: 0.02622
Epoch: 4111, Training Loss: 0.02364
Epoch: 4111, Training Loss: 0.03016
Epoch: 4111, Training Loss: 0.02267
Epoch: 4112, Training Loss: 0.02621
Epoch: 4112, Training Loss: 0.02363
Epoch: 4112, Training Loss: 0.03015
Epoch: 4112, Training Loss: 0.02267
Epoch: 4113, Training Loss: 0.02621
Epoch: 4113, Training Loss: 0.02363
Epoch: 4113, Training Loss: 0.03015
Epoch: 4113, Training Loss: 0.02266
Epoch: 4114, Training Loss: 0.02621
Epoch: 4114, Training Loss: 0.02363
Epoch: 4114, Training Loss: 0.03014
Epoch: 4114, Training Loss: 0.02266
Epoch: 4115, Training Loss: 0.02620
Epoch: 4115, Training Loss: 0.02362
Epoch: 4115, Training Loss: 0.03014
Epoch: 4115, Training Loss: 0.02266
Epoch: 4116, Training Loss: 0.02620
Epoch: 4116, Training Loss: 0.02362
Epoch: 4116, Training Loss: 0.03013
Epoch: 4116, Training Loss: 0.02265
Epoch: 4117, Training Loss: 0.02619
Epoch: 4117, Training Loss: 0.02362
Epoch: 4117, Training Loss: 0.03013
Epoch: 4117, Training Loss: 0.02265
Epoch: 4118, Training Loss: 0.02619
Epoch: 4118, Training Loss: 0.02361
Epoch: 4118, Training Loss: 0.03013
Epoch: 4118, Training Loss: 0.02265
Epoch: 4119, Training Loss: 0.02619
Epoch: 4119, Training Loss: 0.02361
Epoch: 4119, Training Loss: 0.03012
Epoch: 4119, Training Loss: 0.02264
Epoch: 4120, Training Loss: 0.02618
Epoch: 4120, Training Loss: 0.02361
Epoch: 4120, Training Loss: 0.03012
Epoch: 4120, Training Loss: 0.02264
Epoch: 4121, Training Loss: 0.02618
Epoch: 4121, Training Loss: 0.02360
Epoch: 4121, Training Loss: 0.03011
Epoch: 4121, Training Loss: 0.02264
Epoch: 4122, Training Loss: 0.02617
Epoch: 4122, Training Loss: 0.02360
Epoch: 4122, Training Loss: 0.03011
Epoch: 4122, Training Loss: 0.02263
Epoch: 4123, Training Loss: 0.02617
Epoch: 4123, Training Loss: 0.02359
Epoch: 4123, Training Loss: 0.03010
Epoch: 4123, Training Loss: 0.02263
Epoch: 4124, Training Loss: 0.02616
Epoch: 4124, Training Loss: 0.02359
Epoch: 4124, Training Loss: 0.03010
Epoch: 4124, Training Loss: 0.02263
Epoch: 4125, Training Loss: 0.02616
Epoch: 4125, Training Loss: 0.02359
Epoch: 4125, Training Loss: 0.03009
Epoch: 4125, Training Loss: 0.02262
Epoch: 4126, Training Loss: 0.02616
Epoch: 4126, Training Loss: 0.02358
Epoch: 4126, Training Loss: 0.03009
Epoch: 4126, Training Loss: 0.02262
Epoch: 4127, Training Loss: 0.02615
Epoch: 4127, Training Loss: 0.02358
Epoch: 4127, Training Loss: 0.03008
Epoch: 4127, Training Loss: 0.02261
Epoch: 4128, Training Loss: 0.02615
Epoch: 4128, Training Loss: 0.02358
Epoch: 4128, Training Loss: 0.03008
Epoch: 4128, Training Loss: 0.02261
Epoch: 4129, Training Loss: 0.02614
Epoch: 4129, Training Loss: 0.02357
Epoch: 4129, Training Loss: 0.03007
Epoch: 4129, Training Loss: 0.02261
Epoch: 4130, Training Loss: 0.02614
Epoch: 4130, Training Loss: 0.02357
Epoch: 4130, Training Loss: 0.03007
Epoch: 4130, Training Loss: 0.02260
Epoch: 4131, Training Loss: 0.02614
Epoch: 4131, Training Loss: 0.02357
Epoch: 4131, Training Loss: 0.03006
Epoch: 4131, Training Loss: 0.02260
Epoch: 4132, Training Loss: 0.02613
Epoch: 4132, Training Loss: 0.02356
Epoch: 4132, Training Loss: 0.03006
Epoch: 4132, Training Loss: 0.02260
Epoch: 4133, Training Loss: 0.02613
Epoch: 4133, Training Loss: 0.02356
Epoch: 4133, Training Loss: 0.03005
Epoch: 4133, Training Loss: 0.02259
Epoch: 4134, Training Loss: 0.02612
Epoch: 4134, Training Loss: 0.02356
Epoch: 4134, Training Loss: 0.03005
Epoch: 4134, Training Loss: 0.02259
Epoch: 4135, Training Loss: 0.02612
Epoch: 4135, Training Loss: 0.02355
Epoch: 4135, Training Loss: 0.03004
Epoch: 4135, Training Loss: 0.02259
Epoch: 4136, Training Loss: 0.02612
Epoch: 4136, Training Loss: 0.02355
Epoch: 4136, Training Loss: 0.03004
Epoch: 4136, Training Loss: 0.02258
Epoch: 4137, Training Loss: 0.02611
Epoch: 4137, Training Loss: 0.02355
Epoch: 4137, Training Loss: 0.03003
Epoch: 4137, Training Loss: 0.02258
Epoch: 4138, Training Loss: 0.02611
Epoch: 4138, Training Loss: 0.02354
Epoch: 4138, Training Loss: 0.03003
Epoch: 4138, Training Loss: 0.02258
Epoch: 4139, Training Loss: 0.02610
Epoch: 4139, Training Loss: 0.02354
Epoch: 4139, Training Loss: 0.03002
Epoch: 4139, Training Loss: 0.02257
Epoch: 4140, Training Loss: 0.02610
Epoch: 4140, Training Loss: 0.02354
Epoch: 4140, Training Loss: 0.03002
Epoch: 4140, Training Loss: 0.02257
Epoch: 4141, Training Loss: 0.02610
Epoch: 4141, Training Loss: 0.02353
Epoch: 4141, Training Loss: 0.03001
Epoch: 4141, Training Loss: 0.02257
Epoch: 4142, Training Loss: 0.02609
Epoch: 4142, Training Loss: 0.02353
Epoch: 4142, Training Loss: 0.03001
Epoch: 4142, Training Loss: 0.02256
Epoch: 4143, Training Loss: 0.02609
Epoch: 4143, Training Loss: 0.02353
Epoch: 4143, Training Loss: 0.03001
Epoch: 4143, Training Loss: 0.02256
Epoch: 4144, Training Loss: 0.02608
Epoch: 4144, Training Loss: 0.02352
Epoch: 4144, Training Loss: 0.03000
Epoch: 4144, Training Loss: 0.02256
Epoch: 4145, Training Loss: 0.02608
Epoch: 4145, Training Loss: 0.02352
Epoch: 4145, Training Loss: 0.03000
Epoch: 4145, Training Loss: 0.02255
Epoch: 4146, Training Loss: 0.02607
Epoch: 4146, Training Loss: 0.02351
Epoch: 4146, Training Loss: 0.02999
Epoch: 4146, Training Loss: 0.02255
Epoch: 4147, Training Loss: 0.02607
Epoch: 4147, Training Loss: 0.02351
Epoch: 4147, Training Loss: 0.02999
Epoch: 4147, Training Loss: 0.02255
Epoch: 4148, Training Loss: 0.02607
Epoch: 4148, Training Loss: 0.02351
Epoch: 4148, Training Loss: 0.02998
Epoch: 4148, Training Loss: 0.02254
Epoch: 4149, Training Loss: 0.02606
Epoch: 4149, Training Loss: 0.02350
Epoch: 4149, Training Loss: 0.02998
Epoch: 4149, Training Loss: 0.02254
Epoch: 4150, Training Loss: 0.02606
Epoch: 4150, Training Loss: 0.02350
Epoch: 4150, Training Loss: 0.02997
Epoch: 4150, Training Loss: 0.02254
Epoch: 4151, Training Loss: 0.02605
Epoch: 4151, Training Loss: 0.02350
Epoch: 4151, Training Loss: 0.02997
Epoch: 4151, Training Loss: 0.02253
Epoch: 4152, Training Loss: 0.02605
Epoch: 4152, Training Loss: 0.02349
Epoch: 4152, Training Loss: 0.02996
Epoch: 4152, Training Loss: 0.02253
Epoch: 4153, Training Loss: 0.02605
Epoch: 4153, Training Loss: 0.02349
Epoch: 4153, Training Loss: 0.02996
Epoch: 4153, Training Loss: 0.02253
Epoch: 4154, Training Loss: 0.02604
Epoch: 4154, Training Loss: 0.02349
Epoch: 4154, Training Loss: 0.02995
Epoch: 4154, Training Loss: 0.02252
Epoch: 4155, Training Loss: 0.02604
Epoch: 4155, Training Loss: 0.02348
Epoch: 4155, Training Loss: 0.02995
Epoch: 4155, Training Loss: 0.02252
Epoch: 4156, Training Loss: 0.02603
Epoch: 4156, Training Loss: 0.02348
Epoch: 4156, Training Loss: 0.02994
Epoch: 4156, Training Loss: 0.02252
Epoch: 4157, Training Loss: 0.02603
Epoch: 4157, Training Loss: 0.02348
Epoch: 4157, Training Loss: 0.02994
Epoch: 4157, Training Loss: 0.02251
Epoch: 4158, Training Loss: 0.02603
Epoch: 4158, Training Loss: 0.02347
Epoch: 4158, Training Loss: 0.02993
Epoch: 4158, Training Loss: 0.02251
Epoch: 4159, Training Loss: 0.02602
Epoch: 4159, Training Loss: 0.02347
Epoch: 4159, Training Loss: 0.02993
Epoch: 4159, Training Loss: 0.02251
Epoch: 4160, Training Loss: 0.02602
Epoch: 4160, Training Loss: 0.02347
Epoch: 4160, Training Loss: 0.02992
Epoch: 4160, Training Loss: 0.02250
Epoch: 4161, Training Loss: 0.02601
Epoch: 4161, Training Loss: 0.02346
Epoch: 4161, Training Loss: 0.02992
Epoch: 4161, Training Loss: 0.02250
Epoch: 4162, Training Loss: 0.02601
Epoch: 4162, Training Loss: 0.02346
Epoch: 4162, Training Loss: 0.02991
Epoch: 4162, Training Loss: 0.02250
Epoch: 4163, Training Loss: 0.02601
Epoch: 4163, Training Loss: 0.02346
Epoch: 4163, Training Loss: 0.02991
Epoch: 4163, Training Loss: 0.02249
Epoch: 4164, Training Loss: 0.02600
Epoch: 4164, Training Loss: 0.02345
Epoch: 4164, Training Loss: 0.02991
Epoch: 4164, Training Loss: 0.02249
Epoch: 4165, Training Loss: 0.02600
Epoch: 4165, Training Loss: 0.02345
Epoch: 4165, Training Loss: 0.02990
Epoch: 4165, Training Loss: 0.02248
Epoch: 4166, Training Loss: 0.02599
Epoch: 4166, Training Loss: 0.02345
Epoch: 4166, Training Loss: 0.02990
Epoch: 4166, Training Loss: 0.02248
Epoch: 4167, Training Loss: 0.02599
Epoch: 4167, Training Loss: 0.02344
Epoch: 4167, Training Loss: 0.02989
Epoch: 4167, Training Loss: 0.02248
Epoch: 4168, Training Loss: 0.02599
Epoch: 4168, Training Loss: 0.02344
Epoch: 4168, Training Loss: 0.02989
Epoch: 4168, Training Loss: 0.02247
Epoch: 4169, Training Loss: 0.02598
Epoch: 4169, Training Loss: 0.02344
Epoch: 4169, Training Loss: 0.02988
Epoch: 4169, Training Loss: 0.02247
Epoch: 4170, Training Loss: 0.02598
Epoch: 4170, Training Loss: 0.02343
Epoch: 4170, Training Loss: 0.02988
Epoch: 4170, Training Loss: 0.02247
Epoch: 4171, Training Loss: 0.02597
Epoch: 4171, Training Loss: 0.02343
Epoch: 4171, Training Loss: 0.02987
Epoch: 4171, Training Loss: 0.02246
Epoch: 4172, Training Loss: 0.02597
Epoch: 4172, Training Loss: 0.02343
Epoch: 4172, Training Loss: 0.02987
Epoch: 4172, Training Loss: 0.02246
Epoch: 4173, Training Loss: 0.02597
Epoch: 4173, Training Loss: 0.02342
Epoch: 4173, Training Loss: 0.02986
Epoch: 4173, Training Loss: 0.02246
Epoch: 4174, Training Loss: 0.02596
Epoch: 4174, Training Loss: 0.02342
Epoch: 4174, Training Loss: 0.02986
Epoch: 4174, Training Loss: 0.02245
Epoch: 4175, Training Loss: 0.02596
Epoch: 4175, Training Loss: 0.02342
Epoch: 4175, Training Loss: 0.02985
Epoch: 4175, Training Loss: 0.02245
Epoch: 4176, Training Loss: 0.02595
Epoch: 4176, Training Loss: 0.02341
Epoch: 4176, Training Loss: 0.02985
Epoch: 4176, Training Loss: 0.02245
Epoch: 4177, Training Loss: 0.02595
Epoch: 4177, Training Loss: 0.02341
Epoch: 4177, Training Loss: 0.02984
Epoch: 4177, Training Loss: 0.02244
Epoch: 4178, Training Loss: 0.02595
Epoch: 4178, Training Loss: 0.02340
Epoch: 4178, Training Loss: 0.02984
Epoch: 4178, Training Loss: 0.02244
Epoch: 4179, Training Loss: 0.02594
Epoch: 4179, Training Loss: 0.02340
Epoch: 4179, Training Loss: 0.02983
Epoch: 4179, Training Loss: 0.02244
Epoch: 4180, Training Loss: 0.02594
Epoch: 4180, Training Loss: 0.02340
Epoch: 4180, Training Loss: 0.02983
Epoch: 4180, Training Loss: 0.02243
Epoch: 4181, Training Loss: 0.02593
Epoch: 4181, Training Loss: 0.02339
Epoch: 4181, Training Loss: 0.02983
Epoch: 4181, Training Loss: 0.02243
Epoch: 4182, Training Loss: 0.02593
Epoch: 4182, Training Loss: 0.02339
Epoch: 4182, Training Loss: 0.02982
Epoch: 4182, Training Loss: 0.02243
Epoch: 4183, Training Loss: 0.02593
Epoch: 4183, Training Loss: 0.02339
Epoch: 4183, Training Loss: 0.02982
Epoch: 4183, Training Loss: 0.02242
Epoch: 4184, Training Loss: 0.02592
Epoch: 4184, Training Loss: 0.02338
Epoch: 4184, Training Loss: 0.02981
Epoch: 4184, Training Loss: 0.02242
Epoch: 4185, Training Loss: 0.02592
Epoch: 4185, Training Loss: 0.02338
Epoch: 4185, Training Loss: 0.02981
Epoch: 4185, Training Loss: 0.02242
Epoch: 4186, Training Loss: 0.02591
Epoch: 4186, Training Loss: 0.02338
Epoch: 4186, Training Loss: 0.02980
Epoch: 4186, Training Loss: 0.02241
Epoch: 4187, Training Loss: 0.02591
Epoch: 4187, Training Loss: 0.02337
Epoch: 4187, Training Loss: 0.02980
Epoch: 4187, Training Loss: 0.02241
Epoch: 4188, Training Loss: 0.02591
Epoch: 4188, Training Loss: 0.02337
Epoch: 4188, Training Loss: 0.02979
Epoch: 4188, Training Loss: 0.02241
Epoch: 4189, Training Loss: 0.02590
Epoch: 4189, Training Loss: 0.02337
Epoch: 4189, Training Loss: 0.02979
Epoch: 4189, Training Loss: 0.02240
Epoch: 4190, Training Loss: 0.02590
Epoch: 4190, Training Loss: 0.02336
Epoch: 4190, Training Loss: 0.02978
Epoch: 4190, Training Loss: 0.02240
Epoch: 4191, Training Loss: 0.02589
Epoch: 4191, Training Loss: 0.02336
Epoch: 4191, Training Loss: 0.02978
Epoch: 4191, Training Loss: 0.02240
Epoch: 4192, Training Loss: 0.02589
Epoch: 4192, Training Loss: 0.02336
Epoch: 4192, Training Loss: 0.02977
Epoch: 4192, Training Loss: 0.02239
Epoch: 4193, Training Loss: 0.02589
Epoch: 4193, Training Loss: 0.02335
Epoch: 4193, Training Loss: 0.02977
Epoch: 4193, Training Loss: 0.02239
Epoch: 4194, Training Loss: 0.02588
Epoch: 4194, Training Loss: 0.02335
Epoch: 4194, Training Loss: 0.02976
Epoch: 4194, Training Loss: 0.02239
Epoch: 4195, Training Loss: 0.02588
Epoch: 4195, Training Loss: 0.02335
Epoch: 4195, Training Loss: 0.02976
Epoch: 4195, Training Loss: 0.02238
Epoch: 4196, Training Loss: 0.02587
Epoch: 4196, Training Loss: 0.02334
Epoch: 4196, Training Loss: 0.02975
Epoch: 4196, Training Loss: 0.02238
Epoch: 4197, Training Loss: 0.02587
Epoch: 4197, Training Loss: 0.02334
Epoch: 4197, Training Loss: 0.02975
Epoch: 4197, Training Loss: 0.02238
Epoch: 4198, Training Loss: 0.02587
Epoch: 4198, Training Loss: 0.02334
Epoch: 4198, Training Loss: 0.02975
Epoch: 4198, Training Loss: 0.02237
Epoch: 4199, Training Loss: 0.02586
Epoch: 4199, Training Loss: 0.02333
Epoch: 4199, Training Loss: 0.02974
Epoch: 4199, Training Loss: 0.02237
Epoch: 4200, Training Loss: 0.02586
Epoch: 4200, Training Loss: 0.02333
Epoch: 4200, Training Loss: 0.02974
Epoch: 4200, Training Loss: 0.02237
Epoch: 4201, Training Loss: 0.02585
Epoch: 4201, Training Loss: 0.02333
Epoch: 4201, Training Loss: 0.02973
Epoch: 4201, Training Loss: 0.02236
Epoch: 4202, Training Loss: 0.02585
Epoch: 4202, Training Loss: 0.02332
Epoch: 4202, Training Loss: 0.02973
Epoch: 4202, Training Loss: 0.02236
Epoch: 4203, Training Loss: 0.02585
Epoch: 4203, Training Loss: 0.02332
Epoch: 4203, Training Loss: 0.02972
Epoch: 4203, Training Loss: 0.02236
Epoch: 4204, Training Loss: 0.02584
Epoch: 4204, Training Loss: 0.02332
Epoch: 4204, Training Loss: 0.02972
Epoch: 4204, Training Loss: 0.02235
Epoch: 4205, Training Loss: 0.02584
Epoch: 4205, Training Loss: 0.02331
Epoch: 4205, Training Loss: 0.02971
Epoch: 4205, Training Loss: 0.02235
Epoch: 4206, Training Loss: 0.02583
Epoch: 4206, Training Loss: 0.02331
Epoch: 4206, Training Loss: 0.02971
Epoch: 4206, Training Loss: 0.02235
Epoch: 4207, Training Loss: 0.02583
Epoch: 4207, Training Loss: 0.02331
Epoch: 4207, Training Loss: 0.02970
Epoch: 4207, Training Loss: 0.02234
Epoch: 4208, Training Loss: 0.02583
Epoch: 4208, Training Loss: 0.02330
Epoch: 4208, Training Loss: 0.02970
Epoch: 4208, Training Loss: 0.02234
Epoch: 4209, Training Loss: 0.02582
Epoch: 4209, Training Loss: 0.02330
Epoch: 4209, Training Loss: 0.02969
Epoch: 4209, Training Loss: 0.02234
Epoch: 4210, Training Loss: 0.02582
Epoch: 4210, Training Loss: 0.02330
Epoch: 4210, Training Loss: 0.02969
Epoch: 4210, Training Loss: 0.02233
Epoch: 4211, Training Loss: 0.02581
Epoch: 4211, Training Loss: 0.02329
Epoch: 4211, Training Loss: 0.02969
Epoch: 4211, Training Loss: 0.02233
Epoch: 4212, Training Loss: 0.02581
Epoch: 4212, Training Loss: 0.02329
Epoch: 4212, Training Loss: 0.02968
Epoch: 4212, Training Loss: 0.02233
Epoch: 4213, Training Loss: 0.02581
Epoch: 4213, Training Loss: 0.02329
Epoch: 4213, Training Loss: 0.02968
Epoch: 4213, Training Loss: 0.02232
Epoch: 4214, Training Loss: 0.02580
Epoch: 4214, Training Loss: 0.02328
Epoch: 4214, Training Loss: 0.02967
Epoch: 4214, Training Loss: 0.02232
Epoch: 4215, Training Loss: 0.02580
Epoch: 4215, Training Loss: 0.02328
Epoch: 4215, Training Loss: 0.02967
Epoch: 4215, Training Loss: 0.02232
Epoch: 4216, Training Loss: 0.02580
Epoch: 4216, Training Loss: 0.02328
Epoch: 4216, Training Loss: 0.02966
Epoch: 4216, Training Loss: 0.02231
Epoch: 4217, Training Loss: 0.02579
Epoch: 4217, Training Loss: 0.02327
Epoch: 4217, Training Loss: 0.02966
Epoch: 4217, Training Loss: 0.02231
Epoch: 4218, Training Loss: 0.02579
Epoch: 4218, Training Loss: 0.02327
Epoch: 4218, Training Loss: 0.02965
Epoch: 4218, Training Loss: 0.02231
Epoch: 4219, Training Loss: 0.02578
Epoch: 4219, Training Loss: 0.02327
Epoch: 4219, Training Loss: 0.02965
Epoch: 4219, Training Loss: 0.02230
Epoch: 4220, Training Loss: 0.02578
Epoch: 4220, Training Loss: 0.02326
Epoch: 4220, Training Loss: 0.02964
Epoch: 4220, Training Loss: 0.02230
Epoch: 4221, Training Loss: 0.02578
Epoch: 4221, Training Loss: 0.02326
Epoch: 4221, Training Loss: 0.02964
Epoch: 4221, Training Loss: 0.02230
Epoch: 4222, Training Loss: 0.02577
Epoch: 4222, Training Loss: 0.02326
Epoch: 4222, Training Loss: 0.02963
Epoch: 4222, Training Loss: 0.02229
Epoch: 4223, Training Loss: 0.02577
Epoch: 4223, Training Loss: 0.02325
Epoch: 4223, Training Loss: 0.02963
Epoch: 4223, Training Loss: 0.02229
Epoch: 4224, Training Loss: 0.02576
Epoch: 4224, Training Loss: 0.02325
Epoch: 4224, Training Loss: 0.02963
Epoch: 4224, Training Loss: 0.02229
Epoch: 4225, Training Loss: 0.02576
Epoch: 4225, Training Loss: 0.02325
Epoch: 4225, Training Loss: 0.02962
Epoch: 4225, Training Loss: 0.02228
Epoch: 4226, Training Loss: 0.02576
Epoch: 4226, Training Loss: 0.02324
Epoch: 4226, Training Loss: 0.02962
Epoch: 4226, Training Loss: 0.02228
Epoch: 4227, Training Loss: 0.02575
Epoch: 4227, Training Loss: 0.02324
Epoch: 4227, Training Loss: 0.02961
Epoch: 4227, Training Loss: 0.02228
Epoch: 4228, Training Loss: 0.02575
Epoch: 4228, Training Loss: 0.02324
Epoch: 4228, Training Loss: 0.02961
Epoch: 4228, Training Loss: 0.02227
Epoch: 4229, Training Loss: 0.02574
Epoch: 4229, Training Loss: 0.02323
Epoch: 4229, Training Loss: 0.02960
Epoch: 4229, Training Loss: 0.02227
Epoch: 4230, Training Loss: 0.02574
Epoch: 4230, Training Loss: 0.02323
Epoch: 4230, Training Loss: 0.02960
Epoch: 4230, Training Loss: 0.02227
Epoch: 4231, Training Loss: 0.02574
Epoch: 4231, Training Loss: 0.02323
Epoch: 4231, Training Loss: 0.02959
Epoch: 4231, Training Loss: 0.02226
Epoch: 4232, Training Loss: 0.02573
Epoch: 4232, Training Loss: 0.02322
Epoch: 4232, Training Loss: 0.02959
Epoch: 4232, Training Loss: 0.02226
Epoch: 4233, Training Loss: 0.02573
Epoch: 4233, Training Loss: 0.02322
Epoch: 4233, Training Loss: 0.02958
Epoch: 4233, Training Loss: 0.02226
Epoch: 4234, Training Loss: 0.02572
Epoch: 4234, Training Loss: 0.02322
Epoch: 4234, Training Loss: 0.02958
Epoch: 4234, Training Loss: 0.02225
Epoch: 4235, Training Loss: 0.02572
Epoch: 4235, Training Loss: 0.02321
Epoch: 4235, Training Loss: 0.02957
Epoch: 4235, Training Loss: 0.02225
Epoch: 4236, Training Loss: 0.02572
Epoch: 4236, Training Loss: 0.02321
Epoch: 4236, Training Loss: 0.02957
Epoch: 4236, Training Loss: 0.02225
Epoch: 4237, Training Loss: 0.02571
Epoch: 4237, Training Loss: 0.02321
Epoch: 4237, Training Loss: 0.02957
Epoch: 4237, Training Loss: 0.02224
Epoch: 4238, Training Loss: 0.02571
Epoch: 4238, Training Loss: 0.02320
Epoch: 4238, Training Loss: 0.02956
Epoch: 4238, Training Loss: 0.02224
Epoch: 4239, Training Loss: 0.02571
Epoch: 4239, Training Loss: 0.02320
Epoch: 4239, Training Loss: 0.02956
Epoch: 4239, Training Loss: 0.02224
Epoch: 4240, Training Loss: 0.02570
Epoch: 4240, Training Loss: 0.02320
Epoch: 4240, Training Loss: 0.02955
Epoch: 4240, Training Loss: 0.02223
Epoch: 4241, Training Loss: 0.02570
Epoch: 4241, Training Loss: 0.02319
Epoch: 4241, Training Loss: 0.02955
Epoch: 4241, Training Loss: 0.02223
Epoch: 4242, Training Loss: 0.02569
Epoch: 4242, Training Loss: 0.02319
Epoch: 4242, Training Loss: 0.02954
Epoch: 4242, Training Loss: 0.02223
Epoch: 4243, Training Loss: 0.02569
Epoch: 4243, Training Loss: 0.02319
Epoch: 4243, Training Loss: 0.02954
Epoch: 4243, Training Loss: 0.02222
Epoch: 4244, Training Loss: 0.02569
Epoch: 4244, Training Loss: 0.02318
Epoch: 4244, Training Loss: 0.02953
Epoch: 4244, Training Loss: 0.02222
Epoch: 4245, Training Loss: 0.02568
Epoch: 4245, Training Loss: 0.02318
Epoch: 4245, Training Loss: 0.02953
Epoch: 4245, Training Loss: 0.02222
Epoch: 4246, Training Loss: 0.02568
Epoch: 4246, Training Loss: 0.02318
Epoch: 4246, Training Loss: 0.02952
Epoch: 4246, Training Loss: 0.02221
Epoch: 4247, Training Loss: 0.02567
Epoch: 4247, Training Loss: 0.02317
Epoch: 4247, Training Loss: 0.02952
Epoch: 4247, Training Loss: 0.02221
Epoch: 4248, Training Loss: 0.02567
Epoch: 4248, Training Loss: 0.02317
Epoch: 4248, Training Loss: 0.02952
Epoch: 4248, Training Loss: 0.02221
Epoch: 4249, Training Loss: 0.02567
Epoch: 4249, Training Loss: 0.02317
Epoch: 4249, Training Loss: 0.02951
Epoch: 4249, Training Loss: 0.02221
Epoch: 4250, Training Loss: 0.02566
Epoch: 4250, Training Loss: 0.02316
Epoch: 4250, Training Loss: 0.02951
Epoch: 4250, Training Loss: 0.02220
Epoch: 4251, Training Loss: 0.02566
Epoch: 4251, Training Loss: 0.02316
Epoch: 4251, Training Loss: 0.02950
Epoch: 4251, Training Loss: 0.02220
Epoch: 4252, Training Loss: 0.02565
Epoch: 4252, Training Loss: 0.02316
Epoch: 4252, Training Loss: 0.02950
Epoch: 4252, Training Loss: 0.02220
Epoch: 4253, Training Loss: 0.02565
Epoch: 4253, Training Loss: 0.02315
Epoch: 4253, Training Loss: 0.02949
Epoch: 4253, Training Loss: 0.02219
Epoch: 4254, Training Loss: 0.02565
Epoch: 4254, Training Loss: 0.02315
Epoch: 4254, Training Loss: 0.02949
Epoch: 4254, Training Loss: 0.02219
Epoch: 4255, Training Loss: 0.02564
Epoch: 4255, Training Loss: 0.02315
Epoch: 4255, Training Loss: 0.02948
Epoch: 4255, Training Loss: 0.02219
Epoch: 4256, Training Loss: 0.02564
Epoch: 4256, Training Loss: 0.02314
Epoch: 4256, Training Loss: 0.02948
Epoch: 4256, Training Loss: 0.02218
Epoch: 4257, Training Loss: 0.02564
Epoch: 4257, Training Loss: 0.02314
Epoch: 4257, Training Loss: 0.02947
Epoch: 4257, Training Loss: 0.02218
Epoch: 4258, Training Loss: 0.02563
Epoch: 4258, Training Loss: 0.02314
Epoch: 4258, Training Loss: 0.02947
Epoch: 4258, Training Loss: 0.02218
Epoch: 4259, Training Loss: 0.02563
Epoch: 4259, Training Loss: 0.02313
Epoch: 4259, Training Loss: 0.02947
Epoch: 4259, Training Loss: 0.02217
Epoch: 4260, Training Loss: 0.02562
Epoch: 4260, Training Loss: 0.02313
Epoch: 4260, Training Loss: 0.02946
Epoch: 4260, Training Loss: 0.02217
Epoch: 4261, Training Loss: 0.02562
Epoch: 4261, Training Loss: 0.02313
Epoch: 4261, Training Loss: 0.02946
Epoch: 4261, Training Loss: 0.02217
Epoch: 4262, Training Loss: 0.02562
Epoch: 4262, Training Loss: 0.02312
Epoch: 4262, Training Loss: 0.02945
Epoch: 4262, Training Loss: 0.02216
Epoch: 4263, Training Loss: 0.02561
Epoch: 4263, Training Loss: 0.02312
Epoch: 4263, Training Loss: 0.02945
Epoch: 4263, Training Loss: 0.02216
Epoch: 4264, Training Loss: 0.02561
Epoch: 4264, Training Loss: 0.02312
Epoch: 4264, Training Loss: 0.02944
Epoch: 4264, Training Loss: 0.02216
Epoch: 4265, Training Loss: 0.02560
Epoch: 4265, Training Loss: 0.02311
Epoch: 4265, Training Loss: 0.02944
Epoch: 4265, Training Loss: 0.02215
Epoch: 4266, Training Loss: 0.02560
Epoch: 4266, Training Loss: 0.02311
Epoch: 4266, Training Loss: 0.02943
Epoch: 4266, Training Loss: 0.02215
Epoch: 4267, Training Loss: 0.02560
Epoch: 4267, Training Loss: 0.02311
Epoch: 4267, Training Loss: 0.02943
Epoch: 4267, Training Loss: 0.02215
Epoch: 4268, Training Loss: 0.02559
Epoch: 4268, Training Loss: 0.02310
Epoch: 4268, Training Loss: 0.02942
Epoch: 4268, Training Loss: 0.02214
Epoch: 4269, Training Loss: 0.02559
Epoch: 4269, Training Loss: 0.02310
Epoch: 4269, Training Loss: 0.02942
Epoch: 4269, Training Loss: 0.02214
Epoch: 4270, Training Loss: 0.02559
Epoch: 4270, Training Loss: 0.02310
Epoch: 4270, Training Loss: 0.02942
Epoch: 4270, Training Loss: 0.02214
Epoch: 4271, Training Loss: 0.02558
Epoch: 4271, Training Loss: 0.02309
Epoch: 4271, Training Loss: 0.02941
Epoch: 4271, Training Loss: 0.02213
Epoch: 4272, Training Loss: 0.02558
Epoch: 4272, Training Loss: 0.02309
Epoch: 4272, Training Loss: 0.02941
Epoch: 4272, Training Loss: 0.02213
Epoch: 4273, Training Loss: 0.02557
Epoch: 4273, Training Loss: 0.02309
Epoch: 4273, Training Loss: 0.02940
Epoch: 4273, Training Loss: 0.02213
Epoch: 4274, Training Loss: 0.02557
Epoch: 4274, Training Loss: 0.02308
Epoch: 4274, Training Loss: 0.02940
Epoch: 4274, Training Loss: 0.02212
Epoch: 4275, Training Loss: 0.02557
Epoch: 4275, Training Loss: 0.02308
Epoch: 4275, Training Loss: 0.02939
Epoch: 4275, Training Loss: 0.02212
Epoch: 4276, Training Loss: 0.02556
Epoch: 4276, Training Loss: 0.02308
Epoch: 4276, Training Loss: 0.02939
Epoch: 4276, Training Loss: 0.02212
Epoch: 4277, Training Loss: 0.02556
Epoch: 4277, Training Loss: 0.02307
Epoch: 4277, Training Loss: 0.02938
Epoch: 4277, Training Loss: 0.02211
Epoch: 4278, Training Loss: 0.02555
Epoch: 4278, Training Loss: 0.02307
Epoch: 4278, Training Loss: 0.02938
Epoch: 4278, Training Loss: 0.02211
Epoch: 4279, Training Loss: 0.02555
Epoch: 4279, Training Loss: 0.02307
Epoch: 4279, Training Loss: 0.02938
Epoch: 4279, Training Loss: 0.02211
Epoch: 4280, Training Loss: 0.02555
Epoch: 4280, Training Loss: 0.02306
Epoch: 4280, Training Loss: 0.02937
Epoch: 4280, Training Loss: 0.02210
Epoch: 4281, Training Loss: 0.02554
Epoch: 4281, Training Loss: 0.02306
Epoch: 4281, Training Loss: 0.02937
Epoch: 4281, Training Loss: 0.02210
Epoch: 4282, Training Loss: 0.02554
Epoch: 4282, Training Loss: 0.02306
Epoch: 4282, Training Loss: 0.02936
Epoch: 4282, Training Loss: 0.02210
Epoch: 4283, Training Loss: 0.02554
Epoch: 4283, Training Loss: 0.02305
Epoch: 4283, Training Loss: 0.02936
Epoch: 4283, Training Loss: 0.02209
Epoch: 4284, Training Loss: 0.02553
Epoch: 4284, Training Loss: 0.02305
Epoch: 4284, Training Loss: 0.02935
Epoch: 4284, Training Loss: 0.02209
Epoch: 4285, Training Loss: 0.02553
Epoch: 4285, Training Loss: 0.02305
Epoch: 4285, Training Loss: 0.02935
Epoch: 4285, Training Loss: 0.02209
Epoch: 4286, Training Loss: 0.02552
Epoch: 4286, Training Loss: 0.02304
Epoch: 4286, Training Loss: 0.02934
Epoch: 4286, Training Loss: 0.02209
Epoch: 4287, Training Loss: 0.02552
Epoch: 4287, Training Loss: 0.02304
Epoch: 4287, Training Loss: 0.02934
Epoch: 4287, Training Loss: 0.02208
Epoch: 4288, Training Loss: 0.02552
Epoch: 4288, Training Loss: 0.02304
Epoch: 4288, Training Loss: 0.02933
Epoch: 4288, Training Loss: 0.02208
Epoch: 4289, Training Loss: 0.02551
Epoch: 4289, Training Loss: 0.02303
Epoch: 4289, Training Loss: 0.02933
Epoch: 4289, Training Loss: 0.02208
Epoch: 4290, Training Loss: 0.02551
Epoch: 4290, Training Loss: 0.02303
Epoch: 4290, Training Loss: 0.02933
Epoch: 4290, Training Loss: 0.02207
Epoch: 4291, Training Loss: 0.02550
Epoch: 4291, Training Loss: 0.02303
Epoch: 4291, Training Loss: 0.02932
Epoch: 4291, Training Loss: 0.02207
Epoch: 4292, Training Loss: 0.02550
Epoch: 4292, Training Loss: 0.02302
Epoch: 4292, Training Loss: 0.02932
Epoch: 4292, Training Loss: 0.02207
Epoch: 4293, Training Loss: 0.02550
Epoch: 4293, Training Loss: 0.02302
Epoch: 4293, Training Loss: 0.02931
Epoch: 4293, Training Loss: 0.02206
Epoch: 4294, Training Loss: 0.02549
Epoch: 4294, Training Loss: 0.02302
Epoch: 4294, Training Loss: 0.02931
Epoch: 4294, Training Loss: 0.02206
Epoch: 4295, Training Loss: 0.02549
Epoch: 4295, Training Loss: 0.02301
Epoch: 4295, Training Loss: 0.02930
Epoch: 4295, Training Loss: 0.02206
Epoch: 4296, Training Loss: 0.02549
Epoch: 4296, Training Loss: 0.02301
Epoch: 4296, Training Loss: 0.02930
Epoch: 4296, Training Loss: 0.02205
Epoch: 4297, Training Loss: 0.02548
Epoch: 4297, Training Loss: 0.02301
Epoch: 4297, Training Loss: 0.02929
Epoch: 4297, Training Loss: 0.02205
Epoch: 4298, Training Loss: 0.02548
Epoch: 4298, Training Loss: 0.02301
Epoch: 4298, Training Loss: 0.02929
Epoch: 4298, Training Loss: 0.02205
Epoch: 4299, Training Loss: 0.02547
Epoch: 4299, Training Loss: 0.02300
Epoch: 4299, Training Loss: 0.02929
Epoch: 4299, Training Loss: 0.02204
Epoch: 4300, Training Loss: 0.02547
Epoch: 4300, Training Loss: 0.02300
Epoch: 4300, Training Loss: 0.02928
Epoch: 4300, Training Loss: 0.02204
Epoch: 4301, Training Loss: 0.02547
Epoch: 4301, Training Loss: 0.02300
Epoch: 4301, Training Loss: 0.02928
Epoch: 4301, Training Loss: 0.02204
Epoch: 4302, Training Loss: 0.02546
Epoch: 4302, Training Loss: 0.02299
Epoch: 4302, Training Loss: 0.02927
Epoch: 4302, Training Loss: 0.02203
Epoch: 4303, Training Loss: 0.02546
Epoch: 4303, Training Loss: 0.02299
Epoch: 4303, Training Loss: 0.02927
Epoch: 4303, Training Loss: 0.02203
Epoch: 4304, Training Loss: 0.02546
Epoch: 4304, Training Loss: 0.02299
Epoch: 4304, Training Loss: 0.02926
Epoch: 4304, Training Loss: 0.02203
Epoch: 4305, Training Loss: 0.02545
Epoch: 4305, Training Loss: 0.02298
Epoch: 4305, Training Loss: 0.02926
Epoch: 4305, Training Loss: 0.02202
Epoch: 4306, Training Loss: 0.02545
Epoch: 4306, Training Loss: 0.02298
Epoch: 4306, Training Loss: 0.02925
Epoch: 4306, Training Loss: 0.02202
Epoch: 4307, Training Loss: 0.02544
Epoch: 4307, Training Loss: 0.02298
Epoch: 4307, Training Loss: 0.02925
Epoch: 4307, Training Loss: 0.02202
Epoch: 4308, Training Loss: 0.02544
Epoch: 4308, Training Loss: 0.02297
Epoch: 4308, Training Loss: 0.02925
Epoch: 4308, Training Loss: 0.02201
Epoch: 4309, Training Loss: 0.02544
Epoch: 4309, Training Loss: 0.02297
Epoch: 4309, Training Loss: 0.02924
Epoch: 4309, Training Loss: 0.02201
Epoch: 4310, Training Loss: 0.02543
Epoch: 4310, Training Loss: 0.02297
Epoch: 4310, Training Loss: 0.02924
Epoch: 4310, Training Loss: 0.02201
Epoch: 4311, Training Loss: 0.02543
Epoch: 4311, Training Loss: 0.02296
Epoch: 4311, Training Loss: 0.02923
Epoch: 4311, Training Loss: 0.02201
Epoch: 4312, Training Loss: 0.02543
Epoch: 4312, Training Loss: 0.02296
Epoch: 4312, Training Loss: 0.02923
Epoch: 4312, Training Loss: 0.02200
Epoch: 4313, Training Loss: 0.02542
Epoch: 4313, Training Loss: 0.02296
Epoch: 4313, Training Loss: 0.02922
Epoch: 4313, Training Loss: 0.02200
Epoch: 4314, Training Loss: 0.02542
Epoch: 4314, Training Loss: 0.02295
Epoch: 4314, Training Loss: 0.02922
Epoch: 4314, Training Loss: 0.02200
Epoch: 4315, Training Loss: 0.02541
Epoch: 4315, Training Loss: 0.02295
Epoch: 4315, Training Loss: 0.02921
Epoch: 4315, Training Loss: 0.02199
Epoch: 4316, Training Loss: 0.02541
Epoch: 4316, Training Loss: 0.02295
Epoch: 4316, Training Loss: 0.02921
Epoch: 4316, Training Loss: 0.02199
Epoch: 4317, Training Loss: 0.02541
Epoch: 4317, Training Loss: 0.02294
Epoch: 4317, Training Loss: 0.02921
Epoch: 4317, Training Loss: 0.02199
Epoch: 4318, Training Loss: 0.02540
Epoch: 4318, Training Loss: 0.02294
Epoch: 4318, Training Loss: 0.02920
Epoch: 4318, Training Loss: 0.02198
Epoch: 4319, Training Loss: 0.02540
Epoch: 4319, Training Loss: 0.02294
Epoch: 4319, Training Loss: 0.02920
Epoch: 4319, Training Loss: 0.02198
Epoch: 4320, Training Loss: 0.02540
Epoch: 4320, Training Loss: 0.02293
Epoch: 4320, Training Loss: 0.02919
Epoch: 4320, Training Loss: 0.02198
Epoch: 4321, Training Loss: 0.02539
Epoch: 4321, Training Loss: 0.02293
Epoch: 4321, Training Loss: 0.02919
Epoch: 4321, Training Loss: 0.02197
Epoch: 4322, Training Loss: 0.02539
Epoch: 4322, Training Loss: 0.02293
Epoch: 4322, Training Loss: 0.02918
Epoch: 4322, Training Loss: 0.02197
Epoch: 4323, Training Loss: 0.02538
Epoch: 4323, Training Loss: 0.02292
Epoch: 4323, Training Loss: 0.02918
Epoch: 4323, Training Loss: 0.02197
Epoch: 4324, Training Loss: 0.02538
Epoch: 4324, Training Loss: 0.02292
Epoch: 4324, Training Loss: 0.02917
Epoch: 4324, Training Loss: 0.02196
Epoch: 4325, Training Loss: 0.02538
Epoch: 4325, Training Loss: 0.02292
Epoch: 4325, Training Loss: 0.02917
Epoch: 4325, Training Loss: 0.02196
Epoch: 4326, Training Loss: 0.02537
Epoch: 4326, Training Loss: 0.02291
Epoch: 4326, Training Loss: 0.02917
Epoch: 4326, Training Loss: 0.02196
Epoch: 4327, Training Loss: 0.02537
Epoch: 4327, Training Loss: 0.02291
Epoch: 4327, Training Loss: 0.02916
Epoch: 4327, Training Loss: 0.02195
Epoch: 4328, Training Loss: 0.02537
Epoch: 4328, Training Loss: 0.02291
Epoch: 4328, Training Loss: 0.02916
Epoch: 4328, Training Loss: 0.02195
Epoch: 4329, Training Loss: 0.02536
Epoch: 4329, Training Loss: 0.02290
Epoch: 4329, Training Loss: 0.02915
Epoch: 4329, Training Loss: 0.02195
Epoch: 4330, Training Loss: 0.02536
Epoch: 4330, Training Loss: 0.02290
Epoch: 4330, Training Loss: 0.02915
Epoch: 4330, Training Loss: 0.02194
Epoch: 4331, Training Loss: 0.02535
Epoch: 4331, Training Loss: 0.02290
Epoch: 4331, Training Loss: 0.02914
Epoch: 4331, Training Loss: 0.02194
Epoch: 4332, Training Loss: 0.02535
Epoch: 4332, Training Loss: 0.02290
Epoch: 4332, Training Loss: 0.02914
Epoch: 4332, Training Loss: 0.02194
Epoch: 4333, Training Loss: 0.02535
Epoch: 4333, Training Loss: 0.02289
Epoch: 4333, Training Loss: 0.02914
Epoch: 4333, Training Loss: 0.02194
Epoch: 4334, Training Loss: 0.02534
Epoch: 4334, Training Loss: 0.02289
Epoch: 4334, Training Loss: 0.02913
Epoch: 4334, Training Loss: 0.02193
Epoch: 4335, Training Loss: 0.02534
Epoch: 4335, Training Loss: 0.02289
Epoch: 4335, Training Loss: 0.02913
Epoch: 4335, Training Loss: 0.02193
Epoch: 4336, Training Loss: 0.02534
Epoch: 4336, Training Loss: 0.02288
Epoch: 4336, Training Loss: 0.02912
Epoch: 4336, Training Loss: 0.02193
Epoch: 4337, Training Loss: 0.02533
Epoch: 4337, Training Loss: 0.02288
Epoch: 4337, Training Loss: 0.02912
Epoch: 4337, Training Loss: 0.02192
Epoch: 4338, Training Loss: 0.02533
Epoch: 4338, Training Loss: 0.02288
Epoch: 4338, Training Loss: 0.02911
Epoch: 4338, Training Loss: 0.02192
Epoch: 4339, Training Loss: 0.02532
Epoch: 4339, Training Loss: 0.02287
Epoch: 4339, Training Loss: 0.02911
Epoch: 4339, Training Loss: 0.02192
Epoch: 4340, Training Loss: 0.02532
Epoch: 4340, Training Loss: 0.02287
Epoch: 4340, Training Loss: 0.02910
Epoch: 4340, Training Loss: 0.02191
Epoch: 4341, Training Loss: 0.02532
Epoch: 4341, Training Loss: 0.02287
Epoch: 4341, Training Loss: 0.02910
Epoch: 4341, Training Loss: 0.02191
Epoch: 4342, Training Loss: 0.02531
Epoch: 4342, Training Loss: 0.02286
Epoch: 4342, Training Loss: 0.02910
Epoch: 4342, Training Loss: 0.02191
Epoch: 4343, Training Loss: 0.02531
Epoch: 4343, Training Loss: 0.02286
Epoch: 4343, Training Loss: 0.02909
Epoch: 4343, Training Loss: 0.02190
Epoch: 4344, Training Loss: 0.02531
Epoch: 4344, Training Loss: 0.02286
Epoch: 4344, Training Loss: 0.02909
Epoch: 4344, Training Loss: 0.02190
Epoch: 4345, Training Loss: 0.02530
Epoch: 4345, Training Loss: 0.02285
Epoch: 4345, Training Loss: 0.02908
Epoch: 4345, Training Loss: 0.02190
Epoch: 4346, Training Loss: 0.02530
Epoch: 4346, Training Loss: 0.02285
Epoch: 4346, Training Loss: 0.02908
Epoch: 4346, Training Loss: 0.02189
Epoch: 4347, Training Loss: 0.02529
Epoch: 4347, Training Loss: 0.02285
Epoch: 4347, Training Loss: 0.02907
Epoch: 4347, Training Loss: 0.02189
Epoch: 4348, Training Loss: 0.02529
Epoch: 4348, Training Loss: 0.02284
Epoch: 4348, Training Loss: 0.02907
Epoch: 4348, Training Loss: 0.02189
Epoch: 4349, Training Loss: 0.02529
Epoch: 4349, Training Loss: 0.02284
Epoch: 4349, Training Loss: 0.02907
Epoch: 4349, Training Loss: 0.02189
Epoch: 4350, Training Loss: 0.02528
Epoch: 4350, Training Loss: 0.02284
Epoch: 4350, Training Loss: 0.02906
Epoch: 4350, Training Loss: 0.02188
Epoch: 4351, Training Loss: 0.02528
Epoch: 4351, Training Loss: 0.02283
Epoch: 4351, Training Loss: 0.02906
Epoch: 4351, Training Loss: 0.02188
Epoch: 4352, Training Loss: 0.02528
Epoch: 4352, Training Loss: 0.02283
Epoch: 4352, Training Loss: 0.02905
Epoch: 4352, Training Loss: 0.02188
Epoch: 4353, Training Loss: 0.02527
Epoch: 4353, Training Loss: 0.02283
Epoch: 4353, Training Loss: 0.02905
Epoch: 4353, Training Loss: 0.02187
Epoch: 4354, Training Loss: 0.02527
Epoch: 4354, Training Loss: 0.02283
Epoch: 4354, Training Loss: 0.02904
Epoch: 4354, Training Loss: 0.02187
Epoch: 4355, Training Loss: 0.02526
Epoch: 4355, Training Loss: 0.02282
Epoch: 4355, Training Loss: 0.02904
Epoch: 4355, Training Loss: 0.02187
Epoch: 4356, Training Loss: 0.02526
Epoch: 4356, Training Loss: 0.02282
Epoch: 4356, Training Loss: 0.02903
Epoch: 4356, Training Loss: 0.02186
Epoch: 4357, Training Loss: 0.02526
Epoch: 4357, Training Loss: 0.02282
Epoch: 4357, Training Loss: 0.02903
Epoch: 4357, Training Loss: 0.02186
Epoch: 4358, Training Loss: 0.02525
Epoch: 4358, Training Loss: 0.02281
Epoch: 4358, Training Loss: 0.02903
Epoch: 4358, Training Loss: 0.02186
Epoch: 4359, Training Loss: 0.02525
Epoch: 4359, Training Loss: 0.02281
Epoch: 4359, Training Loss: 0.02902
Epoch: 4359, Training Loss: 0.02185
Epoch: 4360, Training Loss: 0.02525
Epoch: 4360, Training Loss: 0.02281
Epoch: 4360, Training Loss: 0.02902
Epoch: 4360, Training Loss: 0.02185
Epoch: 4361, Training Loss: 0.02524
Epoch: 4361, Training Loss: 0.02280
Epoch: 4361, Training Loss: 0.02901
Epoch: 4361, Training Loss: 0.02185
Epoch: 4362, Training Loss: 0.02524
Epoch: 4362, Training Loss: 0.02280
Epoch: 4362, Training Loss: 0.02901
Epoch: 4362, Training Loss: 0.02184
Epoch: 4363, Training Loss: 0.02523
Epoch: 4363, Training Loss: 0.02280
Epoch: 4363, Training Loss: 0.02900
Epoch: 4363, Training Loss: 0.02184
Epoch: 4364, Training Loss: 0.02523
Epoch: 4364, Training Loss: 0.02279
Epoch: 4364, Training Loss: 0.02900
Epoch: 4364, Training Loss: 0.02184
Epoch: 4365, Training Loss: 0.02523
Epoch: 4365, Training Loss: 0.02279
Epoch: 4365, Training Loss: 0.02900
Epoch: 4365, Training Loss: 0.02184
Epoch: 4366, Training Loss: 0.02522
Epoch: 4366, Training Loss: 0.02279
Epoch: 4366, Training Loss: 0.02899
Epoch: 4366, Training Loss: 0.02183
Epoch: 4367, Training Loss: 0.02522
Epoch: 4367, Training Loss: 0.02278
Epoch: 4367, Training Loss: 0.02899
Epoch: 4367, Training Loss: 0.02183
Epoch: 4368, Training Loss: 0.02522
Epoch: 4368, Training Loss: 0.02278
Epoch: 4368, Training Loss: 0.02898
Epoch: 4368, Training Loss: 0.02183
Epoch: 4369, Training Loss: 0.02521
Epoch: 4369, Training Loss: 0.02278
Epoch: 4369, Training Loss: 0.02898
Epoch: 4369, Training Loss: 0.02182
Epoch: 4370, Training Loss: 0.02521
Epoch: 4370, Training Loss: 0.02277
Epoch: 4370, Training Loss: 0.02897
Epoch: 4370, Training Loss: 0.02182
Epoch: 4371, Training Loss: 0.02521
Epoch: 4371, Training Loss: 0.02277
Epoch: 4371, Training Loss: 0.02897
Epoch: 4371, Training Loss: 0.02182
Epoch: 4372, Training Loss: 0.02520
Epoch: 4372, Training Loss: 0.02277
Epoch: 4372, Training Loss: 0.02897
Epoch: 4372, Training Loss: 0.02181
Epoch: 4373, Training Loss: 0.02520
Epoch: 4373, Training Loss: 0.02277
Epoch: 4373, Training Loss: 0.02896
Epoch: 4373, Training Loss: 0.02181
Epoch: 4374, Training Loss: 0.02519
Epoch: 4374, Training Loss: 0.02276
Epoch: 4374, Training Loss: 0.02896
Epoch: 4374, Training Loss: 0.02181
Epoch: 4375, Training Loss: 0.02519
Epoch: 4375, Training Loss: 0.02276
Epoch: 4375, Training Loss: 0.02895
Epoch: 4375, Training Loss: 0.02180
Epoch: 4376, Training Loss: 0.02519
Epoch: 4376, Training Loss: 0.02276
Epoch: 4376, Training Loss: 0.02895
Epoch: 4376, Training Loss: 0.02180
Epoch: 4377, Training Loss: 0.02518
Epoch: 4377, Training Loss: 0.02275
Epoch: 4377, Training Loss: 0.02894
Epoch: 4377, Training Loss: 0.02180
Epoch: 4378, Training Loss: 0.02518
Epoch: 4378, Training Loss: 0.02275
Epoch: 4378, Training Loss: 0.02894
Epoch: 4378, Training Loss: 0.02179
Epoch: 4379, Training Loss: 0.02518
Epoch: 4379, Training Loss: 0.02275
Epoch: 4379, Training Loss: 0.02894
Epoch: 4379, Training Loss: 0.02179
Epoch: 4380, Training Loss: 0.02517
Epoch: 4380, Training Loss: 0.02274
Epoch: 4380, Training Loss: 0.02893
Epoch: 4380, Training Loss: 0.02179
Epoch: 4381, Training Loss: 0.02517
Epoch: 4381, Training Loss: 0.02274
Epoch: 4381, Training Loss: 0.02893
Epoch: 4381, Training Loss: 0.02179
Epoch: 4382, Training Loss: 0.02517
Epoch: 4382, Training Loss: 0.02274
Epoch: 4382, Training Loss: 0.02892
Epoch: 4382, Training Loss: 0.02178
Epoch: 4383, Training Loss: 0.02516
Epoch: 4383, Training Loss: 0.02273
Epoch: 4383, Training Loss: 0.02892
Epoch: 4383, Training Loss: 0.02178
Epoch: 4384, Training Loss: 0.02516
Epoch: 4384, Training Loss: 0.02273
Epoch: 4384, Training Loss: 0.02891
Epoch: 4384, Training Loss: 0.02178
Epoch: 4385, Training Loss: 0.02515
Epoch: 4385, Training Loss: 0.02273
Epoch: 4385, Training Loss: 0.02891
Epoch: 4385, Training Loss: 0.02177
Epoch: 4386, Training Loss: 0.02515
Epoch: 4386, Training Loss: 0.02272
Epoch: 4386, Training Loss: 0.02891
Epoch: 4386, Training Loss: 0.02177
Epoch: 4387, Training Loss: 0.02515
Epoch: 4387, Training Loss: 0.02272
Epoch: 4387, Training Loss: 0.02890
Epoch: 4387, Training Loss: 0.02177
Epoch: 4388, Training Loss: 0.02514
Epoch: 4388, Training Loss: 0.02272
Epoch: 4388, Training Loss: 0.02890
Epoch: 4388, Training Loss: 0.02176
Epoch: 4389, Training Loss: 0.02514
Epoch: 4389, Training Loss: 0.02271
Epoch: 4389, Training Loss: 0.02889
Epoch: 4389, Training Loss: 0.02176
Epoch: 4390, Training Loss: 0.02514
Epoch: 4390, Training Loss: 0.02271
Epoch: 4390, Training Loss: 0.02889
Epoch: 4390, Training Loss: 0.02176
Epoch: 4391, Training Loss: 0.02513
Epoch: 4391, Training Loss: 0.02271
Epoch: 4391, Training Loss: 0.02888
Epoch: 4391, Training Loss: 0.02175
Epoch: 4392, Training Loss: 0.02513
Epoch: 4392, Training Loss: 0.02271
Epoch: 4392, Training Loss: 0.02888
Epoch: 4392, Training Loss: 0.02175
Epoch: 4393, Training Loss: 0.02512
Epoch: 4393, Training Loss: 0.02270
Epoch: 4393, Training Loss: 0.02888
Epoch: 4393, Training Loss: 0.02175
Epoch: 4394, Training Loss: 0.02512
Epoch: 4394, Training Loss: 0.02270
Epoch: 4394, Training Loss: 0.02887
Epoch: 4394, Training Loss: 0.02175
Epoch: 4395, Training Loss: 0.02512
Epoch: 4395, Training Loss: 0.02270
Epoch: 4395, Training Loss: 0.02887
Epoch: 4395, Training Loss: 0.02174
Epoch: 4396, Training Loss: 0.02511
Epoch: 4396, Training Loss: 0.02269
Epoch: 4396, Training Loss: 0.02886
Epoch: 4396, Training Loss: 0.02174
Epoch: 4397, Training Loss: 0.02511
Epoch: 4397, Training Loss: 0.02269
Epoch: 4397, Training Loss: 0.02886
Epoch: 4397, Training Loss: 0.02174
Epoch: 4398, Training Loss: 0.02511
Epoch: 4398, Training Loss: 0.02269
Epoch: 4398, Training Loss: 0.02885
Epoch: 4398, Training Loss: 0.02173
Epoch: 4399, Training Loss: 0.02510
Epoch: 4399, Training Loss: 0.02268
Epoch: 4399, Training Loss: 0.02885
Epoch: 4399, Training Loss: 0.02173
Epoch: 4400, Training Loss: 0.02510
Epoch: 4400, Training Loss: 0.02268
Epoch: 4400, Training Loss: 0.02885
Epoch: 4400, Training Loss: 0.02173
Epoch: 4401, Training Loss: 0.02510
Epoch: 4401, Training Loss: 0.02268
Epoch: 4401, Training Loss: 0.02884
Epoch: 4401, Training Loss: 0.02172
Epoch: 4402, Training Loss: 0.02509
Epoch: 4402, Training Loss: 0.02267
Epoch: 4402, Training Loss: 0.02884
Epoch: 4402, Training Loss: 0.02172
Epoch: 4403, Training Loss: 0.02509
Epoch: 4403, Training Loss: 0.02267
Epoch: 4403, Training Loss: 0.02883
Epoch: 4403, Training Loss: 0.02172
Epoch: 4404, Training Loss: 0.02508
Epoch: 4404, Training Loss: 0.02267
Epoch: 4404, Training Loss: 0.02883
Epoch: 4404, Training Loss: 0.02171
Epoch: 4405, Training Loss: 0.02508
Epoch: 4405, Training Loss: 0.02266
Epoch: 4405, Training Loss: 0.02882
Epoch: 4405, Training Loss: 0.02171
Epoch: 4406, Training Loss: 0.02508
Epoch: 4406, Training Loss: 0.02266
Epoch: 4406, Training Loss: 0.02882
Epoch: 4406, Training Loss: 0.02171
Epoch: 4407, Training Loss: 0.02507
Epoch: 4407, Training Loss: 0.02266
Epoch: 4407, Training Loss: 0.02882
Epoch: 4407, Training Loss: 0.02171
Epoch: 4408, Training Loss: 0.02507
Epoch: 4408, Training Loss: 0.02266
Epoch: 4408, Training Loss: 0.02881
Epoch: 4408, Training Loss: 0.02170
Epoch: 4409, Training Loss: 0.02507
Epoch: 4409, Training Loss: 0.02265
Epoch: 4409, Training Loss: 0.02881
Epoch: 4409, Training Loss: 0.02170
Epoch: 4410, Training Loss: 0.02506
Epoch: 4410, Training Loss: 0.02265
Epoch: 4410, Training Loss: 0.02880
Epoch: 4410, Training Loss: 0.02170
Epoch: 4411, Training Loss: 0.02506
Epoch: 4411, Training Loss: 0.02265
Epoch: 4411, Training Loss: 0.02880
Epoch: 4411, Training Loss: 0.02169
Epoch: 4412, Training Loss: 0.02506
Epoch: 4412, Training Loss: 0.02264
Epoch: 4412, Training Loss: 0.02879
Epoch: 4412, Training Loss: 0.02169
Epoch: 4413, Training Loss: 0.02505
Epoch: 4413, Training Loss: 0.02264
Epoch: 4413, Training Loss: 0.02879
Epoch: 4413, Training Loss: 0.02169
Epoch: 4414, Training Loss: 0.02505
Epoch: 4414, Training Loss: 0.02264
Epoch: 4414, Training Loss: 0.02879
Epoch: 4414, Training Loss: 0.02168
Epoch: 4415, Training Loss: 0.02505
Epoch: 4415, Training Loss: 0.02263
Epoch: 4415, Training Loss: 0.02878
Epoch: 4415, Training Loss: 0.02168
Epoch: 4416, Training Loss: 0.02504
Epoch: 4416, Training Loss: 0.02263
Epoch: 4416, Training Loss: 0.02878
Epoch: 4416, Training Loss: 0.02168
Epoch: 4417, Training Loss: 0.02504
Epoch: 4417, Training Loss: 0.02263
Epoch: 4417, Training Loss: 0.02877
Epoch: 4417, Training Loss: 0.02168
Epoch: 4418, Training Loss: 0.02503
Epoch: 4418, Training Loss: 0.02262
Epoch: 4418, Training Loss: 0.02877
Epoch: 4418, Training Loss: 0.02167
Epoch: 4419, Training Loss: 0.02503
Epoch: 4419, Training Loss: 0.02262
Epoch: 4419, Training Loss: 0.02876
Epoch: 4419, Training Loss: 0.02167
Epoch: 4420, Training Loss: 0.02503
Epoch: 4420, Training Loss: 0.02262
Epoch: 4420, Training Loss: 0.02876
Epoch: 4420, Training Loss: 0.02167
Epoch: 4421, Training Loss: 0.02502
Epoch: 4421, Training Loss: 0.02262
Epoch: 4421, Training Loss: 0.02876
Epoch: 4421, Training Loss: 0.02166
Epoch: 4422, Training Loss: 0.02502
Epoch: 4422, Training Loss: 0.02261
Epoch: 4422, Training Loss: 0.02875
Epoch: 4422, Training Loss: 0.02166
Epoch: 4423, Training Loss: 0.02502
Epoch: 4423, Training Loss: 0.02261
Epoch: 4423, Training Loss: 0.02875
Epoch: 4423, Training Loss: 0.02166
Epoch: 4424, Training Loss: 0.02501
Epoch: 4424, Training Loss: 0.02261
Epoch: 4424, Training Loss: 0.02874
Epoch: 4424, Training Loss: 0.02165
Epoch: 4425, Training Loss: 0.02501
Epoch: 4425, Training Loss: 0.02260
Epoch: 4425, Training Loss: 0.02874
Epoch: 4425, Training Loss: 0.02165
Epoch: 4426, Training Loss: 0.02501
Epoch: 4426, Training Loss: 0.02260
Epoch: 4426, Training Loss: 0.02874
Epoch: 4426, Training Loss: 0.02165
Epoch: 4427, Training Loss: 0.02500
Epoch: 4427, Training Loss: 0.02260
Epoch: 4427, Training Loss: 0.02873
Epoch: 4427, Training Loss: 0.02164
Epoch: 4428, Training Loss: 0.02500
Epoch: 4428, Training Loss: 0.02259
Epoch: 4428, Training Loss: 0.02873
Epoch: 4428, Training Loss: 0.02164
Epoch: 4429, Training Loss: 0.02499
Epoch: 4429, Training Loss: 0.02259
Epoch: 4429, Training Loss: 0.02872
Epoch: 4429, Training Loss: 0.02164
Epoch: 4430, Training Loss: 0.02499
Epoch: 4430, Training Loss: 0.02259
Epoch: 4430, Training Loss: 0.02872
Epoch: 4430, Training Loss: 0.02164
Epoch: 4431, Training Loss: 0.02499
Epoch: 4431, Training Loss: 0.02258
Epoch: 4431, Training Loss: 0.02871
Epoch: 4431, Training Loss: 0.02163
Epoch: 4432, Training Loss: 0.02498
Epoch: 4432, Training Loss: 0.02258
Epoch: 4432, Training Loss: 0.02871
Epoch: 4432, Training Loss: 0.02163
Epoch: 4433, Training Loss: 0.02498
Epoch: 4433, Training Loss: 0.02258
Epoch: 4433, Training Loss: 0.02871
Epoch: 4433, Training Loss: 0.02163
Epoch: 4434, Training Loss: 0.02498
Epoch: 4434, Training Loss: 0.02258
Epoch: 4434, Training Loss: 0.02870
Epoch: 4434, Training Loss: 0.02162
Epoch: 4435, Training Loss: 0.02497
Epoch: 4435, Training Loss: 0.02257
Epoch: 4435, Training Loss: 0.02870
Epoch: 4435, Training Loss: 0.02162
Epoch: 4436, Training Loss: 0.02497
Epoch: 4436, Training Loss: 0.02257
Epoch: 4436, Training Loss: 0.02869
Epoch: 4436, Training Loss: 0.02162
Epoch: 4437, Training Loss: 0.02497
Epoch: 4437, Training Loss: 0.02257
Epoch: 4437, Training Loss: 0.02869
Epoch: 4437, Training Loss: 0.02161
Epoch: 4438, Training Loss: 0.02496
Epoch: 4438, Training Loss: 0.02256
Epoch: 4438, Training Loss: 0.02868
Epoch: 4438, Training Loss: 0.02161
Epoch: 4439, Training Loss: 0.02496
Epoch: 4439, Training Loss: 0.02256
Epoch: 4439, Training Loss: 0.02868
Epoch: 4439, Training Loss: 0.02161
Epoch: 4440, Training Loss: 0.02496
Epoch: 4440, Training Loss: 0.02256
Epoch: 4440, Training Loss: 0.02868
Epoch: 4440, Training Loss: 0.02161
Epoch: 4441, Training Loss: 0.02495
Epoch: 4441, Training Loss: 0.02255
Epoch: 4441, Training Loss: 0.02867
Epoch: 4441, Training Loss: 0.02160
Epoch: 4442, Training Loss: 0.02495
Epoch: 4442, Training Loss: 0.02255
Epoch: 4442, Training Loss: 0.02867
Epoch: 4442, Training Loss: 0.02160
Epoch: 4443, Training Loss: 0.02494
Epoch: 4443, Training Loss: 0.02255
Epoch: 4443, Training Loss: 0.02866
Epoch: 4443, Training Loss: 0.02160
Epoch: 4444, Training Loss: 0.02494
Epoch: 4444, Training Loss: 0.02254
Epoch: 4444, Training Loss: 0.02866
Epoch: 4444, Training Loss: 0.02159
Epoch: 4445, Training Loss: 0.02494
Epoch: 4445, Training Loss: 0.02254
Epoch: 4445, Training Loss: 0.02866
Epoch: 4445, Training Loss: 0.02159
Epoch: 4446, Training Loss: 0.02493
Epoch: 4446, Training Loss: 0.02254
Epoch: 4446, Training Loss: 0.02865
Epoch: 4446, Training Loss: 0.02159
Epoch: 4447, Training Loss: 0.02493
Epoch: 4447, Training Loss: 0.02254
Epoch: 4447, Training Loss: 0.02865
Epoch: 4447, Training Loss: 0.02158
Epoch: 4448, Training Loss: 0.02493
Epoch: 4448, Training Loss: 0.02253
Epoch: 4448, Training Loss: 0.02864
Epoch: 4448, Training Loss: 0.02158
Epoch: 4449, Training Loss: 0.02492
Epoch: 4449, Training Loss: 0.02253
Epoch: 4449, Training Loss: 0.02864
Epoch: 4449, Training Loss: 0.02158
Epoch: 4450, Training Loss: 0.02492
Epoch: 4450, Training Loss: 0.02253
Epoch: 4450, Training Loss: 0.02863
Epoch: 4450, Training Loss: 0.02158
Epoch: 4451, Training Loss: 0.02492
Epoch: 4451, Training Loss: 0.02252
Epoch: 4451, Training Loss: 0.02863
Epoch: 4451, Training Loss: 0.02157
Epoch: 4452, Training Loss: 0.02491
Epoch: 4452, Training Loss: 0.02252
Epoch: 4452, Training Loss: 0.02863
Epoch: 4452, Training Loss: 0.02157
Epoch: 4453, Training Loss: 0.02491
Epoch: 4453, Training Loss: 0.02252
Epoch: 4453, Training Loss: 0.02862
Epoch: 4453, Training Loss: 0.02157
Epoch: 4454, Training Loss: 0.02491
Epoch: 4454, Training Loss: 0.02251
Epoch: 4454, Training Loss: 0.02862
Epoch: 4454, Training Loss: 0.02156
Epoch: 4455, Training Loss: 0.02490
Epoch: 4455, Training Loss: 0.02251
Epoch: 4455, Training Loss: 0.02861
Epoch: 4455, Training Loss: 0.02156
Epoch: 4456, Training Loss: 0.02490
Epoch: 4456, Training Loss: 0.02251
Epoch: 4456, Training Loss: 0.02861
Epoch: 4456, Training Loss: 0.02156
Epoch: 4457, Training Loss: 0.02489
Epoch: 4457, Training Loss: 0.02250
Epoch: 4457, Training Loss: 0.02861
Epoch: 4457, Training Loss: 0.02155
Epoch: 4458, Training Loss: 0.02489
Epoch: 4458, Training Loss: 0.02250
Epoch: 4458, Training Loss: 0.02860
Epoch: 4458, Training Loss: 0.02155
Epoch: 4459, Training Loss: 0.02489
Epoch: 4459, Training Loss: 0.02250
Epoch: 4459, Training Loss: 0.02860
Epoch: 4459, Training Loss: 0.02155
Epoch: 4460, Training Loss: 0.02488
Epoch: 4460, Training Loss: 0.02250
Epoch: 4460, Training Loss: 0.02859
Epoch: 4460, Training Loss: 0.02155
Epoch: 4461, Training Loss: 0.02488
Epoch: 4461, Training Loss: 0.02249
Epoch: 4461, Training Loss: 0.02859
Epoch: 4461, Training Loss: 0.02154
Epoch: 4462, Training Loss: 0.02488
Epoch: 4462, Training Loss: 0.02249
Epoch: 4462, Training Loss: 0.02858
Epoch: 4462, Training Loss: 0.02154
Epoch: 4463, Training Loss: 0.02487
Epoch: 4463, Training Loss: 0.02249
Epoch: 4463, Training Loss: 0.02858
Epoch: 4463, Training Loss: 0.02154
Epoch: 4464, Training Loss: 0.02487
Epoch: 4464, Training Loss: 0.02248
Epoch: 4464, Training Loss: 0.02858
Epoch: 4464, Training Loss: 0.02153
Epoch: 4465, Training Loss: 0.02487
Epoch: 4465, Training Loss: 0.02248
Epoch: 4465, Training Loss: 0.02857
Epoch: 4465, Training Loss: 0.02153
Epoch: 4466, Training Loss: 0.02486
Epoch: 4466, Training Loss: 0.02248
Epoch: 4466, Training Loss: 0.02857
Epoch: 4466, Training Loss: 0.02153
Epoch: 4467, Training Loss: 0.02486
Epoch: 4467, Training Loss: 0.02247
Epoch: 4467, Training Loss: 0.02856
Epoch: 4467, Training Loss: 0.02152
Epoch: 4468, Training Loss: 0.02486
Epoch: 4468, Training Loss: 0.02247
Epoch: 4468, Training Loss: 0.02856
Epoch: 4468, Training Loss: 0.02152
Epoch: 4469, Training Loss: 0.02485
Epoch: 4469, Training Loss: 0.02247
Epoch: 4469, Training Loss: 0.02856
Epoch: 4469, Training Loss: 0.02152
Epoch: 4470, Training Loss: 0.02485
Epoch: 4470, Training Loss: 0.02247
Epoch: 4470, Training Loss: 0.02855
Epoch: 4470, Training Loss: 0.02152
Epoch: 4471, Training Loss: 0.02485
Epoch: 4471, Training Loss: 0.02246
Epoch: 4471, Training Loss: 0.02855
Epoch: 4471, Training Loss: 0.02151
Epoch: 4472, Training Loss: 0.02484
Epoch: 4472, Training Loss: 0.02246
Epoch: 4472, Training Loss: 0.02854
Epoch: 4472, Training Loss: 0.02151
Epoch: 4473, Training Loss: 0.02484
Epoch: 4473, Training Loss: 0.02246
Epoch: 4473, Training Loss: 0.02854
Epoch: 4473, Training Loss: 0.02151
Epoch: 4474, Training Loss: 0.02483
Epoch: 4474, Training Loss: 0.02245
Epoch: 4474, Training Loss: 0.02854
Epoch: 4474, Training Loss: 0.02150
Epoch: 4475, Training Loss: 0.02483
Epoch: 4475, Training Loss: 0.02245
Epoch: 4475, Training Loss: 0.02853
Epoch: 4475, Training Loss: 0.02150
Epoch: 4476, Training Loss: 0.02483
Epoch: 4476, Training Loss: 0.02245
Epoch: 4476, Training Loss: 0.02853
Epoch: 4476, Training Loss: 0.02150
Epoch: 4477, Training Loss: 0.02482
Epoch: 4477, Training Loss: 0.02244
Epoch: 4477, Training Loss: 0.02852
Epoch: 4477, Training Loss: 0.02149
Epoch: 4478, Training Loss: 0.02482
Epoch: 4478, Training Loss: 0.02244
Epoch: 4478, Training Loss: 0.02852
Epoch: 4478, Training Loss: 0.02149
Epoch: 4479, Training Loss: 0.02482
Epoch: 4479, Training Loss: 0.02244
Epoch: 4479, Training Loss: 0.02851
Epoch: 4479, Training Loss: 0.02149
Epoch: 4480, Training Loss: 0.02481
Epoch: 4480, Training Loss: 0.02244
Epoch: 4480, Training Loss: 0.02851
Epoch: 4480, Training Loss: 0.02149
Epoch: 4481, Training Loss: 0.02481
Epoch: 4481, Training Loss: 0.02243
Epoch: 4481, Training Loss: 0.02851
Epoch: 4481, Training Loss: 0.02148
Epoch: 4482, Training Loss: 0.02481
Epoch: 4482, Training Loss: 0.02243
Epoch: 4482, Training Loss: 0.02850
Epoch: 4482, Training Loss: 0.02148
Epoch: 4483, Training Loss: 0.02480
Epoch: 4483, Training Loss: 0.02243
Epoch: 4483, Training Loss: 0.02850
Epoch: 4483, Training Loss: 0.02148
Epoch: 4484, Training Loss: 0.02480
Epoch: 4484, Training Loss: 0.02242
Epoch: 4484, Training Loss: 0.02849
Epoch: 4484, Training Loss: 0.02147
Epoch: 4485, Training Loss: 0.02480
Epoch: 4485, Training Loss: 0.02242
Epoch: 4485, Training Loss: 0.02849
Epoch: 4485, Training Loss: 0.02147
Epoch: 4486, Training Loss: 0.02479
Epoch: 4486, Training Loss: 0.02242
Epoch: 4486, Training Loss: 0.02849
Epoch: 4486, Training Loss: 0.02147
Epoch: 4487, Training Loss: 0.02479
Epoch: 4487, Training Loss: 0.02241
Epoch: 4487, Training Loss: 0.02848
Epoch: 4487, Training Loss: 0.02146
Epoch: 4488, Training Loss: 0.02479
Epoch: 4488, Training Loss: 0.02241
Epoch: 4488, Training Loss: 0.02848
Epoch: 4488, Training Loss: 0.02146
Epoch: 4489, Training Loss: 0.02478
Epoch: 4489, Training Loss: 0.02241
Epoch: 4489, Training Loss: 0.02847
Epoch: 4489, Training Loss: 0.02146
Epoch: 4490, Training Loss: 0.02478
Epoch: 4490, Training Loss: 0.02241
Epoch: 4490, Training Loss: 0.02847
Epoch: 4490, Training Loss: 0.02146
Epoch: 4491, Training Loss: 0.02477
Epoch: 4491, Training Loss: 0.02240
Epoch: 4491, Training Loss: 0.02847
Epoch: 4491, Training Loss: 0.02145
Epoch: 4492, Training Loss: 0.02477
Epoch: 4492, Training Loss: 0.02240
Epoch: 4492, Training Loss: 0.02846
Epoch: 4492, Training Loss: 0.02145
Epoch: 4493, Training Loss: 0.02477
Epoch: 4493, Training Loss: 0.02240
Epoch: 4493, Training Loss: 0.02846
Epoch: 4493, Training Loss: 0.02145
Epoch: 4494, Training Loss: 0.02476
Epoch: 4494, Training Loss: 0.02239
Epoch: 4494, Training Loss: 0.02845
Epoch: 4494, Training Loss: 0.02144
Epoch: 4495, Training Loss: 0.02476
Epoch: 4495, Training Loss: 0.02239
Epoch: 4495, Training Loss: 0.02845
Epoch: 4495, Training Loss: 0.02144
Epoch: 4496, Training Loss: 0.02476
Epoch: 4496, Training Loss: 0.02239
Epoch: 4496, Training Loss: 0.02844
Epoch: 4496, Training Loss: 0.02144
Epoch: 4497, Training Loss: 0.02475
Epoch: 4497, Training Loss: 0.02238
Epoch: 4497, Training Loss: 0.02844
Epoch: 4497, Training Loss: 0.02144
Epoch: 4498, Training Loss: 0.02475
Epoch: 4498, Training Loss: 0.02238
Epoch: 4498, Training Loss: 0.02844
Epoch: 4498, Training Loss: 0.02143
Epoch: 4499, Training Loss: 0.02475
Epoch: 4499, Training Loss: 0.02238
Epoch: 4499, Training Loss: 0.02843
Epoch: 4499, Training Loss: 0.02143
Epoch: 4500, Training Loss: 0.02474
Epoch: 4500, Training Loss: 0.02238
Epoch: 4500, Training Loss: 0.02843
Epoch: 4500, Training Loss: 0.02143
Epoch: 4501, Training Loss: 0.02474
Epoch: 4501, Training Loss: 0.02237
Epoch: 4501, Training Loss: 0.02842
Epoch: 4501, Training Loss: 0.02142
Epoch: 4502, Training Loss: 0.02474
Epoch: 4502, Training Loss: 0.02237
Epoch: 4502, Training Loss: 0.02842
Epoch: 4502, Training Loss: 0.02142
Epoch: 4503, Training Loss: 0.02473
Epoch: 4503, Training Loss: 0.02237
Epoch: 4503, Training Loss: 0.02842
Epoch: 4503, Training Loss: 0.02142
Epoch: 4504, Training Loss: 0.02473
Epoch: 4504, Training Loss: 0.02236
Epoch: 4504, Training Loss: 0.02841
Epoch: 4504, Training Loss: 0.02141
Epoch: 4505, Training Loss: 0.02473
Epoch: 4505, Training Loss: 0.02236
Epoch: 4505, Training Loss: 0.02841
Epoch: 4505, Training Loss: 0.02141
Epoch: 4506, Training Loss: 0.02472
Epoch: 4506, Training Loss: 0.02236
Epoch: 4506, Training Loss: 0.02840
Epoch: 4506, Training Loss: 0.02141
Epoch: 4507, Training Loss: 0.02472
Epoch: 4507, Training Loss: 0.02235
Epoch: 4507, Training Loss: 0.02840
Epoch: 4507, Training Loss: 0.02141
Epoch: 4508, Training Loss: 0.02472
Epoch: 4508, Training Loss: 0.02235
Epoch: 4508, Training Loss: 0.02840
Epoch: 4508, Training Loss: 0.02140
Epoch: 4509, Training Loss: 0.02471
Epoch: 4509, Training Loss: 0.02235
Epoch: 4509, Training Loss: 0.02839
Epoch: 4509, Training Loss: 0.02140
Epoch: 4510, Training Loss: 0.02471
Epoch: 4510, Training Loss: 0.02235
Epoch: 4510, Training Loss: 0.02839
Epoch: 4510, Training Loss: 0.02140
Epoch: 4511, Training Loss: 0.02471
Epoch: 4511, Training Loss: 0.02234
Epoch: 4511, Training Loss: 0.02838
Epoch: 4511, Training Loss: 0.02139
Epoch: 4512, Training Loss: 0.02470
Epoch: 4512, Training Loss: 0.02234
Epoch: 4512, Training Loss: 0.02838
Epoch: 4512, Training Loss: 0.02139
Epoch: 4513, Training Loss: 0.02470
Epoch: 4513, Training Loss: 0.02234
Epoch: 4513, Training Loss: 0.02838
Epoch: 4513, Training Loss: 0.02139
Epoch: 4514, Training Loss: 0.02469
Epoch: 4514, Training Loss: 0.02233
Epoch: 4514, Training Loss: 0.02837
Epoch: 4514, Training Loss: 0.02139
Epoch: 4515, Training Loss: 0.02469
Epoch: 4515, Training Loss: 0.02233
Epoch: 4515, Training Loss: 0.02837
Epoch: 4515, Training Loss: 0.02138
Epoch: 4516, Training Loss: 0.02469
Epoch: 4516, Training Loss: 0.02233
Epoch: 4516, Training Loss: 0.02836
Epoch: 4516, Training Loss: 0.02138
Epoch: 4517, Training Loss: 0.02468
Epoch: 4517, Training Loss: 0.02232
Epoch: 4517, Training Loss: 0.02836
Epoch: 4517, Training Loss: 0.02138
Epoch: 4518, Training Loss: 0.02468
Epoch: 4518, Training Loss: 0.02232
Epoch: 4518, Training Loss: 0.02835
Epoch: 4518, Training Loss: 0.02137
Epoch: 4519, Training Loss: 0.02468
Epoch: 4519, Training Loss: 0.02232
Epoch: 4519, Training Loss: 0.02835
Epoch: 4519, Training Loss: 0.02137
Epoch: 4520, Training Loss: 0.02467
Epoch: 4520, Training Loss: 0.02232
Epoch: 4520, Training Loss: 0.02835
Epoch: 4520, Training Loss: 0.02137
Epoch: 4521, Training Loss: 0.02467
Epoch: 4521, Training Loss: 0.02231
Epoch: 4521, Training Loss: 0.02834
Epoch: 4521, Training Loss: 0.02136
Epoch: 4522, Training Loss: 0.02467
Epoch: 4522, Training Loss: 0.02231
Epoch: 4522, Training Loss: 0.02834
Epoch: 4522, Training Loss: 0.02136
Epoch: 4523, Training Loss: 0.02466
Epoch: 4523, Training Loss: 0.02231
Epoch: 4523, Training Loss: 0.02833
Epoch: 4523, Training Loss: 0.02136
Epoch: 4524, Training Loss: 0.02466
Epoch: 4524, Training Loss: 0.02230
Epoch: 4524, Training Loss: 0.02833
Epoch: 4524, Training Loss: 0.02136
Epoch: 4525, Training Loss: 0.02466
Epoch: 4525, Training Loss: 0.02230
Epoch: 4525, Training Loss: 0.02833
Epoch: 4525, Training Loss: 0.02135
Epoch: 4526, Training Loss: 0.02465
Epoch: 4526, Training Loss: 0.02230
Epoch: 4526, Training Loss: 0.02832
Epoch: 4526, Training Loss: 0.02135
Epoch: 4527, Training Loss: 0.02465
Epoch: 4527, Training Loss: 0.02229
Epoch: 4527, Training Loss: 0.02832
Epoch: 4527, Training Loss: 0.02135
Epoch: 4528, Training Loss: 0.02465
Epoch: 4528, Training Loss: 0.02229
Epoch: 4528, Training Loss: 0.02831
Epoch: 4528, Training Loss: 0.02134
Epoch: 4529, Training Loss: 0.02464
Epoch: 4529, Training Loss: 0.02229
Epoch: 4529, Training Loss: 0.02831
Epoch: 4529, Training Loss: 0.02134
Epoch: 4530, Training Loss: 0.02464
Epoch: 4530, Training Loss: 0.02229
Epoch: 4530, Training Loss: 0.02831
Epoch: 4530, Training Loss: 0.02134
Epoch: 4531, Training Loss: 0.02464
Epoch: 4531, Training Loss: 0.02228
Epoch: 4531, Training Loss: 0.02830
Epoch: 4531, Training Loss: 0.02134
Epoch: 4532, Training Loss: 0.02463
Epoch: 4532, Training Loss: 0.02228
Epoch: 4532, Training Loss: 0.02830
Epoch: 4532, Training Loss: 0.02133
Epoch: 4533, Training Loss: 0.02463
Epoch: 4533, Training Loss: 0.02228
Epoch: 4533, Training Loss: 0.02829
Epoch: 4533, Training Loss: 0.02133
Epoch: 4534, Training Loss: 0.02463
Epoch: 4534, Training Loss: 0.02227
Epoch: 4534, Training Loss: 0.02829
Epoch: 4534, Training Loss: 0.02133
Epoch: 4535, Training Loss: 0.02462
Epoch: 4535, Training Loss: 0.02227
Epoch: 4535, Training Loss: 0.02829
Epoch: 4535, Training Loss: 0.02132
Epoch: 4536, Training Loss: 0.02462
Epoch: 4536, Training Loss: 0.02227
Epoch: 4536, Training Loss: 0.02828
Epoch: 4536, Training Loss: 0.02132
Epoch: 4537, Training Loss: 0.02462
Epoch: 4537, Training Loss: 0.02227
Epoch: 4537, Training Loss: 0.02828
Epoch: 4537, Training Loss: 0.02132
Epoch: 4538, Training Loss: 0.02461
Epoch: 4538, Training Loss: 0.02226
Epoch: 4538, Training Loss: 0.02827
Epoch: 4538, Training Loss: 0.02132
Epoch: 4539, Training Loss: 0.02461
Epoch: 4539, Training Loss: 0.02226
Epoch: 4539, Training Loss: 0.02827
Epoch: 4539, Training Loss: 0.02131
Epoch: 4540, Training Loss: 0.02460
Epoch: 4540, Training Loss: 0.02226
Epoch: 4540, Training Loss: 0.02827
Epoch: 4540, Training Loss: 0.02131
Epoch: 4541, Training Loss: 0.02460
Epoch: 4541, Training Loss: 0.02225
Epoch: 4541, Training Loss: 0.02826
Epoch: 4541, Training Loss: 0.02131
Epoch: 4542, Training Loss: 0.02460
Epoch: 4542, Training Loss: 0.02225
Epoch: 4542, Training Loss: 0.02826
Epoch: 4542, Training Loss: 0.02130
Epoch: 4543, Training Loss: 0.02459
Epoch: 4543, Training Loss: 0.02225
Epoch: 4543, Training Loss: 0.02825
Epoch: 4543, Training Loss: 0.02130
Epoch: 4544, Training Loss: 0.02459
Epoch: 4544, Training Loss: 0.02224
Epoch: 4544, Training Loss: 0.02825
Epoch: 4544, Training Loss: 0.02130
Epoch: 4545, Training Loss: 0.02459
Epoch: 4545, Training Loss: 0.02224
Epoch: 4545, Training Loss: 0.02825
Epoch: 4545, Training Loss: 0.02130
Epoch: 4546, Training Loss: 0.02458
Epoch: 4546, Training Loss: 0.02224
Epoch: 4546, Training Loss: 0.02824
Epoch: 4546, Training Loss: 0.02129
Epoch: 4547, Training Loss: 0.02458
Epoch: 4547, Training Loss: 0.02224
Epoch: 4547, Training Loss: 0.02824
Epoch: 4547, Training Loss: 0.02129
Epoch: 4548, Training Loss: 0.02458
Epoch: 4548, Training Loss: 0.02223
Epoch: 4548, Training Loss: 0.02823
Epoch: 4548, Training Loss: 0.02129
Epoch: 4549, Training Loss: 0.02457
Epoch: 4549, Training Loss: 0.02223
Epoch: 4549, Training Loss: 0.02823
Epoch: 4549, Training Loss: 0.02128
Epoch: 4550, Training Loss: 0.02457
Epoch: 4550, Training Loss: 0.02223
Epoch: 4550, Training Loss: 0.02823
Epoch: 4550, Training Loss: 0.02128
Epoch: 4551, Training Loss: 0.02457
Epoch: 4551, Training Loss: 0.02222
Epoch: 4551, Training Loss: 0.02822
Epoch: 4551, Training Loss: 0.02128
Epoch: 4552, Training Loss: 0.02456
Epoch: 4552, Training Loss: 0.02222
Epoch: 4552, Training Loss: 0.02822
Epoch: 4552, Training Loss: 0.02127
Epoch: 4553, Training Loss: 0.02456
Epoch: 4553, Training Loss: 0.02222
Epoch: 4553, Training Loss: 0.02821
Epoch: 4553, Training Loss: 0.02127
Epoch: 4554, Training Loss: 0.02456
Epoch: 4554, Training Loss: 0.02222
Epoch: 4554, Training Loss: 0.02821
Epoch: 4554, Training Loss: 0.02127
Epoch: 4555, Training Loss: 0.02455
Epoch: 4555, Training Loss: 0.02221
Epoch: 4555, Training Loss: 0.02821
Epoch: 4555, Training Loss: 0.02127
Epoch: 4556, Training Loss: 0.02455
Epoch: 4556, Training Loss: 0.02221
Epoch: 4556, Training Loss: 0.02820
Epoch: 4556, Training Loss: 0.02126
Epoch: 4557, Training Loss: 0.02455
Epoch: 4557, Training Loss: 0.02221
Epoch: 4557, Training Loss: 0.02820
Epoch: 4557, Training Loss: 0.02126
Epoch: 4558, Training Loss: 0.02454
Epoch: 4558, Training Loss: 0.02220
Epoch: 4558, Training Loss: 0.02819
Epoch: 4558, Training Loss: 0.02126
Epoch: 4559, Training Loss: 0.02454
Epoch: 4559, Training Loss: 0.02220
Epoch: 4559, Training Loss: 0.02819
Epoch: 4559, Training Loss: 0.02125
Epoch: 4560, Training Loss: 0.02454
Epoch: 4560, Training Loss: 0.02220
Epoch: 4560, Training Loss: 0.02819
Epoch: 4560, Training Loss: 0.02125
Epoch: 4561, Training Loss: 0.02453
Epoch: 4561, Training Loss: 0.02219
Epoch: 4561, Training Loss: 0.02818
Epoch: 4561, Training Loss: 0.02125
Epoch: 4562, Training Loss: 0.02453
Epoch: 4562, Training Loss: 0.02219
Epoch: 4562, Training Loss: 0.02818
Epoch: 4562, Training Loss: 0.02125
Epoch: 4563, Training Loss: 0.02453
Epoch: 4563, Training Loss: 0.02219
Epoch: 4563, Training Loss: 0.02817
Epoch: 4563, Training Loss: 0.02124
Epoch: 4564, Training Loss: 0.02452
Epoch: 4564, Training Loss: 0.02219
Epoch: 4564, Training Loss: 0.02817
Epoch: 4564, Training Loss: 0.02124
Epoch: 4565, Training Loss: 0.02452
Epoch: 4565, Training Loss: 0.02218
Epoch: 4565, Training Loss: 0.02817
Epoch: 4565, Training Loss: 0.02124
Epoch: 4566, Training Loss: 0.02452
Epoch: 4566, Training Loss: 0.02218
Epoch: 4566, Training Loss: 0.02816
Epoch: 4566, Training Loss: 0.02123
Epoch: 4567, Training Loss: 0.02451
Epoch: 4567, Training Loss: 0.02218
Epoch: 4567, Training Loss: 0.02816
Epoch: 4567, Training Loss: 0.02123
Epoch: 4568, Training Loss: 0.02451
Epoch: 4568, Training Loss: 0.02217
Epoch: 4568, Training Loss: 0.02815
Epoch: 4568, Training Loss: 0.02123
Epoch: 4569, Training Loss: 0.02451
Epoch: 4569, Training Loss: 0.02217
Epoch: 4569, Training Loss: 0.02815
Epoch: 4569, Training Loss: 0.02123
Epoch: 4570, Training Loss: 0.02450
Epoch: 4570, Training Loss: 0.02217
Epoch: 4570, Training Loss: 0.02815
Epoch: 4570, Training Loss: 0.02122
Epoch: 4571, Training Loss: 0.02450
Epoch: 4571, Training Loss: 0.02217
Epoch: 4571, Training Loss: 0.02814
Epoch: 4571, Training Loss: 0.02122
Epoch: 4572, Training Loss: 0.02450
Epoch: 4572, Training Loss: 0.02216
Epoch: 4572, Training Loss: 0.02814
Epoch: 4572, Training Loss: 0.02122
Epoch: 4573, Training Loss: 0.02449
Epoch: 4573, Training Loss: 0.02216
Epoch: 4573, Training Loss: 0.02813
Epoch: 4573, Training Loss: 0.02121
Epoch: 4574, Training Loss: 0.02449
Epoch: 4574, Training Loss: 0.02216
Epoch: 4574, Training Loss: 0.02813
Epoch: 4574, Training Loss: 0.02121
Epoch: 4575, Training Loss: 0.02449
Epoch: 4575, Training Loss: 0.02215
Epoch: 4575, Training Loss: 0.02813
Epoch: 4575, Training Loss: 0.02121
Epoch: 4576, Training Loss: 0.02448
Epoch: 4576, Training Loss: 0.02215
Epoch: 4576, Training Loss: 0.02812
Epoch: 4576, Training Loss: 0.02121
Epoch: 4577, Training Loss: 0.02448
Epoch: 4577, Training Loss: 0.02215
Epoch: 4577, Training Loss: 0.02812
Epoch: 4577, Training Loss: 0.02120
Epoch: 4578, Training Loss: 0.02448
Epoch: 4578, Training Loss: 0.02215
Epoch: 4578, Training Loss: 0.02811
Epoch: 4578, Training Loss: 0.02120
Epoch: 4579, Training Loss: 0.02447
Epoch: 4579, Training Loss: 0.02214
Epoch: 4579, Training Loss: 0.02811
Epoch: 4579, Training Loss: 0.02120
Epoch: 4580, Training Loss: 0.02447
Epoch: 4580, Training Loss: 0.02214
Epoch: 4580, Training Loss: 0.02811
Epoch: 4580, Training Loss: 0.02119
Epoch: 4581, Training Loss: 0.02447
Epoch: 4581, Training Loss: 0.02214
Epoch: 4581, Training Loss: 0.02810
Epoch: 4581, Training Loss: 0.02119
Epoch: 4582, Training Loss: 0.02446
Epoch: 4582, Training Loss: 0.02213
Epoch: 4582, Training Loss: 0.02810
Epoch: 4582, Training Loss: 0.02119
Epoch: 4583, Training Loss: 0.02446
Epoch: 4583, Training Loss: 0.02213
Epoch: 4583, Training Loss: 0.02809
Epoch: 4583, Training Loss: 0.02119
Epoch: 4584, Training Loss: 0.02446
Epoch: 4584, Training Loss: 0.02213
Epoch: 4584, Training Loss: 0.02809
Epoch: 4584, Training Loss: 0.02118
Epoch: 4585, Training Loss: 0.02445
Epoch: 4585, Training Loss: 0.02212
Epoch: 4585, Training Loss: 0.02809
Epoch: 4585, Training Loss: 0.02118
Epoch: 4586, Training Loss: 0.02445
Epoch: 4586, Training Loss: 0.02212
Epoch: 4586, Training Loss: 0.02808
Epoch: 4586, Training Loss: 0.02118
Epoch: 4587, Training Loss: 0.02445
Epoch: 4587, Training Loss: 0.02212
Epoch: 4587, Training Loss: 0.02808
Epoch: 4587, Training Loss: 0.02117
Epoch: 4588, Training Loss: 0.02444
Epoch: 4588, Training Loss: 0.02212
Epoch: 4588, Training Loss: 0.02808
Epoch: 4588, Training Loss: 0.02117
Epoch: 4589, Training Loss: 0.02444
Epoch: 4589, Training Loss: 0.02211
Epoch: 4589, Training Loss: 0.02807
Epoch: 4589, Training Loss: 0.02117
Epoch: 4590, Training Loss: 0.02444
Epoch: 4590, Training Loss: 0.02211
Epoch: 4590, Training Loss: 0.02807
Epoch: 4590, Training Loss: 0.02117
Epoch: 4591, Training Loss: 0.02443
Epoch: 4591, Training Loss: 0.02211
Epoch: 4591, Training Loss: 0.02806
Epoch: 4591, Training Loss: 0.02116
Epoch: 4592, Training Loss: 0.02443
Epoch: 4592, Training Loss: 0.02210
Epoch: 4592, Training Loss: 0.02806
Epoch: 4592, Training Loss: 0.02116
Epoch: 4593, Training Loss: 0.02442
Epoch: 4593, Training Loss: 0.02210
Epoch: 4593, Training Loss: 0.02806
Epoch: 4593, Training Loss: 0.02116
Epoch: 4594, Training Loss: 0.02442
Epoch: 4594, Training Loss: 0.02210
Epoch: 4594, Training Loss: 0.02805
Epoch: 4594, Training Loss: 0.02115
Epoch: 4595, Training Loss: 0.02442
Epoch: 4595, Training Loss: 0.02210
Epoch: 4595, Training Loss: 0.02805
Epoch: 4595, Training Loss: 0.02115
Epoch: 4596, Training Loss: 0.02441
Epoch: 4596, Training Loss: 0.02209
Epoch: 4596, Training Loss: 0.02804
Epoch: 4596, Training Loss: 0.02115
Epoch: 4597, Training Loss: 0.02441
Epoch: 4597, Training Loss: 0.02209
Epoch: 4597, Training Loss: 0.02804
Epoch: 4597, Training Loss: 0.02115
Epoch: 4598, Training Loss: 0.02441
Epoch: 4598, Training Loss: 0.02209
Epoch: 4598, Training Loss: 0.02804
Epoch: 4598, Training Loss: 0.02114
Epoch: 4599, Training Loss: 0.02440
Epoch: 4599, Training Loss: 0.02208
Epoch: 4599, Training Loss: 0.02803
Epoch: 4599, Training Loss: 0.02114
Epoch: 4600, Training Loss: 0.02440
Epoch: 4600, Training Loss: 0.02208
Epoch: 4600, Training Loss: 0.02803
Epoch: 4600, Training Loss: 0.02114
Epoch: 4601, Training Loss: 0.02440
Epoch: 4601, Training Loss: 0.02208
Epoch: 4601, Training Loss: 0.02802
Epoch: 4601, Training Loss: 0.02114
Epoch: 4602, Training Loss: 0.02439
Epoch: 4602, Training Loss: 0.02208
Epoch: 4602, Training Loss: 0.02802
Epoch: 4602, Training Loss: 0.02113
Epoch: 4603, Training Loss: 0.02439
Epoch: 4603, Training Loss: 0.02207
Epoch: 4603, Training Loss: 0.02802
Epoch: 4603, Training Loss: 0.02113
Epoch: 4604, Training Loss: 0.02439
Epoch: 4604, Training Loss: 0.02207
Epoch: 4604, Training Loss: 0.02801
Epoch: 4604, Training Loss: 0.02113
Epoch: 4605, Training Loss: 0.02438
Epoch: 4605, Training Loss: 0.02207
Epoch: 4605, Training Loss: 0.02801
Epoch: 4605, Training Loss: 0.02112
Epoch: 4606, Training Loss: 0.02438
Epoch: 4606, Training Loss: 0.02206
Epoch: 4606, Training Loss: 0.02800
Epoch: 4606, Training Loss: 0.02112
Epoch: 4607, Training Loss: 0.02438
Epoch: 4607, Training Loss: 0.02206
Epoch: 4607, Training Loss: 0.02800
Epoch: 4607, Training Loss: 0.02112
Epoch: 4608, Training Loss: 0.02437
Epoch: 4608, Training Loss: 0.02206
Epoch: 4608, Training Loss: 0.02800
Epoch: 4608, Training Loss: 0.02112
Epoch: 4609, Training Loss: 0.02437
Epoch: 4609, Training Loss: 0.02206
Epoch: 4609, Training Loss: 0.02799
Epoch: 4609, Training Loss: 0.02111
Epoch: 4610, Training Loss: 0.02437
Epoch: 4610, Training Loss: 0.02205
Epoch: 4610, Training Loss: 0.02799
Epoch: 4610, Training Loss: 0.02111
Epoch: 4611, Training Loss: 0.02436
Epoch: 4611, Training Loss: 0.02205
Epoch: 4611, Training Loss: 0.02798
Epoch: 4611, Training Loss: 0.02111
Epoch: 4612, Training Loss: 0.02436
Epoch: 4612, Training Loss: 0.02205
Epoch: 4612, Training Loss: 0.02798
Epoch: 4612, Training Loss: 0.02110
Epoch: 4613, Training Loss: 0.02436
Epoch: 4613, Training Loss: 0.02204
Epoch: 4613, Training Loss: 0.02798
Epoch: 4613, Training Loss: 0.02110
Epoch: 4614, Training Loss: 0.02435
Epoch: 4614, Training Loss: 0.02204
Epoch: 4614, Training Loss: 0.02797
Epoch: 4614, Training Loss: 0.02110
Epoch: 4615, Training Loss: 0.02435
Epoch: 4615, Training Loss: 0.02204
Epoch: 4615, Training Loss: 0.02797
Epoch: 4615, Training Loss: 0.02110
Epoch: 4616, Training Loss: 0.02435
Epoch: 4616, Training Loss: 0.02204
Epoch: 4616, Training Loss: 0.02797
Epoch: 4616, Training Loss: 0.02109
Epoch: 4617, Training Loss: 0.02434
Epoch: 4617, Training Loss: 0.02203
Epoch: 4617, Training Loss: 0.02796
Epoch: 4617, Training Loss: 0.02109
Epoch: 4618, Training Loss: 0.02434
Epoch: 4618, Training Loss: 0.02203
Epoch: 4618, Training Loss: 0.02796
Epoch: 4618, Training Loss: 0.02109
Epoch: 4619, Training Loss: 0.02434
Epoch: 4619, Training Loss: 0.02203
Epoch: 4619, Training Loss: 0.02795
Epoch: 4619, Training Loss: 0.02108
Epoch: 4620, Training Loss: 0.02433
Epoch: 4620, Training Loss: 0.02202
Epoch: 4620, Training Loss: 0.02795
Epoch: 4620, Training Loss: 0.02108
Epoch: 4621, Training Loss: 0.02433
Epoch: 4621, Training Loss: 0.02202
Epoch: 4621, Training Loss: 0.02795
Epoch: 4621, Training Loss: 0.02108
Epoch: 4622, Training Loss: 0.02433
Epoch: 4622, Training Loss: 0.02202
Epoch: 4622, Training Loss: 0.02794
Epoch: 4622, Training Loss: 0.02108
Epoch: 4623, Training Loss: 0.02432
Epoch: 4623, Training Loss: 0.02202
Epoch: 4623, Training Loss: 0.02794
Epoch: 4623, Training Loss: 0.02107
Epoch: 4624, Training Loss: 0.02432
Epoch: 4624, Training Loss: 0.02201
Epoch: 4624, Training Loss: 0.02793
Epoch: 4624, Training Loss: 0.02107
Epoch: 4625, Training Loss: 0.02432
Epoch: 4625, Training Loss: 0.02201
Epoch: 4625, Training Loss: 0.02793
Epoch: 4625, Training Loss: 0.02107
Epoch: 4626, Training Loss: 0.02431
Epoch: 4626, Training Loss: 0.02201
Epoch: 4626, Training Loss: 0.02793
Epoch: 4626, Training Loss: 0.02106
Epoch: 4627, Training Loss: 0.02431
Epoch: 4627, Training Loss: 0.02200
Epoch: 4627, Training Loss: 0.02792
Epoch: 4627, Training Loss: 0.02106
Epoch: 4628, Training Loss: 0.02431
Epoch: 4628, Training Loss: 0.02200
Epoch: 4628, Training Loss: 0.02792
Epoch: 4628, Training Loss: 0.02106
Epoch: 4629, Training Loss: 0.02430
Epoch: 4629, Training Loss: 0.02200
Epoch: 4629, Training Loss: 0.02791
Epoch: 4629, Training Loss: 0.02106
Epoch: 4630, Training Loss: 0.02430
Epoch: 4630, Training Loss: 0.02200
Epoch: 4630, Training Loss: 0.02791
Epoch: 4630, Training Loss: 0.02105
Epoch: 4631, Training Loss: 0.02430
Epoch: 4631, Training Loss: 0.02199
Epoch: 4631, Training Loss: 0.02791
Epoch: 4631, Training Loss: 0.02105
Epoch: 4632, Training Loss: 0.02429
Epoch: 4632, Training Loss: 0.02199
Epoch: 4632, Training Loss: 0.02790
Epoch: 4632, Training Loss: 0.02105
Epoch: 4633, Training Loss: 0.02429
Epoch: 4633, Training Loss: 0.02199
Epoch: 4633, Training Loss: 0.02790
Epoch: 4633, Training Loss: 0.02105
Epoch: 4634, Training Loss: 0.02429
Epoch: 4634, Training Loss: 0.02198
Epoch: 4634, Training Loss: 0.02790
Epoch: 4634, Training Loss: 0.02104
Epoch: 4635, Training Loss: 0.02429
Epoch: 4635, Training Loss: 0.02198
Epoch: 4635, Training Loss: 0.02789
Epoch: 4635, Training Loss: 0.02104
Epoch: 4636, Training Loss: 0.02428
Epoch: 4636, Training Loss: 0.02198
Epoch: 4636, Training Loss: 0.02789
Epoch: 4636, Training Loss: 0.02104
Epoch: 4637, Training Loss: 0.02428
Epoch: 4637, Training Loss: 0.02198
Epoch: 4637, Training Loss: 0.02788
Epoch: 4637, Training Loss: 0.02103
Epoch: 4638, Training Loss: 0.02428
Epoch: 4638, Training Loss: 0.02197
Epoch: 4638, Training Loss: 0.02788
Epoch: 4638, Training Loss: 0.02103
Epoch: 4639, Training Loss: 0.02427
Epoch: 4639, Training Loss: 0.02197
Epoch: 4639, Training Loss: 0.02788
Epoch: 4639, Training Loss: 0.02103
Epoch: 4640, Training Loss: 0.02427
Epoch: 4640, Training Loss: 0.02197
Epoch: 4640, Training Loss: 0.02787
Epoch: 4640, Training Loss: 0.02103
Epoch: 4641, Training Loss: 0.02427
Epoch: 4641, Training Loss: 0.02196
Epoch: 4641, Training Loss: 0.02787
Epoch: 4641, Training Loss: 0.02102
Epoch: 4642, Training Loss: 0.02426
Epoch: 4642, Training Loss: 0.02196
Epoch: 4642, Training Loss: 0.02786
Epoch: 4642, Training Loss: 0.02102
Epoch: 4643, Training Loss: 0.02426
Epoch: 4643, Training Loss: 0.02196
Epoch: 4643, Training Loss: 0.02786
Epoch: 4643, Training Loss: 0.02102
Epoch: 4644, Training Loss: 0.02426
Epoch: 4644, Training Loss: 0.02196
Epoch: 4644, Training Loss: 0.02786
Epoch: 4644, Training Loss: 0.02101
Epoch: 4645, Training Loss: 0.02425
Epoch: 4645, Training Loss: 0.02195
Epoch: 4645, Training Loss: 0.02785
Epoch: 4645, Training Loss: 0.02101
Epoch: 4646, Training Loss: 0.02425
Epoch: 4646, Training Loss: 0.02195
Epoch: 4646, Training Loss: 0.02785
Epoch: 4646, Training Loss: 0.02101
Epoch: 4647, Training Loss: 0.02425
Epoch: 4647, Training Loss: 0.02195
Epoch: 4647, Training Loss: 0.02785
Epoch: 4647, Training Loss: 0.02101
Epoch: 4648, Training Loss: 0.02424
Epoch: 4648, Training Loss: 0.02194
Epoch: 4648, Training Loss: 0.02784
Epoch: 4648, Training Loss: 0.02100
Epoch: 4649, Training Loss: 0.02424
Epoch: 4649, Training Loss: 0.02194
Epoch: 4649, Training Loss: 0.02784
Epoch: 4649, Training Loss: 0.02100
Epoch: 4650, Training Loss: 0.02424
Epoch: 4650, Training Loss: 0.02194
Epoch: 4650, Training Loss: 0.02783
Epoch: 4650, Training Loss: 0.02100
Epoch: 4651, Training Loss: 0.02423
Epoch: 4651, Training Loss: 0.02194
Epoch: 4651, Training Loss: 0.02783
Epoch: 4651, Training Loss: 0.02099
Epoch: 4652, Training Loss: 0.02423
Epoch: 4652, Training Loss: 0.02193
Epoch: 4652, Training Loss: 0.02783
Epoch: 4652, Training Loss: 0.02099
Epoch: 4653, Training Loss: 0.02423
Epoch: 4653, Training Loss: 0.02193
Epoch: 4653, Training Loss: 0.02782
Epoch: 4653, Training Loss: 0.02099
Epoch: 4654, Training Loss: 0.02422
Epoch: 4654, Training Loss: 0.02193
Epoch: 4654, Training Loss: 0.02782
Epoch: 4654, Training Loss: 0.02099
Epoch: 4655, Training Loss: 0.02422
Epoch: 4655, Training Loss: 0.02192
Epoch: 4655, Training Loss: 0.02781
Epoch: 4655, Training Loss: 0.02098
Epoch: 4656, Training Loss: 0.02422
Epoch: 4656, Training Loss: 0.02192
Epoch: 4656, Training Loss: 0.02781
Epoch: 4656, Training Loss: 0.02098
Epoch: 4657, Training Loss: 0.02421
Epoch: 4657, Training Loss: 0.02192
Epoch: 4657, Training Loss: 0.02781
Epoch: 4657, Training Loss: 0.02098
Epoch: 4658, Training Loss: 0.02421
Epoch: 4658, Training Loss: 0.02192
Epoch: 4658, Training Loss: 0.02780
Epoch: 4658, Training Loss: 0.02098
Epoch: 4659, Training Loss: 0.02421
Epoch: 4659, Training Loss: 0.02191
Epoch: 4659, Training Loss: 0.02780
Epoch: 4659, Training Loss: 0.02097
Epoch: 4660, Training Loss: 0.02420
Epoch: 4660, Training Loss: 0.02191
Epoch: 4660, Training Loss: 0.02780
Epoch: 4660, Training Loss: 0.02097
Epoch: 4661, Training Loss: 0.02420
Epoch: 4661, Training Loss: 0.02191
Epoch: 4661, Training Loss: 0.02779
Epoch: 4661, Training Loss: 0.02097
Epoch: 4662, Training Loss: 0.02420
Epoch: 4662, Training Loss: 0.02190
Epoch: 4662, Training Loss: 0.02779
Epoch: 4662, Training Loss: 0.02096
Epoch: 4663, Training Loss: 0.02419
Epoch: 4663, Training Loss: 0.02190
Epoch: 4663, Training Loss: 0.02778
Epoch: 4663, Training Loss: 0.02096
Epoch: 4664, Training Loss: 0.02419
Epoch: 4664, Training Loss: 0.02190
Epoch: 4664, Training Loss: 0.02778
Epoch: 4664, Training Loss: 0.02096
Epoch: 4665, Training Loss: 0.02419
Epoch: 4665, Training Loss: 0.02190
Epoch: 4665, Training Loss: 0.02778
Epoch: 4665, Training Loss: 0.02096
Epoch: 4666, Training Loss: 0.02418
Epoch: 4666, Training Loss: 0.02189
Epoch: 4666, Training Loss: 0.02777
Epoch: 4666, Training Loss: 0.02095
Epoch: 4667, Training Loss: 0.02418
Epoch: 4667, Training Loss: 0.02189
Epoch: 4667, Training Loss: 0.02777
Epoch: 4667, Training Loss: 0.02095
Epoch: 4668, Training Loss: 0.02418
Epoch: 4668, Training Loss: 0.02189
Epoch: 4668, Training Loss: 0.02776
Epoch: 4668, Training Loss: 0.02095
Epoch: 4669, Training Loss: 0.02417
Epoch: 4669, Training Loss: 0.02189
Epoch: 4669, Training Loss: 0.02776
Epoch: 4669, Training Loss: 0.02095
Epoch: 4670, Training Loss: 0.02417
Epoch: 4670, Training Loss: 0.02188
Epoch: 4670, Training Loss: 0.02776
Epoch: 4670, Training Loss: 0.02094
Epoch: 4671, Training Loss: 0.02417
Epoch: 4671, Training Loss: 0.02188
Epoch: 4671, Training Loss: 0.02775
Epoch: 4671, Training Loss: 0.02094
Epoch: 4672, Training Loss: 0.02416
Epoch: 4672, Training Loss: 0.02188
Epoch: 4672, Training Loss: 0.02775
Epoch: 4672, Training Loss: 0.02094
Epoch: 4673, Training Loss: 0.02416
Epoch: 4673, Training Loss: 0.02187
Epoch: 4673, Training Loss: 0.02775
Epoch: 4673, Training Loss: 0.02093
Epoch: 4674, Training Loss: 0.02416
Epoch: 4674, Training Loss: 0.02187
Epoch: 4674, Training Loss: 0.02774
Epoch: 4674, Training Loss: 0.02093
Epoch: 4675, Training Loss: 0.02415
Epoch: 4675, Training Loss: 0.02187
Epoch: 4675, Training Loss: 0.02774
Epoch: 4675, Training Loss: 0.02093
Epoch: 4676, Training Loss: 0.02415
Epoch: 4676, Training Loss: 0.02187
Epoch: 4676, Training Loss: 0.02773
Epoch: 4676, Training Loss: 0.02093
Epoch: 4677, Training Loss: 0.02415
Epoch: 4677, Training Loss: 0.02186
Epoch: 4677, Training Loss: 0.02773
Epoch: 4677, Training Loss: 0.02092
Epoch: 4678, Training Loss: 0.02414
Epoch: 4678, Training Loss: 0.02186
Epoch: 4678, Training Loss: 0.02773
Epoch: 4678, Training Loss: 0.02092
Epoch: 4679, Training Loss: 0.02414
Epoch: 4679, Training Loss: 0.02186
Epoch: 4679, Training Loss: 0.02772
Epoch: 4679, Training Loss: 0.02092
Epoch: 4680, Training Loss: 0.02414
Epoch: 4680, Training Loss: 0.02185
Epoch: 4680, Training Loss: 0.02772
Epoch: 4680, Training Loss: 0.02091
Epoch: 4681, Training Loss: 0.02413
Epoch: 4681, Training Loss: 0.02185
Epoch: 4681, Training Loss: 0.02772
Epoch: 4681, Training Loss: 0.02091
Epoch: 4682, Training Loss: 0.02413
Epoch: 4682, Training Loss: 0.02185
Epoch: 4682, Training Loss: 0.02771
Epoch: 4682, Training Loss: 0.02091
Epoch: 4683, Training Loss: 0.02413
Epoch: 4683, Training Loss: 0.02185
Epoch: 4683, Training Loss: 0.02771
Epoch: 4683, Training Loss: 0.02091
Epoch: 4684, Training Loss: 0.02412
Epoch: 4684, Training Loss: 0.02184
Epoch: 4684, Training Loss: 0.02770
Epoch: 4684, Training Loss: 0.02090
Epoch: 4685, Training Loss: 0.02412
Epoch: 4685, Training Loss: 0.02184
Epoch: 4685, Training Loss: 0.02770
Epoch: 4685, Training Loss: 0.02090
Epoch: 4686, Training Loss: 0.02412
Epoch: 4686, Training Loss: 0.02184
Epoch: 4686, Training Loss: 0.02770
Epoch: 4686, Training Loss: 0.02090
Epoch: 4687, Training Loss: 0.02411
Epoch: 4687, Training Loss: 0.02183
Epoch: 4687, Training Loss: 0.02769
Epoch: 4687, Training Loss: 0.02090
Epoch: 4688, Training Loss: 0.02411
Epoch: 4688, Training Loss: 0.02183
Epoch: 4688, Training Loss: 0.02769
Epoch: 4688, Training Loss: 0.02089
Epoch: 4689, Training Loss: 0.02411
Epoch: 4689, Training Loss: 0.02183
Epoch: 4689, Training Loss: 0.02769
Epoch: 4689, Training Loss: 0.02089
Epoch: 4690, Training Loss: 0.02411
Epoch: 4690, Training Loss: 0.02183
Epoch: 4690, Training Loss: 0.02768
Epoch: 4690, Training Loss: 0.02089
Epoch: 4691, Training Loss: 0.02410
Epoch: 4691, Training Loss: 0.02182
Epoch: 4691, Training Loss: 0.02768
Epoch: 4691, Training Loss: 0.02088
Epoch: 4692, Training Loss: 0.02410
Epoch: 4692, Training Loss: 0.02182
Epoch: 4692, Training Loss: 0.02767
Epoch: 4692, Training Loss: 0.02088
Epoch: 4693, Training Loss: 0.02410
Epoch: 4693, Training Loss: 0.02182
Epoch: 4693, Training Loss: 0.02767
Epoch: 4693, Training Loss: 0.02088
Epoch: 4694, Training Loss: 0.02409
Epoch: 4694, Training Loss: 0.02182
Epoch: 4694, Training Loss: 0.02767
Epoch: 4694, Training Loss: 0.02088
Epoch: 4695, Training Loss: 0.02409
Epoch: 4695, Training Loss: 0.02181
Epoch: 4695, Training Loss: 0.02766
Epoch: 4695, Training Loss: 0.02087
Epoch: 4696, Training Loss: 0.02409
Epoch: 4696, Training Loss: 0.02181
Epoch: 4696, Training Loss: 0.02766
Epoch: 4696, Training Loss: 0.02087
Epoch: 4697, Training Loss: 0.02408
Epoch: 4697, Training Loss: 0.02181
Epoch: 4697, Training Loss: 0.02765
Epoch: 4697, Training Loss: 0.02087
Epoch: 4698, Training Loss: 0.02408
Epoch: 4698, Training Loss: 0.02180
Epoch: 4698, Training Loss: 0.02765
Epoch: 4698, Training Loss: 0.02087
Epoch: 4699, Training Loss: 0.02408
Epoch: 4699, Training Loss: 0.02180
Epoch: 4699, Training Loss: 0.02765
Epoch: 4699, Training Loss: 0.02086
Epoch: 4700, Training Loss: 0.02407
Epoch: 4700, Training Loss: 0.02180
Epoch: 4700, Training Loss: 0.02764
Epoch: 4700, Training Loss: 0.02086
Epoch: 4701, Training Loss: 0.02407
Epoch: 4701, Training Loss: 0.02180
Epoch: 4701, Training Loss: 0.02764
Epoch: 4701, Training Loss: 0.02086
Epoch: 4702, Training Loss: 0.02407
Epoch: 4702, Training Loss: 0.02179
Epoch: 4702, Training Loss: 0.02764
Epoch: 4702, Training Loss: 0.02085
Epoch: 4703, Training Loss: 0.02406
Epoch: 4703, Training Loss: 0.02179
Epoch: 4703, Training Loss: 0.02763
Epoch: 4703, Training Loss: 0.02085
Epoch: 4704, Training Loss: 0.02406
Epoch: 4704, Training Loss: 0.02179
Epoch: 4704, Training Loss: 0.02763
Epoch: 4704, Training Loss: 0.02085
Epoch: 4705, Training Loss: 0.02406
Epoch: 4705, Training Loss: 0.02178
Epoch: 4705, Training Loss: 0.02762
Epoch: 4705, Training Loss: 0.02085
Epoch: 4706, Training Loss: 0.02405
Epoch: 4706, Training Loss: 0.02178
Epoch: 4706, Training Loss: 0.02762
Epoch: 4706, Training Loss: 0.02084
Epoch: 4707, Training Loss: 0.02405
Epoch: 4707, Training Loss: 0.02178
Epoch: 4707, Training Loss: 0.02762
Epoch: 4707, Training Loss: 0.02084
Epoch: 4708, Training Loss: 0.02405
Epoch: 4708, Training Loss: 0.02178
Epoch: 4708, Training Loss: 0.02761
Epoch: 4708, Training Loss: 0.02084
Epoch: 4709, Training Loss: 0.02404
Epoch: 4709, Training Loss: 0.02177
Epoch: 4709, Training Loss: 0.02761
Epoch: 4709, Training Loss: 0.02084
Epoch: 4710, Training Loss: 0.02404
Epoch: 4710, Training Loss: 0.02177
Epoch: 4710, Training Loss: 0.02761
Epoch: 4710, Training Loss: 0.02083
Epoch: 4711, Training Loss: 0.02404
Epoch: 4711, Training Loss: 0.02177
Epoch: 4711, Training Loss: 0.02760
Epoch: 4711, Training Loss: 0.02083
Epoch: 4712, Training Loss: 0.02403
Epoch: 4712, Training Loss: 0.02177
Epoch: 4712, Training Loss: 0.02760
Epoch: 4712, Training Loss: 0.02083
Epoch: 4713, Training Loss: 0.02403
Epoch: 4713, Training Loss: 0.02176
Epoch: 4713, Training Loss: 0.02759
Epoch: 4713, Training Loss: 0.02083
Epoch: 4714, Training Loss: 0.02403
Epoch: 4714, Training Loss: 0.02176
Epoch: 4714, Training Loss: 0.02759
Epoch: 4714, Training Loss: 0.02082
Epoch: 4715, Training Loss: 0.02402
Epoch: 4715, Training Loss: 0.02176
Epoch: 4715, Training Loss: 0.02759
Epoch: 4715, Training Loss: 0.02082
Epoch: 4716, Training Loss: 0.02402
Epoch: 4716, Training Loss: 0.02175
Epoch: 4716, Training Loss: 0.02758
Epoch: 4716, Training Loss: 0.02082
Epoch: 4717, Training Loss: 0.02402
Epoch: 4717, Training Loss: 0.02175
Epoch: 4717, Training Loss: 0.02758
Epoch: 4717, Training Loss: 0.02081
Epoch: 4718, Training Loss: 0.02402
Epoch: 4718, Training Loss: 0.02175
Epoch: 4718, Training Loss: 0.02758
Epoch: 4718, Training Loss: 0.02081
Epoch: 4719, Training Loss: 0.02401
Epoch: 4719, Training Loss: 0.02175
Epoch: 4719, Training Loss: 0.02757
Epoch: 4719, Training Loss: 0.02081
Epoch: 4720, Training Loss: 0.02401
Epoch: 4720, Training Loss: 0.02174
Epoch: 4720, Training Loss: 0.02757
Epoch: 4720, Training Loss: 0.02081
Epoch: 4721, Training Loss: 0.02401
Epoch: 4721, Training Loss: 0.02174
Epoch: 4721, Training Loss: 0.02756
Epoch: 4721, Training Loss: 0.02080
Epoch: 4722, Training Loss: 0.02400
Epoch: 4722, Training Loss: 0.02174
Epoch: 4722, Training Loss: 0.02756
Epoch: 4722, Training Loss: 0.02080
Epoch: 4723, Training Loss: 0.02400
Epoch: 4723, Training Loss: 0.02174
Epoch: 4723, Training Loss: 0.02756
Epoch: 4723, Training Loss: 0.02080
Epoch: 4724, Training Loss: 0.02400
Epoch: 4724, Training Loss: 0.02173
Epoch: 4724, Training Loss: 0.02755
Epoch: 4724, Training Loss: 0.02080
Epoch: 4725, Training Loss: 0.02399
Epoch: 4725, Training Loss: 0.02173
Epoch: 4725, Training Loss: 0.02755
Epoch: 4725, Training Loss: 0.02079
Epoch: 4726, Training Loss: 0.02399
Epoch: 4726, Training Loss: 0.02173
Epoch: 4726, Training Loss: 0.02755
Epoch: 4726, Training Loss: 0.02079
Epoch: 4727, Training Loss: 0.02399
Epoch: 4727, Training Loss: 0.02172
Epoch: 4727, Training Loss: 0.02754
Epoch: 4727, Training Loss: 0.02079
Epoch: 4728, Training Loss: 0.02398
Epoch: 4728, Training Loss: 0.02172
Epoch: 4728, Training Loss: 0.02754
Epoch: 4728, Training Loss: 0.02078
Epoch: 4729, Training Loss: 0.02398
Epoch: 4729, Training Loss: 0.02172
Epoch: 4729, Training Loss: 0.02754
Epoch: 4729, Training Loss: 0.02078
Epoch: 4730, Training Loss: 0.02398
Epoch: 4730, Training Loss: 0.02172
Epoch: 4730, Training Loss: 0.02753
Epoch: 4730, Training Loss: 0.02078
Epoch: 4731, Training Loss: 0.02397
Epoch: 4731, Training Loss: 0.02171
Epoch: 4731, Training Loss: 0.02753
Epoch: 4731, Training Loss: 0.02078
Epoch: 4732, Training Loss: 0.02397
Epoch: 4732, Training Loss: 0.02171
Epoch: 4732, Training Loss: 0.02752
Epoch: 4732, Training Loss: 0.02077
Epoch: 4733, Training Loss: 0.02397
Epoch: 4733, Training Loss: 0.02171
Epoch: 4733, Training Loss: 0.02752
Epoch: 4733, Training Loss: 0.02077
Epoch: 4734, Training Loss: 0.02396
Epoch: 4734, Training Loss: 0.02171
Epoch: 4734, Training Loss: 0.02752
Epoch: 4734, Training Loss: 0.02077
Epoch: 4735, Training Loss: 0.02396
Epoch: 4735, Training Loss: 0.02170
Epoch: 4735, Training Loss: 0.02751
Epoch: 4735, Training Loss: 0.02077
Epoch: 4736, Training Loss: 0.02396
Epoch: 4736, Training Loss: 0.02170
Epoch: 4736, Training Loss: 0.02751
Epoch: 4736, Training Loss: 0.02076
Epoch: 4737, Training Loss: 0.02395
Epoch: 4737, Training Loss: 0.02170
Epoch: 4737, Training Loss: 0.02751
Epoch: 4737, Training Loss: 0.02076
Epoch: 4738, Training Loss: 0.02395
Epoch: 4738, Training Loss: 0.02169
Epoch: 4738, Training Loss: 0.02750
Epoch: 4738, Training Loss: 0.02076
Epoch: 4739, Training Loss: 0.02395
Epoch: 4739, Training Loss: 0.02169
Epoch: 4739, Training Loss: 0.02750
Epoch: 4739, Training Loss: 0.02075
Epoch: 4740, Training Loss: 0.02395
Epoch: 4740, Training Loss: 0.02169
Epoch: 4740, Training Loss: 0.02749
Epoch: 4740, Training Loss: 0.02075
Epoch: 4741, Training Loss: 0.02394
Epoch: 4741, Training Loss: 0.02169
Epoch: 4741, Training Loss: 0.02749
Epoch: 4741, Training Loss: 0.02075
Epoch: 4742, Training Loss: 0.02394
Epoch: 4742, Training Loss: 0.02168
Epoch: 4742, Training Loss: 0.02749
Epoch: 4742, Training Loss: 0.02075
Epoch: 4743, Training Loss: 0.02394
Epoch: 4743, Training Loss: 0.02168
Epoch: 4743, Training Loss: 0.02748
Epoch: 4743, Training Loss: 0.02074
Epoch: 4744, Training Loss: 0.02393
Epoch: 4744, Training Loss: 0.02168
Epoch: 4744, Training Loss: 0.02748
Epoch: 4744, Training Loss: 0.02074
Epoch: 4745, Training Loss: 0.02393
Epoch: 4745, Training Loss: 0.02168
Epoch: 4745, Training Loss: 0.02748
Epoch: 4745, Training Loss: 0.02074
Epoch: 4746, Training Loss: 0.02393
Epoch: 4746, Training Loss: 0.02167
Epoch: 4746, Training Loss: 0.02747
Epoch: 4746, Training Loss: 0.02074
Epoch: 4747, Training Loss: 0.02392
Epoch: 4747, Training Loss: 0.02167
Epoch: 4747, Training Loss: 0.02747
Epoch: 4747, Training Loss: 0.02073
Epoch: 4748, Training Loss: 0.02392
Epoch: 4748, Training Loss: 0.02167
Epoch: 4748, Training Loss: 0.02746
Epoch: 4748, Training Loss: 0.02073
Epoch: 4749, Training Loss: 0.02392
Epoch: 4749, Training Loss: 0.02166
Epoch: 4749, Training Loss: 0.02746
Epoch: 4749, Training Loss: 0.02073
Epoch: 4750, Training Loss: 0.02391
Epoch: 4750, Training Loss: 0.02166
Epoch: 4750, Training Loss: 0.02746
Epoch: 4750, Training Loss: 0.02073
Epoch: 4751, Training Loss: 0.02391
Epoch: 4751, Training Loss: 0.02166
Epoch: 4751, Training Loss: 0.02745
Epoch: 4751, Training Loss: 0.02072
Epoch: 4752, Training Loss: 0.02391
Epoch: 4752, Training Loss: 0.02166
Epoch: 4752, Training Loss: 0.02745
Epoch: 4752, Training Loss: 0.02072
Epoch: 4753, Training Loss: 0.02390
Epoch: 4753, Training Loss: 0.02165
Epoch: 4753, Training Loss: 0.02745
Epoch: 4753, Training Loss: 0.02072
Epoch: 4754, Training Loss: 0.02390
Epoch: 4754, Training Loss: 0.02165
Epoch: 4754, Training Loss: 0.02744
Epoch: 4754, Training Loss: 0.02071
Epoch: 4755, Training Loss: 0.02390
Epoch: 4755, Training Loss: 0.02165
Epoch: 4755, Training Loss: 0.02744
Epoch: 4755, Training Loss: 0.02071
Epoch: 4756, Training Loss: 0.02389
Epoch: 4756, Training Loss: 0.02165
Epoch: 4756, Training Loss: 0.02743
Epoch: 4756, Training Loss: 0.02071
Epoch: 4757, Training Loss: 0.02389
Epoch: 4757, Training Loss: 0.02164
Epoch: 4757, Training Loss: 0.02743
Epoch: 4757, Training Loss: 0.02071
Epoch: 4758, Training Loss: 0.02389
Epoch: 4758, Training Loss: 0.02164
Epoch: 4758, Training Loss: 0.02743
Epoch: 4758, Training Loss: 0.02070
Epoch: 4759, Training Loss: 0.02389
Epoch: 4759, Training Loss: 0.02164
Epoch: 4759, Training Loss: 0.02742
Epoch: 4759, Training Loss: 0.02070
Epoch: 4760, Training Loss: 0.02388
Epoch: 4760, Training Loss: 0.02163
Epoch: 4760, Training Loss: 0.02742
Epoch: 4760, Training Loss: 0.02070
Epoch: 4761, Training Loss: 0.02388
Epoch: 4761, Training Loss: 0.02163
Epoch: 4761, Training Loss: 0.02742
Epoch: 4761, Training Loss: 0.02070
Epoch: 4762, Training Loss: 0.02388
Epoch: 4762, Training Loss: 0.02163
Epoch: 4762, Training Loss: 0.02741
Epoch: 4762, Training Loss: 0.02069
Epoch: 4763, Training Loss: 0.02387
Epoch: 4763, Training Loss: 0.02163
Epoch: 4763, Training Loss: 0.02741
Epoch: 4763, Training Loss: 0.02069
Epoch: 4764, Training Loss: 0.02387
Epoch: 4764, Training Loss: 0.02162
Epoch: 4764, Training Loss: 0.02741
Epoch: 4764, Training Loss: 0.02069
Epoch: 4765, Training Loss: 0.02387
Epoch: 4765, Training Loss: 0.02162
Epoch: 4765, Training Loss: 0.02740
Epoch: 4765, Training Loss: 0.02069
Epoch: 4766, Training Loss: 0.02386
Epoch: 4766, Training Loss: 0.02162
Epoch: 4766, Training Loss: 0.02740
Epoch: 4766, Training Loss: 0.02068
Epoch: 4767, Training Loss: 0.02386
Epoch: 4767, Training Loss: 0.02162
Epoch: 4767, Training Loss: 0.02739
Epoch: 4767, Training Loss: 0.02068
Epoch: 4768, Training Loss: 0.02386
Epoch: 4768, Training Loss: 0.02161
Epoch: 4768, Training Loss: 0.02739
Epoch: 4768, Training Loss: 0.02068
Epoch: 4769, Training Loss: 0.02385
Epoch: 4769, Training Loss: 0.02161
Epoch: 4769, Training Loss: 0.02739
Epoch: 4769, Training Loss: 0.02067
Epoch: 4770, Training Loss: 0.02385
Epoch: 4770, Training Loss: 0.02161
Epoch: 4770, Training Loss: 0.02738
Epoch: 4770, Training Loss: 0.02067
Epoch: 4771, Training Loss: 0.02385
Epoch: 4771, Training Loss: 0.02160
Epoch: 4771, Training Loss: 0.02738
Epoch: 4771, Training Loss: 0.02067
Epoch: 4772, Training Loss: 0.02384
Epoch: 4772, Training Loss: 0.02160
Epoch: 4772, Training Loss: 0.02738
Epoch: 4772, Training Loss: 0.02067
Epoch: 4773, Training Loss: 0.02384
Epoch: 4773, Training Loss: 0.02160
Epoch: 4773, Training Loss: 0.02737
Epoch: 4773, Training Loss: 0.02066
Epoch: 4774, Training Loss: 0.02384
Epoch: 4774, Training Loss: 0.02160
Epoch: 4774, Training Loss: 0.02737
Epoch: 4774, Training Loss: 0.02066
Epoch: 4775, Training Loss: 0.02383
Epoch: 4775, Training Loss: 0.02159
Epoch: 4775, Training Loss: 0.02737
Epoch: 4775, Training Loss: 0.02066
Epoch: 4776, Training Loss: 0.02383
Epoch: 4776, Training Loss: 0.02159
Epoch: 4776, Training Loss: 0.02736
Epoch: 4776, Training Loss: 0.02066
Epoch: 4777, Training Loss: 0.02383
Epoch: 4777, Training Loss: 0.02159
Epoch: 4777, Training Loss: 0.02736
Epoch: 4777, Training Loss: 0.02065
Epoch: 4778, Training Loss: 0.02383
Epoch: 4778, Training Loss: 0.02159
Epoch: 4778, Training Loss: 0.02735
Epoch: 4778, Training Loss: 0.02065
Epoch: 4779, Training Loss: 0.02382
Epoch: 4779, Training Loss: 0.02158
Epoch: 4779, Training Loss: 0.02735
Epoch: 4779, Training Loss: 0.02065
Epoch: 4780, Training Loss: 0.02382
Epoch: 4780, Training Loss: 0.02158
Epoch: 4780, Training Loss: 0.02735
Epoch: 4780, Training Loss: 0.02065
Epoch: 4781, Training Loss: 0.02382
Epoch: 4781, Training Loss: 0.02158
Epoch: 4781, Training Loss: 0.02734
Epoch: 4781, Training Loss: 0.02064
Epoch: 4782, Training Loss: 0.02381
Epoch: 4782, Training Loss: 0.02157
Epoch: 4782, Training Loss: 0.02734
Epoch: 4782, Training Loss: 0.02064
Epoch: 4783, Training Loss: 0.02381
Epoch: 4783, Training Loss: 0.02157
Epoch: 4783, Training Loss: 0.02734
Epoch: 4783, Training Loss: 0.02064
Epoch: 4784, Training Loss: 0.02381
Epoch: 4784, Training Loss: 0.02157
Epoch: 4784, Training Loss: 0.02733
Epoch: 4784, Training Loss: 0.02064
Epoch: 4785, Training Loss: 0.02380
Epoch: 4785, Training Loss: 0.02157
Epoch: 4785, Training Loss: 0.02733
Epoch: 4785, Training Loss: 0.02063
Epoch: 4786, Training Loss: 0.02380
Epoch: 4786, Training Loss: 0.02156
Epoch: 4786, Training Loss: 0.02733
Epoch: 4786, Training Loss: 0.02063
Epoch: 4787, Training Loss: 0.02380
Epoch: 4787, Training Loss: 0.02156
Epoch: 4787, Training Loss: 0.02732
Epoch: 4787, Training Loss: 0.02063
Epoch: 4788, Training Loss: 0.02379
Epoch: 4788, Training Loss: 0.02156
Epoch: 4788, Training Loss: 0.02732
Epoch: 4788, Training Loss: 0.02062
Epoch: 4789, Training Loss: 0.02379
Epoch: 4789, Training Loss: 0.02156
Epoch: 4789, Training Loss: 0.02731
Epoch: 4789, Training Loss: 0.02062
Epoch: 4790, Training Loss: 0.02379
Epoch: 4790, Training Loss: 0.02155
Epoch: 4790, Training Loss: 0.02731
Epoch: 4790, Training Loss: 0.02062
Epoch: 4791, Training Loss: 0.02378
Epoch: 4791, Training Loss: 0.02155
Epoch: 4791, Training Loss: 0.02731
Epoch: 4791, Training Loss: 0.02062
Epoch: 4792, Training Loss: 0.02378
Epoch: 4792, Training Loss: 0.02155
Epoch: 4792, Training Loss: 0.02730
Epoch: 4792, Training Loss: 0.02061
Epoch: 4793, Training Loss: 0.02378
Epoch: 4793, Training Loss: 0.02155
Epoch: 4793, Training Loss: 0.02730
Epoch: 4793, Training Loss: 0.02061
Epoch: 4794, Training Loss: 0.02378
Epoch: 4794, Training Loss: 0.02154
Epoch: 4794, Training Loss: 0.02730
Epoch: 4794, Training Loss: 0.02061
Epoch: 4795, Training Loss: 0.02377
Epoch: 4795, Training Loss: 0.02154
Epoch: 4795, Training Loss: 0.02729
Epoch: 4795, Training Loss: 0.02061
Epoch: 4796, Training Loss: 0.02377
Epoch: 4796, Training Loss: 0.02154
Epoch: 4796, Training Loss: 0.02729
Epoch: 4796, Training Loss: 0.02060
Epoch: 4797, Training Loss: 0.02377
Epoch: 4797, Training Loss: 0.02153
Epoch: 4797, Training Loss: 0.02729
Epoch: 4797, Training Loss: 0.02060
Epoch: 4798, Training Loss: 0.02376
Epoch: 4798, Training Loss: 0.02153
Epoch: 4798, Training Loss: 0.02728
Epoch: 4798, Training Loss: 0.02060
Epoch: 4799, Training Loss: 0.02376
Epoch: 4799, Training Loss: 0.02153
Epoch: 4799, Training Loss: 0.02728
Epoch: 4799, Training Loss: 0.02060
Epoch: 4800, Training Loss: 0.02376
Epoch: 4800, Training Loss: 0.02153
Epoch: 4800, Training Loss: 0.02727
Epoch: 4800, Training Loss: 0.02059
Epoch: 4801, Training Loss: 0.02375
Epoch: 4801, Training Loss: 0.02152
Epoch: 4801, Training Loss: 0.02727
Epoch: 4801, Training Loss: 0.02059
Epoch: 4802, Training Loss: 0.02375
Epoch: 4802, Training Loss: 0.02152
Epoch: 4802, Training Loss: 0.02727
Epoch: 4802, Training Loss: 0.02059
Epoch: 4803, Training Loss: 0.02375
Epoch: 4803, Training Loss: 0.02152
Epoch: 4803, Training Loss: 0.02726
Epoch: 4803, Training Loss: 0.02059
Epoch: 4804, Training Loss: 0.02374
Epoch: 4804, Training Loss: 0.02152
Epoch: 4804, Training Loss: 0.02726
Epoch: 4804, Training Loss: 0.02058
Epoch: 4805, Training Loss: 0.02374
Epoch: 4805, Training Loss: 0.02151
Epoch: 4805, Training Loss: 0.02726
Epoch: 4805, Training Loss: 0.02058
Epoch: 4806, Training Loss: 0.02374
Epoch: 4806, Training Loss: 0.02151
Epoch: 4806, Training Loss: 0.02725
Epoch: 4806, Training Loss: 0.02058
Epoch: 4807, Training Loss: 0.02374
Epoch: 4807, Training Loss: 0.02151
Epoch: 4807, Training Loss: 0.02725
Epoch: 4807, Training Loss: 0.02057
Epoch: 4808, Training Loss: 0.02373
Epoch: 4808, Training Loss: 0.02151
Epoch: 4808, Training Loss: 0.02725
Epoch: 4808, Training Loss: 0.02057
Epoch: 4809, Training Loss: 0.02373
Epoch: 4809, Training Loss: 0.02150
Epoch: 4809, Training Loss: 0.02724
Epoch: 4809, Training Loss: 0.02057
Epoch: 4810, Training Loss: 0.02373
Epoch: 4810, Training Loss: 0.02150
Epoch: 4810, Training Loss: 0.02724
Epoch: 4810, Training Loss: 0.02057
Epoch: 4811, Training Loss: 0.02372
Epoch: 4811, Training Loss: 0.02150
Epoch: 4811, Training Loss: 0.02723
Epoch: 4811, Training Loss: 0.02056
Epoch: 4812, Training Loss: 0.02372
Epoch: 4812, Training Loss: 0.02149
Epoch: 4812, Training Loss: 0.02723
Epoch: 4812, Training Loss: 0.02056
Epoch: 4813, Training Loss: 0.02372
Epoch: 4813, Training Loss: 0.02149
Epoch: 4813, Training Loss: 0.02723
Epoch: 4813, Training Loss: 0.02056
Epoch: 4814, Training Loss: 0.02371
Epoch: 4814, Training Loss: 0.02149
Epoch: 4814, Training Loss: 0.02722
Epoch: 4814, Training Loss: 0.02056
Epoch: 4815, Training Loss: 0.02371
Epoch: 4815, Training Loss: 0.02149
Epoch: 4815, Training Loss: 0.02722
Epoch: 4815, Training Loss: 0.02055
Epoch: 4816, Training Loss: 0.02371
Epoch: 4816, Training Loss: 0.02148
Epoch: 4816, Training Loss: 0.02722
Epoch: 4816, Training Loss: 0.02055
Epoch: 4817, Training Loss: 0.02370
Epoch: 4817, Training Loss: 0.02148
Epoch: 4817, Training Loss: 0.02721
Epoch: 4817, Training Loss: 0.02055
Epoch: 4818, Training Loss: 0.02370
Epoch: 4818, Training Loss: 0.02148
Epoch: 4818, Training Loss: 0.02721
Epoch: 4818, Training Loss: 0.02055
Epoch: 4819, Training Loss: 0.02370
Epoch: 4819, Training Loss: 0.02148
Epoch: 4819, Training Loss: 0.02721
Epoch: 4819, Training Loss: 0.02054
Epoch: 4820, Training Loss: 0.02370
Epoch: 4820, Training Loss: 0.02147
Epoch: 4820, Training Loss: 0.02720
Epoch: 4820, Training Loss: 0.02054
Epoch: 4821, Training Loss: 0.02369
Epoch: 4821, Training Loss: 0.02147
Epoch: 4821, Training Loss: 0.02720
Epoch: 4821, Training Loss: 0.02054
Epoch: 4822, Training Loss: 0.02369
Epoch: 4822, Training Loss: 0.02147
Epoch: 4822, Training Loss: 0.02719
Epoch: 4822, Training Loss: 0.02054
Epoch: 4823, Training Loss: 0.02369
Epoch: 4823, Training Loss: 0.02147
Epoch: 4823, Training Loss: 0.02719
Epoch: 4823, Training Loss: 0.02053
Epoch: 4824, Training Loss: 0.02368
Epoch: 4824, Training Loss: 0.02146
Epoch: 4824, Training Loss: 0.02719
Epoch: 4824, Training Loss: 0.02053
Epoch: 4825, Training Loss: 0.02368
Epoch: 4825, Training Loss: 0.02146
Epoch: 4825, Training Loss: 0.02718
Epoch: 4825, Training Loss: 0.02053
Epoch: 4826, Training Loss: 0.02368
Epoch: 4826, Training Loss: 0.02146
Epoch: 4826, Training Loss: 0.02718
Epoch: 4826, Training Loss: 0.02053
Epoch: 4827, Training Loss: 0.02367
Epoch: 4827, Training Loss: 0.02145
Epoch: 4827, Training Loss: 0.02718
Epoch: 4827, Training Loss: 0.02052
Epoch: 4828, Training Loss: 0.02367
Epoch: 4828, Training Loss: 0.02145
Epoch: 4828, Training Loss: 0.02717
Epoch: 4828, Training Loss: 0.02052
Epoch: 4829, Training Loss: 0.02367
Epoch: 4829, Training Loss: 0.02145
Epoch: 4829, Training Loss: 0.02717
Epoch: 4829, Training Loss: 0.02052
Epoch: 4830, Training Loss: 0.02366
Epoch: 4830, Training Loss: 0.02145
Epoch: 4830, Training Loss: 0.02717
Epoch: 4830, Training Loss: 0.02052
Epoch: 4831, Training Loss: 0.02366
Epoch: 4831, Training Loss: 0.02144
Epoch: 4831, Training Loss: 0.02716
Epoch: 4831, Training Loss: 0.02051
Epoch: 4832, Training Loss: 0.02366
Epoch: 4832, Training Loss: 0.02144
Epoch: 4832, Training Loss: 0.02716
Epoch: 4832, Training Loss: 0.02051
Epoch: 4833, Training Loss: 0.02366
Epoch: 4833, Training Loss: 0.02144
Epoch: 4833, Training Loss: 0.02716
Epoch: 4833, Training Loss: 0.02051
Epoch: 4834, Training Loss: 0.02365
Epoch: 4834, Training Loss: 0.02144
Epoch: 4834, Training Loss: 0.02715
Epoch: 4834, Training Loss: 0.02050
Epoch: 4835, Training Loss: 0.02365
Epoch: 4835, Training Loss: 0.02143
Epoch: 4835, Training Loss: 0.02715
Epoch: 4835, Training Loss: 0.02050
Epoch: 4836, Training Loss: 0.02365
Epoch: 4836, Training Loss: 0.02143
Epoch: 4836, Training Loss: 0.02714
Epoch: 4836, Training Loss: 0.02050
Epoch: 4837, Training Loss: 0.02364
Epoch: 4837, Training Loss: 0.02143
Epoch: 4837, Training Loss: 0.02714
Epoch: 4837, Training Loss: 0.02050
Epoch: 4838, Training Loss: 0.02364
Epoch: 4838, Training Loss: 0.02143
Epoch: 4838, Training Loss: 0.02714
Epoch: 4838, Training Loss: 0.02049
Epoch: 4839, Training Loss: 0.02364
Epoch: 4839, Training Loss: 0.02142
Epoch: 4839, Training Loss: 0.02713
Epoch: 4839, Training Loss: 0.02049
Epoch: 4840, Training Loss: 0.02363
Epoch: 4840, Training Loss: 0.02142
Epoch: 4840, Training Loss: 0.02713
Epoch: 4840, Training Loss: 0.02049
Epoch: 4841, Training Loss: 0.02363
Epoch: 4841, Training Loss: 0.02142
Epoch: 4841, Training Loss: 0.02713
Epoch: 4841, Training Loss: 0.02049
Epoch: 4842, Training Loss: 0.02363
Epoch: 4842, Training Loss: 0.02142
Epoch: 4842, Training Loss: 0.02712
Epoch: 4842, Training Loss: 0.02048
Epoch: 4843, Training Loss: 0.02362
Epoch: 4843, Training Loss: 0.02141
Epoch: 4843, Training Loss: 0.02712
Epoch: 4843, Training Loss: 0.02048
Epoch: 4844, Training Loss: 0.02362
Epoch: 4844, Training Loss: 0.02141
Epoch: 4844, Training Loss: 0.02712
Epoch: 4844, Training Loss: 0.02048
Epoch: 4845, Training Loss: 0.02362
Epoch: 4845, Training Loss: 0.02141
Epoch: 4845, Training Loss: 0.02711
Epoch: 4845, Training Loss: 0.02048
Epoch: 4846, Training Loss: 0.02362
Epoch: 4846, Training Loss: 0.02140
Epoch: 4846, Training Loss: 0.02711
Epoch: 4846, Training Loss: 0.02047
Epoch: 4847, Training Loss: 0.02361
Epoch: 4847, Training Loss: 0.02140
Epoch: 4847, Training Loss: 0.02711
Epoch: 4847, Training Loss: 0.02047
Epoch: 4848, Training Loss: 0.02361
Epoch: 4848, Training Loss: 0.02140
Epoch: 4848, Training Loss: 0.02710
Epoch: 4848, Training Loss: 0.02047
Epoch: 4849, Training Loss: 0.02361
Epoch: 4849, Training Loss: 0.02140
Epoch: 4849, Training Loss: 0.02710
Epoch: 4849, Training Loss: 0.02047
Epoch: 4850, Training Loss: 0.02360
Epoch: 4850, Training Loss: 0.02139
Epoch: 4850, Training Loss: 0.02709
Epoch: 4850, Training Loss: 0.02046
Epoch: 4851, Training Loss: 0.02360
Epoch: 4851, Training Loss: 0.02139
Epoch: 4851, Training Loss: 0.02709
Epoch: 4851, Training Loss: 0.02046
Epoch: 4852, Training Loss: 0.02360
Epoch: 4852, Training Loss: 0.02139
Epoch: 4852, Training Loss: 0.02709
Epoch: 4852, Training Loss: 0.02046
Epoch: 4853, Training Loss: 0.02359
Epoch: 4853, Training Loss: 0.02139
Epoch: 4853, Training Loss: 0.02708
Epoch: 4853, Training Loss: 0.02046
Epoch: 4854, Training Loss: 0.02359
Epoch: 4854, Training Loss: 0.02138
Epoch: 4854, Training Loss: 0.02708
Epoch: 4854, Training Loss: 0.02045
Epoch: 4855, Training Loss: 0.02359
Epoch: 4855, Training Loss: 0.02138
Epoch: 4855, Training Loss: 0.02708
Epoch: 4855, Training Loss: 0.02045
Epoch: 4856, Training Loss: 0.02359
Epoch: 4856, Training Loss: 0.02138
Epoch: 4856, Training Loss: 0.02707
Epoch: 4856, Training Loss: 0.02045
Epoch: 4857, Training Loss: 0.02358
Epoch: 4857, Training Loss: 0.02138
Epoch: 4857, Training Loss: 0.02707
Epoch: 4857, Training Loss: 0.02045
Epoch: 4858, Training Loss: 0.02358
Epoch: 4858, Training Loss: 0.02137
Epoch: 4858, Training Loss: 0.02707
Epoch: 4858, Training Loss: 0.02044
Epoch: 4859, Training Loss: 0.02358
Epoch: 4859, Training Loss: 0.02137
Epoch: 4859, Training Loss: 0.02706
Epoch: 4859, Training Loss: 0.02044
Epoch: 4860, Training Loss: 0.02357
Epoch: 4860, Training Loss: 0.02137
Epoch: 4860, Training Loss: 0.02706
Epoch: 4860, Training Loss: 0.02044
Epoch: 4861, Training Loss: 0.02357
Epoch: 4861, Training Loss: 0.02137
Epoch: 4861, Training Loss: 0.02706
Epoch: 4861, Training Loss: 0.02044
Epoch: 4862, Training Loss: 0.02357
Epoch: 4862, Training Loss: 0.02136
Epoch: 4862, Training Loss: 0.02705
Epoch: 4862, Training Loss: 0.02043
Epoch: 4863, Training Loss: 0.02356
Epoch: 4863, Training Loss: 0.02136
Epoch: 4863, Training Loss: 0.02705
Epoch: 4863, Training Loss: 0.02043
Epoch: 4864, Training Loss: 0.02356
Epoch: 4864, Training Loss: 0.02136
Epoch: 4864, Training Loss: 0.02705
Epoch: 4864, Training Loss: 0.02043
Epoch: 4865, Training Loss: 0.02356
Epoch: 4865, Training Loss: 0.02135
Epoch: 4865, Training Loss: 0.02704
Epoch: 4865, Training Loss: 0.02042
Epoch: 4866, Training Loss: 0.02355
Epoch: 4866, Training Loss: 0.02135
Epoch: 4866, Training Loss: 0.02704
Epoch: 4866, Training Loss: 0.02042
Epoch: 4867, Training Loss: 0.02355
Epoch: 4867, Training Loss: 0.02135
Epoch: 4867, Training Loss: 0.02703
Epoch: 4867, Training Loss: 0.02042
Epoch: 4868, Training Loss: 0.02355
Epoch: 4868, Training Loss: 0.02135
Epoch: 4868, Training Loss: 0.02703
Epoch: 4868, Training Loss: 0.02042
Epoch: 4869, Training Loss: 0.02355
Epoch: 4869, Training Loss: 0.02134
Epoch: 4869, Training Loss: 0.02703
Epoch: 4869, Training Loss: 0.02041
Epoch: 4870, Training Loss: 0.02354
Epoch: 4870, Training Loss: 0.02134
Epoch: 4870, Training Loss: 0.02702
Epoch: 4870, Training Loss: 0.02041
Epoch: 4871, Training Loss: 0.02354
Epoch: 4871, Training Loss: 0.02134
Epoch: 4871, Training Loss: 0.02702
Epoch: 4871, Training Loss: 0.02041
Epoch: 4872, Training Loss: 0.02354
Epoch: 4872, Training Loss: 0.02134
Epoch: 4872, Training Loss: 0.02702
Epoch: 4872, Training Loss: 0.02041
Epoch: 4873, Training Loss: 0.02353
Epoch: 4873, Training Loss: 0.02133
Epoch: 4873, Training Loss: 0.02701
Epoch: 4873, Training Loss: 0.02040
Epoch: 4874, Training Loss: 0.02353
Epoch: 4874, Training Loss: 0.02133
Epoch: 4874, Training Loss: 0.02701
Epoch: 4874, Training Loss: 0.02040
Epoch: 4875, Training Loss: 0.02353
Epoch: 4875, Training Loss: 0.02133
Epoch: 4875, Training Loss: 0.02701
Epoch: 4875, Training Loss: 0.02040
Epoch: 4876, Training Loss: 0.02352
Epoch: 4876, Training Loss: 0.02133
Epoch: 4876, Training Loss: 0.02700
Epoch: 4876, Training Loss: 0.02040
Epoch: 4877, Training Loss: 0.02352
Epoch: 4877, Training Loss: 0.02132
Epoch: 4877, Training Loss: 0.02700
Epoch: 4877, Training Loss: 0.02039
Epoch: 4878, Training Loss: 0.02352
Epoch: 4878, Training Loss: 0.02132
Epoch: 4878, Training Loss: 0.02700
Epoch: 4878, Training Loss: 0.02039
Epoch: 4879, Training Loss: 0.02352
Epoch: 4879, Training Loss: 0.02132
Epoch: 4879, Training Loss: 0.02699
Epoch: 4879, Training Loss: 0.02039
Epoch: 4880, Training Loss: 0.02351
Epoch: 4880, Training Loss: 0.02132
Epoch: 4880, Training Loss: 0.02699
Epoch: 4880, Training Loss: 0.02039
Epoch: 4881, Training Loss: 0.02351
Epoch: 4881, Training Loss: 0.02131
Epoch: 4881, Training Loss: 0.02699
Epoch: 4881, Training Loss: 0.02038
Epoch: 4882, Training Loss: 0.02351
Epoch: 4882, Training Loss: 0.02131
Epoch: 4882, Training Loss: 0.02698
Epoch: 4882, Training Loss: 0.02038
Epoch: 4883, Training Loss: 0.02350
Epoch: 4883, Training Loss: 0.02131
Epoch: 4883, Training Loss: 0.02698
Epoch: 4883, Training Loss: 0.02038
Epoch: 4884, Training Loss: 0.02350
Epoch: 4884, Training Loss: 0.02131
Epoch: 4884, Training Loss: 0.02697
Epoch: 4884, Training Loss: 0.02038
Epoch: 4885, Training Loss: 0.02350
Epoch: 4885, Training Loss: 0.02130
Epoch: 4885, Training Loss: 0.02697
Epoch: 4885, Training Loss: 0.02037
Epoch: 4886, Training Loss: 0.02349
Epoch: 4886, Training Loss: 0.02130
Epoch: 4886, Training Loss: 0.02697
Epoch: 4886, Training Loss: 0.02037
Epoch: 4887, Training Loss: 0.02349
Epoch: 4887, Training Loss: 0.02130
Epoch: 4887, Training Loss: 0.02696
Epoch: 4887, Training Loss: 0.02037
Epoch: 4888, Training Loss: 0.02349
Epoch: 4888, Training Loss: 0.02130
Epoch: 4888, Training Loss: 0.02696
Epoch: 4888, Training Loss: 0.02037
Epoch: 4889, Training Loss: 0.02349
Epoch: 4889, Training Loss: 0.02129
Epoch: 4889, Training Loss: 0.02696
Epoch: 4889, Training Loss: 0.02036
Epoch: 4890, Training Loss: 0.02348
Epoch: 4890, Training Loss: 0.02129
Epoch: 4890, Training Loss: 0.02695
Epoch: 4890, Training Loss: 0.02036
Epoch: 4891, Training Loss: 0.02348
Epoch: 4891, Training Loss: 0.02129
Epoch: 4891, Training Loss: 0.02695
Epoch: 4891, Training Loss: 0.02036
Epoch: 4892, Training Loss: 0.02348
Epoch: 4892, Training Loss: 0.02128
Epoch: 4892, Training Loss: 0.02695
Epoch: 4892, Training Loss: 0.02036
Epoch: 4893, Training Loss: 0.02347
Epoch: 4893, Training Loss: 0.02128
Epoch: 4893, Training Loss: 0.02694
Epoch: 4893, Training Loss: 0.02035
Epoch: 4894, Training Loss: 0.02347
Epoch: 4894, Training Loss: 0.02128
Epoch: 4894, Training Loss: 0.02694
Epoch: 4894, Training Loss: 0.02035
Epoch: 4895, Training Loss: 0.02347
Epoch: 4895, Training Loss: 0.02128
Epoch: 4895, Training Loss: 0.02694
Epoch: 4895, Training Loss: 0.02035
Epoch: 4896, Training Loss: 0.02346
Epoch: 4896, Training Loss: 0.02127
Epoch: 4896, Training Loss: 0.02693
Epoch: 4896, Training Loss: 0.02035
Epoch: 4897, Training Loss: 0.02346
Epoch: 4897, Training Loss: 0.02127
Epoch: 4897, Training Loss: 0.02693
Epoch: 4897, Training Loss: 0.02034
Epoch: 4898, Training Loss: 0.02346
Epoch: 4898, Training Loss: 0.02127
Epoch: 4898, Training Loss: 0.02693
Epoch: 4898, Training Loss: 0.02034
Epoch: 4899, Training Loss: 0.02346
Epoch: 4899, Training Loss: 0.02127
Epoch: 4899, Training Loss: 0.02692
Epoch: 4899, Training Loss: 0.02034
Epoch: 4900, Training Loss: 0.02345
Epoch: 4900, Training Loss: 0.02126
Epoch: 4900, Training Loss: 0.02692
Epoch: 4900, Training Loss: 0.02034
Epoch: 4901, Training Loss: 0.02345
Epoch: 4901, Training Loss: 0.02126
Epoch: 4901, Training Loss: 0.02692
Epoch: 4901, Training Loss: 0.02033
Epoch: 4902, Training Loss: 0.02345
Epoch: 4902, Training Loss: 0.02126
Epoch: 4902, Training Loss: 0.02691
Epoch: 4902, Training Loss: 0.02033
Epoch: 4903, Training Loss: 0.02344
Epoch: 4903, Training Loss: 0.02126
Epoch: 4903, Training Loss: 0.02691
Epoch: 4903, Training Loss: 0.02033
Epoch: 4904, Training Loss: 0.02344
Epoch: 4904, Training Loss: 0.02125
Epoch: 4904, Training Loss: 0.02690
Epoch: 4904, Training Loss: 0.02033
Epoch: 4905, Training Loss: 0.02344
Epoch: 4905, Training Loss: 0.02125
Epoch: 4905, Training Loss: 0.02690
Epoch: 4905, Training Loss: 0.02032
Epoch: 4906, Training Loss: 0.02343
Epoch: 4906, Training Loss: 0.02125
Epoch: 4906, Training Loss: 0.02690
Epoch: 4906, Training Loss: 0.02032
Epoch: 4907, Training Loss: 0.02343
Epoch: 4907, Training Loss: 0.02125
Epoch: 4907, Training Loss: 0.02689
Epoch: 4907, Training Loss: 0.02032
Epoch: 4908, Training Loss: 0.02343
Epoch: 4908, Training Loss: 0.02124
Epoch: 4908, Training Loss: 0.02689
Epoch: 4908, Training Loss: 0.02032
Epoch: 4909, Training Loss: 0.02343
Epoch: 4909, Training Loss: 0.02124
Epoch: 4909, Training Loss: 0.02689
Epoch: 4909, Training Loss: 0.02031
Epoch: 4910, Training Loss: 0.02342
Epoch: 4910, Training Loss: 0.02124
Epoch: 4910, Training Loss: 0.02688
Epoch: 4910, Training Loss: 0.02031
Epoch: 4911, Training Loss: 0.02342
Epoch: 4911, Training Loss: 0.02124
Epoch: 4911, Training Loss: 0.02688
Epoch: 4911, Training Loss: 0.02031
Epoch: 4912, Training Loss: 0.02342
Epoch: 4912, Training Loss: 0.02123
Epoch: 4912, Training Loss: 0.02688
Epoch: 4912, Training Loss: 0.02031
Epoch: 4913, Training Loss: 0.02341
Epoch: 4913, Training Loss: 0.02123
Epoch: 4913, Training Loss: 0.02687
Epoch: 4913, Training Loss: 0.02030
Epoch: 4914, Training Loss: 0.02341
Epoch: 4914, Training Loss: 0.02123
Epoch: 4914, Training Loss: 0.02687
Epoch: 4914, Training Loss: 0.02030
Epoch: 4915, Training Loss: 0.02341
Epoch: 4915, Training Loss: 0.02123
Epoch: 4915, Training Loss: 0.02687
Epoch: 4915, Training Loss: 0.02030
Epoch: 4916, Training Loss: 0.02341
Epoch: 4916, Training Loss: 0.02122
Epoch: 4916, Training Loss: 0.02686
Epoch: 4916, Training Loss: 0.02030
Epoch: 4917, Training Loss: 0.02340
Epoch: 4917, Training Loss: 0.02122
Epoch: 4917, Training Loss: 0.02686
Epoch: 4917, Training Loss: 0.02029
Epoch: 4918, Training Loss: 0.02340
Epoch: 4918, Training Loss: 0.02122
Epoch: 4918, Training Loss: 0.02686
Epoch: 4918, Training Loss: 0.02029
Epoch: 4919, Training Loss: 0.02340
Epoch: 4919, Training Loss: 0.02122
Epoch: 4919, Training Loss: 0.02685
Epoch: 4919, Training Loss: 0.02029
Epoch: 4920, Training Loss: 0.02339
Epoch: 4920, Training Loss: 0.02121
Epoch: 4920, Training Loss: 0.02685
Epoch: 4920, Training Loss: 0.02029
Epoch: 4921, Training Loss: 0.02339
Epoch: 4921, Training Loss: 0.02121
Epoch: 4921, Training Loss: 0.02685
Epoch: 4921, Training Loss: 0.02028
Epoch: 4922, Training Loss: 0.02339
Epoch: 4922, Training Loss: 0.02121
Epoch: 4922, Training Loss: 0.02684
Epoch: 4922, Training Loss: 0.02028
Epoch: 4923, Training Loss: 0.02338
Epoch: 4923, Training Loss: 0.02120
Epoch: 4923, Training Loss: 0.02684
Epoch: 4923, Training Loss: 0.02028
Epoch: 4924, Training Loss: 0.02338
Epoch: 4924, Training Loss: 0.02120
Epoch: 4924, Training Loss: 0.02684
Epoch: 4924, Training Loss: 0.02028
Epoch: 4925, Training Loss: 0.02338
Epoch: 4925, Training Loss: 0.02120
Epoch: 4925, Training Loss: 0.02683
Epoch: 4925, Training Loss: 0.02027
Epoch: 4926, Training Loss: 0.02338
Epoch: 4926, Training Loss: 0.02120
Epoch: 4926, Training Loss: 0.02683
Epoch: 4926, Training Loss: 0.02027
Epoch: 4927, Training Loss: 0.02337
Epoch: 4927, Training Loss: 0.02119
Epoch: 4927, Training Loss: 0.02683
Epoch: 4927, Training Loss: 0.02027
Epoch: 4928, Training Loss: 0.02337
Epoch: 4928, Training Loss: 0.02119
Epoch: 4928, Training Loss: 0.02682
Epoch: 4928, Training Loss: 0.02027
Epoch: 4929, Training Loss: 0.02337
Epoch: 4929, Training Loss: 0.02119
Epoch: 4929, Training Loss: 0.02682
Epoch: 4929, Training Loss: 0.02026
Epoch: 4930, Training Loss: 0.02336
Epoch: 4930, Training Loss: 0.02119
Epoch: 4930, Training Loss: 0.02681
Epoch: 4930, Training Loss: 0.02026
Epoch: 4931, Training Loss: 0.02336
Epoch: 4931, Training Loss: 0.02118
Epoch: 4931, Training Loss: 0.02681
Epoch: 4931, Training Loss: 0.02026
Epoch: 4932, Training Loss: 0.02336
Epoch: 4932, Training Loss: 0.02118
Epoch: 4932, Training Loss: 0.02681
Epoch: 4932, Training Loss: 0.02026
Epoch: 4933, Training Loss: 0.02335
Epoch: 4933, Training Loss: 0.02118
Epoch: 4933, Training Loss: 0.02680
Epoch: 4933, Training Loss: 0.02025
Epoch: 4934, Training Loss: 0.02335
Epoch: 4934, Training Loss: 0.02118
Epoch: 4934, Training Loss: 0.02680
Epoch: 4934, Training Loss: 0.02025
Epoch: 4935, Training Loss: 0.02335
Epoch: 4935, Training Loss: 0.02117
Epoch: 4935, Training Loss: 0.02680
Epoch: 4935, Training Loss: 0.02025
Epoch: 4936, Training Loss: 0.02335
Epoch: 4936, Training Loss: 0.02117
Epoch: 4936, Training Loss: 0.02679
Epoch: 4936, Training Loss: 0.02025
Epoch: 4937, Training Loss: 0.02334
Epoch: 4937, Training Loss: 0.02117
Epoch: 4937, Training Loss: 0.02679
Epoch: 4937, Training Loss: 0.02024
Epoch: 4938, Training Loss: 0.02334
Epoch: 4938, Training Loss: 0.02117
Epoch: 4938, Training Loss: 0.02679
Epoch: 4938, Training Loss: 0.02024
Epoch: 4939, Training Loss: 0.02334
Epoch: 4939, Training Loss: 0.02116
Epoch: 4939, Training Loss: 0.02678
Epoch: 4939, Training Loss: 0.02024
Epoch: 4940, Training Loss: 0.02333
Epoch: 4940, Training Loss: 0.02116
Epoch: 4940, Training Loss: 0.02678
Epoch: 4940, Training Loss: 0.02024
Epoch: 4941, Training Loss: 0.02333
Epoch: 4941, Training Loss: 0.02116
Epoch: 4941, Training Loss: 0.02678
Epoch: 4941, Training Loss: 0.02023
Epoch: 4942, Training Loss: 0.02333
Epoch: 4942, Training Loss: 0.02116
Epoch: 4942, Training Loss: 0.02677
Epoch: 4942, Training Loss: 0.02023
Epoch: 4943, Training Loss: 0.02333
Epoch: 4943, Training Loss: 0.02115
Epoch: 4943, Training Loss: 0.02677
Epoch: 4943, Training Loss: 0.02023
Epoch: 4944, Training Loss: 0.02332
Epoch: 4944, Training Loss: 0.02115
Epoch: 4944, Training Loss: 0.02677
Epoch: 4944, Training Loss: 0.02023
Epoch: 4945, Training Loss: 0.02332
Epoch: 4945, Training Loss: 0.02115
Epoch: 4945, Training Loss: 0.02676
Epoch: 4945, Training Loss: 0.02022
Epoch: 4946, Training Loss: 0.02332
Epoch: 4946, Training Loss: 0.02115
Epoch: 4946, Training Loss: 0.02676
Epoch: 4946, Training Loss: 0.02022
Epoch: 4947, Training Loss: 0.02331
Epoch: 4947, Training Loss: 0.02114
Epoch: 4947, Training Loss: 0.02676
Epoch: 4947, Training Loss: 0.02022
Epoch: 4948, Training Loss: 0.02331
Epoch: 4948, Training Loss: 0.02114
Epoch: 4948, Training Loss: 0.02675
Epoch: 4948, Training Loss: 0.02022
Epoch: 4949, Training Loss: 0.02331
Epoch: 4949, Training Loss: 0.02114
Epoch: 4949, Training Loss: 0.02675
Epoch: 4949, Training Loss: 0.02021
Epoch: 4950, Training Loss: 0.02330
Epoch: 4950, Training Loss: 0.02114
Epoch: 4950, Training Loss: 0.02675
Epoch: 4950, Training Loss: 0.02021
Epoch: 4951, Training Loss: 0.02330
Epoch: 4951, Training Loss: 0.02113
Epoch: 4951, Training Loss: 0.02674
Epoch: 4951, Training Loss: 0.02021
Epoch: 4952, Training Loss: 0.02330
Epoch: 4952, Training Loss: 0.02113
Epoch: 4952, Training Loss: 0.02674
Epoch: 4952, Training Loss: 0.02021
Epoch: 4953, Training Loss: 0.02330
Epoch: 4953, Training Loss: 0.02113
Epoch: 4953, Training Loss: 0.02674
Epoch: 4953, Training Loss: 0.02020
Epoch: 4954, Training Loss: 0.02329
Epoch: 4954, Training Loss: 0.02113
Epoch: 4954, Training Loss: 0.02673
Epoch: 4954, Training Loss: 0.02020
Epoch: 4955, Training Loss: 0.02329
Epoch: 4955, Training Loss: 0.02112
Epoch: 4955, Training Loss: 0.02673
Epoch: 4955, Training Loss: 0.02020
Epoch: 4956, Training Loss: 0.02329
Epoch: 4956, Training Loss: 0.02112
Epoch: 4956, Training Loss: 0.02673
Epoch: 4956, Training Loss: 0.02020
Epoch: 4957, Training Loss: 0.02328
Epoch: 4957, Training Loss: 0.02112
Epoch: 4957, Training Loss: 0.02672
Epoch: 4957, Training Loss: 0.02019
Epoch: 4958, Training Loss: 0.02328
Epoch: 4958, Training Loss: 0.02112
Epoch: 4958, Training Loss: 0.02672
Epoch: 4958, Training Loss: 0.02019
Epoch: 4959, Training Loss: 0.02328
Epoch: 4959, Training Loss: 0.02111
Epoch: 4959, Training Loss: 0.02672
Epoch: 4959, Training Loss: 0.02019
Epoch: 4960, Training Loss: 0.02328
Epoch: 4960, Training Loss: 0.02111
Epoch: 4960, Training Loss: 0.02671
Epoch: 4960, Training Loss: 0.02019
Epoch: 4961, Training Loss: 0.02327
Epoch: 4961, Training Loss: 0.02111
Epoch: 4961, Training Loss: 0.02671
Epoch: 4961, Training Loss: 0.02018
Epoch: 4962, Training Loss: 0.02327
Epoch: 4962, Training Loss: 0.02111
Epoch: 4962, Training Loss: 0.02671
Epoch: 4962, Training Loss: 0.02018
Epoch: 4963, Training Loss: 0.02327
Epoch: 4963, Training Loss: 0.02110
Epoch: 4963, Training Loss: 0.02670
Epoch: 4963, Training Loss: 0.02018
Epoch: 4964, Training Loss: 0.02326
Epoch: 4964, Training Loss: 0.02110
Epoch: 4964, Training Loss: 0.02670
Epoch: 4964, Training Loss: 0.02018
Epoch: 4965, Training Loss: 0.02326
Epoch: 4965, Training Loss: 0.02110
Epoch: 4965, Training Loss: 0.02669
Epoch: 4965, Training Loss: 0.02017
Epoch: 4966, Training Loss: 0.02326
Epoch: 4966, Training Loss: 0.02110
Epoch: 4966, Training Loss: 0.02669
Epoch: 4966, Training Loss: 0.02017
Epoch: 4967, Training Loss: 0.02326
Epoch: 4967, Training Loss: 0.02109
Epoch: 4967, Training Loss: 0.02669
Epoch: 4967, Training Loss: 0.02017
Epoch: 4968, Training Loss: 0.02325
Epoch: 4968, Training Loss: 0.02109
Epoch: 4968, Training Loss: 0.02668
Epoch: 4968, Training Loss: 0.02017
Epoch: 4969, Training Loss: 0.02325
Epoch: 4969, Training Loss: 0.02109
Epoch: 4969, Training Loss: 0.02668
Epoch: 4969, Training Loss: 0.02016
Epoch: 4970, Training Loss: 0.02325
Epoch: 4970, Training Loss: 0.02109
Epoch: 4970, Training Loss: 0.02668
Epoch: 4970, Training Loss: 0.02016
Epoch: 4971, Training Loss: 0.02324
Epoch: 4971, Training Loss: 0.02108
Epoch: 4971, Training Loss: 0.02667
Epoch: 4971, Training Loss: 0.02016
Epoch: 4972, Training Loss: 0.02324
Epoch: 4972, Training Loss: 0.02108
Epoch: 4972, Training Loss: 0.02667
Epoch: 4972, Training Loss: 0.02016
Epoch: 4973, Training Loss: 0.02324
Epoch: 4973, Training Loss: 0.02108
Epoch: 4973, Training Loss: 0.02667
Epoch: 4973, Training Loss: 0.02015
Epoch: 4974, Training Loss: 0.02323
Epoch: 4974, Training Loss: 0.02108
Epoch: 4974, Training Loss: 0.02666
Epoch: 4974, Training Loss: 0.02015
Epoch: 4975, Training Loss: 0.02323
Epoch: 4975, Training Loss: 0.02107
Epoch: 4975, Training Loss: 0.02666
Epoch: 4975, Training Loss: 0.02015
Epoch: 4976, Training Loss: 0.02323
Epoch: 4976, Training Loss: 0.02107
Epoch: 4976, Training Loss: 0.02666
Epoch: 4976, Training Loss: 0.02015
Epoch: 4977, Training Loss: 0.02323
Epoch: 4977, Training Loss: 0.02107
Epoch: 4977, Training Loss: 0.02665
Epoch: 4977, Training Loss: 0.02014
Epoch: 4978, Training Loss: 0.02322
Epoch: 4978, Training Loss: 0.02107
Epoch: 4978, Training Loss: 0.02665
Epoch: 4978, Training Loss: 0.02014
Epoch: 4979, Training Loss: 0.02322
Epoch: 4979, Training Loss: 0.02106
Epoch: 4979, Training Loss: 0.02665
Epoch: 4979, Training Loss: 0.02014
Epoch: 4980, Training Loss: 0.02322
Epoch: 4980, Training Loss: 0.02106
Epoch: 4980, Training Loss: 0.02664
Epoch: 4980, Training Loss: 0.02014
Epoch: 4981, Training Loss: 0.02321
Epoch: 4981, Training Loss: 0.02106
Epoch: 4981, Training Loss: 0.02664
Epoch: 4981, Training Loss: 0.02013
Epoch: 4982, Training Loss: 0.02321
Epoch: 4982, Training Loss: 0.02106
Epoch: 4982, Training Loss: 0.02664
Epoch: 4982, Training Loss: 0.02013
Epoch: 4983, Training Loss: 0.02321
Epoch: 4983, Training Loss: 0.02105
Epoch: 4983, Training Loss: 0.02663
Epoch: 4983, Training Loss: 0.02013
Epoch: 4984, Training Loss: 0.02321
Epoch: 4984, Training Loss: 0.02105
Epoch: 4984, Training Loss: 0.02663
Epoch: 4984, Training Loss: 0.02013
Epoch: 4985, Training Loss: 0.02320
Epoch: 4985, Training Loss: 0.02105
Epoch: 4985, Training Loss: 0.02663
Epoch: 4985, Training Loss: 0.02012
Epoch: 4986, Training Loss: 0.02320
Epoch: 4986, Training Loss: 0.02105
Epoch: 4986, Training Loss: 0.02662
Epoch: 4986, Training Loss: 0.02012
Epoch: 4987, Training Loss: 0.02320
Epoch: 4987, Training Loss: 0.02104
Epoch: 4987, Training Loss: 0.02662
Epoch: 4987, Training Loss: 0.02012
Epoch: 4988, Training Loss: 0.02319
Epoch: 4988, Training Loss: 0.02104
Epoch: 4988, Training Loss: 0.02662
Epoch: 4988, Training Loss: 0.02012
Epoch: 4989, Training Loss: 0.02319
Epoch: 4989, Training Loss: 0.02104
Epoch: 4989, Training Loss: 0.02661
Epoch: 4989, Training Loss: 0.02011
Epoch: 4990, Training Loss: 0.02319
Epoch: 4990, Training Loss: 0.02104
Epoch: 4990, Training Loss: 0.02661
Epoch: 4990, Training Loss: 0.02011
Epoch: 4991, Training Loss: 0.02319
Epoch: 4991, Training Loss: 0.02103
Epoch: 4991, Training Loss: 0.02661
Epoch: 4991, Training Loss: 0.02011
Epoch: 4992, Training Loss: 0.02318
Epoch: 4992, Training Loss: 0.02103
Epoch: 4992, Training Loss: 0.02660
Epoch: 4992, Training Loss: 0.02011
Epoch: 4993, Training Loss: 0.02318
Epoch: 4993, Training Loss: 0.02103
Epoch: 4993, Training Loss: 0.02660
Epoch: 4993, Training Loss: 0.02011
Epoch: 4994, Training Loss: 0.02318
Epoch: 4994, Training Loss: 0.02103
Epoch: 4994, Training Loss: 0.02660
Epoch: 4994, Training Loss: 0.02010
Epoch: 4995, Training Loss: 0.02317
Epoch: 4995, Training Loss: 0.02102
Epoch: 4995, Training Loss: 0.02659
Epoch: 4995, Training Loss: 0.02010
Epoch: 4996, Training Loss: 0.02317
Epoch: 4996, Training Loss: 0.02102
Epoch: 4996, Training Loss: 0.02659
Epoch: 4996, Training Loss: 0.02010
Epoch: 4997, Training Loss: 0.02317
Epoch: 4997, Training Loss: 0.02102
Epoch: 4997, Training Loss: 0.02659
Epoch: 4997, Training Loss: 0.02010
Epoch: 4998, Training Loss: 0.02317
Epoch: 4998, Training Loss: 0.02102
Epoch: 4998, Training Loss: 0.02658
Epoch: 4998, Training Loss: 0.02009
Epoch: 4999, Training Loss: 0.02316
Epoch: 4999, Training Loss: 0.02101
Epoch: 4999, Training Loss: 0.02658
Epoch: 4999, Training Loss: 0.02009
Epoch: 5000, Training Loss: 0.02316
Epoch: 5000, Training Loss: 0.02101
Epoch: 5000, Training Loss: 0.02658
Epoch: 5000, Training Loss: 0.02009
Epoch: 5001, Training Loss: 0.02316
Epoch: 5001, Training Loss: 0.02101
Epoch: 5001, Training Loss: 0.02657
Epoch: 5001, Training Loss: 0.02009
Epoch: 5002, Training Loss: 0.02315
Epoch: 5002, Training Loss: 0.02101
Epoch: 5002, Training Loss: 0.02657
Epoch: 5002, Training Loss: 0.02008
Epoch: 5003, Training Loss: 0.02315
Epoch: 5003, Training Loss: 0.02100
Epoch: 5003, Training Loss: 0.02657
Epoch: 5003, Training Loss: 0.02008
Epoch: 5004, Training Loss: 0.02315
Epoch: 5004, Training Loss: 0.02100
Epoch: 5004, Training Loss: 0.02656
Epoch: 5004, Training Loss: 0.02008
Epoch: 5005, Training Loss: 0.02315
Epoch: 5005, Training Loss: 0.02100
Epoch: 5005, Training Loss: 0.02656
Epoch: 5005, Training Loss: 0.02008
Epoch: 5006, Training Loss: 0.02314
Epoch: 5006, Training Loss: 0.02100
Epoch: 5006, Training Loss: 0.02656
Epoch: 5006, Training Loss: 0.02007
Epoch: 5007, Training Loss: 0.02314
Epoch: 5007, Training Loss: 0.02099
Epoch: 5007, Training Loss: 0.02655
Epoch: 5007, Training Loss: 0.02007
Epoch: 5008, Training Loss: 0.02314
Epoch: 5008, Training Loss: 0.02099
Epoch: 5008, Training Loss: 0.02655
Epoch: 5008, Training Loss: 0.02007
Epoch: 5009, Training Loss: 0.02313
Epoch: 5009, Training Loss: 0.02099
Epoch: 5009, Training Loss: 0.02655
Epoch: 5009, Training Loss: 0.02007
Epoch: 5010, Training Loss: 0.02313
Epoch: 5010, Training Loss: 0.02099
Epoch: 5010, Training Loss: 0.02654
Epoch: 5010, Training Loss: 0.02006
Epoch: 5011, Training Loss: 0.02313
Epoch: 5011, Training Loss: 0.02098
Epoch: 5011, Training Loss: 0.02654
Epoch: 5011, Training Loss: 0.02006
Epoch: 5012, Training Loss: 0.02313
Epoch: 5012, Training Loss: 0.02098
Epoch: 5012, Training Loss: 0.02654
Epoch: 5012, Training Loss: 0.02006
Epoch: 5013, Training Loss: 0.02312
Epoch: 5013, Training Loss: 0.02098
Epoch: 5013, Training Loss: 0.02653
Epoch: 5013, Training Loss: 0.02006
Epoch: 5014, Training Loss: 0.02312
Epoch: 5014, Training Loss: 0.02098
Epoch: 5014, Training Loss: 0.02653
Epoch: 5014, Training Loss: 0.02005
Epoch: 5015, Training Loss: 0.02312
Epoch: 5015, Training Loss: 0.02097
Epoch: 5015, Training Loss: 0.02653
Epoch: 5015, Training Loss: 0.02005
Epoch: 5016, Training Loss: 0.02311
Epoch: 5016, Training Loss: 0.02097
Epoch: 5016, Training Loss: 0.02652
Epoch: 5016, Training Loss: 0.02005
Epoch: 5017, Training Loss: 0.02311
Epoch: 5017, Training Loss: 0.02097
Epoch: 5017, Training Loss: 0.02652
Epoch: 5017, Training Loss: 0.02005
Epoch: 5018, Training Loss: 0.02311
Epoch: 5018, Training Loss: 0.02097
Epoch: 5018, Training Loss: 0.02652
Epoch: 5018, Training Loss: 0.02004
Epoch: 5019, Training Loss: 0.02311
Epoch: 5019, Training Loss: 0.02096
Epoch: 5019, Training Loss: 0.02651
Epoch: 5019, Training Loss: 0.02004
Epoch: 5020, Training Loss: 0.02310
Epoch: 5020, Training Loss: 0.02096
Epoch: 5020, Training Loss: 0.02651
Epoch: 5020, Training Loss: 0.02004
Epoch: 5021, Training Loss: 0.02310
Epoch: 5021, Training Loss: 0.02096
Epoch: 5021, Training Loss: 0.02651
Epoch: 5021, Training Loss: 0.02004
Epoch: 5022, Training Loss: 0.02310
Epoch: 5022, Training Loss: 0.02096
Epoch: 5022, Training Loss: 0.02650
Epoch: 5022, Training Loss: 0.02003
Epoch: 5023, Training Loss: 0.02309
Epoch: 5023, Training Loss: 0.02095
Epoch: 5023, Training Loss: 0.02650
Epoch: 5023, Training Loss: 0.02003
Epoch: 5024, Training Loss: 0.02309
Epoch: 5024, Training Loss: 0.02095
Epoch: 5024, Training Loss: 0.02650
Epoch: 5024, Training Loss: 0.02003
Epoch: 5025, Training Loss: 0.02309
Epoch: 5025, Training Loss: 0.02095
Epoch: 5025, Training Loss: 0.02649
Epoch: 5025, Training Loss: 0.02003
Epoch: 5026, Training Loss: 0.02309
Epoch: 5026, Training Loss: 0.02095
Epoch: 5026, Training Loss: 0.02649
Epoch: 5026, Training Loss: 0.02002
Epoch: 5027, Training Loss: 0.02308
Epoch: 5027, Training Loss: 0.02094
Epoch: 5027, Training Loss: 0.02649
Epoch: 5027, Training Loss: 0.02002
Epoch: 5028, Training Loss: 0.02308
Epoch: 5028, Training Loss: 0.02094
Epoch: 5028, Training Loss: 0.02648
Epoch: 5028, Training Loss: 0.02002
Epoch: 5029, Training Loss: 0.02308
Epoch: 5029, Training Loss: 0.02094
Epoch: 5029, Training Loss: 0.02648
Epoch: 5029, Training Loss: 0.02002
Epoch: 5030, Training Loss: 0.02307
Epoch: 5030, Training Loss: 0.02094
Epoch: 5030, Training Loss: 0.02648
Epoch: 5030, Training Loss: 0.02002
Epoch: 5031, Training Loss: 0.02307
Epoch: 5031, Training Loss: 0.02093
Epoch: 5031, Training Loss: 0.02647
Epoch: 5031, Training Loss: 0.02001
Epoch: 5032, Training Loss: 0.02307
Epoch: 5032, Training Loss: 0.02093
Epoch: 5032, Training Loss: 0.02647
Epoch: 5032, Training Loss: 0.02001
Epoch: 5033, Training Loss: 0.02307
Epoch: 5033, Training Loss: 0.02093
Epoch: 5033, Training Loss: 0.02647
Epoch: 5033, Training Loss: 0.02001
Epoch: 5034, Training Loss: 0.02306
Epoch: 5034, Training Loss: 0.02093
Epoch: 5034, Training Loss: 0.02646
Epoch: 5034, Training Loss: 0.02001
Epoch: 5035, Training Loss: 0.02306
Epoch: 5035, Training Loss: 0.02092
Epoch: 5035, Training Loss: 0.02646
Epoch: 5035, Training Loss: 0.02000
Epoch: 5036, Training Loss: 0.02306
Epoch: 5036, Training Loss: 0.02092
Epoch: 5036, Training Loss: 0.02646
Epoch: 5036, Training Loss: 0.02000
Epoch: 5037, Training Loss: 0.02305
Epoch: 5037, Training Loss: 0.02092
Epoch: 5037, Training Loss: 0.02645
Epoch: 5037, Training Loss: 0.02000
Epoch: 5038, Training Loss: 0.02305
Epoch: 5038, Training Loss: 0.02092
Epoch: 5038, Training Loss: 0.02645
Epoch: 5038, Training Loss: 0.02000
Epoch: 5039, Training Loss: 0.02305
Epoch: 5039, Training Loss: 0.02091
Epoch: 5039, Training Loss: 0.02645
Epoch: 5039, Training Loss: 0.01999
Epoch: 5040, Training Loss: 0.02305
Epoch: 5040, Training Loss: 0.02091
Epoch: 5040, Training Loss: 0.02644
Epoch: 5040, Training Loss: 0.01999
Epoch: 5041, Training Loss: 0.02304
Epoch: 5041, Training Loss: 0.02091
Epoch: 5041, Training Loss: 0.02644
Epoch: 5041, Training Loss: 0.01999
Epoch: 5042, Training Loss: 0.02304
Epoch: 5042, Training Loss: 0.02091
Epoch: 5042, Training Loss: 0.02644
Epoch: 5042, Training Loss: 0.01999
Epoch: 5043, Training Loss: 0.02304
Epoch: 5043, Training Loss: 0.02090
Epoch: 5043, Training Loss: 0.02643
Epoch: 5043, Training Loss: 0.01998
Epoch: 5044, Training Loss: 0.02303
Epoch: 5044, Training Loss: 0.02090
Epoch: 5044, Training Loss: 0.02643
Epoch: 5044, Training Loss: 0.01998
Epoch: 5045, Training Loss: 0.02303
Epoch: 5045, Training Loss: 0.02090
Epoch: 5045, Training Loss: 0.02643
Epoch: 5045, Training Loss: 0.01998
Epoch: 5046, Training Loss: 0.02303
Epoch: 5046, Training Loss: 0.02090
Epoch: 5046, Training Loss: 0.02642
Epoch: 5046, Training Loss: 0.01998
Epoch: 5047, Training Loss: 0.02303
Epoch: 5047, Training Loss: 0.02089
Epoch: 5047, Training Loss: 0.02642
Epoch: 5047, Training Loss: 0.01997
Epoch: 5048, Training Loss: 0.02302
Epoch: 5048, Training Loss: 0.02089
Epoch: 5048, Training Loss: 0.02642
Epoch: 5048, Training Loss: 0.01997
Epoch: 5049, Training Loss: 0.02302
Epoch: 5049, Training Loss: 0.02089
Epoch: 5049, Training Loss: 0.02641
Epoch: 5049, Training Loss: 0.01997
Epoch: 5050, Training Loss: 0.02302
Epoch: 5050, Training Loss: 0.02089
Epoch: 5050, Training Loss: 0.02641
Epoch: 5050, Training Loss: 0.01997
Epoch: 5051, Training Loss: 0.02301
Epoch: 5051, Training Loss: 0.02088
Epoch: 5051, Training Loss: 0.02641
Epoch: 5051, Training Loss: 0.01996
Epoch: 5052, Training Loss: 0.02301
Epoch: 5052, Training Loss: 0.02088
Epoch: 5052, Training Loss: 0.02640
Epoch: 5052, Training Loss: 0.01996
Epoch: 5053, Training Loss: 0.02301
Epoch: 5053, Training Loss: 0.02088
Epoch: 5053, Training Loss: 0.02640
Epoch: 5053, Training Loss: 0.01996
Epoch: 5054, Training Loss: 0.02301
Epoch: 5054, Training Loss: 0.02088
Epoch: 5054, Training Loss: 0.02640
Epoch: 5054, Training Loss: 0.01996
Epoch: 5055, Training Loss: 0.02300
Epoch: 5055, Training Loss: 0.02088
Epoch: 5055, Training Loss: 0.02639
Epoch: 5055, Training Loss: 0.01996
Epoch: 5056, Training Loss: 0.02300
Epoch: 5056, Training Loss: 0.02087
Epoch: 5056, Training Loss: 0.02639
Epoch: 5056, Training Loss: 0.01995
Epoch: 5057, Training Loss: 0.02300
Epoch: 5057, Training Loss: 0.02087
Epoch: 5057, Training Loss: 0.02639
Epoch: 5057, Training Loss: 0.01995
Epoch: 5058, Training Loss: 0.02299
Epoch: 5058, Training Loss: 0.02087
Epoch: 5058, Training Loss: 0.02638
Epoch: 5058, Training Loss: 0.01995
Epoch: 5059, Training Loss: 0.02299
Epoch: 5059, Training Loss: 0.02087
Epoch: 5059, Training Loss: 0.02638
Epoch: 5059, Training Loss: 0.01995
Epoch: 5060, Training Loss: 0.02299
Epoch: 5060, Training Loss: 0.02086
Epoch: 5060, Training Loss: 0.02638
Epoch: 5060, Training Loss: 0.01994
Epoch: 5061, Training Loss: 0.02299
Epoch: 5061, Training Loss: 0.02086
Epoch: 5061, Training Loss: 0.02637
Epoch: 5061, Training Loss: 0.01994
Epoch: 5062, Training Loss: 0.02298
Epoch: 5062, Training Loss: 0.02086
Epoch: 5062, Training Loss: 0.02637
Epoch: 5062, Training Loss: 0.01994
Epoch: 5063, Training Loss: 0.02298
Epoch: 5063, Training Loss: 0.02086
Epoch: 5063, Training Loss: 0.02637
Epoch: 5063, Training Loss: 0.01994
Epoch: 5064, Training Loss: 0.02298
Epoch: 5064, Training Loss: 0.02085
Epoch: 5064, Training Loss: 0.02636
Epoch: 5064, Training Loss: 0.01993
Epoch: 5065, Training Loss: 0.02297
Epoch: 5065, Training Loss: 0.02085
Epoch: 5065, Training Loss: 0.02636
Epoch: 5065, Training Loss: 0.01993
Epoch: 5066, Training Loss: 0.02297
Epoch: 5066, Training Loss: 0.02085
Epoch: 5066, Training Loss: 0.02636
Epoch: 5066, Training Loss: 0.01993
Epoch: 5067, Training Loss: 0.02297
Epoch: 5067, Training Loss: 0.02085
Epoch: 5067, Training Loss: 0.02635
Epoch: 5067, Training Loss: 0.01993
Epoch: 5068, Training Loss: 0.02297
Epoch: 5068, Training Loss: 0.02084
Epoch: 5068, Training Loss: 0.02635
Epoch: 5068, Training Loss: 0.01992
Epoch: 5069, Training Loss: 0.02296
Epoch: 5069, Training Loss: 0.02084
Epoch: 5069, Training Loss: 0.02635
Epoch: 5069, Training Loss: 0.01992
Epoch: 5070, Training Loss: 0.02296
Epoch: 5070, Training Loss: 0.02084
Epoch: 5070, Training Loss: 0.02634
Epoch: 5070, Training Loss: 0.01992
Epoch: 5071, Training Loss: 0.02296
Epoch: 5071, Training Loss: 0.02084
Epoch: 5071, Training Loss: 0.02634
Epoch: 5071, Training Loss: 0.01992
Epoch: 5072, Training Loss: 0.02295
Epoch: 5072, Training Loss: 0.02083
Epoch: 5072, Training Loss: 0.02634
Epoch: 5072, Training Loss: 0.01991
Epoch: 5073, Training Loss: 0.02295
Epoch: 5073, Training Loss: 0.02083
Epoch: 5073, Training Loss: 0.02633
Epoch: 5073, Training Loss: 0.01991
Epoch: 5074, Training Loss: 0.02295
Epoch: 5074, Training Loss: 0.02083
Epoch: 5074, Training Loss: 0.02633
Epoch: 5074, Training Loss: 0.01991
Epoch: 5075, Training Loss: 0.02295
Epoch: 5075, Training Loss: 0.02083
Epoch: 5075, Training Loss: 0.02633
Epoch: 5075, Training Loss: 0.01991
Epoch: 5076, Training Loss: 0.02294
Epoch: 5076, Training Loss: 0.02082
Epoch: 5076, Training Loss: 0.02632
Epoch: 5076, Training Loss: 0.01991
Epoch: 5077, Training Loss: 0.02294
Epoch: 5077, Training Loss: 0.02082
Epoch: 5077, Training Loss: 0.02632
Epoch: 5077, Training Loss: 0.01990
Epoch: 5078, Training Loss: 0.02294
Epoch: 5078, Training Loss: 0.02082
Epoch: 5078, Training Loss: 0.02632
Epoch: 5078, Training Loss: 0.01990
Epoch: 5079, Training Loss: 0.02294
Epoch: 5079, Training Loss: 0.02082
Epoch: 5079, Training Loss: 0.02631
Epoch: 5079, Training Loss: 0.01990
Epoch: 5080, Training Loss: 0.02293
Epoch: 5080, Training Loss: 0.02081
Epoch: 5080, Training Loss: 0.02631
Epoch: 5080, Training Loss: 0.01990
Epoch: 5081, Training Loss: 0.02293
Epoch: 5081, Training Loss: 0.02081
Epoch: 5081, Training Loss: 0.02631
Epoch: 5081, Training Loss: 0.01989
Epoch: 5082, Training Loss: 0.02293
Epoch: 5082, Training Loss: 0.02081
Epoch: 5082, Training Loss: 0.02631
Epoch: 5082, Training Loss: 0.01989
Epoch: 5083, Training Loss: 0.02292
Epoch: 5083, Training Loss: 0.02081
Epoch: 5083, Training Loss: 0.02630
Epoch: 5083, Training Loss: 0.01989
Epoch: 5084, Training Loss: 0.02292
Epoch: 5084, Training Loss: 0.02080
Epoch: 5084, Training Loss: 0.02630
Epoch: 5084, Training Loss: 0.01989
Epoch: 5085, Training Loss: 0.02292
Epoch: 5085, Training Loss: 0.02080
Epoch: 5085, Training Loss: 0.02630
Epoch: 5085, Training Loss: 0.01988
Epoch: 5086, Training Loss: 0.02292
Epoch: 5086, Training Loss: 0.02080
Epoch: 5086, Training Loss: 0.02629
Epoch: 5086, Training Loss: 0.01988
Epoch: 5087, Training Loss: 0.02291
Epoch: 5087, Training Loss: 0.02080
Epoch: 5087, Training Loss: 0.02629
Epoch: 5087, Training Loss: 0.01988
Epoch: 5088, Training Loss: 0.02291
Epoch: 5088, Training Loss: 0.02079
Epoch: 5088, Training Loss: 0.02629
Epoch: 5088, Training Loss: 0.01988
Epoch: 5089, Training Loss: 0.02291
Epoch: 5089, Training Loss: 0.02079
Epoch: 5089, Training Loss: 0.02628
Epoch: 5089, Training Loss: 0.01987
Epoch: 5090, Training Loss: 0.02290
Epoch: 5090, Training Loss: 0.02079
Epoch: 5090, Training Loss: 0.02628
Epoch: 5090, Training Loss: 0.01987
Epoch: 5091, Training Loss: 0.02290
Epoch: 5091, Training Loss: 0.02079
Epoch: 5091, Training Loss: 0.02628
Epoch: 5091, Training Loss: 0.01987
Epoch: 5092, Training Loss: 0.02290
Epoch: 5092, Training Loss: 0.02079
Epoch: 5092, Training Loss: 0.02627
Epoch: 5092, Training Loss: 0.01987
Epoch: 5093, Training Loss: 0.02290
Epoch: 5093, Training Loss: 0.02078
Epoch: 5093, Training Loss: 0.02627
Epoch: 5093, Training Loss: 0.01987
Epoch: 5094, Training Loss: 0.02289
Epoch: 5094, Training Loss: 0.02078
Epoch: 5094, Training Loss: 0.02627
Epoch: 5094, Training Loss: 0.01986
Epoch: 5095, Training Loss: 0.02289
Epoch: 5095, Training Loss: 0.02078
Epoch: 5095, Training Loss: 0.02626
Epoch: 5095, Training Loss: 0.01986
Epoch: 5096, Training Loss: 0.02289
Epoch: 5096, Training Loss: 0.02078
Epoch: 5096, Training Loss: 0.02626
Epoch: 5096, Training Loss: 0.01986
Epoch: 5097, Training Loss: 0.02289
Epoch: 5097, Training Loss: 0.02077
Epoch: 5097, Training Loss: 0.02626
Epoch: 5097, Training Loss: 0.01986
Epoch: 5098, Training Loss: 0.02288
Epoch: 5098, Training Loss: 0.02077
Epoch: 5098, Training Loss: 0.02625
Epoch: 5098, Training Loss: 0.01985
Epoch: 5099, Training Loss: 0.02288
Epoch: 5099, Training Loss: 0.02077
Epoch: 5099, Training Loss: 0.02625
Epoch: 5099, Training Loss: 0.01985
Epoch: 5100, Training Loss: 0.02288
Epoch: 5100, Training Loss: 0.02077
Epoch: 5100, Training Loss: 0.02625
Epoch: 5100, Training Loss: 0.01985
Epoch: 5101, Training Loss: 0.02287
Epoch: 5101, Training Loss: 0.02076
Epoch: 5101, Training Loss: 0.02624
Epoch: 5101, Training Loss: 0.01985
Epoch: 5102, Training Loss: 0.02287
Epoch: 5102, Training Loss: 0.02076
Epoch: 5102, Training Loss: 0.02624
Epoch: 5102, Training Loss: 0.01984
Epoch: 5103, Training Loss: 0.02287
Epoch: 5103, Training Loss: 0.02076
Epoch: 5103, Training Loss: 0.02624
Epoch: 5103, Training Loss: 0.01984
Epoch: 5104, Training Loss: 0.02287
Epoch: 5104, Training Loss: 0.02076
Epoch: 5104, Training Loss: 0.02623
Epoch: 5104, Training Loss: 0.01984
Epoch: 5105, Training Loss: 0.02286
Epoch: 5105, Training Loss: 0.02075
Epoch: 5105, Training Loss: 0.02623
Epoch: 5105, Training Loss: 0.01984
Epoch: 5106, Training Loss: 0.02286
Epoch: 5106, Training Loss: 0.02075
Epoch: 5106, Training Loss: 0.02623
Epoch: 5106, Training Loss: 0.01983
Epoch: 5107, Training Loss: 0.02286
Epoch: 5107, Training Loss: 0.02075
Epoch: 5107, Training Loss: 0.02622
Epoch: 5107, Training Loss: 0.01983
Epoch: 5108, Training Loss: 0.02285
Epoch: 5108, Training Loss: 0.02075
Epoch: 5108, Training Loss: 0.02622
Epoch: 5108, Training Loss: 0.01983
Epoch: 5109, Training Loss: 0.02285
Epoch: 5109, Training Loss: 0.02074
Epoch: 5109, Training Loss: 0.02622
Epoch: 5109, Training Loss: 0.01983
Epoch: 5110, Training Loss: 0.02285
Epoch: 5110, Training Loss: 0.02074
Epoch: 5110, Training Loss: 0.02621
Epoch: 5110, Training Loss: 0.01983
Epoch: 5111, Training Loss: 0.02285
Epoch: 5111, Training Loss: 0.02074
Epoch: 5111, Training Loss: 0.02621
Epoch: 5111, Training Loss: 0.01982
Epoch: 5112, Training Loss: 0.02284
Epoch: 5112, Training Loss: 0.02074
Epoch: 5112, Training Loss: 0.02621
Epoch: 5112, Training Loss: 0.01982
Epoch: 5113, Training Loss: 0.02284
Epoch: 5113, Training Loss: 0.02073
Epoch: 5113, Training Loss: 0.02620
Epoch: 5113, Training Loss: 0.01982
Epoch: 5114, Training Loss: 0.02284
Epoch: 5114, Training Loss: 0.02073
Epoch: 5114, Training Loss: 0.02620
Epoch: 5114, Training Loss: 0.01982
Epoch: 5115, Training Loss: 0.02284
Epoch: 5115, Training Loss: 0.02073
Epoch: 5115, Training Loss: 0.02620
Epoch: 5115, Training Loss: 0.01981
Epoch: 5116, Training Loss: 0.02283
Epoch: 5116, Training Loss: 0.02073
Epoch: 5116, Training Loss: 0.02619
Epoch: 5116, Training Loss: 0.01981
Epoch: 5117, Training Loss: 0.02283
Epoch: 5117, Training Loss: 0.02073
Epoch: 5117, Training Loss: 0.02619
Epoch: 5117, Training Loss: 0.01981
Epoch: 5118, Training Loss: 0.02283
Epoch: 5118, Training Loss: 0.02072
Epoch: 5118, Training Loss: 0.02619
Epoch: 5118, Training Loss: 0.01981
Epoch: 5119, Training Loss: 0.02282
Epoch: 5119, Training Loss: 0.02072
Epoch: 5119, Training Loss: 0.02619
Epoch: 5119, Training Loss: 0.01980
Epoch: 5120, Training Loss: 0.02282
Epoch: 5120, Training Loss: 0.02072
Epoch: 5120, Training Loss: 0.02618
Epoch: 5120, Training Loss: 0.01980
Epoch: 5121, Training Loss: 0.02282
Epoch: 5121, Training Loss: 0.02072
Epoch: 5121, Training Loss: 0.02618
Epoch: 5121, Training Loss: 0.01980
Epoch: 5122, Training Loss: 0.02282
Epoch: 5122, Training Loss: 0.02071
Epoch: 5122, Training Loss: 0.02618
Epoch: 5122, Training Loss: 0.01980
Epoch: 5123, Training Loss: 0.02281
Epoch: 5123, Training Loss: 0.02071
Epoch: 5123, Training Loss: 0.02617
Epoch: 5123, Training Loss: 0.01979
Epoch: 5124, Training Loss: 0.02281
Epoch: 5124, Training Loss: 0.02071
Epoch: 5124, Training Loss: 0.02617
Epoch: 5124, Training Loss: 0.01979
Epoch: 5125, Training Loss: 0.02281
Epoch: 5125, Training Loss: 0.02071
Epoch: 5125, Training Loss: 0.02617
Epoch: 5125, Training Loss: 0.01979
Epoch: 5126, Training Loss: 0.02280
Epoch: 5126, Training Loss: 0.02070
Epoch: 5126, Training Loss: 0.02616
Epoch: 5126, Training Loss: 0.01979
Epoch: 5127, Training Loss: 0.02280
Epoch: 5127, Training Loss: 0.02070
Epoch: 5127, Training Loss: 0.02616
Epoch: 5127, Training Loss: 0.01979
Epoch: 5128, Training Loss: 0.02280
Epoch: 5128, Training Loss: 0.02070
Epoch: 5128, Training Loss: 0.02616
Epoch: 5128, Training Loss: 0.01978
Epoch: 5129, Training Loss: 0.02280
Epoch: 5129, Training Loss: 0.02070
Epoch: 5129, Training Loss: 0.02615
Epoch: 5129, Training Loss: 0.01978
Epoch: 5130, Training Loss: 0.02279
Epoch: 5130, Training Loss: 0.02069
Epoch: 5130, Training Loss: 0.02615
Epoch: 5130, Training Loss: 0.01978
Epoch: 5131, Training Loss: 0.02279
Epoch: 5131, Training Loss: 0.02069
Epoch: 5131, Training Loss: 0.02615
Epoch: 5131, Training Loss: 0.01978
Epoch: 5132, Training Loss: 0.02279
Epoch: 5132, Training Loss: 0.02069
Epoch: 5132, Training Loss: 0.02614
Epoch: 5132, Training Loss: 0.01977
Epoch: 5133, Training Loss: 0.02279
Epoch: 5133, Training Loss: 0.02069
Epoch: 5133, Training Loss: 0.02614
Epoch: 5133, Training Loss: 0.01977
Epoch: 5134, Training Loss: 0.02278
Epoch: 5134, Training Loss: 0.02068
Epoch: 5134, Training Loss: 0.02614
Epoch: 5134, Training Loss: 0.01977
Epoch: 5135, Training Loss: 0.02278
Epoch: 5135, Training Loss: 0.02068
Epoch: 5135, Training Loss: 0.02613
Epoch: 5135, Training Loss: 0.01977
Epoch: 5136, Training Loss: 0.02278
Epoch: 5136, Training Loss: 0.02068
Epoch: 5136, Training Loss: 0.02613
Epoch: 5136, Training Loss: 0.01976
Epoch: 5137, Training Loss: 0.02277
Epoch: 5137, Training Loss: 0.02068
Epoch: 5137, Training Loss: 0.02613
Epoch: 5137, Training Loss: 0.01976
Epoch: 5138, Training Loss: 0.02277
Epoch: 5138, Training Loss: 0.02068
Epoch: 5138, Training Loss: 0.02612
Epoch: 5138, Training Loss: 0.01976
Epoch: 5139, Training Loss: 0.02277
Epoch: 5139, Training Loss: 0.02067
Epoch: 5139, Training Loss: 0.02612
Epoch: 5139, Training Loss: 0.01976
Epoch: 5140, Training Loss: 0.02277
Epoch: 5140, Training Loss: 0.02067
Epoch: 5140, Training Loss: 0.02612
Epoch: 5140, Training Loss: 0.01976
Epoch: 5141, Training Loss: 0.02276
Epoch: 5141, Training Loss: 0.02067
Epoch: 5141, Training Loss: 0.02611
Epoch: 5141, Training Loss: 0.01975
Epoch: 5142, Training Loss: 0.02276
Epoch: 5142, Training Loss: 0.02067
Epoch: 5142, Training Loss: 0.02611
Epoch: 5142, Training Loss: 0.01975
Epoch: 5143, Training Loss: 0.02276
Epoch: 5143, Training Loss: 0.02066
Epoch: 5143, Training Loss: 0.02611
Epoch: 5143, Training Loss: 0.01975
Epoch: 5144, Training Loss: 0.02276
Epoch: 5144, Training Loss: 0.02066
Epoch: 5144, Training Loss: 0.02611
Epoch: 5144, Training Loss: 0.01975
Epoch: 5145, Training Loss: 0.02275
Epoch: 5145, Training Loss: 0.02066
Epoch: 5145, Training Loss: 0.02610
Epoch: 5145, Training Loss: 0.01974
Epoch: 5146, Training Loss: 0.02275
Epoch: 5146, Training Loss: 0.02066
Epoch: 5146, Training Loss: 0.02610
Epoch: 5146, Training Loss: 0.01974
Epoch: 5147, Training Loss: 0.02275
Epoch: 5147, Training Loss: 0.02065
Epoch: 5147, Training Loss: 0.02610
Epoch: 5147, Training Loss: 0.01974
Epoch: 5148, Training Loss: 0.02274
Epoch: 5148, Training Loss: 0.02065
Epoch: 5148, Training Loss: 0.02609
Epoch: 5148, Training Loss: 0.01974
Epoch: 5149, Training Loss: 0.02274
Epoch: 5149, Training Loss: 0.02065
Epoch: 5149, Training Loss: 0.02609
Epoch: 5149, Training Loss: 0.01973
Epoch: 5150, Training Loss: 0.02274
Epoch: 5150, Training Loss: 0.02065
Epoch: 5150, Training Loss: 0.02609
Epoch: 5150, Training Loss: 0.01973
Epoch: 5151, Training Loss: 0.02274
Epoch: 5151, Training Loss: 0.02064
Epoch: 5151, Training Loss: 0.02608
Epoch: 5151, Training Loss: 0.01973
Epoch: 5152, Training Loss: 0.02273
Epoch: 5152, Training Loss: 0.02064
Epoch: 5152, Training Loss: 0.02608
Epoch: 5152, Training Loss: 0.01973
Epoch: 5153, Training Loss: 0.02273
Epoch: 5153, Training Loss: 0.02064
Epoch: 5153, Training Loss: 0.02608
Epoch: 5153, Training Loss: 0.01973
Epoch: 5154, Training Loss: 0.02273
Epoch: 5154, Training Loss: 0.02064
Epoch: 5154, Training Loss: 0.02607
Epoch: 5154, Training Loss: 0.01972
Epoch: 5155, Training Loss: 0.02273
Epoch: 5155, Training Loss: 0.02063
Epoch: 5155, Training Loss: 0.02607
Epoch: 5155, Training Loss: 0.01972
Epoch: 5156, Training Loss: 0.02272
Epoch: 5156, Training Loss: 0.02063
Epoch: 5156, Training Loss: 0.02607
Epoch: 5156, Training Loss: 0.01972
Epoch: 5157, Training Loss: 0.02272
Epoch: 5157, Training Loss: 0.02063
Epoch: 5157, Training Loss: 0.02606
Epoch: 5157, Training Loss: 0.01972
Epoch: 5158, Training Loss: 0.02272
Epoch: 5158, Training Loss: 0.02063
Epoch: 5158, Training Loss: 0.02606
Epoch: 5158, Training Loss: 0.01971
Epoch: 5159, Training Loss: 0.02271
Epoch: 5159, Training Loss: 0.02063
Epoch: 5159, Training Loss: 0.02606
Epoch: 5159, Training Loss: 0.01971
Epoch: 5160, Training Loss: 0.02271
Epoch: 5160, Training Loss: 0.02062
Epoch: 5160, Training Loss: 0.02605
Epoch: 5160, Training Loss: 0.01971
Epoch: 5161, Training Loss: 0.02271
Epoch: 5161, Training Loss: 0.02062
Epoch: 5161, Training Loss: 0.02605
Epoch: 5161, Training Loss: 0.01971
Epoch: 5162, Training Loss: 0.02271
Epoch: 5162, Training Loss: 0.02062
Epoch: 5162, Training Loss: 0.02605
Epoch: 5162, Training Loss: 0.01970
Epoch: 5163, Training Loss: 0.02270
Epoch: 5163, Training Loss: 0.02062
Epoch: 5163, Training Loss: 0.02604
Epoch: 5163, Training Loss: 0.01970
Epoch: 5164, Training Loss: 0.02270
Epoch: 5164, Training Loss: 0.02061
Epoch: 5164, Training Loss: 0.02604
Epoch: 5164, Training Loss: 0.01970
Epoch: 5165, Training Loss: 0.02270
Epoch: 5165, Training Loss: 0.02061
Epoch: 5165, Training Loss: 0.02604
Epoch: 5165, Training Loss: 0.01970
Epoch: 5166, Training Loss: 0.02270
Epoch: 5166, Training Loss: 0.02061
Epoch: 5166, Training Loss: 0.02604
Epoch: 5166, Training Loss: 0.01970
Epoch: 5167, Training Loss: 0.02269
Epoch: 5167, Training Loss: 0.02061
Epoch: 5167, Training Loss: 0.02603
Epoch: 5167, Training Loss: 0.01969
Epoch: 5168, Training Loss: 0.02269
Epoch: 5168, Training Loss: 0.02060
Epoch: 5168, Training Loss: 0.02603
Epoch: 5168, Training Loss: 0.01969
Epoch: 5169, Training Loss: 0.02269
Epoch: 5169, Training Loss: 0.02060
Epoch: 5169, Training Loss: 0.02603
Epoch: 5169, Training Loss: 0.01969
Epoch: 5170, Training Loss: 0.02268
Epoch: 5170, Training Loss: 0.02060
Epoch: 5170, Training Loss: 0.02602
Epoch: 5170, Training Loss: 0.01969
Epoch: 5171, Training Loss: 0.02268
Epoch: 5171, Training Loss: 0.02060
Epoch: 5171, Training Loss: 0.02602
Epoch: 5171, Training Loss: 0.01968
Epoch: 5172, Training Loss: 0.02268
Epoch: 5172, Training Loss: 0.02059
Epoch: 5172, Training Loss: 0.02602
Epoch: 5172, Training Loss: 0.01968
Epoch: 5173, Training Loss: 0.02268
Epoch: 5173, Training Loss: 0.02059
Epoch: 5173, Training Loss: 0.02601
Epoch: 5173, Training Loss: 0.01968
Epoch: 5174, Training Loss: 0.02267
Epoch: 5174, Training Loss: 0.02059
Epoch: 5174, Training Loss: 0.02601
Epoch: 5174, Training Loss: 0.01968
Epoch: 5175, Training Loss: 0.02267
Epoch: 5175, Training Loss: 0.02059
Epoch: 5175, Training Loss: 0.02601
Epoch: 5175, Training Loss: 0.01967
Epoch: 5176, Training Loss: 0.02267
Epoch: 5176, Training Loss: 0.02059
Epoch: 5176, Training Loss: 0.02600
Epoch: 5176, Training Loss: 0.01967
Epoch: 5177, Training Loss: 0.02267
Epoch: 5177, Training Loss: 0.02058
Epoch: 5177, Training Loss: 0.02600
Epoch: 5177, Training Loss: 0.01967
Epoch: 5178, Training Loss: 0.02266
Epoch: 5178, Training Loss: 0.02058
Epoch: 5178, Training Loss: 0.02600
Epoch: 5178, Training Loss: 0.01967
Epoch: 5179, Training Loss: 0.02266
Epoch: 5179, Training Loss: 0.02058
Epoch: 5179, Training Loss: 0.02599
Epoch: 5179, Training Loss: 0.01967
Epoch: 5180, Training Loss: 0.02266
Epoch: 5180, Training Loss: 0.02058
Epoch: 5180, Training Loss: 0.02599
Epoch: 5180, Training Loss: 0.01966
Epoch: 5181, Training Loss: 0.02265
Epoch: 5181, Training Loss: 0.02057
Epoch: 5181, Training Loss: 0.02599
Epoch: 5181, Training Loss: 0.01966
Epoch: 5182, Training Loss: 0.02265
Epoch: 5182, Training Loss: 0.02057
Epoch: 5182, Training Loss: 0.02598
Epoch: 5182, Training Loss: 0.01966
Epoch: 5183, Training Loss: 0.02265
Epoch: 5183, Training Loss: 0.02057
Epoch: 5183, Training Loss: 0.02598
Epoch: 5183, Training Loss: 0.01966
Epoch: 5184, Training Loss: 0.02265
Epoch: 5184, Training Loss: 0.02057
Epoch: 5184, Training Loss: 0.02598
Epoch: 5184, Training Loss: 0.01965
Epoch: 5185, Training Loss: 0.02264
Epoch: 5185, Training Loss: 0.02056
Epoch: 5185, Training Loss: 0.02598
Epoch: 5185, Training Loss: 0.01965
Epoch: 5186, Training Loss: 0.02264
Epoch: 5186, Training Loss: 0.02056
Epoch: 5186, Training Loss: 0.02597
Epoch: 5186, Training Loss: 0.01965
Epoch: 5187, Training Loss: 0.02264
Epoch: 5187, Training Loss: 0.02056
Epoch: 5187, Training Loss: 0.02597
Epoch: 5187, Training Loss: 0.01965
Epoch: 5188, Training Loss: 0.02264
Epoch: 5188, Training Loss: 0.02056
Epoch: 5188, Training Loss: 0.02597
Epoch: 5188, Training Loss: 0.01964
Epoch: 5189, Training Loss: 0.02263
Epoch: 5189, Training Loss: 0.02056
Epoch: 5189, Training Loss: 0.02596
Epoch: 5189, Training Loss: 0.01964
Epoch: 5190, Training Loss: 0.02263
Epoch: 5190, Training Loss: 0.02055
Epoch: 5190, Training Loss: 0.02596
Epoch: 5190, Training Loss: 0.01964
Epoch: 5191, Training Loss: 0.02263
Epoch: 5191, Training Loss: 0.02055
Epoch: 5191, Training Loss: 0.02596
Epoch: 5191, Training Loss: 0.01964
Epoch: 5192, Training Loss: 0.02263
Epoch: 5192, Training Loss: 0.02055
Epoch: 5192, Training Loss: 0.02595
Epoch: 5192, Training Loss: 0.01964
Epoch: 5193, Training Loss: 0.02262
Epoch: 5193, Training Loss: 0.02055
Epoch: 5193, Training Loss: 0.02595
Epoch: 5193, Training Loss: 0.01963
Epoch: 5194, Training Loss: 0.02262
Epoch: 5194, Training Loss: 0.02054
Epoch: 5194, Training Loss: 0.02595
Epoch: 5194, Training Loss: 0.01963
Epoch: 5195, Training Loss: 0.02262
Epoch: 5195, Training Loss: 0.02054
Epoch: 5195, Training Loss: 0.02594
Epoch: 5195, Training Loss: 0.01963
Epoch: 5196, Training Loss: 0.02261
Epoch: 5196, Training Loss: 0.02054
Epoch: 5196, Training Loss: 0.02594
Epoch: 5196, Training Loss: 0.01963
Epoch: 5197, Training Loss: 0.02261
Epoch: 5197, Training Loss: 0.02054
Epoch: 5197, Training Loss: 0.02594
Epoch: 5197, Training Loss: 0.01962
Epoch: 5198, Training Loss: 0.02261
Epoch: 5198, Training Loss: 0.02053
Epoch: 5198, Training Loss: 0.02593
Epoch: 5198, Training Loss: 0.01962
Epoch: 5199, Training Loss: 0.02261
Epoch: 5199, Training Loss: 0.02053
Epoch: 5199, Training Loss: 0.02593
Epoch: 5199, Training Loss: 0.01962
Epoch: 5200, Training Loss: 0.02260
Epoch: 5200, Training Loss: 0.02053
Epoch: 5200, Training Loss: 0.02593
Epoch: 5200, Training Loss: 0.01962
Epoch: 5201, Training Loss: 0.02260
Epoch: 5201, Training Loss: 0.02053
Epoch: 5201, Training Loss: 0.02593
Epoch: 5201, Training Loss: 0.01962
Epoch: 5202, Training Loss: 0.02260
Epoch: 5202, Training Loss: 0.02052
Epoch: 5202, Training Loss: 0.02592
Epoch: 5202, Training Loss: 0.01961
Epoch: 5203, Training Loss: 0.02260
Epoch: 5203, Training Loss: 0.02052
Epoch: 5203, Training Loss: 0.02592
Epoch: 5203, Training Loss: 0.01961
Epoch: 5204, Training Loss: 0.02259
Epoch: 5204, Training Loss: 0.02052
Epoch: 5204, Training Loss: 0.02592
Epoch: 5204, Training Loss: 0.01961
Epoch: 5205, Training Loss: 0.02259
Epoch: 5205, Training Loss: 0.02052
Epoch: 5205, Training Loss: 0.02591
Epoch: 5205, Training Loss: 0.01961
Epoch: 5206, Training Loss: 0.02259
Epoch: 5206, Training Loss: 0.02052
Epoch: 5206, Training Loss: 0.02591
Epoch: 5206, Training Loss: 0.01960
Epoch: 5207, Training Loss: 0.02258
Epoch: 5207, Training Loss: 0.02051
Epoch: 5207, Training Loss: 0.02591
Epoch: 5207, Training Loss: 0.01960
Epoch: 5208, Training Loss: 0.02258
Epoch: 5208, Training Loss: 0.02051
Epoch: 5208, Training Loss: 0.02590
Epoch: 5208, Training Loss: 0.01960
Epoch: 5209, Training Loss: 0.02258
Epoch: 5209, Training Loss: 0.02051
Epoch: 5209, Training Loss: 0.02590
Epoch: 5209, Training Loss: 0.01960
Epoch: 5210, Training Loss: 0.02258
Epoch: 5210, Training Loss: 0.02051
Epoch: 5210, Training Loss: 0.02590
Epoch: 5210, Training Loss: 0.01959
Epoch: 5211, Training Loss: 0.02257
Epoch: 5211, Training Loss: 0.02050
Epoch: 5211, Training Loss: 0.02589
Epoch: 5211, Training Loss: 0.01959
Epoch: 5212, Training Loss: 0.02257
Epoch: 5212, Training Loss: 0.02050
Epoch: 5212, Training Loss: 0.02589
Epoch: 5212, Training Loss: 0.01959
Epoch: 5213, Training Loss: 0.02257
Epoch: 5213, Training Loss: 0.02050
Epoch: 5213, Training Loss: 0.02589
Epoch: 5213, Training Loss: 0.01959
Epoch: 5214, Training Loss: 0.02257
Epoch: 5214, Training Loss: 0.02050
Epoch: 5214, Training Loss: 0.02588
Epoch: 5214, Training Loss: 0.01959
Epoch: 5215, Training Loss: 0.02256
Epoch: 5215, Training Loss: 0.02049
Epoch: 5215, Training Loss: 0.02588
Epoch: 5215, Training Loss: 0.01958
Epoch: 5216, Training Loss: 0.02256
Epoch: 5216, Training Loss: 0.02049
Epoch: 5216, Training Loss: 0.02588
Epoch: 5216, Training Loss: 0.01958
Epoch: 5217, Training Loss: 0.02256
Epoch: 5217, Training Loss: 0.02049
Epoch: 5217, Training Loss: 0.02588
Epoch: 5217, Training Loss: 0.01958
Epoch: 5218, Training Loss: 0.02256
Epoch: 5218, Training Loss: 0.02049
Epoch: 5218, Training Loss: 0.02587
Epoch: 5218, Training Loss: 0.01958
Epoch: 5219, Training Loss: 0.02255
Epoch: 5219, Training Loss: 0.02049
Epoch: 5219, Training Loss: 0.02587
Epoch: 5219, Training Loss: 0.01957
Epoch: 5220, Training Loss: 0.02255
Epoch: 5220, Training Loss: 0.02048
Epoch: 5220, Training Loss: 0.02587
Epoch: 5220, Training Loss: 0.01957
Epoch: 5221, Training Loss: 0.02255
Epoch: 5221, Training Loss: 0.02048
Epoch: 5221, Training Loss: 0.02586
Epoch: 5221, Training Loss: 0.01957
Epoch: 5222, Training Loss: 0.02254
Epoch: 5222, Training Loss: 0.02048
Epoch: 5222, Training Loss: 0.02586
Epoch: 5222, Training Loss: 0.01957
Epoch: 5223, Training Loss: 0.02254
Epoch: 5223, Training Loss: 0.02048
Epoch: 5223, Training Loss: 0.02586
Epoch: 5223, Training Loss: 0.01957
Epoch: 5224, Training Loss: 0.02254
Epoch: 5224, Training Loss: 0.02047
Epoch: 5224, Training Loss: 0.02585
Epoch: 5224, Training Loss: 0.01956
Epoch: 5225, Training Loss: 0.02254
Epoch: 5225, Training Loss: 0.02047
Epoch: 5225, Training Loss: 0.02585
Epoch: 5225, Training Loss: 0.01956
Epoch: 5226, Training Loss: 0.02253
Epoch: 5226, Training Loss: 0.02047
Epoch: 5226, Training Loss: 0.02585
Epoch: 5226, Training Loss: 0.01956
Epoch: 5227, Training Loss: 0.02253
Epoch: 5227, Training Loss: 0.02047
Epoch: 5227, Training Loss: 0.02584
Epoch: 5227, Training Loss: 0.01956
Epoch: 5228, Training Loss: 0.02253
Epoch: 5228, Training Loss: 0.02046
Epoch: 5228, Training Loss: 0.02584
Epoch: 5228, Training Loss: 0.01955
Epoch: 5229, Training Loss: 0.02253
Epoch: 5229, Training Loss: 0.02046
Epoch: 5229, Training Loss: 0.02584
Epoch: 5229, Training Loss: 0.01955
Epoch: 5230, Training Loss: 0.02252
Epoch: 5230, Training Loss: 0.02046
Epoch: 5230, Training Loss: 0.02583
Epoch: 5230, Training Loss: 0.01955
Epoch: 5231, Training Loss: 0.02252
Epoch: 5231, Training Loss: 0.02046
Epoch: 5231, Training Loss: 0.02583
Epoch: 5231, Training Loss: 0.01955
Epoch: 5232, Training Loss: 0.02252
Epoch: 5232, Training Loss: 0.02046
Epoch: 5232, Training Loss: 0.02583
Epoch: 5232, Training Loss: 0.01955
Epoch: 5233, Training Loss: 0.02252
Epoch: 5233, Training Loss: 0.02045
Epoch: 5233, Training Loss: 0.02583
Epoch: 5233, Training Loss: 0.01954
Epoch: 5234, Training Loss: 0.02251
Epoch: 5234, Training Loss: 0.02045
Epoch: 5234, Training Loss: 0.02582
Epoch: 5234, Training Loss: 0.01954
Epoch: 5235, Training Loss: 0.02251
Epoch: 5235, Training Loss: 0.02045
Epoch: 5235, Training Loss: 0.02582
Epoch: 5235, Training Loss: 0.01954
Epoch: 5236, Training Loss: 0.02251
Epoch: 5236, Training Loss: 0.02045
Epoch: 5236, Training Loss: 0.02582
Epoch: 5236, Training Loss: 0.01954
Epoch: 5237, Training Loss: 0.02251
Epoch: 5237, Training Loss: 0.02044
Epoch: 5237, Training Loss: 0.02581
Epoch: 5237, Training Loss: 0.01953
Epoch: 5238, Training Loss: 0.02250
Epoch: 5238, Training Loss: 0.02044
Epoch: 5238, Training Loss: 0.02581
Epoch: 5238, Training Loss: 0.01953
Epoch: 5239, Training Loss: 0.02250
Epoch: 5239, Training Loss: 0.02044
Epoch: 5239, Training Loss: 0.02581
Epoch: 5239, Training Loss: 0.01953
Epoch: 5240, Training Loss: 0.02250
Epoch: 5240, Training Loss: 0.02044
Epoch: 5240, Training Loss: 0.02580
Epoch: 5240, Training Loss: 0.01953
Epoch: 5241, Training Loss: 0.02249
Epoch: 5241, Training Loss: 0.02043
Epoch: 5241, Training Loss: 0.02580
Epoch: 5241, Training Loss: 0.01952
Epoch: 5242, Training Loss: 0.02249
Epoch: 5242, Training Loss: 0.02043
Epoch: 5242, Training Loss: 0.02580
Epoch: 5242, Training Loss: 0.01952
Epoch: 5243, Training Loss: 0.02249
Epoch: 5243, Training Loss: 0.02043
Epoch: 5243, Training Loss: 0.02579
Epoch: 5243, Training Loss: 0.01952
Epoch: 5244, Training Loss: 0.02249
Epoch: 5244, Training Loss: 0.02043
Epoch: 5244, Training Loss: 0.02579
Epoch: 5244, Training Loss: 0.01952
Epoch: 5245, Training Loss: 0.02248
Epoch: 5245, Training Loss: 0.02043
Epoch: 5245, Training Loss: 0.02579
Epoch: 5245, Training Loss: 0.01952
Epoch: 5246, Training Loss: 0.02248
Epoch: 5246, Training Loss: 0.02042
Epoch: 5246, Training Loss: 0.02579
Epoch: 5246, Training Loss: 0.01951
Epoch: 5247, Training Loss: 0.02248
Epoch: 5247, Training Loss: 0.02042
Epoch: 5247, Training Loss: 0.02578
Epoch: 5247, Training Loss: 0.01951
Epoch: 5248, Training Loss: 0.02248
Epoch: 5248, Training Loss: 0.02042
Epoch: 5248, Training Loss: 0.02578
Epoch: 5248, Training Loss: 0.01951
Epoch: 5249, Training Loss: 0.02247
Epoch: 5249, Training Loss: 0.02042
Epoch: 5249, Training Loss: 0.02578
Epoch: 5249, Training Loss: 0.01951
Epoch: 5250, Training Loss: 0.02247
Epoch: 5250, Training Loss: 0.02041
Epoch: 5250, Training Loss: 0.02577
Epoch: 5250, Training Loss: 0.01950
Epoch: 5251, Training Loss: 0.02247
Epoch: 5251, Training Loss: 0.02041
Epoch: 5251, Training Loss: 0.02577
Epoch: 5251, Training Loss: 0.01950
Epoch: 5252, Training Loss: 0.02247
Epoch: 5252, Training Loss: 0.02041
Epoch: 5252, Training Loss: 0.02577
Epoch: 5252, Training Loss: 0.01950
Epoch: 5253, Training Loss: 0.02246
Epoch: 5253, Training Loss: 0.02041
Epoch: 5253, Training Loss: 0.02576
Epoch: 5253, Training Loss: 0.01950
Epoch: 5254, Training Loss: 0.02246
Epoch: 5254, Training Loss: 0.02041
Epoch: 5254, Training Loss: 0.02576
Epoch: 5254, Training Loss: 0.01950
Epoch: 5255, Training Loss: 0.02246
Epoch: 5255, Training Loss: 0.02040
Epoch: 5255, Training Loss: 0.02576
Epoch: 5255, Training Loss: 0.01949
Epoch: 5256, Training Loss: 0.02245
Epoch: 5256, Training Loss: 0.02040
Epoch: 5256, Training Loss: 0.02575
Epoch: 5256, Training Loss: 0.01949
Epoch: 5257, Training Loss: 0.02245
Epoch: 5257, Training Loss: 0.02040
Epoch: 5257, Training Loss: 0.02575
Epoch: 5257, Training Loss: 0.01949
Epoch: 5258, Training Loss: 0.02245
Epoch: 5258, Training Loss: 0.02040
Epoch: 5258, Training Loss: 0.02575
Epoch: 5258, Training Loss: 0.01949
Epoch: 5259, Training Loss: 0.02245
Epoch: 5259, Training Loss: 0.02039
Epoch: 5259, Training Loss: 0.02575
Epoch: 5259, Training Loss: 0.01948
Epoch: 5260, Training Loss: 0.02244
Epoch: 5260, Training Loss: 0.02039
Epoch: 5260, Training Loss: 0.02574
Epoch: 5260, Training Loss: 0.01948
Epoch: 5261, Training Loss: 0.02244
Epoch: 5261, Training Loss: 0.02039
Epoch: 5261, Training Loss: 0.02574
Epoch: 5261, Training Loss: 0.01948
Epoch: 5262, Training Loss: 0.02244
Epoch: 5262, Training Loss: 0.02039
Epoch: 5262, Training Loss: 0.02574
Epoch: 5262, Training Loss: 0.01948
Epoch: 5263, Training Loss: 0.02244
Epoch: 5263, Training Loss: 0.02038
Epoch: 5263, Training Loss: 0.02573
Epoch: 5263, Training Loss: 0.01948
Epoch: 5264, Training Loss: 0.02243
Epoch: 5264, Training Loss: 0.02038
Epoch: 5264, Training Loss: 0.02573
Epoch: 5264, Training Loss: 0.01947
Epoch: 5265, Training Loss: 0.02243
Epoch: 5265, Training Loss: 0.02038
Epoch: 5265, Training Loss: 0.02573
Epoch: 5265, Training Loss: 0.01947
Epoch: 5266, Training Loss: 0.02243
Epoch: 5266, Training Loss: 0.02038
Epoch: 5266, Training Loss: 0.02572
Epoch: 5266, Training Loss: 0.01947
Epoch: 5267, Training Loss: 0.02243
Epoch: 5267, Training Loss: 0.02038
Epoch: 5267, Training Loss: 0.02572
Epoch: 5267, Training Loss: 0.01947
Epoch: 5268, Training Loss: 0.02242
Epoch: 5268, Training Loss: 0.02037
Epoch: 5268, Training Loss: 0.02572
Epoch: 5268, Training Loss: 0.01946
Epoch: 5269, Training Loss: 0.02242
Epoch: 5269, Training Loss: 0.02037
Epoch: 5269, Training Loss: 0.02571
Epoch: 5269, Training Loss: 0.01946
Epoch: 5270, Training Loss: 0.02242
Epoch: 5270, Training Loss: 0.02037
Epoch: 5270, Training Loss: 0.02571
Epoch: 5270, Training Loss: 0.01946
Epoch: 5271, Training Loss: 0.02242
Epoch: 5271, Training Loss: 0.02037
Epoch: 5271, Training Loss: 0.02571
Epoch: 5271, Training Loss: 0.01946
Epoch: 5272, Training Loss: 0.02241
Epoch: 5272, Training Loss: 0.02036
Epoch: 5272, Training Loss: 0.02571
Epoch: 5272, Training Loss: 0.01946
Epoch: 5273, Training Loss: 0.02241
Epoch: 5273, Training Loss: 0.02036
Epoch: 5273, Training Loss: 0.02570
Epoch: 5273, Training Loss: 0.01945
Epoch: 5274, Training Loss: 0.02241
Epoch: 5274, Training Loss: 0.02036
Epoch: 5274, Training Loss: 0.02570
Epoch: 5274, Training Loss: 0.01945
Epoch: 5275, Training Loss: 0.02241
Epoch: 5275, Training Loss: 0.02036
Epoch: 5275, Training Loss: 0.02570
Epoch: 5275, Training Loss: 0.01945
Epoch: 5276, Training Loss: 0.02240
Epoch: 5276, Training Loss: 0.02036
Epoch: 5276, Training Loss: 0.02569
Epoch: 5276, Training Loss: 0.01945
Epoch: 5277, Training Loss: 0.02240
Epoch: 5277, Training Loss: 0.02035
Epoch: 5277, Training Loss: 0.02569
Epoch: 5277, Training Loss: 0.01944
Epoch: 5278, Training Loss: 0.02240
Epoch: 5278, Training Loss: 0.02035
Epoch: 5278, Training Loss: 0.02569
Epoch: 5278, Training Loss: 0.01944
Epoch: 5279, Training Loss: 0.02239
Epoch: 5279, Training Loss: 0.02035
Epoch: 5279, Training Loss: 0.02568
Epoch: 5279, Training Loss: 0.01944
Epoch: 5280, Training Loss: 0.02239
Epoch: 5280, Training Loss: 0.02035
Epoch: 5280, Training Loss: 0.02568
Epoch: 5280, Training Loss: 0.01944
Epoch: 5281, Training Loss: 0.02239
Epoch: 5281, Training Loss: 0.02034
Epoch: 5281, Training Loss: 0.02568
Epoch: 5281, Training Loss: 0.01944
Epoch: 5282, Training Loss: 0.02239
Epoch: 5282, Training Loss: 0.02034
Epoch: 5282, Training Loss: 0.02568
Epoch: 5282, Training Loss: 0.01943
Epoch: 5283, Training Loss: 0.02238
Epoch: 5283, Training Loss: 0.02034
Epoch: 5283, Training Loss: 0.02567
Epoch: 5283, Training Loss: 0.01943
Epoch: 5284, Training Loss: 0.02238
Epoch: 5284, Training Loss: 0.02034
Epoch: 5284, Training Loss: 0.02567
Epoch: 5284, Training Loss: 0.01943
Epoch: 5285, Training Loss: 0.02238
Epoch: 5285, Training Loss: 0.02033
Epoch: 5285, Training Loss: 0.02567
Epoch: 5285, Training Loss: 0.01943
Epoch: 5286, Training Loss: 0.02238
Epoch: 5286, Training Loss: 0.02033
Epoch: 5286, Training Loss: 0.02566
Epoch: 5286, Training Loss: 0.01942
Epoch: 5287, Training Loss: 0.02237
Epoch: 5287, Training Loss: 0.02033
Epoch: 5287, Training Loss: 0.02566
Epoch: 5287, Training Loss: 0.01942
Epoch: 5288, Training Loss: 0.02237
Epoch: 5288, Training Loss: 0.02033
Epoch: 5288, Training Loss: 0.02566
Epoch: 5288, Training Loss: 0.01942
Epoch: 5289, Training Loss: 0.02237
Epoch: 5289, Training Loss: 0.02033
Epoch: 5289, Training Loss: 0.02565
Epoch: 5289, Training Loss: 0.01942
Epoch: 5290, Training Loss: 0.02237
Epoch: 5290, Training Loss: 0.02032
Epoch: 5290, Training Loss: 0.02565
Epoch: 5290, Training Loss: 0.01942
Epoch: 5291, Training Loss: 0.02236
Epoch: 5291, Training Loss: 0.02032
Epoch: 5291, Training Loss: 0.02565
Epoch: 5291, Training Loss: 0.01941
Epoch: 5292, Training Loss: 0.02236
Epoch: 5292, Training Loss: 0.02032
Epoch: 5292, Training Loss: 0.02565
Epoch: 5292, Training Loss: 0.01941
Epoch: 5293, Training Loss: 0.02236
Epoch: 5293, Training Loss: 0.02032
Epoch: 5293, Training Loss: 0.02564
Epoch: 5293, Training Loss: 0.01941
Epoch: 5294, Training Loss: 0.02236
Epoch: 5294, Training Loss: 0.02031
Epoch: 5294, Training Loss: 0.02564
Epoch: 5294, Training Loss: 0.01941
Epoch: 5295, Training Loss: 0.02235
Epoch: 5295, Training Loss: 0.02031
Epoch: 5295, Training Loss: 0.02564
Epoch: 5295, Training Loss: 0.01940
Epoch: 5296, Training Loss: 0.02235
Epoch: 5296, Training Loss: 0.02031
Epoch: 5296, Training Loss: 0.02563
Epoch: 5296, Training Loss: 0.01940
Epoch: 5297, Training Loss: 0.02235
Epoch: 5297, Training Loss: 0.02031
Epoch: 5297, Training Loss: 0.02563
Epoch: 5297, Training Loss: 0.01940
Epoch: 5298, Training Loss: 0.02235
Epoch: 5298, Training Loss: 0.02031
Epoch: 5298, Training Loss: 0.02563
Epoch: 5298, Training Loss: 0.01940
Epoch: 5299, Training Loss: 0.02234
Epoch: 5299, Training Loss: 0.02030
Epoch: 5299, Training Loss: 0.02562
Epoch: 5299, Training Loss: 0.01940
Epoch: 5300, Training Loss: 0.02234
Epoch: 5300, Training Loss: 0.02030
Epoch: 5300, Training Loss: 0.02562
Epoch: 5300, Training Loss: 0.01939
Epoch: 5301, Training Loss: 0.02234
Epoch: 5301, Training Loss: 0.02030
Epoch: 5301, Training Loss: 0.02562
Epoch: 5301, Training Loss: 0.01939
Epoch: 5302, Training Loss: 0.02233
Epoch: 5302, Training Loss: 0.02030
Epoch: 5302, Training Loss: 0.02561
Epoch: 5302, Training Loss: 0.01939
Epoch: 5303, Training Loss: 0.02233
Epoch: 5303, Training Loss: 0.02029
Epoch: 5303, Training Loss: 0.02561
Epoch: 5303, Training Loss: 0.01939
Epoch: 5304, Training Loss: 0.02233
Epoch: 5304, Training Loss: 0.02029
Epoch: 5304, Training Loss: 0.02561
Epoch: 5304, Training Loss: 0.01939
Epoch: 5305, Training Loss: 0.02233
Epoch: 5305, Training Loss: 0.02029
Epoch: 5305, Training Loss: 0.02561
Epoch: 5305, Training Loss: 0.01938
Epoch: 5306, Training Loss: 0.02232
Epoch: 5306, Training Loss: 0.02029
Epoch: 5306, Training Loss: 0.02560
Epoch: 5306, Training Loss: 0.01938
Epoch: 5307, Training Loss: 0.02232
Epoch: 5307, Training Loss: 0.02029
Epoch: 5307, Training Loss: 0.02560
Epoch: 5307, Training Loss: 0.01938
Epoch: 5308, Training Loss: 0.02232
Epoch: 5308, Training Loss: 0.02028
Epoch: 5308, Training Loss: 0.02560
Epoch: 5308, Training Loss: 0.01938
Epoch: 5309, Training Loss: 0.02232
Epoch: 5309, Training Loss: 0.02028
Epoch: 5309, Training Loss: 0.02559
Epoch: 5309, Training Loss: 0.01937
Epoch: 5310, Training Loss: 0.02231
Epoch: 5310, Training Loss: 0.02028
Epoch: 5310, Training Loss: 0.02559
Epoch: 5310, Training Loss: 0.01937
Epoch: 5311, Training Loss: 0.02231
Epoch: 5311, Training Loss: 0.02028
Epoch: 5311, Training Loss: 0.02559
Epoch: 5311, Training Loss: 0.01937
Epoch: 5312, Training Loss: 0.02231
Epoch: 5312, Training Loss: 0.02027
Epoch: 5312, Training Loss: 0.02558
Epoch: 5312, Training Loss: 0.01937
Epoch: 5313, Training Loss: 0.02231
Epoch: 5313, Training Loss: 0.02027
Epoch: 5313, Training Loss: 0.02558
Epoch: 5313, Training Loss: 0.01937
Epoch: 5314, Training Loss: 0.02230
Epoch: 5314, Training Loss: 0.02027
Epoch: 5314, Training Loss: 0.02558
Epoch: 5314, Training Loss: 0.01936
Epoch: 5315, Training Loss: 0.02230
Epoch: 5315, Training Loss: 0.02027
Epoch: 5315, Training Loss: 0.02558
Epoch: 5315, Training Loss: 0.01936
Epoch: 5316, Training Loss: 0.02230
Epoch: 5316, Training Loss: 0.02026
Epoch: 5316, Training Loss: 0.02557
Epoch: 5316, Training Loss: 0.01936
Epoch: 5317, Training Loss: 0.02230
Epoch: 5317, Training Loss: 0.02026
Epoch: 5317, Training Loss: 0.02557
Epoch: 5317, Training Loss: 0.01936
Epoch: 5318, Training Loss: 0.02229
Epoch: 5318, Training Loss: 0.02026
Epoch: 5318, Training Loss: 0.02557
Epoch: 5318, Training Loss: 0.01935
Epoch: 5319, Training Loss: 0.02229
Epoch: 5319, Training Loss: 0.02026
Epoch: 5319, Training Loss: 0.02556
Epoch: 5319, Training Loss: 0.01935
Epoch: 5320, Training Loss: 0.02229
Epoch: 5320, Training Loss: 0.02026
Epoch: 5320, Training Loss: 0.02556
Epoch: 5320, Training Loss: 0.01935
Epoch: 5321, Training Loss: 0.02229
Epoch: 5321, Training Loss: 0.02025
Epoch: 5321, Training Loss: 0.02556
Epoch: 5321, Training Loss: 0.01935
Epoch: 5322, Training Loss: 0.02228
Epoch: 5322, Training Loss: 0.02025
Epoch: 5322, Training Loss: 0.02555
Epoch: 5322, Training Loss: 0.01935
Epoch: 5323, Training Loss: 0.02228
Epoch: 5323, Training Loss: 0.02025
Epoch: 5323, Training Loss: 0.02555
Epoch: 5323, Training Loss: 0.01934
Epoch: 5324, Training Loss: 0.02228
Epoch: 5324, Training Loss: 0.02025
Epoch: 5324, Training Loss: 0.02555
Epoch: 5324, Training Loss: 0.01934
Epoch: 5325, Training Loss: 0.02228
Epoch: 5325, Training Loss: 0.02024
Epoch: 5325, Training Loss: 0.02555
Epoch: 5325, Training Loss: 0.01934
Epoch: 5326, Training Loss: 0.02227
Epoch: 5326, Training Loss: 0.02024
Epoch: 5326, Training Loss: 0.02554
Epoch: 5326, Training Loss: 0.01934
Epoch: 5327, Training Loss: 0.02227
Epoch: 5327, Training Loss: 0.02024
Epoch: 5327, Training Loss: 0.02554
Epoch: 5327, Training Loss: 0.01933
Epoch: 5328, Training Loss: 0.02227
Epoch: 5328, Training Loss: 0.02024
Epoch: 5328, Training Loss: 0.02554
Epoch: 5328, Training Loss: 0.01933
Epoch: 5329, Training Loss: 0.02227
Epoch: 5329, Training Loss: 0.02024
Epoch: 5329, Training Loss: 0.02553
Epoch: 5329, Training Loss: 0.01933
Epoch: 5330, Training Loss: 0.02226
Epoch: 5330, Training Loss: 0.02023
Epoch: 5330, Training Loss: 0.02553
Epoch: 5330, Training Loss: 0.01933
Epoch: 5331, Training Loss: 0.02226
Epoch: 5331, Training Loss: 0.02023
Epoch: 5331, Training Loss: 0.02553
Epoch: 5331, Training Loss: 0.01933
Epoch: 5332, Training Loss: 0.02226
Epoch: 5332, Training Loss: 0.02023
Epoch: 5332, Training Loss: 0.02552
Epoch: 5332, Training Loss: 0.01932
Epoch: 5333, Training Loss: 0.02225
Epoch: 5333, Training Loss: 0.02023
Epoch: 5333, Training Loss: 0.02552
Epoch: 5333, Training Loss: 0.01932
Epoch: 5334, Training Loss: 0.02225
Epoch: 5334, Training Loss: 0.02022
Epoch: 5334, Training Loss: 0.02552
Epoch: 5334, Training Loss: 0.01932
Epoch: 5335, Training Loss: 0.02225
Epoch: 5335, Training Loss: 0.02022
Epoch: 5335, Training Loss: 0.02552
Epoch: 5335, Training Loss: 0.01932
Epoch: 5336, Training Loss: 0.02225
Epoch: 5336, Training Loss: 0.02022
Epoch: 5336, Training Loss: 0.02551
Epoch: 5336, Training Loss: 0.01932
Epoch: 5337, Training Loss: 0.02224
Epoch: 5337, Training Loss: 0.02022
Epoch: 5337, Training Loss: 0.02551
Epoch: 5337, Training Loss: 0.01931
Epoch: 5338, Training Loss: 0.02224
Epoch: 5338, Training Loss: 0.02022
Epoch: 5338, Training Loss: 0.02551
Epoch: 5338, Training Loss: 0.01931
Epoch: 5339, Training Loss: 0.02224
Epoch: 5339, Training Loss: 0.02021
Epoch: 5339, Training Loss: 0.02550
Epoch: 5339, Training Loss: 0.01931
Epoch: 5340, Training Loss: 0.02224
Epoch: 5340, Training Loss: 0.02021
Epoch: 5340, Training Loss: 0.02550
Epoch: 5340, Training Loss: 0.01931
Epoch: 5341, Training Loss: 0.02223
Epoch: 5341, Training Loss: 0.02021
Epoch: 5341, Training Loss: 0.02550
Epoch: 5341, Training Loss: 0.01930
Epoch: 5342, Training Loss: 0.02223
Epoch: 5342, Training Loss: 0.02021
Epoch: 5342, Training Loss: 0.02549
Epoch: 5342, Training Loss: 0.01930
Epoch: 5343, Training Loss: 0.02223
Epoch: 5343, Training Loss: 0.02020
Epoch: 5343, Training Loss: 0.02549
Epoch: 5343, Training Loss: 0.01930
Epoch: 5344, Training Loss: 0.02223
Epoch: 5344, Training Loss: 0.02020
Epoch: 5344, Training Loss: 0.02549
Epoch: 5344, Training Loss: 0.01930
Epoch: 5345, Training Loss: 0.02222
Epoch: 5345, Training Loss: 0.02020
Epoch: 5345, Training Loss: 0.02549
Epoch: 5345, Training Loss: 0.01930
Epoch: 5346, Training Loss: 0.02222
Epoch: 5346, Training Loss: 0.02020
Epoch: 5346, Training Loss: 0.02548
Epoch: 5346, Training Loss: 0.01929
Epoch: 5347, Training Loss: 0.02222
Epoch: 5347, Training Loss: 0.02020
Epoch: 5347, Training Loss: 0.02548
Epoch: 5347, Training Loss: 0.01929
Epoch: 5348, Training Loss: 0.02222
Epoch: 5348, Training Loss: 0.02019
Epoch: 5348, Training Loss: 0.02548
Epoch: 5348, Training Loss: 0.01929
Epoch: 5349, Training Loss: 0.02221
Epoch: 5349, Training Loss: 0.02019
Epoch: 5349, Training Loss: 0.02547
Epoch: 5349, Training Loss: 0.01929
Epoch: 5350, Training Loss: 0.02221
Epoch: 5350, Training Loss: 0.02019
Epoch: 5350, Training Loss: 0.02547
Epoch: 5350, Training Loss: 0.01929
Epoch: 5351, Training Loss: 0.02221
Epoch: 5351, Training Loss: 0.02019
Epoch: 5351, Training Loss: 0.02547
Epoch: 5351, Training Loss: 0.01928
Epoch: 5352, Training Loss: 0.02221
Epoch: 5352, Training Loss: 0.02018
Epoch: 5352, Training Loss: 0.02547
Epoch: 5352, Training Loss: 0.01928
Epoch: 5353, Training Loss: 0.02220
Epoch: 5353, Training Loss: 0.02018
Epoch: 5353, Training Loss: 0.02546
Epoch: 5353, Training Loss: 0.01928
Epoch: 5354, Training Loss: 0.02220
Epoch: 5354, Training Loss: 0.02018
Epoch: 5354, Training Loss: 0.02546
Epoch: 5354, Training Loss: 0.01928
Epoch: 5355, Training Loss: 0.02220
Epoch: 5355, Training Loss: 0.02018
Epoch: 5355, Training Loss: 0.02546
Epoch: 5355, Training Loss: 0.01927
Epoch: 5356, Training Loss: 0.02220
Epoch: 5356, Training Loss: 0.02018
Epoch: 5356, Training Loss: 0.02545
Epoch: 5356, Training Loss: 0.01927
Epoch: 5357, Training Loss: 0.02219
Epoch: 5357, Training Loss: 0.02017
Epoch: 5357, Training Loss: 0.02545
Epoch: 5357, Training Loss: 0.01927
Epoch: 5358, Training Loss: 0.02219
Epoch: 5358, Training Loss: 0.02017
Epoch: 5358, Training Loss: 0.02545
Epoch: 5358, Training Loss: 0.01927
Epoch: 5359, Training Loss: 0.02219
Epoch: 5359, Training Loss: 0.02017
Epoch: 5359, Training Loss: 0.02544
Epoch: 5359, Training Loss: 0.01927
Epoch: 5360, Training Loss: 0.02219
Epoch: 5360, Training Loss: 0.02017
Epoch: 5360, Training Loss: 0.02544
Epoch: 5360, Training Loss: 0.01926
Epoch: 5361, Training Loss: 0.02218
Epoch: 5361, Training Loss: 0.02016
Epoch: 5361, Training Loss: 0.02544
Epoch: 5361, Training Loss: 0.01926
Epoch: 5362, Training Loss: 0.02218
Epoch: 5362, Training Loss: 0.02016
Epoch: 5362, Training Loss: 0.02544
Epoch: 5362, Training Loss: 0.01926
Epoch: 5363, Training Loss: 0.02218
Epoch: 5363, Training Loss: 0.02016
Epoch: 5363, Training Loss: 0.02543
Epoch: 5363, Training Loss: 0.01926
Epoch: 5364, Training Loss: 0.02218
Epoch: 5364, Training Loss: 0.02016
Epoch: 5364, Training Loss: 0.02543
Epoch: 5364, Training Loss: 0.01925
Epoch: 5365, Training Loss: 0.02217
Epoch: 5365, Training Loss: 0.02016
Epoch: 5365, Training Loss: 0.02543
Epoch: 5365, Training Loss: 0.01925
Epoch: 5366, Training Loss: 0.02217
Epoch: 5366, Training Loss: 0.02015
Epoch: 5366, Training Loss: 0.02542
Epoch: 5366, Training Loss: 0.01925
Epoch: 5367, Training Loss: 0.02217
Epoch: 5367, Training Loss: 0.02015
Epoch: 5367, Training Loss: 0.02542
Epoch: 5367, Training Loss: 0.01925
Epoch: 5368, Training Loss: 0.02217
Epoch: 5368, Training Loss: 0.02015
Epoch: 5368, Training Loss: 0.02542
Epoch: 5368, Training Loss: 0.01925
Epoch: 5369, Training Loss: 0.02216
Epoch: 5369, Training Loss: 0.02015
Epoch: 5369, Training Loss: 0.02541
Epoch: 5369, Training Loss: 0.01924
Epoch: 5370, Training Loss: 0.02216
Epoch: 5370, Training Loss: 0.02015
Epoch: 5370, Training Loss: 0.02541
Epoch: 5370, Training Loss: 0.01924
Epoch: 5371, Training Loss: 0.02216
Epoch: 5371, Training Loss: 0.02014
Epoch: 5371, Training Loss: 0.02541
Epoch: 5371, Training Loss: 0.01924
Epoch: 5372, Training Loss: 0.02216
Epoch: 5372, Training Loss: 0.02014
Epoch: 5372, Training Loss: 0.02541
Epoch: 5372, Training Loss: 0.01924
Epoch: 5373, Training Loss: 0.02215
Epoch: 5373, Training Loss: 0.02014
Epoch: 5373, Training Loss: 0.02540
Epoch: 5373, Training Loss: 0.01924
Epoch: 5374, Training Loss: 0.02215
Epoch: 5374, Training Loss: 0.02014
Epoch: 5374, Training Loss: 0.02540
Epoch: 5374, Training Loss: 0.01923
Epoch: 5375, Training Loss: 0.02215
Epoch: 5375, Training Loss: 0.02013
Epoch: 5375, Training Loss: 0.02540
Epoch: 5375, Training Loss: 0.01923
Epoch: 5376, Training Loss: 0.02215
Epoch: 5376, Training Loss: 0.02013
Epoch: 5376, Training Loss: 0.02539
Epoch: 5376, Training Loss: 0.01923
Epoch: 5377, Training Loss: 0.02214
Epoch: 5377, Training Loss: 0.02013
Epoch: 5377, Training Loss: 0.02539
Epoch: 5377, Training Loss: 0.01923
Epoch: 5378, Training Loss: 0.02214
Epoch: 5378, Training Loss: 0.02013
Epoch: 5378, Training Loss: 0.02539
Epoch: 5378, Training Loss: 0.01922
Epoch: 5379, Training Loss: 0.02214
Epoch: 5379, Training Loss: 0.02013
Epoch: 5379, Training Loss: 0.02539
Epoch: 5379, Training Loss: 0.01922
Epoch: 5380, Training Loss: 0.02214
Epoch: 5380, Training Loss: 0.02012
Epoch: 5380, Training Loss: 0.02538
Epoch: 5380, Training Loss: 0.01922
Epoch: 5381, Training Loss: 0.02213
Epoch: 5381, Training Loss: 0.02012
Epoch: 5381, Training Loss: 0.02538
Epoch: 5381, Training Loss: 0.01922
Epoch: 5382, Training Loss: 0.02213
Epoch: 5382, Training Loss: 0.02012
Epoch: 5382, Training Loss: 0.02538
Epoch: 5382, Training Loss: 0.01922
Epoch: 5383, Training Loss: 0.02213
Epoch: 5383, Training Loss: 0.02012
Epoch: 5383, Training Loss: 0.02537
Epoch: 5383, Training Loss: 0.01921
Epoch: 5384, Training Loss: 0.02213
Epoch: 5384, Training Loss: 0.02011
Epoch: 5384, Training Loss: 0.02537
Epoch: 5384, Training Loss: 0.01921
Epoch: 5385, Training Loss: 0.02212
Epoch: 5385, Training Loss: 0.02011
Epoch: 5385, Training Loss: 0.02537
Epoch: 5385, Training Loss: 0.01921
Epoch: 5386, Training Loss: 0.02212
Epoch: 5386, Training Loss: 0.02011
Epoch: 5386, Training Loss: 0.02536
Epoch: 5386, Training Loss: 0.01921
Epoch: 5387, Training Loss: 0.02212
Epoch: 5387, Training Loss: 0.02011
Epoch: 5387, Training Loss: 0.02536
Epoch: 5387, Training Loss: 0.01921
Epoch: 5388, Training Loss: 0.02212
Epoch: 5388, Training Loss: 0.02011
Epoch: 5388, Training Loss: 0.02536
Epoch: 5388, Training Loss: 0.01920
Epoch: 5389, Training Loss: 0.02211
Epoch: 5389, Training Loss: 0.02010
Epoch: 5389, Training Loss: 0.02536
Epoch: 5389, Training Loss: 0.01920
Epoch: 5390, Training Loss: 0.02211
Epoch: 5390, Training Loss: 0.02010
Epoch: 5390, Training Loss: 0.02535
Epoch: 5390, Training Loss: 0.01920
Epoch: 5391, Training Loss: 0.02211
Epoch: 5391, Training Loss: 0.02010
Epoch: 5391, Training Loss: 0.02535
Epoch: 5391, Training Loss: 0.01920
Epoch: 5392, Training Loss: 0.02211
Epoch: 5392, Training Loss: 0.02010
Epoch: 5392, Training Loss: 0.02535
Epoch: 5392, Training Loss: 0.01919
Epoch: 5393, Training Loss: 0.02210
Epoch: 5393, Training Loss: 0.02009
Epoch: 5393, Training Loss: 0.02534
Epoch: 5393, Training Loss: 0.01919
Epoch: 5394, Training Loss: 0.02210
Epoch: 5394, Training Loss: 0.02009
Epoch: 5394, Training Loss: 0.02534
Epoch: 5394, Training Loss: 0.01919
Epoch: 5395, Training Loss: 0.02210
Epoch: 5395, Training Loss: 0.02009
Epoch: 5395, Training Loss: 0.02534
Epoch: 5395, Training Loss: 0.01919
Epoch: 5396, Training Loss: 0.02209
Epoch: 5396, Training Loss: 0.02009
Epoch: 5396, Training Loss: 0.02534
Epoch: 5396, Training Loss: 0.01919
Epoch: 5397, Training Loss: 0.02209
Epoch: 5397, Training Loss: 0.02009
Epoch: 5397, Training Loss: 0.02533
Epoch: 5397, Training Loss: 0.01918
Epoch: 5398, Training Loss: 0.02209
Epoch: 5398, Training Loss: 0.02008
Epoch: 5398, Training Loss: 0.02533
Epoch: 5398, Training Loss: 0.01918
Epoch: 5399, Training Loss: 0.02209
Epoch: 5399, Training Loss: 0.02008
Epoch: 5399, Training Loss: 0.02533
Epoch: 5399, Training Loss: 0.01918
Epoch: 5400, Training Loss: 0.02208
Epoch: 5400, Training Loss: 0.02008
Epoch: 5400, Training Loss: 0.02532
Epoch: 5400, Training Loss: 0.01918
Epoch: 5401, Training Loss: 0.02208
Epoch: 5401, Training Loss: 0.02008
Epoch: 5401, Training Loss: 0.02532
Epoch: 5401, Training Loss: 0.01918
Epoch: 5402, Training Loss: 0.02208
Epoch: 5402, Training Loss: 0.02008
Epoch: 5402, Training Loss: 0.02532
Epoch: 5402, Training Loss: 0.01917
Epoch: 5403, Training Loss: 0.02208
Epoch: 5403, Training Loss: 0.02007
Epoch: 5403, Training Loss: 0.02532
Epoch: 5403, Training Loss: 0.01917
Epoch: 5404, Training Loss: 0.02207
Epoch: 5404, Training Loss: 0.02007
Epoch: 5404, Training Loss: 0.02531
Epoch: 5404, Training Loss: 0.01917
Epoch: 5405, Training Loss: 0.02207
Epoch: 5405, Training Loss: 0.02007
Epoch: 5405, Training Loss: 0.02531
Epoch: 5405, Training Loss: 0.01917
Epoch: 5406, Training Loss: 0.02207
Epoch: 5406, Training Loss: 0.02007
Epoch: 5406, Training Loss: 0.02531
Epoch: 5406, Training Loss: 0.01917
Epoch: 5407, Training Loss: 0.02207
Epoch: 5407, Training Loss: 0.02006
Epoch: 5407, Training Loss: 0.02530
Epoch: 5407, Training Loss: 0.01916
Epoch: 5408, Training Loss: 0.02206
Epoch: 5408, Training Loss: 0.02006
Epoch: 5408, Training Loss: 0.02530
Epoch: 5408, Training Loss: 0.01916
Epoch: 5409, Training Loss: 0.02206
Epoch: 5409, Training Loss: 0.02006
Epoch: 5409, Training Loss: 0.02530
Epoch: 5409, Training Loss: 0.01916
Epoch: 5410, Training Loss: 0.02206
Epoch: 5410, Training Loss: 0.02006
Epoch: 5410, Training Loss: 0.02529
Epoch: 5410, Training Loss: 0.01916
Epoch: 5411, Training Loss: 0.02206
Epoch: 5411, Training Loss: 0.02006
Epoch: 5411, Training Loss: 0.02529
Epoch: 5411, Training Loss: 0.01915
Epoch: 5412, Training Loss: 0.02205
Epoch: 5412, Training Loss: 0.02005
Epoch: 5412, Training Loss: 0.02529
Epoch: 5412, Training Loss: 0.01915
Epoch: 5413, Training Loss: 0.02205
Epoch: 5413, Training Loss: 0.02005
Epoch: 5413, Training Loss: 0.02529
Epoch: 5413, Training Loss: 0.01915
Epoch: 5414, Training Loss: 0.02205
Epoch: 5414, Training Loss: 0.02005
Epoch: 5414, Training Loss: 0.02528
Epoch: 5414, Training Loss: 0.01915
Epoch: 5415, Training Loss: 0.02205
Epoch: 5415, Training Loss: 0.02005
Epoch: 5415, Training Loss: 0.02528
Epoch: 5415, Training Loss: 0.01915
Epoch: 5416, Training Loss: 0.02204
Epoch: 5416, Training Loss: 0.02004
Epoch: 5416, Training Loss: 0.02528
Epoch: 5416, Training Loss: 0.01914
Epoch: 5417, Training Loss: 0.02204
Epoch: 5417, Training Loss: 0.02004
Epoch: 5417, Training Loss: 0.02527
Epoch: 5417, Training Loss: 0.01914
Epoch: 5418, Training Loss: 0.02204
Epoch: 5418, Training Loss: 0.02004
Epoch: 5418, Training Loss: 0.02527
Epoch: 5418, Training Loss: 0.01914
Epoch: 5419, Training Loss: 0.02204
Epoch: 5419, Training Loss: 0.02004
Epoch: 5419, Training Loss: 0.02527
Epoch: 5419, Training Loss: 0.01914
Epoch: 5420, Training Loss: 0.02203
Epoch: 5420, Training Loss: 0.02004
Epoch: 5420, Training Loss: 0.02527
Epoch: 5420, Training Loss: 0.01914
Epoch: 5421, Training Loss: 0.02203
Epoch: 5421, Training Loss: 0.02003
Epoch: 5421, Training Loss: 0.02526
Epoch: 5421, Training Loss: 0.01913
Epoch: 5422, Training Loss: 0.02203
Epoch: 5422, Training Loss: 0.02003
Epoch: 5422, Training Loss: 0.02526
Epoch: 5422, Training Loss: 0.01913
Epoch: 5423, Training Loss: 0.02203
Epoch: 5423, Training Loss: 0.02003
Epoch: 5423, Training Loss: 0.02526
Epoch: 5423, Training Loss: 0.01913
Epoch: 5424, Training Loss: 0.02202
Epoch: 5424, Training Loss: 0.02003
Epoch: 5424, Training Loss: 0.02525
Epoch: 5424, Training Loss: 0.01913
Epoch: 5425, Training Loss: 0.02202
Epoch: 5425, Training Loss: 0.02003
Epoch: 5425, Training Loss: 0.02525
Epoch: 5425, Training Loss: 0.01912
Epoch: 5426, Training Loss: 0.02202
Epoch: 5426, Training Loss: 0.02002
Epoch: 5426, Training Loss: 0.02525
Epoch: 5426, Training Loss: 0.01912
Epoch: 5427, Training Loss: 0.02202
Epoch: 5427, Training Loss: 0.02002
Epoch: 5427, Training Loss: 0.02525
Epoch: 5427, Training Loss: 0.01912
Epoch: 5428, Training Loss: 0.02202
Epoch: 5428, Training Loss: 0.02002
Epoch: 5428, Training Loss: 0.02524
Epoch: 5428, Training Loss: 0.01912
Epoch: 5429, Training Loss: 0.02201
Epoch: 5429, Training Loss: 0.02002
Epoch: 5429, Training Loss: 0.02524
Epoch: 5429, Training Loss: 0.01912
Epoch: 5430, Training Loss: 0.02201
Epoch: 5430, Training Loss: 0.02001
Epoch: 5430, Training Loss: 0.02524
Epoch: 5430, Training Loss: 0.01911
Epoch: 5431, Training Loss: 0.02201
Epoch: 5431, Training Loss: 0.02001
Epoch: 5431, Training Loss: 0.02523
Epoch: 5431, Training Loss: 0.01911
Epoch: 5432, Training Loss: 0.02201
Epoch: 5432, Training Loss: 0.02001
Epoch: 5432, Training Loss: 0.02523
Epoch: 5432, Training Loss: 0.01911
Epoch: 5433, Training Loss: 0.02200
Epoch: 5433, Training Loss: 0.02001
Epoch: 5433, Training Loss: 0.02523
Epoch: 5433, Training Loss: 0.01911
Epoch: 5434, Training Loss: 0.02200
Epoch: 5434, Training Loss: 0.02001
Epoch: 5434, Training Loss: 0.02523
Epoch: 5434, Training Loss: 0.01911
Epoch: 5435, Training Loss: 0.02200
Epoch: 5435, Training Loss: 0.02000
Epoch: 5435, Training Loss: 0.02522
Epoch: 5435, Training Loss: 0.01910
Epoch: 5436, Training Loss: 0.02200
Epoch: 5436, Training Loss: 0.02000
Epoch: 5436, Training Loss: 0.02522
Epoch: 5436, Training Loss: 0.01910
Epoch: 5437, Training Loss: 0.02199
Epoch: 5437, Training Loss: 0.02000
Epoch: 5437, Training Loss: 0.02522
Epoch: 5437, Training Loss: 0.01910
Epoch: 5438, Training Loss: 0.02199
Epoch: 5438, Training Loss: 0.02000
Epoch: 5438, Training Loss: 0.02521
Epoch: 5438, Training Loss: 0.01910
Epoch: 5439, Training Loss: 0.02199
Epoch: 5439, Training Loss: 0.01999
Epoch: 5439, Training Loss: 0.02521
Epoch: 5439, Training Loss: 0.01910
Epoch: 5440, Training Loss: 0.02199
Epoch: 5440, Training Loss: 0.01999
Epoch: 5440, Training Loss: 0.02521
Epoch: 5440, Training Loss: 0.01909
Epoch: 5441, Training Loss: 0.02198
Epoch: 5441, Training Loss: 0.01999
Epoch: 5441, Training Loss: 0.02520
Epoch: 5441, Training Loss: 0.01909
Epoch: 5442, Training Loss: 0.02198
Epoch: 5442, Training Loss: 0.01999
Epoch: 5442, Training Loss: 0.02520
Epoch: 5442, Training Loss: 0.01909
Epoch: 5443, Training Loss: 0.02198
Epoch: 5443, Training Loss: 0.01999
Epoch: 5443, Training Loss: 0.02520
Epoch: 5443, Training Loss: 0.01909
Epoch: 5444, Training Loss: 0.02198
Epoch: 5444, Training Loss: 0.01998
Epoch: 5444, Training Loss: 0.02520
Epoch: 5444, Training Loss: 0.01908
Epoch: 5445, Training Loss: 0.02197
Epoch: 5445, Training Loss: 0.01998
Epoch: 5445, Training Loss: 0.02519
Epoch: 5445, Training Loss: 0.01908
Epoch: 5446, Training Loss: 0.02197
Epoch: 5446, Training Loss: 0.01998
Epoch: 5446, Training Loss: 0.02519
Epoch: 5446, Training Loss: 0.01908
Epoch: 5447, Training Loss: 0.02197
Epoch: 5447, Training Loss: 0.01998
Epoch: 5447, Training Loss: 0.02519
Epoch: 5447, Training Loss: 0.01908
Epoch: 5448, Training Loss: 0.02197
Epoch: 5448, Training Loss: 0.01998
Epoch: 5448, Training Loss: 0.02518
Epoch: 5448, Training Loss: 0.01908
Epoch: 5449, Training Loss: 0.02196
Epoch: 5449, Training Loss: 0.01997
Epoch: 5449, Training Loss: 0.02518
Epoch: 5449, Training Loss: 0.01907
Epoch: 5450, Training Loss: 0.02196
Epoch: 5450, Training Loss: 0.01997
Epoch: 5450, Training Loss: 0.02518
Epoch: 5450, Training Loss: 0.01907
Epoch: 5451, Training Loss: 0.02196
Epoch: 5451, Training Loss: 0.01997
Epoch: 5451, Training Loss: 0.02518
Epoch: 5451, Training Loss: 0.01907
Epoch: 5452, Training Loss: 0.02196
Epoch: 5452, Training Loss: 0.01997
Epoch: 5452, Training Loss: 0.02517
Epoch: 5452, Training Loss: 0.01907
Epoch: 5453, Training Loss: 0.02195
Epoch: 5453, Training Loss: 0.01996
Epoch: 5453, Training Loss: 0.02517
Epoch: 5453, Training Loss: 0.01907
Epoch: 5454, Training Loss: 0.02195
Epoch: 5454, Training Loss: 0.01996
Epoch: 5454, Training Loss: 0.02517
Epoch: 5454, Training Loss: 0.01906
Epoch: 5455, Training Loss: 0.02195
Epoch: 5455, Training Loss: 0.01996
Epoch: 5455, Training Loss: 0.02516
Epoch: 5455, Training Loss: 0.01906
Epoch: 5456, Training Loss: 0.02195
Epoch: 5456, Training Loss: 0.01996
Epoch: 5456, Training Loss: 0.02516
Epoch: 5456, Training Loss: 0.01906
Epoch: 5457, Training Loss: 0.02194
Epoch: 5457, Training Loss: 0.01996
Epoch: 5457, Training Loss: 0.02516
Epoch: 5457, Training Loss: 0.01906
Epoch: 5458, Training Loss: 0.02194
Epoch: 5458, Training Loss: 0.01995
Epoch: 5458, Training Loss: 0.02516
Epoch: 5458, Training Loss: 0.01906
Epoch: 5459, Training Loss: 0.02194
Epoch: 5459, Training Loss: 0.01995
Epoch: 5459, Training Loss: 0.02515
Epoch: 5459, Training Loss: 0.01905
Epoch: 5460, Training Loss: 0.02194
Epoch: 5460, Training Loss: 0.01995
Epoch: 5460, Training Loss: 0.02515
Epoch: 5460, Training Loss: 0.01905
Epoch: 5461, Training Loss: 0.02193
Epoch: 5461, Training Loss: 0.01995
Epoch: 5461, Training Loss: 0.02515
Epoch: 5461, Training Loss: 0.01905
Epoch: 5462, Training Loss: 0.02193
Epoch: 5462, Training Loss: 0.01995
Epoch: 5462, Training Loss: 0.02514
Epoch: 5462, Training Loss: 0.01905
Epoch: 5463, Training Loss: 0.02193
Epoch: 5463, Training Loss: 0.01994
Epoch: 5463, Training Loss: 0.02514
Epoch: 5463, Training Loss: 0.01905
Epoch: 5464, Training Loss: 0.02193
Epoch: 5464, Training Loss: 0.01994
Epoch: 5464, Training Loss: 0.02514
Epoch: 5464, Training Loss: 0.01904
Epoch: 5465, Training Loss: 0.02192
Epoch: 5465, Training Loss: 0.01994
Epoch: 5465, Training Loss: 0.02514
Epoch: 5465, Training Loss: 0.01904
Epoch: 5466, Training Loss: 0.02192
Epoch: 5466, Training Loss: 0.01994
Epoch: 5466, Training Loss: 0.02513
Epoch: 5466, Training Loss: 0.01904
Epoch: 5467, Training Loss: 0.02192
Epoch: 5467, Training Loss: 0.01993
Epoch: 5467, Training Loss: 0.02513
Epoch: 5467, Training Loss: 0.01904
Epoch: 5468, Training Loss: 0.02192
Epoch: 5468, Training Loss: 0.01993
Epoch: 5468, Training Loss: 0.02513
Epoch: 5468, Training Loss: 0.01903
Epoch: 5469, Training Loss: 0.02191
Epoch: 5469, Training Loss: 0.01993
Epoch: 5469, Training Loss: 0.02512
Epoch: 5469, Training Loss: 0.01903
Epoch: 5470, Training Loss: 0.02191
Epoch: 5470, Training Loss: 0.01993
Epoch: 5470, Training Loss: 0.02512
Epoch: 5470, Training Loss: 0.01903
Epoch: 5471, Training Loss: 0.02191
Epoch: 5471, Training Loss: 0.01993
Epoch: 5471, Training Loss: 0.02512
Epoch: 5471, Training Loss: 0.01903
Epoch: 5472, Training Loss: 0.02191
Epoch: 5472, Training Loss: 0.01992
Epoch: 5472, Training Loss: 0.02512
Epoch: 5472, Training Loss: 0.01903
Epoch: 5473, Training Loss: 0.02190
Epoch: 5473, Training Loss: 0.01992
Epoch: 5473, Training Loss: 0.02511
Epoch: 5473, Training Loss: 0.01902
Epoch: 5474, Training Loss: 0.02190
Epoch: 5474, Training Loss: 0.01992
Epoch: 5474, Training Loss: 0.02511
Epoch: 5474, Training Loss: 0.01902
Epoch: 5475, Training Loss: 0.02190
Epoch: 5475, Training Loss: 0.01992
Epoch: 5475, Training Loss: 0.02511
Epoch: 5475, Training Loss: 0.01902
Epoch: 5476, Training Loss: 0.02190
Epoch: 5476, Training Loss: 0.01992
Epoch: 5476, Training Loss: 0.02510
Epoch: 5476, Training Loss: 0.01902
Epoch: 5477, Training Loss: 0.02189
Epoch: 5477, Training Loss: 0.01991
Epoch: 5477, Training Loss: 0.02510
Epoch: 5477, Training Loss: 0.01902
Epoch: 5478, Training Loss: 0.02189
Epoch: 5478, Training Loss: 0.01991
Epoch: 5478, Training Loss: 0.02510
Epoch: 5478, Training Loss: 0.01901
Epoch: 5479, Training Loss: 0.02189
Epoch: 5479, Training Loss: 0.01991
Epoch: 5479, Training Loss: 0.02510
Epoch: 5479, Training Loss: 0.01901
Epoch: 5480, Training Loss: 0.02189
Epoch: 5480, Training Loss: 0.01991
Epoch: 5480, Training Loss: 0.02509
Epoch: 5480, Training Loss: 0.01901
Epoch: 5481, Training Loss: 0.02188
Epoch: 5481, Training Loss: 0.01991
Epoch: 5481, Training Loss: 0.02509
Epoch: 5481, Training Loss: 0.01901
Epoch: 5482, Training Loss: 0.02188
Epoch: 5482, Training Loss: 0.01990
Epoch: 5482, Training Loss: 0.02509
Epoch: 5482, Training Loss: 0.01901
Epoch: 5483, Training Loss: 0.02188
Epoch: 5483, Training Loss: 0.01990
Epoch: 5483, Training Loss: 0.02508
Epoch: 5483, Training Loss: 0.01900
Epoch: 5484, Training Loss: 0.02188
Epoch: 5484, Training Loss: 0.01990
Epoch: 5484, Training Loss: 0.02508
Epoch: 5484, Training Loss: 0.01900
Epoch: 5485, Training Loss: 0.02187
Epoch: 5485, Training Loss: 0.01990
Epoch: 5485, Training Loss: 0.02508
Epoch: 5485, Training Loss: 0.01900
Epoch: 5486, Training Loss: 0.02187
Epoch: 5486, Training Loss: 0.01989
Epoch: 5486, Training Loss: 0.02508
Epoch: 5486, Training Loss: 0.01900
Epoch: 5487, Training Loss: 0.02187
Epoch: 5487, Training Loss: 0.01989
Epoch: 5487, Training Loss: 0.02507
Epoch: 5487, Training Loss: 0.01900
Epoch: 5488, Training Loss: 0.02187
Epoch: 5488, Training Loss: 0.01989
Epoch: 5488, Training Loss: 0.02507
Epoch: 5488, Training Loss: 0.01899
Epoch: 5489, Training Loss: 0.02186
Epoch: 5489, Training Loss: 0.01989
Epoch: 5489, Training Loss: 0.02507
Epoch: 5489, Training Loss: 0.01899
Epoch: 5490, Training Loss: 0.02186
Epoch: 5490, Training Loss: 0.01989
Epoch: 5490, Training Loss: 0.02506
Epoch: 5490, Training Loss: 0.01899
Epoch: 5491, Training Loss: 0.02186
Epoch: 5491, Training Loss: 0.01988
Epoch: 5491, Training Loss: 0.02506
Epoch: 5491, Training Loss: 0.01899
Epoch: 5492, Training Loss: 0.02186
Epoch: 5492, Training Loss: 0.01988
Epoch: 5492, Training Loss: 0.02506
Epoch: 5492, Training Loss: 0.01899
Epoch: 5493, Training Loss: 0.02186
Epoch: 5493, Training Loss: 0.01988
Epoch: 5493, Training Loss: 0.02506
Epoch: 5493, Training Loss: 0.01898
Epoch: 5494, Training Loss: 0.02185
Epoch: 5494, Training Loss: 0.01988
Epoch: 5494, Training Loss: 0.02505
Epoch: 5494, Training Loss: 0.01898
Epoch: 5495, Training Loss: 0.02185
Epoch: 5495, Training Loss: 0.01988
Epoch: 5495, Training Loss: 0.02505
Epoch: 5495, Training Loss: 0.01898
Epoch: 5496, Training Loss: 0.02185
Epoch: 5496, Training Loss: 0.01987
Epoch: 5496, Training Loss: 0.02505
Epoch: 5496, Training Loss: 0.01898
Epoch: 5497, Training Loss: 0.02185
Epoch: 5497, Training Loss: 0.01987
Epoch: 5497, Training Loss: 0.02505
Epoch: 5497, Training Loss: 0.01897
Epoch: 5498, Training Loss: 0.02184
Epoch: 5498, Training Loss: 0.01987
Epoch: 5498, Training Loss: 0.02504
Epoch: 5498, Training Loss: 0.01897
Epoch: 5499, Training Loss: 0.02184
Epoch: 5499, Training Loss: 0.01987
Epoch: 5499, Training Loss: 0.02504
Epoch: 5499, Training Loss: 0.01897
Epoch: 5500, Training Loss: 0.02184
Epoch: 5500, Training Loss: 0.01986
Epoch: 5500, Training Loss: 0.02504
Epoch: 5500, Training Loss: 0.01897
Epoch: 5501, Training Loss: 0.02184
Epoch: 5501, Training Loss: 0.01986
Epoch: 5501, Training Loss: 0.02503
Epoch: 5501, Training Loss: 0.01897
Epoch: 5502, Training Loss: 0.02183
Epoch: 5502, Training Loss: 0.01986
Epoch: 5502, Training Loss: 0.02503
Epoch: 5502, Training Loss: 0.01896
Epoch: 5503, Training Loss: 0.02183
Epoch: 5503, Training Loss: 0.01986
Epoch: 5503, Training Loss: 0.02503
Epoch: 5503, Training Loss: 0.01896
Epoch: 5504, Training Loss: 0.02183
Epoch: 5504, Training Loss: 0.01986
Epoch: 5504, Training Loss: 0.02503
Epoch: 5504, Training Loss: 0.01896
Epoch: 5505, Training Loss: 0.02183
Epoch: 5505, Training Loss: 0.01985
Epoch: 5505, Training Loss: 0.02502
Epoch: 5505, Training Loss: 0.01896
Epoch: 5506, Training Loss: 0.02182
Epoch: 5506, Training Loss: 0.01985
Epoch: 5506, Training Loss: 0.02502
Epoch: 5506, Training Loss: 0.01896
Epoch: 5507, Training Loss: 0.02182
Epoch: 5507, Training Loss: 0.01985
Epoch: 5507, Training Loss: 0.02502
Epoch: 5507, Training Loss: 0.01895
Epoch: 5508, Training Loss: 0.02182
Epoch: 5508, Training Loss: 0.01985
Epoch: 5508, Training Loss: 0.02501
Epoch: 5508, Training Loss: 0.01895
Epoch: 5509, Training Loss: 0.02182
Epoch: 5509, Training Loss: 0.01985
Epoch: 5509, Training Loss: 0.02501
Epoch: 5509, Training Loss: 0.01895
Epoch: 5510, Training Loss: 0.02181
Epoch: 5510, Training Loss: 0.01984
Epoch: 5510, Training Loss: 0.02501
Epoch: 5510, Training Loss: 0.01895
Epoch: 5511, Training Loss: 0.02181
Epoch: 5511, Training Loss: 0.01984
Epoch: 5511, Training Loss: 0.02501
Epoch: 5511, Training Loss: 0.01895
Epoch: 5512, Training Loss: 0.02181
Epoch: 5512, Training Loss: 0.01984
Epoch: 5512, Training Loss: 0.02500
Epoch: 5512, Training Loss: 0.01894
Epoch: 5513, Training Loss: 0.02181
Epoch: 5513, Training Loss: 0.01984
Epoch: 5513, Training Loss: 0.02500
Epoch: 5513, Training Loss: 0.01894
Epoch: 5514, Training Loss: 0.02180
Epoch: 5514, Training Loss: 0.01984
Epoch: 5514, Training Loss: 0.02500
Epoch: 5514, Training Loss: 0.01894
Epoch: 5515, Training Loss: 0.02180
Epoch: 5515, Training Loss: 0.01983
Epoch: 5515, Training Loss: 0.02499
Epoch: 5515, Training Loss: 0.01894
Epoch: 5516, Training Loss: 0.02180
Epoch: 5516, Training Loss: 0.01983
Epoch: 5516, Training Loss: 0.02499
Epoch: 5516, Training Loss: 0.01894
Epoch: 5517, Training Loss: 0.02180
Epoch: 5517, Training Loss: 0.01983
Epoch: 5517, Training Loss: 0.02499
Epoch: 5517, Training Loss: 0.01893
Epoch: 5518, Training Loss: 0.02179
Epoch: 5518, Training Loss: 0.01983
Epoch: 5518, Training Loss: 0.02499
Epoch: 5518, Training Loss: 0.01893
Epoch: 5519, Training Loss: 0.02179
Epoch: 5519, Training Loss: 0.01982
Epoch: 5519, Training Loss: 0.02498
Epoch: 5519, Training Loss: 0.01893
Epoch: 5520, Training Loss: 0.02179
Epoch: 5520, Training Loss: 0.01982
Epoch: 5520, Training Loss: 0.02498
Epoch: 5520, Training Loss: 0.01893
Epoch: 5521, Training Loss: 0.02179
Epoch: 5521, Training Loss: 0.01982
Epoch: 5521, Training Loss: 0.02498
Epoch: 5521, Training Loss: 0.01893
Epoch: 5522, Training Loss: 0.02178
Epoch: 5522, Training Loss: 0.01982
Epoch: 5522, Training Loss: 0.02497
Epoch: 5522, Training Loss: 0.01892
Epoch: 5523, Training Loss: 0.02178
Epoch: 5523, Training Loss: 0.01982
Epoch: 5523, Training Loss: 0.02497
Epoch: 5523, Training Loss: 0.01892
Epoch: 5524, Training Loss: 0.02178
Epoch: 5524, Training Loss: 0.01981
Epoch: 5524, Training Loss: 0.02497
Epoch: 5524, Training Loss: 0.01892
Epoch: 5525, Training Loss: 0.02178
Epoch: 5525, Training Loss: 0.01981
Epoch: 5525, Training Loss: 0.02497
Epoch: 5525, Training Loss: 0.01892
Epoch: 5526, Training Loss: 0.02178
Epoch: 5526, Training Loss: 0.01981
Epoch: 5526, Training Loss: 0.02496
Epoch: 5526, Training Loss: 0.01892
Epoch: 5527, Training Loss: 0.02177
Epoch: 5527, Training Loss: 0.01981
Epoch: 5527, Training Loss: 0.02496
Epoch: 5527, Training Loss: 0.01891
Epoch: 5528, Training Loss: 0.02177
Epoch: 5528, Training Loss: 0.01981
Epoch: 5528, Training Loss: 0.02496
Epoch: 5528, Training Loss: 0.01891
Epoch: 5529, Training Loss: 0.02177
Epoch: 5529, Training Loss: 0.01980
Epoch: 5529, Training Loss: 0.02496
Epoch: 5529, Training Loss: 0.01891
Epoch: 5530, Training Loss: 0.02177
Epoch: 5530, Training Loss: 0.01980
Epoch: 5530, Training Loss: 0.02495
Epoch: 5530, Training Loss: 0.01891
Epoch: 5531, Training Loss: 0.02176
Epoch: 5531, Training Loss: 0.01980
Epoch: 5531, Training Loss: 0.02495
Epoch: 5531, Training Loss: 0.01891
Epoch: 5532, Training Loss: 0.02176
Epoch: 5532, Training Loss: 0.01980
Epoch: 5532, Training Loss: 0.02495
Epoch: 5532, Training Loss: 0.01890
Epoch: 5533, Training Loss: 0.02176
Epoch: 5533, Training Loss: 0.01980
Epoch: 5533, Training Loss: 0.02494
Epoch: 5533, Training Loss: 0.01890
Epoch: 5534, Training Loss: 0.02176
Epoch: 5534, Training Loss: 0.01979
Epoch: 5534, Training Loss: 0.02494
Epoch: 5534, Training Loss: 0.01890
Epoch: 5535, Training Loss: 0.02175
Epoch: 5535, Training Loss: 0.01979
Epoch: 5535, Training Loss: 0.02494
Epoch: 5535, Training Loss: 0.01890
Epoch: 5536, Training Loss: 0.02175
Epoch: 5536, Training Loss: 0.01979
Epoch: 5536, Training Loss: 0.02494
Epoch: 5536, Training Loss: 0.01889
Epoch: 5537, Training Loss: 0.02175
Epoch: 5537, Training Loss: 0.01979
Epoch: 5537, Training Loss: 0.02493
Epoch: 5537, Training Loss: 0.01889
Epoch: 5538, Training Loss: 0.02175
Epoch: 5538, Training Loss: 0.01979
Epoch: 5538, Training Loss: 0.02493
Epoch: 5538, Training Loss: 0.01889
Epoch: 5539, Training Loss: 0.02174
Epoch: 5539, Training Loss: 0.01978
Epoch: 5539, Training Loss: 0.02493
Epoch: 5539, Training Loss: 0.01889
Epoch: 5540, Training Loss: 0.02174
Epoch: 5540, Training Loss: 0.01978
Epoch: 5540, Training Loss: 0.02492
Epoch: 5540, Training Loss: 0.01889
Epoch: 5541, Training Loss: 0.02174
Epoch: 5541, Training Loss: 0.01978
Epoch: 5541, Training Loss: 0.02492
Epoch: 5541, Training Loss: 0.01888
Epoch: 5542, Training Loss: 0.02174
Epoch: 5542, Training Loss: 0.01978
Epoch: 5542, Training Loss: 0.02492
Epoch: 5542, Training Loss: 0.01888
Epoch: 5543, Training Loss: 0.02173
Epoch: 5543, Training Loss: 0.01977
Epoch: 5543, Training Loss: 0.02492
Epoch: 5543, Training Loss: 0.01888
Epoch: 5544, Training Loss: 0.02173
Epoch: 5544, Training Loss: 0.01977
Epoch: 5544, Training Loss: 0.02491
Epoch: 5544, Training Loss: 0.01888
Epoch: 5545, Training Loss: 0.02173
Epoch: 5545, Training Loss: 0.01977
Epoch: 5545, Training Loss: 0.02491
Epoch: 5545, Training Loss: 0.01888
Epoch: 5546, Training Loss: 0.02173
Epoch: 5546, Training Loss: 0.01977
Epoch: 5546, Training Loss: 0.02491
Epoch: 5546, Training Loss: 0.01887
Epoch: 5547, Training Loss: 0.02172
Epoch: 5547, Training Loss: 0.01977
Epoch: 5547, Training Loss: 0.02490
Epoch: 5547, Training Loss: 0.01887
Epoch: 5548, Training Loss: 0.02172
Epoch: 5548, Training Loss: 0.01976
Epoch: 5548, Training Loss: 0.02490
Epoch: 5548, Training Loss: 0.01887
Epoch: 5549, Training Loss: 0.02172
Epoch: 5549, Training Loss: 0.01976
Epoch: 5549, Training Loss: 0.02490
Epoch: 5549, Training Loss: 0.01887
Epoch: 5550, Training Loss: 0.02172
Epoch: 5550, Training Loss: 0.01976
Epoch: 5550, Training Loss: 0.02490
Epoch: 5550, Training Loss: 0.01887
Epoch: 5551, Training Loss: 0.02172
Epoch: 5551, Training Loss: 0.01976
Epoch: 5551, Training Loss: 0.02489
Epoch: 5551, Training Loss: 0.01886
Epoch: 5552, Training Loss: 0.02171
Epoch: 5552, Training Loss: 0.01976
Epoch: 5552, Training Loss: 0.02489
Epoch: 5552, Training Loss: 0.01886
Epoch: 5553, Training Loss: 0.02171
Epoch: 5553, Training Loss: 0.01975
Epoch: 5553, Training Loss: 0.02489
Epoch: 5553, Training Loss: 0.01886
Epoch: 5554, Training Loss: 0.02171
Epoch: 5554, Training Loss: 0.01975
Epoch: 5554, Training Loss: 0.02489
Epoch: 5554, Training Loss: 0.01886
Epoch: 5555, Training Loss: 0.02171
Epoch: 5555, Training Loss: 0.01975
Epoch: 5555, Training Loss: 0.02488
Epoch: 5555, Training Loss: 0.01886
Epoch: 5556, Training Loss: 0.02170
Epoch: 5556, Training Loss: 0.01975
Epoch: 5556, Training Loss: 0.02488
Epoch: 5556, Training Loss: 0.01885
Epoch: 5557, Training Loss: 0.02170
Epoch: 5557, Training Loss: 0.01975
Epoch: 5557, Training Loss: 0.02488
Epoch: 5557, Training Loss: 0.01885
Epoch: 5558, Training Loss: 0.02170
Epoch: 5558, Training Loss: 0.01974
Epoch: 5558, Training Loss: 0.02487
Epoch: 5558, Training Loss: 0.01885
Epoch: 5559, Training Loss: 0.02170
Epoch: 5559, Training Loss: 0.01974
Epoch: 5559, Training Loss: 0.02487
Epoch: 5559, Training Loss: 0.01885
Epoch: 5560, Training Loss: 0.02169
Epoch: 5560, Training Loss: 0.01974
Epoch: 5560, Training Loss: 0.02487
Epoch: 5560, Training Loss: 0.01885
Epoch: 5561, Training Loss: 0.02169
Epoch: 5561, Training Loss: 0.01974
Epoch: 5561, Training Loss: 0.02487
Epoch: 5561, Training Loss: 0.01884
Epoch: 5562, Training Loss: 0.02169
Epoch: 5562, Training Loss: 0.01974
Epoch: 5562, Training Loss: 0.02486
Epoch: 5562, Training Loss: 0.01884
Epoch: 5563, Training Loss: 0.02169
Epoch: 5563, Training Loss: 0.01973
Epoch: 5563, Training Loss: 0.02486
Epoch: 5563, Training Loss: 0.01884
Epoch: 5564, Training Loss: 0.02168
Epoch: 5564, Training Loss: 0.01973
Epoch: 5564, Training Loss: 0.02486
Epoch: 5564, Training Loss: 0.01884
Epoch: 5565, Training Loss: 0.02168
Epoch: 5565, Training Loss: 0.01973
Epoch: 5565, Training Loss: 0.02485
Epoch: 5565, Training Loss: 0.01884
Epoch: 5566, Training Loss: 0.02168
Epoch: 5566, Training Loss: 0.01973
Epoch: 5566, Training Loss: 0.02485
Epoch: 5566, Training Loss: 0.01883
Epoch: 5567, Training Loss: 0.02168
Epoch: 5567, Training Loss: 0.01972
Epoch: 5567, Training Loss: 0.02485
Epoch: 5567, Training Loss: 0.01883
Epoch: 5568, Training Loss: 0.02167
Epoch: 5568, Training Loss: 0.01972
Epoch: 5568, Training Loss: 0.02485
Epoch: 5568, Training Loss: 0.01883
Epoch: 5569, Training Loss: 0.02167
Epoch: 5569, Training Loss: 0.01972
Epoch: 5569, Training Loss: 0.02484
Epoch: 5569, Training Loss: 0.01883
Epoch: 5570, Training Loss: 0.02167
Epoch: 5570, Training Loss: 0.01972
Epoch: 5570, Training Loss: 0.02484
Epoch: 5570, Training Loss: 0.01883
Epoch: 5571, Training Loss: 0.02167
Epoch: 5571, Training Loss: 0.01972
Epoch: 5571, Training Loss: 0.02484
Epoch: 5571, Training Loss: 0.01882
Epoch: 5572, Training Loss: 0.02167
Epoch: 5572, Training Loss: 0.01971
Epoch: 5572, Training Loss: 0.02484
Epoch: 5572, Training Loss: 0.01882
Epoch: 5573, Training Loss: 0.02166
Epoch: 5573, Training Loss: 0.01971
Epoch: 5573, Training Loss: 0.02483
Epoch: 5573, Training Loss: 0.01882
Epoch: 5574, Training Loss: 0.02166
Epoch: 5574, Training Loss: 0.01971
Epoch: 5574, Training Loss: 0.02483
Epoch: 5574, Training Loss: 0.01882
Epoch: 5575, Training Loss: 0.02166
Epoch: 5575, Training Loss: 0.01971
Epoch: 5575, Training Loss: 0.02483
Epoch: 5575, Training Loss: 0.01882
Epoch: 5576, Training Loss: 0.02166
Epoch: 5576, Training Loss: 0.01971
Epoch: 5576, Training Loss: 0.02482
Epoch: 5576, Training Loss: 0.01881
Epoch: 5577, Training Loss: 0.02165
Epoch: 5577, Training Loss: 0.01970
Epoch: 5577, Training Loss: 0.02482
Epoch: 5577, Training Loss: 0.01881
Epoch: 5578, Training Loss: 0.02165
Epoch: 5578, Training Loss: 0.01970
Epoch: 5578, Training Loss: 0.02482
Epoch: 5578, Training Loss: 0.01881
Epoch: 5579, Training Loss: 0.02165
Epoch: 5579, Training Loss: 0.01970
Epoch: 5579, Training Loss: 0.02482
Epoch: 5579, Training Loss: 0.01881
Epoch: 5580, Training Loss: 0.02165
Epoch: 5580, Training Loss: 0.01970
Epoch: 5580, Training Loss: 0.02481
Epoch: 5580, Training Loss: 0.01881
Epoch: 5581, Training Loss: 0.02164
Epoch: 5581, Training Loss: 0.01970
Epoch: 5581, Training Loss: 0.02481
Epoch: 5581, Training Loss: 0.01880
Epoch: 5582, Training Loss: 0.02164
Epoch: 5582, Training Loss: 0.01969
Epoch: 5582, Training Loss: 0.02481
Epoch: 5582, Training Loss: 0.01880
Epoch: 5583, Training Loss: 0.02164
Epoch: 5583, Training Loss: 0.01969
Epoch: 5583, Training Loss: 0.02481
Epoch: 5583, Training Loss: 0.01880
Epoch: 5584, Training Loss: 0.02164
Epoch: 5584, Training Loss: 0.01969
Epoch: 5584, Training Loss: 0.02480
Epoch: 5584, Training Loss: 0.01880
Epoch: 5585, Training Loss: 0.02163
Epoch: 5585, Training Loss: 0.01969
Epoch: 5585, Training Loss: 0.02480
Epoch: 5585, Training Loss: 0.01880
Epoch: 5586, Training Loss: 0.02163
Epoch: 5586, Training Loss: 0.01969
Epoch: 5586, Training Loss: 0.02480
Epoch: 5586, Training Loss: 0.01879
Epoch: 5587, Training Loss: 0.02163
Epoch: 5587, Training Loss: 0.01968
Epoch: 5587, Training Loss: 0.02479
Epoch: 5587, Training Loss: 0.01879
Epoch: 5588, Training Loss: 0.02163
Epoch: 5588, Training Loss: 0.01968
Epoch: 5588, Training Loss: 0.02479
Epoch: 5588, Training Loss: 0.01879
Epoch: 5589, Training Loss: 0.02163
Epoch: 5589, Training Loss: 0.01968
Epoch: 5589, Training Loss: 0.02479
Epoch: 5589, Training Loss: 0.01879
Epoch: 5590, Training Loss: 0.02162
Epoch: 5590, Training Loss: 0.01968
Epoch: 5590, Training Loss: 0.02479
Epoch: 5590, Training Loss: 0.01879
Epoch: 5591, Training Loss: 0.02162
Epoch: 5591, Training Loss: 0.01968
Epoch: 5591, Training Loss: 0.02478
Epoch: 5591, Training Loss: 0.01878
Epoch: 5592, Training Loss: 0.02162
Epoch: 5592, Training Loss: 0.01967
Epoch: 5592, Training Loss: 0.02478
Epoch: 5592, Training Loss: 0.01878
Epoch: 5593, Training Loss: 0.02162
Epoch: 5593, Training Loss: 0.01967
Epoch: 5593, Training Loss: 0.02478
Epoch: 5593, Training Loss: 0.01878
Epoch: 5594, Training Loss: 0.02161
Epoch: 5594, Training Loss: 0.01967
Epoch: 5594, Training Loss: 0.02478
Epoch: 5594, Training Loss: 0.01878
Epoch: 5595, Training Loss: 0.02161
Epoch: 5595, Training Loss: 0.01967
Epoch: 5595, Training Loss: 0.02477
Epoch: 5595, Training Loss: 0.01878
Epoch: 5596, Training Loss: 0.02161
Epoch: 5596, Training Loss: 0.01967
Epoch: 5596, Training Loss: 0.02477
Epoch: 5596, Training Loss: 0.01877
Epoch: 5597, Training Loss: 0.02161
Epoch: 5597, Training Loss: 0.01966
Epoch: 5597, Training Loss: 0.02477
Epoch: 5597, Training Loss: 0.01877
Epoch: 5598, Training Loss: 0.02160
Epoch: 5598, Training Loss: 0.01966
Epoch: 5598, Training Loss: 0.02476
Epoch: 5598, Training Loss: 0.01877
Epoch: 5599, Training Loss: 0.02160
Epoch: 5599, Training Loss: 0.01966
Epoch: 5599, Training Loss: 0.02476
Epoch: 5599, Training Loss: 0.01877
Epoch: 5600, Training Loss: 0.02160
Epoch: 5600, Training Loss: 0.01966
Epoch: 5600, Training Loss: 0.02476
Epoch: 5600, Training Loss: 0.01877
Epoch: 5601, Training Loss: 0.02160
Epoch: 5601, Training Loss: 0.01966
Epoch: 5601, Training Loss: 0.02476
Epoch: 5601, Training Loss: 0.01876
Epoch: 5602, Training Loss: 0.02159
Epoch: 5602, Training Loss: 0.01965
Epoch: 5602, Training Loss: 0.02475
Epoch: 5602, Training Loss: 0.01876
Epoch: 5603, Training Loss: 0.02159
Epoch: 5603, Training Loss: 0.01965
Epoch: 5603, Training Loss: 0.02475
Epoch: 5603, Training Loss: 0.01876
Epoch: 5604, Training Loss: 0.02159
Epoch: 5604, Training Loss: 0.01965
Epoch: 5604, Training Loss: 0.02475
Epoch: 5604, Training Loss: 0.01876
Epoch: 5605, Training Loss: 0.02159
Epoch: 5605, Training Loss: 0.01965
Epoch: 5605, Training Loss: 0.02475
Epoch: 5605, Training Loss: 0.01876
Epoch: 5606, Training Loss: 0.02159
Epoch: 5606, Training Loss: 0.01964
Epoch: 5606, Training Loss: 0.02474
Epoch: 5606, Training Loss: 0.01875
Epoch: 5607, Training Loss: 0.02158
Epoch: 5607, Training Loss: 0.01964
Epoch: 5607, Training Loss: 0.02474
Epoch: 5607, Training Loss: 0.01875
Epoch: 5608, Training Loss: 0.02158
Epoch: 5608, Training Loss: 0.01964
Epoch: 5608, Training Loss: 0.02474
Epoch: 5608, Training Loss: 0.01875
Epoch: 5609, Training Loss: 0.02158
Epoch: 5609, Training Loss: 0.01964
Epoch: 5609, Training Loss: 0.02473
Epoch: 5609, Training Loss: 0.01875
Epoch: 5610, Training Loss: 0.02158
Epoch: 5610, Training Loss: 0.01964
Epoch: 5610, Training Loss: 0.02473
Epoch: 5610, Training Loss: 0.01875
Epoch: 5611, Training Loss: 0.02157
Epoch: 5611, Training Loss: 0.01963
Epoch: 5611, Training Loss: 0.02473
Epoch: 5611, Training Loss: 0.01874
Epoch: 5612, Training Loss: 0.02157
Epoch: 5612, Training Loss: 0.01963
Epoch: 5612, Training Loss: 0.02473
Epoch: 5612, Training Loss: 0.01874
Epoch: 5613, Training Loss: 0.02157
Epoch: 5613, Training Loss: 0.01963
Epoch: 5613, Training Loss: 0.02472
Epoch: 5613, Training Loss: 0.01874
Epoch: 5614, Training Loss: 0.02157
Epoch: 5614, Training Loss: 0.01963
Epoch: 5614, Training Loss: 0.02472
Epoch: 5614, Training Loss: 0.01874
Epoch: 5615, Training Loss: 0.02156
Epoch: 5615, Training Loss: 0.01963
Epoch: 5615, Training Loss: 0.02472
Epoch: 5615, Training Loss: 0.01874
Epoch: 5616, Training Loss: 0.02156
Epoch: 5616, Training Loss: 0.01962
Epoch: 5616, Training Loss: 0.02472
Epoch: 5616, Training Loss: 0.01873
Epoch: 5617, Training Loss: 0.02156
Epoch: 5617, Training Loss: 0.01962
Epoch: 5617, Training Loss: 0.02471
Epoch: 5617, Training Loss: 0.01873
Epoch: 5618, Training Loss: 0.02156
Epoch: 5618, Training Loss: 0.01962
Epoch: 5618, Training Loss: 0.02471
Epoch: 5618, Training Loss: 0.01873
Epoch: 5619, Training Loss: 0.02155
Epoch: 5619, Training Loss: 0.01962
Epoch: 5619, Training Loss: 0.02471
Epoch: 5619, Training Loss: 0.01873
Epoch: 5620, Training Loss: 0.02155
Epoch: 5620, Training Loss: 0.01962
Epoch: 5620, Training Loss: 0.02470
Epoch: 5620, Training Loss: 0.01873
Epoch: 5621, Training Loss: 0.02155
Epoch: 5621, Training Loss: 0.01961
Epoch: 5621, Training Loss: 0.02470
Epoch: 5621, Training Loss: 0.01872
Epoch: 5622, Training Loss: 0.02155
Epoch: 5622, Training Loss: 0.01961
Epoch: 5622, Training Loss: 0.02470
Epoch: 5622, Training Loss: 0.01872
Epoch: 5623, Training Loss: 0.02155
Epoch: 5623, Training Loss: 0.01961
Epoch: 5623, Training Loss: 0.02470
Epoch: 5623, Training Loss: 0.01872
Epoch: 5624, Training Loss: 0.02154
Epoch: 5624, Training Loss: 0.01961
Epoch: 5624, Training Loss: 0.02469
Epoch: 5624, Training Loss: 0.01872
Epoch: 5625, Training Loss: 0.02154
Epoch: 5625, Training Loss: 0.01961
Epoch: 5625, Training Loss: 0.02469
Epoch: 5625, Training Loss: 0.01872
Epoch: 5626, Training Loss: 0.02154
Epoch: 5626, Training Loss: 0.01960
Epoch: 5626, Training Loss: 0.02469
Epoch: 5626, Training Loss: 0.01871
Epoch: 5627, Training Loss: 0.02154
Epoch: 5627, Training Loss: 0.01960
Epoch: 5627, Training Loss: 0.02469
Epoch: 5627, Training Loss: 0.01871
Epoch: 5628, Training Loss: 0.02153
Epoch: 5628, Training Loss: 0.01960
Epoch: 5628, Training Loss: 0.02468
Epoch: 5628, Training Loss: 0.01871
Epoch: 5629, Training Loss: 0.02153
Epoch: 5629, Training Loss: 0.01960
Epoch: 5629, Training Loss: 0.02468
Epoch: 5629, Training Loss: 0.01871
Epoch: 5630, Training Loss: 0.02153
Epoch: 5630, Training Loss: 0.01960
Epoch: 5630, Training Loss: 0.02468
Epoch: 5630, Training Loss: 0.01871
Epoch: 5631, Training Loss: 0.02153
Epoch: 5631, Training Loss: 0.01959
Epoch: 5631, Training Loss: 0.02467
Epoch: 5631, Training Loss: 0.01870
Epoch: 5632, Training Loss: 0.02152
Epoch: 5632, Training Loss: 0.01959
Epoch: 5632, Training Loss: 0.02467
Epoch: 5632, Training Loss: 0.01870
Epoch: 5633, Training Loss: 0.02152
Epoch: 5633, Training Loss: 0.01959
Epoch: 5633, Training Loss: 0.02467
Epoch: 5633, Training Loss: 0.01870
Epoch: 5634, Training Loss: 0.02152
Epoch: 5634, Training Loss: 0.01959
Epoch: 5634, Training Loss: 0.02467
Epoch: 5634, Training Loss: 0.01870
Epoch: 5635, Training Loss: 0.02152
Epoch: 5635, Training Loss: 0.01959
Epoch: 5635, Training Loss: 0.02466
Epoch: 5635, Training Loss: 0.01870
Epoch: 5636, Training Loss: 0.02152
Epoch: 5636, Training Loss: 0.01958
Epoch: 5636, Training Loss: 0.02466
Epoch: 5636, Training Loss: 0.01869
Epoch: 5637, Training Loss: 0.02151
Epoch: 5637, Training Loss: 0.01958
Epoch: 5637, Training Loss: 0.02466
Epoch: 5637, Training Loss: 0.01869
Epoch: 5638, Training Loss: 0.02151
Epoch: 5638, Training Loss: 0.01958
Epoch: 5638, Training Loss: 0.02466
Epoch: 5638, Training Loss: 0.01869
Epoch: 5639, Training Loss: 0.02151
Epoch: 5639, Training Loss: 0.01958
Epoch: 5639, Training Loss: 0.02465
Epoch: 5639, Training Loss: 0.01869
Epoch: 5640, Training Loss: 0.02151
Epoch: 5640, Training Loss: 0.01958
Epoch: 5640, Training Loss: 0.02465
Epoch: 5640, Training Loss: 0.01869
Epoch: 5641, Training Loss: 0.02150
Epoch: 5641, Training Loss: 0.01957
Epoch: 5641, Training Loss: 0.02465
Epoch: 5641, Training Loss: 0.01868
Epoch: 5642, Training Loss: 0.02150
Epoch: 5642, Training Loss: 0.01957
Epoch: 5642, Training Loss: 0.02464
Epoch: 5642, Training Loss: 0.01868
Epoch: 5643, Training Loss: 0.02150
Epoch: 5643, Training Loss: 0.01957
Epoch: 5643, Training Loss: 0.02464
Epoch: 5643, Training Loss: 0.01868
Epoch: 5644, Training Loss: 0.02150
Epoch: 5644, Training Loss: 0.01957
Epoch: 5644, Training Loss: 0.02464
Epoch: 5644, Training Loss: 0.01868
Epoch: 5645, Training Loss: 0.02149
Epoch: 5645, Training Loss: 0.01957
Epoch: 5645, Training Loss: 0.02464
Epoch: 5645, Training Loss: 0.01868
Epoch: 5646, Training Loss: 0.02149
Epoch: 5646, Training Loss: 0.01956
Epoch: 5646, Training Loss: 0.02463
Epoch: 5646, Training Loss: 0.01867
Epoch: 5647, Training Loss: 0.02149
Epoch: 5647, Training Loss: 0.01956
Epoch: 5647, Training Loss: 0.02463
Epoch: 5647, Training Loss: 0.01867
Epoch: 5648, Training Loss: 0.02149
Epoch: 5648, Training Loss: 0.01956
Epoch: 5648, Training Loss: 0.02463
Epoch: 5648, Training Loss: 0.01867
Epoch: 5649, Training Loss: 0.02148
Epoch: 5649, Training Loss: 0.01956
Epoch: 5649, Training Loss: 0.02463
Epoch: 5649, Training Loss: 0.01867
Epoch: 5650, Training Loss: 0.02148
Epoch: 5650, Training Loss: 0.01956
Epoch: 5650, Training Loss: 0.02462
Epoch: 5650, Training Loss: 0.01867
Epoch: 5651, Training Loss: 0.02148
Epoch: 5651, Training Loss: 0.01955
Epoch: 5651, Training Loss: 0.02462
Epoch: 5651, Training Loss: 0.01867
Epoch: 5652, Training Loss: 0.02148
Epoch: 5652, Training Loss: 0.01955
Epoch: 5652, Training Loss: 0.02462
Epoch: 5652, Training Loss: 0.01866
Epoch: 5653, Training Loss: 0.02148
Epoch: 5653, Training Loss: 0.01955
Epoch: 5653, Training Loss: 0.02462
Epoch: 5653, Training Loss: 0.01866
Epoch: 5654, Training Loss: 0.02147
Epoch: 5654, Training Loss: 0.01955
Epoch: 5654, Training Loss: 0.02461
Epoch: 5654, Training Loss: 0.01866
Epoch: 5655, Training Loss: 0.02147
Epoch: 5655, Training Loss: 0.01955
Epoch: 5655, Training Loss: 0.02461
Epoch: 5655, Training Loss: 0.01866
Epoch: 5656, Training Loss: 0.02147
Epoch: 5656, Training Loss: 0.01954
Epoch: 5656, Training Loss: 0.02461
Epoch: 5656, Training Loss: 0.01866
Epoch: 5657, Training Loss: 0.02147
Epoch: 5657, Training Loss: 0.01954
Epoch: 5657, Training Loss: 0.02460
Epoch: 5657, Training Loss: 0.01865
Epoch: 5658, Training Loss: 0.02146
Epoch: 5658, Training Loss: 0.01954
Epoch: 5658, Training Loss: 0.02460
Epoch: 5658, Training Loss: 0.01865
Epoch: 5659, Training Loss: 0.02146
Epoch: 5659, Training Loss: 0.01954
Epoch: 5659, Training Loss: 0.02460
Epoch: 5659, Training Loss: 0.01865
Epoch: 5660, Training Loss: 0.02146
Epoch: 5660, Training Loss: 0.01954
Epoch: 5660, Training Loss: 0.02460
Epoch: 5660, Training Loss: 0.01865
Epoch: 5661, Training Loss: 0.02146
Epoch: 5661, Training Loss: 0.01953
Epoch: 5661, Training Loss: 0.02459
Epoch: 5661, Training Loss: 0.01865
Epoch: 5662, Training Loss: 0.02145
Epoch: 5662, Training Loss: 0.01953
Epoch: 5662, Training Loss: 0.02459
Epoch: 5662, Training Loss: 0.01864
Epoch: 5663, Training Loss: 0.02145
Epoch: 5663, Training Loss: 0.01953
Epoch: 5663, Training Loss: 0.02459
Epoch: 5663, Training Loss: 0.01864
Epoch: 5664, Training Loss: 0.02145
Epoch: 5664, Training Loss: 0.01953
Epoch: 5664, Training Loss: 0.02459
Epoch: 5664, Training Loss: 0.01864
Epoch: 5665, Training Loss: 0.02145
Epoch: 5665, Training Loss: 0.01953
Epoch: 5665, Training Loss: 0.02458
Epoch: 5665, Training Loss: 0.01864
Epoch: 5666, Training Loss: 0.02145
Epoch: 5666, Training Loss: 0.01952
Epoch: 5666, Training Loss: 0.02458
Epoch: 5666, Training Loss: 0.01864
Epoch: 5667, Training Loss: 0.02144
Epoch: 5667, Training Loss: 0.01952
Epoch: 5667, Training Loss: 0.02458
Epoch: 5667, Training Loss: 0.01863
Epoch: 5668, Training Loss: 0.02144
Epoch: 5668, Training Loss: 0.01952
Epoch: 5668, Training Loss: 0.02457
Epoch: 5668, Training Loss: 0.01863
Epoch: 5669, Training Loss: 0.02144
Epoch: 5669, Training Loss: 0.01952
Epoch: 5669, Training Loss: 0.02457
Epoch: 5669, Training Loss: 0.01863
Epoch: 5670, Training Loss: 0.02144
Epoch: 5670, Training Loss: 0.01952
Epoch: 5670, Training Loss: 0.02457
Epoch: 5670, Training Loss: 0.01863
Epoch: 5671, Training Loss: 0.02143
Epoch: 5671, Training Loss: 0.01951
Epoch: 5671, Training Loss: 0.02457
Epoch: 5671, Training Loss: 0.01863
Epoch: 5672, Training Loss: 0.02143
Epoch: 5672, Training Loss: 0.01951
Epoch: 5672, Training Loss: 0.02456
Epoch: 5672, Training Loss: 0.01862
Epoch: 5673, Training Loss: 0.02143
Epoch: 5673, Training Loss: 0.01951
Epoch: 5673, Training Loss: 0.02456
Epoch: 5673, Training Loss: 0.01862
Epoch: 5674, Training Loss: 0.02143
Epoch: 5674, Training Loss: 0.01951
Epoch: 5674, Training Loss: 0.02456
Epoch: 5674, Training Loss: 0.01862
Epoch: 5675, Training Loss: 0.02143
Epoch: 5675, Training Loss: 0.01951
Epoch: 5675, Training Loss: 0.02456
Epoch: 5675, Training Loss: 0.01862
Epoch: 5676, Training Loss: 0.02142
Epoch: 5676, Training Loss: 0.01950
Epoch: 5676, Training Loss: 0.02455
Epoch: 5676, Training Loss: 0.01862
Epoch: 5677, Training Loss: 0.02142
Epoch: 5677, Training Loss: 0.01950
Epoch: 5677, Training Loss: 0.02455
Epoch: 5677, Training Loss: 0.01861
Epoch: 5678, Training Loss: 0.02142
Epoch: 5678, Training Loss: 0.01950
Epoch: 5678, Training Loss: 0.02455
Epoch: 5678, Training Loss: 0.01861
Epoch: 5679, Training Loss: 0.02142
Epoch: 5679, Training Loss: 0.01950
Epoch: 5679, Training Loss: 0.02455
Epoch: 5679, Training Loss: 0.01861
Epoch: 5680, Training Loss: 0.02141
Epoch: 5680, Training Loss: 0.01950
Epoch: 5680, Training Loss: 0.02454
Epoch: 5680, Training Loss: 0.01861
Epoch: 5681, Training Loss: 0.02141
Epoch: 5681, Training Loss: 0.01949
Epoch: 5681, Training Loss: 0.02454
Epoch: 5681, Training Loss: 0.01861
Epoch: 5682, Training Loss: 0.02141
Epoch: 5682, Training Loss: 0.01949
Epoch: 5682, Training Loss: 0.02454
Epoch: 5682, Training Loss: 0.01860
Epoch: 5683, Training Loss: 0.02141
Epoch: 5683, Training Loss: 0.01949
Epoch: 5683, Training Loss: 0.02453
Epoch: 5683, Training Loss: 0.01860
Epoch: 5684, Training Loss: 0.02140
Epoch: 5684, Training Loss: 0.01949
Epoch: 5684, Training Loss: 0.02453
Epoch: 5684, Training Loss: 0.01860
Epoch: 5685, Training Loss: 0.02140
Epoch: 5685, Training Loss: 0.01949
Epoch: 5685, Training Loss: 0.02453
Epoch: 5685, Training Loss: 0.01860
Epoch: 5686, Training Loss: 0.02140
Epoch: 5686, Training Loss: 0.01948
Epoch: 5686, Training Loss: 0.02453
Epoch: 5686, Training Loss: 0.01860
Epoch: 5687, Training Loss: 0.02140
Epoch: 5687, Training Loss: 0.01948
Epoch: 5687, Training Loss: 0.02452
Epoch: 5687, Training Loss: 0.01859
Epoch: 5688, Training Loss: 0.02140
Epoch: 5688, Training Loss: 0.01948
Epoch: 5688, Training Loss: 0.02452
Epoch: 5688, Training Loss: 0.01859
Epoch: 5689, Training Loss: 0.02139
Epoch: 5689, Training Loss: 0.01948
Epoch: 5689, Training Loss: 0.02452
Epoch: 5689, Training Loss: 0.01859
Epoch: 5690, Training Loss: 0.02139
Epoch: 5690, Training Loss: 0.01948
Epoch: 5690, Training Loss: 0.02452
Epoch: 5690, Training Loss: 0.01859
Epoch: 5691, Training Loss: 0.02139
Epoch: 5691, Training Loss: 0.01947
Epoch: 5691, Training Loss: 0.02451
Epoch: 5691, Training Loss: 0.01859
Epoch: 5692, Training Loss: 0.02139
Epoch: 5692, Training Loss: 0.01947
Epoch: 5692, Training Loss: 0.02451
Epoch: 5692, Training Loss: 0.01859
Epoch: 5693, Training Loss: 0.02138
Epoch: 5693, Training Loss: 0.01947
Epoch: 5693, Training Loss: 0.02451
Epoch: 5693, Training Loss: 0.01858
Epoch: 5694, Training Loss: 0.02138
Epoch: 5694, Training Loss: 0.01947
Epoch: 5694, Training Loss: 0.02451
Epoch: 5694, Training Loss: 0.01858
Epoch: 5695, Training Loss: 0.02138
Epoch: 5695, Training Loss: 0.01947
Epoch: 5695, Training Loss: 0.02450
Epoch: 5695, Training Loss: 0.01858
Epoch: 5696, Training Loss: 0.02138
Epoch: 5696, Training Loss: 0.01946
Epoch: 5696, Training Loss: 0.02450
Epoch: 5696, Training Loss: 0.01858
Epoch: 5697, Training Loss: 0.02137
Epoch: 5697, Training Loss: 0.01946
Epoch: 5697, Training Loss: 0.02450
Epoch: 5697, Training Loss: 0.01858
Epoch: 5698, Training Loss: 0.02137
Epoch: 5698, Training Loss: 0.01946
Epoch: 5698, Training Loss: 0.02449
Epoch: 5698, Training Loss: 0.01857
Epoch: 5699, Training Loss: 0.02137
Epoch: 5699, Training Loss: 0.01946
Epoch: 5699, Training Loss: 0.02449
Epoch: 5699, Training Loss: 0.01857
Epoch: 5700, Training Loss: 0.02137
Epoch: 5700, Training Loss: 0.01946
Epoch: 5700, Training Loss: 0.02449
Epoch: 5700, Training Loss: 0.01857
Epoch: 5701, Training Loss: 0.02137
Epoch: 5701, Training Loss: 0.01945
Epoch: 5701, Training Loss: 0.02449
Epoch: 5701, Training Loss: 0.01857
Epoch: 5702, Training Loss: 0.02136
Epoch: 5702, Training Loss: 0.01945
Epoch: 5702, Training Loss: 0.02448
Epoch: 5702, Training Loss: 0.01857
Epoch: 5703, Training Loss: 0.02136
Epoch: 5703, Training Loss: 0.01945
Epoch: 5703, Training Loss: 0.02448
Epoch: 5703, Training Loss: 0.01856
Epoch: 5704, Training Loss: 0.02136
Epoch: 5704, Training Loss: 0.01945
Epoch: 5704, Training Loss: 0.02448
Epoch: 5704, Training Loss: 0.01856
Epoch: 5705, Training Loss: 0.02136
Epoch: 5705, Training Loss: 0.01945
Epoch: 5705, Training Loss: 0.02448
Epoch: 5705, Training Loss: 0.01856
Epoch: 5706, Training Loss: 0.02135
Epoch: 5706, Training Loss: 0.01944
Epoch: 5706, Training Loss: 0.02447
Epoch: 5706, Training Loss: 0.01856
Epoch: 5707, Training Loss: 0.02135
Epoch: 5707, Training Loss: 0.01944
Epoch: 5707, Training Loss: 0.02447
Epoch: 5707, Training Loss: 0.01856
Epoch: 5708, Training Loss: 0.02135
Epoch: 5708, Training Loss: 0.01944
Epoch: 5708, Training Loss: 0.02447
Epoch: 5708, Training Loss: 0.01855
Epoch: 5709, Training Loss: 0.02135
Epoch: 5709, Training Loss: 0.01944
Epoch: 5709, Training Loss: 0.02447
Epoch: 5709, Training Loss: 0.01855
Epoch: 5710, Training Loss: 0.02135
Epoch: 5710, Training Loss: 0.01944
Epoch: 5710, Training Loss: 0.02446
Epoch: 5710, Training Loss: 0.01855
Epoch: 5711, Training Loss: 0.02134
Epoch: 5711, Training Loss: 0.01943
Epoch: 5711, Training Loss: 0.02446
Epoch: 5711, Training Loss: 0.01855
Epoch: 5712, Training Loss: 0.02134
Epoch: 5712, Training Loss: 0.01943
Epoch: 5712, Training Loss: 0.02446
Epoch: 5712, Training Loss: 0.01855
Epoch: 5713, Training Loss: 0.02134
Epoch: 5713, Training Loss: 0.01943
Epoch: 5713, Training Loss: 0.02446
Epoch: 5713, Training Loss: 0.01854
Epoch: 5714, Training Loss: 0.02134
Epoch: 5714, Training Loss: 0.01943
Epoch: 5714, Training Loss: 0.02445
Epoch: 5714, Training Loss: 0.01854
Epoch: 5715, Training Loss: 0.02133
Epoch: 5715, Training Loss: 0.01943
Epoch: 5715, Training Loss: 0.02445
Epoch: 5715, Training Loss: 0.01854
Epoch: 5716, Training Loss: 0.02133
Epoch: 5716, Training Loss: 0.01942
Epoch: 5716, Training Loss: 0.02445
Epoch: 5716, Training Loss: 0.01854
Epoch: 5717, Training Loss: 0.02133
Epoch: 5717, Training Loss: 0.01942
Epoch: 5717, Training Loss: 0.02444
Epoch: 5717, Training Loss: 0.01854
Epoch: 5718, Training Loss: 0.02133
Epoch: 5718, Training Loss: 0.01942
Epoch: 5718, Training Loss: 0.02444
Epoch: 5718, Training Loss: 0.01853
Epoch: 5719, Training Loss: 0.02132
Epoch: 5719, Training Loss: 0.01942
Epoch: 5719, Training Loss: 0.02444
Epoch: 5719, Training Loss: 0.01853
Epoch: 5720, Training Loss: 0.02132
Epoch: 5720, Training Loss: 0.01942
Epoch: 5720, Training Loss: 0.02444
Epoch: 5720, Training Loss: 0.01853
Epoch: 5721, Training Loss: 0.02132
Epoch: 5721, Training Loss: 0.01941
Epoch: 5721, Training Loss: 0.02443
Epoch: 5721, Training Loss: 0.01853
Epoch: 5722, Training Loss: 0.02132
Epoch: 5722, Training Loss: 0.01941
Epoch: 5722, Training Loss: 0.02443
Epoch: 5722, Training Loss: 0.01853
Epoch: 5723, Training Loss: 0.02132
Epoch: 5723, Training Loss: 0.01941
Epoch: 5723, Training Loss: 0.02443
Epoch: 5723, Training Loss: 0.01853
Epoch: 5724, Training Loss: 0.02131
Epoch: 5724, Training Loss: 0.01941
Epoch: 5724, Training Loss: 0.02443
Epoch: 5724, Training Loss: 0.01852
Epoch: 5725, Training Loss: 0.02131
Epoch: 5725, Training Loss: 0.01941
Epoch: 5725, Training Loss: 0.02442
Epoch: 5725, Training Loss: 0.01852
Epoch: 5726, Training Loss: 0.02131
Epoch: 5726, Training Loss: 0.01940
Epoch: 5726, Training Loss: 0.02442
Epoch: 5726, Training Loss: 0.01852
Epoch: 5727, Training Loss: 0.02131
Epoch: 5727, Training Loss: 0.01940
Epoch: 5727, Training Loss: 0.02442
Epoch: 5727, Training Loss: 0.01852
Epoch: 5728, Training Loss: 0.02130
Epoch: 5728, Training Loss: 0.01940
Epoch: 5728, Training Loss: 0.02442
Epoch: 5728, Training Loss: 0.01852
Epoch: 5729, Training Loss: 0.02130
Epoch: 5729, Training Loss: 0.01940
Epoch: 5729, Training Loss: 0.02441
Epoch: 5729, Training Loss: 0.01851
Epoch: 5730, Training Loss: 0.02130
Epoch: 5730, Training Loss: 0.01940
Epoch: 5730, Training Loss: 0.02441
Epoch: 5730, Training Loss: 0.01851
Epoch: 5731, Training Loss: 0.02130
Epoch: 5731, Training Loss: 0.01939
Epoch: 5731, Training Loss: 0.02441
Epoch: 5731, Training Loss: 0.01851
Epoch: 5732, Training Loss: 0.02130
Epoch: 5732, Training Loss: 0.01939
Epoch: 5732, Training Loss: 0.02441
Epoch: 5732, Training Loss: 0.01851
Epoch: 5733, Training Loss: 0.02129
Epoch: 5733, Training Loss: 0.01939
Epoch: 5733, Training Loss: 0.02440
Epoch: 5733, Training Loss: 0.01851
Epoch: 5734, Training Loss: 0.02129
Epoch: 5734, Training Loss: 0.01939
Epoch: 5734, Training Loss: 0.02440
Epoch: 5734, Training Loss: 0.01850
Epoch: 5735, Training Loss: 0.02129
Epoch: 5735, Training Loss: 0.01939
Epoch: 5735, Training Loss: 0.02440
Epoch: 5735, Training Loss: 0.01850
Epoch: 5736, Training Loss: 0.02129
Epoch: 5736, Training Loss: 0.01938
Epoch: 5736, Training Loss: 0.02439
Epoch: 5736, Training Loss: 0.01850
Epoch: 5737, Training Loss: 0.02128
Epoch: 5737, Training Loss: 0.01938
Epoch: 5737, Training Loss: 0.02439
Epoch: 5737, Training Loss: 0.01850
Epoch: 5738, Training Loss: 0.02128
Epoch: 5738, Training Loss: 0.01938
Epoch: 5738, Training Loss: 0.02439
Epoch: 5738, Training Loss: 0.01850
Epoch: 5739, Training Loss: 0.02128
Epoch: 5739, Training Loss: 0.01938
Epoch: 5739, Training Loss: 0.02439
Epoch: 5739, Training Loss: 0.01849
Epoch: 5740, Training Loss: 0.02128
Epoch: 5740, Training Loss: 0.01938
Epoch: 5740, Training Loss: 0.02438
Epoch: 5740, Training Loss: 0.01849
Epoch: 5741, Training Loss: 0.02128
Epoch: 5741, Training Loss: 0.01937
Epoch: 5741, Training Loss: 0.02438
Epoch: 5741, Training Loss: 0.01849
Epoch: 5742, Training Loss: 0.02127
Epoch: 5742, Training Loss: 0.01937
Epoch: 5742, Training Loss: 0.02438
Epoch: 5742, Training Loss: 0.01849
Epoch: 5743, Training Loss: 0.02127
Epoch: 5743, Training Loss: 0.01937
Epoch: 5743, Training Loss: 0.02438
Epoch: 5743, Training Loss: 0.01849
Epoch: 5744, Training Loss: 0.02127
Epoch: 5744, Training Loss: 0.01937
Epoch: 5744, Training Loss: 0.02437
Epoch: 5744, Training Loss: 0.01849
Epoch: 5745, Training Loss: 0.02127
Epoch: 5745, Training Loss: 0.01937
Epoch: 5745, Training Loss: 0.02437
Epoch: 5745, Training Loss: 0.01848
Epoch: 5746, Training Loss: 0.02126
Epoch: 5746, Training Loss: 0.01936
Epoch: 5746, Training Loss: 0.02437
Epoch: 5746, Training Loss: 0.01848
Epoch: 5747, Training Loss: 0.02126
Epoch: 5747, Training Loss: 0.01936
Epoch: 5747, Training Loss: 0.02437
Epoch: 5747, Training Loss: 0.01848
Epoch: 5748, Training Loss: 0.02126
Epoch: 5748, Training Loss: 0.01936
Epoch: 5748, Training Loss: 0.02436
Epoch: 5748, Training Loss: 0.01848
Epoch: 5749, Training Loss: 0.02126
Epoch: 5749, Training Loss: 0.01936
Epoch: 5749, Training Loss: 0.02436
Epoch: 5749, Training Loss: 0.01848
Epoch: 5750, Training Loss: 0.02125
Epoch: 5750, Training Loss: 0.01936
Epoch: 5750, Training Loss: 0.02436
Epoch: 5750, Training Loss: 0.01847
Epoch: 5751, Training Loss: 0.02125
Epoch: 5751, Training Loss: 0.01935
Epoch: 5751, Training Loss: 0.02436
Epoch: 5751, Training Loss: 0.01847
Epoch: 5752, Training Loss: 0.02125
Epoch: 5752, Training Loss: 0.01935
Epoch: 5752, Training Loss: 0.02435
Epoch: 5752, Training Loss: 0.01847
Epoch: 5753, Training Loss: 0.02125
Epoch: 5753, Training Loss: 0.01935
Epoch: 5753, Training Loss: 0.02435
Epoch: 5753, Training Loss: 0.01847
Epoch: 5754, Training Loss: 0.02125
Epoch: 5754, Training Loss: 0.01935
Epoch: 5754, Training Loss: 0.02435
Epoch: 5754, Training Loss: 0.01847
Epoch: 5755, Training Loss: 0.02124
Epoch: 5755, Training Loss: 0.01935
Epoch: 5755, Training Loss: 0.02435
Epoch: 5755, Training Loss: 0.01846
Epoch: 5756, Training Loss: 0.02124
Epoch: 5756, Training Loss: 0.01935
Epoch: 5756, Training Loss: 0.02434
Epoch: 5756, Training Loss: 0.01846
Epoch: 5757, Training Loss: 0.02124
Epoch: 5757, Training Loss: 0.01934
Epoch: 5757, Training Loss: 0.02434
Epoch: 5757, Training Loss: 0.01846
Epoch: 5758, Training Loss: 0.02124
Epoch: 5758, Training Loss: 0.01934
Epoch: 5758, Training Loss: 0.02434
Epoch: 5758, Training Loss: 0.01846
Epoch: 5759, Training Loss: 0.02123
Epoch: 5759, Training Loss: 0.01934
Epoch: 5759, Training Loss: 0.02433
Epoch: 5759, Training Loss: 0.01846
Epoch: 5760, Training Loss: 0.02123
Epoch: 5760, Training Loss: 0.01934
Epoch: 5760, Training Loss: 0.02433
Epoch: 5760, Training Loss: 0.01845
Epoch: 5761, Training Loss: 0.02123
Epoch: 5761, Training Loss: 0.01934
Epoch: 5761, Training Loss: 0.02433
Epoch: 5761, Training Loss: 0.01845
Epoch: 5762, Training Loss: 0.02123
Epoch: 5762, Training Loss: 0.01933
Epoch: 5762, Training Loss: 0.02433
Epoch: 5762, Training Loss: 0.01845
Epoch: 5763, Training Loss: 0.02123
Epoch: 5763, Training Loss: 0.01933
Epoch: 5763, Training Loss: 0.02432
Epoch: 5763, Training Loss: 0.01845
Epoch: 5764, Training Loss: 0.02122
Epoch: 5764, Training Loss: 0.01933
Epoch: 5764, Training Loss: 0.02432
Epoch: 5764, Training Loss: 0.01845
Epoch: 5765, Training Loss: 0.02122
Epoch: 5765, Training Loss: 0.01933
Epoch: 5765, Training Loss: 0.02432
Epoch: 5765, Training Loss: 0.01845
Epoch: 5766, Training Loss: 0.02122
Epoch: 5766, Training Loss: 0.01933
Epoch: 5766, Training Loss: 0.02432
Epoch: 5766, Training Loss: 0.01844
Epoch: 5767, Training Loss: 0.02122
Epoch: 5767, Training Loss: 0.01932
Epoch: 5767, Training Loss: 0.02431
Epoch: 5767, Training Loss: 0.01844
Epoch: 5768, Training Loss: 0.02121
Epoch: 5768, Training Loss: 0.01932
Epoch: 5768, Training Loss: 0.02431
Epoch: 5768, Training Loss: 0.01844
Epoch: 5769, Training Loss: 0.02121
Epoch: 5769, Training Loss: 0.01932
Epoch: 5769, Training Loss: 0.02431
Epoch: 5769, Training Loss: 0.01844
Epoch: 5770, Training Loss: 0.02121
Epoch: 5770, Training Loss: 0.01932
Epoch: 5770, Training Loss: 0.02431
Epoch: 5770, Training Loss: 0.01844
Epoch: 5771, Training Loss: 0.02121
Epoch: 5771, Training Loss: 0.01932
Epoch: 5771, Training Loss: 0.02430
Epoch: 5771, Training Loss: 0.01843
Epoch: 5772, Training Loss: 0.02121
Epoch: 5772, Training Loss: 0.01931
Epoch: 5772, Training Loss: 0.02430
Epoch: 5772, Training Loss: 0.01843
Epoch: 5773, Training Loss: 0.02120
Epoch: 5773, Training Loss: 0.01931
Epoch: 5773, Training Loss: 0.02430
Epoch: 5773, Training Loss: 0.01843
Epoch: 5774, Training Loss: 0.02120
Epoch: 5774, Training Loss: 0.01931
Epoch: 5774, Training Loss: 0.02430
Epoch: 5774, Training Loss: 0.01843
Epoch: 5775, Training Loss: 0.02120
Epoch: 5775, Training Loss: 0.01931
Epoch: 5775, Training Loss: 0.02429
Epoch: 5775, Training Loss: 0.01843
Epoch: 5776, Training Loss: 0.02120
Epoch: 5776, Training Loss: 0.01931
Epoch: 5776, Training Loss: 0.02429
Epoch: 5776, Training Loss: 0.01842
Epoch: 5777, Training Loss: 0.02119
Epoch: 5777, Training Loss: 0.01930
Epoch: 5777, Training Loss: 0.02429
Epoch: 5777, Training Loss: 0.01842
Epoch: 5778, Training Loss: 0.02119
Epoch: 5778, Training Loss: 0.01930
Epoch: 5778, Training Loss: 0.02429
Epoch: 5778, Training Loss: 0.01842
Epoch: 5779, Training Loss: 0.02119
Epoch: 5779, Training Loss: 0.01930
Epoch: 5779, Training Loss: 0.02428
Epoch: 5779, Training Loss: 0.01842
Epoch: 5780, Training Loss: 0.02119
Epoch: 5780, Training Loss: 0.01930
Epoch: 5780, Training Loss: 0.02428
Epoch: 5780, Training Loss: 0.01842
Epoch: 5781, Training Loss: 0.02119
Epoch: 5781, Training Loss: 0.01930
Epoch: 5781, Training Loss: 0.02428
Epoch: 5781, Training Loss: 0.01841
Epoch: 5782, Training Loss: 0.02118
Epoch: 5782, Training Loss: 0.01929
Epoch: 5782, Training Loss: 0.02428
Epoch: 5782, Training Loss: 0.01841
Epoch: 5783, Training Loss: 0.02118
Epoch: 5783, Training Loss: 0.01929
Epoch: 5783, Training Loss: 0.02427
Epoch: 5783, Training Loss: 0.01841
Epoch: 5784, Training Loss: 0.02118
Epoch: 5784, Training Loss: 0.01929
Epoch: 5784, Training Loss: 0.02427
Epoch: 5784, Training Loss: 0.01841
Epoch: 5785, Training Loss: 0.02118
Epoch: 5785, Training Loss: 0.01929
Epoch: 5785, Training Loss: 0.02427
Epoch: 5785, Training Loss: 0.01841
Epoch: 5786, Training Loss: 0.02117
Epoch: 5786, Training Loss: 0.01929
Epoch: 5786, Training Loss: 0.02426
Epoch: 5786, Training Loss: 0.01841
Epoch: 5787, Training Loss: 0.02117
Epoch: 5787, Training Loss: 0.01928
Epoch: 5787, Training Loss: 0.02426
Epoch: 5787, Training Loss: 0.01840
Epoch: 5788, Training Loss: 0.02117
Epoch: 5788, Training Loss: 0.01928
Epoch: 5788, Training Loss: 0.02426
Epoch: 5788, Training Loss: 0.01840
Epoch: 5789, Training Loss: 0.02117
Epoch: 5789, Training Loss: 0.01928
Epoch: 5789, Training Loss: 0.02426
Epoch: 5789, Training Loss: 0.01840
Epoch: 5790, Training Loss: 0.02117
Epoch: 5790, Training Loss: 0.01928
Epoch: 5790, Training Loss: 0.02425
Epoch: 5790, Training Loss: 0.01840
Epoch: 5791, Training Loss: 0.02116
Epoch: 5791, Training Loss: 0.01928
Epoch: 5791, Training Loss: 0.02425
Epoch: 5791, Training Loss: 0.01840
Epoch: 5792, Training Loss: 0.02116
Epoch: 5792, Training Loss: 0.01928
Epoch: 5792, Training Loss: 0.02425
Epoch: 5792, Training Loss: 0.01839
Epoch: 5793, Training Loss: 0.02116
Epoch: 5793, Training Loss: 0.01927
Epoch: 5793, Training Loss: 0.02425
Epoch: 5793, Training Loss: 0.01839
Epoch: 5794, Training Loss: 0.02116
Epoch: 5794, Training Loss: 0.01927
Epoch: 5794, Training Loss: 0.02424
Epoch: 5794, Training Loss: 0.01839
Epoch: 5795, Training Loss: 0.02115
Epoch: 5795, Training Loss: 0.01927
Epoch: 5795, Training Loss: 0.02424
Epoch: 5795, Training Loss: 0.01839
Epoch: 5796, Training Loss: 0.02115
Epoch: 5796, Training Loss: 0.01927
Epoch: 5796, Training Loss: 0.02424
Epoch: 5796, Training Loss: 0.01839
Epoch: 5797, Training Loss: 0.02115
Epoch: 5797, Training Loss: 0.01927
Epoch: 5797, Training Loss: 0.02424
Epoch: 5797, Training Loss: 0.01838
Epoch: 5798, Training Loss: 0.02115
Epoch: 5798, Training Loss: 0.01926
Epoch: 5798, Training Loss: 0.02423
Epoch: 5798, Training Loss: 0.01838
Epoch: 5799, Training Loss: 0.02115
Epoch: 5799, Training Loss: 0.01926
Epoch: 5799, Training Loss: 0.02423
Epoch: 5799, Training Loss: 0.01838
Epoch: 5800, Training Loss: 0.02114
Epoch: 5800, Training Loss: 0.01926
Epoch: 5800, Training Loss: 0.02423
Epoch: 5800, Training Loss: 0.01838
Epoch: 5801, Training Loss: 0.02114
Epoch: 5801, Training Loss: 0.01926
Epoch: 5801, Training Loss: 0.02423
Epoch: 5801, Training Loss: 0.01838
Epoch: 5802, Training Loss: 0.02114
Epoch: 5802, Training Loss: 0.01926
Epoch: 5802, Training Loss: 0.02422
Epoch: 5802, Training Loss: 0.01838
Epoch: 5803, Training Loss: 0.02114
Epoch: 5803, Training Loss: 0.01925
Epoch: 5803, Training Loss: 0.02422
Epoch: 5803, Training Loss: 0.01837
Epoch: 5804, Training Loss: 0.02113
Epoch: 5804, Training Loss: 0.01925
Epoch: 5804, Training Loss: 0.02422
Epoch: 5804, Training Loss: 0.01837
Epoch: 5805, Training Loss: 0.02113
Epoch: 5805, Training Loss: 0.01925
Epoch: 5805, Training Loss: 0.02422
Epoch: 5805, Training Loss: 0.01837
Epoch: 5806, Training Loss: 0.02113
Epoch: 5806, Training Loss: 0.01925
Epoch: 5806, Training Loss: 0.02421
Epoch: 5806, Training Loss: 0.01837
Epoch: 5807, Training Loss: 0.02113
Epoch: 5807, Training Loss: 0.01925
Epoch: 5807, Training Loss: 0.02421
Epoch: 5807, Training Loss: 0.01837
Epoch: 5808, Training Loss: 0.02113
Epoch: 5808, Training Loss: 0.01924
Epoch: 5808, Training Loss: 0.02421
Epoch: 5808, Training Loss: 0.01836
Epoch: 5809, Training Loss: 0.02112
Epoch: 5809, Training Loss: 0.01924
Epoch: 5809, Training Loss: 0.02421
Epoch: 5809, Training Loss: 0.01836
Epoch: 5810, Training Loss: 0.02112
Epoch: 5810, Training Loss: 0.01924
Epoch: 5810, Training Loss: 0.02420
Epoch: 5810, Training Loss: 0.01836
Epoch: 5811, Training Loss: 0.02112
Epoch: 5811, Training Loss: 0.01924
Epoch: 5811, Training Loss: 0.02420
Epoch: 5811, Training Loss: 0.01836
Epoch: 5812, Training Loss: 0.02112
Epoch: 5812, Training Loss: 0.01924
Epoch: 5812, Training Loss: 0.02420
Epoch: 5812, Training Loss: 0.01836
Epoch: 5813, Training Loss: 0.02111
Epoch: 5813, Training Loss: 0.01923
Epoch: 5813, Training Loss: 0.02420
Epoch: 5813, Training Loss: 0.01835
Epoch: 5814, Training Loss: 0.02111
Epoch: 5814, Training Loss: 0.01923
Epoch: 5814, Training Loss: 0.02419
Epoch: 5814, Training Loss: 0.01835
Epoch: 5815, Training Loss: 0.02111
Epoch: 5815, Training Loss: 0.01923
Epoch: 5815, Training Loss: 0.02419
Epoch: 5815, Training Loss: 0.01835
Epoch: 5816, Training Loss: 0.02111
Epoch: 5816, Training Loss: 0.01923
Epoch: 5816, Training Loss: 0.02419
Epoch: 5816, Training Loss: 0.01835
Epoch: 5817, Training Loss: 0.02111
Epoch: 5817, Training Loss: 0.01923
Epoch: 5817, Training Loss: 0.02419
Epoch: 5817, Training Loss: 0.01835
Epoch: 5818, Training Loss: 0.02110
Epoch: 5818, Training Loss: 0.01923
Epoch: 5818, Training Loss: 0.02418
Epoch: 5818, Training Loss: 0.01835
Epoch: 5819, Training Loss: 0.02110
Epoch: 5819, Training Loss: 0.01922
Epoch: 5819, Training Loss: 0.02418
Epoch: 5819, Training Loss: 0.01834
Epoch: 5820, Training Loss: 0.02110
Epoch: 5820, Training Loss: 0.01922
Epoch: 5820, Training Loss: 0.02418
Epoch: 5820, Training Loss: 0.01834
Epoch: 5821, Training Loss: 0.02110
Epoch: 5821, Training Loss: 0.01922
Epoch: 5821, Training Loss: 0.02418
Epoch: 5821, Training Loss: 0.01834
Epoch: 5822, Training Loss: 0.02110
Epoch: 5822, Training Loss: 0.01922
Epoch: 5822, Training Loss: 0.02417
Epoch: 5822, Training Loss: 0.01834
Epoch: 5823, Training Loss: 0.02109
Epoch: 5823, Training Loss: 0.01922
Epoch: 5823, Training Loss: 0.02417
Epoch: 5823, Training Loss: 0.01834
Epoch: 5824, Training Loss: 0.02109
Epoch: 5824, Training Loss: 0.01921
Epoch: 5824, Training Loss: 0.02417
Epoch: 5824, Training Loss: 0.01833
Epoch: 5825, Training Loss: 0.02109
Epoch: 5825, Training Loss: 0.01921
Epoch: 5825, Training Loss: 0.02416
Epoch: 5825, Training Loss: 0.01833
Epoch: 5826, Training Loss: 0.02109
Epoch: 5826, Training Loss: 0.01921
Epoch: 5826, Training Loss: 0.02416
Epoch: 5826, Training Loss: 0.01833
Epoch: 5827, Training Loss: 0.02108
Epoch: 5827, Training Loss: 0.01921
Epoch: 5827, Training Loss: 0.02416
Epoch: 5827, Training Loss: 0.01833
Epoch: 5828, Training Loss: 0.02108
Epoch: 5828, Training Loss: 0.01921
Epoch: 5828, Training Loss: 0.02416
Epoch: 5828, Training Loss: 0.01833
Epoch: 5829, Training Loss: 0.02108
Epoch: 5829, Training Loss: 0.01920
Epoch: 5829, Training Loss: 0.02415
Epoch: 5829, Training Loss: 0.01833
Epoch: 5830, Training Loss: 0.02108
Epoch: 5830, Training Loss: 0.01920
Epoch: 5830, Training Loss: 0.02415
Epoch: 5830, Training Loss: 0.01832
Epoch: 5831, Training Loss: 0.02108
Epoch: 5831, Training Loss: 0.01920
Epoch: 5831, Training Loss: 0.02415
Epoch: 5831, Training Loss: 0.01832
Epoch: 5832, Training Loss: 0.02107
Epoch: 5832, Training Loss: 0.01920
Epoch: 5832, Training Loss: 0.02415
Epoch: 5832, Training Loss: 0.01832
Epoch: 5833, Training Loss: 0.02107
Epoch: 5833, Training Loss: 0.01920
Epoch: 5833, Training Loss: 0.02414
Epoch: 5833, Training Loss: 0.01832
Epoch: 5834, Training Loss: 0.02107
Epoch: 5834, Training Loss: 0.01919
Epoch: 5834, Training Loss: 0.02414
Epoch: 5834, Training Loss: 0.01832
Epoch: 5835, Training Loss: 0.02107
Epoch: 5835, Training Loss: 0.01919
Epoch: 5835, Training Loss: 0.02414
Epoch: 5835, Training Loss: 0.01831
Epoch: 5836, Training Loss: 0.02106
Epoch: 5836, Training Loss: 0.01919
Epoch: 5836, Training Loss: 0.02414
Epoch: 5836, Training Loss: 0.01831
Epoch: 5837, Training Loss: 0.02106
Epoch: 5837, Training Loss: 0.01919
Epoch: 5837, Training Loss: 0.02413
Epoch: 5837, Training Loss: 0.01831
Epoch: 5838, Training Loss: 0.02106
Epoch: 5838, Training Loss: 0.01919
Epoch: 5838, Training Loss: 0.02413
Epoch: 5838, Training Loss: 0.01831
Epoch: 5839, Training Loss: 0.02106
Epoch: 5839, Training Loss: 0.01919
Epoch: 5839, Training Loss: 0.02413
Epoch: 5839, Training Loss: 0.01831
Epoch: 5840, Training Loss: 0.02106
Epoch: 5840, Training Loss: 0.01918
Epoch: 5840, Training Loss: 0.02413
Epoch: 5840, Training Loss: 0.01830
Epoch: 5841, Training Loss: 0.02105
Epoch: 5841, Training Loss: 0.01918
Epoch: 5841, Training Loss: 0.02412
Epoch: 5841, Training Loss: 0.01830
Epoch: 5842, Training Loss: 0.02105
Epoch: 5842, Training Loss: 0.01918
Epoch: 5842, Training Loss: 0.02412
Epoch: 5842, Training Loss: 0.01830
Epoch: 5843, Training Loss: 0.02105
Epoch: 5843, Training Loss: 0.01918
Epoch: 5843, Training Loss: 0.02412
Epoch: 5843, Training Loss: 0.01830
Epoch: 5844, Training Loss: 0.02105
Epoch: 5844, Training Loss: 0.01918
Epoch: 5844, Training Loss: 0.02412
Epoch: 5844, Training Loss: 0.01830
Epoch: 5845, Training Loss: 0.02104
Epoch: 5845, Training Loss: 0.01917
Epoch: 5845, Training Loss: 0.02411
Epoch: 5845, Training Loss: 0.01830
Epoch: 5846, Training Loss: 0.02104
Epoch: 5846, Training Loss: 0.01917
Epoch: 5846, Training Loss: 0.02411
Epoch: 5846, Training Loss: 0.01829
Epoch: 5847, Training Loss: 0.02104
Epoch: 5847, Training Loss: 0.01917
Epoch: 5847, Training Loss: 0.02411
Epoch: 5847, Training Loss: 0.01829
Epoch: 5848, Training Loss: 0.02104
Epoch: 5848, Training Loss: 0.01917
Epoch: 5848, Training Loss: 0.02411
Epoch: 5848, Training Loss: 0.01829
Epoch: 5849, Training Loss: 0.02104
Epoch: 5849, Training Loss: 0.01917
Epoch: 5849, Training Loss: 0.02410
Epoch: 5849, Training Loss: 0.01829
Epoch: 5850, Training Loss: 0.02103
Epoch: 5850, Training Loss: 0.01916
Epoch: 5850, Training Loss: 0.02410
Epoch: 5850, Training Loss: 0.01829
Epoch: 5851, Training Loss: 0.02103
Epoch: 5851, Training Loss: 0.01916
Epoch: 5851, Training Loss: 0.02410
Epoch: 5851, Training Loss: 0.01828
Epoch: 5852, Training Loss: 0.02103
Epoch: 5852, Training Loss: 0.01916
Epoch: 5852, Training Loss: 0.02410
Epoch: 5852, Training Loss: 0.01828
Epoch: 5853, Training Loss: 0.02103
Epoch: 5853, Training Loss: 0.01916
Epoch: 5853, Training Loss: 0.02409
Epoch: 5853, Training Loss: 0.01828
Epoch: 5854, Training Loss: 0.02103
Epoch: 5854, Training Loss: 0.01916
Epoch: 5854, Training Loss: 0.02409
Epoch: 5854, Training Loss: 0.01828
Epoch: 5855, Training Loss: 0.02102
Epoch: 5855, Training Loss: 0.01915
Epoch: 5855, Training Loss: 0.02409
Epoch: 5855, Training Loss: 0.01828
Epoch: 5856, Training Loss: 0.02102
Epoch: 5856, Training Loss: 0.01915
Epoch: 5856, Training Loss: 0.02409
Epoch: 5856, Training Loss: 0.01828
Epoch: 5857, Training Loss: 0.02102
Epoch: 5857, Training Loss: 0.01915
Epoch: 5857, Training Loss: 0.02408
Epoch: 5857, Training Loss: 0.01827
Epoch: 5858, Training Loss: 0.02102
Epoch: 5858, Training Loss: 0.01915
Epoch: 5858, Training Loss: 0.02408
Epoch: 5858, Training Loss: 0.01827
Epoch: 5859, Training Loss: 0.02101
Epoch: 5859, Training Loss: 0.01915
Epoch: 5859, Training Loss: 0.02408
Epoch: 5859, Training Loss: 0.01827
Epoch: 5860, Training Loss: 0.02101
Epoch: 5860, Training Loss: 0.01915
Epoch: 5860, Training Loss: 0.02408
Epoch: 5860, Training Loss: 0.01827
Epoch: 5861, Training Loss: 0.02101
Epoch: 5861, Training Loss: 0.01914
Epoch: 5861, Training Loss: 0.02407
Epoch: 5861, Training Loss: 0.01827
Epoch: 5862, Training Loss: 0.02101
Epoch: 5862, Training Loss: 0.01914
Epoch: 5862, Training Loss: 0.02407
Epoch: 5862, Training Loss: 0.01826
Epoch: 5863, Training Loss: 0.02101
Epoch: 5863, Training Loss: 0.01914
Epoch: 5863, Training Loss: 0.02407
Epoch: 5863, Training Loss: 0.01826
Epoch: 5864, Training Loss: 0.02100
Epoch: 5864, Training Loss: 0.01914
Epoch: 5864, Training Loss: 0.02407
Epoch: 5864, Training Loss: 0.01826
Epoch: 5865, Training Loss: 0.02100
Epoch: 5865, Training Loss: 0.01914
Epoch: 5865, Training Loss: 0.02406
Epoch: 5865, Training Loss: 0.01826
Epoch: 5866, Training Loss: 0.02100
Epoch: 5866, Training Loss: 0.01913
Epoch: 5866, Training Loss: 0.02406
Epoch: 5866, Training Loss: 0.01826
Epoch: 5867, Training Loss: 0.02100
Epoch: 5867, Training Loss: 0.01913
Epoch: 5867, Training Loss: 0.02406
Epoch: 5867, Training Loss: 0.01825
Epoch: 5868, Training Loss: 0.02099
Epoch: 5868, Training Loss: 0.01913
Epoch: 5868, Training Loss: 0.02406
Epoch: 5868, Training Loss: 0.01825
Epoch: 5869, Training Loss: 0.02099
Epoch: 5869, Training Loss: 0.01913
Epoch: 5869, Training Loss: 0.02405
Epoch: 5869, Training Loss: 0.01825
Epoch: 5870, Training Loss: 0.02099
Epoch: 5870, Training Loss: 0.01913
Epoch: 5870, Training Loss: 0.02405
Epoch: 5870, Training Loss: 0.01825
Epoch: 5871, Training Loss: 0.02099
Epoch: 5871, Training Loss: 0.01912
Epoch: 5871, Training Loss: 0.02405
Epoch: 5871, Training Loss: 0.01825
Epoch: 5872, Training Loss: 0.02099
Epoch: 5872, Training Loss: 0.01912
Epoch: 5872, Training Loss: 0.02405
Epoch: 5872, Training Loss: 0.01825
Epoch: 5873, Training Loss: 0.02098
Epoch: 5873, Training Loss: 0.01912
Epoch: 5873, Training Loss: 0.02404
Epoch: 5873, Training Loss: 0.01824
Epoch: 5874, Training Loss: 0.02098
Epoch: 5874, Training Loss: 0.01912
Epoch: 5874, Training Loss: 0.02404
Epoch: 5874, Training Loss: 0.01824
Epoch: 5875, Training Loss: 0.02098
Epoch: 5875, Training Loss: 0.01912
Epoch: 5875, Training Loss: 0.02404
Epoch: 5875, Training Loss: 0.01824
Epoch: 5876, Training Loss: 0.02098
Epoch: 5876, Training Loss: 0.01911
Epoch: 5876, Training Loss: 0.02404
Epoch: 5876, Training Loss: 0.01824
Epoch: 5877, Training Loss: 0.02098
Epoch: 5877, Training Loss: 0.01911
Epoch: 5877, Training Loss: 0.02403
Epoch: 5877, Training Loss: 0.01824
Epoch: 5878, Training Loss: 0.02097
Epoch: 5878, Training Loss: 0.01911
Epoch: 5878, Training Loss: 0.02403
Epoch: 5878, Training Loss: 0.01823
Epoch: 5879, Training Loss: 0.02097
Epoch: 5879, Training Loss: 0.01911
Epoch: 5879, Training Loss: 0.02403
Epoch: 5879, Training Loss: 0.01823
Epoch: 5880, Training Loss: 0.02097
Epoch: 5880, Training Loss: 0.01911
Epoch: 5880, Training Loss: 0.02403
Epoch: 5880, Training Loss: 0.01823
Epoch: 5881, Training Loss: 0.02097
Epoch: 5881, Training Loss: 0.01911
Epoch: 5881, Training Loss: 0.02402
Epoch: 5881, Training Loss: 0.01823
Epoch: 5882, Training Loss: 0.02096
Epoch: 5882, Training Loss: 0.01910
Epoch: 5882, Training Loss: 0.02402
Epoch: 5882, Training Loss: 0.01823
Epoch: 5883, Training Loss: 0.02096
Epoch: 5883, Training Loss: 0.01910
Epoch: 5883, Training Loss: 0.02402
Epoch: 5883, Training Loss: 0.01823
Epoch: 5884, Training Loss: 0.02096
Epoch: 5884, Training Loss: 0.01910
Epoch: 5884, Training Loss: 0.02402
Epoch: 5884, Training Loss: 0.01822
Epoch: 5885, Training Loss: 0.02096
Epoch: 5885, Training Loss: 0.01910
Epoch: 5885, Training Loss: 0.02401
Epoch: 5885, Training Loss: 0.01822
Epoch: 5886, Training Loss: 0.02096
Epoch: 5886, Training Loss: 0.01910
Epoch: 5886, Training Loss: 0.02401
Epoch: 5886, Training Loss: 0.01822
Epoch: 5887, Training Loss: 0.02095
Epoch: 5887, Training Loss: 0.01909
Epoch: 5887, Training Loss: 0.02401
Epoch: 5887, Training Loss: 0.01822
Epoch: 5888, Training Loss: 0.02095
Epoch: 5888, Training Loss: 0.01909
Epoch: 5888, Training Loss: 0.02401
Epoch: 5888, Training Loss: 0.01822
Epoch: 5889, Training Loss: 0.02095
Epoch: 5889, Training Loss: 0.01909
Epoch: 5889, Training Loss: 0.02400
Epoch: 5889, Training Loss: 0.01821
Epoch: 5890, Training Loss: 0.02095
Epoch: 5890, Training Loss: 0.01909
Epoch: 5890, Training Loss: 0.02400
Epoch: 5890, Training Loss: 0.01821
Epoch: 5891, Training Loss: 0.02095
Epoch: 5891, Training Loss: 0.01909
Epoch: 5891, Training Loss: 0.02400
Epoch: 5891, Training Loss: 0.01821
Epoch: 5892, Training Loss: 0.02094
Epoch: 5892, Training Loss: 0.01908
Epoch: 5892, Training Loss: 0.02400
Epoch: 5892, Training Loss: 0.01821
Epoch: 5893, Training Loss: 0.02094
Epoch: 5893, Training Loss: 0.01908
Epoch: 5893, Training Loss: 0.02399
Epoch: 5893, Training Loss: 0.01821
Epoch: 5894, Training Loss: 0.02094
Epoch: 5894, Training Loss: 0.01908
Epoch: 5894, Training Loss: 0.02399
Epoch: 5894, Training Loss: 0.01821
Epoch: 5895, Training Loss: 0.02094
Epoch: 5895, Training Loss: 0.01908
Epoch: 5895, Training Loss: 0.02399
Epoch: 5895, Training Loss: 0.01820
Epoch: 5896, Training Loss: 0.02093
Epoch: 5896, Training Loss: 0.01908
Epoch: 5896, Training Loss: 0.02399
Epoch: 5896, Training Loss: 0.01820
Epoch: 5897, Training Loss: 0.02093
Epoch: 5897, Training Loss: 0.01908
Epoch: 5897, Training Loss: 0.02398
Epoch: 5897, Training Loss: 0.01820
Epoch: 5898, Training Loss: 0.02093
Epoch: 5898, Training Loss: 0.01907
Epoch: 5898, Training Loss: 0.02398
Epoch: 5898, Training Loss: 0.01820
Epoch: 5899, Training Loss: 0.02093
Epoch: 5899, Training Loss: 0.01907
Epoch: 5899, Training Loss: 0.02398
Epoch: 5899, Training Loss: 0.01820
Epoch: 5900, Training Loss: 0.02093
Epoch: 5900, Training Loss: 0.01907
Epoch: 5900, Training Loss: 0.02398
Epoch: 5900, Training Loss: 0.01819
Epoch: 5901, Training Loss: 0.02092
Epoch: 5901, Training Loss: 0.01907
Epoch: 5901, Training Loss: 0.02397
Epoch: 5901, Training Loss: 0.01819
Epoch: 5902, Training Loss: 0.02092
Epoch: 5902, Training Loss: 0.01907
Epoch: 5902, Training Loss: 0.02397
Epoch: 5902, Training Loss: 0.01819
Epoch: 5903, Training Loss: 0.02092
Epoch: 5903, Training Loss: 0.01906
Epoch: 5903, Training Loss: 0.02397
Epoch: 5903, Training Loss: 0.01819
Epoch: 5904, Training Loss: 0.02092
Epoch: 5904, Training Loss: 0.01906
Epoch: 5904, Training Loss: 0.02397
Epoch: 5904, Training Loss: 0.01819
Epoch: 5905, Training Loss: 0.02092
Epoch: 5905, Training Loss: 0.01906
Epoch: 5905, Training Loss: 0.02396
Epoch: 5905, Training Loss: 0.01819
Epoch: 5906, Training Loss: 0.02091
Epoch: 5906, Training Loss: 0.01906
Epoch: 5906, Training Loss: 0.02396
Epoch: 5906, Training Loss: 0.01818
Epoch: 5907, Training Loss: 0.02091
Epoch: 5907, Training Loss: 0.01906
Epoch: 5907, Training Loss: 0.02396
Epoch: 5907, Training Loss: 0.01818
Epoch: 5908, Training Loss: 0.02091
Epoch: 5908, Training Loss: 0.01905
Epoch: 5908, Training Loss: 0.02396
Epoch: 5908, Training Loss: 0.01818
Epoch: 5909, Training Loss: 0.02091
Epoch: 5909, Training Loss: 0.01905
Epoch: 5909, Training Loss: 0.02395
Epoch: 5909, Training Loss: 0.01818
Epoch: 5910, Training Loss: 0.02090
Epoch: 5910, Training Loss: 0.01905
Epoch: 5910, Training Loss: 0.02395
Epoch: 5910, Training Loss: 0.01818
Epoch: 5911, Training Loss: 0.02090
Epoch: 5911, Training Loss: 0.01905
Epoch: 5911, Training Loss: 0.02395
Epoch: 5911, Training Loss: 0.01817
Epoch: 5912, Training Loss: 0.02090
Epoch: 5912, Training Loss: 0.01905
Epoch: 5912, Training Loss: 0.02395
Epoch: 5912, Training Loss: 0.01817
Epoch: 5913, Training Loss: 0.02090
Epoch: 5913, Training Loss: 0.01905
Epoch: 5913, Training Loss: 0.02394
Epoch: 5913, Training Loss: 0.01817
Epoch: 5914, Training Loss: 0.02090
Epoch: 5914, Training Loss: 0.01904
Epoch: 5914, Training Loss: 0.02394
Epoch: 5914, Training Loss: 0.01817
Epoch: 5915, Training Loss: 0.02089
Epoch: 5915, Training Loss: 0.01904
Epoch: 5915, Training Loss: 0.02394
Epoch: 5915, Training Loss: 0.01817
Epoch: 5916, Training Loss: 0.02089
Epoch: 5916, Training Loss: 0.01904
Epoch: 5916, Training Loss: 0.02394
Epoch: 5916, Training Loss: 0.01817
Epoch: 5917, Training Loss: 0.02089
Epoch: 5917, Training Loss: 0.01904
Epoch: 5917, Training Loss: 0.02393
Epoch: 5917, Training Loss: 0.01816
Epoch: 5918, Training Loss: 0.02089
Epoch: 5918, Training Loss: 0.01904
Epoch: 5918, Training Loss: 0.02393
Epoch: 5918, Training Loss: 0.01816
Epoch: 5919, Training Loss: 0.02089
Epoch: 5919, Training Loss: 0.01903
Epoch: 5919, Training Loss: 0.02393
Epoch: 5919, Training Loss: 0.01816
Epoch: 5920, Training Loss: 0.02088
Epoch: 5920, Training Loss: 0.01903
Epoch: 5920, Training Loss: 0.02393
Epoch: 5920, Training Loss: 0.01816
Epoch: 5921, Training Loss: 0.02088
Epoch: 5921, Training Loss: 0.01903
Epoch: 5921, Training Loss: 0.02392
Epoch: 5921, Training Loss: 0.01816
Epoch: 5922, Training Loss: 0.02088
Epoch: 5922, Training Loss: 0.01903
Epoch: 5922, Training Loss: 0.02392
Epoch: 5922, Training Loss: 0.01815
Epoch: 5923, Training Loss: 0.02088
Epoch: 5923, Training Loss: 0.01903
Epoch: 5923, Training Loss: 0.02392
Epoch: 5923, Training Loss: 0.01815
Epoch: 5924, Training Loss: 0.02087
Epoch: 5924, Training Loss: 0.01903
Epoch: 5924, Training Loss: 0.02392
Epoch: 5924, Training Loss: 0.01815
Epoch: 5925, Training Loss: 0.02087
Epoch: 5925, Training Loss: 0.01902
Epoch: 5925, Training Loss: 0.02391
Epoch: 5925, Training Loss: 0.01815
Epoch: 5926, Training Loss: 0.02087
Epoch: 5926, Training Loss: 0.01902
Epoch: 5926, Training Loss: 0.02391
Epoch: 5926, Training Loss: 0.01815
Epoch: 5927, Training Loss: 0.02087
Epoch: 5927, Training Loss: 0.01902
Epoch: 5927, Training Loss: 0.02391
Epoch: 5927, Training Loss: 0.01815
Epoch: 5928, Training Loss: 0.02087
Epoch: 5928, Training Loss: 0.01902
Epoch: 5928, Training Loss: 0.02391
Epoch: 5928, Training Loss: 0.01814
Epoch: 5929, Training Loss: 0.02086
Epoch: 5929, Training Loss: 0.01902
Epoch: 5929, Training Loss: 0.02390
Epoch: 5929, Training Loss: 0.01814
Epoch: 5930, Training Loss: 0.02086
Epoch: 5930, Training Loss: 0.01901
Epoch: 5930, Training Loss: 0.02390
Epoch: 5930, Training Loss: 0.01814
Epoch: 5931, Training Loss: 0.02086
Epoch: 5931, Training Loss: 0.01901
Epoch: 5931, Training Loss: 0.02390
Epoch: 5931, Training Loss: 0.01814
Epoch: 5932, Training Loss: 0.02086
Epoch: 5932, Training Loss: 0.01901
Epoch: 5932, Training Loss: 0.02390
Epoch: 5932, Training Loss: 0.01814
Epoch: 5933, Training Loss: 0.02086
Epoch: 5933, Training Loss: 0.01901
Epoch: 5933, Training Loss: 0.02389
Epoch: 5933, Training Loss: 0.01813
Epoch: 5934, Training Loss: 0.02085
Epoch: 5934, Training Loss: 0.01901
Epoch: 5934, Training Loss: 0.02389
Epoch: 5934, Training Loss: 0.01813
Epoch: 5935, Training Loss: 0.02085
Epoch: 5935, Training Loss: 0.01900
Epoch: 5935, Training Loss: 0.02389
Epoch: 5935, Training Loss: 0.01813
Epoch: 5936, Training Loss: 0.02085
Epoch: 5936, Training Loss: 0.01900
Epoch: 5936, Training Loss: 0.02389
Epoch: 5936, Training Loss: 0.01813
Epoch: 5937, Training Loss: 0.02085
Epoch: 5937, Training Loss: 0.01900
Epoch: 5937, Training Loss: 0.02388
Epoch: 5937, Training Loss: 0.01813
Epoch: 5938, Training Loss: 0.02084
Epoch: 5938, Training Loss: 0.01900
Epoch: 5938, Training Loss: 0.02388
Epoch: 5938, Training Loss: 0.01813
Epoch: 5939, Training Loss: 0.02084
Epoch: 5939, Training Loss: 0.01900
Epoch: 5939, Training Loss: 0.02388
Epoch: 5939, Training Loss: 0.01812
Epoch: 5940, Training Loss: 0.02084
Epoch: 5940, Training Loss: 0.01900
Epoch: 5940, Training Loss: 0.02388
Epoch: 5940, Training Loss: 0.01812
Epoch: 5941, Training Loss: 0.02084
Epoch: 5941, Training Loss: 0.01899
Epoch: 5941, Training Loss: 0.02387
Epoch: 5941, Training Loss: 0.01812
Epoch: 5942, Training Loss: 0.02084
Epoch: 5942, Training Loss: 0.01899
Epoch: 5942, Training Loss: 0.02387
Epoch: 5942, Training Loss: 0.01812
Epoch: 5943, Training Loss: 0.02083
Epoch: 5943, Training Loss: 0.01899
Epoch: 5943, Training Loss: 0.02387
Epoch: 5943, Training Loss: 0.01812
Epoch: 5944, Training Loss: 0.02083
Epoch: 5944, Training Loss: 0.01899
Epoch: 5944, Training Loss: 0.02387
Epoch: 5944, Training Loss: 0.01811
Epoch: 5945, Training Loss: 0.02083
Epoch: 5945, Training Loss: 0.01899
Epoch: 5945, Training Loss: 0.02386
Epoch: 5945, Training Loss: 0.01811
Epoch: 5946, Training Loss: 0.02083
Epoch: 5946, Training Loss: 0.01898
Epoch: 5946, Training Loss: 0.02386
Epoch: 5946, Training Loss: 0.01811
Epoch: 5947, Training Loss: 0.02083
Epoch: 5947, Training Loss: 0.01898
Epoch: 5947, Training Loss: 0.02386
Epoch: 5947, Training Loss: 0.01811
Epoch: 5948, Training Loss: 0.02082
Epoch: 5948, Training Loss: 0.01898
Epoch: 5948, Training Loss: 0.02386
Epoch: 5948, Training Loss: 0.01811
Epoch: 5949, Training Loss: 0.02082
Epoch: 5949, Training Loss: 0.01898
Epoch: 5949, Training Loss: 0.02385
Epoch: 5949, Training Loss: 0.01811
Epoch: 5950, Training Loss: 0.02082
Epoch: 5950, Training Loss: 0.01898
Epoch: 5950, Training Loss: 0.02385
Epoch: 5950, Training Loss: 0.01810
Epoch: 5951, Training Loss: 0.02082
Epoch: 5951, Training Loss: 0.01898
Epoch: 5951, Training Loss: 0.02385
Epoch: 5951, Training Loss: 0.01810
Epoch: 5952, Training Loss: 0.02082
Epoch: 5952, Training Loss: 0.01897
Epoch: 5952, Training Loss: 0.02385
Epoch: 5952, Training Loss: 0.01810
Epoch: 5953, Training Loss: 0.02081
Epoch: 5953, Training Loss: 0.01897
Epoch: 5953, Training Loss: 0.02385
Epoch: 5953, Training Loss: 0.01810
Epoch: 5954, Training Loss: 0.02081
Epoch: 5954, Training Loss: 0.01897
Epoch: 5954, Training Loss: 0.02384
Epoch: 5954, Training Loss: 0.01810
Epoch: 5955, Training Loss: 0.02081
Epoch: 5955, Training Loss: 0.01897
Epoch: 5955, Training Loss: 0.02384
Epoch: 5955, Training Loss: 0.01810
Epoch: 5956, Training Loss: 0.02081
Epoch: 5956, Training Loss: 0.01897
Epoch: 5956, Training Loss: 0.02384
Epoch: 5956, Training Loss: 0.01809
Epoch: 5957, Training Loss: 0.02080
Epoch: 5957, Training Loss: 0.01896
Epoch: 5957, Training Loss: 0.02384
Epoch: 5957, Training Loss: 0.01809
Epoch: 5958, Training Loss: 0.02080
Epoch: 5958, Training Loss: 0.01896
Epoch: 5958, Training Loss: 0.02383
Epoch: 5958, Training Loss: 0.01809
Epoch: 5959, Training Loss: 0.02080
Epoch: 5959, Training Loss: 0.01896
Epoch: 5959, Training Loss: 0.02383
Epoch: 5959, Training Loss: 0.01809
Epoch: 5960, Training Loss: 0.02080
Epoch: 5960, Training Loss: 0.01896
Epoch: 5960, Training Loss: 0.02383
Epoch: 5960, Training Loss: 0.01809
Epoch: 5961, Training Loss: 0.02080
Epoch: 5961, Training Loss: 0.01896
Epoch: 5961, Training Loss: 0.02383
Epoch: 5961, Training Loss: 0.01808
Epoch: 5962, Training Loss: 0.02079
Epoch: 5962, Training Loss: 0.01895
Epoch: 5962, Training Loss: 0.02382
Epoch: 5962, Training Loss: 0.01808
Epoch: 5963, Training Loss: 0.02079
Epoch: 5963, Training Loss: 0.01895
Epoch: 5963, Training Loss: 0.02382
Epoch: 5963, Training Loss: 0.01808
Epoch: 5964, Training Loss: 0.02079
Epoch: 5964, Training Loss: 0.01895
Epoch: 5964, Training Loss: 0.02382
Epoch: 5964, Training Loss: 0.01808
Epoch: 5965, Training Loss: 0.02079
Epoch: 5965, Training Loss: 0.01895
Epoch: 5965, Training Loss: 0.02382
Epoch: 5965, Training Loss: 0.01808
Epoch: 5966, Training Loss: 0.02079
Epoch: 5966, Training Loss: 0.01895
Epoch: 5966, Training Loss: 0.02381
Epoch: 5966, Training Loss: 0.01808
Epoch: 5967, Training Loss: 0.02078
Epoch: 5967, Training Loss: 0.01895
Epoch: 5967, Training Loss: 0.02381
Epoch: 5967, Training Loss: 0.01807
Epoch: 5968, Training Loss: 0.02078
Epoch: 5968, Training Loss: 0.01894
Epoch: 5968, Training Loss: 0.02381
Epoch: 5968, Training Loss: 0.01807
Epoch: 5969, Training Loss: 0.02078
Epoch: 5969, Training Loss: 0.01894
Epoch: 5969, Training Loss: 0.02381
Epoch: 5969, Training Loss: 0.01807
Epoch: 5970, Training Loss: 0.02078
Epoch: 5970, Training Loss: 0.01894
Epoch: 5970, Training Loss: 0.02380
Epoch: 5970, Training Loss: 0.01807
Epoch: 5971, Training Loss: 0.02078
Epoch: 5971, Training Loss: 0.01894
Epoch: 5971, Training Loss: 0.02380
Epoch: 5971, Training Loss: 0.01807
Epoch: 5972, Training Loss: 0.02077
Epoch: 5972, Training Loss: 0.01894
Epoch: 5972, Training Loss: 0.02380
Epoch: 5972, Training Loss: 0.01806
Epoch: 5973, Training Loss: 0.02077
Epoch: 5973, Training Loss: 0.01893
Epoch: 5973, Training Loss: 0.02380
Epoch: 5973, Training Loss: 0.01806
Epoch: 5974, Training Loss: 0.02077
Epoch: 5974, Training Loss: 0.01893
Epoch: 5974, Training Loss: 0.02379
Epoch: 5974, Training Loss: 0.01806
Epoch: 5975, Training Loss: 0.02077
Epoch: 5975, Training Loss: 0.01893
Epoch: 5975, Training Loss: 0.02379
Epoch: 5975, Training Loss: 0.01806
Epoch: 5976, Training Loss: 0.02076
Epoch: 5976, Training Loss: 0.01893
Epoch: 5976, Training Loss: 0.02379
Epoch: 5976, Training Loss: 0.01806
Epoch: 5977, Training Loss: 0.02076
Epoch: 5977, Training Loss: 0.01893
Epoch: 5977, Training Loss: 0.02379
Epoch: 5977, Training Loss: 0.01806
Epoch: 5978, Training Loss: 0.02076
Epoch: 5978, Training Loss: 0.01893
Epoch: 5978, Training Loss: 0.02378
Epoch: 5978, Training Loss: 0.01805
Epoch: 5979, Training Loss: 0.02076
Epoch: 5979, Training Loss: 0.01892
Epoch: 5979, Training Loss: 0.02378
Epoch: 5979, Training Loss: 0.01805
Epoch: 5980, Training Loss: 0.02076
Epoch: 5980, Training Loss: 0.01892
Epoch: 5980, Training Loss: 0.02378
Epoch: 5980, Training Loss: 0.01805
Epoch: 5981, Training Loss: 0.02075
Epoch: 5981, Training Loss: 0.01892
Epoch: 5981, Training Loss: 0.02378
Epoch: 5981, Training Loss: 0.01805
Epoch: 5982, Training Loss: 0.02075
Epoch: 5982, Training Loss: 0.01892
Epoch: 5982, Training Loss: 0.02377
Epoch: 5982, Training Loss: 0.01805
Epoch: 5983, Training Loss: 0.02075
Epoch: 5983, Training Loss: 0.01892
Epoch: 5983, Training Loss: 0.02377
Epoch: 5983, Training Loss: 0.01805
Epoch: 5984, Training Loss: 0.02075
Epoch: 5984, Training Loss: 0.01891
Epoch: 5984, Training Loss: 0.02377
Epoch: 5984, Training Loss: 0.01804
Epoch: 5985, Training Loss: 0.02075
Epoch: 5985, Training Loss: 0.01891
Epoch: 5985, Training Loss: 0.02377
Epoch: 5985, Training Loss: 0.01804
Epoch: 5986, Training Loss: 0.02074
Epoch: 5986, Training Loss: 0.01891
Epoch: 5986, Training Loss: 0.02376
Epoch: 5986, Training Loss: 0.01804
Epoch: 5987, Training Loss: 0.02074
Epoch: 5987, Training Loss: 0.01891
Epoch: 5987, Training Loss: 0.02376
Epoch: 5987, Training Loss: 0.01804
Epoch: 5988, Training Loss: 0.02074
Epoch: 5988, Training Loss: 0.01891
Epoch: 5988, Training Loss: 0.02376
Epoch: 5988, Training Loss: 0.01804
Epoch: 5989, Training Loss: 0.02074
Epoch: 5989, Training Loss: 0.01891
Epoch: 5989, Training Loss: 0.02376
Epoch: 5989, Training Loss: 0.01803
Epoch: 5990, Training Loss: 0.02074
Epoch: 5990, Training Loss: 0.01890
Epoch: 5990, Training Loss: 0.02375
Epoch: 5990, Training Loss: 0.01803
Epoch: 5991, Training Loss: 0.02073
Epoch: 5991, Training Loss: 0.01890
Epoch: 5991, Training Loss: 0.02375
Epoch: 5991, Training Loss: 0.01803
Epoch: 5992, Training Loss: 0.02073
Epoch: 5992, Training Loss: 0.01890
Epoch: 5992, Training Loss: 0.02375
Epoch: 5992, Training Loss: 0.01803
Epoch: 5993, Training Loss: 0.02073
Epoch: 5993, Training Loss: 0.01890
Epoch: 5993, Training Loss: 0.02375
Epoch: 5993, Training Loss: 0.01803
Epoch: 5994, Training Loss: 0.02073
Epoch: 5994, Training Loss: 0.01890
Epoch: 5994, Training Loss: 0.02375
Epoch: 5994, Training Loss: 0.01803
Epoch: 5995, Training Loss: 0.02072
Epoch: 5995, Training Loss: 0.01889
Epoch: 5995, Training Loss: 0.02374
Epoch: 5995, Training Loss: 0.01802
Epoch: 5996, Training Loss: 0.02072
Epoch: 5996, Training Loss: 0.01889
Epoch: 5996, Training Loss: 0.02374
Epoch: 5996, Training Loss: 0.01802
Epoch: 5997, Training Loss: 0.02072
Epoch: 5997, Training Loss: 0.01889
Epoch: 5997, Training Loss: 0.02374
Epoch: 5997, Training Loss: 0.01802
Epoch: 5998, Training Loss: 0.02072
Epoch: 5998, Training Loss: 0.01889
Epoch: 5998, Training Loss: 0.02374
Epoch: 5998, Training Loss: 0.01802
Epoch: 5999, Training Loss: 0.02072
Epoch: 5999, Training Loss: 0.01889
Epoch: 5999, Training Loss: 0.02373
Epoch: 5999, Training Loss: 0.01802
Epoch: 6000, Training Loss: 0.02071
Epoch: 6000, Training Loss: 0.01889
Epoch: 6000, Training Loss: 0.02373
Epoch: 6000, Training Loss: 0.01801
Epoch: 6001, Training Loss: 0.02071
Epoch: 6001, Training Loss: 0.01888
Epoch: 6001, Training Loss: 0.02373
Epoch: 6001, Training Loss: 0.01801
Epoch: 6002, Training Loss: 0.02071
Epoch: 6002, Training Loss: 0.01888
Epoch: 6002, Training Loss: 0.02373
Epoch: 6002, Training Loss: 0.01801
Epoch: 6003, Training Loss: 0.02071
Epoch: 6003, Training Loss: 0.01888
Epoch: 6003, Training Loss: 0.02372
Epoch: 6003, Training Loss: 0.01801
Epoch: 6004, Training Loss: 0.02071
Epoch: 6004, Training Loss: 0.01888
Epoch: 6004, Training Loss: 0.02372
Epoch: 6004, Training Loss: 0.01801
Epoch: 6005, Training Loss: 0.02070
Epoch: 6005, Training Loss: 0.01888
Epoch: 6005, Training Loss: 0.02372
Epoch: 6005, Training Loss: 0.01801
Epoch: 6006, Training Loss: 0.02070
Epoch: 6006, Training Loss: 0.01887
Epoch: 6006, Training Loss: 0.02372
Epoch: 6006, Training Loss: 0.01800
Epoch: 6007, Training Loss: 0.02070
Epoch: 6007, Training Loss: 0.01887
Epoch: 6007, Training Loss: 0.02371
Epoch: 6007, Training Loss: 0.01800
Epoch: 6008, Training Loss: 0.02070
Epoch: 6008, Training Loss: 0.01887
Epoch: 6008, Training Loss: 0.02371
Epoch: 6008, Training Loss: 0.01800
Epoch: 6009, Training Loss: 0.02070
Epoch: 6009, Training Loss: 0.01887
Epoch: 6009, Training Loss: 0.02371
Epoch: 6009, Training Loss: 0.01800
Epoch: 6010, Training Loss: 0.02069
Epoch: 6010, Training Loss: 0.01887
Epoch: 6010, Training Loss: 0.02371
Epoch: 6010, Training Loss: 0.01800
Epoch: 6011, Training Loss: 0.02069
Epoch: 6011, Training Loss: 0.01887
Epoch: 6011, Training Loss: 0.02370
Epoch: 6011, Training Loss: 0.01800
Epoch: 6012, Training Loss: 0.02069
Epoch: 6012, Training Loss: 0.01886
Epoch: 6012, Training Loss: 0.02370
Epoch: 6012, Training Loss: 0.01799
Epoch: 6013, Training Loss: 0.02069
Epoch: 6013, Training Loss: 0.01886
Epoch: 6013, Training Loss: 0.02370
Epoch: 6013, Training Loss: 0.01799
Epoch: 6014, Training Loss: 0.02069
Epoch: 6014, Training Loss: 0.01886
Epoch: 6014, Training Loss: 0.02370
Epoch: 6014, Training Loss: 0.01799
Epoch: 6015, Training Loss: 0.02068
Epoch: 6015, Training Loss: 0.01886
Epoch: 6015, Training Loss: 0.02369
Epoch: 6015, Training Loss: 0.01799
Epoch: 6016, Training Loss: 0.02068
Epoch: 6016, Training Loss: 0.01886
Epoch: 6016, Training Loss: 0.02369
Epoch: 6016, Training Loss: 0.01799
Epoch: 6017, Training Loss: 0.02068
Epoch: 6017, Training Loss: 0.01885
Epoch: 6017, Training Loss: 0.02369
Epoch: 6017, Training Loss: 0.01798
Epoch: 6018, Training Loss: 0.02068
Epoch: 6018, Training Loss: 0.01885
Epoch: 6018, Training Loss: 0.02369
Epoch: 6018, Training Loss: 0.01798
Epoch: 6019, Training Loss: 0.02068
Epoch: 6019, Training Loss: 0.01885
Epoch: 6019, Training Loss: 0.02368
Epoch: 6019, Training Loss: 0.01798
Epoch: 6020, Training Loss: 0.02067
Epoch: 6020, Training Loss: 0.01885
Epoch: 6020, Training Loss: 0.02368
Epoch: 6020, Training Loss: 0.01798
Epoch: 6021, Training Loss: 0.02067
Epoch: 6021, Training Loss: 0.01885
Epoch: 6021, Training Loss: 0.02368
Epoch: 6021, Training Loss: 0.01798
Epoch: 6022, Training Loss: 0.02067
Epoch: 6022, Training Loss: 0.01885
Epoch: 6022, Training Loss: 0.02368
Epoch: 6022, Training Loss: 0.01798
Epoch: 6023, Training Loss: 0.02067
Epoch: 6023, Training Loss: 0.01884
Epoch: 6023, Training Loss: 0.02368
Epoch: 6023, Training Loss: 0.01797
Epoch: 6024, Training Loss: 0.02066
Epoch: 6024, Training Loss: 0.01884
Epoch: 6024, Training Loss: 0.02367
Epoch: 6024, Training Loss: 0.01797
Epoch: 6025, Training Loss: 0.02066
Epoch: 6025, Training Loss: 0.01884
Epoch: 6025, Training Loss: 0.02367
Epoch: 6025, Training Loss: 0.01797
Epoch: 6026, Training Loss: 0.02066
Epoch: 6026, Training Loss: 0.01884
Epoch: 6026, Training Loss: 0.02367
Epoch: 6026, Training Loss: 0.01797
Epoch: 6027, Training Loss: 0.02066
Epoch: 6027, Training Loss: 0.01884
Epoch: 6027, Training Loss: 0.02367
Epoch: 6027, Training Loss: 0.01797
Epoch: 6028, Training Loss: 0.02066
Epoch: 6028, Training Loss: 0.01883
Epoch: 6028, Training Loss: 0.02366
Epoch: 6028, Training Loss: 0.01797
Epoch: 6029, Training Loss: 0.02065
Epoch: 6029, Training Loss: 0.01883
Epoch: 6029, Training Loss: 0.02366
Epoch: 6029, Training Loss: 0.01796
Epoch: 6030, Training Loss: 0.02065
Epoch: 6030, Training Loss: 0.01883
Epoch: 6030, Training Loss: 0.02366
Epoch: 6030, Training Loss: 0.01796
Epoch: 6031, Training Loss: 0.02065
Epoch: 6031, Training Loss: 0.01883
Epoch: 6031, Training Loss: 0.02366
Epoch: 6031, Training Loss: 0.01796
Epoch: 6032, Training Loss: 0.02065
Epoch: 6032, Training Loss: 0.01883
Epoch: 6032, Training Loss: 0.02365
Epoch: 6032, Training Loss: 0.01796
Epoch: 6033, Training Loss: 0.02065
Epoch: 6033, Training Loss: 0.01883
Epoch: 6033, Training Loss: 0.02365
Epoch: 6033, Training Loss: 0.01796
Epoch: 6034, Training Loss: 0.02064
Epoch: 6034, Training Loss: 0.01882
Epoch: 6034, Training Loss: 0.02365
Epoch: 6034, Training Loss: 0.01796
Epoch: 6035, Training Loss: 0.02064
Epoch: 6035, Training Loss: 0.01882
Epoch: 6035, Training Loss: 0.02365
Epoch: 6035, Training Loss: 0.01795
Epoch: 6036, Training Loss: 0.02064
Epoch: 6036, Training Loss: 0.01882
Epoch: 6036, Training Loss: 0.02364
Epoch: 6036, Training Loss: 0.01795
Epoch: 6037, Training Loss: 0.02064
Epoch: 6037, Training Loss: 0.01882
Epoch: 6037, Training Loss: 0.02364
Epoch: 6037, Training Loss: 0.01795
Epoch: 6038, Training Loss: 0.02064
Epoch: 6038, Training Loss: 0.01882
Epoch: 6038, Training Loss: 0.02364
Epoch: 6038, Training Loss: 0.01795
Epoch: 6039, Training Loss: 0.02063
Epoch: 6039, Training Loss: 0.01881
Epoch: 6039, Training Loss: 0.02364
Epoch: 6039, Training Loss: 0.01795
Epoch: 6040, Training Loss: 0.02063
Epoch: 6040, Training Loss: 0.01881
Epoch: 6040, Training Loss: 0.02363
Epoch: 6040, Training Loss: 0.01794
Epoch: 6041, Training Loss: 0.02063
Epoch: 6041, Training Loss: 0.01881
Epoch: 6041, Training Loss: 0.02363
Epoch: 6041, Training Loss: 0.01794
Epoch: 6042, Training Loss: 0.02063
Epoch: 6042, Training Loss: 0.01881
Epoch: 6042, Training Loss: 0.02363
Epoch: 6042, Training Loss: 0.01794
Epoch: 6043, Training Loss: 0.02063
Epoch: 6043, Training Loss: 0.01881
Epoch: 6043, Training Loss: 0.02363
Epoch: 6043, Training Loss: 0.01794
Epoch: 6044, Training Loss: 0.02062
Epoch: 6044, Training Loss: 0.01881
Epoch: 6044, Training Loss: 0.02363
Epoch: 6044, Training Loss: 0.01794
Epoch: 6045, Training Loss: 0.02062
Epoch: 6045, Training Loss: 0.01880
Epoch: 6045, Training Loss: 0.02362
Epoch: 6045, Training Loss: 0.01794
Epoch: 6046, Training Loss: 0.02062
Epoch: 6046, Training Loss: 0.01880
Epoch: 6046, Training Loss: 0.02362
Epoch: 6046, Training Loss: 0.01793
Epoch: 6047, Training Loss: 0.02062
Epoch: 6047, Training Loss: 0.01880
Epoch: 6047, Training Loss: 0.02362
Epoch: 6047, Training Loss: 0.01793
Epoch: 6048, Training Loss: 0.02062
Epoch: 6048, Training Loss: 0.01880
Epoch: 6048, Training Loss: 0.02362
Epoch: 6048, Training Loss: 0.01793
Epoch: 6049, Training Loss: 0.02061
Epoch: 6049, Training Loss: 0.01880
Epoch: 6049, Training Loss: 0.02361
Epoch: 6049, Training Loss: 0.01793
Epoch: 6050, Training Loss: 0.02061
Epoch: 6050, Training Loss: 0.01879
Epoch: 6050, Training Loss: 0.02361
Epoch: 6050, Training Loss: 0.01793
Epoch: 6051, Training Loss: 0.02061
Epoch: 6051, Training Loss: 0.01879
Epoch: 6051, Training Loss: 0.02361
Epoch: 6051, Training Loss: 0.01793
Epoch: 6052, Training Loss: 0.02061
Epoch: 6052, Training Loss: 0.01879
Epoch: 6052, Training Loss: 0.02361
Epoch: 6052, Training Loss: 0.01792
Epoch: 6053, Training Loss: 0.02061
Epoch: 6053, Training Loss: 0.01879
Epoch: 6053, Training Loss: 0.02360
Epoch: 6053, Training Loss: 0.01792
Epoch: 6054, Training Loss: 0.02060
Epoch: 6054, Training Loss: 0.01879
Epoch: 6054, Training Loss: 0.02360
Epoch: 6054, Training Loss: 0.01792
Epoch: 6055, Training Loss: 0.02060
Epoch: 6055, Training Loss: 0.01879
Epoch: 6055, Training Loss: 0.02360
Epoch: 6055, Training Loss: 0.01792
Epoch: 6056, Training Loss: 0.02060
Epoch: 6056, Training Loss: 0.01878
Epoch: 6056, Training Loss: 0.02360
Epoch: 6056, Training Loss: 0.01792
Epoch: 6057, Training Loss: 0.02060
Epoch: 6057, Training Loss: 0.01878
Epoch: 6057, Training Loss: 0.02359
Epoch: 6057, Training Loss: 0.01791
Epoch: 6058, Training Loss: 0.02059
Epoch: 6058, Training Loss: 0.01878
Epoch: 6058, Training Loss: 0.02359
Epoch: 6058, Training Loss: 0.01791
Epoch: 6059, Training Loss: 0.02059
Epoch: 6059, Training Loss: 0.01878
Epoch: 6059, Training Loss: 0.02359
Epoch: 6059, Training Loss: 0.01791
Epoch: 6060, Training Loss: 0.02059
Epoch: 6060, Training Loss: 0.01878
Epoch: 6060, Training Loss: 0.02359
Epoch: 6060, Training Loss: 0.01791
Epoch: 6061, Training Loss: 0.02059
Epoch: 6061, Training Loss: 0.01878
Epoch: 6061, Training Loss: 0.02358
Epoch: 6061, Training Loss: 0.01791
Epoch: 6062, Training Loss: 0.02059
Epoch: 6062, Training Loss: 0.01877
Epoch: 6062, Training Loss: 0.02358
Epoch: 6062, Training Loss: 0.01791
Epoch: 6063, Training Loss: 0.02058
Epoch: 6063, Training Loss: 0.01877
Epoch: 6063, Training Loss: 0.02358
Epoch: 6063, Training Loss: 0.01790
Epoch: 6064, Training Loss: 0.02058
Epoch: 6064, Training Loss: 0.01877
Epoch: 6064, Training Loss: 0.02358
Epoch: 6064, Training Loss: 0.01790
Epoch: 6065, Training Loss: 0.02058
Epoch: 6065, Training Loss: 0.01877
Epoch: 6065, Training Loss: 0.02358
Epoch: 6065, Training Loss: 0.01790
Epoch: 6066, Training Loss: 0.02058
Epoch: 6066, Training Loss: 0.01877
Epoch: 6066, Training Loss: 0.02357
Epoch: 6066, Training Loss: 0.01790
Epoch: 6067, Training Loss: 0.02058
Epoch: 6067, Training Loss: 0.01876
Epoch: 6067, Training Loss: 0.02357
Epoch: 6067, Training Loss: 0.01790
Epoch: 6068, Training Loss: 0.02057
Epoch: 6068, Training Loss: 0.01876
Epoch: 6068, Training Loss: 0.02357
Epoch: 6068, Training Loss: 0.01790
Epoch: 6069, Training Loss: 0.02057
Epoch: 6069, Training Loss: 0.01876
Epoch: 6069, Training Loss: 0.02357
Epoch: 6069, Training Loss: 0.01789
Epoch: 6070, Training Loss: 0.02057
Epoch: 6070, Training Loss: 0.01876
Epoch: 6070, Training Loss: 0.02356
Epoch: 6070, Training Loss: 0.01789
Epoch: 6071, Training Loss: 0.02057
Epoch: 6071, Training Loss: 0.01876
Epoch: 6071, Training Loss: 0.02356
Epoch: 6071, Training Loss: 0.01789
Epoch: 6072, Training Loss: 0.02057
Epoch: 6072, Training Loss: 0.01876
Epoch: 6072, Training Loss: 0.02356
Epoch: 6072, Training Loss: 0.01789
Epoch: 6073, Training Loss: 0.02056
Epoch: 6073, Training Loss: 0.01875
Epoch: 6073, Training Loss: 0.02356
Epoch: 6073, Training Loss: 0.01789
Epoch: 6074, Training Loss: 0.02056
Epoch: 6074, Training Loss: 0.01875
Epoch: 6074, Training Loss: 0.02355
Epoch: 6074, Training Loss: 0.01789
Epoch: 6075, Training Loss: 0.02056
Epoch: 6075, Training Loss: 0.01875
Epoch: 6075, Training Loss: 0.02355
Epoch: 6075, Training Loss: 0.01788
Epoch: 6076, Training Loss: 0.02056
Epoch: 6076, Training Loss: 0.01875
Epoch: 6076, Training Loss: 0.02355
Epoch: 6076, Training Loss: 0.01788
Epoch: 6077, Training Loss: 0.02056
Epoch: 6077, Training Loss: 0.01875
Epoch: 6077, Training Loss: 0.02355
Epoch: 6077, Training Loss: 0.01788
Epoch: 6078, Training Loss: 0.02055
Epoch: 6078, Training Loss: 0.01874
Epoch: 6078, Training Loss: 0.02354
Epoch: 6078, Training Loss: 0.01788
Epoch: 6079, Training Loss: 0.02055
Epoch: 6079, Training Loss: 0.01874
Epoch: 6079, Training Loss: 0.02354
Epoch: 6079, Training Loss: 0.01788
Epoch: 6080, Training Loss: 0.02055
Epoch: 6080, Training Loss: 0.01874
Epoch: 6080, Training Loss: 0.02354
Epoch: 6080, Training Loss: 0.01788
Epoch: 6081, Training Loss: 0.02055
Epoch: 6081, Training Loss: 0.01874
Epoch: 6081, Training Loss: 0.02354
Epoch: 6081, Training Loss: 0.01787
Epoch: 6082, Training Loss: 0.02055
Epoch: 6082, Training Loss: 0.01874
Epoch: 6082, Training Loss: 0.02353
Epoch: 6082, Training Loss: 0.01787
Epoch: 6083, Training Loss: 0.02054
Epoch: 6083, Training Loss: 0.01874
Epoch: 6083, Training Loss: 0.02353
Epoch: 6083, Training Loss: 0.01787
Epoch: 6084, Training Loss: 0.02054
Epoch: 6084, Training Loss: 0.01873
Epoch: 6084, Training Loss: 0.02353
Epoch: 6084, Training Loss: 0.01787
Epoch: 6085, Training Loss: 0.02054
Epoch: 6085, Training Loss: 0.01873
Epoch: 6085, Training Loss: 0.02353
Epoch: 6085, Training Loss: 0.01787
Epoch: 6086, Training Loss: 0.02054
Epoch: 6086, Training Loss: 0.01873
Epoch: 6086, Training Loss: 0.02353
Epoch: 6086, Training Loss: 0.01786
Epoch: 6087, Training Loss: 0.02054
Epoch: 6087, Training Loss: 0.01873
Epoch: 6087, Training Loss: 0.02352
Epoch: 6087, Training Loss: 0.01786
Epoch: 6088, Training Loss: 0.02053
Epoch: 6088, Training Loss: 0.01873
Epoch: 6088, Training Loss: 0.02352
Epoch: 6088, Training Loss: 0.01786
Epoch: 6089, Training Loss: 0.02053
Epoch: 6089, Training Loss: 0.01873
Epoch: 6089, Training Loss: 0.02352
Epoch: 6089, Training Loss: 0.01786
Epoch: 6090, Training Loss: 0.02053
Epoch: 6090, Training Loss: 0.01872
Epoch: 6090, Training Loss: 0.02352
Epoch: 6090, Training Loss: 0.01786
Epoch: 6091, Training Loss: 0.02053
Epoch: 6091, Training Loss: 0.01872
Epoch: 6091, Training Loss: 0.02351
Epoch: 6091, Training Loss: 0.01786
Epoch: 6092, Training Loss: 0.02053
Epoch: 6092, Training Loss: 0.01872
Epoch: 6092, Training Loss: 0.02351
Epoch: 6092, Training Loss: 0.01785
Epoch: 6093, Training Loss: 0.02052
Epoch: 6093, Training Loss: 0.01872
Epoch: 6093, Training Loss: 0.02351
Epoch: 6093, Training Loss: 0.01785
Epoch: 6094, Training Loss: 0.02052
Epoch: 6094, Training Loss: 0.01872
Epoch: 6094, Training Loss: 0.02351
Epoch: 6094, Training Loss: 0.01785
Epoch: 6095, Training Loss: 0.02052
Epoch: 6095, Training Loss: 0.01871
Epoch: 6095, Training Loss: 0.02350
Epoch: 6095, Training Loss: 0.01785
Epoch: 6096, Training Loss: 0.02052
Epoch: 6096, Training Loss: 0.01871
Epoch: 6096, Training Loss: 0.02350
Epoch: 6096, Training Loss: 0.01785
Epoch: 6097, Training Loss: 0.02052
Epoch: 6097, Training Loss: 0.01871
Epoch: 6097, Training Loss: 0.02350
Epoch: 6097, Training Loss: 0.01785
Epoch: 6098, Training Loss: 0.02051
Epoch: 6098, Training Loss: 0.01871
Epoch: 6098, Training Loss: 0.02350
Epoch: 6098, Training Loss: 0.01784
Epoch: 6099, Training Loss: 0.02051
Epoch: 6099, Training Loss: 0.01871
Epoch: 6099, Training Loss: 0.02349
Epoch: 6099, Training Loss: 0.01784
Epoch: 6100, Training Loss: 0.02051
Epoch: 6100, Training Loss: 0.01871
Epoch: 6100, Training Loss: 0.02349
Epoch: 6100, Training Loss: 0.01784
Epoch: 6101, Training Loss: 0.02051
Epoch: 6101, Training Loss: 0.01870
Epoch: 6101, Training Loss: 0.02349
Epoch: 6101, Training Loss: 0.01784
Epoch: 6102, Training Loss: 0.02051
Epoch: 6102, Training Loss: 0.01870
Epoch: 6102, Training Loss: 0.02349
Epoch: 6102, Training Loss: 0.01784
Epoch: 6103, Training Loss: 0.02050
Epoch: 6103, Training Loss: 0.01870
Epoch: 6103, Training Loss: 0.02349
Epoch: 6103, Training Loss: 0.01784
Epoch: 6104, Training Loss: 0.02050
Epoch: 6104, Training Loss: 0.01870
Epoch: 6104, Training Loss: 0.02348
Epoch: 6104, Training Loss: 0.01783
Epoch: 6105, Training Loss: 0.02050
Epoch: 6105, Training Loss: 0.01870
Epoch: 6105, Training Loss: 0.02348
Epoch: 6105, Training Loss: 0.01783
Epoch: 6106, Training Loss: 0.02050
Epoch: 6106, Training Loss: 0.01870
Epoch: 6106, Training Loss: 0.02348
Epoch: 6106, Training Loss: 0.01783
Epoch: 6107, Training Loss: 0.02050
Epoch: 6107, Training Loss: 0.01869
Epoch: 6107, Training Loss: 0.02348
Epoch: 6107, Training Loss: 0.01783
Epoch: 6108, Training Loss: 0.02049
Epoch: 6108, Training Loss: 0.01869
Epoch: 6108, Training Loss: 0.02347
Epoch: 6108, Training Loss: 0.01783
Epoch: 6109, Training Loss: 0.02049
Epoch: 6109, Training Loss: 0.01869
Epoch: 6109, Training Loss: 0.02347
Epoch: 6109, Training Loss: 0.01783
Epoch: 6110, Training Loss: 0.02049
Epoch: 6110, Training Loss: 0.01869
Epoch: 6110, Training Loss: 0.02347
Epoch: 6110, Training Loss: 0.01782
Epoch: 6111, Training Loss: 0.02049
Epoch: 6111, Training Loss: 0.01869
Epoch: 6111, Training Loss: 0.02347
Epoch: 6111, Training Loss: 0.01782
Epoch: 6112, Training Loss: 0.02049
Epoch: 6112, Training Loss: 0.01868
Epoch: 6112, Training Loss: 0.02346
Epoch: 6112, Training Loss: 0.01782
Epoch: 6113, Training Loss: 0.02048
Epoch: 6113, Training Loss: 0.01868
Epoch: 6113, Training Loss: 0.02346
Epoch: 6113, Training Loss: 0.01782
Epoch: 6114, Training Loss: 0.02048
Epoch: 6114, Training Loss: 0.01868
Epoch: 6114, Training Loss: 0.02346
Epoch: 6114, Training Loss: 0.01782
Epoch: 6115, Training Loss: 0.02048
Epoch: 6115, Training Loss: 0.01868
Epoch: 6115, Training Loss: 0.02346
Epoch: 6115, Training Loss: 0.01781
Epoch: 6116, Training Loss: 0.02048
Epoch: 6116, Training Loss: 0.01868
Epoch: 6116, Training Loss: 0.02346
Epoch: 6116, Training Loss: 0.01781
Epoch: 6117, Training Loss: 0.02048
Epoch: 6117, Training Loss: 0.01868
Epoch: 6117, Training Loss: 0.02345
Epoch: 6117, Training Loss: 0.01781
Epoch: 6118, Training Loss: 0.02047
Epoch: 6118, Training Loss: 0.01867
Epoch: 6118, Training Loss: 0.02345
Epoch: 6118, Training Loss: 0.01781
Epoch: 6119, Training Loss: 0.02047
Epoch: 6119, Training Loss: 0.01867
Epoch: 6119, Training Loss: 0.02345
Epoch: 6119, Training Loss: 0.01781
Epoch: 6120, Training Loss: 0.02047
Epoch: 6120, Training Loss: 0.01867
Epoch: 6120, Training Loss: 0.02345
Epoch: 6120, Training Loss: 0.01781
Epoch: 6121, Training Loss: 0.02047
Epoch: 6121, Training Loss: 0.01867
Epoch: 6121, Training Loss: 0.02344
Epoch: 6121, Training Loss: 0.01780
Epoch: 6122, Training Loss: 0.02046
Epoch: 6122, Training Loss: 0.01867
Epoch: 6122, Training Loss: 0.02344
Epoch: 6122, Training Loss: 0.01780
Epoch: 6123, Training Loss: 0.02046
Epoch: 6123, Training Loss: 0.01867
Epoch: 6123, Training Loss: 0.02344
Epoch: 6123, Training Loss: 0.01780
Epoch: 6124, Training Loss: 0.02046
Epoch: 6124, Training Loss: 0.01866
Epoch: 6124, Training Loss: 0.02344
Epoch: 6124, Training Loss: 0.01780
Epoch: 6125, Training Loss: 0.02046
Epoch: 6125, Training Loss: 0.01866
Epoch: 6125, Training Loss: 0.02343
Epoch: 6125, Training Loss: 0.01780
Epoch: 6126, Training Loss: 0.02046
Epoch: 6126, Training Loss: 0.01866
Epoch: 6126, Training Loss: 0.02343
Epoch: 6126, Training Loss: 0.01780
Epoch: 6127, Training Loss: 0.02045
Epoch: 6127, Training Loss: 0.01866
Epoch: 6127, Training Loss: 0.02343
Epoch: 6127, Training Loss: 0.01779
Epoch: 6128, Training Loss: 0.02045
Epoch: 6128, Training Loss: 0.01866
Epoch: 6128, Training Loss: 0.02343
Epoch: 6128, Training Loss: 0.01779
Epoch: 6129, Training Loss: 0.02045
Epoch: 6129, Training Loss: 0.01865
Epoch: 6129, Training Loss: 0.02342
Epoch: 6129, Training Loss: 0.01779
Epoch: 6130, Training Loss: 0.02045
Epoch: 6130, Training Loss: 0.01865
Epoch: 6130, Training Loss: 0.02342
Epoch: 6130, Training Loss: 0.01779
Epoch: 6131, Training Loss: 0.02045
Epoch: 6131, Training Loss: 0.01865
Epoch: 6131, Training Loss: 0.02342
Epoch: 6131, Training Loss: 0.01779
Epoch: 6132, Training Loss: 0.02044
Epoch: 6132, Training Loss: 0.01865
Epoch: 6132, Training Loss: 0.02342
Epoch: 6132, Training Loss: 0.01779
Epoch: 6133, Training Loss: 0.02044
Epoch: 6133, Training Loss: 0.01865
Epoch: 6133, Training Loss: 0.02342
Epoch: 6133, Training Loss: 0.01778
Epoch: 6134, Training Loss: 0.02044
Epoch: 6134, Training Loss: 0.01865
Epoch: 6134, Training Loss: 0.02341
Epoch: 6134, Training Loss: 0.01778
Epoch: 6135, Training Loss: 0.02044
Epoch: 6135, Training Loss: 0.01864
Epoch: 6135, Training Loss: 0.02341
Epoch: 6135, Training Loss: 0.01778
Epoch: 6136, Training Loss: 0.02044
Epoch: 6136, Training Loss: 0.01864
Epoch: 6136, Training Loss: 0.02341
Epoch: 6136, Training Loss: 0.01778
Epoch: 6137, Training Loss: 0.02043
Epoch: 6137, Training Loss: 0.01864
Epoch: 6137, Training Loss: 0.02341
Epoch: 6137, Training Loss: 0.01778
Epoch: 6138, Training Loss: 0.02043
Epoch: 6138, Training Loss: 0.01864
Epoch: 6138, Training Loss: 0.02340
Epoch: 6138, Training Loss: 0.01778
Epoch: 6139, Training Loss: 0.02043
Epoch: 6139, Training Loss: 0.01864
Epoch: 6139, Training Loss: 0.02340
Epoch: 6139, Training Loss: 0.01777
Epoch: 6140, Training Loss: 0.02043
Epoch: 6140, Training Loss: 0.01864
Epoch: 6140, Training Loss: 0.02340
Epoch: 6140, Training Loss: 0.01777
Epoch: 6141, Training Loss: 0.02043
Epoch: 6141, Training Loss: 0.01863
Epoch: 6141, Training Loss: 0.02340
Epoch: 6141, Training Loss: 0.01777
Epoch: 6142, Training Loss: 0.02042
Epoch: 6142, Training Loss: 0.01863
Epoch: 6142, Training Loss: 0.02339
Epoch: 6142, Training Loss: 0.01777
Epoch: 6143, Training Loss: 0.02042
Epoch: 6143, Training Loss: 0.01863
Epoch: 6143, Training Loss: 0.02339
Epoch: 6143, Training Loss: 0.01777
Epoch: 6144, Training Loss: 0.02042
Epoch: 6144, Training Loss: 0.01863
Epoch: 6144, Training Loss: 0.02339
Epoch: 6144, Training Loss: 0.01777
Epoch: 6145, Training Loss: 0.02042
Epoch: 6145, Training Loss: 0.01863
Epoch: 6145, Training Loss: 0.02339
Epoch: 6145, Training Loss: 0.01776
Epoch: 6146, Training Loss: 0.02042
Epoch: 6146, Training Loss: 0.01863
Epoch: 6146, Training Loss: 0.02339
Epoch: 6146, Training Loss: 0.01776
Epoch: 6147, Training Loss: 0.02041
Epoch: 6147, Training Loss: 0.01862
Epoch: 6147, Training Loss: 0.02338
Epoch: 6147, Training Loss: 0.01776
Epoch: 6148, Training Loss: 0.02041
Epoch: 6148, Training Loss: 0.01862
Epoch: 6148, Training Loss: 0.02338
Epoch: 6148, Training Loss: 0.01776
Epoch: 6149, Training Loss: 0.02041
Epoch: 6149, Training Loss: 0.01862
Epoch: 6149, Training Loss: 0.02338
Epoch: 6149, Training Loss: 0.01776
Epoch: 6150, Training Loss: 0.02041
Epoch: 6150, Training Loss: 0.01862
Epoch: 6150, Training Loss: 0.02338
Epoch: 6150, Training Loss: 0.01776
Epoch: 6151, Training Loss: 0.02041
Epoch: 6151, Training Loss: 0.01862
Epoch: 6151, Training Loss: 0.02337
Epoch: 6151, Training Loss: 0.01775
Epoch: 6152, Training Loss: 0.02040
Epoch: 6152, Training Loss: 0.01861
Epoch: 6152, Training Loss: 0.02337
Epoch: 6152, Training Loss: 0.01775
Epoch: 6153, Training Loss: 0.02040
Epoch: 6153, Training Loss: 0.01861
Epoch: 6153, Training Loss: 0.02337
Epoch: 6153, Training Loss: 0.01775
Epoch: 6154, Training Loss: 0.02040
Epoch: 6154, Training Loss: 0.01861
Epoch: 6154, Training Loss: 0.02337
Epoch: 6154, Training Loss: 0.01775
Epoch: 6155, Training Loss: 0.02040
Epoch: 6155, Training Loss: 0.01861
Epoch: 6155, Training Loss: 0.02336
Epoch: 6155, Training Loss: 0.01775
Epoch: 6156, Training Loss: 0.02040
Epoch: 6156, Training Loss: 0.01861
Epoch: 6156, Training Loss: 0.02336
Epoch: 6156, Training Loss: 0.01775
Epoch: 6157, Training Loss: 0.02039
Epoch: 6157, Training Loss: 0.01861
Epoch: 6157, Training Loss: 0.02336
Epoch: 6157, Training Loss: 0.01774
Epoch: 6158, Training Loss: 0.02039
Epoch: 6158, Training Loss: 0.01860
Epoch: 6158, Training Loss: 0.02336
Epoch: 6158, Training Loss: 0.01774
Epoch: 6159, Training Loss: 0.02039
Epoch: 6159, Training Loss: 0.01860
Epoch: 6159, Training Loss: 0.02336
Epoch: 6159, Training Loss: 0.01774
Epoch: 6160, Training Loss: 0.02039
Epoch: 6160, Training Loss: 0.01860
Epoch: 6160, Training Loss: 0.02335
Epoch: 6160, Training Loss: 0.01774
Epoch: 6161, Training Loss: 0.02039
Epoch: 6161, Training Loss: 0.01860
Epoch: 6161, Training Loss: 0.02335
Epoch: 6161, Training Loss: 0.01774
Epoch: 6162, Training Loss: 0.02039
Epoch: 6162, Training Loss: 0.01860
Epoch: 6162, Training Loss: 0.02335
Epoch: 6162, Training Loss: 0.01773
Epoch: 6163, Training Loss: 0.02038
Epoch: 6163, Training Loss: 0.01860
Epoch: 6163, Training Loss: 0.02335
Epoch: 6163, Training Loss: 0.01773
Epoch: 6164, Training Loss: 0.02038
Epoch: 6164, Training Loss: 0.01859
Epoch: 6164, Training Loss: 0.02334
Epoch: 6164, Training Loss: 0.01773
Epoch: 6165, Training Loss: 0.02038
Epoch: 6165, Training Loss: 0.01859
Epoch: 6165, Training Loss: 0.02334
Epoch: 6165, Training Loss: 0.01773
Epoch: 6166, Training Loss: 0.02038
Epoch: 6166, Training Loss: 0.01859
Epoch: 6166, Training Loss: 0.02334
Epoch: 6166, Training Loss: 0.01773
Epoch: 6167, Training Loss: 0.02038
Epoch: 6167, Training Loss: 0.01859
Epoch: 6167, Training Loss: 0.02334
Epoch: 6167, Training Loss: 0.01773
Epoch: 6168, Training Loss: 0.02037
Epoch: 6168, Training Loss: 0.01859
Epoch: 6168, Training Loss: 0.02333
Epoch: 6168, Training Loss: 0.01772
Epoch: 6169, Training Loss: 0.02037
Epoch: 6169, Training Loss: 0.01859
Epoch: 6169, Training Loss: 0.02333
Epoch: 6169, Training Loss: 0.01772
Epoch: 6170, Training Loss: 0.02037
Epoch: 6170, Training Loss: 0.01858
Epoch: 6170, Training Loss: 0.02333
Epoch: 6170, Training Loss: 0.01772
Epoch: 6171, Training Loss: 0.02037
Epoch: 6171, Training Loss: 0.01858
Epoch: 6171, Training Loss: 0.02333
Epoch: 6171, Training Loss: 0.01772
Epoch: 6172, Training Loss: 0.02037
Epoch: 6172, Training Loss: 0.01858
Epoch: 6172, Training Loss: 0.02333
Epoch: 6172, Training Loss: 0.01772
Epoch: 6173, Training Loss: 0.02036
Epoch: 6173, Training Loss: 0.01858
Epoch: 6173, Training Loss: 0.02332
Epoch: 6173, Training Loss: 0.01772
Epoch: 6174, Training Loss: 0.02036
Epoch: 6174, Training Loss: 0.01858
Epoch: 6174, Training Loss: 0.02332
Epoch: 6174, Training Loss: 0.01771
Epoch: 6175, Training Loss: 0.02036
Epoch: 6175, Training Loss: 0.01857
Epoch: 6175, Training Loss: 0.02332
Epoch: 6175, Training Loss: 0.01771
Epoch: 6176, Training Loss: 0.02036
Epoch: 6176, Training Loss: 0.01857
Epoch: 6176, Training Loss: 0.02332
Epoch: 6176, Training Loss: 0.01771
Epoch: 6177, Training Loss: 0.02036
Epoch: 6177, Training Loss: 0.01857
Epoch: 6177, Training Loss: 0.02331
Epoch: 6177, Training Loss: 0.01771
Epoch: 6178, Training Loss: 0.02035
Epoch: 6178, Training Loss: 0.01857
Epoch: 6178, Training Loss: 0.02331
Epoch: 6178, Training Loss: 0.01771
Epoch: 6179, Training Loss: 0.02035
Epoch: 6179, Training Loss: 0.01857
Epoch: 6179, Training Loss: 0.02331
Epoch: 6179, Training Loss: 0.01771
Epoch: 6180, Training Loss: 0.02035
Epoch: 6180, Training Loss: 0.01857
Epoch: 6180, Training Loss: 0.02331
Epoch: 6180, Training Loss: 0.01770
Epoch: 6181, Training Loss: 0.02035
Epoch: 6181, Training Loss: 0.01856
Epoch: 6181, Training Loss: 0.02330
Epoch: 6181, Training Loss: 0.01770
Epoch: 6182, Training Loss: 0.02035
Epoch: 6182, Training Loss: 0.01856
Epoch: 6182, Training Loss: 0.02330
Epoch: 6182, Training Loss: 0.01770
Epoch: 6183, Training Loss: 0.02034
Epoch: 6183, Training Loss: 0.01856
Epoch: 6183, Training Loss: 0.02330
Epoch: 6183, Training Loss: 0.01770
Epoch: 6184, Training Loss: 0.02034
Epoch: 6184, Training Loss: 0.01856
Epoch: 6184, Training Loss: 0.02330
Epoch: 6184, Training Loss: 0.01770
Epoch: 6185, Training Loss: 0.02034
Epoch: 6185, Training Loss: 0.01856
Epoch: 6185, Training Loss: 0.02330
Epoch: 6185, Training Loss: 0.01770
Epoch: 6186, Training Loss: 0.02034
Epoch: 6186, Training Loss: 0.01856
Epoch: 6186, Training Loss: 0.02329
Epoch: 6186, Training Loss: 0.01769
Epoch: 6187, Training Loss: 0.02034
Epoch: 6187, Training Loss: 0.01855
Epoch: 6187, Training Loss: 0.02329
Epoch: 6187, Training Loss: 0.01769
Epoch: 6188, Training Loss: 0.02033
Epoch: 6188, Training Loss: 0.01855
Epoch: 6188, Training Loss: 0.02329
Epoch: 6188, Training Loss: 0.01769
Epoch: 6189, Training Loss: 0.02033
Epoch: 6189, Training Loss: 0.01855
Epoch: 6189, Training Loss: 0.02329
Epoch: 6189, Training Loss: 0.01769
Epoch: 6190, Training Loss: 0.02033
Epoch: 6190, Training Loss: 0.01855
Epoch: 6190, Training Loss: 0.02328
Epoch: 6190, Training Loss: 0.01769
Epoch: 6191, Training Loss: 0.02033
Epoch: 6191, Training Loss: 0.01855
Epoch: 6191, Training Loss: 0.02328
Epoch: 6191, Training Loss: 0.01769
Epoch: 6192, Training Loss: 0.02033
Epoch: 6192, Training Loss: 0.01855
Epoch: 6192, Training Loss: 0.02328
Epoch: 6192, Training Loss: 0.01768
Epoch: 6193, Training Loss: 0.02032
Epoch: 6193, Training Loss: 0.01854
Epoch: 6193, Training Loss: 0.02328
Epoch: 6193, Training Loss: 0.01768
Epoch: 6194, Training Loss: 0.02032
Epoch: 6194, Training Loss: 0.01854
Epoch: 6194, Training Loss: 0.02328
Epoch: 6194, Training Loss: 0.01768
Epoch: 6195, Training Loss: 0.02032
Epoch: 6195, Training Loss: 0.01854
Epoch: 6195, Training Loss: 0.02327
Epoch: 6195, Training Loss: 0.01768
Epoch: 6196, Training Loss: 0.02032
Epoch: 6196, Training Loss: 0.01854
Epoch: 6196, Training Loss: 0.02327
Epoch: 6196, Training Loss: 0.01768
Epoch: 6197, Training Loss: 0.02032
Epoch: 6197, Training Loss: 0.01854
Epoch: 6197, Training Loss: 0.02327
Epoch: 6197, Training Loss: 0.01768
Epoch: 6198, Training Loss: 0.02031
Epoch: 6198, Training Loss: 0.01853
Epoch: 6198, Training Loss: 0.02327
Epoch: 6198, Training Loss: 0.01767
Epoch: 6199, Training Loss: 0.02031
Epoch: 6199, Training Loss: 0.01853
Epoch: 6199, Training Loss: 0.02326
Epoch: 6199, Training Loss: 0.01767
Epoch: 6200, Training Loss: 0.02031
Epoch: 6200, Training Loss: 0.01853
Epoch: 6200, Training Loss: 0.02326
Epoch: 6200, Training Loss: 0.01767
Epoch: 6201, Training Loss: 0.02031
Epoch: 6201, Training Loss: 0.01853
Epoch: 6201, Training Loss: 0.02326
Epoch: 6201, Training Loss: 0.01767
Epoch: 6202, Training Loss: 0.02031
Epoch: 6202, Training Loss: 0.01853
Epoch: 6202, Training Loss: 0.02326
Epoch: 6202, Training Loss: 0.01767
Epoch: 6203, Training Loss: 0.02030
Epoch: 6203, Training Loss: 0.01853
Epoch: 6203, Training Loss: 0.02325
Epoch: 6203, Training Loss: 0.01767
Epoch: 6204, Training Loss: 0.02030
Epoch: 6204, Training Loss: 0.01852
Epoch: 6204, Training Loss: 0.02325
Epoch: 6204, Training Loss: 0.01766
Epoch: 6205, Training Loss: 0.02030
Epoch: 6205, Training Loss: 0.01852
Epoch: 6205, Training Loss: 0.02325
Epoch: 6205, Training Loss: 0.01766
Epoch: 6206, Training Loss: 0.02030
Epoch: 6206, Training Loss: 0.01852
Epoch: 6206, Training Loss: 0.02325
Epoch: 6206, Training Loss: 0.01766
Epoch: 6207, Training Loss: 0.02030
Epoch: 6207, Training Loss: 0.01852
Epoch: 6207, Training Loss: 0.02325
Epoch: 6207, Training Loss: 0.01766
Epoch: 6208, Training Loss: 0.02029
Epoch: 6208, Training Loss: 0.01852
Epoch: 6208, Training Loss: 0.02324
Epoch: 6208, Training Loss: 0.01766
Epoch: 6209, Training Loss: 0.02029
Epoch: 6209, Training Loss: 0.01852
Epoch: 6209, Training Loss: 0.02324
Epoch: 6209, Training Loss: 0.01766
Epoch: 6210, Training Loss: 0.02029
Epoch: 6210, Training Loss: 0.01851
Epoch: 6210, Training Loss: 0.02324
Epoch: 6210, Training Loss: 0.01765
Epoch: 6211, Training Loss: 0.02029
Epoch: 6211, Training Loss: 0.01851
Epoch: 6211, Training Loss: 0.02324
Epoch: 6211, Training Loss: 0.01765
Epoch: 6212, Training Loss: 0.02029
Epoch: 6212, Training Loss: 0.01851
Epoch: 6212, Training Loss: 0.02323
Epoch: 6212, Training Loss: 0.01765
Epoch: 6213, Training Loss: 0.02028
Epoch: 6213, Training Loss: 0.01851
Epoch: 6213, Training Loss: 0.02323
Epoch: 6213, Training Loss: 0.01765
Epoch: 6214, Training Loss: 0.02028
Epoch: 6214, Training Loss: 0.01851
Epoch: 6214, Training Loss: 0.02323
Epoch: 6214, Training Loss: 0.01765
Epoch: 6215, Training Loss: 0.02028
Epoch: 6215, Training Loss: 0.01851
Epoch: 6215, Training Loss: 0.02323
Epoch: 6215, Training Loss: 0.01765
Epoch: 6216, Training Loss: 0.02028
Epoch: 6216, Training Loss: 0.01850
Epoch: 6216, Training Loss: 0.02322
Epoch: 6216, Training Loss: 0.01764
Epoch: 6217, Training Loss: 0.02028
Epoch: 6217, Training Loss: 0.01850
Epoch: 6217, Training Loss: 0.02322
Epoch: 6217, Training Loss: 0.01764
Epoch: 6218, Training Loss: 0.02027
Epoch: 6218, Training Loss: 0.01850
Epoch: 6218, Training Loss: 0.02322
Epoch: 6218, Training Loss: 0.01764
Epoch: 6219, Training Loss: 0.02027
Epoch: 6219, Training Loss: 0.01850
Epoch: 6219, Training Loss: 0.02322
Epoch: 6219, Training Loss: 0.01764
Epoch: 6220, Training Loss: 0.02027
Epoch: 6220, Training Loss: 0.01850
Epoch: 6220, Training Loss: 0.02322
Epoch: 6220, Training Loss: 0.01764
Epoch: 6221, Training Loss: 0.02027
Epoch: 6221, Training Loss: 0.01850
Epoch: 6221, Training Loss: 0.02321
Epoch: 6221, Training Loss: 0.01764
Epoch: 6222, Training Loss: 0.02027
Epoch: 6222, Training Loss: 0.01849
Epoch: 6222, Training Loss: 0.02321
Epoch: 6222, Training Loss: 0.01763
Epoch: 6223, Training Loss: 0.02026
Epoch: 6223, Training Loss: 0.01849
Epoch: 6223, Training Loss: 0.02321
Epoch: 6223, Training Loss: 0.01763
Epoch: 6224, Training Loss: 0.02026
Epoch: 6224, Training Loss: 0.01849
Epoch: 6224, Training Loss: 0.02321
Epoch: 6224, Training Loss: 0.01763
Epoch: 6225, Training Loss: 0.02026
Epoch: 6225, Training Loss: 0.01849
Epoch: 6225, Training Loss: 0.02320
Epoch: 6225, Training Loss: 0.01763
Epoch: 6226, Training Loss: 0.02026
Epoch: 6226, Training Loss: 0.01849
Epoch: 6226, Training Loss: 0.02320
Epoch: 6226, Training Loss: 0.01763
Epoch: 6227, Training Loss: 0.02026
Epoch: 6227, Training Loss: 0.01849
Epoch: 6227, Training Loss: 0.02320
Epoch: 6227, Training Loss: 0.01763
Epoch: 6228, Training Loss: 0.02026
Epoch: 6228, Training Loss: 0.01848
Epoch: 6228, Training Loss: 0.02320
Epoch: 6228, Training Loss: 0.01762
Epoch: 6229, Training Loss: 0.02025
Epoch: 6229, Training Loss: 0.01848
Epoch: 6229, Training Loss: 0.02320
Epoch: 6229, Training Loss: 0.01762
Epoch: 6230, Training Loss: 0.02025
Epoch: 6230, Training Loss: 0.01848
Epoch: 6230, Training Loss: 0.02319
Epoch: 6230, Training Loss: 0.01762
Epoch: 6231, Training Loss: 0.02025
Epoch: 6231, Training Loss: 0.01848
Epoch: 6231, Training Loss: 0.02319
Epoch: 6231, Training Loss: 0.01762
Epoch: 6232, Training Loss: 0.02025
Epoch: 6232, Training Loss: 0.01848
Epoch: 6232, Training Loss: 0.02319
Epoch: 6232, Training Loss: 0.01762
Epoch: 6233, Training Loss: 0.02025
Epoch: 6233, Training Loss: 0.01847
Epoch: 6233, Training Loss: 0.02319
Epoch: 6233, Training Loss: 0.01762
Epoch: 6234, Training Loss: 0.02024
Epoch: 6234, Training Loss: 0.01847
Epoch: 6234, Training Loss: 0.02318
Epoch: 6234, Training Loss: 0.01761
Epoch: 6235, Training Loss: 0.02024
Epoch: 6235, Training Loss: 0.01847
Epoch: 6235, Training Loss: 0.02318
Epoch: 6235, Training Loss: 0.01761
Epoch: 6236, Training Loss: 0.02024
Epoch: 6236, Training Loss: 0.01847
Epoch: 6236, Training Loss: 0.02318
Epoch: 6236, Training Loss: 0.01761
Epoch: 6237, Training Loss: 0.02024
Epoch: 6237, Training Loss: 0.01847
Epoch: 6237, Training Loss: 0.02318
Epoch: 6237, Training Loss: 0.01761
Epoch: 6238, Training Loss: 0.02024
Epoch: 6238, Training Loss: 0.01847
Epoch: 6238, Training Loss: 0.02318
Epoch: 6238, Training Loss: 0.01761
Epoch: 6239, Training Loss: 0.02023
Epoch: 6239, Training Loss: 0.01846
Epoch: 6239, Training Loss: 0.02317
Epoch: 6239, Training Loss: 0.01761
Epoch: 6240, Training Loss: 0.02023
Epoch: 6240, Training Loss: 0.01846
Epoch: 6240, Training Loss: 0.02317
Epoch: 6240, Training Loss: 0.01760
Epoch: 6241, Training Loss: 0.02023
Epoch: 6241, Training Loss: 0.01846
Epoch: 6241, Training Loss: 0.02317
Epoch: 6241, Training Loss: 0.01760
Epoch: 6242, Training Loss: 0.02023
Epoch: 6242, Training Loss: 0.01846
Epoch: 6242, Training Loss: 0.02317
Epoch: 6242, Training Loss: 0.01760
Epoch: 6243, Training Loss: 0.02023
Epoch: 6243, Training Loss: 0.01846
Epoch: 6243, Training Loss: 0.02316
Epoch: 6243, Training Loss: 0.01760
Epoch: 6244, Training Loss: 0.02022
Epoch: 6244, Training Loss: 0.01846
Epoch: 6244, Training Loss: 0.02316
Epoch: 6244, Training Loss: 0.01760
Epoch: 6245, Training Loss: 0.02022
Epoch: 6245, Training Loss: 0.01845
Epoch: 6245, Training Loss: 0.02316
Epoch: 6245, Training Loss: 0.01760
Epoch: 6246, Training Loss: 0.02022
Epoch: 6246, Training Loss: 0.01845
Epoch: 6246, Training Loss: 0.02316
Epoch: 6246, Training Loss: 0.01759
Epoch: 6247, Training Loss: 0.02022
Epoch: 6247, Training Loss: 0.01845
Epoch: 6247, Training Loss: 0.02315
Epoch: 6247, Training Loss: 0.01759
Epoch: 6248, Training Loss: 0.02022
Epoch: 6248, Training Loss: 0.01845
Epoch: 6248, Training Loss: 0.02315
Epoch: 6248, Training Loss: 0.01759
Epoch: 6249, Training Loss: 0.02021
Epoch: 6249, Training Loss: 0.01845
Epoch: 6249, Training Loss: 0.02315
Epoch: 6249, Training Loss: 0.01759
Epoch: 6250, Training Loss: 0.02021
Epoch: 6250, Training Loss: 0.01845
Epoch: 6250, Training Loss: 0.02315
Epoch: 6250, Training Loss: 0.01759
Epoch: 6251, Training Loss: 0.02021
Epoch: 6251, Training Loss: 0.01844
Epoch: 6251, Training Loss: 0.02315
Epoch: 6251, Training Loss: 0.01759
Epoch: 6252, Training Loss: 0.02021
Epoch: 6252, Training Loss: 0.01844
Epoch: 6252, Training Loss: 0.02314
Epoch: 6252, Training Loss: 0.01758
Epoch: 6253, Training Loss: 0.02021
Epoch: 6253, Training Loss: 0.01844
Epoch: 6253, Training Loss: 0.02314
Epoch: 6253, Training Loss: 0.01758
Epoch: 6254, Training Loss: 0.02020
Epoch: 6254, Training Loss: 0.01844
Epoch: 6254, Training Loss: 0.02314
Epoch: 6254, Training Loss: 0.01758
Epoch: 6255, Training Loss: 0.02020
Epoch: 6255, Training Loss: 0.01844
Epoch: 6255, Training Loss: 0.02314
Epoch: 6255, Training Loss: 0.01758
Epoch: 6256, Training Loss: 0.02020
Epoch: 6256, Training Loss: 0.01844
Epoch: 6256, Training Loss: 0.02313
Epoch: 6256, Training Loss: 0.01758
Epoch: 6257, Training Loss: 0.02020
Epoch: 6257, Training Loss: 0.01843
Epoch: 6257, Training Loss: 0.02313
Epoch: 6257, Training Loss: 0.01758
Epoch: 6258, Training Loss: 0.02020
Epoch: 6258, Training Loss: 0.01843
Epoch: 6258, Training Loss: 0.02313
Epoch: 6258, Training Loss: 0.01757
Epoch: 6259, Training Loss: 0.02019
Epoch: 6259, Training Loss: 0.01843
Epoch: 6259, Training Loss: 0.02313
Epoch: 6259, Training Loss: 0.01757
Epoch: 6260, Training Loss: 0.02019
Epoch: 6260, Training Loss: 0.01843
Epoch: 6260, Training Loss: 0.02313
Epoch: 6260, Training Loss: 0.01757
Epoch: 6261, Training Loss: 0.02019
Epoch: 6261, Training Loss: 0.01843
Epoch: 6261, Training Loss: 0.02312
Epoch: 6261, Training Loss: 0.01757
Epoch: 6262, Training Loss: 0.02019
Epoch: 6262, Training Loss: 0.01843
Epoch: 6262, Training Loss: 0.02312
Epoch: 6262, Training Loss: 0.01757
Epoch: 6263, Training Loss: 0.02019
Epoch: 6263, Training Loss: 0.01842
Epoch: 6263, Training Loss: 0.02312
Epoch: 6263, Training Loss: 0.01757
Epoch: 6264, Training Loss: 0.02019
Epoch: 6264, Training Loss: 0.01842
Epoch: 6264, Training Loss: 0.02312
Epoch: 6264, Training Loss: 0.01757
Epoch: 6265, Training Loss: 0.02018
Epoch: 6265, Training Loss: 0.01842
Epoch: 6265, Training Loss: 0.02311
Epoch: 6265, Training Loss: 0.01756
Epoch: 6266, Training Loss: 0.02018
Epoch: 6266, Training Loss: 0.01842
Epoch: 6266, Training Loss: 0.02311
Epoch: 6266, Training Loss: 0.01756
Epoch: 6267, Training Loss: 0.02018
Epoch: 6267, Training Loss: 0.01842
Epoch: 6267, Training Loss: 0.02311
Epoch: 6267, Training Loss: 0.01756
Epoch: 6268, Training Loss: 0.02018
Epoch: 6268, Training Loss: 0.01842
Epoch: 6268, Training Loss: 0.02311
Epoch: 6268, Training Loss: 0.01756
Epoch: 6269, Training Loss: 0.02018
Epoch: 6269, Training Loss: 0.01841
Epoch: 6269, Training Loss: 0.02311
Epoch: 6269, Training Loss: 0.01756
Epoch: 6270, Training Loss: 0.02017
Epoch: 6270, Training Loss: 0.01841
Epoch: 6270, Training Loss: 0.02310
Epoch: 6270, Training Loss: 0.01756
Epoch: 6271, Training Loss: 0.02017
Epoch: 6271, Training Loss: 0.01841
Epoch: 6271, Training Loss: 0.02310
Epoch: 6271, Training Loss: 0.01755
Epoch: 6272, Training Loss: 0.02017
Epoch: 6272, Training Loss: 0.01841
Epoch: 6272, Training Loss: 0.02310
Epoch: 6272, Training Loss: 0.01755
Epoch: 6273, Training Loss: 0.02017
Epoch: 6273, Training Loss: 0.01841
Epoch: 6273, Training Loss: 0.02310
Epoch: 6273, Training Loss: 0.01755
Epoch: 6274, Training Loss: 0.02017
Epoch: 6274, Training Loss: 0.01841
Epoch: 6274, Training Loss: 0.02309
Epoch: 6274, Training Loss: 0.01755
Epoch: 6275, Training Loss: 0.02016
Epoch: 6275, Training Loss: 0.01840
Epoch: 6275, Training Loss: 0.02309
Epoch: 6275, Training Loss: 0.01755
Epoch: 6276, Training Loss: 0.02016
Epoch: 6276, Training Loss: 0.01840
Epoch: 6276, Training Loss: 0.02309
Epoch: 6276, Training Loss: 0.01755
Epoch: 6277, Training Loss: 0.02016
Epoch: 6277, Training Loss: 0.01840
Epoch: 6277, Training Loss: 0.02309
Epoch: 6277, Training Loss: 0.01754
Epoch: 6278, Training Loss: 0.02016
Epoch: 6278, Training Loss: 0.01840
Epoch: 6278, Training Loss: 0.02309
Epoch: 6278, Training Loss: 0.01754
Epoch: 6279, Training Loss: 0.02016
Epoch: 6279, Training Loss: 0.01840
Epoch: 6279, Training Loss: 0.02308
Epoch: 6279, Training Loss: 0.01754
Epoch: 6280, Training Loss: 0.02015
Epoch: 6280, Training Loss: 0.01840
Epoch: 6280, Training Loss: 0.02308
Epoch: 6280, Training Loss: 0.01754
Epoch: 6281, Training Loss: 0.02015
Epoch: 6281, Training Loss: 0.01839
Epoch: 6281, Training Loss: 0.02308
Epoch: 6281, Training Loss: 0.01754
Epoch: 6282, Training Loss: 0.02015
Epoch: 6282, Training Loss: 0.01839
Epoch: 6282, Training Loss: 0.02308
Epoch: 6282, Training Loss: 0.01754
Epoch: 6283, Training Loss: 0.02015
Epoch: 6283, Training Loss: 0.01839
Epoch: 6283, Training Loss: 0.02307
Epoch: 6283, Training Loss: 0.01753
Epoch: 6284, Training Loss: 0.02015
Epoch: 6284, Training Loss: 0.01839
Epoch: 6284, Training Loss: 0.02307
Epoch: 6284, Training Loss: 0.01753
Epoch: 6285, Training Loss: 0.02014
Epoch: 6285, Training Loss: 0.01839
Epoch: 6285, Training Loss: 0.02307
Epoch: 6285, Training Loss: 0.01753
Epoch: 6286, Training Loss: 0.02014
Epoch: 6286, Training Loss: 0.01839
Epoch: 6286, Training Loss: 0.02307
Epoch: 6286, Training Loss: 0.01753
Epoch: 6287, Training Loss: 0.02014
Epoch: 6287, Training Loss: 0.01838
Epoch: 6287, Training Loss: 0.02307
Epoch: 6287, Training Loss: 0.01753
Epoch: 6288, Training Loss: 0.02014
Epoch: 6288, Training Loss: 0.01838
Epoch: 6288, Training Loss: 0.02306
Epoch: 6288, Training Loss: 0.01753
Epoch: 6289, Training Loss: 0.02014
Epoch: 6289, Training Loss: 0.01838
Epoch: 6289, Training Loss: 0.02306
Epoch: 6289, Training Loss: 0.01752
Epoch: 6290, Training Loss: 0.02014
Epoch: 6290, Training Loss: 0.01838
Epoch: 6290, Training Loss: 0.02306
Epoch: 6290, Training Loss: 0.01752
Epoch: 6291, Training Loss: 0.02013
Epoch: 6291, Training Loss: 0.01838
Epoch: 6291, Training Loss: 0.02306
Epoch: 6291, Training Loss: 0.01752
Epoch: 6292, Training Loss: 0.02013
Epoch: 6292, Training Loss: 0.01838
Epoch: 6292, Training Loss: 0.02305
Epoch: 6292, Training Loss: 0.01752
Epoch: 6293, Training Loss: 0.02013
Epoch: 6293, Training Loss: 0.01837
Epoch: 6293, Training Loss: 0.02305
Epoch: 6293, Training Loss: 0.01752
Epoch: 6294, Training Loss: 0.02013
Epoch: 6294, Training Loss: 0.01837
Epoch: 6294, Training Loss: 0.02305
Epoch: 6294, Training Loss: 0.01752
Epoch: 6295, Training Loss: 0.02013
Epoch: 6295, Training Loss: 0.01837
Epoch: 6295, Training Loss: 0.02305
Epoch: 6295, Training Loss: 0.01751
Epoch: 6296, Training Loss: 0.02012
Epoch: 6296, Training Loss: 0.01837
Epoch: 6296, Training Loss: 0.02305
Epoch: 6296, Training Loss: 0.01751
Epoch: 6297, Training Loss: 0.02012
Epoch: 6297, Training Loss: 0.01837
Epoch: 6297, Training Loss: 0.02304
Epoch: 6297, Training Loss: 0.01751
Epoch: 6298, Training Loss: 0.02012
Epoch: 6298, Training Loss: 0.01837
Epoch: 6298, Training Loss: 0.02304
Epoch: 6298, Training Loss: 0.01751
Epoch: 6299, Training Loss: 0.02012
Epoch: 6299, Training Loss: 0.01836
Epoch: 6299, Training Loss: 0.02304
Epoch: 6299, Training Loss: 0.01751
Epoch: 6300, Training Loss: 0.02012
Epoch: 6300, Training Loss: 0.01836
Epoch: 6300, Training Loss: 0.02304
Epoch: 6300, Training Loss: 0.01751
Epoch: 6301, Training Loss: 0.02011
Epoch: 6301, Training Loss: 0.01836
Epoch: 6301, Training Loss: 0.02303
Epoch: 6301, Training Loss: 0.01750
Epoch: 6302, Training Loss: 0.02011
Epoch: 6302, Training Loss: 0.01836
Epoch: 6302, Training Loss: 0.02303
Epoch: 6302, Training Loss: 0.01750
Epoch: 6303, Training Loss: 0.02011
Epoch: 6303, Training Loss: 0.01836
Epoch: 6303, Training Loss: 0.02303
Epoch: 6303, Training Loss: 0.01750
Epoch: 6304, Training Loss: 0.02011
Epoch: 6304, Training Loss: 0.01835
Epoch: 6304, Training Loss: 0.02303
Epoch: 6304, Training Loss: 0.01750
Epoch: 6305, Training Loss: 0.02011
Epoch: 6305, Training Loss: 0.01835
Epoch: 6305, Training Loss: 0.02303
Epoch: 6305, Training Loss: 0.01750
Epoch: 6306, Training Loss: 0.02010
Epoch: 6306, Training Loss: 0.01835
Epoch: 6306, Training Loss: 0.02302
Epoch: 6306, Training Loss: 0.01750
Epoch: 6307, Training Loss: 0.02010
Epoch: 6307, Training Loss: 0.01835
Epoch: 6307, Training Loss: 0.02302
Epoch: 6307, Training Loss: 0.01749
Epoch: 6308, Training Loss: 0.02010
Epoch: 6308, Training Loss: 0.01835
Epoch: 6308, Training Loss: 0.02302
Epoch: 6308, Training Loss: 0.01749
Epoch: 6309, Training Loss: 0.02010
Epoch: 6309, Training Loss: 0.01835
Epoch: 6309, Training Loss: 0.02302
Epoch: 6309, Training Loss: 0.01749
Epoch: 6310, Training Loss: 0.02010
Epoch: 6310, Training Loss: 0.01834
Epoch: 6310, Training Loss: 0.02301
Epoch: 6310, Training Loss: 0.01749
Epoch: 6311, Training Loss: 0.02010
Epoch: 6311, Training Loss: 0.01834
Epoch: 6311, Training Loss: 0.02301
Epoch: 6311, Training Loss: 0.01749
Epoch: 6312, Training Loss: 0.02009
Epoch: 6312, Training Loss: 0.01834
Epoch: 6312, Training Loss: 0.02301
Epoch: 6312, Training Loss: 0.01749
Epoch: 6313, Training Loss: 0.02009
Epoch: 6313, Training Loss: 0.01834
Epoch: 6313, Training Loss: 0.02301
Epoch: 6313, Training Loss: 0.01749
Epoch: 6314, Training Loss: 0.02009
Epoch: 6314, Training Loss: 0.01834
Epoch: 6314, Training Loss: 0.02301
Epoch: 6314, Training Loss: 0.01748
Epoch: 6315, Training Loss: 0.02009
Epoch: 6315, Training Loss: 0.01834
Epoch: 6315, Training Loss: 0.02300
Epoch: 6315, Training Loss: 0.01748
Epoch: 6316, Training Loss: 0.02009
Epoch: 6316, Training Loss: 0.01833
Epoch: 6316, Training Loss: 0.02300
Epoch: 6316, Training Loss: 0.01748
Epoch: 6317, Training Loss: 0.02008
Epoch: 6317, Training Loss: 0.01833
Epoch: 6317, Training Loss: 0.02300
Epoch: 6317, Training Loss: 0.01748
Epoch: 6318, Training Loss: 0.02008
Epoch: 6318, Training Loss: 0.01833
Epoch: 6318, Training Loss: 0.02300
Epoch: 6318, Training Loss: 0.01748
Epoch: 6319, Training Loss: 0.02008
Epoch: 6319, Training Loss: 0.01833
Epoch: 6319, Training Loss: 0.02299
Epoch: 6319, Training Loss: 0.01748
Epoch: 6320, Training Loss: 0.02008
Epoch: 6320, Training Loss: 0.01833
Epoch: 6320, Training Loss: 0.02299
Epoch: 6320, Training Loss: 0.01747
Epoch: 6321, Training Loss: 0.02008
Epoch: 6321, Training Loss: 0.01833
Epoch: 6321, Training Loss: 0.02299
Epoch: 6321, Training Loss: 0.01747
Epoch: 6322, Training Loss: 0.02007
Epoch: 6322, Training Loss: 0.01832
Epoch: 6322, Training Loss: 0.02299
Epoch: 6322, Training Loss: 0.01747
Epoch: 6323, Training Loss: 0.02007
Epoch: 6323, Training Loss: 0.01832
Epoch: 6323, Training Loss: 0.02299
Epoch: 6323, Training Loss: 0.01747
Epoch: 6324, Training Loss: 0.02007
Epoch: 6324, Training Loss: 0.01832
Epoch: 6324, Training Loss: 0.02298
Epoch: 6324, Training Loss: 0.01747
Epoch: 6325, Training Loss: 0.02007
Epoch: 6325, Training Loss: 0.01832
Epoch: 6325, Training Loss: 0.02298
Epoch: 6325, Training Loss: 0.01747
Epoch: 6326, Training Loss: 0.02007
Epoch: 6326, Training Loss: 0.01832
Epoch: 6326, Training Loss: 0.02298
Epoch: 6326, Training Loss: 0.01746
Epoch: 6327, Training Loss: 0.02006
Epoch: 6327, Training Loss: 0.01832
Epoch: 6327, Training Loss: 0.02298
Epoch: 6327, Training Loss: 0.01746
Epoch: 6328, Training Loss: 0.02006
Epoch: 6328, Training Loss: 0.01831
Epoch: 6328, Training Loss: 0.02297
Epoch: 6328, Training Loss: 0.01746
Epoch: 6329, Training Loss: 0.02006
Epoch: 6329, Training Loss: 0.01831
Epoch: 6329, Training Loss: 0.02297
Epoch: 6329, Training Loss: 0.01746
Epoch: 6330, Training Loss: 0.02006
Epoch: 6330, Training Loss: 0.01831
Epoch: 6330, Training Loss: 0.02297
Epoch: 6330, Training Loss: 0.01746
Epoch: 6331, Training Loss: 0.02006
Epoch: 6331, Training Loss: 0.01831
Epoch: 6331, Training Loss: 0.02297
Epoch: 6331, Training Loss: 0.01746
Epoch: 6332, Training Loss: 0.02006
Epoch: 6332, Training Loss: 0.01831
Epoch: 6332, Training Loss: 0.02297
Epoch: 6332, Training Loss: 0.01745
Epoch: 6333, Training Loss: 0.02005
Epoch: 6333, Training Loss: 0.01831
Epoch: 6333, Training Loss: 0.02296
Epoch: 6333, Training Loss: 0.01745
Epoch: 6334, Training Loss: 0.02005
Epoch: 6334, Training Loss: 0.01830
Epoch: 6334, Training Loss: 0.02296
Epoch: 6334, Training Loss: 0.01745
Epoch: 6335, Training Loss: 0.02005
Epoch: 6335, Training Loss: 0.01830
Epoch: 6335, Training Loss: 0.02296
Epoch: 6335, Training Loss: 0.01745
Epoch: 6336, Training Loss: 0.02005
Epoch: 6336, Training Loss: 0.01830
Epoch: 6336, Training Loss: 0.02296
Epoch: 6336, Training Loss: 0.01745
Epoch: 6337, Training Loss: 0.02005
Epoch: 6337, Training Loss: 0.01830
Epoch: 6337, Training Loss: 0.02296
Epoch: 6337, Training Loss: 0.01745
Epoch: 6338, Training Loss: 0.02004
Epoch: 6338, Training Loss: 0.01830
Epoch: 6338, Training Loss: 0.02295
Epoch: 6338, Training Loss: 0.01744
Epoch: 6339, Training Loss: 0.02004
Epoch: 6339, Training Loss: 0.01830
Epoch: 6339, Training Loss: 0.02295
Epoch: 6339, Training Loss: 0.01744
Epoch: 6340, Training Loss: 0.02004
Epoch: 6340, Training Loss: 0.01830
Epoch: 6340, Training Loss: 0.02295
Epoch: 6340, Training Loss: 0.01744
Epoch: 6341, Training Loss: 0.02004
Epoch: 6341, Training Loss: 0.01829
Epoch: 6341, Training Loss: 0.02295
Epoch: 6341, Training Loss: 0.01744
Epoch: 6342, Training Loss: 0.02004
Epoch: 6342, Training Loss: 0.01829
Epoch: 6342, Training Loss: 0.02294
Epoch: 6342, Training Loss: 0.01744
Epoch: 6343, Training Loss: 0.02003
Epoch: 6343, Training Loss: 0.01829
Epoch: 6343, Training Loss: 0.02294
Epoch: 6343, Training Loss: 0.01744
Epoch: 6344, Training Loss: 0.02003
Epoch: 6344, Training Loss: 0.01829
Epoch: 6344, Training Loss: 0.02294
Epoch: 6344, Training Loss: 0.01744
Epoch: 6345, Training Loss: 0.02003
Epoch: 6345, Training Loss: 0.01829
Epoch: 6345, Training Loss: 0.02294
Epoch: 6345, Training Loss: 0.01743
Epoch: 6346, Training Loss: 0.02003
Epoch: 6346, Training Loss: 0.01829
Epoch: 6346, Training Loss: 0.02294
Epoch: 6346, Training Loss: 0.01743
Epoch: 6347, Training Loss: 0.02003
Epoch: 6347, Training Loss: 0.01828
Epoch: 6347, Training Loss: 0.02293
Epoch: 6347, Training Loss: 0.01743
Epoch: 6348, Training Loss: 0.02002
Epoch: 6348, Training Loss: 0.01828
Epoch: 6348, Training Loss: 0.02293
Epoch: 6348, Training Loss: 0.01743
Epoch: 6349, Training Loss: 0.02002
Epoch: 6349, Training Loss: 0.01828
Epoch: 6349, Training Loss: 0.02293
Epoch: 6349, Training Loss: 0.01743
Epoch: 6350, Training Loss: 0.02002
Epoch: 6350, Training Loss: 0.01828
Epoch: 6350, Training Loss: 0.02293
Epoch: 6350, Training Loss: 0.01743
Epoch: 6351, Training Loss: 0.02002
Epoch: 6351, Training Loss: 0.01828
Epoch: 6351, Training Loss: 0.02292
Epoch: 6351, Training Loss: 0.01742
Epoch: 6352, Training Loss: 0.02002
Epoch: 6352, Training Loss: 0.01828
Epoch: 6352, Training Loss: 0.02292
Epoch: 6352, Training Loss: 0.01742
Epoch: 6353, Training Loss: 0.02002
Epoch: 6353, Training Loss: 0.01827
Epoch: 6353, Training Loss: 0.02292
Epoch: 6353, Training Loss: 0.01742
Epoch: 6354, Training Loss: 0.02001
Epoch: 6354, Training Loss: 0.01827
Epoch: 6354, Training Loss: 0.02292
Epoch: 6354, Training Loss: 0.01742
Epoch: 6355, Training Loss: 0.02001
Epoch: 6355, Training Loss: 0.01827
Epoch: 6355, Training Loss: 0.02292
Epoch: 6355, Training Loss: 0.01742
Epoch: 6356, Training Loss: 0.02001
Epoch: 6356, Training Loss: 0.01827
Epoch: 6356, Training Loss: 0.02291
Epoch: 6356, Training Loss: 0.01742
Epoch: 6357, Training Loss: 0.02001
Epoch: 6357, Training Loss: 0.01827
Epoch: 6357, Training Loss: 0.02291
Epoch: 6357, Training Loss: 0.01741
Epoch: 6358, Training Loss: 0.02001
Epoch: 6358, Training Loss: 0.01827
Epoch: 6358, Training Loss: 0.02291
Epoch: 6358, Training Loss: 0.01741
Epoch: 6359, Training Loss: 0.02000
Epoch: 6359, Training Loss: 0.01826
Epoch: 6359, Training Loss: 0.02291
Epoch: 6359, Training Loss: 0.01741
Epoch: 6360, Training Loss: 0.02000
Epoch: 6360, Training Loss: 0.01826
Epoch: 6360, Training Loss: 0.02290
Epoch: 6360, Training Loss: 0.01741
Epoch: 6361, Training Loss: 0.02000
Epoch: 6361, Training Loss: 0.01826
Epoch: 6361, Training Loss: 0.02290
Epoch: 6361, Training Loss: 0.01741
Epoch: 6362, Training Loss: 0.02000
Epoch: 6362, Training Loss: 0.01826
Epoch: 6362, Training Loss: 0.02290
Epoch: 6362, Training Loss: 0.01741
Epoch: 6363, Training Loss: 0.02000
Epoch: 6363, Training Loss: 0.01826
Epoch: 6363, Training Loss: 0.02290
Epoch: 6363, Training Loss: 0.01740
Epoch: 6364, Training Loss: 0.01999
Epoch: 6364, Training Loss: 0.01826
Epoch: 6364, Training Loss: 0.02290
Epoch: 6364, Training Loss: 0.01740
Epoch: 6365, Training Loss: 0.01999
Epoch: 6365, Training Loss: 0.01825
Epoch: 6365, Training Loss: 0.02289
Epoch: 6365, Training Loss: 0.01740
Epoch: 6366, Training Loss: 0.01999
Epoch: 6366, Training Loss: 0.01825
Epoch: 6366, Training Loss: 0.02289
Epoch: 6366, Training Loss: 0.01740
Epoch: 6367, Training Loss: 0.01999
Epoch: 6367, Training Loss: 0.01825
Epoch: 6367, Training Loss: 0.02289
Epoch: 6367, Training Loss: 0.01740
Epoch: 6368, Training Loss: 0.01999
Epoch: 6368, Training Loss: 0.01825
Epoch: 6368, Training Loss: 0.02289
Epoch: 6368, Training Loss: 0.01740
Epoch: 6369, Training Loss: 0.01999
Epoch: 6369, Training Loss: 0.01825
Epoch: 6369, Training Loss: 0.02289
Epoch: 6369, Training Loss: 0.01740
Epoch: 6370, Training Loss: 0.01998
Epoch: 6370, Training Loss: 0.01825
Epoch: 6370, Training Loss: 0.02288
Epoch: 6370, Training Loss: 0.01739
Epoch: 6371, Training Loss: 0.01998
Epoch: 6371, Training Loss: 0.01824
Epoch: 6371, Training Loss: 0.02288
Epoch: 6371, Training Loss: 0.01739
Epoch: 6372, Training Loss: 0.01998
Epoch: 6372, Training Loss: 0.01824
Epoch: 6372, Training Loss: 0.02288
Epoch: 6372, Training Loss: 0.01739
Epoch: 6373, Training Loss: 0.01998
Epoch: 6373, Training Loss: 0.01824
Epoch: 6373, Training Loss: 0.02288
Epoch: 6373, Training Loss: 0.01739
Epoch: 6374, Training Loss: 0.01998
Epoch: 6374, Training Loss: 0.01824
Epoch: 6374, Training Loss: 0.02287
Epoch: 6374, Training Loss: 0.01739
Epoch: 6375, Training Loss: 0.01997
Epoch: 6375, Training Loss: 0.01824
Epoch: 6375, Training Loss: 0.02287
Epoch: 6375, Training Loss: 0.01739
Epoch: 6376, Training Loss: 0.01997
Epoch: 6376, Training Loss: 0.01824
Epoch: 6376, Training Loss: 0.02287
Epoch: 6376, Training Loss: 0.01738
Epoch: 6377, Training Loss: 0.01997
Epoch: 6377, Training Loss: 0.01823
Epoch: 6377, Training Loss: 0.02287
Epoch: 6377, Training Loss: 0.01738
Epoch: 6378, Training Loss: 0.01997
Epoch: 6378, Training Loss: 0.01823
Epoch: 6378, Training Loss: 0.02287
Epoch: 6378, Training Loss: 0.01738
Epoch: 6379, Training Loss: 0.01997
Epoch: 6379, Training Loss: 0.01823
Epoch: 6379, Training Loss: 0.02286
Epoch: 6379, Training Loss: 0.01738
Epoch: 6380, Training Loss: 0.01996
Epoch: 6380, Training Loss: 0.01823
Epoch: 6380, Training Loss: 0.02286
Epoch: 6380, Training Loss: 0.01738
Epoch: 6381, Training Loss: 0.01996
Epoch: 6381, Training Loss: 0.01823
Epoch: 6381, Training Loss: 0.02286
Epoch: 6381, Training Loss: 0.01738
Epoch: 6382, Training Loss: 0.01996
Epoch: 6382, Training Loss: 0.01823
Epoch: 6382, Training Loss: 0.02286
Epoch: 6382, Training Loss: 0.01737
Epoch: 6383, Training Loss: 0.01996
Epoch: 6383, Training Loss: 0.01822
Epoch: 6383, Training Loss: 0.02285
Epoch: 6383, Training Loss: 0.01737
Epoch: 6384, Training Loss: 0.01996
Epoch: 6384, Training Loss: 0.01822
Epoch: 6384, Training Loss: 0.02285
Epoch: 6384, Training Loss: 0.01737
Epoch: 6385, Training Loss: 0.01996
Epoch: 6385, Training Loss: 0.01822
Epoch: 6385, Training Loss: 0.02285
Epoch: 6385, Training Loss: 0.01737
Epoch: 6386, Training Loss: 0.01995
Epoch: 6386, Training Loss: 0.01822
Epoch: 6386, Training Loss: 0.02285
Epoch: 6386, Training Loss: 0.01737
Epoch: 6387, Training Loss: 0.01995
Epoch: 6387, Training Loss: 0.01822
Epoch: 6387, Training Loss: 0.02285
Epoch: 6387, Training Loss: 0.01737
Epoch: 6388, Training Loss: 0.01995
Epoch: 6388, Training Loss: 0.01822
Epoch: 6388, Training Loss: 0.02284
Epoch: 6388, Training Loss: 0.01736
Epoch: 6389, Training Loss: 0.01995
Epoch: 6389, Training Loss: 0.01821
Epoch: 6389, Training Loss: 0.02284
Epoch: 6389, Training Loss: 0.01736
Epoch: 6390, Training Loss: 0.01995
Epoch: 6390, Training Loss: 0.01821
Epoch: 6390, Training Loss: 0.02284
Epoch: 6390, Training Loss: 0.01736
Epoch: 6391, Training Loss: 0.01994
Epoch: 6391, Training Loss: 0.01821
Epoch: 6391, Training Loss: 0.02284
Epoch: 6391, Training Loss: 0.01736
Epoch: 6392, Training Loss: 0.01994
Epoch: 6392, Training Loss: 0.01821
Epoch: 6392, Training Loss: 0.02284
Epoch: 6392, Training Loss: 0.01736
Epoch: 6393, Training Loss: 0.01994
Epoch: 6393, Training Loss: 0.01821
Epoch: 6393, Training Loss: 0.02283
Epoch: 6393, Training Loss: 0.01736
Epoch: 6394, Training Loss: 0.01994
Epoch: 6394, Training Loss: 0.01821
Epoch: 6394, Training Loss: 0.02283
Epoch: 6394, Training Loss: 0.01736
Epoch: 6395, Training Loss: 0.01994
Epoch: 6395, Training Loss: 0.01820
Epoch: 6395, Training Loss: 0.02283
Epoch: 6395, Training Loss: 0.01735
Epoch: 6396, Training Loss: 0.01993
Epoch: 6396, Training Loss: 0.01820
Epoch: 6396, Training Loss: 0.02283
Epoch: 6396, Training Loss: 0.01735
Epoch: 6397, Training Loss: 0.01993
Epoch: 6397, Training Loss: 0.01820
Epoch: 6397, Training Loss: 0.02282
Epoch: 6397, Training Loss: 0.01735
Epoch: 6398, Training Loss: 0.01993
Epoch: 6398, Training Loss: 0.01820
Epoch: 6398, Training Loss: 0.02282
Epoch: 6398, Training Loss: 0.01735
Epoch: 6399, Training Loss: 0.01993
Epoch: 6399, Training Loss: 0.01820
Epoch: 6399, Training Loss: 0.02282
Epoch: 6399, Training Loss: 0.01735
Epoch: 6400, Training Loss: 0.01993
Epoch: 6400, Training Loss: 0.01820
Epoch: 6400, Training Loss: 0.02282
Epoch: 6400, Training Loss: 0.01735
Epoch: 6401, Training Loss: 0.01993
Epoch: 6401, Training Loss: 0.01819
Epoch: 6401, Training Loss: 0.02282
Epoch: 6401, Training Loss: 0.01734
Epoch: 6402, Training Loss: 0.01992
Epoch: 6402, Training Loss: 0.01819
Epoch: 6402, Training Loss: 0.02281
Epoch: 6402, Training Loss: 0.01734
Epoch: 6403, Training Loss: 0.01992
Epoch: 6403, Training Loss: 0.01819
Epoch: 6403, Training Loss: 0.02281
Epoch: 6403, Training Loss: 0.01734
Epoch: 6404, Training Loss: 0.01992
Epoch: 6404, Training Loss: 0.01819
Epoch: 6404, Training Loss: 0.02281
Epoch: 6404, Training Loss: 0.01734
Epoch: 6405, Training Loss: 0.01992
Epoch: 6405, Training Loss: 0.01819
Epoch: 6405, Training Loss: 0.02281
Epoch: 6405, Training Loss: 0.01734
Epoch: 6406, Training Loss: 0.01992
Epoch: 6406, Training Loss: 0.01819
Epoch: 6406, Training Loss: 0.02281
Epoch: 6406, Training Loss: 0.01734
Epoch: 6407, Training Loss: 0.01991
Epoch: 6407, Training Loss: 0.01818
Epoch: 6407, Training Loss: 0.02280
Epoch: 6407, Training Loss: 0.01733
Epoch: 6408, Training Loss: 0.01991
Epoch: 6408, Training Loss: 0.01818
Epoch: 6408, Training Loss: 0.02280
Epoch: 6408, Training Loss: 0.01733
Epoch: 6409, Training Loss: 0.01991
Epoch: 6409, Training Loss: 0.01818
Epoch: 6409, Training Loss: 0.02280
Epoch: 6409, Training Loss: 0.01733
Epoch: 6410, Training Loss: 0.01991
Epoch: 6410, Training Loss: 0.01818
Epoch: 6410, Training Loss: 0.02280
Epoch: 6410, Training Loss: 0.01733
Epoch: 6411, Training Loss: 0.01991
Epoch: 6411, Training Loss: 0.01818
Epoch: 6411, Training Loss: 0.02279
Epoch: 6411, Training Loss: 0.01733
Epoch: 6412, Training Loss: 0.01991
Epoch: 6412, Training Loss: 0.01818
Epoch: 6412, Training Loss: 0.02279
Epoch: 6412, Training Loss: 0.01733
Epoch: 6413, Training Loss: 0.01990
Epoch: 6413, Training Loss: 0.01818
Epoch: 6413, Training Loss: 0.02279
Epoch: 6413, Training Loss: 0.01733
Epoch: 6414, Training Loss: 0.01990
Epoch: 6414, Training Loss: 0.01817
Epoch: 6414, Training Loss: 0.02279
Epoch: 6414, Training Loss: 0.01732
Epoch: 6415, Training Loss: 0.01990
Epoch: 6415, Training Loss: 0.01817
Epoch: 6415, Training Loss: 0.02279
Epoch: 6415, Training Loss: 0.01732
Epoch: 6416, Training Loss: 0.01990
Epoch: 6416, Training Loss: 0.01817
Epoch: 6416, Training Loss: 0.02278
Epoch: 6416, Training Loss: 0.01732
Epoch: 6417, Training Loss: 0.01990
Epoch: 6417, Training Loss: 0.01817
Epoch: 6417, Training Loss: 0.02278
Epoch: 6417, Training Loss: 0.01732
Epoch: 6418, Training Loss: 0.01989
Epoch: 6418, Training Loss: 0.01817
Epoch: 6418, Training Loss: 0.02278
Epoch: 6418, Training Loss: 0.01732
Epoch: 6419, Training Loss: 0.01989
Epoch: 6419, Training Loss: 0.01817
Epoch: 6419, Training Loss: 0.02278
Epoch: 6419, Training Loss: 0.01732
Epoch: 6420, Training Loss: 0.01989
Epoch: 6420, Training Loss: 0.01816
Epoch: 6420, Training Loss: 0.02278
Epoch: 6420, Training Loss: 0.01731
Epoch: 6421, Training Loss: 0.01989
Epoch: 6421, Training Loss: 0.01816
Epoch: 6421, Training Loss: 0.02277
Epoch: 6421, Training Loss: 0.01731
Epoch: 6422, Training Loss: 0.01989
Epoch: 6422, Training Loss: 0.01816
Epoch: 6422, Training Loss: 0.02277
Epoch: 6422, Training Loss: 0.01731
Epoch: 6423, Training Loss: 0.01988
Epoch: 6423, Training Loss: 0.01816
Epoch: 6423, Training Loss: 0.02277
Epoch: 6423, Training Loss: 0.01731
Epoch: 6424, Training Loss: 0.01988
Epoch: 6424, Training Loss: 0.01816
Epoch: 6424, Training Loss: 0.02277
Epoch: 6424, Training Loss: 0.01731
Epoch: 6425, Training Loss: 0.01988
Epoch: 6425, Training Loss: 0.01816
Epoch: 6425, Training Loss: 0.02276
Epoch: 6425, Training Loss: 0.01731
Epoch: 6426, Training Loss: 0.01988
Epoch: 6426, Training Loss: 0.01815
Epoch: 6426, Training Loss: 0.02276
Epoch: 6426, Training Loss: 0.01730
Epoch: 6427, Training Loss: 0.01988
Epoch: 6427, Training Loss: 0.01815
Epoch: 6427, Training Loss: 0.02276
Epoch: 6427, Training Loss: 0.01730
Epoch: 6428, Training Loss: 0.01988
Epoch: 6428, Training Loss: 0.01815
Epoch: 6428, Training Loss: 0.02276
Epoch: 6428, Training Loss: 0.01730
Epoch: 6429, Training Loss: 0.01987
Epoch: 6429, Training Loss: 0.01815
Epoch: 6429, Training Loss: 0.02276
Epoch: 6429, Training Loss: 0.01730
Epoch: 6430, Training Loss: 0.01987
Epoch: 6430, Training Loss: 0.01815
Epoch: 6430, Training Loss: 0.02275
Epoch: 6430, Training Loss: 0.01730
Epoch: 6431, Training Loss: 0.01987
Epoch: 6431, Training Loss: 0.01815
Epoch: 6431, Training Loss: 0.02275
Epoch: 6431, Training Loss: 0.01730
Epoch: 6432, Training Loss: 0.01987
Epoch: 6432, Training Loss: 0.01814
Epoch: 6432, Training Loss: 0.02275
Epoch: 6432, Training Loss: 0.01730
Epoch: 6433, Training Loss: 0.01987
Epoch: 6433, Training Loss: 0.01814
Epoch: 6433, Training Loss: 0.02275
Epoch: 6433, Training Loss: 0.01729
Epoch: 6434, Training Loss: 0.01986
Epoch: 6434, Training Loss: 0.01814
Epoch: 6434, Training Loss: 0.02275
Epoch: 6434, Training Loss: 0.01729
Epoch: 6435, Training Loss: 0.01986
Epoch: 6435, Training Loss: 0.01814
Epoch: 6435, Training Loss: 0.02274
Epoch: 6435, Training Loss: 0.01729
Epoch: 6436, Training Loss: 0.01986
Epoch: 6436, Training Loss: 0.01814
Epoch: 6436, Training Loss: 0.02274
Epoch: 6436, Training Loss: 0.01729
Epoch: 6437, Training Loss: 0.01986
Epoch: 6437, Training Loss: 0.01814
Epoch: 6437, Training Loss: 0.02274
Epoch: 6437, Training Loss: 0.01729
Epoch: 6438, Training Loss: 0.01986
Epoch: 6438, Training Loss: 0.01813
Epoch: 6438, Training Loss: 0.02274
Epoch: 6438, Training Loss: 0.01729
Epoch: 6439, Training Loss: 0.01986
Epoch: 6439, Training Loss: 0.01813
Epoch: 6439, Training Loss: 0.02273
Epoch: 6439, Training Loss: 0.01728
Epoch: 6440, Training Loss: 0.01985
Epoch: 6440, Training Loss: 0.01813
Epoch: 6440, Training Loss: 0.02273
Epoch: 6440, Training Loss: 0.01728
Epoch: 6441, Training Loss: 0.01985
Epoch: 6441, Training Loss: 0.01813
Epoch: 6441, Training Loss: 0.02273
Epoch: 6441, Training Loss: 0.01728
Epoch: 6442, Training Loss: 0.01985
Epoch: 6442, Training Loss: 0.01813
Epoch: 6442, Training Loss: 0.02273
Epoch: 6442, Training Loss: 0.01728
Epoch: 6443, Training Loss: 0.01985
Epoch: 6443, Training Loss: 0.01813
Epoch: 6443, Training Loss: 0.02273
Epoch: 6443, Training Loss: 0.01728
Epoch: 6444, Training Loss: 0.01985
Epoch: 6444, Training Loss: 0.01813
Epoch: 6444, Training Loss: 0.02272
Epoch: 6444, Training Loss: 0.01728
Epoch: 6445, Training Loss: 0.01984
Epoch: 6445, Training Loss: 0.01812
Epoch: 6445, Training Loss: 0.02272
Epoch: 6445, Training Loss: 0.01728
Epoch: 6446, Training Loss: 0.01984
Epoch: 6446, Training Loss: 0.01812
Epoch: 6446, Training Loss: 0.02272
Epoch: 6446, Training Loss: 0.01727
Epoch: 6447, Training Loss: 0.01984
Epoch: 6447, Training Loss: 0.01812
Epoch: 6447, Training Loss: 0.02272
Epoch: 6447, Training Loss: 0.01727
Epoch: 6448, Training Loss: 0.01984
Epoch: 6448, Training Loss: 0.01812
Epoch: 6448, Training Loss: 0.02272
Epoch: 6448, Training Loss: 0.01727
Epoch: 6449, Training Loss: 0.01984
Epoch: 6449, Training Loss: 0.01812
Epoch: 6449, Training Loss: 0.02271
Epoch: 6449, Training Loss: 0.01727
Epoch: 6450, Training Loss: 0.01984
Epoch: 6450, Training Loss: 0.01812
Epoch: 6450, Training Loss: 0.02271
Epoch: 6450, Training Loss: 0.01727
Epoch: 6451, Training Loss: 0.01983
Epoch: 6451, Training Loss: 0.01811
Epoch: 6451, Training Loss: 0.02271
Epoch: 6451, Training Loss: 0.01727
Epoch: 6452, Training Loss: 0.01983
Epoch: 6452, Training Loss: 0.01811
Epoch: 6452, Training Loss: 0.02271
Epoch: 6452, Training Loss: 0.01726
Epoch: 6453, Training Loss: 0.01983
Epoch: 6453, Training Loss: 0.01811
Epoch: 6453, Training Loss: 0.02270
Epoch: 6453, Training Loss: 0.01726
Epoch: 6454, Training Loss: 0.01983
Epoch: 6454, Training Loss: 0.01811
Epoch: 6454, Training Loss: 0.02270
Epoch: 6454, Training Loss: 0.01726
Epoch: 6455, Training Loss: 0.01983
Epoch: 6455, Training Loss: 0.01811
Epoch: 6455, Training Loss: 0.02270
Epoch: 6455, Training Loss: 0.01726
Epoch: 6456, Training Loss: 0.01982
Epoch: 6456, Training Loss: 0.01811
Epoch: 6456, Training Loss: 0.02270
Epoch: 6456, Training Loss: 0.01726
Epoch: 6457, Training Loss: 0.01982
Epoch: 6457, Training Loss: 0.01810
Epoch: 6457, Training Loss: 0.02270
Epoch: 6457, Training Loss: 0.01726
Epoch: 6458, Training Loss: 0.01982
Epoch: 6458, Training Loss: 0.01810
Epoch: 6458, Training Loss: 0.02269
Epoch: 6458, Training Loss: 0.01725
Epoch: 6459, Training Loss: 0.01982
Epoch: 6459, Training Loss: 0.01810
Epoch: 6459, Training Loss: 0.02269
Epoch: 6459, Training Loss: 0.01725
Epoch: 6460, Training Loss: 0.01982
Epoch: 6460, Training Loss: 0.01810
Epoch: 6460, Training Loss: 0.02269
Epoch: 6460, Training Loss: 0.01725
Epoch: 6461, Training Loss: 0.01981
Epoch: 6461, Training Loss: 0.01810
Epoch: 6461, Training Loss: 0.02269
Epoch: 6461, Training Loss: 0.01725
Epoch: 6462, Training Loss: 0.01981
Epoch: 6462, Training Loss: 0.01810
Epoch: 6462, Training Loss: 0.02269
Epoch: 6462, Training Loss: 0.01725
Epoch: 6463, Training Loss: 0.01981
Epoch: 6463, Training Loss: 0.01809
Epoch: 6463, Training Loss: 0.02268
Epoch: 6463, Training Loss: 0.01725
Epoch: 6464, Training Loss: 0.01981
Epoch: 6464, Training Loss: 0.01809
Epoch: 6464, Training Loss: 0.02268
Epoch: 6464, Training Loss: 0.01725
Epoch: 6465, Training Loss: 0.01981
Epoch: 6465, Training Loss: 0.01809
Epoch: 6465, Training Loss: 0.02268
Epoch: 6465, Training Loss: 0.01724
Epoch: 6466, Training Loss: 0.01981
Epoch: 6466, Training Loss: 0.01809
Epoch: 6466, Training Loss: 0.02268
Epoch: 6466, Training Loss: 0.01724
Epoch: 6467, Training Loss: 0.01980
Epoch: 6467, Training Loss: 0.01809
Epoch: 6467, Training Loss: 0.02267
Epoch: 6467, Training Loss: 0.01724
Epoch: 6468, Training Loss: 0.01980
Epoch: 6468, Training Loss: 0.01809
Epoch: 6468, Training Loss: 0.02267
Epoch: 6468, Training Loss: 0.01724
Epoch: 6469, Training Loss: 0.01980
Epoch: 6469, Training Loss: 0.01808
Epoch: 6469, Training Loss: 0.02267
Epoch: 6469, Training Loss: 0.01724
Epoch: 6470, Training Loss: 0.01980
Epoch: 6470, Training Loss: 0.01808
Epoch: 6470, Training Loss: 0.02267
Epoch: 6470, Training Loss: 0.01724
Epoch: 6471, Training Loss: 0.01980
Epoch: 6471, Training Loss: 0.01808
Epoch: 6471, Training Loss: 0.02267
Epoch: 6471, Training Loss: 0.01723
Epoch: 6472, Training Loss: 0.01979
Epoch: 6472, Training Loss: 0.01808
Epoch: 6472, Training Loss: 0.02266
Epoch: 6472, Training Loss: 0.01723
Epoch: 6473, Training Loss: 0.01979
Epoch: 6473, Training Loss: 0.01808
Epoch: 6473, Training Loss: 0.02266
Epoch: 6473, Training Loss: 0.01723
Epoch: 6474, Training Loss: 0.01979
Epoch: 6474, Training Loss: 0.01808
Epoch: 6474, Training Loss: 0.02266
Epoch: 6474, Training Loss: 0.01723
Epoch: 6475, Training Loss: 0.01979
Epoch: 6475, Training Loss: 0.01808
Epoch: 6475, Training Loss: 0.02266
Epoch: 6475, Training Loss: 0.01723
Epoch: 6476, Training Loss: 0.01979
Epoch: 6476, Training Loss: 0.01807
Epoch: 6476, Training Loss: 0.02266
Epoch: 6476, Training Loss: 0.01723
Epoch: 6477, Training Loss: 0.01979
Epoch: 6477, Training Loss: 0.01807
Epoch: 6477, Training Loss: 0.02265
Epoch: 6477, Training Loss: 0.01723
Epoch: 6478, Training Loss: 0.01978
Epoch: 6478, Training Loss: 0.01807
Epoch: 6478, Training Loss: 0.02265
Epoch: 6478, Training Loss: 0.01722
Epoch: 6479, Training Loss: 0.01978
Epoch: 6479, Training Loss: 0.01807
Epoch: 6479, Training Loss: 0.02265
Epoch: 6479, Training Loss: 0.01722
Epoch: 6480, Training Loss: 0.01978
Epoch: 6480, Training Loss: 0.01807
Epoch: 6480, Training Loss: 0.02265
Epoch: 6480, Training Loss: 0.01722
Epoch: 6481, Training Loss: 0.01978
Epoch: 6481, Training Loss: 0.01807
Epoch: 6481, Training Loss: 0.02265
Epoch: 6481, Training Loss: 0.01722
Epoch: 6482, Training Loss: 0.01978
Epoch: 6482, Training Loss: 0.01806
Epoch: 6482, Training Loss: 0.02264
Epoch: 6482, Training Loss: 0.01722
Epoch: 6483, Training Loss: 0.01977
Epoch: 6483, Training Loss: 0.01806
Epoch: 6483, Training Loss: 0.02264
Epoch: 6483, Training Loss: 0.01722
Epoch: 6484, Training Loss: 0.01977
Epoch: 6484, Training Loss: 0.01806
Epoch: 6484, Training Loss: 0.02264
Epoch: 6484, Training Loss: 0.01721
Epoch: 6485, Training Loss: 0.01977
Epoch: 6485, Training Loss: 0.01806
Epoch: 6485, Training Loss: 0.02264
Epoch: 6485, Training Loss: 0.01721
Epoch: 6486, Training Loss: 0.01977
Epoch: 6486, Training Loss: 0.01806
Epoch: 6486, Training Loss: 0.02263
Epoch: 6486, Training Loss: 0.01721
Epoch: 6487, Training Loss: 0.01977
Epoch: 6487, Training Loss: 0.01806
Epoch: 6487, Training Loss: 0.02263
Epoch: 6487, Training Loss: 0.01721
Epoch: 6488, Training Loss: 0.01977
Epoch: 6488, Training Loss: 0.01805
Epoch: 6488, Training Loss: 0.02263
Epoch: 6488, Training Loss: 0.01721
Epoch: 6489, Training Loss: 0.01976
Epoch: 6489, Training Loss: 0.01805
Epoch: 6489, Training Loss: 0.02263
Epoch: 6489, Training Loss: 0.01721
Epoch: 6490, Training Loss: 0.01976
Epoch: 6490, Training Loss: 0.01805
Epoch: 6490, Training Loss: 0.02263
Epoch: 6490, Training Loss: 0.01721
Epoch: 6491, Training Loss: 0.01976
Epoch: 6491, Training Loss: 0.01805
Epoch: 6491, Training Loss: 0.02262
Epoch: 6491, Training Loss: 0.01720
Epoch: 6492, Training Loss: 0.01976
Epoch: 6492, Training Loss: 0.01805
Epoch: 6492, Training Loss: 0.02262
Epoch: 6492, Training Loss: 0.01720
Epoch: 6493, Training Loss: 0.01976
Epoch: 6493, Training Loss: 0.01805
Epoch: 6493, Training Loss: 0.02262
Epoch: 6493, Training Loss: 0.01720
Epoch: 6494, Training Loss: 0.01975
Epoch: 6494, Training Loss: 0.01804
Epoch: 6494, Training Loss: 0.02262
Epoch: 6494, Training Loss: 0.01720
Epoch: 6495, Training Loss: 0.01975
Epoch: 6495, Training Loss: 0.01804
Epoch: 6495, Training Loss: 0.02262
Epoch: 6495, Training Loss: 0.01720
Epoch: 6496, Training Loss: 0.01975
Epoch: 6496, Training Loss: 0.01804
Epoch: 6496, Training Loss: 0.02261
Epoch: 6496, Training Loss: 0.01720
Epoch: 6497, Training Loss: 0.01975
Epoch: 6497, Training Loss: 0.01804
Epoch: 6497, Training Loss: 0.02261
Epoch: 6497, Training Loss: 0.01719
Epoch: 6498, Training Loss: 0.01975
Epoch: 6498, Training Loss: 0.01804
Epoch: 6498, Training Loss: 0.02261
Epoch: 6498, Training Loss: 0.01719
Epoch: 6499, Training Loss: 0.01975
Epoch: 6499, Training Loss: 0.01804
Epoch: 6499, Training Loss: 0.02261
Epoch: 6499, Training Loss: 0.01719
Epoch: 6500, Training Loss: 0.01974
Epoch: 6500, Training Loss: 0.01804
Epoch: 6500, Training Loss: 0.02261
Epoch: 6500, Training Loss: 0.01719
Epoch: 6501, Training Loss: 0.01974
Epoch: 6501, Training Loss: 0.01803
Epoch: 6501, Training Loss: 0.02260
Epoch: 6501, Training Loss: 0.01719
Epoch: 6502, Training Loss: 0.01974
Epoch: 6502, Training Loss: 0.01803
Epoch: 6502, Training Loss: 0.02260
Epoch: 6502, Training Loss: 0.01719
Epoch: 6503, Training Loss: 0.01974
Epoch: 6503, Training Loss: 0.01803
Epoch: 6503, Training Loss: 0.02260
Epoch: 6503, Training Loss: 0.01719
Epoch: 6504, Training Loss: 0.01974
Epoch: 6504, Training Loss: 0.01803
Epoch: 6504, Training Loss: 0.02260
Epoch: 6504, Training Loss: 0.01718
Epoch: 6505, Training Loss: 0.01973
Epoch: 6505, Training Loss: 0.01803
Epoch: 6505, Training Loss: 0.02259
Epoch: 6505, Training Loss: 0.01718
Epoch: 6506, Training Loss: 0.01973
Epoch: 6506, Training Loss: 0.01803
Epoch: 6506, Training Loss: 0.02259
Epoch: 6506, Training Loss: 0.01718
Epoch: 6507, Training Loss: 0.01973
Epoch: 6507, Training Loss: 0.01802
Epoch: 6507, Training Loss: 0.02259
Epoch: 6507, Training Loss: 0.01718
Epoch: 6508, Training Loss: 0.01973
Epoch: 6508, Training Loss: 0.01802
Epoch: 6508, Training Loss: 0.02259
Epoch: 6508, Training Loss: 0.01718
Epoch: 6509, Training Loss: 0.01973
Epoch: 6509, Training Loss: 0.01802
Epoch: 6509, Training Loss: 0.02259
Epoch: 6509, Training Loss: 0.01718
Epoch: 6510, Training Loss: 0.01973
Epoch: 6510, Training Loss: 0.01802
Epoch: 6510, Training Loss: 0.02258
Epoch: 6510, Training Loss: 0.01717
Epoch: 6511, Training Loss: 0.01972
Epoch: 6511, Training Loss: 0.01802
Epoch: 6511, Training Loss: 0.02258
Epoch: 6511, Training Loss: 0.01717
Epoch: 6512, Training Loss: 0.01972
Epoch: 6512, Training Loss: 0.01802
Epoch: 6512, Training Loss: 0.02258
Epoch: 6512, Training Loss: 0.01717
Epoch: 6513, Training Loss: 0.01972
Epoch: 6513, Training Loss: 0.01801
Epoch: 6513, Training Loss: 0.02258
Epoch: 6513, Training Loss: 0.01717
Epoch: 6514, Training Loss: 0.01972
Epoch: 6514, Training Loss: 0.01801
Epoch: 6514, Training Loss: 0.02258
Epoch: 6514, Training Loss: 0.01717
Epoch: 6515, Training Loss: 0.01972
Epoch: 6515, Training Loss: 0.01801
Epoch: 6515, Training Loss: 0.02257
Epoch: 6515, Training Loss: 0.01717
Epoch: 6516, Training Loss: 0.01972
Epoch: 6516, Training Loss: 0.01801
Epoch: 6516, Training Loss: 0.02257
Epoch: 6516, Training Loss: 0.01717
Epoch: 6517, Training Loss: 0.01971
Epoch: 6517, Training Loss: 0.01801
Epoch: 6517, Training Loss: 0.02257
Epoch: 6517, Training Loss: 0.01716
Epoch: 6518, Training Loss: 0.01971
Epoch: 6518, Training Loss: 0.01801
Epoch: 6518, Training Loss: 0.02257
Epoch: 6518, Training Loss: 0.01716
Epoch: 6519, Training Loss: 0.01971
Epoch: 6519, Training Loss: 0.01801
Epoch: 6519, Training Loss: 0.02257
Epoch: 6519, Training Loss: 0.01716
Epoch: 6520, Training Loss: 0.01971
Epoch: 6520, Training Loss: 0.01800
Epoch: 6520, Training Loss: 0.02256
Epoch: 6520, Training Loss: 0.01716
Epoch: 6521, Training Loss: 0.01971
Epoch: 6521, Training Loss: 0.01800
Epoch: 6521, Training Loss: 0.02256
Epoch: 6521, Training Loss: 0.01716
Epoch: 6522, Training Loss: 0.01970
Epoch: 6522, Training Loss: 0.01800
Epoch: 6522, Training Loss: 0.02256
Epoch: 6522, Training Loss: 0.01716
Epoch: 6523, Training Loss: 0.01970
Epoch: 6523, Training Loss: 0.01800
Epoch: 6523, Training Loss: 0.02256
Epoch: 6523, Training Loss: 0.01715
Epoch: 6524, Training Loss: 0.01970
Epoch: 6524, Training Loss: 0.01800
Epoch: 6524, Training Loss: 0.02256
Epoch: 6524, Training Loss: 0.01715
Epoch: 6525, Training Loss: 0.01970
Epoch: 6525, Training Loss: 0.01800
Epoch: 6525, Training Loss: 0.02255
Epoch: 6525, Training Loss: 0.01715
Epoch: 6526, Training Loss: 0.01970
Epoch: 6526, Training Loss: 0.01799
Epoch: 6526, Training Loss: 0.02255
Epoch: 6526, Training Loss: 0.01715
Epoch: 6527, Training Loss: 0.01970
Epoch: 6527, Training Loss: 0.01799
Epoch: 6527, Training Loss: 0.02255
Epoch: 6527, Training Loss: 0.01715
Epoch: 6528, Training Loss: 0.01969
Epoch: 6528, Training Loss: 0.01799
Epoch: 6528, Training Loss: 0.02255
Epoch: 6528, Training Loss: 0.01715
Epoch: 6529, Training Loss: 0.01969
Epoch: 6529, Training Loss: 0.01799
Epoch: 6529, Training Loss: 0.02254
Epoch: 6529, Training Loss: 0.01715
Epoch: 6530, Training Loss: 0.01969
Epoch: 6530, Training Loss: 0.01799
Epoch: 6530, Training Loss: 0.02254
Epoch: 6530, Training Loss: 0.01714
Epoch: 6531, Training Loss: 0.01969
Epoch: 6531, Training Loss: 0.01799
Epoch: 6531, Training Loss: 0.02254
Epoch: 6531, Training Loss: 0.01714
Epoch: 6532, Training Loss: 0.01969
Epoch: 6532, Training Loss: 0.01798
Epoch: 6532, Training Loss: 0.02254
Epoch: 6532, Training Loss: 0.01714
Epoch: 6533, Training Loss: 0.01968
Epoch: 6533, Training Loss: 0.01798
Epoch: 6533, Training Loss: 0.02254
Epoch: 6533, Training Loss: 0.01714
Epoch: 6534, Training Loss: 0.01968
Epoch: 6534, Training Loss: 0.01798
Epoch: 6534, Training Loss: 0.02253
Epoch: 6534, Training Loss: 0.01714
Epoch: 6535, Training Loss: 0.01968
Epoch: 6535, Training Loss: 0.01798
Epoch: 6535, Training Loss: 0.02253
Epoch: 6535, Training Loss: 0.01714
Epoch: 6536, Training Loss: 0.01968
Epoch: 6536, Training Loss: 0.01798
Epoch: 6536, Training Loss: 0.02253
Epoch: 6536, Training Loss: 0.01713
Epoch: 6537, Training Loss: 0.01968
Epoch: 6537, Training Loss: 0.01798
Epoch: 6537, Training Loss: 0.02253
Epoch: 6537, Training Loss: 0.01713
Epoch: 6538, Training Loss: 0.01968
Epoch: 6538, Training Loss: 0.01798
Epoch: 6538, Training Loss: 0.02253
Epoch: 6538, Training Loss: 0.01713
Epoch: 6539, Training Loss: 0.01967
Epoch: 6539, Training Loss: 0.01797
Epoch: 6539, Training Loss: 0.02252
Epoch: 6539, Training Loss: 0.01713
Epoch: 6540, Training Loss: 0.01967
Epoch: 6540, Training Loss: 0.01797
Epoch: 6540, Training Loss: 0.02252
Epoch: 6540, Training Loss: 0.01713
Epoch: 6541, Training Loss: 0.01967
Epoch: 6541, Training Loss: 0.01797
Epoch: 6541, Training Loss: 0.02252
Epoch: 6541, Training Loss: 0.01713
Epoch: 6542, Training Loss: 0.01967
Epoch: 6542, Training Loss: 0.01797
Epoch: 6542, Training Loss: 0.02252
Epoch: 6542, Training Loss: 0.01713
Epoch: 6543, Training Loss: 0.01967
Epoch: 6543, Training Loss: 0.01797
Epoch: 6543, Training Loss: 0.02252
Epoch: 6543, Training Loss: 0.01712
Epoch: 6544, Training Loss: 0.01966
Epoch: 6544, Training Loss: 0.01797
Epoch: 6544, Training Loss: 0.02251
Epoch: 6544, Training Loss: 0.01712
Epoch: 6545, Training Loss: 0.01966
Epoch: 6545, Training Loss: 0.01796
Epoch: 6545, Training Loss: 0.02251
Epoch: 6545, Training Loss: 0.01712
Epoch: 6546, Training Loss: 0.01966
Epoch: 6546, Training Loss: 0.01796
Epoch: 6546, Training Loss: 0.02251
Epoch: 6546, Training Loss: 0.01712
Epoch: 6547, Training Loss: 0.01966
Epoch: 6547, Training Loss: 0.01796
Epoch: 6547, Training Loss: 0.02251
Epoch: 6547, Training Loss: 0.01712
Epoch: 6548, Training Loss: 0.01966
Epoch: 6548, Training Loss: 0.01796
Epoch: 6548, Training Loss: 0.02251
Epoch: 6548, Training Loss: 0.01712
Epoch: 6549, Training Loss: 0.01966
Epoch: 6549, Training Loss: 0.01796
Epoch: 6549, Training Loss: 0.02250
Epoch: 6549, Training Loss: 0.01711
Epoch: 6550, Training Loss: 0.01965
Epoch: 6550, Training Loss: 0.01796
Epoch: 6550, Training Loss: 0.02250
Epoch: 6550, Training Loss: 0.01711
Epoch: 6551, Training Loss: 0.01965
Epoch: 6551, Training Loss: 0.01795
Epoch: 6551, Training Loss: 0.02250
Epoch: 6551, Training Loss: 0.01711
Epoch: 6552, Training Loss: 0.01965
Epoch: 6552, Training Loss: 0.01795
Epoch: 6552, Training Loss: 0.02250
Epoch: 6552, Training Loss: 0.01711
Epoch: 6553, Training Loss: 0.01965
Epoch: 6553, Training Loss: 0.01795
Epoch: 6553, Training Loss: 0.02250
Epoch: 6553, Training Loss: 0.01711
Epoch: 6554, Training Loss: 0.01965
Epoch: 6554, Training Loss: 0.01795
Epoch: 6554, Training Loss: 0.02249
Epoch: 6554, Training Loss: 0.01711
Epoch: 6555, Training Loss: 0.01965
Epoch: 6555, Training Loss: 0.01795
Epoch: 6555, Training Loss: 0.02249
Epoch: 6555, Training Loss: 0.01711
Epoch: 6556, Training Loss: 0.01964
Epoch: 6556, Training Loss: 0.01795
Epoch: 6556, Training Loss: 0.02249
Epoch: 6556, Training Loss: 0.01710
Epoch: 6557, Training Loss: 0.01964
Epoch: 6557, Training Loss: 0.01795
Epoch: 6557, Training Loss: 0.02249
Epoch: 6557, Training Loss: 0.01710
Epoch: 6558, Training Loss: 0.01964
Epoch: 6558, Training Loss: 0.01794
Epoch: 6558, Training Loss: 0.02248
Epoch: 6558, Training Loss: 0.01710
Epoch: 6559, Training Loss: 0.01964
Epoch: 6559, Training Loss: 0.01794
Epoch: 6559, Training Loss: 0.02248
Epoch: 6559, Training Loss: 0.01710
Epoch: 6560, Training Loss: 0.01964
Epoch: 6560, Training Loss: 0.01794
Epoch: 6560, Training Loss: 0.02248
Epoch: 6560, Training Loss: 0.01710
Epoch: 6561, Training Loss: 0.01963
Epoch: 6561, Training Loss: 0.01794
Epoch: 6561, Training Loss: 0.02248
Epoch: 6561, Training Loss: 0.01710
Epoch: 6562, Training Loss: 0.01963
Epoch: 6562, Training Loss: 0.01794
Epoch: 6562, Training Loss: 0.02248
Epoch: 6562, Training Loss: 0.01710
Epoch: 6563, Training Loss: 0.01963
Epoch: 6563, Training Loss: 0.01794
Epoch: 6563, Training Loss: 0.02247
Epoch: 6563, Training Loss: 0.01709
Epoch: 6564, Training Loss: 0.01963
Epoch: 6564, Training Loss: 0.01793
Epoch: 6564, Training Loss: 0.02247
Epoch: 6564, Training Loss: 0.01709
Epoch: 6565, Training Loss: 0.01963
Epoch: 6565, Training Loss: 0.01793
Epoch: 6565, Training Loss: 0.02247
Epoch: 6565, Training Loss: 0.01709
Epoch: 6566, Training Loss: 0.01963
Epoch: 6566, Training Loss: 0.01793
Epoch: 6566, Training Loss: 0.02247
Epoch: 6566, Training Loss: 0.01709
Epoch: 6567, Training Loss: 0.01962
Epoch: 6567, Training Loss: 0.01793
Epoch: 6567, Training Loss: 0.02247
Epoch: 6567, Training Loss: 0.01709
Epoch: 6568, Training Loss: 0.01962
Epoch: 6568, Training Loss: 0.01793
Epoch: 6568, Training Loss: 0.02246
Epoch: 6568, Training Loss: 0.01709
Epoch: 6569, Training Loss: 0.01962
Epoch: 6569, Training Loss: 0.01793
Epoch: 6569, Training Loss: 0.02246
Epoch: 6569, Training Loss: 0.01708
Epoch: 6570, Training Loss: 0.01962
Epoch: 6570, Training Loss: 0.01793
Epoch: 6570, Training Loss: 0.02246
Epoch: 6570, Training Loss: 0.01708
Epoch: 6571, Training Loss: 0.01962
Epoch: 6571, Training Loss: 0.01792
Epoch: 6571, Training Loss: 0.02246
Epoch: 6571, Training Loss: 0.01708
Epoch: 6572, Training Loss: 0.01961
Epoch: 6572, Training Loss: 0.01792
Epoch: 6572, Training Loss: 0.02246
Epoch: 6572, Training Loss: 0.01708
Epoch: 6573, Training Loss: 0.01961
Epoch: 6573, Training Loss: 0.01792
Epoch: 6573, Training Loss: 0.02245
Epoch: 6573, Training Loss: 0.01708
Epoch: 6574, Training Loss: 0.01961
Epoch: 6574, Training Loss: 0.01792
Epoch: 6574, Training Loss: 0.02245
Epoch: 6574, Training Loss: 0.01708
Epoch: 6575, Training Loss: 0.01961
Epoch: 6575, Training Loss: 0.01792
Epoch: 6575, Training Loss: 0.02245
Epoch: 6575, Training Loss: 0.01708
Epoch: 6576, Training Loss: 0.01961
Epoch: 6576, Training Loss: 0.01792
Epoch: 6576, Training Loss: 0.02245
Epoch: 6576, Training Loss: 0.01707
Epoch: 6577, Training Loss: 0.01961
Epoch: 6577, Training Loss: 0.01791
Epoch: 6577, Training Loss: 0.02245
Epoch: 6577, Training Loss: 0.01707
Epoch: 6578, Training Loss: 0.01960
Epoch: 6578, Training Loss: 0.01791
Epoch: 6578, Training Loss: 0.02244
Epoch: 6578, Training Loss: 0.01707
Epoch: 6579, Training Loss: 0.01960
Epoch: 6579, Training Loss: 0.01791
Epoch: 6579, Training Loss: 0.02244
Epoch: 6579, Training Loss: 0.01707
Epoch: 6580, Training Loss: 0.01960
Epoch: 6580, Training Loss: 0.01791
Epoch: 6580, Training Loss: 0.02244
Epoch: 6580, Training Loss: 0.01707
Epoch: 6581, Training Loss: 0.01960
Epoch: 6581, Training Loss: 0.01791
Epoch: 6581, Training Loss: 0.02244
Epoch: 6581, Training Loss: 0.01707
Epoch: 6582, Training Loss: 0.01960
Epoch: 6582, Training Loss: 0.01791
Epoch: 6582, Training Loss: 0.02244
Epoch: 6582, Training Loss: 0.01706
Epoch: 6583, Training Loss: 0.01960
Epoch: 6583, Training Loss: 0.01790
Epoch: 6583, Training Loss: 0.02243
Epoch: 6583, Training Loss: 0.01706
Epoch: 6584, Training Loss: 0.01959
Epoch: 6584, Training Loss: 0.01790
Epoch: 6584, Training Loss: 0.02243
Epoch: 6584, Training Loss: 0.01706
Epoch: 6585, Training Loss: 0.01959
Epoch: 6585, Training Loss: 0.01790
Epoch: 6585, Training Loss: 0.02243
Epoch: 6585, Training Loss: 0.01706
Epoch: 6586, Training Loss: 0.01959
Epoch: 6586, Training Loss: 0.01790
Epoch: 6586, Training Loss: 0.02243
Epoch: 6586, Training Loss: 0.01706
Epoch: 6587, Training Loss: 0.01959
Epoch: 6587, Training Loss: 0.01790
Epoch: 6587, Training Loss: 0.02243
Epoch: 6587, Training Loss: 0.01706
Epoch: 6588, Training Loss: 0.01959
Epoch: 6588, Training Loss: 0.01790
Epoch: 6588, Training Loss: 0.02242
Epoch: 6588, Training Loss: 0.01706
Epoch: 6589, Training Loss: 0.01958
Epoch: 6589, Training Loss: 0.01790
Epoch: 6589, Training Loss: 0.02242
Epoch: 6589, Training Loss: 0.01705
Epoch: 6590, Training Loss: 0.01958
Epoch: 6590, Training Loss: 0.01789
Epoch: 6590, Training Loss: 0.02242
Epoch: 6590, Training Loss: 0.01705
Epoch: 6591, Training Loss: 0.01958
Epoch: 6591, Training Loss: 0.01789
Epoch: 6591, Training Loss: 0.02242
Epoch: 6591, Training Loss: 0.01705
Epoch: 6592, Training Loss: 0.01958
Epoch: 6592, Training Loss: 0.01789
Epoch: 6592, Training Loss: 0.02241
Epoch: 6592, Training Loss: 0.01705
Epoch: 6593, Training Loss: 0.01958
Epoch: 6593, Training Loss: 0.01789
Epoch: 6593, Training Loss: 0.02241
Epoch: 6593, Training Loss: 0.01705
Epoch: 6594, Training Loss: 0.01958
Epoch: 6594, Training Loss: 0.01789
Epoch: 6594, Training Loss: 0.02241
Epoch: 6594, Training Loss: 0.01705
Epoch: 6595, Training Loss: 0.01957
Epoch: 6595, Training Loss: 0.01789
Epoch: 6595, Training Loss: 0.02241
Epoch: 6595, Training Loss: 0.01705
Epoch: 6596, Training Loss: 0.01957
Epoch: 6596, Training Loss: 0.01788
Epoch: 6596, Training Loss: 0.02241
Epoch: 6596, Training Loss: 0.01704
Epoch: 6597, Training Loss: 0.01957
Epoch: 6597, Training Loss: 0.01788
Epoch: 6597, Training Loss: 0.02240
Epoch: 6597, Training Loss: 0.01704
Epoch: 6598, Training Loss: 0.01957
Epoch: 6598, Training Loss: 0.01788
Epoch: 6598, Training Loss: 0.02240
Epoch: 6598, Training Loss: 0.01704
Epoch: 6599, Training Loss: 0.01957
Epoch: 6599, Training Loss: 0.01788
Epoch: 6599, Training Loss: 0.02240
Epoch: 6599, Training Loss: 0.01704
Epoch: 6600, Training Loss: 0.01957
Epoch: 6600, Training Loss: 0.01788
Epoch: 6600, Training Loss: 0.02240
Epoch: 6600, Training Loss: 0.01704
Epoch: 6601, Training Loss: 0.01956
Epoch: 6601, Training Loss: 0.01788
Epoch: 6601, Training Loss: 0.02240
Epoch: 6601, Training Loss: 0.01704
Epoch: 6602, Training Loss: 0.01956
Epoch: 6602, Training Loss: 0.01788
Epoch: 6602, Training Loss: 0.02239
Epoch: 6602, Training Loss: 0.01703
Epoch: 6603, Training Loss: 0.01956
Epoch: 6603, Training Loss: 0.01787
Epoch: 6603, Training Loss: 0.02239
Epoch: 6603, Training Loss: 0.01703
Epoch: 6604, Training Loss: 0.01956
Epoch: 6604, Training Loss: 0.01787
Epoch: 6604, Training Loss: 0.02239
Epoch: 6604, Training Loss: 0.01703
Epoch: 6605, Training Loss: 0.01956
Epoch: 6605, Training Loss: 0.01787
Epoch: 6605, Training Loss: 0.02239
Epoch: 6605, Training Loss: 0.01703
Epoch: 6606, Training Loss: 0.01955
Epoch: 6606, Training Loss: 0.01787
Epoch: 6606, Training Loss: 0.02239
Epoch: 6606, Training Loss: 0.01703
Epoch: 6607, Training Loss: 0.01955
Epoch: 6607, Training Loss: 0.01787
Epoch: 6607, Training Loss: 0.02238
Epoch: 6607, Training Loss: 0.01703
Epoch: 6608, Training Loss: 0.01955
Epoch: 6608, Training Loss: 0.01787
Epoch: 6608, Training Loss: 0.02238
Epoch: 6608, Training Loss: 0.01703
Epoch: 6609, Training Loss: 0.01955
Epoch: 6609, Training Loss: 0.01786
Epoch: 6609, Training Loss: 0.02238
Epoch: 6609, Training Loss: 0.01702
Epoch: 6610, Training Loss: 0.01955
Epoch: 6610, Training Loss: 0.01786
Epoch: 6610, Training Loss: 0.02238
Epoch: 6610, Training Loss: 0.01702
Epoch: 6611, Training Loss: 0.01955
Epoch: 6611, Training Loss: 0.01786
Epoch: 6611, Training Loss: 0.02238
Epoch: 6611, Training Loss: 0.01702
Epoch: 6612, Training Loss: 0.01954
Epoch: 6612, Training Loss: 0.01786
Epoch: 6612, Training Loss: 0.02237
Epoch: 6612, Training Loss: 0.01702
Epoch: 6613, Training Loss: 0.01954
Epoch: 6613, Training Loss: 0.01786
Epoch: 6613, Training Loss: 0.02237
Epoch: 6613, Training Loss: 0.01702
Epoch: 6614, Training Loss: 0.01954
Epoch: 6614, Training Loss: 0.01786
Epoch: 6614, Training Loss: 0.02237
Epoch: 6614, Training Loss: 0.01702
Epoch: 6615, Training Loss: 0.01954
Epoch: 6615, Training Loss: 0.01786
Epoch: 6615, Training Loss: 0.02237
Epoch: 6615, Training Loss: 0.01702
Epoch: 6616, Training Loss: 0.01954
Epoch: 6616, Training Loss: 0.01785
Epoch: 6616, Training Loss: 0.02237
Epoch: 6616, Training Loss: 0.01701
Epoch: 6617, Training Loss: 0.01954
Epoch: 6617, Training Loss: 0.01785
Epoch: 6617, Training Loss: 0.02236
Epoch: 6617, Training Loss: 0.01701
Epoch: 6618, Training Loss: 0.01953
Epoch: 6618, Training Loss: 0.01785
Epoch: 6618, Training Loss: 0.02236
Epoch: 6618, Training Loss: 0.01701
Epoch: 6619, Training Loss: 0.01953
Epoch: 6619, Training Loss: 0.01785
Epoch: 6619, Training Loss: 0.02236
Epoch: 6619, Training Loss: 0.01701
Epoch: 6620, Training Loss: 0.01953
Epoch: 6620, Training Loss: 0.01785
Epoch: 6620, Training Loss: 0.02236
Epoch: 6620, Training Loss: 0.01701
Epoch: 6621, Training Loss: 0.01953
Epoch: 6621, Training Loss: 0.01785
Epoch: 6621, Training Loss: 0.02236
Epoch: 6621, Training Loss: 0.01701
Epoch: 6622, Training Loss: 0.01953
Epoch: 6622, Training Loss: 0.01784
Epoch: 6622, Training Loss: 0.02235
Epoch: 6622, Training Loss: 0.01700
Epoch: 6623, Training Loss: 0.01952
Epoch: 6623, Training Loss: 0.01784
Epoch: 6623, Training Loss: 0.02235
Epoch: 6623, Training Loss: 0.01700
Epoch: 6624, Training Loss: 0.01952
Epoch: 6624, Training Loss: 0.01784
Epoch: 6624, Training Loss: 0.02235
Epoch: 6624, Training Loss: 0.01700
Epoch: 6625, Training Loss: 0.01952
Epoch: 6625, Training Loss: 0.01784
Epoch: 6625, Training Loss: 0.02235
Epoch: 6625, Training Loss: 0.01700
Epoch: 6626, Training Loss: 0.01952
Epoch: 6626, Training Loss: 0.01784
Epoch: 6626, Training Loss: 0.02235
Epoch: 6626, Training Loss: 0.01700
Epoch: 6627, Training Loss: 0.01952
Epoch: 6627, Training Loss: 0.01784
Epoch: 6627, Training Loss: 0.02234
Epoch: 6627, Training Loss: 0.01700
Epoch: 6628, Training Loss: 0.01952
Epoch: 6628, Training Loss: 0.01784
Epoch: 6628, Training Loss: 0.02234
Epoch: 6628, Training Loss: 0.01700
Epoch: 6629, Training Loss: 0.01951
Epoch: 6629, Training Loss: 0.01783
Epoch: 6629, Training Loss: 0.02234
Epoch: 6629, Training Loss: 0.01699
Epoch: 6630, Training Loss: 0.01951
Epoch: 6630, Training Loss: 0.01783
Epoch: 6630, Training Loss: 0.02234
Epoch: 6630, Training Loss: 0.01699
Epoch: 6631, Training Loss: 0.01951
Epoch: 6631, Training Loss: 0.01783
Epoch: 6631, Training Loss: 0.02234
Epoch: 6631, Training Loss: 0.01699
Epoch: 6632, Training Loss: 0.01951
Epoch: 6632, Training Loss: 0.01783
Epoch: 6632, Training Loss: 0.02233
Epoch: 6632, Training Loss: 0.01699
Epoch: 6633, Training Loss: 0.01951
Epoch: 6633, Training Loss: 0.01783
Epoch: 6633, Training Loss: 0.02233
Epoch: 6633, Training Loss: 0.01699
Epoch: 6634, Training Loss: 0.01951
Epoch: 6634, Training Loss: 0.01783
Epoch: 6634, Training Loss: 0.02233
Epoch: 6634, Training Loss: 0.01699
Epoch: 6635, Training Loss: 0.01950
Epoch: 6635, Training Loss: 0.01782
Epoch: 6635, Training Loss: 0.02233
Epoch: 6635, Training Loss: 0.01699
Epoch: 6636, Training Loss: 0.01950
Epoch: 6636, Training Loss: 0.01782
Epoch: 6636, Training Loss: 0.02233
Epoch: 6636, Training Loss: 0.01698
Epoch: 6637, Training Loss: 0.01950
Epoch: 6637, Training Loss: 0.01782
Epoch: 6637, Training Loss: 0.02232
Epoch: 6637, Training Loss: 0.01698
Epoch: 6638, Training Loss: 0.01950
Epoch: 6638, Training Loss: 0.01782
Epoch: 6638, Training Loss: 0.02232
Epoch: 6638, Training Loss: 0.01698
Epoch: 6639, Training Loss: 0.01950
Epoch: 6639, Training Loss: 0.01782
Epoch: 6639, Training Loss: 0.02232
Epoch: 6639, Training Loss: 0.01698
Epoch: 6640, Training Loss: 0.01950
Epoch: 6640, Training Loss: 0.01782
Epoch: 6640, Training Loss: 0.02232
Epoch: 6640, Training Loss: 0.01698
Epoch: 6641, Training Loss: 0.01949
Epoch: 6641, Training Loss: 0.01782
Epoch: 6641, Training Loss: 0.02232
Epoch: 6641, Training Loss: 0.01698
Epoch: 6642, Training Loss: 0.01949
Epoch: 6642, Training Loss: 0.01781
Epoch: 6642, Training Loss: 0.02231
Epoch: 6642, Training Loss: 0.01698
Epoch: 6643, Training Loss: 0.01949
Epoch: 6643, Training Loss: 0.01781
Epoch: 6643, Training Loss: 0.02231
Epoch: 6643, Training Loss: 0.01697
Epoch: 6644, Training Loss: 0.01949
Epoch: 6644, Training Loss: 0.01781
Epoch: 6644, Training Loss: 0.02231
Epoch: 6644, Training Loss: 0.01697
Epoch: 6645, Training Loss: 0.01949
Epoch: 6645, Training Loss: 0.01781
Epoch: 6645, Training Loss: 0.02231
Epoch: 6645, Training Loss: 0.01697
Epoch: 6646, Training Loss: 0.01948
Epoch: 6646, Training Loss: 0.01781
Epoch: 6646, Training Loss: 0.02231
Epoch: 6646, Training Loss: 0.01697
Epoch: 6647, Training Loss: 0.01948
Epoch: 6647, Training Loss: 0.01781
Epoch: 6647, Training Loss: 0.02230
Epoch: 6647, Training Loss: 0.01697
Epoch: 6648, Training Loss: 0.01948
Epoch: 6648, Training Loss: 0.01780
Epoch: 6648, Training Loss: 0.02230
Epoch: 6648, Training Loss: 0.01697
Epoch: 6649, Training Loss: 0.01948
Epoch: 6649, Training Loss: 0.01780
Epoch: 6649, Training Loss: 0.02230
Epoch: 6649, Training Loss: 0.01696
Epoch: 6650, Training Loss: 0.01948
Epoch: 6650, Training Loss: 0.01780
Epoch: 6650, Training Loss: 0.02230
Epoch: 6650, Training Loss: 0.01696
Epoch: 6651, Training Loss: 0.01948
Epoch: 6651, Training Loss: 0.01780
Epoch: 6651, Training Loss: 0.02230
Epoch: 6651, Training Loss: 0.01696
Epoch: 6652, Training Loss: 0.01947
Epoch: 6652, Training Loss: 0.01780
Epoch: 6652, Training Loss: 0.02229
Epoch: 6652, Training Loss: 0.01696
Epoch: 6653, Training Loss: 0.01947
Epoch: 6653, Training Loss: 0.01780
Epoch: 6653, Training Loss: 0.02229
Epoch: 6653, Training Loss: 0.01696
Epoch: 6654, Training Loss: 0.01947
Epoch: 6654, Training Loss: 0.01780
Epoch: 6654, Training Loss: 0.02229
Epoch: 6654, Training Loss: 0.01696
Epoch: 6655, Training Loss: 0.01947
Epoch: 6655, Training Loss: 0.01779
Epoch: 6655, Training Loss: 0.02229
Epoch: 6655, Training Loss: 0.01696
Epoch: 6656, Training Loss: 0.01947
Epoch: 6656, Training Loss: 0.01779
Epoch: 6656, Training Loss: 0.02229
Epoch: 6656, Training Loss: 0.01695
Epoch: 6657, Training Loss: 0.01947
Epoch: 6657, Training Loss: 0.01779
Epoch: 6657, Training Loss: 0.02228
Epoch: 6657, Training Loss: 0.01695
Epoch: 6658, Training Loss: 0.01946
Epoch: 6658, Training Loss: 0.01779
Epoch: 6658, Training Loss: 0.02228
Epoch: 6658, Training Loss: 0.01695
Epoch: 6659, Training Loss: 0.01946
Epoch: 6659, Training Loss: 0.01779
Epoch: 6659, Training Loss: 0.02228
Epoch: 6659, Training Loss: 0.01695
Epoch: 6660, Training Loss: 0.01946
Epoch: 6660, Training Loss: 0.01779
Epoch: 6660, Training Loss: 0.02228
Epoch: 6660, Training Loss: 0.01695
Epoch: 6661, Training Loss: 0.01946
Epoch: 6661, Training Loss: 0.01778
Epoch: 6661, Training Loss: 0.02228
Epoch: 6661, Training Loss: 0.01695
Epoch: 6662, Training Loss: 0.01946
Epoch: 6662, Training Loss: 0.01778
Epoch: 6662, Training Loss: 0.02227
Epoch: 6662, Training Loss: 0.01695
Epoch: 6663, Training Loss: 0.01946
Epoch: 6663, Training Loss: 0.01778
Epoch: 6663, Training Loss: 0.02227
Epoch: 6663, Training Loss: 0.01694
Epoch: 6664, Training Loss: 0.01945
Epoch: 6664, Training Loss: 0.01778
Epoch: 6664, Training Loss: 0.02227
Epoch: 6664, Training Loss: 0.01694
Epoch: 6665, Training Loss: 0.01945
Epoch: 6665, Training Loss: 0.01778
Epoch: 6665, Training Loss: 0.02227
Epoch: 6665, Training Loss: 0.01694
Epoch: 6666, Training Loss: 0.01945
Epoch: 6666, Training Loss: 0.01778
Epoch: 6666, Training Loss: 0.02226
Epoch: 6666, Training Loss: 0.01694
Epoch: 6667, Training Loss: 0.01945
Epoch: 6667, Training Loss: 0.01778
Epoch: 6667, Training Loss: 0.02226
Epoch: 6667, Training Loss: 0.01694
Epoch: 6668, Training Loss: 0.01945
Epoch: 6668, Training Loss: 0.01777
Epoch: 6668, Training Loss: 0.02226
Epoch: 6668, Training Loss: 0.01694
Epoch: 6669, Training Loss: 0.01944
Epoch: 6669, Training Loss: 0.01777
Epoch: 6669, Training Loss: 0.02226
Epoch: 6669, Training Loss: 0.01694
Epoch: 6670, Training Loss: 0.01944
Epoch: 6670, Training Loss: 0.01777
Epoch: 6670, Training Loss: 0.02226
Epoch: 6670, Training Loss: 0.01693
Epoch: 6671, Training Loss: 0.01944
Epoch: 6671, Training Loss: 0.01777
Epoch: 6671, Training Loss: 0.02225
Epoch: 6671, Training Loss: 0.01693
Epoch: 6672, Training Loss: 0.01944
Epoch: 6672, Training Loss: 0.01777
Epoch: 6672, Training Loss: 0.02225
Epoch: 6672, Training Loss: 0.01693
Epoch: 6673, Training Loss: 0.01944
Epoch: 6673, Training Loss: 0.01777
Epoch: 6673, Training Loss: 0.02225
Epoch: 6673, Training Loss: 0.01693
Epoch: 6674, Training Loss: 0.01944
Epoch: 6674, Training Loss: 0.01777
Epoch: 6674, Training Loss: 0.02225
Epoch: 6674, Training Loss: 0.01693
Epoch: 6675, Training Loss: 0.01943
Epoch: 6675, Training Loss: 0.01776
Epoch: 6675, Training Loss: 0.02225
Epoch: 6675, Training Loss: 0.01693
Epoch: 6676, Training Loss: 0.01943
Epoch: 6676, Training Loss: 0.01776
Epoch: 6676, Training Loss: 0.02224
Epoch: 6676, Training Loss: 0.01692
Epoch: 6677, Training Loss: 0.01943
Epoch: 6677, Training Loss: 0.01776
Epoch: 6677, Training Loss: 0.02224
Epoch: 6677, Training Loss: 0.01692
Epoch: 6678, Training Loss: 0.01943
Epoch: 6678, Training Loss: 0.01776
Epoch: 6678, Training Loss: 0.02224
Epoch: 6678, Training Loss: 0.01692
Epoch: 6679, Training Loss: 0.01943
Epoch: 6679, Training Loss: 0.01776
Epoch: 6679, Training Loss: 0.02224
Epoch: 6679, Training Loss: 0.01692
Epoch: 6680, Training Loss: 0.01943
Epoch: 6680, Training Loss: 0.01776
Epoch: 6680, Training Loss: 0.02224
Epoch: 6680, Training Loss: 0.01692
Epoch: 6681, Training Loss: 0.01942
Epoch: 6681, Training Loss: 0.01775
Epoch: 6681, Training Loss: 0.02223
Epoch: 6681, Training Loss: 0.01692
Epoch: 6682, Training Loss: 0.01942
Epoch: 6682, Training Loss: 0.01775
Epoch: 6682, Training Loss: 0.02223
Epoch: 6682, Training Loss: 0.01692
Epoch: 6683, Training Loss: 0.01942
Epoch: 6683, Training Loss: 0.01775
Epoch: 6683, Training Loss: 0.02223
Epoch: 6683, Training Loss: 0.01691
Epoch: 6684, Training Loss: 0.01942
Epoch: 6684, Training Loss: 0.01775
Epoch: 6684, Training Loss: 0.02223
Epoch: 6684, Training Loss: 0.01691
Epoch: 6685, Training Loss: 0.01942
Epoch: 6685, Training Loss: 0.01775
Epoch: 6685, Training Loss: 0.02223
Epoch: 6685, Training Loss: 0.01691
Epoch: 6686, Training Loss: 0.01942
Epoch: 6686, Training Loss: 0.01775
Epoch: 6686, Training Loss: 0.02222
Epoch: 6686, Training Loss: 0.01691
Epoch: 6687, Training Loss: 0.01941
Epoch: 6687, Training Loss: 0.01775
Epoch: 6687, Training Loss: 0.02222
Epoch: 6687, Training Loss: 0.01691
Epoch: 6688, Training Loss: 0.01941
Epoch: 6688, Training Loss: 0.01774
Epoch: 6688, Training Loss: 0.02222
Epoch: 6688, Training Loss: 0.01691
Epoch: 6689, Training Loss: 0.01941
Epoch: 6689, Training Loss: 0.01774
Epoch: 6689, Training Loss: 0.02222
Epoch: 6689, Training Loss: 0.01691
Epoch: 6690, Training Loss: 0.01941
Epoch: 6690, Training Loss: 0.01774
Epoch: 6690, Training Loss: 0.02222
Epoch: 6690, Training Loss: 0.01690
Epoch: 6691, Training Loss: 0.01941
Epoch: 6691, Training Loss: 0.01774
Epoch: 6691, Training Loss: 0.02221
Epoch: 6691, Training Loss: 0.01690
Epoch: 6692, Training Loss: 0.01941
Epoch: 6692, Training Loss: 0.01774
Epoch: 6692, Training Loss: 0.02221
Epoch: 6692, Training Loss: 0.01690
Epoch: 6693, Training Loss: 0.01940
Epoch: 6693, Training Loss: 0.01774
Epoch: 6693, Training Loss: 0.02221
Epoch: 6693, Training Loss: 0.01690
Epoch: 6694, Training Loss: 0.01940
Epoch: 6694, Training Loss: 0.01773
Epoch: 6694, Training Loss: 0.02221
Epoch: 6694, Training Loss: 0.01690
Epoch: 6695, Training Loss: 0.01940
Epoch: 6695, Training Loss: 0.01773
Epoch: 6695, Training Loss: 0.02221
Epoch: 6695, Training Loss: 0.01690
Epoch: 6696, Training Loss: 0.01940
Epoch: 6696, Training Loss: 0.01773
Epoch: 6696, Training Loss: 0.02221
Epoch: 6696, Training Loss: 0.01690
Epoch: 6697, Training Loss: 0.01940
Epoch: 6697, Training Loss: 0.01773
Epoch: 6697, Training Loss: 0.02220
Epoch: 6697, Training Loss: 0.01689
Epoch: 6698, Training Loss: 0.01939
Epoch: 6698, Training Loss: 0.01773
Epoch: 6698, Training Loss: 0.02220
Epoch: 6698, Training Loss: 0.01689
Epoch: 6699, Training Loss: 0.01939
Epoch: 6699, Training Loss: 0.01773
Epoch: 6699, Training Loss: 0.02220
Epoch: 6699, Training Loss: 0.01689
Epoch: 6700, Training Loss: 0.01939
Epoch: 6700, Training Loss: 0.01773
Epoch: 6700, Training Loss: 0.02220
Epoch: 6700, Training Loss: 0.01689
Epoch: 6701, Training Loss: 0.01939
Epoch: 6701, Training Loss: 0.01772
Epoch: 6701, Training Loss: 0.02220
Epoch: 6701, Training Loss: 0.01689
Epoch: 6702, Training Loss: 0.01939
Epoch: 6702, Training Loss: 0.01772
Epoch: 6702, Training Loss: 0.02219
Epoch: 6702, Training Loss: 0.01689
Epoch: 6703, Training Loss: 0.01939
Epoch: 6703, Training Loss: 0.01772
Epoch: 6703, Training Loss: 0.02219
Epoch: 6703, Training Loss: 0.01689
Epoch: 6704, Training Loss: 0.01938
Epoch: 6704, Training Loss: 0.01772
Epoch: 6704, Training Loss: 0.02219
Epoch: 6704, Training Loss: 0.01688
Epoch: 6705, Training Loss: 0.01938
Epoch: 6705, Training Loss: 0.01772
Epoch: 6705, Training Loss: 0.02219
Epoch: 6705, Training Loss: 0.01688
Epoch: 6706, Training Loss: 0.01938
Epoch: 6706, Training Loss: 0.01772
Epoch: 6706, Training Loss: 0.02219
Epoch: 6706, Training Loss: 0.01688
Epoch: 6707, Training Loss: 0.01938
Epoch: 6707, Training Loss: 0.01772
Epoch: 6707, Training Loss: 0.02218
Epoch: 6707, Training Loss: 0.01688
Epoch: 6708, Training Loss: 0.01938
Epoch: 6708, Training Loss: 0.01771
Epoch: 6708, Training Loss: 0.02218
Epoch: 6708, Training Loss: 0.01688
Epoch: 6709, Training Loss: 0.01938
Epoch: 6709, Training Loss: 0.01771
Epoch: 6709, Training Loss: 0.02218
Epoch: 6709, Training Loss: 0.01688
Epoch: 6710, Training Loss: 0.01937
Epoch: 6710, Training Loss: 0.01771
Epoch: 6710, Training Loss: 0.02218
Epoch: 6710, Training Loss: 0.01687
Epoch: 6711, Training Loss: 0.01937
Epoch: 6711, Training Loss: 0.01771
Epoch: 6711, Training Loss: 0.02218
Epoch: 6711, Training Loss: 0.01687
Epoch: 6712, Training Loss: 0.01937
Epoch: 6712, Training Loss: 0.01771
Epoch: 6712, Training Loss: 0.02217
Epoch: 6712, Training Loss: 0.01687
Epoch: 6713, Training Loss: 0.01937
Epoch: 6713, Training Loss: 0.01771
Epoch: 6713, Training Loss: 0.02217
Epoch: 6713, Training Loss: 0.01687
Epoch: 6714, Training Loss: 0.01937
Epoch: 6714, Training Loss: 0.01770
Epoch: 6714, Training Loss: 0.02217
Epoch: 6714, Training Loss: 0.01687
Epoch: 6715, Training Loss: 0.01937
Epoch: 6715, Training Loss: 0.01770
Epoch: 6715, Training Loss: 0.02217
Epoch: 6715, Training Loss: 0.01687
Epoch: 6716, Training Loss: 0.01936
Epoch: 6716, Training Loss: 0.01770
Epoch: 6716, Training Loss: 0.02217
Epoch: 6716, Training Loss: 0.01687
Epoch: 6717, Training Loss: 0.01936
Epoch: 6717, Training Loss: 0.01770
Epoch: 6717, Training Loss: 0.02216
Epoch: 6717, Training Loss: 0.01686
Epoch: 6718, Training Loss: 0.01936
Epoch: 6718, Training Loss: 0.01770
Epoch: 6718, Training Loss: 0.02216
Epoch: 6718, Training Loss: 0.01686
Epoch: 6719, Training Loss: 0.01936
Epoch: 6719, Training Loss: 0.01770
Epoch: 6719, Training Loss: 0.02216
Epoch: 6719, Training Loss: 0.01686
Epoch: 6720, Training Loss: 0.01936
Epoch: 6720, Training Loss: 0.01770
Epoch: 6720, Training Loss: 0.02216
Epoch: 6720, Training Loss: 0.01686
Epoch: 6721, Training Loss: 0.01936
Epoch: 6721, Training Loss: 0.01769
Epoch: 6721, Training Loss: 0.02216
Epoch: 6721, Training Loss: 0.01686
Epoch: 6722, Training Loss: 0.01935
Epoch: 6722, Training Loss: 0.01769
Epoch: 6722, Training Loss: 0.02215
Epoch: 6722, Training Loss: 0.01686
Epoch: 6723, Training Loss: 0.01935
Epoch: 6723, Training Loss: 0.01769
Epoch: 6723, Training Loss: 0.02215
Epoch: 6723, Training Loss: 0.01686
Epoch: 6724, Training Loss: 0.01935
Epoch: 6724, Training Loss: 0.01769
Epoch: 6724, Training Loss: 0.02215
Epoch: 6724, Training Loss: 0.01685
Epoch: 6725, Training Loss: 0.01935
Epoch: 6725, Training Loss: 0.01769
Epoch: 6725, Training Loss: 0.02215
Epoch: 6725, Training Loss: 0.01685
Epoch: 6726, Training Loss: 0.01935
Epoch: 6726, Training Loss: 0.01769
Epoch: 6726, Training Loss: 0.02215
Epoch: 6726, Training Loss: 0.01685
Epoch: 6727, Training Loss: 0.01935
Epoch: 6727, Training Loss: 0.01769
Epoch: 6727, Training Loss: 0.02214
Epoch: 6727, Training Loss: 0.01685
Epoch: 6728, Training Loss: 0.01934
Epoch: 6728, Training Loss: 0.01768
Epoch: 6728, Training Loss: 0.02214
Epoch: 6728, Training Loss: 0.01685
Epoch: 6729, Training Loss: 0.01934
Epoch: 6729, Training Loss: 0.01768
Epoch: 6729, Training Loss: 0.02214
Epoch: 6729, Training Loss: 0.01685
Epoch: 6730, Training Loss: 0.01934
Epoch: 6730, Training Loss: 0.01768
Epoch: 6730, Training Loss: 0.02214
Epoch: 6730, Training Loss: 0.01685
Epoch: 6731, Training Loss: 0.01934
Epoch: 6731, Training Loss: 0.01768
Epoch: 6731, Training Loss: 0.02214
Epoch: 6731, Training Loss: 0.01684
Epoch: 6732, Training Loss: 0.01934
Epoch: 6732, Training Loss: 0.01768
Epoch: 6732, Training Loss: 0.02213
Epoch: 6732, Training Loss: 0.01684
Epoch: 6733, Training Loss: 0.01933
Epoch: 6733, Training Loss: 0.01768
Epoch: 6733, Training Loss: 0.02213
Epoch: 6733, Training Loss: 0.01684
Epoch: 6734, Training Loss: 0.01933
Epoch: 6734, Training Loss: 0.01767
Epoch: 6734, Training Loss: 0.02213
Epoch: 6734, Training Loss: 0.01684
Epoch: 6735, Training Loss: 0.01933
Epoch: 6735, Training Loss: 0.01767
Epoch: 6735, Training Loss: 0.02213
Epoch: 6735, Training Loss: 0.01684
Epoch: 6736, Training Loss: 0.01933
Epoch: 6736, Training Loss: 0.01767
Epoch: 6736, Training Loss: 0.02213
Epoch: 6736, Training Loss: 0.01684
Epoch: 6737, Training Loss: 0.01933
Epoch: 6737, Training Loss: 0.01767
Epoch: 6737, Training Loss: 0.02212
Epoch: 6737, Training Loss: 0.01684
Epoch: 6738, Training Loss: 0.01933
Epoch: 6738, Training Loss: 0.01767
Epoch: 6738, Training Loss: 0.02212
Epoch: 6738, Training Loss: 0.01683
Epoch: 6739, Training Loss: 0.01932
Epoch: 6739, Training Loss: 0.01767
Epoch: 6739, Training Loss: 0.02212
Epoch: 6739, Training Loss: 0.01683
Epoch: 6740, Training Loss: 0.01932
Epoch: 6740, Training Loss: 0.01767
Epoch: 6740, Training Loss: 0.02212
Epoch: 6740, Training Loss: 0.01683
Epoch: 6741, Training Loss: 0.01932
Epoch: 6741, Training Loss: 0.01766
Epoch: 6741, Training Loss: 0.02212
Epoch: 6741, Training Loss: 0.01683
Epoch: 6742, Training Loss: 0.01932
Epoch: 6742, Training Loss: 0.01766
Epoch: 6742, Training Loss: 0.02211
Epoch: 6742, Training Loss: 0.01683
Epoch: 6743, Training Loss: 0.01932
Epoch: 6743, Training Loss: 0.01766
Epoch: 6743, Training Loss: 0.02211
Epoch: 6743, Training Loss: 0.01683
Epoch: 6744, Training Loss: 0.01932
Epoch: 6744, Training Loss: 0.01766
Epoch: 6744, Training Loss: 0.02211
Epoch: 6744, Training Loss: 0.01683
Epoch: 6745, Training Loss: 0.01931
Epoch: 6745, Training Loss: 0.01766
Epoch: 6745, Training Loss: 0.02211
Epoch: 6745, Training Loss: 0.01682
Epoch: 6746, Training Loss: 0.01931
Epoch: 6746, Training Loss: 0.01766
Epoch: 6746, Training Loss: 0.02211
Epoch: 6746, Training Loss: 0.01682
Epoch: 6747, Training Loss: 0.01931
Epoch: 6747, Training Loss: 0.01766
Epoch: 6747, Training Loss: 0.02210
Epoch: 6747, Training Loss: 0.01682
Epoch: 6748, Training Loss: 0.01931
Epoch: 6748, Training Loss: 0.01765
Epoch: 6748, Training Loss: 0.02210
Epoch: 6748, Training Loss: 0.01682
Epoch: 6749, Training Loss: 0.01931
Epoch: 6749, Training Loss: 0.01765
Epoch: 6749, Training Loss: 0.02210
Epoch: 6749, Training Loss: 0.01682
Epoch: 6750, Training Loss: 0.01931
Epoch: 6750, Training Loss: 0.01765
Epoch: 6750, Training Loss: 0.02210
Epoch: 6750, Training Loss: 0.01682
Epoch: 6751, Training Loss: 0.01930
Epoch: 6751, Training Loss: 0.01765
Epoch: 6751, Training Loss: 0.02210
Epoch: 6751, Training Loss: 0.01682
Epoch: 6752, Training Loss: 0.01930
Epoch: 6752, Training Loss: 0.01765
Epoch: 6752, Training Loss: 0.02209
Epoch: 6752, Training Loss: 0.01681
Epoch: 6753, Training Loss: 0.01930
Epoch: 6753, Training Loss: 0.01765
Epoch: 6753, Training Loss: 0.02209
Epoch: 6753, Training Loss: 0.01681
Epoch: 6754, Training Loss: 0.01930
Epoch: 6754, Training Loss: 0.01764
Epoch: 6754, Training Loss: 0.02209
Epoch: 6754, Training Loss: 0.01681
Epoch: 6755, Training Loss: 0.01930
Epoch: 6755, Training Loss: 0.01764
Epoch: 6755, Training Loss: 0.02209
Epoch: 6755, Training Loss: 0.01681
Epoch: 6756, Training Loss: 0.01930
Epoch: 6756, Training Loss: 0.01764
Epoch: 6756, Training Loss: 0.02209
Epoch: 6756, Training Loss: 0.01681
Epoch: 6757, Training Loss: 0.01929
Epoch: 6757, Training Loss: 0.01764
Epoch: 6757, Training Loss: 0.02208
Epoch: 6757, Training Loss: 0.01681
Epoch: 6758, Training Loss: 0.01929
Epoch: 6758, Training Loss: 0.01764
Epoch: 6758, Training Loss: 0.02208
Epoch: 6758, Training Loss: 0.01681
Epoch: 6759, Training Loss: 0.01929
Epoch: 6759, Training Loss: 0.01764
Epoch: 6759, Training Loss: 0.02208
Epoch: 6759, Training Loss: 0.01680
Epoch: 6760, Training Loss: 0.01929
Epoch: 6760, Training Loss: 0.01764
Epoch: 6760, Training Loss: 0.02208
Epoch: 6760, Training Loss: 0.01680
Epoch: 6761, Training Loss: 0.01929
Epoch: 6761, Training Loss: 0.01763
Epoch: 6761, Training Loss: 0.02208
Epoch: 6761, Training Loss: 0.01680
Epoch: 6762, Training Loss: 0.01929
Epoch: 6762, Training Loss: 0.01763
Epoch: 6762, Training Loss: 0.02207
Epoch: 6762, Training Loss: 0.01680
Epoch: 6763, Training Loss: 0.01928
Epoch: 6763, Training Loss: 0.01763
Epoch: 6763, Training Loss: 0.02207
Epoch: 6763, Training Loss: 0.01680
Epoch: 6764, Training Loss: 0.01928
Epoch: 6764, Training Loss: 0.01763
Epoch: 6764, Training Loss: 0.02207
Epoch: 6764, Training Loss: 0.01680
Epoch: 6765, Training Loss: 0.01928
Epoch: 6765, Training Loss: 0.01763
Epoch: 6765, Training Loss: 0.02207
Epoch: 6765, Training Loss: 0.01680
Epoch: 6766, Training Loss: 0.01928
Epoch: 6766, Training Loss: 0.01763
Epoch: 6766, Training Loss: 0.02207
Epoch: 6766, Training Loss: 0.01679
Epoch: 6767, Training Loss: 0.01928
Epoch: 6767, Training Loss: 0.01763
Epoch: 6767, Training Loss: 0.02206
Epoch: 6767, Training Loss: 0.01679
Epoch: 6768, Training Loss: 0.01928
Epoch: 6768, Training Loss: 0.01762
Epoch: 6768, Training Loss: 0.02206
Epoch: 6768, Training Loss: 0.01679
Epoch: 6769, Training Loss: 0.01927
Epoch: 6769, Training Loss: 0.01762
Epoch: 6769, Training Loss: 0.02206
Epoch: 6769, Training Loss: 0.01679
Epoch: 6770, Training Loss: 0.01927
Epoch: 6770, Training Loss: 0.01762
Epoch: 6770, Training Loss: 0.02206
Epoch: 6770, Training Loss: 0.01679
Epoch: 6771, Training Loss: 0.01927
Epoch: 6771, Training Loss: 0.01762
Epoch: 6771, Training Loss: 0.02206
Epoch: 6771, Training Loss: 0.01679
Epoch: 6772, Training Loss: 0.01927
Epoch: 6772, Training Loss: 0.01762
Epoch: 6772, Training Loss: 0.02206
Epoch: 6772, Training Loss: 0.01679
Epoch: 6773, Training Loss: 0.01927
Epoch: 6773, Training Loss: 0.01762
Epoch: 6773, Training Loss: 0.02205
Epoch: 6773, Training Loss: 0.01678
Epoch: 6774, Training Loss: 0.01927
Epoch: 6774, Training Loss: 0.01761
Epoch: 6774, Training Loss: 0.02205
Epoch: 6774, Training Loss: 0.01678
Epoch: 6775, Training Loss: 0.01926
Epoch: 6775, Training Loss: 0.01761
Epoch: 6775, Training Loss: 0.02205
Epoch: 6775, Training Loss: 0.01678
Epoch: 6776, Training Loss: 0.01926
Epoch: 6776, Training Loss: 0.01761
Epoch: 6776, Training Loss: 0.02205
Epoch: 6776, Training Loss: 0.01678
Epoch: 6777, Training Loss: 0.01926
Epoch: 6777, Training Loss: 0.01761
Epoch: 6777, Training Loss: 0.02205
Epoch: 6777, Training Loss: 0.01678
Epoch: 6778, Training Loss: 0.01926
Epoch: 6778, Training Loss: 0.01761
Epoch: 6778, Training Loss: 0.02204
Epoch: 6778, Training Loss: 0.01678
Epoch: 6779, Training Loss: 0.01926
Epoch: 6779, Training Loss: 0.01761
Epoch: 6779, Training Loss: 0.02204
Epoch: 6779, Training Loss: 0.01678
Epoch: 6780, Training Loss: 0.01926
Epoch: 6780, Training Loss: 0.01761
Epoch: 6780, Training Loss: 0.02204
Epoch: 6780, Training Loss: 0.01677
Epoch: 6781, Training Loss: 0.01925
Epoch: 6781, Training Loss: 0.01760
Epoch: 6781, Training Loss: 0.02204
Epoch: 6781, Training Loss: 0.01677
Epoch: 6782, Training Loss: 0.01925
Epoch: 6782, Training Loss: 0.01760
Epoch: 6782, Training Loss: 0.02204
Epoch: 6782, Training Loss: 0.01677
Epoch: 6783, Training Loss: 0.01925
Epoch: 6783, Training Loss: 0.01760
Epoch: 6783, Training Loss: 0.02203
Epoch: 6783, Training Loss: 0.01677
Epoch: 6784, Training Loss: 0.01925
Epoch: 6784, Training Loss: 0.01760
Epoch: 6784, Training Loss: 0.02203
Epoch: 6784, Training Loss: 0.01677
Epoch: 6785, Training Loss: 0.01925
Epoch: 6785, Training Loss: 0.01760
Epoch: 6785, Training Loss: 0.02203
Epoch: 6785, Training Loss: 0.01677
Epoch: 6786, Training Loss: 0.01925
Epoch: 6786, Training Loss: 0.01760
Epoch: 6786, Training Loss: 0.02203
Epoch: 6786, Training Loss: 0.01677
Epoch: 6787, Training Loss: 0.01924
Epoch: 6787, Training Loss: 0.01760
Epoch: 6787, Training Loss: 0.02203
Epoch: 6787, Training Loss: 0.01676
Epoch: 6788, Training Loss: 0.01924
Epoch: 6788, Training Loss: 0.01759
Epoch: 6788, Training Loss: 0.02202
Epoch: 6788, Training Loss: 0.01676
Epoch: 6789, Training Loss: 0.01924
Epoch: 6789, Training Loss: 0.01759
Epoch: 6789, Training Loss: 0.02202
Epoch: 6789, Training Loss: 0.01676
Epoch: 6790, Training Loss: 0.01924
Epoch: 6790, Training Loss: 0.01759
Epoch: 6790, Training Loss: 0.02202
Epoch: 6790, Training Loss: 0.01676
Epoch: 6791, Training Loss: 0.01924
Epoch: 6791, Training Loss: 0.01759
Epoch: 6791, Training Loss: 0.02202
Epoch: 6791, Training Loss: 0.01676
Epoch: 6792, Training Loss: 0.01924
Epoch: 6792, Training Loss: 0.01759
Epoch: 6792, Training Loss: 0.02202
Epoch: 6792, Training Loss: 0.01676
Epoch: 6793, Training Loss: 0.01923
Epoch: 6793, Training Loss: 0.01759
Epoch: 6793, Training Loss: 0.02201
Epoch: 6793, Training Loss: 0.01676
Epoch: 6794, Training Loss: 0.01923
Epoch: 6794, Training Loss: 0.01759
Epoch: 6794, Training Loss: 0.02201
Epoch: 6794, Training Loss: 0.01675
Epoch: 6795, Training Loss: 0.01923
Epoch: 6795, Training Loss: 0.01758
Epoch: 6795, Training Loss: 0.02201
Epoch: 6795, Training Loss: 0.01675
Epoch: 6796, Training Loss: 0.01923
Epoch: 6796, Training Loss: 0.01758
Epoch: 6796, Training Loss: 0.02201
Epoch: 6796, Training Loss: 0.01675
Epoch: 6797, Training Loss: 0.01923
Epoch: 6797, Training Loss: 0.01758
Epoch: 6797, Training Loss: 0.02201
Epoch: 6797, Training Loss: 0.01675
Epoch: 6798, Training Loss: 0.01923
Epoch: 6798, Training Loss: 0.01758
Epoch: 6798, Training Loss: 0.02200
Epoch: 6798, Training Loss: 0.01675
Epoch: 6799, Training Loss: 0.01922
Epoch: 6799, Training Loss: 0.01758
Epoch: 6799, Training Loss: 0.02200
Epoch: 6799, Training Loss: 0.01675
Epoch: 6800, Training Loss: 0.01922
Epoch: 6800, Training Loss: 0.01758
Epoch: 6800, Training Loss: 0.02200
Epoch: 6800, Training Loss: 0.01675
Epoch: 6801, Training Loss: 0.01922
Epoch: 6801, Training Loss: 0.01758
Epoch: 6801, Training Loss: 0.02200
Epoch: 6801, Training Loss: 0.01674
Epoch: 6802, Training Loss: 0.01922
Epoch: 6802, Training Loss: 0.01757
Epoch: 6802, Training Loss: 0.02200
Epoch: 6802, Training Loss: 0.01674
Epoch: 6803, Training Loss: 0.01922
Epoch: 6803, Training Loss: 0.01757
Epoch: 6803, Training Loss: 0.02199
Epoch: 6803, Training Loss: 0.01674
Epoch: 6804, Training Loss: 0.01921
Epoch: 6804, Training Loss: 0.01757
Epoch: 6804, Training Loss: 0.02199
Epoch: 6804, Training Loss: 0.01674
Epoch: 6805, Training Loss: 0.01921
Epoch: 6805, Training Loss: 0.01757
Epoch: 6805, Training Loss: 0.02199
Epoch: 6805, Training Loss: 0.01674
Epoch: 6806, Training Loss: 0.01921
Epoch: 6806, Training Loss: 0.01757
Epoch: 6806, Training Loss: 0.02199
Epoch: 6806, Training Loss: 0.01674
Epoch: 6807, Training Loss: 0.01921
Epoch: 6807, Training Loss: 0.01757
Epoch: 6807, Training Loss: 0.02199
Epoch: 6807, Training Loss: 0.01674
Epoch: 6808, Training Loss: 0.01921
Epoch: 6808, Training Loss: 0.01756
Epoch: 6808, Training Loss: 0.02199
Epoch: 6808, Training Loss: 0.01673
Epoch: 6809, Training Loss: 0.01921
Epoch: 6809, Training Loss: 0.01756
Epoch: 6809, Training Loss: 0.02198
Epoch: 6809, Training Loss: 0.01673
Epoch: 6810, Training Loss: 0.01920
Epoch: 6810, Training Loss: 0.01756
Epoch: 6810, Training Loss: 0.02198
Epoch: 6810, Training Loss: 0.01673
Epoch: 6811, Training Loss: 0.01920
Epoch: 6811, Training Loss: 0.01756
Epoch: 6811, Training Loss: 0.02198
Epoch: 6811, Training Loss: 0.01673
Epoch: 6812, Training Loss: 0.01920
Epoch: 6812, Training Loss: 0.01756
Epoch: 6812, Training Loss: 0.02198
Epoch: 6812, Training Loss: 0.01673
Epoch: 6813, Training Loss: 0.01920
Epoch: 6813, Training Loss: 0.01756
Epoch: 6813, Training Loss: 0.02198
Epoch: 6813, Training Loss: 0.01673
Epoch: 6814, Training Loss: 0.01920
Epoch: 6814, Training Loss: 0.01756
Epoch: 6814, Training Loss: 0.02197
Epoch: 6814, Training Loss: 0.01673
Epoch: 6815, Training Loss: 0.01920
Epoch: 6815, Training Loss: 0.01755
Epoch: 6815, Training Loss: 0.02197
Epoch: 6815, Training Loss: 0.01672
Epoch: 6816, Training Loss: 0.01919
Epoch: 6816, Training Loss: 0.01755
Epoch: 6816, Training Loss: 0.02197
Epoch: 6816, Training Loss: 0.01672
Epoch: 6817, Training Loss: 0.01919
Epoch: 6817, Training Loss: 0.01755
Epoch: 6817, Training Loss: 0.02197
Epoch: 6817, Training Loss: 0.01672
Epoch: 6818, Training Loss: 0.01919
Epoch: 6818, Training Loss: 0.01755
Epoch: 6818, Training Loss: 0.02197
Epoch: 6818, Training Loss: 0.01672
Epoch: 6819, Training Loss: 0.01919
Epoch: 6819, Training Loss: 0.01755
Epoch: 6819, Training Loss: 0.02196
Epoch: 6819, Training Loss: 0.01672
Epoch: 6820, Training Loss: 0.01919
Epoch: 6820, Training Loss: 0.01755
Epoch: 6820, Training Loss: 0.02196
Epoch: 6820, Training Loss: 0.01672
Epoch: 6821, Training Loss: 0.01919
Epoch: 6821, Training Loss: 0.01755
Epoch: 6821, Training Loss: 0.02196
Epoch: 6821, Training Loss: 0.01672
Epoch: 6822, Training Loss: 0.01918
Epoch: 6822, Training Loss: 0.01754
Epoch: 6822, Training Loss: 0.02196
Epoch: 6822, Training Loss: 0.01671
Epoch: 6823, Training Loss: 0.01918
Epoch: 6823, Training Loss: 0.01754
Epoch: 6823, Training Loss: 0.02196
Epoch: 6823, Training Loss: 0.01671
Epoch: 6824, Training Loss: 0.01918
Epoch: 6824, Training Loss: 0.01754
Epoch: 6824, Training Loss: 0.02195
Epoch: 6824, Training Loss: 0.01671
Epoch: 6825, Training Loss: 0.01918
Epoch: 6825, Training Loss: 0.01754
Epoch: 6825, Training Loss: 0.02195
Epoch: 6825, Training Loss: 0.01671
Epoch: 6826, Training Loss: 0.01918
Epoch: 6826, Training Loss: 0.01754
Epoch: 6826, Training Loss: 0.02195
Epoch: 6826, Training Loss: 0.01671
Epoch: 6827, Training Loss: 0.01918
Epoch: 6827, Training Loss: 0.01754
Epoch: 6827, Training Loss: 0.02195
Epoch: 6827, Training Loss: 0.01671
Epoch: 6828, Training Loss: 0.01917
Epoch: 6828, Training Loss: 0.01754
Epoch: 6828, Training Loss: 0.02195
Epoch: 6828, Training Loss: 0.01671
Epoch: 6829, Training Loss: 0.01917
Epoch: 6829, Training Loss: 0.01753
Epoch: 6829, Training Loss: 0.02194
Epoch: 6829, Training Loss: 0.01670
Epoch: 6830, Training Loss: 0.01917
Epoch: 6830, Training Loss: 0.01753
Epoch: 6830, Training Loss: 0.02194
Epoch: 6830, Training Loss: 0.01670
Epoch: 6831, Training Loss: 0.01917
Epoch: 6831, Training Loss: 0.01753
Epoch: 6831, Training Loss: 0.02194
Epoch: 6831, Training Loss: 0.01670
Epoch: 6832, Training Loss: 0.01917
Epoch: 6832, Training Loss: 0.01753
Epoch: 6832, Training Loss: 0.02194
Epoch: 6832, Training Loss: 0.01670
Epoch: 6833, Training Loss: 0.01917
Epoch: 6833, Training Loss: 0.01753
Epoch: 6833, Training Loss: 0.02194
Epoch: 6833, Training Loss: 0.01670
Epoch: 6834, Training Loss: 0.01917
Epoch: 6834, Training Loss: 0.01753
Epoch: 6834, Training Loss: 0.02194
Epoch: 6834, Training Loss: 0.01670
Epoch: 6835, Training Loss: 0.01916
Epoch: 6835, Training Loss: 0.01753
Epoch: 6835, Training Loss: 0.02193
Epoch: 6835, Training Loss: 0.01670
Epoch: 6836, Training Loss: 0.01916
Epoch: 6836, Training Loss: 0.01752
Epoch: 6836, Training Loss: 0.02193
Epoch: 6836, Training Loss: 0.01669
Epoch: 6837, Training Loss: 0.01916
Epoch: 6837, Training Loss: 0.01752
Epoch: 6837, Training Loss: 0.02193
Epoch: 6837, Training Loss: 0.01669
Epoch: 6838, Training Loss: 0.01916
Epoch: 6838, Training Loss: 0.01752
Epoch: 6838, Training Loss: 0.02193
Epoch: 6838, Training Loss: 0.01669
Epoch: 6839, Training Loss: 0.01916
Epoch: 6839, Training Loss: 0.01752
Epoch: 6839, Training Loss: 0.02193
Epoch: 6839, Training Loss: 0.01669
Epoch: 6840, Training Loss: 0.01916
Epoch: 6840, Training Loss: 0.01752
Epoch: 6840, Training Loss: 0.02192
Epoch: 6840, Training Loss: 0.01669
Epoch: 6841, Training Loss: 0.01915
Epoch: 6841, Training Loss: 0.01752
Epoch: 6841, Training Loss: 0.02192
Epoch: 6841, Training Loss: 0.01669
Epoch: 6842, Training Loss: 0.01915
Epoch: 6842, Training Loss: 0.01751
Epoch: 6842, Training Loss: 0.02192
Epoch: 6842, Training Loss: 0.01669
Epoch: 6843, Training Loss: 0.01915
Epoch: 6843, Training Loss: 0.01751
Epoch: 6843, Training Loss: 0.02192
Epoch: 6843, Training Loss: 0.01668
Epoch: 6844, Training Loss: 0.01915
Epoch: 6844, Training Loss: 0.01751
Epoch: 6844, Training Loss: 0.02192
Epoch: 6844, Training Loss: 0.01668
Epoch: 6845, Training Loss: 0.01915
Epoch: 6845, Training Loss: 0.01751
Epoch: 6845, Training Loss: 0.02191
Epoch: 6845, Training Loss: 0.01668
Epoch: 6846, Training Loss: 0.01915
Epoch: 6846, Training Loss: 0.01751
Epoch: 6846, Training Loss: 0.02191
Epoch: 6846, Training Loss: 0.01668
Epoch: 6847, Training Loss: 0.01914
Epoch: 6847, Training Loss: 0.01751
Epoch: 6847, Training Loss: 0.02191
Epoch: 6847, Training Loss: 0.01668
Epoch: 6848, Training Loss: 0.01914
Epoch: 6848, Training Loss: 0.01751
Epoch: 6848, Training Loss: 0.02191
Epoch: 6848, Training Loss: 0.01668
Epoch: 6849, Training Loss: 0.01914
Epoch: 6849, Training Loss: 0.01750
Epoch: 6849, Training Loss: 0.02191
Epoch: 6849, Training Loss: 0.01668
Epoch: 6850, Training Loss: 0.01914
Epoch: 6850, Training Loss: 0.01750
Epoch: 6850, Training Loss: 0.02190
Epoch: 6850, Training Loss: 0.01667
Epoch: 6851, Training Loss: 0.01914
Epoch: 6851, Training Loss: 0.01750
Epoch: 6851, Training Loss: 0.02190
Epoch: 6851, Training Loss: 0.01667
Epoch: 6852, Training Loss: 0.01914
Epoch: 6852, Training Loss: 0.01750
Epoch: 6852, Training Loss: 0.02190
Epoch: 6852, Training Loss: 0.01667
Epoch: 6853, Training Loss: 0.01913
Epoch: 6853, Training Loss: 0.01750
Epoch: 6853, Training Loss: 0.02190
Epoch: 6853, Training Loss: 0.01667
Epoch: 6854, Training Loss: 0.01913
Epoch: 6854, Training Loss: 0.01750
Epoch: 6854, Training Loss: 0.02190
Epoch: 6854, Training Loss: 0.01667
Epoch: 6855, Training Loss: 0.01913
Epoch: 6855, Training Loss: 0.01750
Epoch: 6855, Training Loss: 0.02189
Epoch: 6855, Training Loss: 0.01667
Epoch: 6856, Training Loss: 0.01913
Epoch: 6856, Training Loss: 0.01749
Epoch: 6856, Training Loss: 0.02189
Epoch: 6856, Training Loss: 0.01667
Epoch: 6857, Training Loss: 0.01913
Epoch: 6857, Training Loss: 0.01749
Epoch: 6857, Training Loss: 0.02189
Epoch: 6857, Training Loss: 0.01666
Epoch: 6858, Training Loss: 0.01913
Epoch: 6858, Training Loss: 0.01749
Epoch: 6858, Training Loss: 0.02189
Epoch: 6858, Training Loss: 0.01666
Epoch: 6859, Training Loss: 0.01912
Epoch: 6859, Training Loss: 0.01749
Epoch: 6859, Training Loss: 0.02189
Epoch: 6859, Training Loss: 0.01666
Epoch: 6860, Training Loss: 0.01912
Epoch: 6860, Training Loss: 0.01749
Epoch: 6860, Training Loss: 0.02189
Epoch: 6860, Training Loss: 0.01666
Epoch: 6861, Training Loss: 0.01912
Epoch: 6861, Training Loss: 0.01749
Epoch: 6861, Training Loss: 0.02188
Epoch: 6861, Training Loss: 0.01666
Epoch: 6862, Training Loss: 0.01912
Epoch: 6862, Training Loss: 0.01749
Epoch: 6862, Training Loss: 0.02188
Epoch: 6862, Training Loss: 0.01666
Epoch: 6863, Training Loss: 0.01912
Epoch: 6863, Training Loss: 0.01748
Epoch: 6863, Training Loss: 0.02188
Epoch: 6863, Training Loss: 0.01666
Epoch: 6864, Training Loss: 0.01912
Epoch: 6864, Training Loss: 0.01748
Epoch: 6864, Training Loss: 0.02188
Epoch: 6864, Training Loss: 0.01665
Epoch: 6865, Training Loss: 0.01911
Epoch: 6865, Training Loss: 0.01748
Epoch: 6865, Training Loss: 0.02188
Epoch: 6865, Training Loss: 0.01665
Epoch: 6866, Training Loss: 0.01911
Epoch: 6866, Training Loss: 0.01748
Epoch: 6866, Training Loss: 0.02187
Epoch: 6866, Training Loss: 0.01665
Epoch: 6867, Training Loss: 0.01911
Epoch: 6867, Training Loss: 0.01748
Epoch: 6867, Training Loss: 0.02187
Epoch: 6867, Training Loss: 0.01665
Epoch: 6868, Training Loss: 0.01911
Epoch: 6868, Training Loss: 0.01748
Epoch: 6868, Training Loss: 0.02187
Epoch: 6868, Training Loss: 0.01665
Epoch: 6869, Training Loss: 0.01911
Epoch: 6869, Training Loss: 0.01748
Epoch: 6869, Training Loss: 0.02187
Epoch: 6869, Training Loss: 0.01665
Epoch: 6870, Training Loss: 0.01911
Epoch: 6870, Training Loss: 0.01747
Epoch: 6870, Training Loss: 0.02187
Epoch: 6870, Training Loss: 0.01665
Epoch: 6871, Training Loss: 0.01910
Epoch: 6871, Training Loss: 0.01747
Epoch: 6871, Training Loss: 0.02186
Epoch: 6871, Training Loss: 0.01664
Epoch: 6872, Training Loss: 0.01910
Epoch: 6872, Training Loss: 0.01747
Epoch: 6872, Training Loss: 0.02186
Epoch: 6872, Training Loss: 0.01664
Epoch: 6873, Training Loss: 0.01910
Epoch: 6873, Training Loss: 0.01747
Epoch: 6873, Training Loss: 0.02186
Epoch: 6873, Training Loss: 0.01664
Epoch: 6874, Training Loss: 0.01910
Epoch: 6874, Training Loss: 0.01747
Epoch: 6874, Training Loss: 0.02186
Epoch: 6874, Training Loss: 0.01664
Epoch: 6875, Training Loss: 0.01910
Epoch: 6875, Training Loss: 0.01747
Epoch: 6875, Training Loss: 0.02186
Epoch: 6875, Training Loss: 0.01664
Epoch: 6876, Training Loss: 0.01910
Epoch: 6876, Training Loss: 0.01747
Epoch: 6876, Training Loss: 0.02185
Epoch: 6876, Training Loss: 0.01664
Epoch: 6877, Training Loss: 0.01909
Epoch: 6877, Training Loss: 0.01746
Epoch: 6877, Training Loss: 0.02185
Epoch: 6877, Training Loss: 0.01664
Epoch: 6878, Training Loss: 0.01909
Epoch: 6878, Training Loss: 0.01746
Epoch: 6878, Training Loss: 0.02185
Epoch: 6878, Training Loss: 0.01663
Epoch: 6879, Training Loss: 0.01909
Epoch: 6879, Training Loss: 0.01746
Epoch: 6879, Training Loss: 0.02185
Epoch: 6879, Training Loss: 0.01663
Epoch: 6880, Training Loss: 0.01909
Epoch: 6880, Training Loss: 0.01746
Epoch: 6880, Training Loss: 0.02185
Epoch: 6880, Training Loss: 0.01663
Epoch: 6881, Training Loss: 0.01909
Epoch: 6881, Training Loss: 0.01746
Epoch: 6881, Training Loss: 0.02185
Epoch: 6881, Training Loss: 0.01663
Epoch: 6882, Training Loss: 0.01909
Epoch: 6882, Training Loss: 0.01746
Epoch: 6882, Training Loss: 0.02184
Epoch: 6882, Training Loss: 0.01663
Epoch: 6883, Training Loss: 0.01908
Epoch: 6883, Training Loss: 0.01746
Epoch: 6883, Training Loss: 0.02184
Epoch: 6883, Training Loss: 0.01663
Epoch: 6884, Training Loss: 0.01908
Epoch: 6884, Training Loss: 0.01745
Epoch: 6884, Training Loss: 0.02184
Epoch: 6884, Training Loss: 0.01663
Epoch: 6885, Training Loss: 0.01908
Epoch: 6885, Training Loss: 0.01745
Epoch: 6885, Training Loss: 0.02184
Epoch: 6885, Training Loss: 0.01663
Epoch: 6886, Training Loss: 0.01908
Epoch: 6886, Training Loss: 0.01745
Epoch: 6886, Training Loss: 0.02184
Epoch: 6886, Training Loss: 0.01662
Epoch: 6887, Training Loss: 0.01908
Epoch: 6887, Training Loss: 0.01745
Epoch: 6887, Training Loss: 0.02183
Epoch: 6887, Training Loss: 0.01662
Epoch: 6888, Training Loss: 0.01908
Epoch: 6888, Training Loss: 0.01745
Epoch: 6888, Training Loss: 0.02183
Epoch: 6888, Training Loss: 0.01662
Epoch: 6889, Training Loss: 0.01907
Epoch: 6889, Training Loss: 0.01745
Epoch: 6889, Training Loss: 0.02183
Epoch: 6889, Training Loss: 0.01662
Epoch: 6890, Training Loss: 0.01907
Epoch: 6890, Training Loss: 0.01745
Epoch: 6890, Training Loss: 0.02183
Epoch: 6890, Training Loss: 0.01662
Epoch: 6891, Training Loss: 0.01907
Epoch: 6891, Training Loss: 0.01744
Epoch: 6891, Training Loss: 0.02183
Epoch: 6891, Training Loss: 0.01662
Epoch: 6892, Training Loss: 0.01907
Epoch: 6892, Training Loss: 0.01744
Epoch: 6892, Training Loss: 0.02182
Epoch: 6892, Training Loss: 0.01662
Epoch: 6893, Training Loss: 0.01907
Epoch: 6893, Training Loss: 0.01744
Epoch: 6893, Training Loss: 0.02182
Epoch: 6893, Training Loss: 0.01661
Epoch: 6894, Training Loss: 0.01907
Epoch: 6894, Training Loss: 0.01744
Epoch: 6894, Training Loss: 0.02182
Epoch: 6894, Training Loss: 0.01661
Epoch: 6895, Training Loss: 0.01906
Epoch: 6895, Training Loss: 0.01744
Epoch: 6895, Training Loss: 0.02182
Epoch: 6895, Training Loss: 0.01661
Epoch: 6896, Training Loss: 0.01906
Epoch: 6896, Training Loss: 0.01744
Epoch: 6896, Training Loss: 0.02182
Epoch: 6896, Training Loss: 0.01661
Epoch: 6897, Training Loss: 0.01906
Epoch: 6897, Training Loss: 0.01744
Epoch: 6897, Training Loss: 0.02182
Epoch: 6897, Training Loss: 0.01661
Epoch: 6898, Training Loss: 0.01906
Epoch: 6898, Training Loss: 0.01743
Epoch: 6898, Training Loss: 0.02181
Epoch: 6898, Training Loss: 0.01661
Epoch: 6899, Training Loss: 0.01906
Epoch: 6899, Training Loss: 0.01743
Epoch: 6899, Training Loss: 0.02181
Epoch: 6899, Training Loss: 0.01661
Epoch: 6900, Training Loss: 0.01906
Epoch: 6900, Training Loss: 0.01743
Epoch: 6900, Training Loss: 0.02181
Epoch: 6900, Training Loss: 0.01660
Epoch: 6901, Training Loss: 0.01905
Epoch: 6901, Training Loss: 0.01743
Epoch: 6901, Training Loss: 0.02181
Epoch: 6901, Training Loss: 0.01660
Epoch: 6902, Training Loss: 0.01905
Epoch: 6902, Training Loss: 0.01743
Epoch: 6902, Training Loss: 0.02181
Epoch: 6902, Training Loss: 0.01660
Epoch: 6903, Training Loss: 0.01905
Epoch: 6903, Training Loss: 0.01743
Epoch: 6903, Training Loss: 0.02180
Epoch: 6903, Training Loss: 0.01660
Epoch: 6904, Training Loss: 0.01905
Epoch: 6904, Training Loss: 0.01743
Epoch: 6904, Training Loss: 0.02180
Epoch: 6904, Training Loss: 0.01660
Epoch: 6905, Training Loss: 0.01905
Epoch: 6905, Training Loss: 0.01742
Epoch: 6905, Training Loss: 0.02180
Epoch: 6905, Training Loss: 0.01660
Epoch: 6906, Training Loss: 0.01905
Epoch: 6906, Training Loss: 0.01742
Epoch: 6906, Training Loss: 0.02180
Epoch: 6906, Training Loss: 0.01660
Epoch: 6907, Training Loss: 0.01904
Epoch: 6907, Training Loss: 0.01742
Epoch: 6907, Training Loss: 0.02180
Epoch: 6907, Training Loss: 0.01659
Epoch: 6908, Training Loss: 0.01904
Epoch: 6908, Training Loss: 0.01742
Epoch: 6908, Training Loss: 0.02179
Epoch: 6908, Training Loss: 0.01659
Epoch: 6909, Training Loss: 0.01904
Epoch: 6909, Training Loss: 0.01742
Epoch: 6909, Training Loss: 0.02179
Epoch: 6909, Training Loss: 0.01659
Epoch: 6910, Training Loss: 0.01904
Epoch: 6910, Training Loss: 0.01742
Epoch: 6910, Training Loss: 0.02179
Epoch: 6910, Training Loss: 0.01659
Epoch: 6911, Training Loss: 0.01904
Epoch: 6911, Training Loss: 0.01742
Epoch: 6911, Training Loss: 0.02179
Epoch: 6911, Training Loss: 0.01659
Epoch: 6912, Training Loss: 0.01904
Epoch: 6912, Training Loss: 0.01741
Epoch: 6912, Training Loss: 0.02179
Epoch: 6912, Training Loss: 0.01659
Epoch: 6913, Training Loss: 0.01904
Epoch: 6913, Training Loss: 0.01741
Epoch: 6913, Training Loss: 0.02178
Epoch: 6913, Training Loss: 0.01659
Epoch: 6914, Training Loss: 0.01903
Epoch: 6914, Training Loss: 0.01741
Epoch: 6914, Training Loss: 0.02178
Epoch: 6914, Training Loss: 0.01658
Epoch: 6915, Training Loss: 0.01903
Epoch: 6915, Training Loss: 0.01741
Epoch: 6915, Training Loss: 0.02178
Epoch: 6915, Training Loss: 0.01658
Epoch: 6916, Training Loss: 0.01903
Epoch: 6916, Training Loss: 0.01741
Epoch: 6916, Training Loss: 0.02178
Epoch: 6916, Training Loss: 0.01658
Epoch: 6917, Training Loss: 0.01903
Epoch: 6917, Training Loss: 0.01741
Epoch: 6917, Training Loss: 0.02178
Epoch: 6917, Training Loss: 0.01658
Epoch: 6918, Training Loss: 0.01903
Epoch: 6918, Training Loss: 0.01741
Epoch: 6918, Training Loss: 0.02178
Epoch: 6918, Training Loss: 0.01658
Epoch: 6919, Training Loss: 0.01903
Epoch: 6919, Training Loss: 0.01740
Epoch: 6919, Training Loss: 0.02177
Epoch: 6919, Training Loss: 0.01658
Epoch: 6920, Training Loss: 0.01902
Epoch: 6920, Training Loss: 0.01740
Epoch: 6920, Training Loss: 0.02177
Epoch: 6920, Training Loss: 0.01658
Epoch: 6921, Training Loss: 0.01902
Epoch: 6921, Training Loss: 0.01740
Epoch: 6921, Training Loss: 0.02177
Epoch: 6921, Training Loss: 0.01658
Epoch: 6922, Training Loss: 0.01902
Epoch: 6922, Training Loss: 0.01740
Epoch: 6922, Training Loss: 0.02177
Epoch: 6922, Training Loss: 0.01657
Epoch: 6923, Training Loss: 0.01902
Epoch: 6923, Training Loss: 0.01740
Epoch: 6923, Training Loss: 0.02177
Epoch: 6923, Training Loss: 0.01657
Epoch: 6924, Training Loss: 0.01902
Epoch: 6924, Training Loss: 0.01740
Epoch: 6924, Training Loss: 0.02176
Epoch: 6924, Training Loss: 0.01657
Epoch: 6925, Training Loss: 0.01902
Epoch: 6925, Training Loss: 0.01740
Epoch: 6925, Training Loss: 0.02176
Epoch: 6925, Training Loss: 0.01657
Epoch: 6926, Training Loss: 0.01901
Epoch: 6926, Training Loss: 0.01739
Epoch: 6926, Training Loss: 0.02176
Epoch: 6926, Training Loss: 0.01657
Epoch: 6927, Training Loss: 0.01901
Epoch: 6927, Training Loss: 0.01739
Epoch: 6927, Training Loss: 0.02176
Epoch: 6927, Training Loss: 0.01657
Epoch: 6928, Training Loss: 0.01901
Epoch: 6928, Training Loss: 0.01739
Epoch: 6928, Training Loss: 0.02176
Epoch: 6928, Training Loss: 0.01657
Epoch: 6929, Training Loss: 0.01901
Epoch: 6929, Training Loss: 0.01739
Epoch: 6929, Training Loss: 0.02175
Epoch: 6929, Training Loss: 0.01656
Epoch: 6930, Training Loss: 0.01901
Epoch: 6930, Training Loss: 0.01739
Epoch: 6930, Training Loss: 0.02175
Epoch: 6930, Training Loss: 0.01656
Epoch: 6931, Training Loss: 0.01901
Epoch: 6931, Training Loss: 0.01739
Epoch: 6931, Training Loss: 0.02175
Epoch: 6931, Training Loss: 0.01656
Epoch: 6932, Training Loss: 0.01900
Epoch: 6932, Training Loss: 0.01739
Epoch: 6932, Training Loss: 0.02175
Epoch: 6932, Training Loss: 0.01656
Epoch: 6933, Training Loss: 0.01900
Epoch: 6933, Training Loss: 0.01738
Epoch: 6933, Training Loss: 0.02175
Epoch: 6933, Training Loss: 0.01656
Epoch: 6934, Training Loss: 0.01900
Epoch: 6934, Training Loss: 0.01738
Epoch: 6934, Training Loss: 0.02175
Epoch: 6934, Training Loss: 0.01656
Epoch: 6935, Training Loss: 0.01900
Epoch: 6935, Training Loss: 0.01738
Epoch: 6935, Training Loss: 0.02174
Epoch: 6935, Training Loss: 0.01656
Epoch: 6936, Training Loss: 0.01900
Epoch: 6936, Training Loss: 0.01738
Epoch: 6936, Training Loss: 0.02174
Epoch: 6936, Training Loss: 0.01655
Epoch: 6937, Training Loss: 0.01900
Epoch: 6937, Training Loss: 0.01738
Epoch: 6937, Training Loss: 0.02174
Epoch: 6937, Training Loss: 0.01655
Epoch: 6938, Training Loss: 0.01899
Epoch: 6938, Training Loss: 0.01738
Epoch: 6938, Training Loss: 0.02174
Epoch: 6938, Training Loss: 0.01655
Epoch: 6939, Training Loss: 0.01899
Epoch: 6939, Training Loss: 0.01738
Epoch: 6939, Training Loss: 0.02174
Epoch: 6939, Training Loss: 0.01655
Epoch: 6940, Training Loss: 0.01899
Epoch: 6940, Training Loss: 0.01737
Epoch: 6940, Training Loss: 0.02173
Epoch: 6940, Training Loss: 0.01655
Epoch: 6941, Training Loss: 0.01899
Epoch: 6941, Training Loss: 0.01737
Epoch: 6941, Training Loss: 0.02173
Epoch: 6941, Training Loss: 0.01655
Epoch: 6942, Training Loss: 0.01899
Epoch: 6942, Training Loss: 0.01737
Epoch: 6942, Training Loss: 0.02173
Epoch: 6942, Training Loss: 0.01655
Epoch: 6943, Training Loss: 0.01899
Epoch: 6943, Training Loss: 0.01737
Epoch: 6943, Training Loss: 0.02173
Epoch: 6943, Training Loss: 0.01654
Epoch: 6944, Training Loss: 0.01898
Epoch: 6944, Training Loss: 0.01737
Epoch: 6944, Training Loss: 0.02173
Epoch: 6944, Training Loss: 0.01654
Epoch: 6945, Training Loss: 0.01898
Epoch: 6945, Training Loss: 0.01737
Epoch: 6945, Training Loss: 0.02172
Epoch: 6945, Training Loss: 0.01654
Epoch: 6946, Training Loss: 0.01898
Epoch: 6946, Training Loss: 0.01737
Epoch: 6946, Training Loss: 0.02172
Epoch: 6946, Training Loss: 0.01654
Epoch: 6947, Training Loss: 0.01898
Epoch: 6947, Training Loss: 0.01736
Epoch: 6947, Training Loss: 0.02172
Epoch: 6947, Training Loss: 0.01654
Epoch: 6948, Training Loss: 0.01898
Epoch: 6948, Training Loss: 0.01736
Epoch: 6948, Training Loss: 0.02172
Epoch: 6948, Training Loss: 0.01654
Epoch: 6949, Training Loss: 0.01898
Epoch: 6949, Training Loss: 0.01736
Epoch: 6949, Training Loss: 0.02172
Epoch: 6949, Training Loss: 0.01654
Epoch: 6950, Training Loss: 0.01898
Epoch: 6950, Training Loss: 0.01736
Epoch: 6950, Training Loss: 0.02172
Epoch: 6950, Training Loss: 0.01654
Epoch: 6951, Training Loss: 0.01897
Epoch: 6951, Training Loss: 0.01736
Epoch: 6951, Training Loss: 0.02171
Epoch: 6951, Training Loss: 0.01653
Epoch: 6952, Training Loss: 0.01897
Epoch: 6952, Training Loss: 0.01736
Epoch: 6952, Training Loss: 0.02171
Epoch: 6952, Training Loss: 0.01653
Epoch: 6953, Training Loss: 0.01897
Epoch: 6953, Training Loss: 0.01736
Epoch: 6953, Training Loss: 0.02171
Epoch: 6953, Training Loss: 0.01653
Epoch: 6954, Training Loss: 0.01897
Epoch: 6954, Training Loss: 0.01735
Epoch: 6954, Training Loss: 0.02171
Epoch: 6954, Training Loss: 0.01653
Epoch: 6955, Training Loss: 0.01897
Epoch: 6955, Training Loss: 0.01735
Epoch: 6955, Training Loss: 0.02171
Epoch: 6955, Training Loss: 0.01653
Epoch: 6956, Training Loss: 0.01897
Epoch: 6956, Training Loss: 0.01735
Epoch: 6956, Training Loss: 0.02170
Epoch: 6956, Training Loss: 0.01653
Epoch: 6957, Training Loss: 0.01896
Epoch: 6957, Training Loss: 0.01735
Epoch: 6957, Training Loss: 0.02170
Epoch: 6957, Training Loss: 0.01653
Epoch: 6958, Training Loss: 0.01896
Epoch: 6958, Training Loss: 0.01735
Epoch: 6958, Training Loss: 0.02170
Epoch: 6958, Training Loss: 0.01652
Epoch: 6959, Training Loss: 0.01896
Epoch: 6959, Training Loss: 0.01735
Epoch: 6959, Training Loss: 0.02170
Epoch: 6959, Training Loss: 0.01652
Epoch: 6960, Training Loss: 0.01896
Epoch: 6960, Training Loss: 0.01735
Epoch: 6960, Training Loss: 0.02170
Epoch: 6960, Training Loss: 0.01652
Epoch: 6961, Training Loss: 0.01896
Epoch: 6961, Training Loss: 0.01734
Epoch: 6961, Training Loss: 0.02170
Epoch: 6961, Training Loss: 0.01652
Epoch: 6962, Training Loss: 0.01896
Epoch: 6962, Training Loss: 0.01734
Epoch: 6962, Training Loss: 0.02169
Epoch: 6962, Training Loss: 0.01652
Epoch: 6963, Training Loss: 0.01895
Epoch: 6963, Training Loss: 0.01734
Epoch: 6963, Training Loss: 0.02169
Epoch: 6963, Training Loss: 0.01652
Epoch: 6964, Training Loss: 0.01895
Epoch: 6964, Training Loss: 0.01734
Epoch: 6964, Training Loss: 0.02169
Epoch: 6964, Training Loss: 0.01652
Epoch: 6965, Training Loss: 0.01895
Epoch: 6965, Training Loss: 0.01734
Epoch: 6965, Training Loss: 0.02169
Epoch: 6965, Training Loss: 0.01651
Epoch: 6966, Training Loss: 0.01895
Epoch: 6966, Training Loss: 0.01734
Epoch: 6966, Training Loss: 0.02169
Epoch: 6966, Training Loss: 0.01651
Epoch: 6967, Training Loss: 0.01895
Epoch: 6967, Training Loss: 0.01734
Epoch: 6967, Training Loss: 0.02168
Epoch: 6967, Training Loss: 0.01651
Epoch: 6968, Training Loss: 0.01895
Epoch: 6968, Training Loss: 0.01733
Epoch: 6968, Training Loss: 0.02168
Epoch: 6968, Training Loss: 0.01651
Epoch: 6969, Training Loss: 0.01894
Epoch: 6969, Training Loss: 0.01733
Epoch: 6969, Training Loss: 0.02168
Epoch: 6969, Training Loss: 0.01651
Epoch: 6970, Training Loss: 0.01894
Epoch: 6970, Training Loss: 0.01733
Epoch: 6970, Training Loss: 0.02168
Epoch: 6970, Training Loss: 0.01651
Epoch: 6971, Training Loss: 0.01894
Epoch: 6971, Training Loss: 0.01733
Epoch: 6971, Training Loss: 0.02168
Epoch: 6971, Training Loss: 0.01651
Epoch: 6972, Training Loss: 0.01894
Epoch: 6972, Training Loss: 0.01733
Epoch: 6972, Training Loss: 0.02167
Epoch: 6972, Training Loss: 0.01651
Epoch: 6973, Training Loss: 0.01894
Epoch: 6973, Training Loss: 0.01733
Epoch: 6973, Training Loss: 0.02167
Epoch: 6973, Training Loss: 0.01650
Epoch: 6974, Training Loss: 0.01894
Epoch: 6974, Training Loss: 0.01733
Epoch: 6974, Training Loss: 0.02167
Epoch: 6974, Training Loss: 0.01650
Epoch: 6975, Training Loss: 0.01894
Epoch: 6975, Training Loss: 0.01732
Epoch: 6975, Training Loss: 0.02167
Epoch: 6975, Training Loss: 0.01650
Epoch: 6976, Training Loss: 0.01893
Epoch: 6976, Training Loss: 0.01732
Epoch: 6976, Training Loss: 0.02167
Epoch: 6976, Training Loss: 0.01650
Epoch: 6977, Training Loss: 0.01893
Epoch: 6977, Training Loss: 0.01732
Epoch: 6977, Training Loss: 0.02167
Epoch: 6977, Training Loss: 0.01650
Epoch: 6978, Training Loss: 0.01893
Epoch: 6978, Training Loss: 0.01732
Epoch: 6978, Training Loss: 0.02166
Epoch: 6978, Training Loss: 0.01650
Epoch: 6979, Training Loss: 0.01893
Epoch: 6979, Training Loss: 0.01732
Epoch: 6979, Training Loss: 0.02166
Epoch: 6979, Training Loss: 0.01650
Epoch: 6980, Training Loss: 0.01893
Epoch: 6980, Training Loss: 0.01732
Epoch: 6980, Training Loss: 0.02166
Epoch: 6980, Training Loss: 0.01649
Epoch: 6981, Training Loss: 0.01893
Epoch: 6981, Training Loss: 0.01732
Epoch: 6981, Training Loss: 0.02166
Epoch: 6981, Training Loss: 0.01649
Epoch: 6982, Training Loss: 0.01892
Epoch: 6982, Training Loss: 0.01731
Epoch: 6982, Training Loss: 0.02166
Epoch: 6982, Training Loss: 0.01649
Epoch: 6983, Training Loss: 0.01892
Epoch: 6983, Training Loss: 0.01731
Epoch: 6983, Training Loss: 0.02165
Epoch: 6983, Training Loss: 0.01649
Epoch: 6984, Training Loss: 0.01892
Epoch: 6984, Training Loss: 0.01731
Epoch: 6984, Training Loss: 0.02165
Epoch: 6984, Training Loss: 0.01649
Epoch: 6985, Training Loss: 0.01892
Epoch: 6985, Training Loss: 0.01731
Epoch: 6985, Training Loss: 0.02165
Epoch: 6985, Training Loss: 0.01649
Epoch: 6986, Training Loss: 0.01892
Epoch: 6986, Training Loss: 0.01731
Epoch: 6986, Training Loss: 0.02165
Epoch: 6986, Training Loss: 0.01649
Epoch: 6987, Training Loss: 0.01892
Epoch: 6987, Training Loss: 0.01731
Epoch: 6987, Training Loss: 0.02165
Epoch: 6987, Training Loss: 0.01648
Epoch: 6988, Training Loss: 0.01891
Epoch: 6988, Training Loss: 0.01731
Epoch: 6988, Training Loss: 0.02165
Epoch: 6988, Training Loss: 0.01648
Epoch: 6989, Training Loss: 0.01891
Epoch: 6989, Training Loss: 0.01730
Epoch: 6989, Training Loss: 0.02164
Epoch: 6989, Training Loss: 0.01648
Epoch: 6990, Training Loss: 0.01891
Epoch: 6990, Training Loss: 0.01730
Epoch: 6990, Training Loss: 0.02164
Epoch: 6990, Training Loss: 0.01648
Epoch: 6991, Training Loss: 0.01891
Epoch: 6991, Training Loss: 0.01730
Epoch: 6991, Training Loss: 0.02164
Epoch: 6991, Training Loss: 0.01648
Epoch: 6992, Training Loss: 0.01891
Epoch: 6992, Training Loss: 0.01730
Epoch: 6992, Training Loss: 0.02164
Epoch: 6992, Training Loss: 0.01648
Epoch: 6993, Training Loss: 0.01891
Epoch: 6993, Training Loss: 0.01730
Epoch: 6993, Training Loss: 0.02164
Epoch: 6993, Training Loss: 0.01648
Epoch: 6994, Training Loss: 0.01890
Epoch: 6994, Training Loss: 0.01730
Epoch: 6994, Training Loss: 0.02163
Epoch: 6994, Training Loss: 0.01648
Epoch: 6995, Training Loss: 0.01890
Epoch: 6995, Training Loss: 0.01730
Epoch: 6995, Training Loss: 0.02163
Epoch: 6995, Training Loss: 0.01647
Epoch: 6996, Training Loss: 0.01890
Epoch: 6996, Training Loss: 0.01729
Epoch: 6996, Training Loss: 0.02163
Epoch: 6996, Training Loss: 0.01647
Epoch: 6997, Training Loss: 0.01890
Epoch: 6997, Training Loss: 0.01729
Epoch: 6997, Training Loss: 0.02163
Epoch: 6997, Training Loss: 0.01647
Epoch: 6998, Training Loss: 0.01890
Epoch: 6998, Training Loss: 0.01729
Epoch: 6998, Training Loss: 0.02163
Epoch: 6998, Training Loss: 0.01647
Epoch: 6999, Training Loss: 0.01890
Epoch: 6999, Training Loss: 0.01729
Epoch: 6999, Training Loss: 0.02162
Epoch: 6999, Training Loss: 0.01647
Epoch: 7000, Training Loss: 0.01890
Epoch: 7000, Training Loss: 0.01729
Epoch: 7000, Training Loss: 0.02162
Epoch: 7000, Training Loss: 0.01647
Epoch: 7001, Training Loss: 0.01889
Epoch: 7001, Training Loss: 0.01729
Epoch: 7001, Training Loss: 0.02162
Epoch: 7001, Training Loss: 0.01647
Epoch: 7002, Training Loss: 0.01889
Epoch: 7002, Training Loss: 0.01729
Epoch: 7002, Training Loss: 0.02162
Epoch: 7002, Training Loss: 0.01646
Epoch: 7003, Training Loss: 0.01889
Epoch: 7003, Training Loss: 0.01728
Epoch: 7003, Training Loss: 0.02162
Epoch: 7003, Training Loss: 0.01646
Epoch: 7004, Training Loss: 0.01889
Epoch: 7004, Training Loss: 0.01728
Epoch: 7004, Training Loss: 0.02162
Epoch: 7004, Training Loss: 0.01646
Epoch: 7005, Training Loss: 0.01889
Epoch: 7005, Training Loss: 0.01728
Epoch: 7005, Training Loss: 0.02161
Epoch: 7005, Training Loss: 0.01646
Epoch: 7006, Training Loss: 0.01889
Epoch: 7006, Training Loss: 0.01728
Epoch: 7006, Training Loss: 0.02161
Epoch: 7006, Training Loss: 0.01646
Epoch: 7007, Training Loss: 0.01888
Epoch: 7007, Training Loss: 0.01728
Epoch: 7007, Training Loss: 0.02161
Epoch: 7007, Training Loss: 0.01646
Epoch: 7008, Training Loss: 0.01888
Epoch: 7008, Training Loss: 0.01728
Epoch: 7008, Training Loss: 0.02161
Epoch: 7008, Training Loss: 0.01646
Epoch: 7009, Training Loss: 0.01888
Epoch: 7009, Training Loss: 0.01728
Epoch: 7009, Training Loss: 0.02161
Epoch: 7009, Training Loss: 0.01645
Epoch: 7010, Training Loss: 0.01888
Epoch: 7010, Training Loss: 0.01728
Epoch: 7010, Training Loss: 0.02160
Epoch: 7010, Training Loss: 0.01645
Epoch: 7011, Training Loss: 0.01888
Epoch: 7011, Training Loss: 0.01727
Epoch: 7011, Training Loss: 0.02160
Epoch: 7011, Training Loss: 0.01645
Epoch: 7012, Training Loss: 0.01888
Epoch: 7012, Training Loss: 0.01727
Epoch: 7012, Training Loss: 0.02160
Epoch: 7012, Training Loss: 0.01645
Epoch: 7013, Training Loss: 0.01887
Epoch: 7013, Training Loss: 0.01727
Epoch: 7013, Training Loss: 0.02160
Epoch: 7013, Training Loss: 0.01645
Epoch: 7014, Training Loss: 0.01887
Epoch: 7014, Training Loss: 0.01727
Epoch: 7014, Training Loss: 0.02160
Epoch: 7014, Training Loss: 0.01645
Epoch: 7015, Training Loss: 0.01887
Epoch: 7015, Training Loss: 0.01727
Epoch: 7015, Training Loss: 0.02160
Epoch: 7015, Training Loss: 0.01645
Epoch: 7016, Training Loss: 0.01887
Epoch: 7016, Training Loss: 0.01727
Epoch: 7016, Training Loss: 0.02159
Epoch: 7016, Training Loss: 0.01645
Epoch: 7017, Training Loss: 0.01887
Epoch: 7017, Training Loss: 0.01727
Epoch: 7017, Training Loss: 0.02159
Epoch: 7017, Training Loss: 0.01644
Epoch: 7018, Training Loss: 0.01887
Epoch: 7018, Training Loss: 0.01726
Epoch: 7018, Training Loss: 0.02159
Epoch: 7018, Training Loss: 0.01644
Epoch: 7019, Training Loss: 0.01886
Epoch: 7019, Training Loss: 0.01726
Epoch: 7019, Training Loss: 0.02159
Epoch: 7019, Training Loss: 0.01644
Epoch: 7020, Training Loss: 0.01886
Epoch: 7020, Training Loss: 0.01726
Epoch: 7020, Training Loss: 0.02159
Epoch: 7020, Training Loss: 0.01644
Epoch: 7021, Training Loss: 0.01886
Epoch: 7021, Training Loss: 0.01726
Epoch: 7021, Training Loss: 0.02158
Epoch: 7021, Training Loss: 0.01644
Epoch: 7022, Training Loss: 0.01886
Epoch: 7022, Training Loss: 0.01726
Epoch: 7022, Training Loss: 0.02158
Epoch: 7022, Training Loss: 0.01644
Epoch: 7023, Training Loss: 0.01886
Epoch: 7023, Training Loss: 0.01726
Epoch: 7023, Training Loss: 0.02158
Epoch: 7023, Training Loss: 0.01644
Epoch: 7024, Training Loss: 0.01886
Epoch: 7024, Training Loss: 0.01726
Epoch: 7024, Training Loss: 0.02158
Epoch: 7024, Training Loss: 0.01643
Epoch: 7025, Training Loss: 0.01886
Epoch: 7025, Training Loss: 0.01725
Epoch: 7025, Training Loss: 0.02158
Epoch: 7025, Training Loss: 0.01643
Epoch: 7026, Training Loss: 0.01885
Epoch: 7026, Training Loss: 0.01725
Epoch: 7026, Training Loss: 0.02158
Epoch: 7026, Training Loss: 0.01643
Epoch: 7027, Training Loss: 0.01885
Epoch: 7027, Training Loss: 0.01725
Epoch: 7027, Training Loss: 0.02157
Epoch: 7027, Training Loss: 0.01643
Epoch: 7028, Training Loss: 0.01885
Epoch: 7028, Training Loss: 0.01725
Epoch: 7028, Training Loss: 0.02157
Epoch: 7028, Training Loss: 0.01643
Epoch: 7029, Training Loss: 0.01885
Epoch: 7029, Training Loss: 0.01725
Epoch: 7029, Training Loss: 0.02157
Epoch: 7029, Training Loss: 0.01643
Epoch: 7030, Training Loss: 0.01885
Epoch: 7030, Training Loss: 0.01725
Epoch: 7030, Training Loss: 0.02157
Epoch: 7030, Training Loss: 0.01643
Epoch: 7031, Training Loss: 0.01885
Epoch: 7031, Training Loss: 0.01725
Epoch: 7031, Training Loss: 0.02157
Epoch: 7031, Training Loss: 0.01642
Epoch: 7032, Training Loss: 0.01884
Epoch: 7032, Training Loss: 0.01724
Epoch: 7032, Training Loss: 0.02156
Epoch: 7032, Training Loss: 0.01642
Epoch: 7033, Training Loss: 0.01884
Epoch: 7033, Training Loss: 0.01724
Epoch: 7033, Training Loss: 0.02156
Epoch: 7033, Training Loss: 0.01642
Epoch: 7034, Training Loss: 0.01884
Epoch: 7034, Training Loss: 0.01724
Epoch: 7034, Training Loss: 0.02156
Epoch: 7034, Training Loss: 0.01642
Epoch: 7035, Training Loss: 0.01884
Epoch: 7035, Training Loss: 0.01724
Epoch: 7035, Training Loss: 0.02156
Epoch: 7035, Training Loss: 0.01642
Epoch: 7036, Training Loss: 0.01884
Epoch: 7036, Training Loss: 0.01724
Epoch: 7036, Training Loss: 0.02156
Epoch: 7036, Training Loss: 0.01642
Epoch: 7037, Training Loss: 0.01884
Epoch: 7037, Training Loss: 0.01724
Epoch: 7037, Training Loss: 0.02156
Epoch: 7037, Training Loss: 0.01642
Epoch: 7038, Training Loss: 0.01883
Epoch: 7038, Training Loss: 0.01724
Epoch: 7038, Training Loss: 0.02155
Epoch: 7038, Training Loss: 0.01642
Epoch: 7039, Training Loss: 0.01883
Epoch: 7039, Training Loss: 0.01723
Epoch: 7039, Training Loss: 0.02155
Epoch: 7039, Training Loss: 0.01641
Epoch: 7040, Training Loss: 0.01883
Epoch: 7040, Training Loss: 0.01723
Epoch: 7040, Training Loss: 0.02155
Epoch: 7040, Training Loss: 0.01641
Epoch: 7041, Training Loss: 0.01883
Epoch: 7041, Training Loss: 0.01723
Epoch: 7041, Training Loss: 0.02155
Epoch: 7041, Training Loss: 0.01641
Epoch: 7042, Training Loss: 0.01883
Epoch: 7042, Training Loss: 0.01723
Epoch: 7042, Training Loss: 0.02155
Epoch: 7042, Training Loss: 0.01641
Epoch: 7043, Training Loss: 0.01883
Epoch: 7043, Training Loss: 0.01723
Epoch: 7043, Training Loss: 0.02154
Epoch: 7043, Training Loss: 0.01641
Epoch: 7044, Training Loss: 0.01883
Epoch: 7044, Training Loss: 0.01723
Epoch: 7044, Training Loss: 0.02154
Epoch: 7044, Training Loss: 0.01641
Epoch: 7045, Training Loss: 0.01882
Epoch: 7045, Training Loss: 0.01723
Epoch: 7045, Training Loss: 0.02154
Epoch: 7045, Training Loss: 0.01641
Epoch: 7046, Training Loss: 0.01882
Epoch: 7046, Training Loss: 0.01722
Epoch: 7046, Training Loss: 0.02154
Epoch: 7046, Training Loss: 0.01640
Epoch: 7047, Training Loss: 0.01882
Epoch: 7047, Training Loss: 0.01722
Epoch: 7047, Training Loss: 0.02154
Epoch: 7047, Training Loss: 0.01640
Epoch: 7048, Training Loss: 0.01882
Epoch: 7048, Training Loss: 0.01722
Epoch: 7048, Training Loss: 0.02154
Epoch: 7048, Training Loss: 0.01640
Epoch: 7049, Training Loss: 0.01882
Epoch: 7049, Training Loss: 0.01722
Epoch: 7049, Training Loss: 0.02153
Epoch: 7049, Training Loss: 0.01640
Epoch: 7050, Training Loss: 0.01882
Epoch: 7050, Training Loss: 0.01722
Epoch: 7050, Training Loss: 0.02153
Epoch: 7050, Training Loss: 0.01640
Epoch: 7051, Training Loss: 0.01881
Epoch: 7051, Training Loss: 0.01722
Epoch: 7051, Training Loss: 0.02153
Epoch: 7051, Training Loss: 0.01640
Epoch: 7052, Training Loss: 0.01881
Epoch: 7052, Training Loss: 0.01722
Epoch: 7052, Training Loss: 0.02153
Epoch: 7052, Training Loss: 0.01640
Epoch: 7053, Training Loss: 0.01881
Epoch: 7053, Training Loss: 0.01722
Epoch: 7053, Training Loss: 0.02153
Epoch: 7053, Training Loss: 0.01640
Epoch: 7054, Training Loss: 0.01881
Epoch: 7054, Training Loss: 0.01721
Epoch: 7054, Training Loss: 0.02152
Epoch: 7054, Training Loss: 0.01639
Epoch: 7055, Training Loss: 0.01881
Epoch: 7055, Training Loss: 0.01721
Epoch: 7055, Training Loss: 0.02152
Epoch: 7055, Training Loss: 0.01639
Epoch: 7056, Training Loss: 0.01881
Epoch: 7056, Training Loss: 0.01721
Epoch: 7056, Training Loss: 0.02152
Epoch: 7056, Training Loss: 0.01639
Epoch: 7057, Training Loss: 0.01880
Epoch: 7057, Training Loss: 0.01721
Epoch: 7057, Training Loss: 0.02152
Epoch: 7057, Training Loss: 0.01639
Epoch: 7058, Training Loss: 0.01880
Epoch: 7058, Training Loss: 0.01721
Epoch: 7058, Training Loss: 0.02152
Epoch: 7058, Training Loss: 0.01639
Epoch: 7059, Training Loss: 0.01880
Epoch: 7059, Training Loss: 0.01721
Epoch: 7059, Training Loss: 0.02152
Epoch: 7059, Training Loss: 0.01639
Epoch: 7060, Training Loss: 0.01880
Epoch: 7060, Training Loss: 0.01721
Epoch: 7060, Training Loss: 0.02151
Epoch: 7060, Training Loss: 0.01639
Epoch: 7061, Training Loss: 0.01880
Epoch: 7061, Training Loss: 0.01720
Epoch: 7061, Training Loss: 0.02151
Epoch: 7061, Training Loss: 0.01638
Epoch: 7062, Training Loss: 0.01880
Epoch: 7062, Training Loss: 0.01720
Epoch: 7062, Training Loss: 0.02151
Epoch: 7062, Training Loss: 0.01638
Epoch: 7063, Training Loss: 0.01880
Epoch: 7063, Training Loss: 0.01720
Epoch: 7063, Training Loss: 0.02151
Epoch: 7063, Training Loss: 0.01638
Epoch: 7064, Training Loss: 0.01879
Epoch: 7064, Training Loss: 0.01720
Epoch: 7064, Training Loss: 0.02151
Epoch: 7064, Training Loss: 0.01638
Epoch: 7065, Training Loss: 0.01879
Epoch: 7065, Training Loss: 0.01720
Epoch: 7065, Training Loss: 0.02150
Epoch: 7065, Training Loss: 0.01638
Epoch: 7066, Training Loss: 0.01879
Epoch: 7066, Training Loss: 0.01720
Epoch: 7066, Training Loss: 0.02150
Epoch: 7066, Training Loss: 0.01638
Epoch: 7067, Training Loss: 0.01879
Epoch: 7067, Training Loss: 0.01720
Epoch: 7067, Training Loss: 0.02150
Epoch: 7067, Training Loss: 0.01638
Epoch: 7068, Training Loss: 0.01879
Epoch: 7068, Training Loss: 0.01719
Epoch: 7068, Training Loss: 0.02150
Epoch: 7068, Training Loss: 0.01638
Epoch: 7069, Training Loss: 0.01879
Epoch: 7069, Training Loss: 0.01719
Epoch: 7069, Training Loss: 0.02150
Epoch: 7069, Training Loss: 0.01637
Epoch: 7070, Training Loss: 0.01878
Epoch: 7070, Training Loss: 0.01719
Epoch: 7070, Training Loss: 0.02150
Epoch: 7070, Training Loss: 0.01637
Epoch: 7071, Training Loss: 0.01878
Epoch: 7071, Training Loss: 0.01719
Epoch: 7071, Training Loss: 0.02149
Epoch: 7071, Training Loss: 0.01637
Epoch: 7072, Training Loss: 0.01878
Epoch: 7072, Training Loss: 0.01719
Epoch: 7072, Training Loss: 0.02149
Epoch: 7072, Training Loss: 0.01637
Epoch: 7073, Training Loss: 0.01878
Epoch: 7073, Training Loss: 0.01719
Epoch: 7073, Training Loss: 0.02149
Epoch: 7073, Training Loss: 0.01637
Epoch: 7074, Training Loss: 0.01878
Epoch: 7074, Training Loss: 0.01719
Epoch: 7074, Training Loss: 0.02149
Epoch: 7074, Training Loss: 0.01637
Epoch: 7075, Training Loss: 0.01878
Epoch: 7075, Training Loss: 0.01718
Epoch: 7075, Training Loss: 0.02149
Epoch: 7075, Training Loss: 0.01637
Epoch: 7076, Training Loss: 0.01878
Epoch: 7076, Training Loss: 0.01718
Epoch: 7076, Training Loss: 0.02148
Epoch: 7076, Training Loss: 0.01636
Epoch: 7077, Training Loss: 0.01877
Epoch: 7077, Training Loss: 0.01718
Epoch: 7077, Training Loss: 0.02148
Epoch: 7077, Training Loss: 0.01636
Epoch: 7078, Training Loss: 0.01877
Epoch: 7078, Training Loss: 0.01718
Epoch: 7078, Training Loss: 0.02148
Epoch: 7078, Training Loss: 0.01636
Epoch: 7079, Training Loss: 0.01877
Epoch: 7079, Training Loss: 0.01718
Epoch: 7079, Training Loss: 0.02148
Epoch: 7079, Training Loss: 0.01636
Epoch: 7080, Training Loss: 0.01877
Epoch: 7080, Training Loss: 0.01718
Epoch: 7080, Training Loss: 0.02148
Epoch: 7080, Training Loss: 0.01636
Epoch: 7081, Training Loss: 0.01877
Epoch: 7081, Training Loss: 0.01718
Epoch: 7081, Training Loss: 0.02148
Epoch: 7081, Training Loss: 0.01636
Epoch: 7082, Training Loss: 0.01877
Epoch: 7082, Training Loss: 0.01718
Epoch: 7082, Training Loss: 0.02147
Epoch: 7082, Training Loss: 0.01636
Epoch: 7083, Training Loss: 0.01876
Epoch: 7083, Training Loss: 0.01717
Epoch: 7083, Training Loss: 0.02147
Epoch: 7083, Training Loss: 0.01636
Epoch: 7084, Training Loss: 0.01876
Epoch: 7084, Training Loss: 0.01717
Epoch: 7084, Training Loss: 0.02147
Epoch: 7084, Training Loss: 0.01635
Epoch: 7085, Training Loss: 0.01876
Epoch: 7085, Training Loss: 0.01717
Epoch: 7085, Training Loss: 0.02147
Epoch: 7085, Training Loss: 0.01635
Epoch: 7086, Training Loss: 0.01876
Epoch: 7086, Training Loss: 0.01717
Epoch: 7086, Training Loss: 0.02147
Epoch: 7086, Training Loss: 0.01635
Epoch: 7087, Training Loss: 0.01876
Epoch: 7087, Training Loss: 0.01717
Epoch: 7087, Training Loss: 0.02146
Epoch: 7087, Training Loss: 0.01635
Epoch: 7088, Training Loss: 0.01876
Epoch: 7088, Training Loss: 0.01717
Epoch: 7088, Training Loss: 0.02146
Epoch: 7088, Training Loss: 0.01635
Epoch: 7089, Training Loss: 0.01875
Epoch: 7089, Training Loss: 0.01717
Epoch: 7089, Training Loss: 0.02146
Epoch: 7089, Training Loss: 0.01635
Epoch: 7090, Training Loss: 0.01875
Epoch: 7090, Training Loss: 0.01716
Epoch: 7090, Training Loss: 0.02146
Epoch: 7090, Training Loss: 0.01635
Epoch: 7091, Training Loss: 0.01875
Epoch: 7091, Training Loss: 0.01716
Epoch: 7091, Training Loss: 0.02146
Epoch: 7091, Training Loss: 0.01634
Epoch: 7092, Training Loss: 0.01875
Epoch: 7092, Training Loss: 0.01716
Epoch: 7092, Training Loss: 0.02146
Epoch: 7092, Training Loss: 0.01634
Epoch: 7093, Training Loss: 0.01875
Epoch: 7093, Training Loss: 0.01716
Epoch: 7093, Training Loss: 0.02145
Epoch: 7093, Training Loss: 0.01634
Epoch: 7094, Training Loss: 0.01875
Epoch: 7094, Training Loss: 0.01716
Epoch: 7094, Training Loss: 0.02145
Epoch: 7094, Training Loss: 0.01634
Epoch: 7095, Training Loss: 0.01875
Epoch: 7095, Training Loss: 0.01716
Epoch: 7095, Training Loss: 0.02145
Epoch: 7095, Training Loss: 0.01634
Epoch: 7096, Training Loss: 0.01874
Epoch: 7096, Training Loss: 0.01716
Epoch: 7096, Training Loss: 0.02145
Epoch: 7096, Training Loss: 0.01634
Epoch: 7097, Training Loss: 0.01874
Epoch: 7097, Training Loss: 0.01715
Epoch: 7097, Training Loss: 0.02145
Epoch: 7097, Training Loss: 0.01634
Epoch: 7098, Training Loss: 0.01874
Epoch: 7098, Training Loss: 0.01715
Epoch: 7098, Training Loss: 0.02144
Epoch: 7098, Training Loss: 0.01634
Epoch: 7099, Training Loss: 0.01874
Epoch: 7099, Training Loss: 0.01715
Epoch: 7099, Training Loss: 0.02144
Epoch: 7099, Training Loss: 0.01633
Epoch: 7100, Training Loss: 0.01874
Epoch: 7100, Training Loss: 0.01715
Epoch: 7100, Training Loss: 0.02144
Epoch: 7100, Training Loss: 0.01633
Epoch: 7101, Training Loss: 0.01874
Epoch: 7101, Training Loss: 0.01715
Epoch: 7101, Training Loss: 0.02144
Epoch: 7101, Training Loss: 0.01633
Epoch: 7102, Training Loss: 0.01873
Epoch: 7102, Training Loss: 0.01715
Epoch: 7102, Training Loss: 0.02144
Epoch: 7102, Training Loss: 0.01633
Epoch: 7103, Training Loss: 0.01873
Epoch: 7103, Training Loss: 0.01715
Epoch: 7103, Training Loss: 0.02144
Epoch: 7103, Training Loss: 0.01633
Epoch: 7104, Training Loss: 0.01873
Epoch: 7104, Training Loss: 0.01714
Epoch: 7104, Training Loss: 0.02143
Epoch: 7104, Training Loss: 0.01633
Epoch: 7105, Training Loss: 0.01873
Epoch: 7105, Training Loss: 0.01714
Epoch: 7105, Training Loss: 0.02143
Epoch: 7105, Training Loss: 0.01633
Epoch: 7106, Training Loss: 0.01873
Epoch: 7106, Training Loss: 0.01714
Epoch: 7106, Training Loss: 0.02143
Epoch: 7106, Training Loss: 0.01632
Epoch: 7107, Training Loss: 0.01873
Epoch: 7107, Training Loss: 0.01714
Epoch: 7107, Training Loss: 0.02143
Epoch: 7107, Training Loss: 0.01632
Epoch: 7108, Training Loss: 0.01873
Epoch: 7108, Training Loss: 0.01714
Epoch: 7108, Training Loss: 0.02143
Epoch: 7108, Training Loss: 0.01632
Epoch: 7109, Training Loss: 0.01872
Epoch: 7109, Training Loss: 0.01714
Epoch: 7109, Training Loss: 0.02142
Epoch: 7109, Training Loss: 0.01632
Epoch: 7110, Training Loss: 0.01872
Epoch: 7110, Training Loss: 0.01714
Epoch: 7110, Training Loss: 0.02142
Epoch: 7110, Training Loss: 0.01632
Epoch: 7111, Training Loss: 0.01872
Epoch: 7111, Training Loss: 0.01714
Epoch: 7111, Training Loss: 0.02142
Epoch: 7111, Training Loss: 0.01632
Epoch: 7112, Training Loss: 0.01872
Epoch: 7112, Training Loss: 0.01713
Epoch: 7112, Training Loss: 0.02142
Epoch: 7112, Training Loss: 0.01632
Epoch: 7113, Training Loss: 0.01872
Epoch: 7113, Training Loss: 0.01713
Epoch: 7113, Training Loss: 0.02142
Epoch: 7113, Training Loss: 0.01632
Epoch: 7114, Training Loss: 0.01872
Epoch: 7114, Training Loss: 0.01713
Epoch: 7114, Training Loss: 0.02142
Epoch: 7114, Training Loss: 0.01631
Epoch: 7115, Training Loss: 0.01871
Epoch: 7115, Training Loss: 0.01713
Epoch: 7115, Training Loss: 0.02141
Epoch: 7115, Training Loss: 0.01631
Epoch: 7116, Training Loss: 0.01871
Epoch: 7116, Training Loss: 0.01713
Epoch: 7116, Training Loss: 0.02141
Epoch: 7116, Training Loss: 0.01631
Epoch: 7117, Training Loss: 0.01871
Epoch: 7117, Training Loss: 0.01713
Epoch: 7117, Training Loss: 0.02141
Epoch: 7117, Training Loss: 0.01631
Epoch: 7118, Training Loss: 0.01871
Epoch: 7118, Training Loss: 0.01713
Epoch: 7118, Training Loss: 0.02141
Epoch: 7118, Training Loss: 0.01631
Epoch: 7119, Training Loss: 0.01871
Epoch: 7119, Training Loss: 0.01712
Epoch: 7119, Training Loss: 0.02141
Epoch: 7119, Training Loss: 0.01631
Epoch: 7120, Training Loss: 0.01871
Epoch: 7120, Training Loss: 0.01712
Epoch: 7120, Training Loss: 0.02141
Epoch: 7120, Training Loss: 0.01631
Epoch: 7121, Training Loss: 0.01871
Epoch: 7121, Training Loss: 0.01712
Epoch: 7121, Training Loss: 0.02140
Epoch: 7121, Training Loss: 0.01631
Epoch: 7122, Training Loss: 0.01870
Epoch: 7122, Training Loss: 0.01712
Epoch: 7122, Training Loss: 0.02140
Epoch: 7122, Training Loss: 0.01630
Epoch: 7123, Training Loss: 0.01870
Epoch: 7123, Training Loss: 0.01712
Epoch: 7123, Training Loss: 0.02140
Epoch: 7123, Training Loss: 0.01630
Epoch: 7124, Training Loss: 0.01870
Epoch: 7124, Training Loss: 0.01712
Epoch: 7124, Training Loss: 0.02140
Epoch: 7124, Training Loss: 0.01630
Epoch: 7125, Training Loss: 0.01870
Epoch: 7125, Training Loss: 0.01712
Epoch: 7125, Training Loss: 0.02140
Epoch: 7125, Training Loss: 0.01630
Epoch: 7126, Training Loss: 0.01870
Epoch: 7126, Training Loss: 0.01711
Epoch: 7126, Training Loss: 0.02139
Epoch: 7126, Training Loss: 0.01630
Epoch: 7127, Training Loss: 0.01870
Epoch: 7127, Training Loss: 0.01711
Epoch: 7127, Training Loss: 0.02139
Epoch: 7127, Training Loss: 0.01630
Epoch: 7128, Training Loss: 0.01869
Epoch: 7128, Training Loss: 0.01711
Epoch: 7128, Training Loss: 0.02139
Epoch: 7128, Training Loss: 0.01630
Epoch: 7129, Training Loss: 0.01869
Epoch: 7129, Training Loss: 0.01711
Epoch: 7129, Training Loss: 0.02139
Epoch: 7129, Training Loss: 0.01629
Epoch: 7130, Training Loss: 0.01869
Epoch: 7130, Training Loss: 0.01711
Epoch: 7130, Training Loss: 0.02139
Epoch: 7130, Training Loss: 0.01629
Epoch: 7131, Training Loss: 0.01869
Epoch: 7131, Training Loss: 0.01711
Epoch: 7131, Training Loss: 0.02139
Epoch: 7131, Training Loss: 0.01629
Epoch: 7132, Training Loss: 0.01869
Epoch: 7132, Training Loss: 0.01711
Epoch: 7132, Training Loss: 0.02138
Epoch: 7132, Training Loss: 0.01629
Epoch: 7133, Training Loss: 0.01869
Epoch: 7133, Training Loss: 0.01711
Epoch: 7133, Training Loss: 0.02138
Epoch: 7133, Training Loss: 0.01629
Epoch: 7134, Training Loss: 0.01869
Epoch: 7134, Training Loss: 0.01710
Epoch: 7134, Training Loss: 0.02138
Epoch: 7134, Training Loss: 0.01629
Epoch: 7135, Training Loss: 0.01868
Epoch: 7135, Training Loss: 0.01710
Epoch: 7135, Training Loss: 0.02138
Epoch: 7135, Training Loss: 0.01629
Epoch: 7136, Training Loss: 0.01868
Epoch: 7136, Training Loss: 0.01710
Epoch: 7136, Training Loss: 0.02138
Epoch: 7136, Training Loss: 0.01629
Epoch: 7137, Training Loss: 0.01868
Epoch: 7137, Training Loss: 0.01710
Epoch: 7137, Training Loss: 0.02137
Epoch: 7137, Training Loss: 0.01628
Epoch: 7138, Training Loss: 0.01868
Epoch: 7138, Training Loss: 0.01710
Epoch: 7138, Training Loss: 0.02137
Epoch: 7138, Training Loss: 0.01628
Epoch: 7139, Training Loss: 0.01868
Epoch: 7139, Training Loss: 0.01710
Epoch: 7139, Training Loss: 0.02137
Epoch: 7139, Training Loss: 0.01628
Epoch: 7140, Training Loss: 0.01868
Epoch: 7140, Training Loss: 0.01710
Epoch: 7140, Training Loss: 0.02137
Epoch: 7140, Training Loss: 0.01628
Epoch: 7141, Training Loss: 0.01867
Epoch: 7141, Training Loss: 0.01709
Epoch: 7141, Training Loss: 0.02137
Epoch: 7141, Training Loss: 0.01628
Epoch: 7142, Training Loss: 0.01867
Epoch: 7142, Training Loss: 0.01709
Epoch: 7142, Training Loss: 0.02137
Epoch: 7142, Training Loss: 0.01628
Epoch: 7143, Training Loss: 0.01867
Epoch: 7143, Training Loss: 0.01709
Epoch: 7143, Training Loss: 0.02136
Epoch: 7143, Training Loss: 0.01628
Epoch: 7144, Training Loss: 0.01867
Epoch: 7144, Training Loss: 0.01709
Epoch: 7144, Training Loss: 0.02136
Epoch: 7144, Training Loss: 0.01627
Epoch: 7145, Training Loss: 0.01867
Epoch: 7145, Training Loss: 0.01709
Epoch: 7145, Training Loss: 0.02136
Epoch: 7145, Training Loss: 0.01627
Epoch: 7146, Training Loss: 0.01867
Epoch: 7146, Training Loss: 0.01709
Epoch: 7146, Training Loss: 0.02136
Epoch: 7146, Training Loss: 0.01627
Epoch: 7147, Training Loss: 0.01867
Epoch: 7147, Training Loss: 0.01709
Epoch: 7147, Training Loss: 0.02136
Epoch: 7147, Training Loss: 0.01627
Epoch: 7148, Training Loss: 0.01866
Epoch: 7148, Training Loss: 0.01709
Epoch: 7148, Training Loss: 0.02136
Epoch: 7148, Training Loss: 0.01627
Epoch: 7149, Training Loss: 0.01866
Epoch: 7149, Training Loss: 0.01708
Epoch: 7149, Training Loss: 0.02135
Epoch: 7149, Training Loss: 0.01627
Epoch: 7150, Training Loss: 0.01866
Epoch: 7150, Training Loss: 0.01708
Epoch: 7150, Training Loss: 0.02135
Epoch: 7150, Training Loss: 0.01627
Epoch: 7151, Training Loss: 0.01866
Epoch: 7151, Training Loss: 0.01708
Epoch: 7151, Training Loss: 0.02135
Epoch: 7151, Training Loss: 0.01627
Epoch: 7152, Training Loss: 0.01866
Epoch: 7152, Training Loss: 0.01708
Epoch: 7152, Training Loss: 0.02135
Epoch: 7152, Training Loss: 0.01626
Epoch: 7153, Training Loss: 0.01866
Epoch: 7153, Training Loss: 0.01708
Epoch: 7153, Training Loss: 0.02135
Epoch: 7153, Training Loss: 0.01626
Epoch: 7154, Training Loss: 0.01865
Epoch: 7154, Training Loss: 0.01708
Epoch: 7154, Training Loss: 0.02134
Epoch: 7154, Training Loss: 0.01626
Epoch: 7155, Training Loss: 0.01865
Epoch: 7155, Training Loss: 0.01708
Epoch: 7155, Training Loss: 0.02134
Epoch: 7155, Training Loss: 0.01626
Epoch: 7156, Training Loss: 0.01865
Epoch: 7156, Training Loss: 0.01707
Epoch: 7156, Training Loss: 0.02134
Epoch: 7156, Training Loss: 0.01626
Epoch: 7157, Training Loss: 0.01865
Epoch: 7157, Training Loss: 0.01707
Epoch: 7157, Training Loss: 0.02134
Epoch: 7157, Training Loss: 0.01626
Epoch: 7158, Training Loss: 0.01865
Epoch: 7158, Training Loss: 0.01707
Epoch: 7158, Training Loss: 0.02134
Epoch: 7158, Training Loss: 0.01626
Epoch: 7159, Training Loss: 0.01865
Epoch: 7159, Training Loss: 0.01707
Epoch: 7159, Training Loss: 0.02134
Epoch: 7159, Training Loss: 0.01626
Epoch: 7160, Training Loss: 0.01865
Epoch: 7160, Training Loss: 0.01707
Epoch: 7160, Training Loss: 0.02133
Epoch: 7160, Training Loss: 0.01625
Epoch: 7161, Training Loss: 0.01864
Epoch: 7161, Training Loss: 0.01707
Epoch: 7161, Training Loss: 0.02133
Epoch: 7161, Training Loss: 0.01625
Epoch: 7162, Training Loss: 0.01864
Epoch: 7162, Training Loss: 0.01707
Epoch: 7162, Training Loss: 0.02133
Epoch: 7162, Training Loss: 0.01625
Epoch: 7163, Training Loss: 0.01864
Epoch: 7163, Training Loss: 0.01706
Epoch: 7163, Training Loss: 0.02133
Epoch: 7163, Training Loss: 0.01625
Epoch: 7164, Training Loss: 0.01864
Epoch: 7164, Training Loss: 0.01706
Epoch: 7164, Training Loss: 0.02133
Epoch: 7164, Training Loss: 0.01625
Epoch: 7165, Training Loss: 0.01864
Epoch: 7165, Training Loss: 0.01706
Epoch: 7165, Training Loss: 0.02133
Epoch: 7165, Training Loss: 0.01625
Epoch: 7166, Training Loss: 0.01864
Epoch: 7166, Training Loss: 0.01706
Epoch: 7166, Training Loss: 0.02132
Epoch: 7166, Training Loss: 0.01625
Epoch: 7167, Training Loss: 0.01863
Epoch: 7167, Training Loss: 0.01706
Epoch: 7167, Training Loss: 0.02132
Epoch: 7167, Training Loss: 0.01624
Epoch: 7168, Training Loss: 0.01863
Epoch: 7168, Training Loss: 0.01706
Epoch: 7168, Training Loss: 0.02132
Epoch: 7168, Training Loss: 0.01624
Epoch: 7169, Training Loss: 0.01863
Epoch: 7169, Training Loss: 0.01706
Epoch: 7169, Training Loss: 0.02132
Epoch: 7169, Training Loss: 0.01624
Epoch: 7170, Training Loss: 0.01863
Epoch: 7170, Training Loss: 0.01706
Epoch: 7170, Training Loss: 0.02132
Epoch: 7170, Training Loss: 0.01624
Epoch: 7171, Training Loss: 0.01863
Epoch: 7171, Training Loss: 0.01705
Epoch: 7171, Training Loss: 0.02131
Epoch: 7171, Training Loss: 0.01624
Epoch: 7172, Training Loss: 0.01863
Epoch: 7172, Training Loss: 0.01705
Epoch: 7172, Training Loss: 0.02131
Epoch: 7172, Training Loss: 0.01624
Epoch: 7173, Training Loss: 0.01863
Epoch: 7173, Training Loss: 0.01705
Epoch: 7173, Training Loss: 0.02131
Epoch: 7173, Training Loss: 0.01624
Epoch: 7174, Training Loss: 0.01862
Epoch: 7174, Training Loss: 0.01705
Epoch: 7174, Training Loss: 0.02131
Epoch: 7174, Training Loss: 0.01624
Epoch: 7175, Training Loss: 0.01862
Epoch: 7175, Training Loss: 0.01705
Epoch: 7175, Training Loss: 0.02131
Epoch: 7175, Training Loss: 0.01623
Epoch: 7176, Training Loss: 0.01862
Epoch: 7176, Training Loss: 0.01705
Epoch: 7176, Training Loss: 0.02131
Epoch: 7176, Training Loss: 0.01623
Epoch: 7177, Training Loss: 0.01862
Epoch: 7177, Training Loss: 0.01705
Epoch: 7177, Training Loss: 0.02130
Epoch: 7177, Training Loss: 0.01623
Epoch: 7178, Training Loss: 0.01862
Epoch: 7178, Training Loss: 0.01704
Epoch: 7178, Training Loss: 0.02130
Epoch: 7178, Training Loss: 0.01623
Epoch: 7179, Training Loss: 0.01862
Epoch: 7179, Training Loss: 0.01704
Epoch: 7179, Training Loss: 0.02130
Epoch: 7179, Training Loss: 0.01623
Epoch: 7180, Training Loss: 0.01861
Epoch: 7180, Training Loss: 0.01704
Epoch: 7180, Training Loss: 0.02130
Epoch: 7180, Training Loss: 0.01623
Epoch: 7181, Training Loss: 0.01861
Epoch: 7181, Training Loss: 0.01704
Epoch: 7181, Training Loss: 0.02130
Epoch: 7181, Training Loss: 0.01623
Epoch: 7182, Training Loss: 0.01861
Epoch: 7182, Training Loss: 0.01704
Epoch: 7182, Training Loss: 0.02130
Epoch: 7182, Training Loss: 0.01623
Epoch: 7183, Training Loss: 0.01861
Epoch: 7183, Training Loss: 0.01704
Epoch: 7183, Training Loss: 0.02129
Epoch: 7183, Training Loss: 0.01622
Epoch: 7184, Training Loss: 0.01861
Epoch: 7184, Training Loss: 0.01704
Epoch: 7184, Training Loss: 0.02129
Epoch: 7184, Training Loss: 0.01622
Epoch: 7185, Training Loss: 0.01861
Epoch: 7185, Training Loss: 0.01704
Epoch: 7185, Training Loss: 0.02129
Epoch: 7185, Training Loss: 0.01622
Epoch: 7186, Training Loss: 0.01861
Epoch: 7186, Training Loss: 0.01703
Epoch: 7186, Training Loss: 0.02129
Epoch: 7186, Training Loss: 0.01622
Epoch: 7187, Training Loss: 0.01860
Epoch: 7187, Training Loss: 0.01703
Epoch: 7187, Training Loss: 0.02129
Epoch: 7187, Training Loss: 0.01622
Epoch: 7188, Training Loss: 0.01860
Epoch: 7188, Training Loss: 0.01703
Epoch: 7188, Training Loss: 0.02128
Epoch: 7188, Training Loss: 0.01622
Epoch: 7189, Training Loss: 0.01860
Epoch: 7189, Training Loss: 0.01703
Epoch: 7189, Training Loss: 0.02128
Epoch: 7189, Training Loss: 0.01622
Epoch: 7190, Training Loss: 0.01860
Epoch: 7190, Training Loss: 0.01703
Epoch: 7190, Training Loss: 0.02128
Epoch: 7190, Training Loss: 0.01621
Epoch: 7191, Training Loss: 0.01860
Epoch: 7191, Training Loss: 0.01703
Epoch: 7191, Training Loss: 0.02128
Epoch: 7191, Training Loss: 0.01621
Epoch: 7192, Training Loss: 0.01860
Epoch: 7192, Training Loss: 0.01703
Epoch: 7192, Training Loss: 0.02128
Epoch: 7192, Training Loss: 0.01621
Epoch: 7193, Training Loss: 0.01859
Epoch: 7193, Training Loss: 0.01702
Epoch: 7193, Training Loss: 0.02128
Epoch: 7193, Training Loss: 0.01621
Epoch: 7194, Training Loss: 0.01859
Epoch: 7194, Training Loss: 0.01702
Epoch: 7194, Training Loss: 0.02127
Epoch: 7194, Training Loss: 0.01621
Epoch: 7195, Training Loss: 0.01859
Epoch: 7195, Training Loss: 0.01702
Epoch: 7195, Training Loss: 0.02127
Epoch: 7195, Training Loss: 0.01621
Epoch: 7196, Training Loss: 0.01859
Epoch: 7196, Training Loss: 0.01702
Epoch: 7196, Training Loss: 0.02127
Epoch: 7196, Training Loss: 0.01621
Epoch: 7197, Training Loss: 0.01859
Epoch: 7197, Training Loss: 0.01702
Epoch: 7197, Training Loss: 0.02127
Epoch: 7197, Training Loss: 0.01621
Epoch: 7198, Training Loss: 0.01859
Epoch: 7198, Training Loss: 0.01702
Epoch: 7198, Training Loss: 0.02127
Epoch: 7198, Training Loss: 0.01620
Epoch: 7199, Training Loss: 0.01859
Epoch: 7199, Training Loss: 0.01702
Epoch: 7199, Training Loss: 0.02127
Epoch: 7199, Training Loss: 0.01620
Epoch: 7200, Training Loss: 0.01858
Epoch: 7200, Training Loss: 0.01701
Epoch: 7200, Training Loss: 0.02126
Epoch: 7200, Training Loss: 0.01620
Epoch: 7201, Training Loss: 0.01858
Epoch: 7201, Training Loss: 0.01701
Epoch: 7201, Training Loss: 0.02126
Epoch: 7201, Training Loss: 0.01620
Epoch: 7202, Training Loss: 0.01858
Epoch: 7202, Training Loss: 0.01701
Epoch: 7202, Training Loss: 0.02126
Epoch: 7202, Training Loss: 0.01620
Epoch: 7203, Training Loss: 0.01858
Epoch: 7203, Training Loss: 0.01701
Epoch: 7203, Training Loss: 0.02126
Epoch: 7203, Training Loss: 0.01620
Epoch: 7204, Training Loss: 0.01858
Epoch: 7204, Training Loss: 0.01701
Epoch: 7204, Training Loss: 0.02126
Epoch: 7204, Training Loss: 0.01620
Epoch: 7205, Training Loss: 0.01858
Epoch: 7205, Training Loss: 0.01701
Epoch: 7205, Training Loss: 0.02125
Epoch: 7205, Training Loss: 0.01620
Epoch: 7206, Training Loss: 0.01858
Epoch: 7206, Training Loss: 0.01701
Epoch: 7206, Training Loss: 0.02125
Epoch: 7206, Training Loss: 0.01619
Epoch: 7207, Training Loss: 0.01857
Epoch: 7207, Training Loss: 0.01701
Epoch: 7207, Training Loss: 0.02125
Epoch: 7207, Training Loss: 0.01619
Epoch: 7208, Training Loss: 0.01857
Epoch: 7208, Training Loss: 0.01700
Epoch: 7208, Training Loss: 0.02125
Epoch: 7208, Training Loss: 0.01619
Epoch: 7209, Training Loss: 0.01857
Epoch: 7209, Training Loss: 0.01700
Epoch: 7209, Training Loss: 0.02125
Epoch: 7209, Training Loss: 0.01619
Epoch: 7210, Training Loss: 0.01857
Epoch: 7210, Training Loss: 0.01700
Epoch: 7210, Training Loss: 0.02125
Epoch: 7210, Training Loss: 0.01619
Epoch: 7211, Training Loss: 0.01857
Epoch: 7211, Training Loss: 0.01700
Epoch: 7211, Training Loss: 0.02124
Epoch: 7211, Training Loss: 0.01619
Epoch: 7212, Training Loss: 0.01857
Epoch: 7212, Training Loss: 0.01700
Epoch: 7212, Training Loss: 0.02124
Epoch: 7212, Training Loss: 0.01619
Epoch: 7213, Training Loss: 0.01856
Epoch: 7213, Training Loss: 0.01700
Epoch: 7213, Training Loss: 0.02124
Epoch: 7213, Training Loss: 0.01619
Epoch: 7214, Training Loss: 0.01856
Epoch: 7214, Training Loss: 0.01700
Epoch: 7214, Training Loss: 0.02124
Epoch: 7214, Training Loss: 0.01618
Epoch: 7215, Training Loss: 0.01856
Epoch: 7215, Training Loss: 0.01699
Epoch: 7215, Training Loss: 0.02124
Epoch: 7215, Training Loss: 0.01618
Epoch: 7216, Training Loss: 0.01856
Epoch: 7216, Training Loss: 0.01699
Epoch: 7216, Training Loss: 0.02124
Epoch: 7216, Training Loss: 0.01618
Epoch: 7217, Training Loss: 0.01856
Epoch: 7217, Training Loss: 0.01699
Epoch: 7217, Training Loss: 0.02123
Epoch: 7217, Training Loss: 0.01618
Epoch: 7218, Training Loss: 0.01856
Epoch: 7218, Training Loss: 0.01699
Epoch: 7218, Training Loss: 0.02123
Epoch: 7218, Training Loss: 0.01618
Epoch: 7219, Training Loss: 0.01856
Epoch: 7219, Training Loss: 0.01699
Epoch: 7219, Training Loss: 0.02123
Epoch: 7219, Training Loss: 0.01618
Epoch: 7220, Training Loss: 0.01855
Epoch: 7220, Training Loss: 0.01699
Epoch: 7220, Training Loss: 0.02123
Epoch: 7220, Training Loss: 0.01618
Epoch: 7221, Training Loss: 0.01855
Epoch: 7221, Training Loss: 0.01699
Epoch: 7221, Training Loss: 0.02123
Epoch: 7221, Training Loss: 0.01617
Epoch: 7222, Training Loss: 0.01855
Epoch: 7222, Training Loss: 0.01699
Epoch: 7222, Training Loss: 0.02123
Epoch: 7222, Training Loss: 0.01617
Epoch: 7223, Training Loss: 0.01855
Epoch: 7223, Training Loss: 0.01698
Epoch: 7223, Training Loss: 0.02122
Epoch: 7223, Training Loss: 0.01617
Epoch: 7224, Training Loss: 0.01855
Epoch: 7224, Training Loss: 0.01698
Epoch: 7224, Training Loss: 0.02122
Epoch: 7224, Training Loss: 0.01617
Epoch: 7225, Training Loss: 0.01855
Epoch: 7225, Training Loss: 0.01698
Epoch: 7225, Training Loss: 0.02122
Epoch: 7225, Training Loss: 0.01617
Epoch: 7226, Training Loss: 0.01854
Epoch: 7226, Training Loss: 0.01698
Epoch: 7226, Training Loss: 0.02122
Epoch: 7226, Training Loss: 0.01617
Epoch: 7227, Training Loss: 0.01854
Epoch: 7227, Training Loss: 0.01698
Epoch: 7227, Training Loss: 0.02122
Epoch: 7227, Training Loss: 0.01617
Epoch: 7228, Training Loss: 0.01854
Epoch: 7228, Training Loss: 0.01698
Epoch: 7228, Training Loss: 0.02121
Epoch: 7228, Training Loss: 0.01617
Epoch: 7229, Training Loss: 0.01854
Epoch: 7229, Training Loss: 0.01698
Epoch: 7229, Training Loss: 0.02121
Epoch: 7229, Training Loss: 0.01616
Epoch: 7230, Training Loss: 0.01854
Epoch: 7230, Training Loss: 0.01697
Epoch: 7230, Training Loss: 0.02121
Epoch: 7230, Training Loss: 0.01616
Epoch: 7231, Training Loss: 0.01854
Epoch: 7231, Training Loss: 0.01697
Epoch: 7231, Training Loss: 0.02121
Epoch: 7231, Training Loss: 0.01616
Epoch: 7232, Training Loss: 0.01854
Epoch: 7232, Training Loss: 0.01697
Epoch: 7232, Training Loss: 0.02121
Epoch: 7232, Training Loss: 0.01616
Epoch: 7233, Training Loss: 0.01853
Epoch: 7233, Training Loss: 0.01697
Epoch: 7233, Training Loss: 0.02121
Epoch: 7233, Training Loss: 0.01616
Epoch: 7234, Training Loss: 0.01853
Epoch: 7234, Training Loss: 0.01697
Epoch: 7234, Training Loss: 0.02120
Epoch: 7234, Training Loss: 0.01616
Epoch: 7235, Training Loss: 0.01853
Epoch: 7235, Training Loss: 0.01697
Epoch: 7235, Training Loss: 0.02120
Epoch: 7235, Training Loss: 0.01616
Epoch: 7236, Training Loss: 0.01853
Epoch: 7236, Training Loss: 0.01697
Epoch: 7236, Training Loss: 0.02120
Epoch: 7236, Training Loss: 0.01616
Epoch: 7237, Training Loss: 0.01853
Epoch: 7237, Training Loss: 0.01697
Epoch: 7237, Training Loss: 0.02120
Epoch: 7237, Training Loss: 0.01615
Epoch: 7238, Training Loss: 0.01853
Epoch: 7238, Training Loss: 0.01696
Epoch: 7238, Training Loss: 0.02120
Epoch: 7238, Training Loss: 0.01615
Epoch: 7239, Training Loss: 0.01853
Epoch: 7239, Training Loss: 0.01696
Epoch: 7239, Training Loss: 0.02120
Epoch: 7239, Training Loss: 0.01615
Epoch: 7240, Training Loss: 0.01852
Epoch: 7240, Training Loss: 0.01696
Epoch: 7240, Training Loss: 0.02119
Epoch: 7240, Training Loss: 0.01615
Epoch: 7241, Training Loss: 0.01852
Epoch: 7241, Training Loss: 0.01696
Epoch: 7241, Training Loss: 0.02119
Epoch: 7241, Training Loss: 0.01615
Epoch: 7242, Training Loss: 0.01852
Epoch: 7242, Training Loss: 0.01696
Epoch: 7242, Training Loss: 0.02119
Epoch: 7242, Training Loss: 0.01615
Epoch: 7243, Training Loss: 0.01852
Epoch: 7243, Training Loss: 0.01696
Epoch: 7243, Training Loss: 0.02119
Epoch: 7243, Training Loss: 0.01615
Epoch: 7244, Training Loss: 0.01852
Epoch: 7244, Training Loss: 0.01696
Epoch: 7244, Training Loss: 0.02119
Epoch: 7244, Training Loss: 0.01615
Epoch: 7245, Training Loss: 0.01852
Epoch: 7245, Training Loss: 0.01696
Epoch: 7245, Training Loss: 0.02119
Epoch: 7245, Training Loss: 0.01614
Epoch: 7246, Training Loss: 0.01851
Epoch: 7246, Training Loss: 0.01695
Epoch: 7246, Training Loss: 0.02118
Epoch: 7246, Training Loss: 0.01614
Epoch: 7247, Training Loss: 0.01851
Epoch: 7247, Training Loss: 0.01695
Epoch: 7247, Training Loss: 0.02118
Epoch: 7247, Training Loss: 0.01614
Epoch: 7248, Training Loss: 0.01851
Epoch: 7248, Training Loss: 0.01695
Epoch: 7248, Training Loss: 0.02118
Epoch: 7248, Training Loss: 0.01614
Epoch: 7249, Training Loss: 0.01851
Epoch: 7249, Training Loss: 0.01695
Epoch: 7249, Training Loss: 0.02118
Epoch: 7249, Training Loss: 0.01614
Epoch: 7250, Training Loss: 0.01851
Epoch: 7250, Training Loss: 0.01695
Epoch: 7250, Training Loss: 0.02118
Epoch: 7250, Training Loss: 0.01614
Epoch: 7251, Training Loss: 0.01851
Epoch: 7251, Training Loss: 0.01695
Epoch: 7251, Training Loss: 0.02117
Epoch: 7251, Training Loss: 0.01614
Epoch: 7252, Training Loss: 0.01851
Epoch: 7252, Training Loss: 0.01695
Epoch: 7252, Training Loss: 0.02117
Epoch: 7252, Training Loss: 0.01613
Epoch: 7253, Training Loss: 0.01850
Epoch: 7253, Training Loss: 0.01694
Epoch: 7253, Training Loss: 0.02117
Epoch: 7253, Training Loss: 0.01613
Epoch: 7254, Training Loss: 0.01850
Epoch: 7254, Training Loss: 0.01694
Epoch: 7254, Training Loss: 0.02117
Epoch: 7254, Training Loss: 0.01613
Epoch: 7255, Training Loss: 0.01850
Epoch: 7255, Training Loss: 0.01694
Epoch: 7255, Training Loss: 0.02117
Epoch: 7255, Training Loss: 0.01613
Epoch: 7256, Training Loss: 0.01850
Epoch: 7256, Training Loss: 0.01694
Epoch: 7256, Training Loss: 0.02117
Epoch: 7256, Training Loss: 0.01613
Epoch: 7257, Training Loss: 0.01850
Epoch: 7257, Training Loss: 0.01694
Epoch: 7257, Training Loss: 0.02116
Epoch: 7257, Training Loss: 0.01613
Epoch: 7258, Training Loss: 0.01850
Epoch: 7258, Training Loss: 0.01694
Epoch: 7258, Training Loss: 0.02116
Epoch: 7258, Training Loss: 0.01613
Epoch: 7259, Training Loss: 0.01850
Epoch: 7259, Training Loss: 0.01694
Epoch: 7259, Training Loss: 0.02116
Epoch: 7259, Training Loss: 0.01613
Epoch: 7260, Training Loss: 0.01849
Epoch: 7260, Training Loss: 0.01694
Epoch: 7260, Training Loss: 0.02116
Epoch: 7260, Training Loss: 0.01612
Epoch: 7261, Training Loss: 0.01849
Epoch: 7261, Training Loss: 0.01693
Epoch: 7261, Training Loss: 0.02116
Epoch: 7261, Training Loss: 0.01612
Epoch: 7262, Training Loss: 0.01849
Epoch: 7262, Training Loss: 0.01693
Epoch: 7262, Training Loss: 0.02116
Epoch: 7262, Training Loss: 0.01612
Epoch: 7263, Training Loss: 0.01849
Epoch: 7263, Training Loss: 0.01693
Epoch: 7263, Training Loss: 0.02115
Epoch: 7263, Training Loss: 0.01612
Epoch: 7264, Training Loss: 0.01849
Epoch: 7264, Training Loss: 0.01693
Epoch: 7264, Training Loss: 0.02115
Epoch: 7264, Training Loss: 0.01612
Epoch: 7265, Training Loss: 0.01849
Epoch: 7265, Training Loss: 0.01693
Epoch: 7265, Training Loss: 0.02115
Epoch: 7265, Training Loss: 0.01612
Epoch: 7266, Training Loss: 0.01848
Epoch: 7266, Training Loss: 0.01693
Epoch: 7266, Training Loss: 0.02115
Epoch: 7266, Training Loss: 0.01612
Epoch: 7267, Training Loss: 0.01848
Epoch: 7267, Training Loss: 0.01693
Epoch: 7267, Training Loss: 0.02115
Epoch: 7267, Training Loss: 0.01612
Epoch: 7268, Training Loss: 0.01848
Epoch: 7268, Training Loss: 0.01692
Epoch: 7268, Training Loss: 0.02115
Epoch: 7268, Training Loss: 0.01611
Epoch: 7269, Training Loss: 0.01848
Epoch: 7269, Training Loss: 0.01692
Epoch: 7269, Training Loss: 0.02114
Epoch: 7269, Training Loss: 0.01611
Epoch: 7270, Training Loss: 0.01848
Epoch: 7270, Training Loss: 0.01692
Epoch: 7270, Training Loss: 0.02114
Epoch: 7270, Training Loss: 0.01611
Epoch: 7271, Training Loss: 0.01848
Epoch: 7271, Training Loss: 0.01692
Epoch: 7271, Training Loss: 0.02114
Epoch: 7271, Training Loss: 0.01611
Epoch: 7272, Training Loss: 0.01848
Epoch: 7272, Training Loss: 0.01692
Epoch: 7272, Training Loss: 0.02114
Epoch: 7272, Training Loss: 0.01611
Epoch: 7273, Training Loss: 0.01847
Epoch: 7273, Training Loss: 0.01692
Epoch: 7273, Training Loss: 0.02114
Epoch: 7273, Training Loss: 0.01611
Epoch: 7274, Training Loss: 0.01847
Epoch: 7274, Training Loss: 0.01692
Epoch: 7274, Training Loss: 0.02113
Epoch: 7274, Training Loss: 0.01611
Epoch: 7275, Training Loss: 0.01847
Epoch: 7275, Training Loss: 0.01692
Epoch: 7275, Training Loss: 0.02113
Epoch: 7275, Training Loss: 0.01611
Epoch: 7276, Training Loss: 0.01847
Epoch: 7276, Training Loss: 0.01691
Epoch: 7276, Training Loss: 0.02113
Epoch: 7276, Training Loss: 0.01610
Epoch: 7277, Training Loss: 0.01847
Epoch: 7277, Training Loss: 0.01691
Epoch: 7277, Training Loss: 0.02113
Epoch: 7277, Training Loss: 0.01610
Epoch: 7278, Training Loss: 0.01847
Epoch: 7278, Training Loss: 0.01691
Epoch: 7278, Training Loss: 0.02113
Epoch: 7278, Training Loss: 0.01610
Epoch: 7279, Training Loss: 0.01847
Epoch: 7279, Training Loss: 0.01691
Epoch: 7279, Training Loss: 0.02113
Epoch: 7279, Training Loss: 0.01610
Epoch: 7280, Training Loss: 0.01846
Epoch: 7280, Training Loss: 0.01691
Epoch: 7280, Training Loss: 0.02112
Epoch: 7280, Training Loss: 0.01610
Epoch: 7281, Training Loss: 0.01846
Epoch: 7281, Training Loss: 0.01691
Epoch: 7281, Training Loss: 0.02112
Epoch: 7281, Training Loss: 0.01610
Epoch: 7282, Training Loss: 0.01846
Epoch: 7282, Training Loss: 0.01691
Epoch: 7282, Training Loss: 0.02112
Epoch: 7282, Training Loss: 0.01610
Epoch: 7283, Training Loss: 0.01846
Epoch: 7283, Training Loss: 0.01690
Epoch: 7283, Training Loss: 0.02112
Epoch: 7283, Training Loss: 0.01610
Epoch: 7284, Training Loss: 0.01846
Epoch: 7284, Training Loss: 0.01690
Epoch: 7284, Training Loss: 0.02112
Epoch: 7284, Training Loss: 0.01609
Epoch: 7285, Training Loss: 0.01846
Epoch: 7285, Training Loss: 0.01690
Epoch: 7285, Training Loss: 0.02112
Epoch: 7285, Training Loss: 0.01609
Epoch: 7286, Training Loss: 0.01845
Epoch: 7286, Training Loss: 0.01690
Epoch: 7286, Training Loss: 0.02111
Epoch: 7286, Training Loss: 0.01609
Epoch: 7287, Training Loss: 0.01845
Epoch: 7287, Training Loss: 0.01690
Epoch: 7287, Training Loss: 0.02111
Epoch: 7287, Training Loss: 0.01609
Epoch: 7288, Training Loss: 0.01845
Epoch: 7288, Training Loss: 0.01690
Epoch: 7288, Training Loss: 0.02111
Epoch: 7288, Training Loss: 0.01609
Epoch: 7289, Training Loss: 0.01845
Epoch: 7289, Training Loss: 0.01690
Epoch: 7289, Training Loss: 0.02111
Epoch: 7289, Training Loss: 0.01609
Epoch: 7290, Training Loss: 0.01845
Epoch: 7290, Training Loss: 0.01690
Epoch: 7290, Training Loss: 0.02111
Epoch: 7290, Training Loss: 0.01609
Epoch: 7291, Training Loss: 0.01845
Epoch: 7291, Training Loss: 0.01689
Epoch: 7291, Training Loss: 0.02111
Epoch: 7291, Training Loss: 0.01609
Epoch: 7292, Training Loss: 0.01845
Epoch: 7292, Training Loss: 0.01689
Epoch: 7292, Training Loss: 0.02110
Epoch: 7292, Training Loss: 0.01608
Epoch: 7293, Training Loss: 0.01844
Epoch: 7293, Training Loss: 0.01689
Epoch: 7293, Training Loss: 0.02110
Epoch: 7293, Training Loss: 0.01608
Epoch: 7294, Training Loss: 0.01844
Epoch: 7294, Training Loss: 0.01689
Epoch: 7294, Training Loss: 0.02110
Epoch: 7294, Training Loss: 0.01608
Epoch: 7295, Training Loss: 0.01844
Epoch: 7295, Training Loss: 0.01689
Epoch: 7295, Training Loss: 0.02110
Epoch: 7295, Training Loss: 0.01608
Epoch: 7296, Training Loss: 0.01844
Epoch: 7296, Training Loss: 0.01689
Epoch: 7296, Training Loss: 0.02110
Epoch: 7296, Training Loss: 0.01608
Epoch: 7297, Training Loss: 0.01844
Epoch: 7297, Training Loss: 0.01689
Epoch: 7297, Training Loss: 0.02110
Epoch: 7297, Training Loss: 0.01608
Epoch: 7298, Training Loss: 0.01844
Epoch: 7298, Training Loss: 0.01689
Epoch: 7298, Training Loss: 0.02109
Epoch: 7298, Training Loss: 0.01608
Epoch: 7299, Training Loss: 0.01844
Epoch: 7299, Training Loss: 0.01688
Epoch: 7299, Training Loss: 0.02109
Epoch: 7299, Training Loss: 0.01608
Epoch: 7300, Training Loss: 0.01843
Epoch: 7300, Training Loss: 0.01688
Epoch: 7300, Training Loss: 0.02109
Epoch: 7300, Training Loss: 0.01607
Epoch: 7301, Training Loss: 0.01843
Epoch: 7301, Training Loss: 0.01688
Epoch: 7301, Training Loss: 0.02109
Epoch: 7301, Training Loss: 0.01607
Epoch: 7302, Training Loss: 0.01843
Epoch: 7302, Training Loss: 0.01688
Epoch: 7302, Training Loss: 0.02109
Epoch: 7302, Training Loss: 0.01607
Epoch: 7303, Training Loss: 0.01843
Epoch: 7303, Training Loss: 0.01688
Epoch: 7303, Training Loss: 0.02109
Epoch: 7303, Training Loss: 0.01607
Epoch: 7304, Training Loss: 0.01843
Epoch: 7304, Training Loss: 0.01688
Epoch: 7304, Training Loss: 0.02108
Epoch: 7304, Training Loss: 0.01607
Epoch: 7305, Training Loss: 0.01843
Epoch: 7305, Training Loss: 0.01688
Epoch: 7305, Training Loss: 0.02108
Epoch: 7305, Training Loss: 0.01607
Epoch: 7306, Training Loss: 0.01843
Epoch: 7306, Training Loss: 0.01687
Epoch: 7306, Training Loss: 0.02108
Epoch: 7306, Training Loss: 0.01607
Epoch: 7307, Training Loss: 0.01842
Epoch: 7307, Training Loss: 0.01687
Epoch: 7307, Training Loss: 0.02108
Epoch: 7307, Training Loss: 0.01607
Epoch: 7308, Training Loss: 0.01842
Epoch: 7308, Training Loss: 0.01687
Epoch: 7308, Training Loss: 0.02108
Epoch: 7308, Training Loss: 0.01606
Epoch: 7309, Training Loss: 0.01842
Epoch: 7309, Training Loss: 0.01687
Epoch: 7309, Training Loss: 0.02107
Epoch: 7309, Training Loss: 0.01606
Epoch: 7310, Training Loss: 0.01842
Epoch: 7310, Training Loss: 0.01687
Epoch: 7310, Training Loss: 0.02107
Epoch: 7310, Training Loss: 0.01606
Epoch: 7311, Training Loss: 0.01842
Epoch: 7311, Training Loss: 0.01687
Epoch: 7311, Training Loss: 0.02107
Epoch: 7311, Training Loss: 0.01606
Epoch: 7312, Training Loss: 0.01842
Epoch: 7312, Training Loss: 0.01687
Epoch: 7312, Training Loss: 0.02107
Epoch: 7312, Training Loss: 0.01606
Epoch: 7313, Training Loss: 0.01841
Epoch: 7313, Training Loss: 0.01687
Epoch: 7313, Training Loss: 0.02107
Epoch: 7313, Training Loss: 0.01606
Epoch: 7314, Training Loss: 0.01841
Epoch: 7314, Training Loss: 0.01686
Epoch: 7314, Training Loss: 0.02107
Epoch: 7314, Training Loss: 0.01606
Epoch: 7315, Training Loss: 0.01841
Epoch: 7315, Training Loss: 0.01686
Epoch: 7315, Training Loss: 0.02106
Epoch: 7315, Training Loss: 0.01605
Epoch: 7316, Training Loss: 0.01841
Epoch: 7316, Training Loss: 0.01686
Epoch: 7316, Training Loss: 0.02106
Epoch: 7316, Training Loss: 0.01605
Epoch: 7317, Training Loss: 0.01841
Epoch: 7317, Training Loss: 0.01686
Epoch: 7317, Training Loss: 0.02106
Epoch: 7317, Training Loss: 0.01605
Epoch: 7318, Training Loss: 0.01841
Epoch: 7318, Training Loss: 0.01686
Epoch: 7318, Training Loss: 0.02106
Epoch: 7318, Training Loss: 0.01605
Epoch: 7319, Training Loss: 0.01841
Epoch: 7319, Training Loss: 0.01686
Epoch: 7319, Training Loss: 0.02106
Epoch: 7319, Training Loss: 0.01605
Epoch: 7320, Training Loss: 0.01840
Epoch: 7320, Training Loss: 0.01686
Epoch: 7320, Training Loss: 0.02106
Epoch: 7320, Training Loss: 0.01605
Epoch: 7321, Training Loss: 0.01840
Epoch: 7321, Training Loss: 0.01686
Epoch: 7321, Training Loss: 0.02105
Epoch: 7321, Training Loss: 0.01605
Epoch: 7322, Training Loss: 0.01840
Epoch: 7322, Training Loss: 0.01685
Epoch: 7322, Training Loss: 0.02105
Epoch: 7322, Training Loss: 0.01605
Epoch: 7323, Training Loss: 0.01840
Epoch: 7323, Training Loss: 0.01685
Epoch: 7323, Training Loss: 0.02105
Epoch: 7323, Training Loss: 0.01604
Epoch: 7324, Training Loss: 0.01840
Epoch: 7324, Training Loss: 0.01685
Epoch: 7324, Training Loss: 0.02105
Epoch: 7324, Training Loss: 0.01604
Epoch: 7325, Training Loss: 0.01840
Epoch: 7325, Training Loss: 0.01685
Epoch: 7325, Training Loss: 0.02105
Epoch: 7325, Training Loss: 0.01604
Epoch: 7326, Training Loss: 0.01840
Epoch: 7326, Training Loss: 0.01685
Epoch: 7326, Training Loss: 0.02105
Epoch: 7326, Training Loss: 0.01604
Epoch: 7327, Training Loss: 0.01839
Epoch: 7327, Training Loss: 0.01685
Epoch: 7327, Training Loss: 0.02104
Epoch: 7327, Training Loss: 0.01604
Epoch: 7328, Training Loss: 0.01839
Epoch: 7328, Training Loss: 0.01685
Epoch: 7328, Training Loss: 0.02104
Epoch: 7328, Training Loss: 0.01604
Epoch: 7329, Training Loss: 0.01839
Epoch: 7329, Training Loss: 0.01684
Epoch: 7329, Training Loss: 0.02104
Epoch: 7329, Training Loss: 0.01604
Epoch: 7330, Training Loss: 0.01839
Epoch: 7330, Training Loss: 0.01684
Epoch: 7330, Training Loss: 0.02104
Epoch: 7330, Training Loss: 0.01604
Epoch: 7331, Training Loss: 0.01839
Epoch: 7331, Training Loss: 0.01684
Epoch: 7331, Training Loss: 0.02104
Epoch: 7331, Training Loss: 0.01603
Epoch: 7332, Training Loss: 0.01839
Epoch: 7332, Training Loss: 0.01684
Epoch: 7332, Training Loss: 0.02104
Epoch: 7332, Training Loss: 0.01603
Epoch: 7333, Training Loss: 0.01839
Epoch: 7333, Training Loss: 0.01684
Epoch: 7333, Training Loss: 0.02103
Epoch: 7333, Training Loss: 0.01603
Epoch: 7334, Training Loss: 0.01838
Epoch: 7334, Training Loss: 0.01684
Epoch: 7334, Training Loss: 0.02103
Epoch: 7334, Training Loss: 0.01603
Epoch: 7335, Training Loss: 0.01838
Epoch: 7335, Training Loss: 0.01684
Epoch: 7335, Training Loss: 0.02103
Epoch: 7335, Training Loss: 0.01603
Epoch: 7336, Training Loss: 0.01838
Epoch: 7336, Training Loss: 0.01684
Epoch: 7336, Training Loss: 0.02103
Epoch: 7336, Training Loss: 0.01603
Epoch: 7337, Training Loss: 0.01838
Epoch: 7337, Training Loss: 0.01683
Epoch: 7337, Training Loss: 0.02103
Epoch: 7337, Training Loss: 0.01603
Epoch: 7338, Training Loss: 0.01838
Epoch: 7338, Training Loss: 0.01683
Epoch: 7338, Training Loss: 0.02103
Epoch: 7338, Training Loss: 0.01603
Epoch: 7339, Training Loss: 0.01838
Epoch: 7339, Training Loss: 0.01683
Epoch: 7339, Training Loss: 0.02102
Epoch: 7339, Training Loss: 0.01602
Epoch: 7340, Training Loss: 0.01838
Epoch: 7340, Training Loss: 0.01683
Epoch: 7340, Training Loss: 0.02102
Epoch: 7340, Training Loss: 0.01602
Epoch: 7341, Training Loss: 0.01837
Epoch: 7341, Training Loss: 0.01683
Epoch: 7341, Training Loss: 0.02102
Epoch: 7341, Training Loss: 0.01602
Epoch: 7342, Training Loss: 0.01837
Epoch: 7342, Training Loss: 0.01683
Epoch: 7342, Training Loss: 0.02102
Epoch: 7342, Training Loss: 0.01602
Epoch: 7343, Training Loss: 0.01837
Epoch: 7343, Training Loss: 0.01683
Epoch: 7343, Training Loss: 0.02102
Epoch: 7343, Training Loss: 0.01602
Epoch: 7344, Training Loss: 0.01837
Epoch: 7344, Training Loss: 0.01683
Epoch: 7344, Training Loss: 0.02102
Epoch: 7344, Training Loss: 0.01602
Epoch: 7345, Training Loss: 0.01837
Epoch: 7345, Training Loss: 0.01682
Epoch: 7345, Training Loss: 0.02101
Epoch: 7345, Training Loss: 0.01602
Epoch: 7346, Training Loss: 0.01837
Epoch: 7346, Training Loss: 0.01682
Epoch: 7346, Training Loss: 0.02101
Epoch: 7346, Training Loss: 0.01602
Epoch: 7347, Training Loss: 0.01836
Epoch: 7347, Training Loss: 0.01682
Epoch: 7347, Training Loss: 0.02101
Epoch: 7347, Training Loss: 0.01601
Epoch: 7348, Training Loss: 0.01836
Epoch: 7348, Training Loss: 0.01682
Epoch: 7348, Training Loss: 0.02101
Epoch: 7348, Training Loss: 0.01601
Epoch: 7349, Training Loss: 0.01836
Epoch: 7349, Training Loss: 0.01682
Epoch: 7349, Training Loss: 0.02101
Epoch: 7349, Training Loss: 0.01601
Epoch: 7350, Training Loss: 0.01836
Epoch: 7350, Training Loss: 0.01682
Epoch: 7350, Training Loss: 0.02101
Epoch: 7350, Training Loss: 0.01601
Epoch: 7351, Training Loss: 0.01836
Epoch: 7351, Training Loss: 0.01682
Epoch: 7351, Training Loss: 0.02100
Epoch: 7351, Training Loss: 0.01601
Epoch: 7352, Training Loss: 0.01836
Epoch: 7352, Training Loss: 0.01682
Epoch: 7352, Training Loss: 0.02100
Epoch: 7352, Training Loss: 0.01601
Epoch: 7353, Training Loss: 0.01836
Epoch: 7353, Training Loss: 0.01681
Epoch: 7353, Training Loss: 0.02100
Epoch: 7353, Training Loss: 0.01601
Epoch: 7354, Training Loss: 0.01835
Epoch: 7354, Training Loss: 0.01681
Epoch: 7354, Training Loss: 0.02100
Epoch: 7354, Training Loss: 0.01601
Epoch: 7355, Training Loss: 0.01835
Epoch: 7355, Training Loss: 0.01681
Epoch: 7355, Training Loss: 0.02100
Epoch: 7355, Training Loss: 0.01600
Epoch: 7356, Training Loss: 0.01835
Epoch: 7356, Training Loss: 0.01681
Epoch: 7356, Training Loss: 0.02099
Epoch: 7356, Training Loss: 0.01600
Epoch: 7357, Training Loss: 0.01835
Epoch: 7357, Training Loss: 0.01681
Epoch: 7357, Training Loss: 0.02099
Epoch: 7357, Training Loss: 0.01600
Epoch: 7358, Training Loss: 0.01835
Epoch: 7358, Training Loss: 0.01681
Epoch: 7358, Training Loss: 0.02099
Epoch: 7358, Training Loss: 0.01600
Epoch: 7359, Training Loss: 0.01835
Epoch: 7359, Training Loss: 0.01681
Epoch: 7359, Training Loss: 0.02099
Epoch: 7359, Training Loss: 0.01600
Epoch: 7360, Training Loss: 0.01835
Epoch: 7360, Training Loss: 0.01680
Epoch: 7360, Training Loss: 0.02099
Epoch: 7360, Training Loss: 0.01600
Epoch: 7361, Training Loss: 0.01834
Epoch: 7361, Training Loss: 0.01680
Epoch: 7361, Training Loss: 0.02099
Epoch: 7361, Training Loss: 0.01600
Epoch: 7362, Training Loss: 0.01834
Epoch: 7362, Training Loss: 0.01680
Epoch: 7362, Training Loss: 0.02098
Epoch: 7362, Training Loss: 0.01600
Epoch: 7363, Training Loss: 0.01834
Epoch: 7363, Training Loss: 0.01680
Epoch: 7363, Training Loss: 0.02098
Epoch: 7363, Training Loss: 0.01599
Epoch: 7364, Training Loss: 0.01834
Epoch: 7364, Training Loss: 0.01680
Epoch: 7364, Training Loss: 0.02098
Epoch: 7364, Training Loss: 0.01599
Epoch: 7365, Training Loss: 0.01834
Epoch: 7365, Training Loss: 0.01680
Epoch: 7365, Training Loss: 0.02098
Epoch: 7365, Training Loss: 0.01599
Epoch: 7366, Training Loss: 0.01834
Epoch: 7366, Training Loss: 0.01680
Epoch: 7366, Training Loss: 0.02098
Epoch: 7366, Training Loss: 0.01599
Epoch: 7367, Training Loss: 0.01834
Epoch: 7367, Training Loss: 0.01680
Epoch: 7367, Training Loss: 0.02098
Epoch: 7367, Training Loss: 0.01599
Epoch: 7368, Training Loss: 0.01833
Epoch: 7368, Training Loss: 0.01679
Epoch: 7368, Training Loss: 0.02097
Epoch: 7368, Training Loss: 0.01599
Epoch: 7369, Training Loss: 0.01833
Epoch: 7369, Training Loss: 0.01679
Epoch: 7369, Training Loss: 0.02097
Epoch: 7369, Training Loss: 0.01599
Epoch: 7370, Training Loss: 0.01833
Epoch: 7370, Training Loss: 0.01679
Epoch: 7370, Training Loss: 0.02097
Epoch: 7370, Training Loss: 0.01599
Epoch: 7371, Training Loss: 0.01833
Epoch: 7371, Training Loss: 0.01679
Epoch: 7371, Training Loss: 0.02097
Epoch: 7371, Training Loss: 0.01598
Epoch: 7372, Training Loss: 0.01833
Epoch: 7372, Training Loss: 0.01679
Epoch: 7372, Training Loss: 0.02097
Epoch: 7372, Training Loss: 0.01598
Epoch: 7373, Training Loss: 0.01833
Epoch: 7373, Training Loss: 0.01679
Epoch: 7373, Training Loss: 0.02097
Epoch: 7373, Training Loss: 0.01598
Epoch: 7374, Training Loss: 0.01833
Epoch: 7374, Training Loss: 0.01679
Epoch: 7374, Training Loss: 0.02096
Epoch: 7374, Training Loss: 0.01598
Epoch: 7375, Training Loss: 0.01832
Epoch: 7375, Training Loss: 0.01679
Epoch: 7375, Training Loss: 0.02096
Epoch: 7375, Training Loss: 0.01598
Epoch: 7376, Training Loss: 0.01832
Epoch: 7376, Training Loss: 0.01678
Epoch: 7376, Training Loss: 0.02096
Epoch: 7376, Training Loss: 0.01598
Epoch: 7377, Training Loss: 0.01832
Epoch: 7377, Training Loss: 0.01678
Epoch: 7377, Training Loss: 0.02096
Epoch: 7377, Training Loss: 0.01598
Epoch: 7378, Training Loss: 0.01832
Epoch: 7378, Training Loss: 0.01678
Epoch: 7378, Training Loss: 0.02096
Epoch: 7378, Training Loss: 0.01598
Epoch: 7379, Training Loss: 0.01832
Epoch: 7379, Training Loss: 0.01678
Epoch: 7379, Training Loss: 0.02096
Epoch: 7379, Training Loss: 0.01597
Epoch: 7380, Training Loss: 0.01832
Epoch: 7380, Training Loss: 0.01678
Epoch: 7380, Training Loss: 0.02095
Epoch: 7380, Training Loss: 0.01597
Epoch: 7381, Training Loss: 0.01832
Epoch: 7381, Training Loss: 0.01678
Epoch: 7381, Training Loss: 0.02095
Epoch: 7381, Training Loss: 0.01597
Epoch: 7382, Training Loss: 0.01831
Epoch: 7382, Training Loss: 0.01678
Epoch: 7382, Training Loss: 0.02095
Epoch: 7382, Training Loss: 0.01597
Epoch: 7383, Training Loss: 0.01831
Epoch: 7383, Training Loss: 0.01678
Epoch: 7383, Training Loss: 0.02095
Epoch: 7383, Training Loss: 0.01597
Epoch: 7384, Training Loss: 0.01831
Epoch: 7384, Training Loss: 0.01677
Epoch: 7384, Training Loss: 0.02095
Epoch: 7384, Training Loss: 0.01597
Epoch: 7385, Training Loss: 0.01831
Epoch: 7385, Training Loss: 0.01677
Epoch: 7385, Training Loss: 0.02095
Epoch: 7385, Training Loss: 0.01597
Epoch: 7386, Training Loss: 0.01831
Epoch: 7386, Training Loss: 0.01677
Epoch: 7386, Training Loss: 0.02094
Epoch: 7386, Training Loss: 0.01597
Epoch: 7387, Training Loss: 0.01831
Epoch: 7387, Training Loss: 0.01677
Epoch: 7387, Training Loss: 0.02094
Epoch: 7387, Training Loss: 0.01597
Epoch: 7388, Training Loss: 0.01831
Epoch: 7388, Training Loss: 0.01677
Epoch: 7388, Training Loss: 0.02094
Epoch: 7388, Training Loss: 0.01596
Epoch: 7389, Training Loss: 0.01830
Epoch: 7389, Training Loss: 0.01677
Epoch: 7389, Training Loss: 0.02094
Epoch: 7389, Training Loss: 0.01596
Epoch: 7390, Training Loss: 0.01830
Epoch: 7390, Training Loss: 0.01677
Epoch: 7390, Training Loss: 0.02094
Epoch: 7390, Training Loss: 0.01596
Epoch: 7391, Training Loss: 0.01830
Epoch: 7391, Training Loss: 0.01676
Epoch: 7391, Training Loss: 0.02094
Epoch: 7391, Training Loss: 0.01596
Epoch: 7392, Training Loss: 0.01830
Epoch: 7392, Training Loss: 0.01676
Epoch: 7392, Training Loss: 0.02093
Epoch: 7392, Training Loss: 0.01596
Epoch: 7393, Training Loss: 0.01830
Epoch: 7393, Training Loss: 0.01676
Epoch: 7393, Training Loss: 0.02093
Epoch: 7393, Training Loss: 0.01596
Epoch: 7394, Training Loss: 0.01830
Epoch: 7394, Training Loss: 0.01676
Epoch: 7394, Training Loss: 0.02093
Epoch: 7394, Training Loss: 0.01596
Epoch: 7395, Training Loss: 0.01829
Epoch: 7395, Training Loss: 0.01676
Epoch: 7395, Training Loss: 0.02093
Epoch: 7395, Training Loss: 0.01596
Epoch: 7396, Training Loss: 0.01829
Epoch: 7396, Training Loss: 0.01676
Epoch: 7396, Training Loss: 0.02093
Epoch: 7396, Training Loss: 0.01595
Epoch: 7397, Training Loss: 0.01829
Epoch: 7397, Training Loss: 0.01676
Epoch: 7397, Training Loss: 0.02093
Epoch: 7397, Training Loss: 0.01595
Epoch: 7398, Training Loss: 0.01829
Epoch: 7398, Training Loss: 0.01676
Epoch: 7398, Training Loss: 0.02092
Epoch: 7398, Training Loss: 0.01595
Epoch: 7399, Training Loss: 0.01829
Epoch: 7399, Training Loss: 0.01675
Epoch: 7399, Training Loss: 0.02092
Epoch: 7399, Training Loss: 0.01595
Epoch: 7400, Training Loss: 0.01829
Epoch: 7400, Training Loss: 0.01675
Epoch: 7400, Training Loss: 0.02092
Epoch: 7400, Training Loss: 0.01595
Epoch: 7401, Training Loss: 0.01829
Epoch: 7401, Training Loss: 0.01675
Epoch: 7401, Training Loss: 0.02092
Epoch: 7401, Training Loss: 0.01595
Epoch: 7402, Training Loss: 0.01828
Epoch: 7402, Training Loss: 0.01675
Epoch: 7402, Training Loss: 0.02092
Epoch: 7402, Training Loss: 0.01595
Epoch: 7403, Training Loss: 0.01828
Epoch: 7403, Training Loss: 0.01675
Epoch: 7403, Training Loss: 0.02092
Epoch: 7403, Training Loss: 0.01595
Epoch: 7404, Training Loss: 0.01828
Epoch: 7404, Training Loss: 0.01675
Epoch: 7404, Training Loss: 0.02091
Epoch: 7404, Training Loss: 0.01594
Epoch: 7405, Training Loss: 0.01828
Epoch: 7405, Training Loss: 0.01675
Epoch: 7405, Training Loss: 0.02091
Epoch: 7405, Training Loss: 0.01594
Epoch: 7406, Training Loss: 0.01828
Epoch: 7406, Training Loss: 0.01675
Epoch: 7406, Training Loss: 0.02091
Epoch: 7406, Training Loss: 0.01594
Epoch: 7407, Training Loss: 0.01828
Epoch: 7407, Training Loss: 0.01674
Epoch: 7407, Training Loss: 0.02091
Epoch: 7407, Training Loss: 0.01594
Epoch: 7408, Training Loss: 0.01828
Epoch: 7408, Training Loss: 0.01674
Epoch: 7408, Training Loss: 0.02091
Epoch: 7408, Training Loss: 0.01594
Epoch: 7409, Training Loss: 0.01827
Epoch: 7409, Training Loss: 0.01674
Epoch: 7409, Training Loss: 0.02091
Epoch: 7409, Training Loss: 0.01594
Epoch: 7410, Training Loss: 0.01827
Epoch: 7410, Training Loss: 0.01674
Epoch: 7410, Training Loss: 0.02090
Epoch: 7410, Training Loss: 0.01594
Epoch: 7411, Training Loss: 0.01827
Epoch: 7411, Training Loss: 0.01674
Epoch: 7411, Training Loss: 0.02090
Epoch: 7411, Training Loss: 0.01594
Epoch: 7412, Training Loss: 0.01827
Epoch: 7412, Training Loss: 0.01674
Epoch: 7412, Training Loss: 0.02090
Epoch: 7412, Training Loss: 0.01593
Epoch: 7413, Training Loss: 0.01827
Epoch: 7413, Training Loss: 0.01674
Epoch: 7413, Training Loss: 0.02090
Epoch: 7413, Training Loss: 0.01593
Epoch: 7414, Training Loss: 0.01827
Epoch: 7414, Training Loss: 0.01674
Epoch: 7414, Training Loss: 0.02090
Epoch: 7414, Training Loss: 0.01593
Epoch: 7415, Training Loss: 0.01827
Epoch: 7415, Training Loss: 0.01673
Epoch: 7415, Training Loss: 0.02090
Epoch: 7415, Training Loss: 0.01593
Epoch: 7416, Training Loss: 0.01826
Epoch: 7416, Training Loss: 0.01673
Epoch: 7416, Training Loss: 0.02089
Epoch: 7416, Training Loss: 0.01593
Epoch: 7417, Training Loss: 0.01826
Epoch: 7417, Training Loss: 0.01673
Epoch: 7417, Training Loss: 0.02089
Epoch: 7417, Training Loss: 0.01593
Epoch: 7418, Training Loss: 0.01826
Epoch: 7418, Training Loss: 0.01673
Epoch: 7418, Training Loss: 0.02089
Epoch: 7418, Training Loss: 0.01593
Epoch: 7419, Training Loss: 0.01826
Epoch: 7419, Training Loss: 0.01673
Epoch: 7419, Training Loss: 0.02089
Epoch: 7419, Training Loss: 0.01593
Epoch: 7420, Training Loss: 0.01826
Epoch: 7420, Training Loss: 0.01673
Epoch: 7420, Training Loss: 0.02089
Epoch: 7420, Training Loss: 0.01592
Epoch: 7421, Training Loss: 0.01826
Epoch: 7421, Training Loss: 0.01673
Epoch: 7421, Training Loss: 0.02089
Epoch: 7421, Training Loss: 0.01592
Epoch: 7422, Training Loss: 0.01826
Epoch: 7422, Training Loss: 0.01673
Epoch: 7422, Training Loss: 0.02088
Epoch: 7422, Training Loss: 0.01592
Epoch: 7423, Training Loss: 0.01825
Epoch: 7423, Training Loss: 0.01672
Epoch: 7423, Training Loss: 0.02088
Epoch: 7423, Training Loss: 0.01592
Epoch: 7424, Training Loss: 0.01825
Epoch: 7424, Training Loss: 0.01672
Epoch: 7424, Training Loss: 0.02088
Epoch: 7424, Training Loss: 0.01592
Epoch: 7425, Training Loss: 0.01825
Epoch: 7425, Training Loss: 0.01672
Epoch: 7425, Training Loss: 0.02088
Epoch: 7425, Training Loss: 0.01592
Epoch: 7426, Training Loss: 0.01825
Epoch: 7426, Training Loss: 0.01672
Epoch: 7426, Training Loss: 0.02088
Epoch: 7426, Training Loss: 0.01592
Epoch: 7427, Training Loss: 0.01825
Epoch: 7427, Training Loss: 0.01672
Epoch: 7427, Training Loss: 0.02088
Epoch: 7427, Training Loss: 0.01592
Epoch: 7428, Training Loss: 0.01825
Epoch: 7428, Training Loss: 0.01672
Epoch: 7428, Training Loss: 0.02087
Epoch: 7428, Training Loss: 0.01591
Epoch: 7429, Training Loss: 0.01825
Epoch: 7429, Training Loss: 0.01672
Epoch: 7429, Training Loss: 0.02087
Epoch: 7429, Training Loss: 0.01591
Epoch: 7430, Training Loss: 0.01824
Epoch: 7430, Training Loss: 0.01672
Epoch: 7430, Training Loss: 0.02087
Epoch: 7430, Training Loss: 0.01591
Epoch: 7431, Training Loss: 0.01824
Epoch: 7431, Training Loss: 0.01671
Epoch: 7431, Training Loss: 0.02087
Epoch: 7431, Training Loss: 0.01591
Epoch: 7432, Training Loss: 0.01824
Epoch: 7432, Training Loss: 0.01671
Epoch: 7432, Training Loss: 0.02087
Epoch: 7432, Training Loss: 0.01591
Epoch: 7433, Training Loss: 0.01824
Epoch: 7433, Training Loss: 0.01671
Epoch: 7433, Training Loss: 0.02087
Epoch: 7433, Training Loss: 0.01591
Epoch: 7434, Training Loss: 0.01824
Epoch: 7434, Training Loss: 0.01671
Epoch: 7434, Training Loss: 0.02086
Epoch: 7434, Training Loss: 0.01591
Epoch: 7435, Training Loss: 0.01824
Epoch: 7435, Training Loss: 0.01671
Epoch: 7435, Training Loss: 0.02086
Epoch: 7435, Training Loss: 0.01591
Epoch: 7436, Training Loss: 0.01824
Epoch: 7436, Training Loss: 0.01671
Epoch: 7436, Training Loss: 0.02086
Epoch: 7436, Training Loss: 0.01590
Epoch: 7437, Training Loss: 0.01823
Epoch: 7437, Training Loss: 0.01671
Epoch: 7437, Training Loss: 0.02086
Epoch: 7437, Training Loss: 0.01590
Epoch: 7438, Training Loss: 0.01823
Epoch: 7438, Training Loss: 0.01670
Epoch: 7438, Training Loss: 0.02086
Epoch: 7438, Training Loss: 0.01590
Epoch: 7439, Training Loss: 0.01823
Epoch: 7439, Training Loss: 0.01670
Epoch: 7439, Training Loss: 0.02086
Epoch: 7439, Training Loss: 0.01590
Epoch: 7440, Training Loss: 0.01823
Epoch: 7440, Training Loss: 0.01670
Epoch: 7440, Training Loss: 0.02085
Epoch: 7440, Training Loss: 0.01590
Epoch: 7441, Training Loss: 0.01823
Epoch: 7441, Training Loss: 0.01670
Epoch: 7441, Training Loss: 0.02085
Epoch: 7441, Training Loss: 0.01590
Epoch: 7442, Training Loss: 0.01823
Epoch: 7442, Training Loss: 0.01670
Epoch: 7442, Training Loss: 0.02085
Epoch: 7442, Training Loss: 0.01590
Epoch: 7443, Training Loss: 0.01823
Epoch: 7443, Training Loss: 0.01670
Epoch: 7443, Training Loss: 0.02085
Epoch: 7443, Training Loss: 0.01590
Epoch: 7444, Training Loss: 0.01822
Epoch: 7444, Training Loss: 0.01670
Epoch: 7444, Training Loss: 0.02085
Epoch: 7444, Training Loss: 0.01589
Epoch: 7445, Training Loss: 0.01822
Epoch: 7445, Training Loss: 0.01670
Epoch: 7445, Training Loss: 0.02085
Epoch: 7445, Training Loss: 0.01589
Epoch: 7446, Training Loss: 0.01822
Epoch: 7446, Training Loss: 0.01669
Epoch: 7446, Training Loss: 0.02084
Epoch: 7446, Training Loss: 0.01589
Epoch: 7447, Training Loss: 0.01822
Epoch: 7447, Training Loss: 0.01669
Epoch: 7447, Training Loss: 0.02084
Epoch: 7447, Training Loss: 0.01589
Epoch: 7448, Training Loss: 0.01822
Epoch: 7448, Training Loss: 0.01669
Epoch: 7448, Training Loss: 0.02084
Epoch: 7448, Training Loss: 0.01589
Epoch: 7449, Training Loss: 0.01822
Epoch: 7449, Training Loss: 0.01669
Epoch: 7449, Training Loss: 0.02084
Epoch: 7449, Training Loss: 0.01589
Epoch: 7450, Training Loss: 0.01822
Epoch: 7450, Training Loss: 0.01669
Epoch: 7450, Training Loss: 0.02084
Epoch: 7450, Training Loss: 0.01589
Epoch: 7451, Training Loss: 0.01821
Epoch: 7451, Training Loss: 0.01669
Epoch: 7451, Training Loss: 0.02084
Epoch: 7451, Training Loss: 0.01589
Epoch: 7452, Training Loss: 0.01821
Epoch: 7452, Training Loss: 0.01669
Epoch: 7452, Training Loss: 0.02083
Epoch: 7452, Training Loss: 0.01589
Epoch: 7453, Training Loss: 0.01821
Epoch: 7453, Training Loss: 0.01669
Epoch: 7453, Training Loss: 0.02083
Epoch: 7453, Training Loss: 0.01588
Epoch: 7454, Training Loss: 0.01821
Epoch: 7454, Training Loss: 0.01668
Epoch: 7454, Training Loss: 0.02083
Epoch: 7454, Training Loss: 0.01588
Epoch: 7455, Training Loss: 0.01821
Epoch: 7455, Training Loss: 0.01668
Epoch: 7455, Training Loss: 0.02083
Epoch: 7455, Training Loss: 0.01588
Epoch: 7456, Training Loss: 0.01821
Epoch: 7456, Training Loss: 0.01668
Epoch: 7456, Training Loss: 0.02083
Epoch: 7456, Training Loss: 0.01588
Epoch: 7457, Training Loss: 0.01821
Epoch: 7457, Training Loss: 0.01668
Epoch: 7457, Training Loss: 0.02083
Epoch: 7457, Training Loss: 0.01588
Epoch: 7458, Training Loss: 0.01820
Epoch: 7458, Training Loss: 0.01668
Epoch: 7458, Training Loss: 0.02082
Epoch: 7458, Training Loss: 0.01588
Epoch: 7459, Training Loss: 0.01820
Epoch: 7459, Training Loss: 0.01668
Epoch: 7459, Training Loss: 0.02082
Epoch: 7459, Training Loss: 0.01588
Epoch: 7460, Training Loss: 0.01820
Epoch: 7460, Training Loss: 0.01668
Epoch: 7460, Training Loss: 0.02082
Epoch: 7460, Training Loss: 0.01588
Epoch: 7461, Training Loss: 0.01820
Epoch: 7461, Training Loss: 0.01668
Epoch: 7461, Training Loss: 0.02082
Epoch: 7461, Training Loss: 0.01587
Epoch: 7462, Training Loss: 0.01820
Epoch: 7462, Training Loss: 0.01667
Epoch: 7462, Training Loss: 0.02082
Epoch: 7462, Training Loss: 0.01587
Epoch: 7463, Training Loss: 0.01820
Epoch: 7463, Training Loss: 0.01667
Epoch: 7463, Training Loss: 0.02082
Epoch: 7463, Training Loss: 0.01587
Epoch: 7464, Training Loss: 0.01820
Epoch: 7464, Training Loss: 0.01667
Epoch: 7464, Training Loss: 0.02081
Epoch: 7464, Training Loss: 0.01587
Epoch: 7465, Training Loss: 0.01819
Epoch: 7465, Training Loss: 0.01667
Epoch: 7465, Training Loss: 0.02081
Epoch: 7465, Training Loss: 0.01587
Epoch: 7466, Training Loss: 0.01819
Epoch: 7466, Training Loss: 0.01667
Epoch: 7466, Training Loss: 0.02081
Epoch: 7466, Training Loss: 0.01587
Epoch: 7467, Training Loss: 0.01819
Epoch: 7467, Training Loss: 0.01667
Epoch: 7467, Training Loss: 0.02081
Epoch: 7467, Training Loss: 0.01587
Epoch: 7468, Training Loss: 0.01819
Epoch: 7468, Training Loss: 0.01667
Epoch: 7468, Training Loss: 0.02081
Epoch: 7468, Training Loss: 0.01587
Epoch: 7469, Training Loss: 0.01819
Epoch: 7469, Training Loss: 0.01667
Epoch: 7469, Training Loss: 0.02081
Epoch: 7469, Training Loss: 0.01586
Epoch: 7470, Training Loss: 0.01819
Epoch: 7470, Training Loss: 0.01666
Epoch: 7470, Training Loss: 0.02080
Epoch: 7470, Training Loss: 0.01586
Epoch: 7471, Training Loss: 0.01819
Epoch: 7471, Training Loss: 0.01666
Epoch: 7471, Training Loss: 0.02080
Epoch: 7471, Training Loss: 0.01586
Epoch: 7472, Training Loss: 0.01818
Epoch: 7472, Training Loss: 0.01666
Epoch: 7472, Training Loss: 0.02080
Epoch: 7472, Training Loss: 0.01586
Epoch: 7473, Training Loss: 0.01818
Epoch: 7473, Training Loss: 0.01666
Epoch: 7473, Training Loss: 0.02080
Epoch: 7473, Training Loss: 0.01586
Epoch: 7474, Training Loss: 0.01818
Epoch: 7474, Training Loss: 0.01666
Epoch: 7474, Training Loss: 0.02080
Epoch: 7474, Training Loss: 0.01586
Epoch: 7475, Training Loss: 0.01818
Epoch: 7475, Training Loss: 0.01666
Epoch: 7475, Training Loss: 0.02080
Epoch: 7475, Training Loss: 0.01586
Epoch: 7476, Training Loss: 0.01818
Epoch: 7476, Training Loss: 0.01666
Epoch: 7476, Training Loss: 0.02080
Epoch: 7476, Training Loss: 0.01586
Epoch: 7477, Training Loss: 0.01818
Epoch: 7477, Training Loss: 0.01666
Epoch: 7477, Training Loss: 0.02079
Epoch: 7477, Training Loss: 0.01585
Epoch: 7478, Training Loss: 0.01818
Epoch: 7478, Training Loss: 0.01665
Epoch: 7478, Training Loss: 0.02079
Epoch: 7478, Training Loss: 0.01585
Epoch: 7479, Training Loss: 0.01817
Epoch: 7479, Training Loss: 0.01665
Epoch: 7479, Training Loss: 0.02079
Epoch: 7479, Training Loss: 0.01585
Epoch: 7480, Training Loss: 0.01817
Epoch: 7480, Training Loss: 0.01665
Epoch: 7480, Training Loss: 0.02079
Epoch: 7480, Training Loss: 0.01585
Epoch: 7481, Training Loss: 0.01817
Epoch: 7481, Training Loss: 0.01665
Epoch: 7481, Training Loss: 0.02079
Epoch: 7481, Training Loss: 0.01585
Epoch: 7482, Training Loss: 0.01817
Epoch: 7482, Training Loss: 0.01665
Epoch: 7482, Training Loss: 0.02079
Epoch: 7482, Training Loss: 0.01585
Epoch: 7483, Training Loss: 0.01817
Epoch: 7483, Training Loss: 0.01665
Epoch: 7483, Training Loss: 0.02078
Epoch: 7483, Training Loss: 0.01585
Epoch: 7484, Training Loss: 0.01817
Epoch: 7484, Training Loss: 0.01665
Epoch: 7484, Training Loss: 0.02078
Epoch: 7484, Training Loss: 0.01585
Epoch: 7485, Training Loss: 0.01817
Epoch: 7485, Training Loss: 0.01665
Epoch: 7485, Training Loss: 0.02078
Epoch: 7485, Training Loss: 0.01584
Epoch: 7486, Training Loss: 0.01816
Epoch: 7486, Training Loss: 0.01664
Epoch: 7486, Training Loss: 0.02078
Epoch: 7486, Training Loss: 0.01584
Epoch: 7487, Training Loss: 0.01816
Epoch: 7487, Training Loss: 0.01664
Epoch: 7487, Training Loss: 0.02078
Epoch: 7487, Training Loss: 0.01584
Epoch: 7488, Training Loss: 0.01816
Epoch: 7488, Training Loss: 0.01664
Epoch: 7488, Training Loss: 0.02078
Epoch: 7488, Training Loss: 0.01584
Epoch: 7489, Training Loss: 0.01816
Epoch: 7489, Training Loss: 0.01664
Epoch: 7489, Training Loss: 0.02077
Epoch: 7489, Training Loss: 0.01584
Epoch: 7490, Training Loss: 0.01816
Epoch: 7490, Training Loss: 0.01664
Epoch: 7490, Training Loss: 0.02077
Epoch: 7490, Training Loss: 0.01584
Epoch: 7491, Training Loss: 0.01816
Epoch: 7491, Training Loss: 0.01664
Epoch: 7491, Training Loss: 0.02077
Epoch: 7491, Training Loss: 0.01584
Epoch: 7492, Training Loss: 0.01816
Epoch: 7492, Training Loss: 0.01664
Epoch: 7492, Training Loss: 0.02077
Epoch: 7492, Training Loss: 0.01584
Epoch: 7493, Training Loss: 0.01815
Epoch: 7493, Training Loss: 0.01664
Epoch: 7493, Training Loss: 0.02077
Epoch: 7493, Training Loss: 0.01584
Epoch: 7494, Training Loss: 0.01815
Epoch: 7494, Training Loss: 0.01663
Epoch: 7494, Training Loss: 0.02077
Epoch: 7494, Training Loss: 0.01583
Epoch: 7495, Training Loss: 0.01815
Epoch: 7495, Training Loss: 0.01663
Epoch: 7495, Training Loss: 0.02076
Epoch: 7495, Training Loss: 0.01583
Epoch: 7496, Training Loss: 0.01815
Epoch: 7496, Training Loss: 0.01663
Epoch: 7496, Training Loss: 0.02076
Epoch: 7496, Training Loss: 0.01583
Epoch: 7497, Training Loss: 0.01815
Epoch: 7497, Training Loss: 0.01663
Epoch: 7497, Training Loss: 0.02076
Epoch: 7497, Training Loss: 0.01583
Epoch: 7498, Training Loss: 0.01815
Epoch: 7498, Training Loss: 0.01663
Epoch: 7498, Training Loss: 0.02076
Epoch: 7498, Training Loss: 0.01583
Epoch: 7499, Training Loss: 0.01815
Epoch: 7499, Training Loss: 0.01663
Epoch: 7499, Training Loss: 0.02076
Epoch: 7499, Training Loss: 0.01583
Epoch: 7500, Training Loss: 0.01814
Epoch: 7500, Training Loss: 0.01663
Epoch: 7500, Training Loss: 0.02076
Epoch: 7500, Training Loss: 0.01583
Epoch: 7501, Training Loss: 0.01814
Epoch: 7501, Training Loss: 0.01663
Epoch: 7501, Training Loss: 0.02075
Epoch: 7501, Training Loss: 0.01583
Epoch: 7502, Training Loss: 0.01814
Epoch: 7502, Training Loss: 0.01662
Epoch: 7502, Training Loss: 0.02075
Epoch: 7502, Training Loss: 0.01582
Epoch: 7503, Training Loss: 0.01814
Epoch: 7503, Training Loss: 0.01662
Epoch: 7503, Training Loss: 0.02075
Epoch: 7503, Training Loss: 0.01582
Epoch: 7504, Training Loss: 0.01814
Epoch: 7504, Training Loss: 0.01662
Epoch: 7504, Training Loss: 0.02075
Epoch: 7504, Training Loss: 0.01582
Epoch: 7505, Training Loss: 0.01814
Epoch: 7505, Training Loss: 0.01662
Epoch: 7505, Training Loss: 0.02075
Epoch: 7505, Training Loss: 0.01582
Epoch: 7506, Training Loss: 0.01814
Epoch: 7506, Training Loss: 0.01662
Epoch: 7506, Training Loss: 0.02075
Epoch: 7506, Training Loss: 0.01582
Epoch: 7507, Training Loss: 0.01813
Epoch: 7507, Training Loss: 0.01662
Epoch: 7507, Training Loss: 0.02074
Epoch: 7507, Training Loss: 0.01582
Epoch: 7508, Training Loss: 0.01813
Epoch: 7508, Training Loss: 0.01662
Epoch: 7508, Training Loss: 0.02074
Epoch: 7508, Training Loss: 0.01582
Epoch: 7509, Training Loss: 0.01813
Epoch: 7509, Training Loss: 0.01662
Epoch: 7509, Training Loss: 0.02074
Epoch: 7509, Training Loss: 0.01582
Epoch: 7510, Training Loss: 0.01813
Epoch: 7510, Training Loss: 0.01661
Epoch: 7510, Training Loss: 0.02074
Epoch: 7510, Training Loss: 0.01581
Epoch: 7511, Training Loss: 0.01813
Epoch: 7511, Training Loss: 0.01661
Epoch: 7511, Training Loss: 0.02074
Epoch: 7511, Training Loss: 0.01581
Epoch: 7512, Training Loss: 0.01813
Epoch: 7512, Training Loss: 0.01661
Epoch: 7512, Training Loss: 0.02074
Epoch: 7512, Training Loss: 0.01581
Epoch: 7513, Training Loss: 0.01813
Epoch: 7513, Training Loss: 0.01661
Epoch: 7513, Training Loss: 0.02073
Epoch: 7513, Training Loss: 0.01581
Epoch: 7514, Training Loss: 0.01812
Epoch: 7514, Training Loss: 0.01661
Epoch: 7514, Training Loss: 0.02073
Epoch: 7514, Training Loss: 0.01581
Epoch: 7515, Training Loss: 0.01812
Epoch: 7515, Training Loss: 0.01661
Epoch: 7515, Training Loss: 0.02073
Epoch: 7515, Training Loss: 0.01581
Epoch: 7516, Training Loss: 0.01812
Epoch: 7516, Training Loss: 0.01661
Epoch: 7516, Training Loss: 0.02073
Epoch: 7516, Training Loss: 0.01581
Epoch: 7517, Training Loss: 0.01812
Epoch: 7517, Training Loss: 0.01661
Epoch: 7517, Training Loss: 0.02073
Epoch: 7517, Training Loss: 0.01581
Epoch: 7518, Training Loss: 0.01812
Epoch: 7518, Training Loss: 0.01660
Epoch: 7518, Training Loss: 0.02073
Epoch: 7518, Training Loss: 0.01581
Epoch: 7519, Training Loss: 0.01812
Epoch: 7519, Training Loss: 0.01660
Epoch: 7519, Training Loss: 0.02072
Epoch: 7519, Training Loss: 0.01580
Epoch: 7520, Training Loss: 0.01812
Epoch: 7520, Training Loss: 0.01660
Epoch: 7520, Training Loss: 0.02072
Epoch: 7520, Training Loss: 0.01580
Epoch: 7521, Training Loss: 0.01811
Epoch: 7521, Training Loss: 0.01660
Epoch: 7521, Training Loss: 0.02072
Epoch: 7521, Training Loss: 0.01580
Epoch: 7522, Training Loss: 0.01811
Epoch: 7522, Training Loss: 0.01660
Epoch: 7522, Training Loss: 0.02072
Epoch: 7522, Training Loss: 0.01580
Epoch: 7523, Training Loss: 0.01811
Epoch: 7523, Training Loss: 0.01660
Epoch: 7523, Training Loss: 0.02072
Epoch: 7523, Training Loss: 0.01580
Epoch: 7524, Training Loss: 0.01811
Epoch: 7524, Training Loss: 0.01660
Epoch: 7524, Training Loss: 0.02072
Epoch: 7524, Training Loss: 0.01580
Epoch: 7525, Training Loss: 0.01811
Epoch: 7525, Training Loss: 0.01660
Epoch: 7525, Training Loss: 0.02071
Epoch: 7525, Training Loss: 0.01580
Epoch: 7526, Training Loss: 0.01811
Epoch: 7526, Training Loss: 0.01659
Epoch: 7526, Training Loss: 0.02071
Epoch: 7526, Training Loss: 0.01580
Epoch: 7527, Training Loss: 0.01811
Epoch: 7527, Training Loss: 0.01659
Epoch: 7527, Training Loss: 0.02071
Epoch: 7527, Training Loss: 0.01579
Epoch: 7528, Training Loss: 0.01810
Epoch: 7528, Training Loss: 0.01659
Epoch: 7528, Training Loss: 0.02071
Epoch: 7528, Training Loss: 0.01579
Epoch: 7529, Training Loss: 0.01810
Epoch: 7529, Training Loss: 0.01659
Epoch: 7529, Training Loss: 0.02071
Epoch: 7529, Training Loss: 0.01579
Epoch: 7530, Training Loss: 0.01810
Epoch: 7530, Training Loss: 0.01659
Epoch: 7530, Training Loss: 0.02071
Epoch: 7530, Training Loss: 0.01579
Epoch: 7531, Training Loss: 0.01810
Epoch: 7531, Training Loss: 0.01659
Epoch: 7531, Training Loss: 0.02071
Epoch: 7531, Training Loss: 0.01579
Epoch: 7532, Training Loss: 0.01810
Epoch: 7532, Training Loss: 0.01659
Epoch: 7532, Training Loss: 0.02070
Epoch: 7532, Training Loss: 0.01579
Epoch: 7533, Training Loss: 0.01810
Epoch: 7533, Training Loss: 0.01659
Epoch: 7533, Training Loss: 0.02070
Epoch: 7533, Training Loss: 0.01579
Epoch: 7534, Training Loss: 0.01810
Epoch: 7534, Training Loss: 0.01658
Epoch: 7534, Training Loss: 0.02070
Epoch: 7534, Training Loss: 0.01579
Epoch: 7535, Training Loss: 0.01810
Epoch: 7535, Training Loss: 0.01658
Epoch: 7535, Training Loss: 0.02070
Epoch: 7535, Training Loss: 0.01578
Epoch: 7536, Training Loss: 0.01809
Epoch: 7536, Training Loss: 0.01658
Epoch: 7536, Training Loss: 0.02070
Epoch: 7536, Training Loss: 0.01578
Epoch: 7537, Training Loss: 0.01809
Epoch: 7537, Training Loss: 0.01658
Epoch: 7537, Training Loss: 0.02070
Epoch: 7537, Training Loss: 0.01578
Epoch: 7538, Training Loss: 0.01809
Epoch: 7538, Training Loss: 0.01658
Epoch: 7538, Training Loss: 0.02069
Epoch: 7538, Training Loss: 0.01578
Epoch: 7539, Training Loss: 0.01809
Epoch: 7539, Training Loss: 0.01658
Epoch: 7539, Training Loss: 0.02069
Epoch: 7539, Training Loss: 0.01578
Epoch: 7540, Training Loss: 0.01809
Epoch: 7540, Training Loss: 0.01658
Epoch: 7540, Training Loss: 0.02069
Epoch: 7540, Training Loss: 0.01578
Epoch: 7541, Training Loss: 0.01809
Epoch: 7541, Training Loss: 0.01658
Epoch: 7541, Training Loss: 0.02069
Epoch: 7541, Training Loss: 0.01578
Epoch: 7542, Training Loss: 0.01809
Epoch: 7542, Training Loss: 0.01657
Epoch: 7542, Training Loss: 0.02069
Epoch: 7542, Training Loss: 0.01578
Epoch: 7543, Training Loss: 0.01808
Epoch: 7543, Training Loss: 0.01657
Epoch: 7543, Training Loss: 0.02069
Epoch: 7543, Training Loss: 0.01578
Epoch: 7544, Training Loss: 0.01808
Epoch: 7544, Training Loss: 0.01657
Epoch: 7544, Training Loss: 0.02068
Epoch: 7544, Training Loss: 0.01577
Epoch: 7545, Training Loss: 0.01808
Epoch: 7545, Training Loss: 0.01657
Epoch: 7545, Training Loss: 0.02068
Epoch: 7545, Training Loss: 0.01577
Epoch: 7546, Training Loss: 0.01808
Epoch: 7546, Training Loss: 0.01657
Epoch: 7546, Training Loss: 0.02068
Epoch: 7546, Training Loss: 0.01577
Epoch: 7547, Training Loss: 0.01808
Epoch: 7547, Training Loss: 0.01657
Epoch: 7547, Training Loss: 0.02068
Epoch: 7547, Training Loss: 0.01577
Epoch: 7548, Training Loss: 0.01808
Epoch: 7548, Training Loss: 0.01657
Epoch: 7548, Training Loss: 0.02068
Epoch: 7548, Training Loss: 0.01577
Epoch: 7549, Training Loss: 0.01808
Epoch: 7549, Training Loss: 0.01657
Epoch: 7549, Training Loss: 0.02068
Epoch: 7549, Training Loss: 0.01577
Epoch: 7550, Training Loss: 0.01807
Epoch: 7550, Training Loss: 0.01656
Epoch: 7550, Training Loss: 0.02067
Epoch: 7550, Training Loss: 0.01577
Epoch: 7551, Training Loss: 0.01807
Epoch: 7551, Training Loss: 0.01656
Epoch: 7551, Training Loss: 0.02067
Epoch: 7551, Training Loss: 0.01577
Epoch: 7552, Training Loss: 0.01807
Epoch: 7552, Training Loss: 0.01656
Epoch: 7552, Training Loss: 0.02067
Epoch: 7552, Training Loss: 0.01576
Epoch: 7553, Training Loss: 0.01807
Epoch: 7553, Training Loss: 0.01656
Epoch: 7553, Training Loss: 0.02067
Epoch: 7553, Training Loss: 0.01576
Epoch: 7554, Training Loss: 0.01807
Epoch: 7554, Training Loss: 0.01656
Epoch: 7554, Training Loss: 0.02067
Epoch: 7554, Training Loss: 0.01576
Epoch: 7555, Training Loss: 0.01807
Epoch: 7555, Training Loss: 0.01656
Epoch: 7555, Training Loss: 0.02067
Epoch: 7555, Training Loss: 0.01576
Epoch: 7556, Training Loss: 0.01807
Epoch: 7556, Training Loss: 0.01656
Epoch: 7556, Training Loss: 0.02066
Epoch: 7556, Training Loss: 0.01576
Epoch: 7557, Training Loss: 0.01806
Epoch: 7557, Training Loss: 0.01656
Epoch: 7557, Training Loss: 0.02066
Epoch: 7557, Training Loss: 0.01576
Epoch: 7558, Training Loss: 0.01806
Epoch: 7558, Training Loss: 0.01655
Epoch: 7558, Training Loss: 0.02066
Epoch: 7558, Training Loss: 0.01576
Epoch: 7559, Training Loss: 0.01806
Epoch: 7559, Training Loss: 0.01655
Epoch: 7559, Training Loss: 0.02066
Epoch: 7559, Training Loss: 0.01576
Epoch: 7560, Training Loss: 0.01806
Epoch: 7560, Training Loss: 0.01655
Epoch: 7560, Training Loss: 0.02066
Epoch: 7560, Training Loss: 0.01575
Epoch: 7561, Training Loss: 0.01806
Epoch: 7561, Training Loss: 0.01655
Epoch: 7561, Training Loss: 0.02066
Epoch: 7561, Training Loss: 0.01575
Epoch: 7562, Training Loss: 0.01806
Epoch: 7562, Training Loss: 0.01655
Epoch: 7562, Training Loss: 0.02066
Epoch: 7562, Training Loss: 0.01575
Epoch: 7563, Training Loss: 0.01806
Epoch: 7563, Training Loss: 0.01655
Epoch: 7563, Training Loss: 0.02065
Epoch: 7563, Training Loss: 0.01575
Epoch: 7564, Training Loss: 0.01805
Epoch: 7564, Training Loss: 0.01655
Epoch: 7564, Training Loss: 0.02065
Epoch: 7564, Training Loss: 0.01575
Epoch: 7565, Training Loss: 0.01805
Epoch: 7565, Training Loss: 0.01655
Epoch: 7565, Training Loss: 0.02065
Epoch: 7565, Training Loss: 0.01575
Epoch: 7566, Training Loss: 0.01805
Epoch: 7566, Training Loss: 0.01654
Epoch: 7566, Training Loss: 0.02065
Epoch: 7566, Training Loss: 0.01575
Epoch: 7567, Training Loss: 0.01805
Epoch: 7567, Training Loss: 0.01654
Epoch: 7567, Training Loss: 0.02065
Epoch: 7567, Training Loss: 0.01575
Epoch: 7568, Training Loss: 0.01805
Epoch: 7568, Training Loss: 0.01654
Epoch: 7568, Training Loss: 0.02065
Epoch: 7568, Training Loss: 0.01575
Epoch: 7569, Training Loss: 0.01805
Epoch: 7569, Training Loss: 0.01654
Epoch: 7569, Training Loss: 0.02064
Epoch: 7569, Training Loss: 0.01574
Epoch: 7570, Training Loss: 0.01805
Epoch: 7570, Training Loss: 0.01654
Epoch: 7570, Training Loss: 0.02064
Epoch: 7570, Training Loss: 0.01574
Epoch: 7571, Training Loss: 0.01804
Epoch: 7571, Training Loss: 0.01654
Epoch: 7571, Training Loss: 0.02064
Epoch: 7571, Training Loss: 0.01574
Epoch: 7572, Training Loss: 0.01804
Epoch: 7572, Training Loss: 0.01654
Epoch: 7572, Training Loss: 0.02064
Epoch: 7572, Training Loss: 0.01574
Epoch: 7573, Training Loss: 0.01804
Epoch: 7573, Training Loss: 0.01654
Epoch: 7573, Training Loss: 0.02064
Epoch: 7573, Training Loss: 0.01574
Epoch: 7574, Training Loss: 0.01804
Epoch: 7574, Training Loss: 0.01654
Epoch: 7574, Training Loss: 0.02064
Epoch: 7574, Training Loss: 0.01574
Epoch: 7575, Training Loss: 0.01804
Epoch: 7575, Training Loss: 0.01653
Epoch: 7575, Training Loss: 0.02063
Epoch: 7575, Training Loss: 0.01574
Epoch: 7576, Training Loss: 0.01804
Epoch: 7576, Training Loss: 0.01653
Epoch: 7576, Training Loss: 0.02063
Epoch: 7576, Training Loss: 0.01574
Epoch: 7577, Training Loss: 0.01804
Epoch: 7577, Training Loss: 0.01653
Epoch: 7577, Training Loss: 0.02063
Epoch: 7577, Training Loss: 0.01573
Epoch: 7578, Training Loss: 0.01804
Epoch: 7578, Training Loss: 0.01653
Epoch: 7578, Training Loss: 0.02063
Epoch: 7578, Training Loss: 0.01573
Epoch: 7579, Training Loss: 0.01803
Epoch: 7579, Training Loss: 0.01653
Epoch: 7579, Training Loss: 0.02063
Epoch: 7579, Training Loss: 0.01573
Epoch: 7580, Training Loss: 0.01803
Epoch: 7580, Training Loss: 0.01653
Epoch: 7580, Training Loss: 0.02063
Epoch: 7580, Training Loss: 0.01573
Epoch: 7581, Training Loss: 0.01803
Epoch: 7581, Training Loss: 0.01653
Epoch: 7581, Training Loss: 0.02062
Epoch: 7581, Training Loss: 0.01573
Epoch: 7582, Training Loss: 0.01803
Epoch: 7582, Training Loss: 0.01653
Epoch: 7582, Training Loss: 0.02062
Epoch: 7582, Training Loss: 0.01573
Epoch: 7583, Training Loss: 0.01803
Epoch: 7583, Training Loss: 0.01652
Epoch: 7583, Training Loss: 0.02062
Epoch: 7583, Training Loss: 0.01573
Epoch: 7584, Training Loss: 0.01803
Epoch: 7584, Training Loss: 0.01652
Epoch: 7584, Training Loss: 0.02062
Epoch: 7584, Training Loss: 0.01573
Epoch: 7585, Training Loss: 0.01803
Epoch: 7585, Training Loss: 0.01652
Epoch: 7585, Training Loss: 0.02062
Epoch: 7585, Training Loss: 0.01573
Epoch: 7586, Training Loss: 0.01802
Epoch: 7586, Training Loss: 0.01652
Epoch: 7586, Training Loss: 0.02062
Epoch: 7586, Training Loss: 0.01572
Epoch: 7587, Training Loss: 0.01802
Epoch: 7587, Training Loss: 0.01652
Epoch: 7587, Training Loss: 0.02061
Epoch: 7587, Training Loss: 0.01572
Epoch: 7588, Training Loss: 0.01802
Epoch: 7588, Training Loss: 0.01652
Epoch: 7588, Training Loss: 0.02061
Epoch: 7588, Training Loss: 0.01572
Epoch: 7589, Training Loss: 0.01802
Epoch: 7589, Training Loss: 0.01652
Epoch: 7589, Training Loss: 0.02061
Epoch: 7589, Training Loss: 0.01572
Epoch: 7590, Training Loss: 0.01802
Epoch: 7590, Training Loss: 0.01652
Epoch: 7590, Training Loss: 0.02061
Epoch: 7590, Training Loss: 0.01572
Epoch: 7591, Training Loss: 0.01802
Epoch: 7591, Training Loss: 0.01651
Epoch: 7591, Training Loss: 0.02061
Epoch: 7591, Training Loss: 0.01572
Epoch: 7592, Training Loss: 0.01802
Epoch: 7592, Training Loss: 0.01651
Epoch: 7592, Training Loss: 0.02061
Epoch: 7592, Training Loss: 0.01572
Epoch: 7593, Training Loss: 0.01801
Epoch: 7593, Training Loss: 0.01651
Epoch: 7593, Training Loss: 0.02061
Epoch: 7593, Training Loss: 0.01572
Epoch: 7594, Training Loss: 0.01801
Epoch: 7594, Training Loss: 0.01651
Epoch: 7594, Training Loss: 0.02060
Epoch: 7594, Training Loss: 0.01571
Epoch: 7595, Training Loss: 0.01801
Epoch: 7595, Training Loss: 0.01651
Epoch: 7595, Training Loss: 0.02060
Epoch: 7595, Training Loss: 0.01571
Epoch: 7596, Training Loss: 0.01801
Epoch: 7596, Training Loss: 0.01651
Epoch: 7596, Training Loss: 0.02060
Epoch: 7596, Training Loss: 0.01571
Epoch: 7597, Training Loss: 0.01801
Epoch: 7597, Training Loss: 0.01651
Epoch: 7597, Training Loss: 0.02060
Epoch: 7597, Training Loss: 0.01571
Epoch: 7598, Training Loss: 0.01801
Epoch: 7598, Training Loss: 0.01651
Epoch: 7598, Training Loss: 0.02060
Epoch: 7598, Training Loss: 0.01571
Epoch: 7599, Training Loss: 0.01801
Epoch: 7599, Training Loss: 0.01650
Epoch: 7599, Training Loss: 0.02060
Epoch: 7599, Training Loss: 0.01571
Epoch: 7600, Training Loss: 0.01800
Epoch: 7600, Training Loss: 0.01650
Epoch: 7600, Training Loss: 0.02059
Epoch: 7600, Training Loss: 0.01571
Epoch: 7601, Training Loss: 0.01800
Epoch: 7601, Training Loss: 0.01650
Epoch: 7601, Training Loss: 0.02059
Epoch: 7601, Training Loss: 0.01571
Epoch: 7602, Training Loss: 0.01800
Epoch: 7602, Training Loss: 0.01650
Epoch: 7602, Training Loss: 0.02059
Epoch: 7602, Training Loss: 0.01570
Epoch: 7603, Training Loss: 0.01800
Epoch: 7603, Training Loss: 0.01650
Epoch: 7603, Training Loss: 0.02059
Epoch: 7603, Training Loss: 0.01570
Epoch: 7604, Training Loss: 0.01800
Epoch: 7604, Training Loss: 0.01650
Epoch: 7604, Training Loss: 0.02059
Epoch: 7604, Training Loss: 0.01570
Epoch: 7605, Training Loss: 0.01800
Epoch: 7605, Training Loss: 0.01650
Epoch: 7605, Training Loss: 0.02059
Epoch: 7605, Training Loss: 0.01570
Epoch: 7606, Training Loss: 0.01800
Epoch: 7606, Training Loss: 0.01650
Epoch: 7606, Training Loss: 0.02058
Epoch: 7606, Training Loss: 0.01570
Epoch: 7607, Training Loss: 0.01799
Epoch: 7607, Training Loss: 0.01649
Epoch: 7607, Training Loss: 0.02058
Epoch: 7607, Training Loss: 0.01570
Epoch: 7608, Training Loss: 0.01799
Epoch: 7608, Training Loss: 0.01649
Epoch: 7608, Training Loss: 0.02058
Epoch: 7608, Training Loss: 0.01570
Epoch: 7609, Training Loss: 0.01799
Epoch: 7609, Training Loss: 0.01649
Epoch: 7609, Training Loss: 0.02058
Epoch: 7609, Training Loss: 0.01570
Epoch: 7610, Training Loss: 0.01799
Epoch: 7610, Training Loss: 0.01649
Epoch: 7610, Training Loss: 0.02058
Epoch: 7610, Training Loss: 0.01570
Epoch: 7611, Training Loss: 0.01799
Epoch: 7611, Training Loss: 0.01649
Epoch: 7611, Training Loss: 0.02058
Epoch: 7611, Training Loss: 0.01569
Epoch: 7612, Training Loss: 0.01799
Epoch: 7612, Training Loss: 0.01649
Epoch: 7612, Training Loss: 0.02058
Epoch: 7612, Training Loss: 0.01569
Epoch: 7613, Training Loss: 0.01799
Epoch: 7613, Training Loss: 0.01649
Epoch: 7613, Training Loss: 0.02057
Epoch: 7613, Training Loss: 0.01569
Epoch: 7614, Training Loss: 0.01799
Epoch: 7614, Training Loss: 0.01649
Epoch: 7614, Training Loss: 0.02057
Epoch: 7614, Training Loss: 0.01569
Epoch: 7615, Training Loss: 0.01798
Epoch: 7615, Training Loss: 0.01648
Epoch: 7615, Training Loss: 0.02057
Epoch: 7615, Training Loss: 0.01569
Epoch: 7616, Training Loss: 0.01798
Epoch: 7616, Training Loss: 0.01648
Epoch: 7616, Training Loss: 0.02057
Epoch: 7616, Training Loss: 0.01569
Epoch: 7617, Training Loss: 0.01798
Epoch: 7617, Training Loss: 0.01648
Epoch: 7617, Training Loss: 0.02057
Epoch: 7617, Training Loss: 0.01569
Epoch: 7618, Training Loss: 0.01798
Epoch: 7618, Training Loss: 0.01648
Epoch: 7618, Training Loss: 0.02057
Epoch: 7618, Training Loss: 0.01569
Epoch: 7619, Training Loss: 0.01798
Epoch: 7619, Training Loss: 0.01648
Epoch: 7619, Training Loss: 0.02056
Epoch: 7619, Training Loss: 0.01568
Epoch: 7620, Training Loss: 0.01798
Epoch: 7620, Training Loss: 0.01648
Epoch: 7620, Training Loss: 0.02056
Epoch: 7620, Training Loss: 0.01568
Epoch: 7621, Training Loss: 0.01798
Epoch: 7621, Training Loss: 0.01648
Epoch: 7621, Training Loss: 0.02056
Epoch: 7621, Training Loss: 0.01568
Epoch: 7622, Training Loss: 0.01797
Epoch: 7622, Training Loss: 0.01648
Epoch: 7622, Training Loss: 0.02056
Epoch: 7622, Training Loss: 0.01568
Epoch: 7623, Training Loss: 0.01797
Epoch: 7623, Training Loss: 0.01648
Epoch: 7623, Training Loss: 0.02056
Epoch: 7623, Training Loss: 0.01568
Epoch: 7624, Training Loss: 0.01797
Epoch: 7624, Training Loss: 0.01647
Epoch: 7624, Training Loss: 0.02056
Epoch: 7624, Training Loss: 0.01568
Epoch: 7625, Training Loss: 0.01797
Epoch: 7625, Training Loss: 0.01647
Epoch: 7625, Training Loss: 0.02055
Epoch: 7625, Training Loss: 0.01568
Epoch: 7626, Training Loss: 0.01797
Epoch: 7626, Training Loss: 0.01647
Epoch: 7626, Training Loss: 0.02055
Epoch: 7626, Training Loss: 0.01568
Epoch: 7627, Training Loss: 0.01797
Epoch: 7627, Training Loss: 0.01647
Epoch: 7627, Training Loss: 0.02055
Epoch: 7627, Training Loss: 0.01568
Epoch: 7628, Training Loss: 0.01797
Epoch: 7628, Training Loss: 0.01647
Epoch: 7628, Training Loss: 0.02055
Epoch: 7628, Training Loss: 0.01567
Epoch: 7629, Training Loss: 0.01796
Epoch: 7629, Training Loss: 0.01647
Epoch: 7629, Training Loss: 0.02055
Epoch: 7629, Training Loss: 0.01567
Epoch: 7630, Training Loss: 0.01796
Epoch: 7630, Training Loss: 0.01647
Epoch: 7630, Training Loss: 0.02055
Epoch: 7630, Training Loss: 0.01567
Epoch: 7631, Training Loss: 0.01796
Epoch: 7631, Training Loss: 0.01647
Epoch: 7631, Training Loss: 0.02054
Epoch: 7631, Training Loss: 0.01567
Epoch: 7632, Training Loss: 0.01796
Epoch: 7632, Training Loss: 0.01646
Epoch: 7632, Training Loss: 0.02054
Epoch: 7632, Training Loss: 0.01567
Epoch: 7633, Training Loss: 0.01796
Epoch: 7633, Training Loss: 0.01646
Epoch: 7633, Training Loss: 0.02054
Epoch: 7633, Training Loss: 0.01567
Epoch: 7634, Training Loss: 0.01796
Epoch: 7634, Training Loss: 0.01646
Epoch: 7634, Training Loss: 0.02054
Epoch: 7634, Training Loss: 0.01567
Epoch: 7635, Training Loss: 0.01796
Epoch: 7635, Training Loss: 0.01646
Epoch: 7635, Training Loss: 0.02054
Epoch: 7635, Training Loss: 0.01567
Epoch: 7636, Training Loss: 0.01795
Epoch: 7636, Training Loss: 0.01646
Epoch: 7636, Training Loss: 0.02054
Epoch: 7636, Training Loss: 0.01566
Epoch: 7637, Training Loss: 0.01795
Epoch: 7637, Training Loss: 0.01646
Epoch: 7637, Training Loss: 0.02054
Epoch: 7637, Training Loss: 0.01566
Epoch: 7638, Training Loss: 0.01795
Epoch: 7638, Training Loss: 0.01646
Epoch: 7638, Training Loss: 0.02053
Epoch: 7638, Training Loss: 0.01566
Epoch: 7639, Training Loss: 0.01795
Epoch: 7639, Training Loss: 0.01646
Epoch: 7639, Training Loss: 0.02053
Epoch: 7639, Training Loss: 0.01566
Epoch: 7640, Training Loss: 0.01795
Epoch: 7640, Training Loss: 0.01645
Epoch: 7640, Training Loss: 0.02053
Epoch: 7640, Training Loss: 0.01566
Epoch: 7641, Training Loss: 0.01795
Epoch: 7641, Training Loss: 0.01645
Epoch: 7641, Training Loss: 0.02053
Epoch: 7641, Training Loss: 0.01566
Epoch: 7642, Training Loss: 0.01795
Epoch: 7642, Training Loss: 0.01645
Epoch: 7642, Training Loss: 0.02053
Epoch: 7642, Training Loss: 0.01566
Epoch: 7643, Training Loss: 0.01795
Epoch: 7643, Training Loss: 0.01645
Epoch: 7643, Training Loss: 0.02053
Epoch: 7643, Training Loss: 0.01566
Epoch: 7644, Training Loss: 0.01794
Epoch: 7644, Training Loss: 0.01645
Epoch: 7644, Training Loss: 0.02052
Epoch: 7644, Training Loss: 0.01566
Epoch: 7645, Training Loss: 0.01794
Epoch: 7645, Training Loss: 0.01645
Epoch: 7645, Training Loss: 0.02052
Epoch: 7645, Training Loss: 0.01565
Epoch: 7646, Training Loss: 0.01794
Epoch: 7646, Training Loss: 0.01645
Epoch: 7646, Training Loss: 0.02052
Epoch: 7646, Training Loss: 0.01565
Epoch: 7647, Training Loss: 0.01794
Epoch: 7647, Training Loss: 0.01645
Epoch: 7647, Training Loss: 0.02052
Epoch: 7647, Training Loss: 0.01565
Epoch: 7648, Training Loss: 0.01794
Epoch: 7648, Training Loss: 0.01644
Epoch: 7648, Training Loss: 0.02052
Epoch: 7648, Training Loss: 0.01565
Epoch: 7649, Training Loss: 0.01794
Epoch: 7649, Training Loss: 0.01644
Epoch: 7649, Training Loss: 0.02052
Epoch: 7649, Training Loss: 0.01565
Epoch: 7650, Training Loss: 0.01794
Epoch: 7650, Training Loss: 0.01644
Epoch: 7650, Training Loss: 0.02051
Epoch: 7650, Training Loss: 0.01565
Epoch: 7651, Training Loss: 0.01793
Epoch: 7651, Training Loss: 0.01644
Epoch: 7651, Training Loss: 0.02051
Epoch: 7651, Training Loss: 0.01565
Epoch: 7652, Training Loss: 0.01793
Epoch: 7652, Training Loss: 0.01644
Epoch: 7652, Training Loss: 0.02051
Epoch: 7652, Training Loss: 0.01565
Epoch: 7653, Training Loss: 0.01793
Epoch: 7653, Training Loss: 0.01644
Epoch: 7653, Training Loss: 0.02051
Epoch: 7653, Training Loss: 0.01565
Epoch: 7654, Training Loss: 0.01793
Epoch: 7654, Training Loss: 0.01644
Epoch: 7654, Training Loss: 0.02051
Epoch: 7654, Training Loss: 0.01564
Epoch: 7655, Training Loss: 0.01793
Epoch: 7655, Training Loss: 0.01644
Epoch: 7655, Training Loss: 0.02051
Epoch: 7655, Training Loss: 0.01564
Epoch: 7656, Training Loss: 0.01793
Epoch: 7656, Training Loss: 0.01644
Epoch: 7656, Training Loss: 0.02051
Epoch: 7656, Training Loss: 0.01564
Epoch: 7657, Training Loss: 0.01793
Epoch: 7657, Training Loss: 0.01643
Epoch: 7657, Training Loss: 0.02050
Epoch: 7657, Training Loss: 0.01564
Epoch: 7658, Training Loss: 0.01792
Epoch: 7658, Training Loss: 0.01643
Epoch: 7658, Training Loss: 0.02050
Epoch: 7658, Training Loss: 0.01564
Epoch: 7659, Training Loss: 0.01792
Epoch: 7659, Training Loss: 0.01643
Epoch: 7659, Training Loss: 0.02050
Epoch: 7659, Training Loss: 0.01564
Epoch: 7660, Training Loss: 0.01792
Epoch: 7660, Training Loss: 0.01643
Epoch: 7660, Training Loss: 0.02050
Epoch: 7660, Training Loss: 0.01564
Epoch: 7661, Training Loss: 0.01792
Epoch: 7661, Training Loss: 0.01643
Epoch: 7661, Training Loss: 0.02050
Epoch: 7661, Training Loss: 0.01564
Epoch: 7662, Training Loss: 0.01792
Epoch: 7662, Training Loss: 0.01643
Epoch: 7662, Training Loss: 0.02050
Epoch: 7662, Training Loss: 0.01563
Epoch: 7663, Training Loss: 0.01792
Epoch: 7663, Training Loss: 0.01643
Epoch: 7663, Training Loss: 0.02049
Epoch: 7663, Training Loss: 0.01563
Epoch: 7664, Training Loss: 0.01792
Epoch: 7664, Training Loss: 0.01643
Epoch: 7664, Training Loss: 0.02049
Epoch: 7664, Training Loss: 0.01563
Epoch: 7665, Training Loss: 0.01792
Epoch: 7665, Training Loss: 0.01642
Epoch: 7665, Training Loss: 0.02049
Epoch: 7665, Training Loss: 0.01563
Epoch: 7666, Training Loss: 0.01791
Epoch: 7666, Training Loss: 0.01642
Epoch: 7666, Training Loss: 0.02049
Epoch: 7666, Training Loss: 0.01563
Epoch: 7667, Training Loss: 0.01791
Epoch: 7667, Training Loss: 0.01642
Epoch: 7667, Training Loss: 0.02049
Epoch: 7667, Training Loss: 0.01563
Epoch: 7668, Training Loss: 0.01791
Epoch: 7668, Training Loss: 0.01642
Epoch: 7668, Training Loss: 0.02049
Epoch: 7668, Training Loss: 0.01563
Epoch: 7669, Training Loss: 0.01791
Epoch: 7669, Training Loss: 0.01642
Epoch: 7669, Training Loss: 0.02048
Epoch: 7669, Training Loss: 0.01563
Epoch: 7670, Training Loss: 0.01791
Epoch: 7670, Training Loss: 0.01642
Epoch: 7670, Training Loss: 0.02048
Epoch: 7670, Training Loss: 0.01563
Epoch: 7671, Training Loss: 0.01791
Epoch: 7671, Training Loss: 0.01642
Epoch: 7671, Training Loss: 0.02048
Epoch: 7671, Training Loss: 0.01562
Epoch: 7672, Training Loss: 0.01791
Epoch: 7672, Training Loss: 0.01642
Epoch: 7672, Training Loss: 0.02048
Epoch: 7672, Training Loss: 0.01562
Epoch: 7673, Training Loss: 0.01790
Epoch: 7673, Training Loss: 0.01641
Epoch: 7673, Training Loss: 0.02048
Epoch: 7673, Training Loss: 0.01562
Epoch: 7674, Training Loss: 0.01790
Epoch: 7674, Training Loss: 0.01641
Epoch: 7674, Training Loss: 0.02048
Epoch: 7674, Training Loss: 0.01562
Epoch: 7675, Training Loss: 0.01790
Epoch: 7675, Training Loss: 0.01641
Epoch: 7675, Training Loss: 0.02048
Epoch: 7675, Training Loss: 0.01562
Epoch: 7676, Training Loss: 0.01790
Epoch: 7676, Training Loss: 0.01641
Epoch: 7676, Training Loss: 0.02047
Epoch: 7676, Training Loss: 0.01562
Epoch: 7677, Training Loss: 0.01790
Epoch: 7677, Training Loss: 0.01641
Epoch: 7677, Training Loss: 0.02047
Epoch: 7677, Training Loss: 0.01562
Epoch: 7678, Training Loss: 0.01790
Epoch: 7678, Training Loss: 0.01641
Epoch: 7678, Training Loss: 0.02047
Epoch: 7678, Training Loss: 0.01562
Epoch: 7679, Training Loss: 0.01790
Epoch: 7679, Training Loss: 0.01641
Epoch: 7679, Training Loss: 0.02047
Epoch: 7679, Training Loss: 0.01561
Epoch: 7680, Training Loss: 0.01789
Epoch: 7680, Training Loss: 0.01641
Epoch: 7680, Training Loss: 0.02047
Epoch: 7680, Training Loss: 0.01561
Epoch: 7681, Training Loss: 0.01789
Epoch: 7681, Training Loss: 0.01640
Epoch: 7681, Training Loss: 0.02047
Epoch: 7681, Training Loss: 0.01561
Epoch: 7682, Training Loss: 0.01789
Epoch: 7682, Training Loss: 0.01640
Epoch: 7682, Training Loss: 0.02046
Epoch: 7682, Training Loss: 0.01561
Epoch: 7683, Training Loss: 0.01789
Epoch: 7683, Training Loss: 0.01640
Epoch: 7683, Training Loss: 0.02046
Epoch: 7683, Training Loss: 0.01561
Epoch: 7684, Training Loss: 0.01789
Epoch: 7684, Training Loss: 0.01640
Epoch: 7684, Training Loss: 0.02046
Epoch: 7684, Training Loss: 0.01561
Epoch: 7685, Training Loss: 0.01789
Epoch: 7685, Training Loss: 0.01640
Epoch: 7685, Training Loss: 0.02046
Epoch: 7685, Training Loss: 0.01561
Epoch: 7686, Training Loss: 0.01789
Epoch: 7686, Training Loss: 0.01640
Epoch: 7686, Training Loss: 0.02046
Epoch: 7686, Training Loss: 0.01561
Epoch: 7687, Training Loss: 0.01789
Epoch: 7687, Training Loss: 0.01640
Epoch: 7687, Training Loss: 0.02046
Epoch: 7687, Training Loss: 0.01561
Epoch: 7688, Training Loss: 0.01788
Epoch: 7688, Training Loss: 0.01640
Epoch: 7688, Training Loss: 0.02045
Epoch: 7688, Training Loss: 0.01560
Epoch: 7689, Training Loss: 0.01788
Epoch: 7689, Training Loss: 0.01640
Epoch: 7689, Training Loss: 0.02045
Epoch: 7689, Training Loss: 0.01560
Epoch: 7690, Training Loss: 0.01788
Epoch: 7690, Training Loss: 0.01639
Epoch: 7690, Training Loss: 0.02045
Epoch: 7690, Training Loss: 0.01560
Epoch: 7691, Training Loss: 0.01788
Epoch: 7691, Training Loss: 0.01639
Epoch: 7691, Training Loss: 0.02045
Epoch: 7691, Training Loss: 0.01560
Epoch: 7692, Training Loss: 0.01788
Epoch: 7692, Training Loss: 0.01639
Epoch: 7692, Training Loss: 0.02045
Epoch: 7692, Training Loss: 0.01560
Epoch: 7693, Training Loss: 0.01788
Epoch: 7693, Training Loss: 0.01639
Epoch: 7693, Training Loss: 0.02045
Epoch: 7693, Training Loss: 0.01560
Epoch: 7694, Training Loss: 0.01788
Epoch: 7694, Training Loss: 0.01639
Epoch: 7694, Training Loss: 0.02045
Epoch: 7694, Training Loss: 0.01560
Epoch: 7695, Training Loss: 0.01787
Epoch: 7695, Training Loss: 0.01639
Epoch: 7695, Training Loss: 0.02044
Epoch: 7695, Training Loss: 0.01560
Epoch: 7696, Training Loss: 0.01787
Epoch: 7696, Training Loss: 0.01639
Epoch: 7696, Training Loss: 0.02044
Epoch: 7696, Training Loss: 0.01560
Epoch: 7697, Training Loss: 0.01787
Epoch: 7697, Training Loss: 0.01639
Epoch: 7697, Training Loss: 0.02044
Epoch: 7697, Training Loss: 0.01559
Epoch: 7698, Training Loss: 0.01787
Epoch: 7698, Training Loss: 0.01638
Epoch: 7698, Training Loss: 0.02044
Epoch: 7698, Training Loss: 0.01559
Epoch: 7699, Training Loss: 0.01787
Epoch: 7699, Training Loss: 0.01638
Epoch: 7699, Training Loss: 0.02044
Epoch: 7699, Training Loss: 0.01559
Epoch: 7700, Training Loss: 0.01787
Epoch: 7700, Training Loss: 0.01638
Epoch: 7700, Training Loss: 0.02044
Epoch: 7700, Training Loss: 0.01559
Epoch: 7701, Training Loss: 0.01787
Epoch: 7701, Training Loss: 0.01638
Epoch: 7701, Training Loss: 0.02043
Epoch: 7701, Training Loss: 0.01559
Epoch: 7702, Training Loss: 0.01786
Epoch: 7702, Training Loss: 0.01638
Epoch: 7702, Training Loss: 0.02043
Epoch: 7702, Training Loss: 0.01559
Epoch: 7703, Training Loss: 0.01786
Epoch: 7703, Training Loss: 0.01638
Epoch: 7703, Training Loss: 0.02043
Epoch: 7703, Training Loss: 0.01559
Epoch: 7704, Training Loss: 0.01786
Epoch: 7704, Training Loss: 0.01638
Epoch: 7704, Training Loss: 0.02043
Epoch: 7704, Training Loss: 0.01559
Epoch: 7705, Training Loss: 0.01786
Epoch: 7705, Training Loss: 0.01638
Epoch: 7705, Training Loss: 0.02043
Epoch: 7705, Training Loss: 0.01558
Epoch: 7706, Training Loss: 0.01786
Epoch: 7706, Training Loss: 0.01637
Epoch: 7706, Training Loss: 0.02043
Epoch: 7706, Training Loss: 0.01558
Epoch: 7707, Training Loss: 0.01786
Epoch: 7707, Training Loss: 0.01637
Epoch: 7707, Training Loss: 0.02043
Epoch: 7707, Training Loss: 0.01558
Epoch: 7708, Training Loss: 0.01786
Epoch: 7708, Training Loss: 0.01637
Epoch: 7708, Training Loss: 0.02042
Epoch: 7708, Training Loss: 0.01558
Epoch: 7709, Training Loss: 0.01786
Epoch: 7709, Training Loss: 0.01637
Epoch: 7709, Training Loss: 0.02042
Epoch: 7709, Training Loss: 0.01558
Epoch: 7710, Training Loss: 0.01785
Epoch: 7710, Training Loss: 0.01637
Epoch: 7710, Training Loss: 0.02042
Epoch: 7710, Training Loss: 0.01558
Epoch: 7711, Training Loss: 0.01785
Epoch: 7711, Training Loss: 0.01637
Epoch: 7711, Training Loss: 0.02042
Epoch: 7711, Training Loss: 0.01558
Epoch: 7712, Training Loss: 0.01785
Epoch: 7712, Training Loss: 0.01637
Epoch: 7712, Training Loss: 0.02042
Epoch: 7712, Training Loss: 0.01558
Epoch: 7713, Training Loss: 0.01785
Epoch: 7713, Training Loss: 0.01637
Epoch: 7713, Training Loss: 0.02042
Epoch: 7713, Training Loss: 0.01558
Epoch: 7714, Training Loss: 0.01785
Epoch: 7714, Training Loss: 0.01637
Epoch: 7714, Training Loss: 0.02041
Epoch: 7714, Training Loss: 0.01557
Epoch: 7715, Training Loss: 0.01785
Epoch: 7715, Training Loss: 0.01636
Epoch: 7715, Training Loss: 0.02041
Epoch: 7715, Training Loss: 0.01557
Epoch: 7716, Training Loss: 0.01785
Epoch: 7716, Training Loss: 0.01636
Epoch: 7716, Training Loss: 0.02041
Epoch: 7716, Training Loss: 0.01557
Epoch: 7717, Training Loss: 0.01784
Epoch: 7717, Training Loss: 0.01636
Epoch: 7717, Training Loss: 0.02041
Epoch: 7717, Training Loss: 0.01557
Epoch: 7718, Training Loss: 0.01784
Epoch: 7718, Training Loss: 0.01636
Epoch: 7718, Training Loss: 0.02041
Epoch: 7718, Training Loss: 0.01557
Epoch: 7719, Training Loss: 0.01784
Epoch: 7719, Training Loss: 0.01636
Epoch: 7719, Training Loss: 0.02041
Epoch: 7719, Training Loss: 0.01557
Epoch: 7720, Training Loss: 0.01784
Epoch: 7720, Training Loss: 0.01636
Epoch: 7720, Training Loss: 0.02041
Epoch: 7720, Training Loss: 0.01557
Epoch: 7721, Training Loss: 0.01784
Epoch: 7721, Training Loss: 0.01636
Epoch: 7721, Training Loss: 0.02040
Epoch: 7721, Training Loss: 0.01557
Epoch: 7722, Training Loss: 0.01784
Epoch: 7722, Training Loss: 0.01636
Epoch: 7722, Training Loss: 0.02040
Epoch: 7722, Training Loss: 0.01557
Epoch: 7723, Training Loss: 0.01784
Epoch: 7723, Training Loss: 0.01635
Epoch: 7723, Training Loss: 0.02040
Epoch: 7723, Training Loss: 0.01556
Epoch: 7724, Training Loss: 0.01784
Epoch: 7724, Training Loss: 0.01635
Epoch: 7724, Training Loss: 0.02040
Epoch: 7724, Training Loss: 0.01556
Epoch: 7725, Training Loss: 0.01783
Epoch: 7725, Training Loss: 0.01635
Epoch: 7725, Training Loss: 0.02040
Epoch: 7725, Training Loss: 0.01556
Epoch: 7726, Training Loss: 0.01783
Epoch: 7726, Training Loss: 0.01635
Epoch: 7726, Training Loss: 0.02040
Epoch: 7726, Training Loss: 0.01556
Epoch: 7727, Training Loss: 0.01783
Epoch: 7727, Training Loss: 0.01635
Epoch: 7727, Training Loss: 0.02039
Epoch: 7727, Training Loss: 0.01556
Epoch: 7728, Training Loss: 0.01783
Epoch: 7728, Training Loss: 0.01635
Epoch: 7728, Training Loss: 0.02039
Epoch: 7728, Training Loss: 0.01556
Epoch: 7729, Training Loss: 0.01783
Epoch: 7729, Training Loss: 0.01635
Epoch: 7729, Training Loss: 0.02039
Epoch: 7729, Training Loss: 0.01556
Epoch: 7730, Training Loss: 0.01783
Epoch: 7730, Training Loss: 0.01635
Epoch: 7730, Training Loss: 0.02039
Epoch: 7730, Training Loss: 0.01556
Epoch: 7731, Training Loss: 0.01783
Epoch: 7731, Training Loss: 0.01635
Epoch: 7731, Training Loss: 0.02039
Epoch: 7731, Training Loss: 0.01555
Epoch: 7732, Training Loss: 0.01782
Epoch: 7732, Training Loss: 0.01634
Epoch: 7732, Training Loss: 0.02039
Epoch: 7732, Training Loss: 0.01555
Epoch: 7733, Training Loss: 0.01782
Epoch: 7733, Training Loss: 0.01634
Epoch: 7733, Training Loss: 0.02038
Epoch: 7733, Training Loss: 0.01555
Epoch: 7734, Training Loss: 0.01782
Epoch: 7734, Training Loss: 0.01634
Epoch: 7734, Training Loss: 0.02038
Epoch: 7734, Training Loss: 0.01555
Epoch: 7735, Training Loss: 0.01782
Epoch: 7735, Training Loss: 0.01634
Epoch: 7735, Training Loss: 0.02038
Epoch: 7735, Training Loss: 0.01555
Epoch: 7736, Training Loss: 0.01782
Epoch: 7736, Training Loss: 0.01634
Epoch: 7736, Training Loss: 0.02038
Epoch: 7736, Training Loss: 0.01555
Epoch: 7737, Training Loss: 0.01782
Epoch: 7737, Training Loss: 0.01634
Epoch: 7737, Training Loss: 0.02038
Epoch: 7737, Training Loss: 0.01555
Epoch: 7738, Training Loss: 0.01782
Epoch: 7738, Training Loss: 0.01634
Epoch: 7738, Training Loss: 0.02038
Epoch: 7738, Training Loss: 0.01555
Epoch: 7739, Training Loss: 0.01782
Epoch: 7739, Training Loss: 0.01634
Epoch: 7739, Training Loss: 0.02038
Epoch: 7739, Training Loss: 0.01555
Epoch: 7740, Training Loss: 0.01781
Epoch: 7740, Training Loss: 0.01633
Epoch: 7740, Training Loss: 0.02037
Epoch: 7740, Training Loss: 0.01554
Epoch: 7741, Training Loss: 0.01781
Epoch: 7741, Training Loss: 0.01633
Epoch: 7741, Training Loss: 0.02037
Epoch: 7741, Training Loss: 0.01554
Epoch: 7742, Training Loss: 0.01781
Epoch: 7742, Training Loss: 0.01633
Epoch: 7742, Training Loss: 0.02037
Epoch: 7742, Training Loss: 0.01554
Epoch: 7743, Training Loss: 0.01781
Epoch: 7743, Training Loss: 0.01633
Epoch: 7743, Training Loss: 0.02037
Epoch: 7743, Training Loss: 0.01554
Epoch: 7744, Training Loss: 0.01781
Epoch: 7744, Training Loss: 0.01633
Epoch: 7744, Training Loss: 0.02037
Epoch: 7744, Training Loss: 0.01554
Epoch: 7745, Training Loss: 0.01781
Epoch: 7745, Training Loss: 0.01633
Epoch: 7745, Training Loss: 0.02037
Epoch: 7745, Training Loss: 0.01554
Epoch: 7746, Training Loss: 0.01781
Epoch: 7746, Training Loss: 0.01633
Epoch: 7746, Training Loss: 0.02036
Epoch: 7746, Training Loss: 0.01554
Epoch: 7747, Training Loss: 0.01780
Epoch: 7747, Training Loss: 0.01633
Epoch: 7747, Training Loss: 0.02036
Epoch: 7747, Training Loss: 0.01554
Epoch: 7748, Training Loss: 0.01780
Epoch: 7748, Training Loss: 0.01633
Epoch: 7748, Training Loss: 0.02036
Epoch: 7748, Training Loss: 0.01554
Epoch: 7749, Training Loss: 0.01780
Epoch: 7749, Training Loss: 0.01632
Epoch: 7749, Training Loss: 0.02036
Epoch: 7749, Training Loss: 0.01553
Epoch: 7750, Training Loss: 0.01780
Epoch: 7750, Training Loss: 0.01632
Epoch: 7750, Training Loss: 0.02036
Epoch: 7750, Training Loss: 0.01553
Epoch: 7751, Training Loss: 0.01780
Epoch: 7751, Training Loss: 0.01632
Epoch: 7751, Training Loss: 0.02036
Epoch: 7751, Training Loss: 0.01553
Epoch: 7752, Training Loss: 0.01780
Epoch: 7752, Training Loss: 0.01632
Epoch: 7752, Training Loss: 0.02036
Epoch: 7752, Training Loss: 0.01553
Epoch: 7753, Training Loss: 0.01780
Epoch: 7753, Training Loss: 0.01632
Epoch: 7753, Training Loss: 0.02035
Epoch: 7753, Training Loss: 0.01553
Epoch: 7754, Training Loss: 0.01780
Epoch: 7754, Training Loss: 0.01632
Epoch: 7754, Training Loss: 0.02035
Epoch: 7754, Training Loss: 0.01553
Epoch: 7755, Training Loss: 0.01779
Epoch: 7755, Training Loss: 0.01632
Epoch: 7755, Training Loss: 0.02035
Epoch: 7755, Training Loss: 0.01553
Epoch: 7756, Training Loss: 0.01779
Epoch: 7756, Training Loss: 0.01632
Epoch: 7756, Training Loss: 0.02035
Epoch: 7756, Training Loss: 0.01553
Epoch: 7757, Training Loss: 0.01779
Epoch: 7757, Training Loss: 0.01631
Epoch: 7757, Training Loss: 0.02035
Epoch: 7757, Training Loss: 0.01553
Epoch: 7758, Training Loss: 0.01779
Epoch: 7758, Training Loss: 0.01631
Epoch: 7758, Training Loss: 0.02035
Epoch: 7758, Training Loss: 0.01552
Epoch: 7759, Training Loss: 0.01779
Epoch: 7759, Training Loss: 0.01631
Epoch: 7759, Training Loss: 0.02034
Epoch: 7759, Training Loss: 0.01552
Epoch: 7760, Training Loss: 0.01779
Epoch: 7760, Training Loss: 0.01631
Epoch: 7760, Training Loss: 0.02034
Epoch: 7760, Training Loss: 0.01552
Epoch: 7761, Training Loss: 0.01779
Epoch: 7761, Training Loss: 0.01631
Epoch: 7761, Training Loss: 0.02034
Epoch: 7761, Training Loss: 0.01552
Epoch: 7762, Training Loss: 0.01778
Epoch: 7762, Training Loss: 0.01631
Epoch: 7762, Training Loss: 0.02034
Epoch: 7762, Training Loss: 0.01552
Epoch: 7763, Training Loss: 0.01778
Epoch: 7763, Training Loss: 0.01631
Epoch: 7763, Training Loss: 0.02034
Epoch: 7763, Training Loss: 0.01552
Epoch: 7764, Training Loss: 0.01778
Epoch: 7764, Training Loss: 0.01631
Epoch: 7764, Training Loss: 0.02034
Epoch: 7764, Training Loss: 0.01552
Epoch: 7765, Training Loss: 0.01778
Epoch: 7765, Training Loss: 0.01630
Epoch: 7765, Training Loss: 0.02034
Epoch: 7765, Training Loss: 0.01552
Epoch: 7766, Training Loss: 0.01778
Epoch: 7766, Training Loss: 0.01630
Epoch: 7766, Training Loss: 0.02033
Epoch: 7766, Training Loss: 0.01551
Epoch: 7767, Training Loss: 0.01778
Epoch: 7767, Training Loss: 0.01630
Epoch: 7767, Training Loss: 0.02033
Epoch: 7767, Training Loss: 0.01551
Epoch: 7768, Training Loss: 0.01778
Epoch: 7768, Training Loss: 0.01630
Epoch: 7768, Training Loss: 0.02033
Epoch: 7768, Training Loss: 0.01551
Epoch: 7769, Training Loss: 0.01777
Epoch: 7769, Training Loss: 0.01630
Epoch: 7769, Training Loss: 0.02033
Epoch: 7769, Training Loss: 0.01551
Epoch: 7770, Training Loss: 0.01777
Epoch: 7770, Training Loss: 0.01630
Epoch: 7770, Training Loss: 0.02033
Epoch: 7770, Training Loss: 0.01551
Epoch: 7771, Training Loss: 0.01777
Epoch: 7771, Training Loss: 0.01630
Epoch: 7771, Training Loss: 0.02033
Epoch: 7771, Training Loss: 0.01551
Epoch: 7772, Training Loss: 0.01777
Epoch: 7772, Training Loss: 0.01630
Epoch: 7772, Training Loss: 0.02032
Epoch: 7772, Training Loss: 0.01551
Epoch: 7773, Training Loss: 0.01777
Epoch: 7773, Training Loss: 0.01630
Epoch: 7773, Training Loss: 0.02032
Epoch: 7773, Training Loss: 0.01551
Epoch: 7774, Training Loss: 0.01777
Epoch: 7774, Training Loss: 0.01629
Epoch: 7774, Training Loss: 0.02032
Epoch: 7774, Training Loss: 0.01551
Epoch: 7775, Training Loss: 0.01777
Epoch: 7775, Training Loss: 0.01629
Epoch: 7775, Training Loss: 0.02032
Epoch: 7775, Training Loss: 0.01550
Epoch: 7776, Training Loss: 0.01777
Epoch: 7776, Training Loss: 0.01629
Epoch: 7776, Training Loss: 0.02032
Epoch: 7776, Training Loss: 0.01550
Epoch: 7777, Training Loss: 0.01776
Epoch: 7777, Training Loss: 0.01629
Epoch: 7777, Training Loss: 0.02032
Epoch: 7777, Training Loss: 0.01550
Epoch: 7778, Training Loss: 0.01776
Epoch: 7778, Training Loss: 0.01629
Epoch: 7778, Training Loss: 0.02032
Epoch: 7778, Training Loss: 0.01550
Epoch: 7779, Training Loss: 0.01776
Epoch: 7779, Training Loss: 0.01629
Epoch: 7779, Training Loss: 0.02031
Epoch: 7779, Training Loss: 0.01550
Epoch: 7780, Training Loss: 0.01776
Epoch: 7780, Training Loss: 0.01629
Epoch: 7780, Training Loss: 0.02031
Epoch: 7780, Training Loss: 0.01550
Epoch: 7781, Training Loss: 0.01776
Epoch: 7781, Training Loss: 0.01629
Epoch: 7781, Training Loss: 0.02031
Epoch: 7781, Training Loss: 0.01550
Epoch: 7782, Training Loss: 0.01776
Epoch: 7782, Training Loss: 0.01628
Epoch: 7782, Training Loss: 0.02031
Epoch: 7782, Training Loss: 0.01550
Epoch: 7783, Training Loss: 0.01776
Epoch: 7783, Training Loss: 0.01628
Epoch: 7783, Training Loss: 0.02031
Epoch: 7783, Training Loss: 0.01550
Epoch: 7784, Training Loss: 0.01776
Epoch: 7784, Training Loss: 0.01628
Epoch: 7784, Training Loss: 0.02031
Epoch: 7784, Training Loss: 0.01549
Epoch: 7785, Training Loss: 0.01775
Epoch: 7785, Training Loss: 0.01628
Epoch: 7785, Training Loss: 0.02030
Epoch: 7785, Training Loss: 0.01549
Epoch: 7786, Training Loss: 0.01775
Epoch: 7786, Training Loss: 0.01628
Epoch: 7786, Training Loss: 0.02030
Epoch: 7786, Training Loss: 0.01549
Epoch: 7787, Training Loss: 0.01775
Epoch: 7787, Training Loss: 0.01628
Epoch: 7787, Training Loss: 0.02030
Epoch: 7787, Training Loss: 0.01549
Epoch: 7788, Training Loss: 0.01775
Epoch: 7788, Training Loss: 0.01628
Epoch: 7788, Training Loss: 0.02030
Epoch: 7788, Training Loss: 0.01549
Epoch: 7789, Training Loss: 0.01775
Epoch: 7789, Training Loss: 0.01628
Epoch: 7789, Training Loss: 0.02030
Epoch: 7789, Training Loss: 0.01549
Epoch: 7790, Training Loss: 0.01775
Epoch: 7790, Training Loss: 0.01628
Epoch: 7790, Training Loss: 0.02030
Epoch: 7790, Training Loss: 0.01549
Epoch: 7791, Training Loss: 0.01775
Epoch: 7791, Training Loss: 0.01627
Epoch: 7791, Training Loss: 0.02030
Epoch: 7791, Training Loss: 0.01549
Epoch: 7792, Training Loss: 0.01774
Epoch: 7792, Training Loss: 0.01627
Epoch: 7792, Training Loss: 0.02029
Epoch: 7792, Training Loss: 0.01549
Epoch: 7793, Training Loss: 0.01774
Epoch: 7793, Training Loss: 0.01627
Epoch: 7793, Training Loss: 0.02029
Epoch: 7793, Training Loss: 0.01548
Epoch: 7794, Training Loss: 0.01774
Epoch: 7794, Training Loss: 0.01627
Epoch: 7794, Training Loss: 0.02029
Epoch: 7794, Training Loss: 0.01548
Epoch: 7795, Training Loss: 0.01774
Epoch: 7795, Training Loss: 0.01627
Epoch: 7795, Training Loss: 0.02029
Epoch: 7795, Training Loss: 0.01548
Epoch: 7796, Training Loss: 0.01774
Epoch: 7796, Training Loss: 0.01627
Epoch: 7796, Training Loss: 0.02029
Epoch: 7796, Training Loss: 0.01548
Epoch: 7797, Training Loss: 0.01774
Epoch: 7797, Training Loss: 0.01627
Epoch: 7797, Training Loss: 0.02029
Epoch: 7797, Training Loss: 0.01548
Epoch: 7798, Training Loss: 0.01774
Epoch: 7798, Training Loss: 0.01627
Epoch: 7798, Training Loss: 0.02028
Epoch: 7798, Training Loss: 0.01548
Epoch: 7799, Training Loss: 0.01774
Epoch: 7799, Training Loss: 0.01626
Epoch: 7799, Training Loss: 0.02028
Epoch: 7799, Training Loss: 0.01548
Epoch: 7800, Training Loss: 0.01773
Epoch: 7800, Training Loss: 0.01626
Epoch: 7800, Training Loss: 0.02028
Epoch: 7800, Training Loss: 0.01548
Epoch: 7801, Training Loss: 0.01773
Epoch: 7801, Training Loss: 0.01626
Epoch: 7801, Training Loss: 0.02028
Epoch: 7801, Training Loss: 0.01548
Epoch: 7802, Training Loss: 0.01773
Epoch: 7802, Training Loss: 0.01626
Epoch: 7802, Training Loss: 0.02028
Epoch: 7802, Training Loss: 0.01547
Epoch: 7803, Training Loss: 0.01773
Epoch: 7803, Training Loss: 0.01626
Epoch: 7803, Training Loss: 0.02028
Epoch: 7803, Training Loss: 0.01547
Epoch: 7804, Training Loss: 0.01773
Epoch: 7804, Training Loss: 0.01626
Epoch: 7804, Training Loss: 0.02028
Epoch: 7804, Training Loss: 0.01547
Epoch: 7805, Training Loss: 0.01773
Epoch: 7805, Training Loss: 0.01626
Epoch: 7805, Training Loss: 0.02027
Epoch: 7805, Training Loss: 0.01547
Epoch: 7806, Training Loss: 0.01773
Epoch: 7806, Training Loss: 0.01626
Epoch: 7806, Training Loss: 0.02027
Epoch: 7806, Training Loss: 0.01547
Epoch: 7807, Training Loss: 0.01772
Epoch: 7807, Training Loss: 0.01626
Epoch: 7807, Training Loss: 0.02027
Epoch: 7807, Training Loss: 0.01547
Epoch: 7808, Training Loss: 0.01772
Epoch: 7808, Training Loss: 0.01625
Epoch: 7808, Training Loss: 0.02027
Epoch: 7808, Training Loss: 0.01547
Epoch: 7809, Training Loss: 0.01772
Epoch: 7809, Training Loss: 0.01625
Epoch: 7809, Training Loss: 0.02027
Epoch: 7809, Training Loss: 0.01547
Epoch: 7810, Training Loss: 0.01772
Epoch: 7810, Training Loss: 0.01625
Epoch: 7810, Training Loss: 0.02027
Epoch: 7810, Training Loss: 0.01546
Epoch: 7811, Training Loss: 0.01772
Epoch: 7811, Training Loss: 0.01625
Epoch: 7811, Training Loss: 0.02027
Epoch: 7811, Training Loss: 0.01546
Epoch: 7812, Training Loss: 0.01772
Epoch: 7812, Training Loss: 0.01625
Epoch: 7812, Training Loss: 0.02026
Epoch: 7812, Training Loss: 0.01546
Epoch: 7813, Training Loss: 0.01772
Epoch: 7813, Training Loss: 0.01625
Epoch: 7813, Training Loss: 0.02026
Epoch: 7813, Training Loss: 0.01546
Epoch: 7814, Training Loss: 0.01772
Epoch: 7814, Training Loss: 0.01625
Epoch: 7814, Training Loss: 0.02026
Epoch: 7814, Training Loss: 0.01546
Epoch: 7815, Training Loss: 0.01771
Epoch: 7815, Training Loss: 0.01625
Epoch: 7815, Training Loss: 0.02026
Epoch: 7815, Training Loss: 0.01546
Epoch: 7816, Training Loss: 0.01771
Epoch: 7816, Training Loss: 0.01625
Epoch: 7816, Training Loss: 0.02026
Epoch: 7816, Training Loss: 0.01546
Epoch: 7817, Training Loss: 0.01771
Epoch: 7817, Training Loss: 0.01624
Epoch: 7817, Training Loss: 0.02026
Epoch: 7817, Training Loss: 0.01546
Epoch: 7818, Training Loss: 0.01771
Epoch: 7818, Training Loss: 0.01624
Epoch: 7818, Training Loss: 0.02025
Epoch: 7818, Training Loss: 0.01546
Epoch: 7819, Training Loss: 0.01771
Epoch: 7819, Training Loss: 0.01624
Epoch: 7819, Training Loss: 0.02025
Epoch: 7819, Training Loss: 0.01545
Epoch: 7820, Training Loss: 0.01771
Epoch: 7820, Training Loss: 0.01624
Epoch: 7820, Training Loss: 0.02025
Epoch: 7820, Training Loss: 0.01545
Epoch: 7821, Training Loss: 0.01771
Epoch: 7821, Training Loss: 0.01624
Epoch: 7821, Training Loss: 0.02025
Epoch: 7821, Training Loss: 0.01545
Epoch: 7822, Training Loss: 0.01770
Epoch: 7822, Training Loss: 0.01624
Epoch: 7822, Training Loss: 0.02025
Epoch: 7822, Training Loss: 0.01545
Epoch: 7823, Training Loss: 0.01770
Epoch: 7823, Training Loss: 0.01624
Epoch: 7823, Training Loss: 0.02025
Epoch: 7823, Training Loss: 0.01545
Epoch: 7824, Training Loss: 0.01770
Epoch: 7824, Training Loss: 0.01624
Epoch: 7824, Training Loss: 0.02025
Epoch: 7824, Training Loss: 0.01545
Epoch: 7825, Training Loss: 0.01770
Epoch: 7825, Training Loss: 0.01623
Epoch: 7825, Training Loss: 0.02024
Epoch: 7825, Training Loss: 0.01545
Epoch: 7826, Training Loss: 0.01770
Epoch: 7826, Training Loss: 0.01623
Epoch: 7826, Training Loss: 0.02024
Epoch: 7826, Training Loss: 0.01545
Epoch: 7827, Training Loss: 0.01770
Epoch: 7827, Training Loss: 0.01623
Epoch: 7827, Training Loss: 0.02024
Epoch: 7827, Training Loss: 0.01545
Epoch: 7828, Training Loss: 0.01770
Epoch: 7828, Training Loss: 0.01623
Epoch: 7828, Training Loss: 0.02024
Epoch: 7828, Training Loss: 0.01544
Epoch: 7829, Training Loss: 0.01770
Epoch: 7829, Training Loss: 0.01623
Epoch: 7829, Training Loss: 0.02024
Epoch: 7829, Training Loss: 0.01544
Epoch: 7830, Training Loss: 0.01769
Epoch: 7830, Training Loss: 0.01623
Epoch: 7830, Training Loss: 0.02024
Epoch: 7830, Training Loss: 0.01544
Epoch: 7831, Training Loss: 0.01769
Epoch: 7831, Training Loss: 0.01623
Epoch: 7831, Training Loss: 0.02023
Epoch: 7831, Training Loss: 0.01544
Epoch: 7832, Training Loss: 0.01769
Epoch: 7832, Training Loss: 0.01623
Epoch: 7832, Training Loss: 0.02023
Epoch: 7832, Training Loss: 0.01544
Epoch: 7833, Training Loss: 0.01769
Epoch: 7833, Training Loss: 0.01623
Epoch: 7833, Training Loss: 0.02023
Epoch: 7833, Training Loss: 0.01544
Epoch: 7834, Training Loss: 0.01769
Epoch: 7834, Training Loss: 0.01622
Epoch: 7834, Training Loss: 0.02023
Epoch: 7834, Training Loss: 0.01544
Epoch: 7835, Training Loss: 0.01769
Epoch: 7835, Training Loss: 0.01622
Epoch: 7835, Training Loss: 0.02023
Epoch: 7835, Training Loss: 0.01544
Epoch: 7836, Training Loss: 0.01769
Epoch: 7836, Training Loss: 0.01622
Epoch: 7836, Training Loss: 0.02023
Epoch: 7836, Training Loss: 0.01544
Epoch: 7837, Training Loss: 0.01769
Epoch: 7837, Training Loss: 0.01622
Epoch: 7837, Training Loss: 0.02023
Epoch: 7837, Training Loss: 0.01543
Epoch: 7838, Training Loss: 0.01768
Epoch: 7838, Training Loss: 0.01622
Epoch: 7838, Training Loss: 0.02022
Epoch: 7838, Training Loss: 0.01543
Epoch: 7839, Training Loss: 0.01768
Epoch: 7839, Training Loss: 0.01622
Epoch: 7839, Training Loss: 0.02022
Epoch: 7839, Training Loss: 0.01543
Epoch: 7840, Training Loss: 0.01768
Epoch: 7840, Training Loss: 0.01622
Epoch: 7840, Training Loss: 0.02022
Epoch: 7840, Training Loss: 0.01543
Epoch: 7841, Training Loss: 0.01768
Epoch: 7841, Training Loss: 0.01622
Epoch: 7841, Training Loss: 0.02022
Epoch: 7841, Training Loss: 0.01543
Epoch: 7842, Training Loss: 0.01768
Epoch: 7842, Training Loss: 0.01621
Epoch: 7842, Training Loss: 0.02022
Epoch: 7842, Training Loss: 0.01543
Epoch: 7843, Training Loss: 0.01768
Epoch: 7843, Training Loss: 0.01621
Epoch: 7843, Training Loss: 0.02022
Epoch: 7843, Training Loss: 0.01543
Epoch: 7844, Training Loss: 0.01768
Epoch: 7844, Training Loss: 0.01621
Epoch: 7844, Training Loss: 0.02021
Epoch: 7844, Training Loss: 0.01543
Epoch: 7845, Training Loss: 0.01767
Epoch: 7845, Training Loss: 0.01621
Epoch: 7845, Training Loss: 0.02021
Epoch: 7845, Training Loss: 0.01543
Epoch: 7846, Training Loss: 0.01767
Epoch: 7846, Training Loss: 0.01621
Epoch: 7846, Training Loss: 0.02021
Epoch: 7846, Training Loss: 0.01542
Epoch: 7847, Training Loss: 0.01767
Epoch: 7847, Training Loss: 0.01621
Epoch: 7847, Training Loss: 0.02021
Epoch: 7847, Training Loss: 0.01542
Epoch: 7848, Training Loss: 0.01767
Epoch: 7848, Training Loss: 0.01621
Epoch: 7848, Training Loss: 0.02021
Epoch: 7848, Training Loss: 0.01542
Epoch: 7849, Training Loss: 0.01767
Epoch: 7849, Training Loss: 0.01621
Epoch: 7849, Training Loss: 0.02021
Epoch: 7849, Training Loss: 0.01542
Epoch: 7850, Training Loss: 0.01767
Epoch: 7850, Training Loss: 0.01621
Epoch: 7850, Training Loss: 0.02021
Epoch: 7850, Training Loss: 0.01542
Epoch: 7851, Training Loss: 0.01767
Epoch: 7851, Training Loss: 0.01620
Epoch: 7851, Training Loss: 0.02020
Epoch: 7851, Training Loss: 0.01542
Epoch: 7852, Training Loss: 0.01767
Epoch: 7852, Training Loss: 0.01620
Epoch: 7852, Training Loss: 0.02020
Epoch: 7852, Training Loss: 0.01542
Epoch: 7853, Training Loss: 0.01766
Epoch: 7853, Training Loss: 0.01620
Epoch: 7853, Training Loss: 0.02020
Epoch: 7853, Training Loss: 0.01542
Epoch: 7854, Training Loss: 0.01766
Epoch: 7854, Training Loss: 0.01620
Epoch: 7854, Training Loss: 0.02020
Epoch: 7854, Training Loss: 0.01542
Epoch: 7855, Training Loss: 0.01766
Epoch: 7855, Training Loss: 0.01620
Epoch: 7855, Training Loss: 0.02020
Epoch: 7855, Training Loss: 0.01541
Epoch: 7856, Training Loss: 0.01766
Epoch: 7856, Training Loss: 0.01620
Epoch: 7856, Training Loss: 0.02020
Epoch: 7856, Training Loss: 0.01541
Epoch: 7857, Training Loss: 0.01766
Epoch: 7857, Training Loss: 0.01620
Epoch: 7857, Training Loss: 0.02020
Epoch: 7857, Training Loss: 0.01541
Epoch: 7858, Training Loss: 0.01766
Epoch: 7858, Training Loss: 0.01620
Epoch: 7858, Training Loss: 0.02019
Epoch: 7858, Training Loss: 0.01541
Epoch: 7859, Training Loss: 0.01766
Epoch: 7859, Training Loss: 0.01620
Epoch: 7859, Training Loss: 0.02019
Epoch: 7859, Training Loss: 0.01541
Epoch: 7860, Training Loss: 0.01765
Epoch: 7860, Training Loss: 0.01619
Epoch: 7860, Training Loss: 0.02019
Epoch: 7860, Training Loss: 0.01541
Epoch: 7861, Training Loss: 0.01765
Epoch: 7861, Training Loss: 0.01619
Epoch: 7861, Training Loss: 0.02019
Epoch: 7861, Training Loss: 0.01541
Epoch: 7862, Training Loss: 0.01765
Epoch: 7862, Training Loss: 0.01619
Epoch: 7862, Training Loss: 0.02019
Epoch: 7862, Training Loss: 0.01541
Epoch: 7863, Training Loss: 0.01765
Epoch: 7863, Training Loss: 0.01619
Epoch: 7863, Training Loss: 0.02019
Epoch: 7863, Training Loss: 0.01541
Epoch: 7864, Training Loss: 0.01765
Epoch: 7864, Training Loss: 0.01619
Epoch: 7864, Training Loss: 0.02018
Epoch: 7864, Training Loss: 0.01540
Epoch: 7865, Training Loss: 0.01765
Epoch: 7865, Training Loss: 0.01619
Epoch: 7865, Training Loss: 0.02018
Epoch: 7865, Training Loss: 0.01540
Epoch: 7866, Training Loss: 0.01765
Epoch: 7866, Training Loss: 0.01619
Epoch: 7866, Training Loss: 0.02018
Epoch: 7866, Training Loss: 0.01540
Epoch: 7867, Training Loss: 0.01765
Epoch: 7867, Training Loss: 0.01619
Epoch: 7867, Training Loss: 0.02018
Epoch: 7867, Training Loss: 0.01540
Epoch: 7868, Training Loss: 0.01764
Epoch: 7868, Training Loss: 0.01618
Epoch: 7868, Training Loss: 0.02018
Epoch: 7868, Training Loss: 0.01540
Epoch: 7869, Training Loss: 0.01764
Epoch: 7869, Training Loss: 0.01618
Epoch: 7869, Training Loss: 0.02018
Epoch: 7869, Training Loss: 0.01540
Epoch: 7870, Training Loss: 0.01764
Epoch: 7870, Training Loss: 0.01618
Epoch: 7870, Training Loss: 0.02018
Epoch: 7870, Training Loss: 0.01540
Epoch: 7871, Training Loss: 0.01764
Epoch: 7871, Training Loss: 0.01618
Epoch: 7871, Training Loss: 0.02017
Epoch: 7871, Training Loss: 0.01540
Epoch: 7872, Training Loss: 0.01764
Epoch: 7872, Training Loss: 0.01618
Epoch: 7872, Training Loss: 0.02017
Epoch: 7872, Training Loss: 0.01540
Epoch: 7873, Training Loss: 0.01764
Epoch: 7873, Training Loss: 0.01618
Epoch: 7873, Training Loss: 0.02017
Epoch: 7873, Training Loss: 0.01539
Epoch: 7874, Training Loss: 0.01764
Epoch: 7874, Training Loss: 0.01618
Epoch: 7874, Training Loss: 0.02017
Epoch: 7874, Training Loss: 0.01539
Epoch: 7875, Training Loss: 0.01764
Epoch: 7875, Training Loss: 0.01618
Epoch: 7875, Training Loss: 0.02017
Epoch: 7875, Training Loss: 0.01539
Epoch: 7876, Training Loss: 0.01763
Epoch: 7876, Training Loss: 0.01618
Epoch: 7876, Training Loss: 0.02017
Epoch: 7876, Training Loss: 0.01539
Epoch: 7877, Training Loss: 0.01763
Epoch: 7877, Training Loss: 0.01617
Epoch: 7877, Training Loss: 0.02017
Epoch: 7877, Training Loss: 0.01539
Epoch: 7878, Training Loss: 0.01763
Epoch: 7878, Training Loss: 0.01617
Epoch: 7878, Training Loss: 0.02016
Epoch: 7878, Training Loss: 0.01539
Epoch: 7879, Training Loss: 0.01763
Epoch: 7879, Training Loss: 0.01617
Epoch: 7879, Training Loss: 0.02016
Epoch: 7879, Training Loss: 0.01539
Epoch: 7880, Training Loss: 0.01763
Epoch: 7880, Training Loss: 0.01617
Epoch: 7880, Training Loss: 0.02016
Epoch: 7880, Training Loss: 0.01539
Epoch: 7881, Training Loss: 0.01763
Epoch: 7881, Training Loss: 0.01617
Epoch: 7881, Training Loss: 0.02016
Epoch: 7881, Training Loss: 0.01539
Epoch: 7882, Training Loss: 0.01763
Epoch: 7882, Training Loss: 0.01617
Epoch: 7882, Training Loss: 0.02016
Epoch: 7882, Training Loss: 0.01538
Epoch: 7883, Training Loss: 0.01762
Epoch: 7883, Training Loss: 0.01617
Epoch: 7883, Training Loss: 0.02016
Epoch: 7883, Training Loss: 0.01538
Epoch: 7884, Training Loss: 0.01762
Epoch: 7884, Training Loss: 0.01617
Epoch: 7884, Training Loss: 0.02015
Epoch: 7884, Training Loss: 0.01538
Epoch: 7885, Training Loss: 0.01762
Epoch: 7885, Training Loss: 0.01617
Epoch: 7885, Training Loss: 0.02015
Epoch: 7885, Training Loss: 0.01538
Epoch: 7886, Training Loss: 0.01762
Epoch: 7886, Training Loss: 0.01616
Epoch: 7886, Training Loss: 0.02015
Epoch: 7886, Training Loss: 0.01538
Epoch: 7887, Training Loss: 0.01762
Epoch: 7887, Training Loss: 0.01616
Epoch: 7887, Training Loss: 0.02015
Epoch: 7887, Training Loss: 0.01538
Epoch: 7888, Training Loss: 0.01762
Epoch: 7888, Training Loss: 0.01616
Epoch: 7888, Training Loss: 0.02015
Epoch: 7888, Training Loss: 0.01538
Epoch: 7889, Training Loss: 0.01762
Epoch: 7889, Training Loss: 0.01616
Epoch: 7889, Training Loss: 0.02015
Epoch: 7889, Training Loss: 0.01538
Epoch: 7890, Training Loss: 0.01762
Epoch: 7890, Training Loss: 0.01616
Epoch: 7890, Training Loss: 0.02015
Epoch: 7890, Training Loss: 0.01538
Epoch: 7891, Training Loss: 0.01761
Epoch: 7891, Training Loss: 0.01616
Epoch: 7891, Training Loss: 0.02014
Epoch: 7891, Training Loss: 0.01537
Epoch: 7892, Training Loss: 0.01761
Epoch: 7892, Training Loss: 0.01616
Epoch: 7892, Training Loss: 0.02014
Epoch: 7892, Training Loss: 0.01537
Epoch: 7893, Training Loss: 0.01761
Epoch: 7893, Training Loss: 0.01616
Epoch: 7893, Training Loss: 0.02014
Epoch: 7893, Training Loss: 0.01537
Epoch: 7894, Training Loss: 0.01761
Epoch: 7894, Training Loss: 0.01615
Epoch: 7894, Training Loss: 0.02014
Epoch: 7894, Training Loss: 0.01537
Epoch: 7895, Training Loss: 0.01761
Epoch: 7895, Training Loss: 0.01615
Epoch: 7895, Training Loss: 0.02014
Epoch: 7895, Training Loss: 0.01537
Epoch: 7896, Training Loss: 0.01761
Epoch: 7896, Training Loss: 0.01615
Epoch: 7896, Training Loss: 0.02014
Epoch: 7896, Training Loss: 0.01537
Epoch: 7897, Training Loss: 0.01761
Epoch: 7897, Training Loss: 0.01615
Epoch: 7897, Training Loss: 0.02014
Epoch: 7897, Training Loss: 0.01537
Epoch: 7898, Training Loss: 0.01761
Epoch: 7898, Training Loss: 0.01615
Epoch: 7898, Training Loss: 0.02013
Epoch: 7898, Training Loss: 0.01537
Epoch: 7899, Training Loss: 0.01760
Epoch: 7899, Training Loss: 0.01615
Epoch: 7899, Training Loss: 0.02013
Epoch: 7899, Training Loss: 0.01537
Epoch: 7900, Training Loss: 0.01760
Epoch: 7900, Training Loss: 0.01615
Epoch: 7900, Training Loss: 0.02013
Epoch: 7900, Training Loss: 0.01536
Epoch: 7901, Training Loss: 0.01760
Epoch: 7901, Training Loss: 0.01615
Epoch: 7901, Training Loss: 0.02013
Epoch: 7901, Training Loss: 0.01536
Epoch: 7902, Training Loss: 0.01760
Epoch: 7902, Training Loss: 0.01615
Epoch: 7902, Training Loss: 0.02013
Epoch: 7902, Training Loss: 0.01536
Epoch: 7903, Training Loss: 0.01760
Epoch: 7903, Training Loss: 0.01614
Epoch: 7903, Training Loss: 0.02013
Epoch: 7903, Training Loss: 0.01536
Epoch: 7904, Training Loss: 0.01760
Epoch: 7904, Training Loss: 0.01614
Epoch: 7904, Training Loss: 0.02012
Epoch: 7904, Training Loss: 0.01536
Epoch: 7905, Training Loss: 0.01760
Epoch: 7905, Training Loss: 0.01614
Epoch: 7905, Training Loss: 0.02012
Epoch: 7905, Training Loss: 0.01536
Epoch: 7906, Training Loss: 0.01760
Epoch: 7906, Training Loss: 0.01614
Epoch: 7906, Training Loss: 0.02012
Epoch: 7906, Training Loss: 0.01536
Epoch: 7907, Training Loss: 0.01759
Epoch: 7907, Training Loss: 0.01614
Epoch: 7907, Training Loss: 0.02012
Epoch: 7907, Training Loss: 0.01536
Epoch: 7908, Training Loss: 0.01759
Epoch: 7908, Training Loss: 0.01614
Epoch: 7908, Training Loss: 0.02012
Epoch: 7908, Training Loss: 0.01536
Epoch: 7909, Training Loss: 0.01759
Epoch: 7909, Training Loss: 0.01614
Epoch: 7909, Training Loss: 0.02012
Epoch: 7909, Training Loss: 0.01535
Epoch: 7910, Training Loss: 0.01759
Epoch: 7910, Training Loss: 0.01614
Epoch: 7910, Training Loss: 0.02012
Epoch: 7910, Training Loss: 0.01535
Epoch: 7911, Training Loss: 0.01759
Epoch: 7911, Training Loss: 0.01614
Epoch: 7911, Training Loss: 0.02011
Epoch: 7911, Training Loss: 0.01535
Epoch: 7912, Training Loss: 0.01759
Epoch: 7912, Training Loss: 0.01613
Epoch: 7912, Training Loss: 0.02011
Epoch: 7912, Training Loss: 0.01535
Epoch: 7913, Training Loss: 0.01759
Epoch: 7913, Training Loss: 0.01613
Epoch: 7913, Training Loss: 0.02011
Epoch: 7913, Training Loss: 0.01535
Epoch: 7914, Training Loss: 0.01758
Epoch: 7914, Training Loss: 0.01613
Epoch: 7914, Training Loss: 0.02011
Epoch: 7914, Training Loss: 0.01535
Epoch: 7915, Training Loss: 0.01758
Epoch: 7915, Training Loss: 0.01613
Epoch: 7915, Training Loss: 0.02011
Epoch: 7915, Training Loss: 0.01535
Epoch: 7916, Training Loss: 0.01758
Epoch: 7916, Training Loss: 0.01613
Epoch: 7916, Training Loss: 0.02011
Epoch: 7916, Training Loss: 0.01535
Epoch: 7917, Training Loss: 0.01758
Epoch: 7917, Training Loss: 0.01613
Epoch: 7917, Training Loss: 0.02011
Epoch: 7917, Training Loss: 0.01535
Epoch: 7918, Training Loss: 0.01758
Epoch: 7918, Training Loss: 0.01613
Epoch: 7918, Training Loss: 0.02010
Epoch: 7918, Training Loss: 0.01534
Epoch: 7919, Training Loss: 0.01758
Epoch: 7919, Training Loss: 0.01613
Epoch: 7919, Training Loss: 0.02010
Epoch: 7919, Training Loss: 0.01534
Epoch: 7920, Training Loss: 0.01758
Epoch: 7920, Training Loss: 0.01612
Epoch: 7920, Training Loss: 0.02010
Epoch: 7920, Training Loss: 0.01534
Epoch: 7921, Training Loss: 0.01758
Epoch: 7921, Training Loss: 0.01612
Epoch: 7921, Training Loss: 0.02010
Epoch: 7921, Training Loss: 0.01534
Epoch: 7922, Training Loss: 0.01757
Epoch: 7922, Training Loss: 0.01612
Epoch: 7922, Training Loss: 0.02010
Epoch: 7922, Training Loss: 0.01534
Epoch: 7923, Training Loss: 0.01757
Epoch: 7923, Training Loss: 0.01612
Epoch: 7923, Training Loss: 0.02010
Epoch: 7923, Training Loss: 0.01534
Epoch: 7924, Training Loss: 0.01757
Epoch: 7924, Training Loss: 0.01612
Epoch: 7924, Training Loss: 0.02009
Epoch: 7924, Training Loss: 0.01534
Epoch: 7925, Training Loss: 0.01757
Epoch: 7925, Training Loss: 0.01612
Epoch: 7925, Training Loss: 0.02009
Epoch: 7925, Training Loss: 0.01534
Epoch: 7926, Training Loss: 0.01757
Epoch: 7926, Training Loss: 0.01612
Epoch: 7926, Training Loss: 0.02009
Epoch: 7926, Training Loss: 0.01534
Epoch: 7927, Training Loss: 0.01757
Epoch: 7927, Training Loss: 0.01612
Epoch: 7927, Training Loss: 0.02009
Epoch: 7927, Training Loss: 0.01533
Epoch: 7928, Training Loss: 0.01757
Epoch: 7928, Training Loss: 0.01612
Epoch: 7928, Training Loss: 0.02009
Epoch: 7928, Training Loss: 0.01533
Epoch: 7929, Training Loss: 0.01757
Epoch: 7929, Training Loss: 0.01611
Epoch: 7929, Training Loss: 0.02009
Epoch: 7929, Training Loss: 0.01533
Epoch: 7930, Training Loss: 0.01756
Epoch: 7930, Training Loss: 0.01611
Epoch: 7930, Training Loss: 0.02009
Epoch: 7930, Training Loss: 0.01533
Epoch: 7931, Training Loss: 0.01756
Epoch: 7931, Training Loss: 0.01611
Epoch: 7931, Training Loss: 0.02008
Epoch: 7931, Training Loss: 0.01533
Epoch: 7932, Training Loss: 0.01756
Epoch: 7932, Training Loss: 0.01611
Epoch: 7932, Training Loss: 0.02008
Epoch: 7932, Training Loss: 0.01533
Epoch: 7933, Training Loss: 0.01756
Epoch: 7933, Training Loss: 0.01611
Epoch: 7933, Training Loss: 0.02008
Epoch: 7933, Training Loss: 0.01533
Epoch: 7934, Training Loss: 0.01756
Epoch: 7934, Training Loss: 0.01611
Epoch: 7934, Training Loss: 0.02008
Epoch: 7934, Training Loss: 0.01533
Epoch: 7935, Training Loss: 0.01756
Epoch: 7935, Training Loss: 0.01611
Epoch: 7935, Training Loss: 0.02008
Epoch: 7935, Training Loss: 0.01533
Epoch: 7936, Training Loss: 0.01756
Epoch: 7936, Training Loss: 0.01611
Epoch: 7936, Training Loss: 0.02008
Epoch: 7936, Training Loss: 0.01532
Epoch: 7937, Training Loss: 0.01756
Epoch: 7937, Training Loss: 0.01611
Epoch: 7937, Training Loss: 0.02008
Epoch: 7937, Training Loss: 0.01532
Epoch: 7938, Training Loss: 0.01755
Epoch: 7938, Training Loss: 0.01610
Epoch: 7938, Training Loss: 0.02007
Epoch: 7938, Training Loss: 0.01532
Epoch: 7939, Training Loss: 0.01755
Epoch: 7939, Training Loss: 0.01610
Epoch: 7939, Training Loss: 0.02007
Epoch: 7939, Training Loss: 0.01532
Epoch: 7940, Training Loss: 0.01755
Epoch: 7940, Training Loss: 0.01610
Epoch: 7940, Training Loss: 0.02007
Epoch: 7940, Training Loss: 0.01532
Epoch: 7941, Training Loss: 0.01755
Epoch: 7941, Training Loss: 0.01610
Epoch: 7941, Training Loss: 0.02007
Epoch: 7941, Training Loss: 0.01532
Epoch: 7942, Training Loss: 0.01755
Epoch: 7942, Training Loss: 0.01610
Epoch: 7942, Training Loss: 0.02007
Epoch: 7942, Training Loss: 0.01532
Epoch: 7943, Training Loss: 0.01755
Epoch: 7943, Training Loss: 0.01610
Epoch: 7943, Training Loss: 0.02007
Epoch: 7943, Training Loss: 0.01532
Epoch: 7944, Training Loss: 0.01755
Epoch: 7944, Training Loss: 0.01610
Epoch: 7944, Training Loss: 0.02007
Epoch: 7944, Training Loss: 0.01532
Epoch: 7945, Training Loss: 0.01754
Epoch: 7945, Training Loss: 0.01610
Epoch: 7945, Training Loss: 0.02006
Epoch: 7945, Training Loss: 0.01531
Epoch: 7946, Training Loss: 0.01754
Epoch: 7946, Training Loss: 0.01610
Epoch: 7946, Training Loss: 0.02006
Epoch: 7946, Training Loss: 0.01531
Epoch: 7947, Training Loss: 0.01754
Epoch: 7947, Training Loss: 0.01609
Epoch: 7947, Training Loss: 0.02006
Epoch: 7947, Training Loss: 0.01531
Epoch: 7948, Training Loss: 0.01754
Epoch: 7948, Training Loss: 0.01609
Epoch: 7948, Training Loss: 0.02006
Epoch: 7948, Training Loss: 0.01531
Epoch: 7949, Training Loss: 0.01754
Epoch: 7949, Training Loss: 0.01609
Epoch: 7949, Training Loss: 0.02006
Epoch: 7949, Training Loss: 0.01531
Epoch: 7950, Training Loss: 0.01754
Epoch: 7950, Training Loss: 0.01609
Epoch: 7950, Training Loss: 0.02006
Epoch: 7950, Training Loss: 0.01531
Epoch: 7951, Training Loss: 0.01754
Epoch: 7951, Training Loss: 0.01609
Epoch: 7951, Training Loss: 0.02005
Epoch: 7951, Training Loss: 0.01531
Epoch: 7952, Training Loss: 0.01754
Epoch: 7952, Training Loss: 0.01609
Epoch: 7952, Training Loss: 0.02005
Epoch: 7952, Training Loss: 0.01531
Epoch: 7953, Training Loss: 0.01753
Epoch: 7953, Training Loss: 0.01609
Epoch: 7953, Training Loss: 0.02005
Epoch: 7953, Training Loss: 0.01531
Epoch: 7954, Training Loss: 0.01753
Epoch: 7954, Training Loss: 0.01609
Epoch: 7954, Training Loss: 0.02005
Epoch: 7954, Training Loss: 0.01530
Epoch: 7955, Training Loss: 0.01753
Epoch: 7955, Training Loss: 0.01609
Epoch: 7955, Training Loss: 0.02005
Epoch: 7955, Training Loss: 0.01530
Epoch: 7956, Training Loss: 0.01753
Epoch: 7956, Training Loss: 0.01608
Epoch: 7956, Training Loss: 0.02005
Epoch: 7956, Training Loss: 0.01530
Epoch: 7957, Training Loss: 0.01753
Epoch: 7957, Training Loss: 0.01608
Epoch: 7957, Training Loss: 0.02005
Epoch: 7957, Training Loss: 0.01530
Epoch: 7958, Training Loss: 0.01753
Epoch: 7958, Training Loss: 0.01608
Epoch: 7958, Training Loss: 0.02004
Epoch: 7958, Training Loss: 0.01530
Epoch: 7959, Training Loss: 0.01753
Epoch: 7959, Training Loss: 0.01608
Epoch: 7959, Training Loss: 0.02004
Epoch: 7959, Training Loss: 0.01530
Epoch: 7960, Training Loss: 0.01753
Epoch: 7960, Training Loss: 0.01608
Epoch: 7960, Training Loss: 0.02004
Epoch: 7960, Training Loss: 0.01530
Epoch: 7961, Training Loss: 0.01752
Epoch: 7961, Training Loss: 0.01608
Epoch: 7961, Training Loss: 0.02004
Epoch: 7961, Training Loss: 0.01530
Epoch: 7962, Training Loss: 0.01752
Epoch: 7962, Training Loss: 0.01608
Epoch: 7962, Training Loss: 0.02004
Epoch: 7962, Training Loss: 0.01530
Epoch: 7963, Training Loss: 0.01752
Epoch: 7963, Training Loss: 0.01608
Epoch: 7963, Training Loss: 0.02004
Epoch: 7963, Training Loss: 0.01530
Epoch: 7964, Training Loss: 0.01752
Epoch: 7964, Training Loss: 0.01607
Epoch: 7964, Training Loss: 0.02004
Epoch: 7964, Training Loss: 0.01529
Epoch: 7965, Training Loss: 0.01752
Epoch: 7965, Training Loss: 0.01607
Epoch: 7965, Training Loss: 0.02003
Epoch: 7965, Training Loss: 0.01529
Epoch: 7966, Training Loss: 0.01752
Epoch: 7966, Training Loss: 0.01607
Epoch: 7966, Training Loss: 0.02003
Epoch: 7966, Training Loss: 0.01529
Epoch: 7967, Training Loss: 0.01752
Epoch: 7967, Training Loss: 0.01607
Epoch: 7967, Training Loss: 0.02003
Epoch: 7967, Training Loss: 0.01529
Epoch: 7968, Training Loss: 0.01752
Epoch: 7968, Training Loss: 0.01607
Epoch: 7968, Training Loss: 0.02003
Epoch: 7968, Training Loss: 0.01529
Epoch: 7969, Training Loss: 0.01751
Epoch: 7969, Training Loss: 0.01607
Epoch: 7969, Training Loss: 0.02003
Epoch: 7969, Training Loss: 0.01529
Epoch: 7970, Training Loss: 0.01751
Epoch: 7970, Training Loss: 0.01607
Epoch: 7970, Training Loss: 0.02003
Epoch: 7970, Training Loss: 0.01529
Epoch: 7971, Training Loss: 0.01751
Epoch: 7971, Training Loss: 0.01607
Epoch: 7971, Training Loss: 0.02003
Epoch: 7971, Training Loss: 0.01529
Epoch: 7972, Training Loss: 0.01751
Epoch: 7972, Training Loss: 0.01607
Epoch: 7972, Training Loss: 0.02002
Epoch: 7972, Training Loss: 0.01529
Epoch: 7973, Training Loss: 0.01751
Epoch: 7973, Training Loss: 0.01606
Epoch: 7973, Training Loss: 0.02002
Epoch: 7973, Training Loss: 0.01528
Epoch: 7974, Training Loss: 0.01751
Epoch: 7974, Training Loss: 0.01606
Epoch: 7974, Training Loss: 0.02002
Epoch: 7974, Training Loss: 0.01528
Epoch: 7975, Training Loss: 0.01751
Epoch: 7975, Training Loss: 0.01606
Epoch: 7975, Training Loss: 0.02002
Epoch: 7975, Training Loss: 0.01528
Epoch: 7976, Training Loss: 0.01751
Epoch: 7976, Training Loss: 0.01606
Epoch: 7976, Training Loss: 0.02002
Epoch: 7976, Training Loss: 0.01528
Epoch: 7977, Training Loss: 0.01750
Epoch: 7977, Training Loss: 0.01606
Epoch: 7977, Training Loss: 0.02002
Epoch: 7977, Training Loss: 0.01528
Epoch: 7978, Training Loss: 0.01750
Epoch: 7978, Training Loss: 0.01606
Epoch: 7978, Training Loss: 0.02002
Epoch: 7978, Training Loss: 0.01528
Epoch: 7979, Training Loss: 0.01750
Epoch: 7979, Training Loss: 0.01606
Epoch: 7979, Training Loss: 0.02001
Epoch: 7979, Training Loss: 0.01528
Epoch: 7980, Training Loss: 0.01750
Epoch: 7980, Training Loss: 0.01606
Epoch: 7980, Training Loss: 0.02001
Epoch: 7980, Training Loss: 0.01528
Epoch: 7981, Training Loss: 0.01750
Epoch: 7981, Training Loss: 0.01606
Epoch: 7981, Training Loss: 0.02001
Epoch: 7981, Training Loss: 0.01528
Epoch: 7982, Training Loss: 0.01750
Epoch: 7982, Training Loss: 0.01605
Epoch: 7982, Training Loss: 0.02001
Epoch: 7982, Training Loss: 0.01527
Epoch: 7983, Training Loss: 0.01750
Epoch: 7983, Training Loss: 0.01605
Epoch: 7983, Training Loss: 0.02001
Epoch: 7983, Training Loss: 0.01527
Epoch: 7984, Training Loss: 0.01749
Epoch: 7984, Training Loss: 0.01605
Epoch: 7984, Training Loss: 0.02001
Epoch: 7984, Training Loss: 0.01527
Epoch: 7985, Training Loss: 0.01749
Epoch: 7985, Training Loss: 0.01605
Epoch: 7985, Training Loss: 0.02000
Epoch: 7985, Training Loss: 0.01527
Epoch: 7986, Training Loss: 0.01749
Epoch: 7986, Training Loss: 0.01605
Epoch: 7986, Training Loss: 0.02000
Epoch: 7986, Training Loss: 0.01527
Epoch: 7987, Training Loss: 0.01749
Epoch: 7987, Training Loss: 0.01605
Epoch: 7987, Training Loss: 0.02000
Epoch: 7987, Training Loss: 0.01527
Epoch: 7988, Training Loss: 0.01749
Epoch: 7988, Training Loss: 0.01605
Epoch: 7988, Training Loss: 0.02000
Epoch: 7988, Training Loss: 0.01527
Epoch: 7989, Training Loss: 0.01749
Epoch: 7989, Training Loss: 0.01605
Epoch: 7989, Training Loss: 0.02000
Epoch: 7989, Training Loss: 0.01527
Epoch: 7990, Training Loss: 0.01749
Epoch: 7990, Training Loss: 0.01605
Epoch: 7990, Training Loss: 0.02000
Epoch: 7990, Training Loss: 0.01527
Epoch: 7991, Training Loss: 0.01749
Epoch: 7991, Training Loss: 0.01604
Epoch: 7991, Training Loss: 0.02000
Epoch: 7991, Training Loss: 0.01526
Epoch: 7992, Training Loss: 0.01748
Epoch: 7992, Training Loss: 0.01604
Epoch: 7992, Training Loss: 0.01999
Epoch: 7992, Training Loss: 0.01526
Epoch: 7993, Training Loss: 0.01748
Epoch: 7993, Training Loss: 0.01604
Epoch: 7993, Training Loss: 0.01999
Epoch: 7993, Training Loss: 0.01526
Epoch: 7994, Training Loss: 0.01748
Epoch: 7994, Training Loss: 0.01604
Epoch: 7994, Training Loss: 0.01999
Epoch: 7994, Training Loss: 0.01526
Epoch: 7995, Training Loss: 0.01748
Epoch: 7995, Training Loss: 0.01604
Epoch: 7995, Training Loss: 0.01999
Epoch: 7995, Training Loss: 0.01526
Epoch: 7996, Training Loss: 0.01748
Epoch: 7996, Training Loss: 0.01604
Epoch: 7996, Training Loss: 0.01999
Epoch: 7996, Training Loss: 0.01526
Epoch: 7997, Training Loss: 0.01748
Epoch: 7997, Training Loss: 0.01604
Epoch: 7997, Training Loss: 0.01999
Epoch: 7997, Training Loss: 0.01526
Epoch: 7998, Training Loss: 0.01748
Epoch: 7998, Training Loss: 0.01604
Epoch: 7998, Training Loss: 0.01999
Epoch: 7998, Training Loss: 0.01526
Epoch: 7999, Training Loss: 0.01748
Epoch: 7999, Training Loss: 0.01604
Epoch: 7999, Training Loss: 0.01998
Epoch: 7999, Training Loss: 0.01526
Epoch: 8000, Training Loss: 0.01747
Epoch: 8000, Training Loss: 0.01603
Epoch: 8000, Training Loss: 0.01998
Epoch: 8000, Training Loss: 0.01525
Epoch: 8001, Training Loss: 0.01747
Epoch: 8001, Training Loss: 0.01603
Epoch: 8001, Training Loss: 0.01998
Epoch: 8001, Training Loss: 0.01525
Epoch: 8002, Training Loss: 0.01747
Epoch: 8002, Training Loss: 0.01603
Epoch: 8002, Training Loss: 0.01998
Epoch: 8002, Training Loss: 0.01525
Epoch: 8003, Training Loss: 0.01747
Epoch: 8003, Training Loss: 0.01603
Epoch: 8003, Training Loss: 0.01998
Epoch: 8003, Training Loss: 0.01525
Epoch: 8004, Training Loss: 0.01747
Epoch: 8004, Training Loss: 0.01603
Epoch: 8004, Training Loss: 0.01998
Epoch: 8004, Training Loss: 0.01525
Epoch: 8005, Training Loss: 0.01747
Epoch: 8005, Training Loss: 0.01603
Epoch: 8005, Training Loss: 0.01998
Epoch: 8005, Training Loss: 0.01525
Epoch: 8006, Training Loss: 0.01747
Epoch: 8006, Training Loss: 0.01603
Epoch: 8006, Training Loss: 0.01997
Epoch: 8006, Training Loss: 0.01525
Epoch: 8007, Training Loss: 0.01747
Epoch: 8007, Training Loss: 0.01603
Epoch: 8007, Training Loss: 0.01997
Epoch: 8007, Training Loss: 0.01525
Epoch: 8008, Training Loss: 0.01746
Epoch: 8008, Training Loss: 0.01603
Epoch: 8008, Training Loss: 0.01997
Epoch: 8008, Training Loss: 0.01525
Epoch: 8009, Training Loss: 0.01746
Epoch: 8009, Training Loss: 0.01602
Epoch: 8009, Training Loss: 0.01997
Epoch: 8009, Training Loss: 0.01525
Epoch: 8010, Training Loss: 0.01746
Epoch: 8010, Training Loss: 0.01602
Epoch: 8010, Training Loss: 0.01997
Epoch: 8010, Training Loss: 0.01524
Epoch: 8011, Training Loss: 0.01746
Epoch: 8011, Training Loss: 0.01602
Epoch: 8011, Training Loss: 0.01997
Epoch: 8011, Training Loss: 0.01524
Epoch: 8012, Training Loss: 0.01746
Epoch: 8012, Training Loss: 0.01602
Epoch: 8012, Training Loss: 0.01997
Epoch: 8012, Training Loss: 0.01524
Epoch: 8013, Training Loss: 0.01746
Epoch: 8013, Training Loss: 0.01602
Epoch: 8013, Training Loss: 0.01996
Epoch: 8013, Training Loss: 0.01524
Epoch: 8014, Training Loss: 0.01746
Epoch: 8014, Training Loss: 0.01602
Epoch: 8014, Training Loss: 0.01996
Epoch: 8014, Training Loss: 0.01524
Epoch: 8015, Training Loss: 0.01746
Epoch: 8015, Training Loss: 0.01602
Epoch: 8015, Training Loss: 0.01996
Epoch: 8015, Training Loss: 0.01524
Epoch: 8016, Training Loss: 0.01745
Epoch: 8016, Training Loss: 0.01602
Epoch: 8016, Training Loss: 0.01996
Epoch: 8016, Training Loss: 0.01524
Epoch: 8017, Training Loss: 0.01745
Epoch: 8017, Training Loss: 0.01602
Epoch: 8017, Training Loss: 0.01996
Epoch: 8017, Training Loss: 0.01524
Epoch: 8018, Training Loss: 0.01745
Epoch: 8018, Training Loss: 0.01601
Epoch: 8018, Training Loss: 0.01996
Epoch: 8018, Training Loss: 0.01524
Epoch: 8019, Training Loss: 0.01745
Epoch: 8019, Training Loss: 0.01601
Epoch: 8019, Training Loss: 0.01996
Epoch: 8019, Training Loss: 0.01523
Epoch: 8020, Training Loss: 0.01745
Epoch: 8020, Training Loss: 0.01601
Epoch: 8020, Training Loss: 0.01995
Epoch: 8020, Training Loss: 0.01523
Epoch: 8021, Training Loss: 0.01745
Epoch: 8021, Training Loss: 0.01601
Epoch: 8021, Training Loss: 0.01995
Epoch: 8021, Training Loss: 0.01523
Epoch: 8022, Training Loss: 0.01745
Epoch: 8022, Training Loss: 0.01601
Epoch: 8022, Training Loss: 0.01995
Epoch: 8022, Training Loss: 0.01523
Epoch: 8023, Training Loss: 0.01745
Epoch: 8023, Training Loss: 0.01601
Epoch: 8023, Training Loss: 0.01995
Epoch: 8023, Training Loss: 0.01523
Epoch: 8024, Training Loss: 0.01744
Epoch: 8024, Training Loss: 0.01601
Epoch: 8024, Training Loss: 0.01995
Epoch: 8024, Training Loss: 0.01523
Epoch: 8025, Training Loss: 0.01744
Epoch: 8025, Training Loss: 0.01601
Epoch: 8025, Training Loss: 0.01995
Epoch: 8025, Training Loss: 0.01523
Epoch: 8026, Training Loss: 0.01744
Epoch: 8026, Training Loss: 0.01601
Epoch: 8026, Training Loss: 0.01994
Epoch: 8026, Training Loss: 0.01523
Epoch: 8027, Training Loss: 0.01744
Epoch: 8027, Training Loss: 0.01600
Epoch: 8027, Training Loss: 0.01994
Epoch: 8027, Training Loss: 0.01523
Epoch: 8028, Training Loss: 0.01744
Epoch: 8028, Training Loss: 0.01600
Epoch: 8028, Training Loss: 0.01994
Epoch: 8028, Training Loss: 0.01522
Epoch: 8029, Training Loss: 0.01744
Epoch: 8029, Training Loss: 0.01600
Epoch: 8029, Training Loss: 0.01994
Epoch: 8029, Training Loss: 0.01522
Epoch: 8030, Training Loss: 0.01744
Epoch: 8030, Training Loss: 0.01600
Epoch: 8030, Training Loss: 0.01994
Epoch: 8030, Training Loss: 0.01522
Epoch: 8031, Training Loss: 0.01744
Epoch: 8031, Training Loss: 0.01600
Epoch: 8031, Training Loss: 0.01994
Epoch: 8031, Training Loss: 0.01522
Epoch: 8032, Training Loss: 0.01743
Epoch: 8032, Training Loss: 0.01600
Epoch: 8032, Training Loss: 0.01994
Epoch: 8032, Training Loss: 0.01522
Epoch: 8033, Training Loss: 0.01743
Epoch: 8033, Training Loss: 0.01600
Epoch: 8033, Training Loss: 0.01993
Epoch: 8033, Training Loss: 0.01522
Epoch: 8034, Training Loss: 0.01743
Epoch: 8034, Training Loss: 0.01600
Epoch: 8034, Training Loss: 0.01993
Epoch: 8034, Training Loss: 0.01522
Epoch: 8035, Training Loss: 0.01743
Epoch: 8035, Training Loss: 0.01600
Epoch: 8035, Training Loss: 0.01993
Epoch: 8035, Training Loss: 0.01522
Epoch: 8036, Training Loss: 0.01743
Epoch: 8036, Training Loss: 0.01599
Epoch: 8036, Training Loss: 0.01993
Epoch: 8036, Training Loss: 0.01522
Epoch: 8037, Training Loss: 0.01743
Epoch: 8037, Training Loss: 0.01599
Epoch: 8037, Training Loss: 0.01993
Epoch: 8037, Training Loss: 0.01521
Epoch: 8038, Training Loss: 0.01743
Epoch: 8038, Training Loss: 0.01599
Epoch: 8038, Training Loss: 0.01993
Epoch: 8038, Training Loss: 0.01521
Epoch: 8039, Training Loss: 0.01743
Epoch: 8039, Training Loss: 0.01599
Epoch: 8039, Training Loss: 0.01993
Epoch: 8039, Training Loss: 0.01521
Epoch: 8040, Training Loss: 0.01742
Epoch: 8040, Training Loss: 0.01599
Epoch: 8040, Training Loss: 0.01992
Epoch: 8040, Training Loss: 0.01521
Epoch: 8041, Training Loss: 0.01742
Epoch: 8041, Training Loss: 0.01599
Epoch: 8041, Training Loss: 0.01992
Epoch: 8041, Training Loss: 0.01521
Epoch: 8042, Training Loss: 0.01742
Epoch: 8042, Training Loss: 0.01599
Epoch: 8042, Training Loss: 0.01992
Epoch: 8042, Training Loss: 0.01521
Epoch: 8043, Training Loss: 0.01742
Epoch: 8043, Training Loss: 0.01599
Epoch: 8043, Training Loss: 0.01992
Epoch: 8043, Training Loss: 0.01521
Epoch: 8044, Training Loss: 0.01742
Epoch: 8044, Training Loss: 0.01599
Epoch: 8044, Training Loss: 0.01992
Epoch: 8044, Training Loss: 0.01521
Epoch: 8045, Training Loss: 0.01742
Epoch: 8045, Training Loss: 0.01598
Epoch: 8045, Training Loss: 0.01992
Epoch: 8045, Training Loss: 0.01521
Epoch: 8046, Training Loss: 0.01742
Epoch: 8046, Training Loss: 0.01598
Epoch: 8046, Training Loss: 0.01992
Epoch: 8046, Training Loss: 0.01521
Epoch: 8047, Training Loss: 0.01742
Epoch: 8047, Training Loss: 0.01598
Epoch: 8047, Training Loss: 0.01991
Epoch: 8047, Training Loss: 0.01520
Epoch: 8048, Training Loss: 0.01741
Epoch: 8048, Training Loss: 0.01598
Epoch: 8048, Training Loss: 0.01991
Epoch: 8048, Training Loss: 0.01520
Epoch: 8049, Training Loss: 0.01741
Epoch: 8049, Training Loss: 0.01598
Epoch: 8049, Training Loss: 0.01991
Epoch: 8049, Training Loss: 0.01520
Epoch: 8050, Training Loss: 0.01741
Epoch: 8050, Training Loss: 0.01598
Epoch: 8050, Training Loss: 0.01991
Epoch: 8050, Training Loss: 0.01520
Epoch: 8051, Training Loss: 0.01741
Epoch: 8051, Training Loss: 0.01598
Epoch: 8051, Training Loss: 0.01991
Epoch: 8051, Training Loss: 0.01520
Epoch: 8052, Training Loss: 0.01741
Epoch: 8052, Training Loss: 0.01598
Epoch: 8052, Training Loss: 0.01991
Epoch: 8052, Training Loss: 0.01520
Epoch: 8053, Training Loss: 0.01741
Epoch: 8053, Training Loss: 0.01598
Epoch: 8053, Training Loss: 0.01991
Epoch: 8053, Training Loss: 0.01520
Epoch: 8054, Training Loss: 0.01741
Epoch: 8054, Training Loss: 0.01597
Epoch: 8054, Training Loss: 0.01990
Epoch: 8054, Training Loss: 0.01520
Epoch: 8055, Training Loss: 0.01741
Epoch: 8055, Training Loss: 0.01597
Epoch: 8055, Training Loss: 0.01990
Epoch: 8055, Training Loss: 0.01520
Epoch: 8056, Training Loss: 0.01740
Epoch: 8056, Training Loss: 0.01597
Epoch: 8056, Training Loss: 0.01990
Epoch: 8056, Training Loss: 0.01519
Epoch: 8057, Training Loss: 0.01740
Epoch: 8057, Training Loss: 0.01597
Epoch: 8057, Training Loss: 0.01990
Epoch: 8057, Training Loss: 0.01519
Epoch: 8058, Training Loss: 0.01740
Epoch: 8058, Training Loss: 0.01597
Epoch: 8058, Training Loss: 0.01990
Epoch: 8058, Training Loss: 0.01519
Epoch: 8059, Training Loss: 0.01740
Epoch: 8059, Training Loss: 0.01597
Epoch: 8059, Training Loss: 0.01990
Epoch: 8059, Training Loss: 0.01519
Epoch: 8060, Training Loss: 0.01740
Epoch: 8060, Training Loss: 0.01597
Epoch: 8060, Training Loss: 0.01990
Epoch: 8060, Training Loss: 0.01519
Epoch: 8061, Training Loss: 0.01740
Epoch: 8061, Training Loss: 0.01597
Epoch: 8061, Training Loss: 0.01989
Epoch: 8061, Training Loss: 0.01519
Epoch: 8062, Training Loss: 0.01740
Epoch: 8062, Training Loss: 0.01597
Epoch: 8062, Training Loss: 0.01989
Epoch: 8062, Training Loss: 0.01519
Epoch: 8063, Training Loss: 0.01740
Epoch: 8063, Training Loss: 0.01596
Epoch: 8063, Training Loss: 0.01989
Epoch: 8063, Training Loss: 0.01519
Epoch: 8064, Training Loss: 0.01739
Epoch: 8064, Training Loss: 0.01596
Epoch: 8064, Training Loss: 0.01989
Epoch: 8064, Training Loss: 0.01519
Epoch: 8065, Training Loss: 0.01739
Epoch: 8065, Training Loss: 0.01596
Epoch: 8065, Training Loss: 0.01989
Epoch: 8065, Training Loss: 0.01518
Epoch: 8066, Training Loss: 0.01739
Epoch: 8066, Training Loss: 0.01596
Epoch: 8066, Training Loss: 0.01989
Epoch: 8066, Training Loss: 0.01518
Epoch: 8067, Training Loss: 0.01739
Epoch: 8067, Training Loss: 0.01596
Epoch: 8067, Training Loss: 0.01989
Epoch: 8067, Training Loss: 0.01518
Epoch: 8068, Training Loss: 0.01739
Epoch: 8068, Training Loss: 0.01596
Epoch: 8068, Training Loss: 0.01988
Epoch: 8068, Training Loss: 0.01518
Epoch: 8069, Training Loss: 0.01739
Epoch: 8069, Training Loss: 0.01596
Epoch: 8069, Training Loss: 0.01988
Epoch: 8069, Training Loss: 0.01518
Epoch: 8070, Training Loss: 0.01739
Epoch: 8070, Training Loss: 0.01596
Epoch: 8070, Training Loss: 0.01988
Epoch: 8070, Training Loss: 0.01518
Epoch: 8071, Training Loss: 0.01739
Epoch: 8071, Training Loss: 0.01596
Epoch: 8071, Training Loss: 0.01988
Epoch: 8071, Training Loss: 0.01518
Epoch: 8072, Training Loss: 0.01738
Epoch: 8072, Training Loss: 0.01595
Epoch: 8072, Training Loss: 0.01988
Epoch: 8072, Training Loss: 0.01518
Epoch: 8073, Training Loss: 0.01738
Epoch: 8073, Training Loss: 0.01595
Epoch: 8073, Training Loss: 0.01988
Epoch: 8073, Training Loss: 0.01518
Epoch: 8074, Training Loss: 0.01738
Epoch: 8074, Training Loss: 0.01595
Epoch: 8074, Training Loss: 0.01988
Epoch: 8074, Training Loss: 0.01518
Epoch: 8075, Training Loss: 0.01738
Epoch: 8075, Training Loss: 0.01595
Epoch: 8075, Training Loss: 0.01987
Epoch: 8075, Training Loss: 0.01517
Epoch: 8076, Training Loss: 0.01738
Epoch: 8076, Training Loss: 0.01595
Epoch: 8076, Training Loss: 0.01987
Epoch: 8076, Training Loss: 0.01517
Epoch: 8077, Training Loss: 0.01738
Epoch: 8077, Training Loss: 0.01595
Epoch: 8077, Training Loss: 0.01987
Epoch: 8077, Training Loss: 0.01517
Epoch: 8078, Training Loss: 0.01738
Epoch: 8078, Training Loss: 0.01595
Epoch: 8078, Training Loss: 0.01987
Epoch: 8078, Training Loss: 0.01517
Epoch: 8079, Training Loss: 0.01738
Epoch: 8079, Training Loss: 0.01595
Epoch: 8079, Training Loss: 0.01987
Epoch: 8079, Training Loss: 0.01517
Epoch: 8080, Training Loss: 0.01737
Epoch: 8080, Training Loss: 0.01595
Epoch: 8080, Training Loss: 0.01987
Epoch: 8080, Training Loss: 0.01517
Epoch: 8081, Training Loss: 0.01737
Epoch: 8081, Training Loss: 0.01594
Epoch: 8081, Training Loss: 0.01987
Epoch: 8081, Training Loss: 0.01517
Epoch: 8082, Training Loss: 0.01737
Epoch: 8082, Training Loss: 0.01594
Epoch: 8082, Training Loss: 0.01986
Epoch: 8082, Training Loss: 0.01517
Epoch: 8083, Training Loss: 0.01737
Epoch: 8083, Training Loss: 0.01594
Epoch: 8083, Training Loss: 0.01986
Epoch: 8083, Training Loss: 0.01517
Epoch: 8084, Training Loss: 0.01737
Epoch: 8084, Training Loss: 0.01594
Epoch: 8084, Training Loss: 0.01986
Epoch: 8084, Training Loss: 0.01516
Epoch: 8085, Training Loss: 0.01737
Epoch: 8085, Training Loss: 0.01594
Epoch: 8085, Training Loss: 0.01986
Epoch: 8085, Training Loss: 0.01516
Epoch: 8086, Training Loss: 0.01737
Epoch: 8086, Training Loss: 0.01594
Epoch: 8086, Training Loss: 0.01986
Epoch: 8086, Training Loss: 0.01516
Epoch: 8087, Training Loss: 0.01737
Epoch: 8087, Training Loss: 0.01594
Epoch: 8087, Training Loss: 0.01986
Epoch: 8087, Training Loss: 0.01516
Epoch: 8088, Training Loss: 0.01736
Epoch: 8088, Training Loss: 0.01594
Epoch: 8088, Training Loss: 0.01986
Epoch: 8088, Training Loss: 0.01516
Epoch: 8089, Training Loss: 0.01736
Epoch: 8089, Training Loss: 0.01594
Epoch: 8089, Training Loss: 0.01985
Epoch: 8089, Training Loss: 0.01516
Epoch: 8090, Training Loss: 0.01736
Epoch: 8090, Training Loss: 0.01593
Epoch: 8090, Training Loss: 0.01985
Epoch: 8090, Training Loss: 0.01516
Epoch: 8091, Training Loss: 0.01736
Epoch: 8091, Training Loss: 0.01593
Epoch: 8091, Training Loss: 0.01985
Epoch: 8091, Training Loss: 0.01516
Epoch: 8092, Training Loss: 0.01736
Epoch: 8092, Training Loss: 0.01593
Epoch: 8092, Training Loss: 0.01985
Epoch: 8092, Training Loss: 0.01516
Epoch: 8093, Training Loss: 0.01736
Epoch: 8093, Training Loss: 0.01593
Epoch: 8093, Training Loss: 0.01985
Epoch: 8093, Training Loss: 0.01516
Epoch: 8094, Training Loss: 0.01736
Epoch: 8094, Training Loss: 0.01593
Epoch: 8094, Training Loss: 0.01985
Epoch: 8094, Training Loss: 0.01515
Epoch: 8095, Training Loss: 0.01736
Epoch: 8095, Training Loss: 0.01593
Epoch: 8095, Training Loss: 0.01985
Epoch: 8095, Training Loss: 0.01515
Epoch: 8096, Training Loss: 0.01735
Epoch: 8096, Training Loss: 0.01593
Epoch: 8096, Training Loss: 0.01984
Epoch: 8096, Training Loss: 0.01515
Epoch: 8097, Training Loss: 0.01735
Epoch: 8097, Training Loss: 0.01593
Epoch: 8097, Training Loss: 0.01984
Epoch: 8097, Training Loss: 0.01515
Epoch: 8098, Training Loss: 0.01735
Epoch: 8098, Training Loss: 0.01593
Epoch: 8098, Training Loss: 0.01984
Epoch: 8098, Training Loss: 0.01515
Epoch: 8099, Training Loss: 0.01735
Epoch: 8099, Training Loss: 0.01592
Epoch: 8099, Training Loss: 0.01984
Epoch: 8099, Training Loss: 0.01515
Epoch: 8100, Training Loss: 0.01735
Epoch: 8100, Training Loss: 0.01592
Epoch: 8100, Training Loss: 0.01984
Epoch: 8100, Training Loss: 0.01515
Epoch: 8101, Training Loss: 0.01735
Epoch: 8101, Training Loss: 0.01592
Epoch: 8101, Training Loss: 0.01984
Epoch: 8101, Training Loss: 0.01515
Epoch: 8102, Training Loss: 0.01735
Epoch: 8102, Training Loss: 0.01592
Epoch: 8102, Training Loss: 0.01984
Epoch: 8102, Training Loss: 0.01515
Epoch: 8103, Training Loss: 0.01735
Epoch: 8103, Training Loss: 0.01592
Epoch: 8103, Training Loss: 0.01983
Epoch: 8103, Training Loss: 0.01514
Epoch: 8104, Training Loss: 0.01734
Epoch: 8104, Training Loss: 0.01592
Epoch: 8104, Training Loss: 0.01983
Epoch: 8104, Training Loss: 0.01514
Epoch: 8105, Training Loss: 0.01734
Epoch: 8105, Training Loss: 0.01592
Epoch: 8105, Training Loss: 0.01983
Epoch: 8105, Training Loss: 0.01514
Epoch: 8106, Training Loss: 0.01734
Epoch: 8106, Training Loss: 0.01592
Epoch: 8106, Training Loss: 0.01983
Epoch: 8106, Training Loss: 0.01514
Epoch: 8107, Training Loss: 0.01734
Epoch: 8107, Training Loss: 0.01592
Epoch: 8107, Training Loss: 0.01983
Epoch: 8107, Training Loss: 0.01514
Epoch: 8108, Training Loss: 0.01734
Epoch: 8108, Training Loss: 0.01591
Epoch: 8108, Training Loss: 0.01983
Epoch: 8108, Training Loss: 0.01514
Epoch: 8109, Training Loss: 0.01734
Epoch: 8109, Training Loss: 0.01591
Epoch: 8109, Training Loss: 0.01983
Epoch: 8109, Training Loss: 0.01514
Epoch: 8110, Training Loss: 0.01734
Epoch: 8110, Training Loss: 0.01591
Epoch: 8110, Training Loss: 0.01982
Epoch: 8110, Training Loss: 0.01514
Epoch: 8111, Training Loss: 0.01734
Epoch: 8111, Training Loss: 0.01591
Epoch: 8111, Training Loss: 0.01982
Epoch: 8111, Training Loss: 0.01514
Epoch: 8112, Training Loss: 0.01733
Epoch: 8112, Training Loss: 0.01591
Epoch: 8112, Training Loss: 0.01982
Epoch: 8112, Training Loss: 0.01513
Epoch: 8113, Training Loss: 0.01733
Epoch: 8113, Training Loss: 0.01591
Epoch: 8113, Training Loss: 0.01982
Epoch: 8113, Training Loss: 0.01513
Epoch: 8114, Training Loss: 0.01733
Epoch: 8114, Training Loss: 0.01591
Epoch: 8114, Training Loss: 0.01982
Epoch: 8114, Training Loss: 0.01513
Epoch: 8115, Training Loss: 0.01733
Epoch: 8115, Training Loss: 0.01591
Epoch: 8115, Training Loss: 0.01982
Epoch: 8115, Training Loss: 0.01513
Epoch: 8116, Training Loss: 0.01733
Epoch: 8116, Training Loss: 0.01591
Epoch: 8116, Training Loss: 0.01982
Epoch: 8116, Training Loss: 0.01513
Epoch: 8117, Training Loss: 0.01733
Epoch: 8117, Training Loss: 0.01590
Epoch: 8117, Training Loss: 0.01981
Epoch: 8117, Training Loss: 0.01513
Epoch: 8118, Training Loss: 0.01733
Epoch: 8118, Training Loss: 0.01590
Epoch: 8118, Training Loss: 0.01981
Epoch: 8118, Training Loss: 0.01513
Epoch: 8119, Training Loss: 0.01733
Epoch: 8119, Training Loss: 0.01590
Epoch: 8119, Training Loss: 0.01981
Epoch: 8119, Training Loss: 0.01513
Epoch: 8120, Training Loss: 0.01732
Epoch: 8120, Training Loss: 0.01590
Epoch: 8120, Training Loss: 0.01981
Epoch: 8120, Training Loss: 0.01513
Epoch: 8121, Training Loss: 0.01732
Epoch: 8121, Training Loss: 0.01590
Epoch: 8121, Training Loss: 0.01981
Epoch: 8121, Training Loss: 0.01513
Epoch: 8122, Training Loss: 0.01732
Epoch: 8122, Training Loss: 0.01590
Epoch: 8122, Training Loss: 0.01981
Epoch: 8122, Training Loss: 0.01512
Epoch: 8123, Training Loss: 0.01732
Epoch: 8123, Training Loss: 0.01590
Epoch: 8123, Training Loss: 0.01981
Epoch: 8123, Training Loss: 0.01512
Epoch: 8124, Training Loss: 0.01732
Epoch: 8124, Training Loss: 0.01590
Epoch: 8124, Training Loss: 0.01980
Epoch: 8124, Training Loss: 0.01512
Epoch: 8125, Training Loss: 0.01732
Epoch: 8125, Training Loss: 0.01590
Epoch: 8125, Training Loss: 0.01980
Epoch: 8125, Training Loss: 0.01512
Epoch: 8126, Training Loss: 0.01732
Epoch: 8126, Training Loss: 0.01589
Epoch: 8126, Training Loss: 0.01980
Epoch: 8126, Training Loss: 0.01512
Epoch: 8127, Training Loss: 0.01732
Epoch: 8127, Training Loss: 0.01589
Epoch: 8127, Training Loss: 0.01980
Epoch: 8127, Training Loss: 0.01512
Epoch: 8128, Training Loss: 0.01731
Epoch: 8128, Training Loss: 0.01589
Epoch: 8128, Training Loss: 0.01980
Epoch: 8128, Training Loss: 0.01512
Epoch: 8129, Training Loss: 0.01731
Epoch: 8129, Training Loss: 0.01589
Epoch: 8129, Training Loss: 0.01980
Epoch: 8129, Training Loss: 0.01512
Epoch: 8130, Training Loss: 0.01731
Epoch: 8130, Training Loss: 0.01589
Epoch: 8130, Training Loss: 0.01980
Epoch: 8130, Training Loss: 0.01512
Epoch: 8131, Training Loss: 0.01731
Epoch: 8131, Training Loss: 0.01589
Epoch: 8131, Training Loss: 0.01979
Epoch: 8131, Training Loss: 0.01511
Epoch: 8132, Training Loss: 0.01731
Epoch: 8132, Training Loss: 0.01589
Epoch: 8132, Training Loss: 0.01979
Epoch: 8132, Training Loss: 0.01511
Epoch: 8133, Training Loss: 0.01731
Epoch: 8133, Training Loss: 0.01589
Epoch: 8133, Training Loss: 0.01979
Epoch: 8133, Training Loss: 0.01511
Epoch: 8134, Training Loss: 0.01731
Epoch: 8134, Training Loss: 0.01589
Epoch: 8134, Training Loss: 0.01979
Epoch: 8134, Training Loss: 0.01511
Epoch: 8135, Training Loss: 0.01731
Epoch: 8135, Training Loss: 0.01588
Epoch: 8135, Training Loss: 0.01979
Epoch: 8135, Training Loss: 0.01511
Epoch: 8136, Training Loss: 0.01730
Epoch: 8136, Training Loss: 0.01588
Epoch: 8136, Training Loss: 0.01979
Epoch: 8136, Training Loss: 0.01511
Epoch: 8137, Training Loss: 0.01730
Epoch: 8137, Training Loss: 0.01588
Epoch: 8137, Training Loss: 0.01979
Epoch: 8137, Training Loss: 0.01511
Epoch: 8138, Training Loss: 0.01730
Epoch: 8138, Training Loss: 0.01588
Epoch: 8138, Training Loss: 0.01978
Epoch: 8138, Training Loss: 0.01511
Epoch: 8139, Training Loss: 0.01730
Epoch: 8139, Training Loss: 0.01588
Epoch: 8139, Training Loss: 0.01978
Epoch: 8139, Training Loss: 0.01511
Epoch: 8140, Training Loss: 0.01730
Epoch: 8140, Training Loss: 0.01588
Epoch: 8140, Training Loss: 0.01978
Epoch: 8140, Training Loss: 0.01511
Epoch: 8141, Training Loss: 0.01730
Epoch: 8141, Training Loss: 0.01588
Epoch: 8141, Training Loss: 0.01978
Epoch: 8141, Training Loss: 0.01510
Epoch: 8142, Training Loss: 0.01730
Epoch: 8142, Training Loss: 0.01588
Epoch: 8142, Training Loss: 0.01978
Epoch: 8142, Training Loss: 0.01510
Epoch: 8143, Training Loss: 0.01730
Epoch: 8143, Training Loss: 0.01588
Epoch: 8143, Training Loss: 0.01978
Epoch: 8143, Training Loss: 0.01510
Epoch: 8144, Training Loss: 0.01729
Epoch: 8144, Training Loss: 0.01587
Epoch: 8144, Training Loss: 0.01978
Epoch: 8144, Training Loss: 0.01510
Epoch: 8145, Training Loss: 0.01729
Epoch: 8145, Training Loss: 0.01587
Epoch: 8145, Training Loss: 0.01977
Epoch: 8145, Training Loss: 0.01510
Epoch: 8146, Training Loss: 0.01729
Epoch: 8146, Training Loss: 0.01587
Epoch: 8146, Training Loss: 0.01977
Epoch: 8146, Training Loss: 0.01510
Epoch: 8147, Training Loss: 0.01729
Epoch: 8147, Training Loss: 0.01587
Epoch: 8147, Training Loss: 0.01977
Epoch: 8147, Training Loss: 0.01510
Epoch: 8148, Training Loss: 0.01729
Epoch: 8148, Training Loss: 0.01587
Epoch: 8148, Training Loss: 0.01977
Epoch: 8148, Training Loss: 0.01510
Epoch: 8149, Training Loss: 0.01729
Epoch: 8149, Training Loss: 0.01587
Epoch: 8149, Training Loss: 0.01977
Epoch: 8149, Training Loss: 0.01510
Epoch: 8150, Training Loss: 0.01729
Epoch: 8150, Training Loss: 0.01587
Epoch: 8150, Training Loss: 0.01977
Epoch: 8150, Training Loss: 0.01509
Epoch: 8151, Training Loss: 0.01729
Epoch: 8151, Training Loss: 0.01587
Epoch: 8151, Training Loss: 0.01977
Epoch: 8151, Training Loss: 0.01509
Epoch: 8152, Training Loss: 0.01728
Epoch: 8152, Training Loss: 0.01587
Epoch: 8152, Training Loss: 0.01976
Epoch: 8152, Training Loss: 0.01509
Epoch: 8153, Training Loss: 0.01728
Epoch: 8153, Training Loss: 0.01586
Epoch: 8153, Training Loss: 0.01976
Epoch: 8153, Training Loss: 0.01509
Epoch: 8154, Training Loss: 0.01728
Epoch: 8154, Training Loss: 0.01586
Epoch: 8154, Training Loss: 0.01976
Epoch: 8154, Training Loss: 0.01509
Epoch: 8155, Training Loss: 0.01728
Epoch: 8155, Training Loss: 0.01586
Epoch: 8155, Training Loss: 0.01976
Epoch: 8155, Training Loss: 0.01509
Epoch: 8156, Training Loss: 0.01728
Epoch: 8156, Training Loss: 0.01586
Epoch: 8156, Training Loss: 0.01976
Epoch: 8156, Training Loss: 0.01509
Epoch: 8157, Training Loss: 0.01728
Epoch: 8157, Training Loss: 0.01586
Epoch: 8157, Training Loss: 0.01976
Epoch: 8157, Training Loss: 0.01509
Epoch: 8158, Training Loss: 0.01728
Epoch: 8158, Training Loss: 0.01586
Epoch: 8158, Training Loss: 0.01976
Epoch: 8158, Training Loss: 0.01509
Epoch: 8159, Training Loss: 0.01728
Epoch: 8159, Training Loss: 0.01586
Epoch: 8159, Training Loss: 0.01975
Epoch: 8159, Training Loss: 0.01509
Epoch: 8160, Training Loss: 0.01728
Epoch: 8160, Training Loss: 0.01586
Epoch: 8160, Training Loss: 0.01975
Epoch: 8160, Training Loss: 0.01508
Epoch: 8161, Training Loss: 0.01727
Epoch: 8161, Training Loss: 0.01586
Epoch: 8161, Training Loss: 0.01975
Epoch: 8161, Training Loss: 0.01508
Epoch: 8162, Training Loss: 0.01727
Epoch: 8162, Training Loss: 0.01586
Epoch: 8162, Training Loss: 0.01975
Epoch: 8162, Training Loss: 0.01508
Epoch: 8163, Training Loss: 0.01727
Epoch: 8163, Training Loss: 0.01585
Epoch: 8163, Training Loss: 0.01975
Epoch: 8163, Training Loss: 0.01508
Epoch: 8164, Training Loss: 0.01727
Epoch: 8164, Training Loss: 0.01585
Epoch: 8164, Training Loss: 0.01975
Epoch: 8164, Training Loss: 0.01508
Epoch: 8165, Training Loss: 0.01727
Epoch: 8165, Training Loss: 0.01585
Epoch: 8165, Training Loss: 0.01975
Epoch: 8165, Training Loss: 0.01508
Epoch: 8166, Training Loss: 0.01727
Epoch: 8166, Training Loss: 0.01585
Epoch: 8166, Training Loss: 0.01974
Epoch: 8166, Training Loss: 0.01508
Epoch: 8167, Training Loss: 0.01727
Epoch: 8167, Training Loss: 0.01585
Epoch: 8167, Training Loss: 0.01974
Epoch: 8167, Training Loss: 0.01508
Epoch: 8168, Training Loss: 0.01727
Epoch: 8168, Training Loss: 0.01585
Epoch: 8168, Training Loss: 0.01974
Epoch: 8168, Training Loss: 0.01508
Epoch: 8169, Training Loss: 0.01726
Epoch: 8169, Training Loss: 0.01585
Epoch: 8169, Training Loss: 0.01974
Epoch: 8169, Training Loss: 0.01507
Epoch: 8170, Training Loss: 0.01726
Epoch: 8170, Training Loss: 0.01585
Epoch: 8170, Training Loss: 0.01974
Epoch: 8170, Training Loss: 0.01507
Epoch: 8171, Training Loss: 0.01726
Epoch: 8171, Training Loss: 0.01585
Epoch: 8171, Training Loss: 0.01974
Epoch: 8171, Training Loss: 0.01507
Epoch: 8172, Training Loss: 0.01726
Epoch: 8172, Training Loss: 0.01584
Epoch: 8172, Training Loss: 0.01974
Epoch: 8172, Training Loss: 0.01507
Epoch: 8173, Training Loss: 0.01726
Epoch: 8173, Training Loss: 0.01584
Epoch: 8173, Training Loss: 0.01973
Epoch: 8173, Training Loss: 0.01507
Epoch: 8174, Training Loss: 0.01726
Epoch: 8174, Training Loss: 0.01584
Epoch: 8174, Training Loss: 0.01973
Epoch: 8174, Training Loss: 0.01507
Epoch: 8175, Training Loss: 0.01726
Epoch: 8175, Training Loss: 0.01584
Epoch: 8175, Training Loss: 0.01973
Epoch: 8175, Training Loss: 0.01507
Epoch: 8176, Training Loss: 0.01726
Epoch: 8176, Training Loss: 0.01584
Epoch: 8176, Training Loss: 0.01973
Epoch: 8176, Training Loss: 0.01507
Epoch: 8177, Training Loss: 0.01725
Epoch: 8177, Training Loss: 0.01584
Epoch: 8177, Training Loss: 0.01973
Epoch: 8177, Training Loss: 0.01507
Epoch: 8178, Training Loss: 0.01725
Epoch: 8178, Training Loss: 0.01584
Epoch: 8178, Training Loss: 0.01973
Epoch: 8178, Training Loss: 0.01507
Epoch: 8179, Training Loss: 0.01725
Epoch: 8179, Training Loss: 0.01584
Epoch: 8179, Training Loss: 0.01973
Epoch: 8179, Training Loss: 0.01506
Epoch: 8180, Training Loss: 0.01725
Epoch: 8180, Training Loss: 0.01584
Epoch: 8180, Training Loss: 0.01972
Epoch: 8180, Training Loss: 0.01506
Epoch: 8181, Training Loss: 0.01725
Epoch: 8181, Training Loss: 0.01583
Epoch: 8181, Training Loss: 0.01972
Epoch: 8181, Training Loss: 0.01506
Epoch: 8182, Training Loss: 0.01725
Epoch: 8182, Training Loss: 0.01583
Epoch: 8182, Training Loss: 0.01972
Epoch: 8182, Training Loss: 0.01506
Epoch: 8183, Training Loss: 0.01725
Epoch: 8183, Training Loss: 0.01583
Epoch: 8183, Training Loss: 0.01972
Epoch: 8183, Training Loss: 0.01506
Epoch: 8184, Training Loss: 0.01725
Epoch: 8184, Training Loss: 0.01583
Epoch: 8184, Training Loss: 0.01972
Epoch: 8184, Training Loss: 0.01506
Epoch: 8185, Training Loss: 0.01724
Epoch: 8185, Training Loss: 0.01583
Epoch: 8185, Training Loss: 0.01972
Epoch: 8185, Training Loss: 0.01506
Epoch: 8186, Training Loss: 0.01724
Epoch: 8186, Training Loss: 0.01583
Epoch: 8186, Training Loss: 0.01972
Epoch: 8186, Training Loss: 0.01506
Epoch: 8187, Training Loss: 0.01724
Epoch: 8187, Training Loss: 0.01583
Epoch: 8187, Training Loss: 0.01971
Epoch: 8187, Training Loss: 0.01506
Epoch: 8188, Training Loss: 0.01724
Epoch: 8188, Training Loss: 0.01583
Epoch: 8188, Training Loss: 0.01971
Epoch: 8188, Training Loss: 0.01506
Epoch: 8189, Training Loss: 0.01724
Epoch: 8189, Training Loss: 0.01583
Epoch: 8189, Training Loss: 0.01971
Epoch: 8189, Training Loss: 0.01505
Epoch: 8190, Training Loss: 0.01724
Epoch: 8190, Training Loss: 0.01582
Epoch: 8190, Training Loss: 0.01971
Epoch: 8190, Training Loss: 0.01505
Epoch: 8191, Training Loss: 0.01724
Epoch: 8191, Training Loss: 0.01582
Epoch: 8191, Training Loss: 0.01971
Epoch: 8191, Training Loss: 0.01505
Epoch: 8192, Training Loss: 0.01724
Epoch: 8192, Training Loss: 0.01582
Epoch: 8192, Training Loss: 0.01971
Epoch: 8192, Training Loss: 0.01505
Epoch: 8193, Training Loss: 0.01723
Epoch: 8193, Training Loss: 0.01582
Epoch: 8193, Training Loss: 0.01971
Epoch: 8193, Training Loss: 0.01505
Epoch: 8194, Training Loss: 0.01723
Epoch: 8194, Training Loss: 0.01582
Epoch: 8194, Training Loss: 0.01970
Epoch: 8194, Training Loss: 0.01505
Epoch: 8195, Training Loss: 0.01723
Epoch: 8195, Training Loss: 0.01582
Epoch: 8195, Training Loss: 0.01970
Epoch: 8195, Training Loss: 0.01505
Epoch: 8196, Training Loss: 0.01723
Epoch: 8196, Training Loss: 0.01582
Epoch: 8196, Training Loss: 0.01970
Epoch: 8196, Training Loss: 0.01505
Epoch: 8197, Training Loss: 0.01723
Epoch: 8197, Training Loss: 0.01582
Epoch: 8197, Training Loss: 0.01970
Epoch: 8197, Training Loss: 0.01505
Epoch: 8198, Training Loss: 0.01723
Epoch: 8198, Training Loss: 0.01582
Epoch: 8198, Training Loss: 0.01970
Epoch: 8198, Training Loss: 0.01504
Epoch: 8199, Training Loss: 0.01723
Epoch: 8199, Training Loss: 0.01582
Epoch: 8199, Training Loss: 0.01970
Epoch: 8199, Training Loss: 0.01504
Epoch: 8200, Training Loss: 0.01723
Epoch: 8200, Training Loss: 0.01581
Epoch: 8200, Training Loss: 0.01970
Epoch: 8200, Training Loss: 0.01504
Epoch: 8201, Training Loss: 0.01723
Epoch: 8201, Training Loss: 0.01581
Epoch: 8201, Training Loss: 0.01970
Epoch: 8201, Training Loss: 0.01504
Epoch: 8202, Training Loss: 0.01722
Epoch: 8202, Training Loss: 0.01581
Epoch: 8202, Training Loss: 0.01969
Epoch: 8202, Training Loss: 0.01504
Epoch: 8203, Training Loss: 0.01722
Epoch: 8203, Training Loss: 0.01581
Epoch: 8203, Training Loss: 0.01969
Epoch: 8203, Training Loss: 0.01504
Epoch: 8204, Training Loss: 0.01722
Epoch: 8204, Training Loss: 0.01581
Epoch: 8204, Training Loss: 0.01969
Epoch: 8204, Training Loss: 0.01504
Epoch: 8205, Training Loss: 0.01722
Epoch: 8205, Training Loss: 0.01581
Epoch: 8205, Training Loss: 0.01969
Epoch: 8205, Training Loss: 0.01504
Epoch: 8206, Training Loss: 0.01722
Epoch: 8206, Training Loss: 0.01581
Epoch: 8206, Training Loss: 0.01969
Epoch: 8206, Training Loss: 0.01504
Epoch: 8207, Training Loss: 0.01722
Epoch: 8207, Training Loss: 0.01581
Epoch: 8207, Training Loss: 0.01969
Epoch: 8207, Training Loss: 0.01504
Epoch: 8208, Training Loss: 0.01722
Epoch: 8208, Training Loss: 0.01581
Epoch: 8208, Training Loss: 0.01969
Epoch: 8208, Training Loss: 0.01503
Epoch: 8209, Training Loss: 0.01722
Epoch: 8209, Training Loss: 0.01580
Epoch: 8209, Training Loss: 0.01968
Epoch: 8209, Training Loss: 0.01503
Epoch: 8210, Training Loss: 0.01721
Epoch: 8210, Training Loss: 0.01580
Epoch: 8210, Training Loss: 0.01968
Epoch: 8210, Training Loss: 0.01503
Epoch: 8211, Training Loss: 0.01721
Epoch: 8211, Training Loss: 0.01580
Epoch: 8211, Training Loss: 0.01968
Epoch: 8211, Training Loss: 0.01503
Epoch: 8212, Training Loss: 0.01721
Epoch: 8212, Training Loss: 0.01580
Epoch: 8212, Training Loss: 0.01968
Epoch: 8212, Training Loss: 0.01503
Epoch: 8213, Training Loss: 0.01721
Epoch: 8213, Training Loss: 0.01580
Epoch: 8213, Training Loss: 0.01968
Epoch: 8213, Training Loss: 0.01503
Epoch: 8214, Training Loss: 0.01721
Epoch: 8214, Training Loss: 0.01580
Epoch: 8214, Training Loss: 0.01968
Epoch: 8214, Training Loss: 0.01503
Epoch: 8215, Training Loss: 0.01721
Epoch: 8215, Training Loss: 0.01580
Epoch: 8215, Training Loss: 0.01968
Epoch: 8215, Training Loss: 0.01503
Epoch: 8216, Training Loss: 0.01721
Epoch: 8216, Training Loss: 0.01580
Epoch: 8216, Training Loss: 0.01967
Epoch: 8216, Training Loss: 0.01503
Epoch: 8217, Training Loss: 0.01721
Epoch: 8217, Training Loss: 0.01580
Epoch: 8217, Training Loss: 0.01967
Epoch: 8217, Training Loss: 0.01502
Epoch: 8218, Training Loss: 0.01720
Epoch: 8218, Training Loss: 0.01579
Epoch: 8218, Training Loss: 0.01967
Epoch: 8218, Training Loss: 0.01502
Epoch: 8219, Training Loss: 0.01720
Epoch: 8219, Training Loss: 0.01579
Epoch: 8219, Training Loss: 0.01967
Epoch: 8219, Training Loss: 0.01502
Epoch: 8220, Training Loss: 0.01720
Epoch: 8220, Training Loss: 0.01579
Epoch: 8220, Training Loss: 0.01967
Epoch: 8220, Training Loss: 0.01502
Epoch: 8221, Training Loss: 0.01720
Epoch: 8221, Training Loss: 0.01579
Epoch: 8221, Training Loss: 0.01967
Epoch: 8221, Training Loss: 0.01502
Epoch: 8222, Training Loss: 0.01720
Epoch: 8222, Training Loss: 0.01579
Epoch: 8222, Training Loss: 0.01967
Epoch: 8222, Training Loss: 0.01502
Epoch: 8223, Training Loss: 0.01720
Epoch: 8223, Training Loss: 0.01579
Epoch: 8223, Training Loss: 0.01966
Epoch: 8223, Training Loss: 0.01502
Epoch: 8224, Training Loss: 0.01720
Epoch: 8224, Training Loss: 0.01579
Epoch: 8224, Training Loss: 0.01966
Epoch: 8224, Training Loss: 0.01502
Epoch: 8225, Training Loss: 0.01720
Epoch: 8225, Training Loss: 0.01579
Epoch: 8225, Training Loss: 0.01966
Epoch: 8225, Training Loss: 0.01502
Epoch: 8226, Training Loss: 0.01719
Epoch: 8226, Training Loss: 0.01579
Epoch: 8226, Training Loss: 0.01966
Epoch: 8226, Training Loss: 0.01502
Epoch: 8227, Training Loss: 0.01719
Epoch: 8227, Training Loss: 0.01578
Epoch: 8227, Training Loss: 0.01966
Epoch: 8227, Training Loss: 0.01501
Epoch: 8228, Training Loss: 0.01719
Epoch: 8228, Training Loss: 0.01578
Epoch: 8228, Training Loss: 0.01966
Epoch: 8228, Training Loss: 0.01501
Epoch: 8229, Training Loss: 0.01719
Epoch: 8229, Training Loss: 0.01578
Epoch: 8229, Training Loss: 0.01966
Epoch: 8229, Training Loss: 0.01501
Epoch: 8230, Training Loss: 0.01719
Epoch: 8230, Training Loss: 0.01578
Epoch: 8230, Training Loss: 0.01965
Epoch: 8230, Training Loss: 0.01501
Epoch: 8231, Training Loss: 0.01719
Epoch: 8231, Training Loss: 0.01578
Epoch: 8231, Training Loss: 0.01965
Epoch: 8231, Training Loss: 0.01501
Epoch: 8232, Training Loss: 0.01719
Epoch: 8232, Training Loss: 0.01578
Epoch: 8232, Training Loss: 0.01965
Epoch: 8232, Training Loss: 0.01501
Epoch: 8233, Training Loss: 0.01719
Epoch: 8233, Training Loss: 0.01578
Epoch: 8233, Training Loss: 0.01965
Epoch: 8233, Training Loss: 0.01501
Epoch: 8234, Training Loss: 0.01719
Epoch: 8234, Training Loss: 0.01578
Epoch: 8234, Training Loss: 0.01965
Epoch: 8234, Training Loss: 0.01501
Epoch: 8235, Training Loss: 0.01718
Epoch: 8235, Training Loss: 0.01578
Epoch: 8235, Training Loss: 0.01965
Epoch: 8235, Training Loss: 0.01501
Epoch: 8236, Training Loss: 0.01718
Epoch: 8236, Training Loss: 0.01578
Epoch: 8236, Training Loss: 0.01965
Epoch: 8236, Training Loss: 0.01501
Epoch: 8237, Training Loss: 0.01718
Epoch: 8237, Training Loss: 0.01577
Epoch: 8237, Training Loss: 0.01964
Epoch: 8237, Training Loss: 0.01500
Epoch: 8238, Training Loss: 0.01718
Epoch: 8238, Training Loss: 0.01577
Epoch: 8238, Training Loss: 0.01964
Epoch: 8238, Training Loss: 0.01500
Epoch: 8239, Training Loss: 0.01718
Epoch: 8239, Training Loss: 0.01577
Epoch: 8239, Training Loss: 0.01964
Epoch: 8239, Training Loss: 0.01500
Epoch: 8240, Training Loss: 0.01718
Epoch: 8240, Training Loss: 0.01577
Epoch: 8240, Training Loss: 0.01964
Epoch: 8240, Training Loss: 0.01500
Epoch: 8241, Training Loss: 0.01718
Epoch: 8241, Training Loss: 0.01577
Epoch: 8241, Training Loss: 0.01964
Epoch: 8241, Training Loss: 0.01500
Epoch: 8242, Training Loss: 0.01718
Epoch: 8242, Training Loss: 0.01577
Epoch: 8242, Training Loss: 0.01964
Epoch: 8242, Training Loss: 0.01500
Epoch: 8243, Training Loss: 0.01717
Epoch: 8243, Training Loss: 0.01577
Epoch: 8243, Training Loss: 0.01964
Epoch: 8243, Training Loss: 0.01500
Epoch: 8244, Training Loss: 0.01717
Epoch: 8244, Training Loss: 0.01577
Epoch: 8244, Training Loss: 0.01964
Epoch: 8244, Training Loss: 0.01500
Epoch: 8245, Training Loss: 0.01717
Epoch: 8245, Training Loss: 0.01577
Epoch: 8245, Training Loss: 0.01963
Epoch: 8245, Training Loss: 0.01500
Epoch: 8246, Training Loss: 0.01717
Epoch: 8246, Training Loss: 0.01576
Epoch: 8246, Training Loss: 0.01963
Epoch: 8246, Training Loss: 0.01499
Epoch: 8247, Training Loss: 0.01717
Epoch: 8247, Training Loss: 0.01576
Epoch: 8247, Training Loss: 0.01963
Epoch: 8247, Training Loss: 0.01499
Epoch: 8248, Training Loss: 0.01717
Epoch: 8248, Training Loss: 0.01576
Epoch: 8248, Training Loss: 0.01963
Epoch: 8248, Training Loss: 0.01499
Epoch: 8249, Training Loss: 0.01717
Epoch: 8249, Training Loss: 0.01576
Epoch: 8249, Training Loss: 0.01963
Epoch: 8249, Training Loss: 0.01499
Epoch: 8250, Training Loss: 0.01717
Epoch: 8250, Training Loss: 0.01576
Epoch: 8250, Training Loss: 0.01963
Epoch: 8250, Training Loss: 0.01499
Epoch: 8251, Training Loss: 0.01716
Epoch: 8251, Training Loss: 0.01576
Epoch: 8251, Training Loss: 0.01963
Epoch: 8251, Training Loss: 0.01499
Epoch: 8252, Training Loss: 0.01716
Epoch: 8252, Training Loss: 0.01576
Epoch: 8252, Training Loss: 0.01962
Epoch: 8252, Training Loss: 0.01499
Epoch: 8253, Training Loss: 0.01716
Epoch: 8253, Training Loss: 0.01576
Epoch: 8253, Training Loss: 0.01962
Epoch: 8253, Training Loss: 0.01499
Epoch: 8254, Training Loss: 0.01716
Epoch: 8254, Training Loss: 0.01576
Epoch: 8254, Training Loss: 0.01962
Epoch: 8254, Training Loss: 0.01499
Epoch: 8255, Training Loss: 0.01716
Epoch: 8255, Training Loss: 0.01576
Epoch: 8255, Training Loss: 0.01962
Epoch: 8255, Training Loss: 0.01499
Epoch: 8256, Training Loss: 0.01716
Epoch: 8256, Training Loss: 0.01575
Epoch: 8256, Training Loss: 0.01962
Epoch: 8256, Training Loss: 0.01498
Epoch: 8257, Training Loss: 0.01716
Epoch: 8257, Training Loss: 0.01575
Epoch: 8257, Training Loss: 0.01962
Epoch: 8257, Training Loss: 0.01498
Epoch: 8258, Training Loss: 0.01716
Epoch: 8258, Training Loss: 0.01575
Epoch: 8258, Training Loss: 0.01962
Epoch: 8258, Training Loss: 0.01498
Epoch: 8259, Training Loss: 0.01715
Epoch: 8259, Training Loss: 0.01575
Epoch: 8259, Training Loss: 0.01961
Epoch: 8259, Training Loss: 0.01498
Epoch: 8260, Training Loss: 0.01715
Epoch: 8260, Training Loss: 0.01575
Epoch: 8260, Training Loss: 0.01961
Epoch: 8260, Training Loss: 0.01498
Epoch: 8261, Training Loss: 0.01715
Epoch: 8261, Training Loss: 0.01575
Epoch: 8261, Training Loss: 0.01961
Epoch: 8261, Training Loss: 0.01498
Epoch: 8262, Training Loss: 0.01715
Epoch: 8262, Training Loss: 0.01575
Epoch: 8262, Training Loss: 0.01961
Epoch: 8262, Training Loss: 0.01498
Epoch: 8263, Training Loss: 0.01715
Epoch: 8263, Training Loss: 0.01575
Epoch: 8263, Training Loss: 0.01961
Epoch: 8263, Training Loss: 0.01498
Epoch: 8264, Training Loss: 0.01715
Epoch: 8264, Training Loss: 0.01575
Epoch: 8264, Training Loss: 0.01961
Epoch: 8264, Training Loss: 0.01498
Epoch: 8265, Training Loss: 0.01715
Epoch: 8265, Training Loss: 0.01574
Epoch: 8265, Training Loss: 0.01961
Epoch: 8265, Training Loss: 0.01498
Epoch: 8266, Training Loss: 0.01715
Epoch: 8266, Training Loss: 0.01574
Epoch: 8266, Training Loss: 0.01960
Epoch: 8266, Training Loss: 0.01497
Epoch: 8267, Training Loss: 0.01715
Epoch: 8267, Training Loss: 0.01574
Epoch: 8267, Training Loss: 0.01960
Epoch: 8267, Training Loss: 0.01497
Epoch: 8268, Training Loss: 0.01714
Epoch: 8268, Training Loss: 0.01574
Epoch: 8268, Training Loss: 0.01960
Epoch: 8268, Training Loss: 0.01497
Epoch: 8269, Training Loss: 0.01714
Epoch: 8269, Training Loss: 0.01574
Epoch: 8269, Training Loss: 0.01960
Epoch: 8269, Training Loss: 0.01497
Epoch: 8270, Training Loss: 0.01714
Epoch: 8270, Training Loss: 0.01574
Epoch: 8270, Training Loss: 0.01960
Epoch: 8270, Training Loss: 0.01497
Epoch: 8271, Training Loss: 0.01714
Epoch: 8271, Training Loss: 0.01574
Epoch: 8271, Training Loss: 0.01960
Epoch: 8271, Training Loss: 0.01497
Epoch: 8272, Training Loss: 0.01714
Epoch: 8272, Training Loss: 0.01574
Epoch: 8272, Training Loss: 0.01960
Epoch: 8272, Training Loss: 0.01497
Epoch: 8273, Training Loss: 0.01714
Epoch: 8273, Training Loss: 0.01574
Epoch: 8273, Training Loss: 0.01960
Epoch: 8273, Training Loss: 0.01497
Epoch: 8274, Training Loss: 0.01714
Epoch: 8274, Training Loss: 0.01573
Epoch: 8274, Training Loss: 0.01959
Epoch: 8274, Training Loss: 0.01497
Epoch: 8275, Training Loss: 0.01714
Epoch: 8275, Training Loss: 0.01573
Epoch: 8275, Training Loss: 0.01959
Epoch: 8275, Training Loss: 0.01497
Epoch: 8276, Training Loss: 0.01713
Epoch: 8276, Training Loss: 0.01573
Epoch: 8276, Training Loss: 0.01959
Epoch: 8276, Training Loss: 0.01496
Epoch: 8277, Training Loss: 0.01713
Epoch: 8277, Training Loss: 0.01573
Epoch: 8277, Training Loss: 0.01959
Epoch: 8277, Training Loss: 0.01496
Epoch: 8278, Training Loss: 0.01713
Epoch: 8278, Training Loss: 0.01573
Epoch: 8278, Training Loss: 0.01959
Epoch: 8278, Training Loss: 0.01496
Epoch: 8279, Training Loss: 0.01713
Epoch: 8279, Training Loss: 0.01573
Epoch: 8279, Training Loss: 0.01959
Epoch: 8279, Training Loss: 0.01496
Epoch: 8280, Training Loss: 0.01713
Epoch: 8280, Training Loss: 0.01573
Epoch: 8280, Training Loss: 0.01959
Epoch: 8280, Training Loss: 0.01496
Epoch: 8281, Training Loss: 0.01713
Epoch: 8281, Training Loss: 0.01573
Epoch: 8281, Training Loss: 0.01958
Epoch: 8281, Training Loss: 0.01496
Epoch: 8282, Training Loss: 0.01713
Epoch: 8282, Training Loss: 0.01573
Epoch: 8282, Training Loss: 0.01958
Epoch: 8282, Training Loss: 0.01496
Epoch: 8283, Training Loss: 0.01713
Epoch: 8283, Training Loss: 0.01573
Epoch: 8283, Training Loss: 0.01958
Epoch: 8283, Training Loss: 0.01496
Epoch: 8284, Training Loss: 0.01712
Epoch: 8284, Training Loss: 0.01572
Epoch: 8284, Training Loss: 0.01958
Epoch: 8284, Training Loss: 0.01496
Epoch: 8285, Training Loss: 0.01712
Epoch: 8285, Training Loss: 0.01572
Epoch: 8285, Training Loss: 0.01958
Epoch: 8285, Training Loss: 0.01495
Epoch: 8286, Training Loss: 0.01712
Epoch: 8286, Training Loss: 0.01572
Epoch: 8286, Training Loss: 0.01958
Epoch: 8286, Training Loss: 0.01495
Epoch: 8287, Training Loss: 0.01712
Epoch: 8287, Training Loss: 0.01572
Epoch: 8287, Training Loss: 0.01958
Epoch: 8287, Training Loss: 0.01495
Epoch: 8288, Training Loss: 0.01712
Epoch: 8288, Training Loss: 0.01572
Epoch: 8288, Training Loss: 0.01957
Epoch: 8288, Training Loss: 0.01495
Epoch: 8289, Training Loss: 0.01712
Epoch: 8289, Training Loss: 0.01572
Epoch: 8289, Training Loss: 0.01957
Epoch: 8289, Training Loss: 0.01495
Epoch: 8290, Training Loss: 0.01712
Epoch: 8290, Training Loss: 0.01572
Epoch: 8290, Training Loss: 0.01957
Epoch: 8290, Training Loss: 0.01495
Epoch: 8291, Training Loss: 0.01712
Epoch: 8291, Training Loss: 0.01572
Epoch: 8291, Training Loss: 0.01957
Epoch: 8291, Training Loss: 0.01495
Epoch: 8292, Training Loss: 0.01712
Epoch: 8292, Training Loss: 0.01572
Epoch: 8292, Training Loss: 0.01957
Epoch: 8292, Training Loss: 0.01495
Epoch: 8293, Training Loss: 0.01711
Epoch: 8293, Training Loss: 0.01571
Epoch: 8293, Training Loss: 0.01957
Epoch: 8293, Training Loss: 0.01495
Epoch: 8294, Training Loss: 0.01711
Epoch: 8294, Training Loss: 0.01571
Epoch: 8294, Training Loss: 0.01957
Epoch: 8294, Training Loss: 0.01495
Epoch: 8295, Training Loss: 0.01711
Epoch: 8295, Training Loss: 0.01571
Epoch: 8295, Training Loss: 0.01956
Epoch: 8295, Training Loss: 0.01494
Epoch: 8296, Training Loss: 0.01711
Epoch: 8296, Training Loss: 0.01571
Epoch: 8296, Training Loss: 0.01956
Epoch: 8296, Training Loss: 0.01494
Epoch: 8297, Training Loss: 0.01711
Epoch: 8297, Training Loss: 0.01571
Epoch: 8297, Training Loss: 0.01956
Epoch: 8297, Training Loss: 0.01494
Epoch: 8298, Training Loss: 0.01711
Epoch: 8298, Training Loss: 0.01571
Epoch: 8298, Training Loss: 0.01956
Epoch: 8298, Training Loss: 0.01494
Epoch: 8299, Training Loss: 0.01711
Epoch: 8299, Training Loss: 0.01571
Epoch: 8299, Training Loss: 0.01956
Epoch: 8299, Training Loss: 0.01494
Epoch: 8300, Training Loss: 0.01711
Epoch: 8300, Training Loss: 0.01571
Epoch: 8300, Training Loss: 0.01956
Epoch: 8300, Training Loss: 0.01494
Epoch: 8301, Training Loss: 0.01710
Epoch: 8301, Training Loss: 0.01571
Epoch: 8301, Training Loss: 0.01956
Epoch: 8301, Training Loss: 0.01494
Epoch: 8302, Training Loss: 0.01710
Epoch: 8302, Training Loss: 0.01571
Epoch: 8302, Training Loss: 0.01956
Epoch: 8302, Training Loss: 0.01494
Epoch: 8303, Training Loss: 0.01710
Epoch: 8303, Training Loss: 0.01570
Epoch: 8303, Training Loss: 0.01955
Epoch: 8303, Training Loss: 0.01494
Epoch: 8304, Training Loss: 0.01710
Epoch: 8304, Training Loss: 0.01570
Epoch: 8304, Training Loss: 0.01955
Epoch: 8304, Training Loss: 0.01494
Epoch: 8305, Training Loss: 0.01710
Epoch: 8305, Training Loss: 0.01570
Epoch: 8305, Training Loss: 0.01955
Epoch: 8305, Training Loss: 0.01493
Epoch: 8306, Training Loss: 0.01710
Epoch: 8306, Training Loss: 0.01570
Epoch: 8306, Training Loss: 0.01955
Epoch: 8306, Training Loss: 0.01493
Epoch: 8307, Training Loss: 0.01710
Epoch: 8307, Training Loss: 0.01570
Epoch: 8307, Training Loss: 0.01955
Epoch: 8307, Training Loss: 0.01493
Epoch: 8308, Training Loss: 0.01710
Epoch: 8308, Training Loss: 0.01570
Epoch: 8308, Training Loss: 0.01955
Epoch: 8308, Training Loss: 0.01493
Epoch: 8309, Training Loss: 0.01710
Epoch: 8309, Training Loss: 0.01570
Epoch: 8309, Training Loss: 0.01955
Epoch: 8309, Training Loss: 0.01493
Epoch: 8310, Training Loss: 0.01709
Epoch: 8310, Training Loss: 0.01570
Epoch: 8310, Training Loss: 0.01954
Epoch: 8310, Training Loss: 0.01493
Epoch: 8311, Training Loss: 0.01709
Epoch: 8311, Training Loss: 0.01570
Epoch: 8311, Training Loss: 0.01954
Epoch: 8311, Training Loss: 0.01493
Epoch: 8312, Training Loss: 0.01709
Epoch: 8312, Training Loss: 0.01569
Epoch: 8312, Training Loss: 0.01954
Epoch: 8312, Training Loss: 0.01493
Epoch: 8313, Training Loss: 0.01709
Epoch: 8313, Training Loss: 0.01569
Epoch: 8313, Training Loss: 0.01954
Epoch: 8313, Training Loss: 0.01493
Epoch: 8314, Training Loss: 0.01709
Epoch: 8314, Training Loss: 0.01569
Epoch: 8314, Training Loss: 0.01954
Epoch: 8314, Training Loss: 0.01493
Epoch: 8315, Training Loss: 0.01709
Epoch: 8315, Training Loss: 0.01569
Epoch: 8315, Training Loss: 0.01954
Epoch: 8315, Training Loss: 0.01492
Epoch: 8316, Training Loss: 0.01709
Epoch: 8316, Training Loss: 0.01569
Epoch: 8316, Training Loss: 0.01954
Epoch: 8316, Training Loss: 0.01492
Epoch: 8317, Training Loss: 0.01709
Epoch: 8317, Training Loss: 0.01569
Epoch: 8317, Training Loss: 0.01953
Epoch: 8317, Training Loss: 0.01492
Epoch: 8318, Training Loss: 0.01708
Epoch: 8318, Training Loss: 0.01569
Epoch: 8318, Training Loss: 0.01953
Epoch: 8318, Training Loss: 0.01492
Epoch: 8319, Training Loss: 0.01708
Epoch: 8319, Training Loss: 0.01569
Epoch: 8319, Training Loss: 0.01953
Epoch: 8319, Training Loss: 0.01492
Epoch: 8320, Training Loss: 0.01708
Epoch: 8320, Training Loss: 0.01569
Epoch: 8320, Training Loss: 0.01953
Epoch: 8320, Training Loss: 0.01492
Epoch: 8321, Training Loss: 0.01708
Epoch: 8321, Training Loss: 0.01569
Epoch: 8321, Training Loss: 0.01953
Epoch: 8321, Training Loss: 0.01492
Epoch: 8322, Training Loss: 0.01708
Epoch: 8322, Training Loss: 0.01568
Epoch: 8322, Training Loss: 0.01953
Epoch: 8322, Training Loss: 0.01492
Epoch: 8323, Training Loss: 0.01708
Epoch: 8323, Training Loss: 0.01568
Epoch: 8323, Training Loss: 0.01953
Epoch: 8323, Training Loss: 0.01492
Epoch: 8324, Training Loss: 0.01708
Epoch: 8324, Training Loss: 0.01568
Epoch: 8324, Training Loss: 0.01952
Epoch: 8324, Training Loss: 0.01492
Epoch: 8325, Training Loss: 0.01708
Epoch: 8325, Training Loss: 0.01568
Epoch: 8325, Training Loss: 0.01952
Epoch: 8325, Training Loss: 0.01491
Epoch: 8326, Training Loss: 0.01707
Epoch: 8326, Training Loss: 0.01568
Epoch: 8326, Training Loss: 0.01952
Epoch: 8326, Training Loss: 0.01491
Epoch: 8327, Training Loss: 0.01707
Epoch: 8327, Training Loss: 0.01568
Epoch: 8327, Training Loss: 0.01952
Epoch: 8327, Training Loss: 0.01491
Epoch: 8328, Training Loss: 0.01707
Epoch: 8328, Training Loss: 0.01568
Epoch: 8328, Training Loss: 0.01952
Epoch: 8328, Training Loss: 0.01491
Epoch: 8329, Training Loss: 0.01707
Epoch: 8329, Training Loss: 0.01568
Epoch: 8329, Training Loss: 0.01952
Epoch: 8329, Training Loss: 0.01491
Epoch: 8330, Training Loss: 0.01707
Epoch: 8330, Training Loss: 0.01568
Epoch: 8330, Training Loss: 0.01952
Epoch: 8330, Training Loss: 0.01491
Epoch: 8331, Training Loss: 0.01707
Epoch: 8331, Training Loss: 0.01567
Epoch: 8331, Training Loss: 0.01952
Epoch: 8331, Training Loss: 0.01491
Epoch: 8332, Training Loss: 0.01707
Epoch: 8332, Training Loss: 0.01567
Epoch: 8332, Training Loss: 0.01951
Epoch: 8332, Training Loss: 0.01491
Epoch: 8333, Training Loss: 0.01707
Epoch: 8333, Training Loss: 0.01567
Epoch: 8333, Training Loss: 0.01951
Epoch: 8333, Training Loss: 0.01491
Epoch: 8334, Training Loss: 0.01707
Epoch: 8334, Training Loss: 0.01567
Epoch: 8334, Training Loss: 0.01951
Epoch: 8334, Training Loss: 0.01491
Epoch: 8335, Training Loss: 0.01706
Epoch: 8335, Training Loss: 0.01567
Epoch: 8335, Training Loss: 0.01951
Epoch: 8335, Training Loss: 0.01490
Epoch: 8336, Training Loss: 0.01706
Epoch: 8336, Training Loss: 0.01567
Epoch: 8336, Training Loss: 0.01951
Epoch: 8336, Training Loss: 0.01490
Epoch: 8337, Training Loss: 0.01706
Epoch: 8337, Training Loss: 0.01567
Epoch: 8337, Training Loss: 0.01951
Epoch: 8337, Training Loss: 0.01490
Epoch: 8338, Training Loss: 0.01706
Epoch: 8338, Training Loss: 0.01567
Epoch: 8338, Training Loss: 0.01951
Epoch: 8338, Training Loss: 0.01490
Epoch: 8339, Training Loss: 0.01706
Epoch: 8339, Training Loss: 0.01567
Epoch: 8339, Training Loss: 0.01950
Epoch: 8339, Training Loss: 0.01490
Epoch: 8340, Training Loss: 0.01706
Epoch: 8340, Training Loss: 0.01567
Epoch: 8340, Training Loss: 0.01950
Epoch: 8340, Training Loss: 0.01490
Epoch: 8341, Training Loss: 0.01706
Epoch: 8341, Training Loss: 0.01566
Epoch: 8341, Training Loss: 0.01950
Epoch: 8341, Training Loss: 0.01490
Epoch: 8342, Training Loss: 0.01706
Epoch: 8342, Training Loss: 0.01566
Epoch: 8342, Training Loss: 0.01950
Epoch: 8342, Training Loss: 0.01490
Epoch: 8343, Training Loss: 0.01705
Epoch: 8343, Training Loss: 0.01566
Epoch: 8343, Training Loss: 0.01950
Epoch: 8343, Training Loss: 0.01490
Epoch: 8344, Training Loss: 0.01705
Epoch: 8344, Training Loss: 0.01566
Epoch: 8344, Training Loss: 0.01950
Epoch: 8344, Training Loss: 0.01490
Epoch: 8345, Training Loss: 0.01705
Epoch: 8345, Training Loss: 0.01566
Epoch: 8345, Training Loss: 0.01950
Epoch: 8345, Training Loss: 0.01489
Epoch: 8346, Training Loss: 0.01705
Epoch: 8346, Training Loss: 0.01566
Epoch: 8346, Training Loss: 0.01949
Epoch: 8346, Training Loss: 0.01489
Epoch: 8347, Training Loss: 0.01705
Epoch: 8347, Training Loss: 0.01566
Epoch: 8347, Training Loss: 0.01949
Epoch: 8347, Training Loss: 0.01489
Epoch: 8348, Training Loss: 0.01705
Epoch: 8348, Training Loss: 0.01566
Epoch: 8348, Training Loss: 0.01949
Epoch: 8348, Training Loss: 0.01489
Epoch: 8349, Training Loss: 0.01705
Epoch: 8349, Training Loss: 0.01566
Epoch: 8349, Training Loss: 0.01949
Epoch: 8349, Training Loss: 0.01489
Epoch: 8350, Training Loss: 0.01705
Epoch: 8350, Training Loss: 0.01565
Epoch: 8350, Training Loss: 0.01949
Epoch: 8350, Training Loss: 0.01489
Epoch: 8351, Training Loss: 0.01705
Epoch: 8351, Training Loss: 0.01565
Epoch: 8351, Training Loss: 0.01949
Epoch: 8351, Training Loss: 0.01489
Epoch: 8352, Training Loss: 0.01704
Epoch: 8352, Training Loss: 0.01565
Epoch: 8352, Training Loss: 0.01949
Epoch: 8352, Training Loss: 0.01489
Epoch: 8353, Training Loss: 0.01704
Epoch: 8353, Training Loss: 0.01565
Epoch: 8353, Training Loss: 0.01949
Epoch: 8353, Training Loss: 0.01489
Epoch: 8354, Training Loss: 0.01704
Epoch: 8354, Training Loss: 0.01565
Epoch: 8354, Training Loss: 0.01948
Epoch: 8354, Training Loss: 0.01488
Epoch: 8355, Training Loss: 0.01704
Epoch: 8355, Training Loss: 0.01565
Epoch: 8355, Training Loss: 0.01948
Epoch: 8355, Training Loss: 0.01488
Epoch: 8356, Training Loss: 0.01704
Epoch: 8356, Training Loss: 0.01565
Epoch: 8356, Training Loss: 0.01948
Epoch: 8356, Training Loss: 0.01488
Epoch: 8357, Training Loss: 0.01704
Epoch: 8357, Training Loss: 0.01565
Epoch: 8357, Training Loss: 0.01948
Epoch: 8357, Training Loss: 0.01488
Epoch: 8358, Training Loss: 0.01704
Epoch: 8358, Training Loss: 0.01565
Epoch: 8358, Training Loss: 0.01948
Epoch: 8358, Training Loss: 0.01488
Epoch: 8359, Training Loss: 0.01704
Epoch: 8359, Training Loss: 0.01565
Epoch: 8359, Training Loss: 0.01948
Epoch: 8359, Training Loss: 0.01488
Epoch: 8360, Training Loss: 0.01703
Epoch: 8360, Training Loss: 0.01564
Epoch: 8360, Training Loss: 0.01948
Epoch: 8360, Training Loss: 0.01488
Epoch: 8361, Training Loss: 0.01703
Epoch: 8361, Training Loss: 0.01564
Epoch: 8361, Training Loss: 0.01947
Epoch: 8361, Training Loss: 0.01488
Epoch: 8362, Training Loss: 0.01703
Epoch: 8362, Training Loss: 0.01564
Epoch: 8362, Training Loss: 0.01947
Epoch: 8362, Training Loss: 0.01488
Epoch: 8363, Training Loss: 0.01703
Epoch: 8363, Training Loss: 0.01564
Epoch: 8363, Training Loss: 0.01947
Epoch: 8363, Training Loss: 0.01488
Epoch: 8364, Training Loss: 0.01703
Epoch: 8364, Training Loss: 0.01564
Epoch: 8364, Training Loss: 0.01947
Epoch: 8364, Training Loss: 0.01487
Epoch: 8365, Training Loss: 0.01703
Epoch: 8365, Training Loss: 0.01564
Epoch: 8365, Training Loss: 0.01947
Epoch: 8365, Training Loss: 0.01487
Epoch: 8366, Training Loss: 0.01703
Epoch: 8366, Training Loss: 0.01564
Epoch: 8366, Training Loss: 0.01947
Epoch: 8366, Training Loss: 0.01487
Epoch: 8367, Training Loss: 0.01703
Epoch: 8367, Training Loss: 0.01564
Epoch: 8367, Training Loss: 0.01947
Epoch: 8367, Training Loss: 0.01487
Epoch: 8368, Training Loss: 0.01703
Epoch: 8368, Training Loss: 0.01564
Epoch: 8368, Training Loss: 0.01947
Epoch: 8368, Training Loss: 0.01487
Epoch: 8369, Training Loss: 0.01702
Epoch: 8369, Training Loss: 0.01563
Epoch: 8369, Training Loss: 0.01946
Epoch: 8369, Training Loss: 0.01487
Epoch: 8370, Training Loss: 0.01702
Epoch: 8370, Training Loss: 0.01563
Epoch: 8370, Training Loss: 0.01946
Epoch: 8370, Training Loss: 0.01487
Epoch: 8371, Training Loss: 0.01702
Epoch: 8371, Training Loss: 0.01563
Epoch: 8371, Training Loss: 0.01946
Epoch: 8371, Training Loss: 0.01487
Epoch: 8372, Training Loss: 0.01702
Epoch: 8372, Training Loss: 0.01563
Epoch: 8372, Training Loss: 0.01946
Epoch: 8372, Training Loss: 0.01487
Epoch: 8373, Training Loss: 0.01702
Epoch: 8373, Training Loss: 0.01563
Epoch: 8373, Training Loss: 0.01946
Epoch: 8373, Training Loss: 0.01487
Epoch: 8374, Training Loss: 0.01702
Epoch: 8374, Training Loss: 0.01563
Epoch: 8374, Training Loss: 0.01946
Epoch: 8374, Training Loss: 0.01486
Epoch: 8375, Training Loss: 0.01702
Epoch: 8375, Training Loss: 0.01563
Epoch: 8375, Training Loss: 0.01946
Epoch: 8375, Training Loss: 0.01486
Epoch: 8376, Training Loss: 0.01702
Epoch: 8376, Training Loss: 0.01563
Epoch: 8376, Training Loss: 0.01945
Epoch: 8376, Training Loss: 0.01486
Epoch: 8377, Training Loss: 0.01701
Epoch: 8377, Training Loss: 0.01563
Epoch: 8377, Training Loss: 0.01945
Epoch: 8377, Training Loss: 0.01486
Epoch: 8378, Training Loss: 0.01701
Epoch: 8378, Training Loss: 0.01563
Epoch: 8378, Training Loss: 0.01945
Epoch: 8378, Training Loss: 0.01486
Epoch: 8379, Training Loss: 0.01701
Epoch: 8379, Training Loss: 0.01562
Epoch: 8379, Training Loss: 0.01945
Epoch: 8379, Training Loss: 0.01486
Epoch: 8380, Training Loss: 0.01701
Epoch: 8380, Training Loss: 0.01562
Epoch: 8380, Training Loss: 0.01945
Epoch: 8380, Training Loss: 0.01486
Epoch: 8381, Training Loss: 0.01701
Epoch: 8381, Training Loss: 0.01562
Epoch: 8381, Training Loss: 0.01945
Epoch: 8381, Training Loss: 0.01486
Epoch: 8382, Training Loss: 0.01701
Epoch: 8382, Training Loss: 0.01562
Epoch: 8382, Training Loss: 0.01945
Epoch: 8382, Training Loss: 0.01486
Epoch: 8383, Training Loss: 0.01701
Epoch: 8383, Training Loss: 0.01562
Epoch: 8383, Training Loss: 0.01944
Epoch: 8383, Training Loss: 0.01486
Epoch: 8384, Training Loss: 0.01701
Epoch: 8384, Training Loss: 0.01562
Epoch: 8384, Training Loss: 0.01944
Epoch: 8384, Training Loss: 0.01485
Epoch: 8385, Training Loss: 0.01701
Epoch: 8385, Training Loss: 0.01562
Epoch: 8385, Training Loss: 0.01944
Epoch: 8385, Training Loss: 0.01485
Epoch: 8386, Training Loss: 0.01700
Epoch: 8386, Training Loss: 0.01562
Epoch: 8386, Training Loss: 0.01944
Epoch: 8386, Training Loss: 0.01485
Epoch: 8387, Training Loss: 0.01700
Epoch: 8387, Training Loss: 0.01562
Epoch: 8387, Training Loss: 0.01944
Epoch: 8387, Training Loss: 0.01485
Epoch: 8388, Training Loss: 0.01700
Epoch: 8388, Training Loss: 0.01561
Epoch: 8388, Training Loss: 0.01944
Epoch: 8388, Training Loss: 0.01485
Epoch: 8389, Training Loss: 0.01700
Epoch: 8389, Training Loss: 0.01561
Epoch: 8389, Training Loss: 0.01944
Epoch: 8389, Training Loss: 0.01485
Epoch: 8390, Training Loss: 0.01700
Epoch: 8390, Training Loss: 0.01561
Epoch: 8390, Training Loss: 0.01944
Epoch: 8390, Training Loss: 0.01485
Epoch: 8391, Training Loss: 0.01700
Epoch: 8391, Training Loss: 0.01561
Epoch: 8391, Training Loss: 0.01943
Epoch: 8391, Training Loss: 0.01485
Epoch: 8392, Training Loss: 0.01700
Epoch: 8392, Training Loss: 0.01561
Epoch: 8392, Training Loss: 0.01943
Epoch: 8392, Training Loss: 0.01485
Epoch: 8393, Training Loss: 0.01700
Epoch: 8393, Training Loss: 0.01561
Epoch: 8393, Training Loss: 0.01943
Epoch: 8393, Training Loss: 0.01485
Epoch: 8394, Training Loss: 0.01699
Epoch: 8394, Training Loss: 0.01561
Epoch: 8394, Training Loss: 0.01943
Epoch: 8394, Training Loss: 0.01484
Epoch: 8395, Training Loss: 0.01699
Epoch: 8395, Training Loss: 0.01561
Epoch: 8395, Training Loss: 0.01943
Epoch: 8395, Training Loss: 0.01484
Epoch: 8396, Training Loss: 0.01699
Epoch: 8396, Training Loss: 0.01561
Epoch: 8396, Training Loss: 0.01943
Epoch: 8396, Training Loss: 0.01484
Epoch: 8397, Training Loss: 0.01699
Epoch: 8397, Training Loss: 0.01561
Epoch: 8397, Training Loss: 0.01943
Epoch: 8397, Training Loss: 0.01484
Epoch: 8398, Training Loss: 0.01699
Epoch: 8398, Training Loss: 0.01560
Epoch: 8398, Training Loss: 0.01942
Epoch: 8398, Training Loss: 0.01484
Epoch: 8399, Training Loss: 0.01699
Epoch: 8399, Training Loss: 0.01560
Epoch: 8399, Training Loss: 0.01942
Epoch: 8399, Training Loss: 0.01484
Epoch: 8400, Training Loss: 0.01699
Epoch: 8400, Training Loss: 0.01560
Epoch: 8400, Training Loss: 0.01942
Epoch: 8400, Training Loss: 0.01484
Epoch: 8401, Training Loss: 0.01699
Epoch: 8401, Training Loss: 0.01560
Epoch: 8401, Training Loss: 0.01942
Epoch: 8401, Training Loss: 0.01484
Epoch: 8402, Training Loss: 0.01699
Epoch: 8402, Training Loss: 0.01560
Epoch: 8402, Training Loss: 0.01942
Epoch: 8402, Training Loss: 0.01484
Epoch: 8403, Training Loss: 0.01698
Epoch: 8403, Training Loss: 0.01560
Epoch: 8403, Training Loss: 0.01942
Epoch: 8403, Training Loss: 0.01484
Epoch: 8404, Training Loss: 0.01698
Epoch: 8404, Training Loss: 0.01560
Epoch: 8404, Training Loss: 0.01942
Epoch: 8404, Training Loss: 0.01483
Epoch: 8405, Training Loss: 0.01698
Epoch: 8405, Training Loss: 0.01560
Epoch: 8405, Training Loss: 0.01942
Epoch: 8405, Training Loss: 0.01483
Epoch: 8406, Training Loss: 0.01698
Epoch: 8406, Training Loss: 0.01560
Epoch: 8406, Training Loss: 0.01941
Epoch: 8406, Training Loss: 0.01483
Epoch: 8407, Training Loss: 0.01698
Epoch: 8407, Training Loss: 0.01560
Epoch: 8407, Training Loss: 0.01941
Epoch: 8407, Training Loss: 0.01483
Epoch: 8408, Training Loss: 0.01698
Epoch: 8408, Training Loss: 0.01559
Epoch: 8408, Training Loss: 0.01941
Epoch: 8408, Training Loss: 0.01483
Epoch: 8409, Training Loss: 0.01698
Epoch: 8409, Training Loss: 0.01559
Epoch: 8409, Training Loss: 0.01941
Epoch: 8409, Training Loss: 0.01483
Epoch: 8410, Training Loss: 0.01698
Epoch: 8410, Training Loss: 0.01559
Epoch: 8410, Training Loss: 0.01941
Epoch: 8410, Training Loss: 0.01483
Epoch: 8411, Training Loss: 0.01698
Epoch: 8411, Training Loss: 0.01559
Epoch: 8411, Training Loss: 0.01941
Epoch: 8411, Training Loss: 0.01483
Epoch: 8412, Training Loss: 0.01697
Epoch: 8412, Training Loss: 0.01559
Epoch: 8412, Training Loss: 0.01941
Epoch: 8412, Training Loss: 0.01483
Epoch: 8413, Training Loss: 0.01697
Epoch: 8413, Training Loss: 0.01559
Epoch: 8413, Training Loss: 0.01940
Epoch: 8413, Training Loss: 0.01483
Epoch: 8414, Training Loss: 0.01697
Epoch: 8414, Training Loss: 0.01559
Epoch: 8414, Training Loss: 0.01940
Epoch: 8414, Training Loss: 0.01482
Epoch: 8415, Training Loss: 0.01697
Epoch: 8415, Training Loss: 0.01559
Epoch: 8415, Training Loss: 0.01940
Epoch: 8415, Training Loss: 0.01482
Epoch: 8416, Training Loss: 0.01697
Epoch: 8416, Training Loss: 0.01559
Epoch: 8416, Training Loss: 0.01940
Epoch: 8416, Training Loss: 0.01482
Epoch: 8417, Training Loss: 0.01697
Epoch: 8417, Training Loss: 0.01558
Epoch: 8417, Training Loss: 0.01940
Epoch: 8417, Training Loss: 0.01482
Epoch: 8418, Training Loss: 0.01697
Epoch: 8418, Training Loss: 0.01558
Epoch: 8418, Training Loss: 0.01940
Epoch: 8418, Training Loss: 0.01482
Epoch: 8419, Training Loss: 0.01697
Epoch: 8419, Training Loss: 0.01558
Epoch: 8419, Training Loss: 0.01940
Epoch: 8419, Training Loss: 0.01482
Epoch: 8420, Training Loss: 0.01696
Epoch: 8420, Training Loss: 0.01558
Epoch: 8420, Training Loss: 0.01940
Epoch: 8420, Training Loss: 0.01482
Epoch: 8421, Training Loss: 0.01696
Epoch: 8421, Training Loss: 0.01558
Epoch: 8421, Training Loss: 0.01939
Epoch: 8421, Training Loss: 0.01482
Epoch: 8422, Training Loss: 0.01696
Epoch: 8422, Training Loss: 0.01558
Epoch: 8422, Training Loss: 0.01939
Epoch: 8422, Training Loss: 0.01482
Epoch: 8423, Training Loss: 0.01696
Epoch: 8423, Training Loss: 0.01558
Epoch: 8423, Training Loss: 0.01939
Epoch: 8423, Training Loss: 0.01482
Epoch: 8424, Training Loss: 0.01696
Epoch: 8424, Training Loss: 0.01558
Epoch: 8424, Training Loss: 0.01939
Epoch: 8424, Training Loss: 0.01481
Epoch: 8425, Training Loss: 0.01696
Epoch: 8425, Training Loss: 0.01558
Epoch: 8425, Training Loss: 0.01939
Epoch: 8425, Training Loss: 0.01481
Epoch: 8426, Training Loss: 0.01696
Epoch: 8426, Training Loss: 0.01558
Epoch: 8426, Training Loss: 0.01939
Epoch: 8426, Training Loss: 0.01481
Epoch: 8427, Training Loss: 0.01696
Epoch: 8427, Training Loss: 0.01557
Epoch: 8427, Training Loss: 0.01939
Epoch: 8427, Training Loss: 0.01481
Epoch: 8428, Training Loss: 0.01696
Epoch: 8428, Training Loss: 0.01557
Epoch: 8428, Training Loss: 0.01938
Epoch: 8428, Training Loss: 0.01481
Epoch: 8429, Training Loss: 0.01695
Epoch: 8429, Training Loss: 0.01557
Epoch: 8429, Training Loss: 0.01938
Epoch: 8429, Training Loss: 0.01481
Epoch: 8430, Training Loss: 0.01695
Epoch: 8430, Training Loss: 0.01557
Epoch: 8430, Training Loss: 0.01938
Epoch: 8430, Training Loss: 0.01481
Epoch: 8431, Training Loss: 0.01695
Epoch: 8431, Training Loss: 0.01557
Epoch: 8431, Training Loss: 0.01938
Epoch: 8431, Training Loss: 0.01481
Epoch: 8432, Training Loss: 0.01695
Epoch: 8432, Training Loss: 0.01557
Epoch: 8432, Training Loss: 0.01938
Epoch: 8432, Training Loss: 0.01481
Epoch: 8433, Training Loss: 0.01695
Epoch: 8433, Training Loss: 0.01557
Epoch: 8433, Training Loss: 0.01938
Epoch: 8433, Training Loss: 0.01481
Epoch: 8434, Training Loss: 0.01695
Epoch: 8434, Training Loss: 0.01557
Epoch: 8434, Training Loss: 0.01938
Epoch: 8434, Training Loss: 0.01480
Epoch: 8435, Training Loss: 0.01695
Epoch: 8435, Training Loss: 0.01557
Epoch: 8435, Training Loss: 0.01937
Epoch: 8435, Training Loss: 0.01480
Epoch: 8436, Training Loss: 0.01695
Epoch: 8436, Training Loss: 0.01557
Epoch: 8436, Training Loss: 0.01937
Epoch: 8436, Training Loss: 0.01480
Epoch: 8437, Training Loss: 0.01694
Epoch: 8437, Training Loss: 0.01556
Epoch: 8437, Training Loss: 0.01937
Epoch: 8437, Training Loss: 0.01480
Epoch: 8438, Training Loss: 0.01694
Epoch: 8438, Training Loss: 0.01556
Epoch: 8438, Training Loss: 0.01937
Epoch: 8438, Training Loss: 0.01480
Epoch: 8439, Training Loss: 0.01694
Epoch: 8439, Training Loss: 0.01556
Epoch: 8439, Training Loss: 0.01937
Epoch: 8439, Training Loss: 0.01480
Epoch: 8440, Training Loss: 0.01694
Epoch: 8440, Training Loss: 0.01556
Epoch: 8440, Training Loss: 0.01937
Epoch: 8440, Training Loss: 0.01480
Epoch: 8441, Training Loss: 0.01694
Epoch: 8441, Training Loss: 0.01556
Epoch: 8441, Training Loss: 0.01937
Epoch: 8441, Training Loss: 0.01480
Epoch: 8442, Training Loss: 0.01694
Epoch: 8442, Training Loss: 0.01556
Epoch: 8442, Training Loss: 0.01937
Epoch: 8442, Training Loss: 0.01480
Epoch: 8443, Training Loss: 0.01694
Epoch: 8443, Training Loss: 0.01556
Epoch: 8443, Training Loss: 0.01936
Epoch: 8443, Training Loss: 0.01480
Epoch: 8444, Training Loss: 0.01694
Epoch: 8444, Training Loss: 0.01556
Epoch: 8444, Training Loss: 0.01936
Epoch: 8444, Training Loss: 0.01480
Epoch: 8445, Training Loss: 0.01694
Epoch: 8445, Training Loss: 0.01556
Epoch: 8445, Training Loss: 0.01936
Epoch: 8445, Training Loss: 0.01479
Epoch: 8446, Training Loss: 0.01693
Epoch: 8446, Training Loss: 0.01556
Epoch: 8446, Training Loss: 0.01936
Epoch: 8446, Training Loss: 0.01479
Epoch: 8447, Training Loss: 0.01693
Epoch: 8447, Training Loss: 0.01555
Epoch: 8447, Training Loss: 0.01936
Epoch: 8447, Training Loss: 0.01479
Epoch: 8448, Training Loss: 0.01693
Epoch: 8448, Training Loss: 0.01555
Epoch: 8448, Training Loss: 0.01936
Epoch: 8448, Training Loss: 0.01479
Epoch: 8449, Training Loss: 0.01693
Epoch: 8449, Training Loss: 0.01555
Epoch: 8449, Training Loss: 0.01936
Epoch: 8449, Training Loss: 0.01479
Epoch: 8450, Training Loss: 0.01693
Epoch: 8450, Training Loss: 0.01555
Epoch: 8450, Training Loss: 0.01935
Epoch: 8450, Training Loss: 0.01479
Epoch: 8451, Training Loss: 0.01693
Epoch: 8451, Training Loss: 0.01555
Epoch: 8451, Training Loss: 0.01935
Epoch: 8451, Training Loss: 0.01479
Epoch: 8452, Training Loss: 0.01693
Epoch: 8452, Training Loss: 0.01555
Epoch: 8452, Training Loss: 0.01935
Epoch: 8452, Training Loss: 0.01479
Epoch: 8453, Training Loss: 0.01693
Epoch: 8453, Training Loss: 0.01555
Epoch: 8453, Training Loss: 0.01935
Epoch: 8453, Training Loss: 0.01479
Epoch: 8454, Training Loss: 0.01693
Epoch: 8454, Training Loss: 0.01555
Epoch: 8454, Training Loss: 0.01935
Epoch: 8454, Training Loss: 0.01479
Epoch: 8455, Training Loss: 0.01692
Epoch: 8455, Training Loss: 0.01555
Epoch: 8455, Training Loss: 0.01935
Epoch: 8455, Training Loss: 0.01478
Epoch: 8456, Training Loss: 0.01692
Epoch: 8456, Training Loss: 0.01554
Epoch: 8456, Training Loss: 0.01935
Epoch: 8456, Training Loss: 0.01478
Epoch: 8457, Training Loss: 0.01692
Epoch: 8457, Training Loss: 0.01554
Epoch: 8457, Training Loss: 0.01935
Epoch: 8457, Training Loss: 0.01478
Epoch: 8458, Training Loss: 0.01692
Epoch: 8458, Training Loss: 0.01554
Epoch: 8458, Training Loss: 0.01934
Epoch: 8458, Training Loss: 0.01478
Epoch: 8459, Training Loss: 0.01692
Epoch: 8459, Training Loss: 0.01554
Epoch: 8459, Training Loss: 0.01934
Epoch: 8459, Training Loss: 0.01478
Epoch: 8460, Training Loss: 0.01692
Epoch: 8460, Training Loss: 0.01554
Epoch: 8460, Training Loss: 0.01934
Epoch: 8460, Training Loss: 0.01478
Epoch: 8461, Training Loss: 0.01692
Epoch: 8461, Training Loss: 0.01554
Epoch: 8461, Training Loss: 0.01934
Epoch: 8461, Training Loss: 0.01478
Epoch: 8462, Training Loss: 0.01692
Epoch: 8462, Training Loss: 0.01554
Epoch: 8462, Training Loss: 0.01934
Epoch: 8462, Training Loss: 0.01478
Epoch: 8463, Training Loss: 0.01691
Epoch: 8463, Training Loss: 0.01554
Epoch: 8463, Training Loss: 0.01934
Epoch: 8463, Training Loss: 0.01478
Epoch: 8464, Training Loss: 0.01691
Epoch: 8464, Training Loss: 0.01554
Epoch: 8464, Training Loss: 0.01934
Epoch: 8464, Training Loss: 0.01478
Epoch: 8465, Training Loss: 0.01691
Epoch: 8465, Training Loss: 0.01554
Epoch: 8465, Training Loss: 0.01934
Epoch: 8465, Training Loss: 0.01477
Epoch: 8466, Training Loss: 0.01691
Epoch: 8466, Training Loss: 0.01553
Epoch: 8466, Training Loss: 0.01933
Epoch: 8466, Training Loss: 0.01477
Epoch: 8467, Training Loss: 0.01691
Epoch: 8467, Training Loss: 0.01553
Epoch: 8467, Training Loss: 0.01933
Epoch: 8467, Training Loss: 0.01477
Epoch: 8468, Training Loss: 0.01691
Epoch: 8468, Training Loss: 0.01553
Epoch: 8468, Training Loss: 0.01933
Epoch: 8468, Training Loss: 0.01477
Epoch: 8469, Training Loss: 0.01691
Epoch: 8469, Training Loss: 0.01553
Epoch: 8469, Training Loss: 0.01933
Epoch: 8469, Training Loss: 0.01477
Epoch: 8470, Training Loss: 0.01691
Epoch: 8470, Training Loss: 0.01553
Epoch: 8470, Training Loss: 0.01933
Epoch: 8470, Training Loss: 0.01477
Epoch: 8471, Training Loss: 0.01691
Epoch: 8471, Training Loss: 0.01553
Epoch: 8471, Training Loss: 0.01933
Epoch: 8471, Training Loss: 0.01477
Epoch: 8472, Training Loss: 0.01690
Epoch: 8472, Training Loss: 0.01553
Epoch: 8472, Training Loss: 0.01933
Epoch: 8472, Training Loss: 0.01477
Epoch: 8473, Training Loss: 0.01690
Epoch: 8473, Training Loss: 0.01553
Epoch: 8473, Training Loss: 0.01932
Epoch: 8473, Training Loss: 0.01477
Epoch: 8474, Training Loss: 0.01690
Epoch: 8474, Training Loss: 0.01553
Epoch: 8474, Training Loss: 0.01932
Epoch: 8474, Training Loss: 0.01477
Epoch: 8475, Training Loss: 0.01690
Epoch: 8475, Training Loss: 0.01553
Epoch: 8475, Training Loss: 0.01932
Epoch: 8475, Training Loss: 0.01476
Epoch: 8476, Training Loss: 0.01690
Epoch: 8476, Training Loss: 0.01552
Epoch: 8476, Training Loss: 0.01932
Epoch: 8476, Training Loss: 0.01476
Epoch: 8477, Training Loss: 0.01690
Epoch: 8477, Training Loss: 0.01552
Epoch: 8477, Training Loss: 0.01932
Epoch: 8477, Training Loss: 0.01476
Epoch: 8478, Training Loss: 0.01690
Epoch: 8478, Training Loss: 0.01552
Epoch: 8478, Training Loss: 0.01932
Epoch: 8478, Training Loss: 0.01476
Epoch: 8479, Training Loss: 0.01690
Epoch: 8479, Training Loss: 0.01552
Epoch: 8479, Training Loss: 0.01932
Epoch: 8479, Training Loss: 0.01476
Epoch: 8480, Training Loss: 0.01690
Epoch: 8480, Training Loss: 0.01552
Epoch: 8480, Training Loss: 0.01932
Epoch: 8480, Training Loss: 0.01476
Epoch: 8481, Training Loss: 0.01689
Epoch: 8481, Training Loss: 0.01552
Epoch: 8481, Training Loss: 0.01931
Epoch: 8481, Training Loss: 0.01476
Epoch: 8482, Training Loss: 0.01689
Epoch: 8482, Training Loss: 0.01552
Epoch: 8482, Training Loss: 0.01931
Epoch: 8482, Training Loss: 0.01476
Epoch: 8483, Training Loss: 0.01689
Epoch: 8483, Training Loss: 0.01552
Epoch: 8483, Training Loss: 0.01931
Epoch: 8483, Training Loss: 0.01476
Epoch: 8484, Training Loss: 0.01689
Epoch: 8484, Training Loss: 0.01552
Epoch: 8484, Training Loss: 0.01931
Epoch: 8484, Training Loss: 0.01476
Epoch: 8485, Training Loss: 0.01689
Epoch: 8485, Training Loss: 0.01552
Epoch: 8485, Training Loss: 0.01931
Epoch: 8485, Training Loss: 0.01475
Epoch: 8486, Training Loss: 0.01689
Epoch: 8486, Training Loss: 0.01551
Epoch: 8486, Training Loss: 0.01931
Epoch: 8486, Training Loss: 0.01475
Epoch: 8487, Training Loss: 0.01689
Epoch: 8487, Training Loss: 0.01551
Epoch: 8487, Training Loss: 0.01931
Epoch: 8487, Training Loss: 0.01475
Epoch: 8488, Training Loss: 0.01689
Epoch: 8488, Training Loss: 0.01551
Epoch: 8488, Training Loss: 0.01930
Epoch: 8488, Training Loss: 0.01475
Epoch: 8489, Training Loss: 0.01688
Epoch: 8489, Training Loss: 0.01551
Epoch: 8489, Training Loss: 0.01930
Epoch: 8489, Training Loss: 0.01475
Epoch: 8490, Training Loss: 0.01688
Epoch: 8490, Training Loss: 0.01551
Epoch: 8490, Training Loss: 0.01930
Epoch: 8490, Training Loss: 0.01475
Epoch: 8491, Training Loss: 0.01688
Epoch: 8491, Training Loss: 0.01551
Epoch: 8491, Training Loss: 0.01930
Epoch: 8491, Training Loss: 0.01475
Epoch: 8492, Training Loss: 0.01688
Epoch: 8492, Training Loss: 0.01551
Epoch: 8492, Training Loss: 0.01930
Epoch: 8492, Training Loss: 0.01475
Epoch: 8493, Training Loss: 0.01688
Epoch: 8493, Training Loss: 0.01551
Epoch: 8493, Training Loss: 0.01930
Epoch: 8493, Training Loss: 0.01475
Epoch: 8494, Training Loss: 0.01688
Epoch: 8494, Training Loss: 0.01551
Epoch: 8494, Training Loss: 0.01930
Epoch: 8494, Training Loss: 0.01475
Epoch: 8495, Training Loss: 0.01688
Epoch: 8495, Training Loss: 0.01550
Epoch: 8495, Training Loss: 0.01930
Epoch: 8495, Training Loss: 0.01474
Epoch: 8496, Training Loss: 0.01688
Epoch: 8496, Training Loss: 0.01550
Epoch: 8496, Training Loss: 0.01929
Epoch: 8496, Training Loss: 0.01474
Epoch: 8497, Training Loss: 0.01688
Epoch: 8497, Training Loss: 0.01550
Epoch: 8497, Training Loss: 0.01929
Epoch: 8497, Training Loss: 0.01474
Epoch: 8498, Training Loss: 0.01687
Epoch: 8498, Training Loss: 0.01550
Epoch: 8498, Training Loss: 0.01929
Epoch: 8498, Training Loss: 0.01474
Epoch: 8499, Training Loss: 0.01687
Epoch: 8499, Training Loss: 0.01550
Epoch: 8499, Training Loss: 0.01929
Epoch: 8499, Training Loss: 0.01474
Epoch: 8500, Training Loss: 0.01687
Epoch: 8500, Training Loss: 0.01550
Epoch: 8500, Training Loss: 0.01929
Epoch: 8500, Training Loss: 0.01474
Epoch: 8501, Training Loss: 0.01687
Epoch: 8501, Training Loss: 0.01550
Epoch: 8501, Training Loss: 0.01929
Epoch: 8501, Training Loss: 0.01474
Epoch: 8502, Training Loss: 0.01687
Epoch: 8502, Training Loss: 0.01550
Epoch: 8502, Training Loss: 0.01929
Epoch: 8502, Training Loss: 0.01474
Epoch: 8503, Training Loss: 0.01687
Epoch: 8503, Training Loss: 0.01550
Epoch: 8503, Training Loss: 0.01928
Epoch: 8503, Training Loss: 0.01474
Epoch: 8504, Training Loss: 0.01687
Epoch: 8504, Training Loss: 0.01550
Epoch: 8504, Training Loss: 0.01928
Epoch: 8504, Training Loss: 0.01474
Epoch: 8505, Training Loss: 0.01687
Epoch: 8505, Training Loss: 0.01549
Epoch: 8505, Training Loss: 0.01928
Epoch: 8505, Training Loss: 0.01474
Epoch: 8506, Training Loss: 0.01687
Epoch: 8506, Training Loss: 0.01549
Epoch: 8506, Training Loss: 0.01928
Epoch: 8506, Training Loss: 0.01473
Epoch: 8507, Training Loss: 0.01686
Epoch: 8507, Training Loss: 0.01549
Epoch: 8507, Training Loss: 0.01928
Epoch: 8507, Training Loss: 0.01473
Epoch: 8508, Training Loss: 0.01686
Epoch: 8508, Training Loss: 0.01549
Epoch: 8508, Training Loss: 0.01928
Epoch: 8508, Training Loss: 0.01473
Epoch: 8509, Training Loss: 0.01686
Epoch: 8509, Training Loss: 0.01549
Epoch: 8509, Training Loss: 0.01928
Epoch: 8509, Training Loss: 0.01473
Epoch: 8510, Training Loss: 0.01686
Epoch: 8510, Training Loss: 0.01549
Epoch: 8510, Training Loss: 0.01928
Epoch: 8510, Training Loss: 0.01473
Epoch: 8511, Training Loss: 0.01686
Epoch: 8511, Training Loss: 0.01549
Epoch: 8511, Training Loss: 0.01927
Epoch: 8511, Training Loss: 0.01473
Epoch: 8512, Training Loss: 0.01686
Epoch: 8512, Training Loss: 0.01549
Epoch: 8512, Training Loss: 0.01927
Epoch: 8512, Training Loss: 0.01473
Epoch: 8513, Training Loss: 0.01686
Epoch: 8513, Training Loss: 0.01549
Epoch: 8513, Training Loss: 0.01927
Epoch: 8513, Training Loss: 0.01473
Epoch: 8514, Training Loss: 0.01686
Epoch: 8514, Training Loss: 0.01549
Epoch: 8514, Training Loss: 0.01927
Epoch: 8514, Training Loss: 0.01473
Epoch: 8515, Training Loss: 0.01686
Epoch: 8515, Training Loss: 0.01548
Epoch: 8515, Training Loss: 0.01927
Epoch: 8515, Training Loss: 0.01473
Epoch: 8516, Training Loss: 0.01685
Epoch: 8516, Training Loss: 0.01548
Epoch: 8516, Training Loss: 0.01927
Epoch: 8516, Training Loss: 0.01472
Epoch: 8517, Training Loss: 0.01685
Epoch: 8517, Training Loss: 0.01548
Epoch: 8517, Training Loss: 0.01927
Epoch: 8517, Training Loss: 0.01472
Epoch: 8518, Training Loss: 0.01685
Epoch: 8518, Training Loss: 0.01548
Epoch: 8518, Training Loss: 0.01927
Epoch: 8518, Training Loss: 0.01472
Epoch: 8519, Training Loss: 0.01685
Epoch: 8519, Training Loss: 0.01548
Epoch: 8519, Training Loss: 0.01926
Epoch: 8519, Training Loss: 0.01472
Epoch: 8520, Training Loss: 0.01685
Epoch: 8520, Training Loss: 0.01548
Epoch: 8520, Training Loss: 0.01926
Epoch: 8520, Training Loss: 0.01472
Epoch: 8521, Training Loss: 0.01685
Epoch: 8521, Training Loss: 0.01548
Epoch: 8521, Training Loss: 0.01926
Epoch: 8521, Training Loss: 0.01472
Epoch: 8522, Training Loss: 0.01685
Epoch: 8522, Training Loss: 0.01548
Epoch: 8522, Training Loss: 0.01926
Epoch: 8522, Training Loss: 0.01472
Epoch: 8523, Training Loss: 0.01685
Epoch: 8523, Training Loss: 0.01548
Epoch: 8523, Training Loss: 0.01926
Epoch: 8523, Training Loss: 0.01472
Epoch: 8524, Training Loss: 0.01684
Epoch: 8524, Training Loss: 0.01548
Epoch: 8524, Training Loss: 0.01926
Epoch: 8524, Training Loss: 0.01472
Epoch: 8525, Training Loss: 0.01684
Epoch: 8525, Training Loss: 0.01547
Epoch: 8525, Training Loss: 0.01926
Epoch: 8525, Training Loss: 0.01472
Epoch: 8526, Training Loss: 0.01684
Epoch: 8526, Training Loss: 0.01547
Epoch: 8526, Training Loss: 0.01925
Epoch: 8526, Training Loss: 0.01471
Epoch: 8527, Training Loss: 0.01684
Epoch: 8527, Training Loss: 0.01547
Epoch: 8527, Training Loss: 0.01925
Epoch: 8527, Training Loss: 0.01471
Epoch: 8528, Training Loss: 0.01684
Epoch: 8528, Training Loss: 0.01547
Epoch: 8528, Training Loss: 0.01925
Epoch: 8528, Training Loss: 0.01471
Epoch: 8529, Training Loss: 0.01684
Epoch: 8529, Training Loss: 0.01547
Epoch: 8529, Training Loss: 0.01925
Epoch: 8529, Training Loss: 0.01471
Epoch: 8530, Training Loss: 0.01684
Epoch: 8530, Training Loss: 0.01547
Epoch: 8530, Training Loss: 0.01925
Epoch: 8530, Training Loss: 0.01471
Epoch: 8531, Training Loss: 0.01684
Epoch: 8531, Training Loss: 0.01547
Epoch: 8531, Training Loss: 0.01925
Epoch: 8531, Training Loss: 0.01471
Epoch: 8532, Training Loss: 0.01684
Epoch: 8532, Training Loss: 0.01547
Epoch: 8532, Training Loss: 0.01925
Epoch: 8532, Training Loss: 0.01471
Epoch: 8533, Training Loss: 0.01683
Epoch: 8533, Training Loss: 0.01547
Epoch: 8533, Training Loss: 0.01925
Epoch: 8533, Training Loss: 0.01471
Epoch: 8534, Training Loss: 0.01683
Epoch: 8534, Training Loss: 0.01547
Epoch: 8534, Training Loss: 0.01924
Epoch: 8534, Training Loss: 0.01471
Epoch: 8535, Training Loss: 0.01683
Epoch: 8535, Training Loss: 0.01546
Epoch: 8535, Training Loss: 0.01924
Epoch: 8535, Training Loss: 0.01471
Epoch: 8536, Training Loss: 0.01683
Epoch: 8536, Training Loss: 0.01546
Epoch: 8536, Training Loss: 0.01924
Epoch: 8536, Training Loss: 0.01470
Epoch: 8537, Training Loss: 0.01683
Epoch: 8537, Training Loss: 0.01546
Epoch: 8537, Training Loss: 0.01924
Epoch: 8537, Training Loss: 0.01470
Epoch: 8538, Training Loss: 0.01683
Epoch: 8538, Training Loss: 0.01546
Epoch: 8538, Training Loss: 0.01924
Epoch: 8538, Training Loss: 0.01470
Epoch: 8539, Training Loss: 0.01683
Epoch: 8539, Training Loss: 0.01546
Epoch: 8539, Training Loss: 0.01924
Epoch: 8539, Training Loss: 0.01470
Epoch: 8540, Training Loss: 0.01683
Epoch: 8540, Training Loss: 0.01546
Epoch: 8540, Training Loss: 0.01924
Epoch: 8540, Training Loss: 0.01470
Epoch: 8541, Training Loss: 0.01683
Epoch: 8541, Training Loss: 0.01546
Epoch: 8541, Training Loss: 0.01923
Epoch: 8541, Training Loss: 0.01470
Epoch: 8542, Training Loss: 0.01682
Epoch: 8542, Training Loss: 0.01546
Epoch: 8542, Training Loss: 0.01923
Epoch: 8542, Training Loss: 0.01470
Epoch: 8543, Training Loss: 0.01682
Epoch: 8543, Training Loss: 0.01546
Epoch: 8543, Training Loss: 0.01923
Epoch: 8543, Training Loss: 0.01470
Epoch: 8544, Training Loss: 0.01682
Epoch: 8544, Training Loss: 0.01546
Epoch: 8544, Training Loss: 0.01923
Epoch: 8544, Training Loss: 0.01470
Epoch: 8545, Training Loss: 0.01682
Epoch: 8545, Training Loss: 0.01545
Epoch: 8545, Training Loss: 0.01923
Epoch: 8545, Training Loss: 0.01470
Epoch: 8546, Training Loss: 0.01682
Epoch: 8546, Training Loss: 0.01545
Epoch: 8546, Training Loss: 0.01923
Epoch: 8546, Training Loss: 0.01470
Epoch: 8547, Training Loss: 0.01682
Epoch: 8547, Training Loss: 0.01545
Epoch: 8547, Training Loss: 0.01923
Epoch: 8547, Training Loss: 0.01469
Epoch: 8548, Training Loss: 0.01682
Epoch: 8548, Training Loss: 0.01545
Epoch: 8548, Training Loss: 0.01923
Epoch: 8548, Training Loss: 0.01469
Epoch: 8549, Training Loss: 0.01682
Epoch: 8549, Training Loss: 0.01545
Epoch: 8549, Training Loss: 0.01922
Epoch: 8549, Training Loss: 0.01469
Epoch: 8550, Training Loss: 0.01682
Epoch: 8550, Training Loss: 0.01545
Epoch: 8550, Training Loss: 0.01922
Epoch: 8550, Training Loss: 0.01469
Epoch: 8551, Training Loss: 0.01681
Epoch: 8551, Training Loss: 0.01545
Epoch: 8551, Training Loss: 0.01922
Epoch: 8551, Training Loss: 0.01469
Epoch: 8552, Training Loss: 0.01681
Epoch: 8552, Training Loss: 0.01545
Epoch: 8552, Training Loss: 0.01922
Epoch: 8552, Training Loss: 0.01469
Epoch: 8553, Training Loss: 0.01681
Epoch: 8553, Training Loss: 0.01545
Epoch: 8553, Training Loss: 0.01922
Epoch: 8553, Training Loss: 0.01469
Epoch: 8554, Training Loss: 0.01681
Epoch: 8554, Training Loss: 0.01545
Epoch: 8554, Training Loss: 0.01922
Epoch: 8554, Training Loss: 0.01469
Epoch: 8555, Training Loss: 0.01681
Epoch: 8555, Training Loss: 0.01544
Epoch: 8555, Training Loss: 0.01922
Epoch: 8555, Training Loss: 0.01469
Epoch: 8556, Training Loss: 0.01681
Epoch: 8556, Training Loss: 0.01544
Epoch: 8556, Training Loss: 0.01922
Epoch: 8556, Training Loss: 0.01469
Epoch: 8557, Training Loss: 0.01681
Epoch: 8557, Training Loss: 0.01544
Epoch: 8557, Training Loss: 0.01921
Epoch: 8557, Training Loss: 0.01468
Epoch: 8558, Training Loss: 0.01681
Epoch: 8558, Training Loss: 0.01544
Epoch: 8558, Training Loss: 0.01921
Epoch: 8558, Training Loss: 0.01468
Epoch: 8559, Training Loss: 0.01681
Epoch: 8559, Training Loss: 0.01544
Epoch: 8559, Training Loss: 0.01921
Epoch: 8559, Training Loss: 0.01468
Epoch: 8560, Training Loss: 0.01680
Epoch: 8560, Training Loss: 0.01544
Epoch: 8560, Training Loss: 0.01921
Epoch: 8560, Training Loss: 0.01468
Epoch: 8561, Training Loss: 0.01680
Epoch: 8561, Training Loss: 0.01544
Epoch: 8561, Training Loss: 0.01921
Epoch: 8561, Training Loss: 0.01468
Epoch: 8562, Training Loss: 0.01680
Epoch: 8562, Training Loss: 0.01544
Epoch: 8562, Training Loss: 0.01921
Epoch: 8562, Training Loss: 0.01468
Epoch: 8563, Training Loss: 0.01680
Epoch: 8563, Training Loss: 0.01544
Epoch: 8563, Training Loss: 0.01921
Epoch: 8563, Training Loss: 0.01468
Epoch: 8564, Training Loss: 0.01680
Epoch: 8564, Training Loss: 0.01544
Epoch: 8564, Training Loss: 0.01920
Epoch: 8564, Training Loss: 0.01468
Epoch: 8565, Training Loss: 0.01680
Epoch: 8565, Training Loss: 0.01543
Epoch: 8565, Training Loss: 0.01920
Epoch: 8565, Training Loss: 0.01468
Epoch: 8566, Training Loss: 0.01680
Epoch: 8566, Training Loss: 0.01543
Epoch: 8566, Training Loss: 0.01920
Epoch: 8566, Training Loss: 0.01468
Epoch: 8567, Training Loss: 0.01680
Epoch: 8567, Training Loss: 0.01543
Epoch: 8567, Training Loss: 0.01920
Epoch: 8567, Training Loss: 0.01467
Epoch: 8568, Training Loss: 0.01679
Epoch: 8568, Training Loss: 0.01543
Epoch: 8568, Training Loss: 0.01920
Epoch: 8568, Training Loss: 0.01467
Epoch: 8569, Training Loss: 0.01679
Epoch: 8569, Training Loss: 0.01543
Epoch: 8569, Training Loss: 0.01920
Epoch: 8569, Training Loss: 0.01467
Epoch: 8570, Training Loss: 0.01679
Epoch: 8570, Training Loss: 0.01543
Epoch: 8570, Training Loss: 0.01920
Epoch: 8570, Training Loss: 0.01467
Epoch: 8571, Training Loss: 0.01679
Epoch: 8571, Training Loss: 0.01543
Epoch: 8571, Training Loss: 0.01920
Epoch: 8571, Training Loss: 0.01467
Epoch: 8572, Training Loss: 0.01679
Epoch: 8572, Training Loss: 0.01543
Epoch: 8572, Training Loss: 0.01919
Epoch: 8572, Training Loss: 0.01467
Epoch: 8573, Training Loss: 0.01679
Epoch: 8573, Training Loss: 0.01543
Epoch: 8573, Training Loss: 0.01919
Epoch: 8573, Training Loss: 0.01467
Epoch: 8574, Training Loss: 0.01679
Epoch: 8574, Training Loss: 0.01543
Epoch: 8574, Training Loss: 0.01919
Epoch: 8574, Training Loss: 0.01467
Epoch: 8575, Training Loss: 0.01679
Epoch: 8575, Training Loss: 0.01542
Epoch: 8575, Training Loss: 0.01919
Epoch: 8575, Training Loss: 0.01467
Epoch: 8576, Training Loss: 0.01679
Epoch: 8576, Training Loss: 0.01542
Epoch: 8576, Training Loss: 0.01919
Epoch: 8576, Training Loss: 0.01467
Epoch: 8577, Training Loss: 0.01678
Epoch: 8577, Training Loss: 0.01542
Epoch: 8577, Training Loss: 0.01919
Epoch: 8577, Training Loss: 0.01467
Epoch: 8578, Training Loss: 0.01678
Epoch: 8578, Training Loss: 0.01542
Epoch: 8578, Training Loss: 0.01919
Epoch: 8578, Training Loss: 0.01466
Epoch: 8579, Training Loss: 0.01678
Epoch: 8579, Training Loss: 0.01542
Epoch: 8579, Training Loss: 0.01919
Epoch: 8579, Training Loss: 0.01466
Epoch: 8580, Training Loss: 0.01678
Epoch: 8580, Training Loss: 0.01542
Epoch: 8580, Training Loss: 0.01918
Epoch: 8580, Training Loss: 0.01466
Epoch: 8581, Training Loss: 0.01678
Epoch: 8581, Training Loss: 0.01542
Epoch: 8581, Training Loss: 0.01918
Epoch: 8581, Training Loss: 0.01466
Epoch: 8582, Training Loss: 0.01678
Epoch: 8582, Training Loss: 0.01542
Epoch: 8582, Training Loss: 0.01918
Epoch: 8582, Training Loss: 0.01466
Epoch: 8583, Training Loss: 0.01678
Epoch: 8583, Training Loss: 0.01542
Epoch: 8583, Training Loss: 0.01918
Epoch: 8583, Training Loss: 0.01466
Epoch: 8584, Training Loss: 0.01678
Epoch: 8584, Training Loss: 0.01542
Epoch: 8584, Training Loss: 0.01918
Epoch: 8584, Training Loss: 0.01466
Epoch: 8585, Training Loss: 0.01678
Epoch: 8585, Training Loss: 0.01541
Epoch: 8585, Training Loss: 0.01918
Epoch: 8585, Training Loss: 0.01466
Epoch: 8586, Training Loss: 0.01677
Epoch: 8586, Training Loss: 0.01541
Epoch: 8586, Training Loss: 0.01918
Epoch: 8586, Training Loss: 0.01466
Epoch: 8587, Training Loss: 0.01677
Epoch: 8587, Training Loss: 0.01541
Epoch: 8587, Training Loss: 0.01918
Epoch: 8587, Training Loss: 0.01466
Epoch: 8588, Training Loss: 0.01677
Epoch: 8588, Training Loss: 0.01541
Epoch: 8588, Training Loss: 0.01917
Epoch: 8588, Training Loss: 0.01465
Epoch: 8589, Training Loss: 0.01677
Epoch: 8589, Training Loss: 0.01541
Epoch: 8589, Training Loss: 0.01917
Epoch: 8589, Training Loss: 0.01465
Epoch: 8590, Training Loss: 0.01677
Epoch: 8590, Training Loss: 0.01541
Epoch: 8590, Training Loss: 0.01917
Epoch: 8590, Training Loss: 0.01465
Epoch: 8591, Training Loss: 0.01677
Epoch: 8591, Training Loss: 0.01541
Epoch: 8591, Training Loss: 0.01917
Epoch: 8591, Training Loss: 0.01465
Epoch: 8592, Training Loss: 0.01677
Epoch: 8592, Training Loss: 0.01541
Epoch: 8592, Training Loss: 0.01917
Epoch: 8592, Training Loss: 0.01465
Epoch: 8593, Training Loss: 0.01677
Epoch: 8593, Training Loss: 0.01541
Epoch: 8593, Training Loss: 0.01917
Epoch: 8593, Training Loss: 0.01465
Epoch: 8594, Training Loss: 0.01677
Epoch: 8594, Training Loss: 0.01541
Epoch: 8594, Training Loss: 0.01917
Epoch: 8594, Training Loss: 0.01465
Epoch: 8595, Training Loss: 0.01676
Epoch: 8595, Training Loss: 0.01540
Epoch: 8595, Training Loss: 0.01916
Epoch: 8595, Training Loss: 0.01465
Epoch: 8596, Training Loss: 0.01676
Epoch: 8596, Training Loss: 0.01540
Epoch: 8596, Training Loss: 0.01916
Epoch: 8596, Training Loss: 0.01465
Epoch: 8597, Training Loss: 0.01676
Epoch: 8597, Training Loss: 0.01540
Epoch: 8597, Training Loss: 0.01916
Epoch: 8597, Training Loss: 0.01465
Epoch: 8598, Training Loss: 0.01676
Epoch: 8598, Training Loss: 0.01540
Epoch: 8598, Training Loss: 0.01916
Epoch: 8598, Training Loss: 0.01464
Epoch: 8599, Training Loss: 0.01676
Epoch: 8599, Training Loss: 0.01540
Epoch: 8599, Training Loss: 0.01916
Epoch: 8599, Training Loss: 0.01464
Epoch: 8600, Training Loss: 0.01676
Epoch: 8600, Training Loss: 0.01540
Epoch: 8600, Training Loss: 0.01916
Epoch: 8600, Training Loss: 0.01464
Epoch: 8601, Training Loss: 0.01676
Epoch: 8601, Training Loss: 0.01540
Epoch: 8601, Training Loss: 0.01916
Epoch: 8601, Training Loss: 0.01464
Epoch: 8602, Training Loss: 0.01676
Epoch: 8602, Training Loss: 0.01540
Epoch: 8602, Training Loss: 0.01916
Epoch: 8602, Training Loss: 0.01464
Epoch: 8603, Training Loss: 0.01676
Epoch: 8603, Training Loss: 0.01540
Epoch: 8603, Training Loss: 0.01915
Epoch: 8603, Training Loss: 0.01464
Epoch: 8604, Training Loss: 0.01675
Epoch: 8604, Training Loss: 0.01540
Epoch: 8604, Training Loss: 0.01915
Epoch: 8604, Training Loss: 0.01464
Epoch: 8605, Training Loss: 0.01675
Epoch: 8605, Training Loss: 0.01539
Epoch: 8605, Training Loss: 0.01915
Epoch: 8605, Training Loss: 0.01464
Epoch: 8606, Training Loss: 0.01675
Epoch: 8606, Training Loss: 0.01539
Epoch: 8606, Training Loss: 0.01915
Epoch: 8606, Training Loss: 0.01464
Epoch: 8607, Training Loss: 0.01675
Epoch: 8607, Training Loss: 0.01539
Epoch: 8607, Training Loss: 0.01915
Epoch: 8607, Training Loss: 0.01464
Epoch: 8608, Training Loss: 0.01675
Epoch: 8608, Training Loss: 0.01539
Epoch: 8608, Training Loss: 0.01915
Epoch: 8608, Training Loss: 0.01464
Epoch: 8609, Training Loss: 0.01675
Epoch: 8609, Training Loss: 0.01539
Epoch: 8609, Training Loss: 0.01915
Epoch: 8609, Training Loss: 0.01463
Epoch: 8610, Training Loss: 0.01675
Epoch: 8610, Training Loss: 0.01539
Epoch: 8610, Training Loss: 0.01915
Epoch: 8610, Training Loss: 0.01463
Epoch: 8611, Training Loss: 0.01675
Epoch: 8611, Training Loss: 0.01539
Epoch: 8611, Training Loss: 0.01914
Epoch: 8611, Training Loss: 0.01463
Epoch: 8612, Training Loss: 0.01675
Epoch: 8612, Training Loss: 0.01539
Epoch: 8612, Training Loss: 0.01914
Epoch: 8612, Training Loss: 0.01463
Epoch: 8613, Training Loss: 0.01674
Epoch: 8613, Training Loss: 0.01539
Epoch: 8613, Training Loss: 0.01914
Epoch: 8613, Training Loss: 0.01463
Epoch: 8614, Training Loss: 0.01674
Epoch: 8614, Training Loss: 0.01539
Epoch: 8614, Training Loss: 0.01914
Epoch: 8614, Training Loss: 0.01463
Epoch: 8615, Training Loss: 0.01674
Epoch: 8615, Training Loss: 0.01538
Epoch: 8615, Training Loss: 0.01914
Epoch: 8615, Training Loss: 0.01463
Epoch: 8616, Training Loss: 0.01674
Epoch: 8616, Training Loss: 0.01538
Epoch: 8616, Training Loss: 0.01914
Epoch: 8616, Training Loss: 0.01463
Epoch: 8617, Training Loss: 0.01674
Epoch: 8617, Training Loss: 0.01538
Epoch: 8617, Training Loss: 0.01914
Epoch: 8617, Training Loss: 0.01463
Epoch: 8618, Training Loss: 0.01674
Epoch: 8618, Training Loss: 0.01538
Epoch: 8618, Training Loss: 0.01913
Epoch: 8618, Training Loss: 0.01463
Epoch: 8619, Training Loss: 0.01674
Epoch: 8619, Training Loss: 0.01538
Epoch: 8619, Training Loss: 0.01913
Epoch: 8619, Training Loss: 0.01462
Epoch: 8620, Training Loss: 0.01674
Epoch: 8620, Training Loss: 0.01538
Epoch: 8620, Training Loss: 0.01913
Epoch: 8620, Training Loss: 0.01462
Epoch: 8621, Training Loss: 0.01674
Epoch: 8621, Training Loss: 0.01538
Epoch: 8621, Training Loss: 0.01913
Epoch: 8621, Training Loss: 0.01462
Epoch: 8622, Training Loss: 0.01673
Epoch: 8622, Training Loss: 0.01538
Epoch: 8622, Training Loss: 0.01913
Epoch: 8622, Training Loss: 0.01462
Epoch: 8623, Training Loss: 0.01673
Epoch: 8623, Training Loss: 0.01538
Epoch: 8623, Training Loss: 0.01913
Epoch: 8623, Training Loss: 0.01462
Epoch: 8624, Training Loss: 0.01673
Epoch: 8624, Training Loss: 0.01538
Epoch: 8624, Training Loss: 0.01913
Epoch: 8624, Training Loss: 0.01462
Epoch: 8625, Training Loss: 0.01673
Epoch: 8625, Training Loss: 0.01537
Epoch: 8625, Training Loss: 0.01913
Epoch: 8625, Training Loss: 0.01462
Epoch: 8626, Training Loss: 0.01673
Epoch: 8626, Training Loss: 0.01537
Epoch: 8626, Training Loss: 0.01912
Epoch: 8626, Training Loss: 0.01462
Epoch: 8627, Training Loss: 0.01673
Epoch: 8627, Training Loss: 0.01537
Epoch: 8627, Training Loss: 0.01912
Epoch: 8627, Training Loss: 0.01462
Epoch: 8628, Training Loss: 0.01673
Epoch: 8628, Training Loss: 0.01537
Epoch: 8628, Training Loss: 0.01912
Epoch: 8628, Training Loss: 0.01462
Epoch: 8629, Training Loss: 0.01673
Epoch: 8629, Training Loss: 0.01537
Epoch: 8629, Training Loss: 0.01912
Epoch: 8629, Training Loss: 0.01462
Epoch: 8630, Training Loss: 0.01673
Epoch: 8630, Training Loss: 0.01537
Epoch: 8630, Training Loss: 0.01912
Epoch: 8630, Training Loss: 0.01461
Epoch: 8631, Training Loss: 0.01672
Epoch: 8631, Training Loss: 0.01537
Epoch: 8631, Training Loss: 0.01912
Epoch: 8631, Training Loss: 0.01461
Epoch: 8632, Training Loss: 0.01672
Epoch: 8632, Training Loss: 0.01537
Epoch: 8632, Training Loss: 0.01912
Epoch: 8632, Training Loss: 0.01461
Epoch: 8633, Training Loss: 0.01672
Epoch: 8633, Training Loss: 0.01537
Epoch: 8633, Training Loss: 0.01912
Epoch: 8633, Training Loss: 0.01461
Epoch: 8634, Training Loss: 0.01672
Epoch: 8634, Training Loss: 0.01537
Epoch: 8634, Training Loss: 0.01911
Epoch: 8634, Training Loss: 0.01461
Epoch: 8635, Training Loss: 0.01672
Epoch: 8635, Training Loss: 0.01536
Epoch: 8635, Training Loss: 0.01911
Epoch: 8635, Training Loss: 0.01461
Epoch: 8636, Training Loss: 0.01672
Epoch: 8636, Training Loss: 0.01536
Epoch: 8636, Training Loss: 0.01911
Epoch: 8636, Training Loss: 0.01461
Epoch: 8637, Training Loss: 0.01672
Epoch: 8637, Training Loss: 0.01536
Epoch: 8637, Training Loss: 0.01911
Epoch: 8637, Training Loss: 0.01461
Epoch: 8638, Training Loss: 0.01672
Epoch: 8638, Training Loss: 0.01536
Epoch: 8638, Training Loss: 0.01911
Epoch: 8638, Training Loss: 0.01461
Epoch: 8639, Training Loss: 0.01672
Epoch: 8639, Training Loss: 0.01536
Epoch: 8639, Training Loss: 0.01911
Epoch: 8639, Training Loss: 0.01461
Epoch: 8640, Training Loss: 0.01671
Epoch: 8640, Training Loss: 0.01536
Epoch: 8640, Training Loss: 0.01911
Epoch: 8640, Training Loss: 0.01460
Epoch: 8641, Training Loss: 0.01671
Epoch: 8641, Training Loss: 0.01536
Epoch: 8641, Training Loss: 0.01911
Epoch: 8641, Training Loss: 0.01460
Epoch: 8642, Training Loss: 0.01671
Epoch: 8642, Training Loss: 0.01536
Epoch: 8642, Training Loss: 0.01910
Epoch: 8642, Training Loss: 0.01460
Epoch: 8643, Training Loss: 0.01671
Epoch: 8643, Training Loss: 0.01536
Epoch: 8643, Training Loss: 0.01910
Epoch: 8643, Training Loss: 0.01460
Epoch: 8644, Training Loss: 0.01671
Epoch: 8644, Training Loss: 0.01536
Epoch: 8644, Training Loss: 0.01910
Epoch: 8644, Training Loss: 0.01460
Epoch: 8645, Training Loss: 0.01671
Epoch: 8645, Training Loss: 0.01535
Epoch: 8645, Training Loss: 0.01910
Epoch: 8645, Training Loss: 0.01460
Epoch: 8646, Training Loss: 0.01671
Epoch: 8646, Training Loss: 0.01535
Epoch: 8646, Training Loss: 0.01910
Epoch: 8646, Training Loss: 0.01460
Epoch: 8647, Training Loss: 0.01671
Epoch: 8647, Training Loss: 0.01535
Epoch: 8647, Training Loss: 0.01910
Epoch: 8647, Training Loss: 0.01460
Epoch: 8648, Training Loss: 0.01671
Epoch: 8648, Training Loss: 0.01535
Epoch: 8648, Training Loss: 0.01910
Epoch: 8648, Training Loss: 0.01460
Epoch: 8649, Training Loss: 0.01670
Epoch: 8649, Training Loss: 0.01535
Epoch: 8649, Training Loss: 0.01910
Epoch: 8649, Training Loss: 0.01460
Epoch: 8650, Training Loss: 0.01670
Epoch: 8650, Training Loss: 0.01535
Epoch: 8650, Training Loss: 0.01909
Epoch: 8650, Training Loss: 0.01460
Epoch: 8651, Training Loss: 0.01670
Epoch: 8651, Training Loss: 0.01535
Epoch: 8651, Training Loss: 0.01909
Epoch: 8651, Training Loss: 0.01459
Epoch: 8652, Training Loss: 0.01670
Epoch: 8652, Training Loss: 0.01535
Epoch: 8652, Training Loss: 0.01909
Epoch: 8652, Training Loss: 0.01459
Epoch: 8653, Training Loss: 0.01670
Epoch: 8653, Training Loss: 0.01535
Epoch: 8653, Training Loss: 0.01909
Epoch: 8653, Training Loss: 0.01459
Epoch: 8654, Training Loss: 0.01670
Epoch: 8654, Training Loss: 0.01535
Epoch: 8654, Training Loss: 0.01909
Epoch: 8654, Training Loss: 0.01459
Epoch: 8655, Training Loss: 0.01670
Epoch: 8655, Training Loss: 0.01534
Epoch: 8655, Training Loss: 0.01909
Epoch: 8655, Training Loss: 0.01459
Epoch: 8656, Training Loss: 0.01670
Epoch: 8656, Training Loss: 0.01534
Epoch: 8656, Training Loss: 0.01909
Epoch: 8656, Training Loss: 0.01459
Epoch: 8657, Training Loss: 0.01670
Epoch: 8657, Training Loss: 0.01534
Epoch: 8657, Training Loss: 0.01908
Epoch: 8657, Training Loss: 0.01459
Epoch: 8658, Training Loss: 0.01669
Epoch: 8658, Training Loss: 0.01534
Epoch: 8658, Training Loss: 0.01908
Epoch: 8658, Training Loss: 0.01459
Epoch: 8659, Training Loss: 0.01669
Epoch: 8659, Training Loss: 0.01534
Epoch: 8659, Training Loss: 0.01908
Epoch: 8659, Training Loss: 0.01459
Epoch: 8660, Training Loss: 0.01669
Epoch: 8660, Training Loss: 0.01534
Epoch: 8660, Training Loss: 0.01908
Epoch: 8660, Training Loss: 0.01459
Epoch: 8661, Training Loss: 0.01669
Epoch: 8661, Training Loss: 0.01534
Epoch: 8661, Training Loss: 0.01908
Epoch: 8661, Training Loss: 0.01458
Epoch: 8662, Training Loss: 0.01669
Epoch: 8662, Training Loss: 0.01534
Epoch: 8662, Training Loss: 0.01908
Epoch: 8662, Training Loss: 0.01458
Epoch: 8663, Training Loss: 0.01669
Epoch: 8663, Training Loss: 0.01534
Epoch: 8663, Training Loss: 0.01908
Epoch: 8663, Training Loss: 0.01458
Epoch: 8664, Training Loss: 0.01669
Epoch: 8664, Training Loss: 0.01534
Epoch: 8664, Training Loss: 0.01908
Epoch: 8664, Training Loss: 0.01458
Epoch: 8665, Training Loss: 0.01669
Epoch: 8665, Training Loss: 0.01533
Epoch: 8665, Training Loss: 0.01907
Epoch: 8665, Training Loss: 0.01458
Epoch: 8666, Training Loss: 0.01669
Epoch: 8666, Training Loss: 0.01533
Epoch: 8666, Training Loss: 0.01907
Epoch: 8666, Training Loss: 0.01458
Epoch: 8667, Training Loss: 0.01668
Epoch: 8667, Training Loss: 0.01533
Epoch: 8667, Training Loss: 0.01907
Epoch: 8667, Training Loss: 0.01458
Epoch: 8668, Training Loss: 0.01668
Epoch: 8668, Training Loss: 0.01533
Epoch: 8668, Training Loss: 0.01907
Epoch: 8668, Training Loss: 0.01458
Epoch: 8669, Training Loss: 0.01668
Epoch: 8669, Training Loss: 0.01533
Epoch: 8669, Training Loss: 0.01907
Epoch: 8669, Training Loss: 0.01458
Epoch: 8670, Training Loss: 0.01668
Epoch: 8670, Training Loss: 0.01533
Epoch: 8670, Training Loss: 0.01907
Epoch: 8670, Training Loss: 0.01458
Epoch: 8671, Training Loss: 0.01668
Epoch: 8671, Training Loss: 0.01533
Epoch: 8671, Training Loss: 0.01907
Epoch: 8671, Training Loss: 0.01458
Epoch: 8672, Training Loss: 0.01668
Epoch: 8672, Training Loss: 0.01533
Epoch: 8672, Training Loss: 0.01907
Epoch: 8672, Training Loss: 0.01457
Epoch: 8673, Training Loss: 0.01668
Epoch: 8673, Training Loss: 0.01533
Epoch: 8673, Training Loss: 0.01906
Epoch: 8673, Training Loss: 0.01457
Epoch: 8674, Training Loss: 0.01668
Epoch: 8674, Training Loss: 0.01533
Epoch: 8674, Training Loss: 0.01906
Epoch: 8674, Training Loss: 0.01457
Epoch: 8675, Training Loss: 0.01668
Epoch: 8675, Training Loss: 0.01532
Epoch: 8675, Training Loss: 0.01906
Epoch: 8675, Training Loss: 0.01457
Epoch: 8676, Training Loss: 0.01667
Epoch: 8676, Training Loss: 0.01532
Epoch: 8676, Training Loss: 0.01906
Epoch: 8676, Training Loss: 0.01457
Epoch: 8677, Training Loss: 0.01667
Epoch: 8677, Training Loss: 0.01532
Epoch: 8677, Training Loss: 0.01906
Epoch: 8677, Training Loss: 0.01457
Epoch: 8678, Training Loss: 0.01667
Epoch: 8678, Training Loss: 0.01532
Epoch: 8678, Training Loss: 0.01906
Epoch: 8678, Training Loss: 0.01457
Epoch: 8679, Training Loss: 0.01667
Epoch: 8679, Training Loss: 0.01532
Epoch: 8679, Training Loss: 0.01906
Epoch: 8679, Training Loss: 0.01457
Epoch: 8680, Training Loss: 0.01667
Epoch: 8680, Training Loss: 0.01532
Epoch: 8680, Training Loss: 0.01906
Epoch: 8680, Training Loss: 0.01457
Epoch: 8681, Training Loss: 0.01667
Epoch: 8681, Training Loss: 0.01532
Epoch: 8681, Training Loss: 0.01905
Epoch: 8681, Training Loss: 0.01457
Epoch: 8682, Training Loss: 0.01667
Epoch: 8682, Training Loss: 0.01532
Epoch: 8682, Training Loss: 0.01905
Epoch: 8682, Training Loss: 0.01456
Epoch: 8683, Training Loss: 0.01667
Epoch: 8683, Training Loss: 0.01532
Epoch: 8683, Training Loss: 0.01905
Epoch: 8683, Training Loss: 0.01456
Epoch: 8684, Training Loss: 0.01667
Epoch: 8684, Training Loss: 0.01532
Epoch: 8684, Training Loss: 0.01905
Epoch: 8684, Training Loss: 0.01456
Epoch: 8685, Training Loss: 0.01666
Epoch: 8685, Training Loss: 0.01532
Epoch: 8685, Training Loss: 0.01905
Epoch: 8685, Training Loss: 0.01456
Epoch: 8686, Training Loss: 0.01666
Epoch: 8686, Training Loss: 0.01531
Epoch: 8686, Training Loss: 0.01905
Epoch: 8686, Training Loss: 0.01456
Epoch: 8687, Training Loss: 0.01666
Epoch: 8687, Training Loss: 0.01531
Epoch: 8687, Training Loss: 0.01905
Epoch: 8687, Training Loss: 0.01456
Epoch: 8688, Training Loss: 0.01666
Epoch: 8688, Training Loss: 0.01531
Epoch: 8688, Training Loss: 0.01905
Epoch: 8688, Training Loss: 0.01456
Epoch: 8689, Training Loss: 0.01666
Epoch: 8689, Training Loss: 0.01531
Epoch: 8689, Training Loss: 0.01904
Epoch: 8689, Training Loss: 0.01456
Epoch: 8690, Training Loss: 0.01666
Epoch: 8690, Training Loss: 0.01531
Epoch: 8690, Training Loss: 0.01904
Epoch: 8690, Training Loss: 0.01456
Epoch: 8691, Training Loss: 0.01666
Epoch: 8691, Training Loss: 0.01531
Epoch: 8691, Training Loss: 0.01904
Epoch: 8691, Training Loss: 0.01456
Epoch: 8692, Training Loss: 0.01666
Epoch: 8692, Training Loss: 0.01531
Epoch: 8692, Training Loss: 0.01904
Epoch: 8692, Training Loss: 0.01456
Epoch: 8693, Training Loss: 0.01666
Epoch: 8693, Training Loss: 0.01531
Epoch: 8693, Training Loss: 0.01904
Epoch: 8693, Training Loss: 0.01455
Epoch: 8694, Training Loss: 0.01665
Epoch: 8694, Training Loss: 0.01531
Epoch: 8694, Training Loss: 0.01904
Epoch: 8694, Training Loss: 0.01455
Epoch: 8695, Training Loss: 0.01665
Epoch: 8695, Training Loss: 0.01531
Epoch: 8695, Training Loss: 0.01904
Epoch: 8695, Training Loss: 0.01455
Epoch: 8696, Training Loss: 0.01665
Epoch: 8696, Training Loss: 0.01530
Epoch: 8696, Training Loss: 0.01904
Epoch: 8696, Training Loss: 0.01455
Epoch: 8697, Training Loss: 0.01665
Epoch: 8697, Training Loss: 0.01530
Epoch: 8697, Training Loss: 0.01903
Epoch: 8697, Training Loss: 0.01455
Epoch: 8698, Training Loss: 0.01665
Epoch: 8698, Training Loss: 0.01530
Epoch: 8698, Training Loss: 0.01903
Epoch: 8698, Training Loss: 0.01455
Epoch: 8699, Training Loss: 0.01665
Epoch: 8699, Training Loss: 0.01530
Epoch: 8699, Training Loss: 0.01903
Epoch: 8699, Training Loss: 0.01455
Epoch: 8700, Training Loss: 0.01665
Epoch: 8700, Training Loss: 0.01530
Epoch: 8700, Training Loss: 0.01903
Epoch: 8700, Training Loss: 0.01455
Epoch: 8701, Training Loss: 0.01665
Epoch: 8701, Training Loss: 0.01530
Epoch: 8701, Training Loss: 0.01903
Epoch: 8701, Training Loss: 0.01455
Epoch: 8702, Training Loss: 0.01665
Epoch: 8702, Training Loss: 0.01530
Epoch: 8702, Training Loss: 0.01903
Epoch: 8702, Training Loss: 0.01455
Epoch: 8703, Training Loss: 0.01664
Epoch: 8703, Training Loss: 0.01530
Epoch: 8703, Training Loss: 0.01903
Epoch: 8703, Training Loss: 0.01455
Epoch: 8704, Training Loss: 0.01664
Epoch: 8704, Training Loss: 0.01530
Epoch: 8704, Training Loss: 0.01903
Epoch: 8704, Training Loss: 0.01454
Epoch: 8705, Training Loss: 0.01664
Epoch: 8705, Training Loss: 0.01530
Epoch: 8705, Training Loss: 0.01902
Epoch: 8705, Training Loss: 0.01454
Epoch: 8706, Training Loss: 0.01664
Epoch: 8706, Training Loss: 0.01529
Epoch: 8706, Training Loss: 0.01902
Epoch: 8706, Training Loss: 0.01454
Epoch: 8707, Training Loss: 0.01664
Epoch: 8707, Training Loss: 0.01529
Epoch: 8707, Training Loss: 0.01902
Epoch: 8707, Training Loss: 0.01454
Epoch: 8708, Training Loss: 0.01664
Epoch: 8708, Training Loss: 0.01529
Epoch: 8708, Training Loss: 0.01902
Epoch: 8708, Training Loss: 0.01454
Epoch: 8709, Training Loss: 0.01664
Epoch: 8709, Training Loss: 0.01529
Epoch: 8709, Training Loss: 0.01902
Epoch: 8709, Training Loss: 0.01454
Epoch: 8710, Training Loss: 0.01664
Epoch: 8710, Training Loss: 0.01529
Epoch: 8710, Training Loss: 0.01902
Epoch: 8710, Training Loss: 0.01454
Epoch: 8711, Training Loss: 0.01664
Epoch: 8711, Training Loss: 0.01529
Epoch: 8711, Training Loss: 0.01902
Epoch: 8711, Training Loss: 0.01454
Epoch: 8712, Training Loss: 0.01663
Epoch: 8712, Training Loss: 0.01529
Epoch: 8712, Training Loss: 0.01902
Epoch: 8712, Training Loss: 0.01454
Epoch: 8713, Training Loss: 0.01663
Epoch: 8713, Training Loss: 0.01529
Epoch: 8713, Training Loss: 0.01901
Epoch: 8713, Training Loss: 0.01454
Epoch: 8714, Training Loss: 0.01663
Epoch: 8714, Training Loss: 0.01529
Epoch: 8714, Training Loss: 0.01901
Epoch: 8714, Training Loss: 0.01453
Epoch: 8715, Training Loss: 0.01663
Epoch: 8715, Training Loss: 0.01529
Epoch: 8715, Training Loss: 0.01901
Epoch: 8715, Training Loss: 0.01453
Epoch: 8716, Training Loss: 0.01663
Epoch: 8716, Training Loss: 0.01528
Epoch: 8716, Training Loss: 0.01901
Epoch: 8716, Training Loss: 0.01453
Epoch: 8717, Training Loss: 0.01663
Epoch: 8717, Training Loss: 0.01528
Epoch: 8717, Training Loss: 0.01901
Epoch: 8717, Training Loss: 0.01453
Epoch: 8718, Training Loss: 0.01663
Epoch: 8718, Training Loss: 0.01528
Epoch: 8718, Training Loss: 0.01901
Epoch: 8718, Training Loss: 0.01453
Epoch: 8719, Training Loss: 0.01663
Epoch: 8719, Training Loss: 0.01528
Epoch: 8719, Training Loss: 0.01901
Epoch: 8719, Training Loss: 0.01453
Epoch: 8720, Training Loss: 0.01663
Epoch: 8720, Training Loss: 0.01528
Epoch: 8720, Training Loss: 0.01900
Epoch: 8720, Training Loss: 0.01453
Epoch: 8721, Training Loss: 0.01662
Epoch: 8721, Training Loss: 0.01528
Epoch: 8721, Training Loss: 0.01900
Epoch: 8721, Training Loss: 0.01453
Epoch: 8722, Training Loss: 0.01662
Epoch: 8722, Training Loss: 0.01528
Epoch: 8722, Training Loss: 0.01900
Epoch: 8722, Training Loss: 0.01453
Epoch: 8723, Training Loss: 0.01662
Epoch: 8723, Training Loss: 0.01528
Epoch: 8723, Training Loss: 0.01900
Epoch: 8723, Training Loss: 0.01453
Epoch: 8724, Training Loss: 0.01662
Epoch: 8724, Training Loss: 0.01528
Epoch: 8724, Training Loss: 0.01900
Epoch: 8724, Training Loss: 0.01453
Epoch: 8725, Training Loss: 0.01662
Epoch: 8725, Training Loss: 0.01528
Epoch: 8725, Training Loss: 0.01900
Epoch: 8725, Training Loss: 0.01452
Epoch: 8726, Training Loss: 0.01662
Epoch: 8726, Training Loss: 0.01527
Epoch: 8726, Training Loss: 0.01900
Epoch: 8726, Training Loss: 0.01452
Epoch: 8727, Training Loss: 0.01662
Epoch: 8727, Training Loss: 0.01527
Epoch: 8727, Training Loss: 0.01900
Epoch: 8727, Training Loss: 0.01452
Epoch: 8728, Training Loss: 0.01662
Epoch: 8728, Training Loss: 0.01527
Epoch: 8728, Training Loss: 0.01899
Epoch: 8728, Training Loss: 0.01452
Epoch: 8729, Training Loss: 0.01662
Epoch: 8729, Training Loss: 0.01527
Epoch: 8729, Training Loss: 0.01899
Epoch: 8729, Training Loss: 0.01452
Epoch: 8730, Training Loss: 0.01661
Epoch: 8730, Training Loss: 0.01527
Epoch: 8730, Training Loss: 0.01899
Epoch: 8730, Training Loss: 0.01452
Epoch: 8731, Training Loss: 0.01661
Epoch: 8731, Training Loss: 0.01527
Epoch: 8731, Training Loss: 0.01899
Epoch: 8731, Training Loss: 0.01452
Epoch: 8732, Training Loss: 0.01661
Epoch: 8732, Training Loss: 0.01527
Epoch: 8732, Training Loss: 0.01899
Epoch: 8732, Training Loss: 0.01452
Epoch: 8733, Training Loss: 0.01661
Epoch: 8733, Training Loss: 0.01527
Epoch: 8733, Training Loss: 0.01899
Epoch: 8733, Training Loss: 0.01452
Epoch: 8734, Training Loss: 0.01661
Epoch: 8734, Training Loss: 0.01527
Epoch: 8734, Training Loss: 0.01899
Epoch: 8734, Training Loss: 0.01452
Epoch: 8735, Training Loss: 0.01661
Epoch: 8735, Training Loss: 0.01527
Epoch: 8735, Training Loss: 0.01899
Epoch: 8735, Training Loss: 0.01451
Epoch: 8736, Training Loss: 0.01661
Epoch: 8736, Training Loss: 0.01527
Epoch: 8736, Training Loss: 0.01898
Epoch: 8736, Training Loss: 0.01451
Epoch: 8737, Training Loss: 0.01661
Epoch: 8737, Training Loss: 0.01526
Epoch: 8737, Training Loss: 0.01898
Epoch: 8737, Training Loss: 0.01451
Epoch: 8738, Training Loss: 0.01661
Epoch: 8738, Training Loss: 0.01526
Epoch: 8738, Training Loss: 0.01898
Epoch: 8738, Training Loss: 0.01451
Epoch: 8739, Training Loss: 0.01661
Epoch: 8739, Training Loss: 0.01526
Epoch: 8739, Training Loss: 0.01898
Epoch: 8739, Training Loss: 0.01451
Epoch: 8740, Training Loss: 0.01660
Epoch: 8740, Training Loss: 0.01526
Epoch: 8740, Training Loss: 0.01898
Epoch: 8740, Training Loss: 0.01451
Epoch: 8741, Training Loss: 0.01660
Epoch: 8741, Training Loss: 0.01526
Epoch: 8741, Training Loss: 0.01898
Epoch: 8741, Training Loss: 0.01451
Epoch: 8742, Training Loss: 0.01660
Epoch: 8742, Training Loss: 0.01526
Epoch: 8742, Training Loss: 0.01898
Epoch: 8742, Training Loss: 0.01451
Epoch: 8743, Training Loss: 0.01660
Epoch: 8743, Training Loss: 0.01526
Epoch: 8743, Training Loss: 0.01898
Epoch: 8743, Training Loss: 0.01451
Epoch: 8744, Training Loss: 0.01660
Epoch: 8744, Training Loss: 0.01526
Epoch: 8744, Training Loss: 0.01897
Epoch: 8744, Training Loss: 0.01451
Epoch: 8745, Training Loss: 0.01660
Epoch: 8745, Training Loss: 0.01526
Epoch: 8745, Training Loss: 0.01897
Epoch: 8745, Training Loss: 0.01451
Epoch: 8746, Training Loss: 0.01660
Epoch: 8746, Training Loss: 0.01526
Epoch: 8746, Training Loss: 0.01897
Epoch: 8746, Training Loss: 0.01450
Epoch: 8747, Training Loss: 0.01660
Epoch: 8747, Training Loss: 0.01525
Epoch: 8747, Training Loss: 0.01897
Epoch: 8747, Training Loss: 0.01450
Epoch: 8748, Training Loss: 0.01660
Epoch: 8748, Training Loss: 0.01525
Epoch: 8748, Training Loss: 0.01897
Epoch: 8748, Training Loss: 0.01450
Epoch: 8749, Training Loss: 0.01659
Epoch: 8749, Training Loss: 0.01525
Epoch: 8749, Training Loss: 0.01897
Epoch: 8749, Training Loss: 0.01450
Epoch: 8750, Training Loss: 0.01659
Epoch: 8750, Training Loss: 0.01525
Epoch: 8750, Training Loss: 0.01897
Epoch: 8750, Training Loss: 0.01450
Epoch: 8751, Training Loss: 0.01659
Epoch: 8751, Training Loss: 0.01525
Epoch: 8751, Training Loss: 0.01897
Epoch: 8751, Training Loss: 0.01450
Epoch: 8752, Training Loss: 0.01659
Epoch: 8752, Training Loss: 0.01525
Epoch: 8752, Training Loss: 0.01896
Epoch: 8752, Training Loss: 0.01450
Epoch: 8753, Training Loss: 0.01659
Epoch: 8753, Training Loss: 0.01525
Epoch: 8753, Training Loss: 0.01896
Epoch: 8753, Training Loss: 0.01450
Epoch: 8754, Training Loss: 0.01659
Epoch: 8754, Training Loss: 0.01525
Epoch: 8754, Training Loss: 0.01896
Epoch: 8754, Training Loss: 0.01450
Epoch: 8755, Training Loss: 0.01659
Epoch: 8755, Training Loss: 0.01525
Epoch: 8755, Training Loss: 0.01896
Epoch: 8755, Training Loss: 0.01450
Epoch: 8756, Training Loss: 0.01659
Epoch: 8756, Training Loss: 0.01525
Epoch: 8756, Training Loss: 0.01896
Epoch: 8756, Training Loss: 0.01450
Epoch: 8757, Training Loss: 0.01659
Epoch: 8757, Training Loss: 0.01524
Epoch: 8757, Training Loss: 0.01896
Epoch: 8757, Training Loss: 0.01449
Epoch: 8758, Training Loss: 0.01658
Epoch: 8758, Training Loss: 0.01524
Epoch: 8758, Training Loss: 0.01896
Epoch: 8758, Training Loss: 0.01449
Epoch: 8759, Training Loss: 0.01658
Epoch: 8759, Training Loss: 0.01524
Epoch: 8759, Training Loss: 0.01896
Epoch: 8759, Training Loss: 0.01449
Epoch: 8760, Training Loss: 0.01658
Epoch: 8760, Training Loss: 0.01524
Epoch: 8760, Training Loss: 0.01895
Epoch: 8760, Training Loss: 0.01449
Epoch: 8761, Training Loss: 0.01658
Epoch: 8761, Training Loss: 0.01524
Epoch: 8761, Training Loss: 0.01895
Epoch: 8761, Training Loss: 0.01449
Epoch: 8762, Training Loss: 0.01658
Epoch: 8762, Training Loss: 0.01524
Epoch: 8762, Training Loss: 0.01895
Epoch: 8762, Training Loss: 0.01449
Epoch: 8763, Training Loss: 0.01658
Epoch: 8763, Training Loss: 0.01524
Epoch: 8763, Training Loss: 0.01895
Epoch: 8763, Training Loss: 0.01449
Epoch: 8764, Training Loss: 0.01658
Epoch: 8764, Training Loss: 0.01524
Epoch: 8764, Training Loss: 0.01895
Epoch: 8764, Training Loss: 0.01449
Epoch: 8765, Training Loss: 0.01658
Epoch: 8765, Training Loss: 0.01524
Epoch: 8765, Training Loss: 0.01895
Epoch: 8765, Training Loss: 0.01449
Epoch: 8766, Training Loss: 0.01658
Epoch: 8766, Training Loss: 0.01524
Epoch: 8766, Training Loss: 0.01895
Epoch: 8766, Training Loss: 0.01449
Epoch: 8767, Training Loss: 0.01657
Epoch: 8767, Training Loss: 0.01524
Epoch: 8767, Training Loss: 0.01895
Epoch: 8767, Training Loss: 0.01449
Epoch: 8768, Training Loss: 0.01657
Epoch: 8768, Training Loss: 0.01523
Epoch: 8768, Training Loss: 0.01894
Epoch: 8768, Training Loss: 0.01448
Epoch: 8769, Training Loss: 0.01657
Epoch: 8769, Training Loss: 0.01523
Epoch: 8769, Training Loss: 0.01894
Epoch: 8769, Training Loss: 0.01448
Epoch: 8770, Training Loss: 0.01657
Epoch: 8770, Training Loss: 0.01523
Epoch: 8770, Training Loss: 0.01894
Epoch: 8770, Training Loss: 0.01448
Epoch: 8771, Training Loss: 0.01657
Epoch: 8771, Training Loss: 0.01523
Epoch: 8771, Training Loss: 0.01894
Epoch: 8771, Training Loss: 0.01448
Epoch: 8772, Training Loss: 0.01657
Epoch: 8772, Training Loss: 0.01523
Epoch: 8772, Training Loss: 0.01894
Epoch: 8772, Training Loss: 0.01448
Epoch: 8773, Training Loss: 0.01657
Epoch: 8773, Training Loss: 0.01523
Epoch: 8773, Training Loss: 0.01894
Epoch: 8773, Training Loss: 0.01448
Epoch: 8774, Training Loss: 0.01657
Epoch: 8774, Training Loss: 0.01523
Epoch: 8774, Training Loss: 0.01894
Epoch: 8774, Training Loss: 0.01448
Epoch: 8775, Training Loss: 0.01657
Epoch: 8775, Training Loss: 0.01523
Epoch: 8775, Training Loss: 0.01894
Epoch: 8775, Training Loss: 0.01448
Epoch: 8776, Training Loss: 0.01656
Epoch: 8776, Training Loss: 0.01523
Epoch: 8776, Training Loss: 0.01893
Epoch: 8776, Training Loss: 0.01448
Epoch: 8777, Training Loss: 0.01656
Epoch: 8777, Training Loss: 0.01523
Epoch: 8777, Training Loss: 0.01893
Epoch: 8777, Training Loss: 0.01448
Epoch: 8778, Training Loss: 0.01656
Epoch: 8778, Training Loss: 0.01522
Epoch: 8778, Training Loss: 0.01893
Epoch: 8778, Training Loss: 0.01447
Epoch: 8779, Training Loss: 0.01656
Epoch: 8779, Training Loss: 0.01522
Epoch: 8779, Training Loss: 0.01893
Epoch: 8779, Training Loss: 0.01447
Epoch: 8780, Training Loss: 0.01656
Epoch: 8780, Training Loss: 0.01522
Epoch: 8780, Training Loss: 0.01893
Epoch: 8780, Training Loss: 0.01447
Epoch: 8781, Training Loss: 0.01656
Epoch: 8781, Training Loss: 0.01522
Epoch: 8781, Training Loss: 0.01893
Epoch: 8781, Training Loss: 0.01447
Epoch: 8782, Training Loss: 0.01656
Epoch: 8782, Training Loss: 0.01522
Epoch: 8782, Training Loss: 0.01893
Epoch: 8782, Training Loss: 0.01447
Epoch: 8783, Training Loss: 0.01656
Epoch: 8783, Training Loss: 0.01522
Epoch: 8783, Training Loss: 0.01893
Epoch: 8783, Training Loss: 0.01447
Epoch: 8784, Training Loss: 0.01656
Epoch: 8784, Training Loss: 0.01522
Epoch: 8784, Training Loss: 0.01892
Epoch: 8784, Training Loss: 0.01447
Epoch: 8785, Training Loss: 0.01655
Epoch: 8785, Training Loss: 0.01522
Epoch: 8785, Training Loss: 0.01892
Epoch: 8785, Training Loss: 0.01447
Epoch: 8786, Training Loss: 0.01655
Epoch: 8786, Training Loss: 0.01522
Epoch: 8786, Training Loss: 0.01892
Epoch: 8786, Training Loss: 0.01447
Epoch: 8787, Training Loss: 0.01655
Epoch: 8787, Training Loss: 0.01522
Epoch: 8787, Training Loss: 0.01892
Epoch: 8787, Training Loss: 0.01447
Epoch: 8788, Training Loss: 0.01655
Epoch: 8788, Training Loss: 0.01521
Epoch: 8788, Training Loss: 0.01892
Epoch: 8788, Training Loss: 0.01447
Epoch: 8789, Training Loss: 0.01655
Epoch: 8789, Training Loss: 0.01521
Epoch: 8789, Training Loss: 0.01892
Epoch: 8789, Training Loss: 0.01446
Epoch: 8790, Training Loss: 0.01655
Epoch: 8790, Training Loss: 0.01521
Epoch: 8790, Training Loss: 0.01892
Epoch: 8790, Training Loss: 0.01446
Epoch: 8791, Training Loss: 0.01655
Epoch: 8791, Training Loss: 0.01521
Epoch: 8791, Training Loss: 0.01892
Epoch: 8791, Training Loss: 0.01446
Epoch: 8792, Training Loss: 0.01655
Epoch: 8792, Training Loss: 0.01521
Epoch: 8792, Training Loss: 0.01891
Epoch: 8792, Training Loss: 0.01446
Epoch: 8793, Training Loss: 0.01655
Epoch: 8793, Training Loss: 0.01521
Epoch: 8793, Training Loss: 0.01891
Epoch: 8793, Training Loss: 0.01446
Epoch: 8794, Training Loss: 0.01655
Epoch: 8794, Training Loss: 0.01521
Epoch: 8794, Training Loss: 0.01891
Epoch: 8794, Training Loss: 0.01446
Epoch: 8795, Training Loss: 0.01654
Epoch: 8795, Training Loss: 0.01521
Epoch: 8795, Training Loss: 0.01891
Epoch: 8795, Training Loss: 0.01446
Epoch: 8796, Training Loss: 0.01654
Epoch: 8796, Training Loss: 0.01521
Epoch: 8796, Training Loss: 0.01891
Epoch: 8796, Training Loss: 0.01446
Epoch: 8797, Training Loss: 0.01654
Epoch: 8797, Training Loss: 0.01521
Epoch: 8797, Training Loss: 0.01891
Epoch: 8797, Training Loss: 0.01446
Epoch: 8798, Training Loss: 0.01654
Epoch: 8798, Training Loss: 0.01521
Epoch: 8798, Training Loss: 0.01891
Epoch: 8798, Training Loss: 0.01446
Epoch: 8799, Training Loss: 0.01654
Epoch: 8799, Training Loss: 0.01520
Epoch: 8799, Training Loss: 0.01891
Epoch: 8799, Training Loss: 0.01446
Epoch: 8800, Training Loss: 0.01654
Epoch: 8800, Training Loss: 0.01520
Epoch: 8800, Training Loss: 0.01890
Epoch: 8800, Training Loss: 0.01445
Epoch: 8801, Training Loss: 0.01654
Epoch: 8801, Training Loss: 0.01520
Epoch: 8801, Training Loss: 0.01890
Epoch: 8801, Training Loss: 0.01445
Epoch: 8802, Training Loss: 0.01654
Epoch: 8802, Training Loss: 0.01520
Epoch: 8802, Training Loss: 0.01890
Epoch: 8802, Training Loss: 0.01445
Epoch: 8803, Training Loss: 0.01654
Epoch: 8803, Training Loss: 0.01520
Epoch: 8803, Training Loss: 0.01890
Epoch: 8803, Training Loss: 0.01445
Epoch: 8804, Training Loss: 0.01653
Epoch: 8804, Training Loss: 0.01520
Epoch: 8804, Training Loss: 0.01890
Epoch: 8804, Training Loss: 0.01445
Epoch: 8805, Training Loss: 0.01653
Epoch: 8805, Training Loss: 0.01520
Epoch: 8805, Training Loss: 0.01890
Epoch: 8805, Training Loss: 0.01445
Epoch: 8806, Training Loss: 0.01653
Epoch: 8806, Training Loss: 0.01520
Epoch: 8806, Training Loss: 0.01890
Epoch: 8806, Training Loss: 0.01445
Epoch: 8807, Training Loss: 0.01653
Epoch: 8807, Training Loss: 0.01520
Epoch: 8807, Training Loss: 0.01890
Epoch: 8807, Training Loss: 0.01445
Epoch: 8808, Training Loss: 0.01653
Epoch: 8808, Training Loss: 0.01520
Epoch: 8808, Training Loss: 0.01889
Epoch: 8808, Training Loss: 0.01445
Epoch: 8809, Training Loss: 0.01653
Epoch: 8809, Training Loss: 0.01519
Epoch: 8809, Training Loss: 0.01889
Epoch: 8809, Training Loss: 0.01445
Epoch: 8810, Training Loss: 0.01653
Epoch: 8810, Training Loss: 0.01519
Epoch: 8810, Training Loss: 0.01889
Epoch: 8810, Training Loss: 0.01445
Epoch: 8811, Training Loss: 0.01653
Epoch: 8811, Training Loss: 0.01519
Epoch: 8811, Training Loss: 0.01889
Epoch: 8811, Training Loss: 0.01444
Epoch: 8812, Training Loss: 0.01653
Epoch: 8812, Training Loss: 0.01519
Epoch: 8812, Training Loss: 0.01889
Epoch: 8812, Training Loss: 0.01444
Epoch: 8813, Training Loss: 0.01652
Epoch: 8813, Training Loss: 0.01519
Epoch: 8813, Training Loss: 0.01889
Epoch: 8813, Training Loss: 0.01444
Epoch: 8814, Training Loss: 0.01652
Epoch: 8814, Training Loss: 0.01519
Epoch: 8814, Training Loss: 0.01889
Epoch: 8814, Training Loss: 0.01444
Epoch: 8815, Training Loss: 0.01652
Epoch: 8815, Training Loss: 0.01519
Epoch: 8815, Training Loss: 0.01889
Epoch: 8815, Training Loss: 0.01444
Epoch: 8816, Training Loss: 0.01652
Epoch: 8816, Training Loss: 0.01519
Epoch: 8816, Training Loss: 0.01888
Epoch: 8816, Training Loss: 0.01444
Epoch: 8817, Training Loss: 0.01652
Epoch: 8817, Training Loss: 0.01519
Epoch: 8817, Training Loss: 0.01888
Epoch: 8817, Training Loss: 0.01444
Epoch: 8818, Training Loss: 0.01652
Epoch: 8818, Training Loss: 0.01519
Epoch: 8818, Training Loss: 0.01888
Epoch: 8818, Training Loss: 0.01444
Epoch: 8819, Training Loss: 0.01652
Epoch: 8819, Training Loss: 0.01519
Epoch: 8819, Training Loss: 0.01888
Epoch: 8819, Training Loss: 0.01444
Epoch: 8820, Training Loss: 0.01652
Epoch: 8820, Training Loss: 0.01518
Epoch: 8820, Training Loss: 0.01888
Epoch: 8820, Training Loss: 0.01444
Epoch: 8821, Training Loss: 0.01652
Epoch: 8821, Training Loss: 0.01518
Epoch: 8821, Training Loss: 0.01888
Epoch: 8821, Training Loss: 0.01444
Epoch: 8822, Training Loss: 0.01652
Epoch: 8822, Training Loss: 0.01518
Epoch: 8822, Training Loss: 0.01888
Epoch: 8822, Training Loss: 0.01443
Epoch: 8823, Training Loss: 0.01651
Epoch: 8823, Training Loss: 0.01518
Epoch: 8823, Training Loss: 0.01888
Epoch: 8823, Training Loss: 0.01443
Epoch: 8824, Training Loss: 0.01651
Epoch: 8824, Training Loss: 0.01518
Epoch: 8824, Training Loss: 0.01887
Epoch: 8824, Training Loss: 0.01443
Epoch: 8825, Training Loss: 0.01651
Epoch: 8825, Training Loss: 0.01518
Epoch: 8825, Training Loss: 0.01887
Epoch: 8825, Training Loss: 0.01443
Epoch: 8826, Training Loss: 0.01651
Epoch: 8826, Training Loss: 0.01518
Epoch: 8826, Training Loss: 0.01887
Epoch: 8826, Training Loss: 0.01443
Epoch: 8827, Training Loss: 0.01651
Epoch: 8827, Training Loss: 0.01518
Epoch: 8827, Training Loss: 0.01887
Epoch: 8827, Training Loss: 0.01443
Epoch: 8828, Training Loss: 0.01651
Epoch: 8828, Training Loss: 0.01518
Epoch: 8828, Training Loss: 0.01887
Epoch: 8828, Training Loss: 0.01443
Epoch: 8829, Training Loss: 0.01651
Epoch: 8829, Training Loss: 0.01518
Epoch: 8829, Training Loss: 0.01887
Epoch: 8829, Training Loss: 0.01443
Epoch: 8830, Training Loss: 0.01651
Epoch: 8830, Training Loss: 0.01517
Epoch: 8830, Training Loss: 0.01887
Epoch: 8830, Training Loss: 0.01443
Epoch: 8831, Training Loss: 0.01651
Epoch: 8831, Training Loss: 0.01517
Epoch: 8831, Training Loss: 0.01887
Epoch: 8831, Training Loss: 0.01443
Epoch: 8832, Training Loss: 0.01650
Epoch: 8832, Training Loss: 0.01517
Epoch: 8832, Training Loss: 0.01887
Epoch: 8832, Training Loss: 0.01443
Epoch: 8833, Training Loss: 0.01650
Epoch: 8833, Training Loss: 0.01517
Epoch: 8833, Training Loss: 0.01886
Epoch: 8833, Training Loss: 0.01442
Epoch: 8834, Training Loss: 0.01650
Epoch: 8834, Training Loss: 0.01517
Epoch: 8834, Training Loss: 0.01886
Epoch: 8834, Training Loss: 0.01442
Epoch: 8835, Training Loss: 0.01650
Epoch: 8835, Training Loss: 0.01517
Epoch: 8835, Training Loss: 0.01886
Epoch: 8835, Training Loss: 0.01442
Epoch: 8836, Training Loss: 0.01650
Epoch: 8836, Training Loss: 0.01517
Epoch: 8836, Training Loss: 0.01886
Epoch: 8836, Training Loss: 0.01442
Epoch: 8837, Training Loss: 0.01650
Epoch: 8837, Training Loss: 0.01517
Epoch: 8837, Training Loss: 0.01886
Epoch: 8837, Training Loss: 0.01442
Epoch: 8838, Training Loss: 0.01650
Epoch: 8838, Training Loss: 0.01517
Epoch: 8838, Training Loss: 0.01886
Epoch: 8838, Training Loss: 0.01442
Epoch: 8839, Training Loss: 0.01650
Epoch: 8839, Training Loss: 0.01517
Epoch: 8839, Training Loss: 0.01886
Epoch: 8839, Training Loss: 0.01442
Epoch: 8840, Training Loss: 0.01650
Epoch: 8840, Training Loss: 0.01517
Epoch: 8840, Training Loss: 0.01886
Epoch: 8840, Training Loss: 0.01442
Epoch: 8841, Training Loss: 0.01649
Epoch: 8841, Training Loss: 0.01516
Epoch: 8841, Training Loss: 0.01885
Epoch: 8841, Training Loss: 0.01442
Epoch: 8842, Training Loss: 0.01649
Epoch: 8842, Training Loss: 0.01516
Epoch: 8842, Training Loss: 0.01885
Epoch: 8842, Training Loss: 0.01442
Epoch: 8843, Training Loss: 0.01649
Epoch: 8843, Training Loss: 0.01516
Epoch: 8843, Training Loss: 0.01885
Epoch: 8843, Training Loss: 0.01441
Epoch: 8844, Training Loss: 0.01649
Epoch: 8844, Training Loss: 0.01516
Epoch: 8844, Training Loss: 0.01885
Epoch: 8844, Training Loss: 0.01441
Epoch: 8845, Training Loss: 0.01649
Epoch: 8845, Training Loss: 0.01516
Epoch: 8845, Training Loss: 0.01885
Epoch: 8845, Training Loss: 0.01441
Epoch: 8846, Training Loss: 0.01649
Epoch: 8846, Training Loss: 0.01516
Epoch: 8846, Training Loss: 0.01885
Epoch: 8846, Training Loss: 0.01441
Epoch: 8847, Training Loss: 0.01649
Epoch: 8847, Training Loss: 0.01516
Epoch: 8847, Training Loss: 0.01885
Epoch: 8847, Training Loss: 0.01441
Epoch: 8848, Training Loss: 0.01649
Epoch: 8848, Training Loss: 0.01516
Epoch: 8848, Training Loss: 0.01885
Epoch: 8848, Training Loss: 0.01441
Epoch: 8849, Training Loss: 0.01649
Epoch: 8849, Training Loss: 0.01516
Epoch: 8849, Training Loss: 0.01884
Epoch: 8849, Training Loss: 0.01441
Epoch: 8850, Training Loss: 0.01649
Epoch: 8850, Training Loss: 0.01516
Epoch: 8850, Training Loss: 0.01884
Epoch: 8850, Training Loss: 0.01441
Epoch: 8851, Training Loss: 0.01648
Epoch: 8851, Training Loss: 0.01515
Epoch: 8851, Training Loss: 0.01884
Epoch: 8851, Training Loss: 0.01441
Epoch: 8852, Training Loss: 0.01648
Epoch: 8852, Training Loss: 0.01515
Epoch: 8852, Training Loss: 0.01884
Epoch: 8852, Training Loss: 0.01441
Epoch: 8853, Training Loss: 0.01648
Epoch: 8853, Training Loss: 0.01515
Epoch: 8853, Training Loss: 0.01884
Epoch: 8853, Training Loss: 0.01441
Epoch: 8854, Training Loss: 0.01648
Epoch: 8854, Training Loss: 0.01515
Epoch: 8854, Training Loss: 0.01884
Epoch: 8854, Training Loss: 0.01440
Epoch: 8855, Training Loss: 0.01648
Epoch: 8855, Training Loss: 0.01515
Epoch: 8855, Training Loss: 0.01884
Epoch: 8855, Training Loss: 0.01440
Epoch: 8856, Training Loss: 0.01648
Epoch: 8856, Training Loss: 0.01515
Epoch: 8856, Training Loss: 0.01884
Epoch: 8856, Training Loss: 0.01440
Epoch: 8857, Training Loss: 0.01648
Epoch: 8857, Training Loss: 0.01515
Epoch: 8857, Training Loss: 0.01883
Epoch: 8857, Training Loss: 0.01440
Epoch: 8858, Training Loss: 0.01648
Epoch: 8858, Training Loss: 0.01515
Epoch: 8858, Training Loss: 0.01883
Epoch: 8858, Training Loss: 0.01440
Epoch: 8859, Training Loss: 0.01648
Epoch: 8859, Training Loss: 0.01515
Epoch: 8859, Training Loss: 0.01883
Epoch: 8859, Training Loss: 0.01440
Epoch: 8860, Training Loss: 0.01647
Epoch: 8860, Training Loss: 0.01515
Epoch: 8860, Training Loss: 0.01883
Epoch: 8860, Training Loss: 0.01440
Epoch: 8861, Training Loss: 0.01647
Epoch: 8861, Training Loss: 0.01515
Epoch: 8861, Training Loss: 0.01883
Epoch: 8861, Training Loss: 0.01440
Epoch: 8862, Training Loss: 0.01647
Epoch: 8862, Training Loss: 0.01514
Epoch: 8862, Training Loss: 0.01883
Epoch: 8862, Training Loss: 0.01440
Epoch: 8863, Training Loss: 0.01647
Epoch: 8863, Training Loss: 0.01514
Epoch: 8863, Training Loss: 0.01883
Epoch: 8863, Training Loss: 0.01440
Epoch: 8864, Training Loss: 0.01647
Epoch: 8864, Training Loss: 0.01514
Epoch: 8864, Training Loss: 0.01883
Epoch: 8864, Training Loss: 0.01440
Epoch: 8865, Training Loss: 0.01647
Epoch: 8865, Training Loss: 0.01514
Epoch: 8865, Training Loss: 0.01882
Epoch: 8865, Training Loss: 0.01439
Epoch: 8866, Training Loss: 0.01647
Epoch: 8866, Training Loss: 0.01514
Epoch: 8866, Training Loss: 0.01882
Epoch: 8866, Training Loss: 0.01439
Epoch: 8867, Training Loss: 0.01647
Epoch: 8867, Training Loss: 0.01514
Epoch: 8867, Training Loss: 0.01882
Epoch: 8867, Training Loss: 0.01439
Epoch: 8868, Training Loss: 0.01647
Epoch: 8868, Training Loss: 0.01514
Epoch: 8868, Training Loss: 0.01882
Epoch: 8868, Training Loss: 0.01439
Epoch: 8869, Training Loss: 0.01646
Epoch: 8869, Training Loss: 0.01514
Epoch: 8869, Training Loss: 0.01882
Epoch: 8869, Training Loss: 0.01439
Epoch: 8870, Training Loss: 0.01646
Epoch: 8870, Training Loss: 0.01514
Epoch: 8870, Training Loss: 0.01882
Epoch: 8870, Training Loss: 0.01439
Epoch: 8871, Training Loss: 0.01646
Epoch: 8871, Training Loss: 0.01514
Epoch: 8871, Training Loss: 0.01882
Epoch: 8871, Training Loss: 0.01439
Epoch: 8872, Training Loss: 0.01646
Epoch: 8872, Training Loss: 0.01513
Epoch: 8872, Training Loss: 0.01882
Epoch: 8872, Training Loss: 0.01439
Epoch: 8873, Training Loss: 0.01646
Epoch: 8873, Training Loss: 0.01513
Epoch: 8873, Training Loss: 0.01881
Epoch: 8873, Training Loss: 0.01439
Epoch: 8874, Training Loss: 0.01646
Epoch: 8874, Training Loss: 0.01513
Epoch: 8874, Training Loss: 0.01881
Epoch: 8874, Training Loss: 0.01439
Epoch: 8875, Training Loss: 0.01646
Epoch: 8875, Training Loss: 0.01513
Epoch: 8875, Training Loss: 0.01881
Epoch: 8875, Training Loss: 0.01439
Epoch: 8876, Training Loss: 0.01646
Epoch: 8876, Training Loss: 0.01513
Epoch: 8876, Training Loss: 0.01881
Epoch: 8876, Training Loss: 0.01438
Epoch: 8877, Training Loss: 0.01646
Epoch: 8877, Training Loss: 0.01513
Epoch: 8877, Training Loss: 0.01881
Epoch: 8877, Training Loss: 0.01438
Epoch: 8878, Training Loss: 0.01646
Epoch: 8878, Training Loss: 0.01513
Epoch: 8878, Training Loss: 0.01881
Epoch: 8878, Training Loss: 0.01438
Epoch: 8879, Training Loss: 0.01645
Epoch: 8879, Training Loss: 0.01513
Epoch: 8879, Training Loss: 0.01881
Epoch: 8879, Training Loss: 0.01438
Epoch: 8880, Training Loss: 0.01645
Epoch: 8880, Training Loss: 0.01513
Epoch: 8880, Training Loss: 0.01881
Epoch: 8880, Training Loss: 0.01438
Epoch: 8881, Training Loss: 0.01645
Epoch: 8881, Training Loss: 0.01513
Epoch: 8881, Training Loss: 0.01880
Epoch: 8881, Training Loss: 0.01438
Epoch: 8882, Training Loss: 0.01645
Epoch: 8882, Training Loss: 0.01513
Epoch: 8882, Training Loss: 0.01880
Epoch: 8882, Training Loss: 0.01438
Epoch: 8883, Training Loss: 0.01645
Epoch: 8883, Training Loss: 0.01512
Epoch: 8883, Training Loss: 0.01880
Epoch: 8883, Training Loss: 0.01438
Epoch: 8884, Training Loss: 0.01645
Epoch: 8884, Training Loss: 0.01512
Epoch: 8884, Training Loss: 0.01880
Epoch: 8884, Training Loss: 0.01438
Epoch: 8885, Training Loss: 0.01645
Epoch: 8885, Training Loss: 0.01512
Epoch: 8885, Training Loss: 0.01880
Epoch: 8885, Training Loss: 0.01438
Epoch: 8886, Training Loss: 0.01645
Epoch: 8886, Training Loss: 0.01512
Epoch: 8886, Training Loss: 0.01880
Epoch: 8886, Training Loss: 0.01438
Epoch: 8887, Training Loss: 0.01645
Epoch: 8887, Training Loss: 0.01512
Epoch: 8887, Training Loss: 0.01880
Epoch: 8887, Training Loss: 0.01437
Epoch: 8888, Training Loss: 0.01644
Epoch: 8888, Training Loss: 0.01512
Epoch: 8888, Training Loss: 0.01880
Epoch: 8888, Training Loss: 0.01437
Epoch: 8889, Training Loss: 0.01644
Epoch: 8889, Training Loss: 0.01512
Epoch: 8889, Training Loss: 0.01880
Epoch: 8889, Training Loss: 0.01437
Epoch: 8890, Training Loss: 0.01644
Epoch: 8890, Training Loss: 0.01512
Epoch: 8890, Training Loss: 0.01879
Epoch: 8890, Training Loss: 0.01437
Epoch: 8891, Training Loss: 0.01644
Epoch: 8891, Training Loss: 0.01512
Epoch: 8891, Training Loss: 0.01879
Epoch: 8891, Training Loss: 0.01437
Epoch: 8892, Training Loss: 0.01644
Epoch: 8892, Training Loss: 0.01512
Epoch: 8892, Training Loss: 0.01879
Epoch: 8892, Training Loss: 0.01437
Epoch: 8893, Training Loss: 0.01644
Epoch: 8893, Training Loss: 0.01511
Epoch: 8893, Training Loss: 0.01879
Epoch: 8893, Training Loss: 0.01437
Epoch: 8894, Training Loss: 0.01644
Epoch: 8894, Training Loss: 0.01511
Epoch: 8894, Training Loss: 0.01879
Epoch: 8894, Training Loss: 0.01437
Epoch: 8895, Training Loss: 0.01644
Epoch: 8895, Training Loss: 0.01511
Epoch: 8895, Training Loss: 0.01879
Epoch: 8895, Training Loss: 0.01437
Epoch: 8896, Training Loss: 0.01644
Epoch: 8896, Training Loss: 0.01511
Epoch: 8896, Training Loss: 0.01879
Epoch: 8896, Training Loss: 0.01437
Epoch: 8897, Training Loss: 0.01643
Epoch: 8897, Training Loss: 0.01511
Epoch: 8897, Training Loss: 0.01879
Epoch: 8897, Training Loss: 0.01437
Epoch: 8898, Training Loss: 0.01643
Epoch: 8898, Training Loss: 0.01511
Epoch: 8898, Training Loss: 0.01878
Epoch: 8898, Training Loss: 0.01436
Epoch: 8899, Training Loss: 0.01643
Epoch: 8899, Training Loss: 0.01511
Epoch: 8899, Training Loss: 0.01878
Epoch: 8899, Training Loss: 0.01436
Epoch: 8900, Training Loss: 0.01643
Epoch: 8900, Training Loss: 0.01511
Epoch: 8900, Training Loss: 0.01878
Epoch: 8900, Training Loss: 0.01436
Epoch: 8901, Training Loss: 0.01643
Epoch: 8901, Training Loss: 0.01511
Epoch: 8901, Training Loss: 0.01878
Epoch: 8901, Training Loss: 0.01436
Epoch: 8902, Training Loss: 0.01643
Epoch: 8902, Training Loss: 0.01511
Epoch: 8902, Training Loss: 0.01878
Epoch: 8902, Training Loss: 0.01436
Epoch: 8903, Training Loss: 0.01643
Epoch: 8903, Training Loss: 0.01511
Epoch: 8903, Training Loss: 0.01878
Epoch: 8903, Training Loss: 0.01436
Epoch: 8904, Training Loss: 0.01643
Epoch: 8904, Training Loss: 0.01510
Epoch: 8904, Training Loss: 0.01878
Epoch: 8904, Training Loss: 0.01436
Epoch: 8905, Training Loss: 0.01643
Epoch: 8905, Training Loss: 0.01510
Epoch: 8905, Training Loss: 0.01878
Epoch: 8905, Training Loss: 0.01436
Epoch: 8906, Training Loss: 0.01643
Epoch: 8906, Training Loss: 0.01510
Epoch: 8906, Training Loss: 0.01877
Epoch: 8906, Training Loss: 0.01436
Epoch: 8907, Training Loss: 0.01642
Epoch: 8907, Training Loss: 0.01510
Epoch: 8907, Training Loss: 0.01877
Epoch: 8907, Training Loss: 0.01436
Epoch: 8908, Training Loss: 0.01642
Epoch: 8908, Training Loss: 0.01510
Epoch: 8908, Training Loss: 0.01877
Epoch: 8908, Training Loss: 0.01436
Epoch: 8909, Training Loss: 0.01642
Epoch: 8909, Training Loss: 0.01510
Epoch: 8909, Training Loss: 0.01877
Epoch: 8909, Training Loss: 0.01435
Epoch: 8910, Training Loss: 0.01642
Epoch: 8910, Training Loss: 0.01510
Epoch: 8910, Training Loss: 0.01877
Epoch: 8910, Training Loss: 0.01435
Epoch: 8911, Training Loss: 0.01642
Epoch: 8911, Training Loss: 0.01510
Epoch: 8911, Training Loss: 0.01877
Epoch: 8911, Training Loss: 0.01435
Epoch: 8912, Training Loss: 0.01642
Epoch: 8912, Training Loss: 0.01510
Epoch: 8912, Training Loss: 0.01877
Epoch: 8912, Training Loss: 0.01435
Epoch: 8913, Training Loss: 0.01642
Epoch: 8913, Training Loss: 0.01510
Epoch: 8913, Training Loss: 0.01877
Epoch: 8913, Training Loss: 0.01435
Epoch: 8914, Training Loss: 0.01642
Epoch: 8914, Training Loss: 0.01510
Epoch: 8914, Training Loss: 0.01876
Epoch: 8914, Training Loss: 0.01435
Epoch: 8915, Training Loss: 0.01642
Epoch: 8915, Training Loss: 0.01509
Epoch: 8915, Training Loss: 0.01876
Epoch: 8915, Training Loss: 0.01435
Epoch: 8916, Training Loss: 0.01641
Epoch: 8916, Training Loss: 0.01509
Epoch: 8916, Training Loss: 0.01876
Epoch: 8916, Training Loss: 0.01435
Epoch: 8917, Training Loss: 0.01641
Epoch: 8917, Training Loss: 0.01509
Epoch: 8917, Training Loss: 0.01876
Epoch: 8917, Training Loss: 0.01435
Epoch: 8918, Training Loss: 0.01641
Epoch: 8918, Training Loss: 0.01509
Epoch: 8918, Training Loss: 0.01876
Epoch: 8918, Training Loss: 0.01435
Epoch: 8919, Training Loss: 0.01641
Epoch: 8919, Training Loss: 0.01509
Epoch: 8919, Training Loss: 0.01876
Epoch: 8919, Training Loss: 0.01435
Epoch: 8920, Training Loss: 0.01641
Epoch: 8920, Training Loss: 0.01509
Epoch: 8920, Training Loss: 0.01876
Epoch: 8920, Training Loss: 0.01434
Epoch: 8921, Training Loss: 0.01641
Epoch: 8921, Training Loss: 0.01509
Epoch: 8921, Training Loss: 0.01876
Epoch: 8921, Training Loss: 0.01434
Epoch: 8922, Training Loss: 0.01641
Epoch: 8922, Training Loss: 0.01509
Epoch: 8922, Training Loss: 0.01875
Epoch: 8922, Training Loss: 0.01434
Epoch: 8923, Training Loss: 0.01641
Epoch: 8923, Training Loss: 0.01509
Epoch: 8923, Training Loss: 0.01875
Epoch: 8923, Training Loss: 0.01434
Epoch: 8924, Training Loss: 0.01641
Epoch: 8924, Training Loss: 0.01509
Epoch: 8924, Training Loss: 0.01875
Epoch: 8924, Training Loss: 0.01434
Epoch: 8925, Training Loss: 0.01641
Epoch: 8925, Training Loss: 0.01508
Epoch: 8925, Training Loss: 0.01875
Epoch: 8925, Training Loss: 0.01434
Epoch: 8926, Training Loss: 0.01640
Epoch: 8926, Training Loss: 0.01508
Epoch: 8926, Training Loss: 0.01875
Epoch: 8926, Training Loss: 0.01434
Epoch: 8927, Training Loss: 0.01640
Epoch: 8927, Training Loss: 0.01508
Epoch: 8927, Training Loss: 0.01875
Epoch: 8927, Training Loss: 0.01434
Epoch: 8928, Training Loss: 0.01640
Epoch: 8928, Training Loss: 0.01508
Epoch: 8928, Training Loss: 0.01875
Epoch: 8928, Training Loss: 0.01434
Epoch: 8929, Training Loss: 0.01640
Epoch: 8929, Training Loss: 0.01508
Epoch: 8929, Training Loss: 0.01875
Epoch: 8929, Training Loss: 0.01434
Epoch: 8930, Training Loss: 0.01640
Epoch: 8930, Training Loss: 0.01508
Epoch: 8930, Training Loss: 0.01875
Epoch: 8930, Training Loss: 0.01434
Epoch: 8931, Training Loss: 0.01640
Epoch: 8931, Training Loss: 0.01508
Epoch: 8931, Training Loss: 0.01874
Epoch: 8931, Training Loss: 0.01433
Epoch: 8932, Training Loss: 0.01640
Epoch: 8932, Training Loss: 0.01508
Epoch: 8932, Training Loss: 0.01874
Epoch: 8932, Training Loss: 0.01433
Epoch: 8933, Training Loss: 0.01640
Epoch: 8933, Training Loss: 0.01508
Epoch: 8933, Training Loss: 0.01874
Epoch: 8933, Training Loss: 0.01433
Epoch: 8934, Training Loss: 0.01640
Epoch: 8934, Training Loss: 0.01508
Epoch: 8934, Training Loss: 0.01874
Epoch: 8934, Training Loss: 0.01433
Epoch: 8935, Training Loss: 0.01639
Epoch: 8935, Training Loss: 0.01508
Epoch: 8935, Training Loss: 0.01874
Epoch: 8935, Training Loss: 0.01433
Epoch: 8936, Training Loss: 0.01639
Epoch: 8936, Training Loss: 0.01507
Epoch: 8936, Training Loss: 0.01874
Epoch: 8936, Training Loss: 0.01433
Epoch: 8937, Training Loss: 0.01639
Epoch: 8937, Training Loss: 0.01507
Epoch: 8937, Training Loss: 0.01874
Epoch: 8937, Training Loss: 0.01433
Epoch: 8938, Training Loss: 0.01639
Epoch: 8938, Training Loss: 0.01507
Epoch: 8938, Training Loss: 0.01874
Epoch: 8938, Training Loss: 0.01433
Epoch: 8939, Training Loss: 0.01639
Epoch: 8939, Training Loss: 0.01507
Epoch: 8939, Training Loss: 0.01873
Epoch: 8939, Training Loss: 0.01433
Epoch: 8940, Training Loss: 0.01639
Epoch: 8940, Training Loss: 0.01507
Epoch: 8940, Training Loss: 0.01873
Epoch: 8940, Training Loss: 0.01433
Epoch: 8941, Training Loss: 0.01639
Epoch: 8941, Training Loss: 0.01507
Epoch: 8941, Training Loss: 0.01873
Epoch: 8941, Training Loss: 0.01433
Epoch: 8942, Training Loss: 0.01639
Epoch: 8942, Training Loss: 0.01507
Epoch: 8942, Training Loss: 0.01873
Epoch: 8942, Training Loss: 0.01433
Epoch: 8943, Training Loss: 0.01639
Epoch: 8943, Training Loss: 0.01507
Epoch: 8943, Training Loss: 0.01873
Epoch: 8943, Training Loss: 0.01432
Epoch: 8944, Training Loss: 0.01639
Epoch: 8944, Training Loss: 0.01507
Epoch: 8944, Training Loss: 0.01873
Epoch: 8944, Training Loss: 0.01432
Epoch: 8945, Training Loss: 0.01638
Epoch: 8945, Training Loss: 0.01507
Epoch: 8945, Training Loss: 0.01873
Epoch: 8945, Training Loss: 0.01432
Epoch: 8946, Training Loss: 0.01638
Epoch: 8946, Training Loss: 0.01506
Epoch: 8946, Training Loss: 0.01873
Epoch: 8946, Training Loss: 0.01432
Epoch: 8947, Training Loss: 0.01638
Epoch: 8947, Training Loss: 0.01506
Epoch: 8947, Training Loss: 0.01872
Epoch: 8947, Training Loss: 0.01432
Epoch: 8948, Training Loss: 0.01638
Epoch: 8948, Training Loss: 0.01506
Epoch: 8948, Training Loss: 0.01872
Epoch: 8948, Training Loss: 0.01432
Epoch: 8949, Training Loss: 0.01638
Epoch: 8949, Training Loss: 0.01506
Epoch: 8949, Training Loss: 0.01872
Epoch: 8949, Training Loss: 0.01432
Epoch: 8950, Training Loss: 0.01638
Epoch: 8950, Training Loss: 0.01506
Epoch: 8950, Training Loss: 0.01872
Epoch: 8950, Training Loss: 0.01432
Epoch: 8951, Training Loss: 0.01638
Epoch: 8951, Training Loss: 0.01506
Epoch: 8951, Training Loss: 0.01872
Epoch: 8951, Training Loss: 0.01432
Epoch: 8952, Training Loss: 0.01638
Epoch: 8952, Training Loss: 0.01506
Epoch: 8952, Training Loss: 0.01872
Epoch: 8952, Training Loss: 0.01432
Epoch: 8953, Training Loss: 0.01638
Epoch: 8953, Training Loss: 0.01506
Epoch: 8953, Training Loss: 0.01872
Epoch: 8953, Training Loss: 0.01432
Epoch: 8954, Training Loss: 0.01637
Epoch: 8954, Training Loss: 0.01506
Epoch: 8954, Training Loss: 0.01872
Epoch: 8954, Training Loss: 0.01431
Epoch: 8955, Training Loss: 0.01637
Epoch: 8955, Training Loss: 0.01506
Epoch: 8955, Training Loss: 0.01871
Epoch: 8955, Training Loss: 0.01431
Epoch: 8956, Training Loss: 0.01637
Epoch: 8956, Training Loss: 0.01506
Epoch: 8956, Training Loss: 0.01871
Epoch: 8956, Training Loss: 0.01431
Epoch: 8957, Training Loss: 0.01637
Epoch: 8957, Training Loss: 0.01505
Epoch: 8957, Training Loss: 0.01871
Epoch: 8957, Training Loss: 0.01431
Epoch: 8958, Training Loss: 0.01637
Epoch: 8958, Training Loss: 0.01505
Epoch: 8958, Training Loss: 0.01871
Epoch: 8958, Training Loss: 0.01431
Epoch: 8959, Training Loss: 0.01637
Epoch: 8959, Training Loss: 0.01505
Epoch: 8959, Training Loss: 0.01871
Epoch: 8959, Training Loss: 0.01431
Epoch: 8960, Training Loss: 0.01637
Epoch: 8960, Training Loss: 0.01505
Epoch: 8960, Training Loss: 0.01871
Epoch: 8960, Training Loss: 0.01431
Epoch: 8961, Training Loss: 0.01637
Epoch: 8961, Training Loss: 0.01505
Epoch: 8961, Training Loss: 0.01871
Epoch: 8961, Training Loss: 0.01431
Epoch: 8962, Training Loss: 0.01637
Epoch: 8962, Training Loss: 0.01505
Epoch: 8962, Training Loss: 0.01871
Epoch: 8962, Training Loss: 0.01431
Epoch: 8963, Training Loss: 0.01637
Epoch: 8963, Training Loss: 0.01505
Epoch: 8963, Training Loss: 0.01871
Epoch: 8963, Training Loss: 0.01431
Epoch: 8964, Training Loss: 0.01636
Epoch: 8964, Training Loss: 0.01505
Epoch: 8964, Training Loss: 0.01870
Epoch: 8964, Training Loss: 0.01431
Epoch: 8965, Training Loss: 0.01636
Epoch: 8965, Training Loss: 0.01505
Epoch: 8965, Training Loss: 0.01870
Epoch: 8965, Training Loss: 0.01430
Epoch: 8966, Training Loss: 0.01636
Epoch: 8966, Training Loss: 0.01505
Epoch: 8966, Training Loss: 0.01870
Epoch: 8966, Training Loss: 0.01430
Epoch: 8967, Training Loss: 0.01636
Epoch: 8967, Training Loss: 0.01505
Epoch: 8967, Training Loss: 0.01870
Epoch: 8967, Training Loss: 0.01430
Epoch: 8968, Training Loss: 0.01636
Epoch: 8968, Training Loss: 0.01504
Epoch: 8968, Training Loss: 0.01870
Epoch: 8968, Training Loss: 0.01430
Epoch: 8969, Training Loss: 0.01636
Epoch: 8969, Training Loss: 0.01504
Epoch: 8969, Training Loss: 0.01870
Epoch: 8969, Training Loss: 0.01430
Epoch: 8970, Training Loss: 0.01636
Epoch: 8970, Training Loss: 0.01504
Epoch: 8970, Training Loss: 0.01870
Epoch: 8970, Training Loss: 0.01430
Epoch: 8971, Training Loss: 0.01636
Epoch: 8971, Training Loss: 0.01504
Epoch: 8971, Training Loss: 0.01870
Epoch: 8971, Training Loss: 0.01430
Epoch: 8972, Training Loss: 0.01636
Epoch: 8972, Training Loss: 0.01504
Epoch: 8972, Training Loss: 0.01869
Epoch: 8972, Training Loss: 0.01430
Epoch: 8973, Training Loss: 0.01635
Epoch: 8973, Training Loss: 0.01504
Epoch: 8973, Training Loss: 0.01869
Epoch: 8973, Training Loss: 0.01430
Epoch: 8974, Training Loss: 0.01635
Epoch: 8974, Training Loss: 0.01504
Epoch: 8974, Training Loss: 0.01869
Epoch: 8974, Training Loss: 0.01430
Epoch: 8975, Training Loss: 0.01635
Epoch: 8975, Training Loss: 0.01504
Epoch: 8975, Training Loss: 0.01869
Epoch: 8975, Training Loss: 0.01430
Epoch: 8976, Training Loss: 0.01635
Epoch: 8976, Training Loss: 0.01504
Epoch: 8976, Training Loss: 0.01869
Epoch: 8976, Training Loss: 0.01429
Epoch: 8977, Training Loss: 0.01635
Epoch: 8977, Training Loss: 0.01504
Epoch: 8977, Training Loss: 0.01869
Epoch: 8977, Training Loss: 0.01429
Epoch: 8978, Training Loss: 0.01635
Epoch: 8978, Training Loss: 0.01504
Epoch: 8978, Training Loss: 0.01869
Epoch: 8978, Training Loss: 0.01429
Epoch: 8979, Training Loss: 0.01635
Epoch: 8979, Training Loss: 0.01503
Epoch: 8979, Training Loss: 0.01869
Epoch: 8979, Training Loss: 0.01429
Epoch: 8980, Training Loss: 0.01635
Epoch: 8980, Training Loss: 0.01503
Epoch: 8980, Training Loss: 0.01868
Epoch: 8980, Training Loss: 0.01429
Epoch: 8981, Training Loss: 0.01635
Epoch: 8981, Training Loss: 0.01503
Epoch: 8981, Training Loss: 0.01868
Epoch: 8981, Training Loss: 0.01429
Epoch: 8982, Training Loss: 0.01635
Epoch: 8982, Training Loss: 0.01503
Epoch: 8982, Training Loss: 0.01868
Epoch: 8982, Training Loss: 0.01429
Epoch: 8983, Training Loss: 0.01634
Epoch: 8983, Training Loss: 0.01503
Epoch: 8983, Training Loss: 0.01868
Epoch: 8983, Training Loss: 0.01429
Epoch: 8984, Training Loss: 0.01634
Epoch: 8984, Training Loss: 0.01503
Epoch: 8984, Training Loss: 0.01868
Epoch: 8984, Training Loss: 0.01429
Epoch: 8985, Training Loss: 0.01634
Epoch: 8985, Training Loss: 0.01503
Epoch: 8985, Training Loss: 0.01868
Epoch: 8985, Training Loss: 0.01429
Epoch: 8986, Training Loss: 0.01634
Epoch: 8986, Training Loss: 0.01503
Epoch: 8986, Training Loss: 0.01868
Epoch: 8986, Training Loss: 0.01429
Epoch: 8987, Training Loss: 0.01634
Epoch: 8987, Training Loss: 0.01503
Epoch: 8987, Training Loss: 0.01868
Epoch: 8987, Training Loss: 0.01428
Epoch: 8988, Training Loss: 0.01634
Epoch: 8988, Training Loss: 0.01503
Epoch: 8988, Training Loss: 0.01868
Epoch: 8988, Training Loss: 0.01428
Epoch: 8989, Training Loss: 0.01634
Epoch: 8989, Training Loss: 0.01502
Epoch: 8989, Training Loss: 0.01867
Epoch: 8989, Training Loss: 0.01428
Epoch: 8990, Training Loss: 0.01634
Epoch: 8990, Training Loss: 0.01502
Epoch: 8990, Training Loss: 0.01867
Epoch: 8990, Training Loss: 0.01428
Epoch: 8991, Training Loss: 0.01634
Epoch: 8991, Training Loss: 0.01502
Epoch: 8991, Training Loss: 0.01867
Epoch: 8991, Training Loss: 0.01428
Epoch: 8992, Training Loss: 0.01634
Epoch: 8992, Training Loss: 0.01502
Epoch: 8992, Training Loss: 0.01867
Epoch: 8992, Training Loss: 0.01428
Epoch: 8993, Training Loss: 0.01633
Epoch: 8993, Training Loss: 0.01502
Epoch: 8993, Training Loss: 0.01867
Epoch: 8993, Training Loss: 0.01428
Epoch: 8994, Training Loss: 0.01633
Epoch: 8994, Training Loss: 0.01502
Epoch: 8994, Training Loss: 0.01867
Epoch: 8994, Training Loss: 0.01428
Epoch: 8995, Training Loss: 0.01633
Epoch: 8995, Training Loss: 0.01502
Epoch: 8995, Training Loss: 0.01867
Epoch: 8995, Training Loss: 0.01428
Epoch: 8996, Training Loss: 0.01633
Epoch: 8996, Training Loss: 0.01502
Epoch: 8996, Training Loss: 0.01867
Epoch: 8996, Training Loss: 0.01428
Epoch: 8997, Training Loss: 0.01633
Epoch: 8997, Training Loss: 0.01502
Epoch: 8997, Training Loss: 0.01866
Epoch: 8997, Training Loss: 0.01428
Epoch: 8998, Training Loss: 0.01633
Epoch: 8998, Training Loss: 0.01502
Epoch: 8998, Training Loss: 0.01866
Epoch: 8998, Training Loss: 0.01427
Epoch: 8999, Training Loss: 0.01633
Epoch: 8999, Training Loss: 0.01502
Epoch: 8999, Training Loss: 0.01866
Epoch: 8999, Training Loss: 0.01427
Epoch: 9000, Training Loss: 0.01633
Epoch: 9000, Training Loss: 0.01501
Epoch: 9000, Training Loss: 0.01866
Epoch: 9000, Training Loss: 0.01427
Epoch: 9001, Training Loss: 0.01633
Epoch: 9001, Training Loss: 0.01501
Epoch: 9001, Training Loss: 0.01866
Epoch: 9001, Training Loss: 0.01427
Epoch: 9002, Training Loss: 0.01632
Epoch: 9002, Training Loss: 0.01501
Epoch: 9002, Training Loss: 0.01866
Epoch: 9002, Training Loss: 0.01427
Epoch: 9003, Training Loss: 0.01632
Epoch: 9003, Training Loss: 0.01501
Epoch: 9003, Training Loss: 0.01866
Epoch: 9003, Training Loss: 0.01427
Epoch: 9004, Training Loss: 0.01632
Epoch: 9004, Training Loss: 0.01501
Epoch: 9004, Training Loss: 0.01866
Epoch: 9004, Training Loss: 0.01427
Epoch: 9005, Training Loss: 0.01632
Epoch: 9005, Training Loss: 0.01501
Epoch: 9005, Training Loss: 0.01865
Epoch: 9005, Training Loss: 0.01427
Epoch: 9006, Training Loss: 0.01632
Epoch: 9006, Training Loss: 0.01501
Epoch: 9006, Training Loss: 0.01865
Epoch: 9006, Training Loss: 0.01427
Epoch: 9007, Training Loss: 0.01632
Epoch: 9007, Training Loss: 0.01501
Epoch: 9007, Training Loss: 0.01865
Epoch: 9007, Training Loss: 0.01427
Epoch: 9008, Training Loss: 0.01632
Epoch: 9008, Training Loss: 0.01501
Epoch: 9008, Training Loss: 0.01865
Epoch: 9008, Training Loss: 0.01427
Epoch: 9009, Training Loss: 0.01632
Epoch: 9009, Training Loss: 0.01501
Epoch: 9009, Training Loss: 0.01865
Epoch: 9009, Training Loss: 0.01427
Epoch: 9010, Training Loss: 0.01632
Epoch: 9010, Training Loss: 0.01501
Epoch: 9010, Training Loss: 0.01865
Epoch: 9010, Training Loss: 0.01426
Epoch: 9011, Training Loss: 0.01632
Epoch: 9011, Training Loss: 0.01500
Epoch: 9011, Training Loss: 0.01865
Epoch: 9011, Training Loss: 0.01426
Epoch: 9012, Training Loss: 0.01631
Epoch: 9012, Training Loss: 0.01500
Epoch: 9012, Training Loss: 0.01865
Epoch: 9012, Training Loss: 0.01426
Epoch: 9013, Training Loss: 0.01631
Epoch: 9013, Training Loss: 0.01500
Epoch: 9013, Training Loss: 0.01865
Epoch: 9013, Training Loss: 0.01426
Epoch: 9014, Training Loss: 0.01631
Epoch: 9014, Training Loss: 0.01500
Epoch: 9014, Training Loss: 0.01864
Epoch: 9014, Training Loss: 0.01426
Epoch: 9015, Training Loss: 0.01631
Epoch: 9015, Training Loss: 0.01500
Epoch: 9015, Training Loss: 0.01864
Epoch: 9015, Training Loss: 0.01426
Epoch: 9016, Training Loss: 0.01631
Epoch: 9016, Training Loss: 0.01500
Epoch: 9016, Training Loss: 0.01864
Epoch: 9016, Training Loss: 0.01426
Epoch: 9017, Training Loss: 0.01631
Epoch: 9017, Training Loss: 0.01500
Epoch: 9017, Training Loss: 0.01864
Epoch: 9017, Training Loss: 0.01426
Epoch: 9018, Training Loss: 0.01631
Epoch: 9018, Training Loss: 0.01500
Epoch: 9018, Training Loss: 0.01864
Epoch: 9018, Training Loss: 0.01426
Epoch: 9019, Training Loss: 0.01631
Epoch: 9019, Training Loss: 0.01500
Epoch: 9019, Training Loss: 0.01864
Epoch: 9019, Training Loss: 0.01426
Epoch: 9020, Training Loss: 0.01631
Epoch: 9020, Training Loss: 0.01500
Epoch: 9020, Training Loss: 0.01864
Epoch: 9020, Training Loss: 0.01426
Epoch: 9021, Training Loss: 0.01631
Epoch: 9021, Training Loss: 0.01500
Epoch: 9021, Training Loss: 0.01864
Epoch: 9021, Training Loss: 0.01425
Epoch: 9022, Training Loss: 0.01630
Epoch: 9022, Training Loss: 0.01499
Epoch: 9022, Training Loss: 0.01863
Epoch: 9022, Training Loss: 0.01425
Epoch: 9023, Training Loss: 0.01630
Epoch: 9023, Training Loss: 0.01499
Epoch: 9023, Training Loss: 0.01863
Epoch: 9023, Training Loss: 0.01425
Epoch: 9024, Training Loss: 0.01630
Epoch: 9024, Training Loss: 0.01499
Epoch: 9024, Training Loss: 0.01863
Epoch: 9024, Training Loss: 0.01425
Epoch: 9025, Training Loss: 0.01630
Epoch: 9025, Training Loss: 0.01499
Epoch: 9025, Training Loss: 0.01863
Epoch: 9025, Training Loss: 0.01425
Epoch: 9026, Training Loss: 0.01630
Epoch: 9026, Training Loss: 0.01499
Epoch: 9026, Training Loss: 0.01863
Epoch: 9026, Training Loss: 0.01425
Epoch: 9027, Training Loss: 0.01630
Epoch: 9027, Training Loss: 0.01499
Epoch: 9027, Training Loss: 0.01863
Epoch: 9027, Training Loss: 0.01425
Epoch: 9028, Training Loss: 0.01630
Epoch: 9028, Training Loss: 0.01499
Epoch: 9028, Training Loss: 0.01863
Epoch: 9028, Training Loss: 0.01425
Epoch: 9029, Training Loss: 0.01630
Epoch: 9029, Training Loss: 0.01499
Epoch: 9029, Training Loss: 0.01863
Epoch: 9029, Training Loss: 0.01425
Epoch: 9030, Training Loss: 0.01630
Epoch: 9030, Training Loss: 0.01499
Epoch: 9030, Training Loss: 0.01862
Epoch: 9030, Training Loss: 0.01425
Epoch: 9031, Training Loss: 0.01629
Epoch: 9031, Training Loss: 0.01499
Epoch: 9031, Training Loss: 0.01862
Epoch: 9031, Training Loss: 0.01425
Epoch: 9032, Training Loss: 0.01629
Epoch: 9032, Training Loss: 0.01499
Epoch: 9032, Training Loss: 0.01862
Epoch: 9032, Training Loss: 0.01424
Epoch: 9033, Training Loss: 0.01629
Epoch: 9033, Training Loss: 0.01498
Epoch: 9033, Training Loss: 0.01862
Epoch: 9033, Training Loss: 0.01424
Epoch: 9034, Training Loss: 0.01629
Epoch: 9034, Training Loss: 0.01498
Epoch: 9034, Training Loss: 0.01862
Epoch: 9034, Training Loss: 0.01424
Epoch: 9035, Training Loss: 0.01629
Epoch: 9035, Training Loss: 0.01498
Epoch: 9035, Training Loss: 0.01862
Epoch: 9035, Training Loss: 0.01424
Epoch: 9036, Training Loss: 0.01629
Epoch: 9036, Training Loss: 0.01498
Epoch: 9036, Training Loss: 0.01862
Epoch: 9036, Training Loss: 0.01424
Epoch: 9037, Training Loss: 0.01629
Epoch: 9037, Training Loss: 0.01498
Epoch: 9037, Training Loss: 0.01862
Epoch: 9037, Training Loss: 0.01424
Epoch: 9038, Training Loss: 0.01629
Epoch: 9038, Training Loss: 0.01498
Epoch: 9038, Training Loss: 0.01862
Epoch: 9038, Training Loss: 0.01424
Epoch: 9039, Training Loss: 0.01629
Epoch: 9039, Training Loss: 0.01498
Epoch: 9039, Training Loss: 0.01861
Epoch: 9039, Training Loss: 0.01424
Epoch: 9040, Training Loss: 0.01629
Epoch: 9040, Training Loss: 0.01498
Epoch: 9040, Training Loss: 0.01861
Epoch: 9040, Training Loss: 0.01424
Epoch: 9041, Training Loss: 0.01628
Epoch: 9041, Training Loss: 0.01498
Epoch: 9041, Training Loss: 0.01861
Epoch: 9041, Training Loss: 0.01424
Epoch: 9042, Training Loss: 0.01628
Epoch: 9042, Training Loss: 0.01498
Epoch: 9042, Training Loss: 0.01861
Epoch: 9042, Training Loss: 0.01424
Epoch: 9043, Training Loss: 0.01628
Epoch: 9043, Training Loss: 0.01498
Epoch: 9043, Training Loss: 0.01861
Epoch: 9043, Training Loss: 0.01423
Epoch: 9044, Training Loss: 0.01628
Epoch: 9044, Training Loss: 0.01497
Epoch: 9044, Training Loss: 0.01861
Epoch: 9044, Training Loss: 0.01423
Epoch: 9045, Training Loss: 0.01628
Epoch: 9045, Training Loss: 0.01497
Epoch: 9045, Training Loss: 0.01861
Epoch: 9045, Training Loss: 0.01423
Epoch: 9046, Training Loss: 0.01628
Epoch: 9046, Training Loss: 0.01497
Epoch: 9046, Training Loss: 0.01861
Epoch: 9046, Training Loss: 0.01423
Epoch: 9047, Training Loss: 0.01628
Epoch: 9047, Training Loss: 0.01497
Epoch: 9047, Training Loss: 0.01860
Epoch: 9047, Training Loss: 0.01423
Epoch: 9048, Training Loss: 0.01628
Epoch: 9048, Training Loss: 0.01497
Epoch: 9048, Training Loss: 0.01860
Epoch: 9048, Training Loss: 0.01423
Epoch: 9049, Training Loss: 0.01628
Epoch: 9049, Training Loss: 0.01497
Epoch: 9049, Training Loss: 0.01860
Epoch: 9049, Training Loss: 0.01423
Epoch: 9050, Training Loss: 0.01628
Epoch: 9050, Training Loss: 0.01497
Epoch: 9050, Training Loss: 0.01860
Epoch: 9050, Training Loss: 0.01423
Epoch: 9051, Training Loss: 0.01627
Epoch: 9051, Training Loss: 0.01497
Epoch: 9051, Training Loss: 0.01860
Epoch: 9051, Training Loss: 0.01423
Epoch: 9052, Training Loss: 0.01627
Epoch: 9052, Training Loss: 0.01497
Epoch: 9052, Training Loss: 0.01860
Epoch: 9052, Training Loss: 0.01423
Epoch: 9053, Training Loss: 0.01627
Epoch: 9053, Training Loss: 0.01497
Epoch: 9053, Training Loss: 0.01860
Epoch: 9053, Training Loss: 0.01423
Epoch: 9054, Training Loss: 0.01627
Epoch: 9054, Training Loss: 0.01496
Epoch: 9054, Training Loss: 0.01860
Epoch: 9054, Training Loss: 0.01423
Epoch: 9055, Training Loss: 0.01627
Epoch: 9055, Training Loss: 0.01496
Epoch: 9055, Training Loss: 0.01860
Epoch: 9055, Training Loss: 0.01422
Epoch: 9056, Training Loss: 0.01627
Epoch: 9056, Training Loss: 0.01496
Epoch: 9056, Training Loss: 0.01859
Epoch: 9056, Training Loss: 0.01422
Epoch: 9057, Training Loss: 0.01627
Epoch: 9057, Training Loss: 0.01496
Epoch: 9057, Training Loss: 0.01859
Epoch: 9057, Training Loss: 0.01422
Epoch: 9058, Training Loss: 0.01627
Epoch: 9058, Training Loss: 0.01496
Epoch: 9058, Training Loss: 0.01859
Epoch: 9058, Training Loss: 0.01422
Epoch: 9059, Training Loss: 0.01627
Epoch: 9059, Training Loss: 0.01496
Epoch: 9059, Training Loss: 0.01859
Epoch: 9059, Training Loss: 0.01422
Epoch: 9060, Training Loss: 0.01626
Epoch: 9060, Training Loss: 0.01496
Epoch: 9060, Training Loss: 0.01859
Epoch: 9060, Training Loss: 0.01422
Epoch: 9061, Training Loss: 0.01626
Epoch: 9061, Training Loss: 0.01496
Epoch: 9061, Training Loss: 0.01859
Epoch: 9061, Training Loss: 0.01422
Epoch: 9062, Training Loss: 0.01626
Epoch: 9062, Training Loss: 0.01496
Epoch: 9062, Training Loss: 0.01859
Epoch: 9062, Training Loss: 0.01422
Epoch: 9063, Training Loss: 0.01626
Epoch: 9063, Training Loss: 0.01496
Epoch: 9063, Training Loss: 0.01859
Epoch: 9063, Training Loss: 0.01422
Epoch: 9064, Training Loss: 0.01626
Epoch: 9064, Training Loss: 0.01496
Epoch: 9064, Training Loss: 0.01858
Epoch: 9064, Training Loss: 0.01422
Epoch: 9065, Training Loss: 0.01626
Epoch: 9065, Training Loss: 0.01495
Epoch: 9065, Training Loss: 0.01858
Epoch: 9065, Training Loss: 0.01422
Epoch: 9066, Training Loss: 0.01626
Epoch: 9066, Training Loss: 0.01495
Epoch: 9066, Training Loss: 0.01858
Epoch: 9066, Training Loss: 0.01421
Epoch: 9067, Training Loss: 0.01626
Epoch: 9067, Training Loss: 0.01495
Epoch: 9067, Training Loss: 0.01858
Epoch: 9067, Training Loss: 0.01421
Epoch: 9068, Training Loss: 0.01626
Epoch: 9068, Training Loss: 0.01495
Epoch: 9068, Training Loss: 0.01858
Epoch: 9068, Training Loss: 0.01421
Epoch: 9069, Training Loss: 0.01626
Epoch: 9069, Training Loss: 0.01495
Epoch: 9069, Training Loss: 0.01858
Epoch: 9069, Training Loss: 0.01421
Epoch: 9070, Training Loss: 0.01625
Epoch: 9070, Training Loss: 0.01495
Epoch: 9070, Training Loss: 0.01858
Epoch: 9070, Training Loss: 0.01421
Epoch: 9071, Training Loss: 0.01625
Epoch: 9071, Training Loss: 0.01495
Epoch: 9071, Training Loss: 0.01858
Epoch: 9071, Training Loss: 0.01421
Epoch: 9072, Training Loss: 0.01625
Epoch: 9072, Training Loss: 0.01495
Epoch: 9072, Training Loss: 0.01858
Epoch: 9072, Training Loss: 0.01421
Epoch: 9073, Training Loss: 0.01625
Epoch: 9073, Training Loss: 0.01495
Epoch: 9073, Training Loss: 0.01857
Epoch: 9073, Training Loss: 0.01421
Epoch: 9074, Training Loss: 0.01625
Epoch: 9074, Training Loss: 0.01495
Epoch: 9074, Training Loss: 0.01857
Epoch: 9074, Training Loss: 0.01421
Epoch: 9075, Training Loss: 0.01625
Epoch: 9075, Training Loss: 0.01495
Epoch: 9075, Training Loss: 0.01857
Epoch: 9075, Training Loss: 0.01421
Epoch: 9076, Training Loss: 0.01625
Epoch: 9076, Training Loss: 0.01494
Epoch: 9076, Training Loss: 0.01857
Epoch: 9076, Training Loss: 0.01421
Epoch: 9077, Training Loss: 0.01625
Epoch: 9077, Training Loss: 0.01494
Epoch: 9077, Training Loss: 0.01857
Epoch: 9077, Training Loss: 0.01421
Epoch: 9078, Training Loss: 0.01625
Epoch: 9078, Training Loss: 0.01494
Epoch: 9078, Training Loss: 0.01857
Epoch: 9078, Training Loss: 0.01420
Epoch: 9079, Training Loss: 0.01625
Epoch: 9079, Training Loss: 0.01494
Epoch: 9079, Training Loss: 0.01857
Epoch: 9079, Training Loss: 0.01420
Epoch: 9080, Training Loss: 0.01624
Epoch: 9080, Training Loss: 0.01494
Epoch: 9080, Training Loss: 0.01857
Epoch: 9080, Training Loss: 0.01420
Epoch: 9081, Training Loss: 0.01624
Epoch: 9081, Training Loss: 0.01494
Epoch: 9081, Training Loss: 0.01856
Epoch: 9081, Training Loss: 0.01420
Epoch: 9082, Training Loss: 0.01624
Epoch: 9082, Training Loss: 0.01494
Epoch: 9082, Training Loss: 0.01856
Epoch: 9082, Training Loss: 0.01420
Epoch: 9083, Training Loss: 0.01624
Epoch: 9083, Training Loss: 0.01494
Epoch: 9083, Training Loss: 0.01856
Epoch: 9083, Training Loss: 0.01420
Epoch: 9084, Training Loss: 0.01624
Epoch: 9084, Training Loss: 0.01494
Epoch: 9084, Training Loss: 0.01856
Epoch: 9084, Training Loss: 0.01420
Epoch: 9085, Training Loss: 0.01624
Epoch: 9085, Training Loss: 0.01494
Epoch: 9085, Training Loss: 0.01856
Epoch: 9085, Training Loss: 0.01420
Epoch: 9086, Training Loss: 0.01624
Epoch: 9086, Training Loss: 0.01494
Epoch: 9086, Training Loss: 0.01856
Epoch: 9086, Training Loss: 0.01420
Epoch: 9087, Training Loss: 0.01624
Epoch: 9087, Training Loss: 0.01493
Epoch: 9087, Training Loss: 0.01856
Epoch: 9087, Training Loss: 0.01420
Epoch: 9088, Training Loss: 0.01624
Epoch: 9088, Training Loss: 0.01493
Epoch: 9088, Training Loss: 0.01856
Epoch: 9088, Training Loss: 0.01420
Epoch: 9089, Training Loss: 0.01624
Epoch: 9089, Training Loss: 0.01493
Epoch: 9089, Training Loss: 0.01856
Epoch: 9089, Training Loss: 0.01419
Epoch: 9090, Training Loss: 0.01623
Epoch: 9090, Training Loss: 0.01493
Epoch: 9090, Training Loss: 0.01855
Epoch: 9090, Training Loss: 0.01419
Epoch: 9091, Training Loss: 0.01623
Epoch: 9091, Training Loss: 0.01493
Epoch: 9091, Training Loss: 0.01855
Epoch: 9091, Training Loss: 0.01419
Epoch: 9092, Training Loss: 0.01623
Epoch: 9092, Training Loss: 0.01493
Epoch: 9092, Training Loss: 0.01855
Epoch: 9092, Training Loss: 0.01419
Epoch: 9093, Training Loss: 0.01623
Epoch: 9093, Training Loss: 0.01493
Epoch: 9093, Training Loss: 0.01855
Epoch: 9093, Training Loss: 0.01419
Epoch: 9094, Training Loss: 0.01623
Epoch: 9094, Training Loss: 0.01493
Epoch: 9094, Training Loss: 0.01855
Epoch: 9094, Training Loss: 0.01419
Epoch: 9095, Training Loss: 0.01623
Epoch: 9095, Training Loss: 0.01493
Epoch: 9095, Training Loss: 0.01855
Epoch: 9095, Training Loss: 0.01419
Epoch: 9096, Training Loss: 0.01623
Epoch: 9096, Training Loss: 0.01493
Epoch: 9096, Training Loss: 0.01855
Epoch: 9096, Training Loss: 0.01419
Epoch: 9097, Training Loss: 0.01623
Epoch: 9097, Training Loss: 0.01493
Epoch: 9097, Training Loss: 0.01855
Epoch: 9097, Training Loss: 0.01419
Epoch: 9098, Training Loss: 0.01623
Epoch: 9098, Training Loss: 0.01492
Epoch: 9098, Training Loss: 0.01854
Epoch: 9098, Training Loss: 0.01419
Epoch: 9099, Training Loss: 0.01622
Epoch: 9099, Training Loss: 0.01492
Epoch: 9099, Training Loss: 0.01854
Epoch: 9099, Training Loss: 0.01419
Epoch: 9100, Training Loss: 0.01622
Epoch: 9100, Training Loss: 0.01492
Epoch: 9100, Training Loss: 0.01854
Epoch: 9100, Training Loss: 0.01418
Epoch: 9101, Training Loss: 0.01622
Epoch: 9101, Training Loss: 0.01492
Epoch: 9101, Training Loss: 0.01854
Epoch: 9101, Training Loss: 0.01418
Epoch: 9102, Training Loss: 0.01622
Epoch: 9102, Training Loss: 0.01492
Epoch: 9102, Training Loss: 0.01854
Epoch: 9102, Training Loss: 0.01418
Epoch: 9103, Training Loss: 0.01622
Epoch: 9103, Training Loss: 0.01492
Epoch: 9103, Training Loss: 0.01854
Epoch: 9103, Training Loss: 0.01418
Epoch: 9104, Training Loss: 0.01622
Epoch: 9104, Training Loss: 0.01492
Epoch: 9104, Training Loss: 0.01854
Epoch: 9104, Training Loss: 0.01418
Epoch: 9105, Training Loss: 0.01622
Epoch: 9105, Training Loss: 0.01492
Epoch: 9105, Training Loss: 0.01854
Epoch: 9105, Training Loss: 0.01418
Epoch: 9106, Training Loss: 0.01622
Epoch: 9106, Training Loss: 0.01492
Epoch: 9106, Training Loss: 0.01854
Epoch: 9106, Training Loss: 0.01418
Epoch: 9107, Training Loss: 0.01622
Epoch: 9107, Training Loss: 0.01492
Epoch: 9107, Training Loss: 0.01853
Epoch: 9107, Training Loss: 0.01418
Epoch: 9108, Training Loss: 0.01622
Epoch: 9108, Training Loss: 0.01492
Epoch: 9108, Training Loss: 0.01853
Epoch: 9108, Training Loss: 0.01418
Epoch: 9109, Training Loss: 0.01621
Epoch: 9109, Training Loss: 0.01491
Epoch: 9109, Training Loss: 0.01853
Epoch: 9109, Training Loss: 0.01418
Epoch: 9110, Training Loss: 0.01621
Epoch: 9110, Training Loss: 0.01491
Epoch: 9110, Training Loss: 0.01853
Epoch: 9110, Training Loss: 0.01418
Epoch: 9111, Training Loss: 0.01621
Epoch: 9111, Training Loss: 0.01491
Epoch: 9111, Training Loss: 0.01853
Epoch: 9111, Training Loss: 0.01418
Epoch: 9112, Training Loss: 0.01621
Epoch: 9112, Training Loss: 0.01491
Epoch: 9112, Training Loss: 0.01853
Epoch: 9112, Training Loss: 0.01417
Epoch: 9113, Training Loss: 0.01621
Epoch: 9113, Training Loss: 0.01491
Epoch: 9113, Training Loss: 0.01853
Epoch: 9113, Training Loss: 0.01417
Epoch: 9114, Training Loss: 0.01621
Epoch: 9114, Training Loss: 0.01491
Epoch: 9114, Training Loss: 0.01853
Epoch: 9114, Training Loss: 0.01417
Epoch: 9115, Training Loss: 0.01621
Epoch: 9115, Training Loss: 0.01491
Epoch: 9115, Training Loss: 0.01852
Epoch: 9115, Training Loss: 0.01417
Epoch: 9116, Training Loss: 0.01621
Epoch: 9116, Training Loss: 0.01491
Epoch: 9116, Training Loss: 0.01852
Epoch: 9116, Training Loss: 0.01417
Epoch: 9117, Training Loss: 0.01621
Epoch: 9117, Training Loss: 0.01491
Epoch: 9117, Training Loss: 0.01852
Epoch: 9117, Training Loss: 0.01417
Epoch: 9118, Training Loss: 0.01621
Epoch: 9118, Training Loss: 0.01491
Epoch: 9118, Training Loss: 0.01852
Epoch: 9118, Training Loss: 0.01417
Epoch: 9119, Training Loss: 0.01620
Epoch: 9119, Training Loss: 0.01491
Epoch: 9119, Training Loss: 0.01852
Epoch: 9119, Training Loss: 0.01417
Epoch: 9120, Training Loss: 0.01620
Epoch: 9120, Training Loss: 0.01490
Epoch: 9120, Training Loss: 0.01852
Epoch: 9120, Training Loss: 0.01417
Epoch: 9121, Training Loss: 0.01620
Epoch: 9121, Training Loss: 0.01490
Epoch: 9121, Training Loss: 0.01852
Epoch: 9121, Training Loss: 0.01417
Epoch: 9122, Training Loss: 0.01620
Epoch: 9122, Training Loss: 0.01490
Epoch: 9122, Training Loss: 0.01852
Epoch: 9122, Training Loss: 0.01417
Epoch: 9123, Training Loss: 0.01620
Epoch: 9123, Training Loss: 0.01490
Epoch: 9123, Training Loss: 0.01852
Epoch: 9123, Training Loss: 0.01416
Epoch: 9124, Training Loss: 0.01620
Epoch: 9124, Training Loss: 0.01490
Epoch: 9124, Training Loss: 0.01851
Epoch: 9124, Training Loss: 0.01416
Epoch: 9125, Training Loss: 0.01620
Epoch: 9125, Training Loss: 0.01490
Epoch: 9125, Training Loss: 0.01851
Epoch: 9125, Training Loss: 0.01416
Epoch: 9126, Training Loss: 0.01620
Epoch: 9126, Training Loss: 0.01490
Epoch: 9126, Training Loss: 0.01851
Epoch: 9126, Training Loss: 0.01416
Epoch: 9127, Training Loss: 0.01620
Epoch: 9127, Training Loss: 0.01490
Epoch: 9127, Training Loss: 0.01851
Epoch: 9127, Training Loss: 0.01416
Epoch: 9128, Training Loss: 0.01620
Epoch: 9128, Training Loss: 0.01490
Epoch: 9128, Training Loss: 0.01851
Epoch: 9128, Training Loss: 0.01416
Epoch: 9129, Training Loss: 0.01619
Epoch: 9129, Training Loss: 0.01490
Epoch: 9129, Training Loss: 0.01851
Epoch: 9129, Training Loss: 0.01416
Epoch: 9130, Training Loss: 0.01619
Epoch: 9130, Training Loss: 0.01490
Epoch: 9130, Training Loss: 0.01851
Epoch: 9130, Training Loss: 0.01416
Epoch: 9131, Training Loss: 0.01619
Epoch: 9131, Training Loss: 0.01489
Epoch: 9131, Training Loss: 0.01851
Epoch: 9131, Training Loss: 0.01416
Epoch: 9132, Training Loss: 0.01619
Epoch: 9132, Training Loss: 0.01489
Epoch: 9132, Training Loss: 0.01850
Epoch: 9132, Training Loss: 0.01416
Epoch: 9133, Training Loss: 0.01619
Epoch: 9133, Training Loss: 0.01489
Epoch: 9133, Training Loss: 0.01850
Epoch: 9133, Training Loss: 0.01416
Epoch: 9134, Training Loss: 0.01619
Epoch: 9134, Training Loss: 0.01489
Epoch: 9134, Training Loss: 0.01850
Epoch: 9134, Training Loss: 0.01416
Epoch: 9135, Training Loss: 0.01619
Epoch: 9135, Training Loss: 0.01489
Epoch: 9135, Training Loss: 0.01850
Epoch: 9135, Training Loss: 0.01415
Epoch: 9136, Training Loss: 0.01619
Epoch: 9136, Training Loss: 0.01489
Epoch: 9136, Training Loss: 0.01850
Epoch: 9136, Training Loss: 0.01415
Epoch: 9137, Training Loss: 0.01619
Epoch: 9137, Training Loss: 0.01489
Epoch: 9137, Training Loss: 0.01850
Epoch: 9137, Training Loss: 0.01415
Epoch: 9138, Training Loss: 0.01619
Epoch: 9138, Training Loss: 0.01489
Epoch: 9138, Training Loss: 0.01850
Epoch: 9138, Training Loss: 0.01415
Epoch: 9139, Training Loss: 0.01618
Epoch: 9139, Training Loss: 0.01489
Epoch: 9139, Training Loss: 0.01850
Epoch: 9139, Training Loss: 0.01415
Epoch: 9140, Training Loss: 0.01618
Epoch: 9140, Training Loss: 0.01489
Epoch: 9140, Training Loss: 0.01850
Epoch: 9140, Training Loss: 0.01415
Epoch: 9141, Training Loss: 0.01618
Epoch: 9141, Training Loss: 0.01489
Epoch: 9141, Training Loss: 0.01849
Epoch: 9141, Training Loss: 0.01415
Epoch: 9142, Training Loss: 0.01618
Epoch: 9142, Training Loss: 0.01488
Epoch: 9142, Training Loss: 0.01849
Epoch: 9142, Training Loss: 0.01415
Epoch: 9143, Training Loss: 0.01618
Epoch: 9143, Training Loss: 0.01488
Epoch: 9143, Training Loss: 0.01849
Epoch: 9143, Training Loss: 0.01415
Epoch: 9144, Training Loss: 0.01618
Epoch: 9144, Training Loss: 0.01488
Epoch: 9144, Training Loss: 0.01849
Epoch: 9144, Training Loss: 0.01415
Epoch: 9145, Training Loss: 0.01618
Epoch: 9145, Training Loss: 0.01488
Epoch: 9145, Training Loss: 0.01849
Epoch: 9145, Training Loss: 0.01415
Epoch: 9146, Training Loss: 0.01618
Epoch: 9146, Training Loss: 0.01488
Epoch: 9146, Training Loss: 0.01849
Epoch: 9146, Training Loss: 0.01414
Epoch: 9147, Training Loss: 0.01618
Epoch: 9147, Training Loss: 0.01488
Epoch: 9147, Training Loss: 0.01849
Epoch: 9147, Training Loss: 0.01414
Epoch: 9148, Training Loss: 0.01617
Epoch: 9148, Training Loss: 0.01488
Epoch: 9148, Training Loss: 0.01849
Epoch: 9148, Training Loss: 0.01414
Epoch: 9149, Training Loss: 0.01617
Epoch: 9149, Training Loss: 0.01488
Epoch: 9149, Training Loss: 0.01848
Epoch: 9149, Training Loss: 0.01414
Epoch: 9150, Training Loss: 0.01617
Epoch: 9150, Training Loss: 0.01488
Epoch: 9150, Training Loss: 0.01848
Epoch: 9150, Training Loss: 0.01414
Epoch: 9151, Training Loss: 0.01617
Epoch: 9151, Training Loss: 0.01488
Epoch: 9151, Training Loss: 0.01848
Epoch: 9151, Training Loss: 0.01414
Epoch: 9152, Training Loss: 0.01617
Epoch: 9152, Training Loss: 0.01488
Epoch: 9152, Training Loss: 0.01848
Epoch: 9152, Training Loss: 0.01414
Epoch: 9153, Training Loss: 0.01617
Epoch: 9153, Training Loss: 0.01488
Epoch: 9153, Training Loss: 0.01848
Epoch: 9153, Training Loss: 0.01414
Epoch: 9154, Training Loss: 0.01617
Epoch: 9154, Training Loss: 0.01487
Epoch: 9154, Training Loss: 0.01848
Epoch: 9154, Training Loss: 0.01414
Epoch: 9155, Training Loss: 0.01617
Epoch: 9155, Training Loss: 0.01487
Epoch: 9155, Training Loss: 0.01848
Epoch: 9155, Training Loss: 0.01414
Epoch: 9156, Training Loss: 0.01617
Epoch: 9156, Training Loss: 0.01487
Epoch: 9156, Training Loss: 0.01848
Epoch: 9156, Training Loss: 0.01414
Epoch: 9157, Training Loss: 0.01617
Epoch: 9157, Training Loss: 0.01487
Epoch: 9157, Training Loss: 0.01848
Epoch: 9157, Training Loss: 0.01414
Epoch: 9158, Training Loss: 0.01616
Epoch: 9158, Training Loss: 0.01487
Epoch: 9158, Training Loss: 0.01847
Epoch: 9158, Training Loss: 0.01413
Epoch: 9159, Training Loss: 0.01616
Epoch: 9159, Training Loss: 0.01487
Epoch: 9159, Training Loss: 0.01847
Epoch: 9159, Training Loss: 0.01413
Epoch: 9160, Training Loss: 0.01616
Epoch: 9160, Training Loss: 0.01487
Epoch: 9160, Training Loss: 0.01847
Epoch: 9160, Training Loss: 0.01413
Epoch: 9161, Training Loss: 0.01616
Epoch: 9161, Training Loss: 0.01487
Epoch: 9161, Training Loss: 0.01847
Epoch: 9161, Training Loss: 0.01413
Epoch: 9162, Training Loss: 0.01616
Epoch: 9162, Training Loss: 0.01487
Epoch: 9162, Training Loss: 0.01847
Epoch: 9162, Training Loss: 0.01413
Epoch: 9163, Training Loss: 0.01616
Epoch: 9163, Training Loss: 0.01487
Epoch: 9163, Training Loss: 0.01847
Epoch: 9163, Training Loss: 0.01413
Epoch: 9164, Training Loss: 0.01616
Epoch: 9164, Training Loss: 0.01487
Epoch: 9164, Training Loss: 0.01847
Epoch: 9164, Training Loss: 0.01413
Epoch: 9165, Training Loss: 0.01616
Epoch: 9165, Training Loss: 0.01486
Epoch: 9165, Training Loss: 0.01847
Epoch: 9165, Training Loss: 0.01413
Epoch: 9166, Training Loss: 0.01616
Epoch: 9166, Training Loss: 0.01486
Epoch: 9166, Training Loss: 0.01847
Epoch: 9166, Training Loss: 0.01413
Epoch: 9167, Training Loss: 0.01616
Epoch: 9167, Training Loss: 0.01486
Epoch: 9167, Training Loss: 0.01846
Epoch: 9167, Training Loss: 0.01413
Epoch: 9168, Training Loss: 0.01615
Epoch: 9168, Training Loss: 0.01486
Epoch: 9168, Training Loss: 0.01846
Epoch: 9168, Training Loss: 0.01413
Epoch: 9169, Training Loss: 0.01615
Epoch: 9169, Training Loss: 0.01486
Epoch: 9169, Training Loss: 0.01846
Epoch: 9169, Training Loss: 0.01412
Epoch: 9170, Training Loss: 0.01615
Epoch: 9170, Training Loss: 0.01486
Epoch: 9170, Training Loss: 0.01846
Epoch: 9170, Training Loss: 0.01412
Epoch: 9171, Training Loss: 0.01615
Epoch: 9171, Training Loss: 0.01486
Epoch: 9171, Training Loss: 0.01846
Epoch: 9171, Training Loss: 0.01412
Epoch: 9172, Training Loss: 0.01615
Epoch: 9172, Training Loss: 0.01486
Epoch: 9172, Training Loss: 0.01846
Epoch: 9172, Training Loss: 0.01412
Epoch: 9173, Training Loss: 0.01615
Epoch: 9173, Training Loss: 0.01486
Epoch: 9173, Training Loss: 0.01846
Epoch: 9173, Training Loss: 0.01412
Epoch: 9174, Training Loss: 0.01615
Epoch: 9174, Training Loss: 0.01486
Epoch: 9174, Training Loss: 0.01846
Epoch: 9174, Training Loss: 0.01412
Epoch: 9175, Training Loss: 0.01615
Epoch: 9175, Training Loss: 0.01486
Epoch: 9175, Training Loss: 0.01845
Epoch: 9175, Training Loss: 0.01412
Epoch: 9176, Training Loss: 0.01615
Epoch: 9176, Training Loss: 0.01485
Epoch: 9176, Training Loss: 0.01845
Epoch: 9176, Training Loss: 0.01412
Epoch: 9177, Training Loss: 0.01615
Epoch: 9177, Training Loss: 0.01485
Epoch: 9177, Training Loss: 0.01845
Epoch: 9177, Training Loss: 0.01412
Epoch: 9178, Training Loss: 0.01614
Epoch: 9178, Training Loss: 0.01485
Epoch: 9178, Training Loss: 0.01845
Epoch: 9178, Training Loss: 0.01412
Epoch: 9179, Training Loss: 0.01614
Epoch: 9179, Training Loss: 0.01485
Epoch: 9179, Training Loss: 0.01845
Epoch: 9179, Training Loss: 0.01412
Epoch: 9180, Training Loss: 0.01614
Epoch: 9180, Training Loss: 0.01485
Epoch: 9180, Training Loss: 0.01845
Epoch: 9180, Training Loss: 0.01412
Epoch: 9181, Training Loss: 0.01614
Epoch: 9181, Training Loss: 0.01485
Epoch: 9181, Training Loss: 0.01845
Epoch: 9181, Training Loss: 0.01411
Epoch: 9182, Training Loss: 0.01614
Epoch: 9182, Training Loss: 0.01485
Epoch: 9182, Training Loss: 0.01845
Epoch: 9182, Training Loss: 0.01411
Epoch: 9183, Training Loss: 0.01614
Epoch: 9183, Training Loss: 0.01485
Epoch: 9183, Training Loss: 0.01845
Epoch: 9183, Training Loss: 0.01411
Epoch: 9184, Training Loss: 0.01614
Epoch: 9184, Training Loss: 0.01485
Epoch: 9184, Training Loss: 0.01844
Epoch: 9184, Training Loss: 0.01411
Epoch: 9185, Training Loss: 0.01614
Epoch: 9185, Training Loss: 0.01485
Epoch: 9185, Training Loss: 0.01844
Epoch: 9185, Training Loss: 0.01411
Epoch: 9186, Training Loss: 0.01614
Epoch: 9186, Training Loss: 0.01485
Epoch: 9186, Training Loss: 0.01844
Epoch: 9186, Training Loss: 0.01411
Epoch: 9187, Training Loss: 0.01614
Epoch: 9187, Training Loss: 0.01484
Epoch: 9187, Training Loss: 0.01844
Epoch: 9187, Training Loss: 0.01411
Epoch: 9188, Training Loss: 0.01613
Epoch: 9188, Training Loss: 0.01484
Epoch: 9188, Training Loss: 0.01844
Epoch: 9188, Training Loss: 0.01411
Epoch: 9189, Training Loss: 0.01613
Epoch: 9189, Training Loss: 0.01484
Epoch: 9189, Training Loss: 0.01844
Epoch: 9189, Training Loss: 0.01411
Epoch: 9190, Training Loss: 0.01613
Epoch: 9190, Training Loss: 0.01484
Epoch: 9190, Training Loss: 0.01844
Epoch: 9190, Training Loss: 0.01411
Epoch: 9191, Training Loss: 0.01613
Epoch: 9191, Training Loss: 0.01484
Epoch: 9191, Training Loss: 0.01844
Epoch: 9191, Training Loss: 0.01411
Epoch: 9192, Training Loss: 0.01613
Epoch: 9192, Training Loss: 0.01484
Epoch: 9192, Training Loss: 0.01844
Epoch: 9192, Training Loss: 0.01411
Epoch: 9193, Training Loss: 0.01613
Epoch: 9193, Training Loss: 0.01484
Epoch: 9193, Training Loss: 0.01843
Epoch: 9193, Training Loss: 0.01410
Epoch: 9194, Training Loss: 0.01613
Epoch: 9194, Training Loss: 0.01484
Epoch: 9194, Training Loss: 0.01843
Epoch: 9194, Training Loss: 0.01410
Epoch: 9195, Training Loss: 0.01613
Epoch: 9195, Training Loss: 0.01484
Epoch: 9195, Training Loss: 0.01843
Epoch: 9195, Training Loss: 0.01410
Epoch: 9196, Training Loss: 0.01613
Epoch: 9196, Training Loss: 0.01484
Epoch: 9196, Training Loss: 0.01843
Epoch: 9196, Training Loss: 0.01410
Epoch: 9197, Training Loss: 0.01613
Epoch: 9197, Training Loss: 0.01484
Epoch: 9197, Training Loss: 0.01843
Epoch: 9197, Training Loss: 0.01410
Epoch: 9198, Training Loss: 0.01612
Epoch: 9198, Training Loss: 0.01483
Epoch: 9198, Training Loss: 0.01843
Epoch: 9198, Training Loss: 0.01410
Epoch: 9199, Training Loss: 0.01612
Epoch: 9199, Training Loss: 0.01483
Epoch: 9199, Training Loss: 0.01843
Epoch: 9199, Training Loss: 0.01410
Epoch: 9200, Training Loss: 0.01612
Epoch: 9200, Training Loss: 0.01483
Epoch: 9200, Training Loss: 0.01843
Epoch: 9200, Training Loss: 0.01410
Epoch: 9201, Training Loss: 0.01612
Epoch: 9201, Training Loss: 0.01483
Epoch: 9201, Training Loss: 0.01842
Epoch: 9201, Training Loss: 0.01410
Epoch: 9202, Training Loss: 0.01612
Epoch: 9202, Training Loss: 0.01483
Epoch: 9202, Training Loss: 0.01842
Epoch: 9202, Training Loss: 0.01410
Epoch: 9203, Training Loss: 0.01612
Epoch: 9203, Training Loss: 0.01483
Epoch: 9203, Training Loss: 0.01842
Epoch: 9203, Training Loss: 0.01410
Epoch: 9204, Training Loss: 0.01612
Epoch: 9204, Training Loss: 0.01483
Epoch: 9204, Training Loss: 0.01842
Epoch: 9204, Training Loss: 0.01409
Epoch: 9205, Training Loss: 0.01612
Epoch: 9205, Training Loss: 0.01483
Epoch: 9205, Training Loss: 0.01842
Epoch: 9205, Training Loss: 0.01409
Epoch: 9206, Training Loss: 0.01612
Epoch: 9206, Training Loss: 0.01483
Epoch: 9206, Training Loss: 0.01842
Epoch: 9206, Training Loss: 0.01409
Epoch: 9207, Training Loss: 0.01612
Epoch: 9207, Training Loss: 0.01483
Epoch: 9207, Training Loss: 0.01842
Epoch: 9207, Training Loss: 0.01409
Epoch: 9208, Training Loss: 0.01611
Epoch: 9208, Training Loss: 0.01483
Epoch: 9208, Training Loss: 0.01842
Epoch: 9208, Training Loss: 0.01409
Epoch: 9209, Training Loss: 0.01611
Epoch: 9209, Training Loss: 0.01482
Epoch: 9209, Training Loss: 0.01842
Epoch: 9209, Training Loss: 0.01409
Epoch: 9210, Training Loss: 0.01611
Epoch: 9210, Training Loss: 0.01482
Epoch: 9210, Training Loss: 0.01841
Epoch: 9210, Training Loss: 0.01409
Epoch: 9211, Training Loss: 0.01611
Epoch: 9211, Training Loss: 0.01482
Epoch: 9211, Training Loss: 0.01841
Epoch: 9211, Training Loss: 0.01409
Epoch: 9212, Training Loss: 0.01611
Epoch: 9212, Training Loss: 0.01482
Epoch: 9212, Training Loss: 0.01841
Epoch: 9212, Training Loss: 0.01409
Epoch: 9213, Training Loss: 0.01611
Epoch: 9213, Training Loss: 0.01482
Epoch: 9213, Training Loss: 0.01841
Epoch: 9213, Training Loss: 0.01409
Epoch: 9214, Training Loss: 0.01611
Epoch: 9214, Training Loss: 0.01482
Epoch: 9214, Training Loss: 0.01841
Epoch: 9214, Training Loss: 0.01409
Epoch: 9215, Training Loss: 0.01611
Epoch: 9215, Training Loss: 0.01482
Epoch: 9215, Training Loss: 0.01841
Epoch: 9215, Training Loss: 0.01409
Epoch: 9216, Training Loss: 0.01611
Epoch: 9216, Training Loss: 0.01482
Epoch: 9216, Training Loss: 0.01841
Epoch: 9216, Training Loss: 0.01408
Epoch: 9217, Training Loss: 0.01611
Epoch: 9217, Training Loss: 0.01482
Epoch: 9217, Training Loss: 0.01841
Epoch: 9217, Training Loss: 0.01408
Epoch: 9218, Training Loss: 0.01610
Epoch: 9218, Training Loss: 0.01482
Epoch: 9218, Training Loss: 0.01841
Epoch: 9218, Training Loss: 0.01408
Epoch: 9219, Training Loss: 0.01610
Epoch: 9219, Training Loss: 0.01482
Epoch: 9219, Training Loss: 0.01840
Epoch: 9219, Training Loss: 0.01408
Epoch: 9220, Training Loss: 0.01610
Epoch: 9220, Training Loss: 0.01482
Epoch: 9220, Training Loss: 0.01840
Epoch: 9220, Training Loss: 0.01408
Epoch: 9221, Training Loss: 0.01610
Epoch: 9221, Training Loss: 0.01481
Epoch: 9221, Training Loss: 0.01840
Epoch: 9221, Training Loss: 0.01408
Epoch: 9222, Training Loss: 0.01610
Epoch: 9222, Training Loss: 0.01481
Epoch: 9222, Training Loss: 0.01840
Epoch: 9222, Training Loss: 0.01408
Epoch: 9223, Training Loss: 0.01610
Epoch: 9223, Training Loss: 0.01481
Epoch: 9223, Training Loss: 0.01840
Epoch: 9223, Training Loss: 0.01408
Epoch: 9224, Training Loss: 0.01610
Epoch: 9224, Training Loss: 0.01481
Epoch: 9224, Training Loss: 0.01840
Epoch: 9224, Training Loss: 0.01408
Epoch: 9225, Training Loss: 0.01610
Epoch: 9225, Training Loss: 0.01481
Epoch: 9225, Training Loss: 0.01840
Epoch: 9225, Training Loss: 0.01408
Epoch: 9226, Training Loss: 0.01610
Epoch: 9226, Training Loss: 0.01481
Epoch: 9226, Training Loss: 0.01840
Epoch: 9226, Training Loss: 0.01408
Epoch: 9227, Training Loss: 0.01610
Epoch: 9227, Training Loss: 0.01481
Epoch: 9227, Training Loss: 0.01839
Epoch: 9227, Training Loss: 0.01408
Epoch: 9228, Training Loss: 0.01609
Epoch: 9228, Training Loss: 0.01481
Epoch: 9228, Training Loss: 0.01839
Epoch: 9228, Training Loss: 0.01407
Epoch: 9229, Training Loss: 0.01609
Epoch: 9229, Training Loss: 0.01481
Epoch: 9229, Training Loss: 0.01839
Epoch: 9229, Training Loss: 0.01407
Epoch: 9230, Training Loss: 0.01609
Epoch: 9230, Training Loss: 0.01481
Epoch: 9230, Training Loss: 0.01839
Epoch: 9230, Training Loss: 0.01407
Epoch: 9231, Training Loss: 0.01609
Epoch: 9231, Training Loss: 0.01481
Epoch: 9231, Training Loss: 0.01839
Epoch: 9231, Training Loss: 0.01407
Epoch: 9232, Training Loss: 0.01609
Epoch: 9232, Training Loss: 0.01480
Epoch: 9232, Training Loss: 0.01839
Epoch: 9232, Training Loss: 0.01407
Epoch: 9233, Training Loss: 0.01609
Epoch: 9233, Training Loss: 0.01480
Epoch: 9233, Training Loss: 0.01839
Epoch: 9233, Training Loss: 0.01407
Epoch: 9234, Training Loss: 0.01609
Epoch: 9234, Training Loss: 0.01480
Epoch: 9234, Training Loss: 0.01839
Epoch: 9234, Training Loss: 0.01407
Epoch: 9235, Training Loss: 0.01609
Epoch: 9235, Training Loss: 0.01480
Epoch: 9235, Training Loss: 0.01839
Epoch: 9235, Training Loss: 0.01407
Epoch: 9236, Training Loss: 0.01609
Epoch: 9236, Training Loss: 0.01480
Epoch: 9236, Training Loss: 0.01838
Epoch: 9236, Training Loss: 0.01407
Epoch: 9237, Training Loss: 0.01609
Epoch: 9237, Training Loss: 0.01480
Epoch: 9237, Training Loss: 0.01838
Epoch: 9237, Training Loss: 0.01407
Epoch: 9238, Training Loss: 0.01608
Epoch: 9238, Training Loss: 0.01480
Epoch: 9238, Training Loss: 0.01838
Epoch: 9238, Training Loss: 0.01407
Epoch: 9239, Training Loss: 0.01608
Epoch: 9239, Training Loss: 0.01480
Epoch: 9239, Training Loss: 0.01838
Epoch: 9239, Training Loss: 0.01406
Epoch: 9240, Training Loss: 0.01608
Epoch: 9240, Training Loss: 0.01480
Epoch: 9240, Training Loss: 0.01838
Epoch: 9240, Training Loss: 0.01406
Epoch: 9241, Training Loss: 0.01608
Epoch: 9241, Training Loss: 0.01480
Epoch: 9241, Training Loss: 0.01838
Epoch: 9241, Training Loss: 0.01406
Epoch: 9242, Training Loss: 0.01608
Epoch: 9242, Training Loss: 0.01480
Epoch: 9242, Training Loss: 0.01838
Epoch: 9242, Training Loss: 0.01406
Epoch: 9243, Training Loss: 0.01608
Epoch: 9243, Training Loss: 0.01479
Epoch: 9243, Training Loss: 0.01838
Epoch: 9243, Training Loss: 0.01406
Epoch: 9244, Training Loss: 0.01608
Epoch: 9244, Training Loss: 0.01479
Epoch: 9244, Training Loss: 0.01838
Epoch: 9244, Training Loss: 0.01406
Epoch: 9245, Training Loss: 0.01608
Epoch: 9245, Training Loss: 0.01479
Epoch: 9245, Training Loss: 0.01837
Epoch: 9245, Training Loss: 0.01406
Epoch: 9246, Training Loss: 0.01608
Epoch: 9246, Training Loss: 0.01479
Epoch: 9246, Training Loss: 0.01837
Epoch: 9246, Training Loss: 0.01406
Epoch: 9247, Training Loss: 0.01608
Epoch: 9247, Training Loss: 0.01479
Epoch: 9247, Training Loss: 0.01837
Epoch: 9247, Training Loss: 0.01406
Epoch: 9248, Training Loss: 0.01607
Epoch: 9248, Training Loss: 0.01479
Epoch: 9248, Training Loss: 0.01837
Epoch: 9248, Training Loss: 0.01406
Epoch: 9249, Training Loss: 0.01607
Epoch: 9249, Training Loss: 0.01479
Epoch: 9249, Training Loss: 0.01837
Epoch: 9249, Training Loss: 0.01406
Epoch: 9250, Training Loss: 0.01607
Epoch: 9250, Training Loss: 0.01479
Epoch: 9250, Training Loss: 0.01837
Epoch: 9250, Training Loss: 0.01406
Epoch: 9251, Training Loss: 0.01607
Epoch: 9251, Training Loss: 0.01479
Epoch: 9251, Training Loss: 0.01837
Epoch: 9251, Training Loss: 0.01405
Epoch: 9252, Training Loss: 0.01607
Epoch: 9252, Training Loss: 0.01479
Epoch: 9252, Training Loss: 0.01837
Epoch: 9252, Training Loss: 0.01405
Epoch: 9253, Training Loss: 0.01607
Epoch: 9253, Training Loss: 0.01479
Epoch: 9253, Training Loss: 0.01837
Epoch: 9253, Training Loss: 0.01405
Epoch: 9254, Training Loss: 0.01607
Epoch: 9254, Training Loss: 0.01478
Epoch: 9254, Training Loss: 0.01836
Epoch: 9254, Training Loss: 0.01405
Epoch: 9255, Training Loss: 0.01607
Epoch: 9255, Training Loss: 0.01478
Epoch: 9255, Training Loss: 0.01836
Epoch: 9255, Training Loss: 0.01405
Epoch: 9256, Training Loss: 0.01607
Epoch: 9256, Training Loss: 0.01478
Epoch: 9256, Training Loss: 0.01836
Epoch: 9256, Training Loss: 0.01405
Epoch: 9257, Training Loss: 0.01607
Epoch: 9257, Training Loss: 0.01478
Epoch: 9257, Training Loss: 0.01836
Epoch: 9257, Training Loss: 0.01405
Epoch: 9258, Training Loss: 0.01606
Epoch: 9258, Training Loss: 0.01478
Epoch: 9258, Training Loss: 0.01836
Epoch: 9258, Training Loss: 0.01405
Epoch: 9259, Training Loss: 0.01606
Epoch: 9259, Training Loss: 0.01478
Epoch: 9259, Training Loss: 0.01836
Epoch: 9259, Training Loss: 0.01405
Epoch: 9260, Training Loss: 0.01606
Epoch: 9260, Training Loss: 0.01478
Epoch: 9260, Training Loss: 0.01836
Epoch: 9260, Training Loss: 0.01405
Epoch: 9261, Training Loss: 0.01606
Epoch: 9261, Training Loss: 0.01478
Epoch: 9261, Training Loss: 0.01836
Epoch: 9261, Training Loss: 0.01405
Epoch: 9262, Training Loss: 0.01606
Epoch: 9262, Training Loss: 0.01478
Epoch: 9262, Training Loss: 0.01835
Epoch: 9262, Training Loss: 0.01405
Epoch: 9263, Training Loss: 0.01606
Epoch: 9263, Training Loss: 0.01478
Epoch: 9263, Training Loss: 0.01835
Epoch: 9263, Training Loss: 0.01404
Epoch: 9264, Training Loss: 0.01606
Epoch: 9264, Training Loss: 0.01478
Epoch: 9264, Training Loss: 0.01835
Epoch: 9264, Training Loss: 0.01404
Epoch: 9265, Training Loss: 0.01606
Epoch: 9265, Training Loss: 0.01478
Epoch: 9265, Training Loss: 0.01835
Epoch: 9265, Training Loss: 0.01404
Epoch: 9266, Training Loss: 0.01606
Epoch: 9266, Training Loss: 0.01477
Epoch: 9266, Training Loss: 0.01835
Epoch: 9266, Training Loss: 0.01404
Epoch: 9267, Training Loss: 0.01606
Epoch: 9267, Training Loss: 0.01477
Epoch: 9267, Training Loss: 0.01835
Epoch: 9267, Training Loss: 0.01404
Epoch: 9268, Training Loss: 0.01605
Epoch: 9268, Training Loss: 0.01477
Epoch: 9268, Training Loss: 0.01835
Epoch: 9268, Training Loss: 0.01404
Epoch: 9269, Training Loss: 0.01605
Epoch: 9269, Training Loss: 0.01477
Epoch: 9269, Training Loss: 0.01835
Epoch: 9269, Training Loss: 0.01404
Epoch: 9270, Training Loss: 0.01605
Epoch: 9270, Training Loss: 0.01477
Epoch: 9270, Training Loss: 0.01835
Epoch: 9270, Training Loss: 0.01404
Epoch: 9271, Training Loss: 0.01605
Epoch: 9271, Training Loss: 0.01477
Epoch: 9271, Training Loss: 0.01834
Epoch: 9271, Training Loss: 0.01404
Epoch: 9272, Training Loss: 0.01605
Epoch: 9272, Training Loss: 0.01477
Epoch: 9272, Training Loss: 0.01834
Epoch: 9272, Training Loss: 0.01404
Epoch: 9273, Training Loss: 0.01605
Epoch: 9273, Training Loss: 0.01477
Epoch: 9273, Training Loss: 0.01834
Epoch: 9273, Training Loss: 0.01404
Epoch: 9274, Training Loss: 0.01605
Epoch: 9274, Training Loss: 0.01477
Epoch: 9274, Training Loss: 0.01834
Epoch: 9274, Training Loss: 0.01404
Epoch: 9275, Training Loss: 0.01605
Epoch: 9275, Training Loss: 0.01477
Epoch: 9275, Training Loss: 0.01834
Epoch: 9275, Training Loss: 0.01403
Epoch: 9276, Training Loss: 0.01605
Epoch: 9276, Training Loss: 0.01477
Epoch: 9276, Training Loss: 0.01834
Epoch: 9276, Training Loss: 0.01403
Epoch: 9277, Training Loss: 0.01605
Epoch: 9277, Training Loss: 0.01476
Epoch: 9277, Training Loss: 0.01834
Epoch: 9277, Training Loss: 0.01403
Epoch: 9278, Training Loss: 0.01604
Epoch: 9278, Training Loss: 0.01476
Epoch: 9278, Training Loss: 0.01834
Epoch: 9278, Training Loss: 0.01403
Epoch: 9279, Training Loss: 0.01604
Epoch: 9279, Training Loss: 0.01476
Epoch: 9279, Training Loss: 0.01834
Epoch: 9279, Training Loss: 0.01403
Epoch: 9280, Training Loss: 0.01604
Epoch: 9280, Training Loss: 0.01476
Epoch: 9280, Training Loss: 0.01833
Epoch: 9280, Training Loss: 0.01403
Epoch: 9281, Training Loss: 0.01604
Epoch: 9281, Training Loss: 0.01476
Epoch: 9281, Training Loss: 0.01833
Epoch: 9281, Training Loss: 0.01403
Epoch: 9282, Training Loss: 0.01604
Epoch: 9282, Training Loss: 0.01476
Epoch: 9282, Training Loss: 0.01833
Epoch: 9282, Training Loss: 0.01403
Epoch: 9283, Training Loss: 0.01604
Epoch: 9283, Training Loss: 0.01476
Epoch: 9283, Training Loss: 0.01833
Epoch: 9283, Training Loss: 0.01403
Epoch: 9284, Training Loss: 0.01604
Epoch: 9284, Training Loss: 0.01476
Epoch: 9284, Training Loss: 0.01833
Epoch: 9284, Training Loss: 0.01403
Epoch: 9285, Training Loss: 0.01604
Epoch: 9285, Training Loss: 0.01476
Epoch: 9285, Training Loss: 0.01833
Epoch: 9285, Training Loss: 0.01403
Epoch: 9286, Training Loss: 0.01604
Epoch: 9286, Training Loss: 0.01476
Epoch: 9286, Training Loss: 0.01833
Epoch: 9286, Training Loss: 0.01402
Epoch: 9287, Training Loss: 0.01604
Epoch: 9287, Training Loss: 0.01476
Epoch: 9287, Training Loss: 0.01833
Epoch: 9287, Training Loss: 0.01402
Epoch: 9288, Training Loss: 0.01604
Epoch: 9288, Training Loss: 0.01475
Epoch: 9288, Training Loss: 0.01833
Epoch: 9288, Training Loss: 0.01402
Epoch: 9289, Training Loss: 0.01603
Epoch: 9289, Training Loss: 0.01475
Epoch: 9289, Training Loss: 0.01832
Epoch: 9289, Training Loss: 0.01402
Epoch: 9290, Training Loss: 0.01603
Epoch: 9290, Training Loss: 0.01475
Epoch: 9290, Training Loss: 0.01832
Epoch: 9290, Training Loss: 0.01402
Epoch: 9291, Training Loss: 0.01603
Epoch: 9291, Training Loss: 0.01475
Epoch: 9291, Training Loss: 0.01832
Epoch: 9291, Training Loss: 0.01402
Epoch: 9292, Training Loss: 0.01603
Epoch: 9292, Training Loss: 0.01475
Epoch: 9292, Training Loss: 0.01832
Epoch: 9292, Training Loss: 0.01402
Epoch: 9293, Training Loss: 0.01603
Epoch: 9293, Training Loss: 0.01475
Epoch: 9293, Training Loss: 0.01832
Epoch: 9293, Training Loss: 0.01402
Epoch: 9294, Training Loss: 0.01603
Epoch: 9294, Training Loss: 0.01475
Epoch: 9294, Training Loss: 0.01832
Epoch: 9294, Training Loss: 0.01402
Epoch: 9295, Training Loss: 0.01603
Epoch: 9295, Training Loss: 0.01475
Epoch: 9295, Training Loss: 0.01832
Epoch: 9295, Training Loss: 0.01402
Epoch: 9296, Training Loss: 0.01603
Epoch: 9296, Training Loss: 0.01475
Epoch: 9296, Training Loss: 0.01832
Epoch: 9296, Training Loss: 0.01402
Epoch: 9297, Training Loss: 0.01603
Epoch: 9297, Training Loss: 0.01475
Epoch: 9297, Training Loss: 0.01832
Epoch: 9297, Training Loss: 0.01402
Epoch: 9298, Training Loss: 0.01603
Epoch: 9298, Training Loss: 0.01475
Epoch: 9298, Training Loss: 0.01831
Epoch: 9298, Training Loss: 0.01401
Epoch: 9299, Training Loss: 0.01602
Epoch: 9299, Training Loss: 0.01475
Epoch: 9299, Training Loss: 0.01831
Epoch: 9299, Training Loss: 0.01401
Epoch: 9300, Training Loss: 0.01602
Epoch: 9300, Training Loss: 0.01474
Epoch: 9300, Training Loss: 0.01831
Epoch: 9300, Training Loss: 0.01401
Epoch: 9301, Training Loss: 0.01602
Epoch: 9301, Training Loss: 0.01474
Epoch: 9301, Training Loss: 0.01831
Epoch: 9301, Training Loss: 0.01401
Epoch: 9302, Training Loss: 0.01602
Epoch: 9302, Training Loss: 0.01474
Epoch: 9302, Training Loss: 0.01831
Epoch: 9302, Training Loss: 0.01401
Epoch: 9303, Training Loss: 0.01602
Epoch: 9303, Training Loss: 0.01474
Epoch: 9303, Training Loss: 0.01831
Epoch: 9303, Training Loss: 0.01401
Epoch: 9304, Training Loss: 0.01602
Epoch: 9304, Training Loss: 0.01474
Epoch: 9304, Training Loss: 0.01831
Epoch: 9304, Training Loss: 0.01401
Epoch: 9305, Training Loss: 0.01602
Epoch: 9305, Training Loss: 0.01474
Epoch: 9305, Training Loss: 0.01831
Epoch: 9305, Training Loss: 0.01401
Epoch: 9306, Training Loss: 0.01602
Epoch: 9306, Training Loss: 0.01474
Epoch: 9306, Training Loss: 0.01830
Epoch: 9306, Training Loss: 0.01401
Epoch: 9307, Training Loss: 0.01602
Epoch: 9307, Training Loss: 0.01474
Epoch: 9307, Training Loss: 0.01830
Epoch: 9307, Training Loss: 0.01401
Epoch: 9308, Training Loss: 0.01602
Epoch: 9308, Training Loss: 0.01474
Epoch: 9308, Training Loss: 0.01830
Epoch: 9308, Training Loss: 0.01401
Epoch: 9309, Training Loss: 0.01601
Epoch: 9309, Training Loss: 0.01474
Epoch: 9309, Training Loss: 0.01830
Epoch: 9309, Training Loss: 0.01401
Epoch: 9310, Training Loss: 0.01601
Epoch: 9310, Training Loss: 0.01474
Epoch: 9310, Training Loss: 0.01830
Epoch: 9310, Training Loss: 0.01400
Epoch: 9311, Training Loss: 0.01601
Epoch: 9311, Training Loss: 0.01473
Epoch: 9311, Training Loss: 0.01830
Epoch: 9311, Training Loss: 0.01400
Epoch: 9312, Training Loss: 0.01601
Epoch: 9312, Training Loss: 0.01473
Epoch: 9312, Training Loss: 0.01830
Epoch: 9312, Training Loss: 0.01400
Epoch: 9313, Training Loss: 0.01601
Epoch: 9313, Training Loss: 0.01473
Epoch: 9313, Training Loss: 0.01830
Epoch: 9313, Training Loss: 0.01400
Epoch: 9314, Training Loss: 0.01601
Epoch: 9314, Training Loss: 0.01473
Epoch: 9314, Training Loss: 0.01830
Epoch: 9314, Training Loss: 0.01400
Epoch: 9315, Training Loss: 0.01601
Epoch: 9315, Training Loss: 0.01473
Epoch: 9315, Training Loss: 0.01829
Epoch: 9315, Training Loss: 0.01400
Epoch: 9316, Training Loss: 0.01601
Epoch: 9316, Training Loss: 0.01473
Epoch: 9316, Training Loss: 0.01829
Epoch: 9316, Training Loss: 0.01400
Epoch: 9317, Training Loss: 0.01601
Epoch: 9317, Training Loss: 0.01473
Epoch: 9317, Training Loss: 0.01829
Epoch: 9317, Training Loss: 0.01400
Epoch: 9318, Training Loss: 0.01601
Epoch: 9318, Training Loss: 0.01473
Epoch: 9318, Training Loss: 0.01829
Epoch: 9318, Training Loss: 0.01400
Epoch: 9319, Training Loss: 0.01600
Epoch: 9319, Training Loss: 0.01473
Epoch: 9319, Training Loss: 0.01829
Epoch: 9319, Training Loss: 0.01400
Epoch: 9320, Training Loss: 0.01600
Epoch: 9320, Training Loss: 0.01473
Epoch: 9320, Training Loss: 0.01829
Epoch: 9320, Training Loss: 0.01400
Epoch: 9321, Training Loss: 0.01600
Epoch: 9321, Training Loss: 0.01473
Epoch: 9321, Training Loss: 0.01829
Epoch: 9321, Training Loss: 0.01400
Epoch: 9322, Training Loss: 0.01600
Epoch: 9322, Training Loss: 0.01473
Epoch: 9322, Training Loss: 0.01829
Epoch: 9322, Training Loss: 0.01399
Epoch: 9323, Training Loss: 0.01600
Epoch: 9323, Training Loss: 0.01472
Epoch: 9323, Training Loss: 0.01829
Epoch: 9323, Training Loss: 0.01399
Epoch: 9324, Training Loss: 0.01600
Epoch: 9324, Training Loss: 0.01472
Epoch: 9324, Training Loss: 0.01828
Epoch: 9324, Training Loss: 0.01399
Epoch: 9325, Training Loss: 0.01600
Epoch: 9325, Training Loss: 0.01472
Epoch: 9325, Training Loss: 0.01828
Epoch: 9325, Training Loss: 0.01399
Epoch: 9326, Training Loss: 0.01600
Epoch: 9326, Training Loss: 0.01472
Epoch: 9326, Training Loss: 0.01828
Epoch: 9326, Training Loss: 0.01399
Epoch: 9327, Training Loss: 0.01600
Epoch: 9327, Training Loss: 0.01472
Epoch: 9327, Training Loss: 0.01828
Epoch: 9327, Training Loss: 0.01399
Epoch: 9328, Training Loss: 0.01600
Epoch: 9328, Training Loss: 0.01472
Epoch: 9328, Training Loss: 0.01828
Epoch: 9328, Training Loss: 0.01399
Epoch: 9329, Training Loss: 0.01599
Epoch: 9329, Training Loss: 0.01472
Epoch: 9329, Training Loss: 0.01828
Epoch: 9329, Training Loss: 0.01399
Epoch: 9330, Training Loss: 0.01599
Epoch: 9330, Training Loss: 0.01472
Epoch: 9330, Training Loss: 0.01828
Epoch: 9330, Training Loss: 0.01399
Epoch: 9331, Training Loss: 0.01599
Epoch: 9331, Training Loss: 0.01472
Epoch: 9331, Training Loss: 0.01828
Epoch: 9331, Training Loss: 0.01399
Epoch: 9332, Training Loss: 0.01599
Epoch: 9332, Training Loss: 0.01472
Epoch: 9332, Training Loss: 0.01828
Epoch: 9332, Training Loss: 0.01399
Epoch: 9333, Training Loss: 0.01599
Epoch: 9333, Training Loss: 0.01472
Epoch: 9333, Training Loss: 0.01827
Epoch: 9333, Training Loss: 0.01399
Epoch: 9334, Training Loss: 0.01599
Epoch: 9334, Training Loss: 0.01471
Epoch: 9334, Training Loss: 0.01827
Epoch: 9334, Training Loss: 0.01398
Epoch: 9335, Training Loss: 0.01599
Epoch: 9335, Training Loss: 0.01471
Epoch: 9335, Training Loss: 0.01827
Epoch: 9335, Training Loss: 0.01398
Epoch: 9336, Training Loss: 0.01599
Epoch: 9336, Training Loss: 0.01471
Epoch: 9336, Training Loss: 0.01827
Epoch: 9336, Training Loss: 0.01398
Epoch: 9337, Training Loss: 0.01599
Epoch: 9337, Training Loss: 0.01471
Epoch: 9337, Training Loss: 0.01827
Epoch: 9337, Training Loss: 0.01398
Epoch: 9338, Training Loss: 0.01599
Epoch: 9338, Training Loss: 0.01471
Epoch: 9338, Training Loss: 0.01827
Epoch: 9338, Training Loss: 0.01398
Epoch: 9339, Training Loss: 0.01599
Epoch: 9339, Training Loss: 0.01471
Epoch: 9339, Training Loss: 0.01827
Epoch: 9339, Training Loss: 0.01398
Epoch: 9340, Training Loss: 0.01598
Epoch: 9340, Training Loss: 0.01471
Epoch: 9340, Training Loss: 0.01827
Epoch: 9340, Training Loss: 0.01398
Epoch: 9341, Training Loss: 0.01598
Epoch: 9341, Training Loss: 0.01471
Epoch: 9341, Training Loss: 0.01827
Epoch: 9341, Training Loss: 0.01398
Epoch: 9342, Training Loss: 0.01598
Epoch: 9342, Training Loss: 0.01471
Epoch: 9342, Training Loss: 0.01826
Epoch: 9342, Training Loss: 0.01398
Epoch: 9343, Training Loss: 0.01598
Epoch: 9343, Training Loss: 0.01471
Epoch: 9343, Training Loss: 0.01826
Epoch: 9343, Training Loss: 0.01398
Epoch: 9344, Training Loss: 0.01598
Epoch: 9344, Training Loss: 0.01471
Epoch: 9344, Training Loss: 0.01826
Epoch: 9344, Training Loss: 0.01398
Epoch: 9345, Training Loss: 0.01598
Epoch: 9345, Training Loss: 0.01471
Epoch: 9345, Training Loss: 0.01826
Epoch: 9345, Training Loss: 0.01398
Epoch: 9346, Training Loss: 0.01598
Epoch: 9346, Training Loss: 0.01470
Epoch: 9346, Training Loss: 0.01826
Epoch: 9346, Training Loss: 0.01397
Epoch: 9347, Training Loss: 0.01598
Epoch: 9347, Training Loss: 0.01470
Epoch: 9347, Training Loss: 0.01826
Epoch: 9347, Training Loss: 0.01397
Epoch: 9348, Training Loss: 0.01598
Epoch: 9348, Training Loss: 0.01470
Epoch: 9348, Training Loss: 0.01826
Epoch: 9348, Training Loss: 0.01397
Epoch: 9349, Training Loss: 0.01598
Epoch: 9349, Training Loss: 0.01470
Epoch: 9349, Training Loss: 0.01826
Epoch: 9349, Training Loss: 0.01397
Epoch: 9350, Training Loss: 0.01597
Epoch: 9350, Training Loss: 0.01470
Epoch: 9350, Training Loss: 0.01826
Epoch: 9350, Training Loss: 0.01397
Epoch: 9351, Training Loss: 0.01597
Epoch: 9351, Training Loss: 0.01470
Epoch: 9351, Training Loss: 0.01825
Epoch: 9351, Training Loss: 0.01397
Epoch: 9352, Training Loss: 0.01597
Epoch: 9352, Training Loss: 0.01470
Epoch: 9352, Training Loss: 0.01825
Epoch: 9352, Training Loss: 0.01397
Epoch: 9353, Training Loss: 0.01597
Epoch: 9353, Training Loss: 0.01470
Epoch: 9353, Training Loss: 0.01825
Epoch: 9353, Training Loss: 0.01397
Epoch: 9354, Training Loss: 0.01597
Epoch: 9354, Training Loss: 0.01470
Epoch: 9354, Training Loss: 0.01825
Epoch: 9354, Training Loss: 0.01397
Epoch: 9355, Training Loss: 0.01597
Epoch: 9355, Training Loss: 0.01470
Epoch: 9355, Training Loss: 0.01825
Epoch: 9355, Training Loss: 0.01397
Epoch: 9356, Training Loss: 0.01597
Epoch: 9356, Training Loss: 0.01470
Epoch: 9356, Training Loss: 0.01825
Epoch: 9356, Training Loss: 0.01397
Epoch: 9357, Training Loss: 0.01597
Epoch: 9357, Training Loss: 0.01469
Epoch: 9357, Training Loss: 0.01825
Epoch: 9357, Training Loss: 0.01397
Epoch: 9358, Training Loss: 0.01597
Epoch: 9358, Training Loss: 0.01469
Epoch: 9358, Training Loss: 0.01825
Epoch: 9358, Training Loss: 0.01396
Epoch: 9359, Training Loss: 0.01597
Epoch: 9359, Training Loss: 0.01469
Epoch: 9359, Training Loss: 0.01825
Epoch: 9359, Training Loss: 0.01396
Epoch: 9360, Training Loss: 0.01596
Epoch: 9360, Training Loss: 0.01469
Epoch: 9360, Training Loss: 0.01824
Epoch: 9360, Training Loss: 0.01396
Epoch: 9361, Training Loss: 0.01596
Epoch: 9361, Training Loss: 0.01469
Epoch: 9361, Training Loss: 0.01824
Epoch: 9361, Training Loss: 0.01396
Epoch: 9362, Training Loss: 0.01596
Epoch: 9362, Training Loss: 0.01469
Epoch: 9362, Training Loss: 0.01824
Epoch: 9362, Training Loss: 0.01396
Epoch: 9363, Training Loss: 0.01596
Epoch: 9363, Training Loss: 0.01469
Epoch: 9363, Training Loss: 0.01824
Epoch: 9363, Training Loss: 0.01396
Epoch: 9364, Training Loss: 0.01596
Epoch: 9364, Training Loss: 0.01469
Epoch: 9364, Training Loss: 0.01824
Epoch: 9364, Training Loss: 0.01396
Epoch: 9365, Training Loss: 0.01596
Epoch: 9365, Training Loss: 0.01469
Epoch: 9365, Training Loss: 0.01824
Epoch: 9365, Training Loss: 0.01396
Epoch: 9366, Training Loss: 0.01596
Epoch: 9366, Training Loss: 0.01469
Epoch: 9366, Training Loss: 0.01824
Epoch: 9366, Training Loss: 0.01396
Epoch: 9367, Training Loss: 0.01596
Epoch: 9367, Training Loss: 0.01469
Epoch: 9367, Training Loss: 0.01824
Epoch: 9367, Training Loss: 0.01396
Epoch: 9368, Training Loss: 0.01596
Epoch: 9368, Training Loss: 0.01469
Epoch: 9368, Training Loss: 0.01824
Epoch: 9368, Training Loss: 0.01396
Epoch: 9369, Training Loss: 0.01596
Epoch: 9369, Training Loss: 0.01468
Epoch: 9369, Training Loss: 0.01823
Epoch: 9369, Training Loss: 0.01396
Epoch: 9370, Training Loss: 0.01595
Epoch: 9370, Training Loss: 0.01468
Epoch: 9370, Training Loss: 0.01823
Epoch: 9370, Training Loss: 0.01395
Epoch: 9371, Training Loss: 0.01595
Epoch: 9371, Training Loss: 0.01468
Epoch: 9371, Training Loss: 0.01823
Epoch: 9371, Training Loss: 0.01395
Epoch: 9372, Training Loss: 0.01595
Epoch: 9372, Training Loss: 0.01468
Epoch: 9372, Training Loss: 0.01823
Epoch: 9372, Training Loss: 0.01395
Epoch: 9373, Training Loss: 0.01595
Epoch: 9373, Training Loss: 0.01468
Epoch: 9373, Training Loss: 0.01823
Epoch: 9373, Training Loss: 0.01395
Epoch: 9374, Training Loss: 0.01595
Epoch: 9374, Training Loss: 0.01468
Epoch: 9374, Training Loss: 0.01823
Epoch: 9374, Training Loss: 0.01395
Epoch: 9375, Training Loss: 0.01595
Epoch: 9375, Training Loss: 0.01468
Epoch: 9375, Training Loss: 0.01823
Epoch: 9375, Training Loss: 0.01395
Epoch: 9376, Training Loss: 0.01595
Epoch: 9376, Training Loss: 0.01468
Epoch: 9376, Training Loss: 0.01823
Epoch: 9376, Training Loss: 0.01395
Epoch: 9377, Training Loss: 0.01595
Epoch: 9377, Training Loss: 0.01468
Epoch: 9377, Training Loss: 0.01823
Epoch: 9377, Training Loss: 0.01395
Epoch: 9378, Training Loss: 0.01595
Epoch: 9378, Training Loss: 0.01468
Epoch: 9378, Training Loss: 0.01822
Epoch: 9378, Training Loss: 0.01395
Epoch: 9379, Training Loss: 0.01595
Epoch: 9379, Training Loss: 0.01468
Epoch: 9379, Training Loss: 0.01822
Epoch: 9379, Training Loss: 0.01395
Epoch: 9380, Training Loss: 0.01595
Epoch: 9380, Training Loss: 0.01467
Epoch: 9380, Training Loss: 0.01822
Epoch: 9380, Training Loss: 0.01395
Epoch: 9381, Training Loss: 0.01594
Epoch: 9381, Training Loss: 0.01467
Epoch: 9381, Training Loss: 0.01822
Epoch: 9381, Training Loss: 0.01395
Epoch: 9382, Training Loss: 0.01594
Epoch: 9382, Training Loss: 0.01467
Epoch: 9382, Training Loss: 0.01822
Epoch: 9382, Training Loss: 0.01394
Epoch: 9383, Training Loss: 0.01594
Epoch: 9383, Training Loss: 0.01467
Epoch: 9383, Training Loss: 0.01822
Epoch: 9383, Training Loss: 0.01394
Epoch: 9384, Training Loss: 0.01594
Epoch: 9384, Training Loss: 0.01467
Epoch: 9384, Training Loss: 0.01822
Epoch: 9384, Training Loss: 0.01394
Epoch: 9385, Training Loss: 0.01594
Epoch: 9385, Training Loss: 0.01467
Epoch: 9385, Training Loss: 0.01822
Epoch: 9385, Training Loss: 0.01394
Epoch: 9386, Training Loss: 0.01594
Epoch: 9386, Training Loss: 0.01467
Epoch: 9386, Training Loss: 0.01822
Epoch: 9386, Training Loss: 0.01394
Epoch: 9387, Training Loss: 0.01594
Epoch: 9387, Training Loss: 0.01467
Epoch: 9387, Training Loss: 0.01821
Epoch: 9387, Training Loss: 0.01394
Epoch: 9388, Training Loss: 0.01594
Epoch: 9388, Training Loss: 0.01467
Epoch: 9388, Training Loss: 0.01821
Epoch: 9388, Training Loss: 0.01394
Epoch: 9389, Training Loss: 0.01594
Epoch: 9389, Training Loss: 0.01467
Epoch: 9389, Training Loss: 0.01821
Epoch: 9389, Training Loss: 0.01394
Epoch: 9390, Training Loss: 0.01594
Epoch: 9390, Training Loss: 0.01467
Epoch: 9390, Training Loss: 0.01821
Epoch: 9390, Training Loss: 0.01394
Epoch: 9391, Training Loss: 0.01593
Epoch: 9391, Training Loss: 0.01467
Epoch: 9391, Training Loss: 0.01821
Epoch: 9391, Training Loss: 0.01394
Epoch: 9392, Training Loss: 0.01593
Epoch: 9392, Training Loss: 0.01466
Epoch: 9392, Training Loss: 0.01821
Epoch: 9392, Training Loss: 0.01394
Epoch: 9393, Training Loss: 0.01593
Epoch: 9393, Training Loss: 0.01466
Epoch: 9393, Training Loss: 0.01821
Epoch: 9393, Training Loss: 0.01394
Epoch: 9394, Training Loss: 0.01593
Epoch: 9394, Training Loss: 0.01466
Epoch: 9394, Training Loss: 0.01821
Epoch: 9394, Training Loss: 0.01393
Epoch: 9395, Training Loss: 0.01593
Epoch: 9395, Training Loss: 0.01466
Epoch: 9395, Training Loss: 0.01821
Epoch: 9395, Training Loss: 0.01393
Epoch: 9396, Training Loss: 0.01593
Epoch: 9396, Training Loss: 0.01466
Epoch: 9396, Training Loss: 0.01820
Epoch: 9396, Training Loss: 0.01393
Epoch: 9397, Training Loss: 0.01593
Epoch: 9397, Training Loss: 0.01466
Epoch: 9397, Training Loss: 0.01820
Epoch: 9397, Training Loss: 0.01393
Epoch: 9398, Training Loss: 0.01593
Epoch: 9398, Training Loss: 0.01466
Epoch: 9398, Training Loss: 0.01820
Epoch: 9398, Training Loss: 0.01393
Epoch: 9399, Training Loss: 0.01593
Epoch: 9399, Training Loss: 0.01466
Epoch: 9399, Training Loss: 0.01820
Epoch: 9399, Training Loss: 0.01393
Epoch: 9400, Training Loss: 0.01593
Epoch: 9400, Training Loss: 0.01466
Epoch: 9400, Training Loss: 0.01820
Epoch: 9400, Training Loss: 0.01393
Epoch: 9401, Training Loss: 0.01592
Epoch: 9401, Training Loss: 0.01466
Epoch: 9401, Training Loss: 0.01820
Epoch: 9401, Training Loss: 0.01393
Epoch: 9402, Training Loss: 0.01592
Epoch: 9402, Training Loss: 0.01466
Epoch: 9402, Training Loss: 0.01820
Epoch: 9402, Training Loss: 0.01393
Epoch: 9403, Training Loss: 0.01592
Epoch: 9403, Training Loss: 0.01465
Epoch: 9403, Training Loss: 0.01820
Epoch: 9403, Training Loss: 0.01393
Epoch: 9404, Training Loss: 0.01592
Epoch: 9404, Training Loss: 0.01465
Epoch: 9404, Training Loss: 0.01819
Epoch: 9404, Training Loss: 0.01393
Epoch: 9405, Training Loss: 0.01592
Epoch: 9405, Training Loss: 0.01465
Epoch: 9405, Training Loss: 0.01819
Epoch: 9405, Training Loss: 0.01393
Epoch: 9406, Training Loss: 0.01592
Epoch: 9406, Training Loss: 0.01465
Epoch: 9406, Training Loss: 0.01819
Epoch: 9406, Training Loss: 0.01392
Epoch: 9407, Training Loss: 0.01592
Epoch: 9407, Training Loss: 0.01465
Epoch: 9407, Training Loss: 0.01819
Epoch: 9407, Training Loss: 0.01392
Epoch: 9408, Training Loss: 0.01592
Epoch: 9408, Training Loss: 0.01465
Epoch: 9408, Training Loss: 0.01819
Epoch: 9408, Training Loss: 0.01392
Epoch: 9409, Training Loss: 0.01592
Epoch: 9409, Training Loss: 0.01465
Epoch: 9409, Training Loss: 0.01819
Epoch: 9409, Training Loss: 0.01392
Epoch: 9410, Training Loss: 0.01592
Epoch: 9410, Training Loss: 0.01465
Epoch: 9410, Training Loss: 0.01819
Epoch: 9410, Training Loss: 0.01392
Epoch: 9411, Training Loss: 0.01592
Epoch: 9411, Training Loss: 0.01465
Epoch: 9411, Training Loss: 0.01819
Epoch: 9411, Training Loss: 0.01392
Epoch: 9412, Training Loss: 0.01591
Epoch: 9412, Training Loss: 0.01465
Epoch: 9412, Training Loss: 0.01819
Epoch: 9412, Training Loss: 0.01392
Epoch: 9413, Training Loss: 0.01591
Epoch: 9413, Training Loss: 0.01465
Epoch: 9413, Training Loss: 0.01819
Epoch: 9413, Training Loss: 0.01392
Epoch: 9414, Training Loss: 0.01591
Epoch: 9414, Training Loss: 0.01465
Epoch: 9414, Training Loss: 0.01818
Epoch: 9414, Training Loss: 0.01392
Epoch: 9415, Training Loss: 0.01591
Epoch: 9415, Training Loss: 0.01464
Epoch: 9415, Training Loss: 0.01818
Epoch: 9415, Training Loss: 0.01392
Epoch: 9416, Training Loss: 0.01591
Epoch: 9416, Training Loss: 0.01464
Epoch: 9416, Training Loss: 0.01818
Epoch: 9416, Training Loss: 0.01392
Epoch: 9417, Training Loss: 0.01591
Epoch: 9417, Training Loss: 0.01464
Epoch: 9417, Training Loss: 0.01818
Epoch: 9417, Training Loss: 0.01392
Epoch: 9418, Training Loss: 0.01591
Epoch: 9418, Training Loss: 0.01464
Epoch: 9418, Training Loss: 0.01818
Epoch: 9418, Training Loss: 0.01391
Epoch: 9419, Training Loss: 0.01591
Epoch: 9419, Training Loss: 0.01464
Epoch: 9419, Training Loss: 0.01818
Epoch: 9419, Training Loss: 0.01391
Epoch: 9420, Training Loss: 0.01591
Epoch: 9420, Training Loss: 0.01464
Epoch: 9420, Training Loss: 0.01818
Epoch: 9420, Training Loss: 0.01391
Epoch: 9421, Training Loss: 0.01591
Epoch: 9421, Training Loss: 0.01464
Epoch: 9421, Training Loss: 0.01818
Epoch: 9421, Training Loss: 0.01391
Epoch: 9422, Training Loss: 0.01590
Epoch: 9422, Training Loss: 0.01464
Epoch: 9422, Training Loss: 0.01818
Epoch: 9422, Training Loss: 0.01391
Epoch: 9423, Training Loss: 0.01590
Epoch: 9423, Training Loss: 0.01464
Epoch: 9423, Training Loss: 0.01817
Epoch: 9423, Training Loss: 0.01391
Epoch: 9424, Training Loss: 0.01590
Epoch: 9424, Training Loss: 0.01464
Epoch: 9424, Training Loss: 0.01817
Epoch: 9424, Training Loss: 0.01391
Epoch: 9425, Training Loss: 0.01590
Epoch: 9425, Training Loss: 0.01464
Epoch: 9425, Training Loss: 0.01817
Epoch: 9425, Training Loss: 0.01391
Epoch: 9426, Training Loss: 0.01590
Epoch: 9426, Training Loss: 0.01463
Epoch: 9426, Training Loss: 0.01817
Epoch: 9426, Training Loss: 0.01391
Epoch: 9427, Training Loss: 0.01590
Epoch: 9427, Training Loss: 0.01463
Epoch: 9427, Training Loss: 0.01817
Epoch: 9427, Training Loss: 0.01391
Epoch: 9428, Training Loss: 0.01590
Epoch: 9428, Training Loss: 0.01463
Epoch: 9428, Training Loss: 0.01817
Epoch: 9428, Training Loss: 0.01391
Epoch: 9429, Training Loss: 0.01590
Epoch: 9429, Training Loss: 0.01463
Epoch: 9429, Training Loss: 0.01817
Epoch: 9429, Training Loss: 0.01391
Epoch: 9430, Training Loss: 0.01590
Epoch: 9430, Training Loss: 0.01463
Epoch: 9430, Training Loss: 0.01817
Epoch: 9430, Training Loss: 0.01390
Epoch: 9431, Training Loss: 0.01590
Epoch: 9431, Training Loss: 0.01463
Epoch: 9431, Training Loss: 0.01817
Epoch: 9431, Training Loss: 0.01390
Epoch: 9432, Training Loss: 0.01589
Epoch: 9432, Training Loss: 0.01463
Epoch: 9432, Training Loss: 0.01816
Epoch: 9432, Training Loss: 0.01390
Epoch: 9433, Training Loss: 0.01589
Epoch: 9433, Training Loss: 0.01463
Epoch: 9433, Training Loss: 0.01816
Epoch: 9433, Training Loss: 0.01390
Epoch: 9434, Training Loss: 0.01589
Epoch: 9434, Training Loss: 0.01463
Epoch: 9434, Training Loss: 0.01816
Epoch: 9434, Training Loss: 0.01390
Epoch: 9435, Training Loss: 0.01589
Epoch: 9435, Training Loss: 0.01463
Epoch: 9435, Training Loss: 0.01816
Epoch: 9435, Training Loss: 0.01390
Epoch: 9436, Training Loss: 0.01589
Epoch: 9436, Training Loss: 0.01463
Epoch: 9436, Training Loss: 0.01816
Epoch: 9436, Training Loss: 0.01390
Epoch: 9437, Training Loss: 0.01589
Epoch: 9437, Training Loss: 0.01463
Epoch: 9437, Training Loss: 0.01816
Epoch: 9437, Training Loss: 0.01390
Epoch: 9438, Training Loss: 0.01589
Epoch: 9438, Training Loss: 0.01462
Epoch: 9438, Training Loss: 0.01816
Epoch: 9438, Training Loss: 0.01390
Epoch: 9439, Training Loss: 0.01589
Epoch: 9439, Training Loss: 0.01462
Epoch: 9439, Training Loss: 0.01816
Epoch: 9439, Training Loss: 0.01390
Epoch: 9440, Training Loss: 0.01589
Epoch: 9440, Training Loss: 0.01462
Epoch: 9440, Training Loss: 0.01816
Epoch: 9440, Training Loss: 0.01390
Epoch: 9441, Training Loss: 0.01589
Epoch: 9441, Training Loss: 0.01462
Epoch: 9441, Training Loss: 0.01815
Epoch: 9441, Training Loss: 0.01390
Epoch: 9442, Training Loss: 0.01589
Epoch: 9442, Training Loss: 0.01462
Epoch: 9442, Training Loss: 0.01815
Epoch: 9442, Training Loss: 0.01389
Epoch: 9443, Training Loss: 0.01588
Epoch: 9443, Training Loss: 0.01462
Epoch: 9443, Training Loss: 0.01815
Epoch: 9443, Training Loss: 0.01389
Epoch: 9444, Training Loss: 0.01588
Epoch: 9444, Training Loss: 0.01462
Epoch: 9444, Training Loss: 0.01815
Epoch: 9444, Training Loss: 0.01389
Epoch: 9445, Training Loss: 0.01588
Epoch: 9445, Training Loss: 0.01462
Epoch: 9445, Training Loss: 0.01815
Epoch: 9445, Training Loss: 0.01389
Epoch: 9446, Training Loss: 0.01588
Epoch: 9446, Training Loss: 0.01462
Epoch: 9446, Training Loss: 0.01815
Epoch: 9446, Training Loss: 0.01389
Epoch: 9447, Training Loss: 0.01588
Epoch: 9447, Training Loss: 0.01462
Epoch: 9447, Training Loss: 0.01815
Epoch: 9447, Training Loss: 0.01389
Epoch: 9448, Training Loss: 0.01588
Epoch: 9448, Training Loss: 0.01462
Epoch: 9448, Training Loss: 0.01815
Epoch: 9448, Training Loss: 0.01389
Epoch: 9449, Training Loss: 0.01588
Epoch: 9449, Training Loss: 0.01462
Epoch: 9449, Training Loss: 0.01815
Epoch: 9449, Training Loss: 0.01389
Epoch: 9450, Training Loss: 0.01588
Epoch: 9450, Training Loss: 0.01461
Epoch: 9450, Training Loss: 0.01814
Epoch: 9450, Training Loss: 0.01389
Epoch: 9451, Training Loss: 0.01588
Epoch: 9451, Training Loss: 0.01461
Epoch: 9451, Training Loss: 0.01814
Epoch: 9451, Training Loss: 0.01389
Epoch: 9452, Training Loss: 0.01588
Epoch: 9452, Training Loss: 0.01461
Epoch: 9452, Training Loss: 0.01814
Epoch: 9452, Training Loss: 0.01389
Epoch: 9453, Training Loss: 0.01587
Epoch: 9453, Training Loss: 0.01461
Epoch: 9453, Training Loss: 0.01814
Epoch: 9453, Training Loss: 0.01389
Epoch: 9454, Training Loss: 0.01587
Epoch: 9454, Training Loss: 0.01461
Epoch: 9454, Training Loss: 0.01814
Epoch: 9454, Training Loss: 0.01388
Epoch: 9455, Training Loss: 0.01587
Epoch: 9455, Training Loss: 0.01461
Epoch: 9455, Training Loss: 0.01814
Epoch: 9455, Training Loss: 0.01388
Epoch: 9456, Training Loss: 0.01587
Epoch: 9456, Training Loss: 0.01461
Epoch: 9456, Training Loss: 0.01814
Epoch: 9456, Training Loss: 0.01388
Epoch: 9457, Training Loss: 0.01587
Epoch: 9457, Training Loss: 0.01461
Epoch: 9457, Training Loss: 0.01814
Epoch: 9457, Training Loss: 0.01388
Epoch: 9458, Training Loss: 0.01587
Epoch: 9458, Training Loss: 0.01461
Epoch: 9458, Training Loss: 0.01814
Epoch: 9458, Training Loss: 0.01388
Epoch: 9459, Training Loss: 0.01587
Epoch: 9459, Training Loss: 0.01461
Epoch: 9459, Training Loss: 0.01813
Epoch: 9459, Training Loss: 0.01388
Epoch: 9460, Training Loss: 0.01587
Epoch: 9460, Training Loss: 0.01461
Epoch: 9460, Training Loss: 0.01813
Epoch: 9460, Training Loss: 0.01388
Epoch: 9461, Training Loss: 0.01587
Epoch: 9461, Training Loss: 0.01461
Epoch: 9461, Training Loss: 0.01813
Epoch: 9461, Training Loss: 0.01388
Epoch: 9462, Training Loss: 0.01587
Epoch: 9462, Training Loss: 0.01460
Epoch: 9462, Training Loss: 0.01813
Epoch: 9462, Training Loss: 0.01388
Epoch: 9463, Training Loss: 0.01587
Epoch: 9463, Training Loss: 0.01460
Epoch: 9463, Training Loss: 0.01813
Epoch: 9463, Training Loss: 0.01388
Epoch: 9464, Training Loss: 0.01586
Epoch: 9464, Training Loss: 0.01460
Epoch: 9464, Training Loss: 0.01813
Epoch: 9464, Training Loss: 0.01388
Epoch: 9465, Training Loss: 0.01586
Epoch: 9465, Training Loss: 0.01460
Epoch: 9465, Training Loss: 0.01813
Epoch: 9465, Training Loss: 0.01388
Epoch: 9466, Training Loss: 0.01586
Epoch: 9466, Training Loss: 0.01460
Epoch: 9466, Training Loss: 0.01813
Epoch: 9466, Training Loss: 0.01388
Epoch: 9467, Training Loss: 0.01586
Epoch: 9467, Training Loss: 0.01460
Epoch: 9467, Training Loss: 0.01813
Epoch: 9467, Training Loss: 0.01387
Epoch: 9468, Training Loss: 0.01586
Epoch: 9468, Training Loss: 0.01460
Epoch: 9468, Training Loss: 0.01812
Epoch: 9468, Training Loss: 0.01387
Epoch: 9469, Training Loss: 0.01586
Epoch: 9469, Training Loss: 0.01460
Epoch: 9469, Training Loss: 0.01812
Epoch: 9469, Training Loss: 0.01387
Epoch: 9470, Training Loss: 0.01586
Epoch: 9470, Training Loss: 0.01460
Epoch: 9470, Training Loss: 0.01812
Epoch: 9470, Training Loss: 0.01387
Epoch: 9471, Training Loss: 0.01586
Epoch: 9471, Training Loss: 0.01460
Epoch: 9471, Training Loss: 0.01812
Epoch: 9471, Training Loss: 0.01387
Epoch: 9472, Training Loss: 0.01586
Epoch: 9472, Training Loss: 0.01460
Epoch: 9472, Training Loss: 0.01812
Epoch: 9472, Training Loss: 0.01387
Epoch: 9473, Training Loss: 0.01586
Epoch: 9473, Training Loss: 0.01459
Epoch: 9473, Training Loss: 0.01812
Epoch: 9473, Training Loss: 0.01387
Epoch: 9474, Training Loss: 0.01585
Epoch: 9474, Training Loss: 0.01459
Epoch: 9474, Training Loss: 0.01812
Epoch: 9474, Training Loss: 0.01387
Epoch: 9475, Training Loss: 0.01585
Epoch: 9475, Training Loss: 0.01459
Epoch: 9475, Training Loss: 0.01812
Epoch: 9475, Training Loss: 0.01387
Epoch: 9476, Training Loss: 0.01585
Epoch: 9476, Training Loss: 0.01459
Epoch: 9476, Training Loss: 0.01812
Epoch: 9476, Training Loss: 0.01387
Epoch: 9477, Training Loss: 0.01585
Epoch: 9477, Training Loss: 0.01459
Epoch: 9477, Training Loss: 0.01811
Epoch: 9477, Training Loss: 0.01387
Epoch: 9478, Training Loss: 0.01585
Epoch: 9478, Training Loss: 0.01459
Epoch: 9478, Training Loss: 0.01811
Epoch: 9478, Training Loss: 0.01387
Epoch: 9479, Training Loss: 0.01585
Epoch: 9479, Training Loss: 0.01459
Epoch: 9479, Training Loss: 0.01811
Epoch: 9479, Training Loss: 0.01386
Epoch: 9480, Training Loss: 0.01585
Epoch: 9480, Training Loss: 0.01459
Epoch: 9480, Training Loss: 0.01811
Epoch: 9480, Training Loss: 0.01386
Epoch: 9481, Training Loss: 0.01585
Epoch: 9481, Training Loss: 0.01459
Epoch: 9481, Training Loss: 0.01811
Epoch: 9481, Training Loss: 0.01386
Epoch: 9482, Training Loss: 0.01585
Epoch: 9482, Training Loss: 0.01459
Epoch: 9482, Training Loss: 0.01811
Epoch: 9482, Training Loss: 0.01386
Epoch: 9483, Training Loss: 0.01585
Epoch: 9483, Training Loss: 0.01459
Epoch: 9483, Training Loss: 0.01811
Epoch: 9483, Training Loss: 0.01386
Epoch: 9484, Training Loss: 0.01585
Epoch: 9484, Training Loss: 0.01459
Epoch: 9484, Training Loss: 0.01811
Epoch: 9484, Training Loss: 0.01386
Epoch: 9485, Training Loss: 0.01584
Epoch: 9485, Training Loss: 0.01458
Epoch: 9485, Training Loss: 0.01811
Epoch: 9485, Training Loss: 0.01386
Epoch: 9486, Training Loss: 0.01584
Epoch: 9486, Training Loss: 0.01458
Epoch: 9486, Training Loss: 0.01810
Epoch: 9486, Training Loss: 0.01386
Epoch: 9487, Training Loss: 0.01584
Epoch: 9487, Training Loss: 0.01458
Epoch: 9487, Training Loss: 0.01810
Epoch: 9487, Training Loss: 0.01386
Epoch: 9488, Training Loss: 0.01584
Epoch: 9488, Training Loss: 0.01458
Epoch: 9488, Training Loss: 0.01810
Epoch: 9488, Training Loss: 0.01386
Epoch: 9489, Training Loss: 0.01584
Epoch: 9489, Training Loss: 0.01458
Epoch: 9489, Training Loss: 0.01810
Epoch: 9489, Training Loss: 0.01386
Epoch: 9490, Training Loss: 0.01584
Epoch: 9490, Training Loss: 0.01458
Epoch: 9490, Training Loss: 0.01810
Epoch: 9490, Training Loss: 0.01386
Epoch: 9491, Training Loss: 0.01584
Epoch: 9491, Training Loss: 0.01458
Epoch: 9491, Training Loss: 0.01810
Epoch: 9491, Training Loss: 0.01385
Epoch: 9492, Training Loss: 0.01584
Epoch: 9492, Training Loss: 0.01458
Epoch: 9492, Training Loss: 0.01810
Epoch: 9492, Training Loss: 0.01385
Epoch: 9493, Training Loss: 0.01584
Epoch: 9493, Training Loss: 0.01458
Epoch: 9493, Training Loss: 0.01810
Epoch: 9493, Training Loss: 0.01385
Epoch: 9494, Training Loss: 0.01584
Epoch: 9494, Training Loss: 0.01458
Epoch: 9494, Training Loss: 0.01810
Epoch: 9494, Training Loss: 0.01385
Epoch: 9495, Training Loss: 0.01583
Epoch: 9495, Training Loss: 0.01458
Epoch: 9495, Training Loss: 0.01809
Epoch: 9495, Training Loss: 0.01385
Epoch: 9496, Training Loss: 0.01583
Epoch: 9496, Training Loss: 0.01458
Epoch: 9496, Training Loss: 0.01809
Epoch: 9496, Training Loss: 0.01385
Epoch: 9497, Training Loss: 0.01583
Epoch: 9497, Training Loss: 0.01457
Epoch: 9497, Training Loss: 0.01809
Epoch: 9497, Training Loss: 0.01385
Epoch: 9498, Training Loss: 0.01583
Epoch: 9498, Training Loss: 0.01457
Epoch: 9498, Training Loss: 0.01809
Epoch: 9498, Training Loss: 0.01385
Epoch: 9499, Training Loss: 0.01583
Epoch: 9499, Training Loss: 0.01457
Epoch: 9499, Training Loss: 0.01809
Epoch: 9499, Training Loss: 0.01385
Epoch: 9500, Training Loss: 0.01583
Epoch: 9500, Training Loss: 0.01457
Epoch: 9500, Training Loss: 0.01809
Epoch: 9500, Training Loss: 0.01385
Epoch: 9501, Training Loss: 0.01583
Epoch: 9501, Training Loss: 0.01457
Epoch: 9501, Training Loss: 0.01809
Epoch: 9501, Training Loss: 0.01385
Epoch: 9502, Training Loss: 0.01583
Epoch: 9502, Training Loss: 0.01457
Epoch: 9502, Training Loss: 0.01809
Epoch: 9502, Training Loss: 0.01385
Epoch: 9503, Training Loss: 0.01583
Epoch: 9503, Training Loss: 0.01457
Epoch: 9503, Training Loss: 0.01809
Epoch: 9503, Training Loss: 0.01384
Epoch: 9504, Training Loss: 0.01583
Epoch: 9504, Training Loss: 0.01457
Epoch: 9504, Training Loss: 0.01808
Epoch: 9504, Training Loss: 0.01384
Epoch: 9505, Training Loss: 0.01583
Epoch: 9505, Training Loss: 0.01457
Epoch: 9505, Training Loss: 0.01808
Epoch: 9505, Training Loss: 0.01384
Epoch: 9506, Training Loss: 0.01582
Epoch: 9506, Training Loss: 0.01457
Epoch: 9506, Training Loss: 0.01808
Epoch: 9506, Training Loss: 0.01384
Epoch: 9507, Training Loss: 0.01582
Epoch: 9507, Training Loss: 0.01457
Epoch: 9507, Training Loss: 0.01808
Epoch: 9507, Training Loss: 0.01384
Epoch: 9508, Training Loss: 0.01582
Epoch: 9508, Training Loss: 0.01457
Epoch: 9508, Training Loss: 0.01808
Epoch: 9508, Training Loss: 0.01384
Epoch: 9509, Training Loss: 0.01582
Epoch: 9509, Training Loss: 0.01456
Epoch: 9509, Training Loss: 0.01808
Epoch: 9509, Training Loss: 0.01384
Epoch: 9510, Training Loss: 0.01582
Epoch: 9510, Training Loss: 0.01456
Epoch: 9510, Training Loss: 0.01808
Epoch: 9510, Training Loss: 0.01384
Epoch: 9511, Training Loss: 0.01582
Epoch: 9511, Training Loss: 0.01456
Epoch: 9511, Training Loss: 0.01808
Epoch: 9511, Training Loss: 0.01384
Epoch: 9512, Training Loss: 0.01582
Epoch: 9512, Training Loss: 0.01456
Epoch: 9512, Training Loss: 0.01808
Epoch: 9512, Training Loss: 0.01384
Epoch: 9513, Training Loss: 0.01582
Epoch: 9513, Training Loss: 0.01456
Epoch: 9513, Training Loss: 0.01808
Epoch: 9513, Training Loss: 0.01384
Epoch: 9514, Training Loss: 0.01582
Epoch: 9514, Training Loss: 0.01456
Epoch: 9514, Training Loss: 0.01807
Epoch: 9514, Training Loss: 0.01384
Epoch: 9515, Training Loss: 0.01582
Epoch: 9515, Training Loss: 0.01456
Epoch: 9515, Training Loss: 0.01807
Epoch: 9515, Training Loss: 0.01384
Epoch: 9516, Training Loss: 0.01581
Epoch: 9516, Training Loss: 0.01456
Epoch: 9516, Training Loss: 0.01807
Epoch: 9516, Training Loss: 0.01383
Epoch: 9517, Training Loss: 0.01581
Epoch: 9517, Training Loss: 0.01456
Epoch: 9517, Training Loss: 0.01807
Epoch: 9517, Training Loss: 0.01383
Epoch: 9518, Training Loss: 0.01581
Epoch: 9518, Training Loss: 0.01456
Epoch: 9518, Training Loss: 0.01807
Epoch: 9518, Training Loss: 0.01383
Epoch: 9519, Training Loss: 0.01581
Epoch: 9519, Training Loss: 0.01456
Epoch: 9519, Training Loss: 0.01807
Epoch: 9519, Training Loss: 0.01383
Epoch: 9520, Training Loss: 0.01581
Epoch: 9520, Training Loss: 0.01455
Epoch: 9520, Training Loss: 0.01807
Epoch: 9520, Training Loss: 0.01383
Epoch: 9521, Training Loss: 0.01581
Epoch: 9521, Training Loss: 0.01455
Epoch: 9521, Training Loss: 0.01807
Epoch: 9521, Training Loss: 0.01383
Epoch: 9522, Training Loss: 0.01581
Epoch: 9522, Training Loss: 0.01455
Epoch: 9522, Training Loss: 0.01807
Epoch: 9522, Training Loss: 0.01383
Epoch: 9523, Training Loss: 0.01581
Epoch: 9523, Training Loss: 0.01455
Epoch: 9523, Training Loss: 0.01806
Epoch: 9523, Training Loss: 0.01383
Epoch: 9524, Training Loss: 0.01581
Epoch: 9524, Training Loss: 0.01455
Epoch: 9524, Training Loss: 0.01806
Epoch: 9524, Training Loss: 0.01383
Epoch: 9525, Training Loss: 0.01581
Epoch: 9525, Training Loss: 0.01455
Epoch: 9525, Training Loss: 0.01806
Epoch: 9525, Training Loss: 0.01383
Epoch: 9526, Training Loss: 0.01581
Epoch: 9526, Training Loss: 0.01455
Epoch: 9526, Training Loss: 0.01806
Epoch: 9526, Training Loss: 0.01383
Epoch: 9527, Training Loss: 0.01580
Epoch: 9527, Training Loss: 0.01455
Epoch: 9527, Training Loss: 0.01806
Epoch: 9527, Training Loss: 0.01383
Epoch: 9528, Training Loss: 0.01580
Epoch: 9528, Training Loss: 0.01455
Epoch: 9528, Training Loss: 0.01806
Epoch: 9528, Training Loss: 0.01382
Epoch: 9529, Training Loss: 0.01580
Epoch: 9529, Training Loss: 0.01455
Epoch: 9529, Training Loss: 0.01806
Epoch: 9529, Training Loss: 0.01382
Epoch: 9530, Training Loss: 0.01580
Epoch: 9530, Training Loss: 0.01455
Epoch: 9530, Training Loss: 0.01806
Epoch: 9530, Training Loss: 0.01382
Epoch: 9531, Training Loss: 0.01580
Epoch: 9531, Training Loss: 0.01455
Epoch: 9531, Training Loss: 0.01806
Epoch: 9531, Training Loss: 0.01382
Epoch: 9532, Training Loss: 0.01580
Epoch: 9532, Training Loss: 0.01454
Epoch: 9532, Training Loss: 0.01805
Epoch: 9532, Training Loss: 0.01382
Epoch: 9533, Training Loss: 0.01580
Epoch: 9533, Training Loss: 0.01454
Epoch: 9533, Training Loss: 0.01805
Epoch: 9533, Training Loss: 0.01382
Epoch: 9534, Training Loss: 0.01580
Epoch: 9534, Training Loss: 0.01454
Epoch: 9534, Training Loss: 0.01805
Epoch: 9534, Training Loss: 0.01382
Epoch: 9535, Training Loss: 0.01580
Epoch: 9535, Training Loss: 0.01454
Epoch: 9535, Training Loss: 0.01805
Epoch: 9535, Training Loss: 0.01382
Epoch: 9536, Training Loss: 0.01580
Epoch: 9536, Training Loss: 0.01454
Epoch: 9536, Training Loss: 0.01805
Epoch: 9536, Training Loss: 0.01382
Epoch: 9537, Training Loss: 0.01579
Epoch: 9537, Training Loss: 0.01454
Epoch: 9537, Training Loss: 0.01805
Epoch: 9537, Training Loss: 0.01382
Epoch: 9538, Training Loss: 0.01579
Epoch: 9538, Training Loss: 0.01454
Epoch: 9538, Training Loss: 0.01805
Epoch: 9538, Training Loss: 0.01382
Epoch: 9539, Training Loss: 0.01579
Epoch: 9539, Training Loss: 0.01454
Epoch: 9539, Training Loss: 0.01805
Epoch: 9539, Training Loss: 0.01382
Epoch: 9540, Training Loss: 0.01579
Epoch: 9540, Training Loss: 0.01454
Epoch: 9540, Training Loss: 0.01805
Epoch: 9540, Training Loss: 0.01381
Epoch: 9541, Training Loss: 0.01579
Epoch: 9541, Training Loss: 0.01454
Epoch: 9541, Training Loss: 0.01804
Epoch: 9541, Training Loss: 0.01381
Epoch: 9542, Training Loss: 0.01579
Epoch: 9542, Training Loss: 0.01454
Epoch: 9542, Training Loss: 0.01804
Epoch: 9542, Training Loss: 0.01381
Epoch: 9543, Training Loss: 0.01579
Epoch: 9543, Training Loss: 0.01454
Epoch: 9543, Training Loss: 0.01804
Epoch: 9543, Training Loss: 0.01381
Epoch: 9544, Training Loss: 0.01579
Epoch: 9544, Training Loss: 0.01453
Epoch: 9544, Training Loss: 0.01804
Epoch: 9544, Training Loss: 0.01381
Epoch: 9545, Training Loss: 0.01579
Epoch: 9545, Training Loss: 0.01453
Epoch: 9545, Training Loss: 0.01804
Epoch: 9545, Training Loss: 0.01381
Epoch: 9546, Training Loss: 0.01579
Epoch: 9546, Training Loss: 0.01453
Epoch: 9546, Training Loss: 0.01804
Epoch: 9546, Training Loss: 0.01381
Epoch: 9547, Training Loss: 0.01579
Epoch: 9547, Training Loss: 0.01453
Epoch: 9547, Training Loss: 0.01804
Epoch: 9547, Training Loss: 0.01381
Epoch: 9548, Training Loss: 0.01578
Epoch: 9548, Training Loss: 0.01453
Epoch: 9548, Training Loss: 0.01804
Epoch: 9548, Training Loss: 0.01381
Epoch: 9549, Training Loss: 0.01578
Epoch: 9549, Training Loss: 0.01453
Epoch: 9549, Training Loss: 0.01804
Epoch: 9549, Training Loss: 0.01381
Epoch: 9550, Training Loss: 0.01578
Epoch: 9550, Training Loss: 0.01453
Epoch: 9550, Training Loss: 0.01803
Epoch: 9550, Training Loss: 0.01381
Epoch: 9551, Training Loss: 0.01578
Epoch: 9551, Training Loss: 0.01453
Epoch: 9551, Training Loss: 0.01803
Epoch: 9551, Training Loss: 0.01381
Epoch: 9552, Training Loss: 0.01578
Epoch: 9552, Training Loss: 0.01453
Epoch: 9552, Training Loss: 0.01803
Epoch: 9552, Training Loss: 0.01381
Epoch: 9553, Training Loss: 0.01578
Epoch: 9553, Training Loss: 0.01453
Epoch: 9553, Training Loss: 0.01803
Epoch: 9553, Training Loss: 0.01380
Epoch: 9554, Training Loss: 0.01578
Epoch: 9554, Training Loss: 0.01453
Epoch: 9554, Training Loss: 0.01803
Epoch: 9554, Training Loss: 0.01380
Epoch: 9555, Training Loss: 0.01578
Epoch: 9555, Training Loss: 0.01453
Epoch: 9555, Training Loss: 0.01803
Epoch: 9555, Training Loss: 0.01380
Epoch: 9556, Training Loss: 0.01578
Epoch: 9556, Training Loss: 0.01452
Epoch: 9556, Training Loss: 0.01803
Epoch: 9556, Training Loss: 0.01380
Epoch: 9557, Training Loss: 0.01578
Epoch: 9557, Training Loss: 0.01452
Epoch: 9557, Training Loss: 0.01803
Epoch: 9557, Training Loss: 0.01380
Epoch: 9558, Training Loss: 0.01578
Epoch: 9558, Training Loss: 0.01452
Epoch: 9558, Training Loss: 0.01803
Epoch: 9558, Training Loss: 0.01380
Epoch: 9559, Training Loss: 0.01577
Epoch: 9559, Training Loss: 0.01452
Epoch: 9559, Training Loss: 0.01803
Epoch: 9559, Training Loss: 0.01380
Epoch: 9560, Training Loss: 0.01577
Epoch: 9560, Training Loss: 0.01452
Epoch: 9560, Training Loss: 0.01802
Epoch: 9560, Training Loss: 0.01380
Epoch: 9561, Training Loss: 0.01577
Epoch: 9561, Training Loss: 0.01452
Epoch: 9561, Training Loss: 0.01802
Epoch: 9561, Training Loss: 0.01380
Epoch: 9562, Training Loss: 0.01577
Epoch: 9562, Training Loss: 0.01452
Epoch: 9562, Training Loss: 0.01802
Epoch: 9562, Training Loss: 0.01380
Epoch: 9563, Training Loss: 0.01577
Epoch: 9563, Training Loss: 0.01452
Epoch: 9563, Training Loss: 0.01802
Epoch: 9563, Training Loss: 0.01380
Epoch: 9564, Training Loss: 0.01577
Epoch: 9564, Training Loss: 0.01452
Epoch: 9564, Training Loss: 0.01802
Epoch: 9564, Training Loss: 0.01380
Epoch: 9565, Training Loss: 0.01577
Epoch: 9565, Training Loss: 0.01452
Epoch: 9565, Training Loss: 0.01802
Epoch: 9565, Training Loss: 0.01379
Epoch: 9566, Training Loss: 0.01577
Epoch: 9566, Training Loss: 0.01452
Epoch: 9566, Training Loss: 0.01802
Epoch: 9566, Training Loss: 0.01379
Epoch: 9567, Training Loss: 0.01577
Epoch: 9567, Training Loss: 0.01452
Epoch: 9567, Training Loss: 0.01802
Epoch: 9567, Training Loss: 0.01379
Epoch: 9568, Training Loss: 0.01577
Epoch: 9568, Training Loss: 0.01451
Epoch: 9568, Training Loss: 0.01802
Epoch: 9568, Training Loss: 0.01379
Epoch: 9569, Training Loss: 0.01576
Epoch: 9569, Training Loss: 0.01451
Epoch: 9569, Training Loss: 0.01801
Epoch: 9569, Training Loss: 0.01379
Epoch: 9570, Training Loss: 0.01576
Epoch: 9570, Training Loss: 0.01451
Epoch: 9570, Training Loss: 0.01801
Epoch: 9570, Training Loss: 0.01379
Epoch: 9571, Training Loss: 0.01576
Epoch: 9571, Training Loss: 0.01451
Epoch: 9571, Training Loss: 0.01801
Epoch: 9571, Training Loss: 0.01379
Epoch: 9572, Training Loss: 0.01576
Epoch: 9572, Training Loss: 0.01451
Epoch: 9572, Training Loss: 0.01801
Epoch: 9572, Training Loss: 0.01379
Epoch: 9573, Training Loss: 0.01576
Epoch: 9573, Training Loss: 0.01451
Epoch: 9573, Training Loss: 0.01801
Epoch: 9573, Training Loss: 0.01379
Epoch: 9574, Training Loss: 0.01576
Epoch: 9574, Training Loss: 0.01451
Epoch: 9574, Training Loss: 0.01801
Epoch: 9574, Training Loss: 0.01379
Epoch: 9575, Training Loss: 0.01576
Epoch: 9575, Training Loss: 0.01451
Epoch: 9575, Training Loss: 0.01801
Epoch: 9575, Training Loss: 0.01379
Epoch: 9576, Training Loss: 0.01576
Epoch: 9576, Training Loss: 0.01451
Epoch: 9576, Training Loss: 0.01801
Epoch: 9576, Training Loss: 0.01379
Epoch: 9577, Training Loss: 0.01576
Epoch: 9577, Training Loss: 0.01451
Epoch: 9577, Training Loss: 0.01801
Epoch: 9577, Training Loss: 0.01379
Epoch: 9578, Training Loss: 0.01576
Epoch: 9578, Training Loss: 0.01451
Epoch: 9578, Training Loss: 0.01800
Epoch: 9578, Training Loss: 0.01378
Epoch: 9579, Training Loss: 0.01576
Epoch: 9579, Training Loss: 0.01451
Epoch: 9579, Training Loss: 0.01800
Epoch: 9579, Training Loss: 0.01378
Epoch: 9580, Training Loss: 0.01575
Epoch: 9580, Training Loss: 0.01450
Epoch: 9580, Training Loss: 0.01800
Epoch: 9580, Training Loss: 0.01378
Epoch: 9581, Training Loss: 0.01575
Epoch: 9581, Training Loss: 0.01450
Epoch: 9581, Training Loss: 0.01800
Epoch: 9581, Training Loss: 0.01378
Epoch: 9582, Training Loss: 0.01575
Epoch: 9582, Training Loss: 0.01450
Epoch: 9582, Training Loss: 0.01800
Epoch: 9582, Training Loss: 0.01378
Epoch: 9583, Training Loss: 0.01575
Epoch: 9583, Training Loss: 0.01450
Epoch: 9583, Training Loss: 0.01800
Epoch: 9583, Training Loss: 0.01378
Epoch: 9584, Training Loss: 0.01575
Epoch: 9584, Training Loss: 0.01450
Epoch: 9584, Training Loss: 0.01800
Epoch: 9584, Training Loss: 0.01378
Epoch: 9585, Training Loss: 0.01575
Epoch: 9585, Training Loss: 0.01450
Epoch: 9585, Training Loss: 0.01800
Epoch: 9585, Training Loss: 0.01378
Epoch: 9586, Training Loss: 0.01575
Epoch: 9586, Training Loss: 0.01450
Epoch: 9586, Training Loss: 0.01800
Epoch: 9586, Training Loss: 0.01378
Epoch: 9587, Training Loss: 0.01575
Epoch: 9587, Training Loss: 0.01450
Epoch: 9587, Training Loss: 0.01799
Epoch: 9587, Training Loss: 0.01378
Epoch: 9588, Training Loss: 0.01575
Epoch: 9588, Training Loss: 0.01450
Epoch: 9588, Training Loss: 0.01799
Epoch: 9588, Training Loss: 0.01378
Epoch: 9589, Training Loss: 0.01575
Epoch: 9589, Training Loss: 0.01450
Epoch: 9589, Training Loss: 0.01799
Epoch: 9589, Training Loss: 0.01378
Epoch: 9590, Training Loss: 0.01575
Epoch: 9590, Training Loss: 0.01450
Epoch: 9590, Training Loss: 0.01799
Epoch: 9590, Training Loss: 0.01377
Epoch: 9591, Training Loss: 0.01574
Epoch: 9591, Training Loss: 0.01450
Epoch: 9591, Training Loss: 0.01799
Epoch: 9591, Training Loss: 0.01377
Epoch: 9592, Training Loss: 0.01574
Epoch: 9592, Training Loss: 0.01449
Epoch: 9592, Training Loss: 0.01799
Epoch: 9592, Training Loss: 0.01377
Epoch: 9593, Training Loss: 0.01574
Epoch: 9593, Training Loss: 0.01449
Epoch: 9593, Training Loss: 0.01799
Epoch: 9593, Training Loss: 0.01377
Epoch: 9594, Training Loss: 0.01574
Epoch: 9594, Training Loss: 0.01449
Epoch: 9594, Training Loss: 0.01799
Epoch: 9594, Training Loss: 0.01377
Epoch: 9595, Training Loss: 0.01574
Epoch: 9595, Training Loss: 0.01449
Epoch: 9595, Training Loss: 0.01799
Epoch: 9595, Training Loss: 0.01377
Epoch: 9596, Training Loss: 0.01574
Epoch: 9596, Training Loss: 0.01449
Epoch: 9596, Training Loss: 0.01799
Epoch: 9596, Training Loss: 0.01377
Epoch: 9597, Training Loss: 0.01574
Epoch: 9597, Training Loss: 0.01449
Epoch: 9597, Training Loss: 0.01798
Epoch: 9597, Training Loss: 0.01377
Epoch: 9598, Training Loss: 0.01574
Epoch: 9598, Training Loss: 0.01449
Epoch: 9598, Training Loss: 0.01798
Epoch: 9598, Training Loss: 0.01377
Epoch: 9599, Training Loss: 0.01574
Epoch: 9599, Training Loss: 0.01449
Epoch: 9599, Training Loss: 0.01798
Epoch: 9599, Training Loss: 0.01377
Epoch: 9600, Training Loss: 0.01574
Epoch: 9600, Training Loss: 0.01449
Epoch: 9600, Training Loss: 0.01798
Epoch: 9600, Training Loss: 0.01377
Epoch: 9601, Training Loss: 0.01573
Epoch: 9601, Training Loss: 0.01449
Epoch: 9601, Training Loss: 0.01798
Epoch: 9601, Training Loss: 0.01377
Epoch: 9602, Training Loss: 0.01573
Epoch: 9602, Training Loss: 0.01449
Epoch: 9602, Training Loss: 0.01798
Epoch: 9602, Training Loss: 0.01377
Epoch: 9603, Training Loss: 0.01573
Epoch: 9603, Training Loss: 0.01449
Epoch: 9603, Training Loss: 0.01798
Epoch: 9603, Training Loss: 0.01376
Epoch: 9604, Training Loss: 0.01573
Epoch: 9604, Training Loss: 0.01448
Epoch: 9604, Training Loss: 0.01798
Epoch: 9604, Training Loss: 0.01376
Epoch: 9605, Training Loss: 0.01573
Epoch: 9605, Training Loss: 0.01448
Epoch: 9605, Training Loss: 0.01798
Epoch: 9605, Training Loss: 0.01376
Epoch: 9606, Training Loss: 0.01573
Epoch: 9606, Training Loss: 0.01448
Epoch: 9606, Training Loss: 0.01797
Epoch: 9606, Training Loss: 0.01376
Epoch: 9607, Training Loss: 0.01573
Epoch: 9607, Training Loss: 0.01448
Epoch: 9607, Training Loss: 0.01797
Epoch: 9607, Training Loss: 0.01376
Epoch: 9608, Training Loss: 0.01573
Epoch: 9608, Training Loss: 0.01448
Epoch: 9608, Training Loss: 0.01797
Epoch: 9608, Training Loss: 0.01376
Epoch: 9609, Training Loss: 0.01573
Epoch: 9609, Training Loss: 0.01448
Epoch: 9609, Training Loss: 0.01797
Epoch: 9609, Training Loss: 0.01376
Epoch: 9610, Training Loss: 0.01573
Epoch: 9610, Training Loss: 0.01448
Epoch: 9610, Training Loss: 0.01797
Epoch: 9610, Training Loss: 0.01376
Epoch: 9611, Training Loss: 0.01573
Epoch: 9611, Training Loss: 0.01448
Epoch: 9611, Training Loss: 0.01797
Epoch: 9611, Training Loss: 0.01376
Epoch: 9612, Training Loss: 0.01572
Epoch: 9612, Training Loss: 0.01448
Epoch: 9612, Training Loss: 0.01797
Epoch: 9612, Training Loss: 0.01376
Epoch: 9613, Training Loss: 0.01572
Epoch: 9613, Training Loss: 0.01448
Epoch: 9613, Training Loss: 0.01797
Epoch: 9613, Training Loss: 0.01376
Epoch: 9614, Training Loss: 0.01572
Epoch: 9614, Training Loss: 0.01448
Epoch: 9614, Training Loss: 0.01797
Epoch: 9614, Training Loss: 0.01376
Epoch: 9615, Training Loss: 0.01572
Epoch: 9615, Training Loss: 0.01448
Epoch: 9615, Training Loss: 0.01796
Epoch: 9615, Training Loss: 0.01375
Epoch: 9616, Training Loss: 0.01572
Epoch: 9616, Training Loss: 0.01447
Epoch: 9616, Training Loss: 0.01796
Epoch: 9616, Training Loss: 0.01375
Epoch: 9617, Training Loss: 0.01572
Epoch: 9617, Training Loss: 0.01447
Epoch: 9617, Training Loss: 0.01796
Epoch: 9617, Training Loss: 0.01375
Epoch: 9618, Training Loss: 0.01572
Epoch: 9618, Training Loss: 0.01447
Epoch: 9618, Training Loss: 0.01796
Epoch: 9618, Training Loss: 0.01375
Epoch: 9619, Training Loss: 0.01572
Epoch: 9619, Training Loss: 0.01447
Epoch: 9619, Training Loss: 0.01796
Epoch: 9619, Training Loss: 0.01375
Epoch: 9620, Training Loss: 0.01572
Epoch: 9620, Training Loss: 0.01447
Epoch: 9620, Training Loss: 0.01796
Epoch: 9620, Training Loss: 0.01375
Epoch: 9621, Training Loss: 0.01572
Epoch: 9621, Training Loss: 0.01447
Epoch: 9621, Training Loss: 0.01796
Epoch: 9621, Training Loss: 0.01375
Epoch: 9622, Training Loss: 0.01572
Epoch: 9622, Training Loss: 0.01447
Epoch: 9622, Training Loss: 0.01796
Epoch: 9622, Training Loss: 0.01375
Epoch: 9623, Training Loss: 0.01571
Epoch: 9623, Training Loss: 0.01447
Epoch: 9623, Training Loss: 0.01796
Epoch: 9623, Training Loss: 0.01375
Epoch: 9624, Training Loss: 0.01571
Epoch: 9624, Training Loss: 0.01447
Epoch: 9624, Training Loss: 0.01796
Epoch: 9624, Training Loss: 0.01375
Epoch: 9625, Training Loss: 0.01571
Epoch: 9625, Training Loss: 0.01447
Epoch: 9625, Training Loss: 0.01795
Epoch: 9625, Training Loss: 0.01375
Epoch: 9626, Training Loss: 0.01571
Epoch: 9626, Training Loss: 0.01447
Epoch: 9626, Training Loss: 0.01795
Epoch: 9626, Training Loss: 0.01375
Epoch: 9627, Training Loss: 0.01571
Epoch: 9627, Training Loss: 0.01447
Epoch: 9627, Training Loss: 0.01795
Epoch: 9627, Training Loss: 0.01375
Epoch: 9628, Training Loss: 0.01571
Epoch: 9628, Training Loss: 0.01446
Epoch: 9628, Training Loss: 0.01795
Epoch: 9628, Training Loss: 0.01374
Epoch: 9629, Training Loss: 0.01571
Epoch: 9629, Training Loss: 0.01446
Epoch: 9629, Training Loss: 0.01795
Epoch: 9629, Training Loss: 0.01374
Epoch: 9630, Training Loss: 0.01571
Epoch: 9630, Training Loss: 0.01446
Epoch: 9630, Training Loss: 0.01795
Epoch: 9630, Training Loss: 0.01374
Epoch: 9631, Training Loss: 0.01571
Epoch: 9631, Training Loss: 0.01446
Epoch: 9631, Training Loss: 0.01795
Epoch: 9631, Training Loss: 0.01374
Epoch: 9632, Training Loss: 0.01571
Epoch: 9632, Training Loss: 0.01446
Epoch: 9632, Training Loss: 0.01795
Epoch: 9632, Training Loss: 0.01374
Epoch: 9633, Training Loss: 0.01570
Epoch: 9633, Training Loss: 0.01446
Epoch: 9633, Training Loss: 0.01795
Epoch: 9633, Training Loss: 0.01374
Epoch: 9634, Training Loss: 0.01570
Epoch: 9634, Training Loss: 0.01446
Epoch: 9634, Training Loss: 0.01794
Epoch: 9634, Training Loss: 0.01374
Epoch: 9635, Training Loss: 0.01570
Epoch: 9635, Training Loss: 0.01446
Epoch: 9635, Training Loss: 0.01794
Epoch: 9635, Training Loss: 0.01374
Epoch: 9636, Training Loss: 0.01570
Epoch: 9636, Training Loss: 0.01446
Epoch: 9636, Training Loss: 0.01794
Epoch: 9636, Training Loss: 0.01374
Epoch: 9637, Training Loss: 0.01570
Epoch: 9637, Training Loss: 0.01446
Epoch: 9637, Training Loss: 0.01794
Epoch: 9637, Training Loss: 0.01374
Epoch: 9638, Training Loss: 0.01570
Epoch: 9638, Training Loss: 0.01446
Epoch: 9638, Training Loss: 0.01794
Epoch: 9638, Training Loss: 0.01374
Epoch: 9639, Training Loss: 0.01570
Epoch: 9639, Training Loss: 0.01446
Epoch: 9639, Training Loss: 0.01794
Epoch: 9639, Training Loss: 0.01374
Epoch: 9640, Training Loss: 0.01570
Epoch: 9640, Training Loss: 0.01445
Epoch: 9640, Training Loss: 0.01794
Epoch: 9640, Training Loss: 0.01373
Epoch: 9641, Training Loss: 0.01570
Epoch: 9641, Training Loss: 0.01445
Epoch: 9641, Training Loss: 0.01794
Epoch: 9641, Training Loss: 0.01373
Epoch: 9642, Training Loss: 0.01570
Epoch: 9642, Training Loss: 0.01445
Epoch: 9642, Training Loss: 0.01794
Epoch: 9642, Training Loss: 0.01373
Epoch: 9643, Training Loss: 0.01570
Epoch: 9643, Training Loss: 0.01445
Epoch: 9643, Training Loss: 0.01794
Epoch: 9643, Training Loss: 0.01373
Epoch: 9644, Training Loss: 0.01569
Epoch: 9644, Training Loss: 0.01445
Epoch: 9644, Training Loss: 0.01793
Epoch: 9644, Training Loss: 0.01373
Epoch: 9645, Training Loss: 0.01569
Epoch: 9645, Training Loss: 0.01445
Epoch: 9645, Training Loss: 0.01793
Epoch: 9645, Training Loss: 0.01373
Epoch: 9646, Training Loss: 0.01569
Epoch: 9646, Training Loss: 0.01445
Epoch: 9646, Training Loss: 0.01793
Epoch: 9646, Training Loss: 0.01373
Epoch: 9647, Training Loss: 0.01569
Epoch: 9647, Training Loss: 0.01445
Epoch: 9647, Training Loss: 0.01793
Epoch: 9647, Training Loss: 0.01373
Epoch: 9648, Training Loss: 0.01569
Epoch: 9648, Training Loss: 0.01445
Epoch: 9648, Training Loss: 0.01793
Epoch: 9648, Training Loss: 0.01373
Epoch: 9649, Training Loss: 0.01569
Epoch: 9649, Training Loss: 0.01445
Epoch: 9649, Training Loss: 0.01793
Epoch: 9649, Training Loss: 0.01373
Epoch: 9650, Training Loss: 0.01569
Epoch: 9650, Training Loss: 0.01445
Epoch: 9650, Training Loss: 0.01793
Epoch: 9650, Training Loss: 0.01373
Epoch: 9651, Training Loss: 0.01569
Epoch: 9651, Training Loss: 0.01445
Epoch: 9651, Training Loss: 0.01793
Epoch: 9651, Training Loss: 0.01373
Epoch: 9652, Training Loss: 0.01569
Epoch: 9652, Training Loss: 0.01444
Epoch: 9652, Training Loss: 0.01793
Epoch: 9652, Training Loss: 0.01373
Epoch: 9653, Training Loss: 0.01569
Epoch: 9653, Training Loss: 0.01444
Epoch: 9653, Training Loss: 0.01792
Epoch: 9653, Training Loss: 0.01372
Epoch: 9654, Training Loss: 0.01569
Epoch: 9654, Training Loss: 0.01444
Epoch: 9654, Training Loss: 0.01792
Epoch: 9654, Training Loss: 0.01372
Epoch: 9655, Training Loss: 0.01568
Epoch: 9655, Training Loss: 0.01444
Epoch: 9655, Training Loss: 0.01792
Epoch: 9655, Training Loss: 0.01372
Epoch: 9656, Training Loss: 0.01568
Epoch: 9656, Training Loss: 0.01444
Epoch: 9656, Training Loss: 0.01792
Epoch: 9656, Training Loss: 0.01372
Epoch: 9657, Training Loss: 0.01568
Epoch: 9657, Training Loss: 0.01444
Epoch: 9657, Training Loss: 0.01792
Epoch: 9657, Training Loss: 0.01372
Epoch: 9658, Training Loss: 0.01568
Epoch: 9658, Training Loss: 0.01444
Epoch: 9658, Training Loss: 0.01792
Epoch: 9658, Training Loss: 0.01372
Epoch: 9659, Training Loss: 0.01568
Epoch: 9659, Training Loss: 0.01444
Epoch: 9659, Training Loss: 0.01792
Epoch: 9659, Training Loss: 0.01372
Epoch: 9660, Training Loss: 0.01568
Epoch: 9660, Training Loss: 0.01444
Epoch: 9660, Training Loss: 0.01792
Epoch: 9660, Training Loss: 0.01372
Epoch: 9661, Training Loss: 0.01568
Epoch: 9661, Training Loss: 0.01444
Epoch: 9661, Training Loss: 0.01792
Epoch: 9661, Training Loss: 0.01372
Epoch: 9662, Training Loss: 0.01568
Epoch: 9662, Training Loss: 0.01444
Epoch: 9662, Training Loss: 0.01791
Epoch: 9662, Training Loss: 0.01372
Epoch: 9663, Training Loss: 0.01568
Epoch: 9663, Training Loss: 0.01444
Epoch: 9663, Training Loss: 0.01791
Epoch: 9663, Training Loss: 0.01372
Epoch: 9664, Training Loss: 0.01568
Epoch: 9664, Training Loss: 0.01443
Epoch: 9664, Training Loss: 0.01791
Epoch: 9664, Training Loss: 0.01372
Epoch: 9665, Training Loss: 0.01568
Epoch: 9665, Training Loss: 0.01443
Epoch: 9665, Training Loss: 0.01791
Epoch: 9665, Training Loss: 0.01371
Epoch: 9666, Training Loss: 0.01567
Epoch: 9666, Training Loss: 0.01443
Epoch: 9666, Training Loss: 0.01791
Epoch: 9666, Training Loss: 0.01371
Epoch: 9667, Training Loss: 0.01567
Epoch: 9667, Training Loss: 0.01443
Epoch: 9667, Training Loss: 0.01791
Epoch: 9667, Training Loss: 0.01371
Epoch: 9668, Training Loss: 0.01567
Epoch: 9668, Training Loss: 0.01443
Epoch: 9668, Training Loss: 0.01791
Epoch: 9668, Training Loss: 0.01371
Epoch: 9669, Training Loss: 0.01567
Epoch: 9669, Training Loss: 0.01443
Epoch: 9669, Training Loss: 0.01791
Epoch: 9669, Training Loss: 0.01371
Epoch: 9670, Training Loss: 0.01567
Epoch: 9670, Training Loss: 0.01443
Epoch: 9670, Training Loss: 0.01791
Epoch: 9670, Training Loss: 0.01371
Epoch: 9671, Training Loss: 0.01567
Epoch: 9671, Training Loss: 0.01443
Epoch: 9671, Training Loss: 0.01791
Epoch: 9671, Training Loss: 0.01371
Epoch: 9672, Training Loss: 0.01567
Epoch: 9672, Training Loss: 0.01443
Epoch: 9672, Training Loss: 0.01790
Epoch: 9672, Training Loss: 0.01371
Epoch: 9673, Training Loss: 0.01567
Epoch: 9673, Training Loss: 0.01443
Epoch: 9673, Training Loss: 0.01790
Epoch: 9673, Training Loss: 0.01371
Epoch: 9674, Training Loss: 0.01567
Epoch: 9674, Training Loss: 0.01443
Epoch: 9674, Training Loss: 0.01790
Epoch: 9674, Training Loss: 0.01371
Epoch: 9675, Training Loss: 0.01567
Epoch: 9675, Training Loss: 0.01443
Epoch: 9675, Training Loss: 0.01790
Epoch: 9675, Training Loss: 0.01371
Epoch: 9676, Training Loss: 0.01567
Epoch: 9676, Training Loss: 0.01442
Epoch: 9676, Training Loss: 0.01790
Epoch: 9676, Training Loss: 0.01371
Epoch: 9677, Training Loss: 0.01566
Epoch: 9677, Training Loss: 0.01442
Epoch: 9677, Training Loss: 0.01790
Epoch: 9677, Training Loss: 0.01371
Epoch: 9678, Training Loss: 0.01566
Epoch: 9678, Training Loss: 0.01442
Epoch: 9678, Training Loss: 0.01790
Epoch: 9678, Training Loss: 0.01370
Epoch: 9679, Training Loss: 0.01566
Epoch: 9679, Training Loss: 0.01442
Epoch: 9679, Training Loss: 0.01790
Epoch: 9679, Training Loss: 0.01370
Epoch: 9680, Training Loss: 0.01566
Epoch: 9680, Training Loss: 0.01442
Epoch: 9680, Training Loss: 0.01790
Epoch: 9680, Training Loss: 0.01370
Epoch: 9681, Training Loss: 0.01566
Epoch: 9681, Training Loss: 0.01442
Epoch: 9681, Training Loss: 0.01789
Epoch: 9681, Training Loss: 0.01370
Epoch: 9682, Training Loss: 0.01566
Epoch: 9682, Training Loss: 0.01442
Epoch: 9682, Training Loss: 0.01789
Epoch: 9682, Training Loss: 0.01370
Epoch: 9683, Training Loss: 0.01566
Epoch: 9683, Training Loss: 0.01442
Epoch: 9683, Training Loss: 0.01789
Epoch: 9683, Training Loss: 0.01370
Epoch: 9684, Training Loss: 0.01566
Epoch: 9684, Training Loss: 0.01442
Epoch: 9684, Training Loss: 0.01789
Epoch: 9684, Training Loss: 0.01370
Epoch: 9685, Training Loss: 0.01566
Epoch: 9685, Training Loss: 0.01442
Epoch: 9685, Training Loss: 0.01789
Epoch: 9685, Training Loss: 0.01370
Epoch: 9686, Training Loss: 0.01566
Epoch: 9686, Training Loss: 0.01442
Epoch: 9686, Training Loss: 0.01789
Epoch: 9686, Training Loss: 0.01370
Epoch: 9687, Training Loss: 0.01566
Epoch: 9687, Training Loss: 0.01442
Epoch: 9687, Training Loss: 0.01789
Epoch: 9687, Training Loss: 0.01370
Epoch: 9688, Training Loss: 0.01565
Epoch: 9688, Training Loss: 0.01442
Epoch: 9688, Training Loss: 0.01789
Epoch: 9688, Training Loss: 0.01370
Epoch: 9689, Training Loss: 0.01565
Epoch: 9689, Training Loss: 0.01441
Epoch: 9689, Training Loss: 0.01789
Epoch: 9689, Training Loss: 0.01370
Epoch: 9690, Training Loss: 0.01565
Epoch: 9690, Training Loss: 0.01441
Epoch: 9690, Training Loss: 0.01789
Epoch: 9690, Training Loss: 0.01370
Epoch: 9691, Training Loss: 0.01565
Epoch: 9691, Training Loss: 0.01441
Epoch: 9691, Training Loss: 0.01788
Epoch: 9691, Training Loss: 0.01369
Epoch: 9692, Training Loss: 0.01565
Epoch: 9692, Training Loss: 0.01441
Epoch: 9692, Training Loss: 0.01788
Epoch: 9692, Training Loss: 0.01369
Epoch: 9693, Training Loss: 0.01565
Epoch: 9693, Training Loss: 0.01441
Epoch: 9693, Training Loss: 0.01788
Epoch: 9693, Training Loss: 0.01369
Epoch: 9694, Training Loss: 0.01565
Epoch: 9694, Training Loss: 0.01441
Epoch: 9694, Training Loss: 0.01788
Epoch: 9694, Training Loss: 0.01369
Epoch: 9695, Training Loss: 0.01565
Epoch: 9695, Training Loss: 0.01441
Epoch: 9695, Training Loss: 0.01788
Epoch: 9695, Training Loss: 0.01369
Epoch: 9696, Training Loss: 0.01565
Epoch: 9696, Training Loss: 0.01441
Epoch: 9696, Training Loss: 0.01788
Epoch: 9696, Training Loss: 0.01369
Epoch: 9697, Training Loss: 0.01565
Epoch: 9697, Training Loss: 0.01441
Epoch: 9697, Training Loss: 0.01788
Epoch: 9697, Training Loss: 0.01369
Epoch: 9698, Training Loss: 0.01564
Epoch: 9698, Training Loss: 0.01441
Epoch: 9698, Training Loss: 0.01788
Epoch: 9698, Training Loss: 0.01369
Epoch: 9699, Training Loss: 0.01564
Epoch: 9699, Training Loss: 0.01441
Epoch: 9699, Training Loss: 0.01788
Epoch: 9699, Training Loss: 0.01369
Epoch: 9700, Training Loss: 0.01564
Epoch: 9700, Training Loss: 0.01441
Epoch: 9700, Training Loss: 0.01787
Epoch: 9700, Training Loss: 0.01369
Epoch: 9701, Training Loss: 0.01564
Epoch: 9701, Training Loss: 0.01440
Epoch: 9701, Training Loss: 0.01787
Epoch: 9701, Training Loss: 0.01369
Epoch: 9702, Training Loss: 0.01564
Epoch: 9702, Training Loss: 0.01440
Epoch: 9702, Training Loss: 0.01787
Epoch: 9702, Training Loss: 0.01369
Epoch: 9703, Training Loss: 0.01564
Epoch: 9703, Training Loss: 0.01440
Epoch: 9703, Training Loss: 0.01787
Epoch: 9703, Training Loss: 0.01368
Epoch: 9704, Training Loss: 0.01564
Epoch: 9704, Training Loss: 0.01440
Epoch: 9704, Training Loss: 0.01787
Epoch: 9704, Training Loss: 0.01368
Epoch: 9705, Training Loss: 0.01564
Epoch: 9705, Training Loss: 0.01440
Epoch: 9705, Training Loss: 0.01787
Epoch: 9705, Training Loss: 0.01368
Epoch: 9706, Training Loss: 0.01564
Epoch: 9706, Training Loss: 0.01440
Epoch: 9706, Training Loss: 0.01787
Epoch: 9706, Training Loss: 0.01368
Epoch: 9707, Training Loss: 0.01564
Epoch: 9707, Training Loss: 0.01440
Epoch: 9707, Training Loss: 0.01787
Epoch: 9707, Training Loss: 0.01368
Epoch: 9708, Training Loss: 0.01564
Epoch: 9708, Training Loss: 0.01440
Epoch: 9708, Training Loss: 0.01787
Epoch: 9708, Training Loss: 0.01368
Epoch: 9709, Training Loss: 0.01563
Epoch: 9709, Training Loss: 0.01440
Epoch: 9709, Training Loss: 0.01787
Epoch: 9709, Training Loss: 0.01368
Epoch: 9710, Training Loss: 0.01563
Epoch: 9710, Training Loss: 0.01440
Epoch: 9710, Training Loss: 0.01786
Epoch: 9710, Training Loss: 0.01368
Epoch: 9711, Training Loss: 0.01563
Epoch: 9711, Training Loss: 0.01440
Epoch: 9711, Training Loss: 0.01786
Epoch: 9711, Training Loss: 0.01368
Epoch: 9712, Training Loss: 0.01563
Epoch: 9712, Training Loss: 0.01440
Epoch: 9712, Training Loss: 0.01786
Epoch: 9712, Training Loss: 0.01368
Epoch: 9713, Training Loss: 0.01563
Epoch: 9713, Training Loss: 0.01439
Epoch: 9713, Training Loss: 0.01786
Epoch: 9713, Training Loss: 0.01368
Epoch: 9714, Training Loss: 0.01563
Epoch: 9714, Training Loss: 0.01439
Epoch: 9714, Training Loss: 0.01786
Epoch: 9714, Training Loss: 0.01368
Epoch: 9715, Training Loss: 0.01563
Epoch: 9715, Training Loss: 0.01439
Epoch: 9715, Training Loss: 0.01786
Epoch: 9715, Training Loss: 0.01368
Epoch: 9716, Training Loss: 0.01563
Epoch: 9716, Training Loss: 0.01439
Epoch: 9716, Training Loss: 0.01786
Epoch: 9716, Training Loss: 0.01367
Epoch: 9717, Training Loss: 0.01563
Epoch: 9717, Training Loss: 0.01439
Epoch: 9717, Training Loss: 0.01786
Epoch: 9717, Training Loss: 0.01367
Epoch: 9718, Training Loss: 0.01563
Epoch: 9718, Training Loss: 0.01439
Epoch: 9718, Training Loss: 0.01786
Epoch: 9718, Training Loss: 0.01367
Epoch: 9719, Training Loss: 0.01563
Epoch: 9719, Training Loss: 0.01439
Epoch: 9719, Training Loss: 0.01785
Epoch: 9719, Training Loss: 0.01367
Epoch: 9720, Training Loss: 0.01562
Epoch: 9720, Training Loss: 0.01439
Epoch: 9720, Training Loss: 0.01785
Epoch: 9720, Training Loss: 0.01367
Epoch: 9721, Training Loss: 0.01562
Epoch: 9721, Training Loss: 0.01439
Epoch: 9721, Training Loss: 0.01785
Epoch: 9721, Training Loss: 0.01367
Epoch: 9722, Training Loss: 0.01562
Epoch: 9722, Training Loss: 0.01439
Epoch: 9722, Training Loss: 0.01785
Epoch: 9722, Training Loss: 0.01367
Epoch: 9723, Training Loss: 0.01562
Epoch: 9723, Training Loss: 0.01439
Epoch: 9723, Training Loss: 0.01785
Epoch: 9723, Training Loss: 0.01367
Epoch: 9724, Training Loss: 0.01562
Epoch: 9724, Training Loss: 0.01439
Epoch: 9724, Training Loss: 0.01785
Epoch: 9724, Training Loss: 0.01367
Epoch: 9725, Training Loss: 0.01562
Epoch: 9725, Training Loss: 0.01438
Epoch: 9725, Training Loss: 0.01785
Epoch: 9725, Training Loss: 0.01367
Epoch: 9726, Training Loss: 0.01562
Epoch: 9726, Training Loss: 0.01438
Epoch: 9726, Training Loss: 0.01785
Epoch: 9726, Training Loss: 0.01367
Epoch: 9727, Training Loss: 0.01562
Epoch: 9727, Training Loss: 0.01438
Epoch: 9727, Training Loss: 0.01785
Epoch: 9727, Training Loss: 0.01367
Epoch: 9728, Training Loss: 0.01562
Epoch: 9728, Training Loss: 0.01438
Epoch: 9728, Training Loss: 0.01785
Epoch: 9728, Training Loss: 0.01367
Epoch: 9729, Training Loss: 0.01562
Epoch: 9729, Training Loss: 0.01438
Epoch: 9729, Training Loss: 0.01784
Epoch: 9729, Training Loss: 0.01366
Epoch: 9730, Training Loss: 0.01562
Epoch: 9730, Training Loss: 0.01438
Epoch: 9730, Training Loss: 0.01784
Epoch: 9730, Training Loss: 0.01366
Epoch: 9731, Training Loss: 0.01561
Epoch: 9731, Training Loss: 0.01438
Epoch: 9731, Training Loss: 0.01784
Epoch: 9731, Training Loss: 0.01366
Epoch: 9732, Training Loss: 0.01561
Epoch: 9732, Training Loss: 0.01438
Epoch: 9732, Training Loss: 0.01784
Epoch: 9732, Training Loss: 0.01366
Epoch: 9733, Training Loss: 0.01561
Epoch: 9733, Training Loss: 0.01438
Epoch: 9733, Training Loss: 0.01784
Epoch: 9733, Training Loss: 0.01366
Epoch: 9734, Training Loss: 0.01561
Epoch: 9734, Training Loss: 0.01438
Epoch: 9734, Training Loss: 0.01784
Epoch: 9734, Training Loss: 0.01366
Epoch: 9735, Training Loss: 0.01561
Epoch: 9735, Training Loss: 0.01438
Epoch: 9735, Training Loss: 0.01784
Epoch: 9735, Training Loss: 0.01366
Epoch: 9736, Training Loss: 0.01561
Epoch: 9736, Training Loss: 0.01438
Epoch: 9736, Training Loss: 0.01784
Epoch: 9736, Training Loss: 0.01366
Epoch: 9737, Training Loss: 0.01561
Epoch: 9737, Training Loss: 0.01437
Epoch: 9737, Training Loss: 0.01784
Epoch: 9737, Training Loss: 0.01366
Epoch: 9738, Training Loss: 0.01561
Epoch: 9738, Training Loss: 0.01437
Epoch: 9738, Training Loss: 0.01783
Epoch: 9738, Training Loss: 0.01366
Epoch: 9739, Training Loss: 0.01561
Epoch: 9739, Training Loss: 0.01437
Epoch: 9739, Training Loss: 0.01783
Epoch: 9739, Training Loss: 0.01366
Epoch: 9740, Training Loss: 0.01561
Epoch: 9740, Training Loss: 0.01437
Epoch: 9740, Training Loss: 0.01783
Epoch: 9740, Training Loss: 0.01366
Epoch: 9741, Training Loss: 0.01561
Epoch: 9741, Training Loss: 0.01437
Epoch: 9741, Training Loss: 0.01783
Epoch: 9741, Training Loss: 0.01366
Epoch: 9742, Training Loss: 0.01560
Epoch: 9742, Training Loss: 0.01437
Epoch: 9742, Training Loss: 0.01783
Epoch: 9742, Training Loss: 0.01365
Epoch: 9743, Training Loss: 0.01560
Epoch: 9743, Training Loss: 0.01437
Epoch: 9743, Training Loss: 0.01783
Epoch: 9743, Training Loss: 0.01365
Epoch: 9744, Training Loss: 0.01560
Epoch: 9744, Training Loss: 0.01437
Epoch: 9744, Training Loss: 0.01783
Epoch: 9744, Training Loss: 0.01365
Epoch: 9745, Training Loss: 0.01560
Epoch: 9745, Training Loss: 0.01437
Epoch: 9745, Training Loss: 0.01783
Epoch: 9745, Training Loss: 0.01365
Epoch: 9746, Training Loss: 0.01560
Epoch: 9746, Training Loss: 0.01437
Epoch: 9746, Training Loss: 0.01783
Epoch: 9746, Training Loss: 0.01365
Epoch: 9747, Training Loss: 0.01560
Epoch: 9747, Training Loss: 0.01437
Epoch: 9747, Training Loss: 0.01783
Epoch: 9747, Training Loss: 0.01365
Epoch: 9748, Training Loss: 0.01560
Epoch: 9748, Training Loss: 0.01437
Epoch: 9748, Training Loss: 0.01782
Epoch: 9748, Training Loss: 0.01365
Epoch: 9749, Training Loss: 0.01560
Epoch: 9749, Training Loss: 0.01437
Epoch: 9749, Training Loss: 0.01782
Epoch: 9749, Training Loss: 0.01365
Epoch: 9750, Training Loss: 0.01560
Epoch: 9750, Training Loss: 0.01436
Epoch: 9750, Training Loss: 0.01782
Epoch: 9750, Training Loss: 0.01365
Epoch: 9751, Training Loss: 0.01560
Epoch: 9751, Training Loss: 0.01436
Epoch: 9751, Training Loss: 0.01782
Epoch: 9751, Training Loss: 0.01365
Epoch: 9752, Training Loss: 0.01560
Epoch: 9752, Training Loss: 0.01436
Epoch: 9752, Training Loss: 0.01782
Epoch: 9752, Training Loss: 0.01365
Epoch: 9753, Training Loss: 0.01559
Epoch: 9753, Training Loss: 0.01436
Epoch: 9753, Training Loss: 0.01782
Epoch: 9753, Training Loss: 0.01365
Epoch: 9754, Training Loss: 0.01559
Epoch: 9754, Training Loss: 0.01436
Epoch: 9754, Training Loss: 0.01782
Epoch: 9754, Training Loss: 0.01364
Epoch: 9755, Training Loss: 0.01559
Epoch: 9755, Training Loss: 0.01436
Epoch: 9755, Training Loss: 0.01782
Epoch: 9755, Training Loss: 0.01364
Epoch: 9756, Training Loss: 0.01559
Epoch: 9756, Training Loss: 0.01436
Epoch: 9756, Training Loss: 0.01782
Epoch: 9756, Training Loss: 0.01364
Epoch: 9757, Training Loss: 0.01559
Epoch: 9757, Training Loss: 0.01436
Epoch: 9757, Training Loss: 0.01781
Epoch: 9757, Training Loss: 0.01364
Epoch: 9758, Training Loss: 0.01559
Epoch: 9758, Training Loss: 0.01436
Epoch: 9758, Training Loss: 0.01781
Epoch: 9758, Training Loss: 0.01364
Epoch: 9759, Training Loss: 0.01559
Epoch: 9759, Training Loss: 0.01436
Epoch: 9759, Training Loss: 0.01781
Epoch: 9759, Training Loss: 0.01364
Epoch: 9760, Training Loss: 0.01559
Epoch: 9760, Training Loss: 0.01436
Epoch: 9760, Training Loss: 0.01781
Epoch: 9760, Training Loss: 0.01364
Epoch: 9761, Training Loss: 0.01559
Epoch: 9761, Training Loss: 0.01436
Epoch: 9761, Training Loss: 0.01781
Epoch: 9761, Training Loss: 0.01364
Epoch: 9762, Training Loss: 0.01559
Epoch: 9762, Training Loss: 0.01435
Epoch: 9762, Training Loss: 0.01781
Epoch: 9762, Training Loss: 0.01364
Epoch: 9763, Training Loss: 0.01559
Epoch: 9763, Training Loss: 0.01435
Epoch: 9763, Training Loss: 0.01781
Epoch: 9763, Training Loss: 0.01364
Epoch: 9764, Training Loss: 0.01558
Epoch: 9764, Training Loss: 0.01435
Epoch: 9764, Training Loss: 0.01781
Epoch: 9764, Training Loss: 0.01364
Epoch: 9765, Training Loss: 0.01558
Epoch: 9765, Training Loss: 0.01435
Epoch: 9765, Training Loss: 0.01781
Epoch: 9765, Training Loss: 0.01364
Epoch: 9766, Training Loss: 0.01558
Epoch: 9766, Training Loss: 0.01435
Epoch: 9766, Training Loss: 0.01781
Epoch: 9766, Training Loss: 0.01364
Epoch: 9767, Training Loss: 0.01558
Epoch: 9767, Training Loss: 0.01435
Epoch: 9767, Training Loss: 0.01780
Epoch: 9767, Training Loss: 0.01363
Epoch: 9768, Training Loss: 0.01558
Epoch: 9768, Training Loss: 0.01435
Epoch: 9768, Training Loss: 0.01780
Epoch: 9768, Training Loss: 0.01363
Epoch: 9769, Training Loss: 0.01558
Epoch: 9769, Training Loss: 0.01435
Epoch: 9769, Training Loss: 0.01780
Epoch: 9769, Training Loss: 0.01363
Epoch: 9770, Training Loss: 0.01558
Epoch: 9770, Training Loss: 0.01435
Epoch: 9770, Training Loss: 0.01780
Epoch: 9770, Training Loss: 0.01363
Epoch: 9771, Training Loss: 0.01558
Epoch: 9771, Training Loss: 0.01435
Epoch: 9771, Training Loss: 0.01780
Epoch: 9771, Training Loss: 0.01363
Epoch: 9772, Training Loss: 0.01558
Epoch: 9772, Training Loss: 0.01435
Epoch: 9772, Training Loss: 0.01780
Epoch: 9772, Training Loss: 0.01363
Epoch: 9773, Training Loss: 0.01558
Epoch: 9773, Training Loss: 0.01435
Epoch: 9773, Training Loss: 0.01780
Epoch: 9773, Training Loss: 0.01363
Epoch: 9774, Training Loss: 0.01558
Epoch: 9774, Training Loss: 0.01434
Epoch: 9774, Training Loss: 0.01780
Epoch: 9774, Training Loss: 0.01363
Epoch: 9775, Training Loss: 0.01557
Epoch: 9775, Training Loss: 0.01434
Epoch: 9775, Training Loss: 0.01780
Epoch: 9775, Training Loss: 0.01363
Epoch: 9776, Training Loss: 0.01557
Epoch: 9776, Training Loss: 0.01434
Epoch: 9776, Training Loss: 0.01779
Epoch: 9776, Training Loss: 0.01363
Epoch: 9777, Training Loss: 0.01557
Epoch: 9777, Training Loss: 0.01434
Epoch: 9777, Training Loss: 0.01779
Epoch: 9777, Training Loss: 0.01363
Epoch: 9778, Training Loss: 0.01557
Epoch: 9778, Training Loss: 0.01434
Epoch: 9778, Training Loss: 0.01779
Epoch: 9778, Training Loss: 0.01363
Epoch: 9779, Training Loss: 0.01557
Epoch: 9779, Training Loss: 0.01434
Epoch: 9779, Training Loss: 0.01779
Epoch: 9779, Training Loss: 0.01363
Epoch: 9780, Training Loss: 0.01557
Epoch: 9780, Training Loss: 0.01434
Epoch: 9780, Training Loss: 0.01779
Epoch: 9780, Training Loss: 0.01362
Epoch: 9781, Training Loss: 0.01557
Epoch: 9781, Training Loss: 0.01434
Epoch: 9781, Training Loss: 0.01779
Epoch: 9781, Training Loss: 0.01362
Epoch: 9782, Training Loss: 0.01557
Epoch: 9782, Training Loss: 0.01434
Epoch: 9782, Training Loss: 0.01779
Epoch: 9782, Training Loss: 0.01362
Epoch: 9783, Training Loss: 0.01557
Epoch: 9783, Training Loss: 0.01434
Epoch: 9783, Training Loss: 0.01779
Epoch: 9783, Training Loss: 0.01362
Epoch: 9784, Training Loss: 0.01557
Epoch: 9784, Training Loss: 0.01434
Epoch: 9784, Training Loss: 0.01779
Epoch: 9784, Training Loss: 0.01362
Epoch: 9785, Training Loss: 0.01557
Epoch: 9785, Training Loss: 0.01434
Epoch: 9785, Training Loss: 0.01779
Epoch: 9785, Training Loss: 0.01362
Epoch: 9786, Training Loss: 0.01556
Epoch: 9786, Training Loss: 0.01434
Epoch: 9786, Training Loss: 0.01778
Epoch: 9786, Training Loss: 0.01362
Epoch: 9787, Training Loss: 0.01556
Epoch: 9787, Training Loss: 0.01433
Epoch: 9787, Training Loss: 0.01778
Epoch: 9787, Training Loss: 0.01362
Epoch: 9788, Training Loss: 0.01556
Epoch: 9788, Training Loss: 0.01433
Epoch: 9788, Training Loss: 0.01778
Epoch: 9788, Training Loss: 0.01362
Epoch: 9789, Training Loss: 0.01556
Epoch: 9789, Training Loss: 0.01433
Epoch: 9789, Training Loss: 0.01778
Epoch: 9789, Training Loss: 0.01362
Epoch: 9790, Training Loss: 0.01556
Epoch: 9790, Training Loss: 0.01433
Epoch: 9790, Training Loss: 0.01778
Epoch: 9790, Training Loss: 0.01362
Epoch: 9791, Training Loss: 0.01556
Epoch: 9791, Training Loss: 0.01433
Epoch: 9791, Training Loss: 0.01778
Epoch: 9791, Training Loss: 0.01362
Epoch: 9792, Training Loss: 0.01556
Epoch: 9792, Training Loss: 0.01433
Epoch: 9792, Training Loss: 0.01778
Epoch: 9792, Training Loss: 0.01362
Epoch: 9793, Training Loss: 0.01556
Epoch: 9793, Training Loss: 0.01433
Epoch: 9793, Training Loss: 0.01778
Epoch: 9793, Training Loss: 0.01361
Epoch: 9794, Training Loss: 0.01556
Epoch: 9794, Training Loss: 0.01433
Epoch: 9794, Training Loss: 0.01778
Epoch: 9794, Training Loss: 0.01361
Epoch: 9795, Training Loss: 0.01556
Epoch: 9795, Training Loss: 0.01433
Epoch: 9795, Training Loss: 0.01778
Epoch: 9795, Training Loss: 0.01361
Epoch: 9796, Training Loss: 0.01556
Epoch: 9796, Training Loss: 0.01433
Epoch: 9796, Training Loss: 0.01777
Epoch: 9796, Training Loss: 0.01361
Epoch: 9797, Training Loss: 0.01555
Epoch: 9797, Training Loss: 0.01433
Epoch: 9797, Training Loss: 0.01777
Epoch: 9797, Training Loss: 0.01361
Epoch: 9798, Training Loss: 0.01555
Epoch: 9798, Training Loss: 0.01433
Epoch: 9798, Training Loss: 0.01777
Epoch: 9798, Training Loss: 0.01361
Epoch: 9799, Training Loss: 0.01555
Epoch: 9799, Training Loss: 0.01432
Epoch: 9799, Training Loss: 0.01777
Epoch: 9799, Training Loss: 0.01361
Epoch: 9800, Training Loss: 0.01555
Epoch: 9800, Training Loss: 0.01432
Epoch: 9800, Training Loss: 0.01777
Epoch: 9800, Training Loss: 0.01361
Epoch: 9801, Training Loss: 0.01555
Epoch: 9801, Training Loss: 0.01432
Epoch: 9801, Training Loss: 0.01777
Epoch: 9801, Training Loss: 0.01361
Epoch: 9802, Training Loss: 0.01555
Epoch: 9802, Training Loss: 0.01432
Epoch: 9802, Training Loss: 0.01777
Epoch: 9802, Training Loss: 0.01361
Epoch: 9803, Training Loss: 0.01555
Epoch: 9803, Training Loss: 0.01432
Epoch: 9803, Training Loss: 0.01777
Epoch: 9803, Training Loss: 0.01361
Epoch: 9804, Training Loss: 0.01555
Epoch: 9804, Training Loss: 0.01432
Epoch: 9804, Training Loss: 0.01777
Epoch: 9804, Training Loss: 0.01361
Epoch: 9805, Training Loss: 0.01555
Epoch: 9805, Training Loss: 0.01432
Epoch: 9805, Training Loss: 0.01776
Epoch: 9805, Training Loss: 0.01361
Epoch: 9806, Training Loss: 0.01555
Epoch: 9806, Training Loss: 0.01432
Epoch: 9806, Training Loss: 0.01776
Epoch: 9806, Training Loss: 0.01360
Epoch: 9807, Training Loss: 0.01555
Epoch: 9807, Training Loss: 0.01432
Epoch: 9807, Training Loss: 0.01776
Epoch: 9807, Training Loss: 0.01360
Epoch: 9808, Training Loss: 0.01554
Epoch: 9808, Training Loss: 0.01432
Epoch: 9808, Training Loss: 0.01776
Epoch: 9808, Training Loss: 0.01360
Epoch: 9809, Training Loss: 0.01554
Epoch: 9809, Training Loss: 0.01432
Epoch: 9809, Training Loss: 0.01776
Epoch: 9809, Training Loss: 0.01360
Epoch: 9810, Training Loss: 0.01554
Epoch: 9810, Training Loss: 0.01432
Epoch: 9810, Training Loss: 0.01776
Epoch: 9810, Training Loss: 0.01360
Epoch: 9811, Training Loss: 0.01554
Epoch: 9811, Training Loss: 0.01432
Epoch: 9811, Training Loss: 0.01776
Epoch: 9811, Training Loss: 0.01360
Epoch: 9812, Training Loss: 0.01554
Epoch: 9812, Training Loss: 0.01431
Epoch: 9812, Training Loss: 0.01776
Epoch: 9812, Training Loss: 0.01360
Epoch: 9813, Training Loss: 0.01554
Epoch: 9813, Training Loss: 0.01431
Epoch: 9813, Training Loss: 0.01776
Epoch: 9813, Training Loss: 0.01360
Epoch: 9814, Training Loss: 0.01554
Epoch: 9814, Training Loss: 0.01431
Epoch: 9814, Training Loss: 0.01776
Epoch: 9814, Training Loss: 0.01360
Epoch: 9815, Training Loss: 0.01554
Epoch: 9815, Training Loss: 0.01431
Epoch: 9815, Training Loss: 0.01775
Epoch: 9815, Training Loss: 0.01360
Epoch: 9816, Training Loss: 0.01554
Epoch: 9816, Training Loss: 0.01431
Epoch: 9816, Training Loss: 0.01775
Epoch: 9816, Training Loss: 0.01360
Epoch: 9817, Training Loss: 0.01554
Epoch: 9817, Training Loss: 0.01431
Epoch: 9817, Training Loss: 0.01775
Epoch: 9817, Training Loss: 0.01360
Epoch: 9818, Training Loss: 0.01554
Epoch: 9818, Training Loss: 0.01431
Epoch: 9818, Training Loss: 0.01775
Epoch: 9818, Training Loss: 0.01360
Epoch: 9819, Training Loss: 0.01554
Epoch: 9819, Training Loss: 0.01431
Epoch: 9819, Training Loss: 0.01775
Epoch: 9819, Training Loss: 0.01359
Epoch: 9820, Training Loss: 0.01553
Epoch: 9820, Training Loss: 0.01431
Epoch: 9820, Training Loss: 0.01775
Epoch: 9820, Training Loss: 0.01359
Epoch: 9821, Training Loss: 0.01553
Epoch: 9821, Training Loss: 0.01431
Epoch: 9821, Training Loss: 0.01775
Epoch: 9821, Training Loss: 0.01359
Epoch: 9822, Training Loss: 0.01553
Epoch: 9822, Training Loss: 0.01431
Epoch: 9822, Training Loss: 0.01775
Epoch: 9822, Training Loss: 0.01359
Epoch: 9823, Training Loss: 0.01553
Epoch: 9823, Training Loss: 0.01431
Epoch: 9823, Training Loss: 0.01775
Epoch: 9823, Training Loss: 0.01359
Epoch: 9824, Training Loss: 0.01553
Epoch: 9824, Training Loss: 0.01430
Epoch: 9824, Training Loss: 0.01775
Epoch: 9824, Training Loss: 0.01359
Epoch: 9825, Training Loss: 0.01553
Epoch: 9825, Training Loss: 0.01430
Epoch: 9825, Training Loss: 0.01774
Epoch: 9825, Training Loss: 0.01359
Epoch: 9826, Training Loss: 0.01553
Epoch: 9826, Training Loss: 0.01430
Epoch: 9826, Training Loss: 0.01774
Epoch: 9826, Training Loss: 0.01359
Epoch: 9827, Training Loss: 0.01553
Epoch: 9827, Training Loss: 0.01430
Epoch: 9827, Training Loss: 0.01774
Epoch: 9827, Training Loss: 0.01359
Epoch: 9828, Training Loss: 0.01553
Epoch: 9828, Training Loss: 0.01430
Epoch: 9828, Training Loss: 0.01774
Epoch: 9828, Training Loss: 0.01359
Epoch: 9829, Training Loss: 0.01553
Epoch: 9829, Training Loss: 0.01430
Epoch: 9829, Training Loss: 0.01774
Epoch: 9829, Training Loss: 0.01359
Epoch: 9830, Training Loss: 0.01553
Epoch: 9830, Training Loss: 0.01430
Epoch: 9830, Training Loss: 0.01774
Epoch: 9830, Training Loss: 0.01359
Epoch: 9831, Training Loss: 0.01552
Epoch: 9831, Training Loss: 0.01430
Epoch: 9831, Training Loss: 0.01774
Epoch: 9831, Training Loss: 0.01359
Epoch: 9832, Training Loss: 0.01552
Epoch: 9832, Training Loss: 0.01430
Epoch: 9832, Training Loss: 0.01774
Epoch: 9832, Training Loss: 0.01358
Epoch: 9833, Training Loss: 0.01552
Epoch: 9833, Training Loss: 0.01430
Epoch: 9833, Training Loss: 0.01774
Epoch: 9833, Training Loss: 0.01358
Epoch: 9834, Training Loss: 0.01552
Epoch: 9834, Training Loss: 0.01430
Epoch: 9834, Training Loss: 0.01773
Epoch: 9834, Training Loss: 0.01358
Epoch: 9835, Training Loss: 0.01552
Epoch: 9835, Training Loss: 0.01430
Epoch: 9835, Training Loss: 0.01773
Epoch: 9835, Training Loss: 0.01358
Epoch: 9836, Training Loss: 0.01552
Epoch: 9836, Training Loss: 0.01430
Epoch: 9836, Training Loss: 0.01773
Epoch: 9836, Training Loss: 0.01358
Epoch: 9837, Training Loss: 0.01552
Epoch: 9837, Training Loss: 0.01429
Epoch: 9837, Training Loss: 0.01773
Epoch: 9837, Training Loss: 0.01358
Epoch: 9838, Training Loss: 0.01552
Epoch: 9838, Training Loss: 0.01429
Epoch: 9838, Training Loss: 0.01773
Epoch: 9838, Training Loss: 0.01358
Epoch: 9839, Training Loss: 0.01552
Epoch: 9839, Training Loss: 0.01429
Epoch: 9839, Training Loss: 0.01773
Epoch: 9839, Training Loss: 0.01358
Epoch: 9840, Training Loss: 0.01552
Epoch: 9840, Training Loss: 0.01429
Epoch: 9840, Training Loss: 0.01773
Epoch: 9840, Training Loss: 0.01358
Epoch: 9841, Training Loss: 0.01552
Epoch: 9841, Training Loss: 0.01429
Epoch: 9841, Training Loss: 0.01773
Epoch: 9841, Training Loss: 0.01358
Epoch: 9842, Training Loss: 0.01551
Epoch: 9842, Training Loss: 0.01429
Epoch: 9842, Training Loss: 0.01773
Epoch: 9842, Training Loss: 0.01358
Epoch: 9843, Training Loss: 0.01551
Epoch: 9843, Training Loss: 0.01429
Epoch: 9843, Training Loss: 0.01773
Epoch: 9843, Training Loss: 0.01358
Epoch: 9844, Training Loss: 0.01551
Epoch: 9844, Training Loss: 0.01429
Epoch: 9844, Training Loss: 0.01772
Epoch: 9844, Training Loss: 0.01358
Epoch: 9845, Training Loss: 0.01551
Epoch: 9845, Training Loss: 0.01429
Epoch: 9845, Training Loss: 0.01772
Epoch: 9845, Training Loss: 0.01357
Epoch: 9846, Training Loss: 0.01551
Epoch: 9846, Training Loss: 0.01429
Epoch: 9846, Training Loss: 0.01772
Epoch: 9846, Training Loss: 0.01357
Epoch: 9847, Training Loss: 0.01551
Epoch: 9847, Training Loss: 0.01429
Epoch: 9847, Training Loss: 0.01772
Epoch: 9847, Training Loss: 0.01357
Epoch: 9848, Training Loss: 0.01551
Epoch: 9848, Training Loss: 0.01429
Epoch: 9848, Training Loss: 0.01772
Epoch: 9848, Training Loss: 0.01357
Epoch: 9849, Training Loss: 0.01551
Epoch: 9849, Training Loss: 0.01428
Epoch: 9849, Training Loss: 0.01772
Epoch: 9849, Training Loss: 0.01357
Epoch: 9850, Training Loss: 0.01551
Epoch: 9850, Training Loss: 0.01428
Epoch: 9850, Training Loss: 0.01772
Epoch: 9850, Training Loss: 0.01357
Epoch: 9851, Training Loss: 0.01551
Epoch: 9851, Training Loss: 0.01428
Epoch: 9851, Training Loss: 0.01772
Epoch: 9851, Training Loss: 0.01357
Epoch: 9852, Training Loss: 0.01551
Epoch: 9852, Training Loss: 0.01428
Epoch: 9852, Training Loss: 0.01772
Epoch: 9852, Training Loss: 0.01357
Epoch: 9853, Training Loss: 0.01550
Epoch: 9853, Training Loss: 0.01428
Epoch: 9853, Training Loss: 0.01772
Epoch: 9853, Training Loss: 0.01357
Epoch: 9854, Training Loss: 0.01550
Epoch: 9854, Training Loss: 0.01428
Epoch: 9854, Training Loss: 0.01771
Epoch: 9854, Training Loss: 0.01357
Epoch: 9855, Training Loss: 0.01550
Epoch: 9855, Training Loss: 0.01428
Epoch: 9855, Training Loss: 0.01771
Epoch: 9855, Training Loss: 0.01357
Epoch: 9856, Training Loss: 0.01550
Epoch: 9856, Training Loss: 0.01428
Epoch: 9856, Training Loss: 0.01771
Epoch: 9856, Training Loss: 0.01357
Epoch: 9857, Training Loss: 0.01550
Epoch: 9857, Training Loss: 0.01428
Epoch: 9857, Training Loss: 0.01771
Epoch: 9857, Training Loss: 0.01357
Epoch: 9858, Training Loss: 0.01550
Epoch: 9858, Training Loss: 0.01428
Epoch: 9858, Training Loss: 0.01771
Epoch: 9858, Training Loss: 0.01356
Epoch: 9859, Training Loss: 0.01550
Epoch: 9859, Training Loss: 0.01428
Epoch: 9859, Training Loss: 0.01771
Epoch: 9859, Training Loss: 0.01356
Epoch: 9860, Training Loss: 0.01550
Epoch: 9860, Training Loss: 0.01428
Epoch: 9860, Training Loss: 0.01771
Epoch: 9860, Training Loss: 0.01356
Epoch: 9861, Training Loss: 0.01550
Epoch: 9861, Training Loss: 0.01428
Epoch: 9861, Training Loss: 0.01771
Epoch: 9861, Training Loss: 0.01356
Epoch: 9862, Training Loss: 0.01550
Epoch: 9862, Training Loss: 0.01427
Epoch: 9862, Training Loss: 0.01771
Epoch: 9862, Training Loss: 0.01356
Epoch: 9863, Training Loss: 0.01550
Epoch: 9863, Training Loss: 0.01427
Epoch: 9863, Training Loss: 0.01771
Epoch: 9863, Training Loss: 0.01356
Epoch: 9864, Training Loss: 0.01549
Epoch: 9864, Training Loss: 0.01427
Epoch: 9864, Training Loss: 0.01770
Epoch: 9864, Training Loss: 0.01356
Epoch: 9865, Training Loss: 0.01549
Epoch: 9865, Training Loss: 0.01427
Epoch: 9865, Training Loss: 0.01770
Epoch: 9865, Training Loss: 0.01356
Epoch: 9866, Training Loss: 0.01549
Epoch: 9866, Training Loss: 0.01427
Epoch: 9866, Training Loss: 0.01770
Epoch: 9866, Training Loss: 0.01356
Epoch: 9867, Training Loss: 0.01549
Epoch: 9867, Training Loss: 0.01427
Epoch: 9867, Training Loss: 0.01770
Epoch: 9867, Training Loss: 0.01356
Epoch: 9868, Training Loss: 0.01549
Epoch: 9868, Training Loss: 0.01427
Epoch: 9868, Training Loss: 0.01770
Epoch: 9868, Training Loss: 0.01356
Epoch: 9869, Training Loss: 0.01549
Epoch: 9869, Training Loss: 0.01427
Epoch: 9869, Training Loss: 0.01770
Epoch: 9869, Training Loss: 0.01356
Epoch: 9870, Training Loss: 0.01549
Epoch: 9870, Training Loss: 0.01427
Epoch: 9870, Training Loss: 0.01770
Epoch: 9870, Training Loss: 0.01356
Epoch: 9871, Training Loss: 0.01549
Epoch: 9871, Training Loss: 0.01427
Epoch: 9871, Training Loss: 0.01770
Epoch: 9871, Training Loss: 0.01355
Epoch: 9872, Training Loss: 0.01549
Epoch: 9872, Training Loss: 0.01427
Epoch: 9872, Training Loss: 0.01770
Epoch: 9872, Training Loss: 0.01355
Epoch: 9873, Training Loss: 0.01549
Epoch: 9873, Training Loss: 0.01427
Epoch: 9873, Training Loss: 0.01769
Epoch: 9873, Training Loss: 0.01355
Epoch: 9874, Training Loss: 0.01549
Epoch: 9874, Training Loss: 0.01426
Epoch: 9874, Training Loss: 0.01769
Epoch: 9874, Training Loss: 0.01355
Epoch: 9875, Training Loss: 0.01548
Epoch: 9875, Training Loss: 0.01426
Epoch: 9875, Training Loss: 0.01769
Epoch: 9875, Training Loss: 0.01355
Epoch: 9876, Training Loss: 0.01548
Epoch: 9876, Training Loss: 0.01426
Epoch: 9876, Training Loss: 0.01769
Epoch: 9876, Training Loss: 0.01355
Epoch: 9877, Training Loss: 0.01548
Epoch: 9877, Training Loss: 0.01426
Epoch: 9877, Training Loss: 0.01769
Epoch: 9877, Training Loss: 0.01355
Epoch: 9878, Training Loss: 0.01548
Epoch: 9878, Training Loss: 0.01426
Epoch: 9878, Training Loss: 0.01769
Epoch: 9878, Training Loss: 0.01355
Epoch: 9879, Training Loss: 0.01548
Epoch: 9879, Training Loss: 0.01426
Epoch: 9879, Training Loss: 0.01769
Epoch: 9879, Training Loss: 0.01355
Epoch: 9880, Training Loss: 0.01548
Epoch: 9880, Training Loss: 0.01426
Epoch: 9880, Training Loss: 0.01769
Epoch: 9880, Training Loss: 0.01355
Epoch: 9881, Training Loss: 0.01548
Epoch: 9881, Training Loss: 0.01426
Epoch: 9881, Training Loss: 0.01769
Epoch: 9881, Training Loss: 0.01355
Epoch: 9882, Training Loss: 0.01548
Epoch: 9882, Training Loss: 0.01426
Epoch: 9882, Training Loss: 0.01769
Epoch: 9882, Training Loss: 0.01355
Epoch: 9883, Training Loss: 0.01548
Epoch: 9883, Training Loss: 0.01426
Epoch: 9883, Training Loss: 0.01768
Epoch: 9883, Training Loss: 0.01355
Epoch: 9884, Training Loss: 0.01548
Epoch: 9884, Training Loss: 0.01426
Epoch: 9884, Training Loss: 0.01768
Epoch: 9884, Training Loss: 0.01354
Epoch: 9885, Training Loss: 0.01548
Epoch: 9885, Training Loss: 0.01426
Epoch: 9885, Training Loss: 0.01768
Epoch: 9885, Training Loss: 0.01354
Epoch: 9886, Training Loss: 0.01548
Epoch: 9886, Training Loss: 0.01426
Epoch: 9886, Training Loss: 0.01768
Epoch: 9886, Training Loss: 0.01354
Epoch: 9887, Training Loss: 0.01547
Epoch: 9887, Training Loss: 0.01425
Epoch: 9887, Training Loss: 0.01768
Epoch: 9887, Training Loss: 0.01354
Epoch: 9888, Training Loss: 0.01547
Epoch: 9888, Training Loss: 0.01425
Epoch: 9888, Training Loss: 0.01768
Epoch: 9888, Training Loss: 0.01354
Epoch: 9889, Training Loss: 0.01547
Epoch: 9889, Training Loss: 0.01425
Epoch: 9889, Training Loss: 0.01768
Epoch: 9889, Training Loss: 0.01354
Epoch: 9890, Training Loss: 0.01547
Epoch: 9890, Training Loss: 0.01425
Epoch: 9890, Training Loss: 0.01768
Epoch: 9890, Training Loss: 0.01354
Epoch: 9891, Training Loss: 0.01547
Epoch: 9891, Training Loss: 0.01425
Epoch: 9891, Training Loss: 0.01768
Epoch: 9891, Training Loss: 0.01354
Epoch: 9892, Training Loss: 0.01547
Epoch: 9892, Training Loss: 0.01425
Epoch: 9892, Training Loss: 0.01768
Epoch: 9892, Training Loss: 0.01354
Epoch: 9893, Training Loss: 0.01547
Epoch: 9893, Training Loss: 0.01425
Epoch: 9893, Training Loss: 0.01767
Epoch: 9893, Training Loss: 0.01354
Epoch: 9894, Training Loss: 0.01547
Epoch: 9894, Training Loss: 0.01425
Epoch: 9894, Training Loss: 0.01767
Epoch: 9894, Training Loss: 0.01354
Epoch: 9895, Training Loss: 0.01547
Epoch: 9895, Training Loss: 0.01425
Epoch: 9895, Training Loss: 0.01767
Epoch: 9895, Training Loss: 0.01354
Epoch: 9896, Training Loss: 0.01547
Epoch: 9896, Training Loss: 0.01425
Epoch: 9896, Training Loss: 0.01767
Epoch: 9896, Training Loss: 0.01354
Epoch: 9897, Training Loss: 0.01547
Epoch: 9897, Training Loss: 0.01425
Epoch: 9897, Training Loss: 0.01767
Epoch: 9897, Training Loss: 0.01353
Epoch: 9898, Training Loss: 0.01546
Epoch: 9898, Training Loss: 0.01425
Epoch: 9898, Training Loss: 0.01767
Epoch: 9898, Training Loss: 0.01353
Epoch: 9899, Training Loss: 0.01546
Epoch: 9899, Training Loss: 0.01424
Epoch: 9899, Training Loss: 0.01767
Epoch: 9899, Training Loss: 0.01353
Epoch: 9900, Training Loss: 0.01546
Epoch: 9900, Training Loss: 0.01424
Epoch: 9900, Training Loss: 0.01767
Epoch: 9900, Training Loss: 0.01353
Epoch: 9901, Training Loss: 0.01546
Epoch: 9901, Training Loss: 0.01424
Epoch: 9901, Training Loss: 0.01767
Epoch: 9901, Training Loss: 0.01353
Epoch: 9902, Training Loss: 0.01546
Epoch: 9902, Training Loss: 0.01424
Epoch: 9902, Training Loss: 0.01767
Epoch: 9902, Training Loss: 0.01353
Epoch: 9903, Training Loss: 0.01546
Epoch: 9903, Training Loss: 0.01424
Epoch: 9903, Training Loss: 0.01766
Epoch: 9903, Training Loss: 0.01353
Epoch: 9904, Training Loss: 0.01546
Epoch: 9904, Training Loss: 0.01424
Epoch: 9904, Training Loss: 0.01766
Epoch: 9904, Training Loss: 0.01353
Epoch: 9905, Training Loss: 0.01546
Epoch: 9905, Training Loss: 0.01424
Epoch: 9905, Training Loss: 0.01766
Epoch: 9905, Training Loss: 0.01353
Epoch: 9906, Training Loss: 0.01546
Epoch: 9906, Training Loss: 0.01424
Epoch: 9906, Training Loss: 0.01766
Epoch: 9906, Training Loss: 0.01353
Epoch: 9907, Training Loss: 0.01546
Epoch: 9907, Training Loss: 0.01424
Epoch: 9907, Training Loss: 0.01766
Epoch: 9907, Training Loss: 0.01353
Epoch: 9908, Training Loss: 0.01546
Epoch: 9908, Training Loss: 0.01424
Epoch: 9908, Training Loss: 0.01766
Epoch: 9908, Training Loss: 0.01353
Epoch: 9909, Training Loss: 0.01545
Epoch: 9909, Training Loss: 0.01424
Epoch: 9909, Training Loss: 0.01766
Epoch: 9909, Training Loss: 0.01353
Epoch: 9910, Training Loss: 0.01545
Epoch: 9910, Training Loss: 0.01424
Epoch: 9910, Training Loss: 0.01766
Epoch: 9910, Training Loss: 0.01352
Epoch: 9911, Training Loss: 0.01545
Epoch: 9911, Training Loss: 0.01424
Epoch: 9911, Training Loss: 0.01766
Epoch: 9911, Training Loss: 0.01352
Epoch: 9912, Training Loss: 0.01545
Epoch: 9912, Training Loss: 0.01423
Epoch: 9912, Training Loss: 0.01766
Epoch: 9912, Training Loss: 0.01352
Epoch: 9913, Training Loss: 0.01545
Epoch: 9913, Training Loss: 0.01423
Epoch: 9913, Training Loss: 0.01765
Epoch: 9913, Training Loss: 0.01352
Epoch: 9914, Training Loss: 0.01545
Epoch: 9914, Training Loss: 0.01423
Epoch: 9914, Training Loss: 0.01765
Epoch: 9914, Training Loss: 0.01352
Epoch: 9915, Training Loss: 0.01545
Epoch: 9915, Training Loss: 0.01423
Epoch: 9915, Training Loss: 0.01765
Epoch: 9915, Training Loss: 0.01352
Epoch: 9916, Training Loss: 0.01545
Epoch: 9916, Training Loss: 0.01423
Epoch: 9916, Training Loss: 0.01765
Epoch: 9916, Training Loss: 0.01352
Epoch: 9917, Training Loss: 0.01545
Epoch: 9917, Training Loss: 0.01423
Epoch: 9917, Training Loss: 0.01765
Epoch: 9917, Training Loss: 0.01352
Epoch: 9918, Training Loss: 0.01545
Epoch: 9918, Training Loss: 0.01423
Epoch: 9918, Training Loss: 0.01765
Epoch: 9918, Training Loss: 0.01352
Epoch: 9919, Training Loss: 0.01545
Epoch: 9919, Training Loss: 0.01423
Epoch: 9919, Training Loss: 0.01765
Epoch: 9919, Training Loss: 0.01352
Epoch: 9920, Training Loss: 0.01544
Epoch: 9920, Training Loss: 0.01423
Epoch: 9920, Training Loss: 0.01765
Epoch: 9920, Training Loss: 0.01352
Epoch: 9921, Training Loss: 0.01544
Epoch: 9921, Training Loss: 0.01423
Epoch: 9921, Training Loss: 0.01765
Epoch: 9921, Training Loss: 0.01352
Epoch: 9922, Training Loss: 0.01544
Epoch: 9922, Training Loss: 0.01423
Epoch: 9922, Training Loss: 0.01764
Epoch: 9922, Training Loss: 0.01352
Epoch: 9923, Training Loss: 0.01544
Epoch: 9923, Training Loss: 0.01423
Epoch: 9923, Training Loss: 0.01764
Epoch: 9923, Training Loss: 0.01352
Epoch: 9924, Training Loss: 0.01544
Epoch: 9924, Training Loss: 0.01423
Epoch: 9924, Training Loss: 0.01764
Epoch: 9924, Training Loss: 0.01351
Epoch: 9925, Training Loss: 0.01544
Epoch: 9925, Training Loss: 0.01422
Epoch: 9925, Training Loss: 0.01764
Epoch: 9925, Training Loss: 0.01351
Epoch: 9926, Training Loss: 0.01544
Epoch: 9926, Training Loss: 0.01422
Epoch: 9926, Training Loss: 0.01764
Epoch: 9926, Training Loss: 0.01351
Epoch: 9927, Training Loss: 0.01544
Epoch: 9927, Training Loss: 0.01422
Epoch: 9927, Training Loss: 0.01764
Epoch: 9927, Training Loss: 0.01351
Epoch: 9928, Training Loss: 0.01544
Epoch: 9928, Training Loss: 0.01422
Epoch: 9928, Training Loss: 0.01764
Epoch: 9928, Training Loss: 0.01351
Epoch: 9929, Training Loss: 0.01544
Epoch: 9929, Training Loss: 0.01422
Epoch: 9929, Training Loss: 0.01764
Epoch: 9929, Training Loss: 0.01351
Epoch: 9930, Training Loss: 0.01544
Epoch: 9930, Training Loss: 0.01422
Epoch: 9930, Training Loss: 0.01764
Epoch: 9930, Training Loss: 0.01351
Epoch: 9931, Training Loss: 0.01544
Epoch: 9931, Training Loss: 0.01422
Epoch: 9931, Training Loss: 0.01764
Epoch: 9931, Training Loss: 0.01351
Epoch: 9932, Training Loss: 0.01543
Epoch: 9932, Training Loss: 0.01422
Epoch: 9932, Training Loss: 0.01763
Epoch: 9932, Training Loss: 0.01351
Epoch: 9933, Training Loss: 0.01543
Epoch: 9933, Training Loss: 0.01422
Epoch: 9933, Training Loss: 0.01763
Epoch: 9933, Training Loss: 0.01351
Epoch: 9934, Training Loss: 0.01543
Epoch: 9934, Training Loss: 0.01422
Epoch: 9934, Training Loss: 0.01763
Epoch: 9934, Training Loss: 0.01351
Epoch: 9935, Training Loss: 0.01543
Epoch: 9935, Training Loss: 0.01422
Epoch: 9935, Training Loss: 0.01763
Epoch: 9935, Training Loss: 0.01351
Epoch: 9936, Training Loss: 0.01543
Epoch: 9936, Training Loss: 0.01422
Epoch: 9936, Training Loss: 0.01763
Epoch: 9936, Training Loss: 0.01351
Epoch: 9937, Training Loss: 0.01543
Epoch: 9937, Training Loss: 0.01421
Epoch: 9937, Training Loss: 0.01763
Epoch: 9937, Training Loss: 0.01350
Epoch: 9938, Training Loss: 0.01543
Epoch: 9938, Training Loss: 0.01421
Epoch: 9938, Training Loss: 0.01763
Epoch: 9938, Training Loss: 0.01350
Epoch: 9939, Training Loss: 0.01543
Epoch: 9939, Training Loss: 0.01421
Epoch: 9939, Training Loss: 0.01763
Epoch: 9939, Training Loss: 0.01350
Epoch: 9940, Training Loss: 0.01543
Epoch: 9940, Training Loss: 0.01421
Epoch: 9940, Training Loss: 0.01763
Epoch: 9940, Training Loss: 0.01350
Epoch: 9941, Training Loss: 0.01543
Epoch: 9941, Training Loss: 0.01421
Epoch: 9941, Training Loss: 0.01763
Epoch: 9941, Training Loss: 0.01350
Epoch: 9942, Training Loss: 0.01543
Epoch: 9942, Training Loss: 0.01421
Epoch: 9942, Training Loss: 0.01762
Epoch: 9942, Training Loss: 0.01350
Epoch: 9943, Training Loss: 0.01542
Epoch: 9943, Training Loss: 0.01421
Epoch: 9943, Training Loss: 0.01762
Epoch: 9943, Training Loss: 0.01350
Epoch: 9944, Training Loss: 0.01542
Epoch: 9944, Training Loss: 0.01421
Epoch: 9944, Training Loss: 0.01762
Epoch: 9944, Training Loss: 0.01350
Epoch: 9945, Training Loss: 0.01542
Epoch: 9945, Training Loss: 0.01421
Epoch: 9945, Training Loss: 0.01762
Epoch: 9945, Training Loss: 0.01350
Epoch: 9946, Training Loss: 0.01542
Epoch: 9946, Training Loss: 0.01421
Epoch: 9946, Training Loss: 0.01762
Epoch: 9946, Training Loss: 0.01350
Epoch: 9947, Training Loss: 0.01542
Epoch: 9947, Training Loss: 0.01421
Epoch: 9947, Training Loss: 0.01762
Epoch: 9947, Training Loss: 0.01350
Epoch: 9948, Training Loss: 0.01542
Epoch: 9948, Training Loss: 0.01421
Epoch: 9948, Training Loss: 0.01762
Epoch: 9948, Training Loss: 0.01350
Epoch: 9949, Training Loss: 0.01542
Epoch: 9949, Training Loss: 0.01421
Epoch: 9949, Training Loss: 0.01762
Epoch: 9949, Training Loss: 0.01350
Epoch: 9950, Training Loss: 0.01542
Epoch: 9950, Training Loss: 0.01420
Epoch: 9950, Training Loss: 0.01762
Epoch: 9950, Training Loss: 0.01349
Epoch: 9951, Training Loss: 0.01542
Epoch: 9951, Training Loss: 0.01420
Epoch: 9951, Training Loss: 0.01762
Epoch: 9951, Training Loss: 0.01349
Epoch: 9952, Training Loss: 0.01542
Epoch: 9952, Training Loss: 0.01420
Epoch: 9952, Training Loss: 0.01761
Epoch: 9952, Training Loss: 0.01349
Epoch: 9953, Training Loss: 0.01542
Epoch: 9953, Training Loss: 0.01420
Epoch: 9953, Training Loss: 0.01761
Epoch: 9953, Training Loss: 0.01349
Epoch: 9954, Training Loss: 0.01541
Epoch: 9954, Training Loss: 0.01420
Epoch: 9954, Training Loss: 0.01761
Epoch: 9954, Training Loss: 0.01349
Epoch: 9955, Training Loss: 0.01541
Epoch: 9955, Training Loss: 0.01420
Epoch: 9955, Training Loss: 0.01761
Epoch: 9955, Training Loss: 0.01349
Epoch: 9956, Training Loss: 0.01541
Epoch: 9956, Training Loss: 0.01420
Epoch: 9956, Training Loss: 0.01761
Epoch: 9956, Training Loss: 0.01349
Epoch: 9957, Training Loss: 0.01541
Epoch: 9957, Training Loss: 0.01420
Epoch: 9957, Training Loss: 0.01761
Epoch: 9957, Training Loss: 0.01349
Epoch: 9958, Training Loss: 0.01541
Epoch: 9958, Training Loss: 0.01420
Epoch: 9958, Training Loss: 0.01761
Epoch: 9958, Training Loss: 0.01349
Epoch: 9959, Training Loss: 0.01541
Epoch: 9959, Training Loss: 0.01420
Epoch: 9959, Training Loss: 0.01761
Epoch: 9959, Training Loss: 0.01349
Epoch: 9960, Training Loss: 0.01541
Epoch: 9960, Training Loss: 0.01420
Epoch: 9960, Training Loss: 0.01761
Epoch: 9960, Training Loss: 0.01349
Epoch: 9961, Training Loss: 0.01541
Epoch: 9961, Training Loss: 0.01420
Epoch: 9961, Training Loss: 0.01761
Epoch: 9961, Training Loss: 0.01349
Epoch: 9962, Training Loss: 0.01541
Epoch: 9962, Training Loss: 0.01420
Epoch: 9962, Training Loss: 0.01760
Epoch: 9962, Training Loss: 0.01349
Epoch: 9963, Training Loss: 0.01541
Epoch: 9963, Training Loss: 0.01419
Epoch: 9963, Training Loss: 0.01760
Epoch: 9963, Training Loss: 0.01348
Epoch: 9964, Training Loss: 0.01541
Epoch: 9964, Training Loss: 0.01419
Epoch: 9964, Training Loss: 0.01760
Epoch: 9964, Training Loss: 0.01348
Epoch: 9965, Training Loss: 0.01541
Epoch: 9965, Training Loss: 0.01419
Epoch: 9965, Training Loss: 0.01760
Epoch: 9965, Training Loss: 0.01348
Epoch: 9966, Training Loss: 0.01540
Epoch: 9966, Training Loss: 0.01419
Epoch: 9966, Training Loss: 0.01760
Epoch: 9966, Training Loss: 0.01348
Epoch: 9967, Training Loss: 0.01540
Epoch: 9967, Training Loss: 0.01419
Epoch: 9967, Training Loss: 0.01760
Epoch: 9967, Training Loss: 0.01348
Epoch: 9968, Training Loss: 0.01540
Epoch: 9968, Training Loss: 0.01419
Epoch: 9968, Training Loss: 0.01760
Epoch: 9968, Training Loss: 0.01348
Epoch: 9969, Training Loss: 0.01540
Epoch: 9969, Training Loss: 0.01419
Epoch: 9969, Training Loss: 0.01760
Epoch: 9969, Training Loss: 0.01348
Epoch: 9970, Training Loss: 0.01540
Epoch: 9970, Training Loss: 0.01419
Epoch: 9970, Training Loss: 0.01760
Epoch: 9970, Training Loss: 0.01348
Epoch: 9971, Training Loss: 0.01540
Epoch: 9971, Training Loss: 0.01419
Epoch: 9971, Training Loss: 0.01760
Epoch: 9971, Training Loss: 0.01348
Epoch: 9972, Training Loss: 0.01540
Epoch: 9972, Training Loss: 0.01419
Epoch: 9972, Training Loss: 0.01759
Epoch: 9972, Training Loss: 0.01348
Epoch: 9973, Training Loss: 0.01540
Epoch: 9973, Training Loss: 0.01419
Epoch: 9973, Training Loss: 0.01759
Epoch: 9973, Training Loss: 0.01348
Epoch: 9974, Training Loss: 0.01540
Epoch: 9974, Training Loss: 0.01419
Epoch: 9974, Training Loss: 0.01759
Epoch: 9974, Training Loss: 0.01348
Epoch: 9975, Training Loss: 0.01540
Epoch: 9975, Training Loss: 0.01418
Epoch: 9975, Training Loss: 0.01759
Epoch: 9975, Training Loss: 0.01348
Epoch: 9976, Training Loss: 0.01540
Epoch: 9976, Training Loss: 0.01418
Epoch: 9976, Training Loss: 0.01759
Epoch: 9976, Training Loss: 0.01348
Epoch: 9977, Training Loss: 0.01539
Epoch: 9977, Training Loss: 0.01418
Epoch: 9977, Training Loss: 0.01759
Epoch: 9977, Training Loss: 0.01347
Epoch: 9978, Training Loss: 0.01539
Epoch: 9978, Training Loss: 0.01418
Epoch: 9978, Training Loss: 0.01759
Epoch: 9978, Training Loss: 0.01347
Epoch: 9979, Training Loss: 0.01539
Epoch: 9979, Training Loss: 0.01418
Epoch: 9979, Training Loss: 0.01759
Epoch: 9979, Training Loss: 0.01347
Epoch: 9980, Training Loss: 0.01539
Epoch: 9980, Training Loss: 0.01418
Epoch: 9980, Training Loss: 0.01759
Epoch: 9980, Training Loss: 0.01347
Epoch: 9981, Training Loss: 0.01539
Epoch: 9981, Training Loss: 0.01418
Epoch: 9981, Training Loss: 0.01759
Epoch: 9981, Training Loss: 0.01347
Epoch: 9982, Training Loss: 0.01539
Epoch: 9982, Training Loss: 0.01418
Epoch: 9982, Training Loss: 0.01758
Epoch: 9982, Training Loss: 0.01347
Epoch: 9983, Training Loss: 0.01539
Epoch: 9983, Training Loss: 0.01418
Epoch: 9983, Training Loss: 0.01758
Epoch: 9983, Training Loss: 0.01347
Epoch: 9984, Training Loss: 0.01539
Epoch: 9984, Training Loss: 0.01418
Epoch: 9984, Training Loss: 0.01758
Epoch: 9984, Training Loss: 0.01347
Epoch: 9985, Training Loss: 0.01539
Epoch: 9985, Training Loss: 0.01418
Epoch: 9985, Training Loss: 0.01758
Epoch: 9985, Training Loss: 0.01347
Epoch: 9986, Training Loss: 0.01539
Epoch: 9986, Training Loss: 0.01418
Epoch: 9986, Training Loss: 0.01758
Epoch: 9986, Training Loss: 0.01347
Epoch: 9987, Training Loss: 0.01539
Epoch: 9987, Training Loss: 0.01418
Epoch: 9987, Training Loss: 0.01758
Epoch: 9987, Training Loss: 0.01347
Epoch: 9988, Training Loss: 0.01539
Epoch: 9988, Training Loss: 0.01417
Epoch: 9988, Training Loss: 0.01758
Epoch: 9988, Training Loss: 0.01347
Epoch: 9989, Training Loss: 0.01538
Epoch: 9989, Training Loss: 0.01417
Epoch: 9989, Training Loss: 0.01758
Epoch: 9989, Training Loss: 0.01347
Epoch: 9990, Training Loss: 0.01538
Epoch: 9990, Training Loss: 0.01417
Epoch: 9990, Training Loss: 0.01758
Epoch: 9990, Training Loss: 0.01346
Epoch: 9991, Training Loss: 0.01538
Epoch: 9991, Training Loss: 0.01417
Epoch: 9991, Training Loss: 0.01758
Epoch: 9991, Training Loss: 0.01346
Epoch: 9992, Training Loss: 0.01538
Epoch: 9992, Training Loss: 0.01417
Epoch: 9992, Training Loss: 0.01757
Epoch: 9992, Training Loss: 0.01346
Epoch: 9993, Training Loss: 0.01538
Epoch: 9993, Training Loss: 0.01417
Epoch: 9993, Training Loss: 0.01757
Epoch: 9993, Training Loss: 0.01346
Epoch: 9994, Training Loss: 0.01538
Epoch: 9994, Training Loss: 0.01417
Epoch: 9994, Training Loss: 0.01757
Epoch: 9994, Training Loss: 0.01346
Epoch: 9995, Training Loss: 0.01538
Epoch: 9995, Training Loss: 0.01417
Epoch: 9995, Training Loss: 0.01757
Epoch: 9995, Training Loss: 0.01346
Epoch: 9996, Training Loss: 0.01538
Epoch: 9996, Training Loss: 0.01417
Epoch: 9996, Training Loss: 0.01757
Epoch: 9996, Training Loss: 0.01346
Epoch: 9997, Training Loss: 0.01538
Epoch: 9997, Training Loss: 0.01417
Epoch: 9997, Training Loss: 0.01757
Epoch: 9997, Training Loss: 0.01346
Epoch: 9998, Training Loss: 0.01538
Epoch: 9998, Training Loss: 0.01417
Epoch: 9998, Training Loss: 0.01757
Epoch: 9998, Training Loss: 0.01346
Epoch: 9999, Training Loss: 0.01538
Epoch: 9999, Training Loss: 0.01417
Epoch: 9999, Training Loss: 0.01757
Epoch: 9999, Training Loss: 0.01346
Epoch: 10000, Training Loss: 0.01537
Epoch: 10000, Training Loss: 0.01417
Epoch: 10000, Training Loss: 0.01757
Epoch: 10000, Training Loss: 0.01346
Epoch: 10001, Training Loss: 0.01537
Epoch: 10001, Training Loss: 0.01416
Epoch: 10001, Training Loss: 0.01757
Epoch: 10001, Training Loss: 0.01346
Epoch: 10002, Training Loss: 0.01537
Epoch: 10002, Training Loss: 0.01416
Epoch: 10002, Training Loss: 0.01756
Epoch: 10002, Training Loss: 0.01346
Epoch: 10003, Training Loss: 0.01537
Epoch: 10003, Training Loss: 0.01416
Epoch: 10003, Training Loss: 0.01756
Epoch: 10003, Training Loss: 0.01345
Epoch: 10004, Training Loss: 0.01537
Epoch: 10004, Training Loss: 0.01416
Epoch: 10004, Training Loss: 0.01756
Epoch: 10004, Training Loss: 0.01345
Epoch: 10005, Training Loss: 0.01537
Epoch: 10005, Training Loss: 0.01416
Epoch: 10005, Training Loss: 0.01756
Epoch: 10005, Training Loss: 0.01345
Epoch: 10006, Training Loss: 0.01537
Epoch: 10006, Training Loss: 0.01416
Epoch: 10006, Training Loss: 0.01756
Epoch: 10006, Training Loss: 0.01345
Epoch: 10007, Training Loss: 0.01537
Epoch: 10007, Training Loss: 0.01416
Epoch: 10007, Training Loss: 0.01756
Epoch: 10007, Training Loss: 0.01345
Epoch: 10008, Training Loss: 0.01537
Epoch: 10008, Training Loss: 0.01416
Epoch: 10008, Training Loss: 0.01756
Epoch: 10008, Training Loss: 0.01345
Epoch: 10009, Training Loss: 0.01537
Epoch: 10009, Training Loss: 0.01416
Epoch: 10009, Training Loss: 0.01756
Epoch: 10009, Training Loss: 0.01345
Epoch: 10010, Training Loss: 0.01537
Epoch: 10010, Training Loss: 0.01416
Epoch: 10010, Training Loss: 0.01756
Epoch: 10010, Training Loss: 0.01345
Epoch: 10011, Training Loss: 0.01537
Epoch: 10011, Training Loss: 0.01416
Epoch: 10011, Training Loss: 0.01756
Epoch: 10011, Training Loss: 0.01345
Epoch: 10012, Training Loss: 0.01536
Epoch: 10012, Training Loss: 0.01416
Epoch: 10012, Training Loss: 0.01755
Epoch: 10012, Training Loss: 0.01345
Epoch: 10013, Training Loss: 0.01536
Epoch: 10013, Training Loss: 0.01416
Epoch: 10013, Training Loss: 0.01755
Epoch: 10013, Training Loss: 0.01345
Epoch: 10014, Training Loss: 0.01536
Epoch: 10014, Training Loss: 0.01415
Epoch: 10014, Training Loss: 0.01755
Epoch: 10014, Training Loss: 0.01345
Epoch: 10015, Training Loss: 0.01536
Epoch: 10015, Training Loss: 0.01415
Epoch: 10015, Training Loss: 0.01755
Epoch: 10015, Training Loss: 0.01345
Epoch: 10016, Training Loss: 0.01536
Epoch: 10016, Training Loss: 0.01415
Epoch: 10016, Training Loss: 0.01755
Epoch: 10016, Training Loss: 0.01345
Epoch: 10017, Training Loss: 0.01536
Epoch: 10017, Training Loss: 0.01415
Epoch: 10017, Training Loss: 0.01755
Epoch: 10017, Training Loss: 0.01344
Epoch: 10018, Training Loss: 0.01536
Epoch: 10018, Training Loss: 0.01415
Epoch: 10018, Training Loss: 0.01755
Epoch: 10018, Training Loss: 0.01344
Epoch: 10019, Training Loss: 0.01536
Epoch: 10019, Training Loss: 0.01415
Epoch: 10019, Training Loss: 0.01755
Epoch: 10019, Training Loss: 0.01344
Epoch: 10020, Training Loss: 0.01536
Epoch: 10020, Training Loss: 0.01415
Epoch: 10020, Training Loss: 0.01755
Epoch: 10020, Training Loss: 0.01344
Epoch: 10021, Training Loss: 0.01536
Epoch: 10021, Training Loss: 0.01415
Epoch: 10021, Training Loss: 0.01755
Epoch: 10021, Training Loss: 0.01344
Epoch: 10022, Training Loss: 0.01536
Epoch: 10022, Training Loss: 0.01415
Epoch: 10022, Training Loss: 0.01754
Epoch: 10022, Training Loss: 0.01344
Epoch: 10023, Training Loss: 0.01535
Epoch: 10023, Training Loss: 0.01415
Epoch: 10023, Training Loss: 0.01754
Epoch: 10023, Training Loss: 0.01344
Epoch: 10024, Training Loss: 0.01535
Epoch: 10024, Training Loss: 0.01415
Epoch: 10024, Training Loss: 0.01754
Epoch: 10024, Training Loss: 0.01344
Epoch: 10025, Training Loss: 0.01535
Epoch: 10025, Training Loss: 0.01415
Epoch: 10025, Training Loss: 0.01754
Epoch: 10025, Training Loss: 0.01344
Epoch: 10026, Training Loss: 0.01535
Epoch: 10026, Training Loss: 0.01415
Epoch: 10026, Training Loss: 0.01754
Epoch: 10026, Training Loss: 0.01344
Epoch: 10027, Training Loss: 0.01535
Epoch: 10027, Training Loss: 0.01414
Epoch: 10027, Training Loss: 0.01754
Epoch: 10027, Training Loss: 0.01344
Epoch: 10028, Training Loss: 0.01535
Epoch: 10028, Training Loss: 0.01414
Epoch: 10028, Training Loss: 0.01754
Epoch: 10028, Training Loss: 0.01344
Epoch: 10029, Training Loss: 0.01535
Epoch: 10029, Training Loss: 0.01414
Epoch: 10029, Training Loss: 0.01754
Epoch: 10029, Training Loss: 0.01344
Epoch: 10030, Training Loss: 0.01535
Epoch: 10030, Training Loss: 0.01414
Epoch: 10030, Training Loss: 0.01754
Epoch: 10030, Training Loss: 0.01343
Epoch: 10031, Training Loss: 0.01535
Epoch: 10031, Training Loss: 0.01414
Epoch: 10031, Training Loss: 0.01754
Epoch: 10031, Training Loss: 0.01343
Epoch: 10032, Training Loss: 0.01535
Epoch: 10032, Training Loss: 0.01414
Epoch: 10032, Training Loss: 0.01753
Epoch: 10032, Training Loss: 0.01343
Epoch: 10033, Training Loss: 0.01535
Epoch: 10033, Training Loss: 0.01414
Epoch: 10033, Training Loss: 0.01753
Epoch: 10033, Training Loss: 0.01343
Epoch: 10034, Training Loss: 0.01535
Epoch: 10034, Training Loss: 0.01414
Epoch: 10034, Training Loss: 0.01753
Epoch: 10034, Training Loss: 0.01343
Epoch: 10035, Training Loss: 0.01534
Epoch: 10035, Training Loss: 0.01414
Epoch: 10035, Training Loss: 0.01753
Epoch: 10035, Training Loss: 0.01343
Epoch: 10036, Training Loss: 0.01534
Epoch: 10036, Training Loss: 0.01414
Epoch: 10036, Training Loss: 0.01753
Epoch: 10036, Training Loss: 0.01343
Epoch: 10037, Training Loss: 0.01534
Epoch: 10037, Training Loss: 0.01414
Epoch: 10037, Training Loss: 0.01753
Epoch: 10037, Training Loss: 0.01343
Epoch: 10038, Training Loss: 0.01534
Epoch: 10038, Training Loss: 0.01414
Epoch: 10038, Training Loss: 0.01753
Epoch: 10038, Training Loss: 0.01343
Epoch: 10039, Training Loss: 0.01534
Epoch: 10039, Training Loss: 0.01414
Epoch: 10039, Training Loss: 0.01753
Epoch: 10039, Training Loss: 0.01343
Epoch: 10040, Training Loss: 0.01534
Epoch: 10040, Training Loss: 0.01413
Epoch: 10040, Training Loss: 0.01753
Epoch: 10040, Training Loss: 0.01343
Epoch: 10041, Training Loss: 0.01534
Epoch: 10041, Training Loss: 0.01413
Epoch: 10041, Training Loss: 0.01753
Epoch: 10041, Training Loss: 0.01343
Epoch: 10042, Training Loss: 0.01534
Epoch: 10042, Training Loss: 0.01413
Epoch: 10042, Training Loss: 0.01752
Epoch: 10042, Training Loss: 0.01343
Epoch: 10043, Training Loss: 0.01534
Epoch: 10043, Training Loss: 0.01413
Epoch: 10043, Training Loss: 0.01752
Epoch: 10043, Training Loss: 0.01342
Epoch: 10044, Training Loss: 0.01534
Epoch: 10044, Training Loss: 0.01413
Epoch: 10044, Training Loss: 0.01752
Epoch: 10044, Training Loss: 0.01342
Epoch: 10045, Training Loss: 0.01534
Epoch: 10045, Training Loss: 0.01413
Epoch: 10045, Training Loss: 0.01752
Epoch: 10045, Training Loss: 0.01342
Epoch: 10046, Training Loss: 0.01533
Epoch: 10046, Training Loss: 0.01413
Epoch: 10046, Training Loss: 0.01752
Epoch: 10046, Training Loss: 0.01342
Epoch: 10047, Training Loss: 0.01533
Epoch: 10047, Training Loss: 0.01413
Epoch: 10047, Training Loss: 0.01752
Epoch: 10047, Training Loss: 0.01342
Epoch: 10048, Training Loss: 0.01533
Epoch: 10048, Training Loss: 0.01413
Epoch: 10048, Training Loss: 0.01752
Epoch: 10048, Training Loss: 0.01342
Epoch: 10049, Training Loss: 0.01533
Epoch: 10049, Training Loss: 0.01413
Epoch: 10049, Training Loss: 0.01752
Epoch: 10049, Training Loss: 0.01342
Epoch: 10050, Training Loss: 0.01533
Epoch: 10050, Training Loss: 0.01413
Epoch: 10050, Training Loss: 0.01752
Epoch: 10050, Training Loss: 0.01342
Epoch: 10051, Training Loss: 0.01533
Epoch: 10051, Training Loss: 0.01413
Epoch: 10051, Training Loss: 0.01752
Epoch: 10051, Training Loss: 0.01342
Epoch: 10052, Training Loss: 0.01533
Epoch: 10052, Training Loss: 0.01413
Epoch: 10052, Training Loss: 0.01751
Epoch: 10052, Training Loss: 0.01342
Epoch: 10053, Training Loss: 0.01533
Epoch: 10053, Training Loss: 0.01412
Epoch: 10053, Training Loss: 0.01751
Epoch: 10053, Training Loss: 0.01342
Epoch: 10054, Training Loss: 0.01533
Epoch: 10054, Training Loss: 0.01412
Epoch: 10054, Training Loss: 0.01751
Epoch: 10054, Training Loss: 0.01342
Epoch: 10055, Training Loss: 0.01533
Epoch: 10055, Training Loss: 0.01412
Epoch: 10055, Training Loss: 0.01751
Epoch: 10055, Training Loss: 0.01342
Epoch: 10056, Training Loss: 0.01533
Epoch: 10056, Training Loss: 0.01412
Epoch: 10056, Training Loss: 0.01751
Epoch: 10056, Training Loss: 0.01342
Epoch: 10057, Training Loss: 0.01533
Epoch: 10057, Training Loss: 0.01412
Epoch: 10057, Training Loss: 0.01751
Epoch: 10057, Training Loss: 0.01341
Epoch: 10058, Training Loss: 0.01532
Epoch: 10058, Training Loss: 0.01412
Epoch: 10058, Training Loss: 0.01751
Epoch: 10058, Training Loss: 0.01341
Epoch: 10059, Training Loss: 0.01532
Epoch: 10059, Training Loss: 0.01412
Epoch: 10059, Training Loss: 0.01751
Epoch: 10059, Training Loss: 0.01341
Epoch: 10060, Training Loss: 0.01532
Epoch: 10060, Training Loss: 0.01412
Epoch: 10060, Training Loss: 0.01751
Epoch: 10060, Training Loss: 0.01341
Epoch: 10061, Training Loss: 0.01532
Epoch: 10061, Training Loss: 0.01412
Epoch: 10061, Training Loss: 0.01751
Epoch: 10061, Training Loss: 0.01341
Epoch: 10062, Training Loss: 0.01532
Epoch: 10062, Training Loss: 0.01412
Epoch: 10062, Training Loss: 0.01750
Epoch: 10062, Training Loss: 0.01341
Epoch: 10063, Training Loss: 0.01532
Epoch: 10063, Training Loss: 0.01412
Epoch: 10063, Training Loss: 0.01750
Epoch: 10063, Training Loss: 0.01341
Epoch: 10064, Training Loss: 0.01532
Epoch: 10064, Training Loss: 0.01412
Epoch: 10064, Training Loss: 0.01750
Epoch: 10064, Training Loss: 0.01341
Epoch: 10065, Training Loss: 0.01532
Epoch: 10065, Training Loss: 0.01412
Epoch: 10065, Training Loss: 0.01750
Epoch: 10065, Training Loss: 0.01341
Epoch: 10066, Training Loss: 0.01532
Epoch: 10066, Training Loss: 0.01411
Epoch: 10066, Training Loss: 0.01750
Epoch: 10066, Training Loss: 0.01341
Epoch: 10067, Training Loss: 0.01532
Epoch: 10067, Training Loss: 0.01411
Epoch: 10067, Training Loss: 0.01750
Epoch: 10067, Training Loss: 0.01341
Epoch: 10068, Training Loss: 0.01532
Epoch: 10068, Training Loss: 0.01411
Epoch: 10068, Training Loss: 0.01750
Epoch: 10068, Training Loss: 0.01341
Epoch: 10069, Training Loss: 0.01531
Epoch: 10069, Training Loss: 0.01411
Epoch: 10069, Training Loss: 0.01750
Epoch: 10069, Training Loss: 0.01341
Epoch: 10070, Training Loss: 0.01531
Epoch: 10070, Training Loss: 0.01411
Epoch: 10070, Training Loss: 0.01750
Epoch: 10070, Training Loss: 0.01340
Epoch: 10071, Training Loss: 0.01531
Epoch: 10071, Training Loss: 0.01411
Epoch: 10071, Training Loss: 0.01750
Epoch: 10071, Training Loss: 0.01340
Epoch: 10072, Training Loss: 0.01531
Epoch: 10072, Training Loss: 0.01411
Epoch: 10072, Training Loss: 0.01749
Epoch: 10072, Training Loss: 0.01340
Epoch: 10073, Training Loss: 0.01531
Epoch: 10073, Training Loss: 0.01411
Epoch: 10073, Training Loss: 0.01749
Epoch: 10073, Training Loss: 0.01340
Epoch: 10074, Training Loss: 0.01531
Epoch: 10074, Training Loss: 0.01411
Epoch: 10074, Training Loss: 0.01749
Epoch: 10074, Training Loss: 0.01340
Epoch: 10075, Training Loss: 0.01531
Epoch: 10075, Training Loss: 0.01411
Epoch: 10075, Training Loss: 0.01749
Epoch: 10075, Training Loss: 0.01340
Epoch: 10076, Training Loss: 0.01531
Epoch: 10076, Training Loss: 0.01411
Epoch: 10076, Training Loss: 0.01749
Epoch: 10076, Training Loss: 0.01340
Epoch: 10077, Training Loss: 0.01531
Epoch: 10077, Training Loss: 0.01411
Epoch: 10077, Training Loss: 0.01749
Epoch: 10077, Training Loss: 0.01340
Epoch: 10078, Training Loss: 0.01531
Epoch: 10078, Training Loss: 0.01410
Epoch: 10078, Training Loss: 0.01749
Epoch: 10078, Training Loss: 0.01340
Epoch: 10079, Training Loss: 0.01531
Epoch: 10079, Training Loss: 0.01410
Epoch: 10079, Training Loss: 0.01749
Epoch: 10079, Training Loss: 0.01340
Epoch: 10080, Training Loss: 0.01531
Epoch: 10080, Training Loss: 0.01410
Epoch: 10080, Training Loss: 0.01749
Epoch: 10080, Training Loss: 0.01340
Epoch: 10081, Training Loss: 0.01530
Epoch: 10081, Training Loss: 0.01410
Epoch: 10081, Training Loss: 0.01749
Epoch: 10081, Training Loss: 0.01340
Epoch: 10082, Training Loss: 0.01530
Epoch: 10082, Training Loss: 0.01410
Epoch: 10082, Training Loss: 0.01748
Epoch: 10082, Training Loss: 0.01340
Epoch: 10083, Training Loss: 0.01530
Epoch: 10083, Training Loss: 0.01410
Epoch: 10083, Training Loss: 0.01748
Epoch: 10083, Training Loss: 0.01340
Epoch: 10084, Training Loss: 0.01530
Epoch: 10084, Training Loss: 0.01410
Epoch: 10084, Training Loss: 0.01748
Epoch: 10084, Training Loss: 0.01339
Epoch: 10085, Training Loss: 0.01530
Epoch: 10085, Training Loss: 0.01410
Epoch: 10085, Training Loss: 0.01748
Epoch: 10085, Training Loss: 0.01339
Epoch: 10086, Training Loss: 0.01530
Epoch: 10086, Training Loss: 0.01410
Epoch: 10086, Training Loss: 0.01748
Epoch: 10086, Training Loss: 0.01339
Epoch: 10087, Training Loss: 0.01530
Epoch: 10087, Training Loss: 0.01410
Epoch: 10087, Training Loss: 0.01748
Epoch: 10087, Training Loss: 0.01339
Epoch: 10088, Training Loss: 0.01530
Epoch: 10088, Training Loss: 0.01410
Epoch: 10088, Training Loss: 0.01748
Epoch: 10088, Training Loss: 0.01339
Epoch: 10089, Training Loss: 0.01530
Epoch: 10089, Training Loss: 0.01410
Epoch: 10089, Training Loss: 0.01748
Epoch: 10089, Training Loss: 0.01339
Epoch: 10090, Training Loss: 0.01530
Epoch: 10090, Training Loss: 0.01410
Epoch: 10090, Training Loss: 0.01748
Epoch: 10090, Training Loss: 0.01339
Epoch: 10091, Training Loss: 0.01530
Epoch: 10091, Training Loss: 0.01409
Epoch: 10091, Training Loss: 0.01748
Epoch: 10091, Training Loss: 0.01339
Epoch: 10092, Training Loss: 0.01530
Epoch: 10092, Training Loss: 0.01409
Epoch: 10092, Training Loss: 0.01747
Epoch: 10092, Training Loss: 0.01339
Epoch: 10093, Training Loss: 0.01529
Epoch: 10093, Training Loss: 0.01409
Epoch: 10093, Training Loss: 0.01747
Epoch: 10093, Training Loss: 0.01339
Epoch: 10094, Training Loss: 0.01529
Epoch: 10094, Training Loss: 0.01409
Epoch: 10094, Training Loss: 0.01747
Epoch: 10094, Training Loss: 0.01339
Epoch: 10095, Training Loss: 0.01529
Epoch: 10095, Training Loss: 0.01409
Epoch: 10095, Training Loss: 0.01747
Epoch: 10095, Training Loss: 0.01339
Epoch: 10096, Training Loss: 0.01529
Epoch: 10096, Training Loss: 0.01409
Epoch: 10096, Training Loss: 0.01747
Epoch: 10096, Training Loss: 0.01339
Epoch: 10097, Training Loss: 0.01529
Epoch: 10097, Training Loss: 0.01409
Epoch: 10097, Training Loss: 0.01747
Epoch: 10097, Training Loss: 0.01339
Epoch: 10098, Training Loss: 0.01529
Epoch: 10098, Training Loss: 0.01409
Epoch: 10098, Training Loss: 0.01747
Epoch: 10098, Training Loss: 0.01338
Epoch: 10099, Training Loss: 0.01529
Epoch: 10099, Training Loss: 0.01409
Epoch: 10099, Training Loss: 0.01747
Epoch: 10099, Training Loss: 0.01338
Epoch: 10100, Training Loss: 0.01529
Epoch: 10100, Training Loss: 0.01409
Epoch: 10100, Training Loss: 0.01747
Epoch: 10100, Training Loss: 0.01338
Epoch: 10101, Training Loss: 0.01529
Epoch: 10101, Training Loss: 0.01409
Epoch: 10101, Training Loss: 0.01747
Epoch: 10101, Training Loss: 0.01338
Epoch: 10102, Training Loss: 0.01529
Epoch: 10102, Training Loss: 0.01409
Epoch: 10102, Training Loss: 0.01746
Epoch: 10102, Training Loss: 0.01338
Epoch: 10103, Training Loss: 0.01529
Epoch: 10103, Training Loss: 0.01409
Epoch: 10103, Training Loss: 0.01746
Epoch: 10103, Training Loss: 0.01338
Epoch: 10104, Training Loss: 0.01528
Epoch: 10104, Training Loss: 0.01409
Epoch: 10104, Training Loss: 0.01746
Epoch: 10104, Training Loss: 0.01338
Epoch: 10105, Training Loss: 0.01528
Epoch: 10105, Training Loss: 0.01408
Epoch: 10105, Training Loss: 0.01746
Epoch: 10105, Training Loss: 0.01338
Epoch: 10106, Training Loss: 0.01528
Epoch: 10106, Training Loss: 0.01408
Epoch: 10106, Training Loss: 0.01746
Epoch: 10106, Training Loss: 0.01338
Epoch: 10107, Training Loss: 0.01528
Epoch: 10107, Training Loss: 0.01408
Epoch: 10107, Training Loss: 0.01746
Epoch: 10107, Training Loss: 0.01338
Epoch: 10108, Training Loss: 0.01528
Epoch: 10108, Training Loss: 0.01408
Epoch: 10108, Training Loss: 0.01746
Epoch: 10108, Training Loss: 0.01338
Epoch: 10109, Training Loss: 0.01528
Epoch: 10109, Training Loss: 0.01408
Epoch: 10109, Training Loss: 0.01746
Epoch: 10109, Training Loss: 0.01338
Epoch: 10110, Training Loss: 0.01528
Epoch: 10110, Training Loss: 0.01408
Epoch: 10110, Training Loss: 0.01746
Epoch: 10110, Training Loss: 0.01338
Epoch: 10111, Training Loss: 0.01528
Epoch: 10111, Training Loss: 0.01408
Epoch: 10111, Training Loss: 0.01746
Epoch: 10111, Training Loss: 0.01337
Epoch: 10112, Training Loss: 0.01528
Epoch: 10112, Training Loss: 0.01408
Epoch: 10112, Training Loss: 0.01746
Epoch: 10112, Training Loss: 0.01337
Epoch: 10113, Training Loss: 0.01528
Epoch: 10113, Training Loss: 0.01408
Epoch: 10113, Training Loss: 0.01745
Epoch: 10113, Training Loss: 0.01337
Epoch: 10114, Training Loss: 0.01528
Epoch: 10114, Training Loss: 0.01408
Epoch: 10114, Training Loss: 0.01745
Epoch: 10114, Training Loss: 0.01337
Epoch: 10115, Training Loss: 0.01528
Epoch: 10115, Training Loss: 0.01408
Epoch: 10115, Training Loss: 0.01745
Epoch: 10115, Training Loss: 0.01337
Epoch: 10116, Training Loss: 0.01527
Epoch: 10116, Training Loss: 0.01408
Epoch: 10116, Training Loss: 0.01745
Epoch: 10116, Training Loss: 0.01337
Epoch: 10117, Training Loss: 0.01527
Epoch: 10117, Training Loss: 0.01408
Epoch: 10117, Training Loss: 0.01745
Epoch: 10117, Training Loss: 0.01337
Epoch: 10118, Training Loss: 0.01527
Epoch: 10118, Training Loss: 0.01407
Epoch: 10118, Training Loss: 0.01745
Epoch: 10118, Training Loss: 0.01337
Epoch: 10119, Training Loss: 0.01527
Epoch: 10119, Training Loss: 0.01407
Epoch: 10119, Training Loss: 0.01745
Epoch: 10119, Training Loss: 0.01337
Epoch: 10120, Training Loss: 0.01527
Epoch: 10120, Training Loss: 0.01407
Epoch: 10120, Training Loss: 0.01745
Epoch: 10120, Training Loss: 0.01337
Epoch: 10121, Training Loss: 0.01527
Epoch: 10121, Training Loss: 0.01407
Epoch: 10121, Training Loss: 0.01745
Epoch: 10121, Training Loss: 0.01337
Epoch: 10122, Training Loss: 0.01527
Epoch: 10122, Training Loss: 0.01407
Epoch: 10122, Training Loss: 0.01745
Epoch: 10122, Training Loss: 0.01337
Epoch: 10123, Training Loss: 0.01527
Epoch: 10123, Training Loss: 0.01407
Epoch: 10123, Training Loss: 0.01744
Epoch: 10123, Training Loss: 0.01337
Epoch: 10124, Training Loss: 0.01527
Epoch: 10124, Training Loss: 0.01407
Epoch: 10124, Training Loss: 0.01744
Epoch: 10124, Training Loss: 0.01337
Epoch: 10125, Training Loss: 0.01527
Epoch: 10125, Training Loss: 0.01407
Epoch: 10125, Training Loss: 0.01744
Epoch: 10125, Training Loss: 0.01336
Epoch: 10126, Training Loss: 0.01527
Epoch: 10126, Training Loss: 0.01407
Epoch: 10126, Training Loss: 0.01744
Epoch: 10126, Training Loss: 0.01336
Epoch: 10127, Training Loss: 0.01527
Epoch: 10127, Training Loss: 0.01407
Epoch: 10127, Training Loss: 0.01744
Epoch: 10127, Training Loss: 0.01336
Epoch: 10128, Training Loss: 0.01526
Epoch: 10128, Training Loss: 0.01407
Epoch: 10128, Training Loss: 0.01744
Epoch: 10128, Training Loss: 0.01336
Epoch: 10129, Training Loss: 0.01526
Epoch: 10129, Training Loss: 0.01407
Epoch: 10129, Training Loss: 0.01744
Epoch: 10129, Training Loss: 0.01336
Epoch: 10130, Training Loss: 0.01526
Epoch: 10130, Training Loss: 0.01407
Epoch: 10130, Training Loss: 0.01744
Epoch: 10130, Training Loss: 0.01336
Epoch: 10131, Training Loss: 0.01526
Epoch: 10131, Training Loss: 0.01406
Epoch: 10131, Training Loss: 0.01744
Epoch: 10131, Training Loss: 0.01336
Epoch: 10132, Training Loss: 0.01526
Epoch: 10132, Training Loss: 0.01406
Epoch: 10132, Training Loss: 0.01744
Epoch: 10132, Training Loss: 0.01336
Epoch: 10133, Training Loss: 0.01526
Epoch: 10133, Training Loss: 0.01406
Epoch: 10133, Training Loss: 0.01743
Epoch: 10133, Training Loss: 0.01336
Epoch: 10134, Training Loss: 0.01526
Epoch: 10134, Training Loss: 0.01406
Epoch: 10134, Training Loss: 0.01743
Epoch: 10134, Training Loss: 0.01336
Epoch: 10135, Training Loss: 0.01526
Epoch: 10135, Training Loss: 0.01406
Epoch: 10135, Training Loss: 0.01743
Epoch: 10135, Training Loss: 0.01336
Epoch: 10136, Training Loss: 0.01526
Epoch: 10136, Training Loss: 0.01406
Epoch: 10136, Training Loss: 0.01743
Epoch: 10136, Training Loss: 0.01336
Epoch: 10137, Training Loss: 0.01526
Epoch: 10137, Training Loss: 0.01406
Epoch: 10137, Training Loss: 0.01743
Epoch: 10137, Training Loss: 0.01336
Epoch: 10138, Training Loss: 0.01526
Epoch: 10138, Training Loss: 0.01406
Epoch: 10138, Training Loss: 0.01743
Epoch: 10138, Training Loss: 0.01335
Epoch: 10139, Training Loss: 0.01525
Epoch: 10139, Training Loss: 0.01406
Epoch: 10139, Training Loss: 0.01743
Epoch: 10139, Training Loss: 0.01335
Epoch: 10140, Training Loss: 0.01525
Epoch: 10140, Training Loss: 0.01406
Epoch: 10140, Training Loss: 0.01743
Epoch: 10140, Training Loss: 0.01335
Epoch: 10141, Training Loss: 0.01525
Epoch: 10141, Training Loss: 0.01406
Epoch: 10141, Training Loss: 0.01743
Epoch: 10141, Training Loss: 0.01335
Epoch: 10142, Training Loss: 0.01525
Epoch: 10142, Training Loss: 0.01406
Epoch: 10142, Training Loss: 0.01743
Epoch: 10142, Training Loss: 0.01335
Epoch: 10143, Training Loss: 0.01525
Epoch: 10143, Training Loss: 0.01406
Epoch: 10143, Training Loss: 0.01742
Epoch: 10143, Training Loss: 0.01335
Epoch: 10144, Training Loss: 0.01525
Epoch: 10144, Training Loss: 0.01405
Epoch: 10144, Training Loss: 0.01742
Epoch: 10144, Training Loss: 0.01335
Epoch: 10145, Training Loss: 0.01525
Epoch: 10145, Training Loss: 0.01405
Epoch: 10145, Training Loss: 0.01742
Epoch: 10145, Training Loss: 0.01335
Epoch: 10146, Training Loss: 0.01525
Epoch: 10146, Training Loss: 0.01405
Epoch: 10146, Training Loss: 0.01742
Epoch: 10146, Training Loss: 0.01335
Epoch: 10147, Training Loss: 0.01525
Epoch: 10147, Training Loss: 0.01405
Epoch: 10147, Training Loss: 0.01742
Epoch: 10147, Training Loss: 0.01335
Epoch: 10148, Training Loss: 0.01525
Epoch: 10148, Training Loss: 0.01405
Epoch: 10148, Training Loss: 0.01742
Epoch: 10148, Training Loss: 0.01335
Epoch: 10149, Training Loss: 0.01525
Epoch: 10149, Training Loss: 0.01405
Epoch: 10149, Training Loss: 0.01742
Epoch: 10149, Training Loss: 0.01335
Epoch: 10150, Training Loss: 0.01525
Epoch: 10150, Training Loss: 0.01405
Epoch: 10150, Training Loss: 0.01742
Epoch: 10150, Training Loss: 0.01335
Epoch: 10151, Training Loss: 0.01524
Epoch: 10151, Training Loss: 0.01405
Epoch: 10151, Training Loss: 0.01742
Epoch: 10151, Training Loss: 0.01335
Epoch: 10152, Training Loss: 0.01524
Epoch: 10152, Training Loss: 0.01405
Epoch: 10152, Training Loss: 0.01742
Epoch: 10152, Training Loss: 0.01334
Epoch: 10153, Training Loss: 0.01524
Epoch: 10153, Training Loss: 0.01405
Epoch: 10153, Training Loss: 0.01741
Epoch: 10153, Training Loss: 0.01334
Epoch: 10154, Training Loss: 0.01524
Epoch: 10154, Training Loss: 0.01405
Epoch: 10154, Training Loss: 0.01741
Epoch: 10154, Training Loss: 0.01334
Epoch: 10155, Training Loss: 0.01524
Epoch: 10155, Training Loss: 0.01405
Epoch: 10155, Training Loss: 0.01741
Epoch: 10155, Training Loss: 0.01334
Epoch: 10156, Training Loss: 0.01524
Epoch: 10156, Training Loss: 0.01405
Epoch: 10156, Training Loss: 0.01741
Epoch: 10156, Training Loss: 0.01334
Epoch: 10157, Training Loss: 0.01524
Epoch: 10157, Training Loss: 0.01404
Epoch: 10157, Training Loss: 0.01741
Epoch: 10157, Training Loss: 0.01334
Epoch: 10158, Training Loss: 0.01524
Epoch: 10158, Training Loss: 0.01404
Epoch: 10158, Training Loss: 0.01741
Epoch: 10158, Training Loss: 0.01334
Epoch: 10159, Training Loss: 0.01524
Epoch: 10159, Training Loss: 0.01404
Epoch: 10159, Training Loss: 0.01741
Epoch: 10159, Training Loss: 0.01334
Epoch: 10160, Training Loss: 0.01524
Epoch: 10160, Training Loss: 0.01404
Epoch: 10160, Training Loss: 0.01741
Epoch: 10160, Training Loss: 0.01334
Epoch: 10161, Training Loss: 0.01524
Epoch: 10161, Training Loss: 0.01404
Epoch: 10161, Training Loss: 0.01741
Epoch: 10161, Training Loss: 0.01334
Epoch: 10162, Training Loss: 0.01524
Epoch: 10162, Training Loss: 0.01404
Epoch: 10162, Training Loss: 0.01741
Epoch: 10162, Training Loss: 0.01334
Epoch: 10163, Training Loss: 0.01523
Epoch: 10163, Training Loss: 0.01404
Epoch: 10163, Training Loss: 0.01741
Epoch: 10163, Training Loss: 0.01334
Epoch: 10164, Training Loss: 0.01523
Epoch: 10164, Training Loss: 0.01404
Epoch: 10164, Training Loss: 0.01740
Epoch: 10164, Training Loss: 0.01334
Epoch: 10165, Training Loss: 0.01523
Epoch: 10165, Training Loss: 0.01404
Epoch: 10165, Training Loss: 0.01740
Epoch: 10165, Training Loss: 0.01334
Epoch: 10166, Training Loss: 0.01523
Epoch: 10166, Training Loss: 0.01404
Epoch: 10166, Training Loss: 0.01740
Epoch: 10166, Training Loss: 0.01333
Epoch: 10167, Training Loss: 0.01523
Epoch: 10167, Training Loss: 0.01404
Epoch: 10167, Training Loss: 0.01740
Epoch: 10167, Training Loss: 0.01333
Epoch: 10168, Training Loss: 0.01523
Epoch: 10168, Training Loss: 0.01404
Epoch: 10168, Training Loss: 0.01740
Epoch: 10168, Training Loss: 0.01333
Epoch: 10169, Training Loss: 0.01523
Epoch: 10169, Training Loss: 0.01404
Epoch: 10169, Training Loss: 0.01740
Epoch: 10169, Training Loss: 0.01333
Epoch: 10170, Training Loss: 0.01523
Epoch: 10170, Training Loss: 0.01403
Epoch: 10170, Training Loss: 0.01740
Epoch: 10170, Training Loss: 0.01333
Epoch: 10171, Training Loss: 0.01523
Epoch: 10171, Training Loss: 0.01403
Epoch: 10171, Training Loss: 0.01740
Epoch: 10171, Training Loss: 0.01333
Epoch: 10172, Training Loss: 0.01523
Epoch: 10172, Training Loss: 0.01403
Epoch: 10172, Training Loss: 0.01740
Epoch: 10172, Training Loss: 0.01333
Epoch: 10173, Training Loss: 0.01523
Epoch: 10173, Training Loss: 0.01403
Epoch: 10173, Training Loss: 0.01740
Epoch: 10173, Training Loss: 0.01333
Epoch: 10174, Training Loss: 0.01523
Epoch: 10174, Training Loss: 0.01403
Epoch: 10174, Training Loss: 0.01739
Epoch: 10174, Training Loss: 0.01333
Epoch: 10175, Training Loss: 0.01522
Epoch: 10175, Training Loss: 0.01403
Epoch: 10175, Training Loss: 0.01739
Epoch: 10175, Training Loss: 0.01333
Epoch: 10176, Training Loss: 0.01522
Epoch: 10176, Training Loss: 0.01403
Epoch: 10176, Training Loss: 0.01739
Epoch: 10176, Training Loss: 0.01333
Epoch: 10177, Training Loss: 0.01522
Epoch: 10177, Training Loss: 0.01403
Epoch: 10177, Training Loss: 0.01739
Epoch: 10177, Training Loss: 0.01333
Epoch: 10178, Training Loss: 0.01522
Epoch: 10178, Training Loss: 0.01403
Epoch: 10178, Training Loss: 0.01739
Epoch: 10178, Training Loss: 0.01333
Epoch: 10179, Training Loss: 0.01522
Epoch: 10179, Training Loss: 0.01403
Epoch: 10179, Training Loss: 0.01739
Epoch: 10179, Training Loss: 0.01333
Epoch: 10180, Training Loss: 0.01522
Epoch: 10180, Training Loss: 0.01403
Epoch: 10180, Training Loss: 0.01739
Epoch: 10180, Training Loss: 0.01332
Epoch: 10181, Training Loss: 0.01522
Epoch: 10181, Training Loss: 0.01403
Epoch: 10181, Training Loss: 0.01739
Epoch: 10181, Training Loss: 0.01332
Epoch: 10182, Training Loss: 0.01522
Epoch: 10182, Training Loss: 0.01403
Epoch: 10182, Training Loss: 0.01739
Epoch: 10182, Training Loss: 0.01332
Epoch: 10183, Training Loss: 0.01522
Epoch: 10183, Training Loss: 0.01402
Epoch: 10183, Training Loss: 0.01739
Epoch: 10183, Training Loss: 0.01332
Epoch: 10184, Training Loss: 0.01522
Epoch: 10184, Training Loss: 0.01402
Epoch: 10184, Training Loss: 0.01738
Epoch: 10184, Training Loss: 0.01332
Epoch: 10185, Training Loss: 0.01522
Epoch: 10185, Training Loss: 0.01402
Epoch: 10185, Training Loss: 0.01738
Epoch: 10185, Training Loss: 0.01332
Epoch: 10186, Training Loss: 0.01521
Epoch: 10186, Training Loss: 0.01402
Epoch: 10186, Training Loss: 0.01738
Epoch: 10186, Training Loss: 0.01332
Epoch: 10187, Training Loss: 0.01521
Epoch: 10187, Training Loss: 0.01402
Epoch: 10187, Training Loss: 0.01738
Epoch: 10187, Training Loss: 0.01332
Epoch: 10188, Training Loss: 0.01521
Epoch: 10188, Training Loss: 0.01402
Epoch: 10188, Training Loss: 0.01738
Epoch: 10188, Training Loss: 0.01332
Epoch: 10189, Training Loss: 0.01521
Epoch: 10189, Training Loss: 0.01402
Epoch: 10189, Training Loss: 0.01738
Epoch: 10189, Training Loss: 0.01332
Epoch: 10190, Training Loss: 0.01521
Epoch: 10190, Training Loss: 0.01402
Epoch: 10190, Training Loss: 0.01738
Epoch: 10190, Training Loss: 0.01332
Epoch: 10191, Training Loss: 0.01521
Epoch: 10191, Training Loss: 0.01402
Epoch: 10191, Training Loss: 0.01738
Epoch: 10191, Training Loss: 0.01332
Epoch: 10192, Training Loss: 0.01521
Epoch: 10192, Training Loss: 0.01402
Epoch: 10192, Training Loss: 0.01738
Epoch: 10192, Training Loss: 0.01332
Epoch: 10193, Training Loss: 0.01521
Epoch: 10193, Training Loss: 0.01402
Epoch: 10193, Training Loss: 0.01738
Epoch: 10193, Training Loss: 0.01331
Epoch: 10194, Training Loss: 0.01521
Epoch: 10194, Training Loss: 0.01402
Epoch: 10194, Training Loss: 0.01737
Epoch: 10194, Training Loss: 0.01331
Epoch: 10195, Training Loss: 0.01521
Epoch: 10195, Training Loss: 0.01402
Epoch: 10195, Training Loss: 0.01737
Epoch: 10195, Training Loss: 0.01331
Epoch: 10196, Training Loss: 0.01521
Epoch: 10196, Training Loss: 0.01401
Epoch: 10196, Training Loss: 0.01737
Epoch: 10196, Training Loss: 0.01331
Epoch: 10197, Training Loss: 0.01521
Epoch: 10197, Training Loss: 0.01401
Epoch: 10197, Training Loss: 0.01737
Epoch: 10197, Training Loss: 0.01331
Epoch: 10198, Training Loss: 0.01520
Epoch: 10198, Training Loss: 0.01401
Epoch: 10198, Training Loss: 0.01737
Epoch: 10198, Training Loss: 0.01331
Epoch: 10199, Training Loss: 0.01520
Epoch: 10199, Training Loss: 0.01401
Epoch: 10199, Training Loss: 0.01737
Epoch: 10199, Training Loss: 0.01331
Epoch: 10200, Training Loss: 0.01520
Epoch: 10200, Training Loss: 0.01401
Epoch: 10200, Training Loss: 0.01737
Epoch: 10200, Training Loss: 0.01331
Epoch: 10201, Training Loss: 0.01520
Epoch: 10201, Training Loss: 0.01401
Epoch: 10201, Training Loss: 0.01737
Epoch: 10201, Training Loss: 0.01331
Epoch: 10202, Training Loss: 0.01520
Epoch: 10202, Training Loss: 0.01401
Epoch: 10202, Training Loss: 0.01737
Epoch: 10202, Training Loss: 0.01331
Epoch: 10203, Training Loss: 0.01520
Epoch: 10203, Training Loss: 0.01401
Epoch: 10203, Training Loss: 0.01737
Epoch: 10203, Training Loss: 0.01331
Epoch: 10204, Training Loss: 0.01520
Epoch: 10204, Training Loss: 0.01401
Epoch: 10204, Training Loss: 0.01737
Epoch: 10204, Training Loss: 0.01331
Epoch: 10205, Training Loss: 0.01520
Epoch: 10205, Training Loss: 0.01401
Epoch: 10205, Training Loss: 0.01736
Epoch: 10205, Training Loss: 0.01331
Epoch: 10206, Training Loss: 0.01520
Epoch: 10206, Training Loss: 0.01401
Epoch: 10206, Training Loss: 0.01736
Epoch: 10206, Training Loss: 0.01331
Epoch: 10207, Training Loss: 0.01520
Epoch: 10207, Training Loss: 0.01401
Epoch: 10207, Training Loss: 0.01736
Epoch: 10207, Training Loss: 0.01330
Epoch: 10208, Training Loss: 0.01520
Epoch: 10208, Training Loss: 0.01401
Epoch: 10208, Training Loss: 0.01736
Epoch: 10208, Training Loss: 0.01330
Epoch: 10209, Training Loss: 0.01520
Epoch: 10209, Training Loss: 0.01401
Epoch: 10209, Training Loss: 0.01736
Epoch: 10209, Training Loss: 0.01330
Epoch: 10210, Training Loss: 0.01519
Epoch: 10210, Training Loss: 0.01400
Epoch: 10210, Training Loss: 0.01736
Epoch: 10210, Training Loss: 0.01330
Epoch: 10211, Training Loss: 0.01519
Epoch: 10211, Training Loss: 0.01400
Epoch: 10211, Training Loss: 0.01736
Epoch: 10211, Training Loss: 0.01330
Epoch: 10212, Training Loss: 0.01519
Epoch: 10212, Training Loss: 0.01400
Epoch: 10212, Training Loss: 0.01736
Epoch: 10212, Training Loss: 0.01330
Epoch: 10213, Training Loss: 0.01519
Epoch: 10213, Training Loss: 0.01400
Epoch: 10213, Training Loss: 0.01736
Epoch: 10213, Training Loss: 0.01330
Epoch: 10214, Training Loss: 0.01519
Epoch: 10214, Training Loss: 0.01400
Epoch: 10214, Training Loss: 0.01736
Epoch: 10214, Training Loss: 0.01330
Epoch: 10215, Training Loss: 0.01519
Epoch: 10215, Training Loss: 0.01400
Epoch: 10215, Training Loss: 0.01735
Epoch: 10215, Training Loss: 0.01330
Epoch: 10216, Training Loss: 0.01519
Epoch: 10216, Training Loss: 0.01400
Epoch: 10216, Training Loss: 0.01735
Epoch: 10216, Training Loss: 0.01330
Epoch: 10217, Training Loss: 0.01519
Epoch: 10217, Training Loss: 0.01400
Epoch: 10217, Training Loss: 0.01735
Epoch: 10217, Training Loss: 0.01330
Epoch: 10218, Training Loss: 0.01519
Epoch: 10218, Training Loss: 0.01400
Epoch: 10218, Training Loss: 0.01735
Epoch: 10218, Training Loss: 0.01330
Epoch: 10219, Training Loss: 0.01519
Epoch: 10219, Training Loss: 0.01400
Epoch: 10219, Training Loss: 0.01735
Epoch: 10219, Training Loss: 0.01330
Epoch: 10220, Training Loss: 0.01519
Epoch: 10220, Training Loss: 0.01400
Epoch: 10220, Training Loss: 0.01735
Epoch: 10220, Training Loss: 0.01330
Epoch: 10221, Training Loss: 0.01519
Epoch: 10221, Training Loss: 0.01400
Epoch: 10221, Training Loss: 0.01735
Epoch: 10221, Training Loss: 0.01329
Epoch: 10222, Training Loss: 0.01518
Epoch: 10222, Training Loss: 0.01400
Epoch: 10222, Training Loss: 0.01735
Epoch: 10222, Training Loss: 0.01329
Epoch: 10223, Training Loss: 0.01518
Epoch: 10223, Training Loss: 0.01399
Epoch: 10223, Training Loss: 0.01735
Epoch: 10223, Training Loss: 0.01329
Epoch: 10224, Training Loss: 0.01518
Epoch: 10224, Training Loss: 0.01399
Epoch: 10224, Training Loss: 0.01735
Epoch: 10224, Training Loss: 0.01329
Epoch: 10225, Training Loss: 0.01518
Epoch: 10225, Training Loss: 0.01399
Epoch: 10225, Training Loss: 0.01734
Epoch: 10225, Training Loss: 0.01329
Epoch: 10226, Training Loss: 0.01518
Epoch: 10226, Training Loss: 0.01399
Epoch: 10226, Training Loss: 0.01734
Epoch: 10226, Training Loss: 0.01329
Epoch: 10227, Training Loss: 0.01518
Epoch: 10227, Training Loss: 0.01399
Epoch: 10227, Training Loss: 0.01734
Epoch: 10227, Training Loss: 0.01329
Epoch: 10228, Training Loss: 0.01518
Epoch: 10228, Training Loss: 0.01399
Epoch: 10228, Training Loss: 0.01734
Epoch: 10228, Training Loss: 0.01329
Epoch: 10229, Training Loss: 0.01518
Epoch: 10229, Training Loss: 0.01399
Epoch: 10229, Training Loss: 0.01734
Epoch: 10229, Training Loss: 0.01329
Epoch: 10230, Training Loss: 0.01518
Epoch: 10230, Training Loss: 0.01399
Epoch: 10230, Training Loss: 0.01734
Epoch: 10230, Training Loss: 0.01329
Epoch: 10231, Training Loss: 0.01518
Epoch: 10231, Training Loss: 0.01399
Epoch: 10231, Training Loss: 0.01734
Epoch: 10231, Training Loss: 0.01329
Epoch: 10232, Training Loss: 0.01518
Epoch: 10232, Training Loss: 0.01399
Epoch: 10232, Training Loss: 0.01734
Epoch: 10232, Training Loss: 0.01329
Epoch: 10233, Training Loss: 0.01518
Epoch: 10233, Training Loss: 0.01399
Epoch: 10233, Training Loss: 0.01734
Epoch: 10233, Training Loss: 0.01329
Epoch: 10234, Training Loss: 0.01517
Epoch: 10234, Training Loss: 0.01399
Epoch: 10234, Training Loss: 0.01734
Epoch: 10234, Training Loss: 0.01329
Epoch: 10235, Training Loss: 0.01517
Epoch: 10235, Training Loss: 0.01399
Epoch: 10235, Training Loss: 0.01734
Epoch: 10235, Training Loss: 0.01328
Epoch: 10236, Training Loss: 0.01517
Epoch: 10236, Training Loss: 0.01398
Epoch: 10236, Training Loss: 0.01733
Epoch: 10236, Training Loss: 0.01328
Epoch: 10237, Training Loss: 0.01517
Epoch: 10237, Training Loss: 0.01398
Epoch: 10237, Training Loss: 0.01733
Epoch: 10237, Training Loss: 0.01328
Epoch: 10238, Training Loss: 0.01517
Epoch: 10238, Training Loss: 0.01398
Epoch: 10238, Training Loss: 0.01733
Epoch: 10238, Training Loss: 0.01328
Epoch: 10239, Training Loss: 0.01517
Epoch: 10239, Training Loss: 0.01398
Epoch: 10239, Training Loss: 0.01733
Epoch: 10239, Training Loss: 0.01328
Epoch: 10240, Training Loss: 0.01517
Epoch: 10240, Training Loss: 0.01398
Epoch: 10240, Training Loss: 0.01733
Epoch: 10240, Training Loss: 0.01328
Epoch: 10241, Training Loss: 0.01517
Epoch: 10241, Training Loss: 0.01398
Epoch: 10241, Training Loss: 0.01733
Epoch: 10241, Training Loss: 0.01328
Epoch: 10242, Training Loss: 0.01517
Epoch: 10242, Training Loss: 0.01398
Epoch: 10242, Training Loss: 0.01733
Epoch: 10242, Training Loss: 0.01328
Epoch: 10243, Training Loss: 0.01517
Epoch: 10243, Training Loss: 0.01398
Epoch: 10243, Training Loss: 0.01733
Epoch: 10243, Training Loss: 0.01328
Epoch: 10244, Training Loss: 0.01517
Epoch: 10244, Training Loss: 0.01398
Epoch: 10244, Training Loss: 0.01733
Epoch: 10244, Training Loss: 0.01328
Epoch: 10245, Training Loss: 0.01517
Epoch: 10245, Training Loss: 0.01398
Epoch: 10245, Training Loss: 0.01733
Epoch: 10245, Training Loss: 0.01328
Epoch: 10246, Training Loss: 0.01516
Epoch: 10246, Training Loss: 0.01398
Epoch: 10246, Training Loss: 0.01732
Epoch: 10246, Training Loss: 0.01328
Epoch: 10247, Training Loss: 0.01516
Epoch: 10247, Training Loss: 0.01398
Epoch: 10247, Training Loss: 0.01732
Epoch: 10247, Training Loss: 0.01328
Epoch: 10248, Training Loss: 0.01516
Epoch: 10248, Training Loss: 0.01398
Epoch: 10248, Training Loss: 0.01732
Epoch: 10248, Training Loss: 0.01328
Epoch: 10249, Training Loss: 0.01516
Epoch: 10249, Training Loss: 0.01398
Epoch: 10249, Training Loss: 0.01732
Epoch: 10249, Training Loss: 0.01327
Epoch: 10250, Training Loss: 0.01516
Epoch: 10250, Training Loss: 0.01397
Epoch: 10250, Training Loss: 0.01732
Epoch: 10250, Training Loss: 0.01327
Epoch: 10251, Training Loss: 0.01516
Epoch: 10251, Training Loss: 0.01397
Epoch: 10251, Training Loss: 0.01732
Epoch: 10251, Training Loss: 0.01327
Epoch: 10252, Training Loss: 0.01516
Epoch: 10252, Training Loss: 0.01397
Epoch: 10252, Training Loss: 0.01732
Epoch: 10252, Training Loss: 0.01327
Epoch: 10253, Training Loss: 0.01516
Epoch: 10253, Training Loss: 0.01397
Epoch: 10253, Training Loss: 0.01732
Epoch: 10253, Training Loss: 0.01327
Epoch: 10254, Training Loss: 0.01516
Epoch: 10254, Training Loss: 0.01397
Epoch: 10254, Training Loss: 0.01732
Epoch: 10254, Training Loss: 0.01327
Epoch: 10255, Training Loss: 0.01516
Epoch: 10255, Training Loss: 0.01397
Epoch: 10255, Training Loss: 0.01732
Epoch: 10255, Training Loss: 0.01327
Epoch: 10256, Training Loss: 0.01516
Epoch: 10256, Training Loss: 0.01397
Epoch: 10256, Training Loss: 0.01732
Epoch: 10256, Training Loss: 0.01327
Epoch: 10257, Training Loss: 0.01516
Epoch: 10257, Training Loss: 0.01397
Epoch: 10257, Training Loss: 0.01731
Epoch: 10257, Training Loss: 0.01327
Epoch: 10258, Training Loss: 0.01515
Epoch: 10258, Training Loss: 0.01397
Epoch: 10258, Training Loss: 0.01731
Epoch: 10258, Training Loss: 0.01327
Epoch: 10259, Training Loss: 0.01515
Epoch: 10259, Training Loss: 0.01397
Epoch: 10259, Training Loss: 0.01731
Epoch: 10259, Training Loss: 0.01327
Epoch: 10260, Training Loss: 0.01515
Epoch: 10260, Training Loss: 0.01397
Epoch: 10260, Training Loss: 0.01731
Epoch: 10260, Training Loss: 0.01327
Epoch: 10261, Training Loss: 0.01515
Epoch: 10261, Training Loss: 0.01397
Epoch: 10261, Training Loss: 0.01731
Epoch: 10261, Training Loss: 0.01327
Epoch: 10262, Training Loss: 0.01515
Epoch: 10262, Training Loss: 0.01397
Epoch: 10262, Training Loss: 0.01731
Epoch: 10262, Training Loss: 0.01327
Epoch: 10263, Training Loss: 0.01515
Epoch: 10263, Training Loss: 0.01396
Epoch: 10263, Training Loss: 0.01731
Epoch: 10263, Training Loss: 0.01326
Epoch: 10264, Training Loss: 0.01515
Epoch: 10264, Training Loss: 0.01396
Epoch: 10264, Training Loss: 0.01731
Epoch: 10264, Training Loss: 0.01326
Epoch: 10265, Training Loss: 0.01515
Epoch: 10265, Training Loss: 0.01396
Epoch: 10265, Training Loss: 0.01731
Epoch: 10265, Training Loss: 0.01326
Epoch: 10266, Training Loss: 0.01515
Epoch: 10266, Training Loss: 0.01396
Epoch: 10266, Training Loss: 0.01731
Epoch: 10266, Training Loss: 0.01326
Epoch: 10267, Training Loss: 0.01515
Epoch: 10267, Training Loss: 0.01396
Epoch: 10267, Training Loss: 0.01730
Epoch: 10267, Training Loss: 0.01326
Epoch: 10268, Training Loss: 0.01515
Epoch: 10268, Training Loss: 0.01396
Epoch: 10268, Training Loss: 0.01730
Epoch: 10268, Training Loss: 0.01326
Epoch: 10269, Training Loss: 0.01515
Epoch: 10269, Training Loss: 0.01396
Epoch: 10269, Training Loss: 0.01730
Epoch: 10269, Training Loss: 0.01326
Epoch: 10270, Training Loss: 0.01514
Epoch: 10270, Training Loss: 0.01396
Epoch: 10270, Training Loss: 0.01730
Epoch: 10270, Training Loss: 0.01326
Epoch: 10271, Training Loss: 0.01514
Epoch: 10271, Training Loss: 0.01396
Epoch: 10271, Training Loss: 0.01730
Epoch: 10271, Training Loss: 0.01326
Epoch: 10272, Training Loss: 0.01514
Epoch: 10272, Training Loss: 0.01396
Epoch: 10272, Training Loss: 0.01730
Epoch: 10272, Training Loss: 0.01326
Epoch: 10273, Training Loss: 0.01514
Epoch: 10273, Training Loss: 0.01396
Epoch: 10273, Training Loss: 0.01730
Epoch: 10273, Training Loss: 0.01326
Epoch: 10274, Training Loss: 0.01514
Epoch: 10274, Training Loss: 0.01396
Epoch: 10274, Training Loss: 0.01730
Epoch: 10274, Training Loss: 0.01326
Epoch: 10275, Training Loss: 0.01514
Epoch: 10275, Training Loss: 0.01396
Epoch: 10275, Training Loss: 0.01730
Epoch: 10275, Training Loss: 0.01326
Epoch: 10276, Training Loss: 0.01514
Epoch: 10276, Training Loss: 0.01395
Epoch: 10276, Training Loss: 0.01730
Epoch: 10276, Training Loss: 0.01326
Epoch: 10277, Training Loss: 0.01514
Epoch: 10277, Training Loss: 0.01395
Epoch: 10277, Training Loss: 0.01729
Epoch: 10277, Training Loss: 0.01325
Epoch: 10278, Training Loss: 0.01514
Epoch: 10278, Training Loss: 0.01395
Epoch: 10278, Training Loss: 0.01729
Epoch: 10278, Training Loss: 0.01325
Epoch: 10279, Training Loss: 0.01514
Epoch: 10279, Training Loss: 0.01395
Epoch: 10279, Training Loss: 0.01729
Epoch: 10279, Training Loss: 0.01325
Epoch: 10280, Training Loss: 0.01514
Epoch: 10280, Training Loss: 0.01395
Epoch: 10280, Training Loss: 0.01729
Epoch: 10280, Training Loss: 0.01325
Epoch: 10281, Training Loss: 0.01514
Epoch: 10281, Training Loss: 0.01395
Epoch: 10281, Training Loss: 0.01729
Epoch: 10281, Training Loss: 0.01325
Epoch: 10282, Training Loss: 0.01513
Epoch: 10282, Training Loss: 0.01395
Epoch: 10282, Training Loss: 0.01729
Epoch: 10282, Training Loss: 0.01325
Epoch: 10283, Training Loss: 0.01513
Epoch: 10283, Training Loss: 0.01395
Epoch: 10283, Training Loss: 0.01729
Epoch: 10283, Training Loss: 0.01325
Epoch: 10284, Training Loss: 0.01513
Epoch: 10284, Training Loss: 0.01395
Epoch: 10284, Training Loss: 0.01729
Epoch: 10284, Training Loss: 0.01325
Epoch: 10285, Training Loss: 0.01513
Epoch: 10285, Training Loss: 0.01395
Epoch: 10285, Training Loss: 0.01729
Epoch: 10285, Training Loss: 0.01325
Epoch: 10286, Training Loss: 0.01513
Epoch: 10286, Training Loss: 0.01395
Epoch: 10286, Training Loss: 0.01729
Epoch: 10286, Training Loss: 0.01325
Epoch: 10287, Training Loss: 0.01513
Epoch: 10287, Training Loss: 0.01395
Epoch: 10287, Training Loss: 0.01729
Epoch: 10287, Training Loss: 0.01325
Epoch: 10288, Training Loss: 0.01513
Epoch: 10288, Training Loss: 0.01395
Epoch: 10288, Training Loss: 0.01728
Epoch: 10288, Training Loss: 0.01325
Epoch: 10289, Training Loss: 0.01513
Epoch: 10289, Training Loss: 0.01395
Epoch: 10289, Training Loss: 0.01728
Epoch: 10289, Training Loss: 0.01325
Epoch: 10290, Training Loss: 0.01513
Epoch: 10290, Training Loss: 0.01394
Epoch: 10290, Training Loss: 0.01728
Epoch: 10290, Training Loss: 0.01325
Epoch: 10291, Training Loss: 0.01513
Epoch: 10291, Training Loss: 0.01394
Epoch: 10291, Training Loss: 0.01728
Epoch: 10291, Training Loss: 0.01324
Epoch: 10292, Training Loss: 0.01513
Epoch: 10292, Training Loss: 0.01394
Epoch: 10292, Training Loss: 0.01728
Epoch: 10292, Training Loss: 0.01324
Epoch: 10293, Training Loss: 0.01513
Epoch: 10293, Training Loss: 0.01394
Epoch: 10293, Training Loss: 0.01728
Epoch: 10293, Training Loss: 0.01324
Epoch: 10294, Training Loss: 0.01512
Epoch: 10294, Training Loss: 0.01394
Epoch: 10294, Training Loss: 0.01728
Epoch: 10294, Training Loss: 0.01324
Epoch: 10295, Training Loss: 0.01512
Epoch: 10295, Training Loss: 0.01394
Epoch: 10295, Training Loss: 0.01728
Epoch: 10295, Training Loss: 0.01324
Epoch: 10296, Training Loss: 0.01512
Epoch: 10296, Training Loss: 0.01394
Epoch: 10296, Training Loss: 0.01728
Epoch: 10296, Training Loss: 0.01324
Epoch: 10297, Training Loss: 0.01512
Epoch: 10297, Training Loss: 0.01394
Epoch: 10297, Training Loss: 0.01728
Epoch: 10297, Training Loss: 0.01324
Epoch: 10298, Training Loss: 0.01512
Epoch: 10298, Training Loss: 0.01394
Epoch: 10298, Training Loss: 0.01727
Epoch: 10298, Training Loss: 0.01324
Epoch: 10299, Training Loss: 0.01512
Epoch: 10299, Training Loss: 0.01394
Epoch: 10299, Training Loss: 0.01727
Epoch: 10299, Training Loss: 0.01324
Epoch: 10300, Training Loss: 0.01512
Epoch: 10300, Training Loss: 0.01394
Epoch: 10300, Training Loss: 0.01727
Epoch: 10300, Training Loss: 0.01324
Epoch: 10301, Training Loss: 0.01512
Epoch: 10301, Training Loss: 0.01394
Epoch: 10301, Training Loss: 0.01727
Epoch: 10301, Training Loss: 0.01324
Epoch: 10302, Training Loss: 0.01512
Epoch: 10302, Training Loss: 0.01394
Epoch: 10302, Training Loss: 0.01727
Epoch: 10302, Training Loss: 0.01324
Epoch: 10303, Training Loss: 0.01512
Epoch: 10303, Training Loss: 0.01393
Epoch: 10303, Training Loss: 0.01727
Epoch: 10303, Training Loss: 0.01324
Epoch: 10304, Training Loss: 0.01512
Epoch: 10304, Training Loss: 0.01393
Epoch: 10304, Training Loss: 0.01727
Epoch: 10304, Training Loss: 0.01324
Epoch: 10305, Training Loss: 0.01512
Epoch: 10305, Training Loss: 0.01393
Epoch: 10305, Training Loss: 0.01727
Epoch: 10305, Training Loss: 0.01323
Epoch: 10306, Training Loss: 0.01511
Epoch: 10306, Training Loss: 0.01393
Epoch: 10306, Training Loss: 0.01727
Epoch: 10306, Training Loss: 0.01323
Epoch: 10307, Training Loss: 0.01511
Epoch: 10307, Training Loss: 0.01393
Epoch: 10307, Training Loss: 0.01727
Epoch: 10307, Training Loss: 0.01323
Epoch: 10308, Training Loss: 0.01511
Epoch: 10308, Training Loss: 0.01393
Epoch: 10308, Training Loss: 0.01727
Epoch: 10308, Training Loss: 0.01323
Epoch: 10309, Training Loss: 0.01511
Epoch: 10309, Training Loss: 0.01393
Epoch: 10309, Training Loss: 0.01726
Epoch: 10309, Training Loss: 0.01323
Epoch: 10310, Training Loss: 0.01511
Epoch: 10310, Training Loss: 0.01393
Epoch: 10310, Training Loss: 0.01726
Epoch: 10310, Training Loss: 0.01323
Epoch: 10311, Training Loss: 0.01511
Epoch: 10311, Training Loss: 0.01393
Epoch: 10311, Training Loss: 0.01726
Epoch: 10311, Training Loss: 0.01323
Epoch: 10312, Training Loss: 0.01511
Epoch: 10312, Training Loss: 0.01393
Epoch: 10312, Training Loss: 0.01726
Epoch: 10312, Training Loss: 0.01323
Epoch: 10313, Training Loss: 0.01511
Epoch: 10313, Training Loss: 0.01393
Epoch: 10313, Training Loss: 0.01726
Epoch: 10313, Training Loss: 0.01323
Epoch: 10314, Training Loss: 0.01511
Epoch: 10314, Training Loss: 0.01393
Epoch: 10314, Training Loss: 0.01726
Epoch: 10314, Training Loss: 0.01323
Epoch: 10315, Training Loss: 0.01511
Epoch: 10315, Training Loss: 0.01393
Epoch: 10315, Training Loss: 0.01726
Epoch: 10315, Training Loss: 0.01323
Epoch: 10316, Training Loss: 0.01511
Epoch: 10316, Training Loss: 0.01393
Epoch: 10316, Training Loss: 0.01726
Epoch: 10316, Training Loss: 0.01323
Epoch: 10317, Training Loss: 0.01511
Epoch: 10317, Training Loss: 0.01392
Epoch: 10317, Training Loss: 0.01726
Epoch: 10317, Training Loss: 0.01323
Epoch: 10318, Training Loss: 0.01510
Epoch: 10318, Training Loss: 0.01392
Epoch: 10318, Training Loss: 0.01726
Epoch: 10318, Training Loss: 0.01323
Epoch: 10319, Training Loss: 0.01510
Epoch: 10319, Training Loss: 0.01392
Epoch: 10319, Training Loss: 0.01725
Epoch: 10319, Training Loss: 0.01322
Epoch: 10320, Training Loss: 0.01510
Epoch: 10320, Training Loss: 0.01392
Epoch: 10320, Training Loss: 0.01725
Epoch: 10320, Training Loss: 0.01322
Epoch: 10321, Training Loss: 0.01510
Epoch: 10321, Training Loss: 0.01392
Epoch: 10321, Training Loss: 0.01725
Epoch: 10321, Training Loss: 0.01322
Epoch: 10322, Training Loss: 0.01510
Epoch: 10322, Training Loss: 0.01392
Epoch: 10322, Training Loss: 0.01725
Epoch: 10322, Training Loss: 0.01322
Epoch: 10323, Training Loss: 0.01510
Epoch: 10323, Training Loss: 0.01392
Epoch: 10323, Training Loss: 0.01725
Epoch: 10323, Training Loss: 0.01322
Epoch: 10324, Training Loss: 0.01510
Epoch: 10324, Training Loss: 0.01392
Epoch: 10324, Training Loss: 0.01725
Epoch: 10324, Training Loss: 0.01322
Epoch: 10325, Training Loss: 0.01510
Epoch: 10325, Training Loss: 0.01392
Epoch: 10325, Training Loss: 0.01725
Epoch: 10325, Training Loss: 0.01322
Epoch: 10326, Training Loss: 0.01510
Epoch: 10326, Training Loss: 0.01392
Epoch: 10326, Training Loss: 0.01725
Epoch: 10326, Training Loss: 0.01322
Epoch: 10327, Training Loss: 0.01510
Epoch: 10327, Training Loss: 0.01392
Epoch: 10327, Training Loss: 0.01725
Epoch: 10327, Training Loss: 0.01322
Epoch: 10328, Training Loss: 0.01510
Epoch: 10328, Training Loss: 0.01392
Epoch: 10328, Training Loss: 0.01725
Epoch: 10328, Training Loss: 0.01322
Epoch: 10329, Training Loss: 0.01510
Epoch: 10329, Training Loss: 0.01392
Epoch: 10329, Training Loss: 0.01725
Epoch: 10329, Training Loss: 0.01322
Epoch: 10330, Training Loss: 0.01509
Epoch: 10330, Training Loss: 0.01391
Epoch: 10330, Training Loss: 0.01724
Epoch: 10330, Training Loss: 0.01322
Epoch: 10331, Training Loss: 0.01509
Epoch: 10331, Training Loss: 0.01391
Epoch: 10331, Training Loss: 0.01724
Epoch: 10331, Training Loss: 0.01322
Epoch: 10332, Training Loss: 0.01509
Epoch: 10332, Training Loss: 0.01391
Epoch: 10332, Training Loss: 0.01724
Epoch: 10332, Training Loss: 0.01322
Epoch: 10333, Training Loss: 0.01509
Epoch: 10333, Training Loss: 0.01391
Epoch: 10333, Training Loss: 0.01724
Epoch: 10333, Training Loss: 0.01321
Epoch: 10334, Training Loss: 0.01509
Epoch: 10334, Training Loss: 0.01391
Epoch: 10334, Training Loss: 0.01724
Epoch: 10334, Training Loss: 0.01321
Epoch: 10335, Training Loss: 0.01509
Epoch: 10335, Training Loss: 0.01391
Epoch: 10335, Training Loss: 0.01724
Epoch: 10335, Training Loss: 0.01321
Epoch: 10336, Training Loss: 0.01509
Epoch: 10336, Training Loss: 0.01391
Epoch: 10336, Training Loss: 0.01724
Epoch: 10336, Training Loss: 0.01321
Epoch: 10337, Training Loss: 0.01509
Epoch: 10337, Training Loss: 0.01391
Epoch: 10337, Training Loss: 0.01724
Epoch: 10337, Training Loss: 0.01321
Epoch: 10338, Training Loss: 0.01509
Epoch: 10338, Training Loss: 0.01391
Epoch: 10338, Training Loss: 0.01724
Epoch: 10338, Training Loss: 0.01321
Epoch: 10339, Training Loss: 0.01509
Epoch: 10339, Training Loss: 0.01391
Epoch: 10339, Training Loss: 0.01724
Epoch: 10339, Training Loss: 0.01321
Epoch: 10340, Training Loss: 0.01509
Epoch: 10340, Training Loss: 0.01391
Epoch: 10340, Training Loss: 0.01723
Epoch: 10340, Training Loss: 0.01321
Epoch: 10341, Training Loss: 0.01509
Epoch: 10341, Training Loss: 0.01391
Epoch: 10341, Training Loss: 0.01723
Epoch: 10341, Training Loss: 0.01321
Epoch: 10342, Training Loss: 0.01508
Epoch: 10342, Training Loss: 0.01391
Epoch: 10342, Training Loss: 0.01723
Epoch: 10342, Training Loss: 0.01321
Epoch: 10343, Training Loss: 0.01508
Epoch: 10343, Training Loss: 0.01391
Epoch: 10343, Training Loss: 0.01723
Epoch: 10343, Training Loss: 0.01321
Epoch: 10344, Training Loss: 0.01508
Epoch: 10344, Training Loss: 0.01390
Epoch: 10344, Training Loss: 0.01723
Epoch: 10344, Training Loss: 0.01321
Epoch: 10345, Training Loss: 0.01508
Epoch: 10345, Training Loss: 0.01390
Epoch: 10345, Training Loss: 0.01723
Epoch: 10345, Training Loss: 0.01321
Epoch: 10346, Training Loss: 0.01508
Epoch: 10346, Training Loss: 0.01390
Epoch: 10346, Training Loss: 0.01723
Epoch: 10346, Training Loss: 0.01321
Epoch: 10347, Training Loss: 0.01508
Epoch: 10347, Training Loss: 0.01390
Epoch: 10347, Training Loss: 0.01723
Epoch: 10347, Training Loss: 0.01320
Epoch: 10348, Training Loss: 0.01508
Epoch: 10348, Training Loss: 0.01390
Epoch: 10348, Training Loss: 0.01723
Epoch: 10348, Training Loss: 0.01320
Epoch: 10349, Training Loss: 0.01508
Epoch: 10349, Training Loss: 0.01390
Epoch: 10349, Training Loss: 0.01723
Epoch: 10349, Training Loss: 0.01320
Epoch: 10350, Training Loss: 0.01508
Epoch: 10350, Training Loss: 0.01390
Epoch: 10350, Training Loss: 0.01723
Epoch: 10350, Training Loss: 0.01320
Epoch: 10351, Training Loss: 0.01508
Epoch: 10351, Training Loss: 0.01390
Epoch: 10351, Training Loss: 0.01722
Epoch: 10351, Training Loss: 0.01320
Epoch: 10352, Training Loss: 0.01508
Epoch: 10352, Training Loss: 0.01390
Epoch: 10352, Training Loss: 0.01722
Epoch: 10352, Training Loss: 0.01320
Epoch: 10353, Training Loss: 0.01508
Epoch: 10353, Training Loss: 0.01390
Epoch: 10353, Training Loss: 0.01722
Epoch: 10353, Training Loss: 0.01320
Epoch: 10354, Training Loss: 0.01507
Epoch: 10354, Training Loss: 0.01390
Epoch: 10354, Training Loss: 0.01722
Epoch: 10354, Training Loss: 0.01320
Epoch: 10355, Training Loss: 0.01507
Epoch: 10355, Training Loss: 0.01390
Epoch: 10355, Training Loss: 0.01722
Epoch: 10355, Training Loss: 0.01320
Epoch: 10356, Training Loss: 0.01507
Epoch: 10356, Training Loss: 0.01390
Epoch: 10356, Training Loss: 0.01722
Epoch: 10356, Training Loss: 0.01320
Epoch: 10357, Training Loss: 0.01507
Epoch: 10357, Training Loss: 0.01389
Epoch: 10357, Training Loss: 0.01722
Epoch: 10357, Training Loss: 0.01320
Epoch: 10358, Training Loss: 0.01507
Epoch: 10358, Training Loss: 0.01389
Epoch: 10358, Training Loss: 0.01722
Epoch: 10358, Training Loss: 0.01320
Epoch: 10359, Training Loss: 0.01507
Epoch: 10359, Training Loss: 0.01389
Epoch: 10359, Training Loss: 0.01722
Epoch: 10359, Training Loss: 0.01320
Epoch: 10360, Training Loss: 0.01507
Epoch: 10360, Training Loss: 0.01389
Epoch: 10360, Training Loss: 0.01722
Epoch: 10360, Training Loss: 0.01320
Epoch: 10361, Training Loss: 0.01507
Epoch: 10361, Training Loss: 0.01389
Epoch: 10361, Training Loss: 0.01722
Epoch: 10361, Training Loss: 0.01319
Epoch: 10362, Training Loss: 0.01507
Epoch: 10362, Training Loss: 0.01389
Epoch: 10362, Training Loss: 0.01721
Epoch: 10362, Training Loss: 0.01319
Epoch: 10363, Training Loss: 0.01507
Epoch: 10363, Training Loss: 0.01389
Epoch: 10363, Training Loss: 0.01721
Epoch: 10363, Training Loss: 0.01319
Epoch: 10364, Training Loss: 0.01507
Epoch: 10364, Training Loss: 0.01389
Epoch: 10364, Training Loss: 0.01721
Epoch: 10364, Training Loss: 0.01319
Epoch: 10365, Training Loss: 0.01507
Epoch: 10365, Training Loss: 0.01389
Epoch: 10365, Training Loss: 0.01721
Epoch: 10365, Training Loss: 0.01319
Epoch: 10366, Training Loss: 0.01506
Epoch: 10366, Training Loss: 0.01389
Epoch: 10366, Training Loss: 0.01721
Epoch: 10366, Training Loss: 0.01319
Epoch: 10367, Training Loss: 0.01506
Epoch: 10367, Training Loss: 0.01389
Epoch: 10367, Training Loss: 0.01721
Epoch: 10367, Training Loss: 0.01319
Epoch: 10368, Training Loss: 0.01506
Epoch: 10368, Training Loss: 0.01389
Epoch: 10368, Training Loss: 0.01721
Epoch: 10368, Training Loss: 0.01319
Epoch: 10369, Training Loss: 0.01506
Epoch: 10369, Training Loss: 0.01389
Epoch: 10369, Training Loss: 0.01721
Epoch: 10369, Training Loss: 0.01319
Epoch: 10370, Training Loss: 0.01506
Epoch: 10370, Training Loss: 0.01389
Epoch: 10370, Training Loss: 0.01721
Epoch: 10370, Training Loss: 0.01319
Epoch: 10371, Training Loss: 0.01506
Epoch: 10371, Training Loss: 0.01388
Epoch: 10371, Training Loss: 0.01721
Epoch: 10371, Training Loss: 0.01319
Epoch: 10372, Training Loss: 0.01506
Epoch: 10372, Training Loss: 0.01388
Epoch: 10372, Training Loss: 0.01720
Epoch: 10372, Training Loss: 0.01319
Epoch: 10373, Training Loss: 0.01506
Epoch: 10373, Training Loss: 0.01388
Epoch: 10373, Training Loss: 0.01720
Epoch: 10373, Training Loss: 0.01319
Epoch: 10374, Training Loss: 0.01506
Epoch: 10374, Training Loss: 0.01388
Epoch: 10374, Training Loss: 0.01720
Epoch: 10374, Training Loss: 0.01319
Epoch: 10375, Training Loss: 0.01506
Epoch: 10375, Training Loss: 0.01388
Epoch: 10375, Training Loss: 0.01720
Epoch: 10375, Training Loss: 0.01318
Epoch: 10376, Training Loss: 0.01506
Epoch: 10376, Training Loss: 0.01388
Epoch: 10376, Training Loss: 0.01720
Epoch: 10376, Training Loss: 0.01318
Epoch: 10377, Training Loss: 0.01506
Epoch: 10377, Training Loss: 0.01388
Epoch: 10377, Training Loss: 0.01720
Epoch: 10377, Training Loss: 0.01318
Epoch: 10378, Training Loss: 0.01505
Epoch: 10378, Training Loss: 0.01388
Epoch: 10378, Training Loss: 0.01720
Epoch: 10378, Training Loss: 0.01318
Epoch: 10379, Training Loss: 0.01505
Epoch: 10379, Training Loss: 0.01388
Epoch: 10379, Training Loss: 0.01720
Epoch: 10379, Training Loss: 0.01318
Epoch: 10380, Training Loss: 0.01505
Epoch: 10380, Training Loss: 0.01388
Epoch: 10380, Training Loss: 0.01720
Epoch: 10380, Training Loss: 0.01318
Epoch: 10381, Training Loss: 0.01505
Epoch: 10381, Training Loss: 0.01388
Epoch: 10381, Training Loss: 0.01720
Epoch: 10381, Training Loss: 0.01318
Epoch: 10382, Training Loss: 0.01505
Epoch: 10382, Training Loss: 0.01388
Epoch: 10382, Training Loss: 0.01720
Epoch: 10382, Training Loss: 0.01318
Epoch: 10383, Training Loss: 0.01505
Epoch: 10383, Training Loss: 0.01388
Epoch: 10383, Training Loss: 0.01719
Epoch: 10383, Training Loss: 0.01318
Epoch: 10384, Training Loss: 0.01505
Epoch: 10384, Training Loss: 0.01388
Epoch: 10384, Training Loss: 0.01719
Epoch: 10384, Training Loss: 0.01318
Epoch: 10385, Training Loss: 0.01505
Epoch: 10385, Training Loss: 0.01387
Epoch: 10385, Training Loss: 0.01719
Epoch: 10385, Training Loss: 0.01318
Epoch: 10386, Training Loss: 0.01505
Epoch: 10386, Training Loss: 0.01387
Epoch: 10386, Training Loss: 0.01719
Epoch: 10386, Training Loss: 0.01318
Epoch: 10387, Training Loss: 0.01505
Epoch: 10387, Training Loss: 0.01387
Epoch: 10387, Training Loss: 0.01719
Epoch: 10387, Training Loss: 0.01318
Epoch: 10388, Training Loss: 0.01505
Epoch: 10388, Training Loss: 0.01387
Epoch: 10388, Training Loss: 0.01719
Epoch: 10388, Training Loss: 0.01318
Epoch: 10389, Training Loss: 0.01505
Epoch: 10389, Training Loss: 0.01387
Epoch: 10389, Training Loss: 0.01719
Epoch: 10389, Training Loss: 0.01317
Epoch: 10390, Training Loss: 0.01505
Epoch: 10390, Training Loss: 0.01387
Epoch: 10390, Training Loss: 0.01719
Epoch: 10390, Training Loss: 0.01317
Epoch: 10391, Training Loss: 0.01504
Epoch: 10391, Training Loss: 0.01387
Epoch: 10391, Training Loss: 0.01719
Epoch: 10391, Training Loss: 0.01317
Epoch: 10392, Training Loss: 0.01504
Epoch: 10392, Training Loss: 0.01387
Epoch: 10392, Training Loss: 0.01719
Epoch: 10392, Training Loss: 0.01317
Epoch: 10393, Training Loss: 0.01504
Epoch: 10393, Training Loss: 0.01387
Epoch: 10393, Training Loss: 0.01718
Epoch: 10393, Training Loss: 0.01317
Epoch: 10394, Training Loss: 0.01504
Epoch: 10394, Training Loss: 0.01387
Epoch: 10394, Training Loss: 0.01718
Epoch: 10394, Training Loss: 0.01317
Epoch: 10395, Training Loss: 0.01504
Epoch: 10395, Training Loss: 0.01387
Epoch: 10395, Training Loss: 0.01718
Epoch: 10395, Training Loss: 0.01317
Epoch: 10396, Training Loss: 0.01504
Epoch: 10396, Training Loss: 0.01387
Epoch: 10396, Training Loss: 0.01718
Epoch: 10396, Training Loss: 0.01317
Epoch: 10397, Training Loss: 0.01504
Epoch: 10397, Training Loss: 0.01387
Epoch: 10397, Training Loss: 0.01718
Epoch: 10397, Training Loss: 0.01317
Epoch: 10398, Training Loss: 0.01504
Epoch: 10398, Training Loss: 0.01386
Epoch: 10398, Training Loss: 0.01718
Epoch: 10398, Training Loss: 0.01317
Epoch: 10399, Training Loss: 0.01504
Epoch: 10399, Training Loss: 0.01386
Epoch: 10399, Training Loss: 0.01718
Epoch: 10399, Training Loss: 0.01317
Epoch: 10400, Training Loss: 0.01504
Epoch: 10400, Training Loss: 0.01386
Epoch: 10400, Training Loss: 0.01718
Epoch: 10400, Training Loss: 0.01317
Epoch: 10401, Training Loss: 0.01504
Epoch: 10401, Training Loss: 0.01386
Epoch: 10401, Training Loss: 0.01718
Epoch: 10401, Training Loss: 0.01317
Epoch: 10402, Training Loss: 0.01504
Epoch: 10402, Training Loss: 0.01386
Epoch: 10402, Training Loss: 0.01718
Epoch: 10402, Training Loss: 0.01317
Epoch: 10403, Training Loss: 0.01503
Epoch: 10403, Training Loss: 0.01386
Epoch: 10403, Training Loss: 0.01718
Epoch: 10403, Training Loss: 0.01317
Epoch: 10404, Training Loss: 0.01503
Epoch: 10404, Training Loss: 0.01386
Epoch: 10404, Training Loss: 0.01717
Epoch: 10404, Training Loss: 0.01316
Epoch: 10405, Training Loss: 0.01503
Epoch: 10405, Training Loss: 0.01386
Epoch: 10405, Training Loss: 0.01717
Epoch: 10405, Training Loss: 0.01316
Epoch: 10406, Training Loss: 0.01503
Epoch: 10406, Training Loss: 0.01386
Epoch: 10406, Training Loss: 0.01717
Epoch: 10406, Training Loss: 0.01316
Epoch: 10407, Training Loss: 0.01503
Epoch: 10407, Training Loss: 0.01386
Epoch: 10407, Training Loss: 0.01717
Epoch: 10407, Training Loss: 0.01316
Epoch: 10408, Training Loss: 0.01503
Epoch: 10408, Training Loss: 0.01386
Epoch: 10408, Training Loss: 0.01717
Epoch: 10408, Training Loss: 0.01316
Epoch: 10409, Training Loss: 0.01503
Epoch: 10409, Training Loss: 0.01386
Epoch: 10409, Training Loss: 0.01717
Epoch: 10409, Training Loss: 0.01316
Epoch: 10410, Training Loss: 0.01503
Epoch: 10410, Training Loss: 0.01386
Epoch: 10410, Training Loss: 0.01717
Epoch: 10410, Training Loss: 0.01316
Epoch: 10411, Training Loss: 0.01503
Epoch: 10411, Training Loss: 0.01386
Epoch: 10411, Training Loss: 0.01717
Epoch: 10411, Training Loss: 0.01316
Epoch: 10412, Training Loss: 0.01503
Epoch: 10412, Training Loss: 0.01385
Epoch: 10412, Training Loss: 0.01717
Epoch: 10412, Training Loss: 0.01316
Epoch: 10413, Training Loss: 0.01503
Epoch: 10413, Training Loss: 0.01385
Epoch: 10413, Training Loss: 0.01717
Epoch: 10413, Training Loss: 0.01316
Epoch: 10414, Training Loss: 0.01503
Epoch: 10414, Training Loss: 0.01385
Epoch: 10414, Training Loss: 0.01717
Epoch: 10414, Training Loss: 0.01316
Epoch: 10415, Training Loss: 0.01502
Epoch: 10415, Training Loss: 0.01385
Epoch: 10415, Training Loss: 0.01716
Epoch: 10415, Training Loss: 0.01316
Epoch: 10416, Training Loss: 0.01502
Epoch: 10416, Training Loss: 0.01385
Epoch: 10416, Training Loss: 0.01716
Epoch: 10416, Training Loss: 0.01316
Epoch: 10417, Training Loss: 0.01502
Epoch: 10417, Training Loss: 0.01385
Epoch: 10417, Training Loss: 0.01716
Epoch: 10417, Training Loss: 0.01316
Epoch: 10418, Training Loss: 0.01502
Epoch: 10418, Training Loss: 0.01385
Epoch: 10418, Training Loss: 0.01716
Epoch: 10418, Training Loss: 0.01315
Epoch: 10419, Training Loss: 0.01502
Epoch: 10419, Training Loss: 0.01385
Epoch: 10419, Training Loss: 0.01716
Epoch: 10419, Training Loss: 0.01315
Epoch: 10420, Training Loss: 0.01502
Epoch: 10420, Training Loss: 0.01385
Epoch: 10420, Training Loss: 0.01716
Epoch: 10420, Training Loss: 0.01315
Epoch: 10421, Training Loss: 0.01502
Epoch: 10421, Training Loss: 0.01385
Epoch: 10421, Training Loss: 0.01716
Epoch: 10421, Training Loss: 0.01315
Epoch: 10422, Training Loss: 0.01502
Epoch: 10422, Training Loss: 0.01385
Epoch: 10422, Training Loss: 0.01716
Epoch: 10422, Training Loss: 0.01315
Epoch: 10423, Training Loss: 0.01502
Epoch: 10423, Training Loss: 0.01385
Epoch: 10423, Training Loss: 0.01716
Epoch: 10423, Training Loss: 0.01315
Epoch: 10424, Training Loss: 0.01502
Epoch: 10424, Training Loss: 0.01385
Epoch: 10424, Training Loss: 0.01716
Epoch: 10424, Training Loss: 0.01315
Epoch: 10425, Training Loss: 0.01502
Epoch: 10425, Training Loss: 0.01385
Epoch: 10425, Training Loss: 0.01715
Epoch: 10425, Training Loss: 0.01315
Epoch: 10426, Training Loss: 0.01502
Epoch: 10426, Training Loss: 0.01384
Epoch: 10426, Training Loss: 0.01715
Epoch: 10426, Training Loss: 0.01315
Epoch: 10427, Training Loss: 0.01501
Epoch: 10427, Training Loss: 0.01384
Epoch: 10427, Training Loss: 0.01715
Epoch: 10427, Training Loss: 0.01315
Epoch: 10428, Training Loss: 0.01501
Epoch: 10428, Training Loss: 0.01384
Epoch: 10428, Training Loss: 0.01715
Epoch: 10428, Training Loss: 0.01315
Epoch: 10429, Training Loss: 0.01501
Epoch: 10429, Training Loss: 0.01384
Epoch: 10429, Training Loss: 0.01715
Epoch: 10429, Training Loss: 0.01315
Epoch: 10430, Training Loss: 0.01501
Epoch: 10430, Training Loss: 0.01384
Epoch: 10430, Training Loss: 0.01715
Epoch: 10430, Training Loss: 0.01315
Epoch: 10431, Training Loss: 0.01501
Epoch: 10431, Training Loss: 0.01384
Epoch: 10431, Training Loss: 0.01715
Epoch: 10431, Training Loss: 0.01315
Epoch: 10432, Training Loss: 0.01501
Epoch: 10432, Training Loss: 0.01384
Epoch: 10432, Training Loss: 0.01715
Epoch: 10432, Training Loss: 0.01314
Epoch: 10433, Training Loss: 0.01501
Epoch: 10433, Training Loss: 0.01384
Epoch: 10433, Training Loss: 0.01715
Epoch: 10433, Training Loss: 0.01314
Epoch: 10434, Training Loss: 0.01501
Epoch: 10434, Training Loss: 0.01384
Epoch: 10434, Training Loss: 0.01715
Epoch: 10434, Training Loss: 0.01314
Epoch: 10435, Training Loss: 0.01501
Epoch: 10435, Training Loss: 0.01384
Epoch: 10435, Training Loss: 0.01715
Epoch: 10435, Training Loss: 0.01314
Epoch: 10436, Training Loss: 0.01501
Epoch: 10436, Training Loss: 0.01384
Epoch: 10436, Training Loss: 0.01714
Epoch: 10436, Training Loss: 0.01314
Epoch: 10437, Training Loss: 0.01501
Epoch: 10437, Training Loss: 0.01384
Epoch: 10437, Training Loss: 0.01714
Epoch: 10437, Training Loss: 0.01314
Epoch: 10438, Training Loss: 0.01501
Epoch: 10438, Training Loss: 0.01384
Epoch: 10438, Training Loss: 0.01714
Epoch: 10438, Training Loss: 0.01314
Epoch: 10439, Training Loss: 0.01501
Epoch: 10439, Training Loss: 0.01383
Epoch: 10439, Training Loss: 0.01714
Epoch: 10439, Training Loss: 0.01314
Epoch: 10440, Training Loss: 0.01500
Epoch: 10440, Training Loss: 0.01383
Epoch: 10440, Training Loss: 0.01714
Epoch: 10440, Training Loss: 0.01314
Epoch: 10441, Training Loss: 0.01500
Epoch: 10441, Training Loss: 0.01383
Epoch: 10441, Training Loss: 0.01714
Epoch: 10441, Training Loss: 0.01314
Epoch: 10442, Training Loss: 0.01500
Epoch: 10442, Training Loss: 0.01383
Epoch: 10442, Training Loss: 0.01714
Epoch: 10442, Training Loss: 0.01314
Epoch: 10443, Training Loss: 0.01500
Epoch: 10443, Training Loss: 0.01383
Epoch: 10443, Training Loss: 0.01714
Epoch: 10443, Training Loss: 0.01314
Epoch: 10444, Training Loss: 0.01500
Epoch: 10444, Training Loss: 0.01383
Epoch: 10444, Training Loss: 0.01714
Epoch: 10444, Training Loss: 0.01314
Epoch: 10445, Training Loss: 0.01500
Epoch: 10445, Training Loss: 0.01383
Epoch: 10445, Training Loss: 0.01714
Epoch: 10445, Training Loss: 0.01314
Epoch: 10446, Training Loss: 0.01500
Epoch: 10446, Training Loss: 0.01383
Epoch: 10446, Training Loss: 0.01714
Epoch: 10446, Training Loss: 0.01314
Epoch: 10447, Training Loss: 0.01500
Epoch: 10447, Training Loss: 0.01383
Epoch: 10447, Training Loss: 0.01713
Epoch: 10447, Training Loss: 0.01313
Epoch: 10448, Training Loss: 0.01500
Epoch: 10448, Training Loss: 0.01383
Epoch: 10448, Training Loss: 0.01713
Epoch: 10448, Training Loss: 0.01313
Epoch: 10449, Training Loss: 0.01500
Epoch: 10449, Training Loss: 0.01383
Epoch: 10449, Training Loss: 0.01713
Epoch: 10449, Training Loss: 0.01313
Epoch: 10450, Training Loss: 0.01500
Epoch: 10450, Training Loss: 0.01383
Epoch: 10450, Training Loss: 0.01713
Epoch: 10450, Training Loss: 0.01313
Epoch: 10451, Training Loss: 0.01500
Epoch: 10451, Training Loss: 0.01383
Epoch: 10451, Training Loss: 0.01713
Epoch: 10451, Training Loss: 0.01313
Epoch: 10452, Training Loss: 0.01499
Epoch: 10452, Training Loss: 0.01383
Epoch: 10452, Training Loss: 0.01713
Epoch: 10452, Training Loss: 0.01313
Epoch: 10453, Training Loss: 0.01499
Epoch: 10453, Training Loss: 0.01382
Epoch: 10453, Training Loss: 0.01713
Epoch: 10453, Training Loss: 0.01313
Epoch: 10454, Training Loss: 0.01499
Epoch: 10454, Training Loss: 0.01382
Epoch: 10454, Training Loss: 0.01713
Epoch: 10454, Training Loss: 0.01313
Epoch: 10455, Training Loss: 0.01499
Epoch: 10455, Training Loss: 0.01382
Epoch: 10455, Training Loss: 0.01713
Epoch: 10455, Training Loss: 0.01313
Epoch: 10456, Training Loss: 0.01499
Epoch: 10456, Training Loss: 0.01382
Epoch: 10456, Training Loss: 0.01713
Epoch: 10456, Training Loss: 0.01313
Epoch: 10457, Training Loss: 0.01499
Epoch: 10457, Training Loss: 0.01382
Epoch: 10457, Training Loss: 0.01713
Epoch: 10457, Training Loss: 0.01313
Epoch: 10458, Training Loss: 0.01499
Epoch: 10458, Training Loss: 0.01382
Epoch: 10458, Training Loss: 0.01712
Epoch: 10458, Training Loss: 0.01313
Epoch: 10459, Training Loss: 0.01499
Epoch: 10459, Training Loss: 0.01382
Epoch: 10459, Training Loss: 0.01712
Epoch: 10459, Training Loss: 0.01313
Epoch: 10460, Training Loss: 0.01499
Epoch: 10460, Training Loss: 0.01382
Epoch: 10460, Training Loss: 0.01712
Epoch: 10460, Training Loss: 0.01313
Epoch: 10461, Training Loss: 0.01499
Epoch: 10461, Training Loss: 0.01382
Epoch: 10461, Training Loss: 0.01712
Epoch: 10461, Training Loss: 0.01312
Epoch: 10462, Training Loss: 0.01499
Epoch: 10462, Training Loss: 0.01382
Epoch: 10462, Training Loss: 0.01712
Epoch: 10462, Training Loss: 0.01312
Epoch: 10463, Training Loss: 0.01499
Epoch: 10463, Training Loss: 0.01382
Epoch: 10463, Training Loss: 0.01712
Epoch: 10463, Training Loss: 0.01312
Epoch: 10464, Training Loss: 0.01498
Epoch: 10464, Training Loss: 0.01382
Epoch: 10464, Training Loss: 0.01712
Epoch: 10464, Training Loss: 0.01312
Epoch: 10465, Training Loss: 0.01498
Epoch: 10465, Training Loss: 0.01382
Epoch: 10465, Training Loss: 0.01712
Epoch: 10465, Training Loss: 0.01312
Epoch: 10466, Training Loss: 0.01498
Epoch: 10466, Training Loss: 0.01382
Epoch: 10466, Training Loss: 0.01712
Epoch: 10466, Training Loss: 0.01312
Epoch: 10467, Training Loss: 0.01498
Epoch: 10467, Training Loss: 0.01381
Epoch: 10467, Training Loss: 0.01712
Epoch: 10467, Training Loss: 0.01312
Epoch: 10468, Training Loss: 0.01498
Epoch: 10468, Training Loss: 0.01381
Epoch: 10468, Training Loss: 0.01711
Epoch: 10468, Training Loss: 0.01312
Epoch: 10469, Training Loss: 0.01498
Epoch: 10469, Training Loss: 0.01381
Epoch: 10469, Training Loss: 0.01711
Epoch: 10469, Training Loss: 0.01312
Epoch: 10470, Training Loss: 0.01498
Epoch: 10470, Training Loss: 0.01381
Epoch: 10470, Training Loss: 0.01711
Epoch: 10470, Training Loss: 0.01312
Epoch: 10471, Training Loss: 0.01498
Epoch: 10471, Training Loss: 0.01381
Epoch: 10471, Training Loss: 0.01711
Epoch: 10471, Training Loss: 0.01312
Epoch: 10472, Training Loss: 0.01498
Epoch: 10472, Training Loss: 0.01381
Epoch: 10472, Training Loss: 0.01711
Epoch: 10472, Training Loss: 0.01312
Epoch: 10473, Training Loss: 0.01498
Epoch: 10473, Training Loss: 0.01381
Epoch: 10473, Training Loss: 0.01711
Epoch: 10473, Training Loss: 0.01312
Epoch: 10474, Training Loss: 0.01498
Epoch: 10474, Training Loss: 0.01381
Epoch: 10474, Training Loss: 0.01711
Epoch: 10474, Training Loss: 0.01312
Epoch: 10475, Training Loss: 0.01498
Epoch: 10475, Training Loss: 0.01381
Epoch: 10475, Training Loss: 0.01711
Epoch: 10475, Training Loss: 0.01311
Epoch: 10476, Training Loss: 0.01498
Epoch: 10476, Training Loss: 0.01381
Epoch: 10476, Training Loss: 0.01711
Epoch: 10476, Training Loss: 0.01311
Epoch: 10477, Training Loss: 0.01497
Epoch: 10477, Training Loss: 0.01381
Epoch: 10477, Training Loss: 0.01711
Epoch: 10477, Training Loss: 0.01311
Epoch: 10478, Training Loss: 0.01497
Epoch: 10478, Training Loss: 0.01381
Epoch: 10478, Training Loss: 0.01711
Epoch: 10478, Training Loss: 0.01311
Epoch: 10479, Training Loss: 0.01497
Epoch: 10479, Training Loss: 0.01381
Epoch: 10479, Training Loss: 0.01710
Epoch: 10479, Training Loss: 0.01311
Epoch: 10480, Training Loss: 0.01497
Epoch: 10480, Training Loss: 0.01381
Epoch: 10480, Training Loss: 0.01710
Epoch: 10480, Training Loss: 0.01311
Epoch: 10481, Training Loss: 0.01497
Epoch: 10481, Training Loss: 0.01380
Epoch: 10481, Training Loss: 0.01710
Epoch: 10481, Training Loss: 0.01311
Epoch: 10482, Training Loss: 0.01497
Epoch: 10482, Training Loss: 0.01380
Epoch: 10482, Training Loss: 0.01710
Epoch: 10482, Training Loss: 0.01311
Epoch: 10483, Training Loss: 0.01497
Epoch: 10483, Training Loss: 0.01380
Epoch: 10483, Training Loss: 0.01710
Epoch: 10483, Training Loss: 0.01311
Epoch: 10484, Training Loss: 0.01497
Epoch: 10484, Training Loss: 0.01380
Epoch: 10484, Training Loss: 0.01710
Epoch: 10484, Training Loss: 0.01311
Epoch: 10485, Training Loss: 0.01497
Epoch: 10485, Training Loss: 0.01380
Epoch: 10485, Training Loss: 0.01710
Epoch: 10485, Training Loss: 0.01311
Epoch: 10486, Training Loss: 0.01497
Epoch: 10486, Training Loss: 0.01380
Epoch: 10486, Training Loss: 0.01710
Epoch: 10486, Training Loss: 0.01311
Epoch: 10487, Training Loss: 0.01497
Epoch: 10487, Training Loss: 0.01380
Epoch: 10487, Training Loss: 0.01710
Epoch: 10487, Training Loss: 0.01311
Epoch: 10488, Training Loss: 0.01497
Epoch: 10488, Training Loss: 0.01380
Epoch: 10488, Training Loss: 0.01710
Epoch: 10488, Training Loss: 0.01311
Epoch: 10489, Training Loss: 0.01496
Epoch: 10489, Training Loss: 0.01380
Epoch: 10489, Training Loss: 0.01710
Epoch: 10489, Training Loss: 0.01311
Epoch: 10490, Training Loss: 0.01496
Epoch: 10490, Training Loss: 0.01380
Epoch: 10490, Training Loss: 0.01709
Epoch: 10490, Training Loss: 0.01310
Epoch: 10491, Training Loss: 0.01496
Epoch: 10491, Training Loss: 0.01380
Epoch: 10491, Training Loss: 0.01709
Epoch: 10491, Training Loss: 0.01310
Epoch: 10492, Training Loss: 0.01496
Epoch: 10492, Training Loss: 0.01380
Epoch: 10492, Training Loss: 0.01709
Epoch: 10492, Training Loss: 0.01310
Epoch: 10493, Training Loss: 0.01496
Epoch: 10493, Training Loss: 0.01380
Epoch: 10493, Training Loss: 0.01709
Epoch: 10493, Training Loss: 0.01310
Epoch: 10494, Training Loss: 0.01496
Epoch: 10494, Training Loss: 0.01380
Epoch: 10494, Training Loss: 0.01709
Epoch: 10494, Training Loss: 0.01310
Epoch: 10495, Training Loss: 0.01496
Epoch: 10495, Training Loss: 0.01379
Epoch: 10495, Training Loss: 0.01709
Epoch: 10495, Training Loss: 0.01310
Epoch: 10496, Training Loss: 0.01496
Epoch: 10496, Training Loss: 0.01379
Epoch: 10496, Training Loss: 0.01709
Epoch: 10496, Training Loss: 0.01310
Epoch: 10497, Training Loss: 0.01496
Epoch: 10497, Training Loss: 0.01379
Epoch: 10497, Training Loss: 0.01709
Epoch: 10497, Training Loss: 0.01310
Epoch: 10498, Training Loss: 0.01496
Epoch: 10498, Training Loss: 0.01379
Epoch: 10498, Training Loss: 0.01709
Epoch: 10498, Training Loss: 0.01310
Epoch: 10499, Training Loss: 0.01496
Epoch: 10499, Training Loss: 0.01379
Epoch: 10499, Training Loss: 0.01709
Epoch: 10499, Training Loss: 0.01310
Epoch: 10500, Training Loss: 0.01496
Epoch: 10500, Training Loss: 0.01379
Epoch: 10500, Training Loss: 0.01709
Epoch: 10500, Training Loss: 0.01310
Epoch: 10501, Training Loss: 0.01496
Epoch: 10501, Training Loss: 0.01379
Epoch: 10501, Training Loss: 0.01708
Epoch: 10501, Training Loss: 0.01310
Epoch: 10502, Training Loss: 0.01495
Epoch: 10502, Training Loss: 0.01379
Epoch: 10502, Training Loss: 0.01708
Epoch: 10502, Training Loss: 0.01310
Epoch: 10503, Training Loss: 0.01495
Epoch: 10503, Training Loss: 0.01379
Epoch: 10503, Training Loss: 0.01708
Epoch: 10503, Training Loss: 0.01310
Epoch: 10504, Training Loss: 0.01495
Epoch: 10504, Training Loss: 0.01379
Epoch: 10504, Training Loss: 0.01708
Epoch: 10504, Training Loss: 0.01309
Epoch: 10505, Training Loss: 0.01495
Epoch: 10505, Training Loss: 0.01379
Epoch: 10505, Training Loss: 0.01708
Epoch: 10505, Training Loss: 0.01309
Epoch: 10506, Training Loss: 0.01495
Epoch: 10506, Training Loss: 0.01379
Epoch: 10506, Training Loss: 0.01708
Epoch: 10506, Training Loss: 0.01309
Epoch: 10507, Training Loss: 0.01495
Epoch: 10507, Training Loss: 0.01379
Epoch: 10507, Training Loss: 0.01708
Epoch: 10507, Training Loss: 0.01309
Epoch: 10508, Training Loss: 0.01495
Epoch: 10508, Training Loss: 0.01378
Epoch: 10508, Training Loss: 0.01708
Epoch: 10508, Training Loss: 0.01309
Epoch: 10509, Training Loss: 0.01495
Epoch: 10509, Training Loss: 0.01378
Epoch: 10509, Training Loss: 0.01708
Epoch: 10509, Training Loss: 0.01309
Epoch: 10510, Training Loss: 0.01495
Epoch: 10510, Training Loss: 0.01378
Epoch: 10510, Training Loss: 0.01708
Epoch: 10510, Training Loss: 0.01309
Epoch: 10511, Training Loss: 0.01495
Epoch: 10511, Training Loss: 0.01378
Epoch: 10511, Training Loss: 0.01708
Epoch: 10511, Training Loss: 0.01309
Epoch: 10512, Training Loss: 0.01495
Epoch: 10512, Training Loss: 0.01378
Epoch: 10512, Training Loss: 0.01707
Epoch: 10512, Training Loss: 0.01309
Epoch: 10513, Training Loss: 0.01495
Epoch: 10513, Training Loss: 0.01378
Epoch: 10513, Training Loss: 0.01707
Epoch: 10513, Training Loss: 0.01309
Epoch: 10514, Training Loss: 0.01494
Epoch: 10514, Training Loss: 0.01378
Epoch: 10514, Training Loss: 0.01707
Epoch: 10514, Training Loss: 0.01309
Epoch: 10515, Training Loss: 0.01494
Epoch: 10515, Training Loss: 0.01378
Epoch: 10515, Training Loss: 0.01707
Epoch: 10515, Training Loss: 0.01309
Epoch: 10516, Training Loss: 0.01494
Epoch: 10516, Training Loss: 0.01378
Epoch: 10516, Training Loss: 0.01707
Epoch: 10516, Training Loss: 0.01309
Epoch: 10517, Training Loss: 0.01494
Epoch: 10517, Training Loss: 0.01378
Epoch: 10517, Training Loss: 0.01707
Epoch: 10517, Training Loss: 0.01309
Epoch: 10518, Training Loss: 0.01494
Epoch: 10518, Training Loss: 0.01378
Epoch: 10518, Training Loss: 0.01707
Epoch: 10518, Training Loss: 0.01309
Epoch: 10519, Training Loss: 0.01494
Epoch: 10519, Training Loss: 0.01378
Epoch: 10519, Training Loss: 0.01707
Epoch: 10519, Training Loss: 0.01308
Epoch: 10520, Training Loss: 0.01494
Epoch: 10520, Training Loss: 0.01378
Epoch: 10520, Training Loss: 0.01707
Epoch: 10520, Training Loss: 0.01308
Epoch: 10521, Training Loss: 0.01494
Epoch: 10521, Training Loss: 0.01378
Epoch: 10521, Training Loss: 0.01707
Epoch: 10521, Training Loss: 0.01308
Epoch: 10522, Training Loss: 0.01494
Epoch: 10522, Training Loss: 0.01377
Epoch: 10522, Training Loss: 0.01706
Epoch: 10522, Training Loss: 0.01308
Epoch: 10523, Training Loss: 0.01494
Epoch: 10523, Training Loss: 0.01377
Epoch: 10523, Training Loss: 0.01706
Epoch: 10523, Training Loss: 0.01308
Epoch: 10524, Training Loss: 0.01494
Epoch: 10524, Training Loss: 0.01377
Epoch: 10524, Training Loss: 0.01706
Epoch: 10524, Training Loss: 0.01308
Epoch: 10525, Training Loss: 0.01494
Epoch: 10525, Training Loss: 0.01377
Epoch: 10525, Training Loss: 0.01706
Epoch: 10525, Training Loss: 0.01308
Epoch: 10526, Training Loss: 0.01493
Epoch: 10526, Training Loss: 0.01377
Epoch: 10526, Training Loss: 0.01706
Epoch: 10526, Training Loss: 0.01308
Epoch: 10527, Training Loss: 0.01493
Epoch: 10527, Training Loss: 0.01377
Epoch: 10527, Training Loss: 0.01706
Epoch: 10527, Training Loss: 0.01308
Epoch: 10528, Training Loss: 0.01493
Epoch: 10528, Training Loss: 0.01377
Epoch: 10528, Training Loss: 0.01706
Epoch: 10528, Training Loss: 0.01308
Epoch: 10529, Training Loss: 0.01493
Epoch: 10529, Training Loss: 0.01377
Epoch: 10529, Training Loss: 0.01706
Epoch: 10529, Training Loss: 0.01308
Epoch: 10530, Training Loss: 0.01493
Epoch: 10530, Training Loss: 0.01377
Epoch: 10530, Training Loss: 0.01706
Epoch: 10530, Training Loss: 0.01308
Epoch: 10531, Training Loss: 0.01493
Epoch: 10531, Training Loss: 0.01377
Epoch: 10531, Training Loss: 0.01706
Epoch: 10531, Training Loss: 0.01308
Epoch: 10532, Training Loss: 0.01493
Epoch: 10532, Training Loss: 0.01377
Epoch: 10532, Training Loss: 0.01706
Epoch: 10532, Training Loss: 0.01308
Epoch: 10533, Training Loss: 0.01493
Epoch: 10533, Training Loss: 0.01377
Epoch: 10533, Training Loss: 0.01705
Epoch: 10533, Training Loss: 0.01307
Epoch: 10534, Training Loss: 0.01493
Epoch: 10534, Training Loss: 0.01377
Epoch: 10534, Training Loss: 0.01705
Epoch: 10534, Training Loss: 0.01307
Epoch: 10535, Training Loss: 0.01493
Epoch: 10535, Training Loss: 0.01377
Epoch: 10535, Training Loss: 0.01705
Epoch: 10535, Training Loss: 0.01307
Epoch: 10536, Training Loss: 0.01493
Epoch: 10536, Training Loss: 0.01376
Epoch: 10536, Training Loss: 0.01705
Epoch: 10536, Training Loss: 0.01307
Epoch: 10537, Training Loss: 0.01493
Epoch: 10537, Training Loss: 0.01376
Epoch: 10537, Training Loss: 0.01705
Epoch: 10537, Training Loss: 0.01307
Epoch: 10538, Training Loss: 0.01493
Epoch: 10538, Training Loss: 0.01376
Epoch: 10538, Training Loss: 0.01705
Epoch: 10538, Training Loss: 0.01307
Epoch: 10539, Training Loss: 0.01492
Epoch: 10539, Training Loss: 0.01376
Epoch: 10539, Training Loss: 0.01705
Epoch: 10539, Training Loss: 0.01307
Epoch: 10540, Training Loss: 0.01492
Epoch: 10540, Training Loss: 0.01376
Epoch: 10540, Training Loss: 0.01705
Epoch: 10540, Training Loss: 0.01307
Epoch: 10541, Training Loss: 0.01492
Epoch: 10541, Training Loss: 0.01376
Epoch: 10541, Training Loss: 0.01705
Epoch: 10541, Training Loss: 0.01307
Epoch: 10542, Training Loss: 0.01492
Epoch: 10542, Training Loss: 0.01376
Epoch: 10542, Training Loss: 0.01705
Epoch: 10542, Training Loss: 0.01307
Epoch: 10543, Training Loss: 0.01492
Epoch: 10543, Training Loss: 0.01376
Epoch: 10543, Training Loss: 0.01705
Epoch: 10543, Training Loss: 0.01307
Epoch: 10544, Training Loss: 0.01492
Epoch: 10544, Training Loss: 0.01376
Epoch: 10544, Training Loss: 0.01704
Epoch: 10544, Training Loss: 0.01307
Epoch: 10545, Training Loss: 0.01492
Epoch: 10545, Training Loss: 0.01376
Epoch: 10545, Training Loss: 0.01704
Epoch: 10545, Training Loss: 0.01307
Epoch: 10546, Training Loss: 0.01492
Epoch: 10546, Training Loss: 0.01376
Epoch: 10546, Training Loss: 0.01704
Epoch: 10546, Training Loss: 0.01307
Epoch: 10547, Training Loss: 0.01492
Epoch: 10547, Training Loss: 0.01376
Epoch: 10547, Training Loss: 0.01704
Epoch: 10547, Training Loss: 0.01307
Epoch: 10548, Training Loss: 0.01492
Epoch: 10548, Training Loss: 0.01376
Epoch: 10548, Training Loss: 0.01704
Epoch: 10548, Training Loss: 0.01306
Epoch: 10549, Training Loss: 0.01492
Epoch: 10549, Training Loss: 0.01376
Epoch: 10549, Training Loss: 0.01704
Epoch: 10549, Training Loss: 0.01306
Epoch: 10550, Training Loss: 0.01492
Epoch: 10550, Training Loss: 0.01375
Epoch: 10550, Training Loss: 0.01704
Epoch: 10550, Training Loss: 0.01306
Epoch: 10551, Training Loss: 0.01491
Epoch: 10551, Training Loss: 0.01375
Epoch: 10551, Training Loss: 0.01704
Epoch: 10551, Training Loss: 0.01306
Epoch: 10552, Training Loss: 0.01491
Epoch: 10552, Training Loss: 0.01375
Epoch: 10552, Training Loss: 0.01704
Epoch: 10552, Training Loss: 0.01306
Epoch: 10553, Training Loss: 0.01491
Epoch: 10553, Training Loss: 0.01375
Epoch: 10553, Training Loss: 0.01704
Epoch: 10553, Training Loss: 0.01306
Epoch: 10554, Training Loss: 0.01491
Epoch: 10554, Training Loss: 0.01375
Epoch: 10554, Training Loss: 0.01704
Epoch: 10554, Training Loss: 0.01306
Epoch: 10555, Training Loss: 0.01491
Epoch: 10555, Training Loss: 0.01375
Epoch: 10555, Training Loss: 0.01703
Epoch: 10555, Training Loss: 0.01306
Epoch: 10556, Training Loss: 0.01491
Epoch: 10556, Training Loss: 0.01375
Epoch: 10556, Training Loss: 0.01703
Epoch: 10556, Training Loss: 0.01306
Epoch: 10557, Training Loss: 0.01491
Epoch: 10557, Training Loss: 0.01375
Epoch: 10557, Training Loss: 0.01703
Epoch: 10557, Training Loss: 0.01306
Epoch: 10558, Training Loss: 0.01491
Epoch: 10558, Training Loss: 0.01375
Epoch: 10558, Training Loss: 0.01703
Epoch: 10558, Training Loss: 0.01306
Epoch: 10559, Training Loss: 0.01491
Epoch: 10559, Training Loss: 0.01375
Epoch: 10559, Training Loss: 0.01703
Epoch: 10559, Training Loss: 0.01306
Epoch: 10560, Training Loss: 0.01491
Epoch: 10560, Training Loss: 0.01375
Epoch: 10560, Training Loss: 0.01703
Epoch: 10560, Training Loss: 0.01306
Epoch: 10561, Training Loss: 0.01491
Epoch: 10561, Training Loss: 0.01375
Epoch: 10561, Training Loss: 0.01703
Epoch: 10561, Training Loss: 0.01306
Epoch: 10562, Training Loss: 0.01491
Epoch: 10562, Training Loss: 0.01375
Epoch: 10562, Training Loss: 0.01703
Epoch: 10562, Training Loss: 0.01305
Epoch: 10563, Training Loss: 0.01491
Epoch: 10563, Training Loss: 0.01375
Epoch: 10563, Training Loss: 0.01703
Epoch: 10563, Training Loss: 0.01305
Epoch: 10564, Training Loss: 0.01490
Epoch: 10564, Training Loss: 0.01374
Epoch: 10564, Training Loss: 0.01703
Epoch: 10564, Training Loss: 0.01305
Epoch: 10565, Training Loss: 0.01490
Epoch: 10565, Training Loss: 0.01374
Epoch: 10565, Training Loss: 0.01703
Epoch: 10565, Training Loss: 0.01305
Epoch: 10566, Training Loss: 0.01490
Epoch: 10566, Training Loss: 0.01374
Epoch: 10566, Training Loss: 0.01702
Epoch: 10566, Training Loss: 0.01305
Epoch: 10567, Training Loss: 0.01490
Epoch: 10567, Training Loss: 0.01374
Epoch: 10567, Training Loss: 0.01702
Epoch: 10567, Training Loss: 0.01305
Epoch: 10568, Training Loss: 0.01490
Epoch: 10568, Training Loss: 0.01374
Epoch: 10568, Training Loss: 0.01702
Epoch: 10568, Training Loss: 0.01305
Epoch: 10569, Training Loss: 0.01490
Epoch: 10569, Training Loss: 0.01374
Epoch: 10569, Training Loss: 0.01702
Epoch: 10569, Training Loss: 0.01305
Epoch: 10570, Training Loss: 0.01490
Epoch: 10570, Training Loss: 0.01374
Epoch: 10570, Training Loss: 0.01702
Epoch: 10570, Training Loss: 0.01305
Epoch: 10571, Training Loss: 0.01490
Epoch: 10571, Training Loss: 0.01374
Epoch: 10571, Training Loss: 0.01702
Epoch: 10571, Training Loss: 0.01305
Epoch: 10572, Training Loss: 0.01490
Epoch: 10572, Training Loss: 0.01374
Epoch: 10572, Training Loss: 0.01702
Epoch: 10572, Training Loss: 0.01305
Epoch: 10573, Training Loss: 0.01490
Epoch: 10573, Training Loss: 0.01374
Epoch: 10573, Training Loss: 0.01702
Epoch: 10573, Training Loss: 0.01305
Epoch: 10574, Training Loss: 0.01490
Epoch: 10574, Training Loss: 0.01374
Epoch: 10574, Training Loss: 0.01702
Epoch: 10574, Training Loss: 0.01305
Epoch: 10575, Training Loss: 0.01490
Epoch: 10575, Training Loss: 0.01374
Epoch: 10575, Training Loss: 0.01702
Epoch: 10575, Training Loss: 0.01305
Epoch: 10576, Training Loss: 0.01490
Epoch: 10576, Training Loss: 0.01374
Epoch: 10576, Training Loss: 0.01702
Epoch: 10576, Training Loss: 0.01305
Epoch: 10577, Training Loss: 0.01489
Epoch: 10577, Training Loss: 0.01374
Epoch: 10577, Training Loss: 0.01701
Epoch: 10577, Training Loss: 0.01304
Epoch: 10578, Training Loss: 0.01489
Epoch: 10578, Training Loss: 0.01373
Epoch: 10578, Training Loss: 0.01701
Epoch: 10578, Training Loss: 0.01304
Epoch: 10579, Training Loss: 0.01489
Epoch: 10579, Training Loss: 0.01373
Epoch: 10579, Training Loss: 0.01701
Epoch: 10579, Training Loss: 0.01304
Epoch: 10580, Training Loss: 0.01489
Epoch: 10580, Training Loss: 0.01373
Epoch: 10580, Training Loss: 0.01701
Epoch: 10580, Training Loss: 0.01304
Epoch: 10581, Training Loss: 0.01489
Epoch: 10581, Training Loss: 0.01373
Epoch: 10581, Training Loss: 0.01701
Epoch: 10581, Training Loss: 0.01304
Epoch: 10582, Training Loss: 0.01489
Epoch: 10582, Training Loss: 0.01373
Epoch: 10582, Training Loss: 0.01701
Epoch: 10582, Training Loss: 0.01304
Epoch: 10583, Training Loss: 0.01489
Epoch: 10583, Training Loss: 0.01373
Epoch: 10583, Training Loss: 0.01701
Epoch: 10583, Training Loss: 0.01304
Epoch: 10584, Training Loss: 0.01489
Epoch: 10584, Training Loss: 0.01373
Epoch: 10584, Training Loss: 0.01701
Epoch: 10584, Training Loss: 0.01304
Epoch: 10585, Training Loss: 0.01489
Epoch: 10585, Training Loss: 0.01373
Epoch: 10585, Training Loss: 0.01701
Epoch: 10585, Training Loss: 0.01304
Epoch: 10586, Training Loss: 0.01489
Epoch: 10586, Training Loss: 0.01373
Epoch: 10586, Training Loss: 0.01701
Epoch: 10586, Training Loss: 0.01304
Epoch: 10587, Training Loss: 0.01489
Epoch: 10587, Training Loss: 0.01373
Epoch: 10587, Training Loss: 0.01701
Epoch: 10587, Training Loss: 0.01304
Epoch: 10588, Training Loss: 0.01489
Epoch: 10588, Training Loss: 0.01373
Epoch: 10588, Training Loss: 0.01700
Epoch: 10588, Training Loss: 0.01304
Epoch: 10589, Training Loss: 0.01488
Epoch: 10589, Training Loss: 0.01373
Epoch: 10589, Training Loss: 0.01700
Epoch: 10589, Training Loss: 0.01304
Epoch: 10590, Training Loss: 0.01488
Epoch: 10590, Training Loss: 0.01373
Epoch: 10590, Training Loss: 0.01700
Epoch: 10590, Training Loss: 0.01304
Epoch: 10591, Training Loss: 0.01488
Epoch: 10591, Training Loss: 0.01373
Epoch: 10591, Training Loss: 0.01700
Epoch: 10591, Training Loss: 0.01304
Epoch: 10592, Training Loss: 0.01488
Epoch: 10592, Training Loss: 0.01372
Epoch: 10592, Training Loss: 0.01700
Epoch: 10592, Training Loss: 0.01303
Epoch: 10593, Training Loss: 0.01488
Epoch: 10593, Training Loss: 0.01372
Epoch: 10593, Training Loss: 0.01700
Epoch: 10593, Training Loss: 0.01303
Epoch: 10594, Training Loss: 0.01488
Epoch: 10594, Training Loss: 0.01372
Epoch: 10594, Training Loss: 0.01700
Epoch: 10594, Training Loss: 0.01303
Epoch: 10595, Training Loss: 0.01488
Epoch: 10595, Training Loss: 0.01372
Epoch: 10595, Training Loss: 0.01700
Epoch: 10595, Training Loss: 0.01303
Epoch: 10596, Training Loss: 0.01488
Epoch: 10596, Training Loss: 0.01372
Epoch: 10596, Training Loss: 0.01700
Epoch: 10596, Training Loss: 0.01303
Epoch: 10597, Training Loss: 0.01488
Epoch: 10597, Training Loss: 0.01372
Epoch: 10597, Training Loss: 0.01700
Epoch: 10597, Training Loss: 0.01303
Epoch: 10598, Training Loss: 0.01488
Epoch: 10598, Training Loss: 0.01372
Epoch: 10598, Training Loss: 0.01700
Epoch: 10598, Training Loss: 0.01303
Epoch: 10599, Training Loss: 0.01488
Epoch: 10599, Training Loss: 0.01372
Epoch: 10599, Training Loss: 0.01699
Epoch: 10599, Training Loss: 0.01303
Epoch: 10600, Training Loss: 0.01488
Epoch: 10600, Training Loss: 0.01372
Epoch: 10600, Training Loss: 0.01699
Epoch: 10600, Training Loss: 0.01303
Epoch: 10601, Training Loss: 0.01488
Epoch: 10601, Training Loss: 0.01372
Epoch: 10601, Training Loss: 0.01699
Epoch: 10601, Training Loss: 0.01303
Epoch: 10602, Training Loss: 0.01487
Epoch: 10602, Training Loss: 0.01372
Epoch: 10602, Training Loss: 0.01699
Epoch: 10602, Training Loss: 0.01303
Epoch: 10603, Training Loss: 0.01487
Epoch: 10603, Training Loss: 0.01372
Epoch: 10603, Training Loss: 0.01699
Epoch: 10603, Training Loss: 0.01303
Epoch: 10604, Training Loss: 0.01487
Epoch: 10604, Training Loss: 0.01372
Epoch: 10604, Training Loss: 0.01699
Epoch: 10604, Training Loss: 0.01303
Epoch: 10605, Training Loss: 0.01487
Epoch: 10605, Training Loss: 0.01372
Epoch: 10605, Training Loss: 0.01699
Epoch: 10605, Training Loss: 0.01303
Epoch: 10606, Training Loss: 0.01487
Epoch: 10606, Training Loss: 0.01371
Epoch: 10606, Training Loss: 0.01699
Epoch: 10606, Training Loss: 0.01302
Epoch: 10607, Training Loss: 0.01487
Epoch: 10607, Training Loss: 0.01371
Epoch: 10607, Training Loss: 0.01699
Epoch: 10607, Training Loss: 0.01302
Epoch: 10608, Training Loss: 0.01487
Epoch: 10608, Training Loss: 0.01371
Epoch: 10608, Training Loss: 0.01699
Epoch: 10608, Training Loss: 0.01302
Epoch: 10609, Training Loss: 0.01487
Epoch: 10609, Training Loss: 0.01371
Epoch: 10609, Training Loss: 0.01699
Epoch: 10609, Training Loss: 0.01302
Epoch: 10610, Training Loss: 0.01487
Epoch: 10610, Training Loss: 0.01371
Epoch: 10610, Training Loss: 0.01698
Epoch: 10610, Training Loss: 0.01302
Epoch: 10611, Training Loss: 0.01487
Epoch: 10611, Training Loss: 0.01371
Epoch: 10611, Training Loss: 0.01698
Epoch: 10611, Training Loss: 0.01302
Epoch: 10612, Training Loss: 0.01487
Epoch: 10612, Training Loss: 0.01371
Epoch: 10612, Training Loss: 0.01698
Epoch: 10612, Training Loss: 0.01302
Epoch: 10613, Training Loss: 0.01487
Epoch: 10613, Training Loss: 0.01371
Epoch: 10613, Training Loss: 0.01698
Epoch: 10613, Training Loss: 0.01302
Epoch: 10614, Training Loss: 0.01486
Epoch: 10614, Training Loss: 0.01371
Epoch: 10614, Training Loss: 0.01698
Epoch: 10614, Training Loss: 0.01302
Epoch: 10615, Training Loss: 0.01486
Epoch: 10615, Training Loss: 0.01371
Epoch: 10615, Training Loss: 0.01698
Epoch: 10615, Training Loss: 0.01302
Epoch: 10616, Training Loss: 0.01486
Epoch: 10616, Training Loss: 0.01371
Epoch: 10616, Training Loss: 0.01698
Epoch: 10616, Training Loss: 0.01302
Epoch: 10617, Training Loss: 0.01486
Epoch: 10617, Training Loss: 0.01371
Epoch: 10617, Training Loss: 0.01698
Epoch: 10617, Training Loss: 0.01302
Epoch: 10618, Training Loss: 0.01486
Epoch: 10618, Training Loss: 0.01371
Epoch: 10618, Training Loss: 0.01698
Epoch: 10618, Training Loss: 0.01302
Epoch: 10619, Training Loss: 0.01486
Epoch: 10619, Training Loss: 0.01371
Epoch: 10619, Training Loss: 0.01698
Epoch: 10619, Training Loss: 0.01302
Epoch: 10620, Training Loss: 0.01486
Epoch: 10620, Training Loss: 0.01371
Epoch: 10620, Training Loss: 0.01698
Epoch: 10620, Training Loss: 0.01302
Epoch: 10621, Training Loss: 0.01486
Epoch: 10621, Training Loss: 0.01370
Epoch: 10621, Training Loss: 0.01697
Epoch: 10621, Training Loss: 0.01301
Epoch: 10622, Training Loss: 0.01486
Epoch: 10622, Training Loss: 0.01370
Epoch: 10622, Training Loss: 0.01697
Epoch: 10622, Training Loss: 0.01301
Epoch: 10623, Training Loss: 0.01486
Epoch: 10623, Training Loss: 0.01370
Epoch: 10623, Training Loss: 0.01697
Epoch: 10623, Training Loss: 0.01301
Epoch: 10624, Training Loss: 0.01486
Epoch: 10624, Training Loss: 0.01370
Epoch: 10624, Training Loss: 0.01697
Epoch: 10624, Training Loss: 0.01301
Epoch: 10625, Training Loss: 0.01486
Epoch: 10625, Training Loss: 0.01370
Epoch: 10625, Training Loss: 0.01697
Epoch: 10625, Training Loss: 0.01301
Epoch: 10626, Training Loss: 0.01486
Epoch: 10626, Training Loss: 0.01370
Epoch: 10626, Training Loss: 0.01697
Epoch: 10626, Training Loss: 0.01301
Epoch: 10627, Training Loss: 0.01485
Epoch: 10627, Training Loss: 0.01370
Epoch: 10627, Training Loss: 0.01697
Epoch: 10627, Training Loss: 0.01301
Epoch: 10628, Training Loss: 0.01485
Epoch: 10628, Training Loss: 0.01370
Epoch: 10628, Training Loss: 0.01697
Epoch: 10628, Training Loss: 0.01301
Epoch: 10629, Training Loss: 0.01485
Epoch: 10629, Training Loss: 0.01370
Epoch: 10629, Training Loss: 0.01697
Epoch: 10629, Training Loss: 0.01301
Epoch: 10630, Training Loss: 0.01485
Epoch: 10630, Training Loss: 0.01370
Epoch: 10630, Training Loss: 0.01697
Epoch: 10630, Training Loss: 0.01301
Epoch: 10631, Training Loss: 0.01485
Epoch: 10631, Training Loss: 0.01370
Epoch: 10631, Training Loss: 0.01697
Epoch: 10631, Training Loss: 0.01301
Epoch: 10632, Training Loss: 0.01485
Epoch: 10632, Training Loss: 0.01370
Epoch: 10632, Training Loss: 0.01696
Epoch: 10632, Training Loss: 0.01301
Epoch: 10633, Training Loss: 0.01485
Epoch: 10633, Training Loss: 0.01370
Epoch: 10633, Training Loss: 0.01696
Epoch: 10633, Training Loss: 0.01301
Epoch: 10634, Training Loss: 0.01485
Epoch: 10634, Training Loss: 0.01370
Epoch: 10634, Training Loss: 0.01696
Epoch: 10634, Training Loss: 0.01301
Epoch: 10635, Training Loss: 0.01485
Epoch: 10635, Training Loss: 0.01369
Epoch: 10635, Training Loss: 0.01696
Epoch: 10635, Training Loss: 0.01301
Epoch: 10636, Training Loss: 0.01485
Epoch: 10636, Training Loss: 0.01369
Epoch: 10636, Training Loss: 0.01696
Epoch: 10636, Training Loss: 0.01300
Epoch: 10637, Training Loss: 0.01485
Epoch: 10637, Training Loss: 0.01369
Epoch: 10637, Training Loss: 0.01696
Epoch: 10637, Training Loss: 0.01300
Epoch: 10638, Training Loss: 0.01485
Epoch: 10638, Training Loss: 0.01369
Epoch: 10638, Training Loss: 0.01696
Epoch: 10638, Training Loss: 0.01300
Epoch: 10639, Training Loss: 0.01485
Epoch: 10639, Training Loss: 0.01369
Epoch: 10639, Training Loss: 0.01696
Epoch: 10639, Training Loss: 0.01300
Epoch: 10640, Training Loss: 0.01484
Epoch: 10640, Training Loss: 0.01369
Epoch: 10640, Training Loss: 0.01696
Epoch: 10640, Training Loss: 0.01300
Epoch: 10641, Training Loss: 0.01484
Epoch: 10641, Training Loss: 0.01369
Epoch: 10641, Training Loss: 0.01696
Epoch: 10641, Training Loss: 0.01300
Epoch: 10642, Training Loss: 0.01484
Epoch: 10642, Training Loss: 0.01369
Epoch: 10642, Training Loss: 0.01696
Epoch: 10642, Training Loss: 0.01300
Epoch: 10643, Training Loss: 0.01484
Epoch: 10643, Training Loss: 0.01369
Epoch: 10643, Training Loss: 0.01695
Epoch: 10643, Training Loss: 0.01300
Epoch: 10644, Training Loss: 0.01484
Epoch: 10644, Training Loss: 0.01369
Epoch: 10644, Training Loss: 0.01695
Epoch: 10644, Training Loss: 0.01300
Epoch: 10645, Training Loss: 0.01484
Epoch: 10645, Training Loss: 0.01369
Epoch: 10645, Training Loss: 0.01695
Epoch: 10645, Training Loss: 0.01300
Epoch: 10646, Training Loss: 0.01484
Epoch: 10646, Training Loss: 0.01369
Epoch: 10646, Training Loss: 0.01695
Epoch: 10646, Training Loss: 0.01300
Epoch: 10647, Training Loss: 0.01484
Epoch: 10647, Training Loss: 0.01369
Epoch: 10647, Training Loss: 0.01695
Epoch: 10647, Training Loss: 0.01300
Epoch: 10648, Training Loss: 0.01484
Epoch: 10648, Training Loss: 0.01369
Epoch: 10648, Training Loss: 0.01695
Epoch: 10648, Training Loss: 0.01300
Epoch: 10649, Training Loss: 0.01484
Epoch: 10649, Training Loss: 0.01368
Epoch: 10649, Training Loss: 0.01695
Epoch: 10649, Training Loss: 0.01300
Epoch: 10650, Training Loss: 0.01484
Epoch: 10650, Training Loss: 0.01368
Epoch: 10650, Training Loss: 0.01695
Epoch: 10650, Training Loss: 0.01300
Epoch: 10651, Training Loss: 0.01484
Epoch: 10651, Training Loss: 0.01368
Epoch: 10651, Training Loss: 0.01695
Epoch: 10651, Training Loss: 0.01299
Epoch: 10652, Training Loss: 0.01483
Epoch: 10652, Training Loss: 0.01368
Epoch: 10652, Training Loss: 0.01695
Epoch: 10652, Training Loss: 0.01299
Epoch: 10653, Training Loss: 0.01483
Epoch: 10653, Training Loss: 0.01368
Epoch: 10653, Training Loss: 0.01695
Epoch: 10653, Training Loss: 0.01299
Epoch: 10654, Training Loss: 0.01483
Epoch: 10654, Training Loss: 0.01368
Epoch: 10654, Training Loss: 0.01694
Epoch: 10654, Training Loss: 0.01299
Epoch: 10655, Training Loss: 0.01483
Epoch: 10655, Training Loss: 0.01368
Epoch: 10655, Training Loss: 0.01694
Epoch: 10655, Training Loss: 0.01299
Epoch: 10656, Training Loss: 0.01483
Epoch: 10656, Training Loss: 0.01368
Epoch: 10656, Training Loss: 0.01694
Epoch: 10656, Training Loss: 0.01299
Epoch: 10657, Training Loss: 0.01483
Epoch: 10657, Training Loss: 0.01368
Epoch: 10657, Training Loss: 0.01694
Epoch: 10657, Training Loss: 0.01299
Epoch: 10658, Training Loss: 0.01483
Epoch: 10658, Training Loss: 0.01368
Epoch: 10658, Training Loss: 0.01694
Epoch: 10658, Training Loss: 0.01299
Epoch: 10659, Training Loss: 0.01483
Epoch: 10659, Training Loss: 0.01368
Epoch: 10659, Training Loss: 0.01694
Epoch: 10659, Training Loss: 0.01299
Epoch: 10660, Training Loss: 0.01483
Epoch: 10660, Training Loss: 0.01368
Epoch: 10660, Training Loss: 0.01694
Epoch: 10660, Training Loss: 0.01299
Epoch: 10661, Training Loss: 0.01483
Epoch: 10661, Training Loss: 0.01368
Epoch: 10661, Training Loss: 0.01694
Epoch: 10661, Training Loss: 0.01299
Epoch: 10662, Training Loss: 0.01483
Epoch: 10662, Training Loss: 0.01368
Epoch: 10662, Training Loss: 0.01694
Epoch: 10662, Training Loss: 0.01299
Epoch: 10663, Training Loss: 0.01483
Epoch: 10663, Training Loss: 0.01367
Epoch: 10663, Training Loss: 0.01694
Epoch: 10663, Training Loss: 0.01299
Epoch: 10664, Training Loss: 0.01483
Epoch: 10664, Training Loss: 0.01367
Epoch: 10664, Training Loss: 0.01694
Epoch: 10664, Training Loss: 0.01299
Epoch: 10665, Training Loss: 0.01482
Epoch: 10665, Training Loss: 0.01367
Epoch: 10665, Training Loss: 0.01693
Epoch: 10665, Training Loss: 0.01299
Epoch: 10666, Training Loss: 0.01482
Epoch: 10666, Training Loss: 0.01367
Epoch: 10666, Training Loss: 0.01693
Epoch: 10666, Training Loss: 0.01298
Epoch: 10667, Training Loss: 0.01482
Epoch: 10667, Training Loss: 0.01367
Epoch: 10667, Training Loss: 0.01693
Epoch: 10667, Training Loss: 0.01298
Epoch: 10668, Training Loss: 0.01482
Epoch: 10668, Training Loss: 0.01367
Epoch: 10668, Training Loss: 0.01693
Epoch: 10668, Training Loss: 0.01298
Epoch: 10669, Training Loss: 0.01482
Epoch: 10669, Training Loss: 0.01367
Epoch: 10669, Training Loss: 0.01693
Epoch: 10669, Training Loss: 0.01298
Epoch: 10670, Training Loss: 0.01482
Epoch: 10670, Training Loss: 0.01367
Epoch: 10670, Training Loss: 0.01693
Epoch: 10670, Training Loss: 0.01298
Epoch: 10671, Training Loss: 0.01482
Epoch: 10671, Training Loss: 0.01367
Epoch: 10671, Training Loss: 0.01693
Epoch: 10671, Training Loss: 0.01298
Epoch: 10672, Training Loss: 0.01482
Epoch: 10672, Training Loss: 0.01367
Epoch: 10672, Training Loss: 0.01693
Epoch: 10672, Training Loss: 0.01298
Epoch: 10673, Training Loss: 0.01482
Epoch: 10673, Training Loss: 0.01367
Epoch: 10673, Training Loss: 0.01693
Epoch: 10673, Training Loss: 0.01298
Epoch: 10674, Training Loss: 0.01482
Epoch: 10674, Training Loss: 0.01367
Epoch: 10674, Training Loss: 0.01693
Epoch: 10674, Training Loss: 0.01298
Epoch: 10675, Training Loss: 0.01482
Epoch: 10675, Training Loss: 0.01367
Epoch: 10675, Training Loss: 0.01693
Epoch: 10675, Training Loss: 0.01298
Epoch: 10676, Training Loss: 0.01482
Epoch: 10676, Training Loss: 0.01367
Epoch: 10676, Training Loss: 0.01692
Epoch: 10676, Training Loss: 0.01298
Epoch: 10677, Training Loss: 0.01482
Epoch: 10677, Training Loss: 0.01366
Epoch: 10677, Training Loss: 0.01692
Epoch: 10677, Training Loss: 0.01298
Epoch: 10678, Training Loss: 0.01481
Epoch: 10678, Training Loss: 0.01366
Epoch: 10678, Training Loss: 0.01692
Epoch: 10678, Training Loss: 0.01298
Epoch: 10679, Training Loss: 0.01481
Epoch: 10679, Training Loss: 0.01366
Epoch: 10679, Training Loss: 0.01692
Epoch: 10679, Training Loss: 0.01298
Epoch: 10680, Training Loss: 0.01481
Epoch: 10680, Training Loss: 0.01366
Epoch: 10680, Training Loss: 0.01692
Epoch: 10680, Training Loss: 0.01297
Epoch: 10681, Training Loss: 0.01481
Epoch: 10681, Training Loss: 0.01366
Epoch: 10681, Training Loss: 0.01692
Epoch: 10681, Training Loss: 0.01297
Epoch: 10682, Training Loss: 0.01481
Epoch: 10682, Training Loss: 0.01366
Epoch: 10682, Training Loss: 0.01692
Epoch: 10682, Training Loss: 0.01297
Epoch: 10683, Training Loss: 0.01481
Epoch: 10683, Training Loss: 0.01366
Epoch: 10683, Training Loss: 0.01692
Epoch: 10683, Training Loss: 0.01297
Epoch: 10684, Training Loss: 0.01481
Epoch: 10684, Training Loss: 0.01366
Epoch: 10684, Training Loss: 0.01692
Epoch: 10684, Training Loss: 0.01297
Epoch: 10685, Training Loss: 0.01481
Epoch: 10685, Training Loss: 0.01366
Epoch: 10685, Training Loss: 0.01692
Epoch: 10685, Training Loss: 0.01297
Epoch: 10686, Training Loss: 0.01481
Epoch: 10686, Training Loss: 0.01366
Epoch: 10686, Training Loss: 0.01692
Epoch: 10686, Training Loss: 0.01297
Epoch: 10687, Training Loss: 0.01481
Epoch: 10687, Training Loss: 0.01366
Epoch: 10687, Training Loss: 0.01692
Epoch: 10687, Training Loss: 0.01297
Epoch: 10688, Training Loss: 0.01481
Epoch: 10688, Training Loss: 0.01366
Epoch: 10688, Training Loss: 0.01691
Epoch: 10688, Training Loss: 0.01297
Epoch: 10689, Training Loss: 0.01481
Epoch: 10689, Training Loss: 0.01366
Epoch: 10689, Training Loss: 0.01691
Epoch: 10689, Training Loss: 0.01297
Epoch: 10690, Training Loss: 0.01481
Epoch: 10690, Training Loss: 0.01366
Epoch: 10690, Training Loss: 0.01691
Epoch: 10690, Training Loss: 0.01297
Epoch: 10691, Training Loss: 0.01480
Epoch: 10691, Training Loss: 0.01366
Epoch: 10691, Training Loss: 0.01691
Epoch: 10691, Training Loss: 0.01297
Epoch: 10692, Training Loss: 0.01480
Epoch: 10692, Training Loss: 0.01365
Epoch: 10692, Training Loss: 0.01691
Epoch: 10692, Training Loss: 0.01297
Epoch: 10693, Training Loss: 0.01480
Epoch: 10693, Training Loss: 0.01365
Epoch: 10693, Training Loss: 0.01691
Epoch: 10693, Training Loss: 0.01297
Epoch: 10694, Training Loss: 0.01480
Epoch: 10694, Training Loss: 0.01365
Epoch: 10694, Training Loss: 0.01691
Epoch: 10694, Training Loss: 0.01297
Epoch: 10695, Training Loss: 0.01480
Epoch: 10695, Training Loss: 0.01365
Epoch: 10695, Training Loss: 0.01691
Epoch: 10695, Training Loss: 0.01296
Epoch: 10696, Training Loss: 0.01480
Epoch: 10696, Training Loss: 0.01365
Epoch: 10696, Training Loss: 0.01691
Epoch: 10696, Training Loss: 0.01296
Epoch: 10697, Training Loss: 0.01480
Epoch: 10697, Training Loss: 0.01365
Epoch: 10697, Training Loss: 0.01691
Epoch: 10697, Training Loss: 0.01296
Epoch: 10698, Training Loss: 0.01480
Epoch: 10698, Training Loss: 0.01365
Epoch: 10698, Training Loss: 0.01691
Epoch: 10698, Training Loss: 0.01296
Epoch: 10699, Training Loss: 0.01480
Epoch: 10699, Training Loss: 0.01365
Epoch: 10699, Training Loss: 0.01690
Epoch: 10699, Training Loss: 0.01296
Epoch: 10700, Training Loss: 0.01480
Epoch: 10700, Training Loss: 0.01365
Epoch: 10700, Training Loss: 0.01690
Epoch: 10700, Training Loss: 0.01296
Epoch: 10701, Training Loss: 0.01480
Epoch: 10701, Training Loss: 0.01365
Epoch: 10701, Training Loss: 0.01690
Epoch: 10701, Training Loss: 0.01296
Epoch: 10702, Training Loss: 0.01480
Epoch: 10702, Training Loss: 0.01365
Epoch: 10702, Training Loss: 0.01690
Epoch: 10702, Training Loss: 0.01296
Epoch: 10703, Training Loss: 0.01480
Epoch: 10703, Training Loss: 0.01365
Epoch: 10703, Training Loss: 0.01690
Epoch: 10703, Training Loss: 0.01296
Epoch: 10704, Training Loss: 0.01479
Epoch: 10704, Training Loss: 0.01365
Epoch: 10704, Training Loss: 0.01690
Epoch: 10704, Training Loss: 0.01296
Epoch: 10705, Training Loss: 0.01479
Epoch: 10705, Training Loss: 0.01365
Epoch: 10705, Training Loss: 0.01690
Epoch: 10705, Training Loss: 0.01296
Epoch: 10706, Training Loss: 0.01479
Epoch: 10706, Training Loss: 0.01364
Epoch: 10706, Training Loss: 0.01690
Epoch: 10706, Training Loss: 0.01296
Epoch: 10707, Training Loss: 0.01479
Epoch: 10707, Training Loss: 0.01364
Epoch: 10707, Training Loss: 0.01690
Epoch: 10707, Training Loss: 0.01296
Epoch: 10708, Training Loss: 0.01479
Epoch: 10708, Training Loss: 0.01364
Epoch: 10708, Training Loss: 0.01690
Epoch: 10708, Training Loss: 0.01296
Epoch: 10709, Training Loss: 0.01479
Epoch: 10709, Training Loss: 0.01364
Epoch: 10709, Training Loss: 0.01690
Epoch: 10709, Training Loss: 0.01296
Epoch: 10710, Training Loss: 0.01479
Epoch: 10710, Training Loss: 0.01364
Epoch: 10710, Training Loss: 0.01689
Epoch: 10710, Training Loss: 0.01295
Epoch: 10711, Training Loss: 0.01479
Epoch: 10711, Training Loss: 0.01364
Epoch: 10711, Training Loss: 0.01689
Epoch: 10711, Training Loss: 0.01295
Epoch: 10712, Training Loss: 0.01479
Epoch: 10712, Training Loss: 0.01364
Epoch: 10712, Training Loss: 0.01689
Epoch: 10712, Training Loss: 0.01295
Epoch: 10713, Training Loss: 0.01479
Epoch: 10713, Training Loss: 0.01364
Epoch: 10713, Training Loss: 0.01689
Epoch: 10713, Training Loss: 0.01295
Epoch: 10714, Training Loss: 0.01479
Epoch: 10714, Training Loss: 0.01364
Epoch: 10714, Training Loss: 0.01689
Epoch: 10714, Training Loss: 0.01295
Epoch: 10715, Training Loss: 0.01479
Epoch: 10715, Training Loss: 0.01364
Epoch: 10715, Training Loss: 0.01689
Epoch: 10715, Training Loss: 0.01295
Epoch: 10716, Training Loss: 0.01478
Epoch: 10716, Training Loss: 0.01364
Epoch: 10716, Training Loss: 0.01689
Epoch: 10716, Training Loss: 0.01295
Epoch: 10717, Training Loss: 0.01478
Epoch: 10717, Training Loss: 0.01364
Epoch: 10717, Training Loss: 0.01689
Epoch: 10717, Training Loss: 0.01295
Epoch: 10718, Training Loss: 0.01478
Epoch: 10718, Training Loss: 0.01364
Epoch: 10718, Training Loss: 0.01689
Epoch: 10718, Training Loss: 0.01295
Epoch: 10719, Training Loss: 0.01478
Epoch: 10719, Training Loss: 0.01364
Epoch: 10719, Training Loss: 0.01689
Epoch: 10719, Training Loss: 0.01295
Epoch: 10720, Training Loss: 0.01478
Epoch: 10720, Training Loss: 0.01363
Epoch: 10720, Training Loss: 0.01689
Epoch: 10720, Training Loss: 0.01295
Epoch: 10721, Training Loss: 0.01478
Epoch: 10721, Training Loss: 0.01363
Epoch: 10721, Training Loss: 0.01688
Epoch: 10721, Training Loss: 0.01295
Epoch: 10722, Training Loss: 0.01478
Epoch: 10722, Training Loss: 0.01363
Epoch: 10722, Training Loss: 0.01688
Epoch: 10722, Training Loss: 0.01295
Epoch: 10723, Training Loss: 0.01478
Epoch: 10723, Training Loss: 0.01363
Epoch: 10723, Training Loss: 0.01688
Epoch: 10723, Training Loss: 0.01295
Epoch: 10724, Training Loss: 0.01478
Epoch: 10724, Training Loss: 0.01363
Epoch: 10724, Training Loss: 0.01688
Epoch: 10724, Training Loss: 0.01295
Epoch: 10725, Training Loss: 0.01478
Epoch: 10725, Training Loss: 0.01363
Epoch: 10725, Training Loss: 0.01688
Epoch: 10725, Training Loss: 0.01294
Epoch: 10726, Training Loss: 0.01478
Epoch: 10726, Training Loss: 0.01363
Epoch: 10726, Training Loss: 0.01688
Epoch: 10726, Training Loss: 0.01294
Epoch: 10727, Training Loss: 0.01478
Epoch: 10727, Training Loss: 0.01363
Epoch: 10727, Training Loss: 0.01688
Epoch: 10727, Training Loss: 0.01294
Epoch: 10728, Training Loss: 0.01478
Epoch: 10728, Training Loss: 0.01363
Epoch: 10728, Training Loss: 0.01688
Epoch: 10728, Training Loss: 0.01294
Epoch: 10729, Training Loss: 0.01477
Epoch: 10729, Training Loss: 0.01363
Epoch: 10729, Training Loss: 0.01688
Epoch: 10729, Training Loss: 0.01294
Epoch: 10730, Training Loss: 0.01477
Epoch: 10730, Training Loss: 0.01363
Epoch: 10730, Training Loss: 0.01688
Epoch: 10730, Training Loss: 0.01294
Epoch: 10731, Training Loss: 0.01477
Epoch: 10731, Training Loss: 0.01363
Epoch: 10731, Training Loss: 0.01688
Epoch: 10731, Training Loss: 0.01294
Epoch: 10732, Training Loss: 0.01477
Epoch: 10732, Training Loss: 0.01363
Epoch: 10732, Training Loss: 0.01687
Epoch: 10732, Training Loss: 0.01294
Epoch: 10733, Training Loss: 0.01477
Epoch: 10733, Training Loss: 0.01363
Epoch: 10733, Training Loss: 0.01687
Epoch: 10733, Training Loss: 0.01294
Epoch: 10734, Training Loss: 0.01477
Epoch: 10734, Training Loss: 0.01363
Epoch: 10734, Training Loss: 0.01687
Epoch: 10734, Training Loss: 0.01294
Epoch: 10735, Training Loss: 0.01477
Epoch: 10735, Training Loss: 0.01362
Epoch: 10735, Training Loss: 0.01687
Epoch: 10735, Training Loss: 0.01294
Epoch: 10736, Training Loss: 0.01477
Epoch: 10736, Training Loss: 0.01362
Epoch: 10736, Training Loss: 0.01687
Epoch: 10736, Training Loss: 0.01294
Epoch: 10737, Training Loss: 0.01477
Epoch: 10737, Training Loss: 0.01362
Epoch: 10737, Training Loss: 0.01687
Epoch: 10737, Training Loss: 0.01294
Epoch: 10738, Training Loss: 0.01477
Epoch: 10738, Training Loss: 0.01362
Epoch: 10738, Training Loss: 0.01687
Epoch: 10738, Training Loss: 0.01294
Epoch: 10739, Training Loss: 0.01477
Epoch: 10739, Training Loss: 0.01362
Epoch: 10739, Training Loss: 0.01687
Epoch: 10739, Training Loss: 0.01294
Epoch: 10740, Training Loss: 0.01477
Epoch: 10740, Training Loss: 0.01362
Epoch: 10740, Training Loss: 0.01687
Epoch: 10740, Training Loss: 0.01293
Epoch: 10741, Training Loss: 0.01477
Epoch: 10741, Training Loss: 0.01362
Epoch: 10741, Training Loss: 0.01687
Epoch: 10741, Training Loss: 0.01293
Epoch: 10742, Training Loss: 0.01476
Epoch: 10742, Training Loss: 0.01362
Epoch: 10742, Training Loss: 0.01687
Epoch: 10742, Training Loss: 0.01293
Epoch: 10743, Training Loss: 0.01476
Epoch: 10743, Training Loss: 0.01362
Epoch: 10743, Training Loss: 0.01687
Epoch: 10743, Training Loss: 0.01293
Epoch: 10744, Training Loss: 0.01476
Epoch: 10744, Training Loss: 0.01362
Epoch: 10744, Training Loss: 0.01686
Epoch: 10744, Training Loss: 0.01293
Epoch: 10745, Training Loss: 0.01476
Epoch: 10745, Training Loss: 0.01362
Epoch: 10745, Training Loss: 0.01686
Epoch: 10745, Training Loss: 0.01293
Epoch: 10746, Training Loss: 0.01476
Epoch: 10746, Training Loss: 0.01362
Epoch: 10746, Training Loss: 0.01686
Epoch: 10746, Training Loss: 0.01293
Epoch: 10747, Training Loss: 0.01476
Epoch: 10747, Training Loss: 0.01362
Epoch: 10747, Training Loss: 0.01686
Epoch: 10747, Training Loss: 0.01293
Epoch: 10748, Training Loss: 0.01476
Epoch: 10748, Training Loss: 0.01362
Epoch: 10748, Training Loss: 0.01686
Epoch: 10748, Training Loss: 0.01293
Epoch: 10749, Training Loss: 0.01476
Epoch: 10749, Training Loss: 0.01361
Epoch: 10749, Training Loss: 0.01686
Epoch: 10749, Training Loss: 0.01293
Epoch: 10750, Training Loss: 0.01476
Epoch: 10750, Training Loss: 0.01361
Epoch: 10750, Training Loss: 0.01686
Epoch: 10750, Training Loss: 0.01293
Epoch: 10751, Training Loss: 0.01476
Epoch: 10751, Training Loss: 0.01361
Epoch: 10751, Training Loss: 0.01686
Epoch: 10751, Training Loss: 0.01293
Epoch: 10752, Training Loss: 0.01476
Epoch: 10752, Training Loss: 0.01361
Epoch: 10752, Training Loss: 0.01686
Epoch: 10752, Training Loss: 0.01293
Epoch: 10753, Training Loss: 0.01476
Epoch: 10753, Training Loss: 0.01361
Epoch: 10753, Training Loss: 0.01686
Epoch: 10753, Training Loss: 0.01293
Epoch: 10754, Training Loss: 0.01476
Epoch: 10754, Training Loss: 0.01361
Epoch: 10754, Training Loss: 0.01686
Epoch: 10754, Training Loss: 0.01293
Epoch: 10755, Training Loss: 0.01475
Epoch: 10755, Training Loss: 0.01361
Epoch: 10755, Training Loss: 0.01685
Epoch: 10755, Training Loss: 0.01292
Epoch: 10756, Training Loss: 0.01475
Epoch: 10756, Training Loss: 0.01361
Epoch: 10756, Training Loss: 0.01685
Epoch: 10756, Training Loss: 0.01292
Epoch: 10757, Training Loss: 0.01475
Epoch: 10757, Training Loss: 0.01361
Epoch: 10757, Training Loss: 0.01685
Epoch: 10757, Training Loss: 0.01292
Epoch: 10758, Training Loss: 0.01475
Epoch: 10758, Training Loss: 0.01361
Epoch: 10758, Training Loss: 0.01685
Epoch: 10758, Training Loss: 0.01292
Epoch: 10759, Training Loss: 0.01475
Epoch: 10759, Training Loss: 0.01361
Epoch: 10759, Training Loss: 0.01685
Epoch: 10759, Training Loss: 0.01292
Epoch: 10760, Training Loss: 0.01475
Epoch: 10760, Training Loss: 0.01361
Epoch: 10760, Training Loss: 0.01685
Epoch: 10760, Training Loss: 0.01292
Epoch: 10761, Training Loss: 0.01475
Epoch: 10761, Training Loss: 0.01361
Epoch: 10761, Training Loss: 0.01685
Epoch: 10761, Training Loss: 0.01292
Epoch: 10762, Training Loss: 0.01475
Epoch: 10762, Training Loss: 0.01361
Epoch: 10762, Training Loss: 0.01685
Epoch: 10762, Training Loss: 0.01292
Epoch: 10763, Training Loss: 0.01475
Epoch: 10763, Training Loss: 0.01361
Epoch: 10763, Training Loss: 0.01685
Epoch: 10763, Training Loss: 0.01292
Epoch: 10764, Training Loss: 0.01475
Epoch: 10764, Training Loss: 0.01360
Epoch: 10764, Training Loss: 0.01685
Epoch: 10764, Training Loss: 0.01292
Epoch: 10765, Training Loss: 0.01475
Epoch: 10765, Training Loss: 0.01360
Epoch: 10765, Training Loss: 0.01685
Epoch: 10765, Training Loss: 0.01292
Epoch: 10766, Training Loss: 0.01475
Epoch: 10766, Training Loss: 0.01360
Epoch: 10766, Training Loss: 0.01684
Epoch: 10766, Training Loss: 0.01292
Epoch: 10767, Training Loss: 0.01475
Epoch: 10767, Training Loss: 0.01360
Epoch: 10767, Training Loss: 0.01684
Epoch: 10767, Training Loss: 0.01292
Epoch: 10768, Training Loss: 0.01474
Epoch: 10768, Training Loss: 0.01360
Epoch: 10768, Training Loss: 0.01684
Epoch: 10768, Training Loss: 0.01292
Epoch: 10769, Training Loss: 0.01474
Epoch: 10769, Training Loss: 0.01360
Epoch: 10769, Training Loss: 0.01684
Epoch: 10769, Training Loss: 0.01292
Epoch: 10770, Training Loss: 0.01474
Epoch: 10770, Training Loss: 0.01360
Epoch: 10770, Training Loss: 0.01684
Epoch: 10770, Training Loss: 0.01291
Epoch: 10771, Training Loss: 0.01474
Epoch: 10771, Training Loss: 0.01360
Epoch: 10771, Training Loss: 0.01684
Epoch: 10771, Training Loss: 0.01291
Epoch: 10772, Training Loss: 0.01474
Epoch: 10772, Training Loss: 0.01360
Epoch: 10772, Training Loss: 0.01684
Epoch: 10772, Training Loss: 0.01291
Epoch: 10773, Training Loss: 0.01474
Epoch: 10773, Training Loss: 0.01360
Epoch: 10773, Training Loss: 0.01684
Epoch: 10773, Training Loss: 0.01291
Epoch: 10774, Training Loss: 0.01474
Epoch: 10774, Training Loss: 0.01360
Epoch: 10774, Training Loss: 0.01684
Epoch: 10774, Training Loss: 0.01291
Epoch: 10775, Training Loss: 0.01474
Epoch: 10775, Training Loss: 0.01360
Epoch: 10775, Training Loss: 0.01684
Epoch: 10775, Training Loss: 0.01291
Epoch: 10776, Training Loss: 0.01474
Epoch: 10776, Training Loss: 0.01360
Epoch: 10776, Training Loss: 0.01684
Epoch: 10776, Training Loss: 0.01291
Epoch: 10777, Training Loss: 0.01474
Epoch: 10777, Training Loss: 0.01360
Epoch: 10777, Training Loss: 0.01683
Epoch: 10777, Training Loss: 0.01291
Epoch: 10778, Training Loss: 0.01474
Epoch: 10778, Training Loss: 0.01359
Epoch: 10778, Training Loss: 0.01683
Epoch: 10778, Training Loss: 0.01291
Epoch: 10779, Training Loss: 0.01474
Epoch: 10779, Training Loss: 0.01359
Epoch: 10779, Training Loss: 0.01683
Epoch: 10779, Training Loss: 0.01291
Epoch: 10780, Training Loss: 0.01474
Epoch: 10780, Training Loss: 0.01359
Epoch: 10780, Training Loss: 0.01683
Epoch: 10780, Training Loss: 0.01291
Epoch: 10781, Training Loss: 0.01473
Epoch: 10781, Training Loss: 0.01359
Epoch: 10781, Training Loss: 0.01683
Epoch: 10781, Training Loss: 0.01291
Epoch: 10782, Training Loss: 0.01473
Epoch: 10782, Training Loss: 0.01359
Epoch: 10782, Training Loss: 0.01683
Epoch: 10782, Training Loss: 0.01291
Epoch: 10783, Training Loss: 0.01473
Epoch: 10783, Training Loss: 0.01359
Epoch: 10783, Training Loss: 0.01683
Epoch: 10783, Training Loss: 0.01291
Epoch: 10784, Training Loss: 0.01473
Epoch: 10784, Training Loss: 0.01359
Epoch: 10784, Training Loss: 0.01683
Epoch: 10784, Training Loss: 0.01291
Epoch: 10785, Training Loss: 0.01473
Epoch: 10785, Training Loss: 0.01359
Epoch: 10785, Training Loss: 0.01683
Epoch: 10785, Training Loss: 0.01290
Epoch: 10786, Training Loss: 0.01473
Epoch: 10786, Training Loss: 0.01359
Epoch: 10786, Training Loss: 0.01683
Epoch: 10786, Training Loss: 0.01290
Epoch: 10787, Training Loss: 0.01473
Epoch: 10787, Training Loss: 0.01359
Epoch: 10787, Training Loss: 0.01683
Epoch: 10787, Training Loss: 0.01290
Epoch: 10788, Training Loss: 0.01473
Epoch: 10788, Training Loss: 0.01359
Epoch: 10788, Training Loss: 0.01683
Epoch: 10788, Training Loss: 0.01290
Epoch: 10789, Training Loss: 0.01473
Epoch: 10789, Training Loss: 0.01359
Epoch: 10789, Training Loss: 0.01682
Epoch: 10789, Training Loss: 0.01290
Epoch: 10790, Training Loss: 0.01473
Epoch: 10790, Training Loss: 0.01359
Epoch: 10790, Training Loss: 0.01682
Epoch: 10790, Training Loss: 0.01290
Epoch: 10791, Training Loss: 0.01473
Epoch: 10791, Training Loss: 0.01359
Epoch: 10791, Training Loss: 0.01682
Epoch: 10791, Training Loss: 0.01290
Epoch: 10792, Training Loss: 0.01473
Epoch: 10792, Training Loss: 0.01358
Epoch: 10792, Training Loss: 0.01682
Epoch: 10792, Training Loss: 0.01290
Epoch: 10793, Training Loss: 0.01473
Epoch: 10793, Training Loss: 0.01358
Epoch: 10793, Training Loss: 0.01682
Epoch: 10793, Training Loss: 0.01290
Epoch: 10794, Training Loss: 0.01472
Epoch: 10794, Training Loss: 0.01358
Epoch: 10794, Training Loss: 0.01682
Epoch: 10794, Training Loss: 0.01290
Epoch: 10795, Training Loss: 0.01472
Epoch: 10795, Training Loss: 0.01358
Epoch: 10795, Training Loss: 0.01682
Epoch: 10795, Training Loss: 0.01290
Epoch: 10796, Training Loss: 0.01472
Epoch: 10796, Training Loss: 0.01358
Epoch: 10796, Training Loss: 0.01682
Epoch: 10796, Training Loss: 0.01290
Epoch: 10797, Training Loss: 0.01472
Epoch: 10797, Training Loss: 0.01358
Epoch: 10797, Training Loss: 0.01682
Epoch: 10797, Training Loss: 0.01290
Epoch: 10798, Training Loss: 0.01472
Epoch: 10798, Training Loss: 0.01358
Epoch: 10798, Training Loss: 0.01682
Epoch: 10798, Training Loss: 0.01290
Epoch: 10799, Training Loss: 0.01472
Epoch: 10799, Training Loss: 0.01358
Epoch: 10799, Training Loss: 0.01682
Epoch: 10799, Training Loss: 0.01290
Epoch: 10800, Training Loss: 0.01472
Epoch: 10800, Training Loss: 0.01358
Epoch: 10800, Training Loss: 0.01681
Epoch: 10800, Training Loss: 0.01290
Epoch: 10801, Training Loss: 0.01472
Epoch: 10801, Training Loss: 0.01358
Epoch: 10801, Training Loss: 0.01681
Epoch: 10801, Training Loss: 0.01289
Epoch: 10802, Training Loss: 0.01472
Epoch: 10802, Training Loss: 0.01358
Epoch: 10802, Training Loss: 0.01681
Epoch: 10802, Training Loss: 0.01289
Epoch: 10803, Training Loss: 0.01472
Epoch: 10803, Training Loss: 0.01358
Epoch: 10803, Training Loss: 0.01681
Epoch: 10803, Training Loss: 0.01289
Epoch: 10804, Training Loss: 0.01472
Epoch: 10804, Training Loss: 0.01358
Epoch: 10804, Training Loss: 0.01681
Epoch: 10804, Training Loss: 0.01289
Epoch: 10805, Training Loss: 0.01472
Epoch: 10805, Training Loss: 0.01358
Epoch: 10805, Training Loss: 0.01681
Epoch: 10805, Training Loss: 0.01289
Epoch: 10806, Training Loss: 0.01472
Epoch: 10806, Training Loss: 0.01358
Epoch: 10806, Training Loss: 0.01681
Epoch: 10806, Training Loss: 0.01289
Epoch: 10807, Training Loss: 0.01471
Epoch: 10807, Training Loss: 0.01357
Epoch: 10807, Training Loss: 0.01681
Epoch: 10807, Training Loss: 0.01289
Epoch: 10808, Training Loss: 0.01471
Epoch: 10808, Training Loss: 0.01357
Epoch: 10808, Training Loss: 0.01681
Epoch: 10808, Training Loss: 0.01289
Epoch: 10809, Training Loss: 0.01471
Epoch: 10809, Training Loss: 0.01357
Epoch: 10809, Training Loss: 0.01681
Epoch: 10809, Training Loss: 0.01289
Epoch: 10810, Training Loss: 0.01471
Epoch: 10810, Training Loss: 0.01357
Epoch: 10810, Training Loss: 0.01681
Epoch: 10810, Training Loss: 0.01289
Epoch: 10811, Training Loss: 0.01471
Epoch: 10811, Training Loss: 0.01357
Epoch: 10811, Training Loss: 0.01680
Epoch: 10811, Training Loss: 0.01289
Epoch: 10812, Training Loss: 0.01471
Epoch: 10812, Training Loss: 0.01357
Epoch: 10812, Training Loss: 0.01680
Epoch: 10812, Training Loss: 0.01289
Epoch: 10813, Training Loss: 0.01471
Epoch: 10813, Training Loss: 0.01357
Epoch: 10813, Training Loss: 0.01680
Epoch: 10813, Training Loss: 0.01289
Epoch: 10814, Training Loss: 0.01471
Epoch: 10814, Training Loss: 0.01357
Epoch: 10814, Training Loss: 0.01680
Epoch: 10814, Training Loss: 0.01289
Epoch: 10815, Training Loss: 0.01471
Epoch: 10815, Training Loss: 0.01357
Epoch: 10815, Training Loss: 0.01680
Epoch: 10815, Training Loss: 0.01289
Epoch: 10816, Training Loss: 0.01471
Epoch: 10816, Training Loss: 0.01357
Epoch: 10816, Training Loss: 0.01680
Epoch: 10816, Training Loss: 0.01288
Epoch: 10817, Training Loss: 0.01471
Epoch: 10817, Training Loss: 0.01357
Epoch: 10817, Training Loss: 0.01680
Epoch: 10817, Training Loss: 0.01288
Epoch: 10818, Training Loss: 0.01471
Epoch: 10818, Training Loss: 0.01357
Epoch: 10818, Training Loss: 0.01680
Epoch: 10818, Training Loss: 0.01288
Epoch: 10819, Training Loss: 0.01471
Epoch: 10819, Training Loss: 0.01357
Epoch: 10819, Training Loss: 0.01680
Epoch: 10819, Training Loss: 0.01288
Epoch: 10820, Training Loss: 0.01470
Epoch: 10820, Training Loss: 0.01357
Epoch: 10820, Training Loss: 0.01680
Epoch: 10820, Training Loss: 0.01288
Epoch: 10821, Training Loss: 0.01470
Epoch: 10821, Training Loss: 0.01357
Epoch: 10821, Training Loss: 0.01680
Epoch: 10821, Training Loss: 0.01288
Epoch: 10822, Training Loss: 0.01470
Epoch: 10822, Training Loss: 0.01356
Epoch: 10822, Training Loss: 0.01680
Epoch: 10822, Training Loss: 0.01288
Epoch: 10823, Training Loss: 0.01470
Epoch: 10823, Training Loss: 0.01356
Epoch: 10823, Training Loss: 0.01679
Epoch: 10823, Training Loss: 0.01288
Epoch: 10824, Training Loss: 0.01470
Epoch: 10824, Training Loss: 0.01356
Epoch: 10824, Training Loss: 0.01679
Epoch: 10824, Training Loss: 0.01288
Epoch: 10825, Training Loss: 0.01470
Epoch: 10825, Training Loss: 0.01356
Epoch: 10825, Training Loss: 0.01679
Epoch: 10825, Training Loss: 0.01288
Epoch: 10826, Training Loss: 0.01470
Epoch: 10826, Training Loss: 0.01356
Epoch: 10826, Training Loss: 0.01679
Epoch: 10826, Training Loss: 0.01288
Epoch: 10827, Training Loss: 0.01470
Epoch: 10827, Training Loss: 0.01356
Epoch: 10827, Training Loss: 0.01679
Epoch: 10827, Training Loss: 0.01288
Epoch: 10828, Training Loss: 0.01470
Epoch: 10828, Training Loss: 0.01356
Epoch: 10828, Training Loss: 0.01679
Epoch: 10828, Training Loss: 0.01288
Epoch: 10829, Training Loss: 0.01470
Epoch: 10829, Training Loss: 0.01356
Epoch: 10829, Training Loss: 0.01679
Epoch: 10829, Training Loss: 0.01288
Epoch: 10830, Training Loss: 0.01470
Epoch: 10830, Training Loss: 0.01356
Epoch: 10830, Training Loss: 0.01679
Epoch: 10830, Training Loss: 0.01288
Epoch: 10831, Training Loss: 0.01470
Epoch: 10831, Training Loss: 0.01356
Epoch: 10831, Training Loss: 0.01679
Epoch: 10831, Training Loss: 0.01287
Epoch: 10832, Training Loss: 0.01470
Epoch: 10832, Training Loss: 0.01356
Epoch: 10832, Training Loss: 0.01679
Epoch: 10832, Training Loss: 0.01287
Epoch: 10833, Training Loss: 0.01469
Epoch: 10833, Training Loss: 0.01356
Epoch: 10833, Training Loss: 0.01679
Epoch: 10833, Training Loss: 0.01287
Epoch: 10834, Training Loss: 0.01469
Epoch: 10834, Training Loss: 0.01356
Epoch: 10834, Training Loss: 0.01678
Epoch: 10834, Training Loss: 0.01287
Epoch: 10835, Training Loss: 0.01469
Epoch: 10835, Training Loss: 0.01356
Epoch: 10835, Training Loss: 0.01678
Epoch: 10835, Training Loss: 0.01287
Epoch: 10836, Training Loss: 0.01469
Epoch: 10836, Training Loss: 0.01355
Epoch: 10836, Training Loss: 0.01678
Epoch: 10836, Training Loss: 0.01287
Epoch: 10837, Training Loss: 0.01469
Epoch: 10837, Training Loss: 0.01355
Epoch: 10837, Training Loss: 0.01678
Epoch: 10837, Training Loss: 0.01287
Epoch: 10838, Training Loss: 0.01469
Epoch: 10838, Training Loss: 0.01355
Epoch: 10838, Training Loss: 0.01678
Epoch: 10838, Training Loss: 0.01287
Epoch: 10839, Training Loss: 0.01469
Epoch: 10839, Training Loss: 0.01355
Epoch: 10839, Training Loss: 0.01678
Epoch: 10839, Training Loss: 0.01287
Epoch: 10840, Training Loss: 0.01469
Epoch: 10840, Training Loss: 0.01355
Epoch: 10840, Training Loss: 0.01678
Epoch: 10840, Training Loss: 0.01287
Epoch: 10841, Training Loss: 0.01469
Epoch: 10841, Training Loss: 0.01355
Epoch: 10841, Training Loss: 0.01678
Epoch: 10841, Training Loss: 0.01287
Epoch: 10842, Training Loss: 0.01469
Epoch: 10842, Training Loss: 0.01355
Epoch: 10842, Training Loss: 0.01678
Epoch: 10842, Training Loss: 0.01287
Epoch: 10843, Training Loss: 0.01469
Epoch: 10843, Training Loss: 0.01355
Epoch: 10843, Training Loss: 0.01678
Epoch: 10843, Training Loss: 0.01287
Epoch: 10844, Training Loss: 0.01469
Epoch: 10844, Training Loss: 0.01355
Epoch: 10844, Training Loss: 0.01678
Epoch: 10844, Training Loss: 0.01287
Epoch: 10845, Training Loss: 0.01469
Epoch: 10845, Training Loss: 0.01355
Epoch: 10845, Training Loss: 0.01678
Epoch: 10845, Training Loss: 0.01287
Epoch: 10846, Training Loss: 0.01468
Epoch: 10846, Training Loss: 0.01355
Epoch: 10846, Training Loss: 0.01677
Epoch: 10846, Training Loss: 0.01286
Epoch: 10847, Training Loss: 0.01468
Epoch: 10847, Training Loss: 0.01355
Epoch: 10847, Training Loss: 0.01677
Epoch: 10847, Training Loss: 0.01286
Epoch: 10848, Training Loss: 0.01468
Epoch: 10848, Training Loss: 0.01355
Epoch: 10848, Training Loss: 0.01677
Epoch: 10848, Training Loss: 0.01286
Epoch: 10849, Training Loss: 0.01468
Epoch: 10849, Training Loss: 0.01355
Epoch: 10849, Training Loss: 0.01677
Epoch: 10849, Training Loss: 0.01286
Epoch: 10850, Training Loss: 0.01468
Epoch: 10850, Training Loss: 0.01355
Epoch: 10850, Training Loss: 0.01677
Epoch: 10850, Training Loss: 0.01286
Epoch: 10851, Training Loss: 0.01468
Epoch: 10851, Training Loss: 0.01354
Epoch: 10851, Training Loss: 0.01677
Epoch: 10851, Training Loss: 0.01286
Epoch: 10852, Training Loss: 0.01468
Epoch: 10852, Training Loss: 0.01354
Epoch: 10852, Training Loss: 0.01677
Epoch: 10852, Training Loss: 0.01286
Epoch: 10853, Training Loss: 0.01468
Epoch: 10853, Training Loss: 0.01354
Epoch: 10853, Training Loss: 0.01677
Epoch: 10853, Training Loss: 0.01286
Epoch: 10854, Training Loss: 0.01468
Epoch: 10854, Training Loss: 0.01354
Epoch: 10854, Training Loss: 0.01677
Epoch: 10854, Training Loss: 0.01286
Epoch: 10855, Training Loss: 0.01468
Epoch: 10855, Training Loss: 0.01354
Epoch: 10855, Training Loss: 0.01677
Epoch: 10855, Training Loss: 0.01286
Epoch: 10856, Training Loss: 0.01468
Epoch: 10856, Training Loss: 0.01354
Epoch: 10856, Training Loss: 0.01677
Epoch: 10856, Training Loss: 0.01286
Epoch: 10857, Training Loss: 0.01468
Epoch: 10857, Training Loss: 0.01354
Epoch: 10857, Training Loss: 0.01676
Epoch: 10857, Training Loss: 0.01286
Epoch: 10858, Training Loss: 0.01468
Epoch: 10858, Training Loss: 0.01354
Epoch: 10858, Training Loss: 0.01676
Epoch: 10858, Training Loss: 0.01286
Epoch: 10859, Training Loss: 0.01467
Epoch: 10859, Training Loss: 0.01354
Epoch: 10859, Training Loss: 0.01676
Epoch: 10859, Training Loss: 0.01286
Epoch: 10860, Training Loss: 0.01467
Epoch: 10860, Training Loss: 0.01354
Epoch: 10860, Training Loss: 0.01676
Epoch: 10860, Training Loss: 0.01286
Epoch: 10861, Training Loss: 0.01467
Epoch: 10861, Training Loss: 0.01354
Epoch: 10861, Training Loss: 0.01676
Epoch: 10861, Training Loss: 0.01285
Epoch: 10862, Training Loss: 0.01467
Epoch: 10862, Training Loss: 0.01354
Epoch: 10862, Training Loss: 0.01676
Epoch: 10862, Training Loss: 0.01285
Epoch: 10863, Training Loss: 0.01467
Epoch: 10863, Training Loss: 0.01354
Epoch: 10863, Training Loss: 0.01676
Epoch: 10863, Training Loss: 0.01285
Epoch: 10864, Training Loss: 0.01467
Epoch: 10864, Training Loss: 0.01354
Epoch: 10864, Training Loss: 0.01676
Epoch: 10864, Training Loss: 0.01285
Epoch: 10865, Training Loss: 0.01467
Epoch: 10865, Training Loss: 0.01353
Epoch: 10865, Training Loss: 0.01676
Epoch: 10865, Training Loss: 0.01285
Epoch: 10866, Training Loss: 0.01467
Epoch: 10866, Training Loss: 0.01353
Epoch: 10866, Training Loss: 0.01676
Epoch: 10866, Training Loss: 0.01285
Epoch: 10867, Training Loss: 0.01467
Epoch: 10867, Training Loss: 0.01353
Epoch: 10867, Training Loss: 0.01676
Epoch: 10867, Training Loss: 0.01285
Epoch: 10868, Training Loss: 0.01467
Epoch: 10868, Training Loss: 0.01353
Epoch: 10868, Training Loss: 0.01675
Epoch: 10868, Training Loss: 0.01285
Epoch: 10869, Training Loss: 0.01467
Epoch: 10869, Training Loss: 0.01353
Epoch: 10869, Training Loss: 0.01675
Epoch: 10869, Training Loss: 0.01285
Epoch: 10870, Training Loss: 0.01467
Epoch: 10870, Training Loss: 0.01353
Epoch: 10870, Training Loss: 0.01675
Epoch: 10870, Training Loss: 0.01285
Epoch: 10871, Training Loss: 0.01467
Epoch: 10871, Training Loss: 0.01353
Epoch: 10871, Training Loss: 0.01675
Epoch: 10871, Training Loss: 0.01285
Epoch: 10872, Training Loss: 0.01466
Epoch: 10872, Training Loss: 0.01353
Epoch: 10872, Training Loss: 0.01675
Epoch: 10872, Training Loss: 0.01285
Epoch: 10873, Training Loss: 0.01466
Epoch: 10873, Training Loss: 0.01353
Epoch: 10873, Training Loss: 0.01675
Epoch: 10873, Training Loss: 0.01285
Epoch: 10874, Training Loss: 0.01466
Epoch: 10874, Training Loss: 0.01353
Epoch: 10874, Training Loss: 0.01675
Epoch: 10874, Training Loss: 0.01285
Epoch: 10875, Training Loss: 0.01466
Epoch: 10875, Training Loss: 0.01353
Epoch: 10875, Training Loss: 0.01675
Epoch: 10875, Training Loss: 0.01285
Epoch: 10876, Training Loss: 0.01466
Epoch: 10876, Training Loss: 0.01353
Epoch: 10876, Training Loss: 0.01675
Epoch: 10876, Training Loss: 0.01285
Epoch: 10877, Training Loss: 0.01466
Epoch: 10877, Training Loss: 0.01353
Epoch: 10877, Training Loss: 0.01675
Epoch: 10877, Training Loss: 0.01284
Epoch: 10878, Training Loss: 0.01466
Epoch: 10878, Training Loss: 0.01353
Epoch: 10878, Training Loss: 0.01675
Epoch: 10878, Training Loss: 0.01284
Epoch: 10879, Training Loss: 0.01466
Epoch: 10879, Training Loss: 0.01353
Epoch: 10879, Training Loss: 0.01675
Epoch: 10879, Training Loss: 0.01284
Epoch: 10880, Training Loss: 0.01466
Epoch: 10880, Training Loss: 0.01352
Epoch: 10880, Training Loss: 0.01674
Epoch: 10880, Training Loss: 0.01284
Epoch: 10881, Training Loss: 0.01466
Epoch: 10881, Training Loss: 0.01352
Epoch: 10881, Training Loss: 0.01674
Epoch: 10881, Training Loss: 0.01284
Epoch: 10882, Training Loss: 0.01466
Epoch: 10882, Training Loss: 0.01352
Epoch: 10882, Training Loss: 0.01674
Epoch: 10882, Training Loss: 0.01284
Epoch: 10883, Training Loss: 0.01466
Epoch: 10883, Training Loss: 0.01352
Epoch: 10883, Training Loss: 0.01674
Epoch: 10883, Training Loss: 0.01284
Epoch: 10884, Training Loss: 0.01466
Epoch: 10884, Training Loss: 0.01352
Epoch: 10884, Training Loss: 0.01674
Epoch: 10884, Training Loss: 0.01284
Epoch: 10885, Training Loss: 0.01466
Epoch: 10885, Training Loss: 0.01352
Epoch: 10885, Training Loss: 0.01674
Epoch: 10885, Training Loss: 0.01284
Epoch: 10886, Training Loss: 0.01465
Epoch: 10886, Training Loss: 0.01352
Epoch: 10886, Training Loss: 0.01674
Epoch: 10886, Training Loss: 0.01284
Epoch: 10887, Training Loss: 0.01465
Epoch: 10887, Training Loss: 0.01352
Epoch: 10887, Training Loss: 0.01674
Epoch: 10887, Training Loss: 0.01284
Epoch: 10888, Training Loss: 0.01465
Epoch: 10888, Training Loss: 0.01352
Epoch: 10888, Training Loss: 0.01674
Epoch: 10888, Training Loss: 0.01284
Epoch: 10889, Training Loss: 0.01465
Epoch: 10889, Training Loss: 0.01352
Epoch: 10889, Training Loss: 0.01674
Epoch: 10889, Training Loss: 0.01284
Epoch: 10890, Training Loss: 0.01465
Epoch: 10890, Training Loss: 0.01352
Epoch: 10890, Training Loss: 0.01674
Epoch: 10890, Training Loss: 0.01284
Epoch: 10891, Training Loss: 0.01465
Epoch: 10891, Training Loss: 0.01352
Epoch: 10891, Training Loss: 0.01673
Epoch: 10891, Training Loss: 0.01284
Epoch: 10892, Training Loss: 0.01465
Epoch: 10892, Training Loss: 0.01352
Epoch: 10892, Training Loss: 0.01673
Epoch: 10892, Training Loss: 0.01283
Epoch: 10893, Training Loss: 0.01465
Epoch: 10893, Training Loss: 0.01352
Epoch: 10893, Training Loss: 0.01673
Epoch: 10893, Training Loss: 0.01283
Epoch: 10894, Training Loss: 0.01465
Epoch: 10894, Training Loss: 0.01352
Epoch: 10894, Training Loss: 0.01673
Epoch: 10894, Training Loss: 0.01283
Epoch: 10895, Training Loss: 0.01465
Epoch: 10895, Training Loss: 0.01351
Epoch: 10895, Training Loss: 0.01673
Epoch: 10895, Training Loss: 0.01283
Epoch: 10896, Training Loss: 0.01465
Epoch: 10896, Training Loss: 0.01351
Epoch: 10896, Training Loss: 0.01673
Epoch: 10896, Training Loss: 0.01283
Epoch: 10897, Training Loss: 0.01465
Epoch: 10897, Training Loss: 0.01351
Epoch: 10897, Training Loss: 0.01673
Epoch: 10897, Training Loss: 0.01283
Epoch: 10898, Training Loss: 0.01465
Epoch: 10898, Training Loss: 0.01351
Epoch: 10898, Training Loss: 0.01673
Epoch: 10898, Training Loss: 0.01283
Epoch: 10899, Training Loss: 0.01464
Epoch: 10899, Training Loss: 0.01351
Epoch: 10899, Training Loss: 0.01673
Epoch: 10899, Training Loss: 0.01283
Epoch: 10900, Training Loss: 0.01464
Epoch: 10900, Training Loss: 0.01351
Epoch: 10900, Training Loss: 0.01673
Epoch: 10900, Training Loss: 0.01283
Epoch: 10901, Training Loss: 0.01464
Epoch: 10901, Training Loss: 0.01351
Epoch: 10901, Training Loss: 0.01673
Epoch: 10901, Training Loss: 0.01283
Epoch: 10902, Training Loss: 0.01464
Epoch: 10902, Training Loss: 0.01351
Epoch: 10902, Training Loss: 0.01673
Epoch: 10902, Training Loss: 0.01283
Epoch: 10903, Training Loss: 0.01464
Epoch: 10903, Training Loss: 0.01351
Epoch: 10903, Training Loss: 0.01672
Epoch: 10903, Training Loss: 0.01283
Epoch: 10904, Training Loss: 0.01464
Epoch: 10904, Training Loss: 0.01351
Epoch: 10904, Training Loss: 0.01672
Epoch: 10904, Training Loss: 0.01283
Epoch: 10905, Training Loss: 0.01464
Epoch: 10905, Training Loss: 0.01351
Epoch: 10905, Training Loss: 0.01672
Epoch: 10905, Training Loss: 0.01283
Epoch: 10906, Training Loss: 0.01464
Epoch: 10906, Training Loss: 0.01351
Epoch: 10906, Training Loss: 0.01672
Epoch: 10906, Training Loss: 0.01283
Epoch: 10907, Training Loss: 0.01464
Epoch: 10907, Training Loss: 0.01351
Epoch: 10907, Training Loss: 0.01672
Epoch: 10907, Training Loss: 0.01283
Epoch: 10908, Training Loss: 0.01464
Epoch: 10908, Training Loss: 0.01351
Epoch: 10908, Training Loss: 0.01672
Epoch: 10908, Training Loss: 0.01282
Epoch: 10909, Training Loss: 0.01464
Epoch: 10909, Training Loss: 0.01351
Epoch: 10909, Training Loss: 0.01672
Epoch: 10909, Training Loss: 0.01282
Epoch: 10910, Training Loss: 0.01464
Epoch: 10910, Training Loss: 0.01350
Epoch: 10910, Training Loss: 0.01672
Epoch: 10910, Training Loss: 0.01282
Epoch: 10911, Training Loss: 0.01464
Epoch: 10911, Training Loss: 0.01350
Epoch: 10911, Training Loss: 0.01672
Epoch: 10911, Training Loss: 0.01282
Epoch: 10912, Training Loss: 0.01463
Epoch: 10912, Training Loss: 0.01350
Epoch: 10912, Training Loss: 0.01672
Epoch: 10912, Training Loss: 0.01282
Epoch: 10913, Training Loss: 0.01463
Epoch: 10913, Training Loss: 0.01350
Epoch: 10913, Training Loss: 0.01672
Epoch: 10913, Training Loss: 0.01282
Epoch: 10914, Training Loss: 0.01463
Epoch: 10914, Training Loss: 0.01350
Epoch: 10914, Training Loss: 0.01671
Epoch: 10914, Training Loss: 0.01282
Epoch: 10915, Training Loss: 0.01463
Epoch: 10915, Training Loss: 0.01350
Epoch: 10915, Training Loss: 0.01671
Epoch: 10915, Training Loss: 0.01282
Epoch: 10916, Training Loss: 0.01463
Epoch: 10916, Training Loss: 0.01350
Epoch: 10916, Training Loss: 0.01671
Epoch: 10916, Training Loss: 0.01282
Epoch: 10917, Training Loss: 0.01463
Epoch: 10917, Training Loss: 0.01350
Epoch: 10917, Training Loss: 0.01671
Epoch: 10917, Training Loss: 0.01282
Epoch: 10918, Training Loss: 0.01463
Epoch: 10918, Training Loss: 0.01350
Epoch: 10918, Training Loss: 0.01671
Epoch: 10918, Training Loss: 0.01282
Epoch: 10919, Training Loss: 0.01463
Epoch: 10919, Training Loss: 0.01350
Epoch: 10919, Training Loss: 0.01671
Epoch: 10919, Training Loss: 0.01282
Epoch: 10920, Training Loss: 0.01463
Epoch: 10920, Training Loss: 0.01350
Epoch: 10920, Training Loss: 0.01671
Epoch: 10920, Training Loss: 0.01282
Epoch: 10921, Training Loss: 0.01463
Epoch: 10921, Training Loss: 0.01350
Epoch: 10921, Training Loss: 0.01671
Epoch: 10921, Training Loss: 0.01282
Epoch: 10922, Training Loss: 0.01463
Epoch: 10922, Training Loss: 0.01350
Epoch: 10922, Training Loss: 0.01671
Epoch: 10922, Training Loss: 0.01282
Epoch: 10923, Training Loss: 0.01463
Epoch: 10923, Training Loss: 0.01350
Epoch: 10923, Training Loss: 0.01671
Epoch: 10923, Training Loss: 0.01281
Epoch: 10924, Training Loss: 0.01463
Epoch: 10924, Training Loss: 0.01349
Epoch: 10924, Training Loss: 0.01671
Epoch: 10924, Training Loss: 0.01281
Epoch: 10925, Training Loss: 0.01462
Epoch: 10925, Training Loss: 0.01349
Epoch: 10925, Training Loss: 0.01671
Epoch: 10925, Training Loss: 0.01281
Epoch: 10926, Training Loss: 0.01462
Epoch: 10926, Training Loss: 0.01349
Epoch: 10926, Training Loss: 0.01670
Epoch: 10926, Training Loss: 0.01281
Epoch: 10927, Training Loss: 0.01462
Epoch: 10927, Training Loss: 0.01349
Epoch: 10927, Training Loss: 0.01670
Epoch: 10927, Training Loss: 0.01281
Epoch: 10928, Training Loss: 0.01462
Epoch: 10928, Training Loss: 0.01349
Epoch: 10928, Training Loss: 0.01670
Epoch: 10928, Training Loss: 0.01281
Epoch: 10929, Training Loss: 0.01462
Epoch: 10929, Training Loss: 0.01349
Epoch: 10929, Training Loss: 0.01670
Epoch: 10929, Training Loss: 0.01281
Epoch: 10930, Training Loss: 0.01462
Epoch: 10930, Training Loss: 0.01349
Epoch: 10930, Training Loss: 0.01670
Epoch: 10930, Training Loss: 0.01281
Epoch: 10931, Training Loss: 0.01462
Epoch: 10931, Training Loss: 0.01349
Epoch: 10931, Training Loss: 0.01670
Epoch: 10931, Training Loss: 0.01281
Epoch: 10932, Training Loss: 0.01462
Epoch: 10932, Training Loss: 0.01349
Epoch: 10932, Training Loss: 0.01670
Epoch: 10932, Training Loss: 0.01281
Epoch: 10933, Training Loss: 0.01462
Epoch: 10933, Training Loss: 0.01349
Epoch: 10933, Training Loss: 0.01670
Epoch: 10933, Training Loss: 0.01281
Epoch: 10934, Training Loss: 0.01462
Epoch: 10934, Training Loss: 0.01349
Epoch: 10934, Training Loss: 0.01670
Epoch: 10934, Training Loss: 0.01281
Epoch: 10935, Training Loss: 0.01462
Epoch: 10935, Training Loss: 0.01349
Epoch: 10935, Training Loss: 0.01670
Epoch: 10935, Training Loss: 0.01281
Epoch: 10936, Training Loss: 0.01462
Epoch: 10936, Training Loss: 0.01349
Epoch: 10936, Training Loss: 0.01670
Epoch: 10936, Training Loss: 0.01281
Epoch: 10937, Training Loss: 0.01462
Epoch: 10937, Training Loss: 0.01349
Epoch: 10937, Training Loss: 0.01670
Epoch: 10937, Training Loss: 0.01281
Epoch: 10938, Training Loss: 0.01462
Epoch: 10938, Training Loss: 0.01349
Epoch: 10938, Training Loss: 0.01669
Epoch: 10938, Training Loss: 0.01280
Epoch: 10939, Training Loss: 0.01461
Epoch: 10939, Training Loss: 0.01348
Epoch: 10939, Training Loss: 0.01669
Epoch: 10939, Training Loss: 0.01280
Epoch: 10940, Training Loss: 0.01461
Epoch: 10940, Training Loss: 0.01348
Epoch: 10940, Training Loss: 0.01669
Epoch: 10940, Training Loss: 0.01280
Epoch: 10941, Training Loss: 0.01461
Epoch: 10941, Training Loss: 0.01348
Epoch: 10941, Training Loss: 0.01669
Epoch: 10941, Training Loss: 0.01280
Epoch: 10942, Training Loss: 0.01461
Epoch: 10942, Training Loss: 0.01348
Epoch: 10942, Training Loss: 0.01669
Epoch: 10942, Training Loss: 0.01280
Epoch: 10943, Training Loss: 0.01461
Epoch: 10943, Training Loss: 0.01348
Epoch: 10943, Training Loss: 0.01669
Epoch: 10943, Training Loss: 0.01280
Epoch: 10944, Training Loss: 0.01461
Epoch: 10944, Training Loss: 0.01348
Epoch: 10944, Training Loss: 0.01669
Epoch: 10944, Training Loss: 0.01280
Epoch: 10945, Training Loss: 0.01461
Epoch: 10945, Training Loss: 0.01348
Epoch: 10945, Training Loss: 0.01669
Epoch: 10945, Training Loss: 0.01280
Epoch: 10946, Training Loss: 0.01461
Epoch: 10946, Training Loss: 0.01348
Epoch: 10946, Training Loss: 0.01669
Epoch: 10946, Training Loss: 0.01280
Epoch: 10947, Training Loss: 0.01461
Epoch: 10947, Training Loss: 0.01348
Epoch: 10947, Training Loss: 0.01669
Epoch: 10947, Training Loss: 0.01280
Epoch: 10948, Training Loss: 0.01461
Epoch: 10948, Training Loss: 0.01348
Epoch: 10948, Training Loss: 0.01669
Epoch: 10948, Training Loss: 0.01280
Epoch: 10949, Training Loss: 0.01461
Epoch: 10949, Training Loss: 0.01348
Epoch: 10949, Training Loss: 0.01668
Epoch: 10949, Training Loss: 0.01280
Epoch: 10950, Training Loss: 0.01461
Epoch: 10950, Training Loss: 0.01348
Epoch: 10950, Training Loss: 0.01668
Epoch: 10950, Training Loss: 0.01280
Epoch: 10951, Training Loss: 0.01461
Epoch: 10951, Training Loss: 0.01348
Epoch: 10951, Training Loss: 0.01668
Epoch: 10951, Training Loss: 0.01280
Epoch: 10952, Training Loss: 0.01460
Epoch: 10952, Training Loss: 0.01348
Epoch: 10952, Training Loss: 0.01668
Epoch: 10952, Training Loss: 0.01280
Epoch: 10953, Training Loss: 0.01460
Epoch: 10953, Training Loss: 0.01348
Epoch: 10953, Training Loss: 0.01668
Epoch: 10953, Training Loss: 0.01280
Epoch: 10954, Training Loss: 0.01460
Epoch: 10954, Training Loss: 0.01347
Epoch: 10954, Training Loss: 0.01668
Epoch: 10954, Training Loss: 0.01279
Epoch: 10955, Training Loss: 0.01460
Epoch: 10955, Training Loss: 0.01347
Epoch: 10955, Training Loss: 0.01668
Epoch: 10955, Training Loss: 0.01279
Epoch: 10956, Training Loss: 0.01460
Epoch: 10956, Training Loss: 0.01347
Epoch: 10956, Training Loss: 0.01668
Epoch: 10956, Training Loss: 0.01279
Epoch: 10957, Training Loss: 0.01460
Epoch: 10957, Training Loss: 0.01347
Epoch: 10957, Training Loss: 0.01668
Epoch: 10957, Training Loss: 0.01279
Epoch: 10958, Training Loss: 0.01460
Epoch: 10958, Training Loss: 0.01347
Epoch: 10958, Training Loss: 0.01668
Epoch: 10958, Training Loss: 0.01279
Epoch: 10959, Training Loss: 0.01460
Epoch: 10959, Training Loss: 0.01347
Epoch: 10959, Training Loss: 0.01668
Epoch: 10959, Training Loss: 0.01279
Epoch: 10960, Training Loss: 0.01460
Epoch: 10960, Training Loss: 0.01347
Epoch: 10960, Training Loss: 0.01668
Epoch: 10960, Training Loss: 0.01279
Epoch: 10961, Training Loss: 0.01460
Epoch: 10961, Training Loss: 0.01347
Epoch: 10961, Training Loss: 0.01667
Epoch: 10961, Training Loss: 0.01279
Epoch: 10962, Training Loss: 0.01460
Epoch: 10962, Training Loss: 0.01347
Epoch: 10962, Training Loss: 0.01667
Epoch: 10962, Training Loss: 0.01279
Epoch: 10963, Training Loss: 0.01460
Epoch: 10963, Training Loss: 0.01347
Epoch: 10963, Training Loss: 0.01667
Epoch: 10963, Training Loss: 0.01279
Epoch: 10964, Training Loss: 0.01460
Epoch: 10964, Training Loss: 0.01347
Epoch: 10964, Training Loss: 0.01667
Epoch: 10964, Training Loss: 0.01279
Epoch: 10965, Training Loss: 0.01459
Epoch: 10965, Training Loss: 0.01347
Epoch: 10965, Training Loss: 0.01667
Epoch: 10965, Training Loss: 0.01279
Epoch: 10966, Training Loss: 0.01459
Epoch: 10966, Training Loss: 0.01347
Epoch: 10966, Training Loss: 0.01667
Epoch: 10966, Training Loss: 0.01279
Epoch: 10967, Training Loss: 0.01459
Epoch: 10967, Training Loss: 0.01347
Epoch: 10967, Training Loss: 0.01667
Epoch: 10967, Training Loss: 0.01279
Epoch: 10968, Training Loss: 0.01459
Epoch: 10968, Training Loss: 0.01347
Epoch: 10968, Training Loss: 0.01667
Epoch: 10968, Training Loss: 0.01279
Epoch: 10969, Training Loss: 0.01459
Epoch: 10969, Training Loss: 0.01346
Epoch: 10969, Training Loss: 0.01667
Epoch: 10969, Training Loss: 0.01278
Epoch: 10970, Training Loss: 0.01459
Epoch: 10970, Training Loss: 0.01346
Epoch: 10970, Training Loss: 0.01667
Epoch: 10970, Training Loss: 0.01278
Epoch: 10971, Training Loss: 0.01459
Epoch: 10971, Training Loss: 0.01346
Epoch: 10971, Training Loss: 0.01667
Epoch: 10971, Training Loss: 0.01278
Epoch: 10972, Training Loss: 0.01459
Epoch: 10972, Training Loss: 0.01346
Epoch: 10972, Training Loss: 0.01666
Epoch: 10972, Training Loss: 0.01278
Epoch: 10973, Training Loss: 0.01459
Epoch: 10973, Training Loss: 0.01346
Epoch: 10973, Training Loss: 0.01666
Epoch: 10973, Training Loss: 0.01278
Epoch: 10974, Training Loss: 0.01459
Epoch: 10974, Training Loss: 0.01346
Epoch: 10974, Training Loss: 0.01666
Epoch: 10974, Training Loss: 0.01278
Epoch: 10975, Training Loss: 0.01459
Epoch: 10975, Training Loss: 0.01346
Epoch: 10975, Training Loss: 0.01666
Epoch: 10975, Training Loss: 0.01278
Epoch: 10976, Training Loss: 0.01459
Epoch: 10976, Training Loss: 0.01346
Epoch: 10976, Training Loss: 0.01666
Epoch: 10976, Training Loss: 0.01278
Epoch: 10977, Training Loss: 0.01459
Epoch: 10977, Training Loss: 0.01346
Epoch: 10977, Training Loss: 0.01666
Epoch: 10977, Training Loss: 0.01278
Epoch: 10978, Training Loss: 0.01459
Epoch: 10978, Training Loss: 0.01346
Epoch: 10978, Training Loss: 0.01666
Epoch: 10978, Training Loss: 0.01278
Epoch: 10979, Training Loss: 0.01458
Epoch: 10979, Training Loss: 0.01346
Epoch: 10979, Training Loss: 0.01666
Epoch: 10979, Training Loss: 0.01278
Epoch: 10980, Training Loss: 0.01458
Epoch: 10980, Training Loss: 0.01346
Epoch: 10980, Training Loss: 0.01666
Epoch: 10980, Training Loss: 0.01278
Epoch: 10981, Training Loss: 0.01458
Epoch: 10981, Training Loss: 0.01346
Epoch: 10981, Training Loss: 0.01666
Epoch: 10981, Training Loss: 0.01278
Epoch: 10982, Training Loss: 0.01458
Epoch: 10982, Training Loss: 0.01346
Epoch: 10982, Training Loss: 0.01666
Epoch: 10982, Training Loss: 0.01278
Epoch: 10983, Training Loss: 0.01458
Epoch: 10983, Training Loss: 0.01346
Epoch: 10983, Training Loss: 0.01666
Epoch: 10983, Training Loss: 0.01278
Epoch: 10984, Training Loss: 0.01458
Epoch: 10984, Training Loss: 0.01345
Epoch: 10984, Training Loss: 0.01665
Epoch: 10984, Training Loss: 0.01278
Epoch: 10985, Training Loss: 0.01458
Epoch: 10985, Training Loss: 0.01345
Epoch: 10985, Training Loss: 0.01665
Epoch: 10985, Training Loss: 0.01277
Epoch: 10986, Training Loss: 0.01458
Epoch: 10986, Training Loss: 0.01345
Epoch: 10986, Training Loss: 0.01665
Epoch: 10986, Training Loss: 0.01277
Epoch: 10987, Training Loss: 0.01458
Epoch: 10987, Training Loss: 0.01345
Epoch: 10987, Training Loss: 0.01665
Epoch: 10987, Training Loss: 0.01277
Epoch: 10988, Training Loss: 0.01458
Epoch: 10988, Training Loss: 0.01345
Epoch: 10988, Training Loss: 0.01665
Epoch: 10988, Training Loss: 0.01277
Epoch: 10989, Training Loss: 0.01458
Epoch: 10989, Training Loss: 0.01345
Epoch: 10989, Training Loss: 0.01665
Epoch: 10989, Training Loss: 0.01277
Epoch: 10990, Training Loss: 0.01458
Epoch: 10990, Training Loss: 0.01345
Epoch: 10990, Training Loss: 0.01665
Epoch: 10990, Training Loss: 0.01277
Epoch: 10991, Training Loss: 0.01458
Epoch: 10991, Training Loss: 0.01345
Epoch: 10991, Training Loss: 0.01665
Epoch: 10991, Training Loss: 0.01277
Epoch: 10992, Training Loss: 0.01457
Epoch: 10992, Training Loss: 0.01345
Epoch: 10992, Training Loss: 0.01665
Epoch: 10992, Training Loss: 0.01277
Epoch: 10993, Training Loss: 0.01457
Epoch: 10993, Training Loss: 0.01345
Epoch: 10993, Training Loss: 0.01665
Epoch: 10993, Training Loss: 0.01277
Epoch: 10994, Training Loss: 0.01457
Epoch: 10994, Training Loss: 0.01345
Epoch: 10994, Training Loss: 0.01665
Epoch: 10994, Training Loss: 0.01277
Epoch: 10995, Training Loss: 0.01457
Epoch: 10995, Training Loss: 0.01345
Epoch: 10995, Training Loss: 0.01665
Epoch: 10995, Training Loss: 0.01277
Epoch: 10996, Training Loss: 0.01457
Epoch: 10996, Training Loss: 0.01345
Epoch: 10996, Training Loss: 0.01664
Epoch: 10996, Training Loss: 0.01277
Epoch: 10997, Training Loss: 0.01457
Epoch: 10997, Training Loss: 0.01345
Epoch: 10997, Training Loss: 0.01664
Epoch: 10997, Training Loss: 0.01277
Epoch: 10998, Training Loss: 0.01457
Epoch: 10998, Training Loss: 0.01345
Epoch: 10998, Training Loss: 0.01664
Epoch: 10998, Training Loss: 0.01277
Epoch: 10999, Training Loss: 0.01457
Epoch: 10999, Training Loss: 0.01344
Epoch: 10999, Training Loss: 0.01664
Epoch: 10999, Training Loss: 0.01277
Epoch: 11000, Training Loss: 0.01457
Epoch: 11000, Training Loss: 0.01344
Epoch: 11000, Training Loss: 0.01664
Epoch: 11000, Training Loss: 0.01277
Epoch: 11001, Training Loss: 0.01457
Epoch: 11001, Training Loss: 0.01344
Epoch: 11001, Training Loss: 0.01664
Epoch: 11001, Training Loss: 0.01276
Epoch: 11002, Training Loss: 0.01457
Epoch: 11002, Training Loss: 0.01344
Epoch: 11002, Training Loss: 0.01664
Epoch: 11002, Training Loss: 0.01276
Epoch: 11003, Training Loss: 0.01457
Epoch: 11003, Training Loss: 0.01344
Epoch: 11003, Training Loss: 0.01664
Epoch: 11003, Training Loss: 0.01276
Epoch: 11004, Training Loss: 0.01457
Epoch: 11004, Training Loss: 0.01344
Epoch: 11004, Training Loss: 0.01664
Epoch: 11004, Training Loss: 0.01276
Epoch: 11005, Training Loss: 0.01456
Epoch: 11005, Training Loss: 0.01344
Epoch: 11005, Training Loss: 0.01664
Epoch: 11005, Training Loss: 0.01276
Epoch: 11006, Training Loss: 0.01456
Epoch: 11006, Training Loss: 0.01344
Epoch: 11006, Training Loss: 0.01664
Epoch: 11006, Training Loss: 0.01276
Epoch: 11007, Training Loss: 0.01456
Epoch: 11007, Training Loss: 0.01344
Epoch: 11007, Training Loss: 0.01663
Epoch: 11007, Training Loss: 0.01276
Epoch: 11008, Training Loss: 0.01456
Epoch: 11008, Training Loss: 0.01344
Epoch: 11008, Training Loss: 0.01663
Epoch: 11008, Training Loss: 0.01276
Epoch: 11009, Training Loss: 0.01456
Epoch: 11009, Training Loss: 0.01344
Epoch: 11009, Training Loss: 0.01663
Epoch: 11009, Training Loss: 0.01276
Epoch: 11010, Training Loss: 0.01456
Epoch: 11010, Training Loss: 0.01344
Epoch: 11010, Training Loss: 0.01663
Epoch: 11010, Training Loss: 0.01276
Epoch: 11011, Training Loss: 0.01456
Epoch: 11011, Training Loss: 0.01344
Epoch: 11011, Training Loss: 0.01663
Epoch: 11011, Training Loss: 0.01276
Epoch: 11012, Training Loss: 0.01456
Epoch: 11012, Training Loss: 0.01344
Epoch: 11012, Training Loss: 0.01663
Epoch: 11012, Training Loss: 0.01276
Epoch: 11013, Training Loss: 0.01456
Epoch: 11013, Training Loss: 0.01344
Epoch: 11013, Training Loss: 0.01663
Epoch: 11013, Training Loss: 0.01276
Epoch: 11014, Training Loss: 0.01456
Epoch: 11014, Training Loss: 0.01343
Epoch: 11014, Training Loss: 0.01663
Epoch: 11014, Training Loss: 0.01276
Epoch: 11015, Training Loss: 0.01456
Epoch: 11015, Training Loss: 0.01343
Epoch: 11015, Training Loss: 0.01663
Epoch: 11015, Training Loss: 0.01276
Epoch: 11016, Training Loss: 0.01456
Epoch: 11016, Training Loss: 0.01343
Epoch: 11016, Training Loss: 0.01663
Epoch: 11016, Training Loss: 0.01275
Epoch: 11017, Training Loss: 0.01456
Epoch: 11017, Training Loss: 0.01343
Epoch: 11017, Training Loss: 0.01663
Epoch: 11017, Training Loss: 0.01275
Epoch: 11018, Training Loss: 0.01456
Epoch: 11018, Training Loss: 0.01343
Epoch: 11018, Training Loss: 0.01663
Epoch: 11018, Training Loss: 0.01275
Epoch: 11019, Training Loss: 0.01455
Epoch: 11019, Training Loss: 0.01343
Epoch: 11019, Training Loss: 0.01662
Epoch: 11019, Training Loss: 0.01275
Epoch: 11020, Training Loss: 0.01455
Epoch: 11020, Training Loss: 0.01343
Epoch: 11020, Training Loss: 0.01662
Epoch: 11020, Training Loss: 0.01275
Epoch: 11021, Training Loss: 0.01455
Epoch: 11021, Training Loss: 0.01343
Epoch: 11021, Training Loss: 0.01662
Epoch: 11021, Training Loss: 0.01275
Epoch: 11022, Training Loss: 0.01455
Epoch: 11022, Training Loss: 0.01343
Epoch: 11022, Training Loss: 0.01662
Epoch: 11022, Training Loss: 0.01275
Epoch: 11023, Training Loss: 0.01455
Epoch: 11023, Training Loss: 0.01343
Epoch: 11023, Training Loss: 0.01662
Epoch: 11023, Training Loss: 0.01275
Epoch: 11024, Training Loss: 0.01455
Epoch: 11024, Training Loss: 0.01343
Epoch: 11024, Training Loss: 0.01662
Epoch: 11024, Training Loss: 0.01275
Epoch: 11025, Training Loss: 0.01455
Epoch: 11025, Training Loss: 0.01343
Epoch: 11025, Training Loss: 0.01662
Epoch: 11025, Training Loss: 0.01275
Epoch: 11026, Training Loss: 0.01455
Epoch: 11026, Training Loss: 0.01343
Epoch: 11026, Training Loss: 0.01662
Epoch: 11026, Training Loss: 0.01275
Epoch: 11027, Training Loss: 0.01455
Epoch: 11027, Training Loss: 0.01343
Epoch: 11027, Training Loss: 0.01662
Epoch: 11027, Training Loss: 0.01275
Epoch: 11028, Training Loss: 0.01455
Epoch: 11028, Training Loss: 0.01343
Epoch: 11028, Training Loss: 0.01662
Epoch: 11028, Training Loss: 0.01275
Epoch: 11029, Training Loss: 0.01455
Epoch: 11029, Training Loss: 0.01342
Epoch: 11029, Training Loss: 0.01662
Epoch: 11029, Training Loss: 0.01275
Epoch: 11030, Training Loss: 0.01455
Epoch: 11030, Training Loss: 0.01342
Epoch: 11030, Training Loss: 0.01662
Epoch: 11030, Training Loss: 0.01275
Epoch: 11031, Training Loss: 0.01455
Epoch: 11031, Training Loss: 0.01342
Epoch: 11031, Training Loss: 0.01661
Epoch: 11031, Training Loss: 0.01275
Epoch: 11032, Training Loss: 0.01454
Epoch: 11032, Training Loss: 0.01342
Epoch: 11032, Training Loss: 0.01661
Epoch: 11032, Training Loss: 0.01274
Epoch: 11033, Training Loss: 0.01454
Epoch: 11033, Training Loss: 0.01342
Epoch: 11033, Training Loss: 0.01661
Epoch: 11033, Training Loss: 0.01274
Epoch: 11034, Training Loss: 0.01454
Epoch: 11034, Training Loss: 0.01342
Epoch: 11034, Training Loss: 0.01661
Epoch: 11034, Training Loss: 0.01274
Epoch: 11035, Training Loss: 0.01454
Epoch: 11035, Training Loss: 0.01342
Epoch: 11035, Training Loss: 0.01661
Epoch: 11035, Training Loss: 0.01274
Epoch: 11036, Training Loss: 0.01454
Epoch: 11036, Training Loss: 0.01342
Epoch: 11036, Training Loss: 0.01661
Epoch: 11036, Training Loss: 0.01274
Epoch: 11037, Training Loss: 0.01454
Epoch: 11037, Training Loss: 0.01342
Epoch: 11037, Training Loss: 0.01661
Epoch: 11037, Training Loss: 0.01274
Epoch: 11038, Training Loss: 0.01454
Epoch: 11038, Training Loss: 0.01342
Epoch: 11038, Training Loss: 0.01661
Epoch: 11038, Training Loss: 0.01274
Epoch: 11039, Training Loss: 0.01454
Epoch: 11039, Training Loss: 0.01342
Epoch: 11039, Training Loss: 0.01661
Epoch: 11039, Training Loss: 0.01274
Epoch: 11040, Training Loss: 0.01454
Epoch: 11040, Training Loss: 0.01342
Epoch: 11040, Training Loss: 0.01661
Epoch: 11040, Training Loss: 0.01274
Epoch: 11041, Training Loss: 0.01454
Epoch: 11041, Training Loss: 0.01342
Epoch: 11041, Training Loss: 0.01661
Epoch: 11041, Training Loss: 0.01274
Epoch: 11042, Training Loss: 0.01454
Epoch: 11042, Training Loss: 0.01342
Epoch: 11042, Training Loss: 0.01661
Epoch: 11042, Training Loss: 0.01274
Epoch: 11043, Training Loss: 0.01454
Epoch: 11043, Training Loss: 0.01342
Epoch: 11043, Training Loss: 0.01660
Epoch: 11043, Training Loss: 0.01274
Epoch: 11044, Training Loss: 0.01454
Epoch: 11044, Training Loss: 0.01341
Epoch: 11044, Training Loss: 0.01660
Epoch: 11044, Training Loss: 0.01274
Epoch: 11045, Training Loss: 0.01454
Epoch: 11045, Training Loss: 0.01341
Epoch: 11045, Training Loss: 0.01660
Epoch: 11045, Training Loss: 0.01274
Epoch: 11046, Training Loss: 0.01453
Epoch: 11046, Training Loss: 0.01341
Epoch: 11046, Training Loss: 0.01660
Epoch: 11046, Training Loss: 0.01274
Epoch: 11047, Training Loss: 0.01453
Epoch: 11047, Training Loss: 0.01341
Epoch: 11047, Training Loss: 0.01660
Epoch: 11047, Training Loss: 0.01274
Epoch: 11048, Training Loss: 0.01453
Epoch: 11048, Training Loss: 0.01341
Epoch: 11048, Training Loss: 0.01660
Epoch: 11048, Training Loss: 0.01273
Epoch: 11049, Training Loss: 0.01453
Epoch: 11049, Training Loss: 0.01341
Epoch: 11049, Training Loss: 0.01660
Epoch: 11049, Training Loss: 0.01273
Epoch: 11050, Training Loss: 0.01453
Epoch: 11050, Training Loss: 0.01341
Epoch: 11050, Training Loss: 0.01660
Epoch: 11050, Training Loss: 0.01273
Epoch: 11051, Training Loss: 0.01453
Epoch: 11051, Training Loss: 0.01341
Epoch: 11051, Training Loss: 0.01660
Epoch: 11051, Training Loss: 0.01273
Epoch: 11052, Training Loss: 0.01453
Epoch: 11052, Training Loss: 0.01341
Epoch: 11052, Training Loss: 0.01660
Epoch: 11052, Training Loss: 0.01273
Epoch: 11053, Training Loss: 0.01453
Epoch: 11053, Training Loss: 0.01341
Epoch: 11053, Training Loss: 0.01660
Epoch: 11053, Training Loss: 0.01273
Epoch: 11054, Training Loss: 0.01453
Epoch: 11054, Training Loss: 0.01341
Epoch: 11054, Training Loss: 0.01659
Epoch: 11054, Training Loss: 0.01273
Epoch: 11055, Training Loss: 0.01453
Epoch: 11055, Training Loss: 0.01341
Epoch: 11055, Training Loss: 0.01659
Epoch: 11055, Training Loss: 0.01273
Epoch: 11056, Training Loss: 0.01453
Epoch: 11056, Training Loss: 0.01341
Epoch: 11056, Training Loss: 0.01659
Epoch: 11056, Training Loss: 0.01273
Epoch: 11057, Training Loss: 0.01453
Epoch: 11057, Training Loss: 0.01341
Epoch: 11057, Training Loss: 0.01659
Epoch: 11057, Training Loss: 0.01273
Epoch: 11058, Training Loss: 0.01453
Epoch: 11058, Training Loss: 0.01341
Epoch: 11058, Training Loss: 0.01659
Epoch: 11058, Training Loss: 0.01273
Epoch: 11059, Training Loss: 0.01452
Epoch: 11059, Training Loss: 0.01340
Epoch: 11059, Training Loss: 0.01659
Epoch: 11059, Training Loss: 0.01273
Epoch: 11060, Training Loss: 0.01452
Epoch: 11060, Training Loss: 0.01340
Epoch: 11060, Training Loss: 0.01659
Epoch: 11060, Training Loss: 0.01273
Epoch: 11061, Training Loss: 0.01452
Epoch: 11061, Training Loss: 0.01340
Epoch: 11061, Training Loss: 0.01659
Epoch: 11061, Training Loss: 0.01273
Epoch: 11062, Training Loss: 0.01452
Epoch: 11062, Training Loss: 0.01340
Epoch: 11062, Training Loss: 0.01659
Epoch: 11062, Training Loss: 0.01273
Epoch: 11063, Training Loss: 0.01452
Epoch: 11063, Training Loss: 0.01340
Epoch: 11063, Training Loss: 0.01659
Epoch: 11063, Training Loss: 0.01272
Epoch: 11064, Training Loss: 0.01452
Epoch: 11064, Training Loss: 0.01340
Epoch: 11064, Training Loss: 0.01659
Epoch: 11064, Training Loss: 0.01272
Epoch: 11065, Training Loss: 0.01452
Epoch: 11065, Training Loss: 0.01340
Epoch: 11065, Training Loss: 0.01659
Epoch: 11065, Training Loss: 0.01272
Epoch: 11066, Training Loss: 0.01452
Epoch: 11066, Training Loss: 0.01340
Epoch: 11066, Training Loss: 0.01658
Epoch: 11066, Training Loss: 0.01272
Epoch: 11067, Training Loss: 0.01452
Epoch: 11067, Training Loss: 0.01340
Epoch: 11067, Training Loss: 0.01658
Epoch: 11067, Training Loss: 0.01272
Epoch: 11068, Training Loss: 0.01452
Epoch: 11068, Training Loss: 0.01340
Epoch: 11068, Training Loss: 0.01658
Epoch: 11068, Training Loss: 0.01272
Epoch: 11069, Training Loss: 0.01452
Epoch: 11069, Training Loss: 0.01340
Epoch: 11069, Training Loss: 0.01658
Epoch: 11069, Training Loss: 0.01272
Epoch: 11070, Training Loss: 0.01452
Epoch: 11070, Training Loss: 0.01340
Epoch: 11070, Training Loss: 0.01658
Epoch: 11070, Training Loss: 0.01272
Epoch: 11071, Training Loss: 0.01452
Epoch: 11071, Training Loss: 0.01340
Epoch: 11071, Training Loss: 0.01658
Epoch: 11071, Training Loss: 0.01272
Epoch: 11072, Training Loss: 0.01452
Epoch: 11072, Training Loss: 0.01340
Epoch: 11072, Training Loss: 0.01658
Epoch: 11072, Training Loss: 0.01272
Epoch: 11073, Training Loss: 0.01451
Epoch: 11073, Training Loss: 0.01340
Epoch: 11073, Training Loss: 0.01658
Epoch: 11073, Training Loss: 0.01272
Epoch: 11074, Training Loss: 0.01451
Epoch: 11074, Training Loss: 0.01339
Epoch: 11074, Training Loss: 0.01658
Epoch: 11074, Training Loss: 0.01272
Epoch: 11075, Training Loss: 0.01451
Epoch: 11075, Training Loss: 0.01339
Epoch: 11075, Training Loss: 0.01658
Epoch: 11075, Training Loss: 0.01272
Epoch: 11076, Training Loss: 0.01451
Epoch: 11076, Training Loss: 0.01339
Epoch: 11076, Training Loss: 0.01658
Epoch: 11076, Training Loss: 0.01272
Epoch: 11077, Training Loss: 0.01451
Epoch: 11077, Training Loss: 0.01339
Epoch: 11077, Training Loss: 0.01658
Epoch: 11077, Training Loss: 0.01272
Epoch: 11078, Training Loss: 0.01451
Epoch: 11078, Training Loss: 0.01339
Epoch: 11078, Training Loss: 0.01657
Epoch: 11078, Training Loss: 0.01272
Epoch: 11079, Training Loss: 0.01451
Epoch: 11079, Training Loss: 0.01339
Epoch: 11079, Training Loss: 0.01657
Epoch: 11079, Training Loss: 0.01271
Epoch: 11080, Training Loss: 0.01451
Epoch: 11080, Training Loss: 0.01339
Epoch: 11080, Training Loss: 0.01657
Epoch: 11080, Training Loss: 0.01271
Epoch: 11081, Training Loss: 0.01451
Epoch: 11081, Training Loss: 0.01339
Epoch: 11081, Training Loss: 0.01657
Epoch: 11081, Training Loss: 0.01271
Epoch: 11082, Training Loss: 0.01451
Epoch: 11082, Training Loss: 0.01339
Epoch: 11082, Training Loss: 0.01657
Epoch: 11082, Training Loss: 0.01271
Epoch: 11083, Training Loss: 0.01451
Epoch: 11083, Training Loss: 0.01339
Epoch: 11083, Training Loss: 0.01657
Epoch: 11083, Training Loss: 0.01271
Epoch: 11084, Training Loss: 0.01451
Epoch: 11084, Training Loss: 0.01339
Epoch: 11084, Training Loss: 0.01657
Epoch: 11084, Training Loss: 0.01271
Epoch: 11085, Training Loss: 0.01451
Epoch: 11085, Training Loss: 0.01339
Epoch: 11085, Training Loss: 0.01657
Epoch: 11085, Training Loss: 0.01271
Epoch: 11086, Training Loss: 0.01451
Epoch: 11086, Training Loss: 0.01339
Epoch: 11086, Training Loss: 0.01657
Epoch: 11086, Training Loss: 0.01271
Epoch: 11087, Training Loss: 0.01450
Epoch: 11087, Training Loss: 0.01339
Epoch: 11087, Training Loss: 0.01657
Epoch: 11087, Training Loss: 0.01271
Epoch: 11088, Training Loss: 0.01450
Epoch: 11088, Training Loss: 0.01339
Epoch: 11088, Training Loss: 0.01657
Epoch: 11088, Training Loss: 0.01271
Epoch: 11089, Training Loss: 0.01450
Epoch: 11089, Training Loss: 0.01338
Epoch: 11089, Training Loss: 0.01657
Epoch: 11089, Training Loss: 0.01271
Epoch: 11090, Training Loss: 0.01450
Epoch: 11090, Training Loss: 0.01338
Epoch: 11090, Training Loss: 0.01656
Epoch: 11090, Training Loss: 0.01271
Epoch: 11091, Training Loss: 0.01450
Epoch: 11091, Training Loss: 0.01338
Epoch: 11091, Training Loss: 0.01656
Epoch: 11091, Training Loss: 0.01271
Epoch: 11092, Training Loss: 0.01450
Epoch: 11092, Training Loss: 0.01338
Epoch: 11092, Training Loss: 0.01656
Epoch: 11092, Training Loss: 0.01271
Epoch: 11093, Training Loss: 0.01450
Epoch: 11093, Training Loss: 0.01338
Epoch: 11093, Training Loss: 0.01656
Epoch: 11093, Training Loss: 0.01271
Epoch: 11094, Training Loss: 0.01450
Epoch: 11094, Training Loss: 0.01338
Epoch: 11094, Training Loss: 0.01656
Epoch: 11094, Training Loss: 0.01271
Epoch: 11095, Training Loss: 0.01450
Epoch: 11095, Training Loss: 0.01338
Epoch: 11095, Training Loss: 0.01656
Epoch: 11095, Training Loss: 0.01270
Epoch: 11096, Training Loss: 0.01450
Epoch: 11096, Training Loss: 0.01338
Epoch: 11096, Training Loss: 0.01656
Epoch: 11096, Training Loss: 0.01270
Epoch: 11097, Training Loss: 0.01450
Epoch: 11097, Training Loss: 0.01338
Epoch: 11097, Training Loss: 0.01656
Epoch: 11097, Training Loss: 0.01270
Epoch: 11098, Training Loss: 0.01450
Epoch: 11098, Training Loss: 0.01338
Epoch: 11098, Training Loss: 0.01656
Epoch: 11098, Training Loss: 0.01270
Epoch: 11099, Training Loss: 0.01450
Epoch: 11099, Training Loss: 0.01338
Epoch: 11099, Training Loss: 0.01656
Epoch: 11099, Training Loss: 0.01270
Epoch: 11100, Training Loss: 0.01449
Epoch: 11100, Training Loss: 0.01338
Epoch: 11100, Training Loss: 0.01656
Epoch: 11100, Training Loss: 0.01270
Epoch: 11101, Training Loss: 0.01449
Epoch: 11101, Training Loss: 0.01338
Epoch: 11101, Training Loss: 0.01656
Epoch: 11101, Training Loss: 0.01270
Epoch: 11102, Training Loss: 0.01449
Epoch: 11102, Training Loss: 0.01338
Epoch: 11102, Training Loss: 0.01655
Epoch: 11102, Training Loss: 0.01270
Epoch: 11103, Training Loss: 0.01449
Epoch: 11103, Training Loss: 0.01338
Epoch: 11103, Training Loss: 0.01655
Epoch: 11103, Training Loss: 0.01270
Epoch: 11104, Training Loss: 0.01449
Epoch: 11104, Training Loss: 0.01337
Epoch: 11104, Training Loss: 0.01655
Epoch: 11104, Training Loss: 0.01270
Epoch: 11105, Training Loss: 0.01449
Epoch: 11105, Training Loss: 0.01337
Epoch: 11105, Training Loss: 0.01655
Epoch: 11105, Training Loss: 0.01270
Epoch: 11106, Training Loss: 0.01449
Epoch: 11106, Training Loss: 0.01337
Epoch: 11106, Training Loss: 0.01655
Epoch: 11106, Training Loss: 0.01270
Epoch: 11107, Training Loss: 0.01449
Epoch: 11107, Training Loss: 0.01337
Epoch: 11107, Training Loss: 0.01655
Epoch: 11107, Training Loss: 0.01270
Epoch: 11108, Training Loss: 0.01449
Epoch: 11108, Training Loss: 0.01337
Epoch: 11108, Training Loss: 0.01655
Epoch: 11108, Training Loss: 0.01270
Epoch: 11109, Training Loss: 0.01449
Epoch: 11109, Training Loss: 0.01337
Epoch: 11109, Training Loss: 0.01655
Epoch: 11109, Training Loss: 0.01270
Epoch: 11110, Training Loss: 0.01449
Epoch: 11110, Training Loss: 0.01337
Epoch: 11110, Training Loss: 0.01655
Epoch: 11110, Training Loss: 0.01270
Epoch: 11111, Training Loss: 0.01449
Epoch: 11111, Training Loss: 0.01337
Epoch: 11111, Training Loss: 0.01655
Epoch: 11111, Training Loss: 0.01269
Epoch: 11112, Training Loss: 0.01449
Epoch: 11112, Training Loss: 0.01337
Epoch: 11112, Training Loss: 0.01655
Epoch: 11112, Training Loss: 0.01269
Epoch: 11113, Training Loss: 0.01449
Epoch: 11113, Training Loss: 0.01337
Epoch: 11113, Training Loss: 0.01655
Epoch: 11113, Training Loss: 0.01269
Epoch: 11114, Training Loss: 0.01448
Epoch: 11114, Training Loss: 0.01337
Epoch: 11114, Training Loss: 0.01654
Epoch: 11114, Training Loss: 0.01269
Epoch: 11115, Training Loss: 0.01448
Epoch: 11115, Training Loss: 0.01337
Epoch: 11115, Training Loss: 0.01654
Epoch: 11115, Training Loss: 0.01269
Epoch: 11116, Training Loss: 0.01448
Epoch: 11116, Training Loss: 0.01337
Epoch: 11116, Training Loss: 0.01654
Epoch: 11116, Training Loss: 0.01269
Epoch: 11117, Training Loss: 0.01448
Epoch: 11117, Training Loss: 0.01337
Epoch: 11117, Training Loss: 0.01654
Epoch: 11117, Training Loss: 0.01269
Epoch: 11118, Training Loss: 0.01448
Epoch: 11118, Training Loss: 0.01337
Epoch: 11118, Training Loss: 0.01654
Epoch: 11118, Training Loss: 0.01269
Epoch: 11119, Training Loss: 0.01448
Epoch: 11119, Training Loss: 0.01337
Epoch: 11119, Training Loss: 0.01654
Epoch: 11119, Training Loss: 0.01269
Epoch: 11120, Training Loss: 0.01448
Epoch: 11120, Training Loss: 0.01336
Epoch: 11120, Training Loss: 0.01654
Epoch: 11120, Training Loss: 0.01269
Epoch: 11121, Training Loss: 0.01448
Epoch: 11121, Training Loss: 0.01336
Epoch: 11121, Training Loss: 0.01654
Epoch: 11121, Training Loss: 0.01269
Epoch: 11122, Training Loss: 0.01448
Epoch: 11122, Training Loss: 0.01336
Epoch: 11122, Training Loss: 0.01654
Epoch: 11122, Training Loss: 0.01269
Epoch: 11123, Training Loss: 0.01448
Epoch: 11123, Training Loss: 0.01336
Epoch: 11123, Training Loss: 0.01654
Epoch: 11123, Training Loss: 0.01269
Epoch: 11124, Training Loss: 0.01448
Epoch: 11124, Training Loss: 0.01336
Epoch: 11124, Training Loss: 0.01654
Epoch: 11124, Training Loss: 0.01269
Epoch: 11125, Training Loss: 0.01448
Epoch: 11125, Training Loss: 0.01336
Epoch: 11125, Training Loss: 0.01654
Epoch: 11125, Training Loss: 0.01269
Epoch: 11126, Training Loss: 0.01448
Epoch: 11126, Training Loss: 0.01336
Epoch: 11126, Training Loss: 0.01653
Epoch: 11126, Training Loss: 0.01269
Epoch: 11127, Training Loss: 0.01447
Epoch: 11127, Training Loss: 0.01336
Epoch: 11127, Training Loss: 0.01653
Epoch: 11127, Training Loss: 0.01268
Epoch: 11128, Training Loss: 0.01447
Epoch: 11128, Training Loss: 0.01336
Epoch: 11128, Training Loss: 0.01653
Epoch: 11128, Training Loss: 0.01268
Epoch: 11129, Training Loss: 0.01447
Epoch: 11129, Training Loss: 0.01336
Epoch: 11129, Training Loss: 0.01653
Epoch: 11129, Training Loss: 0.01268
Epoch: 11130, Training Loss: 0.01447
Epoch: 11130, Training Loss: 0.01336
Epoch: 11130, Training Loss: 0.01653
Epoch: 11130, Training Loss: 0.01268
Epoch: 11131, Training Loss: 0.01447
Epoch: 11131, Training Loss: 0.01336
Epoch: 11131, Training Loss: 0.01653
Epoch: 11131, Training Loss: 0.01268
Epoch: 11132, Training Loss: 0.01447
Epoch: 11132, Training Loss: 0.01336
Epoch: 11132, Training Loss: 0.01653
Epoch: 11132, Training Loss: 0.01268
Epoch: 11133, Training Loss: 0.01447
Epoch: 11133, Training Loss: 0.01336
Epoch: 11133, Training Loss: 0.01653
Epoch: 11133, Training Loss: 0.01268
Epoch: 11134, Training Loss: 0.01447
Epoch: 11134, Training Loss: 0.01336
Epoch: 11134, Training Loss: 0.01653
Epoch: 11134, Training Loss: 0.01268
Epoch: 11135, Training Loss: 0.01447
Epoch: 11135, Training Loss: 0.01335
Epoch: 11135, Training Loss: 0.01653
Epoch: 11135, Training Loss: 0.01268
Epoch: 11136, Training Loss: 0.01447
Epoch: 11136, Training Loss: 0.01335
Epoch: 11136, Training Loss: 0.01653
Epoch: 11136, Training Loss: 0.01268
Epoch: 11137, Training Loss: 0.01447
Epoch: 11137, Training Loss: 0.01335
Epoch: 11137, Training Loss: 0.01653
Epoch: 11137, Training Loss: 0.01268
Epoch: 11138, Training Loss: 0.01447
Epoch: 11138, Training Loss: 0.01335
Epoch: 11138, Training Loss: 0.01652
Epoch: 11138, Training Loss: 0.01268
Epoch: 11139, Training Loss: 0.01447
Epoch: 11139, Training Loss: 0.01335
Epoch: 11139, Training Loss: 0.01652
Epoch: 11139, Training Loss: 0.01268
Epoch: 11140, Training Loss: 0.01447
Epoch: 11140, Training Loss: 0.01335
Epoch: 11140, Training Loss: 0.01652
Epoch: 11140, Training Loss: 0.01268
Epoch: 11141, Training Loss: 0.01446
Epoch: 11141, Training Loss: 0.01335
Epoch: 11141, Training Loss: 0.01652
Epoch: 11141, Training Loss: 0.01268
Epoch: 11142, Training Loss: 0.01446
Epoch: 11142, Training Loss: 0.01335
Epoch: 11142, Training Loss: 0.01652
Epoch: 11142, Training Loss: 0.01268
Epoch: 11143, Training Loss: 0.01446
Epoch: 11143, Training Loss: 0.01335
Epoch: 11143, Training Loss: 0.01652
Epoch: 11143, Training Loss: 0.01267
Epoch: 11144, Training Loss: 0.01446
Epoch: 11144, Training Loss: 0.01335
Epoch: 11144, Training Loss: 0.01652
Epoch: 11144, Training Loss: 0.01267
Epoch: 11145, Training Loss: 0.01446
Epoch: 11145, Training Loss: 0.01335
Epoch: 11145, Training Loss: 0.01652
Epoch: 11145, Training Loss: 0.01267
Epoch: 11146, Training Loss: 0.01446
Epoch: 11146, Training Loss: 0.01335
Epoch: 11146, Training Loss: 0.01652
Epoch: 11146, Training Loss: 0.01267
Epoch: 11147, Training Loss: 0.01446
Epoch: 11147, Training Loss: 0.01335
Epoch: 11147, Training Loss: 0.01652
Epoch: 11147, Training Loss: 0.01267
Epoch: 11148, Training Loss: 0.01446
Epoch: 11148, Training Loss: 0.01335
Epoch: 11148, Training Loss: 0.01652
Epoch: 11148, Training Loss: 0.01267
Epoch: 11149, Training Loss: 0.01446
Epoch: 11149, Training Loss: 0.01335
Epoch: 11149, Training Loss: 0.01652
Epoch: 11149, Training Loss: 0.01267
Epoch: 11150, Training Loss: 0.01446
Epoch: 11150, Training Loss: 0.01334
Epoch: 11150, Training Loss: 0.01651
Epoch: 11150, Training Loss: 0.01267
Epoch: 11151, Training Loss: 0.01446
Epoch: 11151, Training Loss: 0.01334
Epoch: 11151, Training Loss: 0.01651
Epoch: 11151, Training Loss: 0.01267
Epoch: 11152, Training Loss: 0.01446
Epoch: 11152, Training Loss: 0.01334
Epoch: 11152, Training Loss: 0.01651
Epoch: 11152, Training Loss: 0.01267
Epoch: 11153, Training Loss: 0.01446
Epoch: 11153, Training Loss: 0.01334
Epoch: 11153, Training Loss: 0.01651
Epoch: 11153, Training Loss: 0.01267
Epoch: 11154, Training Loss: 0.01446
Epoch: 11154, Training Loss: 0.01334
Epoch: 11154, Training Loss: 0.01651
Epoch: 11154, Training Loss: 0.01267
Epoch: 11155, Training Loss: 0.01445
Epoch: 11155, Training Loss: 0.01334
Epoch: 11155, Training Loss: 0.01651
Epoch: 11155, Training Loss: 0.01267
Epoch: 11156, Training Loss: 0.01445
Epoch: 11156, Training Loss: 0.01334
Epoch: 11156, Training Loss: 0.01651
Epoch: 11156, Training Loss: 0.01267
Epoch: 11157, Training Loss: 0.01445
Epoch: 11157, Training Loss: 0.01334
Epoch: 11157, Training Loss: 0.01651
Epoch: 11157, Training Loss: 0.01267
Epoch: 11158, Training Loss: 0.01445
Epoch: 11158, Training Loss: 0.01334
Epoch: 11158, Training Loss: 0.01651
Epoch: 11158, Training Loss: 0.01267
Epoch: 11159, Training Loss: 0.01445
Epoch: 11159, Training Loss: 0.01334
Epoch: 11159, Training Loss: 0.01651
Epoch: 11159, Training Loss: 0.01266
Epoch: 11160, Training Loss: 0.01445
Epoch: 11160, Training Loss: 0.01334
Epoch: 11160, Training Loss: 0.01651
Epoch: 11160, Training Loss: 0.01266
Epoch: 11161, Training Loss: 0.01445
Epoch: 11161, Training Loss: 0.01334
Epoch: 11161, Training Loss: 0.01650
Epoch: 11161, Training Loss: 0.01266
Epoch: 11162, Training Loss: 0.01445
Epoch: 11162, Training Loss: 0.01334
Epoch: 11162, Training Loss: 0.01650
Epoch: 11162, Training Loss: 0.01266
Epoch: 11163, Training Loss: 0.01445
Epoch: 11163, Training Loss: 0.01334
Epoch: 11163, Training Loss: 0.01650
Epoch: 11163, Training Loss: 0.01266
Epoch: 11164, Training Loss: 0.01445
Epoch: 11164, Training Loss: 0.01334
Epoch: 11164, Training Loss: 0.01650
Epoch: 11164, Training Loss: 0.01266
Epoch: 11165, Training Loss: 0.01445
Epoch: 11165, Training Loss: 0.01333
Epoch: 11165, Training Loss: 0.01650
Epoch: 11165, Training Loss: 0.01266
Epoch: 11166, Training Loss: 0.01445
Epoch: 11166, Training Loss: 0.01333
Epoch: 11166, Training Loss: 0.01650
Epoch: 11166, Training Loss: 0.01266
Epoch: 11167, Training Loss: 0.01445
Epoch: 11167, Training Loss: 0.01333
Epoch: 11167, Training Loss: 0.01650
Epoch: 11167, Training Loss: 0.01266
Epoch: 11168, Training Loss: 0.01445
Epoch: 11168, Training Loss: 0.01333
Epoch: 11168, Training Loss: 0.01650
Epoch: 11168, Training Loss: 0.01266
Epoch: 11169, Training Loss: 0.01444
Epoch: 11169, Training Loss: 0.01333
Epoch: 11169, Training Loss: 0.01650
Epoch: 11169, Training Loss: 0.01266
Epoch: 11170, Training Loss: 0.01444
Epoch: 11170, Training Loss: 0.01333
Epoch: 11170, Training Loss: 0.01650
Epoch: 11170, Training Loss: 0.01266
Epoch: 11171, Training Loss: 0.01444
Epoch: 11171, Training Loss: 0.01333
Epoch: 11171, Training Loss: 0.01650
Epoch: 11171, Training Loss: 0.01266
Epoch: 11172, Training Loss: 0.01444
Epoch: 11172, Training Loss: 0.01333
Epoch: 11172, Training Loss: 0.01650
Epoch: 11172, Training Loss: 0.01266
Epoch: 11173, Training Loss: 0.01444
Epoch: 11173, Training Loss: 0.01333
Epoch: 11173, Training Loss: 0.01649
Epoch: 11173, Training Loss: 0.01266
Epoch: 11174, Training Loss: 0.01444
Epoch: 11174, Training Loss: 0.01333
Epoch: 11174, Training Loss: 0.01649
Epoch: 11174, Training Loss: 0.01266
Epoch: 11175, Training Loss: 0.01444
Epoch: 11175, Training Loss: 0.01333
Epoch: 11175, Training Loss: 0.01649
Epoch: 11175, Training Loss: 0.01265
Epoch: 11176, Training Loss: 0.01444
Epoch: 11176, Training Loss: 0.01333
Epoch: 11176, Training Loss: 0.01649
Epoch: 11176, Training Loss: 0.01265
Epoch: 11177, Training Loss: 0.01444
Epoch: 11177, Training Loss: 0.01333
Epoch: 11177, Training Loss: 0.01649
Epoch: 11177, Training Loss: 0.01265
Epoch: 11178, Training Loss: 0.01444
Epoch: 11178, Training Loss: 0.01333
Epoch: 11178, Training Loss: 0.01649
Epoch: 11178, Training Loss: 0.01265
Epoch: 11179, Training Loss: 0.01444
Epoch: 11179, Training Loss: 0.01333
Epoch: 11179, Training Loss: 0.01649
Epoch: 11179, Training Loss: 0.01265
Epoch: 11180, Training Loss: 0.01444
Epoch: 11180, Training Loss: 0.01333
Epoch: 11180, Training Loss: 0.01649
Epoch: 11180, Training Loss: 0.01265
Epoch: 11181, Training Loss: 0.01444
Epoch: 11181, Training Loss: 0.01332
Epoch: 11181, Training Loss: 0.01649
Epoch: 11181, Training Loss: 0.01265
Epoch: 11182, Training Loss: 0.01443
Epoch: 11182, Training Loss: 0.01332
Epoch: 11182, Training Loss: 0.01649
Epoch: 11182, Training Loss: 0.01265
Epoch: 11183, Training Loss: 0.01443
Epoch: 11183, Training Loss: 0.01332
Epoch: 11183, Training Loss: 0.01649
Epoch: 11183, Training Loss: 0.01265
Epoch: 11184, Training Loss: 0.01443
Epoch: 11184, Training Loss: 0.01332
Epoch: 11184, Training Loss: 0.01649
Epoch: 11184, Training Loss: 0.01265
Epoch: 11185, Training Loss: 0.01443
Epoch: 11185, Training Loss: 0.01332
Epoch: 11185, Training Loss: 0.01648
Epoch: 11185, Training Loss: 0.01265
Epoch: 11186, Training Loss: 0.01443
Epoch: 11186, Training Loss: 0.01332
Epoch: 11186, Training Loss: 0.01648
Epoch: 11186, Training Loss: 0.01265
Epoch: 11187, Training Loss: 0.01443
Epoch: 11187, Training Loss: 0.01332
Epoch: 11187, Training Loss: 0.01648
Epoch: 11187, Training Loss: 0.01265
Epoch: 11188, Training Loss: 0.01443
Epoch: 11188, Training Loss: 0.01332
Epoch: 11188, Training Loss: 0.01648
Epoch: 11188, Training Loss: 0.01265
Epoch: 11189, Training Loss: 0.01443
Epoch: 11189, Training Loss: 0.01332
Epoch: 11189, Training Loss: 0.01648
Epoch: 11189, Training Loss: 0.01265
Epoch: 11190, Training Loss: 0.01443
Epoch: 11190, Training Loss: 0.01332
Epoch: 11190, Training Loss: 0.01648
Epoch: 11190, Training Loss: 0.01265
Epoch: 11191, Training Loss: 0.01443
Epoch: 11191, Training Loss: 0.01332
Epoch: 11191, Training Loss: 0.01648
Epoch: 11191, Training Loss: 0.01264
Epoch: 11192, Training Loss: 0.01443
Epoch: 11192, Training Loss: 0.01332
Epoch: 11192, Training Loss: 0.01648
Epoch: 11192, Training Loss: 0.01264
Epoch: 11193, Training Loss: 0.01443
Epoch: 11193, Training Loss: 0.01332
Epoch: 11193, Training Loss: 0.01648
Epoch: 11193, Training Loss: 0.01264
Epoch: 11194, Training Loss: 0.01443
Epoch: 11194, Training Loss: 0.01332
Epoch: 11194, Training Loss: 0.01648
Epoch: 11194, Training Loss: 0.01264
Epoch: 11195, Training Loss: 0.01443
Epoch: 11195, Training Loss: 0.01332
Epoch: 11195, Training Loss: 0.01648
Epoch: 11195, Training Loss: 0.01264
Epoch: 11196, Training Loss: 0.01442
Epoch: 11196, Training Loss: 0.01331
Epoch: 11196, Training Loss: 0.01648
Epoch: 11196, Training Loss: 0.01264
Epoch: 11197, Training Loss: 0.01442
Epoch: 11197, Training Loss: 0.01331
Epoch: 11197, Training Loss: 0.01648
Epoch: 11197, Training Loss: 0.01264
Epoch: 11198, Training Loss: 0.01442
Epoch: 11198, Training Loss: 0.01331
Epoch: 11198, Training Loss: 0.01647
Epoch: 11198, Training Loss: 0.01264
Epoch: 11199, Training Loss: 0.01442
Epoch: 11199, Training Loss: 0.01331
Epoch: 11199, Training Loss: 0.01647
Epoch: 11199, Training Loss: 0.01264
Epoch: 11200, Training Loss: 0.01442
Epoch: 11200, Training Loss: 0.01331
Epoch: 11200, Training Loss: 0.01647
Epoch: 11200, Training Loss: 0.01264
Epoch: 11201, Training Loss: 0.01442
Epoch: 11201, Training Loss: 0.01331
Epoch: 11201, Training Loss: 0.01647
Epoch: 11201, Training Loss: 0.01264
Epoch: 11202, Training Loss: 0.01442
Epoch: 11202, Training Loss: 0.01331
Epoch: 11202, Training Loss: 0.01647
Epoch: 11202, Training Loss: 0.01264
Epoch: 11203, Training Loss: 0.01442
Epoch: 11203, Training Loss: 0.01331
Epoch: 11203, Training Loss: 0.01647
Epoch: 11203, Training Loss: 0.01264
Epoch: 11204, Training Loss: 0.01442
Epoch: 11204, Training Loss: 0.01331
Epoch: 11204, Training Loss: 0.01647
Epoch: 11204, Training Loss: 0.01264
Epoch: 11205, Training Loss: 0.01442
Epoch: 11205, Training Loss: 0.01331
Epoch: 11205, Training Loss: 0.01647
Epoch: 11205, Training Loss: 0.01264
Epoch: 11206, Training Loss: 0.01442
Epoch: 11206, Training Loss: 0.01331
Epoch: 11206, Training Loss: 0.01647
Epoch: 11206, Training Loss: 0.01264
Epoch: 11207, Training Loss: 0.01442
Epoch: 11207, Training Loss: 0.01331
Epoch: 11207, Training Loss: 0.01647
Epoch: 11207, Training Loss: 0.01263
Epoch: 11208, Training Loss: 0.01442
Epoch: 11208, Training Loss: 0.01331
Epoch: 11208, Training Loss: 0.01647
Epoch: 11208, Training Loss: 0.01263
Epoch: 11209, Training Loss: 0.01442
Epoch: 11209, Training Loss: 0.01331
Epoch: 11209, Training Loss: 0.01647
Epoch: 11209, Training Loss: 0.01263
Epoch: 11210, Training Loss: 0.01441
Epoch: 11210, Training Loss: 0.01331
Epoch: 11210, Training Loss: 0.01646
Epoch: 11210, Training Loss: 0.01263
Epoch: 11211, Training Loss: 0.01441
Epoch: 11211, Training Loss: 0.01331
Epoch: 11211, Training Loss: 0.01646
Epoch: 11211, Training Loss: 0.01263
Epoch: 11212, Training Loss: 0.01441
Epoch: 11212, Training Loss: 0.01330
Epoch: 11212, Training Loss: 0.01646
Epoch: 11212, Training Loss: 0.01263
Epoch: 11213, Training Loss: 0.01441
Epoch: 11213, Training Loss: 0.01330
Epoch: 11213, Training Loss: 0.01646
Epoch: 11213, Training Loss: 0.01263
Epoch: 11214, Training Loss: 0.01441
Epoch: 11214, Training Loss: 0.01330
Epoch: 11214, Training Loss: 0.01646
Epoch: 11214, Training Loss: 0.01263
Epoch: 11215, Training Loss: 0.01441
Epoch: 11215, Training Loss: 0.01330
Epoch: 11215, Training Loss: 0.01646
Epoch: 11215, Training Loss: 0.01263
Epoch: 11216, Training Loss: 0.01441
Epoch: 11216, Training Loss: 0.01330
Epoch: 11216, Training Loss: 0.01646
Epoch: 11216, Training Loss: 0.01263
Epoch: 11217, Training Loss: 0.01441
Epoch: 11217, Training Loss: 0.01330
Epoch: 11217, Training Loss: 0.01646
Epoch: 11217, Training Loss: 0.01263
Epoch: 11218, Training Loss: 0.01441
Epoch: 11218, Training Loss: 0.01330
Epoch: 11218, Training Loss: 0.01646
Epoch: 11218, Training Loss: 0.01263
Epoch: 11219, Training Loss: 0.01441
Epoch: 11219, Training Loss: 0.01330
Epoch: 11219, Training Loss: 0.01646
Epoch: 11219, Training Loss: 0.01263
Epoch: 11220, Training Loss: 0.01441
Epoch: 11220, Training Loss: 0.01330
Epoch: 11220, Training Loss: 0.01646
Epoch: 11220, Training Loss: 0.01263
Epoch: 11221, Training Loss: 0.01441
Epoch: 11221, Training Loss: 0.01330
Epoch: 11221, Training Loss: 0.01646
Epoch: 11221, Training Loss: 0.01263
Epoch: 11222, Training Loss: 0.01441
Epoch: 11222, Training Loss: 0.01330
Epoch: 11222, Training Loss: 0.01645
Epoch: 11222, Training Loss: 0.01263
Epoch: 11223, Training Loss: 0.01441
Epoch: 11223, Training Loss: 0.01330
Epoch: 11223, Training Loss: 0.01645
Epoch: 11223, Training Loss: 0.01262
Epoch: 11224, Training Loss: 0.01440
Epoch: 11224, Training Loss: 0.01330
Epoch: 11224, Training Loss: 0.01645
Epoch: 11224, Training Loss: 0.01262
Epoch: 11225, Training Loss: 0.01440
Epoch: 11225, Training Loss: 0.01330
Epoch: 11225, Training Loss: 0.01645
Epoch: 11225, Training Loss: 0.01262
Epoch: 11226, Training Loss: 0.01440
Epoch: 11226, Training Loss: 0.01330
Epoch: 11226, Training Loss: 0.01645
Epoch: 11226, Training Loss: 0.01262
Epoch: 11227, Training Loss: 0.01440
Epoch: 11227, Training Loss: 0.01329
Epoch: 11227, Training Loss: 0.01645
Epoch: 11227, Training Loss: 0.01262
Epoch: 11228, Training Loss: 0.01440
Epoch: 11228, Training Loss: 0.01329
Epoch: 11228, Training Loss: 0.01645
Epoch: 11228, Training Loss: 0.01262
Epoch: 11229, Training Loss: 0.01440
Epoch: 11229, Training Loss: 0.01329
Epoch: 11229, Training Loss: 0.01645
Epoch: 11229, Training Loss: 0.01262
Epoch: 11230, Training Loss: 0.01440
Epoch: 11230, Training Loss: 0.01329
Epoch: 11230, Training Loss: 0.01645
Epoch: 11230, Training Loss: 0.01262
Epoch: 11231, Training Loss: 0.01440
Epoch: 11231, Training Loss: 0.01329
Epoch: 11231, Training Loss: 0.01645
Epoch: 11231, Training Loss: 0.01262
Epoch: 11232, Training Loss: 0.01440
Epoch: 11232, Training Loss: 0.01329
Epoch: 11232, Training Loss: 0.01645
Epoch: 11232, Training Loss: 0.01262
Epoch: 11233, Training Loss: 0.01440
Epoch: 11233, Training Loss: 0.01329
Epoch: 11233, Training Loss: 0.01645
Epoch: 11233, Training Loss: 0.01262
Epoch: 11234, Training Loss: 0.01440
Epoch: 11234, Training Loss: 0.01329
Epoch: 11234, Training Loss: 0.01644
Epoch: 11234, Training Loss: 0.01262
Epoch: 11235, Training Loss: 0.01440
Epoch: 11235, Training Loss: 0.01329
Epoch: 11235, Training Loss: 0.01644
Epoch: 11235, Training Loss: 0.01262
Epoch: 11236, Training Loss: 0.01440
Epoch: 11236, Training Loss: 0.01329
Epoch: 11236, Training Loss: 0.01644
Epoch: 11236, Training Loss: 0.01262
Epoch: 11237, Training Loss: 0.01440
Epoch: 11237, Training Loss: 0.01329
Epoch: 11237, Training Loss: 0.01644
Epoch: 11237, Training Loss: 0.01262
Epoch: 11238, Training Loss: 0.01439
Epoch: 11238, Training Loss: 0.01329
Epoch: 11238, Training Loss: 0.01644
Epoch: 11238, Training Loss: 0.01262
Epoch: 11239, Training Loss: 0.01439
Epoch: 11239, Training Loss: 0.01329
Epoch: 11239, Training Loss: 0.01644
Epoch: 11239, Training Loss: 0.01261
Epoch: 11240, Training Loss: 0.01439
Epoch: 11240, Training Loss: 0.01329
Epoch: 11240, Training Loss: 0.01644
Epoch: 11240, Training Loss: 0.01261
Epoch: 11241, Training Loss: 0.01439
Epoch: 11241, Training Loss: 0.01329
Epoch: 11241, Training Loss: 0.01644
Epoch: 11241, Training Loss: 0.01261
Epoch: 11242, Training Loss: 0.01439
Epoch: 11242, Training Loss: 0.01328
Epoch: 11242, Training Loss: 0.01644
Epoch: 11242, Training Loss: 0.01261
Epoch: 11243, Training Loss: 0.01439
Epoch: 11243, Training Loss: 0.01328
Epoch: 11243, Training Loss: 0.01644
Epoch: 11243, Training Loss: 0.01261
Epoch: 11244, Training Loss: 0.01439
Epoch: 11244, Training Loss: 0.01328
Epoch: 11244, Training Loss: 0.01644
Epoch: 11244, Training Loss: 0.01261
Epoch: 11245, Training Loss: 0.01439
Epoch: 11245, Training Loss: 0.01328
Epoch: 11245, Training Loss: 0.01644
Epoch: 11245, Training Loss: 0.01261
Epoch: 11246, Training Loss: 0.01439
Epoch: 11246, Training Loss: 0.01328
Epoch: 11246, Training Loss: 0.01643
Epoch: 11246, Training Loss: 0.01261
Epoch: 11247, Training Loss: 0.01439
Epoch: 11247, Training Loss: 0.01328
Epoch: 11247, Training Loss: 0.01643
Epoch: 11247, Training Loss: 0.01261
Epoch: 11248, Training Loss: 0.01439
Epoch: 11248, Training Loss: 0.01328
Epoch: 11248, Training Loss: 0.01643
Epoch: 11248, Training Loss: 0.01261
Epoch: 11249, Training Loss: 0.01439
Epoch: 11249, Training Loss: 0.01328
Epoch: 11249, Training Loss: 0.01643
Epoch: 11249, Training Loss: 0.01261
Epoch: 11250, Training Loss: 0.01439
Epoch: 11250, Training Loss: 0.01328
Epoch: 11250, Training Loss: 0.01643
Epoch: 11250, Training Loss: 0.01261
Epoch: 11251, Training Loss: 0.01439
Epoch: 11251, Training Loss: 0.01328
Epoch: 11251, Training Loss: 0.01643
Epoch: 11251, Training Loss: 0.01261
Epoch: 11252, Training Loss: 0.01438
Epoch: 11252, Training Loss: 0.01328
Epoch: 11252, Training Loss: 0.01643
Epoch: 11252, Training Loss: 0.01261
Epoch: 11253, Training Loss: 0.01438
Epoch: 11253, Training Loss: 0.01328
Epoch: 11253, Training Loss: 0.01643
Epoch: 11253, Training Loss: 0.01261
Epoch: 11254, Training Loss: 0.01438
Epoch: 11254, Training Loss: 0.01328
Epoch: 11254, Training Loss: 0.01643
Epoch: 11254, Training Loss: 0.01261
Epoch: 11255, Training Loss: 0.01438
Epoch: 11255, Training Loss: 0.01328
Epoch: 11255, Training Loss: 0.01643
Epoch: 11255, Training Loss: 0.01260
Epoch: 11256, Training Loss: 0.01438
Epoch: 11256, Training Loss: 0.01328
Epoch: 11256, Training Loss: 0.01643
Epoch: 11256, Training Loss: 0.01260
Epoch: 11257, Training Loss: 0.01438
Epoch: 11257, Training Loss: 0.01328
Epoch: 11257, Training Loss: 0.01643
Epoch: 11257, Training Loss: 0.01260
Epoch: 11258, Training Loss: 0.01438
Epoch: 11258, Training Loss: 0.01327
Epoch: 11258, Training Loss: 0.01642
Epoch: 11258, Training Loss: 0.01260
Epoch: 11259, Training Loss: 0.01438
Epoch: 11259, Training Loss: 0.01327
Epoch: 11259, Training Loss: 0.01642
Epoch: 11259, Training Loss: 0.01260
Epoch: 11260, Training Loss: 0.01438
Epoch: 11260, Training Loss: 0.01327
Epoch: 11260, Training Loss: 0.01642
Epoch: 11260, Training Loss: 0.01260
Epoch: 11261, Training Loss: 0.01438
Epoch: 11261, Training Loss: 0.01327
Epoch: 11261, Training Loss: 0.01642
Epoch: 11261, Training Loss: 0.01260
Epoch: 11262, Training Loss: 0.01438
Epoch: 11262, Training Loss: 0.01327
Epoch: 11262, Training Loss: 0.01642
Epoch: 11262, Training Loss: 0.01260
Epoch: 11263, Training Loss: 0.01438
Epoch: 11263, Training Loss: 0.01327
Epoch: 11263, Training Loss: 0.01642
Epoch: 11263, Training Loss: 0.01260
Epoch: 11264, Training Loss: 0.01438
Epoch: 11264, Training Loss: 0.01327
Epoch: 11264, Training Loss: 0.01642
Epoch: 11264, Training Loss: 0.01260
Epoch: 11265, Training Loss: 0.01438
Epoch: 11265, Training Loss: 0.01327
Epoch: 11265, Training Loss: 0.01642
Epoch: 11265, Training Loss: 0.01260
Epoch: 11266, Training Loss: 0.01437
Epoch: 11266, Training Loss: 0.01327
Epoch: 11266, Training Loss: 0.01642
Epoch: 11266, Training Loss: 0.01260
Epoch: 11267, Training Loss: 0.01437
Epoch: 11267, Training Loss: 0.01327
Epoch: 11267, Training Loss: 0.01642
Epoch: 11267, Training Loss: 0.01260
Epoch: 11268, Training Loss: 0.01437
Epoch: 11268, Training Loss: 0.01327
Epoch: 11268, Training Loss: 0.01642
Epoch: 11268, Training Loss: 0.01260
Epoch: 11269, Training Loss: 0.01437
Epoch: 11269, Training Loss: 0.01327
Epoch: 11269, Training Loss: 0.01642
Epoch: 11269, Training Loss: 0.01260
Epoch: 11270, Training Loss: 0.01437
Epoch: 11270, Training Loss: 0.01327
Epoch: 11270, Training Loss: 0.01641
Epoch: 11270, Training Loss: 0.01260
Epoch: 11271, Training Loss: 0.01437
Epoch: 11271, Training Loss: 0.01327
Epoch: 11271, Training Loss: 0.01641
Epoch: 11271, Training Loss: 0.01259
Epoch: 11272, Training Loss: 0.01437
Epoch: 11272, Training Loss: 0.01327
Epoch: 11272, Training Loss: 0.01641
Epoch: 11272, Training Loss: 0.01259
Epoch: 11273, Training Loss: 0.01437
Epoch: 11273, Training Loss: 0.01327
Epoch: 11273, Training Loss: 0.01641
Epoch: 11273, Training Loss: 0.01259
Epoch: 11274, Training Loss: 0.01437
Epoch: 11274, Training Loss: 0.01326
Epoch: 11274, Training Loss: 0.01641
Epoch: 11274, Training Loss: 0.01259
Epoch: 11275, Training Loss: 0.01437
Epoch: 11275, Training Loss: 0.01326
Epoch: 11275, Training Loss: 0.01641
Epoch: 11275, Training Loss: 0.01259
Epoch: 11276, Training Loss: 0.01437
Epoch: 11276, Training Loss: 0.01326
Epoch: 11276, Training Loss: 0.01641
Epoch: 11276, Training Loss: 0.01259
Epoch: 11277, Training Loss: 0.01437
Epoch: 11277, Training Loss: 0.01326
Epoch: 11277, Training Loss: 0.01641
Epoch: 11277, Training Loss: 0.01259
Epoch: 11278, Training Loss: 0.01437
Epoch: 11278, Training Loss: 0.01326
Epoch: 11278, Training Loss: 0.01641
Epoch: 11278, Training Loss: 0.01259
Epoch: 11279, Training Loss: 0.01437
Epoch: 11279, Training Loss: 0.01326
Epoch: 11279, Training Loss: 0.01641
Epoch: 11279, Training Loss: 0.01259
Epoch: 11280, Training Loss: 0.01436
Epoch: 11280, Training Loss: 0.01326
Epoch: 11280, Training Loss: 0.01641
Epoch: 11280, Training Loss: 0.01259
Epoch: 11281, Training Loss: 0.01436
Epoch: 11281, Training Loss: 0.01326
Epoch: 11281, Training Loss: 0.01641
Epoch: 11281, Training Loss: 0.01259
Epoch: 11282, Training Loss: 0.01436
Epoch: 11282, Training Loss: 0.01326
Epoch: 11282, Training Loss: 0.01640
Epoch: 11282, Training Loss: 0.01259
Epoch: 11283, Training Loss: 0.01436
Epoch: 11283, Training Loss: 0.01326
Epoch: 11283, Training Loss: 0.01640
Epoch: 11283, Training Loss: 0.01259
Epoch: 11284, Training Loss: 0.01436
Epoch: 11284, Training Loss: 0.01326
Epoch: 11284, Training Loss: 0.01640
Epoch: 11284, Training Loss: 0.01259
Epoch: 11285, Training Loss: 0.01436
Epoch: 11285, Training Loss: 0.01326
Epoch: 11285, Training Loss: 0.01640
Epoch: 11285, Training Loss: 0.01259
Epoch: 11286, Training Loss: 0.01436
Epoch: 11286, Training Loss: 0.01326
Epoch: 11286, Training Loss: 0.01640
Epoch: 11286, Training Loss: 0.01259
Epoch: 11287, Training Loss: 0.01436
Epoch: 11287, Training Loss: 0.01326
Epoch: 11287, Training Loss: 0.01640
Epoch: 11287, Training Loss: 0.01259
Epoch: 11288, Training Loss: 0.01436
Epoch: 11288, Training Loss: 0.01326
Epoch: 11288, Training Loss: 0.01640
Epoch: 11288, Training Loss: 0.01258
Epoch: 11289, Training Loss: 0.01436
Epoch: 11289, Training Loss: 0.01325
Epoch: 11289, Training Loss: 0.01640
Epoch: 11289, Training Loss: 0.01258
Epoch: 11290, Training Loss: 0.01436
Epoch: 11290, Training Loss: 0.01325
Epoch: 11290, Training Loss: 0.01640
Epoch: 11290, Training Loss: 0.01258
Epoch: 11291, Training Loss: 0.01436
Epoch: 11291, Training Loss: 0.01325
Epoch: 11291, Training Loss: 0.01640
Epoch: 11291, Training Loss: 0.01258
Epoch: 11292, Training Loss: 0.01436
Epoch: 11292, Training Loss: 0.01325
Epoch: 11292, Training Loss: 0.01640
Epoch: 11292, Training Loss: 0.01258
Epoch: 11293, Training Loss: 0.01436
Epoch: 11293, Training Loss: 0.01325
Epoch: 11293, Training Loss: 0.01640
Epoch: 11293, Training Loss: 0.01258
Epoch: 11294, Training Loss: 0.01435
Epoch: 11294, Training Loss: 0.01325
Epoch: 11294, Training Loss: 0.01640
Epoch: 11294, Training Loss: 0.01258
Epoch: 11295, Training Loss: 0.01435
Epoch: 11295, Training Loss: 0.01325
Epoch: 11295, Training Loss: 0.01639
Epoch: 11295, Training Loss: 0.01258
Epoch: 11296, Training Loss: 0.01435
Epoch: 11296, Training Loss: 0.01325
Epoch: 11296, Training Loss: 0.01639
Epoch: 11296, Training Loss: 0.01258
Epoch: 11297, Training Loss: 0.01435
Epoch: 11297, Training Loss: 0.01325
Epoch: 11297, Training Loss: 0.01639
Epoch: 11297, Training Loss: 0.01258
Epoch: 11298, Training Loss: 0.01435
Epoch: 11298, Training Loss: 0.01325
Epoch: 11298, Training Loss: 0.01639
Epoch: 11298, Training Loss: 0.01258
Epoch: 11299, Training Loss: 0.01435
Epoch: 11299, Training Loss: 0.01325
Epoch: 11299, Training Loss: 0.01639
Epoch: 11299, Training Loss: 0.01258
Epoch: 11300, Training Loss: 0.01435
Epoch: 11300, Training Loss: 0.01325
Epoch: 11300, Training Loss: 0.01639
Epoch: 11300, Training Loss: 0.01258
Epoch: 11301, Training Loss: 0.01435
Epoch: 11301, Training Loss: 0.01325
Epoch: 11301, Training Loss: 0.01639
Epoch: 11301, Training Loss: 0.01258
Epoch: 11302, Training Loss: 0.01435
Epoch: 11302, Training Loss: 0.01325
Epoch: 11302, Training Loss: 0.01639
Epoch: 11302, Training Loss: 0.01258
Epoch: 11303, Training Loss: 0.01435
Epoch: 11303, Training Loss: 0.01325
Epoch: 11303, Training Loss: 0.01639
Epoch: 11303, Training Loss: 0.01258
Epoch: 11304, Training Loss: 0.01435
Epoch: 11304, Training Loss: 0.01325
Epoch: 11304, Training Loss: 0.01639
Epoch: 11304, Training Loss: 0.01257
Epoch: 11305, Training Loss: 0.01435
Epoch: 11305, Training Loss: 0.01324
Epoch: 11305, Training Loss: 0.01639
Epoch: 11305, Training Loss: 0.01257
Epoch: 11306, Training Loss: 0.01435
Epoch: 11306, Training Loss: 0.01324
Epoch: 11306, Training Loss: 0.01639
Epoch: 11306, Training Loss: 0.01257
Epoch: 11307, Training Loss: 0.01435
Epoch: 11307, Training Loss: 0.01324
Epoch: 11307, Training Loss: 0.01638
Epoch: 11307, Training Loss: 0.01257
Epoch: 11308, Training Loss: 0.01434
Epoch: 11308, Training Loss: 0.01324
Epoch: 11308, Training Loss: 0.01638
Epoch: 11308, Training Loss: 0.01257
Epoch: 11309, Training Loss: 0.01434
Epoch: 11309, Training Loss: 0.01324
Epoch: 11309, Training Loss: 0.01638
Epoch: 11309, Training Loss: 0.01257
Epoch: 11310, Training Loss: 0.01434
Epoch: 11310, Training Loss: 0.01324
Epoch: 11310, Training Loss: 0.01638
Epoch: 11310, Training Loss: 0.01257
Epoch: 11311, Training Loss: 0.01434
Epoch: 11311, Training Loss: 0.01324
Epoch: 11311, Training Loss: 0.01638
Epoch: 11311, Training Loss: 0.01257
Epoch: 11312, Training Loss: 0.01434
Epoch: 11312, Training Loss: 0.01324
Epoch: 11312, Training Loss: 0.01638
Epoch: 11312, Training Loss: 0.01257
Epoch: 11313, Training Loss: 0.01434
Epoch: 11313, Training Loss: 0.01324
Epoch: 11313, Training Loss: 0.01638
Epoch: 11313, Training Loss: 0.01257
Epoch: 11314, Training Loss: 0.01434
Epoch: 11314, Training Loss: 0.01324
Epoch: 11314, Training Loss: 0.01638
Epoch: 11314, Training Loss: 0.01257
Epoch: 11315, Training Loss: 0.01434
Epoch: 11315, Training Loss: 0.01324
Epoch: 11315, Training Loss: 0.01638
Epoch: 11315, Training Loss: 0.01257
Epoch: 11316, Training Loss: 0.01434
Epoch: 11316, Training Loss: 0.01324
Epoch: 11316, Training Loss: 0.01638
Epoch: 11316, Training Loss: 0.01257
Epoch: 11317, Training Loss: 0.01434
Epoch: 11317, Training Loss: 0.01324
Epoch: 11317, Training Loss: 0.01638
Epoch: 11317, Training Loss: 0.01257
Epoch: 11318, Training Loss: 0.01434
Epoch: 11318, Training Loss: 0.01324
Epoch: 11318, Training Loss: 0.01638
Epoch: 11318, Training Loss: 0.01257
Epoch: 11319, Training Loss: 0.01434
Epoch: 11319, Training Loss: 0.01324
Epoch: 11319, Training Loss: 0.01637
Epoch: 11319, Training Loss: 0.01257
Epoch: 11320, Training Loss: 0.01434
Epoch: 11320, Training Loss: 0.01323
Epoch: 11320, Training Loss: 0.01637
Epoch: 11320, Training Loss: 0.01256
Epoch: 11321, Training Loss: 0.01434
Epoch: 11321, Training Loss: 0.01323
Epoch: 11321, Training Loss: 0.01637
Epoch: 11321, Training Loss: 0.01256
Epoch: 11322, Training Loss: 0.01433
Epoch: 11322, Training Loss: 0.01323
Epoch: 11322, Training Loss: 0.01637
Epoch: 11322, Training Loss: 0.01256
Epoch: 11323, Training Loss: 0.01433
Epoch: 11323, Training Loss: 0.01323
Epoch: 11323, Training Loss: 0.01637
Epoch: 11323, Training Loss: 0.01256
Epoch: 11324, Training Loss: 0.01433
Epoch: 11324, Training Loss: 0.01323
Epoch: 11324, Training Loss: 0.01637
Epoch: 11324, Training Loss: 0.01256
Epoch: 11325, Training Loss: 0.01433
Epoch: 11325, Training Loss: 0.01323
Epoch: 11325, Training Loss: 0.01637
Epoch: 11325, Training Loss: 0.01256
Epoch: 11326, Training Loss: 0.01433
Epoch: 11326, Training Loss: 0.01323
Epoch: 11326, Training Loss: 0.01637
Epoch: 11326, Training Loss: 0.01256
Epoch: 11327, Training Loss: 0.01433
Epoch: 11327, Training Loss: 0.01323
Epoch: 11327, Training Loss: 0.01637
Epoch: 11327, Training Loss: 0.01256
Epoch: 11328, Training Loss: 0.01433
Epoch: 11328, Training Loss: 0.01323
Epoch: 11328, Training Loss: 0.01637
Epoch: 11328, Training Loss: 0.01256
Epoch: 11329, Training Loss: 0.01433
Epoch: 11329, Training Loss: 0.01323
Epoch: 11329, Training Loss: 0.01637
Epoch: 11329, Training Loss: 0.01256
Epoch: 11330, Training Loss: 0.01433
Epoch: 11330, Training Loss: 0.01323
Epoch: 11330, Training Loss: 0.01637
Epoch: 11330, Training Loss: 0.01256
Epoch: 11331, Training Loss: 0.01433
Epoch: 11331, Training Loss: 0.01323
Epoch: 11331, Training Loss: 0.01636
Epoch: 11331, Training Loss: 0.01256
Epoch: 11332, Training Loss: 0.01433
Epoch: 11332, Training Loss: 0.01323
Epoch: 11332, Training Loss: 0.01636
Epoch: 11332, Training Loss: 0.01256
Epoch: 11333, Training Loss: 0.01433
Epoch: 11333, Training Loss: 0.01323
Epoch: 11333, Training Loss: 0.01636
Epoch: 11333, Training Loss: 0.01256
Epoch: 11334, Training Loss: 0.01433
Epoch: 11334, Training Loss: 0.01323
Epoch: 11334, Training Loss: 0.01636
Epoch: 11334, Training Loss: 0.01256
Epoch: 11335, Training Loss: 0.01433
Epoch: 11335, Training Loss: 0.01323
Epoch: 11335, Training Loss: 0.01636
Epoch: 11335, Training Loss: 0.01256
Epoch: 11336, Training Loss: 0.01432
Epoch: 11336, Training Loss: 0.01322
Epoch: 11336, Training Loss: 0.01636
Epoch: 11336, Training Loss: 0.01256
Epoch: 11337, Training Loss: 0.01432
Epoch: 11337, Training Loss: 0.01322
Epoch: 11337, Training Loss: 0.01636
Epoch: 11337, Training Loss: 0.01255
Epoch: 11338, Training Loss: 0.01432
Epoch: 11338, Training Loss: 0.01322
Epoch: 11338, Training Loss: 0.01636
Epoch: 11338, Training Loss: 0.01255
Epoch: 11339, Training Loss: 0.01432
Epoch: 11339, Training Loss: 0.01322
Epoch: 11339, Training Loss: 0.01636
Epoch: 11339, Training Loss: 0.01255
Epoch: 11340, Training Loss: 0.01432
Epoch: 11340, Training Loss: 0.01322
Epoch: 11340, Training Loss: 0.01636
Epoch: 11340, Training Loss: 0.01255
Epoch: 11341, Training Loss: 0.01432
Epoch: 11341, Training Loss: 0.01322
Epoch: 11341, Training Loss: 0.01636
Epoch: 11341, Training Loss: 0.01255
Epoch: 11342, Training Loss: 0.01432
Epoch: 11342, Training Loss: 0.01322
Epoch: 11342, Training Loss: 0.01636
Epoch: 11342, Training Loss: 0.01255
Epoch: 11343, Training Loss: 0.01432
Epoch: 11343, Training Loss: 0.01322
Epoch: 11343, Training Loss: 0.01636
Epoch: 11343, Training Loss: 0.01255
Epoch: 11344, Training Loss: 0.01432
Epoch: 11344, Training Loss: 0.01322
Epoch: 11344, Training Loss: 0.01635
Epoch: 11344, Training Loss: 0.01255
Epoch: 11345, Training Loss: 0.01432
Epoch: 11345, Training Loss: 0.01322
Epoch: 11345, Training Loss: 0.01635
Epoch: 11345, Training Loss: 0.01255
Epoch: 11346, Training Loss: 0.01432
Epoch: 11346, Training Loss: 0.01322
Epoch: 11346, Training Loss: 0.01635
Epoch: 11346, Training Loss: 0.01255
Epoch: 11347, Training Loss: 0.01432
Epoch: 11347, Training Loss: 0.01322
Epoch: 11347, Training Loss: 0.01635
Epoch: 11347, Training Loss: 0.01255
Epoch: 11348, Training Loss: 0.01432
Epoch: 11348, Training Loss: 0.01322
Epoch: 11348, Training Loss: 0.01635
Epoch: 11348, Training Loss: 0.01255
Epoch: 11349, Training Loss: 0.01432
Epoch: 11349, Training Loss: 0.01322
Epoch: 11349, Training Loss: 0.01635
Epoch: 11349, Training Loss: 0.01255
Epoch: 11350, Training Loss: 0.01431
Epoch: 11350, Training Loss: 0.01322
Epoch: 11350, Training Loss: 0.01635
Epoch: 11350, Training Loss: 0.01255
Epoch: 11351, Training Loss: 0.01431
Epoch: 11351, Training Loss: 0.01322
Epoch: 11351, Training Loss: 0.01635
Epoch: 11351, Training Loss: 0.01255
Epoch: 11352, Training Loss: 0.01431
Epoch: 11352, Training Loss: 0.01321
Epoch: 11352, Training Loss: 0.01635
Epoch: 11352, Training Loss: 0.01255
Epoch: 11353, Training Loss: 0.01431
Epoch: 11353, Training Loss: 0.01321
Epoch: 11353, Training Loss: 0.01635
Epoch: 11353, Training Loss: 0.01254
Epoch: 11354, Training Loss: 0.01431
Epoch: 11354, Training Loss: 0.01321
Epoch: 11354, Training Loss: 0.01635
Epoch: 11354, Training Loss: 0.01254
Epoch: 11355, Training Loss: 0.01431
Epoch: 11355, Training Loss: 0.01321
Epoch: 11355, Training Loss: 0.01635
Epoch: 11355, Training Loss: 0.01254
Epoch: 11356, Training Loss: 0.01431
Epoch: 11356, Training Loss: 0.01321
Epoch: 11356, Training Loss: 0.01634
Epoch: 11356, Training Loss: 0.01254
Epoch: 11357, Training Loss: 0.01431
Epoch: 11357, Training Loss: 0.01321
Epoch: 11357, Training Loss: 0.01634
Epoch: 11357, Training Loss: 0.01254
Epoch: 11358, Training Loss: 0.01431
Epoch: 11358, Training Loss: 0.01321
Epoch: 11358, Training Loss: 0.01634
Epoch: 11358, Training Loss: 0.01254
Epoch: 11359, Training Loss: 0.01431
Epoch: 11359, Training Loss: 0.01321
Epoch: 11359, Training Loss: 0.01634
Epoch: 11359, Training Loss: 0.01254
Epoch: 11360, Training Loss: 0.01431
Epoch: 11360, Training Loss: 0.01321
Epoch: 11360, Training Loss: 0.01634
Epoch: 11360, Training Loss: 0.01254
Epoch: 11361, Training Loss: 0.01431
Epoch: 11361, Training Loss: 0.01321
Epoch: 11361, Training Loss: 0.01634
Epoch: 11361, Training Loss: 0.01254
Epoch: 11362, Training Loss: 0.01431
Epoch: 11362, Training Loss: 0.01321
Epoch: 11362, Training Loss: 0.01634
Epoch: 11362, Training Loss: 0.01254
Epoch: 11363, Training Loss: 0.01431
Epoch: 11363, Training Loss: 0.01321
Epoch: 11363, Training Loss: 0.01634
Epoch: 11363, Training Loss: 0.01254
Epoch: 11364, Training Loss: 0.01430
Epoch: 11364, Training Loss: 0.01321
Epoch: 11364, Training Loss: 0.01634
Epoch: 11364, Training Loss: 0.01254
Epoch: 11365, Training Loss: 0.01430
Epoch: 11365, Training Loss: 0.01321
Epoch: 11365, Training Loss: 0.01634
Epoch: 11365, Training Loss: 0.01254
Epoch: 11366, Training Loss: 0.01430
Epoch: 11366, Training Loss: 0.01321
Epoch: 11366, Training Loss: 0.01634
Epoch: 11366, Training Loss: 0.01254
Epoch: 11367, Training Loss: 0.01430
Epoch: 11367, Training Loss: 0.01321
Epoch: 11367, Training Loss: 0.01634
Epoch: 11367, Training Loss: 0.01254
Epoch: 11368, Training Loss: 0.01430
Epoch: 11368, Training Loss: 0.01320
Epoch: 11368, Training Loss: 0.01633
Epoch: 11368, Training Loss: 0.01254
Epoch: 11369, Training Loss: 0.01430
Epoch: 11369, Training Loss: 0.01320
Epoch: 11369, Training Loss: 0.01633
Epoch: 11369, Training Loss: 0.01254
Epoch: 11370, Training Loss: 0.01430
Epoch: 11370, Training Loss: 0.01320
Epoch: 11370, Training Loss: 0.01633
Epoch: 11370, Training Loss: 0.01253
Epoch: 11371, Training Loss: 0.01430
Epoch: 11371, Training Loss: 0.01320
Epoch: 11371, Training Loss: 0.01633
Epoch: 11371, Training Loss: 0.01253
Epoch: 11372, Training Loss: 0.01430
Epoch: 11372, Training Loss: 0.01320
Epoch: 11372, Training Loss: 0.01633
Epoch: 11372, Training Loss: 0.01253
Epoch: 11373, Training Loss: 0.01430
Epoch: 11373, Training Loss: 0.01320
Epoch: 11373, Training Loss: 0.01633
Epoch: 11373, Training Loss: 0.01253
Epoch: 11374, Training Loss: 0.01430
Epoch: 11374, Training Loss: 0.01320
Epoch: 11374, Training Loss: 0.01633
Epoch: 11374, Training Loss: 0.01253
Epoch: 11375, Training Loss: 0.01430
Epoch: 11375, Training Loss: 0.01320
Epoch: 11375, Training Loss: 0.01633
Epoch: 11375, Training Loss: 0.01253
Epoch: 11376, Training Loss: 0.01430
Epoch: 11376, Training Loss: 0.01320
Epoch: 11376, Training Loss: 0.01633
Epoch: 11376, Training Loss: 0.01253
Epoch: 11377, Training Loss: 0.01430
Epoch: 11377, Training Loss: 0.01320
Epoch: 11377, Training Loss: 0.01633
Epoch: 11377, Training Loss: 0.01253
Epoch: 11378, Training Loss: 0.01429
Epoch: 11378, Training Loss: 0.01320
Epoch: 11378, Training Loss: 0.01633
Epoch: 11378, Training Loss: 0.01253
Epoch: 11379, Training Loss: 0.01429
Epoch: 11379, Training Loss: 0.01320
Epoch: 11379, Training Loss: 0.01633
Epoch: 11379, Training Loss: 0.01253
Epoch: 11380, Training Loss: 0.01429
Epoch: 11380, Training Loss: 0.01320
Epoch: 11380, Training Loss: 0.01633
Epoch: 11380, Training Loss: 0.01253
Epoch: 11381, Training Loss: 0.01429
Epoch: 11381, Training Loss: 0.01320
Epoch: 11381, Training Loss: 0.01632
Epoch: 11381, Training Loss: 0.01253
Epoch: 11382, Training Loss: 0.01429
Epoch: 11382, Training Loss: 0.01320
Epoch: 11382, Training Loss: 0.01632
Epoch: 11382, Training Loss: 0.01253
Epoch: 11383, Training Loss: 0.01429
Epoch: 11383, Training Loss: 0.01319
Epoch: 11383, Training Loss: 0.01632
Epoch: 11383, Training Loss: 0.01253
Epoch: 11384, Training Loss: 0.01429
Epoch: 11384, Training Loss: 0.01319
Epoch: 11384, Training Loss: 0.01632
Epoch: 11384, Training Loss: 0.01253
Epoch: 11385, Training Loss: 0.01429
Epoch: 11385, Training Loss: 0.01319
Epoch: 11385, Training Loss: 0.01632
Epoch: 11385, Training Loss: 0.01253
Epoch: 11386, Training Loss: 0.01429
Epoch: 11386, Training Loss: 0.01319
Epoch: 11386, Training Loss: 0.01632
Epoch: 11386, Training Loss: 0.01252
Epoch: 11387, Training Loss: 0.01429
Epoch: 11387, Training Loss: 0.01319
Epoch: 11387, Training Loss: 0.01632
Epoch: 11387, Training Loss: 0.01252
Epoch: 11388, Training Loss: 0.01429
Epoch: 11388, Training Loss: 0.01319
Epoch: 11388, Training Loss: 0.01632
Epoch: 11388, Training Loss: 0.01252
Epoch: 11389, Training Loss: 0.01429
Epoch: 11389, Training Loss: 0.01319
Epoch: 11389, Training Loss: 0.01632
Epoch: 11389, Training Loss: 0.01252
Epoch: 11390, Training Loss: 0.01429
Epoch: 11390, Training Loss: 0.01319
Epoch: 11390, Training Loss: 0.01632
Epoch: 11390, Training Loss: 0.01252
Epoch: 11391, Training Loss: 0.01429
Epoch: 11391, Training Loss: 0.01319
Epoch: 11391, Training Loss: 0.01632
Epoch: 11391, Training Loss: 0.01252
Epoch: 11392, Training Loss: 0.01428
Epoch: 11392, Training Loss: 0.01319
Epoch: 11392, Training Loss: 0.01632
Epoch: 11392, Training Loss: 0.01252
Epoch: 11393, Training Loss: 0.01428
Epoch: 11393, Training Loss: 0.01319
Epoch: 11393, Training Loss: 0.01631
Epoch: 11393, Training Loss: 0.01252
Epoch: 11394, Training Loss: 0.01428
Epoch: 11394, Training Loss: 0.01319
Epoch: 11394, Training Loss: 0.01631
Epoch: 11394, Training Loss: 0.01252
Epoch: 11395, Training Loss: 0.01428
Epoch: 11395, Training Loss: 0.01319
Epoch: 11395, Training Loss: 0.01631
Epoch: 11395, Training Loss: 0.01252
Epoch: 11396, Training Loss: 0.01428
Epoch: 11396, Training Loss: 0.01319
Epoch: 11396, Training Loss: 0.01631
Epoch: 11396, Training Loss: 0.01252
Epoch: 11397, Training Loss: 0.01428
Epoch: 11397, Training Loss: 0.01319
Epoch: 11397, Training Loss: 0.01631
Epoch: 11397, Training Loss: 0.01252
Epoch: 11398, Training Loss: 0.01428
Epoch: 11398, Training Loss: 0.01319
Epoch: 11398, Training Loss: 0.01631
Epoch: 11398, Training Loss: 0.01252
Epoch: 11399, Training Loss: 0.01428
Epoch: 11399, Training Loss: 0.01318
Epoch: 11399, Training Loss: 0.01631
Epoch: 11399, Training Loss: 0.01252
Epoch: 11400, Training Loss: 0.01428
Epoch: 11400, Training Loss: 0.01318
Epoch: 11400, Training Loss: 0.01631
Epoch: 11400, Training Loss: 0.01252
Epoch: 11401, Training Loss: 0.01428
Epoch: 11401, Training Loss: 0.01318
Epoch: 11401, Training Loss: 0.01631
Epoch: 11401, Training Loss: 0.01252
Epoch: 11402, Training Loss: 0.01428
Epoch: 11402, Training Loss: 0.01318
Epoch: 11402, Training Loss: 0.01631
Epoch: 11402, Training Loss: 0.01252
Epoch: 11403, Training Loss: 0.01428
Epoch: 11403, Training Loss: 0.01318
Epoch: 11403, Training Loss: 0.01631
Epoch: 11403, Training Loss: 0.01251
Epoch: 11404, Training Loss: 0.01428
Epoch: 11404, Training Loss: 0.01318
Epoch: 11404, Training Loss: 0.01631
Epoch: 11404, Training Loss: 0.01251
Epoch: 11405, Training Loss: 0.01428
Epoch: 11405, Training Loss: 0.01318
Epoch: 11405, Training Loss: 0.01630
Epoch: 11405, Training Loss: 0.01251
Epoch: 11406, Training Loss: 0.01428
Epoch: 11406, Training Loss: 0.01318
Epoch: 11406, Training Loss: 0.01630
Epoch: 11406, Training Loss: 0.01251
Epoch: 11407, Training Loss: 0.01427
Epoch: 11407, Training Loss: 0.01318
Epoch: 11407, Training Loss: 0.01630
Epoch: 11407, Training Loss: 0.01251
Epoch: 11408, Training Loss: 0.01427
Epoch: 11408, Training Loss: 0.01318
Epoch: 11408, Training Loss: 0.01630
Epoch: 11408, Training Loss: 0.01251
Epoch: 11409, Training Loss: 0.01427
Epoch: 11409, Training Loss: 0.01318
Epoch: 11409, Training Loss: 0.01630
Epoch: 11409, Training Loss: 0.01251
Epoch: 11410, Training Loss: 0.01427
Epoch: 11410, Training Loss: 0.01318
Epoch: 11410, Training Loss: 0.01630
Epoch: 11410, Training Loss: 0.01251
Epoch: 11411, Training Loss: 0.01427
Epoch: 11411, Training Loss: 0.01318
Epoch: 11411, Training Loss: 0.01630
Epoch: 11411, Training Loss: 0.01251
Epoch: 11412, Training Loss: 0.01427
Epoch: 11412, Training Loss: 0.01318
Epoch: 11412, Training Loss: 0.01630
Epoch: 11412, Training Loss: 0.01251
Epoch: 11413, Training Loss: 0.01427
Epoch: 11413, Training Loss: 0.01318
Epoch: 11413, Training Loss: 0.01630
Epoch: 11413, Training Loss: 0.01251
Epoch: 11414, Training Loss: 0.01427
Epoch: 11414, Training Loss: 0.01318
Epoch: 11414, Training Loss: 0.01630
Epoch: 11414, Training Loss: 0.01251
Epoch: 11415, Training Loss: 0.01427
Epoch: 11415, Training Loss: 0.01317
Epoch: 11415, Training Loss: 0.01630
Epoch: 11415, Training Loss: 0.01251
Epoch: 11416, Training Loss: 0.01427
Epoch: 11416, Training Loss: 0.01317
Epoch: 11416, Training Loss: 0.01630
Epoch: 11416, Training Loss: 0.01251
Epoch: 11417, Training Loss: 0.01427
Epoch: 11417, Training Loss: 0.01317
Epoch: 11417, Training Loss: 0.01630
Epoch: 11417, Training Loss: 0.01251
Epoch: 11418, Training Loss: 0.01427
Epoch: 11418, Training Loss: 0.01317
Epoch: 11418, Training Loss: 0.01629
Epoch: 11418, Training Loss: 0.01251
Epoch: 11419, Training Loss: 0.01427
Epoch: 11419, Training Loss: 0.01317
Epoch: 11419, Training Loss: 0.01629
Epoch: 11419, Training Loss: 0.01250
Epoch: 11420, Training Loss: 0.01427
Epoch: 11420, Training Loss: 0.01317
Epoch: 11420, Training Loss: 0.01629
Epoch: 11420, Training Loss: 0.01250
Epoch: 11421, Training Loss: 0.01426
Epoch: 11421, Training Loss: 0.01317
Epoch: 11421, Training Loss: 0.01629
Epoch: 11421, Training Loss: 0.01250
Epoch: 11422, Training Loss: 0.01426
Epoch: 11422, Training Loss: 0.01317
Epoch: 11422, Training Loss: 0.01629
Epoch: 11422, Training Loss: 0.01250
Epoch: 11423, Training Loss: 0.01426
Epoch: 11423, Training Loss: 0.01317
Epoch: 11423, Training Loss: 0.01629
Epoch: 11423, Training Loss: 0.01250
Epoch: 11424, Training Loss: 0.01426
Epoch: 11424, Training Loss: 0.01317
Epoch: 11424, Training Loss: 0.01629
Epoch: 11424, Training Loss: 0.01250
Epoch: 11425, Training Loss: 0.01426
Epoch: 11425, Training Loss: 0.01317
Epoch: 11425, Training Loss: 0.01629
Epoch: 11425, Training Loss: 0.01250
Epoch: 11426, Training Loss: 0.01426
Epoch: 11426, Training Loss: 0.01317
Epoch: 11426, Training Loss: 0.01629
Epoch: 11426, Training Loss: 0.01250
Epoch: 11427, Training Loss: 0.01426
Epoch: 11427, Training Loss: 0.01317
Epoch: 11427, Training Loss: 0.01629
Epoch: 11427, Training Loss: 0.01250
Epoch: 11428, Training Loss: 0.01426
Epoch: 11428, Training Loss: 0.01317
Epoch: 11428, Training Loss: 0.01629
Epoch: 11428, Training Loss: 0.01250
Epoch: 11429, Training Loss: 0.01426
Epoch: 11429, Training Loss: 0.01317
Epoch: 11429, Training Loss: 0.01629
Epoch: 11429, Training Loss: 0.01250
Epoch: 11430, Training Loss: 0.01426
Epoch: 11430, Training Loss: 0.01317
Epoch: 11430, Training Loss: 0.01628
Epoch: 11430, Training Loss: 0.01250
Epoch: 11431, Training Loss: 0.01426
Epoch: 11431, Training Loss: 0.01316
Epoch: 11431, Training Loss: 0.01628
Epoch: 11431, Training Loss: 0.01250
Epoch: 11432, Training Loss: 0.01426
Epoch: 11432, Training Loss: 0.01316
Epoch: 11432, Training Loss: 0.01628
Epoch: 11432, Training Loss: 0.01250
Epoch: 11433, Training Loss: 0.01426
Epoch: 11433, Training Loss: 0.01316
Epoch: 11433, Training Loss: 0.01628
Epoch: 11433, Training Loss: 0.01250
Epoch: 11434, Training Loss: 0.01426
Epoch: 11434, Training Loss: 0.01316
Epoch: 11434, Training Loss: 0.01628
Epoch: 11434, Training Loss: 0.01250
Epoch: 11435, Training Loss: 0.01425
Epoch: 11435, Training Loss: 0.01316
Epoch: 11435, Training Loss: 0.01628
Epoch: 11435, Training Loss: 0.01250
Epoch: 11436, Training Loss: 0.01425
Epoch: 11436, Training Loss: 0.01316
Epoch: 11436, Training Loss: 0.01628
Epoch: 11436, Training Loss: 0.01249
Epoch: 11437, Training Loss: 0.01425
Epoch: 11437, Training Loss: 0.01316
Epoch: 11437, Training Loss: 0.01628
Epoch: 11437, Training Loss: 0.01249
Epoch: 11438, Training Loss: 0.01425
Epoch: 11438, Training Loss: 0.01316
Epoch: 11438, Training Loss: 0.01628
Epoch: 11438, Training Loss: 0.01249
Epoch: 11439, Training Loss: 0.01425
Epoch: 11439, Training Loss: 0.01316
Epoch: 11439, Training Loss: 0.01628
Epoch: 11439, Training Loss: 0.01249
Epoch: 11440, Training Loss: 0.01425
Epoch: 11440, Training Loss: 0.01316
Epoch: 11440, Training Loss: 0.01628
Epoch: 11440, Training Loss: 0.01249
Epoch: 11441, Training Loss: 0.01425
Epoch: 11441, Training Loss: 0.01316
Epoch: 11441, Training Loss: 0.01628
Epoch: 11441, Training Loss: 0.01249
Epoch: 11442, Training Loss: 0.01425
Epoch: 11442, Training Loss: 0.01316
Epoch: 11442, Training Loss: 0.01628
Epoch: 11442, Training Loss: 0.01249
Epoch: 11443, Training Loss: 0.01425
Epoch: 11443, Training Loss: 0.01316
Epoch: 11443, Training Loss: 0.01627
Epoch: 11443, Training Loss: 0.01249
Epoch: 11444, Training Loss: 0.01425
Epoch: 11444, Training Loss: 0.01316
Epoch: 11444, Training Loss: 0.01627
Epoch: 11444, Training Loss: 0.01249
Epoch: 11445, Training Loss: 0.01425
Epoch: 11445, Training Loss: 0.01316
Epoch: 11445, Training Loss: 0.01627
Epoch: 11445, Training Loss: 0.01249
Epoch: 11446, Training Loss: 0.01425
Epoch: 11446, Training Loss: 0.01316
Epoch: 11446, Training Loss: 0.01627
Epoch: 11446, Training Loss: 0.01249
Epoch: 11447, Training Loss: 0.01425
Epoch: 11447, Training Loss: 0.01315
Epoch: 11447, Training Loss: 0.01627
Epoch: 11447, Training Loss: 0.01249
Epoch: 11448, Training Loss: 0.01425
Epoch: 11448, Training Loss: 0.01315
Epoch: 11448, Training Loss: 0.01627
Epoch: 11448, Training Loss: 0.01249
Epoch: 11449, Training Loss: 0.01424
Epoch: 11449, Training Loss: 0.01315
Epoch: 11449, Training Loss: 0.01627
Epoch: 11449, Training Loss: 0.01249
Epoch: 11450, Training Loss: 0.01424
Epoch: 11450, Training Loss: 0.01315
Epoch: 11450, Training Loss: 0.01627
Epoch: 11450, Training Loss: 0.01249
Epoch: 11451, Training Loss: 0.01424
Epoch: 11451, Training Loss: 0.01315
Epoch: 11451, Training Loss: 0.01627
Epoch: 11451, Training Loss: 0.01249
Epoch: 11452, Training Loss: 0.01424
Epoch: 11452, Training Loss: 0.01315
Epoch: 11452, Training Loss: 0.01627
Epoch: 11452, Training Loss: 0.01248
Epoch: 11453, Training Loss: 0.01424
Epoch: 11453, Training Loss: 0.01315
Epoch: 11453, Training Loss: 0.01627
Epoch: 11453, Training Loss: 0.01248
Epoch: 11454, Training Loss: 0.01424
Epoch: 11454, Training Loss: 0.01315
Epoch: 11454, Training Loss: 0.01627
Epoch: 11454, Training Loss: 0.01248
Epoch: 11455, Training Loss: 0.01424
Epoch: 11455, Training Loss: 0.01315
Epoch: 11455, Training Loss: 0.01626
Epoch: 11455, Training Loss: 0.01248
Epoch: 11456, Training Loss: 0.01424
Epoch: 11456, Training Loss: 0.01315
Epoch: 11456, Training Loss: 0.01626
Epoch: 11456, Training Loss: 0.01248
Epoch: 11457, Training Loss: 0.01424
Epoch: 11457, Training Loss: 0.01315
Epoch: 11457, Training Loss: 0.01626
Epoch: 11457, Training Loss: 0.01248
Epoch: 11458, Training Loss: 0.01424
Epoch: 11458, Training Loss: 0.01315
Epoch: 11458, Training Loss: 0.01626
Epoch: 11458, Training Loss: 0.01248
Epoch: 11459, Training Loss: 0.01424
Epoch: 11459, Training Loss: 0.01315
Epoch: 11459, Training Loss: 0.01626
Epoch: 11459, Training Loss: 0.01248
Epoch: 11460, Training Loss: 0.01424
Epoch: 11460, Training Loss: 0.01315
Epoch: 11460, Training Loss: 0.01626
Epoch: 11460, Training Loss: 0.01248
Epoch: 11461, Training Loss: 0.01424
Epoch: 11461, Training Loss: 0.01315
Epoch: 11461, Training Loss: 0.01626
Epoch: 11461, Training Loss: 0.01248
Epoch: 11462, Training Loss: 0.01424
Epoch: 11462, Training Loss: 0.01315
Epoch: 11462, Training Loss: 0.01626
Epoch: 11462, Training Loss: 0.01248
Epoch: 11463, Training Loss: 0.01424
Epoch: 11463, Training Loss: 0.01314
Epoch: 11463, Training Loss: 0.01626
Epoch: 11463, Training Loss: 0.01248
Epoch: 11464, Training Loss: 0.01423
Epoch: 11464, Training Loss: 0.01314
Epoch: 11464, Training Loss: 0.01626
Epoch: 11464, Training Loss: 0.01248
Epoch: 11465, Training Loss: 0.01423
Epoch: 11465, Training Loss: 0.01314
Epoch: 11465, Training Loss: 0.01626
Epoch: 11465, Training Loss: 0.01248
Epoch: 11466, Training Loss: 0.01423
Epoch: 11466, Training Loss: 0.01314
Epoch: 11466, Training Loss: 0.01626
Epoch: 11466, Training Loss: 0.01248
Epoch: 11467, Training Loss: 0.01423
Epoch: 11467, Training Loss: 0.01314
Epoch: 11467, Training Loss: 0.01626
Epoch: 11467, Training Loss: 0.01248
Epoch: 11468, Training Loss: 0.01423
Epoch: 11468, Training Loss: 0.01314
Epoch: 11468, Training Loss: 0.01625
Epoch: 11468, Training Loss: 0.01248
Epoch: 11469, Training Loss: 0.01423
Epoch: 11469, Training Loss: 0.01314
Epoch: 11469, Training Loss: 0.01625
Epoch: 11469, Training Loss: 0.01247
Epoch: 11470, Training Loss: 0.01423
Epoch: 11470, Training Loss: 0.01314
Epoch: 11470, Training Loss: 0.01625
Epoch: 11470, Training Loss: 0.01247
Epoch: 11471, Training Loss: 0.01423
Epoch: 11471, Training Loss: 0.01314
Epoch: 11471, Training Loss: 0.01625
Epoch: 11471, Training Loss: 0.01247
Epoch: 11472, Training Loss: 0.01423
Epoch: 11472, Training Loss: 0.01314
Epoch: 11472, Training Loss: 0.01625
Epoch: 11472, Training Loss: 0.01247
Epoch: 11473, Training Loss: 0.01423
Epoch: 11473, Training Loss: 0.01314
Epoch: 11473, Training Loss: 0.01625
Epoch: 11473, Training Loss: 0.01247
Epoch: 11474, Training Loss: 0.01423
Epoch: 11474, Training Loss: 0.01314
Epoch: 11474, Training Loss: 0.01625
Epoch: 11474, Training Loss: 0.01247
Epoch: 11475, Training Loss: 0.01423
Epoch: 11475, Training Loss: 0.01314
Epoch: 11475, Training Loss: 0.01625
Epoch: 11475, Training Loss: 0.01247
Epoch: 11476, Training Loss: 0.01423
Epoch: 11476, Training Loss: 0.01314
Epoch: 11476, Training Loss: 0.01625
Epoch: 11476, Training Loss: 0.01247
Epoch: 11477, Training Loss: 0.01423
Epoch: 11477, Training Loss: 0.01314
Epoch: 11477, Training Loss: 0.01625
Epoch: 11477, Training Loss: 0.01247
Epoch: 11478, Training Loss: 0.01422
Epoch: 11478, Training Loss: 0.01314
Epoch: 11478, Training Loss: 0.01625
Epoch: 11478, Training Loss: 0.01247
Epoch: 11479, Training Loss: 0.01422
Epoch: 11479, Training Loss: 0.01313
Epoch: 11479, Training Loss: 0.01625
Epoch: 11479, Training Loss: 0.01247
Epoch: 11480, Training Loss: 0.01422
Epoch: 11480, Training Loss: 0.01313
Epoch: 11480, Training Loss: 0.01624
Epoch: 11480, Training Loss: 0.01247
Epoch: 11481, Training Loss: 0.01422
Epoch: 11481, Training Loss: 0.01313
Epoch: 11481, Training Loss: 0.01624
Epoch: 11481, Training Loss: 0.01247
Epoch: 11482, Training Loss: 0.01422
Epoch: 11482, Training Loss: 0.01313
Epoch: 11482, Training Loss: 0.01624
Epoch: 11482, Training Loss: 0.01247
Epoch: 11483, Training Loss: 0.01422
Epoch: 11483, Training Loss: 0.01313
Epoch: 11483, Training Loss: 0.01624
Epoch: 11483, Training Loss: 0.01247
Epoch: 11484, Training Loss: 0.01422
Epoch: 11484, Training Loss: 0.01313
Epoch: 11484, Training Loss: 0.01624
Epoch: 11484, Training Loss: 0.01247
Epoch: 11485, Training Loss: 0.01422
Epoch: 11485, Training Loss: 0.01313
Epoch: 11485, Training Loss: 0.01624
Epoch: 11485, Training Loss: 0.01247
Epoch: 11486, Training Loss: 0.01422
Epoch: 11486, Training Loss: 0.01313
Epoch: 11486, Training Loss: 0.01624
Epoch: 11486, Training Loss: 0.01246
Epoch: 11487, Training Loss: 0.01422
Epoch: 11487, Training Loss: 0.01313
Epoch: 11487, Training Loss: 0.01624
Epoch: 11487, Training Loss: 0.01246
Epoch: 11488, Training Loss: 0.01422
Epoch: 11488, Training Loss: 0.01313
Epoch: 11488, Training Loss: 0.01624
Epoch: 11488, Training Loss: 0.01246
Epoch: 11489, Training Loss: 0.01422
Epoch: 11489, Training Loss: 0.01313
Epoch: 11489, Training Loss: 0.01624
Epoch: 11489, Training Loss: 0.01246
Epoch: 11490, Training Loss: 0.01422
Epoch: 11490, Training Loss: 0.01313
Epoch: 11490, Training Loss: 0.01624
Epoch: 11490, Training Loss: 0.01246
Epoch: 11491, Training Loss: 0.01422
Epoch: 11491, Training Loss: 0.01313
Epoch: 11491, Training Loss: 0.01624
Epoch: 11491, Training Loss: 0.01246
Epoch: 11492, Training Loss: 0.01422
Epoch: 11492, Training Loss: 0.01313
Epoch: 11492, Training Loss: 0.01624
Epoch: 11492, Training Loss: 0.01246
Epoch: 11493, Training Loss: 0.01421
Epoch: 11493, Training Loss: 0.01313
Epoch: 11493, Training Loss: 0.01623
Epoch: 11493, Training Loss: 0.01246
Epoch: 11494, Training Loss: 0.01421
Epoch: 11494, Training Loss: 0.01313
Epoch: 11494, Training Loss: 0.01623
Epoch: 11494, Training Loss: 0.01246
Epoch: 11495, Training Loss: 0.01421
Epoch: 11495, Training Loss: 0.01312
Epoch: 11495, Training Loss: 0.01623
Epoch: 11495, Training Loss: 0.01246
Epoch: 11496, Training Loss: 0.01421
Epoch: 11496, Training Loss: 0.01312
Epoch: 11496, Training Loss: 0.01623
Epoch: 11496, Training Loss: 0.01246
Epoch: 11497, Training Loss: 0.01421
Epoch: 11497, Training Loss: 0.01312
Epoch: 11497, Training Loss: 0.01623
Epoch: 11497, Training Loss: 0.01246
Epoch: 11498, Training Loss: 0.01421
Epoch: 11498, Training Loss: 0.01312
Epoch: 11498, Training Loss: 0.01623
Epoch: 11498, Training Loss: 0.01246
Epoch: 11499, Training Loss: 0.01421
Epoch: 11499, Training Loss: 0.01312
Epoch: 11499, Training Loss: 0.01623
Epoch: 11499, Training Loss: 0.01246
Epoch: 11500, Training Loss: 0.01421
Epoch: 11500, Training Loss: 0.01312
Epoch: 11500, Training Loss: 0.01623
Epoch: 11500, Training Loss: 0.01246
Epoch: 11501, Training Loss: 0.01421
Epoch: 11501, Training Loss: 0.01312
Epoch: 11501, Training Loss: 0.01623
Epoch: 11501, Training Loss: 0.01246
Epoch: 11502, Training Loss: 0.01421
Epoch: 11502, Training Loss: 0.01312
Epoch: 11502, Training Loss: 0.01623
Epoch: 11502, Training Loss: 0.01246
Epoch: 11503, Training Loss: 0.01421
Epoch: 11503, Training Loss: 0.01312
Epoch: 11503, Training Loss: 0.01623
Epoch: 11503, Training Loss: 0.01245
Epoch: 11504, Training Loss: 0.01421
Epoch: 11504, Training Loss: 0.01312
Epoch: 11504, Training Loss: 0.01623
Epoch: 11504, Training Loss: 0.01245
Epoch: 11505, Training Loss: 0.01421
Epoch: 11505, Training Loss: 0.01312
Epoch: 11505, Training Loss: 0.01623
Epoch: 11505, Training Loss: 0.01245
Epoch: 11506, Training Loss: 0.01421
Epoch: 11506, Training Loss: 0.01312
Epoch: 11506, Training Loss: 0.01622
Epoch: 11506, Training Loss: 0.01245
Epoch: 11507, Training Loss: 0.01420
Epoch: 11507, Training Loss: 0.01312
Epoch: 11507, Training Loss: 0.01622
Epoch: 11507, Training Loss: 0.01245
Epoch: 11508, Training Loss: 0.01420
Epoch: 11508, Training Loss: 0.01312
Epoch: 11508, Training Loss: 0.01622
Epoch: 11508, Training Loss: 0.01245
Epoch: 11509, Training Loss: 0.01420
Epoch: 11509, Training Loss: 0.01312
Epoch: 11509, Training Loss: 0.01622
Epoch: 11509, Training Loss: 0.01245
Epoch: 11510, Training Loss: 0.01420
Epoch: 11510, Training Loss: 0.01312
Epoch: 11510, Training Loss: 0.01622
Epoch: 11510, Training Loss: 0.01245
Epoch: 11511, Training Loss: 0.01420
Epoch: 11511, Training Loss: 0.01311
Epoch: 11511, Training Loss: 0.01622
Epoch: 11511, Training Loss: 0.01245
Epoch: 11512, Training Loss: 0.01420
Epoch: 11512, Training Loss: 0.01311
Epoch: 11512, Training Loss: 0.01622
Epoch: 11512, Training Loss: 0.01245
Epoch: 11513, Training Loss: 0.01420
Epoch: 11513, Training Loss: 0.01311
Epoch: 11513, Training Loss: 0.01622
Epoch: 11513, Training Loss: 0.01245
Epoch: 11514, Training Loss: 0.01420
Epoch: 11514, Training Loss: 0.01311
Epoch: 11514, Training Loss: 0.01622
Epoch: 11514, Training Loss: 0.01245
Epoch: 11515, Training Loss: 0.01420
Epoch: 11515, Training Loss: 0.01311
Epoch: 11515, Training Loss: 0.01622
Epoch: 11515, Training Loss: 0.01245
Epoch: 11516, Training Loss: 0.01420
Epoch: 11516, Training Loss: 0.01311
Epoch: 11516, Training Loss: 0.01622
Epoch: 11516, Training Loss: 0.01245
Epoch: 11517, Training Loss: 0.01420
Epoch: 11517, Training Loss: 0.01311
Epoch: 11517, Training Loss: 0.01622
Epoch: 11517, Training Loss: 0.01245
Epoch: 11518, Training Loss: 0.01420
Epoch: 11518, Training Loss: 0.01311
Epoch: 11518, Training Loss: 0.01621
Epoch: 11518, Training Loss: 0.01245
Epoch: 11519, Training Loss: 0.01420
Epoch: 11519, Training Loss: 0.01311
Epoch: 11519, Training Loss: 0.01621
Epoch: 11519, Training Loss: 0.01244
Epoch: 11520, Training Loss: 0.01420
Epoch: 11520, Training Loss: 0.01311
Epoch: 11520, Training Loss: 0.01621
Epoch: 11520, Training Loss: 0.01244
Epoch: 11521, Training Loss: 0.01419
Epoch: 11521, Training Loss: 0.01311
Epoch: 11521, Training Loss: 0.01621
Epoch: 11521, Training Loss: 0.01244
Epoch: 11522, Training Loss: 0.01419
Epoch: 11522, Training Loss: 0.01311
Epoch: 11522, Training Loss: 0.01621
Epoch: 11522, Training Loss: 0.01244
Epoch: 11523, Training Loss: 0.01419
Epoch: 11523, Training Loss: 0.01311
Epoch: 11523, Training Loss: 0.01621
Epoch: 11523, Training Loss: 0.01244
Epoch: 11524, Training Loss: 0.01419
Epoch: 11524, Training Loss: 0.01311
Epoch: 11524, Training Loss: 0.01621
Epoch: 11524, Training Loss: 0.01244
Epoch: 11525, Training Loss: 0.01419
Epoch: 11525, Training Loss: 0.01311
Epoch: 11525, Training Loss: 0.01621
Epoch: 11525, Training Loss: 0.01244
Epoch: 11526, Training Loss: 0.01419
Epoch: 11526, Training Loss: 0.01311
Epoch: 11526, Training Loss: 0.01621
Epoch: 11526, Training Loss: 0.01244
Epoch: 11527, Training Loss: 0.01419
Epoch: 11527, Training Loss: 0.01310
Epoch: 11527, Training Loss: 0.01621
Epoch: 11527, Training Loss: 0.01244
Epoch: 11528, Training Loss: 0.01419
Epoch: 11528, Training Loss: 0.01310
Epoch: 11528, Training Loss: 0.01621
Epoch: 11528, Training Loss: 0.01244
Epoch: 11529, Training Loss: 0.01419
Epoch: 11529, Training Loss: 0.01310
Epoch: 11529, Training Loss: 0.01621
Epoch: 11529, Training Loss: 0.01244
Epoch: 11530, Training Loss: 0.01419
Epoch: 11530, Training Loss: 0.01310
Epoch: 11530, Training Loss: 0.01621
Epoch: 11530, Training Loss: 0.01244
Epoch: 11531, Training Loss: 0.01419
Epoch: 11531, Training Loss: 0.01310
Epoch: 11531, Training Loss: 0.01620
Epoch: 11531, Training Loss: 0.01244
Epoch: 11532, Training Loss: 0.01419
Epoch: 11532, Training Loss: 0.01310
Epoch: 11532, Training Loss: 0.01620
Epoch: 11532, Training Loss: 0.01244
Epoch: 11533, Training Loss: 0.01419
Epoch: 11533, Training Loss: 0.01310
Epoch: 11533, Training Loss: 0.01620
Epoch: 11533, Training Loss: 0.01244
Epoch: 11534, Training Loss: 0.01419
Epoch: 11534, Training Loss: 0.01310
Epoch: 11534, Training Loss: 0.01620
Epoch: 11534, Training Loss: 0.01244
Epoch: 11535, Training Loss: 0.01419
Epoch: 11535, Training Loss: 0.01310
Epoch: 11535, Training Loss: 0.01620
Epoch: 11535, Training Loss: 0.01244
Epoch: 11536, Training Loss: 0.01418
Epoch: 11536, Training Loss: 0.01310
Epoch: 11536, Training Loss: 0.01620
Epoch: 11536, Training Loss: 0.01243
Epoch: 11537, Training Loss: 0.01418
Epoch: 11537, Training Loss: 0.01310
Epoch: 11537, Training Loss: 0.01620
Epoch: 11537, Training Loss: 0.01243
Epoch: 11538, Training Loss: 0.01418
Epoch: 11538, Training Loss: 0.01310
Epoch: 11538, Training Loss: 0.01620
Epoch: 11538, Training Loss: 0.01243
Epoch: 11539, Training Loss: 0.01418
Epoch: 11539, Training Loss: 0.01310
Epoch: 11539, Training Loss: 0.01620
Epoch: 11539, Training Loss: 0.01243
Epoch: 11540, Training Loss: 0.01418
Epoch: 11540, Training Loss: 0.01310
Epoch: 11540, Training Loss: 0.01620
Epoch: 11540, Training Loss: 0.01243
Epoch: 11541, Training Loss: 0.01418
Epoch: 11541, Training Loss: 0.01310
Epoch: 11541, Training Loss: 0.01620
Epoch: 11541, Training Loss: 0.01243
Epoch: 11542, Training Loss: 0.01418
Epoch: 11542, Training Loss: 0.01310
Epoch: 11542, Training Loss: 0.01620
Epoch: 11542, Training Loss: 0.01243
Epoch: 11543, Training Loss: 0.01418
Epoch: 11543, Training Loss: 0.01309
Epoch: 11543, Training Loss: 0.01619
Epoch: 11543, Training Loss: 0.01243
Epoch: 11544, Training Loss: 0.01418
Epoch: 11544, Training Loss: 0.01309
Epoch: 11544, Training Loss: 0.01619
Epoch: 11544, Training Loss: 0.01243
Epoch: 11545, Training Loss: 0.01418
Epoch: 11545, Training Loss: 0.01309
Epoch: 11545, Training Loss: 0.01619
Epoch: 11545, Training Loss: 0.01243
Epoch: 11546, Training Loss: 0.01418
Epoch: 11546, Training Loss: 0.01309
Epoch: 11546, Training Loss: 0.01619
Epoch: 11546, Training Loss: 0.01243
Epoch: 11547, Training Loss: 0.01418
Epoch: 11547, Training Loss: 0.01309
Epoch: 11547, Training Loss: 0.01619
Epoch: 11547, Training Loss: 0.01243
Epoch: 11548, Training Loss: 0.01418
Epoch: 11548, Training Loss: 0.01309
Epoch: 11548, Training Loss: 0.01619
Epoch: 11548, Training Loss: 0.01243
Epoch: 11549, Training Loss: 0.01418
Epoch: 11549, Training Loss: 0.01309
Epoch: 11549, Training Loss: 0.01619
Epoch: 11549, Training Loss: 0.01243
Epoch: 11550, Training Loss: 0.01418
Epoch: 11550, Training Loss: 0.01309
Epoch: 11550, Training Loss: 0.01619
Epoch: 11550, Training Loss: 0.01243
Epoch: 11551, Training Loss: 0.01417
Epoch: 11551, Training Loss: 0.01309
Epoch: 11551, Training Loss: 0.01619
Epoch: 11551, Training Loss: 0.01243
Epoch: 11552, Training Loss: 0.01417
Epoch: 11552, Training Loss: 0.01309
Epoch: 11552, Training Loss: 0.01619
Epoch: 11552, Training Loss: 0.01243
Epoch: 11553, Training Loss: 0.01417
Epoch: 11553, Training Loss: 0.01309
Epoch: 11553, Training Loss: 0.01619
Epoch: 11553, Training Loss: 0.01242
Epoch: 11554, Training Loss: 0.01417
Epoch: 11554, Training Loss: 0.01309
Epoch: 11554, Training Loss: 0.01619
Epoch: 11554, Training Loss: 0.01242
Epoch: 11555, Training Loss: 0.01417
Epoch: 11555, Training Loss: 0.01309
Epoch: 11555, Training Loss: 0.01619
Epoch: 11555, Training Loss: 0.01242
Epoch: 11556, Training Loss: 0.01417
Epoch: 11556, Training Loss: 0.01309
Epoch: 11556, Training Loss: 0.01618
Epoch: 11556, Training Loss: 0.01242
Epoch: 11557, Training Loss: 0.01417
Epoch: 11557, Training Loss: 0.01309
Epoch: 11557, Training Loss: 0.01618
Epoch: 11557, Training Loss: 0.01242
Epoch: 11558, Training Loss: 0.01417
Epoch: 11558, Training Loss: 0.01309
Epoch: 11558, Training Loss: 0.01618
Epoch: 11558, Training Loss: 0.01242
Epoch: 11559, Training Loss: 0.01417
Epoch: 11559, Training Loss: 0.01308
Epoch: 11559, Training Loss: 0.01618
Epoch: 11559, Training Loss: 0.01242
Epoch: 11560, Training Loss: 0.01417
Epoch: 11560, Training Loss: 0.01308
Epoch: 11560, Training Loss: 0.01618
Epoch: 11560, Training Loss: 0.01242
Epoch: 11561, Training Loss: 0.01417
Epoch: 11561, Training Loss: 0.01308
Epoch: 11561, Training Loss: 0.01618
Epoch: 11561, Training Loss: 0.01242
Epoch: 11562, Training Loss: 0.01417
Epoch: 11562, Training Loss: 0.01308
Epoch: 11562, Training Loss: 0.01618
Epoch: 11562, Training Loss: 0.01242
Epoch: 11563, Training Loss: 0.01417
Epoch: 11563, Training Loss: 0.01308
Epoch: 11563, Training Loss: 0.01618
Epoch: 11563, Training Loss: 0.01242
Epoch: 11564, Training Loss: 0.01417
Epoch: 11564, Training Loss: 0.01308
Epoch: 11564, Training Loss: 0.01618
Epoch: 11564, Training Loss: 0.01242
Epoch: 11565, Training Loss: 0.01416
Epoch: 11565, Training Loss: 0.01308
Epoch: 11565, Training Loss: 0.01618
Epoch: 11565, Training Loss: 0.01242
Epoch: 11566, Training Loss: 0.01416
Epoch: 11566, Training Loss: 0.01308
Epoch: 11566, Training Loss: 0.01618
Epoch: 11566, Training Loss: 0.01242
Epoch: 11567, Training Loss: 0.01416
Epoch: 11567, Training Loss: 0.01308
Epoch: 11567, Training Loss: 0.01618
Epoch: 11567, Training Loss: 0.01242
Epoch: 11568, Training Loss: 0.01416
Epoch: 11568, Training Loss: 0.01308
Epoch: 11568, Training Loss: 0.01618
Epoch: 11568, Training Loss: 0.01242
Epoch: 11569, Training Loss: 0.01416
Epoch: 11569, Training Loss: 0.01308
Epoch: 11569, Training Loss: 0.01617
Epoch: 11569, Training Loss: 0.01242
Epoch: 11570, Training Loss: 0.01416
Epoch: 11570, Training Loss: 0.01308
Epoch: 11570, Training Loss: 0.01617
Epoch: 11570, Training Loss: 0.01241
Epoch: 11571, Training Loss: 0.01416
Epoch: 11571, Training Loss: 0.01308
Epoch: 11571, Training Loss: 0.01617
Epoch: 11571, Training Loss: 0.01241
Epoch: 11572, Training Loss: 0.01416
Epoch: 11572, Training Loss: 0.01308
Epoch: 11572, Training Loss: 0.01617
Epoch: 11572, Training Loss: 0.01241
Epoch: 11573, Training Loss: 0.01416
Epoch: 11573, Training Loss: 0.01308
Epoch: 11573, Training Loss: 0.01617
Epoch: 11573, Training Loss: 0.01241
Epoch: 11574, Training Loss: 0.01416
Epoch: 11574, Training Loss: 0.01308
Epoch: 11574, Training Loss: 0.01617
Epoch: 11574, Training Loss: 0.01241
Epoch: 11575, Training Loss: 0.01416
Epoch: 11575, Training Loss: 0.01308
Epoch: 11575, Training Loss: 0.01617
Epoch: 11575, Training Loss: 0.01241
Epoch: 11576, Training Loss: 0.01416
Epoch: 11576, Training Loss: 0.01307
Epoch: 11576, Training Loss: 0.01617
Epoch: 11576, Training Loss: 0.01241
Epoch: 11577, Training Loss: 0.01416
Epoch: 11577, Training Loss: 0.01307
Epoch: 11577, Training Loss: 0.01617
Epoch: 11577, Training Loss: 0.01241
Epoch: 11578, Training Loss: 0.01416
Epoch: 11578, Training Loss: 0.01307
Epoch: 11578, Training Loss: 0.01617
Epoch: 11578, Training Loss: 0.01241
Epoch: 11579, Training Loss: 0.01416
Epoch: 11579, Training Loss: 0.01307
Epoch: 11579, Training Loss: 0.01617
Epoch: 11579, Training Loss: 0.01241
Epoch: 11580, Training Loss: 0.01415
Epoch: 11580, Training Loss: 0.01307
Epoch: 11580, Training Loss: 0.01617
Epoch: 11580, Training Loss: 0.01241
Epoch: 11581, Training Loss: 0.01415
Epoch: 11581, Training Loss: 0.01307
Epoch: 11581, Training Loss: 0.01617
Epoch: 11581, Training Loss: 0.01241
Epoch: 11582, Training Loss: 0.01415
Epoch: 11582, Training Loss: 0.01307
Epoch: 11582, Training Loss: 0.01616
Epoch: 11582, Training Loss: 0.01241
Epoch: 11583, Training Loss: 0.01415
Epoch: 11583, Training Loss: 0.01307
Epoch: 11583, Training Loss: 0.01616
Epoch: 11583, Training Loss: 0.01241
Epoch: 11584, Training Loss: 0.01415
Epoch: 11584, Training Loss: 0.01307
Epoch: 11584, Training Loss: 0.01616
Epoch: 11584, Training Loss: 0.01241
Epoch: 11585, Training Loss: 0.01415
Epoch: 11585, Training Loss: 0.01307
Epoch: 11585, Training Loss: 0.01616
Epoch: 11585, Training Loss: 0.01241
Epoch: 11586, Training Loss: 0.01415
Epoch: 11586, Training Loss: 0.01307
Epoch: 11586, Training Loss: 0.01616
Epoch: 11586, Training Loss: 0.01241
Epoch: 11587, Training Loss: 0.01415
Epoch: 11587, Training Loss: 0.01307
Epoch: 11587, Training Loss: 0.01616
Epoch: 11587, Training Loss: 0.01240
Epoch: 11588, Training Loss: 0.01415
Epoch: 11588, Training Loss: 0.01307
Epoch: 11588, Training Loss: 0.01616
Epoch: 11588, Training Loss: 0.01240
Epoch: 11589, Training Loss: 0.01415
Epoch: 11589, Training Loss: 0.01307
Epoch: 11589, Training Loss: 0.01616
Epoch: 11589, Training Loss: 0.01240
Epoch: 11590, Training Loss: 0.01415
Epoch: 11590, Training Loss: 0.01307
Epoch: 11590, Training Loss: 0.01616
Epoch: 11590, Training Loss: 0.01240
Epoch: 11591, Training Loss: 0.01415
Epoch: 11591, Training Loss: 0.01307
Epoch: 11591, Training Loss: 0.01616
Epoch: 11591, Training Loss: 0.01240
Epoch: 11592, Training Loss: 0.01415
Epoch: 11592, Training Loss: 0.01306
Epoch: 11592, Training Loss: 0.01616
Epoch: 11592, Training Loss: 0.01240
Epoch: 11593, Training Loss: 0.01415
Epoch: 11593, Training Loss: 0.01306
Epoch: 11593, Training Loss: 0.01616
Epoch: 11593, Training Loss: 0.01240
Epoch: 11594, Training Loss: 0.01414
Epoch: 11594, Training Loss: 0.01306
Epoch: 11594, Training Loss: 0.01615
Epoch: 11594, Training Loss: 0.01240
Epoch: 11595, Training Loss: 0.01414
Epoch: 11595, Training Loss: 0.01306
Epoch: 11595, Training Loss: 0.01615
Epoch: 11595, Training Loss: 0.01240
Epoch: 11596, Training Loss: 0.01414
Epoch: 11596, Training Loss: 0.01306
Epoch: 11596, Training Loss: 0.01615
Epoch: 11596, Training Loss: 0.01240
Epoch: 11597, Training Loss: 0.01414
Epoch: 11597, Training Loss: 0.01306
Epoch: 11597, Training Loss: 0.01615
Epoch: 11597, Training Loss: 0.01240
Epoch: 11598, Training Loss: 0.01414
Epoch: 11598, Training Loss: 0.01306
Epoch: 11598, Training Loss: 0.01615
Epoch: 11598, Training Loss: 0.01240
Epoch: 11599, Training Loss: 0.01414
Epoch: 11599, Training Loss: 0.01306
Epoch: 11599, Training Loss: 0.01615
Epoch: 11599, Training Loss: 0.01240
Epoch: 11600, Training Loss: 0.01414
Epoch: 11600, Training Loss: 0.01306
Epoch: 11600, Training Loss: 0.01615
Epoch: 11600, Training Loss: 0.01240
Epoch: 11601, Training Loss: 0.01414
Epoch: 11601, Training Loss: 0.01306
Epoch: 11601, Training Loss: 0.01615
Epoch: 11601, Training Loss: 0.01240
Epoch: 11602, Training Loss: 0.01414
Epoch: 11602, Training Loss: 0.01306
Epoch: 11602, Training Loss: 0.01615
Epoch: 11602, Training Loss: 0.01240
Epoch: 11603, Training Loss: 0.01414
Epoch: 11603, Training Loss: 0.01306
Epoch: 11603, Training Loss: 0.01615
Epoch: 11603, Training Loss: 0.01240
Epoch: 11604, Training Loss: 0.01414
Epoch: 11604, Training Loss: 0.01306
Epoch: 11604, Training Loss: 0.01615
Epoch: 11604, Training Loss: 0.01239
Epoch: 11605, Training Loss: 0.01414
Epoch: 11605, Training Loss: 0.01306
Epoch: 11605, Training Loss: 0.01615
Epoch: 11605, Training Loss: 0.01239
Epoch: 11606, Training Loss: 0.01414
Epoch: 11606, Training Loss: 0.01306
Epoch: 11606, Training Loss: 0.01615
Epoch: 11606, Training Loss: 0.01239
Epoch: 11607, Training Loss: 0.01414
Epoch: 11607, Training Loss: 0.01306
Epoch: 11607, Training Loss: 0.01614
Epoch: 11607, Training Loss: 0.01239
Epoch: 11608, Training Loss: 0.01414
Epoch: 11608, Training Loss: 0.01305
Epoch: 11608, Training Loss: 0.01614
Epoch: 11608, Training Loss: 0.01239
Epoch: 11609, Training Loss: 0.01413
Epoch: 11609, Training Loss: 0.01305
Epoch: 11609, Training Loss: 0.01614
Epoch: 11609, Training Loss: 0.01239
Epoch: 11610, Training Loss: 0.01413
Epoch: 11610, Training Loss: 0.01305
Epoch: 11610, Training Loss: 0.01614
Epoch: 11610, Training Loss: 0.01239
Epoch: 11611, Training Loss: 0.01413
Epoch: 11611, Training Loss: 0.01305
Epoch: 11611, Training Loss: 0.01614
Epoch: 11611, Training Loss: 0.01239
Epoch: 11612, Training Loss: 0.01413
Epoch: 11612, Training Loss: 0.01305
Epoch: 11612, Training Loss: 0.01614
Epoch: 11612, Training Loss: 0.01239
Epoch: 11613, Training Loss: 0.01413
Epoch: 11613, Training Loss: 0.01305
Epoch: 11613, Training Loss: 0.01614
Epoch: 11613, Training Loss: 0.01239
Epoch: 11614, Training Loss: 0.01413
Epoch: 11614, Training Loss: 0.01305
Epoch: 11614, Training Loss: 0.01614
Epoch: 11614, Training Loss: 0.01239
Epoch: 11615, Training Loss: 0.01413
Epoch: 11615, Training Loss: 0.01305
Epoch: 11615, Training Loss: 0.01614
Epoch: 11615, Training Loss: 0.01239
Epoch: 11616, Training Loss: 0.01413
Epoch: 11616, Training Loss: 0.01305
Epoch: 11616, Training Loss: 0.01614
Epoch: 11616, Training Loss: 0.01239
Epoch: 11617, Training Loss: 0.01413
Epoch: 11617, Training Loss: 0.01305
Epoch: 11617, Training Loss: 0.01614
Epoch: 11617, Training Loss: 0.01239
Epoch: 11618, Training Loss: 0.01413
Epoch: 11618, Training Loss: 0.01305
Epoch: 11618, Training Loss: 0.01614
Epoch: 11618, Training Loss: 0.01239
Epoch: 11619, Training Loss: 0.01413
Epoch: 11619, Training Loss: 0.01305
Epoch: 11619, Training Loss: 0.01614
Epoch: 11619, Training Loss: 0.01239
Epoch: 11620, Training Loss: 0.01413
Epoch: 11620, Training Loss: 0.01305
Epoch: 11620, Training Loss: 0.01613
Epoch: 11620, Training Loss: 0.01239
Epoch: 11621, Training Loss: 0.01413
Epoch: 11621, Training Loss: 0.01305
Epoch: 11621, Training Loss: 0.01613
Epoch: 11621, Training Loss: 0.01238
Epoch: 11622, Training Loss: 0.01413
Epoch: 11622, Training Loss: 0.01305
Epoch: 11622, Training Loss: 0.01613
Epoch: 11622, Training Loss: 0.01238
Epoch: 11623, Training Loss: 0.01413
Epoch: 11623, Training Loss: 0.01305
Epoch: 11623, Training Loss: 0.01613
Epoch: 11623, Training Loss: 0.01238
Epoch: 11624, Training Loss: 0.01412
Epoch: 11624, Training Loss: 0.01305
Epoch: 11624, Training Loss: 0.01613
Epoch: 11624, Training Loss: 0.01238
Epoch: 11625, Training Loss: 0.01412
Epoch: 11625, Training Loss: 0.01304
Epoch: 11625, Training Loss: 0.01613
Epoch: 11625, Training Loss: 0.01238
Epoch: 11626, Training Loss: 0.01412
Epoch: 11626, Training Loss: 0.01304
Epoch: 11626, Training Loss: 0.01613
Epoch: 11626, Training Loss: 0.01238
Epoch: 11627, Training Loss: 0.01412
Epoch: 11627, Training Loss: 0.01304
Epoch: 11627, Training Loss: 0.01613
Epoch: 11627, Training Loss: 0.01238
Epoch: 11628, Training Loss: 0.01412
Epoch: 11628, Training Loss: 0.01304
Epoch: 11628, Training Loss: 0.01613
Epoch: 11628, Training Loss: 0.01238
Epoch: 11629, Training Loss: 0.01412
Epoch: 11629, Training Loss: 0.01304
Epoch: 11629, Training Loss: 0.01613
Epoch: 11629, Training Loss: 0.01238
Epoch: 11630, Training Loss: 0.01412
Epoch: 11630, Training Loss: 0.01304
Epoch: 11630, Training Loss: 0.01613
Epoch: 11630, Training Loss: 0.01238
Epoch: 11631, Training Loss: 0.01412
Epoch: 11631, Training Loss: 0.01304
Epoch: 11631, Training Loss: 0.01613
Epoch: 11631, Training Loss: 0.01238
Epoch: 11632, Training Loss: 0.01412
Epoch: 11632, Training Loss: 0.01304
Epoch: 11632, Training Loss: 0.01613
Epoch: 11632, Training Loss: 0.01238
Epoch: 11633, Training Loss: 0.01412
Epoch: 11633, Training Loss: 0.01304
Epoch: 11633, Training Loss: 0.01612
Epoch: 11633, Training Loss: 0.01238
Epoch: 11634, Training Loss: 0.01412
Epoch: 11634, Training Loss: 0.01304
Epoch: 11634, Training Loss: 0.01612
Epoch: 11634, Training Loss: 0.01238
Epoch: 11635, Training Loss: 0.01412
Epoch: 11635, Training Loss: 0.01304
Epoch: 11635, Training Loss: 0.01612
Epoch: 11635, Training Loss: 0.01238
Epoch: 11636, Training Loss: 0.01412
Epoch: 11636, Training Loss: 0.01304
Epoch: 11636, Training Loss: 0.01612
Epoch: 11636, Training Loss: 0.01238
Epoch: 11637, Training Loss: 0.01412
Epoch: 11637, Training Loss: 0.01304
Epoch: 11637, Training Loss: 0.01612
Epoch: 11637, Training Loss: 0.01238
Epoch: 11638, Training Loss: 0.01411
Epoch: 11638, Training Loss: 0.01304
Epoch: 11638, Training Loss: 0.01612
Epoch: 11638, Training Loss: 0.01237
Epoch: 11639, Training Loss: 0.01411
Epoch: 11639, Training Loss: 0.01304
Epoch: 11639, Training Loss: 0.01612
Epoch: 11639, Training Loss: 0.01237
Epoch: 11640, Training Loss: 0.01411
Epoch: 11640, Training Loss: 0.01304
Epoch: 11640, Training Loss: 0.01612
Epoch: 11640, Training Loss: 0.01237
Epoch: 11641, Training Loss: 0.01411
Epoch: 11641, Training Loss: 0.01303
Epoch: 11641, Training Loss: 0.01612
Epoch: 11641, Training Loss: 0.01237
Epoch: 11642, Training Loss: 0.01411
Epoch: 11642, Training Loss: 0.01303
Epoch: 11642, Training Loss: 0.01612
Epoch: 11642, Training Loss: 0.01237
Epoch: 11643, Training Loss: 0.01411
Epoch: 11643, Training Loss: 0.01303
Epoch: 11643, Training Loss: 0.01612
Epoch: 11643, Training Loss: 0.01237
Epoch: 11644, Training Loss: 0.01411
Epoch: 11644, Training Loss: 0.01303
Epoch: 11644, Training Loss: 0.01612
Epoch: 11644, Training Loss: 0.01237
Epoch: 11645, Training Loss: 0.01411
Epoch: 11645, Training Loss: 0.01303
Epoch: 11645, Training Loss: 0.01612
Epoch: 11645, Training Loss: 0.01237
Epoch: 11646, Training Loss: 0.01411
Epoch: 11646, Training Loss: 0.01303
Epoch: 11646, Training Loss: 0.01611
Epoch: 11646, Training Loss: 0.01237
Epoch: 11647, Training Loss: 0.01411
Epoch: 11647, Training Loss: 0.01303
Epoch: 11647, Training Loss: 0.01611
Epoch: 11647, Training Loss: 0.01237
Epoch: 11648, Training Loss: 0.01411
Epoch: 11648, Training Loss: 0.01303
Epoch: 11648, Training Loss: 0.01611
Epoch: 11648, Training Loss: 0.01237
Epoch: 11649, Training Loss: 0.01411
Epoch: 11649, Training Loss: 0.01303
Epoch: 11649, Training Loss: 0.01611
Epoch: 11649, Training Loss: 0.01237
Epoch: 11650, Training Loss: 0.01411
Epoch: 11650, Training Loss: 0.01303
Epoch: 11650, Training Loss: 0.01611
Epoch: 11650, Training Loss: 0.01237
Epoch: 11651, Training Loss: 0.01411
Epoch: 11651, Training Loss: 0.01303
Epoch: 11651, Training Loss: 0.01611
Epoch: 11651, Training Loss: 0.01237
Epoch: 11652, Training Loss: 0.01411
Epoch: 11652, Training Loss: 0.01303
Epoch: 11652, Training Loss: 0.01611
Epoch: 11652, Training Loss: 0.01237
Epoch: 11653, Training Loss: 0.01410
Epoch: 11653, Training Loss: 0.01303
Epoch: 11653, Training Loss: 0.01611
Epoch: 11653, Training Loss: 0.01237
Epoch: 11654, Training Loss: 0.01410
Epoch: 11654, Training Loss: 0.01303
Epoch: 11654, Training Loss: 0.01611
Epoch: 11654, Training Loss: 0.01237
Epoch: 11655, Training Loss: 0.01410
Epoch: 11655, Training Loss: 0.01303
Epoch: 11655, Training Loss: 0.01611
Epoch: 11655, Training Loss: 0.01236
Epoch: 11656, Training Loss: 0.01410
Epoch: 11656, Training Loss: 0.01303
Epoch: 11656, Training Loss: 0.01611
Epoch: 11656, Training Loss: 0.01236
Epoch: 11657, Training Loss: 0.01410
Epoch: 11657, Training Loss: 0.01302
Epoch: 11657, Training Loss: 0.01611
Epoch: 11657, Training Loss: 0.01236
Epoch: 11658, Training Loss: 0.01410
Epoch: 11658, Training Loss: 0.01302
Epoch: 11658, Training Loss: 0.01610
Epoch: 11658, Training Loss: 0.01236
Epoch: 11659, Training Loss: 0.01410
Epoch: 11659, Training Loss: 0.01302
Epoch: 11659, Training Loss: 0.01610
Epoch: 11659, Training Loss: 0.01236
Epoch: 11660, Training Loss: 0.01410
Epoch: 11660, Training Loss: 0.01302
Epoch: 11660, Training Loss: 0.01610
Epoch: 11660, Training Loss: 0.01236
Epoch: 11661, Training Loss: 0.01410
Epoch: 11661, Training Loss: 0.01302
Epoch: 11661, Training Loss: 0.01610
Epoch: 11661, Training Loss: 0.01236
Epoch: 11662, Training Loss: 0.01410
Epoch: 11662, Training Loss: 0.01302
Epoch: 11662, Training Loss: 0.01610
Epoch: 11662, Training Loss: 0.01236
Epoch: 11663, Training Loss: 0.01410
Epoch: 11663, Training Loss: 0.01302
Epoch: 11663, Training Loss: 0.01610
Epoch: 11663, Training Loss: 0.01236
Epoch: 11664, Training Loss: 0.01410
Epoch: 11664, Training Loss: 0.01302
Epoch: 11664, Training Loss: 0.01610
Epoch: 11664, Training Loss: 0.01236
Epoch: 11665, Training Loss: 0.01410
Epoch: 11665, Training Loss: 0.01302
Epoch: 11665, Training Loss: 0.01610
Epoch: 11665, Training Loss: 0.01236
Epoch: 11666, Training Loss: 0.01410
Epoch: 11666, Training Loss: 0.01302
Epoch: 11666, Training Loss: 0.01610
Epoch: 11666, Training Loss: 0.01236
Epoch: 11667, Training Loss: 0.01410
Epoch: 11667, Training Loss: 0.01302
Epoch: 11667, Training Loss: 0.01610
Epoch: 11667, Training Loss: 0.01236
Epoch: 11668, Training Loss: 0.01409
Epoch: 11668, Training Loss: 0.01302
Epoch: 11668, Training Loss: 0.01610
Epoch: 11668, Training Loss: 0.01236
Epoch: 11669, Training Loss: 0.01409
Epoch: 11669, Training Loss: 0.01302
Epoch: 11669, Training Loss: 0.01610
Epoch: 11669, Training Loss: 0.01236
Epoch: 11670, Training Loss: 0.01409
Epoch: 11670, Training Loss: 0.01302
Epoch: 11670, Training Loss: 0.01610
Epoch: 11670, Training Loss: 0.01236
Epoch: 11671, Training Loss: 0.01409
Epoch: 11671, Training Loss: 0.01302
Epoch: 11671, Training Loss: 0.01609
Epoch: 11671, Training Loss: 0.01236
Epoch: 11672, Training Loss: 0.01409
Epoch: 11672, Training Loss: 0.01302
Epoch: 11672, Training Loss: 0.01609
Epoch: 11672, Training Loss: 0.01236
Epoch: 11673, Training Loss: 0.01409
Epoch: 11673, Training Loss: 0.01302
Epoch: 11673, Training Loss: 0.01609
Epoch: 11673, Training Loss: 0.01235
Epoch: 11674, Training Loss: 0.01409
Epoch: 11674, Training Loss: 0.01301
Epoch: 11674, Training Loss: 0.01609
Epoch: 11674, Training Loss: 0.01235
Epoch: 11675, Training Loss: 0.01409
Epoch: 11675, Training Loss: 0.01301
Epoch: 11675, Training Loss: 0.01609
Epoch: 11675, Training Loss: 0.01235
Epoch: 11676, Training Loss: 0.01409
Epoch: 11676, Training Loss: 0.01301
Epoch: 11676, Training Loss: 0.01609
Epoch: 11676, Training Loss: 0.01235
Epoch: 11677, Training Loss: 0.01409
Epoch: 11677, Training Loss: 0.01301
Epoch: 11677, Training Loss: 0.01609
Epoch: 11677, Training Loss: 0.01235
Epoch: 11678, Training Loss: 0.01409
Epoch: 11678, Training Loss: 0.01301
Epoch: 11678, Training Loss: 0.01609
Epoch: 11678, Training Loss: 0.01235
Epoch: 11679, Training Loss: 0.01409
Epoch: 11679, Training Loss: 0.01301
Epoch: 11679, Training Loss: 0.01609
Epoch: 11679, Training Loss: 0.01235
Epoch: 11680, Training Loss: 0.01409
Epoch: 11680, Training Loss: 0.01301
Epoch: 11680, Training Loss: 0.01609
Epoch: 11680, Training Loss: 0.01235
Epoch: 11681, Training Loss: 0.01409
Epoch: 11681, Training Loss: 0.01301
Epoch: 11681, Training Loss: 0.01609
Epoch: 11681, Training Loss: 0.01235
Epoch: 11682, Training Loss: 0.01409
Epoch: 11682, Training Loss: 0.01301
Epoch: 11682, Training Loss: 0.01609
Epoch: 11682, Training Loss: 0.01235
Epoch: 11683, Training Loss: 0.01408
Epoch: 11683, Training Loss: 0.01301
Epoch: 11683, Training Loss: 0.01609
Epoch: 11683, Training Loss: 0.01235
Epoch: 11684, Training Loss: 0.01408
Epoch: 11684, Training Loss: 0.01301
Epoch: 11684, Training Loss: 0.01608
Epoch: 11684, Training Loss: 0.01235
Epoch: 11685, Training Loss: 0.01408
Epoch: 11685, Training Loss: 0.01301
Epoch: 11685, Training Loss: 0.01608
Epoch: 11685, Training Loss: 0.01235
Epoch: 11686, Training Loss: 0.01408
Epoch: 11686, Training Loss: 0.01301
Epoch: 11686, Training Loss: 0.01608
Epoch: 11686, Training Loss: 0.01235
Epoch: 11687, Training Loss: 0.01408
Epoch: 11687, Training Loss: 0.01301
Epoch: 11687, Training Loss: 0.01608
Epoch: 11687, Training Loss: 0.01235
Epoch: 11688, Training Loss: 0.01408
Epoch: 11688, Training Loss: 0.01301
Epoch: 11688, Training Loss: 0.01608
Epoch: 11688, Training Loss: 0.01235
Epoch: 11689, Training Loss: 0.01408
Epoch: 11689, Training Loss: 0.01301
Epoch: 11689, Training Loss: 0.01608
Epoch: 11689, Training Loss: 0.01235
Epoch: 11690, Training Loss: 0.01408
Epoch: 11690, Training Loss: 0.01300
Epoch: 11690, Training Loss: 0.01608
Epoch: 11690, Training Loss: 0.01234
Epoch: 11691, Training Loss: 0.01408
Epoch: 11691, Training Loss: 0.01300
Epoch: 11691, Training Loss: 0.01608
Epoch: 11691, Training Loss: 0.01234
Epoch: 11692, Training Loss: 0.01408
Epoch: 11692, Training Loss: 0.01300
Epoch: 11692, Training Loss: 0.01608
Epoch: 11692, Training Loss: 0.01234
Epoch: 11693, Training Loss: 0.01408
Epoch: 11693, Training Loss: 0.01300
Epoch: 11693, Training Loss: 0.01608
Epoch: 11693, Training Loss: 0.01234
Epoch: 11694, Training Loss: 0.01408
Epoch: 11694, Training Loss: 0.01300
Epoch: 11694, Training Loss: 0.01608
Epoch: 11694, Training Loss: 0.01234
Epoch: 11695, Training Loss: 0.01408
Epoch: 11695, Training Loss: 0.01300
Epoch: 11695, Training Loss: 0.01608
Epoch: 11695, Training Loss: 0.01234
Epoch: 11696, Training Loss: 0.01408
Epoch: 11696, Training Loss: 0.01300
Epoch: 11696, Training Loss: 0.01608
Epoch: 11696, Training Loss: 0.01234
Epoch: 11697, Training Loss: 0.01407
Epoch: 11697, Training Loss: 0.01300
Epoch: 11697, Training Loss: 0.01607
Epoch: 11697, Training Loss: 0.01234
Epoch: 11698, Training Loss: 0.01407
Epoch: 11698, Training Loss: 0.01300
Epoch: 11698, Training Loss: 0.01607
Epoch: 11698, Training Loss: 0.01234
Epoch: 11699, Training Loss: 0.01407
Epoch: 11699, Training Loss: 0.01300
Epoch: 11699, Training Loss: 0.01607
Epoch: 11699, Training Loss: 0.01234
Epoch: 11700, Training Loss: 0.01407
Epoch: 11700, Training Loss: 0.01300
Epoch: 11700, Training Loss: 0.01607
Epoch: 11700, Training Loss: 0.01234
Epoch: 11701, Training Loss: 0.01407
Epoch: 11701, Training Loss: 0.01300
Epoch: 11701, Training Loss: 0.01607
Epoch: 11701, Training Loss: 0.01234
Epoch: 11702, Training Loss: 0.01407
Epoch: 11702, Training Loss: 0.01300
Epoch: 11702, Training Loss: 0.01607
Epoch: 11702, Training Loss: 0.01234
Epoch: 11703, Training Loss: 0.01407
Epoch: 11703, Training Loss: 0.01300
Epoch: 11703, Training Loss: 0.01607
Epoch: 11703, Training Loss: 0.01234
Epoch: 11704, Training Loss: 0.01407
Epoch: 11704, Training Loss: 0.01300
Epoch: 11704, Training Loss: 0.01607
Epoch: 11704, Training Loss: 0.01234
Epoch: 11705, Training Loss: 0.01407
Epoch: 11705, Training Loss: 0.01300
Epoch: 11705, Training Loss: 0.01607
Epoch: 11705, Training Loss: 0.01234
Epoch: 11706, Training Loss: 0.01407
Epoch: 11706, Training Loss: 0.01300
Epoch: 11706, Training Loss: 0.01607
Epoch: 11706, Training Loss: 0.01234
Epoch: 11707, Training Loss: 0.01407
Epoch: 11707, Training Loss: 0.01299
Epoch: 11707, Training Loss: 0.01607
Epoch: 11707, Training Loss: 0.01233
Epoch: 11708, Training Loss: 0.01407
Epoch: 11708, Training Loss: 0.01299
Epoch: 11708, Training Loss: 0.01607
Epoch: 11708, Training Loss: 0.01233
Epoch: 11709, Training Loss: 0.01407
Epoch: 11709, Training Loss: 0.01299
Epoch: 11709, Training Loss: 0.01607
Epoch: 11709, Training Loss: 0.01233
Epoch: 11710, Training Loss: 0.01407
Epoch: 11710, Training Loss: 0.01299
Epoch: 11710, Training Loss: 0.01606
Epoch: 11710, Training Loss: 0.01233
Epoch: 11711, Training Loss: 0.01407
Epoch: 11711, Training Loss: 0.01299
Epoch: 11711, Training Loss: 0.01606
Epoch: 11711, Training Loss: 0.01233
Epoch: 11712, Training Loss: 0.01406
Epoch: 11712, Training Loss: 0.01299
Epoch: 11712, Training Loss: 0.01606
Epoch: 11712, Training Loss: 0.01233
Epoch: 11713, Training Loss: 0.01406
Epoch: 11713, Training Loss: 0.01299
Epoch: 11713, Training Loss: 0.01606
Epoch: 11713, Training Loss: 0.01233
Epoch: 11714, Training Loss: 0.01406
Epoch: 11714, Training Loss: 0.01299
Epoch: 11714, Training Loss: 0.01606
Epoch: 11714, Training Loss: 0.01233
Epoch: 11715, Training Loss: 0.01406
Epoch: 11715, Training Loss: 0.01299
Epoch: 11715, Training Loss: 0.01606
Epoch: 11715, Training Loss: 0.01233
Epoch: 11716, Training Loss: 0.01406
Epoch: 11716, Training Loss: 0.01299
Epoch: 11716, Training Loss: 0.01606
Epoch: 11716, Training Loss: 0.01233
Epoch: 11717, Training Loss: 0.01406
Epoch: 11717, Training Loss: 0.01299
Epoch: 11717, Training Loss: 0.01606
Epoch: 11717, Training Loss: 0.01233
Epoch: 11718, Training Loss: 0.01406
Epoch: 11718, Training Loss: 0.01299
Epoch: 11718, Training Loss: 0.01606
Epoch: 11718, Training Loss: 0.01233
Epoch: 11719, Training Loss: 0.01406
Epoch: 11719, Training Loss: 0.01299
Epoch: 11719, Training Loss: 0.01606
Epoch: 11719, Training Loss: 0.01233
Epoch: 11720, Training Loss: 0.01406
Epoch: 11720, Training Loss: 0.01299
Epoch: 11720, Training Loss: 0.01606
Epoch: 11720, Training Loss: 0.01233
Epoch: 11721, Training Loss: 0.01406
Epoch: 11721, Training Loss: 0.01299
Epoch: 11721, Training Loss: 0.01606
Epoch: 11721, Training Loss: 0.01233
Epoch: 11722, Training Loss: 0.01406
Epoch: 11722, Training Loss: 0.01299
Epoch: 11722, Training Loss: 0.01606
Epoch: 11722, Training Loss: 0.01233
Epoch: 11723, Training Loss: 0.01406
Epoch: 11723, Training Loss: 0.01298
Epoch: 11723, Training Loss: 0.01605
Epoch: 11723, Training Loss: 0.01233
Epoch: 11724, Training Loss: 0.01406
Epoch: 11724, Training Loss: 0.01298
Epoch: 11724, Training Loss: 0.01605
Epoch: 11724, Training Loss: 0.01232
Epoch: 11725, Training Loss: 0.01406
Epoch: 11725, Training Loss: 0.01298
Epoch: 11725, Training Loss: 0.01605
Epoch: 11725, Training Loss: 0.01232
Epoch: 11726, Training Loss: 0.01406
Epoch: 11726, Training Loss: 0.01298
Epoch: 11726, Training Loss: 0.01605
Epoch: 11726, Training Loss: 0.01232
Epoch: 11727, Training Loss: 0.01405
Epoch: 11727, Training Loss: 0.01298
Epoch: 11727, Training Loss: 0.01605
Epoch: 11727, Training Loss: 0.01232
Epoch: 11728, Training Loss: 0.01405
Epoch: 11728, Training Loss: 0.01298
Epoch: 11728, Training Loss: 0.01605
Epoch: 11728, Training Loss: 0.01232
Epoch: 11729, Training Loss: 0.01405
Epoch: 11729, Training Loss: 0.01298
Epoch: 11729, Training Loss: 0.01605
Epoch: 11729, Training Loss: 0.01232
Epoch: 11730, Training Loss: 0.01405
Epoch: 11730, Training Loss: 0.01298
Epoch: 11730, Training Loss: 0.01605
Epoch: 11730, Training Loss: 0.01232
Epoch: 11731, Training Loss: 0.01405
Epoch: 11731, Training Loss: 0.01298
Epoch: 11731, Training Loss: 0.01605
Epoch: 11731, Training Loss: 0.01232
Epoch: 11732, Training Loss: 0.01405
Epoch: 11732, Training Loss: 0.01298
Epoch: 11732, Training Loss: 0.01605
Epoch: 11732, Training Loss: 0.01232
Epoch: 11733, Training Loss: 0.01405
Epoch: 11733, Training Loss: 0.01298
Epoch: 11733, Training Loss: 0.01605
Epoch: 11733, Training Loss: 0.01232
Epoch: 11734, Training Loss: 0.01405
Epoch: 11734, Training Loss: 0.01298
Epoch: 11734, Training Loss: 0.01605
Epoch: 11734, Training Loss: 0.01232
Epoch: 11735, Training Loss: 0.01405
Epoch: 11735, Training Loss: 0.01298
Epoch: 11735, Training Loss: 0.01605
Epoch: 11735, Training Loss: 0.01232
Epoch: 11736, Training Loss: 0.01405
Epoch: 11736, Training Loss: 0.01298
Epoch: 11736, Training Loss: 0.01604
Epoch: 11736, Training Loss: 0.01232
Epoch: 11737, Training Loss: 0.01405
Epoch: 11737, Training Loss: 0.01298
Epoch: 11737, Training Loss: 0.01604
Epoch: 11737, Training Loss: 0.01232
Epoch: 11738, Training Loss: 0.01405
Epoch: 11738, Training Loss: 0.01298
Epoch: 11738, Training Loss: 0.01604
Epoch: 11738, Training Loss: 0.01232
Epoch: 11739, Training Loss: 0.01405
Epoch: 11739, Training Loss: 0.01298
Epoch: 11739, Training Loss: 0.01604
Epoch: 11739, Training Loss: 0.01232
Epoch: 11740, Training Loss: 0.01405
Epoch: 11740, Training Loss: 0.01297
Epoch: 11740, Training Loss: 0.01604
Epoch: 11740, Training Loss: 0.01232
Epoch: 11741, Training Loss: 0.01405
Epoch: 11741, Training Loss: 0.01297
Epoch: 11741, Training Loss: 0.01604
Epoch: 11741, Training Loss: 0.01232
Epoch: 11742, Training Loss: 0.01404
Epoch: 11742, Training Loss: 0.01297
Epoch: 11742, Training Loss: 0.01604
Epoch: 11742, Training Loss: 0.01231
Epoch: 11743, Training Loss: 0.01404
Epoch: 11743, Training Loss: 0.01297
Epoch: 11743, Training Loss: 0.01604
Epoch: 11743, Training Loss: 0.01231
Epoch: 11744, Training Loss: 0.01404
Epoch: 11744, Training Loss: 0.01297
Epoch: 11744, Training Loss: 0.01604
Epoch: 11744, Training Loss: 0.01231
Epoch: 11745, Training Loss: 0.01404
Epoch: 11745, Training Loss: 0.01297
Epoch: 11745, Training Loss: 0.01604
Epoch: 11745, Training Loss: 0.01231
Epoch: 11746, Training Loss: 0.01404
Epoch: 11746, Training Loss: 0.01297
Epoch: 11746, Training Loss: 0.01604
Epoch: 11746, Training Loss: 0.01231
Epoch: 11747, Training Loss: 0.01404
Epoch: 11747, Training Loss: 0.01297
Epoch: 11747, Training Loss: 0.01604
Epoch: 11747, Training Loss: 0.01231
Epoch: 11748, Training Loss: 0.01404
Epoch: 11748, Training Loss: 0.01297
Epoch: 11748, Training Loss: 0.01604
Epoch: 11748, Training Loss: 0.01231
Epoch: 11749, Training Loss: 0.01404
Epoch: 11749, Training Loss: 0.01297
Epoch: 11749, Training Loss: 0.01603
Epoch: 11749, Training Loss: 0.01231
Epoch: 11750, Training Loss: 0.01404
Epoch: 11750, Training Loss: 0.01297
Epoch: 11750, Training Loss: 0.01603
Epoch: 11750, Training Loss: 0.01231
Epoch: 11751, Training Loss: 0.01404
Epoch: 11751, Training Loss: 0.01297
Epoch: 11751, Training Loss: 0.01603
Epoch: 11751, Training Loss: 0.01231
Epoch: 11752, Training Loss: 0.01404
Epoch: 11752, Training Loss: 0.01297
Epoch: 11752, Training Loss: 0.01603
Epoch: 11752, Training Loss: 0.01231
Epoch: 11753, Training Loss: 0.01404
Epoch: 11753, Training Loss: 0.01297
Epoch: 11753, Training Loss: 0.01603
Epoch: 11753, Training Loss: 0.01231
Epoch: 11754, Training Loss: 0.01404
Epoch: 11754, Training Loss: 0.01297
Epoch: 11754, Training Loss: 0.01603
Epoch: 11754, Training Loss: 0.01231
Epoch: 11755, Training Loss: 0.01404
Epoch: 11755, Training Loss: 0.01297
Epoch: 11755, Training Loss: 0.01603
Epoch: 11755, Training Loss: 0.01231
Epoch: 11756, Training Loss: 0.01404
Epoch: 11756, Training Loss: 0.01297
Epoch: 11756, Training Loss: 0.01603
Epoch: 11756, Training Loss: 0.01231
Epoch: 11757, Training Loss: 0.01403
Epoch: 11757, Training Loss: 0.01296
Epoch: 11757, Training Loss: 0.01603
Epoch: 11757, Training Loss: 0.01231
Epoch: 11758, Training Loss: 0.01403
Epoch: 11758, Training Loss: 0.01296
Epoch: 11758, Training Loss: 0.01603
Epoch: 11758, Training Loss: 0.01231
Epoch: 11759, Training Loss: 0.01403
Epoch: 11759, Training Loss: 0.01296
Epoch: 11759, Training Loss: 0.01603
Epoch: 11759, Training Loss: 0.01230
Epoch: 11760, Training Loss: 0.01403
Epoch: 11760, Training Loss: 0.01296
Epoch: 11760, Training Loss: 0.01603
Epoch: 11760, Training Loss: 0.01230
Epoch: 11761, Training Loss: 0.01403
Epoch: 11761, Training Loss: 0.01296
Epoch: 11761, Training Loss: 0.01603
Epoch: 11761, Training Loss: 0.01230
Epoch: 11762, Training Loss: 0.01403
Epoch: 11762, Training Loss: 0.01296
Epoch: 11762, Training Loss: 0.01602
Epoch: 11762, Training Loss: 0.01230
Epoch: 11763, Training Loss: 0.01403
Epoch: 11763, Training Loss: 0.01296
Epoch: 11763, Training Loss: 0.01602
Epoch: 11763, Training Loss: 0.01230
Epoch: 11764, Training Loss: 0.01403
Epoch: 11764, Training Loss: 0.01296
Epoch: 11764, Training Loss: 0.01602
Epoch: 11764, Training Loss: 0.01230
Epoch: 11765, Training Loss: 0.01403
Epoch: 11765, Training Loss: 0.01296
Epoch: 11765, Training Loss: 0.01602
Epoch: 11765, Training Loss: 0.01230
Epoch: 11766, Training Loss: 0.01403
Epoch: 11766, Training Loss: 0.01296
Epoch: 11766, Training Loss: 0.01602
Epoch: 11766, Training Loss: 0.01230
Epoch: 11767, Training Loss: 0.01403
Epoch: 11767, Training Loss: 0.01296
Epoch: 11767, Training Loss: 0.01602
Epoch: 11767, Training Loss: 0.01230
Epoch: 11768, Training Loss: 0.01403
Epoch: 11768, Training Loss: 0.01296
Epoch: 11768, Training Loss: 0.01602
Epoch: 11768, Training Loss: 0.01230
Epoch: 11769, Training Loss: 0.01403
Epoch: 11769, Training Loss: 0.01296
Epoch: 11769, Training Loss: 0.01602
Epoch: 11769, Training Loss: 0.01230
Epoch: 11770, Training Loss: 0.01403
Epoch: 11770, Training Loss: 0.01296
Epoch: 11770, Training Loss: 0.01602
Epoch: 11770, Training Loss: 0.01230
Epoch: 11771, Training Loss: 0.01403
Epoch: 11771, Training Loss: 0.01296
Epoch: 11771, Training Loss: 0.01602
Epoch: 11771, Training Loss: 0.01230
Epoch: 11772, Training Loss: 0.01402
Epoch: 11772, Training Loss: 0.01296
Epoch: 11772, Training Loss: 0.01602
Epoch: 11772, Training Loss: 0.01230
Epoch: 11773, Training Loss: 0.01402
Epoch: 11773, Training Loss: 0.01295
Epoch: 11773, Training Loss: 0.01602
Epoch: 11773, Training Loss: 0.01230
Epoch: 11774, Training Loss: 0.01402
Epoch: 11774, Training Loss: 0.01295
Epoch: 11774, Training Loss: 0.01602
Epoch: 11774, Training Loss: 0.01230
Epoch: 11775, Training Loss: 0.01402
Epoch: 11775, Training Loss: 0.01295
Epoch: 11775, Training Loss: 0.01601
Epoch: 11775, Training Loss: 0.01230
Epoch: 11776, Training Loss: 0.01402
Epoch: 11776, Training Loss: 0.01295
Epoch: 11776, Training Loss: 0.01601
Epoch: 11776, Training Loss: 0.01230
Epoch: 11777, Training Loss: 0.01402
Epoch: 11777, Training Loss: 0.01295
Epoch: 11777, Training Loss: 0.01601
Epoch: 11777, Training Loss: 0.01229
Epoch: 11778, Training Loss: 0.01402
Epoch: 11778, Training Loss: 0.01295
Epoch: 11778, Training Loss: 0.01601
Epoch: 11778, Training Loss: 0.01229
Epoch: 11779, Training Loss: 0.01402
Epoch: 11779, Training Loss: 0.01295
Epoch: 11779, Training Loss: 0.01601
Epoch: 11779, Training Loss: 0.01229
Epoch: 11780, Training Loss: 0.01402
Epoch: 11780, Training Loss: 0.01295
Epoch: 11780, Training Loss: 0.01601
Epoch: 11780, Training Loss: 0.01229
Epoch: 11781, Training Loss: 0.01402
Epoch: 11781, Training Loss: 0.01295
Epoch: 11781, Training Loss: 0.01601
Epoch: 11781, Training Loss: 0.01229
Epoch: 11782, Training Loss: 0.01402
Epoch: 11782, Training Loss: 0.01295
Epoch: 11782, Training Loss: 0.01601
Epoch: 11782, Training Loss: 0.01229
Epoch: 11783, Training Loss: 0.01402
Epoch: 11783, Training Loss: 0.01295
Epoch: 11783, Training Loss: 0.01601
Epoch: 11783, Training Loss: 0.01229
Epoch: 11784, Training Loss: 0.01402
Epoch: 11784, Training Loss: 0.01295
Epoch: 11784, Training Loss: 0.01601
Epoch: 11784, Training Loss: 0.01229
Epoch: 11785, Training Loss: 0.01402
Epoch: 11785, Training Loss: 0.01295
Epoch: 11785, Training Loss: 0.01601
Epoch: 11785, Training Loss: 0.01229
Epoch: 11786, Training Loss: 0.01402
Epoch: 11786, Training Loss: 0.01295
Epoch: 11786, Training Loss: 0.01601
Epoch: 11786, Training Loss: 0.01229
Epoch: 11787, Training Loss: 0.01401
Epoch: 11787, Training Loss: 0.01295
Epoch: 11787, Training Loss: 0.01601
Epoch: 11787, Training Loss: 0.01229
Epoch: 11788, Training Loss: 0.01401
Epoch: 11788, Training Loss: 0.01295
Epoch: 11788, Training Loss: 0.01600
Epoch: 11788, Training Loss: 0.01229
Epoch: 11789, Training Loss: 0.01401
Epoch: 11789, Training Loss: 0.01295
Epoch: 11789, Training Loss: 0.01600
Epoch: 11789, Training Loss: 0.01229
Epoch: 11790, Training Loss: 0.01401
Epoch: 11790, Training Loss: 0.01294
Epoch: 11790, Training Loss: 0.01600
Epoch: 11790, Training Loss: 0.01229
Epoch: 11791, Training Loss: 0.01401
Epoch: 11791, Training Loss: 0.01294
Epoch: 11791, Training Loss: 0.01600
Epoch: 11791, Training Loss: 0.01229
Epoch: 11792, Training Loss: 0.01401
Epoch: 11792, Training Loss: 0.01294
Epoch: 11792, Training Loss: 0.01600
Epoch: 11792, Training Loss: 0.01229
Epoch: 11793, Training Loss: 0.01401
Epoch: 11793, Training Loss: 0.01294
Epoch: 11793, Training Loss: 0.01600
Epoch: 11793, Training Loss: 0.01229
Epoch: 11794, Training Loss: 0.01401
Epoch: 11794, Training Loss: 0.01294
Epoch: 11794, Training Loss: 0.01600
Epoch: 11794, Training Loss: 0.01228
Epoch: 11795, Training Loss: 0.01401
Epoch: 11795, Training Loss: 0.01294
Epoch: 11795, Training Loss: 0.01600
Epoch: 11795, Training Loss: 0.01228
Epoch: 11796, Training Loss: 0.01401
Epoch: 11796, Training Loss: 0.01294
Epoch: 11796, Training Loss: 0.01600
Epoch: 11796, Training Loss: 0.01228
Epoch: 11797, Training Loss: 0.01401
Epoch: 11797, Training Loss: 0.01294
Epoch: 11797, Training Loss: 0.01600
Epoch: 11797, Training Loss: 0.01228
Epoch: 11798, Training Loss: 0.01401
Epoch: 11798, Training Loss: 0.01294
Epoch: 11798, Training Loss: 0.01600
Epoch: 11798, Training Loss: 0.01228
Epoch: 11799, Training Loss: 0.01401
Epoch: 11799, Training Loss: 0.01294
Epoch: 11799, Training Loss: 0.01600
Epoch: 11799, Training Loss: 0.01228
Epoch: 11800, Training Loss: 0.01401
Epoch: 11800, Training Loss: 0.01294
Epoch: 11800, Training Loss: 0.01600
Epoch: 11800, Training Loss: 0.01228
Epoch: 11801, Training Loss: 0.01401
Epoch: 11801, Training Loss: 0.01294
Epoch: 11801, Training Loss: 0.01600
Epoch: 11801, Training Loss: 0.01228
Epoch: 11802, Training Loss: 0.01400
Epoch: 11802, Training Loss: 0.01294
Epoch: 11802, Training Loss: 0.01599
Epoch: 11802, Training Loss: 0.01228
Epoch: 11803, Training Loss: 0.01400
Epoch: 11803, Training Loss: 0.01294
Epoch: 11803, Training Loss: 0.01599
Epoch: 11803, Training Loss: 0.01228
Epoch: 11804, Training Loss: 0.01400
Epoch: 11804, Training Loss: 0.01294
Epoch: 11804, Training Loss: 0.01599
Epoch: 11804, Training Loss: 0.01228
Epoch: 11805, Training Loss: 0.01400
Epoch: 11805, Training Loss: 0.01294
Epoch: 11805, Training Loss: 0.01599
Epoch: 11805, Training Loss: 0.01228
Epoch: 11806, Training Loss: 0.01400
Epoch: 11806, Training Loss: 0.01294
Epoch: 11806, Training Loss: 0.01599
Epoch: 11806, Training Loss: 0.01228
Epoch: 11807, Training Loss: 0.01400
Epoch: 11807, Training Loss: 0.01293
Epoch: 11807, Training Loss: 0.01599
Epoch: 11807, Training Loss: 0.01228
Epoch: 11808, Training Loss: 0.01400
Epoch: 11808, Training Loss: 0.01293
Epoch: 11808, Training Loss: 0.01599
Epoch: 11808, Training Loss: 0.01228
Epoch: 11809, Training Loss: 0.01400
Epoch: 11809, Training Loss: 0.01293
Epoch: 11809, Training Loss: 0.01599
Epoch: 11809, Training Loss: 0.01228
Epoch: 11810, Training Loss: 0.01400
Epoch: 11810, Training Loss: 0.01293
Epoch: 11810, Training Loss: 0.01599
Epoch: 11810, Training Loss: 0.01228
Epoch: 11811, Training Loss: 0.01400
Epoch: 11811, Training Loss: 0.01293
Epoch: 11811, Training Loss: 0.01599
Epoch: 11811, Training Loss: 0.01228
Epoch: 11812, Training Loss: 0.01400
Epoch: 11812, Training Loss: 0.01293
Epoch: 11812, Training Loss: 0.01599
Epoch: 11812, Training Loss: 0.01227
Epoch: 11813, Training Loss: 0.01400
Epoch: 11813, Training Loss: 0.01293
Epoch: 11813, Training Loss: 0.01599
Epoch: 11813, Training Loss: 0.01227
Epoch: 11814, Training Loss: 0.01400
Epoch: 11814, Training Loss: 0.01293
Epoch: 11814, Training Loss: 0.01599
Epoch: 11814, Training Loss: 0.01227
Epoch: 11815, Training Loss: 0.01400
Epoch: 11815, Training Loss: 0.01293
Epoch: 11815, Training Loss: 0.01598
Epoch: 11815, Training Loss: 0.01227
Epoch: 11816, Training Loss: 0.01400
Epoch: 11816, Training Loss: 0.01293
Epoch: 11816, Training Loss: 0.01598
Epoch: 11816, Training Loss: 0.01227
Epoch: 11817, Training Loss: 0.01399
Epoch: 11817, Training Loss: 0.01293
Epoch: 11817, Training Loss: 0.01598
Epoch: 11817, Training Loss: 0.01227
Epoch: 11818, Training Loss: 0.01399
Epoch: 11818, Training Loss: 0.01293
Epoch: 11818, Training Loss: 0.01598
Epoch: 11818, Training Loss: 0.01227
Epoch: 11819, Training Loss: 0.01399
Epoch: 11819, Training Loss: 0.01293
Epoch: 11819, Training Loss: 0.01598
Epoch: 11819, Training Loss: 0.01227
Epoch: 11820, Training Loss: 0.01399
Epoch: 11820, Training Loss: 0.01293
Epoch: 11820, Training Loss: 0.01598
Epoch: 11820, Training Loss: 0.01227
Epoch: 11821, Training Loss: 0.01399
Epoch: 11821, Training Loss: 0.01293
Epoch: 11821, Training Loss: 0.01598
Epoch: 11821, Training Loss: 0.01227
Epoch: 11822, Training Loss: 0.01399
Epoch: 11822, Training Loss: 0.01293
Epoch: 11822, Training Loss: 0.01598
Epoch: 11822, Training Loss: 0.01227
Epoch: 11823, Training Loss: 0.01399
Epoch: 11823, Training Loss: 0.01293
Epoch: 11823, Training Loss: 0.01598
Epoch: 11823, Training Loss: 0.01227
Epoch: 11824, Training Loss: 0.01399
Epoch: 11824, Training Loss: 0.01292
Epoch: 11824, Training Loss: 0.01598
Epoch: 11824, Training Loss: 0.01227
Epoch: 11825, Training Loss: 0.01399
Epoch: 11825, Training Loss: 0.01292
Epoch: 11825, Training Loss: 0.01598
Epoch: 11825, Training Loss: 0.01227
Epoch: 11826, Training Loss: 0.01399
Epoch: 11826, Training Loss: 0.01292
Epoch: 11826, Training Loss: 0.01598
Epoch: 11826, Training Loss: 0.01227
Epoch: 11827, Training Loss: 0.01399
Epoch: 11827, Training Loss: 0.01292
Epoch: 11827, Training Loss: 0.01598
Epoch: 11827, Training Loss: 0.01227
Epoch: 11828, Training Loss: 0.01399
Epoch: 11828, Training Loss: 0.01292
Epoch: 11828, Training Loss: 0.01597
Epoch: 11828, Training Loss: 0.01227
Epoch: 11829, Training Loss: 0.01399
Epoch: 11829, Training Loss: 0.01292
Epoch: 11829, Training Loss: 0.01597
Epoch: 11829, Training Loss: 0.01226
Epoch: 11830, Training Loss: 0.01399
Epoch: 11830, Training Loss: 0.01292
Epoch: 11830, Training Loss: 0.01597
Epoch: 11830, Training Loss: 0.01226
Epoch: 11831, Training Loss: 0.01399
Epoch: 11831, Training Loss: 0.01292
Epoch: 11831, Training Loss: 0.01597
Epoch: 11831, Training Loss: 0.01226
Epoch: 11832, Training Loss: 0.01398
Epoch: 11832, Training Loss: 0.01292
Epoch: 11832, Training Loss: 0.01597
Epoch: 11832, Training Loss: 0.01226
Epoch: 11833, Training Loss: 0.01398
Epoch: 11833, Training Loss: 0.01292
Epoch: 11833, Training Loss: 0.01597
Epoch: 11833, Training Loss: 0.01226
Epoch: 11834, Training Loss: 0.01398
Epoch: 11834, Training Loss: 0.01292
Epoch: 11834, Training Loss: 0.01597
Epoch: 11834, Training Loss: 0.01226
Epoch: 11835, Training Loss: 0.01398
Epoch: 11835, Training Loss: 0.01292
Epoch: 11835, Training Loss: 0.01597
Epoch: 11835, Training Loss: 0.01226
Epoch: 11836, Training Loss: 0.01398
Epoch: 11836, Training Loss: 0.01292
Epoch: 11836, Training Loss: 0.01597
Epoch: 11836, Training Loss: 0.01226
Epoch: 11837, Training Loss: 0.01398
Epoch: 11837, Training Loss: 0.01292
Epoch: 11837, Training Loss: 0.01597
Epoch: 11837, Training Loss: 0.01226
Epoch: 11838, Training Loss: 0.01398
Epoch: 11838, Training Loss: 0.01292
Epoch: 11838, Training Loss: 0.01597
Epoch: 11838, Training Loss: 0.01226
Epoch: 11839, Training Loss: 0.01398
Epoch: 11839, Training Loss: 0.01292
Epoch: 11839, Training Loss: 0.01597
Epoch: 11839, Training Loss: 0.01226
Epoch: 11840, Training Loss: 0.01398
Epoch: 11840, Training Loss: 0.01291
Epoch: 11840, Training Loss: 0.01597
Epoch: 11840, Training Loss: 0.01226
Epoch: 11841, Training Loss: 0.01398
Epoch: 11841, Training Loss: 0.01291
Epoch: 11841, Training Loss: 0.01596
Epoch: 11841, Training Loss: 0.01226
Epoch: 11842, Training Loss: 0.01398
Epoch: 11842, Training Loss: 0.01291
Epoch: 11842, Training Loss: 0.01596
Epoch: 11842, Training Loss: 0.01226
Epoch: 11843, Training Loss: 0.01398
Epoch: 11843, Training Loss: 0.01291
Epoch: 11843, Training Loss: 0.01596
Epoch: 11843, Training Loss: 0.01226
Epoch: 11844, Training Loss: 0.01398
Epoch: 11844, Training Loss: 0.01291
Epoch: 11844, Training Loss: 0.01596
Epoch: 11844, Training Loss: 0.01226
Epoch: 11845, Training Loss: 0.01398
Epoch: 11845, Training Loss: 0.01291
Epoch: 11845, Training Loss: 0.01596
Epoch: 11845, Training Loss: 0.01226
Epoch: 11846, Training Loss: 0.01398
Epoch: 11846, Training Loss: 0.01291
Epoch: 11846, Training Loss: 0.01596
Epoch: 11846, Training Loss: 0.01226
Epoch: 11847, Training Loss: 0.01397
Epoch: 11847, Training Loss: 0.01291
Epoch: 11847, Training Loss: 0.01596
Epoch: 11847, Training Loss: 0.01225
Epoch: 11848, Training Loss: 0.01397
Epoch: 11848, Training Loss: 0.01291
Epoch: 11848, Training Loss: 0.01596
Epoch: 11848, Training Loss: 0.01225
Epoch: 11849, Training Loss: 0.01397
Epoch: 11849, Training Loss: 0.01291
Epoch: 11849, Training Loss: 0.01596
Epoch: 11849, Training Loss: 0.01225
Epoch: 11850, Training Loss: 0.01397
Epoch: 11850, Training Loss: 0.01291
Epoch: 11850, Training Loss: 0.01596
Epoch: 11850, Training Loss: 0.01225
Epoch: 11851, Training Loss: 0.01397
Epoch: 11851, Training Loss: 0.01291
Epoch: 11851, Training Loss: 0.01596
Epoch: 11851, Training Loss: 0.01225
Epoch: 11852, Training Loss: 0.01397
Epoch: 11852, Training Loss: 0.01291
Epoch: 11852, Training Loss: 0.01596
Epoch: 11852, Training Loss: 0.01225
Epoch: 11853, Training Loss: 0.01397
Epoch: 11853, Training Loss: 0.01291
Epoch: 11853, Training Loss: 0.01596
Epoch: 11853, Training Loss: 0.01225
Epoch: 11854, Training Loss: 0.01397
Epoch: 11854, Training Loss: 0.01291
Epoch: 11854, Training Loss: 0.01595
Epoch: 11854, Training Loss: 0.01225
Epoch: 11855, Training Loss: 0.01397
Epoch: 11855, Training Loss: 0.01291
Epoch: 11855, Training Loss: 0.01595
Epoch: 11855, Training Loss: 0.01225
Epoch: 11856, Training Loss: 0.01397
Epoch: 11856, Training Loss: 0.01291
Epoch: 11856, Training Loss: 0.01595
Epoch: 11856, Training Loss: 0.01225
Epoch: 11857, Training Loss: 0.01397
Epoch: 11857, Training Loss: 0.01290
Epoch: 11857, Training Loss: 0.01595
Epoch: 11857, Training Loss: 0.01225
Epoch: 11858, Training Loss: 0.01397
Epoch: 11858, Training Loss: 0.01290
Epoch: 11858, Training Loss: 0.01595
Epoch: 11858, Training Loss: 0.01225
Epoch: 11859, Training Loss: 0.01397
Epoch: 11859, Training Loss: 0.01290
Epoch: 11859, Training Loss: 0.01595
Epoch: 11859, Training Loss: 0.01225
Epoch: 11860, Training Loss: 0.01397
Epoch: 11860, Training Loss: 0.01290
Epoch: 11860, Training Loss: 0.01595
Epoch: 11860, Training Loss: 0.01225
Epoch: 11861, Training Loss: 0.01397
Epoch: 11861, Training Loss: 0.01290
Epoch: 11861, Training Loss: 0.01595
Epoch: 11861, Training Loss: 0.01225
Epoch: 11862, Training Loss: 0.01397
Epoch: 11862, Training Loss: 0.01290
Epoch: 11862, Training Loss: 0.01595
Epoch: 11862, Training Loss: 0.01225
Epoch: 11863, Training Loss: 0.01396
Epoch: 11863, Training Loss: 0.01290
Epoch: 11863, Training Loss: 0.01595
Epoch: 11863, Training Loss: 0.01225
Epoch: 11864, Training Loss: 0.01396
Epoch: 11864, Training Loss: 0.01290
Epoch: 11864, Training Loss: 0.01595
Epoch: 11864, Training Loss: 0.01224
Epoch: 11865, Training Loss: 0.01396
Epoch: 11865, Training Loss: 0.01290
Epoch: 11865, Training Loss: 0.01595
Epoch: 11865, Training Loss: 0.01224
Epoch: 11866, Training Loss: 0.01396
Epoch: 11866, Training Loss: 0.01290
Epoch: 11866, Training Loss: 0.01595
Epoch: 11866, Training Loss: 0.01224
Epoch: 11867, Training Loss: 0.01396
Epoch: 11867, Training Loss: 0.01290
Epoch: 11867, Training Loss: 0.01595
Epoch: 11867, Training Loss: 0.01224
Epoch: 11868, Training Loss: 0.01396
Epoch: 11868, Training Loss: 0.01290
Epoch: 11868, Training Loss: 0.01594
Epoch: 11868, Training Loss: 0.01224
Epoch: 11869, Training Loss: 0.01396
Epoch: 11869, Training Loss: 0.01290
Epoch: 11869, Training Loss: 0.01594
Epoch: 11869, Training Loss: 0.01224
Epoch: 11870, Training Loss: 0.01396
Epoch: 11870, Training Loss: 0.01290
Epoch: 11870, Training Loss: 0.01594
Epoch: 11870, Training Loss: 0.01224
Epoch: 11871, Training Loss: 0.01396
Epoch: 11871, Training Loss: 0.01290
Epoch: 11871, Training Loss: 0.01594
Epoch: 11871, Training Loss: 0.01224
Epoch: 11872, Training Loss: 0.01396
Epoch: 11872, Training Loss: 0.01290
Epoch: 11872, Training Loss: 0.01594
Epoch: 11872, Training Loss: 0.01224
Epoch: 11873, Training Loss: 0.01396
Epoch: 11873, Training Loss: 0.01290
Epoch: 11873, Training Loss: 0.01594
Epoch: 11873, Training Loss: 0.01224
Epoch: 11874, Training Loss: 0.01396
Epoch: 11874, Training Loss: 0.01289
Epoch: 11874, Training Loss: 0.01594
Epoch: 11874, Training Loss: 0.01224
Epoch: 11875, Training Loss: 0.01396
Epoch: 11875, Training Loss: 0.01289
Epoch: 11875, Training Loss: 0.01594
Epoch: 11875, Training Loss: 0.01224
Epoch: 11876, Training Loss: 0.01396
Epoch: 11876, Training Loss: 0.01289
Epoch: 11876, Training Loss: 0.01594
Epoch: 11876, Training Loss: 0.01224
Epoch: 11877, Training Loss: 0.01396
Epoch: 11877, Training Loss: 0.01289
Epoch: 11877, Training Loss: 0.01594
Epoch: 11877, Training Loss: 0.01224
Epoch: 11878, Training Loss: 0.01395
Epoch: 11878, Training Loss: 0.01289
Epoch: 11878, Training Loss: 0.01594
Epoch: 11878, Training Loss: 0.01224
Epoch: 11879, Training Loss: 0.01395
Epoch: 11879, Training Loss: 0.01289
Epoch: 11879, Training Loss: 0.01594
Epoch: 11879, Training Loss: 0.01224
Epoch: 11880, Training Loss: 0.01395
Epoch: 11880, Training Loss: 0.01289
Epoch: 11880, Training Loss: 0.01594
Epoch: 11880, Training Loss: 0.01224
Epoch: 11881, Training Loss: 0.01395
Epoch: 11881, Training Loss: 0.01289
Epoch: 11881, Training Loss: 0.01593
Epoch: 11881, Training Loss: 0.01224
Epoch: 11882, Training Loss: 0.01395
Epoch: 11882, Training Loss: 0.01289
Epoch: 11882, Training Loss: 0.01593
Epoch: 11882, Training Loss: 0.01223
Epoch: 11883, Training Loss: 0.01395
Epoch: 11883, Training Loss: 0.01289
Epoch: 11883, Training Loss: 0.01593
Epoch: 11883, Training Loss: 0.01223
Epoch: 11884, Training Loss: 0.01395
Epoch: 11884, Training Loss: 0.01289
Epoch: 11884, Training Loss: 0.01593
Epoch: 11884, Training Loss: 0.01223
Epoch: 11885, Training Loss: 0.01395
Epoch: 11885, Training Loss: 0.01289
Epoch: 11885, Training Loss: 0.01593
Epoch: 11885, Training Loss: 0.01223
Epoch: 11886, Training Loss: 0.01395
Epoch: 11886, Training Loss: 0.01289
Epoch: 11886, Training Loss: 0.01593
Epoch: 11886, Training Loss: 0.01223
Epoch: 11887, Training Loss: 0.01395
Epoch: 11887, Training Loss: 0.01289
Epoch: 11887, Training Loss: 0.01593
Epoch: 11887, Training Loss: 0.01223
Epoch: 11888, Training Loss: 0.01395
Epoch: 11888, Training Loss: 0.01289
Epoch: 11888, Training Loss: 0.01593
Epoch: 11888, Training Loss: 0.01223
Epoch: 11889, Training Loss: 0.01395
Epoch: 11889, Training Loss: 0.01289
Epoch: 11889, Training Loss: 0.01593
Epoch: 11889, Training Loss: 0.01223
Epoch: 11890, Training Loss: 0.01395
Epoch: 11890, Training Loss: 0.01289
Epoch: 11890, Training Loss: 0.01593
Epoch: 11890, Training Loss: 0.01223
Epoch: 11891, Training Loss: 0.01395
Epoch: 11891, Training Loss: 0.01288
Epoch: 11891, Training Loss: 0.01593
Epoch: 11891, Training Loss: 0.01223
Epoch: 11892, Training Loss: 0.01395
Epoch: 11892, Training Loss: 0.01288
Epoch: 11892, Training Loss: 0.01593
Epoch: 11892, Training Loss: 0.01223
Epoch: 11893, Training Loss: 0.01394
Epoch: 11893, Training Loss: 0.01288
Epoch: 11893, Training Loss: 0.01593
Epoch: 11893, Training Loss: 0.01223
Epoch: 11894, Training Loss: 0.01394
Epoch: 11894, Training Loss: 0.01288
Epoch: 11894, Training Loss: 0.01592
Epoch: 11894, Training Loss: 0.01223
Epoch: 11895, Training Loss: 0.01394
Epoch: 11895, Training Loss: 0.01288
Epoch: 11895, Training Loss: 0.01592
Epoch: 11895, Training Loss: 0.01223
Epoch: 11896, Training Loss: 0.01394
Epoch: 11896, Training Loss: 0.01288
Epoch: 11896, Training Loss: 0.01592
Epoch: 11896, Training Loss: 0.01223
Epoch: 11897, Training Loss: 0.01394
Epoch: 11897, Training Loss: 0.01288
Epoch: 11897, Training Loss: 0.01592
Epoch: 11897, Training Loss: 0.01223
Epoch: 11898, Training Loss: 0.01394
Epoch: 11898, Training Loss: 0.01288
Epoch: 11898, Training Loss: 0.01592
Epoch: 11898, Training Loss: 0.01223
Epoch: 11899, Training Loss: 0.01394
Epoch: 11899, Training Loss: 0.01288
Epoch: 11899, Training Loss: 0.01592
Epoch: 11899, Training Loss: 0.01223
Epoch: 11900, Training Loss: 0.01394
Epoch: 11900, Training Loss: 0.01288
Epoch: 11900, Training Loss: 0.01592
Epoch: 11900, Training Loss: 0.01222
Epoch: 11901, Training Loss: 0.01394
Epoch: 11901, Training Loss: 0.01288
Epoch: 11901, Training Loss: 0.01592
Epoch: 11901, Training Loss: 0.01222
Epoch: 11902, Training Loss: 0.01394
Epoch: 11902, Training Loss: 0.01288
Epoch: 11902, Training Loss: 0.01592
Epoch: 11902, Training Loss: 0.01222
Epoch: 11903, Training Loss: 0.01394
Epoch: 11903, Training Loss: 0.01288
Epoch: 11903, Training Loss: 0.01592
Epoch: 11903, Training Loss: 0.01222
Epoch: 11904, Training Loss: 0.01394
Epoch: 11904, Training Loss: 0.01288
Epoch: 11904, Training Loss: 0.01592
Epoch: 11904, Training Loss: 0.01222
Epoch: 11905, Training Loss: 0.01394
Epoch: 11905, Training Loss: 0.01288
Epoch: 11905, Training Loss: 0.01592
Epoch: 11905, Training Loss: 0.01222
Epoch: 11906, Training Loss: 0.01394
Epoch: 11906, Training Loss: 0.01288
Epoch: 11906, Training Loss: 0.01592
Epoch: 11906, Training Loss: 0.01222
Epoch: 11907, Training Loss: 0.01394
Epoch: 11907, Training Loss: 0.01288
Epoch: 11907, Training Loss: 0.01591
Epoch: 11907, Training Loss: 0.01222
Epoch: 11908, Training Loss: 0.01393
Epoch: 11908, Training Loss: 0.01287
Epoch: 11908, Training Loss: 0.01591
Epoch: 11908, Training Loss: 0.01222
Epoch: 11909, Training Loss: 0.01393
Epoch: 11909, Training Loss: 0.01287
Epoch: 11909, Training Loss: 0.01591
Epoch: 11909, Training Loss: 0.01222
Epoch: 11910, Training Loss: 0.01393
Epoch: 11910, Training Loss: 0.01287
Epoch: 11910, Training Loss: 0.01591
Epoch: 11910, Training Loss: 0.01222
Epoch: 11911, Training Loss: 0.01393
Epoch: 11911, Training Loss: 0.01287
Epoch: 11911, Training Loss: 0.01591
Epoch: 11911, Training Loss: 0.01222
Epoch: 11912, Training Loss: 0.01393
Epoch: 11912, Training Loss: 0.01287
Epoch: 11912, Training Loss: 0.01591
Epoch: 11912, Training Loss: 0.01222
Epoch: 11913, Training Loss: 0.01393
Epoch: 11913, Training Loss: 0.01287
Epoch: 11913, Training Loss: 0.01591
Epoch: 11913, Training Loss: 0.01222
Epoch: 11914, Training Loss: 0.01393
Epoch: 11914, Training Loss: 0.01287
Epoch: 11914, Training Loss: 0.01591
Epoch: 11914, Training Loss: 0.01222
Epoch: 11915, Training Loss: 0.01393
Epoch: 11915, Training Loss: 0.01287
Epoch: 11915, Training Loss: 0.01591
Epoch: 11915, Training Loss: 0.01222
Epoch: 11916, Training Loss: 0.01393
Epoch: 11916, Training Loss: 0.01287
Epoch: 11916, Training Loss: 0.01591
Epoch: 11916, Training Loss: 0.01222
Epoch: 11917, Training Loss: 0.01393
Epoch: 11917, Training Loss: 0.01287
Epoch: 11917, Training Loss: 0.01591
Epoch: 11917, Training Loss: 0.01222
Epoch: 11918, Training Loss: 0.01393
Epoch: 11918, Training Loss: 0.01287
Epoch: 11918, Training Loss: 0.01591
Epoch: 11918, Training Loss: 0.01221
Epoch: 11919, Training Loss: 0.01393
Epoch: 11919, Training Loss: 0.01287
Epoch: 11919, Training Loss: 0.01591
Epoch: 11919, Training Loss: 0.01221
Epoch: 11920, Training Loss: 0.01393
Epoch: 11920, Training Loss: 0.01287
Epoch: 11920, Training Loss: 0.01591
Epoch: 11920, Training Loss: 0.01221
Epoch: 11921, Training Loss: 0.01393
Epoch: 11921, Training Loss: 0.01287
Epoch: 11921, Training Loss: 0.01590
Epoch: 11921, Training Loss: 0.01221
Epoch: 11922, Training Loss: 0.01393
Epoch: 11922, Training Loss: 0.01287
Epoch: 11922, Training Loss: 0.01590
Epoch: 11922, Training Loss: 0.01221
Epoch: 11923, Training Loss: 0.01393
Epoch: 11923, Training Loss: 0.01287
Epoch: 11923, Training Loss: 0.01590
Epoch: 11923, Training Loss: 0.01221
Epoch: 11924, Training Loss: 0.01392
Epoch: 11924, Training Loss: 0.01287
Epoch: 11924, Training Loss: 0.01590
Epoch: 11924, Training Loss: 0.01221
Epoch: 11925, Training Loss: 0.01392
Epoch: 11925, Training Loss: 0.01286
Epoch: 11925, Training Loss: 0.01590
Epoch: 11925, Training Loss: 0.01221
Epoch: 11926, Training Loss: 0.01392
Epoch: 11926, Training Loss: 0.01286
Epoch: 11926, Training Loss: 0.01590
Epoch: 11926, Training Loss: 0.01221
Epoch: 11927, Training Loss: 0.01392
Epoch: 11927, Training Loss: 0.01286
Epoch: 11927, Training Loss: 0.01590
Epoch: 11927, Training Loss: 0.01221
Epoch: 11928, Training Loss: 0.01392
Epoch: 11928, Training Loss: 0.01286
Epoch: 11928, Training Loss: 0.01590
Epoch: 11928, Training Loss: 0.01221
Epoch: 11929, Training Loss: 0.01392
Epoch: 11929, Training Loss: 0.01286
Epoch: 11929, Training Loss: 0.01590
Epoch: 11929, Training Loss: 0.01221
Epoch: 11930, Training Loss: 0.01392
Epoch: 11930, Training Loss: 0.01286
Epoch: 11930, Training Loss: 0.01590
Epoch: 11930, Training Loss: 0.01221
Epoch: 11931, Training Loss: 0.01392
Epoch: 11931, Training Loss: 0.01286
Epoch: 11931, Training Loss: 0.01590
Epoch: 11931, Training Loss: 0.01221
Epoch: 11932, Training Loss: 0.01392
Epoch: 11932, Training Loss: 0.01286
Epoch: 11932, Training Loss: 0.01590
Epoch: 11932, Training Loss: 0.01221
Epoch: 11933, Training Loss: 0.01392
Epoch: 11933, Training Loss: 0.01286
Epoch: 11933, Training Loss: 0.01590
Epoch: 11933, Training Loss: 0.01221
Epoch: 11934, Training Loss: 0.01392
Epoch: 11934, Training Loss: 0.01286
Epoch: 11934, Training Loss: 0.01589
Epoch: 11934, Training Loss: 0.01221
Epoch: 11935, Training Loss: 0.01392
Epoch: 11935, Training Loss: 0.01286
Epoch: 11935, Training Loss: 0.01589
Epoch: 11935, Training Loss: 0.01220
Epoch: 11936, Training Loss: 0.01392
Epoch: 11936, Training Loss: 0.01286
Epoch: 11936, Training Loss: 0.01589
Epoch: 11936, Training Loss: 0.01220
Epoch: 11937, Training Loss: 0.01392
Epoch: 11937, Training Loss: 0.01286
Epoch: 11937, Training Loss: 0.01589
Epoch: 11937, Training Loss: 0.01220
Epoch: 11938, Training Loss: 0.01392
Epoch: 11938, Training Loss: 0.01286
Epoch: 11938, Training Loss: 0.01589
Epoch: 11938, Training Loss: 0.01220
Epoch: 11939, Training Loss: 0.01391
Epoch: 11939, Training Loss: 0.01286
Epoch: 11939, Training Loss: 0.01589
Epoch: 11939, Training Loss: 0.01220
Epoch: 11940, Training Loss: 0.01391
Epoch: 11940, Training Loss: 0.01286
Epoch: 11940, Training Loss: 0.01589
Epoch: 11940, Training Loss: 0.01220
Epoch: 11941, Training Loss: 0.01391
Epoch: 11941, Training Loss: 0.01286
Epoch: 11941, Training Loss: 0.01589
Epoch: 11941, Training Loss: 0.01220
Epoch: 11942, Training Loss: 0.01391
Epoch: 11942, Training Loss: 0.01285
Epoch: 11942, Training Loss: 0.01589
Epoch: 11942, Training Loss: 0.01220
Epoch: 11943, Training Loss: 0.01391
Epoch: 11943, Training Loss: 0.01285
Epoch: 11943, Training Loss: 0.01589
Epoch: 11943, Training Loss: 0.01220
Epoch: 11944, Training Loss: 0.01391
Epoch: 11944, Training Loss: 0.01285
Epoch: 11944, Training Loss: 0.01589
Epoch: 11944, Training Loss: 0.01220
Epoch: 11945, Training Loss: 0.01391
Epoch: 11945, Training Loss: 0.01285
Epoch: 11945, Training Loss: 0.01589
Epoch: 11945, Training Loss: 0.01220
Epoch: 11946, Training Loss: 0.01391
Epoch: 11946, Training Loss: 0.01285
Epoch: 11946, Training Loss: 0.01589
Epoch: 11946, Training Loss: 0.01220
Epoch: 11947, Training Loss: 0.01391
Epoch: 11947, Training Loss: 0.01285
Epoch: 11947, Training Loss: 0.01589
Epoch: 11947, Training Loss: 0.01220
Epoch: 11948, Training Loss: 0.01391
Epoch: 11948, Training Loss: 0.01285
Epoch: 11948, Training Loss: 0.01588
Epoch: 11948, Training Loss: 0.01220
Epoch: 11949, Training Loss: 0.01391
Epoch: 11949, Training Loss: 0.01285
Epoch: 11949, Training Loss: 0.01588
Epoch: 11949, Training Loss: 0.01220
Epoch: 11950, Training Loss: 0.01391
Epoch: 11950, Training Loss: 0.01285
Epoch: 11950, Training Loss: 0.01588
Epoch: 11950, Training Loss: 0.01220
Epoch: 11951, Training Loss: 0.01391
Epoch: 11951, Training Loss: 0.01285
Epoch: 11951, Training Loss: 0.01588
Epoch: 11951, Training Loss: 0.01220
Epoch: 11952, Training Loss: 0.01391
Epoch: 11952, Training Loss: 0.01285
Epoch: 11952, Training Loss: 0.01588
Epoch: 11952, Training Loss: 0.01220
Epoch: 11953, Training Loss: 0.01391
Epoch: 11953, Training Loss: 0.01285
Epoch: 11953, Training Loss: 0.01588
Epoch: 11953, Training Loss: 0.01219
Epoch: 11954, Training Loss: 0.01390
Epoch: 11954, Training Loss: 0.01285
Epoch: 11954, Training Loss: 0.01588
Epoch: 11954, Training Loss: 0.01219
Epoch: 11955, Training Loss: 0.01390
Epoch: 11955, Training Loss: 0.01285
Epoch: 11955, Training Loss: 0.01588
Epoch: 11955, Training Loss: 0.01219
Epoch: 11956, Training Loss: 0.01390
Epoch: 11956, Training Loss: 0.01285
Epoch: 11956, Training Loss: 0.01588
Epoch: 11956, Training Loss: 0.01219
Epoch: 11957, Training Loss: 0.01390
Epoch: 11957, Training Loss: 0.01285
Epoch: 11957, Training Loss: 0.01588
Epoch: 11957, Training Loss: 0.01219
Epoch: 11958, Training Loss: 0.01390
Epoch: 11958, Training Loss: 0.01285
Epoch: 11958, Training Loss: 0.01588
Epoch: 11958, Training Loss: 0.01219
Epoch: 11959, Training Loss: 0.01390
Epoch: 11959, Training Loss: 0.01284
Epoch: 11959, Training Loss: 0.01588
Epoch: 11959, Training Loss: 0.01219
Epoch: 11960, Training Loss: 0.01390
Epoch: 11960, Training Loss: 0.01284
Epoch: 11960, Training Loss: 0.01588
Epoch: 11960, Training Loss: 0.01219
Epoch: 11961, Training Loss: 0.01390
Epoch: 11961, Training Loss: 0.01284
Epoch: 11961, Training Loss: 0.01587
Epoch: 11961, Training Loss: 0.01219
Epoch: 11962, Training Loss: 0.01390
Epoch: 11962, Training Loss: 0.01284
Epoch: 11962, Training Loss: 0.01587
Epoch: 11962, Training Loss: 0.01219
Epoch: 11963, Training Loss: 0.01390
Epoch: 11963, Training Loss: 0.01284
Epoch: 11963, Training Loss: 0.01587
Epoch: 11963, Training Loss: 0.01219
Epoch: 11964, Training Loss: 0.01390
Epoch: 11964, Training Loss: 0.01284
Epoch: 11964, Training Loss: 0.01587
Epoch: 11964, Training Loss: 0.01219
Epoch: 11965, Training Loss: 0.01390
Epoch: 11965, Training Loss: 0.01284
Epoch: 11965, Training Loss: 0.01587
Epoch: 11965, Training Loss: 0.01219
Epoch: 11966, Training Loss: 0.01390
Epoch: 11966, Training Loss: 0.01284
Epoch: 11966, Training Loss: 0.01587
Epoch: 11966, Training Loss: 0.01219
Epoch: 11967, Training Loss: 0.01390
Epoch: 11967, Training Loss: 0.01284
Epoch: 11967, Training Loss: 0.01587
Epoch: 11967, Training Loss: 0.01219
Epoch: 11968, Training Loss: 0.01390
Epoch: 11968, Training Loss: 0.01284
Epoch: 11968, Training Loss: 0.01587
Epoch: 11968, Training Loss: 0.01219
Epoch: 11969, Training Loss: 0.01390
Epoch: 11969, Training Loss: 0.01284
Epoch: 11969, Training Loss: 0.01587
Epoch: 11969, Training Loss: 0.01219
Epoch: 11970, Training Loss: 0.01389
Epoch: 11970, Training Loss: 0.01284
Epoch: 11970, Training Loss: 0.01587
Epoch: 11970, Training Loss: 0.01219
Epoch: 11971, Training Loss: 0.01389
Epoch: 11971, Training Loss: 0.01284
Epoch: 11971, Training Loss: 0.01587
Epoch: 11971, Training Loss: 0.01218
Epoch: 11972, Training Loss: 0.01389
Epoch: 11972, Training Loss: 0.01284
Epoch: 11972, Training Loss: 0.01587
Epoch: 11972, Training Loss: 0.01218
Epoch: 11973, Training Loss: 0.01389
Epoch: 11973, Training Loss: 0.01284
Epoch: 11973, Training Loss: 0.01587
Epoch: 11973, Training Loss: 0.01218
Epoch: 11974, Training Loss: 0.01389
Epoch: 11974, Training Loss: 0.01284
Epoch: 11974, Training Loss: 0.01586
Epoch: 11974, Training Loss: 0.01218
Epoch: 11975, Training Loss: 0.01389
Epoch: 11975, Training Loss: 0.01284
Epoch: 11975, Training Loss: 0.01586
Epoch: 11975, Training Loss: 0.01218
Epoch: 11976, Training Loss: 0.01389
Epoch: 11976, Training Loss: 0.01283
Epoch: 11976, Training Loss: 0.01586
Epoch: 11976, Training Loss: 0.01218
Epoch: 11977, Training Loss: 0.01389
Epoch: 11977, Training Loss: 0.01283
Epoch: 11977, Training Loss: 0.01586
Epoch: 11977, Training Loss: 0.01218
Epoch: 11978, Training Loss: 0.01389
Epoch: 11978, Training Loss: 0.01283
Epoch: 11978, Training Loss: 0.01586
Epoch: 11978, Training Loss: 0.01218
Epoch: 11979, Training Loss: 0.01389
Epoch: 11979, Training Loss: 0.01283
Epoch: 11979, Training Loss: 0.01586
Epoch: 11979, Training Loss: 0.01218
Epoch: 11980, Training Loss: 0.01389
Epoch: 11980, Training Loss: 0.01283
Epoch: 11980, Training Loss: 0.01586
Epoch: 11980, Training Loss: 0.01218
Epoch: 11981, Training Loss: 0.01389
Epoch: 11981, Training Loss: 0.01283
Epoch: 11981, Training Loss: 0.01586
Epoch: 11981, Training Loss: 0.01218
Epoch: 11982, Training Loss: 0.01389
Epoch: 11982, Training Loss: 0.01283
Epoch: 11982, Training Loss: 0.01586
Epoch: 11982, Training Loss: 0.01218
Epoch: 11983, Training Loss: 0.01389
Epoch: 11983, Training Loss: 0.01283
Epoch: 11983, Training Loss: 0.01586
Epoch: 11983, Training Loss: 0.01218
Epoch: 11984, Training Loss: 0.01389
Epoch: 11984, Training Loss: 0.01283
Epoch: 11984, Training Loss: 0.01586
Epoch: 11984, Training Loss: 0.01218
Epoch: 11985, Training Loss: 0.01388
Epoch: 11985, Training Loss: 0.01283
Epoch: 11985, Training Loss: 0.01586
Epoch: 11985, Training Loss: 0.01218
Epoch: 11986, Training Loss: 0.01388
Epoch: 11986, Training Loss: 0.01283
Epoch: 11986, Training Loss: 0.01586
Epoch: 11986, Training Loss: 0.01218
Epoch: 11987, Training Loss: 0.01388
Epoch: 11987, Training Loss: 0.01283
Epoch: 11987, Training Loss: 0.01586
Epoch: 11987, Training Loss: 0.01218
Epoch: 11988, Training Loss: 0.01388
Epoch: 11988, Training Loss: 0.01283
Epoch: 11988, Training Loss: 0.01585
Epoch: 11988, Training Loss: 0.01218
Epoch: 11989, Training Loss: 0.01388
Epoch: 11989, Training Loss: 0.01283
Epoch: 11989, Training Loss: 0.01585
Epoch: 11989, Training Loss: 0.01217
Epoch: 11990, Training Loss: 0.01388
Epoch: 11990, Training Loss: 0.01283
Epoch: 11990, Training Loss: 0.01585
Epoch: 11990, Training Loss: 0.01217
Epoch: 11991, Training Loss: 0.01388
Epoch: 11991, Training Loss: 0.01283
Epoch: 11991, Training Loss: 0.01585
Epoch: 11991, Training Loss: 0.01217
Epoch: 11992, Training Loss: 0.01388
Epoch: 11992, Training Loss: 0.01283
Epoch: 11992, Training Loss: 0.01585
Epoch: 11992, Training Loss: 0.01217
Epoch: 11993, Training Loss: 0.01388
Epoch: 11993, Training Loss: 0.01283
Epoch: 11993, Training Loss: 0.01585
Epoch: 11993, Training Loss: 0.01217
Epoch: 11994, Training Loss: 0.01388
Epoch: 11994, Training Loss: 0.01282
Epoch: 11994, Training Loss: 0.01585
Epoch: 11994, Training Loss: 0.01217
Epoch: 11995, Training Loss: 0.01388
Epoch: 11995, Training Loss: 0.01282
Epoch: 11995, Training Loss: 0.01585
Epoch: 11995, Training Loss: 0.01217
Epoch: 11996, Training Loss: 0.01388
Epoch: 11996, Training Loss: 0.01282
Epoch: 11996, Training Loss: 0.01585
Epoch: 11996, Training Loss: 0.01217
Epoch: 11997, Training Loss: 0.01388
Epoch: 11997, Training Loss: 0.01282
Epoch: 11997, Training Loss: 0.01585
Epoch: 11997, Training Loss: 0.01217
Epoch: 11998, Training Loss: 0.01388
Epoch: 11998, Training Loss: 0.01282
Epoch: 11998, Training Loss: 0.01585
Epoch: 11998, Training Loss: 0.01217
Epoch: 11999, Training Loss: 0.01388
Epoch: 11999, Training Loss: 0.01282
Epoch: 11999, Training Loss: 0.01585
Epoch: 11999, Training Loss: 0.01217
Epoch: 12000, Training Loss: 0.01388
Epoch: 12000, Training Loss: 0.01282
Epoch: 12000, Training Loss: 0.01585
Epoch: 12000, Training Loss: 0.01217
Epoch: 12001, Training Loss: 0.01387
Epoch: 12001, Training Loss: 0.01282
Epoch: 12001, Training Loss: 0.01584
Epoch: 12001, Training Loss: 0.01217
Epoch: 12002, Training Loss: 0.01387
Epoch: 12002, Training Loss: 0.01282
Epoch: 12002, Training Loss: 0.01584
Epoch: 12002, Training Loss: 0.01217
Epoch: 12003, Training Loss: 0.01387
Epoch: 12003, Training Loss: 0.01282
Epoch: 12003, Training Loss: 0.01584
Epoch: 12003, Training Loss: 0.01217
Epoch: 12004, Training Loss: 0.01387
Epoch: 12004, Training Loss: 0.01282
Epoch: 12004, Training Loss: 0.01584
Epoch: 12004, Training Loss: 0.01217
Epoch: 12005, Training Loss: 0.01387
Epoch: 12005, Training Loss: 0.01282
Epoch: 12005, Training Loss: 0.01584
Epoch: 12005, Training Loss: 0.01217
Epoch: 12006, Training Loss: 0.01387
Epoch: 12006, Training Loss: 0.01282
Epoch: 12006, Training Loss: 0.01584
Epoch: 12006, Training Loss: 0.01217
Epoch: 12007, Training Loss: 0.01387
Epoch: 12007, Training Loss: 0.01282
Epoch: 12007, Training Loss: 0.01584
Epoch: 12007, Training Loss: 0.01216
Epoch: 12008, Training Loss: 0.01387
Epoch: 12008, Training Loss: 0.01282
Epoch: 12008, Training Loss: 0.01584
Epoch: 12008, Training Loss: 0.01216
Epoch: 12009, Training Loss: 0.01387
Epoch: 12009, Training Loss: 0.01282
Epoch: 12009, Training Loss: 0.01584
Epoch: 12009, Training Loss: 0.01216
Epoch: 12010, Training Loss: 0.01387
Epoch: 12010, Training Loss: 0.01282
Epoch: 12010, Training Loss: 0.01584
Epoch: 12010, Training Loss: 0.01216
Epoch: 12011, Training Loss: 0.01387
Epoch: 12011, Training Loss: 0.01281
Epoch: 12011, Training Loss: 0.01584
Epoch: 12011, Training Loss: 0.01216
Epoch: 12012, Training Loss: 0.01387
Epoch: 12012, Training Loss: 0.01281
Epoch: 12012, Training Loss: 0.01584
Epoch: 12012, Training Loss: 0.01216
Epoch: 12013, Training Loss: 0.01387
Epoch: 12013, Training Loss: 0.01281
Epoch: 12013, Training Loss: 0.01584
Epoch: 12013, Training Loss: 0.01216
Epoch: 12014, Training Loss: 0.01387
Epoch: 12014, Training Loss: 0.01281
Epoch: 12014, Training Loss: 0.01584
Epoch: 12014, Training Loss: 0.01216
Epoch: 12015, Training Loss: 0.01387
Epoch: 12015, Training Loss: 0.01281
Epoch: 12015, Training Loss: 0.01583
Epoch: 12015, Training Loss: 0.01216
Epoch: 12016, Training Loss: 0.01386
Epoch: 12016, Training Loss: 0.01281
Epoch: 12016, Training Loss: 0.01583
Epoch: 12016, Training Loss: 0.01216
Epoch: 12017, Training Loss: 0.01386
Epoch: 12017, Training Loss: 0.01281
Epoch: 12017, Training Loss: 0.01583
Epoch: 12017, Training Loss: 0.01216
Epoch: 12018, Training Loss: 0.01386
Epoch: 12018, Training Loss: 0.01281
Epoch: 12018, Training Loss: 0.01583
Epoch: 12018, Training Loss: 0.01216
Epoch: 12019, Training Loss: 0.01386
Epoch: 12019, Training Loss: 0.01281
Epoch: 12019, Training Loss: 0.01583
Epoch: 12019, Training Loss: 0.01216
Epoch: 12020, Training Loss: 0.01386
Epoch: 12020, Training Loss: 0.01281
Epoch: 12020, Training Loss: 0.01583
Epoch: 12020, Training Loss: 0.01216
Epoch: 12021, Training Loss: 0.01386
Epoch: 12021, Training Loss: 0.01281
Epoch: 12021, Training Loss: 0.01583
Epoch: 12021, Training Loss: 0.01216
Epoch: 12022, Training Loss: 0.01386
Epoch: 12022, Training Loss: 0.01281
Epoch: 12022, Training Loss: 0.01583
Epoch: 12022, Training Loss: 0.01216
Epoch: 12023, Training Loss: 0.01386
Epoch: 12023, Training Loss: 0.01281
Epoch: 12023, Training Loss: 0.01583
Epoch: 12023, Training Loss: 0.01216
Epoch: 12024, Training Loss: 0.01386
Epoch: 12024, Training Loss: 0.01281
Epoch: 12024, Training Loss: 0.01583
Epoch: 12024, Training Loss: 0.01216
Epoch: 12025, Training Loss: 0.01386
Epoch: 12025, Training Loss: 0.01281
Epoch: 12025, Training Loss: 0.01583
Epoch: 12025, Training Loss: 0.01215
Epoch: 12026, Training Loss: 0.01386
Epoch: 12026, Training Loss: 0.01281
Epoch: 12026, Training Loss: 0.01583
Epoch: 12026, Training Loss: 0.01215
Epoch: 12027, Training Loss: 0.01386
Epoch: 12027, Training Loss: 0.01281
Epoch: 12027, Training Loss: 0.01583
Epoch: 12027, Training Loss: 0.01215
Epoch: 12028, Training Loss: 0.01386
Epoch: 12028, Training Loss: 0.01280
Epoch: 12028, Training Loss: 0.01583
Epoch: 12028, Training Loss: 0.01215
Epoch: 12029, Training Loss: 0.01386
Epoch: 12029, Training Loss: 0.01280
Epoch: 12029, Training Loss: 0.01582
Epoch: 12029, Training Loss: 0.01215
Epoch: 12030, Training Loss: 0.01386
Epoch: 12030, Training Loss: 0.01280
Epoch: 12030, Training Loss: 0.01582
Epoch: 12030, Training Loss: 0.01215
Epoch: 12031, Training Loss: 0.01386
Epoch: 12031, Training Loss: 0.01280
Epoch: 12031, Training Loss: 0.01582
Epoch: 12031, Training Loss: 0.01215
Epoch: 12032, Training Loss: 0.01385
Epoch: 12032, Training Loss: 0.01280
Epoch: 12032, Training Loss: 0.01582
Epoch: 12032, Training Loss: 0.01215
Epoch: 12033, Training Loss: 0.01385
Epoch: 12033, Training Loss: 0.01280
Epoch: 12033, Training Loss: 0.01582
Epoch: 12033, Training Loss: 0.01215
Epoch: 12034, Training Loss: 0.01385
Epoch: 12034, Training Loss: 0.01280
Epoch: 12034, Training Loss: 0.01582
Epoch: 12034, Training Loss: 0.01215
Epoch: 12035, Training Loss: 0.01385
Epoch: 12035, Training Loss: 0.01280
Epoch: 12035, Training Loss: 0.01582
Epoch: 12035, Training Loss: 0.01215
Epoch: 12036, Training Loss: 0.01385
Epoch: 12036, Training Loss: 0.01280
Epoch: 12036, Training Loss: 0.01582
Epoch: 12036, Training Loss: 0.01215
Epoch: 12037, Training Loss: 0.01385
Epoch: 12037, Training Loss: 0.01280
Epoch: 12037, Training Loss: 0.01582
Epoch: 12037, Training Loss: 0.01215
Epoch: 12038, Training Loss: 0.01385
Epoch: 12038, Training Loss: 0.01280
Epoch: 12038, Training Loss: 0.01582
Epoch: 12038, Training Loss: 0.01215
Epoch: 12039, Training Loss: 0.01385
Epoch: 12039, Training Loss: 0.01280
Epoch: 12039, Training Loss: 0.01582
Epoch: 12039, Training Loss: 0.01215
Epoch: 12040, Training Loss: 0.01385
Epoch: 12040, Training Loss: 0.01280
Epoch: 12040, Training Loss: 0.01582
Epoch: 12040, Training Loss: 0.01215
Epoch: 12041, Training Loss: 0.01385
Epoch: 12041, Training Loss: 0.01280
Epoch: 12041, Training Loss: 0.01582
Epoch: 12041, Training Loss: 0.01215
Epoch: 12042, Training Loss: 0.01385
Epoch: 12042, Training Loss: 0.01280
Epoch: 12042, Training Loss: 0.01581
Epoch: 12042, Training Loss: 0.01215
Epoch: 12043, Training Loss: 0.01385
Epoch: 12043, Training Loss: 0.01280
Epoch: 12043, Training Loss: 0.01581
Epoch: 12043, Training Loss: 0.01214
Epoch: 12044, Training Loss: 0.01385
Epoch: 12044, Training Loss: 0.01280
Epoch: 12044, Training Loss: 0.01581
Epoch: 12044, Training Loss: 0.01214
Epoch: 12045, Training Loss: 0.01385
Epoch: 12045, Training Loss: 0.01279
Epoch: 12045, Training Loss: 0.01581
Epoch: 12045, Training Loss: 0.01214
Epoch: 12046, Training Loss: 0.01385
Epoch: 12046, Training Loss: 0.01279
Epoch: 12046, Training Loss: 0.01581
Epoch: 12046, Training Loss: 0.01214
Epoch: 12047, Training Loss: 0.01384
Epoch: 12047, Training Loss: 0.01279
Epoch: 12047, Training Loss: 0.01581
Epoch: 12047, Training Loss: 0.01214
Epoch: 12048, Training Loss: 0.01384
Epoch: 12048, Training Loss: 0.01279
Epoch: 12048, Training Loss: 0.01581
Epoch: 12048, Training Loss: 0.01214
Epoch: 12049, Training Loss: 0.01384
Epoch: 12049, Training Loss: 0.01279
Epoch: 12049, Training Loss: 0.01581
Epoch: 12049, Training Loss: 0.01214
Epoch: 12050, Training Loss: 0.01384
Epoch: 12050, Training Loss: 0.01279
Epoch: 12050, Training Loss: 0.01581
Epoch: 12050, Training Loss: 0.01214
Epoch: 12051, Training Loss: 0.01384
Epoch: 12051, Training Loss: 0.01279
Epoch: 12051, Training Loss: 0.01581
Epoch: 12051, Training Loss: 0.01214
Epoch: 12052, Training Loss: 0.01384
Epoch: 12052, Training Loss: 0.01279
Epoch: 12052, Training Loss: 0.01581
Epoch: 12052, Training Loss: 0.01214
Epoch: 12053, Training Loss: 0.01384
Epoch: 12053, Training Loss: 0.01279
Epoch: 12053, Training Loss: 0.01581
Epoch: 12053, Training Loss: 0.01214
Epoch: 12054, Training Loss: 0.01384
Epoch: 12054, Training Loss: 0.01279
Epoch: 12054, Training Loss: 0.01581
Epoch: 12054, Training Loss: 0.01214
Epoch: 12055, Training Loss: 0.01384
Epoch: 12055, Training Loss: 0.01279
Epoch: 12055, Training Loss: 0.01581
Epoch: 12055, Training Loss: 0.01214
Epoch: 12056, Training Loss: 0.01384
Epoch: 12056, Training Loss: 0.01279
Epoch: 12056, Training Loss: 0.01580
Epoch: 12056, Training Loss: 0.01214
Epoch: 12057, Training Loss: 0.01384
Epoch: 12057, Training Loss: 0.01279
Epoch: 12057, Training Loss: 0.01580
Epoch: 12057, Training Loss: 0.01214
Epoch: 12058, Training Loss: 0.01384
Epoch: 12058, Training Loss: 0.01279
Epoch: 12058, Training Loss: 0.01580
Epoch: 12058, Training Loss: 0.01214
Epoch: 12059, Training Loss: 0.01384
Epoch: 12059, Training Loss: 0.01279
Epoch: 12059, Training Loss: 0.01580
Epoch: 12059, Training Loss: 0.01214
Epoch: 12060, Training Loss: 0.01384
Epoch: 12060, Training Loss: 0.01279
Epoch: 12060, Training Loss: 0.01580
Epoch: 12060, Training Loss: 0.01214
Epoch: 12061, Training Loss: 0.01384
Epoch: 12061, Training Loss: 0.01279
Epoch: 12061, Training Loss: 0.01580
Epoch: 12061, Training Loss: 0.01213
Epoch: 12062, Training Loss: 0.01384
Epoch: 12062, Training Loss: 0.01279
Epoch: 12062, Training Loss: 0.01580
Epoch: 12062, Training Loss: 0.01213
Epoch: 12063, Training Loss: 0.01383
Epoch: 12063, Training Loss: 0.01278
Epoch: 12063, Training Loss: 0.01580
Epoch: 12063, Training Loss: 0.01213
Epoch: 12064, Training Loss: 0.01383
Epoch: 12064, Training Loss: 0.01278
Epoch: 12064, Training Loss: 0.01580
Epoch: 12064, Training Loss: 0.01213
Epoch: 12065, Training Loss: 0.01383
Epoch: 12065, Training Loss: 0.01278
Epoch: 12065, Training Loss: 0.01580
Epoch: 12065, Training Loss: 0.01213
Epoch: 12066, Training Loss: 0.01383
Epoch: 12066, Training Loss: 0.01278
Epoch: 12066, Training Loss: 0.01580
Epoch: 12066, Training Loss: 0.01213
Epoch: 12067, Training Loss: 0.01383
Epoch: 12067, Training Loss: 0.01278
Epoch: 12067, Training Loss: 0.01580
Epoch: 12067, Training Loss: 0.01213
Epoch: 12068, Training Loss: 0.01383
Epoch: 12068, Training Loss: 0.01278
Epoch: 12068, Training Loss: 0.01580
Epoch: 12068, Training Loss: 0.01213
Epoch: 12069, Training Loss: 0.01383
Epoch: 12069, Training Loss: 0.01278
Epoch: 12069, Training Loss: 0.01579
Epoch: 12069, Training Loss: 0.01213
Epoch: 12070, Training Loss: 0.01383
Epoch: 12070, Training Loss: 0.01278
Epoch: 12070, Training Loss: 0.01579
Epoch: 12070, Training Loss: 0.01213
Epoch: 12071, Training Loss: 0.01383
Epoch: 12071, Training Loss: 0.01278
Epoch: 12071, Training Loss: 0.01579
Epoch: 12071, Training Loss: 0.01213
Epoch: 12072, Training Loss: 0.01383
Epoch: 12072, Training Loss: 0.01278
Epoch: 12072, Training Loss: 0.01579
Epoch: 12072, Training Loss: 0.01213
Epoch: 12073, Training Loss: 0.01383
Epoch: 12073, Training Loss: 0.01278
Epoch: 12073, Training Loss: 0.01579
Epoch: 12073, Training Loss: 0.01213
Epoch: 12074, Training Loss: 0.01383
Epoch: 12074, Training Loss: 0.01278
Epoch: 12074, Training Loss: 0.01579
Epoch: 12074, Training Loss: 0.01213
Epoch: 12075, Training Loss: 0.01383
Epoch: 12075, Training Loss: 0.01278
Epoch: 12075, Training Loss: 0.01579
Epoch: 12075, Training Loss: 0.01213
Epoch: 12076, Training Loss: 0.01383
Epoch: 12076, Training Loss: 0.01278
Epoch: 12076, Training Loss: 0.01579
Epoch: 12076, Training Loss: 0.01213
Epoch: 12077, Training Loss: 0.01383
Epoch: 12077, Training Loss: 0.01278
Epoch: 12077, Training Loss: 0.01579
Epoch: 12077, Training Loss: 0.01213
Epoch: 12078, Training Loss: 0.01382
Epoch: 12078, Training Loss: 0.01278
Epoch: 12078, Training Loss: 0.01579
Epoch: 12078, Training Loss: 0.01213
Epoch: 12079, Training Loss: 0.01382
Epoch: 12079, Training Loss: 0.01278
Epoch: 12079, Training Loss: 0.01579
Epoch: 12079, Training Loss: 0.01212
Epoch: 12080, Training Loss: 0.01382
Epoch: 12080, Training Loss: 0.01277
Epoch: 12080, Training Loss: 0.01579
Epoch: 12080, Training Loss: 0.01212
Epoch: 12081, Training Loss: 0.01382
Epoch: 12081, Training Loss: 0.01277
Epoch: 12081, Training Loss: 0.01579
Epoch: 12081, Training Loss: 0.01212
Epoch: 12082, Training Loss: 0.01382
Epoch: 12082, Training Loss: 0.01277
Epoch: 12082, Training Loss: 0.01579
Epoch: 12082, Training Loss: 0.01212
Epoch: 12083, Training Loss: 0.01382
Epoch: 12083, Training Loss: 0.01277
Epoch: 12083, Training Loss: 0.01578
Epoch: 12083, Training Loss: 0.01212
Epoch: 12084, Training Loss: 0.01382
Epoch: 12084, Training Loss: 0.01277
Epoch: 12084, Training Loss: 0.01578
Epoch: 12084, Training Loss: 0.01212
Epoch: 12085, Training Loss: 0.01382
Epoch: 12085, Training Loss: 0.01277
Epoch: 12085, Training Loss: 0.01578
Epoch: 12085, Training Loss: 0.01212
Epoch: 12086, Training Loss: 0.01382
Epoch: 12086, Training Loss: 0.01277
Epoch: 12086, Training Loss: 0.01578
Epoch: 12086, Training Loss: 0.01212
Epoch: 12087, Training Loss: 0.01382
Epoch: 12087, Training Loss: 0.01277
Epoch: 12087, Training Loss: 0.01578
Epoch: 12087, Training Loss: 0.01212
Epoch: 12088, Training Loss: 0.01382
Epoch: 12088, Training Loss: 0.01277
Epoch: 12088, Training Loss: 0.01578
Epoch: 12088, Training Loss: 0.01212
Epoch: 12089, Training Loss: 0.01382
Epoch: 12089, Training Loss: 0.01277
Epoch: 12089, Training Loss: 0.01578
Epoch: 12089, Training Loss: 0.01212
Epoch: 12090, Training Loss: 0.01382
Epoch: 12090, Training Loss: 0.01277
Epoch: 12090, Training Loss: 0.01578
Epoch: 12090, Training Loss: 0.01212
Epoch: 12091, Training Loss: 0.01382
Epoch: 12091, Training Loss: 0.01277
Epoch: 12091, Training Loss: 0.01578
Epoch: 12091, Training Loss: 0.01212
Epoch: 12092, Training Loss: 0.01382
Epoch: 12092, Training Loss: 0.01277
Epoch: 12092, Training Loss: 0.01578
Epoch: 12092, Training Loss: 0.01212
Epoch: 12093, Training Loss: 0.01382
Epoch: 12093, Training Loss: 0.01277
Epoch: 12093, Training Loss: 0.01578
Epoch: 12093, Training Loss: 0.01212
Epoch: 12094, Training Loss: 0.01381
Epoch: 12094, Training Loss: 0.01277
Epoch: 12094, Training Loss: 0.01578
Epoch: 12094, Training Loss: 0.01212
Epoch: 12095, Training Loss: 0.01381
Epoch: 12095, Training Loss: 0.01277
Epoch: 12095, Training Loss: 0.01578
Epoch: 12095, Training Loss: 0.01212
Epoch: 12096, Training Loss: 0.01381
Epoch: 12096, Training Loss: 0.01277
Epoch: 12096, Training Loss: 0.01578
Epoch: 12096, Training Loss: 0.01212
Epoch: 12097, Training Loss: 0.01381
Epoch: 12097, Training Loss: 0.01277
Epoch: 12097, Training Loss: 0.01577
Epoch: 12097, Training Loss: 0.01212
Epoch: 12098, Training Loss: 0.01381
Epoch: 12098, Training Loss: 0.01276
Epoch: 12098, Training Loss: 0.01577
Epoch: 12098, Training Loss: 0.01211
Epoch: 12099, Training Loss: 0.01381
Epoch: 12099, Training Loss: 0.01276
Epoch: 12099, Training Loss: 0.01577
Epoch: 12099, Training Loss: 0.01211
Epoch: 12100, Training Loss: 0.01381
Epoch: 12100, Training Loss: 0.01276
Epoch: 12100, Training Loss: 0.01577
Epoch: 12100, Training Loss: 0.01211
Epoch: 12101, Training Loss: 0.01381
Epoch: 12101, Training Loss: 0.01276
Epoch: 12101, Training Loss: 0.01577
Epoch: 12101, Training Loss: 0.01211
Epoch: 12102, Training Loss: 0.01381
Epoch: 12102, Training Loss: 0.01276
Epoch: 12102, Training Loss: 0.01577
Epoch: 12102, Training Loss: 0.01211
Epoch: 12103, Training Loss: 0.01381
Epoch: 12103, Training Loss: 0.01276
Epoch: 12103, Training Loss: 0.01577
Epoch: 12103, Training Loss: 0.01211
Epoch: 12104, Training Loss: 0.01381
Epoch: 12104, Training Loss: 0.01276
Epoch: 12104, Training Loss: 0.01577
Epoch: 12104, Training Loss: 0.01211
Epoch: 12105, Training Loss: 0.01381
Epoch: 12105, Training Loss: 0.01276
Epoch: 12105, Training Loss: 0.01577
Epoch: 12105, Training Loss: 0.01211
Epoch: 12106, Training Loss: 0.01381
Epoch: 12106, Training Loss: 0.01276
Epoch: 12106, Training Loss: 0.01577
Epoch: 12106, Training Loss: 0.01211
Epoch: 12107, Training Loss: 0.01381
Epoch: 12107, Training Loss: 0.01276
Epoch: 12107, Training Loss: 0.01577
Epoch: 12107, Training Loss: 0.01211
Epoch: 12108, Training Loss: 0.01381
Epoch: 12108, Training Loss: 0.01276
Epoch: 12108, Training Loss: 0.01577
Epoch: 12108, Training Loss: 0.01211
Epoch: 12109, Training Loss: 0.01381
Epoch: 12109, Training Loss: 0.01276
Epoch: 12109, Training Loss: 0.01577
Epoch: 12109, Training Loss: 0.01211
Epoch: 12110, Training Loss: 0.01380
Epoch: 12110, Training Loss: 0.01276
Epoch: 12110, Training Loss: 0.01576
Epoch: 12110, Training Loss: 0.01211
Epoch: 12111, Training Loss: 0.01380
Epoch: 12111, Training Loss: 0.01276
Epoch: 12111, Training Loss: 0.01576
Epoch: 12111, Training Loss: 0.01211
Epoch: 12112, Training Loss: 0.01380
Epoch: 12112, Training Loss: 0.01276
Epoch: 12112, Training Loss: 0.01576
Epoch: 12112, Training Loss: 0.01211
Epoch: 12113, Training Loss: 0.01380
Epoch: 12113, Training Loss: 0.01276
Epoch: 12113, Training Loss: 0.01576
Epoch: 12113, Training Loss: 0.01211
Epoch: 12114, Training Loss: 0.01380
Epoch: 12114, Training Loss: 0.01276
Epoch: 12114, Training Loss: 0.01576
Epoch: 12114, Training Loss: 0.01211
Epoch: 12115, Training Loss: 0.01380
Epoch: 12115, Training Loss: 0.01275
Epoch: 12115, Training Loss: 0.01576
Epoch: 12115, Training Loss: 0.01211
Epoch: 12116, Training Loss: 0.01380
Epoch: 12116, Training Loss: 0.01275
Epoch: 12116, Training Loss: 0.01576
Epoch: 12116, Training Loss: 0.01210
Epoch: 12117, Training Loss: 0.01380
Epoch: 12117, Training Loss: 0.01275
Epoch: 12117, Training Loss: 0.01576
Epoch: 12117, Training Loss: 0.01210
Epoch: 12118, Training Loss: 0.01380
Epoch: 12118, Training Loss: 0.01275
Epoch: 12118, Training Loss: 0.01576
Epoch: 12118, Training Loss: 0.01210
Epoch: 12119, Training Loss: 0.01380
Epoch: 12119, Training Loss: 0.01275
Epoch: 12119, Training Loss: 0.01576
Epoch: 12119, Training Loss: 0.01210
Epoch: 12120, Training Loss: 0.01380
Epoch: 12120, Training Loss: 0.01275
Epoch: 12120, Training Loss: 0.01576
Epoch: 12120, Training Loss: 0.01210
Epoch: 12121, Training Loss: 0.01380
Epoch: 12121, Training Loss: 0.01275
Epoch: 12121, Training Loss: 0.01576
Epoch: 12121, Training Loss: 0.01210
Epoch: 12122, Training Loss: 0.01380
Epoch: 12122, Training Loss: 0.01275
Epoch: 12122, Training Loss: 0.01576
Epoch: 12122, Training Loss: 0.01210
Epoch: 12123, Training Loss: 0.01380
Epoch: 12123, Training Loss: 0.01275
Epoch: 12123, Training Loss: 0.01576
Epoch: 12123, Training Loss: 0.01210
Epoch: 12124, Training Loss: 0.01380
Epoch: 12124, Training Loss: 0.01275
Epoch: 12124, Training Loss: 0.01575
Epoch: 12124, Training Loss: 0.01210
Epoch: 12125, Training Loss: 0.01380
Epoch: 12125, Training Loss: 0.01275
Epoch: 12125, Training Loss: 0.01575
Epoch: 12125, Training Loss: 0.01210
Epoch: 12126, Training Loss: 0.01379
Epoch: 12126, Training Loss: 0.01275
Epoch: 12126, Training Loss: 0.01575
Epoch: 12126, Training Loss: 0.01210
Epoch: 12127, Training Loss: 0.01379
Epoch: 12127, Training Loss: 0.01275
Epoch: 12127, Training Loss: 0.01575
Epoch: 12127, Training Loss: 0.01210
Epoch: 12128, Training Loss: 0.01379
Epoch: 12128, Training Loss: 0.01275
Epoch: 12128, Training Loss: 0.01575
Epoch: 12128, Training Loss: 0.01210
Epoch: 12129, Training Loss: 0.01379
Epoch: 12129, Training Loss: 0.01275
Epoch: 12129, Training Loss: 0.01575
Epoch: 12129, Training Loss: 0.01210
Epoch: 12130, Training Loss: 0.01379
Epoch: 12130, Training Loss: 0.01275
Epoch: 12130, Training Loss: 0.01575
Epoch: 12130, Training Loss: 0.01210
Epoch: 12131, Training Loss: 0.01379
Epoch: 12131, Training Loss: 0.01275
Epoch: 12131, Training Loss: 0.01575
Epoch: 12131, Training Loss: 0.01210
Epoch: 12132, Training Loss: 0.01379
Epoch: 12132, Training Loss: 0.01275
Epoch: 12132, Training Loss: 0.01575
Epoch: 12132, Training Loss: 0.01210
Epoch: 12133, Training Loss: 0.01379
Epoch: 12133, Training Loss: 0.01274
Epoch: 12133, Training Loss: 0.01575
Epoch: 12133, Training Loss: 0.01210
Epoch: 12134, Training Loss: 0.01379
Epoch: 12134, Training Loss: 0.01274
Epoch: 12134, Training Loss: 0.01575
Epoch: 12134, Training Loss: 0.01209
Epoch: 12135, Training Loss: 0.01379
Epoch: 12135, Training Loss: 0.01274
Epoch: 12135, Training Loss: 0.01575
Epoch: 12135, Training Loss: 0.01209
Epoch: 12136, Training Loss: 0.01379
Epoch: 12136, Training Loss: 0.01274
Epoch: 12136, Training Loss: 0.01575
Epoch: 12136, Training Loss: 0.01209
Epoch: 12137, Training Loss: 0.01379
Epoch: 12137, Training Loss: 0.01274
Epoch: 12137, Training Loss: 0.01575
Epoch: 12137, Training Loss: 0.01209
Epoch: 12138, Training Loss: 0.01379
Epoch: 12138, Training Loss: 0.01274
Epoch: 12138, Training Loss: 0.01574
Epoch: 12138, Training Loss: 0.01209
Epoch: 12139, Training Loss: 0.01379
Epoch: 12139, Training Loss: 0.01274
Epoch: 12139, Training Loss: 0.01574
Epoch: 12139, Training Loss: 0.01209
Epoch: 12140, Training Loss: 0.01379
Epoch: 12140, Training Loss: 0.01274
Epoch: 12140, Training Loss: 0.01574
Epoch: 12140, Training Loss: 0.01209
Epoch: 12141, Training Loss: 0.01378
Epoch: 12141, Training Loss: 0.01274
Epoch: 12141, Training Loss: 0.01574
Epoch: 12141, Training Loss: 0.01209
Epoch: 12142, Training Loss: 0.01378
Epoch: 12142, Training Loss: 0.01274
Epoch: 12142, Training Loss: 0.01574
Epoch: 12142, Training Loss: 0.01209
Epoch: 12143, Training Loss: 0.01378
Epoch: 12143, Training Loss: 0.01274
Epoch: 12143, Training Loss: 0.01574
Epoch: 12143, Training Loss: 0.01209
Epoch: 12144, Training Loss: 0.01378
Epoch: 12144, Training Loss: 0.01274
Epoch: 12144, Training Loss: 0.01574
Epoch: 12144, Training Loss: 0.01209
Epoch: 12145, Training Loss: 0.01378
Epoch: 12145, Training Loss: 0.01274
Epoch: 12145, Training Loss: 0.01574
Epoch: 12145, Training Loss: 0.01209
Epoch: 12146, Training Loss: 0.01378
Epoch: 12146, Training Loss: 0.01274
Epoch: 12146, Training Loss: 0.01574
Epoch: 12146, Training Loss: 0.01209
Epoch: 12147, Training Loss: 0.01378
Epoch: 12147, Training Loss: 0.01274
Epoch: 12147, Training Loss: 0.01574
Epoch: 12147, Training Loss: 0.01209
Epoch: 12148, Training Loss: 0.01378
Epoch: 12148, Training Loss: 0.01274
Epoch: 12148, Training Loss: 0.01574
Epoch: 12148, Training Loss: 0.01209
Epoch: 12149, Training Loss: 0.01378
Epoch: 12149, Training Loss: 0.01274
Epoch: 12149, Training Loss: 0.01574
Epoch: 12149, Training Loss: 0.01209
Epoch: 12150, Training Loss: 0.01378
Epoch: 12150, Training Loss: 0.01273
Epoch: 12150, Training Loss: 0.01574
Epoch: 12150, Training Loss: 0.01209
Epoch: 12151, Training Loss: 0.01378
Epoch: 12151, Training Loss: 0.01273
Epoch: 12151, Training Loss: 0.01574
Epoch: 12151, Training Loss: 0.01209
Epoch: 12152, Training Loss: 0.01378
Epoch: 12152, Training Loss: 0.01273
Epoch: 12152, Training Loss: 0.01573
Epoch: 12152, Training Loss: 0.01208
Epoch: 12153, Training Loss: 0.01378
Epoch: 12153, Training Loss: 0.01273
Epoch: 12153, Training Loss: 0.01573
Epoch: 12153, Training Loss: 0.01208
Epoch: 12154, Training Loss: 0.01378
Epoch: 12154, Training Loss: 0.01273
Epoch: 12154, Training Loss: 0.01573
Epoch: 12154, Training Loss: 0.01208
Epoch: 12155, Training Loss: 0.01378
Epoch: 12155, Training Loss: 0.01273
Epoch: 12155, Training Loss: 0.01573
Epoch: 12155, Training Loss: 0.01208
Epoch: 12156, Training Loss: 0.01378
Epoch: 12156, Training Loss: 0.01273
Epoch: 12156, Training Loss: 0.01573
Epoch: 12156, Training Loss: 0.01208
Epoch: 12157, Training Loss: 0.01377
Epoch: 12157, Training Loss: 0.01273
Epoch: 12157, Training Loss: 0.01573
Epoch: 12157, Training Loss: 0.01208
Epoch: 12158, Training Loss: 0.01377
Epoch: 12158, Training Loss: 0.01273
Epoch: 12158, Training Loss: 0.01573
Epoch: 12158, Training Loss: 0.01208
Epoch: 12159, Training Loss: 0.01377
Epoch: 12159, Training Loss: 0.01273
Epoch: 12159, Training Loss: 0.01573
Epoch: 12159, Training Loss: 0.01208
Epoch: 12160, Training Loss: 0.01377
Epoch: 12160, Training Loss: 0.01273
Epoch: 12160, Training Loss: 0.01573
Epoch: 12160, Training Loss: 0.01208
Epoch: 12161, Training Loss: 0.01377
Epoch: 12161, Training Loss: 0.01273
Epoch: 12161, Training Loss: 0.01573
Epoch: 12161, Training Loss: 0.01208
Epoch: 12162, Training Loss: 0.01377
Epoch: 12162, Training Loss: 0.01273
Epoch: 12162, Training Loss: 0.01573
Epoch: 12162, Training Loss: 0.01208
Epoch: 12163, Training Loss: 0.01377
Epoch: 12163, Training Loss: 0.01273
Epoch: 12163, Training Loss: 0.01573
Epoch: 12163, Training Loss: 0.01208
Epoch: 12164, Training Loss: 0.01377
Epoch: 12164, Training Loss: 0.01273
Epoch: 12164, Training Loss: 0.01573
Epoch: 12164, Training Loss: 0.01208
Epoch: 12165, Training Loss: 0.01377
Epoch: 12165, Training Loss: 0.01273
Epoch: 12165, Training Loss: 0.01572
Epoch: 12165, Training Loss: 0.01208
Epoch: 12166, Training Loss: 0.01377
Epoch: 12166, Training Loss: 0.01273
Epoch: 12166, Training Loss: 0.01572
Epoch: 12166, Training Loss: 0.01208
Epoch: 12167, Training Loss: 0.01377
Epoch: 12167, Training Loss: 0.01273
Epoch: 12167, Training Loss: 0.01572
Epoch: 12167, Training Loss: 0.01208
Epoch: 12168, Training Loss: 0.01377
Epoch: 12168, Training Loss: 0.01272
Epoch: 12168, Training Loss: 0.01572
Epoch: 12168, Training Loss: 0.01208
Epoch: 12169, Training Loss: 0.01377
Epoch: 12169, Training Loss: 0.01272
Epoch: 12169, Training Loss: 0.01572
Epoch: 12169, Training Loss: 0.01208
Epoch: 12170, Training Loss: 0.01377
Epoch: 12170, Training Loss: 0.01272
Epoch: 12170, Training Loss: 0.01572
Epoch: 12170, Training Loss: 0.01208
Epoch: 12171, Training Loss: 0.01377
Epoch: 12171, Training Loss: 0.01272
Epoch: 12171, Training Loss: 0.01572
Epoch: 12171, Training Loss: 0.01207
Epoch: 12172, Training Loss: 0.01377
Epoch: 12172, Training Loss: 0.01272
Epoch: 12172, Training Loss: 0.01572
Epoch: 12172, Training Loss: 0.01207
Epoch: 12173, Training Loss: 0.01376
Epoch: 12173, Training Loss: 0.01272
Epoch: 12173, Training Loss: 0.01572
Epoch: 12173, Training Loss: 0.01207
Epoch: 12174, Training Loss: 0.01376
Epoch: 12174, Training Loss: 0.01272
Epoch: 12174, Training Loss: 0.01572
Epoch: 12174, Training Loss: 0.01207
Epoch: 12175, Training Loss: 0.01376
Epoch: 12175, Training Loss: 0.01272
Epoch: 12175, Training Loss: 0.01572
Epoch: 12175, Training Loss: 0.01207
Epoch: 12176, Training Loss: 0.01376
Epoch: 12176, Training Loss: 0.01272
Epoch: 12176, Training Loss: 0.01572
Epoch: 12176, Training Loss: 0.01207
Epoch: 12177, Training Loss: 0.01376
Epoch: 12177, Training Loss: 0.01272
Epoch: 12177, Training Loss: 0.01572
Epoch: 12177, Training Loss: 0.01207
Epoch: 12178, Training Loss: 0.01376
Epoch: 12178, Training Loss: 0.01272
Epoch: 12178, Training Loss: 0.01572
Epoch: 12178, Training Loss: 0.01207
Epoch: 12179, Training Loss: 0.01376
Epoch: 12179, Training Loss: 0.01272
Epoch: 12179, Training Loss: 0.01571
Epoch: 12179, Training Loss: 0.01207
Epoch: 12180, Training Loss: 0.01376
Epoch: 12180, Training Loss: 0.01272
Epoch: 12180, Training Loss: 0.01571
Epoch: 12180, Training Loss: 0.01207
Epoch: 12181, Training Loss: 0.01376
Epoch: 12181, Training Loss: 0.01272
Epoch: 12181, Training Loss: 0.01571
Epoch: 12181, Training Loss: 0.01207
Epoch: 12182, Training Loss: 0.01376
Epoch: 12182, Training Loss: 0.01272
Epoch: 12182, Training Loss: 0.01571
Epoch: 12182, Training Loss: 0.01207
Epoch: 12183, Training Loss: 0.01376
Epoch: 12183, Training Loss: 0.01272
Epoch: 12183, Training Loss: 0.01571
Epoch: 12183, Training Loss: 0.01207
Epoch: 12184, Training Loss: 0.01376
Epoch: 12184, Training Loss: 0.01272
Epoch: 12184, Training Loss: 0.01571
Epoch: 12184, Training Loss: 0.01207
Epoch: 12185, Training Loss: 0.01376
Epoch: 12185, Training Loss: 0.01271
Epoch: 12185, Training Loss: 0.01571
Epoch: 12185, Training Loss: 0.01207
Epoch: 12186, Training Loss: 0.01376
Epoch: 12186, Training Loss: 0.01271
Epoch: 12186, Training Loss: 0.01571
Epoch: 12186, Training Loss: 0.01207
Epoch: 12187, Training Loss: 0.01376
Epoch: 12187, Training Loss: 0.01271
Epoch: 12187, Training Loss: 0.01571
Epoch: 12187, Training Loss: 0.01207
Epoch: 12188, Training Loss: 0.01376
Epoch: 12188, Training Loss: 0.01271
Epoch: 12188, Training Loss: 0.01571
Epoch: 12188, Training Loss: 0.01207
Epoch: 12189, Training Loss: 0.01375
Epoch: 12189, Training Loss: 0.01271
Epoch: 12189, Training Loss: 0.01571
Epoch: 12189, Training Loss: 0.01206
Epoch: 12190, Training Loss: 0.01375
Epoch: 12190, Training Loss: 0.01271
Epoch: 12190, Training Loss: 0.01571
Epoch: 12190, Training Loss: 0.01206
Epoch: 12191, Training Loss: 0.01375
Epoch: 12191, Training Loss: 0.01271
Epoch: 12191, Training Loss: 0.01571
Epoch: 12191, Training Loss: 0.01206
Epoch: 12192, Training Loss: 0.01375
Epoch: 12192, Training Loss: 0.01271
Epoch: 12192, Training Loss: 0.01571
Epoch: 12192, Training Loss: 0.01206
Epoch: 12193, Training Loss: 0.01375
Epoch: 12193, Training Loss: 0.01271
Epoch: 12193, Training Loss: 0.01570
Epoch: 12193, Training Loss: 0.01206
Epoch: 12194, Training Loss: 0.01375
Epoch: 12194, Training Loss: 0.01271
Epoch: 12194, Training Loss: 0.01570
Epoch: 12194, Training Loss: 0.01206
Epoch: 12195, Training Loss: 0.01375
Epoch: 12195, Training Loss: 0.01271
Epoch: 12195, Training Loss: 0.01570
Epoch: 12195, Training Loss: 0.01206
Epoch: 12196, Training Loss: 0.01375
Epoch: 12196, Training Loss: 0.01271
Epoch: 12196, Training Loss: 0.01570
Epoch: 12196, Training Loss: 0.01206
Epoch: 12197, Training Loss: 0.01375
Epoch: 12197, Training Loss: 0.01271
Epoch: 12197, Training Loss: 0.01570
Epoch: 12197, Training Loss: 0.01206
Epoch: 12198, Training Loss: 0.01375
Epoch: 12198, Training Loss: 0.01271
Epoch: 12198, Training Loss: 0.01570
Epoch: 12198, Training Loss: 0.01206
Epoch: 12199, Training Loss: 0.01375
Epoch: 12199, Training Loss: 0.01271
Epoch: 12199, Training Loss: 0.01570
Epoch: 12199, Training Loss: 0.01206
Epoch: 12200, Training Loss: 0.01375
Epoch: 12200, Training Loss: 0.01271
Epoch: 12200, Training Loss: 0.01570
Epoch: 12200, Training Loss: 0.01206
Epoch: 12201, Training Loss: 0.01375
Epoch: 12201, Training Loss: 0.01271
Epoch: 12201, Training Loss: 0.01570
Epoch: 12201, Training Loss: 0.01206
Epoch: 12202, Training Loss: 0.01375
Epoch: 12202, Training Loss: 0.01271
Epoch: 12202, Training Loss: 0.01570
Epoch: 12202, Training Loss: 0.01206
Epoch: 12203, Training Loss: 0.01375
Epoch: 12203, Training Loss: 0.01270
Epoch: 12203, Training Loss: 0.01570
Epoch: 12203, Training Loss: 0.01206
Epoch: 12204, Training Loss: 0.01375
Epoch: 12204, Training Loss: 0.01270
Epoch: 12204, Training Loss: 0.01570
Epoch: 12204, Training Loss: 0.01206
Epoch: 12205, Training Loss: 0.01374
Epoch: 12205, Training Loss: 0.01270
Epoch: 12205, Training Loss: 0.01570
Epoch: 12205, Training Loss: 0.01206
Epoch: 12206, Training Loss: 0.01374
Epoch: 12206, Training Loss: 0.01270
Epoch: 12206, Training Loss: 0.01570
Epoch: 12206, Training Loss: 0.01206
Epoch: 12207, Training Loss: 0.01374
Epoch: 12207, Training Loss: 0.01270
Epoch: 12207, Training Loss: 0.01569
Epoch: 12207, Training Loss: 0.01206
Epoch: 12208, Training Loss: 0.01374
Epoch: 12208, Training Loss: 0.01270
Epoch: 12208, Training Loss: 0.01569
Epoch: 12208, Training Loss: 0.01205
Epoch: 12209, Training Loss: 0.01374
Epoch: 12209, Training Loss: 0.01270
Epoch: 12209, Training Loss: 0.01569
Epoch: 12209, Training Loss: 0.01205
Epoch: 12210, Training Loss: 0.01374
Epoch: 12210, Training Loss: 0.01270
Epoch: 12210, Training Loss: 0.01569
Epoch: 12210, Training Loss: 0.01205
Epoch: 12211, Training Loss: 0.01374
Epoch: 12211, Training Loss: 0.01270
Epoch: 12211, Training Loss: 0.01569
Epoch: 12211, Training Loss: 0.01205
Epoch: 12212, Training Loss: 0.01374
Epoch: 12212, Training Loss: 0.01270
Epoch: 12212, Training Loss: 0.01569
Epoch: 12212, Training Loss: 0.01205
Epoch: 12213, Training Loss: 0.01374
Epoch: 12213, Training Loss: 0.01270
Epoch: 12213, Training Loss: 0.01569
Epoch: 12213, Training Loss: 0.01205
Epoch: 12214, Training Loss: 0.01374
Epoch: 12214, Training Loss: 0.01270
Epoch: 12214, Training Loss: 0.01569
Epoch: 12214, Training Loss: 0.01205
Epoch: 12215, Training Loss: 0.01374
Epoch: 12215, Training Loss: 0.01270
Epoch: 12215, Training Loss: 0.01569
Epoch: 12215, Training Loss: 0.01205
Epoch: 12216, Training Loss: 0.01374
Epoch: 12216, Training Loss: 0.01270
Epoch: 12216, Training Loss: 0.01569
Epoch: 12216, Training Loss: 0.01205
Epoch: 12217, Training Loss: 0.01374
Epoch: 12217, Training Loss: 0.01270
Epoch: 12217, Training Loss: 0.01569
Epoch: 12217, Training Loss: 0.01205
Epoch: 12218, Training Loss: 0.01374
Epoch: 12218, Training Loss: 0.01270
Epoch: 12218, Training Loss: 0.01569
Epoch: 12218, Training Loss: 0.01205
Epoch: 12219, Training Loss: 0.01374
Epoch: 12219, Training Loss: 0.01270
Epoch: 12219, Training Loss: 0.01569
Epoch: 12219, Training Loss: 0.01205
Epoch: 12220, Training Loss: 0.01374
Epoch: 12220, Training Loss: 0.01270
Epoch: 12220, Training Loss: 0.01569
Epoch: 12220, Training Loss: 0.01205
Epoch: 12221, Training Loss: 0.01373
Epoch: 12221, Training Loss: 0.01269
Epoch: 12221, Training Loss: 0.01568
Epoch: 12221, Training Loss: 0.01205
Epoch: 12222, Training Loss: 0.01373
Epoch: 12222, Training Loss: 0.01269
Epoch: 12222, Training Loss: 0.01568
Epoch: 12222, Training Loss: 0.01205
Epoch: 12223, Training Loss: 0.01373
Epoch: 12223, Training Loss: 0.01269
Epoch: 12223, Training Loss: 0.01568
Epoch: 12223, Training Loss: 0.01205
Epoch: 12224, Training Loss: 0.01373
Epoch: 12224, Training Loss: 0.01269
Epoch: 12224, Training Loss: 0.01568
Epoch: 12224, Training Loss: 0.01205
Epoch: 12225, Training Loss: 0.01373
Epoch: 12225, Training Loss: 0.01269
Epoch: 12225, Training Loss: 0.01568
Epoch: 12225, Training Loss: 0.01205
Epoch: 12226, Training Loss: 0.01373
Epoch: 12226, Training Loss: 0.01269
Epoch: 12226, Training Loss: 0.01568
Epoch: 12226, Training Loss: 0.01204
Epoch: 12227, Training Loss: 0.01373
Epoch: 12227, Training Loss: 0.01269
Epoch: 12227, Training Loss: 0.01568
Epoch: 12227, Training Loss: 0.01204
Epoch: 12228, Training Loss: 0.01373
Epoch: 12228, Training Loss: 0.01269
Epoch: 12228, Training Loss: 0.01568
Epoch: 12228, Training Loss: 0.01204
Epoch: 12229, Training Loss: 0.01373
Epoch: 12229, Training Loss: 0.01269
Epoch: 12229, Training Loss: 0.01568
Epoch: 12229, Training Loss: 0.01204
Epoch: 12230, Training Loss: 0.01373
Epoch: 12230, Training Loss: 0.01269
Epoch: 12230, Training Loss: 0.01568
Epoch: 12230, Training Loss: 0.01204
Epoch: 12231, Training Loss: 0.01373
Epoch: 12231, Training Loss: 0.01269
Epoch: 12231, Training Loss: 0.01568
Epoch: 12231, Training Loss: 0.01204
Epoch: 12232, Training Loss: 0.01373
Epoch: 12232, Training Loss: 0.01269
Epoch: 12232, Training Loss: 0.01568
Epoch: 12232, Training Loss: 0.01204
Epoch: 12233, Training Loss: 0.01373
Epoch: 12233, Training Loss: 0.01269
Epoch: 12233, Training Loss: 0.01568
Epoch: 12233, Training Loss: 0.01204
Epoch: 12234, Training Loss: 0.01373
Epoch: 12234, Training Loss: 0.01269
Epoch: 12234, Training Loss: 0.01568
Epoch: 12234, Training Loss: 0.01204
Epoch: 12235, Training Loss: 0.01373
Epoch: 12235, Training Loss: 0.01269
Epoch: 12235, Training Loss: 0.01567
Epoch: 12235, Training Loss: 0.01204
Epoch: 12236, Training Loss: 0.01373
Epoch: 12236, Training Loss: 0.01269
Epoch: 12236, Training Loss: 0.01567
Epoch: 12236, Training Loss: 0.01204
Epoch: 12237, Training Loss: 0.01372
Epoch: 12237, Training Loss: 0.01269
Epoch: 12237, Training Loss: 0.01567
Epoch: 12237, Training Loss: 0.01204
Epoch: 12238, Training Loss: 0.01372
Epoch: 12238, Training Loss: 0.01268
Epoch: 12238, Training Loss: 0.01567
Epoch: 12238, Training Loss: 0.01204
Epoch: 12239, Training Loss: 0.01372
Epoch: 12239, Training Loss: 0.01268
Epoch: 12239, Training Loss: 0.01567
Epoch: 12239, Training Loss: 0.01204
Epoch: 12240, Training Loss: 0.01372
Epoch: 12240, Training Loss: 0.01268
Epoch: 12240, Training Loss: 0.01567
Epoch: 12240, Training Loss: 0.01204
Epoch: 12241, Training Loss: 0.01372
Epoch: 12241, Training Loss: 0.01268
Epoch: 12241, Training Loss: 0.01567
Epoch: 12241, Training Loss: 0.01204
Epoch: 12242, Training Loss: 0.01372
Epoch: 12242, Training Loss: 0.01268
Epoch: 12242, Training Loss: 0.01567
Epoch: 12242, Training Loss: 0.01204
Epoch: 12243, Training Loss: 0.01372
Epoch: 12243, Training Loss: 0.01268
Epoch: 12243, Training Loss: 0.01567
Epoch: 12243, Training Loss: 0.01204
Epoch: 12244, Training Loss: 0.01372
Epoch: 12244, Training Loss: 0.01268
Epoch: 12244, Training Loss: 0.01567
Epoch: 12244, Training Loss: 0.01204
Epoch: 12245, Training Loss: 0.01372
Epoch: 12245, Training Loss: 0.01268
Epoch: 12245, Training Loss: 0.01567
Epoch: 12245, Training Loss: 0.01203
Epoch: 12246, Training Loss: 0.01372
Epoch: 12246, Training Loss: 0.01268
Epoch: 12246, Training Loss: 0.01567
Epoch: 12246, Training Loss: 0.01203
Epoch: 12247, Training Loss: 0.01372
Epoch: 12247, Training Loss: 0.01268
Epoch: 12247, Training Loss: 0.01567
Epoch: 12247, Training Loss: 0.01203
Epoch: 12248, Training Loss: 0.01372
Epoch: 12248, Training Loss: 0.01268
Epoch: 12248, Training Loss: 0.01567
Epoch: 12248, Training Loss: 0.01203
Epoch: 12249, Training Loss: 0.01372
Epoch: 12249, Training Loss: 0.01268
Epoch: 12249, Training Loss: 0.01566
Epoch: 12249, Training Loss: 0.01203
Epoch: 12250, Training Loss: 0.01372
Epoch: 12250, Training Loss: 0.01268
Epoch: 12250, Training Loss: 0.01566
Epoch: 12250, Training Loss: 0.01203
Epoch: 12251, Training Loss: 0.01372
Epoch: 12251, Training Loss: 0.01268
Epoch: 12251, Training Loss: 0.01566
Epoch: 12251, Training Loss: 0.01203
Epoch: 12252, Training Loss: 0.01372
Epoch: 12252, Training Loss: 0.01268
Epoch: 12252, Training Loss: 0.01566
Epoch: 12252, Training Loss: 0.01203
Epoch: 12253, Training Loss: 0.01371
Epoch: 12253, Training Loss: 0.01268
Epoch: 12253, Training Loss: 0.01566
Epoch: 12253, Training Loss: 0.01203
Epoch: 12254, Training Loss: 0.01371
Epoch: 12254, Training Loss: 0.01268
Epoch: 12254, Training Loss: 0.01566
Epoch: 12254, Training Loss: 0.01203
Epoch: 12255, Training Loss: 0.01371
Epoch: 12255, Training Loss: 0.01268
Epoch: 12255, Training Loss: 0.01566
Epoch: 12255, Training Loss: 0.01203
Epoch: 12256, Training Loss: 0.01371
Epoch: 12256, Training Loss: 0.01267
Epoch: 12256, Training Loss: 0.01566
Epoch: 12256, Training Loss: 0.01203
Epoch: 12257, Training Loss: 0.01371
Epoch: 12257, Training Loss: 0.01267
Epoch: 12257, Training Loss: 0.01566
Epoch: 12257, Training Loss: 0.01203
Epoch: 12258, Training Loss: 0.01371
Epoch: 12258, Training Loss: 0.01267
Epoch: 12258, Training Loss: 0.01566
Epoch: 12258, Training Loss: 0.01203
Epoch: 12259, Training Loss: 0.01371
Epoch: 12259, Training Loss: 0.01267
Epoch: 12259, Training Loss: 0.01566
Epoch: 12259, Training Loss: 0.01203
Epoch: 12260, Training Loss: 0.01371
Epoch: 12260, Training Loss: 0.01267
Epoch: 12260, Training Loss: 0.01566
Epoch: 12260, Training Loss: 0.01203
Epoch: 12261, Training Loss: 0.01371
Epoch: 12261, Training Loss: 0.01267
Epoch: 12261, Training Loss: 0.01566
Epoch: 12261, Training Loss: 0.01203
Epoch: 12262, Training Loss: 0.01371
Epoch: 12262, Training Loss: 0.01267
Epoch: 12262, Training Loss: 0.01566
Epoch: 12262, Training Loss: 0.01203
Epoch: 12263, Training Loss: 0.01371
Epoch: 12263, Training Loss: 0.01267
Epoch: 12263, Training Loss: 0.01565
Epoch: 12263, Training Loss: 0.01202
Epoch: 12264, Training Loss: 0.01371
Epoch: 12264, Training Loss: 0.01267
Epoch: 12264, Training Loss: 0.01565
Epoch: 12264, Training Loss: 0.01202
Epoch: 12265, Training Loss: 0.01371
Epoch: 12265, Training Loss: 0.01267
Epoch: 12265, Training Loss: 0.01565
Epoch: 12265, Training Loss: 0.01202
Epoch: 12266, Training Loss: 0.01371
Epoch: 12266, Training Loss: 0.01267
Epoch: 12266, Training Loss: 0.01565
Epoch: 12266, Training Loss: 0.01202
Epoch: 12267, Training Loss: 0.01371
Epoch: 12267, Training Loss: 0.01267
Epoch: 12267, Training Loss: 0.01565
Epoch: 12267, Training Loss: 0.01202
Epoch: 12268, Training Loss: 0.01371
Epoch: 12268, Training Loss: 0.01267
Epoch: 12268, Training Loss: 0.01565
Epoch: 12268, Training Loss: 0.01202
Epoch: 12269, Training Loss: 0.01370
Epoch: 12269, Training Loss: 0.01267
Epoch: 12269, Training Loss: 0.01565
Epoch: 12269, Training Loss: 0.01202
Epoch: 12270, Training Loss: 0.01370
Epoch: 12270, Training Loss: 0.01267
Epoch: 12270, Training Loss: 0.01565
Epoch: 12270, Training Loss: 0.01202
Epoch: 12271, Training Loss: 0.01370
Epoch: 12271, Training Loss: 0.01267
Epoch: 12271, Training Loss: 0.01565
Epoch: 12271, Training Loss: 0.01202
Epoch: 12272, Training Loss: 0.01370
Epoch: 12272, Training Loss: 0.01267
Epoch: 12272, Training Loss: 0.01565
Epoch: 12272, Training Loss: 0.01202
Epoch: 12273, Training Loss: 0.01370
Epoch: 12273, Training Loss: 0.01267
Epoch: 12273, Training Loss: 0.01565
Epoch: 12273, Training Loss: 0.01202
Epoch: 12274, Training Loss: 0.01370
Epoch: 12274, Training Loss: 0.01266
Epoch: 12274, Training Loss: 0.01565
Epoch: 12274, Training Loss: 0.01202
Epoch: 12275, Training Loss: 0.01370
Epoch: 12275, Training Loss: 0.01266
Epoch: 12275, Training Loss: 0.01565
Epoch: 12275, Training Loss: 0.01202
Epoch: 12276, Training Loss: 0.01370
Epoch: 12276, Training Loss: 0.01266
Epoch: 12276, Training Loss: 0.01565
Epoch: 12276, Training Loss: 0.01202
Epoch: 12277, Training Loss: 0.01370
Epoch: 12277, Training Loss: 0.01266
Epoch: 12277, Training Loss: 0.01564
Epoch: 12277, Training Loss: 0.01202
Epoch: 12278, Training Loss: 0.01370
Epoch: 12278, Training Loss: 0.01266
Epoch: 12278, Training Loss: 0.01564
Epoch: 12278, Training Loss: 0.01202
Epoch: 12279, Training Loss: 0.01370
Epoch: 12279, Training Loss: 0.01266
Epoch: 12279, Training Loss: 0.01564
Epoch: 12279, Training Loss: 0.01202
Epoch: 12280, Training Loss: 0.01370
Epoch: 12280, Training Loss: 0.01266
Epoch: 12280, Training Loss: 0.01564
Epoch: 12280, Training Loss: 0.01202
Epoch: 12281, Training Loss: 0.01370
Epoch: 12281, Training Loss: 0.01266
Epoch: 12281, Training Loss: 0.01564
Epoch: 12281, Training Loss: 0.01202
Epoch: 12282, Training Loss: 0.01370
Epoch: 12282, Training Loss: 0.01266
Epoch: 12282, Training Loss: 0.01564
Epoch: 12282, Training Loss: 0.01201
Epoch: 12283, Training Loss: 0.01370
Epoch: 12283, Training Loss: 0.01266
Epoch: 12283, Training Loss: 0.01564
Epoch: 12283, Training Loss: 0.01201
Epoch: 12284, Training Loss: 0.01370
Epoch: 12284, Training Loss: 0.01266
Epoch: 12284, Training Loss: 0.01564
Epoch: 12284, Training Loss: 0.01201
Epoch: 12285, Training Loss: 0.01369
Epoch: 12285, Training Loss: 0.01266
Epoch: 12285, Training Loss: 0.01564
Epoch: 12285, Training Loss: 0.01201
Epoch: 12286, Training Loss: 0.01369
Epoch: 12286, Training Loss: 0.01266
Epoch: 12286, Training Loss: 0.01564
Epoch: 12286, Training Loss: 0.01201
Epoch: 12287, Training Loss: 0.01369
Epoch: 12287, Training Loss: 0.01266
Epoch: 12287, Training Loss: 0.01564
Epoch: 12287, Training Loss: 0.01201
Epoch: 12288, Training Loss: 0.01369
Epoch: 12288, Training Loss: 0.01266
Epoch: 12288, Training Loss: 0.01564
Epoch: 12288, Training Loss: 0.01201
Epoch: 12289, Training Loss: 0.01369
Epoch: 12289, Training Loss: 0.01266
Epoch: 12289, Training Loss: 0.01564
Epoch: 12289, Training Loss: 0.01201
Epoch: 12290, Training Loss: 0.01369
Epoch: 12290, Training Loss: 0.01266
Epoch: 12290, Training Loss: 0.01564
Epoch: 12290, Training Loss: 0.01201
Epoch: 12291, Training Loss: 0.01369
Epoch: 12291, Training Loss: 0.01266
Epoch: 12291, Training Loss: 0.01563
Epoch: 12291, Training Loss: 0.01201
Epoch: 12292, Training Loss: 0.01369
Epoch: 12292, Training Loss: 0.01265
Epoch: 12292, Training Loss: 0.01563
Epoch: 12292, Training Loss: 0.01201
Epoch: 12293, Training Loss: 0.01369
Epoch: 12293, Training Loss: 0.01265
Epoch: 12293, Training Loss: 0.01563
Epoch: 12293, Training Loss: 0.01201
Epoch: 12294, Training Loss: 0.01369
Epoch: 12294, Training Loss: 0.01265
Epoch: 12294, Training Loss: 0.01563
Epoch: 12294, Training Loss: 0.01201
Epoch: 12295, Training Loss: 0.01369
Epoch: 12295, Training Loss: 0.01265
Epoch: 12295, Training Loss: 0.01563
Epoch: 12295, Training Loss: 0.01201
Epoch: 12296, Training Loss: 0.01369
Epoch: 12296, Training Loss: 0.01265
Epoch: 12296, Training Loss: 0.01563
Epoch: 12296, Training Loss: 0.01201
Epoch: 12297, Training Loss: 0.01369
Epoch: 12297, Training Loss: 0.01265
Epoch: 12297, Training Loss: 0.01563
Epoch: 12297, Training Loss: 0.01201
Epoch: 12298, Training Loss: 0.01369
Epoch: 12298, Training Loss: 0.01265
Epoch: 12298, Training Loss: 0.01563
Epoch: 12298, Training Loss: 0.01201
Epoch: 12299, Training Loss: 0.01369
Epoch: 12299, Training Loss: 0.01265
Epoch: 12299, Training Loss: 0.01563
Epoch: 12299, Training Loss: 0.01201
Epoch: 12300, Training Loss: 0.01369
Epoch: 12300, Training Loss: 0.01265
Epoch: 12300, Training Loss: 0.01563
Epoch: 12300, Training Loss: 0.01201
Epoch: 12301, Training Loss: 0.01368
Epoch: 12301, Training Loss: 0.01265
Epoch: 12301, Training Loss: 0.01563
Epoch: 12301, Training Loss: 0.01200
Epoch: 12302, Training Loss: 0.01368
Epoch: 12302, Training Loss: 0.01265
Epoch: 12302, Training Loss: 0.01563
Epoch: 12302, Training Loss: 0.01200
Epoch: 12303, Training Loss: 0.01368
Epoch: 12303, Training Loss: 0.01265
Epoch: 12303, Training Loss: 0.01563
Epoch: 12303, Training Loss: 0.01200
Epoch: 12304, Training Loss: 0.01368
Epoch: 12304, Training Loss: 0.01265
Epoch: 12304, Training Loss: 0.01563
Epoch: 12304, Training Loss: 0.01200
Epoch: 12305, Training Loss: 0.01368
Epoch: 12305, Training Loss: 0.01265
Epoch: 12305, Training Loss: 0.01562
Epoch: 12305, Training Loss: 0.01200
Epoch: 12306, Training Loss: 0.01368
Epoch: 12306, Training Loss: 0.01265
Epoch: 12306, Training Loss: 0.01562
Epoch: 12306, Training Loss: 0.01200
Epoch: 12307, Training Loss: 0.01368
Epoch: 12307, Training Loss: 0.01265
Epoch: 12307, Training Loss: 0.01562
Epoch: 12307, Training Loss: 0.01200
Epoch: 12308, Training Loss: 0.01368
Epoch: 12308, Training Loss: 0.01265
Epoch: 12308, Training Loss: 0.01562
Epoch: 12308, Training Loss: 0.01200
Epoch: 12309, Training Loss: 0.01368
Epoch: 12309, Training Loss: 0.01265
Epoch: 12309, Training Loss: 0.01562
Epoch: 12309, Training Loss: 0.01200
Epoch: 12310, Training Loss: 0.01368
Epoch: 12310, Training Loss: 0.01264
Epoch: 12310, Training Loss: 0.01562
Epoch: 12310, Training Loss: 0.01200
Epoch: 12311, Training Loss: 0.01368
Epoch: 12311, Training Loss: 0.01264
Epoch: 12311, Training Loss: 0.01562
Epoch: 12311, Training Loss: 0.01200
Epoch: 12312, Training Loss: 0.01368
Epoch: 12312, Training Loss: 0.01264
Epoch: 12312, Training Loss: 0.01562
Epoch: 12312, Training Loss: 0.01200
Epoch: 12313, Training Loss: 0.01368
Epoch: 12313, Training Loss: 0.01264
Epoch: 12313, Training Loss: 0.01562
Epoch: 12313, Training Loss: 0.01200
Epoch: 12314, Training Loss: 0.01368
Epoch: 12314, Training Loss: 0.01264
Epoch: 12314, Training Loss: 0.01562
Epoch: 12314, Training Loss: 0.01200
Epoch: 12315, Training Loss: 0.01368
Epoch: 12315, Training Loss: 0.01264
Epoch: 12315, Training Loss: 0.01562
Epoch: 12315, Training Loss: 0.01200
Epoch: 12316, Training Loss: 0.01368
Epoch: 12316, Training Loss: 0.01264
Epoch: 12316, Training Loss: 0.01562
Epoch: 12316, Training Loss: 0.01200
Epoch: 12317, Training Loss: 0.01367
Epoch: 12317, Training Loss: 0.01264
Epoch: 12317, Training Loss: 0.01562
Epoch: 12317, Training Loss: 0.01200
Epoch: 12318, Training Loss: 0.01367
Epoch: 12318, Training Loss: 0.01264
Epoch: 12318, Training Loss: 0.01562
Epoch: 12318, Training Loss: 0.01200
Epoch: 12319, Training Loss: 0.01367
Epoch: 12319, Training Loss: 0.01264
Epoch: 12319, Training Loss: 0.01561
Epoch: 12319, Training Loss: 0.01200
Epoch: 12320, Training Loss: 0.01367
Epoch: 12320, Training Loss: 0.01264
Epoch: 12320, Training Loss: 0.01561
Epoch: 12320, Training Loss: 0.01199
Epoch: 12321, Training Loss: 0.01367
Epoch: 12321, Training Loss: 0.01264
Epoch: 12321, Training Loss: 0.01561
Epoch: 12321, Training Loss: 0.01199
Epoch: 12322, Training Loss: 0.01367
Epoch: 12322, Training Loss: 0.01264
Epoch: 12322, Training Loss: 0.01561
Epoch: 12322, Training Loss: 0.01199
Epoch: 12323, Training Loss: 0.01367
Epoch: 12323, Training Loss: 0.01264
Epoch: 12323, Training Loss: 0.01561
Epoch: 12323, Training Loss: 0.01199
Epoch: 12324, Training Loss: 0.01367
Epoch: 12324, Training Loss: 0.01264
Epoch: 12324, Training Loss: 0.01561
Epoch: 12324, Training Loss: 0.01199
Epoch: 12325, Training Loss: 0.01367
Epoch: 12325, Training Loss: 0.01264
Epoch: 12325, Training Loss: 0.01561
Epoch: 12325, Training Loss: 0.01199
Epoch: 12326, Training Loss: 0.01367
Epoch: 12326, Training Loss: 0.01264
Epoch: 12326, Training Loss: 0.01561
Epoch: 12326, Training Loss: 0.01199
Epoch: 12327, Training Loss: 0.01367
Epoch: 12327, Training Loss: 0.01264
Epoch: 12327, Training Loss: 0.01561
Epoch: 12327, Training Loss: 0.01199
Epoch: 12328, Training Loss: 0.01367
Epoch: 12328, Training Loss: 0.01263
Epoch: 12328, Training Loss: 0.01561
Epoch: 12328, Training Loss: 0.01199
Epoch: 12329, Training Loss: 0.01367
Epoch: 12329, Training Loss: 0.01263
Epoch: 12329, Training Loss: 0.01561
Epoch: 12329, Training Loss: 0.01199
Epoch: 12330, Training Loss: 0.01367
Epoch: 12330, Training Loss: 0.01263
Epoch: 12330, Training Loss: 0.01561
Epoch: 12330, Training Loss: 0.01199
Epoch: 12331, Training Loss: 0.01367
Epoch: 12331, Training Loss: 0.01263
Epoch: 12331, Training Loss: 0.01561
Epoch: 12331, Training Loss: 0.01199
Epoch: 12332, Training Loss: 0.01367
Epoch: 12332, Training Loss: 0.01263
Epoch: 12332, Training Loss: 0.01561
Epoch: 12332, Training Loss: 0.01199
Epoch: 12333, Training Loss: 0.01366
Epoch: 12333, Training Loss: 0.01263
Epoch: 12333, Training Loss: 0.01560
Epoch: 12333, Training Loss: 0.01199
Epoch: 12334, Training Loss: 0.01366
Epoch: 12334, Training Loss: 0.01263
Epoch: 12334, Training Loss: 0.01560
Epoch: 12334, Training Loss: 0.01199
Epoch: 12335, Training Loss: 0.01366
Epoch: 12335, Training Loss: 0.01263
Epoch: 12335, Training Loss: 0.01560
Epoch: 12335, Training Loss: 0.01199
Epoch: 12336, Training Loss: 0.01366
Epoch: 12336, Training Loss: 0.01263
Epoch: 12336, Training Loss: 0.01560
Epoch: 12336, Training Loss: 0.01199
Epoch: 12337, Training Loss: 0.01366
Epoch: 12337, Training Loss: 0.01263
Epoch: 12337, Training Loss: 0.01560
Epoch: 12337, Training Loss: 0.01199
Epoch: 12338, Training Loss: 0.01366
Epoch: 12338, Training Loss: 0.01263
Epoch: 12338, Training Loss: 0.01560
Epoch: 12338, Training Loss: 0.01198
Epoch: 12339, Training Loss: 0.01366
Epoch: 12339, Training Loss: 0.01263
Epoch: 12339, Training Loss: 0.01560
Epoch: 12339, Training Loss: 0.01198
Epoch: 12340, Training Loss: 0.01366
Epoch: 12340, Training Loss: 0.01263
Epoch: 12340, Training Loss: 0.01560
Epoch: 12340, Training Loss: 0.01198
Epoch: 12341, Training Loss: 0.01366
Epoch: 12341, Training Loss: 0.01263
Epoch: 12341, Training Loss: 0.01560
Epoch: 12341, Training Loss: 0.01198
Epoch: 12342, Training Loss: 0.01366
Epoch: 12342, Training Loss: 0.01263
Epoch: 12342, Training Loss: 0.01560
Epoch: 12342, Training Loss: 0.01198
Epoch: 12343, Training Loss: 0.01366
Epoch: 12343, Training Loss: 0.01263
Epoch: 12343, Training Loss: 0.01560
Epoch: 12343, Training Loss: 0.01198
Epoch: 12344, Training Loss: 0.01366
Epoch: 12344, Training Loss: 0.01263
Epoch: 12344, Training Loss: 0.01560
Epoch: 12344, Training Loss: 0.01198
Epoch: 12345, Training Loss: 0.01366
Epoch: 12345, Training Loss: 0.01263
Epoch: 12345, Training Loss: 0.01560
Epoch: 12345, Training Loss: 0.01198
Epoch: 12346, Training Loss: 0.01366
Epoch: 12346, Training Loss: 0.01262
Epoch: 12346, Training Loss: 0.01560
Epoch: 12346, Training Loss: 0.01198
Epoch: 12347, Training Loss: 0.01366
Epoch: 12347, Training Loss: 0.01262
Epoch: 12347, Training Loss: 0.01559
Epoch: 12347, Training Loss: 0.01198
Epoch: 12348, Training Loss: 0.01366
Epoch: 12348, Training Loss: 0.01262
Epoch: 12348, Training Loss: 0.01559
Epoch: 12348, Training Loss: 0.01198
Epoch: 12349, Training Loss: 0.01365
Epoch: 12349, Training Loss: 0.01262
Epoch: 12349, Training Loss: 0.01559
Epoch: 12349, Training Loss: 0.01198
Epoch: 12350, Training Loss: 0.01365
Epoch: 12350, Training Loss: 0.01262
Epoch: 12350, Training Loss: 0.01559
Epoch: 12350, Training Loss: 0.01198
Epoch: 12351, Training Loss: 0.01365
Epoch: 12351, Training Loss: 0.01262
Epoch: 12351, Training Loss: 0.01559
Epoch: 12351, Training Loss: 0.01198
Epoch: 12352, Training Loss: 0.01365
Epoch: 12352, Training Loss: 0.01262
Epoch: 12352, Training Loss: 0.01559
Epoch: 12352, Training Loss: 0.01198
Epoch: 12353, Training Loss: 0.01365
Epoch: 12353, Training Loss: 0.01262
Epoch: 12353, Training Loss: 0.01559
Epoch: 12353, Training Loss: 0.01198
Epoch: 12354, Training Loss: 0.01365
Epoch: 12354, Training Loss: 0.01262
Epoch: 12354, Training Loss: 0.01559
Epoch: 12354, Training Loss: 0.01198
Epoch: 12355, Training Loss: 0.01365
Epoch: 12355, Training Loss: 0.01262
Epoch: 12355, Training Loss: 0.01559
Epoch: 12355, Training Loss: 0.01198
Epoch: 12356, Training Loss: 0.01365
Epoch: 12356, Training Loss: 0.01262
Epoch: 12356, Training Loss: 0.01559
Epoch: 12356, Training Loss: 0.01198
Epoch: 12357, Training Loss: 0.01365
Epoch: 12357, Training Loss: 0.01262
Epoch: 12357, Training Loss: 0.01559
Epoch: 12357, Training Loss: 0.01197
Epoch: 12358, Training Loss: 0.01365
Epoch: 12358, Training Loss: 0.01262
Epoch: 12358, Training Loss: 0.01559
Epoch: 12358, Training Loss: 0.01197
Epoch: 12359, Training Loss: 0.01365
Epoch: 12359, Training Loss: 0.01262
Epoch: 12359, Training Loss: 0.01559
Epoch: 12359, Training Loss: 0.01197
Epoch: 12360, Training Loss: 0.01365
Epoch: 12360, Training Loss: 0.01262
Epoch: 12360, Training Loss: 0.01559
Epoch: 12360, Training Loss: 0.01197
Epoch: 12361, Training Loss: 0.01365
Epoch: 12361, Training Loss: 0.01262
Epoch: 12361, Training Loss: 0.01558
Epoch: 12361, Training Loss: 0.01197
Epoch: 12362, Training Loss: 0.01365
Epoch: 12362, Training Loss: 0.01262
Epoch: 12362, Training Loss: 0.01558
Epoch: 12362, Training Loss: 0.01197
Epoch: 12363, Training Loss: 0.01365
Epoch: 12363, Training Loss: 0.01262
Epoch: 12363, Training Loss: 0.01558
Epoch: 12363, Training Loss: 0.01197
Epoch: 12364, Training Loss: 0.01365
Epoch: 12364, Training Loss: 0.01261
Epoch: 12364, Training Loss: 0.01558
Epoch: 12364, Training Loss: 0.01197
Epoch: 12365, Training Loss: 0.01365
Epoch: 12365, Training Loss: 0.01261
Epoch: 12365, Training Loss: 0.01558
Epoch: 12365, Training Loss: 0.01197
Epoch: 12366, Training Loss: 0.01364
Epoch: 12366, Training Loss: 0.01261
Epoch: 12366, Training Loss: 0.01558
Epoch: 12366, Training Loss: 0.01197
Epoch: 12367, Training Loss: 0.01364
Epoch: 12367, Training Loss: 0.01261
Epoch: 12367, Training Loss: 0.01558
Epoch: 12367, Training Loss: 0.01197
Epoch: 12368, Training Loss: 0.01364
Epoch: 12368, Training Loss: 0.01261
Epoch: 12368, Training Loss: 0.01558
Epoch: 12368, Training Loss: 0.01197
Epoch: 12369, Training Loss: 0.01364
Epoch: 12369, Training Loss: 0.01261
Epoch: 12369, Training Loss: 0.01558
Epoch: 12369, Training Loss: 0.01197
Epoch: 12370, Training Loss: 0.01364
Epoch: 12370, Training Loss: 0.01261
Epoch: 12370, Training Loss: 0.01558
Epoch: 12370, Training Loss: 0.01197
Epoch: 12371, Training Loss: 0.01364
Epoch: 12371, Training Loss: 0.01261
Epoch: 12371, Training Loss: 0.01558
Epoch: 12371, Training Loss: 0.01197
Epoch: 12372, Training Loss: 0.01364
Epoch: 12372, Training Loss: 0.01261
Epoch: 12372, Training Loss: 0.01558
Epoch: 12372, Training Loss: 0.01197
Epoch: 12373, Training Loss: 0.01364
Epoch: 12373, Training Loss: 0.01261
Epoch: 12373, Training Loss: 0.01558
Epoch: 12373, Training Loss: 0.01197
Epoch: 12374, Training Loss: 0.01364
Epoch: 12374, Training Loss: 0.01261
Epoch: 12374, Training Loss: 0.01558
Epoch: 12374, Training Loss: 0.01197
Epoch: 12375, Training Loss: 0.01364
Epoch: 12375, Training Loss: 0.01261
Epoch: 12375, Training Loss: 0.01558
Epoch: 12375, Training Loss: 0.01197
Epoch: 12376, Training Loss: 0.01364
Epoch: 12376, Training Loss: 0.01261
Epoch: 12376, Training Loss: 0.01557
Epoch: 12376, Training Loss: 0.01196
Epoch: 12377, Training Loss: 0.01364
Epoch: 12377, Training Loss: 0.01261
Epoch: 12377, Training Loss: 0.01557
Epoch: 12377, Training Loss: 0.01196
Epoch: 12378, Training Loss: 0.01364
Epoch: 12378, Training Loss: 0.01261
Epoch: 12378, Training Loss: 0.01557
Epoch: 12378, Training Loss: 0.01196
Epoch: 12379, Training Loss: 0.01364
Epoch: 12379, Training Loss: 0.01261
Epoch: 12379, Training Loss: 0.01557
Epoch: 12379, Training Loss: 0.01196
Epoch: 12380, Training Loss: 0.01364
Epoch: 12380, Training Loss: 0.01261
Epoch: 12380, Training Loss: 0.01557
Epoch: 12380, Training Loss: 0.01196
Epoch: 12381, Training Loss: 0.01364
Epoch: 12381, Training Loss: 0.01261
Epoch: 12381, Training Loss: 0.01557
Epoch: 12381, Training Loss: 0.01196
Epoch: 12382, Training Loss: 0.01363
Epoch: 12382, Training Loss: 0.01260
Epoch: 12382, Training Loss: 0.01557
Epoch: 12382, Training Loss: 0.01196
Epoch: 12383, Training Loss: 0.01363
Epoch: 12383, Training Loss: 0.01260
Epoch: 12383, Training Loss: 0.01557
Epoch: 12383, Training Loss: 0.01196
Epoch: 12384, Training Loss: 0.01363
Epoch: 12384, Training Loss: 0.01260
Epoch: 12384, Training Loss: 0.01557
Epoch: 12384, Training Loss: 0.01196
Epoch: 12385, Training Loss: 0.01363
Epoch: 12385, Training Loss: 0.01260
Epoch: 12385, Training Loss: 0.01557
Epoch: 12385, Training Loss: 0.01196
Epoch: 12386, Training Loss: 0.01363
Epoch: 12386, Training Loss: 0.01260
Epoch: 12386, Training Loss: 0.01557
Epoch: 12386, Training Loss: 0.01196
Epoch: 12387, Training Loss: 0.01363
Epoch: 12387, Training Loss: 0.01260
Epoch: 12387, Training Loss: 0.01557
Epoch: 12387, Training Loss: 0.01196
Epoch: 12388, Training Loss: 0.01363
Epoch: 12388, Training Loss: 0.01260
Epoch: 12388, Training Loss: 0.01557
Epoch: 12388, Training Loss: 0.01196
Epoch: 12389, Training Loss: 0.01363
Epoch: 12389, Training Loss: 0.01260
Epoch: 12389, Training Loss: 0.01557
Epoch: 12389, Training Loss: 0.01196
Epoch: 12390, Training Loss: 0.01363
Epoch: 12390, Training Loss: 0.01260
Epoch: 12390, Training Loss: 0.01556
Epoch: 12390, Training Loss: 0.01196
Epoch: 12391, Training Loss: 0.01363
Epoch: 12391, Training Loss: 0.01260
Epoch: 12391, Training Loss: 0.01556
Epoch: 12391, Training Loss: 0.01196
Epoch: 12392, Training Loss: 0.01363
Epoch: 12392, Training Loss: 0.01260
Epoch: 12392, Training Loss: 0.01556
Epoch: 12392, Training Loss: 0.01196
Epoch: 12393, Training Loss: 0.01363
Epoch: 12393, Training Loss: 0.01260
Epoch: 12393, Training Loss: 0.01556
Epoch: 12393, Training Loss: 0.01196
Epoch: 12394, Training Loss: 0.01363
Epoch: 12394, Training Loss: 0.01260
Epoch: 12394, Training Loss: 0.01556
Epoch: 12394, Training Loss: 0.01196
Epoch: 12395, Training Loss: 0.01363
Epoch: 12395, Training Loss: 0.01260
Epoch: 12395, Training Loss: 0.01556
Epoch: 12395, Training Loss: 0.01195
Epoch: 12396, Training Loss: 0.01363
Epoch: 12396, Training Loss: 0.01260
Epoch: 12396, Training Loss: 0.01556
Epoch: 12396, Training Loss: 0.01195
Epoch: 12397, Training Loss: 0.01363
Epoch: 12397, Training Loss: 0.01260
Epoch: 12397, Training Loss: 0.01556
Epoch: 12397, Training Loss: 0.01195
Epoch: 12398, Training Loss: 0.01362
Epoch: 12398, Training Loss: 0.01260
Epoch: 12398, Training Loss: 0.01556
Epoch: 12398, Training Loss: 0.01195
Epoch: 12399, Training Loss: 0.01362
Epoch: 12399, Training Loss: 0.01260
Epoch: 12399, Training Loss: 0.01556
Epoch: 12399, Training Loss: 0.01195
Epoch: 12400, Training Loss: 0.01362
Epoch: 12400, Training Loss: 0.01259
Epoch: 12400, Training Loss: 0.01556
Epoch: 12400, Training Loss: 0.01195
Epoch: 12401, Training Loss: 0.01362
Epoch: 12401, Training Loss: 0.01259
Epoch: 12401, Training Loss: 0.01556
Epoch: 12401, Training Loss: 0.01195
Epoch: 12402, Training Loss: 0.01362
Epoch: 12402, Training Loss: 0.01259
Epoch: 12402, Training Loss: 0.01556
Epoch: 12402, Training Loss: 0.01195
Epoch: 12403, Training Loss: 0.01362
Epoch: 12403, Training Loss: 0.01259
Epoch: 12403, Training Loss: 0.01556
Epoch: 12403, Training Loss: 0.01195
Epoch: 12404, Training Loss: 0.01362
Epoch: 12404, Training Loss: 0.01259
Epoch: 12404, Training Loss: 0.01555
Epoch: 12404, Training Loss: 0.01195
Epoch: 12405, Training Loss: 0.01362
Epoch: 12405, Training Loss: 0.01259
Epoch: 12405, Training Loss: 0.01555
Epoch: 12405, Training Loss: 0.01195
Epoch: 12406, Training Loss: 0.01362
Epoch: 12406, Training Loss: 0.01259
Epoch: 12406, Training Loss: 0.01555
Epoch: 12406, Training Loss: 0.01195
Epoch: 12407, Training Loss: 0.01362
Epoch: 12407, Training Loss: 0.01259
Epoch: 12407, Training Loss: 0.01555
Epoch: 12407, Training Loss: 0.01195
Epoch: 12408, Training Loss: 0.01362
Epoch: 12408, Training Loss: 0.01259
Epoch: 12408, Training Loss: 0.01555
Epoch: 12408, Training Loss: 0.01195
Epoch: 12409, Training Loss: 0.01362
Epoch: 12409, Training Loss: 0.01259
Epoch: 12409, Training Loss: 0.01555
Epoch: 12409, Training Loss: 0.01195
Epoch: 12410, Training Loss: 0.01362
Epoch: 12410, Training Loss: 0.01259
Epoch: 12410, Training Loss: 0.01555
Epoch: 12410, Training Loss: 0.01195
Epoch: 12411, Training Loss: 0.01362
Epoch: 12411, Training Loss: 0.01259
Epoch: 12411, Training Loss: 0.01555
Epoch: 12411, Training Loss: 0.01195
Epoch: 12412, Training Loss: 0.01362
Epoch: 12412, Training Loss: 0.01259
Epoch: 12412, Training Loss: 0.01555
Epoch: 12412, Training Loss: 0.01195
Epoch: 12413, Training Loss: 0.01362
Epoch: 12413, Training Loss: 0.01259
Epoch: 12413, Training Loss: 0.01555
Epoch: 12413, Training Loss: 0.01195
Epoch: 12414, Training Loss: 0.01362
Epoch: 12414, Training Loss: 0.01259
Epoch: 12414, Training Loss: 0.01555
Epoch: 12414, Training Loss: 0.01194
Epoch: 12415, Training Loss: 0.01361
Epoch: 12415, Training Loss: 0.01259
Epoch: 12415, Training Loss: 0.01555
Epoch: 12415, Training Loss: 0.01194
Epoch: 12416, Training Loss: 0.01361
Epoch: 12416, Training Loss: 0.01259
Epoch: 12416, Training Loss: 0.01555
Epoch: 12416, Training Loss: 0.01194
Epoch: 12417, Training Loss: 0.01361
Epoch: 12417, Training Loss: 0.01259
Epoch: 12417, Training Loss: 0.01555
Epoch: 12417, Training Loss: 0.01194
Epoch: 12418, Training Loss: 0.01361
Epoch: 12418, Training Loss: 0.01258
Epoch: 12418, Training Loss: 0.01554
Epoch: 12418, Training Loss: 0.01194
Epoch: 12419, Training Loss: 0.01361
Epoch: 12419, Training Loss: 0.01258
Epoch: 12419, Training Loss: 0.01554
Epoch: 12419, Training Loss: 0.01194
Epoch: 12420, Training Loss: 0.01361
Epoch: 12420, Training Loss: 0.01258
Epoch: 12420, Training Loss: 0.01554
Epoch: 12420, Training Loss: 0.01194
Epoch: 12421, Training Loss: 0.01361
Epoch: 12421, Training Loss: 0.01258
Epoch: 12421, Training Loss: 0.01554
Epoch: 12421, Training Loss: 0.01194
Epoch: 12422, Training Loss: 0.01361
Epoch: 12422, Training Loss: 0.01258
Epoch: 12422, Training Loss: 0.01554
Epoch: 12422, Training Loss: 0.01194
Epoch: 12423, Training Loss: 0.01361
Epoch: 12423, Training Loss: 0.01258
Epoch: 12423, Training Loss: 0.01554
Epoch: 12423, Training Loss: 0.01194
Epoch: 12424, Training Loss: 0.01361
Epoch: 12424, Training Loss: 0.01258
Epoch: 12424, Training Loss: 0.01554
Epoch: 12424, Training Loss: 0.01194
Epoch: 12425, Training Loss: 0.01361
Epoch: 12425, Training Loss: 0.01258
Epoch: 12425, Training Loss: 0.01554
Epoch: 12425, Training Loss: 0.01194
Epoch: 12426, Training Loss: 0.01361
Epoch: 12426, Training Loss: 0.01258
Epoch: 12426, Training Loss: 0.01554
Epoch: 12426, Training Loss: 0.01194
Epoch: 12427, Training Loss: 0.01361
Epoch: 12427, Training Loss: 0.01258
Epoch: 12427, Training Loss: 0.01554
Epoch: 12427, Training Loss: 0.01194
Epoch: 12428, Training Loss: 0.01361
Epoch: 12428, Training Loss: 0.01258
Epoch: 12428, Training Loss: 0.01554
Epoch: 12428, Training Loss: 0.01194
Epoch: 12429, Training Loss: 0.01361
Epoch: 12429, Training Loss: 0.01258
Epoch: 12429, Training Loss: 0.01554
Epoch: 12429, Training Loss: 0.01194
Epoch: 12430, Training Loss: 0.01361
Epoch: 12430, Training Loss: 0.01258
Epoch: 12430, Training Loss: 0.01554
Epoch: 12430, Training Loss: 0.01194
Epoch: 12431, Training Loss: 0.01360
Epoch: 12431, Training Loss: 0.01258
Epoch: 12431, Training Loss: 0.01554
Epoch: 12431, Training Loss: 0.01194
Epoch: 12432, Training Loss: 0.01360
Epoch: 12432, Training Loss: 0.01258
Epoch: 12432, Training Loss: 0.01554
Epoch: 12432, Training Loss: 0.01194
Epoch: 12433, Training Loss: 0.01360
Epoch: 12433, Training Loss: 0.01258
Epoch: 12433, Training Loss: 0.01553
Epoch: 12433, Training Loss: 0.01193
Epoch: 12434, Training Loss: 0.01360
Epoch: 12434, Training Loss: 0.01258
Epoch: 12434, Training Loss: 0.01553
Epoch: 12434, Training Loss: 0.01193
Epoch: 12435, Training Loss: 0.01360
Epoch: 12435, Training Loss: 0.01258
Epoch: 12435, Training Loss: 0.01553
Epoch: 12435, Training Loss: 0.01193
Epoch: 12436, Training Loss: 0.01360
Epoch: 12436, Training Loss: 0.01257
Epoch: 12436, Training Loss: 0.01553
Epoch: 12436, Training Loss: 0.01193
Epoch: 12437, Training Loss: 0.01360
Epoch: 12437, Training Loss: 0.01257
Epoch: 12437, Training Loss: 0.01553
Epoch: 12437, Training Loss: 0.01193
Epoch: 12438, Training Loss: 0.01360
Epoch: 12438, Training Loss: 0.01257
Epoch: 12438, Training Loss: 0.01553
Epoch: 12438, Training Loss: 0.01193
Epoch: 12439, Training Loss: 0.01360
Epoch: 12439, Training Loss: 0.01257
Epoch: 12439, Training Loss: 0.01553
Epoch: 12439, Training Loss: 0.01193
Epoch: 12440, Training Loss: 0.01360
Epoch: 12440, Training Loss: 0.01257
Epoch: 12440, Training Loss: 0.01553
Epoch: 12440, Training Loss: 0.01193
Epoch: 12441, Training Loss: 0.01360
Epoch: 12441, Training Loss: 0.01257
Epoch: 12441, Training Loss: 0.01553
Epoch: 12441, Training Loss: 0.01193
Epoch: 12442, Training Loss: 0.01360
Epoch: 12442, Training Loss: 0.01257
Epoch: 12442, Training Loss: 0.01553
Epoch: 12442, Training Loss: 0.01193
Epoch: 12443, Training Loss: 0.01360
Epoch: 12443, Training Loss: 0.01257
Epoch: 12443, Training Loss: 0.01553
Epoch: 12443, Training Loss: 0.01193
Epoch: 12444, Training Loss: 0.01360
Epoch: 12444, Training Loss: 0.01257
Epoch: 12444, Training Loss: 0.01553
Epoch: 12444, Training Loss: 0.01193
Epoch: 12445, Training Loss: 0.01360
Epoch: 12445, Training Loss: 0.01257
Epoch: 12445, Training Loss: 0.01553
Epoch: 12445, Training Loss: 0.01193
Epoch: 12446, Training Loss: 0.01360
Epoch: 12446, Training Loss: 0.01257
Epoch: 12446, Training Loss: 0.01553
Epoch: 12446, Training Loss: 0.01193
Epoch: 12447, Training Loss: 0.01359
Epoch: 12447, Training Loss: 0.01257
Epoch: 12447, Training Loss: 0.01552
Epoch: 12447, Training Loss: 0.01193
Epoch: 12448, Training Loss: 0.01359
Epoch: 12448, Training Loss: 0.01257
Epoch: 12448, Training Loss: 0.01552
Epoch: 12448, Training Loss: 0.01193
Epoch: 12449, Training Loss: 0.01359
Epoch: 12449, Training Loss: 0.01257
Epoch: 12449, Training Loss: 0.01552
Epoch: 12449, Training Loss: 0.01193
Epoch: 12450, Training Loss: 0.01359
Epoch: 12450, Training Loss: 0.01257
Epoch: 12450, Training Loss: 0.01552
Epoch: 12450, Training Loss: 0.01193
Epoch: 12451, Training Loss: 0.01359
Epoch: 12451, Training Loss: 0.01257
Epoch: 12451, Training Loss: 0.01552
Epoch: 12451, Training Loss: 0.01193
Epoch: 12452, Training Loss: 0.01359
Epoch: 12452, Training Loss: 0.01257
Epoch: 12452, Training Loss: 0.01552
Epoch: 12452, Training Loss: 0.01192
Epoch: 12453, Training Loss: 0.01359
Epoch: 12453, Training Loss: 0.01257
Epoch: 12453, Training Loss: 0.01552
Epoch: 12453, Training Loss: 0.01192
Epoch: 12454, Training Loss: 0.01359
Epoch: 12454, Training Loss: 0.01257
Epoch: 12454, Training Loss: 0.01552
Epoch: 12454, Training Loss: 0.01192
Epoch: 12455, Training Loss: 0.01359
Epoch: 12455, Training Loss: 0.01256
Epoch: 12455, Training Loss: 0.01552
Epoch: 12455, Training Loss: 0.01192
Epoch: 12456, Training Loss: 0.01359
Epoch: 12456, Training Loss: 0.01256
Epoch: 12456, Training Loss: 0.01552
Epoch: 12456, Training Loss: 0.01192
Epoch: 12457, Training Loss: 0.01359
Epoch: 12457, Training Loss: 0.01256
Epoch: 12457, Training Loss: 0.01552
Epoch: 12457, Training Loss: 0.01192
Epoch: 12458, Training Loss: 0.01359
Epoch: 12458, Training Loss: 0.01256
Epoch: 12458, Training Loss: 0.01552
Epoch: 12458, Training Loss: 0.01192
Epoch: 12459, Training Loss: 0.01359
Epoch: 12459, Training Loss: 0.01256
Epoch: 12459, Training Loss: 0.01552
Epoch: 12459, Training Loss: 0.01192
Epoch: 12460, Training Loss: 0.01359
Epoch: 12460, Training Loss: 0.01256
Epoch: 12460, Training Loss: 0.01552
Epoch: 12460, Training Loss: 0.01192
Epoch: 12461, Training Loss: 0.01359
Epoch: 12461, Training Loss: 0.01256
Epoch: 12461, Training Loss: 0.01551
Epoch: 12461, Training Loss: 0.01192
Epoch: 12462, Training Loss: 0.01359
Epoch: 12462, Training Loss: 0.01256
Epoch: 12462, Training Loss: 0.01551
Epoch: 12462, Training Loss: 0.01192
Epoch: 12463, Training Loss: 0.01359
Epoch: 12463, Training Loss: 0.01256
Epoch: 12463, Training Loss: 0.01551
Epoch: 12463, Training Loss: 0.01192
Epoch: 12464, Training Loss: 0.01358
Epoch: 12464, Training Loss: 0.01256
Epoch: 12464, Training Loss: 0.01551
Epoch: 12464, Training Loss: 0.01192
Epoch: 12465, Training Loss: 0.01358
Epoch: 12465, Training Loss: 0.01256
Epoch: 12465, Training Loss: 0.01551
Epoch: 12465, Training Loss: 0.01192
Epoch: 12466, Training Loss: 0.01358
Epoch: 12466, Training Loss: 0.01256
Epoch: 12466, Training Loss: 0.01551
Epoch: 12466, Training Loss: 0.01192
Epoch: 12467, Training Loss: 0.01358
Epoch: 12467, Training Loss: 0.01256
Epoch: 12467, Training Loss: 0.01551
Epoch: 12467, Training Loss: 0.01192
Epoch: 12468, Training Loss: 0.01358
Epoch: 12468, Training Loss: 0.01256
Epoch: 12468, Training Loss: 0.01551
Epoch: 12468, Training Loss: 0.01192
Epoch: 12469, Training Loss: 0.01358
Epoch: 12469, Training Loss: 0.01256
Epoch: 12469, Training Loss: 0.01551
Epoch: 12469, Training Loss: 0.01192
Epoch: 12470, Training Loss: 0.01358
Epoch: 12470, Training Loss: 0.01256
Epoch: 12470, Training Loss: 0.01551
Epoch: 12470, Training Loss: 0.01192
Epoch: 12471, Training Loss: 0.01358
Epoch: 12471, Training Loss: 0.01256
Epoch: 12471, Training Loss: 0.01551
Epoch: 12471, Training Loss: 0.01191
Epoch: 12472, Training Loss: 0.01358
Epoch: 12472, Training Loss: 0.01256
Epoch: 12472, Training Loss: 0.01551
Epoch: 12472, Training Loss: 0.01191
Epoch: 12473, Training Loss: 0.01358
Epoch: 12473, Training Loss: 0.01255
Epoch: 12473, Training Loss: 0.01551
Epoch: 12473, Training Loss: 0.01191
Epoch: 12474, Training Loss: 0.01358
Epoch: 12474, Training Loss: 0.01255
Epoch: 12474, Training Loss: 0.01551
Epoch: 12474, Training Loss: 0.01191
Epoch: 12475, Training Loss: 0.01358
Epoch: 12475, Training Loss: 0.01255
Epoch: 12475, Training Loss: 0.01551
Epoch: 12475, Training Loss: 0.01191
Epoch: 12476, Training Loss: 0.01358
Epoch: 12476, Training Loss: 0.01255
Epoch: 12476, Training Loss: 0.01550
Epoch: 12476, Training Loss: 0.01191
Epoch: 12477, Training Loss: 0.01358
Epoch: 12477, Training Loss: 0.01255
Epoch: 12477, Training Loss: 0.01550
Epoch: 12477, Training Loss: 0.01191
Epoch: 12478, Training Loss: 0.01358
Epoch: 12478, Training Loss: 0.01255
Epoch: 12478, Training Loss: 0.01550
Epoch: 12478, Training Loss: 0.01191
Epoch: 12479, Training Loss: 0.01358
Epoch: 12479, Training Loss: 0.01255
Epoch: 12479, Training Loss: 0.01550
Epoch: 12479, Training Loss: 0.01191
Epoch: 12480, Training Loss: 0.01357
Epoch: 12480, Training Loss: 0.01255
Epoch: 12480, Training Loss: 0.01550
Epoch: 12480, Training Loss: 0.01191
Epoch: 12481, Training Loss: 0.01357
Epoch: 12481, Training Loss: 0.01255
Epoch: 12481, Training Loss: 0.01550
Epoch: 12481, Training Loss: 0.01191
Epoch: 12482, Training Loss: 0.01357
Epoch: 12482, Training Loss: 0.01255
Epoch: 12482, Training Loss: 0.01550
Epoch: 12482, Training Loss: 0.01191
Epoch: 12483, Training Loss: 0.01357
Epoch: 12483, Training Loss: 0.01255
Epoch: 12483, Training Loss: 0.01550
Epoch: 12483, Training Loss: 0.01191
Epoch: 12484, Training Loss: 0.01357
Epoch: 12484, Training Loss: 0.01255
Epoch: 12484, Training Loss: 0.01550
Epoch: 12484, Training Loss: 0.01191
Epoch: 12485, Training Loss: 0.01357
Epoch: 12485, Training Loss: 0.01255
Epoch: 12485, Training Loss: 0.01550
Epoch: 12485, Training Loss: 0.01191
Epoch: 12486, Training Loss: 0.01357
Epoch: 12486, Training Loss: 0.01255
Epoch: 12486, Training Loss: 0.01550
Epoch: 12486, Training Loss: 0.01191
Epoch: 12487, Training Loss: 0.01357
Epoch: 12487, Training Loss: 0.01255
Epoch: 12487, Training Loss: 0.01550
Epoch: 12487, Training Loss: 0.01191
Epoch: 12488, Training Loss: 0.01357
Epoch: 12488, Training Loss: 0.01255
Epoch: 12488, Training Loss: 0.01550
Epoch: 12488, Training Loss: 0.01191
Epoch: 12489, Training Loss: 0.01357
Epoch: 12489, Training Loss: 0.01255
Epoch: 12489, Training Loss: 0.01550
Epoch: 12489, Training Loss: 0.01191
Epoch: 12490, Training Loss: 0.01357
Epoch: 12490, Training Loss: 0.01255
Epoch: 12490, Training Loss: 0.01549
Epoch: 12490, Training Loss: 0.01190
Epoch: 12491, Training Loss: 0.01357
Epoch: 12491, Training Loss: 0.01254
Epoch: 12491, Training Loss: 0.01549
Epoch: 12491, Training Loss: 0.01190
Epoch: 12492, Training Loss: 0.01357
Epoch: 12492, Training Loss: 0.01254
Epoch: 12492, Training Loss: 0.01549
Epoch: 12492, Training Loss: 0.01190
Epoch: 12493, Training Loss: 0.01357
Epoch: 12493, Training Loss: 0.01254
Epoch: 12493, Training Loss: 0.01549
Epoch: 12493, Training Loss: 0.01190
Epoch: 12494, Training Loss: 0.01357
Epoch: 12494, Training Loss: 0.01254
Epoch: 12494, Training Loss: 0.01549
Epoch: 12494, Training Loss: 0.01190
Epoch: 12495, Training Loss: 0.01357
Epoch: 12495, Training Loss: 0.01254
Epoch: 12495, Training Loss: 0.01549
Epoch: 12495, Training Loss: 0.01190
Epoch: 12496, Training Loss: 0.01357
Epoch: 12496, Training Loss: 0.01254
Epoch: 12496, Training Loss: 0.01549
Epoch: 12496, Training Loss: 0.01190
Epoch: 12497, Training Loss: 0.01356
Epoch: 12497, Training Loss: 0.01254
Epoch: 12497, Training Loss: 0.01549
Epoch: 12497, Training Loss: 0.01190
Epoch: 12498, Training Loss: 0.01356
Epoch: 12498, Training Loss: 0.01254
Epoch: 12498, Training Loss: 0.01549
Epoch: 12498, Training Loss: 0.01190
Epoch: 12499, Training Loss: 0.01356
Epoch: 12499, Training Loss: 0.01254
Epoch: 12499, Training Loss: 0.01549
Epoch: 12499, Training Loss: 0.01190
Epoch: 12500, Training Loss: 0.01356
Epoch: 12500, Training Loss: 0.01254
Epoch: 12500, Training Loss: 0.01549
Epoch: 12500, Training Loss: 0.01190
Epoch: 12501, Training Loss: 0.01356
Epoch: 12501, Training Loss: 0.01254
Epoch: 12501, Training Loss: 0.01549
Epoch: 12501, Training Loss: 0.01190
Epoch: 12502, Training Loss: 0.01356
Epoch: 12502, Training Loss: 0.01254
Epoch: 12502, Training Loss: 0.01549
Epoch: 12502, Training Loss: 0.01190
Epoch: 12503, Training Loss: 0.01356
Epoch: 12503, Training Loss: 0.01254
Epoch: 12503, Training Loss: 0.01549
Epoch: 12503, Training Loss: 0.01190
Epoch: 12504, Training Loss: 0.01356
Epoch: 12504, Training Loss: 0.01254
Epoch: 12504, Training Loss: 0.01549
Epoch: 12504, Training Loss: 0.01190
Epoch: 12505, Training Loss: 0.01356
Epoch: 12505, Training Loss: 0.01254
Epoch: 12505, Training Loss: 0.01548
Epoch: 12505, Training Loss: 0.01190
Epoch: 12506, Training Loss: 0.01356
Epoch: 12506, Training Loss: 0.01254
Epoch: 12506, Training Loss: 0.01548
Epoch: 12506, Training Loss: 0.01190
Epoch: 12507, Training Loss: 0.01356
Epoch: 12507, Training Loss: 0.01254
Epoch: 12507, Training Loss: 0.01548
Epoch: 12507, Training Loss: 0.01190
Epoch: 12508, Training Loss: 0.01356
Epoch: 12508, Training Loss: 0.01254
Epoch: 12508, Training Loss: 0.01548
Epoch: 12508, Training Loss: 0.01190
Epoch: 12509, Training Loss: 0.01356
Epoch: 12509, Training Loss: 0.01254
Epoch: 12509, Training Loss: 0.01548
Epoch: 12509, Training Loss: 0.01190
Epoch: 12510, Training Loss: 0.01356
Epoch: 12510, Training Loss: 0.01253
Epoch: 12510, Training Loss: 0.01548
Epoch: 12510, Training Loss: 0.01189
Epoch: 12511, Training Loss: 0.01356
Epoch: 12511, Training Loss: 0.01253
Epoch: 12511, Training Loss: 0.01548
Epoch: 12511, Training Loss: 0.01189
Epoch: 12512, Training Loss: 0.01356
Epoch: 12512, Training Loss: 0.01253
Epoch: 12512, Training Loss: 0.01548
Epoch: 12512, Training Loss: 0.01189
Epoch: 12513, Training Loss: 0.01355
Epoch: 12513, Training Loss: 0.01253
Epoch: 12513, Training Loss: 0.01548
Epoch: 12513, Training Loss: 0.01189
Epoch: 12514, Training Loss: 0.01355
Epoch: 12514, Training Loss: 0.01253
Epoch: 12514, Training Loss: 0.01548
Epoch: 12514, Training Loss: 0.01189
Epoch: 12515, Training Loss: 0.01355
Epoch: 12515, Training Loss: 0.01253
Epoch: 12515, Training Loss: 0.01548
Epoch: 12515, Training Loss: 0.01189
Epoch: 12516, Training Loss: 0.01355
Epoch: 12516, Training Loss: 0.01253
Epoch: 12516, Training Loss: 0.01548
Epoch: 12516, Training Loss: 0.01189
Epoch: 12517, Training Loss: 0.01355
Epoch: 12517, Training Loss: 0.01253
Epoch: 12517, Training Loss: 0.01548
Epoch: 12517, Training Loss: 0.01189
Epoch: 12518, Training Loss: 0.01355
Epoch: 12518, Training Loss: 0.01253
Epoch: 12518, Training Loss: 0.01548
Epoch: 12518, Training Loss: 0.01189
Epoch: 12519, Training Loss: 0.01355
Epoch: 12519, Training Loss: 0.01253
Epoch: 12519, Training Loss: 0.01547
Epoch: 12519, Training Loss: 0.01189
Epoch: 12520, Training Loss: 0.01355
Epoch: 12520, Training Loss: 0.01253
Epoch: 12520, Training Loss: 0.01547
Epoch: 12520, Training Loss: 0.01189
Epoch: 12521, Training Loss: 0.01355
Epoch: 12521, Training Loss: 0.01253
Epoch: 12521, Training Loss: 0.01547
Epoch: 12521, Training Loss: 0.01189
Epoch: 12522, Training Loss: 0.01355
Epoch: 12522, Training Loss: 0.01253
Epoch: 12522, Training Loss: 0.01547
Epoch: 12522, Training Loss: 0.01189
Epoch: 12523, Training Loss: 0.01355
Epoch: 12523, Training Loss: 0.01253
Epoch: 12523, Training Loss: 0.01547
Epoch: 12523, Training Loss: 0.01189
Epoch: 12524, Training Loss: 0.01355
Epoch: 12524, Training Loss: 0.01253
Epoch: 12524, Training Loss: 0.01547
Epoch: 12524, Training Loss: 0.01189
Epoch: 12525, Training Loss: 0.01355
Epoch: 12525, Training Loss: 0.01253
Epoch: 12525, Training Loss: 0.01547
Epoch: 12525, Training Loss: 0.01189
Epoch: 12526, Training Loss: 0.01355
Epoch: 12526, Training Loss: 0.01253
Epoch: 12526, Training Loss: 0.01547
Epoch: 12526, Training Loss: 0.01189
Epoch: 12527, Training Loss: 0.01355
Epoch: 12527, Training Loss: 0.01253
Epoch: 12527, Training Loss: 0.01547
Epoch: 12527, Training Loss: 0.01189
Epoch: 12528, Training Loss: 0.01355
Epoch: 12528, Training Loss: 0.01252
Epoch: 12528, Training Loss: 0.01547
Epoch: 12528, Training Loss: 0.01189
Epoch: 12529, Training Loss: 0.01355
Epoch: 12529, Training Loss: 0.01252
Epoch: 12529, Training Loss: 0.01547
Epoch: 12529, Training Loss: 0.01188
Epoch: 12530, Training Loss: 0.01354
Epoch: 12530, Training Loss: 0.01252
Epoch: 12530, Training Loss: 0.01547
Epoch: 12530, Training Loss: 0.01188
Epoch: 12531, Training Loss: 0.01354
Epoch: 12531, Training Loss: 0.01252
Epoch: 12531, Training Loss: 0.01547
Epoch: 12531, Training Loss: 0.01188
Epoch: 12532, Training Loss: 0.01354
Epoch: 12532, Training Loss: 0.01252
Epoch: 12532, Training Loss: 0.01547
Epoch: 12532, Training Loss: 0.01188
Epoch: 12533, Training Loss: 0.01354
Epoch: 12533, Training Loss: 0.01252
Epoch: 12533, Training Loss: 0.01547
Epoch: 12533, Training Loss: 0.01188
Epoch: 12534, Training Loss: 0.01354
Epoch: 12534, Training Loss: 0.01252
Epoch: 12534, Training Loss: 0.01546
Epoch: 12534, Training Loss: 0.01188
Epoch: 12535, Training Loss: 0.01354
Epoch: 12535, Training Loss: 0.01252
Epoch: 12535, Training Loss: 0.01546
Epoch: 12535, Training Loss: 0.01188
Epoch: 12536, Training Loss: 0.01354
Epoch: 12536, Training Loss: 0.01252
Epoch: 12536, Training Loss: 0.01546
Epoch: 12536, Training Loss: 0.01188
Epoch: 12537, Training Loss: 0.01354
Epoch: 12537, Training Loss: 0.01252
Epoch: 12537, Training Loss: 0.01546
Epoch: 12537, Training Loss: 0.01188
Epoch: 12538, Training Loss: 0.01354
Epoch: 12538, Training Loss: 0.01252
Epoch: 12538, Training Loss: 0.01546
Epoch: 12538, Training Loss: 0.01188
Epoch: 12539, Training Loss: 0.01354
Epoch: 12539, Training Loss: 0.01252
Epoch: 12539, Training Loss: 0.01546
Epoch: 12539, Training Loss: 0.01188
Epoch: 12540, Training Loss: 0.01354
Epoch: 12540, Training Loss: 0.01252
Epoch: 12540, Training Loss: 0.01546
Epoch: 12540, Training Loss: 0.01188
Epoch: 12541, Training Loss: 0.01354
Epoch: 12541, Training Loss: 0.01252
Epoch: 12541, Training Loss: 0.01546
Epoch: 12541, Training Loss: 0.01188
Epoch: 12542, Training Loss: 0.01354
Epoch: 12542, Training Loss: 0.01252
Epoch: 12542, Training Loss: 0.01546
Epoch: 12542, Training Loss: 0.01188
Epoch: 12543, Training Loss: 0.01354
Epoch: 12543, Training Loss: 0.01252
Epoch: 12543, Training Loss: 0.01546
Epoch: 12543, Training Loss: 0.01188
Epoch: 12544, Training Loss: 0.01354
Epoch: 12544, Training Loss: 0.01252
Epoch: 12544, Training Loss: 0.01546
Epoch: 12544, Training Loss: 0.01188
Epoch: 12545, Training Loss: 0.01354
Epoch: 12545, Training Loss: 0.01252
Epoch: 12545, Training Loss: 0.01546
Epoch: 12545, Training Loss: 0.01188
Epoch: 12546, Training Loss: 0.01354
Epoch: 12546, Training Loss: 0.01251
Epoch: 12546, Training Loss: 0.01546
Epoch: 12546, Training Loss: 0.01188
Epoch: 12547, Training Loss: 0.01353
Epoch: 12547, Training Loss: 0.01251
Epoch: 12547, Training Loss: 0.01546
Epoch: 12547, Training Loss: 0.01188
Epoch: 12548, Training Loss: 0.01353
Epoch: 12548, Training Loss: 0.01251
Epoch: 12548, Training Loss: 0.01545
Epoch: 12548, Training Loss: 0.01187
Epoch: 12549, Training Loss: 0.01353
Epoch: 12549, Training Loss: 0.01251
Epoch: 12549, Training Loss: 0.01545
Epoch: 12549, Training Loss: 0.01187
Epoch: 12550, Training Loss: 0.01353
Epoch: 12550, Training Loss: 0.01251
Epoch: 12550, Training Loss: 0.01545
Epoch: 12550, Training Loss: 0.01187
Epoch: 12551, Training Loss: 0.01353
Epoch: 12551, Training Loss: 0.01251
Epoch: 12551, Training Loss: 0.01545
Epoch: 12551, Training Loss: 0.01187
Epoch: 12552, Training Loss: 0.01353
Epoch: 12552, Training Loss: 0.01251
Epoch: 12552, Training Loss: 0.01545
Epoch: 12552, Training Loss: 0.01187
Epoch: 12553, Training Loss: 0.01353
Epoch: 12553, Training Loss: 0.01251
Epoch: 12553, Training Loss: 0.01545
Epoch: 12553, Training Loss: 0.01187
Epoch: 12554, Training Loss: 0.01353
Epoch: 12554, Training Loss: 0.01251
Epoch: 12554, Training Loss: 0.01545
Epoch: 12554, Training Loss: 0.01187
Epoch: 12555, Training Loss: 0.01353
Epoch: 12555, Training Loss: 0.01251
Epoch: 12555, Training Loss: 0.01545
Epoch: 12555, Training Loss: 0.01187
Epoch: 12556, Training Loss: 0.01353
Epoch: 12556, Training Loss: 0.01251
Epoch: 12556, Training Loss: 0.01545
Epoch: 12556, Training Loss: 0.01187
Epoch: 12557, Training Loss: 0.01353
Epoch: 12557, Training Loss: 0.01251
Epoch: 12557, Training Loss: 0.01545
Epoch: 12557, Training Loss: 0.01187
Epoch: 12558, Training Loss: 0.01353
Epoch: 12558, Training Loss: 0.01251
Epoch: 12558, Training Loss: 0.01545
Epoch: 12558, Training Loss: 0.01187
Epoch: 12559, Training Loss: 0.01353
Epoch: 12559, Training Loss: 0.01251
Epoch: 12559, Training Loss: 0.01545
Epoch: 12559, Training Loss: 0.01187
Epoch: 12560, Training Loss: 0.01353
Epoch: 12560, Training Loss: 0.01251
Epoch: 12560, Training Loss: 0.01545
Epoch: 12560, Training Loss: 0.01187
Epoch: 12561, Training Loss: 0.01353
Epoch: 12561, Training Loss: 0.01251
Epoch: 12561, Training Loss: 0.01545
Epoch: 12561, Training Loss: 0.01187
Epoch: 12562, Training Loss: 0.01353
Epoch: 12562, Training Loss: 0.01251
Epoch: 12562, Training Loss: 0.01545
Epoch: 12562, Training Loss: 0.01187
Epoch: 12563, Training Loss: 0.01352
Epoch: 12563, Training Loss: 0.01251
Epoch: 12563, Training Loss: 0.01544
Epoch: 12563, Training Loss: 0.01187
Epoch: 12564, Training Loss: 0.01352
Epoch: 12564, Training Loss: 0.01251
Epoch: 12564, Training Loss: 0.01544
Epoch: 12564, Training Loss: 0.01187
Epoch: 12565, Training Loss: 0.01352
Epoch: 12565, Training Loss: 0.01250
Epoch: 12565, Training Loss: 0.01544
Epoch: 12565, Training Loss: 0.01187
Epoch: 12566, Training Loss: 0.01352
Epoch: 12566, Training Loss: 0.01250
Epoch: 12566, Training Loss: 0.01544
Epoch: 12566, Training Loss: 0.01187
Epoch: 12567, Training Loss: 0.01352
Epoch: 12567, Training Loss: 0.01250
Epoch: 12567, Training Loss: 0.01544
Epoch: 12567, Training Loss: 0.01187
Epoch: 12568, Training Loss: 0.01352
Epoch: 12568, Training Loss: 0.01250
Epoch: 12568, Training Loss: 0.01544
Epoch: 12568, Training Loss: 0.01186
Epoch: 12569, Training Loss: 0.01352
Epoch: 12569, Training Loss: 0.01250
Epoch: 12569, Training Loss: 0.01544
Epoch: 12569, Training Loss: 0.01186
Epoch: 12570, Training Loss: 0.01352
Epoch: 12570, Training Loss: 0.01250
Epoch: 12570, Training Loss: 0.01544
Epoch: 12570, Training Loss: 0.01186
Epoch: 12571, Training Loss: 0.01352
Epoch: 12571, Training Loss: 0.01250
Epoch: 12571, Training Loss: 0.01544
Epoch: 12571, Training Loss: 0.01186
Epoch: 12572, Training Loss: 0.01352
Epoch: 12572, Training Loss: 0.01250
Epoch: 12572, Training Loss: 0.01544
Epoch: 12572, Training Loss: 0.01186
Epoch: 12573, Training Loss: 0.01352
Epoch: 12573, Training Loss: 0.01250
Epoch: 12573, Training Loss: 0.01544
Epoch: 12573, Training Loss: 0.01186
Epoch: 12574, Training Loss: 0.01352
Epoch: 12574, Training Loss: 0.01250
Epoch: 12574, Training Loss: 0.01544
Epoch: 12574, Training Loss: 0.01186
Epoch: 12575, Training Loss: 0.01352
Epoch: 12575, Training Loss: 0.01250
Epoch: 12575, Training Loss: 0.01544
Epoch: 12575, Training Loss: 0.01186
Epoch: 12576, Training Loss: 0.01352
Epoch: 12576, Training Loss: 0.01250
Epoch: 12576, Training Loss: 0.01544
Epoch: 12576, Training Loss: 0.01186
Epoch: 12577, Training Loss: 0.01352
Epoch: 12577, Training Loss: 0.01250
Epoch: 12577, Training Loss: 0.01543
Epoch: 12577, Training Loss: 0.01186
Epoch: 12578, Training Loss: 0.01352
Epoch: 12578, Training Loss: 0.01250
Epoch: 12578, Training Loss: 0.01543
Epoch: 12578, Training Loss: 0.01186
Epoch: 12579, Training Loss: 0.01352
Epoch: 12579, Training Loss: 0.01250
Epoch: 12579, Training Loss: 0.01543
Epoch: 12579, Training Loss: 0.01186
Epoch: 12580, Training Loss: 0.01351
Epoch: 12580, Training Loss: 0.01250
Epoch: 12580, Training Loss: 0.01543
Epoch: 12580, Training Loss: 0.01186
Epoch: 12581, Training Loss: 0.01351
Epoch: 12581, Training Loss: 0.01250
Epoch: 12581, Training Loss: 0.01543
Epoch: 12581, Training Loss: 0.01186
Epoch: 12582, Training Loss: 0.01351
Epoch: 12582, Training Loss: 0.01250
Epoch: 12582, Training Loss: 0.01543
Epoch: 12582, Training Loss: 0.01186
Epoch: 12583, Training Loss: 0.01351
Epoch: 12583, Training Loss: 0.01250
Epoch: 12583, Training Loss: 0.01543
Epoch: 12583, Training Loss: 0.01186
Epoch: 12584, Training Loss: 0.01351
Epoch: 12584, Training Loss: 0.01249
Epoch: 12584, Training Loss: 0.01543
Epoch: 12584, Training Loss: 0.01186
Epoch: 12585, Training Loss: 0.01351
Epoch: 12585, Training Loss: 0.01249
Epoch: 12585, Training Loss: 0.01543
Epoch: 12585, Training Loss: 0.01186
Epoch: 12586, Training Loss: 0.01351
Epoch: 12586, Training Loss: 0.01249
Epoch: 12586, Training Loss: 0.01543
Epoch: 12586, Training Loss: 0.01186
Epoch: 12587, Training Loss: 0.01351
Epoch: 12587, Training Loss: 0.01249
Epoch: 12587, Training Loss: 0.01543
Epoch: 12587, Training Loss: 0.01185
Epoch: 12588, Training Loss: 0.01351
Epoch: 12588, Training Loss: 0.01249
Epoch: 12588, Training Loss: 0.01543
Epoch: 12588, Training Loss: 0.01185
Epoch: 12589, Training Loss: 0.01351
Epoch: 12589, Training Loss: 0.01249
Epoch: 12589, Training Loss: 0.01543
Epoch: 12589, Training Loss: 0.01185
Epoch: 12590, Training Loss: 0.01351
Epoch: 12590, Training Loss: 0.01249
Epoch: 12590, Training Loss: 0.01543
Epoch: 12590, Training Loss: 0.01185
Epoch: 12591, Training Loss: 0.01351
Epoch: 12591, Training Loss: 0.01249
Epoch: 12591, Training Loss: 0.01543
Epoch: 12591, Training Loss: 0.01185
Epoch: 12592, Training Loss: 0.01351
Epoch: 12592, Training Loss: 0.01249
Epoch: 12592, Training Loss: 0.01542
Epoch: 12592, Training Loss: 0.01185
Epoch: 12593, Training Loss: 0.01351
Epoch: 12593, Training Loss: 0.01249
Epoch: 12593, Training Loss: 0.01542
Epoch: 12593, Training Loss: 0.01185
Epoch: 12594, Training Loss: 0.01351
Epoch: 12594, Training Loss: 0.01249
Epoch: 12594, Training Loss: 0.01542
Epoch: 12594, Training Loss: 0.01185
Epoch: 12595, Training Loss: 0.01351
Epoch: 12595, Training Loss: 0.01249
Epoch: 12595, Training Loss: 0.01542
Epoch: 12595, Training Loss: 0.01185
Epoch: 12596, Training Loss: 0.01351
Epoch: 12596, Training Loss: 0.01249
Epoch: 12596, Training Loss: 0.01542
Epoch: 12596, Training Loss: 0.01185
Epoch: 12597, Training Loss: 0.01350
Epoch: 12597, Training Loss: 0.01249
Epoch: 12597, Training Loss: 0.01542
Epoch: 12597, Training Loss: 0.01185
Epoch: 12598, Training Loss: 0.01350
Epoch: 12598, Training Loss: 0.01249
Epoch: 12598, Training Loss: 0.01542
Epoch: 12598, Training Loss: 0.01185
Epoch: 12599, Training Loss: 0.01350
Epoch: 12599, Training Loss: 0.01249
Epoch: 12599, Training Loss: 0.01542
Epoch: 12599, Training Loss: 0.01185
Epoch: 12600, Training Loss: 0.01350
Epoch: 12600, Training Loss: 0.01249
Epoch: 12600, Training Loss: 0.01542
Epoch: 12600, Training Loss: 0.01185
Epoch: 12601, Training Loss: 0.01350
Epoch: 12601, Training Loss: 0.01249
Epoch: 12601, Training Loss: 0.01542
Epoch: 12601, Training Loss: 0.01185
Epoch: 12602, Training Loss: 0.01350
Epoch: 12602, Training Loss: 0.01248
Epoch: 12602, Training Loss: 0.01542
Epoch: 12602, Training Loss: 0.01185
Epoch: 12603, Training Loss: 0.01350
Epoch: 12603, Training Loss: 0.01248
Epoch: 12603, Training Loss: 0.01542
Epoch: 12603, Training Loss: 0.01185
Epoch: 12604, Training Loss: 0.01350
Epoch: 12604, Training Loss: 0.01248
Epoch: 12604, Training Loss: 0.01542
Epoch: 12604, Training Loss: 0.01185
Epoch: 12605, Training Loss: 0.01350
Epoch: 12605, Training Loss: 0.01248
Epoch: 12605, Training Loss: 0.01542
Epoch: 12605, Training Loss: 0.01185
Epoch: 12606, Training Loss: 0.01350
Epoch: 12606, Training Loss: 0.01248
Epoch: 12606, Training Loss: 0.01542
Epoch: 12606, Training Loss: 0.01184
Epoch: 12607, Training Loss: 0.01350
Epoch: 12607, Training Loss: 0.01248
Epoch: 12607, Training Loss: 0.01541
Epoch: 12607, Training Loss: 0.01184
Epoch: 12608, Training Loss: 0.01350
Epoch: 12608, Training Loss: 0.01248
Epoch: 12608, Training Loss: 0.01541
Epoch: 12608, Training Loss: 0.01184
Epoch: 12609, Training Loss: 0.01350
Epoch: 12609, Training Loss: 0.01248
Epoch: 12609, Training Loss: 0.01541
Epoch: 12609, Training Loss: 0.01184
Epoch: 12610, Training Loss: 0.01350
Epoch: 12610, Training Loss: 0.01248
Epoch: 12610, Training Loss: 0.01541
Epoch: 12610, Training Loss: 0.01184
Epoch: 12611, Training Loss: 0.01350
Epoch: 12611, Training Loss: 0.01248
Epoch: 12611, Training Loss: 0.01541
Epoch: 12611, Training Loss: 0.01184
Epoch: 12612, Training Loss: 0.01350
Epoch: 12612, Training Loss: 0.01248
Epoch: 12612, Training Loss: 0.01541
Epoch: 12612, Training Loss: 0.01184
Epoch: 12613, Training Loss: 0.01349
Epoch: 12613, Training Loss: 0.01248
Epoch: 12613, Training Loss: 0.01541
Epoch: 12613, Training Loss: 0.01184
Epoch: 12614, Training Loss: 0.01349
Epoch: 12614, Training Loss: 0.01248
Epoch: 12614, Training Loss: 0.01541
Epoch: 12614, Training Loss: 0.01184
Epoch: 12615, Training Loss: 0.01349
Epoch: 12615, Training Loss: 0.01248
Epoch: 12615, Training Loss: 0.01541
Epoch: 12615, Training Loss: 0.01184
Epoch: 12616, Training Loss: 0.01349
Epoch: 12616, Training Loss: 0.01248
Epoch: 12616, Training Loss: 0.01541
Epoch: 12616, Training Loss: 0.01184
Epoch: 12617, Training Loss: 0.01349
Epoch: 12617, Training Loss: 0.01248
Epoch: 12617, Training Loss: 0.01541
Epoch: 12617, Training Loss: 0.01184
Epoch: 12618, Training Loss: 0.01349
Epoch: 12618, Training Loss: 0.01248
Epoch: 12618, Training Loss: 0.01541
Epoch: 12618, Training Loss: 0.01184
Epoch: 12619, Training Loss: 0.01349
Epoch: 12619, Training Loss: 0.01248
Epoch: 12619, Training Loss: 0.01541
Epoch: 12619, Training Loss: 0.01184
Epoch: 12620, Training Loss: 0.01349
Epoch: 12620, Training Loss: 0.01248
Epoch: 12620, Training Loss: 0.01541
Epoch: 12620, Training Loss: 0.01184
Epoch: 12621, Training Loss: 0.01349
Epoch: 12621, Training Loss: 0.01247
Epoch: 12621, Training Loss: 0.01540
Epoch: 12621, Training Loss: 0.01184
Epoch: 12622, Training Loss: 0.01349
Epoch: 12622, Training Loss: 0.01247
Epoch: 12622, Training Loss: 0.01540
Epoch: 12622, Training Loss: 0.01184
Epoch: 12623, Training Loss: 0.01349
Epoch: 12623, Training Loss: 0.01247
Epoch: 12623, Training Loss: 0.01540
Epoch: 12623, Training Loss: 0.01184
Epoch: 12624, Training Loss: 0.01349
Epoch: 12624, Training Loss: 0.01247
Epoch: 12624, Training Loss: 0.01540
Epoch: 12624, Training Loss: 0.01184
Epoch: 12625, Training Loss: 0.01349
Epoch: 12625, Training Loss: 0.01247
Epoch: 12625, Training Loss: 0.01540
Epoch: 12625, Training Loss: 0.01184
Epoch: 12626, Training Loss: 0.01349
Epoch: 12626, Training Loss: 0.01247
Epoch: 12626, Training Loss: 0.01540
Epoch: 12626, Training Loss: 0.01183
Epoch: 12627, Training Loss: 0.01349
Epoch: 12627, Training Loss: 0.01247
Epoch: 12627, Training Loss: 0.01540
Epoch: 12627, Training Loss: 0.01183
Epoch: 12628, Training Loss: 0.01349
Epoch: 12628, Training Loss: 0.01247
Epoch: 12628, Training Loss: 0.01540
Epoch: 12628, Training Loss: 0.01183
Epoch: 12629, Training Loss: 0.01349
Epoch: 12629, Training Loss: 0.01247
Epoch: 12629, Training Loss: 0.01540
Epoch: 12629, Training Loss: 0.01183
Epoch: 12630, Training Loss: 0.01348
Epoch: 12630, Training Loss: 0.01247
Epoch: 12630, Training Loss: 0.01540
Epoch: 12630, Training Loss: 0.01183
Epoch: 12631, Training Loss: 0.01348
Epoch: 12631, Training Loss: 0.01247
Epoch: 12631, Training Loss: 0.01540
Epoch: 12631, Training Loss: 0.01183
Epoch: 12632, Training Loss: 0.01348
Epoch: 12632, Training Loss: 0.01247
Epoch: 12632, Training Loss: 0.01540
Epoch: 12632, Training Loss: 0.01183
Epoch: 12633, Training Loss: 0.01348
Epoch: 12633, Training Loss: 0.01247
Epoch: 12633, Training Loss: 0.01540
Epoch: 12633, Training Loss: 0.01183
Epoch: 12634, Training Loss: 0.01348
Epoch: 12634, Training Loss: 0.01247
Epoch: 12634, Training Loss: 0.01540
Epoch: 12634, Training Loss: 0.01183
Epoch: 12635, Training Loss: 0.01348
Epoch: 12635, Training Loss: 0.01247
Epoch: 12635, Training Loss: 0.01540
Epoch: 12635, Training Loss: 0.01183
Epoch: 12636, Training Loss: 0.01348
Epoch: 12636, Training Loss: 0.01247
Epoch: 12636, Training Loss: 0.01539
Epoch: 12636, Training Loss: 0.01183
Epoch: 12637, Training Loss: 0.01348
Epoch: 12637, Training Loss: 0.01247
Epoch: 12637, Training Loss: 0.01539
Epoch: 12637, Training Loss: 0.01183
Epoch: 12638, Training Loss: 0.01348
Epoch: 12638, Training Loss: 0.01247
Epoch: 12638, Training Loss: 0.01539
Epoch: 12638, Training Loss: 0.01183
Epoch: 12639, Training Loss: 0.01348
Epoch: 12639, Training Loss: 0.01246
Epoch: 12639, Training Loss: 0.01539
Epoch: 12639, Training Loss: 0.01183
Epoch: 12640, Training Loss: 0.01348
Epoch: 12640, Training Loss: 0.01246
Epoch: 12640, Training Loss: 0.01539
Epoch: 12640, Training Loss: 0.01183
Epoch: 12641, Training Loss: 0.01348
Epoch: 12641, Training Loss: 0.01246
Epoch: 12641, Training Loss: 0.01539
Epoch: 12641, Training Loss: 0.01183
Epoch: 12642, Training Loss: 0.01348
Epoch: 12642, Training Loss: 0.01246
Epoch: 12642, Training Loss: 0.01539
Epoch: 12642, Training Loss: 0.01183
Epoch: 12643, Training Loss: 0.01348
Epoch: 12643, Training Loss: 0.01246
Epoch: 12643, Training Loss: 0.01539
Epoch: 12643, Training Loss: 0.01183
Epoch: 12644, Training Loss: 0.01348
Epoch: 12644, Training Loss: 0.01246
Epoch: 12644, Training Loss: 0.01539
Epoch: 12644, Training Loss: 0.01183
Epoch: 12645, Training Loss: 0.01348
Epoch: 12645, Training Loss: 0.01246
Epoch: 12645, Training Loss: 0.01539
Epoch: 12645, Training Loss: 0.01183
Epoch: 12646, Training Loss: 0.01348
Epoch: 12646, Training Loss: 0.01246
Epoch: 12646, Training Loss: 0.01539
Epoch: 12646, Training Loss: 0.01182
Epoch: 12647, Training Loss: 0.01347
Epoch: 12647, Training Loss: 0.01246
Epoch: 12647, Training Loss: 0.01539
Epoch: 12647, Training Loss: 0.01182
Epoch: 12648, Training Loss: 0.01347
Epoch: 12648, Training Loss: 0.01246
Epoch: 12648, Training Loss: 0.01539
Epoch: 12648, Training Loss: 0.01182
Epoch: 12649, Training Loss: 0.01347
Epoch: 12649, Training Loss: 0.01246
Epoch: 12649, Training Loss: 0.01539
Epoch: 12649, Training Loss: 0.01182
Epoch: 12650, Training Loss: 0.01347
Epoch: 12650, Training Loss: 0.01246
Epoch: 12650, Training Loss: 0.01539
Epoch: 12650, Training Loss: 0.01182
Epoch: 12651, Training Loss: 0.01347
Epoch: 12651, Training Loss: 0.01246
Epoch: 12651, Training Loss: 0.01538
Epoch: 12651, Training Loss: 0.01182
Epoch: 12652, Training Loss: 0.01347
Epoch: 12652, Training Loss: 0.01246
Epoch: 12652, Training Loss: 0.01538
Epoch: 12652, Training Loss: 0.01182
Epoch: 12653, Training Loss: 0.01347
Epoch: 12653, Training Loss: 0.01246
Epoch: 12653, Training Loss: 0.01538
Epoch: 12653, Training Loss: 0.01182
Epoch: 12654, Training Loss: 0.01347
Epoch: 12654, Training Loss: 0.01246
Epoch: 12654, Training Loss: 0.01538
Epoch: 12654, Training Loss: 0.01182
Epoch: 12655, Training Loss: 0.01347
Epoch: 12655, Training Loss: 0.01246
Epoch: 12655, Training Loss: 0.01538
Epoch: 12655, Training Loss: 0.01182
Epoch: 12656, Training Loss: 0.01347
Epoch: 12656, Training Loss: 0.01246
Epoch: 12656, Training Loss: 0.01538
Epoch: 12656, Training Loss: 0.01182
Epoch: 12657, Training Loss: 0.01347
Epoch: 12657, Training Loss: 0.01246
Epoch: 12657, Training Loss: 0.01538
Epoch: 12657, Training Loss: 0.01182
Epoch: 12658, Training Loss: 0.01347
Epoch: 12658, Training Loss: 0.01245
Epoch: 12658, Training Loss: 0.01538
Epoch: 12658, Training Loss: 0.01182
Epoch: 12659, Training Loss: 0.01347
Epoch: 12659, Training Loss: 0.01245
Epoch: 12659, Training Loss: 0.01538
Epoch: 12659, Training Loss: 0.01182
Epoch: 12660, Training Loss: 0.01347
Epoch: 12660, Training Loss: 0.01245
Epoch: 12660, Training Loss: 0.01538
Epoch: 12660, Training Loss: 0.01182
Epoch: 12661, Training Loss: 0.01347
Epoch: 12661, Training Loss: 0.01245
Epoch: 12661, Training Loss: 0.01538
Epoch: 12661, Training Loss: 0.01182
Epoch: 12662, Training Loss: 0.01347
Epoch: 12662, Training Loss: 0.01245
Epoch: 12662, Training Loss: 0.01538
Epoch: 12662, Training Loss: 0.01182
Epoch: 12663, Training Loss: 0.01347
Epoch: 12663, Training Loss: 0.01245
Epoch: 12663, Training Loss: 0.01538
Epoch: 12663, Training Loss: 0.01182
Epoch: 12664, Training Loss: 0.01346
Epoch: 12664, Training Loss: 0.01245
Epoch: 12664, Training Loss: 0.01538
Epoch: 12664, Training Loss: 0.01182
Epoch: 12665, Training Loss: 0.01346
Epoch: 12665, Training Loss: 0.01245
Epoch: 12665, Training Loss: 0.01537
Epoch: 12665, Training Loss: 0.01181
Epoch: 12666, Training Loss: 0.01346
Epoch: 12666, Training Loss: 0.01245
Epoch: 12666, Training Loss: 0.01537
Epoch: 12666, Training Loss: 0.01181
Epoch: 12667, Training Loss: 0.01346
Epoch: 12667, Training Loss: 0.01245
Epoch: 12667, Training Loss: 0.01537
Epoch: 12667, Training Loss: 0.01181
Epoch: 12668, Training Loss: 0.01346
Epoch: 12668, Training Loss: 0.01245
Epoch: 12668, Training Loss: 0.01537
Epoch: 12668, Training Loss: 0.01181
Epoch: 12669, Training Loss: 0.01346
Epoch: 12669, Training Loss: 0.01245
Epoch: 12669, Training Loss: 0.01537
Epoch: 12669, Training Loss: 0.01181
Epoch: 12670, Training Loss: 0.01346
Epoch: 12670, Training Loss: 0.01245
Epoch: 12670, Training Loss: 0.01537
Epoch: 12670, Training Loss: 0.01181
Epoch: 12671, Training Loss: 0.01346
Epoch: 12671, Training Loss: 0.01245
Epoch: 12671, Training Loss: 0.01537
Epoch: 12671, Training Loss: 0.01181
Epoch: 12672, Training Loss: 0.01346
Epoch: 12672, Training Loss: 0.01245
Epoch: 12672, Training Loss: 0.01537
Epoch: 12672, Training Loss: 0.01181
Epoch: 12673, Training Loss: 0.01346
Epoch: 12673, Training Loss: 0.01245
Epoch: 12673, Training Loss: 0.01537
Epoch: 12673, Training Loss: 0.01181
Epoch: 12674, Training Loss: 0.01346
Epoch: 12674, Training Loss: 0.01245
Epoch: 12674, Training Loss: 0.01537
Epoch: 12674, Training Loss: 0.01181
Epoch: 12675, Training Loss: 0.01346
Epoch: 12675, Training Loss: 0.01245
Epoch: 12675, Training Loss: 0.01537
Epoch: 12675, Training Loss: 0.01181
Epoch: 12676, Training Loss: 0.01346
Epoch: 12676, Training Loss: 0.01245
Epoch: 12676, Training Loss: 0.01537
Epoch: 12676, Training Loss: 0.01181
Epoch: 12677, Training Loss: 0.01346
Epoch: 12677, Training Loss: 0.01244
Epoch: 12677, Training Loss: 0.01537
Epoch: 12677, Training Loss: 0.01181
Epoch: 12678, Training Loss: 0.01346
Epoch: 12678, Training Loss: 0.01244
Epoch: 12678, Training Loss: 0.01537
Epoch: 12678, Training Loss: 0.01181
Epoch: 12679, Training Loss: 0.01346
Epoch: 12679, Training Loss: 0.01244
Epoch: 12679, Training Loss: 0.01537
Epoch: 12679, Training Loss: 0.01181
Epoch: 12680, Training Loss: 0.01346
Epoch: 12680, Training Loss: 0.01244
Epoch: 12680, Training Loss: 0.01536
Epoch: 12680, Training Loss: 0.01181
Epoch: 12681, Training Loss: 0.01345
Epoch: 12681, Training Loss: 0.01244
Epoch: 12681, Training Loss: 0.01536
Epoch: 12681, Training Loss: 0.01181
Epoch: 12682, Training Loss: 0.01345
Epoch: 12682, Training Loss: 0.01244
Epoch: 12682, Training Loss: 0.01536
Epoch: 12682, Training Loss: 0.01181
Epoch: 12683, Training Loss: 0.01345
Epoch: 12683, Training Loss: 0.01244
Epoch: 12683, Training Loss: 0.01536
Epoch: 12683, Training Loss: 0.01181
Epoch: 12684, Training Loss: 0.01345
Epoch: 12684, Training Loss: 0.01244
Epoch: 12684, Training Loss: 0.01536
Epoch: 12684, Training Loss: 0.01181
Epoch: 12685, Training Loss: 0.01345
Epoch: 12685, Training Loss: 0.01244
Epoch: 12685, Training Loss: 0.01536
Epoch: 12685, Training Loss: 0.01180
Epoch: 12686, Training Loss: 0.01345
Epoch: 12686, Training Loss: 0.01244
Epoch: 12686, Training Loss: 0.01536
Epoch: 12686, Training Loss: 0.01180
Epoch: 12687, Training Loss: 0.01345
Epoch: 12687, Training Loss: 0.01244
Epoch: 12687, Training Loss: 0.01536
Epoch: 12687, Training Loss: 0.01180
Epoch: 12688, Training Loss: 0.01345
Epoch: 12688, Training Loss: 0.01244
Epoch: 12688, Training Loss: 0.01536
Epoch: 12688, Training Loss: 0.01180
Epoch: 12689, Training Loss: 0.01345
Epoch: 12689, Training Loss: 0.01244
Epoch: 12689, Training Loss: 0.01536
Epoch: 12689, Training Loss: 0.01180
Epoch: 12690, Training Loss: 0.01345
Epoch: 12690, Training Loss: 0.01244
Epoch: 12690, Training Loss: 0.01536
Epoch: 12690, Training Loss: 0.01180
Epoch: 12691, Training Loss: 0.01345
Epoch: 12691, Training Loss: 0.01244
Epoch: 12691, Training Loss: 0.01536
Epoch: 12691, Training Loss: 0.01180
Epoch: 12692, Training Loss: 0.01345
Epoch: 12692, Training Loss: 0.01244
Epoch: 12692, Training Loss: 0.01536
Epoch: 12692, Training Loss: 0.01180
Epoch: 12693, Training Loss: 0.01345
Epoch: 12693, Training Loss: 0.01244
Epoch: 12693, Training Loss: 0.01536
Epoch: 12693, Training Loss: 0.01180
Epoch: 12694, Training Loss: 0.01345
Epoch: 12694, Training Loss: 0.01244
Epoch: 12694, Training Loss: 0.01536
Epoch: 12694, Training Loss: 0.01180
Epoch: 12695, Training Loss: 0.01345
Epoch: 12695, Training Loss: 0.01244
Epoch: 12695, Training Loss: 0.01535
Epoch: 12695, Training Loss: 0.01180
Epoch: 12696, Training Loss: 0.01345
Epoch: 12696, Training Loss: 0.01243
Epoch: 12696, Training Loss: 0.01535
Epoch: 12696, Training Loss: 0.01180
Epoch: 12697, Training Loss: 0.01345
Epoch: 12697, Training Loss: 0.01243
Epoch: 12697, Training Loss: 0.01535
Epoch: 12697, Training Loss: 0.01180
Epoch: 12698, Training Loss: 0.01344
Epoch: 12698, Training Loss: 0.01243
Epoch: 12698, Training Loss: 0.01535
Epoch: 12698, Training Loss: 0.01180
Epoch: 12699, Training Loss: 0.01344
Epoch: 12699, Training Loss: 0.01243
Epoch: 12699, Training Loss: 0.01535
Epoch: 12699, Training Loss: 0.01180
Epoch: 12700, Training Loss: 0.01344
Epoch: 12700, Training Loss: 0.01243
Epoch: 12700, Training Loss: 0.01535
Epoch: 12700, Training Loss: 0.01180
Epoch: 12701, Training Loss: 0.01344
Epoch: 12701, Training Loss: 0.01243
Epoch: 12701, Training Loss: 0.01535
Epoch: 12701, Training Loss: 0.01180
Epoch: 12702, Training Loss: 0.01344
Epoch: 12702, Training Loss: 0.01243
Epoch: 12702, Training Loss: 0.01535
Epoch: 12702, Training Loss: 0.01180
Epoch: 12703, Training Loss: 0.01344
Epoch: 12703, Training Loss: 0.01243
Epoch: 12703, Training Loss: 0.01535
Epoch: 12703, Training Loss: 0.01180
Epoch: 12704, Training Loss: 0.01344
Epoch: 12704, Training Loss: 0.01243
Epoch: 12704, Training Loss: 0.01535
Epoch: 12704, Training Loss: 0.01179
Epoch: 12705, Training Loss: 0.01344
Epoch: 12705, Training Loss: 0.01243
Epoch: 12705, Training Loss: 0.01535
Epoch: 12705, Training Loss: 0.01179
Epoch: 12706, Training Loss: 0.01344
Epoch: 12706, Training Loss: 0.01243
Epoch: 12706, Training Loss: 0.01535
Epoch: 12706, Training Loss: 0.01179
Epoch: 12707, Training Loss: 0.01344
Epoch: 12707, Training Loss: 0.01243
Epoch: 12707, Training Loss: 0.01535
Epoch: 12707, Training Loss: 0.01179
Epoch: 12708, Training Loss: 0.01344
Epoch: 12708, Training Loss: 0.01243
Epoch: 12708, Training Loss: 0.01535
Epoch: 12708, Training Loss: 0.01179
Epoch: 12709, Training Loss: 0.01344
Epoch: 12709, Training Loss: 0.01243
Epoch: 12709, Training Loss: 0.01535
Epoch: 12709, Training Loss: 0.01179
Epoch: 12710, Training Loss: 0.01344
Epoch: 12710, Training Loss: 0.01243
Epoch: 12710, Training Loss: 0.01534
Epoch: 12710, Training Loss: 0.01179
Epoch: 12711, Training Loss: 0.01344
Epoch: 12711, Training Loss: 0.01243
Epoch: 12711, Training Loss: 0.01534
Epoch: 12711, Training Loss: 0.01179
Epoch: 12712, Training Loss: 0.01344
Epoch: 12712, Training Loss: 0.01243
Epoch: 12712, Training Loss: 0.01534
Epoch: 12712, Training Loss: 0.01179
Epoch: 12713, Training Loss: 0.01344
Epoch: 12713, Training Loss: 0.01243
Epoch: 12713, Training Loss: 0.01534
Epoch: 12713, Training Loss: 0.01179
Epoch: 12714, Training Loss: 0.01344
Epoch: 12714, Training Loss: 0.01243
Epoch: 12714, Training Loss: 0.01534
Epoch: 12714, Training Loss: 0.01179
Epoch: 12715, Training Loss: 0.01343
Epoch: 12715, Training Loss: 0.01242
Epoch: 12715, Training Loss: 0.01534
Epoch: 12715, Training Loss: 0.01179
Epoch: 12716, Training Loss: 0.01343
Epoch: 12716, Training Loss: 0.01242
Epoch: 12716, Training Loss: 0.01534
Epoch: 12716, Training Loss: 0.01179
Epoch: 12717, Training Loss: 0.01343
Epoch: 12717, Training Loss: 0.01242
Epoch: 12717, Training Loss: 0.01534
Epoch: 12717, Training Loss: 0.01179
Epoch: 12718, Training Loss: 0.01343
Epoch: 12718, Training Loss: 0.01242
Epoch: 12718, Training Loss: 0.01534
Epoch: 12718, Training Loss: 0.01179
Epoch: 12719, Training Loss: 0.01343
Epoch: 12719, Training Loss: 0.01242
Epoch: 12719, Training Loss: 0.01534
Epoch: 12719, Training Loss: 0.01179
Epoch: 12720, Training Loss: 0.01343
Epoch: 12720, Training Loss: 0.01242
Epoch: 12720, Training Loss: 0.01534
Epoch: 12720, Training Loss: 0.01179
Epoch: 12721, Training Loss: 0.01343
Epoch: 12721, Training Loss: 0.01242
Epoch: 12721, Training Loss: 0.01534
Epoch: 12721, Training Loss: 0.01179
Epoch: 12722, Training Loss: 0.01343
Epoch: 12722, Training Loss: 0.01242
Epoch: 12722, Training Loss: 0.01534
Epoch: 12722, Training Loss: 0.01179
Epoch: 12723, Training Loss: 0.01343
Epoch: 12723, Training Loss: 0.01242
Epoch: 12723, Training Loss: 0.01534
Epoch: 12723, Training Loss: 0.01179
Epoch: 12724, Training Loss: 0.01343
Epoch: 12724, Training Loss: 0.01242
Epoch: 12724, Training Loss: 0.01534
Epoch: 12724, Training Loss: 0.01178
Epoch: 12725, Training Loss: 0.01343
Epoch: 12725, Training Loss: 0.01242
Epoch: 12725, Training Loss: 0.01533
Epoch: 12725, Training Loss: 0.01178
Epoch: 12726, Training Loss: 0.01343
Epoch: 12726, Training Loss: 0.01242
Epoch: 12726, Training Loss: 0.01533
Epoch: 12726, Training Loss: 0.01178
Epoch: 12727, Training Loss: 0.01343
Epoch: 12727, Training Loss: 0.01242
Epoch: 12727, Training Loss: 0.01533
Epoch: 12727, Training Loss: 0.01178
Epoch: 12728, Training Loss: 0.01343
Epoch: 12728, Training Loss: 0.01242
Epoch: 12728, Training Loss: 0.01533
Epoch: 12728, Training Loss: 0.01178
Epoch: 12729, Training Loss: 0.01343
Epoch: 12729, Training Loss: 0.01242
Epoch: 12729, Training Loss: 0.01533
Epoch: 12729, Training Loss: 0.01178
Epoch: 12730, Training Loss: 0.01343
Epoch: 12730, Training Loss: 0.01242
Epoch: 12730, Training Loss: 0.01533
Epoch: 12730, Training Loss: 0.01178
Epoch: 12731, Training Loss: 0.01343
Epoch: 12731, Training Loss: 0.01242
Epoch: 12731, Training Loss: 0.01533
Epoch: 12731, Training Loss: 0.01178
Epoch: 12732, Training Loss: 0.01342
Epoch: 12732, Training Loss: 0.01242
Epoch: 12732, Training Loss: 0.01533
Epoch: 12732, Training Loss: 0.01178
Epoch: 12733, Training Loss: 0.01342
Epoch: 12733, Training Loss: 0.01242
Epoch: 12733, Training Loss: 0.01533
Epoch: 12733, Training Loss: 0.01178
Epoch: 12734, Training Loss: 0.01342
Epoch: 12734, Training Loss: 0.01241
Epoch: 12734, Training Loss: 0.01533
Epoch: 12734, Training Loss: 0.01178
Epoch: 12735, Training Loss: 0.01342
Epoch: 12735, Training Loss: 0.01241
Epoch: 12735, Training Loss: 0.01533
Epoch: 12735, Training Loss: 0.01178
Epoch: 12736, Training Loss: 0.01342
Epoch: 12736, Training Loss: 0.01241
Epoch: 12736, Training Loss: 0.01533
Epoch: 12736, Training Loss: 0.01178
Epoch: 12737, Training Loss: 0.01342
Epoch: 12737, Training Loss: 0.01241
Epoch: 12737, Training Loss: 0.01533
Epoch: 12737, Training Loss: 0.01178
Epoch: 12738, Training Loss: 0.01342
Epoch: 12738, Training Loss: 0.01241
Epoch: 12738, Training Loss: 0.01533
Epoch: 12738, Training Loss: 0.01178
Epoch: 12739, Training Loss: 0.01342
Epoch: 12739, Training Loss: 0.01241
Epoch: 12739, Training Loss: 0.01533
Epoch: 12739, Training Loss: 0.01178
Epoch: 12740, Training Loss: 0.01342
Epoch: 12740, Training Loss: 0.01241
Epoch: 12740, Training Loss: 0.01532
Epoch: 12740, Training Loss: 0.01178
Epoch: 12741, Training Loss: 0.01342
Epoch: 12741, Training Loss: 0.01241
Epoch: 12741, Training Loss: 0.01532
Epoch: 12741, Training Loss: 0.01178
Epoch: 12742, Training Loss: 0.01342
Epoch: 12742, Training Loss: 0.01241
Epoch: 12742, Training Loss: 0.01532
Epoch: 12742, Training Loss: 0.01178
Epoch: 12743, Training Loss: 0.01342
Epoch: 12743, Training Loss: 0.01241
Epoch: 12743, Training Loss: 0.01532
Epoch: 12743, Training Loss: 0.01178
Epoch: 12744, Training Loss: 0.01342
Epoch: 12744, Training Loss: 0.01241
Epoch: 12744, Training Loss: 0.01532
Epoch: 12744, Training Loss: 0.01177
Epoch: 12745, Training Loss: 0.01342
Epoch: 12745, Training Loss: 0.01241
Epoch: 12745, Training Loss: 0.01532
Epoch: 12745, Training Loss: 0.01177
Epoch: 12746, Training Loss: 0.01342
Epoch: 12746, Training Loss: 0.01241
Epoch: 12746, Training Loss: 0.01532
Epoch: 12746, Training Loss: 0.01177
Epoch: 12747, Training Loss: 0.01342
Epoch: 12747, Training Loss: 0.01241
Epoch: 12747, Training Loss: 0.01532
Epoch: 12747, Training Loss: 0.01177
Epoch: 12748, Training Loss: 0.01342
Epoch: 12748, Training Loss: 0.01241
Epoch: 12748, Training Loss: 0.01532
Epoch: 12748, Training Loss: 0.01177
Epoch: 12749, Training Loss: 0.01341
Epoch: 12749, Training Loss: 0.01241
Epoch: 12749, Training Loss: 0.01532
Epoch: 12749, Training Loss: 0.01177
Epoch: 12750, Training Loss: 0.01341
Epoch: 12750, Training Loss: 0.01241
Epoch: 12750, Training Loss: 0.01532
Epoch: 12750, Training Loss: 0.01177
Epoch: 12751, Training Loss: 0.01341
Epoch: 12751, Training Loss: 0.01241
Epoch: 12751, Training Loss: 0.01532
Epoch: 12751, Training Loss: 0.01177
Epoch: 12752, Training Loss: 0.01341
Epoch: 12752, Training Loss: 0.01241
Epoch: 12752, Training Loss: 0.01532
Epoch: 12752, Training Loss: 0.01177
Epoch: 12753, Training Loss: 0.01341
Epoch: 12753, Training Loss: 0.01240
Epoch: 12753, Training Loss: 0.01532
Epoch: 12753, Training Loss: 0.01177
Epoch: 12754, Training Loss: 0.01341
Epoch: 12754, Training Loss: 0.01240
Epoch: 12754, Training Loss: 0.01531
Epoch: 12754, Training Loss: 0.01177
Epoch: 12755, Training Loss: 0.01341
Epoch: 12755, Training Loss: 0.01240
Epoch: 12755, Training Loss: 0.01531
Epoch: 12755, Training Loss: 0.01177
Epoch: 12756, Training Loss: 0.01341
Epoch: 12756, Training Loss: 0.01240
Epoch: 12756, Training Loss: 0.01531
Epoch: 12756, Training Loss: 0.01177
Epoch: 12757, Training Loss: 0.01341
Epoch: 12757, Training Loss: 0.01240
Epoch: 12757, Training Loss: 0.01531
Epoch: 12757, Training Loss: 0.01177
Epoch: 12758, Training Loss: 0.01341
Epoch: 12758, Training Loss: 0.01240
Epoch: 12758, Training Loss: 0.01531
Epoch: 12758, Training Loss: 0.01177
Epoch: 12759, Training Loss: 0.01341
Epoch: 12759, Training Loss: 0.01240
Epoch: 12759, Training Loss: 0.01531
Epoch: 12759, Training Loss: 0.01177
Epoch: 12760, Training Loss: 0.01341
Epoch: 12760, Training Loss: 0.01240
Epoch: 12760, Training Loss: 0.01531
Epoch: 12760, Training Loss: 0.01177
Epoch: 12761, Training Loss: 0.01341
Epoch: 12761, Training Loss: 0.01240
Epoch: 12761, Training Loss: 0.01531
Epoch: 12761, Training Loss: 0.01177
Epoch: 12762, Training Loss: 0.01341
Epoch: 12762, Training Loss: 0.01240
Epoch: 12762, Training Loss: 0.01531
Epoch: 12762, Training Loss: 0.01177
Epoch: 12763, Training Loss: 0.01341
Epoch: 12763, Training Loss: 0.01240
Epoch: 12763, Training Loss: 0.01531
Epoch: 12763, Training Loss: 0.01177
Epoch: 12764, Training Loss: 0.01341
Epoch: 12764, Training Loss: 0.01240
Epoch: 12764, Training Loss: 0.01531
Epoch: 12764, Training Loss: 0.01176
Epoch: 12765, Training Loss: 0.01341
Epoch: 12765, Training Loss: 0.01240
Epoch: 12765, Training Loss: 0.01531
Epoch: 12765, Training Loss: 0.01176
Epoch: 12766, Training Loss: 0.01340
Epoch: 12766, Training Loss: 0.01240
Epoch: 12766, Training Loss: 0.01531
Epoch: 12766, Training Loss: 0.01176
Epoch: 12767, Training Loss: 0.01340
Epoch: 12767, Training Loss: 0.01240
Epoch: 12767, Training Loss: 0.01531
Epoch: 12767, Training Loss: 0.01176
Epoch: 12768, Training Loss: 0.01340
Epoch: 12768, Training Loss: 0.01240
Epoch: 12768, Training Loss: 0.01531
Epoch: 12768, Training Loss: 0.01176
Epoch: 12769, Training Loss: 0.01340
Epoch: 12769, Training Loss: 0.01240
Epoch: 12769, Training Loss: 0.01530
Epoch: 12769, Training Loss: 0.01176
Epoch: 12770, Training Loss: 0.01340
Epoch: 12770, Training Loss: 0.01240
Epoch: 12770, Training Loss: 0.01530
Epoch: 12770, Training Loss: 0.01176
Epoch: 12771, Training Loss: 0.01340
Epoch: 12771, Training Loss: 0.01240
Epoch: 12771, Training Loss: 0.01530
Epoch: 12771, Training Loss: 0.01176
Epoch: 12772, Training Loss: 0.01340
Epoch: 12772, Training Loss: 0.01239
Epoch: 12772, Training Loss: 0.01530
Epoch: 12772, Training Loss: 0.01176
Epoch: 12773, Training Loss: 0.01340
Epoch: 12773, Training Loss: 0.01239
Epoch: 12773, Training Loss: 0.01530
Epoch: 12773, Training Loss: 0.01176
Epoch: 12774, Training Loss: 0.01340
Epoch: 12774, Training Loss: 0.01239
Epoch: 12774, Training Loss: 0.01530
Epoch: 12774, Training Loss: 0.01176
Epoch: 12775, Training Loss: 0.01340
Epoch: 12775, Training Loss: 0.01239
Epoch: 12775, Training Loss: 0.01530
Epoch: 12775, Training Loss: 0.01176
Epoch: 12776, Training Loss: 0.01340
Epoch: 12776, Training Loss: 0.01239
Epoch: 12776, Training Loss: 0.01530
Epoch: 12776, Training Loss: 0.01176
Epoch: 12777, Training Loss: 0.01340
Epoch: 12777, Training Loss: 0.01239
Epoch: 12777, Training Loss: 0.01530
Epoch: 12777, Training Loss: 0.01176
Epoch: 12778, Training Loss: 0.01340
Epoch: 12778, Training Loss: 0.01239
Epoch: 12778, Training Loss: 0.01530
Epoch: 12778, Training Loss: 0.01176
Epoch: 12779, Training Loss: 0.01340
Epoch: 12779, Training Loss: 0.01239
Epoch: 12779, Training Loss: 0.01530
Epoch: 12779, Training Loss: 0.01176
Epoch: 12780, Training Loss: 0.01340
Epoch: 12780, Training Loss: 0.01239
Epoch: 12780, Training Loss: 0.01530
Epoch: 12780, Training Loss: 0.01176
Epoch: 12781, Training Loss: 0.01340
Epoch: 12781, Training Loss: 0.01239
Epoch: 12781, Training Loss: 0.01530
Epoch: 12781, Training Loss: 0.01176
Epoch: 12782, Training Loss: 0.01340
Epoch: 12782, Training Loss: 0.01239
Epoch: 12782, Training Loss: 0.01530
Epoch: 12782, Training Loss: 0.01176
Epoch: 12783, Training Loss: 0.01339
Epoch: 12783, Training Loss: 0.01239
Epoch: 12783, Training Loss: 0.01530
Epoch: 12783, Training Loss: 0.01176
Epoch: 12784, Training Loss: 0.01339
Epoch: 12784, Training Loss: 0.01239
Epoch: 12784, Training Loss: 0.01529
Epoch: 12784, Training Loss: 0.01175
Epoch: 12785, Training Loss: 0.01339
Epoch: 12785, Training Loss: 0.01239
Epoch: 12785, Training Loss: 0.01529
Epoch: 12785, Training Loss: 0.01175
Epoch: 12786, Training Loss: 0.01339
Epoch: 12786, Training Loss: 0.01239
Epoch: 12786, Training Loss: 0.01529
Epoch: 12786, Training Loss: 0.01175
Epoch: 12787, Training Loss: 0.01339
Epoch: 12787, Training Loss: 0.01239
Epoch: 12787, Training Loss: 0.01529
Epoch: 12787, Training Loss: 0.01175
Epoch: 12788, Training Loss: 0.01339
Epoch: 12788, Training Loss: 0.01239
Epoch: 12788, Training Loss: 0.01529
Epoch: 12788, Training Loss: 0.01175
Epoch: 12789, Training Loss: 0.01339
Epoch: 12789, Training Loss: 0.01239
Epoch: 12789, Training Loss: 0.01529
Epoch: 12789, Training Loss: 0.01175
Epoch: 12790, Training Loss: 0.01339
Epoch: 12790, Training Loss: 0.01239
Epoch: 12790, Training Loss: 0.01529
Epoch: 12790, Training Loss: 0.01175
Epoch: 12791, Training Loss: 0.01339
Epoch: 12791, Training Loss: 0.01238
Epoch: 12791, Training Loss: 0.01529
Epoch: 12791, Training Loss: 0.01175
Epoch: 12792, Training Loss: 0.01339
Epoch: 12792, Training Loss: 0.01238
Epoch: 12792, Training Loss: 0.01529
Epoch: 12792, Training Loss: 0.01175
Epoch: 12793, Training Loss: 0.01339
Epoch: 12793, Training Loss: 0.01238
Epoch: 12793, Training Loss: 0.01529
Epoch: 12793, Training Loss: 0.01175
Epoch: 12794, Training Loss: 0.01339
Epoch: 12794, Training Loss: 0.01238
Epoch: 12794, Training Loss: 0.01529
Epoch: 12794, Training Loss: 0.01175
Epoch: 12795, Training Loss: 0.01339
Epoch: 12795, Training Loss: 0.01238
Epoch: 12795, Training Loss: 0.01529
Epoch: 12795, Training Loss: 0.01175
Epoch: 12796, Training Loss: 0.01339
Epoch: 12796, Training Loss: 0.01238
Epoch: 12796, Training Loss: 0.01529
Epoch: 12796, Training Loss: 0.01175
Epoch: 12797, Training Loss: 0.01339
Epoch: 12797, Training Loss: 0.01238
Epoch: 12797, Training Loss: 0.01529
Epoch: 12797, Training Loss: 0.01175
Epoch: 12798, Training Loss: 0.01339
Epoch: 12798, Training Loss: 0.01238
Epoch: 12798, Training Loss: 0.01529
Epoch: 12798, Training Loss: 0.01175
Epoch: 12799, Training Loss: 0.01339
Epoch: 12799, Training Loss: 0.01238
Epoch: 12799, Training Loss: 0.01528
Epoch: 12799, Training Loss: 0.01175
Epoch: 12800, Training Loss: 0.01338
Epoch: 12800, Training Loss: 0.01238
Epoch: 12800, Training Loss: 0.01528
Epoch: 12800, Training Loss: 0.01175
Epoch: 12801, Training Loss: 0.01338
Epoch: 12801, Training Loss: 0.01238
Epoch: 12801, Training Loss: 0.01528
Epoch: 12801, Training Loss: 0.01175
Epoch: 12802, Training Loss: 0.01338
Epoch: 12802, Training Loss: 0.01238
Epoch: 12802, Training Loss: 0.01528
Epoch: 12802, Training Loss: 0.01175
Epoch: 12803, Training Loss: 0.01338
Epoch: 12803, Training Loss: 0.01238
Epoch: 12803, Training Loss: 0.01528
Epoch: 12803, Training Loss: 0.01175
Epoch: 12804, Training Loss: 0.01338
Epoch: 12804, Training Loss: 0.01238
Epoch: 12804, Training Loss: 0.01528
Epoch: 12804, Training Loss: 0.01174
Epoch: 12805, Training Loss: 0.01338
Epoch: 12805, Training Loss: 0.01238
Epoch: 12805, Training Loss: 0.01528
Epoch: 12805, Training Loss: 0.01174
Epoch: 12806, Training Loss: 0.01338
Epoch: 12806, Training Loss: 0.01238
Epoch: 12806, Training Loss: 0.01528
Epoch: 12806, Training Loss: 0.01174
Epoch: 12807, Training Loss: 0.01338
Epoch: 12807, Training Loss: 0.01238
Epoch: 12807, Training Loss: 0.01528
Epoch: 12807, Training Loss: 0.01174
Epoch: 12808, Training Loss: 0.01338
Epoch: 12808, Training Loss: 0.01238
Epoch: 12808, Training Loss: 0.01528
Epoch: 12808, Training Loss: 0.01174
Epoch: 12809, Training Loss: 0.01338
Epoch: 12809, Training Loss: 0.01238
Epoch: 12809, Training Loss: 0.01528
Epoch: 12809, Training Loss: 0.01174
Epoch: 12810, Training Loss: 0.01338
Epoch: 12810, Training Loss: 0.01237
Epoch: 12810, Training Loss: 0.01528
Epoch: 12810, Training Loss: 0.01174
Epoch: 12811, Training Loss: 0.01338
Epoch: 12811, Training Loss: 0.01237
Epoch: 12811, Training Loss: 0.01528
Epoch: 12811, Training Loss: 0.01174
Epoch: 12812, Training Loss: 0.01338
Epoch: 12812, Training Loss: 0.01237
Epoch: 12812, Training Loss: 0.01528
Epoch: 12812, Training Loss: 0.01174
Epoch: 12813, Training Loss: 0.01338
Epoch: 12813, Training Loss: 0.01237
Epoch: 12813, Training Loss: 0.01528
Epoch: 12813, Training Loss: 0.01174
Epoch: 12814, Training Loss: 0.01338
Epoch: 12814, Training Loss: 0.01237
Epoch: 12814, Training Loss: 0.01527
Epoch: 12814, Training Loss: 0.01174
Epoch: 12815, Training Loss: 0.01338
Epoch: 12815, Training Loss: 0.01237
Epoch: 12815, Training Loss: 0.01527
Epoch: 12815, Training Loss: 0.01174
Epoch: 12816, Training Loss: 0.01338
Epoch: 12816, Training Loss: 0.01237
Epoch: 12816, Training Loss: 0.01527
Epoch: 12816, Training Loss: 0.01174
Epoch: 12817, Training Loss: 0.01338
Epoch: 12817, Training Loss: 0.01237
Epoch: 12817, Training Loss: 0.01527
Epoch: 12817, Training Loss: 0.01174
Epoch: 12818, Training Loss: 0.01337
Epoch: 12818, Training Loss: 0.01237
Epoch: 12818, Training Loss: 0.01527
Epoch: 12818, Training Loss: 0.01174
Epoch: 12819, Training Loss: 0.01337
Epoch: 12819, Training Loss: 0.01237
Epoch: 12819, Training Loss: 0.01527
Epoch: 12819, Training Loss: 0.01174
Epoch: 12820, Training Loss: 0.01337
Epoch: 12820, Training Loss: 0.01237
Epoch: 12820, Training Loss: 0.01527
Epoch: 12820, Training Loss: 0.01174
Epoch: 12821, Training Loss: 0.01337
Epoch: 12821, Training Loss: 0.01237
Epoch: 12821, Training Loss: 0.01527
Epoch: 12821, Training Loss: 0.01174
Epoch: 12822, Training Loss: 0.01337
Epoch: 12822, Training Loss: 0.01237
Epoch: 12822, Training Loss: 0.01527
Epoch: 12822, Training Loss: 0.01174
Epoch: 12823, Training Loss: 0.01337
Epoch: 12823, Training Loss: 0.01237
Epoch: 12823, Training Loss: 0.01527
Epoch: 12823, Training Loss: 0.01174
Epoch: 12824, Training Loss: 0.01337
Epoch: 12824, Training Loss: 0.01237
Epoch: 12824, Training Loss: 0.01527
Epoch: 12824, Training Loss: 0.01173
Epoch: 12825, Training Loss: 0.01337
Epoch: 12825, Training Loss: 0.01237
Epoch: 12825, Training Loss: 0.01527
Epoch: 12825, Training Loss: 0.01173
Epoch: 12826, Training Loss: 0.01337
Epoch: 12826, Training Loss: 0.01237
Epoch: 12826, Training Loss: 0.01527
Epoch: 12826, Training Loss: 0.01173
Epoch: 12827, Training Loss: 0.01337
Epoch: 12827, Training Loss: 0.01237
Epoch: 12827, Training Loss: 0.01527
Epoch: 12827, Training Loss: 0.01173
Epoch: 12828, Training Loss: 0.01337
Epoch: 12828, Training Loss: 0.01237
Epoch: 12828, Training Loss: 0.01527
Epoch: 12828, Training Loss: 0.01173
Epoch: 12829, Training Loss: 0.01337
Epoch: 12829, Training Loss: 0.01236
Epoch: 12829, Training Loss: 0.01526
Epoch: 12829, Training Loss: 0.01173
Epoch: 12830, Training Loss: 0.01337
Epoch: 12830, Training Loss: 0.01236
Epoch: 12830, Training Loss: 0.01526
Epoch: 12830, Training Loss: 0.01173
Epoch: 12831, Training Loss: 0.01337
Epoch: 12831, Training Loss: 0.01236
Epoch: 12831, Training Loss: 0.01526
Epoch: 12831, Training Loss: 0.01173
Epoch: 12832, Training Loss: 0.01337
Epoch: 12832, Training Loss: 0.01236
Epoch: 12832, Training Loss: 0.01526
Epoch: 12832, Training Loss: 0.01173
Epoch: 12833, Training Loss: 0.01337
Epoch: 12833, Training Loss: 0.01236
Epoch: 12833, Training Loss: 0.01526
Epoch: 12833, Training Loss: 0.01173
Epoch: 12834, Training Loss: 0.01337
Epoch: 12834, Training Loss: 0.01236
Epoch: 12834, Training Loss: 0.01526
Epoch: 12834, Training Loss: 0.01173
Epoch: 12835, Training Loss: 0.01336
Epoch: 12835, Training Loss: 0.01236
Epoch: 12835, Training Loss: 0.01526
Epoch: 12835, Training Loss: 0.01173
Epoch: 12836, Training Loss: 0.01336
Epoch: 12836, Training Loss: 0.01236
Epoch: 12836, Training Loss: 0.01526
Epoch: 12836, Training Loss: 0.01173
Epoch: 12837, Training Loss: 0.01336
Epoch: 12837, Training Loss: 0.01236
Epoch: 12837, Training Loss: 0.01526
Epoch: 12837, Training Loss: 0.01173
Epoch: 12838, Training Loss: 0.01336
Epoch: 12838, Training Loss: 0.01236
Epoch: 12838, Training Loss: 0.01526
Epoch: 12838, Training Loss: 0.01173
Epoch: 12839, Training Loss: 0.01336
Epoch: 12839, Training Loss: 0.01236
Epoch: 12839, Training Loss: 0.01526
Epoch: 12839, Training Loss: 0.01173
Epoch: 12840, Training Loss: 0.01336
Epoch: 12840, Training Loss: 0.01236
Epoch: 12840, Training Loss: 0.01526
Epoch: 12840, Training Loss: 0.01173
Epoch: 12841, Training Loss: 0.01336
Epoch: 12841, Training Loss: 0.01236
Epoch: 12841, Training Loss: 0.01526
Epoch: 12841, Training Loss: 0.01173
Epoch: 12842, Training Loss: 0.01336
Epoch: 12842, Training Loss: 0.01236
Epoch: 12842, Training Loss: 0.01526
Epoch: 12842, Training Loss: 0.01173
Epoch: 12843, Training Loss: 0.01336
Epoch: 12843, Training Loss: 0.01236
Epoch: 12843, Training Loss: 0.01526
Epoch: 12843, Training Loss: 0.01173
Epoch: 12844, Training Loss: 0.01336
Epoch: 12844, Training Loss: 0.01236
Epoch: 12844, Training Loss: 0.01526
Epoch: 12844, Training Loss: 0.01172
Epoch: 12845, Training Loss: 0.01336
Epoch: 12845, Training Loss: 0.01236
Epoch: 12845, Training Loss: 0.01525
Epoch: 12845, Training Loss: 0.01172
Epoch: 12846, Training Loss: 0.01336
Epoch: 12846, Training Loss: 0.01236
Epoch: 12846, Training Loss: 0.01525
Epoch: 12846, Training Loss: 0.01172
Epoch: 12847, Training Loss: 0.01336
Epoch: 12847, Training Loss: 0.01236
Epoch: 12847, Training Loss: 0.01525
Epoch: 12847, Training Loss: 0.01172
Epoch: 12848, Training Loss: 0.01336
Epoch: 12848, Training Loss: 0.01235
Epoch: 12848, Training Loss: 0.01525
Epoch: 12848, Training Loss: 0.01172
Epoch: 12849, Training Loss: 0.01336
Epoch: 12849, Training Loss: 0.01235
Epoch: 12849, Training Loss: 0.01525
Epoch: 12849, Training Loss: 0.01172
Epoch: 12850, Training Loss: 0.01336
Epoch: 12850, Training Loss: 0.01235
Epoch: 12850, Training Loss: 0.01525
Epoch: 12850, Training Loss: 0.01172
Epoch: 12851, Training Loss: 0.01336
Epoch: 12851, Training Loss: 0.01235
Epoch: 12851, Training Loss: 0.01525
Epoch: 12851, Training Loss: 0.01172
Epoch: 12852, Training Loss: 0.01335
Epoch: 12852, Training Loss: 0.01235
Epoch: 12852, Training Loss: 0.01525
Epoch: 12852, Training Loss: 0.01172
Epoch: 12853, Training Loss: 0.01335
Epoch: 12853, Training Loss: 0.01235
Epoch: 12853, Training Loss: 0.01525
Epoch: 12853, Training Loss: 0.01172
Epoch: 12854, Training Loss: 0.01335
Epoch: 12854, Training Loss: 0.01235
Epoch: 12854, Training Loss: 0.01525
Epoch: 12854, Training Loss: 0.01172
Epoch: 12855, Training Loss: 0.01335
Epoch: 12855, Training Loss: 0.01235
Epoch: 12855, Training Loss: 0.01525
Epoch: 12855, Training Loss: 0.01172
Epoch: 12856, Training Loss: 0.01335
Epoch: 12856, Training Loss: 0.01235
Epoch: 12856, Training Loss: 0.01525
Epoch: 12856, Training Loss: 0.01172
Epoch: 12857, Training Loss: 0.01335
Epoch: 12857, Training Loss: 0.01235
Epoch: 12857, Training Loss: 0.01525
Epoch: 12857, Training Loss: 0.01172
Epoch: 12858, Training Loss: 0.01335
Epoch: 12858, Training Loss: 0.01235
Epoch: 12858, Training Loss: 0.01525
Epoch: 12858, Training Loss: 0.01172
Epoch: 12859, Training Loss: 0.01335
Epoch: 12859, Training Loss: 0.01235
Epoch: 12859, Training Loss: 0.01525
Epoch: 12859, Training Loss: 0.01172
Epoch: 12860, Training Loss: 0.01335
Epoch: 12860, Training Loss: 0.01235
Epoch: 12860, Training Loss: 0.01524
Epoch: 12860, Training Loss: 0.01172
Epoch: 12861, Training Loss: 0.01335
Epoch: 12861, Training Loss: 0.01235
Epoch: 12861, Training Loss: 0.01524
Epoch: 12861, Training Loss: 0.01172
Epoch: 12862, Training Loss: 0.01335
Epoch: 12862, Training Loss: 0.01235
Epoch: 12862, Training Loss: 0.01524
Epoch: 12862, Training Loss: 0.01172
Epoch: 12863, Training Loss: 0.01335
Epoch: 12863, Training Loss: 0.01235
Epoch: 12863, Training Loss: 0.01524
Epoch: 12863, Training Loss: 0.01172
Epoch: 12864, Training Loss: 0.01335
Epoch: 12864, Training Loss: 0.01235
Epoch: 12864, Training Loss: 0.01524
Epoch: 12864, Training Loss: 0.01171
Epoch: 12865, Training Loss: 0.01335
Epoch: 12865, Training Loss: 0.01235
Epoch: 12865, Training Loss: 0.01524
Epoch: 12865, Training Loss: 0.01171
Epoch: 12866, Training Loss: 0.01335
Epoch: 12866, Training Loss: 0.01235
Epoch: 12866, Training Loss: 0.01524
Epoch: 12866, Training Loss: 0.01171
Epoch: 12867, Training Loss: 0.01335
Epoch: 12867, Training Loss: 0.01234
Epoch: 12867, Training Loss: 0.01524
Epoch: 12867, Training Loss: 0.01171
Epoch: 12868, Training Loss: 0.01335
Epoch: 12868, Training Loss: 0.01234
Epoch: 12868, Training Loss: 0.01524
Epoch: 12868, Training Loss: 0.01171
Epoch: 12869, Training Loss: 0.01335
Epoch: 12869, Training Loss: 0.01234
Epoch: 12869, Training Loss: 0.01524
Epoch: 12869, Training Loss: 0.01171
Epoch: 12870, Training Loss: 0.01334
Epoch: 12870, Training Loss: 0.01234
Epoch: 12870, Training Loss: 0.01524
Epoch: 12870, Training Loss: 0.01171
Epoch: 12871, Training Loss: 0.01334
Epoch: 12871, Training Loss: 0.01234
Epoch: 12871, Training Loss: 0.01524
Epoch: 12871, Training Loss: 0.01171
Epoch: 12872, Training Loss: 0.01334
Epoch: 12872, Training Loss: 0.01234
Epoch: 12872, Training Loss: 0.01524
Epoch: 12872, Training Loss: 0.01171
Epoch: 12873, Training Loss: 0.01334
Epoch: 12873, Training Loss: 0.01234
Epoch: 12873, Training Loss: 0.01524
Epoch: 12873, Training Loss: 0.01171
Epoch: 12874, Training Loss: 0.01334
Epoch: 12874, Training Loss: 0.01234
Epoch: 12874, Training Loss: 0.01524
Epoch: 12874, Training Loss: 0.01171
Epoch: 12875, Training Loss: 0.01334
Epoch: 12875, Training Loss: 0.01234
Epoch: 12875, Training Loss: 0.01523
Epoch: 12875, Training Loss: 0.01171
Epoch: 12876, Training Loss: 0.01334
Epoch: 12876, Training Loss: 0.01234
Epoch: 12876, Training Loss: 0.01523
Epoch: 12876, Training Loss: 0.01171
Epoch: 12877, Training Loss: 0.01334
Epoch: 12877, Training Loss: 0.01234
Epoch: 12877, Training Loss: 0.01523
Epoch: 12877, Training Loss: 0.01171
Epoch: 12878, Training Loss: 0.01334
Epoch: 12878, Training Loss: 0.01234
Epoch: 12878, Training Loss: 0.01523
Epoch: 12878, Training Loss: 0.01171
Epoch: 12879, Training Loss: 0.01334
Epoch: 12879, Training Loss: 0.01234
Epoch: 12879, Training Loss: 0.01523
Epoch: 12879, Training Loss: 0.01171
Epoch: 12880, Training Loss: 0.01334
Epoch: 12880, Training Loss: 0.01234
Epoch: 12880, Training Loss: 0.01523
Epoch: 12880, Training Loss: 0.01171
Epoch: 12881, Training Loss: 0.01334
Epoch: 12881, Training Loss: 0.01234
Epoch: 12881, Training Loss: 0.01523
Epoch: 12881, Training Loss: 0.01171
Epoch: 12882, Training Loss: 0.01334
Epoch: 12882, Training Loss: 0.01234
Epoch: 12882, Training Loss: 0.01523
Epoch: 12882, Training Loss: 0.01171
Epoch: 12883, Training Loss: 0.01334
Epoch: 12883, Training Loss: 0.01234
Epoch: 12883, Training Loss: 0.01523
Epoch: 12883, Training Loss: 0.01171
Epoch: 12884, Training Loss: 0.01334
Epoch: 12884, Training Loss: 0.01234
Epoch: 12884, Training Loss: 0.01523
Epoch: 12884, Training Loss: 0.01170
Epoch: 12885, Training Loss: 0.01334
Epoch: 12885, Training Loss: 0.01234
Epoch: 12885, Training Loss: 0.01523
Epoch: 12885, Training Loss: 0.01170
Epoch: 12886, Training Loss: 0.01334
Epoch: 12886, Training Loss: 0.01234
Epoch: 12886, Training Loss: 0.01523
Epoch: 12886, Training Loss: 0.01170
Epoch: 12887, Training Loss: 0.01333
Epoch: 12887, Training Loss: 0.01233
Epoch: 12887, Training Loss: 0.01523
Epoch: 12887, Training Loss: 0.01170
Epoch: 12888, Training Loss: 0.01333
Epoch: 12888, Training Loss: 0.01233
Epoch: 12888, Training Loss: 0.01523
Epoch: 12888, Training Loss: 0.01170
Epoch: 12889, Training Loss: 0.01333
Epoch: 12889, Training Loss: 0.01233
Epoch: 12889, Training Loss: 0.01523
Epoch: 12889, Training Loss: 0.01170
Epoch: 12890, Training Loss: 0.01333
Epoch: 12890, Training Loss: 0.01233
Epoch: 12890, Training Loss: 0.01522
Epoch: 12890, Training Loss: 0.01170
Epoch: 12891, Training Loss: 0.01333
Epoch: 12891, Training Loss: 0.01233
Epoch: 12891, Training Loss: 0.01522
Epoch: 12891, Training Loss: 0.01170
Epoch: 12892, Training Loss: 0.01333
Epoch: 12892, Training Loss: 0.01233
Epoch: 12892, Training Loss: 0.01522
Epoch: 12892, Training Loss: 0.01170
Epoch: 12893, Training Loss: 0.01333
Epoch: 12893, Training Loss: 0.01233
Epoch: 12893, Training Loss: 0.01522
Epoch: 12893, Training Loss: 0.01170
Epoch: 12894, Training Loss: 0.01333
Epoch: 12894, Training Loss: 0.01233
Epoch: 12894, Training Loss: 0.01522
Epoch: 12894, Training Loss: 0.01170
Epoch: 12895, Training Loss: 0.01333
Epoch: 12895, Training Loss: 0.01233
Epoch: 12895, Training Loss: 0.01522
Epoch: 12895, Training Loss: 0.01170
Epoch: 12896, Training Loss: 0.01333
Epoch: 12896, Training Loss: 0.01233
Epoch: 12896, Training Loss: 0.01522
Epoch: 12896, Training Loss: 0.01170
Epoch: 12897, Training Loss: 0.01333
Epoch: 12897, Training Loss: 0.01233
Epoch: 12897, Training Loss: 0.01522
Epoch: 12897, Training Loss: 0.01170
Epoch: 12898, Training Loss: 0.01333
Epoch: 12898, Training Loss: 0.01233
Epoch: 12898, Training Loss: 0.01522
Epoch: 12898, Training Loss: 0.01170
Epoch: 12899, Training Loss: 0.01333
Epoch: 12899, Training Loss: 0.01233
Epoch: 12899, Training Loss: 0.01522
Epoch: 12899, Training Loss: 0.01170
Epoch: 12900, Training Loss: 0.01333
Epoch: 12900, Training Loss: 0.01233
Epoch: 12900, Training Loss: 0.01522
Epoch: 12900, Training Loss: 0.01170
Epoch: 12901, Training Loss: 0.01333
Epoch: 12901, Training Loss: 0.01233
Epoch: 12901, Training Loss: 0.01522
Epoch: 12901, Training Loss: 0.01170
Epoch: 12902, Training Loss: 0.01333
Epoch: 12902, Training Loss: 0.01233
Epoch: 12902, Training Loss: 0.01522
Epoch: 12902, Training Loss: 0.01170
Epoch: 12903, Training Loss: 0.01333
Epoch: 12903, Training Loss: 0.01233
Epoch: 12903, Training Loss: 0.01522
Epoch: 12903, Training Loss: 0.01170
Epoch: 12904, Training Loss: 0.01332
Epoch: 12904, Training Loss: 0.01233
Epoch: 12904, Training Loss: 0.01522
Epoch: 12904, Training Loss: 0.01169
Epoch: 12905, Training Loss: 0.01332
Epoch: 12905, Training Loss: 0.01233
Epoch: 12905, Training Loss: 0.01521
Epoch: 12905, Training Loss: 0.01169
Epoch: 12906, Training Loss: 0.01332
Epoch: 12906, Training Loss: 0.01232
Epoch: 12906, Training Loss: 0.01521
Epoch: 12906, Training Loss: 0.01169
Epoch: 12907, Training Loss: 0.01332
Epoch: 12907, Training Loss: 0.01232
Epoch: 12907, Training Loss: 0.01521
Epoch: 12907, Training Loss: 0.01169
Epoch: 12908, Training Loss: 0.01332
Epoch: 12908, Training Loss: 0.01232
Epoch: 12908, Training Loss: 0.01521
Epoch: 12908, Training Loss: 0.01169
Epoch: 12909, Training Loss: 0.01332
Epoch: 12909, Training Loss: 0.01232
Epoch: 12909, Training Loss: 0.01521
Epoch: 12909, Training Loss: 0.01169
Epoch: 12910, Training Loss: 0.01332
Epoch: 12910, Training Loss: 0.01232
Epoch: 12910, Training Loss: 0.01521
Epoch: 12910, Training Loss: 0.01169
Epoch: 12911, Training Loss: 0.01332
Epoch: 12911, Training Loss: 0.01232
Epoch: 12911, Training Loss: 0.01521
Epoch: 12911, Training Loss: 0.01169
Epoch: 12912, Training Loss: 0.01332
Epoch: 12912, Training Loss: 0.01232
Epoch: 12912, Training Loss: 0.01521
Epoch: 12912, Training Loss: 0.01169
Epoch: 12913, Training Loss: 0.01332
Epoch: 12913, Training Loss: 0.01232
Epoch: 12913, Training Loss: 0.01521
Epoch: 12913, Training Loss: 0.01169
Epoch: 12914, Training Loss: 0.01332
Epoch: 12914, Training Loss: 0.01232
Epoch: 12914, Training Loss: 0.01521
Epoch: 12914, Training Loss: 0.01169
Epoch: 12915, Training Loss: 0.01332
Epoch: 12915, Training Loss: 0.01232
Epoch: 12915, Training Loss: 0.01521
Epoch: 12915, Training Loss: 0.01169
Epoch: 12916, Training Loss: 0.01332
Epoch: 12916, Training Loss: 0.01232
Epoch: 12916, Training Loss: 0.01521
Epoch: 12916, Training Loss: 0.01169
Epoch: 12917, Training Loss: 0.01332
Epoch: 12917, Training Loss: 0.01232
Epoch: 12917, Training Loss: 0.01521
Epoch: 12917, Training Loss: 0.01169
Epoch: 12918, Training Loss: 0.01332
Epoch: 12918, Training Loss: 0.01232
Epoch: 12918, Training Loss: 0.01521
Epoch: 12918, Training Loss: 0.01169
Epoch: 12919, Training Loss: 0.01332
Epoch: 12919, Training Loss: 0.01232
Epoch: 12919, Training Loss: 0.01521
Epoch: 12919, Training Loss: 0.01169
Epoch: 12920, Training Loss: 0.01332
Epoch: 12920, Training Loss: 0.01232
Epoch: 12920, Training Loss: 0.01520
Epoch: 12920, Training Loss: 0.01169
Epoch: 12921, Training Loss: 0.01332
Epoch: 12921, Training Loss: 0.01232
Epoch: 12921, Training Loss: 0.01520
Epoch: 12921, Training Loss: 0.01169
Epoch: 12922, Training Loss: 0.01331
Epoch: 12922, Training Loss: 0.01232
Epoch: 12922, Training Loss: 0.01520
Epoch: 12922, Training Loss: 0.01169
Epoch: 12923, Training Loss: 0.01331
Epoch: 12923, Training Loss: 0.01232
Epoch: 12923, Training Loss: 0.01520
Epoch: 12923, Training Loss: 0.01169
Epoch: 12924, Training Loss: 0.01331
Epoch: 12924, Training Loss: 0.01232
Epoch: 12924, Training Loss: 0.01520
Epoch: 12924, Training Loss: 0.01169
Epoch: 12925, Training Loss: 0.01331
Epoch: 12925, Training Loss: 0.01231
Epoch: 12925, Training Loss: 0.01520
Epoch: 12925, Training Loss: 0.01168
Epoch: 12926, Training Loss: 0.01331
Epoch: 12926, Training Loss: 0.01231
Epoch: 12926, Training Loss: 0.01520
Epoch: 12926, Training Loss: 0.01168
Epoch: 12927, Training Loss: 0.01331
Epoch: 12927, Training Loss: 0.01231
Epoch: 12927, Training Loss: 0.01520
Epoch: 12927, Training Loss: 0.01168
Epoch: 12928, Training Loss: 0.01331
Epoch: 12928, Training Loss: 0.01231
Epoch: 12928, Training Loss: 0.01520
Epoch: 12928, Training Loss: 0.01168
Epoch: 12929, Training Loss: 0.01331
Epoch: 12929, Training Loss: 0.01231
Epoch: 12929, Training Loss: 0.01520
Epoch: 12929, Training Loss: 0.01168
Epoch: 12930, Training Loss: 0.01331
Epoch: 12930, Training Loss: 0.01231
Epoch: 12930, Training Loss: 0.01520
Epoch: 12930, Training Loss: 0.01168
Epoch: 12931, Training Loss: 0.01331
Epoch: 12931, Training Loss: 0.01231
Epoch: 12931, Training Loss: 0.01520
Epoch: 12931, Training Loss: 0.01168
Epoch: 12932, Training Loss: 0.01331
Epoch: 12932, Training Loss: 0.01231
Epoch: 12932, Training Loss: 0.01520
Epoch: 12932, Training Loss: 0.01168
Epoch: 12933, Training Loss: 0.01331
Epoch: 12933, Training Loss: 0.01231
Epoch: 12933, Training Loss: 0.01520
Epoch: 12933, Training Loss: 0.01168
Epoch: 12934, Training Loss: 0.01331
Epoch: 12934, Training Loss: 0.01231
Epoch: 12934, Training Loss: 0.01520
Epoch: 12934, Training Loss: 0.01168
Epoch: 12935, Training Loss: 0.01331
Epoch: 12935, Training Loss: 0.01231
Epoch: 12935, Training Loss: 0.01520
Epoch: 12935, Training Loss: 0.01168
Epoch: 12936, Training Loss: 0.01331
Epoch: 12936, Training Loss: 0.01231
Epoch: 12936, Training Loss: 0.01519
Epoch: 12936, Training Loss: 0.01168
Epoch: 12937, Training Loss: 0.01331
Epoch: 12937, Training Loss: 0.01231
Epoch: 12937, Training Loss: 0.01519
Epoch: 12937, Training Loss: 0.01168
Epoch: 12938, Training Loss: 0.01331
Epoch: 12938, Training Loss: 0.01231
Epoch: 12938, Training Loss: 0.01519
Epoch: 12938, Training Loss: 0.01168
Epoch: 12939, Training Loss: 0.01330
Epoch: 12939, Training Loss: 0.01231
Epoch: 12939, Training Loss: 0.01519
Epoch: 12939, Training Loss: 0.01168
Epoch: 12940, Training Loss: 0.01330
Epoch: 12940, Training Loss: 0.01231
Epoch: 12940, Training Loss: 0.01519
Epoch: 12940, Training Loss: 0.01168
Epoch: 12941, Training Loss: 0.01330
Epoch: 12941, Training Loss: 0.01231
Epoch: 12941, Training Loss: 0.01519
Epoch: 12941, Training Loss: 0.01168
Epoch: 12942, Training Loss: 0.01330
Epoch: 12942, Training Loss: 0.01231
Epoch: 12942, Training Loss: 0.01519
Epoch: 12942, Training Loss: 0.01168
Epoch: 12943, Training Loss: 0.01330
Epoch: 12943, Training Loss: 0.01231
Epoch: 12943, Training Loss: 0.01519
Epoch: 12943, Training Loss: 0.01168
Epoch: 12944, Training Loss: 0.01330
Epoch: 12944, Training Loss: 0.01231
Epoch: 12944, Training Loss: 0.01519
Epoch: 12944, Training Loss: 0.01168
Epoch: 12945, Training Loss: 0.01330
Epoch: 12945, Training Loss: 0.01230
Epoch: 12945, Training Loss: 0.01519
Epoch: 12945, Training Loss: 0.01167
Epoch: 12946, Training Loss: 0.01330
Epoch: 12946, Training Loss: 0.01230
Epoch: 12946, Training Loss: 0.01519
Epoch: 12946, Training Loss: 0.01167
Epoch: 12947, Training Loss: 0.01330
Epoch: 12947, Training Loss: 0.01230
Epoch: 12947, Training Loss: 0.01519
Epoch: 12947, Training Loss: 0.01167
Epoch: 12948, Training Loss: 0.01330
Epoch: 12948, Training Loss: 0.01230
Epoch: 12948, Training Loss: 0.01519
Epoch: 12948, Training Loss: 0.01167
Epoch: 12949, Training Loss: 0.01330
Epoch: 12949, Training Loss: 0.01230
Epoch: 12949, Training Loss: 0.01519
Epoch: 12949, Training Loss: 0.01167
Epoch: 12950, Training Loss: 0.01330
Epoch: 12950, Training Loss: 0.01230
Epoch: 12950, Training Loss: 0.01519
Epoch: 12950, Training Loss: 0.01167
Epoch: 12951, Training Loss: 0.01330
Epoch: 12951, Training Loss: 0.01230
Epoch: 12951, Training Loss: 0.01518
Epoch: 12951, Training Loss: 0.01167
Epoch: 12952, Training Loss: 0.01330
Epoch: 12952, Training Loss: 0.01230
Epoch: 12952, Training Loss: 0.01518
Epoch: 12952, Training Loss: 0.01167
Epoch: 12953, Training Loss: 0.01330
Epoch: 12953, Training Loss: 0.01230
Epoch: 12953, Training Loss: 0.01518
Epoch: 12953, Training Loss: 0.01167
Epoch: 12954, Training Loss: 0.01330
Epoch: 12954, Training Loss: 0.01230
Epoch: 12954, Training Loss: 0.01518
Epoch: 12954, Training Loss: 0.01167
Epoch: 12955, Training Loss: 0.01330
Epoch: 12955, Training Loss: 0.01230
Epoch: 12955, Training Loss: 0.01518
Epoch: 12955, Training Loss: 0.01167
Epoch: 12956, Training Loss: 0.01330
Epoch: 12956, Training Loss: 0.01230
Epoch: 12956, Training Loss: 0.01518
Epoch: 12956, Training Loss: 0.01167
Epoch: 12957, Training Loss: 0.01329
Epoch: 12957, Training Loss: 0.01230
Epoch: 12957, Training Loss: 0.01518
Epoch: 12957, Training Loss: 0.01167
Epoch: 12958, Training Loss: 0.01329
Epoch: 12958, Training Loss: 0.01230
Epoch: 12958, Training Loss: 0.01518
Epoch: 12958, Training Loss: 0.01167
Epoch: 12959, Training Loss: 0.01329
Epoch: 12959, Training Loss: 0.01230
Epoch: 12959, Training Loss: 0.01518
Epoch: 12959, Training Loss: 0.01167
Epoch: 12960, Training Loss: 0.01329
Epoch: 12960, Training Loss: 0.01230
Epoch: 12960, Training Loss: 0.01518
Epoch: 12960, Training Loss: 0.01167
Epoch: 12961, Training Loss: 0.01329
Epoch: 12961, Training Loss: 0.01230
Epoch: 12961, Training Loss: 0.01518
Epoch: 12961, Training Loss: 0.01167
Epoch: 12962, Training Loss: 0.01329
Epoch: 12962, Training Loss: 0.01230
Epoch: 12962, Training Loss: 0.01518
Epoch: 12962, Training Loss: 0.01167
Epoch: 12963, Training Loss: 0.01329
Epoch: 12963, Training Loss: 0.01230
Epoch: 12963, Training Loss: 0.01518
Epoch: 12963, Training Loss: 0.01167
Epoch: 12964, Training Loss: 0.01329
Epoch: 12964, Training Loss: 0.01229
Epoch: 12964, Training Loss: 0.01518
Epoch: 12964, Training Loss: 0.01167
Epoch: 12965, Training Loss: 0.01329
Epoch: 12965, Training Loss: 0.01229
Epoch: 12965, Training Loss: 0.01518
Epoch: 12965, Training Loss: 0.01166
Epoch: 12966, Training Loss: 0.01329
Epoch: 12966, Training Loss: 0.01229
Epoch: 12966, Training Loss: 0.01517
Epoch: 12966, Training Loss: 0.01166
Epoch: 12967, Training Loss: 0.01329
Epoch: 12967, Training Loss: 0.01229
Epoch: 12967, Training Loss: 0.01517
Epoch: 12967, Training Loss: 0.01166
Epoch: 12968, Training Loss: 0.01329
Epoch: 12968, Training Loss: 0.01229
Epoch: 12968, Training Loss: 0.01517
Epoch: 12968, Training Loss: 0.01166
Epoch: 12969, Training Loss: 0.01329
Epoch: 12969, Training Loss: 0.01229
Epoch: 12969, Training Loss: 0.01517
Epoch: 12969, Training Loss: 0.01166
Epoch: 12970, Training Loss: 0.01329
Epoch: 12970, Training Loss: 0.01229
Epoch: 12970, Training Loss: 0.01517
Epoch: 12970, Training Loss: 0.01166
Epoch: 12971, Training Loss: 0.01329
Epoch: 12971, Training Loss: 0.01229
Epoch: 12971, Training Loss: 0.01517
Epoch: 12971, Training Loss: 0.01166
Epoch: 12972, Training Loss: 0.01329
Epoch: 12972, Training Loss: 0.01229
Epoch: 12972, Training Loss: 0.01517
Epoch: 12972, Training Loss: 0.01166
Epoch: 12973, Training Loss: 0.01329
Epoch: 12973, Training Loss: 0.01229
Epoch: 12973, Training Loss: 0.01517
Epoch: 12973, Training Loss: 0.01166
Epoch: 12974, Training Loss: 0.01328
Epoch: 12974, Training Loss: 0.01229
Epoch: 12974, Training Loss: 0.01517
Epoch: 12974, Training Loss: 0.01166
Epoch: 12975, Training Loss: 0.01328
Epoch: 12975, Training Loss: 0.01229
Epoch: 12975, Training Loss: 0.01517
Epoch: 12975, Training Loss: 0.01166
Epoch: 12976, Training Loss: 0.01328
Epoch: 12976, Training Loss: 0.01229
Epoch: 12976, Training Loss: 0.01517
Epoch: 12976, Training Loss: 0.01166
Epoch: 12977, Training Loss: 0.01328
Epoch: 12977, Training Loss: 0.01229
Epoch: 12977, Training Loss: 0.01517
Epoch: 12977, Training Loss: 0.01166
Epoch: 12978, Training Loss: 0.01328
Epoch: 12978, Training Loss: 0.01229
Epoch: 12978, Training Loss: 0.01517
Epoch: 12978, Training Loss: 0.01166
Epoch: 12979, Training Loss: 0.01328
Epoch: 12979, Training Loss: 0.01229
Epoch: 12979, Training Loss: 0.01517
Epoch: 12979, Training Loss: 0.01166
Epoch: 12980, Training Loss: 0.01328
Epoch: 12980, Training Loss: 0.01229
Epoch: 12980, Training Loss: 0.01517
Epoch: 12980, Training Loss: 0.01166
Epoch: 12981, Training Loss: 0.01328
Epoch: 12981, Training Loss: 0.01229
Epoch: 12981, Training Loss: 0.01517
Epoch: 12981, Training Loss: 0.01166
Epoch: 12982, Training Loss: 0.01328
Epoch: 12982, Training Loss: 0.01229
Epoch: 12982, Training Loss: 0.01516
Epoch: 12982, Training Loss: 0.01166
Epoch: 12983, Training Loss: 0.01328
Epoch: 12983, Training Loss: 0.01229
Epoch: 12983, Training Loss: 0.01516
Epoch: 12983, Training Loss: 0.01166
Epoch: 12984, Training Loss: 0.01328
Epoch: 12984, Training Loss: 0.01228
Epoch: 12984, Training Loss: 0.01516
Epoch: 12984, Training Loss: 0.01166
Epoch: 12985, Training Loss: 0.01328
Epoch: 12985, Training Loss: 0.01228
Epoch: 12985, Training Loss: 0.01516
Epoch: 12985, Training Loss: 0.01166
Epoch: 12986, Training Loss: 0.01328
Epoch: 12986, Training Loss: 0.01228
Epoch: 12986, Training Loss: 0.01516
Epoch: 12986, Training Loss: 0.01165
Epoch: 12987, Training Loss: 0.01328
Epoch: 12987, Training Loss: 0.01228
Epoch: 12987, Training Loss: 0.01516
Epoch: 12987, Training Loss: 0.01165
Epoch: 12988, Training Loss: 0.01328
Epoch: 12988, Training Loss: 0.01228
Epoch: 12988, Training Loss: 0.01516
Epoch: 12988, Training Loss: 0.01165
Epoch: 12989, Training Loss: 0.01328
Epoch: 12989, Training Loss: 0.01228
Epoch: 12989, Training Loss: 0.01516
Epoch: 12989, Training Loss: 0.01165
Epoch: 12990, Training Loss: 0.01328
Epoch: 12990, Training Loss: 0.01228
Epoch: 12990, Training Loss: 0.01516
Epoch: 12990, Training Loss: 0.01165
Epoch: 12991, Training Loss: 0.01328
Epoch: 12991, Training Loss: 0.01228
Epoch: 12991, Training Loss: 0.01516
Epoch: 12991, Training Loss: 0.01165
Epoch: 12992, Training Loss: 0.01327
Epoch: 12992, Training Loss: 0.01228
Epoch: 12992, Training Loss: 0.01516
Epoch: 12992, Training Loss: 0.01165
Epoch: 12993, Training Loss: 0.01327
Epoch: 12993, Training Loss: 0.01228
Epoch: 12993, Training Loss: 0.01516
Epoch: 12993, Training Loss: 0.01165
Epoch: 12994, Training Loss: 0.01327
Epoch: 12994, Training Loss: 0.01228
Epoch: 12994, Training Loss: 0.01516
Epoch: 12994, Training Loss: 0.01165
Epoch: 12995, Training Loss: 0.01327
Epoch: 12995, Training Loss: 0.01228
Epoch: 12995, Training Loss: 0.01516
Epoch: 12995, Training Loss: 0.01165
Epoch: 12996, Training Loss: 0.01327
Epoch: 12996, Training Loss: 0.01228
Epoch: 12996, Training Loss: 0.01516
Epoch: 12996, Training Loss: 0.01165
Epoch: 12997, Training Loss: 0.01327
Epoch: 12997, Training Loss: 0.01228
Epoch: 12997, Training Loss: 0.01515
Epoch: 12997, Training Loss: 0.01165
Epoch: 12998, Training Loss: 0.01327
Epoch: 12998, Training Loss: 0.01228
Epoch: 12998, Training Loss: 0.01515
Epoch: 12998, Training Loss: 0.01165
Epoch: 12999, Training Loss: 0.01327
Epoch: 12999, Training Loss: 0.01228
Epoch: 12999, Training Loss: 0.01515
Epoch: 12999, Training Loss: 0.01165
Epoch: 13000, Training Loss: 0.01327
Epoch: 13000, Training Loss: 0.01228
Epoch: 13000, Training Loss: 0.01515
Epoch: 13000, Training Loss: 0.01165
Epoch: 13001, Training Loss: 0.01327
Epoch: 13001, Training Loss: 0.01228
Epoch: 13001, Training Loss: 0.01515
Epoch: 13001, Training Loss: 0.01165
Epoch: 13002, Training Loss: 0.01327
Epoch: 13002, Training Loss: 0.01228
Epoch: 13002, Training Loss: 0.01515
Epoch: 13002, Training Loss: 0.01165
Epoch: 13003, Training Loss: 0.01327
Epoch: 13003, Training Loss: 0.01227
Epoch: 13003, Training Loss: 0.01515
Epoch: 13003, Training Loss: 0.01165
Epoch: 13004, Training Loss: 0.01327
Epoch: 13004, Training Loss: 0.01227
Epoch: 13004, Training Loss: 0.01515
Epoch: 13004, Training Loss: 0.01165
Epoch: 13005, Training Loss: 0.01327
Epoch: 13005, Training Loss: 0.01227
Epoch: 13005, Training Loss: 0.01515
Epoch: 13005, Training Loss: 0.01165
Epoch: 13006, Training Loss: 0.01327
Epoch: 13006, Training Loss: 0.01227
Epoch: 13006, Training Loss: 0.01515
Epoch: 13006, Training Loss: 0.01164
Epoch: 13007, Training Loss: 0.01327
Epoch: 13007, Training Loss: 0.01227
Epoch: 13007, Training Loss: 0.01515
Epoch: 13007, Training Loss: 0.01164
Epoch: 13008, Training Loss: 0.01327
Epoch: 13008, Training Loss: 0.01227
Epoch: 13008, Training Loss: 0.01515
Epoch: 13008, Training Loss: 0.01164
Epoch: 13009, Training Loss: 0.01327
Epoch: 13009, Training Loss: 0.01227
Epoch: 13009, Training Loss: 0.01515
Epoch: 13009, Training Loss: 0.01164
Epoch: 13010, Training Loss: 0.01326
Epoch: 13010, Training Loss: 0.01227
Epoch: 13010, Training Loss: 0.01515
Epoch: 13010, Training Loss: 0.01164
Epoch: 13011, Training Loss: 0.01326
Epoch: 13011, Training Loss: 0.01227
Epoch: 13011, Training Loss: 0.01515
Epoch: 13011, Training Loss: 0.01164
Epoch: 13012, Training Loss: 0.01326
Epoch: 13012, Training Loss: 0.01227
Epoch: 13012, Training Loss: 0.01515
Epoch: 13012, Training Loss: 0.01164
Epoch: 13013, Training Loss: 0.01326
Epoch: 13013, Training Loss: 0.01227
Epoch: 13013, Training Loss: 0.01514
Epoch: 13013, Training Loss: 0.01164
Epoch: 13014, Training Loss: 0.01326
Epoch: 13014, Training Loss: 0.01227
Epoch: 13014, Training Loss: 0.01514
Epoch: 13014, Training Loss: 0.01164
Epoch: 13015, Training Loss: 0.01326
Epoch: 13015, Training Loss: 0.01227
Epoch: 13015, Training Loss: 0.01514
Epoch: 13015, Training Loss: 0.01164
Epoch: 13016, Training Loss: 0.01326
Epoch: 13016, Training Loss: 0.01227
Epoch: 13016, Training Loss: 0.01514
Epoch: 13016, Training Loss: 0.01164
Epoch: 13017, Training Loss: 0.01326
Epoch: 13017, Training Loss: 0.01227
Epoch: 13017, Training Loss: 0.01514
Epoch: 13017, Training Loss: 0.01164
Epoch: 13018, Training Loss: 0.01326
Epoch: 13018, Training Loss: 0.01227
Epoch: 13018, Training Loss: 0.01514
Epoch: 13018, Training Loss: 0.01164
Epoch: 13019, Training Loss: 0.01326
Epoch: 13019, Training Loss: 0.01227
Epoch: 13019, Training Loss: 0.01514
Epoch: 13019, Training Loss: 0.01164
Epoch: 13020, Training Loss: 0.01326
Epoch: 13020, Training Loss: 0.01227
Epoch: 13020, Training Loss: 0.01514
Epoch: 13020, Training Loss: 0.01164
Epoch: 13021, Training Loss: 0.01326
Epoch: 13021, Training Loss: 0.01227
Epoch: 13021, Training Loss: 0.01514
Epoch: 13021, Training Loss: 0.01164
Epoch: 13022, Training Loss: 0.01326
Epoch: 13022, Training Loss: 0.01227
Epoch: 13022, Training Loss: 0.01514
Epoch: 13022, Training Loss: 0.01164
Epoch: 13023, Training Loss: 0.01326
Epoch: 13023, Training Loss: 0.01226
Epoch: 13023, Training Loss: 0.01514
Epoch: 13023, Training Loss: 0.01164
Epoch: 13024, Training Loss: 0.01326
Epoch: 13024, Training Loss: 0.01226
Epoch: 13024, Training Loss: 0.01514
Epoch: 13024, Training Loss: 0.01164
Epoch: 13025, Training Loss: 0.01326
Epoch: 13025, Training Loss: 0.01226
Epoch: 13025, Training Loss: 0.01514
Epoch: 13025, Training Loss: 0.01164
Epoch: 13026, Training Loss: 0.01326
Epoch: 13026, Training Loss: 0.01226
Epoch: 13026, Training Loss: 0.01514
Epoch: 13026, Training Loss: 0.01164
Epoch: 13027, Training Loss: 0.01325
Epoch: 13027, Training Loss: 0.01226
Epoch: 13027, Training Loss: 0.01514
Epoch: 13027, Training Loss: 0.01163
Epoch: 13028, Training Loss: 0.01325
Epoch: 13028, Training Loss: 0.01226
Epoch: 13028, Training Loss: 0.01513
Epoch: 13028, Training Loss: 0.01163
Epoch: 13029, Training Loss: 0.01325
Epoch: 13029, Training Loss: 0.01226
Epoch: 13029, Training Loss: 0.01513
Epoch: 13029, Training Loss: 0.01163
Epoch: 13030, Training Loss: 0.01325
Epoch: 13030, Training Loss: 0.01226
Epoch: 13030, Training Loss: 0.01513
Epoch: 13030, Training Loss: 0.01163
Epoch: 13031, Training Loss: 0.01325
Epoch: 13031, Training Loss: 0.01226
Epoch: 13031, Training Loss: 0.01513
Epoch: 13031, Training Loss: 0.01163
Epoch: 13032, Training Loss: 0.01325
Epoch: 13032, Training Loss: 0.01226
Epoch: 13032, Training Loss: 0.01513
Epoch: 13032, Training Loss: 0.01163
Epoch: 13033, Training Loss: 0.01325
Epoch: 13033, Training Loss: 0.01226
Epoch: 13033, Training Loss: 0.01513
Epoch: 13033, Training Loss: 0.01163
Epoch: 13034, Training Loss: 0.01325
Epoch: 13034, Training Loss: 0.01226
Epoch: 13034, Training Loss: 0.01513
Epoch: 13034, Training Loss: 0.01163
Epoch: 13035, Training Loss: 0.01325
Epoch: 13035, Training Loss: 0.01226
Epoch: 13035, Training Loss: 0.01513
Epoch: 13035, Training Loss: 0.01163
Epoch: 13036, Training Loss: 0.01325
Epoch: 13036, Training Loss: 0.01226
Epoch: 13036, Training Loss: 0.01513
Epoch: 13036, Training Loss: 0.01163
Epoch: 13037, Training Loss: 0.01325
Epoch: 13037, Training Loss: 0.01226
Epoch: 13037, Training Loss: 0.01513
Epoch: 13037, Training Loss: 0.01163
Epoch: 13038, Training Loss: 0.01325
Epoch: 13038, Training Loss: 0.01226
Epoch: 13038, Training Loss: 0.01513
Epoch: 13038, Training Loss: 0.01163
Epoch: 13039, Training Loss: 0.01325
Epoch: 13039, Training Loss: 0.01226
Epoch: 13039, Training Loss: 0.01513
Epoch: 13039, Training Loss: 0.01163
Epoch: 13040, Training Loss: 0.01325
Epoch: 13040, Training Loss: 0.01226
Epoch: 13040, Training Loss: 0.01513
Epoch: 13040, Training Loss: 0.01163
Epoch: 13041, Training Loss: 0.01325
Epoch: 13041, Training Loss: 0.01226
Epoch: 13041, Training Loss: 0.01513
Epoch: 13041, Training Loss: 0.01163
Epoch: 13042, Training Loss: 0.01325
Epoch: 13042, Training Loss: 0.01225
Epoch: 13042, Training Loss: 0.01513
Epoch: 13042, Training Loss: 0.01163
Epoch: 13043, Training Loss: 0.01325
Epoch: 13043, Training Loss: 0.01225
Epoch: 13043, Training Loss: 0.01512
Epoch: 13043, Training Loss: 0.01163
Epoch: 13044, Training Loss: 0.01325
Epoch: 13044, Training Loss: 0.01225
Epoch: 13044, Training Loss: 0.01512
Epoch: 13044, Training Loss: 0.01163
Epoch: 13045, Training Loss: 0.01324
Epoch: 13045, Training Loss: 0.01225
Epoch: 13045, Training Loss: 0.01512
Epoch: 13045, Training Loss: 0.01163
Epoch: 13046, Training Loss: 0.01324
Epoch: 13046, Training Loss: 0.01225
Epoch: 13046, Training Loss: 0.01512
Epoch: 13046, Training Loss: 0.01163
Epoch: 13047, Training Loss: 0.01324
Epoch: 13047, Training Loss: 0.01225
Epoch: 13047, Training Loss: 0.01512
Epoch: 13047, Training Loss: 0.01162
Epoch: 13048, Training Loss: 0.01324
Epoch: 13048, Training Loss: 0.01225
Epoch: 13048, Training Loss: 0.01512
Epoch: 13048, Training Loss: 0.01162
Epoch: 13049, Training Loss: 0.01324
Epoch: 13049, Training Loss: 0.01225
Epoch: 13049, Training Loss: 0.01512
Epoch: 13049, Training Loss: 0.01162
Epoch: 13050, Training Loss: 0.01324
Epoch: 13050, Training Loss: 0.01225
Epoch: 13050, Training Loss: 0.01512
Epoch: 13050, Training Loss: 0.01162
Epoch: 13051, Training Loss: 0.01324
Epoch: 13051, Training Loss: 0.01225
Epoch: 13051, Training Loss: 0.01512
Epoch: 13051, Training Loss: 0.01162
Epoch: 13052, Training Loss: 0.01324
Epoch: 13052, Training Loss: 0.01225
Epoch: 13052, Training Loss: 0.01512
Epoch: 13052, Training Loss: 0.01162
Epoch: 13053, Training Loss: 0.01324
Epoch: 13053, Training Loss: 0.01225
Epoch: 13053, Training Loss: 0.01512
Epoch: 13053, Training Loss: 0.01162
Epoch: 13054, Training Loss: 0.01324
Epoch: 13054, Training Loss: 0.01225
Epoch: 13054, Training Loss: 0.01512
Epoch: 13054, Training Loss: 0.01162
Epoch: 13055, Training Loss: 0.01324
Epoch: 13055, Training Loss: 0.01225
Epoch: 13055, Training Loss: 0.01512
Epoch: 13055, Training Loss: 0.01162
Epoch: 13056, Training Loss: 0.01324
Epoch: 13056, Training Loss: 0.01225
Epoch: 13056, Training Loss: 0.01512
Epoch: 13056, Training Loss: 0.01162
Epoch: 13057, Training Loss: 0.01324
Epoch: 13057, Training Loss: 0.01225
Epoch: 13057, Training Loss: 0.01512
Epoch: 13057, Training Loss: 0.01162
Epoch: 13058, Training Loss: 0.01324
Epoch: 13058, Training Loss: 0.01225
Epoch: 13058, Training Loss: 0.01512
Epoch: 13058, Training Loss: 0.01162
Epoch: 13059, Training Loss: 0.01324
Epoch: 13059, Training Loss: 0.01225
Epoch: 13059, Training Loss: 0.01511
Epoch: 13059, Training Loss: 0.01162
Epoch: 13060, Training Loss: 0.01324
Epoch: 13060, Training Loss: 0.01225
Epoch: 13060, Training Loss: 0.01511
Epoch: 13060, Training Loss: 0.01162
Epoch: 13061, Training Loss: 0.01324
Epoch: 13061, Training Loss: 0.01225
Epoch: 13061, Training Loss: 0.01511
Epoch: 13061, Training Loss: 0.01162
Epoch: 13062, Training Loss: 0.01324
Epoch: 13062, Training Loss: 0.01224
Epoch: 13062, Training Loss: 0.01511
Epoch: 13062, Training Loss: 0.01162
Epoch: 13063, Training Loss: 0.01323
Epoch: 13063, Training Loss: 0.01224
Epoch: 13063, Training Loss: 0.01511
Epoch: 13063, Training Loss: 0.01162
Epoch: 13064, Training Loss: 0.01323
Epoch: 13064, Training Loss: 0.01224
Epoch: 13064, Training Loss: 0.01511
Epoch: 13064, Training Loss: 0.01162
Epoch: 13065, Training Loss: 0.01323
Epoch: 13065, Training Loss: 0.01224
Epoch: 13065, Training Loss: 0.01511
Epoch: 13065, Training Loss: 0.01162
Epoch: 13066, Training Loss: 0.01323
Epoch: 13066, Training Loss: 0.01224
Epoch: 13066, Training Loss: 0.01511
Epoch: 13066, Training Loss: 0.01162
Epoch: 13067, Training Loss: 0.01323
Epoch: 13067, Training Loss: 0.01224
Epoch: 13067, Training Loss: 0.01511
Epoch: 13067, Training Loss: 0.01162
Epoch: 13068, Training Loss: 0.01323
Epoch: 13068, Training Loss: 0.01224
Epoch: 13068, Training Loss: 0.01511
Epoch: 13068, Training Loss: 0.01161
Epoch: 13069, Training Loss: 0.01323
Epoch: 13069, Training Loss: 0.01224
Epoch: 13069, Training Loss: 0.01511
Epoch: 13069, Training Loss: 0.01161
Epoch: 13070, Training Loss: 0.01323
Epoch: 13070, Training Loss: 0.01224
Epoch: 13070, Training Loss: 0.01511
Epoch: 13070, Training Loss: 0.01161
Epoch: 13071, Training Loss: 0.01323
Epoch: 13071, Training Loss: 0.01224
Epoch: 13071, Training Loss: 0.01511
Epoch: 13071, Training Loss: 0.01161
Epoch: 13072, Training Loss: 0.01323
Epoch: 13072, Training Loss: 0.01224
Epoch: 13072, Training Loss: 0.01511
Epoch: 13072, Training Loss: 0.01161
Epoch: 13073, Training Loss: 0.01323
Epoch: 13073, Training Loss: 0.01224
Epoch: 13073, Training Loss: 0.01511
Epoch: 13073, Training Loss: 0.01161
Epoch: 13074, Training Loss: 0.01323
Epoch: 13074, Training Loss: 0.01224
Epoch: 13074, Training Loss: 0.01511
Epoch: 13074, Training Loss: 0.01161
Epoch: 13075, Training Loss: 0.01323
Epoch: 13075, Training Loss: 0.01224
Epoch: 13075, Training Loss: 0.01510
Epoch: 13075, Training Loss: 0.01161
Epoch: 13076, Training Loss: 0.01323
Epoch: 13076, Training Loss: 0.01224
Epoch: 13076, Training Loss: 0.01510
Epoch: 13076, Training Loss: 0.01161
Epoch: 13077, Training Loss: 0.01323
Epoch: 13077, Training Loss: 0.01224
Epoch: 13077, Training Loss: 0.01510
Epoch: 13077, Training Loss: 0.01161
Epoch: 13078, Training Loss: 0.01323
Epoch: 13078, Training Loss: 0.01224
Epoch: 13078, Training Loss: 0.01510
Epoch: 13078, Training Loss: 0.01161
Epoch: 13079, Training Loss: 0.01323
Epoch: 13079, Training Loss: 0.01224
Epoch: 13079, Training Loss: 0.01510
Epoch: 13079, Training Loss: 0.01161
Epoch: 13080, Training Loss: 0.01323
Epoch: 13080, Training Loss: 0.01224
Epoch: 13080, Training Loss: 0.01510
Epoch: 13080, Training Loss: 0.01161
Epoch: 13081, Training Loss: 0.01322
Epoch: 13081, Training Loss: 0.01224
Epoch: 13081, Training Loss: 0.01510
Epoch: 13081, Training Loss: 0.01161
Epoch: 13082, Training Loss: 0.01322
Epoch: 13082, Training Loss: 0.01223
Epoch: 13082, Training Loss: 0.01510
Epoch: 13082, Training Loss: 0.01161
Epoch: 13083, Training Loss: 0.01322
Epoch: 13083, Training Loss: 0.01223
Epoch: 13083, Training Loss: 0.01510
Epoch: 13083, Training Loss: 0.01161
Epoch: 13084, Training Loss: 0.01322
Epoch: 13084, Training Loss: 0.01223
Epoch: 13084, Training Loss: 0.01510
Epoch: 13084, Training Loss: 0.01161
Epoch: 13085, Training Loss: 0.01322
Epoch: 13085, Training Loss: 0.01223
Epoch: 13085, Training Loss: 0.01510
Epoch: 13085, Training Loss: 0.01161
Epoch: 13086, Training Loss: 0.01322
Epoch: 13086, Training Loss: 0.01223
Epoch: 13086, Training Loss: 0.01510
Epoch: 13086, Training Loss: 0.01161
Epoch: 13087, Training Loss: 0.01322
Epoch: 13087, Training Loss: 0.01223
Epoch: 13087, Training Loss: 0.01510
Epoch: 13087, Training Loss: 0.01161
Epoch: 13088, Training Loss: 0.01322
Epoch: 13088, Training Loss: 0.01223
Epoch: 13088, Training Loss: 0.01510
Epoch: 13088, Training Loss: 0.01160
Epoch: 13089, Training Loss: 0.01322
Epoch: 13089, Training Loss: 0.01223
Epoch: 13089, Training Loss: 0.01510
Epoch: 13089, Training Loss: 0.01160
Epoch: 13090, Training Loss: 0.01322
Epoch: 13090, Training Loss: 0.01223
Epoch: 13090, Training Loss: 0.01509
Epoch: 13090, Training Loss: 0.01160
Epoch: 13091, Training Loss: 0.01322
Epoch: 13091, Training Loss: 0.01223
Epoch: 13091, Training Loss: 0.01509
Epoch: 13091, Training Loss: 0.01160
Epoch: 13092, Training Loss: 0.01322
Epoch: 13092, Training Loss: 0.01223
Epoch: 13092, Training Loss: 0.01509
Epoch: 13092, Training Loss: 0.01160
Epoch: 13093, Training Loss: 0.01322
Epoch: 13093, Training Loss: 0.01223
Epoch: 13093, Training Loss: 0.01509
Epoch: 13093, Training Loss: 0.01160
Epoch: 13094, Training Loss: 0.01322
Epoch: 13094, Training Loss: 0.01223
Epoch: 13094, Training Loss: 0.01509
Epoch: 13094, Training Loss: 0.01160
Epoch: 13095, Training Loss: 0.01322
Epoch: 13095, Training Loss: 0.01223
Epoch: 13095, Training Loss: 0.01509
Epoch: 13095, Training Loss: 0.01160
Epoch: 13096, Training Loss: 0.01322
Epoch: 13096, Training Loss: 0.01223
Epoch: 13096, Training Loss: 0.01509
Epoch: 13096, Training Loss: 0.01160
Epoch: 13097, Training Loss: 0.01322
Epoch: 13097, Training Loss: 0.01223
Epoch: 13097, Training Loss: 0.01509
Epoch: 13097, Training Loss: 0.01160
Epoch: 13098, Training Loss: 0.01321
Epoch: 13098, Training Loss: 0.01223
Epoch: 13098, Training Loss: 0.01509
Epoch: 13098, Training Loss: 0.01160
Epoch: 13099, Training Loss: 0.01321
Epoch: 13099, Training Loss: 0.01223
Epoch: 13099, Training Loss: 0.01509
Epoch: 13099, Training Loss: 0.01160
Epoch: 13100, Training Loss: 0.01321
Epoch: 13100, Training Loss: 0.01223
Epoch: 13100, Training Loss: 0.01509
Epoch: 13100, Training Loss: 0.01160
Epoch: 13101, Training Loss: 0.01321
Epoch: 13101, Training Loss: 0.01223
Epoch: 13101, Training Loss: 0.01509
Epoch: 13101, Training Loss: 0.01160
Epoch: 13102, Training Loss: 0.01321
Epoch: 13102, Training Loss: 0.01222
Epoch: 13102, Training Loss: 0.01509
Epoch: 13102, Training Loss: 0.01160
Epoch: 13103, Training Loss: 0.01321
Epoch: 13103, Training Loss: 0.01222
Epoch: 13103, Training Loss: 0.01509
Epoch: 13103, Training Loss: 0.01160
Epoch: 13104, Training Loss: 0.01321
Epoch: 13104, Training Loss: 0.01222
Epoch: 13104, Training Loss: 0.01509
Epoch: 13104, Training Loss: 0.01160
Epoch: 13105, Training Loss: 0.01321
Epoch: 13105, Training Loss: 0.01222
Epoch: 13105, Training Loss: 0.01509
Epoch: 13105, Training Loss: 0.01160
Epoch: 13106, Training Loss: 0.01321
Epoch: 13106, Training Loss: 0.01222
Epoch: 13106, Training Loss: 0.01508
Epoch: 13106, Training Loss: 0.01160
Epoch: 13107, Training Loss: 0.01321
Epoch: 13107, Training Loss: 0.01222
Epoch: 13107, Training Loss: 0.01508
Epoch: 13107, Training Loss: 0.01160
Epoch: 13108, Training Loss: 0.01321
Epoch: 13108, Training Loss: 0.01222
Epoch: 13108, Training Loss: 0.01508
Epoch: 13108, Training Loss: 0.01160
Epoch: 13109, Training Loss: 0.01321
Epoch: 13109, Training Loss: 0.01222
Epoch: 13109, Training Loss: 0.01508
Epoch: 13109, Training Loss: 0.01159
Epoch: 13110, Training Loss: 0.01321
Epoch: 13110, Training Loss: 0.01222
Epoch: 13110, Training Loss: 0.01508
Epoch: 13110, Training Loss: 0.01159
Epoch: 13111, Training Loss: 0.01321
Epoch: 13111, Training Loss: 0.01222
Epoch: 13111, Training Loss: 0.01508
Epoch: 13111, Training Loss: 0.01159
Epoch: 13112, Training Loss: 0.01321
Epoch: 13112, Training Loss: 0.01222
Epoch: 13112, Training Loss: 0.01508
Epoch: 13112, Training Loss: 0.01159
Epoch: 13113, Training Loss: 0.01321
Epoch: 13113, Training Loss: 0.01222
Epoch: 13113, Training Loss: 0.01508
Epoch: 13113, Training Loss: 0.01159
Epoch: 13114, Training Loss: 0.01321
Epoch: 13114, Training Loss: 0.01222
Epoch: 13114, Training Loss: 0.01508
Epoch: 13114, Training Loss: 0.01159
Epoch: 13115, Training Loss: 0.01321
Epoch: 13115, Training Loss: 0.01222
Epoch: 13115, Training Loss: 0.01508
Epoch: 13115, Training Loss: 0.01159
Epoch: 13116, Training Loss: 0.01320
Epoch: 13116, Training Loss: 0.01222
Epoch: 13116, Training Loss: 0.01508
Epoch: 13116, Training Loss: 0.01159
Epoch: 13117, Training Loss: 0.01320
Epoch: 13117, Training Loss: 0.01222
Epoch: 13117, Training Loss: 0.01508
Epoch: 13117, Training Loss: 0.01159
Epoch: 13118, Training Loss: 0.01320
Epoch: 13118, Training Loss: 0.01222
Epoch: 13118, Training Loss: 0.01508
Epoch: 13118, Training Loss: 0.01159
Epoch: 13119, Training Loss: 0.01320
Epoch: 13119, Training Loss: 0.01222
Epoch: 13119, Training Loss: 0.01508
Epoch: 13119, Training Loss: 0.01159
Epoch: 13120, Training Loss: 0.01320
Epoch: 13120, Training Loss: 0.01222
Epoch: 13120, Training Loss: 0.01508
Epoch: 13120, Training Loss: 0.01159
Epoch: 13121, Training Loss: 0.01320
Epoch: 13121, Training Loss: 0.01221
Epoch: 13121, Training Loss: 0.01507
Epoch: 13121, Training Loss: 0.01159
Epoch: 13122, Training Loss: 0.01320
Epoch: 13122, Training Loss: 0.01221
Epoch: 13122, Training Loss: 0.01507
Epoch: 13122, Training Loss: 0.01159
Epoch: 13123, Training Loss: 0.01320
Epoch: 13123, Training Loss: 0.01221
Epoch: 13123, Training Loss: 0.01507
Epoch: 13123, Training Loss: 0.01159
Epoch: 13124, Training Loss: 0.01320
Epoch: 13124, Training Loss: 0.01221
Epoch: 13124, Training Loss: 0.01507
Epoch: 13124, Training Loss: 0.01159
Epoch: 13125, Training Loss: 0.01320
Epoch: 13125, Training Loss: 0.01221
Epoch: 13125, Training Loss: 0.01507
Epoch: 13125, Training Loss: 0.01159
Epoch: 13126, Training Loss: 0.01320
Epoch: 13126, Training Loss: 0.01221
Epoch: 13126, Training Loss: 0.01507
Epoch: 13126, Training Loss: 0.01159
Epoch: 13127, Training Loss: 0.01320
Epoch: 13127, Training Loss: 0.01221
Epoch: 13127, Training Loss: 0.01507
Epoch: 13127, Training Loss: 0.01159
Epoch: 13128, Training Loss: 0.01320
Epoch: 13128, Training Loss: 0.01221
Epoch: 13128, Training Loss: 0.01507
Epoch: 13128, Training Loss: 0.01159
Epoch: 13129, Training Loss: 0.01320
Epoch: 13129, Training Loss: 0.01221
Epoch: 13129, Training Loss: 0.01507
Epoch: 13129, Training Loss: 0.01159
Epoch: 13130, Training Loss: 0.01320
Epoch: 13130, Training Loss: 0.01221
Epoch: 13130, Training Loss: 0.01507
Epoch: 13130, Training Loss: 0.01158
Epoch: 13131, Training Loss: 0.01320
Epoch: 13131, Training Loss: 0.01221
Epoch: 13131, Training Loss: 0.01507
Epoch: 13131, Training Loss: 0.01158
Epoch: 13132, Training Loss: 0.01320
Epoch: 13132, Training Loss: 0.01221
Epoch: 13132, Training Loss: 0.01507
Epoch: 13132, Training Loss: 0.01158
Epoch: 13133, Training Loss: 0.01320
Epoch: 13133, Training Loss: 0.01221
Epoch: 13133, Training Loss: 0.01507
Epoch: 13133, Training Loss: 0.01158
Epoch: 13134, Training Loss: 0.01319
Epoch: 13134, Training Loss: 0.01221
Epoch: 13134, Training Loss: 0.01507
Epoch: 13134, Training Loss: 0.01158
Epoch: 13135, Training Loss: 0.01319
Epoch: 13135, Training Loss: 0.01221
Epoch: 13135, Training Loss: 0.01507
Epoch: 13135, Training Loss: 0.01158
Epoch: 13136, Training Loss: 0.01319
Epoch: 13136, Training Loss: 0.01221
Epoch: 13136, Training Loss: 0.01507
Epoch: 13136, Training Loss: 0.01158
Epoch: 13137, Training Loss: 0.01319
Epoch: 13137, Training Loss: 0.01221
Epoch: 13137, Training Loss: 0.01506
Epoch: 13137, Training Loss: 0.01158
Epoch: 13138, Training Loss: 0.01319
Epoch: 13138, Training Loss: 0.01221
Epoch: 13138, Training Loss: 0.01506
Epoch: 13138, Training Loss: 0.01158
Epoch: 13139, Training Loss: 0.01319
Epoch: 13139, Training Loss: 0.01221
Epoch: 13139, Training Loss: 0.01506
Epoch: 13139, Training Loss: 0.01158
Epoch: 13140, Training Loss: 0.01319
Epoch: 13140, Training Loss: 0.01221
Epoch: 13140, Training Loss: 0.01506
Epoch: 13140, Training Loss: 0.01158
Epoch: 13141, Training Loss: 0.01319
Epoch: 13141, Training Loss: 0.01220
Epoch: 13141, Training Loss: 0.01506
Epoch: 13141, Training Loss: 0.01158
Epoch: 13142, Training Loss: 0.01319
Epoch: 13142, Training Loss: 0.01220
Epoch: 13142, Training Loss: 0.01506
Epoch: 13142, Training Loss: 0.01158
Epoch: 13143, Training Loss: 0.01319
Epoch: 13143, Training Loss: 0.01220
Epoch: 13143, Training Loss: 0.01506
Epoch: 13143, Training Loss: 0.01158
Epoch: 13144, Training Loss: 0.01319
Epoch: 13144, Training Loss: 0.01220
Epoch: 13144, Training Loss: 0.01506
Epoch: 13144, Training Loss: 0.01158
Epoch: 13145, Training Loss: 0.01319
Epoch: 13145, Training Loss: 0.01220
Epoch: 13145, Training Loss: 0.01506
Epoch: 13145, Training Loss: 0.01158
Epoch: 13146, Training Loss: 0.01319
Epoch: 13146, Training Loss: 0.01220
Epoch: 13146, Training Loss: 0.01506
Epoch: 13146, Training Loss: 0.01158
Epoch: 13147, Training Loss: 0.01319
Epoch: 13147, Training Loss: 0.01220
Epoch: 13147, Training Loss: 0.01506
Epoch: 13147, Training Loss: 0.01158
Epoch: 13148, Training Loss: 0.01319
Epoch: 13148, Training Loss: 0.01220
Epoch: 13148, Training Loss: 0.01506
Epoch: 13148, Training Loss: 0.01158
Epoch: 13149, Training Loss: 0.01319
Epoch: 13149, Training Loss: 0.01220
Epoch: 13149, Training Loss: 0.01506
Epoch: 13149, Training Loss: 0.01158
Epoch: 13150, Training Loss: 0.01319
Epoch: 13150, Training Loss: 0.01220
Epoch: 13150, Training Loss: 0.01506
Epoch: 13150, Training Loss: 0.01158
Epoch: 13151, Training Loss: 0.01319
Epoch: 13151, Training Loss: 0.01220
Epoch: 13151, Training Loss: 0.01506
Epoch: 13151, Training Loss: 0.01157
Epoch: 13152, Training Loss: 0.01318
Epoch: 13152, Training Loss: 0.01220
Epoch: 13152, Training Loss: 0.01506
Epoch: 13152, Training Loss: 0.01157
Epoch: 13153, Training Loss: 0.01318
Epoch: 13153, Training Loss: 0.01220
Epoch: 13153, Training Loss: 0.01505
Epoch: 13153, Training Loss: 0.01157
Epoch: 13154, Training Loss: 0.01318
Epoch: 13154, Training Loss: 0.01220
Epoch: 13154, Training Loss: 0.01505
Epoch: 13154, Training Loss: 0.01157
Epoch: 13155, Training Loss: 0.01318
Epoch: 13155, Training Loss: 0.01220
Epoch: 13155, Training Loss: 0.01505
Epoch: 13155, Training Loss: 0.01157
Epoch: 13156, Training Loss: 0.01318
Epoch: 13156, Training Loss: 0.01220
Epoch: 13156, Training Loss: 0.01505
Epoch: 13156, Training Loss: 0.01157
Epoch: 13157, Training Loss: 0.01318
Epoch: 13157, Training Loss: 0.01220
Epoch: 13157, Training Loss: 0.01505
Epoch: 13157, Training Loss: 0.01157
Epoch: 13158, Training Loss: 0.01318
Epoch: 13158, Training Loss: 0.01220
Epoch: 13158, Training Loss: 0.01505
Epoch: 13158, Training Loss: 0.01157
Epoch: 13159, Training Loss: 0.01318
Epoch: 13159, Training Loss: 0.01220
Epoch: 13159, Training Loss: 0.01505
Epoch: 13159, Training Loss: 0.01157
Epoch: 13160, Training Loss: 0.01318
Epoch: 13160, Training Loss: 0.01220
Epoch: 13160, Training Loss: 0.01505
Epoch: 13160, Training Loss: 0.01157
Epoch: 13161, Training Loss: 0.01318
Epoch: 13161, Training Loss: 0.01219
Epoch: 13161, Training Loss: 0.01505
Epoch: 13161, Training Loss: 0.01157
Epoch: 13162, Training Loss: 0.01318
Epoch: 13162, Training Loss: 0.01219
Epoch: 13162, Training Loss: 0.01505
Epoch: 13162, Training Loss: 0.01157
Epoch: 13163, Training Loss: 0.01318
Epoch: 13163, Training Loss: 0.01219
Epoch: 13163, Training Loss: 0.01505
Epoch: 13163, Training Loss: 0.01157
Epoch: 13164, Training Loss: 0.01318
Epoch: 13164, Training Loss: 0.01219
Epoch: 13164, Training Loss: 0.01505
Epoch: 13164, Training Loss: 0.01157
Epoch: 13165, Training Loss: 0.01318
Epoch: 13165, Training Loss: 0.01219
Epoch: 13165, Training Loss: 0.01505
Epoch: 13165, Training Loss: 0.01157
Epoch: 13166, Training Loss: 0.01318
Epoch: 13166, Training Loss: 0.01219
Epoch: 13166, Training Loss: 0.01505
Epoch: 13166, Training Loss: 0.01157
Epoch: 13167, Training Loss: 0.01318
Epoch: 13167, Training Loss: 0.01219
Epoch: 13167, Training Loss: 0.01505
Epoch: 13167, Training Loss: 0.01157
Epoch: 13168, Training Loss: 0.01318
Epoch: 13168, Training Loss: 0.01219
Epoch: 13168, Training Loss: 0.01504
Epoch: 13168, Training Loss: 0.01157
Epoch: 13169, Training Loss: 0.01318
Epoch: 13169, Training Loss: 0.01219
Epoch: 13169, Training Loss: 0.01504
Epoch: 13169, Training Loss: 0.01157
Epoch: 13170, Training Loss: 0.01317
Epoch: 13170, Training Loss: 0.01219
Epoch: 13170, Training Loss: 0.01504
Epoch: 13170, Training Loss: 0.01157
Epoch: 13171, Training Loss: 0.01317
Epoch: 13171, Training Loss: 0.01219
Epoch: 13171, Training Loss: 0.01504
Epoch: 13171, Training Loss: 0.01157
Epoch: 13172, Training Loss: 0.01317
Epoch: 13172, Training Loss: 0.01219
Epoch: 13172, Training Loss: 0.01504
Epoch: 13172, Training Loss: 0.01156
Epoch: 13173, Training Loss: 0.01317
Epoch: 13173, Training Loss: 0.01219
Epoch: 13173, Training Loss: 0.01504
Epoch: 13173, Training Loss: 0.01156
Epoch: 13174, Training Loss: 0.01317
Epoch: 13174, Training Loss: 0.01219
Epoch: 13174, Training Loss: 0.01504
Epoch: 13174, Training Loss: 0.01156
Epoch: 13175, Training Loss: 0.01317
Epoch: 13175, Training Loss: 0.01219
Epoch: 13175, Training Loss: 0.01504
Epoch: 13175, Training Loss: 0.01156
Epoch: 13176, Training Loss: 0.01317
Epoch: 13176, Training Loss: 0.01219
Epoch: 13176, Training Loss: 0.01504
Epoch: 13176, Training Loss: 0.01156
Epoch: 13177, Training Loss: 0.01317
Epoch: 13177, Training Loss: 0.01219
Epoch: 13177, Training Loss: 0.01504
Epoch: 13177, Training Loss: 0.01156
Epoch: 13178, Training Loss: 0.01317
Epoch: 13178, Training Loss: 0.01219
Epoch: 13178, Training Loss: 0.01504
Epoch: 13178, Training Loss: 0.01156
Epoch: 13179, Training Loss: 0.01317
Epoch: 13179, Training Loss: 0.01219
Epoch: 13179, Training Loss: 0.01504
Epoch: 13179, Training Loss: 0.01156
Epoch: 13180, Training Loss: 0.01317
Epoch: 13180, Training Loss: 0.01219
Epoch: 13180, Training Loss: 0.01504
Epoch: 13180, Training Loss: 0.01156
Epoch: 13181, Training Loss: 0.01317
Epoch: 13181, Training Loss: 0.01218
Epoch: 13181, Training Loss: 0.01504
Epoch: 13181, Training Loss: 0.01156
Epoch: 13182, Training Loss: 0.01317
Epoch: 13182, Training Loss: 0.01218
Epoch: 13182, Training Loss: 0.01504
Epoch: 13182, Training Loss: 0.01156
Epoch: 13183, Training Loss: 0.01317
Epoch: 13183, Training Loss: 0.01218
Epoch: 13183, Training Loss: 0.01504
Epoch: 13183, Training Loss: 0.01156
Epoch: 13184, Training Loss: 0.01317
Epoch: 13184, Training Loss: 0.01218
Epoch: 13184, Training Loss: 0.01503
Epoch: 13184, Training Loss: 0.01156
Epoch: 13185, Training Loss: 0.01317
Epoch: 13185, Training Loss: 0.01218
Epoch: 13185, Training Loss: 0.01503
Epoch: 13185, Training Loss: 0.01156
Epoch: 13186, Training Loss: 0.01317
Epoch: 13186, Training Loss: 0.01218
Epoch: 13186, Training Loss: 0.01503
Epoch: 13186, Training Loss: 0.01156
Epoch: 13187, Training Loss: 0.01317
Epoch: 13187, Training Loss: 0.01218
Epoch: 13187, Training Loss: 0.01503
Epoch: 13187, Training Loss: 0.01156
Epoch: 13188, Training Loss: 0.01316
Epoch: 13188, Training Loss: 0.01218
Epoch: 13188, Training Loss: 0.01503
Epoch: 13188, Training Loss: 0.01156
Epoch: 13189, Training Loss: 0.01316
Epoch: 13189, Training Loss: 0.01218
Epoch: 13189, Training Loss: 0.01503
Epoch: 13189, Training Loss: 0.01156
Epoch: 13190, Training Loss: 0.01316
Epoch: 13190, Training Loss: 0.01218
Epoch: 13190, Training Loss: 0.01503
Epoch: 13190, Training Loss: 0.01156
Epoch: 13191, Training Loss: 0.01316
Epoch: 13191, Training Loss: 0.01218
Epoch: 13191, Training Loss: 0.01503
Epoch: 13191, Training Loss: 0.01156
Epoch: 13192, Training Loss: 0.01316
Epoch: 13192, Training Loss: 0.01218
Epoch: 13192, Training Loss: 0.01503
Epoch: 13192, Training Loss: 0.01156
Epoch: 13193, Training Loss: 0.01316
Epoch: 13193, Training Loss: 0.01218
Epoch: 13193, Training Loss: 0.01503
Epoch: 13193, Training Loss: 0.01155
Epoch: 13194, Training Loss: 0.01316
Epoch: 13194, Training Loss: 0.01218
Epoch: 13194, Training Loss: 0.01503
Epoch: 13194, Training Loss: 0.01155
Epoch: 13195, Training Loss: 0.01316
Epoch: 13195, Training Loss: 0.01218
Epoch: 13195, Training Loss: 0.01503
Epoch: 13195, Training Loss: 0.01155
Epoch: 13196, Training Loss: 0.01316
Epoch: 13196, Training Loss: 0.01218
Epoch: 13196, Training Loss: 0.01503
Epoch: 13196, Training Loss: 0.01155
Epoch: 13197, Training Loss: 0.01316
Epoch: 13197, Training Loss: 0.01218
Epoch: 13197, Training Loss: 0.01503
Epoch: 13197, Training Loss: 0.01155
Epoch: 13198, Training Loss: 0.01316
Epoch: 13198, Training Loss: 0.01218
Epoch: 13198, Training Loss: 0.01503
Epoch: 13198, Training Loss: 0.01155
Epoch: 13199, Training Loss: 0.01316
Epoch: 13199, Training Loss: 0.01218
Epoch: 13199, Training Loss: 0.01503
Epoch: 13199, Training Loss: 0.01155
Epoch: 13200, Training Loss: 0.01316
Epoch: 13200, Training Loss: 0.01218
Epoch: 13200, Training Loss: 0.01502
Epoch: 13200, Training Loss: 0.01155
Epoch: 13201, Training Loss: 0.01316
Epoch: 13201, Training Loss: 0.01217
Epoch: 13201, Training Loss: 0.01502
Epoch: 13201, Training Loss: 0.01155
Epoch: 13202, Training Loss: 0.01316
Epoch: 13202, Training Loss: 0.01217
Epoch: 13202, Training Loss: 0.01502
Epoch: 13202, Training Loss: 0.01155
Epoch: 13203, Training Loss: 0.01316
Epoch: 13203, Training Loss: 0.01217
Epoch: 13203, Training Loss: 0.01502
Epoch: 13203, Training Loss: 0.01155
Epoch: 13204, Training Loss: 0.01316
Epoch: 13204, Training Loss: 0.01217
Epoch: 13204, Training Loss: 0.01502
Epoch: 13204, Training Loss: 0.01155
Epoch: 13205, Training Loss: 0.01316
Epoch: 13205, Training Loss: 0.01217
Epoch: 13205, Training Loss: 0.01502
Epoch: 13205, Training Loss: 0.01155
Epoch: 13206, Training Loss: 0.01315
Epoch: 13206, Training Loss: 0.01217
Epoch: 13206, Training Loss: 0.01502
Epoch: 13206, Training Loss: 0.01155
Epoch: 13207, Training Loss: 0.01315
Epoch: 13207, Training Loss: 0.01217
Epoch: 13207, Training Loss: 0.01502
Epoch: 13207, Training Loss: 0.01155
Epoch: 13208, Training Loss: 0.01315
Epoch: 13208, Training Loss: 0.01217
Epoch: 13208, Training Loss: 0.01502
Epoch: 13208, Training Loss: 0.01155
Epoch: 13209, Training Loss: 0.01315
Epoch: 13209, Training Loss: 0.01217
Epoch: 13209, Training Loss: 0.01502
Epoch: 13209, Training Loss: 0.01155
Epoch: 13210, Training Loss: 0.01315
Epoch: 13210, Training Loss: 0.01217
Epoch: 13210, Training Loss: 0.01502
Epoch: 13210, Training Loss: 0.01155
Epoch: 13211, Training Loss: 0.01315
Epoch: 13211, Training Loss: 0.01217
Epoch: 13211, Training Loss: 0.01502
Epoch: 13211, Training Loss: 0.01155
Epoch: 13212, Training Loss: 0.01315
Epoch: 13212, Training Loss: 0.01217
Epoch: 13212, Training Loss: 0.01502
Epoch: 13212, Training Loss: 0.01155
Epoch: 13213, Training Loss: 0.01315
Epoch: 13213, Training Loss: 0.01217
Epoch: 13213, Training Loss: 0.01502
Epoch: 13213, Training Loss: 0.01155
Epoch: 13214, Training Loss: 0.01315
Epoch: 13214, Training Loss: 0.01217
Epoch: 13214, Training Loss: 0.01502
Epoch: 13214, Training Loss: 0.01154
Epoch: 13215, Training Loss: 0.01315
Epoch: 13215, Training Loss: 0.01217
Epoch: 13215, Training Loss: 0.01502
Epoch: 13215, Training Loss: 0.01154
Epoch: 13216, Training Loss: 0.01315
Epoch: 13216, Training Loss: 0.01217
Epoch: 13216, Training Loss: 0.01501
Epoch: 13216, Training Loss: 0.01154
Epoch: 13217, Training Loss: 0.01315
Epoch: 13217, Training Loss: 0.01217
Epoch: 13217, Training Loss: 0.01501
Epoch: 13217, Training Loss: 0.01154
Epoch: 13218, Training Loss: 0.01315
Epoch: 13218, Training Loss: 0.01217
Epoch: 13218, Training Loss: 0.01501
Epoch: 13218, Training Loss: 0.01154
Epoch: 13219, Training Loss: 0.01315
Epoch: 13219, Training Loss: 0.01217
Epoch: 13219, Training Loss: 0.01501
Epoch: 13219, Training Loss: 0.01154
Epoch: 13220, Training Loss: 0.01315
Epoch: 13220, Training Loss: 0.01217
Epoch: 13220, Training Loss: 0.01501
Epoch: 13220, Training Loss: 0.01154
Epoch: 13221, Training Loss: 0.01315
Epoch: 13221, Training Loss: 0.01216
Epoch: 13221, Training Loss: 0.01501
Epoch: 13221, Training Loss: 0.01154
Epoch: 13222, Training Loss: 0.01315
Epoch: 13222, Training Loss: 0.01216
Epoch: 13222, Training Loss: 0.01501
Epoch: 13222, Training Loss: 0.01154
Epoch: 13223, Training Loss: 0.01315
Epoch: 13223, Training Loss: 0.01216
Epoch: 13223, Training Loss: 0.01501
Epoch: 13223, Training Loss: 0.01154
Epoch: 13224, Training Loss: 0.01314
Epoch: 13224, Training Loss: 0.01216
Epoch: 13224, Training Loss: 0.01501
Epoch: 13224, Training Loss: 0.01154
Epoch: 13225, Training Loss: 0.01314
Epoch: 13225, Training Loss: 0.01216
Epoch: 13225, Training Loss: 0.01501
Epoch: 13225, Training Loss: 0.01154
Epoch: 13226, Training Loss: 0.01314
Epoch: 13226, Training Loss: 0.01216
Epoch: 13226, Training Loss: 0.01501
Epoch: 13226, Training Loss: 0.01154
Epoch: 13227, Training Loss: 0.01314
Epoch: 13227, Training Loss: 0.01216
Epoch: 13227, Training Loss: 0.01501
Epoch: 13227, Training Loss: 0.01154
Epoch: 13228, Training Loss: 0.01314
Epoch: 13228, Training Loss: 0.01216
Epoch: 13228, Training Loss: 0.01501
Epoch: 13228, Training Loss: 0.01154
Epoch: 13229, Training Loss: 0.01314
Epoch: 13229, Training Loss: 0.01216
Epoch: 13229, Training Loss: 0.01501
Epoch: 13229, Training Loss: 0.01154
Epoch: 13230, Training Loss: 0.01314
Epoch: 13230, Training Loss: 0.01216
Epoch: 13230, Training Loss: 0.01501
Epoch: 13230, Training Loss: 0.01154
Epoch: 13231, Training Loss: 0.01314
Epoch: 13231, Training Loss: 0.01216
Epoch: 13231, Training Loss: 0.01501
Epoch: 13231, Training Loss: 0.01154
Epoch: 13232, Training Loss: 0.01314
Epoch: 13232, Training Loss: 0.01216
Epoch: 13232, Training Loss: 0.01500
Epoch: 13232, Training Loss: 0.01154
Epoch: 13233, Training Loss: 0.01314
Epoch: 13233, Training Loss: 0.01216
Epoch: 13233, Training Loss: 0.01500
Epoch: 13233, Training Loss: 0.01154
Epoch: 13234, Training Loss: 0.01314
Epoch: 13234, Training Loss: 0.01216
Epoch: 13234, Training Loss: 0.01500
Epoch: 13234, Training Loss: 0.01154
Epoch: 13235, Training Loss: 0.01314
Epoch: 13235, Training Loss: 0.01216
Epoch: 13235, Training Loss: 0.01500
Epoch: 13235, Training Loss: 0.01153
Epoch: 13236, Training Loss: 0.01314
Epoch: 13236, Training Loss: 0.01216
Epoch: 13236, Training Loss: 0.01500
Epoch: 13236, Training Loss: 0.01153
Epoch: 13237, Training Loss: 0.01314
Epoch: 13237, Training Loss: 0.01216
Epoch: 13237, Training Loss: 0.01500
Epoch: 13237, Training Loss: 0.01153
Epoch: 13238, Training Loss: 0.01314
Epoch: 13238, Training Loss: 0.01216
Epoch: 13238, Training Loss: 0.01500
Epoch: 13238, Training Loss: 0.01153
Epoch: 13239, Training Loss: 0.01314
Epoch: 13239, Training Loss: 0.01216
Epoch: 13239, Training Loss: 0.01500
Epoch: 13239, Training Loss: 0.01153
Epoch: 13240, Training Loss: 0.01314
Epoch: 13240, Training Loss: 0.01216
Epoch: 13240, Training Loss: 0.01500
Epoch: 13240, Training Loss: 0.01153
Epoch: 13241, Training Loss: 0.01314
Epoch: 13241, Training Loss: 0.01216
Epoch: 13241, Training Loss: 0.01500
Epoch: 13241, Training Loss: 0.01153
Epoch: 13242, Training Loss: 0.01314
Epoch: 13242, Training Loss: 0.01215
Epoch: 13242, Training Loss: 0.01500
Epoch: 13242, Training Loss: 0.01153
Epoch: 13243, Training Loss: 0.01313
Epoch: 13243, Training Loss: 0.01215
Epoch: 13243, Training Loss: 0.01500
Epoch: 13243, Training Loss: 0.01153
Epoch: 13244, Training Loss: 0.01313
Epoch: 13244, Training Loss: 0.01215
Epoch: 13244, Training Loss: 0.01500
Epoch: 13244, Training Loss: 0.01153
Epoch: 13245, Training Loss: 0.01313
Epoch: 13245, Training Loss: 0.01215
Epoch: 13245, Training Loss: 0.01500
Epoch: 13245, Training Loss: 0.01153
Epoch: 13246, Training Loss: 0.01313
Epoch: 13246, Training Loss: 0.01215
Epoch: 13246, Training Loss: 0.01500
Epoch: 13246, Training Loss: 0.01153
Epoch: 13247, Training Loss: 0.01313
Epoch: 13247, Training Loss: 0.01215
Epoch: 13247, Training Loss: 0.01499
Epoch: 13247, Training Loss: 0.01153
Epoch: 13248, Training Loss: 0.01313
Epoch: 13248, Training Loss: 0.01215
Epoch: 13248, Training Loss: 0.01499
Epoch: 13248, Training Loss: 0.01153
Epoch: 13249, Training Loss: 0.01313
Epoch: 13249, Training Loss: 0.01215
Epoch: 13249, Training Loss: 0.01499
Epoch: 13249, Training Loss: 0.01153
Epoch: 13250, Training Loss: 0.01313
Epoch: 13250, Training Loss: 0.01215
Epoch: 13250, Training Loss: 0.01499
Epoch: 13250, Training Loss: 0.01153
Epoch: 13251, Training Loss: 0.01313
Epoch: 13251, Training Loss: 0.01215
Epoch: 13251, Training Loss: 0.01499
Epoch: 13251, Training Loss: 0.01153
Epoch: 13252, Training Loss: 0.01313
Epoch: 13252, Training Loss: 0.01215
Epoch: 13252, Training Loss: 0.01499
Epoch: 13252, Training Loss: 0.01153
Epoch: 13253, Training Loss: 0.01313
Epoch: 13253, Training Loss: 0.01215
Epoch: 13253, Training Loss: 0.01499
Epoch: 13253, Training Loss: 0.01153
Epoch: 13254, Training Loss: 0.01313
Epoch: 13254, Training Loss: 0.01215
Epoch: 13254, Training Loss: 0.01499
Epoch: 13254, Training Loss: 0.01153
Epoch: 13255, Training Loss: 0.01313
Epoch: 13255, Training Loss: 0.01215
Epoch: 13255, Training Loss: 0.01499
Epoch: 13255, Training Loss: 0.01153
Epoch: 13256, Training Loss: 0.01313
Epoch: 13256, Training Loss: 0.01215
Epoch: 13256, Training Loss: 0.01499
Epoch: 13256, Training Loss: 0.01152
Epoch: 13257, Training Loss: 0.01313
Epoch: 13257, Training Loss: 0.01215
Epoch: 13257, Training Loss: 0.01499
Epoch: 13257, Training Loss: 0.01152
Epoch: 13258, Training Loss: 0.01313
Epoch: 13258, Training Loss: 0.01215
Epoch: 13258, Training Loss: 0.01499
Epoch: 13258, Training Loss: 0.01152
Epoch: 13259, Training Loss: 0.01313
Epoch: 13259, Training Loss: 0.01215
Epoch: 13259, Training Loss: 0.01499
Epoch: 13259, Training Loss: 0.01152
Epoch: 13260, Training Loss: 0.01313
Epoch: 13260, Training Loss: 0.01215
Epoch: 13260, Training Loss: 0.01499
Epoch: 13260, Training Loss: 0.01152
Epoch: 13261, Training Loss: 0.01312
Epoch: 13261, Training Loss: 0.01215
Epoch: 13261, Training Loss: 0.01499
Epoch: 13261, Training Loss: 0.01152
Epoch: 13262, Training Loss: 0.01312
Epoch: 13262, Training Loss: 0.01214
Epoch: 13262, Training Loss: 0.01499
Epoch: 13262, Training Loss: 0.01152
Epoch: 13263, Training Loss: 0.01312
Epoch: 13263, Training Loss: 0.01214
Epoch: 13263, Training Loss: 0.01498
Epoch: 13263, Training Loss: 0.01152
Epoch: 13264, Training Loss: 0.01312
Epoch: 13264, Training Loss: 0.01214
Epoch: 13264, Training Loss: 0.01498
Epoch: 13264, Training Loss: 0.01152
Epoch: 13265, Training Loss: 0.01312
Epoch: 13265, Training Loss: 0.01214
Epoch: 13265, Training Loss: 0.01498
Epoch: 13265, Training Loss: 0.01152
Epoch: 13266, Training Loss: 0.01312
Epoch: 13266, Training Loss: 0.01214
Epoch: 13266, Training Loss: 0.01498
Epoch: 13266, Training Loss: 0.01152
Epoch: 13267, Training Loss: 0.01312
Epoch: 13267, Training Loss: 0.01214
Epoch: 13267, Training Loss: 0.01498
Epoch: 13267, Training Loss: 0.01152
Epoch: 13268, Training Loss: 0.01312
Epoch: 13268, Training Loss: 0.01214
Epoch: 13268, Training Loss: 0.01498
Epoch: 13268, Training Loss: 0.01152
Epoch: 13269, Training Loss: 0.01312
Epoch: 13269, Training Loss: 0.01214
Epoch: 13269, Training Loss: 0.01498
Epoch: 13269, Training Loss: 0.01152
Epoch: 13270, Training Loss: 0.01312
Epoch: 13270, Training Loss: 0.01214
Epoch: 13270, Training Loss: 0.01498
Epoch: 13270, Training Loss: 0.01152
Epoch: 13271, Training Loss: 0.01312
Epoch: 13271, Training Loss: 0.01214
Epoch: 13271, Training Loss: 0.01498
Epoch: 13271, Training Loss: 0.01152
Epoch: 13272, Training Loss: 0.01312
Epoch: 13272, Training Loss: 0.01214
Epoch: 13272, Training Loss: 0.01498
Epoch: 13272, Training Loss: 0.01152
Epoch: 13273, Training Loss: 0.01312
Epoch: 13273, Training Loss: 0.01214
Epoch: 13273, Training Loss: 0.01498
Epoch: 13273, Training Loss: 0.01152
Epoch: 13274, Training Loss: 0.01312
Epoch: 13274, Training Loss: 0.01214
Epoch: 13274, Training Loss: 0.01498
Epoch: 13274, Training Loss: 0.01152
Epoch: 13275, Training Loss: 0.01312
Epoch: 13275, Training Loss: 0.01214
Epoch: 13275, Training Loss: 0.01498
Epoch: 13275, Training Loss: 0.01152
Epoch: 13276, Training Loss: 0.01312
Epoch: 13276, Training Loss: 0.01214
Epoch: 13276, Training Loss: 0.01498
Epoch: 13276, Training Loss: 0.01152
Epoch: 13277, Training Loss: 0.01312
Epoch: 13277, Training Loss: 0.01214
Epoch: 13277, Training Loss: 0.01498
Epoch: 13277, Training Loss: 0.01151
Epoch: 13278, Training Loss: 0.01312
Epoch: 13278, Training Loss: 0.01214
Epoch: 13278, Training Loss: 0.01498
Epoch: 13278, Training Loss: 0.01151
Epoch: 13279, Training Loss: 0.01311
Epoch: 13279, Training Loss: 0.01214
Epoch: 13279, Training Loss: 0.01497
Epoch: 13279, Training Loss: 0.01151
Epoch: 13280, Training Loss: 0.01311
Epoch: 13280, Training Loss: 0.01214
Epoch: 13280, Training Loss: 0.01497
Epoch: 13280, Training Loss: 0.01151
Epoch: 13281, Training Loss: 0.01311
Epoch: 13281, Training Loss: 0.01214
Epoch: 13281, Training Loss: 0.01497
Epoch: 13281, Training Loss: 0.01151
Epoch: 13282, Training Loss: 0.01311
Epoch: 13282, Training Loss: 0.01213
Epoch: 13282, Training Loss: 0.01497
Epoch: 13282, Training Loss: 0.01151
Epoch: 13283, Training Loss: 0.01311
Epoch: 13283, Training Loss: 0.01213
Epoch: 13283, Training Loss: 0.01497
Epoch: 13283, Training Loss: 0.01151
Epoch: 13284, Training Loss: 0.01311
Epoch: 13284, Training Loss: 0.01213
Epoch: 13284, Training Loss: 0.01497
Epoch: 13284, Training Loss: 0.01151
Epoch: 13285, Training Loss: 0.01311
Epoch: 13285, Training Loss: 0.01213
Epoch: 13285, Training Loss: 0.01497
Epoch: 13285, Training Loss: 0.01151
Epoch: 13286, Training Loss: 0.01311
Epoch: 13286, Training Loss: 0.01213
Epoch: 13286, Training Loss: 0.01497
Epoch: 13286, Training Loss: 0.01151
Epoch: 13287, Training Loss: 0.01311
Epoch: 13287, Training Loss: 0.01213
Epoch: 13287, Training Loss: 0.01497
Epoch: 13287, Training Loss: 0.01151
Epoch: 13288, Training Loss: 0.01311
Epoch: 13288, Training Loss: 0.01213
Epoch: 13288, Training Loss: 0.01497
Epoch: 13288, Training Loss: 0.01151
Epoch: 13289, Training Loss: 0.01311
Epoch: 13289, Training Loss: 0.01213
Epoch: 13289, Training Loss: 0.01497
Epoch: 13289, Training Loss: 0.01151
Epoch: 13290, Training Loss: 0.01311
Epoch: 13290, Training Loss: 0.01213
Epoch: 13290, Training Loss: 0.01497
Epoch: 13290, Training Loss: 0.01151
Epoch: 13291, Training Loss: 0.01311
Epoch: 13291, Training Loss: 0.01213
Epoch: 13291, Training Loss: 0.01497
Epoch: 13291, Training Loss: 0.01151
Epoch: 13292, Training Loss: 0.01311
Epoch: 13292, Training Loss: 0.01213
Epoch: 13292, Training Loss: 0.01497
Epoch: 13292, Training Loss: 0.01151
Epoch: 13293, Training Loss: 0.01311
Epoch: 13293, Training Loss: 0.01213
Epoch: 13293, Training Loss: 0.01497
Epoch: 13293, Training Loss: 0.01151
Epoch: 13294, Training Loss: 0.01311
Epoch: 13294, Training Loss: 0.01213
Epoch: 13294, Training Loss: 0.01497
Epoch: 13294, Training Loss: 0.01151
Epoch: 13295, Training Loss: 0.01311
Epoch: 13295, Training Loss: 0.01213
Epoch: 13295, Training Loss: 0.01496
Epoch: 13295, Training Loss: 0.01151
Epoch: 13296, Training Loss: 0.01311
Epoch: 13296, Training Loss: 0.01213
Epoch: 13296, Training Loss: 0.01496
Epoch: 13296, Training Loss: 0.01151
Epoch: 13297, Training Loss: 0.01310
Epoch: 13297, Training Loss: 0.01213
Epoch: 13297, Training Loss: 0.01496
Epoch: 13297, Training Loss: 0.01151
Epoch: 13298, Training Loss: 0.01310
Epoch: 13298, Training Loss: 0.01213
Epoch: 13298, Training Loss: 0.01496
Epoch: 13298, Training Loss: 0.01150
Epoch: 13299, Training Loss: 0.01310
Epoch: 13299, Training Loss: 0.01213
Epoch: 13299, Training Loss: 0.01496
Epoch: 13299, Training Loss: 0.01150
Epoch: 13300, Training Loss: 0.01310
Epoch: 13300, Training Loss: 0.01213
Epoch: 13300, Training Loss: 0.01496
Epoch: 13300, Training Loss: 0.01150
Epoch: 13301, Training Loss: 0.01310
Epoch: 13301, Training Loss: 0.01213
Epoch: 13301, Training Loss: 0.01496
Epoch: 13301, Training Loss: 0.01150
Epoch: 13302, Training Loss: 0.01310
Epoch: 13302, Training Loss: 0.01212
Epoch: 13302, Training Loss: 0.01496
Epoch: 13302, Training Loss: 0.01150
Epoch: 13303, Training Loss: 0.01310
Epoch: 13303, Training Loss: 0.01212
Epoch: 13303, Training Loss: 0.01496
Epoch: 13303, Training Loss: 0.01150
Epoch: 13304, Training Loss: 0.01310
Epoch: 13304, Training Loss: 0.01212
Epoch: 13304, Training Loss: 0.01496
Epoch: 13304, Training Loss: 0.01150
Epoch: 13305, Training Loss: 0.01310
Epoch: 13305, Training Loss: 0.01212
Epoch: 13305, Training Loss: 0.01496
Epoch: 13305, Training Loss: 0.01150
Epoch: 13306, Training Loss: 0.01310
Epoch: 13306, Training Loss: 0.01212
Epoch: 13306, Training Loss: 0.01496
Epoch: 13306, Training Loss: 0.01150
Epoch: 13307, Training Loss: 0.01310
Epoch: 13307, Training Loss: 0.01212
Epoch: 13307, Training Loss: 0.01496
Epoch: 13307, Training Loss: 0.01150
Epoch: 13308, Training Loss: 0.01310
Epoch: 13308, Training Loss: 0.01212
Epoch: 13308, Training Loss: 0.01496
Epoch: 13308, Training Loss: 0.01150
Epoch: 13309, Training Loss: 0.01310
Epoch: 13309, Training Loss: 0.01212
Epoch: 13309, Training Loss: 0.01496
Epoch: 13309, Training Loss: 0.01150
Epoch: 13310, Training Loss: 0.01310
Epoch: 13310, Training Loss: 0.01212
Epoch: 13310, Training Loss: 0.01496
Epoch: 13310, Training Loss: 0.01150
Epoch: 13311, Training Loss: 0.01310
Epoch: 13311, Training Loss: 0.01212
Epoch: 13311, Training Loss: 0.01495
Epoch: 13311, Training Loss: 0.01150
Epoch: 13312, Training Loss: 0.01310
Epoch: 13312, Training Loss: 0.01212
Epoch: 13312, Training Loss: 0.01495
Epoch: 13312, Training Loss: 0.01150
Epoch: 13313, Training Loss: 0.01310
Epoch: 13313, Training Loss: 0.01212
Epoch: 13313, Training Loss: 0.01495
Epoch: 13313, Training Loss: 0.01150
Epoch: 13314, Training Loss: 0.01310
Epoch: 13314, Training Loss: 0.01212
Epoch: 13314, Training Loss: 0.01495
Epoch: 13314, Training Loss: 0.01150
Epoch: 13315, Training Loss: 0.01310
Epoch: 13315, Training Loss: 0.01212
Epoch: 13315, Training Loss: 0.01495
Epoch: 13315, Training Loss: 0.01150
Epoch: 13316, Training Loss: 0.01309
Epoch: 13316, Training Loss: 0.01212
Epoch: 13316, Training Loss: 0.01495
Epoch: 13316, Training Loss: 0.01150
Epoch: 13317, Training Loss: 0.01309
Epoch: 13317, Training Loss: 0.01212
Epoch: 13317, Training Loss: 0.01495
Epoch: 13317, Training Loss: 0.01150
Epoch: 13318, Training Loss: 0.01309
Epoch: 13318, Training Loss: 0.01212
Epoch: 13318, Training Loss: 0.01495
Epoch: 13318, Training Loss: 0.01150
Epoch: 13319, Training Loss: 0.01309
Epoch: 13319, Training Loss: 0.01212
Epoch: 13319, Training Loss: 0.01495
Epoch: 13319, Training Loss: 0.01149
Epoch: 13320, Training Loss: 0.01309
Epoch: 13320, Training Loss: 0.01212
Epoch: 13320, Training Loss: 0.01495
Epoch: 13320, Training Loss: 0.01149
Epoch: 13321, Training Loss: 0.01309
Epoch: 13321, Training Loss: 0.01212
Epoch: 13321, Training Loss: 0.01495
Epoch: 13321, Training Loss: 0.01149
Epoch: 13322, Training Loss: 0.01309
Epoch: 13322, Training Loss: 0.01212
Epoch: 13322, Training Loss: 0.01495
Epoch: 13322, Training Loss: 0.01149
Epoch: 13323, Training Loss: 0.01309
Epoch: 13323, Training Loss: 0.01211
Epoch: 13323, Training Loss: 0.01495
Epoch: 13323, Training Loss: 0.01149
Epoch: 13324, Training Loss: 0.01309
Epoch: 13324, Training Loss: 0.01211
Epoch: 13324, Training Loss: 0.01495
Epoch: 13324, Training Loss: 0.01149
Epoch: 13325, Training Loss: 0.01309
Epoch: 13325, Training Loss: 0.01211
Epoch: 13325, Training Loss: 0.01495
Epoch: 13325, Training Loss: 0.01149
Epoch: 13326, Training Loss: 0.01309
Epoch: 13326, Training Loss: 0.01211
Epoch: 13326, Training Loss: 0.01495
Epoch: 13326, Training Loss: 0.01149
Epoch: 13327, Training Loss: 0.01309
Epoch: 13327, Training Loss: 0.01211
Epoch: 13327, Training Loss: 0.01494
Epoch: 13327, Training Loss: 0.01149
Epoch: 13328, Training Loss: 0.01309
Epoch: 13328, Training Loss: 0.01211
Epoch: 13328, Training Loss: 0.01494
Epoch: 13328, Training Loss: 0.01149
Epoch: 13329, Training Loss: 0.01309
Epoch: 13329, Training Loss: 0.01211
Epoch: 13329, Training Loss: 0.01494
Epoch: 13329, Training Loss: 0.01149
Epoch: 13330, Training Loss: 0.01309
Epoch: 13330, Training Loss: 0.01211
Epoch: 13330, Training Loss: 0.01494
Epoch: 13330, Training Loss: 0.01149
Epoch: 13331, Training Loss: 0.01309
Epoch: 13331, Training Loss: 0.01211
Epoch: 13331, Training Loss: 0.01494
Epoch: 13331, Training Loss: 0.01149
Epoch: 13332, Training Loss: 0.01309
Epoch: 13332, Training Loss: 0.01211
Epoch: 13332, Training Loss: 0.01494
Epoch: 13332, Training Loss: 0.01149
Epoch: 13333, Training Loss: 0.01309
Epoch: 13333, Training Loss: 0.01211
Epoch: 13333, Training Loss: 0.01494
Epoch: 13333, Training Loss: 0.01149
Epoch: 13334, Training Loss: 0.01308
Epoch: 13334, Training Loss: 0.01211
Epoch: 13334, Training Loss: 0.01494
Epoch: 13334, Training Loss: 0.01149
Epoch: 13335, Training Loss: 0.01308
Epoch: 13335, Training Loss: 0.01211
Epoch: 13335, Training Loss: 0.01494
Epoch: 13335, Training Loss: 0.01149
Epoch: 13336, Training Loss: 0.01308
Epoch: 13336, Training Loss: 0.01211
Epoch: 13336, Training Loss: 0.01494
Epoch: 13336, Training Loss: 0.01149
Epoch: 13337, Training Loss: 0.01308
Epoch: 13337, Training Loss: 0.01211
Epoch: 13337, Training Loss: 0.01494
Epoch: 13337, Training Loss: 0.01149
Epoch: 13338, Training Loss: 0.01308
Epoch: 13338, Training Loss: 0.01211
Epoch: 13338, Training Loss: 0.01494
Epoch: 13338, Training Loss: 0.01149
Epoch: 13339, Training Loss: 0.01308
Epoch: 13339, Training Loss: 0.01211
Epoch: 13339, Training Loss: 0.01494
Epoch: 13339, Training Loss: 0.01149
Epoch: 13340, Training Loss: 0.01308
Epoch: 13340, Training Loss: 0.01211
Epoch: 13340, Training Loss: 0.01494
Epoch: 13340, Training Loss: 0.01149
Epoch: 13341, Training Loss: 0.01308
Epoch: 13341, Training Loss: 0.01211
Epoch: 13341, Training Loss: 0.01494
Epoch: 13341, Training Loss: 0.01148
Epoch: 13342, Training Loss: 0.01308
Epoch: 13342, Training Loss: 0.01211
Epoch: 13342, Training Loss: 0.01494
Epoch: 13342, Training Loss: 0.01148
Epoch: 13343, Training Loss: 0.01308
Epoch: 13343, Training Loss: 0.01210
Epoch: 13343, Training Loss: 0.01493
Epoch: 13343, Training Loss: 0.01148
Epoch: 13344, Training Loss: 0.01308
Epoch: 13344, Training Loss: 0.01210
Epoch: 13344, Training Loss: 0.01493
Epoch: 13344, Training Loss: 0.01148
Epoch: 13345, Training Loss: 0.01308
Epoch: 13345, Training Loss: 0.01210
Epoch: 13345, Training Loss: 0.01493
Epoch: 13345, Training Loss: 0.01148
Epoch: 13346, Training Loss: 0.01308
Epoch: 13346, Training Loss: 0.01210
Epoch: 13346, Training Loss: 0.01493
Epoch: 13346, Training Loss: 0.01148
Epoch: 13347, Training Loss: 0.01308
Epoch: 13347, Training Loss: 0.01210
Epoch: 13347, Training Loss: 0.01493
Epoch: 13347, Training Loss: 0.01148
Epoch: 13348, Training Loss: 0.01308
Epoch: 13348, Training Loss: 0.01210
Epoch: 13348, Training Loss: 0.01493
Epoch: 13348, Training Loss: 0.01148
Epoch: 13349, Training Loss: 0.01308
Epoch: 13349, Training Loss: 0.01210
Epoch: 13349, Training Loss: 0.01493
Epoch: 13349, Training Loss: 0.01148
Epoch: 13350, Training Loss: 0.01308
Epoch: 13350, Training Loss: 0.01210
Epoch: 13350, Training Loss: 0.01493
Epoch: 13350, Training Loss: 0.01148
Epoch: 13351, Training Loss: 0.01308
Epoch: 13351, Training Loss: 0.01210
Epoch: 13351, Training Loss: 0.01493
Epoch: 13351, Training Loss: 0.01148
Epoch: 13352, Training Loss: 0.01307
Epoch: 13352, Training Loss: 0.01210
Epoch: 13352, Training Loss: 0.01493
Epoch: 13352, Training Loss: 0.01148
Epoch: 13353, Training Loss: 0.01307
Epoch: 13353, Training Loss: 0.01210
Epoch: 13353, Training Loss: 0.01493
Epoch: 13353, Training Loss: 0.01148
Epoch: 13354, Training Loss: 0.01307
Epoch: 13354, Training Loss: 0.01210
Epoch: 13354, Training Loss: 0.01493
Epoch: 13354, Training Loss: 0.01148
Epoch: 13355, Training Loss: 0.01307
Epoch: 13355, Training Loss: 0.01210
Epoch: 13355, Training Loss: 0.01493
Epoch: 13355, Training Loss: 0.01148
Epoch: 13356, Training Loss: 0.01307
Epoch: 13356, Training Loss: 0.01210
Epoch: 13356, Training Loss: 0.01493
Epoch: 13356, Training Loss: 0.01148
Epoch: 13357, Training Loss: 0.01307
Epoch: 13357, Training Loss: 0.01210
Epoch: 13357, Training Loss: 0.01493
Epoch: 13357, Training Loss: 0.01148
Epoch: 13358, Training Loss: 0.01307
Epoch: 13358, Training Loss: 0.01210
Epoch: 13358, Training Loss: 0.01493
Epoch: 13358, Training Loss: 0.01148
Epoch: 13359, Training Loss: 0.01307
Epoch: 13359, Training Loss: 0.01210
Epoch: 13359, Training Loss: 0.01492
Epoch: 13359, Training Loss: 0.01148
Epoch: 13360, Training Loss: 0.01307
Epoch: 13360, Training Loss: 0.01210
Epoch: 13360, Training Loss: 0.01492
Epoch: 13360, Training Loss: 0.01148
Epoch: 13361, Training Loss: 0.01307
Epoch: 13361, Training Loss: 0.01210
Epoch: 13361, Training Loss: 0.01492
Epoch: 13361, Training Loss: 0.01148
Epoch: 13362, Training Loss: 0.01307
Epoch: 13362, Training Loss: 0.01210
Epoch: 13362, Training Loss: 0.01492
Epoch: 13362, Training Loss: 0.01147
Epoch: 13363, Training Loss: 0.01307
Epoch: 13363, Training Loss: 0.01209
Epoch: 13363, Training Loss: 0.01492
Epoch: 13363, Training Loss: 0.01147
Epoch: 13364, Training Loss: 0.01307
Epoch: 13364, Training Loss: 0.01209
Epoch: 13364, Training Loss: 0.01492
Epoch: 13364, Training Loss: 0.01147
Epoch: 13365, Training Loss: 0.01307
Epoch: 13365, Training Loss: 0.01209
Epoch: 13365, Training Loss: 0.01492
Epoch: 13365, Training Loss: 0.01147
Epoch: 13366, Training Loss: 0.01307
Epoch: 13366, Training Loss: 0.01209
Epoch: 13366, Training Loss: 0.01492
Epoch: 13366, Training Loss: 0.01147
Epoch: 13367, Training Loss: 0.01307
Epoch: 13367, Training Loss: 0.01209
Epoch: 13367, Training Loss: 0.01492
Epoch: 13367, Training Loss: 0.01147
Epoch: 13368, Training Loss: 0.01307
Epoch: 13368, Training Loss: 0.01209
Epoch: 13368, Training Loss: 0.01492
Epoch: 13368, Training Loss: 0.01147
Epoch: 13369, Training Loss: 0.01307
Epoch: 13369, Training Loss: 0.01209
Epoch: 13369, Training Loss: 0.01492
Epoch: 13369, Training Loss: 0.01147
Epoch: 13370, Training Loss: 0.01307
Epoch: 13370, Training Loss: 0.01209
Epoch: 13370, Training Loss: 0.01492
Epoch: 13370, Training Loss: 0.01147
Epoch: 13371, Training Loss: 0.01306
Epoch: 13371, Training Loss: 0.01209
Epoch: 13371, Training Loss: 0.01492
Epoch: 13371, Training Loss: 0.01147
Epoch: 13372, Training Loss: 0.01306
Epoch: 13372, Training Loss: 0.01209
Epoch: 13372, Training Loss: 0.01492
Epoch: 13372, Training Loss: 0.01147
Epoch: 13373, Training Loss: 0.01306
Epoch: 13373, Training Loss: 0.01209
Epoch: 13373, Training Loss: 0.01492
Epoch: 13373, Training Loss: 0.01147
Epoch: 13374, Training Loss: 0.01306
Epoch: 13374, Training Loss: 0.01209
Epoch: 13374, Training Loss: 0.01492
Epoch: 13374, Training Loss: 0.01147
Epoch: 13375, Training Loss: 0.01306
Epoch: 13375, Training Loss: 0.01209
Epoch: 13375, Training Loss: 0.01492
Epoch: 13375, Training Loss: 0.01147
Epoch: 13376, Training Loss: 0.01306
Epoch: 13376, Training Loss: 0.01209
Epoch: 13376, Training Loss: 0.01491
Epoch: 13376, Training Loss: 0.01147
Epoch: 13377, Training Loss: 0.01306
Epoch: 13377, Training Loss: 0.01209
Epoch: 13377, Training Loss: 0.01491
Epoch: 13377, Training Loss: 0.01147
Epoch: 13378, Training Loss: 0.01306
Epoch: 13378, Training Loss: 0.01209
Epoch: 13378, Training Loss: 0.01491
Epoch: 13378, Training Loss: 0.01147
Epoch: 13379, Training Loss: 0.01306
Epoch: 13379, Training Loss: 0.01209
Epoch: 13379, Training Loss: 0.01491
Epoch: 13379, Training Loss: 0.01147
Epoch: 13380, Training Loss: 0.01306
Epoch: 13380, Training Loss: 0.01209
Epoch: 13380, Training Loss: 0.01491
Epoch: 13380, Training Loss: 0.01147
Epoch: 13381, Training Loss: 0.01306
Epoch: 13381, Training Loss: 0.01209
Epoch: 13381, Training Loss: 0.01491
Epoch: 13381, Training Loss: 0.01147
Epoch: 13382, Training Loss: 0.01306
Epoch: 13382, Training Loss: 0.01209
Epoch: 13382, Training Loss: 0.01491
Epoch: 13382, Training Loss: 0.01147
Epoch: 13383, Training Loss: 0.01306
Epoch: 13383, Training Loss: 0.01209
Epoch: 13383, Training Loss: 0.01491
Epoch: 13383, Training Loss: 0.01147
Epoch: 13384, Training Loss: 0.01306
Epoch: 13384, Training Loss: 0.01208
Epoch: 13384, Training Loss: 0.01491
Epoch: 13384, Training Loss: 0.01146
Epoch: 13385, Training Loss: 0.01306
Epoch: 13385, Training Loss: 0.01208
Epoch: 13385, Training Loss: 0.01491
Epoch: 13385, Training Loss: 0.01146
Epoch: 13386, Training Loss: 0.01306
Epoch: 13386, Training Loss: 0.01208
Epoch: 13386, Training Loss: 0.01491
Epoch: 13386, Training Loss: 0.01146
Epoch: 13387, Training Loss: 0.01306
Epoch: 13387, Training Loss: 0.01208
Epoch: 13387, Training Loss: 0.01491
Epoch: 13387, Training Loss: 0.01146
Epoch: 13388, Training Loss: 0.01306
Epoch: 13388, Training Loss: 0.01208
Epoch: 13388, Training Loss: 0.01491
Epoch: 13388, Training Loss: 0.01146
Epoch: 13389, Training Loss: 0.01305
Epoch: 13389, Training Loss: 0.01208
Epoch: 13389, Training Loss: 0.01491
Epoch: 13389, Training Loss: 0.01146
Epoch: 13390, Training Loss: 0.01305
Epoch: 13390, Training Loss: 0.01208
Epoch: 13390, Training Loss: 0.01491
Epoch: 13390, Training Loss: 0.01146
Epoch: 13391, Training Loss: 0.01305
Epoch: 13391, Training Loss: 0.01208
Epoch: 13391, Training Loss: 0.01491
Epoch: 13391, Training Loss: 0.01146
Epoch: 13392, Training Loss: 0.01305
Epoch: 13392, Training Loss: 0.01208
Epoch: 13392, Training Loss: 0.01490
Epoch: 13392, Training Loss: 0.01146
Epoch: 13393, Training Loss: 0.01305
Epoch: 13393, Training Loss: 0.01208
Epoch: 13393, Training Loss: 0.01490
Epoch: 13393, Training Loss: 0.01146
Epoch: 13394, Training Loss: 0.01305
Epoch: 13394, Training Loss: 0.01208
Epoch: 13394, Training Loss: 0.01490
Epoch: 13394, Training Loss: 0.01146
Epoch: 13395, Training Loss: 0.01305
Epoch: 13395, Training Loss: 0.01208
Epoch: 13395, Training Loss: 0.01490
Epoch: 13395, Training Loss: 0.01146
Epoch: 13396, Training Loss: 0.01305
Epoch: 13396, Training Loss: 0.01208
Epoch: 13396, Training Loss: 0.01490
Epoch: 13396, Training Loss: 0.01146
Epoch: 13397, Training Loss: 0.01305
Epoch: 13397, Training Loss: 0.01208
Epoch: 13397, Training Loss: 0.01490
Epoch: 13397, Training Loss: 0.01146
Epoch: 13398, Training Loss: 0.01305
Epoch: 13398, Training Loss: 0.01208
Epoch: 13398, Training Loss: 0.01490
Epoch: 13398, Training Loss: 0.01146
Epoch: 13399, Training Loss: 0.01305
Epoch: 13399, Training Loss: 0.01208
Epoch: 13399, Training Loss: 0.01490
Epoch: 13399, Training Loss: 0.01146
Epoch: 13400, Training Loss: 0.01305
Epoch: 13400, Training Loss: 0.01208
Epoch: 13400, Training Loss: 0.01490
Epoch: 13400, Training Loss: 0.01146
Epoch: 13401, Training Loss: 0.01305
Epoch: 13401, Training Loss: 0.01208
Epoch: 13401, Training Loss: 0.01490
Epoch: 13401, Training Loss: 0.01146
Epoch: 13402, Training Loss: 0.01305
Epoch: 13402, Training Loss: 0.01208
Epoch: 13402, Training Loss: 0.01490
Epoch: 13402, Training Loss: 0.01146
Epoch: 13403, Training Loss: 0.01305
Epoch: 13403, Training Loss: 0.01208
Epoch: 13403, Training Loss: 0.01490
Epoch: 13403, Training Loss: 0.01146
Epoch: 13404, Training Loss: 0.01305
Epoch: 13404, Training Loss: 0.01207
Epoch: 13404, Training Loss: 0.01490
Epoch: 13404, Training Loss: 0.01146
Epoch: 13405, Training Loss: 0.01305
Epoch: 13405, Training Loss: 0.01207
Epoch: 13405, Training Loss: 0.01490
Epoch: 13405, Training Loss: 0.01145
Epoch: 13406, Training Loss: 0.01305
Epoch: 13406, Training Loss: 0.01207
Epoch: 13406, Training Loss: 0.01490
Epoch: 13406, Training Loss: 0.01145
Epoch: 13407, Training Loss: 0.01305
Epoch: 13407, Training Loss: 0.01207
Epoch: 13407, Training Loss: 0.01490
Epoch: 13407, Training Loss: 0.01145
Epoch: 13408, Training Loss: 0.01304
Epoch: 13408, Training Loss: 0.01207
Epoch: 13408, Training Loss: 0.01489
Epoch: 13408, Training Loss: 0.01145
Epoch: 13409, Training Loss: 0.01304
Epoch: 13409, Training Loss: 0.01207
Epoch: 13409, Training Loss: 0.01489
Epoch: 13409, Training Loss: 0.01145
Epoch: 13410, Training Loss: 0.01304
Epoch: 13410, Training Loss: 0.01207
Epoch: 13410, Training Loss: 0.01489
Epoch: 13410, Training Loss: 0.01145
Epoch: 13411, Training Loss: 0.01304
Epoch: 13411, Training Loss: 0.01207
Epoch: 13411, Training Loss: 0.01489
Epoch: 13411, Training Loss: 0.01145
Epoch: 13412, Training Loss: 0.01304
Epoch: 13412, Training Loss: 0.01207
Epoch: 13412, Training Loss: 0.01489
Epoch: 13412, Training Loss: 0.01145
Epoch: 13413, Training Loss: 0.01304
Epoch: 13413, Training Loss: 0.01207
Epoch: 13413, Training Loss: 0.01489
Epoch: 13413, Training Loss: 0.01145
Epoch: 13414, Training Loss: 0.01304
Epoch: 13414, Training Loss: 0.01207
Epoch: 13414, Training Loss: 0.01489
Epoch: 13414, Training Loss: 0.01145
Epoch: 13415, Training Loss: 0.01304
Epoch: 13415, Training Loss: 0.01207
Epoch: 13415, Training Loss: 0.01489
Epoch: 13415, Training Loss: 0.01145
Epoch: 13416, Training Loss: 0.01304
Epoch: 13416, Training Loss: 0.01207
Epoch: 13416, Training Loss: 0.01489
Epoch: 13416, Training Loss: 0.01145
Epoch: 13417, Training Loss: 0.01304
Epoch: 13417, Training Loss: 0.01207
Epoch: 13417, Training Loss: 0.01489
Epoch: 13417, Training Loss: 0.01145
Epoch: 13418, Training Loss: 0.01304
Epoch: 13418, Training Loss: 0.01207
Epoch: 13418, Training Loss: 0.01489
Epoch: 13418, Training Loss: 0.01145
Epoch: 13419, Training Loss: 0.01304
Epoch: 13419, Training Loss: 0.01207
Epoch: 13419, Training Loss: 0.01489
Epoch: 13419, Training Loss: 0.01145
Epoch: 13420, Training Loss: 0.01304
Epoch: 13420, Training Loss: 0.01207
Epoch: 13420, Training Loss: 0.01489
Epoch: 13420, Training Loss: 0.01145
Epoch: 13421, Training Loss: 0.01304
Epoch: 13421, Training Loss: 0.01207
Epoch: 13421, Training Loss: 0.01489
Epoch: 13421, Training Loss: 0.01145
Epoch: 13422, Training Loss: 0.01304
Epoch: 13422, Training Loss: 0.01207
Epoch: 13422, Training Loss: 0.01489
Epoch: 13422, Training Loss: 0.01145
Epoch: 13423, Training Loss: 0.01304
Epoch: 13423, Training Loss: 0.01207
Epoch: 13423, Training Loss: 0.01489
Epoch: 13423, Training Loss: 0.01145
Epoch: 13424, Training Loss: 0.01304
Epoch: 13424, Training Loss: 0.01207
Epoch: 13424, Training Loss: 0.01488
Epoch: 13424, Training Loss: 0.01145
Epoch: 13425, Training Loss: 0.01304
Epoch: 13425, Training Loss: 0.01206
Epoch: 13425, Training Loss: 0.01488
Epoch: 13425, Training Loss: 0.01145
Epoch: 13426, Training Loss: 0.01303
Epoch: 13426, Training Loss: 0.01206
Epoch: 13426, Training Loss: 0.01488
Epoch: 13426, Training Loss: 0.01145
Epoch: 13427, Training Loss: 0.01303
Epoch: 13427, Training Loss: 0.01206
Epoch: 13427, Training Loss: 0.01488
Epoch: 13427, Training Loss: 0.01144
Epoch: 13428, Training Loss: 0.01303
Epoch: 13428, Training Loss: 0.01206
Epoch: 13428, Training Loss: 0.01488
Epoch: 13428, Training Loss: 0.01144
Epoch: 13429, Training Loss: 0.01303
Epoch: 13429, Training Loss: 0.01206
Epoch: 13429, Training Loss: 0.01488
Epoch: 13429, Training Loss: 0.01144
Epoch: 13430, Training Loss: 0.01303
Epoch: 13430, Training Loss: 0.01206
Epoch: 13430, Training Loss: 0.01488
Epoch: 13430, Training Loss: 0.01144
Epoch: 13431, Training Loss: 0.01303
Epoch: 13431, Training Loss: 0.01206
Epoch: 13431, Training Loss: 0.01488
Epoch: 13431, Training Loss: 0.01144
Epoch: 13432, Training Loss: 0.01303
Epoch: 13432, Training Loss: 0.01206
Epoch: 13432, Training Loss: 0.01488
Epoch: 13432, Training Loss: 0.01144
Epoch: 13433, Training Loss: 0.01303
Epoch: 13433, Training Loss: 0.01206
Epoch: 13433, Training Loss: 0.01488
Epoch: 13433, Training Loss: 0.01144
Epoch: 13434, Training Loss: 0.01303
Epoch: 13434, Training Loss: 0.01206
Epoch: 13434, Training Loss: 0.01488
Epoch: 13434, Training Loss: 0.01144
Epoch: 13435, Training Loss: 0.01303
Epoch: 13435, Training Loss: 0.01206
Epoch: 13435, Training Loss: 0.01488
Epoch: 13435, Training Loss: 0.01144
Epoch: 13436, Training Loss: 0.01303
Epoch: 13436, Training Loss: 0.01206
Epoch: 13436, Training Loss: 0.01488
Epoch: 13436, Training Loss: 0.01144
Epoch: 13437, Training Loss: 0.01303
Epoch: 13437, Training Loss: 0.01206
Epoch: 13437, Training Loss: 0.01488
Epoch: 13437, Training Loss: 0.01144
Epoch: 13438, Training Loss: 0.01303
Epoch: 13438, Training Loss: 0.01206
Epoch: 13438, Training Loss: 0.01488
Epoch: 13438, Training Loss: 0.01144
Epoch: 13439, Training Loss: 0.01303
Epoch: 13439, Training Loss: 0.01206
Epoch: 13439, Training Loss: 0.01488
Epoch: 13439, Training Loss: 0.01144
Epoch: 13440, Training Loss: 0.01303
Epoch: 13440, Training Loss: 0.01206
Epoch: 13440, Training Loss: 0.01487
Epoch: 13440, Training Loss: 0.01144
Epoch: 13441, Training Loss: 0.01303
Epoch: 13441, Training Loss: 0.01206
Epoch: 13441, Training Loss: 0.01487
Epoch: 13441, Training Loss: 0.01144
Epoch: 13442, Training Loss: 0.01303
Epoch: 13442, Training Loss: 0.01206
Epoch: 13442, Training Loss: 0.01487
Epoch: 13442, Training Loss: 0.01144
Epoch: 13443, Training Loss: 0.01303
Epoch: 13443, Training Loss: 0.01206
Epoch: 13443, Training Loss: 0.01487
Epoch: 13443, Training Loss: 0.01144
Epoch: 13444, Training Loss: 0.01303
Epoch: 13444, Training Loss: 0.01206
Epoch: 13444, Training Loss: 0.01487
Epoch: 13444, Training Loss: 0.01144
Epoch: 13445, Training Loss: 0.01302
Epoch: 13445, Training Loss: 0.01206
Epoch: 13445, Training Loss: 0.01487
Epoch: 13445, Training Loss: 0.01144
Epoch: 13446, Training Loss: 0.01302
Epoch: 13446, Training Loss: 0.01205
Epoch: 13446, Training Loss: 0.01487
Epoch: 13446, Training Loss: 0.01144
Epoch: 13447, Training Loss: 0.01302
Epoch: 13447, Training Loss: 0.01205
Epoch: 13447, Training Loss: 0.01487
Epoch: 13447, Training Loss: 0.01144
Epoch: 13448, Training Loss: 0.01302
Epoch: 13448, Training Loss: 0.01205
Epoch: 13448, Training Loss: 0.01487
Epoch: 13448, Training Loss: 0.01143
Epoch: 13449, Training Loss: 0.01302
Epoch: 13449, Training Loss: 0.01205
Epoch: 13449, Training Loss: 0.01487
Epoch: 13449, Training Loss: 0.01143
Epoch: 13450, Training Loss: 0.01302
Epoch: 13450, Training Loss: 0.01205
Epoch: 13450, Training Loss: 0.01487
Epoch: 13450, Training Loss: 0.01143
Epoch: 13451, Training Loss: 0.01302
Epoch: 13451, Training Loss: 0.01205
Epoch: 13451, Training Loss: 0.01487
Epoch: 13451, Training Loss: 0.01143
Epoch: 13452, Training Loss: 0.01302
Epoch: 13452, Training Loss: 0.01205
Epoch: 13452, Training Loss: 0.01487
Epoch: 13452, Training Loss: 0.01143
Epoch: 13453, Training Loss: 0.01302
Epoch: 13453, Training Loss: 0.01205
Epoch: 13453, Training Loss: 0.01487
Epoch: 13453, Training Loss: 0.01143
Epoch: 13454, Training Loss: 0.01302
Epoch: 13454, Training Loss: 0.01205
Epoch: 13454, Training Loss: 0.01487
Epoch: 13454, Training Loss: 0.01143
Epoch: 13455, Training Loss: 0.01302
Epoch: 13455, Training Loss: 0.01205
Epoch: 13455, Training Loss: 0.01487
Epoch: 13455, Training Loss: 0.01143
Epoch: 13456, Training Loss: 0.01302
Epoch: 13456, Training Loss: 0.01205
Epoch: 13456, Training Loss: 0.01487
Epoch: 13456, Training Loss: 0.01143
Epoch: 13457, Training Loss: 0.01302
Epoch: 13457, Training Loss: 0.01205
Epoch: 13457, Training Loss: 0.01486
Epoch: 13457, Training Loss: 0.01143
Epoch: 13458, Training Loss: 0.01302
Epoch: 13458, Training Loss: 0.01205
Epoch: 13458, Training Loss: 0.01486
Epoch: 13458, Training Loss: 0.01143
Epoch: 13459, Training Loss: 0.01302
Epoch: 13459, Training Loss: 0.01205
Epoch: 13459, Training Loss: 0.01486
Epoch: 13459, Training Loss: 0.01143
Epoch: 13460, Training Loss: 0.01302
Epoch: 13460, Training Loss: 0.01205
Epoch: 13460, Training Loss: 0.01486
Epoch: 13460, Training Loss: 0.01143
Epoch: 13461, Training Loss: 0.01302
Epoch: 13461, Training Loss: 0.01205
Epoch: 13461, Training Loss: 0.01486
Epoch: 13461, Training Loss: 0.01143
Epoch: 13462, Training Loss: 0.01302
Epoch: 13462, Training Loss: 0.01205
Epoch: 13462, Training Loss: 0.01486
Epoch: 13462, Training Loss: 0.01143
Epoch: 13463, Training Loss: 0.01302
Epoch: 13463, Training Loss: 0.01205
Epoch: 13463, Training Loss: 0.01486
Epoch: 13463, Training Loss: 0.01143
Epoch: 13464, Training Loss: 0.01301
Epoch: 13464, Training Loss: 0.01205
Epoch: 13464, Training Loss: 0.01486
Epoch: 13464, Training Loss: 0.01143
Epoch: 13465, Training Loss: 0.01301
Epoch: 13465, Training Loss: 0.01205
Epoch: 13465, Training Loss: 0.01486
Epoch: 13465, Training Loss: 0.01143
Epoch: 13466, Training Loss: 0.01301
Epoch: 13466, Training Loss: 0.01204
Epoch: 13466, Training Loss: 0.01486
Epoch: 13466, Training Loss: 0.01143
Epoch: 13467, Training Loss: 0.01301
Epoch: 13467, Training Loss: 0.01204
Epoch: 13467, Training Loss: 0.01486
Epoch: 13467, Training Loss: 0.01143
Epoch: 13468, Training Loss: 0.01301
Epoch: 13468, Training Loss: 0.01204
Epoch: 13468, Training Loss: 0.01486
Epoch: 13468, Training Loss: 0.01143
Epoch: 13469, Training Loss: 0.01301
Epoch: 13469, Training Loss: 0.01204
Epoch: 13469, Training Loss: 0.01486
Epoch: 13469, Training Loss: 0.01143
Epoch: 13470, Training Loss: 0.01301
Epoch: 13470, Training Loss: 0.01204
Epoch: 13470, Training Loss: 0.01486
Epoch: 13470, Training Loss: 0.01142
Epoch: 13471, Training Loss: 0.01301
Epoch: 13471, Training Loss: 0.01204
Epoch: 13471, Training Loss: 0.01486
Epoch: 13471, Training Loss: 0.01142
Epoch: 13472, Training Loss: 0.01301
Epoch: 13472, Training Loss: 0.01204
Epoch: 13472, Training Loss: 0.01486
Epoch: 13472, Training Loss: 0.01142
Epoch: 13473, Training Loss: 0.01301
Epoch: 13473, Training Loss: 0.01204
Epoch: 13473, Training Loss: 0.01485
Epoch: 13473, Training Loss: 0.01142
Epoch: 13474, Training Loss: 0.01301
Epoch: 13474, Training Loss: 0.01204
Epoch: 13474, Training Loss: 0.01485
Epoch: 13474, Training Loss: 0.01142
Epoch: 13475, Training Loss: 0.01301
Epoch: 13475, Training Loss: 0.01204
Epoch: 13475, Training Loss: 0.01485
Epoch: 13475, Training Loss: 0.01142
Epoch: 13476, Training Loss: 0.01301
Epoch: 13476, Training Loss: 0.01204
Epoch: 13476, Training Loss: 0.01485
Epoch: 13476, Training Loss: 0.01142
Epoch: 13477, Training Loss: 0.01301
Epoch: 13477, Training Loss: 0.01204
Epoch: 13477, Training Loss: 0.01485
Epoch: 13477, Training Loss: 0.01142
Epoch: 13478, Training Loss: 0.01301
Epoch: 13478, Training Loss: 0.01204
Epoch: 13478, Training Loss: 0.01485
Epoch: 13478, Training Loss: 0.01142
Epoch: 13479, Training Loss: 0.01301
Epoch: 13479, Training Loss: 0.01204
Epoch: 13479, Training Loss: 0.01485
Epoch: 13479, Training Loss: 0.01142
Epoch: 13480, Training Loss: 0.01301
Epoch: 13480, Training Loss: 0.01204
Epoch: 13480, Training Loss: 0.01485
Epoch: 13480, Training Loss: 0.01142
Epoch: 13481, Training Loss: 0.01301
Epoch: 13481, Training Loss: 0.01204
Epoch: 13481, Training Loss: 0.01485
Epoch: 13481, Training Loss: 0.01142
Epoch: 13482, Training Loss: 0.01300
Epoch: 13482, Training Loss: 0.01204
Epoch: 13482, Training Loss: 0.01485
Epoch: 13482, Training Loss: 0.01142
Epoch: 13483, Training Loss: 0.01300
Epoch: 13483, Training Loss: 0.01204
Epoch: 13483, Training Loss: 0.01485
Epoch: 13483, Training Loss: 0.01142
Epoch: 13484, Training Loss: 0.01300
Epoch: 13484, Training Loss: 0.01204
Epoch: 13484, Training Loss: 0.01485
Epoch: 13484, Training Loss: 0.01142
Epoch: 13485, Training Loss: 0.01300
Epoch: 13485, Training Loss: 0.01204
Epoch: 13485, Training Loss: 0.01485
Epoch: 13485, Training Loss: 0.01142
Epoch: 13486, Training Loss: 0.01300
Epoch: 13486, Training Loss: 0.01204
Epoch: 13486, Training Loss: 0.01485
Epoch: 13486, Training Loss: 0.01142
Epoch: 13487, Training Loss: 0.01300
Epoch: 13487, Training Loss: 0.01203
Epoch: 13487, Training Loss: 0.01485
Epoch: 13487, Training Loss: 0.01142
Epoch: 13488, Training Loss: 0.01300
Epoch: 13488, Training Loss: 0.01203
Epoch: 13488, Training Loss: 0.01485
Epoch: 13488, Training Loss: 0.01142
Epoch: 13489, Training Loss: 0.01300
Epoch: 13489, Training Loss: 0.01203
Epoch: 13489, Training Loss: 0.01484
Epoch: 13489, Training Loss: 0.01142
Epoch: 13490, Training Loss: 0.01300
Epoch: 13490, Training Loss: 0.01203
Epoch: 13490, Training Loss: 0.01484
Epoch: 13490, Training Loss: 0.01142
Epoch: 13491, Training Loss: 0.01300
Epoch: 13491, Training Loss: 0.01203
Epoch: 13491, Training Loss: 0.01484
Epoch: 13491, Training Loss: 0.01141
Epoch: 13492, Training Loss: 0.01300
Epoch: 13492, Training Loss: 0.01203
Epoch: 13492, Training Loss: 0.01484
Epoch: 13492, Training Loss: 0.01141
Epoch: 13493, Training Loss: 0.01300
Epoch: 13493, Training Loss: 0.01203
Epoch: 13493, Training Loss: 0.01484
Epoch: 13493, Training Loss: 0.01141
Epoch: 13494, Training Loss: 0.01300
Epoch: 13494, Training Loss: 0.01203
Epoch: 13494, Training Loss: 0.01484
Epoch: 13494, Training Loss: 0.01141
Epoch: 13495, Training Loss: 0.01300
Epoch: 13495, Training Loss: 0.01203
Epoch: 13495, Training Loss: 0.01484
Epoch: 13495, Training Loss: 0.01141
Epoch: 13496, Training Loss: 0.01300
Epoch: 13496, Training Loss: 0.01203
Epoch: 13496, Training Loss: 0.01484
Epoch: 13496, Training Loss: 0.01141
Epoch: 13497, Training Loss: 0.01300
Epoch: 13497, Training Loss: 0.01203
Epoch: 13497, Training Loss: 0.01484
Epoch: 13497, Training Loss: 0.01141
Epoch: 13498, Training Loss: 0.01300
Epoch: 13498, Training Loss: 0.01203
Epoch: 13498, Training Loss: 0.01484
Epoch: 13498, Training Loss: 0.01141
Epoch: 13499, Training Loss: 0.01300
Epoch: 13499, Training Loss: 0.01203
Epoch: 13499, Training Loss: 0.01484
Epoch: 13499, Training Loss: 0.01141
Epoch: 13500, Training Loss: 0.01300
Epoch: 13500, Training Loss: 0.01203
Epoch: 13500, Training Loss: 0.01484
Epoch: 13500, Training Loss: 0.01141
Epoch: 13501, Training Loss: 0.01299
Epoch: 13501, Training Loss: 0.01203
Epoch: 13501, Training Loss: 0.01484
Epoch: 13501, Training Loss: 0.01141
Epoch: 13502, Training Loss: 0.01299
Epoch: 13502, Training Loss: 0.01203
Epoch: 13502, Training Loss: 0.01484
Epoch: 13502, Training Loss: 0.01141
Epoch: 13503, Training Loss: 0.01299
Epoch: 13503, Training Loss: 0.01203
Epoch: 13503, Training Loss: 0.01484
Epoch: 13503, Training Loss: 0.01141
Epoch: 13504, Training Loss: 0.01299
Epoch: 13504, Training Loss: 0.01203
Epoch: 13504, Training Loss: 0.01484
Epoch: 13504, Training Loss: 0.01141
Epoch: 13505, Training Loss: 0.01299
Epoch: 13505, Training Loss: 0.01203
Epoch: 13505, Training Loss: 0.01484
Epoch: 13505, Training Loss: 0.01141
Epoch: 13506, Training Loss: 0.01299
Epoch: 13506, Training Loss: 0.01203
Epoch: 13506, Training Loss: 0.01483
Epoch: 13506, Training Loss: 0.01141
Epoch: 13507, Training Loss: 0.01299
Epoch: 13507, Training Loss: 0.01203
Epoch: 13507, Training Loss: 0.01483
Epoch: 13507, Training Loss: 0.01141
Epoch: 13508, Training Loss: 0.01299
Epoch: 13508, Training Loss: 0.01202
Epoch: 13508, Training Loss: 0.01483
Epoch: 13508, Training Loss: 0.01141
Epoch: 13509, Training Loss: 0.01299
Epoch: 13509, Training Loss: 0.01202
Epoch: 13509, Training Loss: 0.01483
Epoch: 13509, Training Loss: 0.01141
Epoch: 13510, Training Loss: 0.01299
Epoch: 13510, Training Loss: 0.01202
Epoch: 13510, Training Loss: 0.01483
Epoch: 13510, Training Loss: 0.01141
Epoch: 13511, Training Loss: 0.01299
Epoch: 13511, Training Loss: 0.01202
Epoch: 13511, Training Loss: 0.01483
Epoch: 13511, Training Loss: 0.01141
Epoch: 13512, Training Loss: 0.01299
Epoch: 13512, Training Loss: 0.01202
Epoch: 13512, Training Loss: 0.01483
Epoch: 13512, Training Loss: 0.01141
Epoch: 13513, Training Loss: 0.01299
Epoch: 13513, Training Loss: 0.01202
Epoch: 13513, Training Loss: 0.01483
Epoch: 13513, Training Loss: 0.01140
Epoch: 13514, Training Loss: 0.01299
Epoch: 13514, Training Loss: 0.01202
Epoch: 13514, Training Loss: 0.01483
Epoch: 13514, Training Loss: 0.01140
Epoch: 13515, Training Loss: 0.01299
Epoch: 13515, Training Loss: 0.01202
Epoch: 13515, Training Loss: 0.01483
Epoch: 13515, Training Loss: 0.01140
Epoch: 13516, Training Loss: 0.01299
Epoch: 13516, Training Loss: 0.01202
Epoch: 13516, Training Loss: 0.01483
Epoch: 13516, Training Loss: 0.01140
Epoch: 13517, Training Loss: 0.01299
Epoch: 13517, Training Loss: 0.01202
Epoch: 13517, Training Loss: 0.01483
Epoch: 13517, Training Loss: 0.01140
Epoch: 13518, Training Loss: 0.01299
Epoch: 13518, Training Loss: 0.01202
Epoch: 13518, Training Loss: 0.01483
Epoch: 13518, Training Loss: 0.01140
Epoch: 13519, Training Loss: 0.01299
Epoch: 13519, Training Loss: 0.01202
Epoch: 13519, Training Loss: 0.01483
Epoch: 13519, Training Loss: 0.01140
Epoch: 13520, Training Loss: 0.01298
Epoch: 13520, Training Loss: 0.01202
Epoch: 13520, Training Loss: 0.01483
Epoch: 13520, Training Loss: 0.01140
Epoch: 13521, Training Loss: 0.01298
Epoch: 13521, Training Loss: 0.01202
Epoch: 13521, Training Loss: 0.01483
Epoch: 13521, Training Loss: 0.01140
Epoch: 13522, Training Loss: 0.01298
Epoch: 13522, Training Loss: 0.01202
Epoch: 13522, Training Loss: 0.01482
Epoch: 13522, Training Loss: 0.01140
Epoch: 13523, Training Loss: 0.01298
Epoch: 13523, Training Loss: 0.01202
Epoch: 13523, Training Loss: 0.01482
Epoch: 13523, Training Loss: 0.01140
Epoch: 13524, Training Loss: 0.01298
Epoch: 13524, Training Loss: 0.01202
Epoch: 13524, Training Loss: 0.01482
Epoch: 13524, Training Loss: 0.01140
Epoch: 13525, Training Loss: 0.01298
Epoch: 13525, Training Loss: 0.01202
Epoch: 13525, Training Loss: 0.01482
Epoch: 13525, Training Loss: 0.01140
Epoch: 13526, Training Loss: 0.01298
Epoch: 13526, Training Loss: 0.01202
Epoch: 13526, Training Loss: 0.01482
Epoch: 13526, Training Loss: 0.01140
Epoch: 13527, Training Loss: 0.01298
Epoch: 13527, Training Loss: 0.01202
Epoch: 13527, Training Loss: 0.01482
Epoch: 13527, Training Loss: 0.01140
Epoch: 13528, Training Loss: 0.01298
Epoch: 13528, Training Loss: 0.01202
Epoch: 13528, Training Loss: 0.01482
Epoch: 13528, Training Loss: 0.01140
Epoch: 13529, Training Loss: 0.01298
Epoch: 13529, Training Loss: 0.01201
Epoch: 13529, Training Loss: 0.01482
Epoch: 13529, Training Loss: 0.01140
Epoch: 13530, Training Loss: 0.01298
Epoch: 13530, Training Loss: 0.01201
Epoch: 13530, Training Loss: 0.01482
Epoch: 13530, Training Loss: 0.01140
Epoch: 13531, Training Loss: 0.01298
Epoch: 13531, Training Loss: 0.01201
Epoch: 13531, Training Loss: 0.01482
Epoch: 13531, Training Loss: 0.01140
Epoch: 13532, Training Loss: 0.01298
Epoch: 13532, Training Loss: 0.01201
Epoch: 13532, Training Loss: 0.01482
Epoch: 13532, Training Loss: 0.01140
Epoch: 13533, Training Loss: 0.01298
Epoch: 13533, Training Loss: 0.01201
Epoch: 13533, Training Loss: 0.01482
Epoch: 13533, Training Loss: 0.01140
Epoch: 13534, Training Loss: 0.01298
Epoch: 13534, Training Loss: 0.01201
Epoch: 13534, Training Loss: 0.01482
Epoch: 13534, Training Loss: 0.01140
Epoch: 13535, Training Loss: 0.01298
Epoch: 13535, Training Loss: 0.01201
Epoch: 13535, Training Loss: 0.01482
Epoch: 13535, Training Loss: 0.01139
Epoch: 13536, Training Loss: 0.01298
Epoch: 13536, Training Loss: 0.01201
Epoch: 13536, Training Loss: 0.01482
Epoch: 13536, Training Loss: 0.01139
Epoch: 13537, Training Loss: 0.01298
Epoch: 13537, Training Loss: 0.01201
Epoch: 13537, Training Loss: 0.01482
Epoch: 13537, Training Loss: 0.01139
Epoch: 13538, Training Loss: 0.01298
Epoch: 13538, Training Loss: 0.01201
Epoch: 13538, Training Loss: 0.01482
Epoch: 13538, Training Loss: 0.01139
Epoch: 13539, Training Loss: 0.01297
Epoch: 13539, Training Loss: 0.01201
Epoch: 13539, Training Loss: 0.01481
Epoch: 13539, Training Loss: 0.01139
Epoch: 13540, Training Loss: 0.01297
Epoch: 13540, Training Loss: 0.01201
Epoch: 13540, Training Loss: 0.01481
Epoch: 13540, Training Loss: 0.01139
Epoch: 13541, Training Loss: 0.01297
Epoch: 13541, Training Loss: 0.01201
Epoch: 13541, Training Loss: 0.01481
Epoch: 13541, Training Loss: 0.01139
Epoch: 13542, Training Loss: 0.01297
Epoch: 13542, Training Loss: 0.01201
Epoch: 13542, Training Loss: 0.01481
Epoch: 13542, Training Loss: 0.01139
Epoch: 13543, Training Loss: 0.01297
Epoch: 13543, Training Loss: 0.01201
Epoch: 13543, Training Loss: 0.01481
Epoch: 13543, Training Loss: 0.01139
Epoch: 13544, Training Loss: 0.01297
Epoch: 13544, Training Loss: 0.01201
Epoch: 13544, Training Loss: 0.01481
Epoch: 13544, Training Loss: 0.01139
Epoch: 13545, Training Loss: 0.01297
Epoch: 13545, Training Loss: 0.01201
Epoch: 13545, Training Loss: 0.01481
Epoch: 13545, Training Loss: 0.01139
Epoch: 13546, Training Loss: 0.01297
Epoch: 13546, Training Loss: 0.01201
Epoch: 13546, Training Loss: 0.01481
Epoch: 13546, Training Loss: 0.01139
Epoch: 13547, Training Loss: 0.01297
Epoch: 13547, Training Loss: 0.01201
Epoch: 13547, Training Loss: 0.01481
Epoch: 13547, Training Loss: 0.01139
Epoch: 13548, Training Loss: 0.01297
Epoch: 13548, Training Loss: 0.01201
Epoch: 13548, Training Loss: 0.01481
Epoch: 13548, Training Loss: 0.01139
Epoch: 13549, Training Loss: 0.01297
Epoch: 13549, Training Loss: 0.01200
Epoch: 13549, Training Loss: 0.01481
Epoch: 13549, Training Loss: 0.01139
Epoch: 13550, Training Loss: 0.01297
Epoch: 13550, Training Loss: 0.01200
Epoch: 13550, Training Loss: 0.01481
Epoch: 13550, Training Loss: 0.01139
Epoch: 13551, Training Loss: 0.01297
Epoch: 13551, Training Loss: 0.01200
Epoch: 13551, Training Loss: 0.01481
Epoch: 13551, Training Loss: 0.01139
Epoch: 13552, Training Loss: 0.01297
Epoch: 13552, Training Loss: 0.01200
Epoch: 13552, Training Loss: 0.01481
Epoch: 13552, Training Loss: 0.01139
Epoch: 13553, Training Loss: 0.01297
Epoch: 13553, Training Loss: 0.01200
Epoch: 13553, Training Loss: 0.01481
Epoch: 13553, Training Loss: 0.01139
Epoch: 13554, Training Loss: 0.01297
Epoch: 13554, Training Loss: 0.01200
Epoch: 13554, Training Loss: 0.01481
Epoch: 13554, Training Loss: 0.01139
Epoch: 13555, Training Loss: 0.01297
Epoch: 13555, Training Loss: 0.01200
Epoch: 13555, Training Loss: 0.01480
Epoch: 13555, Training Loss: 0.01139
Epoch: 13556, Training Loss: 0.01297
Epoch: 13556, Training Loss: 0.01200
Epoch: 13556, Training Loss: 0.01480
Epoch: 13556, Training Loss: 0.01139
Epoch: 13557, Training Loss: 0.01297
Epoch: 13557, Training Loss: 0.01200
Epoch: 13557, Training Loss: 0.01480
Epoch: 13557, Training Loss: 0.01138
Epoch: 13558, Training Loss: 0.01296
Epoch: 13558, Training Loss: 0.01200
Epoch: 13558, Training Loss: 0.01480
Epoch: 13558, Training Loss: 0.01138
Epoch: 13559, Training Loss: 0.01296
Epoch: 13559, Training Loss: 0.01200
Epoch: 13559, Training Loss: 0.01480
Epoch: 13559, Training Loss: 0.01138
Epoch: 13560, Training Loss: 0.01296
Epoch: 13560, Training Loss: 0.01200
Epoch: 13560, Training Loss: 0.01480
Epoch: 13560, Training Loss: 0.01138
Epoch: 13561, Training Loss: 0.01296
Epoch: 13561, Training Loss: 0.01200
Epoch: 13561, Training Loss: 0.01480
Epoch: 13561, Training Loss: 0.01138
Epoch: 13562, Training Loss: 0.01296
Epoch: 13562, Training Loss: 0.01200
Epoch: 13562, Training Loss: 0.01480
Epoch: 13562, Training Loss: 0.01138
Epoch: 13563, Training Loss: 0.01296
Epoch: 13563, Training Loss: 0.01200
Epoch: 13563, Training Loss: 0.01480
Epoch: 13563, Training Loss: 0.01138
Epoch: 13564, Training Loss: 0.01296
Epoch: 13564, Training Loss: 0.01200
Epoch: 13564, Training Loss: 0.01480
Epoch: 13564, Training Loss: 0.01138
Epoch: 13565, Training Loss: 0.01296
Epoch: 13565, Training Loss: 0.01200
Epoch: 13565, Training Loss: 0.01480
Epoch: 13565, Training Loss: 0.01138
Epoch: 13566, Training Loss: 0.01296
Epoch: 13566, Training Loss: 0.01200
Epoch: 13566, Training Loss: 0.01480
Epoch: 13566, Training Loss: 0.01138
Epoch: 13567, Training Loss: 0.01296
Epoch: 13567, Training Loss: 0.01200
Epoch: 13567, Training Loss: 0.01480
Epoch: 13567, Training Loss: 0.01138
Epoch: 13568, Training Loss: 0.01296
Epoch: 13568, Training Loss: 0.01200
Epoch: 13568, Training Loss: 0.01480
Epoch: 13568, Training Loss: 0.01138
Epoch: 13569, Training Loss: 0.01296
Epoch: 13569, Training Loss: 0.01200
Epoch: 13569, Training Loss: 0.01480
Epoch: 13569, Training Loss: 0.01138
Epoch: 13570, Training Loss: 0.01296
Epoch: 13570, Training Loss: 0.01199
Epoch: 13570, Training Loss: 0.01480
Epoch: 13570, Training Loss: 0.01138
Epoch: 13571, Training Loss: 0.01296
Epoch: 13571, Training Loss: 0.01199
Epoch: 13571, Training Loss: 0.01480
Epoch: 13571, Training Loss: 0.01138
Epoch: 13572, Training Loss: 0.01296
Epoch: 13572, Training Loss: 0.01199
Epoch: 13572, Training Loss: 0.01479
Epoch: 13572, Training Loss: 0.01138
Epoch: 13573, Training Loss: 0.01296
Epoch: 13573, Training Loss: 0.01199
Epoch: 13573, Training Loss: 0.01479
Epoch: 13573, Training Loss: 0.01138
Epoch: 13574, Training Loss: 0.01296
Epoch: 13574, Training Loss: 0.01199
Epoch: 13574, Training Loss: 0.01479
Epoch: 13574, Training Loss: 0.01138
Epoch: 13575, Training Loss: 0.01296
Epoch: 13575, Training Loss: 0.01199
Epoch: 13575, Training Loss: 0.01479
Epoch: 13575, Training Loss: 0.01138
Epoch: 13576, Training Loss: 0.01295
Epoch: 13576, Training Loss: 0.01199
Epoch: 13576, Training Loss: 0.01479
Epoch: 13576, Training Loss: 0.01138
Epoch: 13577, Training Loss: 0.01295
Epoch: 13577, Training Loss: 0.01199
Epoch: 13577, Training Loss: 0.01479
Epoch: 13577, Training Loss: 0.01138
Epoch: 13578, Training Loss: 0.01295
Epoch: 13578, Training Loss: 0.01199
Epoch: 13578, Training Loss: 0.01479
Epoch: 13578, Training Loss: 0.01138
Epoch: 13579, Training Loss: 0.01295
Epoch: 13579, Training Loss: 0.01199
Epoch: 13579, Training Loss: 0.01479
Epoch: 13579, Training Loss: 0.01137
Epoch: 13580, Training Loss: 0.01295
Epoch: 13580, Training Loss: 0.01199
Epoch: 13580, Training Loss: 0.01479
Epoch: 13580, Training Loss: 0.01137
Epoch: 13581, Training Loss: 0.01295
Epoch: 13581, Training Loss: 0.01199
Epoch: 13581, Training Loss: 0.01479
Epoch: 13581, Training Loss: 0.01137
Epoch: 13582, Training Loss: 0.01295
Epoch: 13582, Training Loss: 0.01199
Epoch: 13582, Training Loss: 0.01479
Epoch: 13582, Training Loss: 0.01137
Epoch: 13583, Training Loss: 0.01295
Epoch: 13583, Training Loss: 0.01199
Epoch: 13583, Training Loss: 0.01479
Epoch: 13583, Training Loss: 0.01137
Epoch: 13584, Training Loss: 0.01295
Epoch: 13584, Training Loss: 0.01199
Epoch: 13584, Training Loss: 0.01479
Epoch: 13584, Training Loss: 0.01137
Epoch: 13585, Training Loss: 0.01295
Epoch: 13585, Training Loss: 0.01199
Epoch: 13585, Training Loss: 0.01479
Epoch: 13585, Training Loss: 0.01137
Epoch: 13586, Training Loss: 0.01295
Epoch: 13586, Training Loss: 0.01199
Epoch: 13586, Training Loss: 0.01479
Epoch: 13586, Training Loss: 0.01137
Epoch: 13587, Training Loss: 0.01295
Epoch: 13587, Training Loss: 0.01199
Epoch: 13587, Training Loss: 0.01479
Epoch: 13587, Training Loss: 0.01137
Epoch: 13588, Training Loss: 0.01295
Epoch: 13588, Training Loss: 0.01199
Epoch: 13588, Training Loss: 0.01478
Epoch: 13588, Training Loss: 0.01137
Epoch: 13589, Training Loss: 0.01295
Epoch: 13589, Training Loss: 0.01199
Epoch: 13589, Training Loss: 0.01478
Epoch: 13589, Training Loss: 0.01137
Epoch: 13590, Training Loss: 0.01295
Epoch: 13590, Training Loss: 0.01199
Epoch: 13590, Training Loss: 0.01478
Epoch: 13590, Training Loss: 0.01137
Epoch: 13591, Training Loss: 0.01295
Epoch: 13591, Training Loss: 0.01198
Epoch: 13591, Training Loss: 0.01478
Epoch: 13591, Training Loss: 0.01137
Epoch: 13592, Training Loss: 0.01295
Epoch: 13592, Training Loss: 0.01198
Epoch: 13592, Training Loss: 0.01478
Epoch: 13592, Training Loss: 0.01137
Epoch: 13593, Training Loss: 0.01295
Epoch: 13593, Training Loss: 0.01198
Epoch: 13593, Training Loss: 0.01478
Epoch: 13593, Training Loss: 0.01137
Epoch: 13594, Training Loss: 0.01295
Epoch: 13594, Training Loss: 0.01198
Epoch: 13594, Training Loss: 0.01478
Epoch: 13594, Training Loss: 0.01137
Epoch: 13595, Training Loss: 0.01294
Epoch: 13595, Training Loss: 0.01198
Epoch: 13595, Training Loss: 0.01478
Epoch: 13595, Training Loss: 0.01137
Epoch: 13596, Training Loss: 0.01294
Epoch: 13596, Training Loss: 0.01198
Epoch: 13596, Training Loss: 0.01478
Epoch: 13596, Training Loss: 0.01137
Epoch: 13597, Training Loss: 0.01294
Epoch: 13597, Training Loss: 0.01198
Epoch: 13597, Training Loss: 0.01478
Epoch: 13597, Training Loss: 0.01137
Epoch: 13598, Training Loss: 0.01294
Epoch: 13598, Training Loss: 0.01198
Epoch: 13598, Training Loss: 0.01478
Epoch: 13598, Training Loss: 0.01137
Epoch: 13599, Training Loss: 0.01294
Epoch: 13599, Training Loss: 0.01198
Epoch: 13599, Training Loss: 0.01478
Epoch: 13599, Training Loss: 0.01137
Epoch: 13600, Training Loss: 0.01294
Epoch: 13600, Training Loss: 0.01198
Epoch: 13600, Training Loss: 0.01478
Epoch: 13600, Training Loss: 0.01137
Epoch: 13601, Training Loss: 0.01294
Epoch: 13601, Training Loss: 0.01198
Epoch: 13601, Training Loss: 0.01478
Epoch: 13601, Training Loss: 0.01136
Epoch: 13602, Training Loss: 0.01294
Epoch: 13602, Training Loss: 0.01198
Epoch: 13602, Training Loss: 0.01478
Epoch: 13602, Training Loss: 0.01136
Epoch: 13603, Training Loss: 0.01294
Epoch: 13603, Training Loss: 0.01198
Epoch: 13603, Training Loss: 0.01478
Epoch: 13603, Training Loss: 0.01136
Epoch: 13604, Training Loss: 0.01294
Epoch: 13604, Training Loss: 0.01198
Epoch: 13604, Training Loss: 0.01478
Epoch: 13604, Training Loss: 0.01136
Epoch: 13605, Training Loss: 0.01294
Epoch: 13605, Training Loss: 0.01198
Epoch: 13605, Training Loss: 0.01477
Epoch: 13605, Training Loss: 0.01136
Epoch: 13606, Training Loss: 0.01294
Epoch: 13606, Training Loss: 0.01198
Epoch: 13606, Training Loss: 0.01477
Epoch: 13606, Training Loss: 0.01136
Epoch: 13607, Training Loss: 0.01294
Epoch: 13607, Training Loss: 0.01198
Epoch: 13607, Training Loss: 0.01477
Epoch: 13607, Training Loss: 0.01136
Epoch: 13608, Training Loss: 0.01294
Epoch: 13608, Training Loss: 0.01198
Epoch: 13608, Training Loss: 0.01477
Epoch: 13608, Training Loss: 0.01136
Epoch: 13609, Training Loss: 0.01294
Epoch: 13609, Training Loss: 0.01198
Epoch: 13609, Training Loss: 0.01477
Epoch: 13609, Training Loss: 0.01136
Epoch: 13610, Training Loss: 0.01294
Epoch: 13610, Training Loss: 0.01198
Epoch: 13610, Training Loss: 0.01477
Epoch: 13610, Training Loss: 0.01136
Epoch: 13611, Training Loss: 0.01294
Epoch: 13611, Training Loss: 0.01198
Epoch: 13611, Training Loss: 0.01477
Epoch: 13611, Training Loss: 0.01136
Epoch: 13612, Training Loss: 0.01294
Epoch: 13612, Training Loss: 0.01197
Epoch: 13612, Training Loss: 0.01477
Epoch: 13612, Training Loss: 0.01136
Epoch: 13613, Training Loss: 0.01294
Epoch: 13613, Training Loss: 0.01197
Epoch: 13613, Training Loss: 0.01477
Epoch: 13613, Training Loss: 0.01136
Epoch: 13614, Training Loss: 0.01293
Epoch: 13614, Training Loss: 0.01197
Epoch: 13614, Training Loss: 0.01477
Epoch: 13614, Training Loss: 0.01136
Epoch: 13615, Training Loss: 0.01293
Epoch: 13615, Training Loss: 0.01197
Epoch: 13615, Training Loss: 0.01477
Epoch: 13615, Training Loss: 0.01136
Epoch: 13616, Training Loss: 0.01293
Epoch: 13616, Training Loss: 0.01197
Epoch: 13616, Training Loss: 0.01477
Epoch: 13616, Training Loss: 0.01136
Epoch: 13617, Training Loss: 0.01293
Epoch: 13617, Training Loss: 0.01197
Epoch: 13617, Training Loss: 0.01477
Epoch: 13617, Training Loss: 0.01136
Epoch: 13618, Training Loss: 0.01293
Epoch: 13618, Training Loss: 0.01197
Epoch: 13618, Training Loss: 0.01477
Epoch: 13618, Training Loss: 0.01136
Epoch: 13619, Training Loss: 0.01293
Epoch: 13619, Training Loss: 0.01197
Epoch: 13619, Training Loss: 0.01477
Epoch: 13619, Training Loss: 0.01136
Epoch: 13620, Training Loss: 0.01293
Epoch: 13620, Training Loss: 0.01197
Epoch: 13620, Training Loss: 0.01477
Epoch: 13620, Training Loss: 0.01136
Epoch: 13621, Training Loss: 0.01293
Epoch: 13621, Training Loss: 0.01197
Epoch: 13621, Training Loss: 0.01476
Epoch: 13621, Training Loss: 0.01136
Epoch: 13622, Training Loss: 0.01293
Epoch: 13622, Training Loss: 0.01197
Epoch: 13622, Training Loss: 0.01476
Epoch: 13622, Training Loss: 0.01136
Epoch: 13623, Training Loss: 0.01293
Epoch: 13623, Training Loss: 0.01197
Epoch: 13623, Training Loss: 0.01476
Epoch: 13623, Training Loss: 0.01135
Epoch: 13624, Training Loss: 0.01293
Epoch: 13624, Training Loss: 0.01197
Epoch: 13624, Training Loss: 0.01476
Epoch: 13624, Training Loss: 0.01135
Epoch: 13625, Training Loss: 0.01293
Epoch: 13625, Training Loss: 0.01197
Epoch: 13625, Training Loss: 0.01476
Epoch: 13625, Training Loss: 0.01135
Epoch: 13626, Training Loss: 0.01293
Epoch: 13626, Training Loss: 0.01197
Epoch: 13626, Training Loss: 0.01476
Epoch: 13626, Training Loss: 0.01135
Epoch: 13627, Training Loss: 0.01293
Epoch: 13627, Training Loss: 0.01197
Epoch: 13627, Training Loss: 0.01476
Epoch: 13627, Training Loss: 0.01135
Epoch: 13628, Training Loss: 0.01293
Epoch: 13628, Training Loss: 0.01197
Epoch: 13628, Training Loss: 0.01476
Epoch: 13628, Training Loss: 0.01135
Epoch: 13629, Training Loss: 0.01293
Epoch: 13629, Training Loss: 0.01197
Epoch: 13629, Training Loss: 0.01476
Epoch: 13629, Training Loss: 0.01135
Epoch: 13630, Training Loss: 0.01293
Epoch: 13630, Training Loss: 0.01197
Epoch: 13630, Training Loss: 0.01476
Epoch: 13630, Training Loss: 0.01135
Epoch: 13631, Training Loss: 0.01293
Epoch: 13631, Training Loss: 0.01197
Epoch: 13631, Training Loss: 0.01476
Epoch: 13631, Training Loss: 0.01135
Epoch: 13632, Training Loss: 0.01293
Epoch: 13632, Training Loss: 0.01197
Epoch: 13632, Training Loss: 0.01476
Epoch: 13632, Training Loss: 0.01135
Epoch: 13633, Training Loss: 0.01292
Epoch: 13633, Training Loss: 0.01197
Epoch: 13633, Training Loss: 0.01476
Epoch: 13633, Training Loss: 0.01135
Epoch: 13634, Training Loss: 0.01292
Epoch: 13634, Training Loss: 0.01196
Epoch: 13634, Training Loss: 0.01476
Epoch: 13634, Training Loss: 0.01135
Epoch: 13635, Training Loss: 0.01292
Epoch: 13635, Training Loss: 0.01196
Epoch: 13635, Training Loss: 0.01476
Epoch: 13635, Training Loss: 0.01135
Epoch: 13636, Training Loss: 0.01292
Epoch: 13636, Training Loss: 0.01196
Epoch: 13636, Training Loss: 0.01476
Epoch: 13636, Training Loss: 0.01135
Epoch: 13637, Training Loss: 0.01292
Epoch: 13637, Training Loss: 0.01196
Epoch: 13637, Training Loss: 0.01476
Epoch: 13637, Training Loss: 0.01135
Epoch: 13638, Training Loss: 0.01292
Epoch: 13638, Training Loss: 0.01196
Epoch: 13638, Training Loss: 0.01475
Epoch: 13638, Training Loss: 0.01135
Epoch: 13639, Training Loss: 0.01292
Epoch: 13639, Training Loss: 0.01196
Epoch: 13639, Training Loss: 0.01475
Epoch: 13639, Training Loss: 0.01135
Epoch: 13640, Training Loss: 0.01292
Epoch: 13640, Training Loss: 0.01196
Epoch: 13640, Training Loss: 0.01475
Epoch: 13640, Training Loss: 0.01135
Epoch: 13641, Training Loss: 0.01292
Epoch: 13641, Training Loss: 0.01196
Epoch: 13641, Training Loss: 0.01475
Epoch: 13641, Training Loss: 0.01135
Epoch: 13642, Training Loss: 0.01292
Epoch: 13642, Training Loss: 0.01196
Epoch: 13642, Training Loss: 0.01475
Epoch: 13642, Training Loss: 0.01135
Epoch: 13643, Training Loss: 0.01292
Epoch: 13643, Training Loss: 0.01196
Epoch: 13643, Training Loss: 0.01475
Epoch: 13643, Training Loss: 0.01135
Epoch: 13644, Training Loss: 0.01292
Epoch: 13644, Training Loss: 0.01196
Epoch: 13644, Training Loss: 0.01475
Epoch: 13644, Training Loss: 0.01135
Epoch: 13645, Training Loss: 0.01292
Epoch: 13645, Training Loss: 0.01196
Epoch: 13645, Training Loss: 0.01475
Epoch: 13645, Training Loss: 0.01134
Epoch: 13646, Training Loss: 0.01292
Epoch: 13646, Training Loss: 0.01196
Epoch: 13646, Training Loss: 0.01475
Epoch: 13646, Training Loss: 0.01134
Epoch: 13647, Training Loss: 0.01292
Epoch: 13647, Training Loss: 0.01196
Epoch: 13647, Training Loss: 0.01475
Epoch: 13647, Training Loss: 0.01134
Epoch: 13648, Training Loss: 0.01292
Epoch: 13648, Training Loss: 0.01196
Epoch: 13648, Training Loss: 0.01475
Epoch: 13648, Training Loss: 0.01134
Epoch: 13649, Training Loss: 0.01292
Epoch: 13649, Training Loss: 0.01196
Epoch: 13649, Training Loss: 0.01475
Epoch: 13649, Training Loss: 0.01134
Epoch: 13650, Training Loss: 0.01292
Epoch: 13650, Training Loss: 0.01196
Epoch: 13650, Training Loss: 0.01475
Epoch: 13650, Training Loss: 0.01134
Epoch: 13651, Training Loss: 0.01292
Epoch: 13651, Training Loss: 0.01196
Epoch: 13651, Training Loss: 0.01475
Epoch: 13651, Training Loss: 0.01134
Epoch: 13652, Training Loss: 0.01292
Epoch: 13652, Training Loss: 0.01196
Epoch: 13652, Training Loss: 0.01475
Epoch: 13652, Training Loss: 0.01134
Epoch: 13653, Training Loss: 0.01291
Epoch: 13653, Training Loss: 0.01196
Epoch: 13653, Training Loss: 0.01475
Epoch: 13653, Training Loss: 0.01134
Epoch: 13654, Training Loss: 0.01291
Epoch: 13654, Training Loss: 0.01196
Epoch: 13654, Training Loss: 0.01475
Epoch: 13654, Training Loss: 0.01134
Epoch: 13655, Training Loss: 0.01291
Epoch: 13655, Training Loss: 0.01195
Epoch: 13655, Training Loss: 0.01474
Epoch: 13655, Training Loss: 0.01134
Epoch: 13656, Training Loss: 0.01291
Epoch: 13656, Training Loss: 0.01195
Epoch: 13656, Training Loss: 0.01474
Epoch: 13656, Training Loss: 0.01134
Epoch: 13657, Training Loss: 0.01291
Epoch: 13657, Training Loss: 0.01195
Epoch: 13657, Training Loss: 0.01474
Epoch: 13657, Training Loss: 0.01134
Epoch: 13658, Training Loss: 0.01291
Epoch: 13658, Training Loss: 0.01195
Epoch: 13658, Training Loss: 0.01474
Epoch: 13658, Training Loss: 0.01134
Epoch: 13659, Training Loss: 0.01291
Epoch: 13659, Training Loss: 0.01195
Epoch: 13659, Training Loss: 0.01474
Epoch: 13659, Training Loss: 0.01134
Epoch: 13660, Training Loss: 0.01291
Epoch: 13660, Training Loss: 0.01195
Epoch: 13660, Training Loss: 0.01474
Epoch: 13660, Training Loss: 0.01134
Epoch: 13661, Training Loss: 0.01291
Epoch: 13661, Training Loss: 0.01195
Epoch: 13661, Training Loss: 0.01474
Epoch: 13661, Training Loss: 0.01134
Epoch: 13662, Training Loss: 0.01291
Epoch: 13662, Training Loss: 0.01195
Epoch: 13662, Training Loss: 0.01474
Epoch: 13662, Training Loss: 0.01134
Epoch: 13663, Training Loss: 0.01291
Epoch: 13663, Training Loss: 0.01195
Epoch: 13663, Training Loss: 0.01474
Epoch: 13663, Training Loss: 0.01134
Epoch: 13664, Training Loss: 0.01291
Epoch: 13664, Training Loss: 0.01195
Epoch: 13664, Training Loss: 0.01474
Epoch: 13664, Training Loss: 0.01134
Epoch: 13665, Training Loss: 0.01291
Epoch: 13665, Training Loss: 0.01195
Epoch: 13665, Training Loss: 0.01474
Epoch: 13665, Training Loss: 0.01134
Epoch: 13666, Training Loss: 0.01291
Epoch: 13666, Training Loss: 0.01195
Epoch: 13666, Training Loss: 0.01474
Epoch: 13666, Training Loss: 0.01134
Epoch: 13667, Training Loss: 0.01291
Epoch: 13667, Training Loss: 0.01195
Epoch: 13667, Training Loss: 0.01474
Epoch: 13667, Training Loss: 0.01133
Epoch: 13668, Training Loss: 0.01291
Epoch: 13668, Training Loss: 0.01195
Epoch: 13668, Training Loss: 0.01474
Epoch: 13668, Training Loss: 0.01133
Epoch: 13669, Training Loss: 0.01291
Epoch: 13669, Training Loss: 0.01195
Epoch: 13669, Training Loss: 0.01474
Epoch: 13669, Training Loss: 0.01133
Epoch: 13670, Training Loss: 0.01291
Epoch: 13670, Training Loss: 0.01195
Epoch: 13670, Training Loss: 0.01474
Epoch: 13670, Training Loss: 0.01133
Epoch: 13671, Training Loss: 0.01291
Epoch: 13671, Training Loss: 0.01195
Epoch: 13671, Training Loss: 0.01473
Epoch: 13671, Training Loss: 0.01133
Epoch: 13672, Training Loss: 0.01290
Epoch: 13672, Training Loss: 0.01195
Epoch: 13672, Training Loss: 0.01473
Epoch: 13672, Training Loss: 0.01133
Epoch: 13673, Training Loss: 0.01290
Epoch: 13673, Training Loss: 0.01195
Epoch: 13673, Training Loss: 0.01473
Epoch: 13673, Training Loss: 0.01133
Epoch: 13674, Training Loss: 0.01290
Epoch: 13674, Training Loss: 0.01195
Epoch: 13674, Training Loss: 0.01473
Epoch: 13674, Training Loss: 0.01133
Epoch: 13675, Training Loss: 0.01290
Epoch: 13675, Training Loss: 0.01195
Epoch: 13675, Training Loss: 0.01473
Epoch: 13675, Training Loss: 0.01133
Epoch: 13676, Training Loss: 0.01290
Epoch: 13676, Training Loss: 0.01194
Epoch: 13676, Training Loss: 0.01473
Epoch: 13676, Training Loss: 0.01133
Epoch: 13677, Training Loss: 0.01290
Epoch: 13677, Training Loss: 0.01194
Epoch: 13677, Training Loss: 0.01473
Epoch: 13677, Training Loss: 0.01133
Epoch: 13678, Training Loss: 0.01290
Epoch: 13678, Training Loss: 0.01194
Epoch: 13678, Training Loss: 0.01473
Epoch: 13678, Training Loss: 0.01133
Epoch: 13679, Training Loss: 0.01290
Epoch: 13679, Training Loss: 0.01194
Epoch: 13679, Training Loss: 0.01473
Epoch: 13679, Training Loss: 0.01133
Epoch: 13680, Training Loss: 0.01290
Epoch: 13680, Training Loss: 0.01194
Epoch: 13680, Training Loss: 0.01473
Epoch: 13680, Training Loss: 0.01133
Epoch: 13681, Training Loss: 0.01290
Epoch: 13681, Training Loss: 0.01194
Epoch: 13681, Training Loss: 0.01473
Epoch: 13681, Training Loss: 0.01133
Epoch: 13682, Training Loss: 0.01290
Epoch: 13682, Training Loss: 0.01194
Epoch: 13682, Training Loss: 0.01473
Epoch: 13682, Training Loss: 0.01133
Epoch: 13683, Training Loss: 0.01290
Epoch: 13683, Training Loss: 0.01194
Epoch: 13683, Training Loss: 0.01473
Epoch: 13683, Training Loss: 0.01133
Epoch: 13684, Training Loss: 0.01290
Epoch: 13684, Training Loss: 0.01194
Epoch: 13684, Training Loss: 0.01473
Epoch: 13684, Training Loss: 0.01133
Epoch: 13685, Training Loss: 0.01290
Epoch: 13685, Training Loss: 0.01194
Epoch: 13685, Training Loss: 0.01473
Epoch: 13685, Training Loss: 0.01133
Epoch: 13686, Training Loss: 0.01290
Epoch: 13686, Training Loss: 0.01194
Epoch: 13686, Training Loss: 0.01473
Epoch: 13686, Training Loss: 0.01133
Epoch: 13687, Training Loss: 0.01290
Epoch: 13687, Training Loss: 0.01194
Epoch: 13687, Training Loss: 0.01473
Epoch: 13687, Training Loss: 0.01133
Epoch: 13688, Training Loss: 0.01290
Epoch: 13688, Training Loss: 0.01194
Epoch: 13688, Training Loss: 0.01472
Epoch: 13688, Training Loss: 0.01133
Epoch: 13689, Training Loss: 0.01290
Epoch: 13689, Training Loss: 0.01194
Epoch: 13689, Training Loss: 0.01472
Epoch: 13689, Training Loss: 0.01132
Epoch: 13690, Training Loss: 0.01290
Epoch: 13690, Training Loss: 0.01194
Epoch: 13690, Training Loss: 0.01472
Epoch: 13690, Training Loss: 0.01132
Epoch: 13691, Training Loss: 0.01289
Epoch: 13691, Training Loss: 0.01194
Epoch: 13691, Training Loss: 0.01472
Epoch: 13691, Training Loss: 0.01132
Epoch: 13692, Training Loss: 0.01289
Epoch: 13692, Training Loss: 0.01194
Epoch: 13692, Training Loss: 0.01472
Epoch: 13692, Training Loss: 0.01132
Epoch: 13693, Training Loss: 0.01289
Epoch: 13693, Training Loss: 0.01194
Epoch: 13693, Training Loss: 0.01472
Epoch: 13693, Training Loss: 0.01132
Epoch: 13694, Training Loss: 0.01289
Epoch: 13694, Training Loss: 0.01194
Epoch: 13694, Training Loss: 0.01472
Epoch: 13694, Training Loss: 0.01132
Epoch: 13695, Training Loss: 0.01289
Epoch: 13695, Training Loss: 0.01194
Epoch: 13695, Training Loss: 0.01472
Epoch: 13695, Training Loss: 0.01132
Epoch: 13696, Training Loss: 0.01289
Epoch: 13696, Training Loss: 0.01194
Epoch: 13696, Training Loss: 0.01472
Epoch: 13696, Training Loss: 0.01132
Epoch: 13697, Training Loss: 0.01289
Epoch: 13697, Training Loss: 0.01193
Epoch: 13697, Training Loss: 0.01472
Epoch: 13697, Training Loss: 0.01132
Epoch: 13698, Training Loss: 0.01289
Epoch: 13698, Training Loss: 0.01193
Epoch: 13698, Training Loss: 0.01472
Epoch: 13698, Training Loss: 0.01132
Epoch: 13699, Training Loss: 0.01289
Epoch: 13699, Training Loss: 0.01193
Epoch: 13699, Training Loss: 0.01472
Epoch: 13699, Training Loss: 0.01132
Epoch: 13700, Training Loss: 0.01289
Epoch: 13700, Training Loss: 0.01193
Epoch: 13700, Training Loss: 0.01472
Epoch: 13700, Training Loss: 0.01132
Epoch: 13701, Training Loss: 0.01289
Epoch: 13701, Training Loss: 0.01193
Epoch: 13701, Training Loss: 0.01472
Epoch: 13701, Training Loss: 0.01132
Epoch: 13702, Training Loss: 0.01289
Epoch: 13702, Training Loss: 0.01193
Epoch: 13702, Training Loss: 0.01472
Epoch: 13702, Training Loss: 0.01132
Epoch: 13703, Training Loss: 0.01289
Epoch: 13703, Training Loss: 0.01193
Epoch: 13703, Training Loss: 0.01472
Epoch: 13703, Training Loss: 0.01132
Epoch: 13704, Training Loss: 0.01289
Epoch: 13704, Training Loss: 0.01193
Epoch: 13704, Training Loss: 0.01472
Epoch: 13704, Training Loss: 0.01132
Epoch: 13705, Training Loss: 0.01289
Epoch: 13705, Training Loss: 0.01193
Epoch: 13705, Training Loss: 0.01471
Epoch: 13705, Training Loss: 0.01132
Epoch: 13706, Training Loss: 0.01289
Epoch: 13706, Training Loss: 0.01193
Epoch: 13706, Training Loss: 0.01471
Epoch: 13706, Training Loss: 0.01132
Epoch: 13707, Training Loss: 0.01289
Epoch: 13707, Training Loss: 0.01193
Epoch: 13707, Training Loss: 0.01471
Epoch: 13707, Training Loss: 0.01132
Epoch: 13708, Training Loss: 0.01289
Epoch: 13708, Training Loss: 0.01193
Epoch: 13708, Training Loss: 0.01471
Epoch: 13708, Training Loss: 0.01132
Epoch: 13709, Training Loss: 0.01289
Epoch: 13709, Training Loss: 0.01193
Epoch: 13709, Training Loss: 0.01471
Epoch: 13709, Training Loss: 0.01132
Epoch: 13710, Training Loss: 0.01288
Epoch: 13710, Training Loss: 0.01193
Epoch: 13710, Training Loss: 0.01471
Epoch: 13710, Training Loss: 0.01132
Epoch: 13711, Training Loss: 0.01288
Epoch: 13711, Training Loss: 0.01193
Epoch: 13711, Training Loss: 0.01471
Epoch: 13711, Training Loss: 0.01132
Epoch: 13712, Training Loss: 0.01288
Epoch: 13712, Training Loss: 0.01193
Epoch: 13712, Training Loss: 0.01471
Epoch: 13712, Training Loss: 0.01131
Epoch: 13713, Training Loss: 0.01288
Epoch: 13713, Training Loss: 0.01193
Epoch: 13713, Training Loss: 0.01471
Epoch: 13713, Training Loss: 0.01131
Epoch: 13714, Training Loss: 0.01288
Epoch: 13714, Training Loss: 0.01193
Epoch: 13714, Training Loss: 0.01471
Epoch: 13714, Training Loss: 0.01131
Epoch: 13715, Training Loss: 0.01288
Epoch: 13715, Training Loss: 0.01193
Epoch: 13715, Training Loss: 0.01471
Epoch: 13715, Training Loss: 0.01131
Epoch: 13716, Training Loss: 0.01288
Epoch: 13716, Training Loss: 0.01193
Epoch: 13716, Training Loss: 0.01471
Epoch: 13716, Training Loss: 0.01131
Epoch: 13717, Training Loss: 0.01288
Epoch: 13717, Training Loss: 0.01193
Epoch: 13717, Training Loss: 0.01471
Epoch: 13717, Training Loss: 0.01131
Epoch: 13718, Training Loss: 0.01288
Epoch: 13718, Training Loss: 0.01192
Epoch: 13718, Training Loss: 0.01471
Epoch: 13718, Training Loss: 0.01131
Epoch: 13719, Training Loss: 0.01288
Epoch: 13719, Training Loss: 0.01192
Epoch: 13719, Training Loss: 0.01471
Epoch: 13719, Training Loss: 0.01131
Epoch: 13720, Training Loss: 0.01288
Epoch: 13720, Training Loss: 0.01192
Epoch: 13720, Training Loss: 0.01471
Epoch: 13720, Training Loss: 0.01131
Epoch: 13721, Training Loss: 0.01288
Epoch: 13721, Training Loss: 0.01192
Epoch: 13721, Training Loss: 0.01471
Epoch: 13721, Training Loss: 0.01131
Epoch: 13722, Training Loss: 0.01288
Epoch: 13722, Training Loss: 0.01192
Epoch: 13722, Training Loss: 0.01470
Epoch: 13722, Training Loss: 0.01131
Epoch: 13723, Training Loss: 0.01288
Epoch: 13723, Training Loss: 0.01192
Epoch: 13723, Training Loss: 0.01470
Epoch: 13723, Training Loss: 0.01131
Epoch: 13724, Training Loss: 0.01288
Epoch: 13724, Training Loss: 0.01192
Epoch: 13724, Training Loss: 0.01470
Epoch: 13724, Training Loss: 0.01131
Epoch: 13725, Training Loss: 0.01288
Epoch: 13725, Training Loss: 0.01192
Epoch: 13725, Training Loss: 0.01470
Epoch: 13725, Training Loss: 0.01131
Epoch: 13726, Training Loss: 0.01288
Epoch: 13726, Training Loss: 0.01192
Epoch: 13726, Training Loss: 0.01470
Epoch: 13726, Training Loss: 0.01131
Epoch: 13727, Training Loss: 0.01288
Epoch: 13727, Training Loss: 0.01192
Epoch: 13727, Training Loss: 0.01470
Epoch: 13727, Training Loss: 0.01131
Epoch: 13728, Training Loss: 0.01288
Epoch: 13728, Training Loss: 0.01192
Epoch: 13728, Training Loss: 0.01470
Epoch: 13728, Training Loss: 0.01131
Epoch: 13729, Training Loss: 0.01287
Epoch: 13729, Training Loss: 0.01192
Epoch: 13729, Training Loss: 0.01470
Epoch: 13729, Training Loss: 0.01131
Epoch: 13730, Training Loss: 0.01287
Epoch: 13730, Training Loss: 0.01192
Epoch: 13730, Training Loss: 0.01470
Epoch: 13730, Training Loss: 0.01131
Epoch: 13731, Training Loss: 0.01287
Epoch: 13731, Training Loss: 0.01192
Epoch: 13731, Training Loss: 0.01470
Epoch: 13731, Training Loss: 0.01131
Epoch: 13732, Training Loss: 0.01287
Epoch: 13732, Training Loss: 0.01192
Epoch: 13732, Training Loss: 0.01470
Epoch: 13732, Training Loss: 0.01131
Epoch: 13733, Training Loss: 0.01287
Epoch: 13733, Training Loss: 0.01192
Epoch: 13733, Training Loss: 0.01470
Epoch: 13733, Training Loss: 0.01131
Epoch: 13734, Training Loss: 0.01287
Epoch: 13734, Training Loss: 0.01192
Epoch: 13734, Training Loss: 0.01470
Epoch: 13734, Training Loss: 0.01130
Epoch: 13735, Training Loss: 0.01287
Epoch: 13735, Training Loss: 0.01192
Epoch: 13735, Training Loss: 0.01470
Epoch: 13735, Training Loss: 0.01130
Epoch: 13736, Training Loss: 0.01287
Epoch: 13736, Training Loss: 0.01192
Epoch: 13736, Training Loss: 0.01470
Epoch: 13736, Training Loss: 0.01130
Epoch: 13737, Training Loss: 0.01287
Epoch: 13737, Training Loss: 0.01192
Epoch: 13737, Training Loss: 0.01470
Epoch: 13737, Training Loss: 0.01130
Epoch: 13738, Training Loss: 0.01287
Epoch: 13738, Training Loss: 0.01192
Epoch: 13738, Training Loss: 0.01470
Epoch: 13738, Training Loss: 0.01130
Epoch: 13739, Training Loss: 0.01287
Epoch: 13739, Training Loss: 0.01192
Epoch: 13739, Training Loss: 0.01469
Epoch: 13739, Training Loss: 0.01130
Epoch: 13740, Training Loss: 0.01287
Epoch: 13740, Training Loss: 0.01191
Epoch: 13740, Training Loss: 0.01469
Epoch: 13740, Training Loss: 0.01130
Epoch: 13741, Training Loss: 0.01287
Epoch: 13741, Training Loss: 0.01191
Epoch: 13741, Training Loss: 0.01469
Epoch: 13741, Training Loss: 0.01130
Epoch: 13742, Training Loss: 0.01287
Epoch: 13742, Training Loss: 0.01191
Epoch: 13742, Training Loss: 0.01469
Epoch: 13742, Training Loss: 0.01130
Epoch: 13743, Training Loss: 0.01287
Epoch: 13743, Training Loss: 0.01191
Epoch: 13743, Training Loss: 0.01469
Epoch: 13743, Training Loss: 0.01130
Epoch: 13744, Training Loss: 0.01287
Epoch: 13744, Training Loss: 0.01191
Epoch: 13744, Training Loss: 0.01469
Epoch: 13744, Training Loss: 0.01130
Epoch: 13745, Training Loss: 0.01287
Epoch: 13745, Training Loss: 0.01191
Epoch: 13745, Training Loss: 0.01469
Epoch: 13745, Training Loss: 0.01130
Epoch: 13746, Training Loss: 0.01287
Epoch: 13746, Training Loss: 0.01191
Epoch: 13746, Training Loss: 0.01469
Epoch: 13746, Training Loss: 0.01130
Epoch: 13747, Training Loss: 0.01287
Epoch: 13747, Training Loss: 0.01191
Epoch: 13747, Training Loss: 0.01469
Epoch: 13747, Training Loss: 0.01130
Epoch: 13748, Training Loss: 0.01287
Epoch: 13748, Training Loss: 0.01191
Epoch: 13748, Training Loss: 0.01469
Epoch: 13748, Training Loss: 0.01130
Epoch: 13749, Training Loss: 0.01286
Epoch: 13749, Training Loss: 0.01191
Epoch: 13749, Training Loss: 0.01469
Epoch: 13749, Training Loss: 0.01130
Epoch: 13750, Training Loss: 0.01286
Epoch: 13750, Training Loss: 0.01191
Epoch: 13750, Training Loss: 0.01469
Epoch: 13750, Training Loss: 0.01130
Epoch: 13751, Training Loss: 0.01286
Epoch: 13751, Training Loss: 0.01191
Epoch: 13751, Training Loss: 0.01469
Epoch: 13751, Training Loss: 0.01130
Epoch: 13752, Training Loss: 0.01286
Epoch: 13752, Training Loss: 0.01191
Epoch: 13752, Training Loss: 0.01469
Epoch: 13752, Training Loss: 0.01130
Epoch: 13753, Training Loss: 0.01286
Epoch: 13753, Training Loss: 0.01191
Epoch: 13753, Training Loss: 0.01469
Epoch: 13753, Training Loss: 0.01130
Epoch: 13754, Training Loss: 0.01286
Epoch: 13754, Training Loss: 0.01191
Epoch: 13754, Training Loss: 0.01469
Epoch: 13754, Training Loss: 0.01130
Epoch: 13755, Training Loss: 0.01286
Epoch: 13755, Training Loss: 0.01191
Epoch: 13755, Training Loss: 0.01469
Epoch: 13755, Training Loss: 0.01130
Epoch: 13756, Training Loss: 0.01286
Epoch: 13756, Training Loss: 0.01191
Epoch: 13756, Training Loss: 0.01468
Epoch: 13756, Training Loss: 0.01129
Epoch: 13757, Training Loss: 0.01286
Epoch: 13757, Training Loss: 0.01191
Epoch: 13757, Training Loss: 0.01468
Epoch: 13757, Training Loss: 0.01129
Epoch: 13758, Training Loss: 0.01286
Epoch: 13758, Training Loss: 0.01191
Epoch: 13758, Training Loss: 0.01468
Epoch: 13758, Training Loss: 0.01129
Epoch: 13759, Training Loss: 0.01286
Epoch: 13759, Training Loss: 0.01191
Epoch: 13759, Training Loss: 0.01468
Epoch: 13759, Training Loss: 0.01129
Epoch: 13760, Training Loss: 0.01286
Epoch: 13760, Training Loss: 0.01191
Epoch: 13760, Training Loss: 0.01468
Epoch: 13760, Training Loss: 0.01129
Epoch: 13761, Training Loss: 0.01286
Epoch: 13761, Training Loss: 0.01190
Epoch: 13761, Training Loss: 0.01468
Epoch: 13761, Training Loss: 0.01129
Epoch: 13762, Training Loss: 0.01286
Epoch: 13762, Training Loss: 0.01190
Epoch: 13762, Training Loss: 0.01468
Epoch: 13762, Training Loss: 0.01129
Epoch: 13763, Training Loss: 0.01286
Epoch: 13763, Training Loss: 0.01190
Epoch: 13763, Training Loss: 0.01468
Epoch: 13763, Training Loss: 0.01129
Epoch: 13764, Training Loss: 0.01286
Epoch: 13764, Training Loss: 0.01190
Epoch: 13764, Training Loss: 0.01468
Epoch: 13764, Training Loss: 0.01129
Epoch: 13765, Training Loss: 0.01286
Epoch: 13765, Training Loss: 0.01190
Epoch: 13765, Training Loss: 0.01468
Epoch: 13765, Training Loss: 0.01129
Epoch: 13766, Training Loss: 0.01286
Epoch: 13766, Training Loss: 0.01190
Epoch: 13766, Training Loss: 0.01468
Epoch: 13766, Training Loss: 0.01129
Epoch: 13767, Training Loss: 0.01286
Epoch: 13767, Training Loss: 0.01190
Epoch: 13767, Training Loss: 0.01468
Epoch: 13767, Training Loss: 0.01129
Epoch: 13768, Training Loss: 0.01285
Epoch: 13768, Training Loss: 0.01190
Epoch: 13768, Training Loss: 0.01468
Epoch: 13768, Training Loss: 0.01129
Epoch: 13769, Training Loss: 0.01285
Epoch: 13769, Training Loss: 0.01190
Epoch: 13769, Training Loss: 0.01468
Epoch: 13769, Training Loss: 0.01129
Epoch: 13770, Training Loss: 0.01285
Epoch: 13770, Training Loss: 0.01190
Epoch: 13770, Training Loss: 0.01468
Epoch: 13770, Training Loss: 0.01129
Epoch: 13771, Training Loss: 0.01285
Epoch: 13771, Training Loss: 0.01190
Epoch: 13771, Training Loss: 0.01468
Epoch: 13771, Training Loss: 0.01129
Epoch: 13772, Training Loss: 0.01285
Epoch: 13772, Training Loss: 0.01190
Epoch: 13772, Training Loss: 0.01467
Epoch: 13772, Training Loss: 0.01129
Epoch: 13773, Training Loss: 0.01285
Epoch: 13773, Training Loss: 0.01190
Epoch: 13773, Training Loss: 0.01467
Epoch: 13773, Training Loss: 0.01129
Epoch: 13774, Training Loss: 0.01285
Epoch: 13774, Training Loss: 0.01190
Epoch: 13774, Training Loss: 0.01467
Epoch: 13774, Training Loss: 0.01129
Epoch: 13775, Training Loss: 0.01285
Epoch: 13775, Training Loss: 0.01190
Epoch: 13775, Training Loss: 0.01467
Epoch: 13775, Training Loss: 0.01129
Epoch: 13776, Training Loss: 0.01285
Epoch: 13776, Training Loss: 0.01190
Epoch: 13776, Training Loss: 0.01467
Epoch: 13776, Training Loss: 0.01129
Epoch: 13777, Training Loss: 0.01285
Epoch: 13777, Training Loss: 0.01190
Epoch: 13777, Training Loss: 0.01467
Epoch: 13777, Training Loss: 0.01129
Epoch: 13778, Training Loss: 0.01285
Epoch: 13778, Training Loss: 0.01190
Epoch: 13778, Training Loss: 0.01467
Epoch: 13778, Training Loss: 0.01129
Epoch: 13779, Training Loss: 0.01285
Epoch: 13779, Training Loss: 0.01190
Epoch: 13779, Training Loss: 0.01467
Epoch: 13779, Training Loss: 0.01128
Epoch: 13780, Training Loss: 0.01285
Epoch: 13780, Training Loss: 0.01190
Epoch: 13780, Training Loss: 0.01467
Epoch: 13780, Training Loss: 0.01128
Epoch: 13781, Training Loss: 0.01285
Epoch: 13781, Training Loss: 0.01190
Epoch: 13781, Training Loss: 0.01467
Epoch: 13781, Training Loss: 0.01128
Epoch: 13782, Training Loss: 0.01285
Epoch: 13782, Training Loss: 0.01190
Epoch: 13782, Training Loss: 0.01467
Epoch: 13782, Training Loss: 0.01128
Epoch: 13783, Training Loss: 0.01285
Epoch: 13783, Training Loss: 0.01189
Epoch: 13783, Training Loss: 0.01467
Epoch: 13783, Training Loss: 0.01128
Epoch: 13784, Training Loss: 0.01285
Epoch: 13784, Training Loss: 0.01189
Epoch: 13784, Training Loss: 0.01467
Epoch: 13784, Training Loss: 0.01128
Epoch: 13785, Training Loss: 0.01285
Epoch: 13785, Training Loss: 0.01189
Epoch: 13785, Training Loss: 0.01467
Epoch: 13785, Training Loss: 0.01128
Epoch: 13786, Training Loss: 0.01285
Epoch: 13786, Training Loss: 0.01189
Epoch: 13786, Training Loss: 0.01467
Epoch: 13786, Training Loss: 0.01128
Epoch: 13787, Training Loss: 0.01284
Epoch: 13787, Training Loss: 0.01189
Epoch: 13787, Training Loss: 0.01467
Epoch: 13787, Training Loss: 0.01128
Epoch: 13788, Training Loss: 0.01284
Epoch: 13788, Training Loss: 0.01189
Epoch: 13788, Training Loss: 0.01467
Epoch: 13788, Training Loss: 0.01128
Epoch: 13789, Training Loss: 0.01284
Epoch: 13789, Training Loss: 0.01189
Epoch: 13789, Training Loss: 0.01466
Epoch: 13789, Training Loss: 0.01128
Epoch: 13790, Training Loss: 0.01284
Epoch: 13790, Training Loss: 0.01189
Epoch: 13790, Training Loss: 0.01466
Epoch: 13790, Training Loss: 0.01128
Epoch: 13791, Training Loss: 0.01284
Epoch: 13791, Training Loss: 0.01189
Epoch: 13791, Training Loss: 0.01466
Epoch: 13791, Training Loss: 0.01128
Epoch: 13792, Training Loss: 0.01284
Epoch: 13792, Training Loss: 0.01189
Epoch: 13792, Training Loss: 0.01466
Epoch: 13792, Training Loss: 0.01128
Epoch: 13793, Training Loss: 0.01284
Epoch: 13793, Training Loss: 0.01189
Epoch: 13793, Training Loss: 0.01466
Epoch: 13793, Training Loss: 0.01128
Epoch: 13794, Training Loss: 0.01284
Epoch: 13794, Training Loss: 0.01189
Epoch: 13794, Training Loss: 0.01466
Epoch: 13794, Training Loss: 0.01128
Epoch: 13795, Training Loss: 0.01284
Epoch: 13795, Training Loss: 0.01189
Epoch: 13795, Training Loss: 0.01466
Epoch: 13795, Training Loss: 0.01128
Epoch: 13796, Training Loss: 0.01284
Epoch: 13796, Training Loss: 0.01189
Epoch: 13796, Training Loss: 0.01466
Epoch: 13796, Training Loss: 0.01128
Epoch: 13797, Training Loss: 0.01284
Epoch: 13797, Training Loss: 0.01189
Epoch: 13797, Training Loss: 0.01466
Epoch: 13797, Training Loss: 0.01128
Epoch: 13798, Training Loss: 0.01284
Epoch: 13798, Training Loss: 0.01189
Epoch: 13798, Training Loss: 0.01466
Epoch: 13798, Training Loss: 0.01128
Epoch: 13799, Training Loss: 0.01284
Epoch: 13799, Training Loss: 0.01189
Epoch: 13799, Training Loss: 0.01466
Epoch: 13799, Training Loss: 0.01128
Epoch: 13800, Training Loss: 0.01284
Epoch: 13800, Training Loss: 0.01189
Epoch: 13800, Training Loss: 0.01466
Epoch: 13800, Training Loss: 0.01128
Epoch: 13801, Training Loss: 0.01284
Epoch: 13801, Training Loss: 0.01189
Epoch: 13801, Training Loss: 0.01466
Epoch: 13801, Training Loss: 0.01127
Epoch: 13802, Training Loss: 0.01284
Epoch: 13802, Training Loss: 0.01189
Epoch: 13802, Training Loss: 0.01466
Epoch: 13802, Training Loss: 0.01127
Epoch: 13803, Training Loss: 0.01284
Epoch: 13803, Training Loss: 0.01189
Epoch: 13803, Training Loss: 0.01466
Epoch: 13803, Training Loss: 0.01127
Epoch: 13804, Training Loss: 0.01284
Epoch: 13804, Training Loss: 0.01188
Epoch: 13804, Training Loss: 0.01466
Epoch: 13804, Training Loss: 0.01127
Epoch: 13805, Training Loss: 0.01284
Epoch: 13805, Training Loss: 0.01188
Epoch: 13805, Training Loss: 0.01466
Epoch: 13805, Training Loss: 0.01127
Epoch: 13806, Training Loss: 0.01284
Epoch: 13806, Training Loss: 0.01188
Epoch: 13806, Training Loss: 0.01465
Epoch: 13806, Training Loss: 0.01127
Epoch: 13807, Training Loss: 0.01283
Epoch: 13807, Training Loss: 0.01188
Epoch: 13807, Training Loss: 0.01465
Epoch: 13807, Training Loss: 0.01127
Epoch: 13808, Training Loss: 0.01283
Epoch: 13808, Training Loss: 0.01188
Epoch: 13808, Training Loss: 0.01465
Epoch: 13808, Training Loss: 0.01127
Epoch: 13809, Training Loss: 0.01283
Epoch: 13809, Training Loss: 0.01188
Epoch: 13809, Training Loss: 0.01465
Epoch: 13809, Training Loss: 0.01127
Epoch: 13810, Training Loss: 0.01283
Epoch: 13810, Training Loss: 0.01188
Epoch: 13810, Training Loss: 0.01465
Epoch: 13810, Training Loss: 0.01127
Epoch: 13811, Training Loss: 0.01283
Epoch: 13811, Training Loss: 0.01188
Epoch: 13811, Training Loss: 0.01465
Epoch: 13811, Training Loss: 0.01127
Epoch: 13812, Training Loss: 0.01283
Epoch: 13812, Training Loss: 0.01188
Epoch: 13812, Training Loss: 0.01465
Epoch: 13812, Training Loss: 0.01127
Epoch: 13813, Training Loss: 0.01283
Epoch: 13813, Training Loss: 0.01188
Epoch: 13813, Training Loss: 0.01465
Epoch: 13813, Training Loss: 0.01127
Epoch: 13814, Training Loss: 0.01283
Epoch: 13814, Training Loss: 0.01188
Epoch: 13814, Training Loss: 0.01465
Epoch: 13814, Training Loss: 0.01127
Epoch: 13815, Training Loss: 0.01283
Epoch: 13815, Training Loss: 0.01188
Epoch: 13815, Training Loss: 0.01465
Epoch: 13815, Training Loss: 0.01127
Epoch: 13816, Training Loss: 0.01283
Epoch: 13816, Training Loss: 0.01188
Epoch: 13816, Training Loss: 0.01465
Epoch: 13816, Training Loss: 0.01127
Epoch: 13817, Training Loss: 0.01283
Epoch: 13817, Training Loss: 0.01188
Epoch: 13817, Training Loss: 0.01465
Epoch: 13817, Training Loss: 0.01127
Epoch: 13818, Training Loss: 0.01283
Epoch: 13818, Training Loss: 0.01188
Epoch: 13818, Training Loss: 0.01465
Epoch: 13818, Training Loss: 0.01127
Epoch: 13819, Training Loss: 0.01283
Epoch: 13819, Training Loss: 0.01188
Epoch: 13819, Training Loss: 0.01465
Epoch: 13819, Training Loss: 0.01127
Epoch: 13820, Training Loss: 0.01283
Epoch: 13820, Training Loss: 0.01188
Epoch: 13820, Training Loss: 0.01465
Epoch: 13820, Training Loss: 0.01127
Epoch: 13821, Training Loss: 0.01283
Epoch: 13821, Training Loss: 0.01188
Epoch: 13821, Training Loss: 0.01465
Epoch: 13821, Training Loss: 0.01127
Epoch: 13822, Training Loss: 0.01283
Epoch: 13822, Training Loss: 0.01188
Epoch: 13822, Training Loss: 0.01465
Epoch: 13822, Training Loss: 0.01127
Epoch: 13823, Training Loss: 0.01283
Epoch: 13823, Training Loss: 0.01188
Epoch: 13823, Training Loss: 0.01464
Epoch: 13823, Training Loss: 0.01127
Epoch: 13824, Training Loss: 0.01283
Epoch: 13824, Training Loss: 0.01188
Epoch: 13824, Training Loss: 0.01464
Epoch: 13824, Training Loss: 0.01126
Epoch: 13825, Training Loss: 0.01283
Epoch: 13825, Training Loss: 0.01188
Epoch: 13825, Training Loss: 0.01464
Epoch: 13825, Training Loss: 0.01126
Epoch: 13826, Training Loss: 0.01282
Epoch: 13826, Training Loss: 0.01187
Epoch: 13826, Training Loss: 0.01464
Epoch: 13826, Training Loss: 0.01126
Epoch: 13827, Training Loss: 0.01282
Epoch: 13827, Training Loss: 0.01187
Epoch: 13827, Training Loss: 0.01464
Epoch: 13827, Training Loss: 0.01126
Epoch: 13828, Training Loss: 0.01282
Epoch: 13828, Training Loss: 0.01187
Epoch: 13828, Training Loss: 0.01464
Epoch: 13828, Training Loss: 0.01126
Epoch: 13829, Training Loss: 0.01282
Epoch: 13829, Training Loss: 0.01187
Epoch: 13829, Training Loss: 0.01464
Epoch: 13829, Training Loss: 0.01126
Epoch: 13830, Training Loss: 0.01282
Epoch: 13830, Training Loss: 0.01187
Epoch: 13830, Training Loss: 0.01464
Epoch: 13830, Training Loss: 0.01126
Epoch: 13831, Training Loss: 0.01282
Epoch: 13831, Training Loss: 0.01187
Epoch: 13831, Training Loss: 0.01464
Epoch: 13831, Training Loss: 0.01126
Epoch: 13832, Training Loss: 0.01282
Epoch: 13832, Training Loss: 0.01187
Epoch: 13832, Training Loss: 0.01464
Epoch: 13832, Training Loss: 0.01126
Epoch: 13833, Training Loss: 0.01282
Epoch: 13833, Training Loss: 0.01187
Epoch: 13833, Training Loss: 0.01464
Epoch: 13833, Training Loss: 0.01126
Epoch: 13834, Training Loss: 0.01282
Epoch: 13834, Training Loss: 0.01187
Epoch: 13834, Training Loss: 0.01464
Epoch: 13834, Training Loss: 0.01126
Epoch: 13835, Training Loss: 0.01282
Epoch: 13835, Training Loss: 0.01187
Epoch: 13835, Training Loss: 0.01464
Epoch: 13835, Training Loss: 0.01126
Epoch: 13836, Training Loss: 0.01282
Epoch: 13836, Training Loss: 0.01187
Epoch: 13836, Training Loss: 0.01464
Epoch: 13836, Training Loss: 0.01126
Epoch: 13837, Training Loss: 0.01282
Epoch: 13837, Training Loss: 0.01187
Epoch: 13837, Training Loss: 0.01464
Epoch: 13837, Training Loss: 0.01126
Epoch: 13838, Training Loss: 0.01282
Epoch: 13838, Training Loss: 0.01187
Epoch: 13838, Training Loss: 0.01464
Epoch: 13838, Training Loss: 0.01126
Epoch: 13839, Training Loss: 0.01282
Epoch: 13839, Training Loss: 0.01187
Epoch: 13839, Training Loss: 0.01464
Epoch: 13839, Training Loss: 0.01126
Epoch: 13840, Training Loss: 0.01282
Epoch: 13840, Training Loss: 0.01187
Epoch: 13840, Training Loss: 0.01463
Epoch: 13840, Training Loss: 0.01126
Epoch: 13841, Training Loss: 0.01282
Epoch: 13841, Training Loss: 0.01187
Epoch: 13841, Training Loss: 0.01463
Epoch: 13841, Training Loss: 0.01126
Epoch: 13842, Training Loss: 0.01282
Epoch: 13842, Training Loss: 0.01187
Epoch: 13842, Training Loss: 0.01463
Epoch: 13842, Training Loss: 0.01126
Epoch: 13843, Training Loss: 0.01282
Epoch: 13843, Training Loss: 0.01187
Epoch: 13843, Training Loss: 0.01463
Epoch: 13843, Training Loss: 0.01126
Epoch: 13844, Training Loss: 0.01282
Epoch: 13844, Training Loss: 0.01187
Epoch: 13844, Training Loss: 0.01463
Epoch: 13844, Training Loss: 0.01126
Epoch: 13845, Training Loss: 0.01282
Epoch: 13845, Training Loss: 0.01187
Epoch: 13845, Training Loss: 0.01463
Epoch: 13845, Training Loss: 0.01126
Epoch: 13846, Training Loss: 0.01281
Epoch: 13846, Training Loss: 0.01187
Epoch: 13846, Training Loss: 0.01463
Epoch: 13846, Training Loss: 0.01126
Epoch: 13847, Training Loss: 0.01281
Epoch: 13847, Training Loss: 0.01186
Epoch: 13847, Training Loss: 0.01463
Epoch: 13847, Training Loss: 0.01125
Epoch: 13848, Training Loss: 0.01281
Epoch: 13848, Training Loss: 0.01186
Epoch: 13848, Training Loss: 0.01463
Epoch: 13848, Training Loss: 0.01125
Epoch: 13849, Training Loss: 0.01281
Epoch: 13849, Training Loss: 0.01186
Epoch: 13849, Training Loss: 0.01463
Epoch: 13849, Training Loss: 0.01125
Epoch: 13850, Training Loss: 0.01281
Epoch: 13850, Training Loss: 0.01186
Epoch: 13850, Training Loss: 0.01463
Epoch: 13850, Training Loss: 0.01125
Epoch: 13851, Training Loss: 0.01281
Epoch: 13851, Training Loss: 0.01186
Epoch: 13851, Training Loss: 0.01463
Epoch: 13851, Training Loss: 0.01125
Epoch: 13852, Training Loss: 0.01281
Epoch: 13852, Training Loss: 0.01186
Epoch: 13852, Training Loss: 0.01463
Epoch: 13852, Training Loss: 0.01125
Epoch: 13853, Training Loss: 0.01281
Epoch: 13853, Training Loss: 0.01186
Epoch: 13853, Training Loss: 0.01463
Epoch: 13853, Training Loss: 0.01125
Epoch: 13854, Training Loss: 0.01281
Epoch: 13854, Training Loss: 0.01186
Epoch: 13854, Training Loss: 0.01463
Epoch: 13854, Training Loss: 0.01125
Epoch: 13855, Training Loss: 0.01281
Epoch: 13855, Training Loss: 0.01186
Epoch: 13855, Training Loss: 0.01463
Epoch: 13855, Training Loss: 0.01125
Epoch: 13856, Training Loss: 0.01281
Epoch: 13856, Training Loss: 0.01186
Epoch: 13856, Training Loss: 0.01463
Epoch: 13856, Training Loss: 0.01125
Epoch: 13857, Training Loss: 0.01281
Epoch: 13857, Training Loss: 0.01186
Epoch: 13857, Training Loss: 0.01463
Epoch: 13857, Training Loss: 0.01125
Epoch: 13858, Training Loss: 0.01281
Epoch: 13858, Training Loss: 0.01186
Epoch: 13858, Training Loss: 0.01462
Epoch: 13858, Training Loss: 0.01125
Epoch: 13859, Training Loss: 0.01281
Epoch: 13859, Training Loss: 0.01186
Epoch: 13859, Training Loss: 0.01462
Epoch: 13859, Training Loss: 0.01125
Epoch: 13860, Training Loss: 0.01281
Epoch: 13860, Training Loss: 0.01186
Epoch: 13860, Training Loss: 0.01462
Epoch: 13860, Training Loss: 0.01125
Epoch: 13861, Training Loss: 0.01281
Epoch: 13861, Training Loss: 0.01186
Epoch: 13861, Training Loss: 0.01462
Epoch: 13861, Training Loss: 0.01125
Epoch: 13862, Training Loss: 0.01281
Epoch: 13862, Training Loss: 0.01186
Epoch: 13862, Training Loss: 0.01462
Epoch: 13862, Training Loss: 0.01125
Epoch: 13863, Training Loss: 0.01281
Epoch: 13863, Training Loss: 0.01186
Epoch: 13863, Training Loss: 0.01462
Epoch: 13863, Training Loss: 0.01125
Epoch: 13864, Training Loss: 0.01281
Epoch: 13864, Training Loss: 0.01186
Epoch: 13864, Training Loss: 0.01462
Epoch: 13864, Training Loss: 0.01125
Epoch: 13865, Training Loss: 0.01280
Epoch: 13865, Training Loss: 0.01186
Epoch: 13865, Training Loss: 0.01462
Epoch: 13865, Training Loss: 0.01125
Epoch: 13866, Training Loss: 0.01280
Epoch: 13866, Training Loss: 0.01186
Epoch: 13866, Training Loss: 0.01462
Epoch: 13866, Training Loss: 0.01125
Epoch: 13867, Training Loss: 0.01280
Epoch: 13867, Training Loss: 0.01186
Epoch: 13867, Training Loss: 0.01462
Epoch: 13867, Training Loss: 0.01125
Epoch: 13868, Training Loss: 0.01280
Epoch: 13868, Training Loss: 0.01186
Epoch: 13868, Training Loss: 0.01462
Epoch: 13868, Training Loss: 0.01125
Epoch: 13869, Training Loss: 0.01280
Epoch: 13869, Training Loss: 0.01185
Epoch: 13869, Training Loss: 0.01462
Epoch: 13869, Training Loss: 0.01124
Epoch: 13870, Training Loss: 0.01280
Epoch: 13870, Training Loss: 0.01185
Epoch: 13870, Training Loss: 0.01462
Epoch: 13870, Training Loss: 0.01124
Epoch: 13871, Training Loss: 0.01280
Epoch: 13871, Training Loss: 0.01185
Epoch: 13871, Training Loss: 0.01462
Epoch: 13871, Training Loss: 0.01124
Epoch: 13872, Training Loss: 0.01280
Epoch: 13872, Training Loss: 0.01185
Epoch: 13872, Training Loss: 0.01462
Epoch: 13872, Training Loss: 0.01124
Epoch: 13873, Training Loss: 0.01280
Epoch: 13873, Training Loss: 0.01185
Epoch: 13873, Training Loss: 0.01462
Epoch: 13873, Training Loss: 0.01124
Epoch: 13874, Training Loss: 0.01280
Epoch: 13874, Training Loss: 0.01185
Epoch: 13874, Training Loss: 0.01462
Epoch: 13874, Training Loss: 0.01124
Epoch: 13875, Training Loss: 0.01280
Epoch: 13875, Training Loss: 0.01185
Epoch: 13875, Training Loss: 0.01461
Epoch: 13875, Training Loss: 0.01124
Epoch: 13876, Training Loss: 0.01280
Epoch: 13876, Training Loss: 0.01185
Epoch: 13876, Training Loss: 0.01461
Epoch: 13876, Training Loss: 0.01124
Epoch: 13877, Training Loss: 0.01280
Epoch: 13877, Training Loss: 0.01185
Epoch: 13877, Training Loss: 0.01461
Epoch: 13877, Training Loss: 0.01124
Epoch: 13878, Training Loss: 0.01280
Epoch: 13878, Training Loss: 0.01185
Epoch: 13878, Training Loss: 0.01461
Epoch: 13878, Training Loss: 0.01124
Epoch: 13879, Training Loss: 0.01280
Epoch: 13879, Training Loss: 0.01185
Epoch: 13879, Training Loss: 0.01461
Epoch: 13879, Training Loss: 0.01124
Epoch: 13880, Training Loss: 0.01280
Epoch: 13880, Training Loss: 0.01185
Epoch: 13880, Training Loss: 0.01461
Epoch: 13880, Training Loss: 0.01124
Epoch: 13881, Training Loss: 0.01280
Epoch: 13881, Training Loss: 0.01185
Epoch: 13881, Training Loss: 0.01461
Epoch: 13881, Training Loss: 0.01124
Epoch: 13882, Training Loss: 0.01280
Epoch: 13882, Training Loss: 0.01185
Epoch: 13882, Training Loss: 0.01461
Epoch: 13882, Training Loss: 0.01124
Epoch: 13883, Training Loss: 0.01280
Epoch: 13883, Training Loss: 0.01185
Epoch: 13883, Training Loss: 0.01461
Epoch: 13883, Training Loss: 0.01124
Epoch: 13884, Training Loss: 0.01280
Epoch: 13884, Training Loss: 0.01185
Epoch: 13884, Training Loss: 0.01461
Epoch: 13884, Training Loss: 0.01124
Epoch: 13885, Training Loss: 0.01279
Epoch: 13885, Training Loss: 0.01185
Epoch: 13885, Training Loss: 0.01461
Epoch: 13885, Training Loss: 0.01124
Epoch: 13886, Training Loss: 0.01279
Epoch: 13886, Training Loss: 0.01185
Epoch: 13886, Training Loss: 0.01461
Epoch: 13886, Training Loss: 0.01124
Epoch: 13887, Training Loss: 0.01279
Epoch: 13887, Training Loss: 0.01185
Epoch: 13887, Training Loss: 0.01461
Epoch: 13887, Training Loss: 0.01124
Epoch: 13888, Training Loss: 0.01279
Epoch: 13888, Training Loss: 0.01185
Epoch: 13888, Training Loss: 0.01461
Epoch: 13888, Training Loss: 0.01124
Epoch: 13889, Training Loss: 0.01279
Epoch: 13889, Training Loss: 0.01185
Epoch: 13889, Training Loss: 0.01461
Epoch: 13889, Training Loss: 0.01124
Epoch: 13890, Training Loss: 0.01279
Epoch: 13890, Training Loss: 0.01185
Epoch: 13890, Training Loss: 0.01461
Epoch: 13890, Training Loss: 0.01124
Epoch: 13891, Training Loss: 0.01279
Epoch: 13891, Training Loss: 0.01184
Epoch: 13891, Training Loss: 0.01461
Epoch: 13891, Training Loss: 0.01124
Epoch: 13892, Training Loss: 0.01279
Epoch: 13892, Training Loss: 0.01184
Epoch: 13892, Training Loss: 0.01460
Epoch: 13892, Training Loss: 0.01123
Epoch: 13893, Training Loss: 0.01279
Epoch: 13893, Training Loss: 0.01184
Epoch: 13893, Training Loss: 0.01460
Epoch: 13893, Training Loss: 0.01123
Epoch: 13894, Training Loss: 0.01279
Epoch: 13894, Training Loss: 0.01184
Epoch: 13894, Training Loss: 0.01460
Epoch: 13894, Training Loss: 0.01123
Epoch: 13895, Training Loss: 0.01279
Epoch: 13895, Training Loss: 0.01184
Epoch: 13895, Training Loss: 0.01460
Epoch: 13895, Training Loss: 0.01123
Epoch: 13896, Training Loss: 0.01279
Epoch: 13896, Training Loss: 0.01184
Epoch: 13896, Training Loss: 0.01460
Epoch: 13896, Training Loss: 0.01123
Epoch: 13897, Training Loss: 0.01279
Epoch: 13897, Training Loss: 0.01184
Epoch: 13897, Training Loss: 0.01460
Epoch: 13897, Training Loss: 0.01123
Epoch: 13898, Training Loss: 0.01279
Epoch: 13898, Training Loss: 0.01184
Epoch: 13898, Training Loss: 0.01460
Epoch: 13898, Training Loss: 0.01123
Epoch: 13899, Training Loss: 0.01279
Epoch: 13899, Training Loss: 0.01184
Epoch: 13899, Training Loss: 0.01460
Epoch: 13899, Training Loss: 0.01123
Epoch: 13900, Training Loss: 0.01279
Epoch: 13900, Training Loss: 0.01184
Epoch: 13900, Training Loss: 0.01460
Epoch: 13900, Training Loss: 0.01123
Epoch: 13901, Training Loss: 0.01279
Epoch: 13901, Training Loss: 0.01184
Epoch: 13901, Training Loss: 0.01460
Epoch: 13901, Training Loss: 0.01123
Epoch: 13902, Training Loss: 0.01279
Epoch: 13902, Training Loss: 0.01184
Epoch: 13902, Training Loss: 0.01460
Epoch: 13902, Training Loss: 0.01123
Epoch: 13903, Training Loss: 0.01279
Epoch: 13903, Training Loss: 0.01184
Epoch: 13903, Training Loss: 0.01460
Epoch: 13903, Training Loss: 0.01123
Epoch: 13904, Training Loss: 0.01279
Epoch: 13904, Training Loss: 0.01184
Epoch: 13904, Training Loss: 0.01460
Epoch: 13904, Training Loss: 0.01123
Epoch: 13905, Training Loss: 0.01278
Epoch: 13905, Training Loss: 0.01184
Epoch: 13905, Training Loss: 0.01460
Epoch: 13905, Training Loss: 0.01123
Epoch: 13906, Training Loss: 0.01278
Epoch: 13906, Training Loss: 0.01184
Epoch: 13906, Training Loss: 0.01460
Epoch: 13906, Training Loss: 0.01123
Epoch: 13907, Training Loss: 0.01278
Epoch: 13907, Training Loss: 0.01184
Epoch: 13907, Training Loss: 0.01460
Epoch: 13907, Training Loss: 0.01123
Epoch: 13908, Training Loss: 0.01278
Epoch: 13908, Training Loss: 0.01184
Epoch: 13908, Training Loss: 0.01460
Epoch: 13908, Training Loss: 0.01123
Epoch: 13909, Training Loss: 0.01278
Epoch: 13909, Training Loss: 0.01184
Epoch: 13909, Training Loss: 0.01459
Epoch: 13909, Training Loss: 0.01123
Epoch: 13910, Training Loss: 0.01278
Epoch: 13910, Training Loss: 0.01184
Epoch: 13910, Training Loss: 0.01459
Epoch: 13910, Training Loss: 0.01123
Epoch: 13911, Training Loss: 0.01278
Epoch: 13911, Training Loss: 0.01184
Epoch: 13911, Training Loss: 0.01459
Epoch: 13911, Training Loss: 0.01123
Epoch: 13912, Training Loss: 0.01278
Epoch: 13912, Training Loss: 0.01183
Epoch: 13912, Training Loss: 0.01459
Epoch: 13912, Training Loss: 0.01123
Epoch: 13913, Training Loss: 0.01278
Epoch: 13913, Training Loss: 0.01183
Epoch: 13913, Training Loss: 0.01459
Epoch: 13913, Training Loss: 0.01123
Epoch: 13914, Training Loss: 0.01278
Epoch: 13914, Training Loss: 0.01183
Epoch: 13914, Training Loss: 0.01459
Epoch: 13914, Training Loss: 0.01123
Epoch: 13915, Training Loss: 0.01278
Epoch: 13915, Training Loss: 0.01183
Epoch: 13915, Training Loss: 0.01459
Epoch: 13915, Training Loss: 0.01122
Epoch: 13916, Training Loss: 0.01278
Epoch: 13916, Training Loss: 0.01183
Epoch: 13916, Training Loss: 0.01459
Epoch: 13916, Training Loss: 0.01122
Epoch: 13917, Training Loss: 0.01278
Epoch: 13917, Training Loss: 0.01183
Epoch: 13917, Training Loss: 0.01459
Epoch: 13917, Training Loss: 0.01122
Epoch: 13918, Training Loss: 0.01278
Epoch: 13918, Training Loss: 0.01183
Epoch: 13918, Training Loss: 0.01459
Epoch: 13918, Training Loss: 0.01122
Epoch: 13919, Training Loss: 0.01278
Epoch: 13919, Training Loss: 0.01183
Epoch: 13919, Training Loss: 0.01459
Epoch: 13919, Training Loss: 0.01122
Epoch: 13920, Training Loss: 0.01278
Epoch: 13920, Training Loss: 0.01183
Epoch: 13920, Training Loss: 0.01459
Epoch: 13920, Training Loss: 0.01122
Epoch: 13921, Training Loss: 0.01278
Epoch: 13921, Training Loss: 0.01183
Epoch: 13921, Training Loss: 0.01459
Epoch: 13921, Training Loss: 0.01122
Epoch: 13922, Training Loss: 0.01278
Epoch: 13922, Training Loss: 0.01183
Epoch: 13922, Training Loss: 0.01459
Epoch: 13922, Training Loss: 0.01122
Epoch: 13923, Training Loss: 0.01278
Epoch: 13923, Training Loss: 0.01183
Epoch: 13923, Training Loss: 0.01459
Epoch: 13923, Training Loss: 0.01122
Epoch: 13924, Training Loss: 0.01277
Epoch: 13924, Training Loss: 0.01183
Epoch: 13924, Training Loss: 0.01459
Epoch: 13924, Training Loss: 0.01122
Epoch: 13925, Training Loss: 0.01277
Epoch: 13925, Training Loss: 0.01183
Epoch: 13925, Training Loss: 0.01459
Epoch: 13925, Training Loss: 0.01122
Epoch: 13926, Training Loss: 0.01277
Epoch: 13926, Training Loss: 0.01183
Epoch: 13926, Training Loss: 0.01458
Epoch: 13926, Training Loss: 0.01122
Epoch: 13927, Training Loss: 0.01277
Epoch: 13927, Training Loss: 0.01183
Epoch: 13927, Training Loss: 0.01458
Epoch: 13927, Training Loss: 0.01122
Epoch: 13928, Training Loss: 0.01277
Epoch: 13928, Training Loss: 0.01183
Epoch: 13928, Training Loss: 0.01458
Epoch: 13928, Training Loss: 0.01122
Epoch: 13929, Training Loss: 0.01277
Epoch: 13929, Training Loss: 0.01183
Epoch: 13929, Training Loss: 0.01458
Epoch: 13929, Training Loss: 0.01122
Epoch: 13930, Training Loss: 0.01277
Epoch: 13930, Training Loss: 0.01183
Epoch: 13930, Training Loss: 0.01458
Epoch: 13930, Training Loss: 0.01122
Epoch: 13931, Training Loss: 0.01277
Epoch: 13931, Training Loss: 0.01183
Epoch: 13931, Training Loss: 0.01458
Epoch: 13931, Training Loss: 0.01122
Epoch: 13932, Training Loss: 0.01277
Epoch: 13932, Training Loss: 0.01183
Epoch: 13932, Training Loss: 0.01458
Epoch: 13932, Training Loss: 0.01122
Epoch: 13933, Training Loss: 0.01277
Epoch: 13933, Training Loss: 0.01183
Epoch: 13933, Training Loss: 0.01458
Epoch: 13933, Training Loss: 0.01122
Epoch: 13934, Training Loss: 0.01277
Epoch: 13934, Training Loss: 0.01182
Epoch: 13934, Training Loss: 0.01458
Epoch: 13934, Training Loss: 0.01122
Epoch: 13935, Training Loss: 0.01277
Epoch: 13935, Training Loss: 0.01182
Epoch: 13935, Training Loss: 0.01458
Epoch: 13935, Training Loss: 0.01122
Epoch: 13936, Training Loss: 0.01277
Epoch: 13936, Training Loss: 0.01182
Epoch: 13936, Training Loss: 0.01458
Epoch: 13936, Training Loss: 0.01122
Epoch: 13937, Training Loss: 0.01277
Epoch: 13937, Training Loss: 0.01182
Epoch: 13937, Training Loss: 0.01458
Epoch: 13937, Training Loss: 0.01122
Epoch: 13938, Training Loss: 0.01277
Epoch: 13938, Training Loss: 0.01182
Epoch: 13938, Training Loss: 0.01458
Epoch: 13938, Training Loss: 0.01121
Epoch: 13939, Training Loss: 0.01277
Epoch: 13939, Training Loss: 0.01182
Epoch: 13939, Training Loss: 0.01458
Epoch: 13939, Training Loss: 0.01121
Epoch: 13940, Training Loss: 0.01277
Epoch: 13940, Training Loss: 0.01182
Epoch: 13940, Training Loss: 0.01458
Epoch: 13940, Training Loss: 0.01121
Epoch: 13941, Training Loss: 0.01277
Epoch: 13941, Training Loss: 0.01182
Epoch: 13941, Training Loss: 0.01458
Epoch: 13941, Training Loss: 0.01121
Epoch: 13942, Training Loss: 0.01277
Epoch: 13942, Training Loss: 0.01182
Epoch: 13942, Training Loss: 0.01458
Epoch: 13942, Training Loss: 0.01121
Epoch: 13943, Training Loss: 0.01277
Epoch: 13943, Training Loss: 0.01182
Epoch: 13943, Training Loss: 0.01458
Epoch: 13943, Training Loss: 0.01121
Epoch: 13944, Training Loss: 0.01276
Epoch: 13944, Training Loss: 0.01182
Epoch: 13944, Training Loss: 0.01457
Epoch: 13944, Training Loss: 0.01121
Epoch: 13945, Training Loss: 0.01276
Epoch: 13945, Training Loss: 0.01182
Epoch: 13945, Training Loss: 0.01457
Epoch: 13945, Training Loss: 0.01121
Epoch: 13946, Training Loss: 0.01276
Epoch: 13946, Training Loss: 0.01182
Epoch: 13946, Training Loss: 0.01457
Epoch: 13946, Training Loss: 0.01121
Epoch: 13947, Training Loss: 0.01276
Epoch: 13947, Training Loss: 0.01182
Epoch: 13947, Training Loss: 0.01457
Epoch: 13947, Training Loss: 0.01121
Epoch: 13948, Training Loss: 0.01276
Epoch: 13948, Training Loss: 0.01182
Epoch: 13948, Training Loss: 0.01457
Epoch: 13948, Training Loss: 0.01121
Epoch: 13949, Training Loss: 0.01276
Epoch: 13949, Training Loss: 0.01182
Epoch: 13949, Training Loss: 0.01457
Epoch: 13949, Training Loss: 0.01121
Epoch: 13950, Training Loss: 0.01276
Epoch: 13950, Training Loss: 0.01182
Epoch: 13950, Training Loss: 0.01457
Epoch: 13950, Training Loss: 0.01121
Epoch: 13951, Training Loss: 0.01276
Epoch: 13951, Training Loss: 0.01182
Epoch: 13951, Training Loss: 0.01457
Epoch: 13951, Training Loss: 0.01121
Epoch: 13952, Training Loss: 0.01276
Epoch: 13952, Training Loss: 0.01182
Epoch: 13952, Training Loss: 0.01457
Epoch: 13952, Training Loss: 0.01121
Epoch: 13953, Training Loss: 0.01276
Epoch: 13953, Training Loss: 0.01182
Epoch: 13953, Training Loss: 0.01457
Epoch: 13953, Training Loss: 0.01121
Epoch: 13954, Training Loss: 0.01276
Epoch: 13954, Training Loss: 0.01182
Epoch: 13954, Training Loss: 0.01457
Epoch: 13954, Training Loss: 0.01121
Epoch: 13955, Training Loss: 0.01276
Epoch: 13955, Training Loss: 0.01182
Epoch: 13955, Training Loss: 0.01457
Epoch: 13955, Training Loss: 0.01121
Epoch: 13956, Training Loss: 0.01276
Epoch: 13956, Training Loss: 0.01181
Epoch: 13956, Training Loss: 0.01457
Epoch: 13956, Training Loss: 0.01121
Epoch: 13957, Training Loss: 0.01276
Epoch: 13957, Training Loss: 0.01181
Epoch: 13957, Training Loss: 0.01457
Epoch: 13957, Training Loss: 0.01121
Epoch: 13958, Training Loss: 0.01276
Epoch: 13958, Training Loss: 0.01181
Epoch: 13958, Training Loss: 0.01457
Epoch: 13958, Training Loss: 0.01121
Epoch: 13959, Training Loss: 0.01276
Epoch: 13959, Training Loss: 0.01181
Epoch: 13959, Training Loss: 0.01457
Epoch: 13959, Training Loss: 0.01121
Epoch: 13960, Training Loss: 0.01276
Epoch: 13960, Training Loss: 0.01181
Epoch: 13960, Training Loss: 0.01457
Epoch: 13960, Training Loss: 0.01121
Epoch: 13961, Training Loss: 0.01276
Epoch: 13961, Training Loss: 0.01181
Epoch: 13961, Training Loss: 0.01456
Epoch: 13961, Training Loss: 0.01120
Epoch: 13962, Training Loss: 0.01276
Epoch: 13962, Training Loss: 0.01181
Epoch: 13962, Training Loss: 0.01456
Epoch: 13962, Training Loss: 0.01120
Epoch: 13963, Training Loss: 0.01276
Epoch: 13963, Training Loss: 0.01181
Epoch: 13963, Training Loss: 0.01456
Epoch: 13963, Training Loss: 0.01120
Epoch: 13964, Training Loss: 0.01275
Epoch: 13964, Training Loss: 0.01181
Epoch: 13964, Training Loss: 0.01456
Epoch: 13964, Training Loss: 0.01120
Epoch: 13965, Training Loss: 0.01275
Epoch: 13965, Training Loss: 0.01181
Epoch: 13965, Training Loss: 0.01456
Epoch: 13965, Training Loss: 0.01120
Epoch: 13966, Training Loss: 0.01275
Epoch: 13966, Training Loss: 0.01181
Epoch: 13966, Training Loss: 0.01456
Epoch: 13966, Training Loss: 0.01120
Epoch: 13967, Training Loss: 0.01275
Epoch: 13967, Training Loss: 0.01181
Epoch: 13967, Training Loss: 0.01456
Epoch: 13967, Training Loss: 0.01120
Epoch: 13968, Training Loss: 0.01275
Epoch: 13968, Training Loss: 0.01181
Epoch: 13968, Training Loss: 0.01456
Epoch: 13968, Training Loss: 0.01120
Epoch: 13969, Training Loss: 0.01275
Epoch: 13969, Training Loss: 0.01181
Epoch: 13969, Training Loss: 0.01456
Epoch: 13969, Training Loss: 0.01120
Epoch: 13970, Training Loss: 0.01275
Epoch: 13970, Training Loss: 0.01181
Epoch: 13970, Training Loss: 0.01456
Epoch: 13970, Training Loss: 0.01120
Epoch: 13971, Training Loss: 0.01275
Epoch: 13971, Training Loss: 0.01181
Epoch: 13971, Training Loss: 0.01456
Epoch: 13971, Training Loss: 0.01120
Epoch: 13972, Training Loss: 0.01275
Epoch: 13972, Training Loss: 0.01181
Epoch: 13972, Training Loss: 0.01456
Epoch: 13972, Training Loss: 0.01120
Epoch: 13973, Training Loss: 0.01275
Epoch: 13973, Training Loss: 0.01181
Epoch: 13973, Training Loss: 0.01456
Epoch: 13973, Training Loss: 0.01120
Epoch: 13974, Training Loss: 0.01275
Epoch: 13974, Training Loss: 0.01181
Epoch: 13974, Training Loss: 0.01456
Epoch: 13974, Training Loss: 0.01120
Epoch: 13975, Training Loss: 0.01275
Epoch: 13975, Training Loss: 0.01181
Epoch: 13975, Training Loss: 0.01456
Epoch: 13975, Training Loss: 0.01120
Epoch: 13976, Training Loss: 0.01275
Epoch: 13976, Training Loss: 0.01181
Epoch: 13976, Training Loss: 0.01456
Epoch: 13976, Training Loss: 0.01120
Epoch: 13977, Training Loss: 0.01275
Epoch: 13977, Training Loss: 0.01181
Epoch: 13977, Training Loss: 0.01456
Epoch: 13977, Training Loss: 0.01120
Epoch: 13978, Training Loss: 0.01275
Epoch: 13978, Training Loss: 0.01180
Epoch: 13978, Training Loss: 0.01455
Epoch: 13978, Training Loss: 0.01120
Epoch: 13979, Training Loss: 0.01275
Epoch: 13979, Training Loss: 0.01180
Epoch: 13979, Training Loss: 0.01455
Epoch: 13979, Training Loss: 0.01120
Epoch: 13980, Training Loss: 0.01275
Epoch: 13980, Training Loss: 0.01180
Epoch: 13980, Training Loss: 0.01455
Epoch: 13980, Training Loss: 0.01120
Epoch: 13981, Training Loss: 0.01275
Epoch: 13981, Training Loss: 0.01180
Epoch: 13981, Training Loss: 0.01455
Epoch: 13981, Training Loss: 0.01120
Epoch: 13982, Training Loss: 0.01275
Epoch: 13982, Training Loss: 0.01180
Epoch: 13982, Training Loss: 0.01455
Epoch: 13982, Training Loss: 0.01120
Epoch: 13983, Training Loss: 0.01275
Epoch: 13983, Training Loss: 0.01180
Epoch: 13983, Training Loss: 0.01455
Epoch: 13983, Training Loss: 0.01120
Epoch: 13984, Training Loss: 0.01274
Epoch: 13984, Training Loss: 0.01180
Epoch: 13984, Training Loss: 0.01455
Epoch: 13984, Training Loss: 0.01119
Epoch: 13985, Training Loss: 0.01274
Epoch: 13985, Training Loss: 0.01180
Epoch: 13985, Training Loss: 0.01455
Epoch: 13985, Training Loss: 0.01119
Epoch: 13986, Training Loss: 0.01274
Epoch: 13986, Training Loss: 0.01180
Epoch: 13986, Training Loss: 0.01455
Epoch: 13986, Training Loss: 0.01119
Epoch: 13987, Training Loss: 0.01274
Epoch: 13987, Training Loss: 0.01180
Epoch: 13987, Training Loss: 0.01455
Epoch: 13987, Training Loss: 0.01119
Epoch: 13988, Training Loss: 0.01274
Epoch: 13988, Training Loss: 0.01180
Epoch: 13988, Training Loss: 0.01455
Epoch: 13988, Training Loss: 0.01119
Epoch: 13989, Training Loss: 0.01274
Epoch: 13989, Training Loss: 0.01180
Epoch: 13989, Training Loss: 0.01455
Epoch: 13989, Training Loss: 0.01119
Epoch: 13990, Training Loss: 0.01274
Epoch: 13990, Training Loss: 0.01180
Epoch: 13990, Training Loss: 0.01455
Epoch: 13990, Training Loss: 0.01119
Epoch: 13991, Training Loss: 0.01274
Epoch: 13991, Training Loss: 0.01180
Epoch: 13991, Training Loss: 0.01455
Epoch: 13991, Training Loss: 0.01119
Epoch: 13992, Training Loss: 0.01274
Epoch: 13992, Training Loss: 0.01180
Epoch: 13992, Training Loss: 0.01455
Epoch: 13992, Training Loss: 0.01119
Epoch: 13993, Training Loss: 0.01274
Epoch: 13993, Training Loss: 0.01180
Epoch: 13993, Training Loss: 0.01455
Epoch: 13993, Training Loss: 0.01119
Epoch: 13994, Training Loss: 0.01274
Epoch: 13994, Training Loss: 0.01180
Epoch: 13994, Training Loss: 0.01455
Epoch: 13994, Training Loss: 0.01119
Epoch: 13995, Training Loss: 0.01274
Epoch: 13995, Training Loss: 0.01180
Epoch: 13995, Training Loss: 0.01454
Epoch: 13995, Training Loss: 0.01119
Epoch: 13996, Training Loss: 0.01274
Epoch: 13996, Training Loss: 0.01180
Epoch: 13996, Training Loss: 0.01454
Epoch: 13996, Training Loss: 0.01119
Epoch: 13997, Training Loss: 0.01274
Epoch: 13997, Training Loss: 0.01180
Epoch: 13997, Training Loss: 0.01454
Epoch: 13997, Training Loss: 0.01119
Epoch: 13998, Training Loss: 0.01274
Epoch: 13998, Training Loss: 0.01180
Epoch: 13998, Training Loss: 0.01454
Epoch: 13998, Training Loss: 0.01119
Epoch: 13999, Training Loss: 0.01274
Epoch: 13999, Training Loss: 0.01180
Epoch: 13999, Training Loss: 0.01454
Epoch: 13999, Training Loss: 0.01119
Epoch: 14000, Training Loss: 0.01274
Epoch: 14000, Training Loss: 0.01179
Epoch: 14000, Training Loss: 0.01454
Epoch: 14000, Training Loss: 0.01119
Epoch: 14001, Training Loss: 0.01274
Epoch: 14001, Training Loss: 0.01179
Epoch: 14001, Training Loss: 0.01454
Epoch: 14001, Training Loss: 0.01119
Epoch: 14002, Training Loss: 0.01274
Epoch: 14002, Training Loss: 0.01179
Epoch: 14002, Training Loss: 0.01454
Epoch: 14002, Training Loss: 0.01119
Epoch: 14003, Training Loss: 0.01274
Epoch: 14003, Training Loss: 0.01179
Epoch: 14003, Training Loss: 0.01454
Epoch: 14003, Training Loss: 0.01119
Epoch: 14004, Training Loss: 0.01273
Epoch: 14004, Training Loss: 0.01179
Epoch: 14004, Training Loss: 0.01454
Epoch: 14004, Training Loss: 0.01119
Epoch: 14005, Training Loss: 0.01273
Epoch: 14005, Training Loss: 0.01179
Epoch: 14005, Training Loss: 0.01454
Epoch: 14005, Training Loss: 0.01119
Epoch: 14006, Training Loss: 0.01273
Epoch: 14006, Training Loss: 0.01179
Epoch: 14006, Training Loss: 0.01454
Epoch: 14006, Training Loss: 0.01119
Epoch: 14007, Training Loss: 0.01273
Epoch: 14007, Training Loss: 0.01179
Epoch: 14007, Training Loss: 0.01454
Epoch: 14007, Training Loss: 0.01118
Epoch: 14008, Training Loss: 0.01273
Epoch: 14008, Training Loss: 0.01179
Epoch: 14008, Training Loss: 0.01454
Epoch: 14008, Training Loss: 0.01118
Epoch: 14009, Training Loss: 0.01273
Epoch: 14009, Training Loss: 0.01179
Epoch: 14009, Training Loss: 0.01454
Epoch: 14009, Training Loss: 0.01118
Epoch: 14010, Training Loss: 0.01273
Epoch: 14010, Training Loss: 0.01179
Epoch: 14010, Training Loss: 0.01454
Epoch: 14010, Training Loss: 0.01118
Epoch: 14011, Training Loss: 0.01273
Epoch: 14011, Training Loss: 0.01179
Epoch: 14011, Training Loss: 0.01454
Epoch: 14011, Training Loss: 0.01118
Epoch: 14012, Training Loss: 0.01273
Epoch: 14012, Training Loss: 0.01179
Epoch: 14012, Training Loss: 0.01454
Epoch: 14012, Training Loss: 0.01118
Epoch: 14013, Training Loss: 0.01273
Epoch: 14013, Training Loss: 0.01179
Epoch: 14013, Training Loss: 0.01453
Epoch: 14013, Training Loss: 0.01118
Epoch: 14014, Training Loss: 0.01273
Epoch: 14014, Training Loss: 0.01179
Epoch: 14014, Training Loss: 0.01453
Epoch: 14014, Training Loss: 0.01118
Epoch: 14015, Training Loss: 0.01273
Epoch: 14015, Training Loss: 0.01179
Epoch: 14015, Training Loss: 0.01453
Epoch: 14015, Training Loss: 0.01118
Epoch: 14016, Training Loss: 0.01273
Epoch: 14016, Training Loss: 0.01179
Epoch: 14016, Training Loss: 0.01453
Epoch: 14016, Training Loss: 0.01118
Epoch: 14017, Training Loss: 0.01273
Epoch: 14017, Training Loss: 0.01179
Epoch: 14017, Training Loss: 0.01453
Epoch: 14017, Training Loss: 0.01118
Epoch: 14018, Training Loss: 0.01273
Epoch: 14018, Training Loss: 0.01179
Epoch: 14018, Training Loss: 0.01453
Epoch: 14018, Training Loss: 0.01118
Epoch: 14019, Training Loss: 0.01273
Epoch: 14019, Training Loss: 0.01179
Epoch: 14019, Training Loss: 0.01453
Epoch: 14019, Training Loss: 0.01118
Epoch: 14020, Training Loss: 0.01273
Epoch: 14020, Training Loss: 0.01179
Epoch: 14020, Training Loss: 0.01453
Epoch: 14020, Training Loss: 0.01118
Epoch: 14021, Training Loss: 0.01273
Epoch: 14021, Training Loss: 0.01179
Epoch: 14021, Training Loss: 0.01453
Epoch: 14021, Training Loss: 0.01118
Epoch: 14022, Training Loss: 0.01273
Epoch: 14022, Training Loss: 0.01178
Epoch: 14022, Training Loss: 0.01453
Epoch: 14022, Training Loss: 0.01118
Epoch: 14023, Training Loss: 0.01273
Epoch: 14023, Training Loss: 0.01178
Epoch: 14023, Training Loss: 0.01453
Epoch: 14023, Training Loss: 0.01118
Epoch: 14024, Training Loss: 0.01272
Epoch: 14024, Training Loss: 0.01178
Epoch: 14024, Training Loss: 0.01453
Epoch: 14024, Training Loss: 0.01118
Epoch: 14025, Training Loss: 0.01272
Epoch: 14025, Training Loss: 0.01178
Epoch: 14025, Training Loss: 0.01453
Epoch: 14025, Training Loss: 0.01118
Epoch: 14026, Training Loss: 0.01272
Epoch: 14026, Training Loss: 0.01178
Epoch: 14026, Training Loss: 0.01453
Epoch: 14026, Training Loss: 0.01118
Epoch: 14027, Training Loss: 0.01272
Epoch: 14027, Training Loss: 0.01178
Epoch: 14027, Training Loss: 0.01453
Epoch: 14027, Training Loss: 0.01118
Epoch: 14028, Training Loss: 0.01272
Epoch: 14028, Training Loss: 0.01178
Epoch: 14028, Training Loss: 0.01453
Epoch: 14028, Training Loss: 0.01118
Epoch: 14029, Training Loss: 0.01272
Epoch: 14029, Training Loss: 0.01178
Epoch: 14029, Training Loss: 0.01453
Epoch: 14029, Training Loss: 0.01118
Epoch: 14030, Training Loss: 0.01272
Epoch: 14030, Training Loss: 0.01178
Epoch: 14030, Training Loss: 0.01452
Epoch: 14030, Training Loss: 0.01117
Epoch: 14031, Training Loss: 0.01272
Epoch: 14031, Training Loss: 0.01178
Epoch: 14031, Training Loss: 0.01452
Epoch: 14031, Training Loss: 0.01117
Epoch: 14032, Training Loss: 0.01272
Epoch: 14032, Training Loss: 0.01178
Epoch: 14032, Training Loss: 0.01452
Epoch: 14032, Training Loss: 0.01117
Epoch: 14033, Training Loss: 0.01272
Epoch: 14033, Training Loss: 0.01178
Epoch: 14033, Training Loss: 0.01452
Epoch: 14033, Training Loss: 0.01117
Epoch: 14034, Training Loss: 0.01272
Epoch: 14034, Training Loss: 0.01178
Epoch: 14034, Training Loss: 0.01452
Epoch: 14034, Training Loss: 0.01117
Epoch: 14035, Training Loss: 0.01272
Epoch: 14035, Training Loss: 0.01178
Epoch: 14035, Training Loss: 0.01452
Epoch: 14035, Training Loss: 0.01117
Epoch: 14036, Training Loss: 0.01272
Epoch: 14036, Training Loss: 0.01178
Epoch: 14036, Training Loss: 0.01452
Epoch: 14036, Training Loss: 0.01117
Epoch: 14037, Training Loss: 0.01272
Epoch: 14037, Training Loss: 0.01178
Epoch: 14037, Training Loss: 0.01452
Epoch: 14037, Training Loss: 0.01117
Epoch: 14038, Training Loss: 0.01272
Epoch: 14038, Training Loss: 0.01178
Epoch: 14038, Training Loss: 0.01452
Epoch: 14038, Training Loss: 0.01117
Epoch: 14039, Training Loss: 0.01272
Epoch: 14039, Training Loss: 0.01178
Epoch: 14039, Training Loss: 0.01452
Epoch: 14039, Training Loss: 0.01117
Epoch: 14040, Training Loss: 0.01272
Epoch: 14040, Training Loss: 0.01178
Epoch: 14040, Training Loss: 0.01452
Epoch: 14040, Training Loss: 0.01117
Epoch: 14041, Training Loss: 0.01272
Epoch: 14041, Training Loss: 0.01178
Epoch: 14041, Training Loss: 0.01452
Epoch: 14041, Training Loss: 0.01117
Epoch: 14042, Training Loss: 0.01272
Epoch: 14042, Training Loss: 0.01178
Epoch: 14042, Training Loss: 0.01452
Epoch: 14042, Training Loss: 0.01117
Epoch: 14043, Training Loss: 0.01272
Epoch: 14043, Training Loss: 0.01178
Epoch: 14043, Training Loss: 0.01452
Epoch: 14043, Training Loss: 0.01117
Epoch: 14044, Training Loss: 0.01271
Epoch: 14044, Training Loss: 0.01177
Epoch: 14044, Training Loss: 0.01452
Epoch: 14044, Training Loss: 0.01117
Epoch: 14045, Training Loss: 0.01271
Epoch: 14045, Training Loss: 0.01177
Epoch: 14045, Training Loss: 0.01452
Epoch: 14045, Training Loss: 0.01117
Epoch: 14046, Training Loss: 0.01271
Epoch: 14046, Training Loss: 0.01177
Epoch: 14046, Training Loss: 0.01452
Epoch: 14046, Training Loss: 0.01117
Epoch: 14047, Training Loss: 0.01271
Epoch: 14047, Training Loss: 0.01177
Epoch: 14047, Training Loss: 0.01452
Epoch: 14047, Training Loss: 0.01117
Epoch: 14048, Training Loss: 0.01271
Epoch: 14048, Training Loss: 0.01177
Epoch: 14048, Training Loss: 0.01451
Epoch: 14048, Training Loss: 0.01117
Epoch: 14049, Training Loss: 0.01271
Epoch: 14049, Training Loss: 0.01177
Epoch: 14049, Training Loss: 0.01451
Epoch: 14049, Training Loss: 0.01117
Epoch: 14050, Training Loss: 0.01271
Epoch: 14050, Training Loss: 0.01177
Epoch: 14050, Training Loss: 0.01451
Epoch: 14050, Training Loss: 0.01117
Epoch: 14051, Training Loss: 0.01271
Epoch: 14051, Training Loss: 0.01177
Epoch: 14051, Training Loss: 0.01451
Epoch: 14051, Training Loss: 0.01117
Epoch: 14052, Training Loss: 0.01271
Epoch: 14052, Training Loss: 0.01177
Epoch: 14052, Training Loss: 0.01451
Epoch: 14052, Training Loss: 0.01117
Epoch: 14053, Training Loss: 0.01271
Epoch: 14053, Training Loss: 0.01177
Epoch: 14053, Training Loss: 0.01451
Epoch: 14053, Training Loss: 0.01116
Epoch: 14054, Training Loss: 0.01271
Epoch: 14054, Training Loss: 0.01177
Epoch: 14054, Training Loss: 0.01451
Epoch: 14054, Training Loss: 0.01116
Epoch: 14055, Training Loss: 0.01271
Epoch: 14055, Training Loss: 0.01177
Epoch: 14055, Training Loss: 0.01451
Epoch: 14055, Training Loss: 0.01116
Epoch: 14056, Training Loss: 0.01271
Epoch: 14056, Training Loss: 0.01177
Epoch: 14056, Training Loss: 0.01451
Epoch: 14056, Training Loss: 0.01116
Epoch: 14057, Training Loss: 0.01271
Epoch: 14057, Training Loss: 0.01177
Epoch: 14057, Training Loss: 0.01451
Epoch: 14057, Training Loss: 0.01116
Epoch: 14058, Training Loss: 0.01271
Epoch: 14058, Training Loss: 0.01177
Epoch: 14058, Training Loss: 0.01451
Epoch: 14058, Training Loss: 0.01116
Epoch: 14059, Training Loss: 0.01271
Epoch: 14059, Training Loss: 0.01177
Epoch: 14059, Training Loss: 0.01451
Epoch: 14059, Training Loss: 0.01116
Epoch: 14060, Training Loss: 0.01271
Epoch: 14060, Training Loss: 0.01177
Epoch: 14060, Training Loss: 0.01451
Epoch: 14060, Training Loss: 0.01116
Epoch: 14061, Training Loss: 0.01271
Epoch: 14061, Training Loss: 0.01177
Epoch: 14061, Training Loss: 0.01451
Epoch: 14061, Training Loss: 0.01116
Epoch: 14062, Training Loss: 0.01271
Epoch: 14062, Training Loss: 0.01177
Epoch: 14062, Training Loss: 0.01451
Epoch: 14062, Training Loss: 0.01116
Epoch: 14063, Training Loss: 0.01271
Epoch: 14063, Training Loss: 0.01177
Epoch: 14063, Training Loss: 0.01451
Epoch: 14063, Training Loss: 0.01116
Epoch: 14064, Training Loss: 0.01270
Epoch: 14064, Training Loss: 0.01177
Epoch: 14064, Training Loss: 0.01451
Epoch: 14064, Training Loss: 0.01116
Epoch: 14065, Training Loss: 0.01270
Epoch: 14065, Training Loss: 0.01177
Epoch: 14065, Training Loss: 0.01450
Epoch: 14065, Training Loss: 0.01116
Epoch: 14066, Training Loss: 0.01270
Epoch: 14066, Training Loss: 0.01176
Epoch: 14066, Training Loss: 0.01450
Epoch: 14066, Training Loss: 0.01116
Epoch: 14067, Training Loss: 0.01270
Epoch: 14067, Training Loss: 0.01176
Epoch: 14067, Training Loss: 0.01450
Epoch: 14067, Training Loss: 0.01116
Epoch: 14068, Training Loss: 0.01270
Epoch: 14068, Training Loss: 0.01176
Epoch: 14068, Training Loss: 0.01450
Epoch: 14068, Training Loss: 0.01116
Epoch: 14069, Training Loss: 0.01270
Epoch: 14069, Training Loss: 0.01176
Epoch: 14069, Training Loss: 0.01450
Epoch: 14069, Training Loss: 0.01116
Epoch: 14070, Training Loss: 0.01270
Epoch: 14070, Training Loss: 0.01176
Epoch: 14070, Training Loss: 0.01450
Epoch: 14070, Training Loss: 0.01116
Epoch: 14071, Training Loss: 0.01270
Epoch: 14071, Training Loss: 0.01176
Epoch: 14071, Training Loss: 0.01450
Epoch: 14071, Training Loss: 0.01116
Epoch: 14072, Training Loss: 0.01270
Epoch: 14072, Training Loss: 0.01176
Epoch: 14072, Training Loss: 0.01450
Epoch: 14072, Training Loss: 0.01116
Epoch: 14073, Training Loss: 0.01270
Epoch: 14073, Training Loss: 0.01176
Epoch: 14073, Training Loss: 0.01450
Epoch: 14073, Training Loss: 0.01116
Epoch: 14074, Training Loss: 0.01270
Epoch: 14074, Training Loss: 0.01176
Epoch: 14074, Training Loss: 0.01450
Epoch: 14074, Training Loss: 0.01116
Epoch: 14075, Training Loss: 0.01270
Epoch: 14075, Training Loss: 0.01176
Epoch: 14075, Training Loss: 0.01450
Epoch: 14075, Training Loss: 0.01116
Epoch: 14076, Training Loss: 0.01270
Epoch: 14076, Training Loss: 0.01176
Epoch: 14076, Training Loss: 0.01450
Epoch: 14076, Training Loss: 0.01115
Epoch: 14077, Training Loss: 0.01270
Epoch: 14077, Training Loss: 0.01176
Epoch: 14077, Training Loss: 0.01450
Epoch: 14077, Training Loss: 0.01115
Epoch: 14078, Training Loss: 0.01270
Epoch: 14078, Training Loss: 0.01176
Epoch: 14078, Training Loss: 0.01450
Epoch: 14078, Training Loss: 0.01115
Epoch: 14079, Training Loss: 0.01270
Epoch: 14079, Training Loss: 0.01176
Epoch: 14079, Training Loss: 0.01450
Epoch: 14079, Training Loss: 0.01115
Epoch: 14080, Training Loss: 0.01270
Epoch: 14080, Training Loss: 0.01176
Epoch: 14080, Training Loss: 0.01450
Epoch: 14080, Training Loss: 0.01115
Epoch: 14081, Training Loss: 0.01270
Epoch: 14081, Training Loss: 0.01176
Epoch: 14081, Training Loss: 0.01450
Epoch: 14081, Training Loss: 0.01115
Epoch: 14082, Training Loss: 0.01270
Epoch: 14082, Training Loss: 0.01176
Epoch: 14082, Training Loss: 0.01450
Epoch: 14082, Training Loss: 0.01115
Epoch: 14083, Training Loss: 0.01270
Epoch: 14083, Training Loss: 0.01176
Epoch: 14083, Training Loss: 0.01449
Epoch: 14083, Training Loss: 0.01115
Epoch: 14084, Training Loss: 0.01269
Epoch: 14084, Training Loss: 0.01176
Epoch: 14084, Training Loss: 0.01449
Epoch: 14084, Training Loss: 0.01115
Epoch: 14085, Training Loss: 0.01269
Epoch: 14085, Training Loss: 0.01176
Epoch: 14085, Training Loss: 0.01449
Epoch: 14085, Training Loss: 0.01115
Epoch: 14086, Training Loss: 0.01269
Epoch: 14086, Training Loss: 0.01176
Epoch: 14086, Training Loss: 0.01449
Epoch: 14086, Training Loss: 0.01115
Epoch: 14087, Training Loss: 0.01269
Epoch: 14087, Training Loss: 0.01176
Epoch: 14087, Training Loss: 0.01449
Epoch: 14087, Training Loss: 0.01115
Epoch: 14088, Training Loss: 0.01269
Epoch: 14088, Training Loss: 0.01176
Epoch: 14088, Training Loss: 0.01449
Epoch: 14088, Training Loss: 0.01115
Epoch: 14089, Training Loss: 0.01269
Epoch: 14089, Training Loss: 0.01175
Epoch: 14089, Training Loss: 0.01449
Epoch: 14089, Training Loss: 0.01115
Epoch: 14090, Training Loss: 0.01269
Epoch: 14090, Training Loss: 0.01175
Epoch: 14090, Training Loss: 0.01449
Epoch: 14090, Training Loss: 0.01115
Epoch: 14091, Training Loss: 0.01269
Epoch: 14091, Training Loss: 0.01175
Epoch: 14091, Training Loss: 0.01449
Epoch: 14091, Training Loss: 0.01115
Epoch: 14092, Training Loss: 0.01269
Epoch: 14092, Training Loss: 0.01175
Epoch: 14092, Training Loss: 0.01449
Epoch: 14092, Training Loss: 0.01115
Epoch: 14093, Training Loss: 0.01269
Epoch: 14093, Training Loss: 0.01175
Epoch: 14093, Training Loss: 0.01449
Epoch: 14093, Training Loss: 0.01115
Epoch: 14094, Training Loss: 0.01269
Epoch: 14094, Training Loss: 0.01175
Epoch: 14094, Training Loss: 0.01449
Epoch: 14094, Training Loss: 0.01115
Epoch: 14095, Training Loss: 0.01269
Epoch: 14095, Training Loss: 0.01175
Epoch: 14095, Training Loss: 0.01449
Epoch: 14095, Training Loss: 0.01115
Epoch: 14096, Training Loss: 0.01269
Epoch: 14096, Training Loss: 0.01175
Epoch: 14096, Training Loss: 0.01449
Epoch: 14096, Training Loss: 0.01115
Epoch: 14097, Training Loss: 0.01269
Epoch: 14097, Training Loss: 0.01175
Epoch: 14097, Training Loss: 0.01449
Epoch: 14097, Training Loss: 0.01115
Epoch: 14098, Training Loss: 0.01269
Epoch: 14098, Training Loss: 0.01175
Epoch: 14098, Training Loss: 0.01449
Epoch: 14098, Training Loss: 0.01115
Epoch: 14099, Training Loss: 0.01269
Epoch: 14099, Training Loss: 0.01175
Epoch: 14099, Training Loss: 0.01449
Epoch: 14099, Training Loss: 0.01115
Epoch: 14100, Training Loss: 0.01269
Epoch: 14100, Training Loss: 0.01175
Epoch: 14100, Training Loss: 0.01448
Epoch: 14100, Training Loss: 0.01114
Epoch: 14101, Training Loss: 0.01269
Epoch: 14101, Training Loss: 0.01175
Epoch: 14101, Training Loss: 0.01448
Epoch: 14101, Training Loss: 0.01114
Epoch: 14102, Training Loss: 0.01269
Epoch: 14102, Training Loss: 0.01175
Epoch: 14102, Training Loss: 0.01448
Epoch: 14102, Training Loss: 0.01114
Epoch: 14103, Training Loss: 0.01269
Epoch: 14103, Training Loss: 0.01175
Epoch: 14103, Training Loss: 0.01448
Epoch: 14103, Training Loss: 0.01114
Epoch: 14104, Training Loss: 0.01268
Epoch: 14104, Training Loss: 0.01175
Epoch: 14104, Training Loss: 0.01448
Epoch: 14104, Training Loss: 0.01114
Epoch: 14105, Training Loss: 0.01268
Epoch: 14105, Training Loss: 0.01175
Epoch: 14105, Training Loss: 0.01448
Epoch: 14105, Training Loss: 0.01114
Epoch: 14106, Training Loss: 0.01268
Epoch: 14106, Training Loss: 0.01175
Epoch: 14106, Training Loss: 0.01448
Epoch: 14106, Training Loss: 0.01114
Epoch: 14107, Training Loss: 0.01268
Epoch: 14107, Training Loss: 0.01175
Epoch: 14107, Training Loss: 0.01448
Epoch: 14107, Training Loss: 0.01114
Epoch: 14108, Training Loss: 0.01268
Epoch: 14108, Training Loss: 0.01175
Epoch: 14108, Training Loss: 0.01448
Epoch: 14108, Training Loss: 0.01114
Epoch: 14109, Training Loss: 0.01268
Epoch: 14109, Training Loss: 0.01175
Epoch: 14109, Training Loss: 0.01448
Epoch: 14109, Training Loss: 0.01114
Epoch: 14110, Training Loss: 0.01268
Epoch: 14110, Training Loss: 0.01175
Epoch: 14110, Training Loss: 0.01448
Epoch: 14110, Training Loss: 0.01114
Epoch: 14111, Training Loss: 0.01268
Epoch: 14111, Training Loss: 0.01174
Epoch: 14111, Training Loss: 0.01448
Epoch: 14111, Training Loss: 0.01114
Epoch: 14112, Training Loss: 0.01268
Epoch: 14112, Training Loss: 0.01174
Epoch: 14112, Training Loss: 0.01448
Epoch: 14112, Training Loss: 0.01114
Epoch: 14113, Training Loss: 0.01268
Epoch: 14113, Training Loss: 0.01174
Epoch: 14113, Training Loss: 0.01448
Epoch: 14113, Training Loss: 0.01114
Epoch: 14114, Training Loss: 0.01268
Epoch: 14114, Training Loss: 0.01174
Epoch: 14114, Training Loss: 0.01448
Epoch: 14114, Training Loss: 0.01114
Epoch: 14115, Training Loss: 0.01268
Epoch: 14115, Training Loss: 0.01174
Epoch: 14115, Training Loss: 0.01448
Epoch: 14115, Training Loss: 0.01114
Epoch: 14116, Training Loss: 0.01268
Epoch: 14116, Training Loss: 0.01174
Epoch: 14116, Training Loss: 0.01448
Epoch: 14116, Training Loss: 0.01114
Epoch: 14117, Training Loss: 0.01268
Epoch: 14117, Training Loss: 0.01174
Epoch: 14117, Training Loss: 0.01448
Epoch: 14117, Training Loss: 0.01114
Epoch: 14118, Training Loss: 0.01268
Epoch: 14118, Training Loss: 0.01174
Epoch: 14118, Training Loss: 0.01447
Epoch: 14118, Training Loss: 0.01114
Epoch: 14119, Training Loss: 0.01268
Epoch: 14119, Training Loss: 0.01174
Epoch: 14119, Training Loss: 0.01447
Epoch: 14119, Training Loss: 0.01114
Epoch: 14120, Training Loss: 0.01268
Epoch: 14120, Training Loss: 0.01174
Epoch: 14120, Training Loss: 0.01447
Epoch: 14120, Training Loss: 0.01114
Epoch: 14121, Training Loss: 0.01268
Epoch: 14121, Training Loss: 0.01174
Epoch: 14121, Training Loss: 0.01447
Epoch: 14121, Training Loss: 0.01114
Epoch: 14122, Training Loss: 0.01268
Epoch: 14122, Training Loss: 0.01174
Epoch: 14122, Training Loss: 0.01447
Epoch: 14122, Training Loss: 0.01114
Epoch: 14123, Training Loss: 0.01268
Epoch: 14123, Training Loss: 0.01174
Epoch: 14123, Training Loss: 0.01447
Epoch: 14123, Training Loss: 0.01113
Epoch: 14124, Training Loss: 0.01267
Epoch: 14124, Training Loss: 0.01174
Epoch: 14124, Training Loss: 0.01447
Epoch: 14124, Training Loss: 0.01113
Epoch: 14125, Training Loss: 0.01267
Epoch: 14125, Training Loss: 0.01174
Epoch: 14125, Training Loss: 0.01447
Epoch: 14125, Training Loss: 0.01113
Epoch: 14126, Training Loss: 0.01267
Epoch: 14126, Training Loss: 0.01174
Epoch: 14126, Training Loss: 0.01447
Epoch: 14126, Training Loss: 0.01113
Epoch: 14127, Training Loss: 0.01267
Epoch: 14127, Training Loss: 0.01174
Epoch: 14127, Training Loss: 0.01447
Epoch: 14127, Training Loss: 0.01113
Epoch: 14128, Training Loss: 0.01267
Epoch: 14128, Training Loss: 0.01174
Epoch: 14128, Training Loss: 0.01447
Epoch: 14128, Training Loss: 0.01113
Epoch: 14129, Training Loss: 0.01267
Epoch: 14129, Training Loss: 0.01174
Epoch: 14129, Training Loss: 0.01447
Epoch: 14129, Training Loss: 0.01113
Epoch: 14130, Training Loss: 0.01267
Epoch: 14130, Training Loss: 0.01174
Epoch: 14130, Training Loss: 0.01447
Epoch: 14130, Training Loss: 0.01113
Epoch: 14131, Training Loss: 0.01267
Epoch: 14131, Training Loss: 0.01174
Epoch: 14131, Training Loss: 0.01447
Epoch: 14131, Training Loss: 0.01113
Epoch: 14132, Training Loss: 0.01267
Epoch: 14132, Training Loss: 0.01174
Epoch: 14132, Training Loss: 0.01447
Epoch: 14132, Training Loss: 0.01113
Epoch: 14133, Training Loss: 0.01267
Epoch: 14133, Training Loss: 0.01173
Epoch: 14133, Training Loss: 0.01447
Epoch: 14133, Training Loss: 0.01113
Epoch: 14134, Training Loss: 0.01267
Epoch: 14134, Training Loss: 0.01173
Epoch: 14134, Training Loss: 0.01447
Epoch: 14134, Training Loss: 0.01113
Epoch: 14135, Training Loss: 0.01267
Epoch: 14135, Training Loss: 0.01173
Epoch: 14135, Training Loss: 0.01447
Epoch: 14135, Training Loss: 0.01113
Epoch: 14136, Training Loss: 0.01267
Epoch: 14136, Training Loss: 0.01173
Epoch: 14136, Training Loss: 0.01446
Epoch: 14136, Training Loss: 0.01113
Epoch: 14137, Training Loss: 0.01267
Epoch: 14137, Training Loss: 0.01173
Epoch: 14137, Training Loss: 0.01446
Epoch: 14137, Training Loss: 0.01113
Epoch: 14138, Training Loss: 0.01267
Epoch: 14138, Training Loss: 0.01173
Epoch: 14138, Training Loss: 0.01446
Epoch: 14138, Training Loss: 0.01113
Epoch: 14139, Training Loss: 0.01267
Epoch: 14139, Training Loss: 0.01173
Epoch: 14139, Training Loss: 0.01446
Epoch: 14139, Training Loss: 0.01113
Epoch: 14140, Training Loss: 0.01267
Epoch: 14140, Training Loss: 0.01173
Epoch: 14140, Training Loss: 0.01446
Epoch: 14140, Training Loss: 0.01113
Epoch: 14141, Training Loss: 0.01267
Epoch: 14141, Training Loss: 0.01173
Epoch: 14141, Training Loss: 0.01446
Epoch: 14141, Training Loss: 0.01113
Epoch: 14142, Training Loss: 0.01267
Epoch: 14142, Training Loss: 0.01173
Epoch: 14142, Training Loss: 0.01446
Epoch: 14142, Training Loss: 0.01113
Epoch: 14143, Training Loss: 0.01267
Epoch: 14143, Training Loss: 0.01173
Epoch: 14143, Training Loss: 0.01446
Epoch: 14143, Training Loss: 0.01113
Epoch: 14144, Training Loss: 0.01266
Epoch: 14144, Training Loss: 0.01173
Epoch: 14144, Training Loss: 0.01446
Epoch: 14144, Training Loss: 0.01113
Epoch: 14145, Training Loss: 0.01266
Epoch: 14145, Training Loss: 0.01173
Epoch: 14145, Training Loss: 0.01446
Epoch: 14145, Training Loss: 0.01113
Epoch: 14146, Training Loss: 0.01266
Epoch: 14146, Training Loss: 0.01173
Epoch: 14146, Training Loss: 0.01446
Epoch: 14146, Training Loss: 0.01112
Epoch: 14147, Training Loss: 0.01266
Epoch: 14147, Training Loss: 0.01173
Epoch: 14147, Training Loss: 0.01446
Epoch: 14147, Training Loss: 0.01112
Epoch: 14148, Training Loss: 0.01266
Epoch: 14148, Training Loss: 0.01173
Epoch: 14148, Training Loss: 0.01446
Epoch: 14148, Training Loss: 0.01112
Epoch: 14149, Training Loss: 0.01266
Epoch: 14149, Training Loss: 0.01173
Epoch: 14149, Training Loss: 0.01446
Epoch: 14149, Training Loss: 0.01112
Epoch: 14150, Training Loss: 0.01266
Epoch: 14150, Training Loss: 0.01173
Epoch: 14150, Training Loss: 0.01446
Epoch: 14150, Training Loss: 0.01112
Epoch: 14151, Training Loss: 0.01266
Epoch: 14151, Training Loss: 0.01173
Epoch: 14151, Training Loss: 0.01446
Epoch: 14151, Training Loss: 0.01112
Epoch: 14152, Training Loss: 0.01266
Epoch: 14152, Training Loss: 0.01173
Epoch: 14152, Training Loss: 0.01446
Epoch: 14152, Training Loss: 0.01112
Epoch: 14153, Training Loss: 0.01266
Epoch: 14153, Training Loss: 0.01173
Epoch: 14153, Training Loss: 0.01445
Epoch: 14153, Training Loss: 0.01112
Epoch: 14154, Training Loss: 0.01266
Epoch: 14154, Training Loss: 0.01173
Epoch: 14154, Training Loss: 0.01445
Epoch: 14154, Training Loss: 0.01112
Epoch: 14155, Training Loss: 0.01266
Epoch: 14155, Training Loss: 0.01173
Epoch: 14155, Training Loss: 0.01445
Epoch: 14155, Training Loss: 0.01112
Epoch: 14156, Training Loss: 0.01266
Epoch: 14156, Training Loss: 0.01172
Epoch: 14156, Training Loss: 0.01445
Epoch: 14156, Training Loss: 0.01112
Epoch: 14157, Training Loss: 0.01266
Epoch: 14157, Training Loss: 0.01172
Epoch: 14157, Training Loss: 0.01445
Epoch: 14157, Training Loss: 0.01112
Epoch: 14158, Training Loss: 0.01266
Epoch: 14158, Training Loss: 0.01172
Epoch: 14158, Training Loss: 0.01445
Epoch: 14158, Training Loss: 0.01112
Epoch: 14159, Training Loss: 0.01266
Epoch: 14159, Training Loss: 0.01172
Epoch: 14159, Training Loss: 0.01445
Epoch: 14159, Training Loss: 0.01112
Epoch: 14160, Training Loss: 0.01266
Epoch: 14160, Training Loss: 0.01172
Epoch: 14160, Training Loss: 0.01445
Epoch: 14160, Training Loss: 0.01112
Epoch: 14161, Training Loss: 0.01266
Epoch: 14161, Training Loss: 0.01172
Epoch: 14161, Training Loss: 0.01445
Epoch: 14161, Training Loss: 0.01112
Epoch: 14162, Training Loss: 0.01266
Epoch: 14162, Training Loss: 0.01172
Epoch: 14162, Training Loss: 0.01445
Epoch: 14162, Training Loss: 0.01112
Epoch: 14163, Training Loss: 0.01266
Epoch: 14163, Training Loss: 0.01172
Epoch: 14163, Training Loss: 0.01445
Epoch: 14163, Training Loss: 0.01112
Epoch: 14164, Training Loss: 0.01265
Epoch: 14164, Training Loss: 0.01172
Epoch: 14164, Training Loss: 0.01445
Epoch: 14164, Training Loss: 0.01112
Epoch: 14165, Training Loss: 0.01265
Epoch: 14165, Training Loss: 0.01172
Epoch: 14165, Training Loss: 0.01445
Epoch: 14165, Training Loss: 0.01112
Epoch: 14166, Training Loss: 0.01265
Epoch: 14166, Training Loss: 0.01172
Epoch: 14166, Training Loss: 0.01445
Epoch: 14166, Training Loss: 0.01112
Epoch: 14167, Training Loss: 0.01265
Epoch: 14167, Training Loss: 0.01172
Epoch: 14167, Training Loss: 0.01445
Epoch: 14167, Training Loss: 0.01112
Epoch: 14168, Training Loss: 0.01265
Epoch: 14168, Training Loss: 0.01172
Epoch: 14168, Training Loss: 0.01445
Epoch: 14168, Training Loss: 0.01112
Epoch: 14169, Training Loss: 0.01265
Epoch: 14169, Training Loss: 0.01172
Epoch: 14169, Training Loss: 0.01445
Epoch: 14169, Training Loss: 0.01112
Epoch: 14170, Training Loss: 0.01265
Epoch: 14170, Training Loss: 0.01172
Epoch: 14170, Training Loss: 0.01445
Epoch: 14170, Training Loss: 0.01111
Epoch: 14171, Training Loss: 0.01265
Epoch: 14171, Training Loss: 0.01172
Epoch: 14171, Training Loss: 0.01444
Epoch: 14171, Training Loss: 0.01111
Epoch: 14172, Training Loss: 0.01265
Epoch: 14172, Training Loss: 0.01172
Epoch: 14172, Training Loss: 0.01444
Epoch: 14172, Training Loss: 0.01111
Epoch: 14173, Training Loss: 0.01265
Epoch: 14173, Training Loss: 0.01172
Epoch: 14173, Training Loss: 0.01444
Epoch: 14173, Training Loss: 0.01111
Epoch: 14174, Training Loss: 0.01265
Epoch: 14174, Training Loss: 0.01172
Epoch: 14174, Training Loss: 0.01444
Epoch: 14174, Training Loss: 0.01111
Epoch: 14175, Training Loss: 0.01265
Epoch: 14175, Training Loss: 0.01172
Epoch: 14175, Training Loss: 0.01444
Epoch: 14175, Training Loss: 0.01111
Epoch: 14176, Training Loss: 0.01265
Epoch: 14176, Training Loss: 0.01172
Epoch: 14176, Training Loss: 0.01444
Epoch: 14176, Training Loss: 0.01111
Epoch: 14177, Training Loss: 0.01265
Epoch: 14177, Training Loss: 0.01172
Epoch: 14177, Training Loss: 0.01444
Epoch: 14177, Training Loss: 0.01111
Epoch: 14178, Training Loss: 0.01265
Epoch: 14178, Training Loss: 0.01171
Epoch: 14178, Training Loss: 0.01444
Epoch: 14178, Training Loss: 0.01111
Epoch: 14179, Training Loss: 0.01265
Epoch: 14179, Training Loss: 0.01171
Epoch: 14179, Training Loss: 0.01444
Epoch: 14179, Training Loss: 0.01111
Epoch: 14180, Training Loss: 0.01265
Epoch: 14180, Training Loss: 0.01171
Epoch: 14180, Training Loss: 0.01444
Epoch: 14180, Training Loss: 0.01111
Epoch: 14181, Training Loss: 0.01265
Epoch: 14181, Training Loss: 0.01171
Epoch: 14181, Training Loss: 0.01444
Epoch: 14181, Training Loss: 0.01111
Epoch: 14182, Training Loss: 0.01265
Epoch: 14182, Training Loss: 0.01171
Epoch: 14182, Training Loss: 0.01444
Epoch: 14182, Training Loss: 0.01111
Epoch: 14183, Training Loss: 0.01265
Epoch: 14183, Training Loss: 0.01171
Epoch: 14183, Training Loss: 0.01444
Epoch: 14183, Training Loss: 0.01111
Epoch: 14184, Training Loss: 0.01265
Epoch: 14184, Training Loss: 0.01171
Epoch: 14184, Training Loss: 0.01444
Epoch: 14184, Training Loss: 0.01111
Epoch: 14185, Training Loss: 0.01264
Epoch: 14185, Training Loss: 0.01171
Epoch: 14185, Training Loss: 0.01444
Epoch: 14185, Training Loss: 0.01111
Epoch: 14186, Training Loss: 0.01264
Epoch: 14186, Training Loss: 0.01171
Epoch: 14186, Training Loss: 0.01444
Epoch: 14186, Training Loss: 0.01111
Epoch: 14187, Training Loss: 0.01264
Epoch: 14187, Training Loss: 0.01171
Epoch: 14187, Training Loss: 0.01444
Epoch: 14187, Training Loss: 0.01111
Epoch: 14188, Training Loss: 0.01264
Epoch: 14188, Training Loss: 0.01171
Epoch: 14188, Training Loss: 0.01444
Epoch: 14188, Training Loss: 0.01111
Epoch: 14189, Training Loss: 0.01264
Epoch: 14189, Training Loss: 0.01171
Epoch: 14189, Training Loss: 0.01443
Epoch: 14189, Training Loss: 0.01111
Epoch: 14190, Training Loss: 0.01264
Epoch: 14190, Training Loss: 0.01171
Epoch: 14190, Training Loss: 0.01443
Epoch: 14190, Training Loss: 0.01111
Epoch: 14191, Training Loss: 0.01264
Epoch: 14191, Training Loss: 0.01171
Epoch: 14191, Training Loss: 0.01443
Epoch: 14191, Training Loss: 0.01111
Epoch: 14192, Training Loss: 0.01264
Epoch: 14192, Training Loss: 0.01171
Epoch: 14192, Training Loss: 0.01443
Epoch: 14192, Training Loss: 0.01111
Epoch: 14193, Training Loss: 0.01264
Epoch: 14193, Training Loss: 0.01171
Epoch: 14193, Training Loss: 0.01443
Epoch: 14193, Training Loss: 0.01110
Epoch: 14194, Training Loss: 0.01264
Epoch: 14194, Training Loss: 0.01171
Epoch: 14194, Training Loss: 0.01443
Epoch: 14194, Training Loss: 0.01110
Epoch: 14195, Training Loss: 0.01264
Epoch: 14195, Training Loss: 0.01171
Epoch: 14195, Training Loss: 0.01443
Epoch: 14195, Training Loss: 0.01110
Epoch: 14196, Training Loss: 0.01264
Epoch: 14196, Training Loss: 0.01171
Epoch: 14196, Training Loss: 0.01443
Epoch: 14196, Training Loss: 0.01110
Epoch: 14197, Training Loss: 0.01264
Epoch: 14197, Training Loss: 0.01171
Epoch: 14197, Training Loss: 0.01443
Epoch: 14197, Training Loss: 0.01110
Epoch: 14198, Training Loss: 0.01264
Epoch: 14198, Training Loss: 0.01171
Epoch: 14198, Training Loss: 0.01443
Epoch: 14198, Training Loss: 0.01110
Epoch: 14199, Training Loss: 0.01264
Epoch: 14199, Training Loss: 0.01171
Epoch: 14199, Training Loss: 0.01443
Epoch: 14199, Training Loss: 0.01110
Epoch: 14200, Training Loss: 0.01264
Epoch: 14200, Training Loss: 0.01171
Epoch: 14200, Training Loss: 0.01443
Epoch: 14200, Training Loss: 0.01110
Epoch: 14201, Training Loss: 0.01264
Epoch: 14201, Training Loss: 0.01170
Epoch: 14201, Training Loss: 0.01443
Epoch: 14201, Training Loss: 0.01110
Epoch: 14202, Training Loss: 0.01264
Epoch: 14202, Training Loss: 0.01170
Epoch: 14202, Training Loss: 0.01443
Epoch: 14202, Training Loss: 0.01110
Epoch: 14203, Training Loss: 0.01264
Epoch: 14203, Training Loss: 0.01170
Epoch: 14203, Training Loss: 0.01443
Epoch: 14203, Training Loss: 0.01110
Epoch: 14204, Training Loss: 0.01264
Epoch: 14204, Training Loss: 0.01170
Epoch: 14204, Training Loss: 0.01443
Epoch: 14204, Training Loss: 0.01110
Epoch: 14205, Training Loss: 0.01263
Epoch: 14205, Training Loss: 0.01170
Epoch: 14205, Training Loss: 0.01443
Epoch: 14205, Training Loss: 0.01110
Epoch: 14206, Training Loss: 0.01263
Epoch: 14206, Training Loss: 0.01170
Epoch: 14206, Training Loss: 0.01443
Epoch: 14206, Training Loss: 0.01110
Epoch: 14207, Training Loss: 0.01263
Epoch: 14207, Training Loss: 0.01170
Epoch: 14207, Training Loss: 0.01442
Epoch: 14207, Training Loss: 0.01110
Epoch: 14208, Training Loss: 0.01263
Epoch: 14208, Training Loss: 0.01170
Epoch: 14208, Training Loss: 0.01442
Epoch: 14208, Training Loss: 0.01110
Epoch: 14209, Training Loss: 0.01263
Epoch: 14209, Training Loss: 0.01170
Epoch: 14209, Training Loss: 0.01442
Epoch: 14209, Training Loss: 0.01110
Epoch: 14210, Training Loss: 0.01263
Epoch: 14210, Training Loss: 0.01170
Epoch: 14210, Training Loss: 0.01442
Epoch: 14210, Training Loss: 0.01110
Epoch: 14211, Training Loss: 0.01263
Epoch: 14211, Training Loss: 0.01170
Epoch: 14211, Training Loss: 0.01442
Epoch: 14211, Training Loss: 0.01110
Epoch: 14212, Training Loss: 0.01263
Epoch: 14212, Training Loss: 0.01170
Epoch: 14212, Training Loss: 0.01442
Epoch: 14212, Training Loss: 0.01110
Epoch: 14213, Training Loss: 0.01263
Epoch: 14213, Training Loss: 0.01170
Epoch: 14213, Training Loss: 0.01442
Epoch: 14213, Training Loss: 0.01110
Epoch: 14214, Training Loss: 0.01263
Epoch: 14214, Training Loss: 0.01170
Epoch: 14214, Training Loss: 0.01442
Epoch: 14214, Training Loss: 0.01110
Epoch: 14215, Training Loss: 0.01263
Epoch: 14215, Training Loss: 0.01170
Epoch: 14215, Training Loss: 0.01442
Epoch: 14215, Training Loss: 0.01110
Epoch: 14216, Training Loss: 0.01263
Epoch: 14216, Training Loss: 0.01170
Epoch: 14216, Training Loss: 0.01442
Epoch: 14216, Training Loss: 0.01110
Epoch: 14217, Training Loss: 0.01263
Epoch: 14217, Training Loss: 0.01170
Epoch: 14217, Training Loss: 0.01442
Epoch: 14217, Training Loss: 0.01109
Epoch: 14218, Training Loss: 0.01263
Epoch: 14218, Training Loss: 0.01170
Epoch: 14218, Training Loss: 0.01442
Epoch: 14218, Training Loss: 0.01109
Epoch: 14219, Training Loss: 0.01263
Epoch: 14219, Training Loss: 0.01170
Epoch: 14219, Training Loss: 0.01442
Epoch: 14219, Training Loss: 0.01109
Epoch: 14220, Training Loss: 0.01263
Epoch: 14220, Training Loss: 0.01170
Epoch: 14220, Training Loss: 0.01442
Epoch: 14220, Training Loss: 0.01109
Epoch: 14221, Training Loss: 0.01263
Epoch: 14221, Training Loss: 0.01170
Epoch: 14221, Training Loss: 0.01442
Epoch: 14221, Training Loss: 0.01109
Epoch: 14222, Training Loss: 0.01263
Epoch: 14222, Training Loss: 0.01170
Epoch: 14222, Training Loss: 0.01442
Epoch: 14222, Training Loss: 0.01109
Epoch: 14223, Training Loss: 0.01263
Epoch: 14223, Training Loss: 0.01169
Epoch: 14223, Training Loss: 0.01442
Epoch: 14223, Training Loss: 0.01109
Epoch: 14224, Training Loss: 0.01263
Epoch: 14224, Training Loss: 0.01169
Epoch: 14224, Training Loss: 0.01442
Epoch: 14224, Training Loss: 0.01109
Epoch: 14225, Training Loss: 0.01262
Epoch: 14225, Training Loss: 0.01169
Epoch: 14225, Training Loss: 0.01441
Epoch: 14225, Training Loss: 0.01109
Epoch: 14226, Training Loss: 0.01262
Epoch: 14226, Training Loss: 0.01169
Epoch: 14226, Training Loss: 0.01441
Epoch: 14226, Training Loss: 0.01109
Epoch: 14227, Training Loss: 0.01262
Epoch: 14227, Training Loss: 0.01169
Epoch: 14227, Training Loss: 0.01441
Epoch: 14227, Training Loss: 0.01109
Epoch: 14228, Training Loss: 0.01262
Epoch: 14228, Training Loss: 0.01169
Epoch: 14228, Training Loss: 0.01441
Epoch: 14228, Training Loss: 0.01109
Epoch: 14229, Training Loss: 0.01262
Epoch: 14229, Training Loss: 0.01169
Epoch: 14229, Training Loss: 0.01441
Epoch: 14229, Training Loss: 0.01109
Epoch: 14230, Training Loss: 0.01262
Epoch: 14230, Training Loss: 0.01169
Epoch: 14230, Training Loss: 0.01441
Epoch: 14230, Training Loss: 0.01109
Epoch: 14231, Training Loss: 0.01262
Epoch: 14231, Training Loss: 0.01169
Epoch: 14231, Training Loss: 0.01441
Epoch: 14231, Training Loss: 0.01109
Epoch: 14232, Training Loss: 0.01262
Epoch: 14232, Training Loss: 0.01169
Epoch: 14232, Training Loss: 0.01441
Epoch: 14232, Training Loss: 0.01109
Epoch: 14233, Training Loss: 0.01262
Epoch: 14233, Training Loss: 0.01169
Epoch: 14233, Training Loss: 0.01441
Epoch: 14233, Training Loss: 0.01109
Epoch: 14234, Training Loss: 0.01262
Epoch: 14234, Training Loss: 0.01169
Epoch: 14234, Training Loss: 0.01441
Epoch: 14234, Training Loss: 0.01109
Epoch: 14235, Training Loss: 0.01262
Epoch: 14235, Training Loss: 0.01169
Epoch: 14235, Training Loss: 0.01441
Epoch: 14235, Training Loss: 0.01109
Epoch: 14236, Training Loss: 0.01262
Epoch: 14236, Training Loss: 0.01169
Epoch: 14236, Training Loss: 0.01441
Epoch: 14236, Training Loss: 0.01109
Epoch: 14237, Training Loss: 0.01262
Epoch: 14237, Training Loss: 0.01169
Epoch: 14237, Training Loss: 0.01441
Epoch: 14237, Training Loss: 0.01109
Epoch: 14238, Training Loss: 0.01262
Epoch: 14238, Training Loss: 0.01169
Epoch: 14238, Training Loss: 0.01441
Epoch: 14238, Training Loss: 0.01109
Epoch: 14239, Training Loss: 0.01262
Epoch: 14239, Training Loss: 0.01169
Epoch: 14239, Training Loss: 0.01441
Epoch: 14239, Training Loss: 0.01109
Epoch: 14240, Training Loss: 0.01262
Epoch: 14240, Training Loss: 0.01169
Epoch: 14240, Training Loss: 0.01441
Epoch: 14240, Training Loss: 0.01109
Epoch: 14241, Training Loss: 0.01262
Epoch: 14241, Training Loss: 0.01169
Epoch: 14241, Training Loss: 0.01441
Epoch: 14241, Training Loss: 0.01108
Epoch: 14242, Training Loss: 0.01262
Epoch: 14242, Training Loss: 0.01169
Epoch: 14242, Training Loss: 0.01440
Epoch: 14242, Training Loss: 0.01108
Epoch: 14243, Training Loss: 0.01262
Epoch: 14243, Training Loss: 0.01169
Epoch: 14243, Training Loss: 0.01440
Epoch: 14243, Training Loss: 0.01108
Epoch: 14244, Training Loss: 0.01262
Epoch: 14244, Training Loss: 0.01169
Epoch: 14244, Training Loss: 0.01440
Epoch: 14244, Training Loss: 0.01108
Epoch: 14245, Training Loss: 0.01262
Epoch: 14245, Training Loss: 0.01169
Epoch: 14245, Training Loss: 0.01440
Epoch: 14245, Training Loss: 0.01108
Epoch: 14246, Training Loss: 0.01261
Epoch: 14246, Training Loss: 0.01168
Epoch: 14246, Training Loss: 0.01440
Epoch: 14246, Training Loss: 0.01108
Epoch: 14247, Training Loss: 0.01261
Epoch: 14247, Training Loss: 0.01168
Epoch: 14247, Training Loss: 0.01440
Epoch: 14247, Training Loss: 0.01108
Epoch: 14248, Training Loss: 0.01261
Epoch: 14248, Training Loss: 0.01168
Epoch: 14248, Training Loss: 0.01440
Epoch: 14248, Training Loss: 0.01108
Epoch: 14249, Training Loss: 0.01261
Epoch: 14249, Training Loss: 0.01168
Epoch: 14249, Training Loss: 0.01440
Epoch: 14249, Training Loss: 0.01108
Epoch: 14250, Training Loss: 0.01261
Epoch: 14250, Training Loss: 0.01168
Epoch: 14250, Training Loss: 0.01440
Epoch: 14250, Training Loss: 0.01108
Epoch: 14251, Training Loss: 0.01261
Epoch: 14251, Training Loss: 0.01168
Epoch: 14251, Training Loss: 0.01440
Epoch: 14251, Training Loss: 0.01108
Epoch: 14252, Training Loss: 0.01261
Epoch: 14252, Training Loss: 0.01168
Epoch: 14252, Training Loss: 0.01440
Epoch: 14252, Training Loss: 0.01108
Epoch: 14253, Training Loss: 0.01261
Epoch: 14253, Training Loss: 0.01168
Epoch: 14253, Training Loss: 0.01440
Epoch: 14253, Training Loss: 0.01108
Epoch: 14254, Training Loss: 0.01261
Epoch: 14254, Training Loss: 0.01168
Epoch: 14254, Training Loss: 0.01440
Epoch: 14254, Training Loss: 0.01108
Epoch: 14255, Training Loss: 0.01261
Epoch: 14255, Training Loss: 0.01168
Epoch: 14255, Training Loss: 0.01440
Epoch: 14255, Training Loss: 0.01108
Epoch: 14256, Training Loss: 0.01261
Epoch: 14256, Training Loss: 0.01168
Epoch: 14256, Training Loss: 0.01440
Epoch: 14256, Training Loss: 0.01108
Epoch: 14257, Training Loss: 0.01261
Epoch: 14257, Training Loss: 0.01168
Epoch: 14257, Training Loss: 0.01440
Epoch: 14257, Training Loss: 0.01108
Epoch: 14258, Training Loss: 0.01261
Epoch: 14258, Training Loss: 0.01168
Epoch: 14258, Training Loss: 0.01440
Epoch: 14258, Training Loss: 0.01108
Epoch: 14259, Training Loss: 0.01261
Epoch: 14259, Training Loss: 0.01168
Epoch: 14259, Training Loss: 0.01440
Epoch: 14259, Training Loss: 0.01108
Epoch: 14260, Training Loss: 0.01261
Epoch: 14260, Training Loss: 0.01168
Epoch: 14260, Training Loss: 0.01439
Epoch: 14260, Training Loss: 0.01108
Epoch: 14261, Training Loss: 0.01261
Epoch: 14261, Training Loss: 0.01168
Epoch: 14261, Training Loss: 0.01439
Epoch: 14261, Training Loss: 0.01108
Epoch: 14262, Training Loss: 0.01261
Epoch: 14262, Training Loss: 0.01168
Epoch: 14262, Training Loss: 0.01439
Epoch: 14262, Training Loss: 0.01108
Epoch: 14263, Training Loss: 0.01261
Epoch: 14263, Training Loss: 0.01168
Epoch: 14263, Training Loss: 0.01439
Epoch: 14263, Training Loss: 0.01108
Epoch: 14264, Training Loss: 0.01261
Epoch: 14264, Training Loss: 0.01168
Epoch: 14264, Training Loss: 0.01439
Epoch: 14264, Training Loss: 0.01107
Epoch: 14265, Training Loss: 0.01261
Epoch: 14265, Training Loss: 0.01168
Epoch: 14265, Training Loss: 0.01439
Epoch: 14265, Training Loss: 0.01107
Epoch: 14266, Training Loss: 0.01260
Epoch: 14266, Training Loss: 0.01168
Epoch: 14266, Training Loss: 0.01439
Epoch: 14266, Training Loss: 0.01107
Epoch: 14267, Training Loss: 0.01260
Epoch: 14267, Training Loss: 0.01168
Epoch: 14267, Training Loss: 0.01439
Epoch: 14267, Training Loss: 0.01107
Epoch: 14268, Training Loss: 0.01260
Epoch: 14268, Training Loss: 0.01168
Epoch: 14268, Training Loss: 0.01439
Epoch: 14268, Training Loss: 0.01107
Epoch: 14269, Training Loss: 0.01260
Epoch: 14269, Training Loss: 0.01167
Epoch: 14269, Training Loss: 0.01439
Epoch: 14269, Training Loss: 0.01107
Epoch: 14270, Training Loss: 0.01260
Epoch: 14270, Training Loss: 0.01167
Epoch: 14270, Training Loss: 0.01439
Epoch: 14270, Training Loss: 0.01107
Epoch: 14271, Training Loss: 0.01260
Epoch: 14271, Training Loss: 0.01167
Epoch: 14271, Training Loss: 0.01439
Epoch: 14271, Training Loss: 0.01107
Epoch: 14272, Training Loss: 0.01260
Epoch: 14272, Training Loss: 0.01167
Epoch: 14272, Training Loss: 0.01439
Epoch: 14272, Training Loss: 0.01107
Epoch: 14273, Training Loss: 0.01260
Epoch: 14273, Training Loss: 0.01167
Epoch: 14273, Training Loss: 0.01439
Epoch: 14273, Training Loss: 0.01107
Epoch: 14274, Training Loss: 0.01260
Epoch: 14274, Training Loss: 0.01167
Epoch: 14274, Training Loss: 0.01439
Epoch: 14274, Training Loss: 0.01107
Epoch: 14275, Training Loss: 0.01260
Epoch: 14275, Training Loss: 0.01167
Epoch: 14275, Training Loss: 0.01439
Epoch: 14275, Training Loss: 0.01107
Epoch: 14276, Training Loss: 0.01260
Epoch: 14276, Training Loss: 0.01167
Epoch: 14276, Training Loss: 0.01439
Epoch: 14276, Training Loss: 0.01107
Epoch: 14277, Training Loss: 0.01260
Epoch: 14277, Training Loss: 0.01167
Epoch: 14277, Training Loss: 0.01439
Epoch: 14277, Training Loss: 0.01107
Epoch: 14278, Training Loss: 0.01260
Epoch: 14278, Training Loss: 0.01167
Epoch: 14278, Training Loss: 0.01438
Epoch: 14278, Training Loss: 0.01107
Epoch: 14279, Training Loss: 0.01260
Epoch: 14279, Training Loss: 0.01167
Epoch: 14279, Training Loss: 0.01438
Epoch: 14279, Training Loss: 0.01107
Epoch: 14280, Training Loss: 0.01260
Epoch: 14280, Training Loss: 0.01167
Epoch: 14280, Training Loss: 0.01438
Epoch: 14280, Training Loss: 0.01107
Epoch: 14281, Training Loss: 0.01260
Epoch: 14281, Training Loss: 0.01167
Epoch: 14281, Training Loss: 0.01438
Epoch: 14281, Training Loss: 0.01107
Epoch: 14282, Training Loss: 0.01260
Epoch: 14282, Training Loss: 0.01167
Epoch: 14282, Training Loss: 0.01438
Epoch: 14282, Training Loss: 0.01107
Epoch: 14283, Training Loss: 0.01260
Epoch: 14283, Training Loss: 0.01167
Epoch: 14283, Training Loss: 0.01438
Epoch: 14283, Training Loss: 0.01107
Epoch: 14284, Training Loss: 0.01260
Epoch: 14284, Training Loss: 0.01167
Epoch: 14284, Training Loss: 0.01438
Epoch: 14284, Training Loss: 0.01107
Epoch: 14285, Training Loss: 0.01260
Epoch: 14285, Training Loss: 0.01167
Epoch: 14285, Training Loss: 0.01438
Epoch: 14285, Training Loss: 0.01107
Epoch: 14286, Training Loss: 0.01260
Epoch: 14286, Training Loss: 0.01167
Epoch: 14286, Training Loss: 0.01438
Epoch: 14286, Training Loss: 0.01107
Epoch: 14287, Training Loss: 0.01259
Epoch: 14287, Training Loss: 0.01167
Epoch: 14287, Training Loss: 0.01438
Epoch: 14287, Training Loss: 0.01107
Epoch: 14288, Training Loss: 0.01259
Epoch: 14288, Training Loss: 0.01167
Epoch: 14288, Training Loss: 0.01438
Epoch: 14288, Training Loss: 0.01106
Epoch: 14289, Training Loss: 0.01259
Epoch: 14289, Training Loss: 0.01167
Epoch: 14289, Training Loss: 0.01438
Epoch: 14289, Training Loss: 0.01106
Epoch: 14290, Training Loss: 0.01259
Epoch: 14290, Training Loss: 0.01167
Epoch: 14290, Training Loss: 0.01438
Epoch: 14290, Training Loss: 0.01106
Epoch: 14291, Training Loss: 0.01259
Epoch: 14291, Training Loss: 0.01166
Epoch: 14291, Training Loss: 0.01438
Epoch: 14291, Training Loss: 0.01106
Epoch: 14292, Training Loss: 0.01259
Epoch: 14292, Training Loss: 0.01166
Epoch: 14292, Training Loss: 0.01438
Epoch: 14292, Training Loss: 0.01106
Epoch: 14293, Training Loss: 0.01259
Epoch: 14293, Training Loss: 0.01166
Epoch: 14293, Training Loss: 0.01438
Epoch: 14293, Training Loss: 0.01106
Epoch: 14294, Training Loss: 0.01259
Epoch: 14294, Training Loss: 0.01166
Epoch: 14294, Training Loss: 0.01438
Epoch: 14294, Training Loss: 0.01106
Epoch: 14295, Training Loss: 0.01259
Epoch: 14295, Training Loss: 0.01166
Epoch: 14295, Training Loss: 0.01438
Epoch: 14295, Training Loss: 0.01106
Epoch: 14296, Training Loss: 0.01259
Epoch: 14296, Training Loss: 0.01166
Epoch: 14296, Training Loss: 0.01437
Epoch: 14296, Training Loss: 0.01106
Epoch: 14297, Training Loss: 0.01259
Epoch: 14297, Training Loss: 0.01166
Epoch: 14297, Training Loss: 0.01437
Epoch: 14297, Training Loss: 0.01106
Epoch: 14298, Training Loss: 0.01259
Epoch: 14298, Training Loss: 0.01166
Epoch: 14298, Training Loss: 0.01437
Epoch: 14298, Training Loss: 0.01106
Epoch: 14299, Training Loss: 0.01259
Epoch: 14299, Training Loss: 0.01166
Epoch: 14299, Training Loss: 0.01437
Epoch: 14299, Training Loss: 0.01106
Epoch: 14300, Training Loss: 0.01259
Epoch: 14300, Training Loss: 0.01166
Epoch: 14300, Training Loss: 0.01437
Epoch: 14300, Training Loss: 0.01106
Epoch: 14301, Training Loss: 0.01259
Epoch: 14301, Training Loss: 0.01166
Epoch: 14301, Training Loss: 0.01437
Epoch: 14301, Training Loss: 0.01106
Epoch: 14302, Training Loss: 0.01259
Epoch: 14302, Training Loss: 0.01166
Epoch: 14302, Training Loss: 0.01437
Epoch: 14302, Training Loss: 0.01106
Epoch: 14303, Training Loss: 0.01259
Epoch: 14303, Training Loss: 0.01166
Epoch: 14303, Training Loss: 0.01437
Epoch: 14303, Training Loss: 0.01106
Epoch: 14304, Training Loss: 0.01259
Epoch: 14304, Training Loss: 0.01166
Epoch: 14304, Training Loss: 0.01437
Epoch: 14304, Training Loss: 0.01106
Epoch: 14305, Training Loss: 0.01259
Epoch: 14305, Training Loss: 0.01166
Epoch: 14305, Training Loss: 0.01437
Epoch: 14305, Training Loss: 0.01106
Epoch: 14306, Training Loss: 0.01259
Epoch: 14306, Training Loss: 0.01166
Epoch: 14306, Training Loss: 0.01437
Epoch: 14306, Training Loss: 0.01106
Epoch: 14307, Training Loss: 0.01258
Epoch: 14307, Training Loss: 0.01166
Epoch: 14307, Training Loss: 0.01437
Epoch: 14307, Training Loss: 0.01106
Epoch: 14308, Training Loss: 0.01258
Epoch: 14308, Training Loss: 0.01166
Epoch: 14308, Training Loss: 0.01437
Epoch: 14308, Training Loss: 0.01106
Epoch: 14309, Training Loss: 0.01258
Epoch: 14309, Training Loss: 0.01166
Epoch: 14309, Training Loss: 0.01437
Epoch: 14309, Training Loss: 0.01106
Epoch: 14310, Training Loss: 0.01258
Epoch: 14310, Training Loss: 0.01166
Epoch: 14310, Training Loss: 0.01437
Epoch: 14310, Training Loss: 0.01106
Epoch: 14311, Training Loss: 0.01258
Epoch: 14311, Training Loss: 0.01166
Epoch: 14311, Training Loss: 0.01437
Epoch: 14311, Training Loss: 0.01106
Epoch: 14312, Training Loss: 0.01258
Epoch: 14312, Training Loss: 0.01166
Epoch: 14312, Training Loss: 0.01437
Epoch: 14312, Training Loss: 0.01105
Epoch: 14313, Training Loss: 0.01258
Epoch: 14313, Training Loss: 0.01166
Epoch: 14313, Training Loss: 0.01437
Epoch: 14313, Training Loss: 0.01105
Epoch: 14314, Training Loss: 0.01258
Epoch: 14314, Training Loss: 0.01165
Epoch: 14314, Training Loss: 0.01436
Epoch: 14314, Training Loss: 0.01105
Epoch: 14315, Training Loss: 0.01258
Epoch: 14315, Training Loss: 0.01165
Epoch: 14315, Training Loss: 0.01436
Epoch: 14315, Training Loss: 0.01105
Epoch: 14316, Training Loss: 0.01258
Epoch: 14316, Training Loss: 0.01165
Epoch: 14316, Training Loss: 0.01436
Epoch: 14316, Training Loss: 0.01105
Epoch: 14317, Training Loss: 0.01258
Epoch: 14317, Training Loss: 0.01165
Epoch: 14317, Training Loss: 0.01436
Epoch: 14317, Training Loss: 0.01105
Epoch: 14318, Training Loss: 0.01258
Epoch: 14318, Training Loss: 0.01165
Epoch: 14318, Training Loss: 0.01436
Epoch: 14318, Training Loss: 0.01105
Epoch: 14319, Training Loss: 0.01258
Epoch: 14319, Training Loss: 0.01165
Epoch: 14319, Training Loss: 0.01436
Epoch: 14319, Training Loss: 0.01105
Epoch: 14320, Training Loss: 0.01258
Epoch: 14320, Training Loss: 0.01165
Epoch: 14320, Training Loss: 0.01436
Epoch: 14320, Training Loss: 0.01105
Epoch: 14321, Training Loss: 0.01258
Epoch: 14321, Training Loss: 0.01165
Epoch: 14321, Training Loss: 0.01436
Epoch: 14321, Training Loss: 0.01105
Epoch: 14322, Training Loss: 0.01258
Epoch: 14322, Training Loss: 0.01165
Epoch: 14322, Training Loss: 0.01436
Epoch: 14322, Training Loss: 0.01105
Epoch: 14323, Training Loss: 0.01258
Epoch: 14323, Training Loss: 0.01165
Epoch: 14323, Training Loss: 0.01436
Epoch: 14323, Training Loss: 0.01105
Epoch: 14324, Training Loss: 0.01258
Epoch: 14324, Training Loss: 0.01165
Epoch: 14324, Training Loss: 0.01436
Epoch: 14324, Training Loss: 0.01105
Epoch: 14325, Training Loss: 0.01258
Epoch: 14325, Training Loss: 0.01165
Epoch: 14325, Training Loss: 0.01436
Epoch: 14325, Training Loss: 0.01105
Epoch: 14326, Training Loss: 0.01258
Epoch: 14326, Training Loss: 0.01165
Epoch: 14326, Training Loss: 0.01436
Epoch: 14326, Training Loss: 0.01105
Epoch: 14327, Training Loss: 0.01258
Epoch: 14327, Training Loss: 0.01165
Epoch: 14327, Training Loss: 0.01436
Epoch: 14327, Training Loss: 0.01105
Epoch: 14328, Training Loss: 0.01257
Epoch: 14328, Training Loss: 0.01165
Epoch: 14328, Training Loss: 0.01436
Epoch: 14328, Training Loss: 0.01105
Epoch: 14329, Training Loss: 0.01257
Epoch: 14329, Training Loss: 0.01165
Epoch: 14329, Training Loss: 0.01436
Epoch: 14329, Training Loss: 0.01105
Epoch: 14330, Training Loss: 0.01257
Epoch: 14330, Training Loss: 0.01165
Epoch: 14330, Training Loss: 0.01436
Epoch: 14330, Training Loss: 0.01105
Epoch: 14331, Training Loss: 0.01257
Epoch: 14331, Training Loss: 0.01165
Epoch: 14331, Training Loss: 0.01436
Epoch: 14331, Training Loss: 0.01105
Epoch: 14332, Training Loss: 0.01257
Epoch: 14332, Training Loss: 0.01165
Epoch: 14332, Training Loss: 0.01435
Epoch: 14332, Training Loss: 0.01105
Epoch: 14333, Training Loss: 0.01257
Epoch: 14333, Training Loss: 0.01165
Epoch: 14333, Training Loss: 0.01435
Epoch: 14333, Training Loss: 0.01105
Epoch: 14334, Training Loss: 0.01257
Epoch: 14334, Training Loss: 0.01165
Epoch: 14334, Training Loss: 0.01435
Epoch: 14334, Training Loss: 0.01105
Epoch: 14335, Training Loss: 0.01257
Epoch: 14335, Training Loss: 0.01165
Epoch: 14335, Training Loss: 0.01435
Epoch: 14335, Training Loss: 0.01105
Epoch: 14336, Training Loss: 0.01257
Epoch: 14336, Training Loss: 0.01165
Epoch: 14336, Training Loss: 0.01435
Epoch: 14336, Training Loss: 0.01104
Epoch: 14337, Training Loss: 0.01257
Epoch: 14337, Training Loss: 0.01164
Epoch: 14337, Training Loss: 0.01435
Epoch: 14337, Training Loss: 0.01104
Epoch: 14338, Training Loss: 0.01257
Epoch: 14338, Training Loss: 0.01164
Epoch: 14338, Training Loss: 0.01435
Epoch: 14338, Training Loss: 0.01104
Epoch: 14339, Training Loss: 0.01257
Epoch: 14339, Training Loss: 0.01164
Epoch: 14339, Training Loss: 0.01435
Epoch: 14339, Training Loss: 0.01104
Epoch: 14340, Training Loss: 0.01257
Epoch: 14340, Training Loss: 0.01164
Epoch: 14340, Training Loss: 0.01435
Epoch: 14340, Training Loss: 0.01104
Epoch: 14341, Training Loss: 0.01257
Epoch: 14341, Training Loss: 0.01164
Epoch: 14341, Training Loss: 0.01435
Epoch: 14341, Training Loss: 0.01104
Epoch: 14342, Training Loss: 0.01257
Epoch: 14342, Training Loss: 0.01164
Epoch: 14342, Training Loss: 0.01435
Epoch: 14342, Training Loss: 0.01104
Epoch: 14343, Training Loss: 0.01257
Epoch: 14343, Training Loss: 0.01164
Epoch: 14343, Training Loss: 0.01435
Epoch: 14343, Training Loss: 0.01104
Epoch: 14344, Training Loss: 0.01257
Epoch: 14344, Training Loss: 0.01164
Epoch: 14344, Training Loss: 0.01435
Epoch: 14344, Training Loss: 0.01104
Epoch: 14345, Training Loss: 0.01257
Epoch: 14345, Training Loss: 0.01164
Epoch: 14345, Training Loss: 0.01435
Epoch: 14345, Training Loss: 0.01104
Epoch: 14346, Training Loss: 0.01257
Epoch: 14346, Training Loss: 0.01164
Epoch: 14346, Training Loss: 0.01435
Epoch: 14346, Training Loss: 0.01104
Epoch: 14347, Training Loss: 0.01257
Epoch: 14347, Training Loss: 0.01164
Epoch: 14347, Training Loss: 0.01435
Epoch: 14347, Training Loss: 0.01104
Epoch: 14348, Training Loss: 0.01257
Epoch: 14348, Training Loss: 0.01164
Epoch: 14348, Training Loss: 0.01435
Epoch: 14348, Training Loss: 0.01104
Epoch: 14349, Training Loss: 0.01256
Epoch: 14349, Training Loss: 0.01164
Epoch: 14349, Training Loss: 0.01435
Epoch: 14349, Training Loss: 0.01104
Epoch: 14350, Training Loss: 0.01256
Epoch: 14350, Training Loss: 0.01164
Epoch: 14350, Training Loss: 0.01434
Epoch: 14350, Training Loss: 0.01104
Epoch: 14351, Training Loss: 0.01256
Epoch: 14351, Training Loss: 0.01164
Epoch: 14351, Training Loss: 0.01434
Epoch: 14351, Training Loss: 0.01104
Epoch: 14352, Training Loss: 0.01256
Epoch: 14352, Training Loss: 0.01164
Epoch: 14352, Training Loss: 0.01434
Epoch: 14352, Training Loss: 0.01104
Epoch: 14353, Training Loss: 0.01256
Epoch: 14353, Training Loss: 0.01164
Epoch: 14353, Training Loss: 0.01434
Epoch: 14353, Training Loss: 0.01104
Epoch: 14354, Training Loss: 0.01256
Epoch: 14354, Training Loss: 0.01164
Epoch: 14354, Training Loss: 0.01434
Epoch: 14354, Training Loss: 0.01104
Epoch: 14355, Training Loss: 0.01256
Epoch: 14355, Training Loss: 0.01164
Epoch: 14355, Training Loss: 0.01434
Epoch: 14355, Training Loss: 0.01104
Epoch: 14356, Training Loss: 0.01256
Epoch: 14356, Training Loss: 0.01164
Epoch: 14356, Training Loss: 0.01434
Epoch: 14356, Training Loss: 0.01104
Epoch: 14357, Training Loss: 0.01256
Epoch: 14357, Training Loss: 0.01164
Epoch: 14357, Training Loss: 0.01434
Epoch: 14357, Training Loss: 0.01104
Epoch: 14358, Training Loss: 0.01256
Epoch: 14358, Training Loss: 0.01164
Epoch: 14358, Training Loss: 0.01434
Epoch: 14358, Training Loss: 0.01104
Epoch: 14359, Training Loss: 0.01256
Epoch: 14359, Training Loss: 0.01164
Epoch: 14359, Training Loss: 0.01434
Epoch: 14359, Training Loss: 0.01104
Epoch: 14360, Training Loss: 0.01256
Epoch: 14360, Training Loss: 0.01163
Epoch: 14360, Training Loss: 0.01434
Epoch: 14360, Training Loss: 0.01103
Epoch: 14361, Training Loss: 0.01256
Epoch: 14361, Training Loss: 0.01163
Epoch: 14361, Training Loss: 0.01434
Epoch: 14361, Training Loss: 0.01103
Epoch: 14362, Training Loss: 0.01256
Epoch: 14362, Training Loss: 0.01163
Epoch: 14362, Training Loss: 0.01434
Epoch: 14362, Training Loss: 0.01103
Epoch: 14363, Training Loss: 0.01256
Epoch: 14363, Training Loss: 0.01163
Epoch: 14363, Training Loss: 0.01434
Epoch: 14363, Training Loss: 0.01103
Epoch: 14364, Training Loss: 0.01256
Epoch: 14364, Training Loss: 0.01163
Epoch: 14364, Training Loss: 0.01434
Epoch: 14364, Training Loss: 0.01103
Epoch: 14365, Training Loss: 0.01256
Epoch: 14365, Training Loss: 0.01163
Epoch: 14365, Training Loss: 0.01434
Epoch: 14365, Training Loss: 0.01103
Epoch: 14366, Training Loss: 0.01256
Epoch: 14366, Training Loss: 0.01163
Epoch: 14366, Training Loss: 0.01434
Epoch: 14366, Training Loss: 0.01103
Epoch: 14367, Training Loss: 0.01256
Epoch: 14367, Training Loss: 0.01163
Epoch: 14367, Training Loss: 0.01434
Epoch: 14367, Training Loss: 0.01103
Epoch: 14368, Training Loss: 0.01256
Epoch: 14368, Training Loss: 0.01163
Epoch: 14368, Training Loss: 0.01433
Epoch: 14368, Training Loss: 0.01103
Epoch: 14369, Training Loss: 0.01255
Epoch: 14369, Training Loss: 0.01163
Epoch: 14369, Training Loss: 0.01433
Epoch: 14369, Training Loss: 0.01103
Epoch: 14370, Training Loss: 0.01255
Epoch: 14370, Training Loss: 0.01163
Epoch: 14370, Training Loss: 0.01433
Epoch: 14370, Training Loss: 0.01103
Epoch: 14371, Training Loss: 0.01255
Epoch: 14371, Training Loss: 0.01163
Epoch: 14371, Training Loss: 0.01433
Epoch: 14371, Training Loss: 0.01103
Epoch: 14372, Training Loss: 0.01255
Epoch: 14372, Training Loss: 0.01163
Epoch: 14372, Training Loss: 0.01433
Epoch: 14372, Training Loss: 0.01103
Epoch: 14373, Training Loss: 0.01255
Epoch: 14373, Training Loss: 0.01163
Epoch: 14373, Training Loss: 0.01433
Epoch: 14373, Training Loss: 0.01103
Epoch: 14374, Training Loss: 0.01255
Epoch: 14374, Training Loss: 0.01163
Epoch: 14374, Training Loss: 0.01433
Epoch: 14374, Training Loss: 0.01103
Epoch: 14375, Training Loss: 0.01255
Epoch: 14375, Training Loss: 0.01163
Epoch: 14375, Training Loss: 0.01433
Epoch: 14375, Training Loss: 0.01103
Epoch: 14376, Training Loss: 0.01255
Epoch: 14376, Training Loss: 0.01163
Epoch: 14376, Training Loss: 0.01433
Epoch: 14376, Training Loss: 0.01103
Epoch: 14377, Training Loss: 0.01255
Epoch: 14377, Training Loss: 0.01163
Epoch: 14377, Training Loss: 0.01433
Epoch: 14377, Training Loss: 0.01103
Epoch: 14378, Training Loss: 0.01255
Epoch: 14378, Training Loss: 0.01163
Epoch: 14378, Training Loss: 0.01433
Epoch: 14378, Training Loss: 0.01103
Epoch: 14379, Training Loss: 0.01255
Epoch: 14379, Training Loss: 0.01163
Epoch: 14379, Training Loss: 0.01433
Epoch: 14379, Training Loss: 0.01103
Epoch: 14380, Training Loss: 0.01255
Epoch: 14380, Training Loss: 0.01163
Epoch: 14380, Training Loss: 0.01433
Epoch: 14380, Training Loss: 0.01103
Epoch: 14381, Training Loss: 0.01255
Epoch: 14381, Training Loss: 0.01163
Epoch: 14381, Training Loss: 0.01433
Epoch: 14381, Training Loss: 0.01103
Epoch: 14382, Training Loss: 0.01255
Epoch: 14382, Training Loss: 0.01163
Epoch: 14382, Training Loss: 0.01433
Epoch: 14382, Training Loss: 0.01103
Epoch: 14383, Training Loss: 0.01255
Epoch: 14383, Training Loss: 0.01162
Epoch: 14383, Training Loss: 0.01433
Epoch: 14383, Training Loss: 0.01103
Epoch: 14384, Training Loss: 0.01255
Epoch: 14384, Training Loss: 0.01162
Epoch: 14384, Training Loss: 0.01433
Epoch: 14384, Training Loss: 0.01102
Epoch: 14385, Training Loss: 0.01255
Epoch: 14385, Training Loss: 0.01162
Epoch: 14385, Training Loss: 0.01433
Epoch: 14385, Training Loss: 0.01102
Epoch: 14386, Training Loss: 0.01255
Epoch: 14386, Training Loss: 0.01162
Epoch: 14386, Training Loss: 0.01433
Epoch: 14386, Training Loss: 0.01102
Epoch: 14387, Training Loss: 0.01255
Epoch: 14387, Training Loss: 0.01162
Epoch: 14387, Training Loss: 0.01432
Epoch: 14387, Training Loss: 0.01102
Epoch: 14388, Training Loss: 0.01255
Epoch: 14388, Training Loss: 0.01162
Epoch: 14388, Training Loss: 0.01432
Epoch: 14388, Training Loss: 0.01102
Epoch: 14389, Training Loss: 0.01255
Epoch: 14389, Training Loss: 0.01162
Epoch: 14389, Training Loss: 0.01432
Epoch: 14389, Training Loss: 0.01102
Epoch: 14390, Training Loss: 0.01254
Epoch: 14390, Training Loss: 0.01162
Epoch: 14390, Training Loss: 0.01432
Epoch: 14390, Training Loss: 0.01102
Epoch: 14391, Training Loss: 0.01254
Epoch: 14391, Training Loss: 0.01162
Epoch: 14391, Training Loss: 0.01432
Epoch: 14391, Training Loss: 0.01102
Epoch: 14392, Training Loss: 0.01254
Epoch: 14392, Training Loss: 0.01162
Epoch: 14392, Training Loss: 0.01432
Epoch: 14392, Training Loss: 0.01102
Epoch: 14393, Training Loss: 0.01254
Epoch: 14393, Training Loss: 0.01162
Epoch: 14393, Training Loss: 0.01432
Epoch: 14393, Training Loss: 0.01102
Epoch: 14394, Training Loss: 0.01254
Epoch: 14394, Training Loss: 0.01162
Epoch: 14394, Training Loss: 0.01432
Epoch: 14394, Training Loss: 0.01102
Epoch: 14395, Training Loss: 0.01254
Epoch: 14395, Training Loss: 0.01162
Epoch: 14395, Training Loss: 0.01432
Epoch: 14395, Training Loss: 0.01102
Epoch: 14396, Training Loss: 0.01254
Epoch: 14396, Training Loss: 0.01162
Epoch: 14396, Training Loss: 0.01432
Epoch: 14396, Training Loss: 0.01102
Epoch: 14397, Training Loss: 0.01254
Epoch: 14397, Training Loss: 0.01162
Epoch: 14397, Training Loss: 0.01432
Epoch: 14397, Training Loss: 0.01102
Epoch: 14398, Training Loss: 0.01254
Epoch: 14398, Training Loss: 0.01162
Epoch: 14398, Training Loss: 0.01432
Epoch: 14398, Training Loss: 0.01102
Epoch: 14399, Training Loss: 0.01254
Epoch: 14399, Training Loss: 0.01162
Epoch: 14399, Training Loss: 0.01432
Epoch: 14399, Training Loss: 0.01102
Epoch: 14400, Training Loss: 0.01254
Epoch: 14400, Training Loss: 0.01162
Epoch: 14400, Training Loss: 0.01432
Epoch: 14400, Training Loss: 0.01102
Epoch: 14401, Training Loss: 0.01254
Epoch: 14401, Training Loss: 0.01162
Epoch: 14401, Training Loss: 0.01432
Epoch: 14401, Training Loss: 0.01102
Epoch: 14402, Training Loss: 0.01254
Epoch: 14402, Training Loss: 0.01162
Epoch: 14402, Training Loss: 0.01432
Epoch: 14402, Training Loss: 0.01102
Epoch: 14403, Training Loss: 0.01254
Epoch: 14403, Training Loss: 0.01162
Epoch: 14403, Training Loss: 0.01432
Epoch: 14403, Training Loss: 0.01102
Epoch: 14404, Training Loss: 0.01254
Epoch: 14404, Training Loss: 0.01162
Epoch: 14404, Training Loss: 0.01432
Epoch: 14404, Training Loss: 0.01102
Epoch: 14405, Training Loss: 0.01254
Epoch: 14405, Training Loss: 0.01162
Epoch: 14405, Training Loss: 0.01431
Epoch: 14405, Training Loss: 0.01102
Epoch: 14406, Training Loss: 0.01254
Epoch: 14406, Training Loss: 0.01161
Epoch: 14406, Training Loss: 0.01431
Epoch: 14406, Training Loss: 0.01102
Epoch: 14407, Training Loss: 0.01254
Epoch: 14407, Training Loss: 0.01161
Epoch: 14407, Training Loss: 0.01431
Epoch: 14407, Training Loss: 0.01102
Epoch: 14408, Training Loss: 0.01254
Epoch: 14408, Training Loss: 0.01161
Epoch: 14408, Training Loss: 0.01431
Epoch: 14408, Training Loss: 0.01101
Epoch: 14409, Training Loss: 0.01254
Epoch: 14409, Training Loss: 0.01161
Epoch: 14409, Training Loss: 0.01431
Epoch: 14409, Training Loss: 0.01101
Epoch: 14410, Training Loss: 0.01254
Epoch: 14410, Training Loss: 0.01161
Epoch: 14410, Training Loss: 0.01431
Epoch: 14410, Training Loss: 0.01101
Epoch: 14411, Training Loss: 0.01253
Epoch: 14411, Training Loss: 0.01161
Epoch: 14411, Training Loss: 0.01431
Epoch: 14411, Training Loss: 0.01101
Epoch: 14412, Training Loss: 0.01253
Epoch: 14412, Training Loss: 0.01161
Epoch: 14412, Training Loss: 0.01431
Epoch: 14412, Training Loss: 0.01101
Epoch: 14413, Training Loss: 0.01253
Epoch: 14413, Training Loss: 0.01161
Epoch: 14413, Training Loss: 0.01431
Epoch: 14413, Training Loss: 0.01101
Epoch: 14414, Training Loss: 0.01253
Epoch: 14414, Training Loss: 0.01161
Epoch: 14414, Training Loss: 0.01431
Epoch: 14414, Training Loss: 0.01101
Epoch: 14415, Training Loss: 0.01253
Epoch: 14415, Training Loss: 0.01161
Epoch: 14415, Training Loss: 0.01431
Epoch: 14415, Training Loss: 0.01101
Epoch: 14416, Training Loss: 0.01253
Epoch: 14416, Training Loss: 0.01161
Epoch: 14416, Training Loss: 0.01431
Epoch: 14416, Training Loss: 0.01101
Epoch: 14417, Training Loss: 0.01253
Epoch: 14417, Training Loss: 0.01161
Epoch: 14417, Training Loss: 0.01431
Epoch: 14417, Training Loss: 0.01101
Epoch: 14418, Training Loss: 0.01253
Epoch: 14418, Training Loss: 0.01161
Epoch: 14418, Training Loss: 0.01431
Epoch: 14418, Training Loss: 0.01101
Epoch: 14419, Training Loss: 0.01253
Epoch: 14419, Training Loss: 0.01161
Epoch: 14419, Training Loss: 0.01431
Epoch: 14419, Training Loss: 0.01101
Epoch: 14420, Training Loss: 0.01253
Epoch: 14420, Training Loss: 0.01161
Epoch: 14420, Training Loss: 0.01431
Epoch: 14420, Training Loss: 0.01101
Epoch: 14421, Training Loss: 0.01253
Epoch: 14421, Training Loss: 0.01161
Epoch: 14421, Training Loss: 0.01431
Epoch: 14421, Training Loss: 0.01101
Epoch: 14422, Training Loss: 0.01253
Epoch: 14422, Training Loss: 0.01161
Epoch: 14422, Training Loss: 0.01431
Epoch: 14422, Training Loss: 0.01101
Epoch: 14423, Training Loss: 0.01253
Epoch: 14423, Training Loss: 0.01161
Epoch: 14423, Training Loss: 0.01430
Epoch: 14423, Training Loss: 0.01101
Epoch: 14424, Training Loss: 0.01253
Epoch: 14424, Training Loss: 0.01161
Epoch: 14424, Training Loss: 0.01430
Epoch: 14424, Training Loss: 0.01101
Epoch: 14425, Training Loss: 0.01253
Epoch: 14425, Training Loss: 0.01161
Epoch: 14425, Training Loss: 0.01430
Epoch: 14425, Training Loss: 0.01101
Epoch: 14426, Training Loss: 0.01253
Epoch: 14426, Training Loss: 0.01161
Epoch: 14426, Training Loss: 0.01430
Epoch: 14426, Training Loss: 0.01101
Epoch: 14427, Training Loss: 0.01253
Epoch: 14427, Training Loss: 0.01161
Epoch: 14427, Training Loss: 0.01430
Epoch: 14427, Training Loss: 0.01101
Epoch: 14428, Training Loss: 0.01253
Epoch: 14428, Training Loss: 0.01161
Epoch: 14428, Training Loss: 0.01430
Epoch: 14428, Training Loss: 0.01101
Epoch: 14429, Training Loss: 0.01253
Epoch: 14429, Training Loss: 0.01160
Epoch: 14429, Training Loss: 0.01430
Epoch: 14429, Training Loss: 0.01101
Epoch: 14430, Training Loss: 0.01253
Epoch: 14430, Training Loss: 0.01160
Epoch: 14430, Training Loss: 0.01430
Epoch: 14430, Training Loss: 0.01101
Epoch: 14431, Training Loss: 0.01253
Epoch: 14431, Training Loss: 0.01160
Epoch: 14431, Training Loss: 0.01430
Epoch: 14431, Training Loss: 0.01101
Epoch: 14432, Training Loss: 0.01252
Epoch: 14432, Training Loss: 0.01160
Epoch: 14432, Training Loss: 0.01430
Epoch: 14432, Training Loss: 0.01100
Epoch: 14433, Training Loss: 0.01252
Epoch: 14433, Training Loss: 0.01160
Epoch: 14433, Training Loss: 0.01430
Epoch: 14433, Training Loss: 0.01100
Epoch: 14434, Training Loss: 0.01252
Epoch: 14434, Training Loss: 0.01160
Epoch: 14434, Training Loss: 0.01430
Epoch: 14434, Training Loss: 0.01100
Epoch: 14435, Training Loss: 0.01252
Epoch: 14435, Training Loss: 0.01160
Epoch: 14435, Training Loss: 0.01430
Epoch: 14435, Training Loss: 0.01100
Epoch: 14436, Training Loss: 0.01252
Epoch: 14436, Training Loss: 0.01160
Epoch: 14436, Training Loss: 0.01430
Epoch: 14436, Training Loss: 0.01100
Epoch: 14437, Training Loss: 0.01252
Epoch: 14437, Training Loss: 0.01160
Epoch: 14437, Training Loss: 0.01430
Epoch: 14437, Training Loss: 0.01100
Epoch: 14438, Training Loss: 0.01252
Epoch: 14438, Training Loss: 0.01160
Epoch: 14438, Training Loss: 0.01430
Epoch: 14438, Training Loss: 0.01100
Epoch: 14439, Training Loss: 0.01252
Epoch: 14439, Training Loss: 0.01160
Epoch: 14439, Training Loss: 0.01430
Epoch: 14439, Training Loss: 0.01100
Epoch: 14440, Training Loss: 0.01252
Epoch: 14440, Training Loss: 0.01160
Epoch: 14440, Training Loss: 0.01430
Epoch: 14440, Training Loss: 0.01100
Epoch: 14441, Training Loss: 0.01252
Epoch: 14441, Training Loss: 0.01160
Epoch: 14441, Training Loss: 0.01429
Epoch: 14441, Training Loss: 0.01100
Epoch: 14442, Training Loss: 0.01252
Epoch: 14442, Training Loss: 0.01160
Epoch: 14442, Training Loss: 0.01429
Epoch: 14442, Training Loss: 0.01100
Epoch: 14443, Training Loss: 0.01252
Epoch: 14443, Training Loss: 0.01160
Epoch: 14443, Training Loss: 0.01429
Epoch: 14443, Training Loss: 0.01100
Epoch: 14444, Training Loss: 0.01252
Epoch: 14444, Training Loss: 0.01160
Epoch: 14444, Training Loss: 0.01429
Epoch: 14444, Training Loss: 0.01100
Epoch: 14445, Training Loss: 0.01252
Epoch: 14445, Training Loss: 0.01160
Epoch: 14445, Training Loss: 0.01429
Epoch: 14445, Training Loss: 0.01100
Epoch: 14446, Training Loss: 0.01252
Epoch: 14446, Training Loss: 0.01160
Epoch: 14446, Training Loss: 0.01429
Epoch: 14446, Training Loss: 0.01100
Epoch: 14447, Training Loss: 0.01252
Epoch: 14447, Training Loss: 0.01160
Epoch: 14447, Training Loss: 0.01429
Epoch: 14447, Training Loss: 0.01100
Epoch: 14448, Training Loss: 0.01252
Epoch: 14448, Training Loss: 0.01160
Epoch: 14448, Training Loss: 0.01429
Epoch: 14448, Training Loss: 0.01100
Epoch: 14449, Training Loss: 0.01252
Epoch: 14449, Training Loss: 0.01160
Epoch: 14449, Training Loss: 0.01429
Epoch: 14449, Training Loss: 0.01100
Epoch: 14450, Training Loss: 0.01252
Epoch: 14450, Training Loss: 0.01160
Epoch: 14450, Training Loss: 0.01429
Epoch: 14450, Training Loss: 0.01100
Epoch: 14451, Training Loss: 0.01252
Epoch: 14451, Training Loss: 0.01160
Epoch: 14451, Training Loss: 0.01429
Epoch: 14451, Training Loss: 0.01100
Epoch: 14452, Training Loss: 0.01252
Epoch: 14452, Training Loss: 0.01159
Epoch: 14452, Training Loss: 0.01429
Epoch: 14452, Training Loss: 0.01100
Epoch: 14453, Training Loss: 0.01251
Epoch: 14453, Training Loss: 0.01159
Epoch: 14453, Training Loss: 0.01429
Epoch: 14453, Training Loss: 0.01100
Epoch: 14454, Training Loss: 0.01251
Epoch: 14454, Training Loss: 0.01159
Epoch: 14454, Training Loss: 0.01429
Epoch: 14454, Training Loss: 0.01100
Epoch: 14455, Training Loss: 0.01251
Epoch: 14455, Training Loss: 0.01159
Epoch: 14455, Training Loss: 0.01429
Epoch: 14455, Training Loss: 0.01100
Epoch: 14456, Training Loss: 0.01251
Epoch: 14456, Training Loss: 0.01159
Epoch: 14456, Training Loss: 0.01429
Epoch: 14456, Training Loss: 0.01100
Epoch: 14457, Training Loss: 0.01251
Epoch: 14457, Training Loss: 0.01159
Epoch: 14457, Training Loss: 0.01429
Epoch: 14457, Training Loss: 0.01099
Epoch: 14458, Training Loss: 0.01251
Epoch: 14458, Training Loss: 0.01159
Epoch: 14458, Training Loss: 0.01429
Epoch: 14458, Training Loss: 0.01099
Epoch: 14459, Training Loss: 0.01251
Epoch: 14459, Training Loss: 0.01159
Epoch: 14459, Training Loss: 0.01429
Epoch: 14459, Training Loss: 0.01099
Epoch: 14460, Training Loss: 0.01251
Epoch: 14460, Training Loss: 0.01159
Epoch: 14460, Training Loss: 0.01428
Epoch: 14460, Training Loss: 0.01099
Epoch: 14461, Training Loss: 0.01251
Epoch: 14461, Training Loss: 0.01159
Epoch: 14461, Training Loss: 0.01428
Epoch: 14461, Training Loss: 0.01099
Epoch: 14462, Training Loss: 0.01251
Epoch: 14462, Training Loss: 0.01159
Epoch: 14462, Training Loss: 0.01428
Epoch: 14462, Training Loss: 0.01099
Epoch: 14463, Training Loss: 0.01251
Epoch: 14463, Training Loss: 0.01159
Epoch: 14463, Training Loss: 0.01428
Epoch: 14463, Training Loss: 0.01099
Epoch: 14464, Training Loss: 0.01251
Epoch: 14464, Training Loss: 0.01159
Epoch: 14464, Training Loss: 0.01428
Epoch: 14464, Training Loss: 0.01099
Epoch: 14465, Training Loss: 0.01251
Epoch: 14465, Training Loss: 0.01159
Epoch: 14465, Training Loss: 0.01428
Epoch: 14465, Training Loss: 0.01099
Epoch: 14466, Training Loss: 0.01251
Epoch: 14466, Training Loss: 0.01159
Epoch: 14466, Training Loss: 0.01428
Epoch: 14466, Training Loss: 0.01099
Epoch: 14467, Training Loss: 0.01251
Epoch: 14467, Training Loss: 0.01159
Epoch: 14467, Training Loss: 0.01428
Epoch: 14467, Training Loss: 0.01099
Epoch: 14468, Training Loss: 0.01251
Epoch: 14468, Training Loss: 0.01159
Epoch: 14468, Training Loss: 0.01428
Epoch: 14468, Training Loss: 0.01099
Epoch: 14469, Training Loss: 0.01251
Epoch: 14469, Training Loss: 0.01159
Epoch: 14469, Training Loss: 0.01428
Epoch: 14469, Training Loss: 0.01099
Epoch: 14470, Training Loss: 0.01251
Epoch: 14470, Training Loss: 0.01159
Epoch: 14470, Training Loss: 0.01428
Epoch: 14470, Training Loss: 0.01099
Epoch: 14471, Training Loss: 0.01251
Epoch: 14471, Training Loss: 0.01159
Epoch: 14471, Training Loss: 0.01428
Epoch: 14471, Training Loss: 0.01099
Epoch: 14472, Training Loss: 0.01251
Epoch: 14472, Training Loss: 0.01159
Epoch: 14472, Training Loss: 0.01428
Epoch: 14472, Training Loss: 0.01099
Epoch: 14473, Training Loss: 0.01251
Epoch: 14473, Training Loss: 0.01159
Epoch: 14473, Training Loss: 0.01428
Epoch: 14473, Training Loss: 0.01099
Epoch: 14474, Training Loss: 0.01250
Epoch: 14474, Training Loss: 0.01159
Epoch: 14474, Training Loss: 0.01428
Epoch: 14474, Training Loss: 0.01099
Epoch: 14475, Training Loss: 0.01250
Epoch: 14475, Training Loss: 0.01158
Epoch: 14475, Training Loss: 0.01428
Epoch: 14475, Training Loss: 0.01099
Epoch: 14476, Training Loss: 0.01250
Epoch: 14476, Training Loss: 0.01158
Epoch: 14476, Training Loss: 0.01428
Epoch: 14476, Training Loss: 0.01099
Epoch: 14477, Training Loss: 0.01250
Epoch: 14477, Training Loss: 0.01158
Epoch: 14477, Training Loss: 0.01428
Epoch: 14477, Training Loss: 0.01099
Epoch: 14478, Training Loss: 0.01250
Epoch: 14478, Training Loss: 0.01158
Epoch: 14478, Training Loss: 0.01427
Epoch: 14478, Training Loss: 0.01099
Epoch: 14479, Training Loss: 0.01250
Epoch: 14479, Training Loss: 0.01158
Epoch: 14479, Training Loss: 0.01427
Epoch: 14479, Training Loss: 0.01099
Epoch: 14480, Training Loss: 0.01250
Epoch: 14480, Training Loss: 0.01158
Epoch: 14480, Training Loss: 0.01427
Epoch: 14480, Training Loss: 0.01099
Epoch: 14481, Training Loss: 0.01250
Epoch: 14481, Training Loss: 0.01158
Epoch: 14481, Training Loss: 0.01427
Epoch: 14481, Training Loss: 0.01098
Epoch: 14482, Training Loss: 0.01250
Epoch: 14482, Training Loss: 0.01158
Epoch: 14482, Training Loss: 0.01427
Epoch: 14482, Training Loss: 0.01098
Epoch: 14483, Training Loss: 0.01250
Epoch: 14483, Training Loss: 0.01158
Epoch: 14483, Training Loss: 0.01427
Epoch: 14483, Training Loss: 0.01098
Epoch: 14484, Training Loss: 0.01250
Epoch: 14484, Training Loss: 0.01158
Epoch: 14484, Training Loss: 0.01427
Epoch: 14484, Training Loss: 0.01098
Epoch: 14485, Training Loss: 0.01250
Epoch: 14485, Training Loss: 0.01158
Epoch: 14485, Training Loss: 0.01427
Epoch: 14485, Training Loss: 0.01098
Epoch: 14486, Training Loss: 0.01250
Epoch: 14486, Training Loss: 0.01158
Epoch: 14486, Training Loss: 0.01427
Epoch: 14486, Training Loss: 0.01098
Epoch: 14487, Training Loss: 0.01250
Epoch: 14487, Training Loss: 0.01158
Epoch: 14487, Training Loss: 0.01427
Epoch: 14487, Training Loss: 0.01098
Epoch: 14488, Training Loss: 0.01250
Epoch: 14488, Training Loss: 0.01158
Epoch: 14488, Training Loss: 0.01427
Epoch: 14488, Training Loss: 0.01098
Epoch: 14489, Training Loss: 0.01250
Epoch: 14489, Training Loss: 0.01158
Epoch: 14489, Training Loss: 0.01427
Epoch: 14489, Training Loss: 0.01098
Epoch: 14490, Training Loss: 0.01250
Epoch: 14490, Training Loss: 0.01158
Epoch: 14490, Training Loss: 0.01427
Epoch: 14490, Training Loss: 0.01098
Epoch: 14491, Training Loss: 0.01250
Epoch: 14491, Training Loss: 0.01158
Epoch: 14491, Training Loss: 0.01427
Epoch: 14491, Training Loss: 0.01098
Epoch: 14492, Training Loss: 0.01250
Epoch: 14492, Training Loss: 0.01158
Epoch: 14492, Training Loss: 0.01427
Epoch: 14492, Training Loss: 0.01098
Epoch: 14493, Training Loss: 0.01250
Epoch: 14493, Training Loss: 0.01158
Epoch: 14493, Training Loss: 0.01427
Epoch: 14493, Training Loss: 0.01098
Epoch: 14494, Training Loss: 0.01250
Epoch: 14494, Training Loss: 0.01158
Epoch: 14494, Training Loss: 0.01427
Epoch: 14494, Training Loss: 0.01098
Epoch: 14495, Training Loss: 0.01249
Epoch: 14495, Training Loss: 0.01158
Epoch: 14495, Training Loss: 0.01427
Epoch: 14495, Training Loss: 0.01098
Epoch: 14496, Training Loss: 0.01249
Epoch: 14496, Training Loss: 0.01158
Epoch: 14496, Training Loss: 0.01426
Epoch: 14496, Training Loss: 0.01098
Epoch: 14497, Training Loss: 0.01249
Epoch: 14497, Training Loss: 0.01158
Epoch: 14497, Training Loss: 0.01426
Epoch: 14497, Training Loss: 0.01098
Epoch: 14498, Training Loss: 0.01249
Epoch: 14498, Training Loss: 0.01157
Epoch: 14498, Training Loss: 0.01426
Epoch: 14498, Training Loss: 0.01098
Epoch: 14499, Training Loss: 0.01249
Epoch: 14499, Training Loss: 0.01157
Epoch: 14499, Training Loss: 0.01426
Epoch: 14499, Training Loss: 0.01098
Epoch: 14500, Training Loss: 0.01249
Epoch: 14500, Training Loss: 0.01157
Epoch: 14500, Training Loss: 0.01426
Epoch: 14500, Training Loss: 0.01098
Epoch: 14501, Training Loss: 0.01249
Epoch: 14501, Training Loss: 0.01157
Epoch: 14501, Training Loss: 0.01426
Epoch: 14501, Training Loss: 0.01098
Epoch: 14502, Training Loss: 0.01249
Epoch: 14502, Training Loss: 0.01157
Epoch: 14502, Training Loss: 0.01426
Epoch: 14502, Training Loss: 0.01098
Epoch: 14503, Training Loss: 0.01249
Epoch: 14503, Training Loss: 0.01157
Epoch: 14503, Training Loss: 0.01426
Epoch: 14503, Training Loss: 0.01098
Epoch: 14504, Training Loss: 0.01249
Epoch: 14504, Training Loss: 0.01157
Epoch: 14504, Training Loss: 0.01426
Epoch: 14504, Training Loss: 0.01098
Epoch: 14505, Training Loss: 0.01249
Epoch: 14505, Training Loss: 0.01157
Epoch: 14505, Training Loss: 0.01426
Epoch: 14505, Training Loss: 0.01097
Epoch: 14506, Training Loss: 0.01249
Epoch: 14506, Training Loss: 0.01157
Epoch: 14506, Training Loss: 0.01426
Epoch: 14506, Training Loss: 0.01097
Epoch: 14507, Training Loss: 0.01249
Epoch: 14507, Training Loss: 0.01157
Epoch: 14507, Training Loss: 0.01426
Epoch: 14507, Training Loss: 0.01097
Epoch: 14508, Training Loss: 0.01249
Epoch: 14508, Training Loss: 0.01157
Epoch: 14508, Training Loss: 0.01426
Epoch: 14508, Training Loss: 0.01097
Epoch: 14509, Training Loss: 0.01249
Epoch: 14509, Training Loss: 0.01157
Epoch: 14509, Training Loss: 0.01426
Epoch: 14509, Training Loss: 0.01097
Epoch: 14510, Training Loss: 0.01249
Epoch: 14510, Training Loss: 0.01157
Epoch: 14510, Training Loss: 0.01426
Epoch: 14510, Training Loss: 0.01097
Epoch: 14511, Training Loss: 0.01249
Epoch: 14511, Training Loss: 0.01157
Epoch: 14511, Training Loss: 0.01426
Epoch: 14511, Training Loss: 0.01097
Epoch: 14512, Training Loss: 0.01249
Epoch: 14512, Training Loss: 0.01157
Epoch: 14512, Training Loss: 0.01426
Epoch: 14512, Training Loss: 0.01097
Epoch: 14513, Training Loss: 0.01249
Epoch: 14513, Training Loss: 0.01157
Epoch: 14513, Training Loss: 0.01426
Epoch: 14513, Training Loss: 0.01097
Epoch: 14514, Training Loss: 0.01249
Epoch: 14514, Training Loss: 0.01157
Epoch: 14514, Training Loss: 0.01426
Epoch: 14514, Training Loss: 0.01097
Epoch: 14515, Training Loss: 0.01249
Epoch: 14515, Training Loss: 0.01157
Epoch: 14515, Training Loss: 0.01425
Epoch: 14515, Training Loss: 0.01097
Epoch: 14516, Training Loss: 0.01248
Epoch: 14516, Training Loss: 0.01157
Epoch: 14516, Training Loss: 0.01425
Epoch: 14516, Training Loss: 0.01097
Epoch: 14517, Training Loss: 0.01248
Epoch: 14517, Training Loss: 0.01157
Epoch: 14517, Training Loss: 0.01425
Epoch: 14517, Training Loss: 0.01097
Epoch: 14518, Training Loss: 0.01248
Epoch: 14518, Training Loss: 0.01157
Epoch: 14518, Training Loss: 0.01425
Epoch: 14518, Training Loss: 0.01097
Epoch: 14519, Training Loss: 0.01248
Epoch: 14519, Training Loss: 0.01157
Epoch: 14519, Training Loss: 0.01425
Epoch: 14519, Training Loss: 0.01097
Epoch: 14520, Training Loss: 0.01248
Epoch: 14520, Training Loss: 0.01157
Epoch: 14520, Training Loss: 0.01425
Epoch: 14520, Training Loss: 0.01097
Epoch: 14521, Training Loss: 0.01248
Epoch: 14521, Training Loss: 0.01157
Epoch: 14521, Training Loss: 0.01425
Epoch: 14521, Training Loss: 0.01097
Epoch: 14522, Training Loss: 0.01248
Epoch: 14522, Training Loss: 0.01156
Epoch: 14522, Training Loss: 0.01425
Epoch: 14522, Training Loss: 0.01097
Epoch: 14523, Training Loss: 0.01248
Epoch: 14523, Training Loss: 0.01156
Epoch: 14523, Training Loss: 0.01425
Epoch: 14523, Training Loss: 0.01097
Epoch: 14524, Training Loss: 0.01248
Epoch: 14524, Training Loss: 0.01156
Epoch: 14524, Training Loss: 0.01425
Epoch: 14524, Training Loss: 0.01097
Epoch: 14525, Training Loss: 0.01248
Epoch: 14525, Training Loss: 0.01156
Epoch: 14525, Training Loss: 0.01425
Epoch: 14525, Training Loss: 0.01097
Epoch: 14526, Training Loss: 0.01248
Epoch: 14526, Training Loss: 0.01156
Epoch: 14526, Training Loss: 0.01425
Epoch: 14526, Training Loss: 0.01097
Epoch: 14527, Training Loss: 0.01248
Epoch: 14527, Training Loss: 0.01156
Epoch: 14527, Training Loss: 0.01425
Epoch: 14527, Training Loss: 0.01097
Epoch: 14528, Training Loss: 0.01248
Epoch: 14528, Training Loss: 0.01156
Epoch: 14528, Training Loss: 0.01425
Epoch: 14528, Training Loss: 0.01097
Epoch: 14529, Training Loss: 0.01248
Epoch: 14529, Training Loss: 0.01156
Epoch: 14529, Training Loss: 0.01425
Epoch: 14529, Training Loss: 0.01097
Epoch: 14530, Training Loss: 0.01248
Epoch: 14530, Training Loss: 0.01156
Epoch: 14530, Training Loss: 0.01425
Epoch: 14530, Training Loss: 0.01096
Epoch: 14531, Training Loss: 0.01248
Epoch: 14531, Training Loss: 0.01156
Epoch: 14531, Training Loss: 0.01425
Epoch: 14531, Training Loss: 0.01096
Epoch: 14532, Training Loss: 0.01248
Epoch: 14532, Training Loss: 0.01156
Epoch: 14532, Training Loss: 0.01425
Epoch: 14532, Training Loss: 0.01096
Epoch: 14533, Training Loss: 0.01248
Epoch: 14533, Training Loss: 0.01156
Epoch: 14533, Training Loss: 0.01424
Epoch: 14533, Training Loss: 0.01096
Epoch: 14534, Training Loss: 0.01248
Epoch: 14534, Training Loss: 0.01156
Epoch: 14534, Training Loss: 0.01424
Epoch: 14534, Training Loss: 0.01096
Epoch: 14535, Training Loss: 0.01248
Epoch: 14535, Training Loss: 0.01156
Epoch: 14535, Training Loss: 0.01424
Epoch: 14535, Training Loss: 0.01096
Epoch: 14536, Training Loss: 0.01248
Epoch: 14536, Training Loss: 0.01156
Epoch: 14536, Training Loss: 0.01424
Epoch: 14536, Training Loss: 0.01096
Epoch: 14537, Training Loss: 0.01247
Epoch: 14537, Training Loss: 0.01156
Epoch: 14537, Training Loss: 0.01424
Epoch: 14537, Training Loss: 0.01096
Epoch: 14538, Training Loss: 0.01247
Epoch: 14538, Training Loss: 0.01156
Epoch: 14538, Training Loss: 0.01424
Epoch: 14538, Training Loss: 0.01096
Epoch: 14539, Training Loss: 0.01247
Epoch: 14539, Training Loss: 0.01156
Epoch: 14539, Training Loss: 0.01424
Epoch: 14539, Training Loss: 0.01096
Epoch: 14540, Training Loss: 0.01247
Epoch: 14540, Training Loss: 0.01156
Epoch: 14540, Training Loss: 0.01424
Epoch: 14540, Training Loss: 0.01096
Epoch: 14541, Training Loss: 0.01247
Epoch: 14541, Training Loss: 0.01156
Epoch: 14541, Training Loss: 0.01424
Epoch: 14541, Training Loss: 0.01096
Epoch: 14542, Training Loss: 0.01247
Epoch: 14542, Training Loss: 0.01156
Epoch: 14542, Training Loss: 0.01424
Epoch: 14542, Training Loss: 0.01096
Epoch: 14543, Training Loss: 0.01247
Epoch: 14543, Training Loss: 0.01156
Epoch: 14543, Training Loss: 0.01424
Epoch: 14543, Training Loss: 0.01096
Epoch: 14544, Training Loss: 0.01247
Epoch: 14544, Training Loss: 0.01156
Epoch: 14544, Training Loss: 0.01424
Epoch: 14544, Training Loss: 0.01096
Epoch: 14545, Training Loss: 0.01247
Epoch: 14545, Training Loss: 0.01155
Epoch: 14545, Training Loss: 0.01424
Epoch: 14545, Training Loss: 0.01096
Epoch: 14546, Training Loss: 0.01247
Epoch: 14546, Training Loss: 0.01155
Epoch: 14546, Training Loss: 0.01424
Epoch: 14546, Training Loss: 0.01096
Epoch: 14547, Training Loss: 0.01247
Epoch: 14547, Training Loss: 0.01155
Epoch: 14547, Training Loss: 0.01424
Epoch: 14547, Training Loss: 0.01096
Epoch: 14548, Training Loss: 0.01247
Epoch: 14548, Training Loss: 0.01155
Epoch: 14548, Training Loss: 0.01424
Epoch: 14548, Training Loss: 0.01096
Epoch: 14549, Training Loss: 0.01247
Epoch: 14549, Training Loss: 0.01155
Epoch: 14549, Training Loss: 0.01424
Epoch: 14549, Training Loss: 0.01096
Epoch: 14550, Training Loss: 0.01247
Epoch: 14550, Training Loss: 0.01155
Epoch: 14550, Training Loss: 0.01424
Epoch: 14550, Training Loss: 0.01096
Epoch: 14551, Training Loss: 0.01247
Epoch: 14551, Training Loss: 0.01155
Epoch: 14551, Training Loss: 0.01424
Epoch: 14551, Training Loss: 0.01096
Epoch: 14552, Training Loss: 0.01247
Epoch: 14552, Training Loss: 0.01155
Epoch: 14552, Training Loss: 0.01423
Epoch: 14552, Training Loss: 0.01096
Epoch: 14553, Training Loss: 0.01247
Epoch: 14553, Training Loss: 0.01155
Epoch: 14553, Training Loss: 0.01423
Epoch: 14553, Training Loss: 0.01096
Epoch: 14554, Training Loss: 0.01247
Epoch: 14554, Training Loss: 0.01155
Epoch: 14554, Training Loss: 0.01423
Epoch: 14554, Training Loss: 0.01095
Epoch: 14555, Training Loss: 0.01247
Epoch: 14555, Training Loss: 0.01155
Epoch: 14555, Training Loss: 0.01423
Epoch: 14555, Training Loss: 0.01095
Epoch: 14556, Training Loss: 0.01247
Epoch: 14556, Training Loss: 0.01155
Epoch: 14556, Training Loss: 0.01423
Epoch: 14556, Training Loss: 0.01095
Epoch: 14557, Training Loss: 0.01247
Epoch: 14557, Training Loss: 0.01155
Epoch: 14557, Training Loss: 0.01423
Epoch: 14557, Training Loss: 0.01095
Epoch: 14558, Training Loss: 0.01246
Epoch: 14558, Training Loss: 0.01155
Epoch: 14558, Training Loss: 0.01423
Epoch: 14558, Training Loss: 0.01095
Epoch: 14559, Training Loss: 0.01246
Epoch: 14559, Training Loss: 0.01155
Epoch: 14559, Training Loss: 0.01423
Epoch: 14559, Training Loss: 0.01095
Epoch: 14560, Training Loss: 0.01246
Epoch: 14560, Training Loss: 0.01155
Epoch: 14560, Training Loss: 0.01423
Epoch: 14560, Training Loss: 0.01095
Epoch: 14561, Training Loss: 0.01246
Epoch: 14561, Training Loss: 0.01155
Epoch: 14561, Training Loss: 0.01423
Epoch: 14561, Training Loss: 0.01095
Epoch: 14562, Training Loss: 0.01246
Epoch: 14562, Training Loss: 0.01155
Epoch: 14562, Training Loss: 0.01423
Epoch: 14562, Training Loss: 0.01095
Epoch: 14563, Training Loss: 0.01246
Epoch: 14563, Training Loss: 0.01155
Epoch: 14563, Training Loss: 0.01423
Epoch: 14563, Training Loss: 0.01095
Epoch: 14564, Training Loss: 0.01246
Epoch: 14564, Training Loss: 0.01155
Epoch: 14564, Training Loss: 0.01423
Epoch: 14564, Training Loss: 0.01095
Epoch: 14565, Training Loss: 0.01246
Epoch: 14565, Training Loss: 0.01155
Epoch: 14565, Training Loss: 0.01423
Epoch: 14565, Training Loss: 0.01095
Epoch: 14566, Training Loss: 0.01246
Epoch: 14566, Training Loss: 0.01155
Epoch: 14566, Training Loss: 0.01423
Epoch: 14566, Training Loss: 0.01095
Epoch: 14567, Training Loss: 0.01246
Epoch: 14567, Training Loss: 0.01155
Epoch: 14567, Training Loss: 0.01423
Epoch: 14567, Training Loss: 0.01095
Epoch: 14568, Training Loss: 0.01246
Epoch: 14568, Training Loss: 0.01155
Epoch: 14568, Training Loss: 0.01423
Epoch: 14568, Training Loss: 0.01095
Epoch: 14569, Training Loss: 0.01246
Epoch: 14569, Training Loss: 0.01154
Epoch: 14569, Training Loss: 0.01423
Epoch: 14569, Training Loss: 0.01095
Epoch: 14570, Training Loss: 0.01246
Epoch: 14570, Training Loss: 0.01154
Epoch: 14570, Training Loss: 0.01422
Epoch: 14570, Training Loss: 0.01095
Epoch: 14571, Training Loss: 0.01246
Epoch: 14571, Training Loss: 0.01154
Epoch: 14571, Training Loss: 0.01422
Epoch: 14571, Training Loss: 0.01095
Epoch: 14572, Training Loss: 0.01246
Epoch: 14572, Training Loss: 0.01154
Epoch: 14572, Training Loss: 0.01422
Epoch: 14572, Training Loss: 0.01095
Epoch: 14573, Training Loss: 0.01246
Epoch: 14573, Training Loss: 0.01154
Epoch: 14573, Training Loss: 0.01422
Epoch: 14573, Training Loss: 0.01095
Epoch: 14574, Training Loss: 0.01246
Epoch: 14574, Training Loss: 0.01154
Epoch: 14574, Training Loss: 0.01422
Epoch: 14574, Training Loss: 0.01095
Epoch: 14575, Training Loss: 0.01246
Epoch: 14575, Training Loss: 0.01154
Epoch: 14575, Training Loss: 0.01422
Epoch: 14575, Training Loss: 0.01095
Epoch: 14576, Training Loss: 0.01246
Epoch: 14576, Training Loss: 0.01154
Epoch: 14576, Training Loss: 0.01422
Epoch: 14576, Training Loss: 0.01095
Epoch: 14577, Training Loss: 0.01246
Epoch: 14577, Training Loss: 0.01154
Epoch: 14577, Training Loss: 0.01422
Epoch: 14577, Training Loss: 0.01095
Epoch: 14578, Training Loss: 0.01246
Epoch: 14578, Training Loss: 0.01154
Epoch: 14578, Training Loss: 0.01422
Epoch: 14578, Training Loss: 0.01095
Epoch: 14579, Training Loss: 0.01246
Epoch: 14579, Training Loss: 0.01154
Epoch: 14579, Training Loss: 0.01422
Epoch: 14579, Training Loss: 0.01094
Epoch: 14580, Training Loss: 0.01245
Epoch: 14580, Training Loss: 0.01154
Epoch: 14580, Training Loss: 0.01422
Epoch: 14580, Training Loss: 0.01094
Epoch: 14581, Training Loss: 0.01245
Epoch: 14581, Training Loss: 0.01154
Epoch: 14581, Training Loss: 0.01422
Epoch: 14581, Training Loss: 0.01094
Epoch: 14582, Training Loss: 0.01245
Epoch: 14582, Training Loss: 0.01154
Epoch: 14582, Training Loss: 0.01422
Epoch: 14582, Training Loss: 0.01094
Epoch: 14583, Training Loss: 0.01245
Epoch: 14583, Training Loss: 0.01154
Epoch: 14583, Training Loss: 0.01422
Epoch: 14583, Training Loss: 0.01094
Epoch: 14584, Training Loss: 0.01245
Epoch: 14584, Training Loss: 0.01154
Epoch: 14584, Training Loss: 0.01422
Epoch: 14584, Training Loss: 0.01094
Epoch: 14585, Training Loss: 0.01245
Epoch: 14585, Training Loss: 0.01154
Epoch: 14585, Training Loss: 0.01422
Epoch: 14585, Training Loss: 0.01094
Epoch: 14586, Training Loss: 0.01245
Epoch: 14586, Training Loss: 0.01154
Epoch: 14586, Training Loss: 0.01422
Epoch: 14586, Training Loss: 0.01094
Epoch: 14587, Training Loss: 0.01245
Epoch: 14587, Training Loss: 0.01154
Epoch: 14587, Training Loss: 0.01422
Epoch: 14587, Training Loss: 0.01094
Epoch: 14588, Training Loss: 0.01245
Epoch: 14588, Training Loss: 0.01154
Epoch: 14588, Training Loss: 0.01422
Epoch: 14588, Training Loss: 0.01094
Epoch: 14589, Training Loss: 0.01245
Epoch: 14589, Training Loss: 0.01154
Epoch: 14589, Training Loss: 0.01421
Epoch: 14589, Training Loss: 0.01094
Epoch: 14590, Training Loss: 0.01245
Epoch: 14590, Training Loss: 0.01154
Epoch: 14590, Training Loss: 0.01421
Epoch: 14590, Training Loss: 0.01094
Epoch: 14591, Training Loss: 0.01245
Epoch: 14591, Training Loss: 0.01154
Epoch: 14591, Training Loss: 0.01421
Epoch: 14591, Training Loss: 0.01094
Epoch: 14592, Training Loss: 0.01245
Epoch: 14592, Training Loss: 0.01153
Epoch: 14592, Training Loss: 0.01421
Epoch: 14592, Training Loss: 0.01094
Epoch: 14593, Training Loss: 0.01245
Epoch: 14593, Training Loss: 0.01153
Epoch: 14593, Training Loss: 0.01421
Epoch: 14593, Training Loss: 0.01094
Epoch: 14594, Training Loss: 0.01245
Epoch: 14594, Training Loss: 0.01153
Epoch: 14594, Training Loss: 0.01421
Epoch: 14594, Training Loss: 0.01094
Epoch: 14595, Training Loss: 0.01245
Epoch: 14595, Training Loss: 0.01153
Epoch: 14595, Training Loss: 0.01421
Epoch: 14595, Training Loss: 0.01094
Epoch: 14596, Training Loss: 0.01245
Epoch: 14596, Training Loss: 0.01153
Epoch: 14596, Training Loss: 0.01421
Epoch: 14596, Training Loss: 0.01094
Epoch: 14597, Training Loss: 0.01245
Epoch: 14597, Training Loss: 0.01153
Epoch: 14597, Training Loss: 0.01421
Epoch: 14597, Training Loss: 0.01094
Epoch: 14598, Training Loss: 0.01245
Epoch: 14598, Training Loss: 0.01153
Epoch: 14598, Training Loss: 0.01421
Epoch: 14598, Training Loss: 0.01094
Epoch: 14599, Training Loss: 0.01245
Epoch: 14599, Training Loss: 0.01153
Epoch: 14599, Training Loss: 0.01421
Epoch: 14599, Training Loss: 0.01094
Epoch: 14600, Training Loss: 0.01245
Epoch: 14600, Training Loss: 0.01153
Epoch: 14600, Training Loss: 0.01421
Epoch: 14600, Training Loss: 0.01094
Epoch: 14601, Training Loss: 0.01244
Epoch: 14601, Training Loss: 0.01153
Epoch: 14601, Training Loss: 0.01421
Epoch: 14601, Training Loss: 0.01094
Epoch: 14602, Training Loss: 0.01244
Epoch: 14602, Training Loss: 0.01153
Epoch: 14602, Training Loss: 0.01421
Epoch: 14602, Training Loss: 0.01094
Epoch: 14603, Training Loss: 0.01244
Epoch: 14603, Training Loss: 0.01153
Epoch: 14603, Training Loss: 0.01421
Epoch: 14603, Training Loss: 0.01094
Epoch: 14604, Training Loss: 0.01244
Epoch: 14604, Training Loss: 0.01153
Epoch: 14604, Training Loss: 0.01421
Epoch: 14604, Training Loss: 0.01093
Epoch: 14605, Training Loss: 0.01244
Epoch: 14605, Training Loss: 0.01153
Epoch: 14605, Training Loss: 0.01421
Epoch: 14605, Training Loss: 0.01093
Epoch: 14606, Training Loss: 0.01244
Epoch: 14606, Training Loss: 0.01153
Epoch: 14606, Training Loss: 0.01421
Epoch: 14606, Training Loss: 0.01093
Epoch: 14607, Training Loss: 0.01244
Epoch: 14607, Training Loss: 0.01153
Epoch: 14607, Training Loss: 0.01421
Epoch: 14607, Training Loss: 0.01093
Epoch: 14608, Training Loss: 0.01244
Epoch: 14608, Training Loss: 0.01153
Epoch: 14608, Training Loss: 0.01420
Epoch: 14608, Training Loss: 0.01093
Epoch: 14609, Training Loss: 0.01244
Epoch: 14609, Training Loss: 0.01153
Epoch: 14609, Training Loss: 0.01420
Epoch: 14609, Training Loss: 0.01093
Epoch: 14610, Training Loss: 0.01244
Epoch: 14610, Training Loss: 0.01153
Epoch: 14610, Training Loss: 0.01420
Epoch: 14610, Training Loss: 0.01093
Epoch: 14611, Training Loss: 0.01244
Epoch: 14611, Training Loss: 0.01153
Epoch: 14611, Training Loss: 0.01420
Epoch: 14611, Training Loss: 0.01093
Epoch: 14612, Training Loss: 0.01244
Epoch: 14612, Training Loss: 0.01153
Epoch: 14612, Training Loss: 0.01420
Epoch: 14612, Training Loss: 0.01093
Epoch: 14613, Training Loss: 0.01244
Epoch: 14613, Training Loss: 0.01153
Epoch: 14613, Training Loss: 0.01420
Epoch: 14613, Training Loss: 0.01093
Epoch: 14614, Training Loss: 0.01244
Epoch: 14614, Training Loss: 0.01153
Epoch: 14614, Training Loss: 0.01420
Epoch: 14614, Training Loss: 0.01093
Epoch: 14615, Training Loss: 0.01244
Epoch: 14615, Training Loss: 0.01153
Epoch: 14615, Training Loss: 0.01420
Epoch: 14615, Training Loss: 0.01093
Epoch: 14616, Training Loss: 0.01244
Epoch: 14616, Training Loss: 0.01152
Epoch: 14616, Training Loss: 0.01420
Epoch: 14616, Training Loss: 0.01093
Epoch: 14617, Training Loss: 0.01244
Epoch: 14617, Training Loss: 0.01152
Epoch: 14617, Training Loss: 0.01420
Epoch: 14617, Training Loss: 0.01093
Epoch: 14618, Training Loss: 0.01244
Epoch: 14618, Training Loss: 0.01152
Epoch: 14618, Training Loss: 0.01420
Epoch: 14618, Training Loss: 0.01093
Epoch: 14619, Training Loss: 0.01244
Epoch: 14619, Training Loss: 0.01152
Epoch: 14619, Training Loss: 0.01420
Epoch: 14619, Training Loss: 0.01093
Epoch: 14620, Training Loss: 0.01244
Epoch: 14620, Training Loss: 0.01152
Epoch: 14620, Training Loss: 0.01420
Epoch: 14620, Training Loss: 0.01093
Epoch: 14621, Training Loss: 0.01244
Epoch: 14621, Training Loss: 0.01152
Epoch: 14621, Training Loss: 0.01420
Epoch: 14621, Training Loss: 0.01093
Epoch: 14622, Training Loss: 0.01243
Epoch: 14622, Training Loss: 0.01152
Epoch: 14622, Training Loss: 0.01420
Epoch: 14622, Training Loss: 0.01093
Epoch: 14623, Training Loss: 0.01243
Epoch: 14623, Training Loss: 0.01152
Epoch: 14623, Training Loss: 0.01420
Epoch: 14623, Training Loss: 0.01093
Epoch: 14624, Training Loss: 0.01243
Epoch: 14624, Training Loss: 0.01152
Epoch: 14624, Training Loss: 0.01420
Epoch: 14624, Training Loss: 0.01093
Epoch: 14625, Training Loss: 0.01243
Epoch: 14625, Training Loss: 0.01152
Epoch: 14625, Training Loss: 0.01420
Epoch: 14625, Training Loss: 0.01093
Epoch: 14626, Training Loss: 0.01243
Epoch: 14626, Training Loss: 0.01152
Epoch: 14626, Training Loss: 0.01419
Epoch: 14626, Training Loss: 0.01093
Epoch: 14627, Training Loss: 0.01243
Epoch: 14627, Training Loss: 0.01152
Epoch: 14627, Training Loss: 0.01419
Epoch: 14627, Training Loss: 0.01093
Epoch: 14628, Training Loss: 0.01243
Epoch: 14628, Training Loss: 0.01152
Epoch: 14628, Training Loss: 0.01419
Epoch: 14628, Training Loss: 0.01092
Epoch: 14629, Training Loss: 0.01243
Epoch: 14629, Training Loss: 0.01152
Epoch: 14629, Training Loss: 0.01419
Epoch: 14629, Training Loss: 0.01092
Epoch: 14630, Training Loss: 0.01243
Epoch: 14630, Training Loss: 0.01152
Epoch: 14630, Training Loss: 0.01419
Epoch: 14630, Training Loss: 0.01092
Epoch: 14631, Training Loss: 0.01243
Epoch: 14631, Training Loss: 0.01152
Epoch: 14631, Training Loss: 0.01419
Epoch: 14631, Training Loss: 0.01092
Epoch: 14632, Training Loss: 0.01243
Epoch: 14632, Training Loss: 0.01152
Epoch: 14632, Training Loss: 0.01419
Epoch: 14632, Training Loss: 0.01092
Epoch: 14633, Training Loss: 0.01243
Epoch: 14633, Training Loss: 0.01152
Epoch: 14633, Training Loss: 0.01419
Epoch: 14633, Training Loss: 0.01092
Epoch: 14634, Training Loss: 0.01243
Epoch: 14634, Training Loss: 0.01152
Epoch: 14634, Training Loss: 0.01419
Epoch: 14634, Training Loss: 0.01092
Epoch: 14635, Training Loss: 0.01243
Epoch: 14635, Training Loss: 0.01152
Epoch: 14635, Training Loss: 0.01419
Epoch: 14635, Training Loss: 0.01092
Epoch: 14636, Training Loss: 0.01243
Epoch: 14636, Training Loss: 0.01152
Epoch: 14636, Training Loss: 0.01419
Epoch: 14636, Training Loss: 0.01092
Epoch: 14637, Training Loss: 0.01243
Epoch: 14637, Training Loss: 0.01152
Epoch: 14637, Training Loss: 0.01419
Epoch: 14637, Training Loss: 0.01092
Epoch: 14638, Training Loss: 0.01243
Epoch: 14638, Training Loss: 0.01152
Epoch: 14638, Training Loss: 0.01419
Epoch: 14638, Training Loss: 0.01092
Epoch: 14639, Training Loss: 0.01243
Epoch: 14639, Training Loss: 0.01151
Epoch: 14639, Training Loss: 0.01419
Epoch: 14639, Training Loss: 0.01092
Epoch: 14640, Training Loss: 0.01243
Epoch: 14640, Training Loss: 0.01151
Epoch: 14640, Training Loss: 0.01419
Epoch: 14640, Training Loss: 0.01092
Epoch: 14641, Training Loss: 0.01243
Epoch: 14641, Training Loss: 0.01151
Epoch: 14641, Training Loss: 0.01419
Epoch: 14641, Training Loss: 0.01092
Epoch: 14642, Training Loss: 0.01243
Epoch: 14642, Training Loss: 0.01151
Epoch: 14642, Training Loss: 0.01419
Epoch: 14642, Training Loss: 0.01092
Epoch: 14643, Training Loss: 0.01242
Epoch: 14643, Training Loss: 0.01151
Epoch: 14643, Training Loss: 0.01419
Epoch: 14643, Training Loss: 0.01092
Epoch: 14644, Training Loss: 0.01242
Epoch: 14644, Training Loss: 0.01151
Epoch: 14644, Training Loss: 0.01419
Epoch: 14644, Training Loss: 0.01092
Epoch: 14645, Training Loss: 0.01242
Epoch: 14645, Training Loss: 0.01151
Epoch: 14645, Training Loss: 0.01418
Epoch: 14645, Training Loss: 0.01092
Epoch: 14646, Training Loss: 0.01242
Epoch: 14646, Training Loss: 0.01151
Epoch: 14646, Training Loss: 0.01418
Epoch: 14646, Training Loss: 0.01092
Epoch: 14647, Training Loss: 0.01242
Epoch: 14647, Training Loss: 0.01151
Epoch: 14647, Training Loss: 0.01418
Epoch: 14647, Training Loss: 0.01092
Epoch: 14648, Training Loss: 0.01242
Epoch: 14648, Training Loss: 0.01151
Epoch: 14648, Training Loss: 0.01418
Epoch: 14648, Training Loss: 0.01092
Epoch: 14649, Training Loss: 0.01242
Epoch: 14649, Training Loss: 0.01151
Epoch: 14649, Training Loss: 0.01418
Epoch: 14649, Training Loss: 0.01092
Epoch: 14650, Training Loss: 0.01242
Epoch: 14650, Training Loss: 0.01151
Epoch: 14650, Training Loss: 0.01418
Epoch: 14650, Training Loss: 0.01092
Epoch: 14651, Training Loss: 0.01242
Epoch: 14651, Training Loss: 0.01151
Epoch: 14651, Training Loss: 0.01418
Epoch: 14651, Training Loss: 0.01092
Epoch: 14652, Training Loss: 0.01242
Epoch: 14652, Training Loss: 0.01151
Epoch: 14652, Training Loss: 0.01418
Epoch: 14652, Training Loss: 0.01092
Epoch: 14653, Training Loss: 0.01242
Epoch: 14653, Training Loss: 0.01151
Epoch: 14653, Training Loss: 0.01418
Epoch: 14653, Training Loss: 0.01091
Epoch: 14654, Training Loss: 0.01242
Epoch: 14654, Training Loss: 0.01151
Epoch: 14654, Training Loss: 0.01418
Epoch: 14654, Training Loss: 0.01091
Epoch: 14655, Training Loss: 0.01242
Epoch: 14655, Training Loss: 0.01151
Epoch: 14655, Training Loss: 0.01418
Epoch: 14655, Training Loss: 0.01091
Epoch: 14656, Training Loss: 0.01242
Epoch: 14656, Training Loss: 0.01151
Epoch: 14656, Training Loss: 0.01418
Epoch: 14656, Training Loss: 0.01091
Epoch: 14657, Training Loss: 0.01242
Epoch: 14657, Training Loss: 0.01151
Epoch: 14657, Training Loss: 0.01418
Epoch: 14657, Training Loss: 0.01091
Epoch: 14658, Training Loss: 0.01242
Epoch: 14658, Training Loss: 0.01151
Epoch: 14658, Training Loss: 0.01418
Epoch: 14658, Training Loss: 0.01091
Epoch: 14659, Training Loss: 0.01242
Epoch: 14659, Training Loss: 0.01151
Epoch: 14659, Training Loss: 0.01418
Epoch: 14659, Training Loss: 0.01091
Epoch: 14660, Training Loss: 0.01242
Epoch: 14660, Training Loss: 0.01151
Epoch: 14660, Training Loss: 0.01418
Epoch: 14660, Training Loss: 0.01091
Epoch: 14661, Training Loss: 0.01242
Epoch: 14661, Training Loss: 0.01151
Epoch: 14661, Training Loss: 0.01418
Epoch: 14661, Training Loss: 0.01091
Epoch: 14662, Training Loss: 0.01242
Epoch: 14662, Training Loss: 0.01151
Epoch: 14662, Training Loss: 0.01418
Epoch: 14662, Training Loss: 0.01091
Epoch: 14663, Training Loss: 0.01242
Epoch: 14663, Training Loss: 0.01150
Epoch: 14663, Training Loss: 0.01418
Epoch: 14663, Training Loss: 0.01091
Epoch: 14664, Training Loss: 0.01242
Epoch: 14664, Training Loss: 0.01150
Epoch: 14664, Training Loss: 0.01417
Epoch: 14664, Training Loss: 0.01091
Epoch: 14665, Training Loss: 0.01241
Epoch: 14665, Training Loss: 0.01150
Epoch: 14665, Training Loss: 0.01417
Epoch: 14665, Training Loss: 0.01091
Epoch: 14666, Training Loss: 0.01241
Epoch: 14666, Training Loss: 0.01150
Epoch: 14666, Training Loss: 0.01417
Epoch: 14666, Training Loss: 0.01091
Epoch: 14667, Training Loss: 0.01241
Epoch: 14667, Training Loss: 0.01150
Epoch: 14667, Training Loss: 0.01417
Epoch: 14667, Training Loss: 0.01091
Epoch: 14668, Training Loss: 0.01241
Epoch: 14668, Training Loss: 0.01150
Epoch: 14668, Training Loss: 0.01417
Epoch: 14668, Training Loss: 0.01091
Epoch: 14669, Training Loss: 0.01241
Epoch: 14669, Training Loss: 0.01150
Epoch: 14669, Training Loss: 0.01417
Epoch: 14669, Training Loss: 0.01091
Epoch: 14670, Training Loss: 0.01241
Epoch: 14670, Training Loss: 0.01150
Epoch: 14670, Training Loss: 0.01417
Epoch: 14670, Training Loss: 0.01091
Epoch: 14671, Training Loss: 0.01241
Epoch: 14671, Training Loss: 0.01150
Epoch: 14671, Training Loss: 0.01417
Epoch: 14671, Training Loss: 0.01091
Epoch: 14672, Training Loss: 0.01241
Epoch: 14672, Training Loss: 0.01150
Epoch: 14672, Training Loss: 0.01417
Epoch: 14672, Training Loss: 0.01091
Epoch: 14673, Training Loss: 0.01241
Epoch: 14673, Training Loss: 0.01150
Epoch: 14673, Training Loss: 0.01417
Epoch: 14673, Training Loss: 0.01091
Epoch: 14674, Training Loss: 0.01241
Epoch: 14674, Training Loss: 0.01150
Epoch: 14674, Training Loss: 0.01417
Epoch: 14674, Training Loss: 0.01091
Epoch: 14675, Training Loss: 0.01241
Epoch: 14675, Training Loss: 0.01150
Epoch: 14675, Training Loss: 0.01417
Epoch: 14675, Training Loss: 0.01091
Epoch: 14676, Training Loss: 0.01241
Epoch: 14676, Training Loss: 0.01150
Epoch: 14676, Training Loss: 0.01417
Epoch: 14676, Training Loss: 0.01091
Epoch: 14677, Training Loss: 0.01241
Epoch: 14677, Training Loss: 0.01150
Epoch: 14677, Training Loss: 0.01417
Epoch: 14677, Training Loss: 0.01091
Epoch: 14678, Training Loss: 0.01241
Epoch: 14678, Training Loss: 0.01150
Epoch: 14678, Training Loss: 0.01417
Epoch: 14678, Training Loss: 0.01090
Epoch: 14679, Training Loss: 0.01241
Epoch: 14679, Training Loss: 0.01150
Epoch: 14679, Training Loss: 0.01417
Epoch: 14679, Training Loss: 0.01090
Epoch: 14680, Training Loss: 0.01241
Epoch: 14680, Training Loss: 0.01150
Epoch: 14680, Training Loss: 0.01417
Epoch: 14680, Training Loss: 0.01090
Epoch: 14681, Training Loss: 0.01241
Epoch: 14681, Training Loss: 0.01150
Epoch: 14681, Training Loss: 0.01417
Epoch: 14681, Training Loss: 0.01090
Epoch: 14682, Training Loss: 0.01241
Epoch: 14682, Training Loss: 0.01150
Epoch: 14682, Training Loss: 0.01416
Epoch: 14682, Training Loss: 0.01090
Epoch: 14683, Training Loss: 0.01241
Epoch: 14683, Training Loss: 0.01150
Epoch: 14683, Training Loss: 0.01416
Epoch: 14683, Training Loss: 0.01090
Epoch: 14684, Training Loss: 0.01241
Epoch: 14684, Training Loss: 0.01150
Epoch: 14684, Training Loss: 0.01416
Epoch: 14684, Training Loss: 0.01090
Epoch: 14685, Training Loss: 0.01241
Epoch: 14685, Training Loss: 0.01150
Epoch: 14685, Training Loss: 0.01416
Epoch: 14685, Training Loss: 0.01090
Epoch: 14686, Training Loss: 0.01240
Epoch: 14686, Training Loss: 0.01150
Epoch: 14686, Training Loss: 0.01416
Epoch: 14686, Training Loss: 0.01090
Epoch: 14687, Training Loss: 0.01240
Epoch: 14687, Training Loss: 0.01149
Epoch: 14687, Training Loss: 0.01416
Epoch: 14687, Training Loss: 0.01090
Epoch: 14688, Training Loss: 0.01240
Epoch: 14688, Training Loss: 0.01149
Epoch: 14688, Training Loss: 0.01416
Epoch: 14688, Training Loss: 0.01090
Epoch: 14689, Training Loss: 0.01240
Epoch: 14689, Training Loss: 0.01149
Epoch: 14689, Training Loss: 0.01416
Epoch: 14689, Training Loss: 0.01090
Epoch: 14690, Training Loss: 0.01240
Epoch: 14690, Training Loss: 0.01149
Epoch: 14690, Training Loss: 0.01416
Epoch: 14690, Training Loss: 0.01090
Epoch: 14691, Training Loss: 0.01240
Epoch: 14691, Training Loss: 0.01149
Epoch: 14691, Training Loss: 0.01416
Epoch: 14691, Training Loss: 0.01090
Epoch: 14692, Training Loss: 0.01240
Epoch: 14692, Training Loss: 0.01149
Epoch: 14692, Training Loss: 0.01416
Epoch: 14692, Training Loss: 0.01090
Epoch: 14693, Training Loss: 0.01240
Epoch: 14693, Training Loss: 0.01149
Epoch: 14693, Training Loss: 0.01416
Epoch: 14693, Training Loss: 0.01090
Epoch: 14694, Training Loss: 0.01240
Epoch: 14694, Training Loss: 0.01149
Epoch: 14694, Training Loss: 0.01416
Epoch: 14694, Training Loss: 0.01090
Epoch: 14695, Training Loss: 0.01240
Epoch: 14695, Training Loss: 0.01149
Epoch: 14695, Training Loss: 0.01416
Epoch: 14695, Training Loss: 0.01090
Epoch: 14696, Training Loss: 0.01240
Epoch: 14696, Training Loss: 0.01149
Epoch: 14696, Training Loss: 0.01416
Epoch: 14696, Training Loss: 0.01090
Epoch: 14697, Training Loss: 0.01240
Epoch: 14697, Training Loss: 0.01149
Epoch: 14697, Training Loss: 0.01416
Epoch: 14697, Training Loss: 0.01090
Epoch: 14698, Training Loss: 0.01240
Epoch: 14698, Training Loss: 0.01149
Epoch: 14698, Training Loss: 0.01416
Epoch: 14698, Training Loss: 0.01090
Epoch: 14699, Training Loss: 0.01240
Epoch: 14699, Training Loss: 0.01149
Epoch: 14699, Training Loss: 0.01416
Epoch: 14699, Training Loss: 0.01090
Epoch: 14700, Training Loss: 0.01240
Epoch: 14700, Training Loss: 0.01149
Epoch: 14700, Training Loss: 0.01416
Epoch: 14700, Training Loss: 0.01090
Epoch: 14701, Training Loss: 0.01240
Epoch: 14701, Training Loss: 0.01149
Epoch: 14701, Training Loss: 0.01415
Epoch: 14701, Training Loss: 0.01090
Epoch: 14702, Training Loss: 0.01240
Epoch: 14702, Training Loss: 0.01149
Epoch: 14702, Training Loss: 0.01415
Epoch: 14702, Training Loss: 0.01090
Epoch: 14703, Training Loss: 0.01240
Epoch: 14703, Training Loss: 0.01149
Epoch: 14703, Training Loss: 0.01415
Epoch: 14703, Training Loss: 0.01089
Epoch: 14704, Training Loss: 0.01240
Epoch: 14704, Training Loss: 0.01149
Epoch: 14704, Training Loss: 0.01415
Epoch: 14704, Training Loss: 0.01089
Epoch: 14705, Training Loss: 0.01240
Epoch: 14705, Training Loss: 0.01149
Epoch: 14705, Training Loss: 0.01415
Epoch: 14705, Training Loss: 0.01089
Epoch: 14706, Training Loss: 0.01240
Epoch: 14706, Training Loss: 0.01149
Epoch: 14706, Training Loss: 0.01415
Epoch: 14706, Training Loss: 0.01089
Epoch: 14707, Training Loss: 0.01240
Epoch: 14707, Training Loss: 0.01149
Epoch: 14707, Training Loss: 0.01415
Epoch: 14707, Training Loss: 0.01089
Epoch: 14708, Training Loss: 0.01239
Epoch: 14708, Training Loss: 0.01149
Epoch: 14708, Training Loss: 0.01415
Epoch: 14708, Training Loss: 0.01089
Epoch: 14709, Training Loss: 0.01239
Epoch: 14709, Training Loss: 0.01149
Epoch: 14709, Training Loss: 0.01415
Epoch: 14709, Training Loss: 0.01089
Epoch: 14710, Training Loss: 0.01239
Epoch: 14710, Training Loss: 0.01149
Epoch: 14710, Training Loss: 0.01415
Epoch: 14710, Training Loss: 0.01089
Epoch: 14711, Training Loss: 0.01239
Epoch: 14711, Training Loss: 0.01148
Epoch: 14711, Training Loss: 0.01415
Epoch: 14711, Training Loss: 0.01089
Epoch: 14712, Training Loss: 0.01239
Epoch: 14712, Training Loss: 0.01148
Epoch: 14712, Training Loss: 0.01415
Epoch: 14712, Training Loss: 0.01089
Epoch: 14713, Training Loss: 0.01239
Epoch: 14713, Training Loss: 0.01148
Epoch: 14713, Training Loss: 0.01415
Epoch: 14713, Training Loss: 0.01089
Epoch: 14714, Training Loss: 0.01239
Epoch: 14714, Training Loss: 0.01148
Epoch: 14714, Training Loss: 0.01415
Epoch: 14714, Training Loss: 0.01089
Epoch: 14715, Training Loss: 0.01239
Epoch: 14715, Training Loss: 0.01148
Epoch: 14715, Training Loss: 0.01415
Epoch: 14715, Training Loss: 0.01089
Epoch: 14716, Training Loss: 0.01239
Epoch: 14716, Training Loss: 0.01148
Epoch: 14716, Training Loss: 0.01415
Epoch: 14716, Training Loss: 0.01089
Epoch: 14717, Training Loss: 0.01239
Epoch: 14717, Training Loss: 0.01148
Epoch: 14717, Training Loss: 0.01415
Epoch: 14717, Training Loss: 0.01089
Epoch: 14718, Training Loss: 0.01239
Epoch: 14718, Training Loss: 0.01148
Epoch: 14718, Training Loss: 0.01415
Epoch: 14718, Training Loss: 0.01089
Epoch: 14719, Training Loss: 0.01239
Epoch: 14719, Training Loss: 0.01148
Epoch: 14719, Training Loss: 0.01415
Epoch: 14719, Training Loss: 0.01089
Epoch: 14720, Training Loss: 0.01239
Epoch: 14720, Training Loss: 0.01148
Epoch: 14720, Training Loss: 0.01414
Epoch: 14720, Training Loss: 0.01089
Epoch: 14721, Training Loss: 0.01239
Epoch: 14721, Training Loss: 0.01148
Epoch: 14721, Training Loss: 0.01414
Epoch: 14721, Training Loss: 0.01089
Epoch: 14722, Training Loss: 0.01239
Epoch: 14722, Training Loss: 0.01148
Epoch: 14722, Training Loss: 0.01414
Epoch: 14722, Training Loss: 0.01089
Epoch: 14723, Training Loss: 0.01239
Epoch: 14723, Training Loss: 0.01148
Epoch: 14723, Training Loss: 0.01414
Epoch: 14723, Training Loss: 0.01089
Epoch: 14724, Training Loss: 0.01239
Epoch: 14724, Training Loss: 0.01148
Epoch: 14724, Training Loss: 0.01414
Epoch: 14724, Training Loss: 0.01089
Epoch: 14725, Training Loss: 0.01239
Epoch: 14725, Training Loss: 0.01148
Epoch: 14725, Training Loss: 0.01414
Epoch: 14725, Training Loss: 0.01089
Epoch: 14726, Training Loss: 0.01239
Epoch: 14726, Training Loss: 0.01148
Epoch: 14726, Training Loss: 0.01414
Epoch: 14726, Training Loss: 0.01089
Epoch: 14727, Training Loss: 0.01239
Epoch: 14727, Training Loss: 0.01148
Epoch: 14727, Training Loss: 0.01414
Epoch: 14727, Training Loss: 0.01089
Epoch: 14728, Training Loss: 0.01239
Epoch: 14728, Training Loss: 0.01148
Epoch: 14728, Training Loss: 0.01414
Epoch: 14728, Training Loss: 0.01088
Epoch: 14729, Training Loss: 0.01238
Epoch: 14729, Training Loss: 0.01148
Epoch: 14729, Training Loss: 0.01414
Epoch: 14729, Training Loss: 0.01088
Epoch: 14730, Training Loss: 0.01238
Epoch: 14730, Training Loss: 0.01148
Epoch: 14730, Training Loss: 0.01414
Epoch: 14730, Training Loss: 0.01088
Epoch: 14731, Training Loss: 0.01238
Epoch: 14731, Training Loss: 0.01148
Epoch: 14731, Training Loss: 0.01414
Epoch: 14731, Training Loss: 0.01088
Epoch: 14732, Training Loss: 0.01238
Epoch: 14732, Training Loss: 0.01148
Epoch: 14732, Training Loss: 0.01414
Epoch: 14732, Training Loss: 0.01088
Epoch: 14733, Training Loss: 0.01238
Epoch: 14733, Training Loss: 0.01148
Epoch: 14733, Training Loss: 0.01414
Epoch: 14733, Training Loss: 0.01088
Epoch: 14734, Training Loss: 0.01238
Epoch: 14734, Training Loss: 0.01147
Epoch: 14734, Training Loss: 0.01414
Epoch: 14734, Training Loss: 0.01088
Epoch: 14735, Training Loss: 0.01238
Epoch: 14735, Training Loss: 0.01147
Epoch: 14735, Training Loss: 0.01414
Epoch: 14735, Training Loss: 0.01088
Epoch: 14736, Training Loss: 0.01238
Epoch: 14736, Training Loss: 0.01147
Epoch: 14736, Training Loss: 0.01414
Epoch: 14736, Training Loss: 0.01088
Epoch: 14737, Training Loss: 0.01238
Epoch: 14737, Training Loss: 0.01147
Epoch: 14737, Training Loss: 0.01414
Epoch: 14737, Training Loss: 0.01088
Epoch: 14738, Training Loss: 0.01238
Epoch: 14738, Training Loss: 0.01147
Epoch: 14738, Training Loss: 0.01414
Epoch: 14738, Training Loss: 0.01088
Epoch: 14739, Training Loss: 0.01238
Epoch: 14739, Training Loss: 0.01147
Epoch: 14739, Training Loss: 0.01413
Epoch: 14739, Training Loss: 0.01088
Epoch: 14740, Training Loss: 0.01238
Epoch: 14740, Training Loss: 0.01147
Epoch: 14740, Training Loss: 0.01413
Epoch: 14740, Training Loss: 0.01088
Epoch: 14741, Training Loss: 0.01238
Epoch: 14741, Training Loss: 0.01147
Epoch: 14741, Training Loss: 0.01413
Epoch: 14741, Training Loss: 0.01088
Epoch: 14742, Training Loss: 0.01238
Epoch: 14742, Training Loss: 0.01147
Epoch: 14742, Training Loss: 0.01413
Epoch: 14742, Training Loss: 0.01088
Epoch: 14743, Training Loss: 0.01238
Epoch: 14743, Training Loss: 0.01147
Epoch: 14743, Training Loss: 0.01413
Epoch: 14743, Training Loss: 0.01088
Epoch: 14744, Training Loss: 0.01238
Epoch: 14744, Training Loss: 0.01147
Epoch: 14744, Training Loss: 0.01413
Epoch: 14744, Training Loss: 0.01088
Epoch: 14745, Training Loss: 0.01238
Epoch: 14745, Training Loss: 0.01147
Epoch: 14745, Training Loss: 0.01413
Epoch: 14745, Training Loss: 0.01088
Epoch: 14746, Training Loss: 0.01238
Epoch: 14746, Training Loss: 0.01147
Epoch: 14746, Training Loss: 0.01413
Epoch: 14746, Training Loss: 0.01088
Epoch: 14747, Training Loss: 0.01238
Epoch: 14747, Training Loss: 0.01147
Epoch: 14747, Training Loss: 0.01413
Epoch: 14747, Training Loss: 0.01088
Epoch: 14748, Training Loss: 0.01238
Epoch: 14748, Training Loss: 0.01147
Epoch: 14748, Training Loss: 0.01413
Epoch: 14748, Training Loss: 0.01088
Epoch: 14749, Training Loss: 0.01238
Epoch: 14749, Training Loss: 0.01147
Epoch: 14749, Training Loss: 0.01413
Epoch: 14749, Training Loss: 0.01088
Epoch: 14750, Training Loss: 0.01238
Epoch: 14750, Training Loss: 0.01147
Epoch: 14750, Training Loss: 0.01413
Epoch: 14750, Training Loss: 0.01088
Epoch: 14751, Training Loss: 0.01237
Epoch: 14751, Training Loss: 0.01147
Epoch: 14751, Training Loss: 0.01413
Epoch: 14751, Training Loss: 0.01088
Epoch: 14752, Training Loss: 0.01237
Epoch: 14752, Training Loss: 0.01147
Epoch: 14752, Training Loss: 0.01413
Epoch: 14752, Training Loss: 0.01088
Epoch: 14753, Training Loss: 0.01237
Epoch: 14753, Training Loss: 0.01147
Epoch: 14753, Training Loss: 0.01413
Epoch: 14753, Training Loss: 0.01087
Epoch: 14754, Training Loss: 0.01237
Epoch: 14754, Training Loss: 0.01147
Epoch: 14754, Training Loss: 0.01413
Epoch: 14754, Training Loss: 0.01087
Epoch: 14755, Training Loss: 0.01237
Epoch: 14755, Training Loss: 0.01147
Epoch: 14755, Training Loss: 0.01413
Epoch: 14755, Training Loss: 0.01087
Epoch: 14756, Training Loss: 0.01237
Epoch: 14756, Training Loss: 0.01147
Epoch: 14756, Training Loss: 0.01413
Epoch: 14756, Training Loss: 0.01087
Epoch: 14757, Training Loss: 0.01237
Epoch: 14757, Training Loss: 0.01147
Epoch: 14757, Training Loss: 0.01413
Epoch: 14757, Training Loss: 0.01087
Epoch: 14758, Training Loss: 0.01237
Epoch: 14758, Training Loss: 0.01146
Epoch: 14758, Training Loss: 0.01412
Epoch: 14758, Training Loss: 0.01087
Epoch: 14759, Training Loss: 0.01237
Epoch: 14759, Training Loss: 0.01146
Epoch: 14759, Training Loss: 0.01412
Epoch: 14759, Training Loss: 0.01087
Epoch: 14760, Training Loss: 0.01237
Epoch: 14760, Training Loss: 0.01146
Epoch: 14760, Training Loss: 0.01412
Epoch: 14760, Training Loss: 0.01087
Epoch: 14761, Training Loss: 0.01237
Epoch: 14761, Training Loss: 0.01146
Epoch: 14761, Training Loss: 0.01412
Epoch: 14761, Training Loss: 0.01087
Epoch: 14762, Training Loss: 0.01237
Epoch: 14762, Training Loss: 0.01146
Epoch: 14762, Training Loss: 0.01412
Epoch: 14762, Training Loss: 0.01087
Epoch: 14763, Training Loss: 0.01237
Epoch: 14763, Training Loss: 0.01146
Epoch: 14763, Training Loss: 0.01412
Epoch: 14763, Training Loss: 0.01087
Epoch: 14764, Training Loss: 0.01237
Epoch: 14764, Training Loss: 0.01146
Epoch: 14764, Training Loss: 0.01412
Epoch: 14764, Training Loss: 0.01087
Epoch: 14765, Training Loss: 0.01237
Epoch: 14765, Training Loss: 0.01146
Epoch: 14765, Training Loss: 0.01412
Epoch: 14765, Training Loss: 0.01087
Epoch: 14766, Training Loss: 0.01237
Epoch: 14766, Training Loss: 0.01146
Epoch: 14766, Training Loss: 0.01412
Epoch: 14766, Training Loss: 0.01087
Epoch: 14767, Training Loss: 0.01237
Epoch: 14767, Training Loss: 0.01146
Epoch: 14767, Training Loss: 0.01412
Epoch: 14767, Training Loss: 0.01087
Epoch: 14768, Training Loss: 0.01237
Epoch: 14768, Training Loss: 0.01146
Epoch: 14768, Training Loss: 0.01412
Epoch: 14768, Training Loss: 0.01087
Epoch: 14769, Training Loss: 0.01237
Epoch: 14769, Training Loss: 0.01146
Epoch: 14769, Training Loss: 0.01412
Epoch: 14769, Training Loss: 0.01087
Epoch: 14770, Training Loss: 0.01237
Epoch: 14770, Training Loss: 0.01146
Epoch: 14770, Training Loss: 0.01412
Epoch: 14770, Training Loss: 0.01087
Epoch: 14771, Training Loss: 0.01237
Epoch: 14771, Training Loss: 0.01146
Epoch: 14771, Training Loss: 0.01412
Epoch: 14771, Training Loss: 0.01087
Epoch: 14772, Training Loss: 0.01237
Epoch: 14772, Training Loss: 0.01146
Epoch: 14772, Training Loss: 0.01412
Epoch: 14772, Training Loss: 0.01087
Epoch: 14773, Training Loss: 0.01236
Epoch: 14773, Training Loss: 0.01146
Epoch: 14773, Training Loss: 0.01412
Epoch: 14773, Training Loss: 0.01087
Epoch: 14774, Training Loss: 0.01236
Epoch: 14774, Training Loss: 0.01146
Epoch: 14774, Training Loss: 0.01412
Epoch: 14774, Training Loss: 0.01087
Epoch: 14775, Training Loss: 0.01236
Epoch: 14775, Training Loss: 0.01146
Epoch: 14775, Training Loss: 0.01412
Epoch: 14775, Training Loss: 0.01087
Epoch: 14776, Training Loss: 0.01236
Epoch: 14776, Training Loss: 0.01146
Epoch: 14776, Training Loss: 0.01412
Epoch: 14776, Training Loss: 0.01087
Epoch: 14777, Training Loss: 0.01236
Epoch: 14777, Training Loss: 0.01146
Epoch: 14777, Training Loss: 0.01411
Epoch: 14777, Training Loss: 0.01087
Epoch: 14778, Training Loss: 0.01236
Epoch: 14778, Training Loss: 0.01146
Epoch: 14778, Training Loss: 0.01411
Epoch: 14778, Training Loss: 0.01086
Epoch: 14779, Training Loss: 0.01236
Epoch: 14779, Training Loss: 0.01146
Epoch: 14779, Training Loss: 0.01411
Epoch: 14779, Training Loss: 0.01086
Epoch: 14780, Training Loss: 0.01236
Epoch: 14780, Training Loss: 0.01146
Epoch: 14780, Training Loss: 0.01411
Epoch: 14780, Training Loss: 0.01086
Epoch: 14781, Training Loss: 0.01236
Epoch: 14781, Training Loss: 0.01146
Epoch: 14781, Training Loss: 0.01411
Epoch: 14781, Training Loss: 0.01086
Epoch: 14782, Training Loss: 0.01236
Epoch: 14782, Training Loss: 0.01145
Epoch: 14782, Training Loss: 0.01411
Epoch: 14782, Training Loss: 0.01086
Epoch: 14783, Training Loss: 0.01236
Epoch: 14783, Training Loss: 0.01145
Epoch: 14783, Training Loss: 0.01411
Epoch: 14783, Training Loss: 0.01086
Epoch: 14784, Training Loss: 0.01236
Epoch: 14784, Training Loss: 0.01145
Epoch: 14784, Training Loss: 0.01411
Epoch: 14784, Training Loss: 0.01086
Epoch: 14785, Training Loss: 0.01236
Epoch: 14785, Training Loss: 0.01145
Epoch: 14785, Training Loss: 0.01411
Epoch: 14785, Training Loss: 0.01086
Epoch: 14786, Training Loss: 0.01236
Epoch: 14786, Training Loss: 0.01145
Epoch: 14786, Training Loss: 0.01411
Epoch: 14786, Training Loss: 0.01086
Epoch: 14787, Training Loss: 0.01236
Epoch: 14787, Training Loss: 0.01145
Epoch: 14787, Training Loss: 0.01411
Epoch: 14787, Training Loss: 0.01086
Epoch: 14788, Training Loss: 0.01236
Epoch: 14788, Training Loss: 0.01145
Epoch: 14788, Training Loss: 0.01411
Epoch: 14788, Training Loss: 0.01086
Epoch: 14789, Training Loss: 0.01236
Epoch: 14789, Training Loss: 0.01145
Epoch: 14789, Training Loss: 0.01411
Epoch: 14789, Training Loss: 0.01086
Epoch: 14790, Training Loss: 0.01236
Epoch: 14790, Training Loss: 0.01145
Epoch: 14790, Training Loss: 0.01411
Epoch: 14790, Training Loss: 0.01086
Epoch: 14791, Training Loss: 0.01236
Epoch: 14791, Training Loss: 0.01145
Epoch: 14791, Training Loss: 0.01411
Epoch: 14791, Training Loss: 0.01086
Epoch: 14792, Training Loss: 0.01236
Epoch: 14792, Training Loss: 0.01145
Epoch: 14792, Training Loss: 0.01411
Epoch: 14792, Training Loss: 0.01086
Epoch: 14793, Training Loss: 0.01236
Epoch: 14793, Training Loss: 0.01145
Epoch: 14793, Training Loss: 0.01411
Epoch: 14793, Training Loss: 0.01086
Epoch: 14794, Training Loss: 0.01236
Epoch: 14794, Training Loss: 0.01145
Epoch: 14794, Training Loss: 0.01411
Epoch: 14794, Training Loss: 0.01086
Epoch: 14795, Training Loss: 0.01235
Epoch: 14795, Training Loss: 0.01145
Epoch: 14795, Training Loss: 0.01411
Epoch: 14795, Training Loss: 0.01086
Epoch: 14796, Training Loss: 0.01235
Epoch: 14796, Training Loss: 0.01145
Epoch: 14796, Training Loss: 0.01410
Epoch: 14796, Training Loss: 0.01086
Epoch: 14797, Training Loss: 0.01235
Epoch: 14797, Training Loss: 0.01145
Epoch: 14797, Training Loss: 0.01410
Epoch: 14797, Training Loss: 0.01086
Epoch: 14798, Training Loss: 0.01235
Epoch: 14798, Training Loss: 0.01145
Epoch: 14798, Training Loss: 0.01410
Epoch: 14798, Training Loss: 0.01086
Epoch: 14799, Training Loss: 0.01235
Epoch: 14799, Training Loss: 0.01145
Epoch: 14799, Training Loss: 0.01410
Epoch: 14799, Training Loss: 0.01086
Epoch: 14800, Training Loss: 0.01235
Epoch: 14800, Training Loss: 0.01145
Epoch: 14800, Training Loss: 0.01410
Epoch: 14800, Training Loss: 0.01086
Epoch: 14801, Training Loss: 0.01235
Epoch: 14801, Training Loss: 0.01145
Epoch: 14801, Training Loss: 0.01410
Epoch: 14801, Training Loss: 0.01086
Epoch: 14802, Training Loss: 0.01235
Epoch: 14802, Training Loss: 0.01145
Epoch: 14802, Training Loss: 0.01410
Epoch: 14802, Training Loss: 0.01086
Epoch: 14803, Training Loss: 0.01235
Epoch: 14803, Training Loss: 0.01145
Epoch: 14803, Training Loss: 0.01410
Epoch: 14803, Training Loss: 0.01085
Epoch: 14804, Training Loss: 0.01235
Epoch: 14804, Training Loss: 0.01145
Epoch: 14804, Training Loss: 0.01410
Epoch: 14804, Training Loss: 0.01085
Epoch: 14805, Training Loss: 0.01235
Epoch: 14805, Training Loss: 0.01145
Epoch: 14805, Training Loss: 0.01410
Epoch: 14805, Training Loss: 0.01085
Epoch: 14806, Training Loss: 0.01235
Epoch: 14806, Training Loss: 0.01144
Epoch: 14806, Training Loss: 0.01410
Epoch: 14806, Training Loss: 0.01085
Epoch: 14807, Training Loss: 0.01235
Epoch: 14807, Training Loss: 0.01144
Epoch: 14807, Training Loss: 0.01410
Epoch: 14807, Training Loss: 0.01085
Epoch: 14808, Training Loss: 0.01235
Epoch: 14808, Training Loss: 0.01144
Epoch: 14808, Training Loss: 0.01410
Epoch: 14808, Training Loss: 0.01085
Epoch: 14809, Training Loss: 0.01235
Epoch: 14809, Training Loss: 0.01144
Epoch: 14809, Training Loss: 0.01410
Epoch: 14809, Training Loss: 0.01085
Epoch: 14810, Training Loss: 0.01235
Epoch: 14810, Training Loss: 0.01144
Epoch: 14810, Training Loss: 0.01410
Epoch: 14810, Training Loss: 0.01085
Epoch: 14811, Training Loss: 0.01235
Epoch: 14811, Training Loss: 0.01144
Epoch: 14811, Training Loss: 0.01410
Epoch: 14811, Training Loss: 0.01085
Epoch: 14812, Training Loss: 0.01235
Epoch: 14812, Training Loss: 0.01144
Epoch: 14812, Training Loss: 0.01410
Epoch: 14812, Training Loss: 0.01085
Epoch: 14813, Training Loss: 0.01235
Epoch: 14813, Training Loss: 0.01144
Epoch: 14813, Training Loss: 0.01410
Epoch: 14813, Training Loss: 0.01085
Epoch: 14814, Training Loss: 0.01235
Epoch: 14814, Training Loss: 0.01144
Epoch: 14814, Training Loss: 0.01410
Epoch: 14814, Training Loss: 0.01085
Epoch: 14815, Training Loss: 0.01235
Epoch: 14815, Training Loss: 0.01144
Epoch: 14815, Training Loss: 0.01409
Epoch: 14815, Training Loss: 0.01085
Epoch: 14816, Training Loss: 0.01234
Epoch: 14816, Training Loss: 0.01144
Epoch: 14816, Training Loss: 0.01409
Epoch: 14816, Training Loss: 0.01085
Epoch: 14817, Training Loss: 0.01234
Epoch: 14817, Training Loss: 0.01144
Epoch: 14817, Training Loss: 0.01409
Epoch: 14817, Training Loss: 0.01085
Epoch: 14818, Training Loss: 0.01234
Epoch: 14818, Training Loss: 0.01144
Epoch: 14818, Training Loss: 0.01409
Epoch: 14818, Training Loss: 0.01085
Epoch: 14819, Training Loss: 0.01234
Epoch: 14819, Training Loss: 0.01144
Epoch: 14819, Training Loss: 0.01409
Epoch: 14819, Training Loss: 0.01085
Epoch: 14820, Training Loss: 0.01234
Epoch: 14820, Training Loss: 0.01144
Epoch: 14820, Training Loss: 0.01409
Epoch: 14820, Training Loss: 0.01085
Epoch: 14821, Training Loss: 0.01234
Epoch: 14821, Training Loss: 0.01144
Epoch: 14821, Training Loss: 0.01409
Epoch: 14821, Training Loss: 0.01085
Epoch: 14822, Training Loss: 0.01234
Epoch: 14822, Training Loss: 0.01144
Epoch: 14822, Training Loss: 0.01409
Epoch: 14822, Training Loss: 0.01085
Epoch: 14823, Training Loss: 0.01234
Epoch: 14823, Training Loss: 0.01144
Epoch: 14823, Training Loss: 0.01409
Epoch: 14823, Training Loss: 0.01085
Epoch: 14824, Training Loss: 0.01234
Epoch: 14824, Training Loss: 0.01144
Epoch: 14824, Training Loss: 0.01409
Epoch: 14824, Training Loss: 0.01085
Epoch: 14825, Training Loss: 0.01234
Epoch: 14825, Training Loss: 0.01144
Epoch: 14825, Training Loss: 0.01409
Epoch: 14825, Training Loss: 0.01085
Epoch: 14826, Training Loss: 0.01234
Epoch: 14826, Training Loss: 0.01144
Epoch: 14826, Training Loss: 0.01409
Epoch: 14826, Training Loss: 0.01085
Epoch: 14827, Training Loss: 0.01234
Epoch: 14827, Training Loss: 0.01144
Epoch: 14827, Training Loss: 0.01409
Epoch: 14827, Training Loss: 0.01085
Epoch: 14828, Training Loss: 0.01234
Epoch: 14828, Training Loss: 0.01144
Epoch: 14828, Training Loss: 0.01409
Epoch: 14828, Training Loss: 0.01084
Epoch: 14829, Training Loss: 0.01234
Epoch: 14829, Training Loss: 0.01144
Epoch: 14829, Training Loss: 0.01409
Epoch: 14829, Training Loss: 0.01084
Epoch: 14830, Training Loss: 0.01234
Epoch: 14830, Training Loss: 0.01143
Epoch: 14830, Training Loss: 0.01409
Epoch: 14830, Training Loss: 0.01084
Epoch: 14831, Training Loss: 0.01234
Epoch: 14831, Training Loss: 0.01143
Epoch: 14831, Training Loss: 0.01409
Epoch: 14831, Training Loss: 0.01084
Epoch: 14832, Training Loss: 0.01234
Epoch: 14832, Training Loss: 0.01143
Epoch: 14832, Training Loss: 0.01409
Epoch: 14832, Training Loss: 0.01084
Epoch: 14833, Training Loss: 0.01234
Epoch: 14833, Training Loss: 0.01143
Epoch: 14833, Training Loss: 0.01409
Epoch: 14833, Training Loss: 0.01084
Epoch: 14834, Training Loss: 0.01234
Epoch: 14834, Training Loss: 0.01143
Epoch: 14834, Training Loss: 0.01408
Epoch: 14834, Training Loss: 0.01084
Epoch: 14835, Training Loss: 0.01234
Epoch: 14835, Training Loss: 0.01143
Epoch: 14835, Training Loss: 0.01408
Epoch: 14835, Training Loss: 0.01084
Epoch: 14836, Training Loss: 0.01234
Epoch: 14836, Training Loss: 0.01143
Epoch: 14836, Training Loss: 0.01408
Epoch: 14836, Training Loss: 0.01084
Epoch: 14837, Training Loss: 0.01234
Epoch: 14837, Training Loss: 0.01143
Epoch: 14837, Training Loss: 0.01408
Epoch: 14837, Training Loss: 0.01084
Epoch: 14838, Training Loss: 0.01233
Epoch: 14838, Training Loss: 0.01143
Epoch: 14838, Training Loss: 0.01408
Epoch: 14838, Training Loss: 0.01084
Epoch: 14839, Training Loss: 0.01233
Epoch: 14839, Training Loss: 0.01143
Epoch: 14839, Training Loss: 0.01408
Epoch: 14839, Training Loss: 0.01084
Epoch: 14840, Training Loss: 0.01233
Epoch: 14840, Training Loss: 0.01143
Epoch: 14840, Training Loss: 0.01408
Epoch: 14840, Training Loss: 0.01084
Epoch: 14841, Training Loss: 0.01233
Epoch: 14841, Training Loss: 0.01143
Epoch: 14841, Training Loss: 0.01408
Epoch: 14841, Training Loss: 0.01084
Epoch: 14842, Training Loss: 0.01233
Epoch: 14842, Training Loss: 0.01143
Epoch: 14842, Training Loss: 0.01408
Epoch: 14842, Training Loss: 0.01084
Epoch: 14843, Training Loss: 0.01233
Epoch: 14843, Training Loss: 0.01143
Epoch: 14843, Training Loss: 0.01408
Epoch: 14843, Training Loss: 0.01084
Epoch: 14844, Training Loss: 0.01233
Epoch: 14844, Training Loss: 0.01143
Epoch: 14844, Training Loss: 0.01408
Epoch: 14844, Training Loss: 0.01084
Epoch: 14845, Training Loss: 0.01233
Epoch: 14845, Training Loss: 0.01143
Epoch: 14845, Training Loss: 0.01408
Epoch: 14845, Training Loss: 0.01084
Epoch: 14846, Training Loss: 0.01233
Epoch: 14846, Training Loss: 0.01143
Epoch: 14846, Training Loss: 0.01408
Epoch: 14846, Training Loss: 0.01084
Epoch: 14847, Training Loss: 0.01233
Epoch: 14847, Training Loss: 0.01143
Epoch: 14847, Training Loss: 0.01408
Epoch: 14847, Training Loss: 0.01084
Epoch: 14848, Training Loss: 0.01233
Epoch: 14848, Training Loss: 0.01143
Epoch: 14848, Training Loss: 0.01408
Epoch: 14848, Training Loss: 0.01084
Epoch: 14849, Training Loss: 0.01233
Epoch: 14849, Training Loss: 0.01143
Epoch: 14849, Training Loss: 0.01408
Epoch: 14849, Training Loss: 0.01084
Epoch: 14850, Training Loss: 0.01233
Epoch: 14850, Training Loss: 0.01143
Epoch: 14850, Training Loss: 0.01408
Epoch: 14850, Training Loss: 0.01084
Epoch: 14851, Training Loss: 0.01233
Epoch: 14851, Training Loss: 0.01143
Epoch: 14851, Training Loss: 0.01408
Epoch: 14851, Training Loss: 0.01084
Epoch: 14852, Training Loss: 0.01233
Epoch: 14852, Training Loss: 0.01143
Epoch: 14852, Training Loss: 0.01408
Epoch: 14852, Training Loss: 0.01084
Epoch: 14853, Training Loss: 0.01233
Epoch: 14853, Training Loss: 0.01143
Epoch: 14853, Training Loss: 0.01407
Epoch: 14853, Training Loss: 0.01084
Epoch: 14854, Training Loss: 0.01233
Epoch: 14854, Training Loss: 0.01143
Epoch: 14854, Training Loss: 0.01407
Epoch: 14854, Training Loss: 0.01083
Epoch: 14855, Training Loss: 0.01233
Epoch: 14855, Training Loss: 0.01142
Epoch: 14855, Training Loss: 0.01407
Epoch: 14855, Training Loss: 0.01083
Epoch: 14856, Training Loss: 0.01233
Epoch: 14856, Training Loss: 0.01142
Epoch: 14856, Training Loss: 0.01407
Epoch: 14856, Training Loss: 0.01083
Epoch: 14857, Training Loss: 0.01233
Epoch: 14857, Training Loss: 0.01142
Epoch: 14857, Training Loss: 0.01407
Epoch: 14857, Training Loss: 0.01083
Epoch: 14858, Training Loss: 0.01233
Epoch: 14858, Training Loss: 0.01142
Epoch: 14858, Training Loss: 0.01407
Epoch: 14858, Training Loss: 0.01083
Epoch: 14859, Training Loss: 0.01233
Epoch: 14859, Training Loss: 0.01142
Epoch: 14859, Training Loss: 0.01407
Epoch: 14859, Training Loss: 0.01083
Epoch: 14860, Training Loss: 0.01232
Epoch: 14860, Training Loss: 0.01142
Epoch: 14860, Training Loss: 0.01407
Epoch: 14860, Training Loss: 0.01083
Epoch: 14861, Training Loss: 0.01232
Epoch: 14861, Training Loss: 0.01142
Epoch: 14861, Training Loss: 0.01407
Epoch: 14861, Training Loss: 0.01083
Epoch: 14862, Training Loss: 0.01232
Epoch: 14862, Training Loss: 0.01142
Epoch: 14862, Training Loss: 0.01407
Epoch: 14862, Training Loss: 0.01083
Epoch: 14863, Training Loss: 0.01232
Epoch: 14863, Training Loss: 0.01142
Epoch: 14863, Training Loss: 0.01407
Epoch: 14863, Training Loss: 0.01083
Epoch: 14864, Training Loss: 0.01232
Epoch: 14864, Training Loss: 0.01142
Epoch: 14864, Training Loss: 0.01407
Epoch: 14864, Training Loss: 0.01083
Epoch: 14865, Training Loss: 0.01232
Epoch: 14865, Training Loss: 0.01142
Epoch: 14865, Training Loss: 0.01407
Epoch: 14865, Training Loss: 0.01083
Epoch: 14866, Training Loss: 0.01232
Epoch: 14866, Training Loss: 0.01142
Epoch: 14866, Training Loss: 0.01407
Epoch: 14866, Training Loss: 0.01083
Epoch: 14867, Training Loss: 0.01232
Epoch: 14867, Training Loss: 0.01142
Epoch: 14867, Training Loss: 0.01407
Epoch: 14867, Training Loss: 0.01083
Epoch: 14868, Training Loss: 0.01232
Epoch: 14868, Training Loss: 0.01142
Epoch: 14868, Training Loss: 0.01407
Epoch: 14868, Training Loss: 0.01083
Epoch: 14869, Training Loss: 0.01232
Epoch: 14869, Training Loss: 0.01142
Epoch: 14869, Training Loss: 0.01407
Epoch: 14869, Training Loss: 0.01083
Epoch: 14870, Training Loss: 0.01232
Epoch: 14870, Training Loss: 0.01142
Epoch: 14870, Training Loss: 0.01407
Epoch: 14870, Training Loss: 0.01083
Epoch: 14871, Training Loss: 0.01232
Epoch: 14871, Training Loss: 0.01142
Epoch: 14871, Training Loss: 0.01407
Epoch: 14871, Training Loss: 0.01083
Epoch: 14872, Training Loss: 0.01232
Epoch: 14872, Training Loss: 0.01142
Epoch: 14872, Training Loss: 0.01406
Epoch: 14872, Training Loss: 0.01083
Epoch: 14873, Training Loss: 0.01232
Epoch: 14873, Training Loss: 0.01142
Epoch: 14873, Training Loss: 0.01406
Epoch: 14873, Training Loss: 0.01083
Epoch: 14874, Training Loss: 0.01232
Epoch: 14874, Training Loss: 0.01142
Epoch: 14874, Training Loss: 0.01406
Epoch: 14874, Training Loss: 0.01083
Epoch: 14875, Training Loss: 0.01232
Epoch: 14875, Training Loss: 0.01142
Epoch: 14875, Training Loss: 0.01406
Epoch: 14875, Training Loss: 0.01083
Epoch: 14876, Training Loss: 0.01232
Epoch: 14876, Training Loss: 0.01142
Epoch: 14876, Training Loss: 0.01406
Epoch: 14876, Training Loss: 0.01083
Epoch: 14877, Training Loss: 0.01232
Epoch: 14877, Training Loss: 0.01142
Epoch: 14877, Training Loss: 0.01406
Epoch: 14877, Training Loss: 0.01083
Epoch: 14878, Training Loss: 0.01232
Epoch: 14878, Training Loss: 0.01142
Epoch: 14878, Training Loss: 0.01406
Epoch: 14878, Training Loss: 0.01083
Epoch: 14879, Training Loss: 0.01232
Epoch: 14879, Training Loss: 0.01141
Epoch: 14879, Training Loss: 0.01406
Epoch: 14879, Training Loss: 0.01082
Epoch: 14880, Training Loss: 0.01232
Epoch: 14880, Training Loss: 0.01141
Epoch: 14880, Training Loss: 0.01406
Epoch: 14880, Training Loss: 0.01082
Epoch: 14881, Training Loss: 0.01232
Epoch: 14881, Training Loss: 0.01141
Epoch: 14881, Training Loss: 0.01406
Epoch: 14881, Training Loss: 0.01082
Epoch: 14882, Training Loss: 0.01231
Epoch: 14882, Training Loss: 0.01141
Epoch: 14882, Training Loss: 0.01406
Epoch: 14882, Training Loss: 0.01082
Epoch: 14883, Training Loss: 0.01231
Epoch: 14883, Training Loss: 0.01141
Epoch: 14883, Training Loss: 0.01406
Epoch: 14883, Training Loss: 0.01082
Epoch: 14884, Training Loss: 0.01231
Epoch: 14884, Training Loss: 0.01141
Epoch: 14884, Training Loss: 0.01406
Epoch: 14884, Training Loss: 0.01082
Epoch: 14885, Training Loss: 0.01231
Epoch: 14885, Training Loss: 0.01141
Epoch: 14885, Training Loss: 0.01406
Epoch: 14885, Training Loss: 0.01082
Epoch: 14886, Training Loss: 0.01231
Epoch: 14886, Training Loss: 0.01141
Epoch: 14886, Training Loss: 0.01406
Epoch: 14886, Training Loss: 0.01082
Epoch: 14887, Training Loss: 0.01231
Epoch: 14887, Training Loss: 0.01141
Epoch: 14887, Training Loss: 0.01406
Epoch: 14887, Training Loss: 0.01082
Epoch: 14888, Training Loss: 0.01231
Epoch: 14888, Training Loss: 0.01141
Epoch: 14888, Training Loss: 0.01406
Epoch: 14888, Training Loss: 0.01082
Epoch: 14889, Training Loss: 0.01231
Epoch: 14889, Training Loss: 0.01141
Epoch: 14889, Training Loss: 0.01406
Epoch: 14889, Training Loss: 0.01082
Epoch: 14890, Training Loss: 0.01231
Epoch: 14890, Training Loss: 0.01141
Epoch: 14890, Training Loss: 0.01406
Epoch: 14890, Training Loss: 0.01082
Epoch: 14891, Training Loss: 0.01231
Epoch: 14891, Training Loss: 0.01141
Epoch: 14891, Training Loss: 0.01406
Epoch: 14891, Training Loss: 0.01082
Epoch: 14892, Training Loss: 0.01231
Epoch: 14892, Training Loss: 0.01141
Epoch: 14892, Training Loss: 0.01405
Epoch: 14892, Training Loss: 0.01082
Epoch: 14893, Training Loss: 0.01231
Epoch: 14893, Training Loss: 0.01141
Epoch: 14893, Training Loss: 0.01405
Epoch: 14893, Training Loss: 0.01082
Epoch: 14894, Training Loss: 0.01231
Epoch: 14894, Training Loss: 0.01141
Epoch: 14894, Training Loss: 0.01405
Epoch: 14894, Training Loss: 0.01082
Epoch: 14895, Training Loss: 0.01231
Epoch: 14895, Training Loss: 0.01141
Epoch: 14895, Training Loss: 0.01405
Epoch: 14895, Training Loss: 0.01082
Epoch: 14896, Training Loss: 0.01231
Epoch: 14896, Training Loss: 0.01141
Epoch: 14896, Training Loss: 0.01405
Epoch: 14896, Training Loss: 0.01082
Epoch: 14897, Training Loss: 0.01231
Epoch: 14897, Training Loss: 0.01141
Epoch: 14897, Training Loss: 0.01405
Epoch: 14897, Training Loss: 0.01082
Epoch: 14898, Training Loss: 0.01231
Epoch: 14898, Training Loss: 0.01141
Epoch: 14898, Training Loss: 0.01405
Epoch: 14898, Training Loss: 0.01082
Epoch: 14899, Training Loss: 0.01231
Epoch: 14899, Training Loss: 0.01141
Epoch: 14899, Training Loss: 0.01405
Epoch: 14899, Training Loss: 0.01082
Epoch: 14900, Training Loss: 0.01231
Epoch: 14900, Training Loss: 0.01141
Epoch: 14900, Training Loss: 0.01405
Epoch: 14900, Training Loss: 0.01082
Epoch: 14901, Training Loss: 0.01231
Epoch: 14901, Training Loss: 0.01141
Epoch: 14901, Training Loss: 0.01405
Epoch: 14901, Training Loss: 0.01082
Epoch: 14902, Training Loss: 0.01231
Epoch: 14902, Training Loss: 0.01141
Epoch: 14902, Training Loss: 0.01405
Epoch: 14902, Training Loss: 0.01082
Epoch: 14903, Training Loss: 0.01231
Epoch: 14903, Training Loss: 0.01140
Epoch: 14903, Training Loss: 0.01405
Epoch: 14903, Training Loss: 0.01082
Epoch: 14904, Training Loss: 0.01230
Epoch: 14904, Training Loss: 0.01140
Epoch: 14904, Training Loss: 0.01405
Epoch: 14904, Training Loss: 0.01082
Epoch: 14905, Training Loss: 0.01230
Epoch: 14905, Training Loss: 0.01140
Epoch: 14905, Training Loss: 0.01405
Epoch: 14905, Training Loss: 0.01081
Epoch: 14906, Training Loss: 0.01230
Epoch: 14906, Training Loss: 0.01140
Epoch: 14906, Training Loss: 0.01405
Epoch: 14906, Training Loss: 0.01081
Epoch: 14907, Training Loss: 0.01230
Epoch: 14907, Training Loss: 0.01140
Epoch: 14907, Training Loss: 0.01405
Epoch: 14907, Training Loss: 0.01081
Epoch: 14908, Training Loss: 0.01230
Epoch: 14908, Training Loss: 0.01140
Epoch: 14908, Training Loss: 0.01405
Epoch: 14908, Training Loss: 0.01081
Epoch: 14909, Training Loss: 0.01230
Epoch: 14909, Training Loss: 0.01140
Epoch: 14909, Training Loss: 0.01405
Epoch: 14909, Training Loss: 0.01081
Epoch: 14910, Training Loss: 0.01230
Epoch: 14910, Training Loss: 0.01140
Epoch: 14910, Training Loss: 0.01405
Epoch: 14910, Training Loss: 0.01081
Epoch: 14911, Training Loss: 0.01230
Epoch: 14911, Training Loss: 0.01140
Epoch: 14911, Training Loss: 0.01404
Epoch: 14911, Training Loss: 0.01081
Epoch: 14912, Training Loss: 0.01230
Epoch: 14912, Training Loss: 0.01140
Epoch: 14912, Training Loss: 0.01404
Epoch: 14912, Training Loss: 0.01081
Epoch: 14913, Training Loss: 0.01230
Epoch: 14913, Training Loss: 0.01140
Epoch: 14913, Training Loss: 0.01404
Epoch: 14913, Training Loss: 0.01081
Epoch: 14914, Training Loss: 0.01230
Epoch: 14914, Training Loss: 0.01140
Epoch: 14914, Training Loss: 0.01404
Epoch: 14914, Training Loss: 0.01081
Epoch: 14915, Training Loss: 0.01230
Epoch: 14915, Training Loss: 0.01140
Epoch: 14915, Training Loss: 0.01404
Epoch: 14915, Training Loss: 0.01081
Epoch: 14916, Training Loss: 0.01230
Epoch: 14916, Training Loss: 0.01140
Epoch: 14916, Training Loss: 0.01404
Epoch: 14916, Training Loss: 0.01081
Epoch: 14917, Training Loss: 0.01230
Epoch: 14917, Training Loss: 0.01140
Epoch: 14917, Training Loss: 0.01404
Epoch: 14917, Training Loss: 0.01081
Epoch: 14918, Training Loss: 0.01230
Epoch: 14918, Training Loss: 0.01140
Epoch: 14918, Training Loss: 0.01404
Epoch: 14918, Training Loss: 0.01081
Epoch: 14919, Training Loss: 0.01230
Epoch: 14919, Training Loss: 0.01140
Epoch: 14919, Training Loss: 0.01404
Epoch: 14919, Training Loss: 0.01081
Epoch: 14920, Training Loss: 0.01230
Epoch: 14920, Training Loss: 0.01140
Epoch: 14920, Training Loss: 0.01404
Epoch: 14920, Training Loss: 0.01081
Epoch: 14921, Training Loss: 0.01230
Epoch: 14921, Training Loss: 0.01140
Epoch: 14921, Training Loss: 0.01404
Epoch: 14921, Training Loss: 0.01081
Epoch: 14922, Training Loss: 0.01230
Epoch: 14922, Training Loss: 0.01140
Epoch: 14922, Training Loss: 0.01404
Epoch: 14922, Training Loss: 0.01081
Epoch: 14923, Training Loss: 0.01230
Epoch: 14923, Training Loss: 0.01140
Epoch: 14923, Training Loss: 0.01404
Epoch: 14923, Training Loss: 0.01081
Epoch: 14924, Training Loss: 0.01230
Epoch: 14924, Training Loss: 0.01140
Epoch: 14924, Training Loss: 0.01404
Epoch: 14924, Training Loss: 0.01081
Epoch: 14925, Training Loss: 0.01230
Epoch: 14925, Training Loss: 0.01140
Epoch: 14925, Training Loss: 0.01404
Epoch: 14925, Training Loss: 0.01081
Epoch: 14926, Training Loss: 0.01229
Epoch: 14926, Training Loss: 0.01140
Epoch: 14926, Training Loss: 0.01404
Epoch: 14926, Training Loss: 0.01081
Epoch: 14927, Training Loss: 0.01229
Epoch: 14927, Training Loss: 0.01140
Epoch: 14927, Training Loss: 0.01404
Epoch: 14927, Training Loss: 0.01081
Epoch: 14928, Training Loss: 0.01229
Epoch: 14928, Training Loss: 0.01139
Epoch: 14928, Training Loss: 0.01404
Epoch: 14928, Training Loss: 0.01081
Epoch: 14929, Training Loss: 0.01229
Epoch: 14929, Training Loss: 0.01139
Epoch: 14929, Training Loss: 0.01404
Epoch: 14929, Training Loss: 0.01081
Epoch: 14930, Training Loss: 0.01229
Epoch: 14930, Training Loss: 0.01139
Epoch: 14930, Training Loss: 0.01403
Epoch: 14930, Training Loss: 0.01080
Epoch: 14931, Training Loss: 0.01229
Epoch: 14931, Training Loss: 0.01139
Epoch: 14931, Training Loss: 0.01403
Epoch: 14931, Training Loss: 0.01080
Epoch: 14932, Training Loss: 0.01229
Epoch: 14932, Training Loss: 0.01139
Epoch: 14932, Training Loss: 0.01403
Epoch: 14932, Training Loss: 0.01080
Epoch: 14933, Training Loss: 0.01229
Epoch: 14933, Training Loss: 0.01139
Epoch: 14933, Training Loss: 0.01403
Epoch: 14933, Training Loss: 0.01080
Epoch: 14934, Training Loss: 0.01229
Epoch: 14934, Training Loss: 0.01139
Epoch: 14934, Training Loss: 0.01403
Epoch: 14934, Training Loss: 0.01080
Epoch: 14935, Training Loss: 0.01229
Epoch: 14935, Training Loss: 0.01139
Epoch: 14935, Training Loss: 0.01403
Epoch: 14935, Training Loss: 0.01080
Epoch: 14936, Training Loss: 0.01229
Epoch: 14936, Training Loss: 0.01139
Epoch: 14936, Training Loss: 0.01403
Epoch: 14936, Training Loss: 0.01080
Epoch: 14937, Training Loss: 0.01229
Epoch: 14937, Training Loss: 0.01139
Epoch: 14937, Training Loss: 0.01403
Epoch: 14937, Training Loss: 0.01080
Epoch: 14938, Training Loss: 0.01229
Epoch: 14938, Training Loss: 0.01139
Epoch: 14938, Training Loss: 0.01403
Epoch: 14938, Training Loss: 0.01080
Epoch: 14939, Training Loss: 0.01229
Epoch: 14939, Training Loss: 0.01139
Epoch: 14939, Training Loss: 0.01403
Epoch: 14939, Training Loss: 0.01080
Epoch: 14940, Training Loss: 0.01229
Epoch: 14940, Training Loss: 0.01139
Epoch: 14940, Training Loss: 0.01403
Epoch: 14940, Training Loss: 0.01080
Epoch: 14941, Training Loss: 0.01229
Epoch: 14941, Training Loss: 0.01139
Epoch: 14941, Training Loss: 0.01403
Epoch: 14941, Training Loss: 0.01080
Epoch: 14942, Training Loss: 0.01229
Epoch: 14942, Training Loss: 0.01139
Epoch: 14942, Training Loss: 0.01403
Epoch: 14942, Training Loss: 0.01080
Epoch: 14943, Training Loss: 0.01229
Epoch: 14943, Training Loss: 0.01139
Epoch: 14943, Training Loss: 0.01403
Epoch: 14943, Training Loss: 0.01080
Epoch: 14944, Training Loss: 0.01229
Epoch: 14944, Training Loss: 0.01139
Epoch: 14944, Training Loss: 0.01403
Epoch: 14944, Training Loss: 0.01080
Epoch: 14945, Training Loss: 0.01229
Epoch: 14945, Training Loss: 0.01139
Epoch: 14945, Training Loss: 0.01403
Epoch: 14945, Training Loss: 0.01080
Epoch: 14946, Training Loss: 0.01229
Epoch: 14946, Training Loss: 0.01139
Epoch: 14946, Training Loss: 0.01403
Epoch: 14946, Training Loss: 0.01080
Epoch: 14947, Training Loss: 0.01229
Epoch: 14947, Training Loss: 0.01139
Epoch: 14947, Training Loss: 0.01403
Epoch: 14947, Training Loss: 0.01080
Epoch: 14948, Training Loss: 0.01228
Epoch: 14948, Training Loss: 0.01139
Epoch: 14948, Training Loss: 0.01403
Epoch: 14948, Training Loss: 0.01080
Epoch: 14949, Training Loss: 0.01228
Epoch: 14949, Training Loss: 0.01139
Epoch: 14949, Training Loss: 0.01402
Epoch: 14949, Training Loss: 0.01080
Epoch: 14950, Training Loss: 0.01228
Epoch: 14950, Training Loss: 0.01139
Epoch: 14950, Training Loss: 0.01402
Epoch: 14950, Training Loss: 0.01080
Epoch: 14951, Training Loss: 0.01228
Epoch: 14951, Training Loss: 0.01139
Epoch: 14951, Training Loss: 0.01402
Epoch: 14951, Training Loss: 0.01080
Epoch: 14952, Training Loss: 0.01228
Epoch: 14952, Training Loss: 0.01138
Epoch: 14952, Training Loss: 0.01402
Epoch: 14952, Training Loss: 0.01080
Epoch: 14953, Training Loss: 0.01228
Epoch: 14953, Training Loss: 0.01138
Epoch: 14953, Training Loss: 0.01402
Epoch: 14953, Training Loss: 0.01080
Epoch: 14954, Training Loss: 0.01228
Epoch: 14954, Training Loss: 0.01138
Epoch: 14954, Training Loss: 0.01402
Epoch: 14954, Training Loss: 0.01080
Epoch: 14955, Training Loss: 0.01228
Epoch: 14955, Training Loss: 0.01138
Epoch: 14955, Training Loss: 0.01402
Epoch: 14955, Training Loss: 0.01080
Epoch: 14956, Training Loss: 0.01228
Epoch: 14956, Training Loss: 0.01138
Epoch: 14956, Training Loss: 0.01402
Epoch: 14956, Training Loss: 0.01079
Epoch: 14957, Training Loss: 0.01228
Epoch: 14957, Training Loss: 0.01138
Epoch: 14957, Training Loss: 0.01402
Epoch: 14957, Training Loss: 0.01079
Epoch: 14958, Training Loss: 0.01228
Epoch: 14958, Training Loss: 0.01138
Epoch: 14958, Training Loss: 0.01402
Epoch: 14958, Training Loss: 0.01079
Epoch: 14959, Training Loss: 0.01228
Epoch: 14959, Training Loss: 0.01138
Epoch: 14959, Training Loss: 0.01402
Epoch: 14959, Training Loss: 0.01079
Epoch: 14960, Training Loss: 0.01228
Epoch: 14960, Training Loss: 0.01138
Epoch: 14960, Training Loss: 0.01402
Epoch: 14960, Training Loss: 0.01079
Epoch: 14961, Training Loss: 0.01228
Epoch: 14961, Training Loss: 0.01138
Epoch: 14961, Training Loss: 0.01402
Epoch: 14961, Training Loss: 0.01079
Epoch: 14962, Training Loss: 0.01228
Epoch: 14962, Training Loss: 0.01138
Epoch: 14962, Training Loss: 0.01402
Epoch: 14962, Training Loss: 0.01079
Epoch: 14963, Training Loss: 0.01228
Epoch: 14963, Training Loss: 0.01138
Epoch: 14963, Training Loss: 0.01402
Epoch: 14963, Training Loss: 0.01079
Epoch: 14964, Training Loss: 0.01228
Epoch: 14964, Training Loss: 0.01138
Epoch: 14964, Training Loss: 0.01402
Epoch: 14964, Training Loss: 0.01079
Epoch: 14965, Training Loss: 0.01228
Epoch: 14965, Training Loss: 0.01138
Epoch: 14965, Training Loss: 0.01402
Epoch: 14965, Training Loss: 0.01079
Epoch: 14966, Training Loss: 0.01228
Epoch: 14966, Training Loss: 0.01138
Epoch: 14966, Training Loss: 0.01402
Epoch: 14966, Training Loss: 0.01079
Epoch: 14967, Training Loss: 0.01228
Epoch: 14967, Training Loss: 0.01138
Epoch: 14967, Training Loss: 0.01402
Epoch: 14967, Training Loss: 0.01079
Epoch: 14968, Training Loss: 0.01228
Epoch: 14968, Training Loss: 0.01138
Epoch: 14968, Training Loss: 0.01402
Epoch: 14968, Training Loss: 0.01079
Epoch: 14969, Training Loss: 0.01228
Epoch: 14969, Training Loss: 0.01138
Epoch: 14969, Training Loss: 0.01401
Epoch: 14969, Training Loss: 0.01079
Epoch: 14970, Training Loss: 0.01227
Epoch: 14970, Training Loss: 0.01138
Epoch: 14970, Training Loss: 0.01401
Epoch: 14970, Training Loss: 0.01079
Epoch: 14971, Training Loss: 0.01227
Epoch: 14971, Training Loss: 0.01138
Epoch: 14971, Training Loss: 0.01401
Epoch: 14971, Training Loss: 0.01079
Epoch: 14972, Training Loss: 0.01227
Epoch: 14972, Training Loss: 0.01138
Epoch: 14972, Training Loss: 0.01401
Epoch: 14972, Training Loss: 0.01079
Epoch: 14973, Training Loss: 0.01227
Epoch: 14973, Training Loss: 0.01138
Epoch: 14973, Training Loss: 0.01401
Epoch: 14973, Training Loss: 0.01079
Epoch: 14974, Training Loss: 0.01227
Epoch: 14974, Training Loss: 0.01138
Epoch: 14974, Training Loss: 0.01401
Epoch: 14974, Training Loss: 0.01079
Epoch: 14975, Training Loss: 0.01227
Epoch: 14975, Training Loss: 0.01138
Epoch: 14975, Training Loss: 0.01401
Epoch: 14975, Training Loss: 0.01079
Epoch: 14976, Training Loss: 0.01227
Epoch: 14976, Training Loss: 0.01137
Epoch: 14976, Training Loss: 0.01401
Epoch: 14976, Training Loss: 0.01079
Epoch: 14977, Training Loss: 0.01227
Epoch: 14977, Training Loss: 0.01137
Epoch: 14977, Training Loss: 0.01401
Epoch: 14977, Training Loss: 0.01079
Epoch: 14978, Training Loss: 0.01227
Epoch: 14978, Training Loss: 0.01137
Epoch: 14978, Training Loss: 0.01401
Epoch: 14978, Training Loss: 0.01079
Epoch: 14979, Training Loss: 0.01227
Epoch: 14979, Training Loss: 0.01137
Epoch: 14979, Training Loss: 0.01401
Epoch: 14979, Training Loss: 0.01079
Epoch: 14980, Training Loss: 0.01227
Epoch: 14980, Training Loss: 0.01137
Epoch: 14980, Training Loss: 0.01401
Epoch: 14980, Training Loss: 0.01079
Epoch: 14981, Training Loss: 0.01227
Epoch: 14981, Training Loss: 0.01137
Epoch: 14981, Training Loss: 0.01401
Epoch: 14981, Training Loss: 0.01078
Epoch: 14982, Training Loss: 0.01227
Epoch: 14982, Training Loss: 0.01137
Epoch: 14982, Training Loss: 0.01401
Epoch: 14982, Training Loss: 0.01078
Epoch: 14983, Training Loss: 0.01227
Epoch: 14983, Training Loss: 0.01137
Epoch: 14983, Training Loss: 0.01401
Epoch: 14983, Training Loss: 0.01078
Epoch: 14984, Training Loss: 0.01227
Epoch: 14984, Training Loss: 0.01137
Epoch: 14984, Training Loss: 0.01401
Epoch: 14984, Training Loss: 0.01078
Epoch: 14985, Training Loss: 0.01227
Epoch: 14985, Training Loss: 0.01137
Epoch: 14985, Training Loss: 0.01401
Epoch: 14985, Training Loss: 0.01078
Epoch: 14986, Training Loss: 0.01227
Epoch: 14986, Training Loss: 0.01137
Epoch: 14986, Training Loss: 0.01401
Epoch: 14986, Training Loss: 0.01078
Epoch: 14987, Training Loss: 0.01227
Epoch: 14987, Training Loss: 0.01137
Epoch: 14987, Training Loss: 0.01401
Epoch: 14987, Training Loss: 0.01078
Epoch: 14988, Training Loss: 0.01227
Epoch: 14988, Training Loss: 0.01137
Epoch: 14988, Training Loss: 0.01400
Epoch: 14988, Training Loss: 0.01078
Epoch: 14989, Training Loss: 0.01227
Epoch: 14989, Training Loss: 0.01137
Epoch: 14989, Training Loss: 0.01400
Epoch: 14989, Training Loss: 0.01078
Epoch: 14990, Training Loss: 0.01227
Epoch: 14990, Training Loss: 0.01137
Epoch: 14990, Training Loss: 0.01400
Epoch: 14990, Training Loss: 0.01078
Epoch: 14991, Training Loss: 0.01227
Epoch: 14991, Training Loss: 0.01137
Epoch: 14991, Training Loss: 0.01400
Epoch: 14991, Training Loss: 0.01078
Epoch: 14992, Training Loss: 0.01226
Epoch: 14992, Training Loss: 0.01137
Epoch: 14992, Training Loss: 0.01400
Epoch: 14992, Training Loss: 0.01078
Epoch: 14993, Training Loss: 0.01226
Epoch: 14993, Training Loss: 0.01137
Epoch: 14993, Training Loss: 0.01400
Epoch: 14993, Training Loss: 0.01078
Epoch: 14994, Training Loss: 0.01226
Epoch: 14994, Training Loss: 0.01137
Epoch: 14994, Training Loss: 0.01400
Epoch: 14994, Training Loss: 0.01078
Epoch: 14995, Training Loss: 0.01226
Epoch: 14995, Training Loss: 0.01137
Epoch: 14995, Training Loss: 0.01400
Epoch: 14995, Training Loss: 0.01078
Epoch: 14996, Training Loss: 0.01226
Epoch: 14996, Training Loss: 0.01137
Epoch: 14996, Training Loss: 0.01400
Epoch: 14996, Training Loss: 0.01078
Epoch: 14997, Training Loss: 0.01226
Epoch: 14997, Training Loss: 0.01137
Epoch: 14997, Training Loss: 0.01400
Epoch: 14997, Training Loss: 0.01078
Epoch: 14998, Training Loss: 0.01226
Epoch: 14998, Training Loss: 0.01137
Epoch: 14998, Training Loss: 0.01400
Epoch: 14998, Training Loss: 0.01078
Epoch: 14999, Training Loss: 0.01226
Epoch: 14999, Training Loss: 0.01137
Epoch: 14999, Training Loss: 0.01400
Epoch: 14999, Training Loss: 0.01078
Epoch: 15000, Training Loss: 0.01226
Epoch: 15000, Training Loss: 0.01137
Epoch: 15000, Training Loss: 0.01400
Epoch: 15000, Training Loss: 0.01078
Epoch: 15001, Training Loss: 0.01226
Epoch: 15001, Training Loss: 0.01136
Epoch: 15001, Training Loss: 0.01400
Epoch: 15001, Training Loss: 0.01078
Epoch: 15002, Training Loss: 0.01226
Epoch: 15002, Training Loss: 0.01136
Epoch: 15002, Training Loss: 0.01400
Epoch: 15002, Training Loss: 0.01078
Epoch: 15003, Training Loss: 0.01226
Epoch: 15003, Training Loss: 0.01136
Epoch: 15003, Training Loss: 0.01400
Epoch: 15003, Training Loss: 0.01078
Epoch: 15004, Training Loss: 0.01226
Epoch: 15004, Training Loss: 0.01136
Epoch: 15004, Training Loss: 0.01400
Epoch: 15004, Training Loss: 0.01078
Epoch: 15005, Training Loss: 0.01226
Epoch: 15005, Training Loss: 0.01136
Epoch: 15005, Training Loss: 0.01400
Epoch: 15005, Training Loss: 0.01078
Epoch: 15006, Training Loss: 0.01226
Epoch: 15006, Training Loss: 0.01136
Epoch: 15006, Training Loss: 0.01400
Epoch: 15006, Training Loss: 0.01078
Epoch: 15007, Training Loss: 0.01226
Epoch: 15007, Training Loss: 0.01136
Epoch: 15007, Training Loss: 0.01400
Epoch: 15007, Training Loss: 0.01077
Epoch: 15008, Training Loss: 0.01226
Epoch: 15008, Training Loss: 0.01136
Epoch: 15008, Training Loss: 0.01399
Epoch: 15008, Training Loss: 0.01077
Epoch: 15009, Training Loss: 0.01226
Epoch: 15009, Training Loss: 0.01136
Epoch: 15009, Training Loss: 0.01399
Epoch: 15009, Training Loss: 0.01077
Epoch: 15010, Training Loss: 0.01226
Epoch: 15010, Training Loss: 0.01136
Epoch: 15010, Training Loss: 0.01399
Epoch: 15010, Training Loss: 0.01077
Epoch: 15011, Training Loss: 0.01226
Epoch: 15011, Training Loss: 0.01136
Epoch: 15011, Training Loss: 0.01399
Epoch: 15011, Training Loss: 0.01077
Epoch: 15012, Training Loss: 0.01226
Epoch: 15012, Training Loss: 0.01136
Epoch: 15012, Training Loss: 0.01399
Epoch: 15012, Training Loss: 0.01077
Epoch: 15013, Training Loss: 0.01226
Epoch: 15013, Training Loss: 0.01136
Epoch: 15013, Training Loss: 0.01399
Epoch: 15013, Training Loss: 0.01077
Epoch: 15014, Training Loss: 0.01226
Epoch: 15014, Training Loss: 0.01136
Epoch: 15014, Training Loss: 0.01399
Epoch: 15014, Training Loss: 0.01077
Epoch: 15015, Training Loss: 0.01225
Epoch: 15015, Training Loss: 0.01136
Epoch: 15015, Training Loss: 0.01399
Epoch: 15015, Training Loss: 0.01077
Epoch: 15016, Training Loss: 0.01225
Epoch: 15016, Training Loss: 0.01136
Epoch: 15016, Training Loss: 0.01399
Epoch: 15016, Training Loss: 0.01077
Epoch: 15017, Training Loss: 0.01225
Epoch: 15017, Training Loss: 0.01136
Epoch: 15017, Training Loss: 0.01399
Epoch: 15017, Training Loss: 0.01077
Epoch: 15018, Training Loss: 0.01225
Epoch: 15018, Training Loss: 0.01136
Epoch: 15018, Training Loss: 0.01399
Epoch: 15018, Training Loss: 0.01077
Epoch: 15019, Training Loss: 0.01225
Epoch: 15019, Training Loss: 0.01136
Epoch: 15019, Training Loss: 0.01399
Epoch: 15019, Training Loss: 0.01077
Epoch: 15020, Training Loss: 0.01225
Epoch: 15020, Training Loss: 0.01136
Epoch: 15020, Training Loss: 0.01399
Epoch: 15020, Training Loss: 0.01077
Epoch: 15021, Training Loss: 0.01225
Epoch: 15021, Training Loss: 0.01136
Epoch: 15021, Training Loss: 0.01399
Epoch: 15021, Training Loss: 0.01077
Epoch: 15022, Training Loss: 0.01225
Epoch: 15022, Training Loss: 0.01136
Epoch: 15022, Training Loss: 0.01399
Epoch: 15022, Training Loss: 0.01077
Epoch: 15023, Training Loss: 0.01225
Epoch: 15023, Training Loss: 0.01136
Epoch: 15023, Training Loss: 0.01399
Epoch: 15023, Training Loss: 0.01077
Epoch: 15024, Training Loss: 0.01225
Epoch: 15024, Training Loss: 0.01136
Epoch: 15024, Training Loss: 0.01399
Epoch: 15024, Training Loss: 0.01077
Epoch: 15025, Training Loss: 0.01225
Epoch: 15025, Training Loss: 0.01136
Epoch: 15025, Training Loss: 0.01399
Epoch: 15025, Training Loss: 0.01077
Epoch: 15026, Training Loss: 0.01225
Epoch: 15026, Training Loss: 0.01135
Epoch: 15026, Training Loss: 0.01399
Epoch: 15026, Training Loss: 0.01077
Epoch: 15027, Training Loss: 0.01225
Epoch: 15027, Training Loss: 0.01135
Epoch: 15027, Training Loss: 0.01398
Epoch: 15027, Training Loss: 0.01077
Epoch: 15028, Training Loss: 0.01225
Epoch: 15028, Training Loss: 0.01135
Epoch: 15028, Training Loss: 0.01398
Epoch: 15028, Training Loss: 0.01077
Epoch: 15029, Training Loss: 0.01225
Epoch: 15029, Training Loss: 0.01135
Epoch: 15029, Training Loss: 0.01398
Epoch: 15029, Training Loss: 0.01077
Epoch: 15030, Training Loss: 0.01225
Epoch: 15030, Training Loss: 0.01135
Epoch: 15030, Training Loss: 0.01398
Epoch: 15030, Training Loss: 0.01077
Epoch: 15031, Training Loss: 0.01225
Epoch: 15031, Training Loss: 0.01135
Epoch: 15031, Training Loss: 0.01398
Epoch: 15031, Training Loss: 0.01077
Epoch: 15032, Training Loss: 0.01225
Epoch: 15032, Training Loss: 0.01135
Epoch: 15032, Training Loss: 0.01398
Epoch: 15032, Training Loss: 0.01077
Epoch: 15033, Training Loss: 0.01225
Epoch: 15033, Training Loss: 0.01135
Epoch: 15033, Training Loss: 0.01398
Epoch: 15033, Training Loss: 0.01076
Epoch: 15034, Training Loss: 0.01225
Epoch: 15034, Training Loss: 0.01135
Epoch: 15034, Training Loss: 0.01398
Epoch: 15034, Training Loss: 0.01076
Epoch: 15035, Training Loss: 0.01225
Epoch: 15035, Training Loss: 0.01135
Epoch: 15035, Training Loss: 0.01398
Epoch: 15035, Training Loss: 0.01076
Epoch: 15036, Training Loss: 0.01225
Epoch: 15036, Training Loss: 0.01135
Epoch: 15036, Training Loss: 0.01398
Epoch: 15036, Training Loss: 0.01076
Epoch: 15037, Training Loss: 0.01224
Epoch: 15037, Training Loss: 0.01135
Epoch: 15037, Training Loss: 0.01398
Epoch: 15037, Training Loss: 0.01076
Epoch: 15038, Training Loss: 0.01224
Epoch: 15038, Training Loss: 0.01135
Epoch: 15038, Training Loss: 0.01398
Epoch: 15038, Training Loss: 0.01076
Epoch: 15039, Training Loss: 0.01224
Epoch: 15039, Training Loss: 0.01135
Epoch: 15039, Training Loss: 0.01398
Epoch: 15039, Training Loss: 0.01076
Epoch: 15040, Training Loss: 0.01224
Epoch: 15040, Training Loss: 0.01135
Epoch: 15040, Training Loss: 0.01398
Epoch: 15040, Training Loss: 0.01076
Epoch: 15041, Training Loss: 0.01224
Epoch: 15041, Training Loss: 0.01135
Epoch: 15041, Training Loss: 0.01398
Epoch: 15041, Training Loss: 0.01076
Epoch: 15042, Training Loss: 0.01224
Epoch: 15042, Training Loss: 0.01135
Epoch: 15042, Training Loss: 0.01398
Epoch: 15042, Training Loss: 0.01076
Epoch: 15043, Training Loss: 0.01224
Epoch: 15043, Training Loss: 0.01135
Epoch: 15043, Training Loss: 0.01398
Epoch: 15043, Training Loss: 0.01076
Epoch: 15044, Training Loss: 0.01224
Epoch: 15044, Training Loss: 0.01135
Epoch: 15044, Training Loss: 0.01398
Epoch: 15044, Training Loss: 0.01076
Epoch: 15045, Training Loss: 0.01224
Epoch: 15045, Training Loss: 0.01135
Epoch: 15045, Training Loss: 0.01398
Epoch: 15045, Training Loss: 0.01076
Epoch: 15046, Training Loss: 0.01224
Epoch: 15046, Training Loss: 0.01135
Epoch: 15046, Training Loss: 0.01398
Epoch: 15046, Training Loss: 0.01076
Epoch: 15047, Training Loss: 0.01224
Epoch: 15047, Training Loss: 0.01135
Epoch: 15047, Training Loss: 0.01397
Epoch: 15047, Training Loss: 0.01076
Epoch: 15048, Training Loss: 0.01224
Epoch: 15048, Training Loss: 0.01135
Epoch: 15048, Training Loss: 0.01397
Epoch: 15048, Training Loss: 0.01076
Epoch: 15049, Training Loss: 0.01224
Epoch: 15049, Training Loss: 0.01135
Epoch: 15049, Training Loss: 0.01397
Epoch: 15049, Training Loss: 0.01076
Epoch: 15050, Training Loss: 0.01224
Epoch: 15050, Training Loss: 0.01134
Epoch: 15050, Training Loss: 0.01397
Epoch: 15050, Training Loss: 0.01076
Epoch: 15051, Training Loss: 0.01224
Epoch: 15051, Training Loss: 0.01134
Epoch: 15051, Training Loss: 0.01397
Epoch: 15051, Training Loss: 0.01076
Epoch: 15052, Training Loss: 0.01224
Epoch: 15052, Training Loss: 0.01134
Epoch: 15052, Training Loss: 0.01397
Epoch: 15052, Training Loss: 0.01076
Epoch: 15053, Training Loss: 0.01224
Epoch: 15053, Training Loss: 0.01134
Epoch: 15053, Training Loss: 0.01397
Epoch: 15053, Training Loss: 0.01076
Epoch: 15054, Training Loss: 0.01224
Epoch: 15054, Training Loss: 0.01134
Epoch: 15054, Training Loss: 0.01397
Epoch: 15054, Training Loss: 0.01076
Epoch: 15055, Training Loss: 0.01224
Epoch: 15055, Training Loss: 0.01134
Epoch: 15055, Training Loss: 0.01397
Epoch: 15055, Training Loss: 0.01076
Epoch: 15056, Training Loss: 0.01224
Epoch: 15056, Training Loss: 0.01134
Epoch: 15056, Training Loss: 0.01397
Epoch: 15056, Training Loss: 0.01076
Epoch: 15057, Training Loss: 0.01224
Epoch: 15057, Training Loss: 0.01134
Epoch: 15057, Training Loss: 0.01397
Epoch: 15057, Training Loss: 0.01076
Epoch: 15058, Training Loss: 0.01224
Epoch: 15058, Training Loss: 0.01134
Epoch: 15058, Training Loss: 0.01397
Epoch: 15058, Training Loss: 0.01076
Epoch: 15059, Training Loss: 0.01223
Epoch: 15059, Training Loss: 0.01134
Epoch: 15059, Training Loss: 0.01397
Epoch: 15059, Training Loss: 0.01075
Epoch: 15060, Training Loss: 0.01223
Epoch: 15060, Training Loss: 0.01134
Epoch: 15060, Training Loss: 0.01397
Epoch: 15060, Training Loss: 0.01075
Epoch: 15061, Training Loss: 0.01223
Epoch: 15061, Training Loss: 0.01134
Epoch: 15061, Training Loss: 0.01397
Epoch: 15061, Training Loss: 0.01075
Epoch: 15062, Training Loss: 0.01223
Epoch: 15062, Training Loss: 0.01134
Epoch: 15062, Training Loss: 0.01397
Epoch: 15062, Training Loss: 0.01075
Epoch: 15063, Training Loss: 0.01223
Epoch: 15063, Training Loss: 0.01134
Epoch: 15063, Training Loss: 0.01397
Epoch: 15063, Training Loss: 0.01075
Epoch: 15064, Training Loss: 0.01223
Epoch: 15064, Training Loss: 0.01134
Epoch: 15064, Training Loss: 0.01397
Epoch: 15064, Training Loss: 0.01075
Epoch: 15065, Training Loss: 0.01223
Epoch: 15065, Training Loss: 0.01134
Epoch: 15065, Training Loss: 0.01397
Epoch: 15065, Training Loss: 0.01075
Epoch: 15066, Training Loss: 0.01223
Epoch: 15066, Training Loss: 0.01134
Epoch: 15066, Training Loss: 0.01396
Epoch: 15066, Training Loss: 0.01075
Epoch: 15067, Training Loss: 0.01223
Epoch: 15067, Training Loss: 0.01134
Epoch: 15067, Training Loss: 0.01396
Epoch: 15067, Training Loss: 0.01075
Epoch: 15068, Training Loss: 0.01223
Epoch: 15068, Training Loss: 0.01134
Epoch: 15068, Training Loss: 0.01396
Epoch: 15068, Training Loss: 0.01075
Epoch: 15069, Training Loss: 0.01223
Epoch: 15069, Training Loss: 0.01134
Epoch: 15069, Training Loss: 0.01396
Epoch: 15069, Training Loss: 0.01075
Epoch: 15070, Training Loss: 0.01223
Epoch: 15070, Training Loss: 0.01134
Epoch: 15070, Training Loss: 0.01396
Epoch: 15070, Training Loss: 0.01075
Epoch: 15071, Training Loss: 0.01223
Epoch: 15071, Training Loss: 0.01134
Epoch: 15071, Training Loss: 0.01396
Epoch: 15071, Training Loss: 0.01075
Epoch: 15072, Training Loss: 0.01223
Epoch: 15072, Training Loss: 0.01134
Epoch: 15072, Training Loss: 0.01396
Epoch: 15072, Training Loss: 0.01075
Epoch: 15073, Training Loss: 0.01223
Epoch: 15073, Training Loss: 0.01134
Epoch: 15073, Training Loss: 0.01396
Epoch: 15073, Training Loss: 0.01075
Epoch: 15074, Training Loss: 0.01223
Epoch: 15074, Training Loss: 0.01134
Epoch: 15074, Training Loss: 0.01396
Epoch: 15074, Training Loss: 0.01075
Epoch: 15075, Training Loss: 0.01223
Epoch: 15075, Training Loss: 0.01133
Epoch: 15075, Training Loss: 0.01396
Epoch: 15075, Training Loss: 0.01075
Epoch: 15076, Training Loss: 0.01223
Epoch: 15076, Training Loss: 0.01133
Epoch: 15076, Training Loss: 0.01396
Epoch: 15076, Training Loss: 0.01075
Epoch: 15077, Training Loss: 0.01223
Epoch: 15077, Training Loss: 0.01133
Epoch: 15077, Training Loss: 0.01396
Epoch: 15077, Training Loss: 0.01075
Epoch: 15078, Training Loss: 0.01223
Epoch: 15078, Training Loss: 0.01133
Epoch: 15078, Training Loss: 0.01396
Epoch: 15078, Training Loss: 0.01075
Epoch: 15079, Training Loss: 0.01223
Epoch: 15079, Training Loss: 0.01133
Epoch: 15079, Training Loss: 0.01396
Epoch: 15079, Training Loss: 0.01075
Epoch: 15080, Training Loss: 0.01223
Epoch: 15080, Training Loss: 0.01133
Epoch: 15080, Training Loss: 0.01396
Epoch: 15080, Training Loss: 0.01075
Epoch: 15081, Training Loss: 0.01223
Epoch: 15081, Training Loss: 0.01133
Epoch: 15081, Training Loss: 0.01396
Epoch: 15081, Training Loss: 0.01075
Epoch: 15082, Training Loss: 0.01222
Epoch: 15082, Training Loss: 0.01133
Epoch: 15082, Training Loss: 0.01396
Epoch: 15082, Training Loss: 0.01075
Epoch: 15083, Training Loss: 0.01222
Epoch: 15083, Training Loss: 0.01133
Epoch: 15083, Training Loss: 0.01396
Epoch: 15083, Training Loss: 0.01075
Epoch: 15084, Training Loss: 0.01222
Epoch: 15084, Training Loss: 0.01133
Epoch: 15084, Training Loss: 0.01396
Epoch: 15084, Training Loss: 0.01075
Epoch: 15085, Training Loss: 0.01222
Epoch: 15085, Training Loss: 0.01133
Epoch: 15085, Training Loss: 0.01396
Epoch: 15085, Training Loss: 0.01074
Epoch: 15086, Training Loss: 0.01222
Epoch: 15086, Training Loss: 0.01133
Epoch: 15086, Training Loss: 0.01395
Epoch: 15086, Training Loss: 0.01074
Epoch: 15087, Training Loss: 0.01222
Epoch: 15087, Training Loss: 0.01133
Epoch: 15087, Training Loss: 0.01395
Epoch: 15087, Training Loss: 0.01074
Epoch: 15088, Training Loss: 0.01222
Epoch: 15088, Training Loss: 0.01133
Epoch: 15088, Training Loss: 0.01395
Epoch: 15088, Training Loss: 0.01074
Epoch: 15089, Training Loss: 0.01222
Epoch: 15089, Training Loss: 0.01133
Epoch: 15089, Training Loss: 0.01395
Epoch: 15089, Training Loss: 0.01074
Epoch: 15090, Training Loss: 0.01222
Epoch: 15090, Training Loss: 0.01133
Epoch: 15090, Training Loss: 0.01395
Epoch: 15090, Training Loss: 0.01074
Epoch: 15091, Training Loss: 0.01222
Epoch: 15091, Training Loss: 0.01133
Epoch: 15091, Training Loss: 0.01395
Epoch: 15091, Training Loss: 0.01074
Epoch: 15092, Training Loss: 0.01222
Epoch: 15092, Training Loss: 0.01133
Epoch: 15092, Training Loss: 0.01395
Epoch: 15092, Training Loss: 0.01074
Epoch: 15093, Training Loss: 0.01222
Epoch: 15093, Training Loss: 0.01133
Epoch: 15093, Training Loss: 0.01395
Epoch: 15093, Training Loss: 0.01074
Epoch: 15094, Training Loss: 0.01222
Epoch: 15094, Training Loss: 0.01133
Epoch: 15094, Training Loss: 0.01395
Epoch: 15094, Training Loss: 0.01074
Epoch: 15095, Training Loss: 0.01222
Epoch: 15095, Training Loss: 0.01133
Epoch: 15095, Training Loss: 0.01395
Epoch: 15095, Training Loss: 0.01074
Epoch: 15096, Training Loss: 0.01222
Epoch: 15096, Training Loss: 0.01133
Epoch: 15096, Training Loss: 0.01395
Epoch: 15096, Training Loss: 0.01074
Epoch: 15097, Training Loss: 0.01222
Epoch: 15097, Training Loss: 0.01133
Epoch: 15097, Training Loss: 0.01395
Epoch: 15097, Training Loss: 0.01074
Epoch: 15098, Training Loss: 0.01222
Epoch: 15098, Training Loss: 0.01133
Epoch: 15098, Training Loss: 0.01395
Epoch: 15098, Training Loss: 0.01074
Epoch: 15099, Training Loss: 0.01222
Epoch: 15099, Training Loss: 0.01133
Epoch: 15099, Training Loss: 0.01395
Epoch: 15099, Training Loss: 0.01074
Epoch: 15100, Training Loss: 0.01222
Epoch: 15100, Training Loss: 0.01132
Epoch: 15100, Training Loss: 0.01395
Epoch: 15100, Training Loss: 0.01074
Epoch: 15101, Training Loss: 0.01222
Epoch: 15101, Training Loss: 0.01132
Epoch: 15101, Training Loss: 0.01395
Epoch: 15101, Training Loss: 0.01074
Epoch: 15102, Training Loss: 0.01222
Epoch: 15102, Training Loss: 0.01132
Epoch: 15102, Training Loss: 0.01395
Epoch: 15102, Training Loss: 0.01074
Epoch: 15103, Training Loss: 0.01222
Epoch: 15103, Training Loss: 0.01132
Epoch: 15103, Training Loss: 0.01395
Epoch: 15103, Training Loss: 0.01074
Epoch: 15104, Training Loss: 0.01221
Epoch: 15104, Training Loss: 0.01132
Epoch: 15104, Training Loss: 0.01395
Epoch: 15104, Training Loss: 0.01074
Epoch: 15105, Training Loss: 0.01221
Epoch: 15105, Training Loss: 0.01132
Epoch: 15105, Training Loss: 0.01395
Epoch: 15105, Training Loss: 0.01074
Epoch: 15106, Training Loss: 0.01221
Epoch: 15106, Training Loss: 0.01132
Epoch: 15106, Training Loss: 0.01394
Epoch: 15106, Training Loss: 0.01074
Epoch: 15107, Training Loss: 0.01221
Epoch: 15107, Training Loss: 0.01132
Epoch: 15107, Training Loss: 0.01394
Epoch: 15107, Training Loss: 0.01074
Epoch: 15108, Training Loss: 0.01221
Epoch: 15108, Training Loss: 0.01132
Epoch: 15108, Training Loss: 0.01394
Epoch: 15108, Training Loss: 0.01074
Epoch: 15109, Training Loss: 0.01221
Epoch: 15109, Training Loss: 0.01132
Epoch: 15109, Training Loss: 0.01394
Epoch: 15109, Training Loss: 0.01074
Epoch: 15110, Training Loss: 0.01221
Epoch: 15110, Training Loss: 0.01132
Epoch: 15110, Training Loss: 0.01394
Epoch: 15110, Training Loss: 0.01074
Epoch: 15111, Training Loss: 0.01221
Epoch: 15111, Training Loss: 0.01132
Epoch: 15111, Training Loss: 0.01394
Epoch: 15111, Training Loss: 0.01073
Epoch: 15112, Training Loss: 0.01221
Epoch: 15112, Training Loss: 0.01132
Epoch: 15112, Training Loss: 0.01394
Epoch: 15112, Training Loss: 0.01073
Epoch: 15113, Training Loss: 0.01221
Epoch: 15113, Training Loss: 0.01132
Epoch: 15113, Training Loss: 0.01394
Epoch: 15113, Training Loss: 0.01073
Epoch: 15114, Training Loss: 0.01221
Epoch: 15114, Training Loss: 0.01132
Epoch: 15114, Training Loss: 0.01394
Epoch: 15114, Training Loss: 0.01073
Epoch: 15115, Training Loss: 0.01221
Epoch: 15115, Training Loss: 0.01132
Epoch: 15115, Training Loss: 0.01394
Epoch: 15115, Training Loss: 0.01073
Epoch: 15116, Training Loss: 0.01221
Epoch: 15116, Training Loss: 0.01132
Epoch: 15116, Training Loss: 0.01394
Epoch: 15116, Training Loss: 0.01073
Epoch: 15117, Training Loss: 0.01221
Epoch: 15117, Training Loss: 0.01132
Epoch: 15117, Training Loss: 0.01394
Epoch: 15117, Training Loss: 0.01073
Epoch: 15118, Training Loss: 0.01221
Epoch: 15118, Training Loss: 0.01132
Epoch: 15118, Training Loss: 0.01394
Epoch: 15118, Training Loss: 0.01073
Epoch: 15119, Training Loss: 0.01221
Epoch: 15119, Training Loss: 0.01132
Epoch: 15119, Training Loss: 0.01394
Epoch: 15119, Training Loss: 0.01073
Epoch: 15120, Training Loss: 0.01221
Epoch: 15120, Training Loss: 0.01132
Epoch: 15120, Training Loss: 0.01394
Epoch: 15120, Training Loss: 0.01073
Epoch: 15121, Training Loss: 0.01221
Epoch: 15121, Training Loss: 0.01132
Epoch: 15121, Training Loss: 0.01394
Epoch: 15121, Training Loss: 0.01073
Epoch: 15122, Training Loss: 0.01221
Epoch: 15122, Training Loss: 0.01132
Epoch: 15122, Training Loss: 0.01394
Epoch: 15122, Training Loss: 0.01073
Epoch: 15123, Training Loss: 0.01221
Epoch: 15123, Training Loss: 0.01132
Epoch: 15123, Training Loss: 0.01394
Epoch: 15123, Training Loss: 0.01073
Epoch: 15124, Training Loss: 0.01221
Epoch: 15124, Training Loss: 0.01132
Epoch: 15124, Training Loss: 0.01394
Epoch: 15124, Training Loss: 0.01073
Epoch: 15125, Training Loss: 0.01221
Epoch: 15125, Training Loss: 0.01131
Epoch: 15125, Training Loss: 0.01393
Epoch: 15125, Training Loss: 0.01073
Epoch: 15126, Training Loss: 0.01221
Epoch: 15126, Training Loss: 0.01131
Epoch: 15126, Training Loss: 0.01393
Epoch: 15126, Training Loss: 0.01073
Epoch: 15127, Training Loss: 0.01220
Epoch: 15127, Training Loss: 0.01131
Epoch: 15127, Training Loss: 0.01393
Epoch: 15127, Training Loss: 0.01073
Epoch: 15128, Training Loss: 0.01220
Epoch: 15128, Training Loss: 0.01131
Epoch: 15128, Training Loss: 0.01393
Epoch: 15128, Training Loss: 0.01073
Epoch: 15129, Training Loss: 0.01220
Epoch: 15129, Training Loss: 0.01131
Epoch: 15129, Training Loss: 0.01393
Epoch: 15129, Training Loss: 0.01073
Epoch: 15130, Training Loss: 0.01220
Epoch: 15130, Training Loss: 0.01131
Epoch: 15130, Training Loss: 0.01393
Epoch: 15130, Training Loss: 0.01073
Epoch: 15131, Training Loss: 0.01220
Epoch: 15131, Training Loss: 0.01131
Epoch: 15131, Training Loss: 0.01393
Epoch: 15131, Training Loss: 0.01073
Epoch: 15132, Training Loss: 0.01220
Epoch: 15132, Training Loss: 0.01131
Epoch: 15132, Training Loss: 0.01393
Epoch: 15132, Training Loss: 0.01073
Epoch: 15133, Training Loss: 0.01220
Epoch: 15133, Training Loss: 0.01131
Epoch: 15133, Training Loss: 0.01393
Epoch: 15133, Training Loss: 0.01073
Epoch: 15134, Training Loss: 0.01220
Epoch: 15134, Training Loss: 0.01131
Epoch: 15134, Training Loss: 0.01393
Epoch: 15134, Training Loss: 0.01073
Epoch: 15135, Training Loss: 0.01220
Epoch: 15135, Training Loss: 0.01131
Epoch: 15135, Training Loss: 0.01393
Epoch: 15135, Training Loss: 0.01073
Epoch: 15136, Training Loss: 0.01220
Epoch: 15136, Training Loss: 0.01131
Epoch: 15136, Training Loss: 0.01393
Epoch: 15136, Training Loss: 0.01073
Epoch: 15137, Training Loss: 0.01220
Epoch: 15137, Training Loss: 0.01131
Epoch: 15137, Training Loss: 0.01393
Epoch: 15137, Training Loss: 0.01072
Epoch: 15138, Training Loss: 0.01220
Epoch: 15138, Training Loss: 0.01131
Epoch: 15138, Training Loss: 0.01393
Epoch: 15138, Training Loss: 0.01072
Epoch: 15139, Training Loss: 0.01220
Epoch: 15139, Training Loss: 0.01131
Epoch: 15139, Training Loss: 0.01393
Epoch: 15139, Training Loss: 0.01072
Epoch: 15140, Training Loss: 0.01220
Epoch: 15140, Training Loss: 0.01131
Epoch: 15140, Training Loss: 0.01393
Epoch: 15140, Training Loss: 0.01072
Epoch: 15141, Training Loss: 0.01220
Epoch: 15141, Training Loss: 0.01131
Epoch: 15141, Training Loss: 0.01393
Epoch: 15141, Training Loss: 0.01072
Epoch: 15142, Training Loss: 0.01220
Epoch: 15142, Training Loss: 0.01131
Epoch: 15142, Training Loss: 0.01393
Epoch: 15142, Training Loss: 0.01072
Epoch: 15143, Training Loss: 0.01220
Epoch: 15143, Training Loss: 0.01131
Epoch: 15143, Training Loss: 0.01393
Epoch: 15143, Training Loss: 0.01072
Epoch: 15144, Training Loss: 0.01220
Epoch: 15144, Training Loss: 0.01131
Epoch: 15144, Training Loss: 0.01393
Epoch: 15144, Training Loss: 0.01072
Epoch: 15145, Training Loss: 0.01220
Epoch: 15145, Training Loss: 0.01131
Epoch: 15145, Training Loss: 0.01392
Epoch: 15145, Training Loss: 0.01072
Epoch: 15146, Training Loss: 0.01220
Epoch: 15146, Training Loss: 0.01131
Epoch: 15146, Training Loss: 0.01392
Epoch: 15146, Training Loss: 0.01072
Epoch: 15147, Training Loss: 0.01220
Epoch: 15147, Training Loss: 0.01131
Epoch: 15147, Training Loss: 0.01392
Epoch: 15147, Training Loss: 0.01072
Epoch: 15148, Training Loss: 0.01220
Epoch: 15148, Training Loss: 0.01131
Epoch: 15148, Training Loss: 0.01392
Epoch: 15148, Training Loss: 0.01072
Epoch: 15149, Training Loss: 0.01219
Epoch: 15149, Training Loss: 0.01131
Epoch: 15149, Training Loss: 0.01392
Epoch: 15149, Training Loss: 0.01072
Epoch: 15150, Training Loss: 0.01219
Epoch: 15150, Training Loss: 0.01130
Epoch: 15150, Training Loss: 0.01392
Epoch: 15150, Training Loss: 0.01072
Epoch: 15151, Training Loss: 0.01219
Epoch: 15151, Training Loss: 0.01130
Epoch: 15151, Training Loss: 0.01392
Epoch: 15151, Training Loss: 0.01072
Epoch: 15152, Training Loss: 0.01219
Epoch: 15152, Training Loss: 0.01130
Epoch: 15152, Training Loss: 0.01392
Epoch: 15152, Training Loss: 0.01072
Epoch: 15153, Training Loss: 0.01219
Epoch: 15153, Training Loss: 0.01130
Epoch: 15153, Training Loss: 0.01392
Epoch: 15153, Training Loss: 0.01072
Epoch: 15154, Training Loss: 0.01219
Epoch: 15154, Training Loss: 0.01130
Epoch: 15154, Training Loss: 0.01392
Epoch: 15154, Training Loss: 0.01072
Epoch: 15155, Training Loss: 0.01219
Epoch: 15155, Training Loss: 0.01130
Epoch: 15155, Training Loss: 0.01392
Epoch: 15155, Training Loss: 0.01072
Epoch: 15156, Training Loss: 0.01219
Epoch: 15156, Training Loss: 0.01130
Epoch: 15156, Training Loss: 0.01392
Epoch: 15156, Training Loss: 0.01072
Epoch: 15157, Training Loss: 0.01219
Epoch: 15157, Training Loss: 0.01130
Epoch: 15157, Training Loss: 0.01392
Epoch: 15157, Training Loss: 0.01072
Epoch: 15158, Training Loss: 0.01219
Epoch: 15158, Training Loss: 0.01130
Epoch: 15158, Training Loss: 0.01392
Epoch: 15158, Training Loss: 0.01072
Epoch: 15159, Training Loss: 0.01219
Epoch: 15159, Training Loss: 0.01130
Epoch: 15159, Training Loss: 0.01392
Epoch: 15159, Training Loss: 0.01072
Epoch: 15160, Training Loss: 0.01219
Epoch: 15160, Training Loss: 0.01130
Epoch: 15160, Training Loss: 0.01392
Epoch: 15160, Training Loss: 0.01072
Epoch: 15161, Training Loss: 0.01219
Epoch: 15161, Training Loss: 0.01130
Epoch: 15161, Training Loss: 0.01392
Epoch: 15161, Training Loss: 0.01072
Epoch: 15162, Training Loss: 0.01219
Epoch: 15162, Training Loss: 0.01130
Epoch: 15162, Training Loss: 0.01392
Epoch: 15162, Training Loss: 0.01072
Epoch: 15163, Training Loss: 0.01219
Epoch: 15163, Training Loss: 0.01130
Epoch: 15163, Training Loss: 0.01392
Epoch: 15163, Training Loss: 0.01071
Epoch: 15164, Training Loss: 0.01219
Epoch: 15164, Training Loss: 0.01130
Epoch: 15164, Training Loss: 0.01392
Epoch: 15164, Training Loss: 0.01071
Epoch: 15165, Training Loss: 0.01219
Epoch: 15165, Training Loss: 0.01130
Epoch: 15165, Training Loss: 0.01391
Epoch: 15165, Training Loss: 0.01071
Epoch: 15166, Training Loss: 0.01219
Epoch: 15166, Training Loss: 0.01130
Epoch: 15166, Training Loss: 0.01391
Epoch: 15166, Training Loss: 0.01071
Epoch: 15167, Training Loss: 0.01219
Epoch: 15167, Training Loss: 0.01130
Epoch: 15167, Training Loss: 0.01391
Epoch: 15167, Training Loss: 0.01071
Epoch: 15168, Training Loss: 0.01219
Epoch: 15168, Training Loss: 0.01130
Epoch: 15168, Training Loss: 0.01391
Epoch: 15168, Training Loss: 0.01071
Epoch: 15169, Training Loss: 0.01219
Epoch: 15169, Training Loss: 0.01130
Epoch: 15169, Training Loss: 0.01391
Epoch: 15169, Training Loss: 0.01071
Epoch: 15170, Training Loss: 0.01219
Epoch: 15170, Training Loss: 0.01130
Epoch: 15170, Training Loss: 0.01391
Epoch: 15170, Training Loss: 0.01071
Epoch: 15171, Training Loss: 0.01219
Epoch: 15171, Training Loss: 0.01130
Epoch: 15171, Training Loss: 0.01391
Epoch: 15171, Training Loss: 0.01071
Epoch: 15172, Training Loss: 0.01218
Epoch: 15172, Training Loss: 0.01130
Epoch: 15172, Training Loss: 0.01391
Epoch: 15172, Training Loss: 0.01071
Epoch: 15173, Training Loss: 0.01218
Epoch: 15173, Training Loss: 0.01130
Epoch: 15173, Training Loss: 0.01391
Epoch: 15173, Training Loss: 0.01071
Epoch: 15174, Training Loss: 0.01218
Epoch: 15174, Training Loss: 0.01130
Epoch: 15174, Training Loss: 0.01391
Epoch: 15174, Training Loss: 0.01071
Epoch: 15175, Training Loss: 0.01218
Epoch: 15175, Training Loss: 0.01129
Epoch: 15175, Training Loss: 0.01391
Epoch: 15175, Training Loss: 0.01071
Epoch: 15176, Training Loss: 0.01218
Epoch: 15176, Training Loss: 0.01129
Epoch: 15176, Training Loss: 0.01391
Epoch: 15176, Training Loss: 0.01071
Epoch: 15177, Training Loss: 0.01218
Epoch: 15177, Training Loss: 0.01129
Epoch: 15177, Training Loss: 0.01391
Epoch: 15177, Training Loss: 0.01071
Epoch: 15178, Training Loss: 0.01218
Epoch: 15178, Training Loss: 0.01129
Epoch: 15178, Training Loss: 0.01391
Epoch: 15178, Training Loss: 0.01071
Epoch: 15179, Training Loss: 0.01218
Epoch: 15179, Training Loss: 0.01129
Epoch: 15179, Training Loss: 0.01391
Epoch: 15179, Training Loss: 0.01071
Epoch: 15180, Training Loss: 0.01218
Epoch: 15180, Training Loss: 0.01129
Epoch: 15180, Training Loss: 0.01391
Epoch: 15180, Training Loss: 0.01071
Epoch: 15181, Training Loss: 0.01218
Epoch: 15181, Training Loss: 0.01129
Epoch: 15181, Training Loss: 0.01391
Epoch: 15181, Training Loss: 0.01071
Epoch: 15182, Training Loss: 0.01218
Epoch: 15182, Training Loss: 0.01129
Epoch: 15182, Training Loss: 0.01391
Epoch: 15182, Training Loss: 0.01071
Epoch: 15183, Training Loss: 0.01218
Epoch: 15183, Training Loss: 0.01129
Epoch: 15183, Training Loss: 0.01391
Epoch: 15183, Training Loss: 0.01071
Epoch: 15184, Training Loss: 0.01218
Epoch: 15184, Training Loss: 0.01129
Epoch: 15184, Training Loss: 0.01391
Epoch: 15184, Training Loss: 0.01071
Epoch: 15185, Training Loss: 0.01218
Epoch: 15185, Training Loss: 0.01129
Epoch: 15185, Training Loss: 0.01390
Epoch: 15185, Training Loss: 0.01071
Epoch: 15186, Training Loss: 0.01218
Epoch: 15186, Training Loss: 0.01129
Epoch: 15186, Training Loss: 0.01390
Epoch: 15186, Training Loss: 0.01071
Epoch: 15187, Training Loss: 0.01218
Epoch: 15187, Training Loss: 0.01129
Epoch: 15187, Training Loss: 0.01390
Epoch: 15187, Training Loss: 0.01071
Epoch: 15188, Training Loss: 0.01218
Epoch: 15188, Training Loss: 0.01129
Epoch: 15188, Training Loss: 0.01390
Epoch: 15188, Training Loss: 0.01071
Epoch: 15189, Training Loss: 0.01218
Epoch: 15189, Training Loss: 0.01129
Epoch: 15189, Training Loss: 0.01390
Epoch: 15189, Training Loss: 0.01070
Epoch: 15190, Training Loss: 0.01218
Epoch: 15190, Training Loss: 0.01129
Epoch: 15190, Training Loss: 0.01390
Epoch: 15190, Training Loss: 0.01070
Epoch: 15191, Training Loss: 0.01218
Epoch: 15191, Training Loss: 0.01129
Epoch: 15191, Training Loss: 0.01390
Epoch: 15191, Training Loss: 0.01070
Epoch: 15192, Training Loss: 0.01218
Epoch: 15192, Training Loss: 0.01129
Epoch: 15192, Training Loss: 0.01390
Epoch: 15192, Training Loss: 0.01070
Epoch: 15193, Training Loss: 0.01218
Epoch: 15193, Training Loss: 0.01129
Epoch: 15193, Training Loss: 0.01390
Epoch: 15193, Training Loss: 0.01070
Epoch: 15194, Training Loss: 0.01218
Epoch: 15194, Training Loss: 0.01129
Epoch: 15194, Training Loss: 0.01390
Epoch: 15194, Training Loss: 0.01070
Epoch: 15195, Training Loss: 0.01217
Epoch: 15195, Training Loss: 0.01129
Epoch: 15195, Training Loss: 0.01390
Epoch: 15195, Training Loss: 0.01070
Epoch: 15196, Training Loss: 0.01217
Epoch: 15196, Training Loss: 0.01129
Epoch: 15196, Training Loss: 0.01390
Epoch: 15196, Training Loss: 0.01070
Epoch: 15197, Training Loss: 0.01217
Epoch: 15197, Training Loss: 0.01129
Epoch: 15197, Training Loss: 0.01390
Epoch: 15197, Training Loss: 0.01070
Epoch: 15198, Training Loss: 0.01217
Epoch: 15198, Training Loss: 0.01129
Epoch: 15198, Training Loss: 0.01390
Epoch: 15198, Training Loss: 0.01070
Epoch: 15199, Training Loss: 0.01217
Epoch: 15199, Training Loss: 0.01129
Epoch: 15199, Training Loss: 0.01390
Epoch: 15199, Training Loss: 0.01070
Epoch: 15200, Training Loss: 0.01217
Epoch: 15200, Training Loss: 0.01128
Epoch: 15200, Training Loss: 0.01390
Epoch: 15200, Training Loss: 0.01070
Epoch: 15201, Training Loss: 0.01217
Epoch: 15201, Training Loss: 0.01128
Epoch: 15201, Training Loss: 0.01390
Epoch: 15201, Training Loss: 0.01070
Epoch: 15202, Training Loss: 0.01217
Epoch: 15202, Training Loss: 0.01128
Epoch: 15202, Training Loss: 0.01390
Epoch: 15202, Training Loss: 0.01070
Epoch: 15203, Training Loss: 0.01217
Epoch: 15203, Training Loss: 0.01128
Epoch: 15203, Training Loss: 0.01390
Epoch: 15203, Training Loss: 0.01070
Epoch: 15204, Training Loss: 0.01217
Epoch: 15204, Training Loss: 0.01128
Epoch: 15204, Training Loss: 0.01390
Epoch: 15204, Training Loss: 0.01070
Epoch: 15205, Training Loss: 0.01217
Epoch: 15205, Training Loss: 0.01128
Epoch: 15205, Training Loss: 0.01389
Epoch: 15205, Training Loss: 0.01070
Epoch: 15206, Training Loss: 0.01217
Epoch: 15206, Training Loss: 0.01128
Epoch: 15206, Training Loss: 0.01389
Epoch: 15206, Training Loss: 0.01070
Epoch: 15207, Training Loss: 0.01217
Epoch: 15207, Training Loss: 0.01128
Epoch: 15207, Training Loss: 0.01389
Epoch: 15207, Training Loss: 0.01070
Epoch: 15208, Training Loss: 0.01217
Epoch: 15208, Training Loss: 0.01128
Epoch: 15208, Training Loss: 0.01389
Epoch: 15208, Training Loss: 0.01070
Epoch: 15209, Training Loss: 0.01217
Epoch: 15209, Training Loss: 0.01128
Epoch: 15209, Training Loss: 0.01389
Epoch: 15209, Training Loss: 0.01070
Epoch: 15210, Training Loss: 0.01217
Epoch: 15210, Training Loss: 0.01128
Epoch: 15210, Training Loss: 0.01389
Epoch: 15210, Training Loss: 0.01070
Epoch: 15211, Training Loss: 0.01217
Epoch: 15211, Training Loss: 0.01128
Epoch: 15211, Training Loss: 0.01389
Epoch: 15211, Training Loss: 0.01070
Epoch: 15212, Training Loss: 0.01217
Epoch: 15212, Training Loss: 0.01128
Epoch: 15212, Training Loss: 0.01389
Epoch: 15212, Training Loss: 0.01070
Epoch: 15213, Training Loss: 0.01217
Epoch: 15213, Training Loss: 0.01128
Epoch: 15213, Training Loss: 0.01389
Epoch: 15213, Training Loss: 0.01070
Epoch: 15214, Training Loss: 0.01217
Epoch: 15214, Training Loss: 0.01128
Epoch: 15214, Training Loss: 0.01389
Epoch: 15214, Training Loss: 0.01070
Epoch: 15215, Training Loss: 0.01217
Epoch: 15215, Training Loss: 0.01128
Epoch: 15215, Training Loss: 0.01389
Epoch: 15215, Training Loss: 0.01070
Epoch: 15216, Training Loss: 0.01217
Epoch: 15216, Training Loss: 0.01128
Epoch: 15216, Training Loss: 0.01389
Epoch: 15216, Training Loss: 0.01069
Epoch: 15217, Training Loss: 0.01216
Epoch: 15217, Training Loss: 0.01128
Epoch: 15217, Training Loss: 0.01389
Epoch: 15217, Training Loss: 0.01069
Epoch: 15218, Training Loss: 0.01216
Epoch: 15218, Training Loss: 0.01128
Epoch: 15218, Training Loss: 0.01389
Epoch: 15218, Training Loss: 0.01069
Epoch: 15219, Training Loss: 0.01216
Epoch: 15219, Training Loss: 0.01128
Epoch: 15219, Training Loss: 0.01389
Epoch: 15219, Training Loss: 0.01069
Epoch: 15220, Training Loss: 0.01216
Epoch: 15220, Training Loss: 0.01128
Epoch: 15220, Training Loss: 0.01389
Epoch: 15220, Training Loss: 0.01069
Epoch: 15221, Training Loss: 0.01216
Epoch: 15221, Training Loss: 0.01128
Epoch: 15221, Training Loss: 0.01389
Epoch: 15221, Training Loss: 0.01069
Epoch: 15222, Training Loss: 0.01216
Epoch: 15222, Training Loss: 0.01128
Epoch: 15222, Training Loss: 0.01389
Epoch: 15222, Training Loss: 0.01069
Epoch: 15223, Training Loss: 0.01216
Epoch: 15223, Training Loss: 0.01128
Epoch: 15223, Training Loss: 0.01389
Epoch: 15223, Training Loss: 0.01069
Epoch: 15224, Training Loss: 0.01216
Epoch: 15224, Training Loss: 0.01128
Epoch: 15224, Training Loss: 0.01388
Epoch: 15224, Training Loss: 0.01069
Epoch: 15225, Training Loss: 0.01216
Epoch: 15225, Training Loss: 0.01127
Epoch: 15225, Training Loss: 0.01388
Epoch: 15225, Training Loss: 0.01069
Epoch: 15226, Training Loss: 0.01216
Epoch: 15226, Training Loss: 0.01127
Epoch: 15226, Training Loss: 0.01388
Epoch: 15226, Training Loss: 0.01069
Epoch: 15227, Training Loss: 0.01216
Epoch: 15227, Training Loss: 0.01127
Epoch: 15227, Training Loss: 0.01388
Epoch: 15227, Training Loss: 0.01069
Epoch: 15228, Training Loss: 0.01216
Epoch: 15228, Training Loss: 0.01127
Epoch: 15228, Training Loss: 0.01388
Epoch: 15228, Training Loss: 0.01069
Epoch: 15229, Training Loss: 0.01216
Epoch: 15229, Training Loss: 0.01127
Epoch: 15229, Training Loss: 0.01388
Epoch: 15229, Training Loss: 0.01069
Epoch: 15230, Training Loss: 0.01216
Epoch: 15230, Training Loss: 0.01127
Epoch: 15230, Training Loss: 0.01388
Epoch: 15230, Training Loss: 0.01069
Epoch: 15231, Training Loss: 0.01216
Epoch: 15231, Training Loss: 0.01127
Epoch: 15231, Training Loss: 0.01388
Epoch: 15231, Training Loss: 0.01069
Epoch: 15232, Training Loss: 0.01216
Epoch: 15232, Training Loss: 0.01127
Epoch: 15232, Training Loss: 0.01388
Epoch: 15232, Training Loss: 0.01069
Epoch: 15233, Training Loss: 0.01216
Epoch: 15233, Training Loss: 0.01127
Epoch: 15233, Training Loss: 0.01388
Epoch: 15233, Training Loss: 0.01069
Epoch: 15234, Training Loss: 0.01216
Epoch: 15234, Training Loss: 0.01127
Epoch: 15234, Training Loss: 0.01388
Epoch: 15234, Training Loss: 0.01069
Epoch: 15235, Training Loss: 0.01216
Epoch: 15235, Training Loss: 0.01127
Epoch: 15235, Training Loss: 0.01388
Epoch: 15235, Training Loss: 0.01069
Epoch: 15236, Training Loss: 0.01216
Epoch: 15236, Training Loss: 0.01127
Epoch: 15236, Training Loss: 0.01388
Epoch: 15236, Training Loss: 0.01069
Epoch: 15237, Training Loss: 0.01216
Epoch: 15237, Training Loss: 0.01127
Epoch: 15237, Training Loss: 0.01388
Epoch: 15237, Training Loss: 0.01069
Epoch: 15238, Training Loss: 0.01216
Epoch: 15238, Training Loss: 0.01127
Epoch: 15238, Training Loss: 0.01388
Epoch: 15238, Training Loss: 0.01069
Epoch: 15239, Training Loss: 0.01216
Epoch: 15239, Training Loss: 0.01127
Epoch: 15239, Training Loss: 0.01388
Epoch: 15239, Training Loss: 0.01069
Epoch: 15240, Training Loss: 0.01215
Epoch: 15240, Training Loss: 0.01127
Epoch: 15240, Training Loss: 0.01388
Epoch: 15240, Training Loss: 0.01069
Epoch: 15241, Training Loss: 0.01215
Epoch: 15241, Training Loss: 0.01127
Epoch: 15241, Training Loss: 0.01388
Epoch: 15241, Training Loss: 0.01069
Epoch: 15242, Training Loss: 0.01215
Epoch: 15242, Training Loss: 0.01127
Epoch: 15242, Training Loss: 0.01388
Epoch: 15242, Training Loss: 0.01068
Epoch: 15243, Training Loss: 0.01215
Epoch: 15243, Training Loss: 0.01127
Epoch: 15243, Training Loss: 0.01388
Epoch: 15243, Training Loss: 0.01068
Epoch: 15244, Training Loss: 0.01215
Epoch: 15244, Training Loss: 0.01127
Epoch: 15244, Training Loss: 0.01387
Epoch: 15244, Training Loss: 0.01068
Epoch: 15245, Training Loss: 0.01215
Epoch: 15245, Training Loss: 0.01127
Epoch: 15245, Training Loss: 0.01387
Epoch: 15245, Training Loss: 0.01068
Epoch: 15246, Training Loss: 0.01215
Epoch: 15246, Training Loss: 0.01127
Epoch: 15246, Training Loss: 0.01387
Epoch: 15246, Training Loss: 0.01068
Epoch: 15247, Training Loss: 0.01215
Epoch: 15247, Training Loss: 0.01127
Epoch: 15247, Training Loss: 0.01387
Epoch: 15247, Training Loss: 0.01068
Epoch: 15248, Training Loss: 0.01215
Epoch: 15248, Training Loss: 0.01127
Epoch: 15248, Training Loss: 0.01387
Epoch: 15248, Training Loss: 0.01068
Epoch: 15249, Training Loss: 0.01215
Epoch: 15249, Training Loss: 0.01127
Epoch: 15249, Training Loss: 0.01387
Epoch: 15249, Training Loss: 0.01068
Epoch: 15250, Training Loss: 0.01215
Epoch: 15250, Training Loss: 0.01126
Epoch: 15250, Training Loss: 0.01387
Epoch: 15250, Training Loss: 0.01068
Epoch: 15251, Training Loss: 0.01215
Epoch: 15251, Training Loss: 0.01126
Epoch: 15251, Training Loss: 0.01387
Epoch: 15251, Training Loss: 0.01068
Epoch: 15252, Training Loss: 0.01215
Epoch: 15252, Training Loss: 0.01126
Epoch: 15252, Training Loss: 0.01387
Epoch: 15252, Training Loss: 0.01068
Epoch: 15253, Training Loss: 0.01215
Epoch: 15253, Training Loss: 0.01126
Epoch: 15253, Training Loss: 0.01387
Epoch: 15253, Training Loss: 0.01068
Epoch: 15254, Training Loss: 0.01215
Epoch: 15254, Training Loss: 0.01126
Epoch: 15254, Training Loss: 0.01387
Epoch: 15254, Training Loss: 0.01068
Epoch: 15255, Training Loss: 0.01215
Epoch: 15255, Training Loss: 0.01126
Epoch: 15255, Training Loss: 0.01387
Epoch: 15255, Training Loss: 0.01068
Epoch: 15256, Training Loss: 0.01215
Epoch: 15256, Training Loss: 0.01126
Epoch: 15256, Training Loss: 0.01387
Epoch: 15256, Training Loss: 0.01068
Epoch: 15257, Training Loss: 0.01215
Epoch: 15257, Training Loss: 0.01126
Epoch: 15257, Training Loss: 0.01387
Epoch: 15257, Training Loss: 0.01068
Epoch: 15258, Training Loss: 0.01215
Epoch: 15258, Training Loss: 0.01126
Epoch: 15258, Training Loss: 0.01387
Epoch: 15258, Training Loss: 0.01068
Epoch: 15259, Training Loss: 0.01215
Epoch: 15259, Training Loss: 0.01126
Epoch: 15259, Training Loss: 0.01387
Epoch: 15259, Training Loss: 0.01068
Epoch: 15260, Training Loss: 0.01215
Epoch: 15260, Training Loss: 0.01126
Epoch: 15260, Training Loss: 0.01387
Epoch: 15260, Training Loss: 0.01068
Epoch: 15261, Training Loss: 0.01215
Epoch: 15261, Training Loss: 0.01126
Epoch: 15261, Training Loss: 0.01387
Epoch: 15261, Training Loss: 0.01068
Epoch: 15262, Training Loss: 0.01215
Epoch: 15262, Training Loss: 0.01126
Epoch: 15262, Training Loss: 0.01387
Epoch: 15262, Training Loss: 0.01068
Epoch: 15263, Training Loss: 0.01214
Epoch: 15263, Training Loss: 0.01126
Epoch: 15263, Training Loss: 0.01387
Epoch: 15263, Training Loss: 0.01068
Epoch: 15264, Training Loss: 0.01214
Epoch: 15264, Training Loss: 0.01126
Epoch: 15264, Training Loss: 0.01386
Epoch: 15264, Training Loss: 0.01068
Epoch: 15265, Training Loss: 0.01214
Epoch: 15265, Training Loss: 0.01126
Epoch: 15265, Training Loss: 0.01386
Epoch: 15265, Training Loss: 0.01068
Epoch: 15266, Training Loss: 0.01214
Epoch: 15266, Training Loss: 0.01126
Epoch: 15266, Training Loss: 0.01386
Epoch: 15266, Training Loss: 0.01068
Epoch: 15267, Training Loss: 0.01214
Epoch: 15267, Training Loss: 0.01126
Epoch: 15267, Training Loss: 0.01386
Epoch: 15267, Training Loss: 0.01068
Epoch: 15268, Training Loss: 0.01214
Epoch: 15268, Training Loss: 0.01126
Epoch: 15268, Training Loss: 0.01386
Epoch: 15268, Training Loss: 0.01068
Epoch: 15269, Training Loss: 0.01214
Epoch: 15269, Training Loss: 0.01126
Epoch: 15269, Training Loss: 0.01386
Epoch: 15269, Training Loss: 0.01067
Epoch: 15270, Training Loss: 0.01214
Epoch: 15270, Training Loss: 0.01126
Epoch: 15270, Training Loss: 0.01386
Epoch: 15270, Training Loss: 0.01067
Epoch: 15271, Training Loss: 0.01214
Epoch: 15271, Training Loss: 0.01126
Epoch: 15271, Training Loss: 0.01386
Epoch: 15271, Training Loss: 0.01067
Epoch: 15272, Training Loss: 0.01214
Epoch: 15272, Training Loss: 0.01126
Epoch: 15272, Training Loss: 0.01386
Epoch: 15272, Training Loss: 0.01067
Epoch: 15273, Training Loss: 0.01214
Epoch: 15273, Training Loss: 0.01126
Epoch: 15273, Training Loss: 0.01386
Epoch: 15273, Training Loss: 0.01067
Epoch: 15274, Training Loss: 0.01214
Epoch: 15274, Training Loss: 0.01126
Epoch: 15274, Training Loss: 0.01386
Epoch: 15274, Training Loss: 0.01067
Epoch: 15275, Training Loss: 0.01214
Epoch: 15275, Training Loss: 0.01125
Epoch: 15275, Training Loss: 0.01386
Epoch: 15275, Training Loss: 0.01067
Epoch: 15276, Training Loss: 0.01214
Epoch: 15276, Training Loss: 0.01125
Epoch: 15276, Training Loss: 0.01386
Epoch: 15276, Training Loss: 0.01067
Epoch: 15277, Training Loss: 0.01214
Epoch: 15277, Training Loss: 0.01125
Epoch: 15277, Training Loss: 0.01386
Epoch: 15277, Training Loss: 0.01067
Epoch: 15278, Training Loss: 0.01214
Epoch: 15278, Training Loss: 0.01125
Epoch: 15278, Training Loss: 0.01386
Epoch: 15278, Training Loss: 0.01067
Epoch: 15279, Training Loss: 0.01214
Epoch: 15279, Training Loss: 0.01125
Epoch: 15279, Training Loss: 0.01386
Epoch: 15279, Training Loss: 0.01067
Epoch: 15280, Training Loss: 0.01214
Epoch: 15280, Training Loss: 0.01125
Epoch: 15280, Training Loss: 0.01386
Epoch: 15280, Training Loss: 0.01067
Epoch: 15281, Training Loss: 0.01214
Epoch: 15281, Training Loss: 0.01125
Epoch: 15281, Training Loss: 0.01386
Epoch: 15281, Training Loss: 0.01067
Epoch: 15282, Training Loss: 0.01214
Epoch: 15282, Training Loss: 0.01125
Epoch: 15282, Training Loss: 0.01386
Epoch: 15282, Training Loss: 0.01067
Epoch: 15283, Training Loss: 0.01214
Epoch: 15283, Training Loss: 0.01125
Epoch: 15283, Training Loss: 0.01386
Epoch: 15283, Training Loss: 0.01067
Epoch: 15284, Training Loss: 0.01214
Epoch: 15284, Training Loss: 0.01125
Epoch: 15284, Training Loss: 0.01385
Epoch: 15284, Training Loss: 0.01067
Epoch: 15285, Training Loss: 0.01214
Epoch: 15285, Training Loss: 0.01125
Epoch: 15285, Training Loss: 0.01385
Epoch: 15285, Training Loss: 0.01067
Epoch: 15286, Training Loss: 0.01213
Epoch: 15286, Training Loss: 0.01125
Epoch: 15286, Training Loss: 0.01385
Epoch: 15286, Training Loss: 0.01067
Epoch: 15287, Training Loss: 0.01213
Epoch: 15287, Training Loss: 0.01125
Epoch: 15287, Training Loss: 0.01385
Epoch: 15287, Training Loss: 0.01067
Epoch: 15288, Training Loss: 0.01213
Epoch: 15288, Training Loss: 0.01125
Epoch: 15288, Training Loss: 0.01385
Epoch: 15288, Training Loss: 0.01067
Epoch: 15289, Training Loss: 0.01213
Epoch: 15289, Training Loss: 0.01125
Epoch: 15289, Training Loss: 0.01385
Epoch: 15289, Training Loss: 0.01067
Epoch: 15290, Training Loss: 0.01213
Epoch: 15290, Training Loss: 0.01125
Epoch: 15290, Training Loss: 0.01385
Epoch: 15290, Training Loss: 0.01067
Epoch: 15291, Training Loss: 0.01213
Epoch: 15291, Training Loss: 0.01125
Epoch: 15291, Training Loss: 0.01385
Epoch: 15291, Training Loss: 0.01067
Epoch: 15292, Training Loss: 0.01213
Epoch: 15292, Training Loss: 0.01125
Epoch: 15292, Training Loss: 0.01385
Epoch: 15292, Training Loss: 0.01067
Epoch: 15293, Training Loss: 0.01213
Epoch: 15293, Training Loss: 0.01125
Epoch: 15293, Training Loss: 0.01385
Epoch: 15293, Training Loss: 0.01067
Epoch: 15294, Training Loss: 0.01213
Epoch: 15294, Training Loss: 0.01125
Epoch: 15294, Training Loss: 0.01385
Epoch: 15294, Training Loss: 0.01067
Epoch: 15295, Training Loss: 0.01213
Epoch: 15295, Training Loss: 0.01125
Epoch: 15295, Training Loss: 0.01385
Epoch: 15295, Training Loss: 0.01066
Epoch: 15296, Training Loss: 0.01213
Epoch: 15296, Training Loss: 0.01125
Epoch: 15296, Training Loss: 0.01385
Epoch: 15296, Training Loss: 0.01066
Epoch: 15297, Training Loss: 0.01213
Epoch: 15297, Training Loss: 0.01125
Epoch: 15297, Training Loss: 0.01385
Epoch: 15297, Training Loss: 0.01066
Epoch: 15298, Training Loss: 0.01213
Epoch: 15298, Training Loss: 0.01125
Epoch: 15298, Training Loss: 0.01385
Epoch: 15298, Training Loss: 0.01066
Epoch: 15299, Training Loss: 0.01213
Epoch: 15299, Training Loss: 0.01125
Epoch: 15299, Training Loss: 0.01385
Epoch: 15299, Training Loss: 0.01066
Epoch: 15300, Training Loss: 0.01213
Epoch: 15300, Training Loss: 0.01125
Epoch: 15300, Training Loss: 0.01385
Epoch: 15300, Training Loss: 0.01066
Epoch: 15301, Training Loss: 0.01213
Epoch: 15301, Training Loss: 0.01124
Epoch: 15301, Training Loss: 0.01385
Epoch: 15301, Training Loss: 0.01066
Epoch: 15302, Training Loss: 0.01213
Epoch: 15302, Training Loss: 0.01124
Epoch: 15302, Training Loss: 0.01385
Epoch: 15302, Training Loss: 0.01066
Epoch: 15303, Training Loss: 0.01213
Epoch: 15303, Training Loss: 0.01124
Epoch: 15303, Training Loss: 0.01385
Epoch: 15303, Training Loss: 0.01066
Epoch: 15304, Training Loss: 0.01213
Epoch: 15304, Training Loss: 0.01124
Epoch: 15304, Training Loss: 0.01385
Epoch: 15304, Training Loss: 0.01066
Epoch: 15305, Training Loss: 0.01213
Epoch: 15305, Training Loss: 0.01124
Epoch: 15305, Training Loss: 0.01384
Epoch: 15305, Training Loss: 0.01066
Epoch: 15306, Training Loss: 0.01213
Epoch: 15306, Training Loss: 0.01124
Epoch: 15306, Training Loss: 0.01384
Epoch: 15306, Training Loss: 0.01066
Epoch: 15307, Training Loss: 0.01213
Epoch: 15307, Training Loss: 0.01124
Epoch: 15307, Training Loss: 0.01384
Epoch: 15307, Training Loss: 0.01066
Epoch: 15308, Training Loss: 0.01213
Epoch: 15308, Training Loss: 0.01124
Epoch: 15308, Training Loss: 0.01384
Epoch: 15308, Training Loss: 0.01066
Epoch: 15309, Training Loss: 0.01212
Epoch: 15309, Training Loss: 0.01124
Epoch: 15309, Training Loss: 0.01384
Epoch: 15309, Training Loss: 0.01066
Epoch: 15310, Training Loss: 0.01212
Epoch: 15310, Training Loss: 0.01124
Epoch: 15310, Training Loss: 0.01384
Epoch: 15310, Training Loss: 0.01066
Epoch: 15311, Training Loss: 0.01212
Epoch: 15311, Training Loss: 0.01124
Epoch: 15311, Training Loss: 0.01384
Epoch: 15311, Training Loss: 0.01066
Epoch: 15312, Training Loss: 0.01212
Epoch: 15312, Training Loss: 0.01124
Epoch: 15312, Training Loss: 0.01384
Epoch: 15312, Training Loss: 0.01066
Epoch: 15313, Training Loss: 0.01212
Epoch: 15313, Training Loss: 0.01124
Epoch: 15313, Training Loss: 0.01384
Epoch: 15313, Training Loss: 0.01066
Epoch: 15314, Training Loss: 0.01212
Epoch: 15314, Training Loss: 0.01124
Epoch: 15314, Training Loss: 0.01384
Epoch: 15314, Training Loss: 0.01066
Epoch: 15315, Training Loss: 0.01212
Epoch: 15315, Training Loss: 0.01124
Epoch: 15315, Training Loss: 0.01384
Epoch: 15315, Training Loss: 0.01066
Epoch: 15316, Training Loss: 0.01212
Epoch: 15316, Training Loss: 0.01124
Epoch: 15316, Training Loss: 0.01384
Epoch: 15316, Training Loss: 0.01066
Epoch: 15317, Training Loss: 0.01212
Epoch: 15317, Training Loss: 0.01124
Epoch: 15317, Training Loss: 0.01384
Epoch: 15317, Training Loss: 0.01066
Epoch: 15318, Training Loss: 0.01212
Epoch: 15318, Training Loss: 0.01124
Epoch: 15318, Training Loss: 0.01384
Epoch: 15318, Training Loss: 0.01066
Epoch: 15319, Training Loss: 0.01212
Epoch: 15319, Training Loss: 0.01124
Epoch: 15319, Training Loss: 0.01384
Epoch: 15319, Training Loss: 0.01066
Epoch: 15320, Training Loss: 0.01212
Epoch: 15320, Training Loss: 0.01124
Epoch: 15320, Training Loss: 0.01384
Epoch: 15320, Training Loss: 0.01066
Epoch: 15321, Training Loss: 0.01212
Epoch: 15321, Training Loss: 0.01124
Epoch: 15321, Training Loss: 0.01384
Epoch: 15321, Training Loss: 0.01066
Epoch: 15322, Training Loss: 0.01212
Epoch: 15322, Training Loss: 0.01124
Epoch: 15322, Training Loss: 0.01384
Epoch: 15322, Training Loss: 0.01065
Epoch: 15323, Training Loss: 0.01212
Epoch: 15323, Training Loss: 0.01124
Epoch: 15323, Training Loss: 0.01384
Epoch: 15323, Training Loss: 0.01065
Epoch: 15324, Training Loss: 0.01212
Epoch: 15324, Training Loss: 0.01124
Epoch: 15324, Training Loss: 0.01384
Epoch: 15324, Training Loss: 0.01065
Epoch: 15325, Training Loss: 0.01212
Epoch: 15325, Training Loss: 0.01124
Epoch: 15325, Training Loss: 0.01383
Epoch: 15325, Training Loss: 0.01065
Epoch: 15326, Training Loss: 0.01212
Epoch: 15326, Training Loss: 0.01123
Epoch: 15326, Training Loss: 0.01383
Epoch: 15326, Training Loss: 0.01065
Epoch: 15327, Training Loss: 0.01212
Epoch: 15327, Training Loss: 0.01123
Epoch: 15327, Training Loss: 0.01383
Epoch: 15327, Training Loss: 0.01065
Epoch: 15328, Training Loss: 0.01212
Epoch: 15328, Training Loss: 0.01123
Epoch: 15328, Training Loss: 0.01383
Epoch: 15328, Training Loss: 0.01065
Epoch: 15329, Training Loss: 0.01212
Epoch: 15329, Training Loss: 0.01123
Epoch: 15329, Training Loss: 0.01383
Epoch: 15329, Training Loss: 0.01065
Epoch: 15330, Training Loss: 0.01212
Epoch: 15330, Training Loss: 0.01123
Epoch: 15330, Training Loss: 0.01383
Epoch: 15330, Training Loss: 0.01065
Epoch: 15331, Training Loss: 0.01212
Epoch: 15331, Training Loss: 0.01123
Epoch: 15331, Training Loss: 0.01383
Epoch: 15331, Training Loss: 0.01065
Epoch: 15332, Training Loss: 0.01211
Epoch: 15332, Training Loss: 0.01123
Epoch: 15332, Training Loss: 0.01383
Epoch: 15332, Training Loss: 0.01065
Epoch: 15333, Training Loss: 0.01211
Epoch: 15333, Training Loss: 0.01123
Epoch: 15333, Training Loss: 0.01383
Epoch: 15333, Training Loss: 0.01065
Epoch: 15334, Training Loss: 0.01211
Epoch: 15334, Training Loss: 0.01123
Epoch: 15334, Training Loss: 0.01383
Epoch: 15334, Training Loss: 0.01065
Epoch: 15335, Training Loss: 0.01211
Epoch: 15335, Training Loss: 0.01123
Epoch: 15335, Training Loss: 0.01383
Epoch: 15335, Training Loss: 0.01065
Epoch: 15336, Training Loss: 0.01211
Epoch: 15336, Training Loss: 0.01123
Epoch: 15336, Training Loss: 0.01383
Epoch: 15336, Training Loss: 0.01065
Epoch: 15337, Training Loss: 0.01211
Epoch: 15337, Training Loss: 0.01123
Epoch: 15337, Training Loss: 0.01383
Epoch: 15337, Training Loss: 0.01065
Epoch: 15338, Training Loss: 0.01211
Epoch: 15338, Training Loss: 0.01123
Epoch: 15338, Training Loss: 0.01383
Epoch: 15338, Training Loss: 0.01065
Epoch: 15339, Training Loss: 0.01211
Epoch: 15339, Training Loss: 0.01123
Epoch: 15339, Training Loss: 0.01383
Epoch: 15339, Training Loss: 0.01065
Epoch: 15340, Training Loss: 0.01211
Epoch: 15340, Training Loss: 0.01123
Epoch: 15340, Training Loss: 0.01383
Epoch: 15340, Training Loss: 0.01065
Epoch: 15341, Training Loss: 0.01211
Epoch: 15341, Training Loss: 0.01123
Epoch: 15341, Training Loss: 0.01383
Epoch: 15341, Training Loss: 0.01065
Epoch: 15342, Training Loss: 0.01211
Epoch: 15342, Training Loss: 0.01123
Epoch: 15342, Training Loss: 0.01383
Epoch: 15342, Training Loss: 0.01065
Epoch: 15343, Training Loss: 0.01211
Epoch: 15343, Training Loss: 0.01123
Epoch: 15343, Training Loss: 0.01383
Epoch: 15343, Training Loss: 0.01065
Epoch: 15344, Training Loss: 0.01211
Epoch: 15344, Training Loss: 0.01123
Epoch: 15344, Training Loss: 0.01383
Epoch: 15344, Training Loss: 0.01065
Epoch: 15345, Training Loss: 0.01211
Epoch: 15345, Training Loss: 0.01123
Epoch: 15345, Training Loss: 0.01382
Epoch: 15345, Training Loss: 0.01065
Epoch: 15346, Training Loss: 0.01211
Epoch: 15346, Training Loss: 0.01123
Epoch: 15346, Training Loss: 0.01382
Epoch: 15346, Training Loss: 0.01065
Epoch: 15347, Training Loss: 0.01211
Epoch: 15347, Training Loss: 0.01123
Epoch: 15347, Training Loss: 0.01382
Epoch: 15347, Training Loss: 0.01065
Epoch: 15348, Training Loss: 0.01211
Epoch: 15348, Training Loss: 0.01123
Epoch: 15348, Training Loss: 0.01382
Epoch: 15348, Training Loss: 0.01064
Epoch: 15349, Training Loss: 0.01211
Epoch: 15349, Training Loss: 0.01123
Epoch: 15349, Training Loss: 0.01382
Epoch: 15349, Training Loss: 0.01064
Epoch: 15350, Training Loss: 0.01211
Epoch: 15350, Training Loss: 0.01123
Epoch: 15350, Training Loss: 0.01382
Epoch: 15350, Training Loss: 0.01064
Epoch: 15351, Training Loss: 0.01211
Epoch: 15351, Training Loss: 0.01123
Epoch: 15351, Training Loss: 0.01382
Epoch: 15351, Training Loss: 0.01064
Epoch: 15352, Training Loss: 0.01211
Epoch: 15352, Training Loss: 0.01122
Epoch: 15352, Training Loss: 0.01382
Epoch: 15352, Training Loss: 0.01064
Epoch: 15353, Training Loss: 0.01211
Epoch: 15353, Training Loss: 0.01122
Epoch: 15353, Training Loss: 0.01382
Epoch: 15353, Training Loss: 0.01064
Epoch: 15354, Training Loss: 0.01211
Epoch: 15354, Training Loss: 0.01122
Epoch: 15354, Training Loss: 0.01382
Epoch: 15354, Training Loss: 0.01064
Epoch: 15355, Training Loss: 0.01210
Epoch: 15355, Training Loss: 0.01122
Epoch: 15355, Training Loss: 0.01382
Epoch: 15355, Training Loss: 0.01064
Epoch: 15356, Training Loss: 0.01210
Epoch: 15356, Training Loss: 0.01122
Epoch: 15356, Training Loss: 0.01382
Epoch: 15356, Training Loss: 0.01064
Epoch: 15357, Training Loss: 0.01210
Epoch: 15357, Training Loss: 0.01122
Epoch: 15357, Training Loss: 0.01382
Epoch: 15357, Training Loss: 0.01064
Epoch: 15358, Training Loss: 0.01210
Epoch: 15358, Training Loss: 0.01122
Epoch: 15358, Training Loss: 0.01382
Epoch: 15358, Training Loss: 0.01064
Epoch: 15359, Training Loss: 0.01210
Epoch: 15359, Training Loss: 0.01122
Epoch: 15359, Training Loss: 0.01382
Epoch: 15359, Training Loss: 0.01064
Epoch: 15360, Training Loss: 0.01210
Epoch: 15360, Training Loss: 0.01122
Epoch: 15360, Training Loss: 0.01382
Epoch: 15360, Training Loss: 0.01064
Epoch: 15361, Training Loss: 0.01210
Epoch: 15361, Training Loss: 0.01122
Epoch: 15361, Training Loss: 0.01382
Epoch: 15361, Training Loss: 0.01064
Epoch: 15362, Training Loss: 0.01210
Epoch: 15362, Training Loss: 0.01122
Epoch: 15362, Training Loss: 0.01382
Epoch: 15362, Training Loss: 0.01064
Epoch: 15363, Training Loss: 0.01210
Epoch: 15363, Training Loss: 0.01122
Epoch: 15363, Training Loss: 0.01382
Epoch: 15363, Training Loss: 0.01064
Epoch: 15364, Training Loss: 0.01210
Epoch: 15364, Training Loss: 0.01122
Epoch: 15364, Training Loss: 0.01382
Epoch: 15364, Training Loss: 0.01064
Epoch: 15365, Training Loss: 0.01210
Epoch: 15365, Training Loss: 0.01122
Epoch: 15365, Training Loss: 0.01381
Epoch: 15365, Training Loss: 0.01064
Epoch: 15366, Training Loss: 0.01210
Epoch: 15366, Training Loss: 0.01122
Epoch: 15366, Training Loss: 0.01381
Epoch: 15366, Training Loss: 0.01064
Epoch: 15367, Training Loss: 0.01210
Epoch: 15367, Training Loss: 0.01122
Epoch: 15367, Training Loss: 0.01381
Epoch: 15367, Training Loss: 0.01064
Epoch: 15368, Training Loss: 0.01210
Epoch: 15368, Training Loss: 0.01122
Epoch: 15368, Training Loss: 0.01381
Epoch: 15368, Training Loss: 0.01064
Epoch: 15369, Training Loss: 0.01210
Epoch: 15369, Training Loss: 0.01122
Epoch: 15369, Training Loss: 0.01381
Epoch: 15369, Training Loss: 0.01064
Epoch: 15370, Training Loss: 0.01210
Epoch: 15370, Training Loss: 0.01122
Epoch: 15370, Training Loss: 0.01381
Epoch: 15370, Training Loss: 0.01064
Epoch: 15371, Training Loss: 0.01210
Epoch: 15371, Training Loss: 0.01122
Epoch: 15371, Training Loss: 0.01381
Epoch: 15371, Training Loss: 0.01064
Epoch: 15372, Training Loss: 0.01210
Epoch: 15372, Training Loss: 0.01122
Epoch: 15372, Training Loss: 0.01381
Epoch: 15372, Training Loss: 0.01064
Epoch: 15373, Training Loss: 0.01210
Epoch: 15373, Training Loss: 0.01122
Epoch: 15373, Training Loss: 0.01381
Epoch: 15373, Training Loss: 0.01064
Epoch: 15374, Training Loss: 0.01210
Epoch: 15374, Training Loss: 0.01122
Epoch: 15374, Training Loss: 0.01381
Epoch: 15374, Training Loss: 0.01064
Epoch: 15375, Training Loss: 0.01210
Epoch: 15375, Training Loss: 0.01122
Epoch: 15375, Training Loss: 0.01381
Epoch: 15375, Training Loss: 0.01063
Epoch: 15376, Training Loss: 0.01210
Epoch: 15376, Training Loss: 0.01122
Epoch: 15376, Training Loss: 0.01381
Epoch: 15376, Training Loss: 0.01063
Epoch: 15377, Training Loss: 0.01210
Epoch: 15377, Training Loss: 0.01121
Epoch: 15377, Training Loss: 0.01381
Epoch: 15377, Training Loss: 0.01063
Epoch: 15378, Training Loss: 0.01209
Epoch: 15378, Training Loss: 0.01121
Epoch: 15378, Training Loss: 0.01381
Epoch: 15378, Training Loss: 0.01063
Epoch: 15379, Training Loss: 0.01209
Epoch: 15379, Training Loss: 0.01121
Epoch: 15379, Training Loss: 0.01381
Epoch: 15379, Training Loss: 0.01063
Epoch: 15380, Training Loss: 0.01209
Epoch: 15380, Training Loss: 0.01121
Epoch: 15380, Training Loss: 0.01381
Epoch: 15380, Training Loss: 0.01063
Epoch: 15381, Training Loss: 0.01209
Epoch: 15381, Training Loss: 0.01121
Epoch: 15381, Training Loss: 0.01381
Epoch: 15381, Training Loss: 0.01063
Epoch: 15382, Training Loss: 0.01209
Epoch: 15382, Training Loss: 0.01121
Epoch: 15382, Training Loss: 0.01381
Epoch: 15382, Training Loss: 0.01063
Epoch: 15383, Training Loss: 0.01209
Epoch: 15383, Training Loss: 0.01121
Epoch: 15383, Training Loss: 0.01381
Epoch: 15383, Training Loss: 0.01063
Epoch: 15384, Training Loss: 0.01209
Epoch: 15384, Training Loss: 0.01121
Epoch: 15384, Training Loss: 0.01381
Epoch: 15384, Training Loss: 0.01063
Epoch: 15385, Training Loss: 0.01209
Epoch: 15385, Training Loss: 0.01121
Epoch: 15385, Training Loss: 0.01380
Epoch: 15385, Training Loss: 0.01063
Epoch: 15386, Training Loss: 0.01209
Epoch: 15386, Training Loss: 0.01121
Epoch: 15386, Training Loss: 0.01380
Epoch: 15386, Training Loss: 0.01063
Epoch: 15387, Training Loss: 0.01209
Epoch: 15387, Training Loss: 0.01121
Epoch: 15387, Training Loss: 0.01380
Epoch: 15387, Training Loss: 0.01063
Epoch: 15388, Training Loss: 0.01209
Epoch: 15388, Training Loss: 0.01121
Epoch: 15388, Training Loss: 0.01380
Epoch: 15388, Training Loss: 0.01063
Epoch: 15389, Training Loss: 0.01209
Epoch: 15389, Training Loss: 0.01121
Epoch: 15389, Training Loss: 0.01380
Epoch: 15389, Training Loss: 0.01063
Epoch: 15390, Training Loss: 0.01209
Epoch: 15390, Training Loss: 0.01121
Epoch: 15390, Training Loss: 0.01380
Epoch: 15390, Training Loss: 0.01063
Epoch: 15391, Training Loss: 0.01209
Epoch: 15391, Training Loss: 0.01121
Epoch: 15391, Training Loss: 0.01380
Epoch: 15391, Training Loss: 0.01063
Epoch: 15392, Training Loss: 0.01209
Epoch: 15392, Training Loss: 0.01121
Epoch: 15392, Training Loss: 0.01380
Epoch: 15392, Training Loss: 0.01063
Epoch: 15393, Training Loss: 0.01209
Epoch: 15393, Training Loss: 0.01121
Epoch: 15393, Training Loss: 0.01380
Epoch: 15393, Training Loss: 0.01063
Epoch: 15394, Training Loss: 0.01209
Epoch: 15394, Training Loss: 0.01121
Epoch: 15394, Training Loss: 0.01380
Epoch: 15394, Training Loss: 0.01063
Epoch: 15395, Training Loss: 0.01209
Epoch: 15395, Training Loss: 0.01121
Epoch: 15395, Training Loss: 0.01380
Epoch: 15395, Training Loss: 0.01063
Epoch: 15396, Training Loss: 0.01209
Epoch: 15396, Training Loss: 0.01121
Epoch: 15396, Training Loss: 0.01380
Epoch: 15396, Training Loss: 0.01063
Epoch: 15397, Training Loss: 0.01209
Epoch: 15397, Training Loss: 0.01121
Epoch: 15397, Training Loss: 0.01380
Epoch: 15397, Training Loss: 0.01063
Epoch: 15398, Training Loss: 0.01209
Epoch: 15398, Training Loss: 0.01121
Epoch: 15398, Training Loss: 0.01380
Epoch: 15398, Training Loss: 0.01063
Epoch: 15399, Training Loss: 0.01209
Epoch: 15399, Training Loss: 0.01121
Epoch: 15399, Training Loss: 0.01380
Epoch: 15399, Training Loss: 0.01063
Epoch: 15400, Training Loss: 0.01209
Epoch: 15400, Training Loss: 0.01121
Epoch: 15400, Training Loss: 0.01380
Epoch: 15400, Training Loss: 0.01063
Epoch: 15401, Training Loss: 0.01208
Epoch: 15401, Training Loss: 0.01121
Epoch: 15401, Training Loss: 0.01380
Epoch: 15401, Training Loss: 0.01063
Epoch: 15402, Training Loss: 0.01208
Epoch: 15402, Training Loss: 0.01121
Epoch: 15402, Training Loss: 0.01380
Epoch: 15402, Training Loss: 0.01062
Epoch: 15403, Training Loss: 0.01208
Epoch: 15403, Training Loss: 0.01120
Epoch: 15403, Training Loss: 0.01380
Epoch: 15403, Training Loss: 0.01062
Epoch: 15404, Training Loss: 0.01208
Epoch: 15404, Training Loss: 0.01120
Epoch: 15404, Training Loss: 0.01380
Epoch: 15404, Training Loss: 0.01062
Epoch: 15405, Training Loss: 0.01208
Epoch: 15405, Training Loss: 0.01120
Epoch: 15405, Training Loss: 0.01380
Epoch: 15405, Training Loss: 0.01062
Epoch: 15406, Training Loss: 0.01208
Epoch: 15406, Training Loss: 0.01120
Epoch: 15406, Training Loss: 0.01379
Epoch: 15406, Training Loss: 0.01062
Epoch: 15407, Training Loss: 0.01208
Epoch: 15407, Training Loss: 0.01120
Epoch: 15407, Training Loss: 0.01379
Epoch: 15407, Training Loss: 0.01062
Epoch: 15408, Training Loss: 0.01208
Epoch: 15408, Training Loss: 0.01120
Epoch: 15408, Training Loss: 0.01379
Epoch: 15408, Training Loss: 0.01062
Epoch: 15409, Training Loss: 0.01208
Epoch: 15409, Training Loss: 0.01120
Epoch: 15409, Training Loss: 0.01379
Epoch: 15409, Training Loss: 0.01062
Epoch: 15410, Training Loss: 0.01208
Epoch: 15410, Training Loss: 0.01120
Epoch: 15410, Training Loss: 0.01379
Epoch: 15410, Training Loss: 0.01062
Epoch: 15411, Training Loss: 0.01208
Epoch: 15411, Training Loss: 0.01120
Epoch: 15411, Training Loss: 0.01379
Epoch: 15411, Training Loss: 0.01062
Epoch: 15412, Training Loss: 0.01208
Epoch: 15412, Training Loss: 0.01120
Epoch: 15412, Training Loss: 0.01379
Epoch: 15412, Training Loss: 0.01062
Epoch: 15413, Training Loss: 0.01208
Epoch: 15413, Training Loss: 0.01120
Epoch: 15413, Training Loss: 0.01379
Epoch: 15413, Training Loss: 0.01062
Epoch: 15414, Training Loss: 0.01208
Epoch: 15414, Training Loss: 0.01120
Epoch: 15414, Training Loss: 0.01379
Epoch: 15414, Training Loss: 0.01062
Epoch: 15415, Training Loss: 0.01208
Epoch: 15415, Training Loss: 0.01120
Epoch: 15415, Training Loss: 0.01379
Epoch: 15415, Training Loss: 0.01062
Epoch: 15416, Training Loss: 0.01208
Epoch: 15416, Training Loss: 0.01120
Epoch: 15416, Training Loss: 0.01379
Epoch: 15416, Training Loss: 0.01062
Epoch: 15417, Training Loss: 0.01208
Epoch: 15417, Training Loss: 0.01120
Epoch: 15417, Training Loss: 0.01379
Epoch: 15417, Training Loss: 0.01062
Epoch: 15418, Training Loss: 0.01208
Epoch: 15418, Training Loss: 0.01120
Epoch: 15418, Training Loss: 0.01379
Epoch: 15418, Training Loss: 0.01062
Epoch: 15419, Training Loss: 0.01208
Epoch: 15419, Training Loss: 0.01120
Epoch: 15419, Training Loss: 0.01379
Epoch: 15419, Training Loss: 0.01062
Epoch: 15420, Training Loss: 0.01208
Epoch: 15420, Training Loss: 0.01120
Epoch: 15420, Training Loss: 0.01379
Epoch: 15420, Training Loss: 0.01062
Epoch: 15421, Training Loss: 0.01208
Epoch: 15421, Training Loss: 0.01120
Epoch: 15421, Training Loss: 0.01379
Epoch: 15421, Training Loss: 0.01062
Epoch: 15422, Training Loss: 0.01208
Epoch: 15422, Training Loss: 0.01120
Epoch: 15422, Training Loss: 0.01379
Epoch: 15422, Training Loss: 0.01062
Epoch: 15423, Training Loss: 0.01208
Epoch: 15423, Training Loss: 0.01120
Epoch: 15423, Training Loss: 0.01379
Epoch: 15423, Training Loss: 0.01062
Epoch: 15424, Training Loss: 0.01208
Epoch: 15424, Training Loss: 0.01120
Epoch: 15424, Training Loss: 0.01379
Epoch: 15424, Training Loss: 0.01062
Epoch: 15425, Training Loss: 0.01207
Epoch: 15425, Training Loss: 0.01120
Epoch: 15425, Training Loss: 0.01379
Epoch: 15425, Training Loss: 0.01062
Epoch: 15426, Training Loss: 0.01207
Epoch: 15426, Training Loss: 0.01120
Epoch: 15426, Training Loss: 0.01378
Epoch: 15426, Training Loss: 0.01062
Epoch: 15427, Training Loss: 0.01207
Epoch: 15427, Training Loss: 0.01120
Epoch: 15427, Training Loss: 0.01378
Epoch: 15427, Training Loss: 0.01062
Epoch: 15428, Training Loss: 0.01207
Epoch: 15428, Training Loss: 0.01119
Epoch: 15428, Training Loss: 0.01378
Epoch: 15428, Training Loss: 0.01062
Epoch: 15429, Training Loss: 0.01207
Epoch: 15429, Training Loss: 0.01119
Epoch: 15429, Training Loss: 0.01378
Epoch: 15429, Training Loss: 0.01061
Epoch: 15430, Training Loss: 0.01207
Epoch: 15430, Training Loss: 0.01119
Epoch: 15430, Training Loss: 0.01378
Epoch: 15430, Training Loss: 0.01061
Epoch: 15431, Training Loss: 0.01207
Epoch: 15431, Training Loss: 0.01119
Epoch: 15431, Training Loss: 0.01378
Epoch: 15431, Training Loss: 0.01061
Epoch: 15432, Training Loss: 0.01207
Epoch: 15432, Training Loss: 0.01119
Epoch: 15432, Training Loss: 0.01378
Epoch: 15432, Training Loss: 0.01061
Epoch: 15433, Training Loss: 0.01207
Epoch: 15433, Training Loss: 0.01119
Epoch: 15433, Training Loss: 0.01378
Epoch: 15433, Training Loss: 0.01061
Epoch: 15434, Training Loss: 0.01207
Epoch: 15434, Training Loss: 0.01119
Epoch: 15434, Training Loss: 0.01378
Epoch: 15434, Training Loss: 0.01061
Epoch: 15435, Training Loss: 0.01207
Epoch: 15435, Training Loss: 0.01119
Epoch: 15435, Training Loss: 0.01378
Epoch: 15435, Training Loss: 0.01061
Epoch: 15436, Training Loss: 0.01207
Epoch: 15436, Training Loss: 0.01119
Epoch: 15436, Training Loss: 0.01378
Epoch: 15436, Training Loss: 0.01061
Epoch: 15437, Training Loss: 0.01207
Epoch: 15437, Training Loss: 0.01119
Epoch: 15437, Training Loss: 0.01378
Epoch: 15437, Training Loss: 0.01061
Epoch: 15438, Training Loss: 0.01207
Epoch: 15438, Training Loss: 0.01119
Epoch: 15438, Training Loss: 0.01378
Epoch: 15438, Training Loss: 0.01061
Epoch: 15439, Training Loss: 0.01207
Epoch: 15439, Training Loss: 0.01119
Epoch: 15439, Training Loss: 0.01378
Epoch: 15439, Training Loss: 0.01061
Epoch: 15440, Training Loss: 0.01207
Epoch: 15440, Training Loss: 0.01119
Epoch: 15440, Training Loss: 0.01378
Epoch: 15440, Training Loss: 0.01061
Epoch: 15441, Training Loss: 0.01207
Epoch: 15441, Training Loss: 0.01119
Epoch: 15441, Training Loss: 0.01378
Epoch: 15441, Training Loss: 0.01061
Epoch: 15442, Training Loss: 0.01207
Epoch: 15442, Training Loss: 0.01119
Epoch: 15442, Training Loss: 0.01378
Epoch: 15442, Training Loss: 0.01061
Epoch: 15443, Training Loss: 0.01207
Epoch: 15443, Training Loss: 0.01119
Epoch: 15443, Training Loss: 0.01378
Epoch: 15443, Training Loss: 0.01061
Epoch: 15444, Training Loss: 0.01207
Epoch: 15444, Training Loss: 0.01119
Epoch: 15444, Training Loss: 0.01378
Epoch: 15444, Training Loss: 0.01061
Epoch: 15445, Training Loss: 0.01207
Epoch: 15445, Training Loss: 0.01119
Epoch: 15445, Training Loss: 0.01378
Epoch: 15445, Training Loss: 0.01061
Epoch: 15446, Training Loss: 0.01207
Epoch: 15446, Training Loss: 0.01119
Epoch: 15446, Training Loss: 0.01377
Epoch: 15446, Training Loss: 0.01061
Epoch: 15447, Training Loss: 0.01207
Epoch: 15447, Training Loss: 0.01119
Epoch: 15447, Training Loss: 0.01377
Epoch: 15447, Training Loss: 0.01061
Epoch: 15448, Training Loss: 0.01206
Epoch: 15448, Training Loss: 0.01119
Epoch: 15448, Training Loss: 0.01377
Epoch: 15448, Training Loss: 0.01061
Epoch: 15449, Training Loss: 0.01206
Epoch: 15449, Training Loss: 0.01119
Epoch: 15449, Training Loss: 0.01377
Epoch: 15449, Training Loss: 0.01061
Epoch: 15450, Training Loss: 0.01206
Epoch: 15450, Training Loss: 0.01119
Epoch: 15450, Training Loss: 0.01377
Epoch: 15450, Training Loss: 0.01061
Epoch: 15451, Training Loss: 0.01206
Epoch: 15451, Training Loss: 0.01119
Epoch: 15451, Training Loss: 0.01377
Epoch: 15451, Training Loss: 0.01061
Epoch: 15452, Training Loss: 0.01206
Epoch: 15452, Training Loss: 0.01119
Epoch: 15452, Training Loss: 0.01377
Epoch: 15452, Training Loss: 0.01061
Epoch: 15453, Training Loss: 0.01206
Epoch: 15453, Training Loss: 0.01119
Epoch: 15453, Training Loss: 0.01377
Epoch: 15453, Training Loss: 0.01061
Epoch: 15454, Training Loss: 0.01206
Epoch: 15454, Training Loss: 0.01118
Epoch: 15454, Training Loss: 0.01377
Epoch: 15454, Training Loss: 0.01061
Epoch: 15455, Training Loss: 0.01206
Epoch: 15455, Training Loss: 0.01118
Epoch: 15455, Training Loss: 0.01377
Epoch: 15455, Training Loss: 0.01061
Epoch: 15456, Training Loss: 0.01206
Epoch: 15456, Training Loss: 0.01118
Epoch: 15456, Training Loss: 0.01377
Epoch: 15456, Training Loss: 0.01060
Epoch: 15457, Training Loss: 0.01206
Epoch: 15457, Training Loss: 0.01118
Epoch: 15457, Training Loss: 0.01377
Epoch: 15457, Training Loss: 0.01060
Epoch: 15458, Training Loss: 0.01206
Epoch: 15458, Training Loss: 0.01118
Epoch: 15458, Training Loss: 0.01377
Epoch: 15458, Training Loss: 0.01060
Epoch: 15459, Training Loss: 0.01206
Epoch: 15459, Training Loss: 0.01118
Epoch: 15459, Training Loss: 0.01377
Epoch: 15459, Training Loss: 0.01060
Epoch: 15460, Training Loss: 0.01206
Epoch: 15460, Training Loss: 0.01118
Epoch: 15460, Training Loss: 0.01377
Epoch: 15460, Training Loss: 0.01060
Epoch: 15461, Training Loss: 0.01206
Epoch: 15461, Training Loss: 0.01118
Epoch: 15461, Training Loss: 0.01377
Epoch: 15461, Training Loss: 0.01060
Epoch: 15462, Training Loss: 0.01206
Epoch: 15462, Training Loss: 0.01118
Epoch: 15462, Training Loss: 0.01377
Epoch: 15462, Training Loss: 0.01060
Epoch: 15463, Training Loss: 0.01206
Epoch: 15463, Training Loss: 0.01118
Epoch: 15463, Training Loss: 0.01377
Epoch: 15463, Training Loss: 0.01060
Epoch: 15464, Training Loss: 0.01206
Epoch: 15464, Training Loss: 0.01118
Epoch: 15464, Training Loss: 0.01377
Epoch: 15464, Training Loss: 0.01060
Epoch: 15465, Training Loss: 0.01206
Epoch: 15465, Training Loss: 0.01118
Epoch: 15465, Training Loss: 0.01377
Epoch: 15465, Training Loss: 0.01060
Epoch: 15466, Training Loss: 0.01206
Epoch: 15466, Training Loss: 0.01118
Epoch: 15466, Training Loss: 0.01377
Epoch: 15466, Training Loss: 0.01060
Epoch: 15467, Training Loss: 0.01206
Epoch: 15467, Training Loss: 0.01118
Epoch: 15467, Training Loss: 0.01376
Epoch: 15467, Training Loss: 0.01060
Epoch: 15468, Training Loss: 0.01206
Epoch: 15468, Training Loss: 0.01118
Epoch: 15468, Training Loss: 0.01376
Epoch: 15468, Training Loss: 0.01060
Epoch: 15469, Training Loss: 0.01206
Epoch: 15469, Training Loss: 0.01118
Epoch: 15469, Training Loss: 0.01376
Epoch: 15469, Training Loss: 0.01060
Epoch: 15470, Training Loss: 0.01206
Epoch: 15470, Training Loss: 0.01118
Epoch: 15470, Training Loss: 0.01376
Epoch: 15470, Training Loss: 0.01060
Epoch: 15471, Training Loss: 0.01205
Epoch: 15471, Training Loss: 0.01118
Epoch: 15471, Training Loss: 0.01376
Epoch: 15471, Training Loss: 0.01060
Epoch: 15472, Training Loss: 0.01205
Epoch: 15472, Training Loss: 0.01118
Epoch: 15472, Training Loss: 0.01376
Epoch: 15472, Training Loss: 0.01060
Epoch: 15473, Training Loss: 0.01205
Epoch: 15473, Training Loss: 0.01118
Epoch: 15473, Training Loss: 0.01376
Epoch: 15473, Training Loss: 0.01060
Epoch: 15474, Training Loss: 0.01205
Epoch: 15474, Training Loss: 0.01118
Epoch: 15474, Training Loss: 0.01376
Epoch: 15474, Training Loss: 0.01060
Epoch: 15475, Training Loss: 0.01205
Epoch: 15475, Training Loss: 0.01118
Epoch: 15475, Training Loss: 0.01376
Epoch: 15475, Training Loss: 0.01060
Epoch: 15476, Training Loss: 0.01205
Epoch: 15476, Training Loss: 0.01118
Epoch: 15476, Training Loss: 0.01376
Epoch: 15476, Training Loss: 0.01060
Epoch: 15477, Training Loss: 0.01205
Epoch: 15477, Training Loss: 0.01118
Epoch: 15477, Training Loss: 0.01376
Epoch: 15477, Training Loss: 0.01060
Epoch: 15478, Training Loss: 0.01205
Epoch: 15478, Training Loss: 0.01118
Epoch: 15478, Training Loss: 0.01376
Epoch: 15478, Training Loss: 0.01060
Epoch: 15479, Training Loss: 0.01205
Epoch: 15479, Training Loss: 0.01118
Epoch: 15479, Training Loss: 0.01376
Epoch: 15479, Training Loss: 0.01060
Epoch: 15480, Training Loss: 0.01205
Epoch: 15480, Training Loss: 0.01117
Epoch: 15480, Training Loss: 0.01376
Epoch: 15480, Training Loss: 0.01060
Epoch: 15481, Training Loss: 0.01205
Epoch: 15481, Training Loss: 0.01117
Epoch: 15481, Training Loss: 0.01376
Epoch: 15481, Training Loss: 0.01060
Epoch: 15482, Training Loss: 0.01205
Epoch: 15482, Training Loss: 0.01117
Epoch: 15482, Training Loss: 0.01376
Epoch: 15482, Training Loss: 0.01060
Epoch: 15483, Training Loss: 0.01205
Epoch: 15483, Training Loss: 0.01117
Epoch: 15483, Training Loss: 0.01376
Epoch: 15483, Training Loss: 0.01059
Epoch: 15484, Training Loss: 0.01205
Epoch: 15484, Training Loss: 0.01117
Epoch: 15484, Training Loss: 0.01376
Epoch: 15484, Training Loss: 0.01059
Epoch: 15485, Training Loss: 0.01205
Epoch: 15485, Training Loss: 0.01117
Epoch: 15485, Training Loss: 0.01376
Epoch: 15485, Training Loss: 0.01059
Epoch: 15486, Training Loss: 0.01205
Epoch: 15486, Training Loss: 0.01117
Epoch: 15486, Training Loss: 0.01376
Epoch: 15486, Training Loss: 0.01059
Epoch: 15487, Training Loss: 0.01205
Epoch: 15487, Training Loss: 0.01117
Epoch: 15487, Training Loss: 0.01375
Epoch: 15487, Training Loss: 0.01059
Epoch: 15488, Training Loss: 0.01205
Epoch: 15488, Training Loss: 0.01117
Epoch: 15488, Training Loss: 0.01375
Epoch: 15488, Training Loss: 0.01059
Epoch: 15489, Training Loss: 0.01205
Epoch: 15489, Training Loss: 0.01117
Epoch: 15489, Training Loss: 0.01375
Epoch: 15489, Training Loss: 0.01059
Epoch: 15490, Training Loss: 0.01205
Epoch: 15490, Training Loss: 0.01117
Epoch: 15490, Training Loss: 0.01375
Epoch: 15490, Training Loss: 0.01059
Epoch: 15491, Training Loss: 0.01205
Epoch: 15491, Training Loss: 0.01117
Epoch: 15491, Training Loss: 0.01375
Epoch: 15491, Training Loss: 0.01059
Epoch: 15492, Training Loss: 0.01205
Epoch: 15492, Training Loss: 0.01117
Epoch: 15492, Training Loss: 0.01375
Epoch: 15492, Training Loss: 0.01059
Epoch: 15493, Training Loss: 0.01205
Epoch: 15493, Training Loss: 0.01117
Epoch: 15493, Training Loss: 0.01375
Epoch: 15493, Training Loss: 0.01059
Epoch: 15494, Training Loss: 0.01205
Epoch: 15494, Training Loss: 0.01117
Epoch: 15494, Training Loss: 0.01375
Epoch: 15494, Training Loss: 0.01059
Epoch: 15495, Training Loss: 0.01204
Epoch: 15495, Training Loss: 0.01117
Epoch: 15495, Training Loss: 0.01375
Epoch: 15495, Training Loss: 0.01059
Epoch: 15496, Training Loss: 0.01204
Epoch: 15496, Training Loss: 0.01117
Epoch: 15496, Training Loss: 0.01375
Epoch: 15496, Training Loss: 0.01059
Epoch: 15497, Training Loss: 0.01204
Epoch: 15497, Training Loss: 0.01117
Epoch: 15497, Training Loss: 0.01375
Epoch: 15497, Training Loss: 0.01059
Epoch: 15498, Training Loss: 0.01204
Epoch: 15498, Training Loss: 0.01117
Epoch: 15498, Training Loss: 0.01375
Epoch: 15498, Training Loss: 0.01059
Epoch: 15499, Training Loss: 0.01204
Epoch: 15499, Training Loss: 0.01117
Epoch: 15499, Training Loss: 0.01375
Epoch: 15499, Training Loss: 0.01059
Epoch: 15500, Training Loss: 0.01204
Epoch: 15500, Training Loss: 0.01117
Epoch: 15500, Training Loss: 0.01375
Epoch: 15500, Training Loss: 0.01059
Epoch: 15501, Training Loss: 0.01204
Epoch: 15501, Training Loss: 0.01117
Epoch: 15501, Training Loss: 0.01375
Epoch: 15501, Training Loss: 0.01059
Epoch: 15502, Training Loss: 0.01204
Epoch: 15502, Training Loss: 0.01117
Epoch: 15502, Training Loss: 0.01375
Epoch: 15502, Training Loss: 0.01059
Epoch: 15503, Training Loss: 0.01204
Epoch: 15503, Training Loss: 0.01117
Epoch: 15503, Training Loss: 0.01375
Epoch: 15503, Training Loss: 0.01059
Epoch: 15504, Training Loss: 0.01204
Epoch: 15504, Training Loss: 0.01117
Epoch: 15504, Training Loss: 0.01375
Epoch: 15504, Training Loss: 0.01059
Epoch: 15505, Training Loss: 0.01204
Epoch: 15505, Training Loss: 0.01117
Epoch: 15505, Training Loss: 0.01375
Epoch: 15505, Training Loss: 0.01059
Epoch: 15506, Training Loss: 0.01204
Epoch: 15506, Training Loss: 0.01116
Epoch: 15506, Training Loss: 0.01375
Epoch: 15506, Training Loss: 0.01059
Epoch: 15507, Training Loss: 0.01204
Epoch: 15507, Training Loss: 0.01116
Epoch: 15507, Training Loss: 0.01375
Epoch: 15507, Training Loss: 0.01059
Epoch: 15508, Training Loss: 0.01204
Epoch: 15508, Training Loss: 0.01116
Epoch: 15508, Training Loss: 0.01374
Epoch: 15508, Training Loss: 0.01059
Epoch: 15509, Training Loss: 0.01204
Epoch: 15509, Training Loss: 0.01116
Epoch: 15509, Training Loss: 0.01374
Epoch: 15509, Training Loss: 0.01059
Epoch: 15510, Training Loss: 0.01204
Epoch: 15510, Training Loss: 0.01116
Epoch: 15510, Training Loss: 0.01374
Epoch: 15510, Training Loss: 0.01058
Epoch: 15511, Training Loss: 0.01204
Epoch: 15511, Training Loss: 0.01116
Epoch: 15511, Training Loss: 0.01374
Epoch: 15511, Training Loss: 0.01058
Epoch: 15512, Training Loss: 0.01204
Epoch: 15512, Training Loss: 0.01116
Epoch: 15512, Training Loss: 0.01374
Epoch: 15512, Training Loss: 0.01058
Epoch: 15513, Training Loss: 0.01204
Epoch: 15513, Training Loss: 0.01116
Epoch: 15513, Training Loss: 0.01374
Epoch: 15513, Training Loss: 0.01058
Epoch: 15514, Training Loss: 0.01204
Epoch: 15514, Training Loss: 0.01116
Epoch: 15514, Training Loss: 0.01374
Epoch: 15514, Training Loss: 0.01058
Epoch: 15515, Training Loss: 0.01204
Epoch: 15515, Training Loss: 0.01116
Epoch: 15515, Training Loss: 0.01374
Epoch: 15515, Training Loss: 0.01058
Epoch: 15516, Training Loss: 0.01204
Epoch: 15516, Training Loss: 0.01116
Epoch: 15516, Training Loss: 0.01374
Epoch: 15516, Training Loss: 0.01058
Epoch: 15517, Training Loss: 0.01204
Epoch: 15517, Training Loss: 0.01116
Epoch: 15517, Training Loss: 0.01374
Epoch: 15517, Training Loss: 0.01058
Epoch: 15518, Training Loss: 0.01203
Epoch: 15518, Training Loss: 0.01116
Epoch: 15518, Training Loss: 0.01374
Epoch: 15518, Training Loss: 0.01058
Epoch: 15519, Training Loss: 0.01203
Epoch: 15519, Training Loss: 0.01116
Epoch: 15519, Training Loss: 0.01374
Epoch: 15519, Training Loss: 0.01058
Epoch: 15520, Training Loss: 0.01203
Epoch: 15520, Training Loss: 0.01116
Epoch: 15520, Training Loss: 0.01374
Epoch: 15520, Training Loss: 0.01058
Epoch: 15521, Training Loss: 0.01203
Epoch: 15521, Training Loss: 0.01116
Epoch: 15521, Training Loss: 0.01374
Epoch: 15521, Training Loss: 0.01058
Epoch: 15522, Training Loss: 0.01203
Epoch: 15522, Training Loss: 0.01116
Epoch: 15522, Training Loss: 0.01374
Epoch: 15522, Training Loss: 0.01058
Epoch: 15523, Training Loss: 0.01203
Epoch: 15523, Training Loss: 0.01116
Epoch: 15523, Training Loss: 0.01374
Epoch: 15523, Training Loss: 0.01058
Epoch: 15524, Training Loss: 0.01203
Epoch: 15524, Training Loss: 0.01116
Epoch: 15524, Training Loss: 0.01374
Epoch: 15524, Training Loss: 0.01058
Epoch: 15525, Training Loss: 0.01203
Epoch: 15525, Training Loss: 0.01116
Epoch: 15525, Training Loss: 0.01374
Epoch: 15525, Training Loss: 0.01058
Epoch: 15526, Training Loss: 0.01203
Epoch: 15526, Training Loss: 0.01116
Epoch: 15526, Training Loss: 0.01374
Epoch: 15526, Training Loss: 0.01058
Epoch: 15527, Training Loss: 0.01203
Epoch: 15527, Training Loss: 0.01116
Epoch: 15527, Training Loss: 0.01374
Epoch: 15527, Training Loss: 0.01058
Epoch: 15528, Training Loss: 0.01203
Epoch: 15528, Training Loss: 0.01116
Epoch: 15528, Training Loss: 0.01373
Epoch: 15528, Training Loss: 0.01058
Epoch: 15529, Training Loss: 0.01203
Epoch: 15529, Training Loss: 0.01116
Epoch: 15529, Training Loss: 0.01373
Epoch: 15529, Training Loss: 0.01058
Epoch: 15530, Training Loss: 0.01203
Epoch: 15530, Training Loss: 0.01116
Epoch: 15530, Training Loss: 0.01373
Epoch: 15530, Training Loss: 0.01058
Epoch: 15531, Training Loss: 0.01203
Epoch: 15531, Training Loss: 0.01116
Epoch: 15531, Training Loss: 0.01373
Epoch: 15531, Training Loss: 0.01058
Epoch: 15532, Training Loss: 0.01203
Epoch: 15532, Training Loss: 0.01115
Epoch: 15532, Training Loss: 0.01373
Epoch: 15532, Training Loss: 0.01058
Epoch: 15533, Training Loss: 0.01203
Epoch: 15533, Training Loss: 0.01115
Epoch: 15533, Training Loss: 0.01373
Epoch: 15533, Training Loss: 0.01058
Epoch: 15534, Training Loss: 0.01203
Epoch: 15534, Training Loss: 0.01115
Epoch: 15534, Training Loss: 0.01373
Epoch: 15534, Training Loss: 0.01058
Epoch: 15535, Training Loss: 0.01203
Epoch: 15535, Training Loss: 0.01115
Epoch: 15535, Training Loss: 0.01373
Epoch: 15535, Training Loss: 0.01058
Epoch: 15536, Training Loss: 0.01203
Epoch: 15536, Training Loss: 0.01115
Epoch: 15536, Training Loss: 0.01373
Epoch: 15536, Training Loss: 0.01058
Epoch: 15537, Training Loss: 0.01203
Epoch: 15537, Training Loss: 0.01115
Epoch: 15537, Training Loss: 0.01373
Epoch: 15537, Training Loss: 0.01057
Epoch: 15538, Training Loss: 0.01203
Epoch: 15538, Training Loss: 0.01115
Epoch: 15538, Training Loss: 0.01373
Epoch: 15538, Training Loss: 0.01057
Epoch: 15539, Training Loss: 0.01203
Epoch: 15539, Training Loss: 0.01115
Epoch: 15539, Training Loss: 0.01373
Epoch: 15539, Training Loss: 0.01057
Epoch: 15540, Training Loss: 0.01203
Epoch: 15540, Training Loss: 0.01115
Epoch: 15540, Training Loss: 0.01373
Epoch: 15540, Training Loss: 0.01057
Epoch: 15541, Training Loss: 0.01203
Epoch: 15541, Training Loss: 0.01115
Epoch: 15541, Training Loss: 0.01373
Epoch: 15541, Training Loss: 0.01057
Epoch: 15542, Training Loss: 0.01202
Epoch: 15542, Training Loss: 0.01115
Epoch: 15542, Training Loss: 0.01373
Epoch: 15542, Training Loss: 0.01057
Epoch: 15543, Training Loss: 0.01202
Epoch: 15543, Training Loss: 0.01115
Epoch: 15543, Training Loss: 0.01373
Epoch: 15543, Training Loss: 0.01057
Epoch: 15544, Training Loss: 0.01202
Epoch: 15544, Training Loss: 0.01115
Epoch: 15544, Training Loss: 0.01373
Epoch: 15544, Training Loss: 0.01057
Epoch: 15545, Training Loss: 0.01202
Epoch: 15545, Training Loss: 0.01115
Epoch: 15545, Training Loss: 0.01373
Epoch: 15545, Training Loss: 0.01057
Epoch: 15546, Training Loss: 0.01202
Epoch: 15546, Training Loss: 0.01115
Epoch: 15546, Training Loss: 0.01373
Epoch: 15546, Training Loss: 0.01057
Epoch: 15547, Training Loss: 0.01202
Epoch: 15547, Training Loss: 0.01115
Epoch: 15547, Training Loss: 0.01373
Epoch: 15547, Training Loss: 0.01057
Epoch: 15548, Training Loss: 0.01202
Epoch: 15548, Training Loss: 0.01115
Epoch: 15548, Training Loss: 0.01373
Epoch: 15548, Training Loss: 0.01057
Epoch: 15549, Training Loss: 0.01202
Epoch: 15549, Training Loss: 0.01115
Epoch: 15549, Training Loss: 0.01372
Epoch: 15549, Training Loss: 0.01057
Epoch: 15550, Training Loss: 0.01202
Epoch: 15550, Training Loss: 0.01115
Epoch: 15550, Training Loss: 0.01372
Epoch: 15550, Training Loss: 0.01057
Epoch: 15551, Training Loss: 0.01202
Epoch: 15551, Training Loss: 0.01115
Epoch: 15551, Training Loss: 0.01372
Epoch: 15551, Training Loss: 0.01057
Epoch: 15552, Training Loss: 0.01202
Epoch: 15552, Training Loss: 0.01115
Epoch: 15552, Training Loss: 0.01372
Epoch: 15552, Training Loss: 0.01057
Epoch: 15553, Training Loss: 0.01202
Epoch: 15553, Training Loss: 0.01115
Epoch: 15553, Training Loss: 0.01372
Epoch: 15553, Training Loss: 0.01057
Epoch: 15554, Training Loss: 0.01202
Epoch: 15554, Training Loss: 0.01115
Epoch: 15554, Training Loss: 0.01372
Epoch: 15554, Training Loss: 0.01057
Epoch: 15555, Training Loss: 0.01202
Epoch: 15555, Training Loss: 0.01115
Epoch: 15555, Training Loss: 0.01372
Epoch: 15555, Training Loss: 0.01057
Epoch: 15556, Training Loss: 0.01202
Epoch: 15556, Training Loss: 0.01115
Epoch: 15556, Training Loss: 0.01372
Epoch: 15556, Training Loss: 0.01057
Epoch: 15557, Training Loss: 0.01202
Epoch: 15557, Training Loss: 0.01115
Epoch: 15557, Training Loss: 0.01372
Epoch: 15557, Training Loss: 0.01057
Epoch: 15558, Training Loss: 0.01202
Epoch: 15558, Training Loss: 0.01114
Epoch: 15558, Training Loss: 0.01372
Epoch: 15558, Training Loss: 0.01057
Epoch: 15559, Training Loss: 0.01202
Epoch: 15559, Training Loss: 0.01114
Epoch: 15559, Training Loss: 0.01372
Epoch: 15559, Training Loss: 0.01057
Epoch: 15560, Training Loss: 0.01202
Epoch: 15560, Training Loss: 0.01114
Epoch: 15560, Training Loss: 0.01372
Epoch: 15560, Training Loss: 0.01057
Epoch: 15561, Training Loss: 0.01202
Epoch: 15561, Training Loss: 0.01114
Epoch: 15561, Training Loss: 0.01372
Epoch: 15561, Training Loss: 0.01057
Epoch: 15562, Training Loss: 0.01202
Epoch: 15562, Training Loss: 0.01114
Epoch: 15562, Training Loss: 0.01372
Epoch: 15562, Training Loss: 0.01057
Epoch: 15563, Training Loss: 0.01202
Epoch: 15563, Training Loss: 0.01114
Epoch: 15563, Training Loss: 0.01372
Epoch: 15563, Training Loss: 0.01057
Epoch: 15564, Training Loss: 0.01202
Epoch: 15564, Training Loss: 0.01114
Epoch: 15564, Training Loss: 0.01372
Epoch: 15564, Training Loss: 0.01057
Epoch: 15565, Training Loss: 0.01201
Epoch: 15565, Training Loss: 0.01114
Epoch: 15565, Training Loss: 0.01372
Epoch: 15565, Training Loss: 0.01056
Epoch: 15566, Training Loss: 0.01201
Epoch: 15566, Training Loss: 0.01114
Epoch: 15566, Training Loss: 0.01372
Epoch: 15566, Training Loss: 0.01056
Epoch: 15567, Training Loss: 0.01201
Epoch: 15567, Training Loss: 0.01114
Epoch: 15567, Training Loss: 0.01372
Epoch: 15567, Training Loss: 0.01056
Epoch: 15568, Training Loss: 0.01201
Epoch: 15568, Training Loss: 0.01114
Epoch: 15568, Training Loss: 0.01372
Epoch: 15568, Training Loss: 0.01056
Epoch: 15569, Training Loss: 0.01201
Epoch: 15569, Training Loss: 0.01114
Epoch: 15569, Training Loss: 0.01372
Epoch: 15569, Training Loss: 0.01056
Epoch: 15570, Training Loss: 0.01201
Epoch: 15570, Training Loss: 0.01114
Epoch: 15570, Training Loss: 0.01371
Epoch: 15570, Training Loss: 0.01056
Epoch: 15571, Training Loss: 0.01201
Epoch: 15571, Training Loss: 0.01114
Epoch: 15571, Training Loss: 0.01371
Epoch: 15571, Training Loss: 0.01056
Epoch: 15572, Training Loss: 0.01201
Epoch: 15572, Training Loss: 0.01114
Epoch: 15572, Training Loss: 0.01371
Epoch: 15572, Training Loss: 0.01056
Epoch: 15573, Training Loss: 0.01201
Epoch: 15573, Training Loss: 0.01114
Epoch: 15573, Training Loss: 0.01371
Epoch: 15573, Training Loss: 0.01056
Epoch: 15574, Training Loss: 0.01201
Epoch: 15574, Training Loss: 0.01114
Epoch: 15574, Training Loss: 0.01371
Epoch: 15574, Training Loss: 0.01056
Epoch: 15575, Training Loss: 0.01201
Epoch: 15575, Training Loss: 0.01114
Epoch: 15575, Training Loss: 0.01371
Epoch: 15575, Training Loss: 0.01056
Epoch: 15576, Training Loss: 0.01201
Epoch: 15576, Training Loss: 0.01114
Epoch: 15576, Training Loss: 0.01371
Epoch: 15576, Training Loss: 0.01056
Epoch: 15577, Training Loss: 0.01201
Epoch: 15577, Training Loss: 0.01114
Epoch: 15577, Training Loss: 0.01371
Epoch: 15577, Training Loss: 0.01056
Epoch: 15578, Training Loss: 0.01201
Epoch: 15578, Training Loss: 0.01114
Epoch: 15578, Training Loss: 0.01371
Epoch: 15578, Training Loss: 0.01056
Epoch: 15579, Training Loss: 0.01201
Epoch: 15579, Training Loss: 0.01114
Epoch: 15579, Training Loss: 0.01371
Epoch: 15579, Training Loss: 0.01056
Epoch: 15580, Training Loss: 0.01201
Epoch: 15580, Training Loss: 0.01114
Epoch: 15580, Training Loss: 0.01371
Epoch: 15580, Training Loss: 0.01056
Epoch: 15581, Training Loss: 0.01201
Epoch: 15581, Training Loss: 0.01114
Epoch: 15581, Training Loss: 0.01371
Epoch: 15581, Training Loss: 0.01056
Epoch: 15582, Training Loss: 0.01201
Epoch: 15582, Training Loss: 0.01114
Epoch: 15582, Training Loss: 0.01371
Epoch: 15582, Training Loss: 0.01056
Epoch: 15583, Training Loss: 0.01201
Epoch: 15583, Training Loss: 0.01114
Epoch: 15583, Training Loss: 0.01371
Epoch: 15583, Training Loss: 0.01056
Epoch: 15584, Training Loss: 0.01201
Epoch: 15584, Training Loss: 0.01113
Epoch: 15584, Training Loss: 0.01371
Epoch: 15584, Training Loss: 0.01056
Epoch: 15585, Training Loss: 0.01201
Epoch: 15585, Training Loss: 0.01113
Epoch: 15585, Training Loss: 0.01371
Epoch: 15585, Training Loss: 0.01056
Epoch: 15586, Training Loss: 0.01201
Epoch: 15586, Training Loss: 0.01113
Epoch: 15586, Training Loss: 0.01371
Epoch: 15586, Training Loss: 0.01056
Epoch: 15587, Training Loss: 0.01201
Epoch: 15587, Training Loss: 0.01113
Epoch: 15587, Training Loss: 0.01371
Epoch: 15587, Training Loss: 0.01056
Epoch: 15588, Training Loss: 0.01201
Epoch: 15588, Training Loss: 0.01113
Epoch: 15588, Training Loss: 0.01371
Epoch: 15588, Training Loss: 0.01056
Epoch: 15589, Training Loss: 0.01200
Epoch: 15589, Training Loss: 0.01113
Epoch: 15589, Training Loss: 0.01371
Epoch: 15589, Training Loss: 0.01056
Epoch: 15590, Training Loss: 0.01200
Epoch: 15590, Training Loss: 0.01113
Epoch: 15590, Training Loss: 0.01370
Epoch: 15590, Training Loss: 0.01056
Epoch: 15591, Training Loss: 0.01200
Epoch: 15591, Training Loss: 0.01113
Epoch: 15591, Training Loss: 0.01370
Epoch: 15591, Training Loss: 0.01056
Epoch: 15592, Training Loss: 0.01200
Epoch: 15592, Training Loss: 0.01113
Epoch: 15592, Training Loss: 0.01370
Epoch: 15592, Training Loss: 0.01055
Epoch: 15593, Training Loss: 0.01200
Epoch: 15593, Training Loss: 0.01113
Epoch: 15593, Training Loss: 0.01370
Epoch: 15593, Training Loss: 0.01055
Epoch: 15594, Training Loss: 0.01200
Epoch: 15594, Training Loss: 0.01113
Epoch: 15594, Training Loss: 0.01370
Epoch: 15594, Training Loss: 0.01055
Epoch: 15595, Training Loss: 0.01200
Epoch: 15595, Training Loss: 0.01113
Epoch: 15595, Training Loss: 0.01370
Epoch: 15595, Training Loss: 0.01055
Epoch: 15596, Training Loss: 0.01200
Epoch: 15596, Training Loss: 0.01113
Epoch: 15596, Training Loss: 0.01370
Epoch: 15596, Training Loss: 0.01055
Epoch: 15597, Training Loss: 0.01200
Epoch: 15597, Training Loss: 0.01113
Epoch: 15597, Training Loss: 0.01370
Epoch: 15597, Training Loss: 0.01055
Epoch: 15598, Training Loss: 0.01200
Epoch: 15598, Training Loss: 0.01113
Epoch: 15598, Training Loss: 0.01370
Epoch: 15598, Training Loss: 0.01055
Epoch: 15599, Training Loss: 0.01200
Epoch: 15599, Training Loss: 0.01113
Epoch: 15599, Training Loss: 0.01370
Epoch: 15599, Training Loss: 0.01055
Epoch: 15600, Training Loss: 0.01200
Epoch: 15600, Training Loss: 0.01113
Epoch: 15600, Training Loss: 0.01370
Epoch: 15600, Training Loss: 0.01055
Epoch: 15601, Training Loss: 0.01200
Epoch: 15601, Training Loss: 0.01113
Epoch: 15601, Training Loss: 0.01370
Epoch: 15601, Training Loss: 0.01055
Epoch: 15602, Training Loss: 0.01200
Epoch: 15602, Training Loss: 0.01113
Epoch: 15602, Training Loss: 0.01370
Epoch: 15602, Training Loss: 0.01055
Epoch: 15603, Training Loss: 0.01200
Epoch: 15603, Training Loss: 0.01113
Epoch: 15603, Training Loss: 0.01370
Epoch: 15603, Training Loss: 0.01055
Epoch: 15604, Training Loss: 0.01200
Epoch: 15604, Training Loss: 0.01113
Epoch: 15604, Training Loss: 0.01370
Epoch: 15604, Training Loss: 0.01055
Epoch: 15605, Training Loss: 0.01200
Epoch: 15605, Training Loss: 0.01113
Epoch: 15605, Training Loss: 0.01370
Epoch: 15605, Training Loss: 0.01055
Epoch: 15606, Training Loss: 0.01200
Epoch: 15606, Training Loss: 0.01113
Epoch: 15606, Training Loss: 0.01370
Epoch: 15606, Training Loss: 0.01055
Epoch: 15607, Training Loss: 0.01200
Epoch: 15607, Training Loss: 0.01113
Epoch: 15607, Training Loss: 0.01370
Epoch: 15607, Training Loss: 0.01055
Epoch: 15608, Training Loss: 0.01200
Epoch: 15608, Training Loss: 0.01113
Epoch: 15608, Training Loss: 0.01370
Epoch: 15608, Training Loss: 0.01055
Epoch: 15609, Training Loss: 0.01200
Epoch: 15609, Training Loss: 0.01113
Epoch: 15609, Training Loss: 0.01370
Epoch: 15609, Training Loss: 0.01055
Epoch: 15610, Training Loss: 0.01200
Epoch: 15610, Training Loss: 0.01112
Epoch: 15610, Training Loss: 0.01370
Epoch: 15610, Training Loss: 0.01055
Epoch: 15611, Training Loss: 0.01200
Epoch: 15611, Training Loss: 0.01112
Epoch: 15611, Training Loss: 0.01369
Epoch: 15611, Training Loss: 0.01055
Epoch: 15612, Training Loss: 0.01200
Epoch: 15612, Training Loss: 0.01112
Epoch: 15612, Training Loss: 0.01369
Epoch: 15612, Training Loss: 0.01055
Epoch: 15613, Training Loss: 0.01199
Epoch: 15613, Training Loss: 0.01112
Epoch: 15613, Training Loss: 0.01369
Epoch: 15613, Training Loss: 0.01055
Epoch: 15614, Training Loss: 0.01199
Epoch: 15614, Training Loss: 0.01112
Epoch: 15614, Training Loss: 0.01369
Epoch: 15614, Training Loss: 0.01055
Epoch: 15615, Training Loss: 0.01199
Epoch: 15615, Training Loss: 0.01112
Epoch: 15615, Training Loss: 0.01369
Epoch: 15615, Training Loss: 0.01055
Epoch: 15616, Training Loss: 0.01199
Epoch: 15616, Training Loss: 0.01112
Epoch: 15616, Training Loss: 0.01369
Epoch: 15616, Training Loss: 0.01055
Epoch: 15617, Training Loss: 0.01199
Epoch: 15617, Training Loss: 0.01112
Epoch: 15617, Training Loss: 0.01369
Epoch: 15617, Training Loss: 0.01055
Epoch: 15618, Training Loss: 0.01199
Epoch: 15618, Training Loss: 0.01112
Epoch: 15618, Training Loss: 0.01369
Epoch: 15618, Training Loss: 0.01055
Epoch: 15619, Training Loss: 0.01199
Epoch: 15619, Training Loss: 0.01112
Epoch: 15619, Training Loss: 0.01369
Epoch: 15619, Training Loss: 0.01055
Epoch: 15620, Training Loss: 0.01199
Epoch: 15620, Training Loss: 0.01112
Epoch: 15620, Training Loss: 0.01369
Epoch: 15620, Training Loss: 0.01054
Epoch: 15621, Training Loss: 0.01199
Epoch: 15621, Training Loss: 0.01112
Epoch: 15621, Training Loss: 0.01369
Epoch: 15621, Training Loss: 0.01054
Epoch: 15622, Training Loss: 0.01199
Epoch: 15622, Training Loss: 0.01112
Epoch: 15622, Training Loss: 0.01369
Epoch: 15622, Training Loss: 0.01054
Epoch: 15623, Training Loss: 0.01199
Epoch: 15623, Training Loss: 0.01112
Epoch: 15623, Training Loss: 0.01369
Epoch: 15623, Training Loss: 0.01054
Epoch: 15624, Training Loss: 0.01199
Epoch: 15624, Training Loss: 0.01112
Epoch: 15624, Training Loss: 0.01369
Epoch: 15624, Training Loss: 0.01054
Epoch: 15625, Training Loss: 0.01199
Epoch: 15625, Training Loss: 0.01112
Epoch: 15625, Training Loss: 0.01369
Epoch: 15625, Training Loss: 0.01054
Epoch: 15626, Training Loss: 0.01199
Epoch: 15626, Training Loss: 0.01112
Epoch: 15626, Training Loss: 0.01369
Epoch: 15626, Training Loss: 0.01054
Epoch: 15627, Training Loss: 0.01199
Epoch: 15627, Training Loss: 0.01112
Epoch: 15627, Training Loss: 0.01369
Epoch: 15627, Training Loss: 0.01054
Epoch: 15628, Training Loss: 0.01199
Epoch: 15628, Training Loss: 0.01112
Epoch: 15628, Training Loss: 0.01369
Epoch: 15628, Training Loss: 0.01054
Epoch: 15629, Training Loss: 0.01199
Epoch: 15629, Training Loss: 0.01112
Epoch: 15629, Training Loss: 0.01369
Epoch: 15629, Training Loss: 0.01054
Epoch: 15630, Training Loss: 0.01199
Epoch: 15630, Training Loss: 0.01112
Epoch: 15630, Training Loss: 0.01369
Epoch: 15630, Training Loss: 0.01054
Epoch: 15631, Training Loss: 0.01199
Epoch: 15631, Training Loss: 0.01112
Epoch: 15631, Training Loss: 0.01369
Epoch: 15631, Training Loss: 0.01054
Epoch: 15632, Training Loss: 0.01199
Epoch: 15632, Training Loss: 0.01112
Epoch: 15632, Training Loss: 0.01368
Epoch: 15632, Training Loss: 0.01054
Epoch: 15633, Training Loss: 0.01199
Epoch: 15633, Training Loss: 0.01112
Epoch: 15633, Training Loss: 0.01368
Epoch: 15633, Training Loss: 0.01054
Epoch: 15634, Training Loss: 0.01199
Epoch: 15634, Training Loss: 0.01112
Epoch: 15634, Training Loss: 0.01368
Epoch: 15634, Training Loss: 0.01054
Epoch: 15635, Training Loss: 0.01199
Epoch: 15635, Training Loss: 0.01112
Epoch: 15635, Training Loss: 0.01368
Epoch: 15635, Training Loss: 0.01054
Epoch: 15636, Training Loss: 0.01198
Epoch: 15636, Training Loss: 0.01111
Epoch: 15636, Training Loss: 0.01368
Epoch: 15636, Training Loss: 0.01054
Epoch: 15637, Training Loss: 0.01198
Epoch: 15637, Training Loss: 0.01111
Epoch: 15637, Training Loss: 0.01368
Epoch: 15637, Training Loss: 0.01054
Epoch: 15638, Training Loss: 0.01198
Epoch: 15638, Training Loss: 0.01111
Epoch: 15638, Training Loss: 0.01368
Epoch: 15638, Training Loss: 0.01054
Epoch: 15639, Training Loss: 0.01198
Epoch: 15639, Training Loss: 0.01111
Epoch: 15639, Training Loss: 0.01368
Epoch: 15639, Training Loss: 0.01054
Epoch: 15640, Training Loss: 0.01198
Epoch: 15640, Training Loss: 0.01111
Epoch: 15640, Training Loss: 0.01368
Epoch: 15640, Training Loss: 0.01054
Epoch: 15641, Training Loss: 0.01198
Epoch: 15641, Training Loss: 0.01111
Epoch: 15641, Training Loss: 0.01368
Epoch: 15641, Training Loss: 0.01054
Epoch: 15642, Training Loss: 0.01198
Epoch: 15642, Training Loss: 0.01111
Epoch: 15642, Training Loss: 0.01368
Epoch: 15642, Training Loss: 0.01054
Epoch: 15643, Training Loss: 0.01198
Epoch: 15643, Training Loss: 0.01111
Epoch: 15643, Training Loss: 0.01368
Epoch: 15643, Training Loss: 0.01054
Epoch: 15644, Training Loss: 0.01198
Epoch: 15644, Training Loss: 0.01111
Epoch: 15644, Training Loss: 0.01368
Epoch: 15644, Training Loss: 0.01054
Epoch: 15645, Training Loss: 0.01198
Epoch: 15645, Training Loss: 0.01111
Epoch: 15645, Training Loss: 0.01368
Epoch: 15645, Training Loss: 0.01054
Epoch: 15646, Training Loss: 0.01198
Epoch: 15646, Training Loss: 0.01111
Epoch: 15646, Training Loss: 0.01368
Epoch: 15646, Training Loss: 0.01054
Epoch: 15647, Training Loss: 0.01198
Epoch: 15647, Training Loss: 0.01111
Epoch: 15647, Training Loss: 0.01368
Epoch: 15647, Training Loss: 0.01053
Epoch: 15648, Training Loss: 0.01198
Epoch: 15648, Training Loss: 0.01111
Epoch: 15648, Training Loss: 0.01368
Epoch: 15648, Training Loss: 0.01053
Epoch: 15649, Training Loss: 0.01198
Epoch: 15649, Training Loss: 0.01111
Epoch: 15649, Training Loss: 0.01368
Epoch: 15649, Training Loss: 0.01053
Epoch: 15650, Training Loss: 0.01198
Epoch: 15650, Training Loss: 0.01111
Epoch: 15650, Training Loss: 0.01368
Epoch: 15650, Training Loss: 0.01053
Epoch: 15651, Training Loss: 0.01198
Epoch: 15651, Training Loss: 0.01111
Epoch: 15651, Training Loss: 0.01368
Epoch: 15651, Training Loss: 0.01053
Epoch: 15652, Training Loss: 0.01198
Epoch: 15652, Training Loss: 0.01111
Epoch: 15652, Training Loss: 0.01368
Epoch: 15652, Training Loss: 0.01053
Epoch: 15653, Training Loss: 0.01198
Epoch: 15653, Training Loss: 0.01111
Epoch: 15653, Training Loss: 0.01367
Epoch: 15653, Training Loss: 0.01053
Epoch: 15654, Training Loss: 0.01198
Epoch: 15654, Training Loss: 0.01111
Epoch: 15654, Training Loss: 0.01367
Epoch: 15654, Training Loss: 0.01053
Epoch: 15655, Training Loss: 0.01198
Epoch: 15655, Training Loss: 0.01111
Epoch: 15655, Training Loss: 0.01367
Epoch: 15655, Training Loss: 0.01053
Epoch: 15656, Training Loss: 0.01198
Epoch: 15656, Training Loss: 0.01111
Epoch: 15656, Training Loss: 0.01367
Epoch: 15656, Training Loss: 0.01053
Epoch: 15657, Training Loss: 0.01198
Epoch: 15657, Training Loss: 0.01111
Epoch: 15657, Training Loss: 0.01367
Epoch: 15657, Training Loss: 0.01053
Epoch: 15658, Training Loss: 0.01198
Epoch: 15658, Training Loss: 0.01111
Epoch: 15658, Training Loss: 0.01367
Epoch: 15658, Training Loss: 0.01053
Epoch: 15659, Training Loss: 0.01198
Epoch: 15659, Training Loss: 0.01111
Epoch: 15659, Training Loss: 0.01367
Epoch: 15659, Training Loss: 0.01053
Epoch: 15660, Training Loss: 0.01197
Epoch: 15660, Training Loss: 0.01111
Epoch: 15660, Training Loss: 0.01367
Epoch: 15660, Training Loss: 0.01053
Epoch: 15661, Training Loss: 0.01197
Epoch: 15661, Training Loss: 0.01111
Epoch: 15661, Training Loss: 0.01367
Epoch: 15661, Training Loss: 0.01053
Epoch: 15662, Training Loss: 0.01197
Epoch: 15662, Training Loss: 0.01111
Epoch: 15662, Training Loss: 0.01367
Epoch: 15662, Training Loss: 0.01053
Epoch: 15663, Training Loss: 0.01197
Epoch: 15663, Training Loss: 0.01110
Epoch: 15663, Training Loss: 0.01367
Epoch: 15663, Training Loss: 0.01053
Epoch: 15664, Training Loss: 0.01197
Epoch: 15664, Training Loss: 0.01110
Epoch: 15664, Training Loss: 0.01367
Epoch: 15664, Training Loss: 0.01053
Epoch: 15665, Training Loss: 0.01197
Epoch: 15665, Training Loss: 0.01110
Epoch: 15665, Training Loss: 0.01367
Epoch: 15665, Training Loss: 0.01053
Epoch: 15666, Training Loss: 0.01197
Epoch: 15666, Training Loss: 0.01110
Epoch: 15666, Training Loss: 0.01367
Epoch: 15666, Training Loss: 0.01053
Epoch: 15667, Training Loss: 0.01197
Epoch: 15667, Training Loss: 0.01110
Epoch: 15667, Training Loss: 0.01367
Epoch: 15667, Training Loss: 0.01053
Epoch: 15668, Training Loss: 0.01197
Epoch: 15668, Training Loss: 0.01110
Epoch: 15668, Training Loss: 0.01367
Epoch: 15668, Training Loss: 0.01053
Epoch: 15669, Training Loss: 0.01197
Epoch: 15669, Training Loss: 0.01110
Epoch: 15669, Training Loss: 0.01367
Epoch: 15669, Training Loss: 0.01053
Epoch: 15670, Training Loss: 0.01197
Epoch: 15670, Training Loss: 0.01110
Epoch: 15670, Training Loss: 0.01367
Epoch: 15670, Training Loss: 0.01053
Epoch: 15671, Training Loss: 0.01197
Epoch: 15671, Training Loss: 0.01110
Epoch: 15671, Training Loss: 0.01367
Epoch: 15671, Training Loss: 0.01053
Epoch: 15672, Training Loss: 0.01197
Epoch: 15672, Training Loss: 0.01110
Epoch: 15672, Training Loss: 0.01367
Epoch: 15672, Training Loss: 0.01053
Epoch: 15673, Training Loss: 0.01197
Epoch: 15673, Training Loss: 0.01110
Epoch: 15673, Training Loss: 0.01366
Epoch: 15673, Training Loss: 0.01053
Epoch: 15674, Training Loss: 0.01197
Epoch: 15674, Training Loss: 0.01110
Epoch: 15674, Training Loss: 0.01366
Epoch: 15674, Training Loss: 0.01053
Epoch: 15675, Training Loss: 0.01197
Epoch: 15675, Training Loss: 0.01110
Epoch: 15675, Training Loss: 0.01366
Epoch: 15675, Training Loss: 0.01052
Epoch: 15676, Training Loss: 0.01197
Epoch: 15676, Training Loss: 0.01110
Epoch: 15676, Training Loss: 0.01366
Epoch: 15676, Training Loss: 0.01052
Epoch: 15677, Training Loss: 0.01197
Epoch: 15677, Training Loss: 0.01110
Epoch: 15677, Training Loss: 0.01366
Epoch: 15677, Training Loss: 0.01052
Epoch: 15678, Training Loss: 0.01197
Epoch: 15678, Training Loss: 0.01110
Epoch: 15678, Training Loss: 0.01366
Epoch: 15678, Training Loss: 0.01052
Epoch: 15679, Training Loss: 0.01197
Epoch: 15679, Training Loss: 0.01110
Epoch: 15679, Training Loss: 0.01366
Epoch: 15679, Training Loss: 0.01052
Epoch: 15680, Training Loss: 0.01197
Epoch: 15680, Training Loss: 0.01110
Epoch: 15680, Training Loss: 0.01366
Epoch: 15680, Training Loss: 0.01052
Epoch: 15681, Training Loss: 0.01197
Epoch: 15681, Training Loss: 0.01110
Epoch: 15681, Training Loss: 0.01366
Epoch: 15681, Training Loss: 0.01052
Epoch: 15682, Training Loss: 0.01197
Epoch: 15682, Training Loss: 0.01110
Epoch: 15682, Training Loss: 0.01366
Epoch: 15682, Training Loss: 0.01052
Epoch: 15683, Training Loss: 0.01197
Epoch: 15683, Training Loss: 0.01110
Epoch: 15683, Training Loss: 0.01366
Epoch: 15683, Training Loss: 0.01052
Epoch: 15684, Training Loss: 0.01196
Epoch: 15684, Training Loss: 0.01110
Epoch: 15684, Training Loss: 0.01366
Epoch: 15684, Training Loss: 0.01052
Epoch: 15685, Training Loss: 0.01196
Epoch: 15685, Training Loss: 0.01110
Epoch: 15685, Training Loss: 0.01366
Epoch: 15685, Training Loss: 0.01052
Epoch: 15686, Training Loss: 0.01196
Epoch: 15686, Training Loss: 0.01110
Epoch: 15686, Training Loss: 0.01366
Epoch: 15686, Training Loss: 0.01052
Epoch: 15687, Training Loss: 0.01196
Epoch: 15687, Training Loss: 0.01110
Epoch: 15687, Training Loss: 0.01366
Epoch: 15687, Training Loss: 0.01052
Epoch: 15688, Training Loss: 0.01196
Epoch: 15688, Training Loss: 0.01110
Epoch: 15688, Training Loss: 0.01366
Epoch: 15688, Training Loss: 0.01052
Epoch: 15689, Training Loss: 0.01196
Epoch: 15689, Training Loss: 0.01109
Epoch: 15689, Training Loss: 0.01366
Epoch: 15689, Training Loss: 0.01052
Epoch: 15690, Training Loss: 0.01196
Epoch: 15690, Training Loss: 0.01109
Epoch: 15690, Training Loss: 0.01366
Epoch: 15690, Training Loss: 0.01052
Epoch: 15691, Training Loss: 0.01196
Epoch: 15691, Training Loss: 0.01109
Epoch: 15691, Training Loss: 0.01366
Epoch: 15691, Training Loss: 0.01052
Epoch: 15692, Training Loss: 0.01196
Epoch: 15692, Training Loss: 0.01109
Epoch: 15692, Training Loss: 0.01366
Epoch: 15692, Training Loss: 0.01052
Epoch: 15693, Training Loss: 0.01196
Epoch: 15693, Training Loss: 0.01109
Epoch: 15693, Training Loss: 0.01366
Epoch: 15693, Training Loss: 0.01052
Epoch: 15694, Training Loss: 0.01196
Epoch: 15694, Training Loss: 0.01109
Epoch: 15694, Training Loss: 0.01365
Epoch: 15694, Training Loss: 0.01052
Epoch: 15695, Training Loss: 0.01196
Epoch: 15695, Training Loss: 0.01109
Epoch: 15695, Training Loss: 0.01365
Epoch: 15695, Training Loss: 0.01052
Epoch: 15696, Training Loss: 0.01196
Epoch: 15696, Training Loss: 0.01109
Epoch: 15696, Training Loss: 0.01365
Epoch: 15696, Training Loss: 0.01052
Epoch: 15697, Training Loss: 0.01196
Epoch: 15697, Training Loss: 0.01109
Epoch: 15697, Training Loss: 0.01365
Epoch: 15697, Training Loss: 0.01052
Epoch: 15698, Training Loss: 0.01196
Epoch: 15698, Training Loss: 0.01109
Epoch: 15698, Training Loss: 0.01365
Epoch: 15698, Training Loss: 0.01052
Epoch: 15699, Training Loss: 0.01196
Epoch: 15699, Training Loss: 0.01109
Epoch: 15699, Training Loss: 0.01365
Epoch: 15699, Training Loss: 0.01052
Epoch: 15700, Training Loss: 0.01196
Epoch: 15700, Training Loss: 0.01109
Epoch: 15700, Training Loss: 0.01365
Epoch: 15700, Training Loss: 0.01052
Epoch: 15701, Training Loss: 0.01196
Epoch: 15701, Training Loss: 0.01109
Epoch: 15701, Training Loss: 0.01365
Epoch: 15701, Training Loss: 0.01052
Epoch: 15702, Training Loss: 0.01196
Epoch: 15702, Training Loss: 0.01109
Epoch: 15702, Training Loss: 0.01365
Epoch: 15702, Training Loss: 0.01051
Epoch: 15703, Training Loss: 0.01196
Epoch: 15703, Training Loss: 0.01109
Epoch: 15703, Training Loss: 0.01365
Epoch: 15703, Training Loss: 0.01051
Epoch: 15704, Training Loss: 0.01196
Epoch: 15704, Training Loss: 0.01109
Epoch: 15704, Training Loss: 0.01365
Epoch: 15704, Training Loss: 0.01051
Epoch: 15705, Training Loss: 0.01196
Epoch: 15705, Training Loss: 0.01109
Epoch: 15705, Training Loss: 0.01365
Epoch: 15705, Training Loss: 0.01051
Epoch: 15706, Training Loss: 0.01196
Epoch: 15706, Training Loss: 0.01109
Epoch: 15706, Training Loss: 0.01365
Epoch: 15706, Training Loss: 0.01051
Epoch: 15707, Training Loss: 0.01196
Epoch: 15707, Training Loss: 0.01109
Epoch: 15707, Training Loss: 0.01365
Epoch: 15707, Training Loss: 0.01051
Epoch: 15708, Training Loss: 0.01195
Epoch: 15708, Training Loss: 0.01109
Epoch: 15708, Training Loss: 0.01365
Epoch: 15708, Training Loss: 0.01051
Epoch: 15709, Training Loss: 0.01195
Epoch: 15709, Training Loss: 0.01109
Epoch: 15709, Training Loss: 0.01365
Epoch: 15709, Training Loss: 0.01051
Epoch: 15710, Training Loss: 0.01195
Epoch: 15710, Training Loss: 0.01109
Epoch: 15710, Training Loss: 0.01365
Epoch: 15710, Training Loss: 0.01051
Epoch: 15711, Training Loss: 0.01195
Epoch: 15711, Training Loss: 0.01109
Epoch: 15711, Training Loss: 0.01365
Epoch: 15711, Training Loss: 0.01051
Epoch: 15712, Training Loss: 0.01195
Epoch: 15712, Training Loss: 0.01109
Epoch: 15712, Training Loss: 0.01365
Epoch: 15712, Training Loss: 0.01051
Epoch: 15713, Training Loss: 0.01195
Epoch: 15713, Training Loss: 0.01109
Epoch: 15713, Training Loss: 0.01365
Epoch: 15713, Training Loss: 0.01051
Epoch: 15714, Training Loss: 0.01195
Epoch: 15714, Training Loss: 0.01109
Epoch: 15714, Training Loss: 0.01365
Epoch: 15714, Training Loss: 0.01051
Epoch: 15715, Training Loss: 0.01195
Epoch: 15715, Training Loss: 0.01108
Epoch: 15715, Training Loss: 0.01364
Epoch: 15715, Training Loss: 0.01051
Epoch: 15716, Training Loss: 0.01195
Epoch: 15716, Training Loss: 0.01108
Epoch: 15716, Training Loss: 0.01364
Epoch: 15716, Training Loss: 0.01051
Epoch: 15717, Training Loss: 0.01195
Epoch: 15717, Training Loss: 0.01108
Epoch: 15717, Training Loss: 0.01364
Epoch: 15717, Training Loss: 0.01051
Epoch: 15718, Training Loss: 0.01195
Epoch: 15718, Training Loss: 0.01108
Epoch: 15718, Training Loss: 0.01364
Epoch: 15718, Training Loss: 0.01051
Epoch: 15719, Training Loss: 0.01195
Epoch: 15719, Training Loss: 0.01108
Epoch: 15719, Training Loss: 0.01364
Epoch: 15719, Training Loss: 0.01051
Epoch: 15720, Training Loss: 0.01195
Epoch: 15720, Training Loss: 0.01108
Epoch: 15720, Training Loss: 0.01364
Epoch: 15720, Training Loss: 0.01051
Epoch: 15721, Training Loss: 0.01195
Epoch: 15721, Training Loss: 0.01108
Epoch: 15721, Training Loss: 0.01364
Epoch: 15721, Training Loss: 0.01051
Epoch: 15722, Training Loss: 0.01195
Epoch: 15722, Training Loss: 0.01108
Epoch: 15722, Training Loss: 0.01364
Epoch: 15722, Training Loss: 0.01051
Epoch: 15723, Training Loss: 0.01195
Epoch: 15723, Training Loss: 0.01108
Epoch: 15723, Training Loss: 0.01364
Epoch: 15723, Training Loss: 0.01051
Epoch: 15724, Training Loss: 0.01195
Epoch: 15724, Training Loss: 0.01108
Epoch: 15724, Training Loss: 0.01364
Epoch: 15724, Training Loss: 0.01051
Epoch: 15725, Training Loss: 0.01195
Epoch: 15725, Training Loss: 0.01108
Epoch: 15725, Training Loss: 0.01364
Epoch: 15725, Training Loss: 0.01051
Epoch: 15726, Training Loss: 0.01195
Epoch: 15726, Training Loss: 0.01108
Epoch: 15726, Training Loss: 0.01364
Epoch: 15726, Training Loss: 0.01051
Epoch: 15727, Training Loss: 0.01195
Epoch: 15727, Training Loss: 0.01108
Epoch: 15727, Training Loss: 0.01364
Epoch: 15727, Training Loss: 0.01051
Epoch: 15728, Training Loss: 0.01195
Epoch: 15728, Training Loss: 0.01108
Epoch: 15728, Training Loss: 0.01364
Epoch: 15728, Training Loss: 0.01051
Epoch: 15729, Training Loss: 0.01195
Epoch: 15729, Training Loss: 0.01108
Epoch: 15729, Training Loss: 0.01364
Epoch: 15729, Training Loss: 0.01051
Epoch: 15730, Training Loss: 0.01195
Epoch: 15730, Training Loss: 0.01108
Epoch: 15730, Training Loss: 0.01364
Epoch: 15730, Training Loss: 0.01050
Epoch: 15731, Training Loss: 0.01195
Epoch: 15731, Training Loss: 0.01108
Epoch: 15731, Training Loss: 0.01364
Epoch: 15731, Training Loss: 0.01050
Epoch: 15732, Training Loss: 0.01194
Epoch: 15732, Training Loss: 0.01108
Epoch: 15732, Training Loss: 0.01364
Epoch: 15732, Training Loss: 0.01050
Epoch: 15733, Training Loss: 0.01194
Epoch: 15733, Training Loss: 0.01108
Epoch: 15733, Training Loss: 0.01364
Epoch: 15733, Training Loss: 0.01050
Epoch: 15734, Training Loss: 0.01194
Epoch: 15734, Training Loss: 0.01108
Epoch: 15734, Training Loss: 0.01364
Epoch: 15734, Training Loss: 0.01050
Epoch: 15735, Training Loss: 0.01194
Epoch: 15735, Training Loss: 0.01108
Epoch: 15735, Training Loss: 0.01364
Epoch: 15735, Training Loss: 0.01050
Epoch: 15736, Training Loss: 0.01194
Epoch: 15736, Training Loss: 0.01108
Epoch: 15736, Training Loss: 0.01363
Epoch: 15736, Training Loss: 0.01050
Epoch: 15737, Training Loss: 0.01194
Epoch: 15737, Training Loss: 0.01108
Epoch: 15737, Training Loss: 0.01363
Epoch: 15737, Training Loss: 0.01050
Epoch: 15738, Training Loss: 0.01194
Epoch: 15738, Training Loss: 0.01108
Epoch: 15738, Training Loss: 0.01363
Epoch: 15738, Training Loss: 0.01050
Epoch: 15739, Training Loss: 0.01194
Epoch: 15739, Training Loss: 0.01108
Epoch: 15739, Training Loss: 0.01363
Epoch: 15739, Training Loss: 0.01050
Epoch: 15740, Training Loss: 0.01194
Epoch: 15740, Training Loss: 0.01108
Epoch: 15740, Training Loss: 0.01363
Epoch: 15740, Training Loss: 0.01050
Epoch: 15741, Training Loss: 0.01194
Epoch: 15741, Training Loss: 0.01108
Epoch: 15741, Training Loss: 0.01363
Epoch: 15741, Training Loss: 0.01050
Epoch: 15742, Training Loss: 0.01194
Epoch: 15742, Training Loss: 0.01107
Epoch: 15742, Training Loss: 0.01363
Epoch: 15742, Training Loss: 0.01050
Epoch: 15743, Training Loss: 0.01194
Epoch: 15743, Training Loss: 0.01107
Epoch: 15743, Training Loss: 0.01363
Epoch: 15743, Training Loss: 0.01050
Epoch: 15744, Training Loss: 0.01194
Epoch: 15744, Training Loss: 0.01107
Epoch: 15744, Training Loss: 0.01363
Epoch: 15744, Training Loss: 0.01050
Epoch: 15745, Training Loss: 0.01194
Epoch: 15745, Training Loss: 0.01107
Epoch: 15745, Training Loss: 0.01363
Epoch: 15745, Training Loss: 0.01050
Epoch: 15746, Training Loss: 0.01194
Epoch: 15746, Training Loss: 0.01107
Epoch: 15746, Training Loss: 0.01363
Epoch: 15746, Training Loss: 0.01050
Epoch: 15747, Training Loss: 0.01194
Epoch: 15747, Training Loss: 0.01107
Epoch: 15747, Training Loss: 0.01363
Epoch: 15747, Training Loss: 0.01050
Epoch: 15748, Training Loss: 0.01194
Epoch: 15748, Training Loss: 0.01107
Epoch: 15748, Training Loss: 0.01363
Epoch: 15748, Training Loss: 0.01050
Epoch: 15749, Training Loss: 0.01194
Epoch: 15749, Training Loss: 0.01107
Epoch: 15749, Training Loss: 0.01363
Epoch: 15749, Training Loss: 0.01050
Epoch: 15750, Training Loss: 0.01194
Epoch: 15750, Training Loss: 0.01107
Epoch: 15750, Training Loss: 0.01363
Epoch: 15750, Training Loss: 0.01050
Epoch: 15751, Training Loss: 0.01194
Epoch: 15751, Training Loss: 0.01107
Epoch: 15751, Training Loss: 0.01363
Epoch: 15751, Training Loss: 0.01050
Epoch: 15752, Training Loss: 0.01194
Epoch: 15752, Training Loss: 0.01107
Epoch: 15752, Training Loss: 0.01363
Epoch: 15752, Training Loss: 0.01050
Epoch: 15753, Training Loss: 0.01194
Epoch: 15753, Training Loss: 0.01107
Epoch: 15753, Training Loss: 0.01363
Epoch: 15753, Training Loss: 0.01050
Epoch: 15754, Training Loss: 0.01194
Epoch: 15754, Training Loss: 0.01107
Epoch: 15754, Training Loss: 0.01363
Epoch: 15754, Training Loss: 0.01050
Epoch: 15755, Training Loss: 0.01194
Epoch: 15755, Training Loss: 0.01107
Epoch: 15755, Training Loss: 0.01363
Epoch: 15755, Training Loss: 0.01050
Epoch: 15756, Training Loss: 0.01193
Epoch: 15756, Training Loss: 0.01107
Epoch: 15756, Training Loss: 0.01363
Epoch: 15756, Training Loss: 0.01050
Epoch: 15757, Training Loss: 0.01193
Epoch: 15757, Training Loss: 0.01107
Epoch: 15757, Training Loss: 0.01362
Epoch: 15757, Training Loss: 0.01050
Epoch: 15758, Training Loss: 0.01193
Epoch: 15758, Training Loss: 0.01107
Epoch: 15758, Training Loss: 0.01362
Epoch: 15758, Training Loss: 0.01049
Epoch: 15759, Training Loss: 0.01193
Epoch: 15759, Training Loss: 0.01107
Epoch: 15759, Training Loss: 0.01362
Epoch: 15759, Training Loss: 0.01049
Epoch: 15760, Training Loss: 0.01193
Epoch: 15760, Training Loss: 0.01107
Epoch: 15760, Training Loss: 0.01362
Epoch: 15760, Training Loss: 0.01049
Epoch: 15761, Training Loss: 0.01193
Epoch: 15761, Training Loss: 0.01107
Epoch: 15761, Training Loss: 0.01362
Epoch: 15761, Training Loss: 0.01049
Epoch: 15762, Training Loss: 0.01193
Epoch: 15762, Training Loss: 0.01107
Epoch: 15762, Training Loss: 0.01362
Epoch: 15762, Training Loss: 0.01049
Epoch: 15763, Training Loss: 0.01193
Epoch: 15763, Training Loss: 0.01107
Epoch: 15763, Training Loss: 0.01362
Epoch: 15763, Training Loss: 0.01049
Epoch: 15764, Training Loss: 0.01193
Epoch: 15764, Training Loss: 0.01107
Epoch: 15764, Training Loss: 0.01362
Epoch: 15764, Training Loss: 0.01049
Epoch: 15765, Training Loss: 0.01193
Epoch: 15765, Training Loss: 0.01107
Epoch: 15765, Training Loss: 0.01362
Epoch: 15765, Training Loss: 0.01049
Epoch: 15766, Training Loss: 0.01193
Epoch: 15766, Training Loss: 0.01107
Epoch: 15766, Training Loss: 0.01362
Epoch: 15766, Training Loss: 0.01049
Epoch: 15767, Training Loss: 0.01193
Epoch: 15767, Training Loss: 0.01107
Epoch: 15767, Training Loss: 0.01362
Epoch: 15767, Training Loss: 0.01049
Epoch: 15768, Training Loss: 0.01193
Epoch: 15768, Training Loss: 0.01106
Epoch: 15768, Training Loss: 0.01362
Epoch: 15768, Training Loss: 0.01049
Epoch: 15769, Training Loss: 0.01193
Epoch: 15769, Training Loss: 0.01106
Epoch: 15769, Training Loss: 0.01362
Epoch: 15769, Training Loss: 0.01049
Epoch: 15770, Training Loss: 0.01193
Epoch: 15770, Training Loss: 0.01106
Epoch: 15770, Training Loss: 0.01362
Epoch: 15770, Training Loss: 0.01049
Epoch: 15771, Training Loss: 0.01193
Epoch: 15771, Training Loss: 0.01106
Epoch: 15771, Training Loss: 0.01362
Epoch: 15771, Training Loss: 0.01049
Epoch: 15772, Training Loss: 0.01193
Epoch: 15772, Training Loss: 0.01106
Epoch: 15772, Training Loss: 0.01362
Epoch: 15772, Training Loss: 0.01049
Epoch: 15773, Training Loss: 0.01193
Epoch: 15773, Training Loss: 0.01106
Epoch: 15773, Training Loss: 0.01362
Epoch: 15773, Training Loss: 0.01049
Epoch: 15774, Training Loss: 0.01193
Epoch: 15774, Training Loss: 0.01106
Epoch: 15774, Training Loss: 0.01362
Epoch: 15774, Training Loss: 0.01049
Epoch: 15775, Training Loss: 0.01193
Epoch: 15775, Training Loss: 0.01106
Epoch: 15775, Training Loss: 0.01362
Epoch: 15775, Training Loss: 0.01049
Epoch: 15776, Training Loss: 0.01193
Epoch: 15776, Training Loss: 0.01106
Epoch: 15776, Training Loss: 0.01362
Epoch: 15776, Training Loss: 0.01049
Epoch: 15777, Training Loss: 0.01193
Epoch: 15777, Training Loss: 0.01106
Epoch: 15777, Training Loss: 0.01362
Epoch: 15777, Training Loss: 0.01049
Epoch: 15778, Training Loss: 0.01193
Epoch: 15778, Training Loss: 0.01106
Epoch: 15778, Training Loss: 0.01361
Epoch: 15778, Training Loss: 0.01049
Epoch: 15779, Training Loss: 0.01193
Epoch: 15779, Training Loss: 0.01106
Epoch: 15779, Training Loss: 0.01361
Epoch: 15779, Training Loss: 0.01049
Epoch: 15780, Training Loss: 0.01192
Epoch: 15780, Training Loss: 0.01106
Epoch: 15780, Training Loss: 0.01361
Epoch: 15780, Training Loss: 0.01049
Epoch: 15781, Training Loss: 0.01192
Epoch: 15781, Training Loss: 0.01106
Epoch: 15781, Training Loss: 0.01361
Epoch: 15781, Training Loss: 0.01049
Epoch: 15782, Training Loss: 0.01192
Epoch: 15782, Training Loss: 0.01106
Epoch: 15782, Training Loss: 0.01361
Epoch: 15782, Training Loss: 0.01049
Epoch: 15783, Training Loss: 0.01192
Epoch: 15783, Training Loss: 0.01106
Epoch: 15783, Training Loss: 0.01361
Epoch: 15783, Training Loss: 0.01049
Epoch: 15784, Training Loss: 0.01192
Epoch: 15784, Training Loss: 0.01106
Epoch: 15784, Training Loss: 0.01361
Epoch: 15784, Training Loss: 0.01049
Epoch: 15785, Training Loss: 0.01192
Epoch: 15785, Training Loss: 0.01106
Epoch: 15785, Training Loss: 0.01361
Epoch: 15785, Training Loss: 0.01049
Epoch: 15786, Training Loss: 0.01192
Epoch: 15786, Training Loss: 0.01106
Epoch: 15786, Training Loss: 0.01361
Epoch: 15786, Training Loss: 0.01048
Epoch: 15787, Training Loss: 0.01192
Epoch: 15787, Training Loss: 0.01106
Epoch: 15787, Training Loss: 0.01361
Epoch: 15787, Training Loss: 0.01048
Epoch: 15788, Training Loss: 0.01192
Epoch: 15788, Training Loss: 0.01106
Epoch: 15788, Training Loss: 0.01361
Epoch: 15788, Training Loss: 0.01048
Epoch: 15789, Training Loss: 0.01192
Epoch: 15789, Training Loss: 0.01106
Epoch: 15789, Training Loss: 0.01361
Epoch: 15789, Training Loss: 0.01048
Epoch: 15790, Training Loss: 0.01192
Epoch: 15790, Training Loss: 0.01106
Epoch: 15790, Training Loss: 0.01361
Epoch: 15790, Training Loss: 0.01048
Epoch: 15791, Training Loss: 0.01192
Epoch: 15791, Training Loss: 0.01106
Epoch: 15791, Training Loss: 0.01361
Epoch: 15791, Training Loss: 0.01048
Epoch: 15792, Training Loss: 0.01192
Epoch: 15792, Training Loss: 0.01106
Epoch: 15792, Training Loss: 0.01361
Epoch: 15792, Training Loss: 0.01048
Epoch: 15793, Training Loss: 0.01192
Epoch: 15793, Training Loss: 0.01106
Epoch: 15793, Training Loss: 0.01361
Epoch: 15793, Training Loss: 0.01048
Epoch: 15794, Training Loss: 0.01192
Epoch: 15794, Training Loss: 0.01106
Epoch: 15794, Training Loss: 0.01361
Epoch: 15794, Training Loss: 0.01048
Epoch: 15795, Training Loss: 0.01192
Epoch: 15795, Training Loss: 0.01105
Epoch: 15795, Training Loss: 0.01361
Epoch: 15795, Training Loss: 0.01048
Epoch: 15796, Training Loss: 0.01192
Epoch: 15796, Training Loss: 0.01105
Epoch: 15796, Training Loss: 0.01361
Epoch: 15796, Training Loss: 0.01048
Epoch: 15797, Training Loss: 0.01192
Epoch: 15797, Training Loss: 0.01105
Epoch: 15797, Training Loss: 0.01361
Epoch: 15797, Training Loss: 0.01048
Epoch: 15798, Training Loss: 0.01192
Epoch: 15798, Training Loss: 0.01105
Epoch: 15798, Training Loss: 0.01361
Epoch: 15798, Training Loss: 0.01048
Epoch: 15799, Training Loss: 0.01192
Epoch: 15799, Training Loss: 0.01105
Epoch: 15799, Training Loss: 0.01361
Epoch: 15799, Training Loss: 0.01048
Epoch: 15800, Training Loss: 0.01192
Epoch: 15800, Training Loss: 0.01105
Epoch: 15800, Training Loss: 0.01360
Epoch: 15800, Training Loss: 0.01048
Epoch: 15801, Training Loss: 0.01192
Epoch: 15801, Training Loss: 0.01105
Epoch: 15801, Training Loss: 0.01360
Epoch: 15801, Training Loss: 0.01048
Epoch: 15802, Training Loss: 0.01192
Epoch: 15802, Training Loss: 0.01105
Epoch: 15802, Training Loss: 0.01360
Epoch: 15802, Training Loss: 0.01048
Epoch: 15803, Training Loss: 0.01192
Epoch: 15803, Training Loss: 0.01105
Epoch: 15803, Training Loss: 0.01360
Epoch: 15803, Training Loss: 0.01048
Epoch: 15804, Training Loss: 0.01191
Epoch: 15804, Training Loss: 0.01105
Epoch: 15804, Training Loss: 0.01360
Epoch: 15804, Training Loss: 0.01048
Epoch: 15805, Training Loss: 0.01191
Epoch: 15805, Training Loss: 0.01105
Epoch: 15805, Training Loss: 0.01360
Epoch: 15805, Training Loss: 0.01048
Epoch: 15806, Training Loss: 0.01191
Epoch: 15806, Training Loss: 0.01105
Epoch: 15806, Training Loss: 0.01360
Epoch: 15806, Training Loss: 0.01048
Epoch: 15807, Training Loss: 0.01191
Epoch: 15807, Training Loss: 0.01105
Epoch: 15807, Training Loss: 0.01360
Epoch: 15807, Training Loss: 0.01048
Epoch: 15808, Training Loss: 0.01191
Epoch: 15808, Training Loss: 0.01105
Epoch: 15808, Training Loss: 0.01360
Epoch: 15808, Training Loss: 0.01048
Epoch: 15809, Training Loss: 0.01191
Epoch: 15809, Training Loss: 0.01105
Epoch: 15809, Training Loss: 0.01360
Epoch: 15809, Training Loss: 0.01048
Epoch: 15810, Training Loss: 0.01191
Epoch: 15810, Training Loss: 0.01105
Epoch: 15810, Training Loss: 0.01360
Epoch: 15810, Training Loss: 0.01048
Epoch: 15811, Training Loss: 0.01191
Epoch: 15811, Training Loss: 0.01105
Epoch: 15811, Training Loss: 0.01360
Epoch: 15811, Training Loss: 0.01048
Epoch: 15812, Training Loss: 0.01191
Epoch: 15812, Training Loss: 0.01105
Epoch: 15812, Training Loss: 0.01360
Epoch: 15812, Training Loss: 0.01048
Epoch: 15813, Training Loss: 0.01191
Epoch: 15813, Training Loss: 0.01105
Epoch: 15813, Training Loss: 0.01360
Epoch: 15813, Training Loss: 0.01048
Epoch: 15814, Training Loss: 0.01191
Epoch: 15814, Training Loss: 0.01105
Epoch: 15814, Training Loss: 0.01360
Epoch: 15814, Training Loss: 0.01047
Epoch: 15815, Training Loss: 0.01191
Epoch: 15815, Training Loss: 0.01105
Epoch: 15815, Training Loss: 0.01360
Epoch: 15815, Training Loss: 0.01047
Epoch: 15816, Training Loss: 0.01191
Epoch: 15816, Training Loss: 0.01105
Epoch: 15816, Training Loss: 0.01360
Epoch: 15816, Training Loss: 0.01047
Epoch: 15817, Training Loss: 0.01191
Epoch: 15817, Training Loss: 0.01105
Epoch: 15817, Training Loss: 0.01360
Epoch: 15817, Training Loss: 0.01047
Epoch: 15818, Training Loss: 0.01191
Epoch: 15818, Training Loss: 0.01105
Epoch: 15818, Training Loss: 0.01360
Epoch: 15818, Training Loss: 0.01047
Epoch: 15819, Training Loss: 0.01191
Epoch: 15819, Training Loss: 0.01105
Epoch: 15819, Training Loss: 0.01360
Epoch: 15819, Training Loss: 0.01047
Epoch: 15820, Training Loss: 0.01191
Epoch: 15820, Training Loss: 0.01105
Epoch: 15820, Training Loss: 0.01360
Epoch: 15820, Training Loss: 0.01047
Epoch: 15821, Training Loss: 0.01191
Epoch: 15821, Training Loss: 0.01105
Epoch: 15821, Training Loss: 0.01359
Epoch: 15821, Training Loss: 0.01047
Epoch: 15822, Training Loss: 0.01191
Epoch: 15822, Training Loss: 0.01104
Epoch: 15822, Training Loss: 0.01359
Epoch: 15822, Training Loss: 0.01047
Epoch: 15823, Training Loss: 0.01191
Epoch: 15823, Training Loss: 0.01104
Epoch: 15823, Training Loss: 0.01359
Epoch: 15823, Training Loss: 0.01047
Epoch: 15824, Training Loss: 0.01191
Epoch: 15824, Training Loss: 0.01104
Epoch: 15824, Training Loss: 0.01359
Epoch: 15824, Training Loss: 0.01047
Epoch: 15825, Training Loss: 0.01191
Epoch: 15825, Training Loss: 0.01104
Epoch: 15825, Training Loss: 0.01359
Epoch: 15825, Training Loss: 0.01047
Epoch: 15826, Training Loss: 0.01191
Epoch: 15826, Training Loss: 0.01104
Epoch: 15826, Training Loss: 0.01359
Epoch: 15826, Training Loss: 0.01047
Epoch: 15827, Training Loss: 0.01191
Epoch: 15827, Training Loss: 0.01104
Epoch: 15827, Training Loss: 0.01359
Epoch: 15827, Training Loss: 0.01047
Epoch: 15828, Training Loss: 0.01191
Epoch: 15828, Training Loss: 0.01104
Epoch: 15828, Training Loss: 0.01359
Epoch: 15828, Training Loss: 0.01047
Epoch: 15829, Training Loss: 0.01190
Epoch: 15829, Training Loss: 0.01104
Epoch: 15829, Training Loss: 0.01359
Epoch: 15829, Training Loss: 0.01047
Epoch: 15830, Training Loss: 0.01190
Epoch: 15830, Training Loss: 0.01104
Epoch: 15830, Training Loss: 0.01359
Epoch: 15830, Training Loss: 0.01047
Epoch: 15831, Training Loss: 0.01190
Epoch: 15831, Training Loss: 0.01104
Epoch: 15831, Training Loss: 0.01359
Epoch: 15831, Training Loss: 0.01047
Epoch: 15832, Training Loss: 0.01190
Epoch: 15832, Training Loss: 0.01104
Epoch: 15832, Training Loss: 0.01359
Epoch: 15832, Training Loss: 0.01047
Epoch: 15833, Training Loss: 0.01190
Epoch: 15833, Training Loss: 0.01104
Epoch: 15833, Training Loss: 0.01359
Epoch: 15833, Training Loss: 0.01047
Epoch: 15834, Training Loss: 0.01190
Epoch: 15834, Training Loss: 0.01104
Epoch: 15834, Training Loss: 0.01359
Epoch: 15834, Training Loss: 0.01047
Epoch: 15835, Training Loss: 0.01190
Epoch: 15835, Training Loss: 0.01104
Epoch: 15835, Training Loss: 0.01359
Epoch: 15835, Training Loss: 0.01047
Epoch: 15836, Training Loss: 0.01190
Epoch: 15836, Training Loss: 0.01104
Epoch: 15836, Training Loss: 0.01359
Epoch: 15836, Training Loss: 0.01047
Epoch: 15837, Training Loss: 0.01190
Epoch: 15837, Training Loss: 0.01104
Epoch: 15837, Training Loss: 0.01359
Epoch: 15837, Training Loss: 0.01047
Epoch: 15838, Training Loss: 0.01190
Epoch: 15838, Training Loss: 0.01104
Epoch: 15838, Training Loss: 0.01359
Epoch: 15838, Training Loss: 0.01047
Epoch: 15839, Training Loss: 0.01190
Epoch: 15839, Training Loss: 0.01104
Epoch: 15839, Training Loss: 0.01359
Epoch: 15839, Training Loss: 0.01047
Epoch: 15840, Training Loss: 0.01190
Epoch: 15840, Training Loss: 0.01104
Epoch: 15840, Training Loss: 0.01359
Epoch: 15840, Training Loss: 0.01047
Epoch: 15841, Training Loss: 0.01190
Epoch: 15841, Training Loss: 0.01104
Epoch: 15841, Training Loss: 0.01359
Epoch: 15841, Training Loss: 0.01047
Epoch: 15842, Training Loss: 0.01190
Epoch: 15842, Training Loss: 0.01104
Epoch: 15842, Training Loss: 0.01358
Epoch: 15842, Training Loss: 0.01046
Epoch: 15843, Training Loss: 0.01190
Epoch: 15843, Training Loss: 0.01104
Epoch: 15843, Training Loss: 0.01358
Epoch: 15843, Training Loss: 0.01046
Epoch: 15844, Training Loss: 0.01190
Epoch: 15844, Training Loss: 0.01104
Epoch: 15844, Training Loss: 0.01358
Epoch: 15844, Training Loss: 0.01046
Epoch: 15845, Training Loss: 0.01190
Epoch: 15845, Training Loss: 0.01104
Epoch: 15845, Training Loss: 0.01358
Epoch: 15845, Training Loss: 0.01046
Epoch: 15846, Training Loss: 0.01190
Epoch: 15846, Training Loss: 0.01104
Epoch: 15846, Training Loss: 0.01358
Epoch: 15846, Training Loss: 0.01046
Epoch: 15847, Training Loss: 0.01190
Epoch: 15847, Training Loss: 0.01104
Epoch: 15847, Training Loss: 0.01358
Epoch: 15847, Training Loss: 0.01046
Epoch: 15848, Training Loss: 0.01190
Epoch: 15848, Training Loss: 0.01104
Epoch: 15848, Training Loss: 0.01358
Epoch: 15848, Training Loss: 0.01046
Epoch: 15849, Training Loss: 0.01190
Epoch: 15849, Training Loss: 0.01103
Epoch: 15849, Training Loss: 0.01358
Epoch: 15849, Training Loss: 0.01046
Epoch: 15850, Training Loss: 0.01190
Epoch: 15850, Training Loss: 0.01103
Epoch: 15850, Training Loss: 0.01358
Epoch: 15850, Training Loss: 0.01046
Epoch: 15851, Training Loss: 0.01190
Epoch: 15851, Training Loss: 0.01103
Epoch: 15851, Training Loss: 0.01358
Epoch: 15851, Training Loss: 0.01046
Epoch: 15852, Training Loss: 0.01190
Epoch: 15852, Training Loss: 0.01103
Epoch: 15852, Training Loss: 0.01358
Epoch: 15852, Training Loss: 0.01046
Epoch: 15853, Training Loss: 0.01189
Epoch: 15853, Training Loss: 0.01103
Epoch: 15853, Training Loss: 0.01358
Epoch: 15853, Training Loss: 0.01046
Epoch: 15854, Training Loss: 0.01189
Epoch: 15854, Training Loss: 0.01103
Epoch: 15854, Training Loss: 0.01358
Epoch: 15854, Training Loss: 0.01046
Epoch: 15855, Training Loss: 0.01189
Epoch: 15855, Training Loss: 0.01103
Epoch: 15855, Training Loss: 0.01358
Epoch: 15855, Training Loss: 0.01046
Epoch: 15856, Training Loss: 0.01189
Epoch: 15856, Training Loss: 0.01103
Epoch: 15856, Training Loss: 0.01358
Epoch: 15856, Training Loss: 0.01046
Epoch: 15857, Training Loss: 0.01189
Epoch: 15857, Training Loss: 0.01103
Epoch: 15857, Training Loss: 0.01358
Epoch: 15857, Training Loss: 0.01046
Epoch: 15858, Training Loss: 0.01189
Epoch: 15858, Training Loss: 0.01103
Epoch: 15858, Training Loss: 0.01358
Epoch: 15858, Training Loss: 0.01046
Epoch: 15859, Training Loss: 0.01189
Epoch: 15859, Training Loss: 0.01103
Epoch: 15859, Training Loss: 0.01358
Epoch: 15859, Training Loss: 0.01046
Epoch: 15860, Training Loss: 0.01189
Epoch: 15860, Training Loss: 0.01103
Epoch: 15860, Training Loss: 0.01358
Epoch: 15860, Training Loss: 0.01046
Epoch: 15861, Training Loss: 0.01189
Epoch: 15861, Training Loss: 0.01103
Epoch: 15861, Training Loss: 0.01358
Epoch: 15861, Training Loss: 0.01046
Epoch: 15862, Training Loss: 0.01189
Epoch: 15862, Training Loss: 0.01103
Epoch: 15862, Training Loss: 0.01358
Epoch: 15862, Training Loss: 0.01046
Epoch: 15863, Training Loss: 0.01189
Epoch: 15863, Training Loss: 0.01103
Epoch: 15863, Training Loss: 0.01357
Epoch: 15863, Training Loss: 0.01046
Epoch: 15864, Training Loss: 0.01189
Epoch: 15864, Training Loss: 0.01103
Epoch: 15864, Training Loss: 0.01357
Epoch: 15864, Training Loss: 0.01046
Epoch: 15865, Training Loss: 0.01189
Epoch: 15865, Training Loss: 0.01103
Epoch: 15865, Training Loss: 0.01357
Epoch: 15865, Training Loss: 0.01046
Epoch: 15866, Training Loss: 0.01189
Epoch: 15866, Training Loss: 0.01103
Epoch: 15866, Training Loss: 0.01357
Epoch: 15866, Training Loss: 0.01046
Epoch: 15867, Training Loss: 0.01189
Epoch: 15867, Training Loss: 0.01103
Epoch: 15867, Training Loss: 0.01357
Epoch: 15867, Training Loss: 0.01046
Epoch: 15868, Training Loss: 0.01189
Epoch: 15868, Training Loss: 0.01103
Epoch: 15868, Training Loss: 0.01357
Epoch: 15868, Training Loss: 0.01046
Epoch: 15869, Training Loss: 0.01189
Epoch: 15869, Training Loss: 0.01103
Epoch: 15869, Training Loss: 0.01357
Epoch: 15869, Training Loss: 0.01046
Epoch: 15870, Training Loss: 0.01189
Epoch: 15870, Training Loss: 0.01103
Epoch: 15870, Training Loss: 0.01357
Epoch: 15870, Training Loss: 0.01045
Epoch: 15871, Training Loss: 0.01189
Epoch: 15871, Training Loss: 0.01103
Epoch: 15871, Training Loss: 0.01357
Epoch: 15871, Training Loss: 0.01045
Epoch: 15872, Training Loss: 0.01189
Epoch: 15872, Training Loss: 0.01103
Epoch: 15872, Training Loss: 0.01357
Epoch: 15872, Training Loss: 0.01045
Epoch: 15873, Training Loss: 0.01189
Epoch: 15873, Training Loss: 0.01103
Epoch: 15873, Training Loss: 0.01357
Epoch: 15873, Training Loss: 0.01045
Epoch: 15874, Training Loss: 0.01189
Epoch: 15874, Training Loss: 0.01103
Epoch: 15874, Training Loss: 0.01357
Epoch: 15874, Training Loss: 0.01045
Epoch: 15875, Training Loss: 0.01189
Epoch: 15875, Training Loss: 0.01102
Epoch: 15875, Training Loss: 0.01357
Epoch: 15875, Training Loss: 0.01045
Epoch: 15876, Training Loss: 0.01189
Epoch: 15876, Training Loss: 0.01102
Epoch: 15876, Training Loss: 0.01357
Epoch: 15876, Training Loss: 0.01045
Epoch: 15877, Training Loss: 0.01188
Epoch: 15877, Training Loss: 0.01102
Epoch: 15877, Training Loss: 0.01357
Epoch: 15877, Training Loss: 0.01045
Epoch: 15878, Training Loss: 0.01188
Epoch: 15878, Training Loss: 0.01102
Epoch: 15878, Training Loss: 0.01357
Epoch: 15878, Training Loss: 0.01045
Epoch: 15879, Training Loss: 0.01188
Epoch: 15879, Training Loss: 0.01102
Epoch: 15879, Training Loss: 0.01357
Epoch: 15879, Training Loss: 0.01045
Epoch: 15880, Training Loss: 0.01188
Epoch: 15880, Training Loss: 0.01102
Epoch: 15880, Training Loss: 0.01357
Epoch: 15880, Training Loss: 0.01045
Epoch: 15881, Training Loss: 0.01188
Epoch: 15881, Training Loss: 0.01102
Epoch: 15881, Training Loss: 0.01357
Epoch: 15881, Training Loss: 0.01045
Epoch: 15882, Training Loss: 0.01188
Epoch: 15882, Training Loss: 0.01102
Epoch: 15882, Training Loss: 0.01357
Epoch: 15882, Training Loss: 0.01045
Epoch: 15883, Training Loss: 0.01188
Epoch: 15883, Training Loss: 0.01102
Epoch: 15883, Training Loss: 0.01357
Epoch: 15883, Training Loss: 0.01045
Epoch: 15884, Training Loss: 0.01188
Epoch: 15884, Training Loss: 0.01102
Epoch: 15884, Training Loss: 0.01357
Epoch: 15884, Training Loss: 0.01045
Epoch: 15885, Training Loss: 0.01188
Epoch: 15885, Training Loss: 0.01102
Epoch: 15885, Training Loss: 0.01356
Epoch: 15885, Training Loss: 0.01045
Epoch: 15886, Training Loss: 0.01188
Epoch: 15886, Training Loss: 0.01102
Epoch: 15886, Training Loss: 0.01356
Epoch: 15886, Training Loss: 0.01045
Epoch: 15887, Training Loss: 0.01188
Epoch: 15887, Training Loss: 0.01102
Epoch: 15887, Training Loss: 0.01356
Epoch: 15887, Training Loss: 0.01045
Epoch: 15888, Training Loss: 0.01188
Epoch: 15888, Training Loss: 0.01102
Epoch: 15888, Training Loss: 0.01356
Epoch: 15888, Training Loss: 0.01045
Epoch: 15889, Training Loss: 0.01188
Epoch: 15889, Training Loss: 0.01102
Epoch: 15889, Training Loss: 0.01356
Epoch: 15889, Training Loss: 0.01045
Epoch: 15890, Training Loss: 0.01188
Epoch: 15890, Training Loss: 0.01102
Epoch: 15890, Training Loss: 0.01356
Epoch: 15890, Training Loss: 0.01045
Epoch: 15891, Training Loss: 0.01188
Epoch: 15891, Training Loss: 0.01102
Epoch: 15891, Training Loss: 0.01356
Epoch: 15891, Training Loss: 0.01045
Epoch: 15892, Training Loss: 0.01188
Epoch: 15892, Training Loss: 0.01102
Epoch: 15892, Training Loss: 0.01356
Epoch: 15892, Training Loss: 0.01045
Epoch: 15893, Training Loss: 0.01188
Epoch: 15893, Training Loss: 0.01102
Epoch: 15893, Training Loss: 0.01356
Epoch: 15893, Training Loss: 0.01045
Epoch: 15894, Training Loss: 0.01188
Epoch: 15894, Training Loss: 0.01102
Epoch: 15894, Training Loss: 0.01356
Epoch: 15894, Training Loss: 0.01045
Epoch: 15895, Training Loss: 0.01188
Epoch: 15895, Training Loss: 0.01102
Epoch: 15895, Training Loss: 0.01356
Epoch: 15895, Training Loss: 0.01045
Epoch: 15896, Training Loss: 0.01188
Epoch: 15896, Training Loss: 0.01102
Epoch: 15896, Training Loss: 0.01356
Epoch: 15896, Training Loss: 0.01045
Epoch: 15897, Training Loss: 0.01188
Epoch: 15897, Training Loss: 0.01102
Epoch: 15897, Training Loss: 0.01356
Epoch: 15897, Training Loss: 0.01045
Epoch: 15898, Training Loss: 0.01188
Epoch: 15898, Training Loss: 0.01102
Epoch: 15898, Training Loss: 0.01356
Epoch: 15898, Training Loss: 0.01044
Epoch: 15899, Training Loss: 0.01188
Epoch: 15899, Training Loss: 0.01102
Epoch: 15899, Training Loss: 0.01356
Epoch: 15899, Training Loss: 0.01044
Epoch: 15900, Training Loss: 0.01188
Epoch: 15900, Training Loss: 0.01102
Epoch: 15900, Training Loss: 0.01356
Epoch: 15900, Training Loss: 0.01044
Epoch: 15901, Training Loss: 0.01188
Epoch: 15901, Training Loss: 0.01102
Epoch: 15901, Training Loss: 0.01356
Epoch: 15901, Training Loss: 0.01044
Epoch: 15902, Training Loss: 0.01187
Epoch: 15902, Training Loss: 0.01101
Epoch: 15902, Training Loss: 0.01356
Epoch: 15902, Training Loss: 0.01044
Epoch: 15903, Training Loss: 0.01187
Epoch: 15903, Training Loss: 0.01101
Epoch: 15903, Training Loss: 0.01356
Epoch: 15903, Training Loss: 0.01044
Epoch: 15904, Training Loss: 0.01187
Epoch: 15904, Training Loss: 0.01101
Epoch: 15904, Training Loss: 0.01356
Epoch: 15904, Training Loss: 0.01044
Epoch: 15905, Training Loss: 0.01187
Epoch: 15905, Training Loss: 0.01101
Epoch: 15905, Training Loss: 0.01356
Epoch: 15905, Training Loss: 0.01044
Epoch: 15906, Training Loss: 0.01187
Epoch: 15906, Training Loss: 0.01101
Epoch: 15906, Training Loss: 0.01355
Epoch: 15906, Training Loss: 0.01044
Epoch: 15907, Training Loss: 0.01187
Epoch: 15907, Training Loss: 0.01101
Epoch: 15907, Training Loss: 0.01355
Epoch: 15907, Training Loss: 0.01044
Epoch: 15908, Training Loss: 0.01187
Epoch: 15908, Training Loss: 0.01101
Epoch: 15908, Training Loss: 0.01355
Epoch: 15908, Training Loss: 0.01044
Epoch: 15909, Training Loss: 0.01187
Epoch: 15909, Training Loss: 0.01101
Epoch: 15909, Training Loss: 0.01355
Epoch: 15909, Training Loss: 0.01044
Epoch: 15910, Training Loss: 0.01187
Epoch: 15910, Training Loss: 0.01101
Epoch: 15910, Training Loss: 0.01355
Epoch: 15910, Training Loss: 0.01044
Epoch: 15911, Training Loss: 0.01187
Epoch: 15911, Training Loss: 0.01101
Epoch: 15911, Training Loss: 0.01355
Epoch: 15911, Training Loss: 0.01044
Epoch: 15912, Training Loss: 0.01187
Epoch: 15912, Training Loss: 0.01101
Epoch: 15912, Training Loss: 0.01355
Epoch: 15912, Training Loss: 0.01044
Epoch: 15913, Training Loss: 0.01187
Epoch: 15913, Training Loss: 0.01101
Epoch: 15913, Training Loss: 0.01355
Epoch: 15913, Training Loss: 0.01044
Epoch: 15914, Training Loss: 0.01187
Epoch: 15914, Training Loss: 0.01101
Epoch: 15914, Training Loss: 0.01355
Epoch: 15914, Training Loss: 0.01044
Epoch: 15915, Training Loss: 0.01187
Epoch: 15915, Training Loss: 0.01101
Epoch: 15915, Training Loss: 0.01355
Epoch: 15915, Training Loss: 0.01044
Epoch: 15916, Training Loss: 0.01187
Epoch: 15916, Training Loss: 0.01101
Epoch: 15916, Training Loss: 0.01355
Epoch: 15916, Training Loss: 0.01044
Epoch: 15917, Training Loss: 0.01187
Epoch: 15917, Training Loss: 0.01101
Epoch: 15917, Training Loss: 0.01355
Epoch: 15917, Training Loss: 0.01044
Epoch: 15918, Training Loss: 0.01187
Epoch: 15918, Training Loss: 0.01101
Epoch: 15918, Training Loss: 0.01355
Epoch: 15918, Training Loss: 0.01044
Epoch: 15919, Training Loss: 0.01187
Epoch: 15919, Training Loss: 0.01101
Epoch: 15919, Training Loss: 0.01355
Epoch: 15919, Training Loss: 0.01044
Epoch: 15920, Training Loss: 0.01187
Epoch: 15920, Training Loss: 0.01101
Epoch: 15920, Training Loss: 0.01355
Epoch: 15920, Training Loss: 0.01044
Epoch: 15921, Training Loss: 0.01187
Epoch: 15921, Training Loss: 0.01101
Epoch: 15921, Training Loss: 0.01355
Epoch: 15921, Training Loss: 0.01044
Epoch: 15922, Training Loss: 0.01187
Epoch: 15922, Training Loss: 0.01101
Epoch: 15922, Training Loss: 0.01355
Epoch: 15922, Training Loss: 0.01044
Epoch: 15923, Training Loss: 0.01187
Epoch: 15923, Training Loss: 0.01101
Epoch: 15923, Training Loss: 0.01355
Epoch: 15923, Training Loss: 0.01044
Epoch: 15924, Training Loss: 0.01187
Epoch: 15924, Training Loss: 0.01101
Epoch: 15924, Training Loss: 0.01355
Epoch: 15924, Training Loss: 0.01044
Epoch: 15925, Training Loss: 0.01187
Epoch: 15925, Training Loss: 0.01101
Epoch: 15925, Training Loss: 0.01355
Epoch: 15925, Training Loss: 0.01044
Epoch: 15926, Training Loss: 0.01186
Epoch: 15926, Training Loss: 0.01101
Epoch: 15926, Training Loss: 0.01355
Epoch: 15926, Training Loss: 0.01044
Epoch: 15927, Training Loss: 0.01186
Epoch: 15927, Training Loss: 0.01101
Epoch: 15927, Training Loss: 0.01354
Epoch: 15927, Training Loss: 0.01043
Epoch: 15928, Training Loss: 0.01186
Epoch: 15928, Training Loss: 0.01101
Epoch: 15928, Training Loss: 0.01354
Epoch: 15928, Training Loss: 0.01043
Epoch: 15929, Training Loss: 0.01186
Epoch: 15929, Training Loss: 0.01100
Epoch: 15929, Training Loss: 0.01354
Epoch: 15929, Training Loss: 0.01043
Epoch: 15930, Training Loss: 0.01186
Epoch: 15930, Training Loss: 0.01100
Epoch: 15930, Training Loss: 0.01354
Epoch: 15930, Training Loss: 0.01043
Epoch: 15931, Training Loss: 0.01186
Epoch: 15931, Training Loss: 0.01100
Epoch: 15931, Training Loss: 0.01354
Epoch: 15931, Training Loss: 0.01043
Epoch: 15932, Training Loss: 0.01186
Epoch: 15932, Training Loss: 0.01100
Epoch: 15932, Training Loss: 0.01354
Epoch: 15932, Training Loss: 0.01043
Epoch: 15933, Training Loss: 0.01186
Epoch: 15933, Training Loss: 0.01100
Epoch: 15933, Training Loss: 0.01354
Epoch: 15933, Training Loss: 0.01043
Epoch: 15934, Training Loss: 0.01186
Epoch: 15934, Training Loss: 0.01100
Epoch: 15934, Training Loss: 0.01354
Epoch: 15934, Training Loss: 0.01043
Epoch: 15935, Training Loss: 0.01186
Epoch: 15935, Training Loss: 0.01100
Epoch: 15935, Training Loss: 0.01354
Epoch: 15935, Training Loss: 0.01043
Epoch: 15936, Training Loss: 0.01186
Epoch: 15936, Training Loss: 0.01100
Epoch: 15936, Training Loss: 0.01354
Epoch: 15936, Training Loss: 0.01043
Epoch: 15937, Training Loss: 0.01186
Epoch: 15937, Training Loss: 0.01100
Epoch: 15937, Training Loss: 0.01354
Epoch: 15937, Training Loss: 0.01043
Epoch: 15938, Training Loss: 0.01186
Epoch: 15938, Training Loss: 0.01100
Epoch: 15938, Training Loss: 0.01354
Epoch: 15938, Training Loss: 0.01043
Epoch: 15939, Training Loss: 0.01186
Epoch: 15939, Training Loss: 0.01100
Epoch: 15939, Training Loss: 0.01354
Epoch: 15939, Training Loss: 0.01043
Epoch: 15940, Training Loss: 0.01186
Epoch: 15940, Training Loss: 0.01100
Epoch: 15940, Training Loss: 0.01354
Epoch: 15940, Training Loss: 0.01043
Epoch: 15941, Training Loss: 0.01186
Epoch: 15941, Training Loss: 0.01100
Epoch: 15941, Training Loss: 0.01354
Epoch: 15941, Training Loss: 0.01043
Epoch: 15942, Training Loss: 0.01186
Epoch: 15942, Training Loss: 0.01100
Epoch: 15942, Training Loss: 0.01354
Epoch: 15942, Training Loss: 0.01043
Epoch: 15943, Training Loss: 0.01186
Epoch: 15943, Training Loss: 0.01100
Epoch: 15943, Training Loss: 0.01354
Epoch: 15943, Training Loss: 0.01043
Epoch: 15944, Training Loss: 0.01186
Epoch: 15944, Training Loss: 0.01100
Epoch: 15944, Training Loss: 0.01354
Epoch: 15944, Training Loss: 0.01043
Epoch: 15945, Training Loss: 0.01186
Epoch: 15945, Training Loss: 0.01100
Epoch: 15945, Training Loss: 0.01354
Epoch: 15945, Training Loss: 0.01043
Epoch: 15946, Training Loss: 0.01186
Epoch: 15946, Training Loss: 0.01100
Epoch: 15946, Training Loss: 0.01354
Epoch: 15946, Training Loss: 0.01043
Epoch: 15947, Training Loss: 0.01186
Epoch: 15947, Training Loss: 0.01100
Epoch: 15947, Training Loss: 0.01354
Epoch: 15947, Training Loss: 0.01043
Epoch: 15948, Training Loss: 0.01186
Epoch: 15948, Training Loss: 0.01100
Epoch: 15948, Training Loss: 0.01354
Epoch: 15948, Training Loss: 0.01043
Epoch: 15949, Training Loss: 0.01186
Epoch: 15949, Training Loss: 0.01100
Epoch: 15949, Training Loss: 0.01353
Epoch: 15949, Training Loss: 0.01043
Epoch: 15950, Training Loss: 0.01186
Epoch: 15950, Training Loss: 0.01100
Epoch: 15950, Training Loss: 0.01353
Epoch: 15950, Training Loss: 0.01043
Epoch: 15951, Training Loss: 0.01185
Epoch: 15951, Training Loss: 0.01100
Epoch: 15951, Training Loss: 0.01353
Epoch: 15951, Training Loss: 0.01043
Epoch: 15952, Training Loss: 0.01185
Epoch: 15952, Training Loss: 0.01100
Epoch: 15952, Training Loss: 0.01353
Epoch: 15952, Training Loss: 0.01043
Epoch: 15953, Training Loss: 0.01185
Epoch: 15953, Training Loss: 0.01100
Epoch: 15953, Training Loss: 0.01353
Epoch: 15953, Training Loss: 0.01043
Epoch: 15954, Training Loss: 0.01185
Epoch: 15954, Training Loss: 0.01100
Epoch: 15954, Training Loss: 0.01353
Epoch: 15954, Training Loss: 0.01043
Epoch: 15955, Training Loss: 0.01185
Epoch: 15955, Training Loss: 0.01100
Epoch: 15955, Training Loss: 0.01353
Epoch: 15955, Training Loss: 0.01042
Epoch: 15956, Training Loss: 0.01185
Epoch: 15956, Training Loss: 0.01099
Epoch: 15956, Training Loss: 0.01353
Epoch: 15956, Training Loss: 0.01042
Epoch: 15957, Training Loss: 0.01185
Epoch: 15957, Training Loss: 0.01099
Epoch: 15957, Training Loss: 0.01353
Epoch: 15957, Training Loss: 0.01042
Epoch: 15958, Training Loss: 0.01185
Epoch: 15958, Training Loss: 0.01099
Epoch: 15958, Training Loss: 0.01353
Epoch: 15958, Training Loss: 0.01042
Epoch: 15959, Training Loss: 0.01185
Epoch: 15959, Training Loss: 0.01099
Epoch: 15959, Training Loss: 0.01353
Epoch: 15959, Training Loss: 0.01042
Epoch: 15960, Training Loss: 0.01185
Epoch: 15960, Training Loss: 0.01099
Epoch: 15960, Training Loss: 0.01353
Epoch: 15960, Training Loss: 0.01042
Epoch: 15961, Training Loss: 0.01185
Epoch: 15961, Training Loss: 0.01099
Epoch: 15961, Training Loss: 0.01353
Epoch: 15961, Training Loss: 0.01042
Epoch: 15962, Training Loss: 0.01185
Epoch: 15962, Training Loss: 0.01099
Epoch: 15962, Training Loss: 0.01353
Epoch: 15962, Training Loss: 0.01042
Epoch: 15963, Training Loss: 0.01185
Epoch: 15963, Training Loss: 0.01099
Epoch: 15963, Training Loss: 0.01353
Epoch: 15963, Training Loss: 0.01042
Epoch: 15964, Training Loss: 0.01185
Epoch: 15964, Training Loss: 0.01099
Epoch: 15964, Training Loss: 0.01353
Epoch: 15964, Training Loss: 0.01042
Epoch: 15965, Training Loss: 0.01185
Epoch: 15965, Training Loss: 0.01099
Epoch: 15965, Training Loss: 0.01353
Epoch: 15965, Training Loss: 0.01042
Epoch: 15966, Training Loss: 0.01185
Epoch: 15966, Training Loss: 0.01099
Epoch: 15966, Training Loss: 0.01353
Epoch: 15966, Training Loss: 0.01042
Epoch: 15967, Training Loss: 0.01185
Epoch: 15967, Training Loss: 0.01099
Epoch: 15967, Training Loss: 0.01353
Epoch: 15967, Training Loss: 0.01042
Epoch: 15968, Training Loss: 0.01185
Epoch: 15968, Training Loss: 0.01099
Epoch: 15968, Training Loss: 0.01353
Epoch: 15968, Training Loss: 0.01042
Epoch: 15969, Training Loss: 0.01185
Epoch: 15969, Training Loss: 0.01099
Epoch: 15969, Training Loss: 0.01353
Epoch: 15969, Training Loss: 0.01042
Epoch: 15970, Training Loss: 0.01185
Epoch: 15970, Training Loss: 0.01099
Epoch: 15970, Training Loss: 0.01352
Epoch: 15970, Training Loss: 0.01042
Epoch: 15971, Training Loss: 0.01185
Epoch: 15971, Training Loss: 0.01099
Epoch: 15971, Training Loss: 0.01352
Epoch: 15971, Training Loss: 0.01042
Epoch: 15972, Training Loss: 0.01185
Epoch: 15972, Training Loss: 0.01099
Epoch: 15972, Training Loss: 0.01352
Epoch: 15972, Training Loss: 0.01042
Epoch: 15973, Training Loss: 0.01185
Epoch: 15973, Training Loss: 0.01099
Epoch: 15973, Training Loss: 0.01352
Epoch: 15973, Training Loss: 0.01042
Epoch: 15974, Training Loss: 0.01185
Epoch: 15974, Training Loss: 0.01099
Epoch: 15974, Training Loss: 0.01352
Epoch: 15974, Training Loss: 0.01042
Epoch: 15975, Training Loss: 0.01184
Epoch: 15975, Training Loss: 0.01099
Epoch: 15975, Training Loss: 0.01352
Epoch: 15975, Training Loss: 0.01042
Epoch: 15976, Training Loss: 0.01184
Epoch: 15976, Training Loss: 0.01099
Epoch: 15976, Training Loss: 0.01352
Epoch: 15976, Training Loss: 0.01042
Epoch: 15977, Training Loss: 0.01184
Epoch: 15977, Training Loss: 0.01099
Epoch: 15977, Training Loss: 0.01352
Epoch: 15977, Training Loss: 0.01042
Epoch: 15978, Training Loss: 0.01184
Epoch: 15978, Training Loss: 0.01099
Epoch: 15978, Training Loss: 0.01352
Epoch: 15978, Training Loss: 0.01042
Epoch: 15979, Training Loss: 0.01184
Epoch: 15979, Training Loss: 0.01099
Epoch: 15979, Training Loss: 0.01352
Epoch: 15979, Training Loss: 0.01042
Epoch: 15980, Training Loss: 0.01184
Epoch: 15980, Training Loss: 0.01099
Epoch: 15980, Training Loss: 0.01352
Epoch: 15980, Training Loss: 0.01042
Epoch: 15981, Training Loss: 0.01184
Epoch: 15981, Training Loss: 0.01099
Epoch: 15981, Training Loss: 0.01352
Epoch: 15981, Training Loss: 0.01042
Epoch: 15982, Training Loss: 0.01184
Epoch: 15982, Training Loss: 0.01099
Epoch: 15982, Training Loss: 0.01352
Epoch: 15982, Training Loss: 0.01042
Epoch: 15983, Training Loss: 0.01184
Epoch: 15983, Training Loss: 0.01099
Epoch: 15983, Training Loss: 0.01352
Epoch: 15983, Training Loss: 0.01041
Epoch: 15984, Training Loss: 0.01184
Epoch: 15984, Training Loss: 0.01098
Epoch: 15984, Training Loss: 0.01352
Epoch: 15984, Training Loss: 0.01041
Epoch: 15985, Training Loss: 0.01184
Epoch: 15985, Training Loss: 0.01098
Epoch: 15985, Training Loss: 0.01352
Epoch: 15985, Training Loss: 0.01041
Epoch: 15986, Training Loss: 0.01184
Epoch: 15986, Training Loss: 0.01098
Epoch: 15986, Training Loss: 0.01352
Epoch: 15986, Training Loss: 0.01041
Epoch: 15987, Training Loss: 0.01184
Epoch: 15987, Training Loss: 0.01098
Epoch: 15987, Training Loss: 0.01352
Epoch: 15987, Training Loss: 0.01041
Epoch: 15988, Training Loss: 0.01184
Epoch: 15988, Training Loss: 0.01098
Epoch: 15988, Training Loss: 0.01352
Epoch: 15988, Training Loss: 0.01041
Epoch: 15989, Training Loss: 0.01184
Epoch: 15989, Training Loss: 0.01098
Epoch: 15989, Training Loss: 0.01352
Epoch: 15989, Training Loss: 0.01041
Epoch: 15990, Training Loss: 0.01184
Epoch: 15990, Training Loss: 0.01098
Epoch: 15990, Training Loss: 0.01352
Epoch: 15990, Training Loss: 0.01041
Epoch: 15991, Training Loss: 0.01184
Epoch: 15991, Training Loss: 0.01098
Epoch: 15991, Training Loss: 0.01352
Epoch: 15991, Training Loss: 0.01041
Epoch: 15992, Training Loss: 0.01184
Epoch: 15992, Training Loss: 0.01098
Epoch: 15992, Training Loss: 0.01351
Epoch: 15992, Training Loss: 0.01041
Epoch: 15993, Training Loss: 0.01184
Epoch: 15993, Training Loss: 0.01098
Epoch: 15993, Training Loss: 0.01351
Epoch: 15993, Training Loss: 0.01041
Epoch: 15994, Training Loss: 0.01184
Epoch: 15994, Training Loss: 0.01098
Epoch: 15994, Training Loss: 0.01351
Epoch: 15994, Training Loss: 0.01041
Epoch: 15995, Training Loss: 0.01184
Epoch: 15995, Training Loss: 0.01098
Epoch: 15995, Training Loss: 0.01351
Epoch: 15995, Training Loss: 0.01041
Epoch: 15996, Training Loss: 0.01184
Epoch: 15996, Training Loss: 0.01098
Epoch: 15996, Training Loss: 0.01351
Epoch: 15996, Training Loss: 0.01041
Epoch: 15997, Training Loss: 0.01184
Epoch: 15997, Training Loss: 0.01098
Epoch: 15997, Training Loss: 0.01351
Epoch: 15997, Training Loss: 0.01041
Epoch: 15998, Training Loss: 0.01184
Epoch: 15998, Training Loss: 0.01098
Epoch: 15998, Training Loss: 0.01351
Epoch: 15998, Training Loss: 0.01041
Epoch: 15999, Training Loss: 0.01184
Epoch: 15999, Training Loss: 0.01098
Epoch: 15999, Training Loss: 0.01351
Epoch: 15999, Training Loss: 0.01041
Epoch: 16000, Training Loss: 0.01183
Epoch: 16000, Training Loss: 0.01098
Epoch: 16000, Training Loss: 0.01351
Epoch: 16000, Training Loss: 0.01041
Epoch: 16001, Training Loss: 0.01183
Epoch: 16001, Training Loss: 0.01098
Epoch: 16001, Training Loss: 0.01351
Epoch: 16001, Training Loss: 0.01041
Epoch: 16002, Training Loss: 0.01183
Epoch: 16002, Training Loss: 0.01098
Epoch: 16002, Training Loss: 0.01351
Epoch: 16002, Training Loss: 0.01041
Epoch: 16003, Training Loss: 0.01183
Epoch: 16003, Training Loss: 0.01098
Epoch: 16003, Training Loss: 0.01351
Epoch: 16003, Training Loss: 0.01041
Epoch: 16004, Training Loss: 0.01183
Epoch: 16004, Training Loss: 0.01098
Epoch: 16004, Training Loss: 0.01351
Epoch: 16004, Training Loss: 0.01041
Epoch: 16005, Training Loss: 0.01183
Epoch: 16005, Training Loss: 0.01098
Epoch: 16005, Training Loss: 0.01351
Epoch: 16005, Training Loss: 0.01041
Epoch: 16006, Training Loss: 0.01183
Epoch: 16006, Training Loss: 0.01098
Epoch: 16006, Training Loss: 0.01351
Epoch: 16006, Training Loss: 0.01041
Epoch: 16007, Training Loss: 0.01183
Epoch: 16007, Training Loss: 0.01098
Epoch: 16007, Training Loss: 0.01351
Epoch: 16007, Training Loss: 0.01041
Epoch: 16008, Training Loss: 0.01183
Epoch: 16008, Training Loss: 0.01098
Epoch: 16008, Training Loss: 0.01351
Epoch: 16008, Training Loss: 0.01041
Epoch: 16009, Training Loss: 0.01183
Epoch: 16009, Training Loss: 0.01098
Epoch: 16009, Training Loss: 0.01351
Epoch: 16009, Training Loss: 0.01041
Epoch: 16010, Training Loss: 0.01183
Epoch: 16010, Training Loss: 0.01098
Epoch: 16010, Training Loss: 0.01351
Epoch: 16010, Training Loss: 0.01041
Epoch: 16011, Training Loss: 0.01183
Epoch: 16011, Training Loss: 0.01097
Epoch: 16011, Training Loss: 0.01351
Epoch: 16011, Training Loss: 0.01041
Epoch: 16012, Training Loss: 0.01183
Epoch: 16012, Training Loss: 0.01097
Epoch: 16012, Training Loss: 0.01351
Epoch: 16012, Training Loss: 0.01040
Epoch: 16013, Training Loss: 0.01183
Epoch: 16013, Training Loss: 0.01097
Epoch: 16013, Training Loss: 0.01351
Epoch: 16013, Training Loss: 0.01040
Epoch: 16014, Training Loss: 0.01183
Epoch: 16014, Training Loss: 0.01097
Epoch: 16014, Training Loss: 0.01350
Epoch: 16014, Training Loss: 0.01040
Epoch: 16015, Training Loss: 0.01183
Epoch: 16015, Training Loss: 0.01097
Epoch: 16015, Training Loss: 0.01350
Epoch: 16015, Training Loss: 0.01040
Epoch: 16016, Training Loss: 0.01183
Epoch: 16016, Training Loss: 0.01097
Epoch: 16016, Training Loss: 0.01350
Epoch: 16016, Training Loss: 0.01040
Epoch: 16017, Training Loss: 0.01183
Epoch: 16017, Training Loss: 0.01097
Epoch: 16017, Training Loss: 0.01350
Epoch: 16017, Training Loss: 0.01040
Epoch: 16018, Training Loss: 0.01183
Epoch: 16018, Training Loss: 0.01097
Epoch: 16018, Training Loss: 0.01350
Epoch: 16018, Training Loss: 0.01040
Epoch: 16019, Training Loss: 0.01183
Epoch: 16019, Training Loss: 0.01097
Epoch: 16019, Training Loss: 0.01350
Epoch: 16019, Training Loss: 0.01040
Epoch: 16020, Training Loss: 0.01183
Epoch: 16020, Training Loss: 0.01097
Epoch: 16020, Training Loss: 0.01350
Epoch: 16020, Training Loss: 0.01040
Epoch: 16021, Training Loss: 0.01183
Epoch: 16021, Training Loss: 0.01097
Epoch: 16021, Training Loss: 0.01350
Epoch: 16021, Training Loss: 0.01040
Epoch: 16022, Training Loss: 0.01183
Epoch: 16022, Training Loss: 0.01097
Epoch: 16022, Training Loss: 0.01350
Epoch: 16022, Training Loss: 0.01040
Epoch: 16023, Training Loss: 0.01183
Epoch: 16023, Training Loss: 0.01097
Epoch: 16023, Training Loss: 0.01350
Epoch: 16023, Training Loss: 0.01040
Epoch: 16024, Training Loss: 0.01183
Epoch: 16024, Training Loss: 0.01097
Epoch: 16024, Training Loss: 0.01350
Epoch: 16024, Training Loss: 0.01040
Epoch: 16025, Training Loss: 0.01182
Epoch: 16025, Training Loss: 0.01097
Epoch: 16025, Training Loss: 0.01350
Epoch: 16025, Training Loss: 0.01040
Epoch: 16026, Training Loss: 0.01182
Epoch: 16026, Training Loss: 0.01097
Epoch: 16026, Training Loss: 0.01350
Epoch: 16026, Training Loss: 0.01040
Epoch: 16027, Training Loss: 0.01182
Epoch: 16027, Training Loss: 0.01097
Epoch: 16027, Training Loss: 0.01350
Epoch: 16027, Training Loss: 0.01040
Epoch: 16028, Training Loss: 0.01182
Epoch: 16028, Training Loss: 0.01097
Epoch: 16028, Training Loss: 0.01350
Epoch: 16028, Training Loss: 0.01040
Epoch: 16029, Training Loss: 0.01182
Epoch: 16029, Training Loss: 0.01097
Epoch: 16029, Training Loss: 0.01350
Epoch: 16029, Training Loss: 0.01040
Epoch: 16030, Training Loss: 0.01182
Epoch: 16030, Training Loss: 0.01097
Epoch: 16030, Training Loss: 0.01350
Epoch: 16030, Training Loss: 0.01040
Epoch: 16031, Training Loss: 0.01182
Epoch: 16031, Training Loss: 0.01097
Epoch: 16031, Training Loss: 0.01350
Epoch: 16031, Training Loss: 0.01040
Epoch: 16032, Training Loss: 0.01182
Epoch: 16032, Training Loss: 0.01097
Epoch: 16032, Training Loss: 0.01350
Epoch: 16032, Training Loss: 0.01040
Epoch: 16033, Training Loss: 0.01182
Epoch: 16033, Training Loss: 0.01097
Epoch: 16033, Training Loss: 0.01350
Epoch: 16033, Training Loss: 0.01040
Epoch: 16034, Training Loss: 0.01182
Epoch: 16034, Training Loss: 0.01097
Epoch: 16034, Training Loss: 0.01350
Epoch: 16034, Training Loss: 0.01040
Epoch: 16035, Training Loss: 0.01182
Epoch: 16035, Training Loss: 0.01097
Epoch: 16035, Training Loss: 0.01349
Epoch: 16035, Training Loss: 0.01040
Epoch: 16036, Training Loss: 0.01182
Epoch: 16036, Training Loss: 0.01097
Epoch: 16036, Training Loss: 0.01349
Epoch: 16036, Training Loss: 0.01040
Epoch: 16037, Training Loss: 0.01182
Epoch: 16037, Training Loss: 0.01097
Epoch: 16037, Training Loss: 0.01349
Epoch: 16037, Training Loss: 0.01040
Epoch: 16038, Training Loss: 0.01182
Epoch: 16038, Training Loss: 0.01096
Epoch: 16038, Training Loss: 0.01349
Epoch: 16038, Training Loss: 0.01040
Epoch: 16039, Training Loss: 0.01182
Epoch: 16039, Training Loss: 0.01096
Epoch: 16039, Training Loss: 0.01349
Epoch: 16039, Training Loss: 0.01040
Epoch: 16040, Training Loss: 0.01182
Epoch: 16040, Training Loss: 0.01096
Epoch: 16040, Training Loss: 0.01349
Epoch: 16040, Training Loss: 0.01040
Epoch: 16041, Training Loss: 0.01182
Epoch: 16041, Training Loss: 0.01096
Epoch: 16041, Training Loss: 0.01349
Epoch: 16041, Training Loss: 0.01039
Epoch: 16042, Training Loss: 0.01182
Epoch: 16042, Training Loss: 0.01096
Epoch: 16042, Training Loss: 0.01349
Epoch: 16042, Training Loss: 0.01039
Epoch: 16043, Training Loss: 0.01182
Epoch: 16043, Training Loss: 0.01096
Epoch: 16043, Training Loss: 0.01349
Epoch: 16043, Training Loss: 0.01039
Epoch: 16044, Training Loss: 0.01182
Epoch: 16044, Training Loss: 0.01096
Epoch: 16044, Training Loss: 0.01349
Epoch: 16044, Training Loss: 0.01039
Epoch: 16045, Training Loss: 0.01182
Epoch: 16045, Training Loss: 0.01096
Epoch: 16045, Training Loss: 0.01349
Epoch: 16045, Training Loss: 0.01039
Epoch: 16046, Training Loss: 0.01182
Epoch: 16046, Training Loss: 0.01096
Epoch: 16046, Training Loss: 0.01349
Epoch: 16046, Training Loss: 0.01039
Epoch: 16047, Training Loss: 0.01182
Epoch: 16047, Training Loss: 0.01096
Epoch: 16047, Training Loss: 0.01349
Epoch: 16047, Training Loss: 0.01039
Epoch: 16048, Training Loss: 0.01182
Epoch: 16048, Training Loss: 0.01096
Epoch: 16048, Training Loss: 0.01349
Epoch: 16048, Training Loss: 0.01039
Epoch: 16049, Training Loss: 0.01181
Epoch: 16049, Training Loss: 0.01096
Epoch: 16049, Training Loss: 0.01349
Epoch: 16049, Training Loss: 0.01039
Epoch: 16050, Training Loss: 0.01181
Epoch: 16050, Training Loss: 0.01096
Epoch: 16050, Training Loss: 0.01349
Epoch: 16050, Training Loss: 0.01039
Epoch: 16051, Training Loss: 0.01181
Epoch: 16051, Training Loss: 0.01096
Epoch: 16051, Training Loss: 0.01349
Epoch: 16051, Training Loss: 0.01039
Epoch: 16052, Training Loss: 0.01181
Epoch: 16052, Training Loss: 0.01096
Epoch: 16052, Training Loss: 0.01349
Epoch: 16052, Training Loss: 0.01039
Epoch: 16053, Training Loss: 0.01181
Epoch: 16053, Training Loss: 0.01096
Epoch: 16053, Training Loss: 0.01349
Epoch: 16053, Training Loss: 0.01039
Epoch: 16054, Training Loss: 0.01181
Epoch: 16054, Training Loss: 0.01096
Epoch: 16054, Training Loss: 0.01349
Epoch: 16054, Training Loss: 0.01039
Epoch: 16055, Training Loss: 0.01181
Epoch: 16055, Training Loss: 0.01096
Epoch: 16055, Training Loss: 0.01349
Epoch: 16055, Training Loss: 0.01039
Epoch: 16056, Training Loss: 0.01181
Epoch: 16056, Training Loss: 0.01096
Epoch: 16056, Training Loss: 0.01349
Epoch: 16056, Training Loss: 0.01039
Epoch: 16057, Training Loss: 0.01181
Epoch: 16057, Training Loss: 0.01096
Epoch: 16057, Training Loss: 0.01348
Epoch: 16057, Training Loss: 0.01039
Epoch: 16058, Training Loss: 0.01181
Epoch: 16058, Training Loss: 0.01096
Epoch: 16058, Training Loss: 0.01348
Epoch: 16058, Training Loss: 0.01039
Epoch: 16059, Training Loss: 0.01181
Epoch: 16059, Training Loss: 0.01096
Epoch: 16059, Training Loss: 0.01348
Epoch: 16059, Training Loss: 0.01039
Epoch: 16060, Training Loss: 0.01181
Epoch: 16060, Training Loss: 0.01096
Epoch: 16060, Training Loss: 0.01348
Epoch: 16060, Training Loss: 0.01039
Epoch: 16061, Training Loss: 0.01181
Epoch: 16061, Training Loss: 0.01096
Epoch: 16061, Training Loss: 0.01348
Epoch: 16061, Training Loss: 0.01039
Epoch: 16062, Training Loss: 0.01181
Epoch: 16062, Training Loss: 0.01096
Epoch: 16062, Training Loss: 0.01348
Epoch: 16062, Training Loss: 0.01039
Epoch: 16063, Training Loss: 0.01181
Epoch: 16063, Training Loss: 0.01096
Epoch: 16063, Training Loss: 0.01348
Epoch: 16063, Training Loss: 0.01039
Epoch: 16064, Training Loss: 0.01181
Epoch: 16064, Training Loss: 0.01096
Epoch: 16064, Training Loss: 0.01348
Epoch: 16064, Training Loss: 0.01039
Epoch: 16065, Training Loss: 0.01181
Epoch: 16065, Training Loss: 0.01095
Epoch: 16065, Training Loss: 0.01348
Epoch: 16065, Training Loss: 0.01039
Epoch: 16066, Training Loss: 0.01181
Epoch: 16066, Training Loss: 0.01095
Epoch: 16066, Training Loss: 0.01348
Epoch: 16066, Training Loss: 0.01039
Epoch: 16067, Training Loss: 0.01181
Epoch: 16067, Training Loss: 0.01095
Epoch: 16067, Training Loss: 0.01348
Epoch: 16067, Training Loss: 0.01039
Epoch: 16068, Training Loss: 0.01181
Epoch: 16068, Training Loss: 0.01095
Epoch: 16068, Training Loss: 0.01348
Epoch: 16068, Training Loss: 0.01039
Epoch: 16069, Training Loss: 0.01181
Epoch: 16069, Training Loss: 0.01095
Epoch: 16069, Training Loss: 0.01348
Epoch: 16069, Training Loss: 0.01038
Epoch: 16070, Training Loss: 0.01181
Epoch: 16070, Training Loss: 0.01095
Epoch: 16070, Training Loss: 0.01348
Epoch: 16070, Training Loss: 0.01038
Epoch: 16071, Training Loss: 0.01181
Epoch: 16071, Training Loss: 0.01095
Epoch: 16071, Training Loss: 0.01348
Epoch: 16071, Training Loss: 0.01038
Epoch: 16072, Training Loss: 0.01181
Epoch: 16072, Training Loss: 0.01095
Epoch: 16072, Training Loss: 0.01348
Epoch: 16072, Training Loss: 0.01038
Epoch: 16073, Training Loss: 0.01181
Epoch: 16073, Training Loss: 0.01095
Epoch: 16073, Training Loss: 0.01348
Epoch: 16073, Training Loss: 0.01038
Epoch: 16074, Training Loss: 0.01180
Epoch: 16074, Training Loss: 0.01095
Epoch: 16074, Training Loss: 0.01348
Epoch: 16074, Training Loss: 0.01038
Epoch: 16075, Training Loss: 0.01180
Epoch: 16075, Training Loss: 0.01095
Epoch: 16075, Training Loss: 0.01348
Epoch: 16075, Training Loss: 0.01038
Epoch: 16076, Training Loss: 0.01180
Epoch: 16076, Training Loss: 0.01095
Epoch: 16076, Training Loss: 0.01348
Epoch: 16076, Training Loss: 0.01038
Epoch: 16077, Training Loss: 0.01180
Epoch: 16077, Training Loss: 0.01095
Epoch: 16077, Training Loss: 0.01348
Epoch: 16077, Training Loss: 0.01038
Epoch: 16078, Training Loss: 0.01180
Epoch: 16078, Training Loss: 0.01095
Epoch: 16078, Training Loss: 0.01348
Epoch: 16078, Training Loss: 0.01038
Epoch: 16079, Training Loss: 0.01180
Epoch: 16079, Training Loss: 0.01095
Epoch: 16079, Training Loss: 0.01347
Epoch: 16079, Training Loss: 0.01038
Epoch: 16080, Training Loss: 0.01180
Epoch: 16080, Training Loss: 0.01095
Epoch: 16080, Training Loss: 0.01347
Epoch: 16080, Training Loss: 0.01038
Epoch: 16081, Training Loss: 0.01180
Epoch: 16081, Training Loss: 0.01095
Epoch: 16081, Training Loss: 0.01347
Epoch: 16081, Training Loss: 0.01038
Epoch: 16082, Training Loss: 0.01180
Epoch: 16082, Training Loss: 0.01095
Epoch: 16082, Training Loss: 0.01347
Epoch: 16082, Training Loss: 0.01038
Epoch: 16083, Training Loss: 0.01180
Epoch: 16083, Training Loss: 0.01095
Epoch: 16083, Training Loss: 0.01347
Epoch: 16083, Training Loss: 0.01038
Epoch: 16084, Training Loss: 0.01180
Epoch: 16084, Training Loss: 0.01095
Epoch: 16084, Training Loss: 0.01347
Epoch: 16084, Training Loss: 0.01038
Epoch: 16085, Training Loss: 0.01180
Epoch: 16085, Training Loss: 0.01095
Epoch: 16085, Training Loss: 0.01347
Epoch: 16085, Training Loss: 0.01038
Epoch: 16086, Training Loss: 0.01180
Epoch: 16086, Training Loss: 0.01095
Epoch: 16086, Training Loss: 0.01347
Epoch: 16086, Training Loss: 0.01038
Epoch: 16087, Training Loss: 0.01180
Epoch: 16087, Training Loss: 0.01095
Epoch: 16087, Training Loss: 0.01347
Epoch: 16087, Training Loss: 0.01038
Epoch: 16088, Training Loss: 0.01180
Epoch: 16088, Training Loss: 0.01095
Epoch: 16088, Training Loss: 0.01347
Epoch: 16088, Training Loss: 0.01038
Epoch: 16089, Training Loss: 0.01180
Epoch: 16089, Training Loss: 0.01095
Epoch: 16089, Training Loss: 0.01347
Epoch: 16089, Training Loss: 0.01038
Epoch: 16090, Training Loss: 0.01180
Epoch: 16090, Training Loss: 0.01095
Epoch: 16090, Training Loss: 0.01347
Epoch: 16090, Training Loss: 0.01038
Epoch: 16091, Training Loss: 0.01180
Epoch: 16091, Training Loss: 0.01095
Epoch: 16091, Training Loss: 0.01347
Epoch: 16091, Training Loss: 0.01038
Epoch: 16092, Training Loss: 0.01180
Epoch: 16092, Training Loss: 0.01095
Epoch: 16092, Training Loss: 0.01347
Epoch: 16092, Training Loss: 0.01038
Epoch: 16093, Training Loss: 0.01180
Epoch: 16093, Training Loss: 0.01094
Epoch: 16093, Training Loss: 0.01347
Epoch: 16093, Training Loss: 0.01038
Epoch: 16094, Training Loss: 0.01180
Epoch: 16094, Training Loss: 0.01094
Epoch: 16094, Training Loss: 0.01347
Epoch: 16094, Training Loss: 0.01038
Epoch: 16095, Training Loss: 0.01180
Epoch: 16095, Training Loss: 0.01094
Epoch: 16095, Training Loss: 0.01347
Epoch: 16095, Training Loss: 0.01038
Epoch: 16096, Training Loss: 0.01180
Epoch: 16096, Training Loss: 0.01094
Epoch: 16096, Training Loss: 0.01347
Epoch: 16096, Training Loss: 0.01038
Epoch: 16097, Training Loss: 0.01180
Epoch: 16097, Training Loss: 0.01094
Epoch: 16097, Training Loss: 0.01347
Epoch: 16097, Training Loss: 0.01038
Epoch: 16098, Training Loss: 0.01180
Epoch: 16098, Training Loss: 0.01094
Epoch: 16098, Training Loss: 0.01347
Epoch: 16098, Training Loss: 0.01037
Epoch: 16099, Training Loss: 0.01179
Epoch: 16099, Training Loss: 0.01094
Epoch: 16099, Training Loss: 0.01347
Epoch: 16099, Training Loss: 0.01037
Epoch: 16100, Training Loss: 0.01179
Epoch: 16100, Training Loss: 0.01094
Epoch: 16100, Training Loss: 0.01347
Epoch: 16100, Training Loss: 0.01037
Epoch: 16101, Training Loss: 0.01179
Epoch: 16101, Training Loss: 0.01094
Epoch: 16101, Training Loss: 0.01346
Epoch: 16101, Training Loss: 0.01037
Epoch: 16102, Training Loss: 0.01179
Epoch: 16102, Training Loss: 0.01094
Epoch: 16102, Training Loss: 0.01346
Epoch: 16102, Training Loss: 0.01037
Epoch: 16103, Training Loss: 0.01179
Epoch: 16103, Training Loss: 0.01094
Epoch: 16103, Training Loss: 0.01346
Epoch: 16103, Training Loss: 0.01037
Epoch: 16104, Training Loss: 0.01179
Epoch: 16104, Training Loss: 0.01094
Epoch: 16104, Training Loss: 0.01346
Epoch: 16104, Training Loss: 0.01037
Epoch: 16105, Training Loss: 0.01179
Epoch: 16105, Training Loss: 0.01094
Epoch: 16105, Training Loss: 0.01346
Epoch: 16105, Training Loss: 0.01037
Epoch: 16106, Training Loss: 0.01179
Epoch: 16106, Training Loss: 0.01094
Epoch: 16106, Training Loss: 0.01346
Epoch: 16106, Training Loss: 0.01037
Epoch: 16107, Training Loss: 0.01179
Epoch: 16107, Training Loss: 0.01094
Epoch: 16107, Training Loss: 0.01346
Epoch: 16107, Training Loss: 0.01037
Epoch: 16108, Training Loss: 0.01179
Epoch: 16108, Training Loss: 0.01094
Epoch: 16108, Training Loss: 0.01346
Epoch: 16108, Training Loss: 0.01037
Epoch: 16109, Training Loss: 0.01179
Epoch: 16109, Training Loss: 0.01094
Epoch: 16109, Training Loss: 0.01346
Epoch: 16109, Training Loss: 0.01037
Epoch: 16110, Training Loss: 0.01179
Epoch: 16110, Training Loss: 0.01094
Epoch: 16110, Training Loss: 0.01346
Epoch: 16110, Training Loss: 0.01037
Epoch: 16111, Training Loss: 0.01179
Epoch: 16111, Training Loss: 0.01094
Epoch: 16111, Training Loss: 0.01346
Epoch: 16111, Training Loss: 0.01037
Epoch: 16112, Training Loss: 0.01179
Epoch: 16112, Training Loss: 0.01094
Epoch: 16112, Training Loss: 0.01346
Epoch: 16112, Training Loss: 0.01037
Epoch: 16113, Training Loss: 0.01179
Epoch: 16113, Training Loss: 0.01094
Epoch: 16113, Training Loss: 0.01346
Epoch: 16113, Training Loss: 0.01037
Epoch: 16114, Training Loss: 0.01179
Epoch: 16114, Training Loss: 0.01094
Epoch: 16114, Training Loss: 0.01346
Epoch: 16114, Training Loss: 0.01037
Epoch: 16115, Training Loss: 0.01179
Epoch: 16115, Training Loss: 0.01094
Epoch: 16115, Training Loss: 0.01346
Epoch: 16115, Training Loss: 0.01037
Epoch: 16116, Training Loss: 0.01179
Epoch: 16116, Training Loss: 0.01094
Epoch: 16116, Training Loss: 0.01346
Epoch: 16116, Training Loss: 0.01037
Epoch: 16117, Training Loss: 0.01179
Epoch: 16117, Training Loss: 0.01094
Epoch: 16117, Training Loss: 0.01346
Epoch: 16117, Training Loss: 0.01037
Epoch: 16118, Training Loss: 0.01179
Epoch: 16118, Training Loss: 0.01094
Epoch: 16118, Training Loss: 0.01346
Epoch: 16118, Training Loss: 0.01037
Epoch: 16119, Training Loss: 0.01179
Epoch: 16119, Training Loss: 0.01094
Epoch: 16119, Training Loss: 0.01346
Epoch: 16119, Training Loss: 0.01037
Epoch: 16120, Training Loss: 0.01179
Epoch: 16120, Training Loss: 0.01093
Epoch: 16120, Training Loss: 0.01346
Epoch: 16120, Training Loss: 0.01037
Epoch: 16121, Training Loss: 0.01179
Epoch: 16121, Training Loss: 0.01093
Epoch: 16121, Training Loss: 0.01346
Epoch: 16121, Training Loss: 0.01037
Epoch: 16122, Training Loss: 0.01179
Epoch: 16122, Training Loss: 0.01093
Epoch: 16122, Training Loss: 0.01345
Epoch: 16122, Training Loss: 0.01037
Epoch: 16123, Training Loss: 0.01179
Epoch: 16123, Training Loss: 0.01093
Epoch: 16123, Training Loss: 0.01345
Epoch: 16123, Training Loss: 0.01037
Epoch: 16124, Training Loss: 0.01178
Epoch: 16124, Training Loss: 0.01093
Epoch: 16124, Training Loss: 0.01345
Epoch: 16124, Training Loss: 0.01037
Epoch: 16125, Training Loss: 0.01178
Epoch: 16125, Training Loss: 0.01093
Epoch: 16125, Training Loss: 0.01345
Epoch: 16125, Training Loss: 0.01037
Epoch: 16126, Training Loss: 0.01178
Epoch: 16126, Training Loss: 0.01093
Epoch: 16126, Training Loss: 0.01345
Epoch: 16126, Training Loss: 0.01037
Epoch: 16127, Training Loss: 0.01178
Epoch: 16127, Training Loss: 0.01093
Epoch: 16127, Training Loss: 0.01345
Epoch: 16127, Training Loss: 0.01036
Epoch: 16128, Training Loss: 0.01178
Epoch: 16128, Training Loss: 0.01093
Epoch: 16128, Training Loss: 0.01345
Epoch: 16128, Training Loss: 0.01036
Epoch: 16129, Training Loss: 0.01178
Epoch: 16129, Training Loss: 0.01093
Epoch: 16129, Training Loss: 0.01345
Epoch: 16129, Training Loss: 0.01036
Epoch: 16130, Training Loss: 0.01178
Epoch: 16130, Training Loss: 0.01093
Epoch: 16130, Training Loss: 0.01345
Epoch: 16130, Training Loss: 0.01036
Epoch: 16131, Training Loss: 0.01178
Epoch: 16131, Training Loss: 0.01093
Epoch: 16131, Training Loss: 0.01345
Epoch: 16131, Training Loss: 0.01036
Epoch: 16132, Training Loss: 0.01178
Epoch: 16132, Training Loss: 0.01093
Epoch: 16132, Training Loss: 0.01345
Epoch: 16132, Training Loss: 0.01036
Epoch: 16133, Training Loss: 0.01178
Epoch: 16133, Training Loss: 0.01093
Epoch: 16133, Training Loss: 0.01345
Epoch: 16133, Training Loss: 0.01036
Epoch: 16134, Training Loss: 0.01178
Epoch: 16134, Training Loss: 0.01093
Epoch: 16134, Training Loss: 0.01345
Epoch: 16134, Training Loss: 0.01036
Epoch: 16135, Training Loss: 0.01178
Epoch: 16135, Training Loss: 0.01093
Epoch: 16135, Training Loss: 0.01345
Epoch: 16135, Training Loss: 0.01036
Epoch: 16136, Training Loss: 0.01178
Epoch: 16136, Training Loss: 0.01093
Epoch: 16136, Training Loss: 0.01345
Epoch: 16136, Training Loss: 0.01036
Epoch: 16137, Training Loss: 0.01178
Epoch: 16137, Training Loss: 0.01093
Epoch: 16137, Training Loss: 0.01345
Epoch: 16137, Training Loss: 0.01036
Epoch: 16138, Training Loss: 0.01178
Epoch: 16138, Training Loss: 0.01093
Epoch: 16138, Training Loss: 0.01345
Epoch: 16138, Training Loss: 0.01036
Epoch: 16139, Training Loss: 0.01178
Epoch: 16139, Training Loss: 0.01093
Epoch: 16139, Training Loss: 0.01345
Epoch: 16139, Training Loss: 0.01036
Epoch: 16140, Training Loss: 0.01178
Epoch: 16140, Training Loss: 0.01093
Epoch: 16140, Training Loss: 0.01345
Epoch: 16140, Training Loss: 0.01036
Epoch: 16141, Training Loss: 0.01178
Epoch: 16141, Training Loss: 0.01093
Epoch: 16141, Training Loss: 0.01345
Epoch: 16141, Training Loss: 0.01036
Epoch: 16142, Training Loss: 0.01178
Epoch: 16142, Training Loss: 0.01093
Epoch: 16142, Training Loss: 0.01345
Epoch: 16142, Training Loss: 0.01036
Epoch: 16143, Training Loss: 0.01178
Epoch: 16143, Training Loss: 0.01093
Epoch: 16143, Training Loss: 0.01345
Epoch: 16143, Training Loss: 0.01036
Epoch: 16144, Training Loss: 0.01178
Epoch: 16144, Training Loss: 0.01093
Epoch: 16144, Training Loss: 0.01344
Epoch: 16144, Training Loss: 0.01036
Epoch: 16145, Training Loss: 0.01178
Epoch: 16145, Training Loss: 0.01093
Epoch: 16145, Training Loss: 0.01344
Epoch: 16145, Training Loss: 0.01036
Epoch: 16146, Training Loss: 0.01178
Epoch: 16146, Training Loss: 0.01093
Epoch: 16146, Training Loss: 0.01344
Epoch: 16146, Training Loss: 0.01036
Epoch: 16147, Training Loss: 0.01178
Epoch: 16147, Training Loss: 0.01093
Epoch: 16147, Training Loss: 0.01344
Epoch: 16147, Training Loss: 0.01036
Epoch: 16148, Training Loss: 0.01178
Epoch: 16148, Training Loss: 0.01092
Epoch: 16148, Training Loss: 0.01344
Epoch: 16148, Training Loss: 0.01036
Epoch: 16149, Training Loss: 0.01177
Epoch: 16149, Training Loss: 0.01092
Epoch: 16149, Training Loss: 0.01344
Epoch: 16149, Training Loss: 0.01036
Epoch: 16150, Training Loss: 0.01177
Epoch: 16150, Training Loss: 0.01092
Epoch: 16150, Training Loss: 0.01344
Epoch: 16150, Training Loss: 0.01036
Epoch: 16151, Training Loss: 0.01177
Epoch: 16151, Training Loss: 0.01092
Epoch: 16151, Training Loss: 0.01344
Epoch: 16151, Training Loss: 0.01036
Epoch: 16152, Training Loss: 0.01177
Epoch: 16152, Training Loss: 0.01092
Epoch: 16152, Training Loss: 0.01344
Epoch: 16152, Training Loss: 0.01036
Epoch: 16153, Training Loss: 0.01177
Epoch: 16153, Training Loss: 0.01092
Epoch: 16153, Training Loss: 0.01344
Epoch: 16153, Training Loss: 0.01036
Epoch: 16154, Training Loss: 0.01177
Epoch: 16154, Training Loss: 0.01092
Epoch: 16154, Training Loss: 0.01344
Epoch: 16154, Training Loss: 0.01036
Epoch: 16155, Training Loss: 0.01177
Epoch: 16155, Training Loss: 0.01092
Epoch: 16155, Training Loss: 0.01344
Epoch: 16155, Training Loss: 0.01036
Epoch: 16156, Training Loss: 0.01177
Epoch: 16156, Training Loss: 0.01092
Epoch: 16156, Training Loss: 0.01344
Epoch: 16156, Training Loss: 0.01035
Epoch: 16157, Training Loss: 0.01177
Epoch: 16157, Training Loss: 0.01092
Epoch: 16157, Training Loss: 0.01344
Epoch: 16157, Training Loss: 0.01035
Epoch: 16158, Training Loss: 0.01177
Epoch: 16158, Training Loss: 0.01092
Epoch: 16158, Training Loss: 0.01344
Epoch: 16158, Training Loss: 0.01035
Epoch: 16159, Training Loss: 0.01177
Epoch: 16159, Training Loss: 0.01092
Epoch: 16159, Training Loss: 0.01344
Epoch: 16159, Training Loss: 0.01035
Epoch: 16160, Training Loss: 0.01177
Epoch: 16160, Training Loss: 0.01092
Epoch: 16160, Training Loss: 0.01344
Epoch: 16160, Training Loss: 0.01035
Epoch: 16161, Training Loss: 0.01177
Epoch: 16161, Training Loss: 0.01092
Epoch: 16161, Training Loss: 0.01344
Epoch: 16161, Training Loss: 0.01035
Epoch: 16162, Training Loss: 0.01177
Epoch: 16162, Training Loss: 0.01092
Epoch: 16162, Training Loss: 0.01344
Epoch: 16162, Training Loss: 0.01035
Epoch: 16163, Training Loss: 0.01177
Epoch: 16163, Training Loss: 0.01092
Epoch: 16163, Training Loss: 0.01344
Epoch: 16163, Training Loss: 0.01035
Epoch: 16164, Training Loss: 0.01177
Epoch: 16164, Training Loss: 0.01092
Epoch: 16164, Training Loss: 0.01344
Epoch: 16164, Training Loss: 0.01035
Epoch: 16165, Training Loss: 0.01177
Epoch: 16165, Training Loss: 0.01092
Epoch: 16165, Training Loss: 0.01344
Epoch: 16165, Training Loss: 0.01035
Epoch: 16166, Training Loss: 0.01177
Epoch: 16166, Training Loss: 0.01092
Epoch: 16166, Training Loss: 0.01343
Epoch: 16166, Training Loss: 0.01035
Epoch: 16167, Training Loss: 0.01177
Epoch: 16167, Training Loss: 0.01092
Epoch: 16167, Training Loss: 0.01343
Epoch: 16167, Training Loss: 0.01035
Epoch: 16168, Training Loss: 0.01177
Epoch: 16168, Training Loss: 0.01092
Epoch: 16168, Training Loss: 0.01343
Epoch: 16168, Training Loss: 0.01035
Epoch: 16169, Training Loss: 0.01177
Epoch: 16169, Training Loss: 0.01092
Epoch: 16169, Training Loss: 0.01343
Epoch: 16169, Training Loss: 0.01035
Epoch: 16170, Training Loss: 0.01177
Epoch: 16170, Training Loss: 0.01092
Epoch: 16170, Training Loss: 0.01343
Epoch: 16170, Training Loss: 0.01035
Epoch: 16171, Training Loss: 0.01177
Epoch: 16171, Training Loss: 0.01092
Epoch: 16171, Training Loss: 0.01343
Epoch: 16171, Training Loss: 0.01035
Epoch: 16172, Training Loss: 0.01177
Epoch: 16172, Training Loss: 0.01092
Epoch: 16172, Training Loss: 0.01343
Epoch: 16172, Training Loss: 0.01035
Epoch: 16173, Training Loss: 0.01177
Epoch: 16173, Training Loss: 0.01092
Epoch: 16173, Training Loss: 0.01343
Epoch: 16173, Training Loss: 0.01035
Epoch: 16174, Training Loss: 0.01176
Epoch: 16174, Training Loss: 0.01092
Epoch: 16174, Training Loss: 0.01343
Epoch: 16174, Training Loss: 0.01035
Epoch: 16175, Training Loss: 0.01176
Epoch: 16175, Training Loss: 0.01092
Epoch: 16175, Training Loss: 0.01343
Epoch: 16175, Training Loss: 0.01035
Epoch: 16176, Training Loss: 0.01176
Epoch: 16176, Training Loss: 0.01091
Epoch: 16176, Training Loss: 0.01343
Epoch: 16176, Training Loss: 0.01035
Epoch: 16177, Training Loss: 0.01176
Epoch: 16177, Training Loss: 0.01091
Epoch: 16177, Training Loss: 0.01343
Epoch: 16177, Training Loss: 0.01035
Epoch: 16178, Training Loss: 0.01176
Epoch: 16178, Training Loss: 0.01091
Epoch: 16178, Training Loss: 0.01343
Epoch: 16178, Training Loss: 0.01035
Epoch: 16179, Training Loss: 0.01176
Epoch: 16179, Training Loss: 0.01091
Epoch: 16179, Training Loss: 0.01343
Epoch: 16179, Training Loss: 0.01035
Epoch: 16180, Training Loss: 0.01176
Epoch: 16180, Training Loss: 0.01091
Epoch: 16180, Training Loss: 0.01343
Epoch: 16180, Training Loss: 0.01035
Epoch: 16181, Training Loss: 0.01176
Epoch: 16181, Training Loss: 0.01091
Epoch: 16181, Training Loss: 0.01343
Epoch: 16181, Training Loss: 0.01035
Epoch: 16182, Training Loss: 0.01176
Epoch: 16182, Training Loss: 0.01091
Epoch: 16182, Training Loss: 0.01343
Epoch: 16182, Training Loss: 0.01035
Epoch: 16183, Training Loss: 0.01176
Epoch: 16183, Training Loss: 0.01091
Epoch: 16183, Training Loss: 0.01343
Epoch: 16183, Training Loss: 0.01035
Epoch: 16184, Training Loss: 0.01176
Epoch: 16184, Training Loss: 0.01091
Epoch: 16184, Training Loss: 0.01343
Epoch: 16184, Training Loss: 0.01035
Epoch: 16185, Training Loss: 0.01176
Epoch: 16185, Training Loss: 0.01091
Epoch: 16185, Training Loss: 0.01343
Epoch: 16185, Training Loss: 0.01034
Epoch: 16186, Training Loss: 0.01176
Epoch: 16186, Training Loss: 0.01091
Epoch: 16186, Training Loss: 0.01343
Epoch: 16186, Training Loss: 0.01034
Epoch: 16187, Training Loss: 0.01176
Epoch: 16187, Training Loss: 0.01091
Epoch: 16187, Training Loss: 0.01343
Epoch: 16187, Training Loss: 0.01034
Epoch: 16188, Training Loss: 0.01176
Epoch: 16188, Training Loss: 0.01091
Epoch: 16188, Training Loss: 0.01342
Epoch: 16188, Training Loss: 0.01034
Epoch: 16189, Training Loss: 0.01176
Epoch: 16189, Training Loss: 0.01091
Epoch: 16189, Training Loss: 0.01342
Epoch: 16189, Training Loss: 0.01034
Epoch: 16190, Training Loss: 0.01176
Epoch: 16190, Training Loss: 0.01091
Epoch: 16190, Training Loss: 0.01342
Epoch: 16190, Training Loss: 0.01034
Epoch: 16191, Training Loss: 0.01176
Epoch: 16191, Training Loss: 0.01091
Epoch: 16191, Training Loss: 0.01342
Epoch: 16191, Training Loss: 0.01034
Epoch: 16192, Training Loss: 0.01176
Epoch: 16192, Training Loss: 0.01091
Epoch: 16192, Training Loss: 0.01342
Epoch: 16192, Training Loss: 0.01034
Epoch: 16193, Training Loss: 0.01176
Epoch: 16193, Training Loss: 0.01091
Epoch: 16193, Training Loss: 0.01342
Epoch: 16193, Training Loss: 0.01034
Epoch: 16194, Training Loss: 0.01176
Epoch: 16194, Training Loss: 0.01091
Epoch: 16194, Training Loss: 0.01342
Epoch: 16194, Training Loss: 0.01034
Epoch: 16195, Training Loss: 0.01176
Epoch: 16195, Training Loss: 0.01091
Epoch: 16195, Training Loss: 0.01342
Epoch: 16195, Training Loss: 0.01034
Epoch: 16196, Training Loss: 0.01176
Epoch: 16196, Training Loss: 0.01091
Epoch: 16196, Training Loss: 0.01342
Epoch: 16196, Training Loss: 0.01034
Epoch: 16197, Training Loss: 0.01176
Epoch: 16197, Training Loss: 0.01091
Epoch: 16197, Training Loss: 0.01342
Epoch: 16197, Training Loss: 0.01034
Epoch: 16198, Training Loss: 0.01176
Epoch: 16198, Training Loss: 0.01091
Epoch: 16198, Training Loss: 0.01342
Epoch: 16198, Training Loss: 0.01034
Epoch: 16199, Training Loss: 0.01175
Epoch: 16199, Training Loss: 0.01091
Epoch: 16199, Training Loss: 0.01342
Epoch: 16199, Training Loss: 0.01034
Epoch: 16200, Training Loss: 0.01175
Epoch: 16200, Training Loss: 0.01091
Epoch: 16200, Training Loss: 0.01342
Epoch: 16200, Training Loss: 0.01034
Epoch: 16201, Training Loss: 0.01175
Epoch: 16201, Training Loss: 0.01091
Epoch: 16201, Training Loss: 0.01342
Epoch: 16201, Training Loss: 0.01034
Epoch: 16202, Training Loss: 0.01175
Epoch: 16202, Training Loss: 0.01091
Epoch: 16202, Training Loss: 0.01342
Epoch: 16202, Training Loss: 0.01034
Epoch: 16203, Training Loss: 0.01175
Epoch: 16203, Training Loss: 0.01090
Epoch: 16203, Training Loss: 0.01342
Epoch: 16203, Training Loss: 0.01034
Epoch: 16204, Training Loss: 0.01175
Epoch: 16204, Training Loss: 0.01090
Epoch: 16204, Training Loss: 0.01342
Epoch: 16204, Training Loss: 0.01034
Epoch: 16205, Training Loss: 0.01175
Epoch: 16205, Training Loss: 0.01090
Epoch: 16205, Training Loss: 0.01342
Epoch: 16205, Training Loss: 0.01034
Epoch: 16206, Training Loss: 0.01175
Epoch: 16206, Training Loss: 0.01090
Epoch: 16206, Training Loss: 0.01342
Epoch: 16206, Training Loss: 0.01034
Epoch: 16207, Training Loss: 0.01175
Epoch: 16207, Training Loss: 0.01090
Epoch: 16207, Training Loss: 0.01342
Epoch: 16207, Training Loss: 0.01034
Epoch: 16208, Training Loss: 0.01175
Epoch: 16208, Training Loss: 0.01090
Epoch: 16208, Training Loss: 0.01342
Epoch: 16208, Training Loss: 0.01034
Epoch: 16209, Training Loss: 0.01175
Epoch: 16209, Training Loss: 0.01090
Epoch: 16209, Training Loss: 0.01342
Epoch: 16209, Training Loss: 0.01034
Epoch: 16210, Training Loss: 0.01175
Epoch: 16210, Training Loss: 0.01090
Epoch: 16210, Training Loss: 0.01341
Epoch: 16210, Training Loss: 0.01034
Epoch: 16211, Training Loss: 0.01175
Epoch: 16211, Training Loss: 0.01090
Epoch: 16211, Training Loss: 0.01341
Epoch: 16211, Training Loss: 0.01034
Epoch: 16212, Training Loss: 0.01175
Epoch: 16212, Training Loss: 0.01090
Epoch: 16212, Training Loss: 0.01341
Epoch: 16212, Training Loss: 0.01034
Epoch: 16213, Training Loss: 0.01175
Epoch: 16213, Training Loss: 0.01090
Epoch: 16213, Training Loss: 0.01341
Epoch: 16213, Training Loss: 0.01034
Epoch: 16214, Training Loss: 0.01175
Epoch: 16214, Training Loss: 0.01090
Epoch: 16214, Training Loss: 0.01341
Epoch: 16214, Training Loss: 0.01033
Epoch: 16215, Training Loss: 0.01175
Epoch: 16215, Training Loss: 0.01090
Epoch: 16215, Training Loss: 0.01341
Epoch: 16215, Training Loss: 0.01033
Epoch: 16216, Training Loss: 0.01175
Epoch: 16216, Training Loss: 0.01090
Epoch: 16216, Training Loss: 0.01341
Epoch: 16216, Training Loss: 0.01033
Epoch: 16217, Training Loss: 0.01175
Epoch: 16217, Training Loss: 0.01090
Epoch: 16217, Training Loss: 0.01341
Epoch: 16217, Training Loss: 0.01033
Epoch: 16218, Training Loss: 0.01175
Epoch: 16218, Training Loss: 0.01090
Epoch: 16218, Training Loss: 0.01341
Epoch: 16218, Training Loss: 0.01033
Epoch: 16219, Training Loss: 0.01175
Epoch: 16219, Training Loss: 0.01090
Epoch: 16219, Training Loss: 0.01341
Epoch: 16219, Training Loss: 0.01033
Epoch: 16220, Training Loss: 0.01175
Epoch: 16220, Training Loss: 0.01090
Epoch: 16220, Training Loss: 0.01341
Epoch: 16220, Training Loss: 0.01033
Epoch: 16221, Training Loss: 0.01175
Epoch: 16221, Training Loss: 0.01090
Epoch: 16221, Training Loss: 0.01341
Epoch: 16221, Training Loss: 0.01033
Epoch: 16222, Training Loss: 0.01175
Epoch: 16222, Training Loss: 0.01090
Epoch: 16222, Training Loss: 0.01341
Epoch: 16222, Training Loss: 0.01033
Epoch: 16223, Training Loss: 0.01175
Epoch: 16223, Training Loss: 0.01090
Epoch: 16223, Training Loss: 0.01341
Epoch: 16223, Training Loss: 0.01033
Epoch: 16224, Training Loss: 0.01175
Epoch: 16224, Training Loss: 0.01090
Epoch: 16224, Training Loss: 0.01341
Epoch: 16224, Training Loss: 0.01033
Epoch: 16225, Training Loss: 0.01174
Epoch: 16225, Training Loss: 0.01090
Epoch: 16225, Training Loss: 0.01341
Epoch: 16225, Training Loss: 0.01033
Epoch: 16226, Training Loss: 0.01174
Epoch: 16226, Training Loss: 0.01090
Epoch: 16226, Training Loss: 0.01341
Epoch: 16226, Training Loss: 0.01033
Epoch: 16227, Training Loss: 0.01174
Epoch: 16227, Training Loss: 0.01090
Epoch: 16227, Training Loss: 0.01341
Epoch: 16227, Training Loss: 0.01033
Epoch: 16228, Training Loss: 0.01174
Epoch: 16228, Training Loss: 0.01090
Epoch: 16228, Training Loss: 0.01341
Epoch: 16228, Training Loss: 0.01033
Epoch: 16229, Training Loss: 0.01174
Epoch: 16229, Training Loss: 0.01090
Epoch: 16229, Training Loss: 0.01341
Epoch: 16229, Training Loss: 0.01033
Epoch: 16230, Training Loss: 0.01174
Epoch: 16230, Training Loss: 0.01090
Epoch: 16230, Training Loss: 0.01341
Epoch: 16230, Training Loss: 0.01033
Epoch: 16231, Training Loss: 0.01174
Epoch: 16231, Training Loss: 0.01089
Epoch: 16231, Training Loss: 0.01341
Epoch: 16231, Training Loss: 0.01033
Epoch: 16232, Training Loss: 0.01174
Epoch: 16232, Training Loss: 0.01089
Epoch: 16232, Training Loss: 0.01340
Epoch: 16232, Training Loss: 0.01033
Epoch: 16233, Training Loss: 0.01174
Epoch: 16233, Training Loss: 0.01089
Epoch: 16233, Training Loss: 0.01340
Epoch: 16233, Training Loss: 0.01033
Epoch: 16234, Training Loss: 0.01174
Epoch: 16234, Training Loss: 0.01089
Epoch: 16234, Training Loss: 0.01340
Epoch: 16234, Training Loss: 0.01033
Epoch: 16235, Training Loss: 0.01174
Epoch: 16235, Training Loss: 0.01089
Epoch: 16235, Training Loss: 0.01340
Epoch: 16235, Training Loss: 0.01033
Epoch: 16236, Training Loss: 0.01174
Epoch: 16236, Training Loss: 0.01089
Epoch: 16236, Training Loss: 0.01340
Epoch: 16236, Training Loss: 0.01033
Epoch: 16237, Training Loss: 0.01174
Epoch: 16237, Training Loss: 0.01089
Epoch: 16237, Training Loss: 0.01340
Epoch: 16237, Training Loss: 0.01033
Epoch: 16238, Training Loss: 0.01174
Epoch: 16238, Training Loss: 0.01089
Epoch: 16238, Training Loss: 0.01340
Epoch: 16238, Training Loss: 0.01033
Epoch: 16239, Training Loss: 0.01174
Epoch: 16239, Training Loss: 0.01089
Epoch: 16239, Training Loss: 0.01340
Epoch: 16239, Training Loss: 0.01033
Epoch: 16240, Training Loss: 0.01174
Epoch: 16240, Training Loss: 0.01089
Epoch: 16240, Training Loss: 0.01340
Epoch: 16240, Training Loss: 0.01033
Epoch: 16241, Training Loss: 0.01174
Epoch: 16241, Training Loss: 0.01089
Epoch: 16241, Training Loss: 0.01340
Epoch: 16241, Training Loss: 0.01033
Epoch: 16242, Training Loss: 0.01174
Epoch: 16242, Training Loss: 0.01089
Epoch: 16242, Training Loss: 0.01340
Epoch: 16242, Training Loss: 0.01033
Epoch: 16243, Training Loss: 0.01174
Epoch: 16243, Training Loss: 0.01089
Epoch: 16243, Training Loss: 0.01340
Epoch: 16243, Training Loss: 0.01032
Epoch: 16244, Training Loss: 0.01174
Epoch: 16244, Training Loss: 0.01089
Epoch: 16244, Training Loss: 0.01340
Epoch: 16244, Training Loss: 0.01032
Epoch: 16245, Training Loss: 0.01174
Epoch: 16245, Training Loss: 0.01089
Epoch: 16245, Training Loss: 0.01340
Epoch: 16245, Training Loss: 0.01032
Epoch: 16246, Training Loss: 0.01174
Epoch: 16246, Training Loss: 0.01089
Epoch: 16246, Training Loss: 0.01340
Epoch: 16246, Training Loss: 0.01032
Epoch: 16247, Training Loss: 0.01174
Epoch: 16247, Training Loss: 0.01089
Epoch: 16247, Training Loss: 0.01340
Epoch: 16247, Training Loss: 0.01032
Epoch: 16248, Training Loss: 0.01174
Epoch: 16248, Training Loss: 0.01089
Epoch: 16248, Training Loss: 0.01340
Epoch: 16248, Training Loss: 0.01032
Epoch: 16249, Training Loss: 0.01174
Epoch: 16249, Training Loss: 0.01089
Epoch: 16249, Training Loss: 0.01340
Epoch: 16249, Training Loss: 0.01032
Epoch: 16250, Training Loss: 0.01173
Epoch: 16250, Training Loss: 0.01089
Epoch: 16250, Training Loss: 0.01340
Epoch: 16250, Training Loss: 0.01032
Epoch: 16251, Training Loss: 0.01173
Epoch: 16251, Training Loss: 0.01089
Epoch: 16251, Training Loss: 0.01340
Epoch: 16251, Training Loss: 0.01032
Epoch: 16252, Training Loss: 0.01173
Epoch: 16252, Training Loss: 0.01089
Epoch: 16252, Training Loss: 0.01340
Epoch: 16252, Training Loss: 0.01032
Epoch: 16253, Training Loss: 0.01173
Epoch: 16253, Training Loss: 0.01089
Epoch: 16253, Training Loss: 0.01340
Epoch: 16253, Training Loss: 0.01032
Epoch: 16254, Training Loss: 0.01173
Epoch: 16254, Training Loss: 0.01089
Epoch: 16254, Training Loss: 0.01339
Epoch: 16254, Training Loss: 0.01032
Epoch: 16255, Training Loss: 0.01173
Epoch: 16255, Training Loss: 0.01089
Epoch: 16255, Training Loss: 0.01339
Epoch: 16255, Training Loss: 0.01032
Epoch: 16256, Training Loss: 0.01173
Epoch: 16256, Training Loss: 0.01089
Epoch: 16256, Training Loss: 0.01339
Epoch: 16256, Training Loss: 0.01032
Epoch: 16257, Training Loss: 0.01173
Epoch: 16257, Training Loss: 0.01089
Epoch: 16257, Training Loss: 0.01339
Epoch: 16257, Training Loss: 0.01032
Epoch: 16258, Training Loss: 0.01173
Epoch: 16258, Training Loss: 0.01089
Epoch: 16258, Training Loss: 0.01339
Epoch: 16258, Training Loss: 0.01032
Epoch: 16259, Training Loss: 0.01173
Epoch: 16259, Training Loss: 0.01088
Epoch: 16259, Training Loss: 0.01339
Epoch: 16259, Training Loss: 0.01032
Epoch: 16260, Training Loss: 0.01173
Epoch: 16260, Training Loss: 0.01088
Epoch: 16260, Training Loss: 0.01339
Epoch: 16260, Training Loss: 0.01032
Epoch: 16261, Training Loss: 0.01173
Epoch: 16261, Training Loss: 0.01088
Epoch: 16261, Training Loss: 0.01339
Epoch: 16261, Training Loss: 0.01032
Epoch: 16262, Training Loss: 0.01173
Epoch: 16262, Training Loss: 0.01088
Epoch: 16262, Training Loss: 0.01339
Epoch: 16262, Training Loss: 0.01032
Epoch: 16263, Training Loss: 0.01173
Epoch: 16263, Training Loss: 0.01088
Epoch: 16263, Training Loss: 0.01339
Epoch: 16263, Training Loss: 0.01032
Epoch: 16264, Training Loss: 0.01173
Epoch: 16264, Training Loss: 0.01088
Epoch: 16264, Training Loss: 0.01339
Epoch: 16264, Training Loss: 0.01032
Epoch: 16265, Training Loss: 0.01173
Epoch: 16265, Training Loss: 0.01088
Epoch: 16265, Training Loss: 0.01339
Epoch: 16265, Training Loss: 0.01032
Epoch: 16266, Training Loss: 0.01173
Epoch: 16266, Training Loss: 0.01088
Epoch: 16266, Training Loss: 0.01339
Epoch: 16266, Training Loss: 0.01032
Epoch: 16267, Training Loss: 0.01173
Epoch: 16267, Training Loss: 0.01088
Epoch: 16267, Training Loss: 0.01339
Epoch: 16267, Training Loss: 0.01032
Epoch: 16268, Training Loss: 0.01173
Epoch: 16268, Training Loss: 0.01088
Epoch: 16268, Training Loss: 0.01339
Epoch: 16268, Training Loss: 0.01032
Epoch: 16269, Training Loss: 0.01173
Epoch: 16269, Training Loss: 0.01088
Epoch: 16269, Training Loss: 0.01339
Epoch: 16269, Training Loss: 0.01032
Epoch: 16270, Training Loss: 0.01173
Epoch: 16270, Training Loss: 0.01088
Epoch: 16270, Training Loss: 0.01339
Epoch: 16270, Training Loss: 0.01032
Epoch: 16271, Training Loss: 0.01173
Epoch: 16271, Training Loss: 0.01088
Epoch: 16271, Training Loss: 0.01339
Epoch: 16271, Training Loss: 0.01032
Epoch: 16272, Training Loss: 0.01173
Epoch: 16272, Training Loss: 0.01088
Epoch: 16272, Training Loss: 0.01339
Epoch: 16272, Training Loss: 0.01032
Epoch: 16273, Training Loss: 0.01173
Epoch: 16273, Training Loss: 0.01088
Epoch: 16273, Training Loss: 0.01339
Epoch: 16273, Training Loss: 0.01031
Epoch: 16274, Training Loss: 0.01173
Epoch: 16274, Training Loss: 0.01088
Epoch: 16274, Training Loss: 0.01339
Epoch: 16274, Training Loss: 0.01031
Epoch: 16275, Training Loss: 0.01172
Epoch: 16275, Training Loss: 0.01088
Epoch: 16275, Training Loss: 0.01339
Epoch: 16275, Training Loss: 0.01031
Epoch: 16276, Training Loss: 0.01172
Epoch: 16276, Training Loss: 0.01088
Epoch: 16276, Training Loss: 0.01339
Epoch: 16276, Training Loss: 0.01031
Epoch: 16277, Training Loss: 0.01172
Epoch: 16277, Training Loss: 0.01088
Epoch: 16277, Training Loss: 0.01338
Epoch: 16277, Training Loss: 0.01031
Epoch: 16278, Training Loss: 0.01172
Epoch: 16278, Training Loss: 0.01088
Epoch: 16278, Training Loss: 0.01338
Epoch: 16278, Training Loss: 0.01031
Epoch: 16279, Training Loss: 0.01172
Epoch: 16279, Training Loss: 0.01088
Epoch: 16279, Training Loss: 0.01338
Epoch: 16279, Training Loss: 0.01031
Epoch: 16280, Training Loss: 0.01172
Epoch: 16280, Training Loss: 0.01088
Epoch: 16280, Training Loss: 0.01338
Epoch: 16280, Training Loss: 0.01031
Epoch: 16281, Training Loss: 0.01172
Epoch: 16281, Training Loss: 0.01088
Epoch: 16281, Training Loss: 0.01338
Epoch: 16281, Training Loss: 0.01031
Epoch: 16282, Training Loss: 0.01172
Epoch: 16282, Training Loss: 0.01088
Epoch: 16282, Training Loss: 0.01338
Epoch: 16282, Training Loss: 0.01031
Epoch: 16283, Training Loss: 0.01172
Epoch: 16283, Training Loss: 0.01088
Epoch: 16283, Training Loss: 0.01338
Epoch: 16283, Training Loss: 0.01031
Epoch: 16284, Training Loss: 0.01172
Epoch: 16284, Training Loss: 0.01088
Epoch: 16284, Training Loss: 0.01338
Epoch: 16284, Training Loss: 0.01031
Epoch: 16285, Training Loss: 0.01172
Epoch: 16285, Training Loss: 0.01088
Epoch: 16285, Training Loss: 0.01338
Epoch: 16285, Training Loss: 0.01031
Epoch: 16286, Training Loss: 0.01172
Epoch: 16286, Training Loss: 0.01088
Epoch: 16286, Training Loss: 0.01338
Epoch: 16286, Training Loss: 0.01031
Epoch: 16287, Training Loss: 0.01172
Epoch: 16287, Training Loss: 0.01087
Epoch: 16287, Training Loss: 0.01338
Epoch: 16287, Training Loss: 0.01031
Epoch: 16288, Training Loss: 0.01172
Epoch: 16288, Training Loss: 0.01087
Epoch: 16288, Training Loss: 0.01338
Epoch: 16288, Training Loss: 0.01031
Epoch: 16289, Training Loss: 0.01172
Epoch: 16289, Training Loss: 0.01087
Epoch: 16289, Training Loss: 0.01338
Epoch: 16289, Training Loss: 0.01031
Epoch: 16290, Training Loss: 0.01172
Epoch: 16290, Training Loss: 0.01087
Epoch: 16290, Training Loss: 0.01338
Epoch: 16290, Training Loss: 0.01031
Epoch: 16291, Training Loss: 0.01172
Epoch: 16291, Training Loss: 0.01087
Epoch: 16291, Training Loss: 0.01338
Epoch: 16291, Training Loss: 0.01031
Epoch: 16292, Training Loss: 0.01172
Epoch: 16292, Training Loss: 0.01087
Epoch: 16292, Training Loss: 0.01338
Epoch: 16292, Training Loss: 0.01031
Epoch: 16293, Training Loss: 0.01172
Epoch: 16293, Training Loss: 0.01087
Epoch: 16293, Training Loss: 0.01338
Epoch: 16293, Training Loss: 0.01031
Epoch: 16294, Training Loss: 0.01172
Epoch: 16294, Training Loss: 0.01087
Epoch: 16294, Training Loss: 0.01338
Epoch: 16294, Training Loss: 0.01031
Epoch: 16295, Training Loss: 0.01172
Epoch: 16295, Training Loss: 0.01087
Epoch: 16295, Training Loss: 0.01338
Epoch: 16295, Training Loss: 0.01031
Epoch: 16296, Training Loss: 0.01172
Epoch: 16296, Training Loss: 0.01087
Epoch: 16296, Training Loss: 0.01338
Epoch: 16296, Training Loss: 0.01031
Epoch: 16297, Training Loss: 0.01172
Epoch: 16297, Training Loss: 0.01087
Epoch: 16297, Training Loss: 0.01338
Epoch: 16297, Training Loss: 0.01031
Epoch: 16298, Training Loss: 0.01172
Epoch: 16298, Training Loss: 0.01087
Epoch: 16298, Training Loss: 0.01338
Epoch: 16298, Training Loss: 0.01031
Epoch: 16299, Training Loss: 0.01172
Epoch: 16299, Training Loss: 0.01087
Epoch: 16299, Training Loss: 0.01337
Epoch: 16299, Training Loss: 0.01031
Epoch: 16300, Training Loss: 0.01172
Epoch: 16300, Training Loss: 0.01087
Epoch: 16300, Training Loss: 0.01337
Epoch: 16300, Training Loss: 0.01031
Epoch: 16301, Training Loss: 0.01171
Epoch: 16301, Training Loss: 0.01087
Epoch: 16301, Training Loss: 0.01337
Epoch: 16301, Training Loss: 0.01031
Epoch: 16302, Training Loss: 0.01171
Epoch: 16302, Training Loss: 0.01087
Epoch: 16302, Training Loss: 0.01337
Epoch: 16302, Training Loss: 0.01030
Epoch: 16303, Training Loss: 0.01171
Epoch: 16303, Training Loss: 0.01087
Epoch: 16303, Training Loss: 0.01337
Epoch: 16303, Training Loss: 0.01030
Epoch: 16304, Training Loss: 0.01171
Epoch: 16304, Training Loss: 0.01087
Epoch: 16304, Training Loss: 0.01337
Epoch: 16304, Training Loss: 0.01030
Epoch: 16305, Training Loss: 0.01171
Epoch: 16305, Training Loss: 0.01087
Epoch: 16305, Training Loss: 0.01337
Epoch: 16305, Training Loss: 0.01030
Epoch: 16306, Training Loss: 0.01171
Epoch: 16306, Training Loss: 0.01087
Epoch: 16306, Training Loss: 0.01337
Epoch: 16306, Training Loss: 0.01030
Epoch: 16307, Training Loss: 0.01171
Epoch: 16307, Training Loss: 0.01087
Epoch: 16307, Training Loss: 0.01337
Epoch: 16307, Training Loss: 0.01030
Epoch: 16308, Training Loss: 0.01171
Epoch: 16308, Training Loss: 0.01087
Epoch: 16308, Training Loss: 0.01337
Epoch: 16308, Training Loss: 0.01030
Epoch: 16309, Training Loss: 0.01171
Epoch: 16309, Training Loss: 0.01087
Epoch: 16309, Training Loss: 0.01337
Epoch: 16309, Training Loss: 0.01030
Epoch: 16310, Training Loss: 0.01171
Epoch: 16310, Training Loss: 0.01087
Epoch: 16310, Training Loss: 0.01337
Epoch: 16310, Training Loss: 0.01030
Epoch: 16311, Training Loss: 0.01171
Epoch: 16311, Training Loss: 0.01087
Epoch: 16311, Training Loss: 0.01337
Epoch: 16311, Training Loss: 0.01030
Epoch: 16312, Training Loss: 0.01171
Epoch: 16312, Training Loss: 0.01087
Epoch: 16312, Training Loss: 0.01337
Epoch: 16312, Training Loss: 0.01030
Epoch: 16313, Training Loss: 0.01171
Epoch: 16313, Training Loss: 0.01087
Epoch: 16313, Training Loss: 0.01337
Epoch: 16313, Training Loss: 0.01030
Epoch: 16314, Training Loss: 0.01171
Epoch: 16314, Training Loss: 0.01087
Epoch: 16314, Training Loss: 0.01337
Epoch: 16314, Training Loss: 0.01030
Epoch: 16315, Training Loss: 0.01171
Epoch: 16315, Training Loss: 0.01086
Epoch: 16315, Training Loss: 0.01337
Epoch: 16315, Training Loss: 0.01030
Epoch: 16316, Training Loss: 0.01171
Epoch: 16316, Training Loss: 0.01086
Epoch: 16316, Training Loss: 0.01337
Epoch: 16316, Training Loss: 0.01030
Epoch: 16317, Training Loss: 0.01171
Epoch: 16317, Training Loss: 0.01086
Epoch: 16317, Training Loss: 0.01337
Epoch: 16317, Training Loss: 0.01030
Epoch: 16318, Training Loss: 0.01171
Epoch: 16318, Training Loss: 0.01086
Epoch: 16318, Training Loss: 0.01337
Epoch: 16318, Training Loss: 0.01030
Epoch: 16319, Training Loss: 0.01171
Epoch: 16319, Training Loss: 0.01086
Epoch: 16319, Training Loss: 0.01337
Epoch: 16319, Training Loss: 0.01030
Epoch: 16320, Training Loss: 0.01171
Epoch: 16320, Training Loss: 0.01086
Epoch: 16320, Training Loss: 0.01337
Epoch: 16320, Training Loss: 0.01030
Epoch: 16321, Training Loss: 0.01171
Epoch: 16321, Training Loss: 0.01086
Epoch: 16321, Training Loss: 0.01336
Epoch: 16321, Training Loss: 0.01030
Epoch: 16322, Training Loss: 0.01171
Epoch: 16322, Training Loss: 0.01086
Epoch: 16322, Training Loss: 0.01336
Epoch: 16322, Training Loss: 0.01030
Epoch: 16323, Training Loss: 0.01171
Epoch: 16323, Training Loss: 0.01086
Epoch: 16323, Training Loss: 0.01336
Epoch: 16323, Training Loss: 0.01030
Epoch: 16324, Training Loss: 0.01171
Epoch: 16324, Training Loss: 0.01086
Epoch: 16324, Training Loss: 0.01336
Epoch: 16324, Training Loss: 0.01030
Epoch: 16325, Training Loss: 0.01171
Epoch: 16325, Training Loss: 0.01086
Epoch: 16325, Training Loss: 0.01336
Epoch: 16325, Training Loss: 0.01030
Epoch: 16326, Training Loss: 0.01170
Epoch: 16326, Training Loss: 0.01086
Epoch: 16326, Training Loss: 0.01336
Epoch: 16326, Training Loss: 0.01030
Epoch: 16327, Training Loss: 0.01170
Epoch: 16327, Training Loss: 0.01086
Epoch: 16327, Training Loss: 0.01336
Epoch: 16327, Training Loss: 0.01030
Epoch: 16328, Training Loss: 0.01170
Epoch: 16328, Training Loss: 0.01086
Epoch: 16328, Training Loss: 0.01336
Epoch: 16328, Training Loss: 0.01030
Epoch: 16329, Training Loss: 0.01170
Epoch: 16329, Training Loss: 0.01086
Epoch: 16329, Training Loss: 0.01336
Epoch: 16329, Training Loss: 0.01030
Epoch: 16330, Training Loss: 0.01170
Epoch: 16330, Training Loss: 0.01086
Epoch: 16330, Training Loss: 0.01336
Epoch: 16330, Training Loss: 0.01030
Epoch: 16331, Training Loss: 0.01170
Epoch: 16331, Training Loss: 0.01086
Epoch: 16331, Training Loss: 0.01336
Epoch: 16331, Training Loss: 0.01030
Epoch: 16332, Training Loss: 0.01170
Epoch: 16332, Training Loss: 0.01086
Epoch: 16332, Training Loss: 0.01336
Epoch: 16332, Training Loss: 0.01029
Epoch: 16333, Training Loss: 0.01170
Epoch: 16333, Training Loss: 0.01086
Epoch: 16333, Training Loss: 0.01336
Epoch: 16333, Training Loss: 0.01029
Epoch: 16334, Training Loss: 0.01170
Epoch: 16334, Training Loss: 0.01086
Epoch: 16334, Training Loss: 0.01336
Epoch: 16334, Training Loss: 0.01029
Epoch: 16335, Training Loss: 0.01170
Epoch: 16335, Training Loss: 0.01086
Epoch: 16335, Training Loss: 0.01336
Epoch: 16335, Training Loss: 0.01029
Epoch: 16336, Training Loss: 0.01170
Epoch: 16336, Training Loss: 0.01086
Epoch: 16336, Training Loss: 0.01336
Epoch: 16336, Training Loss: 0.01029
Epoch: 16337, Training Loss: 0.01170
Epoch: 16337, Training Loss: 0.01086
Epoch: 16337, Training Loss: 0.01336
Epoch: 16337, Training Loss: 0.01029
Epoch: 16338, Training Loss: 0.01170
Epoch: 16338, Training Loss: 0.01086
Epoch: 16338, Training Loss: 0.01336
Epoch: 16338, Training Loss: 0.01029
Epoch: 16339, Training Loss: 0.01170
Epoch: 16339, Training Loss: 0.01086
Epoch: 16339, Training Loss: 0.01336
Epoch: 16339, Training Loss: 0.01029
Epoch: 16340, Training Loss: 0.01170
Epoch: 16340, Training Loss: 0.01086
Epoch: 16340, Training Loss: 0.01336
Epoch: 16340, Training Loss: 0.01029
Epoch: 16341, Training Loss: 0.01170
Epoch: 16341, Training Loss: 0.01086
Epoch: 16341, Training Loss: 0.01336
Epoch: 16341, Training Loss: 0.01029
Epoch: 16342, Training Loss: 0.01170
Epoch: 16342, Training Loss: 0.01086
Epoch: 16342, Training Loss: 0.01336
Epoch: 16342, Training Loss: 0.01029
Epoch: 16343, Training Loss: 0.01170
Epoch: 16343, Training Loss: 0.01085
Epoch: 16343, Training Loss: 0.01336
Epoch: 16343, Training Loss: 0.01029
Epoch: 16344, Training Loss: 0.01170
Epoch: 16344, Training Loss: 0.01085
Epoch: 16344, Training Loss: 0.01335
Epoch: 16344, Training Loss: 0.01029
Epoch: 16345, Training Loss: 0.01170
Epoch: 16345, Training Loss: 0.01085
Epoch: 16345, Training Loss: 0.01335
Epoch: 16345, Training Loss: 0.01029
Epoch: 16346, Training Loss: 0.01170
Epoch: 16346, Training Loss: 0.01085
Epoch: 16346, Training Loss: 0.01335
Epoch: 16346, Training Loss: 0.01029
Epoch: 16347, Training Loss: 0.01170
Epoch: 16347, Training Loss: 0.01085
Epoch: 16347, Training Loss: 0.01335
Epoch: 16347, Training Loss: 0.01029
Epoch: 16348, Training Loss: 0.01170
Epoch: 16348, Training Loss: 0.01085
Epoch: 16348, Training Loss: 0.01335
Epoch: 16348, Training Loss: 0.01029
Epoch: 16349, Training Loss: 0.01170
Epoch: 16349, Training Loss: 0.01085
Epoch: 16349, Training Loss: 0.01335
Epoch: 16349, Training Loss: 0.01029
Epoch: 16350, Training Loss: 0.01170
Epoch: 16350, Training Loss: 0.01085
Epoch: 16350, Training Loss: 0.01335
Epoch: 16350, Training Loss: 0.01029
Epoch: 16351, Training Loss: 0.01170
Epoch: 16351, Training Loss: 0.01085
Epoch: 16351, Training Loss: 0.01335
Epoch: 16351, Training Loss: 0.01029
Epoch: 16352, Training Loss: 0.01169
Epoch: 16352, Training Loss: 0.01085
Epoch: 16352, Training Loss: 0.01335
Epoch: 16352, Training Loss: 0.01029
Epoch: 16353, Training Loss: 0.01169
Epoch: 16353, Training Loss: 0.01085
Epoch: 16353, Training Loss: 0.01335
Epoch: 16353, Training Loss: 0.01029
Epoch: 16354, Training Loss: 0.01169
Epoch: 16354, Training Loss: 0.01085
Epoch: 16354, Training Loss: 0.01335
Epoch: 16354, Training Loss: 0.01029
Epoch: 16355, Training Loss: 0.01169
Epoch: 16355, Training Loss: 0.01085
Epoch: 16355, Training Loss: 0.01335
Epoch: 16355, Training Loss: 0.01029
Epoch: 16356, Training Loss: 0.01169
Epoch: 16356, Training Loss: 0.01085
Epoch: 16356, Training Loss: 0.01335
Epoch: 16356, Training Loss: 0.01029
Epoch: 16357, Training Loss: 0.01169
Epoch: 16357, Training Loss: 0.01085
Epoch: 16357, Training Loss: 0.01335
Epoch: 16357, Training Loss: 0.01029
Epoch: 16358, Training Loss: 0.01169
Epoch: 16358, Training Loss: 0.01085
Epoch: 16358, Training Loss: 0.01335
Epoch: 16358, Training Loss: 0.01029
Epoch: 16359, Training Loss: 0.01169
Epoch: 16359, Training Loss: 0.01085
Epoch: 16359, Training Loss: 0.01335
Epoch: 16359, Training Loss: 0.01029
Epoch: 16360, Training Loss: 0.01169
Epoch: 16360, Training Loss: 0.01085
Epoch: 16360, Training Loss: 0.01335
Epoch: 16360, Training Loss: 0.01029
Epoch: 16361, Training Loss: 0.01169
Epoch: 16361, Training Loss: 0.01085
Epoch: 16361, Training Loss: 0.01335
Epoch: 16361, Training Loss: 0.01028
Epoch: 16362, Training Loss: 0.01169
Epoch: 16362, Training Loss: 0.01085
Epoch: 16362, Training Loss: 0.01335
Epoch: 16362, Training Loss: 0.01028
Epoch: 16363, Training Loss: 0.01169
Epoch: 16363, Training Loss: 0.01085
Epoch: 16363, Training Loss: 0.01335
Epoch: 16363, Training Loss: 0.01028
Epoch: 16364, Training Loss: 0.01169
Epoch: 16364, Training Loss: 0.01085
Epoch: 16364, Training Loss: 0.01335
Epoch: 16364, Training Loss: 0.01028
Epoch: 16365, Training Loss: 0.01169
Epoch: 16365, Training Loss: 0.01085
Epoch: 16365, Training Loss: 0.01335
Epoch: 16365, Training Loss: 0.01028
Epoch: 16366, Training Loss: 0.01169
Epoch: 16366, Training Loss: 0.01085
Epoch: 16366, Training Loss: 0.01334
Epoch: 16366, Training Loss: 0.01028
Epoch: 16367, Training Loss: 0.01169
Epoch: 16367, Training Loss: 0.01085
Epoch: 16367, Training Loss: 0.01334
Epoch: 16367, Training Loss: 0.01028
Epoch: 16368, Training Loss: 0.01169
Epoch: 16368, Training Loss: 0.01085
Epoch: 16368, Training Loss: 0.01334
Epoch: 16368, Training Loss: 0.01028
Epoch: 16369, Training Loss: 0.01169
Epoch: 16369, Training Loss: 0.01085
Epoch: 16369, Training Loss: 0.01334
Epoch: 16369, Training Loss: 0.01028
Epoch: 16370, Training Loss: 0.01169
Epoch: 16370, Training Loss: 0.01085
Epoch: 16370, Training Loss: 0.01334
Epoch: 16370, Training Loss: 0.01028
Epoch: 16371, Training Loss: 0.01169
Epoch: 16371, Training Loss: 0.01084
Epoch: 16371, Training Loss: 0.01334
Epoch: 16371, Training Loss: 0.01028
Epoch: 16372, Training Loss: 0.01169
Epoch: 16372, Training Loss: 0.01084
Epoch: 16372, Training Loss: 0.01334
Epoch: 16372, Training Loss: 0.01028
Epoch: 16373, Training Loss: 0.01169
Epoch: 16373, Training Loss: 0.01084
Epoch: 16373, Training Loss: 0.01334
Epoch: 16373, Training Loss: 0.01028
Epoch: 16374, Training Loss: 0.01169
Epoch: 16374, Training Loss: 0.01084
Epoch: 16374, Training Loss: 0.01334
Epoch: 16374, Training Loss: 0.01028
Epoch: 16375, Training Loss: 0.01169
Epoch: 16375, Training Loss: 0.01084
Epoch: 16375, Training Loss: 0.01334
Epoch: 16375, Training Loss: 0.01028
Epoch: 16376, Training Loss: 0.01169
Epoch: 16376, Training Loss: 0.01084
Epoch: 16376, Training Loss: 0.01334
Epoch: 16376, Training Loss: 0.01028
Epoch: 16377, Training Loss: 0.01168
Epoch: 16377, Training Loss: 0.01084
Epoch: 16377, Training Loss: 0.01334
Epoch: 16377, Training Loss: 0.01028
Epoch: 16378, Training Loss: 0.01168
Epoch: 16378, Training Loss: 0.01084
Epoch: 16378, Training Loss: 0.01334
Epoch: 16378, Training Loss: 0.01028
Epoch: 16379, Training Loss: 0.01168
Epoch: 16379, Training Loss: 0.01084
Epoch: 16379, Training Loss: 0.01334
Epoch: 16379, Training Loss: 0.01028
Epoch: 16380, Training Loss: 0.01168
Epoch: 16380, Training Loss: 0.01084
Epoch: 16380, Training Loss: 0.01334
Epoch: 16380, Training Loss: 0.01028
Epoch: 16381, Training Loss: 0.01168
Epoch: 16381, Training Loss: 0.01084
Epoch: 16381, Training Loss: 0.01334
Epoch: 16381, Training Loss: 0.01028
Epoch: 16382, Training Loss: 0.01168
Epoch: 16382, Training Loss: 0.01084
Epoch: 16382, Training Loss: 0.01334
Epoch: 16382, Training Loss: 0.01028
Epoch: 16383, Training Loss: 0.01168
Epoch: 16383, Training Loss: 0.01084
Epoch: 16383, Training Loss: 0.01334
Epoch: 16383, Training Loss: 0.01028
Epoch: 16384, Training Loss: 0.01168
Epoch: 16384, Training Loss: 0.01084
Epoch: 16384, Training Loss: 0.01334
Epoch: 16384, Training Loss: 0.01028
Epoch: 16385, Training Loss: 0.01168
Epoch: 16385, Training Loss: 0.01084
Epoch: 16385, Training Loss: 0.01334
Epoch: 16385, Training Loss: 0.01028
Epoch: 16386, Training Loss: 0.01168
Epoch: 16386, Training Loss: 0.01084
Epoch: 16386, Training Loss: 0.01334
Epoch: 16386, Training Loss: 0.01028
Epoch: 16387, Training Loss: 0.01168
Epoch: 16387, Training Loss: 0.01084
Epoch: 16387, Training Loss: 0.01334
Epoch: 16387, Training Loss: 0.01028
Epoch: 16388, Training Loss: 0.01168
Epoch: 16388, Training Loss: 0.01084
Epoch: 16388, Training Loss: 0.01333
Epoch: 16388, Training Loss: 0.01028
Epoch: 16389, Training Loss: 0.01168
Epoch: 16389, Training Loss: 0.01084
Epoch: 16389, Training Loss: 0.01333
Epoch: 16389, Training Loss: 0.01028
Epoch: 16390, Training Loss: 0.01168
Epoch: 16390, Training Loss: 0.01084
Epoch: 16390, Training Loss: 0.01333
Epoch: 16390, Training Loss: 0.01028
Epoch: 16391, Training Loss: 0.01168
Epoch: 16391, Training Loss: 0.01084
Epoch: 16391, Training Loss: 0.01333
Epoch: 16391, Training Loss: 0.01027
Epoch: 16392, Training Loss: 0.01168
Epoch: 16392, Training Loss: 0.01084
Epoch: 16392, Training Loss: 0.01333
Epoch: 16392, Training Loss: 0.01027
Epoch: 16393, Training Loss: 0.01168
Epoch: 16393, Training Loss: 0.01084
Epoch: 16393, Training Loss: 0.01333
Epoch: 16393, Training Loss: 0.01027
Epoch: 16394, Training Loss: 0.01168
Epoch: 16394, Training Loss: 0.01084
Epoch: 16394, Training Loss: 0.01333
Epoch: 16394, Training Loss: 0.01027
Epoch: 16395, Training Loss: 0.01168
Epoch: 16395, Training Loss: 0.01084
Epoch: 16395, Training Loss: 0.01333
Epoch: 16395, Training Loss: 0.01027
Epoch: 16396, Training Loss: 0.01168
Epoch: 16396, Training Loss: 0.01084
Epoch: 16396, Training Loss: 0.01333
Epoch: 16396, Training Loss: 0.01027
Epoch: 16397, Training Loss: 0.01168
Epoch: 16397, Training Loss: 0.01084
Epoch: 16397, Training Loss: 0.01333
Epoch: 16397, Training Loss: 0.01027
Epoch: 16398, Training Loss: 0.01168
Epoch: 16398, Training Loss: 0.01084
Epoch: 16398, Training Loss: 0.01333
Epoch: 16398, Training Loss: 0.01027
Epoch: 16399, Training Loss: 0.01168
Epoch: 16399, Training Loss: 0.01084
Epoch: 16399, Training Loss: 0.01333
Epoch: 16399, Training Loss: 0.01027
Epoch: 16400, Training Loss: 0.01168
Epoch: 16400, Training Loss: 0.01083
Epoch: 16400, Training Loss: 0.01333
Epoch: 16400, Training Loss: 0.01027
Epoch: 16401, Training Loss: 0.01168
Epoch: 16401, Training Loss: 0.01083
Epoch: 16401, Training Loss: 0.01333
Epoch: 16401, Training Loss: 0.01027
Epoch: 16402, Training Loss: 0.01168
Epoch: 16402, Training Loss: 0.01083
Epoch: 16402, Training Loss: 0.01333
Epoch: 16402, Training Loss: 0.01027
Epoch: 16403, Training Loss: 0.01167
Epoch: 16403, Training Loss: 0.01083
Epoch: 16403, Training Loss: 0.01333
Epoch: 16403, Training Loss: 0.01027
Epoch: 16404, Training Loss: 0.01167
Epoch: 16404, Training Loss: 0.01083
Epoch: 16404, Training Loss: 0.01333
Epoch: 16404, Training Loss: 0.01027
Epoch: 16405, Training Loss: 0.01167
Epoch: 16405, Training Loss: 0.01083
Epoch: 16405, Training Loss: 0.01333
Epoch: 16405, Training Loss: 0.01027
Epoch: 16406, Training Loss: 0.01167
Epoch: 16406, Training Loss: 0.01083
Epoch: 16406, Training Loss: 0.01333
Epoch: 16406, Training Loss: 0.01027
Epoch: 16407, Training Loss: 0.01167
Epoch: 16407, Training Loss: 0.01083
Epoch: 16407, Training Loss: 0.01333
Epoch: 16407, Training Loss: 0.01027
Epoch: 16408, Training Loss: 0.01167
Epoch: 16408, Training Loss: 0.01083
Epoch: 16408, Training Loss: 0.01333
Epoch: 16408, Training Loss: 0.01027
Epoch: 16409, Training Loss: 0.01167
Epoch: 16409, Training Loss: 0.01083
Epoch: 16409, Training Loss: 0.01333
Epoch: 16409, Training Loss: 0.01027
Epoch: 16410, Training Loss: 0.01167
Epoch: 16410, Training Loss: 0.01083
Epoch: 16410, Training Loss: 0.01333
Epoch: 16410, Training Loss: 0.01027
Epoch: 16411, Training Loss: 0.01167
Epoch: 16411, Training Loss: 0.01083
Epoch: 16411, Training Loss: 0.01332
Epoch: 16411, Training Loss: 0.01027
Epoch: 16412, Training Loss: 0.01167
Epoch: 16412, Training Loss: 0.01083
Epoch: 16412, Training Loss: 0.01332
Epoch: 16412, Training Loss: 0.01027
Epoch: 16413, Training Loss: 0.01167
Epoch: 16413, Training Loss: 0.01083
Epoch: 16413, Training Loss: 0.01332
Epoch: 16413, Training Loss: 0.01027
Epoch: 16414, Training Loss: 0.01167
Epoch: 16414, Training Loss: 0.01083
Epoch: 16414, Training Loss: 0.01332
Epoch: 16414, Training Loss: 0.01027
Epoch: 16415, Training Loss: 0.01167
Epoch: 16415, Training Loss: 0.01083
Epoch: 16415, Training Loss: 0.01332
Epoch: 16415, Training Loss: 0.01027
Epoch: 16416, Training Loss: 0.01167
Epoch: 16416, Training Loss: 0.01083
Epoch: 16416, Training Loss: 0.01332
Epoch: 16416, Training Loss: 0.01027
Epoch: 16417, Training Loss: 0.01167
Epoch: 16417, Training Loss: 0.01083
Epoch: 16417, Training Loss: 0.01332
Epoch: 16417, Training Loss: 0.01027
Epoch: 16418, Training Loss: 0.01167
Epoch: 16418, Training Loss: 0.01083
Epoch: 16418, Training Loss: 0.01332
Epoch: 16418, Training Loss: 0.01027
Epoch: 16419, Training Loss: 0.01167
Epoch: 16419, Training Loss: 0.01083
Epoch: 16419, Training Loss: 0.01332
Epoch: 16419, Training Loss: 0.01027
Epoch: 16420, Training Loss: 0.01167
Epoch: 16420, Training Loss: 0.01083
Epoch: 16420, Training Loss: 0.01332
Epoch: 16420, Training Loss: 0.01026
Epoch: 16421, Training Loss: 0.01167
Epoch: 16421, Training Loss: 0.01083
Epoch: 16421, Training Loss: 0.01332
Epoch: 16421, Training Loss: 0.01026
Epoch: 16422, Training Loss: 0.01167
Epoch: 16422, Training Loss: 0.01083
Epoch: 16422, Training Loss: 0.01332
Epoch: 16422, Training Loss: 0.01026
Epoch: 16423, Training Loss: 0.01167
Epoch: 16423, Training Loss: 0.01083
Epoch: 16423, Training Loss: 0.01332
Epoch: 16423, Training Loss: 0.01026
Epoch: 16424, Training Loss: 0.01167
Epoch: 16424, Training Loss: 0.01083
Epoch: 16424, Training Loss: 0.01332
Epoch: 16424, Training Loss: 0.01026
Epoch: 16425, Training Loss: 0.01167
Epoch: 16425, Training Loss: 0.01083
Epoch: 16425, Training Loss: 0.01332
Epoch: 16425, Training Loss: 0.01026
Epoch: 16426, Training Loss: 0.01167
Epoch: 16426, Training Loss: 0.01083
Epoch: 16426, Training Loss: 0.01332
Epoch: 16426, Training Loss: 0.01026
Epoch: 16427, Training Loss: 0.01167
Epoch: 16427, Training Loss: 0.01083
Epoch: 16427, Training Loss: 0.01332
Epoch: 16427, Training Loss: 0.01026
Epoch: 16428, Training Loss: 0.01167
Epoch: 16428, Training Loss: 0.01082
Epoch: 16428, Training Loss: 0.01332
Epoch: 16428, Training Loss: 0.01026
Epoch: 16429, Training Loss: 0.01166
Epoch: 16429, Training Loss: 0.01082
Epoch: 16429, Training Loss: 0.01332
Epoch: 16429, Training Loss: 0.01026
Epoch: 16430, Training Loss: 0.01166
Epoch: 16430, Training Loss: 0.01082
Epoch: 16430, Training Loss: 0.01332
Epoch: 16430, Training Loss: 0.01026
Epoch: 16431, Training Loss: 0.01166
Epoch: 16431, Training Loss: 0.01082
Epoch: 16431, Training Loss: 0.01332
Epoch: 16431, Training Loss: 0.01026
Epoch: 16432, Training Loss: 0.01166
Epoch: 16432, Training Loss: 0.01082
Epoch: 16432, Training Loss: 0.01332
Epoch: 16432, Training Loss: 0.01026
Epoch: 16433, Training Loss: 0.01166
Epoch: 16433, Training Loss: 0.01082
Epoch: 16433, Training Loss: 0.01331
Epoch: 16433, Training Loss: 0.01026
Epoch: 16434, Training Loss: 0.01166
Epoch: 16434, Training Loss: 0.01082
Epoch: 16434, Training Loss: 0.01331
Epoch: 16434, Training Loss: 0.01026
Epoch: 16435, Training Loss: 0.01166
Epoch: 16435, Training Loss: 0.01082
Epoch: 16435, Training Loss: 0.01331
Epoch: 16435, Training Loss: 0.01026
Epoch: 16436, Training Loss: 0.01166
Epoch: 16436, Training Loss: 0.01082
Epoch: 16436, Training Loss: 0.01331
Epoch: 16436, Training Loss: 0.01026
Epoch: 16437, Training Loss: 0.01166
Epoch: 16437, Training Loss: 0.01082
Epoch: 16437, Training Loss: 0.01331
Epoch: 16437, Training Loss: 0.01026
Epoch: 16438, Training Loss: 0.01166
Epoch: 16438, Training Loss: 0.01082
Epoch: 16438, Training Loss: 0.01331
Epoch: 16438, Training Loss: 0.01026
Epoch: 16439, Training Loss: 0.01166
Epoch: 16439, Training Loss: 0.01082
Epoch: 16439, Training Loss: 0.01331
Epoch: 16439, Training Loss: 0.01026
Epoch: 16440, Training Loss: 0.01166
Epoch: 16440, Training Loss: 0.01082
Epoch: 16440, Training Loss: 0.01331
Epoch: 16440, Training Loss: 0.01026
Epoch: 16441, Training Loss: 0.01166
Epoch: 16441, Training Loss: 0.01082
Epoch: 16441, Training Loss: 0.01331
Epoch: 16441, Training Loss: 0.01026
Epoch: 16442, Training Loss: 0.01166
Epoch: 16442, Training Loss: 0.01082
Epoch: 16442, Training Loss: 0.01331
Epoch: 16442, Training Loss: 0.01026
Epoch: 16443, Training Loss: 0.01166
Epoch: 16443, Training Loss: 0.01082
Epoch: 16443, Training Loss: 0.01331
Epoch: 16443, Training Loss: 0.01026
Epoch: 16444, Training Loss: 0.01166
Epoch: 16444, Training Loss: 0.01082
Epoch: 16444, Training Loss: 0.01331
Epoch: 16444, Training Loss: 0.01026
Epoch: 16445, Training Loss: 0.01166
Epoch: 16445, Training Loss: 0.01082
Epoch: 16445, Training Loss: 0.01331
Epoch: 16445, Training Loss: 0.01026
Epoch: 16446, Training Loss: 0.01166
Epoch: 16446, Training Loss: 0.01082
Epoch: 16446, Training Loss: 0.01331
Epoch: 16446, Training Loss: 0.01026
Epoch: 16447, Training Loss: 0.01166
Epoch: 16447, Training Loss: 0.01082
Epoch: 16447, Training Loss: 0.01331
Epoch: 16447, Training Loss: 0.01026
Epoch: 16448, Training Loss: 0.01166
Epoch: 16448, Training Loss: 0.01082
Epoch: 16448, Training Loss: 0.01331
Epoch: 16448, Training Loss: 0.01026
Epoch: 16449, Training Loss: 0.01166
Epoch: 16449, Training Loss: 0.01082
Epoch: 16449, Training Loss: 0.01331
Epoch: 16449, Training Loss: 0.01026
Epoch: 16450, Training Loss: 0.01166
Epoch: 16450, Training Loss: 0.01082
Epoch: 16450, Training Loss: 0.01331
Epoch: 16450, Training Loss: 0.01025
Epoch: 16451, Training Loss: 0.01166
Epoch: 16451, Training Loss: 0.01082
Epoch: 16451, Training Loss: 0.01331
Epoch: 16451, Training Loss: 0.01025
Epoch: 16452, Training Loss: 0.01166
Epoch: 16452, Training Loss: 0.01082
Epoch: 16452, Training Loss: 0.01331
Epoch: 16452, Training Loss: 0.01025
Epoch: 16453, Training Loss: 0.01166
Epoch: 16453, Training Loss: 0.01082
Epoch: 16453, Training Loss: 0.01331
Epoch: 16453, Training Loss: 0.01025
Epoch: 16454, Training Loss: 0.01165
Epoch: 16454, Training Loss: 0.01082
Epoch: 16454, Training Loss: 0.01331
Epoch: 16454, Training Loss: 0.01025
Epoch: 16455, Training Loss: 0.01165
Epoch: 16455, Training Loss: 0.01082
Epoch: 16455, Training Loss: 0.01331
Epoch: 16455, Training Loss: 0.01025
Epoch: 16456, Training Loss: 0.01165
Epoch: 16456, Training Loss: 0.01082
Epoch: 16456, Training Loss: 0.01330
Epoch: 16456, Training Loss: 0.01025
Epoch: 16457, Training Loss: 0.01165
Epoch: 16457, Training Loss: 0.01081
Epoch: 16457, Training Loss: 0.01330
Epoch: 16457, Training Loss: 0.01025
Epoch: 16458, Training Loss: 0.01165
Epoch: 16458, Training Loss: 0.01081
Epoch: 16458, Training Loss: 0.01330
Epoch: 16458, Training Loss: 0.01025
Epoch: 16459, Training Loss: 0.01165
Epoch: 16459, Training Loss: 0.01081
Epoch: 16459, Training Loss: 0.01330
Epoch: 16459, Training Loss: 0.01025
Epoch: 16460, Training Loss: 0.01165
Epoch: 16460, Training Loss: 0.01081
Epoch: 16460, Training Loss: 0.01330
Epoch: 16460, Training Loss: 0.01025
Epoch: 16461, Training Loss: 0.01165
Epoch: 16461, Training Loss: 0.01081
Epoch: 16461, Training Loss: 0.01330
Epoch: 16461, Training Loss: 0.01025
Epoch: 16462, Training Loss: 0.01165
Epoch: 16462, Training Loss: 0.01081
Epoch: 16462, Training Loss: 0.01330
Epoch: 16462, Training Loss: 0.01025
Epoch: 16463, Training Loss: 0.01165
Epoch: 16463, Training Loss: 0.01081
Epoch: 16463, Training Loss: 0.01330
Epoch: 16463, Training Loss: 0.01025
Epoch: 16464, Training Loss: 0.01165
Epoch: 16464, Training Loss: 0.01081
Epoch: 16464, Training Loss: 0.01330
Epoch: 16464, Training Loss: 0.01025
Epoch: 16465, Training Loss: 0.01165
Epoch: 16465, Training Loss: 0.01081
Epoch: 16465, Training Loss: 0.01330
Epoch: 16465, Training Loss: 0.01025
Epoch: 16466, Training Loss: 0.01165
Epoch: 16466, Training Loss: 0.01081
Epoch: 16466, Training Loss: 0.01330
Epoch: 16466, Training Loss: 0.01025
Epoch: 16467, Training Loss: 0.01165
Epoch: 16467, Training Loss: 0.01081
Epoch: 16467, Training Loss: 0.01330
Epoch: 16467, Training Loss: 0.01025
Epoch: 16468, Training Loss: 0.01165
Epoch: 16468, Training Loss: 0.01081
Epoch: 16468, Training Loss: 0.01330
Epoch: 16468, Training Loss: 0.01025
Epoch: 16469, Training Loss: 0.01165
Epoch: 16469, Training Loss: 0.01081
Epoch: 16469, Training Loss: 0.01330
Epoch: 16469, Training Loss: 0.01025
Epoch: 16470, Training Loss: 0.01165
Epoch: 16470, Training Loss: 0.01081
Epoch: 16470, Training Loss: 0.01330
Epoch: 16470, Training Loss: 0.01025
Epoch: 16471, Training Loss: 0.01165
Epoch: 16471, Training Loss: 0.01081
Epoch: 16471, Training Loss: 0.01330
Epoch: 16471, Training Loss: 0.01025
Epoch: 16472, Training Loss: 0.01165
Epoch: 16472, Training Loss: 0.01081
Epoch: 16472, Training Loss: 0.01330
Epoch: 16472, Training Loss: 0.01025
Epoch: 16473, Training Loss: 0.01165
Epoch: 16473, Training Loss: 0.01081
Epoch: 16473, Training Loss: 0.01330
Epoch: 16473, Training Loss: 0.01025
Epoch: 16474, Training Loss: 0.01165
Epoch: 16474, Training Loss: 0.01081
Epoch: 16474, Training Loss: 0.01330
Epoch: 16474, Training Loss: 0.01025
Epoch: 16475, Training Loss: 0.01165
Epoch: 16475, Training Loss: 0.01081
Epoch: 16475, Training Loss: 0.01330
Epoch: 16475, Training Loss: 0.01025
Epoch: 16476, Training Loss: 0.01165
Epoch: 16476, Training Loss: 0.01081
Epoch: 16476, Training Loss: 0.01330
Epoch: 16476, Training Loss: 0.01025
Epoch: 16477, Training Loss: 0.01165
Epoch: 16477, Training Loss: 0.01081
Epoch: 16477, Training Loss: 0.01330
Epoch: 16477, Training Loss: 0.01025
Epoch: 16478, Training Loss: 0.01165
Epoch: 16478, Training Loss: 0.01081
Epoch: 16478, Training Loss: 0.01330
Epoch: 16478, Training Loss: 0.01025
Epoch: 16479, Training Loss: 0.01165
Epoch: 16479, Training Loss: 0.01081
Epoch: 16479, Training Loss: 0.01329
Epoch: 16479, Training Loss: 0.01025
Epoch: 16480, Training Loss: 0.01164
Epoch: 16480, Training Loss: 0.01081
Epoch: 16480, Training Loss: 0.01329
Epoch: 16480, Training Loss: 0.01024
Epoch: 16481, Training Loss: 0.01164
Epoch: 16481, Training Loss: 0.01081
Epoch: 16481, Training Loss: 0.01329
Epoch: 16481, Training Loss: 0.01024
Epoch: 16482, Training Loss: 0.01164
Epoch: 16482, Training Loss: 0.01081
Epoch: 16482, Training Loss: 0.01329
Epoch: 16482, Training Loss: 0.01024
Epoch: 16483, Training Loss: 0.01164
Epoch: 16483, Training Loss: 0.01081
Epoch: 16483, Training Loss: 0.01329
Epoch: 16483, Training Loss: 0.01024
Epoch: 16484, Training Loss: 0.01164
Epoch: 16484, Training Loss: 0.01081
Epoch: 16484, Training Loss: 0.01329
Epoch: 16484, Training Loss: 0.01024
Epoch: 16485, Training Loss: 0.01164
Epoch: 16485, Training Loss: 0.01080
Epoch: 16485, Training Loss: 0.01329
Epoch: 16485, Training Loss: 0.01024
Epoch: 16486, Training Loss: 0.01164
Epoch: 16486, Training Loss: 0.01080
Epoch: 16486, Training Loss: 0.01329
Epoch: 16486, Training Loss: 0.01024
Epoch: 16487, Training Loss: 0.01164
Epoch: 16487, Training Loss: 0.01080
Epoch: 16487, Training Loss: 0.01329
Epoch: 16487, Training Loss: 0.01024
Epoch: 16488, Training Loss: 0.01164
Epoch: 16488, Training Loss: 0.01080
Epoch: 16488, Training Loss: 0.01329
Epoch: 16488, Training Loss: 0.01024
Epoch: 16489, Training Loss: 0.01164
Epoch: 16489, Training Loss: 0.01080
Epoch: 16489, Training Loss: 0.01329
Epoch: 16489, Training Loss: 0.01024
Epoch: 16490, Training Loss: 0.01164
Epoch: 16490, Training Loss: 0.01080
Epoch: 16490, Training Loss: 0.01329
Epoch: 16490, Training Loss: 0.01024
Epoch: 16491, Training Loss: 0.01164
Epoch: 16491, Training Loss: 0.01080
Epoch: 16491, Training Loss: 0.01329
Epoch: 16491, Training Loss: 0.01024
Epoch: 16492, Training Loss: 0.01164
Epoch: 16492, Training Loss: 0.01080
Epoch: 16492, Training Loss: 0.01329
Epoch: 16492, Training Loss: 0.01024
Epoch: 16493, Training Loss: 0.01164
Epoch: 16493, Training Loss: 0.01080
Epoch: 16493, Training Loss: 0.01329
Epoch: 16493, Training Loss: 0.01024
Epoch: 16494, Training Loss: 0.01164
Epoch: 16494, Training Loss: 0.01080
Epoch: 16494, Training Loss: 0.01329
Epoch: 16494, Training Loss: 0.01024
Epoch: 16495, Training Loss: 0.01164
Epoch: 16495, Training Loss: 0.01080
Epoch: 16495, Training Loss: 0.01329
Epoch: 16495, Training Loss: 0.01024
Epoch: 16496, Training Loss: 0.01164
Epoch: 16496, Training Loss: 0.01080
Epoch: 16496, Training Loss: 0.01329
Epoch: 16496, Training Loss: 0.01024
Epoch: 16497, Training Loss: 0.01164
Epoch: 16497, Training Loss: 0.01080
Epoch: 16497, Training Loss: 0.01329
Epoch: 16497, Training Loss: 0.01024
Epoch: 16498, Training Loss: 0.01164
Epoch: 16498, Training Loss: 0.01080
Epoch: 16498, Training Loss: 0.01329
Epoch: 16498, Training Loss: 0.01024
Epoch: 16499, Training Loss: 0.01164
Epoch: 16499, Training Loss: 0.01080
Epoch: 16499, Training Loss: 0.01329
Epoch: 16499, Training Loss: 0.01024
Epoch: 16500, Training Loss: 0.01164
Epoch: 16500, Training Loss: 0.01080
Epoch: 16500, Training Loss: 0.01329
Epoch: 16500, Training Loss: 0.01024
Epoch: 16501, Training Loss: 0.01164
Epoch: 16501, Training Loss: 0.01080
Epoch: 16501, Training Loss: 0.01328
Epoch: 16501, Training Loss: 0.01024
Epoch: 16502, Training Loss: 0.01164
Epoch: 16502, Training Loss: 0.01080
Epoch: 16502, Training Loss: 0.01328
Epoch: 16502, Training Loss: 0.01024
Epoch: 16503, Training Loss: 0.01164
Epoch: 16503, Training Loss: 0.01080
Epoch: 16503, Training Loss: 0.01328
Epoch: 16503, Training Loss: 0.01024
Epoch: 16504, Training Loss: 0.01164
Epoch: 16504, Training Loss: 0.01080
Epoch: 16504, Training Loss: 0.01328
Epoch: 16504, Training Loss: 0.01024
Epoch: 16505, Training Loss: 0.01164
Epoch: 16505, Training Loss: 0.01080
Epoch: 16505, Training Loss: 0.01328
Epoch: 16505, Training Loss: 0.01024
Epoch: 16506, Training Loss: 0.01163
Epoch: 16506, Training Loss: 0.01080
Epoch: 16506, Training Loss: 0.01328
Epoch: 16506, Training Loss: 0.01024
Epoch: 16507, Training Loss: 0.01163
Epoch: 16507, Training Loss: 0.01080
Epoch: 16507, Training Loss: 0.01328
Epoch: 16507, Training Loss: 0.01024
Epoch: 16508, Training Loss: 0.01163
Epoch: 16508, Training Loss: 0.01080
Epoch: 16508, Training Loss: 0.01328
Epoch: 16508, Training Loss: 0.01024
Epoch: 16509, Training Loss: 0.01163
Epoch: 16509, Training Loss: 0.01080
Epoch: 16509, Training Loss: 0.01328
Epoch: 16509, Training Loss: 0.01024
Epoch: 16510, Training Loss: 0.01163
Epoch: 16510, Training Loss: 0.01080
Epoch: 16510, Training Loss: 0.01328
Epoch: 16510, Training Loss: 0.01023
Epoch: 16511, Training Loss: 0.01163
Epoch: 16511, Training Loss: 0.01080
Epoch: 16511, Training Loss: 0.01328
Epoch: 16511, Training Loss: 0.01023
Epoch: 16512, Training Loss: 0.01163
Epoch: 16512, Training Loss: 0.01080
Epoch: 16512, Training Loss: 0.01328
Epoch: 16512, Training Loss: 0.01023
Epoch: 16513, Training Loss: 0.01163
Epoch: 16513, Training Loss: 0.01080
Epoch: 16513, Training Loss: 0.01328
Epoch: 16513, Training Loss: 0.01023
Epoch: 16514, Training Loss: 0.01163
Epoch: 16514, Training Loss: 0.01079
Epoch: 16514, Training Loss: 0.01328
Epoch: 16514, Training Loss: 0.01023
Epoch: 16515, Training Loss: 0.01163
Epoch: 16515, Training Loss: 0.01079
Epoch: 16515, Training Loss: 0.01328
Epoch: 16515, Training Loss: 0.01023
Epoch: 16516, Training Loss: 0.01163
Epoch: 16516, Training Loss: 0.01079
Epoch: 16516, Training Loss: 0.01328
Epoch: 16516, Training Loss: 0.01023
Epoch: 16517, Training Loss: 0.01163
Epoch: 16517, Training Loss: 0.01079
Epoch: 16517, Training Loss: 0.01328
Epoch: 16517, Training Loss: 0.01023
Epoch: 16518, Training Loss: 0.01163
Epoch: 16518, Training Loss: 0.01079
Epoch: 16518, Training Loss: 0.01328
Epoch: 16518, Training Loss: 0.01023
Epoch: 16519, Training Loss: 0.01163
Epoch: 16519, Training Loss: 0.01079
Epoch: 16519, Training Loss: 0.01328
Epoch: 16519, Training Loss: 0.01023
Epoch: 16520, Training Loss: 0.01163
Epoch: 16520, Training Loss: 0.01079
Epoch: 16520, Training Loss: 0.01328
Epoch: 16520, Training Loss: 0.01023
Epoch: 16521, Training Loss: 0.01163
Epoch: 16521, Training Loss: 0.01079
Epoch: 16521, Training Loss: 0.01328
Epoch: 16521, Training Loss: 0.01023
Epoch: 16522, Training Loss: 0.01163
Epoch: 16522, Training Loss: 0.01079
Epoch: 16522, Training Loss: 0.01328
Epoch: 16522, Training Loss: 0.01023
Epoch: 16523, Training Loss: 0.01163
Epoch: 16523, Training Loss: 0.01079
Epoch: 16523, Training Loss: 0.01328
Epoch: 16523, Training Loss: 0.01023
Epoch: 16524, Training Loss: 0.01163
Epoch: 16524, Training Loss: 0.01079
Epoch: 16524, Training Loss: 0.01327
Epoch: 16524, Training Loss: 0.01023
Epoch: 16525, Training Loss: 0.01163
Epoch: 16525, Training Loss: 0.01079
Epoch: 16525, Training Loss: 0.01327
Epoch: 16525, Training Loss: 0.01023
Epoch: 16526, Training Loss: 0.01163
Epoch: 16526, Training Loss: 0.01079
Epoch: 16526, Training Loss: 0.01327
Epoch: 16526, Training Loss: 0.01023
Epoch: 16527, Training Loss: 0.01163
Epoch: 16527, Training Loss: 0.01079
Epoch: 16527, Training Loss: 0.01327
Epoch: 16527, Training Loss: 0.01023
Epoch: 16528, Training Loss: 0.01163
Epoch: 16528, Training Loss: 0.01079
Epoch: 16528, Training Loss: 0.01327
Epoch: 16528, Training Loss: 0.01023
Epoch: 16529, Training Loss: 0.01163
Epoch: 16529, Training Loss: 0.01079
Epoch: 16529, Training Loss: 0.01327
Epoch: 16529, Training Loss: 0.01023
Epoch: 16530, Training Loss: 0.01163
Epoch: 16530, Training Loss: 0.01079
Epoch: 16530, Training Loss: 0.01327
Epoch: 16530, Training Loss: 0.01023
Epoch: 16531, Training Loss: 0.01163
Epoch: 16531, Training Loss: 0.01079
Epoch: 16531, Training Loss: 0.01327
Epoch: 16531, Training Loss: 0.01023
Epoch: 16532, Training Loss: 0.01162
Epoch: 16532, Training Loss: 0.01079
Epoch: 16532, Training Loss: 0.01327
Epoch: 16532, Training Loss: 0.01023
Epoch: 16533, Training Loss: 0.01162
Epoch: 16533, Training Loss: 0.01079
Epoch: 16533, Training Loss: 0.01327
Epoch: 16533, Training Loss: 0.01023
Epoch: 16534, Training Loss: 0.01162
Epoch: 16534, Training Loss: 0.01079
Epoch: 16534, Training Loss: 0.01327
Epoch: 16534, Training Loss: 0.01023
Epoch: 16535, Training Loss: 0.01162
Epoch: 16535, Training Loss: 0.01079
Epoch: 16535, Training Loss: 0.01327
Epoch: 16535, Training Loss: 0.01023
Epoch: 16536, Training Loss: 0.01162
Epoch: 16536, Training Loss: 0.01079
Epoch: 16536, Training Loss: 0.01327
Epoch: 16536, Training Loss: 0.01023
Epoch: 16537, Training Loss: 0.01162
Epoch: 16537, Training Loss: 0.01079
Epoch: 16537, Training Loss: 0.01327
Epoch: 16537, Training Loss: 0.01023
Epoch: 16538, Training Loss: 0.01162
Epoch: 16538, Training Loss: 0.01079
Epoch: 16538, Training Loss: 0.01327
Epoch: 16538, Training Loss: 0.01023
Epoch: 16539, Training Loss: 0.01162
Epoch: 16539, Training Loss: 0.01079
Epoch: 16539, Training Loss: 0.01327
Epoch: 16539, Training Loss: 0.01023
Epoch: 16540, Training Loss: 0.01162
Epoch: 16540, Training Loss: 0.01079
Epoch: 16540, Training Loss: 0.01327
Epoch: 16540, Training Loss: 0.01022
Epoch: 16541, Training Loss: 0.01162
Epoch: 16541, Training Loss: 0.01079
Epoch: 16541, Training Loss: 0.01327
Epoch: 16541, Training Loss: 0.01022
Epoch: 16542, Training Loss: 0.01162
Epoch: 16542, Training Loss: 0.01078
Epoch: 16542, Training Loss: 0.01327
Epoch: 16542, Training Loss: 0.01022
Epoch: 16543, Training Loss: 0.01162
Epoch: 16543, Training Loss: 0.01078
Epoch: 16543, Training Loss: 0.01327
Epoch: 16543, Training Loss: 0.01022
Epoch: 16544, Training Loss: 0.01162
Epoch: 16544, Training Loss: 0.01078
Epoch: 16544, Training Loss: 0.01327
Epoch: 16544, Training Loss: 0.01022
Epoch: 16545, Training Loss: 0.01162
Epoch: 16545, Training Loss: 0.01078
Epoch: 16545, Training Loss: 0.01327
Epoch: 16545, Training Loss: 0.01022
Epoch: 16546, Training Loss: 0.01162
Epoch: 16546, Training Loss: 0.01078
Epoch: 16546, Training Loss: 0.01327
Epoch: 16546, Training Loss: 0.01022
Epoch: 16547, Training Loss: 0.01162
Epoch: 16547, Training Loss: 0.01078
Epoch: 16547, Training Loss: 0.01326
Epoch: 16547, Training Loss: 0.01022
Epoch: 16548, Training Loss: 0.01162
Epoch: 16548, Training Loss: 0.01078
Epoch: 16548, Training Loss: 0.01326
Epoch: 16548, Training Loss: 0.01022
Epoch: 16549, Training Loss: 0.01162
Epoch: 16549, Training Loss: 0.01078
Epoch: 16549, Training Loss: 0.01326
Epoch: 16549, Training Loss: 0.01022
Epoch: 16550, Training Loss: 0.01162
Epoch: 16550, Training Loss: 0.01078
Epoch: 16550, Training Loss: 0.01326
Epoch: 16550, Training Loss: 0.01022
Epoch: 16551, Training Loss: 0.01162
Epoch: 16551, Training Loss: 0.01078
Epoch: 16551, Training Loss: 0.01326
Epoch: 16551, Training Loss: 0.01022
Epoch: 16552, Training Loss: 0.01162
Epoch: 16552, Training Loss: 0.01078
Epoch: 16552, Training Loss: 0.01326
Epoch: 16552, Training Loss: 0.01022
Epoch: 16553, Training Loss: 0.01162
Epoch: 16553, Training Loss: 0.01078
Epoch: 16553, Training Loss: 0.01326
Epoch: 16553, Training Loss: 0.01022
Epoch: 16554, Training Loss: 0.01162
Epoch: 16554, Training Loss: 0.01078
Epoch: 16554, Training Loss: 0.01326
Epoch: 16554, Training Loss: 0.01022
Epoch: 16555, Training Loss: 0.01162
Epoch: 16555, Training Loss: 0.01078
Epoch: 16555, Training Loss: 0.01326
Epoch: 16555, Training Loss: 0.01022
Epoch: 16556, Training Loss: 0.01162
Epoch: 16556, Training Loss: 0.01078
Epoch: 16556, Training Loss: 0.01326
Epoch: 16556, Training Loss: 0.01022
Epoch: 16557, Training Loss: 0.01162
Epoch: 16557, Training Loss: 0.01078
Epoch: 16557, Training Loss: 0.01326
Epoch: 16557, Training Loss: 0.01022
Epoch: 16558, Training Loss: 0.01161
Epoch: 16558, Training Loss: 0.01078
Epoch: 16558, Training Loss: 0.01326
Epoch: 16558, Training Loss: 0.01022
Epoch: 16559, Training Loss: 0.01161
Epoch: 16559, Training Loss: 0.01078
Epoch: 16559, Training Loss: 0.01326
Epoch: 16559, Training Loss: 0.01022
Epoch: 16560, Training Loss: 0.01161
Epoch: 16560, Training Loss: 0.01078
Epoch: 16560, Training Loss: 0.01326
Epoch: 16560, Training Loss: 0.01022
Epoch: 16561, Training Loss: 0.01161
Epoch: 16561, Training Loss: 0.01078
Epoch: 16561, Training Loss: 0.01326
Epoch: 16561, Training Loss: 0.01022
Epoch: 16562, Training Loss: 0.01161
Epoch: 16562, Training Loss: 0.01078
Epoch: 16562, Training Loss: 0.01326
Epoch: 16562, Training Loss: 0.01022
Epoch: 16563, Training Loss: 0.01161
Epoch: 16563, Training Loss: 0.01078
Epoch: 16563, Training Loss: 0.01326
Epoch: 16563, Training Loss: 0.01022
Epoch: 16564, Training Loss: 0.01161
Epoch: 16564, Training Loss: 0.01078
Epoch: 16564, Training Loss: 0.01326
Epoch: 16564, Training Loss: 0.01022
Epoch: 16565, Training Loss: 0.01161
Epoch: 16565, Training Loss: 0.01078
Epoch: 16565, Training Loss: 0.01326
Epoch: 16565, Training Loss: 0.01022
Epoch: 16566, Training Loss: 0.01161
Epoch: 16566, Training Loss: 0.01078
Epoch: 16566, Training Loss: 0.01326
Epoch: 16566, Training Loss: 0.01022
Epoch: 16567, Training Loss: 0.01161
Epoch: 16567, Training Loss: 0.01078
Epoch: 16567, Training Loss: 0.01326
Epoch: 16567, Training Loss: 0.01022
Epoch: 16568, Training Loss: 0.01161
Epoch: 16568, Training Loss: 0.01078
Epoch: 16568, Training Loss: 0.01326
Epoch: 16568, Training Loss: 0.01022
Epoch: 16569, Training Loss: 0.01161
Epoch: 16569, Training Loss: 0.01078
Epoch: 16569, Training Loss: 0.01326
Epoch: 16569, Training Loss: 0.01022
Epoch: 16570, Training Loss: 0.01161
Epoch: 16570, Training Loss: 0.01078
Epoch: 16570, Training Loss: 0.01325
Epoch: 16570, Training Loss: 0.01021
Epoch: 16571, Training Loss: 0.01161
Epoch: 16571, Training Loss: 0.01077
Epoch: 16571, Training Loss: 0.01325
Epoch: 16571, Training Loss: 0.01021
Epoch: 16572, Training Loss: 0.01161
Epoch: 16572, Training Loss: 0.01077
Epoch: 16572, Training Loss: 0.01325
Epoch: 16572, Training Loss: 0.01021
Epoch: 16573, Training Loss: 0.01161
Epoch: 16573, Training Loss: 0.01077
Epoch: 16573, Training Loss: 0.01325
Epoch: 16573, Training Loss: 0.01021
Epoch: 16574, Training Loss: 0.01161
Epoch: 16574, Training Loss: 0.01077
Epoch: 16574, Training Loss: 0.01325
Epoch: 16574, Training Loss: 0.01021
Epoch: 16575, Training Loss: 0.01161
Epoch: 16575, Training Loss: 0.01077
Epoch: 16575, Training Loss: 0.01325
Epoch: 16575, Training Loss: 0.01021
Epoch: 16576, Training Loss: 0.01161
Epoch: 16576, Training Loss: 0.01077
Epoch: 16576, Training Loss: 0.01325
Epoch: 16576, Training Loss: 0.01021
Epoch: 16577, Training Loss: 0.01161
Epoch: 16577, Training Loss: 0.01077
Epoch: 16577, Training Loss: 0.01325
Epoch: 16577, Training Loss: 0.01021
Epoch: 16578, Training Loss: 0.01161
Epoch: 16578, Training Loss: 0.01077
Epoch: 16578, Training Loss: 0.01325
Epoch: 16578, Training Loss: 0.01021
Epoch: 16579, Training Loss: 0.01161
Epoch: 16579, Training Loss: 0.01077
Epoch: 16579, Training Loss: 0.01325
Epoch: 16579, Training Loss: 0.01021
Epoch: 16580, Training Loss: 0.01161
Epoch: 16580, Training Loss: 0.01077
Epoch: 16580, Training Loss: 0.01325
Epoch: 16580, Training Loss: 0.01021
Epoch: 16581, Training Loss: 0.01161
Epoch: 16581, Training Loss: 0.01077
Epoch: 16581, Training Loss: 0.01325
Epoch: 16581, Training Loss: 0.01021
Epoch: 16582, Training Loss: 0.01161
Epoch: 16582, Training Loss: 0.01077
Epoch: 16582, Training Loss: 0.01325
Epoch: 16582, Training Loss: 0.01021
Epoch: 16583, Training Loss: 0.01161
Epoch: 16583, Training Loss: 0.01077
Epoch: 16583, Training Loss: 0.01325
Epoch: 16583, Training Loss: 0.01021
Epoch: 16584, Training Loss: 0.01160
Epoch: 16584, Training Loss: 0.01077
Epoch: 16584, Training Loss: 0.01325
Epoch: 16584, Training Loss: 0.01021
Epoch: 16585, Training Loss: 0.01160
Epoch: 16585, Training Loss: 0.01077
Epoch: 16585, Training Loss: 0.01325
Epoch: 16585, Training Loss: 0.01021
Epoch: 16586, Training Loss: 0.01160
Epoch: 16586, Training Loss: 0.01077
Epoch: 16586, Training Loss: 0.01325
Epoch: 16586, Training Loss: 0.01021
Epoch: 16587, Training Loss: 0.01160
Epoch: 16587, Training Loss: 0.01077
Epoch: 16587, Training Loss: 0.01325
Epoch: 16587, Training Loss: 0.01021
Epoch: 16588, Training Loss: 0.01160
Epoch: 16588, Training Loss: 0.01077
Epoch: 16588, Training Loss: 0.01325
Epoch: 16588, Training Loss: 0.01021
Epoch: 16589, Training Loss: 0.01160
Epoch: 16589, Training Loss: 0.01077
Epoch: 16589, Training Loss: 0.01325
Epoch: 16589, Training Loss: 0.01021
Epoch: 16590, Training Loss: 0.01160
Epoch: 16590, Training Loss: 0.01077
Epoch: 16590, Training Loss: 0.01325
Epoch: 16590, Training Loss: 0.01021
Epoch: 16591, Training Loss: 0.01160
Epoch: 16591, Training Loss: 0.01077
Epoch: 16591, Training Loss: 0.01325
Epoch: 16591, Training Loss: 0.01021
Epoch: 16592, Training Loss: 0.01160
Epoch: 16592, Training Loss: 0.01077
Epoch: 16592, Training Loss: 0.01325
Epoch: 16592, Training Loss: 0.01021
Epoch: 16593, Training Loss: 0.01160
Epoch: 16593, Training Loss: 0.01077
Epoch: 16593, Training Loss: 0.01324
Epoch: 16593, Training Loss: 0.01021
Epoch: 16594, Training Loss: 0.01160
Epoch: 16594, Training Loss: 0.01077
Epoch: 16594, Training Loss: 0.01324
Epoch: 16594, Training Loss: 0.01021
Epoch: 16595, Training Loss: 0.01160
Epoch: 16595, Training Loss: 0.01077
Epoch: 16595, Training Loss: 0.01324
Epoch: 16595, Training Loss: 0.01021
Epoch: 16596, Training Loss: 0.01160
Epoch: 16596, Training Loss: 0.01077
Epoch: 16596, Training Loss: 0.01324
Epoch: 16596, Training Loss: 0.01021
Epoch: 16597, Training Loss: 0.01160
Epoch: 16597, Training Loss: 0.01077
Epoch: 16597, Training Loss: 0.01324
Epoch: 16597, Training Loss: 0.01021
Epoch: 16598, Training Loss: 0.01160
Epoch: 16598, Training Loss: 0.01077
Epoch: 16598, Training Loss: 0.01324
Epoch: 16598, Training Loss: 0.01021
Epoch: 16599, Training Loss: 0.01160
Epoch: 16599, Training Loss: 0.01077
Epoch: 16599, Training Loss: 0.01324
Epoch: 16599, Training Loss: 0.01021
Epoch: 16600, Training Loss: 0.01160
Epoch: 16600, Training Loss: 0.01076
Epoch: 16600, Training Loss: 0.01324
Epoch: 16600, Training Loss: 0.01021
Epoch: 16601, Training Loss: 0.01160
Epoch: 16601, Training Loss: 0.01076
Epoch: 16601, Training Loss: 0.01324
Epoch: 16601, Training Loss: 0.01020
Epoch: 16602, Training Loss: 0.01160
Epoch: 16602, Training Loss: 0.01076
Epoch: 16602, Training Loss: 0.01324
Epoch: 16602, Training Loss: 0.01020
Epoch: 16603, Training Loss: 0.01160
Epoch: 16603, Training Loss: 0.01076
Epoch: 16603, Training Loss: 0.01324
Epoch: 16603, Training Loss: 0.01020
Epoch: 16604, Training Loss: 0.01160
Epoch: 16604, Training Loss: 0.01076
Epoch: 16604, Training Loss: 0.01324
Epoch: 16604, Training Loss: 0.01020
Epoch: 16605, Training Loss: 0.01160
Epoch: 16605, Training Loss: 0.01076
Epoch: 16605, Training Loss: 0.01324
Epoch: 16605, Training Loss: 0.01020
Epoch: 16606, Training Loss: 0.01160
Epoch: 16606, Training Loss: 0.01076
Epoch: 16606, Training Loss: 0.01324
Epoch: 16606, Training Loss: 0.01020
Epoch: 16607, Training Loss: 0.01160
Epoch: 16607, Training Loss: 0.01076
Epoch: 16607, Training Loss: 0.01324
Epoch: 16607, Training Loss: 0.01020
Epoch: 16608, Training Loss: 0.01160
Epoch: 16608, Training Loss: 0.01076
Epoch: 16608, Training Loss: 0.01324
Epoch: 16608, Training Loss: 0.01020
Epoch: 16609, Training Loss: 0.01160
Epoch: 16609, Training Loss: 0.01076
Epoch: 16609, Training Loss: 0.01324
Epoch: 16609, Training Loss: 0.01020
Epoch: 16610, Training Loss: 0.01160
Epoch: 16610, Training Loss: 0.01076
Epoch: 16610, Training Loss: 0.01324
Epoch: 16610, Training Loss: 0.01020
Epoch: 16611, Training Loss: 0.01159
Epoch: 16611, Training Loss: 0.01076
Epoch: 16611, Training Loss: 0.01324
Epoch: 16611, Training Loss: 0.01020
Epoch: 16612, Training Loss: 0.01159
Epoch: 16612, Training Loss: 0.01076
Epoch: 16612, Training Loss: 0.01324
Epoch: 16612, Training Loss: 0.01020
Epoch: 16613, Training Loss: 0.01159
Epoch: 16613, Training Loss: 0.01076
Epoch: 16613, Training Loss: 0.01324
Epoch: 16613, Training Loss: 0.01020
Epoch: 16614, Training Loss: 0.01159
Epoch: 16614, Training Loss: 0.01076
Epoch: 16614, Training Loss: 0.01324
Epoch: 16614, Training Loss: 0.01020
Epoch: 16615, Training Loss: 0.01159
Epoch: 16615, Training Loss: 0.01076
Epoch: 16615, Training Loss: 0.01323
Epoch: 16615, Training Loss: 0.01020
Epoch: 16616, Training Loss: 0.01159
Epoch: 16616, Training Loss: 0.01076
Epoch: 16616, Training Loss: 0.01323
Epoch: 16616, Training Loss: 0.01020
Epoch: 16617, Training Loss: 0.01159
Epoch: 16617, Training Loss: 0.01076
Epoch: 16617, Training Loss: 0.01323
Epoch: 16617, Training Loss: 0.01020
Epoch: 16618, Training Loss: 0.01159
Epoch: 16618, Training Loss: 0.01076
Epoch: 16618, Training Loss: 0.01323
Epoch: 16618, Training Loss: 0.01020
Epoch: 16619, Training Loss: 0.01159
Epoch: 16619, Training Loss: 0.01076
Epoch: 16619, Training Loss: 0.01323
Epoch: 16619, Training Loss: 0.01020
Epoch: 16620, Training Loss: 0.01159
Epoch: 16620, Training Loss: 0.01076
Epoch: 16620, Training Loss: 0.01323
Epoch: 16620, Training Loss: 0.01020
Epoch: 16621, Training Loss: 0.01159
Epoch: 16621, Training Loss: 0.01076
Epoch: 16621, Training Loss: 0.01323
Epoch: 16621, Training Loss: 0.01020
Epoch: 16622, Training Loss: 0.01159
Epoch: 16622, Training Loss: 0.01076
Epoch: 16622, Training Loss: 0.01323
Epoch: 16622, Training Loss: 0.01020
Epoch: 16623, Training Loss: 0.01159
Epoch: 16623, Training Loss: 0.01076
Epoch: 16623, Training Loss: 0.01323
Epoch: 16623, Training Loss: 0.01020
Epoch: 16624, Training Loss: 0.01159
Epoch: 16624, Training Loss: 0.01076
Epoch: 16624, Training Loss: 0.01323
Epoch: 16624, Training Loss: 0.01020
Epoch: 16625, Training Loss: 0.01159
Epoch: 16625, Training Loss: 0.01076
Epoch: 16625, Training Loss: 0.01323
Epoch: 16625, Training Loss: 0.01020
Epoch: 16626, Training Loss: 0.01159
Epoch: 16626, Training Loss: 0.01076
Epoch: 16626, Training Loss: 0.01323
Epoch: 16626, Training Loss: 0.01020
Epoch: 16627, Training Loss: 0.01159
Epoch: 16627, Training Loss: 0.01076
Epoch: 16627, Training Loss: 0.01323
Epoch: 16627, Training Loss: 0.01020
Epoch: 16628, Training Loss: 0.01159
Epoch: 16628, Training Loss: 0.01076
Epoch: 16628, Training Loss: 0.01323
Epoch: 16628, Training Loss: 0.01020
Epoch: 16629, Training Loss: 0.01159
Epoch: 16629, Training Loss: 0.01075
Epoch: 16629, Training Loss: 0.01323
Epoch: 16629, Training Loss: 0.01020
Epoch: 16630, Training Loss: 0.01159
Epoch: 16630, Training Loss: 0.01075
Epoch: 16630, Training Loss: 0.01323
Epoch: 16630, Training Loss: 0.01020
Epoch: 16631, Training Loss: 0.01159
Epoch: 16631, Training Loss: 0.01075
Epoch: 16631, Training Loss: 0.01323
Epoch: 16631, Training Loss: 0.01019
Epoch: 16632, Training Loss: 0.01159
Epoch: 16632, Training Loss: 0.01075
Epoch: 16632, Training Loss: 0.01323
Epoch: 16632, Training Loss: 0.01019
Epoch: 16633, Training Loss: 0.01159
Epoch: 16633, Training Loss: 0.01075
Epoch: 16633, Training Loss: 0.01323
Epoch: 16633, Training Loss: 0.01019
Epoch: 16634, Training Loss: 0.01159
Epoch: 16634, Training Loss: 0.01075
Epoch: 16634, Training Loss: 0.01323
Epoch: 16634, Training Loss: 0.01019
Epoch: 16635, Training Loss: 0.01159
Epoch: 16635, Training Loss: 0.01075
Epoch: 16635, Training Loss: 0.01323
Epoch: 16635, Training Loss: 0.01019
Epoch: 16636, Training Loss: 0.01159
Epoch: 16636, Training Loss: 0.01075
Epoch: 16636, Training Loss: 0.01323
Epoch: 16636, Training Loss: 0.01019
Epoch: 16637, Training Loss: 0.01158
Epoch: 16637, Training Loss: 0.01075
Epoch: 16637, Training Loss: 0.01323
Epoch: 16637, Training Loss: 0.01019
Epoch: 16638, Training Loss: 0.01158
Epoch: 16638, Training Loss: 0.01075
Epoch: 16638, Training Loss: 0.01322
Epoch: 16638, Training Loss: 0.01019
Epoch: 16639, Training Loss: 0.01158
Epoch: 16639, Training Loss: 0.01075
Epoch: 16639, Training Loss: 0.01322
Epoch: 16639, Training Loss: 0.01019
Epoch: 16640, Training Loss: 0.01158
Epoch: 16640, Training Loss: 0.01075
Epoch: 16640, Training Loss: 0.01322
Epoch: 16640, Training Loss: 0.01019
Epoch: 16641, Training Loss: 0.01158
Epoch: 16641, Training Loss: 0.01075
Epoch: 16641, Training Loss: 0.01322
Epoch: 16641, Training Loss: 0.01019
Epoch: 16642, Training Loss: 0.01158
Epoch: 16642, Training Loss: 0.01075
Epoch: 16642, Training Loss: 0.01322
Epoch: 16642, Training Loss: 0.01019
Epoch: 16643, Training Loss: 0.01158
Epoch: 16643, Training Loss: 0.01075
Epoch: 16643, Training Loss: 0.01322
Epoch: 16643, Training Loss: 0.01019
Epoch: 16644, Training Loss: 0.01158
Epoch: 16644, Training Loss: 0.01075
Epoch: 16644, Training Loss: 0.01322
Epoch: 16644, Training Loss: 0.01019
Epoch: 16645, Training Loss: 0.01158
Epoch: 16645, Training Loss: 0.01075
Epoch: 16645, Training Loss: 0.01322
Epoch: 16645, Training Loss: 0.01019
Epoch: 16646, Training Loss: 0.01158
Epoch: 16646, Training Loss: 0.01075
Epoch: 16646, Training Loss: 0.01322
Epoch: 16646, Training Loss: 0.01019
Epoch: 16647, Training Loss: 0.01158
Epoch: 16647, Training Loss: 0.01075
Epoch: 16647, Training Loss: 0.01322
Epoch: 16647, Training Loss: 0.01019
Epoch: 16648, Training Loss: 0.01158
Epoch: 16648, Training Loss: 0.01075
Epoch: 16648, Training Loss: 0.01322
Epoch: 16648, Training Loss: 0.01019
Epoch: 16649, Training Loss: 0.01158
Epoch: 16649, Training Loss: 0.01075
Epoch: 16649, Training Loss: 0.01322
Epoch: 16649, Training Loss: 0.01019
Epoch: 16650, Training Loss: 0.01158
Epoch: 16650, Training Loss: 0.01075
Epoch: 16650, Training Loss: 0.01322
Epoch: 16650, Training Loss: 0.01019
Epoch: 16651, Training Loss: 0.01158
Epoch: 16651, Training Loss: 0.01075
Epoch: 16651, Training Loss: 0.01322
Epoch: 16651, Training Loss: 0.01019
Epoch: 16652, Training Loss: 0.01158
Epoch: 16652, Training Loss: 0.01075
Epoch: 16652, Training Loss: 0.01322
Epoch: 16652, Training Loss: 0.01019
Epoch: 16653, Training Loss: 0.01158
Epoch: 16653, Training Loss: 0.01075
Epoch: 16653, Training Loss: 0.01322
Epoch: 16653, Training Loss: 0.01019
Epoch: 16654, Training Loss: 0.01158
Epoch: 16654, Training Loss: 0.01075
Epoch: 16654, Training Loss: 0.01322
Epoch: 16654, Training Loss: 0.01019
Epoch: 16655, Training Loss: 0.01158
Epoch: 16655, Training Loss: 0.01075
Epoch: 16655, Training Loss: 0.01322
Epoch: 16655, Training Loss: 0.01019
Epoch: 16656, Training Loss: 0.01158
Epoch: 16656, Training Loss: 0.01075
Epoch: 16656, Training Loss: 0.01322
Epoch: 16656, Training Loss: 0.01019
Epoch: 16657, Training Loss: 0.01158
Epoch: 16657, Training Loss: 0.01075
Epoch: 16657, Training Loss: 0.01322
Epoch: 16657, Training Loss: 0.01019
Epoch: 16658, Training Loss: 0.01158
Epoch: 16658, Training Loss: 0.01074
Epoch: 16658, Training Loss: 0.01322
Epoch: 16658, Training Loss: 0.01019
Epoch: 16659, Training Loss: 0.01158
Epoch: 16659, Training Loss: 0.01074
Epoch: 16659, Training Loss: 0.01322
Epoch: 16659, Training Loss: 0.01019
Epoch: 16660, Training Loss: 0.01158
Epoch: 16660, Training Loss: 0.01074
Epoch: 16660, Training Loss: 0.01322
Epoch: 16660, Training Loss: 0.01019
Epoch: 16661, Training Loss: 0.01158
Epoch: 16661, Training Loss: 0.01074
Epoch: 16661, Training Loss: 0.01321
Epoch: 16661, Training Loss: 0.01018
Epoch: 16662, Training Loss: 0.01158
Epoch: 16662, Training Loss: 0.01074
Epoch: 16662, Training Loss: 0.01321
Epoch: 16662, Training Loss: 0.01018
Epoch: 16663, Training Loss: 0.01157
Epoch: 16663, Training Loss: 0.01074
Epoch: 16663, Training Loss: 0.01321
Epoch: 16663, Training Loss: 0.01018
Epoch: 16664, Training Loss: 0.01157
Epoch: 16664, Training Loss: 0.01074
Epoch: 16664, Training Loss: 0.01321
Epoch: 16664, Training Loss: 0.01018
Epoch: 16665, Training Loss: 0.01157
Epoch: 16665, Training Loss: 0.01074
Epoch: 16665, Training Loss: 0.01321
Epoch: 16665, Training Loss: 0.01018
Epoch: 16666, Training Loss: 0.01157
Epoch: 16666, Training Loss: 0.01074
Epoch: 16666, Training Loss: 0.01321
Epoch: 16666, Training Loss: 0.01018
Epoch: 16667, Training Loss: 0.01157
Epoch: 16667, Training Loss: 0.01074
Epoch: 16667, Training Loss: 0.01321
Epoch: 16667, Training Loss: 0.01018
Epoch: 16668, Training Loss: 0.01157
Epoch: 16668, Training Loss: 0.01074
Epoch: 16668, Training Loss: 0.01321
Epoch: 16668, Training Loss: 0.01018
Epoch: 16669, Training Loss: 0.01157
Epoch: 16669, Training Loss: 0.01074
Epoch: 16669, Training Loss: 0.01321
Epoch: 16669, Training Loss: 0.01018
Epoch: 16670, Training Loss: 0.01157
Epoch: 16670, Training Loss: 0.01074
Epoch: 16670, Training Loss: 0.01321
Epoch: 16670, Training Loss: 0.01018
Epoch: 16671, Training Loss: 0.01157
Epoch: 16671, Training Loss: 0.01074
Epoch: 16671, Training Loss: 0.01321
Epoch: 16671, Training Loss: 0.01018
Epoch: 16672, Training Loss: 0.01157
Epoch: 16672, Training Loss: 0.01074
Epoch: 16672, Training Loss: 0.01321
Epoch: 16672, Training Loss: 0.01018
Epoch: 16673, Training Loss: 0.01157
Epoch: 16673, Training Loss: 0.01074
Epoch: 16673, Training Loss: 0.01321
Epoch: 16673, Training Loss: 0.01018
Epoch: 16674, Training Loss: 0.01157
Epoch: 16674, Training Loss: 0.01074
Epoch: 16674, Training Loss: 0.01321
Epoch: 16674, Training Loss: 0.01018
Epoch: 16675, Training Loss: 0.01157
Epoch: 16675, Training Loss: 0.01074
Epoch: 16675, Training Loss: 0.01321
Epoch: 16675, Training Loss: 0.01018
Epoch: 16676, Training Loss: 0.01157
Epoch: 16676, Training Loss: 0.01074
Epoch: 16676, Training Loss: 0.01321
Epoch: 16676, Training Loss: 0.01018
Epoch: 16677, Training Loss: 0.01157
Epoch: 16677, Training Loss: 0.01074
Epoch: 16677, Training Loss: 0.01321
Epoch: 16677, Training Loss: 0.01018
Epoch: 16678, Training Loss: 0.01157
Epoch: 16678, Training Loss: 0.01074
Epoch: 16678, Training Loss: 0.01321
Epoch: 16678, Training Loss: 0.01018
Epoch: 16679, Training Loss: 0.01157
Epoch: 16679, Training Loss: 0.01074
Epoch: 16679, Training Loss: 0.01321
Epoch: 16679, Training Loss: 0.01018
Epoch: 16680, Training Loss: 0.01157
Epoch: 16680, Training Loss: 0.01074
Epoch: 16680, Training Loss: 0.01321
Epoch: 16680, Training Loss: 0.01018
Epoch: 16681, Training Loss: 0.01157
Epoch: 16681, Training Loss: 0.01074
Epoch: 16681, Training Loss: 0.01321
Epoch: 16681, Training Loss: 0.01018
Epoch: 16682, Training Loss: 0.01157
Epoch: 16682, Training Loss: 0.01074
Epoch: 16682, Training Loss: 0.01321
Epoch: 16682, Training Loss: 0.01018
Epoch: 16683, Training Loss: 0.01157
Epoch: 16683, Training Loss: 0.01074
Epoch: 16683, Training Loss: 0.01321
Epoch: 16683, Training Loss: 0.01018
Epoch: 16684, Training Loss: 0.01157
Epoch: 16684, Training Loss: 0.01074
Epoch: 16684, Training Loss: 0.01321
Epoch: 16684, Training Loss: 0.01018
Epoch: 16685, Training Loss: 0.01157
Epoch: 16685, Training Loss: 0.01074
Epoch: 16685, Training Loss: 0.01320
Epoch: 16685, Training Loss: 0.01018
Epoch: 16686, Training Loss: 0.01157
Epoch: 16686, Training Loss: 0.01074
Epoch: 16686, Training Loss: 0.01320
Epoch: 16686, Training Loss: 0.01018
Epoch: 16687, Training Loss: 0.01157
Epoch: 16687, Training Loss: 0.01073
Epoch: 16687, Training Loss: 0.01320
Epoch: 16687, Training Loss: 0.01018
Epoch: 16688, Training Loss: 0.01157
Epoch: 16688, Training Loss: 0.01073
Epoch: 16688, Training Loss: 0.01320
Epoch: 16688, Training Loss: 0.01018
Epoch: 16689, Training Loss: 0.01157
Epoch: 16689, Training Loss: 0.01073
Epoch: 16689, Training Loss: 0.01320
Epoch: 16689, Training Loss: 0.01018
Epoch: 16690, Training Loss: 0.01156
Epoch: 16690, Training Loss: 0.01073
Epoch: 16690, Training Loss: 0.01320
Epoch: 16690, Training Loss: 0.01018
Epoch: 16691, Training Loss: 0.01156
Epoch: 16691, Training Loss: 0.01073
Epoch: 16691, Training Loss: 0.01320
Epoch: 16691, Training Loss: 0.01018
Epoch: 16692, Training Loss: 0.01156
Epoch: 16692, Training Loss: 0.01073
Epoch: 16692, Training Loss: 0.01320
Epoch: 16692, Training Loss: 0.01017
Epoch: 16693, Training Loss: 0.01156
Epoch: 16693, Training Loss: 0.01073
Epoch: 16693, Training Loss: 0.01320
Epoch: 16693, Training Loss: 0.01017
Epoch: 16694, Training Loss: 0.01156
Epoch: 16694, Training Loss: 0.01073
Epoch: 16694, Training Loss: 0.01320
Epoch: 16694, Training Loss: 0.01017
Epoch: 16695, Training Loss: 0.01156
Epoch: 16695, Training Loss: 0.01073
Epoch: 16695, Training Loss: 0.01320
Epoch: 16695, Training Loss: 0.01017
Epoch: 16696, Training Loss: 0.01156
Epoch: 16696, Training Loss: 0.01073
Epoch: 16696, Training Loss: 0.01320
Epoch: 16696, Training Loss: 0.01017
Epoch: 16697, Training Loss: 0.01156
Epoch: 16697, Training Loss: 0.01073
Epoch: 16697, Training Loss: 0.01320
Epoch: 16697, Training Loss: 0.01017
Epoch: 16698, Training Loss: 0.01156
Epoch: 16698, Training Loss: 0.01073
Epoch: 16698, Training Loss: 0.01320
Epoch: 16698, Training Loss: 0.01017
Epoch: 16699, Training Loss: 0.01156
Epoch: 16699, Training Loss: 0.01073
Epoch: 16699, Training Loss: 0.01320
Epoch: 16699, Training Loss: 0.01017
Epoch: 16700, Training Loss: 0.01156
Epoch: 16700, Training Loss: 0.01073
Epoch: 16700, Training Loss: 0.01320
Epoch: 16700, Training Loss: 0.01017
Epoch: 16701, Training Loss: 0.01156
Epoch: 16701, Training Loss: 0.01073
Epoch: 16701, Training Loss: 0.01320
Epoch: 16701, Training Loss: 0.01017
Epoch: 16702, Training Loss: 0.01156
Epoch: 16702, Training Loss: 0.01073
Epoch: 16702, Training Loss: 0.01320
Epoch: 16702, Training Loss: 0.01017
Epoch: 16703, Training Loss: 0.01156
Epoch: 16703, Training Loss: 0.01073
Epoch: 16703, Training Loss: 0.01320
Epoch: 16703, Training Loss: 0.01017
Epoch: 16704, Training Loss: 0.01156
Epoch: 16704, Training Loss: 0.01073
Epoch: 16704, Training Loss: 0.01320
Epoch: 16704, Training Loss: 0.01017
Epoch: 16705, Training Loss: 0.01156
Epoch: 16705, Training Loss: 0.01073
Epoch: 16705, Training Loss: 0.01320
Epoch: 16705, Training Loss: 0.01017
Epoch: 16706, Training Loss: 0.01156
Epoch: 16706, Training Loss: 0.01073
Epoch: 16706, Training Loss: 0.01320
Epoch: 16706, Training Loss: 0.01017
Epoch: 16707, Training Loss: 0.01156
Epoch: 16707, Training Loss: 0.01073
Epoch: 16707, Training Loss: 0.01320
Epoch: 16707, Training Loss: 0.01017
Epoch: 16708, Training Loss: 0.01156
Epoch: 16708, Training Loss: 0.01073
Epoch: 16708, Training Loss: 0.01319
Epoch: 16708, Training Loss: 0.01017
Epoch: 16709, Training Loss: 0.01156
Epoch: 16709, Training Loss: 0.01073
Epoch: 16709, Training Loss: 0.01319
Epoch: 16709, Training Loss: 0.01017
Epoch: 16710, Training Loss: 0.01156
Epoch: 16710, Training Loss: 0.01073
Epoch: 16710, Training Loss: 0.01319
Epoch: 16710, Training Loss: 0.01017
Epoch: 16711, Training Loss: 0.01156
Epoch: 16711, Training Loss: 0.01073
Epoch: 16711, Training Loss: 0.01319
Epoch: 16711, Training Loss: 0.01017
Epoch: 16712, Training Loss: 0.01156
Epoch: 16712, Training Loss: 0.01073
Epoch: 16712, Training Loss: 0.01319
Epoch: 16712, Training Loss: 0.01017
Epoch: 16713, Training Loss: 0.01156
Epoch: 16713, Training Loss: 0.01073
Epoch: 16713, Training Loss: 0.01319
Epoch: 16713, Training Loss: 0.01017
Epoch: 16714, Training Loss: 0.01156
Epoch: 16714, Training Loss: 0.01073
Epoch: 16714, Training Loss: 0.01319
Epoch: 16714, Training Loss: 0.01017
Epoch: 16715, Training Loss: 0.01156
Epoch: 16715, Training Loss: 0.01073
Epoch: 16715, Training Loss: 0.01319
Epoch: 16715, Training Loss: 0.01017
Epoch: 16716, Training Loss: 0.01155
Epoch: 16716, Training Loss: 0.01072
Epoch: 16716, Training Loss: 0.01319
Epoch: 16716, Training Loss: 0.01017
Epoch: 16717, Training Loss: 0.01155
Epoch: 16717, Training Loss: 0.01072
Epoch: 16717, Training Loss: 0.01319
Epoch: 16717, Training Loss: 0.01017
Epoch: 16718, Training Loss: 0.01155
Epoch: 16718, Training Loss: 0.01072
Epoch: 16718, Training Loss: 0.01319
Epoch: 16718, Training Loss: 0.01017
Epoch: 16719, Training Loss: 0.01155
Epoch: 16719, Training Loss: 0.01072
Epoch: 16719, Training Loss: 0.01319
Epoch: 16719, Training Loss: 0.01017
Epoch: 16720, Training Loss: 0.01155
Epoch: 16720, Training Loss: 0.01072
Epoch: 16720, Training Loss: 0.01319
Epoch: 16720, Training Loss: 0.01017
Epoch: 16721, Training Loss: 0.01155
Epoch: 16721, Training Loss: 0.01072
Epoch: 16721, Training Loss: 0.01319
Epoch: 16721, Training Loss: 0.01017
Epoch: 16722, Training Loss: 0.01155
Epoch: 16722, Training Loss: 0.01072
Epoch: 16722, Training Loss: 0.01319
Epoch: 16722, Training Loss: 0.01016
Epoch: 16723, Training Loss: 0.01155
Epoch: 16723, Training Loss: 0.01072
Epoch: 16723, Training Loss: 0.01319
Epoch: 16723, Training Loss: 0.01016
Epoch: 16724, Training Loss: 0.01155
Epoch: 16724, Training Loss: 0.01072
Epoch: 16724, Training Loss: 0.01319
Epoch: 16724, Training Loss: 0.01016
Epoch: 16725, Training Loss: 0.01155
Epoch: 16725, Training Loss: 0.01072
Epoch: 16725, Training Loss: 0.01319
Epoch: 16725, Training Loss: 0.01016
Epoch: 16726, Training Loss: 0.01155
Epoch: 16726, Training Loss: 0.01072
Epoch: 16726, Training Loss: 0.01319
Epoch: 16726, Training Loss: 0.01016
Epoch: 16727, Training Loss: 0.01155
Epoch: 16727, Training Loss: 0.01072
Epoch: 16727, Training Loss: 0.01319
Epoch: 16727, Training Loss: 0.01016
Epoch: 16728, Training Loss: 0.01155
Epoch: 16728, Training Loss: 0.01072
Epoch: 16728, Training Loss: 0.01319
Epoch: 16728, Training Loss: 0.01016
Epoch: 16729, Training Loss: 0.01155
Epoch: 16729, Training Loss: 0.01072
Epoch: 16729, Training Loss: 0.01319
Epoch: 16729, Training Loss: 0.01016
Epoch: 16730, Training Loss: 0.01155
Epoch: 16730, Training Loss: 0.01072
Epoch: 16730, Training Loss: 0.01319
Epoch: 16730, Training Loss: 0.01016
Epoch: 16731, Training Loss: 0.01155
Epoch: 16731, Training Loss: 0.01072
Epoch: 16731, Training Loss: 0.01318
Epoch: 16731, Training Loss: 0.01016
Epoch: 16732, Training Loss: 0.01155
Epoch: 16732, Training Loss: 0.01072
Epoch: 16732, Training Loss: 0.01318
Epoch: 16732, Training Loss: 0.01016
Epoch: 16733, Training Loss: 0.01155
Epoch: 16733, Training Loss: 0.01072
Epoch: 16733, Training Loss: 0.01318
Epoch: 16733, Training Loss: 0.01016
Epoch: 16734, Training Loss: 0.01155
Epoch: 16734, Training Loss: 0.01072
Epoch: 16734, Training Loss: 0.01318
Epoch: 16734, Training Loss: 0.01016
Epoch: 16735, Training Loss: 0.01155
Epoch: 16735, Training Loss: 0.01072
Epoch: 16735, Training Loss: 0.01318
Epoch: 16735, Training Loss: 0.01016
Epoch: 16736, Training Loss: 0.01155
Epoch: 16736, Training Loss: 0.01072
Epoch: 16736, Training Loss: 0.01318
Epoch: 16736, Training Loss: 0.01016
Epoch: 16737, Training Loss: 0.01155
Epoch: 16737, Training Loss: 0.01072
Epoch: 16737, Training Loss: 0.01318
Epoch: 16737, Training Loss: 0.01016
Epoch: 16738, Training Loss: 0.01155
Epoch: 16738, Training Loss: 0.01072
Epoch: 16738, Training Loss: 0.01318
Epoch: 16738, Training Loss: 0.01016
Epoch: 16739, Training Loss: 0.01155
Epoch: 16739, Training Loss: 0.01072
Epoch: 16739, Training Loss: 0.01318
Epoch: 16739, Training Loss: 0.01016
Epoch: 16740, Training Loss: 0.01155
Epoch: 16740, Training Loss: 0.01072
Epoch: 16740, Training Loss: 0.01318
Epoch: 16740, Training Loss: 0.01016
Epoch: 16741, Training Loss: 0.01155
Epoch: 16741, Training Loss: 0.01072
Epoch: 16741, Training Loss: 0.01318
Epoch: 16741, Training Loss: 0.01016
Epoch: 16742, Training Loss: 0.01155
Epoch: 16742, Training Loss: 0.01072
Epoch: 16742, Training Loss: 0.01318
Epoch: 16742, Training Loss: 0.01016
Epoch: 16743, Training Loss: 0.01154
Epoch: 16743, Training Loss: 0.01072
Epoch: 16743, Training Loss: 0.01318
Epoch: 16743, Training Loss: 0.01016
Epoch: 16744, Training Loss: 0.01154
Epoch: 16744, Training Loss: 0.01072
Epoch: 16744, Training Loss: 0.01318
Epoch: 16744, Training Loss: 0.01016
Epoch: 16745, Training Loss: 0.01154
Epoch: 16745, Training Loss: 0.01071
Epoch: 16745, Training Loss: 0.01318
Epoch: 16745, Training Loss: 0.01016
Epoch: 16746, Training Loss: 0.01154
Epoch: 16746, Training Loss: 0.01071
Epoch: 16746, Training Loss: 0.01318
Epoch: 16746, Training Loss: 0.01016
Epoch: 16747, Training Loss: 0.01154
Epoch: 16747, Training Loss: 0.01071
Epoch: 16747, Training Loss: 0.01318
Epoch: 16747, Training Loss: 0.01016
Epoch: 16748, Training Loss: 0.01154
Epoch: 16748, Training Loss: 0.01071
Epoch: 16748, Training Loss: 0.01318
Epoch: 16748, Training Loss: 0.01016
Epoch: 16749, Training Loss: 0.01154
Epoch: 16749, Training Loss: 0.01071
Epoch: 16749, Training Loss: 0.01318
Epoch: 16749, Training Loss: 0.01016
Epoch: 16750, Training Loss: 0.01154
Epoch: 16750, Training Loss: 0.01071
Epoch: 16750, Training Loss: 0.01318
Epoch: 16750, Training Loss: 0.01016
Epoch: 16751, Training Loss: 0.01154
Epoch: 16751, Training Loss: 0.01071
Epoch: 16751, Training Loss: 0.01318
Epoch: 16751, Training Loss: 0.01016
Epoch: 16752, Training Loss: 0.01154
Epoch: 16752, Training Loss: 0.01071
Epoch: 16752, Training Loss: 0.01318
Epoch: 16752, Training Loss: 0.01016
Epoch: 16753, Training Loss: 0.01154
Epoch: 16753, Training Loss: 0.01071
Epoch: 16753, Training Loss: 0.01318
Epoch: 16753, Training Loss: 0.01015
Epoch: 16754, Training Loss: 0.01154
Epoch: 16754, Training Loss: 0.01071
Epoch: 16754, Training Loss: 0.01317
Epoch: 16754, Training Loss: 0.01015
Epoch: 16755, Training Loss: 0.01154
Epoch: 16755, Training Loss: 0.01071
Epoch: 16755, Training Loss: 0.01317
Epoch: 16755, Training Loss: 0.01015
Epoch: 16756, Training Loss: 0.01154
Epoch: 16756, Training Loss: 0.01071
Epoch: 16756, Training Loss: 0.01317
Epoch: 16756, Training Loss: 0.01015
Epoch: 16757, Training Loss: 0.01154
Epoch: 16757, Training Loss: 0.01071
Epoch: 16757, Training Loss: 0.01317
Epoch: 16757, Training Loss: 0.01015
Epoch: 16758, Training Loss: 0.01154
Epoch: 16758, Training Loss: 0.01071
Epoch: 16758, Training Loss: 0.01317
Epoch: 16758, Training Loss: 0.01015
Epoch: 16759, Training Loss: 0.01154
Epoch: 16759, Training Loss: 0.01071
Epoch: 16759, Training Loss: 0.01317
Epoch: 16759, Training Loss: 0.01015
Epoch: 16760, Training Loss: 0.01154
Epoch: 16760, Training Loss: 0.01071
Epoch: 16760, Training Loss: 0.01317
Epoch: 16760, Training Loss: 0.01015
Epoch: 16761, Training Loss: 0.01154
Epoch: 16761, Training Loss: 0.01071
Epoch: 16761, Training Loss: 0.01317
Epoch: 16761, Training Loss: 0.01015
Epoch: 16762, Training Loss: 0.01154
Epoch: 16762, Training Loss: 0.01071
Epoch: 16762, Training Loss: 0.01317
Epoch: 16762, Training Loss: 0.01015
Epoch: 16763, Training Loss: 0.01154
Epoch: 16763, Training Loss: 0.01071
Epoch: 16763, Training Loss: 0.01317
Epoch: 16763, Training Loss: 0.01015
Epoch: 16764, Training Loss: 0.01154
Epoch: 16764, Training Loss: 0.01071
Epoch: 16764, Training Loss: 0.01317
Epoch: 16764, Training Loss: 0.01015
Epoch: 16765, Training Loss: 0.01154
Epoch: 16765, Training Loss: 0.01071
Epoch: 16765, Training Loss: 0.01317
Epoch: 16765, Training Loss: 0.01015
Epoch: 16766, Training Loss: 0.01154
Epoch: 16766, Training Loss: 0.01071
Epoch: 16766, Training Loss: 0.01317
Epoch: 16766, Training Loss: 0.01015
Epoch: 16767, Training Loss: 0.01154
Epoch: 16767, Training Loss: 0.01071
Epoch: 16767, Training Loss: 0.01317
Epoch: 16767, Training Loss: 0.01015
Epoch: 16768, Training Loss: 0.01154
Epoch: 16768, Training Loss: 0.01071
Epoch: 16768, Training Loss: 0.01317
Epoch: 16768, Training Loss: 0.01015
Epoch: 16769, Training Loss: 0.01153
Epoch: 16769, Training Loss: 0.01071
Epoch: 16769, Training Loss: 0.01317
Epoch: 16769, Training Loss: 0.01015
Epoch: 16770, Training Loss: 0.01153
Epoch: 16770, Training Loss: 0.01071
Epoch: 16770, Training Loss: 0.01317
Epoch: 16770, Training Loss: 0.01015
Epoch: 16771, Training Loss: 0.01153
Epoch: 16771, Training Loss: 0.01071
Epoch: 16771, Training Loss: 0.01317
Epoch: 16771, Training Loss: 0.01015
Epoch: 16772, Training Loss: 0.01153
Epoch: 16772, Training Loss: 0.01071
Epoch: 16772, Training Loss: 0.01317
Epoch: 16772, Training Loss: 0.01015
Epoch: 16773, Training Loss: 0.01153
Epoch: 16773, Training Loss: 0.01071
Epoch: 16773, Training Loss: 0.01317
Epoch: 16773, Training Loss: 0.01015
Epoch: 16774, Training Loss: 0.01153
Epoch: 16774, Training Loss: 0.01071
Epoch: 16774, Training Loss: 0.01317
Epoch: 16774, Training Loss: 0.01015
Epoch: 16775, Training Loss: 0.01153
Epoch: 16775, Training Loss: 0.01070
Epoch: 16775, Training Loss: 0.01317
Epoch: 16775, Training Loss: 0.01015
Epoch: 16776, Training Loss: 0.01153
Epoch: 16776, Training Loss: 0.01070
Epoch: 16776, Training Loss: 0.01317
Epoch: 16776, Training Loss: 0.01015
Epoch: 16777, Training Loss: 0.01153
Epoch: 16777, Training Loss: 0.01070
Epoch: 16777, Training Loss: 0.01316
Epoch: 16777, Training Loss: 0.01015
Epoch: 16778, Training Loss: 0.01153
Epoch: 16778, Training Loss: 0.01070
Epoch: 16778, Training Loss: 0.01316
Epoch: 16778, Training Loss: 0.01015
Epoch: 16779, Training Loss: 0.01153
Epoch: 16779, Training Loss: 0.01070
Epoch: 16779, Training Loss: 0.01316
Epoch: 16779, Training Loss: 0.01015
Epoch: 16780, Training Loss: 0.01153
Epoch: 16780, Training Loss: 0.01070
Epoch: 16780, Training Loss: 0.01316
Epoch: 16780, Training Loss: 0.01015
Epoch: 16781, Training Loss: 0.01153
Epoch: 16781, Training Loss: 0.01070
Epoch: 16781, Training Loss: 0.01316
Epoch: 16781, Training Loss: 0.01015
Epoch: 16782, Training Loss: 0.01153
Epoch: 16782, Training Loss: 0.01070
Epoch: 16782, Training Loss: 0.01316
Epoch: 16782, Training Loss: 0.01015
Epoch: 16783, Training Loss: 0.01153
Epoch: 16783, Training Loss: 0.01070
Epoch: 16783, Training Loss: 0.01316
Epoch: 16783, Training Loss: 0.01015
Epoch: 16784, Training Loss: 0.01153
Epoch: 16784, Training Loss: 0.01070
Epoch: 16784, Training Loss: 0.01316
Epoch: 16784, Training Loss: 0.01014
Epoch: 16785, Training Loss: 0.01153
Epoch: 16785, Training Loss: 0.01070
Epoch: 16785, Training Loss: 0.01316
Epoch: 16785, Training Loss: 0.01014
Epoch: 16786, Training Loss: 0.01153
Epoch: 16786, Training Loss: 0.01070
Epoch: 16786, Training Loss: 0.01316
Epoch: 16786, Training Loss: 0.01014
Epoch: 16787, Training Loss: 0.01153
Epoch: 16787, Training Loss: 0.01070
Epoch: 16787, Training Loss: 0.01316
Epoch: 16787, Training Loss: 0.01014
Epoch: 16788, Training Loss: 0.01153
Epoch: 16788, Training Loss: 0.01070
Epoch: 16788, Training Loss: 0.01316
Epoch: 16788, Training Loss: 0.01014
Epoch: 16789, Training Loss: 0.01153
Epoch: 16789, Training Loss: 0.01070
Epoch: 16789, Training Loss: 0.01316
Epoch: 16789, Training Loss: 0.01014
Epoch: 16790, Training Loss: 0.01153
Epoch: 16790, Training Loss: 0.01070
Epoch: 16790, Training Loss: 0.01316
Epoch: 16790, Training Loss: 0.01014
Epoch: 16791, Training Loss: 0.01153
Epoch: 16791, Training Loss: 0.01070
Epoch: 16791, Training Loss: 0.01316
Epoch: 16791, Training Loss: 0.01014
Epoch: 16792, Training Loss: 0.01153
Epoch: 16792, Training Loss: 0.01070
Epoch: 16792, Training Loss: 0.01316
Epoch: 16792, Training Loss: 0.01014
Epoch: 16793, Training Loss: 0.01153
Epoch: 16793, Training Loss: 0.01070
Epoch: 16793, Training Loss: 0.01316
Epoch: 16793, Training Loss: 0.01014
Epoch: 16794, Training Loss: 0.01153
Epoch: 16794, Training Loss: 0.01070
Epoch: 16794, Training Loss: 0.01316
Epoch: 16794, Training Loss: 0.01014
Epoch: 16795, Training Loss: 0.01153
Epoch: 16795, Training Loss: 0.01070
Epoch: 16795, Training Loss: 0.01316
Epoch: 16795, Training Loss: 0.01014
Epoch: 16796, Training Loss: 0.01152
Epoch: 16796, Training Loss: 0.01070
Epoch: 16796, Training Loss: 0.01316
Epoch: 16796, Training Loss: 0.01014
Epoch: 16797, Training Loss: 0.01152
Epoch: 16797, Training Loss: 0.01070
Epoch: 16797, Training Loss: 0.01316
Epoch: 16797, Training Loss: 0.01014
Epoch: 16798, Training Loss: 0.01152
Epoch: 16798, Training Loss: 0.01070
Epoch: 16798, Training Loss: 0.01316
Epoch: 16798, Training Loss: 0.01014
Epoch: 16799, Training Loss: 0.01152
Epoch: 16799, Training Loss: 0.01070
Epoch: 16799, Training Loss: 0.01316
Epoch: 16799, Training Loss: 0.01014
Epoch: 16800, Training Loss: 0.01152
Epoch: 16800, Training Loss: 0.01070
Epoch: 16800, Training Loss: 0.01316
Epoch: 16800, Training Loss: 0.01014
Epoch: 16801, Training Loss: 0.01152
Epoch: 16801, Training Loss: 0.01070
Epoch: 16801, Training Loss: 0.01315
Epoch: 16801, Training Loss: 0.01014
Epoch: 16802, Training Loss: 0.01152
Epoch: 16802, Training Loss: 0.01070
Epoch: 16802, Training Loss: 0.01315
Epoch: 16802, Training Loss: 0.01014
Epoch: 16803, Training Loss: 0.01152
Epoch: 16803, Training Loss: 0.01070
Epoch: 16803, Training Loss: 0.01315
Epoch: 16803, Training Loss: 0.01014
Epoch: 16804, Training Loss: 0.01152
Epoch: 16804, Training Loss: 0.01069
Epoch: 16804, Training Loss: 0.01315
Epoch: 16804, Training Loss: 0.01014
Epoch: 16805, Training Loss: 0.01152
Epoch: 16805, Training Loss: 0.01069
Epoch: 16805, Training Loss: 0.01315
Epoch: 16805, Training Loss: 0.01014
Epoch: 16806, Training Loss: 0.01152
Epoch: 16806, Training Loss: 0.01069
Epoch: 16806, Training Loss: 0.01315
Epoch: 16806, Training Loss: 0.01014
Epoch: 16807, Training Loss: 0.01152
Epoch: 16807, Training Loss: 0.01069
Epoch: 16807, Training Loss: 0.01315
Epoch: 16807, Training Loss: 0.01014
Epoch: 16808, Training Loss: 0.01152
Epoch: 16808, Training Loss: 0.01069
Epoch: 16808, Training Loss: 0.01315
Epoch: 16808, Training Loss: 0.01014
Epoch: 16809, Training Loss: 0.01152
Epoch: 16809, Training Loss: 0.01069
Epoch: 16809, Training Loss: 0.01315
Epoch: 16809, Training Loss: 0.01014
Epoch: 16810, Training Loss: 0.01152
Epoch: 16810, Training Loss: 0.01069
Epoch: 16810, Training Loss: 0.01315
Epoch: 16810, Training Loss: 0.01014
Epoch: 16811, Training Loss: 0.01152
Epoch: 16811, Training Loss: 0.01069
Epoch: 16811, Training Loss: 0.01315
Epoch: 16811, Training Loss: 0.01014
Epoch: 16812, Training Loss: 0.01152
Epoch: 16812, Training Loss: 0.01069
Epoch: 16812, Training Loss: 0.01315
Epoch: 16812, Training Loss: 0.01014
Epoch: 16813, Training Loss: 0.01152
Epoch: 16813, Training Loss: 0.01069
Epoch: 16813, Training Loss: 0.01315
Epoch: 16813, Training Loss: 0.01014
Epoch: 16814, Training Loss: 0.01152
Epoch: 16814, Training Loss: 0.01069
Epoch: 16814, Training Loss: 0.01315
Epoch: 16814, Training Loss: 0.01014
Epoch: 16815, Training Loss: 0.01152
Epoch: 16815, Training Loss: 0.01069
Epoch: 16815, Training Loss: 0.01315
Epoch: 16815, Training Loss: 0.01013
Epoch: 16816, Training Loss: 0.01152
Epoch: 16816, Training Loss: 0.01069
Epoch: 16816, Training Loss: 0.01315
Epoch: 16816, Training Loss: 0.01013
Epoch: 16817, Training Loss: 0.01152
Epoch: 16817, Training Loss: 0.01069
Epoch: 16817, Training Loss: 0.01315
Epoch: 16817, Training Loss: 0.01013
Epoch: 16818, Training Loss: 0.01152
Epoch: 16818, Training Loss: 0.01069
Epoch: 16818, Training Loss: 0.01315
Epoch: 16818, Training Loss: 0.01013
Epoch: 16819, Training Loss: 0.01152
Epoch: 16819, Training Loss: 0.01069
Epoch: 16819, Training Loss: 0.01315
Epoch: 16819, Training Loss: 0.01013
Epoch: 16820, Training Loss: 0.01152
Epoch: 16820, Training Loss: 0.01069
Epoch: 16820, Training Loss: 0.01315
Epoch: 16820, Training Loss: 0.01013
Epoch: 16821, Training Loss: 0.01152
Epoch: 16821, Training Loss: 0.01069
Epoch: 16821, Training Loss: 0.01315
Epoch: 16821, Training Loss: 0.01013
Epoch: 16822, Training Loss: 0.01151
Epoch: 16822, Training Loss: 0.01069
Epoch: 16822, Training Loss: 0.01315
Epoch: 16822, Training Loss: 0.01013
Epoch: 16823, Training Loss: 0.01151
Epoch: 16823, Training Loss: 0.01069
Epoch: 16823, Training Loss: 0.01315
Epoch: 16823, Training Loss: 0.01013
Epoch: 16824, Training Loss: 0.01151
Epoch: 16824, Training Loss: 0.01069
Epoch: 16824, Training Loss: 0.01314
Epoch: 16824, Training Loss: 0.01013
Epoch: 16825, Training Loss: 0.01151
Epoch: 16825, Training Loss: 0.01069
Epoch: 16825, Training Loss: 0.01314
Epoch: 16825, Training Loss: 0.01013
Epoch: 16826, Training Loss: 0.01151
Epoch: 16826, Training Loss: 0.01069
Epoch: 16826, Training Loss: 0.01314
Epoch: 16826, Training Loss: 0.01013
Epoch: 16827, Training Loss: 0.01151
Epoch: 16827, Training Loss: 0.01069
Epoch: 16827, Training Loss: 0.01314
Epoch: 16827, Training Loss: 0.01013
Epoch: 16828, Training Loss: 0.01151
Epoch: 16828, Training Loss: 0.01069
Epoch: 16828, Training Loss: 0.01314
Epoch: 16828, Training Loss: 0.01013
Epoch: 16829, Training Loss: 0.01151
Epoch: 16829, Training Loss: 0.01069
Epoch: 16829, Training Loss: 0.01314
Epoch: 16829, Training Loss: 0.01013
Epoch: 16830, Training Loss: 0.01151
Epoch: 16830, Training Loss: 0.01069
Epoch: 16830, Training Loss: 0.01314
Epoch: 16830, Training Loss: 0.01013
Epoch: 16831, Training Loss: 0.01151
Epoch: 16831, Training Loss: 0.01069
Epoch: 16831, Training Loss: 0.01314
Epoch: 16831, Training Loss: 0.01013
Epoch: 16832, Training Loss: 0.01151
Epoch: 16832, Training Loss: 0.01069
Epoch: 16832, Training Loss: 0.01314
Epoch: 16832, Training Loss: 0.01013
Epoch: 16833, Training Loss: 0.01151
Epoch: 16833, Training Loss: 0.01068
Epoch: 16833, Training Loss: 0.01314
Epoch: 16833, Training Loss: 0.01013
Epoch: 16834, Training Loss: 0.01151
Epoch: 16834, Training Loss: 0.01068
Epoch: 16834, Training Loss: 0.01314
Epoch: 16834, Training Loss: 0.01013
Epoch: 16835, Training Loss: 0.01151
Epoch: 16835, Training Loss: 0.01068
Epoch: 16835, Training Loss: 0.01314
Epoch: 16835, Training Loss: 0.01013
Epoch: 16836, Training Loss: 0.01151
Epoch: 16836, Training Loss: 0.01068
Epoch: 16836, Training Loss: 0.01314
Epoch: 16836, Training Loss: 0.01013
Epoch: 16837, Training Loss: 0.01151
Epoch: 16837, Training Loss: 0.01068
Epoch: 16837, Training Loss: 0.01314
Epoch: 16837, Training Loss: 0.01013
Epoch: 16838, Training Loss: 0.01151
Epoch: 16838, Training Loss: 0.01068
Epoch: 16838, Training Loss: 0.01314
Epoch: 16838, Training Loss: 0.01013
Epoch: 16839, Training Loss: 0.01151
Epoch: 16839, Training Loss: 0.01068
Epoch: 16839, Training Loss: 0.01314
Epoch: 16839, Training Loss: 0.01013
Epoch: 16840, Training Loss: 0.01151
Epoch: 16840, Training Loss: 0.01068
Epoch: 16840, Training Loss: 0.01314
Epoch: 16840, Training Loss: 0.01013
Epoch: 16841, Training Loss: 0.01151
Epoch: 16841, Training Loss: 0.01068
Epoch: 16841, Training Loss: 0.01314
Epoch: 16841, Training Loss: 0.01013
Epoch: 16842, Training Loss: 0.01151
Epoch: 16842, Training Loss: 0.01068
Epoch: 16842, Training Loss: 0.01314
Epoch: 16842, Training Loss: 0.01013
Epoch: 16843, Training Loss: 0.01151
Epoch: 16843, Training Loss: 0.01068
Epoch: 16843, Training Loss: 0.01314
Epoch: 16843, Training Loss: 0.01013
Epoch: 16844, Training Loss: 0.01151
Epoch: 16844, Training Loss: 0.01068
Epoch: 16844, Training Loss: 0.01314
Epoch: 16844, Training Loss: 0.01013
Epoch: 16845, Training Loss: 0.01151
Epoch: 16845, Training Loss: 0.01068
Epoch: 16845, Training Loss: 0.01314
Epoch: 16845, Training Loss: 0.01013
Epoch: 16846, Training Loss: 0.01151
Epoch: 16846, Training Loss: 0.01068
Epoch: 16846, Training Loss: 0.01314
Epoch: 16846, Training Loss: 0.01012
Epoch: 16847, Training Loss: 0.01151
Epoch: 16847, Training Loss: 0.01068
Epoch: 16847, Training Loss: 0.01314
Epoch: 16847, Training Loss: 0.01012
Epoch: 16848, Training Loss: 0.01151
Epoch: 16848, Training Loss: 0.01068
Epoch: 16848, Training Loss: 0.01313
Epoch: 16848, Training Loss: 0.01012
Epoch: 16849, Training Loss: 0.01150
Epoch: 16849, Training Loss: 0.01068
Epoch: 16849, Training Loss: 0.01313
Epoch: 16849, Training Loss: 0.01012
Epoch: 16850, Training Loss: 0.01150
Epoch: 16850, Training Loss: 0.01068
Epoch: 16850, Training Loss: 0.01313
Epoch: 16850, Training Loss: 0.01012
Epoch: 16851, Training Loss: 0.01150
Epoch: 16851, Training Loss: 0.01068
Epoch: 16851, Training Loss: 0.01313
Epoch: 16851, Training Loss: 0.01012
Epoch: 16852, Training Loss: 0.01150
Epoch: 16852, Training Loss: 0.01068
Epoch: 16852, Training Loss: 0.01313
Epoch: 16852, Training Loss: 0.01012
Epoch: 16853, Training Loss: 0.01150
Epoch: 16853, Training Loss: 0.01068
Epoch: 16853, Training Loss: 0.01313
Epoch: 16853, Training Loss: 0.01012
Epoch: 16854, Training Loss: 0.01150
Epoch: 16854, Training Loss: 0.01068
Epoch: 16854, Training Loss: 0.01313
Epoch: 16854, Training Loss: 0.01012
Epoch: 16855, Training Loss: 0.01150
Epoch: 16855, Training Loss: 0.01068
Epoch: 16855, Training Loss: 0.01313
Epoch: 16855, Training Loss: 0.01012
Epoch: 16856, Training Loss: 0.01150
Epoch: 16856, Training Loss: 0.01068
Epoch: 16856, Training Loss: 0.01313
Epoch: 16856, Training Loss: 0.01012
Epoch: 16857, Training Loss: 0.01150
Epoch: 16857, Training Loss: 0.01068
Epoch: 16857, Training Loss: 0.01313
Epoch: 16857, Training Loss: 0.01012
Epoch: 16858, Training Loss: 0.01150
Epoch: 16858, Training Loss: 0.01068
Epoch: 16858, Training Loss: 0.01313
Epoch: 16858, Training Loss: 0.01012
Epoch: 16859, Training Loss: 0.01150
Epoch: 16859, Training Loss: 0.01068
Epoch: 16859, Training Loss: 0.01313
Epoch: 16859, Training Loss: 0.01012
Epoch: 16860, Training Loss: 0.01150
Epoch: 16860, Training Loss: 0.01068
Epoch: 16860, Training Loss: 0.01313
Epoch: 16860, Training Loss: 0.01012
Epoch: 16861, Training Loss: 0.01150
Epoch: 16861, Training Loss: 0.01068
Epoch: 16861, Training Loss: 0.01313
Epoch: 16861, Training Loss: 0.01012
Epoch: 16862, Training Loss: 0.01150
Epoch: 16862, Training Loss: 0.01068
Epoch: 16862, Training Loss: 0.01313
Epoch: 16862, Training Loss: 0.01012
Epoch: 16863, Training Loss: 0.01150
Epoch: 16863, Training Loss: 0.01067
Epoch: 16863, Training Loss: 0.01313
Epoch: 16863, Training Loss: 0.01012
Epoch: 16864, Training Loss: 0.01150
Epoch: 16864, Training Loss: 0.01067
Epoch: 16864, Training Loss: 0.01313
Epoch: 16864, Training Loss: 0.01012
Epoch: 16865, Training Loss: 0.01150
Epoch: 16865, Training Loss: 0.01067
Epoch: 16865, Training Loss: 0.01313
Epoch: 16865, Training Loss: 0.01012
Epoch: 16866, Training Loss: 0.01150
Epoch: 16866, Training Loss: 0.01067
Epoch: 16866, Training Loss: 0.01313
Epoch: 16866, Training Loss: 0.01012
Epoch: 16867, Training Loss: 0.01150
Epoch: 16867, Training Loss: 0.01067
Epoch: 16867, Training Loss: 0.01313
Epoch: 16867, Training Loss: 0.01012
Epoch: 16868, Training Loss: 0.01150
Epoch: 16868, Training Loss: 0.01067
Epoch: 16868, Training Loss: 0.01313
Epoch: 16868, Training Loss: 0.01012
Epoch: 16869, Training Loss: 0.01150
Epoch: 16869, Training Loss: 0.01067
Epoch: 16869, Training Loss: 0.01313
Epoch: 16869, Training Loss: 0.01012
Epoch: 16870, Training Loss: 0.01150
Epoch: 16870, Training Loss: 0.01067
Epoch: 16870, Training Loss: 0.01313
Epoch: 16870, Training Loss: 0.01012
Epoch: 16871, Training Loss: 0.01150
Epoch: 16871, Training Loss: 0.01067
Epoch: 16871, Training Loss: 0.01312
Epoch: 16871, Training Loss: 0.01012
Epoch: 16872, Training Loss: 0.01150
Epoch: 16872, Training Loss: 0.01067
Epoch: 16872, Training Loss: 0.01312
Epoch: 16872, Training Loss: 0.01012
Epoch: 16873, Training Loss: 0.01150
Epoch: 16873, Training Loss: 0.01067
Epoch: 16873, Training Loss: 0.01312
Epoch: 16873, Training Loss: 0.01012
Epoch: 16874, Training Loss: 0.01150
Epoch: 16874, Training Loss: 0.01067
Epoch: 16874, Training Loss: 0.01312
Epoch: 16874, Training Loss: 0.01012
Epoch: 16875, Training Loss: 0.01150
Epoch: 16875, Training Loss: 0.01067
Epoch: 16875, Training Loss: 0.01312
Epoch: 16875, Training Loss: 0.01012
Epoch: 16876, Training Loss: 0.01149
Epoch: 16876, Training Loss: 0.01067
Epoch: 16876, Training Loss: 0.01312
Epoch: 16876, Training Loss: 0.01012
Epoch: 16877, Training Loss: 0.01149
Epoch: 16877, Training Loss: 0.01067
Epoch: 16877, Training Loss: 0.01312
Epoch: 16877, Training Loss: 0.01011
Epoch: 16878, Training Loss: 0.01149
Epoch: 16878, Training Loss: 0.01067
Epoch: 16878, Training Loss: 0.01312
Epoch: 16878, Training Loss: 0.01011
Epoch: 16879, Training Loss: 0.01149
Epoch: 16879, Training Loss: 0.01067
Epoch: 16879, Training Loss: 0.01312
Epoch: 16879, Training Loss: 0.01011
Epoch: 16880, Training Loss: 0.01149
Epoch: 16880, Training Loss: 0.01067
Epoch: 16880, Training Loss: 0.01312
Epoch: 16880, Training Loss: 0.01011
Epoch: 16881, Training Loss: 0.01149
Epoch: 16881, Training Loss: 0.01067
Epoch: 16881, Training Loss: 0.01312
Epoch: 16881, Training Loss: 0.01011
Epoch: 16882, Training Loss: 0.01149
Epoch: 16882, Training Loss: 0.01067
Epoch: 16882, Training Loss: 0.01312
Epoch: 16882, Training Loss: 0.01011
Epoch: 16883, Training Loss: 0.01149
Epoch: 16883, Training Loss: 0.01067
Epoch: 16883, Training Loss: 0.01312
Epoch: 16883, Training Loss: 0.01011
Epoch: 16884, Training Loss: 0.01149
Epoch: 16884, Training Loss: 0.01067
Epoch: 16884, Training Loss: 0.01312
Epoch: 16884, Training Loss: 0.01011
Epoch: 16885, Training Loss: 0.01149
Epoch: 16885, Training Loss: 0.01067
Epoch: 16885, Training Loss: 0.01312
Epoch: 16885, Training Loss: 0.01011
Epoch: 16886, Training Loss: 0.01149
Epoch: 16886, Training Loss: 0.01067
Epoch: 16886, Training Loss: 0.01312
Epoch: 16886, Training Loss: 0.01011
Epoch: 16887, Training Loss: 0.01149
Epoch: 16887, Training Loss: 0.01067
Epoch: 16887, Training Loss: 0.01312
Epoch: 16887, Training Loss: 0.01011
Epoch: 16888, Training Loss: 0.01149
Epoch: 16888, Training Loss: 0.01067
Epoch: 16888, Training Loss: 0.01312
Epoch: 16888, Training Loss: 0.01011
Epoch: 16889, Training Loss: 0.01149
Epoch: 16889, Training Loss: 0.01067
Epoch: 16889, Training Loss: 0.01312
Epoch: 16889, Training Loss: 0.01011
Epoch: 16890, Training Loss: 0.01149
Epoch: 16890, Training Loss: 0.01067
Epoch: 16890, Training Loss: 0.01312
Epoch: 16890, Training Loss: 0.01011
Epoch: 16891, Training Loss: 0.01149
Epoch: 16891, Training Loss: 0.01067
Epoch: 16891, Training Loss: 0.01312
Epoch: 16891, Training Loss: 0.01011
Epoch: 16892, Training Loss: 0.01149
Epoch: 16892, Training Loss: 0.01067
Epoch: 16892, Training Loss: 0.01312
Epoch: 16892, Training Loss: 0.01011
Epoch: 16893, Training Loss: 0.01149
Epoch: 16893, Training Loss: 0.01066
Epoch: 16893, Training Loss: 0.01312
Epoch: 16893, Training Loss: 0.01011
Epoch: 16894, Training Loss: 0.01149
Epoch: 16894, Training Loss: 0.01066
Epoch: 16894, Training Loss: 0.01312
Epoch: 16894, Training Loss: 0.01011
Epoch: 16895, Training Loss: 0.01149
Epoch: 16895, Training Loss: 0.01066
Epoch: 16895, Training Loss: 0.01311
Epoch: 16895, Training Loss: 0.01011
Epoch: 16896, Training Loss: 0.01149
Epoch: 16896, Training Loss: 0.01066
Epoch: 16896, Training Loss: 0.01311
Epoch: 16896, Training Loss: 0.01011
Epoch: 16897, Training Loss: 0.01149
Epoch: 16897, Training Loss: 0.01066
Epoch: 16897, Training Loss: 0.01311
Epoch: 16897, Training Loss: 0.01011
Epoch: 16898, Training Loss: 0.01149
Epoch: 16898, Training Loss: 0.01066
Epoch: 16898, Training Loss: 0.01311
Epoch: 16898, Training Loss: 0.01011
Epoch: 16899, Training Loss: 0.01149
Epoch: 16899, Training Loss: 0.01066
Epoch: 16899, Training Loss: 0.01311
Epoch: 16899, Training Loss: 0.01011
Epoch: 16900, Training Loss: 0.01149
Epoch: 16900, Training Loss: 0.01066
Epoch: 16900, Training Loss: 0.01311
Epoch: 16900, Training Loss: 0.01011
Epoch: 16901, Training Loss: 0.01149
Epoch: 16901, Training Loss: 0.01066
Epoch: 16901, Training Loss: 0.01311
Epoch: 16901, Training Loss: 0.01011
Epoch: 16902, Training Loss: 0.01149
Epoch: 16902, Training Loss: 0.01066
Epoch: 16902, Training Loss: 0.01311
Epoch: 16902, Training Loss: 0.01011
Epoch: 16903, Training Loss: 0.01148
Epoch: 16903, Training Loss: 0.01066
Epoch: 16903, Training Loss: 0.01311
Epoch: 16903, Training Loss: 0.01011
Epoch: 16904, Training Loss: 0.01148
Epoch: 16904, Training Loss: 0.01066
Epoch: 16904, Training Loss: 0.01311
Epoch: 16904, Training Loss: 0.01011
Epoch: 16905, Training Loss: 0.01148
Epoch: 16905, Training Loss: 0.01066
Epoch: 16905, Training Loss: 0.01311
Epoch: 16905, Training Loss: 0.01011
Epoch: 16906, Training Loss: 0.01148
Epoch: 16906, Training Loss: 0.01066
Epoch: 16906, Training Loss: 0.01311
Epoch: 16906, Training Loss: 0.01011
Epoch: 16907, Training Loss: 0.01148
Epoch: 16907, Training Loss: 0.01066
Epoch: 16907, Training Loss: 0.01311
Epoch: 16907, Training Loss: 0.01011
Epoch: 16908, Training Loss: 0.01148
Epoch: 16908, Training Loss: 0.01066
Epoch: 16908, Training Loss: 0.01311
Epoch: 16908, Training Loss: 0.01010
Epoch: 16909, Training Loss: 0.01148
Epoch: 16909, Training Loss: 0.01066
Epoch: 16909, Training Loss: 0.01311
Epoch: 16909, Training Loss: 0.01010
Epoch: 16910, Training Loss: 0.01148
Epoch: 16910, Training Loss: 0.01066
Epoch: 16910, Training Loss: 0.01311
Epoch: 16910, Training Loss: 0.01010
Epoch: 16911, Training Loss: 0.01148
Epoch: 16911, Training Loss: 0.01066
Epoch: 16911, Training Loss: 0.01311
Epoch: 16911, Training Loss: 0.01010
Epoch: 16912, Training Loss: 0.01148
Epoch: 16912, Training Loss: 0.01066
Epoch: 16912, Training Loss: 0.01311
Epoch: 16912, Training Loss: 0.01010
Epoch: 16913, Training Loss: 0.01148
Epoch: 16913, Training Loss: 0.01066
Epoch: 16913, Training Loss: 0.01311
Epoch: 16913, Training Loss: 0.01010
Epoch: 16914, Training Loss: 0.01148
Epoch: 16914, Training Loss: 0.01066
Epoch: 16914, Training Loss: 0.01311
Epoch: 16914, Training Loss: 0.01010
Epoch: 16915, Training Loss: 0.01148
Epoch: 16915, Training Loss: 0.01066
Epoch: 16915, Training Loss: 0.01311
Epoch: 16915, Training Loss: 0.01010
Epoch: 16916, Training Loss: 0.01148
Epoch: 16916, Training Loss: 0.01066
Epoch: 16916, Training Loss: 0.01311
Epoch: 16916, Training Loss: 0.01010
Epoch: 16917, Training Loss: 0.01148
Epoch: 16917, Training Loss: 0.01066
Epoch: 16917, Training Loss: 0.01311
Epoch: 16917, Training Loss: 0.01010
Epoch: 16918, Training Loss: 0.01148
Epoch: 16918, Training Loss: 0.01066
Epoch: 16918, Training Loss: 0.01310
Epoch: 16918, Training Loss: 0.01010
Epoch: 16919, Training Loss: 0.01148
Epoch: 16919, Training Loss: 0.01066
Epoch: 16919, Training Loss: 0.01310
Epoch: 16919, Training Loss: 0.01010
Epoch: 16920, Training Loss: 0.01148
Epoch: 16920, Training Loss: 0.01066
Epoch: 16920, Training Loss: 0.01310
Epoch: 16920, Training Loss: 0.01010
Epoch: 16921, Training Loss: 0.01148
Epoch: 16921, Training Loss: 0.01066
Epoch: 16921, Training Loss: 0.01310
Epoch: 16921, Training Loss: 0.01010
Epoch: 16922, Training Loss: 0.01148
Epoch: 16922, Training Loss: 0.01065
Epoch: 16922, Training Loss: 0.01310
Epoch: 16922, Training Loss: 0.01010
Epoch: 16923, Training Loss: 0.01148
Epoch: 16923, Training Loss: 0.01065
Epoch: 16923, Training Loss: 0.01310
Epoch: 16923, Training Loss: 0.01010
Epoch: 16924, Training Loss: 0.01148
Epoch: 16924, Training Loss: 0.01065
Epoch: 16924, Training Loss: 0.01310
Epoch: 16924, Training Loss: 0.01010
Epoch: 16925, Training Loss: 0.01148
Epoch: 16925, Training Loss: 0.01065
Epoch: 16925, Training Loss: 0.01310
Epoch: 16925, Training Loss: 0.01010
Epoch: 16926, Training Loss: 0.01148
Epoch: 16926, Training Loss: 0.01065
Epoch: 16926, Training Loss: 0.01310
Epoch: 16926, Training Loss: 0.01010
Epoch: 16927, Training Loss: 0.01148
Epoch: 16927, Training Loss: 0.01065
Epoch: 16927, Training Loss: 0.01310
Epoch: 16927, Training Loss: 0.01010
Epoch: 16928, Training Loss: 0.01148
Epoch: 16928, Training Loss: 0.01065
Epoch: 16928, Training Loss: 0.01310
Epoch: 16928, Training Loss: 0.01010
Epoch: 16929, Training Loss: 0.01148
Epoch: 16929, Training Loss: 0.01065
Epoch: 16929, Training Loss: 0.01310
Epoch: 16929, Training Loss: 0.01010
Epoch: 16930, Training Loss: 0.01147
Epoch: 16930, Training Loss: 0.01065
Epoch: 16930, Training Loss: 0.01310
Epoch: 16930, Training Loss: 0.01010
Epoch: 16931, Training Loss: 0.01147
Epoch: 16931, Training Loss: 0.01065
Epoch: 16931, Training Loss: 0.01310
Epoch: 16931, Training Loss: 0.01010
Epoch: 16932, Training Loss: 0.01147
Epoch: 16932, Training Loss: 0.01065
Epoch: 16932, Training Loss: 0.01310
Epoch: 16932, Training Loss: 0.01010
Epoch: 16933, Training Loss: 0.01147
Epoch: 16933, Training Loss: 0.01065
Epoch: 16933, Training Loss: 0.01310
Epoch: 16933, Training Loss: 0.01010
Epoch: 16934, Training Loss: 0.01147
Epoch: 16934, Training Loss: 0.01065
Epoch: 16934, Training Loss: 0.01310
Epoch: 16934, Training Loss: 0.01010
Epoch: 16935, Training Loss: 0.01147
Epoch: 16935, Training Loss: 0.01065
Epoch: 16935, Training Loss: 0.01310
Epoch: 16935, Training Loss: 0.01010
Epoch: 16936, Training Loss: 0.01147
Epoch: 16936, Training Loss: 0.01065
Epoch: 16936, Training Loss: 0.01310
Epoch: 16936, Training Loss: 0.01010
Epoch: 16937, Training Loss: 0.01147
Epoch: 16937, Training Loss: 0.01065
Epoch: 16937, Training Loss: 0.01310
Epoch: 16937, Training Loss: 0.01010
Epoch: 16938, Training Loss: 0.01147
Epoch: 16938, Training Loss: 0.01065
Epoch: 16938, Training Loss: 0.01310
Epoch: 16938, Training Loss: 0.01010
Epoch: 16939, Training Loss: 0.01147
Epoch: 16939, Training Loss: 0.01065
Epoch: 16939, Training Loss: 0.01310
Epoch: 16939, Training Loss: 0.01009
Epoch: 16940, Training Loss: 0.01147
Epoch: 16940, Training Loss: 0.01065
Epoch: 16940, Training Loss: 0.01310
Epoch: 16940, Training Loss: 0.01009
Epoch: 16941, Training Loss: 0.01147
Epoch: 16941, Training Loss: 0.01065
Epoch: 16941, Training Loss: 0.01310
Epoch: 16941, Training Loss: 0.01009
Epoch: 16942, Training Loss: 0.01147
Epoch: 16942, Training Loss: 0.01065
Epoch: 16942, Training Loss: 0.01309
Epoch: 16942, Training Loss: 0.01009
Epoch: 16943, Training Loss: 0.01147
Epoch: 16943, Training Loss: 0.01065
Epoch: 16943, Training Loss: 0.01309
Epoch: 16943, Training Loss: 0.01009
Epoch: 16944, Training Loss: 0.01147
Epoch: 16944, Training Loss: 0.01065
Epoch: 16944, Training Loss: 0.01309
Epoch: 16944, Training Loss: 0.01009
Epoch: 16945, Training Loss: 0.01147
Epoch: 16945, Training Loss: 0.01065
Epoch: 16945, Training Loss: 0.01309
Epoch: 16945, Training Loss: 0.01009
Epoch: 16946, Training Loss: 0.01147
Epoch: 16946, Training Loss: 0.01065
Epoch: 16946, Training Loss: 0.01309
Epoch: 16946, Training Loss: 0.01009
Epoch: 16947, Training Loss: 0.01147
Epoch: 16947, Training Loss: 0.01065
Epoch: 16947, Training Loss: 0.01309
Epoch: 16947, Training Loss: 0.01009
Epoch: 16948, Training Loss: 0.01147
Epoch: 16948, Training Loss: 0.01065
Epoch: 16948, Training Loss: 0.01309
Epoch: 16948, Training Loss: 0.01009
Epoch: 16949, Training Loss: 0.01147
Epoch: 16949, Training Loss: 0.01065
Epoch: 16949, Training Loss: 0.01309
Epoch: 16949, Training Loss: 0.01009
Epoch: 16950, Training Loss: 0.01147
Epoch: 16950, Training Loss: 0.01065
Epoch: 16950, Training Loss: 0.01309
Epoch: 16950, Training Loss: 0.01009
Epoch: 16951, Training Loss: 0.01147
Epoch: 16951, Training Loss: 0.01065
Epoch: 16951, Training Loss: 0.01309
Epoch: 16951, Training Loss: 0.01009
Epoch: 16952, Training Loss: 0.01147
Epoch: 16952, Training Loss: 0.01064
Epoch: 16952, Training Loss: 0.01309
Epoch: 16952, Training Loss: 0.01009
Epoch: 16953, Training Loss: 0.01147
Epoch: 16953, Training Loss: 0.01064
Epoch: 16953, Training Loss: 0.01309
Epoch: 16953, Training Loss: 0.01009
Epoch: 16954, Training Loss: 0.01147
Epoch: 16954, Training Loss: 0.01064
Epoch: 16954, Training Loss: 0.01309
Epoch: 16954, Training Loss: 0.01009
Epoch: 16955, Training Loss: 0.01147
Epoch: 16955, Training Loss: 0.01064
Epoch: 16955, Training Loss: 0.01309
Epoch: 16955, Training Loss: 0.01009
Epoch: 16956, Training Loss: 0.01147
Epoch: 16956, Training Loss: 0.01064
Epoch: 16956, Training Loss: 0.01309
Epoch: 16956, Training Loss: 0.01009
Epoch: 16957, Training Loss: 0.01146
Epoch: 16957, Training Loss: 0.01064
Epoch: 16957, Training Loss: 0.01309
Epoch: 16957, Training Loss: 0.01009
Epoch: 16958, Training Loss: 0.01146
Epoch: 16958, Training Loss: 0.01064
Epoch: 16958, Training Loss: 0.01309
Epoch: 16958, Training Loss: 0.01009
Epoch: 16959, Training Loss: 0.01146
Epoch: 16959, Training Loss: 0.01064
Epoch: 16959, Training Loss: 0.01309
Epoch: 16959, Training Loss: 0.01009
Epoch: 16960, Training Loss: 0.01146
Epoch: 16960, Training Loss: 0.01064
Epoch: 16960, Training Loss: 0.01309
Epoch: 16960, Training Loss: 0.01009
Epoch: 16961, Training Loss: 0.01146
Epoch: 16961, Training Loss: 0.01064
Epoch: 16961, Training Loss: 0.01309
Epoch: 16961, Training Loss: 0.01009
Epoch: 16962, Training Loss: 0.01146
Epoch: 16962, Training Loss: 0.01064
Epoch: 16962, Training Loss: 0.01309
Epoch: 16962, Training Loss: 0.01009
Epoch: 16963, Training Loss: 0.01146
Epoch: 16963, Training Loss: 0.01064
Epoch: 16963, Training Loss: 0.01309
Epoch: 16963, Training Loss: 0.01009
Epoch: 16964, Training Loss: 0.01146
Epoch: 16964, Training Loss: 0.01064
Epoch: 16964, Training Loss: 0.01309
Epoch: 16964, Training Loss: 0.01009
Epoch: 16965, Training Loss: 0.01146
Epoch: 16965, Training Loss: 0.01064
Epoch: 16965, Training Loss: 0.01309
Epoch: 16965, Training Loss: 0.01009
Epoch: 16966, Training Loss: 0.01146
Epoch: 16966, Training Loss: 0.01064
Epoch: 16966, Training Loss: 0.01308
Epoch: 16966, Training Loss: 0.01009
Epoch: 16967, Training Loss: 0.01146
Epoch: 16967, Training Loss: 0.01064
Epoch: 16967, Training Loss: 0.01308
Epoch: 16967, Training Loss: 0.01009
Epoch: 16968, Training Loss: 0.01146
Epoch: 16968, Training Loss: 0.01064
Epoch: 16968, Training Loss: 0.01308
Epoch: 16968, Training Loss: 0.01009
Epoch: 16969, Training Loss: 0.01146
Epoch: 16969, Training Loss: 0.01064
Epoch: 16969, Training Loss: 0.01308
Epoch: 16969, Training Loss: 0.01009
Epoch: 16970, Training Loss: 0.01146
Epoch: 16970, Training Loss: 0.01064
Epoch: 16970, Training Loss: 0.01308
Epoch: 16970, Training Loss: 0.01009
Epoch: 16971, Training Loss: 0.01146
Epoch: 16971, Training Loss: 0.01064
Epoch: 16971, Training Loss: 0.01308
Epoch: 16971, Training Loss: 0.01008
Epoch: 16972, Training Loss: 0.01146
Epoch: 16972, Training Loss: 0.01064
Epoch: 16972, Training Loss: 0.01308
Epoch: 16972, Training Loss: 0.01008
Epoch: 16973, Training Loss: 0.01146
Epoch: 16973, Training Loss: 0.01064
Epoch: 16973, Training Loss: 0.01308
Epoch: 16973, Training Loss: 0.01008
Epoch: 16974, Training Loss: 0.01146
Epoch: 16974, Training Loss: 0.01064
Epoch: 16974, Training Loss: 0.01308
Epoch: 16974, Training Loss: 0.01008
Epoch: 16975, Training Loss: 0.01146
Epoch: 16975, Training Loss: 0.01064
Epoch: 16975, Training Loss: 0.01308
Epoch: 16975, Training Loss: 0.01008
Epoch: 16976, Training Loss: 0.01146
Epoch: 16976, Training Loss: 0.01064
Epoch: 16976, Training Loss: 0.01308
Epoch: 16976, Training Loss: 0.01008
Epoch: 16977, Training Loss: 0.01146
Epoch: 16977, Training Loss: 0.01064
Epoch: 16977, Training Loss: 0.01308
Epoch: 16977, Training Loss: 0.01008
Epoch: 16978, Training Loss: 0.01146
Epoch: 16978, Training Loss: 0.01064
Epoch: 16978, Training Loss: 0.01308
Epoch: 16978, Training Loss: 0.01008
Epoch: 16979, Training Loss: 0.01146
Epoch: 16979, Training Loss: 0.01064
Epoch: 16979, Training Loss: 0.01308
Epoch: 16979, Training Loss: 0.01008
Epoch: 16980, Training Loss: 0.01146
Epoch: 16980, Training Loss: 0.01064
Epoch: 16980, Training Loss: 0.01308
Epoch: 16980, Training Loss: 0.01008
Epoch: 16981, Training Loss: 0.01146
Epoch: 16981, Training Loss: 0.01064
Epoch: 16981, Training Loss: 0.01308
Epoch: 16981, Training Loss: 0.01008
Epoch: 16982, Training Loss: 0.01146
Epoch: 16982, Training Loss: 0.01063
Epoch: 16982, Training Loss: 0.01308
Epoch: 16982, Training Loss: 0.01008
Epoch: 16983, Training Loss: 0.01146
Epoch: 16983, Training Loss: 0.01063
Epoch: 16983, Training Loss: 0.01308
Epoch: 16983, Training Loss: 0.01008
Epoch: 16984, Training Loss: 0.01145
Epoch: 16984, Training Loss: 0.01063
Epoch: 16984, Training Loss: 0.01308
Epoch: 16984, Training Loss: 0.01008
Epoch: 16985, Training Loss: 0.01145
Epoch: 16985, Training Loss: 0.01063
Epoch: 16985, Training Loss: 0.01308
Epoch: 16985, Training Loss: 0.01008
Epoch: 16986, Training Loss: 0.01145
Epoch: 16986, Training Loss: 0.01063
Epoch: 16986, Training Loss: 0.01308
Epoch: 16986, Training Loss: 0.01008
Epoch: 16987, Training Loss: 0.01145
Epoch: 16987, Training Loss: 0.01063
Epoch: 16987, Training Loss: 0.01308
Epoch: 16987, Training Loss: 0.01008
Epoch: 16988, Training Loss: 0.01145
Epoch: 16988, Training Loss: 0.01063
Epoch: 16988, Training Loss: 0.01308
Epoch: 16988, Training Loss: 0.01008
Epoch: 16989, Training Loss: 0.01145
Epoch: 16989, Training Loss: 0.01063
Epoch: 16989, Training Loss: 0.01307
Epoch: 16989, Training Loss: 0.01008
Epoch: 16990, Training Loss: 0.01145
Epoch: 16990, Training Loss: 0.01063
Epoch: 16990, Training Loss: 0.01307
Epoch: 16990, Training Loss: 0.01008
Epoch: 16991, Training Loss: 0.01145
Epoch: 16991, Training Loss: 0.01063
Epoch: 16991, Training Loss: 0.01307
Epoch: 16991, Training Loss: 0.01008
Epoch: 16992, Training Loss: 0.01145
Epoch: 16992, Training Loss: 0.01063
Epoch: 16992, Training Loss: 0.01307
Epoch: 16992, Training Loss: 0.01008
Epoch: 16993, Training Loss: 0.01145
Epoch: 16993, Training Loss: 0.01063
Epoch: 16993, Training Loss: 0.01307
Epoch: 16993, Training Loss: 0.01008
Epoch: 16994, Training Loss: 0.01145
Epoch: 16994, Training Loss: 0.01063
Epoch: 16994, Training Loss: 0.01307
Epoch: 16994, Training Loss: 0.01008
Epoch: 16995, Training Loss: 0.01145
Epoch: 16995, Training Loss: 0.01063
Epoch: 16995, Training Loss: 0.01307
Epoch: 16995, Training Loss: 0.01008
Epoch: 16996, Training Loss: 0.01145
Epoch: 16996, Training Loss: 0.01063
Epoch: 16996, Training Loss: 0.01307
Epoch: 16996, Training Loss: 0.01008
Epoch: 16997, Training Loss: 0.01145
Epoch: 16997, Training Loss: 0.01063
Epoch: 16997, Training Loss: 0.01307
Epoch: 16997, Training Loss: 0.01008
Epoch: 16998, Training Loss: 0.01145
Epoch: 16998, Training Loss: 0.01063
Epoch: 16998, Training Loss: 0.01307
Epoch: 16998, Training Loss: 0.01008
Epoch: 16999, Training Loss: 0.01145
Epoch: 16999, Training Loss: 0.01063
Epoch: 16999, Training Loss: 0.01307
Epoch: 16999, Training Loss: 0.01008
Epoch: 17000, Training Loss: 0.01145
Epoch: 17000, Training Loss: 0.01063
Epoch: 17000, Training Loss: 0.01307
Epoch: 17000, Training Loss: 0.01008
Epoch: 17001, Training Loss: 0.01145
Epoch: 17001, Training Loss: 0.01063
Epoch: 17001, Training Loss: 0.01307
Epoch: 17001, Training Loss: 0.01008
Epoch: 17002, Training Loss: 0.01145
Epoch: 17002, Training Loss: 0.01063
Epoch: 17002, Training Loss: 0.01307
Epoch: 17002, Training Loss: 0.01007
Epoch: 17003, Training Loss: 0.01145
Epoch: 17003, Training Loss: 0.01063
Epoch: 17003, Training Loss: 0.01307
Epoch: 17003, Training Loss: 0.01007
Epoch: 17004, Training Loss: 0.01145
Epoch: 17004, Training Loss: 0.01063
Epoch: 17004, Training Loss: 0.01307
Epoch: 17004, Training Loss: 0.01007
Epoch: 17005, Training Loss: 0.01145
Epoch: 17005, Training Loss: 0.01063
Epoch: 17005, Training Loss: 0.01307
Epoch: 17005, Training Loss: 0.01007
Epoch: 17006, Training Loss: 0.01145
Epoch: 17006, Training Loss: 0.01063
Epoch: 17006, Training Loss: 0.01307
Epoch: 17006, Training Loss: 0.01007
Epoch: 17007, Training Loss: 0.01145
Epoch: 17007, Training Loss: 0.01063
Epoch: 17007, Training Loss: 0.01307
Epoch: 17007, Training Loss: 0.01007
Epoch: 17008, Training Loss: 0.01145
Epoch: 17008, Training Loss: 0.01063
Epoch: 17008, Training Loss: 0.01307
Epoch: 17008, Training Loss: 0.01007
Epoch: 17009, Training Loss: 0.01145
Epoch: 17009, Training Loss: 0.01063
Epoch: 17009, Training Loss: 0.01307
Epoch: 17009, Training Loss: 0.01007
Epoch: 17010, Training Loss: 0.01145
Epoch: 17010, Training Loss: 0.01063
Epoch: 17010, Training Loss: 0.01307
Epoch: 17010, Training Loss: 0.01007
Epoch: 17011, Training Loss: 0.01144
Epoch: 17011, Training Loss: 0.01063
Epoch: 17011, Training Loss: 0.01307
Epoch: 17011, Training Loss: 0.01007
Epoch: 17012, Training Loss: 0.01144
Epoch: 17012, Training Loss: 0.01062
Epoch: 17012, Training Loss: 0.01307
Epoch: 17012, Training Loss: 0.01007
Epoch: 17013, Training Loss: 0.01144
Epoch: 17013, Training Loss: 0.01062
Epoch: 17013, Training Loss: 0.01306
Epoch: 17013, Training Loss: 0.01007
Epoch: 17014, Training Loss: 0.01144
Epoch: 17014, Training Loss: 0.01062
Epoch: 17014, Training Loss: 0.01306
Epoch: 17014, Training Loss: 0.01007
Epoch: 17015, Training Loss: 0.01144
Epoch: 17015, Training Loss: 0.01062
Epoch: 17015, Training Loss: 0.01306
Epoch: 17015, Training Loss: 0.01007
Epoch: 17016, Training Loss: 0.01144
Epoch: 17016, Training Loss: 0.01062
Epoch: 17016, Training Loss: 0.01306
Epoch: 17016, Training Loss: 0.01007
Epoch: 17017, Training Loss: 0.01144
Epoch: 17017, Training Loss: 0.01062
Epoch: 17017, Training Loss: 0.01306
Epoch: 17017, Training Loss: 0.01007
Epoch: 17018, Training Loss: 0.01144
Epoch: 17018, Training Loss: 0.01062
Epoch: 17018, Training Loss: 0.01306
Epoch: 17018, Training Loss: 0.01007
Epoch: 17019, Training Loss: 0.01144
Epoch: 17019, Training Loss: 0.01062
Epoch: 17019, Training Loss: 0.01306
Epoch: 17019, Training Loss: 0.01007
Epoch: 17020, Training Loss: 0.01144
Epoch: 17020, Training Loss: 0.01062
Epoch: 17020, Training Loss: 0.01306
Epoch: 17020, Training Loss: 0.01007
Epoch: 17021, Training Loss: 0.01144
Epoch: 17021, Training Loss: 0.01062
Epoch: 17021, Training Loss: 0.01306
Epoch: 17021, Training Loss: 0.01007
Epoch: 17022, Training Loss: 0.01144
Epoch: 17022, Training Loss: 0.01062
Epoch: 17022, Training Loss: 0.01306
Epoch: 17022, Training Loss: 0.01007
Epoch: 17023, Training Loss: 0.01144
Epoch: 17023, Training Loss: 0.01062
Epoch: 17023, Training Loss: 0.01306
Epoch: 17023, Training Loss: 0.01007
Epoch: 17024, Training Loss: 0.01144
Epoch: 17024, Training Loss: 0.01062
Epoch: 17024, Training Loss: 0.01306
Epoch: 17024, Training Loss: 0.01007
Epoch: 17025, Training Loss: 0.01144
Epoch: 17025, Training Loss: 0.01062
Epoch: 17025, Training Loss: 0.01306
Epoch: 17025, Training Loss: 0.01007
Epoch: 17026, Training Loss: 0.01144
Epoch: 17026, Training Loss: 0.01062
Epoch: 17026, Training Loss: 0.01306
Epoch: 17026, Training Loss: 0.01007
Epoch: 17027, Training Loss: 0.01144
Epoch: 17027, Training Loss: 0.01062
Epoch: 17027, Training Loss: 0.01306
Epoch: 17027, Training Loss: 0.01007
Epoch: 17028, Training Loss: 0.01144
Epoch: 17028, Training Loss: 0.01062
Epoch: 17028, Training Loss: 0.01306
Epoch: 17028, Training Loss: 0.01007
Epoch: 17029, Training Loss: 0.01144
Epoch: 17029, Training Loss: 0.01062
Epoch: 17029, Training Loss: 0.01306
Epoch: 17029, Training Loss: 0.01007
Epoch: 17030, Training Loss: 0.01144
Epoch: 17030, Training Loss: 0.01062
Epoch: 17030, Training Loss: 0.01306
Epoch: 17030, Training Loss: 0.01007
Epoch: 17031, Training Loss: 0.01144
Epoch: 17031, Training Loss: 0.01062
Epoch: 17031, Training Loss: 0.01306
Epoch: 17031, Training Loss: 0.01007
Epoch: 17032, Training Loss: 0.01144
Epoch: 17032, Training Loss: 0.01062
Epoch: 17032, Training Loss: 0.01306
Epoch: 17032, Training Loss: 0.01007
Epoch: 17033, Training Loss: 0.01144
Epoch: 17033, Training Loss: 0.01062
Epoch: 17033, Training Loss: 0.01306
Epoch: 17033, Training Loss: 0.01006
Epoch: 17034, Training Loss: 0.01144
Epoch: 17034, Training Loss: 0.01062
Epoch: 17034, Training Loss: 0.01306
Epoch: 17034, Training Loss: 0.01006
Epoch: 17035, Training Loss: 0.01144
Epoch: 17035, Training Loss: 0.01062
Epoch: 17035, Training Loss: 0.01306
Epoch: 17035, Training Loss: 0.01006
Epoch: 17036, Training Loss: 0.01144
Epoch: 17036, Training Loss: 0.01062
Epoch: 17036, Training Loss: 0.01306
Epoch: 17036, Training Loss: 0.01006
Epoch: 17037, Training Loss: 0.01144
Epoch: 17037, Training Loss: 0.01062
Epoch: 17037, Training Loss: 0.01305
Epoch: 17037, Training Loss: 0.01006
Epoch: 17038, Training Loss: 0.01144
Epoch: 17038, Training Loss: 0.01062
Epoch: 17038, Training Loss: 0.01305
Epoch: 17038, Training Loss: 0.01006
Epoch: 17039, Training Loss: 0.01143
Epoch: 17039, Training Loss: 0.01062
Epoch: 17039, Training Loss: 0.01305
Epoch: 17039, Training Loss: 0.01006
Epoch: 17040, Training Loss: 0.01143
Epoch: 17040, Training Loss: 0.01062
Epoch: 17040, Training Loss: 0.01305
Epoch: 17040, Training Loss: 0.01006
Epoch: 17041, Training Loss: 0.01143
Epoch: 17041, Training Loss: 0.01062
Epoch: 17041, Training Loss: 0.01305
Epoch: 17041, Training Loss: 0.01006
Epoch: 17042, Training Loss: 0.01143
Epoch: 17042, Training Loss: 0.01061
Epoch: 17042, Training Loss: 0.01305
Epoch: 17042, Training Loss: 0.01006
Epoch: 17043, Training Loss: 0.01143
Epoch: 17043, Training Loss: 0.01061
Epoch: 17043, Training Loss: 0.01305
Epoch: 17043, Training Loss: 0.01006
Epoch: 17044, Training Loss: 0.01143
Epoch: 17044, Training Loss: 0.01061
Epoch: 17044, Training Loss: 0.01305
Epoch: 17044, Training Loss: 0.01006
Epoch: 17045, Training Loss: 0.01143
Epoch: 17045, Training Loss: 0.01061
Epoch: 17045, Training Loss: 0.01305
Epoch: 17045, Training Loss: 0.01006
Epoch: 17046, Training Loss: 0.01143
Epoch: 17046, Training Loss: 0.01061
Epoch: 17046, Training Loss: 0.01305
Epoch: 17046, Training Loss: 0.01006
Epoch: 17047, Training Loss: 0.01143
Epoch: 17047, Training Loss: 0.01061
Epoch: 17047, Training Loss: 0.01305
Epoch: 17047, Training Loss: 0.01006
Epoch: 17048, Training Loss: 0.01143
Epoch: 17048, Training Loss: 0.01061
Epoch: 17048, Training Loss: 0.01305
Epoch: 17048, Training Loss: 0.01006
Epoch: 17049, Training Loss: 0.01143
Epoch: 17049, Training Loss: 0.01061
Epoch: 17049, Training Loss: 0.01305
Epoch: 17049, Training Loss: 0.01006
Epoch: 17050, Training Loss: 0.01143
Epoch: 17050, Training Loss: 0.01061
Epoch: 17050, Training Loss: 0.01305
Epoch: 17050, Training Loss: 0.01006
Epoch: 17051, Training Loss: 0.01143
Epoch: 17051, Training Loss: 0.01061
Epoch: 17051, Training Loss: 0.01305
Epoch: 17051, Training Loss: 0.01006
Epoch: 17052, Training Loss: 0.01143
Epoch: 17052, Training Loss: 0.01061
Epoch: 17052, Training Loss: 0.01305
Epoch: 17052, Training Loss: 0.01006
Epoch: 17053, Training Loss: 0.01143
Epoch: 17053, Training Loss: 0.01061
Epoch: 17053, Training Loss: 0.01305
Epoch: 17053, Training Loss: 0.01006
Epoch: 17054, Training Loss: 0.01143
Epoch: 17054, Training Loss: 0.01061
Epoch: 17054, Training Loss: 0.01305
Epoch: 17054, Training Loss: 0.01006
Epoch: 17055, Training Loss: 0.01143
Epoch: 17055, Training Loss: 0.01061
Epoch: 17055, Training Loss: 0.01305
Epoch: 17055, Training Loss: 0.01006
Epoch: 17056, Training Loss: 0.01143
Epoch: 17056, Training Loss: 0.01061
Epoch: 17056, Training Loss: 0.01305
Epoch: 17056, Training Loss: 0.01006
Epoch: 17057, Training Loss: 0.01143
Epoch: 17057, Training Loss: 0.01061
Epoch: 17057, Training Loss: 0.01305
Epoch: 17057, Training Loss: 0.01006
Epoch: 17058, Training Loss: 0.01143
Epoch: 17058, Training Loss: 0.01061
Epoch: 17058, Training Loss: 0.01305
Epoch: 17058, Training Loss: 0.01006
Epoch: 17059, Training Loss: 0.01143
Epoch: 17059, Training Loss: 0.01061
Epoch: 17059, Training Loss: 0.01305
Epoch: 17059, Training Loss: 0.01006
Epoch: 17060, Training Loss: 0.01143
Epoch: 17060, Training Loss: 0.01061
Epoch: 17060, Training Loss: 0.01305
Epoch: 17060, Training Loss: 0.01006
Epoch: 17061, Training Loss: 0.01143
Epoch: 17061, Training Loss: 0.01061
Epoch: 17061, Training Loss: 0.01304
Epoch: 17061, Training Loss: 0.01006
Epoch: 17062, Training Loss: 0.01143
Epoch: 17062, Training Loss: 0.01061
Epoch: 17062, Training Loss: 0.01304
Epoch: 17062, Training Loss: 0.01006
Epoch: 17063, Training Loss: 0.01143
Epoch: 17063, Training Loss: 0.01061
Epoch: 17063, Training Loss: 0.01304
Epoch: 17063, Training Loss: 0.01006
Epoch: 17064, Training Loss: 0.01143
Epoch: 17064, Training Loss: 0.01061
Epoch: 17064, Training Loss: 0.01304
Epoch: 17064, Training Loss: 0.01006
Epoch: 17065, Training Loss: 0.01143
Epoch: 17065, Training Loss: 0.01061
Epoch: 17065, Training Loss: 0.01304
Epoch: 17065, Training Loss: 0.01005
Epoch: 17066, Training Loss: 0.01142
Epoch: 17066, Training Loss: 0.01061
Epoch: 17066, Training Loss: 0.01304
Epoch: 17066, Training Loss: 0.01005
Epoch: 17067, Training Loss: 0.01142
Epoch: 17067, Training Loss: 0.01061
Epoch: 17067, Training Loss: 0.01304
Epoch: 17067, Training Loss: 0.01005
Epoch: 17068, Training Loss: 0.01142
Epoch: 17068, Training Loss: 0.01061
Epoch: 17068, Training Loss: 0.01304
Epoch: 17068, Training Loss: 0.01005
Epoch: 17069, Training Loss: 0.01142
Epoch: 17069, Training Loss: 0.01061
Epoch: 17069, Training Loss: 0.01304
Epoch: 17069, Training Loss: 0.01005
Epoch: 17070, Training Loss: 0.01142
Epoch: 17070, Training Loss: 0.01061
Epoch: 17070, Training Loss: 0.01304
Epoch: 17070, Training Loss: 0.01005
Epoch: 17071, Training Loss: 0.01142
Epoch: 17071, Training Loss: 0.01061
Epoch: 17071, Training Loss: 0.01304
Epoch: 17071, Training Loss: 0.01005
Epoch: 17072, Training Loss: 0.01142
Epoch: 17072, Training Loss: 0.01060
Epoch: 17072, Training Loss: 0.01304
Epoch: 17072, Training Loss: 0.01005
Epoch: 17073, Training Loss: 0.01142
Epoch: 17073, Training Loss: 0.01060
Epoch: 17073, Training Loss: 0.01304
Epoch: 17073, Training Loss: 0.01005
Epoch: 17074, Training Loss: 0.01142
Epoch: 17074, Training Loss: 0.01060
Epoch: 17074, Training Loss: 0.01304
Epoch: 17074, Training Loss: 0.01005
Epoch: 17075, Training Loss: 0.01142
Epoch: 17075, Training Loss: 0.01060
Epoch: 17075, Training Loss: 0.01304
Epoch: 17075, Training Loss: 0.01005
Epoch: 17076, Training Loss: 0.01142
Epoch: 17076, Training Loss: 0.01060
Epoch: 17076, Training Loss: 0.01304
Epoch: 17076, Training Loss: 0.01005
Epoch: 17077, Training Loss: 0.01142
Epoch: 17077, Training Loss: 0.01060
Epoch: 17077, Training Loss: 0.01304
Epoch: 17077, Training Loss: 0.01005
Epoch: 17078, Training Loss: 0.01142
Epoch: 17078, Training Loss: 0.01060
Epoch: 17078, Training Loss: 0.01304
Epoch: 17078, Training Loss: 0.01005
Epoch: 17079, Training Loss: 0.01142
Epoch: 17079, Training Loss: 0.01060
Epoch: 17079, Training Loss: 0.01304
Epoch: 17079, Training Loss: 0.01005
Epoch: 17080, Training Loss: 0.01142
Epoch: 17080, Training Loss: 0.01060
Epoch: 17080, Training Loss: 0.01304
Epoch: 17080, Training Loss: 0.01005
Epoch: 17081, Training Loss: 0.01142
Epoch: 17081, Training Loss: 0.01060
Epoch: 17081, Training Loss: 0.01304
Epoch: 17081, Training Loss: 0.01005
Epoch: 17082, Training Loss: 0.01142
Epoch: 17082, Training Loss: 0.01060
Epoch: 17082, Training Loss: 0.01304
Epoch: 17082, Training Loss: 0.01005
Epoch: 17083, Training Loss: 0.01142
Epoch: 17083, Training Loss: 0.01060
Epoch: 17083, Training Loss: 0.01304
Epoch: 17083, Training Loss: 0.01005
Epoch: 17084, Training Loss: 0.01142
Epoch: 17084, Training Loss: 0.01060
Epoch: 17084, Training Loss: 0.01304
Epoch: 17084, Training Loss: 0.01005
Epoch: 17085, Training Loss: 0.01142
Epoch: 17085, Training Loss: 0.01060
Epoch: 17085, Training Loss: 0.01303
Epoch: 17085, Training Loss: 0.01005
Epoch: 17086, Training Loss: 0.01142
Epoch: 17086, Training Loss: 0.01060
Epoch: 17086, Training Loss: 0.01303
Epoch: 17086, Training Loss: 0.01005
Epoch: 17087, Training Loss: 0.01142
Epoch: 17087, Training Loss: 0.01060
Epoch: 17087, Training Loss: 0.01303
Epoch: 17087, Training Loss: 0.01005
Epoch: 17088, Training Loss: 0.01142
Epoch: 17088, Training Loss: 0.01060
Epoch: 17088, Training Loss: 0.01303
Epoch: 17088, Training Loss: 0.01005
Epoch: 17089, Training Loss: 0.01142
Epoch: 17089, Training Loss: 0.01060
Epoch: 17089, Training Loss: 0.01303
Epoch: 17089, Training Loss: 0.01005
Epoch: 17090, Training Loss: 0.01142
Epoch: 17090, Training Loss: 0.01060
Epoch: 17090, Training Loss: 0.01303
Epoch: 17090, Training Loss: 0.01005
Epoch: 17091, Training Loss: 0.01142
Epoch: 17091, Training Loss: 0.01060
Epoch: 17091, Training Loss: 0.01303
Epoch: 17091, Training Loss: 0.01005
Epoch: 17092, Training Loss: 0.01142
Epoch: 17092, Training Loss: 0.01060
Epoch: 17092, Training Loss: 0.01303
Epoch: 17092, Training Loss: 0.01005
Epoch: 17093, Training Loss: 0.01141
Epoch: 17093, Training Loss: 0.01060
Epoch: 17093, Training Loss: 0.01303
Epoch: 17093, Training Loss: 0.01005
Epoch: 17094, Training Loss: 0.01141
Epoch: 17094, Training Loss: 0.01060
Epoch: 17094, Training Loss: 0.01303
Epoch: 17094, Training Loss: 0.01005
Epoch: 17095, Training Loss: 0.01141
Epoch: 17095, Training Loss: 0.01060
Epoch: 17095, Training Loss: 0.01303
Epoch: 17095, Training Loss: 0.01005
Epoch: 17096, Training Loss: 0.01141
Epoch: 17096, Training Loss: 0.01060
Epoch: 17096, Training Loss: 0.01303
Epoch: 17096, Training Loss: 0.01005
Epoch: 17097, Training Loss: 0.01141
Epoch: 17097, Training Loss: 0.01060
Epoch: 17097, Training Loss: 0.01303
Epoch: 17097, Training Loss: 0.01004
Epoch: 17098, Training Loss: 0.01141
Epoch: 17098, Training Loss: 0.01060
Epoch: 17098, Training Loss: 0.01303
Epoch: 17098, Training Loss: 0.01004
Epoch: 17099, Training Loss: 0.01141
Epoch: 17099, Training Loss: 0.01060
Epoch: 17099, Training Loss: 0.01303
Epoch: 17099, Training Loss: 0.01004
Epoch: 17100, Training Loss: 0.01141
Epoch: 17100, Training Loss: 0.01060
Epoch: 17100, Training Loss: 0.01303
Epoch: 17100, Training Loss: 0.01004
Epoch: 17101, Training Loss: 0.01141
Epoch: 17101, Training Loss: 0.01060
Epoch: 17101, Training Loss: 0.01303
Epoch: 17101, Training Loss: 0.01004
Epoch: 17102, Training Loss: 0.01141
Epoch: 17102, Training Loss: 0.01059
Epoch: 17102, Training Loss: 0.01303
Epoch: 17102, Training Loss: 0.01004
Epoch: 17103, Training Loss: 0.01141
Epoch: 17103, Training Loss: 0.01059
Epoch: 17103, Training Loss: 0.01303
Epoch: 17103, Training Loss: 0.01004
Epoch: 17104, Training Loss: 0.01141
Epoch: 17104, Training Loss: 0.01059
Epoch: 17104, Training Loss: 0.01303
Epoch: 17104, Training Loss: 0.01004
Epoch: 17105, Training Loss: 0.01141
Epoch: 17105, Training Loss: 0.01059
Epoch: 17105, Training Loss: 0.01303
Epoch: 17105, Training Loss: 0.01004
Epoch: 17106, Training Loss: 0.01141
Epoch: 17106, Training Loss: 0.01059
Epoch: 17106, Training Loss: 0.01303
Epoch: 17106, Training Loss: 0.01004
Epoch: 17107, Training Loss: 0.01141
Epoch: 17107, Training Loss: 0.01059
Epoch: 17107, Training Loss: 0.01303
Epoch: 17107, Training Loss: 0.01004
Epoch: 17108, Training Loss: 0.01141
Epoch: 17108, Training Loss: 0.01059
Epoch: 17108, Training Loss: 0.01303
Epoch: 17108, Training Loss: 0.01004
Epoch: 17109, Training Loss: 0.01141
Epoch: 17109, Training Loss: 0.01059
Epoch: 17109, Training Loss: 0.01302
Epoch: 17109, Training Loss: 0.01004
Epoch: 17110, Training Loss: 0.01141
Epoch: 17110, Training Loss: 0.01059
Epoch: 17110, Training Loss: 0.01302
Epoch: 17110, Training Loss: 0.01004
Epoch: 17111, Training Loss: 0.01141
Epoch: 17111, Training Loss: 0.01059
Epoch: 17111, Training Loss: 0.01302
Epoch: 17111, Training Loss: 0.01004
Epoch: 17112, Training Loss: 0.01141
Epoch: 17112, Training Loss: 0.01059
Epoch: 17112, Training Loss: 0.01302
Epoch: 17112, Training Loss: 0.01004
Epoch: 17113, Training Loss: 0.01141
Epoch: 17113, Training Loss: 0.01059
Epoch: 17113, Training Loss: 0.01302
Epoch: 17113, Training Loss: 0.01004
Epoch: 17114, Training Loss: 0.01141
Epoch: 17114, Training Loss: 0.01059
Epoch: 17114, Training Loss: 0.01302
Epoch: 17114, Training Loss: 0.01004
Epoch: 17115, Training Loss: 0.01141
Epoch: 17115, Training Loss: 0.01059
Epoch: 17115, Training Loss: 0.01302
Epoch: 17115, Training Loss: 0.01004
Epoch: 17116, Training Loss: 0.01141
Epoch: 17116, Training Loss: 0.01059
Epoch: 17116, Training Loss: 0.01302
Epoch: 17116, Training Loss: 0.01004
Epoch: 17117, Training Loss: 0.01141
Epoch: 17117, Training Loss: 0.01059
Epoch: 17117, Training Loss: 0.01302
Epoch: 17117, Training Loss: 0.01004
Epoch: 17118, Training Loss: 0.01141
Epoch: 17118, Training Loss: 0.01059
Epoch: 17118, Training Loss: 0.01302
Epoch: 17118, Training Loss: 0.01004
Epoch: 17119, Training Loss: 0.01141
Epoch: 17119, Training Loss: 0.01059
Epoch: 17119, Training Loss: 0.01302
Epoch: 17119, Training Loss: 0.01004
Epoch: 17120, Training Loss: 0.01141
Epoch: 17120, Training Loss: 0.01059
Epoch: 17120, Training Loss: 0.01302
Epoch: 17120, Training Loss: 0.01004
Epoch: 17121, Training Loss: 0.01140
Epoch: 17121, Training Loss: 0.01059
Epoch: 17121, Training Loss: 0.01302
Epoch: 17121, Training Loss: 0.01004
Epoch: 17122, Training Loss: 0.01140
Epoch: 17122, Training Loss: 0.01059
Epoch: 17122, Training Loss: 0.01302
Epoch: 17122, Training Loss: 0.01004
Epoch: 17123, Training Loss: 0.01140
Epoch: 17123, Training Loss: 0.01059
Epoch: 17123, Training Loss: 0.01302
Epoch: 17123, Training Loss: 0.01004
Epoch: 17124, Training Loss: 0.01140
Epoch: 17124, Training Loss: 0.01059
Epoch: 17124, Training Loss: 0.01302
Epoch: 17124, Training Loss: 0.01004
Epoch: 17125, Training Loss: 0.01140
Epoch: 17125, Training Loss: 0.01059
Epoch: 17125, Training Loss: 0.01302
Epoch: 17125, Training Loss: 0.01004
Epoch: 17126, Training Loss: 0.01140
Epoch: 17126, Training Loss: 0.01059
Epoch: 17126, Training Loss: 0.01302
Epoch: 17126, Training Loss: 0.01004
Epoch: 17127, Training Loss: 0.01140
Epoch: 17127, Training Loss: 0.01059
Epoch: 17127, Training Loss: 0.01302
Epoch: 17127, Training Loss: 0.01004
Epoch: 17128, Training Loss: 0.01140
Epoch: 17128, Training Loss: 0.01059
Epoch: 17128, Training Loss: 0.01302
Epoch: 17128, Training Loss: 0.01004
Epoch: 17129, Training Loss: 0.01140
Epoch: 17129, Training Loss: 0.01059
Epoch: 17129, Training Loss: 0.01302
Epoch: 17129, Training Loss: 0.01003
Epoch: 17130, Training Loss: 0.01140
Epoch: 17130, Training Loss: 0.01059
Epoch: 17130, Training Loss: 0.01302
Epoch: 17130, Training Loss: 0.01003
Epoch: 17131, Training Loss: 0.01140
Epoch: 17131, Training Loss: 0.01059
Epoch: 17131, Training Loss: 0.01302
Epoch: 17131, Training Loss: 0.01003
Epoch: 17132, Training Loss: 0.01140
Epoch: 17132, Training Loss: 0.01059
Epoch: 17132, Training Loss: 0.01302
Epoch: 17132, Training Loss: 0.01003
Epoch: 17133, Training Loss: 0.01140
Epoch: 17133, Training Loss: 0.01058
Epoch: 17133, Training Loss: 0.01301
Epoch: 17133, Training Loss: 0.01003
Epoch: 17134, Training Loss: 0.01140
Epoch: 17134, Training Loss: 0.01058
Epoch: 17134, Training Loss: 0.01301
Epoch: 17134, Training Loss: 0.01003
Epoch: 17135, Training Loss: 0.01140
Epoch: 17135, Training Loss: 0.01058
Epoch: 17135, Training Loss: 0.01301
Epoch: 17135, Training Loss: 0.01003
Epoch: 17136, Training Loss: 0.01140
Epoch: 17136, Training Loss: 0.01058
Epoch: 17136, Training Loss: 0.01301
Epoch: 17136, Training Loss: 0.01003
Epoch: 17137, Training Loss: 0.01140
Epoch: 17137, Training Loss: 0.01058
Epoch: 17137, Training Loss: 0.01301
Epoch: 17137, Training Loss: 0.01003
Epoch: 17138, Training Loss: 0.01140
Epoch: 17138, Training Loss: 0.01058
Epoch: 17138, Training Loss: 0.01301
Epoch: 17138, Training Loss: 0.01003
Epoch: 17139, Training Loss: 0.01140
Epoch: 17139, Training Loss: 0.01058
Epoch: 17139, Training Loss: 0.01301
Epoch: 17139, Training Loss: 0.01003
Epoch: 17140, Training Loss: 0.01140
Epoch: 17140, Training Loss: 0.01058
Epoch: 17140, Training Loss: 0.01301
Epoch: 17140, Training Loss: 0.01003
Epoch: 17141, Training Loss: 0.01140
Epoch: 17141, Training Loss: 0.01058
Epoch: 17141, Training Loss: 0.01301
Epoch: 17141, Training Loss: 0.01003
Epoch: 17142, Training Loss: 0.01140
Epoch: 17142, Training Loss: 0.01058
Epoch: 17142, Training Loss: 0.01301
Epoch: 17142, Training Loss: 0.01003
Epoch: 17143, Training Loss: 0.01140
Epoch: 17143, Training Loss: 0.01058
Epoch: 17143, Training Loss: 0.01301
Epoch: 17143, Training Loss: 0.01003
Epoch: 17144, Training Loss: 0.01140
Epoch: 17144, Training Loss: 0.01058
Epoch: 17144, Training Loss: 0.01301
Epoch: 17144, Training Loss: 0.01003
Epoch: 17145, Training Loss: 0.01140
Epoch: 17145, Training Loss: 0.01058
Epoch: 17145, Training Loss: 0.01301
Epoch: 17145, Training Loss: 0.01003
Epoch: 17146, Training Loss: 0.01140
Epoch: 17146, Training Loss: 0.01058
Epoch: 17146, Training Loss: 0.01301
Epoch: 17146, Training Loss: 0.01003
Epoch: 17147, Training Loss: 0.01140
Epoch: 17147, Training Loss: 0.01058
Epoch: 17147, Training Loss: 0.01301
Epoch: 17147, Training Loss: 0.01003
Epoch: 17148, Training Loss: 0.01140
Epoch: 17148, Training Loss: 0.01058
Epoch: 17148, Training Loss: 0.01301
Epoch: 17148, Training Loss: 0.01003
Epoch: 17149, Training Loss: 0.01139
Epoch: 17149, Training Loss: 0.01058
Epoch: 17149, Training Loss: 0.01301
Epoch: 17149, Training Loss: 0.01003
Epoch: 17150, Training Loss: 0.01139
Epoch: 17150, Training Loss: 0.01058
Epoch: 17150, Training Loss: 0.01301
Epoch: 17150, Training Loss: 0.01003
Epoch: 17151, Training Loss: 0.01139
Epoch: 17151, Training Loss: 0.01058
Epoch: 17151, Training Loss: 0.01301
Epoch: 17151, Training Loss: 0.01003
Epoch: 17152, Training Loss: 0.01139
Epoch: 17152, Training Loss: 0.01058
Epoch: 17152, Training Loss: 0.01301
Epoch: 17152, Training Loss: 0.01003
Epoch: 17153, Training Loss: 0.01139
Epoch: 17153, Training Loss: 0.01058
Epoch: 17153, Training Loss: 0.01301
Epoch: 17153, Training Loss: 0.01003
Epoch: 17154, Training Loss: 0.01139
Epoch: 17154, Training Loss: 0.01058
Epoch: 17154, Training Loss: 0.01301
Epoch: 17154, Training Loss: 0.01003
Epoch: 17155, Training Loss: 0.01139
Epoch: 17155, Training Loss: 0.01058
Epoch: 17155, Training Loss: 0.01301
Epoch: 17155, Training Loss: 0.01003
Epoch: 17156, Training Loss: 0.01139
Epoch: 17156, Training Loss: 0.01058
Epoch: 17156, Training Loss: 0.01301
Epoch: 17156, Training Loss: 0.01003
Epoch: 17157, Training Loss: 0.01139
Epoch: 17157, Training Loss: 0.01058
Epoch: 17157, Training Loss: 0.01300
Epoch: 17157, Training Loss: 0.01003
Epoch: 17158, Training Loss: 0.01139
Epoch: 17158, Training Loss: 0.01058
Epoch: 17158, Training Loss: 0.01300
Epoch: 17158, Training Loss: 0.01003
Epoch: 17159, Training Loss: 0.01139
Epoch: 17159, Training Loss: 0.01058
Epoch: 17159, Training Loss: 0.01300
Epoch: 17159, Training Loss: 0.01003
Epoch: 17160, Training Loss: 0.01139
Epoch: 17160, Training Loss: 0.01058
Epoch: 17160, Training Loss: 0.01300
Epoch: 17160, Training Loss: 0.01002
Epoch: 17161, Training Loss: 0.01139
Epoch: 17161, Training Loss: 0.01058
Epoch: 17161, Training Loss: 0.01300
Epoch: 17161, Training Loss: 0.01002
Epoch: 17162, Training Loss: 0.01139
Epoch: 17162, Training Loss: 0.01058
Epoch: 17162, Training Loss: 0.01300
Epoch: 17162, Training Loss: 0.01002
Epoch: 17163, Training Loss: 0.01139
Epoch: 17163, Training Loss: 0.01057
Epoch: 17163, Training Loss: 0.01300
Epoch: 17163, Training Loss: 0.01002
Epoch: 17164, Training Loss: 0.01139
Epoch: 17164, Training Loss: 0.01057
Epoch: 17164, Training Loss: 0.01300
Epoch: 17164, Training Loss: 0.01002
Epoch: 17165, Training Loss: 0.01139
Epoch: 17165, Training Loss: 0.01057
Epoch: 17165, Training Loss: 0.01300
Epoch: 17165, Training Loss: 0.01002
Epoch: 17166, Training Loss: 0.01139
Epoch: 17166, Training Loss: 0.01057
Epoch: 17166, Training Loss: 0.01300
Epoch: 17166, Training Loss: 0.01002
Epoch: 17167, Training Loss: 0.01139
Epoch: 17167, Training Loss: 0.01057
Epoch: 17167, Training Loss: 0.01300
Epoch: 17167, Training Loss: 0.01002
Epoch: 17168, Training Loss: 0.01139
Epoch: 17168, Training Loss: 0.01057
Epoch: 17168, Training Loss: 0.01300
Epoch: 17168, Training Loss: 0.01002
Epoch: 17169, Training Loss: 0.01139
Epoch: 17169, Training Loss: 0.01057
Epoch: 17169, Training Loss: 0.01300
Epoch: 17169, Training Loss: 0.01002
Epoch: 17170, Training Loss: 0.01139
Epoch: 17170, Training Loss: 0.01057
Epoch: 17170, Training Loss: 0.01300
Epoch: 17170, Training Loss: 0.01002
Epoch: 17171, Training Loss: 0.01139
Epoch: 17171, Training Loss: 0.01057
Epoch: 17171, Training Loss: 0.01300
Epoch: 17171, Training Loss: 0.01002
Epoch: 17172, Training Loss: 0.01139
Epoch: 17172, Training Loss: 0.01057
Epoch: 17172, Training Loss: 0.01300
Epoch: 17172, Training Loss: 0.01002
Epoch: 17173, Training Loss: 0.01139
Epoch: 17173, Training Loss: 0.01057
Epoch: 17173, Training Loss: 0.01300
Epoch: 17173, Training Loss: 0.01002
Epoch: 17174, Training Loss: 0.01139
Epoch: 17174, Training Loss: 0.01057
Epoch: 17174, Training Loss: 0.01300
Epoch: 17174, Training Loss: 0.01002
Epoch: 17175, Training Loss: 0.01139
Epoch: 17175, Training Loss: 0.01057
Epoch: 17175, Training Loss: 0.01300
Epoch: 17175, Training Loss: 0.01002
Epoch: 17176, Training Loss: 0.01138
Epoch: 17176, Training Loss: 0.01057
Epoch: 17176, Training Loss: 0.01300
Epoch: 17176, Training Loss: 0.01002
Epoch: 17177, Training Loss: 0.01138
Epoch: 17177, Training Loss: 0.01057
Epoch: 17177, Training Loss: 0.01300
Epoch: 17177, Training Loss: 0.01002
Epoch: 17178, Training Loss: 0.01138
Epoch: 17178, Training Loss: 0.01057
Epoch: 17178, Training Loss: 0.01300
Epoch: 17178, Training Loss: 0.01002
Epoch: 17179, Training Loss: 0.01138
Epoch: 17179, Training Loss: 0.01057
Epoch: 17179, Training Loss: 0.01300
Epoch: 17179, Training Loss: 0.01002
Epoch: 17180, Training Loss: 0.01138
Epoch: 17180, Training Loss: 0.01057
Epoch: 17180, Training Loss: 0.01300
Epoch: 17180, Training Loss: 0.01002
Epoch: 17181, Training Loss: 0.01138
Epoch: 17181, Training Loss: 0.01057
Epoch: 17181, Training Loss: 0.01300
Epoch: 17181, Training Loss: 0.01002
Epoch: 17182, Training Loss: 0.01138
Epoch: 17182, Training Loss: 0.01057
Epoch: 17182, Training Loss: 0.01299
Epoch: 17182, Training Loss: 0.01002
Epoch: 17183, Training Loss: 0.01138
Epoch: 17183, Training Loss: 0.01057
Epoch: 17183, Training Loss: 0.01299
Epoch: 17183, Training Loss: 0.01002
Epoch: 17184, Training Loss: 0.01138
Epoch: 17184, Training Loss: 0.01057
Epoch: 17184, Training Loss: 0.01299
Epoch: 17184, Training Loss: 0.01002
Epoch: 17185, Training Loss: 0.01138
Epoch: 17185, Training Loss: 0.01057
Epoch: 17185, Training Loss: 0.01299
Epoch: 17185, Training Loss: 0.01002
Epoch: 17186, Training Loss: 0.01138
Epoch: 17186, Training Loss: 0.01057
Epoch: 17186, Training Loss: 0.01299
Epoch: 17186, Training Loss: 0.01002
Epoch: 17187, Training Loss: 0.01138
Epoch: 17187, Training Loss: 0.01057
Epoch: 17187, Training Loss: 0.01299
Epoch: 17187, Training Loss: 0.01002
Epoch: 17188, Training Loss: 0.01138
Epoch: 17188, Training Loss: 0.01057
Epoch: 17188, Training Loss: 0.01299
Epoch: 17188, Training Loss: 0.01002
Epoch: 17189, Training Loss: 0.01138
Epoch: 17189, Training Loss: 0.01057
Epoch: 17189, Training Loss: 0.01299
Epoch: 17189, Training Loss: 0.01002
Epoch: 17190, Training Loss: 0.01138
Epoch: 17190, Training Loss: 0.01057
Epoch: 17190, Training Loss: 0.01299
Epoch: 17190, Training Loss: 0.01002
Epoch: 17191, Training Loss: 0.01138
Epoch: 17191, Training Loss: 0.01057
Epoch: 17191, Training Loss: 0.01299
Epoch: 17191, Training Loss: 0.01002
Epoch: 17192, Training Loss: 0.01138
Epoch: 17192, Training Loss: 0.01057
Epoch: 17192, Training Loss: 0.01299
Epoch: 17192, Training Loss: 0.01001
Epoch: 17193, Training Loss: 0.01138
Epoch: 17193, Training Loss: 0.01056
Epoch: 17193, Training Loss: 0.01299
Epoch: 17193, Training Loss: 0.01001
Epoch: 17194, Training Loss: 0.01138
Epoch: 17194, Training Loss: 0.01056
Epoch: 17194, Training Loss: 0.01299
Epoch: 17194, Training Loss: 0.01001
Epoch: 17195, Training Loss: 0.01138
Epoch: 17195, Training Loss: 0.01056
Epoch: 17195, Training Loss: 0.01299
Epoch: 17195, Training Loss: 0.01001
Epoch: 17196, Training Loss: 0.01138
Epoch: 17196, Training Loss: 0.01056
Epoch: 17196, Training Loss: 0.01299
Epoch: 17196, Training Loss: 0.01001
Epoch: 17197, Training Loss: 0.01138
Epoch: 17197, Training Loss: 0.01056
Epoch: 17197, Training Loss: 0.01299
Epoch: 17197, Training Loss: 0.01001
Epoch: 17198, Training Loss: 0.01138
Epoch: 17198, Training Loss: 0.01056
Epoch: 17198, Training Loss: 0.01299
Epoch: 17198, Training Loss: 0.01001
Epoch: 17199, Training Loss: 0.01138
Epoch: 17199, Training Loss: 0.01056
Epoch: 17199, Training Loss: 0.01299
Epoch: 17199, Training Loss: 0.01001
Epoch: 17200, Training Loss: 0.01138
Epoch: 17200, Training Loss: 0.01056
Epoch: 17200, Training Loss: 0.01299
Epoch: 17200, Training Loss: 0.01001
Epoch: 17201, Training Loss: 0.01138
Epoch: 17201, Training Loss: 0.01056
Epoch: 17201, Training Loss: 0.01299
Epoch: 17201, Training Loss: 0.01001
Epoch: 17202, Training Loss: 0.01138
Epoch: 17202, Training Loss: 0.01056
Epoch: 17202, Training Loss: 0.01299
Epoch: 17202, Training Loss: 0.01001
Epoch: 17203, Training Loss: 0.01138
Epoch: 17203, Training Loss: 0.01056
Epoch: 17203, Training Loss: 0.01299
Epoch: 17203, Training Loss: 0.01001
Epoch: 17204, Training Loss: 0.01137
Epoch: 17204, Training Loss: 0.01056
Epoch: 17204, Training Loss: 0.01299
Epoch: 17204, Training Loss: 0.01001
Epoch: 17205, Training Loss: 0.01137
Epoch: 17205, Training Loss: 0.01056
Epoch: 17205, Training Loss: 0.01299
Epoch: 17205, Training Loss: 0.01001
Epoch: 17206, Training Loss: 0.01137
Epoch: 17206, Training Loss: 0.01056
Epoch: 17206, Training Loss: 0.01298
Epoch: 17206, Training Loss: 0.01001
Epoch: 17207, Training Loss: 0.01137
Epoch: 17207, Training Loss: 0.01056
Epoch: 17207, Training Loss: 0.01298
Epoch: 17207, Training Loss: 0.01001
Epoch: 17208, Training Loss: 0.01137
Epoch: 17208, Training Loss: 0.01056
Epoch: 17208, Training Loss: 0.01298
Epoch: 17208, Training Loss: 0.01001
Epoch: 17209, Training Loss: 0.01137
Epoch: 17209, Training Loss: 0.01056
Epoch: 17209, Training Loss: 0.01298
Epoch: 17209, Training Loss: 0.01001
Epoch: 17210, Training Loss: 0.01137
Epoch: 17210, Training Loss: 0.01056
Epoch: 17210, Training Loss: 0.01298
Epoch: 17210, Training Loss: 0.01001
Epoch: 17211, Training Loss: 0.01137
Epoch: 17211, Training Loss: 0.01056
Epoch: 17211, Training Loss: 0.01298
Epoch: 17211, Training Loss: 0.01001
Epoch: 17212, Training Loss: 0.01137
Epoch: 17212, Training Loss: 0.01056
Epoch: 17212, Training Loss: 0.01298
Epoch: 17212, Training Loss: 0.01001
Epoch: 17213, Training Loss: 0.01137
Epoch: 17213, Training Loss: 0.01056
Epoch: 17213, Training Loss: 0.01298
Epoch: 17213, Training Loss: 0.01001
Epoch: 17214, Training Loss: 0.01137
Epoch: 17214, Training Loss: 0.01056
Epoch: 17214, Training Loss: 0.01298
Epoch: 17214, Training Loss: 0.01001
Epoch: 17215, Training Loss: 0.01137
Epoch: 17215, Training Loss: 0.01056
Epoch: 17215, Training Loss: 0.01298
Epoch: 17215, Training Loss: 0.01001
Epoch: 17216, Training Loss: 0.01137
Epoch: 17216, Training Loss: 0.01056
Epoch: 17216, Training Loss: 0.01298
Epoch: 17216, Training Loss: 0.01001
Epoch: 17217, Training Loss: 0.01137
Epoch: 17217, Training Loss: 0.01056
Epoch: 17217, Training Loss: 0.01298
Epoch: 17217, Training Loss: 0.01001
Epoch: 17218, Training Loss: 0.01137
Epoch: 17218, Training Loss: 0.01056
Epoch: 17218, Training Loss: 0.01298
Epoch: 17218, Training Loss: 0.01001
Epoch: 17219, Training Loss: 0.01137
Epoch: 17219, Training Loss: 0.01056
Epoch: 17219, Training Loss: 0.01298
Epoch: 17219, Training Loss: 0.01001
Epoch: 17220, Training Loss: 0.01137
Epoch: 17220, Training Loss: 0.01056
Epoch: 17220, Training Loss: 0.01298
Epoch: 17220, Training Loss: 0.01001
Epoch: 17221, Training Loss: 0.01137
Epoch: 17221, Training Loss: 0.01056
Epoch: 17221, Training Loss: 0.01298
Epoch: 17221, Training Loss: 0.01001
Epoch: 17222, Training Loss: 0.01137
Epoch: 17222, Training Loss: 0.01056
Epoch: 17222, Training Loss: 0.01298
Epoch: 17222, Training Loss: 0.01001
Epoch: 17223, Training Loss: 0.01137
Epoch: 17223, Training Loss: 0.01056
Epoch: 17223, Training Loss: 0.01298
Epoch: 17223, Training Loss: 0.01001
Epoch: 17224, Training Loss: 0.01137
Epoch: 17224, Training Loss: 0.01055
Epoch: 17224, Training Loss: 0.01298
Epoch: 17224, Training Loss: 0.01000
Epoch: 17225, Training Loss: 0.01137
Epoch: 17225, Training Loss: 0.01055
Epoch: 17225, Training Loss: 0.01298
Epoch: 17225, Training Loss: 0.01000
Epoch: 17226, Training Loss: 0.01137
Epoch: 17226, Training Loss: 0.01055
Epoch: 17226, Training Loss: 0.01298
Epoch: 17226, Training Loss: 0.01000
Epoch: 17227, Training Loss: 0.01137
Epoch: 17227, Training Loss: 0.01055
Epoch: 17227, Training Loss: 0.01298
Epoch: 17227, Training Loss: 0.01000
Epoch: 17228, Training Loss: 0.01137
Epoch: 17228, Training Loss: 0.01055
Epoch: 17228, Training Loss: 0.01298
Epoch: 17228, Training Loss: 0.01000
Epoch: 17229, Training Loss: 0.01137
Epoch: 17229, Training Loss: 0.01055
Epoch: 17229, Training Loss: 0.01298
Epoch: 17229, Training Loss: 0.01000
Epoch: 17230, Training Loss: 0.01137
Epoch: 17230, Training Loss: 0.01055
Epoch: 17230, Training Loss: 0.01297
Epoch: 17230, Training Loss: 0.01000
Epoch: 17231, Training Loss: 0.01137
Epoch: 17231, Training Loss: 0.01055
Epoch: 17231, Training Loss: 0.01297
Epoch: 17231, Training Loss: 0.01000
Epoch: 17232, Training Loss: 0.01136
Epoch: 17232, Training Loss: 0.01055
Epoch: 17232, Training Loss: 0.01297
Epoch: 17232, Training Loss: 0.01000
Epoch: 17233, Training Loss: 0.01136
Epoch: 17233, Training Loss: 0.01055
Epoch: 17233, Training Loss: 0.01297
Epoch: 17233, Training Loss: 0.01000
Epoch: 17234, Training Loss: 0.01136
Epoch: 17234, Training Loss: 0.01055
Epoch: 17234, Training Loss: 0.01297
Epoch: 17234, Training Loss: 0.01000
Epoch: 17235, Training Loss: 0.01136
Epoch: 17235, Training Loss: 0.01055
Epoch: 17235, Training Loss: 0.01297
Epoch: 17235, Training Loss: 0.01000
Epoch: 17236, Training Loss: 0.01136
Epoch: 17236, Training Loss: 0.01055
Epoch: 17236, Training Loss: 0.01297
Epoch: 17236, Training Loss: 0.01000
Epoch: 17237, Training Loss: 0.01136
Epoch: 17237, Training Loss: 0.01055
Epoch: 17237, Training Loss: 0.01297
Epoch: 17237, Training Loss: 0.01000
Epoch: 17238, Training Loss: 0.01136
Epoch: 17238, Training Loss: 0.01055
Epoch: 17238, Training Loss: 0.01297
Epoch: 17238, Training Loss: 0.01000
Epoch: 17239, Training Loss: 0.01136
Epoch: 17239, Training Loss: 0.01055
Epoch: 17239, Training Loss: 0.01297
Epoch: 17239, Training Loss: 0.01000
Epoch: 17240, Training Loss: 0.01136
Epoch: 17240, Training Loss: 0.01055
Epoch: 17240, Training Loss: 0.01297
Epoch: 17240, Training Loss: 0.01000
Epoch: 17241, Training Loss: 0.01136
Epoch: 17241, Training Loss: 0.01055
Epoch: 17241, Training Loss: 0.01297
Epoch: 17241, Training Loss: 0.01000
Epoch: 17242, Training Loss: 0.01136
Epoch: 17242, Training Loss: 0.01055
Epoch: 17242, Training Loss: 0.01297
Epoch: 17242, Training Loss: 0.01000
Epoch: 17243, Training Loss: 0.01136
Epoch: 17243, Training Loss: 0.01055
Epoch: 17243, Training Loss: 0.01297
Epoch: 17243, Training Loss: 0.01000
Epoch: 17244, Training Loss: 0.01136
Epoch: 17244, Training Loss: 0.01055
Epoch: 17244, Training Loss: 0.01297
Epoch: 17244, Training Loss: 0.01000
Epoch: 17245, Training Loss: 0.01136
Epoch: 17245, Training Loss: 0.01055
Epoch: 17245, Training Loss: 0.01297
Epoch: 17245, Training Loss: 0.01000
Epoch: 17246, Training Loss: 0.01136
Epoch: 17246, Training Loss: 0.01055
Epoch: 17246, Training Loss: 0.01297
Epoch: 17246, Training Loss: 0.01000
Epoch: 17247, Training Loss: 0.01136
Epoch: 17247, Training Loss: 0.01055
Epoch: 17247, Training Loss: 0.01297
Epoch: 17247, Training Loss: 0.01000
Epoch: 17248, Training Loss: 0.01136
Epoch: 17248, Training Loss: 0.01055
Epoch: 17248, Training Loss: 0.01297
Epoch: 17248, Training Loss: 0.01000
Epoch: 17249, Training Loss: 0.01136
Epoch: 17249, Training Loss: 0.01055
Epoch: 17249, Training Loss: 0.01297
Epoch: 17249, Training Loss: 0.01000
Epoch: 17250, Training Loss: 0.01136
Epoch: 17250, Training Loss: 0.01055
Epoch: 17250, Training Loss: 0.01297
Epoch: 17250, Training Loss: 0.01000
Epoch: 17251, Training Loss: 0.01136
Epoch: 17251, Training Loss: 0.01055
Epoch: 17251, Training Loss: 0.01297
Epoch: 17251, Training Loss: 0.01000
Epoch: 17252, Training Loss: 0.01136
Epoch: 17252, Training Loss: 0.01055
Epoch: 17252, Training Loss: 0.01297
Epoch: 17252, Training Loss: 0.01000
Epoch: 17253, Training Loss: 0.01136
Epoch: 17253, Training Loss: 0.01055
Epoch: 17253, Training Loss: 0.01297
Epoch: 17253, Training Loss: 0.01000
Epoch: 17254, Training Loss: 0.01136
Epoch: 17254, Training Loss: 0.01055
Epoch: 17254, Training Loss: 0.01297
Epoch: 17254, Training Loss: 0.01000
Epoch: 17255, Training Loss: 0.01136
Epoch: 17255, Training Loss: 0.01054
Epoch: 17255, Training Loss: 0.01296
Epoch: 17255, Training Loss: 0.01000
Epoch: 17256, Training Loss: 0.01136
Epoch: 17256, Training Loss: 0.01054
Epoch: 17256, Training Loss: 0.01296
Epoch: 17256, Training Loss: 0.01000
Epoch: 17257, Training Loss: 0.01136
Epoch: 17257, Training Loss: 0.01054
Epoch: 17257, Training Loss: 0.01296
Epoch: 17257, Training Loss: 0.00999
Epoch: 17258, Training Loss: 0.01136
Epoch: 17258, Training Loss: 0.01054
Epoch: 17258, Training Loss: 0.01296
Epoch: 17258, Training Loss: 0.00999
Epoch: 17259, Training Loss: 0.01135
Epoch: 17259, Training Loss: 0.01054
Epoch: 17259, Training Loss: 0.01296
Epoch: 17259, Training Loss: 0.00999
Epoch: 17260, Training Loss: 0.01135
Epoch: 17260, Training Loss: 0.01054
Epoch: 17260, Training Loss: 0.01296
Epoch: 17260, Training Loss: 0.00999
Epoch: 17261, Training Loss: 0.01135
Epoch: 17261, Training Loss: 0.01054
Epoch: 17261, Training Loss: 0.01296
Epoch: 17261, Training Loss: 0.00999
Epoch: 17262, Training Loss: 0.01135
Epoch: 17262, Training Loss: 0.01054
Epoch: 17262, Training Loss: 0.01296
Epoch: 17262, Training Loss: 0.00999
Epoch: 17263, Training Loss: 0.01135
Epoch: 17263, Training Loss: 0.01054
Epoch: 17263, Training Loss: 0.01296
Epoch: 17263, Training Loss: 0.00999
Epoch: 17264, Training Loss: 0.01135
Epoch: 17264, Training Loss: 0.01054
Epoch: 17264, Training Loss: 0.01296
Epoch: 17264, Training Loss: 0.00999
Epoch: 17265, Training Loss: 0.01135
Epoch: 17265, Training Loss: 0.01054
Epoch: 17265, Training Loss: 0.01296
Epoch: 17265, Training Loss: 0.00999
Epoch: 17266, Training Loss: 0.01135
Epoch: 17266, Training Loss: 0.01054
Epoch: 17266, Training Loss: 0.01296
Epoch: 17266, Training Loss: 0.00999
Epoch: 17267, Training Loss: 0.01135
Epoch: 17267, Training Loss: 0.01054
Epoch: 17267, Training Loss: 0.01296
Epoch: 17267, Training Loss: 0.00999
Epoch: 17268, Training Loss: 0.01135
Epoch: 17268, Training Loss: 0.01054
Epoch: 17268, Training Loss: 0.01296
Epoch: 17268, Training Loss: 0.00999
Epoch: 17269, Training Loss: 0.01135
Epoch: 17269, Training Loss: 0.01054
Epoch: 17269, Training Loss: 0.01296
Epoch: 17269, Training Loss: 0.00999
Epoch: 17270, Training Loss: 0.01135
Epoch: 17270, Training Loss: 0.01054
Epoch: 17270, Training Loss: 0.01296
Epoch: 17270, Training Loss: 0.00999
Epoch: 17271, Training Loss: 0.01135
Epoch: 17271, Training Loss: 0.01054
Epoch: 17271, Training Loss: 0.01296
Epoch: 17271, Training Loss: 0.00999
Epoch: 17272, Training Loss: 0.01135
Epoch: 17272, Training Loss: 0.01054
Epoch: 17272, Training Loss: 0.01296
Epoch: 17272, Training Loss: 0.00999
Epoch: 17273, Training Loss: 0.01135
Epoch: 17273, Training Loss: 0.01054
Epoch: 17273, Training Loss: 0.01296
Epoch: 17273, Training Loss: 0.00999
Epoch: 17274, Training Loss: 0.01135
Epoch: 17274, Training Loss: 0.01054
Epoch: 17274, Training Loss: 0.01296
Epoch: 17274, Training Loss: 0.00999
Epoch: 17275, Training Loss: 0.01135
Epoch: 17275, Training Loss: 0.01054
Epoch: 17275, Training Loss: 0.01296
Epoch: 17275, Training Loss: 0.00999
Epoch: 17276, Training Loss: 0.01135
Epoch: 17276, Training Loss: 0.01054
Epoch: 17276, Training Loss: 0.01296
Epoch: 17276, Training Loss: 0.00999
Epoch: 17277, Training Loss: 0.01135
Epoch: 17277, Training Loss: 0.01054
Epoch: 17277, Training Loss: 0.01296
Epoch: 17277, Training Loss: 0.00999
Epoch: 17278, Training Loss: 0.01135
Epoch: 17278, Training Loss: 0.01054
Epoch: 17278, Training Loss: 0.01296
Epoch: 17278, Training Loss: 0.00999
Epoch: 17279, Training Loss: 0.01135
Epoch: 17279, Training Loss: 0.01054
Epoch: 17279, Training Loss: 0.01295
Epoch: 17279, Training Loss: 0.00999
Epoch: 17280, Training Loss: 0.01135
Epoch: 17280, Training Loss: 0.01054
Epoch: 17280, Training Loss: 0.01295
Epoch: 17280, Training Loss: 0.00999
Epoch: 17281, Training Loss: 0.01135
Epoch: 17281, Training Loss: 0.01054
Epoch: 17281, Training Loss: 0.01295
Epoch: 17281, Training Loss: 0.00999
Epoch: 17282, Training Loss: 0.01135
Epoch: 17282, Training Loss: 0.01054
Epoch: 17282, Training Loss: 0.01295
Epoch: 17282, Training Loss: 0.00999
Epoch: 17283, Training Loss: 0.01135
Epoch: 17283, Training Loss: 0.01054
Epoch: 17283, Training Loss: 0.01295
Epoch: 17283, Training Loss: 0.00999
Epoch: 17284, Training Loss: 0.01135
Epoch: 17284, Training Loss: 0.01054
Epoch: 17284, Training Loss: 0.01295
Epoch: 17284, Training Loss: 0.00999
Epoch: 17285, Training Loss: 0.01135
Epoch: 17285, Training Loss: 0.01053
Epoch: 17285, Training Loss: 0.01295
Epoch: 17285, Training Loss: 0.00999
Epoch: 17286, Training Loss: 0.01135
Epoch: 17286, Training Loss: 0.01053
Epoch: 17286, Training Loss: 0.01295
Epoch: 17286, Training Loss: 0.00999
Epoch: 17287, Training Loss: 0.01134
Epoch: 17287, Training Loss: 0.01053
Epoch: 17287, Training Loss: 0.01295
Epoch: 17287, Training Loss: 0.00999
Epoch: 17288, Training Loss: 0.01134
Epoch: 17288, Training Loss: 0.01053
Epoch: 17288, Training Loss: 0.01295
Epoch: 17288, Training Loss: 0.00999
Epoch: 17289, Training Loss: 0.01134
Epoch: 17289, Training Loss: 0.01053
Epoch: 17289, Training Loss: 0.01295
Epoch: 17289, Training Loss: 0.00998
Epoch: 17290, Training Loss: 0.01134
Epoch: 17290, Training Loss: 0.01053
Epoch: 17290, Training Loss: 0.01295
Epoch: 17290, Training Loss: 0.00998
Epoch: 17291, Training Loss: 0.01134
Epoch: 17291, Training Loss: 0.01053
Epoch: 17291, Training Loss: 0.01295
Epoch: 17291, Training Loss: 0.00998
Epoch: 17292, Training Loss: 0.01134
Epoch: 17292, Training Loss: 0.01053
Epoch: 17292, Training Loss: 0.01295
Epoch: 17292, Training Loss: 0.00998
Epoch: 17293, Training Loss: 0.01134
Epoch: 17293, Training Loss: 0.01053
Epoch: 17293, Training Loss: 0.01295
Epoch: 17293, Training Loss: 0.00998
Epoch: 17294, Training Loss: 0.01134
Epoch: 17294, Training Loss: 0.01053
Epoch: 17294, Training Loss: 0.01295
Epoch: 17294, Training Loss: 0.00998
Epoch: 17295, Training Loss: 0.01134
Epoch: 17295, Training Loss: 0.01053
Epoch: 17295, Training Loss: 0.01295
Epoch: 17295, Training Loss: 0.00998
Epoch: 17296, Training Loss: 0.01134
Epoch: 17296, Training Loss: 0.01053
Epoch: 17296, Training Loss: 0.01295
Epoch: 17296, Training Loss: 0.00998
Epoch: 17297, Training Loss: 0.01134
Epoch: 17297, Training Loss: 0.01053
Epoch: 17297, Training Loss: 0.01295
Epoch: 17297, Training Loss: 0.00998
Epoch: 17298, Training Loss: 0.01134
Epoch: 17298, Training Loss: 0.01053
Epoch: 17298, Training Loss: 0.01295
Epoch: 17298, Training Loss: 0.00998
Epoch: 17299, Training Loss: 0.01134
Epoch: 17299, Training Loss: 0.01053
Epoch: 17299, Training Loss: 0.01295
Epoch: 17299, Training Loss: 0.00998
Epoch: 17300, Training Loss: 0.01134
Epoch: 17300, Training Loss: 0.01053
Epoch: 17300, Training Loss: 0.01295
Epoch: 17300, Training Loss: 0.00998
Epoch: 17301, Training Loss: 0.01134
Epoch: 17301, Training Loss: 0.01053
Epoch: 17301, Training Loss: 0.01295
Epoch: 17301, Training Loss: 0.00998
Epoch: 17302, Training Loss: 0.01134
Epoch: 17302, Training Loss: 0.01053
Epoch: 17302, Training Loss: 0.01295
Epoch: 17302, Training Loss: 0.00998
Epoch: 17303, Training Loss: 0.01134
Epoch: 17303, Training Loss: 0.01053
Epoch: 17303, Training Loss: 0.01295
Epoch: 17303, Training Loss: 0.00998
Epoch: 17304, Training Loss: 0.01134
Epoch: 17304, Training Loss: 0.01053
Epoch: 17304, Training Loss: 0.01294
Epoch: 17304, Training Loss: 0.00998
Epoch: 17305, Training Loss: 0.01134
Epoch: 17305, Training Loss: 0.01053
Epoch: 17305, Training Loss: 0.01294
Epoch: 17305, Training Loss: 0.00998
Epoch: 17306, Training Loss: 0.01134
Epoch: 17306, Training Loss: 0.01053
Epoch: 17306, Training Loss: 0.01294
Epoch: 17306, Training Loss: 0.00998
Epoch: 17307, Training Loss: 0.01134
Epoch: 17307, Training Loss: 0.01053
Epoch: 17307, Training Loss: 0.01294
Epoch: 17307, Training Loss: 0.00998
Epoch: 17308, Training Loss: 0.01134
Epoch: 17308, Training Loss: 0.01053
Epoch: 17308, Training Loss: 0.01294
Epoch: 17308, Training Loss: 0.00998
Epoch: 17309, Training Loss: 0.01134
Epoch: 17309, Training Loss: 0.01053
Epoch: 17309, Training Loss: 0.01294
Epoch: 17309, Training Loss: 0.00998
Epoch: 17310, Training Loss: 0.01134
Epoch: 17310, Training Loss: 0.01053
Epoch: 17310, Training Loss: 0.01294
Epoch: 17310, Training Loss: 0.00998
Epoch: 17311, Training Loss: 0.01134
Epoch: 17311, Training Loss: 0.01053
Epoch: 17311, Training Loss: 0.01294
Epoch: 17311, Training Loss: 0.00998
Epoch: 17312, Training Loss: 0.01134
Epoch: 17312, Training Loss: 0.01053
Epoch: 17312, Training Loss: 0.01294
Epoch: 17312, Training Loss: 0.00998
Epoch: 17313, Training Loss: 0.01134
Epoch: 17313, Training Loss: 0.01053
Epoch: 17313, Training Loss: 0.01294
Epoch: 17313, Training Loss: 0.00998
Epoch: 17314, Training Loss: 0.01134
Epoch: 17314, Training Loss: 0.01053
Epoch: 17314, Training Loss: 0.01294
Epoch: 17314, Training Loss: 0.00998
Epoch: 17315, Training Loss: 0.01133
Epoch: 17315, Training Loss: 0.01053
Epoch: 17315, Training Loss: 0.01294
Epoch: 17315, Training Loss: 0.00998
Epoch: 17316, Training Loss: 0.01133
Epoch: 17316, Training Loss: 0.01052
Epoch: 17316, Training Loss: 0.01294
Epoch: 17316, Training Loss: 0.00998
Epoch: 17317, Training Loss: 0.01133
Epoch: 17317, Training Loss: 0.01052
Epoch: 17317, Training Loss: 0.01294
Epoch: 17317, Training Loss: 0.00998
Epoch: 17318, Training Loss: 0.01133
Epoch: 17318, Training Loss: 0.01052
Epoch: 17318, Training Loss: 0.01294
Epoch: 17318, Training Loss: 0.00998
Epoch: 17319, Training Loss: 0.01133
Epoch: 17319, Training Loss: 0.01052
Epoch: 17319, Training Loss: 0.01294
Epoch: 17319, Training Loss: 0.00998
Epoch: 17320, Training Loss: 0.01133
Epoch: 17320, Training Loss: 0.01052
Epoch: 17320, Training Loss: 0.01294
Epoch: 17320, Training Loss: 0.00998
Epoch: 17321, Training Loss: 0.01133
Epoch: 17321, Training Loss: 0.01052
Epoch: 17321, Training Loss: 0.01294
Epoch: 17321, Training Loss: 0.00997
Epoch: 17322, Training Loss: 0.01133
Epoch: 17322, Training Loss: 0.01052
Epoch: 17322, Training Loss: 0.01294
Epoch: 17322, Training Loss: 0.00997
Epoch: 17323, Training Loss: 0.01133
Epoch: 17323, Training Loss: 0.01052
Epoch: 17323, Training Loss: 0.01294
Epoch: 17323, Training Loss: 0.00997
Epoch: 17324, Training Loss: 0.01133
Epoch: 17324, Training Loss: 0.01052
Epoch: 17324, Training Loss: 0.01294
Epoch: 17324, Training Loss: 0.00997
Epoch: 17325, Training Loss: 0.01133
Epoch: 17325, Training Loss: 0.01052
Epoch: 17325, Training Loss: 0.01294
Epoch: 17325, Training Loss: 0.00997
Epoch: 17326, Training Loss: 0.01133
Epoch: 17326, Training Loss: 0.01052
Epoch: 17326, Training Loss: 0.01294
Epoch: 17326, Training Loss: 0.00997
Epoch: 17327, Training Loss: 0.01133
Epoch: 17327, Training Loss: 0.01052
Epoch: 17327, Training Loss: 0.01294
Epoch: 17327, Training Loss: 0.00997
Epoch: 17328, Training Loss: 0.01133
Epoch: 17328, Training Loss: 0.01052
Epoch: 17328, Training Loss: 0.01293
Epoch: 17328, Training Loss: 0.00997
Epoch: 17329, Training Loss: 0.01133
Epoch: 17329, Training Loss: 0.01052
Epoch: 17329, Training Loss: 0.01293
Epoch: 17329, Training Loss: 0.00997
Epoch: 17330, Training Loss: 0.01133
Epoch: 17330, Training Loss: 0.01052
Epoch: 17330, Training Loss: 0.01293
Epoch: 17330, Training Loss: 0.00997
Epoch: 17331, Training Loss: 0.01133
Epoch: 17331, Training Loss: 0.01052
Epoch: 17331, Training Loss: 0.01293
Epoch: 17331, Training Loss: 0.00997
Epoch: 17332, Training Loss: 0.01133
Epoch: 17332, Training Loss: 0.01052
Epoch: 17332, Training Loss: 0.01293
Epoch: 17332, Training Loss: 0.00997
Epoch: 17333, Training Loss: 0.01133
Epoch: 17333, Training Loss: 0.01052
Epoch: 17333, Training Loss: 0.01293
Epoch: 17333, Training Loss: 0.00997
Epoch: 17334, Training Loss: 0.01133
Epoch: 17334, Training Loss: 0.01052
Epoch: 17334, Training Loss: 0.01293
Epoch: 17334, Training Loss: 0.00997
Epoch: 17335, Training Loss: 0.01133
Epoch: 17335, Training Loss: 0.01052
Epoch: 17335, Training Loss: 0.01293
Epoch: 17335, Training Loss: 0.00997
Epoch: 17336, Training Loss: 0.01133
Epoch: 17336, Training Loss: 0.01052
Epoch: 17336, Training Loss: 0.01293
Epoch: 17336, Training Loss: 0.00997
Epoch: 17337, Training Loss: 0.01133
Epoch: 17337, Training Loss: 0.01052
Epoch: 17337, Training Loss: 0.01293
Epoch: 17337, Training Loss: 0.00997
Epoch: 17338, Training Loss: 0.01133
Epoch: 17338, Training Loss: 0.01052
Epoch: 17338, Training Loss: 0.01293
Epoch: 17338, Training Loss: 0.00997
Epoch: 17339, Training Loss: 0.01133
Epoch: 17339, Training Loss: 0.01052
Epoch: 17339, Training Loss: 0.01293
Epoch: 17339, Training Loss: 0.00997
Epoch: 17340, Training Loss: 0.01133
Epoch: 17340, Training Loss: 0.01052
Epoch: 17340, Training Loss: 0.01293
Epoch: 17340, Training Loss: 0.00997
Epoch: 17341, Training Loss: 0.01133
Epoch: 17341, Training Loss: 0.01052
Epoch: 17341, Training Loss: 0.01293
Epoch: 17341, Training Loss: 0.00997
Epoch: 17342, Training Loss: 0.01133
Epoch: 17342, Training Loss: 0.01052
Epoch: 17342, Training Loss: 0.01293
Epoch: 17342, Training Loss: 0.00997
Epoch: 17343, Training Loss: 0.01132
Epoch: 17343, Training Loss: 0.01052
Epoch: 17343, Training Loss: 0.01293
Epoch: 17343, Training Loss: 0.00997
Epoch: 17344, Training Loss: 0.01132
Epoch: 17344, Training Loss: 0.01052
Epoch: 17344, Training Loss: 0.01293
Epoch: 17344, Training Loss: 0.00997
Epoch: 17345, Training Loss: 0.01132
Epoch: 17345, Training Loss: 0.01052
Epoch: 17345, Training Loss: 0.01293
Epoch: 17345, Training Loss: 0.00997
Epoch: 17346, Training Loss: 0.01132
Epoch: 17346, Training Loss: 0.01052
Epoch: 17346, Training Loss: 0.01293
Epoch: 17346, Training Loss: 0.00997
Epoch: 17347, Training Loss: 0.01132
Epoch: 17347, Training Loss: 0.01051
Epoch: 17347, Training Loss: 0.01293
Epoch: 17347, Training Loss: 0.00997
Epoch: 17348, Training Loss: 0.01132
Epoch: 17348, Training Loss: 0.01051
Epoch: 17348, Training Loss: 0.01293
Epoch: 17348, Training Loss: 0.00997
Epoch: 17349, Training Loss: 0.01132
Epoch: 17349, Training Loss: 0.01051
Epoch: 17349, Training Loss: 0.01293
Epoch: 17349, Training Loss: 0.00997
Epoch: 17350, Training Loss: 0.01132
Epoch: 17350, Training Loss: 0.01051
Epoch: 17350, Training Loss: 0.01293
Epoch: 17350, Training Loss: 0.00997
Epoch: 17351, Training Loss: 0.01132
Epoch: 17351, Training Loss: 0.01051
Epoch: 17351, Training Loss: 0.01293
Epoch: 17351, Training Loss: 0.00997
Epoch: 17352, Training Loss: 0.01132
Epoch: 17352, Training Loss: 0.01051
Epoch: 17352, Training Loss: 0.01293
Epoch: 17352, Training Loss: 0.00997
Epoch: 17353, Training Loss: 0.01132
Epoch: 17353, Training Loss: 0.01051
Epoch: 17353, Training Loss: 0.01292
Epoch: 17353, Training Loss: 0.00997
Epoch: 17354, Training Loss: 0.01132
Epoch: 17354, Training Loss: 0.01051
Epoch: 17354, Training Loss: 0.01292
Epoch: 17354, Training Loss: 0.00996
Epoch: 17355, Training Loss: 0.01132
Epoch: 17355, Training Loss: 0.01051
Epoch: 17355, Training Loss: 0.01292
Epoch: 17355, Training Loss: 0.00996
Epoch: 17356, Training Loss: 0.01132
Epoch: 17356, Training Loss: 0.01051
Epoch: 17356, Training Loss: 0.01292
Epoch: 17356, Training Loss: 0.00996
Epoch: 17357, Training Loss: 0.01132
Epoch: 17357, Training Loss: 0.01051
Epoch: 17357, Training Loss: 0.01292
Epoch: 17357, Training Loss: 0.00996
Epoch: 17358, Training Loss: 0.01132
Epoch: 17358, Training Loss: 0.01051
Epoch: 17358, Training Loss: 0.01292
Epoch: 17358, Training Loss: 0.00996
Epoch: 17359, Training Loss: 0.01132
Epoch: 17359, Training Loss: 0.01051
Epoch: 17359, Training Loss: 0.01292
Epoch: 17359, Training Loss: 0.00996
Epoch: 17360, Training Loss: 0.01132
Epoch: 17360, Training Loss: 0.01051
Epoch: 17360, Training Loss: 0.01292
Epoch: 17360, Training Loss: 0.00996
Epoch: 17361, Training Loss: 0.01132
Epoch: 17361, Training Loss: 0.01051
Epoch: 17361, Training Loss: 0.01292
Epoch: 17361, Training Loss: 0.00996
Epoch: 17362, Training Loss: 0.01132
Epoch: 17362, Training Loss: 0.01051
Epoch: 17362, Training Loss: 0.01292
Epoch: 17362, Training Loss: 0.00996
Epoch: 17363, Training Loss: 0.01132
Epoch: 17363, Training Loss: 0.01051
Epoch: 17363, Training Loss: 0.01292
Epoch: 17363, Training Loss: 0.00996
Epoch: 17364, Training Loss: 0.01132
Epoch: 17364, Training Loss: 0.01051
Epoch: 17364, Training Loss: 0.01292
Epoch: 17364, Training Loss: 0.00996
Epoch: 17365, Training Loss: 0.01132
Epoch: 17365, Training Loss: 0.01051
Epoch: 17365, Training Loss: 0.01292
Epoch: 17365, Training Loss: 0.00996
Epoch: 17366, Training Loss: 0.01132
Epoch: 17366, Training Loss: 0.01051
Epoch: 17366, Training Loss: 0.01292
Epoch: 17366, Training Loss: 0.00996
Epoch: 17367, Training Loss: 0.01132
Epoch: 17367, Training Loss: 0.01051
Epoch: 17367, Training Loss: 0.01292
Epoch: 17367, Training Loss: 0.00996
Epoch: 17368, Training Loss: 0.01132
Epoch: 17368, Training Loss: 0.01051
Epoch: 17368, Training Loss: 0.01292
Epoch: 17368, Training Loss: 0.00996
Epoch: 17369, Training Loss: 0.01132
Epoch: 17369, Training Loss: 0.01051
Epoch: 17369, Training Loss: 0.01292
Epoch: 17369, Training Loss: 0.00996
Epoch: 17370, Training Loss: 0.01132
Epoch: 17370, Training Loss: 0.01051
Epoch: 17370, Training Loss: 0.01292
Epoch: 17370, Training Loss: 0.00996
Epoch: 17371, Training Loss: 0.01132
Epoch: 17371, Training Loss: 0.01051
Epoch: 17371, Training Loss: 0.01292
Epoch: 17371, Training Loss: 0.00996
Epoch: 17372, Training Loss: 0.01131
Epoch: 17372, Training Loss: 0.01051
Epoch: 17372, Training Loss: 0.01292
Epoch: 17372, Training Loss: 0.00996
Epoch: 17373, Training Loss: 0.01131
Epoch: 17373, Training Loss: 0.01051
Epoch: 17373, Training Loss: 0.01292
Epoch: 17373, Training Loss: 0.00996
Epoch: 17374, Training Loss: 0.01131
Epoch: 17374, Training Loss: 0.01051
Epoch: 17374, Training Loss: 0.01292
Epoch: 17374, Training Loss: 0.00996
Epoch: 17375, Training Loss: 0.01131
Epoch: 17375, Training Loss: 0.01051
Epoch: 17375, Training Loss: 0.01292
Epoch: 17375, Training Loss: 0.00996
Epoch: 17376, Training Loss: 0.01131
Epoch: 17376, Training Loss: 0.01051
Epoch: 17376, Training Loss: 0.01292
Epoch: 17376, Training Loss: 0.00996
Epoch: 17377, Training Loss: 0.01131
Epoch: 17377, Training Loss: 0.01051
Epoch: 17377, Training Loss: 0.01291
Epoch: 17377, Training Loss: 0.00996
Epoch: 17378, Training Loss: 0.01131
Epoch: 17378, Training Loss: 0.01050
Epoch: 17378, Training Loss: 0.01291
Epoch: 17378, Training Loss: 0.00996
Epoch: 17379, Training Loss: 0.01131
Epoch: 17379, Training Loss: 0.01050
Epoch: 17379, Training Loss: 0.01291
Epoch: 17379, Training Loss: 0.00996
Epoch: 17380, Training Loss: 0.01131
Epoch: 17380, Training Loss: 0.01050
Epoch: 17380, Training Loss: 0.01291
Epoch: 17380, Training Loss: 0.00996
Epoch: 17381, Training Loss: 0.01131
Epoch: 17381, Training Loss: 0.01050
Epoch: 17381, Training Loss: 0.01291
Epoch: 17381, Training Loss: 0.00996
Epoch: 17382, Training Loss: 0.01131
Epoch: 17382, Training Loss: 0.01050
Epoch: 17382, Training Loss: 0.01291
Epoch: 17382, Training Loss: 0.00996
Epoch: 17383, Training Loss: 0.01131
Epoch: 17383, Training Loss: 0.01050
Epoch: 17383, Training Loss: 0.01291
Epoch: 17383, Training Loss: 0.00996
Epoch: 17384, Training Loss: 0.01131
Epoch: 17384, Training Loss: 0.01050
Epoch: 17384, Training Loss: 0.01291
Epoch: 17384, Training Loss: 0.00996
Epoch: 17385, Training Loss: 0.01131
Epoch: 17385, Training Loss: 0.01050
Epoch: 17385, Training Loss: 0.01291
Epoch: 17385, Training Loss: 0.00996
Epoch: 17386, Training Loss: 0.01131
Epoch: 17386, Training Loss: 0.01050
Epoch: 17386, Training Loss: 0.01291
Epoch: 17386, Training Loss: 0.00995
Epoch: 17387, Training Loss: 0.01131
Epoch: 17387, Training Loss: 0.01050
Epoch: 17387, Training Loss: 0.01291
Epoch: 17387, Training Loss: 0.00995
Epoch: 17388, Training Loss: 0.01131
Epoch: 17388, Training Loss: 0.01050
Epoch: 17388, Training Loss: 0.01291
Epoch: 17388, Training Loss: 0.00995
Epoch: 17389, Training Loss: 0.01131
Epoch: 17389, Training Loss: 0.01050
Epoch: 17389, Training Loss: 0.01291
Epoch: 17389, Training Loss: 0.00995
Epoch: 17390, Training Loss: 0.01131
Epoch: 17390, Training Loss: 0.01050
Epoch: 17390, Training Loss: 0.01291
Epoch: 17390, Training Loss: 0.00995
Epoch: 17391, Training Loss: 0.01131
Epoch: 17391, Training Loss: 0.01050
Epoch: 17391, Training Loss: 0.01291
Epoch: 17391, Training Loss: 0.00995
Epoch: 17392, Training Loss: 0.01131
Epoch: 17392, Training Loss: 0.01050
Epoch: 17392, Training Loss: 0.01291
Epoch: 17392, Training Loss: 0.00995
Epoch: 17393, Training Loss: 0.01131
Epoch: 17393, Training Loss: 0.01050
Epoch: 17393, Training Loss: 0.01291
Epoch: 17393, Training Loss: 0.00995
Epoch: 17394, Training Loss: 0.01131
Epoch: 17394, Training Loss: 0.01050
Epoch: 17394, Training Loss: 0.01291
Epoch: 17394, Training Loss: 0.00995
Epoch: 17395, Training Loss: 0.01131
Epoch: 17395, Training Loss: 0.01050
Epoch: 17395, Training Loss: 0.01291
Epoch: 17395, Training Loss: 0.00995
Epoch: 17396, Training Loss: 0.01131
Epoch: 17396, Training Loss: 0.01050
Epoch: 17396, Training Loss: 0.01291
Epoch: 17396, Training Loss: 0.00995
Epoch: 17397, Training Loss: 0.01131
Epoch: 17397, Training Loss: 0.01050
Epoch: 17397, Training Loss: 0.01291
Epoch: 17397, Training Loss: 0.00995
Epoch: 17398, Training Loss: 0.01131
Epoch: 17398, Training Loss: 0.01050
Epoch: 17398, Training Loss: 0.01291
Epoch: 17398, Training Loss: 0.00995
Epoch: 17399, Training Loss: 0.01131
Epoch: 17399, Training Loss: 0.01050
Epoch: 17399, Training Loss: 0.01291
Epoch: 17399, Training Loss: 0.00995
Epoch: 17400, Training Loss: 0.01130
Epoch: 17400, Training Loss: 0.01050
Epoch: 17400, Training Loss: 0.01291
Epoch: 17400, Training Loss: 0.00995
Epoch: 17401, Training Loss: 0.01130
Epoch: 17401, Training Loss: 0.01050
Epoch: 17401, Training Loss: 0.01291
Epoch: 17401, Training Loss: 0.00995
Epoch: 17402, Training Loss: 0.01130
Epoch: 17402, Training Loss: 0.01050
Epoch: 17402, Training Loss: 0.01290
Epoch: 17402, Training Loss: 0.00995
Epoch: 17403, Training Loss: 0.01130
Epoch: 17403, Training Loss: 0.01050
Epoch: 17403, Training Loss: 0.01290
Epoch: 17403, Training Loss: 0.00995
Epoch: 17404, Training Loss: 0.01130
Epoch: 17404, Training Loss: 0.01050
Epoch: 17404, Training Loss: 0.01290
Epoch: 17404, Training Loss: 0.00995
Epoch: 17405, Training Loss: 0.01130
Epoch: 17405, Training Loss: 0.01050
Epoch: 17405, Training Loss: 0.01290
Epoch: 17405, Training Loss: 0.00995
Epoch: 17406, Training Loss: 0.01130
Epoch: 17406, Training Loss: 0.01050
Epoch: 17406, Training Loss: 0.01290
Epoch: 17406, Training Loss: 0.00995
Epoch: 17407, Training Loss: 0.01130
Epoch: 17407, Training Loss: 0.01050
Epoch: 17407, Training Loss: 0.01290
Epoch: 17407, Training Loss: 0.00995
Epoch: 17408, Training Loss: 0.01130
Epoch: 17408, Training Loss: 0.01050
Epoch: 17408, Training Loss: 0.01290
Epoch: 17408, Training Loss: 0.00995
Epoch: 17409, Training Loss: 0.01130
Epoch: 17409, Training Loss: 0.01049
Epoch: 17409, Training Loss: 0.01290
Epoch: 17409, Training Loss: 0.00995
Epoch: 17410, Training Loss: 0.01130
Epoch: 17410, Training Loss: 0.01049
Epoch: 17410, Training Loss: 0.01290
Epoch: 17410, Training Loss: 0.00995
Epoch: 17411, Training Loss: 0.01130
Epoch: 17411, Training Loss: 0.01049
Epoch: 17411, Training Loss: 0.01290
Epoch: 17411, Training Loss: 0.00995
Epoch: 17412, Training Loss: 0.01130
Epoch: 17412, Training Loss: 0.01049
Epoch: 17412, Training Loss: 0.01290
Epoch: 17412, Training Loss: 0.00995
Epoch: 17413, Training Loss: 0.01130
Epoch: 17413, Training Loss: 0.01049
Epoch: 17413, Training Loss: 0.01290
Epoch: 17413, Training Loss: 0.00995
Epoch: 17414, Training Loss: 0.01130
Epoch: 17414, Training Loss: 0.01049
Epoch: 17414, Training Loss: 0.01290
Epoch: 17414, Training Loss: 0.00995
Epoch: 17415, Training Loss: 0.01130
Epoch: 17415, Training Loss: 0.01049
Epoch: 17415, Training Loss: 0.01290
Epoch: 17415, Training Loss: 0.00995
Epoch: 17416, Training Loss: 0.01130
Epoch: 17416, Training Loss: 0.01049
Epoch: 17416, Training Loss: 0.01290
Epoch: 17416, Training Loss: 0.00995
Epoch: 17417, Training Loss: 0.01130
Epoch: 17417, Training Loss: 0.01049
Epoch: 17417, Training Loss: 0.01290
Epoch: 17417, Training Loss: 0.00995
Epoch: 17418, Training Loss: 0.01130
Epoch: 17418, Training Loss: 0.01049
Epoch: 17418, Training Loss: 0.01290
Epoch: 17418, Training Loss: 0.00995
Epoch: 17419, Training Loss: 0.01130
Epoch: 17419, Training Loss: 0.01049
Epoch: 17419, Training Loss: 0.01290
Epoch: 17419, Training Loss: 0.00994
Epoch: 17420, Training Loss: 0.01130
Epoch: 17420, Training Loss: 0.01049
Epoch: 17420, Training Loss: 0.01290
Epoch: 17420, Training Loss: 0.00994
Epoch: 17421, Training Loss: 0.01130
Epoch: 17421, Training Loss: 0.01049
Epoch: 17421, Training Loss: 0.01290
Epoch: 17421, Training Loss: 0.00994
Epoch: 17422, Training Loss: 0.01130
Epoch: 17422, Training Loss: 0.01049
Epoch: 17422, Training Loss: 0.01290
Epoch: 17422, Training Loss: 0.00994
Epoch: 17423, Training Loss: 0.01130
Epoch: 17423, Training Loss: 0.01049
Epoch: 17423, Training Loss: 0.01290
Epoch: 17423, Training Loss: 0.00994
Epoch: 17424, Training Loss: 0.01130
Epoch: 17424, Training Loss: 0.01049
Epoch: 17424, Training Loss: 0.01290
Epoch: 17424, Training Loss: 0.00994
Epoch: 17425, Training Loss: 0.01130
Epoch: 17425, Training Loss: 0.01049
Epoch: 17425, Training Loss: 0.01290
Epoch: 17425, Training Loss: 0.00994
Epoch: 17426, Training Loss: 0.01130
Epoch: 17426, Training Loss: 0.01049
Epoch: 17426, Training Loss: 0.01290
Epoch: 17426, Training Loss: 0.00994
Epoch: 17427, Training Loss: 0.01130
Epoch: 17427, Training Loss: 0.01049
Epoch: 17427, Training Loss: 0.01289
Epoch: 17427, Training Loss: 0.00994
Epoch: 17428, Training Loss: 0.01129
Epoch: 17428, Training Loss: 0.01049
Epoch: 17428, Training Loss: 0.01289
Epoch: 17428, Training Loss: 0.00994
Epoch: 17429, Training Loss: 0.01129
Epoch: 17429, Training Loss: 0.01049
Epoch: 17429, Training Loss: 0.01289
Epoch: 17429, Training Loss: 0.00994
Epoch: 17430, Training Loss: 0.01129
Epoch: 17430, Training Loss: 0.01049
Epoch: 17430, Training Loss: 0.01289
Epoch: 17430, Training Loss: 0.00994
Epoch: 17431, Training Loss: 0.01129
Epoch: 17431, Training Loss: 0.01049
Epoch: 17431, Training Loss: 0.01289
Epoch: 17431, Training Loss: 0.00994
Epoch: 17432, Training Loss: 0.01129
Epoch: 17432, Training Loss: 0.01049
Epoch: 17432, Training Loss: 0.01289
Epoch: 17432, Training Loss: 0.00994
Epoch: 17433, Training Loss: 0.01129
Epoch: 17433, Training Loss: 0.01049
Epoch: 17433, Training Loss: 0.01289
Epoch: 17433, Training Loss: 0.00994
Epoch: 17434, Training Loss: 0.01129
Epoch: 17434, Training Loss: 0.01049
Epoch: 17434, Training Loss: 0.01289
Epoch: 17434, Training Loss: 0.00994
Epoch: 17435, Training Loss: 0.01129
Epoch: 17435, Training Loss: 0.01049
Epoch: 17435, Training Loss: 0.01289
Epoch: 17435, Training Loss: 0.00994
Epoch: 17436, Training Loss: 0.01129
Epoch: 17436, Training Loss: 0.01049
Epoch: 17436, Training Loss: 0.01289
Epoch: 17436, Training Loss: 0.00994
Epoch: 17437, Training Loss: 0.01129
Epoch: 17437, Training Loss: 0.01049
Epoch: 17437, Training Loss: 0.01289
Epoch: 17437, Training Loss: 0.00994
Epoch: 17438, Training Loss: 0.01129
Epoch: 17438, Training Loss: 0.01049
Epoch: 17438, Training Loss: 0.01289
Epoch: 17438, Training Loss: 0.00994
Epoch: 17439, Training Loss: 0.01129
Epoch: 17439, Training Loss: 0.01049
Epoch: 17439, Training Loss: 0.01289
Epoch: 17439, Training Loss: 0.00994
Epoch: 17440, Training Loss: 0.01129
Epoch: 17440, Training Loss: 0.01048
Epoch: 17440, Training Loss: 0.01289
Epoch: 17440, Training Loss: 0.00994
Epoch: 17441, Training Loss: 0.01129
Epoch: 17441, Training Loss: 0.01048
Epoch: 17441, Training Loss: 0.01289
Epoch: 17441, Training Loss: 0.00994
Epoch: 17442, Training Loss: 0.01129
Epoch: 17442, Training Loss: 0.01048
Epoch: 17442, Training Loss: 0.01289
Epoch: 17442, Training Loss: 0.00994
Epoch: 17443, Training Loss: 0.01129
Epoch: 17443, Training Loss: 0.01048
Epoch: 17443, Training Loss: 0.01289
Epoch: 17443, Training Loss: 0.00994
Epoch: 17444, Training Loss: 0.01129
Epoch: 17444, Training Loss: 0.01048
Epoch: 17444, Training Loss: 0.01289
Epoch: 17444, Training Loss: 0.00994
Epoch: 17445, Training Loss: 0.01129
Epoch: 17445, Training Loss: 0.01048
Epoch: 17445, Training Loss: 0.01289
Epoch: 17445, Training Loss: 0.00994
Epoch: 17446, Training Loss: 0.01129
Epoch: 17446, Training Loss: 0.01048
Epoch: 17446, Training Loss: 0.01289
Epoch: 17446, Training Loss: 0.00994
Epoch: 17447, Training Loss: 0.01129
Epoch: 17447, Training Loss: 0.01048
Epoch: 17447, Training Loss: 0.01289
Epoch: 17447, Training Loss: 0.00994
Epoch: 17448, Training Loss: 0.01129
Epoch: 17448, Training Loss: 0.01048
Epoch: 17448, Training Loss: 0.01289
Epoch: 17448, Training Loss: 0.00994
Epoch: 17449, Training Loss: 0.01129
Epoch: 17449, Training Loss: 0.01048
Epoch: 17449, Training Loss: 0.01289
Epoch: 17449, Training Loss: 0.00994
Epoch: 17450, Training Loss: 0.01129
Epoch: 17450, Training Loss: 0.01048
Epoch: 17450, Training Loss: 0.01289
Epoch: 17450, Training Loss: 0.00994
Epoch: 17451, Training Loss: 0.01129
Epoch: 17451, Training Loss: 0.01048
Epoch: 17451, Training Loss: 0.01289
Epoch: 17451, Training Loss: 0.00994
Epoch: 17452, Training Loss: 0.01129
Epoch: 17452, Training Loss: 0.01048
Epoch: 17452, Training Loss: 0.01288
Epoch: 17452, Training Loss: 0.00993
Epoch: 17453, Training Loss: 0.01129
Epoch: 17453, Training Loss: 0.01048
Epoch: 17453, Training Loss: 0.01288
Epoch: 17453, Training Loss: 0.00993
Epoch: 17454, Training Loss: 0.01129
Epoch: 17454, Training Loss: 0.01048
Epoch: 17454, Training Loss: 0.01288
Epoch: 17454, Training Loss: 0.00993
Epoch: 17455, Training Loss: 0.01129
Epoch: 17455, Training Loss: 0.01048
Epoch: 17455, Training Loss: 0.01288
Epoch: 17455, Training Loss: 0.00993
Epoch: 17456, Training Loss: 0.01128
Epoch: 17456, Training Loss: 0.01048
Epoch: 17456, Training Loss: 0.01288
Epoch: 17456, Training Loss: 0.00993
Epoch: 17457, Training Loss: 0.01128
Epoch: 17457, Training Loss: 0.01048
Epoch: 17457, Training Loss: 0.01288
Epoch: 17457, Training Loss: 0.00993
Epoch: 17458, Training Loss: 0.01128
Epoch: 17458, Training Loss: 0.01048
Epoch: 17458, Training Loss: 0.01288
Epoch: 17458, Training Loss: 0.00993
Epoch: 17459, Training Loss: 0.01128
Epoch: 17459, Training Loss: 0.01048
Epoch: 17459, Training Loss: 0.01288
Epoch: 17459, Training Loss: 0.00993
Epoch: 17460, Training Loss: 0.01128
Epoch: 17460, Training Loss: 0.01048
Epoch: 17460, Training Loss: 0.01288
Epoch: 17460, Training Loss: 0.00993
Epoch: 17461, Training Loss: 0.01128
Epoch: 17461, Training Loss: 0.01048
Epoch: 17461, Training Loss: 0.01288
Epoch: 17461, Training Loss: 0.00993
Epoch: 17462, Training Loss: 0.01128
Epoch: 17462, Training Loss: 0.01048
Epoch: 17462, Training Loss: 0.01288
Epoch: 17462, Training Loss: 0.00993
Epoch: 17463, Training Loss: 0.01128
Epoch: 17463, Training Loss: 0.01048
Epoch: 17463, Training Loss: 0.01288
Epoch: 17463, Training Loss: 0.00993
Epoch: 17464, Training Loss: 0.01128
Epoch: 17464, Training Loss: 0.01048
Epoch: 17464, Training Loss: 0.01288
Epoch: 17464, Training Loss: 0.00993
Epoch: 17465, Training Loss: 0.01128
Epoch: 17465, Training Loss: 0.01048
Epoch: 17465, Training Loss: 0.01288
Epoch: 17465, Training Loss: 0.00993
Epoch: 17466, Training Loss: 0.01128
Epoch: 17466, Training Loss: 0.01048
Epoch: 17466, Training Loss: 0.01288
Epoch: 17466, Training Loss: 0.00993
Epoch: 17467, Training Loss: 0.01128
Epoch: 17467, Training Loss: 0.01048
Epoch: 17467, Training Loss: 0.01288
Epoch: 17467, Training Loss: 0.00993
Epoch: 17468, Training Loss: 0.01128
Epoch: 17468, Training Loss: 0.01048
Epoch: 17468, Training Loss: 0.01288
Epoch: 17468, Training Loss: 0.00993
Epoch: 17469, Training Loss: 0.01128
Epoch: 17469, Training Loss: 0.01048
Epoch: 17469, Training Loss: 0.01288
Epoch: 17469, Training Loss: 0.00993
Epoch: 17470, Training Loss: 0.01128
Epoch: 17470, Training Loss: 0.01048
Epoch: 17470, Training Loss: 0.01288
Epoch: 17470, Training Loss: 0.00993
Epoch: 17471, Training Loss: 0.01128
Epoch: 17471, Training Loss: 0.01048
Epoch: 17471, Training Loss: 0.01288
Epoch: 17471, Training Loss: 0.00993
Epoch: 17472, Training Loss: 0.01128
Epoch: 17472, Training Loss: 0.01047
Epoch: 17472, Training Loss: 0.01288
Epoch: 17472, Training Loss: 0.00993
Epoch: 17473, Training Loss: 0.01128
Epoch: 17473, Training Loss: 0.01047
Epoch: 17473, Training Loss: 0.01288
Epoch: 17473, Training Loss: 0.00993
Epoch: 17474, Training Loss: 0.01128
Epoch: 17474, Training Loss: 0.01047
Epoch: 17474, Training Loss: 0.01288
Epoch: 17474, Training Loss: 0.00993
Epoch: 17475, Training Loss: 0.01128
Epoch: 17475, Training Loss: 0.01047
Epoch: 17475, Training Loss: 0.01288
Epoch: 17475, Training Loss: 0.00993
Epoch: 17476, Training Loss: 0.01128
Epoch: 17476, Training Loss: 0.01047
Epoch: 17476, Training Loss: 0.01287
Epoch: 17476, Training Loss: 0.00993
Epoch: 17477, Training Loss: 0.01128
Epoch: 17477, Training Loss: 0.01047
Epoch: 17477, Training Loss: 0.01287
Epoch: 17477, Training Loss: 0.00993
Epoch: 17478, Training Loss: 0.01128
Epoch: 17478, Training Loss: 0.01047
Epoch: 17478, Training Loss: 0.01287
Epoch: 17478, Training Loss: 0.00993
Epoch: 17479, Training Loss: 0.01128
Epoch: 17479, Training Loss: 0.01047
Epoch: 17479, Training Loss: 0.01287
Epoch: 17479, Training Loss: 0.00993
Epoch: 17480, Training Loss: 0.01128
Epoch: 17480, Training Loss: 0.01047
Epoch: 17480, Training Loss: 0.01287
Epoch: 17480, Training Loss: 0.00993
Epoch: 17481, Training Loss: 0.01128
Epoch: 17481, Training Loss: 0.01047
Epoch: 17481, Training Loss: 0.01287
Epoch: 17481, Training Loss: 0.00993
Epoch: 17482, Training Loss: 0.01128
Epoch: 17482, Training Loss: 0.01047
Epoch: 17482, Training Loss: 0.01287
Epoch: 17482, Training Loss: 0.00993
Epoch: 17483, Training Loss: 0.01128
Epoch: 17483, Training Loss: 0.01047
Epoch: 17483, Training Loss: 0.01287
Epoch: 17483, Training Loss: 0.00993
Epoch: 17484, Training Loss: 0.01128
Epoch: 17484, Training Loss: 0.01047
Epoch: 17484, Training Loss: 0.01287
Epoch: 17484, Training Loss: 0.00992
Epoch: 17485, Training Loss: 0.01127
Epoch: 17485, Training Loss: 0.01047
Epoch: 17485, Training Loss: 0.01287
Epoch: 17485, Training Loss: 0.00992
Epoch: 17486, Training Loss: 0.01127
Epoch: 17486, Training Loss: 0.01047
Epoch: 17486, Training Loss: 0.01287
Epoch: 17486, Training Loss: 0.00992
Epoch: 17487, Training Loss: 0.01127
Epoch: 17487, Training Loss: 0.01047
Epoch: 17487, Training Loss: 0.01287
Epoch: 17487, Training Loss: 0.00992
Epoch: 17488, Training Loss: 0.01127
Epoch: 17488, Training Loss: 0.01047
Epoch: 17488, Training Loss: 0.01287
Epoch: 17488, Training Loss: 0.00992
Epoch: 17489, Training Loss: 0.01127
Epoch: 17489, Training Loss: 0.01047
Epoch: 17489, Training Loss: 0.01287
Epoch: 17489, Training Loss: 0.00992
Epoch: 17490, Training Loss: 0.01127
Epoch: 17490, Training Loss: 0.01047
Epoch: 17490, Training Loss: 0.01287
Epoch: 17490, Training Loss: 0.00992
Epoch: 17491, Training Loss: 0.01127
Epoch: 17491, Training Loss: 0.01047
Epoch: 17491, Training Loss: 0.01287
Epoch: 17491, Training Loss: 0.00992
Epoch: 17492, Training Loss: 0.01127
Epoch: 17492, Training Loss: 0.01047
Epoch: 17492, Training Loss: 0.01287
Epoch: 17492, Training Loss: 0.00992
Epoch: 17493, Training Loss: 0.01127
Epoch: 17493, Training Loss: 0.01047
Epoch: 17493, Training Loss: 0.01287
Epoch: 17493, Training Loss: 0.00992
Epoch: 17494, Training Loss: 0.01127
Epoch: 17494, Training Loss: 0.01047
Epoch: 17494, Training Loss: 0.01287
Epoch: 17494, Training Loss: 0.00992
Epoch: 17495, Training Loss: 0.01127
Epoch: 17495, Training Loss: 0.01047
Epoch: 17495, Training Loss: 0.01287
Epoch: 17495, Training Loss: 0.00992
Epoch: 17496, Training Loss: 0.01127
Epoch: 17496, Training Loss: 0.01047
Epoch: 17496, Training Loss: 0.01287
Epoch: 17496, Training Loss: 0.00992
Epoch: 17497, Training Loss: 0.01127
Epoch: 17497, Training Loss: 0.01047
Epoch: 17497, Training Loss: 0.01287
Epoch: 17497, Training Loss: 0.00992
Epoch: 17498, Training Loss: 0.01127
Epoch: 17498, Training Loss: 0.01047
Epoch: 17498, Training Loss: 0.01287
Epoch: 17498, Training Loss: 0.00992
Epoch: 17499, Training Loss: 0.01127
Epoch: 17499, Training Loss: 0.01047
Epoch: 17499, Training Loss: 0.01287
Epoch: 17499, Training Loss: 0.00992
Epoch: 17500, Training Loss: 0.01127
Epoch: 17500, Training Loss: 0.01047
Epoch: 17500, Training Loss: 0.01287
Epoch: 17500, Training Loss: 0.00992
Epoch: 17501, Training Loss: 0.01127
Epoch: 17501, Training Loss: 0.01047
Epoch: 17501, Training Loss: 0.01286
Epoch: 17501, Training Loss: 0.00992
Epoch: 17502, Training Loss: 0.01127
Epoch: 17502, Training Loss: 0.01047
Epoch: 17502, Training Loss: 0.01286
Epoch: 17502, Training Loss: 0.00992
Epoch: 17503, Training Loss: 0.01127
Epoch: 17503, Training Loss: 0.01046
Epoch: 17503, Training Loss: 0.01286
Epoch: 17503, Training Loss: 0.00992
Epoch: 17504, Training Loss: 0.01127
Epoch: 17504, Training Loss: 0.01046
Epoch: 17504, Training Loss: 0.01286
Epoch: 17504, Training Loss: 0.00992
Epoch: 17505, Training Loss: 0.01127
Epoch: 17505, Training Loss: 0.01046
Epoch: 17505, Training Loss: 0.01286
Epoch: 17505, Training Loss: 0.00992
Epoch: 17506, Training Loss: 0.01127
Epoch: 17506, Training Loss: 0.01046
Epoch: 17506, Training Loss: 0.01286
Epoch: 17506, Training Loss: 0.00992
Epoch: 17507, Training Loss: 0.01127
Epoch: 17507, Training Loss: 0.01046
Epoch: 17507, Training Loss: 0.01286
Epoch: 17507, Training Loss: 0.00992
Epoch: 17508, Training Loss: 0.01127
Epoch: 17508, Training Loss: 0.01046
Epoch: 17508, Training Loss: 0.01286
Epoch: 17508, Training Loss: 0.00992
Epoch: 17509, Training Loss: 0.01127
Epoch: 17509, Training Loss: 0.01046
Epoch: 17509, Training Loss: 0.01286
Epoch: 17509, Training Loss: 0.00992
Epoch: 17510, Training Loss: 0.01127
Epoch: 17510, Training Loss: 0.01046
Epoch: 17510, Training Loss: 0.01286
Epoch: 17510, Training Loss: 0.00992
Epoch: 17511, Training Loss: 0.01127
Epoch: 17511, Training Loss: 0.01046
Epoch: 17511, Training Loss: 0.01286
Epoch: 17511, Training Loss: 0.00992
Epoch: 17512, Training Loss: 0.01127
Epoch: 17512, Training Loss: 0.01046
Epoch: 17512, Training Loss: 0.01286
Epoch: 17512, Training Loss: 0.00992
Epoch: 17513, Training Loss: 0.01126
Epoch: 17513, Training Loss: 0.01046
Epoch: 17513, Training Loss: 0.01286
Epoch: 17513, Training Loss: 0.00992
Epoch: 17514, Training Loss: 0.01126
Epoch: 17514, Training Loss: 0.01046
Epoch: 17514, Training Loss: 0.01286
Epoch: 17514, Training Loss: 0.00992
Epoch: 17515, Training Loss: 0.01126
Epoch: 17515, Training Loss: 0.01046
Epoch: 17515, Training Loss: 0.01286
Epoch: 17515, Training Loss: 0.00992
Epoch: 17516, Training Loss: 0.01126
Epoch: 17516, Training Loss: 0.01046
Epoch: 17516, Training Loss: 0.01286
Epoch: 17516, Training Loss: 0.00992
Epoch: 17517, Training Loss: 0.01126
Epoch: 17517, Training Loss: 0.01046
Epoch: 17517, Training Loss: 0.01286
Epoch: 17517, Training Loss: 0.00991
Epoch: 17518, Training Loss: 0.01126
Epoch: 17518, Training Loss: 0.01046
Epoch: 17518, Training Loss: 0.01286
Epoch: 17518, Training Loss: 0.00991
Epoch: 17519, Training Loss: 0.01126
Epoch: 17519, Training Loss: 0.01046
Epoch: 17519, Training Loss: 0.01286
Epoch: 17519, Training Loss: 0.00991
Epoch: 17520, Training Loss: 0.01126
Epoch: 17520, Training Loss: 0.01046
Epoch: 17520, Training Loss: 0.01286
Epoch: 17520, Training Loss: 0.00991
Epoch: 17521, Training Loss: 0.01126
Epoch: 17521, Training Loss: 0.01046
Epoch: 17521, Training Loss: 0.01286
Epoch: 17521, Training Loss: 0.00991
Epoch: 17522, Training Loss: 0.01126
Epoch: 17522, Training Loss: 0.01046
Epoch: 17522, Training Loss: 0.01286
Epoch: 17522, Training Loss: 0.00991
Epoch: 17523, Training Loss: 0.01126
Epoch: 17523, Training Loss: 0.01046
Epoch: 17523, Training Loss: 0.01286
Epoch: 17523, Training Loss: 0.00991
Epoch: 17524, Training Loss: 0.01126
Epoch: 17524, Training Loss: 0.01046
Epoch: 17524, Training Loss: 0.01286
Epoch: 17524, Training Loss: 0.00991
Epoch: 17525, Training Loss: 0.01126
Epoch: 17525, Training Loss: 0.01046
Epoch: 17525, Training Loss: 0.01286
Epoch: 17525, Training Loss: 0.00991
Epoch: 17526, Training Loss: 0.01126
Epoch: 17526, Training Loss: 0.01046
Epoch: 17526, Training Loss: 0.01285
Epoch: 17526, Training Loss: 0.00991
Epoch: 17527, Training Loss: 0.01126
Epoch: 17527, Training Loss: 0.01046
Epoch: 17527, Training Loss: 0.01285
Epoch: 17527, Training Loss: 0.00991
Epoch: 17528, Training Loss: 0.01126
Epoch: 17528, Training Loss: 0.01046
Epoch: 17528, Training Loss: 0.01285
Epoch: 17528, Training Loss: 0.00991
Epoch: 17529, Training Loss: 0.01126
Epoch: 17529, Training Loss: 0.01046
Epoch: 17529, Training Loss: 0.01285
Epoch: 17529, Training Loss: 0.00991
Epoch: 17530, Training Loss: 0.01126
Epoch: 17530, Training Loss: 0.01046
Epoch: 17530, Training Loss: 0.01285
Epoch: 17530, Training Loss: 0.00991
Epoch: 17531, Training Loss: 0.01126
Epoch: 17531, Training Loss: 0.01046
Epoch: 17531, Training Loss: 0.01285
Epoch: 17531, Training Loss: 0.00991
Epoch: 17532, Training Loss: 0.01126
Epoch: 17532, Training Loss: 0.01046
Epoch: 17532, Training Loss: 0.01285
Epoch: 17532, Training Loss: 0.00991
Epoch: 17533, Training Loss: 0.01126
Epoch: 17533, Training Loss: 0.01046
Epoch: 17533, Training Loss: 0.01285
Epoch: 17533, Training Loss: 0.00991
Epoch: 17534, Training Loss: 0.01126
Epoch: 17534, Training Loss: 0.01045
Epoch: 17534, Training Loss: 0.01285
Epoch: 17534, Training Loss: 0.00991
Epoch: 17535, Training Loss: 0.01126
Epoch: 17535, Training Loss: 0.01045
Epoch: 17535, Training Loss: 0.01285
Epoch: 17535, Training Loss: 0.00991
Epoch: 17536, Training Loss: 0.01126
Epoch: 17536, Training Loss: 0.01045
Epoch: 17536, Training Loss: 0.01285
Epoch: 17536, Training Loss: 0.00991
Epoch: 17537, Training Loss: 0.01126
Epoch: 17537, Training Loss: 0.01045
Epoch: 17537, Training Loss: 0.01285
Epoch: 17537, Training Loss: 0.00991
Epoch: 17538, Training Loss: 0.01126
Epoch: 17538, Training Loss: 0.01045
Epoch: 17538, Training Loss: 0.01285
Epoch: 17538, Training Loss: 0.00991
Epoch: 17539, Training Loss: 0.01126
Epoch: 17539, Training Loss: 0.01045
Epoch: 17539, Training Loss: 0.01285
Epoch: 17539, Training Loss: 0.00991
Epoch: 17540, Training Loss: 0.01126
Epoch: 17540, Training Loss: 0.01045
Epoch: 17540, Training Loss: 0.01285
Epoch: 17540, Training Loss: 0.00991
Epoch: 17541, Training Loss: 0.01126
Epoch: 17541, Training Loss: 0.01045
Epoch: 17541, Training Loss: 0.01285
Epoch: 17541, Training Loss: 0.00991
Epoch: 17542, Training Loss: 0.01125
Epoch: 17542, Training Loss: 0.01045
Epoch: 17542, Training Loss: 0.01285
Epoch: 17542, Training Loss: 0.00991
Epoch: 17543, Training Loss: 0.01125
Epoch: 17543, Training Loss: 0.01045
Epoch: 17543, Training Loss: 0.01285
Epoch: 17543, Training Loss: 0.00991
Epoch: 17544, Training Loss: 0.01125
Epoch: 17544, Training Loss: 0.01045
Epoch: 17544, Training Loss: 0.01285
Epoch: 17544, Training Loss: 0.00991
Epoch: 17545, Training Loss: 0.01125
Epoch: 17545, Training Loss: 0.01045
Epoch: 17545, Training Loss: 0.01285
Epoch: 17545, Training Loss: 0.00991
Epoch: 17546, Training Loss: 0.01125
Epoch: 17546, Training Loss: 0.01045
Epoch: 17546, Training Loss: 0.01285
Epoch: 17546, Training Loss: 0.00991
Epoch: 17547, Training Loss: 0.01125
Epoch: 17547, Training Loss: 0.01045
Epoch: 17547, Training Loss: 0.01285
Epoch: 17547, Training Loss: 0.00991
Epoch: 17548, Training Loss: 0.01125
Epoch: 17548, Training Loss: 0.01045
Epoch: 17548, Training Loss: 0.01285
Epoch: 17548, Training Loss: 0.00991
Epoch: 17549, Training Loss: 0.01125
Epoch: 17549, Training Loss: 0.01045
Epoch: 17549, Training Loss: 0.01285
Epoch: 17549, Training Loss: 0.00991
Epoch: 17550, Training Loss: 0.01125
Epoch: 17550, Training Loss: 0.01045
Epoch: 17550, Training Loss: 0.01285
Epoch: 17550, Training Loss: 0.00990
Epoch: 17551, Training Loss: 0.01125
Epoch: 17551, Training Loss: 0.01045
Epoch: 17551, Training Loss: 0.01284
Epoch: 17551, Training Loss: 0.00990
Epoch: 17552, Training Loss: 0.01125
Epoch: 17552, Training Loss: 0.01045
Epoch: 17552, Training Loss: 0.01284
Epoch: 17552, Training Loss: 0.00990
Epoch: 17553, Training Loss: 0.01125
Epoch: 17553, Training Loss: 0.01045
Epoch: 17553, Training Loss: 0.01284
Epoch: 17553, Training Loss: 0.00990
Epoch: 17554, Training Loss: 0.01125
Epoch: 17554, Training Loss: 0.01045
Epoch: 17554, Training Loss: 0.01284
Epoch: 17554, Training Loss: 0.00990
Epoch: 17555, Training Loss: 0.01125
Epoch: 17555, Training Loss: 0.01045
Epoch: 17555, Training Loss: 0.01284
Epoch: 17555, Training Loss: 0.00990
Epoch: 17556, Training Loss: 0.01125
Epoch: 17556, Training Loss: 0.01045
Epoch: 17556, Training Loss: 0.01284
Epoch: 17556, Training Loss: 0.00990
Epoch: 17557, Training Loss: 0.01125
Epoch: 17557, Training Loss: 0.01045
Epoch: 17557, Training Loss: 0.01284
Epoch: 17557, Training Loss: 0.00990
Epoch: 17558, Training Loss: 0.01125
Epoch: 17558, Training Loss: 0.01045
Epoch: 17558, Training Loss: 0.01284
Epoch: 17558, Training Loss: 0.00990
Epoch: 17559, Training Loss: 0.01125
Epoch: 17559, Training Loss: 0.01045
Epoch: 17559, Training Loss: 0.01284
Epoch: 17559, Training Loss: 0.00990
Epoch: 17560, Training Loss: 0.01125
Epoch: 17560, Training Loss: 0.01045
Epoch: 17560, Training Loss: 0.01284
Epoch: 17560, Training Loss: 0.00990
Epoch: 17561, Training Loss: 0.01125
Epoch: 17561, Training Loss: 0.01045
Epoch: 17561, Training Loss: 0.01284
Epoch: 17561, Training Loss: 0.00990
Epoch: 17562, Training Loss: 0.01125
Epoch: 17562, Training Loss: 0.01045
Epoch: 17562, Training Loss: 0.01284
Epoch: 17562, Training Loss: 0.00990
Epoch: 17563, Training Loss: 0.01125
Epoch: 17563, Training Loss: 0.01045
Epoch: 17563, Training Loss: 0.01284
Epoch: 17563, Training Loss: 0.00990
Epoch: 17564, Training Loss: 0.01125
Epoch: 17564, Training Loss: 0.01045
Epoch: 17564, Training Loss: 0.01284
Epoch: 17564, Training Loss: 0.00990
Epoch: 17565, Training Loss: 0.01125
Epoch: 17565, Training Loss: 0.01045
Epoch: 17565, Training Loss: 0.01284
Epoch: 17565, Training Loss: 0.00990
Epoch: 17566, Training Loss: 0.01125
Epoch: 17566, Training Loss: 0.01044
Epoch: 17566, Training Loss: 0.01284
Epoch: 17566, Training Loss: 0.00990
Epoch: 17567, Training Loss: 0.01125
Epoch: 17567, Training Loss: 0.01044
Epoch: 17567, Training Loss: 0.01284
Epoch: 17567, Training Loss: 0.00990
Epoch: 17568, Training Loss: 0.01125
Epoch: 17568, Training Loss: 0.01044
Epoch: 17568, Training Loss: 0.01284
Epoch: 17568, Training Loss: 0.00990
Epoch: 17569, Training Loss: 0.01125
Epoch: 17569, Training Loss: 0.01044
Epoch: 17569, Training Loss: 0.01284
Epoch: 17569, Training Loss: 0.00990
Epoch: 17570, Training Loss: 0.01125
Epoch: 17570, Training Loss: 0.01044
Epoch: 17570, Training Loss: 0.01284
Epoch: 17570, Training Loss: 0.00990
Epoch: 17571, Training Loss: 0.01124
Epoch: 17571, Training Loss: 0.01044
Epoch: 17571, Training Loss: 0.01284
Epoch: 17571, Training Loss: 0.00990
Epoch: 17572, Training Loss: 0.01124
Epoch: 17572, Training Loss: 0.01044
Epoch: 17572, Training Loss: 0.01284
Epoch: 17572, Training Loss: 0.00990
Epoch: 17573, Training Loss: 0.01124
Epoch: 17573, Training Loss: 0.01044
Epoch: 17573, Training Loss: 0.01284
Epoch: 17573, Training Loss: 0.00990
Epoch: 17574, Training Loss: 0.01124
Epoch: 17574, Training Loss: 0.01044
Epoch: 17574, Training Loss: 0.01284
Epoch: 17574, Training Loss: 0.00990
Epoch: 17575, Training Loss: 0.01124
Epoch: 17575, Training Loss: 0.01044
Epoch: 17575, Training Loss: 0.01284
Epoch: 17575, Training Loss: 0.00990
Epoch: 17576, Training Loss: 0.01124
Epoch: 17576, Training Loss: 0.01044
Epoch: 17576, Training Loss: 0.01284
Epoch: 17576, Training Loss: 0.00990
Epoch: 17577, Training Loss: 0.01124
Epoch: 17577, Training Loss: 0.01044
Epoch: 17577, Training Loss: 0.01283
Epoch: 17577, Training Loss: 0.00990
Epoch: 17578, Training Loss: 0.01124
Epoch: 17578, Training Loss: 0.01044
Epoch: 17578, Training Loss: 0.01283
Epoch: 17578, Training Loss: 0.00990
Epoch: 17579, Training Loss: 0.01124
Epoch: 17579, Training Loss: 0.01044
Epoch: 17579, Training Loss: 0.01283
Epoch: 17579, Training Loss: 0.00990
Epoch: 17580, Training Loss: 0.01124
Epoch: 17580, Training Loss: 0.01044
Epoch: 17580, Training Loss: 0.01283
Epoch: 17580, Training Loss: 0.00990
Epoch: 17581, Training Loss: 0.01124
Epoch: 17581, Training Loss: 0.01044
Epoch: 17581, Training Loss: 0.01283
Epoch: 17581, Training Loss: 0.00990
Epoch: 17582, Training Loss: 0.01124
Epoch: 17582, Training Loss: 0.01044
Epoch: 17582, Training Loss: 0.01283
Epoch: 17582, Training Loss: 0.00990
Epoch: 17583, Training Loss: 0.01124
Epoch: 17583, Training Loss: 0.01044
Epoch: 17583, Training Loss: 0.01283
Epoch: 17583, Training Loss: 0.00990
Epoch: 17584, Training Loss: 0.01124
Epoch: 17584, Training Loss: 0.01044
Epoch: 17584, Training Loss: 0.01283
Epoch: 17584, Training Loss: 0.00989
Epoch: 17585, Training Loss: 0.01124
Epoch: 17585, Training Loss: 0.01044
Epoch: 17585, Training Loss: 0.01283
Epoch: 17585, Training Loss: 0.00989
Epoch: 17586, Training Loss: 0.01124
Epoch: 17586, Training Loss: 0.01044
Epoch: 17586, Training Loss: 0.01283
Epoch: 17586, Training Loss: 0.00989
Epoch: 17587, Training Loss: 0.01124
Epoch: 17587, Training Loss: 0.01044
Epoch: 17587, Training Loss: 0.01283
Epoch: 17587, Training Loss: 0.00989
Epoch: 17588, Training Loss: 0.01124
Epoch: 17588, Training Loss: 0.01044
Epoch: 17588, Training Loss: 0.01283
Epoch: 17588, Training Loss: 0.00989
Epoch: 17589, Training Loss: 0.01124
Epoch: 17589, Training Loss: 0.01044
Epoch: 17589, Training Loss: 0.01283
Epoch: 17589, Training Loss: 0.00989
Epoch: 17590, Training Loss: 0.01124
Epoch: 17590, Training Loss: 0.01044
Epoch: 17590, Training Loss: 0.01283
Epoch: 17590, Training Loss: 0.00989
Epoch: 17591, Training Loss: 0.01124
Epoch: 17591, Training Loss: 0.01044
Epoch: 17591, Training Loss: 0.01283
Epoch: 17591, Training Loss: 0.00989
Epoch: 17592, Training Loss: 0.01124
Epoch: 17592, Training Loss: 0.01044
Epoch: 17592, Training Loss: 0.01283
Epoch: 17592, Training Loss: 0.00989
Epoch: 17593, Training Loss: 0.01124
Epoch: 17593, Training Loss: 0.01044
Epoch: 17593, Training Loss: 0.01283
Epoch: 17593, Training Loss: 0.00989
Epoch: 17594, Training Loss: 0.01124
Epoch: 17594, Training Loss: 0.01044
Epoch: 17594, Training Loss: 0.01283
Epoch: 17594, Training Loss: 0.00989
Epoch: 17595, Training Loss: 0.01124
Epoch: 17595, Training Loss: 0.01044
Epoch: 17595, Training Loss: 0.01283
Epoch: 17595, Training Loss: 0.00989
Epoch: 17596, Training Loss: 0.01124
Epoch: 17596, Training Loss: 0.01044
Epoch: 17596, Training Loss: 0.01283
Epoch: 17596, Training Loss: 0.00989
Epoch: 17597, Training Loss: 0.01124
Epoch: 17597, Training Loss: 0.01043
Epoch: 17597, Training Loss: 0.01283
Epoch: 17597, Training Loss: 0.00989
Epoch: 17598, Training Loss: 0.01124
Epoch: 17598, Training Loss: 0.01043
Epoch: 17598, Training Loss: 0.01283
Epoch: 17598, Training Loss: 0.00989
Epoch: 17599, Training Loss: 0.01123
Epoch: 17599, Training Loss: 0.01043
Epoch: 17599, Training Loss: 0.01283
Epoch: 17599, Training Loss: 0.00989
Epoch: 17600, Training Loss: 0.01123
Epoch: 17600, Training Loss: 0.01043
Epoch: 17600, Training Loss: 0.01283
Epoch: 17600, Training Loss: 0.00989
Epoch: 17601, Training Loss: 0.01123
Epoch: 17601, Training Loss: 0.01043
Epoch: 17601, Training Loss: 0.01283
Epoch: 17601, Training Loss: 0.00989
Epoch: 17602, Training Loss: 0.01123
Epoch: 17602, Training Loss: 0.01043
Epoch: 17602, Training Loss: 0.01282
Epoch: 17602, Training Loss: 0.00989
Epoch: 17603, Training Loss: 0.01123
Epoch: 17603, Training Loss: 0.01043
Epoch: 17603, Training Loss: 0.01282
Epoch: 17603, Training Loss: 0.00989
Epoch: 17604, Training Loss: 0.01123
Epoch: 17604, Training Loss: 0.01043
Epoch: 17604, Training Loss: 0.01282
Epoch: 17604, Training Loss: 0.00989
Epoch: 17605, Training Loss: 0.01123
Epoch: 17605, Training Loss: 0.01043
Epoch: 17605, Training Loss: 0.01282
Epoch: 17605, Training Loss: 0.00989
Epoch: 17606, Training Loss: 0.01123
Epoch: 17606, Training Loss: 0.01043
Epoch: 17606, Training Loss: 0.01282
Epoch: 17606, Training Loss: 0.00989
Epoch: 17607, Training Loss: 0.01123
Epoch: 17607, Training Loss: 0.01043
Epoch: 17607, Training Loss: 0.01282
Epoch: 17607, Training Loss: 0.00989
Epoch: 17608, Training Loss: 0.01123
Epoch: 17608, Training Loss: 0.01043
Epoch: 17608, Training Loss: 0.01282
Epoch: 17608, Training Loss: 0.00989
Epoch: 17609, Training Loss: 0.01123
Epoch: 17609, Training Loss: 0.01043
Epoch: 17609, Training Loss: 0.01282
Epoch: 17609, Training Loss: 0.00989
Epoch: 17610, Training Loss: 0.01123
Epoch: 17610, Training Loss: 0.01043
Epoch: 17610, Training Loss: 0.01282
Epoch: 17610, Training Loss: 0.00989
Epoch: 17611, Training Loss: 0.01123
Epoch: 17611, Training Loss: 0.01043
Epoch: 17611, Training Loss: 0.01282
Epoch: 17611, Training Loss: 0.00989
Epoch: 17612, Training Loss: 0.01123
Epoch: 17612, Training Loss: 0.01043
Epoch: 17612, Training Loss: 0.01282
Epoch: 17612, Training Loss: 0.00989
Epoch: 17613, Training Loss: 0.01123
Epoch: 17613, Training Loss: 0.01043
Epoch: 17613, Training Loss: 0.01282
Epoch: 17613, Training Loss: 0.00989
Epoch: 17614, Training Loss: 0.01123
Epoch: 17614, Training Loss: 0.01043
Epoch: 17614, Training Loss: 0.01282
Epoch: 17614, Training Loss: 0.00989
Epoch: 17615, Training Loss: 0.01123
Epoch: 17615, Training Loss: 0.01043
Epoch: 17615, Training Loss: 0.01282
Epoch: 17615, Training Loss: 0.00989
Epoch: 17616, Training Loss: 0.01123
Epoch: 17616, Training Loss: 0.01043
Epoch: 17616, Training Loss: 0.01282
Epoch: 17616, Training Loss: 0.00989
Epoch: 17617, Training Loss: 0.01123
Epoch: 17617, Training Loss: 0.01043
Epoch: 17617, Training Loss: 0.01282
Epoch: 17617, Training Loss: 0.00988
Epoch: 17618, Training Loss: 0.01123
Epoch: 17618, Training Loss: 0.01043
Epoch: 17618, Training Loss: 0.01282
Epoch: 17618, Training Loss: 0.00988
Epoch: 17619, Training Loss: 0.01123
Epoch: 17619, Training Loss: 0.01043
Epoch: 17619, Training Loss: 0.01282
Epoch: 17619, Training Loss: 0.00988
Epoch: 17620, Training Loss: 0.01123
Epoch: 17620, Training Loss: 0.01043
Epoch: 17620, Training Loss: 0.01282
Epoch: 17620, Training Loss: 0.00988
Epoch: 17621, Training Loss: 0.01123
Epoch: 17621, Training Loss: 0.01043
Epoch: 17621, Training Loss: 0.01282
Epoch: 17621, Training Loss: 0.00988
Epoch: 17622, Training Loss: 0.01123
Epoch: 17622, Training Loss: 0.01043
Epoch: 17622, Training Loss: 0.01282
Epoch: 17622, Training Loss: 0.00988
Epoch: 17623, Training Loss: 0.01123
Epoch: 17623, Training Loss: 0.01043
Epoch: 17623, Training Loss: 0.01282
Epoch: 17623, Training Loss: 0.00988
Epoch: 17624, Training Loss: 0.01123
Epoch: 17624, Training Loss: 0.01043
Epoch: 17624, Training Loss: 0.01282
Epoch: 17624, Training Loss: 0.00988
Epoch: 17625, Training Loss: 0.01123
Epoch: 17625, Training Loss: 0.01043
Epoch: 17625, Training Loss: 0.01282
Epoch: 17625, Training Loss: 0.00988
Epoch: 17626, Training Loss: 0.01123
Epoch: 17626, Training Loss: 0.01043
Epoch: 17626, Training Loss: 0.01282
Epoch: 17626, Training Loss: 0.00988
Epoch: 17627, Training Loss: 0.01123
Epoch: 17627, Training Loss: 0.01043
Epoch: 17627, Training Loss: 0.01281
Epoch: 17627, Training Loss: 0.00988
Epoch: 17628, Training Loss: 0.01122
Epoch: 17628, Training Loss: 0.01043
Epoch: 17628, Training Loss: 0.01281
Epoch: 17628, Training Loss: 0.00988
Epoch: 17629, Training Loss: 0.01122
Epoch: 17629, Training Loss: 0.01042
Epoch: 17629, Training Loss: 0.01281
Epoch: 17629, Training Loss: 0.00988
Epoch: 17630, Training Loss: 0.01122
Epoch: 17630, Training Loss: 0.01042
Epoch: 17630, Training Loss: 0.01281
Epoch: 17630, Training Loss: 0.00988
Epoch: 17631, Training Loss: 0.01122
Epoch: 17631, Training Loss: 0.01042
Epoch: 17631, Training Loss: 0.01281
Epoch: 17631, Training Loss: 0.00988
Epoch: 17632, Training Loss: 0.01122
Epoch: 17632, Training Loss: 0.01042
Epoch: 17632, Training Loss: 0.01281
Epoch: 17632, Training Loss: 0.00988
Epoch: 17633, Training Loss: 0.01122
Epoch: 17633, Training Loss: 0.01042
Epoch: 17633, Training Loss: 0.01281
Epoch: 17633, Training Loss: 0.00988
Epoch: 17634, Training Loss: 0.01122
Epoch: 17634, Training Loss: 0.01042
Epoch: 17634, Training Loss: 0.01281
Epoch: 17634, Training Loss: 0.00988
Epoch: 17635, Training Loss: 0.01122
Epoch: 17635, Training Loss: 0.01042
Epoch: 17635, Training Loss: 0.01281
Epoch: 17635, Training Loss: 0.00988
Epoch: 17636, Training Loss: 0.01122
Epoch: 17636, Training Loss: 0.01042
Epoch: 17636, Training Loss: 0.01281
Epoch: 17636, Training Loss: 0.00988
Epoch: 17637, Training Loss: 0.01122
Epoch: 17637, Training Loss: 0.01042
Epoch: 17637, Training Loss: 0.01281
Epoch: 17637, Training Loss: 0.00988
Epoch: 17638, Training Loss: 0.01122
Epoch: 17638, Training Loss: 0.01042
Epoch: 17638, Training Loss: 0.01281
Epoch: 17638, Training Loss: 0.00988
Epoch: 17639, Training Loss: 0.01122
Epoch: 17639, Training Loss: 0.01042
Epoch: 17639, Training Loss: 0.01281
Epoch: 17639, Training Loss: 0.00988
Epoch: 17640, Training Loss: 0.01122
Epoch: 17640, Training Loss: 0.01042
Epoch: 17640, Training Loss: 0.01281
Epoch: 17640, Training Loss: 0.00988
Epoch: 17641, Training Loss: 0.01122
Epoch: 17641, Training Loss: 0.01042
Epoch: 17641, Training Loss: 0.01281
Epoch: 17641, Training Loss: 0.00988
Epoch: 17642, Training Loss: 0.01122
Epoch: 17642, Training Loss: 0.01042
Epoch: 17642, Training Loss: 0.01281
Epoch: 17642, Training Loss: 0.00988
Epoch: 17643, Training Loss: 0.01122
Epoch: 17643, Training Loss: 0.01042
Epoch: 17643, Training Loss: 0.01281
Epoch: 17643, Training Loss: 0.00988
Epoch: 17644, Training Loss: 0.01122
Epoch: 17644, Training Loss: 0.01042
Epoch: 17644, Training Loss: 0.01281
Epoch: 17644, Training Loss: 0.00988
Epoch: 17645, Training Loss: 0.01122
Epoch: 17645, Training Loss: 0.01042
Epoch: 17645, Training Loss: 0.01281
Epoch: 17645, Training Loss: 0.00988
Epoch: 17646, Training Loss: 0.01122
Epoch: 17646, Training Loss: 0.01042
Epoch: 17646, Training Loss: 0.01281
Epoch: 17646, Training Loss: 0.00988
Epoch: 17647, Training Loss: 0.01122
Epoch: 17647, Training Loss: 0.01042
Epoch: 17647, Training Loss: 0.01281
Epoch: 17647, Training Loss: 0.00988
Epoch: 17648, Training Loss: 0.01122
Epoch: 17648, Training Loss: 0.01042
Epoch: 17648, Training Loss: 0.01281
Epoch: 17648, Training Loss: 0.00988
Epoch: 17649, Training Loss: 0.01122
Epoch: 17649, Training Loss: 0.01042
Epoch: 17649, Training Loss: 0.01281
Epoch: 17649, Training Loss: 0.00988
Epoch: 17650, Training Loss: 0.01122
Epoch: 17650, Training Loss: 0.01042
Epoch: 17650, Training Loss: 0.01281
Epoch: 17650, Training Loss: 0.00987
Epoch: 17651, Training Loss: 0.01122
Epoch: 17651, Training Loss: 0.01042
Epoch: 17651, Training Loss: 0.01281
Epoch: 17651, Training Loss: 0.00987
Epoch: 17652, Training Loss: 0.01122
Epoch: 17652, Training Loss: 0.01042
Epoch: 17652, Training Loss: 0.01280
Epoch: 17652, Training Loss: 0.00987
Epoch: 17653, Training Loss: 0.01122
Epoch: 17653, Training Loss: 0.01042
Epoch: 17653, Training Loss: 0.01280
Epoch: 17653, Training Loss: 0.00987
Epoch: 17654, Training Loss: 0.01122
Epoch: 17654, Training Loss: 0.01042
Epoch: 17654, Training Loss: 0.01280
Epoch: 17654, Training Loss: 0.00987
Epoch: 17655, Training Loss: 0.01122
Epoch: 17655, Training Loss: 0.01042
Epoch: 17655, Training Loss: 0.01280
Epoch: 17655, Training Loss: 0.00987
Epoch: 17656, Training Loss: 0.01122
Epoch: 17656, Training Loss: 0.01042
Epoch: 17656, Training Loss: 0.01280
Epoch: 17656, Training Loss: 0.00987
Epoch: 17657, Training Loss: 0.01121
Epoch: 17657, Training Loss: 0.01042
Epoch: 17657, Training Loss: 0.01280
Epoch: 17657, Training Loss: 0.00987
Epoch: 17658, Training Loss: 0.01121
Epoch: 17658, Training Loss: 0.01042
Epoch: 17658, Training Loss: 0.01280
Epoch: 17658, Training Loss: 0.00987
Epoch: 17659, Training Loss: 0.01121
Epoch: 17659, Training Loss: 0.01042
Epoch: 17659, Training Loss: 0.01280
Epoch: 17659, Training Loss: 0.00987
Epoch: 17660, Training Loss: 0.01121
Epoch: 17660, Training Loss: 0.01042
Epoch: 17660, Training Loss: 0.01280
Epoch: 17660, Training Loss: 0.00987
Epoch: 17661, Training Loss: 0.01121
Epoch: 17661, Training Loss: 0.01041
Epoch: 17661, Training Loss: 0.01280
Epoch: 17661, Training Loss: 0.00987
Epoch: 17662, Training Loss: 0.01121
Epoch: 17662, Training Loss: 0.01041
Epoch: 17662, Training Loss: 0.01280
Epoch: 17662, Training Loss: 0.00987
Epoch: 17663, Training Loss: 0.01121
Epoch: 17663, Training Loss: 0.01041
Epoch: 17663, Training Loss: 0.01280
Epoch: 17663, Training Loss: 0.00987
Epoch: 17664, Training Loss: 0.01121
Epoch: 17664, Training Loss: 0.01041
Epoch: 17664, Training Loss: 0.01280
Epoch: 17664, Training Loss: 0.00987
Epoch: 17665, Training Loss: 0.01121
Epoch: 17665, Training Loss: 0.01041
Epoch: 17665, Training Loss: 0.01280
Epoch: 17665, Training Loss: 0.00987
Epoch: 17666, Training Loss: 0.01121
Epoch: 17666, Training Loss: 0.01041
Epoch: 17666, Training Loss: 0.01280
Epoch: 17666, Training Loss: 0.00987
Epoch: 17667, Training Loss: 0.01121
Epoch: 17667, Training Loss: 0.01041
Epoch: 17667, Training Loss: 0.01280
Epoch: 17667, Training Loss: 0.00987
Epoch: 17668, Training Loss: 0.01121
Epoch: 17668, Training Loss: 0.01041
Epoch: 17668, Training Loss: 0.01280
Epoch: 17668, Training Loss: 0.00987
Epoch: 17669, Training Loss: 0.01121
Epoch: 17669, Training Loss: 0.01041
Epoch: 17669, Training Loss: 0.01280
Epoch: 17669, Training Loss: 0.00987
Epoch: 17670, Training Loss: 0.01121
Epoch: 17670, Training Loss: 0.01041
Epoch: 17670, Training Loss: 0.01280
Epoch: 17670, Training Loss: 0.00987
Epoch: 17671, Training Loss: 0.01121
Epoch: 17671, Training Loss: 0.01041
Epoch: 17671, Training Loss: 0.01280
Epoch: 17671, Training Loss: 0.00987
Epoch: 17672, Training Loss: 0.01121
Epoch: 17672, Training Loss: 0.01041
Epoch: 17672, Training Loss: 0.01280
Epoch: 17672, Training Loss: 0.00987
Epoch: 17673, Training Loss: 0.01121
Epoch: 17673, Training Loss: 0.01041
Epoch: 17673, Training Loss: 0.01280
Epoch: 17673, Training Loss: 0.00987
Epoch: 17674, Training Loss: 0.01121
Epoch: 17674, Training Loss: 0.01041
Epoch: 17674, Training Loss: 0.01280
Epoch: 17674, Training Loss: 0.00987
Epoch: 17675, Training Loss: 0.01121
Epoch: 17675, Training Loss: 0.01041
Epoch: 17675, Training Loss: 0.01280
Epoch: 17675, Training Loss: 0.00987
Epoch: 17676, Training Loss: 0.01121
Epoch: 17676, Training Loss: 0.01041
Epoch: 17676, Training Loss: 0.01280
Epoch: 17676, Training Loss: 0.00987
Epoch: 17677, Training Loss: 0.01121
Epoch: 17677, Training Loss: 0.01041
Epoch: 17677, Training Loss: 0.01280
Epoch: 17677, Training Loss: 0.00987
Epoch: 17678, Training Loss: 0.01121
Epoch: 17678, Training Loss: 0.01041
Epoch: 17678, Training Loss: 0.01279
Epoch: 17678, Training Loss: 0.00987
Epoch: 17679, Training Loss: 0.01121
Epoch: 17679, Training Loss: 0.01041
Epoch: 17679, Training Loss: 0.01279
Epoch: 17679, Training Loss: 0.00987
Epoch: 17680, Training Loss: 0.01121
Epoch: 17680, Training Loss: 0.01041
Epoch: 17680, Training Loss: 0.01279
Epoch: 17680, Training Loss: 0.00987
Epoch: 17681, Training Loss: 0.01121
Epoch: 17681, Training Loss: 0.01041
Epoch: 17681, Training Loss: 0.01279
Epoch: 17681, Training Loss: 0.00987
Epoch: 17682, Training Loss: 0.01121
Epoch: 17682, Training Loss: 0.01041
Epoch: 17682, Training Loss: 0.01279
Epoch: 17682, Training Loss: 0.00987
Epoch: 17683, Training Loss: 0.01121
Epoch: 17683, Training Loss: 0.01041
Epoch: 17683, Training Loss: 0.01279
Epoch: 17683, Training Loss: 0.00987
Epoch: 17684, Training Loss: 0.01121
Epoch: 17684, Training Loss: 0.01041
Epoch: 17684, Training Loss: 0.01279
Epoch: 17684, Training Loss: 0.00986
Epoch: 17685, Training Loss: 0.01121
Epoch: 17685, Training Loss: 0.01041
Epoch: 17685, Training Loss: 0.01279
Epoch: 17685, Training Loss: 0.00986
Epoch: 17686, Training Loss: 0.01120
Epoch: 17686, Training Loss: 0.01041
Epoch: 17686, Training Loss: 0.01279
Epoch: 17686, Training Loss: 0.00986
Epoch: 17687, Training Loss: 0.01120
Epoch: 17687, Training Loss: 0.01041
Epoch: 17687, Training Loss: 0.01279
Epoch: 17687, Training Loss: 0.00986
Epoch: 17688, Training Loss: 0.01120
Epoch: 17688, Training Loss: 0.01041
Epoch: 17688, Training Loss: 0.01279
Epoch: 17688, Training Loss: 0.00986
Epoch: 17689, Training Loss: 0.01120
Epoch: 17689, Training Loss: 0.01041
Epoch: 17689, Training Loss: 0.01279
Epoch: 17689, Training Loss: 0.00986
Epoch: 17690, Training Loss: 0.01120
Epoch: 17690, Training Loss: 0.01041
Epoch: 17690, Training Loss: 0.01279
Epoch: 17690, Training Loss: 0.00986
Epoch: 17691, Training Loss: 0.01120
Epoch: 17691, Training Loss: 0.01041
Epoch: 17691, Training Loss: 0.01279
Epoch: 17691, Training Loss: 0.00986
Epoch: 17692, Training Loss: 0.01120
Epoch: 17692, Training Loss: 0.01041
Epoch: 17692, Training Loss: 0.01279
Epoch: 17692, Training Loss: 0.00986
Epoch: 17693, Training Loss: 0.01120
Epoch: 17693, Training Loss: 0.01040
Epoch: 17693, Training Loss: 0.01279
Epoch: 17693, Training Loss: 0.00986
Epoch: 17694, Training Loss: 0.01120
Epoch: 17694, Training Loss: 0.01040
Epoch: 17694, Training Loss: 0.01279
Epoch: 17694, Training Loss: 0.00986
Epoch: 17695, Training Loss: 0.01120
Epoch: 17695, Training Loss: 0.01040
Epoch: 17695, Training Loss: 0.01279
Epoch: 17695, Training Loss: 0.00986
Epoch: 17696, Training Loss: 0.01120
Epoch: 17696, Training Loss: 0.01040
Epoch: 17696, Training Loss: 0.01279
Epoch: 17696, Training Loss: 0.00986
Epoch: 17697, Training Loss: 0.01120
Epoch: 17697, Training Loss: 0.01040
Epoch: 17697, Training Loss: 0.01279
Epoch: 17697, Training Loss: 0.00986
Epoch: 17698, Training Loss: 0.01120
Epoch: 17698, Training Loss: 0.01040
Epoch: 17698, Training Loss: 0.01279
Epoch: 17698, Training Loss: 0.00986
Epoch: 17699, Training Loss: 0.01120
Epoch: 17699, Training Loss: 0.01040
Epoch: 17699, Training Loss: 0.01279
Epoch: 17699, Training Loss: 0.00986
Epoch: 17700, Training Loss: 0.01120
Epoch: 17700, Training Loss: 0.01040
Epoch: 17700, Training Loss: 0.01279
Epoch: 17700, Training Loss: 0.00986
Epoch: 17701, Training Loss: 0.01120
Epoch: 17701, Training Loss: 0.01040
Epoch: 17701, Training Loss: 0.01279
Epoch: 17701, Training Loss: 0.00986
Epoch: 17702, Training Loss: 0.01120
Epoch: 17702, Training Loss: 0.01040
Epoch: 17702, Training Loss: 0.01279
Epoch: 17702, Training Loss: 0.00986
Epoch: 17703, Training Loss: 0.01120
Epoch: 17703, Training Loss: 0.01040
Epoch: 17703, Training Loss: 0.01278
Epoch: 17703, Training Loss: 0.00986
Epoch: 17704, Training Loss: 0.01120
Epoch: 17704, Training Loss: 0.01040
Epoch: 17704, Training Loss: 0.01278
Epoch: 17704, Training Loss: 0.00986
Epoch: 17705, Training Loss: 0.01120
Epoch: 17705, Training Loss: 0.01040
Epoch: 17705, Training Loss: 0.01278
Epoch: 17705, Training Loss: 0.00986
Epoch: 17706, Training Loss: 0.01120
Epoch: 17706, Training Loss: 0.01040
Epoch: 17706, Training Loss: 0.01278
Epoch: 17706, Training Loss: 0.00986
Epoch: 17707, Training Loss: 0.01120
Epoch: 17707, Training Loss: 0.01040
Epoch: 17707, Training Loss: 0.01278
Epoch: 17707, Training Loss: 0.00986
Epoch: 17708, Training Loss: 0.01120
Epoch: 17708, Training Loss: 0.01040
Epoch: 17708, Training Loss: 0.01278
Epoch: 17708, Training Loss: 0.00986
Epoch: 17709, Training Loss: 0.01120
Epoch: 17709, Training Loss: 0.01040
Epoch: 17709, Training Loss: 0.01278
Epoch: 17709, Training Loss: 0.00986
Epoch: 17710, Training Loss: 0.01120
Epoch: 17710, Training Loss: 0.01040
Epoch: 17710, Training Loss: 0.01278
Epoch: 17710, Training Loss: 0.00986
Epoch: 17711, Training Loss: 0.01120
Epoch: 17711, Training Loss: 0.01040
Epoch: 17711, Training Loss: 0.01278
Epoch: 17711, Training Loss: 0.00986
Epoch: 17712, Training Loss: 0.01120
Epoch: 17712, Training Loss: 0.01040
Epoch: 17712, Training Loss: 0.01278
Epoch: 17712, Training Loss: 0.00986
Epoch: 17713, Training Loss: 0.01120
Epoch: 17713, Training Loss: 0.01040
Epoch: 17713, Training Loss: 0.01278
Epoch: 17713, Training Loss: 0.00986
Epoch: 17714, Training Loss: 0.01120
Epoch: 17714, Training Loss: 0.01040
Epoch: 17714, Training Loss: 0.01278
Epoch: 17714, Training Loss: 0.00986
Epoch: 17715, Training Loss: 0.01119
Epoch: 17715, Training Loss: 0.01040
Epoch: 17715, Training Loss: 0.01278
Epoch: 17715, Training Loss: 0.00986
Epoch: 17716, Training Loss: 0.01119
Epoch: 17716, Training Loss: 0.01040
Epoch: 17716, Training Loss: 0.01278
Epoch: 17716, Training Loss: 0.00986
Epoch: 17717, Training Loss: 0.01119
Epoch: 17717, Training Loss: 0.01040
Epoch: 17717, Training Loss: 0.01278
Epoch: 17717, Training Loss: 0.00985
Epoch: 17718, Training Loss: 0.01119
Epoch: 17718, Training Loss: 0.01040
Epoch: 17718, Training Loss: 0.01278
Epoch: 17718, Training Loss: 0.00985
Epoch: 17719, Training Loss: 0.01119
Epoch: 17719, Training Loss: 0.01040
Epoch: 17719, Training Loss: 0.01278
Epoch: 17719, Training Loss: 0.00985
Epoch: 17720, Training Loss: 0.01119
Epoch: 17720, Training Loss: 0.01040
Epoch: 17720, Training Loss: 0.01278
Epoch: 17720, Training Loss: 0.00985
Epoch: 17721, Training Loss: 0.01119
Epoch: 17721, Training Loss: 0.01040
Epoch: 17721, Training Loss: 0.01278
Epoch: 17721, Training Loss: 0.00985
Epoch: 17722, Training Loss: 0.01119
Epoch: 17722, Training Loss: 0.01040
Epoch: 17722, Training Loss: 0.01278
Epoch: 17722, Training Loss: 0.00985
Epoch: 17723, Training Loss: 0.01119
Epoch: 17723, Training Loss: 0.01040
Epoch: 17723, Training Loss: 0.01278
Epoch: 17723, Training Loss: 0.00985
Epoch: 17724, Training Loss: 0.01119
Epoch: 17724, Training Loss: 0.01040
Epoch: 17724, Training Loss: 0.01278
Epoch: 17724, Training Loss: 0.00985
Epoch: 17725, Training Loss: 0.01119
Epoch: 17725, Training Loss: 0.01039
Epoch: 17725, Training Loss: 0.01278
Epoch: 17725, Training Loss: 0.00985
Epoch: 17726, Training Loss: 0.01119
Epoch: 17726, Training Loss: 0.01039
Epoch: 17726, Training Loss: 0.01278
Epoch: 17726, Training Loss: 0.00985
Epoch: 17727, Training Loss: 0.01119
Epoch: 17727, Training Loss: 0.01039
Epoch: 17727, Training Loss: 0.01278
Epoch: 17727, Training Loss: 0.00985
Epoch: 17728, Training Loss: 0.01119
Epoch: 17728, Training Loss: 0.01039
Epoch: 17728, Training Loss: 0.01277
Epoch: 17728, Training Loss: 0.00985
Epoch: 17729, Training Loss: 0.01119
Epoch: 17729, Training Loss: 0.01039
Epoch: 17729, Training Loss: 0.01277
Epoch: 17729, Training Loss: 0.00985
Epoch: 17730, Training Loss: 0.01119
Epoch: 17730, Training Loss: 0.01039
Epoch: 17730, Training Loss: 0.01277
Epoch: 17730, Training Loss: 0.00985
Epoch: 17731, Training Loss: 0.01119
Epoch: 17731, Training Loss: 0.01039
Epoch: 17731, Training Loss: 0.01277
Epoch: 17731, Training Loss: 0.00985
Epoch: 17732, Training Loss: 0.01119
Epoch: 17732, Training Loss: 0.01039
Epoch: 17732, Training Loss: 0.01277
Epoch: 17732, Training Loss: 0.00985
Epoch: 17733, Training Loss: 0.01119
Epoch: 17733, Training Loss: 0.01039
Epoch: 17733, Training Loss: 0.01277
Epoch: 17733, Training Loss: 0.00985
Epoch: 17734, Training Loss: 0.01119
Epoch: 17734, Training Loss: 0.01039
Epoch: 17734, Training Loss: 0.01277
Epoch: 17734, Training Loss: 0.00985
Epoch: 17735, Training Loss: 0.01119
Epoch: 17735, Training Loss: 0.01039
Epoch: 17735, Training Loss: 0.01277
Epoch: 17735, Training Loss: 0.00985
Epoch: 17736, Training Loss: 0.01119
Epoch: 17736, Training Loss: 0.01039
Epoch: 17736, Training Loss: 0.01277
Epoch: 17736, Training Loss: 0.00985
Epoch: 17737, Training Loss: 0.01119
Epoch: 17737, Training Loss: 0.01039
Epoch: 17737, Training Loss: 0.01277
Epoch: 17737, Training Loss: 0.00985
Epoch: 17738, Training Loss: 0.01119
Epoch: 17738, Training Loss: 0.01039
Epoch: 17738, Training Loss: 0.01277
Epoch: 17738, Training Loss: 0.00985
Epoch: 17739, Training Loss: 0.01119
Epoch: 17739, Training Loss: 0.01039
Epoch: 17739, Training Loss: 0.01277
Epoch: 17739, Training Loss: 0.00985
Epoch: 17740, Training Loss: 0.01119
Epoch: 17740, Training Loss: 0.01039
Epoch: 17740, Training Loss: 0.01277
Epoch: 17740, Training Loss: 0.00985
Epoch: 17741, Training Loss: 0.01119
Epoch: 17741, Training Loss: 0.01039
Epoch: 17741, Training Loss: 0.01277
Epoch: 17741, Training Loss: 0.00985
Epoch: 17742, Training Loss: 0.01119
Epoch: 17742, Training Loss: 0.01039
Epoch: 17742, Training Loss: 0.01277
Epoch: 17742, Training Loss: 0.00985
Epoch: 17743, Training Loss: 0.01119
Epoch: 17743, Training Loss: 0.01039
Epoch: 17743, Training Loss: 0.01277
Epoch: 17743, Training Loss: 0.00985
Epoch: 17744, Training Loss: 0.01118
Epoch: 17744, Training Loss: 0.01039
Epoch: 17744, Training Loss: 0.01277
Epoch: 17744, Training Loss: 0.00985
Epoch: 17745, Training Loss: 0.01118
Epoch: 17745, Training Loss: 0.01039
Epoch: 17745, Training Loss: 0.01277
Epoch: 17745, Training Loss: 0.00985
Epoch: 17746, Training Loss: 0.01118
Epoch: 17746, Training Loss: 0.01039
Epoch: 17746, Training Loss: 0.01277
Epoch: 17746, Training Loss: 0.00985
Epoch: 17747, Training Loss: 0.01118
Epoch: 17747, Training Loss: 0.01039
Epoch: 17747, Training Loss: 0.01277
Epoch: 17747, Training Loss: 0.00985
Epoch: 17748, Training Loss: 0.01118
Epoch: 17748, Training Loss: 0.01039
Epoch: 17748, Training Loss: 0.01277
Epoch: 17748, Training Loss: 0.00985
Epoch: 17749, Training Loss: 0.01118
Epoch: 17749, Training Loss: 0.01039
Epoch: 17749, Training Loss: 0.01277
Epoch: 17749, Training Loss: 0.00985
Epoch: 17750, Training Loss: 0.01118
Epoch: 17750, Training Loss: 0.01039
Epoch: 17750, Training Loss: 0.01277
Epoch: 17750, Training Loss: 0.00985
Epoch: 17751, Training Loss: 0.01118
Epoch: 17751, Training Loss: 0.01039
Epoch: 17751, Training Loss: 0.01277
Epoch: 17751, Training Loss: 0.00984
Epoch: 17752, Training Loss: 0.01118
Epoch: 17752, Training Loss: 0.01039
Epoch: 17752, Training Loss: 0.01277
Epoch: 17752, Training Loss: 0.00984
Epoch: 17753, Training Loss: 0.01118
Epoch: 17753, Training Loss: 0.01039
Epoch: 17753, Training Loss: 0.01277
Epoch: 17753, Training Loss: 0.00984
Epoch: 17754, Training Loss: 0.01118
Epoch: 17754, Training Loss: 0.01039
Epoch: 17754, Training Loss: 0.01276
Epoch: 17754, Training Loss: 0.00984
Epoch: 17755, Training Loss: 0.01118
Epoch: 17755, Training Loss: 0.01039
Epoch: 17755, Training Loss: 0.01276
Epoch: 17755, Training Loss: 0.00984
Epoch: 17756, Training Loss: 0.01118
Epoch: 17756, Training Loss: 0.01039
Epoch: 17756, Training Loss: 0.01276
Epoch: 17756, Training Loss: 0.00984
Epoch: 17757, Training Loss: 0.01118
Epoch: 17757, Training Loss: 0.01038
Epoch: 17757, Training Loss: 0.01276
Epoch: 17757, Training Loss: 0.00984
Epoch: 17758, Training Loss: 0.01118
Epoch: 17758, Training Loss: 0.01038
Epoch: 17758, Training Loss: 0.01276
Epoch: 17758, Training Loss: 0.00984
Epoch: 17759, Training Loss: 0.01118
Epoch: 17759, Training Loss: 0.01038
Epoch: 17759, Training Loss: 0.01276
Epoch: 17759, Training Loss: 0.00984
Epoch: 17760, Training Loss: 0.01118
Epoch: 17760, Training Loss: 0.01038
Epoch: 17760, Training Loss: 0.01276
Epoch: 17760, Training Loss: 0.00984
Epoch: 17761, Training Loss: 0.01118
Epoch: 17761, Training Loss: 0.01038
Epoch: 17761, Training Loss: 0.01276
Epoch: 17761, Training Loss: 0.00984
Epoch: 17762, Training Loss: 0.01118
Epoch: 17762, Training Loss: 0.01038
Epoch: 17762, Training Loss: 0.01276
Epoch: 17762, Training Loss: 0.00984
Epoch: 17763, Training Loss: 0.01118
Epoch: 17763, Training Loss: 0.01038
Epoch: 17763, Training Loss: 0.01276
Epoch: 17763, Training Loss: 0.00984
Epoch: 17764, Training Loss: 0.01118
Epoch: 17764, Training Loss: 0.01038
Epoch: 17764, Training Loss: 0.01276
Epoch: 17764, Training Loss: 0.00984
Epoch: 17765, Training Loss: 0.01118
Epoch: 17765, Training Loss: 0.01038
Epoch: 17765, Training Loss: 0.01276
Epoch: 17765, Training Loss: 0.00984
Epoch: 17766, Training Loss: 0.01118
Epoch: 17766, Training Loss: 0.01038
Epoch: 17766, Training Loss: 0.01276
Epoch: 17766, Training Loss: 0.00984
Epoch: 17767, Training Loss: 0.01118
Epoch: 17767, Training Loss: 0.01038
Epoch: 17767, Training Loss: 0.01276
Epoch: 17767, Training Loss: 0.00984
Epoch: 17768, Training Loss: 0.01118
Epoch: 17768, Training Loss: 0.01038
Epoch: 17768, Training Loss: 0.01276
Epoch: 17768, Training Loss: 0.00984
Epoch: 17769, Training Loss: 0.01118
Epoch: 17769, Training Loss: 0.01038
Epoch: 17769, Training Loss: 0.01276
Epoch: 17769, Training Loss: 0.00984
Epoch: 17770, Training Loss: 0.01118
Epoch: 17770, Training Loss: 0.01038
Epoch: 17770, Training Loss: 0.01276
Epoch: 17770, Training Loss: 0.00984
Epoch: 17771, Training Loss: 0.01118
Epoch: 17771, Training Loss: 0.01038
Epoch: 17771, Training Loss: 0.01276
Epoch: 17771, Training Loss: 0.00984
Epoch: 17772, Training Loss: 0.01118
Epoch: 17772, Training Loss: 0.01038
Epoch: 17772, Training Loss: 0.01276
Epoch: 17772, Training Loss: 0.00984
Epoch: 17773, Training Loss: 0.01117
Epoch: 17773, Training Loss: 0.01038
Epoch: 17773, Training Loss: 0.01276
Epoch: 17773, Training Loss: 0.00984
Epoch: 17774, Training Loss: 0.01117
Epoch: 17774, Training Loss: 0.01038
Epoch: 17774, Training Loss: 0.01276
Epoch: 17774, Training Loss: 0.00984
Epoch: 17775, Training Loss: 0.01117
Epoch: 17775, Training Loss: 0.01038
Epoch: 17775, Training Loss: 0.01276
Epoch: 17775, Training Loss: 0.00984
Epoch: 17776, Training Loss: 0.01117
Epoch: 17776, Training Loss: 0.01038
Epoch: 17776, Training Loss: 0.01276
Epoch: 17776, Training Loss: 0.00984
Epoch: 17777, Training Loss: 0.01117
Epoch: 17777, Training Loss: 0.01038
Epoch: 17777, Training Loss: 0.01276
Epoch: 17777, Training Loss: 0.00984
Epoch: 17778, Training Loss: 0.01117
Epoch: 17778, Training Loss: 0.01038
Epoch: 17778, Training Loss: 0.01276
Epoch: 17778, Training Loss: 0.00984
Epoch: 17779, Training Loss: 0.01117
Epoch: 17779, Training Loss: 0.01038
Epoch: 17779, Training Loss: 0.01276
Epoch: 17779, Training Loss: 0.00984
Epoch: 17780, Training Loss: 0.01117
Epoch: 17780, Training Loss: 0.01038
Epoch: 17780, Training Loss: 0.01275
Epoch: 17780, Training Loss: 0.00984
Epoch: 17781, Training Loss: 0.01117
Epoch: 17781, Training Loss: 0.01038
Epoch: 17781, Training Loss: 0.01275
Epoch: 17781, Training Loss: 0.00984
Epoch: 17782, Training Loss: 0.01117
Epoch: 17782, Training Loss: 0.01038
Epoch: 17782, Training Loss: 0.01275
Epoch: 17782, Training Loss: 0.00984
Epoch: 17783, Training Loss: 0.01117
Epoch: 17783, Training Loss: 0.01038
Epoch: 17783, Training Loss: 0.01275
Epoch: 17783, Training Loss: 0.00984
Epoch: 17784, Training Loss: 0.01117
Epoch: 17784, Training Loss: 0.01038
Epoch: 17784, Training Loss: 0.01275
Epoch: 17784, Training Loss: 0.00983
Epoch: 17785, Training Loss: 0.01117
Epoch: 17785, Training Loss: 0.01038
Epoch: 17785, Training Loss: 0.01275
Epoch: 17785, Training Loss: 0.00983
Epoch: 17786, Training Loss: 0.01117
Epoch: 17786, Training Loss: 0.01038
Epoch: 17786, Training Loss: 0.01275
Epoch: 17786, Training Loss: 0.00983
Epoch: 17787, Training Loss: 0.01117
Epoch: 17787, Training Loss: 0.01038
Epoch: 17787, Training Loss: 0.01275
Epoch: 17787, Training Loss: 0.00983
Epoch: 17788, Training Loss: 0.01117
Epoch: 17788, Training Loss: 0.01038
Epoch: 17788, Training Loss: 0.01275
Epoch: 17788, Training Loss: 0.00983
Epoch: 17789, Training Loss: 0.01117
Epoch: 17789, Training Loss: 0.01037
Epoch: 17789, Training Loss: 0.01275
Epoch: 17789, Training Loss: 0.00983
Epoch: 17790, Training Loss: 0.01117
Epoch: 17790, Training Loss: 0.01037
Epoch: 17790, Training Loss: 0.01275
Epoch: 17790, Training Loss: 0.00983
Epoch: 17791, Training Loss: 0.01117
Epoch: 17791, Training Loss: 0.01037
Epoch: 17791, Training Loss: 0.01275
Epoch: 17791, Training Loss: 0.00983
Epoch: 17792, Training Loss: 0.01117
Epoch: 17792, Training Loss: 0.01037
Epoch: 17792, Training Loss: 0.01275
Epoch: 17792, Training Loss: 0.00983
Epoch: 17793, Training Loss: 0.01117
Epoch: 17793, Training Loss: 0.01037
Epoch: 17793, Training Loss: 0.01275
Epoch: 17793, Training Loss: 0.00983
Epoch: 17794, Training Loss: 0.01117
Epoch: 17794, Training Loss: 0.01037
Epoch: 17794, Training Loss: 0.01275
Epoch: 17794, Training Loss: 0.00983
Epoch: 17795, Training Loss: 0.01117
Epoch: 17795, Training Loss: 0.01037
Epoch: 17795, Training Loss: 0.01275
Epoch: 17795, Training Loss: 0.00983
Epoch: 17796, Training Loss: 0.01117
Epoch: 17796, Training Loss: 0.01037
Epoch: 17796, Training Loss: 0.01275
Epoch: 17796, Training Loss: 0.00983
Epoch: 17797, Training Loss: 0.01117
Epoch: 17797, Training Loss: 0.01037
Epoch: 17797, Training Loss: 0.01275
Epoch: 17797, Training Loss: 0.00983
Epoch: 17798, Training Loss: 0.01117
Epoch: 17798, Training Loss: 0.01037
Epoch: 17798, Training Loss: 0.01275
Epoch: 17798, Training Loss: 0.00983
Epoch: 17799, Training Loss: 0.01117
Epoch: 17799, Training Loss: 0.01037
Epoch: 17799, Training Loss: 0.01275
Epoch: 17799, Training Loss: 0.00983
Epoch: 17800, Training Loss: 0.01117
Epoch: 17800, Training Loss: 0.01037
Epoch: 17800, Training Loss: 0.01275
Epoch: 17800, Training Loss: 0.00983
Epoch: 17801, Training Loss: 0.01117
Epoch: 17801, Training Loss: 0.01037
Epoch: 17801, Training Loss: 0.01275
Epoch: 17801, Training Loss: 0.00983
Epoch: 17802, Training Loss: 0.01117
Epoch: 17802, Training Loss: 0.01037
Epoch: 17802, Training Loss: 0.01275
Epoch: 17802, Training Loss: 0.00983
Epoch: 17803, Training Loss: 0.01116
Epoch: 17803, Training Loss: 0.01037
Epoch: 17803, Training Loss: 0.01275
Epoch: 17803, Training Loss: 0.00983
Epoch: 17804, Training Loss: 0.01116
Epoch: 17804, Training Loss: 0.01037
Epoch: 17804, Training Loss: 0.01275
Epoch: 17804, Training Loss: 0.00983
Epoch: 17805, Training Loss: 0.01116
Epoch: 17805, Training Loss: 0.01037
Epoch: 17805, Training Loss: 0.01274
Epoch: 17805, Training Loss: 0.00983
Epoch: 17806, Training Loss: 0.01116
Epoch: 17806, Training Loss: 0.01037
Epoch: 17806, Training Loss: 0.01274
Epoch: 17806, Training Loss: 0.00983
Epoch: 17807, Training Loss: 0.01116
Epoch: 17807, Training Loss: 0.01037
Epoch: 17807, Training Loss: 0.01274
Epoch: 17807, Training Loss: 0.00983
Epoch: 17808, Training Loss: 0.01116
Epoch: 17808, Training Loss: 0.01037
Epoch: 17808, Training Loss: 0.01274
Epoch: 17808, Training Loss: 0.00983
Epoch: 17809, Training Loss: 0.01116
Epoch: 17809, Training Loss: 0.01037
Epoch: 17809, Training Loss: 0.01274
Epoch: 17809, Training Loss: 0.00983
Epoch: 17810, Training Loss: 0.01116
Epoch: 17810, Training Loss: 0.01037
Epoch: 17810, Training Loss: 0.01274
Epoch: 17810, Training Loss: 0.00983
Epoch: 17811, Training Loss: 0.01116
Epoch: 17811, Training Loss: 0.01037
Epoch: 17811, Training Loss: 0.01274
Epoch: 17811, Training Loss: 0.00983
Epoch: 17812, Training Loss: 0.01116
Epoch: 17812, Training Loss: 0.01037
Epoch: 17812, Training Loss: 0.01274
Epoch: 17812, Training Loss: 0.00983
Epoch: 17813, Training Loss: 0.01116
Epoch: 17813, Training Loss: 0.01037
Epoch: 17813, Training Loss: 0.01274
Epoch: 17813, Training Loss: 0.00983
Epoch: 17814, Training Loss: 0.01116
Epoch: 17814, Training Loss: 0.01037
Epoch: 17814, Training Loss: 0.01274
Epoch: 17814, Training Loss: 0.00983
Epoch: 17815, Training Loss: 0.01116
Epoch: 17815, Training Loss: 0.01037
Epoch: 17815, Training Loss: 0.01274
Epoch: 17815, Training Loss: 0.00983
Epoch: 17816, Training Loss: 0.01116
Epoch: 17816, Training Loss: 0.01037
Epoch: 17816, Training Loss: 0.01274
Epoch: 17816, Training Loss: 0.00983
Epoch: 17817, Training Loss: 0.01116
Epoch: 17817, Training Loss: 0.01037
Epoch: 17817, Training Loss: 0.01274
Epoch: 17817, Training Loss: 0.00983
Epoch: 17818, Training Loss: 0.01116
Epoch: 17818, Training Loss: 0.01037
Epoch: 17818, Training Loss: 0.01274
Epoch: 17818, Training Loss: 0.00982
Epoch: 17819, Training Loss: 0.01116
Epoch: 17819, Training Loss: 0.01037
Epoch: 17819, Training Loss: 0.01274
Epoch: 17819, Training Loss: 0.00982
Epoch: 17820, Training Loss: 0.01116
Epoch: 17820, Training Loss: 0.01037
Epoch: 17820, Training Loss: 0.01274
Epoch: 17820, Training Loss: 0.00982
Epoch: 17821, Training Loss: 0.01116
Epoch: 17821, Training Loss: 0.01036
Epoch: 17821, Training Loss: 0.01274
Epoch: 17821, Training Loss: 0.00982
Epoch: 17822, Training Loss: 0.01116
Epoch: 17822, Training Loss: 0.01036
Epoch: 17822, Training Loss: 0.01274
Epoch: 17822, Training Loss: 0.00982
Epoch: 17823, Training Loss: 0.01116
Epoch: 17823, Training Loss: 0.01036
Epoch: 17823, Training Loss: 0.01274
Epoch: 17823, Training Loss: 0.00982
Epoch: 17824, Training Loss: 0.01116
Epoch: 17824, Training Loss: 0.01036
Epoch: 17824, Training Loss: 0.01274
Epoch: 17824, Training Loss: 0.00982
Epoch: 17825, Training Loss: 0.01116
Epoch: 17825, Training Loss: 0.01036
Epoch: 17825, Training Loss: 0.01274
Epoch: 17825, Training Loss: 0.00982
Epoch: 17826, Training Loss: 0.01116
Epoch: 17826, Training Loss: 0.01036
Epoch: 17826, Training Loss: 0.01274
Epoch: 17826, Training Loss: 0.00982
Epoch: 17827, Training Loss: 0.01116
Epoch: 17827, Training Loss: 0.01036
Epoch: 17827, Training Loss: 0.01274
Epoch: 17827, Training Loss: 0.00982
Epoch: 17828, Training Loss: 0.01116
Epoch: 17828, Training Loss: 0.01036
Epoch: 17828, Training Loss: 0.01274
Epoch: 17828, Training Loss: 0.00982
Epoch: 17829, Training Loss: 0.01116
Epoch: 17829, Training Loss: 0.01036
Epoch: 17829, Training Loss: 0.01274
Epoch: 17829, Training Loss: 0.00982
Epoch: 17830, Training Loss: 0.01116
Epoch: 17830, Training Loss: 0.01036
Epoch: 17830, Training Loss: 0.01274
Epoch: 17830, Training Loss: 0.00982
Epoch: 17831, Training Loss: 0.01116
Epoch: 17831, Training Loss: 0.01036
Epoch: 17831, Training Loss: 0.01273
Epoch: 17831, Training Loss: 0.00982
Epoch: 17832, Training Loss: 0.01115
Epoch: 17832, Training Loss: 0.01036
Epoch: 17832, Training Loss: 0.01273
Epoch: 17832, Training Loss: 0.00982
Epoch: 17833, Training Loss: 0.01115
Epoch: 17833, Training Loss: 0.01036
Epoch: 17833, Training Loss: 0.01273
Epoch: 17833, Training Loss: 0.00982
Epoch: 17834, Training Loss: 0.01115
Epoch: 17834, Training Loss: 0.01036
Epoch: 17834, Training Loss: 0.01273
Epoch: 17834, Training Loss: 0.00982
Epoch: 17835, Training Loss: 0.01115
Epoch: 17835, Training Loss: 0.01036
Epoch: 17835, Training Loss: 0.01273
Epoch: 17835, Training Loss: 0.00982
Epoch: 17836, Training Loss: 0.01115
Epoch: 17836, Training Loss: 0.01036
Epoch: 17836, Training Loss: 0.01273
Epoch: 17836, Training Loss: 0.00982
Epoch: 17837, Training Loss: 0.01115
Epoch: 17837, Training Loss: 0.01036
Epoch: 17837, Training Loss: 0.01273
Epoch: 17837, Training Loss: 0.00982
Epoch: 17838, Training Loss: 0.01115
Epoch: 17838, Training Loss: 0.01036
Epoch: 17838, Training Loss: 0.01273
Epoch: 17838, Training Loss: 0.00982
Epoch: 17839, Training Loss: 0.01115
Epoch: 17839, Training Loss: 0.01036
Epoch: 17839, Training Loss: 0.01273
Epoch: 17839, Training Loss: 0.00982
Epoch: 17840, Training Loss: 0.01115
Epoch: 17840, Training Loss: 0.01036
Epoch: 17840, Training Loss: 0.01273
Epoch: 17840, Training Loss: 0.00982
Epoch: 17841, Training Loss: 0.01115
Epoch: 17841, Training Loss: 0.01036
Epoch: 17841, Training Loss: 0.01273
Epoch: 17841, Training Loss: 0.00982
Epoch: 17842, Training Loss: 0.01115
Epoch: 17842, Training Loss: 0.01036
Epoch: 17842, Training Loss: 0.01273
Epoch: 17842, Training Loss: 0.00982
Epoch: 17843, Training Loss: 0.01115
Epoch: 17843, Training Loss: 0.01036
Epoch: 17843, Training Loss: 0.01273
Epoch: 17843, Training Loss: 0.00982
Epoch: 17844, Training Loss: 0.01115
Epoch: 17844, Training Loss: 0.01036
Epoch: 17844, Training Loss: 0.01273
Epoch: 17844, Training Loss: 0.00982
Epoch: 17845, Training Loss: 0.01115
Epoch: 17845, Training Loss: 0.01036
Epoch: 17845, Training Loss: 0.01273
Epoch: 17845, Training Loss: 0.00982
Epoch: 17846, Training Loss: 0.01115
Epoch: 17846, Training Loss: 0.01036
Epoch: 17846, Training Loss: 0.01273
Epoch: 17846, Training Loss: 0.00982
Epoch: 17847, Training Loss: 0.01115
Epoch: 17847, Training Loss: 0.01036
Epoch: 17847, Training Loss: 0.01273
Epoch: 17847, Training Loss: 0.00982
Epoch: 17848, Training Loss: 0.01115
Epoch: 17848, Training Loss: 0.01036
Epoch: 17848, Training Loss: 0.01273
Epoch: 17848, Training Loss: 0.00982
Epoch: 17849, Training Loss: 0.01115
Epoch: 17849, Training Loss: 0.01036
Epoch: 17849, Training Loss: 0.01273
Epoch: 17849, Training Loss: 0.00982
Epoch: 17850, Training Loss: 0.01115
Epoch: 17850, Training Loss: 0.01036
Epoch: 17850, Training Loss: 0.01273
Epoch: 17850, Training Loss: 0.00982
Epoch: 17851, Training Loss: 0.01115
Epoch: 17851, Training Loss: 0.01036
Epoch: 17851, Training Loss: 0.01273
Epoch: 17851, Training Loss: 0.00982
Epoch: 17852, Training Loss: 0.01115
Epoch: 17852, Training Loss: 0.01036
Epoch: 17852, Training Loss: 0.01273
Epoch: 17852, Training Loss: 0.00981
Epoch: 17853, Training Loss: 0.01115
Epoch: 17853, Training Loss: 0.01036
Epoch: 17853, Training Loss: 0.01273
Epoch: 17853, Training Loss: 0.00981
Epoch: 17854, Training Loss: 0.01115
Epoch: 17854, Training Loss: 0.01035
Epoch: 17854, Training Loss: 0.01273
Epoch: 17854, Training Loss: 0.00981
Epoch: 17855, Training Loss: 0.01115
Epoch: 17855, Training Loss: 0.01035
Epoch: 17855, Training Loss: 0.01273
Epoch: 17855, Training Loss: 0.00981
Epoch: 17856, Training Loss: 0.01115
Epoch: 17856, Training Loss: 0.01035
Epoch: 17856, Training Loss: 0.01273
Epoch: 17856, Training Loss: 0.00981
Epoch: 17857, Training Loss: 0.01115
Epoch: 17857, Training Loss: 0.01035
Epoch: 17857, Training Loss: 0.01272
Epoch: 17857, Training Loss: 0.00981
Epoch: 17858, Training Loss: 0.01115
Epoch: 17858, Training Loss: 0.01035
Epoch: 17858, Training Loss: 0.01272
Epoch: 17858, Training Loss: 0.00981
Epoch: 17859, Training Loss: 0.01115
Epoch: 17859, Training Loss: 0.01035
Epoch: 17859, Training Loss: 0.01272
Epoch: 17859, Training Loss: 0.00981
Epoch: 17860, Training Loss: 0.01115
Epoch: 17860, Training Loss: 0.01035
Epoch: 17860, Training Loss: 0.01272
Epoch: 17860, Training Loss: 0.00981
Epoch: 17861, Training Loss: 0.01114
Epoch: 17861, Training Loss: 0.01035
Epoch: 17861, Training Loss: 0.01272
Epoch: 17861, Training Loss: 0.00981
Epoch: 17862, Training Loss: 0.01114
Epoch: 17862, Training Loss: 0.01035
Epoch: 17862, Training Loss: 0.01272
Epoch: 17862, Training Loss: 0.00981
Epoch: 17863, Training Loss: 0.01114
Epoch: 17863, Training Loss: 0.01035
Epoch: 17863, Training Loss: 0.01272
Epoch: 17863, Training Loss: 0.00981
Epoch: 17864, Training Loss: 0.01114
Epoch: 17864, Training Loss: 0.01035
Epoch: 17864, Training Loss: 0.01272
Epoch: 17864, Training Loss: 0.00981
Epoch: 17865, Training Loss: 0.01114
Epoch: 17865, Training Loss: 0.01035
Epoch: 17865, Training Loss: 0.01272
Epoch: 17865, Training Loss: 0.00981
Epoch: 17866, Training Loss: 0.01114
Epoch: 17866, Training Loss: 0.01035
Epoch: 17866, Training Loss: 0.01272
Epoch: 17866, Training Loss: 0.00981
Epoch: 17867, Training Loss: 0.01114
Epoch: 17867, Training Loss: 0.01035
Epoch: 17867, Training Loss: 0.01272
Epoch: 17867, Training Loss: 0.00981
Epoch: 17868, Training Loss: 0.01114
Epoch: 17868, Training Loss: 0.01035
Epoch: 17868, Training Loss: 0.01272
Epoch: 17868, Training Loss: 0.00981
Epoch: 17869, Training Loss: 0.01114
Epoch: 17869, Training Loss: 0.01035
Epoch: 17869, Training Loss: 0.01272
Epoch: 17869, Training Loss: 0.00981
Epoch: 17870, Training Loss: 0.01114
Epoch: 17870, Training Loss: 0.01035
Epoch: 17870, Training Loss: 0.01272
Epoch: 17870, Training Loss: 0.00981
Epoch: 17871, Training Loss: 0.01114
Epoch: 17871, Training Loss: 0.01035
Epoch: 17871, Training Loss: 0.01272
Epoch: 17871, Training Loss: 0.00981
Epoch: 17872, Training Loss: 0.01114
Epoch: 17872, Training Loss: 0.01035
Epoch: 17872, Training Loss: 0.01272
Epoch: 17872, Training Loss: 0.00981
Epoch: 17873, Training Loss: 0.01114
Epoch: 17873, Training Loss: 0.01035
Epoch: 17873, Training Loss: 0.01272
Epoch: 17873, Training Loss: 0.00981
Epoch: 17874, Training Loss: 0.01114
Epoch: 17874, Training Loss: 0.01035
Epoch: 17874, Training Loss: 0.01272
Epoch: 17874, Training Loss: 0.00981
Epoch: 17875, Training Loss: 0.01114
Epoch: 17875, Training Loss: 0.01035
Epoch: 17875, Training Loss: 0.01272
Epoch: 17875, Training Loss: 0.00981
Epoch: 17876, Training Loss: 0.01114
Epoch: 17876, Training Loss: 0.01035
Epoch: 17876, Training Loss: 0.01272
Epoch: 17876, Training Loss: 0.00981
Epoch: 17877, Training Loss: 0.01114
Epoch: 17877, Training Loss: 0.01035
Epoch: 17877, Training Loss: 0.01272
Epoch: 17877, Training Loss: 0.00981
Epoch: 17878, Training Loss: 0.01114
Epoch: 17878, Training Loss: 0.01035
Epoch: 17878, Training Loss: 0.01272
Epoch: 17878, Training Loss: 0.00981
Epoch: 17879, Training Loss: 0.01114
Epoch: 17879, Training Loss: 0.01035
Epoch: 17879, Training Loss: 0.01272
Epoch: 17879, Training Loss: 0.00981
Epoch: 17880, Training Loss: 0.01114
Epoch: 17880, Training Loss: 0.01035
Epoch: 17880, Training Loss: 0.01272
Epoch: 17880, Training Loss: 0.00981
Epoch: 17881, Training Loss: 0.01114
Epoch: 17881, Training Loss: 0.01035
Epoch: 17881, Training Loss: 0.01272
Epoch: 17881, Training Loss: 0.00981
Epoch: 17882, Training Loss: 0.01114
Epoch: 17882, Training Loss: 0.01035
Epoch: 17882, Training Loss: 0.01271
Epoch: 17882, Training Loss: 0.00981
Epoch: 17883, Training Loss: 0.01114
Epoch: 17883, Training Loss: 0.01035
Epoch: 17883, Training Loss: 0.01271
Epoch: 17883, Training Loss: 0.00981
Epoch: 17884, Training Loss: 0.01114
Epoch: 17884, Training Loss: 0.01035
Epoch: 17884, Training Loss: 0.01271
Epoch: 17884, Training Loss: 0.00981
Epoch: 17885, Training Loss: 0.01114
Epoch: 17885, Training Loss: 0.01035
Epoch: 17885, Training Loss: 0.01271
Epoch: 17885, Training Loss: 0.00981
Epoch: 17886, Training Loss: 0.01114
Epoch: 17886, Training Loss: 0.01034
Epoch: 17886, Training Loss: 0.01271
Epoch: 17886, Training Loss: 0.00980
Epoch: 17887, Training Loss: 0.01114
Epoch: 17887, Training Loss: 0.01034
Epoch: 17887, Training Loss: 0.01271
Epoch: 17887, Training Loss: 0.00980
Epoch: 17888, Training Loss: 0.01114
Epoch: 17888, Training Loss: 0.01034
Epoch: 17888, Training Loss: 0.01271
Epoch: 17888, Training Loss: 0.00980
Epoch: 17889, Training Loss: 0.01114
Epoch: 17889, Training Loss: 0.01034
Epoch: 17889, Training Loss: 0.01271
Epoch: 17889, Training Loss: 0.00980
Epoch: 17890, Training Loss: 0.01114
Epoch: 17890, Training Loss: 0.01034
Epoch: 17890, Training Loss: 0.01271
Epoch: 17890, Training Loss: 0.00980
Epoch: 17891, Training Loss: 0.01113
Epoch: 17891, Training Loss: 0.01034
Epoch: 17891, Training Loss: 0.01271
Epoch: 17891, Training Loss: 0.00980
Epoch: 17892, Training Loss: 0.01113
Epoch: 17892, Training Loss: 0.01034
Epoch: 17892, Training Loss: 0.01271
Epoch: 17892, Training Loss: 0.00980
Epoch: 17893, Training Loss: 0.01113
Epoch: 17893, Training Loss: 0.01034
Epoch: 17893, Training Loss: 0.01271
Epoch: 17893, Training Loss: 0.00980
Epoch: 17894, Training Loss: 0.01113
Epoch: 17894, Training Loss: 0.01034
Epoch: 17894, Training Loss: 0.01271
Epoch: 17894, Training Loss: 0.00980
Epoch: 17895, Training Loss: 0.01113
Epoch: 17895, Training Loss: 0.01034
Epoch: 17895, Training Loss: 0.01271
Epoch: 17895, Training Loss: 0.00980
Epoch: 17896, Training Loss: 0.01113
Epoch: 17896, Training Loss: 0.01034
Epoch: 17896, Training Loss: 0.01271
Epoch: 17896, Training Loss: 0.00980
Epoch: 17897, Training Loss: 0.01113
Epoch: 17897, Training Loss: 0.01034
Epoch: 17897, Training Loss: 0.01271
Epoch: 17897, Training Loss: 0.00980
Epoch: 17898, Training Loss: 0.01113
Epoch: 17898, Training Loss: 0.01034
Epoch: 17898, Training Loss: 0.01271
Epoch: 17898, Training Loss: 0.00980
Epoch: 17899, Training Loss: 0.01113
Epoch: 17899, Training Loss: 0.01034
Epoch: 17899, Training Loss: 0.01271
Epoch: 17899, Training Loss: 0.00980
Epoch: 17900, Training Loss: 0.01113
Epoch: 17900, Training Loss: 0.01034
Epoch: 17900, Training Loss: 0.01271
Epoch: 17900, Training Loss: 0.00980
Epoch: 17901, Training Loss: 0.01113
Epoch: 17901, Training Loss: 0.01034
Epoch: 17901, Training Loss: 0.01271
Epoch: 17901, Training Loss: 0.00980
Epoch: 17902, Training Loss: 0.01113
Epoch: 17902, Training Loss: 0.01034
Epoch: 17902, Training Loss: 0.01271
Epoch: 17902, Training Loss: 0.00980
Epoch: 17903, Training Loss: 0.01113
Epoch: 17903, Training Loss: 0.01034
Epoch: 17903, Training Loss: 0.01271
Epoch: 17903, Training Loss: 0.00980
Epoch: 17904, Training Loss: 0.01113
Epoch: 17904, Training Loss: 0.01034
Epoch: 17904, Training Loss: 0.01271
Epoch: 17904, Training Loss: 0.00980
Epoch: 17905, Training Loss: 0.01113
Epoch: 17905, Training Loss: 0.01034
Epoch: 17905, Training Loss: 0.01271
Epoch: 17905, Training Loss: 0.00980
Epoch: 17906, Training Loss: 0.01113
Epoch: 17906, Training Loss: 0.01034
Epoch: 17906, Training Loss: 0.01271
Epoch: 17906, Training Loss: 0.00980
Epoch: 17907, Training Loss: 0.01113
Epoch: 17907, Training Loss: 0.01034
Epoch: 17907, Training Loss: 0.01271
Epoch: 17907, Training Loss: 0.00980
Epoch: 17908, Training Loss: 0.01113
Epoch: 17908, Training Loss: 0.01034
Epoch: 17908, Training Loss: 0.01270
Epoch: 17908, Training Loss: 0.00980
Epoch: 17909, Training Loss: 0.01113
Epoch: 17909, Training Loss: 0.01034
Epoch: 17909, Training Loss: 0.01270
Epoch: 17909, Training Loss: 0.00980
Epoch: 17910, Training Loss: 0.01113
Epoch: 17910, Training Loss: 0.01034
Epoch: 17910, Training Loss: 0.01270
Epoch: 17910, Training Loss: 0.00980
Epoch: 17911, Training Loss: 0.01113
Epoch: 17911, Training Loss: 0.01034
Epoch: 17911, Training Loss: 0.01270
Epoch: 17911, Training Loss: 0.00980
Epoch: 17912, Training Loss: 0.01113
Epoch: 17912, Training Loss: 0.01034
Epoch: 17912, Training Loss: 0.01270
Epoch: 17912, Training Loss: 0.00980
Epoch: 17913, Training Loss: 0.01113
Epoch: 17913, Training Loss: 0.01034
Epoch: 17913, Training Loss: 0.01270
Epoch: 17913, Training Loss: 0.00980
Epoch: 17914, Training Loss: 0.01113
Epoch: 17914, Training Loss: 0.01034
Epoch: 17914, Training Loss: 0.01270
Epoch: 17914, Training Loss: 0.00980
Epoch: 17915, Training Loss: 0.01113
Epoch: 17915, Training Loss: 0.01034
Epoch: 17915, Training Loss: 0.01270
Epoch: 17915, Training Loss: 0.00980
Epoch: 17916, Training Loss: 0.01113
Epoch: 17916, Training Loss: 0.01034
Epoch: 17916, Training Loss: 0.01270
Epoch: 17916, Training Loss: 0.00980
Epoch: 17917, Training Loss: 0.01113
Epoch: 17917, Training Loss: 0.01034
Epoch: 17917, Training Loss: 0.01270
Epoch: 17917, Training Loss: 0.00980
Epoch: 17918, Training Loss: 0.01113
Epoch: 17918, Training Loss: 0.01033
Epoch: 17918, Training Loss: 0.01270
Epoch: 17918, Training Loss: 0.00980
Epoch: 17919, Training Loss: 0.01113
Epoch: 17919, Training Loss: 0.01033
Epoch: 17919, Training Loss: 0.01270
Epoch: 17919, Training Loss: 0.00980
Epoch: 17920, Training Loss: 0.01112
Epoch: 17920, Training Loss: 0.01033
Epoch: 17920, Training Loss: 0.01270
Epoch: 17920, Training Loss: 0.00979
Epoch: 17921, Training Loss: 0.01112
Epoch: 17921, Training Loss: 0.01033
Epoch: 17921, Training Loss: 0.01270
Epoch: 17921, Training Loss: 0.00979
Epoch: 17922, Training Loss: 0.01112
Epoch: 17922, Training Loss: 0.01033
Epoch: 17922, Training Loss: 0.01270
Epoch: 17922, Training Loss: 0.00979
Epoch: 17923, Training Loss: 0.01112
Epoch: 17923, Training Loss: 0.01033
Epoch: 17923, Training Loss: 0.01270
Epoch: 17923, Training Loss: 0.00979
Epoch: 17924, Training Loss: 0.01112
Epoch: 17924, Training Loss: 0.01033
Epoch: 17924, Training Loss: 0.01270
Epoch: 17924, Training Loss: 0.00979
Epoch: 17925, Training Loss: 0.01112
Epoch: 17925, Training Loss: 0.01033
Epoch: 17925, Training Loss: 0.01270
Epoch: 17925, Training Loss: 0.00979
Epoch: 17926, Training Loss: 0.01112
Epoch: 17926, Training Loss: 0.01033
Epoch: 17926, Training Loss: 0.01270
Epoch: 17926, Training Loss: 0.00979
Epoch: 17927, Training Loss: 0.01112
Epoch: 17927, Training Loss: 0.01033
Epoch: 17927, Training Loss: 0.01270
Epoch: 17927, Training Loss: 0.00979
Epoch: 17928, Training Loss: 0.01112
Epoch: 17928, Training Loss: 0.01033
Epoch: 17928, Training Loss: 0.01270
Epoch: 17928, Training Loss: 0.00979
Epoch: 17929, Training Loss: 0.01112
Epoch: 17929, Training Loss: 0.01033
Epoch: 17929, Training Loss: 0.01270
Epoch: 17929, Training Loss: 0.00979
Epoch: 17930, Training Loss: 0.01112
Epoch: 17930, Training Loss: 0.01033
Epoch: 17930, Training Loss: 0.01270
Epoch: 17930, Training Loss: 0.00979
Epoch: 17931, Training Loss: 0.01112
Epoch: 17931, Training Loss: 0.01033
Epoch: 17931, Training Loss: 0.01270
Epoch: 17931, Training Loss: 0.00979
Epoch: 17932, Training Loss: 0.01112
Epoch: 17932, Training Loss: 0.01033
Epoch: 17932, Training Loss: 0.01270
Epoch: 17932, Training Loss: 0.00979
Epoch: 17933, Training Loss: 0.01112
Epoch: 17933, Training Loss: 0.01033
Epoch: 17933, Training Loss: 0.01270
Epoch: 17933, Training Loss: 0.00979
Epoch: 17934, Training Loss: 0.01112
Epoch: 17934, Training Loss: 0.01033
Epoch: 17934, Training Loss: 0.01269
Epoch: 17934, Training Loss: 0.00979
Epoch: 17935, Training Loss: 0.01112
Epoch: 17935, Training Loss: 0.01033
Epoch: 17935, Training Loss: 0.01269
Epoch: 17935, Training Loss: 0.00979
Epoch: 17936, Training Loss: 0.01112
Epoch: 17936, Training Loss: 0.01033
Epoch: 17936, Training Loss: 0.01269
Epoch: 17936, Training Loss: 0.00979
Epoch: 17937, Training Loss: 0.01112
Epoch: 17937, Training Loss: 0.01033
Epoch: 17937, Training Loss: 0.01269
Epoch: 17937, Training Loss: 0.00979
Epoch: 17938, Training Loss: 0.01112
Epoch: 17938, Training Loss: 0.01033
Epoch: 17938, Training Loss: 0.01269
Epoch: 17938, Training Loss: 0.00979
Epoch: 17939, Training Loss: 0.01112
Epoch: 17939, Training Loss: 0.01033
Epoch: 17939, Training Loss: 0.01269
Epoch: 17939, Training Loss: 0.00979
Epoch: 17940, Training Loss: 0.01112
Epoch: 17940, Training Loss: 0.01033
Epoch: 17940, Training Loss: 0.01269
Epoch: 17940, Training Loss: 0.00979
Epoch: 17941, Training Loss: 0.01112
Epoch: 17941, Training Loss: 0.01033
Epoch: 17941, Training Loss: 0.01269
Epoch: 17941, Training Loss: 0.00979
Epoch: 17942, Training Loss: 0.01112
Epoch: 17942, Training Loss: 0.01033
Epoch: 17942, Training Loss: 0.01269
Epoch: 17942, Training Loss: 0.00979
Epoch: 17943, Training Loss: 0.01112
Epoch: 17943, Training Loss: 0.01033
Epoch: 17943, Training Loss: 0.01269
Epoch: 17943, Training Loss: 0.00979
Epoch: 17944, Training Loss: 0.01112
Epoch: 17944, Training Loss: 0.01033
Epoch: 17944, Training Loss: 0.01269
Epoch: 17944, Training Loss: 0.00979
Epoch: 17945, Training Loss: 0.01112
Epoch: 17945, Training Loss: 0.01033
Epoch: 17945, Training Loss: 0.01269
Epoch: 17945, Training Loss: 0.00979
Epoch: 17946, Training Loss: 0.01112
Epoch: 17946, Training Loss: 0.01033
Epoch: 17946, Training Loss: 0.01269
Epoch: 17946, Training Loss: 0.00979
Epoch: 17947, Training Loss: 0.01112
Epoch: 17947, Training Loss: 0.01033
Epoch: 17947, Training Loss: 0.01269
Epoch: 17947, Training Loss: 0.00979
Epoch: 17948, Training Loss: 0.01112
Epoch: 17948, Training Loss: 0.01033
Epoch: 17948, Training Loss: 0.01269
Epoch: 17948, Training Loss: 0.00979
Epoch: 17949, Training Loss: 0.01112
Epoch: 17949, Training Loss: 0.01033
Epoch: 17949, Training Loss: 0.01269
Epoch: 17949, Training Loss: 0.00979
Epoch: 17950, Training Loss: 0.01111
Epoch: 17950, Training Loss: 0.01033
Epoch: 17950, Training Loss: 0.01269
Epoch: 17950, Training Loss: 0.00979
Epoch: 17951, Training Loss: 0.01111
Epoch: 17951, Training Loss: 0.01032
Epoch: 17951, Training Loss: 0.01269
Epoch: 17951, Training Loss: 0.00979
Epoch: 17952, Training Loss: 0.01111
Epoch: 17952, Training Loss: 0.01032
Epoch: 17952, Training Loss: 0.01269
Epoch: 17952, Training Loss: 0.00979
Epoch: 17953, Training Loss: 0.01111
Epoch: 17953, Training Loss: 0.01032
Epoch: 17953, Training Loss: 0.01269
Epoch: 17953, Training Loss: 0.00979
Epoch: 17954, Training Loss: 0.01111
Epoch: 17954, Training Loss: 0.01032
Epoch: 17954, Training Loss: 0.01269
Epoch: 17954, Training Loss: 0.00979
Epoch: 17955, Training Loss: 0.01111
Epoch: 17955, Training Loss: 0.01032
Epoch: 17955, Training Loss: 0.01269
Epoch: 17955, Training Loss: 0.00978
Epoch: 17956, Training Loss: 0.01111
Epoch: 17956, Training Loss: 0.01032
Epoch: 17956, Training Loss: 0.01269
Epoch: 17956, Training Loss: 0.00978
Epoch: 17957, Training Loss: 0.01111
Epoch: 17957, Training Loss: 0.01032
Epoch: 17957, Training Loss: 0.01269
Epoch: 17957, Training Loss: 0.00978
Epoch: 17958, Training Loss: 0.01111
Epoch: 17958, Training Loss: 0.01032
Epoch: 17958, Training Loss: 0.01269
Epoch: 17958, Training Loss: 0.00978
Epoch: 17959, Training Loss: 0.01111
Epoch: 17959, Training Loss: 0.01032
Epoch: 17959, Training Loss: 0.01269
Epoch: 17959, Training Loss: 0.00978
Epoch: 17960, Training Loss: 0.01111
Epoch: 17960, Training Loss: 0.01032
Epoch: 17960, Training Loss: 0.01268
Epoch: 17960, Training Loss: 0.00978
Epoch: 17961, Training Loss: 0.01111
Epoch: 17961, Training Loss: 0.01032
Epoch: 17961, Training Loss: 0.01268
Epoch: 17961, Training Loss: 0.00978
Epoch: 17962, Training Loss: 0.01111
Epoch: 17962, Training Loss: 0.01032
Epoch: 17962, Training Loss: 0.01268
Epoch: 17962, Training Loss: 0.00978
Epoch: 17963, Training Loss: 0.01111
Epoch: 17963, Training Loss: 0.01032
Epoch: 17963, Training Loss: 0.01268
Epoch: 17963, Training Loss: 0.00978
Epoch: 17964, Training Loss: 0.01111
Epoch: 17964, Training Loss: 0.01032
Epoch: 17964, Training Loss: 0.01268
Epoch: 17964, Training Loss: 0.00978
Epoch: 17965, Training Loss: 0.01111
Epoch: 17965, Training Loss: 0.01032
Epoch: 17965, Training Loss: 0.01268
Epoch: 17965, Training Loss: 0.00978
Epoch: 17966, Training Loss: 0.01111
Epoch: 17966, Training Loss: 0.01032
Epoch: 17966, Training Loss: 0.01268
Epoch: 17966, Training Loss: 0.00978
Epoch: 17967, Training Loss: 0.01111
Epoch: 17967, Training Loss: 0.01032
Epoch: 17967, Training Loss: 0.01268
Epoch: 17967, Training Loss: 0.00978
Epoch: 17968, Training Loss: 0.01111
Epoch: 17968, Training Loss: 0.01032
Epoch: 17968, Training Loss: 0.01268
Epoch: 17968, Training Loss: 0.00978
Epoch: 17969, Training Loss: 0.01111
Epoch: 17969, Training Loss: 0.01032
Epoch: 17969, Training Loss: 0.01268
Epoch: 17969, Training Loss: 0.00978
Epoch: 17970, Training Loss: 0.01111
Epoch: 17970, Training Loss: 0.01032
Epoch: 17970, Training Loss: 0.01268
Epoch: 17970, Training Loss: 0.00978
Epoch: 17971, Training Loss: 0.01111
Epoch: 17971, Training Loss: 0.01032
Epoch: 17971, Training Loss: 0.01268
Epoch: 17971, Training Loss: 0.00978
Epoch: 17972, Training Loss: 0.01111
Epoch: 17972, Training Loss: 0.01032
Epoch: 17972, Training Loss: 0.01268
Epoch: 17972, Training Loss: 0.00978
Epoch: 17973, Training Loss: 0.01111
Epoch: 17973, Training Loss: 0.01032
Epoch: 17973, Training Loss: 0.01268
Epoch: 17973, Training Loss: 0.00978
Epoch: 17974, Training Loss: 0.01111
Epoch: 17974, Training Loss: 0.01032
Epoch: 17974, Training Loss: 0.01268
Epoch: 17974, Training Loss: 0.00978
Epoch: 17975, Training Loss: 0.01111
Epoch: 17975, Training Loss: 0.01032
Epoch: 17975, Training Loss: 0.01268
Epoch: 17975, Training Loss: 0.00978
Epoch: 17976, Training Loss: 0.01111
Epoch: 17976, Training Loss: 0.01032
Epoch: 17976, Training Loss: 0.01268
Epoch: 17976, Training Loss: 0.00978
Epoch: 17977, Training Loss: 0.01111
Epoch: 17977, Training Loss: 0.01032
Epoch: 17977, Training Loss: 0.01268
Epoch: 17977, Training Loss: 0.00978
Epoch: 17978, Training Loss: 0.01111
Epoch: 17978, Training Loss: 0.01032
Epoch: 17978, Training Loss: 0.01268
Epoch: 17978, Training Loss: 0.00978
Epoch: 17979, Training Loss: 0.01111
Epoch: 17979, Training Loss: 0.01032
Epoch: 17979, Training Loss: 0.01268
Epoch: 17979, Training Loss: 0.00978
Epoch: 17980, Training Loss: 0.01110
Epoch: 17980, Training Loss: 0.01032
Epoch: 17980, Training Loss: 0.01268
Epoch: 17980, Training Loss: 0.00978
Epoch: 17981, Training Loss: 0.01110
Epoch: 17981, Training Loss: 0.01032
Epoch: 17981, Training Loss: 0.01268
Epoch: 17981, Training Loss: 0.00978
Epoch: 17982, Training Loss: 0.01110
Epoch: 17982, Training Loss: 0.01032
Epoch: 17982, Training Loss: 0.01268
Epoch: 17982, Training Loss: 0.00978
Epoch: 17983, Training Loss: 0.01110
Epoch: 17983, Training Loss: 0.01032
Epoch: 17983, Training Loss: 0.01268
Epoch: 17983, Training Loss: 0.00978
Epoch: 17984, Training Loss: 0.01110
Epoch: 17984, Training Loss: 0.01031
Epoch: 17984, Training Loss: 0.01268
Epoch: 17984, Training Loss: 0.00978
Epoch: 17985, Training Loss: 0.01110
Epoch: 17985, Training Loss: 0.01031
Epoch: 17985, Training Loss: 0.01268
Epoch: 17985, Training Loss: 0.00978
Epoch: 17986, Training Loss: 0.01110
Epoch: 17986, Training Loss: 0.01031
Epoch: 17986, Training Loss: 0.01267
Epoch: 17986, Training Loss: 0.00978
Epoch: 17987, Training Loss: 0.01110
Epoch: 17987, Training Loss: 0.01031
Epoch: 17987, Training Loss: 0.01267
Epoch: 17987, Training Loss: 0.00978
Epoch: 17988, Training Loss: 0.01110
Epoch: 17988, Training Loss: 0.01031
Epoch: 17988, Training Loss: 0.01267
Epoch: 17988, Training Loss: 0.00978
Epoch: 17989, Training Loss: 0.01110
Epoch: 17989, Training Loss: 0.01031
Epoch: 17989, Training Loss: 0.01267
Epoch: 17989, Training Loss: 0.00977
Epoch: 17990, Training Loss: 0.01110
Epoch: 17990, Training Loss: 0.01031
Epoch: 17990, Training Loss: 0.01267
Epoch: 17990, Training Loss: 0.00977
Epoch: 17991, Training Loss: 0.01110
Epoch: 17991, Training Loss: 0.01031
Epoch: 17991, Training Loss: 0.01267
Epoch: 17991, Training Loss: 0.00977
Epoch: 17992, Training Loss: 0.01110
Epoch: 17992, Training Loss: 0.01031
Epoch: 17992, Training Loss: 0.01267
Epoch: 17992, Training Loss: 0.00977
Epoch: 17993, Training Loss: 0.01110
Epoch: 17993, Training Loss: 0.01031
Epoch: 17993, Training Loss: 0.01267
Epoch: 17993, Training Loss: 0.00977
Epoch: 17994, Training Loss: 0.01110
Epoch: 17994, Training Loss: 0.01031
Epoch: 17994, Training Loss: 0.01267
Epoch: 17994, Training Loss: 0.00977
Epoch: 17995, Training Loss: 0.01110
Epoch: 17995, Training Loss: 0.01031
Epoch: 17995, Training Loss: 0.01267
Epoch: 17995, Training Loss: 0.00977
Epoch: 17996, Training Loss: 0.01110
Epoch: 17996, Training Loss: 0.01031
Epoch: 17996, Training Loss: 0.01267
Epoch: 17996, Training Loss: 0.00977
Epoch: 17997, Training Loss: 0.01110
Epoch: 17997, Training Loss: 0.01031
Epoch: 17997, Training Loss: 0.01267
Epoch: 17997, Training Loss: 0.00977
Epoch: 17998, Training Loss: 0.01110
Epoch: 17998, Training Loss: 0.01031
Epoch: 17998, Training Loss: 0.01267
Epoch: 17998, Training Loss: 0.00977
Epoch: 17999, Training Loss: 0.01110
Epoch: 17999, Training Loss: 0.01031
Epoch: 17999, Training Loss: 0.01267
Epoch: 17999, Training Loss: 0.00977
Epoch: 18000, Training Loss: 0.01110
Epoch: 18000, Training Loss: 0.01031
Epoch: 18000, Training Loss: 0.01267
Epoch: 18000, Training Loss: 0.00977
Epoch: 18001, Training Loss: 0.01110
Epoch: 18001, Training Loss: 0.01031
Epoch: 18001, Training Loss: 0.01267
Epoch: 18001, Training Loss: 0.00977
Epoch: 18002, Training Loss: 0.01110
Epoch: 18002, Training Loss: 0.01031
Epoch: 18002, Training Loss: 0.01267
Epoch: 18002, Training Loss: 0.00977
Epoch: 18003, Training Loss: 0.01110
Epoch: 18003, Training Loss: 0.01031
Epoch: 18003, Training Loss: 0.01267
Epoch: 18003, Training Loss: 0.00977
Epoch: 18004, Training Loss: 0.01110
Epoch: 18004, Training Loss: 0.01031
Epoch: 18004, Training Loss: 0.01267
Epoch: 18004, Training Loss: 0.00977
Epoch: 18005, Training Loss: 0.01110
Epoch: 18005, Training Loss: 0.01031
Epoch: 18005, Training Loss: 0.01267
Epoch: 18005, Training Loss: 0.00977
Epoch: 18006, Training Loss: 0.01110
Epoch: 18006, Training Loss: 0.01031
Epoch: 18006, Training Loss: 0.01267
Epoch: 18006, Training Loss: 0.00977
Epoch: 18007, Training Loss: 0.01110
Epoch: 18007, Training Loss: 0.01031
Epoch: 18007, Training Loss: 0.01267
Epoch: 18007, Training Loss: 0.00977
Epoch: 18008, Training Loss: 0.01110
Epoch: 18008, Training Loss: 0.01031
Epoch: 18008, Training Loss: 0.01267
Epoch: 18008, Training Loss: 0.00977
Epoch: 18009, Training Loss: 0.01110
Epoch: 18009, Training Loss: 0.01031
Epoch: 18009, Training Loss: 0.01267
Epoch: 18009, Training Loss: 0.00977
Epoch: 18010, Training Loss: 0.01109
Epoch: 18010, Training Loss: 0.01031
Epoch: 18010, Training Loss: 0.01267
Epoch: 18010, Training Loss: 0.00977
Epoch: 18011, Training Loss: 0.01109
Epoch: 18011, Training Loss: 0.01031
Epoch: 18011, Training Loss: 0.01267
Epoch: 18011, Training Loss: 0.00977
Epoch: 18012, Training Loss: 0.01109
Epoch: 18012, Training Loss: 0.01031
Epoch: 18012, Training Loss: 0.01266
Epoch: 18012, Training Loss: 0.00977
Epoch: 18013, Training Loss: 0.01109
Epoch: 18013, Training Loss: 0.01031
Epoch: 18013, Training Loss: 0.01266
Epoch: 18013, Training Loss: 0.00977
Epoch: 18014, Training Loss: 0.01109
Epoch: 18014, Training Loss: 0.01031
Epoch: 18014, Training Loss: 0.01266
Epoch: 18014, Training Loss: 0.00977
Epoch: 18015, Training Loss: 0.01109
Epoch: 18015, Training Loss: 0.01031
Epoch: 18015, Training Loss: 0.01266
Epoch: 18015, Training Loss: 0.00977
Epoch: 18016, Training Loss: 0.01109
Epoch: 18016, Training Loss: 0.01031
Epoch: 18016, Training Loss: 0.01266
Epoch: 18016, Training Loss: 0.00977
Epoch: 18017, Training Loss: 0.01109
Epoch: 18017, Training Loss: 0.01030
Epoch: 18017, Training Loss: 0.01266
Epoch: 18017, Training Loss: 0.00977
Epoch: 18018, Training Loss: 0.01109
Epoch: 18018, Training Loss: 0.01030
Epoch: 18018, Training Loss: 0.01266
Epoch: 18018, Training Loss: 0.00977
Epoch: 18019, Training Loss: 0.01109
Epoch: 18019, Training Loss: 0.01030
Epoch: 18019, Training Loss: 0.01266
Epoch: 18019, Training Loss: 0.00977
Epoch: 18020, Training Loss: 0.01109
Epoch: 18020, Training Loss: 0.01030
Epoch: 18020, Training Loss: 0.01266
Epoch: 18020, Training Loss: 0.00977
Epoch: 18021, Training Loss: 0.01109
Epoch: 18021, Training Loss: 0.01030
Epoch: 18021, Training Loss: 0.01266
Epoch: 18021, Training Loss: 0.00977
Epoch: 18022, Training Loss: 0.01109
Epoch: 18022, Training Loss: 0.01030
Epoch: 18022, Training Loss: 0.01266
Epoch: 18022, Training Loss: 0.00977
Epoch: 18023, Training Loss: 0.01109
Epoch: 18023, Training Loss: 0.01030
Epoch: 18023, Training Loss: 0.01266
Epoch: 18023, Training Loss: 0.00977
Epoch: 18024, Training Loss: 0.01109
Epoch: 18024, Training Loss: 0.01030
Epoch: 18024, Training Loss: 0.01266
Epoch: 18024, Training Loss: 0.00976
Epoch: 18025, Training Loss: 0.01109
Epoch: 18025, Training Loss: 0.01030
Epoch: 18025, Training Loss: 0.01266
Epoch: 18025, Training Loss: 0.00976
Epoch: 18026, Training Loss: 0.01109
Epoch: 18026, Training Loss: 0.01030
Epoch: 18026, Training Loss: 0.01266
Epoch: 18026, Training Loss: 0.00976
Epoch: 18027, Training Loss: 0.01109
Epoch: 18027, Training Loss: 0.01030
Epoch: 18027, Training Loss: 0.01266
Epoch: 18027, Training Loss: 0.00976
Epoch: 18028, Training Loss: 0.01109
Epoch: 18028, Training Loss: 0.01030
Epoch: 18028, Training Loss: 0.01266
Epoch: 18028, Training Loss: 0.00976
Epoch: 18029, Training Loss: 0.01109
Epoch: 18029, Training Loss: 0.01030
Epoch: 18029, Training Loss: 0.01266
Epoch: 18029, Training Loss: 0.00976
Epoch: 18030, Training Loss: 0.01109
Epoch: 18030, Training Loss: 0.01030
Epoch: 18030, Training Loss: 0.01266
Epoch: 18030, Training Loss: 0.00976
Epoch: 18031, Training Loss: 0.01109
Epoch: 18031, Training Loss: 0.01030
Epoch: 18031, Training Loss: 0.01266
Epoch: 18031, Training Loss: 0.00976
Epoch: 18032, Training Loss: 0.01109
Epoch: 18032, Training Loss: 0.01030
Epoch: 18032, Training Loss: 0.01266
Epoch: 18032, Training Loss: 0.00976
Epoch: 18033, Training Loss: 0.01109
Epoch: 18033, Training Loss: 0.01030
Epoch: 18033, Training Loss: 0.01266
Epoch: 18033, Training Loss: 0.00976
Epoch: 18034, Training Loss: 0.01109
Epoch: 18034, Training Loss: 0.01030
Epoch: 18034, Training Loss: 0.01266
Epoch: 18034, Training Loss: 0.00976
Epoch: 18035, Training Loss: 0.01109
Epoch: 18035, Training Loss: 0.01030
Epoch: 18035, Training Loss: 0.01266
Epoch: 18035, Training Loss: 0.00976
Epoch: 18036, Training Loss: 0.01109
Epoch: 18036, Training Loss: 0.01030
Epoch: 18036, Training Loss: 0.01266
Epoch: 18036, Training Loss: 0.00976
Epoch: 18037, Training Loss: 0.01109
Epoch: 18037, Training Loss: 0.01030
Epoch: 18037, Training Loss: 0.01266
Epoch: 18037, Training Loss: 0.00976
Epoch: 18038, Training Loss: 0.01109
Epoch: 18038, Training Loss: 0.01030
Epoch: 18038, Training Loss: 0.01266
Epoch: 18038, Training Loss: 0.00976
Epoch: 18039, Training Loss: 0.01108
Epoch: 18039, Training Loss: 0.01030
Epoch: 18039, Training Loss: 0.01265
Epoch: 18039, Training Loss: 0.00976
Epoch: 18040, Training Loss: 0.01108
Epoch: 18040, Training Loss: 0.01030
Epoch: 18040, Training Loss: 0.01265
Epoch: 18040, Training Loss: 0.00976
Epoch: 18041, Training Loss: 0.01108
Epoch: 18041, Training Loss: 0.01030
Epoch: 18041, Training Loss: 0.01265
Epoch: 18041, Training Loss: 0.00976
Epoch: 18042, Training Loss: 0.01108
Epoch: 18042, Training Loss: 0.01030
Epoch: 18042, Training Loss: 0.01265
Epoch: 18042, Training Loss: 0.00976
Epoch: 18043, Training Loss: 0.01108
Epoch: 18043, Training Loss: 0.01030
Epoch: 18043, Training Loss: 0.01265
Epoch: 18043, Training Loss: 0.00976
Epoch: 18044, Training Loss: 0.01108
Epoch: 18044, Training Loss: 0.01030
Epoch: 18044, Training Loss: 0.01265
Epoch: 18044, Training Loss: 0.00976
Epoch: 18045, Training Loss: 0.01108
Epoch: 18045, Training Loss: 0.01030
Epoch: 18045, Training Loss: 0.01265
Epoch: 18045, Training Loss: 0.00976
Epoch: 18046, Training Loss: 0.01108
Epoch: 18046, Training Loss: 0.01030
Epoch: 18046, Training Loss: 0.01265
Epoch: 18046, Training Loss: 0.00976
Epoch: 18047, Training Loss: 0.01108
Epoch: 18047, Training Loss: 0.01030
Epoch: 18047, Training Loss: 0.01265
Epoch: 18047, Training Loss: 0.00976
Epoch: 18048, Training Loss: 0.01108
Epoch: 18048, Training Loss: 0.01030
Epoch: 18048, Training Loss: 0.01265
Epoch: 18048, Training Loss: 0.00976
Epoch: 18049, Training Loss: 0.01108
Epoch: 18049, Training Loss: 0.01029
Epoch: 18049, Training Loss: 0.01265
Epoch: 18049, Training Loss: 0.00976
Epoch: 18050, Training Loss: 0.01108
Epoch: 18050, Training Loss: 0.01029
Epoch: 18050, Training Loss: 0.01265
Epoch: 18050, Training Loss: 0.00976
Epoch: 18051, Training Loss: 0.01108
Epoch: 18051, Training Loss: 0.01029
Epoch: 18051, Training Loss: 0.01265
Epoch: 18051, Training Loss: 0.00976
Epoch: 18052, Training Loss: 0.01108
Epoch: 18052, Training Loss: 0.01029
Epoch: 18052, Training Loss: 0.01265
Epoch: 18052, Training Loss: 0.00976
Epoch: 18053, Training Loss: 0.01108
Epoch: 18053, Training Loss: 0.01029
Epoch: 18053, Training Loss: 0.01265
Epoch: 18053, Training Loss: 0.00976
Epoch: 18054, Training Loss: 0.01108
Epoch: 18054, Training Loss: 0.01029
Epoch: 18054, Training Loss: 0.01265
Epoch: 18054, Training Loss: 0.00976
Epoch: 18055, Training Loss: 0.01108
Epoch: 18055, Training Loss: 0.01029
Epoch: 18055, Training Loss: 0.01265
Epoch: 18055, Training Loss: 0.00976
Epoch: 18056, Training Loss: 0.01108
Epoch: 18056, Training Loss: 0.01029
Epoch: 18056, Training Loss: 0.01265
Epoch: 18056, Training Loss: 0.00976
Epoch: 18057, Training Loss: 0.01108
Epoch: 18057, Training Loss: 0.01029
Epoch: 18057, Training Loss: 0.01265
Epoch: 18057, Training Loss: 0.00976
Epoch: 18058, Training Loss: 0.01108
Epoch: 18058, Training Loss: 0.01029
Epoch: 18058, Training Loss: 0.01265
Epoch: 18058, Training Loss: 0.00975
Epoch: 18059, Training Loss: 0.01108
Epoch: 18059, Training Loss: 0.01029
Epoch: 18059, Training Loss: 0.01265
Epoch: 18059, Training Loss: 0.00975
Epoch: 18060, Training Loss: 0.01108
Epoch: 18060, Training Loss: 0.01029
Epoch: 18060, Training Loss: 0.01265
Epoch: 18060, Training Loss: 0.00975
Epoch: 18061, Training Loss: 0.01108
Epoch: 18061, Training Loss: 0.01029
Epoch: 18061, Training Loss: 0.01265
Epoch: 18061, Training Loss: 0.00975
Epoch: 18062, Training Loss: 0.01108
Epoch: 18062, Training Loss: 0.01029
Epoch: 18062, Training Loss: 0.01265
Epoch: 18062, Training Loss: 0.00975
Epoch: 18063, Training Loss: 0.01108
Epoch: 18063, Training Loss: 0.01029
Epoch: 18063, Training Loss: 0.01265
Epoch: 18063, Training Loss: 0.00975
Epoch: 18064, Training Loss: 0.01108
Epoch: 18064, Training Loss: 0.01029
Epoch: 18064, Training Loss: 0.01265
Epoch: 18064, Training Loss: 0.00975
Epoch: 18065, Training Loss: 0.01108
Epoch: 18065, Training Loss: 0.01029
Epoch: 18065, Training Loss: 0.01264
Epoch: 18065, Training Loss: 0.00975
Epoch: 18066, Training Loss: 0.01108
Epoch: 18066, Training Loss: 0.01029
Epoch: 18066, Training Loss: 0.01264
Epoch: 18066, Training Loss: 0.00975
Epoch: 18067, Training Loss: 0.01108
Epoch: 18067, Training Loss: 0.01029
Epoch: 18067, Training Loss: 0.01264
Epoch: 18067, Training Loss: 0.00975
Epoch: 18068, Training Loss: 0.01108
Epoch: 18068, Training Loss: 0.01029
Epoch: 18068, Training Loss: 0.01264
Epoch: 18068, Training Loss: 0.00975
Epoch: 18069, Training Loss: 0.01107
Epoch: 18069, Training Loss: 0.01029
Epoch: 18069, Training Loss: 0.01264
Epoch: 18069, Training Loss: 0.00975
Epoch: 18070, Training Loss: 0.01107
Epoch: 18070, Training Loss: 0.01029
Epoch: 18070, Training Loss: 0.01264
Epoch: 18070, Training Loss: 0.00975
Epoch: 18071, Training Loss: 0.01107
Epoch: 18071, Training Loss: 0.01029
Epoch: 18071, Training Loss: 0.01264
Epoch: 18071, Training Loss: 0.00975
Epoch: 18072, Training Loss: 0.01107
Epoch: 18072, Training Loss: 0.01029
Epoch: 18072, Training Loss: 0.01264
Epoch: 18072, Training Loss: 0.00975
Epoch: 18073, Training Loss: 0.01107
Epoch: 18073, Training Loss: 0.01029
Epoch: 18073, Training Loss: 0.01264
Epoch: 18073, Training Loss: 0.00975
Epoch: 18074, Training Loss: 0.01107
Epoch: 18074, Training Loss: 0.01029
Epoch: 18074, Training Loss: 0.01264
Epoch: 18074, Training Loss: 0.00975
Epoch: 18075, Training Loss: 0.01107
Epoch: 18075, Training Loss: 0.01029
Epoch: 18075, Training Loss: 0.01264
Epoch: 18075, Training Loss: 0.00975
Epoch: 18076, Training Loss: 0.01107
Epoch: 18076, Training Loss: 0.01029
Epoch: 18076, Training Loss: 0.01264
Epoch: 18076, Training Loss: 0.00975
Epoch: 18077, Training Loss: 0.01107
Epoch: 18077, Training Loss: 0.01029
Epoch: 18077, Training Loss: 0.01264
Epoch: 18077, Training Loss: 0.00975
Epoch: 18078, Training Loss: 0.01107
Epoch: 18078, Training Loss: 0.01029
Epoch: 18078, Training Loss: 0.01264
Epoch: 18078, Training Loss: 0.00975
Epoch: 18079, Training Loss: 0.01107
Epoch: 18079, Training Loss: 0.01029
Epoch: 18079, Training Loss: 0.01264
Epoch: 18079, Training Loss: 0.00975
Epoch: 18080, Training Loss: 0.01107
Epoch: 18080, Training Loss: 0.01029
Epoch: 18080, Training Loss: 0.01264
Epoch: 18080, Training Loss: 0.00975
Epoch: 18081, Training Loss: 0.01107
Epoch: 18081, Training Loss: 0.01029
Epoch: 18081, Training Loss: 0.01264
Epoch: 18081, Training Loss: 0.00975
Epoch: 18082, Training Loss: 0.01107
Epoch: 18082, Training Loss: 0.01028
Epoch: 18082, Training Loss: 0.01264
Epoch: 18082, Training Loss: 0.00975
Epoch: 18083, Training Loss: 0.01107
Epoch: 18083, Training Loss: 0.01028
Epoch: 18083, Training Loss: 0.01264
Epoch: 18083, Training Loss: 0.00975
Epoch: 18084, Training Loss: 0.01107
Epoch: 18084, Training Loss: 0.01028
Epoch: 18084, Training Loss: 0.01264
Epoch: 18084, Training Loss: 0.00975
Epoch: 18085, Training Loss: 0.01107
Epoch: 18085, Training Loss: 0.01028
Epoch: 18085, Training Loss: 0.01264
Epoch: 18085, Training Loss: 0.00975
Epoch: 18086, Training Loss: 0.01107
Epoch: 18086, Training Loss: 0.01028
Epoch: 18086, Training Loss: 0.01264
Epoch: 18086, Training Loss: 0.00975
Epoch: 18087, Training Loss: 0.01107
Epoch: 18087, Training Loss: 0.01028
Epoch: 18087, Training Loss: 0.01264
Epoch: 18087, Training Loss: 0.00975
Epoch: 18088, Training Loss: 0.01107
Epoch: 18088, Training Loss: 0.01028
Epoch: 18088, Training Loss: 0.01264
Epoch: 18088, Training Loss: 0.00975
Epoch: 18089, Training Loss: 0.01107
Epoch: 18089, Training Loss: 0.01028
Epoch: 18089, Training Loss: 0.01264
Epoch: 18089, Training Loss: 0.00975
Epoch: 18090, Training Loss: 0.01107
Epoch: 18090, Training Loss: 0.01028
Epoch: 18090, Training Loss: 0.01264
Epoch: 18090, Training Loss: 0.00975
Epoch: 18091, Training Loss: 0.01107
Epoch: 18091, Training Loss: 0.01028
Epoch: 18091, Training Loss: 0.01263
Epoch: 18091, Training Loss: 0.00975
Epoch: 18092, Training Loss: 0.01107
Epoch: 18092, Training Loss: 0.01028
Epoch: 18092, Training Loss: 0.01263
Epoch: 18092, Training Loss: 0.00975
Epoch: 18093, Training Loss: 0.01107
Epoch: 18093, Training Loss: 0.01028
Epoch: 18093, Training Loss: 0.01263
Epoch: 18093, Training Loss: 0.00974
Epoch: 18094, Training Loss: 0.01107
Epoch: 18094, Training Loss: 0.01028
Epoch: 18094, Training Loss: 0.01263
Epoch: 18094, Training Loss: 0.00974
Epoch: 18095, Training Loss: 0.01107
Epoch: 18095, Training Loss: 0.01028
Epoch: 18095, Training Loss: 0.01263
Epoch: 18095, Training Loss: 0.00974
Epoch: 18096, Training Loss: 0.01107
Epoch: 18096, Training Loss: 0.01028
Epoch: 18096, Training Loss: 0.01263
Epoch: 18096, Training Loss: 0.00974
Epoch: 18097, Training Loss: 0.01107
Epoch: 18097, Training Loss: 0.01028
Epoch: 18097, Training Loss: 0.01263
Epoch: 18097, Training Loss: 0.00974
Epoch: 18098, Training Loss: 0.01107
Epoch: 18098, Training Loss: 0.01028
Epoch: 18098, Training Loss: 0.01263
Epoch: 18098, Training Loss: 0.00974
Epoch: 18099, Training Loss: 0.01106
Epoch: 18099, Training Loss: 0.01028
Epoch: 18099, Training Loss: 0.01263
Epoch: 18099, Training Loss: 0.00974
Epoch: 18100, Training Loss: 0.01106
Epoch: 18100, Training Loss: 0.01028
Epoch: 18100, Training Loss: 0.01263
Epoch: 18100, Training Loss: 0.00974
Epoch: 18101, Training Loss: 0.01106
Epoch: 18101, Training Loss: 0.01028
Epoch: 18101, Training Loss: 0.01263
Epoch: 18101, Training Loss: 0.00974
Epoch: 18102, Training Loss: 0.01106
Epoch: 18102, Training Loss: 0.01028
Epoch: 18102, Training Loss: 0.01263
Epoch: 18102, Training Loss: 0.00974
Epoch: 18103, Training Loss: 0.01106
Epoch: 18103, Training Loss: 0.01028
Epoch: 18103, Training Loss: 0.01263
Epoch: 18103, Training Loss: 0.00974
Epoch: 18104, Training Loss: 0.01106
Epoch: 18104, Training Loss: 0.01028
Epoch: 18104, Training Loss: 0.01263
Epoch: 18104, Training Loss: 0.00974
Epoch: 18105, Training Loss: 0.01106
Epoch: 18105, Training Loss: 0.01028
Epoch: 18105, Training Loss: 0.01263
Epoch: 18105, Training Loss: 0.00974
Epoch: 18106, Training Loss: 0.01106
Epoch: 18106, Training Loss: 0.01028
Epoch: 18106, Training Loss: 0.01263
Epoch: 18106, Training Loss: 0.00974
Epoch: 18107, Training Loss: 0.01106
Epoch: 18107, Training Loss: 0.01028
Epoch: 18107, Training Loss: 0.01263
Epoch: 18107, Training Loss: 0.00974
Epoch: 18108, Training Loss: 0.01106
Epoch: 18108, Training Loss: 0.01028
Epoch: 18108, Training Loss: 0.01263
Epoch: 18108, Training Loss: 0.00974
Epoch: 18109, Training Loss: 0.01106
Epoch: 18109, Training Loss: 0.01028
Epoch: 18109, Training Loss: 0.01263
Epoch: 18109, Training Loss: 0.00974
Epoch: 18110, Training Loss: 0.01106
Epoch: 18110, Training Loss: 0.01028
Epoch: 18110, Training Loss: 0.01263
Epoch: 18110, Training Loss: 0.00974
Epoch: 18111, Training Loss: 0.01106
Epoch: 18111, Training Loss: 0.01028
Epoch: 18111, Training Loss: 0.01263
Epoch: 18111, Training Loss: 0.00974
Epoch: 18112, Training Loss: 0.01106
Epoch: 18112, Training Loss: 0.01028
Epoch: 18112, Training Loss: 0.01263
Epoch: 18112, Training Loss: 0.00974
Epoch: 18113, Training Loss: 0.01106
Epoch: 18113, Training Loss: 0.01028
Epoch: 18113, Training Loss: 0.01263
Epoch: 18113, Training Loss: 0.00974
Epoch: 18114, Training Loss: 0.01106
Epoch: 18114, Training Loss: 0.01028
Epoch: 18114, Training Loss: 0.01263
Epoch: 18114, Training Loss: 0.00974
Epoch: 18115, Training Loss: 0.01106
Epoch: 18115, Training Loss: 0.01028
Epoch: 18115, Training Loss: 0.01263
Epoch: 18115, Training Loss: 0.00974
Epoch: 18116, Training Loss: 0.01106
Epoch: 18116, Training Loss: 0.01027
Epoch: 18116, Training Loss: 0.01263
Epoch: 18116, Training Loss: 0.00974
Epoch: 18117, Training Loss: 0.01106
Epoch: 18117, Training Loss: 0.01027
Epoch: 18117, Training Loss: 0.01263
Epoch: 18117, Training Loss: 0.00974
Epoch: 18118, Training Loss: 0.01106
Epoch: 18118, Training Loss: 0.01027
Epoch: 18118, Training Loss: 0.01262
Epoch: 18118, Training Loss: 0.00974
Epoch: 18119, Training Loss: 0.01106
Epoch: 18119, Training Loss: 0.01027
Epoch: 18119, Training Loss: 0.01262
Epoch: 18119, Training Loss: 0.00974
Epoch: 18120, Training Loss: 0.01106
Epoch: 18120, Training Loss: 0.01027
Epoch: 18120, Training Loss: 0.01262
Epoch: 18120, Training Loss: 0.00974
Epoch: 18121, Training Loss: 0.01106
Epoch: 18121, Training Loss: 0.01027
Epoch: 18121, Training Loss: 0.01262
Epoch: 18121, Training Loss: 0.00974
Epoch: 18122, Training Loss: 0.01106
Epoch: 18122, Training Loss: 0.01027
Epoch: 18122, Training Loss: 0.01262
Epoch: 18122, Training Loss: 0.00974
Epoch: 18123, Training Loss: 0.01106
Epoch: 18123, Training Loss: 0.01027
Epoch: 18123, Training Loss: 0.01262
Epoch: 18123, Training Loss: 0.00974
Epoch: 18124, Training Loss: 0.01106
Epoch: 18124, Training Loss: 0.01027
Epoch: 18124, Training Loss: 0.01262
Epoch: 18124, Training Loss: 0.00974
Epoch: 18125, Training Loss: 0.01106
Epoch: 18125, Training Loss: 0.01027
Epoch: 18125, Training Loss: 0.01262
Epoch: 18125, Training Loss: 0.00974
Epoch: 18126, Training Loss: 0.01106
Epoch: 18126, Training Loss: 0.01027
Epoch: 18126, Training Loss: 0.01262
Epoch: 18126, Training Loss: 0.00974
Epoch: 18127, Training Loss: 0.01106
Epoch: 18127, Training Loss: 0.01027
Epoch: 18127, Training Loss: 0.01262
Epoch: 18127, Training Loss: 0.00974
Epoch: 18128, Training Loss: 0.01106
Epoch: 18128, Training Loss: 0.01027
Epoch: 18128, Training Loss: 0.01262
Epoch: 18128, Training Loss: 0.00973
Epoch: 18129, Training Loss: 0.01106
Epoch: 18129, Training Loss: 0.01027
Epoch: 18129, Training Loss: 0.01262
Epoch: 18129, Training Loss: 0.00973
Epoch: 18130, Training Loss: 0.01105
Epoch: 18130, Training Loss: 0.01027
Epoch: 18130, Training Loss: 0.01262
Epoch: 18130, Training Loss: 0.00973
Epoch: 18131, Training Loss: 0.01105
Epoch: 18131, Training Loss: 0.01027
Epoch: 18131, Training Loss: 0.01262
Epoch: 18131, Training Loss: 0.00973
Epoch: 18132, Training Loss: 0.01105
Epoch: 18132, Training Loss: 0.01027
Epoch: 18132, Training Loss: 0.01262
Epoch: 18132, Training Loss: 0.00973
Epoch: 18133, Training Loss: 0.01105
Epoch: 18133, Training Loss: 0.01027
Epoch: 18133, Training Loss: 0.01262
Epoch: 18133, Training Loss: 0.00973
Epoch: 18134, Training Loss: 0.01105
Epoch: 18134, Training Loss: 0.01027
Epoch: 18134, Training Loss: 0.01262
Epoch: 18134, Training Loss: 0.00973
Epoch: 18135, Training Loss: 0.01105
Epoch: 18135, Training Loss: 0.01027
Epoch: 18135, Training Loss: 0.01262
Epoch: 18135, Training Loss: 0.00973
Epoch: 18136, Training Loss: 0.01105
Epoch: 18136, Training Loss: 0.01027
Epoch: 18136, Training Loss: 0.01262
Epoch: 18136, Training Loss: 0.00973
Epoch: 18137, Training Loss: 0.01105
Epoch: 18137, Training Loss: 0.01027
Epoch: 18137, Training Loss: 0.01262
Epoch: 18137, Training Loss: 0.00973
Epoch: 18138, Training Loss: 0.01105
Epoch: 18138, Training Loss: 0.01027
Epoch: 18138, Training Loss: 0.01262
Epoch: 18138, Training Loss: 0.00973
Epoch: 18139, Training Loss: 0.01105
Epoch: 18139, Training Loss: 0.01027
Epoch: 18139, Training Loss: 0.01262
Epoch: 18139, Training Loss: 0.00973
Epoch: 18140, Training Loss: 0.01105
Epoch: 18140, Training Loss: 0.01027
Epoch: 18140, Training Loss: 0.01262
Epoch: 18140, Training Loss: 0.00973
Epoch: 18141, Training Loss: 0.01105
Epoch: 18141, Training Loss: 0.01027
Epoch: 18141, Training Loss: 0.01262
Epoch: 18141, Training Loss: 0.00973
Epoch: 18142, Training Loss: 0.01105
Epoch: 18142, Training Loss: 0.01027
Epoch: 18142, Training Loss: 0.01262
Epoch: 18142, Training Loss: 0.00973
Epoch: 18143, Training Loss: 0.01105
Epoch: 18143, Training Loss: 0.01027
Epoch: 18143, Training Loss: 0.01262
Epoch: 18143, Training Loss: 0.00973
Epoch: 18144, Training Loss: 0.01105
Epoch: 18144, Training Loss: 0.01027
Epoch: 18144, Training Loss: 0.01261
Epoch: 18144, Training Loss: 0.00973
Epoch: 18145, Training Loss: 0.01105
Epoch: 18145, Training Loss: 0.01027
Epoch: 18145, Training Loss: 0.01261
Epoch: 18145, Training Loss: 0.00973
Epoch: 18146, Training Loss: 0.01105
Epoch: 18146, Training Loss: 0.01027
Epoch: 18146, Training Loss: 0.01261
Epoch: 18146, Training Loss: 0.00973
Epoch: 18147, Training Loss: 0.01105
Epoch: 18147, Training Loss: 0.01027
Epoch: 18147, Training Loss: 0.01261
Epoch: 18147, Training Loss: 0.00973
Epoch: 18148, Training Loss: 0.01105
Epoch: 18148, Training Loss: 0.01027
Epoch: 18148, Training Loss: 0.01261
Epoch: 18148, Training Loss: 0.00973
Epoch: 18149, Training Loss: 0.01105
Epoch: 18149, Training Loss: 0.01026
Epoch: 18149, Training Loss: 0.01261
Epoch: 18149, Training Loss: 0.00973
Epoch: 18150, Training Loss: 0.01105
Epoch: 18150, Training Loss: 0.01026
Epoch: 18150, Training Loss: 0.01261
Epoch: 18150, Training Loss: 0.00973
Epoch: 18151, Training Loss: 0.01105
Epoch: 18151, Training Loss: 0.01026
Epoch: 18151, Training Loss: 0.01261
Epoch: 18151, Training Loss: 0.00973
Epoch: 18152, Training Loss: 0.01105
Epoch: 18152, Training Loss: 0.01026
Epoch: 18152, Training Loss: 0.01261
Epoch: 18152, Training Loss: 0.00973
Epoch: 18153, Training Loss: 0.01105
Epoch: 18153, Training Loss: 0.01026
Epoch: 18153, Training Loss: 0.01261
Epoch: 18153, Training Loss: 0.00973
Epoch: 18154, Training Loss: 0.01105
Epoch: 18154, Training Loss: 0.01026
Epoch: 18154, Training Loss: 0.01261
Epoch: 18154, Training Loss: 0.00973
Epoch: 18155, Training Loss: 0.01105
Epoch: 18155, Training Loss: 0.01026
Epoch: 18155, Training Loss: 0.01261
Epoch: 18155, Training Loss: 0.00973
Epoch: 18156, Training Loss: 0.01105
Epoch: 18156, Training Loss: 0.01026
Epoch: 18156, Training Loss: 0.01261
Epoch: 18156, Training Loss: 0.00973
Epoch: 18157, Training Loss: 0.01105
Epoch: 18157, Training Loss: 0.01026
Epoch: 18157, Training Loss: 0.01261
Epoch: 18157, Training Loss: 0.00973
Epoch: 18158, Training Loss: 0.01105
Epoch: 18158, Training Loss: 0.01026
Epoch: 18158, Training Loss: 0.01261
Epoch: 18158, Training Loss: 0.00973
Epoch: 18159, Training Loss: 0.01105
Epoch: 18159, Training Loss: 0.01026
Epoch: 18159, Training Loss: 0.01261
Epoch: 18159, Training Loss: 0.00973
Epoch: 18160, Training Loss: 0.01104
Epoch: 18160, Training Loss: 0.01026
Epoch: 18160, Training Loss: 0.01261
Epoch: 18160, Training Loss: 0.00973
Epoch: 18161, Training Loss: 0.01104
Epoch: 18161, Training Loss: 0.01026
Epoch: 18161, Training Loss: 0.01261
Epoch: 18161, Training Loss: 0.00973
Epoch: 18162, Training Loss: 0.01104
Epoch: 18162, Training Loss: 0.01026
Epoch: 18162, Training Loss: 0.01261
Epoch: 18162, Training Loss: 0.00972
Epoch: 18163, Training Loss: 0.01104
Epoch: 18163, Training Loss: 0.01026
Epoch: 18163, Training Loss: 0.01261
Epoch: 18163, Training Loss: 0.00972
Epoch: 18164, Training Loss: 0.01104
Epoch: 18164, Training Loss: 0.01026
Epoch: 18164, Training Loss: 0.01261
Epoch: 18164, Training Loss: 0.00972
Epoch: 18165, Training Loss: 0.01104
Epoch: 18165, Training Loss: 0.01026
Epoch: 18165, Training Loss: 0.01261
Epoch: 18165, Training Loss: 0.00972
Epoch: 18166, Training Loss: 0.01104
Epoch: 18166, Training Loss: 0.01026
Epoch: 18166, Training Loss: 0.01261
Epoch: 18166, Training Loss: 0.00972
Epoch: 18167, Training Loss: 0.01104
Epoch: 18167, Training Loss: 0.01026
Epoch: 18167, Training Loss: 0.01261
Epoch: 18167, Training Loss: 0.00972
Epoch: 18168, Training Loss: 0.01104
Epoch: 18168, Training Loss: 0.01026
Epoch: 18168, Training Loss: 0.01261
Epoch: 18168, Training Loss: 0.00972
Epoch: 18169, Training Loss: 0.01104
Epoch: 18169, Training Loss: 0.01026
Epoch: 18169, Training Loss: 0.01261
Epoch: 18169, Training Loss: 0.00972
Epoch: 18170, Training Loss: 0.01104
Epoch: 18170, Training Loss: 0.01026
Epoch: 18170, Training Loss: 0.01260
Epoch: 18170, Training Loss: 0.00972
Epoch: 18171, Training Loss: 0.01104
Epoch: 18171, Training Loss: 0.01026
Epoch: 18171, Training Loss: 0.01260
Epoch: 18171, Training Loss: 0.00972
Epoch: 18172, Training Loss: 0.01104
Epoch: 18172, Training Loss: 0.01026
Epoch: 18172, Training Loss: 0.01260
Epoch: 18172, Training Loss: 0.00972
Epoch: 18173, Training Loss: 0.01104
Epoch: 18173, Training Loss: 0.01026
Epoch: 18173, Training Loss: 0.01260
Epoch: 18173, Training Loss: 0.00972
Epoch: 18174, Training Loss: 0.01104
Epoch: 18174, Training Loss: 0.01026
Epoch: 18174, Training Loss: 0.01260
Epoch: 18174, Training Loss: 0.00972
Epoch: 18175, Training Loss: 0.01104
Epoch: 18175, Training Loss: 0.01026
Epoch: 18175, Training Loss: 0.01260
Epoch: 18175, Training Loss: 0.00972
Epoch: 18176, Training Loss: 0.01104
Epoch: 18176, Training Loss: 0.01026
Epoch: 18176, Training Loss: 0.01260
Epoch: 18176, Training Loss: 0.00972
Epoch: 18177, Training Loss: 0.01104
Epoch: 18177, Training Loss: 0.01026
Epoch: 18177, Training Loss: 0.01260
Epoch: 18177, Training Loss: 0.00972
Epoch: 18178, Training Loss: 0.01104
Epoch: 18178, Training Loss: 0.01026
Epoch: 18178, Training Loss: 0.01260
Epoch: 18178, Training Loss: 0.00972
Epoch: 18179, Training Loss: 0.01104
Epoch: 18179, Training Loss: 0.01026
Epoch: 18179, Training Loss: 0.01260
Epoch: 18179, Training Loss: 0.00972
Epoch: 18180, Training Loss: 0.01104
Epoch: 18180, Training Loss: 0.01026
Epoch: 18180, Training Loss: 0.01260
Epoch: 18180, Training Loss: 0.00972
Epoch: 18181, Training Loss: 0.01104
Epoch: 18181, Training Loss: 0.01026
Epoch: 18181, Training Loss: 0.01260
Epoch: 18181, Training Loss: 0.00972
Epoch: 18182, Training Loss: 0.01104
Epoch: 18182, Training Loss: 0.01025
Epoch: 18182, Training Loss: 0.01260
Epoch: 18182, Training Loss: 0.00972
Epoch: 18183, Training Loss: 0.01104
Epoch: 18183, Training Loss: 0.01025
Epoch: 18183, Training Loss: 0.01260
Epoch: 18183, Training Loss: 0.00972
Epoch: 18184, Training Loss: 0.01104
Epoch: 18184, Training Loss: 0.01025
Epoch: 18184, Training Loss: 0.01260
Epoch: 18184, Training Loss: 0.00972
Epoch: 18185, Training Loss: 0.01104
Epoch: 18185, Training Loss: 0.01025
Epoch: 18185, Training Loss: 0.01260
Epoch: 18185, Training Loss: 0.00972
Epoch: 18186, Training Loss: 0.01104
Epoch: 18186, Training Loss: 0.01025
Epoch: 18186, Training Loss: 0.01260
Epoch: 18186, Training Loss: 0.00972
Epoch: 18187, Training Loss: 0.01104
Epoch: 18187, Training Loss: 0.01025
Epoch: 18187, Training Loss: 0.01260
Epoch: 18187, Training Loss: 0.00972
Epoch: 18188, Training Loss: 0.01104
Epoch: 18188, Training Loss: 0.01025
Epoch: 18188, Training Loss: 0.01260
Epoch: 18188, Training Loss: 0.00972
Epoch: 18189, Training Loss: 0.01104
Epoch: 18189, Training Loss: 0.01025
Epoch: 18189, Training Loss: 0.01260
Epoch: 18189, Training Loss: 0.00972
Epoch: 18190, Training Loss: 0.01103
Epoch: 18190, Training Loss: 0.01025
Epoch: 18190, Training Loss: 0.01260
Epoch: 18190, Training Loss: 0.00972
Epoch: 18191, Training Loss: 0.01103
Epoch: 18191, Training Loss: 0.01025
Epoch: 18191, Training Loss: 0.01260
Epoch: 18191, Training Loss: 0.00972
Epoch: 18192, Training Loss: 0.01103
Epoch: 18192, Training Loss: 0.01025
Epoch: 18192, Training Loss: 0.01260
Epoch: 18192, Training Loss: 0.00972
Epoch: 18193, Training Loss: 0.01103
Epoch: 18193, Training Loss: 0.01025
Epoch: 18193, Training Loss: 0.01260
Epoch: 18193, Training Loss: 0.00972
Epoch: 18194, Training Loss: 0.01103
Epoch: 18194, Training Loss: 0.01025
Epoch: 18194, Training Loss: 0.01260
Epoch: 18194, Training Loss: 0.00972
Epoch: 18195, Training Loss: 0.01103
Epoch: 18195, Training Loss: 0.01025
Epoch: 18195, Training Loss: 0.01260
Epoch: 18195, Training Loss: 0.00972
Epoch: 18196, Training Loss: 0.01103
Epoch: 18196, Training Loss: 0.01025
Epoch: 18196, Training Loss: 0.01260
Epoch: 18196, Training Loss: 0.00972
Epoch: 18197, Training Loss: 0.01103
Epoch: 18197, Training Loss: 0.01025
Epoch: 18197, Training Loss: 0.01259
Epoch: 18197, Training Loss: 0.00971
Epoch: 18198, Training Loss: 0.01103
Epoch: 18198, Training Loss: 0.01025
Epoch: 18198, Training Loss: 0.01259
Epoch: 18198, Training Loss: 0.00971
Epoch: 18199, Training Loss: 0.01103
Epoch: 18199, Training Loss: 0.01025
Epoch: 18199, Training Loss: 0.01259
Epoch: 18199, Training Loss: 0.00971
Epoch: 18200, Training Loss: 0.01103
Epoch: 18200, Training Loss: 0.01025
Epoch: 18200, Training Loss: 0.01259
Epoch: 18200, Training Loss: 0.00971
Epoch: 18201, Training Loss: 0.01103
Epoch: 18201, Training Loss: 0.01025
Epoch: 18201, Training Loss: 0.01259
Epoch: 18201, Training Loss: 0.00971
Epoch: 18202, Training Loss: 0.01103
Epoch: 18202, Training Loss: 0.01025
Epoch: 18202, Training Loss: 0.01259
Epoch: 18202, Training Loss: 0.00971
Epoch: 18203, Training Loss: 0.01103
Epoch: 18203, Training Loss: 0.01025
Epoch: 18203, Training Loss: 0.01259
Epoch: 18203, Training Loss: 0.00971
Epoch: 18204, Training Loss: 0.01103
Epoch: 18204, Training Loss: 0.01025
Epoch: 18204, Training Loss: 0.01259
Epoch: 18204, Training Loss: 0.00971
Epoch: 18205, Training Loss: 0.01103
Epoch: 18205, Training Loss: 0.01025
Epoch: 18205, Training Loss: 0.01259
Epoch: 18205, Training Loss: 0.00971
Epoch: 18206, Training Loss: 0.01103
Epoch: 18206, Training Loss: 0.01025
Epoch: 18206, Training Loss: 0.01259
Epoch: 18206, Training Loss: 0.00971
Epoch: 18207, Training Loss: 0.01103
Epoch: 18207, Training Loss: 0.01025
Epoch: 18207, Training Loss: 0.01259
Epoch: 18207, Training Loss: 0.00971
Epoch: 18208, Training Loss: 0.01103
Epoch: 18208, Training Loss: 0.01025
Epoch: 18208, Training Loss: 0.01259
Epoch: 18208, Training Loss: 0.00971
Epoch: 18209, Training Loss: 0.01103
Epoch: 18209, Training Loss: 0.01025
Epoch: 18209, Training Loss: 0.01259
Epoch: 18209, Training Loss: 0.00971
Epoch: 18210, Training Loss: 0.01103
Epoch: 18210, Training Loss: 0.01025
Epoch: 18210, Training Loss: 0.01259
Epoch: 18210, Training Loss: 0.00971
Epoch: 18211, Training Loss: 0.01103
Epoch: 18211, Training Loss: 0.01025
Epoch: 18211, Training Loss: 0.01259
Epoch: 18211, Training Loss: 0.00971
Epoch: 18212, Training Loss: 0.01103
Epoch: 18212, Training Loss: 0.01025
Epoch: 18212, Training Loss: 0.01259
Epoch: 18212, Training Loss: 0.00971
Epoch: 18213, Training Loss: 0.01103
Epoch: 18213, Training Loss: 0.01025
Epoch: 18213, Training Loss: 0.01259
Epoch: 18213, Training Loss: 0.00971
Epoch: 18214, Training Loss: 0.01103
Epoch: 18214, Training Loss: 0.01025
Epoch: 18214, Training Loss: 0.01259
Epoch: 18214, Training Loss: 0.00971
Epoch: 18215, Training Loss: 0.01103
Epoch: 18215, Training Loss: 0.01024
Epoch: 18215, Training Loss: 0.01259
Epoch: 18215, Training Loss: 0.00971
Epoch: 18216, Training Loss: 0.01103
Epoch: 18216, Training Loss: 0.01024
Epoch: 18216, Training Loss: 0.01259
Epoch: 18216, Training Loss: 0.00971
Epoch: 18217, Training Loss: 0.01103
Epoch: 18217, Training Loss: 0.01024
Epoch: 18217, Training Loss: 0.01259
Epoch: 18217, Training Loss: 0.00971
Epoch: 18218, Training Loss: 0.01103
Epoch: 18218, Training Loss: 0.01024
Epoch: 18218, Training Loss: 0.01259
Epoch: 18218, Training Loss: 0.00971
Epoch: 18219, Training Loss: 0.01103
Epoch: 18219, Training Loss: 0.01024
Epoch: 18219, Training Loss: 0.01259
Epoch: 18219, Training Loss: 0.00971
Epoch: 18220, Training Loss: 0.01102
Epoch: 18220, Training Loss: 0.01024
Epoch: 18220, Training Loss: 0.01259
Epoch: 18220, Training Loss: 0.00971
Epoch: 18221, Training Loss: 0.01102
Epoch: 18221, Training Loss: 0.01024
Epoch: 18221, Training Loss: 0.01259
Epoch: 18221, Training Loss: 0.00971
Epoch: 18222, Training Loss: 0.01102
Epoch: 18222, Training Loss: 0.01024
Epoch: 18222, Training Loss: 0.01259
Epoch: 18222, Training Loss: 0.00971
Epoch: 18223, Training Loss: 0.01102
Epoch: 18223, Training Loss: 0.01024
Epoch: 18223, Training Loss: 0.01259
Epoch: 18223, Training Loss: 0.00971
Epoch: 18224, Training Loss: 0.01102
Epoch: 18224, Training Loss: 0.01024
Epoch: 18224, Training Loss: 0.01258
Epoch: 18224, Training Loss: 0.00971
Epoch: 18225, Training Loss: 0.01102
Epoch: 18225, Training Loss: 0.01024
Epoch: 18225, Training Loss: 0.01258
Epoch: 18225, Training Loss: 0.00971
Epoch: 18226, Training Loss: 0.01102
Epoch: 18226, Training Loss: 0.01024
Epoch: 18226, Training Loss: 0.01258
Epoch: 18226, Training Loss: 0.00971
Epoch: 18227, Training Loss: 0.01102
Epoch: 18227, Training Loss: 0.01024
Epoch: 18227, Training Loss: 0.01258
Epoch: 18227, Training Loss: 0.00971
Epoch: 18228, Training Loss: 0.01102
Epoch: 18228, Training Loss: 0.01024
Epoch: 18228, Training Loss: 0.01258
Epoch: 18228, Training Loss: 0.00971
Epoch: 18229, Training Loss: 0.01102
Epoch: 18229, Training Loss: 0.01024
Epoch: 18229, Training Loss: 0.01258
Epoch: 18229, Training Loss: 0.00971
Epoch: 18230, Training Loss: 0.01102
Epoch: 18230, Training Loss: 0.01024
Epoch: 18230, Training Loss: 0.01258
Epoch: 18230, Training Loss: 0.00971
Epoch: 18231, Training Loss: 0.01102
Epoch: 18231, Training Loss: 0.01024
Epoch: 18231, Training Loss: 0.01258
Epoch: 18231, Training Loss: 0.00971
Epoch: 18232, Training Loss: 0.01102
Epoch: 18232, Training Loss: 0.01024
Epoch: 18232, Training Loss: 0.01258
Epoch: 18232, Training Loss: 0.00970
Epoch: 18233, Training Loss: 0.01102
Epoch: 18233, Training Loss: 0.01024
Epoch: 18233, Training Loss: 0.01258
Epoch: 18233, Training Loss: 0.00970
Epoch: 18234, Training Loss: 0.01102
Epoch: 18234, Training Loss: 0.01024
Epoch: 18234, Training Loss: 0.01258
Epoch: 18234, Training Loss: 0.00970
Epoch: 18235, Training Loss: 0.01102
Epoch: 18235, Training Loss: 0.01024
Epoch: 18235, Training Loss: 0.01258
Epoch: 18235, Training Loss: 0.00970
Epoch: 18236, Training Loss: 0.01102
Epoch: 18236, Training Loss: 0.01024
Epoch: 18236, Training Loss: 0.01258
Epoch: 18236, Training Loss: 0.00970
Epoch: 18237, Training Loss: 0.01102
Epoch: 18237, Training Loss: 0.01024
Epoch: 18237, Training Loss: 0.01258
Epoch: 18237, Training Loss: 0.00970
Epoch: 18238, Training Loss: 0.01102
Epoch: 18238, Training Loss: 0.01024
Epoch: 18238, Training Loss: 0.01258
Epoch: 18238, Training Loss: 0.00970
Epoch: 18239, Training Loss: 0.01102
Epoch: 18239, Training Loss: 0.01024
Epoch: 18239, Training Loss: 0.01258
Epoch: 18239, Training Loss: 0.00970
Epoch: 18240, Training Loss: 0.01102
Epoch: 18240, Training Loss: 0.01024
Epoch: 18240, Training Loss: 0.01258
Epoch: 18240, Training Loss: 0.00970
Epoch: 18241, Training Loss: 0.01102
Epoch: 18241, Training Loss: 0.01024
Epoch: 18241, Training Loss: 0.01258
Epoch: 18241, Training Loss: 0.00970
Epoch: 18242, Training Loss: 0.01102
Epoch: 18242, Training Loss: 0.01024
Epoch: 18242, Training Loss: 0.01258
Epoch: 18242, Training Loss: 0.00970
Epoch: 18243, Training Loss: 0.01102
Epoch: 18243, Training Loss: 0.01024
Epoch: 18243, Training Loss: 0.01258
Epoch: 18243, Training Loss: 0.00970
Epoch: 18244, Training Loss: 0.01102
Epoch: 18244, Training Loss: 0.01024
Epoch: 18244, Training Loss: 0.01258
Epoch: 18244, Training Loss: 0.00970
Epoch: 18245, Training Loss: 0.01102
Epoch: 18245, Training Loss: 0.01024
Epoch: 18245, Training Loss: 0.01258
Epoch: 18245, Training Loss: 0.00970
Epoch: 18246, Training Loss: 0.01102
Epoch: 18246, Training Loss: 0.01024
Epoch: 18246, Training Loss: 0.01258
Epoch: 18246, Training Loss: 0.00970
Epoch: 18247, Training Loss: 0.01102
Epoch: 18247, Training Loss: 0.01024
Epoch: 18247, Training Loss: 0.01258
Epoch: 18247, Training Loss: 0.00970
Epoch: 18248, Training Loss: 0.01102
Epoch: 18248, Training Loss: 0.01024
Epoch: 18248, Training Loss: 0.01258
Epoch: 18248, Training Loss: 0.00970
Epoch: 18249, Training Loss: 0.01102
Epoch: 18249, Training Loss: 0.01023
Epoch: 18249, Training Loss: 0.01258
Epoch: 18249, Training Loss: 0.00970
Epoch: 18250, Training Loss: 0.01102
Epoch: 18250, Training Loss: 0.01023
Epoch: 18250, Training Loss: 0.01257
Epoch: 18250, Training Loss: 0.00970
Epoch: 18251, Training Loss: 0.01101
Epoch: 18251, Training Loss: 0.01023
Epoch: 18251, Training Loss: 0.01257
Epoch: 18251, Training Loss: 0.00970
Epoch: 18252, Training Loss: 0.01101
Epoch: 18252, Training Loss: 0.01023
Epoch: 18252, Training Loss: 0.01257
Epoch: 18252, Training Loss: 0.00970
Epoch: 18253, Training Loss: 0.01101
Epoch: 18253, Training Loss: 0.01023
Epoch: 18253, Training Loss: 0.01257
Epoch: 18253, Training Loss: 0.00970
Epoch: 18254, Training Loss: 0.01101
Epoch: 18254, Training Loss: 0.01023
Epoch: 18254, Training Loss: 0.01257
Epoch: 18254, Training Loss: 0.00970
Epoch: 18255, Training Loss: 0.01101
Epoch: 18255, Training Loss: 0.01023
Epoch: 18255, Training Loss: 0.01257
Epoch: 18255, Training Loss: 0.00970
Epoch: 18256, Training Loss: 0.01101
Epoch: 18256, Training Loss: 0.01023
Epoch: 18256, Training Loss: 0.01257
Epoch: 18256, Training Loss: 0.00970
Epoch: 18257, Training Loss: 0.01101
Epoch: 18257, Training Loss: 0.01023
Epoch: 18257, Training Loss: 0.01257
Epoch: 18257, Training Loss: 0.00970
Epoch: 18258, Training Loss: 0.01101
Epoch: 18258, Training Loss: 0.01023
Epoch: 18258, Training Loss: 0.01257
Epoch: 18258, Training Loss: 0.00970
Epoch: 18259, Training Loss: 0.01101
Epoch: 18259, Training Loss: 0.01023
Epoch: 18259, Training Loss: 0.01257
Epoch: 18259, Training Loss: 0.00970
Epoch: 18260, Training Loss: 0.01101
Epoch: 18260, Training Loss: 0.01023
Epoch: 18260, Training Loss: 0.01257
Epoch: 18260, Training Loss: 0.00970
Epoch: 18261, Training Loss: 0.01101
Epoch: 18261, Training Loss: 0.01023
Epoch: 18261, Training Loss: 0.01257
Epoch: 18261, Training Loss: 0.00970
Epoch: 18262, Training Loss: 0.01101
Epoch: 18262, Training Loss: 0.01023
Epoch: 18262, Training Loss: 0.01257
Epoch: 18262, Training Loss: 0.00970
Epoch: 18263, Training Loss: 0.01101
Epoch: 18263, Training Loss: 0.01023
Epoch: 18263, Training Loss: 0.01257
Epoch: 18263, Training Loss: 0.00970
Epoch: 18264, Training Loss: 0.01101
Epoch: 18264, Training Loss: 0.01023
Epoch: 18264, Training Loss: 0.01257
Epoch: 18264, Training Loss: 0.00970
Epoch: 18265, Training Loss: 0.01101
Epoch: 18265, Training Loss: 0.01023
Epoch: 18265, Training Loss: 0.01257
Epoch: 18265, Training Loss: 0.00970
Epoch: 18266, Training Loss: 0.01101
Epoch: 18266, Training Loss: 0.01023
Epoch: 18266, Training Loss: 0.01257
Epoch: 18266, Training Loss: 0.00970
Epoch: 18267, Training Loss: 0.01101
Epoch: 18267, Training Loss: 0.01023
Epoch: 18267, Training Loss: 0.01257
Epoch: 18267, Training Loss: 0.00970
Epoch: 18268, Training Loss: 0.01101
Epoch: 18268, Training Loss: 0.01023
Epoch: 18268, Training Loss: 0.01257
Epoch: 18268, Training Loss: 0.00969
Epoch: 18269, Training Loss: 0.01101
Epoch: 18269, Training Loss: 0.01023
Epoch: 18269, Training Loss: 0.01257
Epoch: 18269, Training Loss: 0.00969
Epoch: 18270, Training Loss: 0.01101
Epoch: 18270, Training Loss: 0.01023
Epoch: 18270, Training Loss: 0.01257
Epoch: 18270, Training Loss: 0.00969
Epoch: 18271, Training Loss: 0.01101
Epoch: 18271, Training Loss: 0.01023
Epoch: 18271, Training Loss: 0.01257
Epoch: 18271, Training Loss: 0.00969
Epoch: 18272, Training Loss: 0.01101
Epoch: 18272, Training Loss: 0.01023
Epoch: 18272, Training Loss: 0.01257
Epoch: 18272, Training Loss: 0.00969
Epoch: 18273, Training Loss: 0.01101
Epoch: 18273, Training Loss: 0.01023
Epoch: 18273, Training Loss: 0.01257
Epoch: 18273, Training Loss: 0.00969
Epoch: 18274, Training Loss: 0.01101
Epoch: 18274, Training Loss: 0.01023
Epoch: 18274, Training Loss: 0.01257
Epoch: 18274, Training Loss: 0.00969
Epoch: 18275, Training Loss: 0.01101
Epoch: 18275, Training Loss: 0.01023
Epoch: 18275, Training Loss: 0.01257
Epoch: 18275, Training Loss: 0.00969
Epoch: 18276, Training Loss: 0.01101
Epoch: 18276, Training Loss: 0.01023
Epoch: 18276, Training Loss: 0.01257
Epoch: 18276, Training Loss: 0.00969
Epoch: 18277, Training Loss: 0.01101
Epoch: 18277, Training Loss: 0.01023
Epoch: 18277, Training Loss: 0.01256
Epoch: 18277, Training Loss: 0.00969
Epoch: 18278, Training Loss: 0.01101
Epoch: 18278, Training Loss: 0.01023
Epoch: 18278, Training Loss: 0.01256
Epoch: 18278, Training Loss: 0.00969
Epoch: 18279, Training Loss: 0.01101
Epoch: 18279, Training Loss: 0.01023
Epoch: 18279, Training Loss: 0.01256
Epoch: 18279, Training Loss: 0.00969
Epoch: 18280, Training Loss: 0.01101
Epoch: 18280, Training Loss: 0.01023
Epoch: 18280, Training Loss: 0.01256
Epoch: 18280, Training Loss: 0.00969
Epoch: 18281, Training Loss: 0.01100
Epoch: 18281, Training Loss: 0.01023
Epoch: 18281, Training Loss: 0.01256
Epoch: 18281, Training Loss: 0.00969
Epoch: 18282, Training Loss: 0.01100
Epoch: 18282, Training Loss: 0.01022
Epoch: 18282, Training Loss: 0.01256
Epoch: 18282, Training Loss: 0.00969
Epoch: 18283, Training Loss: 0.01100
Epoch: 18283, Training Loss: 0.01022
Epoch: 18283, Training Loss: 0.01256
Epoch: 18283, Training Loss: 0.00969
Epoch: 18284, Training Loss: 0.01100
Epoch: 18284, Training Loss: 0.01022
Epoch: 18284, Training Loss: 0.01256
Epoch: 18284, Training Loss: 0.00969
Epoch: 18285, Training Loss: 0.01100
Epoch: 18285, Training Loss: 0.01022
Epoch: 18285, Training Loss: 0.01256
Epoch: 18285, Training Loss: 0.00969
Epoch: 18286, Training Loss: 0.01100
Epoch: 18286, Training Loss: 0.01022
Epoch: 18286, Training Loss: 0.01256
Epoch: 18286, Training Loss: 0.00969
Epoch: 18287, Training Loss: 0.01100
Epoch: 18287, Training Loss: 0.01022
Epoch: 18287, Training Loss: 0.01256
Epoch: 18287, Training Loss: 0.00969
Epoch: 18288, Training Loss: 0.01100
Epoch: 18288, Training Loss: 0.01022
Epoch: 18288, Training Loss: 0.01256
Epoch: 18288, Training Loss: 0.00969
Epoch: 18289, Training Loss: 0.01100
Epoch: 18289, Training Loss: 0.01022
Epoch: 18289, Training Loss: 0.01256
Epoch: 18289, Training Loss: 0.00969
Epoch: 18290, Training Loss: 0.01100
Epoch: 18290, Training Loss: 0.01022
Epoch: 18290, Training Loss: 0.01256
Epoch: 18290, Training Loss: 0.00969
Epoch: 18291, Training Loss: 0.01100
Epoch: 18291, Training Loss: 0.01022
Epoch: 18291, Training Loss: 0.01256
Epoch: 18291, Training Loss: 0.00969
Epoch: 18292, Training Loss: 0.01100
Epoch: 18292, Training Loss: 0.01022
Epoch: 18292, Training Loss: 0.01256
Epoch: 18292, Training Loss: 0.00969
Epoch: 18293, Training Loss: 0.01100
Epoch: 18293, Training Loss: 0.01022
Epoch: 18293, Training Loss: 0.01256
Epoch: 18293, Training Loss: 0.00969
Epoch: 18294, Training Loss: 0.01100
Epoch: 18294, Training Loss: 0.01022
Epoch: 18294, Training Loss: 0.01256
Epoch: 18294, Training Loss: 0.00969
Epoch: 18295, Training Loss: 0.01100
Epoch: 18295, Training Loss: 0.01022
Epoch: 18295, Training Loss: 0.01256
Epoch: 18295, Training Loss: 0.00969
Epoch: 18296, Training Loss: 0.01100
Epoch: 18296, Training Loss: 0.01022
Epoch: 18296, Training Loss: 0.01256
Epoch: 18296, Training Loss: 0.00969
Epoch: 18297, Training Loss: 0.01100
Epoch: 18297, Training Loss: 0.01022
Epoch: 18297, Training Loss: 0.01256
Epoch: 18297, Training Loss: 0.00969
Epoch: 18298, Training Loss: 0.01100
Epoch: 18298, Training Loss: 0.01022
Epoch: 18298, Training Loss: 0.01256
Epoch: 18298, Training Loss: 0.00969
Epoch: 18299, Training Loss: 0.01100
Epoch: 18299, Training Loss: 0.01022
Epoch: 18299, Training Loss: 0.01256
Epoch: 18299, Training Loss: 0.00969
Epoch: 18300, Training Loss: 0.01100
Epoch: 18300, Training Loss: 0.01022
Epoch: 18300, Training Loss: 0.01256
Epoch: 18300, Training Loss: 0.00969
Epoch: 18301, Training Loss: 0.01100
Epoch: 18301, Training Loss: 0.01022
Epoch: 18301, Training Loss: 0.01256
Epoch: 18301, Training Loss: 0.00969
Epoch: 18302, Training Loss: 0.01100
Epoch: 18302, Training Loss: 0.01022
Epoch: 18302, Training Loss: 0.01256
Epoch: 18302, Training Loss: 0.00969
Epoch: 18303, Training Loss: 0.01100
Epoch: 18303, Training Loss: 0.01022
Epoch: 18303, Training Loss: 0.01256
Epoch: 18303, Training Loss: 0.00968
Epoch: 18304, Training Loss: 0.01100
Epoch: 18304, Training Loss: 0.01022
Epoch: 18304, Training Loss: 0.01255
Epoch: 18304, Training Loss: 0.00968
Epoch: 18305, Training Loss: 0.01100
Epoch: 18305, Training Loss: 0.01022
Epoch: 18305, Training Loss: 0.01255
Epoch: 18305, Training Loss: 0.00968
Epoch: 18306, Training Loss: 0.01100
Epoch: 18306, Training Loss: 0.01022
Epoch: 18306, Training Loss: 0.01255
Epoch: 18306, Training Loss: 0.00968
Epoch: 18307, Training Loss: 0.01100
Epoch: 18307, Training Loss: 0.01022
Epoch: 18307, Training Loss: 0.01255
Epoch: 18307, Training Loss: 0.00968
Epoch: 18308, Training Loss: 0.01100
Epoch: 18308, Training Loss: 0.01022
Epoch: 18308, Training Loss: 0.01255
Epoch: 18308, Training Loss: 0.00968
Epoch: 18309, Training Loss: 0.01100
Epoch: 18309, Training Loss: 0.01022
Epoch: 18309, Training Loss: 0.01255
Epoch: 18309, Training Loss: 0.00968
Epoch: 18310, Training Loss: 0.01100
Epoch: 18310, Training Loss: 0.01022
Epoch: 18310, Training Loss: 0.01255
Epoch: 18310, Training Loss: 0.00968
Epoch: 18311, Training Loss: 0.01100
Epoch: 18311, Training Loss: 0.01022
Epoch: 18311, Training Loss: 0.01255
Epoch: 18311, Training Loss: 0.00968
Epoch: 18312, Training Loss: 0.01099
Epoch: 18312, Training Loss: 0.01022
Epoch: 18312, Training Loss: 0.01255
Epoch: 18312, Training Loss: 0.00968
Epoch: 18313, Training Loss: 0.01099
Epoch: 18313, Training Loss: 0.01022
Epoch: 18313, Training Loss: 0.01255
Epoch: 18313, Training Loss: 0.00968
Epoch: 18314, Training Loss: 0.01099
Epoch: 18314, Training Loss: 0.01022
Epoch: 18314, Training Loss: 0.01255
Epoch: 18314, Training Loss: 0.00968
Epoch: 18315, Training Loss: 0.01099
Epoch: 18315, Training Loss: 0.01022
Epoch: 18315, Training Loss: 0.01255
Epoch: 18315, Training Loss: 0.00968
Epoch: 18316, Training Loss: 0.01099
Epoch: 18316, Training Loss: 0.01021
Epoch: 18316, Training Loss: 0.01255
Epoch: 18316, Training Loss: 0.00968
Epoch: 18317, Training Loss: 0.01099
Epoch: 18317, Training Loss: 0.01021
Epoch: 18317, Training Loss: 0.01255
Epoch: 18317, Training Loss: 0.00968
Epoch: 18318, Training Loss: 0.01099
Epoch: 18318, Training Loss: 0.01021
Epoch: 18318, Training Loss: 0.01255
Epoch: 18318, Training Loss: 0.00968
Epoch: 18319, Training Loss: 0.01099
Epoch: 18319, Training Loss: 0.01021
Epoch: 18319, Training Loss: 0.01255
Epoch: 18319, Training Loss: 0.00968
Epoch: 18320, Training Loss: 0.01099
Epoch: 18320, Training Loss: 0.01021
Epoch: 18320, Training Loss: 0.01255
Epoch: 18320, Training Loss: 0.00968
Epoch: 18321, Training Loss: 0.01099
Epoch: 18321, Training Loss: 0.01021
Epoch: 18321, Training Loss: 0.01255
Epoch: 18321, Training Loss: 0.00968
Epoch: 18322, Training Loss: 0.01099
Epoch: 18322, Training Loss: 0.01021
Epoch: 18322, Training Loss: 0.01255
Epoch: 18322, Training Loss: 0.00968
Epoch: 18323, Training Loss: 0.01099
Epoch: 18323, Training Loss: 0.01021
Epoch: 18323, Training Loss: 0.01255
Epoch: 18323, Training Loss: 0.00968
Epoch: 18324, Training Loss: 0.01099
Epoch: 18324, Training Loss: 0.01021
Epoch: 18324, Training Loss: 0.01255
Epoch: 18324, Training Loss: 0.00968
Epoch: 18325, Training Loss: 0.01099
Epoch: 18325, Training Loss: 0.01021
Epoch: 18325, Training Loss: 0.01255
Epoch: 18325, Training Loss: 0.00968
Epoch: 18326, Training Loss: 0.01099
Epoch: 18326, Training Loss: 0.01021
Epoch: 18326, Training Loss: 0.01255
Epoch: 18326, Training Loss: 0.00968
Epoch: 18327, Training Loss: 0.01099
Epoch: 18327, Training Loss: 0.01021
Epoch: 18327, Training Loss: 0.01255
Epoch: 18327, Training Loss: 0.00968
Epoch: 18328, Training Loss: 0.01099
Epoch: 18328, Training Loss: 0.01021
Epoch: 18328, Training Loss: 0.01255
Epoch: 18328, Training Loss: 0.00968
Epoch: 18329, Training Loss: 0.01099
Epoch: 18329, Training Loss: 0.01021
Epoch: 18329, Training Loss: 0.01255
Epoch: 18329, Training Loss: 0.00968
Epoch: 18330, Training Loss: 0.01099
Epoch: 18330, Training Loss: 0.01021
Epoch: 18330, Training Loss: 0.01255
Epoch: 18330, Training Loss: 0.00968
Epoch: 18331, Training Loss: 0.01099
Epoch: 18331, Training Loss: 0.01021
Epoch: 18331, Training Loss: 0.01254
Epoch: 18331, Training Loss: 0.00968
Epoch: 18332, Training Loss: 0.01099
Epoch: 18332, Training Loss: 0.01021
Epoch: 18332, Training Loss: 0.01254
Epoch: 18332, Training Loss: 0.00968
Epoch: 18333, Training Loss: 0.01099
Epoch: 18333, Training Loss: 0.01021
Epoch: 18333, Training Loss: 0.01254
Epoch: 18333, Training Loss: 0.00968
Epoch: 18334, Training Loss: 0.01099
Epoch: 18334, Training Loss: 0.01021
Epoch: 18334, Training Loss: 0.01254
Epoch: 18334, Training Loss: 0.00968
Epoch: 18335, Training Loss: 0.01099
Epoch: 18335, Training Loss: 0.01021
Epoch: 18335, Training Loss: 0.01254
Epoch: 18335, Training Loss: 0.00968
Epoch: 18336, Training Loss: 0.01099
Epoch: 18336, Training Loss: 0.01021
Epoch: 18336, Training Loss: 0.01254
Epoch: 18336, Training Loss: 0.00968
Epoch: 18337, Training Loss: 0.01099
Epoch: 18337, Training Loss: 0.01021
Epoch: 18337, Training Loss: 0.01254
Epoch: 18337, Training Loss: 0.00968
Epoch: 18338, Training Loss: 0.01099
Epoch: 18338, Training Loss: 0.01021
Epoch: 18338, Training Loss: 0.01254
Epoch: 18338, Training Loss: 0.00967
Epoch: 18339, Training Loss: 0.01099
Epoch: 18339, Training Loss: 0.01021
Epoch: 18339, Training Loss: 0.01254
Epoch: 18339, Training Loss: 0.00967
Epoch: 18340, Training Loss: 0.01099
Epoch: 18340, Training Loss: 0.01021
Epoch: 18340, Training Loss: 0.01254
Epoch: 18340, Training Loss: 0.00967
Epoch: 18341, Training Loss: 0.01099
Epoch: 18341, Training Loss: 0.01021
Epoch: 18341, Training Loss: 0.01254
Epoch: 18341, Training Loss: 0.00967
Epoch: 18342, Training Loss: 0.01099
Epoch: 18342, Training Loss: 0.01021
Epoch: 18342, Training Loss: 0.01254
Epoch: 18342, Training Loss: 0.00967
Epoch: 18343, Training Loss: 0.01098
Epoch: 18343, Training Loss: 0.01021
Epoch: 18343, Training Loss: 0.01254
Epoch: 18343, Training Loss: 0.00967
Epoch: 18344, Training Loss: 0.01098
Epoch: 18344, Training Loss: 0.01021
Epoch: 18344, Training Loss: 0.01254
Epoch: 18344, Training Loss: 0.00967
Epoch: 18345, Training Loss: 0.01098
Epoch: 18345, Training Loss: 0.01021
Epoch: 18345, Training Loss: 0.01254
Epoch: 18345, Training Loss: 0.00967
Epoch: 18346, Training Loss: 0.01098
Epoch: 18346, Training Loss: 0.01021
Epoch: 18346, Training Loss: 0.01254
Epoch: 18346, Training Loss: 0.00967
Epoch: 18347, Training Loss: 0.01098
Epoch: 18347, Training Loss: 0.01021
Epoch: 18347, Training Loss: 0.01254
Epoch: 18347, Training Loss: 0.00967
Epoch: 18348, Training Loss: 0.01098
Epoch: 18348, Training Loss: 0.01021
Epoch: 18348, Training Loss: 0.01254
Epoch: 18348, Training Loss: 0.00967
Epoch: 18349, Training Loss: 0.01098
Epoch: 18349, Training Loss: 0.01021
Epoch: 18349, Training Loss: 0.01254
Epoch: 18349, Training Loss: 0.00967
Epoch: 18350, Training Loss: 0.01098
Epoch: 18350, Training Loss: 0.01020
Epoch: 18350, Training Loss: 0.01254
Epoch: 18350, Training Loss: 0.00967
Epoch: 18351, Training Loss: 0.01098
Epoch: 18351, Training Loss: 0.01020
Epoch: 18351, Training Loss: 0.01254
Epoch: 18351, Training Loss: 0.00967
Epoch: 18352, Training Loss: 0.01098
Epoch: 18352, Training Loss: 0.01020
Epoch: 18352, Training Loss: 0.01254
Epoch: 18352, Training Loss: 0.00967
Epoch: 18353, Training Loss: 0.01098
Epoch: 18353, Training Loss: 0.01020
Epoch: 18353, Training Loss: 0.01254
Epoch: 18353, Training Loss: 0.00967
Epoch: 18354, Training Loss: 0.01098
Epoch: 18354, Training Loss: 0.01020
Epoch: 18354, Training Loss: 0.01254
Epoch: 18354, Training Loss: 0.00967
Epoch: 18355, Training Loss: 0.01098
Epoch: 18355, Training Loss: 0.01020
Epoch: 18355, Training Loss: 0.01254
Epoch: 18355, Training Loss: 0.00967
Epoch: 18356, Training Loss: 0.01098
Epoch: 18356, Training Loss: 0.01020
Epoch: 18356, Training Loss: 0.01254
Epoch: 18356, Training Loss: 0.00967
Epoch: 18357, Training Loss: 0.01098
Epoch: 18357, Training Loss: 0.01020
Epoch: 18357, Training Loss: 0.01254
Epoch: 18357, Training Loss: 0.00967
Epoch: 18358, Training Loss: 0.01098
Epoch: 18358, Training Loss: 0.01020
Epoch: 18358, Training Loss: 0.01253
Epoch: 18358, Training Loss: 0.00967
Epoch: 18359, Training Loss: 0.01098
Epoch: 18359, Training Loss: 0.01020
Epoch: 18359, Training Loss: 0.01253
Epoch: 18359, Training Loss: 0.00967
Epoch: 18360, Training Loss: 0.01098
Epoch: 18360, Training Loss: 0.01020
Epoch: 18360, Training Loss: 0.01253
Epoch: 18360, Training Loss: 0.00967
Epoch: 18361, Training Loss: 0.01098
Epoch: 18361, Training Loss: 0.01020
Epoch: 18361, Training Loss: 0.01253
Epoch: 18361, Training Loss: 0.00967
Epoch: 18362, Training Loss: 0.01098
Epoch: 18362, Training Loss: 0.01020
Epoch: 18362, Training Loss: 0.01253
Epoch: 18362, Training Loss: 0.00967
Epoch: 18363, Training Loss: 0.01098
Epoch: 18363, Training Loss: 0.01020
Epoch: 18363, Training Loss: 0.01253
Epoch: 18363, Training Loss: 0.00967
Epoch: 18364, Training Loss: 0.01098
Epoch: 18364, Training Loss: 0.01020
Epoch: 18364, Training Loss: 0.01253
Epoch: 18364, Training Loss: 0.00967
Epoch: 18365, Training Loss: 0.01098
Epoch: 18365, Training Loss: 0.01020
Epoch: 18365, Training Loss: 0.01253
Epoch: 18365, Training Loss: 0.00967
Epoch: 18366, Training Loss: 0.01098
Epoch: 18366, Training Loss: 0.01020
Epoch: 18366, Training Loss: 0.01253
Epoch: 18366, Training Loss: 0.00967
Epoch: 18367, Training Loss: 0.01098
Epoch: 18367, Training Loss: 0.01020
Epoch: 18367, Training Loss: 0.01253
Epoch: 18367, Training Loss: 0.00967
Epoch: 18368, Training Loss: 0.01098
Epoch: 18368, Training Loss: 0.01020
Epoch: 18368, Training Loss: 0.01253
Epoch: 18368, Training Loss: 0.00967
Epoch: 18369, Training Loss: 0.01098
Epoch: 18369, Training Loss: 0.01020
Epoch: 18369, Training Loss: 0.01253
Epoch: 18369, Training Loss: 0.00967
Epoch: 18370, Training Loss: 0.01098
Epoch: 18370, Training Loss: 0.01020
Epoch: 18370, Training Loss: 0.01253
Epoch: 18370, Training Loss: 0.00967
Epoch: 18371, Training Loss: 0.01098
Epoch: 18371, Training Loss: 0.01020
Epoch: 18371, Training Loss: 0.01253
Epoch: 18371, Training Loss: 0.00967
Epoch: 18372, Training Loss: 0.01098
Epoch: 18372, Training Loss: 0.01020
Epoch: 18372, Training Loss: 0.01253
Epoch: 18372, Training Loss: 0.00967
Epoch: 18373, Training Loss: 0.01097
Epoch: 18373, Training Loss: 0.01020
Epoch: 18373, Training Loss: 0.01253
Epoch: 18373, Training Loss: 0.00967
Epoch: 18374, Training Loss: 0.01097
Epoch: 18374, Training Loss: 0.01020
Epoch: 18374, Training Loss: 0.01253
Epoch: 18374, Training Loss: 0.00966
Epoch: 18375, Training Loss: 0.01097
Epoch: 18375, Training Loss: 0.01020
Epoch: 18375, Training Loss: 0.01253
Epoch: 18375, Training Loss: 0.00966
Epoch: 18376, Training Loss: 0.01097
Epoch: 18376, Training Loss: 0.01020
Epoch: 18376, Training Loss: 0.01253
Epoch: 18376, Training Loss: 0.00966
Epoch: 18377, Training Loss: 0.01097
Epoch: 18377, Training Loss: 0.01020
Epoch: 18377, Training Loss: 0.01253
Epoch: 18377, Training Loss: 0.00966
Epoch: 18378, Training Loss: 0.01097
Epoch: 18378, Training Loss: 0.01020
Epoch: 18378, Training Loss: 0.01253
Epoch: 18378, Training Loss: 0.00966
Epoch: 18379, Training Loss: 0.01097
Epoch: 18379, Training Loss: 0.01020
Epoch: 18379, Training Loss: 0.01253
Epoch: 18379, Training Loss: 0.00966
Epoch: 18380, Training Loss: 0.01097
Epoch: 18380, Training Loss: 0.01020
Epoch: 18380, Training Loss: 0.01253
Epoch: 18380, Training Loss: 0.00966
Epoch: 18381, Training Loss: 0.01097
Epoch: 18381, Training Loss: 0.01020
Epoch: 18381, Training Loss: 0.01253
Epoch: 18381, Training Loss: 0.00966
Epoch: 18382, Training Loss: 0.01097
Epoch: 18382, Training Loss: 0.01020
Epoch: 18382, Training Loss: 0.01253
Epoch: 18382, Training Loss: 0.00966
Epoch: 18383, Training Loss: 0.01097
Epoch: 18383, Training Loss: 0.01020
Epoch: 18383, Training Loss: 0.01253
Epoch: 18383, Training Loss: 0.00966
Epoch: 18384, Training Loss: 0.01097
Epoch: 18384, Training Loss: 0.01019
Epoch: 18384, Training Loss: 0.01253
Epoch: 18384, Training Loss: 0.00966
Epoch: 18385, Training Loss: 0.01097
Epoch: 18385, Training Loss: 0.01019
Epoch: 18385, Training Loss: 0.01252
Epoch: 18385, Training Loss: 0.00966
Epoch: 18386, Training Loss: 0.01097
Epoch: 18386, Training Loss: 0.01019
Epoch: 18386, Training Loss: 0.01252
Epoch: 18386, Training Loss: 0.00966
Epoch: 18387, Training Loss: 0.01097
Epoch: 18387, Training Loss: 0.01019
Epoch: 18387, Training Loss: 0.01252
Epoch: 18387, Training Loss: 0.00966
Epoch: 18388, Training Loss: 0.01097
Epoch: 18388, Training Loss: 0.01019
Epoch: 18388, Training Loss: 0.01252
Epoch: 18388, Training Loss: 0.00966
Epoch: 18389, Training Loss: 0.01097
Epoch: 18389, Training Loss: 0.01019
Epoch: 18389, Training Loss: 0.01252
Epoch: 18389, Training Loss: 0.00966
Epoch: 18390, Training Loss: 0.01097
Epoch: 18390, Training Loss: 0.01019
Epoch: 18390, Training Loss: 0.01252
Epoch: 18390, Training Loss: 0.00966
Epoch: 18391, Training Loss: 0.01097
Epoch: 18391, Training Loss: 0.01019
Epoch: 18391, Training Loss: 0.01252
Epoch: 18391, Training Loss: 0.00966
Epoch: 18392, Training Loss: 0.01097
Epoch: 18392, Training Loss: 0.01019
Epoch: 18392, Training Loss: 0.01252
Epoch: 18392, Training Loss: 0.00966
Epoch: 18393, Training Loss: 0.01097
Epoch: 18393, Training Loss: 0.01019
Epoch: 18393, Training Loss: 0.01252
Epoch: 18393, Training Loss: 0.00966
Epoch: 18394, Training Loss: 0.01097
Epoch: 18394, Training Loss: 0.01019
Epoch: 18394, Training Loss: 0.01252
Epoch: 18394, Training Loss: 0.00966
Epoch: 18395, Training Loss: 0.01097
Epoch: 18395, Training Loss: 0.01019
Epoch: 18395, Training Loss: 0.01252
Epoch: 18395, Training Loss: 0.00966
Epoch: 18396, Training Loss: 0.01097
Epoch: 18396, Training Loss: 0.01019
Epoch: 18396, Training Loss: 0.01252
Epoch: 18396, Training Loss: 0.00966
Epoch: 18397, Training Loss: 0.01097
Epoch: 18397, Training Loss: 0.01019
Epoch: 18397, Training Loss: 0.01252
Epoch: 18397, Training Loss: 0.00966
Epoch: 18398, Training Loss: 0.01097
Epoch: 18398, Training Loss: 0.01019
Epoch: 18398, Training Loss: 0.01252
Epoch: 18398, Training Loss: 0.00966
Epoch: 18399, Training Loss: 0.01097
Epoch: 18399, Training Loss: 0.01019
Epoch: 18399, Training Loss: 0.01252
Epoch: 18399, Training Loss: 0.00966
Epoch: 18400, Training Loss: 0.01097
Epoch: 18400, Training Loss: 0.01019
Epoch: 18400, Training Loss: 0.01252
Epoch: 18400, Training Loss: 0.00966
Epoch: 18401, Training Loss: 0.01097
Epoch: 18401, Training Loss: 0.01019
Epoch: 18401, Training Loss: 0.01252
Epoch: 18401, Training Loss: 0.00966
Epoch: 18402, Training Loss: 0.01097
Epoch: 18402, Training Loss: 0.01019
Epoch: 18402, Training Loss: 0.01252
Epoch: 18402, Training Loss: 0.00966
Epoch: 18403, Training Loss: 0.01097
Epoch: 18403, Training Loss: 0.01019
Epoch: 18403, Training Loss: 0.01252
Epoch: 18403, Training Loss: 0.00966
Epoch: 18404, Training Loss: 0.01096
Epoch: 18404, Training Loss: 0.01019
Epoch: 18404, Training Loss: 0.01252
Epoch: 18404, Training Loss: 0.00966
Epoch: 18405, Training Loss: 0.01096
Epoch: 18405, Training Loss: 0.01019
Epoch: 18405, Training Loss: 0.01252
Epoch: 18405, Training Loss: 0.00966
Epoch: 18406, Training Loss: 0.01096
Epoch: 18406, Training Loss: 0.01019
Epoch: 18406, Training Loss: 0.01252
Epoch: 18406, Training Loss: 0.00966
Epoch: 18407, Training Loss: 0.01096
Epoch: 18407, Training Loss: 0.01019
Epoch: 18407, Training Loss: 0.01252
Epoch: 18407, Training Loss: 0.00966
Epoch: 18408, Training Loss: 0.01096
Epoch: 18408, Training Loss: 0.01019
Epoch: 18408, Training Loss: 0.01252
Epoch: 18408, Training Loss: 0.00966
Epoch: 18409, Training Loss: 0.01096
Epoch: 18409, Training Loss: 0.01019
Epoch: 18409, Training Loss: 0.01252
Epoch: 18409, Training Loss: 0.00966
Epoch: 18410, Training Loss: 0.01096
Epoch: 18410, Training Loss: 0.01019
Epoch: 18410, Training Loss: 0.01252
Epoch: 18410, Training Loss: 0.00965
Epoch: 18411, Training Loss: 0.01096
Epoch: 18411, Training Loss: 0.01019
Epoch: 18411, Training Loss: 0.01252
Epoch: 18411, Training Loss: 0.00965
Epoch: 18412, Training Loss: 0.01096
Epoch: 18412, Training Loss: 0.01019
Epoch: 18412, Training Loss: 0.01251
Epoch: 18412, Training Loss: 0.00965
Epoch: 18413, Training Loss: 0.01096
Epoch: 18413, Training Loss: 0.01019
Epoch: 18413, Training Loss: 0.01251
Epoch: 18413, Training Loss: 0.00965
Epoch: 18414, Training Loss: 0.01096
Epoch: 18414, Training Loss: 0.01019
Epoch: 18414, Training Loss: 0.01251
Epoch: 18414, Training Loss: 0.00965
Epoch: 18415, Training Loss: 0.01096
Epoch: 18415, Training Loss: 0.01019
Epoch: 18415, Training Loss: 0.01251
Epoch: 18415, Training Loss: 0.00965
Epoch: 18416, Training Loss: 0.01096
Epoch: 18416, Training Loss: 0.01019
Epoch: 18416, Training Loss: 0.01251
Epoch: 18416, Training Loss: 0.00965
Epoch: 18417, Training Loss: 0.01096
Epoch: 18417, Training Loss: 0.01019
Epoch: 18417, Training Loss: 0.01251
Epoch: 18417, Training Loss: 0.00965
Epoch: 18418, Training Loss: 0.01096
Epoch: 18418, Training Loss: 0.01018
Epoch: 18418, Training Loss: 0.01251
Epoch: 18418, Training Loss: 0.00965
Epoch: 18419, Training Loss: 0.01096
Epoch: 18419, Training Loss: 0.01018
Epoch: 18419, Training Loss: 0.01251
Epoch: 18419, Training Loss: 0.00965
Epoch: 18420, Training Loss: 0.01096
Epoch: 18420, Training Loss: 0.01018
Epoch: 18420, Training Loss: 0.01251
Epoch: 18420, Training Loss: 0.00965
Epoch: 18421, Training Loss: 0.01096
Epoch: 18421, Training Loss: 0.01018
Epoch: 18421, Training Loss: 0.01251
Epoch: 18421, Training Loss: 0.00965
Epoch: 18422, Training Loss: 0.01096
Epoch: 18422, Training Loss: 0.01018
Epoch: 18422, Training Loss: 0.01251
Epoch: 18422, Training Loss: 0.00965
Epoch: 18423, Training Loss: 0.01096
Epoch: 18423, Training Loss: 0.01018
Epoch: 18423, Training Loss: 0.01251
Epoch: 18423, Training Loss: 0.00965
Epoch: 18424, Training Loss: 0.01096
Epoch: 18424, Training Loss: 0.01018
Epoch: 18424, Training Loss: 0.01251
Epoch: 18424, Training Loss: 0.00965
Epoch: 18425, Training Loss: 0.01096
Epoch: 18425, Training Loss: 0.01018
Epoch: 18425, Training Loss: 0.01251
Epoch: 18425, Training Loss: 0.00965
Epoch: 18426, Training Loss: 0.01096
Epoch: 18426, Training Loss: 0.01018
Epoch: 18426, Training Loss: 0.01251
Epoch: 18426, Training Loss: 0.00965
Epoch: 18427, Training Loss: 0.01096
Epoch: 18427, Training Loss: 0.01018
Epoch: 18427, Training Loss: 0.01251
Epoch: 18427, Training Loss: 0.00965
Epoch: 18428, Training Loss: 0.01096
Epoch: 18428, Training Loss: 0.01018
Epoch: 18428, Training Loss: 0.01251
Epoch: 18428, Training Loss: 0.00965
Epoch: 18429, Training Loss: 0.01096
Epoch: 18429, Training Loss: 0.01018
Epoch: 18429, Training Loss: 0.01251
Epoch: 18429, Training Loss: 0.00965
Epoch: 18430, Training Loss: 0.01096
Epoch: 18430, Training Loss: 0.01018
Epoch: 18430, Training Loss: 0.01251
Epoch: 18430, Training Loss: 0.00965
Epoch: 18431, Training Loss: 0.01096
Epoch: 18431, Training Loss: 0.01018
Epoch: 18431, Training Loss: 0.01251
Epoch: 18431, Training Loss: 0.00965
Epoch: 18432, Training Loss: 0.01096
Epoch: 18432, Training Loss: 0.01018
Epoch: 18432, Training Loss: 0.01251
Epoch: 18432, Training Loss: 0.00965
Epoch: 18433, Training Loss: 0.01096
Epoch: 18433, Training Loss: 0.01018
Epoch: 18433, Training Loss: 0.01251
Epoch: 18433, Training Loss: 0.00965
Epoch: 18434, Training Loss: 0.01096
Epoch: 18434, Training Loss: 0.01018
Epoch: 18434, Training Loss: 0.01251
Epoch: 18434, Training Loss: 0.00965
Epoch: 18435, Training Loss: 0.01095
Epoch: 18435, Training Loss: 0.01018
Epoch: 18435, Training Loss: 0.01251
Epoch: 18435, Training Loss: 0.00965
Epoch: 18436, Training Loss: 0.01095
Epoch: 18436, Training Loss: 0.01018
Epoch: 18436, Training Loss: 0.01251
Epoch: 18436, Training Loss: 0.00965
Epoch: 18437, Training Loss: 0.01095
Epoch: 18437, Training Loss: 0.01018
Epoch: 18437, Training Loss: 0.01251
Epoch: 18437, Training Loss: 0.00965
Epoch: 18438, Training Loss: 0.01095
Epoch: 18438, Training Loss: 0.01018
Epoch: 18438, Training Loss: 0.01251
Epoch: 18438, Training Loss: 0.00965
Epoch: 18439, Training Loss: 0.01095
Epoch: 18439, Training Loss: 0.01018
Epoch: 18439, Training Loss: 0.01250
Epoch: 18439, Training Loss: 0.00965
Epoch: 18440, Training Loss: 0.01095
Epoch: 18440, Training Loss: 0.01018
Epoch: 18440, Training Loss: 0.01250
Epoch: 18440, Training Loss: 0.00965
Epoch: 18441, Training Loss: 0.01095
Epoch: 18441, Training Loss: 0.01018
Epoch: 18441, Training Loss: 0.01250
Epoch: 18441, Training Loss: 0.00965
Epoch: 18442, Training Loss: 0.01095
Epoch: 18442, Training Loss: 0.01018
Epoch: 18442, Training Loss: 0.01250
Epoch: 18442, Training Loss: 0.00965
Epoch: 18443, Training Loss: 0.01095
Epoch: 18443, Training Loss: 0.01018
Epoch: 18443, Training Loss: 0.01250
Epoch: 18443, Training Loss: 0.00965
Epoch: 18444, Training Loss: 0.01095
Epoch: 18444, Training Loss: 0.01018
Epoch: 18444, Training Loss: 0.01250
Epoch: 18444, Training Loss: 0.00965
Epoch: 18445, Training Loss: 0.01095
Epoch: 18445, Training Loss: 0.01018
Epoch: 18445, Training Loss: 0.01250
Epoch: 18445, Training Loss: 0.00964
Epoch: 18446, Training Loss: 0.01095
Epoch: 18446, Training Loss: 0.01018
Epoch: 18446, Training Loss: 0.01250
Epoch: 18446, Training Loss: 0.00964
Epoch: 18447, Training Loss: 0.01095
Epoch: 18447, Training Loss: 0.01018
Epoch: 18447, Training Loss: 0.01250
Epoch: 18447, Training Loss: 0.00964
Epoch: 18448, Training Loss: 0.01095
Epoch: 18448, Training Loss: 0.01018
Epoch: 18448, Training Loss: 0.01250
Epoch: 18448, Training Loss: 0.00964
Epoch: 18449, Training Loss: 0.01095
Epoch: 18449, Training Loss: 0.01018
Epoch: 18449, Training Loss: 0.01250
Epoch: 18449, Training Loss: 0.00964
Epoch: 18450, Training Loss: 0.01095
Epoch: 18450, Training Loss: 0.01018
Epoch: 18450, Training Loss: 0.01250
Epoch: 18450, Training Loss: 0.00964
Epoch: 18451, Training Loss: 0.01095
Epoch: 18451, Training Loss: 0.01018
Epoch: 18451, Training Loss: 0.01250
Epoch: 18451, Training Loss: 0.00964
Epoch: 18452, Training Loss: 0.01095
Epoch: 18452, Training Loss: 0.01017
Epoch: 18452, Training Loss: 0.01250
Epoch: 18452, Training Loss: 0.00964
Epoch: 18453, Training Loss: 0.01095
Epoch: 18453, Training Loss: 0.01017
Epoch: 18453, Training Loss: 0.01250
Epoch: 18453, Training Loss: 0.00964
Epoch: 18454, Training Loss: 0.01095
Epoch: 18454, Training Loss: 0.01017
Epoch: 18454, Training Loss: 0.01250
Epoch: 18454, Training Loss: 0.00964
Epoch: 18455, Training Loss: 0.01095
Epoch: 18455, Training Loss: 0.01017
Epoch: 18455, Training Loss: 0.01250
Epoch: 18455, Training Loss: 0.00964
Epoch: 18456, Training Loss: 0.01095
Epoch: 18456, Training Loss: 0.01017
Epoch: 18456, Training Loss: 0.01250
Epoch: 18456, Training Loss: 0.00964
Epoch: 18457, Training Loss: 0.01095
Epoch: 18457, Training Loss: 0.01017
Epoch: 18457, Training Loss: 0.01250
Epoch: 18457, Training Loss: 0.00964
Epoch: 18458, Training Loss: 0.01095
Epoch: 18458, Training Loss: 0.01017
Epoch: 18458, Training Loss: 0.01250
Epoch: 18458, Training Loss: 0.00964
Epoch: 18459, Training Loss: 0.01095
Epoch: 18459, Training Loss: 0.01017
Epoch: 18459, Training Loss: 0.01250
Epoch: 18459, Training Loss: 0.00964
Epoch: 18460, Training Loss: 0.01095
Epoch: 18460, Training Loss: 0.01017
Epoch: 18460, Training Loss: 0.01250
Epoch: 18460, Training Loss: 0.00964
Epoch: 18461, Training Loss: 0.01095
Epoch: 18461, Training Loss: 0.01017
Epoch: 18461, Training Loss: 0.01250
Epoch: 18461, Training Loss: 0.00964
Epoch: 18462, Training Loss: 0.01095
Epoch: 18462, Training Loss: 0.01017
Epoch: 18462, Training Loss: 0.01250
Epoch: 18462, Training Loss: 0.00964
Epoch: 18463, Training Loss: 0.01095
Epoch: 18463, Training Loss: 0.01017
Epoch: 18463, Training Loss: 0.01250
Epoch: 18463, Training Loss: 0.00964
Epoch: 18464, Training Loss: 0.01095
Epoch: 18464, Training Loss: 0.01017
Epoch: 18464, Training Loss: 0.01250
Epoch: 18464, Training Loss: 0.00964
Epoch: 18465, Training Loss: 0.01095
Epoch: 18465, Training Loss: 0.01017
Epoch: 18465, Training Loss: 0.01250
Epoch: 18465, Training Loss: 0.00964
Epoch: 18466, Training Loss: 0.01094
Epoch: 18466, Training Loss: 0.01017
Epoch: 18466, Training Loss: 0.01249
Epoch: 18466, Training Loss: 0.00964
Epoch: 18467, Training Loss: 0.01094
Epoch: 18467, Training Loss: 0.01017
Epoch: 18467, Training Loss: 0.01249
Epoch: 18467, Training Loss: 0.00964
Epoch: 18468, Training Loss: 0.01094
Epoch: 18468, Training Loss: 0.01017
Epoch: 18468, Training Loss: 0.01249
Epoch: 18468, Training Loss: 0.00964
Epoch: 18469, Training Loss: 0.01094
Epoch: 18469, Training Loss: 0.01017
Epoch: 18469, Training Loss: 0.01249
Epoch: 18469, Training Loss: 0.00964
Epoch: 18470, Training Loss: 0.01094
Epoch: 18470, Training Loss: 0.01017
Epoch: 18470, Training Loss: 0.01249
Epoch: 18470, Training Loss: 0.00964
Epoch: 18471, Training Loss: 0.01094
Epoch: 18471, Training Loss: 0.01017
Epoch: 18471, Training Loss: 0.01249
Epoch: 18471, Training Loss: 0.00964
Epoch: 18472, Training Loss: 0.01094
Epoch: 18472, Training Loss: 0.01017
Epoch: 18472, Training Loss: 0.01249
Epoch: 18472, Training Loss: 0.00964
Epoch: 18473, Training Loss: 0.01094
Epoch: 18473, Training Loss: 0.01017
Epoch: 18473, Training Loss: 0.01249
Epoch: 18473, Training Loss: 0.00964
Epoch: 18474, Training Loss: 0.01094
Epoch: 18474, Training Loss: 0.01017
Epoch: 18474, Training Loss: 0.01249
Epoch: 18474, Training Loss: 0.00964
Epoch: 18475, Training Loss: 0.01094
Epoch: 18475, Training Loss: 0.01017
Epoch: 18475, Training Loss: 0.01249
Epoch: 18475, Training Loss: 0.00964
Epoch: 18476, Training Loss: 0.01094
Epoch: 18476, Training Loss: 0.01017
Epoch: 18476, Training Loss: 0.01249
Epoch: 18476, Training Loss: 0.00964
Epoch: 18477, Training Loss: 0.01094
Epoch: 18477, Training Loss: 0.01017
Epoch: 18477, Training Loss: 0.01249
Epoch: 18477, Training Loss: 0.00964
Epoch: 18478, Training Loss: 0.01094
Epoch: 18478, Training Loss: 0.01017
Epoch: 18478, Training Loss: 0.01249
Epoch: 18478, Training Loss: 0.00964
Epoch: 18479, Training Loss: 0.01094
Epoch: 18479, Training Loss: 0.01017
Epoch: 18479, Training Loss: 0.01249
Epoch: 18479, Training Loss: 0.00964
Epoch: 18480, Training Loss: 0.01094
Epoch: 18480, Training Loss: 0.01017
Epoch: 18480, Training Loss: 0.01249
Epoch: 18480, Training Loss: 0.00964
Epoch: 18481, Training Loss: 0.01094
Epoch: 18481, Training Loss: 0.01017
Epoch: 18481, Training Loss: 0.01249
Epoch: 18481, Training Loss: 0.00963
Epoch: 18482, Training Loss: 0.01094
Epoch: 18482, Training Loss: 0.01017
Epoch: 18482, Training Loss: 0.01249
Epoch: 18482, Training Loss: 0.00963
Epoch: 18483, Training Loss: 0.01094
Epoch: 18483, Training Loss: 0.01017
Epoch: 18483, Training Loss: 0.01249
Epoch: 18483, Training Loss: 0.00963
Epoch: 18484, Training Loss: 0.01094
Epoch: 18484, Training Loss: 0.01017
Epoch: 18484, Training Loss: 0.01249
Epoch: 18484, Training Loss: 0.00963
Epoch: 18485, Training Loss: 0.01094
Epoch: 18485, Training Loss: 0.01017
Epoch: 18485, Training Loss: 0.01249
Epoch: 18485, Training Loss: 0.00963
Epoch: 18486, Training Loss: 0.01094
Epoch: 18486, Training Loss: 0.01016
Epoch: 18486, Training Loss: 0.01249
Epoch: 18486, Training Loss: 0.00963
Epoch: 18487, Training Loss: 0.01094
Epoch: 18487, Training Loss: 0.01016
Epoch: 18487, Training Loss: 0.01249
Epoch: 18487, Training Loss: 0.00963
Epoch: 18488, Training Loss: 0.01094
Epoch: 18488, Training Loss: 0.01016
Epoch: 18488, Training Loss: 0.01249
Epoch: 18488, Training Loss: 0.00963
Epoch: 18489, Training Loss: 0.01094
Epoch: 18489, Training Loss: 0.01016
Epoch: 18489, Training Loss: 0.01249
Epoch: 18489, Training Loss: 0.00963
Epoch: 18490, Training Loss: 0.01094
Epoch: 18490, Training Loss: 0.01016
Epoch: 18490, Training Loss: 0.01249
Epoch: 18490, Training Loss: 0.00963
Epoch: 18491, Training Loss: 0.01094
Epoch: 18491, Training Loss: 0.01016
Epoch: 18491, Training Loss: 0.01249
Epoch: 18491, Training Loss: 0.00963
Epoch: 18492, Training Loss: 0.01094
Epoch: 18492, Training Loss: 0.01016
Epoch: 18492, Training Loss: 0.01249
Epoch: 18492, Training Loss: 0.00963
Epoch: 18493, Training Loss: 0.01094
Epoch: 18493, Training Loss: 0.01016
Epoch: 18493, Training Loss: 0.01248
Epoch: 18493, Training Loss: 0.00963
Epoch: 18494, Training Loss: 0.01094
Epoch: 18494, Training Loss: 0.01016
Epoch: 18494, Training Loss: 0.01248
Epoch: 18494, Training Loss: 0.00963
Epoch: 18495, Training Loss: 0.01094
Epoch: 18495, Training Loss: 0.01016
Epoch: 18495, Training Loss: 0.01248
Epoch: 18495, Training Loss: 0.00963
Epoch: 18496, Training Loss: 0.01094
Epoch: 18496, Training Loss: 0.01016
Epoch: 18496, Training Loss: 0.01248
Epoch: 18496, Training Loss: 0.00963
Epoch: 18497, Training Loss: 0.01093
Epoch: 18497, Training Loss: 0.01016
Epoch: 18497, Training Loss: 0.01248
Epoch: 18497, Training Loss: 0.00963
Epoch: 18498, Training Loss: 0.01093
Epoch: 18498, Training Loss: 0.01016
Epoch: 18498, Training Loss: 0.01248
Epoch: 18498, Training Loss: 0.00963
Epoch: 18499, Training Loss: 0.01093
Epoch: 18499, Training Loss: 0.01016
Epoch: 18499, Training Loss: 0.01248
Epoch: 18499, Training Loss: 0.00963
Epoch: 18500, Training Loss: 0.01093
Epoch: 18500, Training Loss: 0.01016
Epoch: 18500, Training Loss: 0.01248
Epoch: 18500, Training Loss: 0.00963
Epoch: 18501, Training Loss: 0.01093
Epoch: 18501, Training Loss: 0.01016
Epoch: 18501, Training Loss: 0.01248
Epoch: 18501, Training Loss: 0.00963
Epoch: 18502, Training Loss: 0.01093
Epoch: 18502, Training Loss: 0.01016
Epoch: 18502, Training Loss: 0.01248
Epoch: 18502, Training Loss: 0.00963
Epoch: 18503, Training Loss: 0.01093
Epoch: 18503, Training Loss: 0.01016
Epoch: 18503, Training Loss: 0.01248
Epoch: 18503, Training Loss: 0.00963
Epoch: 18504, Training Loss: 0.01093
Epoch: 18504, Training Loss: 0.01016
Epoch: 18504, Training Loss: 0.01248
Epoch: 18504, Training Loss: 0.00963
Epoch: 18505, Training Loss: 0.01093
Epoch: 18505, Training Loss: 0.01016
Epoch: 18505, Training Loss: 0.01248
Epoch: 18505, Training Loss: 0.00963
Epoch: 18506, Training Loss: 0.01093
Epoch: 18506, Training Loss: 0.01016
Epoch: 18506, Training Loss: 0.01248
Epoch: 18506, Training Loss: 0.00963
Epoch: 18507, Training Loss: 0.01093
Epoch: 18507, Training Loss: 0.01016
Epoch: 18507, Training Loss: 0.01248
Epoch: 18507, Training Loss: 0.00963
Epoch: 18508, Training Loss: 0.01093
Epoch: 18508, Training Loss: 0.01016
Epoch: 18508, Training Loss: 0.01248
Epoch: 18508, Training Loss: 0.00963
Epoch: 18509, Training Loss: 0.01093
Epoch: 18509, Training Loss: 0.01016
Epoch: 18509, Training Loss: 0.01248
Epoch: 18509, Training Loss: 0.00963
Epoch: 18510, Training Loss: 0.01093
Epoch: 18510, Training Loss: 0.01016
Epoch: 18510, Training Loss: 0.01248
Epoch: 18510, Training Loss: 0.00963
Epoch: 18511, Training Loss: 0.01093
Epoch: 18511, Training Loss: 0.01016
Epoch: 18511, Training Loss: 0.01248
Epoch: 18511, Training Loss: 0.00963
Epoch: 18512, Training Loss: 0.01093
Epoch: 18512, Training Loss: 0.01016
Epoch: 18512, Training Loss: 0.01248
Epoch: 18512, Training Loss: 0.00963
Epoch: 18513, Training Loss: 0.01093
Epoch: 18513, Training Loss: 0.01016
Epoch: 18513, Training Loss: 0.01248
Epoch: 18513, Training Loss: 0.00963
Epoch: 18514, Training Loss: 0.01093
Epoch: 18514, Training Loss: 0.01016
Epoch: 18514, Training Loss: 0.01248
Epoch: 18514, Training Loss: 0.00963
Epoch: 18515, Training Loss: 0.01093
Epoch: 18515, Training Loss: 0.01016
Epoch: 18515, Training Loss: 0.01248
Epoch: 18515, Training Loss: 0.00963
Epoch: 18516, Training Loss: 0.01093
Epoch: 18516, Training Loss: 0.01016
Epoch: 18516, Training Loss: 0.01248
Epoch: 18516, Training Loss: 0.00963
Epoch: 18517, Training Loss: 0.01093
Epoch: 18517, Training Loss: 0.01016
Epoch: 18517, Training Loss: 0.01248
Epoch: 18517, Training Loss: 0.00962
Epoch: 18518, Training Loss: 0.01093
Epoch: 18518, Training Loss: 0.01016
Epoch: 18518, Training Loss: 0.01248
Epoch: 18518, Training Loss: 0.00962
Epoch: 18519, Training Loss: 0.01093
Epoch: 18519, Training Loss: 0.01016
Epoch: 18519, Training Loss: 0.01248
Epoch: 18519, Training Loss: 0.00962
Epoch: 18520, Training Loss: 0.01093
Epoch: 18520, Training Loss: 0.01015
Epoch: 18520, Training Loss: 0.01247
Epoch: 18520, Training Loss: 0.00962
Epoch: 18521, Training Loss: 0.01093
Epoch: 18521, Training Loss: 0.01015
Epoch: 18521, Training Loss: 0.01247
Epoch: 18521, Training Loss: 0.00962
Epoch: 18522, Training Loss: 0.01093
Epoch: 18522, Training Loss: 0.01015
Epoch: 18522, Training Loss: 0.01247
Epoch: 18522, Training Loss: 0.00962
Epoch: 18523, Training Loss: 0.01093
Epoch: 18523, Training Loss: 0.01015
Epoch: 18523, Training Loss: 0.01247
Epoch: 18523, Training Loss: 0.00962
Epoch: 18524, Training Loss: 0.01093
Epoch: 18524, Training Loss: 0.01015
Epoch: 18524, Training Loss: 0.01247
Epoch: 18524, Training Loss: 0.00962
Epoch: 18525, Training Loss: 0.01093
Epoch: 18525, Training Loss: 0.01015
Epoch: 18525, Training Loss: 0.01247
Epoch: 18525, Training Loss: 0.00962
Epoch: 18526, Training Loss: 0.01093
Epoch: 18526, Training Loss: 0.01015
Epoch: 18526, Training Loss: 0.01247
Epoch: 18526, Training Loss: 0.00962
Epoch: 18527, Training Loss: 0.01093
Epoch: 18527, Training Loss: 0.01015
Epoch: 18527, Training Loss: 0.01247
Epoch: 18527, Training Loss: 0.00962
Epoch: 18528, Training Loss: 0.01093
Epoch: 18528, Training Loss: 0.01015
Epoch: 18528, Training Loss: 0.01247
Epoch: 18528, Training Loss: 0.00962
Epoch: 18529, Training Loss: 0.01092
Epoch: 18529, Training Loss: 0.01015
Epoch: 18529, Training Loss: 0.01247
Epoch: 18529, Training Loss: 0.00962
Epoch: 18530, Training Loss: 0.01092
Epoch: 18530, Training Loss: 0.01015
Epoch: 18530, Training Loss: 0.01247
Epoch: 18530, Training Loss: 0.00962
Epoch: 18531, Training Loss: 0.01092
Epoch: 18531, Training Loss: 0.01015
Epoch: 18531, Training Loss: 0.01247
Epoch: 18531, Training Loss: 0.00962
Epoch: 18532, Training Loss: 0.01092
Epoch: 18532, Training Loss: 0.01015
Epoch: 18532, Training Loss: 0.01247
Epoch: 18532, Training Loss: 0.00962
Epoch: 18533, Training Loss: 0.01092
Epoch: 18533, Training Loss: 0.01015
Epoch: 18533, Training Loss: 0.01247
Epoch: 18533, Training Loss: 0.00962
Epoch: 18534, Training Loss: 0.01092
Epoch: 18534, Training Loss: 0.01015
Epoch: 18534, Training Loss: 0.01247
Epoch: 18534, Training Loss: 0.00962
Epoch: 18535, Training Loss: 0.01092
Epoch: 18535, Training Loss: 0.01015
Epoch: 18535, Training Loss: 0.01247
Epoch: 18535, Training Loss: 0.00962
Epoch: 18536, Training Loss: 0.01092
Epoch: 18536, Training Loss: 0.01015
Epoch: 18536, Training Loss: 0.01247
Epoch: 18536, Training Loss: 0.00962
Epoch: 18537, Training Loss: 0.01092
Epoch: 18537, Training Loss: 0.01015
Epoch: 18537, Training Loss: 0.01247
Epoch: 18537, Training Loss: 0.00962
Epoch: 18538, Training Loss: 0.01092
Epoch: 18538, Training Loss: 0.01015
Epoch: 18538, Training Loss: 0.01247
Epoch: 18538, Training Loss: 0.00962
Epoch: 18539, Training Loss: 0.01092
Epoch: 18539, Training Loss: 0.01015
Epoch: 18539, Training Loss: 0.01247
Epoch: 18539, Training Loss: 0.00962
Epoch: 18540, Training Loss: 0.01092
Epoch: 18540, Training Loss: 0.01015
Epoch: 18540, Training Loss: 0.01247
Epoch: 18540, Training Loss: 0.00962
Epoch: 18541, Training Loss: 0.01092
Epoch: 18541, Training Loss: 0.01015
Epoch: 18541, Training Loss: 0.01247
Epoch: 18541, Training Loss: 0.00962
Epoch: 18542, Training Loss: 0.01092
Epoch: 18542, Training Loss: 0.01015
Epoch: 18542, Training Loss: 0.01247
Epoch: 18542, Training Loss: 0.00962
Epoch: 18543, Training Loss: 0.01092
Epoch: 18543, Training Loss: 0.01015
Epoch: 18543, Training Loss: 0.01247
Epoch: 18543, Training Loss: 0.00962
Epoch: 18544, Training Loss: 0.01092
Epoch: 18544, Training Loss: 0.01015
Epoch: 18544, Training Loss: 0.01247
Epoch: 18544, Training Loss: 0.00962
Epoch: 18545, Training Loss: 0.01092
Epoch: 18545, Training Loss: 0.01015
Epoch: 18545, Training Loss: 0.01247
Epoch: 18545, Training Loss: 0.00962
Epoch: 18546, Training Loss: 0.01092
Epoch: 18546, Training Loss: 0.01015
Epoch: 18546, Training Loss: 0.01247
Epoch: 18546, Training Loss: 0.00962
Epoch: 18547, Training Loss: 0.01092
Epoch: 18547, Training Loss: 0.01015
Epoch: 18547, Training Loss: 0.01247
Epoch: 18547, Training Loss: 0.00962
Epoch: 18548, Training Loss: 0.01092
Epoch: 18548, Training Loss: 0.01015
Epoch: 18548, Training Loss: 0.01246
Epoch: 18548, Training Loss: 0.00962
Epoch: 18549, Training Loss: 0.01092
Epoch: 18549, Training Loss: 0.01015
Epoch: 18549, Training Loss: 0.01246
Epoch: 18549, Training Loss: 0.00962
Epoch: 18550, Training Loss: 0.01092
Epoch: 18550, Training Loss: 0.01015
Epoch: 18550, Training Loss: 0.01246
Epoch: 18550, Training Loss: 0.00962
Epoch: 18551, Training Loss: 0.01092
Epoch: 18551, Training Loss: 0.01015
Epoch: 18551, Training Loss: 0.01246
Epoch: 18551, Training Loss: 0.00962
Epoch: 18552, Training Loss: 0.01092
Epoch: 18552, Training Loss: 0.01015
Epoch: 18552, Training Loss: 0.01246
Epoch: 18552, Training Loss: 0.00962
Epoch: 18553, Training Loss: 0.01092
Epoch: 18553, Training Loss: 0.01015
Epoch: 18553, Training Loss: 0.01246
Epoch: 18553, Training Loss: 0.00961
Epoch: 18554, Training Loss: 0.01092
Epoch: 18554, Training Loss: 0.01014
Epoch: 18554, Training Loss: 0.01246
Epoch: 18554, Training Loss: 0.00961
Epoch: 18555, Training Loss: 0.01092
Epoch: 18555, Training Loss: 0.01014
Epoch: 18555, Training Loss: 0.01246
Epoch: 18555, Training Loss: 0.00961
Epoch: 18556, Training Loss: 0.01092
Epoch: 18556, Training Loss: 0.01014
Epoch: 18556, Training Loss: 0.01246
Epoch: 18556, Training Loss: 0.00961
Epoch: 18557, Training Loss: 0.01092
Epoch: 18557, Training Loss: 0.01014
Epoch: 18557, Training Loss: 0.01246
Epoch: 18557, Training Loss: 0.00961
Epoch: 18558, Training Loss: 0.01092
Epoch: 18558, Training Loss: 0.01014
Epoch: 18558, Training Loss: 0.01246
Epoch: 18558, Training Loss: 0.00961
Epoch: 18559, Training Loss: 0.01092
Epoch: 18559, Training Loss: 0.01014
Epoch: 18559, Training Loss: 0.01246
Epoch: 18559, Training Loss: 0.00961
Epoch: 18560, Training Loss: 0.01091
Epoch: 18560, Training Loss: 0.01014
Epoch: 18560, Training Loss: 0.01246
Epoch: 18560, Training Loss: 0.00961
Epoch: 18561, Training Loss: 0.01091
Epoch: 18561, Training Loss: 0.01014
Epoch: 18561, Training Loss: 0.01246
Epoch: 18561, Training Loss: 0.00961
Epoch: 18562, Training Loss: 0.01091
Epoch: 18562, Training Loss: 0.01014
Epoch: 18562, Training Loss: 0.01246
Epoch: 18562, Training Loss: 0.00961
Epoch: 18563, Training Loss: 0.01091
Epoch: 18563, Training Loss: 0.01014
Epoch: 18563, Training Loss: 0.01246
Epoch: 18563, Training Loss: 0.00961
Epoch: 18564, Training Loss: 0.01091
Epoch: 18564, Training Loss: 0.01014
Epoch: 18564, Training Loss: 0.01246
Epoch: 18564, Training Loss: 0.00961
Epoch: 18565, Training Loss: 0.01091
Epoch: 18565, Training Loss: 0.01014
Epoch: 18565, Training Loss: 0.01246
Epoch: 18565, Training Loss: 0.00961
Epoch: 18566, Training Loss: 0.01091
Epoch: 18566, Training Loss: 0.01014
Epoch: 18566, Training Loss: 0.01246
Epoch: 18566, Training Loss: 0.00961
Epoch: 18567, Training Loss: 0.01091
Epoch: 18567, Training Loss: 0.01014
Epoch: 18567, Training Loss: 0.01246
Epoch: 18567, Training Loss: 0.00961
Epoch: 18568, Training Loss: 0.01091
Epoch: 18568, Training Loss: 0.01014
Epoch: 18568, Training Loss: 0.01246
Epoch: 18568, Training Loss: 0.00961
Epoch: 18569, Training Loss: 0.01091
Epoch: 18569, Training Loss: 0.01014
Epoch: 18569, Training Loss: 0.01246
Epoch: 18569, Training Loss: 0.00961
Epoch: 18570, Training Loss: 0.01091
Epoch: 18570, Training Loss: 0.01014
Epoch: 18570, Training Loss: 0.01246
Epoch: 18570, Training Loss: 0.00961
Epoch: 18571, Training Loss: 0.01091
Epoch: 18571, Training Loss: 0.01014
Epoch: 18571, Training Loss: 0.01246
Epoch: 18571, Training Loss: 0.00961
Epoch: 18572, Training Loss: 0.01091
Epoch: 18572, Training Loss: 0.01014
Epoch: 18572, Training Loss: 0.01246
Epoch: 18572, Training Loss: 0.00961
Epoch: 18573, Training Loss: 0.01091
Epoch: 18573, Training Loss: 0.01014
Epoch: 18573, Training Loss: 0.01246
Epoch: 18573, Training Loss: 0.00961
Epoch: 18574, Training Loss: 0.01091
Epoch: 18574, Training Loss: 0.01014
Epoch: 18574, Training Loss: 0.01246
Epoch: 18574, Training Loss: 0.00961
Epoch: 18575, Training Loss: 0.01091
Epoch: 18575, Training Loss: 0.01014
Epoch: 18575, Training Loss: 0.01245
Epoch: 18575, Training Loss: 0.00961
Epoch: 18576, Training Loss: 0.01091
Epoch: 18576, Training Loss: 0.01014
Epoch: 18576, Training Loss: 0.01245
Epoch: 18576, Training Loss: 0.00961
Epoch: 18577, Training Loss: 0.01091
Epoch: 18577, Training Loss: 0.01014
Epoch: 18577, Training Loss: 0.01245
Epoch: 18577, Training Loss: 0.00961
Epoch: 18578, Training Loss: 0.01091
Epoch: 18578, Training Loss: 0.01014
Epoch: 18578, Training Loss: 0.01245
Epoch: 18578, Training Loss: 0.00961
Epoch: 18579, Training Loss: 0.01091
Epoch: 18579, Training Loss: 0.01014
Epoch: 18579, Training Loss: 0.01245
Epoch: 18579, Training Loss: 0.00961
Epoch: 18580, Training Loss: 0.01091
Epoch: 18580, Training Loss: 0.01014
Epoch: 18580, Training Loss: 0.01245
Epoch: 18580, Training Loss: 0.00961
Epoch: 18581, Training Loss: 0.01091
Epoch: 18581, Training Loss: 0.01014
Epoch: 18581, Training Loss: 0.01245
Epoch: 18581, Training Loss: 0.00961
Epoch: 18582, Training Loss: 0.01091
Epoch: 18582, Training Loss: 0.01014
Epoch: 18582, Training Loss: 0.01245
Epoch: 18582, Training Loss: 0.00961
Epoch: 18583, Training Loss: 0.01091
Epoch: 18583, Training Loss: 0.01014
Epoch: 18583, Training Loss: 0.01245
Epoch: 18583, Training Loss: 0.00961
Epoch: 18584, Training Loss: 0.01091
Epoch: 18584, Training Loss: 0.01014
Epoch: 18584, Training Loss: 0.01245
Epoch: 18584, Training Loss: 0.00961
Epoch: 18585, Training Loss: 0.01091
Epoch: 18585, Training Loss: 0.01014
Epoch: 18585, Training Loss: 0.01245
Epoch: 18585, Training Loss: 0.00961
Epoch: 18586, Training Loss: 0.01091
Epoch: 18586, Training Loss: 0.01014
Epoch: 18586, Training Loss: 0.01245
Epoch: 18586, Training Loss: 0.00961
Epoch: 18587, Training Loss: 0.01091
Epoch: 18587, Training Loss: 0.01014
Epoch: 18587, Training Loss: 0.01245
Epoch: 18587, Training Loss: 0.00961
Epoch: 18588, Training Loss: 0.01091
Epoch: 18588, Training Loss: 0.01014
Epoch: 18588, Training Loss: 0.01245
Epoch: 18588, Training Loss: 0.00961
Epoch: 18589, Training Loss: 0.01091
Epoch: 18589, Training Loss: 0.01013
Epoch: 18589, Training Loss: 0.01245
Epoch: 18589, Training Loss: 0.00960
Epoch: 18590, Training Loss: 0.01091
Epoch: 18590, Training Loss: 0.01013
Epoch: 18590, Training Loss: 0.01245
Epoch: 18590, Training Loss: 0.00960
Epoch: 18591, Training Loss: 0.01090
Epoch: 18591, Training Loss: 0.01013
Epoch: 18591, Training Loss: 0.01245
Epoch: 18591, Training Loss: 0.00960
Epoch: 18592, Training Loss: 0.01090
Epoch: 18592, Training Loss: 0.01013
Epoch: 18592, Training Loss: 0.01245
Epoch: 18592, Training Loss: 0.00960
Epoch: 18593, Training Loss: 0.01090
Epoch: 18593, Training Loss: 0.01013
Epoch: 18593, Training Loss: 0.01245
Epoch: 18593, Training Loss: 0.00960
Epoch: 18594, Training Loss: 0.01090
Epoch: 18594, Training Loss: 0.01013
Epoch: 18594, Training Loss: 0.01245
Epoch: 18594, Training Loss: 0.00960
Epoch: 18595, Training Loss: 0.01090
Epoch: 18595, Training Loss: 0.01013
Epoch: 18595, Training Loss: 0.01245
Epoch: 18595, Training Loss: 0.00960
Epoch: 18596, Training Loss: 0.01090
Epoch: 18596, Training Loss: 0.01013
Epoch: 18596, Training Loss: 0.01245
Epoch: 18596, Training Loss: 0.00960
Epoch: 18597, Training Loss: 0.01090
Epoch: 18597, Training Loss: 0.01013
Epoch: 18597, Training Loss: 0.01245
Epoch: 18597, Training Loss: 0.00960
Epoch: 18598, Training Loss: 0.01090
Epoch: 18598, Training Loss: 0.01013
Epoch: 18598, Training Loss: 0.01245
Epoch: 18598, Training Loss: 0.00960
Epoch: 18599, Training Loss: 0.01090
Epoch: 18599, Training Loss: 0.01013
Epoch: 18599, Training Loss: 0.01245
Epoch: 18599, Training Loss: 0.00960
Epoch: 18600, Training Loss: 0.01090
Epoch: 18600, Training Loss: 0.01013
Epoch: 18600, Training Loss: 0.01245
Epoch: 18600, Training Loss: 0.00960
Epoch: 18601, Training Loss: 0.01090
Epoch: 18601, Training Loss: 0.01013
Epoch: 18601, Training Loss: 0.01245
Epoch: 18601, Training Loss: 0.00960
Epoch: 18602, Training Loss: 0.01090
Epoch: 18602, Training Loss: 0.01013
Epoch: 18602, Training Loss: 0.01245
Epoch: 18602, Training Loss: 0.00960
Epoch: 18603, Training Loss: 0.01090
Epoch: 18603, Training Loss: 0.01013
Epoch: 18603, Training Loss: 0.01244
Epoch: 18603, Training Loss: 0.00960
Epoch: 18604, Training Loss: 0.01090
Epoch: 18604, Training Loss: 0.01013
Epoch: 18604, Training Loss: 0.01244
Epoch: 18604, Training Loss: 0.00960
Epoch: 18605, Training Loss: 0.01090
Epoch: 18605, Training Loss: 0.01013
Epoch: 18605, Training Loss: 0.01244
Epoch: 18605, Training Loss: 0.00960
Epoch: 18606, Training Loss: 0.01090
Epoch: 18606, Training Loss: 0.01013
Epoch: 18606, Training Loss: 0.01244
Epoch: 18606, Training Loss: 0.00960
Epoch: 18607, Training Loss: 0.01090
Epoch: 18607, Training Loss: 0.01013
Epoch: 18607, Training Loss: 0.01244
Epoch: 18607, Training Loss: 0.00960
Epoch: 18608, Training Loss: 0.01090
Epoch: 18608, Training Loss: 0.01013
Epoch: 18608, Training Loss: 0.01244
Epoch: 18608, Training Loss: 0.00960
Epoch: 18609, Training Loss: 0.01090
Epoch: 18609, Training Loss: 0.01013
Epoch: 18609, Training Loss: 0.01244
Epoch: 18609, Training Loss: 0.00960
Epoch: 18610, Training Loss: 0.01090
Epoch: 18610, Training Loss: 0.01013
Epoch: 18610, Training Loss: 0.01244
Epoch: 18610, Training Loss: 0.00960
Epoch: 18611, Training Loss: 0.01090
Epoch: 18611, Training Loss: 0.01013
Epoch: 18611, Training Loss: 0.01244
Epoch: 18611, Training Loss: 0.00960
Epoch: 18612, Training Loss: 0.01090
Epoch: 18612, Training Loss: 0.01013
Epoch: 18612, Training Loss: 0.01244
Epoch: 18612, Training Loss: 0.00960
Epoch: 18613, Training Loss: 0.01090
Epoch: 18613, Training Loss: 0.01013
Epoch: 18613, Training Loss: 0.01244
Epoch: 18613, Training Loss: 0.00960
Epoch: 18614, Training Loss: 0.01090
Epoch: 18614, Training Loss: 0.01013
Epoch: 18614, Training Loss: 0.01244
Epoch: 18614, Training Loss: 0.00960
Epoch: 18615, Training Loss: 0.01090
Epoch: 18615, Training Loss: 0.01013
Epoch: 18615, Training Loss: 0.01244
Epoch: 18615, Training Loss: 0.00960
Epoch: 18616, Training Loss: 0.01090
Epoch: 18616, Training Loss: 0.01013
Epoch: 18616, Training Loss: 0.01244
Epoch: 18616, Training Loss: 0.00960
Epoch: 18617, Training Loss: 0.01090
Epoch: 18617, Training Loss: 0.01013
Epoch: 18617, Training Loss: 0.01244
Epoch: 18617, Training Loss: 0.00960
Epoch: 18618, Training Loss: 0.01090
Epoch: 18618, Training Loss: 0.01013
Epoch: 18618, Training Loss: 0.01244
Epoch: 18618, Training Loss: 0.00960
Epoch: 18619, Training Loss: 0.01090
Epoch: 18619, Training Loss: 0.01013
Epoch: 18619, Training Loss: 0.01244
Epoch: 18619, Training Loss: 0.00960
Epoch: 18620, Training Loss: 0.01090
Epoch: 18620, Training Loss: 0.01013
Epoch: 18620, Training Loss: 0.01244
Epoch: 18620, Training Loss: 0.00960
Epoch: 18621, Training Loss: 0.01090
Epoch: 18621, Training Loss: 0.01013
Epoch: 18621, Training Loss: 0.01244
Epoch: 18621, Training Loss: 0.00960
Epoch: 18622, Training Loss: 0.01090
Epoch: 18622, Training Loss: 0.01013
Epoch: 18622, Training Loss: 0.01244
Epoch: 18622, Training Loss: 0.00960
Epoch: 18623, Training Loss: 0.01089
Epoch: 18623, Training Loss: 0.01012
Epoch: 18623, Training Loss: 0.01244
Epoch: 18623, Training Loss: 0.00960
Epoch: 18624, Training Loss: 0.01089
Epoch: 18624, Training Loss: 0.01012
Epoch: 18624, Training Loss: 0.01244
Epoch: 18624, Training Loss: 0.00960
Epoch: 18625, Training Loss: 0.01089
Epoch: 18625, Training Loss: 0.01012
Epoch: 18625, Training Loss: 0.01244
Epoch: 18625, Training Loss: 0.00960
Epoch: 18626, Training Loss: 0.01089
Epoch: 18626, Training Loss: 0.01012
Epoch: 18626, Training Loss: 0.01244
Epoch: 18626, Training Loss: 0.00959
Epoch: 18627, Training Loss: 0.01089
Epoch: 18627, Training Loss: 0.01012
Epoch: 18627, Training Loss: 0.01244
Epoch: 18627, Training Loss: 0.00959
Epoch: 18628, Training Loss: 0.01089
Epoch: 18628, Training Loss: 0.01012
Epoch: 18628, Training Loss: 0.01244
Epoch: 18628, Training Loss: 0.00959
Epoch: 18629, Training Loss: 0.01089
Epoch: 18629, Training Loss: 0.01012
Epoch: 18629, Training Loss: 0.01244
Epoch: 18629, Training Loss: 0.00959
Epoch: 18630, Training Loss: 0.01089
Epoch: 18630, Training Loss: 0.01012
Epoch: 18630, Training Loss: 0.01243
Epoch: 18630, Training Loss: 0.00959
Epoch: 18631, Training Loss: 0.01089
Epoch: 18631, Training Loss: 0.01012
Epoch: 18631, Training Loss: 0.01243
Epoch: 18631, Training Loss: 0.00959
Epoch: 18632, Training Loss: 0.01089
Epoch: 18632, Training Loss: 0.01012
Epoch: 18632, Training Loss: 0.01243
Epoch: 18632, Training Loss: 0.00959
Epoch: 18633, Training Loss: 0.01089
Epoch: 18633, Training Loss: 0.01012
Epoch: 18633, Training Loss: 0.01243
Epoch: 18633, Training Loss: 0.00959
Epoch: 18634, Training Loss: 0.01089
Epoch: 18634, Training Loss: 0.01012
Epoch: 18634, Training Loss: 0.01243
Epoch: 18634, Training Loss: 0.00959
Epoch: 18635, Training Loss: 0.01089
Epoch: 18635, Training Loss: 0.01012
Epoch: 18635, Training Loss: 0.01243
Epoch: 18635, Training Loss: 0.00959
Epoch: 18636, Training Loss: 0.01089
Epoch: 18636, Training Loss: 0.01012
Epoch: 18636, Training Loss: 0.01243
Epoch: 18636, Training Loss: 0.00959
Epoch: 18637, Training Loss: 0.01089
Epoch: 18637, Training Loss: 0.01012
Epoch: 18637, Training Loss: 0.01243
Epoch: 18637, Training Loss: 0.00959
Epoch: 18638, Training Loss: 0.01089
Epoch: 18638, Training Loss: 0.01012
Epoch: 18638, Training Loss: 0.01243
Epoch: 18638, Training Loss: 0.00959
Epoch: 18639, Training Loss: 0.01089
Epoch: 18639, Training Loss: 0.01012
Epoch: 18639, Training Loss: 0.01243
Epoch: 18639, Training Loss: 0.00959
Epoch: 18640, Training Loss: 0.01089
Epoch: 18640, Training Loss: 0.01012
Epoch: 18640, Training Loss: 0.01243
Epoch: 18640, Training Loss: 0.00959
Epoch: 18641, Training Loss: 0.01089
Epoch: 18641, Training Loss: 0.01012
Epoch: 18641, Training Loss: 0.01243
Epoch: 18641, Training Loss: 0.00959
Epoch: 18642, Training Loss: 0.01089
Epoch: 18642, Training Loss: 0.01012
Epoch: 18642, Training Loss: 0.01243
Epoch: 18642, Training Loss: 0.00959
Epoch: 18643, Training Loss: 0.01089
Epoch: 18643, Training Loss: 0.01012
Epoch: 18643, Training Loss: 0.01243
Epoch: 18643, Training Loss: 0.00959
Epoch: 18644, Training Loss: 0.01089
Epoch: 18644, Training Loss: 0.01012
Epoch: 18644, Training Loss: 0.01243
Epoch: 18644, Training Loss: 0.00959
Epoch: 18645, Training Loss: 0.01089
Epoch: 18645, Training Loss: 0.01012
Epoch: 18645, Training Loss: 0.01243
Epoch: 18645, Training Loss: 0.00959
Epoch: 18646, Training Loss: 0.01089
Epoch: 18646, Training Loss: 0.01012
Epoch: 18646, Training Loss: 0.01243
Epoch: 18646, Training Loss: 0.00959
Epoch: 18647, Training Loss: 0.01089
Epoch: 18647, Training Loss: 0.01012
Epoch: 18647, Training Loss: 0.01243
Epoch: 18647, Training Loss: 0.00959
Epoch: 18648, Training Loss: 0.01089
Epoch: 18648, Training Loss: 0.01012
Epoch: 18648, Training Loss: 0.01243
Epoch: 18648, Training Loss: 0.00959
Epoch: 18649, Training Loss: 0.01089
Epoch: 18649, Training Loss: 0.01012
Epoch: 18649, Training Loss: 0.01243
Epoch: 18649, Training Loss: 0.00959
Epoch: 18650, Training Loss: 0.01089
Epoch: 18650, Training Loss: 0.01012
Epoch: 18650, Training Loss: 0.01243
Epoch: 18650, Training Loss: 0.00959
Epoch: 18651, Training Loss: 0.01089
Epoch: 18651, Training Loss: 0.01012
Epoch: 18651, Training Loss: 0.01243
Epoch: 18651, Training Loss: 0.00959
Epoch: 18652, Training Loss: 0.01089
Epoch: 18652, Training Loss: 0.01012
Epoch: 18652, Training Loss: 0.01243
Epoch: 18652, Training Loss: 0.00959
Epoch: 18653, Training Loss: 0.01089
Epoch: 18653, Training Loss: 0.01012
Epoch: 18653, Training Loss: 0.01243
Epoch: 18653, Training Loss: 0.00959
Epoch: 18654, Training Loss: 0.01088
Epoch: 18654, Training Loss: 0.01012
Epoch: 18654, Training Loss: 0.01243
Epoch: 18654, Training Loss: 0.00959
Epoch: 18655, Training Loss: 0.01088
Epoch: 18655, Training Loss: 0.01012
Epoch: 18655, Training Loss: 0.01243
Epoch: 18655, Training Loss: 0.00959
Epoch: 18656, Training Loss: 0.01088
Epoch: 18656, Training Loss: 0.01012
Epoch: 18656, Training Loss: 0.01243
Epoch: 18656, Training Loss: 0.00959
Epoch: 18657, Training Loss: 0.01088
Epoch: 18657, Training Loss: 0.01012
Epoch: 18657, Training Loss: 0.01243
Epoch: 18657, Training Loss: 0.00959
Epoch: 18658, Training Loss: 0.01088
Epoch: 18658, Training Loss: 0.01011
Epoch: 18658, Training Loss: 0.01242
Epoch: 18658, Training Loss: 0.00959
Epoch: 18659, Training Loss: 0.01088
Epoch: 18659, Training Loss: 0.01011
Epoch: 18659, Training Loss: 0.01242
Epoch: 18659, Training Loss: 0.00959
Epoch: 18660, Training Loss: 0.01088
Epoch: 18660, Training Loss: 0.01011
Epoch: 18660, Training Loss: 0.01242
Epoch: 18660, Training Loss: 0.00959
Epoch: 18661, Training Loss: 0.01088
Epoch: 18661, Training Loss: 0.01011
Epoch: 18661, Training Loss: 0.01242
Epoch: 18661, Training Loss: 0.00959
Epoch: 18662, Training Loss: 0.01088
Epoch: 18662, Training Loss: 0.01011
Epoch: 18662, Training Loss: 0.01242
Epoch: 18662, Training Loss: 0.00958
Epoch: 18663, Training Loss: 0.01088
Epoch: 18663, Training Loss: 0.01011
Epoch: 18663, Training Loss: 0.01242
Epoch: 18663, Training Loss: 0.00958
Epoch: 18664, Training Loss: 0.01088
Epoch: 18664, Training Loss: 0.01011
Epoch: 18664, Training Loss: 0.01242
Epoch: 18664, Training Loss: 0.00958
Epoch: 18665, Training Loss: 0.01088
Epoch: 18665, Training Loss: 0.01011
Epoch: 18665, Training Loss: 0.01242
Epoch: 18665, Training Loss: 0.00958
Epoch: 18666, Training Loss: 0.01088
Epoch: 18666, Training Loss: 0.01011
Epoch: 18666, Training Loss: 0.01242
Epoch: 18666, Training Loss: 0.00958
Epoch: 18667, Training Loss: 0.01088
Epoch: 18667, Training Loss: 0.01011
Epoch: 18667, Training Loss: 0.01242
Epoch: 18667, Training Loss: 0.00958
Epoch: 18668, Training Loss: 0.01088
Epoch: 18668, Training Loss: 0.01011
Epoch: 18668, Training Loss: 0.01242
Epoch: 18668, Training Loss: 0.00958
Epoch: 18669, Training Loss: 0.01088
Epoch: 18669, Training Loss: 0.01011
Epoch: 18669, Training Loss: 0.01242
Epoch: 18669, Training Loss: 0.00958
Epoch: 18670, Training Loss: 0.01088
Epoch: 18670, Training Loss: 0.01011
Epoch: 18670, Training Loss: 0.01242
Epoch: 18670, Training Loss: 0.00958
Epoch: 18671, Training Loss: 0.01088
Epoch: 18671, Training Loss: 0.01011
Epoch: 18671, Training Loss: 0.01242
Epoch: 18671, Training Loss: 0.00958
Epoch: 18672, Training Loss: 0.01088
Epoch: 18672, Training Loss: 0.01011
Epoch: 18672, Training Loss: 0.01242
Epoch: 18672, Training Loss: 0.00958
Epoch: 18673, Training Loss: 0.01088
Epoch: 18673, Training Loss: 0.01011
Epoch: 18673, Training Loss: 0.01242
Epoch: 18673, Training Loss: 0.00958
Epoch: 18674, Training Loss: 0.01088
Epoch: 18674, Training Loss: 0.01011
Epoch: 18674, Training Loss: 0.01242
Epoch: 18674, Training Loss: 0.00958
Epoch: 18675, Training Loss: 0.01088
Epoch: 18675, Training Loss: 0.01011
Epoch: 18675, Training Loss: 0.01242
Epoch: 18675, Training Loss: 0.00958
Epoch: 18676, Training Loss: 0.01088
Epoch: 18676, Training Loss: 0.01011
Epoch: 18676, Training Loss: 0.01242
Epoch: 18676, Training Loss: 0.00958
Epoch: 18677, Training Loss: 0.01088
Epoch: 18677, Training Loss: 0.01011
Epoch: 18677, Training Loss: 0.01242
Epoch: 18677, Training Loss: 0.00958
Epoch: 18678, Training Loss: 0.01088
Epoch: 18678, Training Loss: 0.01011
Epoch: 18678, Training Loss: 0.01242
Epoch: 18678, Training Loss: 0.00958
Epoch: 18679, Training Loss: 0.01088
Epoch: 18679, Training Loss: 0.01011
Epoch: 18679, Training Loss: 0.01242
Epoch: 18679, Training Loss: 0.00958
Epoch: 18680, Training Loss: 0.01088
Epoch: 18680, Training Loss: 0.01011
Epoch: 18680, Training Loss: 0.01242
Epoch: 18680, Training Loss: 0.00958
Epoch: 18681, Training Loss: 0.01088
Epoch: 18681, Training Loss: 0.01011
Epoch: 18681, Training Loss: 0.01242
Epoch: 18681, Training Loss: 0.00958
Epoch: 18682, Training Loss: 0.01088
Epoch: 18682, Training Loss: 0.01011
Epoch: 18682, Training Loss: 0.01242
Epoch: 18682, Training Loss: 0.00958
Epoch: 18683, Training Loss: 0.01088
Epoch: 18683, Training Loss: 0.01011
Epoch: 18683, Training Loss: 0.01242
Epoch: 18683, Training Loss: 0.00958
Epoch: 18684, Training Loss: 0.01088
Epoch: 18684, Training Loss: 0.01011
Epoch: 18684, Training Loss: 0.01242
Epoch: 18684, Training Loss: 0.00958
Epoch: 18685, Training Loss: 0.01088
Epoch: 18685, Training Loss: 0.01011
Epoch: 18685, Training Loss: 0.01242
Epoch: 18685, Training Loss: 0.00958
Epoch: 18686, Training Loss: 0.01087
Epoch: 18686, Training Loss: 0.01011
Epoch: 18686, Training Loss: 0.01241
Epoch: 18686, Training Loss: 0.00958
Epoch: 18687, Training Loss: 0.01087
Epoch: 18687, Training Loss: 0.01011
Epoch: 18687, Training Loss: 0.01241
Epoch: 18687, Training Loss: 0.00958
Epoch: 18688, Training Loss: 0.01087
Epoch: 18688, Training Loss: 0.01011
Epoch: 18688, Training Loss: 0.01241
Epoch: 18688, Training Loss: 0.00958
Epoch: 18689, Training Loss: 0.01087
Epoch: 18689, Training Loss: 0.01011
Epoch: 18689, Training Loss: 0.01241
Epoch: 18689, Training Loss: 0.00958
Epoch: 18690, Training Loss: 0.01087
Epoch: 18690, Training Loss: 0.01011
Epoch: 18690, Training Loss: 0.01241
Epoch: 18690, Training Loss: 0.00958
Epoch: 18691, Training Loss: 0.01087
Epoch: 18691, Training Loss: 0.01011
Epoch: 18691, Training Loss: 0.01241
Epoch: 18691, Training Loss: 0.00958
Epoch: 18692, Training Loss: 0.01087
Epoch: 18692, Training Loss: 0.01011
Epoch: 18692, Training Loss: 0.01241
Epoch: 18692, Training Loss: 0.00958
Epoch: 18693, Training Loss: 0.01087
Epoch: 18693, Training Loss: 0.01010
Epoch: 18693, Training Loss: 0.01241
Epoch: 18693, Training Loss: 0.00958
Epoch: 18694, Training Loss: 0.01087
Epoch: 18694, Training Loss: 0.01010
Epoch: 18694, Training Loss: 0.01241
Epoch: 18694, Training Loss: 0.00958
Epoch: 18695, Training Loss: 0.01087
Epoch: 18695, Training Loss: 0.01010
Epoch: 18695, Training Loss: 0.01241
Epoch: 18695, Training Loss: 0.00958
Epoch: 18696, Training Loss: 0.01087
Epoch: 18696, Training Loss: 0.01010
Epoch: 18696, Training Loss: 0.01241
Epoch: 18696, Training Loss: 0.00958
Epoch: 18697, Training Loss: 0.01087
Epoch: 18697, Training Loss: 0.01010
Epoch: 18697, Training Loss: 0.01241
Epoch: 18697, Training Loss: 0.00958
Epoch: 18698, Training Loss: 0.01087
Epoch: 18698, Training Loss: 0.01010
Epoch: 18698, Training Loss: 0.01241
Epoch: 18698, Training Loss: 0.00958
Epoch: 18699, Training Loss: 0.01087
Epoch: 18699, Training Loss: 0.01010
Epoch: 18699, Training Loss: 0.01241
Epoch: 18699, Training Loss: 0.00957
Epoch: 18700, Training Loss: 0.01087
Epoch: 18700, Training Loss: 0.01010
Epoch: 18700, Training Loss: 0.01241
Epoch: 18700, Training Loss: 0.00957
Epoch: 18701, Training Loss: 0.01087
Epoch: 18701, Training Loss: 0.01010
Epoch: 18701, Training Loss: 0.01241
Epoch: 18701, Training Loss: 0.00957
Epoch: 18702, Training Loss: 0.01087
Epoch: 18702, Training Loss: 0.01010
Epoch: 18702, Training Loss: 0.01241
Epoch: 18702, Training Loss: 0.00957
Epoch: 18703, Training Loss: 0.01087
Epoch: 18703, Training Loss: 0.01010
Epoch: 18703, Training Loss: 0.01241
Epoch: 18703, Training Loss: 0.00957
Epoch: 18704, Training Loss: 0.01087
Epoch: 18704, Training Loss: 0.01010
Epoch: 18704, Training Loss: 0.01241
Epoch: 18704, Training Loss: 0.00957
Epoch: 18705, Training Loss: 0.01087
Epoch: 18705, Training Loss: 0.01010
Epoch: 18705, Training Loss: 0.01241
Epoch: 18705, Training Loss: 0.00957
Epoch: 18706, Training Loss: 0.01087
Epoch: 18706, Training Loss: 0.01010
Epoch: 18706, Training Loss: 0.01241
Epoch: 18706, Training Loss: 0.00957
Epoch: 18707, Training Loss: 0.01087
Epoch: 18707, Training Loss: 0.01010
Epoch: 18707, Training Loss: 0.01241
Epoch: 18707, Training Loss: 0.00957
Epoch: 18708, Training Loss: 0.01087
Epoch: 18708, Training Loss: 0.01010
Epoch: 18708, Training Loss: 0.01241
Epoch: 18708, Training Loss: 0.00957
Epoch: 18709, Training Loss: 0.01087
Epoch: 18709, Training Loss: 0.01010
Epoch: 18709, Training Loss: 0.01241
Epoch: 18709, Training Loss: 0.00957
Epoch: 18710, Training Loss: 0.01087
Epoch: 18710, Training Loss: 0.01010
Epoch: 18710, Training Loss: 0.01241
Epoch: 18710, Training Loss: 0.00957
Epoch: 18711, Training Loss: 0.01087
Epoch: 18711, Training Loss: 0.01010
Epoch: 18711, Training Loss: 0.01241
Epoch: 18711, Training Loss: 0.00957
Epoch: 18712, Training Loss: 0.01087
Epoch: 18712, Training Loss: 0.01010
Epoch: 18712, Training Loss: 0.01241
Epoch: 18712, Training Loss: 0.00957
Epoch: 18713, Training Loss: 0.01087
Epoch: 18713, Training Loss: 0.01010
Epoch: 18713, Training Loss: 0.01240
Epoch: 18713, Training Loss: 0.00957
Epoch: 18714, Training Loss: 0.01087
Epoch: 18714, Training Loss: 0.01010
Epoch: 18714, Training Loss: 0.01240
Epoch: 18714, Training Loss: 0.00957
Epoch: 18715, Training Loss: 0.01087
Epoch: 18715, Training Loss: 0.01010
Epoch: 18715, Training Loss: 0.01240
Epoch: 18715, Training Loss: 0.00957
Epoch: 18716, Training Loss: 0.01087
Epoch: 18716, Training Loss: 0.01010
Epoch: 18716, Training Loss: 0.01240
Epoch: 18716, Training Loss: 0.00957
Epoch: 18717, Training Loss: 0.01086
Epoch: 18717, Training Loss: 0.01010
Epoch: 18717, Training Loss: 0.01240
Epoch: 18717, Training Loss: 0.00957
Epoch: 18718, Training Loss: 0.01086
Epoch: 18718, Training Loss: 0.01010
Epoch: 18718, Training Loss: 0.01240
Epoch: 18718, Training Loss: 0.00957
Epoch: 18719, Training Loss: 0.01086
Epoch: 18719, Training Loss: 0.01010
Epoch: 18719, Training Loss: 0.01240
Epoch: 18719, Training Loss: 0.00957
Epoch: 18720, Training Loss: 0.01086
Epoch: 18720, Training Loss: 0.01010
Epoch: 18720, Training Loss: 0.01240
Epoch: 18720, Training Loss: 0.00957
Epoch: 18721, Training Loss: 0.01086
Epoch: 18721, Training Loss: 0.01010
Epoch: 18721, Training Loss: 0.01240
Epoch: 18721, Training Loss: 0.00957
Epoch: 18722, Training Loss: 0.01086
Epoch: 18722, Training Loss: 0.01010
Epoch: 18722, Training Loss: 0.01240
Epoch: 18722, Training Loss: 0.00957
Epoch: 18723, Training Loss: 0.01086
Epoch: 18723, Training Loss: 0.01010
Epoch: 18723, Training Loss: 0.01240
Epoch: 18723, Training Loss: 0.00957
Epoch: 18724, Training Loss: 0.01086
Epoch: 18724, Training Loss: 0.01010
Epoch: 18724, Training Loss: 0.01240
Epoch: 18724, Training Loss: 0.00957
Epoch: 18725, Training Loss: 0.01086
Epoch: 18725, Training Loss: 0.01010
Epoch: 18725, Training Loss: 0.01240
Epoch: 18725, Training Loss: 0.00957
Epoch: 18726, Training Loss: 0.01086
Epoch: 18726, Training Loss: 0.01010
Epoch: 18726, Training Loss: 0.01240
Epoch: 18726, Training Loss: 0.00957
Epoch: 18727, Training Loss: 0.01086
Epoch: 18727, Training Loss: 0.01010
Epoch: 18727, Training Loss: 0.01240
Epoch: 18727, Training Loss: 0.00957
Epoch: 18728, Training Loss: 0.01086
Epoch: 18728, Training Loss: 0.01009
Epoch: 18728, Training Loss: 0.01240
Epoch: 18728, Training Loss: 0.00957
Epoch: 18729, Training Loss: 0.01086
Epoch: 18729, Training Loss: 0.01009
Epoch: 18729, Training Loss: 0.01240
Epoch: 18729, Training Loss: 0.00957
Epoch: 18730, Training Loss: 0.01086
Epoch: 18730, Training Loss: 0.01009
Epoch: 18730, Training Loss: 0.01240
Epoch: 18730, Training Loss: 0.00957
Epoch: 18731, Training Loss: 0.01086
Epoch: 18731, Training Loss: 0.01009
Epoch: 18731, Training Loss: 0.01240
Epoch: 18731, Training Loss: 0.00957
Epoch: 18732, Training Loss: 0.01086
Epoch: 18732, Training Loss: 0.01009
Epoch: 18732, Training Loss: 0.01240
Epoch: 18732, Training Loss: 0.00957
Epoch: 18733, Training Loss: 0.01086
Epoch: 18733, Training Loss: 0.01009
Epoch: 18733, Training Loss: 0.01240
Epoch: 18733, Training Loss: 0.00957
Epoch: 18734, Training Loss: 0.01086
Epoch: 18734, Training Loss: 0.01009
Epoch: 18734, Training Loss: 0.01240
Epoch: 18734, Training Loss: 0.00957
Epoch: 18735, Training Loss: 0.01086
Epoch: 18735, Training Loss: 0.01009
Epoch: 18735, Training Loss: 0.01240
Epoch: 18735, Training Loss: 0.00956
Epoch: 18736, Training Loss: 0.01086
Epoch: 18736, Training Loss: 0.01009
Epoch: 18736, Training Loss: 0.01240
Epoch: 18736, Training Loss: 0.00956
Epoch: 18737, Training Loss: 0.01086
Epoch: 18737, Training Loss: 0.01009
Epoch: 18737, Training Loss: 0.01240
Epoch: 18737, Training Loss: 0.00956
Epoch: 18738, Training Loss: 0.01086
Epoch: 18738, Training Loss: 0.01009
Epoch: 18738, Training Loss: 0.01240
Epoch: 18738, Training Loss: 0.00956
Epoch: 18739, Training Loss: 0.01086
Epoch: 18739, Training Loss: 0.01009
Epoch: 18739, Training Loss: 0.01240
Epoch: 18739, Training Loss: 0.00956
Epoch: 18740, Training Loss: 0.01086
Epoch: 18740, Training Loss: 0.01009
Epoch: 18740, Training Loss: 0.01240
Epoch: 18740, Training Loss: 0.00956
Epoch: 18741, Training Loss: 0.01086
Epoch: 18741, Training Loss: 0.01009
Epoch: 18741, Training Loss: 0.01239
Epoch: 18741, Training Loss: 0.00956
Epoch: 18742, Training Loss: 0.01086
Epoch: 18742, Training Loss: 0.01009
Epoch: 18742, Training Loss: 0.01239
Epoch: 18742, Training Loss: 0.00956
Epoch: 18743, Training Loss: 0.01086
Epoch: 18743, Training Loss: 0.01009
Epoch: 18743, Training Loss: 0.01239
Epoch: 18743, Training Loss: 0.00956
Epoch: 18744, Training Loss: 0.01086
Epoch: 18744, Training Loss: 0.01009
Epoch: 18744, Training Loss: 0.01239
Epoch: 18744, Training Loss: 0.00956
Epoch: 18745, Training Loss: 0.01086
Epoch: 18745, Training Loss: 0.01009
Epoch: 18745, Training Loss: 0.01239
Epoch: 18745, Training Loss: 0.00956
Epoch: 18746, Training Loss: 0.01086
Epoch: 18746, Training Loss: 0.01009
Epoch: 18746, Training Loss: 0.01239
Epoch: 18746, Training Loss: 0.00956
Epoch: 18747, Training Loss: 0.01086
Epoch: 18747, Training Loss: 0.01009
Epoch: 18747, Training Loss: 0.01239
Epoch: 18747, Training Loss: 0.00956
Epoch: 18748, Training Loss: 0.01086
Epoch: 18748, Training Loss: 0.01009
Epoch: 18748, Training Loss: 0.01239
Epoch: 18748, Training Loss: 0.00956
Epoch: 18749, Training Loss: 0.01085
Epoch: 18749, Training Loss: 0.01009
Epoch: 18749, Training Loss: 0.01239
Epoch: 18749, Training Loss: 0.00956
Epoch: 18750, Training Loss: 0.01085
Epoch: 18750, Training Loss: 0.01009
Epoch: 18750, Training Loss: 0.01239
Epoch: 18750, Training Loss: 0.00956
Epoch: 18751, Training Loss: 0.01085
Epoch: 18751, Training Loss: 0.01009
Epoch: 18751, Training Loss: 0.01239
Epoch: 18751, Training Loss: 0.00956
Epoch: 18752, Training Loss: 0.01085
Epoch: 18752, Training Loss: 0.01009
Epoch: 18752, Training Loss: 0.01239
Epoch: 18752, Training Loss: 0.00956
Epoch: 18753, Training Loss: 0.01085
Epoch: 18753, Training Loss: 0.01009
Epoch: 18753, Training Loss: 0.01239
Epoch: 18753, Training Loss: 0.00956
Epoch: 18754, Training Loss: 0.01085
Epoch: 18754, Training Loss: 0.01009
Epoch: 18754, Training Loss: 0.01239
Epoch: 18754, Training Loss: 0.00956
Epoch: 18755, Training Loss: 0.01085
Epoch: 18755, Training Loss: 0.01009
Epoch: 18755, Training Loss: 0.01239
Epoch: 18755, Training Loss: 0.00956
Epoch: 18756, Training Loss: 0.01085
Epoch: 18756, Training Loss: 0.01009
Epoch: 18756, Training Loss: 0.01239
Epoch: 18756, Training Loss: 0.00956
Epoch: 18757, Training Loss: 0.01085
Epoch: 18757, Training Loss: 0.01009
Epoch: 18757, Training Loss: 0.01239
Epoch: 18757, Training Loss: 0.00956
Epoch: 18758, Training Loss: 0.01085
Epoch: 18758, Training Loss: 0.01009
Epoch: 18758, Training Loss: 0.01239
Epoch: 18758, Training Loss: 0.00956
Epoch: 18759, Training Loss: 0.01085
Epoch: 18759, Training Loss: 0.01009
Epoch: 18759, Training Loss: 0.01239
Epoch: 18759, Training Loss: 0.00956
Epoch: 18760, Training Loss: 0.01085
Epoch: 18760, Training Loss: 0.01009
Epoch: 18760, Training Loss: 0.01239
Epoch: 18760, Training Loss: 0.00956
Epoch: 18761, Training Loss: 0.01085
Epoch: 18761, Training Loss: 0.01009
Epoch: 18761, Training Loss: 0.01239
Epoch: 18761, Training Loss: 0.00956
Epoch: 18762, Training Loss: 0.01085
Epoch: 18762, Training Loss: 0.01009
Epoch: 18762, Training Loss: 0.01239
Epoch: 18762, Training Loss: 0.00956
Epoch: 18763, Training Loss: 0.01085
Epoch: 18763, Training Loss: 0.01008
Epoch: 18763, Training Loss: 0.01239
Epoch: 18763, Training Loss: 0.00956
Epoch: 18764, Training Loss: 0.01085
Epoch: 18764, Training Loss: 0.01008
Epoch: 18764, Training Loss: 0.01239
Epoch: 18764, Training Loss: 0.00956
Epoch: 18765, Training Loss: 0.01085
Epoch: 18765, Training Loss: 0.01008
Epoch: 18765, Training Loss: 0.01239
Epoch: 18765, Training Loss: 0.00956
Epoch: 18766, Training Loss: 0.01085
Epoch: 18766, Training Loss: 0.01008
Epoch: 18766, Training Loss: 0.01239
Epoch: 18766, Training Loss: 0.00956
Epoch: 18767, Training Loss: 0.01085
Epoch: 18767, Training Loss: 0.01008
Epoch: 18767, Training Loss: 0.01239
Epoch: 18767, Training Loss: 0.00956
Epoch: 18768, Training Loss: 0.01085
Epoch: 18768, Training Loss: 0.01008
Epoch: 18768, Training Loss: 0.01239
Epoch: 18768, Training Loss: 0.00956
Epoch: 18769, Training Loss: 0.01085
Epoch: 18769, Training Loss: 0.01008
Epoch: 18769, Training Loss: 0.01238
Epoch: 18769, Training Loss: 0.00956
Epoch: 18770, Training Loss: 0.01085
Epoch: 18770, Training Loss: 0.01008
Epoch: 18770, Training Loss: 0.01238
Epoch: 18770, Training Loss: 0.00956
Epoch: 18771, Training Loss: 0.01085
Epoch: 18771, Training Loss: 0.01008
Epoch: 18771, Training Loss: 0.01238
Epoch: 18771, Training Loss: 0.00956
Epoch: 18772, Training Loss: 0.01085
Epoch: 18772, Training Loss: 0.01008
Epoch: 18772, Training Loss: 0.01238
Epoch: 18772, Training Loss: 0.00955
Epoch: 18773, Training Loss: 0.01085
Epoch: 18773, Training Loss: 0.01008
Epoch: 18773, Training Loss: 0.01238
Epoch: 18773, Training Loss: 0.00955
Epoch: 18774, Training Loss: 0.01085
Epoch: 18774, Training Loss: 0.01008
Epoch: 18774, Training Loss: 0.01238
Epoch: 18774, Training Loss: 0.00955
Epoch: 18775, Training Loss: 0.01085
Epoch: 18775, Training Loss: 0.01008
Epoch: 18775, Training Loss: 0.01238
Epoch: 18775, Training Loss: 0.00955
Epoch: 18776, Training Loss: 0.01085
Epoch: 18776, Training Loss: 0.01008
Epoch: 18776, Training Loss: 0.01238
Epoch: 18776, Training Loss: 0.00955
Epoch: 18777, Training Loss: 0.01085
Epoch: 18777, Training Loss: 0.01008
Epoch: 18777, Training Loss: 0.01238
Epoch: 18777, Training Loss: 0.00955
Epoch: 18778, Training Loss: 0.01085
Epoch: 18778, Training Loss: 0.01008
Epoch: 18778, Training Loss: 0.01238
Epoch: 18778, Training Loss: 0.00955
Epoch: 18779, Training Loss: 0.01085
Epoch: 18779, Training Loss: 0.01008
Epoch: 18779, Training Loss: 0.01238
Epoch: 18779, Training Loss: 0.00955
Epoch: 18780, Training Loss: 0.01085
Epoch: 18780, Training Loss: 0.01008
Epoch: 18780, Training Loss: 0.01238
Epoch: 18780, Training Loss: 0.00955
Epoch: 18781, Training Loss: 0.01084
Epoch: 18781, Training Loss: 0.01008
Epoch: 18781, Training Loss: 0.01238
Epoch: 18781, Training Loss: 0.00955
Epoch: 18782, Training Loss: 0.01084
Epoch: 18782, Training Loss: 0.01008
Epoch: 18782, Training Loss: 0.01238
Epoch: 18782, Training Loss: 0.00955
Epoch: 18783, Training Loss: 0.01084
Epoch: 18783, Training Loss: 0.01008
Epoch: 18783, Training Loss: 0.01238
Epoch: 18783, Training Loss: 0.00955
Epoch: 18784, Training Loss: 0.01084
Epoch: 18784, Training Loss: 0.01008
Epoch: 18784, Training Loss: 0.01238
Epoch: 18784, Training Loss: 0.00955
Epoch: 18785, Training Loss: 0.01084
Epoch: 18785, Training Loss: 0.01008
Epoch: 18785, Training Loss: 0.01238
Epoch: 18785, Training Loss: 0.00955
Epoch: 18786, Training Loss: 0.01084
Epoch: 18786, Training Loss: 0.01008
Epoch: 18786, Training Loss: 0.01238
Epoch: 18786, Training Loss: 0.00955
Epoch: 18787, Training Loss: 0.01084
Epoch: 18787, Training Loss: 0.01008
Epoch: 18787, Training Loss: 0.01238
Epoch: 18787, Training Loss: 0.00955
Epoch: 18788, Training Loss: 0.01084
Epoch: 18788, Training Loss: 0.01008
Epoch: 18788, Training Loss: 0.01238
Epoch: 18788, Training Loss: 0.00955
Epoch: 18789, Training Loss: 0.01084
Epoch: 18789, Training Loss: 0.01008
Epoch: 18789, Training Loss: 0.01238
Epoch: 18789, Training Loss: 0.00955
Epoch: 18790, Training Loss: 0.01084
Epoch: 18790, Training Loss: 0.01008
Epoch: 18790, Training Loss: 0.01238
Epoch: 18790, Training Loss: 0.00955
Epoch: 18791, Training Loss: 0.01084
Epoch: 18791, Training Loss: 0.01008
Epoch: 18791, Training Loss: 0.01238
Epoch: 18791, Training Loss: 0.00955
Epoch: 18792, Training Loss: 0.01084
Epoch: 18792, Training Loss: 0.01008
Epoch: 18792, Training Loss: 0.01238
Epoch: 18792, Training Loss: 0.00955
Epoch: 18793, Training Loss: 0.01084
Epoch: 18793, Training Loss: 0.01008
Epoch: 18793, Training Loss: 0.01238
Epoch: 18793, Training Loss: 0.00955
Epoch: 18794, Training Loss: 0.01084
Epoch: 18794, Training Loss: 0.01008
Epoch: 18794, Training Loss: 0.01238
Epoch: 18794, Training Loss: 0.00955
Epoch: 18795, Training Loss: 0.01084
Epoch: 18795, Training Loss: 0.01008
Epoch: 18795, Training Loss: 0.01238
Epoch: 18795, Training Loss: 0.00955
Epoch: 18796, Training Loss: 0.01084
Epoch: 18796, Training Loss: 0.01008
Epoch: 18796, Training Loss: 0.01238
Epoch: 18796, Training Loss: 0.00955
Epoch: 18797, Training Loss: 0.01084
Epoch: 18797, Training Loss: 0.01008
Epoch: 18797, Training Loss: 0.01237
Epoch: 18797, Training Loss: 0.00955
Epoch: 18798, Training Loss: 0.01084
Epoch: 18798, Training Loss: 0.01007
Epoch: 18798, Training Loss: 0.01237
Epoch: 18798, Training Loss: 0.00955
Epoch: 18799, Training Loss: 0.01084
Epoch: 18799, Training Loss: 0.01007
Epoch: 18799, Training Loss: 0.01237
Epoch: 18799, Training Loss: 0.00955
Epoch: 18800, Training Loss: 0.01084
Epoch: 18800, Training Loss: 0.01007
Epoch: 18800, Training Loss: 0.01237
Epoch: 18800, Training Loss: 0.00955
Epoch: 18801, Training Loss: 0.01084
Epoch: 18801, Training Loss: 0.01007
Epoch: 18801, Training Loss: 0.01237
Epoch: 18801, Training Loss: 0.00955
Epoch: 18802, Training Loss: 0.01084
Epoch: 18802, Training Loss: 0.01007
Epoch: 18802, Training Loss: 0.01237
Epoch: 18802, Training Loss: 0.00955
Epoch: 18803, Training Loss: 0.01084
Epoch: 18803, Training Loss: 0.01007
Epoch: 18803, Training Loss: 0.01237
Epoch: 18803, Training Loss: 0.00955
Epoch: 18804, Training Loss: 0.01084
Epoch: 18804, Training Loss: 0.01007
Epoch: 18804, Training Loss: 0.01237
Epoch: 18804, Training Loss: 0.00955
Epoch: 18805, Training Loss: 0.01084
Epoch: 18805, Training Loss: 0.01007
Epoch: 18805, Training Loss: 0.01237
Epoch: 18805, Training Loss: 0.00955
Epoch: 18806, Training Loss: 0.01084
Epoch: 18806, Training Loss: 0.01007
Epoch: 18806, Training Loss: 0.01237
Epoch: 18806, Training Loss: 0.00955
Epoch: 18807, Training Loss: 0.01084
Epoch: 18807, Training Loss: 0.01007
Epoch: 18807, Training Loss: 0.01237
Epoch: 18807, Training Loss: 0.00955
Epoch: 18808, Training Loss: 0.01084
Epoch: 18808, Training Loss: 0.01007
Epoch: 18808, Training Loss: 0.01237
Epoch: 18808, Training Loss: 0.00955
Epoch: 18809, Training Loss: 0.01084
Epoch: 18809, Training Loss: 0.01007
Epoch: 18809, Training Loss: 0.01237
Epoch: 18809, Training Loss: 0.00954
Epoch: 18810, Training Loss: 0.01084
Epoch: 18810, Training Loss: 0.01007
Epoch: 18810, Training Loss: 0.01237
Epoch: 18810, Training Loss: 0.00954
Epoch: 18811, Training Loss: 0.01084
Epoch: 18811, Training Loss: 0.01007
Epoch: 18811, Training Loss: 0.01237
Epoch: 18811, Training Loss: 0.00954
Epoch: 18812, Training Loss: 0.01084
Epoch: 18812, Training Loss: 0.01007
Epoch: 18812, Training Loss: 0.01237
Epoch: 18812, Training Loss: 0.00954
Epoch: 18813, Training Loss: 0.01083
Epoch: 18813, Training Loss: 0.01007
Epoch: 18813, Training Loss: 0.01237
Epoch: 18813, Training Loss: 0.00954
Epoch: 18814, Training Loss: 0.01083
Epoch: 18814, Training Loss: 0.01007
Epoch: 18814, Training Loss: 0.01237
Epoch: 18814, Training Loss: 0.00954
Epoch: 18815, Training Loss: 0.01083
Epoch: 18815, Training Loss: 0.01007
Epoch: 18815, Training Loss: 0.01237
Epoch: 18815, Training Loss: 0.00954
Epoch: 18816, Training Loss: 0.01083
Epoch: 18816, Training Loss: 0.01007
Epoch: 18816, Training Loss: 0.01237
Epoch: 18816, Training Loss: 0.00954
Epoch: 18817, Training Loss: 0.01083
Epoch: 18817, Training Loss: 0.01007
Epoch: 18817, Training Loss: 0.01237
Epoch: 18817, Training Loss: 0.00954
Epoch: 18818, Training Loss: 0.01083
Epoch: 18818, Training Loss: 0.01007
Epoch: 18818, Training Loss: 0.01237
Epoch: 18818, Training Loss: 0.00954
Epoch: 18819, Training Loss: 0.01083
Epoch: 18819, Training Loss: 0.01007
Epoch: 18819, Training Loss: 0.01237
Epoch: 18819, Training Loss: 0.00954
Epoch: 18820, Training Loss: 0.01083
Epoch: 18820, Training Loss: 0.01007
Epoch: 18820, Training Loss: 0.01237
Epoch: 18820, Training Loss: 0.00954
Epoch: 18821, Training Loss: 0.01083
Epoch: 18821, Training Loss: 0.01007
Epoch: 18821, Training Loss: 0.01237
Epoch: 18821, Training Loss: 0.00954
Epoch: 18822, Training Loss: 0.01083
Epoch: 18822, Training Loss: 0.01007
Epoch: 18822, Training Loss: 0.01237
Epoch: 18822, Training Loss: 0.00954
Epoch: 18823, Training Loss: 0.01083
Epoch: 18823, Training Loss: 0.01007
Epoch: 18823, Training Loss: 0.01237
Epoch: 18823, Training Loss: 0.00954
Epoch: 18824, Training Loss: 0.01083
Epoch: 18824, Training Loss: 0.01007
Epoch: 18824, Training Loss: 0.01237
Epoch: 18824, Training Loss: 0.00954
Epoch: 18825, Training Loss: 0.01083
Epoch: 18825, Training Loss: 0.01007
Epoch: 18825, Training Loss: 0.01236
Epoch: 18825, Training Loss: 0.00954
Epoch: 18826, Training Loss: 0.01083
Epoch: 18826, Training Loss: 0.01007
Epoch: 18826, Training Loss: 0.01236
Epoch: 18826, Training Loss: 0.00954
Epoch: 18827, Training Loss: 0.01083
Epoch: 18827, Training Loss: 0.01007
Epoch: 18827, Training Loss: 0.01236
Epoch: 18827, Training Loss: 0.00954
Epoch: 18828, Training Loss: 0.01083
Epoch: 18828, Training Loss: 0.01007
Epoch: 18828, Training Loss: 0.01236
Epoch: 18828, Training Loss: 0.00954
Epoch: 18829, Training Loss: 0.01083
Epoch: 18829, Training Loss: 0.01007
Epoch: 18829, Training Loss: 0.01236
Epoch: 18829, Training Loss: 0.00954
Epoch: 18830, Training Loss: 0.01083
Epoch: 18830, Training Loss: 0.01007
Epoch: 18830, Training Loss: 0.01236
Epoch: 18830, Training Loss: 0.00954
Epoch: 18831, Training Loss: 0.01083
Epoch: 18831, Training Loss: 0.01007
Epoch: 18831, Training Loss: 0.01236
Epoch: 18831, Training Loss: 0.00954
Epoch: 18832, Training Loss: 0.01083
Epoch: 18832, Training Loss: 0.01007
Epoch: 18832, Training Loss: 0.01236
Epoch: 18832, Training Loss: 0.00954
Epoch: 18833, Training Loss: 0.01083
Epoch: 18833, Training Loss: 0.01006
Epoch: 18833, Training Loss: 0.01236
Epoch: 18833, Training Loss: 0.00954
Epoch: 18834, Training Loss: 0.01083
Epoch: 18834, Training Loss: 0.01006
Epoch: 18834, Training Loss: 0.01236
Epoch: 18834, Training Loss: 0.00954
Epoch: 18835, Training Loss: 0.01083
Epoch: 18835, Training Loss: 0.01006
Epoch: 18835, Training Loss: 0.01236
Epoch: 18835, Training Loss: 0.00954
Epoch: 18836, Training Loss: 0.01083
Epoch: 18836, Training Loss: 0.01006
Epoch: 18836, Training Loss: 0.01236
Epoch: 18836, Training Loss: 0.00954
Epoch: 18837, Training Loss: 0.01083
Epoch: 18837, Training Loss: 0.01006
Epoch: 18837, Training Loss: 0.01236
Epoch: 18837, Training Loss: 0.00954
Epoch: 18838, Training Loss: 0.01083
Epoch: 18838, Training Loss: 0.01006
Epoch: 18838, Training Loss: 0.01236
Epoch: 18838, Training Loss: 0.00954
Epoch: 18839, Training Loss: 0.01083
Epoch: 18839, Training Loss: 0.01006
Epoch: 18839, Training Loss: 0.01236
Epoch: 18839, Training Loss: 0.00954
Epoch: 18840, Training Loss: 0.01083
Epoch: 18840, Training Loss: 0.01006
Epoch: 18840, Training Loss: 0.01236
Epoch: 18840, Training Loss: 0.00954
Epoch: 18841, Training Loss: 0.01083
Epoch: 18841, Training Loss: 0.01006
Epoch: 18841, Training Loss: 0.01236
Epoch: 18841, Training Loss: 0.00954
Epoch: 18842, Training Loss: 0.01083
Epoch: 18842, Training Loss: 0.01006
Epoch: 18842, Training Loss: 0.01236
Epoch: 18842, Training Loss: 0.00954
Epoch: 18843, Training Loss: 0.01083
Epoch: 18843, Training Loss: 0.01006
Epoch: 18843, Training Loss: 0.01236
Epoch: 18843, Training Loss: 0.00954
Epoch: 18844, Training Loss: 0.01083
Epoch: 18844, Training Loss: 0.01006
Epoch: 18844, Training Loss: 0.01236
Epoch: 18844, Training Loss: 0.00954
Epoch: 18845, Training Loss: 0.01082
Epoch: 18845, Training Loss: 0.01006
Epoch: 18845, Training Loss: 0.01236
Epoch: 18845, Training Loss: 0.00954
Epoch: 18846, Training Loss: 0.01082
Epoch: 18846, Training Loss: 0.01006
Epoch: 18846, Training Loss: 0.01236
Epoch: 18846, Training Loss: 0.00953
Epoch: 18847, Training Loss: 0.01082
Epoch: 18847, Training Loss: 0.01006
Epoch: 18847, Training Loss: 0.01236
Epoch: 18847, Training Loss: 0.00953
Epoch: 18848, Training Loss: 0.01082
Epoch: 18848, Training Loss: 0.01006
Epoch: 18848, Training Loss: 0.01236
Epoch: 18848, Training Loss: 0.00953
Epoch: 18849, Training Loss: 0.01082
Epoch: 18849, Training Loss: 0.01006
Epoch: 18849, Training Loss: 0.01236
Epoch: 18849, Training Loss: 0.00953
Epoch: 18850, Training Loss: 0.01082
Epoch: 18850, Training Loss: 0.01006
Epoch: 18850, Training Loss: 0.01236
Epoch: 18850, Training Loss: 0.00953
Epoch: 18851, Training Loss: 0.01082
Epoch: 18851, Training Loss: 0.01006
Epoch: 18851, Training Loss: 0.01236
Epoch: 18851, Training Loss: 0.00953
Epoch: 18852, Training Loss: 0.01082
Epoch: 18852, Training Loss: 0.01006
Epoch: 18852, Training Loss: 0.01236
Epoch: 18852, Training Loss: 0.00953
Epoch: 18853, Training Loss: 0.01082
Epoch: 18853, Training Loss: 0.01006
Epoch: 18853, Training Loss: 0.01235
Epoch: 18853, Training Loss: 0.00953
Epoch: 18854, Training Loss: 0.01082
Epoch: 18854, Training Loss: 0.01006
Epoch: 18854, Training Loss: 0.01235
Epoch: 18854, Training Loss: 0.00953
Epoch: 18855, Training Loss: 0.01082
Epoch: 18855, Training Loss: 0.01006
Epoch: 18855, Training Loss: 0.01235
Epoch: 18855, Training Loss: 0.00953
Epoch: 18856, Training Loss: 0.01082
Epoch: 18856, Training Loss: 0.01006
Epoch: 18856, Training Loss: 0.01235
Epoch: 18856, Training Loss: 0.00953
Epoch: 18857, Training Loss: 0.01082
Epoch: 18857, Training Loss: 0.01006
Epoch: 18857, Training Loss: 0.01235
Epoch: 18857, Training Loss: 0.00953
Epoch: 18858, Training Loss: 0.01082
Epoch: 18858, Training Loss: 0.01006
Epoch: 18858, Training Loss: 0.01235
Epoch: 18858, Training Loss: 0.00953
Epoch: 18859, Training Loss: 0.01082
Epoch: 18859, Training Loss: 0.01006
Epoch: 18859, Training Loss: 0.01235
Epoch: 18859, Training Loss: 0.00953
Epoch: 18860, Training Loss: 0.01082
Epoch: 18860, Training Loss: 0.01006
Epoch: 18860, Training Loss: 0.01235
Epoch: 18860, Training Loss: 0.00953
Epoch: 18861, Training Loss: 0.01082
Epoch: 18861, Training Loss: 0.01006
Epoch: 18861, Training Loss: 0.01235
Epoch: 18861, Training Loss: 0.00953
Epoch: 18862, Training Loss: 0.01082
Epoch: 18862, Training Loss: 0.01006
Epoch: 18862, Training Loss: 0.01235
Epoch: 18862, Training Loss: 0.00953
Epoch: 18863, Training Loss: 0.01082
Epoch: 18863, Training Loss: 0.01006
Epoch: 18863, Training Loss: 0.01235
Epoch: 18863, Training Loss: 0.00953
Epoch: 18864, Training Loss: 0.01082
Epoch: 18864, Training Loss: 0.01006
Epoch: 18864, Training Loss: 0.01235
Epoch: 18864, Training Loss: 0.00953
Epoch: 18865, Training Loss: 0.01082
Epoch: 18865, Training Loss: 0.01006
Epoch: 18865, Training Loss: 0.01235
Epoch: 18865, Training Loss: 0.00953
Epoch: 18866, Training Loss: 0.01082
Epoch: 18866, Training Loss: 0.01006
Epoch: 18866, Training Loss: 0.01235
Epoch: 18866, Training Loss: 0.00953
Epoch: 18867, Training Loss: 0.01082
Epoch: 18867, Training Loss: 0.01006
Epoch: 18867, Training Loss: 0.01235
Epoch: 18867, Training Loss: 0.00953
Epoch: 18868, Training Loss: 0.01082
Epoch: 18868, Training Loss: 0.01005
Epoch: 18868, Training Loss: 0.01235
Epoch: 18868, Training Loss: 0.00953
Epoch: 18869, Training Loss: 0.01082
Epoch: 18869, Training Loss: 0.01005
Epoch: 18869, Training Loss: 0.01235
Epoch: 18869, Training Loss: 0.00953
Epoch: 18870, Training Loss: 0.01082
Epoch: 18870, Training Loss: 0.01005
Epoch: 18870, Training Loss: 0.01235
Epoch: 18870, Training Loss: 0.00953
Epoch: 18871, Training Loss: 0.01082
Epoch: 18871, Training Loss: 0.01005
Epoch: 18871, Training Loss: 0.01235
Epoch: 18871, Training Loss: 0.00953
Epoch: 18872, Training Loss: 0.01082
Epoch: 18872, Training Loss: 0.01005
Epoch: 18872, Training Loss: 0.01235
Epoch: 18872, Training Loss: 0.00953
Epoch: 18873, Training Loss: 0.01082
Epoch: 18873, Training Loss: 0.01005
Epoch: 18873, Training Loss: 0.01235
Epoch: 18873, Training Loss: 0.00953
Epoch: 18874, Training Loss: 0.01082
Epoch: 18874, Training Loss: 0.01005
Epoch: 18874, Training Loss: 0.01235
Epoch: 18874, Training Loss: 0.00953
Epoch: 18875, Training Loss: 0.01082
Epoch: 18875, Training Loss: 0.01005
Epoch: 18875, Training Loss: 0.01235
Epoch: 18875, Training Loss: 0.00953
Epoch: 18876, Training Loss: 0.01082
Epoch: 18876, Training Loss: 0.01005
Epoch: 18876, Training Loss: 0.01235
Epoch: 18876, Training Loss: 0.00953
Epoch: 18877, Training Loss: 0.01081
Epoch: 18877, Training Loss: 0.01005
Epoch: 18877, Training Loss: 0.01235
Epoch: 18877, Training Loss: 0.00953
Epoch: 18878, Training Loss: 0.01081
Epoch: 18878, Training Loss: 0.01005
Epoch: 18878, Training Loss: 0.01235
Epoch: 18878, Training Loss: 0.00953
Epoch: 18879, Training Loss: 0.01081
Epoch: 18879, Training Loss: 0.01005
Epoch: 18879, Training Loss: 0.01235
Epoch: 18879, Training Loss: 0.00953
Epoch: 18880, Training Loss: 0.01081
Epoch: 18880, Training Loss: 0.01005
Epoch: 18880, Training Loss: 0.01235
Epoch: 18880, Training Loss: 0.00953
Epoch: 18881, Training Loss: 0.01081
Epoch: 18881, Training Loss: 0.01005
Epoch: 18881, Training Loss: 0.01234
Epoch: 18881, Training Loss: 0.00953
Epoch: 18882, Training Loss: 0.01081
Epoch: 18882, Training Loss: 0.01005
Epoch: 18882, Training Loss: 0.01234
Epoch: 18882, Training Loss: 0.00953
Epoch: 18883, Training Loss: 0.01081
Epoch: 18883, Training Loss: 0.01005
Epoch: 18883, Training Loss: 0.01234
Epoch: 18883, Training Loss: 0.00952
Epoch: 18884, Training Loss: 0.01081
Epoch: 18884, Training Loss: 0.01005
Epoch: 18884, Training Loss: 0.01234
Epoch: 18884, Training Loss: 0.00952
Epoch: 18885, Training Loss: 0.01081
Epoch: 18885, Training Loss: 0.01005
Epoch: 18885, Training Loss: 0.01234
Epoch: 18885, Training Loss: 0.00952
Epoch: 18886, Training Loss: 0.01081
Epoch: 18886, Training Loss: 0.01005
Epoch: 18886, Training Loss: 0.01234
Epoch: 18886, Training Loss: 0.00952
Epoch: 18887, Training Loss: 0.01081
Epoch: 18887, Training Loss: 0.01005
Epoch: 18887, Training Loss: 0.01234
Epoch: 18887, Training Loss: 0.00952
Epoch: 18888, Training Loss: 0.01081
Epoch: 18888, Training Loss: 0.01005
Epoch: 18888, Training Loss: 0.01234
Epoch: 18888, Training Loss: 0.00952
Epoch: 18889, Training Loss: 0.01081
Epoch: 18889, Training Loss: 0.01005
Epoch: 18889, Training Loss: 0.01234
Epoch: 18889, Training Loss: 0.00952
Epoch: 18890, Training Loss: 0.01081
Epoch: 18890, Training Loss: 0.01005
Epoch: 18890, Training Loss: 0.01234
Epoch: 18890, Training Loss: 0.00952
Epoch: 18891, Training Loss: 0.01081
Epoch: 18891, Training Loss: 0.01005
Epoch: 18891, Training Loss: 0.01234
Epoch: 18891, Training Loss: 0.00952
Epoch: 18892, Training Loss: 0.01081
Epoch: 18892, Training Loss: 0.01005
Epoch: 18892, Training Loss: 0.01234
Epoch: 18892, Training Loss: 0.00952
Epoch: 18893, Training Loss: 0.01081
Epoch: 18893, Training Loss: 0.01005
Epoch: 18893, Training Loss: 0.01234
Epoch: 18893, Training Loss: 0.00952
Epoch: 18894, Training Loss: 0.01081
Epoch: 18894, Training Loss: 0.01005
Epoch: 18894, Training Loss: 0.01234
Epoch: 18894, Training Loss: 0.00952
Epoch: 18895, Training Loss: 0.01081
Epoch: 18895, Training Loss: 0.01005
Epoch: 18895, Training Loss: 0.01234
Epoch: 18895, Training Loss: 0.00952
Epoch: 18896, Training Loss: 0.01081
Epoch: 18896, Training Loss: 0.01005
Epoch: 18896, Training Loss: 0.01234
Epoch: 18896, Training Loss: 0.00952
Epoch: 18897, Training Loss: 0.01081
Epoch: 18897, Training Loss: 0.01005
Epoch: 18897, Training Loss: 0.01234
Epoch: 18897, Training Loss: 0.00952
Epoch: 18898, Training Loss: 0.01081
Epoch: 18898, Training Loss: 0.01005
Epoch: 18898, Training Loss: 0.01234
Epoch: 18898, Training Loss: 0.00952
Epoch: 18899, Training Loss: 0.01081
Epoch: 18899, Training Loss: 0.01005
Epoch: 18899, Training Loss: 0.01234
Epoch: 18899, Training Loss: 0.00952
Epoch: 18900, Training Loss: 0.01081
Epoch: 18900, Training Loss: 0.01005
Epoch: 18900, Training Loss: 0.01234
Epoch: 18900, Training Loss: 0.00952
Epoch: 18901, Training Loss: 0.01081
Epoch: 18901, Training Loss: 0.01005
Epoch: 18901, Training Loss: 0.01234
Epoch: 18901, Training Loss: 0.00952
Epoch: 18902, Training Loss: 0.01081
Epoch: 18902, Training Loss: 0.01005
Epoch: 18902, Training Loss: 0.01234
Epoch: 18902, Training Loss: 0.00952
Epoch: 18903, Training Loss: 0.01081
Epoch: 18903, Training Loss: 0.01005
Epoch: 18903, Training Loss: 0.01234
Epoch: 18903, Training Loss: 0.00952
Epoch: 18904, Training Loss: 0.01081
Epoch: 18904, Training Loss: 0.01004
Epoch: 18904, Training Loss: 0.01234
Epoch: 18904, Training Loss: 0.00952
Epoch: 18905, Training Loss: 0.01081
Epoch: 18905, Training Loss: 0.01004
Epoch: 18905, Training Loss: 0.01234
Epoch: 18905, Training Loss: 0.00952
Epoch: 18906, Training Loss: 0.01081
Epoch: 18906, Training Loss: 0.01004
Epoch: 18906, Training Loss: 0.01234
Epoch: 18906, Training Loss: 0.00952
Epoch: 18907, Training Loss: 0.01081
Epoch: 18907, Training Loss: 0.01004
Epoch: 18907, Training Loss: 0.01234
Epoch: 18907, Training Loss: 0.00952
Epoch: 18908, Training Loss: 0.01081
Epoch: 18908, Training Loss: 0.01004
Epoch: 18908, Training Loss: 0.01234
Epoch: 18908, Training Loss: 0.00952
Epoch: 18909, Training Loss: 0.01080
Epoch: 18909, Training Loss: 0.01004
Epoch: 18909, Training Loss: 0.01234
Epoch: 18909, Training Loss: 0.00952
Epoch: 18910, Training Loss: 0.01080
Epoch: 18910, Training Loss: 0.01004
Epoch: 18910, Training Loss: 0.01233
Epoch: 18910, Training Loss: 0.00952
Epoch: 18911, Training Loss: 0.01080
Epoch: 18911, Training Loss: 0.01004
Epoch: 18911, Training Loss: 0.01233
Epoch: 18911, Training Loss: 0.00952
Epoch: 18912, Training Loss: 0.01080
Epoch: 18912, Training Loss: 0.01004
Epoch: 18912, Training Loss: 0.01233
Epoch: 18912, Training Loss: 0.00952
Epoch: 18913, Training Loss: 0.01080
Epoch: 18913, Training Loss: 0.01004
Epoch: 18913, Training Loss: 0.01233
Epoch: 18913, Training Loss: 0.00952
Epoch: 18914, Training Loss: 0.01080
Epoch: 18914, Training Loss: 0.01004
Epoch: 18914, Training Loss: 0.01233
Epoch: 18914, Training Loss: 0.00952
Epoch: 18915, Training Loss: 0.01080
Epoch: 18915, Training Loss: 0.01004
Epoch: 18915, Training Loss: 0.01233
Epoch: 18915, Training Loss: 0.00952
Epoch: 18916, Training Loss: 0.01080
Epoch: 18916, Training Loss: 0.01004
Epoch: 18916, Training Loss: 0.01233
Epoch: 18916, Training Loss: 0.00952
Epoch: 18917, Training Loss: 0.01080
Epoch: 18917, Training Loss: 0.01004
Epoch: 18917, Training Loss: 0.01233
Epoch: 18917, Training Loss: 0.00952
Epoch: 18918, Training Loss: 0.01080
Epoch: 18918, Training Loss: 0.01004
Epoch: 18918, Training Loss: 0.01233
Epoch: 18918, Training Loss: 0.00952
Epoch: 18919, Training Loss: 0.01080
Epoch: 18919, Training Loss: 0.01004
Epoch: 18919, Training Loss: 0.01233
Epoch: 18919, Training Loss: 0.00952
Epoch: 18920, Training Loss: 0.01080
Epoch: 18920, Training Loss: 0.01004
Epoch: 18920, Training Loss: 0.01233
Epoch: 18920, Training Loss: 0.00951
Epoch: 18921, Training Loss: 0.01080
Epoch: 18921, Training Loss: 0.01004
Epoch: 18921, Training Loss: 0.01233
Epoch: 18921, Training Loss: 0.00951
Epoch: 18922, Training Loss: 0.01080
Epoch: 18922, Training Loss: 0.01004
Epoch: 18922, Training Loss: 0.01233
Epoch: 18922, Training Loss: 0.00951
Epoch: 18923, Training Loss: 0.01080
Epoch: 18923, Training Loss: 0.01004
Epoch: 18923, Training Loss: 0.01233
Epoch: 18923, Training Loss: 0.00951
Epoch: 18924, Training Loss: 0.01080
Epoch: 18924, Training Loss: 0.01004
Epoch: 18924, Training Loss: 0.01233
Epoch: 18924, Training Loss: 0.00951
Epoch: 18925, Training Loss: 0.01080
Epoch: 18925, Training Loss: 0.01004
Epoch: 18925, Training Loss: 0.01233
Epoch: 18925, Training Loss: 0.00951
Epoch: 18926, Training Loss: 0.01080
Epoch: 18926, Training Loss: 0.01004
Epoch: 18926, Training Loss: 0.01233
Epoch: 18926, Training Loss: 0.00951
Epoch: 18927, Training Loss: 0.01080
Epoch: 18927, Training Loss: 0.01004
Epoch: 18927, Training Loss: 0.01233
Epoch: 18927, Training Loss: 0.00951
Epoch: 18928, Training Loss: 0.01080
Epoch: 18928, Training Loss: 0.01004
Epoch: 18928, Training Loss: 0.01233
Epoch: 18928, Training Loss: 0.00951
Epoch: 18929, Training Loss: 0.01080
Epoch: 18929, Training Loss: 0.01004
Epoch: 18929, Training Loss: 0.01233
Epoch: 18929, Training Loss: 0.00951
Epoch: 18930, Training Loss: 0.01080
Epoch: 18930, Training Loss: 0.01004
Epoch: 18930, Training Loss: 0.01233
Epoch: 18930, Training Loss: 0.00951
Epoch: 18931, Training Loss: 0.01080
Epoch: 18931, Training Loss: 0.01004
Epoch: 18931, Training Loss: 0.01233
Epoch: 18931, Training Loss: 0.00951
Epoch: 18932, Training Loss: 0.01080
Epoch: 18932, Training Loss: 0.01004
Epoch: 18932, Training Loss: 0.01233
Epoch: 18932, Training Loss: 0.00951
Epoch: 18933, Training Loss: 0.01080
Epoch: 18933, Training Loss: 0.01004
Epoch: 18933, Training Loss: 0.01233
Epoch: 18933, Training Loss: 0.00951
Epoch: 18934, Training Loss: 0.01080
Epoch: 18934, Training Loss: 0.01004
Epoch: 18934, Training Loss: 0.01233
Epoch: 18934, Training Loss: 0.00951
Epoch: 18935, Training Loss: 0.01080
Epoch: 18935, Training Loss: 0.01004
Epoch: 18935, Training Loss: 0.01233
Epoch: 18935, Training Loss: 0.00951
Epoch: 18936, Training Loss: 0.01080
Epoch: 18936, Training Loss: 0.01004
Epoch: 18936, Training Loss: 0.01233
Epoch: 18936, Training Loss: 0.00951
Epoch: 18937, Training Loss: 0.01080
Epoch: 18937, Training Loss: 0.01004
Epoch: 18937, Training Loss: 0.01233
Epoch: 18937, Training Loss: 0.00951
Epoch: 18938, Training Loss: 0.01080
Epoch: 18938, Training Loss: 0.01004
Epoch: 18938, Training Loss: 0.01232
Epoch: 18938, Training Loss: 0.00951
Epoch: 18939, Training Loss: 0.01080
Epoch: 18939, Training Loss: 0.01003
Epoch: 18939, Training Loss: 0.01232
Epoch: 18939, Training Loss: 0.00951
Epoch: 18940, Training Loss: 0.01080
Epoch: 18940, Training Loss: 0.01003
Epoch: 18940, Training Loss: 0.01232
Epoch: 18940, Training Loss: 0.00951
Epoch: 18941, Training Loss: 0.01080
Epoch: 18941, Training Loss: 0.01003
Epoch: 18941, Training Loss: 0.01232
Epoch: 18941, Training Loss: 0.00951
Epoch: 18942, Training Loss: 0.01079
Epoch: 18942, Training Loss: 0.01003
Epoch: 18942, Training Loss: 0.01232
Epoch: 18942, Training Loss: 0.00951
Epoch: 18943, Training Loss: 0.01079
Epoch: 18943, Training Loss: 0.01003
Epoch: 18943, Training Loss: 0.01232
Epoch: 18943, Training Loss: 0.00951
Epoch: 18944, Training Loss: 0.01079
Epoch: 18944, Training Loss: 0.01003
Epoch: 18944, Training Loss: 0.01232
Epoch: 18944, Training Loss: 0.00951
Epoch: 18945, Training Loss: 0.01079
Epoch: 18945, Training Loss: 0.01003
Epoch: 18945, Training Loss: 0.01232
Epoch: 18945, Training Loss: 0.00951
Epoch: 18946, Training Loss: 0.01079
Epoch: 18946, Training Loss: 0.01003
Epoch: 18946, Training Loss: 0.01232
Epoch: 18946, Training Loss: 0.00951
Epoch: 18947, Training Loss: 0.01079
Epoch: 18947, Training Loss: 0.01003
Epoch: 18947, Training Loss: 0.01232
Epoch: 18947, Training Loss: 0.00951
Epoch: 18948, Training Loss: 0.01079
Epoch: 18948, Training Loss: 0.01003
Epoch: 18948, Training Loss: 0.01232
Epoch: 18948, Training Loss: 0.00951
Epoch: 18949, Training Loss: 0.01079
Epoch: 18949, Training Loss: 0.01003
Epoch: 18949, Training Loss: 0.01232
Epoch: 18949, Training Loss: 0.00951
Epoch: 18950, Training Loss: 0.01079
Epoch: 18950, Training Loss: 0.01003
Epoch: 18950, Training Loss: 0.01232
Epoch: 18950, Training Loss: 0.00951
Epoch: 18951, Training Loss: 0.01079
Epoch: 18951, Training Loss: 0.01003
Epoch: 18951, Training Loss: 0.01232
Epoch: 18951, Training Loss: 0.00951
Epoch: 18952, Training Loss: 0.01079
Epoch: 18952, Training Loss: 0.01003
Epoch: 18952, Training Loss: 0.01232
Epoch: 18952, Training Loss: 0.00951
Epoch: 18953, Training Loss: 0.01079
Epoch: 18953, Training Loss: 0.01003
Epoch: 18953, Training Loss: 0.01232
Epoch: 18953, Training Loss: 0.00951
Epoch: 18954, Training Loss: 0.01079
Epoch: 18954, Training Loss: 0.01003
Epoch: 18954, Training Loss: 0.01232
Epoch: 18954, Training Loss: 0.00951
Epoch: 18955, Training Loss: 0.01079
Epoch: 18955, Training Loss: 0.01003
Epoch: 18955, Training Loss: 0.01232
Epoch: 18955, Training Loss: 0.00951
Epoch: 18956, Training Loss: 0.01079
Epoch: 18956, Training Loss: 0.01003
Epoch: 18956, Training Loss: 0.01232
Epoch: 18956, Training Loss: 0.00951
Epoch: 18957, Training Loss: 0.01079
Epoch: 18957, Training Loss: 0.01003
Epoch: 18957, Training Loss: 0.01232
Epoch: 18957, Training Loss: 0.00951
Epoch: 18958, Training Loss: 0.01079
Epoch: 18958, Training Loss: 0.01003
Epoch: 18958, Training Loss: 0.01232
Epoch: 18958, Training Loss: 0.00950
Epoch: 18959, Training Loss: 0.01079
Epoch: 18959, Training Loss: 0.01003
Epoch: 18959, Training Loss: 0.01232
Epoch: 18959, Training Loss: 0.00950
Epoch: 18960, Training Loss: 0.01079
Epoch: 18960, Training Loss: 0.01003
Epoch: 18960, Training Loss: 0.01232
Epoch: 18960, Training Loss: 0.00950
Epoch: 18961, Training Loss: 0.01079
Epoch: 18961, Training Loss: 0.01003
Epoch: 18961, Training Loss: 0.01232
Epoch: 18961, Training Loss: 0.00950
Epoch: 18962, Training Loss: 0.01079
Epoch: 18962, Training Loss: 0.01003
Epoch: 18962, Training Loss: 0.01232
Epoch: 18962, Training Loss: 0.00950
Epoch: 18963, Training Loss: 0.01079
Epoch: 18963, Training Loss: 0.01003
Epoch: 18963, Training Loss: 0.01232
Epoch: 18963, Training Loss: 0.00950
Epoch: 18964, Training Loss: 0.01079
Epoch: 18964, Training Loss: 0.01003
Epoch: 18964, Training Loss: 0.01232
Epoch: 18964, Training Loss: 0.00950
Epoch: 18965, Training Loss: 0.01079
Epoch: 18965, Training Loss: 0.01003
Epoch: 18965, Training Loss: 0.01232
Epoch: 18965, Training Loss: 0.00950
Epoch: 18966, Training Loss: 0.01079
Epoch: 18966, Training Loss: 0.01003
Epoch: 18966, Training Loss: 0.01231
Epoch: 18966, Training Loss: 0.00950
Epoch: 18967, Training Loss: 0.01079
Epoch: 18967, Training Loss: 0.01003
Epoch: 18967, Training Loss: 0.01231
Epoch: 18967, Training Loss: 0.00950
Epoch: 18968, Training Loss: 0.01079
Epoch: 18968, Training Loss: 0.01003
Epoch: 18968, Training Loss: 0.01231
Epoch: 18968, Training Loss: 0.00950
Epoch: 18969, Training Loss: 0.01079
Epoch: 18969, Training Loss: 0.01003
Epoch: 18969, Training Loss: 0.01231
Epoch: 18969, Training Loss: 0.00950
Epoch: 18970, Training Loss: 0.01079
Epoch: 18970, Training Loss: 0.01003
Epoch: 18970, Training Loss: 0.01231
Epoch: 18970, Training Loss: 0.00950
Epoch: 18971, Training Loss: 0.01079
Epoch: 18971, Training Loss: 0.01003
Epoch: 18971, Training Loss: 0.01231
Epoch: 18971, Training Loss: 0.00950
Epoch: 18972, Training Loss: 0.01079
Epoch: 18972, Training Loss: 0.01003
Epoch: 18972, Training Loss: 0.01231
Epoch: 18972, Training Loss: 0.00950
Epoch: 18973, Training Loss: 0.01079
Epoch: 18973, Training Loss: 0.01003
Epoch: 18973, Training Loss: 0.01231
Epoch: 18973, Training Loss: 0.00950
Epoch: 18974, Training Loss: 0.01078
Epoch: 18974, Training Loss: 0.01003
Epoch: 18974, Training Loss: 0.01231
Epoch: 18974, Training Loss: 0.00950
Epoch: 18975, Training Loss: 0.01078
Epoch: 18975, Training Loss: 0.01002
Epoch: 18975, Training Loss: 0.01231
Epoch: 18975, Training Loss: 0.00950
Epoch: 18976, Training Loss: 0.01078
Epoch: 18976, Training Loss: 0.01002
Epoch: 18976, Training Loss: 0.01231
Epoch: 18976, Training Loss: 0.00950
Epoch: 18977, Training Loss: 0.01078
Epoch: 18977, Training Loss: 0.01002
Epoch: 18977, Training Loss: 0.01231
Epoch: 18977, Training Loss: 0.00950
Epoch: 18978, Training Loss: 0.01078
Epoch: 18978, Training Loss: 0.01002
Epoch: 18978, Training Loss: 0.01231
Epoch: 18978, Training Loss: 0.00950
Epoch: 18979, Training Loss: 0.01078
Epoch: 18979, Training Loss: 0.01002
Epoch: 18979, Training Loss: 0.01231
Epoch: 18979, Training Loss: 0.00950
Epoch: 18980, Training Loss: 0.01078
Epoch: 18980, Training Loss: 0.01002
Epoch: 18980, Training Loss: 0.01231
Epoch: 18980, Training Loss: 0.00950
Epoch: 18981, Training Loss: 0.01078
Epoch: 18981, Training Loss: 0.01002
Epoch: 18981, Training Loss: 0.01231
Epoch: 18981, Training Loss: 0.00950
Epoch: 18982, Training Loss: 0.01078
Epoch: 18982, Training Loss: 0.01002
Epoch: 18982, Training Loss: 0.01231
Epoch: 18982, Training Loss: 0.00950
Epoch: 18983, Training Loss: 0.01078
Epoch: 18983, Training Loss: 0.01002
Epoch: 18983, Training Loss: 0.01231
Epoch: 18983, Training Loss: 0.00950
Epoch: 18984, Training Loss: 0.01078
Epoch: 18984, Training Loss: 0.01002
Epoch: 18984, Training Loss: 0.01231
Epoch: 18984, Training Loss: 0.00950
Epoch: 18985, Training Loss: 0.01078
Epoch: 18985, Training Loss: 0.01002
Epoch: 18985, Training Loss: 0.01231
Epoch: 18985, Training Loss: 0.00950
Epoch: 18986, Training Loss: 0.01078
Epoch: 18986, Training Loss: 0.01002
Epoch: 18986, Training Loss: 0.01231
Epoch: 18986, Training Loss: 0.00950
Epoch: 18987, Training Loss: 0.01078
Epoch: 18987, Training Loss: 0.01002
Epoch: 18987, Training Loss: 0.01231
Epoch: 18987, Training Loss: 0.00950
Epoch: 18988, Training Loss: 0.01078
Epoch: 18988, Training Loss: 0.01002
Epoch: 18988, Training Loss: 0.01231
Epoch: 18988, Training Loss: 0.00950
Epoch: 18989, Training Loss: 0.01078
Epoch: 18989, Training Loss: 0.01002
Epoch: 18989, Training Loss: 0.01231
Epoch: 18989, Training Loss: 0.00950
Epoch: 18990, Training Loss: 0.01078
Epoch: 18990, Training Loss: 0.01002
Epoch: 18990, Training Loss: 0.01231
Epoch: 18990, Training Loss: 0.00950
Epoch: 18991, Training Loss: 0.01078
Epoch: 18991, Training Loss: 0.01002
Epoch: 18991, Training Loss: 0.01231
Epoch: 18991, Training Loss: 0.00950
Epoch: 18992, Training Loss: 0.01078
Epoch: 18992, Training Loss: 0.01002
Epoch: 18992, Training Loss: 0.01231
Epoch: 18992, Training Loss: 0.00950
Epoch: 18993, Training Loss: 0.01078
Epoch: 18993, Training Loss: 0.01002
Epoch: 18993, Training Loss: 0.01231
Epoch: 18993, Training Loss: 0.00950
Epoch: 18994, Training Loss: 0.01078
Epoch: 18994, Training Loss: 0.01002
Epoch: 18994, Training Loss: 0.01231
Epoch: 18994, Training Loss: 0.00950
Epoch: 18995, Training Loss: 0.01078
Epoch: 18995, Training Loss: 0.01002
Epoch: 18995, Training Loss: 0.01230
Epoch: 18995, Training Loss: 0.00949
Epoch: 18996, Training Loss: 0.01078
Epoch: 18996, Training Loss: 0.01002
Epoch: 18996, Training Loss: 0.01230
Epoch: 18996, Training Loss: 0.00949
Epoch: 18997, Training Loss: 0.01078
Epoch: 18997, Training Loss: 0.01002
Epoch: 18997, Training Loss: 0.01230
Epoch: 18997, Training Loss: 0.00949
Epoch: 18998, Training Loss: 0.01078
Epoch: 18998, Training Loss: 0.01002
Epoch: 18998, Training Loss: 0.01230
Epoch: 18998, Training Loss: 0.00949
Epoch: 18999, Training Loss: 0.01078
Epoch: 18999, Training Loss: 0.01002
Epoch: 18999, Training Loss: 0.01230
Epoch: 18999, Training Loss: 0.00949
Epoch: 19000, Training Loss: 0.01078
Epoch: 19000, Training Loss: 0.01002
Epoch: 19000, Training Loss: 0.01230
Epoch: 19000, Training Loss: 0.00949
Epoch: 19001, Training Loss: 0.01078
Epoch: 19001, Training Loss: 0.01002
Epoch: 19001, Training Loss: 0.01230
Epoch: 19001, Training Loss: 0.00949
Epoch: 19002, Training Loss: 0.01078
Epoch: 19002, Training Loss: 0.01002
Epoch: 19002, Training Loss: 0.01230
Epoch: 19002, Training Loss: 0.00949
Epoch: 19003, Training Loss: 0.01078
Epoch: 19003, Training Loss: 0.01002
Epoch: 19003, Training Loss: 0.01230
Epoch: 19003, Training Loss: 0.00949
Epoch: 19004, Training Loss: 0.01078
Epoch: 19004, Training Loss: 0.01002
Epoch: 19004, Training Loss: 0.01230
Epoch: 19004, Training Loss: 0.00949
Epoch: 19005, Training Loss: 0.01078
Epoch: 19005, Training Loss: 0.01002
Epoch: 19005, Training Loss: 0.01230
Epoch: 19005, Training Loss: 0.00949
Epoch: 19006, Training Loss: 0.01078
Epoch: 19006, Training Loss: 0.01002
Epoch: 19006, Training Loss: 0.01230
Epoch: 19006, Training Loss: 0.00949
Epoch: 19007, Training Loss: 0.01077
Epoch: 19007, Training Loss: 0.01002
Epoch: 19007, Training Loss: 0.01230
Epoch: 19007, Training Loss: 0.00949
Epoch: 19008, Training Loss: 0.01077
Epoch: 19008, Training Loss: 0.01002
Epoch: 19008, Training Loss: 0.01230
Epoch: 19008, Training Loss: 0.00949
Epoch: 19009, Training Loss: 0.01077
Epoch: 19009, Training Loss: 0.01002
Epoch: 19009, Training Loss: 0.01230
Epoch: 19009, Training Loss: 0.00949
Epoch: 19010, Training Loss: 0.01077
Epoch: 19010, Training Loss: 0.01001
Epoch: 19010, Training Loss: 0.01230
Epoch: 19010, Training Loss: 0.00949
Epoch: 19011, Training Loss: 0.01077
Epoch: 19011, Training Loss: 0.01001
Epoch: 19011, Training Loss: 0.01230
Epoch: 19011, Training Loss: 0.00949
Epoch: 19012, Training Loss: 0.01077
Epoch: 19012, Training Loss: 0.01001
Epoch: 19012, Training Loss: 0.01230
Epoch: 19012, Training Loss: 0.00949
Epoch: 19013, Training Loss: 0.01077
Epoch: 19013, Training Loss: 0.01001
Epoch: 19013, Training Loss: 0.01230
Epoch: 19013, Training Loss: 0.00949
Epoch: 19014, Training Loss: 0.01077
Epoch: 19014, Training Loss: 0.01001
Epoch: 19014, Training Loss: 0.01230
Epoch: 19014, Training Loss: 0.00949
Epoch: 19015, Training Loss: 0.01077
Epoch: 19015, Training Loss: 0.01001
Epoch: 19015, Training Loss: 0.01230
Epoch: 19015, Training Loss: 0.00949
Epoch: 19016, Training Loss: 0.01077
Epoch: 19016, Training Loss: 0.01001
Epoch: 19016, Training Loss: 0.01230
Epoch: 19016, Training Loss: 0.00949
Epoch: 19017, Training Loss: 0.01077
Epoch: 19017, Training Loss: 0.01001
Epoch: 19017, Training Loss: 0.01230
Epoch: 19017, Training Loss: 0.00949
Epoch: 19018, Training Loss: 0.01077
Epoch: 19018, Training Loss: 0.01001
Epoch: 19018, Training Loss: 0.01230
Epoch: 19018, Training Loss: 0.00949
Epoch: 19019, Training Loss: 0.01077
Epoch: 19019, Training Loss: 0.01001
Epoch: 19019, Training Loss: 0.01230
Epoch: 19019, Training Loss: 0.00949
Epoch: 19020, Training Loss: 0.01077
Epoch: 19020, Training Loss: 0.01001
Epoch: 19020, Training Loss: 0.01230
Epoch: 19020, Training Loss: 0.00949
Epoch: 19021, Training Loss: 0.01077
Epoch: 19021, Training Loss: 0.01001
Epoch: 19021, Training Loss: 0.01230
Epoch: 19021, Training Loss: 0.00949
Epoch: 19022, Training Loss: 0.01077
Epoch: 19022, Training Loss: 0.01001
Epoch: 19022, Training Loss: 0.01230
Epoch: 19022, Training Loss: 0.00949
Epoch: 19023, Training Loss: 0.01077
Epoch: 19023, Training Loss: 0.01001
Epoch: 19023, Training Loss: 0.01229
Epoch: 19023, Training Loss: 0.00949
Epoch: 19024, Training Loss: 0.01077
Epoch: 19024, Training Loss: 0.01001
Epoch: 19024, Training Loss: 0.01229
Epoch: 19024, Training Loss: 0.00949
Epoch: 19025, Training Loss: 0.01077
Epoch: 19025, Training Loss: 0.01001
Epoch: 19025, Training Loss: 0.01229
Epoch: 19025, Training Loss: 0.00949
Epoch: 19026, Training Loss: 0.01077
Epoch: 19026, Training Loss: 0.01001
Epoch: 19026, Training Loss: 0.01229
Epoch: 19026, Training Loss: 0.00949
Epoch: 19027, Training Loss: 0.01077
Epoch: 19027, Training Loss: 0.01001
Epoch: 19027, Training Loss: 0.01229
Epoch: 19027, Training Loss: 0.00949
Epoch: 19028, Training Loss: 0.01077
Epoch: 19028, Training Loss: 0.01001
Epoch: 19028, Training Loss: 0.01229
Epoch: 19028, Training Loss: 0.00949
Epoch: 19029, Training Loss: 0.01077
Epoch: 19029, Training Loss: 0.01001
Epoch: 19029, Training Loss: 0.01229
Epoch: 19029, Training Loss: 0.00949
Epoch: 19030, Training Loss: 0.01077
Epoch: 19030, Training Loss: 0.01001
Epoch: 19030, Training Loss: 0.01229
Epoch: 19030, Training Loss: 0.00949
Epoch: 19031, Training Loss: 0.01077
Epoch: 19031, Training Loss: 0.01001
Epoch: 19031, Training Loss: 0.01229
Epoch: 19031, Training Loss: 0.00949
Epoch: 19032, Training Loss: 0.01077
Epoch: 19032, Training Loss: 0.01001
Epoch: 19032, Training Loss: 0.01229
Epoch: 19032, Training Loss: 0.00949
Epoch: 19033, Training Loss: 0.01077
Epoch: 19033, Training Loss: 0.01001
Epoch: 19033, Training Loss: 0.01229
Epoch: 19033, Training Loss: 0.00948
Epoch: 19034, Training Loss: 0.01077
Epoch: 19034, Training Loss: 0.01001
Epoch: 19034, Training Loss: 0.01229
Epoch: 19034, Training Loss: 0.00948
Epoch: 19035, Training Loss: 0.01077
Epoch: 19035, Training Loss: 0.01001
Epoch: 19035, Training Loss: 0.01229
Epoch: 19035, Training Loss: 0.00948
Epoch: 19036, Training Loss: 0.01077
Epoch: 19036, Training Loss: 0.01001
Epoch: 19036, Training Loss: 0.01229
Epoch: 19036, Training Loss: 0.00948
Epoch: 19037, Training Loss: 0.01077
Epoch: 19037, Training Loss: 0.01001
Epoch: 19037, Training Loss: 0.01229
Epoch: 19037, Training Loss: 0.00948
Epoch: 19038, Training Loss: 0.01077
Epoch: 19038, Training Loss: 0.01001
Epoch: 19038, Training Loss: 0.01229
Epoch: 19038, Training Loss: 0.00948
Epoch: 19039, Training Loss: 0.01076
Epoch: 19039, Training Loss: 0.01001
Epoch: 19039, Training Loss: 0.01229
Epoch: 19039, Training Loss: 0.00948
Epoch: 19040, Training Loss: 0.01076
Epoch: 19040, Training Loss: 0.01001
Epoch: 19040, Training Loss: 0.01229
Epoch: 19040, Training Loss: 0.00948
Epoch: 19041, Training Loss: 0.01076
Epoch: 19041, Training Loss: 0.01001
Epoch: 19041, Training Loss: 0.01229
Epoch: 19041, Training Loss: 0.00948
Epoch: 19042, Training Loss: 0.01076
Epoch: 19042, Training Loss: 0.01001
Epoch: 19042, Training Loss: 0.01229
Epoch: 19042, Training Loss: 0.00948
Epoch: 19043, Training Loss: 0.01076
Epoch: 19043, Training Loss: 0.01001
Epoch: 19043, Training Loss: 0.01229
Epoch: 19043, Training Loss: 0.00948
Epoch: 19044, Training Loss: 0.01076
Epoch: 19044, Training Loss: 0.01001
Epoch: 19044, Training Loss: 0.01229
Epoch: 19044, Training Loss: 0.00948
Epoch: 19045, Training Loss: 0.01076
Epoch: 19045, Training Loss: 0.01001
Epoch: 19045, Training Loss: 0.01229
Epoch: 19045, Training Loss: 0.00948
Epoch: 19046, Training Loss: 0.01076
Epoch: 19046, Training Loss: 0.01000
Epoch: 19046, Training Loss: 0.01229
Epoch: 19046, Training Loss: 0.00948
Epoch: 19047, Training Loss: 0.01076
Epoch: 19047, Training Loss: 0.01000
Epoch: 19047, Training Loss: 0.01229
Epoch: 19047, Training Loss: 0.00948
Epoch: 19048, Training Loss: 0.01076
Epoch: 19048, Training Loss: 0.01000
Epoch: 19048, Training Loss: 0.01229
Epoch: 19048, Training Loss: 0.00948
Epoch: 19049, Training Loss: 0.01076
Epoch: 19049, Training Loss: 0.01000
Epoch: 19049, Training Loss: 0.01229
Epoch: 19049, Training Loss: 0.00948
Epoch: 19050, Training Loss: 0.01076
Epoch: 19050, Training Loss: 0.01000
Epoch: 19050, Training Loss: 0.01229
Epoch: 19050, Training Loss: 0.00948
Epoch: 19051, Training Loss: 0.01076
Epoch: 19051, Training Loss: 0.01000
Epoch: 19051, Training Loss: 0.01229
Epoch: 19051, Training Loss: 0.00948
Epoch: 19052, Training Loss: 0.01076
Epoch: 19052, Training Loss: 0.01000
Epoch: 19052, Training Loss: 0.01228
Epoch: 19052, Training Loss: 0.00948
Epoch: 19053, Training Loss: 0.01076
Epoch: 19053, Training Loss: 0.01000
Epoch: 19053, Training Loss: 0.01228
Epoch: 19053, Training Loss: 0.00948
Epoch: 19054, Training Loss: 0.01076
Epoch: 19054, Training Loss: 0.01000
Epoch: 19054, Training Loss: 0.01228
Epoch: 19054, Training Loss: 0.00948
Epoch: 19055, Training Loss: 0.01076
Epoch: 19055, Training Loss: 0.01000
Epoch: 19055, Training Loss: 0.01228
Epoch: 19055, Training Loss: 0.00948
Epoch: 19056, Training Loss: 0.01076
Epoch: 19056, Training Loss: 0.01000
Epoch: 19056, Training Loss: 0.01228
Epoch: 19056, Training Loss: 0.00948
Epoch: 19057, Training Loss: 0.01076
Epoch: 19057, Training Loss: 0.01000
Epoch: 19057, Training Loss: 0.01228
Epoch: 19057, Training Loss: 0.00948
Epoch: 19058, Training Loss: 0.01076
Epoch: 19058, Training Loss: 0.01000
Epoch: 19058, Training Loss: 0.01228
Epoch: 19058, Training Loss: 0.00948
Epoch: 19059, Training Loss: 0.01076
Epoch: 19059, Training Loss: 0.01000
Epoch: 19059, Training Loss: 0.01228
Epoch: 19059, Training Loss: 0.00948
Epoch: 19060, Training Loss: 0.01076
Epoch: 19060, Training Loss: 0.01000
Epoch: 19060, Training Loss: 0.01228
Epoch: 19060, Training Loss: 0.00948
Epoch: 19061, Training Loss: 0.01076
Epoch: 19061, Training Loss: 0.01000
Epoch: 19061, Training Loss: 0.01228
Epoch: 19061, Training Loss: 0.00948
Epoch: 19062, Training Loss: 0.01076
Epoch: 19062, Training Loss: 0.01000
Epoch: 19062, Training Loss: 0.01228
Epoch: 19062, Training Loss: 0.00948
Epoch: 19063, Training Loss: 0.01076
Epoch: 19063, Training Loss: 0.01000
Epoch: 19063, Training Loss: 0.01228
Epoch: 19063, Training Loss: 0.00948
Epoch: 19064, Training Loss: 0.01076
Epoch: 19064, Training Loss: 0.01000
Epoch: 19064, Training Loss: 0.01228
Epoch: 19064, Training Loss: 0.00948
Epoch: 19065, Training Loss: 0.01076
Epoch: 19065, Training Loss: 0.01000
Epoch: 19065, Training Loss: 0.01228
Epoch: 19065, Training Loss: 0.00948
Epoch: 19066, Training Loss: 0.01076
Epoch: 19066, Training Loss: 0.01000
Epoch: 19066, Training Loss: 0.01228
Epoch: 19066, Training Loss: 0.00948
Epoch: 19067, Training Loss: 0.01076
Epoch: 19067, Training Loss: 0.01000
Epoch: 19067, Training Loss: 0.01228
Epoch: 19067, Training Loss: 0.00948
Epoch: 19068, Training Loss: 0.01076
Epoch: 19068, Training Loss: 0.01000
Epoch: 19068, Training Loss: 0.01228
Epoch: 19068, Training Loss: 0.00948
Epoch: 19069, Training Loss: 0.01076
Epoch: 19069, Training Loss: 0.01000
Epoch: 19069, Training Loss: 0.01228
Epoch: 19069, Training Loss: 0.00948
Epoch: 19070, Training Loss: 0.01076
Epoch: 19070, Training Loss: 0.01000
Epoch: 19070, Training Loss: 0.01228
Epoch: 19070, Training Loss: 0.00947
Epoch: 19071, Training Loss: 0.01076
Epoch: 19071, Training Loss: 0.01000
Epoch: 19071, Training Loss: 0.01228
Epoch: 19071, Training Loss: 0.00947
Epoch: 19072, Training Loss: 0.01075
Epoch: 19072, Training Loss: 0.01000
Epoch: 19072, Training Loss: 0.01228
Epoch: 19072, Training Loss: 0.00947
Epoch: 19073, Training Loss: 0.01075
Epoch: 19073, Training Loss: 0.01000
Epoch: 19073, Training Loss: 0.01228
Epoch: 19073, Training Loss: 0.00947
Epoch: 19074, Training Loss: 0.01075
Epoch: 19074, Training Loss: 0.01000
Epoch: 19074, Training Loss: 0.01228
Epoch: 19074, Training Loss: 0.00947
Epoch: 19075, Training Loss: 0.01075
Epoch: 19075, Training Loss: 0.01000
Epoch: 19075, Training Loss: 0.01228
Epoch: 19075, Training Loss: 0.00947
Epoch: 19076, Training Loss: 0.01075
Epoch: 19076, Training Loss: 0.01000
Epoch: 19076, Training Loss: 0.01228
Epoch: 19076, Training Loss: 0.00947
Epoch: 19077, Training Loss: 0.01075
Epoch: 19077, Training Loss: 0.01000
Epoch: 19077, Training Loss: 0.01228
Epoch: 19077, Training Loss: 0.00947
Epoch: 19078, Training Loss: 0.01075
Epoch: 19078, Training Loss: 0.01000
Epoch: 19078, Training Loss: 0.01228
Epoch: 19078, Training Loss: 0.00947
Epoch: 19079, Training Loss: 0.01075
Epoch: 19079, Training Loss: 0.01000
Epoch: 19079, Training Loss: 0.01228
Epoch: 19079, Training Loss: 0.00947
Epoch: 19080, Training Loss: 0.01075
Epoch: 19080, Training Loss: 0.01000
Epoch: 19080, Training Loss: 0.01228
Epoch: 19080, Training Loss: 0.00947
Epoch: 19081, Training Loss: 0.01075
Epoch: 19081, Training Loss: 0.01000
Epoch: 19081, Training Loss: 0.01227
Epoch: 19081, Training Loss: 0.00947
Epoch: 19082, Training Loss: 0.01075
Epoch: 19082, Training Loss: 0.00999
Epoch: 19082, Training Loss: 0.01227
Epoch: 19082, Training Loss: 0.00947
Epoch: 19083, Training Loss: 0.01075
Epoch: 19083, Training Loss: 0.00999
Epoch: 19083, Training Loss: 0.01227
Epoch: 19083, Training Loss: 0.00947
Epoch: 19084, Training Loss: 0.01075
Epoch: 19084, Training Loss: 0.00999
Epoch: 19084, Training Loss: 0.01227
Epoch: 19084, Training Loss: 0.00947
Epoch: 19085, Training Loss: 0.01075
Epoch: 19085, Training Loss: 0.00999
Epoch: 19085, Training Loss: 0.01227
Epoch: 19085, Training Loss: 0.00947
Epoch: 19086, Training Loss: 0.01075
Epoch: 19086, Training Loss: 0.00999
Epoch: 19086, Training Loss: 0.01227
Epoch: 19086, Training Loss: 0.00947
Epoch: 19087, Training Loss: 0.01075
Epoch: 19087, Training Loss: 0.00999
Epoch: 19087, Training Loss: 0.01227
Epoch: 19087, Training Loss: 0.00947
Epoch: 19088, Training Loss: 0.01075
Epoch: 19088, Training Loss: 0.00999
Epoch: 19088, Training Loss: 0.01227
Epoch: 19088, Training Loss: 0.00947
Epoch: 19089, Training Loss: 0.01075
Epoch: 19089, Training Loss: 0.00999
Epoch: 19089, Training Loss: 0.01227
Epoch: 19089, Training Loss: 0.00947
Epoch: 19090, Training Loss: 0.01075
Epoch: 19090, Training Loss: 0.00999
Epoch: 19090, Training Loss: 0.01227
Epoch: 19090, Training Loss: 0.00947
Epoch: 19091, Training Loss: 0.01075
Epoch: 19091, Training Loss: 0.00999
Epoch: 19091, Training Loss: 0.01227
Epoch: 19091, Training Loss: 0.00947
Epoch: 19092, Training Loss: 0.01075
Epoch: 19092, Training Loss: 0.00999
Epoch: 19092, Training Loss: 0.01227
Epoch: 19092, Training Loss: 0.00947
Epoch: 19093, Training Loss: 0.01075
Epoch: 19093, Training Loss: 0.00999
Epoch: 19093, Training Loss: 0.01227
Epoch: 19093, Training Loss: 0.00947
Epoch: 19094, Training Loss: 0.01075
Epoch: 19094, Training Loss: 0.00999
Epoch: 19094, Training Loss: 0.01227
Epoch: 19094, Training Loss: 0.00947
Epoch: 19095, Training Loss: 0.01075
Epoch: 19095, Training Loss: 0.00999
Epoch: 19095, Training Loss: 0.01227
Epoch: 19095, Training Loss: 0.00947
Epoch: 19096, Training Loss: 0.01075
Epoch: 19096, Training Loss: 0.00999
Epoch: 19096, Training Loss: 0.01227
Epoch: 19096, Training Loss: 0.00947
Epoch: 19097, Training Loss: 0.01075
Epoch: 19097, Training Loss: 0.00999
Epoch: 19097, Training Loss: 0.01227
Epoch: 19097, Training Loss: 0.00947
Epoch: 19098, Training Loss: 0.01075
Epoch: 19098, Training Loss: 0.00999
Epoch: 19098, Training Loss: 0.01227
Epoch: 19098, Training Loss: 0.00947
Epoch: 19099, Training Loss: 0.01075
Epoch: 19099, Training Loss: 0.00999
Epoch: 19099, Training Loss: 0.01227
Epoch: 19099, Training Loss: 0.00947
Epoch: 19100, Training Loss: 0.01075
Epoch: 19100, Training Loss: 0.00999
Epoch: 19100, Training Loss: 0.01227
Epoch: 19100, Training Loss: 0.00947
Epoch: 19101, Training Loss: 0.01075
Epoch: 19101, Training Loss: 0.00999
Epoch: 19101, Training Loss: 0.01227
Epoch: 19101, Training Loss: 0.00947
Epoch: 19102, Training Loss: 0.01075
Epoch: 19102, Training Loss: 0.00999
Epoch: 19102, Training Loss: 0.01227
Epoch: 19102, Training Loss: 0.00947
Epoch: 19103, Training Loss: 0.01075
Epoch: 19103, Training Loss: 0.00999
Epoch: 19103, Training Loss: 0.01227
Epoch: 19103, Training Loss: 0.00947
Epoch: 19104, Training Loss: 0.01075
Epoch: 19104, Training Loss: 0.00999
Epoch: 19104, Training Loss: 0.01227
Epoch: 19104, Training Loss: 0.00947
Epoch: 19105, Training Loss: 0.01074
Epoch: 19105, Training Loss: 0.00999
Epoch: 19105, Training Loss: 0.01227
Epoch: 19105, Training Loss: 0.00947
Epoch: 19106, Training Loss: 0.01074
Epoch: 19106, Training Loss: 0.00999
Epoch: 19106, Training Loss: 0.01227
Epoch: 19106, Training Loss: 0.00947
Epoch: 19107, Training Loss: 0.01074
Epoch: 19107, Training Loss: 0.00999
Epoch: 19107, Training Loss: 0.01227
Epoch: 19107, Training Loss: 0.00947
Epoch: 19108, Training Loss: 0.01074
Epoch: 19108, Training Loss: 0.00999
Epoch: 19108, Training Loss: 0.01227
Epoch: 19108, Training Loss: 0.00946
Epoch: 19109, Training Loss: 0.01074
Epoch: 19109, Training Loss: 0.00999
Epoch: 19109, Training Loss: 0.01226
Epoch: 19109, Training Loss: 0.00946
Epoch: 19110, Training Loss: 0.01074
Epoch: 19110, Training Loss: 0.00999
Epoch: 19110, Training Loss: 0.01226
Epoch: 19110, Training Loss: 0.00946
Epoch: 19111, Training Loss: 0.01074
Epoch: 19111, Training Loss: 0.00999
Epoch: 19111, Training Loss: 0.01226
Epoch: 19111, Training Loss: 0.00946
Epoch: 19112, Training Loss: 0.01074
Epoch: 19112, Training Loss: 0.00999
Epoch: 19112, Training Loss: 0.01226
Epoch: 19112, Training Loss: 0.00946
Epoch: 19113, Training Loss: 0.01074
Epoch: 19113, Training Loss: 0.00999
Epoch: 19113, Training Loss: 0.01226
Epoch: 19113, Training Loss: 0.00946
Epoch: 19114, Training Loss: 0.01074
Epoch: 19114, Training Loss: 0.00999
Epoch: 19114, Training Loss: 0.01226
Epoch: 19114, Training Loss: 0.00946
Epoch: 19115, Training Loss: 0.01074
Epoch: 19115, Training Loss: 0.00999
Epoch: 19115, Training Loss: 0.01226
Epoch: 19115, Training Loss: 0.00946
Epoch: 19116, Training Loss: 0.01074
Epoch: 19116, Training Loss: 0.00999
Epoch: 19116, Training Loss: 0.01226
Epoch: 19116, Training Loss: 0.00946
Epoch: 19117, Training Loss: 0.01074
Epoch: 19117, Training Loss: 0.00999
Epoch: 19117, Training Loss: 0.01226
Epoch: 19117, Training Loss: 0.00946
Epoch: 19118, Training Loss: 0.01074
Epoch: 19118, Training Loss: 0.00998
Epoch: 19118, Training Loss: 0.01226
Epoch: 19118, Training Loss: 0.00946
Epoch: 19119, Training Loss: 0.01074
Epoch: 19119, Training Loss: 0.00998
Epoch: 19119, Training Loss: 0.01226
Epoch: 19119, Training Loss: 0.00946
Epoch: 19120, Training Loss: 0.01074
Epoch: 19120, Training Loss: 0.00998
Epoch: 19120, Training Loss: 0.01226
Epoch: 19120, Training Loss: 0.00946
Epoch: 19121, Training Loss: 0.01074
Epoch: 19121, Training Loss: 0.00998
Epoch: 19121, Training Loss: 0.01226
Epoch: 19121, Training Loss: 0.00946
Epoch: 19122, Training Loss: 0.01074
Epoch: 19122, Training Loss: 0.00998
Epoch: 19122, Training Loss: 0.01226
Epoch: 19122, Training Loss: 0.00946
Epoch: 19123, Training Loss: 0.01074
Epoch: 19123, Training Loss: 0.00998
Epoch: 19123, Training Loss: 0.01226
Epoch: 19123, Training Loss: 0.00946
Epoch: 19124, Training Loss: 0.01074
Epoch: 19124, Training Loss: 0.00998
Epoch: 19124, Training Loss: 0.01226
Epoch: 19124, Training Loss: 0.00946
Epoch: 19125, Training Loss: 0.01074
Epoch: 19125, Training Loss: 0.00998
Epoch: 19125, Training Loss: 0.01226
Epoch: 19125, Training Loss: 0.00946
Epoch: 19126, Training Loss: 0.01074
Epoch: 19126, Training Loss: 0.00998
Epoch: 19126, Training Loss: 0.01226
Epoch: 19126, Training Loss: 0.00946
Epoch: 19127, Training Loss: 0.01074
Epoch: 19127, Training Loss: 0.00998
Epoch: 19127, Training Loss: 0.01226
Epoch: 19127, Training Loss: 0.00946
Epoch: 19128, Training Loss: 0.01074
Epoch: 19128, Training Loss: 0.00998
Epoch: 19128, Training Loss: 0.01226
Epoch: 19128, Training Loss: 0.00946
Epoch: 19129, Training Loss: 0.01074
Epoch: 19129, Training Loss: 0.00998
Epoch: 19129, Training Loss: 0.01226
Epoch: 19129, Training Loss: 0.00946
Epoch: 19130, Training Loss: 0.01074
Epoch: 19130, Training Loss: 0.00998
Epoch: 19130, Training Loss: 0.01226
Epoch: 19130, Training Loss: 0.00946
Epoch: 19131, Training Loss: 0.01074
Epoch: 19131, Training Loss: 0.00998
Epoch: 19131, Training Loss: 0.01226
Epoch: 19131, Training Loss: 0.00946
Epoch: 19132, Training Loss: 0.01074
Epoch: 19132, Training Loss: 0.00998
Epoch: 19132, Training Loss: 0.01226
Epoch: 19132, Training Loss: 0.00946
Epoch: 19133, Training Loss: 0.01074
Epoch: 19133, Training Loss: 0.00998
Epoch: 19133, Training Loss: 0.01226
Epoch: 19133, Training Loss: 0.00946
Epoch: 19134, Training Loss: 0.01074
Epoch: 19134, Training Loss: 0.00998
Epoch: 19134, Training Loss: 0.01226
Epoch: 19134, Training Loss: 0.00946
Epoch: 19135, Training Loss: 0.01074
Epoch: 19135, Training Loss: 0.00998
Epoch: 19135, Training Loss: 0.01226
Epoch: 19135, Training Loss: 0.00946
Epoch: 19136, Training Loss: 0.01074
Epoch: 19136, Training Loss: 0.00998
Epoch: 19136, Training Loss: 0.01226
Epoch: 19136, Training Loss: 0.00946
Epoch: 19137, Training Loss: 0.01073
Epoch: 19137, Training Loss: 0.00998
Epoch: 19137, Training Loss: 0.01226
Epoch: 19137, Training Loss: 0.00946
Epoch: 19138, Training Loss: 0.01073
Epoch: 19138, Training Loss: 0.00998
Epoch: 19138, Training Loss: 0.01225
Epoch: 19138, Training Loss: 0.00946
Epoch: 19139, Training Loss: 0.01073
Epoch: 19139, Training Loss: 0.00998
Epoch: 19139, Training Loss: 0.01225
Epoch: 19139, Training Loss: 0.00946
Epoch: 19140, Training Loss: 0.01073
Epoch: 19140, Training Loss: 0.00998
Epoch: 19140, Training Loss: 0.01225
Epoch: 19140, Training Loss: 0.00946
Epoch: 19141, Training Loss: 0.01073
Epoch: 19141, Training Loss: 0.00998
Epoch: 19141, Training Loss: 0.01225
Epoch: 19141, Training Loss: 0.00946
Epoch: 19142, Training Loss: 0.01073
Epoch: 19142, Training Loss: 0.00998
Epoch: 19142, Training Loss: 0.01225
Epoch: 19142, Training Loss: 0.00946
Epoch: 19143, Training Loss: 0.01073
Epoch: 19143, Training Loss: 0.00998
Epoch: 19143, Training Loss: 0.01225
Epoch: 19143, Training Loss: 0.00946
Epoch: 19144, Training Loss: 0.01073
Epoch: 19144, Training Loss: 0.00998
Epoch: 19144, Training Loss: 0.01225
Epoch: 19144, Training Loss: 0.00946
Epoch: 19145, Training Loss: 0.01073
Epoch: 19145, Training Loss: 0.00998
Epoch: 19145, Training Loss: 0.01225
Epoch: 19145, Training Loss: 0.00946
Epoch: 19146, Training Loss: 0.01073
Epoch: 19146, Training Loss: 0.00998
Epoch: 19146, Training Loss: 0.01225
Epoch: 19146, Training Loss: 0.00945
Epoch: 19147, Training Loss: 0.01073
Epoch: 19147, Training Loss: 0.00998
Epoch: 19147, Training Loss: 0.01225
Epoch: 19147, Training Loss: 0.00945
Epoch: 19148, Training Loss: 0.01073
Epoch: 19148, Training Loss: 0.00998
Epoch: 19148, Training Loss: 0.01225
Epoch: 19148, Training Loss: 0.00945
Epoch: 19149, Training Loss: 0.01073
Epoch: 19149, Training Loss: 0.00998
Epoch: 19149, Training Loss: 0.01225
Epoch: 19149, Training Loss: 0.00945
Epoch: 19150, Training Loss: 0.01073
Epoch: 19150, Training Loss: 0.00998
Epoch: 19150, Training Loss: 0.01225
Epoch: 19150, Training Loss: 0.00945
Epoch: 19151, Training Loss: 0.01073
Epoch: 19151, Training Loss: 0.00998
Epoch: 19151, Training Loss: 0.01225
Epoch: 19151, Training Loss: 0.00945
Epoch: 19152, Training Loss: 0.01073
Epoch: 19152, Training Loss: 0.00998
Epoch: 19152, Training Loss: 0.01225
Epoch: 19152, Training Loss: 0.00945
Epoch: 19153, Training Loss: 0.01073
Epoch: 19153, Training Loss: 0.00998
Epoch: 19153, Training Loss: 0.01225
Epoch: 19153, Training Loss: 0.00945
Epoch: 19154, Training Loss: 0.01073
Epoch: 19154, Training Loss: 0.00997
Epoch: 19154, Training Loss: 0.01225
Epoch: 19154, Training Loss: 0.00945
Epoch: 19155, Training Loss: 0.01073
Epoch: 19155, Training Loss: 0.00997
Epoch: 19155, Training Loss: 0.01225
Epoch: 19155, Training Loss: 0.00945
Epoch: 19156, Training Loss: 0.01073
Epoch: 19156, Training Loss: 0.00997
Epoch: 19156, Training Loss: 0.01225
Epoch: 19156, Training Loss: 0.00945
Epoch: 19157, Training Loss: 0.01073
Epoch: 19157, Training Loss: 0.00997
Epoch: 19157, Training Loss: 0.01225
Epoch: 19157, Training Loss: 0.00945
Epoch: 19158, Training Loss: 0.01073
Epoch: 19158, Training Loss: 0.00997
Epoch: 19158, Training Loss: 0.01225
Epoch: 19158, Training Loss: 0.00945
Epoch: 19159, Training Loss: 0.01073
Epoch: 19159, Training Loss: 0.00997
Epoch: 19159, Training Loss: 0.01225
Epoch: 19159, Training Loss: 0.00945
Epoch: 19160, Training Loss: 0.01073
Epoch: 19160, Training Loss: 0.00997
Epoch: 19160, Training Loss: 0.01225
Epoch: 19160, Training Loss: 0.00945
Epoch: 19161, Training Loss: 0.01073
Epoch: 19161, Training Loss: 0.00997
Epoch: 19161, Training Loss: 0.01225
Epoch: 19161, Training Loss: 0.00945
Epoch: 19162, Training Loss: 0.01073
Epoch: 19162, Training Loss: 0.00997
Epoch: 19162, Training Loss: 0.01225
Epoch: 19162, Training Loss: 0.00945
Epoch: 19163, Training Loss: 0.01073
Epoch: 19163, Training Loss: 0.00997
Epoch: 19163, Training Loss: 0.01225
Epoch: 19163, Training Loss: 0.00945
Epoch: 19164, Training Loss: 0.01073
Epoch: 19164, Training Loss: 0.00997
Epoch: 19164, Training Loss: 0.01225
Epoch: 19164, Training Loss: 0.00945
Epoch: 19165, Training Loss: 0.01073
Epoch: 19165, Training Loss: 0.00997
Epoch: 19165, Training Loss: 0.01225
Epoch: 19165, Training Loss: 0.00945
Epoch: 19166, Training Loss: 0.01073
Epoch: 19166, Training Loss: 0.00997
Epoch: 19166, Training Loss: 0.01225
Epoch: 19166, Training Loss: 0.00945
Epoch: 19167, Training Loss: 0.01073
Epoch: 19167, Training Loss: 0.00997
Epoch: 19167, Training Loss: 0.01224
Epoch: 19167, Training Loss: 0.00945
Epoch: 19168, Training Loss: 0.01073
Epoch: 19168, Training Loss: 0.00997
Epoch: 19168, Training Loss: 0.01224
Epoch: 19168, Training Loss: 0.00945
Epoch: 19169, Training Loss: 0.01073
Epoch: 19169, Training Loss: 0.00997
Epoch: 19169, Training Loss: 0.01224
Epoch: 19169, Training Loss: 0.00945
Epoch: 19170, Training Loss: 0.01072
Epoch: 19170, Training Loss: 0.00997
Epoch: 19170, Training Loss: 0.01224
Epoch: 19170, Training Loss: 0.00945
Epoch: 19171, Training Loss: 0.01072
Epoch: 19171, Training Loss: 0.00997
Epoch: 19171, Training Loss: 0.01224
Epoch: 19171, Training Loss: 0.00945
Epoch: 19172, Training Loss: 0.01072
Epoch: 19172, Training Loss: 0.00997
Epoch: 19172, Training Loss: 0.01224
Epoch: 19172, Training Loss: 0.00945
Epoch: 19173, Training Loss: 0.01072
Epoch: 19173, Training Loss: 0.00997
Epoch: 19173, Training Loss: 0.01224
Epoch: 19173, Training Loss: 0.00945
Epoch: 19174, Training Loss: 0.01072
Epoch: 19174, Training Loss: 0.00997
Epoch: 19174, Training Loss: 0.01224
Epoch: 19174, Training Loss: 0.00945
Epoch: 19175, Training Loss: 0.01072
Epoch: 19175, Training Loss: 0.00997
Epoch: 19175, Training Loss: 0.01224
Epoch: 19175, Training Loss: 0.00945
Epoch: 19176, Training Loss: 0.01072
Epoch: 19176, Training Loss: 0.00997
Epoch: 19176, Training Loss: 0.01224
Epoch: 19176, Training Loss: 0.00945
Epoch: 19177, Training Loss: 0.01072
Epoch: 19177, Training Loss: 0.00997
Epoch: 19177, Training Loss: 0.01224
Epoch: 19177, Training Loss: 0.00945
Epoch: 19178, Training Loss: 0.01072
Epoch: 19178, Training Loss: 0.00997
Epoch: 19178, Training Loss: 0.01224
Epoch: 19178, Training Loss: 0.00945
Epoch: 19179, Training Loss: 0.01072
Epoch: 19179, Training Loss: 0.00997
Epoch: 19179, Training Loss: 0.01224
Epoch: 19179, Training Loss: 0.00945
Epoch: 19180, Training Loss: 0.01072
Epoch: 19180, Training Loss: 0.00997
Epoch: 19180, Training Loss: 0.01224
Epoch: 19180, Training Loss: 0.00945
Epoch: 19181, Training Loss: 0.01072
Epoch: 19181, Training Loss: 0.00997
Epoch: 19181, Training Loss: 0.01224
Epoch: 19181, Training Loss: 0.00945
Epoch: 19182, Training Loss: 0.01072
Epoch: 19182, Training Loss: 0.00997
Epoch: 19182, Training Loss: 0.01224
Epoch: 19182, Training Loss: 0.00945
Epoch: 19183, Training Loss: 0.01072
Epoch: 19183, Training Loss: 0.00997
Epoch: 19183, Training Loss: 0.01224
Epoch: 19183, Training Loss: 0.00945
Epoch: 19184, Training Loss: 0.01072
Epoch: 19184, Training Loss: 0.00997
Epoch: 19184, Training Loss: 0.01224
Epoch: 19184, Training Loss: 0.00944
Epoch: 19185, Training Loss: 0.01072
Epoch: 19185, Training Loss: 0.00997
Epoch: 19185, Training Loss: 0.01224
Epoch: 19185, Training Loss: 0.00944
Epoch: 19186, Training Loss: 0.01072
Epoch: 19186, Training Loss: 0.00997
Epoch: 19186, Training Loss: 0.01224
Epoch: 19186, Training Loss: 0.00944
Epoch: 19187, Training Loss: 0.01072
Epoch: 19187, Training Loss: 0.00997
Epoch: 19187, Training Loss: 0.01224
Epoch: 19187, Training Loss: 0.00944
Epoch: 19188, Training Loss: 0.01072
Epoch: 19188, Training Loss: 0.00997
Epoch: 19188, Training Loss: 0.01224
Epoch: 19188, Training Loss: 0.00944
Epoch: 19189, Training Loss: 0.01072
Epoch: 19189, Training Loss: 0.00997
Epoch: 19189, Training Loss: 0.01224
Epoch: 19189, Training Loss: 0.00944
Epoch: 19190, Training Loss: 0.01072
Epoch: 19190, Training Loss: 0.00996
Epoch: 19190, Training Loss: 0.01224
Epoch: 19190, Training Loss: 0.00944
Epoch: 19191, Training Loss: 0.01072
Epoch: 19191, Training Loss: 0.00996
Epoch: 19191, Training Loss: 0.01224
Epoch: 19191, Training Loss: 0.00944
Epoch: 19192, Training Loss: 0.01072
Epoch: 19192, Training Loss: 0.00996
Epoch: 19192, Training Loss: 0.01224
Epoch: 19192, Training Loss: 0.00944
Epoch: 19193, Training Loss: 0.01072
Epoch: 19193, Training Loss: 0.00996
Epoch: 19193, Training Loss: 0.01224
Epoch: 19193, Training Loss: 0.00944
Epoch: 19194, Training Loss: 0.01072
Epoch: 19194, Training Loss: 0.00996
Epoch: 19194, Training Loss: 0.01224
Epoch: 19194, Training Loss: 0.00944
Epoch: 19195, Training Loss: 0.01072
Epoch: 19195, Training Loss: 0.00996
Epoch: 19195, Training Loss: 0.01224
Epoch: 19195, Training Loss: 0.00944
Epoch: 19196, Training Loss: 0.01072
Epoch: 19196, Training Loss: 0.00996
Epoch: 19196, Training Loss: 0.01223
Epoch: 19196, Training Loss: 0.00944
Epoch: 19197, Training Loss: 0.01072
Epoch: 19197, Training Loss: 0.00996
Epoch: 19197, Training Loss: 0.01223
Epoch: 19197, Training Loss: 0.00944
Epoch: 19198, Training Loss: 0.01072
Epoch: 19198, Training Loss: 0.00996
Epoch: 19198, Training Loss: 0.01223
Epoch: 19198, Training Loss: 0.00944
Epoch: 19199, Training Loss: 0.01072
Epoch: 19199, Training Loss: 0.00996
Epoch: 19199, Training Loss: 0.01223
Epoch: 19199, Training Loss: 0.00944
Epoch: 19200, Training Loss: 0.01072
Epoch: 19200, Training Loss: 0.00996
Epoch: 19200, Training Loss: 0.01223
Epoch: 19200, Training Loss: 0.00944
Epoch: 19201, Training Loss: 0.01072
Epoch: 19201, Training Loss: 0.00996
Epoch: 19201, Training Loss: 0.01223
Epoch: 19201, Training Loss: 0.00944
Epoch: 19202, Training Loss: 0.01072
Epoch: 19202, Training Loss: 0.00996
Epoch: 19202, Training Loss: 0.01223
Epoch: 19202, Training Loss: 0.00944
Epoch: 19203, Training Loss: 0.01071
Epoch: 19203, Training Loss: 0.00996
Epoch: 19203, Training Loss: 0.01223
Epoch: 19203, Training Loss: 0.00944
Epoch: 19204, Training Loss: 0.01071
Epoch: 19204, Training Loss: 0.00996
Epoch: 19204, Training Loss: 0.01223
Epoch: 19204, Training Loss: 0.00944
Epoch: 19205, Training Loss: 0.01071
Epoch: 19205, Training Loss: 0.00996
Epoch: 19205, Training Loss: 0.01223
Epoch: 19205, Training Loss: 0.00944
Epoch: 19206, Training Loss: 0.01071
Epoch: 19206, Training Loss: 0.00996
Epoch: 19206, Training Loss: 0.01223
Epoch: 19206, Training Loss: 0.00944
Epoch: 19207, Training Loss: 0.01071
Epoch: 19207, Training Loss: 0.00996
Epoch: 19207, Training Loss: 0.01223
Epoch: 19207, Training Loss: 0.00944
Epoch: 19208, Training Loss: 0.01071
Epoch: 19208, Training Loss: 0.00996
Epoch: 19208, Training Loss: 0.01223
Epoch: 19208, Training Loss: 0.00944
Epoch: 19209, Training Loss: 0.01071
Epoch: 19209, Training Loss: 0.00996
Epoch: 19209, Training Loss: 0.01223
Epoch: 19209, Training Loss: 0.00944
Epoch: 19210, Training Loss: 0.01071
Epoch: 19210, Training Loss: 0.00996
Epoch: 19210, Training Loss: 0.01223
Epoch: 19210, Training Loss: 0.00944
Epoch: 19211, Training Loss: 0.01071
Epoch: 19211, Training Loss: 0.00996
Epoch: 19211, Training Loss: 0.01223
Epoch: 19211, Training Loss: 0.00944
Epoch: 19212, Training Loss: 0.01071
Epoch: 19212, Training Loss: 0.00996
Epoch: 19212, Training Loss: 0.01223
Epoch: 19212, Training Loss: 0.00944
Epoch: 19213, Training Loss: 0.01071
Epoch: 19213, Training Loss: 0.00996
Epoch: 19213, Training Loss: 0.01223
Epoch: 19213, Training Loss: 0.00944
Epoch: 19214, Training Loss: 0.01071
Epoch: 19214, Training Loss: 0.00996
Epoch: 19214, Training Loss: 0.01223
Epoch: 19214, Training Loss: 0.00944
Epoch: 19215, Training Loss: 0.01071
Epoch: 19215, Training Loss: 0.00996
Epoch: 19215, Training Loss: 0.01223
Epoch: 19215, Training Loss: 0.00944
Epoch: 19216, Training Loss: 0.01071
Epoch: 19216, Training Loss: 0.00996
Epoch: 19216, Training Loss: 0.01223
Epoch: 19216, Training Loss: 0.00944
Epoch: 19217, Training Loss: 0.01071
Epoch: 19217, Training Loss: 0.00996
Epoch: 19217, Training Loss: 0.01223
Epoch: 19217, Training Loss: 0.00944
Epoch: 19218, Training Loss: 0.01071
Epoch: 19218, Training Loss: 0.00996
Epoch: 19218, Training Loss: 0.01223
Epoch: 19218, Training Loss: 0.00944
Epoch: 19219, Training Loss: 0.01071
Epoch: 19219, Training Loss: 0.00996
Epoch: 19219, Training Loss: 0.01223
Epoch: 19219, Training Loss: 0.00944
Epoch: 19220, Training Loss: 0.01071
Epoch: 19220, Training Loss: 0.00996
Epoch: 19220, Training Loss: 0.01223
Epoch: 19220, Training Loss: 0.00944
Epoch: 19221, Training Loss: 0.01071
Epoch: 19221, Training Loss: 0.00996
Epoch: 19221, Training Loss: 0.01223
Epoch: 19221, Training Loss: 0.00944
Epoch: 19222, Training Loss: 0.01071
Epoch: 19222, Training Loss: 0.00996
Epoch: 19222, Training Loss: 0.01223
Epoch: 19222, Training Loss: 0.00943
Epoch: 19223, Training Loss: 0.01071
Epoch: 19223, Training Loss: 0.00996
Epoch: 19223, Training Loss: 0.01223
Epoch: 19223, Training Loss: 0.00943
Epoch: 19224, Training Loss: 0.01071
Epoch: 19224, Training Loss: 0.00996
Epoch: 19224, Training Loss: 0.01223
Epoch: 19224, Training Loss: 0.00943
Epoch: 19225, Training Loss: 0.01071
Epoch: 19225, Training Loss: 0.00996
Epoch: 19225, Training Loss: 0.01222
Epoch: 19225, Training Loss: 0.00943
Epoch: 19226, Training Loss: 0.01071
Epoch: 19226, Training Loss: 0.00996
Epoch: 19226, Training Loss: 0.01222
Epoch: 19226, Training Loss: 0.00943
Epoch: 19227, Training Loss: 0.01071
Epoch: 19227, Training Loss: 0.00995
Epoch: 19227, Training Loss: 0.01222
Epoch: 19227, Training Loss: 0.00943
Epoch: 19228, Training Loss: 0.01071
Epoch: 19228, Training Loss: 0.00995
Epoch: 19228, Training Loss: 0.01222
Epoch: 19228, Training Loss: 0.00943
Epoch: 19229, Training Loss: 0.01071
Epoch: 19229, Training Loss: 0.00995
Epoch: 19229, Training Loss: 0.01222
Epoch: 19229, Training Loss: 0.00943
Epoch: 19230, Training Loss: 0.01071
Epoch: 19230, Training Loss: 0.00995
Epoch: 19230, Training Loss: 0.01222
Epoch: 19230, Training Loss: 0.00943
Epoch: 19231, Training Loss: 0.01071
Epoch: 19231, Training Loss: 0.00995
Epoch: 19231, Training Loss: 0.01222
Epoch: 19231, Training Loss: 0.00943
Epoch: 19232, Training Loss: 0.01071
Epoch: 19232, Training Loss: 0.00995
Epoch: 19232, Training Loss: 0.01222
Epoch: 19232, Training Loss: 0.00943
Epoch: 19233, Training Loss: 0.01071
Epoch: 19233, Training Loss: 0.00995
Epoch: 19233, Training Loss: 0.01222
Epoch: 19233, Training Loss: 0.00943
Epoch: 19234, Training Loss: 0.01071
Epoch: 19234, Training Loss: 0.00995
Epoch: 19234, Training Loss: 0.01222
Epoch: 19234, Training Loss: 0.00943
Epoch: 19235, Training Loss: 0.01071
Epoch: 19235, Training Loss: 0.00995
Epoch: 19235, Training Loss: 0.01222
Epoch: 19235, Training Loss: 0.00943
Epoch: 19236, Training Loss: 0.01071
Epoch: 19236, Training Loss: 0.00995
Epoch: 19236, Training Loss: 0.01222
Epoch: 19236, Training Loss: 0.00943
Epoch: 19237, Training Loss: 0.01070
Epoch: 19237, Training Loss: 0.00995
Epoch: 19237, Training Loss: 0.01222
Epoch: 19237, Training Loss: 0.00943
Epoch: 19238, Training Loss: 0.01070
Epoch: 19238, Training Loss: 0.00995
Epoch: 19238, Training Loss: 0.01222
Epoch: 19238, Training Loss: 0.00943
Epoch: 19239, Training Loss: 0.01070
Epoch: 19239, Training Loss: 0.00995
Epoch: 19239, Training Loss: 0.01222
Epoch: 19239, Training Loss: 0.00943
Epoch: 19240, Training Loss: 0.01070
Epoch: 19240, Training Loss: 0.00995
Epoch: 19240, Training Loss: 0.01222
Epoch: 19240, Training Loss: 0.00943
Epoch: 19241, Training Loss: 0.01070
Epoch: 19241, Training Loss: 0.00995
Epoch: 19241, Training Loss: 0.01222
Epoch: 19241, Training Loss: 0.00943
Epoch: 19242, Training Loss: 0.01070
Epoch: 19242, Training Loss: 0.00995
Epoch: 19242, Training Loss: 0.01222
Epoch: 19242, Training Loss: 0.00943
Epoch: 19243, Training Loss: 0.01070
Epoch: 19243, Training Loss: 0.00995
Epoch: 19243, Training Loss: 0.01222
Epoch: 19243, Training Loss: 0.00943
Epoch: 19244, Training Loss: 0.01070
Epoch: 19244, Training Loss: 0.00995
Epoch: 19244, Training Loss: 0.01222
Epoch: 19244, Training Loss: 0.00943
Epoch: 19245, Training Loss: 0.01070
Epoch: 19245, Training Loss: 0.00995
Epoch: 19245, Training Loss: 0.01222
Epoch: 19245, Training Loss: 0.00943
Epoch: 19246, Training Loss: 0.01070
Epoch: 19246, Training Loss: 0.00995
Epoch: 19246, Training Loss: 0.01222
Epoch: 19246, Training Loss: 0.00943
Epoch: 19247, Training Loss: 0.01070
Epoch: 19247, Training Loss: 0.00995
Epoch: 19247, Training Loss: 0.01222
Epoch: 19247, Training Loss: 0.00943
Epoch: 19248, Training Loss: 0.01070
Epoch: 19248, Training Loss: 0.00995
Epoch: 19248, Training Loss: 0.01222
Epoch: 19248, Training Loss: 0.00943
Epoch: 19249, Training Loss: 0.01070
Epoch: 19249, Training Loss: 0.00995
Epoch: 19249, Training Loss: 0.01222
Epoch: 19249, Training Loss: 0.00943
Epoch: 19250, Training Loss: 0.01070
Epoch: 19250, Training Loss: 0.00995
Epoch: 19250, Training Loss: 0.01222
Epoch: 19250, Training Loss: 0.00943
Epoch: 19251, Training Loss: 0.01070
Epoch: 19251, Training Loss: 0.00995
Epoch: 19251, Training Loss: 0.01222
Epoch: 19251, Training Loss: 0.00943
Epoch: 19252, Training Loss: 0.01070
Epoch: 19252, Training Loss: 0.00995
Epoch: 19252, Training Loss: 0.01222
Epoch: 19252, Training Loss: 0.00943
Epoch: 19253, Training Loss: 0.01070
Epoch: 19253, Training Loss: 0.00995
Epoch: 19253, Training Loss: 0.01222
Epoch: 19253, Training Loss: 0.00943
Epoch: 19254, Training Loss: 0.01070
Epoch: 19254, Training Loss: 0.00995
Epoch: 19254, Training Loss: 0.01221
Epoch: 19254, Training Loss: 0.00943
Epoch: 19255, Training Loss: 0.01070
Epoch: 19255, Training Loss: 0.00995
Epoch: 19255, Training Loss: 0.01221
Epoch: 19255, Training Loss: 0.00943
Epoch: 19256, Training Loss: 0.01070
Epoch: 19256, Training Loss: 0.00995
Epoch: 19256, Training Loss: 0.01221
Epoch: 19256, Training Loss: 0.00943
Epoch: 19257, Training Loss: 0.01070
Epoch: 19257, Training Loss: 0.00995
Epoch: 19257, Training Loss: 0.01221
Epoch: 19257, Training Loss: 0.00943
Epoch: 19258, Training Loss: 0.01070
Epoch: 19258, Training Loss: 0.00995
Epoch: 19258, Training Loss: 0.01221
Epoch: 19258, Training Loss: 0.00943
Epoch: 19259, Training Loss: 0.01070
Epoch: 19259, Training Loss: 0.00995
Epoch: 19259, Training Loss: 0.01221
Epoch: 19259, Training Loss: 0.00943
Epoch: 19260, Training Loss: 0.01070
Epoch: 19260, Training Loss: 0.00995
Epoch: 19260, Training Loss: 0.01221
Epoch: 19260, Training Loss: 0.00942
Epoch: 19261, Training Loss: 0.01070
Epoch: 19261, Training Loss: 0.00995
Epoch: 19261, Training Loss: 0.01221
Epoch: 19261, Training Loss: 0.00942
Epoch: 19262, Training Loss: 0.01070
Epoch: 19262, Training Loss: 0.00995
Epoch: 19262, Training Loss: 0.01221
Epoch: 19262, Training Loss: 0.00942
Epoch: 19263, Training Loss: 0.01070
Epoch: 19263, Training Loss: 0.00994
Epoch: 19263, Training Loss: 0.01221
Epoch: 19263, Training Loss: 0.00942
Epoch: 19264, Training Loss: 0.01070
Epoch: 19264, Training Loss: 0.00994
Epoch: 19264, Training Loss: 0.01221
Epoch: 19264, Training Loss: 0.00942
Epoch: 19265, Training Loss: 0.01070
Epoch: 19265, Training Loss: 0.00994
Epoch: 19265, Training Loss: 0.01221
Epoch: 19265, Training Loss: 0.00942
Epoch: 19266, Training Loss: 0.01070
Epoch: 19266, Training Loss: 0.00994
Epoch: 19266, Training Loss: 0.01221
Epoch: 19266, Training Loss: 0.00942
Epoch: 19267, Training Loss: 0.01070
Epoch: 19267, Training Loss: 0.00994
Epoch: 19267, Training Loss: 0.01221
Epoch: 19267, Training Loss: 0.00942
Epoch: 19268, Training Loss: 0.01070
Epoch: 19268, Training Loss: 0.00994
Epoch: 19268, Training Loss: 0.01221
Epoch: 19268, Training Loss: 0.00942
Epoch: 19269, Training Loss: 0.01070
Epoch: 19269, Training Loss: 0.00994
Epoch: 19269, Training Loss: 0.01221
Epoch: 19269, Training Loss: 0.00942
Epoch: 19270, Training Loss: 0.01069
Epoch: 19270, Training Loss: 0.00994
Epoch: 19270, Training Loss: 0.01221
Epoch: 19270, Training Loss: 0.00942
Epoch: 19271, Training Loss: 0.01069
Epoch: 19271, Training Loss: 0.00994
Epoch: 19271, Training Loss: 0.01221
Epoch: 19271, Training Loss: 0.00942
Epoch: 19272, Training Loss: 0.01069
Epoch: 19272, Training Loss: 0.00994
Epoch: 19272, Training Loss: 0.01221
Epoch: 19272, Training Loss: 0.00942
Epoch: 19273, Training Loss: 0.01069
Epoch: 19273, Training Loss: 0.00994
Epoch: 19273, Training Loss: 0.01221
Epoch: 19273, Training Loss: 0.00942
Epoch: 19274, Training Loss: 0.01069
Epoch: 19274, Training Loss: 0.00994
Epoch: 19274, Training Loss: 0.01221
Epoch: 19274, Training Loss: 0.00942
Epoch: 19275, Training Loss: 0.01069
Epoch: 19275, Training Loss: 0.00994
Epoch: 19275, Training Loss: 0.01221
Epoch: 19275, Training Loss: 0.00942
Epoch: 19276, Training Loss: 0.01069
Epoch: 19276, Training Loss: 0.00994
Epoch: 19276, Training Loss: 0.01221
Epoch: 19276, Training Loss: 0.00942
Epoch: 19277, Training Loss: 0.01069
Epoch: 19277, Training Loss: 0.00994
Epoch: 19277, Training Loss: 0.01221
Epoch: 19277, Training Loss: 0.00942
Epoch: 19278, Training Loss: 0.01069
Epoch: 19278, Training Loss: 0.00994
Epoch: 19278, Training Loss: 0.01221
Epoch: 19278, Training Loss: 0.00942
Epoch: 19279, Training Loss: 0.01069
Epoch: 19279, Training Loss: 0.00994
Epoch: 19279, Training Loss: 0.01221
Epoch: 19279, Training Loss: 0.00942
Epoch: 19280, Training Loss: 0.01069
Epoch: 19280, Training Loss: 0.00994
Epoch: 19280, Training Loss: 0.01221
Epoch: 19280, Training Loss: 0.00942
Epoch: 19281, Training Loss: 0.01069
Epoch: 19281, Training Loss: 0.00994
Epoch: 19281, Training Loss: 0.01221
Epoch: 19281, Training Loss: 0.00942
Epoch: 19282, Training Loss: 0.01069
Epoch: 19282, Training Loss: 0.00994
Epoch: 19282, Training Loss: 0.01221
Epoch: 19282, Training Loss: 0.00942
Epoch: 19283, Training Loss: 0.01069
Epoch: 19283, Training Loss: 0.00994
Epoch: 19283, Training Loss: 0.01220
Epoch: 19283, Training Loss: 0.00942
Epoch: 19284, Training Loss: 0.01069
Epoch: 19284, Training Loss: 0.00994
Epoch: 19284, Training Loss: 0.01220
Epoch: 19284, Training Loss: 0.00942
Epoch: 19285, Training Loss: 0.01069
Epoch: 19285, Training Loss: 0.00994
Epoch: 19285, Training Loss: 0.01220
Epoch: 19285, Training Loss: 0.00942
Epoch: 19286, Training Loss: 0.01069
Epoch: 19286, Training Loss: 0.00994
Epoch: 19286, Training Loss: 0.01220
Epoch: 19286, Training Loss: 0.00942
Epoch: 19287, Training Loss: 0.01069
Epoch: 19287, Training Loss: 0.00994
Epoch: 19287, Training Loss: 0.01220
Epoch: 19287, Training Loss: 0.00942
Epoch: 19288, Training Loss: 0.01069
Epoch: 19288, Training Loss: 0.00994
Epoch: 19288, Training Loss: 0.01220
Epoch: 19288, Training Loss: 0.00942
Epoch: 19289, Training Loss: 0.01069
Epoch: 19289, Training Loss: 0.00994
Epoch: 19289, Training Loss: 0.01220
Epoch: 19289, Training Loss: 0.00942
Epoch: 19290, Training Loss: 0.01069
Epoch: 19290, Training Loss: 0.00994
Epoch: 19290, Training Loss: 0.01220
Epoch: 19290, Training Loss: 0.00942
Epoch: 19291, Training Loss: 0.01069
Epoch: 19291, Training Loss: 0.00994
Epoch: 19291, Training Loss: 0.01220
Epoch: 19291, Training Loss: 0.00942
Epoch: 19292, Training Loss: 0.01069
Epoch: 19292, Training Loss: 0.00994
Epoch: 19292, Training Loss: 0.01220
Epoch: 19292, Training Loss: 0.00942
Epoch: 19293, Training Loss: 0.01069
Epoch: 19293, Training Loss: 0.00994
Epoch: 19293, Training Loss: 0.01220
Epoch: 19293, Training Loss: 0.00942
Epoch: 19294, Training Loss: 0.01069
Epoch: 19294, Training Loss: 0.00994
Epoch: 19294, Training Loss: 0.01220
Epoch: 19294, Training Loss: 0.00942
Epoch: 19295, Training Loss: 0.01069
Epoch: 19295, Training Loss: 0.00994
Epoch: 19295, Training Loss: 0.01220
Epoch: 19295, Training Loss: 0.00942
Epoch: 19296, Training Loss: 0.01069
Epoch: 19296, Training Loss: 0.00994
Epoch: 19296, Training Loss: 0.01220
Epoch: 19296, Training Loss: 0.00942
Epoch: 19297, Training Loss: 0.01069
Epoch: 19297, Training Loss: 0.00994
Epoch: 19297, Training Loss: 0.01220
Epoch: 19297, Training Loss: 0.00942
Epoch: 19298, Training Loss: 0.01069
Epoch: 19298, Training Loss: 0.00994
Epoch: 19298, Training Loss: 0.01220
Epoch: 19298, Training Loss: 0.00942
Epoch: 19299, Training Loss: 0.01069
Epoch: 19299, Training Loss: 0.00994
Epoch: 19299, Training Loss: 0.01220
Epoch: 19299, Training Loss: 0.00941
Epoch: 19300, Training Loss: 0.01069
Epoch: 19300, Training Loss: 0.00993
Epoch: 19300, Training Loss: 0.01220
Epoch: 19300, Training Loss: 0.00941
Epoch: 19301, Training Loss: 0.01069
Epoch: 19301, Training Loss: 0.00993
Epoch: 19301, Training Loss: 0.01220
Epoch: 19301, Training Loss: 0.00941
Epoch: 19302, Training Loss: 0.01069
Epoch: 19302, Training Loss: 0.00993
Epoch: 19302, Training Loss: 0.01220
Epoch: 19302, Training Loss: 0.00941
Epoch: 19303, Training Loss: 0.01068
Epoch: 19303, Training Loss: 0.00993
Epoch: 19303, Training Loss: 0.01220
Epoch: 19303, Training Loss: 0.00941
Epoch: 19304, Training Loss: 0.01068
Epoch: 19304, Training Loss: 0.00993
Epoch: 19304, Training Loss: 0.01220
Epoch: 19304, Training Loss: 0.00941
Epoch: 19305, Training Loss: 0.01068
Epoch: 19305, Training Loss: 0.00993
Epoch: 19305, Training Loss: 0.01220
Epoch: 19305, Training Loss: 0.00941
Epoch: 19306, Training Loss: 0.01068
Epoch: 19306, Training Loss: 0.00993
Epoch: 19306, Training Loss: 0.01220
Epoch: 19306, Training Loss: 0.00941
Epoch: 19307, Training Loss: 0.01068
Epoch: 19307, Training Loss: 0.00993
Epoch: 19307, Training Loss: 0.01220
Epoch: 19307, Training Loss: 0.00941
Epoch: 19308, Training Loss: 0.01068
Epoch: 19308, Training Loss: 0.00993
Epoch: 19308, Training Loss: 0.01220
Epoch: 19308, Training Loss: 0.00941
Epoch: 19309, Training Loss: 0.01068
Epoch: 19309, Training Loss: 0.00993
Epoch: 19309, Training Loss: 0.01220
Epoch: 19309, Training Loss: 0.00941
Epoch: 19310, Training Loss: 0.01068
Epoch: 19310, Training Loss: 0.00993
Epoch: 19310, Training Loss: 0.01220
Epoch: 19310, Training Loss: 0.00941
Epoch: 19311, Training Loss: 0.01068
Epoch: 19311, Training Loss: 0.00993
Epoch: 19311, Training Loss: 0.01220
Epoch: 19311, Training Loss: 0.00941
Epoch: 19312, Training Loss: 0.01068
Epoch: 19312, Training Loss: 0.00993
Epoch: 19312, Training Loss: 0.01219
Epoch: 19312, Training Loss: 0.00941
Epoch: 19313, Training Loss: 0.01068
Epoch: 19313, Training Loss: 0.00993
Epoch: 19313, Training Loss: 0.01219
Epoch: 19313, Training Loss: 0.00941
Epoch: 19314, Training Loss: 0.01068
Epoch: 19314, Training Loss: 0.00993
Epoch: 19314, Training Loss: 0.01219
Epoch: 19314, Training Loss: 0.00941
Epoch: 19315, Training Loss: 0.01068
Epoch: 19315, Training Loss: 0.00993
Epoch: 19315, Training Loss: 0.01219
Epoch: 19315, Training Loss: 0.00941
Epoch: 19316, Training Loss: 0.01068
Epoch: 19316, Training Loss: 0.00993
Epoch: 19316, Training Loss: 0.01219
Epoch: 19316, Training Loss: 0.00941
Epoch: 19317, Training Loss: 0.01068
Epoch: 19317, Training Loss: 0.00993
Epoch: 19317, Training Loss: 0.01219
Epoch: 19317, Training Loss: 0.00941
Epoch: 19318, Training Loss: 0.01068
Epoch: 19318, Training Loss: 0.00993
Epoch: 19318, Training Loss: 0.01219
Epoch: 19318, Training Loss: 0.00941
Epoch: 19319, Training Loss: 0.01068
Epoch: 19319, Training Loss: 0.00993
Epoch: 19319, Training Loss: 0.01219
Epoch: 19319, Training Loss: 0.00941
Epoch: 19320, Training Loss: 0.01068
Epoch: 19320, Training Loss: 0.00993
Epoch: 19320, Training Loss: 0.01219
Epoch: 19320, Training Loss: 0.00941
Epoch: 19321, Training Loss: 0.01068
Epoch: 19321, Training Loss: 0.00993
Epoch: 19321, Training Loss: 0.01219
Epoch: 19321, Training Loss: 0.00941
Epoch: 19322, Training Loss: 0.01068
Epoch: 19322, Training Loss: 0.00993
Epoch: 19322, Training Loss: 0.01219
Epoch: 19322, Training Loss: 0.00941
Epoch: 19323, Training Loss: 0.01068
Epoch: 19323, Training Loss: 0.00993
Epoch: 19323, Training Loss: 0.01219
Epoch: 19323, Training Loss: 0.00941
Epoch: 19324, Training Loss: 0.01068
Epoch: 19324, Training Loss: 0.00993
Epoch: 19324, Training Loss: 0.01219
Epoch: 19324, Training Loss: 0.00941
Epoch: 19325, Training Loss: 0.01068
Epoch: 19325, Training Loss: 0.00993
Epoch: 19325, Training Loss: 0.01219
Epoch: 19325, Training Loss: 0.00941
Epoch: 19326, Training Loss: 0.01068
Epoch: 19326, Training Loss: 0.00993
Epoch: 19326, Training Loss: 0.01219
Epoch: 19326, Training Loss: 0.00941
Epoch: 19327, Training Loss: 0.01068
Epoch: 19327, Training Loss: 0.00993
Epoch: 19327, Training Loss: 0.01219
Epoch: 19327, Training Loss: 0.00941
Epoch: 19328, Training Loss: 0.01068
Epoch: 19328, Training Loss: 0.00993
Epoch: 19328, Training Loss: 0.01219
Epoch: 19328, Training Loss: 0.00941
Epoch: 19329, Training Loss: 0.01068
Epoch: 19329, Training Loss: 0.00993
Epoch: 19329, Training Loss: 0.01219
Epoch: 19329, Training Loss: 0.00941
Epoch: 19330, Training Loss: 0.01068
Epoch: 19330, Training Loss: 0.00993
Epoch: 19330, Training Loss: 0.01219
Epoch: 19330, Training Loss: 0.00941
Epoch: 19331, Training Loss: 0.01068
Epoch: 19331, Training Loss: 0.00993
Epoch: 19331, Training Loss: 0.01219
Epoch: 19331, Training Loss: 0.00941
Epoch: 19332, Training Loss: 0.01068
Epoch: 19332, Training Loss: 0.00993
Epoch: 19332, Training Loss: 0.01219
Epoch: 19332, Training Loss: 0.00941
Epoch: 19333, Training Loss: 0.01068
Epoch: 19333, Training Loss: 0.00993
Epoch: 19333, Training Loss: 0.01219
Epoch: 19333, Training Loss: 0.00941
Epoch: 19334, Training Loss: 0.01068
Epoch: 19334, Training Loss: 0.00993
Epoch: 19334, Training Loss: 0.01219
Epoch: 19334, Training Loss: 0.00941
Epoch: 19335, Training Loss: 0.01068
Epoch: 19335, Training Loss: 0.00993
Epoch: 19335, Training Loss: 0.01219
Epoch: 19335, Training Loss: 0.00941
Epoch: 19336, Training Loss: 0.01067
Epoch: 19336, Training Loss: 0.00992
Epoch: 19336, Training Loss: 0.01219
Epoch: 19336, Training Loss: 0.00941
Epoch: 19337, Training Loss: 0.01067
Epoch: 19337, Training Loss: 0.00992
Epoch: 19337, Training Loss: 0.01219
Epoch: 19337, Training Loss: 0.00940
Epoch: 19338, Training Loss: 0.01067
Epoch: 19338, Training Loss: 0.00992
Epoch: 19338, Training Loss: 0.01219
Epoch: 19338, Training Loss: 0.00940
Epoch: 19339, Training Loss: 0.01067
Epoch: 19339, Training Loss: 0.00992
Epoch: 19339, Training Loss: 0.01219
Epoch: 19339, Training Loss: 0.00940
Epoch: 19340, Training Loss: 0.01067
Epoch: 19340, Training Loss: 0.00992
Epoch: 19340, Training Loss: 0.01219
Epoch: 19340, Training Loss: 0.00940
Epoch: 19341, Training Loss: 0.01067
Epoch: 19341, Training Loss: 0.00992
Epoch: 19341, Training Loss: 0.01218
Epoch: 19341, Training Loss: 0.00940
Epoch: 19342, Training Loss: 0.01067
Epoch: 19342, Training Loss: 0.00992
Epoch: 19342, Training Loss: 0.01218
Epoch: 19342, Training Loss: 0.00940
Epoch: 19343, Training Loss: 0.01067
Epoch: 19343, Training Loss: 0.00992
Epoch: 19343, Training Loss: 0.01218
Epoch: 19343, Training Loss: 0.00940
Epoch: 19344, Training Loss: 0.01067
Epoch: 19344, Training Loss: 0.00992
Epoch: 19344, Training Loss: 0.01218
Epoch: 19344, Training Loss: 0.00940
Epoch: 19345, Training Loss: 0.01067
Epoch: 19345, Training Loss: 0.00992
Epoch: 19345, Training Loss: 0.01218
Epoch: 19345, Training Loss: 0.00940
Epoch: 19346, Training Loss: 0.01067
Epoch: 19346, Training Loss: 0.00992
Epoch: 19346, Training Loss: 0.01218
Epoch: 19346, Training Loss: 0.00940
Epoch: 19347, Training Loss: 0.01067
Epoch: 19347, Training Loss: 0.00992
Epoch: 19347, Training Loss: 0.01218
Epoch: 19347, Training Loss: 0.00940
Epoch: 19348, Training Loss: 0.01067
Epoch: 19348, Training Loss: 0.00992
Epoch: 19348, Training Loss: 0.01218
Epoch: 19348, Training Loss: 0.00940
Epoch: 19349, Training Loss: 0.01067
Epoch: 19349, Training Loss: 0.00992
Epoch: 19349, Training Loss: 0.01218
Epoch: 19349, Training Loss: 0.00940
Epoch: 19350, Training Loss: 0.01067
Epoch: 19350, Training Loss: 0.00992
Epoch: 19350, Training Loss: 0.01218
Epoch: 19350, Training Loss: 0.00940
Epoch: 19351, Training Loss: 0.01067
Epoch: 19351, Training Loss: 0.00992
Epoch: 19351, Training Loss: 0.01218
Epoch: 19351, Training Loss: 0.00940
Epoch: 19352, Training Loss: 0.01067
Epoch: 19352, Training Loss: 0.00992
Epoch: 19352, Training Loss: 0.01218
Epoch: 19352, Training Loss: 0.00940
Epoch: 19353, Training Loss: 0.01067
Epoch: 19353, Training Loss: 0.00992
Epoch: 19353, Training Loss: 0.01218
Epoch: 19353, Training Loss: 0.00940
Epoch: 19354, Training Loss: 0.01067
Epoch: 19354, Training Loss: 0.00992
Epoch: 19354, Training Loss: 0.01218
Epoch: 19354, Training Loss: 0.00940
Epoch: 19355, Training Loss: 0.01067
Epoch: 19355, Training Loss: 0.00992
Epoch: 19355, Training Loss: 0.01218
Epoch: 19355, Training Loss: 0.00940
Epoch: 19356, Training Loss: 0.01067
Epoch: 19356, Training Loss: 0.00992
Epoch: 19356, Training Loss: 0.01218
Epoch: 19356, Training Loss: 0.00940
Epoch: 19357, Training Loss: 0.01067
Epoch: 19357, Training Loss: 0.00992
Epoch: 19357, Training Loss: 0.01218
Epoch: 19357, Training Loss: 0.00940
Epoch: 19358, Training Loss: 0.01067
Epoch: 19358, Training Loss: 0.00992
Epoch: 19358, Training Loss: 0.01218
Epoch: 19358, Training Loss: 0.00940
Epoch: 19359, Training Loss: 0.01067
Epoch: 19359, Training Loss: 0.00992
Epoch: 19359, Training Loss: 0.01218
Epoch: 19359, Training Loss: 0.00940
Epoch: 19360, Training Loss: 0.01067
Epoch: 19360, Training Loss: 0.00992
Epoch: 19360, Training Loss: 0.01218
Epoch: 19360, Training Loss: 0.00940
Epoch: 19361, Training Loss: 0.01067
Epoch: 19361, Training Loss: 0.00992
Epoch: 19361, Training Loss: 0.01218
Epoch: 19361, Training Loss: 0.00940
Epoch: 19362, Training Loss: 0.01067
Epoch: 19362, Training Loss: 0.00992
Epoch: 19362, Training Loss: 0.01218
Epoch: 19362, Training Loss: 0.00940
Epoch: 19363, Training Loss: 0.01067
Epoch: 19363, Training Loss: 0.00992
Epoch: 19363, Training Loss: 0.01218
Epoch: 19363, Training Loss: 0.00940
Epoch: 19364, Training Loss: 0.01067
Epoch: 19364, Training Loss: 0.00992
Epoch: 19364, Training Loss: 0.01218
Epoch: 19364, Training Loss: 0.00940
Epoch: 19365, Training Loss: 0.01067
Epoch: 19365, Training Loss: 0.00992
Epoch: 19365, Training Loss: 0.01218
Epoch: 19365, Training Loss: 0.00940
Epoch: 19366, Training Loss: 0.01067
Epoch: 19366, Training Loss: 0.00992
Epoch: 19366, Training Loss: 0.01218
Epoch: 19366, Training Loss: 0.00940
Epoch: 19367, Training Loss: 0.01067
Epoch: 19367, Training Loss: 0.00992
Epoch: 19367, Training Loss: 0.01218
Epoch: 19367, Training Loss: 0.00940
Epoch: 19368, Training Loss: 0.01067
Epoch: 19368, Training Loss: 0.00992
Epoch: 19368, Training Loss: 0.01218
Epoch: 19368, Training Loss: 0.00940
Epoch: 19369, Training Loss: 0.01067
Epoch: 19369, Training Loss: 0.00992
Epoch: 19369, Training Loss: 0.01218
Epoch: 19369, Training Loss: 0.00940
Epoch: 19370, Training Loss: 0.01066
Epoch: 19370, Training Loss: 0.00992
Epoch: 19370, Training Loss: 0.01218
Epoch: 19370, Training Loss: 0.00940
Epoch: 19371, Training Loss: 0.01066
Epoch: 19371, Training Loss: 0.00992
Epoch: 19371, Training Loss: 0.01217
Epoch: 19371, Training Loss: 0.00940
Epoch: 19372, Training Loss: 0.01066
Epoch: 19372, Training Loss: 0.00992
Epoch: 19372, Training Loss: 0.01217
Epoch: 19372, Training Loss: 0.00940
Epoch: 19373, Training Loss: 0.01066
Epoch: 19373, Training Loss: 0.00991
Epoch: 19373, Training Loss: 0.01217
Epoch: 19373, Training Loss: 0.00940
Epoch: 19374, Training Loss: 0.01066
Epoch: 19374, Training Loss: 0.00991
Epoch: 19374, Training Loss: 0.01217
Epoch: 19374, Training Loss: 0.00940
Epoch: 19375, Training Loss: 0.01066
Epoch: 19375, Training Loss: 0.00991
Epoch: 19375, Training Loss: 0.01217
Epoch: 19375, Training Loss: 0.00940
Epoch: 19376, Training Loss: 0.01066
Epoch: 19376, Training Loss: 0.00991
Epoch: 19376, Training Loss: 0.01217
Epoch: 19376, Training Loss: 0.00939
Epoch: 19377, Training Loss: 0.01066
Epoch: 19377, Training Loss: 0.00991
Epoch: 19377, Training Loss: 0.01217
Epoch: 19377, Training Loss: 0.00939
Epoch: 19378, Training Loss: 0.01066
Epoch: 19378, Training Loss: 0.00991
Epoch: 19378, Training Loss: 0.01217
Epoch: 19378, Training Loss: 0.00939
Epoch: 19379, Training Loss: 0.01066
Epoch: 19379, Training Loss: 0.00991
Epoch: 19379, Training Loss: 0.01217
Epoch: 19379, Training Loss: 0.00939
Epoch: 19380, Training Loss: 0.01066
Epoch: 19380, Training Loss: 0.00991
Epoch: 19380, Training Loss: 0.01217
Epoch: 19380, Training Loss: 0.00939
Epoch: 19381, Training Loss: 0.01066
Epoch: 19381, Training Loss: 0.00991
Epoch: 19381, Training Loss: 0.01217
Epoch: 19381, Training Loss: 0.00939
Epoch: 19382, Training Loss: 0.01066
Epoch: 19382, Training Loss: 0.00991
Epoch: 19382, Training Loss: 0.01217
Epoch: 19382, Training Loss: 0.00939
Epoch: 19383, Training Loss: 0.01066
Epoch: 19383, Training Loss: 0.00991
Epoch: 19383, Training Loss: 0.01217
Epoch: 19383, Training Loss: 0.00939
Epoch: 19384, Training Loss: 0.01066
Epoch: 19384, Training Loss: 0.00991
Epoch: 19384, Training Loss: 0.01217
Epoch: 19384, Training Loss: 0.00939
Epoch: 19385, Training Loss: 0.01066
Epoch: 19385, Training Loss: 0.00991
Epoch: 19385, Training Loss: 0.01217
Epoch: 19385, Training Loss: 0.00939
Epoch: 19386, Training Loss: 0.01066
Epoch: 19386, Training Loss: 0.00991
Epoch: 19386, Training Loss: 0.01217
Epoch: 19386, Training Loss: 0.00939
Epoch: 19387, Training Loss: 0.01066
Epoch: 19387, Training Loss: 0.00991
Epoch: 19387, Training Loss: 0.01217
Epoch: 19387, Training Loss: 0.00939
Epoch: 19388, Training Loss: 0.01066
Epoch: 19388, Training Loss: 0.00991
Epoch: 19388, Training Loss: 0.01217
Epoch: 19388, Training Loss: 0.00939
Epoch: 19389, Training Loss: 0.01066
Epoch: 19389, Training Loss: 0.00991
Epoch: 19389, Training Loss: 0.01217
Epoch: 19389, Training Loss: 0.00939
Epoch: 19390, Training Loss: 0.01066
Epoch: 19390, Training Loss: 0.00991
Epoch: 19390, Training Loss: 0.01217
Epoch: 19390, Training Loss: 0.00939
Epoch: 19391, Training Loss: 0.01066
Epoch: 19391, Training Loss: 0.00991
Epoch: 19391, Training Loss: 0.01217
Epoch: 19391, Training Loss: 0.00939
Epoch: 19392, Training Loss: 0.01066
Epoch: 19392, Training Loss: 0.00991
Epoch: 19392, Training Loss: 0.01217
Epoch: 19392, Training Loss: 0.00939
Epoch: 19393, Training Loss: 0.01066
Epoch: 19393, Training Loss: 0.00991
Epoch: 19393, Training Loss: 0.01217
Epoch: 19393, Training Loss: 0.00939
Epoch: 19394, Training Loss: 0.01066
Epoch: 19394, Training Loss: 0.00991
Epoch: 19394, Training Loss: 0.01217
Epoch: 19394, Training Loss: 0.00939
Epoch: 19395, Training Loss: 0.01066
Epoch: 19395, Training Loss: 0.00991
Epoch: 19395, Training Loss: 0.01217
Epoch: 19395, Training Loss: 0.00939
Epoch: 19396, Training Loss: 0.01066
Epoch: 19396, Training Loss: 0.00991
Epoch: 19396, Training Loss: 0.01217
Epoch: 19396, Training Loss: 0.00939
Epoch: 19397, Training Loss: 0.01066
Epoch: 19397, Training Loss: 0.00991
Epoch: 19397, Training Loss: 0.01217
Epoch: 19397, Training Loss: 0.00939
Epoch: 19398, Training Loss: 0.01066
Epoch: 19398, Training Loss: 0.00991
Epoch: 19398, Training Loss: 0.01217
Epoch: 19398, Training Loss: 0.00939
Epoch: 19399, Training Loss: 0.01066
Epoch: 19399, Training Loss: 0.00991
Epoch: 19399, Training Loss: 0.01217
Epoch: 19399, Training Loss: 0.00939
Epoch: 19400, Training Loss: 0.01066
Epoch: 19400, Training Loss: 0.00991
Epoch: 19400, Training Loss: 0.01216
Epoch: 19400, Training Loss: 0.00939
Epoch: 19401, Training Loss: 0.01066
Epoch: 19401, Training Loss: 0.00991
Epoch: 19401, Training Loss: 0.01216
Epoch: 19401, Training Loss: 0.00939
Epoch: 19402, Training Loss: 0.01066
Epoch: 19402, Training Loss: 0.00991
Epoch: 19402, Training Loss: 0.01216
Epoch: 19402, Training Loss: 0.00939
Epoch: 19403, Training Loss: 0.01065
Epoch: 19403, Training Loss: 0.00991
Epoch: 19403, Training Loss: 0.01216
Epoch: 19403, Training Loss: 0.00939
Epoch: 19404, Training Loss: 0.01065
Epoch: 19404, Training Loss: 0.00991
Epoch: 19404, Training Loss: 0.01216
Epoch: 19404, Training Loss: 0.00939
Epoch: 19405, Training Loss: 0.01065
Epoch: 19405, Training Loss: 0.00991
Epoch: 19405, Training Loss: 0.01216
Epoch: 19405, Training Loss: 0.00939
Epoch: 19406, Training Loss: 0.01065
Epoch: 19406, Training Loss: 0.00991
Epoch: 19406, Training Loss: 0.01216
Epoch: 19406, Training Loss: 0.00939
Epoch: 19407, Training Loss: 0.01065
Epoch: 19407, Training Loss: 0.00991
Epoch: 19407, Training Loss: 0.01216
Epoch: 19407, Training Loss: 0.00939
Epoch: 19408, Training Loss: 0.01065
Epoch: 19408, Training Loss: 0.00991
Epoch: 19408, Training Loss: 0.01216
Epoch: 19408, Training Loss: 0.00939
Epoch: 19409, Training Loss: 0.01065
Epoch: 19409, Training Loss: 0.00991
Epoch: 19409, Training Loss: 0.01216
Epoch: 19409, Training Loss: 0.00939
Epoch: 19410, Training Loss: 0.01065
Epoch: 19410, Training Loss: 0.00990
Epoch: 19410, Training Loss: 0.01216
Epoch: 19410, Training Loss: 0.00939
Epoch: 19411, Training Loss: 0.01065
Epoch: 19411, Training Loss: 0.00990
Epoch: 19411, Training Loss: 0.01216
Epoch: 19411, Training Loss: 0.00939
Epoch: 19412, Training Loss: 0.01065
Epoch: 19412, Training Loss: 0.00990
Epoch: 19412, Training Loss: 0.01216
Epoch: 19412, Training Loss: 0.00939
Epoch: 19413, Training Loss: 0.01065
Epoch: 19413, Training Loss: 0.00990
Epoch: 19413, Training Loss: 0.01216
Epoch: 19413, Training Loss: 0.00939
Epoch: 19414, Training Loss: 0.01065
Epoch: 19414, Training Loss: 0.00990
Epoch: 19414, Training Loss: 0.01216
Epoch: 19414, Training Loss: 0.00939
Epoch: 19415, Training Loss: 0.01065
Epoch: 19415, Training Loss: 0.00990
Epoch: 19415, Training Loss: 0.01216
Epoch: 19415, Training Loss: 0.00938
Epoch: 19416, Training Loss: 0.01065
Epoch: 19416, Training Loss: 0.00990
Epoch: 19416, Training Loss: 0.01216
Epoch: 19416, Training Loss: 0.00938
Epoch: 19417, Training Loss: 0.01065
Epoch: 19417, Training Loss: 0.00990
Epoch: 19417, Training Loss: 0.01216
Epoch: 19417, Training Loss: 0.00938
Epoch: 19418, Training Loss: 0.01065
Epoch: 19418, Training Loss: 0.00990
Epoch: 19418, Training Loss: 0.01216
Epoch: 19418, Training Loss: 0.00938
Epoch: 19419, Training Loss: 0.01065
Epoch: 19419, Training Loss: 0.00990
Epoch: 19419, Training Loss: 0.01216
Epoch: 19419, Training Loss: 0.00938
Epoch: 19420, Training Loss: 0.01065
Epoch: 19420, Training Loss: 0.00990
Epoch: 19420, Training Loss: 0.01216
Epoch: 19420, Training Loss: 0.00938
Epoch: 19421, Training Loss: 0.01065
Epoch: 19421, Training Loss: 0.00990
Epoch: 19421, Training Loss: 0.01216
Epoch: 19421, Training Loss: 0.00938
Epoch: 19422, Training Loss: 0.01065
Epoch: 19422, Training Loss: 0.00990
Epoch: 19422, Training Loss: 0.01216
Epoch: 19422, Training Loss: 0.00938
Epoch: 19423, Training Loss: 0.01065
Epoch: 19423, Training Loss: 0.00990
Epoch: 19423, Training Loss: 0.01216
Epoch: 19423, Training Loss: 0.00938
Epoch: 19424, Training Loss: 0.01065
Epoch: 19424, Training Loss: 0.00990
Epoch: 19424, Training Loss: 0.01216
Epoch: 19424, Training Loss: 0.00938
Epoch: 19425, Training Loss: 0.01065
Epoch: 19425, Training Loss: 0.00990
Epoch: 19425, Training Loss: 0.01216
Epoch: 19425, Training Loss: 0.00938
Epoch: 19426, Training Loss: 0.01065
Epoch: 19426, Training Loss: 0.00990
Epoch: 19426, Training Loss: 0.01216
Epoch: 19426, Training Loss: 0.00938
Epoch: 19427, Training Loss: 0.01065
Epoch: 19427, Training Loss: 0.00990
Epoch: 19427, Training Loss: 0.01216
Epoch: 19427, Training Loss: 0.00938
Epoch: 19428, Training Loss: 0.01065
Epoch: 19428, Training Loss: 0.00990
Epoch: 19428, Training Loss: 0.01216
Epoch: 19428, Training Loss: 0.00938
Epoch: 19429, Training Loss: 0.01065
Epoch: 19429, Training Loss: 0.00990
Epoch: 19429, Training Loss: 0.01216
Epoch: 19429, Training Loss: 0.00938
Epoch: 19430, Training Loss: 0.01065
Epoch: 19430, Training Loss: 0.00990
Epoch: 19430, Training Loss: 0.01215
Epoch: 19430, Training Loss: 0.00938
Epoch: 19431, Training Loss: 0.01065
Epoch: 19431, Training Loss: 0.00990
Epoch: 19431, Training Loss: 0.01215
Epoch: 19431, Training Loss: 0.00938
Epoch: 19432, Training Loss: 0.01065
Epoch: 19432, Training Loss: 0.00990
Epoch: 19432, Training Loss: 0.01215
Epoch: 19432, Training Loss: 0.00938
Epoch: 19433, Training Loss: 0.01065
Epoch: 19433, Training Loss: 0.00990
Epoch: 19433, Training Loss: 0.01215
Epoch: 19433, Training Loss: 0.00938
Epoch: 19434, Training Loss: 0.01065
Epoch: 19434, Training Loss: 0.00990
Epoch: 19434, Training Loss: 0.01215
Epoch: 19434, Training Loss: 0.00938
Epoch: 19435, Training Loss: 0.01065
Epoch: 19435, Training Loss: 0.00990
Epoch: 19435, Training Loss: 0.01215
Epoch: 19435, Training Loss: 0.00938
Epoch: 19436, Training Loss: 0.01065
Epoch: 19436, Training Loss: 0.00990
Epoch: 19436, Training Loss: 0.01215
Epoch: 19436, Training Loss: 0.00938
Epoch: 19437, Training Loss: 0.01064
Epoch: 19437, Training Loss: 0.00990
Epoch: 19437, Training Loss: 0.01215
Epoch: 19437, Training Loss: 0.00938
Epoch: 19438, Training Loss: 0.01064
Epoch: 19438, Training Loss: 0.00990
Epoch: 19438, Training Loss: 0.01215
Epoch: 19438, Training Loss: 0.00938
Epoch: 19439, Training Loss: 0.01064
Epoch: 19439, Training Loss: 0.00990
Epoch: 19439, Training Loss: 0.01215
Epoch: 19439, Training Loss: 0.00938
Epoch: 19440, Training Loss: 0.01064
Epoch: 19440, Training Loss: 0.00990
Epoch: 19440, Training Loss: 0.01215
Epoch: 19440, Training Loss: 0.00938
Epoch: 19441, Training Loss: 0.01064
Epoch: 19441, Training Loss: 0.00990
Epoch: 19441, Training Loss: 0.01215
Epoch: 19441, Training Loss: 0.00938
Epoch: 19442, Training Loss: 0.01064
Epoch: 19442, Training Loss: 0.00990
Epoch: 19442, Training Loss: 0.01215
Epoch: 19442, Training Loss: 0.00938
Epoch: 19443, Training Loss: 0.01064
Epoch: 19443, Training Loss: 0.00990
Epoch: 19443, Training Loss: 0.01215
Epoch: 19443, Training Loss: 0.00938
Epoch: 19444, Training Loss: 0.01064
Epoch: 19444, Training Loss: 0.00990
Epoch: 19444, Training Loss: 0.01215
Epoch: 19444, Training Loss: 0.00938
Epoch: 19445, Training Loss: 0.01064
Epoch: 19445, Training Loss: 0.00990
Epoch: 19445, Training Loss: 0.01215
Epoch: 19445, Training Loss: 0.00938
Epoch: 19446, Training Loss: 0.01064
Epoch: 19446, Training Loss: 0.00990
Epoch: 19446, Training Loss: 0.01215
Epoch: 19446, Training Loss: 0.00938
Epoch: 19447, Training Loss: 0.01064
Epoch: 19447, Training Loss: 0.00989
Epoch: 19447, Training Loss: 0.01215
Epoch: 19447, Training Loss: 0.00938
Epoch: 19448, Training Loss: 0.01064
Epoch: 19448, Training Loss: 0.00989
Epoch: 19448, Training Loss: 0.01215
Epoch: 19448, Training Loss: 0.00938
Epoch: 19449, Training Loss: 0.01064
Epoch: 19449, Training Loss: 0.00989
Epoch: 19449, Training Loss: 0.01215
Epoch: 19449, Training Loss: 0.00938
Epoch: 19450, Training Loss: 0.01064
Epoch: 19450, Training Loss: 0.00989
Epoch: 19450, Training Loss: 0.01215
Epoch: 19450, Training Loss: 0.00938
Epoch: 19451, Training Loss: 0.01064
Epoch: 19451, Training Loss: 0.00989
Epoch: 19451, Training Loss: 0.01215
Epoch: 19451, Training Loss: 0.00938
Epoch: 19452, Training Loss: 0.01064
Epoch: 19452, Training Loss: 0.00989
Epoch: 19452, Training Loss: 0.01215
Epoch: 19452, Training Loss: 0.00938
Epoch: 19453, Training Loss: 0.01064
Epoch: 19453, Training Loss: 0.00989
Epoch: 19453, Training Loss: 0.01215
Epoch: 19453, Training Loss: 0.00938
Epoch: 19454, Training Loss: 0.01064
Epoch: 19454, Training Loss: 0.00989
Epoch: 19454, Training Loss: 0.01215
Epoch: 19454, Training Loss: 0.00937
Epoch: 19455, Training Loss: 0.01064
Epoch: 19455, Training Loss: 0.00989
Epoch: 19455, Training Loss: 0.01215
Epoch: 19455, Training Loss: 0.00937
Epoch: 19456, Training Loss: 0.01064
Epoch: 19456, Training Loss: 0.00989
Epoch: 19456, Training Loss: 0.01215
Epoch: 19456, Training Loss: 0.00937
Epoch: 19457, Training Loss: 0.01064
Epoch: 19457, Training Loss: 0.00989
Epoch: 19457, Training Loss: 0.01215
Epoch: 19457, Training Loss: 0.00937
Epoch: 19458, Training Loss: 0.01064
Epoch: 19458, Training Loss: 0.00989
Epoch: 19458, Training Loss: 0.01215
Epoch: 19458, Training Loss: 0.00937
Epoch: 19459, Training Loss: 0.01064
Epoch: 19459, Training Loss: 0.00989
Epoch: 19459, Training Loss: 0.01214
Epoch: 19459, Training Loss: 0.00937
Epoch: 19460, Training Loss: 0.01064
Epoch: 19460, Training Loss: 0.00989
Epoch: 19460, Training Loss: 0.01214
Epoch: 19460, Training Loss: 0.00937
Epoch: 19461, Training Loss: 0.01064
Epoch: 19461, Training Loss: 0.00989
Epoch: 19461, Training Loss: 0.01214
Epoch: 19461, Training Loss: 0.00937
Epoch: 19462, Training Loss: 0.01064
Epoch: 19462, Training Loss: 0.00989
Epoch: 19462, Training Loss: 0.01214
Epoch: 19462, Training Loss: 0.00937
Epoch: 19463, Training Loss: 0.01064
Epoch: 19463, Training Loss: 0.00989
Epoch: 19463, Training Loss: 0.01214
Epoch: 19463, Training Loss: 0.00937
Epoch: 19464, Training Loss: 0.01064
Epoch: 19464, Training Loss: 0.00989
Epoch: 19464, Training Loss: 0.01214
Epoch: 19464, Training Loss: 0.00937
Epoch: 19465, Training Loss: 0.01064
Epoch: 19465, Training Loss: 0.00989
Epoch: 19465, Training Loss: 0.01214
Epoch: 19465, Training Loss: 0.00937
Epoch: 19466, Training Loss: 0.01064
Epoch: 19466, Training Loss: 0.00989
Epoch: 19466, Training Loss: 0.01214
Epoch: 19466, Training Loss: 0.00937
Epoch: 19467, Training Loss: 0.01064
Epoch: 19467, Training Loss: 0.00989
Epoch: 19467, Training Loss: 0.01214
Epoch: 19467, Training Loss: 0.00937
Epoch: 19468, Training Loss: 0.01064
Epoch: 19468, Training Loss: 0.00989
Epoch: 19468, Training Loss: 0.01214
Epoch: 19468, Training Loss: 0.00937
Epoch: 19469, Training Loss: 0.01064
Epoch: 19469, Training Loss: 0.00989
Epoch: 19469, Training Loss: 0.01214
Epoch: 19469, Training Loss: 0.00937
Epoch: 19470, Training Loss: 0.01064
Epoch: 19470, Training Loss: 0.00989
Epoch: 19470, Training Loss: 0.01214
Epoch: 19470, Training Loss: 0.00937
Epoch: 19471, Training Loss: 0.01063
Epoch: 19471, Training Loss: 0.00989
Epoch: 19471, Training Loss: 0.01214
Epoch: 19471, Training Loss: 0.00937
Epoch: 19472, Training Loss: 0.01063
Epoch: 19472, Training Loss: 0.00989
Epoch: 19472, Training Loss: 0.01214
Epoch: 19472, Training Loss: 0.00937
Epoch: 19473, Training Loss: 0.01063
Epoch: 19473, Training Loss: 0.00989
Epoch: 19473, Training Loss: 0.01214
Epoch: 19473, Training Loss: 0.00937
Epoch: 19474, Training Loss: 0.01063
Epoch: 19474, Training Loss: 0.00989
Epoch: 19474, Training Loss: 0.01214
Epoch: 19474, Training Loss: 0.00937
Epoch: 19475, Training Loss: 0.01063
Epoch: 19475, Training Loss: 0.00989
Epoch: 19475, Training Loss: 0.01214
Epoch: 19475, Training Loss: 0.00937
Epoch: 19476, Training Loss: 0.01063
Epoch: 19476, Training Loss: 0.00989
Epoch: 19476, Training Loss: 0.01214
Epoch: 19476, Training Loss: 0.00937
Epoch: 19477, Training Loss: 0.01063
Epoch: 19477, Training Loss: 0.00989
Epoch: 19477, Training Loss: 0.01214
Epoch: 19477, Training Loss: 0.00937
Epoch: 19478, Training Loss: 0.01063
Epoch: 19478, Training Loss: 0.00989
Epoch: 19478, Training Loss: 0.01214
Epoch: 19478, Training Loss: 0.00937
Epoch: 19479, Training Loss: 0.01063
Epoch: 19479, Training Loss: 0.00989
Epoch: 19479, Training Loss: 0.01214
Epoch: 19479, Training Loss: 0.00937
Epoch: 19480, Training Loss: 0.01063
Epoch: 19480, Training Loss: 0.00989
Epoch: 19480, Training Loss: 0.01214
Epoch: 19480, Training Loss: 0.00937
Epoch: 19481, Training Loss: 0.01063
Epoch: 19481, Training Loss: 0.00989
Epoch: 19481, Training Loss: 0.01214
Epoch: 19481, Training Loss: 0.00937
Epoch: 19482, Training Loss: 0.01063
Epoch: 19482, Training Loss: 0.00989
Epoch: 19482, Training Loss: 0.01214
Epoch: 19482, Training Loss: 0.00937
Epoch: 19483, Training Loss: 0.01063
Epoch: 19483, Training Loss: 0.00989
Epoch: 19483, Training Loss: 0.01214
Epoch: 19483, Training Loss: 0.00937
Epoch: 19484, Training Loss: 0.01063
Epoch: 19484, Training Loss: 0.00988
Epoch: 19484, Training Loss: 0.01214
Epoch: 19484, Training Loss: 0.00937
Epoch: 19485, Training Loss: 0.01063
Epoch: 19485, Training Loss: 0.00988
Epoch: 19485, Training Loss: 0.01214
Epoch: 19485, Training Loss: 0.00937
Epoch: 19486, Training Loss: 0.01063
Epoch: 19486, Training Loss: 0.00988
Epoch: 19486, Training Loss: 0.01214
Epoch: 19486, Training Loss: 0.00937
Epoch: 19487, Training Loss: 0.01063
Epoch: 19487, Training Loss: 0.00988
Epoch: 19487, Training Loss: 0.01214
Epoch: 19487, Training Loss: 0.00937
Epoch: 19488, Training Loss: 0.01063
Epoch: 19488, Training Loss: 0.00988
Epoch: 19488, Training Loss: 0.01214
Epoch: 19488, Training Loss: 0.00937
Epoch: 19489, Training Loss: 0.01063
Epoch: 19489, Training Loss: 0.00988
Epoch: 19489, Training Loss: 0.01213
Epoch: 19489, Training Loss: 0.00937
Epoch: 19490, Training Loss: 0.01063
Epoch: 19490, Training Loss: 0.00988
Epoch: 19490, Training Loss: 0.01213
Epoch: 19490, Training Loss: 0.00937
Epoch: 19491, Training Loss: 0.01063
Epoch: 19491, Training Loss: 0.00988
Epoch: 19491, Training Loss: 0.01213
Epoch: 19491, Training Loss: 0.00937
Epoch: 19492, Training Loss: 0.01063
Epoch: 19492, Training Loss: 0.00988
Epoch: 19492, Training Loss: 0.01213
Epoch: 19492, Training Loss: 0.00937
Epoch: 19493, Training Loss: 0.01063
Epoch: 19493, Training Loss: 0.00988
Epoch: 19493, Training Loss: 0.01213
Epoch: 19493, Training Loss: 0.00936
Epoch: 19494, Training Loss: 0.01063
Epoch: 19494, Training Loss: 0.00988
Epoch: 19494, Training Loss: 0.01213
Epoch: 19494, Training Loss: 0.00936
Epoch: 19495, Training Loss: 0.01063
Epoch: 19495, Training Loss: 0.00988
Epoch: 19495, Training Loss: 0.01213
Epoch: 19495, Training Loss: 0.00936
Epoch: 19496, Training Loss: 0.01063
Epoch: 19496, Training Loss: 0.00988
Epoch: 19496, Training Loss: 0.01213
Epoch: 19496, Training Loss: 0.00936
Epoch: 19497, Training Loss: 0.01063
Epoch: 19497, Training Loss: 0.00988
Epoch: 19497, Training Loss: 0.01213
Epoch: 19497, Training Loss: 0.00936
Epoch: 19498, Training Loss: 0.01063
Epoch: 19498, Training Loss: 0.00988
Epoch: 19498, Training Loss: 0.01213
Epoch: 19498, Training Loss: 0.00936
Epoch: 19499, Training Loss: 0.01063
Epoch: 19499, Training Loss: 0.00988
Epoch: 19499, Training Loss: 0.01213
Epoch: 19499, Training Loss: 0.00936
Epoch: 19500, Training Loss: 0.01063
Epoch: 19500, Training Loss: 0.00988
Epoch: 19500, Training Loss: 0.01213
Epoch: 19500, Training Loss: 0.00936
Epoch: 19501, Training Loss: 0.01063
Epoch: 19501, Training Loss: 0.00988
Epoch: 19501, Training Loss: 0.01213
Epoch: 19501, Training Loss: 0.00936
Epoch: 19502, Training Loss: 0.01063
Epoch: 19502, Training Loss: 0.00988
Epoch: 19502, Training Loss: 0.01213
Epoch: 19502, Training Loss: 0.00936
Epoch: 19503, Training Loss: 0.01063
Epoch: 19503, Training Loss: 0.00988
Epoch: 19503, Training Loss: 0.01213
Epoch: 19503, Training Loss: 0.00936
Epoch: 19504, Training Loss: 0.01063
Epoch: 19504, Training Loss: 0.00988
Epoch: 19504, Training Loss: 0.01213
Epoch: 19504, Training Loss: 0.00936
Epoch: 19505, Training Loss: 0.01062
Epoch: 19505, Training Loss: 0.00988
Epoch: 19505, Training Loss: 0.01213
Epoch: 19505, Training Loss: 0.00936
Epoch: 19506, Training Loss: 0.01062
Epoch: 19506, Training Loss: 0.00988
Epoch: 19506, Training Loss: 0.01213
Epoch: 19506, Training Loss: 0.00936
Epoch: 19507, Training Loss: 0.01062
Epoch: 19507, Training Loss: 0.00988
Epoch: 19507, Training Loss: 0.01213
Epoch: 19507, Training Loss: 0.00936
Epoch: 19508, Training Loss: 0.01062
Epoch: 19508, Training Loss: 0.00988
Epoch: 19508, Training Loss: 0.01213
Epoch: 19508, Training Loss: 0.00936
Epoch: 19509, Training Loss: 0.01062
Epoch: 19509, Training Loss: 0.00988
Epoch: 19509, Training Loss: 0.01213
Epoch: 19509, Training Loss: 0.00936
Epoch: 19510, Training Loss: 0.01062
Epoch: 19510, Training Loss: 0.00988
Epoch: 19510, Training Loss: 0.01213
Epoch: 19510, Training Loss: 0.00936
Epoch: 19511, Training Loss: 0.01062
Epoch: 19511, Training Loss: 0.00988
Epoch: 19511, Training Loss: 0.01213
Epoch: 19511, Training Loss: 0.00936
Epoch: 19512, Training Loss: 0.01062
Epoch: 19512, Training Loss: 0.00988
Epoch: 19512, Training Loss: 0.01213
Epoch: 19512, Training Loss: 0.00936
Epoch: 19513, Training Loss: 0.01062
Epoch: 19513, Training Loss: 0.00988
Epoch: 19513, Training Loss: 0.01213
Epoch: 19513, Training Loss: 0.00936
Epoch: 19514, Training Loss: 0.01062
Epoch: 19514, Training Loss: 0.00988
Epoch: 19514, Training Loss: 0.01213
Epoch: 19514, Training Loss: 0.00936
Epoch: 19515, Training Loss: 0.01062
Epoch: 19515, Training Loss: 0.00988
Epoch: 19515, Training Loss: 0.01213
Epoch: 19515, Training Loss: 0.00936
Epoch: 19516, Training Loss: 0.01062
Epoch: 19516, Training Loss: 0.00988
Epoch: 19516, Training Loss: 0.01213
Epoch: 19516, Training Loss: 0.00936
Epoch: 19517, Training Loss: 0.01062
Epoch: 19517, Training Loss: 0.00988
Epoch: 19517, Training Loss: 0.01213
Epoch: 19517, Training Loss: 0.00936
Epoch: 19518, Training Loss: 0.01062
Epoch: 19518, Training Loss: 0.00988
Epoch: 19518, Training Loss: 0.01213
Epoch: 19518, Training Loss: 0.00936
Epoch: 19519, Training Loss: 0.01062
Epoch: 19519, Training Loss: 0.00988
Epoch: 19519, Training Loss: 0.01212
Epoch: 19519, Training Loss: 0.00936
Epoch: 19520, Training Loss: 0.01062
Epoch: 19520, Training Loss: 0.00988
Epoch: 19520, Training Loss: 0.01212
Epoch: 19520, Training Loss: 0.00936
Epoch: 19521, Training Loss: 0.01062
Epoch: 19521, Training Loss: 0.00987
Epoch: 19521, Training Loss: 0.01212
Epoch: 19521, Training Loss: 0.00936
Epoch: 19522, Training Loss: 0.01062
Epoch: 19522, Training Loss: 0.00987
Epoch: 19522, Training Loss: 0.01212
Epoch: 19522, Training Loss: 0.00936
Epoch: 19523, Training Loss: 0.01062
Epoch: 19523, Training Loss: 0.00987
Epoch: 19523, Training Loss: 0.01212
Epoch: 19523, Training Loss: 0.00936
Epoch: 19524, Training Loss: 0.01062
Epoch: 19524, Training Loss: 0.00987
Epoch: 19524, Training Loss: 0.01212
Epoch: 19524, Training Loss: 0.00936
Epoch: 19525, Training Loss: 0.01062
Epoch: 19525, Training Loss: 0.00987
Epoch: 19525, Training Loss: 0.01212
Epoch: 19525, Training Loss: 0.00936
Epoch: 19526, Training Loss: 0.01062
Epoch: 19526, Training Loss: 0.00987
Epoch: 19526, Training Loss: 0.01212
Epoch: 19526, Training Loss: 0.00936
Epoch: 19527, Training Loss: 0.01062
Epoch: 19527, Training Loss: 0.00987
Epoch: 19527, Training Loss: 0.01212
Epoch: 19527, Training Loss: 0.00936
Epoch: 19528, Training Loss: 0.01062
Epoch: 19528, Training Loss: 0.00987
Epoch: 19528, Training Loss: 0.01212
Epoch: 19528, Training Loss: 0.00936
Epoch: 19529, Training Loss: 0.01062
Epoch: 19529, Training Loss: 0.00987
Epoch: 19529, Training Loss: 0.01212
Epoch: 19529, Training Loss: 0.00936
Epoch: 19530, Training Loss: 0.01062
Epoch: 19530, Training Loss: 0.00987
Epoch: 19530, Training Loss: 0.01212
Epoch: 19530, Training Loss: 0.00936
Epoch: 19531, Training Loss: 0.01062
Epoch: 19531, Training Loss: 0.00987
Epoch: 19531, Training Loss: 0.01212
Epoch: 19531, Training Loss: 0.00936
Epoch: 19532, Training Loss: 0.01062
Epoch: 19532, Training Loss: 0.00987
Epoch: 19532, Training Loss: 0.01212
Epoch: 19532, Training Loss: 0.00935
Epoch: 19533, Training Loss: 0.01062
Epoch: 19533, Training Loss: 0.00987
Epoch: 19533, Training Loss: 0.01212
Epoch: 19533, Training Loss: 0.00935
Epoch: 19534, Training Loss: 0.01062
Epoch: 19534, Training Loss: 0.00987
Epoch: 19534, Training Loss: 0.01212
Epoch: 19534, Training Loss: 0.00935
Epoch: 19535, Training Loss: 0.01062
Epoch: 19535, Training Loss: 0.00987
Epoch: 19535, Training Loss: 0.01212
Epoch: 19535, Training Loss: 0.00935
Epoch: 19536, Training Loss: 0.01062
Epoch: 19536, Training Loss: 0.00987
Epoch: 19536, Training Loss: 0.01212
Epoch: 19536, Training Loss: 0.00935
Epoch: 19537, Training Loss: 0.01062
Epoch: 19537, Training Loss: 0.00987
Epoch: 19537, Training Loss: 0.01212
Epoch: 19537, Training Loss: 0.00935
Epoch: 19538, Training Loss: 0.01062
Epoch: 19538, Training Loss: 0.00987
Epoch: 19538, Training Loss: 0.01212
Epoch: 19538, Training Loss: 0.00935
Epoch: 19539, Training Loss: 0.01061
Epoch: 19539, Training Loss: 0.00987
Epoch: 19539, Training Loss: 0.01212
Epoch: 19539, Training Loss: 0.00935
Epoch: 19540, Training Loss: 0.01061
Epoch: 19540, Training Loss: 0.00987
Epoch: 19540, Training Loss: 0.01212
Epoch: 19540, Training Loss: 0.00935
Epoch: 19541, Training Loss: 0.01061
Epoch: 19541, Training Loss: 0.00987
Epoch: 19541, Training Loss: 0.01212
Epoch: 19541, Training Loss: 0.00935
Epoch: 19542, Training Loss: 0.01061
Epoch: 19542, Training Loss: 0.00987
Epoch: 19542, Training Loss: 0.01212
Epoch: 19542, Training Loss: 0.00935
Epoch: 19543, Training Loss: 0.01061
Epoch: 19543, Training Loss: 0.00987
Epoch: 19543, Training Loss: 0.01212
Epoch: 19543, Training Loss: 0.00935
Epoch: 19544, Training Loss: 0.01061
Epoch: 19544, Training Loss: 0.00987
Epoch: 19544, Training Loss: 0.01212
Epoch: 19544, Training Loss: 0.00935
Epoch: 19545, Training Loss: 0.01061
Epoch: 19545, Training Loss: 0.00987
Epoch: 19545, Training Loss: 0.01212
Epoch: 19545, Training Loss: 0.00935
Epoch: 19546, Training Loss: 0.01061
Epoch: 19546, Training Loss: 0.00987
Epoch: 19546, Training Loss: 0.01212
Epoch: 19546, Training Loss: 0.00935
Epoch: 19547, Training Loss: 0.01061
Epoch: 19547, Training Loss: 0.00987
Epoch: 19547, Training Loss: 0.01212
Epoch: 19547, Training Loss: 0.00935
Epoch: 19548, Training Loss: 0.01061
Epoch: 19548, Training Loss: 0.00987
Epoch: 19548, Training Loss: 0.01211
Epoch: 19548, Training Loss: 0.00935
Epoch: 19549, Training Loss: 0.01061
Epoch: 19549, Training Loss: 0.00987
Epoch: 19549, Training Loss: 0.01211
Epoch: 19549, Training Loss: 0.00935
Epoch: 19550, Training Loss: 0.01061
Epoch: 19550, Training Loss: 0.00987
Epoch: 19550, Training Loss: 0.01211
Epoch: 19550, Training Loss: 0.00935
Epoch: 19551, Training Loss: 0.01061
Epoch: 19551, Training Loss: 0.00987
Epoch: 19551, Training Loss: 0.01211
Epoch: 19551, Training Loss: 0.00935
Epoch: 19552, Training Loss: 0.01061
Epoch: 19552, Training Loss: 0.00987
Epoch: 19552, Training Loss: 0.01211
Epoch: 19552, Training Loss: 0.00935
Epoch: 19553, Training Loss: 0.01061
Epoch: 19553, Training Loss: 0.00987
Epoch: 19553, Training Loss: 0.01211
Epoch: 19553, Training Loss: 0.00935
Epoch: 19554, Training Loss: 0.01061
Epoch: 19554, Training Loss: 0.00987
Epoch: 19554, Training Loss: 0.01211
Epoch: 19554, Training Loss: 0.00935
Epoch: 19555, Training Loss: 0.01061
Epoch: 19555, Training Loss: 0.00987
Epoch: 19555, Training Loss: 0.01211
Epoch: 19555, Training Loss: 0.00935
Epoch: 19556, Training Loss: 0.01061
Epoch: 19556, Training Loss: 0.00987
Epoch: 19556, Training Loss: 0.01211
Epoch: 19556, Training Loss: 0.00935
Epoch: 19557, Training Loss: 0.01061
Epoch: 19557, Training Loss: 0.00987
Epoch: 19557, Training Loss: 0.01211
Epoch: 19557, Training Loss: 0.00935
Epoch: 19558, Training Loss: 0.01061
Epoch: 19558, Training Loss: 0.00987
Epoch: 19558, Training Loss: 0.01211
Epoch: 19558, Training Loss: 0.00935
Epoch: 19559, Training Loss: 0.01061
Epoch: 19559, Training Loss: 0.00986
Epoch: 19559, Training Loss: 0.01211
Epoch: 19559, Training Loss: 0.00935
Epoch: 19560, Training Loss: 0.01061
Epoch: 19560, Training Loss: 0.00986
Epoch: 19560, Training Loss: 0.01211
Epoch: 19560, Training Loss: 0.00935
Epoch: 19561, Training Loss: 0.01061
Epoch: 19561, Training Loss: 0.00986
Epoch: 19561, Training Loss: 0.01211
Epoch: 19561, Training Loss: 0.00935
Epoch: 19562, Training Loss: 0.01061
Epoch: 19562, Training Loss: 0.00986
Epoch: 19562, Training Loss: 0.01211
Epoch: 19562, Training Loss: 0.00935
Epoch: 19563, Training Loss: 0.01061
Epoch: 19563, Training Loss: 0.00986
Epoch: 19563, Training Loss: 0.01211
Epoch: 19563, Training Loss: 0.00935
Epoch: 19564, Training Loss: 0.01061
Epoch: 19564, Training Loss: 0.00986
Epoch: 19564, Training Loss: 0.01211
Epoch: 19564, Training Loss: 0.00935
Epoch: 19565, Training Loss: 0.01061
Epoch: 19565, Training Loss: 0.00986
Epoch: 19565, Training Loss: 0.01211
Epoch: 19565, Training Loss: 0.00935
Epoch: 19566, Training Loss: 0.01061
Epoch: 19566, Training Loss: 0.00986
Epoch: 19566, Training Loss: 0.01211
Epoch: 19566, Training Loss: 0.00935
Epoch: 19567, Training Loss: 0.01061
Epoch: 19567, Training Loss: 0.00986
Epoch: 19567, Training Loss: 0.01211
Epoch: 19567, Training Loss: 0.00935
Epoch: 19568, Training Loss: 0.01061
Epoch: 19568, Training Loss: 0.00986
Epoch: 19568, Training Loss: 0.01211
Epoch: 19568, Training Loss: 0.00935
Epoch: 19569, Training Loss: 0.01061
Epoch: 19569, Training Loss: 0.00986
Epoch: 19569, Training Loss: 0.01211
Epoch: 19569, Training Loss: 0.00935
Epoch: 19570, Training Loss: 0.01061
Epoch: 19570, Training Loss: 0.00986
Epoch: 19570, Training Loss: 0.01211
Epoch: 19570, Training Loss: 0.00935
Epoch: 19571, Training Loss: 0.01061
Epoch: 19571, Training Loss: 0.00986
Epoch: 19571, Training Loss: 0.01211
Epoch: 19571, Training Loss: 0.00934
Epoch: 19572, Training Loss: 0.01061
Epoch: 19572, Training Loss: 0.00986
Epoch: 19572, Training Loss: 0.01211
Epoch: 19572, Training Loss: 0.00934
Epoch: 19573, Training Loss: 0.01060
Epoch: 19573, Training Loss: 0.00986
Epoch: 19573, Training Loss: 0.01211
Epoch: 19573, Training Loss: 0.00934
Epoch: 19574, Training Loss: 0.01060
Epoch: 19574, Training Loss: 0.00986
Epoch: 19574, Training Loss: 0.01211
Epoch: 19574, Training Loss: 0.00934
Epoch: 19575, Training Loss: 0.01060
Epoch: 19575, Training Loss: 0.00986
Epoch: 19575, Training Loss: 0.01211
Epoch: 19575, Training Loss: 0.00934
Epoch: 19576, Training Loss: 0.01060
Epoch: 19576, Training Loss: 0.00986
Epoch: 19576, Training Loss: 0.01211
Epoch: 19576, Training Loss: 0.00934
Epoch: 19577, Training Loss: 0.01060
Epoch: 19577, Training Loss: 0.00986
Epoch: 19577, Training Loss: 0.01211
Epoch: 19577, Training Loss: 0.00934
Epoch: 19578, Training Loss: 0.01060
Epoch: 19578, Training Loss: 0.00986
Epoch: 19578, Training Loss: 0.01210
Epoch: 19578, Training Loss: 0.00934
Epoch: 19579, Training Loss: 0.01060
Epoch: 19579, Training Loss: 0.00986
Epoch: 19579, Training Loss: 0.01210
Epoch: 19579, Training Loss: 0.00934
Epoch: 19580, Training Loss: 0.01060
Epoch: 19580, Training Loss: 0.00986
Epoch: 19580, Training Loss: 0.01210
Epoch: 19580, Training Loss: 0.00934
Epoch: 19581, Training Loss: 0.01060
Epoch: 19581, Training Loss: 0.00986
Epoch: 19581, Training Loss: 0.01210
Epoch: 19581, Training Loss: 0.00934
Epoch: 19582, Training Loss: 0.01060
Epoch: 19582, Training Loss: 0.00986
Epoch: 19582, Training Loss: 0.01210
Epoch: 19582, Training Loss: 0.00934
Epoch: 19583, Training Loss: 0.01060
Epoch: 19583, Training Loss: 0.00986
Epoch: 19583, Training Loss: 0.01210
Epoch: 19583, Training Loss: 0.00934
Epoch: 19584, Training Loss: 0.01060
Epoch: 19584, Training Loss: 0.00986
Epoch: 19584, Training Loss: 0.01210
Epoch: 19584, Training Loss: 0.00934
Epoch: 19585, Training Loss: 0.01060
Epoch: 19585, Training Loss: 0.00986
Epoch: 19585, Training Loss: 0.01210
Epoch: 19585, Training Loss: 0.00934
Epoch: 19586, Training Loss: 0.01060
Epoch: 19586, Training Loss: 0.00986
Epoch: 19586, Training Loss: 0.01210
Epoch: 19586, Training Loss: 0.00934
Epoch: 19587, Training Loss: 0.01060
Epoch: 19587, Training Loss: 0.00986
Epoch: 19587, Training Loss: 0.01210
Epoch: 19587, Training Loss: 0.00934
Epoch: 19588, Training Loss: 0.01060
Epoch: 19588, Training Loss: 0.00986
Epoch: 19588, Training Loss: 0.01210
Epoch: 19588, Training Loss: 0.00934
Epoch: 19589, Training Loss: 0.01060
Epoch: 19589, Training Loss: 0.00986
Epoch: 19589, Training Loss: 0.01210
Epoch: 19589, Training Loss: 0.00934
Epoch: 19590, Training Loss: 0.01060
Epoch: 19590, Training Loss: 0.00986
Epoch: 19590, Training Loss: 0.01210
Epoch: 19590, Training Loss: 0.00934
Epoch: 19591, Training Loss: 0.01060
Epoch: 19591, Training Loss: 0.00986
Epoch: 19591, Training Loss: 0.01210
Epoch: 19591, Training Loss: 0.00934
Epoch: 19592, Training Loss: 0.01060
Epoch: 19592, Training Loss: 0.00986
Epoch: 19592, Training Loss: 0.01210
Epoch: 19592, Training Loss: 0.00934
Epoch: 19593, Training Loss: 0.01060
Epoch: 19593, Training Loss: 0.00986
Epoch: 19593, Training Loss: 0.01210
Epoch: 19593, Training Loss: 0.00934
Epoch: 19594, Training Loss: 0.01060
Epoch: 19594, Training Loss: 0.00986
Epoch: 19594, Training Loss: 0.01210
Epoch: 19594, Training Loss: 0.00934
Epoch: 19595, Training Loss: 0.01060
Epoch: 19595, Training Loss: 0.00986
Epoch: 19595, Training Loss: 0.01210
Epoch: 19595, Training Loss: 0.00934
Epoch: 19596, Training Loss: 0.01060
Epoch: 19596, Training Loss: 0.00985
Epoch: 19596, Training Loss: 0.01210
Epoch: 19596, Training Loss: 0.00934
Epoch: 19597, Training Loss: 0.01060
Epoch: 19597, Training Loss: 0.00985
Epoch: 19597, Training Loss: 0.01210
Epoch: 19597, Training Loss: 0.00934
Epoch: 19598, Training Loss: 0.01060
Epoch: 19598, Training Loss: 0.00985
Epoch: 19598, Training Loss: 0.01210
Epoch: 19598, Training Loss: 0.00934
Epoch: 19599, Training Loss: 0.01060
Epoch: 19599, Training Loss: 0.00985
Epoch: 19599, Training Loss: 0.01210
Epoch: 19599, Training Loss: 0.00934
Epoch: 19600, Training Loss: 0.01060
Epoch: 19600, Training Loss: 0.00985
Epoch: 19600, Training Loss: 0.01210
Epoch: 19600, Training Loss: 0.00934
Epoch: 19601, Training Loss: 0.01060
Epoch: 19601, Training Loss: 0.00985
Epoch: 19601, Training Loss: 0.01210
Epoch: 19601, Training Loss: 0.00934
Epoch: 19602, Training Loss: 0.01060
Epoch: 19602, Training Loss: 0.00985
Epoch: 19602, Training Loss: 0.01210
Epoch: 19602, Training Loss: 0.00934
Epoch: 19603, Training Loss: 0.01060
Epoch: 19603, Training Loss: 0.00985
Epoch: 19603, Training Loss: 0.01210
Epoch: 19603, Training Loss: 0.00934
Epoch: 19604, Training Loss: 0.01060
Epoch: 19604, Training Loss: 0.00985
Epoch: 19604, Training Loss: 0.01210
Epoch: 19604, Training Loss: 0.00934
Epoch: 19605, Training Loss: 0.01060
Epoch: 19605, Training Loss: 0.00985
Epoch: 19605, Training Loss: 0.01210
Epoch: 19605, Training Loss: 0.00934
Epoch: 19606, Training Loss: 0.01060
Epoch: 19606, Training Loss: 0.00985
Epoch: 19606, Training Loss: 0.01210
Epoch: 19606, Training Loss: 0.00934
Epoch: 19607, Training Loss: 0.01059
Epoch: 19607, Training Loss: 0.00985
Epoch: 19607, Training Loss: 0.01210
Epoch: 19607, Training Loss: 0.00934
Epoch: 19608, Training Loss: 0.01059
Epoch: 19608, Training Loss: 0.00985
Epoch: 19608, Training Loss: 0.01209
Epoch: 19608, Training Loss: 0.00934
Epoch: 19609, Training Loss: 0.01059
Epoch: 19609, Training Loss: 0.00985
Epoch: 19609, Training Loss: 0.01209
Epoch: 19609, Training Loss: 0.00934
Epoch: 19610, Training Loss: 0.01059
Epoch: 19610, Training Loss: 0.00985
Epoch: 19610, Training Loss: 0.01209
Epoch: 19610, Training Loss: 0.00933
Epoch: 19611, Training Loss: 0.01059
Epoch: 19611, Training Loss: 0.00985
Epoch: 19611, Training Loss: 0.01209
Epoch: 19611, Training Loss: 0.00933
Epoch: 19612, Training Loss: 0.01059
Epoch: 19612, Training Loss: 0.00985
Epoch: 19612, Training Loss: 0.01209
Epoch: 19612, Training Loss: 0.00933
Epoch: 19613, Training Loss: 0.01059
Epoch: 19613, Training Loss: 0.00985
Epoch: 19613, Training Loss: 0.01209
Epoch: 19613, Training Loss: 0.00933
Epoch: 19614, Training Loss: 0.01059
Epoch: 19614, Training Loss: 0.00985
Epoch: 19614, Training Loss: 0.01209
Epoch: 19614, Training Loss: 0.00933
Epoch: 19615, Training Loss: 0.01059
Epoch: 19615, Training Loss: 0.00985
Epoch: 19615, Training Loss: 0.01209
Epoch: 19615, Training Loss: 0.00933
Epoch: 19616, Training Loss: 0.01059
Epoch: 19616, Training Loss: 0.00985
Epoch: 19616, Training Loss: 0.01209
Epoch: 19616, Training Loss: 0.00933
Epoch: 19617, Training Loss: 0.01059
Epoch: 19617, Training Loss: 0.00985
Epoch: 19617, Training Loss: 0.01209
Epoch: 19617, Training Loss: 0.00933
Epoch: 19618, Training Loss: 0.01059
Epoch: 19618, Training Loss: 0.00985
Epoch: 19618, Training Loss: 0.01209
Epoch: 19618, Training Loss: 0.00933
Epoch: 19619, Training Loss: 0.01059
Epoch: 19619, Training Loss: 0.00985
Epoch: 19619, Training Loss: 0.01209
Epoch: 19619, Training Loss: 0.00933
Epoch: 19620, Training Loss: 0.01059
Epoch: 19620, Training Loss: 0.00985
Epoch: 19620, Training Loss: 0.01209
Epoch: 19620, Training Loss: 0.00933
Epoch: 19621, Training Loss: 0.01059
Epoch: 19621, Training Loss: 0.00985
Epoch: 19621, Training Loss: 0.01209
Epoch: 19621, Training Loss: 0.00933
Epoch: 19622, Training Loss: 0.01059
Epoch: 19622, Training Loss: 0.00985
Epoch: 19622, Training Loss: 0.01209
Epoch: 19622, Training Loss: 0.00933
Epoch: 19623, Training Loss: 0.01059
Epoch: 19623, Training Loss: 0.00985
Epoch: 19623, Training Loss: 0.01209
Epoch: 19623, Training Loss: 0.00933
Epoch: 19624, Training Loss: 0.01059
Epoch: 19624, Training Loss: 0.00985
Epoch: 19624, Training Loss: 0.01209
Epoch: 19624, Training Loss: 0.00933
Epoch: 19625, Training Loss: 0.01059
Epoch: 19625, Training Loss: 0.00985
Epoch: 19625, Training Loss: 0.01209
Epoch: 19625, Training Loss: 0.00933
Epoch: 19626, Training Loss: 0.01059
Epoch: 19626, Training Loss: 0.00985
Epoch: 19626, Training Loss: 0.01209
Epoch: 19626, Training Loss: 0.00933
Epoch: 19627, Training Loss: 0.01059
Epoch: 19627, Training Loss: 0.00985
Epoch: 19627, Training Loss: 0.01209
Epoch: 19627, Training Loss: 0.00933
Epoch: 19628, Training Loss: 0.01059
Epoch: 19628, Training Loss: 0.00985
Epoch: 19628, Training Loss: 0.01209
Epoch: 19628, Training Loss: 0.00933
Epoch: 19629, Training Loss: 0.01059
Epoch: 19629, Training Loss: 0.00985
Epoch: 19629, Training Loss: 0.01209
Epoch: 19629, Training Loss: 0.00933
Epoch: 19630, Training Loss: 0.01059
Epoch: 19630, Training Loss: 0.00985
Epoch: 19630, Training Loss: 0.01209
Epoch: 19630, Training Loss: 0.00933
Epoch: 19631, Training Loss: 0.01059
Epoch: 19631, Training Loss: 0.00985
Epoch: 19631, Training Loss: 0.01209
Epoch: 19631, Training Loss: 0.00933
Epoch: 19632, Training Loss: 0.01059
Epoch: 19632, Training Loss: 0.00985
Epoch: 19632, Training Loss: 0.01209
Epoch: 19632, Training Loss: 0.00933
Epoch: 19633, Training Loss: 0.01059
Epoch: 19633, Training Loss: 0.00985
Epoch: 19633, Training Loss: 0.01209
Epoch: 19633, Training Loss: 0.00933
Epoch: 19634, Training Loss: 0.01059
Epoch: 19634, Training Loss: 0.00984
Epoch: 19634, Training Loss: 0.01209
Epoch: 19634, Training Loss: 0.00933
Epoch: 19635, Training Loss: 0.01059
Epoch: 19635, Training Loss: 0.00984
Epoch: 19635, Training Loss: 0.01209
Epoch: 19635, Training Loss: 0.00933
Epoch: 19636, Training Loss: 0.01059
Epoch: 19636, Training Loss: 0.00984
Epoch: 19636, Training Loss: 0.01209
Epoch: 19636, Training Loss: 0.00933
Epoch: 19637, Training Loss: 0.01059
Epoch: 19637, Training Loss: 0.00984
Epoch: 19637, Training Loss: 0.01209
Epoch: 19637, Training Loss: 0.00933
Epoch: 19638, Training Loss: 0.01059
Epoch: 19638, Training Loss: 0.00984
Epoch: 19638, Training Loss: 0.01208
Epoch: 19638, Training Loss: 0.00933
Epoch: 19639, Training Loss: 0.01059
Epoch: 19639, Training Loss: 0.00984
Epoch: 19639, Training Loss: 0.01208
Epoch: 19639, Training Loss: 0.00933
Epoch: 19640, Training Loss: 0.01059
Epoch: 19640, Training Loss: 0.00984
Epoch: 19640, Training Loss: 0.01208
Epoch: 19640, Training Loss: 0.00933
Epoch: 19641, Training Loss: 0.01058
Epoch: 19641, Training Loss: 0.00984
Epoch: 19641, Training Loss: 0.01208
Epoch: 19641, Training Loss: 0.00933
Epoch: 19642, Training Loss: 0.01058
Epoch: 19642, Training Loss: 0.00984
Epoch: 19642, Training Loss: 0.01208
Epoch: 19642, Training Loss: 0.00933
Epoch: 19643, Training Loss: 0.01058
Epoch: 19643, Training Loss: 0.00984
Epoch: 19643, Training Loss: 0.01208
Epoch: 19643, Training Loss: 0.00933
Epoch: 19644, Training Loss: 0.01058
Epoch: 19644, Training Loss: 0.00984
Epoch: 19644, Training Loss: 0.01208
Epoch: 19644, Training Loss: 0.00933
Epoch: 19645, Training Loss: 0.01058
Epoch: 19645, Training Loss: 0.00984
Epoch: 19645, Training Loss: 0.01208
Epoch: 19645, Training Loss: 0.00933
Epoch: 19646, Training Loss: 0.01058
Epoch: 19646, Training Loss: 0.00984
Epoch: 19646, Training Loss: 0.01208
Epoch: 19646, Training Loss: 0.00933
Epoch: 19647, Training Loss: 0.01058
Epoch: 19647, Training Loss: 0.00984
Epoch: 19647, Training Loss: 0.01208
Epoch: 19647, Training Loss: 0.00933
Epoch: 19648, Training Loss: 0.01058
Epoch: 19648, Training Loss: 0.00984
Epoch: 19648, Training Loss: 0.01208
Epoch: 19648, Training Loss: 0.00933
Epoch: 19649, Training Loss: 0.01058
Epoch: 19649, Training Loss: 0.00984
Epoch: 19649, Training Loss: 0.01208
Epoch: 19649, Training Loss: 0.00933
Epoch: 19650, Training Loss: 0.01058
Epoch: 19650, Training Loss: 0.00984
Epoch: 19650, Training Loss: 0.01208
Epoch: 19650, Training Loss: 0.00932
Epoch: 19651, Training Loss: 0.01058
Epoch: 19651, Training Loss: 0.00984
Epoch: 19651, Training Loss: 0.01208
Epoch: 19651, Training Loss: 0.00932
Epoch: 19652, Training Loss: 0.01058
Epoch: 19652, Training Loss: 0.00984
Epoch: 19652, Training Loss: 0.01208
Epoch: 19652, Training Loss: 0.00932
Epoch: 19653, Training Loss: 0.01058
Epoch: 19653, Training Loss: 0.00984
Epoch: 19653, Training Loss: 0.01208
Epoch: 19653, Training Loss: 0.00932
Epoch: 19654, Training Loss: 0.01058
Epoch: 19654, Training Loss: 0.00984
Epoch: 19654, Training Loss: 0.01208
Epoch: 19654, Training Loss: 0.00932
Epoch: 19655, Training Loss: 0.01058
Epoch: 19655, Training Loss: 0.00984
Epoch: 19655, Training Loss: 0.01208
Epoch: 19655, Training Loss: 0.00932
Epoch: 19656, Training Loss: 0.01058
Epoch: 19656, Training Loss: 0.00984
Epoch: 19656, Training Loss: 0.01208
Epoch: 19656, Training Loss: 0.00932
Epoch: 19657, Training Loss: 0.01058
Epoch: 19657, Training Loss: 0.00984
Epoch: 19657, Training Loss: 0.01208
Epoch: 19657, Training Loss: 0.00932
Epoch: 19658, Training Loss: 0.01058
Epoch: 19658, Training Loss: 0.00984
Epoch: 19658, Training Loss: 0.01208
Epoch: 19658, Training Loss: 0.00932
Epoch: 19659, Training Loss: 0.01058
Epoch: 19659, Training Loss: 0.00984
Epoch: 19659, Training Loss: 0.01208
Epoch: 19659, Training Loss: 0.00932
Epoch: 19660, Training Loss: 0.01058
Epoch: 19660, Training Loss: 0.00984
Epoch: 19660, Training Loss: 0.01208
Epoch: 19660, Training Loss: 0.00932
Epoch: 19661, Training Loss: 0.01058
Epoch: 19661, Training Loss: 0.00984
Epoch: 19661, Training Loss: 0.01208
Epoch: 19661, Training Loss: 0.00932
Epoch: 19662, Training Loss: 0.01058
Epoch: 19662, Training Loss: 0.00984
Epoch: 19662, Training Loss: 0.01208
Epoch: 19662, Training Loss: 0.00932
Epoch: 19663, Training Loss: 0.01058
Epoch: 19663, Training Loss: 0.00984
Epoch: 19663, Training Loss: 0.01208
Epoch: 19663, Training Loss: 0.00932
Epoch: 19664, Training Loss: 0.01058
Epoch: 19664, Training Loss: 0.00984
Epoch: 19664, Training Loss: 0.01208
Epoch: 19664, Training Loss: 0.00932
Epoch: 19665, Training Loss: 0.01058
Epoch: 19665, Training Loss: 0.00984
Epoch: 19665, Training Loss: 0.01208
Epoch: 19665, Training Loss: 0.00932
Epoch: 19666, Training Loss: 0.01058
Epoch: 19666, Training Loss: 0.00984
Epoch: 19666, Training Loss: 0.01208
Epoch: 19666, Training Loss: 0.00932
Epoch: 19667, Training Loss: 0.01058
Epoch: 19667, Training Loss: 0.00984
Epoch: 19667, Training Loss: 0.01208
Epoch: 19667, Training Loss: 0.00932
Epoch: 19668, Training Loss: 0.01058
Epoch: 19668, Training Loss: 0.00984
Epoch: 19668, Training Loss: 0.01207
Epoch: 19668, Training Loss: 0.00932
Epoch: 19669, Training Loss: 0.01058
Epoch: 19669, Training Loss: 0.00984
Epoch: 19669, Training Loss: 0.01207
Epoch: 19669, Training Loss: 0.00932
Epoch: 19670, Training Loss: 0.01058
Epoch: 19670, Training Loss: 0.00984
Epoch: 19670, Training Loss: 0.01207
Epoch: 19670, Training Loss: 0.00932
Epoch: 19671, Training Loss: 0.01058
Epoch: 19671, Training Loss: 0.00983
Epoch: 19671, Training Loss: 0.01207
Epoch: 19671, Training Loss: 0.00932
Epoch: 19672, Training Loss: 0.01058
Epoch: 19672, Training Loss: 0.00983
Epoch: 19672, Training Loss: 0.01207
Epoch: 19672, Training Loss: 0.00932
Epoch: 19673, Training Loss: 0.01058
Epoch: 19673, Training Loss: 0.00983
Epoch: 19673, Training Loss: 0.01207
Epoch: 19673, Training Loss: 0.00932
Epoch: 19674, Training Loss: 0.01058
Epoch: 19674, Training Loss: 0.00983
Epoch: 19674, Training Loss: 0.01207
Epoch: 19674, Training Loss: 0.00932
Epoch: 19675, Training Loss: 0.01057
Epoch: 19675, Training Loss: 0.00983
Epoch: 19675, Training Loss: 0.01207
Epoch: 19675, Training Loss: 0.00932
Epoch: 19676, Training Loss: 0.01057
Epoch: 19676, Training Loss: 0.00983
Epoch: 19676, Training Loss: 0.01207
Epoch: 19676, Training Loss: 0.00932
Epoch: 19677, Training Loss: 0.01057
Epoch: 19677, Training Loss: 0.00983
Epoch: 19677, Training Loss: 0.01207
Epoch: 19677, Training Loss: 0.00932
Epoch: 19678, Training Loss: 0.01057
Epoch: 19678, Training Loss: 0.00983
Epoch: 19678, Training Loss: 0.01207
Epoch: 19678, Training Loss: 0.00932
Epoch: 19679, Training Loss: 0.01057
Epoch: 19679, Training Loss: 0.00983
Epoch: 19679, Training Loss: 0.01207
Epoch: 19679, Training Loss: 0.00932
Epoch: 19680, Training Loss: 0.01057
Epoch: 19680, Training Loss: 0.00983
Epoch: 19680, Training Loss: 0.01207
Epoch: 19680, Training Loss: 0.00932
Epoch: 19681, Training Loss: 0.01057
Epoch: 19681, Training Loss: 0.00983
Epoch: 19681, Training Loss: 0.01207
Epoch: 19681, Training Loss: 0.00932
Epoch: 19682, Training Loss: 0.01057
Epoch: 19682, Training Loss: 0.00983
Epoch: 19682, Training Loss: 0.01207
Epoch: 19682, Training Loss: 0.00932
Epoch: 19683, Training Loss: 0.01057
Epoch: 19683, Training Loss: 0.00983
Epoch: 19683, Training Loss: 0.01207
Epoch: 19683, Training Loss: 0.00932
Epoch: 19684, Training Loss: 0.01057
Epoch: 19684, Training Loss: 0.00983
Epoch: 19684, Training Loss: 0.01207
Epoch: 19684, Training Loss: 0.00932
Epoch: 19685, Training Loss: 0.01057
Epoch: 19685, Training Loss: 0.00983
Epoch: 19685, Training Loss: 0.01207
Epoch: 19685, Training Loss: 0.00932
Epoch: 19686, Training Loss: 0.01057
Epoch: 19686, Training Loss: 0.00983
Epoch: 19686, Training Loss: 0.01207
Epoch: 19686, Training Loss: 0.00932
Epoch: 19687, Training Loss: 0.01057
Epoch: 19687, Training Loss: 0.00983
Epoch: 19687, Training Loss: 0.01207
Epoch: 19687, Training Loss: 0.00932
Epoch: 19688, Training Loss: 0.01057
Epoch: 19688, Training Loss: 0.00983
Epoch: 19688, Training Loss: 0.01207
Epoch: 19688, Training Loss: 0.00932
Epoch: 19689, Training Loss: 0.01057
Epoch: 19689, Training Loss: 0.00983
Epoch: 19689, Training Loss: 0.01207
Epoch: 19689, Training Loss: 0.00931
Epoch: 19690, Training Loss: 0.01057
Epoch: 19690, Training Loss: 0.00983
Epoch: 19690, Training Loss: 0.01207
Epoch: 19690, Training Loss: 0.00931
Epoch: 19691, Training Loss: 0.01057
Epoch: 19691, Training Loss: 0.00983
Epoch: 19691, Training Loss: 0.01207
Epoch: 19691, Training Loss: 0.00931
Epoch: 19692, Training Loss: 0.01057
Epoch: 19692, Training Loss: 0.00983
Epoch: 19692, Training Loss: 0.01207
Epoch: 19692, Training Loss: 0.00931
Epoch: 19693, Training Loss: 0.01057
Epoch: 19693, Training Loss: 0.00983
Epoch: 19693, Training Loss: 0.01207
Epoch: 19693, Training Loss: 0.00931
Epoch: 19694, Training Loss: 0.01057
Epoch: 19694, Training Loss: 0.00983
Epoch: 19694, Training Loss: 0.01207
Epoch: 19694, Training Loss: 0.00931
Epoch: 19695, Training Loss: 0.01057
Epoch: 19695, Training Loss: 0.00983
Epoch: 19695, Training Loss: 0.01207
Epoch: 19695, Training Loss: 0.00931
Epoch: 19696, Training Loss: 0.01057
Epoch: 19696, Training Loss: 0.00983
Epoch: 19696, Training Loss: 0.01207
Epoch: 19696, Training Loss: 0.00931
Epoch: 19697, Training Loss: 0.01057
Epoch: 19697, Training Loss: 0.00983
Epoch: 19697, Training Loss: 0.01207
Epoch: 19697, Training Loss: 0.00931
Epoch: 19698, Training Loss: 0.01057
Epoch: 19698, Training Loss: 0.00983
Epoch: 19698, Training Loss: 0.01206
Epoch: 19698, Training Loss: 0.00931
Epoch: 19699, Training Loss: 0.01057
Epoch: 19699, Training Loss: 0.00983
Epoch: 19699, Training Loss: 0.01206
Epoch: 19699, Training Loss: 0.00931
Epoch: 19700, Training Loss: 0.01057
Epoch: 19700, Training Loss: 0.00983
Epoch: 19700, Training Loss: 0.01206
Epoch: 19700, Training Loss: 0.00931
Epoch: 19701, Training Loss: 0.01057
Epoch: 19701, Training Loss: 0.00983
Epoch: 19701, Training Loss: 0.01206
Epoch: 19701, Training Loss: 0.00931
Epoch: 19702, Training Loss: 0.01057
Epoch: 19702, Training Loss: 0.00983
Epoch: 19702, Training Loss: 0.01206
Epoch: 19702, Training Loss: 0.00931
Epoch: 19703, Training Loss: 0.01057
Epoch: 19703, Training Loss: 0.00983
Epoch: 19703, Training Loss: 0.01206
Epoch: 19703, Training Loss: 0.00931
Epoch: 19704, Training Loss: 0.01057
Epoch: 19704, Training Loss: 0.00983
Epoch: 19704, Training Loss: 0.01206
Epoch: 19704, Training Loss: 0.00931
Epoch: 19705, Training Loss: 0.01057
Epoch: 19705, Training Loss: 0.00983
Epoch: 19705, Training Loss: 0.01206
Epoch: 19705, Training Loss: 0.00931
Epoch: 19706, Training Loss: 0.01057
Epoch: 19706, Training Loss: 0.00983
Epoch: 19706, Training Loss: 0.01206
Epoch: 19706, Training Loss: 0.00931
Epoch: 19707, Training Loss: 0.01057
Epoch: 19707, Training Loss: 0.00983
Epoch: 19707, Training Loss: 0.01206
Epoch: 19707, Training Loss: 0.00931
Epoch: 19708, Training Loss: 0.01057
Epoch: 19708, Training Loss: 0.00983
Epoch: 19708, Training Loss: 0.01206
Epoch: 19708, Training Loss: 0.00931
Epoch: 19709, Training Loss: 0.01057
Epoch: 19709, Training Loss: 0.00982
Epoch: 19709, Training Loss: 0.01206
Epoch: 19709, Training Loss: 0.00931
Epoch: 19710, Training Loss: 0.01056
Epoch: 19710, Training Loss: 0.00982
Epoch: 19710, Training Loss: 0.01206
Epoch: 19710, Training Loss: 0.00931
Epoch: 19711, Training Loss: 0.01056
Epoch: 19711, Training Loss: 0.00982
Epoch: 19711, Training Loss: 0.01206
Epoch: 19711, Training Loss: 0.00931
Epoch: 19712, Training Loss: 0.01056
Epoch: 19712, Training Loss: 0.00982
Epoch: 19712, Training Loss: 0.01206
Epoch: 19712, Training Loss: 0.00931
Epoch: 19713, Training Loss: 0.01056
Epoch: 19713, Training Loss: 0.00982
Epoch: 19713, Training Loss: 0.01206
Epoch: 19713, Training Loss: 0.00931
Epoch: 19714, Training Loss: 0.01056
Epoch: 19714, Training Loss: 0.00982
Epoch: 19714, Training Loss: 0.01206
Epoch: 19714, Training Loss: 0.00931
Epoch: 19715, Training Loss: 0.01056
Epoch: 19715, Training Loss: 0.00982
Epoch: 19715, Training Loss: 0.01206
Epoch: 19715, Training Loss: 0.00931
Epoch: 19716, Training Loss: 0.01056
Epoch: 19716, Training Loss: 0.00982
Epoch: 19716, Training Loss: 0.01206
Epoch: 19716, Training Loss: 0.00931
Epoch: 19717, Training Loss: 0.01056
Epoch: 19717, Training Loss: 0.00982
Epoch: 19717, Training Loss: 0.01206
Epoch: 19717, Training Loss: 0.00931
Epoch: 19718, Training Loss: 0.01056
Epoch: 19718, Training Loss: 0.00982
Epoch: 19718, Training Loss: 0.01206
Epoch: 19718, Training Loss: 0.00931
Epoch: 19719, Training Loss: 0.01056
Epoch: 19719, Training Loss: 0.00982
Epoch: 19719, Training Loss: 0.01206
Epoch: 19719, Training Loss: 0.00931
Epoch: 19720, Training Loss: 0.01056
Epoch: 19720, Training Loss: 0.00982
Epoch: 19720, Training Loss: 0.01206
Epoch: 19720, Training Loss: 0.00931
Epoch: 19721, Training Loss: 0.01056
Epoch: 19721, Training Loss: 0.00982
Epoch: 19721, Training Loss: 0.01206
Epoch: 19721, Training Loss: 0.00931
Epoch: 19722, Training Loss: 0.01056
Epoch: 19722, Training Loss: 0.00982
Epoch: 19722, Training Loss: 0.01206
Epoch: 19722, Training Loss: 0.00931
Epoch: 19723, Training Loss: 0.01056
Epoch: 19723, Training Loss: 0.00982
Epoch: 19723, Training Loss: 0.01206
Epoch: 19723, Training Loss: 0.00931
Epoch: 19724, Training Loss: 0.01056
Epoch: 19724, Training Loss: 0.00982
Epoch: 19724, Training Loss: 0.01206
Epoch: 19724, Training Loss: 0.00931
Epoch: 19725, Training Loss: 0.01056
Epoch: 19725, Training Loss: 0.00982
Epoch: 19725, Training Loss: 0.01206
Epoch: 19725, Training Loss: 0.00931
Epoch: 19726, Training Loss: 0.01056
Epoch: 19726, Training Loss: 0.00982
Epoch: 19726, Training Loss: 0.01206
Epoch: 19726, Training Loss: 0.00931
Epoch: 19727, Training Loss: 0.01056
Epoch: 19727, Training Loss: 0.00982
Epoch: 19727, Training Loss: 0.01206
Epoch: 19727, Training Loss: 0.00931
Epoch: 19728, Training Loss: 0.01056
Epoch: 19728, Training Loss: 0.00982
Epoch: 19728, Training Loss: 0.01206
Epoch: 19728, Training Loss: 0.00931
Epoch: 19729, Training Loss: 0.01056
Epoch: 19729, Training Loss: 0.00982
Epoch: 19729, Training Loss: 0.01205
Epoch: 19729, Training Loss: 0.00930
Epoch: 19730, Training Loss: 0.01056
Epoch: 19730, Training Loss: 0.00982
Epoch: 19730, Training Loss: 0.01205
Epoch: 19730, Training Loss: 0.00930
Epoch: 19731, Training Loss: 0.01056
Epoch: 19731, Training Loss: 0.00982
Epoch: 19731, Training Loss: 0.01205
Epoch: 19731, Training Loss: 0.00930
Epoch: 19732, Training Loss: 0.01056
Epoch: 19732, Training Loss: 0.00982
Epoch: 19732, Training Loss: 0.01205
Epoch: 19732, Training Loss: 0.00930
Epoch: 19733, Training Loss: 0.01056
Epoch: 19733, Training Loss: 0.00982
Epoch: 19733, Training Loss: 0.01205
Epoch: 19733, Training Loss: 0.00930
Epoch: 19734, Training Loss: 0.01056
Epoch: 19734, Training Loss: 0.00982
Epoch: 19734, Training Loss: 0.01205
Epoch: 19734, Training Loss: 0.00930
Epoch: 19735, Training Loss: 0.01056
Epoch: 19735, Training Loss: 0.00982
Epoch: 19735, Training Loss: 0.01205
Epoch: 19735, Training Loss: 0.00930
Epoch: 19736, Training Loss: 0.01056
Epoch: 19736, Training Loss: 0.00982
Epoch: 19736, Training Loss: 0.01205
Epoch: 19736, Training Loss: 0.00930
Epoch: 19737, Training Loss: 0.01056
Epoch: 19737, Training Loss: 0.00982
Epoch: 19737, Training Loss: 0.01205
Epoch: 19737, Training Loss: 0.00930
Epoch: 19738, Training Loss: 0.01056
Epoch: 19738, Training Loss: 0.00982
Epoch: 19738, Training Loss: 0.01205
Epoch: 19738, Training Loss: 0.00930
Epoch: 19739, Training Loss: 0.01056
Epoch: 19739, Training Loss: 0.00982
Epoch: 19739, Training Loss: 0.01205
Epoch: 19739, Training Loss: 0.00930
Epoch: 19740, Training Loss: 0.01056
Epoch: 19740, Training Loss: 0.00982
Epoch: 19740, Training Loss: 0.01205
Epoch: 19740, Training Loss: 0.00930
Epoch: 19741, Training Loss: 0.01056
Epoch: 19741, Training Loss: 0.00982
Epoch: 19741, Training Loss: 0.01205
Epoch: 19741, Training Loss: 0.00930
Epoch: 19742, Training Loss: 0.01056
Epoch: 19742, Training Loss: 0.00982
Epoch: 19742, Training Loss: 0.01205
Epoch: 19742, Training Loss: 0.00930
Epoch: 19743, Training Loss: 0.01056
Epoch: 19743, Training Loss: 0.00982
Epoch: 19743, Training Loss: 0.01205
Epoch: 19743, Training Loss: 0.00930
Epoch: 19744, Training Loss: 0.01055
Epoch: 19744, Training Loss: 0.00982
Epoch: 19744, Training Loss: 0.01205
Epoch: 19744, Training Loss: 0.00930
Epoch: 19745, Training Loss: 0.01055
Epoch: 19745, Training Loss: 0.00982
Epoch: 19745, Training Loss: 0.01205
Epoch: 19745, Training Loss: 0.00930
Epoch: 19746, Training Loss: 0.01055
Epoch: 19746, Training Loss: 0.00982
Epoch: 19746, Training Loss: 0.01205
Epoch: 19746, Training Loss: 0.00930
Epoch: 19747, Training Loss: 0.01055
Epoch: 19747, Training Loss: 0.00981
Epoch: 19747, Training Loss: 0.01205
Epoch: 19747, Training Loss: 0.00930
Epoch: 19748, Training Loss: 0.01055
Epoch: 19748, Training Loss: 0.00981
Epoch: 19748, Training Loss: 0.01205
Epoch: 19748, Training Loss: 0.00930
Epoch: 19749, Training Loss: 0.01055
Epoch: 19749, Training Loss: 0.00981
Epoch: 19749, Training Loss: 0.01205
Epoch: 19749, Training Loss: 0.00930
Epoch: 19750, Training Loss: 0.01055
Epoch: 19750, Training Loss: 0.00981
Epoch: 19750, Training Loss: 0.01205
Epoch: 19750, Training Loss: 0.00930
Epoch: 19751, Training Loss: 0.01055
Epoch: 19751, Training Loss: 0.00981
Epoch: 19751, Training Loss: 0.01205
Epoch: 19751, Training Loss: 0.00930
Epoch: 19752, Training Loss: 0.01055
Epoch: 19752, Training Loss: 0.00981
Epoch: 19752, Training Loss: 0.01205
Epoch: 19752, Training Loss: 0.00930
Epoch: 19753, Training Loss: 0.01055
Epoch: 19753, Training Loss: 0.00981
Epoch: 19753, Training Loss: 0.01205
Epoch: 19753, Training Loss: 0.00930
Epoch: 19754, Training Loss: 0.01055
Epoch: 19754, Training Loss: 0.00981
Epoch: 19754, Training Loss: 0.01205
Epoch: 19754, Training Loss: 0.00930
Epoch: 19755, Training Loss: 0.01055
Epoch: 19755, Training Loss: 0.00981
Epoch: 19755, Training Loss: 0.01205
Epoch: 19755, Training Loss: 0.00930
Epoch: 19756, Training Loss: 0.01055
Epoch: 19756, Training Loss: 0.00981
Epoch: 19756, Training Loss: 0.01205
Epoch: 19756, Training Loss: 0.00930
Epoch: 19757, Training Loss: 0.01055
Epoch: 19757, Training Loss: 0.00981
Epoch: 19757, Training Loss: 0.01205
Epoch: 19757, Training Loss: 0.00930
Epoch: 19758, Training Loss: 0.01055
Epoch: 19758, Training Loss: 0.00981
Epoch: 19758, Training Loss: 0.01205
Epoch: 19758, Training Loss: 0.00930
Epoch: 19759, Training Loss: 0.01055
Epoch: 19759, Training Loss: 0.00981
Epoch: 19759, Training Loss: 0.01204
Epoch: 19759, Training Loss: 0.00930
Epoch: 19760, Training Loss: 0.01055
Epoch: 19760, Training Loss: 0.00981
Epoch: 19760, Training Loss: 0.01204
Epoch: 19760, Training Loss: 0.00930
Epoch: 19761, Training Loss: 0.01055
Epoch: 19761, Training Loss: 0.00981
Epoch: 19761, Training Loss: 0.01204
Epoch: 19761, Training Loss: 0.00930
Epoch: 19762, Training Loss: 0.01055
Epoch: 19762, Training Loss: 0.00981
Epoch: 19762, Training Loss: 0.01204
Epoch: 19762, Training Loss: 0.00930
Epoch: 19763, Training Loss: 0.01055
Epoch: 19763, Training Loss: 0.00981
Epoch: 19763, Training Loss: 0.01204
Epoch: 19763, Training Loss: 0.00930
Epoch: 19764, Training Loss: 0.01055
Epoch: 19764, Training Loss: 0.00981
Epoch: 19764, Training Loss: 0.01204
Epoch: 19764, Training Loss: 0.00930
Epoch: 19765, Training Loss: 0.01055
Epoch: 19765, Training Loss: 0.00981
Epoch: 19765, Training Loss: 0.01204
Epoch: 19765, Training Loss: 0.00930
Epoch: 19766, Training Loss: 0.01055
Epoch: 19766, Training Loss: 0.00981
Epoch: 19766, Training Loss: 0.01204
Epoch: 19766, Training Loss: 0.00930
Epoch: 19767, Training Loss: 0.01055
Epoch: 19767, Training Loss: 0.00981
Epoch: 19767, Training Loss: 0.01204
Epoch: 19767, Training Loss: 0.00930
Epoch: 19768, Training Loss: 0.01055
Epoch: 19768, Training Loss: 0.00981
Epoch: 19768, Training Loss: 0.01204
Epoch: 19768, Training Loss: 0.00930
Epoch: 19769, Training Loss: 0.01055
Epoch: 19769, Training Loss: 0.00981
Epoch: 19769, Training Loss: 0.01204
Epoch: 19769, Training Loss: 0.00929
Epoch: 19770, Training Loss: 0.01055
Epoch: 19770, Training Loss: 0.00981
Epoch: 19770, Training Loss: 0.01204
Epoch: 19770, Training Loss: 0.00929
Epoch: 19771, Training Loss: 0.01055
Epoch: 19771, Training Loss: 0.00981
Epoch: 19771, Training Loss: 0.01204
Epoch: 19771, Training Loss: 0.00929
Epoch: 19772, Training Loss: 0.01055
Epoch: 19772, Training Loss: 0.00981
Epoch: 19772, Training Loss: 0.01204
Epoch: 19772, Training Loss: 0.00929
Epoch: 19773, Training Loss: 0.01055
Epoch: 19773, Training Loss: 0.00981
Epoch: 19773, Training Loss: 0.01204
Epoch: 19773, Training Loss: 0.00929
Epoch: 19774, Training Loss: 0.01055
Epoch: 19774, Training Loss: 0.00981
Epoch: 19774, Training Loss: 0.01204
Epoch: 19774, Training Loss: 0.00929
Epoch: 19775, Training Loss: 0.01055
Epoch: 19775, Training Loss: 0.00981
Epoch: 19775, Training Loss: 0.01204
Epoch: 19775, Training Loss: 0.00929
Epoch: 19776, Training Loss: 0.01055
Epoch: 19776, Training Loss: 0.00981
Epoch: 19776, Training Loss: 0.01204
Epoch: 19776, Training Loss: 0.00929
Epoch: 19777, Training Loss: 0.01055
Epoch: 19777, Training Loss: 0.00981
Epoch: 19777, Training Loss: 0.01204
Epoch: 19777, Training Loss: 0.00929
Epoch: 19778, Training Loss: 0.01055
Epoch: 19778, Training Loss: 0.00981
Epoch: 19778, Training Loss: 0.01204
Epoch: 19778, Training Loss: 0.00929
Epoch: 19779, Training Loss: 0.01054
Epoch: 19779, Training Loss: 0.00981
Epoch: 19779, Training Loss: 0.01204
Epoch: 19779, Training Loss: 0.00929
Epoch: 19780, Training Loss: 0.01054
Epoch: 19780, Training Loss: 0.00981
Epoch: 19780, Training Loss: 0.01204
Epoch: 19780, Training Loss: 0.00929
Epoch: 19781, Training Loss: 0.01054
Epoch: 19781, Training Loss: 0.00981
Epoch: 19781, Training Loss: 0.01204
Epoch: 19781, Training Loss: 0.00929
Epoch: 19782, Training Loss: 0.01054
Epoch: 19782, Training Loss: 0.00981
Epoch: 19782, Training Loss: 0.01204
Epoch: 19782, Training Loss: 0.00929
Epoch: 19783, Training Loss: 0.01054
Epoch: 19783, Training Loss: 0.00981
Epoch: 19783, Training Loss: 0.01204
Epoch: 19783, Training Loss: 0.00929
Epoch: 19784, Training Loss: 0.01054
Epoch: 19784, Training Loss: 0.00981
Epoch: 19784, Training Loss: 0.01204
Epoch: 19784, Training Loss: 0.00929
Epoch: 19785, Training Loss: 0.01054
Epoch: 19785, Training Loss: 0.00980
Epoch: 19785, Training Loss: 0.01204
Epoch: 19785, Training Loss: 0.00929
Epoch: 19786, Training Loss: 0.01054
Epoch: 19786, Training Loss: 0.00980
Epoch: 19786, Training Loss: 0.01204
Epoch: 19786, Training Loss: 0.00929
Epoch: 19787, Training Loss: 0.01054
Epoch: 19787, Training Loss: 0.00980
Epoch: 19787, Training Loss: 0.01204
Epoch: 19787, Training Loss: 0.00929
Epoch: 19788, Training Loss: 0.01054
Epoch: 19788, Training Loss: 0.00980
Epoch: 19788, Training Loss: 0.01204
Epoch: 19788, Training Loss: 0.00929
Epoch: 19789, Training Loss: 0.01054
Epoch: 19789, Training Loss: 0.00980
Epoch: 19789, Training Loss: 0.01203
Epoch: 19789, Training Loss: 0.00929
Epoch: 19790, Training Loss: 0.01054
Epoch: 19790, Training Loss: 0.00980
Epoch: 19790, Training Loss: 0.01203
Epoch: 19790, Training Loss: 0.00929
Epoch: 19791, Training Loss: 0.01054
Epoch: 19791, Training Loss: 0.00980
Epoch: 19791, Training Loss: 0.01203
Epoch: 19791, Training Loss: 0.00929
Epoch: 19792, Training Loss: 0.01054
Epoch: 19792, Training Loss: 0.00980
Epoch: 19792, Training Loss: 0.01203
Epoch: 19792, Training Loss: 0.00929
Epoch: 19793, Training Loss: 0.01054
Epoch: 19793, Training Loss: 0.00980
Epoch: 19793, Training Loss: 0.01203
Epoch: 19793, Training Loss: 0.00929
Epoch: 19794, Training Loss: 0.01054
Epoch: 19794, Training Loss: 0.00980
Epoch: 19794, Training Loss: 0.01203
Epoch: 19794, Training Loss: 0.00929
Epoch: 19795, Training Loss: 0.01054
Epoch: 19795, Training Loss: 0.00980
Epoch: 19795, Training Loss: 0.01203
Epoch: 19795, Training Loss: 0.00929
Epoch: 19796, Training Loss: 0.01054
Epoch: 19796, Training Loss: 0.00980
Epoch: 19796, Training Loss: 0.01203
Epoch: 19796, Training Loss: 0.00929
Epoch: 19797, Training Loss: 0.01054
Epoch: 19797, Training Loss: 0.00980
Epoch: 19797, Training Loss: 0.01203
Epoch: 19797, Training Loss: 0.00929
Epoch: 19798, Training Loss: 0.01054
Epoch: 19798, Training Loss: 0.00980
Epoch: 19798, Training Loss: 0.01203
Epoch: 19798, Training Loss: 0.00929
Epoch: 19799, Training Loss: 0.01054
Epoch: 19799, Training Loss: 0.00980
Epoch: 19799, Training Loss: 0.01203
Epoch: 19799, Training Loss: 0.00929
Epoch: 19800, Training Loss: 0.01054
Epoch: 19800, Training Loss: 0.00980
Epoch: 19800, Training Loss: 0.01203
Epoch: 19800, Training Loss: 0.00929
Epoch: 19801, Training Loss: 0.01054
Epoch: 19801, Training Loss: 0.00980
Epoch: 19801, Training Loss: 0.01203
Epoch: 19801, Training Loss: 0.00929
Epoch: 19802, Training Loss: 0.01054
Epoch: 19802, Training Loss: 0.00980
Epoch: 19802, Training Loss: 0.01203
Epoch: 19802, Training Loss: 0.00929
Epoch: 19803, Training Loss: 0.01054
Epoch: 19803, Training Loss: 0.00980
Epoch: 19803, Training Loss: 0.01203
Epoch: 19803, Training Loss: 0.00929
Epoch: 19804, Training Loss: 0.01054
Epoch: 19804, Training Loss: 0.00980
Epoch: 19804, Training Loss: 0.01203
Epoch: 19804, Training Loss: 0.00929
Epoch: 19805, Training Loss: 0.01054
Epoch: 19805, Training Loss: 0.00980
Epoch: 19805, Training Loss: 0.01203
Epoch: 19805, Training Loss: 0.00929
Epoch: 19806, Training Loss: 0.01054
Epoch: 19806, Training Loss: 0.00980
Epoch: 19806, Training Loss: 0.01203
Epoch: 19806, Training Loss: 0.00929
Epoch: 19807, Training Loss: 0.01054
Epoch: 19807, Training Loss: 0.00980
Epoch: 19807, Training Loss: 0.01203
Epoch: 19807, Training Loss: 0.00929
Epoch: 19808, Training Loss: 0.01054
Epoch: 19808, Training Loss: 0.00980
Epoch: 19808, Training Loss: 0.01203
Epoch: 19808, Training Loss: 0.00929
Epoch: 19809, Training Loss: 0.01054
Epoch: 19809, Training Loss: 0.00980
Epoch: 19809, Training Loss: 0.01203
Epoch: 19809, Training Loss: 0.00928
Epoch: 19810, Training Loss: 0.01054
Epoch: 19810, Training Loss: 0.00980
Epoch: 19810, Training Loss: 0.01203
Epoch: 19810, Training Loss: 0.00928
Epoch: 19811, Training Loss: 0.01054
Epoch: 19811, Training Loss: 0.00980
Epoch: 19811, Training Loss: 0.01203
Epoch: 19811, Training Loss: 0.00928
Epoch: 19812, Training Loss: 0.01054
Epoch: 19812, Training Loss: 0.00980
Epoch: 19812, Training Loss: 0.01203
Epoch: 19812, Training Loss: 0.00928
Epoch: 19813, Training Loss: 0.01054
Epoch: 19813, Training Loss: 0.00980
Epoch: 19813, Training Loss: 0.01203
Epoch: 19813, Training Loss: 0.00928
Epoch: 19814, Training Loss: 0.01053
Epoch: 19814, Training Loss: 0.00980
Epoch: 19814, Training Loss: 0.01203
Epoch: 19814, Training Loss: 0.00928
Epoch: 19815, Training Loss: 0.01053
Epoch: 19815, Training Loss: 0.00980
Epoch: 19815, Training Loss: 0.01203
Epoch: 19815, Training Loss: 0.00928
Epoch: 19816, Training Loss: 0.01053
Epoch: 19816, Training Loss: 0.00980
Epoch: 19816, Training Loss: 0.01203
Epoch: 19816, Training Loss: 0.00928
Epoch: 19817, Training Loss: 0.01053
Epoch: 19817, Training Loss: 0.00980
Epoch: 19817, Training Loss: 0.01203
Epoch: 19817, Training Loss: 0.00928
Epoch: 19818, Training Loss: 0.01053
Epoch: 19818, Training Loss: 0.00980
Epoch: 19818, Training Loss: 0.01203
Epoch: 19818, Training Loss: 0.00928
Epoch: 19819, Training Loss: 0.01053
Epoch: 19819, Training Loss: 0.00980
Epoch: 19819, Training Loss: 0.01203
Epoch: 19819, Training Loss: 0.00928
Epoch: 19820, Training Loss: 0.01053
Epoch: 19820, Training Loss: 0.00980
Epoch: 19820, Training Loss: 0.01202
Epoch: 19820, Training Loss: 0.00928
Epoch: 19821, Training Loss: 0.01053
Epoch: 19821, Training Loss: 0.00980
Epoch: 19821, Training Loss: 0.01202
Epoch: 19821, Training Loss: 0.00928
Epoch: 19822, Training Loss: 0.01053
Epoch: 19822, Training Loss: 0.00980
Epoch: 19822, Training Loss: 0.01202
Epoch: 19822, Training Loss: 0.00928
Epoch: 19823, Training Loss: 0.01053
Epoch: 19823, Training Loss: 0.00979
Epoch: 19823, Training Loss: 0.01202
Epoch: 19823, Training Loss: 0.00928
Epoch: 19824, Training Loss: 0.01053
Epoch: 19824, Training Loss: 0.00979
Epoch: 19824, Training Loss: 0.01202
Epoch: 19824, Training Loss: 0.00928
Epoch: 19825, Training Loss: 0.01053
Epoch: 19825, Training Loss: 0.00979
Epoch: 19825, Training Loss: 0.01202
Epoch: 19825, Training Loss: 0.00928
Epoch: 19826, Training Loss: 0.01053
Epoch: 19826, Training Loss: 0.00979
Epoch: 19826, Training Loss: 0.01202
Epoch: 19826, Training Loss: 0.00928
Epoch: 19827, Training Loss: 0.01053
Epoch: 19827, Training Loss: 0.00979
Epoch: 19827, Training Loss: 0.01202
Epoch: 19827, Training Loss: 0.00928
Epoch: 19828, Training Loss: 0.01053
Epoch: 19828, Training Loss: 0.00979
Epoch: 19828, Training Loss: 0.01202
Epoch: 19828, Training Loss: 0.00928
Epoch: 19829, Training Loss: 0.01053
Epoch: 19829, Training Loss: 0.00979
Epoch: 19829, Training Loss: 0.01202
Epoch: 19829, Training Loss: 0.00928
Epoch: 19830, Training Loss: 0.01053
Epoch: 19830, Training Loss: 0.00979
Epoch: 19830, Training Loss: 0.01202
Epoch: 19830, Training Loss: 0.00928
Epoch: 19831, Training Loss: 0.01053
Epoch: 19831, Training Loss: 0.00979
Epoch: 19831, Training Loss: 0.01202
Epoch: 19831, Training Loss: 0.00928
Epoch: 19832, Training Loss: 0.01053
Epoch: 19832, Training Loss: 0.00979
Epoch: 19832, Training Loss: 0.01202
Epoch: 19832, Training Loss: 0.00928
Epoch: 19833, Training Loss: 0.01053
Epoch: 19833, Training Loss: 0.00979
Epoch: 19833, Training Loss: 0.01202
Epoch: 19833, Training Loss: 0.00928
Epoch: 19834, Training Loss: 0.01053
Epoch: 19834, Training Loss: 0.00979
Epoch: 19834, Training Loss: 0.01202
Epoch: 19834, Training Loss: 0.00928
Epoch: 19835, Training Loss: 0.01053
Epoch: 19835, Training Loss: 0.00979
Epoch: 19835, Training Loss: 0.01202
Epoch: 19835, Training Loss: 0.00928
Epoch: 19836, Training Loss: 0.01053
Epoch: 19836, Training Loss: 0.00979
Epoch: 19836, Training Loss: 0.01202
Epoch: 19836, Training Loss: 0.00928
Epoch: 19837, Training Loss: 0.01053
Epoch: 19837, Training Loss: 0.00979
Epoch: 19837, Training Loss: 0.01202
Epoch: 19837, Training Loss: 0.00928
Epoch: 19838, Training Loss: 0.01053
Epoch: 19838, Training Loss: 0.00979
Epoch: 19838, Training Loss: 0.01202
Epoch: 19838, Training Loss: 0.00928
Epoch: 19839, Training Loss: 0.01053
Epoch: 19839, Training Loss: 0.00979
Epoch: 19839, Training Loss: 0.01202
Epoch: 19839, Training Loss: 0.00928
Epoch: 19840, Training Loss: 0.01053
Epoch: 19840, Training Loss: 0.00979
Epoch: 19840, Training Loss: 0.01202
Epoch: 19840, Training Loss: 0.00928
Epoch: 19841, Training Loss: 0.01053
Epoch: 19841, Training Loss: 0.00979
Epoch: 19841, Training Loss: 0.01202
Epoch: 19841, Training Loss: 0.00928
Epoch: 19842, Training Loss: 0.01053
Epoch: 19842, Training Loss: 0.00979
Epoch: 19842, Training Loss: 0.01202
Epoch: 19842, Training Loss: 0.00928
Epoch: 19843, Training Loss: 0.01053
Epoch: 19843, Training Loss: 0.00979
Epoch: 19843, Training Loss: 0.01202
Epoch: 19843, Training Loss: 0.00928
Epoch: 19844, Training Loss: 0.01053
Epoch: 19844, Training Loss: 0.00979
Epoch: 19844, Training Loss: 0.01202
Epoch: 19844, Training Loss: 0.00928
Epoch: 19845, Training Loss: 0.01053
Epoch: 19845, Training Loss: 0.00979
Epoch: 19845, Training Loss: 0.01202
Epoch: 19845, Training Loss: 0.00928
Epoch: 19846, Training Loss: 0.01053
Epoch: 19846, Training Loss: 0.00979
Epoch: 19846, Training Loss: 0.01202
Epoch: 19846, Training Loss: 0.00928
Epoch: 19847, Training Loss: 0.01053
Epoch: 19847, Training Loss: 0.00979
Epoch: 19847, Training Loss: 0.01202
Epoch: 19847, Training Loss: 0.00928
Epoch: 19848, Training Loss: 0.01052
Epoch: 19848, Training Loss: 0.00979
Epoch: 19848, Training Loss: 0.01202
Epoch: 19848, Training Loss: 0.00928
Epoch: 19849, Training Loss: 0.01052
Epoch: 19849, Training Loss: 0.00979
Epoch: 19849, Training Loss: 0.01202
Epoch: 19849, Training Loss: 0.00927
Epoch: 19850, Training Loss: 0.01052
Epoch: 19850, Training Loss: 0.00979
Epoch: 19850, Training Loss: 0.01201
Epoch: 19850, Training Loss: 0.00927
Epoch: 19851, Training Loss: 0.01052
Epoch: 19851, Training Loss: 0.00979
Epoch: 19851, Training Loss: 0.01201
Epoch: 19851, Training Loss: 0.00927
Epoch: 19852, Training Loss: 0.01052
Epoch: 19852, Training Loss: 0.00979
Epoch: 19852, Training Loss: 0.01201
Epoch: 19852, Training Loss: 0.00927
Epoch: 19853, Training Loss: 0.01052
Epoch: 19853, Training Loss: 0.00979
Epoch: 19853, Training Loss: 0.01201
Epoch: 19853, Training Loss: 0.00927
Epoch: 19854, Training Loss: 0.01052
Epoch: 19854, Training Loss: 0.00979
Epoch: 19854, Training Loss: 0.01201
Epoch: 19854, Training Loss: 0.00927
Epoch: 19855, Training Loss: 0.01052
Epoch: 19855, Training Loss: 0.00979
Epoch: 19855, Training Loss: 0.01201
Epoch: 19855, Training Loss: 0.00927
Epoch: 19856, Training Loss: 0.01052
Epoch: 19856, Training Loss: 0.00979
Epoch: 19856, Training Loss: 0.01201
Epoch: 19856, Training Loss: 0.00927
Epoch: 19857, Training Loss: 0.01052
Epoch: 19857, Training Loss: 0.00979
Epoch: 19857, Training Loss: 0.01201
Epoch: 19857, Training Loss: 0.00927
Epoch: 19858, Training Loss: 0.01052
Epoch: 19858, Training Loss: 0.00979
Epoch: 19858, Training Loss: 0.01201
Epoch: 19858, Training Loss: 0.00927
Epoch: 19859, Training Loss: 0.01052
Epoch: 19859, Training Loss: 0.00979
Epoch: 19859, Training Loss: 0.01201
Epoch: 19859, Training Loss: 0.00927
Epoch: 19860, Training Loss: 0.01052
Epoch: 19860, Training Loss: 0.00979
Epoch: 19860, Training Loss: 0.01201
Epoch: 19860, Training Loss: 0.00927
Epoch: 19861, Training Loss: 0.01052
Epoch: 19861, Training Loss: 0.00978
Epoch: 19861, Training Loss: 0.01201
Epoch: 19861, Training Loss: 0.00927
Epoch: 19862, Training Loss: 0.01052
Epoch: 19862, Training Loss: 0.00978
Epoch: 19862, Training Loss: 0.01201
Epoch: 19862, Training Loss: 0.00927
Epoch: 19863, Training Loss: 0.01052
Epoch: 19863, Training Loss: 0.00978
Epoch: 19863, Training Loss: 0.01201
Epoch: 19863, Training Loss: 0.00927
Epoch: 19864, Training Loss: 0.01052
Epoch: 19864, Training Loss: 0.00978
Epoch: 19864, Training Loss: 0.01201
Epoch: 19864, Training Loss: 0.00927
Epoch: 19865, Training Loss: 0.01052
Epoch: 19865, Training Loss: 0.00978
Epoch: 19865, Training Loss: 0.01201
Epoch: 19865, Training Loss: 0.00927
Epoch: 19866, Training Loss: 0.01052
Epoch: 19866, Training Loss: 0.00978
Epoch: 19866, Training Loss: 0.01201
Epoch: 19866, Training Loss: 0.00927
Epoch: 19867, Training Loss: 0.01052
Epoch: 19867, Training Loss: 0.00978
Epoch: 19867, Training Loss: 0.01201
Epoch: 19867, Training Loss: 0.00927
Epoch: 19868, Training Loss: 0.01052
Epoch: 19868, Training Loss: 0.00978
Epoch: 19868, Training Loss: 0.01201
Epoch: 19868, Training Loss: 0.00927
Epoch: 19869, Training Loss: 0.01052
Epoch: 19869, Training Loss: 0.00978
Epoch: 19869, Training Loss: 0.01201
Epoch: 19869, Training Loss: 0.00927
Epoch: 19870, Training Loss: 0.01052
Epoch: 19870, Training Loss: 0.00978
Epoch: 19870, Training Loss: 0.01201
Epoch: 19870, Training Loss: 0.00927
Epoch: 19871, Training Loss: 0.01052
Epoch: 19871, Training Loss: 0.00978
Epoch: 19871, Training Loss: 0.01201
Epoch: 19871, Training Loss: 0.00927
Epoch: 19872, Training Loss: 0.01052
Epoch: 19872, Training Loss: 0.00978
Epoch: 19872, Training Loss: 0.01201
Epoch: 19872, Training Loss: 0.00927
Epoch: 19873, Training Loss: 0.01052
Epoch: 19873, Training Loss: 0.00978
Epoch: 19873, Training Loss: 0.01201
Epoch: 19873, Training Loss: 0.00927
Epoch: 19874, Training Loss: 0.01052
Epoch: 19874, Training Loss: 0.00978
Epoch: 19874, Training Loss: 0.01201
Epoch: 19874, Training Loss: 0.00927
Epoch: 19875, Training Loss: 0.01052
Epoch: 19875, Training Loss: 0.00978
Epoch: 19875, Training Loss: 0.01201
Epoch: 19875, Training Loss: 0.00927
Epoch: 19876, Training Loss: 0.01052
Epoch: 19876, Training Loss: 0.00978
Epoch: 19876, Training Loss: 0.01201
Epoch: 19876, Training Loss: 0.00927
Epoch: 19877, Training Loss: 0.01052
Epoch: 19877, Training Loss: 0.00978
Epoch: 19877, Training Loss: 0.01201
Epoch: 19877, Training Loss: 0.00927
Epoch: 19878, Training Loss: 0.01052
Epoch: 19878, Training Loss: 0.00978
Epoch: 19878, Training Loss: 0.01201
Epoch: 19878, Training Loss: 0.00927
Epoch: 19879, Training Loss: 0.01052
Epoch: 19879, Training Loss: 0.00978
Epoch: 19879, Training Loss: 0.01201
Epoch: 19879, Training Loss: 0.00927
Epoch: 19880, Training Loss: 0.01052
Epoch: 19880, Training Loss: 0.00978
Epoch: 19880, Training Loss: 0.01201
Epoch: 19880, Training Loss: 0.00927
Epoch: 19881, Training Loss: 0.01052
Epoch: 19881, Training Loss: 0.00978
Epoch: 19881, Training Loss: 0.01200
Epoch: 19881, Training Loss: 0.00927
Epoch: 19882, Training Loss: 0.01052
Epoch: 19882, Training Loss: 0.00978
Epoch: 19882, Training Loss: 0.01200
Epoch: 19882, Training Loss: 0.00927
Epoch: 19883, Training Loss: 0.01051
Epoch: 19883, Training Loss: 0.00978
Epoch: 19883, Training Loss: 0.01200
Epoch: 19883, Training Loss: 0.00927
Epoch: 19884, Training Loss: 0.01051
Epoch: 19884, Training Loss: 0.00978
Epoch: 19884, Training Loss: 0.01200
Epoch: 19884, Training Loss: 0.00927
Epoch: 19885, Training Loss: 0.01051
Epoch: 19885, Training Loss: 0.00978
Epoch: 19885, Training Loss: 0.01200
Epoch: 19885, Training Loss: 0.00927
Epoch: 19886, Training Loss: 0.01051
Epoch: 19886, Training Loss: 0.00978
Epoch: 19886, Training Loss: 0.01200
Epoch: 19886, Training Loss: 0.00927
Epoch: 19887, Training Loss: 0.01051
Epoch: 19887, Training Loss: 0.00978
Epoch: 19887, Training Loss: 0.01200
Epoch: 19887, Training Loss: 0.00927
Epoch: 19888, Training Loss: 0.01051
Epoch: 19888, Training Loss: 0.00978
Epoch: 19888, Training Loss: 0.01200
Epoch: 19888, Training Loss: 0.00927
Epoch: 19889, Training Loss: 0.01051
Epoch: 19889, Training Loss: 0.00978
Epoch: 19889, Training Loss: 0.01200
Epoch: 19889, Training Loss: 0.00927
Epoch: 19890, Training Loss: 0.01051
Epoch: 19890, Training Loss: 0.00978
Epoch: 19890, Training Loss: 0.01200
Epoch: 19890, Training Loss: 0.00926
Epoch: 19891, Training Loss: 0.01051
Epoch: 19891, Training Loss: 0.00978
Epoch: 19891, Training Loss: 0.01200
Epoch: 19891, Training Loss: 0.00926
Epoch: 19892, Training Loss: 0.01051
Epoch: 19892, Training Loss: 0.00978
Epoch: 19892, Training Loss: 0.01200
Epoch: 19892, Training Loss: 0.00926
Epoch: 19893, Training Loss: 0.01051
Epoch: 19893, Training Loss: 0.00978
Epoch: 19893, Training Loss: 0.01200
Epoch: 19893, Training Loss: 0.00926
Epoch: 19894, Training Loss: 0.01051
Epoch: 19894, Training Loss: 0.00978
Epoch: 19894, Training Loss: 0.01200
Epoch: 19894, Training Loss: 0.00926
Epoch: 19895, Training Loss: 0.01051
Epoch: 19895, Training Loss: 0.00978
Epoch: 19895, Training Loss: 0.01200
Epoch: 19895, Training Loss: 0.00926
Epoch: 19896, Training Loss: 0.01051
Epoch: 19896, Training Loss: 0.00978
Epoch: 19896, Training Loss: 0.01200
Epoch: 19896, Training Loss: 0.00926
Epoch: 19897, Training Loss: 0.01051
Epoch: 19897, Training Loss: 0.00978
Epoch: 19897, Training Loss: 0.01200
Epoch: 19897, Training Loss: 0.00926
Epoch: 19898, Training Loss: 0.01051
Epoch: 19898, Training Loss: 0.00978
Epoch: 19898, Training Loss: 0.01200
Epoch: 19898, Training Loss: 0.00926
Epoch: 19899, Training Loss: 0.01051
Epoch: 19899, Training Loss: 0.00978
Epoch: 19899, Training Loss: 0.01200
Epoch: 19899, Training Loss: 0.00926
Epoch: 19900, Training Loss: 0.01051
Epoch: 19900, Training Loss: 0.00977
Epoch: 19900, Training Loss: 0.01200
Epoch: 19900, Training Loss: 0.00926
Epoch: 19901, Training Loss: 0.01051
Epoch: 19901, Training Loss: 0.00977
Epoch: 19901, Training Loss: 0.01200
Epoch: 19901, Training Loss: 0.00926
Epoch: 19902, Training Loss: 0.01051
Epoch: 19902, Training Loss: 0.00977
Epoch: 19902, Training Loss: 0.01200
Epoch: 19902, Training Loss: 0.00926
Epoch: 19903, Training Loss: 0.01051
Epoch: 19903, Training Loss: 0.00977
Epoch: 19903, Training Loss: 0.01200
Epoch: 19903, Training Loss: 0.00926
Epoch: 19904, Training Loss: 0.01051
Epoch: 19904, Training Loss: 0.00977
Epoch: 19904, Training Loss: 0.01200
Epoch: 19904, Training Loss: 0.00926
Epoch: 19905, Training Loss: 0.01051
Epoch: 19905, Training Loss: 0.00977
Epoch: 19905, Training Loss: 0.01200
Epoch: 19905, Training Loss: 0.00926
Epoch: 19906, Training Loss: 0.01051
Epoch: 19906, Training Loss: 0.00977
Epoch: 19906, Training Loss: 0.01200
Epoch: 19906, Training Loss: 0.00926
Epoch: 19907, Training Loss: 0.01051
Epoch: 19907, Training Loss: 0.00977
Epoch: 19907, Training Loss: 0.01200
Epoch: 19907, Training Loss: 0.00926
Epoch: 19908, Training Loss: 0.01051
Epoch: 19908, Training Loss: 0.00977
Epoch: 19908, Training Loss: 0.01200
Epoch: 19908, Training Loss: 0.00926
Epoch: 19909, Training Loss: 0.01051
Epoch: 19909, Training Loss: 0.00977
Epoch: 19909, Training Loss: 0.01200
Epoch: 19909, Training Loss: 0.00926
Epoch: 19910, Training Loss: 0.01051
Epoch: 19910, Training Loss: 0.00977
Epoch: 19910, Training Loss: 0.01200
Epoch: 19910, Training Loss: 0.00926
Epoch: 19911, Training Loss: 0.01051
Epoch: 19911, Training Loss: 0.00977
Epoch: 19911, Training Loss: 0.01199
Epoch: 19911, Training Loss: 0.00926
Epoch: 19912, Training Loss: 0.01051
Epoch: 19912, Training Loss: 0.00977
Epoch: 19912, Training Loss: 0.01199
Epoch: 19912, Training Loss: 0.00926
Epoch: 19913, Training Loss: 0.01051
Epoch: 19913, Training Loss: 0.00977
Epoch: 19913, Training Loss: 0.01199
Epoch: 19913, Training Loss: 0.00926
Epoch: 19914, Training Loss: 0.01051
Epoch: 19914, Training Loss: 0.00977
Epoch: 19914, Training Loss: 0.01199
Epoch: 19914, Training Loss: 0.00926
Epoch: 19915, Training Loss: 0.01051
Epoch: 19915, Training Loss: 0.00977
Epoch: 19915, Training Loss: 0.01199
Epoch: 19915, Training Loss: 0.00926
Epoch: 19916, Training Loss: 0.01051
Epoch: 19916, Training Loss: 0.00977
Epoch: 19916, Training Loss: 0.01199
Epoch: 19916, Training Loss: 0.00926
Epoch: 19917, Training Loss: 0.01051
Epoch: 19917, Training Loss: 0.00977
Epoch: 19917, Training Loss: 0.01199
Epoch: 19917, Training Loss: 0.00926
Epoch: 19918, Training Loss: 0.01050
Epoch: 19918, Training Loss: 0.00977
Epoch: 19918, Training Loss: 0.01199
Epoch: 19918, Training Loss: 0.00926
Epoch: 19919, Training Loss: 0.01050
Epoch: 19919, Training Loss: 0.00977
Epoch: 19919, Training Loss: 0.01199
Epoch: 19919, Training Loss: 0.00926
Epoch: 19920, Training Loss: 0.01050
Epoch: 19920, Training Loss: 0.00977
Epoch: 19920, Training Loss: 0.01199
Epoch: 19920, Training Loss: 0.00926
Epoch: 19921, Training Loss: 0.01050
Epoch: 19921, Training Loss: 0.00977
Epoch: 19921, Training Loss: 0.01199
Epoch: 19921, Training Loss: 0.00926
Epoch: 19922, Training Loss: 0.01050
Epoch: 19922, Training Loss: 0.00977
Epoch: 19922, Training Loss: 0.01199
Epoch: 19922, Training Loss: 0.00926
Epoch: 19923, Training Loss: 0.01050
Epoch: 19923, Training Loss: 0.00977
Epoch: 19923, Training Loss: 0.01199
Epoch: 19923, Training Loss: 0.00926
Epoch: 19924, Training Loss: 0.01050
Epoch: 19924, Training Loss: 0.00977
Epoch: 19924, Training Loss: 0.01199
Epoch: 19924, Training Loss: 0.00926
Epoch: 19925, Training Loss: 0.01050
Epoch: 19925, Training Loss: 0.00977
Epoch: 19925, Training Loss: 0.01199
Epoch: 19925, Training Loss: 0.00926
Epoch: 19926, Training Loss: 0.01050
Epoch: 19926, Training Loss: 0.00977
Epoch: 19926, Training Loss: 0.01199
Epoch: 19926, Training Loss: 0.00926
Epoch: 19927, Training Loss: 0.01050
Epoch: 19927, Training Loss: 0.00977
Epoch: 19927, Training Loss: 0.01199
Epoch: 19927, Training Loss: 0.00926
Epoch: 19928, Training Loss: 0.01050
Epoch: 19928, Training Loss: 0.00977
Epoch: 19928, Training Loss: 0.01199
Epoch: 19928, Training Loss: 0.00926
Epoch: 19929, Training Loss: 0.01050
Epoch: 19929, Training Loss: 0.00977
Epoch: 19929, Training Loss: 0.01199
Epoch: 19929, Training Loss: 0.00926
Epoch: 19930, Training Loss: 0.01050
Epoch: 19930, Training Loss: 0.00977
Epoch: 19930, Training Loss: 0.01199
Epoch: 19930, Training Loss: 0.00925
Epoch: 19931, Training Loss: 0.01050
Epoch: 19931, Training Loss: 0.00977
Epoch: 19931, Training Loss: 0.01199
Epoch: 19931, Training Loss: 0.00925
Epoch: 19932, Training Loss: 0.01050
Epoch: 19932, Training Loss: 0.00977
Epoch: 19932, Training Loss: 0.01199
Epoch: 19932, Training Loss: 0.00925
Epoch: 19933, Training Loss: 0.01050
Epoch: 19933, Training Loss: 0.00977
Epoch: 19933, Training Loss: 0.01199
Epoch: 19933, Training Loss: 0.00925
Epoch: 19934, Training Loss: 0.01050
Epoch: 19934, Training Loss: 0.00977
Epoch: 19934, Training Loss: 0.01199
Epoch: 19934, Training Loss: 0.00925
Epoch: 19935, Training Loss: 0.01050
Epoch: 19935, Training Loss: 0.00977
Epoch: 19935, Training Loss: 0.01199
Epoch: 19935, Training Loss: 0.00925
Epoch: 19936, Training Loss: 0.01050
Epoch: 19936, Training Loss: 0.00977
Epoch: 19936, Training Loss: 0.01199
Epoch: 19936, Training Loss: 0.00925
Epoch: 19937, Training Loss: 0.01050
Epoch: 19937, Training Loss: 0.00977
Epoch: 19937, Training Loss: 0.01199
Epoch: 19937, Training Loss: 0.00925
Epoch: 19938, Training Loss: 0.01050
Epoch: 19938, Training Loss: 0.00976
Epoch: 19938, Training Loss: 0.01199
Epoch: 19938, Training Loss: 0.00925
Epoch: 19939, Training Loss: 0.01050
Epoch: 19939, Training Loss: 0.00976
Epoch: 19939, Training Loss: 0.01199
Epoch: 19939, Training Loss: 0.00925
Epoch: 19940, Training Loss: 0.01050
Epoch: 19940, Training Loss: 0.00976
Epoch: 19940, Training Loss: 0.01199
Epoch: 19940, Training Loss: 0.00925
Epoch: 19941, Training Loss: 0.01050
Epoch: 19941, Training Loss: 0.00976
Epoch: 19941, Training Loss: 0.01199
Epoch: 19941, Training Loss: 0.00925
Epoch: 19942, Training Loss: 0.01050
Epoch: 19942, Training Loss: 0.00976
Epoch: 19942, Training Loss: 0.01198
Epoch: 19942, Training Loss: 0.00925
Epoch: 19943, Training Loss: 0.01050
Epoch: 19943, Training Loss: 0.00976
Epoch: 19943, Training Loss: 0.01198
Epoch: 19943, Training Loss: 0.00925
Epoch: 19944, Training Loss: 0.01050
Epoch: 19944, Training Loss: 0.00976
Epoch: 19944, Training Loss: 0.01198
Epoch: 19944, Training Loss: 0.00925
Epoch: 19945, Training Loss: 0.01050
Epoch: 19945, Training Loss: 0.00976
Epoch: 19945, Training Loss: 0.01198
Epoch: 19945, Training Loss: 0.00925
Epoch: 19946, Training Loss: 0.01050
Epoch: 19946, Training Loss: 0.00976
Epoch: 19946, Training Loss: 0.01198
Epoch: 19946, Training Loss: 0.00925
Epoch: 19947, Training Loss: 0.01050
Epoch: 19947, Training Loss: 0.00976
Epoch: 19947, Training Loss: 0.01198
Epoch: 19947, Training Loss: 0.00925
Epoch: 19948, Training Loss: 0.01050
Epoch: 19948, Training Loss: 0.00976
Epoch: 19948, Training Loss: 0.01198
Epoch: 19948, Training Loss: 0.00925
Epoch: 19949, Training Loss: 0.01050
Epoch: 19949, Training Loss: 0.00976
Epoch: 19949, Training Loss: 0.01198
Epoch: 19949, Training Loss: 0.00925
Epoch: 19950, Training Loss: 0.01050
Epoch: 19950, Training Loss: 0.00976
Epoch: 19950, Training Loss: 0.01198
Epoch: 19950, Training Loss: 0.00925
Epoch: 19951, Training Loss: 0.01050
Epoch: 19951, Training Loss: 0.00976
Epoch: 19951, Training Loss: 0.01198
Epoch: 19951, Training Loss: 0.00925
Epoch: 19952, Training Loss: 0.01050
Epoch: 19952, Training Loss: 0.00976
Epoch: 19952, Training Loss: 0.01198
Epoch: 19952, Training Loss: 0.00925
Epoch: 19953, Training Loss: 0.01049
Epoch: 19953, Training Loss: 0.00976
Epoch: 19953, Training Loss: 0.01198
Epoch: 19953, Training Loss: 0.00925
Epoch: 19954, Training Loss: 0.01049
Epoch: 19954, Training Loss: 0.00976
Epoch: 19954, Training Loss: 0.01198
Epoch: 19954, Training Loss: 0.00925
Epoch: 19955, Training Loss: 0.01049
Epoch: 19955, Training Loss: 0.00976
Epoch: 19955, Training Loss: 0.01198
Epoch: 19955, Training Loss: 0.00925
Epoch: 19956, Training Loss: 0.01049
Epoch: 19956, Training Loss: 0.00976
Epoch: 19956, Training Loss: 0.01198
Epoch: 19956, Training Loss: 0.00925
Epoch: 19957, Training Loss: 0.01049
Epoch: 19957, Training Loss: 0.00976
Epoch: 19957, Training Loss: 0.01198
Epoch: 19957, Training Loss: 0.00925
Epoch: 19958, Training Loss: 0.01049
Epoch: 19958, Training Loss: 0.00976
Epoch: 19958, Training Loss: 0.01198
Epoch: 19958, Training Loss: 0.00925
Epoch: 19959, Training Loss: 0.01049
Epoch: 19959, Training Loss: 0.00976
Epoch: 19959, Training Loss: 0.01198
Epoch: 19959, Training Loss: 0.00925
Epoch: 19960, Training Loss: 0.01049
Epoch: 19960, Training Loss: 0.00976
Epoch: 19960, Training Loss: 0.01198
Epoch: 19960, Training Loss: 0.00925
Epoch: 19961, Training Loss: 0.01049
Epoch: 19961, Training Loss: 0.00976
Epoch: 19961, Training Loss: 0.01198
Epoch: 19961, Training Loss: 0.00925
Epoch: 19962, Training Loss: 0.01049
Epoch: 19962, Training Loss: 0.00976
Epoch: 19962, Training Loss: 0.01198
Epoch: 19962, Training Loss: 0.00925
Epoch: 19963, Training Loss: 0.01049
Epoch: 19963, Training Loss: 0.00976
Epoch: 19963, Training Loss: 0.01198
Epoch: 19963, Training Loss: 0.00925
Epoch: 19964, Training Loss: 0.01049
Epoch: 19964, Training Loss: 0.00976
Epoch: 19964, Training Loss: 0.01198
Epoch: 19964, Training Loss: 0.00925
Epoch: 19965, Training Loss: 0.01049
Epoch: 19965, Training Loss: 0.00976
Epoch: 19965, Training Loss: 0.01198
Epoch: 19965, Training Loss: 0.00925
Epoch: 19966, Training Loss: 0.01049
Epoch: 19966, Training Loss: 0.00976
Epoch: 19966, Training Loss: 0.01198
Epoch: 19966, Training Loss: 0.00925
Epoch: 19967, Training Loss: 0.01049
Epoch: 19967, Training Loss: 0.00976
Epoch: 19967, Training Loss: 0.01198
Epoch: 19967, Training Loss: 0.00925
Epoch: 19968, Training Loss: 0.01049
Epoch: 19968, Training Loss: 0.00976
Epoch: 19968, Training Loss: 0.01198
Epoch: 19968, Training Loss: 0.00925
Epoch: 19969, Training Loss: 0.01049
Epoch: 19969, Training Loss: 0.00976
Epoch: 19969, Training Loss: 0.01198
Epoch: 19969, Training Loss: 0.00925
Epoch: 19970, Training Loss: 0.01049
Epoch: 19970, Training Loss: 0.00976
Epoch: 19970, Training Loss: 0.01198
Epoch: 19970, Training Loss: 0.00924
Epoch: 19971, Training Loss: 0.01049
Epoch: 19971, Training Loss: 0.00976
Epoch: 19971, Training Loss: 0.01198
Epoch: 19971, Training Loss: 0.00924
Epoch: 19972, Training Loss: 0.01049
Epoch: 19972, Training Loss: 0.00976
Epoch: 19972, Training Loss: 0.01198
Epoch: 19972, Training Loss: 0.00924
Epoch: 19973, Training Loss: 0.01049
Epoch: 19973, Training Loss: 0.00976
Epoch: 19973, Training Loss: 0.01197
Epoch: 19973, Training Loss: 0.00924
Epoch: 19974, Training Loss: 0.01049
Epoch: 19974, Training Loss: 0.00976
Epoch: 19974, Training Loss: 0.01197
Epoch: 19974, Training Loss: 0.00924
Epoch: 19975, Training Loss: 0.01049
Epoch: 19975, Training Loss: 0.00976
Epoch: 19975, Training Loss: 0.01197
Epoch: 19975, Training Loss: 0.00924
Epoch: 19976, Training Loss: 0.01049
Epoch: 19976, Training Loss: 0.00976
Epoch: 19976, Training Loss: 0.01197
Epoch: 19976, Training Loss: 0.00924
Epoch: 19977, Training Loss: 0.01049
Epoch: 19977, Training Loss: 0.00975
Epoch: 19977, Training Loss: 0.01197
Epoch: 19977, Training Loss: 0.00924
Epoch: 19978, Training Loss: 0.01049
Epoch: 19978, Training Loss: 0.00975
Epoch: 19978, Training Loss: 0.01197
Epoch: 19978, Training Loss: 0.00924
Epoch: 19979, Training Loss: 0.01049
Epoch: 19979, Training Loss: 0.00975
Epoch: 19979, Training Loss: 0.01197
Epoch: 19979, Training Loss: 0.00924
Epoch: 19980, Training Loss: 0.01049
Epoch: 19980, Training Loss: 0.00975
Epoch: 19980, Training Loss: 0.01197
Epoch: 19980, Training Loss: 0.00924
Epoch: 19981, Training Loss: 0.01049
Epoch: 19981, Training Loss: 0.00975
Epoch: 19981, Training Loss: 0.01197
Epoch: 19981, Training Loss: 0.00924
Epoch: 19982, Training Loss: 0.01049
Epoch: 19982, Training Loss: 0.00975
Epoch: 19982, Training Loss: 0.01197
Epoch: 19982, Training Loss: 0.00924
Epoch: 19983, Training Loss: 0.01049
Epoch: 19983, Training Loss: 0.00975
Epoch: 19983, Training Loss: 0.01197
Epoch: 19983, Training Loss: 0.00924
Epoch: 19984, Training Loss: 0.01049
Epoch: 19984, Training Loss: 0.00975
Epoch: 19984, Training Loss: 0.01197
Epoch: 19984, Training Loss: 0.00924
Epoch: 19985, Training Loss: 0.01049
Epoch: 19985, Training Loss: 0.00975
Epoch: 19985, Training Loss: 0.01197
Epoch: 19985, Training Loss: 0.00924
Epoch: 19986, Training Loss: 0.01049
Epoch: 19986, Training Loss: 0.00975
Epoch: 19986, Training Loss: 0.01197
Epoch: 19986, Training Loss: 0.00924
Epoch: 19987, Training Loss: 0.01049
Epoch: 19987, Training Loss: 0.00975
Epoch: 19987, Training Loss: 0.01197
Epoch: 19987, Training Loss: 0.00924
Epoch: 19988, Training Loss: 0.01049
Epoch: 19988, Training Loss: 0.00975
Epoch: 19988, Training Loss: 0.01197
Epoch: 19988, Training Loss: 0.00924
Epoch: 19989, Training Loss: 0.01048
Epoch: 19989, Training Loss: 0.00975
Epoch: 19989, Training Loss: 0.01197
Epoch: 19989, Training Loss: 0.00924
Epoch: 19990, Training Loss: 0.01048
Epoch: 19990, Training Loss: 0.00975
Epoch: 19990, Training Loss: 0.01197
Epoch: 19990, Training Loss: 0.00924
Epoch: 19991, Training Loss: 0.01048
Epoch: 19991, Training Loss: 0.00975
Epoch: 19991, Training Loss: 0.01197
Epoch: 19991, Training Loss: 0.00924
Epoch: 19992, Training Loss: 0.01048
Epoch: 19992, Training Loss: 0.00975
Epoch: 19992, Training Loss: 0.01197
Epoch: 19992, Training Loss: 0.00924
Epoch: 19993, Training Loss: 0.01048
Epoch: 19993, Training Loss: 0.00975
Epoch: 19993, Training Loss: 0.01197
Epoch: 19993, Training Loss: 0.00924
Epoch: 19994, Training Loss: 0.01048
Epoch: 19994, Training Loss: 0.00975
Epoch: 19994, Training Loss: 0.01197
Epoch: 19994, Training Loss: 0.00924
Epoch: 19995, Training Loss: 0.01048
Epoch: 19995, Training Loss: 0.00975
Epoch: 19995, Training Loss: 0.01197
Epoch: 19995, Training Loss: 0.00924
Epoch: 19996, Training Loss: 0.01048
Epoch: 19996, Training Loss: 0.00975
Epoch: 19996, Training Loss: 0.01197
Epoch: 19996, Training Loss: 0.00924
Epoch: 19997, Training Loss: 0.01048
Epoch: 19997, Training Loss: 0.00975
Epoch: 19997, Training Loss: 0.01197
Epoch: 19997, Training Loss: 0.00924
Epoch: 19998, Training Loss: 0.01048
Epoch: 19998, Training Loss: 0.00975
Epoch: 19998, Training Loss: 0.01197
Epoch: 19998, Training Loss: 0.00924
Epoch: 19999, Training Loss: 0.01048
Epoch: 19999, Training Loss: 0.00975
Epoch: 19999, Training Loss: 0.01197
Epoch: 19999, Training Loss: 0.00924
Epoch: 20000, Training Loss: 0.01048
Epoch: 20000, Training Loss: 0.00975
Epoch: 20000, Training Loss: 0.01197
Epoch: 20000, Training Loss: 0.00924
Epoch: 20001, Training Loss: 0.01048
Epoch: 20001, Training Loss: 0.00975
Epoch: 20001, Training Loss: 0.01197
Epoch: 20001, Training Loss: 0.00924
Epoch: 20002, Training Loss: 0.01048
Epoch: 20002, Training Loss: 0.00975
Epoch: 20002, Training Loss: 0.01197
Epoch: 20002, Training Loss: 0.00924
Epoch: 20003, Training Loss: 0.01048
Epoch: 20003, Training Loss: 0.00975
Epoch: 20003, Training Loss: 0.01197
Epoch: 20003, Training Loss: 0.00924
Epoch: 20004, Training Loss: 0.01048
Epoch: 20004, Training Loss: 0.00975
Epoch: 20004, Training Loss: 0.01196
Epoch: 20004, Training Loss: 0.00924
Epoch: 20005, Training Loss: 0.01048
Epoch: 20005, Training Loss: 0.00975
Epoch: 20005, Training Loss: 0.01196
Epoch: 20005, Training Loss: 0.00924
Epoch: 20006, Training Loss: 0.01048
Epoch: 20006, Training Loss: 0.00975
Epoch: 20006, Training Loss: 0.01196
Epoch: 20006, Training Loss: 0.00924
Epoch: 20007, Training Loss: 0.01048
Epoch: 20007, Training Loss: 0.00975
Epoch: 20007, Training Loss: 0.01196
Epoch: 20007, Training Loss: 0.00924
Epoch: 20008, Training Loss: 0.01048
Epoch: 20008, Training Loss: 0.00975
Epoch: 20008, Training Loss: 0.01196
Epoch: 20008, Training Loss: 0.00924
Epoch: 20009, Training Loss: 0.01048
Epoch: 20009, Training Loss: 0.00975
Epoch: 20009, Training Loss: 0.01196
Epoch: 20009, Training Loss: 0.00924
Epoch: 20010, Training Loss: 0.01048
Epoch: 20010, Training Loss: 0.00975
Epoch: 20010, Training Loss: 0.01196
Epoch: 20010, Training Loss: 0.00924
Epoch: 20011, Training Loss: 0.01048
Epoch: 20011, Training Loss: 0.00975
Epoch: 20011, Training Loss: 0.01196
Epoch: 20011, Training Loss: 0.00923
Epoch: 20012, Training Loss: 0.01048
Epoch: 20012, Training Loss: 0.00975
Epoch: 20012, Training Loss: 0.01196
Epoch: 20012, Training Loss: 0.00923
Epoch: 20013, Training Loss: 0.01048
Epoch: 20013, Training Loss: 0.00975
Epoch: 20013, Training Loss: 0.01196
Epoch: 20013, Training Loss: 0.00923
Epoch: 20014, Training Loss: 0.01048
Epoch: 20014, Training Loss: 0.00975
Epoch: 20014, Training Loss: 0.01196
Epoch: 20014, Training Loss: 0.00923
Epoch: 20015, Training Loss: 0.01048
Epoch: 20015, Training Loss: 0.00974
Epoch: 20015, Training Loss: 0.01196
Epoch: 20015, Training Loss: 0.00923
Epoch: 20016, Training Loss: 0.01048
Epoch: 20016, Training Loss: 0.00974
Epoch: 20016, Training Loss: 0.01196
Epoch: 20016, Training Loss: 0.00923
Epoch: 20017, Training Loss: 0.01048
Epoch: 20017, Training Loss: 0.00974
Epoch: 20017, Training Loss: 0.01196
Epoch: 20017, Training Loss: 0.00923
Epoch: 20018, Training Loss: 0.01048
Epoch: 20018, Training Loss: 0.00974
Epoch: 20018, Training Loss: 0.01196
Epoch: 20018, Training Loss: 0.00923
Epoch: 20019, Training Loss: 0.01048
Epoch: 20019, Training Loss: 0.00974
Epoch: 20019, Training Loss: 0.01196
Epoch: 20019, Training Loss: 0.00923
Epoch: 20020, Training Loss: 0.01048
Epoch: 20020, Training Loss: 0.00974
Epoch: 20020, Training Loss: 0.01196
Epoch: 20020, Training Loss: 0.00923
Epoch: 20021, Training Loss: 0.01048
Epoch: 20021, Training Loss: 0.00974
Epoch: 20021, Training Loss: 0.01196
Epoch: 20021, Training Loss: 0.00923
Epoch: 20022, Training Loss: 0.01048
Epoch: 20022, Training Loss: 0.00974
Epoch: 20022, Training Loss: 0.01196
Epoch: 20022, Training Loss: 0.00923
Epoch: 20023, Training Loss: 0.01048
Epoch: 20023, Training Loss: 0.00974
Epoch: 20023, Training Loss: 0.01196
Epoch: 20023, Training Loss: 0.00923
Epoch: 20024, Training Loss: 0.01047
Epoch: 20024, Training Loss: 0.00974
Epoch: 20024, Training Loss: 0.01196
Epoch: 20024, Training Loss: 0.00923
Epoch: 20025, Training Loss: 0.01047
Epoch: 20025, Training Loss: 0.00974
Epoch: 20025, Training Loss: 0.01196
Epoch: 20025, Training Loss: 0.00923
Epoch: 20026, Training Loss: 0.01047
Epoch: 20026, Training Loss: 0.00974
Epoch: 20026, Training Loss: 0.01196
Epoch: 20026, Training Loss: 0.00923
Epoch: 20027, Training Loss: 0.01047
Epoch: 20027, Training Loss: 0.00974
Epoch: 20027, Training Loss: 0.01196
Epoch: 20027, Training Loss: 0.00923
Epoch: 20028, Training Loss: 0.01047
Epoch: 20028, Training Loss: 0.00974
Epoch: 20028, Training Loss: 0.01196
Epoch: 20028, Training Loss: 0.00923
Epoch: 20029, Training Loss: 0.01047
Epoch: 20029, Training Loss: 0.00974
Epoch: 20029, Training Loss: 0.01196
Epoch: 20029, Training Loss: 0.00923
Epoch: 20030, Training Loss: 0.01047
Epoch: 20030, Training Loss: 0.00974
Epoch: 20030, Training Loss: 0.01196
Epoch: 20030, Training Loss: 0.00923
Epoch: 20031, Training Loss: 0.01047
Epoch: 20031, Training Loss: 0.00974
Epoch: 20031, Training Loss: 0.01196
Epoch: 20031, Training Loss: 0.00923
Epoch: 20032, Training Loss: 0.01047
Epoch: 20032, Training Loss: 0.00974
Epoch: 20032, Training Loss: 0.01196
Epoch: 20032, Training Loss: 0.00923
Epoch: 20033, Training Loss: 0.01047
Epoch: 20033, Training Loss: 0.00974
Epoch: 20033, Training Loss: 0.01196
Epoch: 20033, Training Loss: 0.00923
Epoch: 20034, Training Loss: 0.01047
Epoch: 20034, Training Loss: 0.00974
Epoch: 20034, Training Loss: 0.01196
Epoch: 20034, Training Loss: 0.00923
Epoch: 20035, Training Loss: 0.01047
Epoch: 20035, Training Loss: 0.00974
Epoch: 20035, Training Loss: 0.01195
Epoch: 20035, Training Loss: 0.00923
Epoch: 20036, Training Loss: 0.01047
Epoch: 20036, Training Loss: 0.00974
Epoch: 20036, Training Loss: 0.01195
Epoch: 20036, Training Loss: 0.00923
Epoch: 20037, Training Loss: 0.01047
Epoch: 20037, Training Loss: 0.00974
Epoch: 20037, Training Loss: 0.01195
Epoch: 20037, Training Loss: 0.00923
Epoch: 20038, Training Loss: 0.01047
Epoch: 20038, Training Loss: 0.00974
Epoch: 20038, Training Loss: 0.01195
Epoch: 20038, Training Loss: 0.00923
Epoch: 20039, Training Loss: 0.01047
Epoch: 20039, Training Loss: 0.00974
Epoch: 20039, Training Loss: 0.01195
Epoch: 20039, Training Loss: 0.00923
Epoch: 20040, Training Loss: 0.01047
Epoch: 20040, Training Loss: 0.00974
Epoch: 20040, Training Loss: 0.01195
Epoch: 20040, Training Loss: 0.00923
Epoch: 20041, Training Loss: 0.01047
Epoch: 20041, Training Loss: 0.00974
Epoch: 20041, Training Loss: 0.01195
Epoch: 20041, Training Loss: 0.00923
Epoch: 20042, Training Loss: 0.01047
Epoch: 20042, Training Loss: 0.00974
Epoch: 20042, Training Loss: 0.01195
Epoch: 20042, Training Loss: 0.00923
Epoch: 20043, Training Loss: 0.01047
Epoch: 20043, Training Loss: 0.00974
Epoch: 20043, Training Loss: 0.01195
Epoch: 20043, Training Loss: 0.00923
Epoch: 20044, Training Loss: 0.01047
Epoch: 20044, Training Loss: 0.00974
Epoch: 20044, Training Loss: 0.01195
Epoch: 20044, Training Loss: 0.00923
Epoch: 20045, Training Loss: 0.01047
Epoch: 20045, Training Loss: 0.00974
Epoch: 20045, Training Loss: 0.01195
Epoch: 20045, Training Loss: 0.00923
Epoch: 20046, Training Loss: 0.01047
Epoch: 20046, Training Loss: 0.00974
Epoch: 20046, Training Loss: 0.01195
Epoch: 20046, Training Loss: 0.00923
Epoch: 20047, Training Loss: 0.01047
Epoch: 20047, Training Loss: 0.00974
Epoch: 20047, Training Loss: 0.01195
Epoch: 20047, Training Loss: 0.00923
Epoch: 20048, Training Loss: 0.01047
Epoch: 20048, Training Loss: 0.00974
Epoch: 20048, Training Loss: 0.01195
Epoch: 20048, Training Loss: 0.00923
Epoch: 20049, Training Loss: 0.01047
Epoch: 20049, Training Loss: 0.00974
Epoch: 20049, Training Loss: 0.01195
Epoch: 20049, Training Loss: 0.00923
Epoch: 20050, Training Loss: 0.01047
Epoch: 20050, Training Loss: 0.00974
Epoch: 20050, Training Loss: 0.01195
Epoch: 20050, Training Loss: 0.00923
Epoch: 20051, Training Loss: 0.01047
Epoch: 20051, Training Loss: 0.00974
Epoch: 20051, Training Loss: 0.01195
Epoch: 20051, Training Loss: 0.00923
Epoch: 20052, Training Loss: 0.01047
Epoch: 20052, Training Loss: 0.00974
Epoch: 20052, Training Loss: 0.01195
Epoch: 20052, Training Loss: 0.00922
Epoch: 20053, Training Loss: 0.01047
Epoch: 20053, Training Loss: 0.00974
Epoch: 20053, Training Loss: 0.01195
Epoch: 20053, Training Loss: 0.00922
Epoch: 20054, Training Loss: 0.01047
Epoch: 20054, Training Loss: 0.00973
Epoch: 20054, Training Loss: 0.01195
Epoch: 20054, Training Loss: 0.00922
Epoch: 20055, Training Loss: 0.01047
Epoch: 20055, Training Loss: 0.00973
Epoch: 20055, Training Loss: 0.01195
Epoch: 20055, Training Loss: 0.00922
Epoch: 20056, Training Loss: 0.01047
Epoch: 20056, Training Loss: 0.00973
Epoch: 20056, Training Loss: 0.01195
Epoch: 20056, Training Loss: 0.00922
Epoch: 20057, Training Loss: 0.01047
Epoch: 20057, Training Loss: 0.00973
Epoch: 20057, Training Loss: 0.01195
Epoch: 20057, Training Loss: 0.00922
Epoch: 20058, Training Loss: 0.01047
Epoch: 20058, Training Loss: 0.00973
Epoch: 20058, Training Loss: 0.01195
Epoch: 20058, Training Loss: 0.00922
Epoch: 20059, Training Loss: 0.01046
Epoch: 20059, Training Loss: 0.00973
Epoch: 20059, Training Loss: 0.01195
Epoch: 20059, Training Loss: 0.00922
Epoch: 20060, Training Loss: 0.01046
Epoch: 20060, Training Loss: 0.00973
Epoch: 20060, Training Loss: 0.01195
Epoch: 20060, Training Loss: 0.00922
Epoch: 20061, Training Loss: 0.01046
Epoch: 20061, Training Loss: 0.00973
Epoch: 20061, Training Loss: 0.01195
Epoch: 20061, Training Loss: 0.00922
Epoch: 20062, Training Loss: 0.01046
Epoch: 20062, Training Loss: 0.00973
Epoch: 20062, Training Loss: 0.01195
Epoch: 20062, Training Loss: 0.00922
Epoch: 20063, Training Loss: 0.01046
Epoch: 20063, Training Loss: 0.00973
Epoch: 20063, Training Loss: 0.01195
Epoch: 20063, Training Loss: 0.00922
Epoch: 20064, Training Loss: 0.01046
Epoch: 20064, Training Loss: 0.00973
Epoch: 20064, Training Loss: 0.01195
Epoch: 20064, Training Loss: 0.00922
Epoch: 20065, Training Loss: 0.01046
Epoch: 20065, Training Loss: 0.00973
Epoch: 20065, Training Loss: 0.01195
Epoch: 20065, Training Loss: 0.00922
Epoch: 20066, Training Loss: 0.01046
Epoch: 20066, Training Loss: 0.00973
Epoch: 20066, Training Loss: 0.01194
Epoch: 20066, Training Loss: 0.00922
Epoch: 20067, Training Loss: 0.01046
Epoch: 20067, Training Loss: 0.00973
Epoch: 20067, Training Loss: 0.01194
Epoch: 20067, Training Loss: 0.00922
Epoch: 20068, Training Loss: 0.01046
Epoch: 20068, Training Loss: 0.00973
Epoch: 20068, Training Loss: 0.01194
Epoch: 20068, Training Loss: 0.00922
Epoch: 20069, Training Loss: 0.01046
Epoch: 20069, Training Loss: 0.00973
Epoch: 20069, Training Loss: 0.01194
Epoch: 20069, Training Loss: 0.00922
Epoch: 20070, Training Loss: 0.01046
Epoch: 20070, Training Loss: 0.00973
Epoch: 20070, Training Loss: 0.01194
Epoch: 20070, Training Loss: 0.00922
Epoch: 20071, Training Loss: 0.01046
Epoch: 20071, Training Loss: 0.00973
Epoch: 20071, Training Loss: 0.01194
Epoch: 20071, Training Loss: 0.00922
Epoch: 20072, Training Loss: 0.01046
Epoch: 20072, Training Loss: 0.00973
Epoch: 20072, Training Loss: 0.01194
Epoch: 20072, Training Loss: 0.00922
Epoch: 20073, Training Loss: 0.01046
Epoch: 20073, Training Loss: 0.00973
Epoch: 20073, Training Loss: 0.01194
Epoch: 20073, Training Loss: 0.00922
Epoch: 20074, Training Loss: 0.01046
Epoch: 20074, Training Loss: 0.00973
Epoch: 20074, Training Loss: 0.01194
Epoch: 20074, Training Loss: 0.00922
Epoch: 20075, Training Loss: 0.01046
Epoch: 20075, Training Loss: 0.00973
Epoch: 20075, Training Loss: 0.01194
Epoch: 20075, Training Loss: 0.00922
Epoch: 20076, Training Loss: 0.01046
Epoch: 20076, Training Loss: 0.00973
Epoch: 20076, Training Loss: 0.01194
Epoch: 20076, Training Loss: 0.00922
Epoch: 20077, Training Loss: 0.01046
Epoch: 20077, Training Loss: 0.00973
Epoch: 20077, Training Loss: 0.01194
Epoch: 20077, Training Loss: 0.00922
Epoch: 20078, Training Loss: 0.01046
Epoch: 20078, Training Loss: 0.00973
Epoch: 20078, Training Loss: 0.01194
Epoch: 20078, Training Loss: 0.00922
Epoch: 20079, Training Loss: 0.01046
Epoch: 20079, Training Loss: 0.00973
Epoch: 20079, Training Loss: 0.01194
Epoch: 20079, Training Loss: 0.00922
Epoch: 20080, Training Loss: 0.01046
Epoch: 20080, Training Loss: 0.00973
Epoch: 20080, Training Loss: 0.01194
Epoch: 20080, Training Loss: 0.00922
Epoch: 20081, Training Loss: 0.01046
Epoch: 20081, Training Loss: 0.00973
Epoch: 20081, Training Loss: 0.01194
Epoch: 20081, Training Loss: 0.00922
Epoch: 20082, Training Loss: 0.01046
Epoch: 20082, Training Loss: 0.00973
Epoch: 20082, Training Loss: 0.01194
Epoch: 20082, Training Loss: 0.00922
Epoch: 20083, Training Loss: 0.01046
Epoch: 20083, Training Loss: 0.00973
Epoch: 20083, Training Loss: 0.01194
Epoch: 20083, Training Loss: 0.00922
Epoch: 20084, Training Loss: 0.01046
Epoch: 20084, Training Loss: 0.00973
Epoch: 20084, Training Loss: 0.01194
Epoch: 20084, Training Loss: 0.00922
Epoch: 20085, Training Loss: 0.01046
Epoch: 20085, Training Loss: 0.00973
Epoch: 20085, Training Loss: 0.01194
Epoch: 20085, Training Loss: 0.00922
Epoch: 20086, Training Loss: 0.01046
Epoch: 20086, Training Loss: 0.00973
Epoch: 20086, Training Loss: 0.01194
Epoch: 20086, Training Loss: 0.00922
Epoch: 20087, Training Loss: 0.01046
Epoch: 20087, Training Loss: 0.00973
Epoch: 20087, Training Loss: 0.01194
Epoch: 20087, Training Loss: 0.00922
Epoch: 20088, Training Loss: 0.01046
Epoch: 20088, Training Loss: 0.00973
Epoch: 20088, Training Loss: 0.01194
Epoch: 20088, Training Loss: 0.00922
Epoch: 20089, Training Loss: 0.01046
Epoch: 20089, Training Loss: 0.00973
Epoch: 20089, Training Loss: 0.01194
Epoch: 20089, Training Loss: 0.00922
Epoch: 20090, Training Loss: 0.01046
Epoch: 20090, Training Loss: 0.00973
Epoch: 20090, Training Loss: 0.01194
Epoch: 20090, Training Loss: 0.00922
Epoch: 20091, Training Loss: 0.01046
Epoch: 20091, Training Loss: 0.00973
Epoch: 20091, Training Loss: 0.01194
Epoch: 20091, Training Loss: 0.00922
Epoch: 20092, Training Loss: 0.01046
Epoch: 20092, Training Loss: 0.00973
Epoch: 20092, Training Loss: 0.01194
Epoch: 20092, Training Loss: 0.00922
Epoch: 20093, Training Loss: 0.01046
Epoch: 20093, Training Loss: 0.00972
Epoch: 20093, Training Loss: 0.01194
Epoch: 20093, Training Loss: 0.00921
Epoch: 20094, Training Loss: 0.01046
Epoch: 20094, Training Loss: 0.00972
Epoch: 20094, Training Loss: 0.01194
Epoch: 20094, Training Loss: 0.00921
Epoch: 20095, Training Loss: 0.01045
Epoch: 20095, Training Loss: 0.00972
Epoch: 20095, Training Loss: 0.01194
Epoch: 20095, Training Loss: 0.00921
Epoch: 20096, Training Loss: 0.01045
Epoch: 20096, Training Loss: 0.00972
Epoch: 20096, Training Loss: 0.01194
Epoch: 20096, Training Loss: 0.00921
Epoch: 20097, Training Loss: 0.01045
Epoch: 20097, Training Loss: 0.00972
Epoch: 20097, Training Loss: 0.01193
Epoch: 20097, Training Loss: 0.00921
Epoch: 20098, Training Loss: 0.01045
Epoch: 20098, Training Loss: 0.00972
Epoch: 20098, Training Loss: 0.01193
Epoch: 20098, Training Loss: 0.00921
Epoch: 20099, Training Loss: 0.01045
Epoch: 20099, Training Loss: 0.00972
Epoch: 20099, Training Loss: 0.01193
Epoch: 20099, Training Loss: 0.00921
Epoch: 20100, Training Loss: 0.01045
Epoch: 20100, Training Loss: 0.00972
Epoch: 20100, Training Loss: 0.01193
Epoch: 20100, Training Loss: 0.00921
Epoch: 20101, Training Loss: 0.01045
Epoch: 20101, Training Loss: 0.00972
Epoch: 20101, Training Loss: 0.01193
Epoch: 20101, Training Loss: 0.00921
Epoch: 20102, Training Loss: 0.01045
Epoch: 20102, Training Loss: 0.00972
Epoch: 20102, Training Loss: 0.01193
Epoch: 20102, Training Loss: 0.00921
Epoch: 20103, Training Loss: 0.01045
Epoch: 20103, Training Loss: 0.00972
Epoch: 20103, Training Loss: 0.01193
Epoch: 20103, Training Loss: 0.00921
Epoch: 20104, Training Loss: 0.01045
Epoch: 20104, Training Loss: 0.00972
Epoch: 20104, Training Loss: 0.01193
Epoch: 20104, Training Loss: 0.00921
Epoch: 20105, Training Loss: 0.01045
Epoch: 20105, Training Loss: 0.00972
Epoch: 20105, Training Loss: 0.01193
Epoch: 20105, Training Loss: 0.00921
Epoch: 20106, Training Loss: 0.01045
Epoch: 20106, Training Loss: 0.00972
Epoch: 20106, Training Loss: 0.01193
Epoch: 20106, Training Loss: 0.00921
Epoch: 20107, Training Loss: 0.01045
Epoch: 20107, Training Loss: 0.00972
Epoch: 20107, Training Loss: 0.01193
Epoch: 20107, Training Loss: 0.00921
Epoch: 20108, Training Loss: 0.01045
Epoch: 20108, Training Loss: 0.00972
Epoch: 20108, Training Loss: 0.01193
Epoch: 20108, Training Loss: 0.00921
Epoch: 20109, Training Loss: 0.01045
Epoch: 20109, Training Loss: 0.00972
Epoch: 20109, Training Loss: 0.01193
Epoch: 20109, Training Loss: 0.00921
Epoch: 20110, Training Loss: 0.01045
Epoch: 20110, Training Loss: 0.00972
Epoch: 20110, Training Loss: 0.01193
Epoch: 20110, Training Loss: 0.00921
Epoch: 20111, Training Loss: 0.01045
Epoch: 20111, Training Loss: 0.00972
Epoch: 20111, Training Loss: 0.01193
Epoch: 20111, Training Loss: 0.00921
Epoch: 20112, Training Loss: 0.01045
Epoch: 20112, Training Loss: 0.00972
Epoch: 20112, Training Loss: 0.01193
Epoch: 20112, Training Loss: 0.00921
Epoch: 20113, Training Loss: 0.01045
Epoch: 20113, Training Loss: 0.00972
Epoch: 20113, Training Loss: 0.01193
Epoch: 20113, Training Loss: 0.00921
Epoch: 20114, Training Loss: 0.01045
Epoch: 20114, Training Loss: 0.00972
Epoch: 20114, Training Loss: 0.01193
Epoch: 20114, Training Loss: 0.00921
Epoch: 20115, Training Loss: 0.01045
Epoch: 20115, Training Loss: 0.00972
Epoch: 20115, Training Loss: 0.01193
Epoch: 20115, Training Loss: 0.00921
Epoch: 20116, Training Loss: 0.01045
Epoch: 20116, Training Loss: 0.00972
Epoch: 20116, Training Loss: 0.01193
Epoch: 20116, Training Loss: 0.00921
Epoch: 20117, Training Loss: 0.01045
Epoch: 20117, Training Loss: 0.00972
Epoch: 20117, Training Loss: 0.01193
Epoch: 20117, Training Loss: 0.00921
Epoch: 20118, Training Loss: 0.01045
Epoch: 20118, Training Loss: 0.00972
Epoch: 20118, Training Loss: 0.01193
Epoch: 20118, Training Loss: 0.00921
Epoch: 20119, Training Loss: 0.01045
Epoch: 20119, Training Loss: 0.00972
Epoch: 20119, Training Loss: 0.01193
Epoch: 20119, Training Loss: 0.00921
Epoch: 20120, Training Loss: 0.01045
Epoch: 20120, Training Loss: 0.00972
Epoch: 20120, Training Loss: 0.01193
Epoch: 20120, Training Loss: 0.00921
Epoch: 20121, Training Loss: 0.01045
Epoch: 20121, Training Loss: 0.00972
Epoch: 20121, Training Loss: 0.01193
Epoch: 20121, Training Loss: 0.00921
Epoch: 20122, Training Loss: 0.01045
Epoch: 20122, Training Loss: 0.00972
Epoch: 20122, Training Loss: 0.01193
Epoch: 20122, Training Loss: 0.00921
Epoch: 20123, Training Loss: 0.01045
Epoch: 20123, Training Loss: 0.00972
Epoch: 20123, Training Loss: 0.01193
Epoch: 20123, Training Loss: 0.00921
Epoch: 20124, Training Loss: 0.01045
Epoch: 20124, Training Loss: 0.00972
Epoch: 20124, Training Loss: 0.01193
Epoch: 20124, Training Loss: 0.00921
Epoch: 20125, Training Loss: 0.01045
Epoch: 20125, Training Loss: 0.00972
Epoch: 20125, Training Loss: 0.01193
Epoch: 20125, Training Loss: 0.00921
Epoch: 20126, Training Loss: 0.01045
Epoch: 20126, Training Loss: 0.00972
Epoch: 20126, Training Loss: 0.01193
Epoch: 20126, Training Loss: 0.00921
Epoch: 20127, Training Loss: 0.01045
Epoch: 20127, Training Loss: 0.00972
Epoch: 20127, Training Loss: 0.01193
Epoch: 20127, Training Loss: 0.00921
Epoch: 20128, Training Loss: 0.01045
Epoch: 20128, Training Loss: 0.00972
Epoch: 20128, Training Loss: 0.01192
Epoch: 20128, Training Loss: 0.00921
Epoch: 20129, Training Loss: 0.01045
Epoch: 20129, Training Loss: 0.00972
Epoch: 20129, Training Loss: 0.01192
Epoch: 20129, Training Loss: 0.00921
Epoch: 20130, Training Loss: 0.01044
Epoch: 20130, Training Loss: 0.00972
Epoch: 20130, Training Loss: 0.01192
Epoch: 20130, Training Loss: 0.00921
Epoch: 20131, Training Loss: 0.01044
Epoch: 20131, Training Loss: 0.00972
Epoch: 20131, Training Loss: 0.01192
Epoch: 20131, Training Loss: 0.00921
Epoch: 20132, Training Loss: 0.01044
Epoch: 20132, Training Loss: 0.00971
Epoch: 20132, Training Loss: 0.01192
Epoch: 20132, Training Loss: 0.00921
Epoch: 20133, Training Loss: 0.01044
Epoch: 20133, Training Loss: 0.00971
Epoch: 20133, Training Loss: 0.01192
Epoch: 20133, Training Loss: 0.00921
Epoch: 20134, Training Loss: 0.01044
Epoch: 20134, Training Loss: 0.00971
Epoch: 20134, Training Loss: 0.01192
Epoch: 20134, Training Loss: 0.00920
Epoch: 20135, Training Loss: 0.01044
Epoch: 20135, Training Loss: 0.00971
Epoch: 20135, Training Loss: 0.01192
Epoch: 20135, Training Loss: 0.00920
Epoch: 20136, Training Loss: 0.01044
Epoch: 20136, Training Loss: 0.00971
Epoch: 20136, Training Loss: 0.01192
Epoch: 20136, Training Loss: 0.00920
Epoch: 20137, Training Loss: 0.01044
Epoch: 20137, Training Loss: 0.00971
Epoch: 20137, Training Loss: 0.01192
Epoch: 20137, Training Loss: 0.00920
Epoch: 20138, Training Loss: 0.01044
Epoch: 20138, Training Loss: 0.00971
Epoch: 20138, Training Loss: 0.01192
Epoch: 20138, Training Loss: 0.00920
Epoch: 20139, Training Loss: 0.01044
Epoch: 20139, Training Loss: 0.00971
Epoch: 20139, Training Loss: 0.01192
Epoch: 20139, Training Loss: 0.00920
Epoch: 20140, Training Loss: 0.01044
Epoch: 20140, Training Loss: 0.00971
Epoch: 20140, Training Loss: 0.01192
Epoch: 20140, Training Loss: 0.00920
Epoch: 20141, Training Loss: 0.01044
Epoch: 20141, Training Loss: 0.00971
Epoch: 20141, Training Loss: 0.01192
Epoch: 20141, Training Loss: 0.00920
Epoch: 20142, Training Loss: 0.01044
Epoch: 20142, Training Loss: 0.00971
Epoch: 20142, Training Loss: 0.01192
Epoch: 20142, Training Loss: 0.00920
Epoch: 20143, Training Loss: 0.01044
Epoch: 20143, Training Loss: 0.00971
Epoch: 20143, Training Loss: 0.01192
Epoch: 20143, Training Loss: 0.00920
Epoch: 20144, Training Loss: 0.01044
Epoch: 20144, Training Loss: 0.00971
Epoch: 20144, Training Loss: 0.01192
Epoch: 20144, Training Loss: 0.00920
Epoch: 20145, Training Loss: 0.01044
Epoch: 20145, Training Loss: 0.00971
Epoch: 20145, Training Loss: 0.01192
Epoch: 20145, Training Loss: 0.00920
Epoch: 20146, Training Loss: 0.01044
Epoch: 20146, Training Loss: 0.00971
Epoch: 20146, Training Loss: 0.01192
Epoch: 20146, Training Loss: 0.00920
Epoch: 20147, Training Loss: 0.01044
Epoch: 20147, Training Loss: 0.00971
Epoch: 20147, Training Loss: 0.01192
Epoch: 20147, Training Loss: 0.00920
Epoch: 20148, Training Loss: 0.01044
Epoch: 20148, Training Loss: 0.00971
Epoch: 20148, Training Loss: 0.01192
Epoch: 20148, Training Loss: 0.00920
Epoch: 20149, Training Loss: 0.01044
Epoch: 20149, Training Loss: 0.00971
Epoch: 20149, Training Loss: 0.01192
Epoch: 20149, Training Loss: 0.00920
Epoch: 20150, Training Loss: 0.01044
Epoch: 20150, Training Loss: 0.00971
Epoch: 20150, Training Loss: 0.01192
Epoch: 20150, Training Loss: 0.00920
Epoch: 20151, Training Loss: 0.01044
Epoch: 20151, Training Loss: 0.00971
Epoch: 20151, Training Loss: 0.01192
Epoch: 20151, Training Loss: 0.00920
Epoch: 20152, Training Loss: 0.01044
Epoch: 20152, Training Loss: 0.00971
Epoch: 20152, Training Loss: 0.01192
Epoch: 20152, Training Loss: 0.00920
Epoch: 20153, Training Loss: 0.01044
Epoch: 20153, Training Loss: 0.00971
Epoch: 20153, Training Loss: 0.01192
Epoch: 20153, Training Loss: 0.00920
Epoch: 20154, Training Loss: 0.01044
Epoch: 20154, Training Loss: 0.00971
Epoch: 20154, Training Loss: 0.01192
Epoch: 20154, Training Loss: 0.00920
Epoch: 20155, Training Loss: 0.01044
Epoch: 20155, Training Loss: 0.00971
Epoch: 20155, Training Loss: 0.01192
Epoch: 20155, Training Loss: 0.00920
Epoch: 20156, Training Loss: 0.01044
Epoch: 20156, Training Loss: 0.00971
Epoch: 20156, Training Loss: 0.01192
Epoch: 20156, Training Loss: 0.00920
Epoch: 20157, Training Loss: 0.01044
Epoch: 20157, Training Loss: 0.00971
Epoch: 20157, Training Loss: 0.01192
Epoch: 20157, Training Loss: 0.00920
Epoch: 20158, Training Loss: 0.01044
Epoch: 20158, Training Loss: 0.00971
Epoch: 20158, Training Loss: 0.01192
Epoch: 20158, Training Loss: 0.00920
Epoch: 20159, Training Loss: 0.01044
Epoch: 20159, Training Loss: 0.00971
Epoch: 20159, Training Loss: 0.01191
Epoch: 20159, Training Loss: 0.00920
Epoch: 20160, Training Loss: 0.01044
Epoch: 20160, Training Loss: 0.00971
Epoch: 20160, Training Loss: 0.01191
Epoch: 20160, Training Loss: 0.00920
Epoch: 20161, Training Loss: 0.01044
Epoch: 20161, Training Loss: 0.00971
Epoch: 20161, Training Loss: 0.01191
Epoch: 20161, Training Loss: 0.00920
Epoch: 20162, Training Loss: 0.01044
Epoch: 20162, Training Loss: 0.00971
Epoch: 20162, Training Loss: 0.01191
Epoch: 20162, Training Loss: 0.00920
Epoch: 20163, Training Loss: 0.01044
Epoch: 20163, Training Loss: 0.00971
Epoch: 20163, Training Loss: 0.01191
Epoch: 20163, Training Loss: 0.00920
Epoch: 20164, Training Loss: 0.01044
Epoch: 20164, Training Loss: 0.00971
Epoch: 20164, Training Loss: 0.01191
Epoch: 20164, Training Loss: 0.00920
Epoch: 20165, Training Loss: 0.01044
Epoch: 20165, Training Loss: 0.00971
Epoch: 20165, Training Loss: 0.01191
Epoch: 20165, Training Loss: 0.00920
Epoch: 20166, Training Loss: 0.01043
Epoch: 20166, Training Loss: 0.00971
Epoch: 20166, Training Loss: 0.01191
Epoch: 20166, Training Loss: 0.00920
Epoch: 20167, Training Loss: 0.01043
Epoch: 20167, Training Loss: 0.00971
Epoch: 20167, Training Loss: 0.01191
Epoch: 20167, Training Loss: 0.00920
Epoch: 20168, Training Loss: 0.01043
Epoch: 20168, Training Loss: 0.00971
Epoch: 20168, Training Loss: 0.01191
Epoch: 20168, Training Loss: 0.00920
Epoch: 20169, Training Loss: 0.01043
Epoch: 20169, Training Loss: 0.00971
Epoch: 20169, Training Loss: 0.01191
Epoch: 20169, Training Loss: 0.00920
Epoch: 20170, Training Loss: 0.01043
Epoch: 20170, Training Loss: 0.00971
Epoch: 20170, Training Loss: 0.01191
Epoch: 20170, Training Loss: 0.00920
Epoch: 20171, Training Loss: 0.01043
Epoch: 20171, Training Loss: 0.00970
Epoch: 20171, Training Loss: 0.01191
Epoch: 20171, Training Loss: 0.00920
Epoch: 20172, Training Loss: 0.01043
Epoch: 20172, Training Loss: 0.00970
Epoch: 20172, Training Loss: 0.01191
Epoch: 20172, Training Loss: 0.00920
Epoch: 20173, Training Loss: 0.01043
Epoch: 20173, Training Loss: 0.00970
Epoch: 20173, Training Loss: 0.01191
Epoch: 20173, Training Loss: 0.00920
Epoch: 20174, Training Loss: 0.01043
Epoch: 20174, Training Loss: 0.00970
Epoch: 20174, Training Loss: 0.01191
Epoch: 20174, Training Loss: 0.00920
Epoch: 20175, Training Loss: 0.01043
Epoch: 20175, Training Loss: 0.00970
Epoch: 20175, Training Loss: 0.01191
Epoch: 20175, Training Loss: 0.00919
Epoch: 20176, Training Loss: 0.01043
Epoch: 20176, Training Loss: 0.00970
Epoch: 20176, Training Loss: 0.01191
Epoch: 20176, Training Loss: 0.00919
Epoch: 20177, Training Loss: 0.01043
Epoch: 20177, Training Loss: 0.00970
Epoch: 20177, Training Loss: 0.01191
Epoch: 20177, Training Loss: 0.00919
Epoch: 20178, Training Loss: 0.01043
Epoch: 20178, Training Loss: 0.00970
Epoch: 20178, Training Loss: 0.01191
Epoch: 20178, Training Loss: 0.00919
Epoch: 20179, Training Loss: 0.01043
Epoch: 20179, Training Loss: 0.00970
Epoch: 20179, Training Loss: 0.01191
Epoch: 20179, Training Loss: 0.00919
Epoch: 20180, Training Loss: 0.01043
Epoch: 20180, Training Loss: 0.00970
Epoch: 20180, Training Loss: 0.01191
Epoch: 20180, Training Loss: 0.00919
Epoch: 20181, Training Loss: 0.01043
Epoch: 20181, Training Loss: 0.00970
Epoch: 20181, Training Loss: 0.01191
Epoch: 20181, Training Loss: 0.00919
Epoch: 20182, Training Loss: 0.01043
Epoch: 20182, Training Loss: 0.00970
Epoch: 20182, Training Loss: 0.01191
Epoch: 20182, Training Loss: 0.00919
Epoch: 20183, Training Loss: 0.01043
Epoch: 20183, Training Loss: 0.00970
Epoch: 20183, Training Loss: 0.01191
Epoch: 20183, Training Loss: 0.00919
Epoch: 20184, Training Loss: 0.01043
Epoch: 20184, Training Loss: 0.00970
Epoch: 20184, Training Loss: 0.01191
Epoch: 20184, Training Loss: 0.00919
Epoch: 20185, Training Loss: 0.01043
Epoch: 20185, Training Loss: 0.00970
Epoch: 20185, Training Loss: 0.01191
Epoch: 20185, Training Loss: 0.00919
Epoch: 20186, Training Loss: 0.01043
Epoch: 20186, Training Loss: 0.00970
Epoch: 20186, Training Loss: 0.01191
Epoch: 20186, Training Loss: 0.00919
Epoch: 20187, Training Loss: 0.01043
Epoch: 20187, Training Loss: 0.00970
Epoch: 20187, Training Loss: 0.01191
Epoch: 20187, Training Loss: 0.00919
Epoch: 20188, Training Loss: 0.01043
Epoch: 20188, Training Loss: 0.00970
Epoch: 20188, Training Loss: 0.01191
Epoch: 20188, Training Loss: 0.00919
Epoch: 20189, Training Loss: 0.01043
Epoch: 20189, Training Loss: 0.00970
Epoch: 20189, Training Loss: 0.01191
Epoch: 20189, Training Loss: 0.00919
Epoch: 20190, Training Loss: 0.01043
Epoch: 20190, Training Loss: 0.00970
Epoch: 20190, Training Loss: 0.01191
Epoch: 20190, Training Loss: 0.00919
Epoch: 20191, Training Loss: 0.01043
Epoch: 20191, Training Loss: 0.00970
Epoch: 20191, Training Loss: 0.01190
Epoch: 20191, Training Loss: 0.00919
Epoch: 20192, Training Loss: 0.01043
Epoch: 20192, Training Loss: 0.00970
Epoch: 20192, Training Loss: 0.01190
Epoch: 20192, Training Loss: 0.00919
Epoch: 20193, Training Loss: 0.01043
Epoch: 20193, Training Loss: 0.00970
Epoch: 20193, Training Loss: 0.01190
Epoch: 20193, Training Loss: 0.00919
Epoch: 20194, Training Loss: 0.01043
Epoch: 20194, Training Loss: 0.00970
Epoch: 20194, Training Loss: 0.01190
Epoch: 20194, Training Loss: 0.00919
Epoch: 20195, Training Loss: 0.01043
Epoch: 20195, Training Loss: 0.00970
Epoch: 20195, Training Loss: 0.01190
Epoch: 20195, Training Loss: 0.00919
Epoch: 20196, Training Loss: 0.01043
Epoch: 20196, Training Loss: 0.00970
Epoch: 20196, Training Loss: 0.01190
Epoch: 20196, Training Loss: 0.00919
Epoch: 20197, Training Loss: 0.01043
Epoch: 20197, Training Loss: 0.00970
Epoch: 20197, Training Loss: 0.01190
Epoch: 20197, Training Loss: 0.00919
Epoch: 20198, Training Loss: 0.01043
Epoch: 20198, Training Loss: 0.00970
Epoch: 20198, Training Loss: 0.01190
Epoch: 20198, Training Loss: 0.00919
Epoch: 20199, Training Loss: 0.01043
Epoch: 20199, Training Loss: 0.00970
Epoch: 20199, Training Loss: 0.01190
Epoch: 20199, Training Loss: 0.00919
Epoch: 20200, Training Loss: 0.01043
Epoch: 20200, Training Loss: 0.00970
Epoch: 20200, Training Loss: 0.01190
Epoch: 20200, Training Loss: 0.00919
Epoch: 20201, Training Loss: 0.01043
Epoch: 20201, Training Loss: 0.00970
Epoch: 20201, Training Loss: 0.01190
Epoch: 20201, Training Loss: 0.00919
Epoch: 20202, Training Loss: 0.01042
Epoch: 20202, Training Loss: 0.00970
Epoch: 20202, Training Loss: 0.01190
Epoch: 20202, Training Loss: 0.00919
Epoch: 20203, Training Loss: 0.01042
Epoch: 20203, Training Loss: 0.00970
Epoch: 20203, Training Loss: 0.01190
Epoch: 20203, Training Loss: 0.00919
Epoch: 20204, Training Loss: 0.01042
Epoch: 20204, Training Loss: 0.00970
Epoch: 20204, Training Loss: 0.01190
Epoch: 20204, Training Loss: 0.00919
Epoch: 20205, Training Loss: 0.01042
Epoch: 20205, Training Loss: 0.00970
Epoch: 20205, Training Loss: 0.01190
Epoch: 20205, Training Loss: 0.00919
Epoch: 20206, Training Loss: 0.01042
Epoch: 20206, Training Loss: 0.00970
Epoch: 20206, Training Loss: 0.01190
Epoch: 20206, Training Loss: 0.00919
Epoch: 20207, Training Loss: 0.01042
Epoch: 20207, Training Loss: 0.00970
Epoch: 20207, Training Loss: 0.01190
Epoch: 20207, Training Loss: 0.00919
Epoch: 20208, Training Loss: 0.01042
Epoch: 20208, Training Loss: 0.00970
Epoch: 20208, Training Loss: 0.01190
Epoch: 20208, Training Loss: 0.00919
Epoch: 20209, Training Loss: 0.01042
Epoch: 20209, Training Loss: 0.00970
Epoch: 20209, Training Loss: 0.01190
Epoch: 20209, Training Loss: 0.00919
Epoch: 20210, Training Loss: 0.01042
Epoch: 20210, Training Loss: 0.00970
Epoch: 20210, Training Loss: 0.01190
Epoch: 20210, Training Loss: 0.00919
Epoch: 20211, Training Loss: 0.01042
Epoch: 20211, Training Loss: 0.00969
Epoch: 20211, Training Loss: 0.01190
Epoch: 20211, Training Loss: 0.00919
Epoch: 20212, Training Loss: 0.01042
Epoch: 20212, Training Loss: 0.00969
Epoch: 20212, Training Loss: 0.01190
Epoch: 20212, Training Loss: 0.00919
Epoch: 20213, Training Loss: 0.01042
Epoch: 20213, Training Loss: 0.00969
Epoch: 20213, Training Loss: 0.01190
Epoch: 20213, Training Loss: 0.00919
Epoch: 20214, Training Loss: 0.01042
Epoch: 20214, Training Loss: 0.00969
Epoch: 20214, Training Loss: 0.01190
Epoch: 20214, Training Loss: 0.00919
Epoch: 20215, Training Loss: 0.01042
Epoch: 20215, Training Loss: 0.00969
Epoch: 20215, Training Loss: 0.01190
Epoch: 20215, Training Loss: 0.00919
Epoch: 20216, Training Loss: 0.01042
Epoch: 20216, Training Loss: 0.00969
Epoch: 20216, Training Loss: 0.01190
Epoch: 20216, Training Loss: 0.00918
Epoch: 20217, Training Loss: 0.01042
Epoch: 20217, Training Loss: 0.00969
Epoch: 20217, Training Loss: 0.01190
Epoch: 20217, Training Loss: 0.00918
Epoch: 20218, Training Loss: 0.01042
Epoch: 20218, Training Loss: 0.00969
Epoch: 20218, Training Loss: 0.01190
Epoch: 20218, Training Loss: 0.00918
Epoch: 20219, Training Loss: 0.01042
Epoch: 20219, Training Loss: 0.00969
Epoch: 20219, Training Loss: 0.01190
Epoch: 20219, Training Loss: 0.00918
Epoch: 20220, Training Loss: 0.01042
Epoch: 20220, Training Loss: 0.00969
Epoch: 20220, Training Loss: 0.01190
Epoch: 20220, Training Loss: 0.00918
Epoch: 20221, Training Loss: 0.01042
Epoch: 20221, Training Loss: 0.00969
Epoch: 20221, Training Loss: 0.01190
Epoch: 20221, Training Loss: 0.00918
Epoch: 20222, Training Loss: 0.01042
Epoch: 20222, Training Loss: 0.00969
Epoch: 20222, Training Loss: 0.01189
Epoch: 20222, Training Loss: 0.00918
Epoch: 20223, Training Loss: 0.01042
Epoch: 20223, Training Loss: 0.00969
Epoch: 20223, Training Loss: 0.01189
Epoch: 20223, Training Loss: 0.00918
Epoch: 20224, Training Loss: 0.01042
Epoch: 20224, Training Loss: 0.00969
Epoch: 20224, Training Loss: 0.01189
Epoch: 20224, Training Loss: 0.00918
Epoch: 20225, Training Loss: 0.01042
Epoch: 20225, Training Loss: 0.00969
Epoch: 20225, Training Loss: 0.01189
Epoch: 20225, Training Loss: 0.00918
Epoch: 20226, Training Loss: 0.01042
Epoch: 20226, Training Loss: 0.00969
Epoch: 20226, Training Loss: 0.01189
Epoch: 20226, Training Loss: 0.00918
Epoch: 20227, Training Loss: 0.01042
Epoch: 20227, Training Loss: 0.00969
Epoch: 20227, Training Loss: 0.01189
Epoch: 20227, Training Loss: 0.00918
Epoch: 20228, Training Loss: 0.01042
Epoch: 20228, Training Loss: 0.00969
Epoch: 20228, Training Loss: 0.01189
Epoch: 20228, Training Loss: 0.00918
Epoch: 20229, Training Loss: 0.01042
Epoch: 20229, Training Loss: 0.00969
Epoch: 20229, Training Loss: 0.01189
Epoch: 20229, Training Loss: 0.00918
Epoch: 20230, Training Loss: 0.01042
Epoch: 20230, Training Loss: 0.00969
Epoch: 20230, Training Loss: 0.01189
Epoch: 20230, Training Loss: 0.00918
Epoch: 20231, Training Loss: 0.01042
Epoch: 20231, Training Loss: 0.00969
Epoch: 20231, Training Loss: 0.01189
Epoch: 20231, Training Loss: 0.00918
Epoch: 20232, Training Loss: 0.01042
Epoch: 20232, Training Loss: 0.00969
Epoch: 20232, Training Loss: 0.01189
Epoch: 20232, Training Loss: 0.00918
Epoch: 20233, Training Loss: 0.01042
Epoch: 20233, Training Loss: 0.00969
Epoch: 20233, Training Loss: 0.01189
Epoch: 20233, Training Loss: 0.00918
Epoch: 20234, Training Loss: 0.01042
Epoch: 20234, Training Loss: 0.00969
Epoch: 20234, Training Loss: 0.01189
Epoch: 20234, Training Loss: 0.00918
Epoch: 20235, Training Loss: 0.01042
Epoch: 20235, Training Loss: 0.00969
Epoch: 20235, Training Loss: 0.01189
Epoch: 20235, Training Loss: 0.00918
Epoch: 20236, Training Loss: 0.01042
Epoch: 20236, Training Loss: 0.00969
Epoch: 20236, Training Loss: 0.01189
Epoch: 20236, Training Loss: 0.00918
Epoch: 20237, Training Loss: 0.01042
Epoch: 20237, Training Loss: 0.00969
Epoch: 20237, Training Loss: 0.01189
Epoch: 20237, Training Loss: 0.00918
Epoch: 20238, Training Loss: 0.01041
Epoch: 20238, Training Loss: 0.00969
Epoch: 20238, Training Loss: 0.01189
Epoch: 20238, Training Loss: 0.00918
Epoch: 20239, Training Loss: 0.01041
Epoch: 20239, Training Loss: 0.00969
Epoch: 20239, Training Loss: 0.01189
Epoch: 20239, Training Loss: 0.00918
Epoch: 20240, Training Loss: 0.01041
Epoch: 20240, Training Loss: 0.00969
Epoch: 20240, Training Loss: 0.01189
Epoch: 20240, Training Loss: 0.00918
Epoch: 20241, Training Loss: 0.01041
Epoch: 20241, Training Loss: 0.00969
Epoch: 20241, Training Loss: 0.01189
Epoch: 20241, Training Loss: 0.00918
Epoch: 20242, Training Loss: 0.01041
Epoch: 20242, Training Loss: 0.00969
Epoch: 20242, Training Loss: 0.01189
Epoch: 20242, Training Loss: 0.00918
Epoch: 20243, Training Loss: 0.01041
Epoch: 20243, Training Loss: 0.00969
Epoch: 20243, Training Loss: 0.01189
Epoch: 20243, Training Loss: 0.00918
Epoch: 20244, Training Loss: 0.01041
Epoch: 20244, Training Loss: 0.00969
Epoch: 20244, Training Loss: 0.01189
Epoch: 20244, Training Loss: 0.00918
Epoch: 20245, Training Loss: 0.01041
Epoch: 20245, Training Loss: 0.00969
Epoch: 20245, Training Loss: 0.01189
Epoch: 20245, Training Loss: 0.00918
Epoch: 20246, Training Loss: 0.01041
Epoch: 20246, Training Loss: 0.00969
Epoch: 20246, Training Loss: 0.01189
Epoch: 20246, Training Loss: 0.00918
Epoch: 20247, Training Loss: 0.01041
Epoch: 20247, Training Loss: 0.00969
Epoch: 20247, Training Loss: 0.01189
Epoch: 20247, Training Loss: 0.00918
Epoch: 20248, Training Loss: 0.01041
Epoch: 20248, Training Loss: 0.00969
Epoch: 20248, Training Loss: 0.01189
Epoch: 20248, Training Loss: 0.00918
Epoch: 20249, Training Loss: 0.01041
Epoch: 20249, Training Loss: 0.00969
Epoch: 20249, Training Loss: 0.01189
Epoch: 20249, Training Loss: 0.00918
Epoch: 20250, Training Loss: 0.01041
Epoch: 20250, Training Loss: 0.00968
Epoch: 20250, Training Loss: 0.01189
Epoch: 20250, Training Loss: 0.00918
Epoch: 20251, Training Loss: 0.01041
Epoch: 20251, Training Loss: 0.00968
Epoch: 20251, Training Loss: 0.01189
Epoch: 20251, Training Loss: 0.00918
Epoch: 20252, Training Loss: 0.01041
Epoch: 20252, Training Loss: 0.00968
Epoch: 20252, Training Loss: 0.01189
Epoch: 20252, Training Loss: 0.00918
Epoch: 20253, Training Loss: 0.01041
Epoch: 20253, Training Loss: 0.00968
Epoch: 20253, Training Loss: 0.01189
Epoch: 20253, Training Loss: 0.00918
Epoch: 20254, Training Loss: 0.01041
Epoch: 20254, Training Loss: 0.00968
Epoch: 20254, Training Loss: 0.01188
Epoch: 20254, Training Loss: 0.00918
Epoch: 20255, Training Loss: 0.01041
Epoch: 20255, Training Loss: 0.00968
Epoch: 20255, Training Loss: 0.01188
Epoch: 20255, Training Loss: 0.00918
Epoch: 20256, Training Loss: 0.01041
Epoch: 20256, Training Loss: 0.00968
Epoch: 20256, Training Loss: 0.01188
Epoch: 20256, Training Loss: 0.00918
Epoch: 20257, Training Loss: 0.01041
Epoch: 20257, Training Loss: 0.00968
Epoch: 20257, Training Loss: 0.01188
Epoch: 20257, Training Loss: 0.00918
Epoch: 20258, Training Loss: 0.01041
Epoch: 20258, Training Loss: 0.00968
Epoch: 20258, Training Loss: 0.01188
Epoch: 20258, Training Loss: 0.00917
Epoch: 20259, Training Loss: 0.01041
Epoch: 20259, Training Loss: 0.00968
Epoch: 20259, Training Loss: 0.01188
Epoch: 20259, Training Loss: 0.00917
Epoch: 20260, Training Loss: 0.01041
Epoch: 20260, Training Loss: 0.00968
Epoch: 20260, Training Loss: 0.01188
Epoch: 20260, Training Loss: 0.00917
Epoch: 20261, Training Loss: 0.01041
Epoch: 20261, Training Loss: 0.00968
Epoch: 20261, Training Loss: 0.01188
Epoch: 20261, Training Loss: 0.00917
Epoch: 20262, Training Loss: 0.01041
Epoch: 20262, Training Loss: 0.00968
Epoch: 20262, Training Loss: 0.01188
Epoch: 20262, Training Loss: 0.00917
Epoch: 20263, Training Loss: 0.01041
Epoch: 20263, Training Loss: 0.00968
Epoch: 20263, Training Loss: 0.01188
Epoch: 20263, Training Loss: 0.00917
Epoch: 20264, Training Loss: 0.01041
Epoch: 20264, Training Loss: 0.00968
Epoch: 20264, Training Loss: 0.01188
Epoch: 20264, Training Loss: 0.00917
Epoch: 20265, Training Loss: 0.01041
Epoch: 20265, Training Loss: 0.00968
Epoch: 20265, Training Loss: 0.01188
Epoch: 20265, Training Loss: 0.00917
Epoch: 20266, Training Loss: 0.01041
Epoch: 20266, Training Loss: 0.00968
Epoch: 20266, Training Loss: 0.01188
Epoch: 20266, Training Loss: 0.00917
Epoch: 20267, Training Loss: 0.01041
Epoch: 20267, Training Loss: 0.00968
Epoch: 20267, Training Loss: 0.01188
Epoch: 20267, Training Loss: 0.00917
Epoch: 20268, Training Loss: 0.01041
Epoch: 20268, Training Loss: 0.00968
Epoch: 20268, Training Loss: 0.01188
Epoch: 20268, Training Loss: 0.00917
Epoch: 20269, Training Loss: 0.01041
Epoch: 20269, Training Loss: 0.00968
Epoch: 20269, Training Loss: 0.01188
Epoch: 20269, Training Loss: 0.00917
Epoch: 20270, Training Loss: 0.01041
Epoch: 20270, Training Loss: 0.00968
Epoch: 20270, Training Loss: 0.01188
Epoch: 20270, Training Loss: 0.00917
Epoch: 20271, Training Loss: 0.01041
Epoch: 20271, Training Loss: 0.00968
Epoch: 20271, Training Loss: 0.01188
Epoch: 20271, Training Loss: 0.00917
Epoch: 20272, Training Loss: 0.01041
Epoch: 20272, Training Loss: 0.00968
Epoch: 20272, Training Loss: 0.01188
Epoch: 20272, Training Loss: 0.00917
Epoch: 20273, Training Loss: 0.01041
Epoch: 20273, Training Loss: 0.00968
Epoch: 20273, Training Loss: 0.01188
Epoch: 20273, Training Loss: 0.00917
Epoch: 20274, Training Loss: 0.01040
Epoch: 20274, Training Loss: 0.00968
Epoch: 20274, Training Loss: 0.01188
Epoch: 20274, Training Loss: 0.00917
Epoch: 20275, Training Loss: 0.01040
Epoch: 20275, Training Loss: 0.00968
Epoch: 20275, Training Loss: 0.01188
Epoch: 20275, Training Loss: 0.00917
Epoch: 20276, Training Loss: 0.01040
Epoch: 20276, Training Loss: 0.00968
Epoch: 20276, Training Loss: 0.01188
Epoch: 20276, Training Loss: 0.00917
Epoch: 20277, Training Loss: 0.01040
Epoch: 20277, Training Loss: 0.00968
Epoch: 20277, Training Loss: 0.01188
Epoch: 20277, Training Loss: 0.00917
Epoch: 20278, Training Loss: 0.01040
Epoch: 20278, Training Loss: 0.00968
Epoch: 20278, Training Loss: 0.01188
Epoch: 20278, Training Loss: 0.00917
Epoch: 20279, Training Loss: 0.01040
Epoch: 20279, Training Loss: 0.00968
Epoch: 20279, Training Loss: 0.01188
Epoch: 20279, Training Loss: 0.00917
Epoch: 20280, Training Loss: 0.01040
Epoch: 20280, Training Loss: 0.00968
Epoch: 20280, Training Loss: 0.01188
Epoch: 20280, Training Loss: 0.00917
Epoch: 20281, Training Loss: 0.01040
Epoch: 20281, Training Loss: 0.00968
Epoch: 20281, Training Loss: 0.01188
Epoch: 20281, Training Loss: 0.00917
Epoch: 20282, Training Loss: 0.01040
Epoch: 20282, Training Loss: 0.00968
Epoch: 20282, Training Loss: 0.01188
Epoch: 20282, Training Loss: 0.00917
Epoch: 20283, Training Loss: 0.01040
Epoch: 20283, Training Loss: 0.00968
Epoch: 20283, Training Loss: 0.01188
Epoch: 20283, Training Loss: 0.00917
Epoch: 20284, Training Loss: 0.01040
Epoch: 20284, Training Loss: 0.00968
Epoch: 20284, Training Loss: 0.01188
Epoch: 20284, Training Loss: 0.00917
Epoch: 20285, Training Loss: 0.01040
Epoch: 20285, Training Loss: 0.00968
Epoch: 20285, Training Loss: 0.01187
Epoch: 20285, Training Loss: 0.00917
Epoch: 20286, Training Loss: 0.01040
Epoch: 20286, Training Loss: 0.00968
Epoch: 20286, Training Loss: 0.01187
Epoch: 20286, Training Loss: 0.00917
Epoch: 20287, Training Loss: 0.01040
Epoch: 20287, Training Loss: 0.00968
Epoch: 20287, Training Loss: 0.01187
Epoch: 20287, Training Loss: 0.00917
Epoch: 20288, Training Loss: 0.01040
Epoch: 20288, Training Loss: 0.00968
Epoch: 20288, Training Loss: 0.01187
Epoch: 20288, Training Loss: 0.00917
Epoch: 20289, Training Loss: 0.01040
Epoch: 20289, Training Loss: 0.00967
Epoch: 20289, Training Loss: 0.01187
Epoch: 20289, Training Loss: 0.00917
Epoch: 20290, Training Loss: 0.01040
Epoch: 20290, Training Loss: 0.00967
Epoch: 20290, Training Loss: 0.01187
Epoch: 20290, Training Loss: 0.00917
Epoch: 20291, Training Loss: 0.01040
Epoch: 20291, Training Loss: 0.00967
Epoch: 20291, Training Loss: 0.01187
Epoch: 20291, Training Loss: 0.00917
Epoch: 20292, Training Loss: 0.01040
Epoch: 20292, Training Loss: 0.00967
Epoch: 20292, Training Loss: 0.01187
Epoch: 20292, Training Loss: 0.00917
Epoch: 20293, Training Loss: 0.01040
Epoch: 20293, Training Loss: 0.00967
Epoch: 20293, Training Loss: 0.01187
Epoch: 20293, Training Loss: 0.00917
Epoch: 20294, Training Loss: 0.01040
Epoch: 20294, Training Loss: 0.00967
Epoch: 20294, Training Loss: 0.01187
Epoch: 20294, Training Loss: 0.00917
Epoch: 20295, Training Loss: 0.01040
Epoch: 20295, Training Loss: 0.00967
Epoch: 20295, Training Loss: 0.01187
Epoch: 20295, Training Loss: 0.00917
Epoch: 20296, Training Loss: 0.01040
Epoch: 20296, Training Loss: 0.00967
Epoch: 20296, Training Loss: 0.01187
Epoch: 20296, Training Loss: 0.00917
Epoch: 20297, Training Loss: 0.01040
Epoch: 20297, Training Loss: 0.00967
Epoch: 20297, Training Loss: 0.01187
Epoch: 20297, Training Loss: 0.00917
Epoch: 20298, Training Loss: 0.01040
Epoch: 20298, Training Loss: 0.00967
Epoch: 20298, Training Loss: 0.01187
Epoch: 20298, Training Loss: 0.00917
Epoch: 20299, Training Loss: 0.01040
Epoch: 20299, Training Loss: 0.00967
Epoch: 20299, Training Loss: 0.01187
Epoch: 20299, Training Loss: 0.00916
Epoch: 20300, Training Loss: 0.01040
Epoch: 20300, Training Loss: 0.00967
Epoch: 20300, Training Loss: 0.01187
Epoch: 20300, Training Loss: 0.00916
Epoch: 20301, Training Loss: 0.01040
Epoch: 20301, Training Loss: 0.00967
Epoch: 20301, Training Loss: 0.01187
Epoch: 20301, Training Loss: 0.00916
Epoch: 20302, Training Loss: 0.01040
Epoch: 20302, Training Loss: 0.00967
Epoch: 20302, Training Loss: 0.01187
Epoch: 20302, Training Loss: 0.00916
Epoch: 20303, Training Loss: 0.01040
Epoch: 20303, Training Loss: 0.00967
Epoch: 20303, Training Loss: 0.01187
Epoch: 20303, Training Loss: 0.00916
Epoch: 20304, Training Loss: 0.01040
Epoch: 20304, Training Loss: 0.00967
Epoch: 20304, Training Loss: 0.01187
Epoch: 20304, Training Loss: 0.00916
Epoch: 20305, Training Loss: 0.01040
Epoch: 20305, Training Loss: 0.00967
Epoch: 20305, Training Loss: 0.01187
Epoch: 20305, Training Loss: 0.00916
Epoch: 20306, Training Loss: 0.01040
Epoch: 20306, Training Loss: 0.00967
Epoch: 20306, Training Loss: 0.01187
Epoch: 20306, Training Loss: 0.00916
Epoch: 20307, Training Loss: 0.01040
Epoch: 20307, Training Loss: 0.00967
Epoch: 20307, Training Loss: 0.01187
Epoch: 20307, Training Loss: 0.00916
Epoch: 20308, Training Loss: 0.01040
Epoch: 20308, Training Loss: 0.00967
Epoch: 20308, Training Loss: 0.01187
Epoch: 20308, Training Loss: 0.00916
Epoch: 20309, Training Loss: 0.01040
Epoch: 20309, Training Loss: 0.00967
Epoch: 20309, Training Loss: 0.01187
Epoch: 20309, Training Loss: 0.00916
Epoch: 20310, Training Loss: 0.01039
Epoch: 20310, Training Loss: 0.00967
Epoch: 20310, Training Loss: 0.01187
Epoch: 20310, Training Loss: 0.00916
Epoch: 20311, Training Loss: 0.01039
Epoch: 20311, Training Loss: 0.00967
Epoch: 20311, Training Loss: 0.01187
Epoch: 20311, Training Loss: 0.00916
Epoch: 20312, Training Loss: 0.01039
Epoch: 20312, Training Loss: 0.00967
Epoch: 20312, Training Loss: 0.01187
Epoch: 20312, Training Loss: 0.00916
Epoch: 20313, Training Loss: 0.01039
Epoch: 20313, Training Loss: 0.00967
Epoch: 20313, Training Loss: 0.01187
Epoch: 20313, Training Loss: 0.00916
Epoch: 20314, Training Loss: 0.01039
Epoch: 20314, Training Loss: 0.00967
Epoch: 20314, Training Loss: 0.01187
Epoch: 20314, Training Loss: 0.00916
Epoch: 20315, Training Loss: 0.01039
Epoch: 20315, Training Loss: 0.00967
Epoch: 20315, Training Loss: 0.01187
Epoch: 20315, Training Loss: 0.00916
Epoch: 20316, Training Loss: 0.01039
Epoch: 20316, Training Loss: 0.00967
Epoch: 20316, Training Loss: 0.01187
Epoch: 20316, Training Loss: 0.00916
Epoch: 20317, Training Loss: 0.01039
Epoch: 20317, Training Loss: 0.00967
Epoch: 20317, Training Loss: 0.01186
Epoch: 20317, Training Loss: 0.00916
Epoch: 20318, Training Loss: 0.01039
Epoch: 20318, Training Loss: 0.00967
Epoch: 20318, Training Loss: 0.01186
Epoch: 20318, Training Loss: 0.00916
Epoch: 20319, Training Loss: 0.01039
Epoch: 20319, Training Loss: 0.00967
Epoch: 20319, Training Loss: 0.01186
Epoch: 20319, Training Loss: 0.00916
Epoch: 20320, Training Loss: 0.01039
Epoch: 20320, Training Loss: 0.00967
Epoch: 20320, Training Loss: 0.01186
Epoch: 20320, Training Loss: 0.00916
Epoch: 20321, Training Loss: 0.01039
Epoch: 20321, Training Loss: 0.00967
Epoch: 20321, Training Loss: 0.01186
Epoch: 20321, Training Loss: 0.00916
Epoch: 20322, Training Loss: 0.01039
Epoch: 20322, Training Loss: 0.00967
Epoch: 20322, Training Loss: 0.01186
Epoch: 20322, Training Loss: 0.00916
Epoch: 20323, Training Loss: 0.01039
Epoch: 20323, Training Loss: 0.00967
Epoch: 20323, Training Loss: 0.01186
Epoch: 20323, Training Loss: 0.00916
Epoch: 20324, Training Loss: 0.01039
Epoch: 20324, Training Loss: 0.00967
Epoch: 20324, Training Loss: 0.01186
Epoch: 20324, Training Loss: 0.00916
Epoch: 20325, Training Loss: 0.01039
Epoch: 20325, Training Loss: 0.00967
Epoch: 20325, Training Loss: 0.01186
Epoch: 20325, Training Loss: 0.00916
Epoch: 20326, Training Loss: 0.01039
Epoch: 20326, Training Loss: 0.00967
Epoch: 20326, Training Loss: 0.01186
Epoch: 20326, Training Loss: 0.00916
Epoch: 20327, Training Loss: 0.01039
Epoch: 20327, Training Loss: 0.00967
Epoch: 20327, Training Loss: 0.01186
Epoch: 20327, Training Loss: 0.00916
Epoch: 20328, Training Loss: 0.01039
Epoch: 20328, Training Loss: 0.00967
Epoch: 20328, Training Loss: 0.01186
Epoch: 20328, Training Loss: 0.00916
Epoch: 20329, Training Loss: 0.01039
Epoch: 20329, Training Loss: 0.00966
Epoch: 20329, Training Loss: 0.01186
Epoch: 20329, Training Loss: 0.00916
Epoch: 20330, Training Loss: 0.01039
Epoch: 20330, Training Loss: 0.00966
Epoch: 20330, Training Loss: 0.01186
Epoch: 20330, Training Loss: 0.00916
Epoch: 20331, Training Loss: 0.01039
Epoch: 20331, Training Loss: 0.00966
Epoch: 20331, Training Loss: 0.01186
Epoch: 20331, Training Loss: 0.00916
Epoch: 20332, Training Loss: 0.01039
Epoch: 20332, Training Loss: 0.00966
Epoch: 20332, Training Loss: 0.01186
Epoch: 20332, Training Loss: 0.00916
Epoch: 20333, Training Loss: 0.01039
Epoch: 20333, Training Loss: 0.00966
Epoch: 20333, Training Loss: 0.01186
Epoch: 20333, Training Loss: 0.00916
Epoch: 20334, Training Loss: 0.01039
Epoch: 20334, Training Loss: 0.00966
Epoch: 20334, Training Loss: 0.01186
Epoch: 20334, Training Loss: 0.00916
Epoch: 20335, Training Loss: 0.01039
Epoch: 20335, Training Loss: 0.00966
Epoch: 20335, Training Loss: 0.01186
Epoch: 20335, Training Loss: 0.00916
Epoch: 20336, Training Loss: 0.01039
Epoch: 20336, Training Loss: 0.00966
Epoch: 20336, Training Loss: 0.01186
Epoch: 20336, Training Loss: 0.00916
Epoch: 20337, Training Loss: 0.01039
Epoch: 20337, Training Loss: 0.00966
Epoch: 20337, Training Loss: 0.01186
Epoch: 20337, Training Loss: 0.00916
Epoch: 20338, Training Loss: 0.01039
Epoch: 20338, Training Loss: 0.00966
Epoch: 20338, Training Loss: 0.01186
Epoch: 20338, Training Loss: 0.00916
Epoch: 20339, Training Loss: 0.01039
Epoch: 20339, Training Loss: 0.00966
Epoch: 20339, Training Loss: 0.01186
Epoch: 20339, Training Loss: 0.00916
Epoch: 20340, Training Loss: 0.01039
Epoch: 20340, Training Loss: 0.00966
Epoch: 20340, Training Loss: 0.01186
Epoch: 20340, Training Loss: 0.00916
Epoch: 20341, Training Loss: 0.01039
Epoch: 20341, Training Loss: 0.00966
Epoch: 20341, Training Loss: 0.01186
Epoch: 20341, Training Loss: 0.00915
Epoch: 20342, Training Loss: 0.01039
Epoch: 20342, Training Loss: 0.00966
Epoch: 20342, Training Loss: 0.01186
Epoch: 20342, Training Loss: 0.00915
Epoch: 20343, Training Loss: 0.01039
Epoch: 20343, Training Loss: 0.00966
Epoch: 20343, Training Loss: 0.01186
Epoch: 20343, Training Loss: 0.00915
Epoch: 20344, Training Loss: 0.01039
Epoch: 20344, Training Loss: 0.00966
Epoch: 20344, Training Loss: 0.01186
Epoch: 20344, Training Loss: 0.00915
Epoch: 20345, Training Loss: 0.01039
Epoch: 20345, Training Loss: 0.00966
Epoch: 20345, Training Loss: 0.01186
Epoch: 20345, Training Loss: 0.00915
Epoch: 20346, Training Loss: 0.01038
Epoch: 20346, Training Loss: 0.00966
Epoch: 20346, Training Loss: 0.01186
Epoch: 20346, Training Loss: 0.00915
Epoch: 20347, Training Loss: 0.01038
Epoch: 20347, Training Loss: 0.00966
Epoch: 20347, Training Loss: 0.01186
Epoch: 20347, Training Loss: 0.00915
Epoch: 20348, Training Loss: 0.01038
Epoch: 20348, Training Loss: 0.00966
Epoch: 20348, Training Loss: 0.01186
Epoch: 20348, Training Loss: 0.00915
Epoch: 20349, Training Loss: 0.01038
Epoch: 20349, Training Loss: 0.00966
Epoch: 20349, Training Loss: 0.01185
Epoch: 20349, Training Loss: 0.00915
Epoch: 20350, Training Loss: 0.01038
Epoch: 20350, Training Loss: 0.00966
Epoch: 20350, Training Loss: 0.01185
Epoch: 20350, Training Loss: 0.00915
Epoch: 20351, Training Loss: 0.01038
Epoch: 20351, Training Loss: 0.00966
Epoch: 20351, Training Loss: 0.01185
Epoch: 20351, Training Loss: 0.00915
Epoch: 20352, Training Loss: 0.01038
Epoch: 20352, Training Loss: 0.00966
Epoch: 20352, Training Loss: 0.01185
Epoch: 20352, Training Loss: 0.00915
Epoch: 20353, Training Loss: 0.01038
Epoch: 20353, Training Loss: 0.00966
Epoch: 20353, Training Loss: 0.01185
Epoch: 20353, Training Loss: 0.00915
Epoch: 20354, Training Loss: 0.01038
Epoch: 20354, Training Loss: 0.00966
Epoch: 20354, Training Loss: 0.01185
Epoch: 20354, Training Loss: 0.00915
Epoch: 20355, Training Loss: 0.01038
Epoch: 20355, Training Loss: 0.00966
Epoch: 20355, Training Loss: 0.01185
Epoch: 20355, Training Loss: 0.00915
Epoch: 20356, Training Loss: 0.01038
Epoch: 20356, Training Loss: 0.00966
Epoch: 20356, Training Loss: 0.01185
Epoch: 20356, Training Loss: 0.00915
Epoch: 20357, Training Loss: 0.01038
Epoch: 20357, Training Loss: 0.00966
Epoch: 20357, Training Loss: 0.01185
Epoch: 20357, Training Loss: 0.00915
Epoch: 20358, Training Loss: 0.01038
Epoch: 20358, Training Loss: 0.00966
Epoch: 20358, Training Loss: 0.01185
Epoch: 20358, Training Loss: 0.00915
Epoch: 20359, Training Loss: 0.01038
Epoch: 20359, Training Loss: 0.00966
Epoch: 20359, Training Loss: 0.01185
Epoch: 20359, Training Loss: 0.00915
Epoch: 20360, Training Loss: 0.01038
Epoch: 20360, Training Loss: 0.00966
Epoch: 20360, Training Loss: 0.01185
Epoch: 20360, Training Loss: 0.00915
Epoch: 20361, Training Loss: 0.01038
Epoch: 20361, Training Loss: 0.00966
Epoch: 20361, Training Loss: 0.01185
Epoch: 20361, Training Loss: 0.00915
Epoch: 20362, Training Loss: 0.01038
Epoch: 20362, Training Loss: 0.00966
Epoch: 20362, Training Loss: 0.01185
Epoch: 20362, Training Loss: 0.00915
Epoch: 20363, Training Loss: 0.01038
Epoch: 20363, Training Loss: 0.00966
Epoch: 20363, Training Loss: 0.01185
Epoch: 20363, Training Loss: 0.00915
Epoch: 20364, Training Loss: 0.01038
Epoch: 20364, Training Loss: 0.00966
Epoch: 20364, Training Loss: 0.01185
Epoch: 20364, Training Loss: 0.00915
Epoch: 20365, Training Loss: 0.01038
Epoch: 20365, Training Loss: 0.00966
Epoch: 20365, Training Loss: 0.01185
Epoch: 20365, Training Loss: 0.00915
Epoch: 20366, Training Loss: 0.01038
Epoch: 20366, Training Loss: 0.00966
Epoch: 20366, Training Loss: 0.01185
Epoch: 20366, Training Loss: 0.00915
Epoch: 20367, Training Loss: 0.01038
Epoch: 20367, Training Loss: 0.00966
Epoch: 20367, Training Loss: 0.01185
Epoch: 20367, Training Loss: 0.00915
Epoch: 20368, Training Loss: 0.01038
Epoch: 20368, Training Loss: 0.00966
Epoch: 20368, Training Loss: 0.01185
Epoch: 20368, Training Loss: 0.00915
Epoch: 20369, Training Loss: 0.01038
Epoch: 20369, Training Loss: 0.00965
Epoch: 20369, Training Loss: 0.01185
Epoch: 20369, Training Loss: 0.00915
Epoch: 20370, Training Loss: 0.01038
Epoch: 20370, Training Loss: 0.00965
Epoch: 20370, Training Loss: 0.01185
Epoch: 20370, Training Loss: 0.00915
Epoch: 20371, Training Loss: 0.01038
Epoch: 20371, Training Loss: 0.00965
Epoch: 20371, Training Loss: 0.01185
Epoch: 20371, Training Loss: 0.00915
Epoch: 20372, Training Loss: 0.01038
Epoch: 20372, Training Loss: 0.00965
Epoch: 20372, Training Loss: 0.01185
Epoch: 20372, Training Loss: 0.00915
Epoch: 20373, Training Loss: 0.01038
Epoch: 20373, Training Loss: 0.00965
Epoch: 20373, Training Loss: 0.01185
Epoch: 20373, Training Loss: 0.00915
Epoch: 20374, Training Loss: 0.01038
Epoch: 20374, Training Loss: 0.00965
Epoch: 20374, Training Loss: 0.01185
Epoch: 20374, Training Loss: 0.00915
Epoch: 20375, Training Loss: 0.01038
Epoch: 20375, Training Loss: 0.00965
Epoch: 20375, Training Loss: 0.01185
Epoch: 20375, Training Loss: 0.00915
Epoch: 20376, Training Loss: 0.01038
Epoch: 20376, Training Loss: 0.00965
Epoch: 20376, Training Loss: 0.01185
Epoch: 20376, Training Loss: 0.00915
Epoch: 20377, Training Loss: 0.01038
Epoch: 20377, Training Loss: 0.00965
Epoch: 20377, Training Loss: 0.01185
Epoch: 20377, Training Loss: 0.00915
Epoch: 20378, Training Loss: 0.01038
Epoch: 20378, Training Loss: 0.00965
Epoch: 20378, Training Loss: 0.01185
Epoch: 20378, Training Loss: 0.00915
Epoch: 20379, Training Loss: 0.01038
Epoch: 20379, Training Loss: 0.00965
Epoch: 20379, Training Loss: 0.01185
Epoch: 20379, Training Loss: 0.00915
Epoch: 20380, Training Loss: 0.01038
Epoch: 20380, Training Loss: 0.00965
Epoch: 20380, Training Loss: 0.01185
Epoch: 20380, Training Loss: 0.00915
Epoch: 20381, Training Loss: 0.01038
Epoch: 20381, Training Loss: 0.00965
Epoch: 20381, Training Loss: 0.01184
Epoch: 20381, Training Loss: 0.00915
Epoch: 20382, Training Loss: 0.01037
Epoch: 20382, Training Loss: 0.00965
Epoch: 20382, Training Loss: 0.01184
Epoch: 20382, Training Loss: 0.00915
Epoch: 20383, Training Loss: 0.01037
Epoch: 20383, Training Loss: 0.00965
Epoch: 20383, Training Loss: 0.01184
Epoch: 20383, Training Loss: 0.00914
Epoch: 20384, Training Loss: 0.01037
Epoch: 20384, Training Loss: 0.00965
Epoch: 20384, Training Loss: 0.01184
Epoch: 20384, Training Loss: 0.00914
Epoch: 20385, Training Loss: 0.01037
Epoch: 20385, Training Loss: 0.00965
Epoch: 20385, Training Loss: 0.01184
Epoch: 20385, Training Loss: 0.00914
Epoch: 20386, Training Loss: 0.01037
Epoch: 20386, Training Loss: 0.00965
Epoch: 20386, Training Loss: 0.01184
Epoch: 20386, Training Loss: 0.00914
Epoch: 20387, Training Loss: 0.01037
Epoch: 20387, Training Loss: 0.00965
Epoch: 20387, Training Loss: 0.01184
Epoch: 20387, Training Loss: 0.00914
Epoch: 20388, Training Loss: 0.01037
Epoch: 20388, Training Loss: 0.00965
Epoch: 20388, Training Loss: 0.01184
Epoch: 20388, Training Loss: 0.00914
Epoch: 20389, Training Loss: 0.01037
Epoch: 20389, Training Loss: 0.00965
Epoch: 20389, Training Loss: 0.01184
Epoch: 20389, Training Loss: 0.00914
Epoch: 20390, Training Loss: 0.01037
Epoch: 20390, Training Loss: 0.00965
Epoch: 20390, Training Loss: 0.01184
Epoch: 20390, Training Loss: 0.00914
Epoch: 20391, Training Loss: 0.01037
Epoch: 20391, Training Loss: 0.00965
Epoch: 20391, Training Loss: 0.01184
Epoch: 20391, Training Loss: 0.00914
Epoch: 20392, Training Loss: 0.01037
Epoch: 20392, Training Loss: 0.00965
Epoch: 20392, Training Loss: 0.01184
Epoch: 20392, Training Loss: 0.00914
Epoch: 20393, Training Loss: 0.01037
Epoch: 20393, Training Loss: 0.00965
Epoch: 20393, Training Loss: 0.01184
Epoch: 20393, Training Loss: 0.00914
Epoch: 20394, Training Loss: 0.01037
Epoch: 20394, Training Loss: 0.00965
Epoch: 20394, Training Loss: 0.01184
Epoch: 20394, Training Loss: 0.00914
Epoch: 20395, Training Loss: 0.01037
Epoch: 20395, Training Loss: 0.00965
Epoch: 20395, Training Loss: 0.01184
Epoch: 20395, Training Loss: 0.00914
Epoch: 20396, Training Loss: 0.01037
Epoch: 20396, Training Loss: 0.00965
Epoch: 20396, Training Loss: 0.01184
Epoch: 20396, Training Loss: 0.00914
Epoch: 20397, Training Loss: 0.01037
Epoch: 20397, Training Loss: 0.00965
Epoch: 20397, Training Loss: 0.01184
Epoch: 20397, Training Loss: 0.00914
Epoch: 20398, Training Loss: 0.01037
Epoch: 20398, Training Loss: 0.00965
Epoch: 20398, Training Loss: 0.01184
Epoch: 20398, Training Loss: 0.00914
Epoch: 20399, Training Loss: 0.01037
Epoch: 20399, Training Loss: 0.00965
Epoch: 20399, Training Loss: 0.01184
Epoch: 20399, Training Loss: 0.00914
Epoch: 20400, Training Loss: 0.01037
Epoch: 20400, Training Loss: 0.00965
Epoch: 20400, Training Loss: 0.01184
Epoch: 20400, Training Loss: 0.00914
Epoch: 20401, Training Loss: 0.01037
Epoch: 20401, Training Loss: 0.00965
Epoch: 20401, Training Loss: 0.01184
Epoch: 20401, Training Loss: 0.00914
Epoch: 20402, Training Loss: 0.01037
Epoch: 20402, Training Loss: 0.00965
Epoch: 20402, Training Loss: 0.01184
Epoch: 20402, Training Loss: 0.00914
Epoch: 20403, Training Loss: 0.01037
Epoch: 20403, Training Loss: 0.00965
Epoch: 20403, Training Loss: 0.01184
Epoch: 20403, Training Loss: 0.00914
Epoch: 20404, Training Loss: 0.01037
Epoch: 20404, Training Loss: 0.00965
Epoch: 20404, Training Loss: 0.01184
Epoch: 20404, Training Loss: 0.00914
Epoch: 20405, Training Loss: 0.01037
Epoch: 20405, Training Loss: 0.00965
Epoch: 20405, Training Loss: 0.01184
Epoch: 20405, Training Loss: 0.00914
Epoch: 20406, Training Loss: 0.01037
Epoch: 20406, Training Loss: 0.00965
Epoch: 20406, Training Loss: 0.01184
Epoch: 20406, Training Loss: 0.00914
Epoch: 20407, Training Loss: 0.01037
Epoch: 20407, Training Loss: 0.00965
Epoch: 20407, Training Loss: 0.01184
Epoch: 20407, Training Loss: 0.00914
Epoch: 20408, Training Loss: 0.01037
Epoch: 20408, Training Loss: 0.00965
Epoch: 20408, Training Loss: 0.01184
Epoch: 20408, Training Loss: 0.00914
Epoch: 20409, Training Loss: 0.01037
Epoch: 20409, Training Loss: 0.00964
Epoch: 20409, Training Loss: 0.01184
Epoch: 20409, Training Loss: 0.00914
Epoch: 20410, Training Loss: 0.01037
Epoch: 20410, Training Loss: 0.00964
Epoch: 20410, Training Loss: 0.01184
Epoch: 20410, Training Loss: 0.00914
Epoch: 20411, Training Loss: 0.01037
Epoch: 20411, Training Loss: 0.00964
Epoch: 20411, Training Loss: 0.01184
Epoch: 20411, Training Loss: 0.00914
Epoch: 20412, Training Loss: 0.01037
Epoch: 20412, Training Loss: 0.00964
Epoch: 20412, Training Loss: 0.01183
Epoch: 20412, Training Loss: 0.00914
Epoch: 20413, Training Loss: 0.01037
Epoch: 20413, Training Loss: 0.00964
Epoch: 20413, Training Loss: 0.01183
Epoch: 20413, Training Loss: 0.00914
Epoch: 20414, Training Loss: 0.01037
Epoch: 20414, Training Loss: 0.00964
Epoch: 20414, Training Loss: 0.01183
Epoch: 20414, Training Loss: 0.00914
Epoch: 20415, Training Loss: 0.01037
Epoch: 20415, Training Loss: 0.00964
Epoch: 20415, Training Loss: 0.01183
Epoch: 20415, Training Loss: 0.00914
Epoch: 20416, Training Loss: 0.01037
Epoch: 20416, Training Loss: 0.00964
Epoch: 20416, Training Loss: 0.01183
Epoch: 20416, Training Loss: 0.00914
Epoch: 20417, Training Loss: 0.01037
Epoch: 20417, Training Loss: 0.00964
Epoch: 20417, Training Loss: 0.01183
Epoch: 20417, Training Loss: 0.00914
Epoch: 20418, Training Loss: 0.01037
Epoch: 20418, Training Loss: 0.00964
Epoch: 20418, Training Loss: 0.01183
Epoch: 20418, Training Loss: 0.00914
Epoch: 20419, Training Loss: 0.01036
Epoch: 20419, Training Loss: 0.00964
Epoch: 20419, Training Loss: 0.01183
Epoch: 20419, Training Loss: 0.00914
Epoch: 20420, Training Loss: 0.01036
Epoch: 20420, Training Loss: 0.00964
Epoch: 20420, Training Loss: 0.01183
Epoch: 20420, Training Loss: 0.00914
Epoch: 20421, Training Loss: 0.01036
Epoch: 20421, Training Loss: 0.00964
Epoch: 20421, Training Loss: 0.01183
Epoch: 20421, Training Loss: 0.00914
Epoch: 20422, Training Loss: 0.01036
Epoch: 20422, Training Loss: 0.00964
Epoch: 20422, Training Loss: 0.01183
Epoch: 20422, Training Loss: 0.00914
Epoch: 20423, Training Loss: 0.01036
Epoch: 20423, Training Loss: 0.00964
Epoch: 20423, Training Loss: 0.01183
Epoch: 20423, Training Loss: 0.00914
Epoch: 20424, Training Loss: 0.01036
Epoch: 20424, Training Loss: 0.00964
Epoch: 20424, Training Loss: 0.01183
Epoch: 20424, Training Loss: 0.00914
Epoch: 20425, Training Loss: 0.01036
Epoch: 20425, Training Loss: 0.00964
Epoch: 20425, Training Loss: 0.01183
Epoch: 20425, Training Loss: 0.00913
Epoch: 20426, Training Loss: 0.01036
Epoch: 20426, Training Loss: 0.00964
Epoch: 20426, Training Loss: 0.01183
Epoch: 20426, Training Loss: 0.00913
Epoch: 20427, Training Loss: 0.01036
Epoch: 20427, Training Loss: 0.00964
Epoch: 20427, Training Loss: 0.01183
Epoch: 20427, Training Loss: 0.00913
Epoch: 20428, Training Loss: 0.01036
Epoch: 20428, Training Loss: 0.00964
Epoch: 20428, Training Loss: 0.01183
Epoch: 20428, Training Loss: 0.00913
Epoch: 20429, Training Loss: 0.01036
Epoch: 20429, Training Loss: 0.00964
Epoch: 20429, Training Loss: 0.01183
Epoch: 20429, Training Loss: 0.00913
Epoch: 20430, Training Loss: 0.01036
Epoch: 20430, Training Loss: 0.00964
Epoch: 20430, Training Loss: 0.01183
Epoch: 20430, Training Loss: 0.00913
Epoch: 20431, Training Loss: 0.01036
Epoch: 20431, Training Loss: 0.00964
Epoch: 20431, Training Loss: 0.01183
Epoch: 20431, Training Loss: 0.00913
Epoch: 20432, Training Loss: 0.01036
Epoch: 20432, Training Loss: 0.00964
Epoch: 20432, Training Loss: 0.01183
Epoch: 20432, Training Loss: 0.00913
Epoch: 20433, Training Loss: 0.01036
Epoch: 20433, Training Loss: 0.00964
Epoch: 20433, Training Loss: 0.01183
Epoch: 20433, Training Loss: 0.00913
Epoch: 20434, Training Loss: 0.01036
Epoch: 20434, Training Loss: 0.00964
Epoch: 20434, Training Loss: 0.01183
Epoch: 20434, Training Loss: 0.00913
Epoch: 20435, Training Loss: 0.01036
Epoch: 20435, Training Loss: 0.00964
Epoch: 20435, Training Loss: 0.01183
Epoch: 20435, Training Loss: 0.00913
Epoch: 20436, Training Loss: 0.01036
Epoch: 20436, Training Loss: 0.00964
Epoch: 20436, Training Loss: 0.01183
Epoch: 20436, Training Loss: 0.00913
Epoch: 20437, Training Loss: 0.01036
Epoch: 20437, Training Loss: 0.00964
Epoch: 20437, Training Loss: 0.01183
Epoch: 20437, Training Loss: 0.00913
Epoch: 20438, Training Loss: 0.01036
Epoch: 20438, Training Loss: 0.00964
Epoch: 20438, Training Loss: 0.01183
Epoch: 20438, Training Loss: 0.00913
Epoch: 20439, Training Loss: 0.01036
Epoch: 20439, Training Loss: 0.00964
Epoch: 20439, Training Loss: 0.01183
Epoch: 20439, Training Loss: 0.00913
Epoch: 20440, Training Loss: 0.01036
Epoch: 20440, Training Loss: 0.00964
Epoch: 20440, Training Loss: 0.01183
Epoch: 20440, Training Loss: 0.00913
Epoch: 20441, Training Loss: 0.01036
Epoch: 20441, Training Loss: 0.00964
Epoch: 20441, Training Loss: 0.01183
Epoch: 20441, Training Loss: 0.00913
Epoch: 20442, Training Loss: 0.01036
Epoch: 20442, Training Loss: 0.00964
Epoch: 20442, Training Loss: 0.01183
Epoch: 20442, Training Loss: 0.00913
Epoch: 20443, Training Loss: 0.01036
Epoch: 20443, Training Loss: 0.00964
Epoch: 20443, Training Loss: 0.01183
Epoch: 20443, Training Loss: 0.00913
Epoch: 20444, Training Loss: 0.01036
Epoch: 20444, Training Loss: 0.00964
Epoch: 20444, Training Loss: 0.01182
Epoch: 20444, Training Loss: 0.00913
Epoch: 20445, Training Loss: 0.01036
Epoch: 20445, Training Loss: 0.00964
Epoch: 20445, Training Loss: 0.01182
Epoch: 20445, Training Loss: 0.00913
Epoch: 20446, Training Loss: 0.01036
Epoch: 20446, Training Loss: 0.00964
Epoch: 20446, Training Loss: 0.01182
Epoch: 20446, Training Loss: 0.00913
Epoch: 20447, Training Loss: 0.01036
Epoch: 20447, Training Loss: 0.00964
Epoch: 20447, Training Loss: 0.01182
Epoch: 20447, Training Loss: 0.00913
Epoch: 20448, Training Loss: 0.01036
Epoch: 20448, Training Loss: 0.00964
Epoch: 20448, Training Loss: 0.01182
Epoch: 20448, Training Loss: 0.00913
Epoch: 20449, Training Loss: 0.01036
Epoch: 20449, Training Loss: 0.00963
Epoch: 20449, Training Loss: 0.01182
Epoch: 20449, Training Loss: 0.00913
Epoch: 20450, Training Loss: 0.01036
Epoch: 20450, Training Loss: 0.00963
Epoch: 20450, Training Loss: 0.01182
Epoch: 20450, Training Loss: 0.00913
Epoch: 20451, Training Loss: 0.01036
Epoch: 20451, Training Loss: 0.00963
Epoch: 20451, Training Loss: 0.01182
Epoch: 20451, Training Loss: 0.00913
Epoch: 20452, Training Loss: 0.01036
Epoch: 20452, Training Loss: 0.00963
Epoch: 20452, Training Loss: 0.01182
Epoch: 20452, Training Loss: 0.00913
Epoch: 20453, Training Loss: 0.01036
Epoch: 20453, Training Loss: 0.00963
Epoch: 20453, Training Loss: 0.01182
Epoch: 20453, Training Loss: 0.00913
Epoch: 20454, Training Loss: 0.01036
Epoch: 20454, Training Loss: 0.00963
Epoch: 20454, Training Loss: 0.01182
Epoch: 20454, Training Loss: 0.00913
Epoch: 20455, Training Loss: 0.01035
Epoch: 20455, Training Loss: 0.00963
Epoch: 20455, Training Loss: 0.01182
Epoch: 20455, Training Loss: 0.00913
Epoch: 20456, Training Loss: 0.01035
Epoch: 20456, Training Loss: 0.00963
Epoch: 20456, Training Loss: 0.01182
Epoch: 20456, Training Loss: 0.00913
Epoch: 20457, Training Loss: 0.01035
Epoch: 20457, Training Loss: 0.00963
Epoch: 20457, Training Loss: 0.01182
Epoch: 20457, Training Loss: 0.00913
Epoch: 20458, Training Loss: 0.01035
Epoch: 20458, Training Loss: 0.00963
Epoch: 20458, Training Loss: 0.01182
Epoch: 20458, Training Loss: 0.00913
Epoch: 20459, Training Loss: 0.01035
Epoch: 20459, Training Loss: 0.00963
Epoch: 20459, Training Loss: 0.01182
Epoch: 20459, Training Loss: 0.00913
Epoch: 20460, Training Loss: 0.01035
Epoch: 20460, Training Loss: 0.00963
Epoch: 20460, Training Loss: 0.01182
Epoch: 20460, Training Loss: 0.00913
Epoch: 20461, Training Loss: 0.01035
Epoch: 20461, Training Loss: 0.00963
Epoch: 20461, Training Loss: 0.01182
Epoch: 20461, Training Loss: 0.00913
Epoch: 20462, Training Loss: 0.01035
Epoch: 20462, Training Loss: 0.00963
Epoch: 20462, Training Loss: 0.01182
Epoch: 20462, Training Loss: 0.00913
Epoch: 20463, Training Loss: 0.01035
Epoch: 20463, Training Loss: 0.00963
Epoch: 20463, Training Loss: 0.01182
Epoch: 20463, Training Loss: 0.00913
Epoch: 20464, Training Loss: 0.01035
Epoch: 20464, Training Loss: 0.00963
Epoch: 20464, Training Loss: 0.01182
Epoch: 20464, Training Loss: 0.00913
Epoch: 20465, Training Loss: 0.01035
Epoch: 20465, Training Loss: 0.00963
Epoch: 20465, Training Loss: 0.01182
Epoch: 20465, Training Loss: 0.00913
Epoch: 20466, Training Loss: 0.01035
Epoch: 20466, Training Loss: 0.00963
Epoch: 20466, Training Loss: 0.01182
Epoch: 20466, Training Loss: 0.00913
Epoch: 20467, Training Loss: 0.01035
Epoch: 20467, Training Loss: 0.00963
Epoch: 20467, Training Loss: 0.01182
Epoch: 20467, Training Loss: 0.00912
Epoch: 20468, Training Loss: 0.01035
Epoch: 20468, Training Loss: 0.00963
Epoch: 20468, Training Loss: 0.01182
Epoch: 20468, Training Loss: 0.00912
Epoch: 20469, Training Loss: 0.01035
Epoch: 20469, Training Loss: 0.00963
Epoch: 20469, Training Loss: 0.01182
Epoch: 20469, Training Loss: 0.00912
Epoch: 20470, Training Loss: 0.01035
Epoch: 20470, Training Loss: 0.00963
Epoch: 20470, Training Loss: 0.01182
Epoch: 20470, Training Loss: 0.00912
Epoch: 20471, Training Loss: 0.01035
Epoch: 20471, Training Loss: 0.00963
Epoch: 20471, Training Loss: 0.01182
Epoch: 20471, Training Loss: 0.00912
Epoch: 20472, Training Loss: 0.01035
Epoch: 20472, Training Loss: 0.00963
Epoch: 20472, Training Loss: 0.01182
Epoch: 20472, Training Loss: 0.00912
Epoch: 20473, Training Loss: 0.01035
Epoch: 20473, Training Loss: 0.00963
Epoch: 20473, Training Loss: 0.01182
Epoch: 20473, Training Loss: 0.00912
Epoch: 20474, Training Loss: 0.01035
Epoch: 20474, Training Loss: 0.00963
Epoch: 20474, Training Loss: 0.01182
Epoch: 20474, Training Loss: 0.00912
Epoch: 20475, Training Loss: 0.01035
Epoch: 20475, Training Loss: 0.00963
Epoch: 20475, Training Loss: 0.01182
Epoch: 20475, Training Loss: 0.00912
Epoch: 20476, Training Loss: 0.01035
Epoch: 20476, Training Loss: 0.00963
Epoch: 20476, Training Loss: 0.01182
Epoch: 20476, Training Loss: 0.00912
Epoch: 20477, Training Loss: 0.01035
Epoch: 20477, Training Loss: 0.00963
Epoch: 20477, Training Loss: 0.01181
Epoch: 20477, Training Loss: 0.00912
Epoch: 20478, Training Loss: 0.01035
Epoch: 20478, Training Loss: 0.00963
Epoch: 20478, Training Loss: 0.01181
Epoch: 20478, Training Loss: 0.00912
Epoch: 20479, Training Loss: 0.01035
Epoch: 20479, Training Loss: 0.00963
Epoch: 20479, Training Loss: 0.01181
Epoch: 20479, Training Loss: 0.00912
Epoch: 20480, Training Loss: 0.01035
Epoch: 20480, Training Loss: 0.00963
Epoch: 20480, Training Loss: 0.01181
Epoch: 20480, Training Loss: 0.00912
Epoch: 20481, Training Loss: 0.01035
Epoch: 20481, Training Loss: 0.00963
Epoch: 20481, Training Loss: 0.01181
Epoch: 20481, Training Loss: 0.00912
Epoch: 20482, Training Loss: 0.01035
Epoch: 20482, Training Loss: 0.00963
Epoch: 20482, Training Loss: 0.01181
Epoch: 20482, Training Loss: 0.00912
Epoch: 20483, Training Loss: 0.01035
Epoch: 20483, Training Loss: 0.00963
Epoch: 20483, Training Loss: 0.01181
Epoch: 20483, Training Loss: 0.00912
Epoch: 20484, Training Loss: 0.01035
Epoch: 20484, Training Loss: 0.00963
Epoch: 20484, Training Loss: 0.01181
Epoch: 20484, Training Loss: 0.00912
Epoch: 20485, Training Loss: 0.01035
Epoch: 20485, Training Loss: 0.00963
Epoch: 20485, Training Loss: 0.01181
Epoch: 20485, Training Loss: 0.00912
Epoch: 20486, Training Loss: 0.01035
Epoch: 20486, Training Loss: 0.00963
Epoch: 20486, Training Loss: 0.01181
Epoch: 20486, Training Loss: 0.00912
Epoch: 20487, Training Loss: 0.01035
Epoch: 20487, Training Loss: 0.00963
Epoch: 20487, Training Loss: 0.01181
Epoch: 20487, Training Loss: 0.00912
Epoch: 20488, Training Loss: 0.01035
Epoch: 20488, Training Loss: 0.00963
Epoch: 20488, Training Loss: 0.01181
Epoch: 20488, Training Loss: 0.00912
Epoch: 20489, Training Loss: 0.01035
Epoch: 20489, Training Loss: 0.00962
Epoch: 20489, Training Loss: 0.01181
Epoch: 20489, Training Loss: 0.00912
Epoch: 20490, Training Loss: 0.01035
Epoch: 20490, Training Loss: 0.00962
Epoch: 20490, Training Loss: 0.01181
Epoch: 20490, Training Loss: 0.00912
Epoch: 20491, Training Loss: 0.01035
Epoch: 20491, Training Loss: 0.00962
Epoch: 20491, Training Loss: 0.01181
Epoch: 20491, Training Loss: 0.00912
Epoch: 20492, Training Loss: 0.01034
Epoch: 20492, Training Loss: 0.00962
Epoch: 20492, Training Loss: 0.01181
Epoch: 20492, Training Loss: 0.00912
Epoch: 20493, Training Loss: 0.01034
Epoch: 20493, Training Loss: 0.00962
Epoch: 20493, Training Loss: 0.01181
Epoch: 20493, Training Loss: 0.00912
Epoch: 20494, Training Loss: 0.01034
Epoch: 20494, Training Loss: 0.00962
Epoch: 20494, Training Loss: 0.01181
Epoch: 20494, Training Loss: 0.00912
Epoch: 20495, Training Loss: 0.01034
Epoch: 20495, Training Loss: 0.00962
Epoch: 20495, Training Loss: 0.01181
Epoch: 20495, Training Loss: 0.00912
Epoch: 20496, Training Loss: 0.01034
Epoch: 20496, Training Loss: 0.00962
Epoch: 20496, Training Loss: 0.01181
Epoch: 20496, Training Loss: 0.00912
Epoch: 20497, Training Loss: 0.01034
Epoch: 20497, Training Loss: 0.00962
Epoch: 20497, Training Loss: 0.01181
Epoch: 20497, Training Loss: 0.00912
Epoch: 20498, Training Loss: 0.01034
Epoch: 20498, Training Loss: 0.00962
Epoch: 20498, Training Loss: 0.01181
Epoch: 20498, Training Loss: 0.00912
Epoch: 20499, Training Loss: 0.01034
Epoch: 20499, Training Loss: 0.00962
Epoch: 20499, Training Loss: 0.01181
Epoch: 20499, Training Loss: 0.00912
Epoch: 20500, Training Loss: 0.01034
Epoch: 20500, Training Loss: 0.00962
Epoch: 20500, Training Loss: 0.01181
Epoch: 20500, Training Loss: 0.00912
Epoch: 20501, Training Loss: 0.01034
Epoch: 20501, Training Loss: 0.00962
Epoch: 20501, Training Loss: 0.01181
Epoch: 20501, Training Loss: 0.00912
Epoch: 20502, Training Loss: 0.01034
Epoch: 20502, Training Loss: 0.00962
Epoch: 20502, Training Loss: 0.01181
Epoch: 20502, Training Loss: 0.00912
Epoch: 20503, Training Loss: 0.01034
Epoch: 20503, Training Loss: 0.00962
Epoch: 20503, Training Loss: 0.01181
Epoch: 20503, Training Loss: 0.00912
Epoch: 20504, Training Loss: 0.01034
Epoch: 20504, Training Loss: 0.00962
Epoch: 20504, Training Loss: 0.01181
Epoch: 20504, Training Loss: 0.00912
Epoch: 20505, Training Loss: 0.01034
Epoch: 20505, Training Loss: 0.00962
Epoch: 20505, Training Loss: 0.01181
Epoch: 20505, Training Loss: 0.00912
Epoch: 20506, Training Loss: 0.01034
Epoch: 20506, Training Loss: 0.00962
Epoch: 20506, Training Loss: 0.01181
Epoch: 20506, Training Loss: 0.00912
Epoch: 20507, Training Loss: 0.01034
Epoch: 20507, Training Loss: 0.00962
Epoch: 20507, Training Loss: 0.01181
Epoch: 20507, Training Loss: 0.00912
Epoch: 20508, Training Loss: 0.01034
Epoch: 20508, Training Loss: 0.00962
Epoch: 20508, Training Loss: 0.01181
Epoch: 20508, Training Loss: 0.00912
Epoch: 20509, Training Loss: 0.01034
Epoch: 20509, Training Loss: 0.00962
Epoch: 20509, Training Loss: 0.01180
Epoch: 20509, Training Loss: 0.00911
Epoch: 20510, Training Loss: 0.01034
Epoch: 20510, Training Loss: 0.00962
Epoch: 20510, Training Loss: 0.01180
Epoch: 20510, Training Loss: 0.00911
Epoch: 20511, Training Loss: 0.01034
Epoch: 20511, Training Loss: 0.00962
Epoch: 20511, Training Loss: 0.01180
Epoch: 20511, Training Loss: 0.00911
Epoch: 20512, Training Loss: 0.01034
Epoch: 20512, Training Loss: 0.00962
Epoch: 20512, Training Loss: 0.01180
Epoch: 20512, Training Loss: 0.00911
Epoch: 20513, Training Loss: 0.01034
Epoch: 20513, Training Loss: 0.00962
Epoch: 20513, Training Loss: 0.01180
Epoch: 20513, Training Loss: 0.00911
Epoch: 20514, Training Loss: 0.01034
Epoch: 20514, Training Loss: 0.00962
Epoch: 20514, Training Loss: 0.01180
Epoch: 20514, Training Loss: 0.00911
Epoch: 20515, Training Loss: 0.01034
Epoch: 20515, Training Loss: 0.00962
Epoch: 20515, Training Loss: 0.01180
Epoch: 20515, Training Loss: 0.00911
Epoch: 20516, Training Loss: 0.01034
Epoch: 20516, Training Loss: 0.00962
Epoch: 20516, Training Loss: 0.01180
Epoch: 20516, Training Loss: 0.00911
Epoch: 20517, Training Loss: 0.01034
Epoch: 20517, Training Loss: 0.00962
Epoch: 20517, Training Loss: 0.01180
Epoch: 20517, Training Loss: 0.00911
Epoch: 20518, Training Loss: 0.01034
Epoch: 20518, Training Loss: 0.00962
Epoch: 20518, Training Loss: 0.01180
Epoch: 20518, Training Loss: 0.00911
Epoch: 20519, Training Loss: 0.01034
Epoch: 20519, Training Loss: 0.00962
Epoch: 20519, Training Loss: 0.01180
Epoch: 20519, Training Loss: 0.00911
Epoch: 20520, Training Loss: 0.01034
Epoch: 20520, Training Loss: 0.00962
Epoch: 20520, Training Loss: 0.01180
Epoch: 20520, Training Loss: 0.00911
Epoch: 20521, Training Loss: 0.01034
Epoch: 20521, Training Loss: 0.00962
Epoch: 20521, Training Loss: 0.01180
Epoch: 20521, Training Loss: 0.00911
Epoch: 20522, Training Loss: 0.01034
Epoch: 20522, Training Loss: 0.00962
Epoch: 20522, Training Loss: 0.01180
Epoch: 20522, Training Loss: 0.00911
Epoch: 20523, Training Loss: 0.01034
Epoch: 20523, Training Loss: 0.00962
Epoch: 20523, Training Loss: 0.01180
Epoch: 20523, Training Loss: 0.00911
Epoch: 20524, Training Loss: 0.01034
Epoch: 20524, Training Loss: 0.00962
Epoch: 20524, Training Loss: 0.01180
Epoch: 20524, Training Loss: 0.00911
Epoch: 20525, Training Loss: 0.01034
Epoch: 20525, Training Loss: 0.00962
Epoch: 20525, Training Loss: 0.01180
Epoch: 20525, Training Loss: 0.00911
Epoch: 20526, Training Loss: 0.01034
Epoch: 20526, Training Loss: 0.00962
Epoch: 20526, Training Loss: 0.01180
Epoch: 20526, Training Loss: 0.00911
Epoch: 20527, Training Loss: 0.01034
Epoch: 20527, Training Loss: 0.00962
Epoch: 20527, Training Loss: 0.01180
Epoch: 20527, Training Loss: 0.00911
Epoch: 20528, Training Loss: 0.01034
Epoch: 20528, Training Loss: 0.00962
Epoch: 20528, Training Loss: 0.01180
Epoch: 20528, Training Loss: 0.00911
Epoch: 20529, Training Loss: 0.01033
Epoch: 20529, Training Loss: 0.00961
Epoch: 20529, Training Loss: 0.01180
Epoch: 20529, Training Loss: 0.00911
Epoch: 20530, Training Loss: 0.01033
Epoch: 20530, Training Loss: 0.00961
Epoch: 20530, Training Loss: 0.01180
Epoch: 20530, Training Loss: 0.00911
Epoch: 20531, Training Loss: 0.01033
Epoch: 20531, Training Loss: 0.00961
Epoch: 20531, Training Loss: 0.01180
Epoch: 20531, Training Loss: 0.00911
Epoch: 20532, Training Loss: 0.01033
Epoch: 20532, Training Loss: 0.00961
Epoch: 20532, Training Loss: 0.01180
Epoch: 20532, Training Loss: 0.00911
Epoch: 20533, Training Loss: 0.01033
Epoch: 20533, Training Loss: 0.00961
Epoch: 20533, Training Loss: 0.01180
Epoch: 20533, Training Loss: 0.00911
Epoch: 20534, Training Loss: 0.01033
Epoch: 20534, Training Loss: 0.00961
Epoch: 20534, Training Loss: 0.01180
Epoch: 20534, Training Loss: 0.00911
Epoch: 20535, Training Loss: 0.01033
Epoch: 20535, Training Loss: 0.00961
Epoch: 20535, Training Loss: 0.01180
Epoch: 20535, Training Loss: 0.00911
Epoch: 20536, Training Loss: 0.01033
Epoch: 20536, Training Loss: 0.00961
Epoch: 20536, Training Loss: 0.01180
Epoch: 20536, Training Loss: 0.00911
Epoch: 20537, Training Loss: 0.01033
Epoch: 20537, Training Loss: 0.00961
Epoch: 20537, Training Loss: 0.01180
Epoch: 20537, Training Loss: 0.00911
Epoch: 20538, Training Loss: 0.01033
Epoch: 20538, Training Loss: 0.00961
Epoch: 20538, Training Loss: 0.01180
Epoch: 20538, Training Loss: 0.00911
Epoch: 20539, Training Loss: 0.01033
Epoch: 20539, Training Loss: 0.00961
Epoch: 20539, Training Loss: 0.01180
Epoch: 20539, Training Loss: 0.00911
Epoch: 20540, Training Loss: 0.01033
Epoch: 20540, Training Loss: 0.00961
Epoch: 20540, Training Loss: 0.01180
Epoch: 20540, Training Loss: 0.00911
Epoch: 20541, Training Loss: 0.01033
Epoch: 20541, Training Loss: 0.00961
Epoch: 20541, Training Loss: 0.01179
Epoch: 20541, Training Loss: 0.00911
Epoch: 20542, Training Loss: 0.01033
Epoch: 20542, Training Loss: 0.00961
Epoch: 20542, Training Loss: 0.01179
Epoch: 20542, Training Loss: 0.00911
Epoch: 20543, Training Loss: 0.01033
Epoch: 20543, Training Loss: 0.00961
Epoch: 20543, Training Loss: 0.01179
Epoch: 20543, Training Loss: 0.00911
Epoch: 20544, Training Loss: 0.01033
Epoch: 20544, Training Loss: 0.00961
Epoch: 20544, Training Loss: 0.01179
Epoch: 20544, Training Loss: 0.00911
Epoch: 20545, Training Loss: 0.01033
Epoch: 20545, Training Loss: 0.00961
Epoch: 20545, Training Loss: 0.01179
Epoch: 20545, Training Loss: 0.00911
Epoch: 20546, Training Loss: 0.01033
Epoch: 20546, Training Loss: 0.00961
Epoch: 20546, Training Loss: 0.01179
Epoch: 20546, Training Loss: 0.00911
Epoch: 20547, Training Loss: 0.01033
Epoch: 20547, Training Loss: 0.00961
Epoch: 20547, Training Loss: 0.01179
Epoch: 20547, Training Loss: 0.00911
Epoch: 20548, Training Loss: 0.01033
Epoch: 20548, Training Loss: 0.00961
Epoch: 20548, Training Loss: 0.01179
Epoch: 20548, Training Loss: 0.00911
Epoch: 20549, Training Loss: 0.01033
Epoch: 20549, Training Loss: 0.00961
Epoch: 20549, Training Loss: 0.01179
Epoch: 20549, Training Loss: 0.00911
Epoch: 20550, Training Loss: 0.01033
Epoch: 20550, Training Loss: 0.00961
Epoch: 20550, Training Loss: 0.01179
Epoch: 20550, Training Loss: 0.00911
Epoch: 20551, Training Loss: 0.01033
Epoch: 20551, Training Loss: 0.00961
Epoch: 20551, Training Loss: 0.01179
Epoch: 20551, Training Loss: 0.00911
Epoch: 20552, Training Loss: 0.01033
Epoch: 20552, Training Loss: 0.00961
Epoch: 20552, Training Loss: 0.01179
Epoch: 20552, Training Loss: 0.00910
Epoch: 20553, Training Loss: 0.01033
Epoch: 20553, Training Loss: 0.00961
Epoch: 20553, Training Loss: 0.01179
Epoch: 20553, Training Loss: 0.00910
Epoch: 20554, Training Loss: 0.01033
Epoch: 20554, Training Loss: 0.00961
Epoch: 20554, Training Loss: 0.01179
Epoch: 20554, Training Loss: 0.00910
Epoch: 20555, Training Loss: 0.01033
Epoch: 20555, Training Loss: 0.00961
Epoch: 20555, Training Loss: 0.01179
Epoch: 20555, Training Loss: 0.00910
Epoch: 20556, Training Loss: 0.01033
Epoch: 20556, Training Loss: 0.00961
Epoch: 20556, Training Loss: 0.01179
Epoch: 20556, Training Loss: 0.00910
Epoch: 20557, Training Loss: 0.01033
Epoch: 20557, Training Loss: 0.00961
Epoch: 20557, Training Loss: 0.01179
Epoch: 20557, Training Loss: 0.00910
Epoch: 20558, Training Loss: 0.01033
Epoch: 20558, Training Loss: 0.00961
Epoch: 20558, Training Loss: 0.01179
Epoch: 20558, Training Loss: 0.00910
Epoch: 20559, Training Loss: 0.01033
Epoch: 20559, Training Loss: 0.00961
Epoch: 20559, Training Loss: 0.01179
Epoch: 20559, Training Loss: 0.00910
Epoch: 20560, Training Loss: 0.01033
Epoch: 20560, Training Loss: 0.00961
Epoch: 20560, Training Loss: 0.01179
Epoch: 20560, Training Loss: 0.00910
Epoch: 20561, Training Loss: 0.01033
Epoch: 20561, Training Loss: 0.00961
Epoch: 20561, Training Loss: 0.01179
Epoch: 20561, Training Loss: 0.00910
Epoch: 20562, Training Loss: 0.01033
Epoch: 20562, Training Loss: 0.00961
Epoch: 20562, Training Loss: 0.01179
Epoch: 20562, Training Loss: 0.00910
Epoch: 20563, Training Loss: 0.01033
Epoch: 20563, Training Loss: 0.00961
Epoch: 20563, Training Loss: 0.01179
Epoch: 20563, Training Loss: 0.00910
Epoch: 20564, Training Loss: 0.01033
Epoch: 20564, Training Loss: 0.00961
Epoch: 20564, Training Loss: 0.01179
Epoch: 20564, Training Loss: 0.00910
Epoch: 20565, Training Loss: 0.01033
Epoch: 20565, Training Loss: 0.00961
Epoch: 20565, Training Loss: 0.01179
Epoch: 20565, Training Loss: 0.00910
Epoch: 20566, Training Loss: 0.01032
Epoch: 20566, Training Loss: 0.00961
Epoch: 20566, Training Loss: 0.01179
Epoch: 20566, Training Loss: 0.00910
Epoch: 20567, Training Loss: 0.01032
Epoch: 20567, Training Loss: 0.00961
Epoch: 20567, Training Loss: 0.01179
Epoch: 20567, Training Loss: 0.00910
Epoch: 20568, Training Loss: 0.01032
Epoch: 20568, Training Loss: 0.00961
Epoch: 20568, Training Loss: 0.01179
Epoch: 20568, Training Loss: 0.00910
Epoch: 20569, Training Loss: 0.01032
Epoch: 20569, Training Loss: 0.00961
Epoch: 20569, Training Loss: 0.01179
Epoch: 20569, Training Loss: 0.00910
Epoch: 20570, Training Loss: 0.01032
Epoch: 20570, Training Loss: 0.00960
Epoch: 20570, Training Loss: 0.01179
Epoch: 20570, Training Loss: 0.00910
Epoch: 20571, Training Loss: 0.01032
Epoch: 20571, Training Loss: 0.00960
Epoch: 20571, Training Loss: 0.01179
Epoch: 20571, Training Loss: 0.00910
Epoch: 20572, Training Loss: 0.01032
Epoch: 20572, Training Loss: 0.00960
Epoch: 20572, Training Loss: 0.01179
Epoch: 20572, Training Loss: 0.00910
Epoch: 20573, Training Loss: 0.01032
Epoch: 20573, Training Loss: 0.00960
Epoch: 20573, Training Loss: 0.01178
Epoch: 20573, Training Loss: 0.00910
Epoch: 20574, Training Loss: 0.01032
Epoch: 20574, Training Loss: 0.00960
Epoch: 20574, Training Loss: 0.01178
Epoch: 20574, Training Loss: 0.00910
Epoch: 20575, Training Loss: 0.01032
Epoch: 20575, Training Loss: 0.00960
Epoch: 20575, Training Loss: 0.01178
Epoch: 20575, Training Loss: 0.00910
Epoch: 20576, Training Loss: 0.01032
Epoch: 20576, Training Loss: 0.00960
Epoch: 20576, Training Loss: 0.01178
Epoch: 20576, Training Loss: 0.00910
Epoch: 20577, Training Loss: 0.01032
Epoch: 20577, Training Loss: 0.00960
Epoch: 20577, Training Loss: 0.01178
Epoch: 20577, Training Loss: 0.00910
Epoch: 20578, Training Loss: 0.01032
Epoch: 20578, Training Loss: 0.00960
Epoch: 20578, Training Loss: 0.01178
Epoch: 20578, Training Loss: 0.00910
Epoch: 20579, Training Loss: 0.01032
Epoch: 20579, Training Loss: 0.00960
Epoch: 20579, Training Loss: 0.01178
Epoch: 20579, Training Loss: 0.00910
Epoch: 20580, Training Loss: 0.01032
Epoch: 20580, Training Loss: 0.00960
Epoch: 20580, Training Loss: 0.01178
Epoch: 20580, Training Loss: 0.00910
Epoch: 20581, Training Loss: 0.01032
Epoch: 20581, Training Loss: 0.00960
Epoch: 20581, Training Loss: 0.01178
Epoch: 20581, Training Loss: 0.00910
Epoch: 20582, Training Loss: 0.01032
Epoch: 20582, Training Loss: 0.00960
Epoch: 20582, Training Loss: 0.01178
Epoch: 20582, Training Loss: 0.00910
Epoch: 20583, Training Loss: 0.01032
Epoch: 20583, Training Loss: 0.00960
Epoch: 20583, Training Loss: 0.01178
Epoch: 20583, Training Loss: 0.00910
Epoch: 20584, Training Loss: 0.01032
Epoch: 20584, Training Loss: 0.00960
Epoch: 20584, Training Loss: 0.01178
Epoch: 20584, Training Loss: 0.00910
Epoch: 20585, Training Loss: 0.01032
Epoch: 20585, Training Loss: 0.00960
Epoch: 20585, Training Loss: 0.01178
Epoch: 20585, Training Loss: 0.00910
Epoch: 20586, Training Loss: 0.01032
Epoch: 20586, Training Loss: 0.00960
Epoch: 20586, Training Loss: 0.01178
Epoch: 20586, Training Loss: 0.00910
Epoch: 20587, Training Loss: 0.01032
Epoch: 20587, Training Loss: 0.00960
Epoch: 20587, Training Loss: 0.01178
Epoch: 20587, Training Loss: 0.00910
Epoch: 20588, Training Loss: 0.01032
Epoch: 20588, Training Loss: 0.00960
Epoch: 20588, Training Loss: 0.01178
Epoch: 20588, Training Loss: 0.00910
Epoch: 20589, Training Loss: 0.01032
Epoch: 20589, Training Loss: 0.00960
Epoch: 20589, Training Loss: 0.01178
Epoch: 20589, Training Loss: 0.00910
Epoch: 20590, Training Loss: 0.01032
Epoch: 20590, Training Loss: 0.00960
Epoch: 20590, Training Loss: 0.01178
Epoch: 20590, Training Loss: 0.00910
Epoch: 20591, Training Loss: 0.01032
Epoch: 20591, Training Loss: 0.00960
Epoch: 20591, Training Loss: 0.01178
Epoch: 20591, Training Loss: 0.00910
Epoch: 20592, Training Loss: 0.01032
Epoch: 20592, Training Loss: 0.00960
Epoch: 20592, Training Loss: 0.01178
Epoch: 20592, Training Loss: 0.00910
Epoch: 20593, Training Loss: 0.01032
Epoch: 20593, Training Loss: 0.00960
Epoch: 20593, Training Loss: 0.01178
Epoch: 20593, Training Loss: 0.00910
Epoch: 20594, Training Loss: 0.01032
Epoch: 20594, Training Loss: 0.00960
Epoch: 20594, Training Loss: 0.01178
Epoch: 20594, Training Loss: 0.00909
Epoch: 20595, Training Loss: 0.01032
Epoch: 20595, Training Loss: 0.00960
Epoch: 20595, Training Loss: 0.01178
Epoch: 20595, Training Loss: 0.00909
Epoch: 20596, Training Loss: 0.01032
Epoch: 20596, Training Loss: 0.00960
Epoch: 20596, Training Loss: 0.01178
Epoch: 20596, Training Loss: 0.00909
Epoch: 20597, Training Loss: 0.01032
Epoch: 20597, Training Loss: 0.00960
Epoch: 20597, Training Loss: 0.01178
Epoch: 20597, Training Loss: 0.00909
Epoch: 20598, Training Loss: 0.01032
Epoch: 20598, Training Loss: 0.00960
Epoch: 20598, Training Loss: 0.01178
Epoch: 20598, Training Loss: 0.00909
Epoch: 20599, Training Loss: 0.01032
Epoch: 20599, Training Loss: 0.00960
Epoch: 20599, Training Loss: 0.01178
Epoch: 20599, Training Loss: 0.00909
Epoch: 20600, Training Loss: 0.01032
Epoch: 20600, Training Loss: 0.00960
Epoch: 20600, Training Loss: 0.01178
Epoch: 20600, Training Loss: 0.00909
Epoch: 20601, Training Loss: 0.01032
Epoch: 20601, Training Loss: 0.00960
Epoch: 20601, Training Loss: 0.01178
Epoch: 20601, Training Loss: 0.00909
Epoch: 20602, Training Loss: 0.01031
Epoch: 20602, Training Loss: 0.00960
Epoch: 20602, Training Loss: 0.01178
Epoch: 20602, Training Loss: 0.00909
Epoch: 20603, Training Loss: 0.01031
Epoch: 20603, Training Loss: 0.00960
Epoch: 20603, Training Loss: 0.01178
Epoch: 20603, Training Loss: 0.00909
Epoch: 20604, Training Loss: 0.01031
Epoch: 20604, Training Loss: 0.00960
Epoch: 20604, Training Loss: 0.01178
Epoch: 20604, Training Loss: 0.00909
Epoch: 20605, Training Loss: 0.01031
Epoch: 20605, Training Loss: 0.00960
Epoch: 20605, Training Loss: 0.01178
Epoch: 20605, Training Loss: 0.00909
Epoch: 20606, Training Loss: 0.01031
Epoch: 20606, Training Loss: 0.00960
Epoch: 20606, Training Loss: 0.01177
Epoch: 20606, Training Loss: 0.00909
Epoch: 20607, Training Loss: 0.01031
Epoch: 20607, Training Loss: 0.00960
Epoch: 20607, Training Loss: 0.01177
Epoch: 20607, Training Loss: 0.00909
Epoch: 20608, Training Loss: 0.01031
Epoch: 20608, Training Loss: 0.00960
Epoch: 20608, Training Loss: 0.01177
Epoch: 20608, Training Loss: 0.00909
Epoch: 20609, Training Loss: 0.01031
Epoch: 20609, Training Loss: 0.00960
Epoch: 20609, Training Loss: 0.01177
Epoch: 20609, Training Loss: 0.00909
Epoch: 20610, Training Loss: 0.01031
Epoch: 20610, Training Loss: 0.00959
Epoch: 20610, Training Loss: 0.01177
Epoch: 20610, Training Loss: 0.00909
Epoch: 20611, Training Loss: 0.01031
Epoch: 20611, Training Loss: 0.00959
Epoch: 20611, Training Loss: 0.01177
Epoch: 20611, Training Loss: 0.00909
Epoch: 20612, Training Loss: 0.01031
Epoch: 20612, Training Loss: 0.00959
Epoch: 20612, Training Loss: 0.01177
Epoch: 20612, Training Loss: 0.00909
Epoch: 20613, Training Loss: 0.01031
Epoch: 20613, Training Loss: 0.00959
Epoch: 20613, Training Loss: 0.01177
Epoch: 20613, Training Loss: 0.00909
Epoch: 20614, Training Loss: 0.01031
Epoch: 20614, Training Loss: 0.00959
Epoch: 20614, Training Loss: 0.01177
Epoch: 20614, Training Loss: 0.00909
Epoch: 20615, Training Loss: 0.01031
Epoch: 20615, Training Loss: 0.00959
Epoch: 20615, Training Loss: 0.01177
Epoch: 20615, Training Loss: 0.00909
Epoch: 20616, Training Loss: 0.01031
Epoch: 20616, Training Loss: 0.00959
Epoch: 20616, Training Loss: 0.01177
Epoch: 20616, Training Loss: 0.00909
Epoch: 20617, Training Loss: 0.01031
Epoch: 20617, Training Loss: 0.00959
Epoch: 20617, Training Loss: 0.01177
Epoch: 20617, Training Loss: 0.00909
Epoch: 20618, Training Loss: 0.01031
Epoch: 20618, Training Loss: 0.00959
Epoch: 20618, Training Loss: 0.01177
Epoch: 20618, Training Loss: 0.00909
Epoch: 20619, Training Loss: 0.01031
Epoch: 20619, Training Loss: 0.00959
Epoch: 20619, Training Loss: 0.01177
Epoch: 20619, Training Loss: 0.00909
Epoch: 20620, Training Loss: 0.01031
Epoch: 20620, Training Loss: 0.00959
Epoch: 20620, Training Loss: 0.01177
Epoch: 20620, Training Loss: 0.00909
Epoch: 20621, Training Loss: 0.01031
Epoch: 20621, Training Loss: 0.00959
Epoch: 20621, Training Loss: 0.01177
Epoch: 20621, Training Loss: 0.00909
Epoch: 20622, Training Loss: 0.01031
Epoch: 20622, Training Loss: 0.00959
Epoch: 20622, Training Loss: 0.01177
Epoch: 20622, Training Loss: 0.00909
Epoch: 20623, Training Loss: 0.01031
Epoch: 20623, Training Loss: 0.00959
Epoch: 20623, Training Loss: 0.01177
Epoch: 20623, Training Loss: 0.00909
Epoch: 20624, Training Loss: 0.01031
Epoch: 20624, Training Loss: 0.00959
Epoch: 20624, Training Loss: 0.01177
Epoch: 20624, Training Loss: 0.00909
Epoch: 20625, Training Loss: 0.01031
Epoch: 20625, Training Loss: 0.00959
Epoch: 20625, Training Loss: 0.01177
Epoch: 20625, Training Loss: 0.00909
Epoch: 20626, Training Loss: 0.01031
Epoch: 20626, Training Loss: 0.00959
Epoch: 20626, Training Loss: 0.01177
Epoch: 20626, Training Loss: 0.00909
Epoch: 20627, Training Loss: 0.01031
Epoch: 20627, Training Loss: 0.00959
Epoch: 20627, Training Loss: 0.01177
Epoch: 20627, Training Loss: 0.00909
Epoch: 20628, Training Loss: 0.01031
Epoch: 20628, Training Loss: 0.00959
Epoch: 20628, Training Loss: 0.01177
Epoch: 20628, Training Loss: 0.00909
Epoch: 20629, Training Loss: 0.01031
Epoch: 20629, Training Loss: 0.00959
Epoch: 20629, Training Loss: 0.01177
Epoch: 20629, Training Loss: 0.00909
Epoch: 20630, Training Loss: 0.01031
Epoch: 20630, Training Loss: 0.00959
Epoch: 20630, Training Loss: 0.01177
Epoch: 20630, Training Loss: 0.00909
Epoch: 20631, Training Loss: 0.01031
Epoch: 20631, Training Loss: 0.00959
Epoch: 20631, Training Loss: 0.01177
Epoch: 20631, Training Loss: 0.00909
Epoch: 20632, Training Loss: 0.01031
Epoch: 20632, Training Loss: 0.00959
Epoch: 20632, Training Loss: 0.01177
Epoch: 20632, Training Loss: 0.00909
Epoch: 20633, Training Loss: 0.01031
Epoch: 20633, Training Loss: 0.00959
Epoch: 20633, Training Loss: 0.01177
Epoch: 20633, Training Loss: 0.00909
Epoch: 20634, Training Loss: 0.01031
Epoch: 20634, Training Loss: 0.00959
Epoch: 20634, Training Loss: 0.01177
Epoch: 20634, Training Loss: 0.00909
Epoch: 20635, Training Loss: 0.01031
Epoch: 20635, Training Loss: 0.00959
Epoch: 20635, Training Loss: 0.01177
Epoch: 20635, Training Loss: 0.00909
Epoch: 20636, Training Loss: 0.01031
Epoch: 20636, Training Loss: 0.00959
Epoch: 20636, Training Loss: 0.01177
Epoch: 20636, Training Loss: 0.00909
Epoch: 20637, Training Loss: 0.01031
Epoch: 20637, Training Loss: 0.00959
Epoch: 20637, Training Loss: 0.01177
Epoch: 20637, Training Loss: 0.00908
Epoch: 20638, Training Loss: 0.01031
Epoch: 20638, Training Loss: 0.00959
Epoch: 20638, Training Loss: 0.01176
Epoch: 20638, Training Loss: 0.00908
Epoch: 20639, Training Loss: 0.01030
Epoch: 20639, Training Loss: 0.00959
Epoch: 20639, Training Loss: 0.01176
Epoch: 20639, Training Loss: 0.00908
Epoch: 20640, Training Loss: 0.01030
Epoch: 20640, Training Loss: 0.00959
Epoch: 20640, Training Loss: 0.01176
Epoch: 20640, Training Loss: 0.00908
Epoch: 20641, Training Loss: 0.01030
Epoch: 20641, Training Loss: 0.00959
Epoch: 20641, Training Loss: 0.01176
Epoch: 20641, Training Loss: 0.00908
Epoch: 20642, Training Loss: 0.01030
Epoch: 20642, Training Loss: 0.00959
Epoch: 20642, Training Loss: 0.01176
Epoch: 20642, Training Loss: 0.00908
Epoch: 20643, Training Loss: 0.01030
Epoch: 20643, Training Loss: 0.00959
Epoch: 20643, Training Loss: 0.01176
Epoch: 20643, Training Loss: 0.00908
Epoch: 20644, Training Loss: 0.01030
Epoch: 20644, Training Loss: 0.00959
Epoch: 20644, Training Loss: 0.01176
Epoch: 20644, Training Loss: 0.00908
Epoch: 20645, Training Loss: 0.01030
Epoch: 20645, Training Loss: 0.00959
Epoch: 20645, Training Loss: 0.01176
Epoch: 20645, Training Loss: 0.00908
Epoch: 20646, Training Loss: 0.01030
Epoch: 20646, Training Loss: 0.00959
Epoch: 20646, Training Loss: 0.01176
Epoch: 20646, Training Loss: 0.00908
Epoch: 20647, Training Loss: 0.01030
Epoch: 20647, Training Loss: 0.00959
Epoch: 20647, Training Loss: 0.01176
Epoch: 20647, Training Loss: 0.00908
Epoch: 20648, Training Loss: 0.01030
Epoch: 20648, Training Loss: 0.00959
Epoch: 20648, Training Loss: 0.01176
Epoch: 20648, Training Loss: 0.00908
Epoch: 20649, Training Loss: 0.01030
Epoch: 20649, Training Loss: 0.00959
Epoch: 20649, Training Loss: 0.01176
Epoch: 20649, Training Loss: 0.00908
Epoch: 20650, Training Loss: 0.01030
Epoch: 20650, Training Loss: 0.00959
Epoch: 20650, Training Loss: 0.01176
Epoch: 20650, Training Loss: 0.00908
Epoch: 20651, Training Loss: 0.01030
Epoch: 20651, Training Loss: 0.00958
Epoch: 20651, Training Loss: 0.01176
Epoch: 20651, Training Loss: 0.00908
Epoch: 20652, Training Loss: 0.01030
Epoch: 20652, Training Loss: 0.00958
Epoch: 20652, Training Loss: 0.01176
Epoch: 20652, Training Loss: 0.00908
Epoch: 20653, Training Loss: 0.01030
Epoch: 20653, Training Loss: 0.00958
Epoch: 20653, Training Loss: 0.01176
Epoch: 20653, Training Loss: 0.00908
Epoch: 20654, Training Loss: 0.01030
Epoch: 20654, Training Loss: 0.00958
Epoch: 20654, Training Loss: 0.01176
Epoch: 20654, Training Loss: 0.00908
Epoch: 20655, Training Loss: 0.01030
Epoch: 20655, Training Loss: 0.00958
Epoch: 20655, Training Loss: 0.01176
Epoch: 20655, Training Loss: 0.00908
Epoch: 20656, Training Loss: 0.01030
Epoch: 20656, Training Loss: 0.00958
Epoch: 20656, Training Loss: 0.01176
Epoch: 20656, Training Loss: 0.00908
Epoch: 20657, Training Loss: 0.01030
Epoch: 20657, Training Loss: 0.00958
Epoch: 20657, Training Loss: 0.01176
Epoch: 20657, Training Loss: 0.00908
Epoch: 20658, Training Loss: 0.01030
Epoch: 20658, Training Loss: 0.00958
Epoch: 20658, Training Loss: 0.01176
Epoch: 20658, Training Loss: 0.00908
Epoch: 20659, Training Loss: 0.01030
Epoch: 20659, Training Loss: 0.00958
Epoch: 20659, Training Loss: 0.01176
Epoch: 20659, Training Loss: 0.00908
Epoch: 20660, Training Loss: 0.01030
Epoch: 20660, Training Loss: 0.00958
Epoch: 20660, Training Loss: 0.01176
Epoch: 20660, Training Loss: 0.00908
Epoch: 20661, Training Loss: 0.01030
Epoch: 20661, Training Loss: 0.00958
Epoch: 20661, Training Loss: 0.01176
Epoch: 20661, Training Loss: 0.00908
Epoch: 20662, Training Loss: 0.01030
Epoch: 20662, Training Loss: 0.00958
Epoch: 20662, Training Loss: 0.01176
Epoch: 20662, Training Loss: 0.00908
Epoch: 20663, Training Loss: 0.01030
Epoch: 20663, Training Loss: 0.00958
Epoch: 20663, Training Loss: 0.01176
Epoch: 20663, Training Loss: 0.00908
Epoch: 20664, Training Loss: 0.01030
Epoch: 20664, Training Loss: 0.00958
Epoch: 20664, Training Loss: 0.01176
Epoch: 20664, Training Loss: 0.00908
Epoch: 20665, Training Loss: 0.01030
Epoch: 20665, Training Loss: 0.00958
Epoch: 20665, Training Loss: 0.01176
Epoch: 20665, Training Loss: 0.00908
Epoch: 20666, Training Loss: 0.01030
Epoch: 20666, Training Loss: 0.00958
Epoch: 20666, Training Loss: 0.01176
Epoch: 20666, Training Loss: 0.00908
Epoch: 20667, Training Loss: 0.01030
Epoch: 20667, Training Loss: 0.00958
Epoch: 20667, Training Loss: 0.01176
Epoch: 20667, Training Loss: 0.00908
Epoch: 20668, Training Loss: 0.01030
Epoch: 20668, Training Loss: 0.00958
Epoch: 20668, Training Loss: 0.01176
Epoch: 20668, Training Loss: 0.00908
Epoch: 20669, Training Loss: 0.01030
Epoch: 20669, Training Loss: 0.00958
Epoch: 20669, Training Loss: 0.01176
Epoch: 20669, Training Loss: 0.00908
Epoch: 20670, Training Loss: 0.01030
Epoch: 20670, Training Loss: 0.00958
Epoch: 20670, Training Loss: 0.01176
Epoch: 20670, Training Loss: 0.00908
Epoch: 20671, Training Loss: 0.01030
Epoch: 20671, Training Loss: 0.00958
Epoch: 20671, Training Loss: 0.01175
Epoch: 20671, Training Loss: 0.00908
Epoch: 20672, Training Loss: 0.01030
Epoch: 20672, Training Loss: 0.00958
Epoch: 20672, Training Loss: 0.01175
Epoch: 20672, Training Loss: 0.00908
Epoch: 20673, Training Loss: 0.01030
Epoch: 20673, Training Loss: 0.00958
Epoch: 20673, Training Loss: 0.01175
Epoch: 20673, Training Loss: 0.00908
Epoch: 20674, Training Loss: 0.01030
Epoch: 20674, Training Loss: 0.00958
Epoch: 20674, Training Loss: 0.01175
Epoch: 20674, Training Loss: 0.00908
Epoch: 20675, Training Loss: 0.01030
Epoch: 20675, Training Loss: 0.00958
Epoch: 20675, Training Loss: 0.01175
Epoch: 20675, Training Loss: 0.00908
Epoch: 20676, Training Loss: 0.01030
Epoch: 20676, Training Loss: 0.00958
Epoch: 20676, Training Loss: 0.01175
Epoch: 20676, Training Loss: 0.00908
Epoch: 20677, Training Loss: 0.01029
Epoch: 20677, Training Loss: 0.00958
Epoch: 20677, Training Loss: 0.01175
Epoch: 20677, Training Loss: 0.00908
Epoch: 20678, Training Loss: 0.01029
Epoch: 20678, Training Loss: 0.00958
Epoch: 20678, Training Loss: 0.01175
Epoch: 20678, Training Loss: 0.00908
Epoch: 20679, Training Loss: 0.01029
Epoch: 20679, Training Loss: 0.00958
Epoch: 20679, Training Loss: 0.01175
Epoch: 20679, Training Loss: 0.00908
Epoch: 20680, Training Loss: 0.01029
Epoch: 20680, Training Loss: 0.00958
Epoch: 20680, Training Loss: 0.01175
Epoch: 20680, Training Loss: 0.00907
Epoch: 20681, Training Loss: 0.01029
Epoch: 20681, Training Loss: 0.00958
Epoch: 20681, Training Loss: 0.01175
Epoch: 20681, Training Loss: 0.00907
Epoch: 20682, Training Loss: 0.01029
Epoch: 20682, Training Loss: 0.00958
Epoch: 20682, Training Loss: 0.01175
Epoch: 20682, Training Loss: 0.00907
Epoch: 20683, Training Loss: 0.01029
Epoch: 20683, Training Loss: 0.00958
Epoch: 20683, Training Loss: 0.01175
Epoch: 20683, Training Loss: 0.00907
Epoch: 20684, Training Loss: 0.01029
Epoch: 20684, Training Loss: 0.00958
Epoch: 20684, Training Loss: 0.01175
Epoch: 20684, Training Loss: 0.00907
Epoch: 20685, Training Loss: 0.01029
Epoch: 20685, Training Loss: 0.00958
Epoch: 20685, Training Loss: 0.01175
Epoch: 20685, Training Loss: 0.00907
Epoch: 20686, Training Loss: 0.01029
Epoch: 20686, Training Loss: 0.00958
Epoch: 20686, Training Loss: 0.01175
Epoch: 20686, Training Loss: 0.00907
Epoch: 20687, Training Loss: 0.01029
Epoch: 20687, Training Loss: 0.00958
Epoch: 20687, Training Loss: 0.01175
Epoch: 20687, Training Loss: 0.00907
Epoch: 20688, Training Loss: 0.01029
Epoch: 20688, Training Loss: 0.00958
Epoch: 20688, Training Loss: 0.01175
Epoch: 20688, Training Loss: 0.00907
Epoch: 20689, Training Loss: 0.01029
Epoch: 20689, Training Loss: 0.00958
Epoch: 20689, Training Loss: 0.01175
Epoch: 20689, Training Loss: 0.00907
Epoch: 20690, Training Loss: 0.01029
Epoch: 20690, Training Loss: 0.00958
Epoch: 20690, Training Loss: 0.01175
Epoch: 20690, Training Loss: 0.00907
Epoch: 20691, Training Loss: 0.01029
Epoch: 20691, Training Loss: 0.00958
Epoch: 20691, Training Loss: 0.01175
Epoch: 20691, Training Loss: 0.00907
Epoch: 20692, Training Loss: 0.01029
Epoch: 20692, Training Loss: 0.00957
Epoch: 20692, Training Loss: 0.01175
Epoch: 20692, Training Loss: 0.00907
Epoch: 20693, Training Loss: 0.01029
Epoch: 20693, Training Loss: 0.00957
Epoch: 20693, Training Loss: 0.01175
Epoch: 20693, Training Loss: 0.00907
Epoch: 20694, Training Loss: 0.01029
Epoch: 20694, Training Loss: 0.00957
Epoch: 20694, Training Loss: 0.01175
Epoch: 20694, Training Loss: 0.00907
Epoch: 20695, Training Loss: 0.01029
Epoch: 20695, Training Loss: 0.00957
Epoch: 20695, Training Loss: 0.01175
Epoch: 20695, Training Loss: 0.00907
Epoch: 20696, Training Loss: 0.01029
Epoch: 20696, Training Loss: 0.00957
Epoch: 20696, Training Loss: 0.01175
Epoch: 20696, Training Loss: 0.00907
Epoch: 20697, Training Loss: 0.01029
Epoch: 20697, Training Loss: 0.00957
Epoch: 20697, Training Loss: 0.01175
Epoch: 20697, Training Loss: 0.00907
Epoch: 20698, Training Loss: 0.01029
Epoch: 20698, Training Loss: 0.00957
Epoch: 20698, Training Loss: 0.01175
Epoch: 20698, Training Loss: 0.00907
Epoch: 20699, Training Loss: 0.01029
Epoch: 20699, Training Loss: 0.00957
Epoch: 20699, Training Loss: 0.01175
Epoch: 20699, Training Loss: 0.00907
Epoch: 20700, Training Loss: 0.01029
Epoch: 20700, Training Loss: 0.00957
Epoch: 20700, Training Loss: 0.01175
Epoch: 20700, Training Loss: 0.00907
Epoch: 20701, Training Loss: 0.01029
Epoch: 20701, Training Loss: 0.00957
Epoch: 20701, Training Loss: 0.01175
Epoch: 20701, Training Loss: 0.00907
Epoch: 20702, Training Loss: 0.01029
Epoch: 20702, Training Loss: 0.00957
Epoch: 20702, Training Loss: 0.01175
Epoch: 20702, Training Loss: 0.00907
Epoch: 20703, Training Loss: 0.01029
Epoch: 20703, Training Loss: 0.00957
Epoch: 20703, Training Loss: 0.01174
Epoch: 20703, Training Loss: 0.00907
Epoch: 20704, Training Loss: 0.01029
Epoch: 20704, Training Loss: 0.00957
Epoch: 20704, Training Loss: 0.01174
Epoch: 20704, Training Loss: 0.00907
Epoch: 20705, Training Loss: 0.01029
Epoch: 20705, Training Loss: 0.00957
Epoch: 20705, Training Loss: 0.01174
Epoch: 20705, Training Loss: 0.00907
Epoch: 20706, Training Loss: 0.01029
Epoch: 20706, Training Loss: 0.00957
Epoch: 20706, Training Loss: 0.01174
Epoch: 20706, Training Loss: 0.00907
Epoch: 20707, Training Loss: 0.01029
Epoch: 20707, Training Loss: 0.00957
Epoch: 20707, Training Loss: 0.01174
Epoch: 20707, Training Loss: 0.00907
Epoch: 20708, Training Loss: 0.01029
Epoch: 20708, Training Loss: 0.00957
Epoch: 20708, Training Loss: 0.01174
Epoch: 20708, Training Loss: 0.00907
Epoch: 20709, Training Loss: 0.01029
Epoch: 20709, Training Loss: 0.00957
Epoch: 20709, Training Loss: 0.01174
Epoch: 20709, Training Loss: 0.00907
Epoch: 20710, Training Loss: 0.01029
Epoch: 20710, Training Loss: 0.00957
Epoch: 20710, Training Loss: 0.01174
Epoch: 20710, Training Loss: 0.00907
Epoch: 20711, Training Loss: 0.01029
Epoch: 20711, Training Loss: 0.00957
Epoch: 20711, Training Loss: 0.01174
Epoch: 20711, Training Loss: 0.00907
Epoch: 20712, Training Loss: 0.01029
Epoch: 20712, Training Loss: 0.00957
Epoch: 20712, Training Loss: 0.01174
Epoch: 20712, Training Loss: 0.00907
Epoch: 20713, Training Loss: 0.01029
Epoch: 20713, Training Loss: 0.00957
Epoch: 20713, Training Loss: 0.01174
Epoch: 20713, Training Loss: 0.00907
Epoch: 20714, Training Loss: 0.01028
Epoch: 20714, Training Loss: 0.00957
Epoch: 20714, Training Loss: 0.01174
Epoch: 20714, Training Loss: 0.00907
Epoch: 20715, Training Loss: 0.01028
Epoch: 20715, Training Loss: 0.00957
Epoch: 20715, Training Loss: 0.01174
Epoch: 20715, Training Loss: 0.00907
Epoch: 20716, Training Loss: 0.01028
Epoch: 20716, Training Loss: 0.00957
Epoch: 20716, Training Loss: 0.01174
Epoch: 20716, Training Loss: 0.00907
Epoch: 20717, Training Loss: 0.01028
Epoch: 20717, Training Loss: 0.00957
Epoch: 20717, Training Loss: 0.01174
Epoch: 20717, Training Loss: 0.00907
Epoch: 20718, Training Loss: 0.01028
Epoch: 20718, Training Loss: 0.00957
Epoch: 20718, Training Loss: 0.01174
Epoch: 20718, Training Loss: 0.00907
Epoch: 20719, Training Loss: 0.01028
Epoch: 20719, Training Loss: 0.00957
Epoch: 20719, Training Loss: 0.01174
Epoch: 20719, Training Loss: 0.00907
Epoch: 20720, Training Loss: 0.01028
Epoch: 20720, Training Loss: 0.00957
Epoch: 20720, Training Loss: 0.01174
Epoch: 20720, Training Loss: 0.00907
Epoch: 20721, Training Loss: 0.01028
Epoch: 20721, Training Loss: 0.00957
Epoch: 20721, Training Loss: 0.01174
Epoch: 20721, Training Loss: 0.00907
Epoch: 20722, Training Loss: 0.01028
Epoch: 20722, Training Loss: 0.00957
Epoch: 20722, Training Loss: 0.01174
Epoch: 20722, Training Loss: 0.00907
Epoch: 20723, Training Loss: 0.01028
Epoch: 20723, Training Loss: 0.00957
Epoch: 20723, Training Loss: 0.01174
Epoch: 20723, Training Loss: 0.00906
Epoch: 20724, Training Loss: 0.01028
Epoch: 20724, Training Loss: 0.00957
Epoch: 20724, Training Loss: 0.01174
Epoch: 20724, Training Loss: 0.00906
Epoch: 20725, Training Loss: 0.01028
Epoch: 20725, Training Loss: 0.00957
Epoch: 20725, Training Loss: 0.01174
Epoch: 20725, Training Loss: 0.00906
Epoch: 20726, Training Loss: 0.01028
Epoch: 20726, Training Loss: 0.00957
Epoch: 20726, Training Loss: 0.01174
Epoch: 20726, Training Loss: 0.00906
Epoch: 20727, Training Loss: 0.01028
Epoch: 20727, Training Loss: 0.00957
Epoch: 20727, Training Loss: 0.01174
Epoch: 20727, Training Loss: 0.00906
Epoch: 20728, Training Loss: 0.01028
Epoch: 20728, Training Loss: 0.00957
Epoch: 20728, Training Loss: 0.01174
Epoch: 20728, Training Loss: 0.00906
Epoch: 20729, Training Loss: 0.01028
Epoch: 20729, Training Loss: 0.00957
Epoch: 20729, Training Loss: 0.01174
Epoch: 20729, Training Loss: 0.00906
Epoch: 20730, Training Loss: 0.01028
Epoch: 20730, Training Loss: 0.00957
Epoch: 20730, Training Loss: 0.01174
Epoch: 20730, Training Loss: 0.00906
Epoch: 20731, Training Loss: 0.01028
Epoch: 20731, Training Loss: 0.00957
Epoch: 20731, Training Loss: 0.01174
Epoch: 20731, Training Loss: 0.00906
Epoch: 20732, Training Loss: 0.01028
Epoch: 20732, Training Loss: 0.00956
Epoch: 20732, Training Loss: 0.01174
Epoch: 20732, Training Loss: 0.00906
Epoch: 20733, Training Loss: 0.01028
Epoch: 20733, Training Loss: 0.00956
Epoch: 20733, Training Loss: 0.01174
Epoch: 20733, Training Loss: 0.00906
Epoch: 20734, Training Loss: 0.01028
Epoch: 20734, Training Loss: 0.00956
Epoch: 20734, Training Loss: 0.01174
Epoch: 20734, Training Loss: 0.00906
Epoch: 20735, Training Loss: 0.01028
Epoch: 20735, Training Loss: 0.00956
Epoch: 20735, Training Loss: 0.01174
Epoch: 20735, Training Loss: 0.00906
Epoch: 20736, Training Loss: 0.01028
Epoch: 20736, Training Loss: 0.00956
Epoch: 20736, Training Loss: 0.01173
Epoch: 20736, Training Loss: 0.00906
Epoch: 20737, Training Loss: 0.01028
Epoch: 20737, Training Loss: 0.00956
Epoch: 20737, Training Loss: 0.01173
Epoch: 20737, Training Loss: 0.00906
Epoch: 20738, Training Loss: 0.01028
Epoch: 20738, Training Loss: 0.00956
Epoch: 20738, Training Loss: 0.01173
Epoch: 20738, Training Loss: 0.00906
Epoch: 20739, Training Loss: 0.01028
Epoch: 20739, Training Loss: 0.00956
Epoch: 20739, Training Loss: 0.01173
Epoch: 20739, Training Loss: 0.00906
Epoch: 20740, Training Loss: 0.01028
Epoch: 20740, Training Loss: 0.00956
Epoch: 20740, Training Loss: 0.01173
Epoch: 20740, Training Loss: 0.00906
Epoch: 20741, Training Loss: 0.01028
Epoch: 20741, Training Loss: 0.00956
Epoch: 20741, Training Loss: 0.01173
Epoch: 20741, Training Loss: 0.00906
Epoch: 20742, Training Loss: 0.01028
Epoch: 20742, Training Loss: 0.00956
Epoch: 20742, Training Loss: 0.01173
Epoch: 20742, Training Loss: 0.00906
Epoch: 20743, Training Loss: 0.01028
Epoch: 20743, Training Loss: 0.00956
Epoch: 20743, Training Loss: 0.01173
Epoch: 20743, Training Loss: 0.00906
Epoch: 20744, Training Loss: 0.01028
Epoch: 20744, Training Loss: 0.00956
Epoch: 20744, Training Loss: 0.01173
Epoch: 20744, Training Loss: 0.00906
Epoch: 20745, Training Loss: 0.01028
Epoch: 20745, Training Loss: 0.00956
Epoch: 20745, Training Loss: 0.01173
Epoch: 20745, Training Loss: 0.00906
Epoch: 20746, Training Loss: 0.01028
Epoch: 20746, Training Loss: 0.00956
Epoch: 20746, Training Loss: 0.01173
Epoch: 20746, Training Loss: 0.00906
Epoch: 20747, Training Loss: 0.01028
Epoch: 20747, Training Loss: 0.00956
Epoch: 20747, Training Loss: 0.01173
Epoch: 20747, Training Loss: 0.00906
Epoch: 20748, Training Loss: 0.01028
Epoch: 20748, Training Loss: 0.00956
Epoch: 20748, Training Loss: 0.01173
Epoch: 20748, Training Loss: 0.00906
Epoch: 20749, Training Loss: 0.01028
Epoch: 20749, Training Loss: 0.00956
Epoch: 20749, Training Loss: 0.01173
Epoch: 20749, Training Loss: 0.00906
Epoch: 20750, Training Loss: 0.01028
Epoch: 20750, Training Loss: 0.00956
Epoch: 20750, Training Loss: 0.01173
Epoch: 20750, Training Loss: 0.00906
Epoch: 20751, Training Loss: 0.01027
Epoch: 20751, Training Loss: 0.00956
Epoch: 20751, Training Loss: 0.01173
Epoch: 20751, Training Loss: 0.00906
Epoch: 20752, Training Loss: 0.01027
Epoch: 20752, Training Loss: 0.00956
Epoch: 20752, Training Loss: 0.01173
Epoch: 20752, Training Loss: 0.00906
Epoch: 20753, Training Loss: 0.01027
Epoch: 20753, Training Loss: 0.00956
Epoch: 20753, Training Loss: 0.01173
Epoch: 20753, Training Loss: 0.00906
Epoch: 20754, Training Loss: 0.01027
Epoch: 20754, Training Loss: 0.00956
Epoch: 20754, Training Loss: 0.01173
Epoch: 20754, Training Loss: 0.00906
Epoch: 20755, Training Loss: 0.01027
Epoch: 20755, Training Loss: 0.00956
Epoch: 20755, Training Loss: 0.01173
Epoch: 20755, Training Loss: 0.00906
Epoch: 20756, Training Loss: 0.01027
Epoch: 20756, Training Loss: 0.00956
Epoch: 20756, Training Loss: 0.01173
Epoch: 20756, Training Loss: 0.00906
Epoch: 20757, Training Loss: 0.01027
Epoch: 20757, Training Loss: 0.00956
Epoch: 20757, Training Loss: 0.01173
Epoch: 20757, Training Loss: 0.00906
Epoch: 20758, Training Loss: 0.01027
Epoch: 20758, Training Loss: 0.00956
Epoch: 20758, Training Loss: 0.01173
Epoch: 20758, Training Loss: 0.00906
Epoch: 20759, Training Loss: 0.01027
Epoch: 20759, Training Loss: 0.00956
Epoch: 20759, Training Loss: 0.01173
Epoch: 20759, Training Loss: 0.00906
Epoch: 20760, Training Loss: 0.01027
Epoch: 20760, Training Loss: 0.00956
Epoch: 20760, Training Loss: 0.01173
Epoch: 20760, Training Loss: 0.00906
Epoch: 20761, Training Loss: 0.01027
Epoch: 20761, Training Loss: 0.00956
Epoch: 20761, Training Loss: 0.01173
Epoch: 20761, Training Loss: 0.00906
Epoch: 20762, Training Loss: 0.01027
Epoch: 20762, Training Loss: 0.00956
Epoch: 20762, Training Loss: 0.01173
Epoch: 20762, Training Loss: 0.00906
Epoch: 20763, Training Loss: 0.01027
Epoch: 20763, Training Loss: 0.00956
Epoch: 20763, Training Loss: 0.01173
Epoch: 20763, Training Loss: 0.00906
Epoch: 20764, Training Loss: 0.01027
Epoch: 20764, Training Loss: 0.00956
Epoch: 20764, Training Loss: 0.01173
Epoch: 20764, Training Loss: 0.00906
Epoch: 20765, Training Loss: 0.01027
Epoch: 20765, Training Loss: 0.00956
Epoch: 20765, Training Loss: 0.01173
Epoch: 20765, Training Loss: 0.00906
Epoch: 20766, Training Loss: 0.01027
Epoch: 20766, Training Loss: 0.00956
Epoch: 20766, Training Loss: 0.01173
Epoch: 20766, Training Loss: 0.00905
Epoch: 20767, Training Loss: 0.01027
Epoch: 20767, Training Loss: 0.00956
Epoch: 20767, Training Loss: 0.01173
Epoch: 20767, Training Loss: 0.00905
Epoch: 20768, Training Loss: 0.01027
Epoch: 20768, Training Loss: 0.00956
Epoch: 20768, Training Loss: 0.01173
Epoch: 20768, Training Loss: 0.00905
Epoch: 20769, Training Loss: 0.01027
Epoch: 20769, Training Loss: 0.00956
Epoch: 20769, Training Loss: 0.01172
Epoch: 20769, Training Loss: 0.00905
Epoch: 20770, Training Loss: 0.01027
Epoch: 20770, Training Loss: 0.00956
Epoch: 20770, Training Loss: 0.01172
Epoch: 20770, Training Loss: 0.00905
Epoch: 20771, Training Loss: 0.01027
Epoch: 20771, Training Loss: 0.00956
Epoch: 20771, Training Loss: 0.01172
Epoch: 20771, Training Loss: 0.00905
Epoch: 20772, Training Loss: 0.01027
Epoch: 20772, Training Loss: 0.00956
Epoch: 20772, Training Loss: 0.01172
Epoch: 20772, Training Loss: 0.00905
Epoch: 20773, Training Loss: 0.01027
Epoch: 20773, Training Loss: 0.00955
Epoch: 20773, Training Loss: 0.01172
Epoch: 20773, Training Loss: 0.00905
Epoch: 20774, Training Loss: 0.01027
Epoch: 20774, Training Loss: 0.00955
Epoch: 20774, Training Loss: 0.01172
Epoch: 20774, Training Loss: 0.00905
Epoch: 20775, Training Loss: 0.01027
Epoch: 20775, Training Loss: 0.00955
Epoch: 20775, Training Loss: 0.01172
Epoch: 20775, Training Loss: 0.00905
Epoch: 20776, Training Loss: 0.01027
Epoch: 20776, Training Loss: 0.00955
Epoch: 20776, Training Loss: 0.01172
Epoch: 20776, Training Loss: 0.00905
Epoch: 20777, Training Loss: 0.01027
Epoch: 20777, Training Loss: 0.00955
Epoch: 20777, Training Loss: 0.01172
Epoch: 20777, Training Loss: 0.00905
Epoch: 20778, Training Loss: 0.01027
Epoch: 20778, Training Loss: 0.00955
Epoch: 20778, Training Loss: 0.01172
Epoch: 20778, Training Loss: 0.00905
Epoch: 20779, Training Loss: 0.01027
Epoch: 20779, Training Loss: 0.00955
Epoch: 20779, Training Loss: 0.01172
Epoch: 20779, Training Loss: 0.00905
Epoch: 20780, Training Loss: 0.01027
Epoch: 20780, Training Loss: 0.00955
Epoch: 20780, Training Loss: 0.01172
Epoch: 20780, Training Loss: 0.00905
Epoch: 20781, Training Loss: 0.01027
Epoch: 20781, Training Loss: 0.00955
Epoch: 20781, Training Loss: 0.01172
Epoch: 20781, Training Loss: 0.00905
Epoch: 20782, Training Loss: 0.01027
Epoch: 20782, Training Loss: 0.00955
Epoch: 20782, Training Loss: 0.01172
Epoch: 20782, Training Loss: 0.00905
Epoch: 20783, Training Loss: 0.01027
Epoch: 20783, Training Loss: 0.00955
Epoch: 20783, Training Loss: 0.01172
Epoch: 20783, Training Loss: 0.00905
Epoch: 20784, Training Loss: 0.01027
Epoch: 20784, Training Loss: 0.00955
Epoch: 20784, Training Loss: 0.01172
Epoch: 20784, Training Loss: 0.00905
Epoch: 20785, Training Loss: 0.01027
Epoch: 20785, Training Loss: 0.00955
Epoch: 20785, Training Loss: 0.01172
Epoch: 20785, Training Loss: 0.00905
Epoch: 20786, Training Loss: 0.01027
Epoch: 20786, Training Loss: 0.00955
Epoch: 20786, Training Loss: 0.01172
Epoch: 20786, Training Loss: 0.00905
Epoch: 20787, Training Loss: 0.01027
Epoch: 20787, Training Loss: 0.00955
Epoch: 20787, Training Loss: 0.01172
Epoch: 20787, Training Loss: 0.00905
Epoch: 20788, Training Loss: 0.01027
Epoch: 20788, Training Loss: 0.00955
Epoch: 20788, Training Loss: 0.01172
Epoch: 20788, Training Loss: 0.00905
Epoch: 20789, Training Loss: 0.01026
Epoch: 20789, Training Loss: 0.00955
Epoch: 20789, Training Loss: 0.01172
Epoch: 20789, Training Loss: 0.00905
Epoch: 20790, Training Loss: 0.01026
Epoch: 20790, Training Loss: 0.00955
Epoch: 20790, Training Loss: 0.01172
Epoch: 20790, Training Loss: 0.00905
Epoch: 20791, Training Loss: 0.01026
Epoch: 20791, Training Loss: 0.00955
Epoch: 20791, Training Loss: 0.01172
Epoch: 20791, Training Loss: 0.00905
Epoch: 20792, Training Loss: 0.01026
Epoch: 20792, Training Loss: 0.00955
Epoch: 20792, Training Loss: 0.01172
Epoch: 20792, Training Loss: 0.00905
Epoch: 20793, Training Loss: 0.01026
Epoch: 20793, Training Loss: 0.00955
Epoch: 20793, Training Loss: 0.01172
Epoch: 20793, Training Loss: 0.00905
Epoch: 20794, Training Loss: 0.01026
Epoch: 20794, Training Loss: 0.00955
Epoch: 20794, Training Loss: 0.01172
Epoch: 20794, Training Loss: 0.00905
Epoch: 20795, Training Loss: 0.01026
Epoch: 20795, Training Loss: 0.00955
Epoch: 20795, Training Loss: 0.01172
Epoch: 20795, Training Loss: 0.00905
Epoch: 20796, Training Loss: 0.01026
Epoch: 20796, Training Loss: 0.00955
Epoch: 20796, Training Loss: 0.01172
Epoch: 20796, Training Loss: 0.00905
Epoch: 20797, Training Loss: 0.01026
Epoch: 20797, Training Loss: 0.00955
Epoch: 20797, Training Loss: 0.01172
Epoch: 20797, Training Loss: 0.00905
Epoch: 20798, Training Loss: 0.01026
Epoch: 20798, Training Loss: 0.00955
Epoch: 20798, Training Loss: 0.01172
Epoch: 20798, Training Loss: 0.00905
Epoch: 20799, Training Loss: 0.01026
Epoch: 20799, Training Loss: 0.00955
Epoch: 20799, Training Loss: 0.01172
Epoch: 20799, Training Loss: 0.00905
Epoch: 20800, Training Loss: 0.01026
Epoch: 20800, Training Loss: 0.00955
Epoch: 20800, Training Loss: 0.01172
Epoch: 20800, Training Loss: 0.00905
Epoch: 20801, Training Loss: 0.01026
Epoch: 20801, Training Loss: 0.00955
Epoch: 20801, Training Loss: 0.01172
Epoch: 20801, Training Loss: 0.00905
Epoch: 20802, Training Loss: 0.01026
Epoch: 20802, Training Loss: 0.00955
Epoch: 20802, Training Loss: 0.01171
Epoch: 20802, Training Loss: 0.00905
Epoch: 20803, Training Loss: 0.01026
Epoch: 20803, Training Loss: 0.00955
Epoch: 20803, Training Loss: 0.01171
Epoch: 20803, Training Loss: 0.00905
Epoch: 20804, Training Loss: 0.01026
Epoch: 20804, Training Loss: 0.00955
Epoch: 20804, Training Loss: 0.01171
Epoch: 20804, Training Loss: 0.00905
Epoch: 20805, Training Loss: 0.01026
Epoch: 20805, Training Loss: 0.00955
Epoch: 20805, Training Loss: 0.01171
Epoch: 20805, Training Loss: 0.00905
Epoch: 20806, Training Loss: 0.01026
Epoch: 20806, Training Loss: 0.00955
Epoch: 20806, Training Loss: 0.01171
Epoch: 20806, Training Loss: 0.00905
Epoch: 20807, Training Loss: 0.01026
Epoch: 20807, Training Loss: 0.00955
Epoch: 20807, Training Loss: 0.01171
Epoch: 20807, Training Loss: 0.00905
Epoch: 20808, Training Loss: 0.01026
Epoch: 20808, Training Loss: 0.00955
Epoch: 20808, Training Loss: 0.01171
Epoch: 20808, Training Loss: 0.00905
Epoch: 20809, Training Loss: 0.01026
Epoch: 20809, Training Loss: 0.00955
Epoch: 20809, Training Loss: 0.01171
Epoch: 20809, Training Loss: 0.00904
Epoch: 20810, Training Loss: 0.01026
Epoch: 20810, Training Loss: 0.00955
Epoch: 20810, Training Loss: 0.01171
Epoch: 20810, Training Loss: 0.00904
Epoch: 20811, Training Loss: 0.01026
Epoch: 20811, Training Loss: 0.00955
Epoch: 20811, Training Loss: 0.01171
Epoch: 20811, Training Loss: 0.00904
Epoch: 20812, Training Loss: 0.01026
Epoch: 20812, Training Loss: 0.00955
Epoch: 20812, Training Loss: 0.01171
Epoch: 20812, Training Loss: 0.00904
Epoch: 20813, Training Loss: 0.01026
Epoch: 20813, Training Loss: 0.00955
Epoch: 20813, Training Loss: 0.01171
Epoch: 20813, Training Loss: 0.00904
Epoch: 20814, Training Loss: 0.01026
Epoch: 20814, Training Loss: 0.00955
Epoch: 20814, Training Loss: 0.01171
Epoch: 20814, Training Loss: 0.00904
Epoch: 20815, Training Loss: 0.01026
Epoch: 20815, Training Loss: 0.00954
Epoch: 20815, Training Loss: 0.01171
Epoch: 20815, Training Loss: 0.00904
Epoch: 20816, Training Loss: 0.01026
Epoch: 20816, Training Loss: 0.00954
Epoch: 20816, Training Loss: 0.01171
Epoch: 20816, Training Loss: 0.00904
Epoch: 20817, Training Loss: 0.01026
Epoch: 20817, Training Loss: 0.00954
Epoch: 20817, Training Loss: 0.01171
Epoch: 20817, Training Loss: 0.00904
Epoch: 20818, Training Loss: 0.01026
Epoch: 20818, Training Loss: 0.00954
Epoch: 20818, Training Loss: 0.01171
Epoch: 20818, Training Loss: 0.00904
Epoch: 20819, Training Loss: 0.01026
Epoch: 20819, Training Loss: 0.00954
Epoch: 20819, Training Loss: 0.01171
Epoch: 20819, Training Loss: 0.00904
Epoch: 20820, Training Loss: 0.01026
Epoch: 20820, Training Loss: 0.00954
Epoch: 20820, Training Loss: 0.01171
Epoch: 20820, Training Loss: 0.00904
Epoch: 20821, Training Loss: 0.01026
Epoch: 20821, Training Loss: 0.00954
Epoch: 20821, Training Loss: 0.01171
Epoch: 20821, Training Loss: 0.00904
Epoch: 20822, Training Loss: 0.01026
Epoch: 20822, Training Loss: 0.00954
Epoch: 20822, Training Loss: 0.01171
Epoch: 20822, Training Loss: 0.00904
Epoch: 20823, Training Loss: 0.01026
Epoch: 20823, Training Loss: 0.00954
Epoch: 20823, Training Loss: 0.01171
Epoch: 20823, Training Loss: 0.00904
Epoch: 20824, Training Loss: 0.01026
Epoch: 20824, Training Loss: 0.00954
Epoch: 20824, Training Loss: 0.01171
Epoch: 20824, Training Loss: 0.00904
Epoch: 20825, Training Loss: 0.01026
Epoch: 20825, Training Loss: 0.00954
Epoch: 20825, Training Loss: 0.01171
Epoch: 20825, Training Loss: 0.00904
Epoch: 20826, Training Loss: 0.01025
Epoch: 20826, Training Loss: 0.00954
Epoch: 20826, Training Loss: 0.01171
Epoch: 20826, Training Loss: 0.00904
Epoch: 20827, Training Loss: 0.01025
Epoch: 20827, Training Loss: 0.00954
Epoch: 20827, Training Loss: 0.01171
Epoch: 20827, Training Loss: 0.00904
Epoch: 20828, Training Loss: 0.01025
Epoch: 20828, Training Loss: 0.00954
Epoch: 20828, Training Loss: 0.01171
Epoch: 20828, Training Loss: 0.00904
Epoch: 20829, Training Loss: 0.01025
Epoch: 20829, Training Loss: 0.00954
Epoch: 20829, Training Loss: 0.01171
Epoch: 20829, Training Loss: 0.00904
Epoch: 20830, Training Loss: 0.01025
Epoch: 20830, Training Loss: 0.00954
Epoch: 20830, Training Loss: 0.01171
Epoch: 20830, Training Loss: 0.00904
Epoch: 20831, Training Loss: 0.01025
Epoch: 20831, Training Loss: 0.00954
Epoch: 20831, Training Loss: 0.01171
Epoch: 20831, Training Loss: 0.00904
Epoch: 20832, Training Loss: 0.01025
Epoch: 20832, Training Loss: 0.00954
Epoch: 20832, Training Loss: 0.01171
Epoch: 20832, Training Loss: 0.00904
Epoch: 20833, Training Loss: 0.01025
Epoch: 20833, Training Loss: 0.00954
Epoch: 20833, Training Loss: 0.01171
Epoch: 20833, Training Loss: 0.00904
Epoch: 20834, Training Loss: 0.01025
Epoch: 20834, Training Loss: 0.00954
Epoch: 20834, Training Loss: 0.01171
Epoch: 20834, Training Loss: 0.00904
Epoch: 20835, Training Loss: 0.01025
Epoch: 20835, Training Loss: 0.00954
Epoch: 20835, Training Loss: 0.01170
Epoch: 20835, Training Loss: 0.00904
Epoch: 20836, Training Loss: 0.01025
Epoch: 20836, Training Loss: 0.00954
Epoch: 20836, Training Loss: 0.01170
Epoch: 20836, Training Loss: 0.00904
Epoch: 20837, Training Loss: 0.01025
Epoch: 20837, Training Loss: 0.00954
Epoch: 20837, Training Loss: 0.01170
Epoch: 20837, Training Loss: 0.00904
Epoch: 20838, Training Loss: 0.01025
Epoch: 20838, Training Loss: 0.00954
Epoch: 20838, Training Loss: 0.01170
Epoch: 20838, Training Loss: 0.00904
Epoch: 20839, Training Loss: 0.01025
Epoch: 20839, Training Loss: 0.00954
Epoch: 20839, Training Loss: 0.01170
Epoch: 20839, Training Loss: 0.00904
Epoch: 20840, Training Loss: 0.01025
Epoch: 20840, Training Loss: 0.00954
Epoch: 20840, Training Loss: 0.01170
Epoch: 20840, Training Loss: 0.00904
Epoch: 20841, Training Loss: 0.01025
Epoch: 20841, Training Loss: 0.00954
Epoch: 20841, Training Loss: 0.01170
Epoch: 20841, Training Loss: 0.00904
Epoch: 20842, Training Loss: 0.01025
Epoch: 20842, Training Loss: 0.00954
Epoch: 20842, Training Loss: 0.01170
Epoch: 20842, Training Loss: 0.00904
Epoch: 20843, Training Loss: 0.01025
Epoch: 20843, Training Loss: 0.00954
Epoch: 20843, Training Loss: 0.01170
Epoch: 20843, Training Loss: 0.00904
Epoch: 20844, Training Loss: 0.01025
Epoch: 20844, Training Loss: 0.00954
Epoch: 20844, Training Loss: 0.01170
Epoch: 20844, Training Loss: 0.00904
Epoch: 20845, Training Loss: 0.01025
Epoch: 20845, Training Loss: 0.00954
Epoch: 20845, Training Loss: 0.01170
Epoch: 20845, Training Loss: 0.00904
Epoch: 20846, Training Loss: 0.01025
Epoch: 20846, Training Loss: 0.00954
Epoch: 20846, Training Loss: 0.01170
Epoch: 20846, Training Loss: 0.00904
Epoch: 20847, Training Loss: 0.01025
Epoch: 20847, Training Loss: 0.00954
Epoch: 20847, Training Loss: 0.01170
Epoch: 20847, Training Loss: 0.00904
Epoch: 20848, Training Loss: 0.01025
Epoch: 20848, Training Loss: 0.00954
Epoch: 20848, Training Loss: 0.01170
Epoch: 20848, Training Loss: 0.00904
Epoch: 20849, Training Loss: 0.01025
Epoch: 20849, Training Loss: 0.00954
Epoch: 20849, Training Loss: 0.01170
Epoch: 20849, Training Loss: 0.00904
Epoch: 20850, Training Loss: 0.01025
Epoch: 20850, Training Loss: 0.00954
Epoch: 20850, Training Loss: 0.01170
Epoch: 20850, Training Loss: 0.00904
Epoch: 20851, Training Loss: 0.01025
Epoch: 20851, Training Loss: 0.00954
Epoch: 20851, Training Loss: 0.01170
Epoch: 20851, Training Loss: 0.00904
Epoch: 20852, Training Loss: 0.01025
Epoch: 20852, Training Loss: 0.00954
Epoch: 20852, Training Loss: 0.01170
Epoch: 20852, Training Loss: 0.00903
Epoch: 20853, Training Loss: 0.01025
Epoch: 20853, Training Loss: 0.00954
Epoch: 20853, Training Loss: 0.01170
Epoch: 20853, Training Loss: 0.00903
Epoch: 20854, Training Loss: 0.01025
Epoch: 20854, Training Loss: 0.00954
Epoch: 20854, Training Loss: 0.01170
Epoch: 20854, Training Loss: 0.00903
Epoch: 20855, Training Loss: 0.01025
Epoch: 20855, Training Loss: 0.00954
Epoch: 20855, Training Loss: 0.01170
Epoch: 20855, Training Loss: 0.00903
Epoch: 20856, Training Loss: 0.01025
Epoch: 20856, Training Loss: 0.00953
Epoch: 20856, Training Loss: 0.01170
Epoch: 20856, Training Loss: 0.00903
Epoch: 20857, Training Loss: 0.01025
Epoch: 20857, Training Loss: 0.00953
Epoch: 20857, Training Loss: 0.01170
Epoch: 20857, Training Loss: 0.00903
Epoch: 20858, Training Loss: 0.01025
Epoch: 20858, Training Loss: 0.00953
Epoch: 20858, Training Loss: 0.01170
Epoch: 20858, Training Loss: 0.00903
Epoch: 20859, Training Loss: 0.01025
Epoch: 20859, Training Loss: 0.00953
Epoch: 20859, Training Loss: 0.01170
Epoch: 20859, Training Loss: 0.00903
Epoch: 20860, Training Loss: 0.01025
Epoch: 20860, Training Loss: 0.00953
Epoch: 20860, Training Loss: 0.01170
Epoch: 20860, Training Loss: 0.00903
Epoch: 20861, Training Loss: 0.01025
Epoch: 20861, Training Loss: 0.00953
Epoch: 20861, Training Loss: 0.01170
Epoch: 20861, Training Loss: 0.00903
Epoch: 20862, Training Loss: 0.01025
Epoch: 20862, Training Loss: 0.00953
Epoch: 20862, Training Loss: 0.01170
Epoch: 20862, Training Loss: 0.00903
Epoch: 20863, Training Loss: 0.01025
Epoch: 20863, Training Loss: 0.00953
Epoch: 20863, Training Loss: 0.01170
Epoch: 20863, Training Loss: 0.00903
Epoch: 20864, Training Loss: 0.01024
Epoch: 20864, Training Loss: 0.00953
Epoch: 20864, Training Loss: 0.01170
Epoch: 20864, Training Loss: 0.00903
Epoch: 20865, Training Loss: 0.01024
Epoch: 20865, Training Loss: 0.00953
Epoch: 20865, Training Loss: 0.01170
Epoch: 20865, Training Loss: 0.00903
Epoch: 20866, Training Loss: 0.01024
Epoch: 20866, Training Loss: 0.00953
Epoch: 20866, Training Loss: 0.01170
Epoch: 20866, Training Loss: 0.00903
Epoch: 20867, Training Loss: 0.01024
Epoch: 20867, Training Loss: 0.00953
Epoch: 20867, Training Loss: 0.01170
Epoch: 20867, Training Loss: 0.00903
Epoch: 20868, Training Loss: 0.01024
Epoch: 20868, Training Loss: 0.00953
Epoch: 20868, Training Loss: 0.01169
Epoch: 20868, Training Loss: 0.00903
Epoch: 20869, Training Loss: 0.01024
Epoch: 20869, Training Loss: 0.00953
Epoch: 20869, Training Loss: 0.01169
Epoch: 20869, Training Loss: 0.00903
Epoch: 20870, Training Loss: 0.01024
Epoch: 20870, Training Loss: 0.00953
Epoch: 20870, Training Loss: 0.01169
Epoch: 20870, Training Loss: 0.00903
Epoch: 20871, Training Loss: 0.01024
Epoch: 20871, Training Loss: 0.00953
Epoch: 20871, Training Loss: 0.01169
Epoch: 20871, Training Loss: 0.00903
Epoch: 20872, Training Loss: 0.01024
Epoch: 20872, Training Loss: 0.00953
Epoch: 20872, Training Loss: 0.01169
Epoch: 20872, Training Loss: 0.00903
Epoch: 20873, Training Loss: 0.01024
Epoch: 20873, Training Loss: 0.00953
Epoch: 20873, Training Loss: 0.01169
Epoch: 20873, Training Loss: 0.00903
Epoch: 20874, Training Loss: 0.01024
Epoch: 20874, Training Loss: 0.00953
Epoch: 20874, Training Loss: 0.01169
Epoch: 20874, Training Loss: 0.00903
Epoch: 20875, Training Loss: 0.01024
Epoch: 20875, Training Loss: 0.00953
Epoch: 20875, Training Loss: 0.01169
Epoch: 20875, Training Loss: 0.00903
Epoch: 20876, Training Loss: 0.01024
Epoch: 20876, Training Loss: 0.00953
Epoch: 20876, Training Loss: 0.01169
Epoch: 20876, Training Loss: 0.00903
Epoch: 20877, Training Loss: 0.01024
Epoch: 20877, Training Loss: 0.00953
Epoch: 20877, Training Loss: 0.01169
Epoch: 20877, Training Loss: 0.00903
Epoch: 20878, Training Loss: 0.01024
Epoch: 20878, Training Loss: 0.00953
Epoch: 20878, Training Loss: 0.01169
Epoch: 20878, Training Loss: 0.00903
Epoch: 20879, Training Loss: 0.01024
Epoch: 20879, Training Loss: 0.00953
Epoch: 20879, Training Loss: 0.01169
Epoch: 20879, Training Loss: 0.00903
Epoch: 20880, Training Loss: 0.01024
Epoch: 20880, Training Loss: 0.00953
Epoch: 20880, Training Loss: 0.01169
Epoch: 20880, Training Loss: 0.00903
Epoch: 20881, Training Loss: 0.01024
Epoch: 20881, Training Loss: 0.00953
Epoch: 20881, Training Loss: 0.01169
Epoch: 20881, Training Loss: 0.00903
Epoch: 20882, Training Loss: 0.01024
Epoch: 20882, Training Loss: 0.00953
Epoch: 20882, Training Loss: 0.01169
Epoch: 20882, Training Loss: 0.00903
Epoch: 20883, Training Loss: 0.01024
Epoch: 20883, Training Loss: 0.00953
Epoch: 20883, Training Loss: 0.01169
Epoch: 20883, Training Loss: 0.00903
Epoch: 20884, Training Loss: 0.01024
Epoch: 20884, Training Loss: 0.00953
Epoch: 20884, Training Loss: 0.01169
Epoch: 20884, Training Loss: 0.00903
Epoch: 20885, Training Loss: 0.01024
Epoch: 20885, Training Loss: 0.00953
Epoch: 20885, Training Loss: 0.01169
Epoch: 20885, Training Loss: 0.00903
Epoch: 20886, Training Loss: 0.01024
Epoch: 20886, Training Loss: 0.00953
Epoch: 20886, Training Loss: 0.01169
Epoch: 20886, Training Loss: 0.00903
Epoch: 20887, Training Loss: 0.01024
Epoch: 20887, Training Loss: 0.00953
Epoch: 20887, Training Loss: 0.01169
Epoch: 20887, Training Loss: 0.00903
Epoch: 20888, Training Loss: 0.01024
Epoch: 20888, Training Loss: 0.00953
Epoch: 20888, Training Loss: 0.01169
Epoch: 20888, Training Loss: 0.00903
Epoch: 20889, Training Loss: 0.01024
Epoch: 20889, Training Loss: 0.00953
Epoch: 20889, Training Loss: 0.01169
Epoch: 20889, Training Loss: 0.00903
Epoch: 20890, Training Loss: 0.01024
Epoch: 20890, Training Loss: 0.00953
Epoch: 20890, Training Loss: 0.01169
Epoch: 20890, Training Loss: 0.00903
Epoch: 20891, Training Loss: 0.01024
Epoch: 20891, Training Loss: 0.00953
Epoch: 20891, Training Loss: 0.01169
Epoch: 20891, Training Loss: 0.00903
Epoch: 20892, Training Loss: 0.01024
Epoch: 20892, Training Loss: 0.00953
Epoch: 20892, Training Loss: 0.01169
Epoch: 20892, Training Loss: 0.00903
Epoch: 20893, Training Loss: 0.01024
Epoch: 20893, Training Loss: 0.00953
Epoch: 20893, Training Loss: 0.01169
Epoch: 20893, Training Loss: 0.00903
Epoch: 20894, Training Loss: 0.01024
Epoch: 20894, Training Loss: 0.00953
Epoch: 20894, Training Loss: 0.01169
Epoch: 20894, Training Loss: 0.00903
Epoch: 20895, Training Loss: 0.01024
Epoch: 20895, Training Loss: 0.00953
Epoch: 20895, Training Loss: 0.01169
Epoch: 20895, Training Loss: 0.00903
Epoch: 20896, Training Loss: 0.01024
Epoch: 20896, Training Loss: 0.00953
Epoch: 20896, Training Loss: 0.01169
Epoch: 20896, Training Loss: 0.00902
Epoch: 20897, Training Loss: 0.01024
Epoch: 20897, Training Loss: 0.00952
Epoch: 20897, Training Loss: 0.01169
Epoch: 20897, Training Loss: 0.00902
Epoch: 20898, Training Loss: 0.01024
Epoch: 20898, Training Loss: 0.00952
Epoch: 20898, Training Loss: 0.01169
Epoch: 20898, Training Loss: 0.00902
Epoch: 20899, Training Loss: 0.01024
Epoch: 20899, Training Loss: 0.00952
Epoch: 20899, Training Loss: 0.01169
Epoch: 20899, Training Loss: 0.00902
Epoch: 20900, Training Loss: 0.01024
Epoch: 20900, Training Loss: 0.00952
Epoch: 20900, Training Loss: 0.01169
Epoch: 20900, Training Loss: 0.00902
Epoch: 20901, Training Loss: 0.01024
Epoch: 20901, Training Loss: 0.00952
Epoch: 20901, Training Loss: 0.01168
Epoch: 20901, Training Loss: 0.00902
Epoch: 20902, Training Loss: 0.01023
Epoch: 20902, Training Loss: 0.00952
Epoch: 20902, Training Loss: 0.01168
Epoch: 20902, Training Loss: 0.00902
Epoch: 20903, Training Loss: 0.01023
Epoch: 20903, Training Loss: 0.00952
Epoch: 20903, Training Loss: 0.01168
Epoch: 20903, Training Loss: 0.00902
Epoch: 20904, Training Loss: 0.01023
Epoch: 20904, Training Loss: 0.00952
Epoch: 20904, Training Loss: 0.01168
Epoch: 20904, Training Loss: 0.00902
Epoch: 20905, Training Loss: 0.01023
Epoch: 20905, Training Loss: 0.00952
Epoch: 20905, Training Loss: 0.01168
Epoch: 20905, Training Loss: 0.00902
Epoch: 20906, Training Loss: 0.01023
Epoch: 20906, Training Loss: 0.00952
Epoch: 20906, Training Loss: 0.01168
Epoch: 20906, Training Loss: 0.00902
Epoch: 20907, Training Loss: 0.01023
Epoch: 20907, Training Loss: 0.00952
Epoch: 20907, Training Loss: 0.01168
Epoch: 20907, Training Loss: 0.00902
Epoch: 20908, Training Loss: 0.01023
Epoch: 20908, Training Loss: 0.00952
Epoch: 20908, Training Loss: 0.01168
Epoch: 20908, Training Loss: 0.00902
Epoch: 20909, Training Loss: 0.01023
Epoch: 20909, Training Loss: 0.00952
Epoch: 20909, Training Loss: 0.01168
Epoch: 20909, Training Loss: 0.00902
Epoch: 20910, Training Loss: 0.01023
Epoch: 20910, Training Loss: 0.00952
Epoch: 20910, Training Loss: 0.01168
Epoch: 20910, Training Loss: 0.00902
Epoch: 20911, Training Loss: 0.01023
Epoch: 20911, Training Loss: 0.00952
Epoch: 20911, Training Loss: 0.01168
Epoch: 20911, Training Loss: 0.00902
Epoch: 20912, Training Loss: 0.01023
Epoch: 20912, Training Loss: 0.00952
Epoch: 20912, Training Loss: 0.01168
Epoch: 20912, Training Loss: 0.00902
Epoch: 20913, Training Loss: 0.01023
Epoch: 20913, Training Loss: 0.00952
Epoch: 20913, Training Loss: 0.01168
Epoch: 20913, Training Loss: 0.00902
Epoch: 20914, Training Loss: 0.01023
Epoch: 20914, Training Loss: 0.00952
Epoch: 20914, Training Loss: 0.01168
Epoch: 20914, Training Loss: 0.00902
Epoch: 20915, Training Loss: 0.01023
Epoch: 20915, Training Loss: 0.00952
Epoch: 20915, Training Loss: 0.01168
Epoch: 20915, Training Loss: 0.00902
Epoch: 20916, Training Loss: 0.01023
Epoch: 20916, Training Loss: 0.00952
Epoch: 20916, Training Loss: 0.01168
Epoch: 20916, Training Loss: 0.00902
Epoch: 20917, Training Loss: 0.01023
Epoch: 20917, Training Loss: 0.00952
Epoch: 20917, Training Loss: 0.01168
Epoch: 20917, Training Loss: 0.00902
Epoch: 20918, Training Loss: 0.01023
Epoch: 20918, Training Loss: 0.00952
Epoch: 20918, Training Loss: 0.01168
Epoch: 20918, Training Loss: 0.00902
Epoch: 20919, Training Loss: 0.01023
Epoch: 20919, Training Loss: 0.00952
Epoch: 20919, Training Loss: 0.01168
Epoch: 20919, Training Loss: 0.00902
Epoch: 20920, Training Loss: 0.01023
Epoch: 20920, Training Loss: 0.00952
Epoch: 20920, Training Loss: 0.01168
Epoch: 20920, Training Loss: 0.00902
Epoch: 20921, Training Loss: 0.01023
Epoch: 20921, Training Loss: 0.00952
Epoch: 20921, Training Loss: 0.01168
Epoch: 20921, Training Loss: 0.00902
Epoch: 20922, Training Loss: 0.01023
Epoch: 20922, Training Loss: 0.00952
Epoch: 20922, Training Loss: 0.01168
Epoch: 20922, Training Loss: 0.00902
Epoch: 20923, Training Loss: 0.01023
Epoch: 20923, Training Loss: 0.00952
Epoch: 20923, Training Loss: 0.01168
Epoch: 20923, Training Loss: 0.00902
Epoch: 20924, Training Loss: 0.01023
Epoch: 20924, Training Loss: 0.00952
Epoch: 20924, Training Loss: 0.01168
Epoch: 20924, Training Loss: 0.00902
Epoch: 20925, Training Loss: 0.01023
Epoch: 20925, Training Loss: 0.00952
Epoch: 20925, Training Loss: 0.01168
Epoch: 20925, Training Loss: 0.00902
Epoch: 20926, Training Loss: 0.01023
Epoch: 20926, Training Loss: 0.00952
Epoch: 20926, Training Loss: 0.01168
Epoch: 20926, Training Loss: 0.00902
Epoch: 20927, Training Loss: 0.01023
Epoch: 20927, Training Loss: 0.00952
Epoch: 20927, Training Loss: 0.01168
Epoch: 20927, Training Loss: 0.00902
Epoch: 20928, Training Loss: 0.01023
Epoch: 20928, Training Loss: 0.00952
Epoch: 20928, Training Loss: 0.01168
Epoch: 20928, Training Loss: 0.00902
Epoch: 20929, Training Loss: 0.01023
Epoch: 20929, Training Loss: 0.00952
Epoch: 20929, Training Loss: 0.01168
Epoch: 20929, Training Loss: 0.00902
Epoch: 20930, Training Loss: 0.01023
Epoch: 20930, Training Loss: 0.00952
Epoch: 20930, Training Loss: 0.01168
Epoch: 20930, Training Loss: 0.00902
Epoch: 20931, Training Loss: 0.01023
Epoch: 20931, Training Loss: 0.00952
Epoch: 20931, Training Loss: 0.01168
Epoch: 20931, Training Loss: 0.00902
Epoch: 20932, Training Loss: 0.01023
Epoch: 20932, Training Loss: 0.00952
Epoch: 20932, Training Loss: 0.01168
Epoch: 20932, Training Loss: 0.00902
Epoch: 20933, Training Loss: 0.01023
Epoch: 20933, Training Loss: 0.00952
Epoch: 20933, Training Loss: 0.01168
Epoch: 20933, Training Loss: 0.00902
Epoch: 20934, Training Loss: 0.01023
Epoch: 20934, Training Loss: 0.00952
Epoch: 20934, Training Loss: 0.01167
Epoch: 20934, Training Loss: 0.00902
Epoch: 20935, Training Loss: 0.01023
Epoch: 20935, Training Loss: 0.00952
Epoch: 20935, Training Loss: 0.01167
Epoch: 20935, Training Loss: 0.00902
Epoch: 20936, Training Loss: 0.01023
Epoch: 20936, Training Loss: 0.00952
Epoch: 20936, Training Loss: 0.01167
Epoch: 20936, Training Loss: 0.00902
Epoch: 20937, Training Loss: 0.01023
Epoch: 20937, Training Loss: 0.00952
Epoch: 20937, Training Loss: 0.01167
Epoch: 20937, Training Loss: 0.00902
Epoch: 20938, Training Loss: 0.01023
Epoch: 20938, Training Loss: 0.00952
Epoch: 20938, Training Loss: 0.01167
Epoch: 20938, Training Loss: 0.00902
Epoch: 20939, Training Loss: 0.01023
Epoch: 20939, Training Loss: 0.00951
Epoch: 20939, Training Loss: 0.01167
Epoch: 20939, Training Loss: 0.00902
Epoch: 20940, Training Loss: 0.01022
Epoch: 20940, Training Loss: 0.00951
Epoch: 20940, Training Loss: 0.01167
Epoch: 20940, Training Loss: 0.00901
Epoch: 20941, Training Loss: 0.01022
Epoch: 20941, Training Loss: 0.00951
Epoch: 20941, Training Loss: 0.01167
Epoch: 20941, Training Loss: 0.00901
Epoch: 20942, Training Loss: 0.01022
Epoch: 20942, Training Loss: 0.00951
Epoch: 20942, Training Loss: 0.01167
Epoch: 20942, Training Loss: 0.00901
Epoch: 20943, Training Loss: 0.01022
Epoch: 20943, Training Loss: 0.00951
Epoch: 20943, Training Loss: 0.01167
Epoch: 20943, Training Loss: 0.00901
Epoch: 20944, Training Loss: 0.01022
Epoch: 20944, Training Loss: 0.00951
Epoch: 20944, Training Loss: 0.01167
Epoch: 20944, Training Loss: 0.00901
Epoch: 20945, Training Loss: 0.01022
Epoch: 20945, Training Loss: 0.00951
Epoch: 20945, Training Loss: 0.01167
Epoch: 20945, Training Loss: 0.00901
Epoch: 20946, Training Loss: 0.01022
Epoch: 20946, Training Loss: 0.00951
Epoch: 20946, Training Loss: 0.01167
Epoch: 20946, Training Loss: 0.00901
Epoch: 20947, Training Loss: 0.01022
Epoch: 20947, Training Loss: 0.00951
Epoch: 20947, Training Loss: 0.01167
Epoch: 20947, Training Loss: 0.00901
Epoch: 20948, Training Loss: 0.01022
Epoch: 20948, Training Loss: 0.00951
Epoch: 20948, Training Loss: 0.01167
Epoch: 20948, Training Loss: 0.00901
Epoch: 20949, Training Loss: 0.01022
Epoch: 20949, Training Loss: 0.00951
Epoch: 20949, Training Loss: 0.01167
Epoch: 20949, Training Loss: 0.00901
Epoch: 20950, Training Loss: 0.01022
Epoch: 20950, Training Loss: 0.00951
Epoch: 20950, Training Loss: 0.01167
Epoch: 20950, Training Loss: 0.00901
Epoch: 20951, Training Loss: 0.01022
Epoch: 20951, Training Loss: 0.00951
Epoch: 20951, Training Loss: 0.01167
Epoch: 20951, Training Loss: 0.00901
Epoch: 20952, Training Loss: 0.01022
Epoch: 20952, Training Loss: 0.00951
Epoch: 20952, Training Loss: 0.01167
Epoch: 20952, Training Loss: 0.00901
Epoch: 20953, Training Loss: 0.01022
Epoch: 20953, Training Loss: 0.00951
Epoch: 20953, Training Loss: 0.01167
Epoch: 20953, Training Loss: 0.00901
Epoch: 20954, Training Loss: 0.01022
Epoch: 20954, Training Loss: 0.00951
Epoch: 20954, Training Loss: 0.01167
Epoch: 20954, Training Loss: 0.00901
Epoch: 20955, Training Loss: 0.01022
Epoch: 20955, Training Loss: 0.00951
Epoch: 20955, Training Loss: 0.01167
Epoch: 20955, Training Loss: 0.00901
Epoch: 20956, Training Loss: 0.01022
Epoch: 20956, Training Loss: 0.00951
Epoch: 20956, Training Loss: 0.01167
Epoch: 20956, Training Loss: 0.00901
Epoch: 20957, Training Loss: 0.01022
Epoch: 20957, Training Loss: 0.00951
Epoch: 20957, Training Loss: 0.01167
Epoch: 20957, Training Loss: 0.00901
Epoch: 20958, Training Loss: 0.01022
Epoch: 20958, Training Loss: 0.00951
Epoch: 20958, Training Loss: 0.01167
Epoch: 20958, Training Loss: 0.00901
Epoch: 20959, Training Loss: 0.01022
Epoch: 20959, Training Loss: 0.00951
Epoch: 20959, Training Loss: 0.01167
Epoch: 20959, Training Loss: 0.00901
Epoch: 20960, Training Loss: 0.01022
Epoch: 20960, Training Loss: 0.00951
Epoch: 20960, Training Loss: 0.01167
Epoch: 20960, Training Loss: 0.00901
Epoch: 20961, Training Loss: 0.01022
Epoch: 20961, Training Loss: 0.00951
Epoch: 20961, Training Loss: 0.01167
Epoch: 20961, Training Loss: 0.00901
Epoch: 20962, Training Loss: 0.01022
Epoch: 20962, Training Loss: 0.00951
Epoch: 20962, Training Loss: 0.01167
Epoch: 20962, Training Loss: 0.00901
Epoch: 20963, Training Loss: 0.01022
Epoch: 20963, Training Loss: 0.00951
Epoch: 20963, Training Loss: 0.01167
Epoch: 20963, Training Loss: 0.00901
Epoch: 20964, Training Loss: 0.01022
Epoch: 20964, Training Loss: 0.00951
Epoch: 20964, Training Loss: 0.01167
Epoch: 20964, Training Loss: 0.00901
Epoch: 20965, Training Loss: 0.01022
Epoch: 20965, Training Loss: 0.00951
Epoch: 20965, Training Loss: 0.01167
Epoch: 20965, Training Loss: 0.00901
Epoch: 20966, Training Loss: 0.01022
Epoch: 20966, Training Loss: 0.00951
Epoch: 20966, Training Loss: 0.01167
Epoch: 20966, Training Loss: 0.00901
Epoch: 20967, Training Loss: 0.01022
Epoch: 20967, Training Loss: 0.00951
Epoch: 20967, Training Loss: 0.01166
Epoch: 20967, Training Loss: 0.00901
Epoch: 20968, Training Loss: 0.01022
Epoch: 20968, Training Loss: 0.00951
Epoch: 20968, Training Loss: 0.01166
Epoch: 20968, Training Loss: 0.00901
Epoch: 20969, Training Loss: 0.01022
Epoch: 20969, Training Loss: 0.00951
Epoch: 20969, Training Loss: 0.01166
Epoch: 20969, Training Loss: 0.00901
Epoch: 20970, Training Loss: 0.01022
Epoch: 20970, Training Loss: 0.00951
Epoch: 20970, Training Loss: 0.01166
Epoch: 20970, Training Loss: 0.00901
Epoch: 20971, Training Loss: 0.01022
Epoch: 20971, Training Loss: 0.00951
Epoch: 20971, Training Loss: 0.01166
Epoch: 20971, Training Loss: 0.00901
Epoch: 20972, Training Loss: 0.01022
Epoch: 20972, Training Loss: 0.00951
Epoch: 20972, Training Loss: 0.01166
Epoch: 20972, Training Loss: 0.00901
Epoch: 20973, Training Loss: 0.01022
Epoch: 20973, Training Loss: 0.00951
Epoch: 20973, Training Loss: 0.01166
Epoch: 20973, Training Loss: 0.00901
Epoch: 20974, Training Loss: 0.01022
Epoch: 20974, Training Loss: 0.00951
Epoch: 20974, Training Loss: 0.01166
Epoch: 20974, Training Loss: 0.00901
Epoch: 20975, Training Loss: 0.01022
Epoch: 20975, Training Loss: 0.00951
Epoch: 20975, Training Loss: 0.01166
Epoch: 20975, Training Loss: 0.00901
Epoch: 20976, Training Loss: 0.01022
Epoch: 20976, Training Loss: 0.00951
Epoch: 20976, Training Loss: 0.01166
Epoch: 20976, Training Loss: 0.00901
Epoch: 20977, Training Loss: 0.01022
Epoch: 20977, Training Loss: 0.00951
Epoch: 20977, Training Loss: 0.01166
Epoch: 20977, Training Loss: 0.00901
Epoch: 20978, Training Loss: 0.01021
Epoch: 20978, Training Loss: 0.00951
Epoch: 20978, Training Loss: 0.01166
Epoch: 20978, Training Loss: 0.00901
Epoch: 20979, Training Loss: 0.01021
Epoch: 20979, Training Loss: 0.00951
Epoch: 20979, Training Loss: 0.01166
Epoch: 20979, Training Loss: 0.00901
Epoch: 20980, Training Loss: 0.01021
Epoch: 20980, Training Loss: 0.00950
Epoch: 20980, Training Loss: 0.01166
Epoch: 20980, Training Loss: 0.00901
Epoch: 20981, Training Loss: 0.01021
Epoch: 20981, Training Loss: 0.00950
Epoch: 20981, Training Loss: 0.01166
Epoch: 20981, Training Loss: 0.00901
Epoch: 20982, Training Loss: 0.01021
Epoch: 20982, Training Loss: 0.00950
Epoch: 20982, Training Loss: 0.01166
Epoch: 20982, Training Loss: 0.00901
Epoch: 20983, Training Loss: 0.01021
Epoch: 20983, Training Loss: 0.00950
Epoch: 20983, Training Loss: 0.01166
Epoch: 20983, Training Loss: 0.00900
Epoch: 20984, Training Loss: 0.01021
Epoch: 20984, Training Loss: 0.00950
Epoch: 20984, Training Loss: 0.01166
Epoch: 20984, Training Loss: 0.00900
Epoch: 20985, Training Loss: 0.01021
Epoch: 20985, Training Loss: 0.00950
Epoch: 20985, Training Loss: 0.01166
Epoch: 20985, Training Loss: 0.00900
Epoch: 20986, Training Loss: 0.01021
Epoch: 20986, Training Loss: 0.00950
Epoch: 20986, Training Loss: 0.01166
Epoch: 20986, Training Loss: 0.00900
Epoch: 20987, Training Loss: 0.01021
Epoch: 20987, Training Loss: 0.00950
Epoch: 20987, Training Loss: 0.01166
Epoch: 20987, Training Loss: 0.00900
Epoch: 20988, Training Loss: 0.01021
Epoch: 20988, Training Loss: 0.00950
Epoch: 20988, Training Loss: 0.01166
Epoch: 20988, Training Loss: 0.00900
Epoch: 20989, Training Loss: 0.01021
Epoch: 20989, Training Loss: 0.00950
Epoch: 20989, Training Loss: 0.01166
Epoch: 20989, Training Loss: 0.00900
Epoch: 20990, Training Loss: 0.01021
Epoch: 20990, Training Loss: 0.00950
Epoch: 20990, Training Loss: 0.01166
Epoch: 20990, Training Loss: 0.00900
Epoch: 20991, Training Loss: 0.01021
Epoch: 20991, Training Loss: 0.00950
Epoch: 20991, Training Loss: 0.01166
Epoch: 20991, Training Loss: 0.00900
Epoch: 20992, Training Loss: 0.01021
Epoch: 20992, Training Loss: 0.00950
Epoch: 20992, Training Loss: 0.01166
Epoch: 20992, Training Loss: 0.00900
Epoch: 20993, Training Loss: 0.01021
Epoch: 20993, Training Loss: 0.00950
Epoch: 20993, Training Loss: 0.01166
Epoch: 20993, Training Loss: 0.00900
Epoch: 20994, Training Loss: 0.01021
Epoch: 20994, Training Loss: 0.00950
Epoch: 20994, Training Loss: 0.01166
Epoch: 20994, Training Loss: 0.00900
Epoch: 20995, Training Loss: 0.01021
Epoch: 20995, Training Loss: 0.00950
Epoch: 20995, Training Loss: 0.01166
Epoch: 20995, Training Loss: 0.00900
Epoch: 20996, Training Loss: 0.01021
Epoch: 20996, Training Loss: 0.00950
Epoch: 20996, Training Loss: 0.01166
Epoch: 20996, Training Loss: 0.00900
Epoch: 20997, Training Loss: 0.01021
Epoch: 20997, Training Loss: 0.00950
Epoch: 20997, Training Loss: 0.01166
Epoch: 20997, Training Loss: 0.00900
Epoch: 20998, Training Loss: 0.01021
Epoch: 20998, Training Loss: 0.00950
Epoch: 20998, Training Loss: 0.01166
Epoch: 20998, Training Loss: 0.00900
Epoch: 20999, Training Loss: 0.01021
Epoch: 20999, Training Loss: 0.00950
Epoch: 20999, Training Loss: 0.01166
Epoch: 20999, Training Loss: 0.00900
Epoch: 21000, Training Loss: 0.01021
Epoch: 21000, Training Loss: 0.00950
Epoch: 21000, Training Loss: 0.01166
Epoch: 21000, Training Loss: 0.00900
Epoch: 21001, Training Loss: 0.01021
Epoch: 21001, Training Loss: 0.00950
Epoch: 21001, Training Loss: 0.01165
Epoch: 21001, Training Loss: 0.00900
Epoch: 21002, Training Loss: 0.01021
Epoch: 21002, Training Loss: 0.00950
Epoch: 21002, Training Loss: 0.01165
Epoch: 21002, Training Loss: 0.00900
Epoch: 21003, Training Loss: 0.01021
Epoch: 21003, Training Loss: 0.00950
Epoch: 21003, Training Loss: 0.01165
Epoch: 21003, Training Loss: 0.00900
Epoch: 21004, Training Loss: 0.01021
Epoch: 21004, Training Loss: 0.00950
Epoch: 21004, Training Loss: 0.01165
Epoch: 21004, Training Loss: 0.00900
Epoch: 21005, Training Loss: 0.01021
Epoch: 21005, Training Loss: 0.00950
Epoch: 21005, Training Loss: 0.01165
Epoch: 21005, Training Loss: 0.00900
Epoch: 21006, Training Loss: 0.01021
Epoch: 21006, Training Loss: 0.00950
Epoch: 21006, Training Loss: 0.01165
Epoch: 21006, Training Loss: 0.00900
Epoch: 21007, Training Loss: 0.01021
Epoch: 21007, Training Loss: 0.00950
Epoch: 21007, Training Loss: 0.01165
Epoch: 21007, Training Loss: 0.00900
Epoch: 21008, Training Loss: 0.01021
Epoch: 21008, Training Loss: 0.00950
Epoch: 21008, Training Loss: 0.01165
Epoch: 21008, Training Loss: 0.00900
Epoch: 21009, Training Loss: 0.01021
Epoch: 21009, Training Loss: 0.00950
Epoch: 21009, Training Loss: 0.01165
Epoch: 21009, Training Loss: 0.00900
Epoch: 21010, Training Loss: 0.01021
Epoch: 21010, Training Loss: 0.00950
Epoch: 21010, Training Loss: 0.01165
Epoch: 21010, Training Loss: 0.00900
Epoch: 21011, Training Loss: 0.01021
Epoch: 21011, Training Loss: 0.00950
Epoch: 21011, Training Loss: 0.01165
Epoch: 21011, Training Loss: 0.00900
Epoch: 21012, Training Loss: 0.01021
Epoch: 21012, Training Loss: 0.00950
Epoch: 21012, Training Loss: 0.01165
Epoch: 21012, Training Loss: 0.00900
Epoch: 21013, Training Loss: 0.01021
Epoch: 21013, Training Loss: 0.00950
Epoch: 21013, Training Loss: 0.01165
Epoch: 21013, Training Loss: 0.00900
Epoch: 21014, Training Loss: 0.01021
Epoch: 21014, Training Loss: 0.00950
Epoch: 21014, Training Loss: 0.01165
Epoch: 21014, Training Loss: 0.00900
Epoch: 21015, Training Loss: 0.01021
Epoch: 21015, Training Loss: 0.00950
Epoch: 21015, Training Loss: 0.01165
Epoch: 21015, Training Loss: 0.00900
Epoch: 21016, Training Loss: 0.01020
Epoch: 21016, Training Loss: 0.00950
Epoch: 21016, Training Loss: 0.01165
Epoch: 21016, Training Loss: 0.00900
Epoch: 21017, Training Loss: 0.01020
Epoch: 21017, Training Loss: 0.00950
Epoch: 21017, Training Loss: 0.01165
Epoch: 21017, Training Loss: 0.00900
Epoch: 21018, Training Loss: 0.01020
Epoch: 21018, Training Loss: 0.00950
Epoch: 21018, Training Loss: 0.01165
Epoch: 21018, Training Loss: 0.00900
Epoch: 21019, Training Loss: 0.01020
Epoch: 21019, Training Loss: 0.00950
Epoch: 21019, Training Loss: 0.01165
Epoch: 21019, Training Loss: 0.00900
Epoch: 21020, Training Loss: 0.01020
Epoch: 21020, Training Loss: 0.00950
Epoch: 21020, Training Loss: 0.01165
Epoch: 21020, Training Loss: 0.00900
Epoch: 21021, Training Loss: 0.01020
Epoch: 21021, Training Loss: 0.00950
Epoch: 21021, Training Loss: 0.01165
Epoch: 21021, Training Loss: 0.00900
Epoch: 21022, Training Loss: 0.01020
Epoch: 21022, Training Loss: 0.00949
Epoch: 21022, Training Loss: 0.01165
Epoch: 21022, Training Loss: 0.00900
Epoch: 21023, Training Loss: 0.01020
Epoch: 21023, Training Loss: 0.00949
Epoch: 21023, Training Loss: 0.01165
Epoch: 21023, Training Loss: 0.00900
Epoch: 21024, Training Loss: 0.01020
Epoch: 21024, Training Loss: 0.00949
Epoch: 21024, Training Loss: 0.01165
Epoch: 21024, Training Loss: 0.00900
Epoch: 21025, Training Loss: 0.01020
Epoch: 21025, Training Loss: 0.00949
Epoch: 21025, Training Loss: 0.01165
Epoch: 21025, Training Loss: 0.00900
Epoch: 21026, Training Loss: 0.01020
Epoch: 21026, Training Loss: 0.00949
Epoch: 21026, Training Loss: 0.01165
Epoch: 21026, Training Loss: 0.00900
Epoch: 21027, Training Loss: 0.01020
Epoch: 21027, Training Loss: 0.00949
Epoch: 21027, Training Loss: 0.01165
Epoch: 21027, Training Loss: 0.00899
Epoch: 21028, Training Loss: 0.01020
Epoch: 21028, Training Loss: 0.00949
Epoch: 21028, Training Loss: 0.01165
Epoch: 21028, Training Loss: 0.00899
Epoch: 21029, Training Loss: 0.01020
Epoch: 21029, Training Loss: 0.00949
Epoch: 21029, Training Loss: 0.01165
Epoch: 21029, Training Loss: 0.00899
Epoch: 21030, Training Loss: 0.01020
Epoch: 21030, Training Loss: 0.00949
Epoch: 21030, Training Loss: 0.01165
Epoch: 21030, Training Loss: 0.00899
Epoch: 21031, Training Loss: 0.01020
Epoch: 21031, Training Loss: 0.00949
Epoch: 21031, Training Loss: 0.01165
Epoch: 21031, Training Loss: 0.00899
Epoch: 21032, Training Loss: 0.01020
Epoch: 21032, Training Loss: 0.00949
Epoch: 21032, Training Loss: 0.01165
Epoch: 21032, Training Loss: 0.00899
Epoch: 21033, Training Loss: 0.01020
Epoch: 21033, Training Loss: 0.00949
Epoch: 21033, Training Loss: 0.01165
Epoch: 21033, Training Loss: 0.00899
Epoch: 21034, Training Loss: 0.01020
Epoch: 21034, Training Loss: 0.00949
Epoch: 21034, Training Loss: 0.01164
Epoch: 21034, Training Loss: 0.00899
Epoch: 21035, Training Loss: 0.01020
Epoch: 21035, Training Loss: 0.00949
Epoch: 21035, Training Loss: 0.01164
Epoch: 21035, Training Loss: 0.00899
Epoch: 21036, Training Loss: 0.01020
Epoch: 21036, Training Loss: 0.00949
Epoch: 21036, Training Loss: 0.01164
Epoch: 21036, Training Loss: 0.00899
Epoch: 21037, Training Loss: 0.01020
Epoch: 21037, Training Loss: 0.00949
Epoch: 21037, Training Loss: 0.01164
Epoch: 21037, Training Loss: 0.00899
Epoch: 21038, Training Loss: 0.01020
Epoch: 21038, Training Loss: 0.00949
Epoch: 21038, Training Loss: 0.01164
Epoch: 21038, Training Loss: 0.00899
Epoch: 21039, Training Loss: 0.01020
Epoch: 21039, Training Loss: 0.00949
Epoch: 21039, Training Loss: 0.01164
Epoch: 21039, Training Loss: 0.00899
Epoch: 21040, Training Loss: 0.01020
Epoch: 21040, Training Loss: 0.00949
Epoch: 21040, Training Loss: 0.01164
Epoch: 21040, Training Loss: 0.00899
Epoch: 21041, Training Loss: 0.01020
Epoch: 21041, Training Loss: 0.00949
Epoch: 21041, Training Loss: 0.01164
Epoch: 21041, Training Loss: 0.00899
Epoch: 21042, Training Loss: 0.01020
Epoch: 21042, Training Loss: 0.00949
Epoch: 21042, Training Loss: 0.01164
Epoch: 21042, Training Loss: 0.00899
Epoch: 21043, Training Loss: 0.01020
Epoch: 21043, Training Loss: 0.00949
Epoch: 21043, Training Loss: 0.01164
Epoch: 21043, Training Loss: 0.00899
Epoch: 21044, Training Loss: 0.01020
Epoch: 21044, Training Loss: 0.00949
Epoch: 21044, Training Loss: 0.01164
Epoch: 21044, Training Loss: 0.00899
Epoch: 21045, Training Loss: 0.01020
Epoch: 21045, Training Loss: 0.00949
Epoch: 21045, Training Loss: 0.01164
Epoch: 21045, Training Loss: 0.00899
Epoch: 21046, Training Loss: 0.01020
Epoch: 21046, Training Loss: 0.00949
Epoch: 21046, Training Loss: 0.01164
Epoch: 21046, Training Loss: 0.00899
Epoch: 21047, Training Loss: 0.01020
Epoch: 21047, Training Loss: 0.00949
Epoch: 21047, Training Loss: 0.01164
Epoch: 21047, Training Loss: 0.00899
Epoch: 21048, Training Loss: 0.01020
Epoch: 21048, Training Loss: 0.00949
Epoch: 21048, Training Loss: 0.01164
Epoch: 21048, Training Loss: 0.00899
Epoch: 21049, Training Loss: 0.01020
Epoch: 21049, Training Loss: 0.00949
Epoch: 21049, Training Loss: 0.01164
Epoch: 21049, Training Loss: 0.00899
Epoch: 21050, Training Loss: 0.01020
Epoch: 21050, Training Loss: 0.00949
Epoch: 21050, Training Loss: 0.01164
Epoch: 21050, Training Loss: 0.00899
Epoch: 21051, Training Loss: 0.01020
Epoch: 21051, Training Loss: 0.00949
Epoch: 21051, Training Loss: 0.01164
Epoch: 21051, Training Loss: 0.00899
Epoch: 21052, Training Loss: 0.01020
Epoch: 21052, Training Loss: 0.00949
Epoch: 21052, Training Loss: 0.01164
Epoch: 21052, Training Loss: 0.00899
Epoch: 21053, Training Loss: 0.01020
Epoch: 21053, Training Loss: 0.00949
Epoch: 21053, Training Loss: 0.01164
Epoch: 21053, Training Loss: 0.00899
Epoch: 21054, Training Loss: 0.01019
Epoch: 21054, Training Loss: 0.00949
Epoch: 21054, Training Loss: 0.01164
Epoch: 21054, Training Loss: 0.00899
Epoch: 21055, Training Loss: 0.01019
Epoch: 21055, Training Loss: 0.00949
Epoch: 21055, Training Loss: 0.01164
Epoch: 21055, Training Loss: 0.00899
Epoch: 21056, Training Loss: 0.01019
Epoch: 21056, Training Loss: 0.00949
Epoch: 21056, Training Loss: 0.01164
Epoch: 21056, Training Loss: 0.00899
Epoch: 21057, Training Loss: 0.01019
Epoch: 21057, Training Loss: 0.00949
Epoch: 21057, Training Loss: 0.01164
Epoch: 21057, Training Loss: 0.00899
Epoch: 21058, Training Loss: 0.01019
Epoch: 21058, Training Loss: 0.00949
Epoch: 21058, Training Loss: 0.01164
Epoch: 21058, Training Loss: 0.00899
Epoch: 21059, Training Loss: 0.01019
Epoch: 21059, Training Loss: 0.00949
Epoch: 21059, Training Loss: 0.01164
Epoch: 21059, Training Loss: 0.00899
Epoch: 21060, Training Loss: 0.01019
Epoch: 21060, Training Loss: 0.00949
Epoch: 21060, Training Loss: 0.01164
Epoch: 21060, Training Loss: 0.00899
Epoch: 21061, Training Loss: 0.01019
Epoch: 21061, Training Loss: 0.00949
Epoch: 21061, Training Loss: 0.01164
Epoch: 21061, Training Loss: 0.00899
Epoch: 21062, Training Loss: 0.01019
Epoch: 21062, Training Loss: 0.00949
Epoch: 21062, Training Loss: 0.01164
Epoch: 21062, Training Loss: 0.00899
Epoch: 21063, Training Loss: 0.01019
Epoch: 21063, Training Loss: 0.00949
Epoch: 21063, Training Loss: 0.01164
Epoch: 21063, Training Loss: 0.00899
Epoch: 21064, Training Loss: 0.01019
Epoch: 21064, Training Loss: 0.00948
Epoch: 21064, Training Loss: 0.01164
Epoch: 21064, Training Loss: 0.00899
Epoch: 21065, Training Loss: 0.01019
Epoch: 21065, Training Loss: 0.00948
Epoch: 21065, Training Loss: 0.01164
Epoch: 21065, Training Loss: 0.00899
Epoch: 21066, Training Loss: 0.01019
Epoch: 21066, Training Loss: 0.00948
Epoch: 21066, Training Loss: 0.01164
Epoch: 21066, Training Loss: 0.00899
Epoch: 21067, Training Loss: 0.01019
Epoch: 21067, Training Loss: 0.00948
Epoch: 21067, Training Loss: 0.01164
Epoch: 21067, Training Loss: 0.00899
Epoch: 21068, Training Loss: 0.01019
Epoch: 21068, Training Loss: 0.00948
Epoch: 21068, Training Loss: 0.01163
Epoch: 21068, Training Loss: 0.00899
Epoch: 21069, Training Loss: 0.01019
Epoch: 21069, Training Loss: 0.00948
Epoch: 21069, Training Loss: 0.01163
Epoch: 21069, Training Loss: 0.00899
Epoch: 21070, Training Loss: 0.01019
Epoch: 21070, Training Loss: 0.00948
Epoch: 21070, Training Loss: 0.01163
Epoch: 21070, Training Loss: 0.00899
Epoch: 21071, Training Loss: 0.01019
Epoch: 21071, Training Loss: 0.00948
Epoch: 21071, Training Loss: 0.01163
Epoch: 21071, Training Loss: 0.00899
Epoch: 21072, Training Loss: 0.01019
Epoch: 21072, Training Loss: 0.00948
Epoch: 21072, Training Loss: 0.01163
Epoch: 21072, Training Loss: 0.00898
Epoch: 21073, Training Loss: 0.01019
Epoch: 21073, Training Loss: 0.00948
Epoch: 21073, Training Loss: 0.01163
Epoch: 21073, Training Loss: 0.00898
Epoch: 21074, Training Loss: 0.01019
Epoch: 21074, Training Loss: 0.00948
Epoch: 21074, Training Loss: 0.01163
Epoch: 21074, Training Loss: 0.00898
Epoch: 21075, Training Loss: 0.01019
Epoch: 21075, Training Loss: 0.00948
Epoch: 21075, Training Loss: 0.01163
Epoch: 21075, Training Loss: 0.00898
Epoch: 21076, Training Loss: 0.01019
Epoch: 21076, Training Loss: 0.00948
Epoch: 21076, Training Loss: 0.01163
Epoch: 21076, Training Loss: 0.00898
Epoch: 21077, Training Loss: 0.01019
Epoch: 21077, Training Loss: 0.00948
Epoch: 21077, Training Loss: 0.01163
Epoch: 21077, Training Loss: 0.00898
Epoch: 21078, Training Loss: 0.01019
Epoch: 21078, Training Loss: 0.00948
Epoch: 21078, Training Loss: 0.01163
Epoch: 21078, Training Loss: 0.00898
Epoch: 21079, Training Loss: 0.01019
Epoch: 21079, Training Loss: 0.00948
Epoch: 21079, Training Loss: 0.01163
Epoch: 21079, Training Loss: 0.00898
Epoch: 21080, Training Loss: 0.01019
Epoch: 21080, Training Loss: 0.00948
Epoch: 21080, Training Loss: 0.01163
Epoch: 21080, Training Loss: 0.00898
Epoch: 21081, Training Loss: 0.01019
Epoch: 21081, Training Loss: 0.00948
Epoch: 21081, Training Loss: 0.01163
Epoch: 21081, Training Loss: 0.00898
Epoch: 21082, Training Loss: 0.01019
Epoch: 21082, Training Loss: 0.00948
Epoch: 21082, Training Loss: 0.01163
Epoch: 21082, Training Loss: 0.00898
Epoch: 21083, Training Loss: 0.01019
Epoch: 21083, Training Loss: 0.00948
Epoch: 21083, Training Loss: 0.01163
Epoch: 21083, Training Loss: 0.00898
Epoch: 21084, Training Loss: 0.01019
Epoch: 21084, Training Loss: 0.00948
Epoch: 21084, Training Loss: 0.01163
Epoch: 21084, Training Loss: 0.00898
Epoch: 21085, Training Loss: 0.01019
Epoch: 21085, Training Loss: 0.00948
Epoch: 21085, Training Loss: 0.01163
Epoch: 21085, Training Loss: 0.00898
Epoch: 21086, Training Loss: 0.01019
Epoch: 21086, Training Loss: 0.00948
Epoch: 21086, Training Loss: 0.01163
Epoch: 21086, Training Loss: 0.00898
Epoch: 21087, Training Loss: 0.01019
Epoch: 21087, Training Loss: 0.00948
Epoch: 21087, Training Loss: 0.01163
Epoch: 21087, Training Loss: 0.00898
Epoch: 21088, Training Loss: 0.01019
Epoch: 21088, Training Loss: 0.00948
Epoch: 21088, Training Loss: 0.01163
Epoch: 21088, Training Loss: 0.00898
Epoch: 21089, Training Loss: 0.01019
Epoch: 21089, Training Loss: 0.00948
Epoch: 21089, Training Loss: 0.01163
Epoch: 21089, Training Loss: 0.00898
Epoch: 21090, Training Loss: 0.01019
Epoch: 21090, Training Loss: 0.00948
Epoch: 21090, Training Loss: 0.01163
Epoch: 21090, Training Loss: 0.00898
Epoch: 21091, Training Loss: 0.01019
Epoch: 21091, Training Loss: 0.00948
Epoch: 21091, Training Loss: 0.01163
Epoch: 21091, Training Loss: 0.00898
Epoch: 21092, Training Loss: 0.01018
Epoch: 21092, Training Loss: 0.00948
Epoch: 21092, Training Loss: 0.01163
Epoch: 21092, Training Loss: 0.00898
Epoch: 21093, Training Loss: 0.01018
Epoch: 21093, Training Loss: 0.00948
Epoch: 21093, Training Loss: 0.01163
Epoch: 21093, Training Loss: 0.00898
Epoch: 21094, Training Loss: 0.01018
Epoch: 21094, Training Loss: 0.00948
Epoch: 21094, Training Loss: 0.01163
Epoch: 21094, Training Loss: 0.00898
Epoch: 21095, Training Loss: 0.01018
Epoch: 21095, Training Loss: 0.00948
Epoch: 21095, Training Loss: 0.01163
Epoch: 21095, Training Loss: 0.00898
Epoch: 21096, Training Loss: 0.01018
Epoch: 21096, Training Loss: 0.00948
Epoch: 21096, Training Loss: 0.01163
Epoch: 21096, Training Loss: 0.00898
Epoch: 21097, Training Loss: 0.01018
Epoch: 21097, Training Loss: 0.00948
Epoch: 21097, Training Loss: 0.01163
Epoch: 21097, Training Loss: 0.00898
Epoch: 21098, Training Loss: 0.01018
Epoch: 21098, Training Loss: 0.00948
Epoch: 21098, Training Loss: 0.01163
Epoch: 21098, Training Loss: 0.00898
Epoch: 21099, Training Loss: 0.01018
Epoch: 21099, Training Loss: 0.00948
Epoch: 21099, Training Loss: 0.01163
Epoch: 21099, Training Loss: 0.00898
Epoch: 21100, Training Loss: 0.01018
Epoch: 21100, Training Loss: 0.00948
Epoch: 21100, Training Loss: 0.01163
Epoch: 21100, Training Loss: 0.00898
Epoch: 21101, Training Loss: 0.01018
Epoch: 21101, Training Loss: 0.00948
Epoch: 21101, Training Loss: 0.01162
Epoch: 21101, Training Loss: 0.00898
Epoch: 21102, Training Loss: 0.01018
Epoch: 21102, Training Loss: 0.00948
Epoch: 21102, Training Loss: 0.01162
Epoch: 21102, Training Loss: 0.00898
Epoch: 21103, Training Loss: 0.01018
Epoch: 21103, Training Loss: 0.00948
Epoch: 21103, Training Loss: 0.01162
Epoch: 21103, Training Loss: 0.00898
Epoch: 21104, Training Loss: 0.01018
Epoch: 21104, Training Loss: 0.00948
Epoch: 21104, Training Loss: 0.01162
Epoch: 21104, Training Loss: 0.00898
Epoch: 21105, Training Loss: 0.01018
Epoch: 21105, Training Loss: 0.00948
Epoch: 21105, Training Loss: 0.01162
Epoch: 21105, Training Loss: 0.00898
Epoch: 21106, Training Loss: 0.01018
Epoch: 21106, Training Loss: 0.00947
Epoch: 21106, Training Loss: 0.01162
Epoch: 21106, Training Loss: 0.00898
Epoch: 21107, Training Loss: 0.01018
Epoch: 21107, Training Loss: 0.00947
Epoch: 21107, Training Loss: 0.01162
Epoch: 21107, Training Loss: 0.00898
Epoch: 21108, Training Loss: 0.01018
Epoch: 21108, Training Loss: 0.00947
Epoch: 21108, Training Loss: 0.01162
Epoch: 21108, Training Loss: 0.00898
Epoch: 21109, Training Loss: 0.01018
Epoch: 21109, Training Loss: 0.00947
Epoch: 21109, Training Loss: 0.01162
Epoch: 21109, Training Loss: 0.00898
Epoch: 21110, Training Loss: 0.01018
Epoch: 21110, Training Loss: 0.00947
Epoch: 21110, Training Loss: 0.01162
Epoch: 21110, Training Loss: 0.00898
Epoch: 21111, Training Loss: 0.01018
Epoch: 21111, Training Loss: 0.00947
Epoch: 21111, Training Loss: 0.01162
Epoch: 21111, Training Loss: 0.00898
Epoch: 21112, Training Loss: 0.01018
Epoch: 21112, Training Loss: 0.00947
Epoch: 21112, Training Loss: 0.01162
Epoch: 21112, Training Loss: 0.00898
Epoch: 21113, Training Loss: 0.01018
Epoch: 21113, Training Loss: 0.00947
Epoch: 21113, Training Loss: 0.01162
Epoch: 21113, Training Loss: 0.00898
Epoch: 21114, Training Loss: 0.01018
Epoch: 21114, Training Loss: 0.00947
Epoch: 21114, Training Loss: 0.01162
Epoch: 21114, Training Loss: 0.00898
Epoch: 21115, Training Loss: 0.01018
Epoch: 21115, Training Loss: 0.00947
Epoch: 21115, Training Loss: 0.01162
Epoch: 21115, Training Loss: 0.00898
Epoch: 21116, Training Loss: 0.01018
Epoch: 21116, Training Loss: 0.00947
Epoch: 21116, Training Loss: 0.01162
Epoch: 21116, Training Loss: 0.00897
Epoch: 21117, Training Loss: 0.01018
Epoch: 21117, Training Loss: 0.00947
Epoch: 21117, Training Loss: 0.01162
Epoch: 21117, Training Loss: 0.00897
Epoch: 21118, Training Loss: 0.01018
Epoch: 21118, Training Loss: 0.00947
Epoch: 21118, Training Loss: 0.01162
Epoch: 21118, Training Loss: 0.00897
Epoch: 21119, Training Loss: 0.01018
Epoch: 21119, Training Loss: 0.00947
Epoch: 21119, Training Loss: 0.01162
Epoch: 21119, Training Loss: 0.00897
Epoch: 21120, Training Loss: 0.01018
Epoch: 21120, Training Loss: 0.00947
Epoch: 21120, Training Loss: 0.01162
Epoch: 21120, Training Loss: 0.00897
Epoch: 21121, Training Loss: 0.01018
Epoch: 21121, Training Loss: 0.00947
Epoch: 21121, Training Loss: 0.01162
Epoch: 21121, Training Loss: 0.00897
Epoch: 21122, Training Loss: 0.01018
Epoch: 21122, Training Loss: 0.00947
Epoch: 21122, Training Loss: 0.01162
Epoch: 21122, Training Loss: 0.00897
Epoch: 21123, Training Loss: 0.01018
Epoch: 21123, Training Loss: 0.00947
Epoch: 21123, Training Loss: 0.01162
Epoch: 21123, Training Loss: 0.00897
Epoch: 21124, Training Loss: 0.01018
Epoch: 21124, Training Loss: 0.00947
Epoch: 21124, Training Loss: 0.01162
Epoch: 21124, Training Loss: 0.00897
Epoch: 21125, Training Loss: 0.01018
Epoch: 21125, Training Loss: 0.00947
Epoch: 21125, Training Loss: 0.01162
Epoch: 21125, Training Loss: 0.00897
Epoch: 21126, Training Loss: 0.01018
Epoch: 21126, Training Loss: 0.00947
Epoch: 21126, Training Loss: 0.01162
Epoch: 21126, Training Loss: 0.00897
Epoch: 21127, Training Loss: 0.01018
Epoch: 21127, Training Loss: 0.00947
Epoch: 21127, Training Loss: 0.01162
Epoch: 21127, Training Loss: 0.00897
Epoch: 21128, Training Loss: 0.01018
Epoch: 21128, Training Loss: 0.00947
Epoch: 21128, Training Loss: 0.01162
Epoch: 21128, Training Loss: 0.00897
Epoch: 21129, Training Loss: 0.01018
Epoch: 21129, Training Loss: 0.00947
Epoch: 21129, Training Loss: 0.01162
Epoch: 21129, Training Loss: 0.00897
Epoch: 21130, Training Loss: 0.01018
Epoch: 21130, Training Loss: 0.00947
Epoch: 21130, Training Loss: 0.01162
Epoch: 21130, Training Loss: 0.00897
Epoch: 21131, Training Loss: 0.01017
Epoch: 21131, Training Loss: 0.00947
Epoch: 21131, Training Loss: 0.01162
Epoch: 21131, Training Loss: 0.00897
Epoch: 21132, Training Loss: 0.01017
Epoch: 21132, Training Loss: 0.00947
Epoch: 21132, Training Loss: 0.01162
Epoch: 21132, Training Loss: 0.00897
Epoch: 21133, Training Loss: 0.01017
Epoch: 21133, Training Loss: 0.00947
Epoch: 21133, Training Loss: 0.01162
Epoch: 21133, Training Loss: 0.00897
Epoch: 21134, Training Loss: 0.01017
Epoch: 21134, Training Loss: 0.00947
Epoch: 21134, Training Loss: 0.01162
Epoch: 21134, Training Loss: 0.00897
Epoch: 21135, Training Loss: 0.01017
Epoch: 21135, Training Loss: 0.00947
Epoch: 21135, Training Loss: 0.01161
Epoch: 21135, Training Loss: 0.00897
Epoch: 21136, Training Loss: 0.01017
Epoch: 21136, Training Loss: 0.00947
Epoch: 21136, Training Loss: 0.01161
Epoch: 21136, Training Loss: 0.00897
Epoch: 21137, Training Loss: 0.01017
Epoch: 21137, Training Loss: 0.00947
Epoch: 21137, Training Loss: 0.01161
Epoch: 21137, Training Loss: 0.00897
Epoch: 21138, Training Loss: 0.01017
Epoch: 21138, Training Loss: 0.00947
Epoch: 21138, Training Loss: 0.01161
Epoch: 21138, Training Loss: 0.00897
Epoch: 21139, Training Loss: 0.01017
Epoch: 21139, Training Loss: 0.00947
Epoch: 21139, Training Loss: 0.01161
Epoch: 21139, Training Loss: 0.00897
Epoch: 21140, Training Loss: 0.01017
Epoch: 21140, Training Loss: 0.00947
Epoch: 21140, Training Loss: 0.01161
Epoch: 21140, Training Loss: 0.00897
Epoch: 21141, Training Loss: 0.01017
Epoch: 21141, Training Loss: 0.00947
Epoch: 21141, Training Loss: 0.01161
Epoch: 21141, Training Loss: 0.00897
Epoch: 21142, Training Loss: 0.01017
Epoch: 21142, Training Loss: 0.00947
Epoch: 21142, Training Loss: 0.01161
Epoch: 21142, Training Loss: 0.00897
Epoch: 21143, Training Loss: 0.01017
Epoch: 21143, Training Loss: 0.00947
Epoch: 21143, Training Loss: 0.01161
Epoch: 21143, Training Loss: 0.00897
Epoch: 21144, Training Loss: 0.01017
Epoch: 21144, Training Loss: 0.00947
Epoch: 21144, Training Loss: 0.01161
Epoch: 21144, Training Loss: 0.00897
Epoch: 21145, Training Loss: 0.01017
Epoch: 21145, Training Loss: 0.00947
Epoch: 21145, Training Loss: 0.01161
Epoch: 21145, Training Loss: 0.00897
Epoch: 21146, Training Loss: 0.01017
Epoch: 21146, Training Loss: 0.00947
Epoch: 21146, Training Loss: 0.01161
Epoch: 21146, Training Loss: 0.00897
Epoch: 21147, Training Loss: 0.01017
Epoch: 21147, Training Loss: 0.00947
Epoch: 21147, Training Loss: 0.01161
Epoch: 21147, Training Loss: 0.00897
Epoch: 21148, Training Loss: 0.01017
Epoch: 21148, Training Loss: 0.00946
Epoch: 21148, Training Loss: 0.01161
Epoch: 21148, Training Loss: 0.00897
Epoch: 21149, Training Loss: 0.01017
Epoch: 21149, Training Loss: 0.00946
Epoch: 21149, Training Loss: 0.01161
Epoch: 21149, Training Loss: 0.00897
Epoch: 21150, Training Loss: 0.01017
Epoch: 21150, Training Loss: 0.00946
Epoch: 21150, Training Loss: 0.01161
Epoch: 21150, Training Loss: 0.00897
Epoch: 21151, Training Loss: 0.01017
Epoch: 21151, Training Loss: 0.00946
Epoch: 21151, Training Loss: 0.01161
Epoch: 21151, Training Loss: 0.00897
Epoch: 21152, Training Loss: 0.01017
Epoch: 21152, Training Loss: 0.00946
Epoch: 21152, Training Loss: 0.01161
Epoch: 21152, Training Loss: 0.00897
Epoch: 21153, Training Loss: 0.01017
Epoch: 21153, Training Loss: 0.00946
Epoch: 21153, Training Loss: 0.01161
Epoch: 21153, Training Loss: 0.00897
Epoch: 21154, Training Loss: 0.01017
Epoch: 21154, Training Loss: 0.00946
Epoch: 21154, Training Loss: 0.01161
Epoch: 21154, Training Loss: 0.00897
Epoch: 21155, Training Loss: 0.01017
Epoch: 21155, Training Loss: 0.00946
Epoch: 21155, Training Loss: 0.01161
Epoch: 21155, Training Loss: 0.00897
Epoch: 21156, Training Loss: 0.01017
Epoch: 21156, Training Loss: 0.00946
Epoch: 21156, Training Loss: 0.01161
Epoch: 21156, Training Loss: 0.00897
Epoch: 21157, Training Loss: 0.01017
Epoch: 21157, Training Loss: 0.00946
Epoch: 21157, Training Loss: 0.01161
Epoch: 21157, Training Loss: 0.00897
Epoch: 21158, Training Loss: 0.01017
Epoch: 21158, Training Loss: 0.00946
Epoch: 21158, Training Loss: 0.01161
Epoch: 21158, Training Loss: 0.00897
Epoch: 21159, Training Loss: 0.01017
Epoch: 21159, Training Loss: 0.00946
Epoch: 21159, Training Loss: 0.01161
Epoch: 21159, Training Loss: 0.00897
Epoch: 21160, Training Loss: 0.01017
Epoch: 21160, Training Loss: 0.00946
Epoch: 21160, Training Loss: 0.01161
Epoch: 21160, Training Loss: 0.00896
Epoch: 21161, Training Loss: 0.01017
Epoch: 21161, Training Loss: 0.00946
Epoch: 21161, Training Loss: 0.01161
Epoch: 21161, Training Loss: 0.00896
Epoch: 21162, Training Loss: 0.01017
Epoch: 21162, Training Loss: 0.00946
Epoch: 21162, Training Loss: 0.01161
Epoch: 21162, Training Loss: 0.00896
Epoch: 21163, Training Loss: 0.01017
Epoch: 21163, Training Loss: 0.00946
Epoch: 21163, Training Loss: 0.01161
Epoch: 21163, Training Loss: 0.00896
Epoch: 21164, Training Loss: 0.01017
Epoch: 21164, Training Loss: 0.00946
Epoch: 21164, Training Loss: 0.01161
Epoch: 21164, Training Loss: 0.00896
Epoch: 21165, Training Loss: 0.01017
Epoch: 21165, Training Loss: 0.00946
Epoch: 21165, Training Loss: 0.01161
Epoch: 21165, Training Loss: 0.00896
Epoch: 21166, Training Loss: 0.01017
Epoch: 21166, Training Loss: 0.00946
Epoch: 21166, Training Loss: 0.01161
Epoch: 21166, Training Loss: 0.00896
Epoch: 21167, Training Loss: 0.01017
Epoch: 21167, Training Loss: 0.00946
Epoch: 21167, Training Loss: 0.01161
Epoch: 21167, Training Loss: 0.00896
Epoch: 21168, Training Loss: 0.01017
Epoch: 21168, Training Loss: 0.00946
Epoch: 21168, Training Loss: 0.01161
Epoch: 21168, Training Loss: 0.00896
Epoch: 21169, Training Loss: 0.01016
Epoch: 21169, Training Loss: 0.00946
Epoch: 21169, Training Loss: 0.01160
Epoch: 21169, Training Loss: 0.00896
Epoch: 21170, Training Loss: 0.01016
Epoch: 21170, Training Loss: 0.00946
Epoch: 21170, Training Loss: 0.01160
Epoch: 21170, Training Loss: 0.00896
Epoch: 21171, Training Loss: 0.01016
Epoch: 21171, Training Loss: 0.00946
Epoch: 21171, Training Loss: 0.01160
Epoch: 21171, Training Loss: 0.00896
Epoch: 21172, Training Loss: 0.01016
Epoch: 21172, Training Loss: 0.00946
Epoch: 21172, Training Loss: 0.01160
Epoch: 21172, Training Loss: 0.00896
Epoch: 21173, Training Loss: 0.01016
Epoch: 21173, Training Loss: 0.00946
Epoch: 21173, Training Loss: 0.01160
Epoch: 21173, Training Loss: 0.00896
Epoch: 21174, Training Loss: 0.01016
Epoch: 21174, Training Loss: 0.00946
Epoch: 21174, Training Loss: 0.01160
Epoch: 21174, Training Loss: 0.00896
Epoch: 21175, Training Loss: 0.01016
Epoch: 21175, Training Loss: 0.00946
Epoch: 21175, Training Loss: 0.01160
Epoch: 21175, Training Loss: 0.00896
Epoch: 21176, Training Loss: 0.01016
Epoch: 21176, Training Loss: 0.00946
Epoch: 21176, Training Loss: 0.01160
Epoch: 21176, Training Loss: 0.00896
Epoch: 21177, Training Loss: 0.01016
Epoch: 21177, Training Loss: 0.00946
Epoch: 21177, Training Loss: 0.01160
Epoch: 21177, Training Loss: 0.00896
Epoch: 21178, Training Loss: 0.01016
Epoch: 21178, Training Loss: 0.00946
Epoch: 21178, Training Loss: 0.01160
Epoch: 21178, Training Loss: 0.00896
Epoch: 21179, Training Loss: 0.01016
Epoch: 21179, Training Loss: 0.00946
Epoch: 21179, Training Loss: 0.01160
Epoch: 21179, Training Loss: 0.00896
Epoch: 21180, Training Loss: 0.01016
Epoch: 21180, Training Loss: 0.00946
Epoch: 21180, Training Loss: 0.01160
Epoch: 21180, Training Loss: 0.00896
Epoch: 21181, Training Loss: 0.01016
Epoch: 21181, Training Loss: 0.00946
Epoch: 21181, Training Loss: 0.01160
Epoch: 21181, Training Loss: 0.00896
Epoch: 21182, Training Loss: 0.01016
Epoch: 21182, Training Loss: 0.00946
Epoch: 21182, Training Loss: 0.01160
Epoch: 21182, Training Loss: 0.00896
Epoch: 21183, Training Loss: 0.01016
Epoch: 21183, Training Loss: 0.00946
Epoch: 21183, Training Loss: 0.01160
Epoch: 21183, Training Loss: 0.00896
Epoch: 21184, Training Loss: 0.01016
Epoch: 21184, Training Loss: 0.00946
Epoch: 21184, Training Loss: 0.01160
Epoch: 21184, Training Loss: 0.00896
Epoch: 21185, Training Loss: 0.01016
Epoch: 21185, Training Loss: 0.00946
Epoch: 21185, Training Loss: 0.01160
Epoch: 21185, Training Loss: 0.00896
Epoch: 21186, Training Loss: 0.01016
Epoch: 21186, Training Loss: 0.00946
Epoch: 21186, Training Loss: 0.01160
Epoch: 21186, Training Loss: 0.00896
Epoch: 21187, Training Loss: 0.01016
Epoch: 21187, Training Loss: 0.00946
Epoch: 21187, Training Loss: 0.01160
Epoch: 21187, Training Loss: 0.00896
Epoch: 21188, Training Loss: 0.01016
Epoch: 21188, Training Loss: 0.00946
Epoch: 21188, Training Loss: 0.01160
Epoch: 21188, Training Loss: 0.00896
Epoch: 21189, Training Loss: 0.01016
Epoch: 21189, Training Loss: 0.00946
Epoch: 21189, Training Loss: 0.01160
Epoch: 21189, Training Loss: 0.00896
Epoch: 21190, Training Loss: 0.01016
Epoch: 21190, Training Loss: 0.00946
Epoch: 21190, Training Loss: 0.01160
Epoch: 21190, Training Loss: 0.00896
Epoch: 21191, Training Loss: 0.01016
Epoch: 21191, Training Loss: 0.00945
Epoch: 21191, Training Loss: 0.01160
Epoch: 21191, Training Loss: 0.00896
Epoch: 21192, Training Loss: 0.01016
Epoch: 21192, Training Loss: 0.00945
Epoch: 21192, Training Loss: 0.01160
Epoch: 21192, Training Loss: 0.00896
Epoch: 21193, Training Loss: 0.01016
Epoch: 21193, Training Loss: 0.00945
Epoch: 21193, Training Loss: 0.01160
Epoch: 21193, Training Loss: 0.00896
Epoch: 21194, Training Loss: 0.01016
Epoch: 21194, Training Loss: 0.00945
Epoch: 21194, Training Loss: 0.01160
Epoch: 21194, Training Loss: 0.00896
Epoch: 21195, Training Loss: 0.01016
Epoch: 21195, Training Loss: 0.00945
Epoch: 21195, Training Loss: 0.01160
Epoch: 21195, Training Loss: 0.00896
Epoch: 21196, Training Loss: 0.01016
Epoch: 21196, Training Loss: 0.00945
Epoch: 21196, Training Loss: 0.01160
Epoch: 21196, Training Loss: 0.00896
Epoch: 21197, Training Loss: 0.01016
Epoch: 21197, Training Loss: 0.00945
Epoch: 21197, Training Loss: 0.01160
Epoch: 21197, Training Loss: 0.00896
Epoch: 21198, Training Loss: 0.01016
Epoch: 21198, Training Loss: 0.00945
Epoch: 21198, Training Loss: 0.01160
Epoch: 21198, Training Loss: 0.00896
Epoch: 21199, Training Loss: 0.01016
Epoch: 21199, Training Loss: 0.00945
Epoch: 21199, Training Loss: 0.01160
Epoch: 21199, Training Loss: 0.00896
Epoch: 21200, Training Loss: 0.01016
Epoch: 21200, Training Loss: 0.00945
Epoch: 21200, Training Loss: 0.01160
Epoch: 21200, Training Loss: 0.00896
Epoch: 21201, Training Loss: 0.01016
Epoch: 21201, Training Loss: 0.00945
Epoch: 21201, Training Loss: 0.01160
Epoch: 21201, Training Loss: 0.00896
Epoch: 21202, Training Loss: 0.01016
Epoch: 21202, Training Loss: 0.00945
Epoch: 21202, Training Loss: 0.01160
Epoch: 21202, Training Loss: 0.00896
Epoch: 21203, Training Loss: 0.01016
Epoch: 21203, Training Loss: 0.00945
Epoch: 21203, Training Loss: 0.01159
Epoch: 21203, Training Loss: 0.00896
Epoch: 21204, Training Loss: 0.01016
Epoch: 21204, Training Loss: 0.00945
Epoch: 21204, Training Loss: 0.01159
Epoch: 21204, Training Loss: 0.00896
Epoch: 21205, Training Loss: 0.01016
Epoch: 21205, Training Loss: 0.00945
Epoch: 21205, Training Loss: 0.01159
Epoch: 21205, Training Loss: 0.00895
Epoch: 21206, Training Loss: 0.01016
Epoch: 21206, Training Loss: 0.00945
Epoch: 21206, Training Loss: 0.01159
Epoch: 21206, Training Loss: 0.00895
Epoch: 21207, Training Loss: 0.01016
Epoch: 21207, Training Loss: 0.00945
Epoch: 21207, Training Loss: 0.01159
Epoch: 21207, Training Loss: 0.00895
Epoch: 21208, Training Loss: 0.01015
Epoch: 21208, Training Loss: 0.00945
Epoch: 21208, Training Loss: 0.01159
Epoch: 21208, Training Loss: 0.00895
Epoch: 21209, Training Loss: 0.01015
Epoch: 21209, Training Loss: 0.00945
Epoch: 21209, Training Loss: 0.01159
Epoch: 21209, Training Loss: 0.00895
Epoch: 21210, Training Loss: 0.01015
Epoch: 21210, Training Loss: 0.00945
Epoch: 21210, Training Loss: 0.01159
Epoch: 21210, Training Loss: 0.00895
Epoch: 21211, Training Loss: 0.01015
Epoch: 21211, Training Loss: 0.00945
Epoch: 21211, Training Loss: 0.01159
Epoch: 21211, Training Loss: 0.00895
Epoch: 21212, Training Loss: 0.01015
Epoch: 21212, Training Loss: 0.00945
Epoch: 21212, Training Loss: 0.01159
Epoch: 21212, Training Loss: 0.00895
Epoch: 21213, Training Loss: 0.01015
Epoch: 21213, Training Loss: 0.00945
Epoch: 21213, Training Loss: 0.01159
Epoch: 21213, Training Loss: 0.00895
Epoch: 21214, Training Loss: 0.01015
Epoch: 21214, Training Loss: 0.00945
Epoch: 21214, Training Loss: 0.01159
Epoch: 21214, Training Loss: 0.00895
Epoch: 21215, Training Loss: 0.01015
Epoch: 21215, Training Loss: 0.00945
Epoch: 21215, Training Loss: 0.01159
Epoch: 21215, Training Loss: 0.00895
Epoch: 21216, Training Loss: 0.01015
Epoch: 21216, Training Loss: 0.00945
Epoch: 21216, Training Loss: 0.01159
Epoch: 21216, Training Loss: 0.00895
Epoch: 21217, Training Loss: 0.01015
Epoch: 21217, Training Loss: 0.00945
Epoch: 21217, Training Loss: 0.01159
Epoch: 21217, Training Loss: 0.00895
Epoch: 21218, Training Loss: 0.01015
Epoch: 21218, Training Loss: 0.00945
Epoch: 21218, Training Loss: 0.01159
Epoch: 21218, Training Loss: 0.00895
Epoch: 21219, Training Loss: 0.01015
Epoch: 21219, Training Loss: 0.00945
Epoch: 21219, Training Loss: 0.01159
Epoch: 21219, Training Loss: 0.00895
Epoch: 21220, Training Loss: 0.01015
Epoch: 21220, Training Loss: 0.00945
Epoch: 21220, Training Loss: 0.01159
Epoch: 21220, Training Loss: 0.00895
Epoch: 21221, Training Loss: 0.01015
Epoch: 21221, Training Loss: 0.00945
Epoch: 21221, Training Loss: 0.01159
Epoch: 21221, Training Loss: 0.00895
Epoch: 21222, Training Loss: 0.01015
Epoch: 21222, Training Loss: 0.00945
Epoch: 21222, Training Loss: 0.01159
Epoch: 21222, Training Loss: 0.00895
Epoch: 21223, Training Loss: 0.01015
Epoch: 21223, Training Loss: 0.00945
Epoch: 21223, Training Loss: 0.01159
Epoch: 21223, Training Loss: 0.00895
Epoch: 21224, Training Loss: 0.01015
Epoch: 21224, Training Loss: 0.00945
Epoch: 21224, Training Loss: 0.01159
Epoch: 21224, Training Loss: 0.00895
Epoch: 21225, Training Loss: 0.01015
Epoch: 21225, Training Loss: 0.00945
Epoch: 21225, Training Loss: 0.01159
Epoch: 21225, Training Loss: 0.00895
Epoch: 21226, Training Loss: 0.01015
Epoch: 21226, Training Loss: 0.00945
Epoch: 21226, Training Loss: 0.01159
Epoch: 21226, Training Loss: 0.00895
Epoch: 21227, Training Loss: 0.01015
Epoch: 21227, Training Loss: 0.00945
Epoch: 21227, Training Loss: 0.01159
Epoch: 21227, Training Loss: 0.00895
Epoch: 21228, Training Loss: 0.01015
Epoch: 21228, Training Loss: 0.00945
Epoch: 21228, Training Loss: 0.01159
Epoch: 21228, Training Loss: 0.00895
Epoch: 21229, Training Loss: 0.01015
Epoch: 21229, Training Loss: 0.00945
Epoch: 21229, Training Loss: 0.01159
Epoch: 21229, Training Loss: 0.00895
Epoch: 21230, Training Loss: 0.01015
Epoch: 21230, Training Loss: 0.00945
Epoch: 21230, Training Loss: 0.01159
Epoch: 21230, Training Loss: 0.00895
Epoch: 21231, Training Loss: 0.01015
Epoch: 21231, Training Loss: 0.00945
Epoch: 21231, Training Loss: 0.01159
Epoch: 21231, Training Loss: 0.00895
Epoch: 21232, Training Loss: 0.01015
Epoch: 21232, Training Loss: 0.00945
Epoch: 21232, Training Loss: 0.01159
Epoch: 21232, Training Loss: 0.00895
Epoch: 21233, Training Loss: 0.01015
Epoch: 21233, Training Loss: 0.00944
Epoch: 21233, Training Loss: 0.01159
Epoch: 21233, Training Loss: 0.00895
Epoch: 21234, Training Loss: 0.01015
Epoch: 21234, Training Loss: 0.00944
Epoch: 21234, Training Loss: 0.01159
Epoch: 21234, Training Loss: 0.00895
Epoch: 21235, Training Loss: 0.01015
Epoch: 21235, Training Loss: 0.00944
Epoch: 21235, Training Loss: 0.01159
Epoch: 21235, Training Loss: 0.00895
Epoch: 21236, Training Loss: 0.01015
Epoch: 21236, Training Loss: 0.00944
Epoch: 21236, Training Loss: 0.01159
Epoch: 21236, Training Loss: 0.00895
Epoch: 21237, Training Loss: 0.01015
Epoch: 21237, Training Loss: 0.00944
Epoch: 21237, Training Loss: 0.01158
Epoch: 21237, Training Loss: 0.00895
Epoch: 21238, Training Loss: 0.01015
Epoch: 21238, Training Loss: 0.00944
Epoch: 21238, Training Loss: 0.01158
Epoch: 21238, Training Loss: 0.00895
Epoch: 21239, Training Loss: 0.01015
Epoch: 21239, Training Loss: 0.00944
Epoch: 21239, Training Loss: 0.01158
Epoch: 21239, Training Loss: 0.00895
Epoch: 21240, Training Loss: 0.01015
Epoch: 21240, Training Loss: 0.00944
Epoch: 21240, Training Loss: 0.01158
Epoch: 21240, Training Loss: 0.00895
Epoch: 21241, Training Loss: 0.01015
Epoch: 21241, Training Loss: 0.00944
Epoch: 21241, Training Loss: 0.01158
Epoch: 21241, Training Loss: 0.00895
Epoch: 21242, Training Loss: 0.01015
Epoch: 21242, Training Loss: 0.00944
Epoch: 21242, Training Loss: 0.01158
Epoch: 21242, Training Loss: 0.00895
Epoch: 21243, Training Loss: 0.01015
Epoch: 21243, Training Loss: 0.00944
Epoch: 21243, Training Loss: 0.01158
Epoch: 21243, Training Loss: 0.00895
Epoch: 21244, Training Loss: 0.01015
Epoch: 21244, Training Loss: 0.00944
Epoch: 21244, Training Loss: 0.01158
Epoch: 21244, Training Loss: 0.00895
Epoch: 21245, Training Loss: 0.01015
Epoch: 21245, Training Loss: 0.00944
Epoch: 21245, Training Loss: 0.01158
Epoch: 21245, Training Loss: 0.00895
Epoch: 21246, Training Loss: 0.01015
Epoch: 21246, Training Loss: 0.00944
Epoch: 21246, Training Loss: 0.01158
Epoch: 21246, Training Loss: 0.00895
Epoch: 21247, Training Loss: 0.01014
Epoch: 21247, Training Loss: 0.00944
Epoch: 21247, Training Loss: 0.01158
Epoch: 21247, Training Loss: 0.00895
Epoch: 21248, Training Loss: 0.01014
Epoch: 21248, Training Loss: 0.00944
Epoch: 21248, Training Loss: 0.01158
Epoch: 21248, Training Loss: 0.00895
Epoch: 21249, Training Loss: 0.01014
Epoch: 21249, Training Loss: 0.00944
Epoch: 21249, Training Loss: 0.01158
Epoch: 21249, Training Loss: 0.00894
Epoch: 21250, Training Loss: 0.01014
Epoch: 21250, Training Loss: 0.00944
Epoch: 21250, Training Loss: 0.01158
Epoch: 21250, Training Loss: 0.00894
Epoch: 21251, Training Loss: 0.01014
Epoch: 21251, Training Loss: 0.00944
Epoch: 21251, Training Loss: 0.01158
Epoch: 21251, Training Loss: 0.00894
Epoch: 21252, Training Loss: 0.01014
Epoch: 21252, Training Loss: 0.00944
Epoch: 21252, Training Loss: 0.01158
Epoch: 21252, Training Loss: 0.00894
Epoch: 21253, Training Loss: 0.01014
Epoch: 21253, Training Loss: 0.00944
Epoch: 21253, Training Loss: 0.01158
Epoch: 21253, Training Loss: 0.00894
Epoch: 21254, Training Loss: 0.01014
Epoch: 21254, Training Loss: 0.00944
Epoch: 21254, Training Loss: 0.01158
Epoch: 21254, Training Loss: 0.00894
Epoch: 21255, Training Loss: 0.01014
Epoch: 21255, Training Loss: 0.00944
Epoch: 21255, Training Loss: 0.01158
Epoch: 21255, Training Loss: 0.00894
Epoch: 21256, Training Loss: 0.01014
Epoch: 21256, Training Loss: 0.00944
Epoch: 21256, Training Loss: 0.01158
Epoch: 21256, Training Loss: 0.00894
Epoch: 21257, Training Loss: 0.01014
Epoch: 21257, Training Loss: 0.00944
Epoch: 21257, Training Loss: 0.01158
Epoch: 21257, Training Loss: 0.00894
Epoch: 21258, Training Loss: 0.01014
Epoch: 21258, Training Loss: 0.00944
Epoch: 21258, Training Loss: 0.01158
Epoch: 21258, Training Loss: 0.00894
Epoch: 21259, Training Loss: 0.01014
Epoch: 21259, Training Loss: 0.00944
Epoch: 21259, Training Loss: 0.01158
Epoch: 21259, Training Loss: 0.00894
Epoch: 21260, Training Loss: 0.01014
Epoch: 21260, Training Loss: 0.00944
Epoch: 21260, Training Loss: 0.01158
Epoch: 21260, Training Loss: 0.00894
Epoch: 21261, Training Loss: 0.01014
Epoch: 21261, Training Loss: 0.00944
Epoch: 21261, Training Loss: 0.01158
Epoch: 21261, Training Loss: 0.00894
Epoch: 21262, Training Loss: 0.01014
Epoch: 21262, Training Loss: 0.00944
Epoch: 21262, Training Loss: 0.01158
Epoch: 21262, Training Loss: 0.00894
Epoch: 21263, Training Loss: 0.01014
Epoch: 21263, Training Loss: 0.00944
Epoch: 21263, Training Loss: 0.01158
Epoch: 21263, Training Loss: 0.00894
Epoch: 21264, Training Loss: 0.01014
Epoch: 21264, Training Loss: 0.00944
Epoch: 21264, Training Loss: 0.01158
Epoch: 21264, Training Loss: 0.00894
Epoch: 21265, Training Loss: 0.01014
Epoch: 21265, Training Loss: 0.00944
Epoch: 21265, Training Loss: 0.01158
Epoch: 21265, Training Loss: 0.00894
Epoch: 21266, Training Loss: 0.01014
Epoch: 21266, Training Loss: 0.00944
Epoch: 21266, Training Loss: 0.01158
Epoch: 21266, Training Loss: 0.00894
Epoch: 21267, Training Loss: 0.01014
Epoch: 21267, Training Loss: 0.00944
Epoch: 21267, Training Loss: 0.01158
Epoch: 21267, Training Loss: 0.00894
Epoch: 21268, Training Loss: 0.01014
Epoch: 21268, Training Loss: 0.00944
Epoch: 21268, Training Loss: 0.01158
Epoch: 21268, Training Loss: 0.00894
Epoch: 21269, Training Loss: 0.01014
Epoch: 21269, Training Loss: 0.00944
Epoch: 21269, Training Loss: 0.01158
Epoch: 21269, Training Loss: 0.00894
Epoch: 21270, Training Loss: 0.01014
Epoch: 21270, Training Loss: 0.00944
Epoch: 21270, Training Loss: 0.01158
Epoch: 21270, Training Loss: 0.00894
Epoch: 21271, Training Loss: 0.01014
Epoch: 21271, Training Loss: 0.00944
Epoch: 21271, Training Loss: 0.01157
Epoch: 21271, Training Loss: 0.00894
Epoch: 21272, Training Loss: 0.01014
Epoch: 21272, Training Loss: 0.00944
Epoch: 21272, Training Loss: 0.01157
Epoch: 21272, Training Loss: 0.00894
Epoch: 21273, Training Loss: 0.01014
Epoch: 21273, Training Loss: 0.00944
Epoch: 21273, Training Loss: 0.01157
Epoch: 21273, Training Loss: 0.00894
Epoch: 21274, Training Loss: 0.01014
Epoch: 21274, Training Loss: 0.00944
Epoch: 21274, Training Loss: 0.01157
Epoch: 21274, Training Loss: 0.00894
Epoch: 21275, Training Loss: 0.01014
Epoch: 21275, Training Loss: 0.00944
Epoch: 21275, Training Loss: 0.01157
Epoch: 21275, Training Loss: 0.00894
Epoch: 21276, Training Loss: 0.01014
Epoch: 21276, Training Loss: 0.00943
Epoch: 21276, Training Loss: 0.01157
Epoch: 21276, Training Loss: 0.00894
Epoch: 21277, Training Loss: 0.01014
Epoch: 21277, Training Loss: 0.00943
Epoch: 21277, Training Loss: 0.01157
Epoch: 21277, Training Loss: 0.00894
Epoch: 21278, Training Loss: 0.01014
Epoch: 21278, Training Loss: 0.00943
Epoch: 21278, Training Loss: 0.01157
Epoch: 21278, Training Loss: 0.00894
Epoch: 21279, Training Loss: 0.01014
Epoch: 21279, Training Loss: 0.00943
Epoch: 21279, Training Loss: 0.01157
Epoch: 21279, Training Loss: 0.00894
Epoch: 21280, Training Loss: 0.01014
Epoch: 21280, Training Loss: 0.00943
Epoch: 21280, Training Loss: 0.01157
Epoch: 21280, Training Loss: 0.00894
Epoch: 21281, Training Loss: 0.01014
Epoch: 21281, Training Loss: 0.00943
Epoch: 21281, Training Loss: 0.01157
Epoch: 21281, Training Loss: 0.00894
Epoch: 21282, Training Loss: 0.01014
Epoch: 21282, Training Loss: 0.00943
Epoch: 21282, Training Loss: 0.01157
Epoch: 21282, Training Loss: 0.00894
Epoch: 21283, Training Loss: 0.01014
Epoch: 21283, Training Loss: 0.00943
Epoch: 21283, Training Loss: 0.01157
Epoch: 21283, Training Loss: 0.00894
Epoch: 21284, Training Loss: 0.01014
Epoch: 21284, Training Loss: 0.00943
Epoch: 21284, Training Loss: 0.01157
Epoch: 21284, Training Loss: 0.00894
Epoch: 21285, Training Loss: 0.01014
Epoch: 21285, Training Loss: 0.00943
Epoch: 21285, Training Loss: 0.01157
Epoch: 21285, Training Loss: 0.00894
Epoch: 21286, Training Loss: 0.01013
Epoch: 21286, Training Loss: 0.00943
Epoch: 21286, Training Loss: 0.01157
Epoch: 21286, Training Loss: 0.00894
Epoch: 21287, Training Loss: 0.01013
Epoch: 21287, Training Loss: 0.00943
Epoch: 21287, Training Loss: 0.01157
Epoch: 21287, Training Loss: 0.00894
Epoch: 21288, Training Loss: 0.01013
Epoch: 21288, Training Loss: 0.00943
Epoch: 21288, Training Loss: 0.01157
Epoch: 21288, Training Loss: 0.00894
Epoch: 21289, Training Loss: 0.01013
Epoch: 21289, Training Loss: 0.00943
Epoch: 21289, Training Loss: 0.01157
Epoch: 21289, Training Loss: 0.00894
Epoch: 21290, Training Loss: 0.01013
Epoch: 21290, Training Loss: 0.00943
Epoch: 21290, Training Loss: 0.01157
Epoch: 21290, Training Loss: 0.00894
Epoch: 21291, Training Loss: 0.01013
Epoch: 21291, Training Loss: 0.00943
Epoch: 21291, Training Loss: 0.01157
Epoch: 21291, Training Loss: 0.00894
Epoch: 21292, Training Loss: 0.01013
Epoch: 21292, Training Loss: 0.00943
Epoch: 21292, Training Loss: 0.01157
Epoch: 21292, Training Loss: 0.00894
Epoch: 21293, Training Loss: 0.01013
Epoch: 21293, Training Loss: 0.00943
Epoch: 21293, Training Loss: 0.01157
Epoch: 21293, Training Loss: 0.00894
Epoch: 21294, Training Loss: 0.01013
Epoch: 21294, Training Loss: 0.00943
Epoch: 21294, Training Loss: 0.01157
Epoch: 21294, Training Loss: 0.00893
Epoch: 21295, Training Loss: 0.01013
Epoch: 21295, Training Loss: 0.00943
Epoch: 21295, Training Loss: 0.01157
Epoch: 21295, Training Loss: 0.00893
Epoch: 21296, Training Loss: 0.01013
Epoch: 21296, Training Loss: 0.00943
Epoch: 21296, Training Loss: 0.01157
Epoch: 21296, Training Loss: 0.00893
Epoch: 21297, Training Loss: 0.01013
Epoch: 21297, Training Loss: 0.00943
Epoch: 21297, Training Loss: 0.01157
Epoch: 21297, Training Loss: 0.00893
Epoch: 21298, Training Loss: 0.01013
Epoch: 21298, Training Loss: 0.00943
Epoch: 21298, Training Loss: 0.01157
Epoch: 21298, Training Loss: 0.00893
Epoch: 21299, Training Loss: 0.01013
Epoch: 21299, Training Loss: 0.00943
Epoch: 21299, Training Loss: 0.01157
Epoch: 21299, Training Loss: 0.00893
Epoch: 21300, Training Loss: 0.01013
Epoch: 21300, Training Loss: 0.00943
Epoch: 21300, Training Loss: 0.01157
Epoch: 21300, Training Loss: 0.00893
Epoch: 21301, Training Loss: 0.01013
Epoch: 21301, Training Loss: 0.00943
Epoch: 21301, Training Loss: 0.01157
Epoch: 21301, Training Loss: 0.00893
Epoch: 21302, Training Loss: 0.01013
Epoch: 21302, Training Loss: 0.00943
Epoch: 21302, Training Loss: 0.01157
Epoch: 21302, Training Loss: 0.00893
Epoch: 21303, Training Loss: 0.01013
Epoch: 21303, Training Loss: 0.00943
Epoch: 21303, Training Loss: 0.01157
Epoch: 21303, Training Loss: 0.00893
Epoch: 21304, Training Loss: 0.01013
Epoch: 21304, Training Loss: 0.00943
Epoch: 21304, Training Loss: 0.01157
Epoch: 21304, Training Loss: 0.00893
Epoch: 21305, Training Loss: 0.01013
Epoch: 21305, Training Loss: 0.00943
Epoch: 21305, Training Loss: 0.01156
Epoch: 21305, Training Loss: 0.00893
Epoch: 21306, Training Loss: 0.01013
Epoch: 21306, Training Loss: 0.00943
Epoch: 21306, Training Loss: 0.01156
Epoch: 21306, Training Loss: 0.00893
Epoch: 21307, Training Loss: 0.01013
Epoch: 21307, Training Loss: 0.00943
Epoch: 21307, Training Loss: 0.01156
Epoch: 21307, Training Loss: 0.00893
Epoch: 21308, Training Loss: 0.01013
Epoch: 21308, Training Loss: 0.00943
Epoch: 21308, Training Loss: 0.01156
Epoch: 21308, Training Loss: 0.00893
Epoch: 21309, Training Loss: 0.01013
Epoch: 21309, Training Loss: 0.00943
Epoch: 21309, Training Loss: 0.01156
Epoch: 21309, Training Loss: 0.00893
Epoch: 21310, Training Loss: 0.01013
Epoch: 21310, Training Loss: 0.00943
Epoch: 21310, Training Loss: 0.01156
Epoch: 21310, Training Loss: 0.00893
Epoch: 21311, Training Loss: 0.01013
Epoch: 21311, Training Loss: 0.00943
Epoch: 21311, Training Loss: 0.01156
Epoch: 21311, Training Loss: 0.00893
Epoch: 21312, Training Loss: 0.01013
Epoch: 21312, Training Loss: 0.00943
Epoch: 21312, Training Loss: 0.01156
Epoch: 21312, Training Loss: 0.00893
Epoch: 21313, Training Loss: 0.01013
Epoch: 21313, Training Loss: 0.00943
Epoch: 21313, Training Loss: 0.01156
Epoch: 21313, Training Loss: 0.00893
Epoch: 21314, Training Loss: 0.01013
Epoch: 21314, Training Loss: 0.00943
Epoch: 21314, Training Loss: 0.01156
Epoch: 21314, Training Loss: 0.00893
Epoch: 21315, Training Loss: 0.01013
Epoch: 21315, Training Loss: 0.00943
Epoch: 21315, Training Loss: 0.01156
Epoch: 21315, Training Loss: 0.00893
Epoch: 21316, Training Loss: 0.01013
Epoch: 21316, Training Loss: 0.00943
Epoch: 21316, Training Loss: 0.01156
Epoch: 21316, Training Loss: 0.00893
Epoch: 21317, Training Loss: 0.01013
Epoch: 21317, Training Loss: 0.00943
Epoch: 21317, Training Loss: 0.01156
Epoch: 21317, Training Loss: 0.00893
Epoch: 21318, Training Loss: 0.01013
Epoch: 21318, Training Loss: 0.00942
Epoch: 21318, Training Loss: 0.01156
Epoch: 21318, Training Loss: 0.00893
Epoch: 21319, Training Loss: 0.01013
Epoch: 21319, Training Loss: 0.00942
Epoch: 21319, Training Loss: 0.01156
Epoch: 21319, Training Loss: 0.00893
Epoch: 21320, Training Loss: 0.01013
Epoch: 21320, Training Loss: 0.00942
Epoch: 21320, Training Loss: 0.01156
Epoch: 21320, Training Loss: 0.00893
Epoch: 21321, Training Loss: 0.01013
Epoch: 21321, Training Loss: 0.00942
Epoch: 21321, Training Loss: 0.01156
Epoch: 21321, Training Loss: 0.00893
Epoch: 21322, Training Loss: 0.01013
Epoch: 21322, Training Loss: 0.00942
Epoch: 21322, Training Loss: 0.01156
Epoch: 21322, Training Loss: 0.00893
Epoch: 21323, Training Loss: 0.01013
Epoch: 21323, Training Loss: 0.00942
Epoch: 21323, Training Loss: 0.01156
Epoch: 21323, Training Loss: 0.00893
Epoch: 21324, Training Loss: 0.01013
Epoch: 21324, Training Loss: 0.00942
Epoch: 21324, Training Loss: 0.01156
Epoch: 21324, Training Loss: 0.00893
Epoch: 21325, Training Loss: 0.01012
Epoch: 21325, Training Loss: 0.00942
Epoch: 21325, Training Loss: 0.01156
Epoch: 21325, Training Loss: 0.00893
Epoch: 21326, Training Loss: 0.01012
Epoch: 21326, Training Loss: 0.00942
Epoch: 21326, Training Loss: 0.01156
Epoch: 21326, Training Loss: 0.00893
Epoch: 21327, Training Loss: 0.01012
Epoch: 21327, Training Loss: 0.00942
Epoch: 21327, Training Loss: 0.01156
Epoch: 21327, Training Loss: 0.00893
Epoch: 21328, Training Loss: 0.01012
Epoch: 21328, Training Loss: 0.00942
Epoch: 21328, Training Loss: 0.01156
Epoch: 21328, Training Loss: 0.00893
Epoch: 21329, Training Loss: 0.01012
Epoch: 21329, Training Loss: 0.00942
Epoch: 21329, Training Loss: 0.01156
Epoch: 21329, Training Loss: 0.00893
Epoch: 21330, Training Loss: 0.01012
Epoch: 21330, Training Loss: 0.00942
Epoch: 21330, Training Loss: 0.01156
Epoch: 21330, Training Loss: 0.00893
Epoch: 21331, Training Loss: 0.01012
Epoch: 21331, Training Loss: 0.00942
Epoch: 21331, Training Loss: 0.01156
Epoch: 21331, Training Loss: 0.00893
Epoch: 21332, Training Loss: 0.01012
Epoch: 21332, Training Loss: 0.00942
Epoch: 21332, Training Loss: 0.01156
Epoch: 21332, Training Loss: 0.00893
Epoch: 21333, Training Loss: 0.01012
Epoch: 21333, Training Loss: 0.00942
Epoch: 21333, Training Loss: 0.01156
Epoch: 21333, Training Loss: 0.00893
Epoch: 21334, Training Loss: 0.01012
Epoch: 21334, Training Loss: 0.00942
Epoch: 21334, Training Loss: 0.01156
Epoch: 21334, Training Loss: 0.00893
Epoch: 21335, Training Loss: 0.01012
Epoch: 21335, Training Loss: 0.00942
Epoch: 21335, Training Loss: 0.01156
Epoch: 21335, Training Loss: 0.00893
Epoch: 21336, Training Loss: 0.01012
Epoch: 21336, Training Loss: 0.00942
Epoch: 21336, Training Loss: 0.01156
Epoch: 21336, Training Loss: 0.00893
Epoch: 21337, Training Loss: 0.01012
Epoch: 21337, Training Loss: 0.00942
Epoch: 21337, Training Loss: 0.01156
Epoch: 21337, Training Loss: 0.00893
Epoch: 21338, Training Loss: 0.01012
Epoch: 21338, Training Loss: 0.00942
Epoch: 21338, Training Loss: 0.01156
Epoch: 21338, Training Loss: 0.00893
Epoch: 21339, Training Loss: 0.01012
Epoch: 21339, Training Loss: 0.00942
Epoch: 21339, Training Loss: 0.01155
Epoch: 21339, Training Loss: 0.00892
Epoch: 21340, Training Loss: 0.01012
Epoch: 21340, Training Loss: 0.00942
Epoch: 21340, Training Loss: 0.01155
Epoch: 21340, Training Loss: 0.00892
Epoch: 21341, Training Loss: 0.01012
Epoch: 21341, Training Loss: 0.00942
Epoch: 21341, Training Loss: 0.01155
Epoch: 21341, Training Loss: 0.00892
Epoch: 21342, Training Loss: 0.01012
Epoch: 21342, Training Loss: 0.00942
Epoch: 21342, Training Loss: 0.01155
Epoch: 21342, Training Loss: 0.00892
Epoch: 21343, Training Loss: 0.01012
Epoch: 21343, Training Loss: 0.00942
Epoch: 21343, Training Loss: 0.01155
Epoch: 21343, Training Loss: 0.00892
Epoch: 21344, Training Loss: 0.01012
Epoch: 21344, Training Loss: 0.00942
Epoch: 21344, Training Loss: 0.01155
Epoch: 21344, Training Loss: 0.00892
Epoch: 21345, Training Loss: 0.01012
Epoch: 21345, Training Loss: 0.00942
Epoch: 21345, Training Loss: 0.01155
Epoch: 21345, Training Loss: 0.00892
Epoch: 21346, Training Loss: 0.01012
Epoch: 21346, Training Loss: 0.00942
Epoch: 21346, Training Loss: 0.01155
Epoch: 21346, Training Loss: 0.00892
Epoch: 21347, Training Loss: 0.01012
Epoch: 21347, Training Loss: 0.00942
Epoch: 21347, Training Loss: 0.01155
Epoch: 21347, Training Loss: 0.00892
Epoch: 21348, Training Loss: 0.01012
Epoch: 21348, Training Loss: 0.00942
Epoch: 21348, Training Loss: 0.01155
Epoch: 21348, Training Loss: 0.00892
Epoch: 21349, Training Loss: 0.01012
Epoch: 21349, Training Loss: 0.00942
Epoch: 21349, Training Loss: 0.01155
Epoch: 21349, Training Loss: 0.00892
Epoch: 21350, Training Loss: 0.01012
Epoch: 21350, Training Loss: 0.00942
Epoch: 21350, Training Loss: 0.01155
Epoch: 21350, Training Loss: 0.00892
Epoch: 21351, Training Loss: 0.01012
Epoch: 21351, Training Loss: 0.00942
Epoch: 21351, Training Loss: 0.01155
Epoch: 21351, Training Loss: 0.00892
Epoch: 21352, Training Loss: 0.01012
Epoch: 21352, Training Loss: 0.00942
Epoch: 21352, Training Loss: 0.01155
Epoch: 21352, Training Loss: 0.00892
Epoch: 21353, Training Loss: 0.01012
Epoch: 21353, Training Loss: 0.00942
Epoch: 21353, Training Loss: 0.01155
Epoch: 21353, Training Loss: 0.00892
Epoch: 21354, Training Loss: 0.01012
Epoch: 21354, Training Loss: 0.00942
Epoch: 21354, Training Loss: 0.01155
Epoch: 21354, Training Loss: 0.00892
Epoch: 21355, Training Loss: 0.01012
Epoch: 21355, Training Loss: 0.00942
Epoch: 21355, Training Loss: 0.01155
Epoch: 21355, Training Loss: 0.00892
Epoch: 21356, Training Loss: 0.01012
Epoch: 21356, Training Loss: 0.00942
Epoch: 21356, Training Loss: 0.01155
Epoch: 21356, Training Loss: 0.00892
Epoch: 21357, Training Loss: 0.01012
Epoch: 21357, Training Loss: 0.00942
Epoch: 21357, Training Loss: 0.01155
Epoch: 21357, Training Loss: 0.00892
Epoch: 21358, Training Loss: 0.01012
Epoch: 21358, Training Loss: 0.00942
Epoch: 21358, Training Loss: 0.01155
Epoch: 21358, Training Loss: 0.00892
Epoch: 21359, Training Loss: 0.01012
Epoch: 21359, Training Loss: 0.00942
Epoch: 21359, Training Loss: 0.01155
Epoch: 21359, Training Loss: 0.00892
Epoch: 21360, Training Loss: 0.01012
Epoch: 21360, Training Loss: 0.00942
Epoch: 21360, Training Loss: 0.01155
Epoch: 21360, Training Loss: 0.00892
Epoch: 21361, Training Loss: 0.01012
Epoch: 21361, Training Loss: 0.00941
Epoch: 21361, Training Loss: 0.01155
Epoch: 21361, Training Loss: 0.00892
Epoch: 21362, Training Loss: 0.01012
Epoch: 21362, Training Loss: 0.00941
Epoch: 21362, Training Loss: 0.01155
Epoch: 21362, Training Loss: 0.00892
Epoch: 21363, Training Loss: 0.01012
Epoch: 21363, Training Loss: 0.00941
Epoch: 21363, Training Loss: 0.01155
Epoch: 21363, Training Loss: 0.00892
Epoch: 21364, Training Loss: 0.01011
Epoch: 21364, Training Loss: 0.00941
Epoch: 21364, Training Loss: 0.01155
Epoch: 21364, Training Loss: 0.00892
Epoch: 21365, Training Loss: 0.01011
Epoch: 21365, Training Loss: 0.00941
Epoch: 21365, Training Loss: 0.01155
Epoch: 21365, Training Loss: 0.00892
Epoch: 21366, Training Loss: 0.01011
Epoch: 21366, Training Loss: 0.00941
Epoch: 21366, Training Loss: 0.01155
Epoch: 21366, Training Loss: 0.00892
Epoch: 21367, Training Loss: 0.01011
Epoch: 21367, Training Loss: 0.00941
Epoch: 21367, Training Loss: 0.01155
Epoch: 21367, Training Loss: 0.00892
Epoch: 21368, Training Loss: 0.01011
Epoch: 21368, Training Loss: 0.00941
Epoch: 21368, Training Loss: 0.01155
Epoch: 21368, Training Loss: 0.00892
Epoch: 21369, Training Loss: 0.01011
Epoch: 21369, Training Loss: 0.00941
Epoch: 21369, Training Loss: 0.01155
Epoch: 21369, Training Loss: 0.00892
Epoch: 21370, Training Loss: 0.01011
Epoch: 21370, Training Loss: 0.00941
Epoch: 21370, Training Loss: 0.01155
Epoch: 21370, Training Loss: 0.00892
Epoch: 21371, Training Loss: 0.01011
Epoch: 21371, Training Loss: 0.00941
Epoch: 21371, Training Loss: 0.01155
Epoch: 21371, Training Loss: 0.00892
Epoch: 21372, Training Loss: 0.01011
Epoch: 21372, Training Loss: 0.00941
Epoch: 21372, Training Loss: 0.01155
Epoch: 21372, Training Loss: 0.00892
Epoch: 21373, Training Loss: 0.01011
Epoch: 21373, Training Loss: 0.00941
Epoch: 21373, Training Loss: 0.01155
Epoch: 21373, Training Loss: 0.00892
Epoch: 21374, Training Loss: 0.01011
Epoch: 21374, Training Loss: 0.00941
Epoch: 21374, Training Loss: 0.01154
Epoch: 21374, Training Loss: 0.00892
Epoch: 21375, Training Loss: 0.01011
Epoch: 21375, Training Loss: 0.00941
Epoch: 21375, Training Loss: 0.01154
Epoch: 21375, Training Loss: 0.00892
Epoch: 21376, Training Loss: 0.01011
Epoch: 21376, Training Loss: 0.00941
Epoch: 21376, Training Loss: 0.01154
Epoch: 21376, Training Loss: 0.00892
Epoch: 21377, Training Loss: 0.01011
Epoch: 21377, Training Loss: 0.00941
Epoch: 21377, Training Loss: 0.01154
Epoch: 21377, Training Loss: 0.00892
Epoch: 21378, Training Loss: 0.01011
Epoch: 21378, Training Loss: 0.00941
Epoch: 21378, Training Loss: 0.01154
Epoch: 21378, Training Loss: 0.00892
Epoch: 21379, Training Loss: 0.01011
Epoch: 21379, Training Loss: 0.00941
Epoch: 21379, Training Loss: 0.01154
Epoch: 21379, Training Loss: 0.00892
Epoch: 21380, Training Loss: 0.01011
Epoch: 21380, Training Loss: 0.00941
Epoch: 21380, Training Loss: 0.01154
Epoch: 21380, Training Loss: 0.00892
Epoch: 21381, Training Loss: 0.01011
Epoch: 21381, Training Loss: 0.00941
Epoch: 21381, Training Loss: 0.01154
Epoch: 21381, Training Loss: 0.00892
Epoch: 21382, Training Loss: 0.01011
Epoch: 21382, Training Loss: 0.00941
Epoch: 21382, Training Loss: 0.01154
Epoch: 21382, Training Loss: 0.00892
Epoch: 21383, Training Loss: 0.01011
Epoch: 21383, Training Loss: 0.00941
Epoch: 21383, Training Loss: 0.01154
Epoch: 21383, Training Loss: 0.00892
Epoch: 21384, Training Loss: 0.01011
Epoch: 21384, Training Loss: 0.00941
Epoch: 21384, Training Loss: 0.01154
Epoch: 21384, Training Loss: 0.00891
Epoch: 21385, Training Loss: 0.01011
Epoch: 21385, Training Loss: 0.00941
Epoch: 21385, Training Loss: 0.01154
Epoch: 21385, Training Loss: 0.00891
Epoch: 21386, Training Loss: 0.01011
Epoch: 21386, Training Loss: 0.00941
Epoch: 21386, Training Loss: 0.01154
Epoch: 21386, Training Loss: 0.00891
Epoch: 21387, Training Loss: 0.01011
Epoch: 21387, Training Loss: 0.00941
Epoch: 21387, Training Loss: 0.01154
Epoch: 21387, Training Loss: 0.00891
Epoch: 21388, Training Loss: 0.01011
Epoch: 21388, Training Loss: 0.00941
Epoch: 21388, Training Loss: 0.01154
Epoch: 21388, Training Loss: 0.00891
Epoch: 21389, Training Loss: 0.01011
Epoch: 21389, Training Loss: 0.00941
Epoch: 21389, Training Loss: 0.01154
Epoch: 21389, Training Loss: 0.00891
Epoch: 21390, Training Loss: 0.01011
Epoch: 21390, Training Loss: 0.00941
Epoch: 21390, Training Loss: 0.01154
Epoch: 21390, Training Loss: 0.00891
Epoch: 21391, Training Loss: 0.01011
Epoch: 21391, Training Loss: 0.00941
Epoch: 21391, Training Loss: 0.01154
Epoch: 21391, Training Loss: 0.00891
Epoch: 21392, Training Loss: 0.01011
Epoch: 21392, Training Loss: 0.00941
Epoch: 21392, Training Loss: 0.01154
Epoch: 21392, Training Loss: 0.00891
Epoch: 21393, Training Loss: 0.01011
Epoch: 21393, Training Loss: 0.00941
Epoch: 21393, Training Loss: 0.01154
Epoch: 21393, Training Loss: 0.00891
Epoch: 21394, Training Loss: 0.01011
Epoch: 21394, Training Loss: 0.00941
Epoch: 21394, Training Loss: 0.01154
Epoch: 21394, Training Loss: 0.00891
Epoch: 21395, Training Loss: 0.01011
Epoch: 21395, Training Loss: 0.00941
Epoch: 21395, Training Loss: 0.01154
Epoch: 21395, Training Loss: 0.00891
Epoch: 21396, Training Loss: 0.01011
Epoch: 21396, Training Loss: 0.00941
Epoch: 21396, Training Loss: 0.01154
Epoch: 21396, Training Loss: 0.00891
Epoch: 21397, Training Loss: 0.01011
Epoch: 21397, Training Loss: 0.00941
Epoch: 21397, Training Loss: 0.01154
Epoch: 21397, Training Loss: 0.00891
Epoch: 21398, Training Loss: 0.01011
Epoch: 21398, Training Loss: 0.00941
Epoch: 21398, Training Loss: 0.01154
Epoch: 21398, Training Loss: 0.00891
Epoch: 21399, Training Loss: 0.01011
Epoch: 21399, Training Loss: 0.00941
Epoch: 21399, Training Loss: 0.01154
Epoch: 21399, Training Loss: 0.00891
Epoch: 21400, Training Loss: 0.01011
Epoch: 21400, Training Loss: 0.00941
Epoch: 21400, Training Loss: 0.01154
Epoch: 21400, Training Loss: 0.00891
Epoch: 21401, Training Loss: 0.01011
Epoch: 21401, Training Loss: 0.00941
Epoch: 21401, Training Loss: 0.01154
Epoch: 21401, Training Loss: 0.00891
Epoch: 21402, Training Loss: 0.01011
Epoch: 21402, Training Loss: 0.00941
Epoch: 21402, Training Loss: 0.01154
Epoch: 21402, Training Loss: 0.00891
Epoch: 21403, Training Loss: 0.01010
Epoch: 21403, Training Loss: 0.00941
Epoch: 21403, Training Loss: 0.01154
Epoch: 21403, Training Loss: 0.00891
Epoch: 21404, Training Loss: 0.01010
Epoch: 21404, Training Loss: 0.00940
Epoch: 21404, Training Loss: 0.01154
Epoch: 21404, Training Loss: 0.00891
Epoch: 21405, Training Loss: 0.01010
Epoch: 21405, Training Loss: 0.00940
Epoch: 21405, Training Loss: 0.01154
Epoch: 21405, Training Loss: 0.00891
Epoch: 21406, Training Loss: 0.01010
Epoch: 21406, Training Loss: 0.00940
Epoch: 21406, Training Loss: 0.01154
Epoch: 21406, Training Loss: 0.00891
Epoch: 21407, Training Loss: 0.01010
Epoch: 21407, Training Loss: 0.00940
Epoch: 21407, Training Loss: 0.01154
Epoch: 21407, Training Loss: 0.00891
Epoch: 21408, Training Loss: 0.01010
Epoch: 21408, Training Loss: 0.00940
Epoch: 21408, Training Loss: 0.01153
Epoch: 21408, Training Loss: 0.00891
Epoch: 21409, Training Loss: 0.01010
Epoch: 21409, Training Loss: 0.00940
Epoch: 21409, Training Loss: 0.01153
Epoch: 21409, Training Loss: 0.00891
Epoch: 21410, Training Loss: 0.01010
Epoch: 21410, Training Loss: 0.00940
Epoch: 21410, Training Loss: 0.01153
Epoch: 21410, Training Loss: 0.00891
Epoch: 21411, Training Loss: 0.01010
Epoch: 21411, Training Loss: 0.00940
Epoch: 21411, Training Loss: 0.01153
Epoch: 21411, Training Loss: 0.00891
Epoch: 21412, Training Loss: 0.01010
Epoch: 21412, Training Loss: 0.00940
Epoch: 21412, Training Loss: 0.01153
Epoch: 21412, Training Loss: 0.00891
Epoch: 21413, Training Loss: 0.01010
Epoch: 21413, Training Loss: 0.00940
Epoch: 21413, Training Loss: 0.01153
Epoch: 21413, Training Loss: 0.00891
Epoch: 21414, Training Loss: 0.01010
Epoch: 21414, Training Loss: 0.00940
Epoch: 21414, Training Loss: 0.01153
Epoch: 21414, Training Loss: 0.00891
Epoch: 21415, Training Loss: 0.01010
Epoch: 21415, Training Loss: 0.00940
Epoch: 21415, Training Loss: 0.01153
Epoch: 21415, Training Loss: 0.00891
Epoch: 21416, Training Loss: 0.01010
Epoch: 21416, Training Loss: 0.00940
Epoch: 21416, Training Loss: 0.01153
Epoch: 21416, Training Loss: 0.00891
Epoch: 21417, Training Loss: 0.01010
Epoch: 21417, Training Loss: 0.00940
Epoch: 21417, Training Loss: 0.01153
Epoch: 21417, Training Loss: 0.00891
Epoch: 21418, Training Loss: 0.01010
Epoch: 21418, Training Loss: 0.00940
Epoch: 21418, Training Loss: 0.01153
Epoch: 21418, Training Loss: 0.00891
Epoch: 21419, Training Loss: 0.01010
Epoch: 21419, Training Loss: 0.00940
Epoch: 21419, Training Loss: 0.01153
Epoch: 21419, Training Loss: 0.00891
Epoch: 21420, Training Loss: 0.01010
Epoch: 21420, Training Loss: 0.00940
Epoch: 21420, Training Loss: 0.01153
Epoch: 21420, Training Loss: 0.00891
Epoch: 21421, Training Loss: 0.01010
Epoch: 21421, Training Loss: 0.00940
Epoch: 21421, Training Loss: 0.01153
Epoch: 21421, Training Loss: 0.00891
Epoch: 21422, Training Loss: 0.01010
Epoch: 21422, Training Loss: 0.00940
Epoch: 21422, Training Loss: 0.01153
Epoch: 21422, Training Loss: 0.00891
Epoch: 21423, Training Loss: 0.01010
Epoch: 21423, Training Loss: 0.00940
Epoch: 21423, Training Loss: 0.01153
Epoch: 21423, Training Loss: 0.00891
Epoch: 21424, Training Loss: 0.01010
Epoch: 21424, Training Loss: 0.00940
Epoch: 21424, Training Loss: 0.01153
Epoch: 21424, Training Loss: 0.00891
Epoch: 21425, Training Loss: 0.01010
Epoch: 21425, Training Loss: 0.00940
Epoch: 21425, Training Loss: 0.01153
Epoch: 21425, Training Loss: 0.00891
Epoch: 21426, Training Loss: 0.01010
Epoch: 21426, Training Loss: 0.00940
Epoch: 21426, Training Loss: 0.01153
Epoch: 21426, Training Loss: 0.00891
Epoch: 21427, Training Loss: 0.01010
Epoch: 21427, Training Loss: 0.00940
Epoch: 21427, Training Loss: 0.01153
Epoch: 21427, Training Loss: 0.00891
Epoch: 21428, Training Loss: 0.01010
Epoch: 21428, Training Loss: 0.00940
Epoch: 21428, Training Loss: 0.01153
Epoch: 21428, Training Loss: 0.00891
Epoch: 21429, Training Loss: 0.01010
Epoch: 21429, Training Loss: 0.00940
Epoch: 21429, Training Loss: 0.01153
Epoch: 21429, Training Loss: 0.00891
Epoch: 21430, Training Loss: 0.01010
Epoch: 21430, Training Loss: 0.00940
Epoch: 21430, Training Loss: 0.01153
Epoch: 21430, Training Loss: 0.00890
Epoch: 21431, Training Loss: 0.01010
Epoch: 21431, Training Loss: 0.00940
Epoch: 21431, Training Loss: 0.01153
Epoch: 21431, Training Loss: 0.00890
Epoch: 21432, Training Loss: 0.01010
Epoch: 21432, Training Loss: 0.00940
Epoch: 21432, Training Loss: 0.01153
Epoch: 21432, Training Loss: 0.00890
Epoch: 21433, Training Loss: 0.01010
Epoch: 21433, Training Loss: 0.00940
Epoch: 21433, Training Loss: 0.01153
Epoch: 21433, Training Loss: 0.00890
Epoch: 21434, Training Loss: 0.01010
Epoch: 21434, Training Loss: 0.00940
Epoch: 21434, Training Loss: 0.01153
Epoch: 21434, Training Loss: 0.00890
Epoch: 21435, Training Loss: 0.01010
Epoch: 21435, Training Loss: 0.00940
Epoch: 21435, Training Loss: 0.01153
Epoch: 21435, Training Loss: 0.00890
Epoch: 21436, Training Loss: 0.01010
Epoch: 21436, Training Loss: 0.00940
Epoch: 21436, Training Loss: 0.01153
Epoch: 21436, Training Loss: 0.00890
Epoch: 21437, Training Loss: 0.01010
Epoch: 21437, Training Loss: 0.00940
Epoch: 21437, Training Loss: 0.01153
Epoch: 21437, Training Loss: 0.00890
Epoch: 21438, Training Loss: 0.01010
Epoch: 21438, Training Loss: 0.00940
Epoch: 21438, Training Loss: 0.01153
Epoch: 21438, Training Loss: 0.00890
Epoch: 21439, Training Loss: 0.01010
Epoch: 21439, Training Loss: 0.00940
Epoch: 21439, Training Loss: 0.01153
Epoch: 21439, Training Loss: 0.00890
Epoch: 21440, Training Loss: 0.01010
Epoch: 21440, Training Loss: 0.00940
Epoch: 21440, Training Loss: 0.01153
Epoch: 21440, Training Loss: 0.00890
Epoch: 21441, Training Loss: 0.01010
Epoch: 21441, Training Loss: 0.00940
Epoch: 21441, Training Loss: 0.01153
Epoch: 21441, Training Loss: 0.00890
Epoch: 21442, Training Loss: 0.01009
Epoch: 21442, Training Loss: 0.00940
Epoch: 21442, Training Loss: 0.01152
Epoch: 21442, Training Loss: 0.00890
Epoch: 21443, Training Loss: 0.01009
Epoch: 21443, Training Loss: 0.00940
Epoch: 21443, Training Loss: 0.01152
Epoch: 21443, Training Loss: 0.00890
Epoch: 21444, Training Loss: 0.01009
Epoch: 21444, Training Loss: 0.00940
Epoch: 21444, Training Loss: 0.01152
Epoch: 21444, Training Loss: 0.00890
Epoch: 21445, Training Loss: 0.01009
Epoch: 21445, Training Loss: 0.00940
Epoch: 21445, Training Loss: 0.01152
Epoch: 21445, Training Loss: 0.00890
Epoch: 21446, Training Loss: 0.01009
Epoch: 21446, Training Loss: 0.00940
Epoch: 21446, Training Loss: 0.01152
Epoch: 21446, Training Loss: 0.00890
Epoch: 21447, Training Loss: 0.01009
Epoch: 21447, Training Loss: 0.00939
Epoch: 21447, Training Loss: 0.01152
Epoch: 21447, Training Loss: 0.00890
Epoch: 21448, Training Loss: 0.01009
Epoch: 21448, Training Loss: 0.00939
Epoch: 21448, Training Loss: 0.01152
Epoch: 21448, Training Loss: 0.00890
Epoch: 21449, Training Loss: 0.01009
Epoch: 21449, Training Loss: 0.00939
Epoch: 21449, Training Loss: 0.01152
Epoch: 21449, Training Loss: 0.00890
Epoch: 21450, Training Loss: 0.01009
Epoch: 21450, Training Loss: 0.00939
Epoch: 21450, Training Loss: 0.01152
Epoch: 21450, Training Loss: 0.00890
Epoch: 21451, Training Loss: 0.01009
Epoch: 21451, Training Loss: 0.00939
Epoch: 21451, Training Loss: 0.01152
Epoch: 21451, Training Loss: 0.00890
Epoch: 21452, Training Loss: 0.01009
Epoch: 21452, Training Loss: 0.00939
Epoch: 21452, Training Loss: 0.01152
Epoch: 21452, Training Loss: 0.00890
Epoch: 21453, Training Loss: 0.01009
Epoch: 21453, Training Loss: 0.00939
Epoch: 21453, Training Loss: 0.01152
Epoch: 21453, Training Loss: 0.00890
Epoch: 21454, Training Loss: 0.01009
Epoch: 21454, Training Loss: 0.00939
Epoch: 21454, Training Loss: 0.01152
Epoch: 21454, Training Loss: 0.00890
Epoch: 21455, Training Loss: 0.01009
Epoch: 21455, Training Loss: 0.00939
Epoch: 21455, Training Loss: 0.01152
Epoch: 21455, Training Loss: 0.00890
Epoch: 21456, Training Loss: 0.01009
Epoch: 21456, Training Loss: 0.00939
Epoch: 21456, Training Loss: 0.01152
Epoch: 21456, Training Loss: 0.00890
Epoch: 21457, Training Loss: 0.01009
Epoch: 21457, Training Loss: 0.00939
Epoch: 21457, Training Loss: 0.01152
Epoch: 21457, Training Loss: 0.00890
Epoch: 21458, Training Loss: 0.01009
Epoch: 21458, Training Loss: 0.00939
Epoch: 21458, Training Loss: 0.01152
Epoch: 21458, Training Loss: 0.00890
Epoch: 21459, Training Loss: 0.01009
Epoch: 21459, Training Loss: 0.00939
Epoch: 21459, Training Loss: 0.01152
Epoch: 21459, Training Loss: 0.00890
Epoch: 21460, Training Loss: 0.01009
Epoch: 21460, Training Loss: 0.00939
Epoch: 21460, Training Loss: 0.01152
Epoch: 21460, Training Loss: 0.00890
Epoch: 21461, Training Loss: 0.01009
Epoch: 21461, Training Loss: 0.00939
Epoch: 21461, Training Loss: 0.01152
Epoch: 21461, Training Loss: 0.00890
Epoch: 21462, Training Loss: 0.01009
Epoch: 21462, Training Loss: 0.00939
Epoch: 21462, Training Loss: 0.01152
Epoch: 21462, Training Loss: 0.00890
Epoch: 21463, Training Loss: 0.01009
Epoch: 21463, Training Loss: 0.00939
Epoch: 21463, Training Loss: 0.01152
Epoch: 21463, Training Loss: 0.00890
Epoch: 21464, Training Loss: 0.01009
Epoch: 21464, Training Loss: 0.00939
Epoch: 21464, Training Loss: 0.01152
Epoch: 21464, Training Loss: 0.00890
Epoch: 21465, Training Loss: 0.01009
Epoch: 21465, Training Loss: 0.00939
Epoch: 21465, Training Loss: 0.01152
Epoch: 21465, Training Loss: 0.00890
Epoch: 21466, Training Loss: 0.01009
Epoch: 21466, Training Loss: 0.00939
Epoch: 21466, Training Loss: 0.01152
Epoch: 21466, Training Loss: 0.00890
Epoch: 21467, Training Loss: 0.01009
Epoch: 21467, Training Loss: 0.00939
Epoch: 21467, Training Loss: 0.01152
Epoch: 21467, Training Loss: 0.00890
Epoch: 21468, Training Loss: 0.01009
Epoch: 21468, Training Loss: 0.00939
Epoch: 21468, Training Loss: 0.01152
Epoch: 21468, Training Loss: 0.00890
Epoch: 21469, Training Loss: 0.01009
Epoch: 21469, Training Loss: 0.00939
Epoch: 21469, Training Loss: 0.01152
Epoch: 21469, Training Loss: 0.00890
Epoch: 21470, Training Loss: 0.01009
Epoch: 21470, Training Loss: 0.00939
Epoch: 21470, Training Loss: 0.01152
Epoch: 21470, Training Loss: 0.00890
Epoch: 21471, Training Loss: 0.01009
Epoch: 21471, Training Loss: 0.00939
Epoch: 21471, Training Loss: 0.01152
Epoch: 21471, Training Loss: 0.00890
Epoch: 21472, Training Loss: 0.01009
Epoch: 21472, Training Loss: 0.00939
Epoch: 21472, Training Loss: 0.01152
Epoch: 21472, Training Loss: 0.00890
Epoch: 21473, Training Loss: 0.01009
Epoch: 21473, Training Loss: 0.00939
Epoch: 21473, Training Loss: 0.01152
Epoch: 21473, Training Loss: 0.00890
Epoch: 21474, Training Loss: 0.01009
Epoch: 21474, Training Loss: 0.00939
Epoch: 21474, Training Loss: 0.01152
Epoch: 21474, Training Loss: 0.00890
Epoch: 21475, Training Loss: 0.01009
Epoch: 21475, Training Loss: 0.00939
Epoch: 21475, Training Loss: 0.01152
Epoch: 21475, Training Loss: 0.00889
Epoch: 21476, Training Loss: 0.01009
Epoch: 21476, Training Loss: 0.00939
Epoch: 21476, Training Loss: 0.01152
Epoch: 21476, Training Loss: 0.00889
Epoch: 21477, Training Loss: 0.01009
Epoch: 21477, Training Loss: 0.00939
Epoch: 21477, Training Loss: 0.01151
Epoch: 21477, Training Loss: 0.00889
Epoch: 21478, Training Loss: 0.01009
Epoch: 21478, Training Loss: 0.00939
Epoch: 21478, Training Loss: 0.01151
Epoch: 21478, Training Loss: 0.00889
Epoch: 21479, Training Loss: 0.01009
Epoch: 21479, Training Loss: 0.00939
Epoch: 21479, Training Loss: 0.01151
Epoch: 21479, Training Loss: 0.00889
Epoch: 21480, Training Loss: 0.01009
Epoch: 21480, Training Loss: 0.00939
Epoch: 21480, Training Loss: 0.01151
Epoch: 21480, Training Loss: 0.00889
Epoch: 21481, Training Loss: 0.01009
Epoch: 21481, Training Loss: 0.00939
Epoch: 21481, Training Loss: 0.01151
Epoch: 21481, Training Loss: 0.00889
Epoch: 21482, Training Loss: 0.01008
Epoch: 21482, Training Loss: 0.00939
Epoch: 21482, Training Loss: 0.01151
Epoch: 21482, Training Loss: 0.00889
Epoch: 21483, Training Loss: 0.01008
Epoch: 21483, Training Loss: 0.00939
Epoch: 21483, Training Loss: 0.01151
Epoch: 21483, Training Loss: 0.00889
Epoch: 21484, Training Loss: 0.01008
Epoch: 21484, Training Loss: 0.00939
Epoch: 21484, Training Loss: 0.01151
Epoch: 21484, Training Loss: 0.00889
Epoch: 21485, Training Loss: 0.01008
Epoch: 21485, Training Loss: 0.00939
Epoch: 21485, Training Loss: 0.01151
Epoch: 21485, Training Loss: 0.00889
Epoch: 21486, Training Loss: 0.01008
Epoch: 21486, Training Loss: 0.00939
Epoch: 21486, Training Loss: 0.01151
Epoch: 21486, Training Loss: 0.00889
Epoch: 21487, Training Loss: 0.01008
Epoch: 21487, Training Loss: 0.00939
Epoch: 21487, Training Loss: 0.01151
Epoch: 21487, Training Loss: 0.00889
Epoch: 21488, Training Loss: 0.01008
Epoch: 21488, Training Loss: 0.00939
Epoch: 21488, Training Loss: 0.01151
Epoch: 21488, Training Loss: 0.00889
Epoch: 21489, Training Loss: 0.01008
Epoch: 21489, Training Loss: 0.00939
Epoch: 21489, Training Loss: 0.01151
Epoch: 21489, Training Loss: 0.00889
Epoch: 21490, Training Loss: 0.01008
Epoch: 21490, Training Loss: 0.00939
Epoch: 21490, Training Loss: 0.01151
Epoch: 21490, Training Loss: 0.00889
Epoch: 21491, Training Loss: 0.01008
Epoch: 21491, Training Loss: 0.00938
Epoch: 21491, Training Loss: 0.01151
Epoch: 21491, Training Loss: 0.00889
Epoch: 21492, Training Loss: 0.01008
Epoch: 21492, Training Loss: 0.00938
Epoch: 21492, Training Loss: 0.01151
Epoch: 21492, Training Loss: 0.00889
Epoch: 21493, Training Loss: 0.01008
Epoch: 21493, Training Loss: 0.00938
Epoch: 21493, Training Loss: 0.01151
Epoch: 21493, Training Loss: 0.00889
Epoch: 21494, Training Loss: 0.01008
Epoch: 21494, Training Loss: 0.00938
Epoch: 21494, Training Loss: 0.01151
Epoch: 21494, Training Loss: 0.00889
Epoch: 21495, Training Loss: 0.01008
Epoch: 21495, Training Loss: 0.00938
Epoch: 21495, Training Loss: 0.01151
Epoch: 21495, Training Loss: 0.00889
Epoch: 21496, Training Loss: 0.01008
Epoch: 21496, Training Loss: 0.00938
Epoch: 21496, Training Loss: 0.01151
Epoch: 21496, Training Loss: 0.00889
Epoch: 21497, Training Loss: 0.01008
Epoch: 21497, Training Loss: 0.00938
Epoch: 21497, Training Loss: 0.01151
Epoch: 21497, Training Loss: 0.00889
Epoch: 21498, Training Loss: 0.01008
Epoch: 21498, Training Loss: 0.00938
Epoch: 21498, Training Loss: 0.01151
Epoch: 21498, Training Loss: 0.00889
Epoch: 21499, Training Loss: 0.01008
Epoch: 21499, Training Loss: 0.00938
Epoch: 21499, Training Loss: 0.01151
Epoch: 21499, Training Loss: 0.00889
Epoch: 21500, Training Loss: 0.01008
Epoch: 21500, Training Loss: 0.00938
Epoch: 21500, Training Loss: 0.01151
Epoch: 21500, Training Loss: 0.00889
Epoch: 21501, Training Loss: 0.01008
Epoch: 21501, Training Loss: 0.00938
Epoch: 21501, Training Loss: 0.01151
Epoch: 21501, Training Loss: 0.00889
Epoch: 21502, Training Loss: 0.01008
Epoch: 21502, Training Loss: 0.00938
Epoch: 21502, Training Loss: 0.01151
Epoch: 21502, Training Loss: 0.00889
Epoch: 21503, Training Loss: 0.01008
Epoch: 21503, Training Loss: 0.00938
Epoch: 21503, Training Loss: 0.01151
Epoch: 21503, Training Loss: 0.00889
Epoch: 21504, Training Loss: 0.01008
Epoch: 21504, Training Loss: 0.00938
Epoch: 21504, Training Loss: 0.01151
Epoch: 21504, Training Loss: 0.00889
Epoch: 21505, Training Loss: 0.01008
Epoch: 21505, Training Loss: 0.00938
Epoch: 21505, Training Loss: 0.01151
Epoch: 21505, Training Loss: 0.00889
Epoch: 21506, Training Loss: 0.01008
Epoch: 21506, Training Loss: 0.00938
Epoch: 21506, Training Loss: 0.01151
Epoch: 21506, Training Loss: 0.00889
Epoch: 21507, Training Loss: 0.01008
Epoch: 21507, Training Loss: 0.00938
Epoch: 21507, Training Loss: 0.01151
Epoch: 21507, Training Loss: 0.00889
Epoch: 21508, Training Loss: 0.01008
Epoch: 21508, Training Loss: 0.00938
Epoch: 21508, Training Loss: 0.01151
Epoch: 21508, Training Loss: 0.00889
Epoch: 21509, Training Loss: 0.01008
Epoch: 21509, Training Loss: 0.00938
Epoch: 21509, Training Loss: 0.01151
Epoch: 21509, Training Loss: 0.00889
Epoch: 21510, Training Loss: 0.01008
Epoch: 21510, Training Loss: 0.00938
Epoch: 21510, Training Loss: 0.01151
Epoch: 21510, Training Loss: 0.00889
Epoch: 21511, Training Loss: 0.01008
Epoch: 21511, Training Loss: 0.00938
Epoch: 21511, Training Loss: 0.01151
Epoch: 21511, Training Loss: 0.00889
Epoch: 21512, Training Loss: 0.01008
Epoch: 21512, Training Loss: 0.00938
Epoch: 21512, Training Loss: 0.01150
Epoch: 21512, Training Loss: 0.00889
Epoch: 21513, Training Loss: 0.01008
Epoch: 21513, Training Loss: 0.00938
Epoch: 21513, Training Loss: 0.01150
Epoch: 21513, Training Loss: 0.00889
Epoch: 21514, Training Loss: 0.01008
Epoch: 21514, Training Loss: 0.00938
Epoch: 21514, Training Loss: 0.01150
Epoch: 21514, Training Loss: 0.00889
Epoch: 21515, Training Loss: 0.01008
Epoch: 21515, Training Loss: 0.00938
Epoch: 21515, Training Loss: 0.01150
Epoch: 21515, Training Loss: 0.00889
Epoch: 21516, Training Loss: 0.01008
Epoch: 21516, Training Loss: 0.00938
Epoch: 21516, Training Loss: 0.01150
Epoch: 21516, Training Loss: 0.00889
Epoch: 21517, Training Loss: 0.01008
Epoch: 21517, Training Loss: 0.00938
Epoch: 21517, Training Loss: 0.01150
Epoch: 21517, Training Loss: 0.00889
Epoch: 21518, Training Loss: 0.01008
Epoch: 21518, Training Loss: 0.00938
Epoch: 21518, Training Loss: 0.01150
Epoch: 21518, Training Loss: 0.00889
Epoch: 21519, Training Loss: 0.01008
Epoch: 21519, Training Loss: 0.00938
Epoch: 21519, Training Loss: 0.01150
Epoch: 21519, Training Loss: 0.00889
Epoch: 21520, Training Loss: 0.01008
Epoch: 21520, Training Loss: 0.00938
Epoch: 21520, Training Loss: 0.01150
Epoch: 21520, Training Loss: 0.00889
Epoch: 21521, Training Loss: 0.01007
Epoch: 21521, Training Loss: 0.00938
Epoch: 21521, Training Loss: 0.01150
Epoch: 21521, Training Loss: 0.00888
Epoch: 21522, Training Loss: 0.01007
Epoch: 21522, Training Loss: 0.00938
Epoch: 21522, Training Loss: 0.01150
Epoch: 21522, Training Loss: 0.00888
Epoch: 21523, Training Loss: 0.01007
Epoch: 21523, Training Loss: 0.00938
Epoch: 21523, Training Loss: 0.01150
Epoch: 21523, Training Loss: 0.00888
Epoch: 21524, Training Loss: 0.01007
Epoch: 21524, Training Loss: 0.00938
Epoch: 21524, Training Loss: 0.01150
Epoch: 21524, Training Loss: 0.00888
Epoch: 21525, Training Loss: 0.01007
Epoch: 21525, Training Loss: 0.00938
Epoch: 21525, Training Loss: 0.01150
Epoch: 21525, Training Loss: 0.00888
Epoch: 21526, Training Loss: 0.01007
Epoch: 21526, Training Loss: 0.00938
Epoch: 21526, Training Loss: 0.01150
Epoch: 21526, Training Loss: 0.00888
Epoch: 21527, Training Loss: 0.01007
Epoch: 21527, Training Loss: 0.00938
Epoch: 21527, Training Loss: 0.01150
Epoch: 21527, Training Loss: 0.00888
Epoch: 21528, Training Loss: 0.01007
Epoch: 21528, Training Loss: 0.00938
Epoch: 21528, Training Loss: 0.01150
Epoch: 21528, Training Loss: 0.00888
Epoch: 21529, Training Loss: 0.01007
Epoch: 21529, Training Loss: 0.00938
Epoch: 21529, Training Loss: 0.01150
Epoch: 21529, Training Loss: 0.00888
Epoch: 21530, Training Loss: 0.01007
Epoch: 21530, Training Loss: 0.00938
Epoch: 21530, Training Loss: 0.01150
Epoch: 21530, Training Loss: 0.00888
Epoch: 21531, Training Loss: 0.01007
Epoch: 21531, Training Loss: 0.00938
Epoch: 21531, Training Loss: 0.01150
Epoch: 21531, Training Loss: 0.00888
Epoch: 21532, Training Loss: 0.01007
Epoch: 21532, Training Loss: 0.00938
Epoch: 21532, Training Loss: 0.01150
Epoch: 21532, Training Loss: 0.00888
Epoch: 21533, Training Loss: 0.01007
Epoch: 21533, Training Loss: 0.00938
Epoch: 21533, Training Loss: 0.01150
Epoch: 21533, Training Loss: 0.00888
Epoch: 21534, Training Loss: 0.01007
Epoch: 21534, Training Loss: 0.00937
Epoch: 21534, Training Loss: 0.01150
Epoch: 21534, Training Loss: 0.00888
Epoch: 21535, Training Loss: 0.01007
Epoch: 21535, Training Loss: 0.00937
Epoch: 21535, Training Loss: 0.01150
Epoch: 21535, Training Loss: 0.00888
Epoch: 21536, Training Loss: 0.01007
Epoch: 21536, Training Loss: 0.00937
Epoch: 21536, Training Loss: 0.01150
Epoch: 21536, Training Loss: 0.00888
Epoch: 21537, Training Loss: 0.01007
Epoch: 21537, Training Loss: 0.00937
Epoch: 21537, Training Loss: 0.01150
Epoch: 21537, Training Loss: 0.00888
Epoch: 21538, Training Loss: 0.01007
Epoch: 21538, Training Loss: 0.00937
Epoch: 21538, Training Loss: 0.01150
Epoch: 21538, Training Loss: 0.00888
Epoch: 21539, Training Loss: 0.01007
Epoch: 21539, Training Loss: 0.00937
Epoch: 21539, Training Loss: 0.01150
Epoch: 21539, Training Loss: 0.00888
Epoch: 21540, Training Loss: 0.01007
Epoch: 21540, Training Loss: 0.00937
Epoch: 21540, Training Loss: 0.01150
Epoch: 21540, Training Loss: 0.00888
Epoch: 21541, Training Loss: 0.01007
Epoch: 21541, Training Loss: 0.00937
Epoch: 21541, Training Loss: 0.01150
Epoch: 21541, Training Loss: 0.00888
Epoch: 21542, Training Loss: 0.01007
Epoch: 21542, Training Loss: 0.00937
Epoch: 21542, Training Loss: 0.01150
Epoch: 21542, Training Loss: 0.00888
Epoch: 21543, Training Loss: 0.01007
Epoch: 21543, Training Loss: 0.00937
Epoch: 21543, Training Loss: 0.01150
Epoch: 21543, Training Loss: 0.00888
Epoch: 21544, Training Loss: 0.01007
Epoch: 21544, Training Loss: 0.00937
Epoch: 21544, Training Loss: 0.01150
Epoch: 21544, Training Loss: 0.00888
Epoch: 21545, Training Loss: 0.01007
Epoch: 21545, Training Loss: 0.00937
Epoch: 21545, Training Loss: 0.01150
Epoch: 21545, Training Loss: 0.00888
Epoch: 21546, Training Loss: 0.01007
Epoch: 21546, Training Loss: 0.00937
Epoch: 21546, Training Loss: 0.01149
Epoch: 21546, Training Loss: 0.00888
Epoch: 21547, Training Loss: 0.01007
Epoch: 21547, Training Loss: 0.00937
Epoch: 21547, Training Loss: 0.01149
Epoch: 21547, Training Loss: 0.00888
Epoch: 21548, Training Loss: 0.01007
Epoch: 21548, Training Loss: 0.00937
Epoch: 21548, Training Loss: 0.01149
Epoch: 21548, Training Loss: 0.00888
Epoch: 21549, Training Loss: 0.01007
Epoch: 21549, Training Loss: 0.00937
Epoch: 21549, Training Loss: 0.01149
Epoch: 21549, Training Loss: 0.00888
Epoch: 21550, Training Loss: 0.01007
Epoch: 21550, Training Loss: 0.00937
Epoch: 21550, Training Loss: 0.01149
Epoch: 21550, Training Loss: 0.00888
Epoch: 21551, Training Loss: 0.01007
Epoch: 21551, Training Loss: 0.00937
Epoch: 21551, Training Loss: 0.01149
Epoch: 21551, Training Loss: 0.00888
Epoch: 21552, Training Loss: 0.01007
Epoch: 21552, Training Loss: 0.00937
Epoch: 21552, Training Loss: 0.01149
Epoch: 21552, Training Loss: 0.00888
Epoch: 21553, Training Loss: 0.01007
Epoch: 21553, Training Loss: 0.00937
Epoch: 21553, Training Loss: 0.01149
Epoch: 21553, Training Loss: 0.00888
Epoch: 21554, Training Loss: 0.01007
Epoch: 21554, Training Loss: 0.00937
Epoch: 21554, Training Loss: 0.01149
Epoch: 21554, Training Loss: 0.00888
Epoch: 21555, Training Loss: 0.01007
Epoch: 21555, Training Loss: 0.00937
Epoch: 21555, Training Loss: 0.01149
Epoch: 21555, Training Loss: 0.00888
Epoch: 21556, Training Loss: 0.01007
Epoch: 21556, Training Loss: 0.00937
Epoch: 21556, Training Loss: 0.01149
Epoch: 21556, Training Loss: 0.00888
Epoch: 21557, Training Loss: 0.01007
Epoch: 21557, Training Loss: 0.00937
Epoch: 21557, Training Loss: 0.01149
Epoch: 21557, Training Loss: 0.00888
Epoch: 21558, Training Loss: 0.01007
Epoch: 21558, Training Loss: 0.00937
Epoch: 21558, Training Loss: 0.01149
Epoch: 21558, Training Loss: 0.00888
Epoch: 21559, Training Loss: 0.01007
Epoch: 21559, Training Loss: 0.00937
Epoch: 21559, Training Loss: 0.01149
Epoch: 21559, Training Loss: 0.00888
Epoch: 21560, Training Loss: 0.01007
Epoch: 21560, Training Loss: 0.00937
Epoch: 21560, Training Loss: 0.01149
Epoch: 21560, Training Loss: 0.00888
Epoch: 21561, Training Loss: 0.01006
Epoch: 21561, Training Loss: 0.00937
Epoch: 21561, Training Loss: 0.01149
Epoch: 21561, Training Loss: 0.00888
Epoch: 21562, Training Loss: 0.01006
Epoch: 21562, Training Loss: 0.00937
Epoch: 21562, Training Loss: 0.01149
Epoch: 21562, Training Loss: 0.00888
Epoch: 21563, Training Loss: 0.01006
Epoch: 21563, Training Loss: 0.00937
Epoch: 21563, Training Loss: 0.01149
Epoch: 21563, Training Loss: 0.00888
Epoch: 21564, Training Loss: 0.01006
Epoch: 21564, Training Loss: 0.00937
Epoch: 21564, Training Loss: 0.01149
Epoch: 21564, Training Loss: 0.00888
Epoch: 21565, Training Loss: 0.01006
Epoch: 21565, Training Loss: 0.00937
Epoch: 21565, Training Loss: 0.01149
Epoch: 21565, Training Loss: 0.00888
Epoch: 21566, Training Loss: 0.01006
Epoch: 21566, Training Loss: 0.00937
Epoch: 21566, Training Loss: 0.01149
Epoch: 21566, Training Loss: 0.00887
Epoch: 21567, Training Loss: 0.01006
Epoch: 21567, Training Loss: 0.00937
Epoch: 21567, Training Loss: 0.01149
Epoch: 21567, Training Loss: 0.00887
Epoch: 21568, Training Loss: 0.01006
Epoch: 21568, Training Loss: 0.00937
Epoch: 21568, Training Loss: 0.01149
Epoch: 21568, Training Loss: 0.00887
Epoch: 21569, Training Loss: 0.01006
Epoch: 21569, Training Loss: 0.00937
Epoch: 21569, Training Loss: 0.01149
Epoch: 21569, Training Loss: 0.00887
Epoch: 21570, Training Loss: 0.01006
Epoch: 21570, Training Loss: 0.00937
Epoch: 21570, Training Loss: 0.01149
Epoch: 21570, Training Loss: 0.00887
Epoch: 21571, Training Loss: 0.01006
Epoch: 21571, Training Loss: 0.00937
Epoch: 21571, Training Loss: 0.01149
Epoch: 21571, Training Loss: 0.00887
Epoch: 21572, Training Loss: 0.01006
Epoch: 21572, Training Loss: 0.00937
Epoch: 21572, Training Loss: 0.01149
Epoch: 21572, Training Loss: 0.00887
Epoch: 21573, Training Loss: 0.01006
Epoch: 21573, Training Loss: 0.00937
Epoch: 21573, Training Loss: 0.01149
Epoch: 21573, Training Loss: 0.00887
Epoch: 21574, Training Loss: 0.01006
Epoch: 21574, Training Loss: 0.00937
Epoch: 21574, Training Loss: 0.01149
Epoch: 21574, Training Loss: 0.00887
Epoch: 21575, Training Loss: 0.01006
Epoch: 21575, Training Loss: 0.00937
Epoch: 21575, Training Loss: 0.01149
Epoch: 21575, Training Loss: 0.00887
Epoch: 21576, Training Loss: 0.01006
Epoch: 21576, Training Loss: 0.00937
Epoch: 21576, Training Loss: 0.01149
Epoch: 21576, Training Loss: 0.00887
Epoch: 21577, Training Loss: 0.01006
Epoch: 21577, Training Loss: 0.00937
Epoch: 21577, Training Loss: 0.01149
Epoch: 21577, Training Loss: 0.00887
Epoch: 21578, Training Loss: 0.01006
Epoch: 21578, Training Loss: 0.00936
Epoch: 21578, Training Loss: 0.01149
Epoch: 21578, Training Loss: 0.00887
Epoch: 21579, Training Loss: 0.01006
Epoch: 21579, Training Loss: 0.00936
Epoch: 21579, Training Loss: 0.01149
Epoch: 21579, Training Loss: 0.00887
Epoch: 21580, Training Loss: 0.01006
Epoch: 21580, Training Loss: 0.00936
Epoch: 21580, Training Loss: 0.01149
Epoch: 21580, Training Loss: 0.00887
Epoch: 21581, Training Loss: 0.01006
Epoch: 21581, Training Loss: 0.00936
Epoch: 21581, Training Loss: 0.01148
Epoch: 21581, Training Loss: 0.00887
Epoch: 21582, Training Loss: 0.01006
Epoch: 21582, Training Loss: 0.00936
Epoch: 21582, Training Loss: 0.01148
Epoch: 21582, Training Loss: 0.00887
Epoch: 21583, Training Loss: 0.01006
Epoch: 21583, Training Loss: 0.00936
Epoch: 21583, Training Loss: 0.01148
Epoch: 21583, Training Loss: 0.00887
Epoch: 21584, Training Loss: 0.01006
Epoch: 21584, Training Loss: 0.00936
Epoch: 21584, Training Loss: 0.01148
Epoch: 21584, Training Loss: 0.00887
Epoch: 21585, Training Loss: 0.01006
Epoch: 21585, Training Loss: 0.00936
Epoch: 21585, Training Loss: 0.01148
Epoch: 21585, Training Loss: 0.00887
Epoch: 21586, Training Loss: 0.01006
Epoch: 21586, Training Loss: 0.00936
Epoch: 21586, Training Loss: 0.01148
Epoch: 21586, Training Loss: 0.00887
Epoch: 21587, Training Loss: 0.01006
Epoch: 21587, Training Loss: 0.00936
Epoch: 21587, Training Loss: 0.01148
Epoch: 21587, Training Loss: 0.00887
Epoch: 21588, Training Loss: 0.01006
Epoch: 21588, Training Loss: 0.00936
Epoch: 21588, Training Loss: 0.01148
Epoch: 21588, Training Loss: 0.00887
Epoch: 21589, Training Loss: 0.01006
Epoch: 21589, Training Loss: 0.00936
Epoch: 21589, Training Loss: 0.01148
Epoch: 21589, Training Loss: 0.00887
Epoch: 21590, Training Loss: 0.01006
Epoch: 21590, Training Loss: 0.00936
Epoch: 21590, Training Loss: 0.01148
Epoch: 21590, Training Loss: 0.00887
Epoch: 21591, Training Loss: 0.01006
Epoch: 21591, Training Loss: 0.00936
Epoch: 21591, Training Loss: 0.01148
Epoch: 21591, Training Loss: 0.00887
Epoch: 21592, Training Loss: 0.01006
Epoch: 21592, Training Loss: 0.00936
Epoch: 21592, Training Loss: 0.01148
Epoch: 21592, Training Loss: 0.00887
Epoch: 21593, Training Loss: 0.01006
Epoch: 21593, Training Loss: 0.00936
Epoch: 21593, Training Loss: 0.01148
Epoch: 21593, Training Loss: 0.00887
Epoch: 21594, Training Loss: 0.01006
Epoch: 21594, Training Loss: 0.00936
Epoch: 21594, Training Loss: 0.01148
Epoch: 21594, Training Loss: 0.00887
Epoch: 21595, Training Loss: 0.01006
Epoch: 21595, Training Loss: 0.00936
Epoch: 21595, Training Loss: 0.01148
Epoch: 21595, Training Loss: 0.00887
Epoch: 21596, Training Loss: 0.01006
Epoch: 21596, Training Loss: 0.00936
Epoch: 21596, Training Loss: 0.01148
Epoch: 21596, Training Loss: 0.00887
Epoch: 21597, Training Loss: 0.01006
Epoch: 21597, Training Loss: 0.00936
Epoch: 21597, Training Loss: 0.01148
Epoch: 21597, Training Loss: 0.00887
Epoch: 21598, Training Loss: 0.01006
Epoch: 21598, Training Loss: 0.00936
Epoch: 21598, Training Loss: 0.01148
Epoch: 21598, Training Loss: 0.00887
Epoch: 21599, Training Loss: 0.01006
Epoch: 21599, Training Loss: 0.00936
Epoch: 21599, Training Loss: 0.01148
Epoch: 21599, Training Loss: 0.00887
Epoch: 21600, Training Loss: 0.01006
Epoch: 21600, Training Loss: 0.00936
Epoch: 21600, Training Loss: 0.01148
Epoch: 21600, Training Loss: 0.00887
Epoch: 21601, Training Loss: 0.01005
Epoch: 21601, Training Loss: 0.00936
Epoch: 21601, Training Loss: 0.01148
Epoch: 21601, Training Loss: 0.00887
Epoch: 21602, Training Loss: 0.01005
Epoch: 21602, Training Loss: 0.00936
Epoch: 21602, Training Loss: 0.01148
Epoch: 21602, Training Loss: 0.00887
Epoch: 21603, Training Loss: 0.01005
Epoch: 21603, Training Loss: 0.00936
Epoch: 21603, Training Loss: 0.01148
Epoch: 21603, Training Loss: 0.00887
Epoch: 21604, Training Loss: 0.01005
Epoch: 21604, Training Loss: 0.00936
Epoch: 21604, Training Loss: 0.01148
Epoch: 21604, Training Loss: 0.00887
Epoch: 21605, Training Loss: 0.01005
Epoch: 21605, Training Loss: 0.00936
Epoch: 21605, Training Loss: 0.01148
Epoch: 21605, Training Loss: 0.00887
Epoch: 21606, Training Loss: 0.01005
Epoch: 21606, Training Loss: 0.00936
Epoch: 21606, Training Loss: 0.01148
Epoch: 21606, Training Loss: 0.00887
Epoch: 21607, Training Loss: 0.01005
Epoch: 21607, Training Loss: 0.00936
Epoch: 21607, Training Loss: 0.01148
Epoch: 21607, Training Loss: 0.00887
Epoch: 21608, Training Loss: 0.01005
Epoch: 21608, Training Loss: 0.00936
Epoch: 21608, Training Loss: 0.01148
Epoch: 21608, Training Loss: 0.00887
Epoch: 21609, Training Loss: 0.01005
Epoch: 21609, Training Loss: 0.00936
Epoch: 21609, Training Loss: 0.01148
Epoch: 21609, Training Loss: 0.00887
Epoch: 21610, Training Loss: 0.01005
Epoch: 21610, Training Loss: 0.00936
Epoch: 21610, Training Loss: 0.01148
Epoch: 21610, Training Loss: 0.00887
Epoch: 21611, Training Loss: 0.01005
Epoch: 21611, Training Loss: 0.00936
Epoch: 21611, Training Loss: 0.01148
Epoch: 21611, Training Loss: 0.00887
Epoch: 21612, Training Loss: 0.01005
Epoch: 21612, Training Loss: 0.00936
Epoch: 21612, Training Loss: 0.01148
Epoch: 21612, Training Loss: 0.00886
Epoch: 21613, Training Loss: 0.01005
Epoch: 21613, Training Loss: 0.00936
Epoch: 21613, Training Loss: 0.01148
Epoch: 21613, Training Loss: 0.00886
Epoch: 21614, Training Loss: 0.01005
Epoch: 21614, Training Loss: 0.00936
Epoch: 21614, Training Loss: 0.01148
Epoch: 21614, Training Loss: 0.00886
Epoch: 21615, Training Loss: 0.01005
Epoch: 21615, Training Loss: 0.00936
Epoch: 21615, Training Loss: 0.01148
Epoch: 21615, Training Loss: 0.00886
Epoch: 21616, Training Loss: 0.01005
Epoch: 21616, Training Loss: 0.00936
Epoch: 21616, Training Loss: 0.01147
Epoch: 21616, Training Loss: 0.00886
Epoch: 21617, Training Loss: 0.01005
Epoch: 21617, Training Loss: 0.00936
Epoch: 21617, Training Loss: 0.01147
Epoch: 21617, Training Loss: 0.00886
Epoch: 21618, Training Loss: 0.01005
Epoch: 21618, Training Loss: 0.00936
Epoch: 21618, Training Loss: 0.01147
Epoch: 21618, Training Loss: 0.00886
Epoch: 21619, Training Loss: 0.01005
Epoch: 21619, Training Loss: 0.00936
Epoch: 21619, Training Loss: 0.01147
Epoch: 21619, Training Loss: 0.00886
Epoch: 21620, Training Loss: 0.01005
Epoch: 21620, Training Loss: 0.00936
Epoch: 21620, Training Loss: 0.01147
Epoch: 21620, Training Loss: 0.00886
Epoch: 21621, Training Loss: 0.01005
Epoch: 21621, Training Loss: 0.00935
Epoch: 21621, Training Loss: 0.01147
Epoch: 21621, Training Loss: 0.00886
Epoch: 21622, Training Loss: 0.01005
Epoch: 21622, Training Loss: 0.00935
Epoch: 21622, Training Loss: 0.01147
Epoch: 21622, Training Loss: 0.00886
Epoch: 21623, Training Loss: 0.01005
Epoch: 21623, Training Loss: 0.00935
Epoch: 21623, Training Loss: 0.01147
Epoch: 21623, Training Loss: 0.00886
Epoch: 21624, Training Loss: 0.01005
Epoch: 21624, Training Loss: 0.00935
Epoch: 21624, Training Loss: 0.01147
Epoch: 21624, Training Loss: 0.00886
Epoch: 21625, Training Loss: 0.01005
Epoch: 21625, Training Loss: 0.00935
Epoch: 21625, Training Loss: 0.01147
Epoch: 21625, Training Loss: 0.00886
Epoch: 21626, Training Loss: 0.01005
Epoch: 21626, Training Loss: 0.00935
Epoch: 21626, Training Loss: 0.01147
Epoch: 21626, Training Loss: 0.00886
Epoch: 21627, Training Loss: 0.01005
Epoch: 21627, Training Loss: 0.00935
Epoch: 21627, Training Loss: 0.01147
Epoch: 21627, Training Loss: 0.00886
Epoch: 21628, Training Loss: 0.01005
Epoch: 21628, Training Loss: 0.00935
Epoch: 21628, Training Loss: 0.01147
Epoch: 21628, Training Loss: 0.00886
Epoch: 21629, Training Loss: 0.01005
Epoch: 21629, Training Loss: 0.00935
Epoch: 21629, Training Loss: 0.01147
Epoch: 21629, Training Loss: 0.00886
Epoch: 21630, Training Loss: 0.01005
Epoch: 21630, Training Loss: 0.00935
Epoch: 21630, Training Loss: 0.01147
Epoch: 21630, Training Loss: 0.00886
Epoch: 21631, Training Loss: 0.01005
Epoch: 21631, Training Loss: 0.00935
Epoch: 21631, Training Loss: 0.01147
Epoch: 21631, Training Loss: 0.00886
Epoch: 21632, Training Loss: 0.01005
Epoch: 21632, Training Loss: 0.00935
Epoch: 21632, Training Loss: 0.01147
Epoch: 21632, Training Loss: 0.00886
Epoch: 21633, Training Loss: 0.01005
Epoch: 21633, Training Loss: 0.00935
Epoch: 21633, Training Loss: 0.01147
Epoch: 21633, Training Loss: 0.00886
Epoch: 21634, Training Loss: 0.01005
Epoch: 21634, Training Loss: 0.00935
Epoch: 21634, Training Loss: 0.01147
Epoch: 21634, Training Loss: 0.00886
Epoch: 21635, Training Loss: 0.01005
Epoch: 21635, Training Loss: 0.00935
Epoch: 21635, Training Loss: 0.01147
Epoch: 21635, Training Loss: 0.00886
Epoch: 21636, Training Loss: 0.01005
Epoch: 21636, Training Loss: 0.00935
Epoch: 21636, Training Loss: 0.01147
Epoch: 21636, Training Loss: 0.00886
Epoch: 21637, Training Loss: 0.01005
Epoch: 21637, Training Loss: 0.00935
Epoch: 21637, Training Loss: 0.01147
Epoch: 21637, Training Loss: 0.00886
Epoch: 21638, Training Loss: 0.01005
Epoch: 21638, Training Loss: 0.00935
Epoch: 21638, Training Loss: 0.01147
Epoch: 21638, Training Loss: 0.00886
Epoch: 21639, Training Loss: 0.01005
Epoch: 21639, Training Loss: 0.00935
Epoch: 21639, Training Loss: 0.01147
Epoch: 21639, Training Loss: 0.00886
Epoch: 21640, Training Loss: 0.01005
Epoch: 21640, Training Loss: 0.00935
Epoch: 21640, Training Loss: 0.01147
Epoch: 21640, Training Loss: 0.00886
Epoch: 21641, Training Loss: 0.01004
Epoch: 21641, Training Loss: 0.00935
Epoch: 21641, Training Loss: 0.01147
Epoch: 21641, Training Loss: 0.00886
Epoch: 21642, Training Loss: 0.01004
Epoch: 21642, Training Loss: 0.00935
Epoch: 21642, Training Loss: 0.01147
Epoch: 21642, Training Loss: 0.00886
Epoch: 21643, Training Loss: 0.01004
Epoch: 21643, Training Loss: 0.00935
Epoch: 21643, Training Loss: 0.01147
Epoch: 21643, Training Loss: 0.00886
Epoch: 21644, Training Loss: 0.01004
Epoch: 21644, Training Loss: 0.00935
Epoch: 21644, Training Loss: 0.01147
Epoch: 21644, Training Loss: 0.00886
Epoch: 21645, Training Loss: 0.01004
Epoch: 21645, Training Loss: 0.00935
Epoch: 21645, Training Loss: 0.01147
Epoch: 21645, Training Loss: 0.00886
Epoch: 21646, Training Loss: 0.01004
Epoch: 21646, Training Loss: 0.00935
Epoch: 21646, Training Loss: 0.01147
Epoch: 21646, Training Loss: 0.00886
Epoch: 21647, Training Loss: 0.01004
Epoch: 21647, Training Loss: 0.00935
Epoch: 21647, Training Loss: 0.01147
Epoch: 21647, Training Loss: 0.00886
Epoch: 21648, Training Loss: 0.01004
Epoch: 21648, Training Loss: 0.00935
Epoch: 21648, Training Loss: 0.01147
Epoch: 21648, Training Loss: 0.00886
Epoch: 21649, Training Loss: 0.01004
Epoch: 21649, Training Loss: 0.00935
Epoch: 21649, Training Loss: 0.01147
Epoch: 21649, Training Loss: 0.00886
Epoch: 21650, Training Loss: 0.01004
Epoch: 21650, Training Loss: 0.00935
Epoch: 21650, Training Loss: 0.01147
Epoch: 21650, Training Loss: 0.00886
Epoch: 21651, Training Loss: 0.01004
Epoch: 21651, Training Loss: 0.00935
Epoch: 21651, Training Loss: 0.01146
Epoch: 21651, Training Loss: 0.00886
Epoch: 21652, Training Loss: 0.01004
Epoch: 21652, Training Loss: 0.00935
Epoch: 21652, Training Loss: 0.01146
Epoch: 21652, Training Loss: 0.00886
Epoch: 21653, Training Loss: 0.01004
Epoch: 21653, Training Loss: 0.00935
Epoch: 21653, Training Loss: 0.01146
Epoch: 21653, Training Loss: 0.00886
Epoch: 21654, Training Loss: 0.01004
Epoch: 21654, Training Loss: 0.00935
Epoch: 21654, Training Loss: 0.01146
Epoch: 21654, Training Loss: 0.00886
Epoch: 21655, Training Loss: 0.01004
Epoch: 21655, Training Loss: 0.00935
Epoch: 21655, Training Loss: 0.01146
Epoch: 21655, Training Loss: 0.00886
Epoch: 21656, Training Loss: 0.01004
Epoch: 21656, Training Loss: 0.00935
Epoch: 21656, Training Loss: 0.01146
Epoch: 21656, Training Loss: 0.00886
Epoch: 21657, Training Loss: 0.01004
Epoch: 21657, Training Loss: 0.00935
Epoch: 21657, Training Loss: 0.01146
Epoch: 21657, Training Loss: 0.00886
Epoch: 21658, Training Loss: 0.01004
Epoch: 21658, Training Loss: 0.00935
Epoch: 21658, Training Loss: 0.01146
Epoch: 21658, Training Loss: 0.00885
Epoch: 21659, Training Loss: 0.01004
Epoch: 21659, Training Loss: 0.00935
Epoch: 21659, Training Loss: 0.01146
Epoch: 21659, Training Loss: 0.00885
Epoch: 21660, Training Loss: 0.01004
Epoch: 21660, Training Loss: 0.00935
Epoch: 21660, Training Loss: 0.01146
Epoch: 21660, Training Loss: 0.00885
Epoch: 21661, Training Loss: 0.01004
Epoch: 21661, Training Loss: 0.00935
Epoch: 21661, Training Loss: 0.01146
Epoch: 21661, Training Loss: 0.00885
Epoch: 21662, Training Loss: 0.01004
Epoch: 21662, Training Loss: 0.00935
Epoch: 21662, Training Loss: 0.01146
Epoch: 21662, Training Loss: 0.00885
Epoch: 21663, Training Loss: 0.01004
Epoch: 21663, Training Loss: 0.00935
Epoch: 21663, Training Loss: 0.01146
Epoch: 21663, Training Loss: 0.00885
Epoch: 21664, Training Loss: 0.01004
Epoch: 21664, Training Loss: 0.00935
Epoch: 21664, Training Loss: 0.01146
Epoch: 21664, Training Loss: 0.00885
Epoch: 21665, Training Loss: 0.01004
Epoch: 21665, Training Loss: 0.00934
Epoch: 21665, Training Loss: 0.01146
Epoch: 21665, Training Loss: 0.00885
Epoch: 21666, Training Loss: 0.01004
Epoch: 21666, Training Loss: 0.00934
Epoch: 21666, Training Loss: 0.01146
Epoch: 21666, Training Loss: 0.00885
Epoch: 21667, Training Loss: 0.01004
Epoch: 21667, Training Loss: 0.00934
Epoch: 21667, Training Loss: 0.01146
Epoch: 21667, Training Loss: 0.00885
Epoch: 21668, Training Loss: 0.01004
Epoch: 21668, Training Loss: 0.00934
Epoch: 21668, Training Loss: 0.01146
Epoch: 21668, Training Loss: 0.00885
Epoch: 21669, Training Loss: 0.01004
Epoch: 21669, Training Loss: 0.00934
Epoch: 21669, Training Loss: 0.01146
Epoch: 21669, Training Loss: 0.00885
Epoch: 21670, Training Loss: 0.01004
Epoch: 21670, Training Loss: 0.00934
Epoch: 21670, Training Loss: 0.01146
Epoch: 21670, Training Loss: 0.00885
Epoch: 21671, Training Loss: 0.01004
Epoch: 21671, Training Loss: 0.00934
Epoch: 21671, Training Loss: 0.01146
Epoch: 21671, Training Loss: 0.00885
Epoch: 21672, Training Loss: 0.01004
Epoch: 21672, Training Loss: 0.00934
Epoch: 21672, Training Loss: 0.01146
Epoch: 21672, Training Loss: 0.00885
Epoch: 21673, Training Loss: 0.01004
Epoch: 21673, Training Loss: 0.00934
Epoch: 21673, Training Loss: 0.01146
Epoch: 21673, Training Loss: 0.00885
Epoch: 21674, Training Loss: 0.01004
Epoch: 21674, Training Loss: 0.00934
Epoch: 21674, Training Loss: 0.01146
Epoch: 21674, Training Loss: 0.00885
Epoch: 21675, Training Loss: 0.01004
Epoch: 21675, Training Loss: 0.00934
Epoch: 21675, Training Loss: 0.01146
Epoch: 21675, Training Loss: 0.00885
Epoch: 21676, Training Loss: 0.01004
Epoch: 21676, Training Loss: 0.00934
Epoch: 21676, Training Loss: 0.01146
Epoch: 21676, Training Loss: 0.00885
Epoch: 21677, Training Loss: 0.01004
Epoch: 21677, Training Loss: 0.00934
Epoch: 21677, Training Loss: 0.01146
Epoch: 21677, Training Loss: 0.00885
Epoch: 21678, Training Loss: 0.01004
Epoch: 21678, Training Loss: 0.00934
Epoch: 21678, Training Loss: 0.01146
Epoch: 21678, Training Loss: 0.00885
Epoch: 21679, Training Loss: 0.01004
Epoch: 21679, Training Loss: 0.00934
Epoch: 21679, Training Loss: 0.01146
Epoch: 21679, Training Loss: 0.00885
Epoch: 21680, Training Loss: 0.01004
Epoch: 21680, Training Loss: 0.00934
Epoch: 21680, Training Loss: 0.01146
Epoch: 21680, Training Loss: 0.00885
Epoch: 21681, Training Loss: 0.01003
Epoch: 21681, Training Loss: 0.00934
Epoch: 21681, Training Loss: 0.01146
Epoch: 21681, Training Loss: 0.00885
Epoch: 21682, Training Loss: 0.01003
Epoch: 21682, Training Loss: 0.00934
Epoch: 21682, Training Loss: 0.01146
Epoch: 21682, Training Loss: 0.00885
Epoch: 21683, Training Loss: 0.01003
Epoch: 21683, Training Loss: 0.00934
Epoch: 21683, Training Loss: 0.01146
Epoch: 21683, Training Loss: 0.00885
Epoch: 21684, Training Loss: 0.01003
Epoch: 21684, Training Loss: 0.00934
Epoch: 21684, Training Loss: 0.01146
Epoch: 21684, Training Loss: 0.00885
Epoch: 21685, Training Loss: 0.01003
Epoch: 21685, Training Loss: 0.00934
Epoch: 21685, Training Loss: 0.01146
Epoch: 21685, Training Loss: 0.00885
Epoch: 21686, Training Loss: 0.01003
Epoch: 21686, Training Loss: 0.00934
Epoch: 21686, Training Loss: 0.01145
Epoch: 21686, Training Loss: 0.00885
Epoch: 21687, Training Loss: 0.01003
Epoch: 21687, Training Loss: 0.00934
Epoch: 21687, Training Loss: 0.01145
Epoch: 21687, Training Loss: 0.00885
Epoch: 21688, Training Loss: 0.01003
Epoch: 21688, Training Loss: 0.00934
Epoch: 21688, Training Loss: 0.01145
Epoch: 21688, Training Loss: 0.00885
Epoch: 21689, Training Loss: 0.01003
Epoch: 21689, Training Loss: 0.00934
Epoch: 21689, Training Loss: 0.01145
Epoch: 21689, Training Loss: 0.00885
Epoch: 21690, Training Loss: 0.01003
Epoch: 21690, Training Loss: 0.00934
Epoch: 21690, Training Loss: 0.01145
Epoch: 21690, Training Loss: 0.00885
Epoch: 21691, Training Loss: 0.01003
Epoch: 21691, Training Loss: 0.00934
Epoch: 21691, Training Loss: 0.01145
Epoch: 21691, Training Loss: 0.00885
Epoch: 21692, Training Loss: 0.01003
Epoch: 21692, Training Loss: 0.00934
Epoch: 21692, Training Loss: 0.01145
Epoch: 21692, Training Loss: 0.00885
Epoch: 21693, Training Loss: 0.01003
Epoch: 21693, Training Loss: 0.00934
Epoch: 21693, Training Loss: 0.01145
Epoch: 21693, Training Loss: 0.00885
Epoch: 21694, Training Loss: 0.01003
Epoch: 21694, Training Loss: 0.00934
Epoch: 21694, Training Loss: 0.01145
Epoch: 21694, Training Loss: 0.00885
Epoch: 21695, Training Loss: 0.01003
Epoch: 21695, Training Loss: 0.00934
Epoch: 21695, Training Loss: 0.01145
Epoch: 21695, Training Loss: 0.00885
Epoch: 21696, Training Loss: 0.01003
Epoch: 21696, Training Loss: 0.00934
Epoch: 21696, Training Loss: 0.01145
Epoch: 21696, Training Loss: 0.00885
Epoch: 21697, Training Loss: 0.01003
Epoch: 21697, Training Loss: 0.00934
Epoch: 21697, Training Loss: 0.01145
Epoch: 21697, Training Loss: 0.00885
Epoch: 21698, Training Loss: 0.01003
Epoch: 21698, Training Loss: 0.00934
Epoch: 21698, Training Loss: 0.01145
Epoch: 21698, Training Loss: 0.00885
Epoch: 21699, Training Loss: 0.01003
Epoch: 21699, Training Loss: 0.00934
Epoch: 21699, Training Loss: 0.01145
Epoch: 21699, Training Loss: 0.00885
Epoch: 21700, Training Loss: 0.01003
Epoch: 21700, Training Loss: 0.00934
Epoch: 21700, Training Loss: 0.01145
Epoch: 21700, Training Loss: 0.00885
Epoch: 21701, Training Loss: 0.01003
Epoch: 21701, Training Loss: 0.00934
Epoch: 21701, Training Loss: 0.01145
Epoch: 21701, Training Loss: 0.00885
Epoch: 21702, Training Loss: 0.01003
Epoch: 21702, Training Loss: 0.00934
Epoch: 21702, Training Loss: 0.01145
Epoch: 21702, Training Loss: 0.00885
Epoch: 21703, Training Loss: 0.01003
Epoch: 21703, Training Loss: 0.00934
Epoch: 21703, Training Loss: 0.01145
Epoch: 21703, Training Loss: 0.00885
Epoch: 21704, Training Loss: 0.01003
Epoch: 21704, Training Loss: 0.00934
Epoch: 21704, Training Loss: 0.01145
Epoch: 21704, Training Loss: 0.00885
Epoch: 21705, Training Loss: 0.01003
Epoch: 21705, Training Loss: 0.00934
Epoch: 21705, Training Loss: 0.01145
Epoch: 21705, Training Loss: 0.00884
Epoch: 21706, Training Loss: 0.01003
Epoch: 21706, Training Loss: 0.00934
Epoch: 21706, Training Loss: 0.01145
Epoch: 21706, Training Loss: 0.00884
Epoch: 21707, Training Loss: 0.01003
Epoch: 21707, Training Loss: 0.00934
Epoch: 21707, Training Loss: 0.01145
Epoch: 21707, Training Loss: 0.00884
Epoch: 21708, Training Loss: 0.01003
Epoch: 21708, Training Loss: 0.00934
Epoch: 21708, Training Loss: 0.01145
Epoch: 21708, Training Loss: 0.00884
Epoch: 21709, Training Loss: 0.01003
Epoch: 21709, Training Loss: 0.00933
Epoch: 21709, Training Loss: 0.01145
Epoch: 21709, Training Loss: 0.00884
Epoch: 21710, Training Loss: 0.01003
Epoch: 21710, Training Loss: 0.00933
Epoch: 21710, Training Loss: 0.01145
Epoch: 21710, Training Loss: 0.00884
Epoch: 21711, Training Loss: 0.01003
Epoch: 21711, Training Loss: 0.00933
Epoch: 21711, Training Loss: 0.01145
Epoch: 21711, Training Loss: 0.00884
Epoch: 21712, Training Loss: 0.01003
Epoch: 21712, Training Loss: 0.00933
Epoch: 21712, Training Loss: 0.01145
Epoch: 21712, Training Loss: 0.00884
Epoch: 21713, Training Loss: 0.01003
Epoch: 21713, Training Loss: 0.00933
Epoch: 21713, Training Loss: 0.01145
Epoch: 21713, Training Loss: 0.00884
Epoch: 21714, Training Loss: 0.01003
Epoch: 21714, Training Loss: 0.00933
Epoch: 21714, Training Loss: 0.01145
Epoch: 21714, Training Loss: 0.00884
Epoch: 21715, Training Loss: 0.01003
Epoch: 21715, Training Loss: 0.00933
Epoch: 21715, Training Loss: 0.01145
Epoch: 21715, Training Loss: 0.00884
Epoch: 21716, Training Loss: 0.01003
Epoch: 21716, Training Loss: 0.00933
Epoch: 21716, Training Loss: 0.01145
Epoch: 21716, Training Loss: 0.00884
Epoch: 21717, Training Loss: 0.01003
Epoch: 21717, Training Loss: 0.00933
Epoch: 21717, Training Loss: 0.01145
Epoch: 21717, Training Loss: 0.00884
Epoch: 21718, Training Loss: 0.01003
Epoch: 21718, Training Loss: 0.00933
Epoch: 21718, Training Loss: 0.01145
Epoch: 21718, Training Loss: 0.00884
Epoch: 21719, Training Loss: 0.01003
Epoch: 21719, Training Loss: 0.00933
Epoch: 21719, Training Loss: 0.01145
Epoch: 21719, Training Loss: 0.00884
Epoch: 21720, Training Loss: 0.01003
Epoch: 21720, Training Loss: 0.00933
Epoch: 21720, Training Loss: 0.01145
Epoch: 21720, Training Loss: 0.00884
Epoch: 21721, Training Loss: 0.01002
Epoch: 21721, Training Loss: 0.00933
Epoch: 21721, Training Loss: 0.01145
Epoch: 21721, Training Loss: 0.00884
Epoch: 21722, Training Loss: 0.01002
Epoch: 21722, Training Loss: 0.00933
Epoch: 21722, Training Loss: 0.01144
Epoch: 21722, Training Loss: 0.00884
Epoch: 21723, Training Loss: 0.01002
Epoch: 21723, Training Loss: 0.00933
Epoch: 21723, Training Loss: 0.01144
Epoch: 21723, Training Loss: 0.00884
Epoch: 21724, Training Loss: 0.01002
Epoch: 21724, Training Loss: 0.00933
Epoch: 21724, Training Loss: 0.01144
Epoch: 21724, Training Loss: 0.00884
Epoch: 21725, Training Loss: 0.01002
Epoch: 21725, Training Loss: 0.00933
Epoch: 21725, Training Loss: 0.01144
Epoch: 21725, Training Loss: 0.00884
Epoch: 21726, Training Loss: 0.01002
Epoch: 21726, Training Loss: 0.00933
Epoch: 21726, Training Loss: 0.01144
Epoch: 21726, Training Loss: 0.00884
Epoch: 21727, Training Loss: 0.01002
Epoch: 21727, Training Loss: 0.00933
Epoch: 21727, Training Loss: 0.01144
Epoch: 21727, Training Loss: 0.00884
Epoch: 21728, Training Loss: 0.01002
Epoch: 21728, Training Loss: 0.00933
Epoch: 21728, Training Loss: 0.01144
Epoch: 21728, Training Loss: 0.00884
Epoch: 21729, Training Loss: 0.01002
Epoch: 21729, Training Loss: 0.00933
Epoch: 21729, Training Loss: 0.01144
Epoch: 21729, Training Loss: 0.00884
Epoch: 21730, Training Loss: 0.01002
Epoch: 21730, Training Loss: 0.00933
Epoch: 21730, Training Loss: 0.01144
Epoch: 21730, Training Loss: 0.00884
Epoch: 21731, Training Loss: 0.01002
Epoch: 21731, Training Loss: 0.00933
Epoch: 21731, Training Loss: 0.01144
Epoch: 21731, Training Loss: 0.00884
Epoch: 21732, Training Loss: 0.01002
Epoch: 21732, Training Loss: 0.00933
Epoch: 21732, Training Loss: 0.01144
Epoch: 21732, Training Loss: 0.00884
Epoch: 21733, Training Loss: 0.01002
Epoch: 21733, Training Loss: 0.00933
Epoch: 21733, Training Loss: 0.01144
Epoch: 21733, Training Loss: 0.00884
Epoch: 21734, Training Loss: 0.01002
Epoch: 21734, Training Loss: 0.00933
Epoch: 21734, Training Loss: 0.01144
Epoch: 21734, Training Loss: 0.00884
Epoch: 21735, Training Loss: 0.01002
Epoch: 21735, Training Loss: 0.00933
Epoch: 21735, Training Loss: 0.01144
Epoch: 21735, Training Loss: 0.00884
Epoch: 21736, Training Loss: 0.01002
Epoch: 21736, Training Loss: 0.00933
Epoch: 21736, Training Loss: 0.01144
Epoch: 21736, Training Loss: 0.00884
Epoch: 21737, Training Loss: 0.01002
Epoch: 21737, Training Loss: 0.00933
Epoch: 21737, Training Loss: 0.01144
Epoch: 21737, Training Loss: 0.00884
Epoch: 21738, Training Loss: 0.01002
Epoch: 21738, Training Loss: 0.00933
Epoch: 21738, Training Loss: 0.01144
Epoch: 21738, Training Loss: 0.00884
Epoch: 21739, Training Loss: 0.01002
Epoch: 21739, Training Loss: 0.00933
Epoch: 21739, Training Loss: 0.01144
Epoch: 21739, Training Loss: 0.00884
Epoch: 21740, Training Loss: 0.01002
Epoch: 21740, Training Loss: 0.00933
Epoch: 21740, Training Loss: 0.01144
Epoch: 21740, Training Loss: 0.00884
Epoch: 21741, Training Loss: 0.01002
Epoch: 21741, Training Loss: 0.00933
Epoch: 21741, Training Loss: 0.01144
Epoch: 21741, Training Loss: 0.00884
Epoch: 21742, Training Loss: 0.01002
Epoch: 21742, Training Loss: 0.00933
Epoch: 21742, Training Loss: 0.01144
Epoch: 21742, Training Loss: 0.00884
Epoch: 21743, Training Loss: 0.01002
Epoch: 21743, Training Loss: 0.00933
Epoch: 21743, Training Loss: 0.01144
Epoch: 21743, Training Loss: 0.00884
Epoch: 21744, Training Loss: 0.01002
Epoch: 21744, Training Loss: 0.00933
Epoch: 21744, Training Loss: 0.01144
Epoch: 21744, Training Loss: 0.00884
Epoch: 21745, Training Loss: 0.01002
Epoch: 21745, Training Loss: 0.00933
Epoch: 21745, Training Loss: 0.01144
Epoch: 21745, Training Loss: 0.00884
Epoch: 21746, Training Loss: 0.01002
Epoch: 21746, Training Loss: 0.00933
Epoch: 21746, Training Loss: 0.01144
Epoch: 21746, Training Loss: 0.00884
Epoch: 21747, Training Loss: 0.01002
Epoch: 21747, Training Loss: 0.00933
Epoch: 21747, Training Loss: 0.01144
Epoch: 21747, Training Loss: 0.00884
Epoch: 21748, Training Loss: 0.01002
Epoch: 21748, Training Loss: 0.00933
Epoch: 21748, Training Loss: 0.01144
Epoch: 21748, Training Loss: 0.00884
Epoch: 21749, Training Loss: 0.01002
Epoch: 21749, Training Loss: 0.00933
Epoch: 21749, Training Loss: 0.01144
Epoch: 21749, Training Loss: 0.00884
Epoch: 21750, Training Loss: 0.01002
Epoch: 21750, Training Loss: 0.00933
Epoch: 21750, Training Loss: 0.01144
Epoch: 21750, Training Loss: 0.00884
Epoch: 21751, Training Loss: 0.01002
Epoch: 21751, Training Loss: 0.00933
Epoch: 21751, Training Loss: 0.01144
Epoch: 21751, Training Loss: 0.00883
Epoch: 21752, Training Loss: 0.01002
Epoch: 21752, Training Loss: 0.00933
Epoch: 21752, Training Loss: 0.01144
Epoch: 21752, Training Loss: 0.00883
Epoch: 21753, Training Loss: 0.01002
Epoch: 21753, Training Loss: 0.00932
Epoch: 21753, Training Loss: 0.01144
Epoch: 21753, Training Loss: 0.00883
Epoch: 21754, Training Loss: 0.01002
Epoch: 21754, Training Loss: 0.00932
Epoch: 21754, Training Loss: 0.01144
Epoch: 21754, Training Loss: 0.00883
Epoch: 21755, Training Loss: 0.01002
Epoch: 21755, Training Loss: 0.00932
Epoch: 21755, Training Loss: 0.01144
Epoch: 21755, Training Loss: 0.00883
Epoch: 21756, Training Loss: 0.01002
Epoch: 21756, Training Loss: 0.00932
Epoch: 21756, Training Loss: 0.01144
Epoch: 21756, Training Loss: 0.00883
Epoch: 21757, Training Loss: 0.01002
Epoch: 21757, Training Loss: 0.00932
Epoch: 21757, Training Loss: 0.01143
Epoch: 21757, Training Loss: 0.00883
Epoch: 21758, Training Loss: 0.01002
Epoch: 21758, Training Loss: 0.00932
Epoch: 21758, Training Loss: 0.01143
Epoch: 21758, Training Loss: 0.00883
Epoch: 21759, Training Loss: 0.01002
Epoch: 21759, Training Loss: 0.00932
Epoch: 21759, Training Loss: 0.01143
Epoch: 21759, Training Loss: 0.00883
Epoch: 21760, Training Loss: 0.01002
Epoch: 21760, Training Loss: 0.00932
Epoch: 21760, Training Loss: 0.01143
Epoch: 21760, Training Loss: 0.00883
Epoch: 21761, Training Loss: 0.01001
Epoch: 21761, Training Loss: 0.00932
Epoch: 21761, Training Loss: 0.01143
Epoch: 21761, Training Loss: 0.00883
Epoch: 21762, Training Loss: 0.01001
Epoch: 21762, Training Loss: 0.00932
Epoch: 21762, Training Loss: 0.01143
Epoch: 21762, Training Loss: 0.00883
Epoch: 21763, Training Loss: 0.01001
Epoch: 21763, Training Loss: 0.00932
Epoch: 21763, Training Loss: 0.01143
Epoch: 21763, Training Loss: 0.00883
Epoch: 21764, Training Loss: 0.01001
Epoch: 21764, Training Loss: 0.00932
Epoch: 21764, Training Loss: 0.01143
Epoch: 21764, Training Loss: 0.00883
Epoch: 21765, Training Loss: 0.01001
Epoch: 21765, Training Loss: 0.00932
Epoch: 21765, Training Loss: 0.01143
Epoch: 21765, Training Loss: 0.00883
Epoch: 21766, Training Loss: 0.01001
Epoch: 21766, Training Loss: 0.00932
Epoch: 21766, Training Loss: 0.01143
Epoch: 21766, Training Loss: 0.00883
Epoch: 21767, Training Loss: 0.01001
Epoch: 21767, Training Loss: 0.00932
Epoch: 21767, Training Loss: 0.01143
Epoch: 21767, Training Loss: 0.00883
Epoch: 21768, Training Loss: 0.01001
Epoch: 21768, Training Loss: 0.00932
Epoch: 21768, Training Loss: 0.01143
Epoch: 21768, Training Loss: 0.00883
Epoch: 21769, Training Loss: 0.01001
Epoch: 21769, Training Loss: 0.00932
Epoch: 21769, Training Loss: 0.01143
Epoch: 21769, Training Loss: 0.00883
Epoch: 21770, Training Loss: 0.01001
Epoch: 21770, Training Loss: 0.00932
Epoch: 21770, Training Loss: 0.01143
Epoch: 21770, Training Loss: 0.00883
Epoch: 21771, Training Loss: 0.01001
Epoch: 21771, Training Loss: 0.00932
Epoch: 21771, Training Loss: 0.01143
Epoch: 21771, Training Loss: 0.00883
Epoch: 21772, Training Loss: 0.01001
Epoch: 21772, Training Loss: 0.00932
Epoch: 21772, Training Loss: 0.01143
Epoch: 21772, Training Loss: 0.00883
Epoch: 21773, Training Loss: 0.01001
Epoch: 21773, Training Loss: 0.00932
Epoch: 21773, Training Loss: 0.01143
Epoch: 21773, Training Loss: 0.00883
Epoch: 21774, Training Loss: 0.01001
Epoch: 21774, Training Loss: 0.00932
Epoch: 21774, Training Loss: 0.01143
Epoch: 21774, Training Loss: 0.00883
Epoch: 21775, Training Loss: 0.01001
Epoch: 21775, Training Loss: 0.00932
Epoch: 21775, Training Loss: 0.01143
Epoch: 21775, Training Loss: 0.00883
Epoch: 21776, Training Loss: 0.01001
Epoch: 21776, Training Loss: 0.00932
Epoch: 21776, Training Loss: 0.01143
Epoch: 21776, Training Loss: 0.00883
Epoch: 21777, Training Loss: 0.01001
Epoch: 21777, Training Loss: 0.00932
Epoch: 21777, Training Loss: 0.01143
Epoch: 21777, Training Loss: 0.00883
Epoch: 21778, Training Loss: 0.01001
Epoch: 21778, Training Loss: 0.00932
Epoch: 21778, Training Loss: 0.01143
Epoch: 21778, Training Loss: 0.00883
Epoch: 21779, Training Loss: 0.01001
Epoch: 21779, Training Loss: 0.00932
Epoch: 21779, Training Loss: 0.01143
Epoch: 21779, Training Loss: 0.00883
Epoch: 21780, Training Loss: 0.01001
Epoch: 21780, Training Loss: 0.00932
Epoch: 21780, Training Loss: 0.01143
Epoch: 21780, Training Loss: 0.00883
Epoch: 21781, Training Loss: 0.01001
Epoch: 21781, Training Loss: 0.00932
Epoch: 21781, Training Loss: 0.01143
Epoch: 21781, Training Loss: 0.00883
Epoch: 21782, Training Loss: 0.01001
Epoch: 21782, Training Loss: 0.00932
Epoch: 21782, Training Loss: 0.01143
Epoch: 21782, Training Loss: 0.00883
Epoch: 21783, Training Loss: 0.01001
Epoch: 21783, Training Loss: 0.00932
Epoch: 21783, Training Loss: 0.01143
Epoch: 21783, Training Loss: 0.00883
Epoch: 21784, Training Loss: 0.01001
Epoch: 21784, Training Loss: 0.00932
Epoch: 21784, Training Loss: 0.01143
Epoch: 21784, Training Loss: 0.00883
Epoch: 21785, Training Loss: 0.01001
Epoch: 21785, Training Loss: 0.00932
Epoch: 21785, Training Loss: 0.01143
Epoch: 21785, Training Loss: 0.00883
Epoch: 21786, Training Loss: 0.01001
Epoch: 21786, Training Loss: 0.00932
Epoch: 21786, Training Loss: 0.01143
Epoch: 21786, Training Loss: 0.00883
Epoch: 21787, Training Loss: 0.01001
Epoch: 21787, Training Loss: 0.00932
Epoch: 21787, Training Loss: 0.01143
Epoch: 21787, Training Loss: 0.00883
Epoch: 21788, Training Loss: 0.01001
Epoch: 21788, Training Loss: 0.00932
Epoch: 21788, Training Loss: 0.01143
Epoch: 21788, Training Loss: 0.00883
Epoch: 21789, Training Loss: 0.01001
Epoch: 21789, Training Loss: 0.00932
Epoch: 21789, Training Loss: 0.01143
Epoch: 21789, Training Loss: 0.00883
Epoch: 21790, Training Loss: 0.01001
Epoch: 21790, Training Loss: 0.00932
Epoch: 21790, Training Loss: 0.01143
Epoch: 21790, Training Loss: 0.00883
Epoch: 21791, Training Loss: 0.01001
Epoch: 21791, Training Loss: 0.00932
Epoch: 21791, Training Loss: 0.01143
Epoch: 21791, Training Loss: 0.00883
Epoch: 21792, Training Loss: 0.01001
Epoch: 21792, Training Loss: 0.00932
Epoch: 21792, Training Loss: 0.01142
Epoch: 21792, Training Loss: 0.00883
Epoch: 21793, Training Loss: 0.01001
Epoch: 21793, Training Loss: 0.00932
Epoch: 21793, Training Loss: 0.01142
Epoch: 21793, Training Loss: 0.00883
Epoch: 21794, Training Loss: 0.01001
Epoch: 21794, Training Loss: 0.00932
Epoch: 21794, Training Loss: 0.01142
Epoch: 21794, Training Loss: 0.00883
Epoch: 21795, Training Loss: 0.01001
Epoch: 21795, Training Loss: 0.00932
Epoch: 21795, Training Loss: 0.01142
Epoch: 21795, Training Loss: 0.00883
Epoch: 21796, Training Loss: 0.01001
Epoch: 21796, Training Loss: 0.00932
Epoch: 21796, Training Loss: 0.01142
Epoch: 21796, Training Loss: 0.00883
Epoch: 21797, Training Loss: 0.01001
Epoch: 21797, Training Loss: 0.00931
Epoch: 21797, Training Loss: 0.01142
Epoch: 21797, Training Loss: 0.00882
Epoch: 21798, Training Loss: 0.01001
Epoch: 21798, Training Loss: 0.00931
Epoch: 21798, Training Loss: 0.01142
Epoch: 21798, Training Loss: 0.00882
Epoch: 21799, Training Loss: 0.01001
Epoch: 21799, Training Loss: 0.00931
Epoch: 21799, Training Loss: 0.01142
Epoch: 21799, Training Loss: 0.00882
Epoch: 21800, Training Loss: 0.01001
Epoch: 21800, Training Loss: 0.00931
Epoch: 21800, Training Loss: 0.01142
Epoch: 21800, Training Loss: 0.00882
Epoch: 21801, Training Loss: 0.01001
Epoch: 21801, Training Loss: 0.00931
Epoch: 21801, Training Loss: 0.01142
Epoch: 21801, Training Loss: 0.00882
Epoch: 21802, Training Loss: 0.01000
Epoch: 21802, Training Loss: 0.00931
Epoch: 21802, Training Loss: 0.01142
Epoch: 21802, Training Loss: 0.00882
Epoch: 21803, Training Loss: 0.01000
Epoch: 21803, Training Loss: 0.00931
Epoch: 21803, Training Loss: 0.01142
Epoch: 21803, Training Loss: 0.00882
Epoch: 21804, Training Loss: 0.01000
Epoch: 21804, Training Loss: 0.00931
Epoch: 21804, Training Loss: 0.01142
Epoch: 21804, Training Loss: 0.00882
Epoch: 21805, Training Loss: 0.01000
Epoch: 21805, Training Loss: 0.00931
Epoch: 21805, Training Loss: 0.01142
Epoch: 21805, Training Loss: 0.00882
Epoch: 21806, Training Loss: 0.01000
Epoch: 21806, Training Loss: 0.00931
Epoch: 21806, Training Loss: 0.01142
Epoch: 21806, Training Loss: 0.00882
Epoch: 21807, Training Loss: 0.01000
Epoch: 21807, Training Loss: 0.00931
Epoch: 21807, Training Loss: 0.01142
Epoch: 21807, Training Loss: 0.00882
Epoch: 21808, Training Loss: 0.01000
Epoch: 21808, Training Loss: 0.00931
Epoch: 21808, Training Loss: 0.01142
Epoch: 21808, Training Loss: 0.00882
Epoch: 21809, Training Loss: 0.01000
Epoch: 21809, Training Loss: 0.00931
Epoch: 21809, Training Loss: 0.01142
Epoch: 21809, Training Loss: 0.00882
Epoch: 21810, Training Loss: 0.01000
Epoch: 21810, Training Loss: 0.00931
Epoch: 21810, Training Loss: 0.01142
Epoch: 21810, Training Loss: 0.00882
Epoch: 21811, Training Loss: 0.01000
Epoch: 21811, Training Loss: 0.00931
Epoch: 21811, Training Loss: 0.01142
Epoch: 21811, Training Loss: 0.00882
Epoch: 21812, Training Loss: 0.01000
Epoch: 21812, Training Loss: 0.00931
Epoch: 21812, Training Loss: 0.01142
Epoch: 21812, Training Loss: 0.00882
Epoch: 21813, Training Loss: 0.01000
Epoch: 21813, Training Loss: 0.00931
Epoch: 21813, Training Loss: 0.01142
Epoch: 21813, Training Loss: 0.00882
Epoch: 21814, Training Loss: 0.01000
Epoch: 21814, Training Loss: 0.00931
Epoch: 21814, Training Loss: 0.01142
Epoch: 21814, Training Loss: 0.00882
Epoch: 21815, Training Loss: 0.01000
Epoch: 21815, Training Loss: 0.00931
Epoch: 21815, Training Loss: 0.01142
Epoch: 21815, Training Loss: 0.00882
Epoch: 21816, Training Loss: 0.01000
Epoch: 21816, Training Loss: 0.00931
Epoch: 21816, Training Loss: 0.01142
Epoch: 21816, Training Loss: 0.00882
Epoch: 21817, Training Loss: 0.01000
Epoch: 21817, Training Loss: 0.00931
Epoch: 21817, Training Loss: 0.01142
Epoch: 21817, Training Loss: 0.00882
Epoch: 21818, Training Loss: 0.01000
Epoch: 21818, Training Loss: 0.00931
Epoch: 21818, Training Loss: 0.01142
Epoch: 21818, Training Loss: 0.00882
Epoch: 21819, Training Loss: 0.01000
Epoch: 21819, Training Loss: 0.00931
Epoch: 21819, Training Loss: 0.01142
Epoch: 21819, Training Loss: 0.00882
Epoch: 21820, Training Loss: 0.01000
Epoch: 21820, Training Loss: 0.00931
Epoch: 21820, Training Loss: 0.01142
Epoch: 21820, Training Loss: 0.00882
Epoch: 21821, Training Loss: 0.01000
Epoch: 21821, Training Loss: 0.00931
Epoch: 21821, Training Loss: 0.01142
Epoch: 21821, Training Loss: 0.00882
Epoch: 21822, Training Loss: 0.01000
Epoch: 21822, Training Loss: 0.00931
Epoch: 21822, Training Loss: 0.01142
Epoch: 21822, Training Loss: 0.00882
Epoch: 21823, Training Loss: 0.01000
Epoch: 21823, Training Loss: 0.00931
Epoch: 21823, Training Loss: 0.01142
Epoch: 21823, Training Loss: 0.00882
Epoch: 21824, Training Loss: 0.01000
Epoch: 21824, Training Loss: 0.00931
Epoch: 21824, Training Loss: 0.01142
Epoch: 21824, Training Loss: 0.00882
Epoch: 21825, Training Loss: 0.01000
Epoch: 21825, Training Loss: 0.00931
Epoch: 21825, Training Loss: 0.01142
Epoch: 21825, Training Loss: 0.00882
Epoch: 21826, Training Loss: 0.01000
Epoch: 21826, Training Loss: 0.00931
Epoch: 21826, Training Loss: 0.01142
Epoch: 21826, Training Loss: 0.00882
Epoch: 21827, Training Loss: 0.01000
Epoch: 21827, Training Loss: 0.00931
Epoch: 21827, Training Loss: 0.01142
Epoch: 21827, Training Loss: 0.00882
Epoch: 21828, Training Loss: 0.01000
Epoch: 21828, Training Loss: 0.00931
Epoch: 21828, Training Loss: 0.01141
Epoch: 21828, Training Loss: 0.00882
Epoch: 21829, Training Loss: 0.01000
Epoch: 21829, Training Loss: 0.00931
Epoch: 21829, Training Loss: 0.01141
Epoch: 21829, Training Loss: 0.00882
Epoch: 21830, Training Loss: 0.01000
Epoch: 21830, Training Loss: 0.00931
Epoch: 21830, Training Loss: 0.01141
Epoch: 21830, Training Loss: 0.00882
Epoch: 21831, Training Loss: 0.01000
Epoch: 21831, Training Loss: 0.00931
Epoch: 21831, Training Loss: 0.01141
Epoch: 21831, Training Loss: 0.00882
Epoch: 21832, Training Loss: 0.01000
Epoch: 21832, Training Loss: 0.00931
Epoch: 21832, Training Loss: 0.01141
Epoch: 21832, Training Loss: 0.00882
Epoch: 21833, Training Loss: 0.01000
Epoch: 21833, Training Loss: 0.00931
Epoch: 21833, Training Loss: 0.01141
Epoch: 21833, Training Loss: 0.00882
Epoch: 21834, Training Loss: 0.01000
Epoch: 21834, Training Loss: 0.00931
Epoch: 21834, Training Loss: 0.01141
Epoch: 21834, Training Loss: 0.00882
Epoch: 21835, Training Loss: 0.01000
Epoch: 21835, Training Loss: 0.00931
Epoch: 21835, Training Loss: 0.01141
Epoch: 21835, Training Loss: 0.00882
Epoch: 21836, Training Loss: 0.01000
Epoch: 21836, Training Loss: 0.00931
Epoch: 21836, Training Loss: 0.01141
Epoch: 21836, Training Loss: 0.00882
Epoch: 21837, Training Loss: 0.01000
Epoch: 21837, Training Loss: 0.00931
Epoch: 21837, Training Loss: 0.01141
Epoch: 21837, Training Loss: 0.00882
Epoch: 21838, Training Loss: 0.01000
Epoch: 21838, Training Loss: 0.00931
Epoch: 21838, Training Loss: 0.01141
Epoch: 21838, Training Loss: 0.00882
Epoch: 21839, Training Loss: 0.01000
Epoch: 21839, Training Loss: 0.00931
Epoch: 21839, Training Loss: 0.01141
Epoch: 21839, Training Loss: 0.00882
Epoch: 21840, Training Loss: 0.01000
Epoch: 21840, Training Loss: 0.00931
Epoch: 21840, Training Loss: 0.01141
Epoch: 21840, Training Loss: 0.00882
Epoch: 21841, Training Loss: 0.01000
Epoch: 21841, Training Loss: 0.00931
Epoch: 21841, Training Loss: 0.01141
Epoch: 21841, Training Loss: 0.00882
Epoch: 21842, Training Loss: 0.00999
Epoch: 21842, Training Loss: 0.00930
Epoch: 21842, Training Loss: 0.01141
Epoch: 21842, Training Loss: 0.00882
Epoch: 21843, Training Loss: 0.00999
Epoch: 21843, Training Loss: 0.00930
Epoch: 21843, Training Loss: 0.01141
Epoch: 21843, Training Loss: 0.00882
Epoch: 21844, Training Loss: 0.00999
Epoch: 21844, Training Loss: 0.00930
Epoch: 21844, Training Loss: 0.01141
Epoch: 21844, Training Loss: 0.00881
Epoch: 21845, Training Loss: 0.00999
Epoch: 21845, Training Loss: 0.00930
Epoch: 21845, Training Loss: 0.01141
Epoch: 21845, Training Loss: 0.00881
Epoch: 21846, Training Loss: 0.00999
Epoch: 21846, Training Loss: 0.00930
Epoch: 21846, Training Loss: 0.01141
Epoch: 21846, Training Loss: 0.00881
Epoch: 21847, Training Loss: 0.00999
Epoch: 21847, Training Loss: 0.00930
Epoch: 21847, Training Loss: 0.01141
Epoch: 21847, Training Loss: 0.00881
Epoch: 21848, Training Loss: 0.00999
Epoch: 21848, Training Loss: 0.00930
Epoch: 21848, Training Loss: 0.01141
Epoch: 21848, Training Loss: 0.00881
Epoch: 21849, Training Loss: 0.00999
Epoch: 21849, Training Loss: 0.00930
Epoch: 21849, Training Loss: 0.01141
Epoch: 21849, Training Loss: 0.00881
Epoch: 21850, Training Loss: 0.00999
Epoch: 21850, Training Loss: 0.00930
Epoch: 21850, Training Loss: 0.01141
Epoch: 21850, Training Loss: 0.00881
Epoch: 21851, Training Loss: 0.00999
Epoch: 21851, Training Loss: 0.00930
Epoch: 21851, Training Loss: 0.01141
Epoch: 21851, Training Loss: 0.00881
Epoch: 21852, Training Loss: 0.00999
Epoch: 21852, Training Loss: 0.00930
Epoch: 21852, Training Loss: 0.01141
Epoch: 21852, Training Loss: 0.00881
Epoch: 21853, Training Loss: 0.00999
Epoch: 21853, Training Loss: 0.00930
Epoch: 21853, Training Loss: 0.01141
Epoch: 21853, Training Loss: 0.00881
Epoch: 21854, Training Loss: 0.00999
Epoch: 21854, Training Loss: 0.00930
Epoch: 21854, Training Loss: 0.01141
Epoch: 21854, Training Loss: 0.00881
Epoch: 21855, Training Loss: 0.00999
Epoch: 21855, Training Loss: 0.00930
Epoch: 21855, Training Loss: 0.01141
Epoch: 21855, Training Loss: 0.00881
Epoch: 21856, Training Loss: 0.00999
Epoch: 21856, Training Loss: 0.00930
Epoch: 21856, Training Loss: 0.01141
Epoch: 21856, Training Loss: 0.00881
Epoch: 21857, Training Loss: 0.00999
Epoch: 21857, Training Loss: 0.00930
Epoch: 21857, Training Loss: 0.01141
Epoch: 21857, Training Loss: 0.00881
Epoch: 21858, Training Loss: 0.00999
Epoch: 21858, Training Loss: 0.00930
Epoch: 21858, Training Loss: 0.01141
Epoch: 21858, Training Loss: 0.00881
Epoch: 21859, Training Loss: 0.00999
Epoch: 21859, Training Loss: 0.00930
Epoch: 21859, Training Loss: 0.01141
Epoch: 21859, Training Loss: 0.00881
Epoch: 21860, Training Loss: 0.00999
Epoch: 21860, Training Loss: 0.00930
Epoch: 21860, Training Loss: 0.01141
Epoch: 21860, Training Loss: 0.00881
Epoch: 21861, Training Loss: 0.00999
Epoch: 21861, Training Loss: 0.00930
Epoch: 21861, Training Loss: 0.01141
Epoch: 21861, Training Loss: 0.00881
Epoch: 21862, Training Loss: 0.00999
Epoch: 21862, Training Loss: 0.00930
Epoch: 21862, Training Loss: 0.01141
Epoch: 21862, Training Loss: 0.00881
Epoch: 21863, Training Loss: 0.00999
Epoch: 21863, Training Loss: 0.00930
Epoch: 21863, Training Loss: 0.01141
Epoch: 21863, Training Loss: 0.00881
Epoch: 21864, Training Loss: 0.00999
Epoch: 21864, Training Loss: 0.00930
Epoch: 21864, Training Loss: 0.01140
Epoch: 21864, Training Loss: 0.00881
Epoch: 21865, Training Loss: 0.00999
Epoch: 21865, Training Loss: 0.00930
Epoch: 21865, Training Loss: 0.01140
Epoch: 21865, Training Loss: 0.00881
Epoch: 21866, Training Loss: 0.00999
Epoch: 21866, Training Loss: 0.00930
Epoch: 21866, Training Loss: 0.01140
Epoch: 21866, Training Loss: 0.00881
Epoch: 21867, Training Loss: 0.00999
Epoch: 21867, Training Loss: 0.00930
Epoch: 21867, Training Loss: 0.01140
Epoch: 21867, Training Loss: 0.00881
Epoch: 21868, Training Loss: 0.00999
Epoch: 21868, Training Loss: 0.00930
Epoch: 21868, Training Loss: 0.01140
Epoch: 21868, Training Loss: 0.00881
Epoch: 21869, Training Loss: 0.00999
Epoch: 21869, Training Loss: 0.00930
Epoch: 21869, Training Loss: 0.01140
Epoch: 21869, Training Loss: 0.00881
Epoch: 21870, Training Loss: 0.00999
Epoch: 21870, Training Loss: 0.00930
Epoch: 21870, Training Loss: 0.01140
Epoch: 21870, Training Loss: 0.00881
Epoch: 21871, Training Loss: 0.00999
Epoch: 21871, Training Loss: 0.00930
Epoch: 21871, Training Loss: 0.01140
Epoch: 21871, Training Loss: 0.00881
Epoch: 21872, Training Loss: 0.00999
Epoch: 21872, Training Loss: 0.00930
Epoch: 21872, Training Loss: 0.01140
Epoch: 21872, Training Loss: 0.00881
Epoch: 21873, Training Loss: 0.00999
Epoch: 21873, Training Loss: 0.00930
Epoch: 21873, Training Loss: 0.01140
Epoch: 21873, Training Loss: 0.00881
Epoch: 21874, Training Loss: 0.00999
Epoch: 21874, Training Loss: 0.00930
Epoch: 21874, Training Loss: 0.01140
Epoch: 21874, Training Loss: 0.00881
Epoch: 21875, Training Loss: 0.00999
Epoch: 21875, Training Loss: 0.00930
Epoch: 21875, Training Loss: 0.01140
Epoch: 21875, Training Loss: 0.00881
Epoch: 21876, Training Loss: 0.00999
Epoch: 21876, Training Loss: 0.00930
Epoch: 21876, Training Loss: 0.01140
Epoch: 21876, Training Loss: 0.00881
Epoch: 21877, Training Loss: 0.00999
Epoch: 21877, Training Loss: 0.00930
Epoch: 21877, Training Loss: 0.01140
Epoch: 21877, Training Loss: 0.00881
Epoch: 21878, Training Loss: 0.00999
Epoch: 21878, Training Loss: 0.00930
Epoch: 21878, Training Loss: 0.01140
Epoch: 21878, Training Loss: 0.00881
Epoch: 21879, Training Loss: 0.00999
Epoch: 21879, Training Loss: 0.00930
Epoch: 21879, Training Loss: 0.01140
Epoch: 21879, Training Loss: 0.00881
Epoch: 21880, Training Loss: 0.00999
Epoch: 21880, Training Loss: 0.00930
Epoch: 21880, Training Loss: 0.01140
Epoch: 21880, Training Loss: 0.00881
Epoch: 21881, Training Loss: 0.00999
Epoch: 21881, Training Loss: 0.00930
Epoch: 21881, Training Loss: 0.01140
Epoch: 21881, Training Loss: 0.00881
Epoch: 21882, Training Loss: 0.00999
Epoch: 21882, Training Loss: 0.00930
Epoch: 21882, Training Loss: 0.01140
Epoch: 21882, Training Loss: 0.00881
Epoch: 21883, Training Loss: 0.00998
Epoch: 21883, Training Loss: 0.00930
Epoch: 21883, Training Loss: 0.01140
Epoch: 21883, Training Loss: 0.00881
Epoch: 21884, Training Loss: 0.00998
Epoch: 21884, Training Loss: 0.00930
Epoch: 21884, Training Loss: 0.01140
Epoch: 21884, Training Loss: 0.00881
Epoch: 21885, Training Loss: 0.00998
Epoch: 21885, Training Loss: 0.00930
Epoch: 21885, Training Loss: 0.01140
Epoch: 21885, Training Loss: 0.00881
Epoch: 21886, Training Loss: 0.00998
Epoch: 21886, Training Loss: 0.00929
Epoch: 21886, Training Loss: 0.01140
Epoch: 21886, Training Loss: 0.00881
Epoch: 21887, Training Loss: 0.00998
Epoch: 21887, Training Loss: 0.00929
Epoch: 21887, Training Loss: 0.01140
Epoch: 21887, Training Loss: 0.00881
Epoch: 21888, Training Loss: 0.00998
Epoch: 21888, Training Loss: 0.00929
Epoch: 21888, Training Loss: 0.01140
Epoch: 21888, Training Loss: 0.00881
Epoch: 21889, Training Loss: 0.00998
Epoch: 21889, Training Loss: 0.00929
Epoch: 21889, Training Loss: 0.01140
Epoch: 21889, Training Loss: 0.00881
Epoch: 21890, Training Loss: 0.00998
Epoch: 21890, Training Loss: 0.00929
Epoch: 21890, Training Loss: 0.01140
Epoch: 21890, Training Loss: 0.00881
Epoch: 21891, Training Loss: 0.00998
Epoch: 21891, Training Loss: 0.00929
Epoch: 21891, Training Loss: 0.01140
Epoch: 21891, Training Loss: 0.00880
Epoch: 21892, Training Loss: 0.00998
Epoch: 21892, Training Loss: 0.00929
Epoch: 21892, Training Loss: 0.01140
Epoch: 21892, Training Loss: 0.00880
Epoch: 21893, Training Loss: 0.00998
Epoch: 21893, Training Loss: 0.00929
Epoch: 21893, Training Loss: 0.01140
Epoch: 21893, Training Loss: 0.00880
Epoch: 21894, Training Loss: 0.00998
Epoch: 21894, Training Loss: 0.00929
Epoch: 21894, Training Loss: 0.01140
Epoch: 21894, Training Loss: 0.00880
Epoch: 21895, Training Loss: 0.00998
Epoch: 21895, Training Loss: 0.00929
Epoch: 21895, Training Loss: 0.01140
Epoch: 21895, Training Loss: 0.00880
Epoch: 21896, Training Loss: 0.00998
Epoch: 21896, Training Loss: 0.00929
Epoch: 21896, Training Loss: 0.01140
Epoch: 21896, Training Loss: 0.00880
Epoch: 21897, Training Loss: 0.00998
Epoch: 21897, Training Loss: 0.00929
Epoch: 21897, Training Loss: 0.01140
Epoch: 21897, Training Loss: 0.00880
Epoch: 21898, Training Loss: 0.00998
Epoch: 21898, Training Loss: 0.00929
Epoch: 21898, Training Loss: 0.01140
Epoch: 21898, Training Loss: 0.00880
Epoch: 21899, Training Loss: 0.00998
Epoch: 21899, Training Loss: 0.00929
Epoch: 21899, Training Loss: 0.01139
Epoch: 21899, Training Loss: 0.00880
Epoch: 21900, Training Loss: 0.00998
Epoch: 21900, Training Loss: 0.00929
Epoch: 21900, Training Loss: 0.01139
Epoch: 21900, Training Loss: 0.00880
Epoch: 21901, Training Loss: 0.00998
Epoch: 21901, Training Loss: 0.00929
Epoch: 21901, Training Loss: 0.01139
Epoch: 21901, Training Loss: 0.00880
Epoch: 21902, Training Loss: 0.00998
Epoch: 21902, Training Loss: 0.00929
Epoch: 21902, Training Loss: 0.01139
Epoch: 21902, Training Loss: 0.00880
Epoch: 21903, Training Loss: 0.00998
Epoch: 21903, Training Loss: 0.00929
Epoch: 21903, Training Loss: 0.01139
Epoch: 21903, Training Loss: 0.00880
Epoch: 21904, Training Loss: 0.00998
Epoch: 21904, Training Loss: 0.00929
Epoch: 21904, Training Loss: 0.01139
Epoch: 21904, Training Loss: 0.00880
Epoch: 21905, Training Loss: 0.00998
Epoch: 21905, Training Loss: 0.00929
Epoch: 21905, Training Loss: 0.01139
Epoch: 21905, Training Loss: 0.00880
Epoch: 21906, Training Loss: 0.00998
Epoch: 21906, Training Loss: 0.00929
Epoch: 21906, Training Loss: 0.01139
Epoch: 21906, Training Loss: 0.00880
Epoch: 21907, Training Loss: 0.00998
Epoch: 21907, Training Loss: 0.00929
Epoch: 21907, Training Loss: 0.01139
Epoch: 21907, Training Loss: 0.00880
Epoch: 21908, Training Loss: 0.00998
Epoch: 21908, Training Loss: 0.00929
Epoch: 21908, Training Loss: 0.01139
Epoch: 21908, Training Loss: 0.00880
Epoch: 21909, Training Loss: 0.00998
Epoch: 21909, Training Loss: 0.00929
Epoch: 21909, Training Loss: 0.01139
Epoch: 21909, Training Loss: 0.00880
Epoch: 21910, Training Loss: 0.00998
Epoch: 21910, Training Loss: 0.00929
Epoch: 21910, Training Loss: 0.01139
Epoch: 21910, Training Loss: 0.00880
Epoch: 21911, Training Loss: 0.00998
Epoch: 21911, Training Loss: 0.00929
Epoch: 21911, Training Loss: 0.01139
Epoch: 21911, Training Loss: 0.00880
Epoch: 21912, Training Loss: 0.00998
Epoch: 21912, Training Loss: 0.00929
Epoch: 21912, Training Loss: 0.01139
Epoch: 21912, Training Loss: 0.00880
Epoch: 21913, Training Loss: 0.00998
Epoch: 21913, Training Loss: 0.00929
Epoch: 21913, Training Loss: 0.01139
Epoch: 21913, Training Loss: 0.00880
Epoch: 21914, Training Loss: 0.00998
Epoch: 21914, Training Loss: 0.00929
Epoch: 21914, Training Loss: 0.01139
Epoch: 21914, Training Loss: 0.00880
Epoch: 21915, Training Loss: 0.00998
Epoch: 21915, Training Loss: 0.00929
Epoch: 21915, Training Loss: 0.01139
Epoch: 21915, Training Loss: 0.00880
Epoch: 21916, Training Loss: 0.00998
Epoch: 21916, Training Loss: 0.00929
Epoch: 21916, Training Loss: 0.01139
Epoch: 21916, Training Loss: 0.00880
Epoch: 21917, Training Loss: 0.00998
Epoch: 21917, Training Loss: 0.00929
Epoch: 21917, Training Loss: 0.01139
Epoch: 21917, Training Loss: 0.00880
Epoch: 21918, Training Loss: 0.00998
Epoch: 21918, Training Loss: 0.00929
Epoch: 21918, Training Loss: 0.01139
Epoch: 21918, Training Loss: 0.00880
Epoch: 21919, Training Loss: 0.00998
Epoch: 21919, Training Loss: 0.00929
Epoch: 21919, Training Loss: 0.01139
Epoch: 21919, Training Loss: 0.00880
Epoch: 21920, Training Loss: 0.00998
Epoch: 21920, Training Loss: 0.00929
Epoch: 21920, Training Loss: 0.01139
Epoch: 21920, Training Loss: 0.00880
Epoch: 21921, Training Loss: 0.00998
Epoch: 21921, Training Loss: 0.00929
Epoch: 21921, Training Loss: 0.01139
Epoch: 21921, Training Loss: 0.00880
Epoch: 21922, Training Loss: 0.00998
Epoch: 21922, Training Loss: 0.00929
Epoch: 21922, Training Loss: 0.01139
Epoch: 21922, Training Loss: 0.00880
Epoch: 21923, Training Loss: 0.00998
Epoch: 21923, Training Loss: 0.00929
Epoch: 21923, Training Loss: 0.01139
Epoch: 21923, Training Loss: 0.00880
Epoch: 21924, Training Loss: 0.00997
Epoch: 21924, Training Loss: 0.00929
Epoch: 21924, Training Loss: 0.01139
Epoch: 21924, Training Loss: 0.00880
Epoch: 21925, Training Loss: 0.00997
Epoch: 21925, Training Loss: 0.00929
Epoch: 21925, Training Loss: 0.01139
Epoch: 21925, Training Loss: 0.00880
Epoch: 21926, Training Loss: 0.00997
Epoch: 21926, Training Loss: 0.00929
Epoch: 21926, Training Loss: 0.01139
Epoch: 21926, Training Loss: 0.00880
Epoch: 21927, Training Loss: 0.00997
Epoch: 21927, Training Loss: 0.00929
Epoch: 21927, Training Loss: 0.01139
Epoch: 21927, Training Loss: 0.00880
Epoch: 21928, Training Loss: 0.00997
Epoch: 21928, Training Loss: 0.00929
Epoch: 21928, Training Loss: 0.01139
Epoch: 21928, Training Loss: 0.00880
Epoch: 21929, Training Loss: 0.00997
Epoch: 21929, Training Loss: 0.00929
Epoch: 21929, Training Loss: 0.01139
Epoch: 21929, Training Loss: 0.00880
Epoch: 21930, Training Loss: 0.00997
Epoch: 21930, Training Loss: 0.00929
Epoch: 21930, Training Loss: 0.01139
Epoch: 21930, Training Loss: 0.00880
Epoch: 21931, Training Loss: 0.00997
Epoch: 21931, Training Loss: 0.00928
Epoch: 21931, Training Loss: 0.01139
Epoch: 21931, Training Loss: 0.00880
Epoch: 21932, Training Loss: 0.00997
Epoch: 21932, Training Loss: 0.00928
Epoch: 21932, Training Loss: 0.01139
Epoch: 21932, Training Loss: 0.00880
Epoch: 21933, Training Loss: 0.00997
Epoch: 21933, Training Loss: 0.00928
Epoch: 21933, Training Loss: 0.01139
Epoch: 21933, Training Loss: 0.00880
Epoch: 21934, Training Loss: 0.00997
Epoch: 21934, Training Loss: 0.00928
Epoch: 21934, Training Loss: 0.01139
Epoch: 21934, Training Loss: 0.00880
Epoch: 21935, Training Loss: 0.00997
Epoch: 21935, Training Loss: 0.00928
Epoch: 21935, Training Loss: 0.01138
Epoch: 21935, Training Loss: 0.00880
Epoch: 21936, Training Loss: 0.00997
Epoch: 21936, Training Loss: 0.00928
Epoch: 21936, Training Loss: 0.01138
Epoch: 21936, Training Loss: 0.00880
Epoch: 21937, Training Loss: 0.00997
Epoch: 21937, Training Loss: 0.00928
Epoch: 21937, Training Loss: 0.01138
Epoch: 21937, Training Loss: 0.00880
Epoch: 21938, Training Loss: 0.00997
Epoch: 21938, Training Loss: 0.00928
Epoch: 21938, Training Loss: 0.01138
Epoch: 21938, Training Loss: 0.00879
Epoch: 21939, Training Loss: 0.00997
Epoch: 21939, Training Loss: 0.00928
Epoch: 21939, Training Loss: 0.01138
Epoch: 21939, Training Loss: 0.00879
Epoch: 21940, Training Loss: 0.00997
Epoch: 21940, Training Loss: 0.00928
Epoch: 21940, Training Loss: 0.01138
Epoch: 21940, Training Loss: 0.00879
Epoch: 21941, Training Loss: 0.00997
Epoch: 21941, Training Loss: 0.00928
Epoch: 21941, Training Loss: 0.01138
Epoch: 21941, Training Loss: 0.00879
Epoch: 21942, Training Loss: 0.00997
Epoch: 21942, Training Loss: 0.00928
Epoch: 21942, Training Loss: 0.01138
Epoch: 21942, Training Loss: 0.00879
Epoch: 21943, Training Loss: 0.00997
Epoch: 21943, Training Loss: 0.00928
Epoch: 21943, Training Loss: 0.01138
Epoch: 21943, Training Loss: 0.00879
Epoch: 21944, Training Loss: 0.00997
Epoch: 21944, Training Loss: 0.00928
Epoch: 21944, Training Loss: 0.01138
Epoch: 21944, Training Loss: 0.00879
Epoch: 21945, Training Loss: 0.00997
Epoch: 21945, Training Loss: 0.00928
Epoch: 21945, Training Loss: 0.01138
Epoch: 21945, Training Loss: 0.00879
Epoch: 21946, Training Loss: 0.00997
Epoch: 21946, Training Loss: 0.00928
Epoch: 21946, Training Loss: 0.01138
Epoch: 21946, Training Loss: 0.00879
Epoch: 21947, Training Loss: 0.00997
Epoch: 21947, Training Loss: 0.00928
Epoch: 21947, Training Loss: 0.01138
Epoch: 21947, Training Loss: 0.00879
Epoch: 21948, Training Loss: 0.00997
Epoch: 21948, Training Loss: 0.00928
Epoch: 21948, Training Loss: 0.01138
Epoch: 21948, Training Loss: 0.00879
Epoch: 21949, Training Loss: 0.00997
Epoch: 21949, Training Loss: 0.00928
Epoch: 21949, Training Loss: 0.01138
Epoch: 21949, Training Loss: 0.00879
Epoch: 21950, Training Loss: 0.00997
Epoch: 21950, Training Loss: 0.00928
Epoch: 21950, Training Loss: 0.01138
Epoch: 21950, Training Loss: 0.00879
Epoch: 21951, Training Loss: 0.00997
Epoch: 21951, Training Loss: 0.00928
Epoch: 21951, Training Loss: 0.01138
Epoch: 21951, Training Loss: 0.00879
Epoch: 21952, Training Loss: 0.00997
Epoch: 21952, Training Loss: 0.00928
Epoch: 21952, Training Loss: 0.01138
Epoch: 21952, Training Loss: 0.00879
Epoch: 21953, Training Loss: 0.00997
Epoch: 21953, Training Loss: 0.00928
Epoch: 21953, Training Loss: 0.01138
Epoch: 21953, Training Loss: 0.00879
Epoch: 21954, Training Loss: 0.00997
Epoch: 21954, Training Loss: 0.00928
Epoch: 21954, Training Loss: 0.01138
Epoch: 21954, Training Loss: 0.00879
Epoch: 21955, Training Loss: 0.00997
Epoch: 21955, Training Loss: 0.00928
Epoch: 21955, Training Loss: 0.01138
Epoch: 21955, Training Loss: 0.00879
Epoch: 21956, Training Loss: 0.00997
Epoch: 21956, Training Loss: 0.00928
Epoch: 21956, Training Loss: 0.01138
Epoch: 21956, Training Loss: 0.00879
Epoch: 21957, Training Loss: 0.00997
Epoch: 21957, Training Loss: 0.00928
Epoch: 21957, Training Loss: 0.01138
Epoch: 21957, Training Loss: 0.00879
Epoch: 21958, Training Loss: 0.00997
Epoch: 21958, Training Loss: 0.00928
Epoch: 21958, Training Loss: 0.01138
Epoch: 21958, Training Loss: 0.00879
Epoch: 21959, Training Loss: 0.00997
Epoch: 21959, Training Loss: 0.00928
Epoch: 21959, Training Loss: 0.01138
Epoch: 21959, Training Loss: 0.00879
Epoch: 21960, Training Loss: 0.00997
Epoch: 21960, Training Loss: 0.00928
Epoch: 21960, Training Loss: 0.01138
Epoch: 21960, Training Loss: 0.00879
Epoch: 21961, Training Loss: 0.00997
Epoch: 21961, Training Loss: 0.00928
Epoch: 21961, Training Loss: 0.01138
Epoch: 21961, Training Loss: 0.00879
Epoch: 21962, Training Loss: 0.00997
Epoch: 21962, Training Loss: 0.00928
Epoch: 21962, Training Loss: 0.01138
Epoch: 21962, Training Loss: 0.00879
Epoch: 21963, Training Loss: 0.00997
Epoch: 21963, Training Loss: 0.00928
Epoch: 21963, Training Loss: 0.01138
Epoch: 21963, Training Loss: 0.00879
Epoch: 21964, Training Loss: 0.00996
Epoch: 21964, Training Loss: 0.00928
Epoch: 21964, Training Loss: 0.01138
Epoch: 21964, Training Loss: 0.00879
Epoch: 21965, Training Loss: 0.00996
Epoch: 21965, Training Loss: 0.00928
Epoch: 21965, Training Loss: 0.01138
Epoch: 21965, Training Loss: 0.00879
Epoch: 21966, Training Loss: 0.00996
Epoch: 21966, Training Loss: 0.00928
Epoch: 21966, Training Loss: 0.01138
Epoch: 21966, Training Loss: 0.00879
Epoch: 21967, Training Loss: 0.00996
Epoch: 21967, Training Loss: 0.00928
Epoch: 21967, Training Loss: 0.01138
Epoch: 21967, Training Loss: 0.00879
Epoch: 21968, Training Loss: 0.00996
Epoch: 21968, Training Loss: 0.00928
Epoch: 21968, Training Loss: 0.01138
Epoch: 21968, Training Loss: 0.00879
Epoch: 21969, Training Loss: 0.00996
Epoch: 21969, Training Loss: 0.00928
Epoch: 21969, Training Loss: 0.01138
Epoch: 21969, Training Loss: 0.00879
Epoch: 21970, Training Loss: 0.00996
Epoch: 21970, Training Loss: 0.00928
Epoch: 21970, Training Loss: 0.01138
Epoch: 21970, Training Loss: 0.00879
Epoch: 21971, Training Loss: 0.00996
Epoch: 21971, Training Loss: 0.00928
Epoch: 21971, Training Loss: 0.01137
Epoch: 21971, Training Loss: 0.00879
Epoch: 21972, Training Loss: 0.00996
Epoch: 21972, Training Loss: 0.00928
Epoch: 21972, Training Loss: 0.01137
Epoch: 21972, Training Loss: 0.00879
Epoch: 21973, Training Loss: 0.00996
Epoch: 21973, Training Loss: 0.00928
Epoch: 21973, Training Loss: 0.01137
Epoch: 21973, Training Loss: 0.00879
Epoch: 21974, Training Loss: 0.00996
Epoch: 21974, Training Loss: 0.00928
Epoch: 21974, Training Loss: 0.01137
Epoch: 21974, Training Loss: 0.00879
Epoch: 21975, Training Loss: 0.00996
Epoch: 21975, Training Loss: 0.00928
Epoch: 21975, Training Loss: 0.01137
Epoch: 21975, Training Loss: 0.00879
Epoch: 21976, Training Loss: 0.00996
Epoch: 21976, Training Loss: 0.00927
Epoch: 21976, Training Loss: 0.01137
Epoch: 21976, Training Loss: 0.00879
Epoch: 21977, Training Loss: 0.00996
Epoch: 21977, Training Loss: 0.00927
Epoch: 21977, Training Loss: 0.01137
Epoch: 21977, Training Loss: 0.00879
Epoch: 21978, Training Loss: 0.00996
Epoch: 21978, Training Loss: 0.00927
Epoch: 21978, Training Loss: 0.01137
Epoch: 21978, Training Loss: 0.00879
Epoch: 21979, Training Loss: 0.00996
Epoch: 21979, Training Loss: 0.00927
Epoch: 21979, Training Loss: 0.01137
Epoch: 21979, Training Loss: 0.00879
Epoch: 21980, Training Loss: 0.00996
Epoch: 21980, Training Loss: 0.00927
Epoch: 21980, Training Loss: 0.01137
Epoch: 21980, Training Loss: 0.00879
Epoch: 21981, Training Loss: 0.00996
Epoch: 21981, Training Loss: 0.00927
Epoch: 21981, Training Loss: 0.01137
Epoch: 21981, Training Loss: 0.00879
Epoch: 21982, Training Loss: 0.00996
Epoch: 21982, Training Loss: 0.00927
Epoch: 21982, Training Loss: 0.01137
Epoch: 21982, Training Loss: 0.00879
Epoch: 21983, Training Loss: 0.00996
Epoch: 21983, Training Loss: 0.00927
Epoch: 21983, Training Loss: 0.01137
Epoch: 21983, Training Loss: 0.00879
Epoch: 21984, Training Loss: 0.00996
Epoch: 21984, Training Loss: 0.00927
Epoch: 21984, Training Loss: 0.01137
Epoch: 21984, Training Loss: 0.00879
Epoch: 21985, Training Loss: 0.00996
Epoch: 21985, Training Loss: 0.00927
Epoch: 21985, Training Loss: 0.01137
Epoch: 21985, Training Loss: 0.00878
Epoch: 21986, Training Loss: 0.00996
Epoch: 21986, Training Loss: 0.00927
Epoch: 21986, Training Loss: 0.01137
Epoch: 21986, Training Loss: 0.00878
Epoch: 21987, Training Loss: 0.00996
Epoch: 21987, Training Loss: 0.00927
Epoch: 21987, Training Loss: 0.01137
Epoch: 21987, Training Loss: 0.00878
Epoch: 21988, Training Loss: 0.00996
Epoch: 21988, Training Loss: 0.00927
Epoch: 21988, Training Loss: 0.01137
Epoch: 21988, Training Loss: 0.00878
Epoch: 21989, Training Loss: 0.00996
Epoch: 21989, Training Loss: 0.00927
Epoch: 21989, Training Loss: 0.01137
Epoch: 21989, Training Loss: 0.00878
Epoch: 21990, Training Loss: 0.00996
Epoch: 21990, Training Loss: 0.00927
Epoch: 21990, Training Loss: 0.01137
Epoch: 21990, Training Loss: 0.00878
Epoch: 21991, Training Loss: 0.00996
Epoch: 21991, Training Loss: 0.00927
Epoch: 21991, Training Loss: 0.01137
Epoch: 21991, Training Loss: 0.00878
Epoch: 21992, Training Loss: 0.00996
Epoch: 21992, Training Loss: 0.00927
Epoch: 21992, Training Loss: 0.01137
Epoch: 21992, Training Loss: 0.00878
Epoch: 21993, Training Loss: 0.00996
Epoch: 21993, Training Loss: 0.00927
Epoch: 21993, Training Loss: 0.01137
Epoch: 21993, Training Loss: 0.00878
Epoch: 21994, Training Loss: 0.00996
Epoch: 21994, Training Loss: 0.00927
Epoch: 21994, Training Loss: 0.01137
Epoch: 21994, Training Loss: 0.00878
Epoch: 21995, Training Loss: 0.00996
Epoch: 21995, Training Loss: 0.00927
Epoch: 21995, Training Loss: 0.01137
Epoch: 21995, Training Loss: 0.00878
Epoch: 21996, Training Loss: 0.00996
Epoch: 21996, Training Loss: 0.00927
Epoch: 21996, Training Loss: 0.01137
Epoch: 21996, Training Loss: 0.00878
Epoch: 21997, Training Loss: 0.00996
Epoch: 21997, Training Loss: 0.00927
Epoch: 21997, Training Loss: 0.01137
Epoch: 21997, Training Loss: 0.00878
Epoch: 21998, Training Loss: 0.00996
Epoch: 21998, Training Loss: 0.00927
Epoch: 21998, Training Loss: 0.01137
Epoch: 21998, Training Loss: 0.00878
Epoch: 21999, Training Loss: 0.00996
Epoch: 21999, Training Loss: 0.00927
Epoch: 21999, Training Loss: 0.01137
Epoch: 21999, Training Loss: 0.00878
Epoch: 22000, Training Loss: 0.00996
Epoch: 22000, Training Loss: 0.00927
Epoch: 22000, Training Loss: 0.01137
Epoch: 22000, Training Loss: 0.00878
Epoch: 22001, Training Loss: 0.00996
Epoch: 22001, Training Loss: 0.00927
Epoch: 22001, Training Loss: 0.01137
Epoch: 22001, Training Loss: 0.00878
Epoch: 22002, Training Loss: 0.00996
Epoch: 22002, Training Loss: 0.00927
Epoch: 22002, Training Loss: 0.01137
Epoch: 22002, Training Loss: 0.00878
Epoch: 22003, Training Loss: 0.00996
Epoch: 22003, Training Loss: 0.00927
Epoch: 22003, Training Loss: 0.01137
Epoch: 22003, Training Loss: 0.00878
Epoch: 22004, Training Loss: 0.00996
Epoch: 22004, Training Loss: 0.00927
Epoch: 22004, Training Loss: 0.01137
Epoch: 22004, Training Loss: 0.00878
Epoch: 22005, Training Loss: 0.00995
Epoch: 22005, Training Loss: 0.00927
Epoch: 22005, Training Loss: 0.01137
Epoch: 22005, Training Loss: 0.00878
Epoch: 22006, Training Loss: 0.00995
Epoch: 22006, Training Loss: 0.00927
Epoch: 22006, Training Loss: 0.01137
Epoch: 22006, Training Loss: 0.00878
Epoch: 22007, Training Loss: 0.00995
Epoch: 22007, Training Loss: 0.00927
Epoch: 22007, Training Loss: 0.01136
Epoch: 22007, Training Loss: 0.00878
Epoch: 22008, Training Loss: 0.00995
Epoch: 22008, Training Loss: 0.00927
Epoch: 22008, Training Loss: 0.01136
Epoch: 22008, Training Loss: 0.00878
Epoch: 22009, Training Loss: 0.00995
Epoch: 22009, Training Loss: 0.00927
Epoch: 22009, Training Loss: 0.01136
Epoch: 22009, Training Loss: 0.00878
Epoch: 22010, Training Loss: 0.00995
Epoch: 22010, Training Loss: 0.00927
Epoch: 22010, Training Loss: 0.01136
Epoch: 22010, Training Loss: 0.00878
Epoch: 22011, Training Loss: 0.00995
Epoch: 22011, Training Loss: 0.00927
Epoch: 22011, Training Loss: 0.01136
Epoch: 22011, Training Loss: 0.00878
Epoch: 22012, Training Loss: 0.00995
Epoch: 22012, Training Loss: 0.00927
Epoch: 22012, Training Loss: 0.01136
Epoch: 22012, Training Loss: 0.00878
Epoch: 22013, Training Loss: 0.00995
Epoch: 22013, Training Loss: 0.00927
Epoch: 22013, Training Loss: 0.01136
Epoch: 22013, Training Loss: 0.00878
Epoch: 22014, Training Loss: 0.00995
Epoch: 22014, Training Loss: 0.00927
Epoch: 22014, Training Loss: 0.01136
Epoch: 22014, Training Loss: 0.00878
Epoch: 22015, Training Loss: 0.00995
Epoch: 22015, Training Loss: 0.00927
Epoch: 22015, Training Loss: 0.01136
Epoch: 22015, Training Loss: 0.00878
Epoch: 22016, Training Loss: 0.00995
Epoch: 22016, Training Loss: 0.00927
Epoch: 22016, Training Loss: 0.01136
Epoch: 22016, Training Loss: 0.00878
Epoch: 22017, Training Loss: 0.00995
Epoch: 22017, Training Loss: 0.00927
Epoch: 22017, Training Loss: 0.01136
Epoch: 22017, Training Loss: 0.00878
Epoch: 22018, Training Loss: 0.00995
Epoch: 22018, Training Loss: 0.00927
Epoch: 22018, Training Loss: 0.01136
Epoch: 22018, Training Loss: 0.00878
Epoch: 22019, Training Loss: 0.00995
Epoch: 22019, Training Loss: 0.00927
Epoch: 22019, Training Loss: 0.01136
Epoch: 22019, Training Loss: 0.00878
Epoch: 22020, Training Loss: 0.00995
Epoch: 22020, Training Loss: 0.00927
Epoch: 22020, Training Loss: 0.01136
Epoch: 22020, Training Loss: 0.00878
Epoch: 22021, Training Loss: 0.00995
Epoch: 22021, Training Loss: 0.00926
Epoch: 22021, Training Loss: 0.01136
Epoch: 22021, Training Loss: 0.00878
Epoch: 22022, Training Loss: 0.00995
Epoch: 22022, Training Loss: 0.00926
Epoch: 22022, Training Loss: 0.01136
Epoch: 22022, Training Loss: 0.00878
Epoch: 22023, Training Loss: 0.00995
Epoch: 22023, Training Loss: 0.00926
Epoch: 22023, Training Loss: 0.01136
Epoch: 22023, Training Loss: 0.00878
Epoch: 22024, Training Loss: 0.00995
Epoch: 22024, Training Loss: 0.00926
Epoch: 22024, Training Loss: 0.01136
Epoch: 22024, Training Loss: 0.00878
Epoch: 22025, Training Loss: 0.00995
Epoch: 22025, Training Loss: 0.00926
Epoch: 22025, Training Loss: 0.01136
Epoch: 22025, Training Loss: 0.00878
Epoch: 22026, Training Loss: 0.00995
Epoch: 22026, Training Loss: 0.00926
Epoch: 22026, Training Loss: 0.01136
Epoch: 22026, Training Loss: 0.00878
Epoch: 22027, Training Loss: 0.00995
Epoch: 22027, Training Loss: 0.00926
Epoch: 22027, Training Loss: 0.01136
Epoch: 22027, Training Loss: 0.00878
Epoch: 22028, Training Loss: 0.00995
Epoch: 22028, Training Loss: 0.00926
Epoch: 22028, Training Loss: 0.01136
Epoch: 22028, Training Loss: 0.00878
Epoch: 22029, Training Loss: 0.00995
Epoch: 22029, Training Loss: 0.00926
Epoch: 22029, Training Loss: 0.01136
Epoch: 22029, Training Loss: 0.00878
Epoch: 22030, Training Loss: 0.00995
Epoch: 22030, Training Loss: 0.00926
Epoch: 22030, Training Loss: 0.01136
Epoch: 22030, Training Loss: 0.00878
Epoch: 22031, Training Loss: 0.00995
Epoch: 22031, Training Loss: 0.00926
Epoch: 22031, Training Loss: 0.01136
Epoch: 22031, Training Loss: 0.00878
Epoch: 22032, Training Loss: 0.00995
Epoch: 22032, Training Loss: 0.00926
Epoch: 22032, Training Loss: 0.01136
Epoch: 22032, Training Loss: 0.00877
Epoch: 22033, Training Loss: 0.00995
Epoch: 22033, Training Loss: 0.00926
Epoch: 22033, Training Loss: 0.01136
Epoch: 22033, Training Loss: 0.00877
Epoch: 22034, Training Loss: 0.00995
Epoch: 22034, Training Loss: 0.00926
Epoch: 22034, Training Loss: 0.01136
Epoch: 22034, Training Loss: 0.00877
Epoch: 22035, Training Loss: 0.00995
Epoch: 22035, Training Loss: 0.00926
Epoch: 22035, Training Loss: 0.01136
Epoch: 22035, Training Loss: 0.00877
Epoch: 22036, Training Loss: 0.00995
Epoch: 22036, Training Loss: 0.00926
Epoch: 22036, Training Loss: 0.01136
Epoch: 22036, Training Loss: 0.00877
Epoch: 22037, Training Loss: 0.00995
Epoch: 22037, Training Loss: 0.00926
Epoch: 22037, Training Loss: 0.01136
Epoch: 22037, Training Loss: 0.00877
Epoch: 22038, Training Loss: 0.00995
Epoch: 22038, Training Loss: 0.00926
Epoch: 22038, Training Loss: 0.01136
Epoch: 22038, Training Loss: 0.00877
Epoch: 22039, Training Loss: 0.00995
Epoch: 22039, Training Loss: 0.00926
Epoch: 22039, Training Loss: 0.01136
Epoch: 22039, Training Loss: 0.00877
Epoch: 22040, Training Loss: 0.00995
Epoch: 22040, Training Loss: 0.00926
Epoch: 22040, Training Loss: 0.01136
Epoch: 22040, Training Loss: 0.00877
Epoch: 22041, Training Loss: 0.00995
Epoch: 22041, Training Loss: 0.00926
Epoch: 22041, Training Loss: 0.01136
Epoch: 22041, Training Loss: 0.00877
Epoch: 22042, Training Loss: 0.00995
Epoch: 22042, Training Loss: 0.00926
Epoch: 22042, Training Loss: 0.01136
Epoch: 22042, Training Loss: 0.00877
Epoch: 22043, Training Loss: 0.00995
Epoch: 22043, Training Loss: 0.00926
Epoch: 22043, Training Loss: 0.01135
Epoch: 22043, Training Loss: 0.00877
Epoch: 22044, Training Loss: 0.00995
Epoch: 22044, Training Loss: 0.00926
Epoch: 22044, Training Loss: 0.01135
Epoch: 22044, Training Loss: 0.00877
Epoch: 22045, Training Loss: 0.00995
Epoch: 22045, Training Loss: 0.00926
Epoch: 22045, Training Loss: 0.01135
Epoch: 22045, Training Loss: 0.00877
Epoch: 22046, Training Loss: 0.00995
Epoch: 22046, Training Loss: 0.00926
Epoch: 22046, Training Loss: 0.01135
Epoch: 22046, Training Loss: 0.00877
Epoch: 22047, Training Loss: 0.00994
Epoch: 22047, Training Loss: 0.00926
Epoch: 22047, Training Loss: 0.01135
Epoch: 22047, Training Loss: 0.00877
Epoch: 22048, Training Loss: 0.00994
Epoch: 22048, Training Loss: 0.00926
Epoch: 22048, Training Loss: 0.01135
Epoch: 22048, Training Loss: 0.00877
Epoch: 22049, Training Loss: 0.00994
Epoch: 22049, Training Loss: 0.00926
Epoch: 22049, Training Loss: 0.01135
Epoch: 22049, Training Loss: 0.00877
Epoch: 22050, Training Loss: 0.00994
Epoch: 22050, Training Loss: 0.00926
Epoch: 22050, Training Loss: 0.01135
Epoch: 22050, Training Loss: 0.00877
Epoch: 22051, Training Loss: 0.00994
Epoch: 22051, Training Loss: 0.00926
Epoch: 22051, Training Loss: 0.01135
Epoch: 22051, Training Loss: 0.00877
Epoch: 22052, Training Loss: 0.00994
Epoch: 22052, Training Loss: 0.00926
Epoch: 22052, Training Loss: 0.01135
Epoch: 22052, Training Loss: 0.00877
Epoch: 22053, Training Loss: 0.00994
Epoch: 22053, Training Loss: 0.00926
Epoch: 22053, Training Loss: 0.01135
Epoch: 22053, Training Loss: 0.00877
Epoch: 22054, Training Loss: 0.00994
Epoch: 22054, Training Loss: 0.00926
Epoch: 22054, Training Loss: 0.01135
Epoch: 22054, Training Loss: 0.00877
Epoch: 22055, Training Loss: 0.00994
Epoch: 22055, Training Loss: 0.00926
Epoch: 22055, Training Loss: 0.01135
Epoch: 22055, Training Loss: 0.00877
Epoch: 22056, Training Loss: 0.00994
Epoch: 22056, Training Loss: 0.00926
Epoch: 22056, Training Loss: 0.01135
Epoch: 22056, Training Loss: 0.00877
Epoch: 22057, Training Loss: 0.00994
Epoch: 22057, Training Loss: 0.00926
Epoch: 22057, Training Loss: 0.01135
Epoch: 22057, Training Loss: 0.00877
Epoch: 22058, Training Loss: 0.00994
Epoch: 22058, Training Loss: 0.00926
Epoch: 22058, Training Loss: 0.01135
Epoch: 22058, Training Loss: 0.00877
Epoch: 22059, Training Loss: 0.00994
Epoch: 22059, Training Loss: 0.00926
Epoch: 22059, Training Loss: 0.01135
Epoch: 22059, Training Loss: 0.00877
Epoch: 22060, Training Loss: 0.00994
Epoch: 22060, Training Loss: 0.00926
Epoch: 22060, Training Loss: 0.01135
Epoch: 22060, Training Loss: 0.00877
Epoch: 22061, Training Loss: 0.00994
Epoch: 22061, Training Loss: 0.00926
Epoch: 22061, Training Loss: 0.01135
Epoch: 22061, Training Loss: 0.00877
Epoch: 22062, Training Loss: 0.00994
Epoch: 22062, Training Loss: 0.00926
Epoch: 22062, Training Loss: 0.01135
Epoch: 22062, Training Loss: 0.00877
Epoch: 22063, Training Loss: 0.00994
Epoch: 22063, Training Loss: 0.00926
Epoch: 22063, Training Loss: 0.01135
Epoch: 22063, Training Loss: 0.00877
Epoch: 22064, Training Loss: 0.00994
Epoch: 22064, Training Loss: 0.00926
Epoch: 22064, Training Loss: 0.01135
Epoch: 22064, Training Loss: 0.00877
Epoch: 22065, Training Loss: 0.00994
Epoch: 22065, Training Loss: 0.00926
Epoch: 22065, Training Loss: 0.01135
Epoch: 22065, Training Loss: 0.00877
Epoch: 22066, Training Loss: 0.00994
Epoch: 22066, Training Loss: 0.00925
Epoch: 22066, Training Loss: 0.01135
Epoch: 22066, Training Loss: 0.00877
Epoch: 22067, Training Loss: 0.00994
Epoch: 22067, Training Loss: 0.00925
Epoch: 22067, Training Loss: 0.01135
Epoch: 22067, Training Loss: 0.00877
Epoch: 22068, Training Loss: 0.00994
Epoch: 22068, Training Loss: 0.00925
Epoch: 22068, Training Loss: 0.01135
Epoch: 22068, Training Loss: 0.00877
Epoch: 22069, Training Loss: 0.00994
Epoch: 22069, Training Loss: 0.00925
Epoch: 22069, Training Loss: 0.01135
Epoch: 22069, Training Loss: 0.00877
Epoch: 22070, Training Loss: 0.00994
Epoch: 22070, Training Loss: 0.00925
Epoch: 22070, Training Loss: 0.01135
Epoch: 22070, Training Loss: 0.00877
Epoch: 22071, Training Loss: 0.00994
Epoch: 22071, Training Loss: 0.00925
Epoch: 22071, Training Loss: 0.01135
Epoch: 22071, Training Loss: 0.00877
Epoch: 22072, Training Loss: 0.00994
Epoch: 22072, Training Loss: 0.00925
Epoch: 22072, Training Loss: 0.01135
Epoch: 22072, Training Loss: 0.00877
Epoch: 22073, Training Loss: 0.00994
Epoch: 22073, Training Loss: 0.00925
Epoch: 22073, Training Loss: 0.01135
Epoch: 22073, Training Loss: 0.00877
Epoch: 22074, Training Loss: 0.00994
Epoch: 22074, Training Loss: 0.00925
Epoch: 22074, Training Loss: 0.01135
Epoch: 22074, Training Loss: 0.00877
Epoch: 22075, Training Loss: 0.00994
Epoch: 22075, Training Loss: 0.00925
Epoch: 22075, Training Loss: 0.01135
Epoch: 22075, Training Loss: 0.00877
Epoch: 22076, Training Loss: 0.00994
Epoch: 22076, Training Loss: 0.00925
Epoch: 22076, Training Loss: 0.01135
Epoch: 22076, Training Loss: 0.00877
Epoch: 22077, Training Loss: 0.00994
Epoch: 22077, Training Loss: 0.00925
Epoch: 22077, Training Loss: 0.01135
Epoch: 22077, Training Loss: 0.00877
Epoch: 22078, Training Loss: 0.00994
Epoch: 22078, Training Loss: 0.00925
Epoch: 22078, Training Loss: 0.01135
Epoch: 22078, Training Loss: 0.00877
Epoch: 22079, Training Loss: 0.00994
Epoch: 22079, Training Loss: 0.00925
Epoch: 22079, Training Loss: 0.01134
Epoch: 22079, Training Loss: 0.00877
Epoch: 22080, Training Loss: 0.00994
Epoch: 22080, Training Loss: 0.00925
Epoch: 22080, Training Loss: 0.01134
Epoch: 22080, Training Loss: 0.00876
Epoch: 22081, Training Loss: 0.00994
Epoch: 22081, Training Loss: 0.00925
Epoch: 22081, Training Loss: 0.01134
Epoch: 22081, Training Loss: 0.00876
Epoch: 22082, Training Loss: 0.00994
Epoch: 22082, Training Loss: 0.00925
Epoch: 22082, Training Loss: 0.01134
Epoch: 22082, Training Loss: 0.00876
Epoch: 22083, Training Loss: 0.00994
Epoch: 22083, Training Loss: 0.00925
Epoch: 22083, Training Loss: 0.01134
Epoch: 22083, Training Loss: 0.00876
Epoch: 22084, Training Loss: 0.00994
Epoch: 22084, Training Loss: 0.00925
Epoch: 22084, Training Loss: 0.01134
Epoch: 22084, Training Loss: 0.00876
Epoch: 22085, Training Loss: 0.00994
Epoch: 22085, Training Loss: 0.00925
Epoch: 22085, Training Loss: 0.01134
Epoch: 22085, Training Loss: 0.00876
Epoch: 22086, Training Loss: 0.00994
Epoch: 22086, Training Loss: 0.00925
Epoch: 22086, Training Loss: 0.01134
Epoch: 22086, Training Loss: 0.00876
Epoch: 22087, Training Loss: 0.00994
Epoch: 22087, Training Loss: 0.00925
Epoch: 22087, Training Loss: 0.01134
Epoch: 22087, Training Loss: 0.00876
Epoch: 22088, Training Loss: 0.00993
Epoch: 22088, Training Loss: 0.00925
Epoch: 22088, Training Loss: 0.01134
Epoch: 22088, Training Loss: 0.00876
Epoch: 22089, Training Loss: 0.00993
Epoch: 22089, Training Loss: 0.00925
Epoch: 22089, Training Loss: 0.01134
Epoch: 22089, Training Loss: 0.00876
Epoch: 22090, Training Loss: 0.00993
Epoch: 22090, Training Loss: 0.00925
Epoch: 22090, Training Loss: 0.01134
Epoch: 22090, Training Loss: 0.00876
Epoch: 22091, Training Loss: 0.00993
Epoch: 22091, Training Loss: 0.00925
Epoch: 22091, Training Loss: 0.01134
Epoch: 22091, Training Loss: 0.00876
Epoch: 22092, Training Loss: 0.00993
Epoch: 22092, Training Loss: 0.00925
Epoch: 22092, Training Loss: 0.01134
Epoch: 22092, Training Loss: 0.00876
Epoch: 22093, Training Loss: 0.00993
Epoch: 22093, Training Loss: 0.00925
Epoch: 22093, Training Loss: 0.01134
Epoch: 22093, Training Loss: 0.00876
Epoch: 22094, Training Loss: 0.00993
Epoch: 22094, Training Loss: 0.00925
Epoch: 22094, Training Loss: 0.01134
Epoch: 22094, Training Loss: 0.00876
Epoch: 22095, Training Loss: 0.00993
Epoch: 22095, Training Loss: 0.00925
Epoch: 22095, Training Loss: 0.01134
Epoch: 22095, Training Loss: 0.00876
Epoch: 22096, Training Loss: 0.00993
Epoch: 22096, Training Loss: 0.00925
Epoch: 22096, Training Loss: 0.01134
Epoch: 22096, Training Loss: 0.00876
Epoch: 22097, Training Loss: 0.00993
Epoch: 22097, Training Loss: 0.00925
Epoch: 22097, Training Loss: 0.01134
Epoch: 22097, Training Loss: 0.00876
Epoch: 22098, Training Loss: 0.00993
Epoch: 22098, Training Loss: 0.00925
Epoch: 22098, Training Loss: 0.01134
Epoch: 22098, Training Loss: 0.00876
Epoch: 22099, Training Loss: 0.00993
Epoch: 22099, Training Loss: 0.00925
Epoch: 22099, Training Loss: 0.01134
Epoch: 22099, Training Loss: 0.00876
Epoch: 22100, Training Loss: 0.00993
Epoch: 22100, Training Loss: 0.00925
Epoch: 22100, Training Loss: 0.01134
Epoch: 22100, Training Loss: 0.00876
Epoch: 22101, Training Loss: 0.00993
Epoch: 22101, Training Loss: 0.00925
Epoch: 22101, Training Loss: 0.01134
Epoch: 22101, Training Loss: 0.00876
Epoch: 22102, Training Loss: 0.00993
Epoch: 22102, Training Loss: 0.00925
Epoch: 22102, Training Loss: 0.01134
Epoch: 22102, Training Loss: 0.00876
Epoch: 22103, Training Loss: 0.00993
Epoch: 22103, Training Loss: 0.00925
Epoch: 22103, Training Loss: 0.01134
Epoch: 22103, Training Loss: 0.00876
Epoch: 22104, Training Loss: 0.00993
Epoch: 22104, Training Loss: 0.00925
Epoch: 22104, Training Loss: 0.01134
Epoch: 22104, Training Loss: 0.00876
Epoch: 22105, Training Loss: 0.00993
Epoch: 22105, Training Loss: 0.00925
Epoch: 22105, Training Loss: 0.01134
Epoch: 22105, Training Loss: 0.00876
Epoch: 22106, Training Loss: 0.00993
Epoch: 22106, Training Loss: 0.00925
Epoch: 22106, Training Loss: 0.01134
Epoch: 22106, Training Loss: 0.00876
Epoch: 22107, Training Loss: 0.00993
Epoch: 22107, Training Loss: 0.00925
Epoch: 22107, Training Loss: 0.01134
Epoch: 22107, Training Loss: 0.00876
Epoch: 22108, Training Loss: 0.00993
Epoch: 22108, Training Loss: 0.00925
Epoch: 22108, Training Loss: 0.01134
Epoch: 22108, Training Loss: 0.00876
Epoch: 22109, Training Loss: 0.00993
Epoch: 22109, Training Loss: 0.00925
Epoch: 22109, Training Loss: 0.01134
Epoch: 22109, Training Loss: 0.00876
Epoch: 22110, Training Loss: 0.00993
Epoch: 22110, Training Loss: 0.00925
Epoch: 22110, Training Loss: 0.01134
Epoch: 22110, Training Loss: 0.00876
Epoch: 22111, Training Loss: 0.00993
Epoch: 22111, Training Loss: 0.00924
Epoch: 22111, Training Loss: 0.01134
Epoch: 22111, Training Loss: 0.00876
Epoch: 22112, Training Loss: 0.00993
Epoch: 22112, Training Loss: 0.00924
Epoch: 22112, Training Loss: 0.01134
Epoch: 22112, Training Loss: 0.00876
Epoch: 22113, Training Loss: 0.00993
Epoch: 22113, Training Loss: 0.00924
Epoch: 22113, Training Loss: 0.01134
Epoch: 22113, Training Loss: 0.00876
Epoch: 22114, Training Loss: 0.00993
Epoch: 22114, Training Loss: 0.00924
Epoch: 22114, Training Loss: 0.01134
Epoch: 22114, Training Loss: 0.00876
Epoch: 22115, Training Loss: 0.00993
Epoch: 22115, Training Loss: 0.00924
Epoch: 22115, Training Loss: 0.01133
Epoch: 22115, Training Loss: 0.00876
Epoch: 22116, Training Loss: 0.00993
Epoch: 22116, Training Loss: 0.00924
Epoch: 22116, Training Loss: 0.01133
Epoch: 22116, Training Loss: 0.00876
Epoch: 22117, Training Loss: 0.00993
Epoch: 22117, Training Loss: 0.00924
Epoch: 22117, Training Loss: 0.01133
Epoch: 22117, Training Loss: 0.00876
Epoch: 22118, Training Loss: 0.00993
Epoch: 22118, Training Loss: 0.00924
Epoch: 22118, Training Loss: 0.01133
Epoch: 22118, Training Loss: 0.00876
Epoch: 22119, Training Loss: 0.00993
Epoch: 22119, Training Loss: 0.00924
Epoch: 22119, Training Loss: 0.01133
Epoch: 22119, Training Loss: 0.00876
Epoch: 22120, Training Loss: 0.00993
Epoch: 22120, Training Loss: 0.00924
Epoch: 22120, Training Loss: 0.01133
Epoch: 22120, Training Loss: 0.00876
Epoch: 22121, Training Loss: 0.00993
Epoch: 22121, Training Loss: 0.00924
Epoch: 22121, Training Loss: 0.01133
Epoch: 22121, Training Loss: 0.00876
Epoch: 22122, Training Loss: 0.00993
Epoch: 22122, Training Loss: 0.00924
Epoch: 22122, Training Loss: 0.01133
Epoch: 22122, Training Loss: 0.00876
Epoch: 22123, Training Loss: 0.00993
Epoch: 22123, Training Loss: 0.00924
Epoch: 22123, Training Loss: 0.01133
Epoch: 22123, Training Loss: 0.00876
Epoch: 22124, Training Loss: 0.00993
Epoch: 22124, Training Loss: 0.00924
Epoch: 22124, Training Loss: 0.01133
Epoch: 22124, Training Loss: 0.00876
Epoch: 22125, Training Loss: 0.00993
Epoch: 22125, Training Loss: 0.00924
Epoch: 22125, Training Loss: 0.01133
Epoch: 22125, Training Loss: 0.00876
Epoch: 22126, Training Loss: 0.00993
Epoch: 22126, Training Loss: 0.00924
Epoch: 22126, Training Loss: 0.01133
Epoch: 22126, Training Loss: 0.00876
Epoch: 22127, Training Loss: 0.00993
Epoch: 22127, Training Loss: 0.00924
Epoch: 22127, Training Loss: 0.01133
Epoch: 22127, Training Loss: 0.00875
Epoch: 22128, Training Loss: 0.00993
Epoch: 22128, Training Loss: 0.00924
Epoch: 22128, Training Loss: 0.01133
Epoch: 22128, Training Loss: 0.00875
Epoch: 22129, Training Loss: 0.00992
Epoch: 22129, Training Loss: 0.00924
Epoch: 22129, Training Loss: 0.01133
Epoch: 22129, Training Loss: 0.00875
Epoch: 22130, Training Loss: 0.00992
Epoch: 22130, Training Loss: 0.00924
Epoch: 22130, Training Loss: 0.01133
Epoch: 22130, Training Loss: 0.00875
Epoch: 22131, Training Loss: 0.00992
Epoch: 22131, Training Loss: 0.00924
Epoch: 22131, Training Loss: 0.01133
Epoch: 22131, Training Loss: 0.00875
Epoch: 22132, Training Loss: 0.00992
Epoch: 22132, Training Loss: 0.00924
Epoch: 22132, Training Loss: 0.01133
Epoch: 22132, Training Loss: 0.00875
Epoch: 22133, Training Loss: 0.00992
Epoch: 22133, Training Loss: 0.00924
Epoch: 22133, Training Loss: 0.01133
Epoch: 22133, Training Loss: 0.00875
Epoch: 22134, Training Loss: 0.00992
Epoch: 22134, Training Loss: 0.00924
Epoch: 22134, Training Loss: 0.01133
Epoch: 22134, Training Loss: 0.00875
Epoch: 22135, Training Loss: 0.00992
Epoch: 22135, Training Loss: 0.00924
Epoch: 22135, Training Loss: 0.01133
Epoch: 22135, Training Loss: 0.00875
Epoch: 22136, Training Loss: 0.00992
Epoch: 22136, Training Loss: 0.00924
Epoch: 22136, Training Loss: 0.01133
Epoch: 22136, Training Loss: 0.00875
Epoch: 22137, Training Loss: 0.00992
Epoch: 22137, Training Loss: 0.00924
Epoch: 22137, Training Loss: 0.01133
Epoch: 22137, Training Loss: 0.00875
Epoch: 22138, Training Loss: 0.00992
Epoch: 22138, Training Loss: 0.00924
Epoch: 22138, Training Loss: 0.01133
Epoch: 22138, Training Loss: 0.00875
Epoch: 22139, Training Loss: 0.00992
Epoch: 22139, Training Loss: 0.00924
Epoch: 22139, Training Loss: 0.01133
Epoch: 22139, Training Loss: 0.00875
Epoch: 22140, Training Loss: 0.00992
Epoch: 22140, Training Loss: 0.00924
Epoch: 22140, Training Loss: 0.01133
Epoch: 22140, Training Loss: 0.00875
Epoch: 22141, Training Loss: 0.00992
Epoch: 22141, Training Loss: 0.00924
Epoch: 22141, Training Loss: 0.01133
Epoch: 22141, Training Loss: 0.00875
Epoch: 22142, Training Loss: 0.00992
Epoch: 22142, Training Loss: 0.00924
Epoch: 22142, Training Loss: 0.01133
Epoch: 22142, Training Loss: 0.00875
Epoch: 22143, Training Loss: 0.00992
Epoch: 22143, Training Loss: 0.00924
Epoch: 22143, Training Loss: 0.01133
Epoch: 22143, Training Loss: 0.00875
Epoch: 22144, Training Loss: 0.00992
Epoch: 22144, Training Loss: 0.00924
Epoch: 22144, Training Loss: 0.01133
Epoch: 22144, Training Loss: 0.00875
Epoch: 22145, Training Loss: 0.00992
Epoch: 22145, Training Loss: 0.00924
Epoch: 22145, Training Loss: 0.01133
Epoch: 22145, Training Loss: 0.00875
Epoch: 22146, Training Loss: 0.00992
Epoch: 22146, Training Loss: 0.00924
Epoch: 22146, Training Loss: 0.01133
Epoch: 22146, Training Loss: 0.00875
Epoch: 22147, Training Loss: 0.00992
Epoch: 22147, Training Loss: 0.00924
Epoch: 22147, Training Loss: 0.01133
Epoch: 22147, Training Loss: 0.00875
Epoch: 22148, Training Loss: 0.00992
Epoch: 22148, Training Loss: 0.00924
Epoch: 22148, Training Loss: 0.01133
Epoch: 22148, Training Loss: 0.00875
Epoch: 22149, Training Loss: 0.00992
Epoch: 22149, Training Loss: 0.00924
Epoch: 22149, Training Loss: 0.01133
Epoch: 22149, Training Loss: 0.00875
Epoch: 22150, Training Loss: 0.00992
Epoch: 22150, Training Loss: 0.00924
Epoch: 22150, Training Loss: 0.01133
Epoch: 22150, Training Loss: 0.00875
Epoch: 22151, Training Loss: 0.00992
Epoch: 22151, Training Loss: 0.00924
Epoch: 22151, Training Loss: 0.01133
Epoch: 22151, Training Loss: 0.00875
Epoch: 22152, Training Loss: 0.00992
Epoch: 22152, Training Loss: 0.00924
Epoch: 22152, Training Loss: 0.01132
Epoch: 22152, Training Loss: 0.00875
Epoch: 22153, Training Loss: 0.00992
Epoch: 22153, Training Loss: 0.00924
Epoch: 22153, Training Loss: 0.01132
Epoch: 22153, Training Loss: 0.00875
Epoch: 22154, Training Loss: 0.00992
Epoch: 22154, Training Loss: 0.00924
Epoch: 22154, Training Loss: 0.01132
Epoch: 22154, Training Loss: 0.00875
Epoch: 22155, Training Loss: 0.00992
Epoch: 22155, Training Loss: 0.00924
Epoch: 22155, Training Loss: 0.01132
Epoch: 22155, Training Loss: 0.00875
Epoch: 22156, Training Loss: 0.00992
Epoch: 22156, Training Loss: 0.00923
Epoch: 22156, Training Loss: 0.01132
Epoch: 22156, Training Loss: 0.00875
Epoch: 22157, Training Loss: 0.00992
Epoch: 22157, Training Loss: 0.00923
Epoch: 22157, Training Loss: 0.01132
Epoch: 22157, Training Loss: 0.00875
Epoch: 22158, Training Loss: 0.00992
Epoch: 22158, Training Loss: 0.00923
Epoch: 22158, Training Loss: 0.01132
Epoch: 22158, Training Loss: 0.00875
Epoch: 22159, Training Loss: 0.00992
Epoch: 22159, Training Loss: 0.00923
Epoch: 22159, Training Loss: 0.01132
Epoch: 22159, Training Loss: 0.00875
Epoch: 22160, Training Loss: 0.00992
Epoch: 22160, Training Loss: 0.00923
Epoch: 22160, Training Loss: 0.01132
Epoch: 22160, Training Loss: 0.00875
Epoch: 22161, Training Loss: 0.00992
Epoch: 22161, Training Loss: 0.00923
Epoch: 22161, Training Loss: 0.01132
Epoch: 22161, Training Loss: 0.00875
Epoch: 22162, Training Loss: 0.00992
Epoch: 22162, Training Loss: 0.00923
Epoch: 22162, Training Loss: 0.01132
Epoch: 22162, Training Loss: 0.00875
Epoch: 22163, Training Loss: 0.00992
Epoch: 22163, Training Loss: 0.00923
Epoch: 22163, Training Loss: 0.01132
Epoch: 22163, Training Loss: 0.00875
Epoch: 22164, Training Loss: 0.00992
Epoch: 22164, Training Loss: 0.00923
Epoch: 22164, Training Loss: 0.01132
Epoch: 22164, Training Loss: 0.00875
Epoch: 22165, Training Loss: 0.00992
Epoch: 22165, Training Loss: 0.00923
Epoch: 22165, Training Loss: 0.01132
Epoch: 22165, Training Loss: 0.00875
Epoch: 22166, Training Loss: 0.00992
Epoch: 22166, Training Loss: 0.00923
Epoch: 22166, Training Loss: 0.01132
Epoch: 22166, Training Loss: 0.00875
Epoch: 22167, Training Loss: 0.00992
Epoch: 22167, Training Loss: 0.00923
Epoch: 22167, Training Loss: 0.01132
Epoch: 22167, Training Loss: 0.00875
Epoch: 22168, Training Loss: 0.00992
Epoch: 22168, Training Loss: 0.00923
Epoch: 22168, Training Loss: 0.01132
Epoch: 22168, Training Loss: 0.00875
Epoch: 22169, Training Loss: 0.00992
Epoch: 22169, Training Loss: 0.00923
Epoch: 22169, Training Loss: 0.01132
Epoch: 22169, Training Loss: 0.00875
Epoch: 22170, Training Loss: 0.00992
Epoch: 22170, Training Loss: 0.00923
Epoch: 22170, Training Loss: 0.01132
Epoch: 22170, Training Loss: 0.00875
Epoch: 22171, Training Loss: 0.00991
Epoch: 22171, Training Loss: 0.00923
Epoch: 22171, Training Loss: 0.01132
Epoch: 22171, Training Loss: 0.00875
Epoch: 22172, Training Loss: 0.00991
Epoch: 22172, Training Loss: 0.00923
Epoch: 22172, Training Loss: 0.01132
Epoch: 22172, Training Loss: 0.00875
Epoch: 22173, Training Loss: 0.00991
Epoch: 22173, Training Loss: 0.00923
Epoch: 22173, Training Loss: 0.01132
Epoch: 22173, Training Loss: 0.00875
Epoch: 22174, Training Loss: 0.00991
Epoch: 22174, Training Loss: 0.00923
Epoch: 22174, Training Loss: 0.01132
Epoch: 22174, Training Loss: 0.00875
Epoch: 22175, Training Loss: 0.00991
Epoch: 22175, Training Loss: 0.00923
Epoch: 22175, Training Loss: 0.01132
Epoch: 22175, Training Loss: 0.00874
Epoch: 22176, Training Loss: 0.00991
Epoch: 22176, Training Loss: 0.00923
Epoch: 22176, Training Loss: 0.01132
Epoch: 22176, Training Loss: 0.00874
Epoch: 22177, Training Loss: 0.00991
Epoch: 22177, Training Loss: 0.00923
Epoch: 22177, Training Loss: 0.01132
Epoch: 22177, Training Loss: 0.00874
Epoch: 22178, Training Loss: 0.00991
Epoch: 22178, Training Loss: 0.00923
Epoch: 22178, Training Loss: 0.01132
Epoch: 22178, Training Loss: 0.00874
Epoch: 22179, Training Loss: 0.00991
Epoch: 22179, Training Loss: 0.00923
Epoch: 22179, Training Loss: 0.01132
Epoch: 22179, Training Loss: 0.00874
Epoch: 22180, Training Loss: 0.00991
Epoch: 22180, Training Loss: 0.00923
Epoch: 22180, Training Loss: 0.01132
Epoch: 22180, Training Loss: 0.00874
Epoch: 22181, Training Loss: 0.00991
Epoch: 22181, Training Loss: 0.00923
Epoch: 22181, Training Loss: 0.01132
Epoch: 22181, Training Loss: 0.00874
Epoch: 22182, Training Loss: 0.00991
Epoch: 22182, Training Loss: 0.00923
Epoch: 22182, Training Loss: 0.01132
Epoch: 22182, Training Loss: 0.00874
Epoch: 22183, Training Loss: 0.00991
Epoch: 22183, Training Loss: 0.00923
Epoch: 22183, Training Loss: 0.01132
Epoch: 22183, Training Loss: 0.00874
Epoch: 22184, Training Loss: 0.00991
Epoch: 22184, Training Loss: 0.00923
Epoch: 22184, Training Loss: 0.01132
Epoch: 22184, Training Loss: 0.00874
Epoch: 22185, Training Loss: 0.00991
Epoch: 22185, Training Loss: 0.00923
Epoch: 22185, Training Loss: 0.01132
Epoch: 22185, Training Loss: 0.00874
Epoch: 22186, Training Loss: 0.00991
Epoch: 22186, Training Loss: 0.00923
Epoch: 22186, Training Loss: 0.01132
Epoch: 22186, Training Loss: 0.00874
Epoch: 22187, Training Loss: 0.00991
Epoch: 22187, Training Loss: 0.00923
Epoch: 22187, Training Loss: 0.01132
Epoch: 22187, Training Loss: 0.00874
Epoch: 22188, Training Loss: 0.00991
Epoch: 22188, Training Loss: 0.00923
Epoch: 22188, Training Loss: 0.01131
Epoch: 22188, Training Loss: 0.00874
Epoch: 22189, Training Loss: 0.00991
Epoch: 22189, Training Loss: 0.00923
Epoch: 22189, Training Loss: 0.01131
Epoch: 22189, Training Loss: 0.00874
Epoch: 22190, Training Loss: 0.00991
Epoch: 22190, Training Loss: 0.00923
Epoch: 22190, Training Loss: 0.01131
Epoch: 22190, Training Loss: 0.00874
Epoch: 22191, Training Loss: 0.00991
Epoch: 22191, Training Loss: 0.00923
Epoch: 22191, Training Loss: 0.01131
Epoch: 22191, Training Loss: 0.00874
Epoch: 22192, Training Loss: 0.00991
Epoch: 22192, Training Loss: 0.00923
Epoch: 22192, Training Loss: 0.01131
Epoch: 22192, Training Loss: 0.00874
Epoch: 22193, Training Loss: 0.00991
Epoch: 22193, Training Loss: 0.00923
Epoch: 22193, Training Loss: 0.01131
Epoch: 22193, Training Loss: 0.00874
Epoch: 22194, Training Loss: 0.00991
Epoch: 22194, Training Loss: 0.00923
Epoch: 22194, Training Loss: 0.01131
Epoch: 22194, Training Loss: 0.00874
Epoch: 22195, Training Loss: 0.00991
Epoch: 22195, Training Loss: 0.00923
Epoch: 22195, Training Loss: 0.01131
Epoch: 22195, Training Loss: 0.00874
Epoch: 22196, Training Loss: 0.00991
Epoch: 22196, Training Loss: 0.00923
Epoch: 22196, Training Loss: 0.01131
Epoch: 22196, Training Loss: 0.00874
Epoch: 22197, Training Loss: 0.00991
Epoch: 22197, Training Loss: 0.00923
Epoch: 22197, Training Loss: 0.01131
Epoch: 22197, Training Loss: 0.00874
Epoch: 22198, Training Loss: 0.00991
Epoch: 22198, Training Loss: 0.00923
Epoch: 22198, Training Loss: 0.01131
Epoch: 22198, Training Loss: 0.00874
Epoch: 22199, Training Loss: 0.00991
Epoch: 22199, Training Loss: 0.00923
Epoch: 22199, Training Loss: 0.01131
Epoch: 22199, Training Loss: 0.00874
Epoch: 22200, Training Loss: 0.00991
Epoch: 22200, Training Loss: 0.00923
Epoch: 22200, Training Loss: 0.01131
Epoch: 22200, Training Loss: 0.00874
Epoch: 22201, Training Loss: 0.00991
Epoch: 22201, Training Loss: 0.00923
Epoch: 22201, Training Loss: 0.01131
Epoch: 22201, Training Loss: 0.00874
Epoch: 22202, Training Loss: 0.00991
Epoch: 22202, Training Loss: 0.00922
Epoch: 22202, Training Loss: 0.01131
Epoch: 22202, Training Loss: 0.00874
Epoch: 22203, Training Loss: 0.00991
Epoch: 22203, Training Loss: 0.00922
Epoch: 22203, Training Loss: 0.01131
Epoch: 22203, Training Loss: 0.00874
Epoch: 22204, Training Loss: 0.00991
Epoch: 22204, Training Loss: 0.00922
Epoch: 22204, Training Loss: 0.01131
Epoch: 22204, Training Loss: 0.00874
Epoch: 22205, Training Loss: 0.00991
Epoch: 22205, Training Loss: 0.00922
Epoch: 22205, Training Loss: 0.01131
Epoch: 22205, Training Loss: 0.00874
Epoch: 22206, Training Loss: 0.00991
Epoch: 22206, Training Loss: 0.00922
Epoch: 22206, Training Loss: 0.01131
Epoch: 22206, Training Loss: 0.00874
Epoch: 22207, Training Loss: 0.00991
Epoch: 22207, Training Loss: 0.00922
Epoch: 22207, Training Loss: 0.01131
Epoch: 22207, Training Loss: 0.00874
Epoch: 22208, Training Loss: 0.00991
Epoch: 22208, Training Loss: 0.00922
Epoch: 22208, Training Loss: 0.01131
Epoch: 22208, Training Loss: 0.00874
Epoch: 22209, Training Loss: 0.00991
Epoch: 22209, Training Loss: 0.00922
Epoch: 22209, Training Loss: 0.01131
Epoch: 22209, Training Loss: 0.00874
Epoch: 22210, Training Loss: 0.00991
Epoch: 22210, Training Loss: 0.00922
Epoch: 22210, Training Loss: 0.01131
Epoch: 22210, Training Loss: 0.00874
Epoch: 22211, Training Loss: 0.00991
Epoch: 22211, Training Loss: 0.00922
Epoch: 22211, Training Loss: 0.01131
Epoch: 22211, Training Loss: 0.00874
Epoch: 22212, Training Loss: 0.00990
Epoch: 22212, Training Loss: 0.00922
Epoch: 22212, Training Loss: 0.01131
Epoch: 22212, Training Loss: 0.00874
Epoch: 22213, Training Loss: 0.00990
Epoch: 22213, Training Loss: 0.00922
Epoch: 22213, Training Loss: 0.01131
Epoch: 22213, Training Loss: 0.00874
Epoch: 22214, Training Loss: 0.00990
Epoch: 22214, Training Loss: 0.00922
Epoch: 22214, Training Loss: 0.01131
Epoch: 22214, Training Loss: 0.00874
Epoch: 22215, Training Loss: 0.00990
Epoch: 22215, Training Loss: 0.00922
Epoch: 22215, Training Loss: 0.01131
Epoch: 22215, Training Loss: 0.00874
Epoch: 22216, Training Loss: 0.00990
Epoch: 22216, Training Loss: 0.00922
Epoch: 22216, Training Loss: 0.01131
Epoch: 22216, Training Loss: 0.00874
Epoch: 22217, Training Loss: 0.00990
Epoch: 22217, Training Loss: 0.00922
Epoch: 22217, Training Loss: 0.01131
Epoch: 22217, Training Loss: 0.00874
Epoch: 22218, Training Loss: 0.00990
Epoch: 22218, Training Loss: 0.00922
Epoch: 22218, Training Loss: 0.01131
Epoch: 22218, Training Loss: 0.00874
Epoch: 22219, Training Loss: 0.00990
Epoch: 22219, Training Loss: 0.00922
Epoch: 22219, Training Loss: 0.01131
Epoch: 22219, Training Loss: 0.00874
Epoch: 22220, Training Loss: 0.00990
Epoch: 22220, Training Loss: 0.00922
Epoch: 22220, Training Loss: 0.01131
Epoch: 22220, Training Loss: 0.00874
Epoch: 22221, Training Loss: 0.00990
Epoch: 22221, Training Loss: 0.00922
Epoch: 22221, Training Loss: 0.01131
Epoch: 22221, Training Loss: 0.00874
Epoch: 22222, Training Loss: 0.00990
Epoch: 22222, Training Loss: 0.00922
Epoch: 22222, Training Loss: 0.01131
Epoch: 22222, Training Loss: 0.00874
Epoch: 22223, Training Loss: 0.00990
Epoch: 22223, Training Loss: 0.00922
Epoch: 22223, Training Loss: 0.01131
Epoch: 22223, Training Loss: 0.00873
Epoch: 22224, Training Loss: 0.00990
Epoch: 22224, Training Loss: 0.00922
Epoch: 22224, Training Loss: 0.01131
Epoch: 22224, Training Loss: 0.00873
Epoch: 22225, Training Loss: 0.00990
Epoch: 22225, Training Loss: 0.00922
Epoch: 22225, Training Loss: 0.01130
Epoch: 22225, Training Loss: 0.00873
Epoch: 22226, Training Loss: 0.00990
Epoch: 22226, Training Loss: 0.00922
Epoch: 22226, Training Loss: 0.01130
Epoch: 22226, Training Loss: 0.00873
Epoch: 22227, Training Loss: 0.00990
Epoch: 22227, Training Loss: 0.00922
Epoch: 22227, Training Loss: 0.01130
Epoch: 22227, Training Loss: 0.00873
Epoch: 22228, Training Loss: 0.00990
Epoch: 22228, Training Loss: 0.00922
Epoch: 22228, Training Loss: 0.01130
Epoch: 22228, Training Loss: 0.00873
Epoch: 22229, Training Loss: 0.00990
Epoch: 22229, Training Loss: 0.00922
Epoch: 22229, Training Loss: 0.01130
Epoch: 22229, Training Loss: 0.00873
Epoch: 22230, Training Loss: 0.00990
Epoch: 22230, Training Loss: 0.00922
Epoch: 22230, Training Loss: 0.01130
Epoch: 22230, Training Loss: 0.00873
Epoch: 22231, Training Loss: 0.00990
Epoch: 22231, Training Loss: 0.00922
Epoch: 22231, Training Loss: 0.01130
Epoch: 22231, Training Loss: 0.00873
Epoch: 22232, Training Loss: 0.00990
Epoch: 22232, Training Loss: 0.00922
Epoch: 22232, Training Loss: 0.01130
Epoch: 22232, Training Loss: 0.00873
Epoch: 22233, Training Loss: 0.00990
Epoch: 22233, Training Loss: 0.00922
Epoch: 22233, Training Loss: 0.01130
Epoch: 22233, Training Loss: 0.00873
Epoch: 22234, Training Loss: 0.00990
Epoch: 22234, Training Loss: 0.00922
Epoch: 22234, Training Loss: 0.01130
Epoch: 22234, Training Loss: 0.00873
Epoch: 22235, Training Loss: 0.00990
Epoch: 22235, Training Loss: 0.00922
Epoch: 22235, Training Loss: 0.01130
Epoch: 22235, Training Loss: 0.00873
Epoch: 22236, Training Loss: 0.00990
Epoch: 22236, Training Loss: 0.00922
Epoch: 22236, Training Loss: 0.01130
Epoch: 22236, Training Loss: 0.00873
Epoch: 22237, Training Loss: 0.00990
Epoch: 22237, Training Loss: 0.00922
Epoch: 22237, Training Loss: 0.01130
Epoch: 22237, Training Loss: 0.00873
Epoch: 22238, Training Loss: 0.00990
Epoch: 22238, Training Loss: 0.00922
Epoch: 22238, Training Loss: 0.01130
Epoch: 22238, Training Loss: 0.00873
Epoch: 22239, Training Loss: 0.00990
Epoch: 22239, Training Loss: 0.00922
Epoch: 22239, Training Loss: 0.01130
Epoch: 22239, Training Loss: 0.00873
Epoch: 22240, Training Loss: 0.00990
Epoch: 22240, Training Loss: 0.00922
Epoch: 22240, Training Loss: 0.01130
Epoch: 22240, Training Loss: 0.00873
Epoch: 22241, Training Loss: 0.00990
Epoch: 22241, Training Loss: 0.00922
Epoch: 22241, Training Loss: 0.01130
Epoch: 22241, Training Loss: 0.00873
Epoch: 22242, Training Loss: 0.00990
Epoch: 22242, Training Loss: 0.00922
Epoch: 22242, Training Loss: 0.01130
Epoch: 22242, Training Loss: 0.00873
Epoch: 22243, Training Loss: 0.00990
Epoch: 22243, Training Loss: 0.00922
Epoch: 22243, Training Loss: 0.01130
Epoch: 22243, Training Loss: 0.00873
Epoch: 22244, Training Loss: 0.00990
Epoch: 22244, Training Loss: 0.00922
Epoch: 22244, Training Loss: 0.01130
Epoch: 22244, Training Loss: 0.00873
Epoch: 22245, Training Loss: 0.00990
Epoch: 22245, Training Loss: 0.00922
Epoch: 22245, Training Loss: 0.01130
Epoch: 22245, Training Loss: 0.00873
Epoch: 22246, Training Loss: 0.00990
Epoch: 22246, Training Loss: 0.00922
Epoch: 22246, Training Loss: 0.01130
Epoch: 22246, Training Loss: 0.00873
Epoch: 22247, Training Loss: 0.00990
Epoch: 22247, Training Loss: 0.00921
Epoch: 22247, Training Loss: 0.01130
Epoch: 22247, Training Loss: 0.00873
Epoch: 22248, Training Loss: 0.00990
Epoch: 22248, Training Loss: 0.00921
Epoch: 22248, Training Loss: 0.01130
Epoch: 22248, Training Loss: 0.00873
Epoch: 22249, Training Loss: 0.00990
Epoch: 22249, Training Loss: 0.00921
Epoch: 22249, Training Loss: 0.01130
Epoch: 22249, Training Loss: 0.00873
Epoch: 22250, Training Loss: 0.00990
Epoch: 22250, Training Loss: 0.00921
Epoch: 22250, Training Loss: 0.01130
Epoch: 22250, Training Loss: 0.00873
Epoch: 22251, Training Loss: 0.00990
Epoch: 22251, Training Loss: 0.00921
Epoch: 22251, Training Loss: 0.01130
Epoch: 22251, Training Loss: 0.00873
Epoch: 22252, Training Loss: 0.00990
Epoch: 22252, Training Loss: 0.00921
Epoch: 22252, Training Loss: 0.01130
Epoch: 22252, Training Loss: 0.00873
Epoch: 22253, Training Loss: 0.00990
Epoch: 22253, Training Loss: 0.00921
Epoch: 22253, Training Loss: 0.01130
Epoch: 22253, Training Loss: 0.00873
Epoch: 22254, Training Loss: 0.00989
Epoch: 22254, Training Loss: 0.00921
Epoch: 22254, Training Loss: 0.01130
Epoch: 22254, Training Loss: 0.00873
Epoch: 22255, Training Loss: 0.00989
Epoch: 22255, Training Loss: 0.00921
Epoch: 22255, Training Loss: 0.01130
Epoch: 22255, Training Loss: 0.00873
Epoch: 22256, Training Loss: 0.00989
Epoch: 22256, Training Loss: 0.00921
Epoch: 22256, Training Loss: 0.01130
Epoch: 22256, Training Loss: 0.00873
Epoch: 22257, Training Loss: 0.00989
Epoch: 22257, Training Loss: 0.00921
Epoch: 22257, Training Loss: 0.01130
Epoch: 22257, Training Loss: 0.00873
Epoch: 22258, Training Loss: 0.00989
Epoch: 22258, Training Loss: 0.00921
Epoch: 22258, Training Loss: 0.01130
Epoch: 22258, Training Loss: 0.00873
Epoch: 22259, Training Loss: 0.00989
Epoch: 22259, Training Loss: 0.00921
Epoch: 22259, Training Loss: 0.01130
Epoch: 22259, Training Loss: 0.00873
Epoch: 22260, Training Loss: 0.00989
Epoch: 22260, Training Loss: 0.00921
Epoch: 22260, Training Loss: 0.01130
Epoch: 22260, Training Loss: 0.00873
Epoch: 22261, Training Loss: 0.00989
Epoch: 22261, Training Loss: 0.00921
Epoch: 22261, Training Loss: 0.01129
Epoch: 22261, Training Loss: 0.00873
Epoch: 22262, Training Loss: 0.00989
Epoch: 22262, Training Loss: 0.00921
Epoch: 22262, Training Loss: 0.01129
Epoch: 22262, Training Loss: 0.00873
Epoch: 22263, Training Loss: 0.00989
Epoch: 22263, Training Loss: 0.00921
Epoch: 22263, Training Loss: 0.01129
Epoch: 22263, Training Loss: 0.00873
Epoch: 22264, Training Loss: 0.00989
Epoch: 22264, Training Loss: 0.00921
Epoch: 22264, Training Loss: 0.01129
Epoch: 22264, Training Loss: 0.00873
Epoch: 22265, Training Loss: 0.00989
Epoch: 22265, Training Loss: 0.00921
Epoch: 22265, Training Loss: 0.01129
Epoch: 22265, Training Loss: 0.00873
Epoch: 22266, Training Loss: 0.00989
Epoch: 22266, Training Loss: 0.00921
Epoch: 22266, Training Loss: 0.01129
Epoch: 22266, Training Loss: 0.00873
Epoch: 22267, Training Loss: 0.00989
Epoch: 22267, Training Loss: 0.00921
Epoch: 22267, Training Loss: 0.01129
Epoch: 22267, Training Loss: 0.00873
Epoch: 22268, Training Loss: 0.00989
Epoch: 22268, Training Loss: 0.00921
Epoch: 22268, Training Loss: 0.01129
Epoch: 22268, Training Loss: 0.00873
Epoch: 22269, Training Loss: 0.00989
Epoch: 22269, Training Loss: 0.00921
Epoch: 22269, Training Loss: 0.01129
Epoch: 22269, Training Loss: 0.00873
Epoch: 22270, Training Loss: 0.00989
Epoch: 22270, Training Loss: 0.00921
Epoch: 22270, Training Loss: 0.01129
Epoch: 22270, Training Loss: 0.00873
Epoch: 22271, Training Loss: 0.00989
Epoch: 22271, Training Loss: 0.00921
Epoch: 22271, Training Loss: 0.01129
Epoch: 22271, Training Loss: 0.00872
Epoch: 22272, Training Loss: 0.00989
Epoch: 22272, Training Loss: 0.00921
Epoch: 22272, Training Loss: 0.01129
Epoch: 22272, Training Loss: 0.00872
Epoch: 22273, Training Loss: 0.00989
Epoch: 22273, Training Loss: 0.00921
Epoch: 22273, Training Loss: 0.01129
Epoch: 22273, Training Loss: 0.00872
Epoch: 22274, Training Loss: 0.00989
Epoch: 22274, Training Loss: 0.00921
Epoch: 22274, Training Loss: 0.01129
Epoch: 22274, Training Loss: 0.00872
Epoch: 22275, Training Loss: 0.00989
Epoch: 22275, Training Loss: 0.00921
Epoch: 22275, Training Loss: 0.01129
Epoch: 22275, Training Loss: 0.00872
Epoch: 22276, Training Loss: 0.00989
Epoch: 22276, Training Loss: 0.00921
Epoch: 22276, Training Loss: 0.01129
Epoch: 22276, Training Loss: 0.00872
Epoch: 22277, Training Loss: 0.00989
Epoch: 22277, Training Loss: 0.00921
Epoch: 22277, Training Loss: 0.01129
Epoch: 22277, Training Loss: 0.00872
Epoch: 22278, Training Loss: 0.00989
Epoch: 22278, Training Loss: 0.00921
Epoch: 22278, Training Loss: 0.01129
Epoch: 22278, Training Loss: 0.00872
Epoch: 22279, Training Loss: 0.00989
Epoch: 22279, Training Loss: 0.00921
Epoch: 22279, Training Loss: 0.01129
Epoch: 22279, Training Loss: 0.00872
Epoch: 22280, Training Loss: 0.00989
Epoch: 22280, Training Loss: 0.00921
Epoch: 22280, Training Loss: 0.01129
Epoch: 22280, Training Loss: 0.00872
Epoch: 22281, Training Loss: 0.00989
Epoch: 22281, Training Loss: 0.00921
Epoch: 22281, Training Loss: 0.01129
Epoch: 22281, Training Loss: 0.00872
Epoch: 22282, Training Loss: 0.00989
Epoch: 22282, Training Loss: 0.00921
Epoch: 22282, Training Loss: 0.01129
Epoch: 22282, Training Loss: 0.00872
Epoch: 22283, Training Loss: 0.00989
Epoch: 22283, Training Loss: 0.00921
Epoch: 22283, Training Loss: 0.01129
Epoch: 22283, Training Loss: 0.00872
Epoch: 22284, Training Loss: 0.00989
Epoch: 22284, Training Loss: 0.00921
Epoch: 22284, Training Loss: 0.01129
Epoch: 22284, Training Loss: 0.00872
Epoch: 22285, Training Loss: 0.00989
Epoch: 22285, Training Loss: 0.00921
Epoch: 22285, Training Loss: 0.01129
Epoch: 22285, Training Loss: 0.00872
Epoch: 22286, Training Loss: 0.00989
Epoch: 22286, Training Loss: 0.00921
Epoch: 22286, Training Loss: 0.01129
Epoch: 22286, Training Loss: 0.00872
Epoch: 22287, Training Loss: 0.00989
Epoch: 22287, Training Loss: 0.00921
Epoch: 22287, Training Loss: 0.01129
Epoch: 22287, Training Loss: 0.00872
Epoch: 22288, Training Loss: 0.00989
Epoch: 22288, Training Loss: 0.00921
Epoch: 22288, Training Loss: 0.01129
Epoch: 22288, Training Loss: 0.00872
Epoch: 22289, Training Loss: 0.00989
Epoch: 22289, Training Loss: 0.00921
Epoch: 22289, Training Loss: 0.01129
Epoch: 22289, Training Loss: 0.00872
Epoch: 22290, Training Loss: 0.00989
Epoch: 22290, Training Loss: 0.00921
Epoch: 22290, Training Loss: 0.01129
Epoch: 22290, Training Loss: 0.00872
Epoch: 22291, Training Loss: 0.00989
Epoch: 22291, Training Loss: 0.00921
Epoch: 22291, Training Loss: 0.01129
Epoch: 22291, Training Loss: 0.00872
Epoch: 22292, Training Loss: 0.00989
Epoch: 22292, Training Loss: 0.00921
Epoch: 22292, Training Loss: 0.01129
Epoch: 22292, Training Loss: 0.00872
Epoch: 22293, Training Loss: 0.00989
Epoch: 22293, Training Loss: 0.00920
Epoch: 22293, Training Loss: 0.01129
Epoch: 22293, Training Loss: 0.00872
Epoch: 22294, Training Loss: 0.00989
Epoch: 22294, Training Loss: 0.00920
Epoch: 22294, Training Loss: 0.01129
Epoch: 22294, Training Loss: 0.00872
Epoch: 22295, Training Loss: 0.00989
Epoch: 22295, Training Loss: 0.00920
Epoch: 22295, Training Loss: 0.01129
Epoch: 22295, Training Loss: 0.00872
Epoch: 22296, Training Loss: 0.00988
Epoch: 22296, Training Loss: 0.00920
Epoch: 22296, Training Loss: 0.01129
Epoch: 22296, Training Loss: 0.00872
Epoch: 22297, Training Loss: 0.00988
Epoch: 22297, Training Loss: 0.00920
Epoch: 22297, Training Loss: 0.01129
Epoch: 22297, Training Loss: 0.00872
Epoch: 22298, Training Loss: 0.00988
Epoch: 22298, Training Loss: 0.00920
Epoch: 22298, Training Loss: 0.01128
Epoch: 22298, Training Loss: 0.00872
Epoch: 22299, Training Loss: 0.00988
Epoch: 22299, Training Loss: 0.00920
Epoch: 22299, Training Loss: 0.01128
Epoch: 22299, Training Loss: 0.00872
Epoch: 22300, Training Loss: 0.00988
Epoch: 22300, Training Loss: 0.00920
Epoch: 22300, Training Loss: 0.01128
Epoch: 22300, Training Loss: 0.00872
Epoch: 22301, Training Loss: 0.00988
Epoch: 22301, Training Loss: 0.00920
Epoch: 22301, Training Loss: 0.01128
Epoch: 22301, Training Loss: 0.00872
Epoch: 22302, Training Loss: 0.00988
Epoch: 22302, Training Loss: 0.00920
Epoch: 22302, Training Loss: 0.01128
Epoch: 22302, Training Loss: 0.00872
Epoch: 22303, Training Loss: 0.00988
Epoch: 22303, Training Loss: 0.00920
Epoch: 22303, Training Loss: 0.01128
Epoch: 22303, Training Loss: 0.00872
Epoch: 22304, Training Loss: 0.00988
Epoch: 22304, Training Loss: 0.00920
Epoch: 22304, Training Loss: 0.01128
Epoch: 22304, Training Loss: 0.00872
Epoch: 22305, Training Loss: 0.00988
Epoch: 22305, Training Loss: 0.00920
Epoch: 22305, Training Loss: 0.01128
Epoch: 22305, Training Loss: 0.00872
Epoch: 22306, Training Loss: 0.00988
Epoch: 22306, Training Loss: 0.00920
Epoch: 22306, Training Loss: 0.01128
Epoch: 22306, Training Loss: 0.00872
Epoch: 22307, Training Loss: 0.00988
Epoch: 22307, Training Loss: 0.00920
Epoch: 22307, Training Loss: 0.01128
Epoch: 22307, Training Loss: 0.00872
Epoch: 22308, Training Loss: 0.00988
Epoch: 22308, Training Loss: 0.00920
Epoch: 22308, Training Loss: 0.01128
Epoch: 22308, Training Loss: 0.00872
Epoch: 22309, Training Loss: 0.00988
Epoch: 22309, Training Loss: 0.00920
Epoch: 22309, Training Loss: 0.01128
Epoch: 22309, Training Loss: 0.00872
Epoch: 22310, Training Loss: 0.00988
Epoch: 22310, Training Loss: 0.00920
Epoch: 22310, Training Loss: 0.01128
Epoch: 22310, Training Loss: 0.00872
Epoch: 22311, Training Loss: 0.00988
Epoch: 22311, Training Loss: 0.00920
Epoch: 22311, Training Loss: 0.01128
Epoch: 22311, Training Loss: 0.00872
Epoch: 22312, Training Loss: 0.00988
Epoch: 22312, Training Loss: 0.00920
Epoch: 22312, Training Loss: 0.01128
Epoch: 22312, Training Loss: 0.00872
Epoch: 22313, Training Loss: 0.00988
Epoch: 22313, Training Loss: 0.00920
Epoch: 22313, Training Loss: 0.01128
Epoch: 22313, Training Loss: 0.00872
Epoch: 22314, Training Loss: 0.00988
Epoch: 22314, Training Loss: 0.00920
Epoch: 22314, Training Loss: 0.01128
Epoch: 22314, Training Loss: 0.00872
Epoch: 22315, Training Loss: 0.00988
Epoch: 22315, Training Loss: 0.00920
Epoch: 22315, Training Loss: 0.01128
Epoch: 22315, Training Loss: 0.00872
Epoch: 22316, Training Loss: 0.00988
Epoch: 22316, Training Loss: 0.00920
Epoch: 22316, Training Loss: 0.01128
Epoch: 22316, Training Loss: 0.00872
Epoch: 22317, Training Loss: 0.00988
Epoch: 22317, Training Loss: 0.00920
Epoch: 22317, Training Loss: 0.01128
Epoch: 22317, Training Loss: 0.00872
Epoch: 22318, Training Loss: 0.00988
Epoch: 22318, Training Loss: 0.00920
Epoch: 22318, Training Loss: 0.01128
Epoch: 22318, Training Loss: 0.00872
Epoch: 22319, Training Loss: 0.00988
Epoch: 22319, Training Loss: 0.00920
Epoch: 22319, Training Loss: 0.01128
Epoch: 22319, Training Loss: 0.00872
Epoch: 22320, Training Loss: 0.00988
Epoch: 22320, Training Loss: 0.00920
Epoch: 22320, Training Loss: 0.01128
Epoch: 22320, Training Loss: 0.00871
Epoch: 22321, Training Loss: 0.00988
Epoch: 22321, Training Loss: 0.00920
Epoch: 22321, Training Loss: 0.01128
Epoch: 22321, Training Loss: 0.00871
Epoch: 22322, Training Loss: 0.00988
Epoch: 22322, Training Loss: 0.00920
Epoch: 22322, Training Loss: 0.01128
Epoch: 22322, Training Loss: 0.00871
Epoch: 22323, Training Loss: 0.00988
Epoch: 22323, Training Loss: 0.00920
Epoch: 22323, Training Loss: 0.01128
Epoch: 22323, Training Loss: 0.00871
Epoch: 22324, Training Loss: 0.00988
Epoch: 22324, Training Loss: 0.00920
Epoch: 22324, Training Loss: 0.01128
Epoch: 22324, Training Loss: 0.00871
Epoch: 22325, Training Loss: 0.00988
Epoch: 22325, Training Loss: 0.00920
Epoch: 22325, Training Loss: 0.01128
Epoch: 22325, Training Loss: 0.00871
Epoch: 22326, Training Loss: 0.00988
Epoch: 22326, Training Loss: 0.00920
Epoch: 22326, Training Loss: 0.01128
Epoch: 22326, Training Loss: 0.00871
Epoch: 22327, Training Loss: 0.00988
Epoch: 22327, Training Loss: 0.00920
Epoch: 22327, Training Loss: 0.01128
Epoch: 22327, Training Loss: 0.00871
Epoch: 22328, Training Loss: 0.00988
Epoch: 22328, Training Loss: 0.00920
Epoch: 22328, Training Loss: 0.01128
Epoch: 22328, Training Loss: 0.00871
Epoch: 22329, Training Loss: 0.00988
Epoch: 22329, Training Loss: 0.00920
Epoch: 22329, Training Loss: 0.01128
Epoch: 22329, Training Loss: 0.00871
Epoch: 22330, Training Loss: 0.00988
Epoch: 22330, Training Loss: 0.00920
Epoch: 22330, Training Loss: 0.01128
Epoch: 22330, Training Loss: 0.00871
Epoch: 22331, Training Loss: 0.00988
Epoch: 22331, Training Loss: 0.00920
Epoch: 22331, Training Loss: 0.01128
Epoch: 22331, Training Loss: 0.00871
Epoch: 22332, Training Loss: 0.00988
Epoch: 22332, Training Loss: 0.00920
Epoch: 22332, Training Loss: 0.01128
Epoch: 22332, Training Loss: 0.00871
Epoch: 22333, Training Loss: 0.00988
Epoch: 22333, Training Loss: 0.00920
Epoch: 22333, Training Loss: 0.01128
Epoch: 22333, Training Loss: 0.00871
Epoch: 22334, Training Loss: 0.00988
Epoch: 22334, Training Loss: 0.00920
Epoch: 22334, Training Loss: 0.01128
Epoch: 22334, Training Loss: 0.00871
Epoch: 22335, Training Loss: 0.00988
Epoch: 22335, Training Loss: 0.00920
Epoch: 22335, Training Loss: 0.01127
Epoch: 22335, Training Loss: 0.00871
Epoch: 22336, Training Loss: 0.00988
Epoch: 22336, Training Loss: 0.00920
Epoch: 22336, Training Loss: 0.01127
Epoch: 22336, Training Loss: 0.00871
Epoch: 22337, Training Loss: 0.00988
Epoch: 22337, Training Loss: 0.00920
Epoch: 22337, Training Loss: 0.01127
Epoch: 22337, Training Loss: 0.00871
Epoch: 22338, Training Loss: 0.00987
Epoch: 22338, Training Loss: 0.00920
Epoch: 22338, Training Loss: 0.01127
Epoch: 22338, Training Loss: 0.00871
Epoch: 22339, Training Loss: 0.00987
Epoch: 22339, Training Loss: 0.00919
Epoch: 22339, Training Loss: 0.01127
Epoch: 22339, Training Loss: 0.00871
Epoch: 22340, Training Loss: 0.00987
Epoch: 22340, Training Loss: 0.00919
Epoch: 22340, Training Loss: 0.01127
Epoch: 22340, Training Loss: 0.00871
Epoch: 22341, Training Loss: 0.00987
Epoch: 22341, Training Loss: 0.00919
Epoch: 22341, Training Loss: 0.01127
Epoch: 22341, Training Loss: 0.00871
Epoch: 22342, Training Loss: 0.00987
Epoch: 22342, Training Loss: 0.00919
Epoch: 22342, Training Loss: 0.01127
Epoch: 22342, Training Loss: 0.00871
Epoch: 22343, Training Loss: 0.00987
Epoch: 22343, Training Loss: 0.00919
Epoch: 22343, Training Loss: 0.01127
Epoch: 22343, Training Loss: 0.00871
Epoch: 22344, Training Loss: 0.00987
Epoch: 22344, Training Loss: 0.00919
Epoch: 22344, Training Loss: 0.01127
Epoch: 22344, Training Loss: 0.00871
Epoch: 22345, Training Loss: 0.00987
Epoch: 22345, Training Loss: 0.00919
Epoch: 22345, Training Loss: 0.01127
Epoch: 22345, Training Loss: 0.00871
Epoch: 22346, Training Loss: 0.00987
Epoch: 22346, Training Loss: 0.00919
Epoch: 22346, Training Loss: 0.01127
Epoch: 22346, Training Loss: 0.00871
Epoch: 22347, Training Loss: 0.00987
Epoch: 22347, Training Loss: 0.00919
Epoch: 22347, Training Loss: 0.01127
Epoch: 22347, Training Loss: 0.00871
Epoch: 22348, Training Loss: 0.00987
Epoch: 22348, Training Loss: 0.00919
Epoch: 22348, Training Loss: 0.01127
Epoch: 22348, Training Loss: 0.00871
Epoch: 22349, Training Loss: 0.00987
Epoch: 22349, Training Loss: 0.00919
Epoch: 22349, Training Loss: 0.01127
Epoch: 22349, Training Loss: 0.00871
Epoch: 22350, Training Loss: 0.00987
Epoch: 22350, Training Loss: 0.00919
Epoch: 22350, Training Loss: 0.01127
Epoch: 22350, Training Loss: 0.00871
Epoch: 22351, Training Loss: 0.00987
Epoch: 22351, Training Loss: 0.00919
Epoch: 22351, Training Loss: 0.01127
Epoch: 22351, Training Loss: 0.00871
Epoch: 22352, Training Loss: 0.00987
Epoch: 22352, Training Loss: 0.00919
Epoch: 22352, Training Loss: 0.01127
Epoch: 22352, Training Loss: 0.00871
Epoch: 22353, Training Loss: 0.00987
Epoch: 22353, Training Loss: 0.00919
Epoch: 22353, Training Loss: 0.01127
Epoch: 22353, Training Loss: 0.00871
Epoch: 22354, Training Loss: 0.00987
Epoch: 22354, Training Loss: 0.00919
Epoch: 22354, Training Loss: 0.01127
Epoch: 22354, Training Loss: 0.00871
Epoch: 22355, Training Loss: 0.00987
Epoch: 22355, Training Loss: 0.00919
Epoch: 22355, Training Loss: 0.01127
Epoch: 22355, Training Loss: 0.00871
Epoch: 22356, Training Loss: 0.00987
Epoch: 22356, Training Loss: 0.00919
Epoch: 22356, Training Loss: 0.01127
Epoch: 22356, Training Loss: 0.00871
Epoch: 22357, Training Loss: 0.00987
Epoch: 22357, Training Loss: 0.00919
Epoch: 22357, Training Loss: 0.01127
Epoch: 22357, Training Loss: 0.00871
Epoch: 22358, Training Loss: 0.00987
Epoch: 22358, Training Loss: 0.00919
Epoch: 22358, Training Loss: 0.01127
Epoch: 22358, Training Loss: 0.00871
Epoch: 22359, Training Loss: 0.00987
Epoch: 22359, Training Loss: 0.00919
Epoch: 22359, Training Loss: 0.01127
Epoch: 22359, Training Loss: 0.00871
Epoch: 22360, Training Loss: 0.00987
Epoch: 22360, Training Loss: 0.00919
Epoch: 22360, Training Loss: 0.01127
Epoch: 22360, Training Loss: 0.00871
Epoch: 22361, Training Loss: 0.00987
Epoch: 22361, Training Loss: 0.00919
Epoch: 22361, Training Loss: 0.01127
Epoch: 22361, Training Loss: 0.00871
Epoch: 22362, Training Loss: 0.00987
Epoch: 22362, Training Loss: 0.00919
Epoch: 22362, Training Loss: 0.01127
Epoch: 22362, Training Loss: 0.00871
Epoch: 22363, Training Loss: 0.00987
Epoch: 22363, Training Loss: 0.00919
Epoch: 22363, Training Loss: 0.01127
Epoch: 22363, Training Loss: 0.00871
Epoch: 22364, Training Loss: 0.00987
Epoch: 22364, Training Loss: 0.00919
Epoch: 22364, Training Loss: 0.01127
Epoch: 22364, Training Loss: 0.00871
Epoch: 22365, Training Loss: 0.00987
Epoch: 22365, Training Loss: 0.00919
Epoch: 22365, Training Loss: 0.01127
Epoch: 22365, Training Loss: 0.00871
Epoch: 22366, Training Loss: 0.00987
Epoch: 22366, Training Loss: 0.00919
Epoch: 22366, Training Loss: 0.01127
Epoch: 22366, Training Loss: 0.00871
Epoch: 22367, Training Loss: 0.00987
Epoch: 22367, Training Loss: 0.00919
Epoch: 22367, Training Loss: 0.01127
Epoch: 22367, Training Loss: 0.00871
Epoch: 22368, Training Loss: 0.00987
Epoch: 22368, Training Loss: 0.00919
Epoch: 22368, Training Loss: 0.01127
Epoch: 22368, Training Loss: 0.00870
Epoch: 22369, Training Loss: 0.00987
Epoch: 22369, Training Loss: 0.00919
Epoch: 22369, Training Loss: 0.01127
Epoch: 22369, Training Loss: 0.00870
Epoch: 22370, Training Loss: 0.00987
Epoch: 22370, Training Loss: 0.00919
Epoch: 22370, Training Loss: 0.01127
Epoch: 22370, Training Loss: 0.00870
Epoch: 22371, Training Loss: 0.00987
Epoch: 22371, Training Loss: 0.00919
Epoch: 22371, Training Loss: 0.01127
Epoch: 22371, Training Loss: 0.00870
Epoch: 22372, Training Loss: 0.00987
Epoch: 22372, Training Loss: 0.00919
Epoch: 22372, Training Loss: 0.01126
Epoch: 22372, Training Loss: 0.00870
Epoch: 22373, Training Loss: 0.00987
Epoch: 22373, Training Loss: 0.00919
Epoch: 22373, Training Loss: 0.01126
Epoch: 22373, Training Loss: 0.00870
Epoch: 22374, Training Loss: 0.00987
Epoch: 22374, Training Loss: 0.00919
Epoch: 22374, Training Loss: 0.01126
Epoch: 22374, Training Loss: 0.00870
Epoch: 22375, Training Loss: 0.00987
Epoch: 22375, Training Loss: 0.00919
Epoch: 22375, Training Loss: 0.01126
Epoch: 22375, Training Loss: 0.00870
Epoch: 22376, Training Loss: 0.00987
Epoch: 22376, Training Loss: 0.00919
Epoch: 22376, Training Loss: 0.01126
Epoch: 22376, Training Loss: 0.00870
Epoch: 22377, Training Loss: 0.00987
Epoch: 22377, Training Loss: 0.00919
Epoch: 22377, Training Loss: 0.01126
Epoch: 22377, Training Loss: 0.00870
Epoch: 22378, Training Loss: 0.00987
Epoch: 22378, Training Loss: 0.00919
Epoch: 22378, Training Loss: 0.01126
Epoch: 22378, Training Loss: 0.00870
Epoch: 22379, Training Loss: 0.00987
Epoch: 22379, Training Loss: 0.00919
Epoch: 22379, Training Loss: 0.01126
Epoch: 22379, Training Loss: 0.00870
Epoch: 22380, Training Loss: 0.00986
Epoch: 22380, Training Loss: 0.00919
Epoch: 22380, Training Loss: 0.01126
Epoch: 22380, Training Loss: 0.00870
Epoch: 22381, Training Loss: 0.00986
Epoch: 22381, Training Loss: 0.00919
Epoch: 22381, Training Loss: 0.01126
Epoch: 22381, Training Loss: 0.00870
Epoch: 22382, Training Loss: 0.00986
Epoch: 22382, Training Loss: 0.00919
Epoch: 22382, Training Loss: 0.01126
Epoch: 22382, Training Loss: 0.00870
Epoch: 22383, Training Loss: 0.00986
Epoch: 22383, Training Loss: 0.00919
Epoch: 22383, Training Loss: 0.01126
Epoch: 22383, Training Loss: 0.00870
Epoch: 22384, Training Loss: 0.00986
Epoch: 22384, Training Loss: 0.00919
Epoch: 22384, Training Loss: 0.01126
Epoch: 22384, Training Loss: 0.00870
Epoch: 22385, Training Loss: 0.00986
Epoch: 22385, Training Loss: 0.00918
Epoch: 22385, Training Loss: 0.01126
Epoch: 22385, Training Loss: 0.00870
Epoch: 22386, Training Loss: 0.00986
Epoch: 22386, Training Loss: 0.00918
Epoch: 22386, Training Loss: 0.01126
Epoch: 22386, Training Loss: 0.00870
Epoch: 22387, Training Loss: 0.00986
Epoch: 22387, Training Loss: 0.00918
Epoch: 22387, Training Loss: 0.01126
Epoch: 22387, Training Loss: 0.00870
Epoch: 22388, Training Loss: 0.00986
Epoch: 22388, Training Loss: 0.00918
Epoch: 22388, Training Loss: 0.01126
Epoch: 22388, Training Loss: 0.00870
Epoch: 22389, Training Loss: 0.00986
Epoch: 22389, Training Loss: 0.00918
Epoch: 22389, Training Loss: 0.01126
Epoch: 22389, Training Loss: 0.00870
Epoch: 22390, Training Loss: 0.00986
Epoch: 22390, Training Loss: 0.00918
Epoch: 22390, Training Loss: 0.01126
Epoch: 22390, Training Loss: 0.00870
Epoch: 22391, Training Loss: 0.00986
Epoch: 22391, Training Loss: 0.00918
Epoch: 22391, Training Loss: 0.01126
Epoch: 22391, Training Loss: 0.00870
Epoch: 22392, Training Loss: 0.00986
Epoch: 22392, Training Loss: 0.00918
Epoch: 22392, Training Loss: 0.01126
Epoch: 22392, Training Loss: 0.00870
Epoch: 22393, Training Loss: 0.00986
Epoch: 22393, Training Loss: 0.00918
Epoch: 22393, Training Loss: 0.01126
Epoch: 22393, Training Loss: 0.00870
Epoch: 22394, Training Loss: 0.00986
Epoch: 22394, Training Loss: 0.00918
Epoch: 22394, Training Loss: 0.01126
Epoch: 22394, Training Loss: 0.00870
Epoch: 22395, Training Loss: 0.00986
Epoch: 22395, Training Loss: 0.00918
Epoch: 22395, Training Loss: 0.01126
Epoch: 22395, Training Loss: 0.00870
Epoch: 22396, Training Loss: 0.00986
Epoch: 22396, Training Loss: 0.00918
Epoch: 22396, Training Loss: 0.01126
Epoch: 22396, Training Loss: 0.00870
Epoch: 22397, Training Loss: 0.00986
Epoch: 22397, Training Loss: 0.00918
Epoch: 22397, Training Loss: 0.01126
Epoch: 22397, Training Loss: 0.00870
Epoch: 22398, Training Loss: 0.00986
Epoch: 22398, Training Loss: 0.00918
Epoch: 22398, Training Loss: 0.01126
Epoch: 22398, Training Loss: 0.00870
Epoch: 22399, Training Loss: 0.00986
Epoch: 22399, Training Loss: 0.00918
Epoch: 22399, Training Loss: 0.01126
Epoch: 22399, Training Loss: 0.00870
Epoch: 22400, Training Loss: 0.00986
Epoch: 22400, Training Loss: 0.00918
Epoch: 22400, Training Loss: 0.01126
Epoch: 22400, Training Loss: 0.00870
Epoch: 22401, Training Loss: 0.00986
Epoch: 22401, Training Loss: 0.00918
Epoch: 22401, Training Loss: 0.01126
Epoch: 22401, Training Loss: 0.00870
Epoch: 22402, Training Loss: 0.00986
Epoch: 22402, Training Loss: 0.00918
Epoch: 22402, Training Loss: 0.01126
Epoch: 22402, Training Loss: 0.00870
Epoch: 22403, Training Loss: 0.00986
Epoch: 22403, Training Loss: 0.00918
Epoch: 22403, Training Loss: 0.01126
Epoch: 22403, Training Loss: 0.00870
Epoch: 22404, Training Loss: 0.00986
Epoch: 22404, Training Loss: 0.00918
Epoch: 22404, Training Loss: 0.01126
Epoch: 22404, Training Loss: 0.00870
Epoch: 22405, Training Loss: 0.00986
Epoch: 22405, Training Loss: 0.00918
Epoch: 22405, Training Loss: 0.01126
Epoch: 22405, Training Loss: 0.00870
Epoch: 22406, Training Loss: 0.00986
Epoch: 22406, Training Loss: 0.00918
Epoch: 22406, Training Loss: 0.01126
Epoch: 22406, Training Loss: 0.00870
Epoch: 22407, Training Loss: 0.00986
Epoch: 22407, Training Loss: 0.00918
Epoch: 22407, Training Loss: 0.01126
Epoch: 22407, Training Loss: 0.00870
Epoch: 22408, Training Loss: 0.00986
Epoch: 22408, Training Loss: 0.00918
Epoch: 22408, Training Loss: 0.01126
Epoch: 22408, Training Loss: 0.00870
Epoch: 22409, Training Loss: 0.00986
Epoch: 22409, Training Loss: 0.00918
Epoch: 22409, Training Loss: 0.01125
Epoch: 22409, Training Loss: 0.00870
Epoch: 22410, Training Loss: 0.00986
Epoch: 22410, Training Loss: 0.00918
Epoch: 22410, Training Loss: 0.01125
Epoch: 22410, Training Loss: 0.00870
Epoch: 22411, Training Loss: 0.00986
Epoch: 22411, Training Loss: 0.00918
Epoch: 22411, Training Loss: 0.01125
Epoch: 22411, Training Loss: 0.00870
Epoch: 22412, Training Loss: 0.00986
Epoch: 22412, Training Loss: 0.00918
Epoch: 22412, Training Loss: 0.01125
Epoch: 22412, Training Loss: 0.00870
Epoch: 22413, Training Loss: 0.00986
Epoch: 22413, Training Loss: 0.00918
Epoch: 22413, Training Loss: 0.01125
Epoch: 22413, Training Loss: 0.00870
Epoch: 22414, Training Loss: 0.00986
Epoch: 22414, Training Loss: 0.00918
Epoch: 22414, Training Loss: 0.01125
Epoch: 22414, Training Loss: 0.00870
Epoch: 22415, Training Loss: 0.00986
Epoch: 22415, Training Loss: 0.00918
Epoch: 22415, Training Loss: 0.01125
Epoch: 22415, Training Loss: 0.00870
Epoch: 22416, Training Loss: 0.00986
Epoch: 22416, Training Loss: 0.00918
Epoch: 22416, Training Loss: 0.01125
Epoch: 22416, Training Loss: 0.00870
Epoch: 22417, Training Loss: 0.00986
Epoch: 22417, Training Loss: 0.00918
Epoch: 22417, Training Loss: 0.01125
Epoch: 22417, Training Loss: 0.00869
Epoch: 22418, Training Loss: 0.00986
Epoch: 22418, Training Loss: 0.00918
Epoch: 22418, Training Loss: 0.01125
Epoch: 22418, Training Loss: 0.00869
Epoch: 22419, Training Loss: 0.00986
Epoch: 22419, Training Loss: 0.00918
Epoch: 22419, Training Loss: 0.01125
Epoch: 22419, Training Loss: 0.00869
Epoch: 22420, Training Loss: 0.00986
Epoch: 22420, Training Loss: 0.00918
Epoch: 22420, Training Loss: 0.01125
Epoch: 22420, Training Loss: 0.00869
Epoch: 22421, Training Loss: 0.00986
Epoch: 22421, Training Loss: 0.00918
Epoch: 22421, Training Loss: 0.01125
Epoch: 22421, Training Loss: 0.00869
Epoch: 22422, Training Loss: 0.00985
Epoch: 22422, Training Loss: 0.00918
Epoch: 22422, Training Loss: 0.01125
Epoch: 22422, Training Loss: 0.00869
Epoch: 22423, Training Loss: 0.00985
Epoch: 22423, Training Loss: 0.00918
Epoch: 22423, Training Loss: 0.01125
Epoch: 22423, Training Loss: 0.00869
Epoch: 22424, Training Loss: 0.00985
Epoch: 22424, Training Loss: 0.00918
Epoch: 22424, Training Loss: 0.01125
Epoch: 22424, Training Loss: 0.00869
Epoch: 22425, Training Loss: 0.00985
Epoch: 22425, Training Loss: 0.00918
Epoch: 22425, Training Loss: 0.01125
Epoch: 22425, Training Loss: 0.00869
Epoch: 22426, Training Loss: 0.00985
Epoch: 22426, Training Loss: 0.00918
Epoch: 22426, Training Loss: 0.01125
Epoch: 22426, Training Loss: 0.00869
Epoch: 22427, Training Loss: 0.00985
Epoch: 22427, Training Loss: 0.00918
Epoch: 22427, Training Loss: 0.01125
Epoch: 22427, Training Loss: 0.00869
Epoch: 22428, Training Loss: 0.00985
Epoch: 22428, Training Loss: 0.00918
Epoch: 22428, Training Loss: 0.01125
Epoch: 22428, Training Loss: 0.00869
Epoch: 22429, Training Loss: 0.00985
Epoch: 22429, Training Loss: 0.00918
Epoch: 22429, Training Loss: 0.01125
Epoch: 22429, Training Loss: 0.00869
Epoch: 22430, Training Loss: 0.00985
Epoch: 22430, Training Loss: 0.00918
Epoch: 22430, Training Loss: 0.01125
Epoch: 22430, Training Loss: 0.00869
Epoch: 22431, Training Loss: 0.00985
Epoch: 22431, Training Loss: 0.00918
Epoch: 22431, Training Loss: 0.01125
Epoch: 22431, Training Loss: 0.00869
Epoch: 22432, Training Loss: 0.00985
Epoch: 22432, Training Loss: 0.00917
Epoch: 22432, Training Loss: 0.01125
Epoch: 22432, Training Loss: 0.00869
Epoch: 22433, Training Loss: 0.00985
Epoch: 22433, Training Loss: 0.00917
Epoch: 22433, Training Loss: 0.01125
Epoch: 22433, Training Loss: 0.00869
Epoch: 22434, Training Loss: 0.00985
Epoch: 22434, Training Loss: 0.00917
Epoch: 22434, Training Loss: 0.01125
Epoch: 22434, Training Loss: 0.00869
Epoch: 22435, Training Loss: 0.00985
Epoch: 22435, Training Loss: 0.00917
Epoch: 22435, Training Loss: 0.01125
Epoch: 22435, Training Loss: 0.00869
Epoch: 22436, Training Loss: 0.00985
Epoch: 22436, Training Loss: 0.00917
Epoch: 22436, Training Loss: 0.01125
Epoch: 22436, Training Loss: 0.00869
Epoch: 22437, Training Loss: 0.00985
Epoch: 22437, Training Loss: 0.00917
Epoch: 22437, Training Loss: 0.01125
Epoch: 22437, Training Loss: 0.00869
Epoch: 22438, Training Loss: 0.00985
Epoch: 22438, Training Loss: 0.00917
Epoch: 22438, Training Loss: 0.01125
Epoch: 22438, Training Loss: 0.00869
Epoch: 22439, Training Loss: 0.00985
Epoch: 22439, Training Loss: 0.00917
Epoch: 22439, Training Loss: 0.01125
Epoch: 22439, Training Loss: 0.00869
Epoch: 22440, Training Loss: 0.00985
Epoch: 22440, Training Loss: 0.00917
Epoch: 22440, Training Loss: 0.01125
Epoch: 22440, Training Loss: 0.00869
Epoch: 22441, Training Loss: 0.00985
Epoch: 22441, Training Loss: 0.00917
Epoch: 22441, Training Loss: 0.01125
Epoch: 22441, Training Loss: 0.00869
Epoch: 22442, Training Loss: 0.00985
Epoch: 22442, Training Loss: 0.00917
Epoch: 22442, Training Loss: 0.01125
Epoch: 22442, Training Loss: 0.00869
Epoch: 22443, Training Loss: 0.00985
Epoch: 22443, Training Loss: 0.00917
Epoch: 22443, Training Loss: 0.01125
Epoch: 22443, Training Loss: 0.00869
Epoch: 22444, Training Loss: 0.00985
Epoch: 22444, Training Loss: 0.00917
Epoch: 22444, Training Loss: 0.01125
Epoch: 22444, Training Loss: 0.00869
Epoch: 22445, Training Loss: 0.00985
Epoch: 22445, Training Loss: 0.00917
Epoch: 22445, Training Loss: 0.01125
Epoch: 22445, Training Loss: 0.00869
Epoch: 22446, Training Loss: 0.00985
Epoch: 22446, Training Loss: 0.00917
Epoch: 22446, Training Loss: 0.01124
Epoch: 22446, Training Loss: 0.00869
Epoch: 22447, Training Loss: 0.00985
Epoch: 22447, Training Loss: 0.00917
Epoch: 22447, Training Loss: 0.01124
Epoch: 22447, Training Loss: 0.00869
Epoch: 22448, Training Loss: 0.00985
Epoch: 22448, Training Loss: 0.00917
Epoch: 22448, Training Loss: 0.01124
Epoch: 22448, Training Loss: 0.00869
Epoch: 22449, Training Loss: 0.00985
Epoch: 22449, Training Loss: 0.00917
Epoch: 22449, Training Loss: 0.01124
Epoch: 22449, Training Loss: 0.00869
Epoch: 22450, Training Loss: 0.00985
Epoch: 22450, Training Loss: 0.00917
Epoch: 22450, Training Loss: 0.01124
Epoch: 22450, Training Loss: 0.00869
Epoch: 22451, Training Loss: 0.00985
Epoch: 22451, Training Loss: 0.00917
Epoch: 22451, Training Loss: 0.01124
Epoch: 22451, Training Loss: 0.00869
Epoch: 22452, Training Loss: 0.00985
Epoch: 22452, Training Loss: 0.00917
Epoch: 22452, Training Loss: 0.01124
Epoch: 22452, Training Loss: 0.00869
Epoch: 22453, Training Loss: 0.00985
Epoch: 22453, Training Loss: 0.00917
Epoch: 22453, Training Loss: 0.01124
Epoch: 22453, Training Loss: 0.00869
Epoch: 22454, Training Loss: 0.00985
Epoch: 22454, Training Loss: 0.00917
Epoch: 22454, Training Loss: 0.01124
Epoch: 22454, Training Loss: 0.00869
Epoch: 22455, Training Loss: 0.00985
Epoch: 22455, Training Loss: 0.00917
Epoch: 22455, Training Loss: 0.01124
Epoch: 22455, Training Loss: 0.00869
Epoch: 22456, Training Loss: 0.00985
Epoch: 22456, Training Loss: 0.00917
Epoch: 22456, Training Loss: 0.01124
Epoch: 22456, Training Loss: 0.00869
Epoch: 22457, Training Loss: 0.00985
Epoch: 22457, Training Loss: 0.00917
Epoch: 22457, Training Loss: 0.01124
Epoch: 22457, Training Loss: 0.00869
Epoch: 22458, Training Loss: 0.00985
Epoch: 22458, Training Loss: 0.00917
Epoch: 22458, Training Loss: 0.01124
Epoch: 22458, Training Loss: 0.00869
Epoch: 22459, Training Loss: 0.00985
Epoch: 22459, Training Loss: 0.00917
Epoch: 22459, Training Loss: 0.01124
Epoch: 22459, Training Loss: 0.00869
Epoch: 22460, Training Loss: 0.00985
Epoch: 22460, Training Loss: 0.00917
Epoch: 22460, Training Loss: 0.01124
Epoch: 22460, Training Loss: 0.00869
Epoch: 22461, Training Loss: 0.00985
Epoch: 22461, Training Loss: 0.00917
Epoch: 22461, Training Loss: 0.01124
Epoch: 22461, Training Loss: 0.00869
Epoch: 22462, Training Loss: 0.00985
Epoch: 22462, Training Loss: 0.00917
Epoch: 22462, Training Loss: 0.01124
Epoch: 22462, Training Loss: 0.00869
Epoch: 22463, Training Loss: 0.00985
Epoch: 22463, Training Loss: 0.00917
Epoch: 22463, Training Loss: 0.01124
Epoch: 22463, Training Loss: 0.00869
Epoch: 22464, Training Loss: 0.00985
Epoch: 22464, Training Loss: 0.00917
Epoch: 22464, Training Loss: 0.01124
Epoch: 22464, Training Loss: 0.00869
Epoch: 22465, Training Loss: 0.00984
Epoch: 22465, Training Loss: 0.00917
Epoch: 22465, Training Loss: 0.01124
Epoch: 22465, Training Loss: 0.00868
Epoch: 22466, Training Loss: 0.00984
Epoch: 22466, Training Loss: 0.00917
Epoch: 22466, Training Loss: 0.01124
Epoch: 22466, Training Loss: 0.00868
Epoch: 22467, Training Loss: 0.00984
Epoch: 22467, Training Loss: 0.00917
Epoch: 22467, Training Loss: 0.01124
Epoch: 22467, Training Loss: 0.00868
Epoch: 22468, Training Loss: 0.00984
Epoch: 22468, Training Loss: 0.00917
Epoch: 22468, Training Loss: 0.01124
Epoch: 22468, Training Loss: 0.00868
Epoch: 22469, Training Loss: 0.00984
Epoch: 22469, Training Loss: 0.00917
Epoch: 22469, Training Loss: 0.01124
Epoch: 22469, Training Loss: 0.00868
Epoch: 22470, Training Loss: 0.00984
Epoch: 22470, Training Loss: 0.00917
Epoch: 22470, Training Loss: 0.01124
Epoch: 22470, Training Loss: 0.00868
Epoch: 22471, Training Loss: 0.00984
Epoch: 22471, Training Loss: 0.00917
Epoch: 22471, Training Loss: 0.01124
Epoch: 22471, Training Loss: 0.00868
Epoch: 22472, Training Loss: 0.00984
Epoch: 22472, Training Loss: 0.00917
Epoch: 22472, Training Loss: 0.01124
Epoch: 22472, Training Loss: 0.00868
Epoch: 22473, Training Loss: 0.00984
Epoch: 22473, Training Loss: 0.00917
Epoch: 22473, Training Loss: 0.01124
Epoch: 22473, Training Loss: 0.00868
Epoch: 22474, Training Loss: 0.00984
Epoch: 22474, Training Loss: 0.00917
Epoch: 22474, Training Loss: 0.01124
Epoch: 22474, Training Loss: 0.00868
Epoch: 22475, Training Loss: 0.00984
Epoch: 22475, Training Loss: 0.00917
Epoch: 22475, Training Loss: 0.01124
Epoch: 22475, Training Loss: 0.00868
Epoch: 22476, Training Loss: 0.00984
Epoch: 22476, Training Loss: 0.00917
Epoch: 22476, Training Loss: 0.01124
Epoch: 22476, Training Loss: 0.00868
Epoch: 22477, Training Loss: 0.00984
Epoch: 22477, Training Loss: 0.00917
Epoch: 22477, Training Loss: 0.01124
Epoch: 22477, Training Loss: 0.00868
Epoch: 22478, Training Loss: 0.00984
Epoch: 22478, Training Loss: 0.00916
Epoch: 22478, Training Loss: 0.01124
Epoch: 22478, Training Loss: 0.00868
Epoch: 22479, Training Loss: 0.00984
Epoch: 22479, Training Loss: 0.00916
Epoch: 22479, Training Loss: 0.01124
Epoch: 22479, Training Loss: 0.00868
Epoch: 22480, Training Loss: 0.00984
Epoch: 22480, Training Loss: 0.00916
Epoch: 22480, Training Loss: 0.01124
Epoch: 22480, Training Loss: 0.00868
Epoch: 22481, Training Loss: 0.00984
Epoch: 22481, Training Loss: 0.00916
Epoch: 22481, Training Loss: 0.01124
Epoch: 22481, Training Loss: 0.00868
Epoch: 22482, Training Loss: 0.00984
Epoch: 22482, Training Loss: 0.00916
Epoch: 22482, Training Loss: 0.01124
Epoch: 22482, Training Loss: 0.00868
Epoch: 22483, Training Loss: 0.00984
Epoch: 22483, Training Loss: 0.00916
Epoch: 22483, Training Loss: 0.01123
Epoch: 22483, Training Loss: 0.00868
Epoch: 22484, Training Loss: 0.00984
Epoch: 22484, Training Loss: 0.00916
Epoch: 22484, Training Loss: 0.01123
Epoch: 22484, Training Loss: 0.00868
Epoch: 22485, Training Loss: 0.00984
Epoch: 22485, Training Loss: 0.00916
Epoch: 22485, Training Loss: 0.01123
Epoch: 22485, Training Loss: 0.00868
Epoch: 22486, Training Loss: 0.00984
Epoch: 22486, Training Loss: 0.00916
Epoch: 22486, Training Loss: 0.01123
Epoch: 22486, Training Loss: 0.00868
Epoch: 22487, Training Loss: 0.00984
Epoch: 22487, Training Loss: 0.00916
Epoch: 22487, Training Loss: 0.01123
Epoch: 22487, Training Loss: 0.00868
Epoch: 22488, Training Loss: 0.00984
Epoch: 22488, Training Loss: 0.00916
Epoch: 22488, Training Loss: 0.01123
Epoch: 22488, Training Loss: 0.00868
Epoch: 22489, Training Loss: 0.00984
Epoch: 22489, Training Loss: 0.00916
Epoch: 22489, Training Loss: 0.01123
Epoch: 22489, Training Loss: 0.00868
Epoch: 22490, Training Loss: 0.00984
Epoch: 22490, Training Loss: 0.00916
Epoch: 22490, Training Loss: 0.01123
Epoch: 22490, Training Loss: 0.00868
Epoch: 22491, Training Loss: 0.00984
Epoch: 22491, Training Loss: 0.00916
Epoch: 22491, Training Loss: 0.01123
Epoch: 22491, Training Loss: 0.00868
Epoch: 22492, Training Loss: 0.00984
Epoch: 22492, Training Loss: 0.00916
Epoch: 22492, Training Loss: 0.01123
Epoch: 22492, Training Loss: 0.00868
Epoch: 22493, Training Loss: 0.00984
Epoch: 22493, Training Loss: 0.00916
Epoch: 22493, Training Loss: 0.01123
Epoch: 22493, Training Loss: 0.00868
Epoch: 22494, Training Loss: 0.00984
Epoch: 22494, Training Loss: 0.00916
Epoch: 22494, Training Loss: 0.01123
Epoch: 22494, Training Loss: 0.00868
Epoch: 22495, Training Loss: 0.00984
Epoch: 22495, Training Loss: 0.00916
Epoch: 22495, Training Loss: 0.01123
Epoch: 22495, Training Loss: 0.00868
Epoch: 22496, Training Loss: 0.00984
Epoch: 22496, Training Loss: 0.00916
Epoch: 22496, Training Loss: 0.01123
Epoch: 22496, Training Loss: 0.00868
Epoch: 22497, Training Loss: 0.00984
Epoch: 22497, Training Loss: 0.00916
Epoch: 22497, Training Loss: 0.01123
Epoch: 22497, Training Loss: 0.00868
Epoch: 22498, Training Loss: 0.00984
Epoch: 22498, Training Loss: 0.00916
Epoch: 22498, Training Loss: 0.01123
Epoch: 22498, Training Loss: 0.00868
Epoch: 22499, Training Loss: 0.00984
Epoch: 22499, Training Loss: 0.00916
Epoch: 22499, Training Loss: 0.01123
Epoch: 22499, Training Loss: 0.00868
Epoch: 22500, Training Loss: 0.00984
Epoch: 22500, Training Loss: 0.00916
Epoch: 22500, Training Loss: 0.01123
Epoch: 22500, Training Loss: 0.00868
Epoch: 22501, Training Loss: 0.00984
Epoch: 22501, Training Loss: 0.00916
Epoch: 22501, Training Loss: 0.01123
Epoch: 22501, Training Loss: 0.00868
Epoch: 22502, Training Loss: 0.00984
Epoch: 22502, Training Loss: 0.00916
Epoch: 22502, Training Loss: 0.01123
Epoch: 22502, Training Loss: 0.00868
Epoch: 22503, Training Loss: 0.00984
Epoch: 22503, Training Loss: 0.00916
Epoch: 22503, Training Loss: 0.01123
Epoch: 22503, Training Loss: 0.00868
Epoch: 22504, Training Loss: 0.00984
Epoch: 22504, Training Loss: 0.00916
Epoch: 22504, Training Loss: 0.01123
Epoch: 22504, Training Loss: 0.00868
Epoch: 22505, Training Loss: 0.00984
Epoch: 22505, Training Loss: 0.00916
Epoch: 22505, Training Loss: 0.01123
Epoch: 22505, Training Loss: 0.00868
Epoch: 22506, Training Loss: 0.00984
Epoch: 22506, Training Loss: 0.00916
Epoch: 22506, Training Loss: 0.01123
Epoch: 22506, Training Loss: 0.00868
Epoch: 22507, Training Loss: 0.00983
Epoch: 22507, Training Loss: 0.00916
Epoch: 22507, Training Loss: 0.01123
Epoch: 22507, Training Loss: 0.00868
Epoch: 22508, Training Loss: 0.00983
Epoch: 22508, Training Loss: 0.00916
Epoch: 22508, Training Loss: 0.01123
Epoch: 22508, Training Loss: 0.00868
Epoch: 22509, Training Loss: 0.00983
Epoch: 22509, Training Loss: 0.00916
Epoch: 22509, Training Loss: 0.01123
Epoch: 22509, Training Loss: 0.00868
Epoch: 22510, Training Loss: 0.00983
Epoch: 22510, Training Loss: 0.00916
Epoch: 22510, Training Loss: 0.01123
Epoch: 22510, Training Loss: 0.00868
Epoch: 22511, Training Loss: 0.00983
Epoch: 22511, Training Loss: 0.00916
Epoch: 22511, Training Loss: 0.01123
Epoch: 22511, Training Loss: 0.00868
Epoch: 22512, Training Loss: 0.00983
Epoch: 22512, Training Loss: 0.00916
Epoch: 22512, Training Loss: 0.01123
Epoch: 22512, Training Loss: 0.00868
Epoch: 22513, Training Loss: 0.00983
Epoch: 22513, Training Loss: 0.00916
Epoch: 22513, Training Loss: 0.01123
Epoch: 22513, Training Loss: 0.00868
Epoch: 22514, Training Loss: 0.00983
Epoch: 22514, Training Loss: 0.00916
Epoch: 22514, Training Loss: 0.01123
Epoch: 22514, Training Loss: 0.00867
Epoch: 22515, Training Loss: 0.00983
Epoch: 22515, Training Loss: 0.00916
Epoch: 22515, Training Loss: 0.01123
Epoch: 22515, Training Loss: 0.00867
Epoch: 22516, Training Loss: 0.00983
Epoch: 22516, Training Loss: 0.00916
Epoch: 22516, Training Loss: 0.01123
Epoch: 22516, Training Loss: 0.00867
Epoch: 22517, Training Loss: 0.00983
Epoch: 22517, Training Loss: 0.00916
Epoch: 22517, Training Loss: 0.01123
Epoch: 22517, Training Loss: 0.00867
Epoch: 22518, Training Loss: 0.00983
Epoch: 22518, Training Loss: 0.00916
Epoch: 22518, Training Loss: 0.01123
Epoch: 22518, Training Loss: 0.00867
Epoch: 22519, Training Loss: 0.00983
Epoch: 22519, Training Loss: 0.00916
Epoch: 22519, Training Loss: 0.01123
Epoch: 22519, Training Loss: 0.00867
Epoch: 22520, Training Loss: 0.00983
Epoch: 22520, Training Loss: 0.00916
Epoch: 22520, Training Loss: 0.01122
Epoch: 22520, Training Loss: 0.00867
Epoch: 22521, Training Loss: 0.00983
Epoch: 22521, Training Loss: 0.00916
Epoch: 22521, Training Loss: 0.01122
Epoch: 22521, Training Loss: 0.00867
Epoch: 22522, Training Loss: 0.00983
Epoch: 22522, Training Loss: 0.00916
Epoch: 22522, Training Loss: 0.01122
Epoch: 22522, Training Loss: 0.00867
Epoch: 22523, Training Loss: 0.00983
Epoch: 22523, Training Loss: 0.00916
Epoch: 22523, Training Loss: 0.01122
Epoch: 22523, Training Loss: 0.00867
Epoch: 22524, Training Loss: 0.00983
Epoch: 22524, Training Loss: 0.00916
Epoch: 22524, Training Loss: 0.01122
Epoch: 22524, Training Loss: 0.00867
Epoch: 22525, Training Loss: 0.00983
Epoch: 22525, Training Loss: 0.00915
Epoch: 22525, Training Loss: 0.01122
Epoch: 22525, Training Loss: 0.00867
Epoch: 22526, Training Loss: 0.00983
Epoch: 22526, Training Loss: 0.00915
Epoch: 22526, Training Loss: 0.01122
Epoch: 22526, Training Loss: 0.00867
Epoch: 22527, Training Loss: 0.00983
Epoch: 22527, Training Loss: 0.00915
Epoch: 22527, Training Loss: 0.01122
Epoch: 22527, Training Loss: 0.00867
Epoch: 22528, Training Loss: 0.00983
Epoch: 22528, Training Loss: 0.00915
Epoch: 22528, Training Loss: 0.01122
Epoch: 22528, Training Loss: 0.00867
Epoch: 22529, Training Loss: 0.00983
Epoch: 22529, Training Loss: 0.00915
Epoch: 22529, Training Loss: 0.01122
Epoch: 22529, Training Loss: 0.00867
Epoch: 22530, Training Loss: 0.00983
Epoch: 22530, Training Loss: 0.00915
Epoch: 22530, Training Loss: 0.01122
Epoch: 22530, Training Loss: 0.00867
Epoch: 22531, Training Loss: 0.00983
Epoch: 22531, Training Loss: 0.00915
Epoch: 22531, Training Loss: 0.01122
Epoch: 22531, Training Loss: 0.00867
Epoch: 22532, Training Loss: 0.00983
Epoch: 22532, Training Loss: 0.00915
Epoch: 22532, Training Loss: 0.01122
Epoch: 22532, Training Loss: 0.00867
Epoch: 22533, Training Loss: 0.00983
Epoch: 22533, Training Loss: 0.00915
Epoch: 22533, Training Loss: 0.01122
Epoch: 22533, Training Loss: 0.00867
Epoch: 22534, Training Loss: 0.00983
Epoch: 22534, Training Loss: 0.00915
Epoch: 22534, Training Loss: 0.01122
Epoch: 22534, Training Loss: 0.00867
Epoch: 22535, Training Loss: 0.00983
Epoch: 22535, Training Loss: 0.00915
Epoch: 22535, Training Loss: 0.01122
Epoch: 22535, Training Loss: 0.00867
Epoch: 22536, Training Loss: 0.00983
Epoch: 22536, Training Loss: 0.00915
Epoch: 22536, Training Loss: 0.01122
Epoch: 22536, Training Loss: 0.00867
Epoch: 22537, Training Loss: 0.00983
Epoch: 22537, Training Loss: 0.00915
Epoch: 22537, Training Loss: 0.01122
Epoch: 22537, Training Loss: 0.00867
Epoch: 22538, Training Loss: 0.00983
Epoch: 22538, Training Loss: 0.00915
Epoch: 22538, Training Loss: 0.01122
Epoch: 22538, Training Loss: 0.00867
Epoch: 22539, Training Loss: 0.00983
Epoch: 22539, Training Loss: 0.00915
Epoch: 22539, Training Loss: 0.01122
Epoch: 22539, Training Loss: 0.00867
Epoch: 22540, Training Loss: 0.00983
Epoch: 22540, Training Loss: 0.00915
Epoch: 22540, Training Loss: 0.01122
Epoch: 22540, Training Loss: 0.00867
Epoch: 22541, Training Loss: 0.00983
Epoch: 22541, Training Loss: 0.00915
Epoch: 22541, Training Loss: 0.01122
Epoch: 22541, Training Loss: 0.00867
Epoch: 22542, Training Loss: 0.00983
Epoch: 22542, Training Loss: 0.00915
Epoch: 22542, Training Loss: 0.01122
Epoch: 22542, Training Loss: 0.00867
Epoch: 22543, Training Loss: 0.00983
Epoch: 22543, Training Loss: 0.00915
Epoch: 22543, Training Loss: 0.01122
Epoch: 22543, Training Loss: 0.00867
Epoch: 22544, Training Loss: 0.00983
Epoch: 22544, Training Loss: 0.00915
Epoch: 22544, Training Loss: 0.01122
Epoch: 22544, Training Loss: 0.00867
Epoch: 22545, Training Loss: 0.00983
Epoch: 22545, Training Loss: 0.00915
Epoch: 22545, Training Loss: 0.01122
Epoch: 22545, Training Loss: 0.00867
Epoch: 22546, Training Loss: 0.00983
Epoch: 22546, Training Loss: 0.00915
Epoch: 22546, Training Loss: 0.01122
Epoch: 22546, Training Loss: 0.00867
Epoch: 22547, Training Loss: 0.00983
Epoch: 22547, Training Loss: 0.00915
Epoch: 22547, Training Loss: 0.01122
Epoch: 22547, Training Loss: 0.00867
Epoch: 22548, Training Loss: 0.00983
Epoch: 22548, Training Loss: 0.00915
Epoch: 22548, Training Loss: 0.01122
Epoch: 22548, Training Loss: 0.00867
Epoch: 22549, Training Loss: 0.00983
Epoch: 22549, Training Loss: 0.00915
Epoch: 22549, Training Loss: 0.01122
Epoch: 22549, Training Loss: 0.00867
Epoch: 22550, Training Loss: 0.00982
Epoch: 22550, Training Loss: 0.00915
Epoch: 22550, Training Loss: 0.01122
Epoch: 22550, Training Loss: 0.00867
Epoch: 22551, Training Loss: 0.00982
Epoch: 22551, Training Loss: 0.00915
Epoch: 22551, Training Loss: 0.01122
Epoch: 22551, Training Loss: 0.00867
Epoch: 22552, Training Loss: 0.00982
Epoch: 22552, Training Loss: 0.00915
Epoch: 22552, Training Loss: 0.01122
Epoch: 22552, Training Loss: 0.00867
Epoch: 22553, Training Loss: 0.00982
Epoch: 22553, Training Loss: 0.00915
Epoch: 22553, Training Loss: 0.01122
Epoch: 22553, Training Loss: 0.00867
Epoch: 22554, Training Loss: 0.00982
Epoch: 22554, Training Loss: 0.00915
Epoch: 22554, Training Loss: 0.01122
Epoch: 22554, Training Loss: 0.00867
Epoch: 22555, Training Loss: 0.00982
Epoch: 22555, Training Loss: 0.00915
Epoch: 22555, Training Loss: 0.01122
Epoch: 22555, Training Loss: 0.00867
Epoch: 22556, Training Loss: 0.00982
Epoch: 22556, Training Loss: 0.00915
Epoch: 22556, Training Loss: 0.01122
Epoch: 22556, Training Loss: 0.00867
Epoch: 22557, Training Loss: 0.00982
Epoch: 22557, Training Loss: 0.00915
Epoch: 22557, Training Loss: 0.01122
Epoch: 22557, Training Loss: 0.00867
Epoch: 22558, Training Loss: 0.00982
Epoch: 22558, Training Loss: 0.00915
Epoch: 22558, Training Loss: 0.01121
Epoch: 22558, Training Loss: 0.00867
Epoch: 22559, Training Loss: 0.00982
Epoch: 22559, Training Loss: 0.00915
Epoch: 22559, Training Loss: 0.01121
Epoch: 22559, Training Loss: 0.00867
Epoch: 22560, Training Loss: 0.00982
Epoch: 22560, Training Loss: 0.00915
Epoch: 22560, Training Loss: 0.01121
Epoch: 22560, Training Loss: 0.00867
Epoch: 22561, Training Loss: 0.00982
Epoch: 22561, Training Loss: 0.00915
Epoch: 22561, Training Loss: 0.01121
Epoch: 22561, Training Loss: 0.00867
Epoch: 22562, Training Loss: 0.00982
Epoch: 22562, Training Loss: 0.00915
Epoch: 22562, Training Loss: 0.01121
Epoch: 22562, Training Loss: 0.00867
Epoch: 22563, Training Loss: 0.00982
Epoch: 22563, Training Loss: 0.00915
Epoch: 22563, Training Loss: 0.01121
Epoch: 22563, Training Loss: 0.00866
Epoch: 22564, Training Loss: 0.00982
Epoch: 22564, Training Loss: 0.00915
Epoch: 22564, Training Loss: 0.01121
Epoch: 22564, Training Loss: 0.00866
Epoch: 22565, Training Loss: 0.00982
Epoch: 22565, Training Loss: 0.00915
Epoch: 22565, Training Loss: 0.01121
Epoch: 22565, Training Loss: 0.00866
Epoch: 22566, Training Loss: 0.00982
Epoch: 22566, Training Loss: 0.00915
Epoch: 22566, Training Loss: 0.01121
Epoch: 22566, Training Loss: 0.00866
Epoch: 22567, Training Loss: 0.00982
Epoch: 22567, Training Loss: 0.00915
Epoch: 22567, Training Loss: 0.01121
Epoch: 22567, Training Loss: 0.00866
Epoch: 22568, Training Loss: 0.00982
Epoch: 22568, Training Loss: 0.00915
Epoch: 22568, Training Loss: 0.01121
Epoch: 22568, Training Loss: 0.00866
Epoch: 22569, Training Loss: 0.00982
Epoch: 22569, Training Loss: 0.00915
Epoch: 22569, Training Loss: 0.01121
Epoch: 22569, Training Loss: 0.00866
Epoch: 22570, Training Loss: 0.00982
Epoch: 22570, Training Loss: 0.00915
Epoch: 22570, Training Loss: 0.01121
Epoch: 22570, Training Loss: 0.00866
Epoch: 22571, Training Loss: 0.00982
Epoch: 22571, Training Loss: 0.00914
Epoch: 22571, Training Loss: 0.01121
Epoch: 22571, Training Loss: 0.00866
Epoch: 22572, Training Loss: 0.00982
Epoch: 22572, Training Loss: 0.00914
Epoch: 22572, Training Loss: 0.01121
Epoch: 22572, Training Loss: 0.00866
Epoch: 22573, Training Loss: 0.00982
Epoch: 22573, Training Loss: 0.00914
Epoch: 22573, Training Loss: 0.01121
Epoch: 22573, Training Loss: 0.00866
Epoch: 22574, Training Loss: 0.00982
Epoch: 22574, Training Loss: 0.00914
Epoch: 22574, Training Loss: 0.01121
Epoch: 22574, Training Loss: 0.00866
Epoch: 22575, Training Loss: 0.00982
Epoch: 22575, Training Loss: 0.00914
Epoch: 22575, Training Loss: 0.01121
Epoch: 22575, Training Loss: 0.00866
Epoch: 22576, Training Loss: 0.00982
Epoch: 22576, Training Loss: 0.00914
Epoch: 22576, Training Loss: 0.01121
Epoch: 22576, Training Loss: 0.00866
Epoch: 22577, Training Loss: 0.00982
Epoch: 22577, Training Loss: 0.00914
Epoch: 22577, Training Loss: 0.01121
Epoch: 22577, Training Loss: 0.00866
Epoch: 22578, Training Loss: 0.00982
Epoch: 22578, Training Loss: 0.00914
Epoch: 22578, Training Loss: 0.01121
Epoch: 22578, Training Loss: 0.00866
Epoch: 22579, Training Loss: 0.00982
Epoch: 22579, Training Loss: 0.00914
Epoch: 22579, Training Loss: 0.01121
Epoch: 22579, Training Loss: 0.00866
Epoch: 22580, Training Loss: 0.00982
Epoch: 22580, Training Loss: 0.00914
Epoch: 22580, Training Loss: 0.01121
Epoch: 22580, Training Loss: 0.00866
Epoch: 22581, Training Loss: 0.00982
Epoch: 22581, Training Loss: 0.00914
Epoch: 22581, Training Loss: 0.01121
Epoch: 22581, Training Loss: 0.00866
Epoch: 22582, Training Loss: 0.00982
Epoch: 22582, Training Loss: 0.00914
Epoch: 22582, Training Loss: 0.01121
Epoch: 22582, Training Loss: 0.00866
Epoch: 22583, Training Loss: 0.00982
Epoch: 22583, Training Loss: 0.00914
Epoch: 22583, Training Loss: 0.01121
Epoch: 22583, Training Loss: 0.00866
Epoch: 22584, Training Loss: 0.00982
Epoch: 22584, Training Loss: 0.00914
Epoch: 22584, Training Loss: 0.01121
Epoch: 22584, Training Loss: 0.00866
Epoch: 22585, Training Loss: 0.00982
Epoch: 22585, Training Loss: 0.00914
Epoch: 22585, Training Loss: 0.01121
Epoch: 22585, Training Loss: 0.00866
Epoch: 22586, Training Loss: 0.00982
Epoch: 22586, Training Loss: 0.00914
Epoch: 22586, Training Loss: 0.01121
Epoch: 22586, Training Loss: 0.00866
Epoch: 22587, Training Loss: 0.00982
Epoch: 22587, Training Loss: 0.00914
Epoch: 22587, Training Loss: 0.01121
Epoch: 22587, Training Loss: 0.00866
Epoch: 22588, Training Loss: 0.00982
Epoch: 22588, Training Loss: 0.00914
Epoch: 22588, Training Loss: 0.01121
Epoch: 22588, Training Loss: 0.00866
Epoch: 22589, Training Loss: 0.00982
Epoch: 22589, Training Loss: 0.00914
Epoch: 22589, Training Loss: 0.01121
Epoch: 22589, Training Loss: 0.00866
Epoch: 22590, Training Loss: 0.00982
Epoch: 22590, Training Loss: 0.00914
Epoch: 22590, Training Loss: 0.01121
Epoch: 22590, Training Loss: 0.00866
Epoch: 22591, Training Loss: 0.00982
Epoch: 22591, Training Loss: 0.00914
Epoch: 22591, Training Loss: 0.01121
Epoch: 22591, Training Loss: 0.00866
Epoch: 22592, Training Loss: 0.00981
Epoch: 22592, Training Loss: 0.00914
Epoch: 22592, Training Loss: 0.01121
Epoch: 22592, Training Loss: 0.00866
Epoch: 22593, Training Loss: 0.00981
Epoch: 22593, Training Loss: 0.00914
Epoch: 22593, Training Loss: 0.01121
Epoch: 22593, Training Loss: 0.00866
Epoch: 22594, Training Loss: 0.00981
Epoch: 22594, Training Loss: 0.00914
Epoch: 22594, Training Loss: 0.01121
Epoch: 22594, Training Loss: 0.00866
Epoch: 22595, Training Loss: 0.00981
Epoch: 22595, Training Loss: 0.00914
Epoch: 22595, Training Loss: 0.01120
Epoch: 22595, Training Loss: 0.00866
Epoch: 22596, Training Loss: 0.00981
Epoch: 22596, Training Loss: 0.00914
Epoch: 22596, Training Loss: 0.01120
Epoch: 22596, Training Loss: 0.00866
Epoch: 22597, Training Loss: 0.00981
Epoch: 22597, Training Loss: 0.00914
Epoch: 22597, Training Loss: 0.01120
Epoch: 22597, Training Loss: 0.00866
Epoch: 22598, Training Loss: 0.00981
Epoch: 22598, Training Loss: 0.00914
Epoch: 22598, Training Loss: 0.01120
Epoch: 22598, Training Loss: 0.00866
Epoch: 22599, Training Loss: 0.00981
Epoch: 22599, Training Loss: 0.00914
Epoch: 22599, Training Loss: 0.01120
Epoch: 22599, Training Loss: 0.00866
Epoch: 22600, Training Loss: 0.00981
Epoch: 22600, Training Loss: 0.00914
Epoch: 22600, Training Loss: 0.01120
Epoch: 22600, Training Loss: 0.00866
Epoch: 22601, Training Loss: 0.00981
Epoch: 22601, Training Loss: 0.00914
Epoch: 22601, Training Loss: 0.01120
Epoch: 22601, Training Loss: 0.00866
Epoch: 22602, Training Loss: 0.00981
Epoch: 22602, Training Loss: 0.00914
Epoch: 22602, Training Loss: 0.01120
Epoch: 22602, Training Loss: 0.00866
Epoch: 22603, Training Loss: 0.00981
Epoch: 22603, Training Loss: 0.00914
Epoch: 22603, Training Loss: 0.01120
Epoch: 22603, Training Loss: 0.00866
Epoch: 22604, Training Loss: 0.00981
Epoch: 22604, Training Loss: 0.00914
Epoch: 22604, Training Loss: 0.01120
Epoch: 22604, Training Loss: 0.00866
Epoch: 22605, Training Loss: 0.00981
Epoch: 22605, Training Loss: 0.00914
Epoch: 22605, Training Loss: 0.01120
Epoch: 22605, Training Loss: 0.00866
Epoch: 22606, Training Loss: 0.00981
Epoch: 22606, Training Loss: 0.00914
Epoch: 22606, Training Loss: 0.01120
Epoch: 22606, Training Loss: 0.00866
Epoch: 22607, Training Loss: 0.00981
Epoch: 22607, Training Loss: 0.00914
Epoch: 22607, Training Loss: 0.01120
Epoch: 22607, Training Loss: 0.00866
Epoch: 22608, Training Loss: 0.00981
Epoch: 22608, Training Loss: 0.00914
Epoch: 22608, Training Loss: 0.01120
Epoch: 22608, Training Loss: 0.00866
Epoch: 22609, Training Loss: 0.00981
Epoch: 22609, Training Loss: 0.00914
Epoch: 22609, Training Loss: 0.01120
Epoch: 22609, Training Loss: 0.00866
Epoch: 22610, Training Loss: 0.00981
Epoch: 22610, Training Loss: 0.00914
Epoch: 22610, Training Loss: 0.01120
Epoch: 22610, Training Loss: 0.00866
Epoch: 22611, Training Loss: 0.00981
Epoch: 22611, Training Loss: 0.00914
Epoch: 22611, Training Loss: 0.01120
Epoch: 22611, Training Loss: 0.00866
Epoch: 22612, Training Loss: 0.00981
Epoch: 22612, Training Loss: 0.00914
Epoch: 22612, Training Loss: 0.01120
Epoch: 22612, Training Loss: 0.00866
Epoch: 22613, Training Loss: 0.00981
Epoch: 22613, Training Loss: 0.00914
Epoch: 22613, Training Loss: 0.01120
Epoch: 22613, Training Loss: 0.00865
Epoch: 22614, Training Loss: 0.00981
Epoch: 22614, Training Loss: 0.00914
Epoch: 22614, Training Loss: 0.01120
Epoch: 22614, Training Loss: 0.00865
Epoch: 22615, Training Loss: 0.00981
Epoch: 22615, Training Loss: 0.00914
Epoch: 22615, Training Loss: 0.01120
Epoch: 22615, Training Loss: 0.00865
Epoch: 22616, Training Loss: 0.00981
Epoch: 22616, Training Loss: 0.00914
Epoch: 22616, Training Loss: 0.01120
Epoch: 22616, Training Loss: 0.00865
Epoch: 22617, Training Loss: 0.00981
Epoch: 22617, Training Loss: 0.00914
Epoch: 22617, Training Loss: 0.01120
Epoch: 22617, Training Loss: 0.00865
Epoch: 22618, Training Loss: 0.00981
Epoch: 22618, Training Loss: 0.00913
Epoch: 22618, Training Loss: 0.01120
Epoch: 22618, Training Loss: 0.00865
Epoch: 22619, Training Loss: 0.00981
Epoch: 22619, Training Loss: 0.00913
Epoch: 22619, Training Loss: 0.01120
Epoch: 22619, Training Loss: 0.00865
Epoch: 22620, Training Loss: 0.00981
Epoch: 22620, Training Loss: 0.00913
Epoch: 22620, Training Loss: 0.01120
Epoch: 22620, Training Loss: 0.00865
Epoch: 22621, Training Loss: 0.00981
Epoch: 22621, Training Loss: 0.00913
Epoch: 22621, Training Loss: 0.01120
Epoch: 22621, Training Loss: 0.00865
Epoch: 22622, Training Loss: 0.00981
Epoch: 22622, Training Loss: 0.00913
Epoch: 22622, Training Loss: 0.01120
Epoch: 22622, Training Loss: 0.00865
Epoch: 22623, Training Loss: 0.00981
Epoch: 22623, Training Loss: 0.00913
Epoch: 22623, Training Loss: 0.01120
Epoch: 22623, Training Loss: 0.00865
Epoch: 22624, Training Loss: 0.00981
Epoch: 22624, Training Loss: 0.00913
Epoch: 22624, Training Loss: 0.01120
Epoch: 22624, Training Loss: 0.00865
Epoch: 22625, Training Loss: 0.00981
Epoch: 22625, Training Loss: 0.00913
Epoch: 22625, Training Loss: 0.01120
Epoch: 22625, Training Loss: 0.00865
Epoch: 22626, Training Loss: 0.00981
Epoch: 22626, Training Loss: 0.00913
Epoch: 22626, Training Loss: 0.01120
Epoch: 22626, Training Loss: 0.00865
Epoch: 22627, Training Loss: 0.00981
Epoch: 22627, Training Loss: 0.00913
Epoch: 22627, Training Loss: 0.01120
Epoch: 22627, Training Loss: 0.00865
Epoch: 22628, Training Loss: 0.00981
Epoch: 22628, Training Loss: 0.00913
Epoch: 22628, Training Loss: 0.01120
Epoch: 22628, Training Loss: 0.00865
Epoch: 22629, Training Loss: 0.00981
Epoch: 22629, Training Loss: 0.00913
Epoch: 22629, Training Loss: 0.01120
Epoch: 22629, Training Loss: 0.00865
Epoch: 22630, Training Loss: 0.00981
Epoch: 22630, Training Loss: 0.00913
Epoch: 22630, Training Loss: 0.01120
Epoch: 22630, Training Loss: 0.00865
Epoch: 22631, Training Loss: 0.00981
Epoch: 22631, Training Loss: 0.00913
Epoch: 22631, Training Loss: 0.01120
Epoch: 22631, Training Loss: 0.00865
Epoch: 22632, Training Loss: 0.00981
Epoch: 22632, Training Loss: 0.00913
Epoch: 22632, Training Loss: 0.01120
Epoch: 22632, Training Loss: 0.00865
Epoch: 22633, Training Loss: 0.00981
Epoch: 22633, Training Loss: 0.00913
Epoch: 22633, Training Loss: 0.01119
Epoch: 22633, Training Loss: 0.00865
Epoch: 22634, Training Loss: 0.00981
Epoch: 22634, Training Loss: 0.00913
Epoch: 22634, Training Loss: 0.01119
Epoch: 22634, Training Loss: 0.00865
Epoch: 22635, Training Loss: 0.00980
Epoch: 22635, Training Loss: 0.00913
Epoch: 22635, Training Loss: 0.01119
Epoch: 22635, Training Loss: 0.00865
Epoch: 22636, Training Loss: 0.00980
Epoch: 22636, Training Loss: 0.00913
Epoch: 22636, Training Loss: 0.01119
Epoch: 22636, Training Loss: 0.00865
Epoch: 22637, Training Loss: 0.00980
Epoch: 22637, Training Loss: 0.00913
Epoch: 22637, Training Loss: 0.01119
Epoch: 22637, Training Loss: 0.00865
Epoch: 22638, Training Loss: 0.00980
Epoch: 22638, Training Loss: 0.00913
Epoch: 22638, Training Loss: 0.01119
Epoch: 22638, Training Loss: 0.00865
Epoch: 22639, Training Loss: 0.00980
Epoch: 22639, Training Loss: 0.00913
Epoch: 22639, Training Loss: 0.01119
Epoch: 22639, Training Loss: 0.00865
Epoch: 22640, Training Loss: 0.00980
Epoch: 22640, Training Loss: 0.00913
Epoch: 22640, Training Loss: 0.01119
Epoch: 22640, Training Loss: 0.00865
Epoch: 22641, Training Loss: 0.00980
Epoch: 22641, Training Loss: 0.00913
Epoch: 22641, Training Loss: 0.01119
Epoch: 22641, Training Loss: 0.00865
Epoch: 22642, Training Loss: 0.00980
Epoch: 22642, Training Loss: 0.00913
Epoch: 22642, Training Loss: 0.01119
Epoch: 22642, Training Loss: 0.00865
Epoch: 22643, Training Loss: 0.00980
Epoch: 22643, Training Loss: 0.00913
Epoch: 22643, Training Loss: 0.01119
Epoch: 22643, Training Loss: 0.00865
Epoch: 22644, Training Loss: 0.00980
Epoch: 22644, Training Loss: 0.00913
Epoch: 22644, Training Loss: 0.01119
Epoch: 22644, Training Loss: 0.00865
Epoch: 22645, Training Loss: 0.00980
Epoch: 22645, Training Loss: 0.00913
Epoch: 22645, Training Loss: 0.01119
Epoch: 22645, Training Loss: 0.00865
Epoch: 22646, Training Loss: 0.00980
Epoch: 22646, Training Loss: 0.00913
Epoch: 22646, Training Loss: 0.01119
Epoch: 22646, Training Loss: 0.00865
Epoch: 22647, Training Loss: 0.00980
Epoch: 22647, Training Loss: 0.00913
Epoch: 22647, Training Loss: 0.01119
Epoch: 22647, Training Loss: 0.00865
Epoch: 22648, Training Loss: 0.00980
Epoch: 22648, Training Loss: 0.00913
Epoch: 22648, Training Loss: 0.01119
Epoch: 22648, Training Loss: 0.00865
Epoch: 22649, Training Loss: 0.00980
Epoch: 22649, Training Loss: 0.00913
Epoch: 22649, Training Loss: 0.01119
Epoch: 22649, Training Loss: 0.00865
Epoch: 22650, Training Loss: 0.00980
Epoch: 22650, Training Loss: 0.00913
Epoch: 22650, Training Loss: 0.01119
Epoch: 22650, Training Loss: 0.00865
Epoch: 22651, Training Loss: 0.00980
Epoch: 22651, Training Loss: 0.00913
Epoch: 22651, Training Loss: 0.01119
Epoch: 22651, Training Loss: 0.00865
Epoch: 22652, Training Loss: 0.00980
Epoch: 22652, Training Loss: 0.00913
Epoch: 22652, Training Loss: 0.01119
Epoch: 22652, Training Loss: 0.00865
Epoch: 22653, Training Loss: 0.00980
Epoch: 22653, Training Loss: 0.00913
Epoch: 22653, Training Loss: 0.01119
Epoch: 22653, Training Loss: 0.00865
Epoch: 22654, Training Loss: 0.00980
Epoch: 22654, Training Loss: 0.00913
Epoch: 22654, Training Loss: 0.01119
Epoch: 22654, Training Loss: 0.00865
Epoch: 22655, Training Loss: 0.00980
Epoch: 22655, Training Loss: 0.00913
Epoch: 22655, Training Loss: 0.01119
Epoch: 22655, Training Loss: 0.00865
Epoch: 22656, Training Loss: 0.00980
Epoch: 22656, Training Loss: 0.00913
Epoch: 22656, Training Loss: 0.01119
Epoch: 22656, Training Loss: 0.00865
Epoch: 22657, Training Loss: 0.00980
Epoch: 22657, Training Loss: 0.00913
Epoch: 22657, Training Loss: 0.01119
Epoch: 22657, Training Loss: 0.00865
Epoch: 22658, Training Loss: 0.00980
Epoch: 22658, Training Loss: 0.00913
Epoch: 22658, Training Loss: 0.01119
Epoch: 22658, Training Loss: 0.00865
Epoch: 22659, Training Loss: 0.00980
Epoch: 22659, Training Loss: 0.00913
Epoch: 22659, Training Loss: 0.01119
Epoch: 22659, Training Loss: 0.00865
Epoch: 22660, Training Loss: 0.00980
Epoch: 22660, Training Loss: 0.00913
Epoch: 22660, Training Loss: 0.01119
Epoch: 22660, Training Loss: 0.00865
Epoch: 22661, Training Loss: 0.00980
Epoch: 22661, Training Loss: 0.00913
Epoch: 22661, Training Loss: 0.01119
Epoch: 22661, Training Loss: 0.00865
Epoch: 22662, Training Loss: 0.00980
Epoch: 22662, Training Loss: 0.00913
Epoch: 22662, Training Loss: 0.01119
Epoch: 22662, Training Loss: 0.00864
Epoch: 22663, Training Loss: 0.00980
Epoch: 22663, Training Loss: 0.00913
Epoch: 22663, Training Loss: 0.01119
Epoch: 22663, Training Loss: 0.00864
Epoch: 22664, Training Loss: 0.00980
Epoch: 22664, Training Loss: 0.00913
Epoch: 22664, Training Loss: 0.01119
Epoch: 22664, Training Loss: 0.00864
Epoch: 22665, Training Loss: 0.00980
Epoch: 22665, Training Loss: 0.00912
Epoch: 22665, Training Loss: 0.01119
Epoch: 22665, Training Loss: 0.00864
Epoch: 22666, Training Loss: 0.00980
Epoch: 22666, Training Loss: 0.00912
Epoch: 22666, Training Loss: 0.01119
Epoch: 22666, Training Loss: 0.00864
Epoch: 22667, Training Loss: 0.00980
Epoch: 22667, Training Loss: 0.00912
Epoch: 22667, Training Loss: 0.01119
Epoch: 22667, Training Loss: 0.00864
Epoch: 22668, Training Loss: 0.00980
Epoch: 22668, Training Loss: 0.00912
Epoch: 22668, Training Loss: 0.01119
Epoch: 22668, Training Loss: 0.00864
Epoch: 22669, Training Loss: 0.00980
Epoch: 22669, Training Loss: 0.00912
Epoch: 22669, Training Loss: 0.01119
Epoch: 22669, Training Loss: 0.00864
Epoch: 22670, Training Loss: 0.00980
Epoch: 22670, Training Loss: 0.00912
Epoch: 22670, Training Loss: 0.01118
Epoch: 22670, Training Loss: 0.00864
Epoch: 22671, Training Loss: 0.00980
Epoch: 22671, Training Loss: 0.00912
Epoch: 22671, Training Loss: 0.01118
Epoch: 22671, Training Loss: 0.00864
Epoch: 22672, Training Loss: 0.00980
Epoch: 22672, Training Loss: 0.00912
Epoch: 22672, Training Loss: 0.01118
Epoch: 22672, Training Loss: 0.00864
Epoch: 22673, Training Loss: 0.00980
Epoch: 22673, Training Loss: 0.00912
Epoch: 22673, Training Loss: 0.01118
Epoch: 22673, Training Loss: 0.00864
Epoch: 22674, Training Loss: 0.00980
Epoch: 22674, Training Loss: 0.00912
Epoch: 22674, Training Loss: 0.01118
Epoch: 22674, Training Loss: 0.00864
Epoch: 22675, Training Loss: 0.00980
Epoch: 22675, Training Loss: 0.00912
Epoch: 22675, Training Loss: 0.01118
Epoch: 22675, Training Loss: 0.00864
Epoch: 22676, Training Loss: 0.00980
Epoch: 22676, Training Loss: 0.00912
Epoch: 22676, Training Loss: 0.01118
Epoch: 22676, Training Loss: 0.00864
Epoch: 22677, Training Loss: 0.00980
Epoch: 22677, Training Loss: 0.00912
Epoch: 22677, Training Loss: 0.01118
Epoch: 22677, Training Loss: 0.00864
Epoch: 22678, Training Loss: 0.00979
Epoch: 22678, Training Loss: 0.00912
Epoch: 22678, Training Loss: 0.01118
Epoch: 22678, Training Loss: 0.00864
Epoch: 22679, Training Loss: 0.00979
Epoch: 22679, Training Loss: 0.00912
Epoch: 22679, Training Loss: 0.01118
Epoch: 22679, Training Loss: 0.00864
Epoch: 22680, Training Loss: 0.00979
Epoch: 22680, Training Loss: 0.00912
Epoch: 22680, Training Loss: 0.01118
Epoch: 22680, Training Loss: 0.00864
Epoch: 22681, Training Loss: 0.00979
Epoch: 22681, Training Loss: 0.00912
Epoch: 22681, Training Loss: 0.01118
Epoch: 22681, Training Loss: 0.00864
Epoch: 22682, Training Loss: 0.00979
Epoch: 22682, Training Loss: 0.00912
Epoch: 22682, Training Loss: 0.01118
Epoch: 22682, Training Loss: 0.00864
Epoch: 22683, Training Loss: 0.00979
Epoch: 22683, Training Loss: 0.00912
Epoch: 22683, Training Loss: 0.01118
Epoch: 22683, Training Loss: 0.00864
Epoch: 22684, Training Loss: 0.00979
Epoch: 22684, Training Loss: 0.00912
Epoch: 22684, Training Loss: 0.01118
Epoch: 22684, Training Loss: 0.00864
Epoch: 22685, Training Loss: 0.00979
Epoch: 22685, Training Loss: 0.00912
Epoch: 22685, Training Loss: 0.01118
Epoch: 22685, Training Loss: 0.00864
Epoch: 22686, Training Loss: 0.00979
Epoch: 22686, Training Loss: 0.00912
Epoch: 22686, Training Loss: 0.01118
Epoch: 22686, Training Loss: 0.00864
Epoch: 22687, Training Loss: 0.00979
Epoch: 22687, Training Loss: 0.00912
Epoch: 22687, Training Loss: 0.01118
Epoch: 22687, Training Loss: 0.00864
Epoch: 22688, Training Loss: 0.00979
Epoch: 22688, Training Loss: 0.00912
Epoch: 22688, Training Loss: 0.01118
Epoch: 22688, Training Loss: 0.00864
Epoch: 22689, Training Loss: 0.00979
Epoch: 22689, Training Loss: 0.00912
Epoch: 22689, Training Loss: 0.01118
Epoch: 22689, Training Loss: 0.00864
Epoch: 22690, Training Loss: 0.00979
Epoch: 22690, Training Loss: 0.00912
Epoch: 22690, Training Loss: 0.01118
Epoch: 22690, Training Loss: 0.00864
Epoch: 22691, Training Loss: 0.00979
Epoch: 22691, Training Loss: 0.00912
Epoch: 22691, Training Loss: 0.01118
Epoch: 22691, Training Loss: 0.00864
Epoch: 22692, Training Loss: 0.00979
Epoch: 22692, Training Loss: 0.00912
Epoch: 22692, Training Loss: 0.01118
Epoch: 22692, Training Loss: 0.00864
Epoch: 22693, Training Loss: 0.00979
Epoch: 22693, Training Loss: 0.00912
Epoch: 22693, Training Loss: 0.01118
Epoch: 22693, Training Loss: 0.00864
Epoch: 22694, Training Loss: 0.00979
Epoch: 22694, Training Loss: 0.00912
Epoch: 22694, Training Loss: 0.01118
Epoch: 22694, Training Loss: 0.00864
Epoch: 22695, Training Loss: 0.00979
Epoch: 22695, Training Loss: 0.00912
Epoch: 22695, Training Loss: 0.01118
Epoch: 22695, Training Loss: 0.00864
Epoch: 22696, Training Loss: 0.00979
Epoch: 22696, Training Loss: 0.00912
Epoch: 22696, Training Loss: 0.01118
Epoch: 22696, Training Loss: 0.00864
Epoch: 22697, Training Loss: 0.00979
Epoch: 22697, Training Loss: 0.00912
Epoch: 22697, Training Loss: 0.01118
Epoch: 22697, Training Loss: 0.00864
Epoch: 22698, Training Loss: 0.00979
Epoch: 22698, Training Loss: 0.00912
Epoch: 22698, Training Loss: 0.01118
Epoch: 22698, Training Loss: 0.00864
Epoch: 22699, Training Loss: 0.00979
Epoch: 22699, Training Loss: 0.00912
Epoch: 22699, Training Loss: 0.01118
Epoch: 22699, Training Loss: 0.00864
Epoch: 22700, Training Loss: 0.00979
Epoch: 22700, Training Loss: 0.00912
Epoch: 22700, Training Loss: 0.01118
Epoch: 22700, Training Loss: 0.00864
Epoch: 22701, Training Loss: 0.00979
Epoch: 22701, Training Loss: 0.00912
Epoch: 22701, Training Loss: 0.01118
Epoch: 22701, Training Loss: 0.00864
Epoch: 22702, Training Loss: 0.00979
Epoch: 22702, Training Loss: 0.00912
Epoch: 22702, Training Loss: 0.01118
Epoch: 22702, Training Loss: 0.00864
Epoch: 22703, Training Loss: 0.00979
Epoch: 22703, Training Loss: 0.00912
Epoch: 22703, Training Loss: 0.01118
Epoch: 22703, Training Loss: 0.00864
Epoch: 22704, Training Loss: 0.00979
Epoch: 22704, Training Loss: 0.00912
Epoch: 22704, Training Loss: 0.01118
Epoch: 22704, Training Loss: 0.00864
Epoch: 22705, Training Loss: 0.00979
Epoch: 22705, Training Loss: 0.00912
Epoch: 22705, Training Loss: 0.01118
Epoch: 22705, Training Loss: 0.00864
Epoch: 22706, Training Loss: 0.00979
Epoch: 22706, Training Loss: 0.00912
Epoch: 22706, Training Loss: 0.01118
Epoch: 22706, Training Loss: 0.00864
Epoch: 22707, Training Loss: 0.00979
Epoch: 22707, Training Loss: 0.00912
Epoch: 22707, Training Loss: 0.01118
Epoch: 22707, Training Loss: 0.00864
Epoch: 22708, Training Loss: 0.00979
Epoch: 22708, Training Loss: 0.00912
Epoch: 22708, Training Loss: 0.01117
Epoch: 22708, Training Loss: 0.00864
Epoch: 22709, Training Loss: 0.00979
Epoch: 22709, Training Loss: 0.00912
Epoch: 22709, Training Loss: 0.01117
Epoch: 22709, Training Loss: 0.00864
Epoch: 22710, Training Loss: 0.00979
Epoch: 22710, Training Loss: 0.00912
Epoch: 22710, Training Loss: 0.01117
Epoch: 22710, Training Loss: 0.00864
Epoch: 22711, Training Loss: 0.00979
Epoch: 22711, Training Loss: 0.00912
Epoch: 22711, Training Loss: 0.01117
Epoch: 22711, Training Loss: 0.00864
Epoch: 22712, Training Loss: 0.00979
Epoch: 22712, Training Loss: 0.00911
Epoch: 22712, Training Loss: 0.01117
Epoch: 22712, Training Loss: 0.00863
Epoch: 22713, Training Loss: 0.00979
Epoch: 22713, Training Loss: 0.00911
Epoch: 22713, Training Loss: 0.01117
Epoch: 22713, Training Loss: 0.00863
Epoch: 22714, Training Loss: 0.00979
Epoch: 22714, Training Loss: 0.00911
Epoch: 22714, Training Loss: 0.01117
Epoch: 22714, Training Loss: 0.00863
Epoch: 22715, Training Loss: 0.00979
Epoch: 22715, Training Loss: 0.00911
Epoch: 22715, Training Loss: 0.01117
Epoch: 22715, Training Loss: 0.00863
Epoch: 22716, Training Loss: 0.00979
Epoch: 22716, Training Loss: 0.00911
Epoch: 22716, Training Loss: 0.01117
Epoch: 22716, Training Loss: 0.00863
Epoch: 22717, Training Loss: 0.00979
Epoch: 22717, Training Loss: 0.00911
Epoch: 22717, Training Loss: 0.01117
Epoch: 22717, Training Loss: 0.00863
Epoch: 22718, Training Loss: 0.00979
Epoch: 22718, Training Loss: 0.00911
Epoch: 22718, Training Loss: 0.01117
Epoch: 22718, Training Loss: 0.00863
Epoch: 22719, Training Loss: 0.00979
Epoch: 22719, Training Loss: 0.00911
Epoch: 22719, Training Loss: 0.01117
Epoch: 22719, Training Loss: 0.00863
Epoch: 22720, Training Loss: 0.00979
Epoch: 22720, Training Loss: 0.00911
Epoch: 22720, Training Loss: 0.01117
Epoch: 22720, Training Loss: 0.00863
Epoch: 22721, Training Loss: 0.00978
Epoch: 22721, Training Loss: 0.00911
Epoch: 22721, Training Loss: 0.01117
Epoch: 22721, Training Loss: 0.00863
Epoch: 22722, Training Loss: 0.00978
Epoch: 22722, Training Loss: 0.00911
Epoch: 22722, Training Loss: 0.01117
Epoch: 22722, Training Loss: 0.00863
Epoch: 22723, Training Loss: 0.00978
Epoch: 22723, Training Loss: 0.00911
Epoch: 22723, Training Loss: 0.01117
Epoch: 22723, Training Loss: 0.00863
Epoch: 22724, Training Loss: 0.00978
Epoch: 22724, Training Loss: 0.00911
Epoch: 22724, Training Loss: 0.01117
Epoch: 22724, Training Loss: 0.00863
Epoch: 22725, Training Loss: 0.00978
Epoch: 22725, Training Loss: 0.00911
Epoch: 22725, Training Loss: 0.01117
Epoch: 22725, Training Loss: 0.00863
Epoch: 22726, Training Loss: 0.00978
Epoch: 22726, Training Loss: 0.00911
Epoch: 22726, Training Loss: 0.01117
Epoch: 22726, Training Loss: 0.00863
Epoch: 22727, Training Loss: 0.00978
Epoch: 22727, Training Loss: 0.00911
Epoch: 22727, Training Loss: 0.01117
Epoch: 22727, Training Loss: 0.00863
Epoch: 22728, Training Loss: 0.00978
Epoch: 22728, Training Loss: 0.00911
Epoch: 22728, Training Loss: 0.01117
Epoch: 22728, Training Loss: 0.00863
Epoch: 22729, Training Loss: 0.00978
Epoch: 22729, Training Loss: 0.00911
Epoch: 22729, Training Loss: 0.01117
Epoch: 22729, Training Loss: 0.00863
Epoch: 22730, Training Loss: 0.00978
Epoch: 22730, Training Loss: 0.00911
Epoch: 22730, Training Loss: 0.01117
Epoch: 22730, Training Loss: 0.00863
Epoch: 22731, Training Loss: 0.00978
Epoch: 22731, Training Loss: 0.00911
Epoch: 22731, Training Loss: 0.01117
Epoch: 22731, Training Loss: 0.00863
Epoch: 22732, Training Loss: 0.00978
Epoch: 22732, Training Loss: 0.00911
Epoch: 22732, Training Loss: 0.01117
Epoch: 22732, Training Loss: 0.00863
Epoch: 22733, Training Loss: 0.00978
Epoch: 22733, Training Loss: 0.00911
Epoch: 22733, Training Loss: 0.01117
Epoch: 22733, Training Loss: 0.00863
Epoch: 22734, Training Loss: 0.00978
Epoch: 22734, Training Loss: 0.00911
Epoch: 22734, Training Loss: 0.01117
Epoch: 22734, Training Loss: 0.00863
Epoch: 22735, Training Loss: 0.00978
Epoch: 22735, Training Loss: 0.00911
Epoch: 22735, Training Loss: 0.01117
Epoch: 22735, Training Loss: 0.00863
Epoch: 22736, Training Loss: 0.00978
Epoch: 22736, Training Loss: 0.00911
Epoch: 22736, Training Loss: 0.01117
Epoch: 22736, Training Loss: 0.00863
Epoch: 22737, Training Loss: 0.00978
Epoch: 22737, Training Loss: 0.00911
Epoch: 22737, Training Loss: 0.01117
Epoch: 22737, Training Loss: 0.00863
Epoch: 22738, Training Loss: 0.00978
Epoch: 22738, Training Loss: 0.00911
Epoch: 22738, Training Loss: 0.01117
Epoch: 22738, Training Loss: 0.00863
Epoch: 22739, Training Loss: 0.00978
Epoch: 22739, Training Loss: 0.00911
Epoch: 22739, Training Loss: 0.01117
Epoch: 22739, Training Loss: 0.00863
Epoch: 22740, Training Loss: 0.00978
Epoch: 22740, Training Loss: 0.00911
Epoch: 22740, Training Loss: 0.01117
Epoch: 22740, Training Loss: 0.00863
Epoch: 22741, Training Loss: 0.00978
Epoch: 22741, Training Loss: 0.00911
Epoch: 22741, Training Loss: 0.01117
Epoch: 22741, Training Loss: 0.00863
Epoch: 22742, Training Loss: 0.00978
Epoch: 22742, Training Loss: 0.00911
Epoch: 22742, Training Loss: 0.01117
Epoch: 22742, Training Loss: 0.00863
Epoch: 22743, Training Loss: 0.00978
Epoch: 22743, Training Loss: 0.00911
Epoch: 22743, Training Loss: 0.01117
Epoch: 22743, Training Loss: 0.00863
Epoch: 22744, Training Loss: 0.00978
Epoch: 22744, Training Loss: 0.00911
Epoch: 22744, Training Loss: 0.01117
Epoch: 22744, Training Loss: 0.00863
Epoch: 22745, Training Loss: 0.00978
Epoch: 22745, Training Loss: 0.00911
Epoch: 22745, Training Loss: 0.01117
Epoch: 22745, Training Loss: 0.00863
Epoch: 22746, Training Loss: 0.00978
Epoch: 22746, Training Loss: 0.00911
Epoch: 22746, Training Loss: 0.01116
Epoch: 22746, Training Loss: 0.00863
Epoch: 22747, Training Loss: 0.00978
Epoch: 22747, Training Loss: 0.00911
Epoch: 22747, Training Loss: 0.01116
Epoch: 22747, Training Loss: 0.00863
Epoch: 22748, Training Loss: 0.00978
Epoch: 22748, Training Loss: 0.00911
Epoch: 22748, Training Loss: 0.01116
Epoch: 22748, Training Loss: 0.00863
Epoch: 22749, Training Loss: 0.00978
Epoch: 22749, Training Loss: 0.00911
Epoch: 22749, Training Loss: 0.01116
Epoch: 22749, Training Loss: 0.00863
Epoch: 22750, Training Loss: 0.00978
Epoch: 22750, Training Loss: 0.00911
Epoch: 22750, Training Loss: 0.01116
Epoch: 22750, Training Loss: 0.00863
Epoch: 22751, Training Loss: 0.00978
Epoch: 22751, Training Loss: 0.00911
Epoch: 22751, Training Loss: 0.01116
Epoch: 22751, Training Loss: 0.00863
Epoch: 22752, Training Loss: 0.00978
Epoch: 22752, Training Loss: 0.00911
Epoch: 22752, Training Loss: 0.01116
Epoch: 22752, Training Loss: 0.00863
Epoch: 22753, Training Loss: 0.00978
Epoch: 22753, Training Loss: 0.00911
Epoch: 22753, Training Loss: 0.01116
Epoch: 22753, Training Loss: 0.00863
Epoch: 22754, Training Loss: 0.00978
Epoch: 22754, Training Loss: 0.00911
Epoch: 22754, Training Loss: 0.01116
Epoch: 22754, Training Loss: 0.00863
Epoch: 22755, Training Loss: 0.00978
Epoch: 22755, Training Loss: 0.00911
Epoch: 22755, Training Loss: 0.01116
Epoch: 22755, Training Loss: 0.00863
Epoch: 22756, Training Loss: 0.00978
Epoch: 22756, Training Loss: 0.00911
Epoch: 22756, Training Loss: 0.01116
Epoch: 22756, Training Loss: 0.00863
Epoch: 22757, Training Loss: 0.00978
Epoch: 22757, Training Loss: 0.00911
Epoch: 22757, Training Loss: 0.01116
Epoch: 22757, Training Loss: 0.00863
Epoch: 22758, Training Loss: 0.00978
Epoch: 22758, Training Loss: 0.00911
Epoch: 22758, Training Loss: 0.01116
Epoch: 22758, Training Loss: 0.00863
Epoch: 22759, Training Loss: 0.00978
Epoch: 22759, Training Loss: 0.00911
Epoch: 22759, Training Loss: 0.01116
Epoch: 22759, Training Loss: 0.00863
Epoch: 22760, Training Loss: 0.00978
Epoch: 22760, Training Loss: 0.00910
Epoch: 22760, Training Loss: 0.01116
Epoch: 22760, Training Loss: 0.00863
Epoch: 22761, Training Loss: 0.00978
Epoch: 22761, Training Loss: 0.00910
Epoch: 22761, Training Loss: 0.01116
Epoch: 22761, Training Loss: 0.00863
Epoch: 22762, Training Loss: 0.00978
Epoch: 22762, Training Loss: 0.00910
Epoch: 22762, Training Loss: 0.01116
Epoch: 22762, Training Loss: 0.00862
Epoch: 22763, Training Loss: 0.00978
Epoch: 22763, Training Loss: 0.00910
Epoch: 22763, Training Loss: 0.01116
Epoch: 22763, Training Loss: 0.00862
Epoch: 22764, Training Loss: 0.00978
Epoch: 22764, Training Loss: 0.00910
Epoch: 22764, Training Loss: 0.01116
Epoch: 22764, Training Loss: 0.00862
Epoch: 22765, Training Loss: 0.00977
Epoch: 22765, Training Loss: 0.00910
Epoch: 22765, Training Loss: 0.01116
Epoch: 22765, Training Loss: 0.00862
Epoch: 22766, Training Loss: 0.00977
Epoch: 22766, Training Loss: 0.00910
Epoch: 22766, Training Loss: 0.01116
Epoch: 22766, Training Loss: 0.00862
Epoch: 22767, Training Loss: 0.00977
Epoch: 22767, Training Loss: 0.00910
Epoch: 22767, Training Loss: 0.01116
Epoch: 22767, Training Loss: 0.00862
Epoch: 22768, Training Loss: 0.00977
Epoch: 22768, Training Loss: 0.00910
Epoch: 22768, Training Loss: 0.01116
Epoch: 22768, Training Loss: 0.00862
Epoch: 22769, Training Loss: 0.00977
Epoch: 22769, Training Loss: 0.00910
Epoch: 22769, Training Loss: 0.01116
Epoch: 22769, Training Loss: 0.00862
Epoch: 22770, Training Loss: 0.00977
Epoch: 22770, Training Loss: 0.00910
Epoch: 22770, Training Loss: 0.01116
Epoch: 22770, Training Loss: 0.00862
Epoch: 22771, Training Loss: 0.00977
Epoch: 22771, Training Loss: 0.00910
Epoch: 22771, Training Loss: 0.01116
Epoch: 22771, Training Loss: 0.00862
Epoch: 22772, Training Loss: 0.00977
Epoch: 22772, Training Loss: 0.00910
Epoch: 22772, Training Loss: 0.01116
Epoch: 22772, Training Loss: 0.00862
Epoch: 22773, Training Loss: 0.00977
Epoch: 22773, Training Loss: 0.00910
Epoch: 22773, Training Loss: 0.01116
Epoch: 22773, Training Loss: 0.00862
Epoch: 22774, Training Loss: 0.00977
Epoch: 22774, Training Loss: 0.00910
Epoch: 22774, Training Loss: 0.01116
Epoch: 22774, Training Loss: 0.00862
Epoch: 22775, Training Loss: 0.00977
Epoch: 22775, Training Loss: 0.00910
Epoch: 22775, Training Loss: 0.01116
Epoch: 22775, Training Loss: 0.00862
Epoch: 22776, Training Loss: 0.00977
Epoch: 22776, Training Loss: 0.00910
Epoch: 22776, Training Loss: 0.01116
Epoch: 22776, Training Loss: 0.00862
Epoch: 22777, Training Loss: 0.00977
Epoch: 22777, Training Loss: 0.00910
Epoch: 22777, Training Loss: 0.01116
Epoch: 22777, Training Loss: 0.00862
Epoch: 22778, Training Loss: 0.00977
Epoch: 22778, Training Loss: 0.00910
Epoch: 22778, Training Loss: 0.01116
Epoch: 22778, Training Loss: 0.00862
Epoch: 22779, Training Loss: 0.00977
Epoch: 22779, Training Loss: 0.00910
Epoch: 22779, Training Loss: 0.01116
Epoch: 22779, Training Loss: 0.00862
Epoch: 22780, Training Loss: 0.00977
Epoch: 22780, Training Loss: 0.00910
Epoch: 22780, Training Loss: 0.01116
Epoch: 22780, Training Loss: 0.00862
Epoch: 22781, Training Loss: 0.00977
Epoch: 22781, Training Loss: 0.00910
Epoch: 22781, Training Loss: 0.01116
Epoch: 22781, Training Loss: 0.00862
Epoch: 22782, Training Loss: 0.00977
Epoch: 22782, Training Loss: 0.00910
Epoch: 22782, Training Loss: 0.01116
Epoch: 22782, Training Loss: 0.00862
Epoch: 22783, Training Loss: 0.00977
Epoch: 22783, Training Loss: 0.00910
Epoch: 22783, Training Loss: 0.01116
Epoch: 22783, Training Loss: 0.00862
Epoch: 22784, Training Loss: 0.00977
Epoch: 22784, Training Loss: 0.00910
Epoch: 22784, Training Loss: 0.01115
Epoch: 22784, Training Loss: 0.00862
Epoch: 22785, Training Loss: 0.00977
Epoch: 22785, Training Loss: 0.00910
Epoch: 22785, Training Loss: 0.01115
Epoch: 22785, Training Loss: 0.00862
Epoch: 22786, Training Loss: 0.00977
Epoch: 22786, Training Loss: 0.00910
Epoch: 22786, Training Loss: 0.01115
Epoch: 22786, Training Loss: 0.00862
Epoch: 22787, Training Loss: 0.00977
Epoch: 22787, Training Loss: 0.00910
Epoch: 22787, Training Loss: 0.01115
Epoch: 22787, Training Loss: 0.00862
Epoch: 22788, Training Loss: 0.00977
Epoch: 22788, Training Loss: 0.00910
Epoch: 22788, Training Loss: 0.01115
Epoch: 22788, Training Loss: 0.00862
Epoch: 22789, Training Loss: 0.00977
Epoch: 22789, Training Loss: 0.00910
Epoch: 22789, Training Loss: 0.01115
Epoch: 22789, Training Loss: 0.00862
Epoch: 22790, Training Loss: 0.00977
Epoch: 22790, Training Loss: 0.00910
Epoch: 22790, Training Loss: 0.01115
Epoch: 22790, Training Loss: 0.00862
Epoch: 22791, Training Loss: 0.00977
Epoch: 22791, Training Loss: 0.00910
Epoch: 22791, Training Loss: 0.01115
Epoch: 22791, Training Loss: 0.00862
Epoch: 22792, Training Loss: 0.00977
Epoch: 22792, Training Loss: 0.00910
Epoch: 22792, Training Loss: 0.01115
Epoch: 22792, Training Loss: 0.00862
Epoch: 22793, Training Loss: 0.00977
Epoch: 22793, Training Loss: 0.00910
Epoch: 22793, Training Loss: 0.01115
Epoch: 22793, Training Loss: 0.00862
Epoch: 22794, Training Loss: 0.00977
Epoch: 22794, Training Loss: 0.00910
Epoch: 22794, Training Loss: 0.01115
Epoch: 22794, Training Loss: 0.00862
Epoch: 22795, Training Loss: 0.00977
Epoch: 22795, Training Loss: 0.00910
Epoch: 22795, Training Loss: 0.01115
Epoch: 22795, Training Loss: 0.00862
Epoch: 22796, Training Loss: 0.00977
Epoch: 22796, Training Loss: 0.00910
Epoch: 22796, Training Loss: 0.01115
Epoch: 22796, Training Loss: 0.00862
Epoch: 22797, Training Loss: 0.00977
Epoch: 22797, Training Loss: 0.00910
Epoch: 22797, Training Loss: 0.01115
Epoch: 22797, Training Loss: 0.00862
Epoch: 22798, Training Loss: 0.00977
Epoch: 22798, Training Loss: 0.00910
Epoch: 22798, Training Loss: 0.01115
Epoch: 22798, Training Loss: 0.00862
Epoch: 22799, Training Loss: 0.00977
Epoch: 22799, Training Loss: 0.00910
Epoch: 22799, Training Loss: 0.01115
Epoch: 22799, Training Loss: 0.00862
Epoch: 22800, Training Loss: 0.00977
Epoch: 22800, Training Loss: 0.00910
Epoch: 22800, Training Loss: 0.01115
Epoch: 22800, Training Loss: 0.00862
Epoch: 22801, Training Loss: 0.00977
Epoch: 22801, Training Loss: 0.00910
Epoch: 22801, Training Loss: 0.01115
Epoch: 22801, Training Loss: 0.00862
Epoch: 22802, Training Loss: 0.00977
Epoch: 22802, Training Loss: 0.00910
Epoch: 22802, Training Loss: 0.01115
Epoch: 22802, Training Loss: 0.00862
Epoch: 22803, Training Loss: 0.00977
Epoch: 22803, Training Loss: 0.00910
Epoch: 22803, Training Loss: 0.01115
Epoch: 22803, Training Loss: 0.00862
Epoch: 22804, Training Loss: 0.00977
Epoch: 22804, Training Loss: 0.00910
Epoch: 22804, Training Loss: 0.01115
Epoch: 22804, Training Loss: 0.00862
Epoch: 22805, Training Loss: 0.00977
Epoch: 22805, Training Loss: 0.00910
Epoch: 22805, Training Loss: 0.01115
Epoch: 22805, Training Loss: 0.00862
Epoch: 22806, Training Loss: 0.00977
Epoch: 22806, Training Loss: 0.00910
Epoch: 22806, Training Loss: 0.01115
Epoch: 22806, Training Loss: 0.00862
Epoch: 22807, Training Loss: 0.00977
Epoch: 22807, Training Loss: 0.00909
Epoch: 22807, Training Loss: 0.01115
Epoch: 22807, Training Loss: 0.00862
Epoch: 22808, Training Loss: 0.00976
Epoch: 22808, Training Loss: 0.00909
Epoch: 22808, Training Loss: 0.01115
Epoch: 22808, Training Loss: 0.00862
Epoch: 22809, Training Loss: 0.00976
Epoch: 22809, Training Loss: 0.00909
Epoch: 22809, Training Loss: 0.01115
Epoch: 22809, Training Loss: 0.00862
Epoch: 22810, Training Loss: 0.00976
Epoch: 22810, Training Loss: 0.00909
Epoch: 22810, Training Loss: 0.01115
Epoch: 22810, Training Loss: 0.00862
Epoch: 22811, Training Loss: 0.00976
Epoch: 22811, Training Loss: 0.00909
Epoch: 22811, Training Loss: 0.01115
Epoch: 22811, Training Loss: 0.00862
Epoch: 22812, Training Loss: 0.00976
Epoch: 22812, Training Loss: 0.00909
Epoch: 22812, Training Loss: 0.01115
Epoch: 22812, Training Loss: 0.00861
Epoch: 22813, Training Loss: 0.00976
Epoch: 22813, Training Loss: 0.00909
Epoch: 22813, Training Loss: 0.01115
Epoch: 22813, Training Loss: 0.00861
Epoch: 22814, Training Loss: 0.00976
Epoch: 22814, Training Loss: 0.00909
Epoch: 22814, Training Loss: 0.01115
Epoch: 22814, Training Loss: 0.00861
Epoch: 22815, Training Loss: 0.00976
Epoch: 22815, Training Loss: 0.00909
Epoch: 22815, Training Loss: 0.01115
Epoch: 22815, Training Loss: 0.00861
Epoch: 22816, Training Loss: 0.00976
Epoch: 22816, Training Loss: 0.00909
Epoch: 22816, Training Loss: 0.01115
Epoch: 22816, Training Loss: 0.00861
Epoch: 22817, Training Loss: 0.00976
Epoch: 22817, Training Loss: 0.00909
Epoch: 22817, Training Loss: 0.01115
Epoch: 22817, Training Loss: 0.00861
Epoch: 22818, Training Loss: 0.00976
Epoch: 22818, Training Loss: 0.00909
Epoch: 22818, Training Loss: 0.01115
Epoch: 22818, Training Loss: 0.00861
Epoch: 22819, Training Loss: 0.00976
Epoch: 22819, Training Loss: 0.00909
Epoch: 22819, Training Loss: 0.01115
Epoch: 22819, Training Loss: 0.00861
Epoch: 22820, Training Loss: 0.00976
Epoch: 22820, Training Loss: 0.00909
Epoch: 22820, Training Loss: 0.01115
Epoch: 22820, Training Loss: 0.00861
Epoch: 22821, Training Loss: 0.00976
Epoch: 22821, Training Loss: 0.00909
Epoch: 22821, Training Loss: 0.01115
Epoch: 22821, Training Loss: 0.00861
Epoch: 22822, Training Loss: 0.00976
Epoch: 22822, Training Loss: 0.00909
Epoch: 22822, Training Loss: 0.01114
Epoch: 22822, Training Loss: 0.00861
Epoch: 22823, Training Loss: 0.00976
Epoch: 22823, Training Loss: 0.00909
Epoch: 22823, Training Loss: 0.01114
Epoch: 22823, Training Loss: 0.00861
Epoch: 22824, Training Loss: 0.00976
Epoch: 22824, Training Loss: 0.00909
Epoch: 22824, Training Loss: 0.01114
Epoch: 22824, Training Loss: 0.00861
Epoch: 22825, Training Loss: 0.00976
Epoch: 22825, Training Loss: 0.00909
Epoch: 22825, Training Loss: 0.01114
Epoch: 22825, Training Loss: 0.00861
Epoch: 22826, Training Loss: 0.00976
Epoch: 22826, Training Loss: 0.00909
Epoch: 22826, Training Loss: 0.01114
Epoch: 22826, Training Loss: 0.00861
Epoch: 22827, Training Loss: 0.00976
Epoch: 22827, Training Loss: 0.00909
Epoch: 22827, Training Loss: 0.01114
Epoch: 22827, Training Loss: 0.00861
Epoch: 22828, Training Loss: 0.00976
Epoch: 22828, Training Loss: 0.00909
Epoch: 22828, Training Loss: 0.01114
Epoch: 22828, Training Loss: 0.00861
Epoch: 22829, Training Loss: 0.00976
Epoch: 22829, Training Loss: 0.00909
Epoch: 22829, Training Loss: 0.01114
Epoch: 22829, Training Loss: 0.00861
Epoch: 22830, Training Loss: 0.00976
Epoch: 22830, Training Loss: 0.00909
Epoch: 22830, Training Loss: 0.01114
Epoch: 22830, Training Loss: 0.00861
Epoch: 22831, Training Loss: 0.00976
Epoch: 22831, Training Loss: 0.00909
Epoch: 22831, Training Loss: 0.01114
Epoch: 22831, Training Loss: 0.00861
Epoch: 22832, Training Loss: 0.00976
Epoch: 22832, Training Loss: 0.00909
Epoch: 22832, Training Loss: 0.01114
Epoch: 22832, Training Loss: 0.00861
Epoch: 22833, Training Loss: 0.00976
Epoch: 22833, Training Loss: 0.00909
Epoch: 22833, Training Loss: 0.01114
Epoch: 22833, Training Loss: 0.00861
Epoch: 22834, Training Loss: 0.00976
Epoch: 22834, Training Loss: 0.00909
Epoch: 22834, Training Loss: 0.01114
Epoch: 22834, Training Loss: 0.00861
Epoch: 22835, Training Loss: 0.00976
Epoch: 22835, Training Loss: 0.00909
Epoch: 22835, Training Loss: 0.01114
Epoch: 22835, Training Loss: 0.00861
Epoch: 22836, Training Loss: 0.00976
Epoch: 22836, Training Loss: 0.00909
Epoch: 22836, Training Loss: 0.01114
Epoch: 22836, Training Loss: 0.00861
Epoch: 22837, Training Loss: 0.00976
Epoch: 22837, Training Loss: 0.00909
Epoch: 22837, Training Loss: 0.01114
Epoch: 22837, Training Loss: 0.00861
Epoch: 22838, Training Loss: 0.00976
Epoch: 22838, Training Loss: 0.00909
Epoch: 22838, Training Loss: 0.01114
Epoch: 22838, Training Loss: 0.00861
Epoch: 22839, Training Loss: 0.00976
Epoch: 22839, Training Loss: 0.00909
Epoch: 22839, Training Loss: 0.01114
Epoch: 22839, Training Loss: 0.00861
Epoch: 22840, Training Loss: 0.00976
Epoch: 22840, Training Loss: 0.00909
Epoch: 22840, Training Loss: 0.01114
Epoch: 22840, Training Loss: 0.00861
Epoch: 22841, Training Loss: 0.00976
Epoch: 22841, Training Loss: 0.00909
Epoch: 22841, Training Loss: 0.01114
Epoch: 22841, Training Loss: 0.00861
Epoch: 22842, Training Loss: 0.00976
Epoch: 22842, Training Loss: 0.00909
Epoch: 22842, Training Loss: 0.01114
Epoch: 22842, Training Loss: 0.00861
Epoch: 22843, Training Loss: 0.00976
Epoch: 22843, Training Loss: 0.00909
Epoch: 22843, Training Loss: 0.01114
Epoch: 22843, Training Loss: 0.00861
Epoch: 22844, Training Loss: 0.00976
Epoch: 22844, Training Loss: 0.00909
Epoch: 22844, Training Loss: 0.01114
Epoch: 22844, Training Loss: 0.00861
Epoch: 22845, Training Loss: 0.00976
Epoch: 22845, Training Loss: 0.00909
Epoch: 22845, Training Loss: 0.01114
Epoch: 22845, Training Loss: 0.00861
Epoch: 22846, Training Loss: 0.00976
Epoch: 22846, Training Loss: 0.00909
Epoch: 22846, Training Loss: 0.01114
Epoch: 22846, Training Loss: 0.00861
Epoch: 22847, Training Loss: 0.00976
Epoch: 22847, Training Loss: 0.00909
Epoch: 22847, Training Loss: 0.01114
Epoch: 22847, Training Loss: 0.00861
Epoch: 22848, Training Loss: 0.00976
Epoch: 22848, Training Loss: 0.00909
Epoch: 22848, Training Loss: 0.01114
Epoch: 22848, Training Loss: 0.00861
Epoch: 22849, Training Loss: 0.00976
Epoch: 22849, Training Loss: 0.00909
Epoch: 22849, Training Loss: 0.01114
Epoch: 22849, Training Loss: 0.00861
Epoch: 22850, Training Loss: 0.00976
Epoch: 22850, Training Loss: 0.00909
Epoch: 22850, Training Loss: 0.01114
Epoch: 22850, Training Loss: 0.00861
Epoch: 22851, Training Loss: 0.00976
Epoch: 22851, Training Loss: 0.00909
Epoch: 22851, Training Loss: 0.01114
Epoch: 22851, Training Loss: 0.00861
Epoch: 22852, Training Loss: 0.00975
Epoch: 22852, Training Loss: 0.00909
Epoch: 22852, Training Loss: 0.01114
Epoch: 22852, Training Loss: 0.00861
Epoch: 22853, Training Loss: 0.00975
Epoch: 22853, Training Loss: 0.00909
Epoch: 22853, Training Loss: 0.01114
Epoch: 22853, Training Loss: 0.00861
Epoch: 22854, Training Loss: 0.00975
Epoch: 22854, Training Loss: 0.00909
Epoch: 22854, Training Loss: 0.01114
Epoch: 22854, Training Loss: 0.00861
Epoch: 22855, Training Loss: 0.00975
Epoch: 22855, Training Loss: 0.00908
Epoch: 22855, Training Loss: 0.01114
Epoch: 22855, Training Loss: 0.00861
Epoch: 22856, Training Loss: 0.00975
Epoch: 22856, Training Loss: 0.00908
Epoch: 22856, Training Loss: 0.01114
Epoch: 22856, Training Loss: 0.00861
Epoch: 22857, Training Loss: 0.00975
Epoch: 22857, Training Loss: 0.00908
Epoch: 22857, Training Loss: 0.01114
Epoch: 22857, Training Loss: 0.00861
Epoch: 22858, Training Loss: 0.00975
Epoch: 22858, Training Loss: 0.00908
Epoch: 22858, Training Loss: 0.01114
Epoch: 22858, Training Loss: 0.00861
Epoch: 22859, Training Loss: 0.00975
Epoch: 22859, Training Loss: 0.00908
Epoch: 22859, Training Loss: 0.01114
Epoch: 22859, Training Loss: 0.00861
Epoch: 22860, Training Loss: 0.00975
Epoch: 22860, Training Loss: 0.00908
Epoch: 22860, Training Loss: 0.01113
Epoch: 22860, Training Loss: 0.00861
Epoch: 22861, Training Loss: 0.00975
Epoch: 22861, Training Loss: 0.00908
Epoch: 22861, Training Loss: 0.01113
Epoch: 22861, Training Loss: 0.00861
Epoch: 22862, Training Loss: 0.00975
Epoch: 22862, Training Loss: 0.00908
Epoch: 22862, Training Loss: 0.01113
Epoch: 22862, Training Loss: 0.00860
Epoch: 22863, Training Loss: 0.00975
Epoch: 22863, Training Loss: 0.00908
Epoch: 22863, Training Loss: 0.01113
Epoch: 22863, Training Loss: 0.00860
Epoch: 22864, Training Loss: 0.00975
Epoch: 22864, Training Loss: 0.00908
Epoch: 22864, Training Loss: 0.01113
Epoch: 22864, Training Loss: 0.00860
Epoch: 22865, Training Loss: 0.00975
Epoch: 22865, Training Loss: 0.00908
Epoch: 22865, Training Loss: 0.01113
Epoch: 22865, Training Loss: 0.00860
Epoch: 22866, Training Loss: 0.00975
Epoch: 22866, Training Loss: 0.00908
Epoch: 22866, Training Loss: 0.01113
Epoch: 22866, Training Loss: 0.00860
Epoch: 22867, Training Loss: 0.00975
Epoch: 22867, Training Loss: 0.00908
Epoch: 22867, Training Loss: 0.01113
Epoch: 22867, Training Loss: 0.00860
Epoch: 22868, Training Loss: 0.00975
Epoch: 22868, Training Loss: 0.00908
Epoch: 22868, Training Loss: 0.01113
Epoch: 22868, Training Loss: 0.00860
Epoch: 22869, Training Loss: 0.00975
Epoch: 22869, Training Loss: 0.00908
Epoch: 22869, Training Loss: 0.01113
Epoch: 22869, Training Loss: 0.00860
Epoch: 22870, Training Loss: 0.00975
Epoch: 22870, Training Loss: 0.00908
Epoch: 22870, Training Loss: 0.01113
Epoch: 22870, Training Loss: 0.00860
Epoch: 22871, Training Loss: 0.00975
Epoch: 22871, Training Loss: 0.00908
Epoch: 22871, Training Loss: 0.01113
Epoch: 22871, Training Loss: 0.00860
Epoch: 22872, Training Loss: 0.00975
Epoch: 22872, Training Loss: 0.00908
Epoch: 22872, Training Loss: 0.01113
Epoch: 22872, Training Loss: 0.00860
Epoch: 22873, Training Loss: 0.00975
Epoch: 22873, Training Loss: 0.00908
Epoch: 22873, Training Loss: 0.01113
Epoch: 22873, Training Loss: 0.00860
Epoch: 22874, Training Loss: 0.00975
Epoch: 22874, Training Loss: 0.00908
Epoch: 22874, Training Loss: 0.01113
Epoch: 22874, Training Loss: 0.00860
Epoch: 22875, Training Loss: 0.00975
Epoch: 22875, Training Loss: 0.00908
Epoch: 22875, Training Loss: 0.01113
Epoch: 22875, Training Loss: 0.00860
Epoch: 22876, Training Loss: 0.00975
Epoch: 22876, Training Loss: 0.00908
Epoch: 22876, Training Loss: 0.01113
Epoch: 22876, Training Loss: 0.00860
Epoch: 22877, Training Loss: 0.00975
Epoch: 22877, Training Loss: 0.00908
Epoch: 22877, Training Loss: 0.01113
Epoch: 22877, Training Loss: 0.00860
Epoch: 22878, Training Loss: 0.00975
Epoch: 22878, Training Loss: 0.00908
Epoch: 22878, Training Loss: 0.01113
Epoch: 22878, Training Loss: 0.00860
Epoch: 22879, Training Loss: 0.00975
Epoch: 22879, Training Loss: 0.00908
Epoch: 22879, Training Loss: 0.01113
Epoch: 22879, Training Loss: 0.00860
Epoch: 22880, Training Loss: 0.00975
Epoch: 22880, Training Loss: 0.00908
Epoch: 22880, Training Loss: 0.01113
Epoch: 22880, Training Loss: 0.00860
Epoch: 22881, Training Loss: 0.00975
Epoch: 22881, Training Loss: 0.00908
Epoch: 22881, Training Loss: 0.01113
Epoch: 22881, Training Loss: 0.00860
Epoch: 22882, Training Loss: 0.00975
Epoch: 22882, Training Loss: 0.00908
Epoch: 22882, Training Loss: 0.01113
Epoch: 22882, Training Loss: 0.00860
Epoch: 22883, Training Loss: 0.00975
Epoch: 22883, Training Loss: 0.00908
Epoch: 22883, Training Loss: 0.01113
Epoch: 22883, Training Loss: 0.00860
Epoch: 22884, Training Loss: 0.00975
Epoch: 22884, Training Loss: 0.00908
Epoch: 22884, Training Loss: 0.01113
Epoch: 22884, Training Loss: 0.00860
Epoch: 22885, Training Loss: 0.00975
Epoch: 22885, Training Loss: 0.00908
Epoch: 22885, Training Loss: 0.01113
Epoch: 22885, Training Loss: 0.00860
Epoch: 22886, Training Loss: 0.00975
Epoch: 22886, Training Loss: 0.00908
Epoch: 22886, Training Loss: 0.01113
Epoch: 22886, Training Loss: 0.00860
Epoch: 22887, Training Loss: 0.00975
Epoch: 22887, Training Loss: 0.00908
Epoch: 22887, Training Loss: 0.01113
Epoch: 22887, Training Loss: 0.00860
Epoch: 22888, Training Loss: 0.00975
Epoch: 22888, Training Loss: 0.00908
Epoch: 22888, Training Loss: 0.01113
Epoch: 22888, Training Loss: 0.00860
Epoch: 22889, Training Loss: 0.00975
Epoch: 22889, Training Loss: 0.00908
Epoch: 22889, Training Loss: 0.01113
Epoch: 22889, Training Loss: 0.00860
Epoch: 22890, Training Loss: 0.00975
Epoch: 22890, Training Loss: 0.00908
Epoch: 22890, Training Loss: 0.01113
Epoch: 22890, Training Loss: 0.00860
Epoch: 22891, Training Loss: 0.00975
Epoch: 22891, Training Loss: 0.00908
Epoch: 22891, Training Loss: 0.01113
Epoch: 22891, Training Loss: 0.00860
Epoch: 22892, Training Loss: 0.00975
Epoch: 22892, Training Loss: 0.00908
Epoch: 22892, Training Loss: 0.01113
Epoch: 22892, Training Loss: 0.00860
Epoch: 22893, Training Loss: 0.00975
Epoch: 22893, Training Loss: 0.00908
Epoch: 22893, Training Loss: 0.01113
Epoch: 22893, Training Loss: 0.00860
Epoch: 22894, Training Loss: 0.00975
Epoch: 22894, Training Loss: 0.00908
Epoch: 22894, Training Loss: 0.01113
Epoch: 22894, Training Loss: 0.00860
Epoch: 22895, Training Loss: 0.00974
Epoch: 22895, Training Loss: 0.00908
Epoch: 22895, Training Loss: 0.01113
Epoch: 22895, Training Loss: 0.00860
Epoch: 22896, Training Loss: 0.00974
Epoch: 22896, Training Loss: 0.00908
Epoch: 22896, Training Loss: 0.01113
Epoch: 22896, Training Loss: 0.00860
Epoch: 22897, Training Loss: 0.00974
Epoch: 22897, Training Loss: 0.00908
Epoch: 22897, Training Loss: 0.01113
Epoch: 22897, Training Loss: 0.00860
Epoch: 22898, Training Loss: 0.00974
Epoch: 22898, Training Loss: 0.00908
Epoch: 22898, Training Loss: 0.01113
Epoch: 22898, Training Loss: 0.00860
Epoch: 22899, Training Loss: 0.00974
Epoch: 22899, Training Loss: 0.00908
Epoch: 22899, Training Loss: 0.01112
Epoch: 22899, Training Loss: 0.00860
Epoch: 22900, Training Loss: 0.00974
Epoch: 22900, Training Loss: 0.00908
Epoch: 22900, Training Loss: 0.01112
Epoch: 22900, Training Loss: 0.00860
Epoch: 22901, Training Loss: 0.00974
Epoch: 22901, Training Loss: 0.00908
Epoch: 22901, Training Loss: 0.01112
Epoch: 22901, Training Loss: 0.00860
Epoch: 22902, Training Loss: 0.00974
Epoch: 22902, Training Loss: 0.00907
Epoch: 22902, Training Loss: 0.01112
Epoch: 22902, Training Loss: 0.00860
Epoch: 22903, Training Loss: 0.00974
Epoch: 22903, Training Loss: 0.00907
Epoch: 22903, Training Loss: 0.01112
Epoch: 22903, Training Loss: 0.00860
Epoch: 22904, Training Loss: 0.00974
Epoch: 22904, Training Loss: 0.00907
Epoch: 22904, Training Loss: 0.01112
Epoch: 22904, Training Loss: 0.00860
Epoch: 22905, Training Loss: 0.00974
Epoch: 22905, Training Loss: 0.00907
Epoch: 22905, Training Loss: 0.01112
Epoch: 22905, Training Loss: 0.00860
Epoch: 22906, Training Loss: 0.00974
Epoch: 22906, Training Loss: 0.00907
Epoch: 22906, Training Loss: 0.01112
Epoch: 22906, Training Loss: 0.00860
Epoch: 22907, Training Loss: 0.00974
Epoch: 22907, Training Loss: 0.00907
Epoch: 22907, Training Loss: 0.01112
Epoch: 22907, Training Loss: 0.00860
Epoch: 22908, Training Loss: 0.00974
Epoch: 22908, Training Loss: 0.00907
Epoch: 22908, Training Loss: 0.01112
Epoch: 22908, Training Loss: 0.00860
Epoch: 22909, Training Loss: 0.00974
Epoch: 22909, Training Loss: 0.00907
Epoch: 22909, Training Loss: 0.01112
Epoch: 22909, Training Loss: 0.00860
Epoch: 22910, Training Loss: 0.00974
Epoch: 22910, Training Loss: 0.00907
Epoch: 22910, Training Loss: 0.01112
Epoch: 22910, Training Loss: 0.00860
Epoch: 22911, Training Loss: 0.00974
Epoch: 22911, Training Loss: 0.00907
Epoch: 22911, Training Loss: 0.01112
Epoch: 22911, Training Loss: 0.00860
Epoch: 22912, Training Loss: 0.00974
Epoch: 22912, Training Loss: 0.00907
Epoch: 22912, Training Loss: 0.01112
Epoch: 22912, Training Loss: 0.00859
Epoch: 22913, Training Loss: 0.00974
Epoch: 22913, Training Loss: 0.00907
Epoch: 22913, Training Loss: 0.01112
Epoch: 22913, Training Loss: 0.00859
Epoch: 22914, Training Loss: 0.00974
Epoch: 22914, Training Loss: 0.00907
Epoch: 22914, Training Loss: 0.01112
Epoch: 22914, Training Loss: 0.00859
Epoch: 22915, Training Loss: 0.00974
Epoch: 22915, Training Loss: 0.00907
Epoch: 22915, Training Loss: 0.01112
Epoch: 22915, Training Loss: 0.00859
Epoch: 22916, Training Loss: 0.00974
Epoch: 22916, Training Loss: 0.00907
Epoch: 22916, Training Loss: 0.01112
Epoch: 22916, Training Loss: 0.00859
Epoch: 22917, Training Loss: 0.00974
Epoch: 22917, Training Loss: 0.00907
Epoch: 22917, Training Loss: 0.01112
Epoch: 22917, Training Loss: 0.00859
Epoch: 22918, Training Loss: 0.00974
Epoch: 22918, Training Loss: 0.00907
Epoch: 22918, Training Loss: 0.01112
Epoch: 22918, Training Loss: 0.00859
Epoch: 22919, Training Loss: 0.00974
Epoch: 22919, Training Loss: 0.00907
Epoch: 22919, Training Loss: 0.01112
Epoch: 22919, Training Loss: 0.00859
Epoch: 22920, Training Loss: 0.00974
Epoch: 22920, Training Loss: 0.00907
Epoch: 22920, Training Loss: 0.01112
Epoch: 22920, Training Loss: 0.00859
Epoch: 22921, Training Loss: 0.00974
Epoch: 22921, Training Loss: 0.00907
Epoch: 22921, Training Loss: 0.01112
Epoch: 22921, Training Loss: 0.00859
Epoch: 22922, Training Loss: 0.00974
Epoch: 22922, Training Loss: 0.00907
Epoch: 22922, Training Loss: 0.01112
Epoch: 22922, Training Loss: 0.00859
Epoch: 22923, Training Loss: 0.00974
Epoch: 22923, Training Loss: 0.00907
Epoch: 22923, Training Loss: 0.01112
Epoch: 22923, Training Loss: 0.00859
Epoch: 22924, Training Loss: 0.00974
Epoch: 22924, Training Loss: 0.00907
Epoch: 22924, Training Loss: 0.01112
Epoch: 22924, Training Loss: 0.00859
Epoch: 22925, Training Loss: 0.00974
Epoch: 22925, Training Loss: 0.00907
Epoch: 22925, Training Loss: 0.01112
Epoch: 22925, Training Loss: 0.00859
Epoch: 22926, Training Loss: 0.00974
Epoch: 22926, Training Loss: 0.00907
Epoch: 22926, Training Loss: 0.01112
Epoch: 22926, Training Loss: 0.00859
Epoch: 22927, Training Loss: 0.00974
Epoch: 22927, Training Loss: 0.00907
Epoch: 22927, Training Loss: 0.01112
Epoch: 22927, Training Loss: 0.00859
Epoch: 22928, Training Loss: 0.00974
Epoch: 22928, Training Loss: 0.00907
Epoch: 22928, Training Loss: 0.01112
Epoch: 22928, Training Loss: 0.00859
Epoch: 22929, Training Loss: 0.00974
Epoch: 22929, Training Loss: 0.00907
Epoch: 22929, Training Loss: 0.01112
Epoch: 22929, Training Loss: 0.00859
Epoch: 22930, Training Loss: 0.00974
Epoch: 22930, Training Loss: 0.00907
Epoch: 22930, Training Loss: 0.01112
Epoch: 22930, Training Loss: 0.00859
Epoch: 22931, Training Loss: 0.00974
Epoch: 22931, Training Loss: 0.00907
Epoch: 22931, Training Loss: 0.01112
Epoch: 22931, Training Loss: 0.00859
Epoch: 22932, Training Loss: 0.00974
Epoch: 22932, Training Loss: 0.00907
Epoch: 22932, Training Loss: 0.01112
Epoch: 22932, Training Loss: 0.00859
Epoch: 22933, Training Loss: 0.00974
Epoch: 22933, Training Loss: 0.00907
Epoch: 22933, Training Loss: 0.01112
Epoch: 22933, Training Loss: 0.00859
Epoch: 22934, Training Loss: 0.00974
Epoch: 22934, Training Loss: 0.00907
Epoch: 22934, Training Loss: 0.01112
Epoch: 22934, Training Loss: 0.00859
Epoch: 22935, Training Loss: 0.00974
Epoch: 22935, Training Loss: 0.00907
Epoch: 22935, Training Loss: 0.01112
Epoch: 22935, Training Loss: 0.00859
Epoch: 22936, Training Loss: 0.00974
Epoch: 22936, Training Loss: 0.00907
Epoch: 22936, Training Loss: 0.01112
Epoch: 22936, Training Loss: 0.00859
Epoch: 22937, Training Loss: 0.00974
Epoch: 22937, Training Loss: 0.00907
Epoch: 22937, Training Loss: 0.01111
Epoch: 22937, Training Loss: 0.00859
Epoch: 22938, Training Loss: 0.00974
Epoch: 22938, Training Loss: 0.00907
Epoch: 22938, Training Loss: 0.01111
Epoch: 22938, Training Loss: 0.00859
Epoch: 22939, Training Loss: 0.00973
Epoch: 22939, Training Loss: 0.00907
Epoch: 22939, Training Loss: 0.01111
Epoch: 22939, Training Loss: 0.00859
Epoch: 22940, Training Loss: 0.00973
Epoch: 22940, Training Loss: 0.00907
Epoch: 22940, Training Loss: 0.01111
Epoch: 22940, Training Loss: 0.00859
Epoch: 22941, Training Loss: 0.00973
Epoch: 22941, Training Loss: 0.00907
Epoch: 22941, Training Loss: 0.01111
Epoch: 22941, Training Loss: 0.00859
Epoch: 22942, Training Loss: 0.00973
Epoch: 22942, Training Loss: 0.00907
Epoch: 22942, Training Loss: 0.01111
Epoch: 22942, Training Loss: 0.00859
Epoch: 22943, Training Loss: 0.00973
Epoch: 22943, Training Loss: 0.00907
Epoch: 22943, Training Loss: 0.01111
Epoch: 22943, Training Loss: 0.00859
Epoch: 22944, Training Loss: 0.00973
Epoch: 22944, Training Loss: 0.00907
Epoch: 22944, Training Loss: 0.01111
Epoch: 22944, Training Loss: 0.00859
Epoch: 22945, Training Loss: 0.00973
Epoch: 22945, Training Loss: 0.00907
Epoch: 22945, Training Loss: 0.01111
Epoch: 22945, Training Loss: 0.00859
Epoch: 22946, Training Loss: 0.00973
Epoch: 22946, Training Loss: 0.00907
Epoch: 22946, Training Loss: 0.01111
Epoch: 22946, Training Loss: 0.00859
Epoch: 22947, Training Loss: 0.00973
Epoch: 22947, Training Loss: 0.00907
Epoch: 22947, Training Loss: 0.01111
Epoch: 22947, Training Loss: 0.00859
Epoch: 22948, Training Loss: 0.00973
Epoch: 22948, Training Loss: 0.00907
Epoch: 22948, Training Loss: 0.01111
Epoch: 22948, Training Loss: 0.00859
Epoch: 22949, Training Loss: 0.00973
Epoch: 22949, Training Loss: 0.00907
Epoch: 22949, Training Loss: 0.01111
Epoch: 22949, Training Loss: 0.00859
Epoch: 22950, Training Loss: 0.00973
Epoch: 22950, Training Loss: 0.00906
Epoch: 22950, Training Loss: 0.01111
Epoch: 22950, Training Loss: 0.00859
Epoch: 22951, Training Loss: 0.00973
Epoch: 22951, Training Loss: 0.00906
Epoch: 22951, Training Loss: 0.01111
Epoch: 22951, Training Loss: 0.00859
Epoch: 22952, Training Loss: 0.00973
Epoch: 22952, Training Loss: 0.00906
Epoch: 22952, Training Loss: 0.01111
Epoch: 22952, Training Loss: 0.00859
Epoch: 22953, Training Loss: 0.00973
Epoch: 22953, Training Loss: 0.00906
Epoch: 22953, Training Loss: 0.01111
Epoch: 22953, Training Loss: 0.00859
Epoch: 22954, Training Loss: 0.00973
Epoch: 22954, Training Loss: 0.00906
Epoch: 22954, Training Loss: 0.01111
Epoch: 22954, Training Loss: 0.00859
Epoch: 22955, Training Loss: 0.00973
Epoch: 22955, Training Loss: 0.00906
Epoch: 22955, Training Loss: 0.01111
Epoch: 22955, Training Loss: 0.00859
Epoch: 22956, Training Loss: 0.00973
Epoch: 22956, Training Loss: 0.00906
Epoch: 22956, Training Loss: 0.01111
Epoch: 22956, Training Loss: 0.00859
Epoch: 22957, Training Loss: 0.00973
Epoch: 22957, Training Loss: 0.00906
Epoch: 22957, Training Loss: 0.01111
Epoch: 22957, Training Loss: 0.00859
Epoch: 22958, Training Loss: 0.00973
Epoch: 22958, Training Loss: 0.00906
Epoch: 22958, Training Loss: 0.01111
Epoch: 22958, Training Loss: 0.00859
Epoch: 22959, Training Loss: 0.00973
Epoch: 22959, Training Loss: 0.00906
Epoch: 22959, Training Loss: 0.01111
Epoch: 22959, Training Loss: 0.00859
Epoch: 22960, Training Loss: 0.00973
Epoch: 22960, Training Loss: 0.00906
Epoch: 22960, Training Loss: 0.01111
Epoch: 22960, Training Loss: 0.00859
Epoch: 22961, Training Loss: 0.00973
Epoch: 22961, Training Loss: 0.00906
Epoch: 22961, Training Loss: 0.01111
Epoch: 22961, Training Loss: 0.00859
Epoch: 22962, Training Loss: 0.00973
Epoch: 22962, Training Loss: 0.00906
Epoch: 22962, Training Loss: 0.01111
Epoch: 22962, Training Loss: 0.00858
Epoch: 22963, Training Loss: 0.00973
Epoch: 22963, Training Loss: 0.00906
Epoch: 22963, Training Loss: 0.01111
Epoch: 22963, Training Loss: 0.00858
Epoch: 22964, Training Loss: 0.00973
Epoch: 22964, Training Loss: 0.00906
Epoch: 22964, Training Loss: 0.01111
Epoch: 22964, Training Loss: 0.00858
Epoch: 22965, Training Loss: 0.00973
Epoch: 22965, Training Loss: 0.00906
Epoch: 22965, Training Loss: 0.01111
Epoch: 22965, Training Loss: 0.00858
Epoch: 22966, Training Loss: 0.00973
Epoch: 22966, Training Loss: 0.00906
Epoch: 22966, Training Loss: 0.01111
Epoch: 22966, Training Loss: 0.00858
Epoch: 22967, Training Loss: 0.00973
Epoch: 22967, Training Loss: 0.00906
Epoch: 22967, Training Loss: 0.01111
Epoch: 22967, Training Loss: 0.00858
Epoch: 22968, Training Loss: 0.00973
Epoch: 22968, Training Loss: 0.00906
Epoch: 22968, Training Loss: 0.01111
Epoch: 22968, Training Loss: 0.00858
Epoch: 22969, Training Loss: 0.00973
Epoch: 22969, Training Loss: 0.00906
Epoch: 22969, Training Loss: 0.01111
Epoch: 22969, Training Loss: 0.00858
Epoch: 22970, Training Loss: 0.00973
Epoch: 22970, Training Loss: 0.00906
Epoch: 22970, Training Loss: 0.01111
Epoch: 22970, Training Loss: 0.00858
Epoch: 22971, Training Loss: 0.00973
Epoch: 22971, Training Loss: 0.00906
Epoch: 22971, Training Loss: 0.01111
Epoch: 22971, Training Loss: 0.00858
Epoch: 22972, Training Loss: 0.00973
Epoch: 22972, Training Loss: 0.00906
Epoch: 22972, Training Loss: 0.01111
Epoch: 22972, Training Loss: 0.00858
Epoch: 22973, Training Loss: 0.00973
Epoch: 22973, Training Loss: 0.00906
Epoch: 22973, Training Loss: 0.01111
Epoch: 22973, Training Loss: 0.00858
Epoch: 22974, Training Loss: 0.00973
Epoch: 22974, Training Loss: 0.00906
Epoch: 22974, Training Loss: 0.01111
Epoch: 22974, Training Loss: 0.00858
Epoch: 22975, Training Loss: 0.00973
Epoch: 22975, Training Loss: 0.00906
Epoch: 22975, Training Loss: 0.01111
Epoch: 22975, Training Loss: 0.00858
Epoch: 22976, Training Loss: 0.00973
Epoch: 22976, Training Loss: 0.00906
Epoch: 22976, Training Loss: 0.01110
Epoch: 22976, Training Loss: 0.00858
Epoch: 22977, Training Loss: 0.00973
Epoch: 22977, Training Loss: 0.00906
Epoch: 22977, Training Loss: 0.01110
Epoch: 22977, Training Loss: 0.00858
Epoch: 22978, Training Loss: 0.00973
Epoch: 22978, Training Loss: 0.00906
Epoch: 22978, Training Loss: 0.01110
Epoch: 22978, Training Loss: 0.00858
Epoch: 22979, Training Loss: 0.00973
Epoch: 22979, Training Loss: 0.00906
Epoch: 22979, Training Loss: 0.01110
Epoch: 22979, Training Loss: 0.00858
Epoch: 22980, Training Loss: 0.00973
Epoch: 22980, Training Loss: 0.00906
Epoch: 22980, Training Loss: 0.01110
Epoch: 22980, Training Loss: 0.00858
Epoch: 22981, Training Loss: 0.00973
Epoch: 22981, Training Loss: 0.00906
Epoch: 22981, Training Loss: 0.01110
Epoch: 22981, Training Loss: 0.00858
Epoch: 22982, Training Loss: 0.00973
Epoch: 22982, Training Loss: 0.00906
Epoch: 22982, Training Loss: 0.01110
Epoch: 22982, Training Loss: 0.00858
Epoch: 22983, Training Loss: 0.00972
Epoch: 22983, Training Loss: 0.00906
Epoch: 22983, Training Loss: 0.01110
Epoch: 22983, Training Loss: 0.00858
Epoch: 22984, Training Loss: 0.00972
Epoch: 22984, Training Loss: 0.00906
Epoch: 22984, Training Loss: 0.01110
Epoch: 22984, Training Loss: 0.00858
Epoch: 22985, Training Loss: 0.00972
Epoch: 22985, Training Loss: 0.00906
Epoch: 22985, Training Loss: 0.01110
Epoch: 22985, Training Loss: 0.00858
Epoch: 22986, Training Loss: 0.00972
Epoch: 22986, Training Loss: 0.00906
Epoch: 22986, Training Loss: 0.01110
Epoch: 22986, Training Loss: 0.00858
Epoch: 22987, Training Loss: 0.00972
Epoch: 22987, Training Loss: 0.00906
Epoch: 22987, Training Loss: 0.01110
Epoch: 22987, Training Loss: 0.00858
Epoch: 22988, Training Loss: 0.00972
Epoch: 22988, Training Loss: 0.00906
Epoch: 22988, Training Loss: 0.01110
Epoch: 22988, Training Loss: 0.00858
Epoch: 22989, Training Loss: 0.00972
Epoch: 22989, Training Loss: 0.00906
Epoch: 22989, Training Loss: 0.01110
Epoch: 22989, Training Loss: 0.00858
Epoch: 22990, Training Loss: 0.00972
Epoch: 22990, Training Loss: 0.00906
Epoch: 22990, Training Loss: 0.01110
Epoch: 22990, Training Loss: 0.00858
Epoch: 22991, Training Loss: 0.00972
Epoch: 22991, Training Loss: 0.00906
Epoch: 22991, Training Loss: 0.01110
Epoch: 22991, Training Loss: 0.00858
Epoch: 22992, Training Loss: 0.00972
Epoch: 22992, Training Loss: 0.00906
Epoch: 22992, Training Loss: 0.01110
Epoch: 22992, Training Loss: 0.00858
Epoch: 22993, Training Loss: 0.00972
Epoch: 22993, Training Loss: 0.00906
Epoch: 22993, Training Loss: 0.01110
Epoch: 22993, Training Loss: 0.00858
Epoch: 22994, Training Loss: 0.00972
Epoch: 22994, Training Loss: 0.00906
Epoch: 22994, Training Loss: 0.01110
Epoch: 22994, Training Loss: 0.00858
Epoch: 22995, Training Loss: 0.00972
Epoch: 22995, Training Loss: 0.00906
Epoch: 22995, Training Loss: 0.01110
Epoch: 22995, Training Loss: 0.00858
Epoch: 22996, Training Loss: 0.00972
Epoch: 22996, Training Loss: 0.00906
Epoch: 22996, Training Loss: 0.01110
Epoch: 22996, Training Loss: 0.00858
Epoch: 22997, Training Loss: 0.00972
Epoch: 22997, Training Loss: 0.00906
Epoch: 22997, Training Loss: 0.01110
Epoch: 22997, Training Loss: 0.00858
Epoch: 22998, Training Loss: 0.00972
Epoch: 22998, Training Loss: 0.00906
Epoch: 22998, Training Loss: 0.01110
Epoch: 22998, Training Loss: 0.00858
Epoch: 22999, Training Loss: 0.00972
Epoch: 22999, Training Loss: 0.00905
Epoch: 22999, Training Loss: 0.01110
Epoch: 22999, Training Loss: 0.00858
Epoch: 23000, Training Loss: 0.00972
Epoch: 23000, Training Loss: 0.00905
Epoch: 23000, Training Loss: 0.01110
Epoch: 23000, Training Loss: 0.00858
Epoch: 23001, Training Loss: 0.00972
Epoch: 23001, Training Loss: 0.00905
Epoch: 23001, Training Loss: 0.01110
Epoch: 23001, Training Loss: 0.00858
Epoch: 23002, Training Loss: 0.00972
Epoch: 23002, Training Loss: 0.00905
Epoch: 23002, Training Loss: 0.01110
Epoch: 23002, Training Loss: 0.00858
Epoch: 23003, Training Loss: 0.00972
Epoch: 23003, Training Loss: 0.00905
Epoch: 23003, Training Loss: 0.01110
Epoch: 23003, Training Loss: 0.00858
Epoch: 23004, Training Loss: 0.00972
Epoch: 23004, Training Loss: 0.00905
Epoch: 23004, Training Loss: 0.01110
Epoch: 23004, Training Loss: 0.00858
Epoch: 23005, Training Loss: 0.00972
Epoch: 23005, Training Loss: 0.00905
Epoch: 23005, Training Loss: 0.01110
Epoch: 23005, Training Loss: 0.00858
Epoch: 23006, Training Loss: 0.00972
Epoch: 23006, Training Loss: 0.00905
Epoch: 23006, Training Loss: 0.01110
Epoch: 23006, Training Loss: 0.00858
Epoch: 23007, Training Loss: 0.00972
Epoch: 23007, Training Loss: 0.00905
Epoch: 23007, Training Loss: 0.01110
Epoch: 23007, Training Loss: 0.00858
Epoch: 23008, Training Loss: 0.00972
Epoch: 23008, Training Loss: 0.00905
Epoch: 23008, Training Loss: 0.01110
Epoch: 23008, Training Loss: 0.00858
Epoch: 23009, Training Loss: 0.00972
Epoch: 23009, Training Loss: 0.00905
Epoch: 23009, Training Loss: 0.01110
Epoch: 23009, Training Loss: 0.00858
Epoch: 23010, Training Loss: 0.00972
Epoch: 23010, Training Loss: 0.00905
Epoch: 23010, Training Loss: 0.01110
Epoch: 23010, Training Loss: 0.00858
Epoch: 23011, Training Loss: 0.00972
Epoch: 23011, Training Loss: 0.00905
Epoch: 23011, Training Loss: 0.01110
Epoch: 23011, Training Loss: 0.00858
Epoch: 23012, Training Loss: 0.00972
Epoch: 23012, Training Loss: 0.00905
Epoch: 23012, Training Loss: 0.01110
Epoch: 23012, Training Loss: 0.00858
Epoch: 23013, Training Loss: 0.00972
Epoch: 23013, Training Loss: 0.00905
Epoch: 23013, Training Loss: 0.01110
Epoch: 23013, Training Loss: 0.00857
Epoch: 23014, Training Loss: 0.00972
Epoch: 23014, Training Loss: 0.00905
Epoch: 23014, Training Loss: 0.01109
Epoch: 23014, Training Loss: 0.00857
Epoch: 23015, Training Loss: 0.00972
Epoch: 23015, Training Loss: 0.00905
Epoch: 23015, Training Loss: 0.01109
Epoch: 23015, Training Loss: 0.00857
Epoch: 23016, Training Loss: 0.00972
Epoch: 23016, Training Loss: 0.00905
Epoch: 23016, Training Loss: 0.01109
Epoch: 23016, Training Loss: 0.00857
Epoch: 23017, Training Loss: 0.00972
Epoch: 23017, Training Loss: 0.00905
Epoch: 23017, Training Loss: 0.01109
Epoch: 23017, Training Loss: 0.00857
Epoch: 23018, Training Loss: 0.00972
Epoch: 23018, Training Loss: 0.00905
Epoch: 23018, Training Loss: 0.01109
Epoch: 23018, Training Loss: 0.00857
Epoch: 23019, Training Loss: 0.00972
Epoch: 23019, Training Loss: 0.00905
Epoch: 23019, Training Loss: 0.01109
Epoch: 23019, Training Loss: 0.00857
Epoch: 23020, Training Loss: 0.00972
Epoch: 23020, Training Loss: 0.00905
Epoch: 23020, Training Loss: 0.01109
Epoch: 23020, Training Loss: 0.00857
Epoch: 23021, Training Loss: 0.00972
Epoch: 23021, Training Loss: 0.00905
Epoch: 23021, Training Loss: 0.01109
Epoch: 23021, Training Loss: 0.00857
Epoch: 23022, Training Loss: 0.00972
Epoch: 23022, Training Loss: 0.00905
Epoch: 23022, Training Loss: 0.01109
Epoch: 23022, Training Loss: 0.00857
Epoch: 23023, Training Loss: 0.00972
Epoch: 23023, Training Loss: 0.00905
Epoch: 23023, Training Loss: 0.01109
Epoch: 23023, Training Loss: 0.00857
Epoch: 23024, Training Loss: 0.00972
Epoch: 23024, Training Loss: 0.00905
Epoch: 23024, Training Loss: 0.01109
Epoch: 23024, Training Loss: 0.00857
Epoch: 23025, Training Loss: 0.00972
Epoch: 23025, Training Loss: 0.00905
Epoch: 23025, Training Loss: 0.01109
Epoch: 23025, Training Loss: 0.00857
Epoch: 23026, Training Loss: 0.00972
Epoch: 23026, Training Loss: 0.00905
Epoch: 23026, Training Loss: 0.01109
Epoch: 23026, Training Loss: 0.00857
Epoch: 23027, Training Loss: 0.00971
Epoch: 23027, Training Loss: 0.00905
Epoch: 23027, Training Loss: 0.01109
Epoch: 23027, Training Loss: 0.00857
Epoch: 23028, Training Loss: 0.00971
Epoch: 23028, Training Loss: 0.00905
Epoch: 23028, Training Loss: 0.01109
Epoch: 23028, Training Loss: 0.00857
Epoch: 23029, Training Loss: 0.00971
Epoch: 23029, Training Loss: 0.00905
Epoch: 23029, Training Loss: 0.01109
Epoch: 23029, Training Loss: 0.00857
Epoch: 23030, Training Loss: 0.00971
Epoch: 23030, Training Loss: 0.00905
Epoch: 23030, Training Loss: 0.01109
Epoch: 23030, Training Loss: 0.00857
Epoch: 23031, Training Loss: 0.00971
Epoch: 23031, Training Loss: 0.00905
Epoch: 23031, Training Loss: 0.01109
Epoch: 23031, Training Loss: 0.00857
Epoch: 23032, Training Loss: 0.00971
Epoch: 23032, Training Loss: 0.00905
Epoch: 23032, Training Loss: 0.01109
Epoch: 23032, Training Loss: 0.00857
Epoch: 23033, Training Loss: 0.00971
Epoch: 23033, Training Loss: 0.00905
Epoch: 23033, Training Loss: 0.01109
Epoch: 23033, Training Loss: 0.00857
Epoch: 23034, Training Loss: 0.00971
Epoch: 23034, Training Loss: 0.00905
Epoch: 23034, Training Loss: 0.01109
Epoch: 23034, Training Loss: 0.00857
Epoch: 23035, Training Loss: 0.00971
Epoch: 23035, Training Loss: 0.00905
Epoch: 23035, Training Loss: 0.01109
Epoch: 23035, Training Loss: 0.00857
Epoch: 23036, Training Loss: 0.00971
Epoch: 23036, Training Loss: 0.00905
Epoch: 23036, Training Loss: 0.01109
Epoch: 23036, Training Loss: 0.00857
Epoch: 23037, Training Loss: 0.00971
Epoch: 23037, Training Loss: 0.00905
Epoch: 23037, Training Loss: 0.01109
Epoch: 23037, Training Loss: 0.00857
Epoch: 23038, Training Loss: 0.00971
Epoch: 23038, Training Loss: 0.00905
Epoch: 23038, Training Loss: 0.01109
Epoch: 23038, Training Loss: 0.00857
Epoch: 23039, Training Loss: 0.00971
Epoch: 23039, Training Loss: 0.00905
Epoch: 23039, Training Loss: 0.01109
Epoch: 23039, Training Loss: 0.00857
Epoch: 23040, Training Loss: 0.00971
Epoch: 23040, Training Loss: 0.00905
Epoch: 23040, Training Loss: 0.01109
Epoch: 23040, Training Loss: 0.00857
Epoch: 23041, Training Loss: 0.00971
Epoch: 23041, Training Loss: 0.00905
Epoch: 23041, Training Loss: 0.01109
Epoch: 23041, Training Loss: 0.00857
Epoch: 23042, Training Loss: 0.00971
Epoch: 23042, Training Loss: 0.00905
Epoch: 23042, Training Loss: 0.01109
Epoch: 23042, Training Loss: 0.00857
Epoch: 23043, Training Loss: 0.00971
Epoch: 23043, Training Loss: 0.00905
Epoch: 23043, Training Loss: 0.01109
Epoch: 23043, Training Loss: 0.00857
Epoch: 23044, Training Loss: 0.00971
Epoch: 23044, Training Loss: 0.00905
Epoch: 23044, Training Loss: 0.01109
Epoch: 23044, Training Loss: 0.00857
Epoch: 23045, Training Loss: 0.00971
Epoch: 23045, Training Loss: 0.00905
Epoch: 23045, Training Loss: 0.01109
Epoch: 23045, Training Loss: 0.00857
Epoch: 23046, Training Loss: 0.00971
Epoch: 23046, Training Loss: 0.00905
Epoch: 23046, Training Loss: 0.01109
Epoch: 23046, Training Loss: 0.00857
Epoch: 23047, Training Loss: 0.00971
Epoch: 23047, Training Loss: 0.00904
Epoch: 23047, Training Loss: 0.01109
Epoch: 23047, Training Loss: 0.00857
Epoch: 23048, Training Loss: 0.00971
Epoch: 23048, Training Loss: 0.00904
Epoch: 23048, Training Loss: 0.01109
Epoch: 23048, Training Loss: 0.00857
Epoch: 23049, Training Loss: 0.00971
Epoch: 23049, Training Loss: 0.00904
Epoch: 23049, Training Loss: 0.01109
Epoch: 23049, Training Loss: 0.00857
Epoch: 23050, Training Loss: 0.00971
Epoch: 23050, Training Loss: 0.00904
Epoch: 23050, Training Loss: 0.01109
Epoch: 23050, Training Loss: 0.00857
Epoch: 23051, Training Loss: 0.00971
Epoch: 23051, Training Loss: 0.00904
Epoch: 23051, Training Loss: 0.01109
Epoch: 23051, Training Loss: 0.00857
Epoch: 23052, Training Loss: 0.00971
Epoch: 23052, Training Loss: 0.00904
Epoch: 23052, Training Loss: 0.01109
Epoch: 23052, Training Loss: 0.00857
Epoch: 23053, Training Loss: 0.00971
Epoch: 23053, Training Loss: 0.00904
Epoch: 23053, Training Loss: 0.01108
Epoch: 23053, Training Loss: 0.00857
Epoch: 23054, Training Loss: 0.00971
Epoch: 23054, Training Loss: 0.00904
Epoch: 23054, Training Loss: 0.01108
Epoch: 23054, Training Loss: 0.00857
Epoch: 23055, Training Loss: 0.00971
Epoch: 23055, Training Loss: 0.00904
Epoch: 23055, Training Loss: 0.01108
Epoch: 23055, Training Loss: 0.00857
Epoch: 23056, Training Loss: 0.00971
Epoch: 23056, Training Loss: 0.00904
Epoch: 23056, Training Loss: 0.01108
Epoch: 23056, Training Loss: 0.00857
Epoch: 23057, Training Loss: 0.00971
Epoch: 23057, Training Loss: 0.00904
Epoch: 23057, Training Loss: 0.01108
Epoch: 23057, Training Loss: 0.00857
Epoch: 23058, Training Loss: 0.00971
Epoch: 23058, Training Loss: 0.00904
Epoch: 23058, Training Loss: 0.01108
Epoch: 23058, Training Loss: 0.00857
Epoch: 23059, Training Loss: 0.00971
Epoch: 23059, Training Loss: 0.00904
Epoch: 23059, Training Loss: 0.01108
Epoch: 23059, Training Loss: 0.00857
Epoch: 23060, Training Loss: 0.00971
Epoch: 23060, Training Loss: 0.00904
Epoch: 23060, Training Loss: 0.01108
Epoch: 23060, Training Loss: 0.00857
Epoch: 23061, Training Loss: 0.00971
Epoch: 23061, Training Loss: 0.00904
Epoch: 23061, Training Loss: 0.01108
Epoch: 23061, Training Loss: 0.00857
Epoch: 23062, Training Loss: 0.00971
Epoch: 23062, Training Loss: 0.00904
Epoch: 23062, Training Loss: 0.01108
Epoch: 23062, Training Loss: 0.00857
Epoch: 23063, Training Loss: 0.00971
Epoch: 23063, Training Loss: 0.00904
Epoch: 23063, Training Loss: 0.01108
Epoch: 23063, Training Loss: 0.00857
Epoch: 23064, Training Loss: 0.00971
Epoch: 23064, Training Loss: 0.00904
Epoch: 23064, Training Loss: 0.01108
Epoch: 23064, Training Loss: 0.00856
Epoch: 23065, Training Loss: 0.00971
Epoch: 23065, Training Loss: 0.00904
Epoch: 23065, Training Loss: 0.01108
Epoch: 23065, Training Loss: 0.00856
Epoch: 23066, Training Loss: 0.00971
Epoch: 23066, Training Loss: 0.00904
Epoch: 23066, Training Loss: 0.01108
Epoch: 23066, Training Loss: 0.00856
Epoch: 23067, Training Loss: 0.00971
Epoch: 23067, Training Loss: 0.00904
Epoch: 23067, Training Loss: 0.01108
Epoch: 23067, Training Loss: 0.00856
Epoch: 23068, Training Loss: 0.00971
Epoch: 23068, Training Loss: 0.00904
Epoch: 23068, Training Loss: 0.01108
Epoch: 23068, Training Loss: 0.00856
Epoch: 23069, Training Loss: 0.00971
Epoch: 23069, Training Loss: 0.00904
Epoch: 23069, Training Loss: 0.01108
Epoch: 23069, Training Loss: 0.00856
Epoch: 23070, Training Loss: 0.00971
Epoch: 23070, Training Loss: 0.00904
Epoch: 23070, Training Loss: 0.01108
Epoch: 23070, Training Loss: 0.00856
Epoch: 23071, Training Loss: 0.00970
Epoch: 23071, Training Loss: 0.00904
Epoch: 23071, Training Loss: 0.01108
Epoch: 23071, Training Loss: 0.00856
Epoch: 23072, Training Loss: 0.00970
Epoch: 23072, Training Loss: 0.00904
Epoch: 23072, Training Loss: 0.01108
Epoch: 23072, Training Loss: 0.00856
Epoch: 23073, Training Loss: 0.00970
Epoch: 23073, Training Loss: 0.00904
Epoch: 23073, Training Loss: 0.01108
Epoch: 23073, Training Loss: 0.00856
Epoch: 23074, Training Loss: 0.00970
Epoch: 23074, Training Loss: 0.00904
Epoch: 23074, Training Loss: 0.01108
Epoch: 23074, Training Loss: 0.00856
Epoch: 23075, Training Loss: 0.00970
Epoch: 23075, Training Loss: 0.00904
Epoch: 23075, Training Loss: 0.01108
Epoch: 23075, Training Loss: 0.00856
Epoch: 23076, Training Loss: 0.00970
Epoch: 23076, Training Loss: 0.00904
Epoch: 23076, Training Loss: 0.01108
Epoch: 23076, Training Loss: 0.00856
Epoch: 23077, Training Loss: 0.00970
Epoch: 23077, Training Loss: 0.00904
Epoch: 23077, Training Loss: 0.01108
Epoch: 23077, Training Loss: 0.00856
Epoch: 23078, Training Loss: 0.00970
Epoch: 23078, Training Loss: 0.00904
Epoch: 23078, Training Loss: 0.01108
Epoch: 23078, Training Loss: 0.00856
Epoch: 23079, Training Loss: 0.00970
Epoch: 23079, Training Loss: 0.00904
Epoch: 23079, Training Loss: 0.01108
Epoch: 23079, Training Loss: 0.00856
Epoch: 23080, Training Loss: 0.00970
Epoch: 23080, Training Loss: 0.00904
Epoch: 23080, Training Loss: 0.01108
Epoch: 23080, Training Loss: 0.00856
Epoch: 23081, Training Loss: 0.00970
Epoch: 23081, Training Loss: 0.00904
Epoch: 23081, Training Loss: 0.01108
Epoch: 23081, Training Loss: 0.00856
Epoch: 23082, Training Loss: 0.00970
Epoch: 23082, Training Loss: 0.00904
Epoch: 23082, Training Loss: 0.01108
Epoch: 23082, Training Loss: 0.00856
Epoch: 23083, Training Loss: 0.00970
Epoch: 23083, Training Loss: 0.00904
Epoch: 23083, Training Loss: 0.01108
Epoch: 23083, Training Loss: 0.00856
Epoch: 23084, Training Loss: 0.00970
Epoch: 23084, Training Loss: 0.00904
Epoch: 23084, Training Loss: 0.01108
Epoch: 23084, Training Loss: 0.00856
Epoch: 23085, Training Loss: 0.00970
Epoch: 23085, Training Loss: 0.00904
Epoch: 23085, Training Loss: 0.01108
Epoch: 23085, Training Loss: 0.00856
Epoch: 23086, Training Loss: 0.00970
Epoch: 23086, Training Loss: 0.00904
Epoch: 23086, Training Loss: 0.01108
Epoch: 23086, Training Loss: 0.00856
Epoch: 23087, Training Loss: 0.00970
Epoch: 23087, Training Loss: 0.00904
Epoch: 23087, Training Loss: 0.01108
Epoch: 23087, Training Loss: 0.00856
Epoch: 23088, Training Loss: 0.00970
Epoch: 23088, Training Loss: 0.00904
Epoch: 23088, Training Loss: 0.01108
Epoch: 23088, Training Loss: 0.00856
Epoch: 23089, Training Loss: 0.00970
Epoch: 23089, Training Loss: 0.00904
Epoch: 23089, Training Loss: 0.01108
Epoch: 23089, Training Loss: 0.00856
Epoch: 23090, Training Loss: 0.00970
Epoch: 23090, Training Loss: 0.00904
Epoch: 23090, Training Loss: 0.01108
Epoch: 23090, Training Loss: 0.00856
Epoch: 23091, Training Loss: 0.00970
Epoch: 23091, Training Loss: 0.00904
Epoch: 23091, Training Loss: 0.01108
Epoch: 23091, Training Loss: 0.00856
Epoch: 23092, Training Loss: 0.00970
Epoch: 23092, Training Loss: 0.00904
Epoch: 23092, Training Loss: 0.01107
Epoch: 23092, Training Loss: 0.00856
Epoch: 23093, Training Loss: 0.00970
Epoch: 23093, Training Loss: 0.00904
Epoch: 23093, Training Loss: 0.01107
Epoch: 23093, Training Loss: 0.00856
Epoch: 23094, Training Loss: 0.00970
Epoch: 23094, Training Loss: 0.00904
Epoch: 23094, Training Loss: 0.01107
Epoch: 23094, Training Loss: 0.00856
Epoch: 23095, Training Loss: 0.00970
Epoch: 23095, Training Loss: 0.00903
Epoch: 23095, Training Loss: 0.01107
Epoch: 23095, Training Loss: 0.00856
Epoch: 23096, Training Loss: 0.00970
Epoch: 23096, Training Loss: 0.00903
Epoch: 23096, Training Loss: 0.01107
Epoch: 23096, Training Loss: 0.00856
Epoch: 23097, Training Loss: 0.00970
Epoch: 23097, Training Loss: 0.00903
Epoch: 23097, Training Loss: 0.01107
Epoch: 23097, Training Loss: 0.00856
Epoch: 23098, Training Loss: 0.00970
Epoch: 23098, Training Loss: 0.00903
Epoch: 23098, Training Loss: 0.01107
Epoch: 23098, Training Loss: 0.00856
Epoch: 23099, Training Loss: 0.00970
Epoch: 23099, Training Loss: 0.00903
Epoch: 23099, Training Loss: 0.01107
Epoch: 23099, Training Loss: 0.00856
Epoch: 23100, Training Loss: 0.00970
Epoch: 23100, Training Loss: 0.00903
Epoch: 23100, Training Loss: 0.01107
Epoch: 23100, Training Loss: 0.00856
Epoch: 23101, Training Loss: 0.00970
Epoch: 23101, Training Loss: 0.00903
Epoch: 23101, Training Loss: 0.01107
Epoch: 23101, Training Loss: 0.00856
Epoch: 23102, Training Loss: 0.00970
Epoch: 23102, Training Loss: 0.00903
Epoch: 23102, Training Loss: 0.01107
Epoch: 23102, Training Loss: 0.00856
Epoch: 23103, Training Loss: 0.00970
Epoch: 23103, Training Loss: 0.00903
Epoch: 23103, Training Loss: 0.01107
Epoch: 23103, Training Loss: 0.00856
Epoch: 23104, Training Loss: 0.00970
Epoch: 23104, Training Loss: 0.00903
Epoch: 23104, Training Loss: 0.01107
Epoch: 23104, Training Loss: 0.00856
Epoch: 23105, Training Loss: 0.00970
Epoch: 23105, Training Loss: 0.00903
Epoch: 23105, Training Loss: 0.01107
Epoch: 23105, Training Loss: 0.00856
Epoch: 23106, Training Loss: 0.00970
Epoch: 23106, Training Loss: 0.00903
Epoch: 23106, Training Loss: 0.01107
Epoch: 23106, Training Loss: 0.00856
Epoch: 23107, Training Loss: 0.00970
Epoch: 23107, Training Loss: 0.00903
Epoch: 23107, Training Loss: 0.01107
Epoch: 23107, Training Loss: 0.00856
Epoch: 23108, Training Loss: 0.00970
Epoch: 23108, Training Loss: 0.00903
Epoch: 23108, Training Loss: 0.01107
Epoch: 23108, Training Loss: 0.00856
Epoch: 23109, Training Loss: 0.00970
Epoch: 23109, Training Loss: 0.00903
Epoch: 23109, Training Loss: 0.01107
Epoch: 23109, Training Loss: 0.00856
Epoch: 23110, Training Loss: 0.00970
Epoch: 23110, Training Loss: 0.00903
Epoch: 23110, Training Loss: 0.01107
Epoch: 23110, Training Loss: 0.00856
Epoch: 23111, Training Loss: 0.00970
Epoch: 23111, Training Loss: 0.00903
Epoch: 23111, Training Loss: 0.01107
Epoch: 23111, Training Loss: 0.00856
Epoch: 23112, Training Loss: 0.00970
Epoch: 23112, Training Loss: 0.00903
Epoch: 23112, Training Loss: 0.01107
Epoch: 23112, Training Loss: 0.00856
Epoch: 23113, Training Loss: 0.00970
Epoch: 23113, Training Loss: 0.00903
Epoch: 23113, Training Loss: 0.01107
Epoch: 23113, Training Loss: 0.00856
Epoch: 23114, Training Loss: 0.00970
Epoch: 23114, Training Loss: 0.00903
Epoch: 23114, Training Loss: 0.01107
Epoch: 23114, Training Loss: 0.00856
Epoch: 23115, Training Loss: 0.00970
Epoch: 23115, Training Loss: 0.00903
Epoch: 23115, Training Loss: 0.01107
Epoch: 23115, Training Loss: 0.00855
Epoch: 23116, Training Loss: 0.00969
Epoch: 23116, Training Loss: 0.00903
Epoch: 23116, Training Loss: 0.01107
Epoch: 23116, Training Loss: 0.00855
Epoch: 23117, Training Loss: 0.00969
Epoch: 23117, Training Loss: 0.00903
Epoch: 23117, Training Loss: 0.01107
Epoch: 23117, Training Loss: 0.00855
Epoch: 23118, Training Loss: 0.00969
Epoch: 23118, Training Loss: 0.00903
Epoch: 23118, Training Loss: 0.01107
Epoch: 23118, Training Loss: 0.00855
Epoch: 23119, Training Loss: 0.00969
Epoch: 23119, Training Loss: 0.00903
Epoch: 23119, Training Loss: 0.01107
Epoch: 23119, Training Loss: 0.00855
Epoch: 23120, Training Loss: 0.00969
Epoch: 23120, Training Loss: 0.00903
Epoch: 23120, Training Loss: 0.01107
Epoch: 23120, Training Loss: 0.00855
Epoch: 23121, Training Loss: 0.00969
Epoch: 23121, Training Loss: 0.00903
Epoch: 23121, Training Loss: 0.01107
Epoch: 23121, Training Loss: 0.00855
Epoch: 23122, Training Loss: 0.00969
Epoch: 23122, Training Loss: 0.00903
Epoch: 23122, Training Loss: 0.01107
Epoch: 23122, Training Loss: 0.00855
Epoch: 23123, Training Loss: 0.00969
Epoch: 23123, Training Loss: 0.00903
Epoch: 23123, Training Loss: 0.01107
Epoch: 23123, Training Loss: 0.00855
Epoch: 23124, Training Loss: 0.00969
Epoch: 23124, Training Loss: 0.00903
Epoch: 23124, Training Loss: 0.01107
Epoch: 23124, Training Loss: 0.00855
Epoch: 23125, Training Loss: 0.00969
Epoch: 23125, Training Loss: 0.00903
Epoch: 23125, Training Loss: 0.01107
Epoch: 23125, Training Loss: 0.00855
Epoch: 23126, Training Loss: 0.00969
Epoch: 23126, Training Loss: 0.00903
Epoch: 23126, Training Loss: 0.01107
Epoch: 23126, Training Loss: 0.00855
Epoch: 23127, Training Loss: 0.00969
Epoch: 23127, Training Loss: 0.00903
Epoch: 23127, Training Loss: 0.01107
Epoch: 23127, Training Loss: 0.00855
Epoch: 23128, Training Loss: 0.00969
Epoch: 23128, Training Loss: 0.00903
Epoch: 23128, Training Loss: 0.01107
Epoch: 23128, Training Loss: 0.00855
Epoch: 23129, Training Loss: 0.00969
Epoch: 23129, Training Loss: 0.00903
Epoch: 23129, Training Loss: 0.01107
Epoch: 23129, Training Loss: 0.00855
Epoch: 23130, Training Loss: 0.00969
Epoch: 23130, Training Loss: 0.00903
Epoch: 23130, Training Loss: 0.01107
Epoch: 23130, Training Loss: 0.00855
Epoch: 23131, Training Loss: 0.00969
Epoch: 23131, Training Loss: 0.00903
Epoch: 23131, Training Loss: 0.01106
Epoch: 23131, Training Loss: 0.00855
Epoch: 23132, Training Loss: 0.00969
Epoch: 23132, Training Loss: 0.00903
Epoch: 23132, Training Loss: 0.01106
Epoch: 23132, Training Loss: 0.00855
Epoch: 23133, Training Loss: 0.00969
Epoch: 23133, Training Loss: 0.00903
Epoch: 23133, Training Loss: 0.01106
Epoch: 23133, Training Loss: 0.00855
Epoch: 23134, Training Loss: 0.00969
Epoch: 23134, Training Loss: 0.00903
Epoch: 23134, Training Loss: 0.01106
Epoch: 23134, Training Loss: 0.00855
Epoch: 23135, Training Loss: 0.00969
Epoch: 23135, Training Loss: 0.00903
Epoch: 23135, Training Loss: 0.01106
Epoch: 23135, Training Loss: 0.00855
Epoch: 23136, Training Loss: 0.00969
Epoch: 23136, Training Loss: 0.00903
Epoch: 23136, Training Loss: 0.01106
Epoch: 23136, Training Loss: 0.00855
Epoch: 23137, Training Loss: 0.00969
Epoch: 23137, Training Loss: 0.00903
Epoch: 23137, Training Loss: 0.01106
Epoch: 23137, Training Loss: 0.00855
Epoch: 23138, Training Loss: 0.00969
Epoch: 23138, Training Loss: 0.00903
Epoch: 23138, Training Loss: 0.01106
Epoch: 23138, Training Loss: 0.00855
Epoch: 23139, Training Loss: 0.00969
Epoch: 23139, Training Loss: 0.00903
Epoch: 23139, Training Loss: 0.01106
Epoch: 23139, Training Loss: 0.00855
Epoch: 23140, Training Loss: 0.00969
Epoch: 23140, Training Loss: 0.00903
Epoch: 23140, Training Loss: 0.01106
Epoch: 23140, Training Loss: 0.00855
Epoch: 23141, Training Loss: 0.00969
Epoch: 23141, Training Loss: 0.00903
Epoch: 23141, Training Loss: 0.01106
Epoch: 23141, Training Loss: 0.00855
Epoch: 23142, Training Loss: 0.00969
Epoch: 23142, Training Loss: 0.00903
Epoch: 23142, Training Loss: 0.01106
Epoch: 23142, Training Loss: 0.00855
Epoch: 23143, Training Loss: 0.00969
Epoch: 23143, Training Loss: 0.00903
Epoch: 23143, Training Loss: 0.01106
Epoch: 23143, Training Loss: 0.00855
Epoch: 23144, Training Loss: 0.00969
Epoch: 23144, Training Loss: 0.00902
Epoch: 23144, Training Loss: 0.01106
Epoch: 23144, Training Loss: 0.00855
Epoch: 23145, Training Loss: 0.00969
Epoch: 23145, Training Loss: 0.00902
Epoch: 23145, Training Loss: 0.01106
Epoch: 23145, Training Loss: 0.00855
Epoch: 23146, Training Loss: 0.00969
Epoch: 23146, Training Loss: 0.00902
Epoch: 23146, Training Loss: 0.01106
Epoch: 23146, Training Loss: 0.00855
Epoch: 23147, Training Loss: 0.00969
Epoch: 23147, Training Loss: 0.00902
Epoch: 23147, Training Loss: 0.01106
Epoch: 23147, Training Loss: 0.00855
Epoch: 23148, Training Loss: 0.00969
Epoch: 23148, Training Loss: 0.00902
Epoch: 23148, Training Loss: 0.01106
Epoch: 23148, Training Loss: 0.00855
Epoch: 23149, Training Loss: 0.00969
Epoch: 23149, Training Loss: 0.00902
Epoch: 23149, Training Loss: 0.01106
Epoch: 23149, Training Loss: 0.00855
Epoch: 23150, Training Loss: 0.00969
Epoch: 23150, Training Loss: 0.00902
Epoch: 23150, Training Loss: 0.01106
Epoch: 23150, Training Loss: 0.00855
Epoch: 23151, Training Loss: 0.00969
Epoch: 23151, Training Loss: 0.00902
Epoch: 23151, Training Loss: 0.01106
Epoch: 23151, Training Loss: 0.00855
Epoch: 23152, Training Loss: 0.00969
Epoch: 23152, Training Loss: 0.00902
Epoch: 23152, Training Loss: 0.01106
Epoch: 23152, Training Loss: 0.00855
Epoch: 23153, Training Loss: 0.00969
Epoch: 23153, Training Loss: 0.00902
Epoch: 23153, Training Loss: 0.01106
Epoch: 23153, Training Loss: 0.00855
Epoch: 23154, Training Loss: 0.00969
Epoch: 23154, Training Loss: 0.00902
Epoch: 23154, Training Loss: 0.01106
Epoch: 23154, Training Loss: 0.00855
Epoch: 23155, Training Loss: 0.00969
Epoch: 23155, Training Loss: 0.00902
Epoch: 23155, Training Loss: 0.01106
Epoch: 23155, Training Loss: 0.00855
Epoch: 23156, Training Loss: 0.00969
Epoch: 23156, Training Loss: 0.00902
Epoch: 23156, Training Loss: 0.01106
Epoch: 23156, Training Loss: 0.00855
Epoch: 23157, Training Loss: 0.00969
Epoch: 23157, Training Loss: 0.00902
Epoch: 23157, Training Loss: 0.01106
Epoch: 23157, Training Loss: 0.00855
Epoch: 23158, Training Loss: 0.00969
Epoch: 23158, Training Loss: 0.00902
Epoch: 23158, Training Loss: 0.01106
Epoch: 23158, Training Loss: 0.00855
Epoch: 23159, Training Loss: 0.00969
Epoch: 23159, Training Loss: 0.00902
Epoch: 23159, Training Loss: 0.01106
Epoch: 23159, Training Loss: 0.00855
Epoch: 23160, Training Loss: 0.00968
Epoch: 23160, Training Loss: 0.00902
Epoch: 23160, Training Loss: 0.01106
Epoch: 23160, Training Loss: 0.00855
Epoch: 23161, Training Loss: 0.00968
Epoch: 23161, Training Loss: 0.00902
Epoch: 23161, Training Loss: 0.01106
Epoch: 23161, Training Loss: 0.00855
Epoch: 23162, Training Loss: 0.00968
Epoch: 23162, Training Loss: 0.00902
Epoch: 23162, Training Loss: 0.01106
Epoch: 23162, Training Loss: 0.00855
Epoch: 23163, Training Loss: 0.00968
Epoch: 23163, Training Loss: 0.00902
Epoch: 23163, Training Loss: 0.01106
Epoch: 23163, Training Loss: 0.00855
Epoch: 23164, Training Loss: 0.00968
Epoch: 23164, Training Loss: 0.00902
Epoch: 23164, Training Loss: 0.01106
Epoch: 23164, Training Loss: 0.00855
Epoch: 23165, Training Loss: 0.00968
Epoch: 23165, Training Loss: 0.00902
Epoch: 23165, Training Loss: 0.01106
Epoch: 23165, Training Loss: 0.00855
Epoch: 23166, Training Loss: 0.00968
Epoch: 23166, Training Loss: 0.00902
Epoch: 23166, Training Loss: 0.01106
Epoch: 23166, Training Loss: 0.00854
Epoch: 23167, Training Loss: 0.00968
Epoch: 23167, Training Loss: 0.00902
Epoch: 23167, Training Loss: 0.01106
Epoch: 23167, Training Loss: 0.00854
Epoch: 23168, Training Loss: 0.00968
Epoch: 23168, Training Loss: 0.00902
Epoch: 23168, Training Loss: 0.01106
Epoch: 23168, Training Loss: 0.00854
Epoch: 23169, Training Loss: 0.00968
Epoch: 23169, Training Loss: 0.00902
Epoch: 23169, Training Loss: 0.01106
Epoch: 23169, Training Loss: 0.00854
Epoch: 23170, Training Loss: 0.00968
Epoch: 23170, Training Loss: 0.00902
Epoch: 23170, Training Loss: 0.01105
Epoch: 23170, Training Loss: 0.00854
Epoch: 23171, Training Loss: 0.00968
Epoch: 23171, Training Loss: 0.00902
Epoch: 23171, Training Loss: 0.01105
Epoch: 23171, Training Loss: 0.00854
Epoch: 23172, Training Loss: 0.00968
Epoch: 23172, Training Loss: 0.00902
Epoch: 23172, Training Loss: 0.01105
Epoch: 23172, Training Loss: 0.00854
Epoch: 23173, Training Loss: 0.00968
Epoch: 23173, Training Loss: 0.00902
Epoch: 23173, Training Loss: 0.01105
Epoch: 23173, Training Loss: 0.00854
Epoch: 23174, Training Loss: 0.00968
Epoch: 23174, Training Loss: 0.00902
Epoch: 23174, Training Loss: 0.01105
Epoch: 23174, Training Loss: 0.00854
Epoch: 23175, Training Loss: 0.00968
Epoch: 23175, Training Loss: 0.00902
Epoch: 23175, Training Loss: 0.01105
Epoch: 23175, Training Loss: 0.00854
Epoch: 23176, Training Loss: 0.00968
Epoch: 23176, Training Loss: 0.00902
Epoch: 23176, Training Loss: 0.01105
Epoch: 23176, Training Loss: 0.00854
Epoch: 23177, Training Loss: 0.00968
Epoch: 23177, Training Loss: 0.00902
Epoch: 23177, Training Loss: 0.01105
Epoch: 23177, Training Loss: 0.00854
Epoch: 23178, Training Loss: 0.00968
Epoch: 23178, Training Loss: 0.00902
Epoch: 23178, Training Loss: 0.01105
Epoch: 23178, Training Loss: 0.00854
Epoch: 23179, Training Loss: 0.00968
Epoch: 23179, Training Loss: 0.00902
Epoch: 23179, Training Loss: 0.01105
Epoch: 23179, Training Loss: 0.00854
Epoch: 23180, Training Loss: 0.00968
Epoch: 23180, Training Loss: 0.00902
Epoch: 23180, Training Loss: 0.01105
Epoch: 23180, Training Loss: 0.00854
Epoch: 23181, Training Loss: 0.00968
Epoch: 23181, Training Loss: 0.00902
Epoch: 23181, Training Loss: 0.01105
Epoch: 23181, Training Loss: 0.00854
Epoch: 23182, Training Loss: 0.00968
Epoch: 23182, Training Loss: 0.00902
Epoch: 23182, Training Loss: 0.01105
Epoch: 23182, Training Loss: 0.00854
Epoch: 23183, Training Loss: 0.00968
Epoch: 23183, Training Loss: 0.00902
Epoch: 23183, Training Loss: 0.01105
Epoch: 23183, Training Loss: 0.00854
Epoch: 23184, Training Loss: 0.00968
Epoch: 23184, Training Loss: 0.00902
Epoch: 23184, Training Loss: 0.01105
Epoch: 23184, Training Loss: 0.00854
Epoch: 23185, Training Loss: 0.00968
Epoch: 23185, Training Loss: 0.00902
Epoch: 23185, Training Loss: 0.01105
Epoch: 23185, Training Loss: 0.00854
Epoch: 23186, Training Loss: 0.00968
Epoch: 23186, Training Loss: 0.00902
Epoch: 23186, Training Loss: 0.01105
Epoch: 23186, Training Loss: 0.00854
Epoch: 23187, Training Loss: 0.00968
Epoch: 23187, Training Loss: 0.00902
Epoch: 23187, Training Loss: 0.01105
Epoch: 23187, Training Loss: 0.00854
Epoch: 23188, Training Loss: 0.00968
Epoch: 23188, Training Loss: 0.00902
Epoch: 23188, Training Loss: 0.01105
Epoch: 23188, Training Loss: 0.00854
Epoch: 23189, Training Loss: 0.00968
Epoch: 23189, Training Loss: 0.00902
Epoch: 23189, Training Loss: 0.01105
Epoch: 23189, Training Loss: 0.00854
Epoch: 23190, Training Loss: 0.00968
Epoch: 23190, Training Loss: 0.00902
Epoch: 23190, Training Loss: 0.01105
Epoch: 23190, Training Loss: 0.00854
Epoch: 23191, Training Loss: 0.00968
Epoch: 23191, Training Loss: 0.00902
Epoch: 23191, Training Loss: 0.01105
Epoch: 23191, Training Loss: 0.00854
Epoch: 23192, Training Loss: 0.00968
Epoch: 23192, Training Loss: 0.00901
Epoch: 23192, Training Loss: 0.01105
Epoch: 23192, Training Loss: 0.00854
Epoch: 23193, Training Loss: 0.00968
Epoch: 23193, Training Loss: 0.00901
Epoch: 23193, Training Loss: 0.01105
Epoch: 23193, Training Loss: 0.00854
Epoch: 23194, Training Loss: 0.00968
Epoch: 23194, Training Loss: 0.00901
Epoch: 23194, Training Loss: 0.01105
Epoch: 23194, Training Loss: 0.00854
Epoch: 23195, Training Loss: 0.00968
Epoch: 23195, Training Loss: 0.00901
Epoch: 23195, Training Loss: 0.01105
Epoch: 23195, Training Loss: 0.00854
Epoch: 23196, Training Loss: 0.00968
Epoch: 23196, Training Loss: 0.00901
Epoch: 23196, Training Loss: 0.01105
Epoch: 23196, Training Loss: 0.00854
Epoch: 23197, Training Loss: 0.00968
Epoch: 23197, Training Loss: 0.00901
Epoch: 23197, Training Loss: 0.01105
Epoch: 23197, Training Loss: 0.00854
Epoch: 23198, Training Loss: 0.00968
Epoch: 23198, Training Loss: 0.00901
Epoch: 23198, Training Loss: 0.01105
Epoch: 23198, Training Loss: 0.00854
Epoch: 23199, Training Loss: 0.00968
Epoch: 23199, Training Loss: 0.00901
Epoch: 23199, Training Loss: 0.01105
Epoch: 23199, Training Loss: 0.00854
Epoch: 23200, Training Loss: 0.00968
Epoch: 23200, Training Loss: 0.00901
Epoch: 23200, Training Loss: 0.01105
Epoch: 23200, Training Loss: 0.00854
Epoch: 23201, Training Loss: 0.00968
Epoch: 23201, Training Loss: 0.00901
Epoch: 23201, Training Loss: 0.01105
Epoch: 23201, Training Loss: 0.00854
Epoch: 23202, Training Loss: 0.00968
Epoch: 23202, Training Loss: 0.00901
Epoch: 23202, Training Loss: 0.01105
Epoch: 23202, Training Loss: 0.00854
Epoch: 23203, Training Loss: 0.00968
Epoch: 23203, Training Loss: 0.00901
Epoch: 23203, Training Loss: 0.01105
Epoch: 23203, Training Loss: 0.00854
Epoch: 23204, Training Loss: 0.00968
Epoch: 23204, Training Loss: 0.00901
Epoch: 23204, Training Loss: 0.01105
Epoch: 23204, Training Loss: 0.00854
Epoch: 23205, Training Loss: 0.00967
Epoch: 23205, Training Loss: 0.00901
Epoch: 23205, Training Loss: 0.01105
Epoch: 23205, Training Loss: 0.00854
Epoch: 23206, Training Loss: 0.00967
Epoch: 23206, Training Loss: 0.00901
Epoch: 23206, Training Loss: 0.01105
Epoch: 23206, Training Loss: 0.00854
Epoch: 23207, Training Loss: 0.00967
Epoch: 23207, Training Loss: 0.00901
Epoch: 23207, Training Loss: 0.01105
Epoch: 23207, Training Loss: 0.00854
Epoch: 23208, Training Loss: 0.00967
Epoch: 23208, Training Loss: 0.00901
Epoch: 23208, Training Loss: 0.01105
Epoch: 23208, Training Loss: 0.00854
Epoch: 23209, Training Loss: 0.00967
Epoch: 23209, Training Loss: 0.00901
Epoch: 23209, Training Loss: 0.01104
Epoch: 23209, Training Loss: 0.00854
Epoch: 23210, Training Loss: 0.00967
Epoch: 23210, Training Loss: 0.00901
Epoch: 23210, Training Loss: 0.01104
Epoch: 23210, Training Loss: 0.00854
Epoch: 23211, Training Loss: 0.00967
Epoch: 23211, Training Loss: 0.00901
Epoch: 23211, Training Loss: 0.01104
Epoch: 23211, Training Loss: 0.00854
Epoch: 23212, Training Loss: 0.00967
Epoch: 23212, Training Loss: 0.00901
Epoch: 23212, Training Loss: 0.01104
Epoch: 23212, Training Loss: 0.00854
Epoch: 23213, Training Loss: 0.00967
Epoch: 23213, Training Loss: 0.00901
Epoch: 23213, Training Loss: 0.01104
Epoch: 23213, Training Loss: 0.00854
Epoch: 23214, Training Loss: 0.00967
Epoch: 23214, Training Loss: 0.00901
Epoch: 23214, Training Loss: 0.01104
Epoch: 23214, Training Loss: 0.00854
Epoch: 23215, Training Loss: 0.00967
Epoch: 23215, Training Loss: 0.00901
Epoch: 23215, Training Loss: 0.01104
Epoch: 23215, Training Loss: 0.00854
Epoch: 23216, Training Loss: 0.00967
Epoch: 23216, Training Loss: 0.00901
Epoch: 23216, Training Loss: 0.01104
Epoch: 23216, Training Loss: 0.00854
Epoch: 23217, Training Loss: 0.00967
Epoch: 23217, Training Loss: 0.00901
Epoch: 23217, Training Loss: 0.01104
Epoch: 23217, Training Loss: 0.00853
Epoch: 23218, Training Loss: 0.00967
Epoch: 23218, Training Loss: 0.00901
Epoch: 23218, Training Loss: 0.01104
Epoch: 23218, Training Loss: 0.00853
Epoch: 23219, Training Loss: 0.00967
Epoch: 23219, Training Loss: 0.00901
Epoch: 23219, Training Loss: 0.01104
Epoch: 23219, Training Loss: 0.00853
Epoch: 23220, Training Loss: 0.00967
Epoch: 23220, Training Loss: 0.00901
Epoch: 23220, Training Loss: 0.01104
Epoch: 23220, Training Loss: 0.00853
Epoch: 23221, Training Loss: 0.00967
Epoch: 23221, Training Loss: 0.00901
Epoch: 23221, Training Loss: 0.01104
Epoch: 23221, Training Loss: 0.00853
Epoch: 23222, Training Loss: 0.00967
Epoch: 23222, Training Loss: 0.00901
Epoch: 23222, Training Loss: 0.01104
Epoch: 23222, Training Loss: 0.00853
Epoch: 23223, Training Loss: 0.00967
Epoch: 23223, Training Loss: 0.00901
Epoch: 23223, Training Loss: 0.01104
Epoch: 23223, Training Loss: 0.00853
Epoch: 23224, Training Loss: 0.00967
Epoch: 23224, Training Loss: 0.00901
Epoch: 23224, Training Loss: 0.01104
Epoch: 23224, Training Loss: 0.00853
Epoch: 23225, Training Loss: 0.00967
Epoch: 23225, Training Loss: 0.00901
Epoch: 23225, Training Loss: 0.01104
Epoch: 23225, Training Loss: 0.00853
Epoch: 23226, Training Loss: 0.00967
Epoch: 23226, Training Loss: 0.00901
Epoch: 23226, Training Loss: 0.01104
Epoch: 23226, Training Loss: 0.00853
Epoch: 23227, Training Loss: 0.00967
Epoch: 23227, Training Loss: 0.00901
Epoch: 23227, Training Loss: 0.01104
Epoch: 23227, Training Loss: 0.00853
Epoch: 23228, Training Loss: 0.00967
Epoch: 23228, Training Loss: 0.00901
Epoch: 23228, Training Loss: 0.01104
Epoch: 23228, Training Loss: 0.00853
Epoch: 23229, Training Loss: 0.00967
Epoch: 23229, Training Loss: 0.00901
Epoch: 23229, Training Loss: 0.01104
Epoch: 23229, Training Loss: 0.00853
Epoch: 23230, Training Loss: 0.00967
Epoch: 23230, Training Loss: 0.00901
Epoch: 23230, Training Loss: 0.01104
Epoch: 23230, Training Loss: 0.00853
Epoch: 23231, Training Loss: 0.00967
Epoch: 23231, Training Loss: 0.00901
Epoch: 23231, Training Loss: 0.01104
Epoch: 23231, Training Loss: 0.00853
Epoch: 23232, Training Loss: 0.00967
Epoch: 23232, Training Loss: 0.00901
Epoch: 23232, Training Loss: 0.01104
Epoch: 23232, Training Loss: 0.00853
Epoch: 23233, Training Loss: 0.00967
Epoch: 23233, Training Loss: 0.00901
Epoch: 23233, Training Loss: 0.01104
Epoch: 23233, Training Loss: 0.00853
Epoch: 23234, Training Loss: 0.00967
Epoch: 23234, Training Loss: 0.00901
Epoch: 23234, Training Loss: 0.01104
Epoch: 23234, Training Loss: 0.00853
Epoch: 23235, Training Loss: 0.00967
Epoch: 23235, Training Loss: 0.00901
Epoch: 23235, Training Loss: 0.01104
Epoch: 23235, Training Loss: 0.00853
Epoch: 23236, Training Loss: 0.00967
Epoch: 23236, Training Loss: 0.00901
Epoch: 23236, Training Loss: 0.01104
Epoch: 23236, Training Loss: 0.00853
Epoch: 23237, Training Loss: 0.00967
Epoch: 23237, Training Loss: 0.00901
Epoch: 23237, Training Loss: 0.01104
Epoch: 23237, Training Loss: 0.00853
Epoch: 23238, Training Loss: 0.00967
Epoch: 23238, Training Loss: 0.00901
Epoch: 23238, Training Loss: 0.01104
Epoch: 23238, Training Loss: 0.00853
Epoch: 23239, Training Loss: 0.00967
Epoch: 23239, Training Loss: 0.00901
Epoch: 23239, Training Loss: 0.01104
Epoch: 23239, Training Loss: 0.00853
Epoch: 23240, Training Loss: 0.00967
Epoch: 23240, Training Loss: 0.00901
Epoch: 23240, Training Loss: 0.01104
Epoch: 23240, Training Loss: 0.00853
Epoch: 23241, Training Loss: 0.00967
Epoch: 23241, Training Loss: 0.00900
Epoch: 23241, Training Loss: 0.01104
Epoch: 23241, Training Loss: 0.00853
Epoch: 23242, Training Loss: 0.00967
Epoch: 23242, Training Loss: 0.00900
Epoch: 23242, Training Loss: 0.01104
Epoch: 23242, Training Loss: 0.00853
Epoch: 23243, Training Loss: 0.00967
Epoch: 23243, Training Loss: 0.00900
Epoch: 23243, Training Loss: 0.01104
Epoch: 23243, Training Loss: 0.00853
Epoch: 23244, Training Loss: 0.00967
Epoch: 23244, Training Loss: 0.00900
Epoch: 23244, Training Loss: 0.01104
Epoch: 23244, Training Loss: 0.00853
Epoch: 23245, Training Loss: 0.00967
Epoch: 23245, Training Loss: 0.00900
Epoch: 23245, Training Loss: 0.01104
Epoch: 23245, Training Loss: 0.00853
Epoch: 23246, Training Loss: 0.00967
Epoch: 23246, Training Loss: 0.00900
Epoch: 23246, Training Loss: 0.01104
Epoch: 23246, Training Loss: 0.00853
Epoch: 23247, Training Loss: 0.00967
Epoch: 23247, Training Loss: 0.00900
Epoch: 23247, Training Loss: 0.01104
Epoch: 23247, Training Loss: 0.00853
Epoch: 23248, Training Loss: 0.00967
Epoch: 23248, Training Loss: 0.00900
Epoch: 23248, Training Loss: 0.01103
Epoch: 23248, Training Loss: 0.00853
Epoch: 23249, Training Loss: 0.00966
Epoch: 23249, Training Loss: 0.00900
Epoch: 23249, Training Loss: 0.01103
Epoch: 23249, Training Loss: 0.00853
Epoch: 23250, Training Loss: 0.00966
Epoch: 23250, Training Loss: 0.00900
Epoch: 23250, Training Loss: 0.01103
Epoch: 23250, Training Loss: 0.00853
Epoch: 23251, Training Loss: 0.00966
Epoch: 23251, Training Loss: 0.00900
Epoch: 23251, Training Loss: 0.01103
Epoch: 23251, Training Loss: 0.00853
Epoch: 23252, Training Loss: 0.00966
Epoch: 23252, Training Loss: 0.00900
Epoch: 23252, Training Loss: 0.01103
Epoch: 23252, Training Loss: 0.00853
Epoch: 23253, Training Loss: 0.00966
Epoch: 23253, Training Loss: 0.00900
Epoch: 23253, Training Loss: 0.01103
Epoch: 23253, Training Loss: 0.00853
Epoch: 23254, Training Loss: 0.00966
Epoch: 23254, Training Loss: 0.00900
Epoch: 23254, Training Loss: 0.01103
Epoch: 23254, Training Loss: 0.00853
Epoch: 23255, Training Loss: 0.00966
Epoch: 23255, Training Loss: 0.00900
Epoch: 23255, Training Loss: 0.01103
Epoch: 23255, Training Loss: 0.00853
Epoch: 23256, Training Loss: 0.00966
Epoch: 23256, Training Loss: 0.00900
Epoch: 23256, Training Loss: 0.01103
Epoch: 23256, Training Loss: 0.00853
Epoch: 23257, Training Loss: 0.00966
Epoch: 23257, Training Loss: 0.00900
Epoch: 23257, Training Loss: 0.01103
Epoch: 23257, Training Loss: 0.00853
Epoch: 23258, Training Loss: 0.00966
Epoch: 23258, Training Loss: 0.00900
Epoch: 23258, Training Loss: 0.01103
Epoch: 23258, Training Loss: 0.00853
Epoch: 23259, Training Loss: 0.00966
Epoch: 23259, Training Loss: 0.00900
Epoch: 23259, Training Loss: 0.01103
Epoch: 23259, Training Loss: 0.00853
Epoch: 23260, Training Loss: 0.00966
Epoch: 23260, Training Loss: 0.00900
Epoch: 23260, Training Loss: 0.01103
Epoch: 23260, Training Loss: 0.00853
Epoch: 23261, Training Loss: 0.00966
Epoch: 23261, Training Loss: 0.00900
Epoch: 23261, Training Loss: 0.01103
Epoch: 23261, Training Loss: 0.00853
Epoch: 23262, Training Loss: 0.00966
Epoch: 23262, Training Loss: 0.00900
Epoch: 23262, Training Loss: 0.01103
Epoch: 23262, Training Loss: 0.00853
Epoch: 23263, Training Loss: 0.00966
Epoch: 23263, Training Loss: 0.00900
Epoch: 23263, Training Loss: 0.01103
Epoch: 23263, Training Loss: 0.00853
Epoch: 23264, Training Loss: 0.00966
Epoch: 23264, Training Loss: 0.00900
Epoch: 23264, Training Loss: 0.01103
Epoch: 23264, Training Loss: 0.00853
Epoch: 23265, Training Loss: 0.00966
Epoch: 23265, Training Loss: 0.00900
Epoch: 23265, Training Loss: 0.01103
Epoch: 23265, Training Loss: 0.00853
Epoch: 23266, Training Loss: 0.00966
Epoch: 23266, Training Loss: 0.00900
Epoch: 23266, Training Loss: 0.01103
Epoch: 23266, Training Loss: 0.00853
Epoch: 23267, Training Loss: 0.00966
Epoch: 23267, Training Loss: 0.00900
Epoch: 23267, Training Loss: 0.01103
Epoch: 23267, Training Loss: 0.00853
Epoch: 23268, Training Loss: 0.00966
Epoch: 23268, Training Loss: 0.00900
Epoch: 23268, Training Loss: 0.01103
Epoch: 23268, Training Loss: 0.00853
Epoch: 23269, Training Loss: 0.00966
Epoch: 23269, Training Loss: 0.00900
Epoch: 23269, Training Loss: 0.01103
Epoch: 23269, Training Loss: 0.00852
Epoch: 23270, Training Loss: 0.00966
Epoch: 23270, Training Loss: 0.00900
Epoch: 23270, Training Loss: 0.01103
Epoch: 23270, Training Loss: 0.00852
Epoch: 23271, Training Loss: 0.00966
Epoch: 23271, Training Loss: 0.00900
Epoch: 23271, Training Loss: 0.01103
Epoch: 23271, Training Loss: 0.00852
Epoch: 23272, Training Loss: 0.00966
Epoch: 23272, Training Loss: 0.00900
Epoch: 23272, Training Loss: 0.01103
Epoch: 23272, Training Loss: 0.00852
Epoch: 23273, Training Loss: 0.00966
Epoch: 23273, Training Loss: 0.00900
Epoch: 23273, Training Loss: 0.01103
Epoch: 23273, Training Loss: 0.00852
Epoch: 23274, Training Loss: 0.00966
Epoch: 23274, Training Loss: 0.00900
Epoch: 23274, Training Loss: 0.01103
Epoch: 23274, Training Loss: 0.00852
Epoch: 23275, Training Loss: 0.00966
Epoch: 23275, Training Loss: 0.00900
Epoch: 23275, Training Loss: 0.01103
Epoch: 23275, Training Loss: 0.00852
Epoch: 23276, Training Loss: 0.00966
Epoch: 23276, Training Loss: 0.00900
Epoch: 23276, Training Loss: 0.01103
Epoch: 23276, Training Loss: 0.00852
Epoch: 23277, Training Loss: 0.00966
Epoch: 23277, Training Loss: 0.00900
Epoch: 23277, Training Loss: 0.01103
Epoch: 23277, Training Loss: 0.00852
Epoch: 23278, Training Loss: 0.00966
Epoch: 23278, Training Loss: 0.00900
Epoch: 23278, Training Loss: 0.01103
Epoch: 23278, Training Loss: 0.00852
Epoch: 23279, Training Loss: 0.00966
Epoch: 23279, Training Loss: 0.00900
Epoch: 23279, Training Loss: 0.01103
Epoch: 23279, Training Loss: 0.00852
Epoch: 23280, Training Loss: 0.00966
Epoch: 23280, Training Loss: 0.00900
Epoch: 23280, Training Loss: 0.01103
Epoch: 23280, Training Loss: 0.00852
Epoch: 23281, Training Loss: 0.00966
Epoch: 23281, Training Loss: 0.00900
Epoch: 23281, Training Loss: 0.01103
Epoch: 23281, Training Loss: 0.00852
Epoch: 23282, Training Loss: 0.00966
Epoch: 23282, Training Loss: 0.00900
Epoch: 23282, Training Loss: 0.01103
Epoch: 23282, Training Loss: 0.00852
Epoch: 23283, Training Loss: 0.00966
Epoch: 23283, Training Loss: 0.00900
Epoch: 23283, Training Loss: 0.01103
Epoch: 23283, Training Loss: 0.00852
Epoch: 23284, Training Loss: 0.00966
Epoch: 23284, Training Loss: 0.00900
Epoch: 23284, Training Loss: 0.01103
Epoch: 23284, Training Loss: 0.00852
Epoch: 23285, Training Loss: 0.00966
Epoch: 23285, Training Loss: 0.00900
Epoch: 23285, Training Loss: 0.01103
Epoch: 23285, Training Loss: 0.00852
Epoch: 23286, Training Loss: 0.00966
Epoch: 23286, Training Loss: 0.00900
Epoch: 23286, Training Loss: 0.01103
Epoch: 23286, Training Loss: 0.00852
Epoch: 23287, Training Loss: 0.00966
Epoch: 23287, Training Loss: 0.00900
Epoch: 23287, Training Loss: 0.01102
Epoch: 23287, Training Loss: 0.00852
Epoch: 23288, Training Loss: 0.00966
Epoch: 23288, Training Loss: 0.00900
Epoch: 23288, Training Loss: 0.01102
Epoch: 23288, Training Loss: 0.00852
Epoch: 23289, Training Loss: 0.00966
Epoch: 23289, Training Loss: 0.00900
Epoch: 23289, Training Loss: 0.01102
Epoch: 23289, Training Loss: 0.00852
Epoch: 23290, Training Loss: 0.00966
Epoch: 23290, Training Loss: 0.00899
Epoch: 23290, Training Loss: 0.01102
Epoch: 23290, Training Loss: 0.00852
Epoch: 23291, Training Loss: 0.00966
Epoch: 23291, Training Loss: 0.00899
Epoch: 23291, Training Loss: 0.01102
Epoch: 23291, Training Loss: 0.00852
Epoch: 23292, Training Loss: 0.00966
Epoch: 23292, Training Loss: 0.00899
Epoch: 23292, Training Loss: 0.01102
Epoch: 23292, Training Loss: 0.00852
Epoch: 23293, Training Loss: 0.00966
Epoch: 23293, Training Loss: 0.00899
Epoch: 23293, Training Loss: 0.01102
Epoch: 23293, Training Loss: 0.00852
Epoch: 23294, Training Loss: 0.00965
Epoch: 23294, Training Loss: 0.00899
Epoch: 23294, Training Loss: 0.01102
Epoch: 23294, Training Loss: 0.00852
Epoch: 23295, Training Loss: 0.00965
Epoch: 23295, Training Loss: 0.00899
Epoch: 23295, Training Loss: 0.01102
Epoch: 23295, Training Loss: 0.00852
Epoch: 23296, Training Loss: 0.00965
Epoch: 23296, Training Loss: 0.00899
Epoch: 23296, Training Loss: 0.01102
Epoch: 23296, Training Loss: 0.00852
Epoch: 23297, Training Loss: 0.00965
Epoch: 23297, Training Loss: 0.00899
Epoch: 23297, Training Loss: 0.01102
Epoch: 23297, Training Loss: 0.00852
Epoch: 23298, Training Loss: 0.00965
Epoch: 23298, Training Loss: 0.00899
Epoch: 23298, Training Loss: 0.01102
Epoch: 23298, Training Loss: 0.00852
Epoch: 23299, Training Loss: 0.00965
Epoch: 23299, Training Loss: 0.00899
Epoch: 23299, Training Loss: 0.01102
Epoch: 23299, Training Loss: 0.00852
Epoch: 23300, Training Loss: 0.00965
Epoch: 23300, Training Loss: 0.00899
Epoch: 23300, Training Loss: 0.01102
Epoch: 23300, Training Loss: 0.00852
Epoch: 23301, Training Loss: 0.00965
Epoch: 23301, Training Loss: 0.00899
Epoch: 23301, Training Loss: 0.01102
Epoch: 23301, Training Loss: 0.00852
Epoch: 23302, Training Loss: 0.00965
Epoch: 23302, Training Loss: 0.00899
Epoch: 23302, Training Loss: 0.01102
Epoch: 23302, Training Loss: 0.00852
Epoch: 23303, Training Loss: 0.00965
Epoch: 23303, Training Loss: 0.00899
Epoch: 23303, Training Loss: 0.01102
Epoch: 23303, Training Loss: 0.00852
Epoch: 23304, Training Loss: 0.00965
Epoch: 23304, Training Loss: 0.00899
Epoch: 23304, Training Loss: 0.01102
Epoch: 23304, Training Loss: 0.00852
Epoch: 23305, Training Loss: 0.00965
Epoch: 23305, Training Loss: 0.00899
Epoch: 23305, Training Loss: 0.01102
Epoch: 23305, Training Loss: 0.00852
Epoch: 23306, Training Loss: 0.00965
Epoch: 23306, Training Loss: 0.00899
Epoch: 23306, Training Loss: 0.01102
Epoch: 23306, Training Loss: 0.00852
Epoch: 23307, Training Loss: 0.00965
Epoch: 23307, Training Loss: 0.00899
Epoch: 23307, Training Loss: 0.01102
Epoch: 23307, Training Loss: 0.00852
Epoch: 23308, Training Loss: 0.00965
Epoch: 23308, Training Loss: 0.00899
Epoch: 23308, Training Loss: 0.01102
Epoch: 23308, Training Loss: 0.00852
Epoch: 23309, Training Loss: 0.00965
Epoch: 23309, Training Loss: 0.00899
Epoch: 23309, Training Loss: 0.01102
Epoch: 23309, Training Loss: 0.00852
Epoch: 23310, Training Loss: 0.00965
Epoch: 23310, Training Loss: 0.00899
Epoch: 23310, Training Loss: 0.01102
Epoch: 23310, Training Loss: 0.00852
Epoch: 23311, Training Loss: 0.00965
Epoch: 23311, Training Loss: 0.00899
Epoch: 23311, Training Loss: 0.01102
Epoch: 23311, Training Loss: 0.00852
Epoch: 23312, Training Loss: 0.00965
Epoch: 23312, Training Loss: 0.00899
Epoch: 23312, Training Loss: 0.01102
Epoch: 23312, Training Loss: 0.00852
Epoch: 23313, Training Loss: 0.00965
Epoch: 23313, Training Loss: 0.00899
Epoch: 23313, Training Loss: 0.01102
Epoch: 23313, Training Loss: 0.00852
Epoch: 23314, Training Loss: 0.00965
Epoch: 23314, Training Loss: 0.00899
Epoch: 23314, Training Loss: 0.01102
Epoch: 23314, Training Loss: 0.00852
Epoch: 23315, Training Loss: 0.00965
Epoch: 23315, Training Loss: 0.00899
Epoch: 23315, Training Loss: 0.01102
Epoch: 23315, Training Loss: 0.00852
Epoch: 23316, Training Loss: 0.00965
Epoch: 23316, Training Loss: 0.00899
Epoch: 23316, Training Loss: 0.01102
Epoch: 23316, Training Loss: 0.00852
Epoch: 23317, Training Loss: 0.00965
Epoch: 23317, Training Loss: 0.00899
Epoch: 23317, Training Loss: 0.01102
Epoch: 23317, Training Loss: 0.00852
Epoch: 23318, Training Loss: 0.00965
Epoch: 23318, Training Loss: 0.00899
Epoch: 23318, Training Loss: 0.01102
Epoch: 23318, Training Loss: 0.00852
Epoch: 23319, Training Loss: 0.00965
Epoch: 23319, Training Loss: 0.00899
Epoch: 23319, Training Loss: 0.01102
Epoch: 23319, Training Loss: 0.00852
Epoch: 23320, Training Loss: 0.00965
Epoch: 23320, Training Loss: 0.00899
Epoch: 23320, Training Loss: 0.01102
Epoch: 23320, Training Loss: 0.00852
Epoch: 23321, Training Loss: 0.00965
Epoch: 23321, Training Loss: 0.00899
Epoch: 23321, Training Loss: 0.01102
Epoch: 23321, Training Loss: 0.00851
Epoch: 23322, Training Loss: 0.00965
Epoch: 23322, Training Loss: 0.00899
Epoch: 23322, Training Loss: 0.01102
Epoch: 23322, Training Loss: 0.00851
Epoch: 23323, Training Loss: 0.00965
Epoch: 23323, Training Loss: 0.00899
Epoch: 23323, Training Loss: 0.01102
Epoch: 23323, Training Loss: 0.00851
Epoch: 23324, Training Loss: 0.00965
Epoch: 23324, Training Loss: 0.00899
Epoch: 23324, Training Loss: 0.01102
Epoch: 23324, Training Loss: 0.00851
Epoch: 23325, Training Loss: 0.00965
Epoch: 23325, Training Loss: 0.00899
Epoch: 23325, Training Loss: 0.01102
Epoch: 23325, Training Loss: 0.00851
Epoch: 23326, Training Loss: 0.00965
Epoch: 23326, Training Loss: 0.00899
Epoch: 23326, Training Loss: 0.01102
Epoch: 23326, Training Loss: 0.00851
Epoch: 23327, Training Loss: 0.00965
Epoch: 23327, Training Loss: 0.00899
Epoch: 23327, Training Loss: 0.01101
Epoch: 23327, Training Loss: 0.00851
Epoch: 23328, Training Loss: 0.00965
Epoch: 23328, Training Loss: 0.00899
Epoch: 23328, Training Loss: 0.01101
Epoch: 23328, Training Loss: 0.00851
Epoch: 23329, Training Loss: 0.00965
Epoch: 23329, Training Loss: 0.00899
Epoch: 23329, Training Loss: 0.01101
Epoch: 23329, Training Loss: 0.00851
Epoch: 23330, Training Loss: 0.00965
Epoch: 23330, Training Loss: 0.00899
Epoch: 23330, Training Loss: 0.01101
Epoch: 23330, Training Loss: 0.00851
Epoch: 23331, Training Loss: 0.00965
Epoch: 23331, Training Loss: 0.00899
Epoch: 23331, Training Loss: 0.01101
Epoch: 23331, Training Loss: 0.00851
Epoch: 23332, Training Loss: 0.00965
Epoch: 23332, Training Loss: 0.00899
Epoch: 23332, Training Loss: 0.01101
Epoch: 23332, Training Loss: 0.00851
Epoch: 23333, Training Loss: 0.00965
Epoch: 23333, Training Loss: 0.00899
Epoch: 23333, Training Loss: 0.01101
Epoch: 23333, Training Loss: 0.00851
Epoch: 23334, Training Loss: 0.00965
Epoch: 23334, Training Loss: 0.00899
Epoch: 23334, Training Loss: 0.01101
Epoch: 23334, Training Loss: 0.00851
Epoch: 23335, Training Loss: 0.00965
Epoch: 23335, Training Loss: 0.00899
Epoch: 23335, Training Loss: 0.01101
Epoch: 23335, Training Loss: 0.00851
Epoch: 23336, Training Loss: 0.00965
Epoch: 23336, Training Loss: 0.00899
Epoch: 23336, Training Loss: 0.01101
Epoch: 23336, Training Loss: 0.00851
Epoch: 23337, Training Loss: 0.00965
Epoch: 23337, Training Loss: 0.00899
Epoch: 23337, Training Loss: 0.01101
Epoch: 23337, Training Loss: 0.00851
Epoch: 23338, Training Loss: 0.00965
Epoch: 23338, Training Loss: 0.00899
Epoch: 23338, Training Loss: 0.01101
Epoch: 23338, Training Loss: 0.00851
Epoch: 23339, Training Loss: 0.00964
Epoch: 23339, Training Loss: 0.00899
Epoch: 23339, Training Loss: 0.01101
Epoch: 23339, Training Loss: 0.00851
Epoch: 23340, Training Loss: 0.00964
Epoch: 23340, Training Loss: 0.00898
Epoch: 23340, Training Loss: 0.01101
Epoch: 23340, Training Loss: 0.00851
Epoch: 23341, Training Loss: 0.00964
Epoch: 23341, Training Loss: 0.00898
Epoch: 23341, Training Loss: 0.01101
Epoch: 23341, Training Loss: 0.00851
Epoch: 23342, Training Loss: 0.00964
Epoch: 23342, Training Loss: 0.00898
Epoch: 23342, Training Loss: 0.01101
Epoch: 23342, Training Loss: 0.00851
Epoch: 23343, Training Loss: 0.00964
Epoch: 23343, Training Loss: 0.00898
Epoch: 23343, Training Loss: 0.01101
Epoch: 23343, Training Loss: 0.00851
Epoch: 23344, Training Loss: 0.00964
Epoch: 23344, Training Loss: 0.00898
Epoch: 23344, Training Loss: 0.01101
Epoch: 23344, Training Loss: 0.00851
Epoch: 23345, Training Loss: 0.00964
Epoch: 23345, Training Loss: 0.00898
Epoch: 23345, Training Loss: 0.01101
Epoch: 23345, Training Loss: 0.00851
Epoch: 23346, Training Loss: 0.00964
Epoch: 23346, Training Loss: 0.00898
Epoch: 23346, Training Loss: 0.01101
Epoch: 23346, Training Loss: 0.00851
Epoch: 23347, Training Loss: 0.00964
Epoch: 23347, Training Loss: 0.00898
Epoch: 23347, Training Loss: 0.01101
Epoch: 23347, Training Loss: 0.00851
Epoch: 23348, Training Loss: 0.00964
Epoch: 23348, Training Loss: 0.00898
Epoch: 23348, Training Loss: 0.01101
Epoch: 23348, Training Loss: 0.00851
Epoch: 23349, Training Loss: 0.00964
Epoch: 23349, Training Loss: 0.00898
Epoch: 23349, Training Loss: 0.01101
Epoch: 23349, Training Loss: 0.00851
Epoch: 23350, Training Loss: 0.00964
Epoch: 23350, Training Loss: 0.00898
Epoch: 23350, Training Loss: 0.01101
Epoch: 23350, Training Loss: 0.00851
Epoch: 23351, Training Loss: 0.00964
Epoch: 23351, Training Loss: 0.00898
Epoch: 23351, Training Loss: 0.01101
Epoch: 23351, Training Loss: 0.00851
Epoch: 23352, Training Loss: 0.00964
Epoch: 23352, Training Loss: 0.00898
Epoch: 23352, Training Loss: 0.01101
Epoch: 23352, Training Loss: 0.00851
Epoch: 23353, Training Loss: 0.00964
Epoch: 23353, Training Loss: 0.00898
Epoch: 23353, Training Loss: 0.01101
Epoch: 23353, Training Loss: 0.00851
Epoch: 23354, Training Loss: 0.00964
Epoch: 23354, Training Loss: 0.00898
Epoch: 23354, Training Loss: 0.01101
Epoch: 23354, Training Loss: 0.00851
Epoch: 23355, Training Loss: 0.00964
Epoch: 23355, Training Loss: 0.00898
Epoch: 23355, Training Loss: 0.01101
Epoch: 23355, Training Loss: 0.00851
Epoch: 23356, Training Loss: 0.00964
Epoch: 23356, Training Loss: 0.00898
Epoch: 23356, Training Loss: 0.01101
Epoch: 23356, Training Loss: 0.00851
Epoch: 23357, Training Loss: 0.00964
Epoch: 23357, Training Loss: 0.00898
Epoch: 23357, Training Loss: 0.01101
Epoch: 23357, Training Loss: 0.00851
Epoch: 23358, Training Loss: 0.00964
Epoch: 23358, Training Loss: 0.00898
Epoch: 23358, Training Loss: 0.01101
Epoch: 23358, Training Loss: 0.00851
Epoch: 23359, Training Loss: 0.00964
Epoch: 23359, Training Loss: 0.00898
Epoch: 23359, Training Loss: 0.01101
Epoch: 23359, Training Loss: 0.00851
Epoch: 23360, Training Loss: 0.00964
Epoch: 23360, Training Loss: 0.00898
Epoch: 23360, Training Loss: 0.01101
Epoch: 23360, Training Loss: 0.00851
Epoch: 23361, Training Loss: 0.00964
Epoch: 23361, Training Loss: 0.00898
Epoch: 23361, Training Loss: 0.01101
Epoch: 23361, Training Loss: 0.00851
Epoch: 23362, Training Loss: 0.00964
Epoch: 23362, Training Loss: 0.00898
Epoch: 23362, Training Loss: 0.01101
Epoch: 23362, Training Loss: 0.00851
Epoch: 23363, Training Loss: 0.00964
Epoch: 23363, Training Loss: 0.00898
Epoch: 23363, Training Loss: 0.01101
Epoch: 23363, Training Loss: 0.00851
Epoch: 23364, Training Loss: 0.00964
Epoch: 23364, Training Loss: 0.00898
Epoch: 23364, Training Loss: 0.01101
Epoch: 23364, Training Loss: 0.00851
Epoch: 23365, Training Loss: 0.00964
Epoch: 23365, Training Loss: 0.00898
Epoch: 23365, Training Loss: 0.01101
Epoch: 23365, Training Loss: 0.00851
Epoch: 23366, Training Loss: 0.00964
Epoch: 23366, Training Loss: 0.00898
Epoch: 23366, Training Loss: 0.01100
Epoch: 23366, Training Loss: 0.00851
Epoch: 23367, Training Loss: 0.00964
Epoch: 23367, Training Loss: 0.00898
Epoch: 23367, Training Loss: 0.01100
Epoch: 23367, Training Loss: 0.00851
Epoch: 23368, Training Loss: 0.00964
Epoch: 23368, Training Loss: 0.00898
Epoch: 23368, Training Loss: 0.01100
Epoch: 23368, Training Loss: 0.00851
Epoch: 23369, Training Loss: 0.00964
Epoch: 23369, Training Loss: 0.00898
Epoch: 23369, Training Loss: 0.01100
Epoch: 23369, Training Loss: 0.00851
Epoch: 23370, Training Loss: 0.00964
Epoch: 23370, Training Loss: 0.00898
Epoch: 23370, Training Loss: 0.01100
Epoch: 23370, Training Loss: 0.00851
Epoch: 23371, Training Loss: 0.00964
Epoch: 23371, Training Loss: 0.00898
Epoch: 23371, Training Loss: 0.01100
Epoch: 23371, Training Loss: 0.00851
Epoch: 23372, Training Loss: 0.00964
Epoch: 23372, Training Loss: 0.00898
Epoch: 23372, Training Loss: 0.01100
Epoch: 23372, Training Loss: 0.00851
Epoch: 23373, Training Loss: 0.00964
Epoch: 23373, Training Loss: 0.00898
Epoch: 23373, Training Loss: 0.01100
Epoch: 23373, Training Loss: 0.00850
Epoch: 23374, Training Loss: 0.00964
Epoch: 23374, Training Loss: 0.00898
Epoch: 23374, Training Loss: 0.01100
Epoch: 23374, Training Loss: 0.00850
Epoch: 23375, Training Loss: 0.00964
Epoch: 23375, Training Loss: 0.00898
Epoch: 23375, Training Loss: 0.01100
Epoch: 23375, Training Loss: 0.00850
Epoch: 23376, Training Loss: 0.00964
Epoch: 23376, Training Loss: 0.00898
Epoch: 23376, Training Loss: 0.01100
Epoch: 23376, Training Loss: 0.00850
Epoch: 23377, Training Loss: 0.00964
Epoch: 23377, Training Loss: 0.00898
Epoch: 23377, Training Loss: 0.01100
Epoch: 23377, Training Loss: 0.00850
Epoch: 23378, Training Loss: 0.00964
Epoch: 23378, Training Loss: 0.00898
Epoch: 23378, Training Loss: 0.01100
Epoch: 23378, Training Loss: 0.00850
Epoch: 23379, Training Loss: 0.00964
Epoch: 23379, Training Loss: 0.00898
Epoch: 23379, Training Loss: 0.01100
Epoch: 23379, Training Loss: 0.00850
Epoch: 23380, Training Loss: 0.00964
Epoch: 23380, Training Loss: 0.00898
Epoch: 23380, Training Loss: 0.01100
Epoch: 23380, Training Loss: 0.00850
Epoch: 23381, Training Loss: 0.00964
Epoch: 23381, Training Loss: 0.00898
Epoch: 23381, Training Loss: 0.01100
Epoch: 23381, Training Loss: 0.00850
Epoch: 23382, Training Loss: 0.00964
Epoch: 23382, Training Loss: 0.00898
Epoch: 23382, Training Loss: 0.01100
Epoch: 23382, Training Loss: 0.00850
Epoch: 23383, Training Loss: 0.00964
Epoch: 23383, Training Loss: 0.00898
Epoch: 23383, Training Loss: 0.01100
Epoch: 23383, Training Loss: 0.00850
Epoch: 23384, Training Loss: 0.00963
Epoch: 23384, Training Loss: 0.00898
Epoch: 23384, Training Loss: 0.01100
Epoch: 23384, Training Loss: 0.00850
Epoch: 23385, Training Loss: 0.00963
Epoch: 23385, Training Loss: 0.00898
Epoch: 23385, Training Loss: 0.01100
Epoch: 23385, Training Loss: 0.00850
Epoch: 23386, Training Loss: 0.00963
Epoch: 23386, Training Loss: 0.00898
Epoch: 23386, Training Loss: 0.01100
Epoch: 23386, Training Loss: 0.00850
Epoch: 23387, Training Loss: 0.00963
Epoch: 23387, Training Loss: 0.00898
Epoch: 23387, Training Loss: 0.01100
Epoch: 23387, Training Loss: 0.00850
Epoch: 23388, Training Loss: 0.00963
Epoch: 23388, Training Loss: 0.00898
Epoch: 23388, Training Loss: 0.01100
Epoch: 23388, Training Loss: 0.00850
Epoch: 23389, Training Loss: 0.00963
Epoch: 23389, Training Loss: 0.00897
Epoch: 23389, Training Loss: 0.01100
Epoch: 23389, Training Loss: 0.00850
Epoch: 23390, Training Loss: 0.00963
Epoch: 23390, Training Loss: 0.00897
Epoch: 23390, Training Loss: 0.01100
Epoch: 23390, Training Loss: 0.00850
Epoch: 23391, Training Loss: 0.00963
Epoch: 23391, Training Loss: 0.00897
Epoch: 23391, Training Loss: 0.01100
Epoch: 23391, Training Loss: 0.00850
Epoch: 23392, Training Loss: 0.00963
Epoch: 23392, Training Loss: 0.00897
Epoch: 23392, Training Loss: 0.01100
Epoch: 23392, Training Loss: 0.00850
Epoch: 23393, Training Loss: 0.00963
Epoch: 23393, Training Loss: 0.00897
Epoch: 23393, Training Loss: 0.01100
Epoch: 23393, Training Loss: 0.00850
Epoch: 23394, Training Loss: 0.00963
Epoch: 23394, Training Loss: 0.00897
Epoch: 23394, Training Loss: 0.01100
Epoch: 23394, Training Loss: 0.00850
Epoch: 23395, Training Loss: 0.00963
Epoch: 23395, Training Loss: 0.00897
Epoch: 23395, Training Loss: 0.01100
Epoch: 23395, Training Loss: 0.00850
Epoch: 23396, Training Loss: 0.00963
Epoch: 23396, Training Loss: 0.00897
Epoch: 23396, Training Loss: 0.01100
Epoch: 23396, Training Loss: 0.00850
Epoch: 23397, Training Loss: 0.00963
Epoch: 23397, Training Loss: 0.00897
Epoch: 23397, Training Loss: 0.01100
Epoch: 23397, Training Loss: 0.00850
Epoch: 23398, Training Loss: 0.00963
Epoch: 23398, Training Loss: 0.00897
Epoch: 23398, Training Loss: 0.01100
Epoch: 23398, Training Loss: 0.00850
Epoch: 23399, Training Loss: 0.00963
Epoch: 23399, Training Loss: 0.00897
Epoch: 23399, Training Loss: 0.01100
Epoch: 23399, Training Loss: 0.00850
Epoch: 23400, Training Loss: 0.00963
Epoch: 23400, Training Loss: 0.00897
Epoch: 23400, Training Loss: 0.01100
Epoch: 23400, Training Loss: 0.00850
Epoch: 23401, Training Loss: 0.00963
Epoch: 23401, Training Loss: 0.00897
Epoch: 23401, Training Loss: 0.01100
Epoch: 23401, Training Loss: 0.00850
Epoch: 23402, Training Loss: 0.00963
Epoch: 23402, Training Loss: 0.00897
Epoch: 23402, Training Loss: 0.01100
Epoch: 23402, Training Loss: 0.00850
Epoch: 23403, Training Loss: 0.00963
Epoch: 23403, Training Loss: 0.00897
Epoch: 23403, Training Loss: 0.01100
Epoch: 23403, Training Loss: 0.00850
Epoch: 23404, Training Loss: 0.00963
Epoch: 23404, Training Loss: 0.00897
Epoch: 23404, Training Loss: 0.01100
Epoch: 23404, Training Loss: 0.00850
Epoch: 23405, Training Loss: 0.00963
Epoch: 23405, Training Loss: 0.00897
Epoch: 23405, Training Loss: 0.01100
Epoch: 23405, Training Loss: 0.00850
Epoch: 23406, Training Loss: 0.00963
Epoch: 23406, Training Loss: 0.00897
Epoch: 23406, Training Loss: 0.01099
Epoch: 23406, Training Loss: 0.00850
Epoch: 23407, Training Loss: 0.00963
Epoch: 23407, Training Loss: 0.00897
Epoch: 23407, Training Loss: 0.01099
Epoch: 23407, Training Loss: 0.00850
Epoch: 23408, Training Loss: 0.00963
Epoch: 23408, Training Loss: 0.00897
Epoch: 23408, Training Loss: 0.01099
Epoch: 23408, Training Loss: 0.00850
Epoch: 23409, Training Loss: 0.00963
Epoch: 23409, Training Loss: 0.00897
Epoch: 23409, Training Loss: 0.01099
Epoch: 23409, Training Loss: 0.00850
Epoch: 23410, Training Loss: 0.00963
Epoch: 23410, Training Loss: 0.00897
Epoch: 23410, Training Loss: 0.01099
Epoch: 23410, Training Loss: 0.00850
Epoch: 23411, Training Loss: 0.00963
Epoch: 23411, Training Loss: 0.00897
Epoch: 23411, Training Loss: 0.01099
Epoch: 23411, Training Loss: 0.00850
Epoch: 23412, Training Loss: 0.00963
Epoch: 23412, Training Loss: 0.00897
Epoch: 23412, Training Loss: 0.01099
Epoch: 23412, Training Loss: 0.00850
Epoch: 23413, Training Loss: 0.00963
Epoch: 23413, Training Loss: 0.00897
Epoch: 23413, Training Loss: 0.01099
Epoch: 23413, Training Loss: 0.00850
Epoch: 23414, Training Loss: 0.00963
Epoch: 23414, Training Loss: 0.00897
Epoch: 23414, Training Loss: 0.01099
Epoch: 23414, Training Loss: 0.00850
Epoch: 23415, Training Loss: 0.00963
Epoch: 23415, Training Loss: 0.00897
Epoch: 23415, Training Loss: 0.01099
Epoch: 23415, Training Loss: 0.00850
Epoch: 23416, Training Loss: 0.00963
Epoch: 23416, Training Loss: 0.00897
Epoch: 23416, Training Loss: 0.01099
Epoch: 23416, Training Loss: 0.00850
Epoch: 23417, Training Loss: 0.00963
Epoch: 23417, Training Loss: 0.00897
Epoch: 23417, Training Loss: 0.01099
Epoch: 23417, Training Loss: 0.00850
Epoch: 23418, Training Loss: 0.00963
Epoch: 23418, Training Loss: 0.00897
Epoch: 23418, Training Loss: 0.01099
Epoch: 23418, Training Loss: 0.00850
Epoch: 23419, Training Loss: 0.00963
Epoch: 23419, Training Loss: 0.00897
Epoch: 23419, Training Loss: 0.01099
Epoch: 23419, Training Loss: 0.00850
Epoch: 23420, Training Loss: 0.00963
Epoch: 23420, Training Loss: 0.00897
Epoch: 23420, Training Loss: 0.01099
Epoch: 23420, Training Loss: 0.00850
Epoch: 23421, Training Loss: 0.00963
Epoch: 23421, Training Loss: 0.00897
Epoch: 23421, Training Loss: 0.01099
Epoch: 23421, Training Loss: 0.00850
Epoch: 23422, Training Loss: 0.00963
Epoch: 23422, Training Loss: 0.00897
Epoch: 23422, Training Loss: 0.01099
Epoch: 23422, Training Loss: 0.00850
Epoch: 23423, Training Loss: 0.00963
Epoch: 23423, Training Loss: 0.00897
Epoch: 23423, Training Loss: 0.01099
Epoch: 23423, Training Loss: 0.00850
Epoch: 23424, Training Loss: 0.00963
Epoch: 23424, Training Loss: 0.00897
Epoch: 23424, Training Loss: 0.01099
Epoch: 23424, Training Loss: 0.00850
Epoch: 23425, Training Loss: 0.00963
Epoch: 23425, Training Loss: 0.00897
Epoch: 23425, Training Loss: 0.01099
Epoch: 23425, Training Loss: 0.00849
Epoch: 23426, Training Loss: 0.00963
Epoch: 23426, Training Loss: 0.00897
Epoch: 23426, Training Loss: 0.01099
Epoch: 23426, Training Loss: 0.00849
Epoch: 23427, Training Loss: 0.00963
Epoch: 23427, Training Loss: 0.00897
Epoch: 23427, Training Loss: 0.01099
Epoch: 23427, Training Loss: 0.00849
Epoch: 23428, Training Loss: 0.00963
Epoch: 23428, Training Loss: 0.00897
Epoch: 23428, Training Loss: 0.01099
Epoch: 23428, Training Loss: 0.00849
Epoch: 23429, Training Loss: 0.00963
Epoch: 23429, Training Loss: 0.00897
Epoch: 23429, Training Loss: 0.01099
Epoch: 23429, Training Loss: 0.00849
Epoch: 23430, Training Loss: 0.00962
Epoch: 23430, Training Loss: 0.00897
Epoch: 23430, Training Loss: 0.01099
Epoch: 23430, Training Loss: 0.00849
Epoch: 23431, Training Loss: 0.00962
Epoch: 23431, Training Loss: 0.00897
Epoch: 23431, Training Loss: 0.01099
Epoch: 23431, Training Loss: 0.00849
Epoch: 23432, Training Loss: 0.00962
Epoch: 23432, Training Loss: 0.00897
Epoch: 23432, Training Loss: 0.01099
Epoch: 23432, Training Loss: 0.00849
Epoch: 23433, Training Loss: 0.00962
Epoch: 23433, Training Loss: 0.00897
Epoch: 23433, Training Loss: 0.01099
Epoch: 23433, Training Loss: 0.00849
Epoch: 23434, Training Loss: 0.00962
Epoch: 23434, Training Loss: 0.00897
Epoch: 23434, Training Loss: 0.01099
Epoch: 23434, Training Loss: 0.00849
Epoch: 23435, Training Loss: 0.00962
Epoch: 23435, Training Loss: 0.00897
Epoch: 23435, Training Loss: 0.01099
Epoch: 23435, Training Loss: 0.00849
Epoch: 23436, Training Loss: 0.00962
Epoch: 23436, Training Loss: 0.00897
Epoch: 23436, Training Loss: 0.01099
Epoch: 23436, Training Loss: 0.00849
Epoch: 23437, Training Loss: 0.00962
Epoch: 23437, Training Loss: 0.00897
Epoch: 23437, Training Loss: 0.01099
Epoch: 23437, Training Loss: 0.00849
Epoch: 23438, Training Loss: 0.00962
Epoch: 23438, Training Loss: 0.00897
Epoch: 23438, Training Loss: 0.01099
Epoch: 23438, Training Loss: 0.00849
Epoch: 23439, Training Loss: 0.00962
Epoch: 23439, Training Loss: 0.00896
Epoch: 23439, Training Loss: 0.01099
Epoch: 23439, Training Loss: 0.00849
Epoch: 23440, Training Loss: 0.00962
Epoch: 23440, Training Loss: 0.00896
Epoch: 23440, Training Loss: 0.01099
Epoch: 23440, Training Loss: 0.00849
Epoch: 23441, Training Loss: 0.00962
Epoch: 23441, Training Loss: 0.00896
Epoch: 23441, Training Loss: 0.01099
Epoch: 23441, Training Loss: 0.00849
Epoch: 23442, Training Loss: 0.00962
Epoch: 23442, Training Loss: 0.00896
Epoch: 23442, Training Loss: 0.01099
Epoch: 23442, Training Loss: 0.00849
Epoch: 23443, Training Loss: 0.00962
Epoch: 23443, Training Loss: 0.00896
Epoch: 23443, Training Loss: 0.01099
Epoch: 23443, Training Loss: 0.00849
Epoch: 23444, Training Loss: 0.00962
Epoch: 23444, Training Loss: 0.00896
Epoch: 23444, Training Loss: 0.01099
Epoch: 23444, Training Loss: 0.00849
Epoch: 23445, Training Loss: 0.00962
Epoch: 23445, Training Loss: 0.00896
Epoch: 23445, Training Loss: 0.01099
Epoch: 23445, Training Loss: 0.00849
Epoch: 23446, Training Loss: 0.00962
Epoch: 23446, Training Loss: 0.00896
Epoch: 23446, Training Loss: 0.01098
Epoch: 23446, Training Loss: 0.00849
Epoch: 23447, Training Loss: 0.00962
Epoch: 23447, Training Loss: 0.00896
Epoch: 23447, Training Loss: 0.01098
Epoch: 23447, Training Loss: 0.00849
Epoch: 23448, Training Loss: 0.00962
Epoch: 23448, Training Loss: 0.00896
Epoch: 23448, Training Loss: 0.01098
Epoch: 23448, Training Loss: 0.00849
Epoch: 23449, Training Loss: 0.00962
Epoch: 23449, Training Loss: 0.00896
Epoch: 23449, Training Loss: 0.01098
Epoch: 23449, Training Loss: 0.00849
Epoch: 23450, Training Loss: 0.00962
Epoch: 23450, Training Loss: 0.00896
Epoch: 23450, Training Loss: 0.01098
Epoch: 23450, Training Loss: 0.00849
Epoch: 23451, Training Loss: 0.00962
Epoch: 23451, Training Loss: 0.00896
Epoch: 23451, Training Loss: 0.01098
Epoch: 23451, Training Loss: 0.00849
Epoch: 23452, Training Loss: 0.00962
Epoch: 23452, Training Loss: 0.00896
Epoch: 23452, Training Loss: 0.01098
Epoch: 23452, Training Loss: 0.00849
Epoch: 23453, Training Loss: 0.00962
Epoch: 23453, Training Loss: 0.00896
Epoch: 23453, Training Loss: 0.01098
Epoch: 23453, Training Loss: 0.00849
Epoch: 23454, Training Loss: 0.00962
Epoch: 23454, Training Loss: 0.00896
Epoch: 23454, Training Loss: 0.01098
Epoch: 23454, Training Loss: 0.00849
Epoch: 23455, Training Loss: 0.00962
Epoch: 23455, Training Loss: 0.00896
Epoch: 23455, Training Loss: 0.01098
Epoch: 23455, Training Loss: 0.00849
Epoch: 23456, Training Loss: 0.00962
Epoch: 23456, Training Loss: 0.00896
Epoch: 23456, Training Loss: 0.01098
Epoch: 23456, Training Loss: 0.00849
Epoch: 23457, Training Loss: 0.00962
Epoch: 23457, Training Loss: 0.00896
Epoch: 23457, Training Loss: 0.01098
Epoch: 23457, Training Loss: 0.00849
Epoch: 23458, Training Loss: 0.00962
Epoch: 23458, Training Loss: 0.00896
Epoch: 23458, Training Loss: 0.01098
Epoch: 23458, Training Loss: 0.00849
Epoch: 23459, Training Loss: 0.00962
Epoch: 23459, Training Loss: 0.00896
Epoch: 23459, Training Loss: 0.01098
Epoch: 23459, Training Loss: 0.00849
Epoch: 23460, Training Loss: 0.00962
Epoch: 23460, Training Loss: 0.00896
Epoch: 23460, Training Loss: 0.01098
Epoch: 23460, Training Loss: 0.00849
Epoch: 23461, Training Loss: 0.00962
Epoch: 23461, Training Loss: 0.00896
Epoch: 23461, Training Loss: 0.01098
Epoch: 23461, Training Loss: 0.00849
Epoch: 23462, Training Loss: 0.00962
Epoch: 23462, Training Loss: 0.00896
Epoch: 23462, Training Loss: 0.01098
Epoch: 23462, Training Loss: 0.00849
Epoch: 23463, Training Loss: 0.00962
Epoch: 23463, Training Loss: 0.00896
Epoch: 23463, Training Loss: 0.01098
Epoch: 23463, Training Loss: 0.00849
Epoch: 23464, Training Loss: 0.00962
Epoch: 23464, Training Loss: 0.00896
Epoch: 23464, Training Loss: 0.01098
Epoch: 23464, Training Loss: 0.00849
Epoch: 23465, Training Loss: 0.00962
Epoch: 23465, Training Loss: 0.00896
Epoch: 23465, Training Loss: 0.01098
Epoch: 23465, Training Loss: 0.00849
Epoch: 23466, Training Loss: 0.00962
Epoch: 23466, Training Loss: 0.00896
Epoch: 23466, Training Loss: 0.01098
Epoch: 23466, Training Loss: 0.00849
Epoch: 23467, Training Loss: 0.00962
Epoch: 23467, Training Loss: 0.00896
Epoch: 23467, Training Loss: 0.01098
Epoch: 23467, Training Loss: 0.00849
Epoch: 23468, Training Loss: 0.00962
Epoch: 23468, Training Loss: 0.00896
Epoch: 23468, Training Loss: 0.01098
Epoch: 23468, Training Loss: 0.00849
Epoch: 23469, Training Loss: 0.00962
Epoch: 23469, Training Loss: 0.00896
Epoch: 23469, Training Loss: 0.01098
Epoch: 23469, Training Loss: 0.00849
Epoch: 23470, Training Loss: 0.00962
Epoch: 23470, Training Loss: 0.00896
Epoch: 23470, Training Loss: 0.01098
Epoch: 23470, Training Loss: 0.00849
Epoch: 23471, Training Loss: 0.00962
Epoch: 23471, Training Loss: 0.00896
Epoch: 23471, Training Loss: 0.01098
Epoch: 23471, Training Loss: 0.00849
Epoch: 23472, Training Loss: 0.00962
Epoch: 23472, Training Loss: 0.00896
Epoch: 23472, Training Loss: 0.01098
Epoch: 23472, Training Loss: 0.00849
Epoch: 23473, Training Loss: 0.00962
Epoch: 23473, Training Loss: 0.00896
Epoch: 23473, Training Loss: 0.01098
Epoch: 23473, Training Loss: 0.00849
Epoch: 23474, Training Loss: 0.00962
Epoch: 23474, Training Loss: 0.00896
Epoch: 23474, Training Loss: 0.01098
Epoch: 23474, Training Loss: 0.00849
Epoch: 23475, Training Loss: 0.00961
Epoch: 23475, Training Loss: 0.00896
Epoch: 23475, Training Loss: 0.01098
Epoch: 23475, Training Loss: 0.00849
Epoch: 23476, Training Loss: 0.00961
Epoch: 23476, Training Loss: 0.00896
Epoch: 23476, Training Loss: 0.01098
Epoch: 23476, Training Loss: 0.00849
Epoch: 23477, Training Loss: 0.00961
Epoch: 23477, Training Loss: 0.00896
Epoch: 23477, Training Loss: 0.01098
Epoch: 23477, Training Loss: 0.00848
Epoch: 23478, Training Loss: 0.00961
Epoch: 23478, Training Loss: 0.00896
Epoch: 23478, Training Loss: 0.01098
Epoch: 23478, Training Loss: 0.00848
Epoch: 23479, Training Loss: 0.00961
Epoch: 23479, Training Loss: 0.00896
Epoch: 23479, Training Loss: 0.01098
Epoch: 23479, Training Loss: 0.00848
Epoch: 23480, Training Loss: 0.00961
Epoch: 23480, Training Loss: 0.00896
Epoch: 23480, Training Loss: 0.01098
Epoch: 23480, Training Loss: 0.00848
Epoch: 23481, Training Loss: 0.00961
Epoch: 23481, Training Loss: 0.00896
Epoch: 23481, Training Loss: 0.01098
Epoch: 23481, Training Loss: 0.00848
Epoch: 23482, Training Loss: 0.00961
Epoch: 23482, Training Loss: 0.00896
Epoch: 23482, Training Loss: 0.01098
Epoch: 23482, Training Loss: 0.00848
Epoch: 23483, Training Loss: 0.00961
Epoch: 23483, Training Loss: 0.00896
Epoch: 23483, Training Loss: 0.01098
Epoch: 23483, Training Loss: 0.00848
Epoch: 23484, Training Loss: 0.00961
Epoch: 23484, Training Loss: 0.00896
Epoch: 23484, Training Loss: 0.01098
Epoch: 23484, Training Loss: 0.00848
Epoch: 23485, Training Loss: 0.00961
Epoch: 23485, Training Loss: 0.00896
Epoch: 23485, Training Loss: 0.01098
Epoch: 23485, Training Loss: 0.00848
Epoch: 23486, Training Loss: 0.00961
Epoch: 23486, Training Loss: 0.00896
Epoch: 23486, Training Loss: 0.01097
Epoch: 23486, Training Loss: 0.00848
Epoch: 23487, Training Loss: 0.00961
Epoch: 23487, Training Loss: 0.00896
Epoch: 23487, Training Loss: 0.01097
Epoch: 23487, Training Loss: 0.00848
Epoch: 23488, Training Loss: 0.00961
Epoch: 23488, Training Loss: 0.00895
Epoch: 23488, Training Loss: 0.01097
Epoch: 23488, Training Loss: 0.00848
Epoch: 23489, Training Loss: 0.00961
Epoch: 23489, Training Loss: 0.00895
Epoch: 23489, Training Loss: 0.01097
Epoch: 23489, Training Loss: 0.00848
Epoch: 23490, Training Loss: 0.00961
Epoch: 23490, Training Loss: 0.00895
Epoch: 23490, Training Loss: 0.01097
Epoch: 23490, Training Loss: 0.00848
Epoch: 23491, Training Loss: 0.00961
Epoch: 23491, Training Loss: 0.00895
Epoch: 23491, Training Loss: 0.01097
Epoch: 23491, Training Loss: 0.00848
Epoch: 23492, Training Loss: 0.00961
Epoch: 23492, Training Loss: 0.00895
Epoch: 23492, Training Loss: 0.01097
Epoch: 23492, Training Loss: 0.00848
Epoch: 23493, Training Loss: 0.00961
Epoch: 23493, Training Loss: 0.00895
Epoch: 23493, Training Loss: 0.01097
Epoch: 23493, Training Loss: 0.00848
Epoch: 23494, Training Loss: 0.00961
Epoch: 23494, Training Loss: 0.00895
Epoch: 23494, Training Loss: 0.01097
Epoch: 23494, Training Loss: 0.00848
Epoch: 23495, Training Loss: 0.00961
Epoch: 23495, Training Loss: 0.00895
Epoch: 23495, Training Loss: 0.01097
Epoch: 23495, Training Loss: 0.00848
Epoch: 23496, Training Loss: 0.00961
Epoch: 23496, Training Loss: 0.00895
Epoch: 23496, Training Loss: 0.01097
Epoch: 23496, Training Loss: 0.00848
Epoch: 23497, Training Loss: 0.00961
Epoch: 23497, Training Loss: 0.00895
Epoch: 23497, Training Loss: 0.01097
Epoch: 23497, Training Loss: 0.00848
Epoch: 23498, Training Loss: 0.00961
Epoch: 23498, Training Loss: 0.00895
Epoch: 23498, Training Loss: 0.01097
Epoch: 23498, Training Loss: 0.00848
Epoch: 23499, Training Loss: 0.00961
Epoch: 23499, Training Loss: 0.00895
Epoch: 23499, Training Loss: 0.01097
Epoch: 23499, Training Loss: 0.00848
Epoch: 23500, Training Loss: 0.00961
Epoch: 23500, Training Loss: 0.00895
Epoch: 23500, Training Loss: 0.01097
Epoch: 23500, Training Loss: 0.00848
Epoch: 23501, Training Loss: 0.00961
Epoch: 23501, Training Loss: 0.00895
Epoch: 23501, Training Loss: 0.01097
Epoch: 23501, Training Loss: 0.00848
Epoch: 23502, Training Loss: 0.00961
Epoch: 23502, Training Loss: 0.00895
Epoch: 23502, Training Loss: 0.01097
Epoch: 23502, Training Loss: 0.00848
Epoch: 23503, Training Loss: 0.00961
Epoch: 23503, Training Loss: 0.00895
Epoch: 23503, Training Loss: 0.01097
Epoch: 23503, Training Loss: 0.00848
Epoch: 23504, Training Loss: 0.00961
Epoch: 23504, Training Loss: 0.00895
Epoch: 23504, Training Loss: 0.01097
Epoch: 23504, Training Loss: 0.00848
Epoch: 23505, Training Loss: 0.00961
Epoch: 23505, Training Loss: 0.00895
Epoch: 23505, Training Loss: 0.01097
Epoch: 23505, Training Loss: 0.00848
Epoch: 23506, Training Loss: 0.00961
Epoch: 23506, Training Loss: 0.00895
Epoch: 23506, Training Loss: 0.01097
Epoch: 23506, Training Loss: 0.00848
Epoch: 23507, Training Loss: 0.00961
Epoch: 23507, Training Loss: 0.00895
Epoch: 23507, Training Loss: 0.01097
Epoch: 23507, Training Loss: 0.00848
Epoch: 23508, Training Loss: 0.00961
Epoch: 23508, Training Loss: 0.00895
Epoch: 23508, Training Loss: 0.01097
Epoch: 23508, Training Loss: 0.00848
Epoch: 23509, Training Loss: 0.00961
Epoch: 23509, Training Loss: 0.00895
Epoch: 23509, Training Loss: 0.01097
Epoch: 23509, Training Loss: 0.00848
Epoch: 23510, Training Loss: 0.00961
Epoch: 23510, Training Loss: 0.00895
Epoch: 23510, Training Loss: 0.01097
Epoch: 23510, Training Loss: 0.00848
Epoch: 23511, Training Loss: 0.00961
Epoch: 23511, Training Loss: 0.00895
Epoch: 23511, Training Loss: 0.01097
Epoch: 23511, Training Loss: 0.00848
Epoch: 23512, Training Loss: 0.00961
Epoch: 23512, Training Loss: 0.00895
Epoch: 23512, Training Loss: 0.01097
Epoch: 23512, Training Loss: 0.00848
Epoch: 23513, Training Loss: 0.00961
Epoch: 23513, Training Loss: 0.00895
Epoch: 23513, Training Loss: 0.01097
Epoch: 23513, Training Loss: 0.00848
Epoch: 23514, Training Loss: 0.00961
Epoch: 23514, Training Loss: 0.00895
Epoch: 23514, Training Loss: 0.01097
Epoch: 23514, Training Loss: 0.00848
Epoch: 23515, Training Loss: 0.00961
Epoch: 23515, Training Loss: 0.00895
Epoch: 23515, Training Loss: 0.01097
Epoch: 23515, Training Loss: 0.00848
Epoch: 23516, Training Loss: 0.00961
Epoch: 23516, Training Loss: 0.00895
Epoch: 23516, Training Loss: 0.01097
Epoch: 23516, Training Loss: 0.00848
Epoch: 23517, Training Loss: 0.00961
Epoch: 23517, Training Loss: 0.00895
Epoch: 23517, Training Loss: 0.01097
Epoch: 23517, Training Loss: 0.00848
Epoch: 23518, Training Loss: 0.00961
Epoch: 23518, Training Loss: 0.00895
Epoch: 23518, Training Loss: 0.01097
Epoch: 23518, Training Loss: 0.00848
Epoch: 23519, Training Loss: 0.00961
Epoch: 23519, Training Loss: 0.00895
Epoch: 23519, Training Loss: 0.01097
Epoch: 23519, Training Loss: 0.00848
Epoch: 23520, Training Loss: 0.00961
Epoch: 23520, Training Loss: 0.00895
Epoch: 23520, Training Loss: 0.01097
Epoch: 23520, Training Loss: 0.00848
Epoch: 23521, Training Loss: 0.00960
Epoch: 23521, Training Loss: 0.00895
Epoch: 23521, Training Loss: 0.01097
Epoch: 23521, Training Loss: 0.00848
Epoch: 23522, Training Loss: 0.00960
Epoch: 23522, Training Loss: 0.00895
Epoch: 23522, Training Loss: 0.01097
Epoch: 23522, Training Loss: 0.00848
Epoch: 23523, Training Loss: 0.00960
Epoch: 23523, Training Loss: 0.00895
Epoch: 23523, Training Loss: 0.01097
Epoch: 23523, Training Loss: 0.00848
Epoch: 23524, Training Loss: 0.00960
Epoch: 23524, Training Loss: 0.00895
Epoch: 23524, Training Loss: 0.01097
Epoch: 23524, Training Loss: 0.00848
Epoch: 23525, Training Loss: 0.00960
Epoch: 23525, Training Loss: 0.00895
Epoch: 23525, Training Loss: 0.01097
Epoch: 23525, Training Loss: 0.00848
Epoch: 23526, Training Loss: 0.00960
Epoch: 23526, Training Loss: 0.00895
Epoch: 23526, Training Loss: 0.01096
Epoch: 23526, Training Loss: 0.00848
Epoch: 23527, Training Loss: 0.00960
Epoch: 23527, Training Loss: 0.00895
Epoch: 23527, Training Loss: 0.01096
Epoch: 23527, Training Loss: 0.00848
Epoch: 23528, Training Loss: 0.00960
Epoch: 23528, Training Loss: 0.00895
Epoch: 23528, Training Loss: 0.01096
Epoch: 23528, Training Loss: 0.00848
Epoch: 23529, Training Loss: 0.00960
Epoch: 23529, Training Loss: 0.00895
Epoch: 23529, Training Loss: 0.01096
Epoch: 23529, Training Loss: 0.00847
Epoch: 23530, Training Loss: 0.00960
Epoch: 23530, Training Loss: 0.00895
Epoch: 23530, Training Loss: 0.01096
Epoch: 23530, Training Loss: 0.00847
Epoch: 23531, Training Loss: 0.00960
Epoch: 23531, Training Loss: 0.00895
Epoch: 23531, Training Loss: 0.01096
Epoch: 23531, Training Loss: 0.00847
Epoch: 23532, Training Loss: 0.00960
Epoch: 23532, Training Loss: 0.00895
Epoch: 23532, Training Loss: 0.01096
Epoch: 23532, Training Loss: 0.00847
Epoch: 23533, Training Loss: 0.00960
Epoch: 23533, Training Loss: 0.00895
Epoch: 23533, Training Loss: 0.01096
Epoch: 23533, Training Loss: 0.00847
Epoch: 23534, Training Loss: 0.00960
Epoch: 23534, Training Loss: 0.00895
Epoch: 23534, Training Loss: 0.01096
Epoch: 23534, Training Loss: 0.00847
Epoch: 23535, Training Loss: 0.00960
Epoch: 23535, Training Loss: 0.00895
Epoch: 23535, Training Loss: 0.01096
Epoch: 23535, Training Loss: 0.00847
Epoch: 23536, Training Loss: 0.00960
Epoch: 23536, Training Loss: 0.00895
Epoch: 23536, Training Loss: 0.01096
Epoch: 23536, Training Loss: 0.00847
Epoch: 23537, Training Loss: 0.00960
Epoch: 23537, Training Loss: 0.00895
Epoch: 23537, Training Loss: 0.01096
Epoch: 23537, Training Loss: 0.00847
Epoch: 23538, Training Loss: 0.00960
Epoch: 23538, Training Loss: 0.00894
Epoch: 23538, Training Loss: 0.01096
Epoch: 23538, Training Loss: 0.00847
Epoch: 23539, Training Loss: 0.00960
Epoch: 23539, Training Loss: 0.00894
Epoch: 23539, Training Loss: 0.01096
Epoch: 23539, Training Loss: 0.00847
Epoch: 23540, Training Loss: 0.00960
Epoch: 23540, Training Loss: 0.00894
Epoch: 23540, Training Loss: 0.01096
Epoch: 23540, Training Loss: 0.00847
Epoch: 23541, Training Loss: 0.00960
Epoch: 23541, Training Loss: 0.00894
Epoch: 23541, Training Loss: 0.01096
Epoch: 23541, Training Loss: 0.00847
Epoch: 23542, Training Loss: 0.00960
Epoch: 23542, Training Loss: 0.00894
Epoch: 23542, Training Loss: 0.01096
Epoch: 23542, Training Loss: 0.00847
Epoch: 23543, Training Loss: 0.00960
Epoch: 23543, Training Loss: 0.00894
Epoch: 23543, Training Loss: 0.01096
Epoch: 23543, Training Loss: 0.00847
Epoch: 23544, Training Loss: 0.00960
Epoch: 23544, Training Loss: 0.00894
Epoch: 23544, Training Loss: 0.01096
Epoch: 23544, Training Loss: 0.00847
Epoch: 23545, Training Loss: 0.00960
Epoch: 23545, Training Loss: 0.00894
Epoch: 23545, Training Loss: 0.01096
Epoch: 23545, Training Loss: 0.00847
Epoch: 23546, Training Loss: 0.00960
Epoch: 23546, Training Loss: 0.00894
Epoch: 23546, Training Loss: 0.01096
Epoch: 23546, Training Loss: 0.00847
Epoch: 23547, Training Loss: 0.00960
Epoch: 23547, Training Loss: 0.00894
Epoch: 23547, Training Loss: 0.01096
Epoch: 23547, Training Loss: 0.00847
Epoch: 23548, Training Loss: 0.00960
Epoch: 23548, Training Loss: 0.00894
Epoch: 23548, Training Loss: 0.01096
Epoch: 23548, Training Loss: 0.00847
Epoch: 23549, Training Loss: 0.00960
Epoch: 23549, Training Loss: 0.00894
Epoch: 23549, Training Loss: 0.01096
Epoch: 23549, Training Loss: 0.00847
Epoch: 23550, Training Loss: 0.00960
Epoch: 23550, Training Loss: 0.00894
Epoch: 23550, Training Loss: 0.01096
Epoch: 23550, Training Loss: 0.00847
Epoch: 23551, Training Loss: 0.00960
Epoch: 23551, Training Loss: 0.00894
Epoch: 23551, Training Loss: 0.01096
Epoch: 23551, Training Loss: 0.00847
Epoch: 23552, Training Loss: 0.00960
Epoch: 23552, Training Loss: 0.00894
Epoch: 23552, Training Loss: 0.01096
Epoch: 23552, Training Loss: 0.00847
Epoch: 23553, Training Loss: 0.00960
Epoch: 23553, Training Loss: 0.00894
Epoch: 23553, Training Loss: 0.01096
Epoch: 23553, Training Loss: 0.00847
Epoch: 23554, Training Loss: 0.00960
Epoch: 23554, Training Loss: 0.00894
Epoch: 23554, Training Loss: 0.01096
Epoch: 23554, Training Loss: 0.00847
Epoch: 23555, Training Loss: 0.00960
Epoch: 23555, Training Loss: 0.00894
Epoch: 23555, Training Loss: 0.01096
Epoch: 23555, Training Loss: 0.00847
Epoch: 23556, Training Loss: 0.00960
Epoch: 23556, Training Loss: 0.00894
Epoch: 23556, Training Loss: 0.01096
Epoch: 23556, Training Loss: 0.00847
Epoch: 23557, Training Loss: 0.00960
Epoch: 23557, Training Loss: 0.00894
Epoch: 23557, Training Loss: 0.01096
Epoch: 23557, Training Loss: 0.00847
Epoch: 23558, Training Loss: 0.00960
Epoch: 23558, Training Loss: 0.00894
Epoch: 23558, Training Loss: 0.01096
Epoch: 23558, Training Loss: 0.00847
Epoch: 23559, Training Loss: 0.00960
Epoch: 23559, Training Loss: 0.00894
Epoch: 23559, Training Loss: 0.01096
Epoch: 23559, Training Loss: 0.00847
Epoch: 23560, Training Loss: 0.00960
Epoch: 23560, Training Loss: 0.00894
Epoch: 23560, Training Loss: 0.01096
Epoch: 23560, Training Loss: 0.00847
Epoch: 23561, Training Loss: 0.00960
Epoch: 23561, Training Loss: 0.00894
Epoch: 23561, Training Loss: 0.01096
Epoch: 23561, Training Loss: 0.00847
Epoch: 23562, Training Loss: 0.00960
Epoch: 23562, Training Loss: 0.00894
Epoch: 23562, Training Loss: 0.01096
Epoch: 23562, Training Loss: 0.00847
Epoch: 23563, Training Loss: 0.00960
Epoch: 23563, Training Loss: 0.00894
Epoch: 23563, Training Loss: 0.01096
Epoch: 23563, Training Loss: 0.00847
Epoch: 23564, Training Loss: 0.00960
Epoch: 23564, Training Loss: 0.00894
Epoch: 23564, Training Loss: 0.01096
Epoch: 23564, Training Loss: 0.00847
Epoch: 23565, Training Loss: 0.00960
Epoch: 23565, Training Loss: 0.00894
Epoch: 23565, Training Loss: 0.01096
Epoch: 23565, Training Loss: 0.00847
Epoch: 23566, Training Loss: 0.00959
Epoch: 23566, Training Loss: 0.00894
Epoch: 23566, Training Loss: 0.01095
Epoch: 23566, Training Loss: 0.00847
Epoch: 23567, Training Loss: 0.00959
Epoch: 23567, Training Loss: 0.00894
Epoch: 23567, Training Loss: 0.01095
Epoch: 23567, Training Loss: 0.00847
Epoch: 23568, Training Loss: 0.00959
Epoch: 23568, Training Loss: 0.00894
Epoch: 23568, Training Loss: 0.01095
Epoch: 23568, Training Loss: 0.00847
Epoch: 23569, Training Loss: 0.00959
Epoch: 23569, Training Loss: 0.00894
Epoch: 23569, Training Loss: 0.01095
Epoch: 23569, Training Loss: 0.00847
Epoch: 23570, Training Loss: 0.00959
Epoch: 23570, Training Loss: 0.00894
Epoch: 23570, Training Loss: 0.01095
Epoch: 23570, Training Loss: 0.00847
Epoch: 23571, Training Loss: 0.00959
Epoch: 23571, Training Loss: 0.00894
Epoch: 23571, Training Loss: 0.01095
Epoch: 23571, Training Loss: 0.00847
Epoch: 23572, Training Loss: 0.00959
Epoch: 23572, Training Loss: 0.00894
Epoch: 23572, Training Loss: 0.01095
Epoch: 23572, Training Loss: 0.00847
Epoch: 23573, Training Loss: 0.00959
Epoch: 23573, Training Loss: 0.00894
Epoch: 23573, Training Loss: 0.01095
Epoch: 23573, Training Loss: 0.00847
Epoch: 23574, Training Loss: 0.00959
Epoch: 23574, Training Loss: 0.00894
Epoch: 23574, Training Loss: 0.01095
Epoch: 23574, Training Loss: 0.00847
Epoch: 23575, Training Loss: 0.00959
Epoch: 23575, Training Loss: 0.00894
Epoch: 23575, Training Loss: 0.01095
Epoch: 23575, Training Loss: 0.00847
Epoch: 23576, Training Loss: 0.00959
Epoch: 23576, Training Loss: 0.00894
Epoch: 23576, Training Loss: 0.01095
Epoch: 23576, Training Loss: 0.00847
Epoch: 23577, Training Loss: 0.00959
Epoch: 23577, Training Loss: 0.00894
Epoch: 23577, Training Loss: 0.01095
Epoch: 23577, Training Loss: 0.00847
Epoch: 23578, Training Loss: 0.00959
Epoch: 23578, Training Loss: 0.00894
Epoch: 23578, Training Loss: 0.01095
Epoch: 23578, Training Loss: 0.00847
Epoch: 23579, Training Loss: 0.00959
Epoch: 23579, Training Loss: 0.00894
Epoch: 23579, Training Loss: 0.01095
Epoch: 23579, Training Loss: 0.00847
Epoch: 23580, Training Loss: 0.00959
Epoch: 23580, Training Loss: 0.00894
Epoch: 23580, Training Loss: 0.01095
Epoch: 23580, Training Loss: 0.00847
Epoch: 23581, Training Loss: 0.00959
Epoch: 23581, Training Loss: 0.00894
Epoch: 23581, Training Loss: 0.01095
Epoch: 23581, Training Loss: 0.00847
Epoch: 23582, Training Loss: 0.00959
Epoch: 23582, Training Loss: 0.00894
Epoch: 23582, Training Loss: 0.01095
Epoch: 23582, Training Loss: 0.00846
Epoch: 23583, Training Loss: 0.00959
Epoch: 23583, Training Loss: 0.00894
Epoch: 23583, Training Loss: 0.01095
Epoch: 23583, Training Loss: 0.00846
Epoch: 23584, Training Loss: 0.00959
Epoch: 23584, Training Loss: 0.00894
Epoch: 23584, Training Loss: 0.01095
Epoch: 23584, Training Loss: 0.00846
Epoch: 23585, Training Loss: 0.00959
Epoch: 23585, Training Loss: 0.00894
Epoch: 23585, Training Loss: 0.01095
Epoch: 23585, Training Loss: 0.00846
Epoch: 23586, Training Loss: 0.00959
Epoch: 23586, Training Loss: 0.00894
Epoch: 23586, Training Loss: 0.01095
Epoch: 23586, Training Loss: 0.00846
Epoch: 23587, Training Loss: 0.00959
Epoch: 23587, Training Loss: 0.00894
Epoch: 23587, Training Loss: 0.01095
Epoch: 23587, Training Loss: 0.00846
Epoch: 23588, Training Loss: 0.00959
Epoch: 23588, Training Loss: 0.00893
Epoch: 23588, Training Loss: 0.01095
Epoch: 23588, Training Loss: 0.00846
Epoch: 23589, Training Loss: 0.00959
Epoch: 23589, Training Loss: 0.00893
Epoch: 23589, Training Loss: 0.01095
Epoch: 23589, Training Loss: 0.00846
Epoch: 23590, Training Loss: 0.00959
Epoch: 23590, Training Loss: 0.00893
Epoch: 23590, Training Loss: 0.01095
Epoch: 23590, Training Loss: 0.00846
Epoch: 23591, Training Loss: 0.00959
Epoch: 23591, Training Loss: 0.00893
Epoch: 23591, Training Loss: 0.01095
Epoch: 23591, Training Loss: 0.00846
Epoch: 23592, Training Loss: 0.00959
Epoch: 23592, Training Loss: 0.00893
Epoch: 23592, Training Loss: 0.01095
Epoch: 23592, Training Loss: 0.00846
Epoch: 23593, Training Loss: 0.00959
Epoch: 23593, Training Loss: 0.00893
Epoch: 23593, Training Loss: 0.01095
Epoch: 23593, Training Loss: 0.00846
Epoch: 23594, Training Loss: 0.00959
Epoch: 23594, Training Loss: 0.00893
Epoch: 23594, Training Loss: 0.01095
Epoch: 23594, Training Loss: 0.00846
Epoch: 23595, Training Loss: 0.00959
Epoch: 23595, Training Loss: 0.00893
Epoch: 23595, Training Loss: 0.01095
Epoch: 23595, Training Loss: 0.00846
Epoch: 23596, Training Loss: 0.00959
Epoch: 23596, Training Loss: 0.00893
Epoch: 23596, Training Loss: 0.01095
Epoch: 23596, Training Loss: 0.00846
Epoch: 23597, Training Loss: 0.00959
Epoch: 23597, Training Loss: 0.00893
Epoch: 23597, Training Loss: 0.01095
Epoch: 23597, Training Loss: 0.00846
Epoch: 23598, Training Loss: 0.00959
Epoch: 23598, Training Loss: 0.00893
Epoch: 23598, Training Loss: 0.01095
Epoch: 23598, Training Loss: 0.00846
Epoch: 23599, Training Loss: 0.00959
Epoch: 23599, Training Loss: 0.00893
Epoch: 23599, Training Loss: 0.01095
Epoch: 23599, Training Loss: 0.00846
Epoch: 23600, Training Loss: 0.00959
Epoch: 23600, Training Loss: 0.00893
Epoch: 23600, Training Loss: 0.01095
Epoch: 23600, Training Loss: 0.00846
Epoch: 23601, Training Loss: 0.00959
Epoch: 23601, Training Loss: 0.00893
Epoch: 23601, Training Loss: 0.01095
Epoch: 23601, Training Loss: 0.00846
Epoch: 23602, Training Loss: 0.00959
Epoch: 23602, Training Loss: 0.00893
Epoch: 23602, Training Loss: 0.01095
Epoch: 23602, Training Loss: 0.00846
Epoch: 23603, Training Loss: 0.00959
Epoch: 23603, Training Loss: 0.00893
Epoch: 23603, Training Loss: 0.01095
Epoch: 23603, Training Loss: 0.00846
Epoch: 23604, Training Loss: 0.00959
Epoch: 23604, Training Loss: 0.00893
Epoch: 23604, Training Loss: 0.01095
Epoch: 23604, Training Loss: 0.00846
Epoch: 23605, Training Loss: 0.00959
Epoch: 23605, Training Loss: 0.00893
Epoch: 23605, Training Loss: 0.01095
Epoch: 23605, Training Loss: 0.00846
Epoch: 23606, Training Loss: 0.00959
Epoch: 23606, Training Loss: 0.00893
Epoch: 23606, Training Loss: 0.01094
Epoch: 23606, Training Loss: 0.00846
Epoch: 23607, Training Loss: 0.00959
Epoch: 23607, Training Loss: 0.00893
Epoch: 23607, Training Loss: 0.01094
Epoch: 23607, Training Loss: 0.00846
Epoch: 23608, Training Loss: 0.00959
Epoch: 23608, Training Loss: 0.00893
Epoch: 23608, Training Loss: 0.01094
Epoch: 23608, Training Loss: 0.00846
Epoch: 23609, Training Loss: 0.00959
Epoch: 23609, Training Loss: 0.00893
Epoch: 23609, Training Loss: 0.01094
Epoch: 23609, Training Loss: 0.00846
Epoch: 23610, Training Loss: 0.00959
Epoch: 23610, Training Loss: 0.00893
Epoch: 23610, Training Loss: 0.01094
Epoch: 23610, Training Loss: 0.00846
Epoch: 23611, Training Loss: 0.00959
Epoch: 23611, Training Loss: 0.00893
Epoch: 23611, Training Loss: 0.01094
Epoch: 23611, Training Loss: 0.00846
Epoch: 23612, Training Loss: 0.00958
Epoch: 23612, Training Loss: 0.00893
Epoch: 23612, Training Loss: 0.01094
Epoch: 23612, Training Loss: 0.00846
Epoch: 23613, Training Loss: 0.00958
Epoch: 23613, Training Loss: 0.00893
Epoch: 23613, Training Loss: 0.01094
Epoch: 23613, Training Loss: 0.00846
Epoch: 23614, Training Loss: 0.00958
Epoch: 23614, Training Loss: 0.00893
Epoch: 23614, Training Loss: 0.01094
Epoch: 23614, Training Loss: 0.00846
Epoch: 23615, Training Loss: 0.00958
Epoch: 23615, Training Loss: 0.00893
Epoch: 23615, Training Loss: 0.01094
Epoch: 23615, Training Loss: 0.00846
Epoch: 23616, Training Loss: 0.00958
Epoch: 23616, Training Loss: 0.00893
Epoch: 23616, Training Loss: 0.01094
Epoch: 23616, Training Loss: 0.00846
Epoch: 23617, Training Loss: 0.00958
Epoch: 23617, Training Loss: 0.00893
Epoch: 23617, Training Loss: 0.01094
Epoch: 23617, Training Loss: 0.00846
Epoch: 23618, Training Loss: 0.00958
Epoch: 23618, Training Loss: 0.00893
Epoch: 23618, Training Loss: 0.01094
Epoch: 23618, Training Loss: 0.00846
Epoch: 23619, Training Loss: 0.00958
Epoch: 23619, Training Loss: 0.00893
Epoch: 23619, Training Loss: 0.01094
Epoch: 23619, Training Loss: 0.00846
Epoch: 23620, Training Loss: 0.00958
Epoch: 23620, Training Loss: 0.00893
Epoch: 23620, Training Loss: 0.01094
Epoch: 23620, Training Loss: 0.00846
Epoch: 23621, Training Loss: 0.00958
Epoch: 23621, Training Loss: 0.00893
Epoch: 23621, Training Loss: 0.01094
Epoch: 23621, Training Loss: 0.00846
Epoch: 23622, Training Loss: 0.00958
Epoch: 23622, Training Loss: 0.00893
Epoch: 23622, Training Loss: 0.01094
Epoch: 23622, Training Loss: 0.00846
Epoch: 23623, Training Loss: 0.00958
Epoch: 23623, Training Loss: 0.00893
Epoch: 23623, Training Loss: 0.01094
Epoch: 23623, Training Loss: 0.00846
Epoch: 23624, Training Loss: 0.00958
Epoch: 23624, Training Loss: 0.00893
Epoch: 23624, Training Loss: 0.01094
Epoch: 23624, Training Loss: 0.00846
Epoch: 23625, Training Loss: 0.00958
Epoch: 23625, Training Loss: 0.00893
Epoch: 23625, Training Loss: 0.01094
Epoch: 23625, Training Loss: 0.00846
Epoch: 23626, Training Loss: 0.00958
Epoch: 23626, Training Loss: 0.00893
Epoch: 23626, Training Loss: 0.01094
Epoch: 23626, Training Loss: 0.00846
Epoch: 23627, Training Loss: 0.00958
Epoch: 23627, Training Loss: 0.00893
Epoch: 23627, Training Loss: 0.01094
Epoch: 23627, Training Loss: 0.00846
Epoch: 23628, Training Loss: 0.00958
Epoch: 23628, Training Loss: 0.00893
Epoch: 23628, Training Loss: 0.01094
Epoch: 23628, Training Loss: 0.00846
Epoch: 23629, Training Loss: 0.00958
Epoch: 23629, Training Loss: 0.00893
Epoch: 23629, Training Loss: 0.01094
Epoch: 23629, Training Loss: 0.00846
Epoch: 23630, Training Loss: 0.00958
Epoch: 23630, Training Loss: 0.00893
Epoch: 23630, Training Loss: 0.01094
Epoch: 23630, Training Loss: 0.00846
Epoch: 23631, Training Loss: 0.00958
Epoch: 23631, Training Loss: 0.00893
Epoch: 23631, Training Loss: 0.01094
Epoch: 23631, Training Loss: 0.00846
Epoch: 23632, Training Loss: 0.00958
Epoch: 23632, Training Loss: 0.00893
Epoch: 23632, Training Loss: 0.01094
Epoch: 23632, Training Loss: 0.00846
Epoch: 23633, Training Loss: 0.00958
Epoch: 23633, Training Loss: 0.00893
Epoch: 23633, Training Loss: 0.01094
Epoch: 23633, Training Loss: 0.00846
Epoch: 23634, Training Loss: 0.00958
Epoch: 23634, Training Loss: 0.00893
Epoch: 23634, Training Loss: 0.01094
Epoch: 23634, Training Loss: 0.00846
Epoch: 23635, Training Loss: 0.00958
Epoch: 23635, Training Loss: 0.00893
Epoch: 23635, Training Loss: 0.01094
Epoch: 23635, Training Loss: 0.00845
Epoch: 23636, Training Loss: 0.00958
Epoch: 23636, Training Loss: 0.00893
Epoch: 23636, Training Loss: 0.01094
Epoch: 23636, Training Loss: 0.00845
Epoch: 23637, Training Loss: 0.00958
Epoch: 23637, Training Loss: 0.00893
Epoch: 23637, Training Loss: 0.01094
Epoch: 23637, Training Loss: 0.00845
Epoch: 23638, Training Loss: 0.00958
Epoch: 23638, Training Loss: 0.00892
Epoch: 23638, Training Loss: 0.01094
Epoch: 23638, Training Loss: 0.00845
Epoch: 23639, Training Loss: 0.00958
Epoch: 23639, Training Loss: 0.00892
Epoch: 23639, Training Loss: 0.01094
Epoch: 23639, Training Loss: 0.00845
Epoch: 23640, Training Loss: 0.00958
Epoch: 23640, Training Loss: 0.00892
Epoch: 23640, Training Loss: 0.01094
Epoch: 23640, Training Loss: 0.00845
Epoch: 23641, Training Loss: 0.00958
Epoch: 23641, Training Loss: 0.00892
Epoch: 23641, Training Loss: 0.01094
Epoch: 23641, Training Loss: 0.00845
Epoch: 23642, Training Loss: 0.00958
Epoch: 23642, Training Loss: 0.00892
Epoch: 23642, Training Loss: 0.01094
Epoch: 23642, Training Loss: 0.00845
Epoch: 23643, Training Loss: 0.00958
Epoch: 23643, Training Loss: 0.00892
Epoch: 23643, Training Loss: 0.01094
Epoch: 23643, Training Loss: 0.00845
Epoch: 23644, Training Loss: 0.00958
Epoch: 23644, Training Loss: 0.00892
Epoch: 23644, Training Loss: 0.01094
Epoch: 23644, Training Loss: 0.00845
Epoch: 23645, Training Loss: 0.00958
Epoch: 23645, Training Loss: 0.00892
Epoch: 23645, Training Loss: 0.01094
Epoch: 23645, Training Loss: 0.00845
Epoch: 23646, Training Loss: 0.00958
Epoch: 23646, Training Loss: 0.00892
Epoch: 23646, Training Loss: 0.01093
Epoch: 23646, Training Loss: 0.00845
Epoch: 23647, Training Loss: 0.00958
Epoch: 23647, Training Loss: 0.00892
Epoch: 23647, Training Loss: 0.01093
Epoch: 23647, Training Loss: 0.00845
Epoch: 23648, Training Loss: 0.00958
Epoch: 23648, Training Loss: 0.00892
Epoch: 23648, Training Loss: 0.01093
Epoch: 23648, Training Loss: 0.00845
Epoch: 23649, Training Loss: 0.00958
Epoch: 23649, Training Loss: 0.00892
Epoch: 23649, Training Loss: 0.01093
Epoch: 23649, Training Loss: 0.00845
Epoch: 23650, Training Loss: 0.00958
Epoch: 23650, Training Loss: 0.00892
Epoch: 23650, Training Loss: 0.01093
Epoch: 23650, Training Loss: 0.00845
Epoch: 23651, Training Loss: 0.00958
Epoch: 23651, Training Loss: 0.00892
Epoch: 23651, Training Loss: 0.01093
Epoch: 23651, Training Loss: 0.00845
Epoch: 23652, Training Loss: 0.00958
Epoch: 23652, Training Loss: 0.00892
Epoch: 23652, Training Loss: 0.01093
Epoch: 23652, Training Loss: 0.00845
Epoch: 23653, Training Loss: 0.00958
Epoch: 23653, Training Loss: 0.00892
Epoch: 23653, Training Loss: 0.01093
Epoch: 23653, Training Loss: 0.00845
Epoch: 23654, Training Loss: 0.00958
Epoch: 23654, Training Loss: 0.00892
Epoch: 23654, Training Loss: 0.01093
Epoch: 23654, Training Loss: 0.00845
Epoch: 23655, Training Loss: 0.00958
Epoch: 23655, Training Loss: 0.00892
Epoch: 23655, Training Loss: 0.01093
Epoch: 23655, Training Loss: 0.00845
Epoch: 23656, Training Loss: 0.00958
Epoch: 23656, Training Loss: 0.00892
Epoch: 23656, Training Loss: 0.01093
Epoch: 23656, Training Loss: 0.00845
Epoch: 23657, Training Loss: 0.00958
Epoch: 23657, Training Loss: 0.00892
Epoch: 23657, Training Loss: 0.01093
Epoch: 23657, Training Loss: 0.00845
Epoch: 23658, Training Loss: 0.00957
Epoch: 23658, Training Loss: 0.00892
Epoch: 23658, Training Loss: 0.01093
Epoch: 23658, Training Loss: 0.00845
Epoch: 23659, Training Loss: 0.00957
Epoch: 23659, Training Loss: 0.00892
Epoch: 23659, Training Loss: 0.01093
Epoch: 23659, Training Loss: 0.00845
Epoch: 23660, Training Loss: 0.00957
Epoch: 23660, Training Loss: 0.00892
Epoch: 23660, Training Loss: 0.01093
Epoch: 23660, Training Loss: 0.00845
Epoch: 23661, Training Loss: 0.00957
Epoch: 23661, Training Loss: 0.00892
Epoch: 23661, Training Loss: 0.01093
Epoch: 23661, Training Loss: 0.00845
Epoch: 23662, Training Loss: 0.00957
Epoch: 23662, Training Loss: 0.00892
Epoch: 23662, Training Loss: 0.01093
Epoch: 23662, Training Loss: 0.00845
Epoch: 23663, Training Loss: 0.00957
Epoch: 23663, Training Loss: 0.00892
Epoch: 23663, Training Loss: 0.01093
Epoch: 23663, Training Loss: 0.00845
Epoch: 23664, Training Loss: 0.00957
Epoch: 23664, Training Loss: 0.00892
Epoch: 23664, Training Loss: 0.01093
Epoch: 23664, Training Loss: 0.00845
Epoch: 23665, Training Loss: 0.00957
Epoch: 23665, Training Loss: 0.00892
Epoch: 23665, Training Loss: 0.01093
Epoch: 23665, Training Loss: 0.00845
Epoch: 23666, Training Loss: 0.00957
Epoch: 23666, Training Loss: 0.00892
Epoch: 23666, Training Loss: 0.01093
Epoch: 23666, Training Loss: 0.00845
Epoch: 23667, Training Loss: 0.00957
Epoch: 23667, Training Loss: 0.00892
Epoch: 23667, Training Loss: 0.01093
Epoch: 23667, Training Loss: 0.00845
Epoch: 23668, Training Loss: 0.00957
Epoch: 23668, Training Loss: 0.00892
Epoch: 23668, Training Loss: 0.01093
Epoch: 23668, Training Loss: 0.00845
Epoch: 23669, Training Loss: 0.00957
Epoch: 23669, Training Loss: 0.00892
Epoch: 23669, Training Loss: 0.01093
Epoch: 23669, Training Loss: 0.00845
Epoch: 23670, Training Loss: 0.00957
Epoch: 23670, Training Loss: 0.00892
Epoch: 23670, Training Loss: 0.01093
Epoch: 23670, Training Loss: 0.00845
Epoch: 23671, Training Loss: 0.00957
Epoch: 23671, Training Loss: 0.00892
Epoch: 23671, Training Loss: 0.01093
Epoch: 23671, Training Loss: 0.00845
Epoch: 23672, Training Loss: 0.00957
Epoch: 23672, Training Loss: 0.00892
Epoch: 23672, Training Loss: 0.01093
Epoch: 23672, Training Loss: 0.00845
Epoch: 23673, Training Loss: 0.00957
Epoch: 23673, Training Loss: 0.00892
Epoch: 23673, Training Loss: 0.01093
Epoch: 23673, Training Loss: 0.00845
Epoch: 23674, Training Loss: 0.00957
Epoch: 23674, Training Loss: 0.00892
Epoch: 23674, Training Loss: 0.01093
Epoch: 23674, Training Loss: 0.00845
Epoch: 23675, Training Loss: 0.00957
Epoch: 23675, Training Loss: 0.00892
Epoch: 23675, Training Loss: 0.01093
Epoch: 23675, Training Loss: 0.00845
Epoch: 23676, Training Loss: 0.00957
Epoch: 23676, Training Loss: 0.00892
Epoch: 23676, Training Loss: 0.01093
Epoch: 23676, Training Loss: 0.00845
Epoch: 23677, Training Loss: 0.00957
Epoch: 23677, Training Loss: 0.00892
Epoch: 23677, Training Loss: 0.01093
Epoch: 23677, Training Loss: 0.00845
Epoch: 23678, Training Loss: 0.00957
Epoch: 23678, Training Loss: 0.00892
Epoch: 23678, Training Loss: 0.01093
Epoch: 23678, Training Loss: 0.00845
Epoch: 23679, Training Loss: 0.00957
Epoch: 23679, Training Loss: 0.00892
Epoch: 23679, Training Loss: 0.01093
Epoch: 23679, Training Loss: 0.00845
Epoch: 23680, Training Loss: 0.00957
Epoch: 23680, Training Loss: 0.00892
Epoch: 23680, Training Loss: 0.01093
Epoch: 23680, Training Loss: 0.00845
Epoch: 23681, Training Loss: 0.00957
Epoch: 23681, Training Loss: 0.00892
Epoch: 23681, Training Loss: 0.01093
Epoch: 23681, Training Loss: 0.00845
Epoch: 23682, Training Loss: 0.00957
Epoch: 23682, Training Loss: 0.00892
Epoch: 23682, Training Loss: 0.01093
Epoch: 23682, Training Loss: 0.00845
Epoch: 23683, Training Loss: 0.00957
Epoch: 23683, Training Loss: 0.00892
Epoch: 23683, Training Loss: 0.01093
Epoch: 23683, Training Loss: 0.00845
Epoch: 23684, Training Loss: 0.00957
Epoch: 23684, Training Loss: 0.00892
Epoch: 23684, Training Loss: 0.01093
Epoch: 23684, Training Loss: 0.00845
Epoch: 23685, Training Loss: 0.00957
Epoch: 23685, Training Loss: 0.00892
Epoch: 23685, Training Loss: 0.01093
Epoch: 23685, Training Loss: 0.00845
Epoch: 23686, Training Loss: 0.00957
Epoch: 23686, Training Loss: 0.00892
Epoch: 23686, Training Loss: 0.01093
Epoch: 23686, Training Loss: 0.00845
Epoch: 23687, Training Loss: 0.00957
Epoch: 23687, Training Loss: 0.00892
Epoch: 23687, Training Loss: 0.01092
Epoch: 23687, Training Loss: 0.00845
Epoch: 23688, Training Loss: 0.00957
Epoch: 23688, Training Loss: 0.00892
Epoch: 23688, Training Loss: 0.01092
Epoch: 23688, Training Loss: 0.00844
Epoch: 23689, Training Loss: 0.00957
Epoch: 23689, Training Loss: 0.00891
Epoch: 23689, Training Loss: 0.01092
Epoch: 23689, Training Loss: 0.00844
Epoch: 23690, Training Loss: 0.00957
Epoch: 23690, Training Loss: 0.00891
Epoch: 23690, Training Loss: 0.01092
Epoch: 23690, Training Loss: 0.00844
Epoch: 23691, Training Loss: 0.00957
Epoch: 23691, Training Loss: 0.00891
Epoch: 23691, Training Loss: 0.01092
Epoch: 23691, Training Loss: 0.00844
Epoch: 23692, Training Loss: 0.00957
Epoch: 23692, Training Loss: 0.00891
Epoch: 23692, Training Loss: 0.01092
Epoch: 23692, Training Loss: 0.00844
Epoch: 23693, Training Loss: 0.00957
Epoch: 23693, Training Loss: 0.00891
Epoch: 23693, Training Loss: 0.01092
Epoch: 23693, Training Loss: 0.00844
Epoch: 23694, Training Loss: 0.00957
Epoch: 23694, Training Loss: 0.00891
Epoch: 23694, Training Loss: 0.01092
Epoch: 23694, Training Loss: 0.00844
Epoch: 23695, Training Loss: 0.00957
Epoch: 23695, Training Loss: 0.00891
Epoch: 23695, Training Loss: 0.01092
Epoch: 23695, Training Loss: 0.00844
Epoch: 23696, Training Loss: 0.00957
Epoch: 23696, Training Loss: 0.00891
Epoch: 23696, Training Loss: 0.01092
Epoch: 23696, Training Loss: 0.00844
Epoch: 23697, Training Loss: 0.00957
Epoch: 23697, Training Loss: 0.00891
Epoch: 23697, Training Loss: 0.01092
Epoch: 23697, Training Loss: 0.00844
Epoch: 23698, Training Loss: 0.00957
Epoch: 23698, Training Loss: 0.00891
Epoch: 23698, Training Loss: 0.01092
Epoch: 23698, Training Loss: 0.00844
Epoch: 23699, Training Loss: 0.00957
Epoch: 23699, Training Loss: 0.00891
Epoch: 23699, Training Loss: 0.01092
Epoch: 23699, Training Loss: 0.00844
Epoch: 23700, Training Loss: 0.00957
Epoch: 23700, Training Loss: 0.00891
Epoch: 23700, Training Loss: 0.01092
Epoch: 23700, Training Loss: 0.00844
Epoch: 23701, Training Loss: 0.00957
Epoch: 23701, Training Loss: 0.00891
Epoch: 23701, Training Loss: 0.01092
Epoch: 23701, Training Loss: 0.00844
Epoch: 23702, Training Loss: 0.00957
Epoch: 23702, Training Loss: 0.00891
Epoch: 23702, Training Loss: 0.01092
Epoch: 23702, Training Loss: 0.00844
Epoch: 23703, Training Loss: 0.00957
Epoch: 23703, Training Loss: 0.00891
Epoch: 23703, Training Loss: 0.01092
Epoch: 23703, Training Loss: 0.00844
Epoch: 23704, Training Loss: 0.00956
Epoch: 23704, Training Loss: 0.00891
Epoch: 23704, Training Loss: 0.01092
Epoch: 23704, Training Loss: 0.00844
Epoch: 23705, Training Loss: 0.00956
Epoch: 23705, Training Loss: 0.00891
Epoch: 23705, Training Loss: 0.01092
Epoch: 23705, Training Loss: 0.00844
Epoch: 23706, Training Loss: 0.00956
Epoch: 23706, Training Loss: 0.00891
Epoch: 23706, Training Loss: 0.01092
Epoch: 23706, Training Loss: 0.00844
Epoch: 23707, Training Loss: 0.00956
Epoch: 23707, Training Loss: 0.00891
Epoch: 23707, Training Loss: 0.01092
Epoch: 23707, Training Loss: 0.00844
Epoch: 23708, Training Loss: 0.00956
Epoch: 23708, Training Loss: 0.00891
Epoch: 23708, Training Loss: 0.01092
Epoch: 23708, Training Loss: 0.00844
Epoch: 23709, Training Loss: 0.00956
Epoch: 23709, Training Loss: 0.00891
Epoch: 23709, Training Loss: 0.01092
Epoch: 23709, Training Loss: 0.00844
Epoch: 23710, Training Loss: 0.00956
Epoch: 23710, Training Loss: 0.00891
Epoch: 23710, Training Loss: 0.01092
Epoch: 23710, Training Loss: 0.00844
Epoch: 23711, Training Loss: 0.00956
Epoch: 23711, Training Loss: 0.00891
Epoch: 23711, Training Loss: 0.01092
Epoch: 23711, Training Loss: 0.00844
Epoch: 23712, Training Loss: 0.00956
Epoch: 23712, Training Loss: 0.00891
Epoch: 23712, Training Loss: 0.01092
Epoch: 23712, Training Loss: 0.00844
Epoch: 23713, Training Loss: 0.00956
Epoch: 23713, Training Loss: 0.00891
Epoch: 23713, Training Loss: 0.01092
Epoch: 23713, Training Loss: 0.00844
Epoch: 23714, Training Loss: 0.00956
Epoch: 23714, Training Loss: 0.00891
Epoch: 23714, Training Loss: 0.01092
Epoch: 23714, Training Loss: 0.00844
Epoch: 23715, Training Loss: 0.00956
Epoch: 23715, Training Loss: 0.00891
Epoch: 23715, Training Loss: 0.01092
Epoch: 23715, Training Loss: 0.00844
Epoch: 23716, Training Loss: 0.00956
Epoch: 23716, Training Loss: 0.00891
Epoch: 23716, Training Loss: 0.01092
Epoch: 23716, Training Loss: 0.00844
Epoch: 23717, Training Loss: 0.00956
Epoch: 23717, Training Loss: 0.00891
Epoch: 23717, Training Loss: 0.01092
Epoch: 23717, Training Loss: 0.00844
Epoch: 23718, Training Loss: 0.00956
Epoch: 23718, Training Loss: 0.00891
Epoch: 23718, Training Loss: 0.01092
Epoch: 23718, Training Loss: 0.00844
Epoch: 23719, Training Loss: 0.00956
Epoch: 23719, Training Loss: 0.00891
Epoch: 23719, Training Loss: 0.01092
Epoch: 23719, Training Loss: 0.00844
Epoch: 23720, Training Loss: 0.00956
Epoch: 23720, Training Loss: 0.00891
Epoch: 23720, Training Loss: 0.01092
Epoch: 23720, Training Loss: 0.00844
Epoch: 23721, Training Loss: 0.00956
Epoch: 23721, Training Loss: 0.00891
Epoch: 23721, Training Loss: 0.01092
Epoch: 23721, Training Loss: 0.00844
Epoch: 23722, Training Loss: 0.00956
Epoch: 23722, Training Loss: 0.00891
Epoch: 23722, Training Loss: 0.01092
Epoch: 23722, Training Loss: 0.00844
Epoch: 23723, Training Loss: 0.00956
Epoch: 23723, Training Loss: 0.00891
Epoch: 23723, Training Loss: 0.01092
Epoch: 23723, Training Loss: 0.00844
Epoch: 23724, Training Loss: 0.00956
Epoch: 23724, Training Loss: 0.00891
Epoch: 23724, Training Loss: 0.01092
Epoch: 23724, Training Loss: 0.00844
Epoch: 23725, Training Loss: 0.00956
Epoch: 23725, Training Loss: 0.00891
Epoch: 23725, Training Loss: 0.01092
Epoch: 23725, Training Loss: 0.00844
Epoch: 23726, Training Loss: 0.00956
Epoch: 23726, Training Loss: 0.00891
Epoch: 23726, Training Loss: 0.01092
Epoch: 23726, Training Loss: 0.00844
Epoch: 23727, Training Loss: 0.00956
Epoch: 23727, Training Loss: 0.00891
Epoch: 23727, Training Loss: 0.01091
Epoch: 23727, Training Loss: 0.00844
Epoch: 23728, Training Loss: 0.00956
Epoch: 23728, Training Loss: 0.00891
Epoch: 23728, Training Loss: 0.01091
Epoch: 23728, Training Loss: 0.00844
Epoch: 23729, Training Loss: 0.00956
Epoch: 23729, Training Loss: 0.00891
Epoch: 23729, Training Loss: 0.01091
Epoch: 23729, Training Loss: 0.00844
Epoch: 23730, Training Loss: 0.00956
Epoch: 23730, Training Loss: 0.00891
Epoch: 23730, Training Loss: 0.01091
Epoch: 23730, Training Loss: 0.00844
Epoch: 23731, Training Loss: 0.00956
Epoch: 23731, Training Loss: 0.00891
Epoch: 23731, Training Loss: 0.01091
Epoch: 23731, Training Loss: 0.00844
Epoch: 23732, Training Loss: 0.00956
Epoch: 23732, Training Loss: 0.00891
Epoch: 23732, Training Loss: 0.01091
Epoch: 23732, Training Loss: 0.00844
Epoch: 23733, Training Loss: 0.00956
Epoch: 23733, Training Loss: 0.00891
Epoch: 23733, Training Loss: 0.01091
Epoch: 23733, Training Loss: 0.00844
Epoch: 23734, Training Loss: 0.00956
Epoch: 23734, Training Loss: 0.00891
Epoch: 23734, Training Loss: 0.01091
Epoch: 23734, Training Loss: 0.00844
Epoch: 23735, Training Loss: 0.00956
Epoch: 23735, Training Loss: 0.00891
Epoch: 23735, Training Loss: 0.01091
Epoch: 23735, Training Loss: 0.00844
Epoch: 23736, Training Loss: 0.00956
Epoch: 23736, Training Loss: 0.00891
Epoch: 23736, Training Loss: 0.01091
Epoch: 23736, Training Loss: 0.00844
Epoch: 23737, Training Loss: 0.00956
Epoch: 23737, Training Loss: 0.00891
Epoch: 23737, Training Loss: 0.01091
Epoch: 23737, Training Loss: 0.00844
Epoch: 23738, Training Loss: 0.00956
Epoch: 23738, Training Loss: 0.00891
Epoch: 23738, Training Loss: 0.01091
Epoch: 23738, Training Loss: 0.00844
Epoch: 23739, Training Loss: 0.00956
Epoch: 23739, Training Loss: 0.00890
Epoch: 23739, Training Loss: 0.01091
Epoch: 23739, Training Loss: 0.00844
Epoch: 23740, Training Loss: 0.00956
Epoch: 23740, Training Loss: 0.00890
Epoch: 23740, Training Loss: 0.01091
Epoch: 23740, Training Loss: 0.00844
Epoch: 23741, Training Loss: 0.00956
Epoch: 23741, Training Loss: 0.00890
Epoch: 23741, Training Loss: 0.01091
Epoch: 23741, Training Loss: 0.00843
Epoch: 23742, Training Loss: 0.00956
Epoch: 23742, Training Loss: 0.00890
Epoch: 23742, Training Loss: 0.01091
Epoch: 23742, Training Loss: 0.00843
Epoch: 23743, Training Loss: 0.00956
Epoch: 23743, Training Loss: 0.00890
Epoch: 23743, Training Loss: 0.01091
Epoch: 23743, Training Loss: 0.00843
Epoch: 23744, Training Loss: 0.00956
Epoch: 23744, Training Loss: 0.00890
Epoch: 23744, Training Loss: 0.01091
Epoch: 23744, Training Loss: 0.00843
Epoch: 23745, Training Loss: 0.00956
Epoch: 23745, Training Loss: 0.00890
Epoch: 23745, Training Loss: 0.01091
Epoch: 23745, Training Loss: 0.00843
Epoch: 23746, Training Loss: 0.00956
Epoch: 23746, Training Loss: 0.00890
Epoch: 23746, Training Loss: 0.01091
Epoch: 23746, Training Loss: 0.00843
Epoch: 23747, Training Loss: 0.00956
Epoch: 23747, Training Loss: 0.00890
Epoch: 23747, Training Loss: 0.01091
Epoch: 23747, Training Loss: 0.00843
Epoch: 23748, Training Loss: 0.00956
Epoch: 23748, Training Loss: 0.00890
Epoch: 23748, Training Loss: 0.01091
Epoch: 23748, Training Loss: 0.00843
Epoch: 23749, Training Loss: 0.00956
Epoch: 23749, Training Loss: 0.00890
Epoch: 23749, Training Loss: 0.01091
Epoch: 23749, Training Loss: 0.00843
Epoch: 23750, Training Loss: 0.00955
Epoch: 23750, Training Loss: 0.00890
Epoch: 23750, Training Loss: 0.01091
Epoch: 23750, Training Loss: 0.00843
Epoch: 23751, Training Loss: 0.00955
Epoch: 23751, Training Loss: 0.00890
Epoch: 23751, Training Loss: 0.01091
Epoch: 23751, Training Loss: 0.00843
Epoch: 23752, Training Loss: 0.00955
Epoch: 23752, Training Loss: 0.00890
Epoch: 23752, Training Loss: 0.01091
Epoch: 23752, Training Loss: 0.00843
Epoch: 23753, Training Loss: 0.00955
Epoch: 23753, Training Loss: 0.00890
Epoch: 23753, Training Loss: 0.01091
Epoch: 23753, Training Loss: 0.00843
Epoch: 23754, Training Loss: 0.00955
Epoch: 23754, Training Loss: 0.00890
Epoch: 23754, Training Loss: 0.01091
Epoch: 23754, Training Loss: 0.00843
Epoch: 23755, Training Loss: 0.00955
Epoch: 23755, Training Loss: 0.00890
Epoch: 23755, Training Loss: 0.01091
Epoch: 23755, Training Loss: 0.00843
Epoch: 23756, Training Loss: 0.00955
Epoch: 23756, Training Loss: 0.00890
Epoch: 23756, Training Loss: 0.01091
Epoch: 23756, Training Loss: 0.00843
Epoch: 23757, Training Loss: 0.00955
Epoch: 23757, Training Loss: 0.00890
Epoch: 23757, Training Loss: 0.01091
Epoch: 23757, Training Loss: 0.00843
Epoch: 23758, Training Loss: 0.00955
Epoch: 23758, Training Loss: 0.00890
Epoch: 23758, Training Loss: 0.01091
Epoch: 23758, Training Loss: 0.00843
Epoch: 23759, Training Loss: 0.00955
Epoch: 23759, Training Loss: 0.00890
Epoch: 23759, Training Loss: 0.01091
Epoch: 23759, Training Loss: 0.00843
Epoch: 23760, Training Loss: 0.00955
Epoch: 23760, Training Loss: 0.00890
Epoch: 23760, Training Loss: 0.01091
Epoch: 23760, Training Loss: 0.00843
Epoch: 23761, Training Loss: 0.00955
Epoch: 23761, Training Loss: 0.00890
Epoch: 23761, Training Loss: 0.01091
Epoch: 23761, Training Loss: 0.00843
Epoch: 23762, Training Loss: 0.00955
Epoch: 23762, Training Loss: 0.00890
Epoch: 23762, Training Loss: 0.01091
Epoch: 23762, Training Loss: 0.00843
Epoch: 23763, Training Loss: 0.00955
Epoch: 23763, Training Loss: 0.00890
Epoch: 23763, Training Loss: 0.01091
Epoch: 23763, Training Loss: 0.00843
Epoch: 23764, Training Loss: 0.00955
Epoch: 23764, Training Loss: 0.00890
Epoch: 23764, Training Loss: 0.01091
Epoch: 23764, Training Loss: 0.00843
Epoch: 23765, Training Loss: 0.00955
Epoch: 23765, Training Loss: 0.00890
Epoch: 23765, Training Loss: 0.01091
Epoch: 23765, Training Loss: 0.00843
Epoch: 23766, Training Loss: 0.00955
Epoch: 23766, Training Loss: 0.00890
Epoch: 23766, Training Loss: 0.01091
Epoch: 23766, Training Loss: 0.00843
Epoch: 23767, Training Loss: 0.00955
Epoch: 23767, Training Loss: 0.00890
Epoch: 23767, Training Loss: 0.01091
Epoch: 23767, Training Loss: 0.00843
Epoch: 23768, Training Loss: 0.00955
Epoch: 23768, Training Loss: 0.00890
Epoch: 23768, Training Loss: 0.01090
Epoch: 23768, Training Loss: 0.00843
Epoch: 23769, Training Loss: 0.00955
Epoch: 23769, Training Loss: 0.00890
Epoch: 23769, Training Loss: 0.01090
Epoch: 23769, Training Loss: 0.00843
Epoch: 23770, Training Loss: 0.00955
Epoch: 23770, Training Loss: 0.00890
Epoch: 23770, Training Loss: 0.01090
Epoch: 23770, Training Loss: 0.00843
Epoch: 23771, Training Loss: 0.00955
Epoch: 23771, Training Loss: 0.00890
Epoch: 23771, Training Loss: 0.01090
Epoch: 23771, Training Loss: 0.00843
Epoch: 23772, Training Loss: 0.00955
Epoch: 23772, Training Loss: 0.00890
Epoch: 23772, Training Loss: 0.01090
Epoch: 23772, Training Loss: 0.00843
Epoch: 23773, Training Loss: 0.00955
Epoch: 23773, Training Loss: 0.00890
Epoch: 23773, Training Loss: 0.01090
Epoch: 23773, Training Loss: 0.00843
Epoch: 23774, Training Loss: 0.00955
Epoch: 23774, Training Loss: 0.00890
Epoch: 23774, Training Loss: 0.01090
Epoch: 23774, Training Loss: 0.00843
Epoch: 23775, Training Loss: 0.00955
Epoch: 23775, Training Loss: 0.00890
Epoch: 23775, Training Loss: 0.01090
Epoch: 23775, Training Loss: 0.00843
Epoch: 23776, Training Loss: 0.00955
Epoch: 23776, Training Loss: 0.00890
Epoch: 23776, Training Loss: 0.01090
Epoch: 23776, Training Loss: 0.00843
Epoch: 23777, Training Loss: 0.00955
Epoch: 23777, Training Loss: 0.00890
Epoch: 23777, Training Loss: 0.01090
Epoch: 23777, Training Loss: 0.00843
Epoch: 23778, Training Loss: 0.00955
Epoch: 23778, Training Loss: 0.00890
Epoch: 23778, Training Loss: 0.01090
Epoch: 23778, Training Loss: 0.00843
Epoch: 23779, Training Loss: 0.00955
Epoch: 23779, Training Loss: 0.00890
Epoch: 23779, Training Loss: 0.01090
Epoch: 23779, Training Loss: 0.00843
Epoch: 23780, Training Loss: 0.00955
Epoch: 23780, Training Loss: 0.00890
Epoch: 23780, Training Loss: 0.01090
Epoch: 23780, Training Loss: 0.00843
Epoch: 23781, Training Loss: 0.00955
Epoch: 23781, Training Loss: 0.00890
Epoch: 23781, Training Loss: 0.01090
Epoch: 23781, Training Loss: 0.00843
Epoch: 23782, Training Loss: 0.00955
Epoch: 23782, Training Loss: 0.00890
Epoch: 23782, Training Loss: 0.01090
Epoch: 23782, Training Loss: 0.00843
Epoch: 23783, Training Loss: 0.00955
Epoch: 23783, Training Loss: 0.00890
Epoch: 23783, Training Loss: 0.01090
Epoch: 23783, Training Loss: 0.00843
Epoch: 23784, Training Loss: 0.00955
Epoch: 23784, Training Loss: 0.00890
Epoch: 23784, Training Loss: 0.01090
Epoch: 23784, Training Loss: 0.00843
Epoch: 23785, Training Loss: 0.00955
Epoch: 23785, Training Loss: 0.00890
Epoch: 23785, Training Loss: 0.01090
Epoch: 23785, Training Loss: 0.00843
Epoch: 23786, Training Loss: 0.00955
Epoch: 23786, Training Loss: 0.00890
Epoch: 23786, Training Loss: 0.01090
Epoch: 23786, Training Loss: 0.00843
Epoch: 23787, Training Loss: 0.00955
Epoch: 23787, Training Loss: 0.00890
Epoch: 23787, Training Loss: 0.01090
Epoch: 23787, Training Loss: 0.00843
Epoch: 23788, Training Loss: 0.00955
Epoch: 23788, Training Loss: 0.00890
Epoch: 23788, Training Loss: 0.01090
Epoch: 23788, Training Loss: 0.00843
Epoch: 23789, Training Loss: 0.00955
Epoch: 23789, Training Loss: 0.00890
Epoch: 23789, Training Loss: 0.01090
Epoch: 23789, Training Loss: 0.00843
Epoch: 23790, Training Loss: 0.00955
Epoch: 23790, Training Loss: 0.00889
Epoch: 23790, Training Loss: 0.01090
Epoch: 23790, Training Loss: 0.00843
Epoch: 23791, Training Loss: 0.00955
Epoch: 23791, Training Loss: 0.00889
Epoch: 23791, Training Loss: 0.01090
Epoch: 23791, Training Loss: 0.00843
Epoch: 23792, Training Loss: 0.00955
Epoch: 23792, Training Loss: 0.00889
Epoch: 23792, Training Loss: 0.01090
Epoch: 23792, Training Loss: 0.00843
Epoch: 23793, Training Loss: 0.00955
Epoch: 23793, Training Loss: 0.00889
Epoch: 23793, Training Loss: 0.01090
Epoch: 23793, Training Loss: 0.00843
Epoch: 23794, Training Loss: 0.00955
Epoch: 23794, Training Loss: 0.00889
Epoch: 23794, Training Loss: 0.01090
Epoch: 23794, Training Loss: 0.00842
Epoch: 23795, Training Loss: 0.00955
Epoch: 23795, Training Loss: 0.00889
Epoch: 23795, Training Loss: 0.01090
Epoch: 23795, Training Loss: 0.00842
Epoch: 23796, Training Loss: 0.00955
Epoch: 23796, Training Loss: 0.00889
Epoch: 23796, Training Loss: 0.01090
Epoch: 23796, Training Loss: 0.00842
Epoch: 23797, Training Loss: 0.00954
Epoch: 23797, Training Loss: 0.00889
Epoch: 23797, Training Loss: 0.01090
Epoch: 23797, Training Loss: 0.00842
Epoch: 23798, Training Loss: 0.00954
Epoch: 23798, Training Loss: 0.00889
Epoch: 23798, Training Loss: 0.01090
Epoch: 23798, Training Loss: 0.00842
Epoch: 23799, Training Loss: 0.00954
Epoch: 23799, Training Loss: 0.00889
Epoch: 23799, Training Loss: 0.01090
Epoch: 23799, Training Loss: 0.00842
Epoch: 23800, Training Loss: 0.00954
Epoch: 23800, Training Loss: 0.00889
Epoch: 23800, Training Loss: 0.01090
Epoch: 23800, Training Loss: 0.00842
Epoch: 23801, Training Loss: 0.00954
Epoch: 23801, Training Loss: 0.00889
Epoch: 23801, Training Loss: 0.01090
Epoch: 23801, Training Loss: 0.00842
Epoch: 23802, Training Loss: 0.00954
Epoch: 23802, Training Loss: 0.00889
Epoch: 23802, Training Loss: 0.01090
Epoch: 23802, Training Loss: 0.00842
Epoch: 23803, Training Loss: 0.00954
Epoch: 23803, Training Loss: 0.00889
Epoch: 23803, Training Loss: 0.01090
Epoch: 23803, Training Loss: 0.00842
Epoch: 23804, Training Loss: 0.00954
Epoch: 23804, Training Loss: 0.00889
Epoch: 23804, Training Loss: 0.01090
Epoch: 23804, Training Loss: 0.00842
Epoch: 23805, Training Loss: 0.00954
Epoch: 23805, Training Loss: 0.00889
Epoch: 23805, Training Loss: 0.01090
Epoch: 23805, Training Loss: 0.00842
Epoch: 23806, Training Loss: 0.00954
Epoch: 23806, Training Loss: 0.00889
Epoch: 23806, Training Loss: 0.01090
Epoch: 23806, Training Loss: 0.00842
Epoch: 23807, Training Loss: 0.00954
Epoch: 23807, Training Loss: 0.00889
Epoch: 23807, Training Loss: 0.01090
Epoch: 23807, Training Loss: 0.00842
Epoch: 23808, Training Loss: 0.00954
Epoch: 23808, Training Loss: 0.00889
Epoch: 23808, Training Loss: 0.01090
Epoch: 23808, Training Loss: 0.00842
Epoch: 23809, Training Loss: 0.00954
Epoch: 23809, Training Loss: 0.00889
Epoch: 23809, Training Loss: 0.01089
Epoch: 23809, Training Loss: 0.00842
Epoch: 23810, Training Loss: 0.00954
Epoch: 23810, Training Loss: 0.00889
Epoch: 23810, Training Loss: 0.01089
Epoch: 23810, Training Loss: 0.00842
Epoch: 23811, Training Loss: 0.00954
Epoch: 23811, Training Loss: 0.00889
Epoch: 23811, Training Loss: 0.01089
Epoch: 23811, Training Loss: 0.00842
Epoch: 23812, Training Loss: 0.00954
Epoch: 23812, Training Loss: 0.00889
Epoch: 23812, Training Loss: 0.01089
Epoch: 23812, Training Loss: 0.00842
Epoch: 23813, Training Loss: 0.00954
Epoch: 23813, Training Loss: 0.00889
Epoch: 23813, Training Loss: 0.01089
Epoch: 23813, Training Loss: 0.00842
Epoch: 23814, Training Loss: 0.00954
Epoch: 23814, Training Loss: 0.00889
Epoch: 23814, Training Loss: 0.01089
Epoch: 23814, Training Loss: 0.00842
Epoch: 23815, Training Loss: 0.00954
Epoch: 23815, Training Loss: 0.00889
Epoch: 23815, Training Loss: 0.01089
Epoch: 23815, Training Loss: 0.00842
Epoch: 23816, Training Loss: 0.00954
Epoch: 23816, Training Loss: 0.00889
Epoch: 23816, Training Loss: 0.01089
Epoch: 23816, Training Loss: 0.00842
Epoch: 23817, Training Loss: 0.00954
Epoch: 23817, Training Loss: 0.00889
Epoch: 23817, Training Loss: 0.01089
Epoch: 23817, Training Loss: 0.00842
Epoch: 23818, Training Loss: 0.00954
Epoch: 23818, Training Loss: 0.00889
Epoch: 23818, Training Loss: 0.01089
Epoch: 23818, Training Loss: 0.00842
Epoch: 23819, Training Loss: 0.00954
Epoch: 23819, Training Loss: 0.00889
Epoch: 23819, Training Loss: 0.01089
Epoch: 23819, Training Loss: 0.00842
Epoch: 23820, Training Loss: 0.00954
Epoch: 23820, Training Loss: 0.00889
Epoch: 23820, Training Loss: 0.01089
Epoch: 23820, Training Loss: 0.00842
Epoch: 23821, Training Loss: 0.00954
Epoch: 23821, Training Loss: 0.00889
Epoch: 23821, Training Loss: 0.01089
Epoch: 23821, Training Loss: 0.00842
Epoch: 23822, Training Loss: 0.00954
Epoch: 23822, Training Loss: 0.00889
Epoch: 23822, Training Loss: 0.01089
Epoch: 23822, Training Loss: 0.00842
Epoch: 23823, Training Loss: 0.00954
Epoch: 23823, Training Loss: 0.00889
Epoch: 23823, Training Loss: 0.01089
Epoch: 23823, Training Loss: 0.00842
Epoch: 23824, Training Loss: 0.00954
Epoch: 23824, Training Loss: 0.00889
Epoch: 23824, Training Loss: 0.01089
Epoch: 23824, Training Loss: 0.00842
Epoch: 23825, Training Loss: 0.00954
Epoch: 23825, Training Loss: 0.00889
Epoch: 23825, Training Loss: 0.01089
Epoch: 23825, Training Loss: 0.00842
Epoch: 23826, Training Loss: 0.00954
Epoch: 23826, Training Loss: 0.00889
Epoch: 23826, Training Loss: 0.01089
Epoch: 23826, Training Loss: 0.00842
Epoch: 23827, Training Loss: 0.00954
Epoch: 23827, Training Loss: 0.00889
Epoch: 23827, Training Loss: 0.01089
Epoch: 23827, Training Loss: 0.00842
Epoch: 23828, Training Loss: 0.00954
Epoch: 23828, Training Loss: 0.00889
Epoch: 23828, Training Loss: 0.01089
Epoch: 23828, Training Loss: 0.00842
Epoch: 23829, Training Loss: 0.00954
Epoch: 23829, Training Loss: 0.00889
Epoch: 23829, Training Loss: 0.01089
Epoch: 23829, Training Loss: 0.00842
Epoch: 23830, Training Loss: 0.00954
Epoch: 23830, Training Loss: 0.00889
Epoch: 23830, Training Loss: 0.01089
Epoch: 23830, Training Loss: 0.00842
Epoch: 23831, Training Loss: 0.00954
Epoch: 23831, Training Loss: 0.00889
Epoch: 23831, Training Loss: 0.01089
Epoch: 23831, Training Loss: 0.00842
Epoch: 23832, Training Loss: 0.00954
Epoch: 23832, Training Loss: 0.00889
Epoch: 23832, Training Loss: 0.01089
Epoch: 23832, Training Loss: 0.00842
Epoch: 23833, Training Loss: 0.00954
Epoch: 23833, Training Loss: 0.00889
Epoch: 23833, Training Loss: 0.01089
Epoch: 23833, Training Loss: 0.00842
Epoch: 23834, Training Loss: 0.00954
Epoch: 23834, Training Loss: 0.00889
Epoch: 23834, Training Loss: 0.01089
Epoch: 23834, Training Loss: 0.00842
Epoch: 23835, Training Loss: 0.00954
Epoch: 23835, Training Loss: 0.00889
Epoch: 23835, Training Loss: 0.01089
Epoch: 23835, Training Loss: 0.00842
Epoch: 23836, Training Loss: 0.00954
Epoch: 23836, Training Loss: 0.00889
Epoch: 23836, Training Loss: 0.01089
Epoch: 23836, Training Loss: 0.00842
Epoch: 23837, Training Loss: 0.00954
Epoch: 23837, Training Loss: 0.00889
Epoch: 23837, Training Loss: 0.01089
Epoch: 23837, Training Loss: 0.00842
Epoch: 23838, Training Loss: 0.00954
Epoch: 23838, Training Loss: 0.00889
Epoch: 23838, Training Loss: 0.01089
Epoch: 23838, Training Loss: 0.00842
Epoch: 23839, Training Loss: 0.00954
Epoch: 23839, Training Loss: 0.00889
Epoch: 23839, Training Loss: 0.01089
Epoch: 23839, Training Loss: 0.00842
Epoch: 23840, Training Loss: 0.00954
Epoch: 23840, Training Loss: 0.00889
Epoch: 23840, Training Loss: 0.01089
Epoch: 23840, Training Loss: 0.00842
Epoch: 23841, Training Loss: 0.00954
Epoch: 23841, Training Loss: 0.00888
Epoch: 23841, Training Loss: 0.01089
Epoch: 23841, Training Loss: 0.00842
Epoch: 23842, Training Loss: 0.00954
Epoch: 23842, Training Loss: 0.00888
Epoch: 23842, Training Loss: 0.01089
Epoch: 23842, Training Loss: 0.00842
Epoch: 23843, Training Loss: 0.00953
Epoch: 23843, Training Loss: 0.00888
Epoch: 23843, Training Loss: 0.01089
Epoch: 23843, Training Loss: 0.00842
Epoch: 23844, Training Loss: 0.00953
Epoch: 23844, Training Loss: 0.00888
Epoch: 23844, Training Loss: 0.01089
Epoch: 23844, Training Loss: 0.00842
Epoch: 23845, Training Loss: 0.00953
Epoch: 23845, Training Loss: 0.00888
Epoch: 23845, Training Loss: 0.01089
Epoch: 23845, Training Loss: 0.00842
Epoch: 23846, Training Loss: 0.00953
Epoch: 23846, Training Loss: 0.00888
Epoch: 23846, Training Loss: 0.01089
Epoch: 23846, Training Loss: 0.00842
Epoch: 23847, Training Loss: 0.00953
Epoch: 23847, Training Loss: 0.00888
Epoch: 23847, Training Loss: 0.01089
Epoch: 23847, Training Loss: 0.00842
Epoch: 23848, Training Loss: 0.00953
Epoch: 23848, Training Loss: 0.00888
Epoch: 23848, Training Loss: 0.01089
Epoch: 23848, Training Loss: 0.00841
Epoch: 23849, Training Loss: 0.00953
Epoch: 23849, Training Loss: 0.00888
Epoch: 23849, Training Loss: 0.01089
Epoch: 23849, Training Loss: 0.00841
Epoch: 23850, Training Loss: 0.00953
Epoch: 23850, Training Loss: 0.00888
Epoch: 23850, Training Loss: 0.01088
Epoch: 23850, Training Loss: 0.00841
Epoch: 23851, Training Loss: 0.00953
Epoch: 23851, Training Loss: 0.00888
Epoch: 23851, Training Loss: 0.01088
Epoch: 23851, Training Loss: 0.00841
Epoch: 23852, Training Loss: 0.00953
Epoch: 23852, Training Loss: 0.00888
Epoch: 23852, Training Loss: 0.01088
Epoch: 23852, Training Loss: 0.00841
Epoch: 23853, Training Loss: 0.00953
Epoch: 23853, Training Loss: 0.00888
Epoch: 23853, Training Loss: 0.01088
Epoch: 23853, Training Loss: 0.00841
Epoch: 23854, Training Loss: 0.00953
Epoch: 23854, Training Loss: 0.00888
Epoch: 23854, Training Loss: 0.01088
Epoch: 23854, Training Loss: 0.00841
Epoch: 23855, Training Loss: 0.00953
Epoch: 23855, Training Loss: 0.00888
Epoch: 23855, Training Loss: 0.01088
Epoch: 23855, Training Loss: 0.00841
Epoch: 23856, Training Loss: 0.00953
Epoch: 23856, Training Loss: 0.00888
Epoch: 23856, Training Loss: 0.01088
Epoch: 23856, Training Loss: 0.00841
Epoch: 23857, Training Loss: 0.00953
Epoch: 23857, Training Loss: 0.00888
Epoch: 23857, Training Loss: 0.01088
Epoch: 23857, Training Loss: 0.00841
Epoch: 23858, Training Loss: 0.00953
Epoch: 23858, Training Loss: 0.00888
Epoch: 23858, Training Loss: 0.01088
Epoch: 23858, Training Loss: 0.00841
Epoch: 23859, Training Loss: 0.00953
Epoch: 23859, Training Loss: 0.00888
Epoch: 23859, Training Loss: 0.01088
Epoch: 23859, Training Loss: 0.00841
Epoch: 23860, Training Loss: 0.00953
Epoch: 23860, Training Loss: 0.00888
Epoch: 23860, Training Loss: 0.01088
Epoch: 23860, Training Loss: 0.00841
Epoch: 23861, Training Loss: 0.00953
Epoch: 23861, Training Loss: 0.00888
Epoch: 23861, Training Loss: 0.01088
Epoch: 23861, Training Loss: 0.00841
Epoch: 23862, Training Loss: 0.00953
Epoch: 23862, Training Loss: 0.00888
Epoch: 23862, Training Loss: 0.01088
Epoch: 23862, Training Loss: 0.00841
Epoch: 23863, Training Loss: 0.00953
Epoch: 23863, Training Loss: 0.00888
Epoch: 23863, Training Loss: 0.01088
Epoch: 23863, Training Loss: 0.00841
Epoch: 23864, Training Loss: 0.00953
Epoch: 23864, Training Loss: 0.00888
Epoch: 23864, Training Loss: 0.01088
Epoch: 23864, Training Loss: 0.00841
Epoch: 23865, Training Loss: 0.00953
Epoch: 23865, Training Loss: 0.00888
Epoch: 23865, Training Loss: 0.01088
Epoch: 23865, Training Loss: 0.00841
Epoch: 23866, Training Loss: 0.00953
Epoch: 23866, Training Loss: 0.00888
Epoch: 23866, Training Loss: 0.01088
Epoch: 23866, Training Loss: 0.00841
Epoch: 23867, Training Loss: 0.00953
Epoch: 23867, Training Loss: 0.00888
Epoch: 23867, Training Loss: 0.01088
Epoch: 23867, Training Loss: 0.00841
Epoch: 23868, Training Loss: 0.00953
Epoch: 23868, Training Loss: 0.00888
Epoch: 23868, Training Loss: 0.01088
Epoch: 23868, Training Loss: 0.00841
Epoch: 23869, Training Loss: 0.00953
Epoch: 23869, Training Loss: 0.00888
Epoch: 23869, Training Loss: 0.01088
Epoch: 23869, Training Loss: 0.00841
Epoch: 23870, Training Loss: 0.00953
Epoch: 23870, Training Loss: 0.00888
Epoch: 23870, Training Loss: 0.01088
Epoch: 23870, Training Loss: 0.00841
Epoch: 23871, Training Loss: 0.00953
Epoch: 23871, Training Loss: 0.00888
Epoch: 23871, Training Loss: 0.01088
Epoch: 23871, Training Loss: 0.00841
Epoch: 23872, Training Loss: 0.00953
Epoch: 23872, Training Loss: 0.00888
Epoch: 23872, Training Loss: 0.01088
Epoch: 23872, Training Loss: 0.00841
Epoch: 23873, Training Loss: 0.00953
Epoch: 23873, Training Loss: 0.00888
Epoch: 23873, Training Loss: 0.01088
Epoch: 23873, Training Loss: 0.00841
Epoch: 23874, Training Loss: 0.00953
Epoch: 23874, Training Loss: 0.00888
Epoch: 23874, Training Loss: 0.01088
Epoch: 23874, Training Loss: 0.00841
Epoch: 23875, Training Loss: 0.00953
Epoch: 23875, Training Loss: 0.00888
Epoch: 23875, Training Loss: 0.01088
Epoch: 23875, Training Loss: 0.00841
Epoch: 23876, Training Loss: 0.00953
Epoch: 23876, Training Loss: 0.00888
Epoch: 23876, Training Loss: 0.01088
Epoch: 23876, Training Loss: 0.00841
Epoch: 23877, Training Loss: 0.00953
Epoch: 23877, Training Loss: 0.00888
Epoch: 23877, Training Loss: 0.01088
Epoch: 23877, Training Loss: 0.00841
Epoch: 23878, Training Loss: 0.00953
Epoch: 23878, Training Loss: 0.00888
Epoch: 23878, Training Loss: 0.01088
Epoch: 23878, Training Loss: 0.00841
Epoch: 23879, Training Loss: 0.00953
Epoch: 23879, Training Loss: 0.00888
Epoch: 23879, Training Loss: 0.01088
Epoch: 23879, Training Loss: 0.00841
Epoch: 23880, Training Loss: 0.00953
Epoch: 23880, Training Loss: 0.00888
Epoch: 23880, Training Loss: 0.01088
Epoch: 23880, Training Loss: 0.00841
Epoch: 23881, Training Loss: 0.00953
Epoch: 23881, Training Loss: 0.00888
Epoch: 23881, Training Loss: 0.01088
Epoch: 23881, Training Loss: 0.00841
Epoch: 23882, Training Loss: 0.00953
Epoch: 23882, Training Loss: 0.00888
Epoch: 23882, Training Loss: 0.01088
Epoch: 23882, Training Loss: 0.00841
Epoch: 23883, Training Loss: 0.00953
Epoch: 23883, Training Loss: 0.00888
Epoch: 23883, Training Loss: 0.01088
Epoch: 23883, Training Loss: 0.00841
Epoch: 23884, Training Loss: 0.00953
Epoch: 23884, Training Loss: 0.00888
Epoch: 23884, Training Loss: 0.01088
Epoch: 23884, Training Loss: 0.00841
Epoch: 23885, Training Loss: 0.00953
Epoch: 23885, Training Loss: 0.00888
Epoch: 23885, Training Loss: 0.01088
Epoch: 23885, Training Loss: 0.00841
Epoch: 23886, Training Loss: 0.00953
Epoch: 23886, Training Loss: 0.00888
Epoch: 23886, Training Loss: 0.01088
Epoch: 23886, Training Loss: 0.00841
Epoch: 23887, Training Loss: 0.00953
Epoch: 23887, Training Loss: 0.00888
Epoch: 23887, Training Loss: 0.01088
Epoch: 23887, Training Loss: 0.00841
Epoch: 23888, Training Loss: 0.00953
Epoch: 23888, Training Loss: 0.00888
Epoch: 23888, Training Loss: 0.01088
Epoch: 23888, Training Loss: 0.00841
Epoch: 23889, Training Loss: 0.00953
Epoch: 23889, Training Loss: 0.00888
Epoch: 23889, Training Loss: 0.01088
Epoch: 23889, Training Loss: 0.00841
Epoch: 23890, Training Loss: 0.00952
Epoch: 23890, Training Loss: 0.00888
Epoch: 23890, Training Loss: 0.01087
Epoch: 23890, Training Loss: 0.00841
Epoch: 23891, Training Loss: 0.00952
Epoch: 23891, Training Loss: 0.00888
Epoch: 23891, Training Loss: 0.01087
Epoch: 23891, Training Loss: 0.00841
Epoch: 23892, Training Loss: 0.00952
Epoch: 23892, Training Loss: 0.00887
Epoch: 23892, Training Loss: 0.01087
Epoch: 23892, Training Loss: 0.00841
Epoch: 23893, Training Loss: 0.00952
Epoch: 23893, Training Loss: 0.00887
Epoch: 23893, Training Loss: 0.01087
Epoch: 23893, Training Loss: 0.00841
Epoch: 23894, Training Loss: 0.00952
Epoch: 23894, Training Loss: 0.00887
Epoch: 23894, Training Loss: 0.01087
Epoch: 23894, Training Loss: 0.00841
Epoch: 23895, Training Loss: 0.00952
Epoch: 23895, Training Loss: 0.00887
Epoch: 23895, Training Loss: 0.01087
Epoch: 23895, Training Loss: 0.00841
Epoch: 23896, Training Loss: 0.00952
Epoch: 23896, Training Loss: 0.00887
Epoch: 23896, Training Loss: 0.01087
Epoch: 23896, Training Loss: 0.00841
Epoch: 23897, Training Loss: 0.00952
Epoch: 23897, Training Loss: 0.00887
Epoch: 23897, Training Loss: 0.01087
Epoch: 23897, Training Loss: 0.00841
Epoch: 23898, Training Loss: 0.00952
Epoch: 23898, Training Loss: 0.00887
Epoch: 23898, Training Loss: 0.01087
Epoch: 23898, Training Loss: 0.00841
Epoch: 23899, Training Loss: 0.00952
Epoch: 23899, Training Loss: 0.00887
Epoch: 23899, Training Loss: 0.01087
Epoch: 23899, Training Loss: 0.00841
Epoch: 23900, Training Loss: 0.00952
Epoch: 23900, Training Loss: 0.00887
Epoch: 23900, Training Loss: 0.01087
Epoch: 23900, Training Loss: 0.00841
Epoch: 23901, Training Loss: 0.00952
Epoch: 23901, Training Loss: 0.00887
Epoch: 23901, Training Loss: 0.01087
Epoch: 23901, Training Loss: 0.00841
Epoch: 23902, Training Loss: 0.00952
Epoch: 23902, Training Loss: 0.00887
Epoch: 23902, Training Loss: 0.01087
Epoch: 23902, Training Loss: 0.00840
Epoch: 23903, Training Loss: 0.00952
Epoch: 23903, Training Loss: 0.00887
Epoch: 23903, Training Loss: 0.01087
Epoch: 23903, Training Loss: 0.00840
Epoch: 23904, Training Loss: 0.00952
Epoch: 23904, Training Loss: 0.00887
Epoch: 23904, Training Loss: 0.01087
Epoch: 23904, Training Loss: 0.00840
Epoch: 23905, Training Loss: 0.00952
Epoch: 23905, Training Loss: 0.00887
Epoch: 23905, Training Loss: 0.01087
Epoch: 23905, Training Loss: 0.00840
Epoch: 23906, Training Loss: 0.00952
Epoch: 23906, Training Loss: 0.00887
Epoch: 23906, Training Loss: 0.01087
Epoch: 23906, Training Loss: 0.00840
Epoch: 23907, Training Loss: 0.00952
Epoch: 23907, Training Loss: 0.00887
Epoch: 23907, Training Loss: 0.01087
Epoch: 23907, Training Loss: 0.00840
Epoch: 23908, Training Loss: 0.00952
Epoch: 23908, Training Loss: 0.00887
Epoch: 23908, Training Loss: 0.01087
Epoch: 23908, Training Loss: 0.00840
Epoch: 23909, Training Loss: 0.00952
Epoch: 23909, Training Loss: 0.00887
Epoch: 23909, Training Loss: 0.01087
Epoch: 23909, Training Loss: 0.00840
Epoch: 23910, Training Loss: 0.00952
Epoch: 23910, Training Loss: 0.00887
Epoch: 23910, Training Loss: 0.01087
Epoch: 23910, Training Loss: 0.00840
Epoch: 23911, Training Loss: 0.00952
Epoch: 23911, Training Loss: 0.00887
Epoch: 23911, Training Loss: 0.01087
Epoch: 23911, Training Loss: 0.00840
Epoch: 23912, Training Loss: 0.00952
Epoch: 23912, Training Loss: 0.00887
Epoch: 23912, Training Loss: 0.01087
Epoch: 23912, Training Loss: 0.00840
Epoch: 23913, Training Loss: 0.00952
Epoch: 23913, Training Loss: 0.00887
Epoch: 23913, Training Loss: 0.01087
Epoch: 23913, Training Loss: 0.00840
Epoch: 23914, Training Loss: 0.00952
Epoch: 23914, Training Loss: 0.00887
Epoch: 23914, Training Loss: 0.01087
Epoch: 23914, Training Loss: 0.00840
Epoch: 23915, Training Loss: 0.00952
Epoch: 23915, Training Loss: 0.00887
Epoch: 23915, Training Loss: 0.01087
Epoch: 23915, Training Loss: 0.00840
Epoch: 23916, Training Loss: 0.00952
Epoch: 23916, Training Loss: 0.00887
Epoch: 23916, Training Loss: 0.01087
Epoch: 23916, Training Loss: 0.00840
Epoch: 23917, Training Loss: 0.00952
Epoch: 23917, Training Loss: 0.00887
Epoch: 23917, Training Loss: 0.01087
Epoch: 23917, Training Loss: 0.00840
Epoch: 23918, Training Loss: 0.00952
Epoch: 23918, Training Loss: 0.00887
Epoch: 23918, Training Loss: 0.01087
Epoch: 23918, Training Loss: 0.00840
Epoch: 23919, Training Loss: 0.00952
Epoch: 23919, Training Loss: 0.00887
Epoch: 23919, Training Loss: 0.01087
Epoch: 23919, Training Loss: 0.00840
Epoch: 23920, Training Loss: 0.00952
Epoch: 23920, Training Loss: 0.00887
Epoch: 23920, Training Loss: 0.01087
Epoch: 23920, Training Loss: 0.00840
Epoch: 23921, Training Loss: 0.00952
Epoch: 23921, Training Loss: 0.00887
Epoch: 23921, Training Loss: 0.01087
Epoch: 23921, Training Loss: 0.00840
Epoch: 23922, Training Loss: 0.00952
Epoch: 23922, Training Loss: 0.00887
Epoch: 23922, Training Loss: 0.01087
Epoch: 23922, Training Loss: 0.00840
Epoch: 23923, Training Loss: 0.00952
Epoch: 23923, Training Loss: 0.00887
Epoch: 23923, Training Loss: 0.01087
Epoch: 23923, Training Loss: 0.00840
Epoch: 23924, Training Loss: 0.00952
Epoch: 23924, Training Loss: 0.00887
Epoch: 23924, Training Loss: 0.01087
Epoch: 23924, Training Loss: 0.00840
Epoch: 23925, Training Loss: 0.00952
Epoch: 23925, Training Loss: 0.00887
Epoch: 23925, Training Loss: 0.01087
Epoch: 23925, Training Loss: 0.00840
Epoch: 23926, Training Loss: 0.00952
Epoch: 23926, Training Loss: 0.00887
Epoch: 23926, Training Loss: 0.01087
Epoch: 23926, Training Loss: 0.00840
Epoch: 23927, Training Loss: 0.00952
Epoch: 23927, Training Loss: 0.00887
Epoch: 23927, Training Loss: 0.01087
Epoch: 23927, Training Loss: 0.00840
Epoch: 23928, Training Loss: 0.00952
Epoch: 23928, Training Loss: 0.00887
Epoch: 23928, Training Loss: 0.01087
Epoch: 23928, Training Loss: 0.00840
Epoch: 23929, Training Loss: 0.00952
Epoch: 23929, Training Loss: 0.00887
Epoch: 23929, Training Loss: 0.01087
Epoch: 23929, Training Loss: 0.00840
Epoch: 23930, Training Loss: 0.00952
Epoch: 23930, Training Loss: 0.00887
Epoch: 23930, Training Loss: 0.01087
Epoch: 23930, Training Loss: 0.00840
Epoch: 23931, Training Loss: 0.00952
Epoch: 23931, Training Loss: 0.00887
Epoch: 23931, Training Loss: 0.01087
Epoch: 23931, Training Loss: 0.00840
Epoch: 23932, Training Loss: 0.00952
Epoch: 23932, Training Loss: 0.00887
Epoch: 23932, Training Loss: 0.01086
Epoch: 23932, Training Loss: 0.00840
Epoch: 23933, Training Loss: 0.00952
Epoch: 23933, Training Loss: 0.00887
Epoch: 23933, Training Loss: 0.01086
Epoch: 23933, Training Loss: 0.00840
Epoch: 23934, Training Loss: 0.00952
Epoch: 23934, Training Loss: 0.00887
Epoch: 23934, Training Loss: 0.01086
Epoch: 23934, Training Loss: 0.00840
Epoch: 23935, Training Loss: 0.00952
Epoch: 23935, Training Loss: 0.00887
Epoch: 23935, Training Loss: 0.01086
Epoch: 23935, Training Loss: 0.00840
Epoch: 23936, Training Loss: 0.00952
Epoch: 23936, Training Loss: 0.00887
Epoch: 23936, Training Loss: 0.01086
Epoch: 23936, Training Loss: 0.00840
Epoch: 23937, Training Loss: 0.00951
Epoch: 23937, Training Loss: 0.00887
Epoch: 23937, Training Loss: 0.01086
Epoch: 23937, Training Loss: 0.00840
Epoch: 23938, Training Loss: 0.00951
Epoch: 23938, Training Loss: 0.00887
Epoch: 23938, Training Loss: 0.01086
Epoch: 23938, Training Loss: 0.00840
Epoch: 23939, Training Loss: 0.00951
Epoch: 23939, Training Loss: 0.00887
Epoch: 23939, Training Loss: 0.01086
Epoch: 23939, Training Loss: 0.00840
Epoch: 23940, Training Loss: 0.00951
Epoch: 23940, Training Loss: 0.00887
Epoch: 23940, Training Loss: 0.01086
Epoch: 23940, Training Loss: 0.00840
Epoch: 23941, Training Loss: 0.00951
Epoch: 23941, Training Loss: 0.00887
Epoch: 23941, Training Loss: 0.01086
Epoch: 23941, Training Loss: 0.00840
Epoch: 23942, Training Loss: 0.00951
Epoch: 23942, Training Loss: 0.00887
Epoch: 23942, Training Loss: 0.01086
Epoch: 23942, Training Loss: 0.00840
Epoch: 23943, Training Loss: 0.00951
Epoch: 23943, Training Loss: 0.00886
Epoch: 23943, Training Loss: 0.01086
Epoch: 23943, Training Loss: 0.00840
Epoch: 23944, Training Loss: 0.00951
Epoch: 23944, Training Loss: 0.00886
Epoch: 23944, Training Loss: 0.01086
Epoch: 23944, Training Loss: 0.00840
Epoch: 23945, Training Loss: 0.00951
Epoch: 23945, Training Loss: 0.00886
Epoch: 23945, Training Loss: 0.01086
Epoch: 23945, Training Loss: 0.00840
Epoch: 23946, Training Loss: 0.00951
Epoch: 23946, Training Loss: 0.00886
Epoch: 23946, Training Loss: 0.01086
Epoch: 23946, Training Loss: 0.00840
Epoch: 23947, Training Loss: 0.00951
Epoch: 23947, Training Loss: 0.00886
Epoch: 23947, Training Loss: 0.01086
Epoch: 23947, Training Loss: 0.00840
Epoch: 23948, Training Loss: 0.00951
Epoch: 23948, Training Loss: 0.00886
Epoch: 23948, Training Loss: 0.01086
Epoch: 23948, Training Loss: 0.00840
Epoch: 23949, Training Loss: 0.00951
Epoch: 23949, Training Loss: 0.00886
Epoch: 23949, Training Loss: 0.01086
Epoch: 23949, Training Loss: 0.00840
Epoch: 23950, Training Loss: 0.00951
Epoch: 23950, Training Loss: 0.00886
Epoch: 23950, Training Loss: 0.01086
Epoch: 23950, Training Loss: 0.00840
Epoch: 23951, Training Loss: 0.00951
Epoch: 23951, Training Loss: 0.00886
Epoch: 23951, Training Loss: 0.01086
Epoch: 23951, Training Loss: 0.00840
Epoch: 23952, Training Loss: 0.00951
Epoch: 23952, Training Loss: 0.00886
Epoch: 23952, Training Loss: 0.01086
Epoch: 23952, Training Loss: 0.00840
Epoch: 23953, Training Loss: 0.00951
Epoch: 23953, Training Loss: 0.00886
Epoch: 23953, Training Loss: 0.01086
Epoch: 23953, Training Loss: 0.00840
Epoch: 23954, Training Loss: 0.00951
Epoch: 23954, Training Loss: 0.00886
Epoch: 23954, Training Loss: 0.01086
Epoch: 23954, Training Loss: 0.00840
Epoch: 23955, Training Loss: 0.00951
Epoch: 23955, Training Loss: 0.00886
Epoch: 23955, Training Loss: 0.01086
Epoch: 23955, Training Loss: 0.00840
Epoch: 23956, Training Loss: 0.00951
Epoch: 23956, Training Loss: 0.00886
Epoch: 23956, Training Loss: 0.01086
Epoch: 23956, Training Loss: 0.00839
Epoch: 23957, Training Loss: 0.00951
Epoch: 23957, Training Loss: 0.00886
Epoch: 23957, Training Loss: 0.01086
Epoch: 23957, Training Loss: 0.00839
Epoch: 23958, Training Loss: 0.00951
Epoch: 23958, Training Loss: 0.00886
Epoch: 23958, Training Loss: 0.01086
Epoch: 23958, Training Loss: 0.00839
Epoch: 23959, Training Loss: 0.00951
Epoch: 23959, Training Loss: 0.00886
Epoch: 23959, Training Loss: 0.01086
Epoch: 23959, Training Loss: 0.00839
Epoch: 23960, Training Loss: 0.00951
Epoch: 23960, Training Loss: 0.00886
Epoch: 23960, Training Loss: 0.01086
Epoch: 23960, Training Loss: 0.00839
Epoch: 23961, Training Loss: 0.00951
Epoch: 23961, Training Loss: 0.00886
Epoch: 23961, Training Loss: 0.01086
Epoch: 23961, Training Loss: 0.00839
Epoch: 23962, Training Loss: 0.00951
Epoch: 23962, Training Loss: 0.00886
Epoch: 23962, Training Loss: 0.01086
Epoch: 23962, Training Loss: 0.00839
Epoch: 23963, Training Loss: 0.00951
Epoch: 23963, Training Loss: 0.00886
Epoch: 23963, Training Loss: 0.01086
Epoch: 23963, Training Loss: 0.00839
Epoch: 23964, Training Loss: 0.00951
Epoch: 23964, Training Loss: 0.00886
Epoch: 23964, Training Loss: 0.01086
Epoch: 23964, Training Loss: 0.00839
Epoch: 23965, Training Loss: 0.00951
Epoch: 23965, Training Loss: 0.00886
Epoch: 23965, Training Loss: 0.01086
Epoch: 23965, Training Loss: 0.00839
Epoch: 23966, Training Loss: 0.00951
Epoch: 23966, Training Loss: 0.00886
Epoch: 23966, Training Loss: 0.01086
Epoch: 23966, Training Loss: 0.00839
Epoch: 23967, Training Loss: 0.00951
Epoch: 23967, Training Loss: 0.00886
Epoch: 23967, Training Loss: 0.01086
Epoch: 23967, Training Loss: 0.00839
Epoch: 23968, Training Loss: 0.00951
Epoch: 23968, Training Loss: 0.00886
Epoch: 23968, Training Loss: 0.01086
Epoch: 23968, Training Loss: 0.00839
Epoch: 23969, Training Loss: 0.00951
Epoch: 23969, Training Loss: 0.00886
Epoch: 23969, Training Loss: 0.01086
Epoch: 23969, Training Loss: 0.00839
Epoch: 23970, Training Loss: 0.00951
Epoch: 23970, Training Loss: 0.00886
Epoch: 23970, Training Loss: 0.01086
Epoch: 23970, Training Loss: 0.00839
Epoch: 23971, Training Loss: 0.00951
Epoch: 23971, Training Loss: 0.00886
Epoch: 23971, Training Loss: 0.01086
Epoch: 23971, Training Loss: 0.00839
Epoch: 23972, Training Loss: 0.00951
Epoch: 23972, Training Loss: 0.00886
Epoch: 23972, Training Loss: 0.01086
Epoch: 23972, Training Loss: 0.00839
Epoch: 23973, Training Loss: 0.00951
Epoch: 23973, Training Loss: 0.00886
Epoch: 23973, Training Loss: 0.01085
Epoch: 23973, Training Loss: 0.00839
Epoch: 23974, Training Loss: 0.00951
Epoch: 23974, Training Loss: 0.00886
Epoch: 23974, Training Loss: 0.01085
Epoch: 23974, Training Loss: 0.00839
Epoch: 23975, Training Loss: 0.00951
Epoch: 23975, Training Loss: 0.00886
Epoch: 23975, Training Loss: 0.01085
Epoch: 23975, Training Loss: 0.00839
Epoch: 23976, Training Loss: 0.00951
Epoch: 23976, Training Loss: 0.00886
Epoch: 23976, Training Loss: 0.01085
Epoch: 23976, Training Loss: 0.00839
Epoch: 23977, Training Loss: 0.00951
Epoch: 23977, Training Loss: 0.00886
Epoch: 23977, Training Loss: 0.01085
Epoch: 23977, Training Loss: 0.00839
Epoch: 23978, Training Loss: 0.00951
Epoch: 23978, Training Loss: 0.00886
Epoch: 23978, Training Loss: 0.01085
Epoch: 23978, Training Loss: 0.00839
Epoch: 23979, Training Loss: 0.00951
Epoch: 23979, Training Loss: 0.00886
Epoch: 23979, Training Loss: 0.01085
Epoch: 23979, Training Loss: 0.00839
Epoch: 23980, Training Loss: 0.00951
Epoch: 23980, Training Loss: 0.00886
Epoch: 23980, Training Loss: 0.01085
Epoch: 23980, Training Loss: 0.00839
Epoch: 23981, Training Loss: 0.00951
Epoch: 23981, Training Loss: 0.00886
Epoch: 23981, Training Loss: 0.01085
Epoch: 23981, Training Loss: 0.00839
Epoch: 23982, Training Loss: 0.00951
Epoch: 23982, Training Loss: 0.00886
Epoch: 23982, Training Loss: 0.01085
Epoch: 23982, Training Loss: 0.00839
Epoch: 23983, Training Loss: 0.00951
Epoch: 23983, Training Loss: 0.00886
Epoch: 23983, Training Loss: 0.01085
Epoch: 23983, Training Loss: 0.00839
Epoch: 23984, Training Loss: 0.00950
Epoch: 23984, Training Loss: 0.00886
Epoch: 23984, Training Loss: 0.01085
Epoch: 23984, Training Loss: 0.00839
Epoch: 23985, Training Loss: 0.00950
Epoch: 23985, Training Loss: 0.00886
Epoch: 23985, Training Loss: 0.01085
Epoch: 23985, Training Loss: 0.00839
Epoch: 23986, Training Loss: 0.00950
Epoch: 23986, Training Loss: 0.00886
Epoch: 23986, Training Loss: 0.01085
Epoch: 23986, Training Loss: 0.00839
Epoch: 23987, Training Loss: 0.00950
Epoch: 23987, Training Loss: 0.00886
Epoch: 23987, Training Loss: 0.01085
Epoch: 23987, Training Loss: 0.00839
Epoch: 23988, Training Loss: 0.00950
Epoch: 23988, Training Loss: 0.00886
Epoch: 23988, Training Loss: 0.01085
Epoch: 23988, Training Loss: 0.00839
Epoch: 23989, Training Loss: 0.00950
Epoch: 23989, Training Loss: 0.00886
Epoch: 23989, Training Loss: 0.01085
Epoch: 23989, Training Loss: 0.00839
Epoch: 23990, Training Loss: 0.00950
Epoch: 23990, Training Loss: 0.00886
Epoch: 23990, Training Loss: 0.01085
Epoch: 23990, Training Loss: 0.00839
Epoch: 23991, Training Loss: 0.00950
Epoch: 23991, Training Loss: 0.00886
Epoch: 23991, Training Loss: 0.01085
Epoch: 23991, Training Loss: 0.00839
Epoch: 23992, Training Loss: 0.00950
Epoch: 23992, Training Loss: 0.00886
Epoch: 23992, Training Loss: 0.01085
Epoch: 23992, Training Loss: 0.00839
Epoch: 23993, Training Loss: 0.00950
Epoch: 23993, Training Loss: 0.00886
Epoch: 23993, Training Loss: 0.01085
Epoch: 23993, Training Loss: 0.00839
Epoch: 23994, Training Loss: 0.00950
Epoch: 23994, Training Loss: 0.00885
Epoch: 23994, Training Loss: 0.01085
Epoch: 23994, Training Loss: 0.00839
Epoch: 23995, Training Loss: 0.00950
Epoch: 23995, Training Loss: 0.00885
Epoch: 23995, Training Loss: 0.01085
Epoch: 23995, Training Loss: 0.00839
Epoch: 23996, Training Loss: 0.00950
Epoch: 23996, Training Loss: 0.00885
Epoch: 23996, Training Loss: 0.01085
Epoch: 23996, Training Loss: 0.00839
Epoch: 23997, Training Loss: 0.00950
Epoch: 23997, Training Loss: 0.00885
Epoch: 23997, Training Loss: 0.01085
Epoch: 23997, Training Loss: 0.00839
Epoch: 23998, Training Loss: 0.00950
Epoch: 23998, Training Loss: 0.00885
Epoch: 23998, Training Loss: 0.01085
Epoch: 23998, Training Loss: 0.00839
Epoch: 23999, Training Loss: 0.00950
Epoch: 23999, Training Loss: 0.00885
Epoch: 23999, Training Loss: 0.01085
Epoch: 23999, Training Loss: 0.00839
Epoch: 24000, Training Loss: 0.00950
Epoch: 24000, Training Loss: 0.00885
Epoch: 24000, Training Loss: 0.01085
Epoch: 24000, Training Loss: 0.00839
Epoch: 24001, Training Loss: 0.00950
Epoch: 24001, Training Loss: 0.00885
Epoch: 24001, Training Loss: 0.01085
Epoch: 24001, Training Loss: 0.00839
Epoch: 24002, Training Loss: 0.00950
Epoch: 24002, Training Loss: 0.00885
Epoch: 24002, Training Loss: 0.01085
Epoch: 24002, Training Loss: 0.00839
Epoch: 24003, Training Loss: 0.00950
Epoch: 24003, Training Loss: 0.00885
Epoch: 24003, Training Loss: 0.01085
Epoch: 24003, Training Loss: 0.00839
Epoch: 24004, Training Loss: 0.00950
Epoch: 24004, Training Loss: 0.00885
Epoch: 24004, Training Loss: 0.01085
Epoch: 24004, Training Loss: 0.00839
Epoch: 24005, Training Loss: 0.00950
Epoch: 24005, Training Loss: 0.00885
Epoch: 24005, Training Loss: 0.01085
Epoch: 24005, Training Loss: 0.00839
Epoch: 24006, Training Loss: 0.00950
Epoch: 24006, Training Loss: 0.00885
Epoch: 24006, Training Loss: 0.01085
Epoch: 24006, Training Loss: 0.00839
Epoch: 24007, Training Loss: 0.00950
Epoch: 24007, Training Loss: 0.00885
Epoch: 24007, Training Loss: 0.01085
Epoch: 24007, Training Loss: 0.00839
Epoch: 24008, Training Loss: 0.00950
Epoch: 24008, Training Loss: 0.00885
Epoch: 24008, Training Loss: 0.01085
Epoch: 24008, Training Loss: 0.00839
Epoch: 24009, Training Loss: 0.00950
Epoch: 24009, Training Loss: 0.00885
Epoch: 24009, Training Loss: 0.01085
Epoch: 24009, Training Loss: 0.00839
Epoch: 24010, Training Loss: 0.00950
Epoch: 24010, Training Loss: 0.00885
Epoch: 24010, Training Loss: 0.01085
Epoch: 24010, Training Loss: 0.00838
Epoch: 24011, Training Loss: 0.00950
Epoch: 24011, Training Loss: 0.00885
Epoch: 24011, Training Loss: 0.01085
Epoch: 24011, Training Loss: 0.00838
Epoch: 24012, Training Loss: 0.00950
Epoch: 24012, Training Loss: 0.00885
Epoch: 24012, Training Loss: 0.01085
Epoch: 24012, Training Loss: 0.00838
Epoch: 24013, Training Loss: 0.00950
Epoch: 24013, Training Loss: 0.00885
Epoch: 24013, Training Loss: 0.01085
Epoch: 24013, Training Loss: 0.00838
Epoch: 24014, Training Loss: 0.00950
Epoch: 24014, Training Loss: 0.00885
Epoch: 24014, Training Loss: 0.01084
Epoch: 24014, Training Loss: 0.00838
Epoch: 24015, Training Loss: 0.00950
Epoch: 24015, Training Loss: 0.00885
Epoch: 24015, Training Loss: 0.01084
Epoch: 24015, Training Loss: 0.00838
Epoch: 24016, Training Loss: 0.00950
Epoch: 24016, Training Loss: 0.00885
Epoch: 24016, Training Loss: 0.01084
Epoch: 24016, Training Loss: 0.00838
Epoch: 24017, Training Loss: 0.00950
Epoch: 24017, Training Loss: 0.00885
Epoch: 24017, Training Loss: 0.01084
Epoch: 24017, Training Loss: 0.00838
Epoch: 24018, Training Loss: 0.00950
Epoch: 24018, Training Loss: 0.00885
Epoch: 24018, Training Loss: 0.01084
Epoch: 24018, Training Loss: 0.00838
Epoch: 24019, Training Loss: 0.00950
Epoch: 24019, Training Loss: 0.00885
Epoch: 24019, Training Loss: 0.01084
Epoch: 24019, Training Loss: 0.00838
Epoch: 24020, Training Loss: 0.00950
Epoch: 24020, Training Loss: 0.00885
Epoch: 24020, Training Loss: 0.01084
Epoch: 24020, Training Loss: 0.00838
Epoch: 24021, Training Loss: 0.00950
Epoch: 24021, Training Loss: 0.00885
Epoch: 24021, Training Loss: 0.01084
Epoch: 24021, Training Loss: 0.00838
Epoch: 24022, Training Loss: 0.00950
Epoch: 24022, Training Loss: 0.00885
Epoch: 24022, Training Loss: 0.01084
Epoch: 24022, Training Loss: 0.00838
Epoch: 24023, Training Loss: 0.00950
Epoch: 24023, Training Loss: 0.00885
Epoch: 24023, Training Loss: 0.01084
Epoch: 24023, Training Loss: 0.00838
Epoch: 24024, Training Loss: 0.00950
Epoch: 24024, Training Loss: 0.00885
Epoch: 24024, Training Loss: 0.01084
Epoch: 24024, Training Loss: 0.00838
Epoch: 24025, Training Loss: 0.00950
Epoch: 24025, Training Loss: 0.00885
Epoch: 24025, Training Loss: 0.01084
Epoch: 24025, Training Loss: 0.00838
Epoch: 24026, Training Loss: 0.00950
Epoch: 24026, Training Loss: 0.00885
Epoch: 24026, Training Loss: 0.01084
Epoch: 24026, Training Loss: 0.00838
Epoch: 24027, Training Loss: 0.00950
Epoch: 24027, Training Loss: 0.00885
Epoch: 24027, Training Loss: 0.01084
Epoch: 24027, Training Loss: 0.00838
Epoch: 24028, Training Loss: 0.00950
Epoch: 24028, Training Loss: 0.00885
Epoch: 24028, Training Loss: 0.01084
Epoch: 24028, Training Loss: 0.00838
Epoch: 24029, Training Loss: 0.00950
Epoch: 24029, Training Loss: 0.00885
Epoch: 24029, Training Loss: 0.01084
Epoch: 24029, Training Loss: 0.00838
Epoch: 24030, Training Loss: 0.00950
Epoch: 24030, Training Loss: 0.00885
Epoch: 24030, Training Loss: 0.01084
Epoch: 24030, Training Loss: 0.00838
Epoch: 24031, Training Loss: 0.00949
Epoch: 24031, Training Loss: 0.00885
Epoch: 24031, Training Loss: 0.01084
Epoch: 24031, Training Loss: 0.00838
Epoch: 24032, Training Loss: 0.00949
Epoch: 24032, Training Loss: 0.00885
Epoch: 24032, Training Loss: 0.01084
Epoch: 24032, Training Loss: 0.00838
Epoch: 24033, Training Loss: 0.00949
Epoch: 24033, Training Loss: 0.00885
Epoch: 24033, Training Loss: 0.01084
Epoch: 24033, Training Loss: 0.00838
Epoch: 24034, Training Loss: 0.00949
Epoch: 24034, Training Loss: 0.00885
Epoch: 24034, Training Loss: 0.01084
Epoch: 24034, Training Loss: 0.00838
Epoch: 24035, Training Loss: 0.00949
Epoch: 24035, Training Loss: 0.00885
Epoch: 24035, Training Loss: 0.01084
Epoch: 24035, Training Loss: 0.00838
Epoch: 24036, Training Loss: 0.00949
Epoch: 24036, Training Loss: 0.00885
Epoch: 24036, Training Loss: 0.01084
Epoch: 24036, Training Loss: 0.00838
Epoch: 24037, Training Loss: 0.00949
Epoch: 24037, Training Loss: 0.00885
Epoch: 24037, Training Loss: 0.01084
Epoch: 24037, Training Loss: 0.00838
Epoch: 24038, Training Loss: 0.00949
Epoch: 24038, Training Loss: 0.00885
Epoch: 24038, Training Loss: 0.01084
Epoch: 24038, Training Loss: 0.00838
Epoch: 24039, Training Loss: 0.00949
Epoch: 24039, Training Loss: 0.00885
Epoch: 24039, Training Loss: 0.01084
Epoch: 24039, Training Loss: 0.00838
Epoch: 24040, Training Loss: 0.00949
Epoch: 24040, Training Loss: 0.00885
Epoch: 24040, Training Loss: 0.01084
Epoch: 24040, Training Loss: 0.00838
Epoch: 24041, Training Loss: 0.00949
Epoch: 24041, Training Loss: 0.00885
Epoch: 24041, Training Loss: 0.01084
Epoch: 24041, Training Loss: 0.00838
Epoch: 24042, Training Loss: 0.00949
Epoch: 24042, Training Loss: 0.00885
Epoch: 24042, Training Loss: 0.01084
Epoch: 24042, Training Loss: 0.00838
Epoch: 24043, Training Loss: 0.00949
Epoch: 24043, Training Loss: 0.00885
Epoch: 24043, Training Loss: 0.01084
Epoch: 24043, Training Loss: 0.00838
Epoch: 24044, Training Loss: 0.00949
Epoch: 24044, Training Loss: 0.00885
Epoch: 24044, Training Loss: 0.01084
Epoch: 24044, Training Loss: 0.00838
Epoch: 24045, Training Loss: 0.00949
Epoch: 24045, Training Loss: 0.00885
Epoch: 24045, Training Loss: 0.01084
Epoch: 24045, Training Loss: 0.00838
Epoch: 24046, Training Loss: 0.00949
Epoch: 24046, Training Loss: 0.00884
Epoch: 24046, Training Loss: 0.01084
Epoch: 24046, Training Loss: 0.00838
Epoch: 24047, Training Loss: 0.00949
Epoch: 24047, Training Loss: 0.00884
Epoch: 24047, Training Loss: 0.01084
Epoch: 24047, Training Loss: 0.00838
Epoch: 24048, Training Loss: 0.00949
Epoch: 24048, Training Loss: 0.00884
Epoch: 24048, Training Loss: 0.01084
Epoch: 24048, Training Loss: 0.00838
Epoch: 24049, Training Loss: 0.00949
Epoch: 24049, Training Loss: 0.00884
Epoch: 24049, Training Loss: 0.01084
Epoch: 24049, Training Loss: 0.00838
Epoch: 24050, Training Loss: 0.00949
Epoch: 24050, Training Loss: 0.00884
Epoch: 24050, Training Loss: 0.01084
Epoch: 24050, Training Loss: 0.00838
Epoch: 24051, Training Loss: 0.00949
Epoch: 24051, Training Loss: 0.00884
Epoch: 24051, Training Loss: 0.01084
Epoch: 24051, Training Loss: 0.00838
Epoch: 24052, Training Loss: 0.00949
Epoch: 24052, Training Loss: 0.00884
Epoch: 24052, Training Loss: 0.01084
Epoch: 24052, Training Loss: 0.00838
Epoch: 24053, Training Loss: 0.00949
Epoch: 24053, Training Loss: 0.00884
Epoch: 24053, Training Loss: 0.01084
Epoch: 24053, Training Loss: 0.00838
Epoch: 24054, Training Loss: 0.00949
Epoch: 24054, Training Loss: 0.00884
Epoch: 24054, Training Loss: 0.01084
Epoch: 24054, Training Loss: 0.00838
Epoch: 24055, Training Loss: 0.00949
Epoch: 24055, Training Loss: 0.00884
Epoch: 24055, Training Loss: 0.01083
Epoch: 24055, Training Loss: 0.00838
Epoch: 24056, Training Loss: 0.00949
Epoch: 24056, Training Loss: 0.00884
Epoch: 24056, Training Loss: 0.01083
Epoch: 24056, Training Loss: 0.00838
Epoch: 24057, Training Loss: 0.00949
Epoch: 24057, Training Loss: 0.00884
Epoch: 24057, Training Loss: 0.01083
Epoch: 24057, Training Loss: 0.00838
Epoch: 24058, Training Loss: 0.00949
Epoch: 24058, Training Loss: 0.00884
Epoch: 24058, Training Loss: 0.01083
Epoch: 24058, Training Loss: 0.00838
Epoch: 24059, Training Loss: 0.00949
Epoch: 24059, Training Loss: 0.00884
Epoch: 24059, Training Loss: 0.01083
Epoch: 24059, Training Loss: 0.00838
Epoch: 24060, Training Loss: 0.00949
Epoch: 24060, Training Loss: 0.00884
Epoch: 24060, Training Loss: 0.01083
Epoch: 24060, Training Loss: 0.00838
Epoch: 24061, Training Loss: 0.00949
Epoch: 24061, Training Loss: 0.00884
Epoch: 24061, Training Loss: 0.01083
Epoch: 24061, Training Loss: 0.00838
Epoch: 24062, Training Loss: 0.00949
Epoch: 24062, Training Loss: 0.00884
Epoch: 24062, Training Loss: 0.01083
Epoch: 24062, Training Loss: 0.00838
Epoch: 24063, Training Loss: 0.00949
Epoch: 24063, Training Loss: 0.00884
Epoch: 24063, Training Loss: 0.01083
Epoch: 24063, Training Loss: 0.00838
Epoch: 24064, Training Loss: 0.00949
Epoch: 24064, Training Loss: 0.00884
Epoch: 24064, Training Loss: 0.01083
Epoch: 24064, Training Loss: 0.00837
Epoch: 24065, Training Loss: 0.00949
Epoch: 24065, Training Loss: 0.00884
Epoch: 24065, Training Loss: 0.01083
Epoch: 24065, Training Loss: 0.00837
Epoch: 24066, Training Loss: 0.00949
Epoch: 24066, Training Loss: 0.00884
Epoch: 24066, Training Loss: 0.01083
Epoch: 24066, Training Loss: 0.00837
Epoch: 24067, Training Loss: 0.00949
Epoch: 24067, Training Loss: 0.00884
Epoch: 24067, Training Loss: 0.01083
Epoch: 24067, Training Loss: 0.00837
Epoch: 24068, Training Loss: 0.00949
Epoch: 24068, Training Loss: 0.00884
Epoch: 24068, Training Loss: 0.01083
Epoch: 24068, Training Loss: 0.00837
Epoch: 24069, Training Loss: 0.00949
Epoch: 24069, Training Loss: 0.00884
Epoch: 24069, Training Loss: 0.01083
Epoch: 24069, Training Loss: 0.00837
Epoch: 24070, Training Loss: 0.00949
Epoch: 24070, Training Loss: 0.00884
Epoch: 24070, Training Loss: 0.01083
Epoch: 24070, Training Loss: 0.00837
Epoch: 24071, Training Loss: 0.00949
Epoch: 24071, Training Loss: 0.00884
Epoch: 24071, Training Loss: 0.01083
Epoch: 24071, Training Loss: 0.00837
Epoch: 24072, Training Loss: 0.00949
Epoch: 24072, Training Loss: 0.00884
Epoch: 24072, Training Loss: 0.01083
Epoch: 24072, Training Loss: 0.00837
Epoch: 24073, Training Loss: 0.00949
Epoch: 24073, Training Loss: 0.00884
Epoch: 24073, Training Loss: 0.01083
Epoch: 24073, Training Loss: 0.00837
Epoch: 24074, Training Loss: 0.00949
Epoch: 24074, Training Loss: 0.00884
Epoch: 24074, Training Loss: 0.01083
Epoch: 24074, Training Loss: 0.00837
Epoch: 24075, Training Loss: 0.00949
Epoch: 24075, Training Loss: 0.00884
Epoch: 24075, Training Loss: 0.01083
Epoch: 24075, Training Loss: 0.00837
Epoch: 24076, Training Loss: 0.00949
Epoch: 24076, Training Loss: 0.00884
Epoch: 24076, Training Loss: 0.01083
Epoch: 24076, Training Loss: 0.00837
Epoch: 24077, Training Loss: 0.00949
Epoch: 24077, Training Loss: 0.00884
Epoch: 24077, Training Loss: 0.01083
Epoch: 24077, Training Loss: 0.00837
Epoch: 24078, Training Loss: 0.00948
Epoch: 24078, Training Loss: 0.00884
Epoch: 24078, Training Loss: 0.01083
Epoch: 24078, Training Loss: 0.00837
Epoch: 24079, Training Loss: 0.00948
Epoch: 24079, Training Loss: 0.00884
Epoch: 24079, Training Loss: 0.01083
Epoch: 24079, Training Loss: 0.00837
Epoch: 24080, Training Loss: 0.00948
Epoch: 24080, Training Loss: 0.00884
Epoch: 24080, Training Loss: 0.01083
Epoch: 24080, Training Loss: 0.00837
Epoch: 24081, Training Loss: 0.00948
Epoch: 24081, Training Loss: 0.00884
Epoch: 24081, Training Loss: 0.01083
Epoch: 24081, Training Loss: 0.00837
Epoch: 24082, Training Loss: 0.00948
Epoch: 24082, Training Loss: 0.00884
Epoch: 24082, Training Loss: 0.01083
Epoch: 24082, Training Loss: 0.00837
Epoch: 24083, Training Loss: 0.00948
Epoch: 24083, Training Loss: 0.00884
Epoch: 24083, Training Loss: 0.01083
Epoch: 24083, Training Loss: 0.00837
Epoch: 24084, Training Loss: 0.00948
Epoch: 24084, Training Loss: 0.00884
Epoch: 24084, Training Loss: 0.01083
Epoch: 24084, Training Loss: 0.00837
Epoch: 24085, Training Loss: 0.00948
Epoch: 24085, Training Loss: 0.00884
Epoch: 24085, Training Loss: 0.01083
Epoch: 24085, Training Loss: 0.00837
Epoch: 24086, Training Loss: 0.00948
Epoch: 24086, Training Loss: 0.00884
Epoch: 24086, Training Loss: 0.01083
Epoch: 24086, Training Loss: 0.00837
Epoch: 24087, Training Loss: 0.00948
Epoch: 24087, Training Loss: 0.00884
Epoch: 24087, Training Loss: 0.01083
Epoch: 24087, Training Loss: 0.00837
Epoch: 24088, Training Loss: 0.00948
Epoch: 24088, Training Loss: 0.00884
Epoch: 24088, Training Loss: 0.01083
Epoch: 24088, Training Loss: 0.00837
Epoch: 24089, Training Loss: 0.00948
Epoch: 24089, Training Loss: 0.00884
Epoch: 24089, Training Loss: 0.01083
Epoch: 24089, Training Loss: 0.00837
Epoch: 24090, Training Loss: 0.00948
Epoch: 24090, Training Loss: 0.00884
Epoch: 24090, Training Loss: 0.01083
Epoch: 24090, Training Loss: 0.00837
Epoch: 24091, Training Loss: 0.00948
Epoch: 24091, Training Loss: 0.00884
Epoch: 24091, Training Loss: 0.01083
Epoch: 24091, Training Loss: 0.00837
Epoch: 24092, Training Loss: 0.00948
Epoch: 24092, Training Loss: 0.00884
Epoch: 24092, Training Loss: 0.01083
Epoch: 24092, Training Loss: 0.00837
Epoch: 24093, Training Loss: 0.00948
Epoch: 24093, Training Loss: 0.00884
Epoch: 24093, Training Loss: 0.01083
Epoch: 24093, Training Loss: 0.00837
Epoch: 24094, Training Loss: 0.00948
Epoch: 24094, Training Loss: 0.00884
Epoch: 24094, Training Loss: 0.01083
Epoch: 24094, Training Loss: 0.00837
Epoch: 24095, Training Loss: 0.00948
Epoch: 24095, Training Loss: 0.00884
Epoch: 24095, Training Loss: 0.01083
Epoch: 24095, Training Loss: 0.00837
Epoch: 24096, Training Loss: 0.00948
Epoch: 24096, Training Loss: 0.00884
Epoch: 24096, Training Loss: 0.01083
Epoch: 24096, Training Loss: 0.00837
Epoch: 24097, Training Loss: 0.00948
Epoch: 24097, Training Loss: 0.00884
Epoch: 24097, Training Loss: 0.01082
Epoch: 24097, Training Loss: 0.00837
Epoch: 24098, Training Loss: 0.00948
Epoch: 24098, Training Loss: 0.00883
Epoch: 24098, Training Loss: 0.01082
Epoch: 24098, Training Loss: 0.00837
Epoch: 24099, Training Loss: 0.00948
Epoch: 24099, Training Loss: 0.00883
Epoch: 24099, Training Loss: 0.01082
Epoch: 24099, Training Loss: 0.00837
Epoch: 24100, Training Loss: 0.00948
Epoch: 24100, Training Loss: 0.00883
Epoch: 24100, Training Loss: 0.01082
Epoch: 24100, Training Loss: 0.00837
Epoch: 24101, Training Loss: 0.00948
Epoch: 24101, Training Loss: 0.00883
Epoch: 24101, Training Loss: 0.01082
Epoch: 24101, Training Loss: 0.00837
Epoch: 24102, Training Loss: 0.00948
Epoch: 24102, Training Loss: 0.00883
Epoch: 24102, Training Loss: 0.01082
Epoch: 24102, Training Loss: 0.00837
Epoch: 24103, Training Loss: 0.00948
Epoch: 24103, Training Loss: 0.00883
Epoch: 24103, Training Loss: 0.01082
Epoch: 24103, Training Loss: 0.00837
Epoch: 24104, Training Loss: 0.00948
Epoch: 24104, Training Loss: 0.00883
Epoch: 24104, Training Loss: 0.01082
Epoch: 24104, Training Loss: 0.00837
Epoch: 24105, Training Loss: 0.00948
Epoch: 24105, Training Loss: 0.00883
Epoch: 24105, Training Loss: 0.01082
Epoch: 24105, Training Loss: 0.00837
Epoch: 24106, Training Loss: 0.00948
Epoch: 24106, Training Loss: 0.00883
Epoch: 24106, Training Loss: 0.01082
Epoch: 24106, Training Loss: 0.00837
Epoch: 24107, Training Loss: 0.00948
Epoch: 24107, Training Loss: 0.00883
Epoch: 24107, Training Loss: 0.01082
Epoch: 24107, Training Loss: 0.00837
Epoch: 24108, Training Loss: 0.00948
Epoch: 24108, Training Loss: 0.00883
Epoch: 24108, Training Loss: 0.01082
Epoch: 24108, Training Loss: 0.00837
Epoch: 24109, Training Loss: 0.00948
Epoch: 24109, Training Loss: 0.00883
Epoch: 24109, Training Loss: 0.01082
Epoch: 24109, Training Loss: 0.00837
Epoch: 24110, Training Loss: 0.00948
Epoch: 24110, Training Loss: 0.00883
Epoch: 24110, Training Loss: 0.01082
Epoch: 24110, Training Loss: 0.00837
Epoch: 24111, Training Loss: 0.00948
Epoch: 24111, Training Loss: 0.00883
Epoch: 24111, Training Loss: 0.01082
Epoch: 24111, Training Loss: 0.00837
Epoch: 24112, Training Loss: 0.00948
Epoch: 24112, Training Loss: 0.00883
Epoch: 24112, Training Loss: 0.01082
Epoch: 24112, Training Loss: 0.00837
Epoch: 24113, Training Loss: 0.00948
Epoch: 24113, Training Loss: 0.00883
Epoch: 24113, Training Loss: 0.01082
Epoch: 24113, Training Loss: 0.00837
Epoch: 24114, Training Loss: 0.00948
Epoch: 24114, Training Loss: 0.00883
Epoch: 24114, Training Loss: 0.01082
Epoch: 24114, Training Loss: 0.00837
Epoch: 24115, Training Loss: 0.00948
Epoch: 24115, Training Loss: 0.00883
Epoch: 24115, Training Loss: 0.01082
Epoch: 24115, Training Loss: 0.00837
Epoch: 24116, Training Loss: 0.00948
Epoch: 24116, Training Loss: 0.00883
Epoch: 24116, Training Loss: 0.01082
Epoch: 24116, Training Loss: 0.00837
Epoch: 24117, Training Loss: 0.00948
Epoch: 24117, Training Loss: 0.00883
Epoch: 24117, Training Loss: 0.01082
Epoch: 24117, Training Loss: 0.00837
Epoch: 24118, Training Loss: 0.00948
Epoch: 24118, Training Loss: 0.00883
Epoch: 24118, Training Loss: 0.01082
Epoch: 24118, Training Loss: 0.00837
Epoch: 24119, Training Loss: 0.00948
Epoch: 24119, Training Loss: 0.00883
Epoch: 24119, Training Loss: 0.01082
Epoch: 24119, Training Loss: 0.00836
Epoch: 24120, Training Loss: 0.00948
Epoch: 24120, Training Loss: 0.00883
Epoch: 24120, Training Loss: 0.01082
Epoch: 24120, Training Loss: 0.00836
Epoch: 24121, Training Loss: 0.00948
Epoch: 24121, Training Loss: 0.00883
Epoch: 24121, Training Loss: 0.01082
Epoch: 24121, Training Loss: 0.00836
Epoch: 24122, Training Loss: 0.00948
Epoch: 24122, Training Loss: 0.00883
Epoch: 24122, Training Loss: 0.01082
Epoch: 24122, Training Loss: 0.00836
Epoch: 24123, Training Loss: 0.00948
Epoch: 24123, Training Loss: 0.00883
Epoch: 24123, Training Loss: 0.01082
Epoch: 24123, Training Loss: 0.00836
Epoch: 24124, Training Loss: 0.00948
Epoch: 24124, Training Loss: 0.00883
Epoch: 24124, Training Loss: 0.01082
Epoch: 24124, Training Loss: 0.00836
Epoch: 24125, Training Loss: 0.00948
Epoch: 24125, Training Loss: 0.00883
Epoch: 24125, Training Loss: 0.01082
Epoch: 24125, Training Loss: 0.00836
Epoch: 24126, Training Loss: 0.00947
Epoch: 24126, Training Loss: 0.00883
Epoch: 24126, Training Loss: 0.01082
Epoch: 24126, Training Loss: 0.00836
Epoch: 24127, Training Loss: 0.00947
Epoch: 24127, Training Loss: 0.00883
Epoch: 24127, Training Loss: 0.01082
Epoch: 24127, Training Loss: 0.00836
Epoch: 24128, Training Loss: 0.00947
Epoch: 24128, Training Loss: 0.00883
Epoch: 24128, Training Loss: 0.01082
Epoch: 24128, Training Loss: 0.00836
Epoch: 24129, Training Loss: 0.00947
Epoch: 24129, Training Loss: 0.00883
Epoch: 24129, Training Loss: 0.01082
Epoch: 24129, Training Loss: 0.00836
Epoch: 24130, Training Loss: 0.00947
Epoch: 24130, Training Loss: 0.00883
Epoch: 24130, Training Loss: 0.01082
Epoch: 24130, Training Loss: 0.00836
Epoch: 24131, Training Loss: 0.00947
Epoch: 24131, Training Loss: 0.00883
Epoch: 24131, Training Loss: 0.01082
Epoch: 24131, Training Loss: 0.00836
Epoch: 24132, Training Loss: 0.00947
Epoch: 24132, Training Loss: 0.00883
Epoch: 24132, Training Loss: 0.01082
Epoch: 24132, Training Loss: 0.00836
Epoch: 24133, Training Loss: 0.00947
Epoch: 24133, Training Loss: 0.00883
Epoch: 24133, Training Loss: 0.01082
Epoch: 24133, Training Loss: 0.00836
Epoch: 24134, Training Loss: 0.00947
Epoch: 24134, Training Loss: 0.00883
Epoch: 24134, Training Loss: 0.01082
Epoch: 24134, Training Loss: 0.00836
Epoch: 24135, Training Loss: 0.00947
Epoch: 24135, Training Loss: 0.00883
Epoch: 24135, Training Loss: 0.01082
Epoch: 24135, Training Loss: 0.00836
Epoch: 24136, Training Loss: 0.00947
Epoch: 24136, Training Loss: 0.00883
Epoch: 24136, Training Loss: 0.01082
Epoch: 24136, Training Loss: 0.00836
Epoch: 24137, Training Loss: 0.00947
Epoch: 24137, Training Loss: 0.00883
Epoch: 24137, Training Loss: 0.01082
Epoch: 24137, Training Loss: 0.00836
Epoch: 24138, Training Loss: 0.00947
Epoch: 24138, Training Loss: 0.00883
Epoch: 24138, Training Loss: 0.01082
Epoch: 24138, Training Loss: 0.00836
Epoch: 24139, Training Loss: 0.00947
Epoch: 24139, Training Loss: 0.00883
Epoch: 24139, Training Loss: 0.01081
Epoch: 24139, Training Loss: 0.00836
Epoch: 24140, Training Loss: 0.00947
Epoch: 24140, Training Loss: 0.00883
Epoch: 24140, Training Loss: 0.01081
Epoch: 24140, Training Loss: 0.00836
Epoch: 24141, Training Loss: 0.00947
Epoch: 24141, Training Loss: 0.00883
Epoch: 24141, Training Loss: 0.01081
Epoch: 24141, Training Loss: 0.00836
Epoch: 24142, Training Loss: 0.00947
Epoch: 24142, Training Loss: 0.00883
Epoch: 24142, Training Loss: 0.01081
Epoch: 24142, Training Loss: 0.00836
Epoch: 24143, Training Loss: 0.00947
Epoch: 24143, Training Loss: 0.00883
Epoch: 24143, Training Loss: 0.01081
Epoch: 24143, Training Loss: 0.00836
Epoch: 24144, Training Loss: 0.00947
Epoch: 24144, Training Loss: 0.00883
Epoch: 24144, Training Loss: 0.01081
Epoch: 24144, Training Loss: 0.00836
Epoch: 24145, Training Loss: 0.00947
Epoch: 24145, Training Loss: 0.00883
Epoch: 24145, Training Loss: 0.01081
Epoch: 24145, Training Loss: 0.00836
Epoch: 24146, Training Loss: 0.00947
Epoch: 24146, Training Loss: 0.00883
Epoch: 24146, Training Loss: 0.01081
Epoch: 24146, Training Loss: 0.00836
Epoch: 24147, Training Loss: 0.00947
Epoch: 24147, Training Loss: 0.00883
Epoch: 24147, Training Loss: 0.01081
Epoch: 24147, Training Loss: 0.00836
Epoch: 24148, Training Loss: 0.00947
Epoch: 24148, Training Loss: 0.00883
Epoch: 24148, Training Loss: 0.01081
Epoch: 24148, Training Loss: 0.00836
Epoch: 24149, Training Loss: 0.00947
Epoch: 24149, Training Loss: 0.00883
Epoch: 24149, Training Loss: 0.01081
Epoch: 24149, Training Loss: 0.00836
Epoch: 24150, Training Loss: 0.00947
Epoch: 24150, Training Loss: 0.00882
Epoch: 24150, Training Loss: 0.01081
Epoch: 24150, Training Loss: 0.00836
Epoch: 24151, Training Loss: 0.00947
Epoch: 24151, Training Loss: 0.00882
Epoch: 24151, Training Loss: 0.01081
Epoch: 24151, Training Loss: 0.00836
Epoch: 24152, Training Loss: 0.00947
Epoch: 24152, Training Loss: 0.00882
Epoch: 24152, Training Loss: 0.01081
Epoch: 24152, Training Loss: 0.00836
Epoch: 24153, Training Loss: 0.00947
Epoch: 24153, Training Loss: 0.00882
Epoch: 24153, Training Loss: 0.01081
Epoch: 24153, Training Loss: 0.00836
Epoch: 24154, Training Loss: 0.00947
Epoch: 24154, Training Loss: 0.00882
Epoch: 24154, Training Loss: 0.01081
Epoch: 24154, Training Loss: 0.00836
Epoch: 24155, Training Loss: 0.00947
Epoch: 24155, Training Loss: 0.00882
Epoch: 24155, Training Loss: 0.01081
Epoch: 24155, Training Loss: 0.00836
Epoch: 24156, Training Loss: 0.00947
Epoch: 24156, Training Loss: 0.00882
Epoch: 24156, Training Loss: 0.01081
Epoch: 24156, Training Loss: 0.00836
Epoch: 24157, Training Loss: 0.00947
Epoch: 24157, Training Loss: 0.00882
Epoch: 24157, Training Loss: 0.01081
Epoch: 24157, Training Loss: 0.00836
Epoch: 24158, Training Loss: 0.00947
Epoch: 24158, Training Loss: 0.00882
Epoch: 24158, Training Loss: 0.01081
Epoch: 24158, Training Loss: 0.00836
Epoch: 24159, Training Loss: 0.00947
Epoch: 24159, Training Loss: 0.00882
Epoch: 24159, Training Loss: 0.01081
Epoch: 24159, Training Loss: 0.00836
Epoch: 24160, Training Loss: 0.00947
Epoch: 24160, Training Loss: 0.00882
Epoch: 24160, Training Loss: 0.01081
Epoch: 24160, Training Loss: 0.00836
Epoch: 24161, Training Loss: 0.00947
Epoch: 24161, Training Loss: 0.00882
Epoch: 24161, Training Loss: 0.01081
Epoch: 24161, Training Loss: 0.00836
Epoch: 24162, Training Loss: 0.00947
Epoch: 24162, Training Loss: 0.00882
Epoch: 24162, Training Loss: 0.01081
Epoch: 24162, Training Loss: 0.00836
Epoch: 24163, Training Loss: 0.00947
Epoch: 24163, Training Loss: 0.00882
Epoch: 24163, Training Loss: 0.01081
Epoch: 24163, Training Loss: 0.00836
Epoch: 24164, Training Loss: 0.00947
Epoch: 24164, Training Loss: 0.00882
Epoch: 24164, Training Loss: 0.01081
Epoch: 24164, Training Loss: 0.00836
Epoch: 24165, Training Loss: 0.00947
Epoch: 24165, Training Loss: 0.00882
Epoch: 24165, Training Loss: 0.01081
Epoch: 24165, Training Loss: 0.00836
Epoch: 24166, Training Loss: 0.00947
Epoch: 24166, Training Loss: 0.00882
Epoch: 24166, Training Loss: 0.01081
Epoch: 24166, Training Loss: 0.00836
Epoch: 24167, Training Loss: 0.00947
Epoch: 24167, Training Loss: 0.00882
Epoch: 24167, Training Loss: 0.01081
Epoch: 24167, Training Loss: 0.00836
Epoch: 24168, Training Loss: 0.00947
Epoch: 24168, Training Loss: 0.00882
Epoch: 24168, Training Loss: 0.01081
Epoch: 24168, Training Loss: 0.00836
Epoch: 24169, Training Loss: 0.00947
Epoch: 24169, Training Loss: 0.00882
Epoch: 24169, Training Loss: 0.01081
Epoch: 24169, Training Loss: 0.00836
Epoch: 24170, Training Loss: 0.00947
Epoch: 24170, Training Loss: 0.00882
Epoch: 24170, Training Loss: 0.01081
Epoch: 24170, Training Loss: 0.00836
Epoch: 24171, Training Loss: 0.00947
Epoch: 24171, Training Loss: 0.00882
Epoch: 24171, Training Loss: 0.01081
Epoch: 24171, Training Loss: 0.00836
Epoch: 24172, Training Loss: 0.00947
Epoch: 24172, Training Loss: 0.00882
Epoch: 24172, Training Loss: 0.01081
Epoch: 24172, Training Loss: 0.00836
Epoch: 24173, Training Loss: 0.00946
Epoch: 24173, Training Loss: 0.00882
Epoch: 24173, Training Loss: 0.01081
Epoch: 24173, Training Loss: 0.00835
Epoch: 24174, Training Loss: 0.00946
Epoch: 24174, Training Loss: 0.00882
Epoch: 24174, Training Loss: 0.01081
Epoch: 24174, Training Loss: 0.00835
Epoch: 24175, Training Loss: 0.00946
Epoch: 24175, Training Loss: 0.00882
Epoch: 24175, Training Loss: 0.01081
Epoch: 24175, Training Loss: 0.00835
Epoch: 24176, Training Loss: 0.00946
Epoch: 24176, Training Loss: 0.00882
Epoch: 24176, Training Loss: 0.01081
Epoch: 24176, Training Loss: 0.00835
Epoch: 24177, Training Loss: 0.00946
Epoch: 24177, Training Loss: 0.00882
Epoch: 24177, Training Loss: 0.01081
Epoch: 24177, Training Loss: 0.00835
Epoch: 24178, Training Loss: 0.00946
Epoch: 24178, Training Loss: 0.00882
Epoch: 24178, Training Loss: 0.01081
Epoch: 24178, Training Loss: 0.00835
Epoch: 24179, Training Loss: 0.00946
Epoch: 24179, Training Loss: 0.00882
Epoch: 24179, Training Loss: 0.01081
Epoch: 24179, Training Loss: 0.00835
Epoch: 24180, Training Loss: 0.00946
Epoch: 24180, Training Loss: 0.00882
Epoch: 24180, Training Loss: 0.01080
Epoch: 24180, Training Loss: 0.00835
Epoch: 24181, Training Loss: 0.00946
Epoch: 24181, Training Loss: 0.00882
Epoch: 24181, Training Loss: 0.01080
Epoch: 24181, Training Loss: 0.00835
Epoch: 24182, Training Loss: 0.00946
Epoch: 24182, Training Loss: 0.00882
Epoch: 24182, Training Loss: 0.01080
Epoch: 24182, Training Loss: 0.00835
Epoch: 24183, Training Loss: 0.00946
Epoch: 24183, Training Loss: 0.00882
Epoch: 24183, Training Loss: 0.01080
Epoch: 24183, Training Loss: 0.00835
Epoch: 24184, Training Loss: 0.00946
Epoch: 24184, Training Loss: 0.00882
Epoch: 24184, Training Loss: 0.01080
Epoch: 24184, Training Loss: 0.00835
Epoch: 24185, Training Loss: 0.00946
Epoch: 24185, Training Loss: 0.00882
Epoch: 24185, Training Loss: 0.01080
Epoch: 24185, Training Loss: 0.00835
Epoch: 24186, Training Loss: 0.00946
Epoch: 24186, Training Loss: 0.00882
Epoch: 24186, Training Loss: 0.01080
Epoch: 24186, Training Loss: 0.00835
Epoch: 24187, Training Loss: 0.00946
Epoch: 24187, Training Loss: 0.00882
Epoch: 24187, Training Loss: 0.01080
Epoch: 24187, Training Loss: 0.00835
Epoch: 24188, Training Loss: 0.00946
Epoch: 24188, Training Loss: 0.00882
Epoch: 24188, Training Loss: 0.01080
Epoch: 24188, Training Loss: 0.00835
Epoch: 24189, Training Loss: 0.00946
Epoch: 24189, Training Loss: 0.00882
Epoch: 24189, Training Loss: 0.01080
Epoch: 24189, Training Loss: 0.00835
Epoch: 24190, Training Loss: 0.00946
Epoch: 24190, Training Loss: 0.00882
Epoch: 24190, Training Loss: 0.01080
Epoch: 24190, Training Loss: 0.00835
Epoch: 24191, Training Loss: 0.00946
Epoch: 24191, Training Loss: 0.00882
Epoch: 24191, Training Loss: 0.01080
Epoch: 24191, Training Loss: 0.00835
Epoch: 24192, Training Loss: 0.00946
Epoch: 24192, Training Loss: 0.00882
Epoch: 24192, Training Loss: 0.01080
Epoch: 24192, Training Loss: 0.00835
Epoch: 24193, Training Loss: 0.00946
Epoch: 24193, Training Loss: 0.00882
Epoch: 24193, Training Loss: 0.01080
Epoch: 24193, Training Loss: 0.00835
Epoch: 24194, Training Loss: 0.00946
Epoch: 24194, Training Loss: 0.00882
Epoch: 24194, Training Loss: 0.01080
Epoch: 24194, Training Loss: 0.00835
Epoch: 24195, Training Loss: 0.00946
Epoch: 24195, Training Loss: 0.00882
Epoch: 24195, Training Loss: 0.01080
Epoch: 24195, Training Loss: 0.00835
Epoch: 24196, Training Loss: 0.00946
Epoch: 24196, Training Loss: 0.00882
Epoch: 24196, Training Loss: 0.01080
Epoch: 24196, Training Loss: 0.00835
Epoch: 24197, Training Loss: 0.00946
Epoch: 24197, Training Loss: 0.00882
Epoch: 24197, Training Loss: 0.01080
Epoch: 24197, Training Loss: 0.00835
Epoch: 24198, Training Loss: 0.00946
Epoch: 24198, Training Loss: 0.00882
Epoch: 24198, Training Loss: 0.01080
Epoch: 24198, Training Loss: 0.00835
Epoch: 24199, Training Loss: 0.00946
Epoch: 24199, Training Loss: 0.00882
Epoch: 24199, Training Loss: 0.01080
Epoch: 24199, Training Loss: 0.00835
Epoch: 24200, Training Loss: 0.00946
Epoch: 24200, Training Loss: 0.00882
Epoch: 24200, Training Loss: 0.01080
Epoch: 24200, Training Loss: 0.00835
Epoch: 24201, Training Loss: 0.00946
Epoch: 24201, Training Loss: 0.00882
Epoch: 24201, Training Loss: 0.01080
Epoch: 24201, Training Loss: 0.00835
Epoch: 24202, Training Loss: 0.00946
Epoch: 24202, Training Loss: 0.00881
Epoch: 24202, Training Loss: 0.01080
Epoch: 24202, Training Loss: 0.00835
Epoch: 24203, Training Loss: 0.00946
Epoch: 24203, Training Loss: 0.00881
Epoch: 24203, Training Loss: 0.01080
Epoch: 24203, Training Loss: 0.00835
Epoch: 24204, Training Loss: 0.00946
Epoch: 24204, Training Loss: 0.00881
Epoch: 24204, Training Loss: 0.01080
Epoch: 24204, Training Loss: 0.00835
Epoch: 24205, Training Loss: 0.00946
Epoch: 24205, Training Loss: 0.00881
Epoch: 24205, Training Loss: 0.01080
Epoch: 24205, Training Loss: 0.00835
Epoch: 24206, Training Loss: 0.00946
Epoch: 24206, Training Loss: 0.00881
Epoch: 24206, Training Loss: 0.01080
Epoch: 24206, Training Loss: 0.00835
Epoch: 24207, Training Loss: 0.00946
Epoch: 24207, Training Loss: 0.00881
Epoch: 24207, Training Loss: 0.01080
Epoch: 24207, Training Loss: 0.00835
Epoch: 24208, Training Loss: 0.00946
Epoch: 24208, Training Loss: 0.00881
Epoch: 24208, Training Loss: 0.01080
Epoch: 24208, Training Loss: 0.00835
Epoch: 24209, Training Loss: 0.00946
Epoch: 24209, Training Loss: 0.00881
Epoch: 24209, Training Loss: 0.01080
Epoch: 24209, Training Loss: 0.00835
Epoch: 24210, Training Loss: 0.00946
Epoch: 24210, Training Loss: 0.00881
Epoch: 24210, Training Loss: 0.01080
Epoch: 24210, Training Loss: 0.00835
Epoch: 24211, Training Loss: 0.00946
Epoch: 24211, Training Loss: 0.00881
Epoch: 24211, Training Loss: 0.01080
Epoch: 24211, Training Loss: 0.00835
Epoch: 24212, Training Loss: 0.00946
Epoch: 24212, Training Loss: 0.00881
Epoch: 24212, Training Loss: 0.01080
Epoch: 24212, Training Loss: 0.00835
Epoch: 24213, Training Loss: 0.00946
Epoch: 24213, Training Loss: 0.00881
Epoch: 24213, Training Loss: 0.01080
Epoch: 24213, Training Loss: 0.00835
Epoch: 24214, Training Loss: 0.00946
Epoch: 24214, Training Loss: 0.00881
Epoch: 24214, Training Loss: 0.01080
Epoch: 24214, Training Loss: 0.00835
Epoch: 24215, Training Loss: 0.00946
Epoch: 24215, Training Loss: 0.00881
Epoch: 24215, Training Loss: 0.01080
Epoch: 24215, Training Loss: 0.00835
Epoch: 24216, Training Loss: 0.00946
Epoch: 24216, Training Loss: 0.00881
Epoch: 24216, Training Loss: 0.01080
Epoch: 24216, Training Loss: 0.00835
Epoch: 24217, Training Loss: 0.00946
Epoch: 24217, Training Loss: 0.00881
Epoch: 24217, Training Loss: 0.01080
Epoch: 24217, Training Loss: 0.00835
Epoch: 24218, Training Loss: 0.00946
Epoch: 24218, Training Loss: 0.00881
Epoch: 24218, Training Loss: 0.01080
Epoch: 24218, Training Loss: 0.00835
Epoch: 24219, Training Loss: 0.00946
Epoch: 24219, Training Loss: 0.00881
Epoch: 24219, Training Loss: 0.01080
Epoch: 24219, Training Loss: 0.00835
Epoch: 24220, Training Loss: 0.00946
Epoch: 24220, Training Loss: 0.00881
Epoch: 24220, Training Loss: 0.01080
Epoch: 24220, Training Loss: 0.00835
Epoch: 24221, Training Loss: 0.00945
Epoch: 24221, Training Loss: 0.00881
Epoch: 24221, Training Loss: 0.01080
Epoch: 24221, Training Loss: 0.00835
Epoch: 24222, Training Loss: 0.00945
Epoch: 24222, Training Loss: 0.00881
Epoch: 24222, Training Loss: 0.01079
Epoch: 24222, Training Loss: 0.00835
Epoch: 24223, Training Loss: 0.00945
Epoch: 24223, Training Loss: 0.00881
Epoch: 24223, Training Loss: 0.01079
Epoch: 24223, Training Loss: 0.00835
Epoch: 24224, Training Loss: 0.00945
Epoch: 24224, Training Loss: 0.00881
Epoch: 24224, Training Loss: 0.01079
Epoch: 24224, Training Loss: 0.00835
Epoch: 24225, Training Loss: 0.00945
Epoch: 24225, Training Loss: 0.00881
Epoch: 24225, Training Loss: 0.01079
Epoch: 24225, Training Loss: 0.00835
Epoch: 24226, Training Loss: 0.00945
Epoch: 24226, Training Loss: 0.00881
Epoch: 24226, Training Loss: 0.01079
Epoch: 24226, Training Loss: 0.00835
Epoch: 24227, Training Loss: 0.00945
Epoch: 24227, Training Loss: 0.00881
Epoch: 24227, Training Loss: 0.01079
Epoch: 24227, Training Loss: 0.00835
Epoch: 24228, Training Loss: 0.00945
Epoch: 24228, Training Loss: 0.00881
Epoch: 24228, Training Loss: 0.01079
Epoch: 24228, Training Loss: 0.00834
Epoch: 24229, Training Loss: 0.00945
Epoch: 24229, Training Loss: 0.00881
Epoch: 24229, Training Loss: 0.01079
Epoch: 24229, Training Loss: 0.00834
Epoch: 24230, Training Loss: 0.00945
Epoch: 24230, Training Loss: 0.00881
Epoch: 24230, Training Loss: 0.01079
Epoch: 24230, Training Loss: 0.00834
Epoch: 24231, Training Loss: 0.00945
Epoch: 24231, Training Loss: 0.00881
Epoch: 24231, Training Loss: 0.01079
Epoch: 24231, Training Loss: 0.00834
Epoch: 24232, Training Loss: 0.00945
Epoch: 24232, Training Loss: 0.00881
Epoch: 24232, Training Loss: 0.01079
Epoch: 24232, Training Loss: 0.00834
Epoch: 24233, Training Loss: 0.00945
Epoch: 24233, Training Loss: 0.00881
Epoch: 24233, Training Loss: 0.01079
Epoch: 24233, Training Loss: 0.00834
Epoch: 24234, Training Loss: 0.00945
Epoch: 24234, Training Loss: 0.00881
Epoch: 24234, Training Loss: 0.01079
Epoch: 24234, Training Loss: 0.00834
Epoch: 24235, Training Loss: 0.00945
Epoch: 24235, Training Loss: 0.00881
Epoch: 24235, Training Loss: 0.01079
Epoch: 24235, Training Loss: 0.00834
Epoch: 24236, Training Loss: 0.00945
Epoch: 24236, Training Loss: 0.00881
Epoch: 24236, Training Loss: 0.01079
Epoch: 24236, Training Loss: 0.00834
Epoch: 24237, Training Loss: 0.00945
Epoch: 24237, Training Loss: 0.00881
Epoch: 24237, Training Loss: 0.01079
Epoch: 24237, Training Loss: 0.00834
Epoch: 24238, Training Loss: 0.00945
Epoch: 24238, Training Loss: 0.00881
Epoch: 24238, Training Loss: 0.01079
Epoch: 24238, Training Loss: 0.00834
Epoch: 24239, Training Loss: 0.00945
Epoch: 24239, Training Loss: 0.00881
Epoch: 24239, Training Loss: 0.01079
Epoch: 24239, Training Loss: 0.00834
Epoch: 24240, Training Loss: 0.00945
Epoch: 24240, Training Loss: 0.00881
Epoch: 24240, Training Loss: 0.01079
Epoch: 24240, Training Loss: 0.00834
Epoch: 24241, Training Loss: 0.00945
Epoch: 24241, Training Loss: 0.00881
Epoch: 24241, Training Loss: 0.01079
Epoch: 24241, Training Loss: 0.00834
Epoch: 24242, Training Loss: 0.00945
Epoch: 24242, Training Loss: 0.00881
Epoch: 24242, Training Loss: 0.01079
Epoch: 24242, Training Loss: 0.00834
Epoch: 24243, Training Loss: 0.00945
Epoch: 24243, Training Loss: 0.00881
Epoch: 24243, Training Loss: 0.01079
Epoch: 24243, Training Loss: 0.00834
Epoch: 24244, Training Loss: 0.00945
Epoch: 24244, Training Loss: 0.00881
Epoch: 24244, Training Loss: 0.01079
Epoch: 24244, Training Loss: 0.00834
Epoch: 24245, Training Loss: 0.00945
Epoch: 24245, Training Loss: 0.00881
Epoch: 24245, Training Loss: 0.01079
Epoch: 24245, Training Loss: 0.00834
Epoch: 24246, Training Loss: 0.00945
Epoch: 24246, Training Loss: 0.00881
Epoch: 24246, Training Loss: 0.01079
Epoch: 24246, Training Loss: 0.00834
Epoch: 24247, Training Loss: 0.00945
Epoch: 24247, Training Loss: 0.00881
Epoch: 24247, Training Loss: 0.01079
Epoch: 24247, Training Loss: 0.00834
Epoch: 24248, Training Loss: 0.00945
Epoch: 24248, Training Loss: 0.00881
Epoch: 24248, Training Loss: 0.01079
Epoch: 24248, Training Loss: 0.00834
Epoch: 24249, Training Loss: 0.00945
Epoch: 24249, Training Loss: 0.00881
Epoch: 24249, Training Loss: 0.01079
Epoch: 24249, Training Loss: 0.00834
Epoch: 24250, Training Loss: 0.00945
Epoch: 24250, Training Loss: 0.00881
Epoch: 24250, Training Loss: 0.01079
Epoch: 24250, Training Loss: 0.00834
Epoch: 24251, Training Loss: 0.00945
Epoch: 24251, Training Loss: 0.00881
Epoch: 24251, Training Loss: 0.01079
Epoch: 24251, Training Loss: 0.00834
Epoch: 24252, Training Loss: 0.00945
Epoch: 24252, Training Loss: 0.00881
Epoch: 24252, Training Loss: 0.01079
Epoch: 24252, Training Loss: 0.00834
Epoch: 24253, Training Loss: 0.00945
Epoch: 24253, Training Loss: 0.00881
Epoch: 24253, Training Loss: 0.01079
Epoch: 24253, Training Loss: 0.00834
Epoch: 24254, Training Loss: 0.00945
Epoch: 24254, Training Loss: 0.00880
Epoch: 24254, Training Loss: 0.01079
Epoch: 24254, Training Loss: 0.00834
Epoch: 24255, Training Loss: 0.00945
Epoch: 24255, Training Loss: 0.00880
Epoch: 24255, Training Loss: 0.01079
Epoch: 24255, Training Loss: 0.00834
Epoch: 24256, Training Loss: 0.00945
Epoch: 24256, Training Loss: 0.00880
Epoch: 24256, Training Loss: 0.01079
Epoch: 24256, Training Loss: 0.00834
Epoch: 24257, Training Loss: 0.00945
Epoch: 24257, Training Loss: 0.00880
Epoch: 24257, Training Loss: 0.01079
Epoch: 24257, Training Loss: 0.00834
Epoch: 24258, Training Loss: 0.00945
Epoch: 24258, Training Loss: 0.00880
Epoch: 24258, Training Loss: 0.01079
Epoch: 24258, Training Loss: 0.00834
Epoch: 24259, Training Loss: 0.00945
Epoch: 24259, Training Loss: 0.00880
Epoch: 24259, Training Loss: 0.01079
Epoch: 24259, Training Loss: 0.00834
Epoch: 24260, Training Loss: 0.00945
Epoch: 24260, Training Loss: 0.00880
Epoch: 24260, Training Loss: 0.01079
Epoch: 24260, Training Loss: 0.00834
Epoch: 24261, Training Loss: 0.00945
Epoch: 24261, Training Loss: 0.00880
Epoch: 24261, Training Loss: 0.01079
Epoch: 24261, Training Loss: 0.00834
Epoch: 24262, Training Loss: 0.00945
Epoch: 24262, Training Loss: 0.00880
Epoch: 24262, Training Loss: 0.01079
Epoch: 24262, Training Loss: 0.00834
Epoch: 24263, Training Loss: 0.00945
Epoch: 24263, Training Loss: 0.00880
Epoch: 24263, Training Loss: 0.01079
Epoch: 24263, Training Loss: 0.00834
Epoch: 24264, Training Loss: 0.00945
Epoch: 24264, Training Loss: 0.00880
Epoch: 24264, Training Loss: 0.01078
Epoch: 24264, Training Loss: 0.00834
Epoch: 24265, Training Loss: 0.00945
Epoch: 24265, Training Loss: 0.00880
Epoch: 24265, Training Loss: 0.01078
Epoch: 24265, Training Loss: 0.00834
Epoch: 24266, Training Loss: 0.00945
Epoch: 24266, Training Loss: 0.00880
Epoch: 24266, Training Loss: 0.01078
Epoch: 24266, Training Loss: 0.00834
Epoch: 24267, Training Loss: 0.00945
Epoch: 24267, Training Loss: 0.00880
Epoch: 24267, Training Loss: 0.01078
Epoch: 24267, Training Loss: 0.00834
Epoch: 24268, Training Loss: 0.00945
Epoch: 24268, Training Loss: 0.00880
Epoch: 24268, Training Loss: 0.01078
Epoch: 24268, Training Loss: 0.00834
Epoch: 24269, Training Loss: 0.00944
Epoch: 24269, Training Loss: 0.00880
Epoch: 24269, Training Loss: 0.01078
Epoch: 24269, Training Loss: 0.00834
Epoch: 24270, Training Loss: 0.00944
Epoch: 24270, Training Loss: 0.00880
Epoch: 24270, Training Loss: 0.01078
Epoch: 24270, Training Loss: 0.00834
Epoch: 24271, Training Loss: 0.00944
Epoch: 24271, Training Loss: 0.00880
Epoch: 24271, Training Loss: 0.01078
Epoch: 24271, Training Loss: 0.00834
Epoch: 24272, Training Loss: 0.00944
Epoch: 24272, Training Loss: 0.00880
Epoch: 24272, Training Loss: 0.01078
Epoch: 24272, Training Loss: 0.00834
Epoch: 24273, Training Loss: 0.00944
Epoch: 24273, Training Loss: 0.00880
Epoch: 24273, Training Loss: 0.01078
Epoch: 24273, Training Loss: 0.00834
Epoch: 24274, Training Loss: 0.00944
Epoch: 24274, Training Loss: 0.00880
Epoch: 24274, Training Loss: 0.01078
Epoch: 24274, Training Loss: 0.00834
Epoch: 24275, Training Loss: 0.00944
Epoch: 24275, Training Loss: 0.00880
Epoch: 24275, Training Loss: 0.01078
Epoch: 24275, Training Loss: 0.00834
Epoch: 24276, Training Loss: 0.00944
Epoch: 24276, Training Loss: 0.00880
Epoch: 24276, Training Loss: 0.01078
Epoch: 24276, Training Loss: 0.00834
Epoch: 24277, Training Loss: 0.00944
Epoch: 24277, Training Loss: 0.00880
Epoch: 24277, Training Loss: 0.01078
Epoch: 24277, Training Loss: 0.00834
Epoch: 24278, Training Loss: 0.00944
Epoch: 24278, Training Loss: 0.00880
Epoch: 24278, Training Loss: 0.01078
Epoch: 24278, Training Loss: 0.00834
Epoch: 24279, Training Loss: 0.00944
Epoch: 24279, Training Loss: 0.00880
Epoch: 24279, Training Loss: 0.01078
Epoch: 24279, Training Loss: 0.00834
Epoch: 24280, Training Loss: 0.00944
Epoch: 24280, Training Loss: 0.00880
Epoch: 24280, Training Loss: 0.01078
Epoch: 24280, Training Loss: 0.00834
Epoch: 24281, Training Loss: 0.00944
Epoch: 24281, Training Loss: 0.00880
Epoch: 24281, Training Loss: 0.01078
Epoch: 24281, Training Loss: 0.00834
Epoch: 24282, Training Loss: 0.00944
Epoch: 24282, Training Loss: 0.00880
Epoch: 24282, Training Loss: 0.01078
Epoch: 24282, Training Loss: 0.00834
Epoch: 24283, Training Loss: 0.00944
Epoch: 24283, Training Loss: 0.00880
Epoch: 24283, Training Loss: 0.01078
Epoch: 24283, Training Loss: 0.00833
Epoch: 24284, Training Loss: 0.00944
Epoch: 24284, Training Loss: 0.00880
Epoch: 24284, Training Loss: 0.01078
Epoch: 24284, Training Loss: 0.00833
Epoch: 24285, Training Loss: 0.00944
Epoch: 24285, Training Loss: 0.00880
Epoch: 24285, Training Loss: 0.01078
Epoch: 24285, Training Loss: 0.00833
Epoch: 24286, Training Loss: 0.00944
Epoch: 24286, Training Loss: 0.00880
Epoch: 24286, Training Loss: 0.01078
Epoch: 24286, Training Loss: 0.00833
Epoch: 24287, Training Loss: 0.00944
Epoch: 24287, Training Loss: 0.00880
Epoch: 24287, Training Loss: 0.01078
Epoch: 24287, Training Loss: 0.00833
Epoch: 24288, Training Loss: 0.00944
Epoch: 24288, Training Loss: 0.00880
Epoch: 24288, Training Loss: 0.01078
Epoch: 24288, Training Loss: 0.00833
Epoch: 24289, Training Loss: 0.00944
Epoch: 24289, Training Loss: 0.00880
Epoch: 24289, Training Loss: 0.01078
Epoch: 24289, Training Loss: 0.00833
Epoch: 24290, Training Loss: 0.00944
Epoch: 24290, Training Loss: 0.00880
Epoch: 24290, Training Loss: 0.01078
Epoch: 24290, Training Loss: 0.00833
Epoch: 24291, Training Loss: 0.00944
Epoch: 24291, Training Loss: 0.00880
Epoch: 24291, Training Loss: 0.01078
Epoch: 24291, Training Loss: 0.00833
Epoch: 24292, Training Loss: 0.00944
Epoch: 24292, Training Loss: 0.00880
Epoch: 24292, Training Loss: 0.01078
Epoch: 24292, Training Loss: 0.00833
Epoch: 24293, Training Loss: 0.00944
Epoch: 24293, Training Loss: 0.00880
Epoch: 24293, Training Loss: 0.01078
Epoch: 24293, Training Loss: 0.00833
Epoch: 24294, Training Loss: 0.00944
Epoch: 24294, Training Loss: 0.00880
Epoch: 24294, Training Loss: 0.01078
Epoch: 24294, Training Loss: 0.00833
Epoch: 24295, Training Loss: 0.00944
Epoch: 24295, Training Loss: 0.00880
Epoch: 24295, Training Loss: 0.01078
Epoch: 24295, Training Loss: 0.00833
Epoch: 24296, Training Loss: 0.00944
Epoch: 24296, Training Loss: 0.00880
Epoch: 24296, Training Loss: 0.01078
Epoch: 24296, Training Loss: 0.00833
Epoch: 24297, Training Loss: 0.00944
Epoch: 24297, Training Loss: 0.00880
Epoch: 24297, Training Loss: 0.01078
Epoch: 24297, Training Loss: 0.00833
Epoch: 24298, Training Loss: 0.00944
Epoch: 24298, Training Loss: 0.00880
Epoch: 24298, Training Loss: 0.01078
Epoch: 24298, Training Loss: 0.00833
Epoch: 24299, Training Loss: 0.00944
Epoch: 24299, Training Loss: 0.00880
Epoch: 24299, Training Loss: 0.01078
Epoch: 24299, Training Loss: 0.00833
Epoch: 24300, Training Loss: 0.00944
Epoch: 24300, Training Loss: 0.00880
Epoch: 24300, Training Loss: 0.01078
Epoch: 24300, Training Loss: 0.00833
Epoch: 24301, Training Loss: 0.00944
Epoch: 24301, Training Loss: 0.00880
Epoch: 24301, Training Loss: 0.01078
Epoch: 24301, Training Loss: 0.00833
Epoch: 24302, Training Loss: 0.00944
Epoch: 24302, Training Loss: 0.00880
Epoch: 24302, Training Loss: 0.01078
Epoch: 24302, Training Loss: 0.00833
Epoch: 24303, Training Loss: 0.00944
Epoch: 24303, Training Loss: 0.00880
Epoch: 24303, Training Loss: 0.01078
Epoch: 24303, Training Loss: 0.00833
Epoch: 24304, Training Loss: 0.00944
Epoch: 24304, Training Loss: 0.00880
Epoch: 24304, Training Loss: 0.01078
Epoch: 24304, Training Loss: 0.00833
Epoch: 24305, Training Loss: 0.00944
Epoch: 24305, Training Loss: 0.00880
Epoch: 24305, Training Loss: 0.01078
Epoch: 24305, Training Loss: 0.00833
Epoch: 24306, Training Loss: 0.00944
Epoch: 24306, Training Loss: 0.00879
Epoch: 24306, Training Loss: 0.01077
Epoch: 24306, Training Loss: 0.00833
Epoch: 24307, Training Loss: 0.00944
Epoch: 24307, Training Loss: 0.00879
Epoch: 24307, Training Loss: 0.01077
Epoch: 24307, Training Loss: 0.00833
Epoch: 24308, Training Loss: 0.00944
Epoch: 24308, Training Loss: 0.00879
Epoch: 24308, Training Loss: 0.01077
Epoch: 24308, Training Loss: 0.00833
Epoch: 24309, Training Loss: 0.00944
Epoch: 24309, Training Loss: 0.00879
Epoch: 24309, Training Loss: 0.01077
Epoch: 24309, Training Loss: 0.00833
Epoch: 24310, Training Loss: 0.00944
Epoch: 24310, Training Loss: 0.00879
Epoch: 24310, Training Loss: 0.01077
Epoch: 24310, Training Loss: 0.00833
Epoch: 24311, Training Loss: 0.00944
Epoch: 24311, Training Loss: 0.00879
Epoch: 24311, Training Loss: 0.01077
Epoch: 24311, Training Loss: 0.00833
Epoch: 24312, Training Loss: 0.00944
Epoch: 24312, Training Loss: 0.00879
Epoch: 24312, Training Loss: 0.01077
Epoch: 24312, Training Loss: 0.00833
Epoch: 24313, Training Loss: 0.00944
Epoch: 24313, Training Loss: 0.00879
Epoch: 24313, Training Loss: 0.01077
Epoch: 24313, Training Loss: 0.00833
Epoch: 24314, Training Loss: 0.00944
Epoch: 24314, Training Loss: 0.00879
Epoch: 24314, Training Loss: 0.01077
Epoch: 24314, Training Loss: 0.00833
Epoch: 24315, Training Loss: 0.00944
Epoch: 24315, Training Loss: 0.00879
Epoch: 24315, Training Loss: 0.01077
Epoch: 24315, Training Loss: 0.00833
Epoch: 24316, Training Loss: 0.00944
Epoch: 24316, Training Loss: 0.00879
Epoch: 24316, Training Loss: 0.01077
Epoch: 24316, Training Loss: 0.00833
Epoch: 24317, Training Loss: 0.00943
Epoch: 24317, Training Loss: 0.00879
Epoch: 24317, Training Loss: 0.01077
Epoch: 24317, Training Loss: 0.00833
Epoch: 24318, Training Loss: 0.00943
Epoch: 24318, Training Loss: 0.00879
Epoch: 24318, Training Loss: 0.01077
Epoch: 24318, Training Loss: 0.00833
Epoch: 24319, Training Loss: 0.00943
Epoch: 24319, Training Loss: 0.00879
Epoch: 24319, Training Loss: 0.01077
Epoch: 24319, Training Loss: 0.00833
Epoch: 24320, Training Loss: 0.00943
Epoch: 24320, Training Loss: 0.00879
Epoch: 24320, Training Loss: 0.01077
Epoch: 24320, Training Loss: 0.00833
Epoch: 24321, Training Loss: 0.00943
Epoch: 24321, Training Loss: 0.00879
Epoch: 24321, Training Loss: 0.01077
Epoch: 24321, Training Loss: 0.00833
Epoch: 24322, Training Loss: 0.00943
Epoch: 24322, Training Loss: 0.00879
Epoch: 24322, Training Loss: 0.01077
Epoch: 24322, Training Loss: 0.00833
Epoch: 24323, Training Loss: 0.00943
Epoch: 24323, Training Loss: 0.00879
Epoch: 24323, Training Loss: 0.01077
Epoch: 24323, Training Loss: 0.00833
Epoch: 24324, Training Loss: 0.00943
Epoch: 24324, Training Loss: 0.00879
Epoch: 24324, Training Loss: 0.01077
Epoch: 24324, Training Loss: 0.00833
Epoch: 24325, Training Loss: 0.00943
Epoch: 24325, Training Loss: 0.00879
Epoch: 24325, Training Loss: 0.01077
Epoch: 24325, Training Loss: 0.00833
Epoch: 24326, Training Loss: 0.00943
Epoch: 24326, Training Loss: 0.00879
Epoch: 24326, Training Loss: 0.01077
Epoch: 24326, Training Loss: 0.00833
Epoch: 24327, Training Loss: 0.00943
Epoch: 24327, Training Loss: 0.00879
Epoch: 24327, Training Loss: 0.01077
Epoch: 24327, Training Loss: 0.00833
Epoch: 24328, Training Loss: 0.00943
Epoch: 24328, Training Loss: 0.00879
Epoch: 24328, Training Loss: 0.01077
Epoch: 24328, Training Loss: 0.00833
Epoch: 24329, Training Loss: 0.00943
Epoch: 24329, Training Loss: 0.00879
Epoch: 24329, Training Loss: 0.01077
Epoch: 24329, Training Loss: 0.00833
Epoch: 24330, Training Loss: 0.00943
Epoch: 24330, Training Loss: 0.00879
Epoch: 24330, Training Loss: 0.01077
Epoch: 24330, Training Loss: 0.00833
Epoch: 24331, Training Loss: 0.00943
Epoch: 24331, Training Loss: 0.00879
Epoch: 24331, Training Loss: 0.01077
Epoch: 24331, Training Loss: 0.00833
Epoch: 24332, Training Loss: 0.00943
Epoch: 24332, Training Loss: 0.00879
Epoch: 24332, Training Loss: 0.01077
Epoch: 24332, Training Loss: 0.00833
Epoch: 24333, Training Loss: 0.00943
Epoch: 24333, Training Loss: 0.00879
Epoch: 24333, Training Loss: 0.01077
Epoch: 24333, Training Loss: 0.00833
Epoch: 24334, Training Loss: 0.00943
Epoch: 24334, Training Loss: 0.00879
Epoch: 24334, Training Loss: 0.01077
Epoch: 24334, Training Loss: 0.00833
Epoch: 24335, Training Loss: 0.00943
Epoch: 24335, Training Loss: 0.00879
Epoch: 24335, Training Loss: 0.01077
Epoch: 24335, Training Loss: 0.00833
Epoch: 24336, Training Loss: 0.00943
Epoch: 24336, Training Loss: 0.00879
Epoch: 24336, Training Loss: 0.01077
Epoch: 24336, Training Loss: 0.00833
Epoch: 24337, Training Loss: 0.00943
Epoch: 24337, Training Loss: 0.00879
Epoch: 24337, Training Loss: 0.01077
Epoch: 24337, Training Loss: 0.00833
Epoch: 24338, Training Loss: 0.00943
Epoch: 24338, Training Loss: 0.00879
Epoch: 24338, Training Loss: 0.01077
Epoch: 24338, Training Loss: 0.00833
Epoch: 24339, Training Loss: 0.00943
Epoch: 24339, Training Loss: 0.00879
Epoch: 24339, Training Loss: 0.01077
Epoch: 24339, Training Loss: 0.00832
Epoch: 24340, Training Loss: 0.00943
Epoch: 24340, Training Loss: 0.00879
Epoch: 24340, Training Loss: 0.01077
Epoch: 24340, Training Loss: 0.00832
Epoch: 24341, Training Loss: 0.00943
Epoch: 24341, Training Loss: 0.00879
Epoch: 24341, Training Loss: 0.01077
Epoch: 24341, Training Loss: 0.00832
Epoch: 24342, Training Loss: 0.00943
Epoch: 24342, Training Loss: 0.00879
Epoch: 24342, Training Loss: 0.01077
Epoch: 24342, Training Loss: 0.00832
Epoch: 24343, Training Loss: 0.00943
Epoch: 24343, Training Loss: 0.00879
Epoch: 24343, Training Loss: 0.01077
Epoch: 24343, Training Loss: 0.00832
Epoch: 24344, Training Loss: 0.00943
Epoch: 24344, Training Loss: 0.00879
Epoch: 24344, Training Loss: 0.01077
Epoch: 24344, Training Loss: 0.00832
Epoch: 24345, Training Loss: 0.00943
Epoch: 24345, Training Loss: 0.00879
Epoch: 24345, Training Loss: 0.01077
Epoch: 24345, Training Loss: 0.00832
Epoch: 24346, Training Loss: 0.00943
Epoch: 24346, Training Loss: 0.00879
Epoch: 24346, Training Loss: 0.01077
Epoch: 24346, Training Loss: 0.00832
Epoch: 24347, Training Loss: 0.00943
Epoch: 24347, Training Loss: 0.00879
Epoch: 24347, Training Loss: 0.01077
Epoch: 24347, Training Loss: 0.00832
Epoch: 24348, Training Loss: 0.00943
Epoch: 24348, Training Loss: 0.00879
Epoch: 24348, Training Loss: 0.01077
Epoch: 24348, Training Loss: 0.00832
Epoch: 24349, Training Loss: 0.00943
Epoch: 24349, Training Loss: 0.00879
Epoch: 24349, Training Loss: 0.01076
Epoch: 24349, Training Loss: 0.00832
Epoch: 24350, Training Loss: 0.00943
Epoch: 24350, Training Loss: 0.00879
Epoch: 24350, Training Loss: 0.01076
Epoch: 24350, Training Loss: 0.00832
Epoch: 24351, Training Loss: 0.00943
Epoch: 24351, Training Loss: 0.00879
Epoch: 24351, Training Loss: 0.01076
Epoch: 24351, Training Loss: 0.00832
Epoch: 24352, Training Loss: 0.00943
Epoch: 24352, Training Loss: 0.00879
Epoch: 24352, Training Loss: 0.01076
Epoch: 24352, Training Loss: 0.00832
Epoch: 24353, Training Loss: 0.00943
Epoch: 24353, Training Loss: 0.00879
Epoch: 24353, Training Loss: 0.01076
Epoch: 24353, Training Loss: 0.00832
Epoch: 24354, Training Loss: 0.00943
Epoch: 24354, Training Loss: 0.00879
Epoch: 24354, Training Loss: 0.01076
Epoch: 24354, Training Loss: 0.00832
Epoch: 24355, Training Loss: 0.00943
Epoch: 24355, Training Loss: 0.00879
Epoch: 24355, Training Loss: 0.01076
Epoch: 24355, Training Loss: 0.00832
Epoch: 24356, Training Loss: 0.00943
Epoch: 24356, Training Loss: 0.00879
Epoch: 24356, Training Loss: 0.01076
Epoch: 24356, Training Loss: 0.00832
Epoch: 24357, Training Loss: 0.00943
Epoch: 24357, Training Loss: 0.00879
Epoch: 24357, Training Loss: 0.01076
Epoch: 24357, Training Loss: 0.00832
Epoch: 24358, Training Loss: 0.00943
Epoch: 24358, Training Loss: 0.00879
Epoch: 24358, Training Loss: 0.01076
Epoch: 24358, Training Loss: 0.00832
Epoch: 24359, Training Loss: 0.00943
Epoch: 24359, Training Loss: 0.00878
Epoch: 24359, Training Loss: 0.01076
Epoch: 24359, Training Loss: 0.00832
Epoch: 24360, Training Loss: 0.00943
Epoch: 24360, Training Loss: 0.00878
Epoch: 24360, Training Loss: 0.01076
Epoch: 24360, Training Loss: 0.00832
Epoch: 24361, Training Loss: 0.00943
Epoch: 24361, Training Loss: 0.00878
Epoch: 24361, Training Loss: 0.01076
Epoch: 24361, Training Loss: 0.00832
Epoch: 24362, Training Loss: 0.00943
Epoch: 24362, Training Loss: 0.00878
Epoch: 24362, Training Loss: 0.01076
Epoch: 24362, Training Loss: 0.00832
Epoch: 24363, Training Loss: 0.00943
Epoch: 24363, Training Loss: 0.00878
Epoch: 24363, Training Loss: 0.01076
Epoch: 24363, Training Loss: 0.00832
Epoch: 24364, Training Loss: 0.00943
Epoch: 24364, Training Loss: 0.00878
Epoch: 24364, Training Loss: 0.01076
Epoch: 24364, Training Loss: 0.00832
Epoch: 24365, Training Loss: 0.00942
Epoch: 24365, Training Loss: 0.00878
Epoch: 24365, Training Loss: 0.01076
Epoch: 24365, Training Loss: 0.00832
Epoch: 24366, Training Loss: 0.00942
Epoch: 24366, Training Loss: 0.00878
Epoch: 24366, Training Loss: 0.01076
Epoch: 24366, Training Loss: 0.00832
Epoch: 24367, Training Loss: 0.00942
Epoch: 24367, Training Loss: 0.00878
Epoch: 24367, Training Loss: 0.01076
Epoch: 24367, Training Loss: 0.00832
Epoch: 24368, Training Loss: 0.00942
Epoch: 24368, Training Loss: 0.00878
Epoch: 24368, Training Loss: 0.01076
Epoch: 24368, Training Loss: 0.00832
Epoch: 24369, Training Loss: 0.00942
Epoch: 24369, Training Loss: 0.00878
Epoch: 24369, Training Loss: 0.01076
Epoch: 24369, Training Loss: 0.00832
Epoch: 24370, Training Loss: 0.00942
Epoch: 24370, Training Loss: 0.00878
Epoch: 24370, Training Loss: 0.01076
Epoch: 24370, Training Loss: 0.00832
Epoch: 24371, Training Loss: 0.00942
Epoch: 24371, Training Loss: 0.00878
Epoch: 24371, Training Loss: 0.01076
Epoch: 24371, Training Loss: 0.00832
Epoch: 24372, Training Loss: 0.00942
Epoch: 24372, Training Loss: 0.00878
Epoch: 24372, Training Loss: 0.01076
Epoch: 24372, Training Loss: 0.00832
Epoch: 24373, Training Loss: 0.00942
Epoch: 24373, Training Loss: 0.00878
Epoch: 24373, Training Loss: 0.01076
Epoch: 24373, Training Loss: 0.00832
Epoch: 24374, Training Loss: 0.00942
Epoch: 24374, Training Loss: 0.00878
Epoch: 24374, Training Loss: 0.01076
Epoch: 24374, Training Loss: 0.00832
Epoch: 24375, Training Loss: 0.00942
Epoch: 24375, Training Loss: 0.00878
Epoch: 24375, Training Loss: 0.01076
Epoch: 24375, Training Loss: 0.00832
Epoch: 24376, Training Loss: 0.00942
Epoch: 24376, Training Loss: 0.00878
Epoch: 24376, Training Loss: 0.01076
Epoch: 24376, Training Loss: 0.00832
Epoch: 24377, Training Loss: 0.00942
Epoch: 24377, Training Loss: 0.00878
Epoch: 24377, Training Loss: 0.01076
Epoch: 24377, Training Loss: 0.00832
Epoch: 24378, Training Loss: 0.00942
Epoch: 24378, Training Loss: 0.00878
Epoch: 24378, Training Loss: 0.01076
Epoch: 24378, Training Loss: 0.00832
Epoch: 24379, Training Loss: 0.00942
Epoch: 24379, Training Loss: 0.00878
Epoch: 24379, Training Loss: 0.01076
Epoch: 24379, Training Loss: 0.00832
Epoch: 24380, Training Loss: 0.00942
Epoch: 24380, Training Loss: 0.00878
Epoch: 24380, Training Loss: 0.01076
Epoch: 24380, Training Loss: 0.00832
Epoch: 24381, Training Loss: 0.00942
Epoch: 24381, Training Loss: 0.00878
Epoch: 24381, Training Loss: 0.01076
Epoch: 24381, Training Loss: 0.00832
Epoch: 24382, Training Loss: 0.00942
Epoch: 24382, Training Loss: 0.00878
Epoch: 24382, Training Loss: 0.01076
Epoch: 24382, Training Loss: 0.00832
Epoch: 24383, Training Loss: 0.00942
Epoch: 24383, Training Loss: 0.00878
Epoch: 24383, Training Loss: 0.01076
Epoch: 24383, Training Loss: 0.00832
Epoch: 24384, Training Loss: 0.00942
Epoch: 24384, Training Loss: 0.00878
Epoch: 24384, Training Loss: 0.01076
Epoch: 24384, Training Loss: 0.00832
Epoch: 24385, Training Loss: 0.00942
Epoch: 24385, Training Loss: 0.00878
Epoch: 24385, Training Loss: 0.01076
Epoch: 24385, Training Loss: 0.00832
Epoch: 24386, Training Loss: 0.00942
Epoch: 24386, Training Loss: 0.00878
Epoch: 24386, Training Loss: 0.01076
Epoch: 24386, Training Loss: 0.00832
Epoch: 24387, Training Loss: 0.00942
Epoch: 24387, Training Loss: 0.00878
Epoch: 24387, Training Loss: 0.01076
Epoch: 24387, Training Loss: 0.00832
Epoch: 24388, Training Loss: 0.00942
Epoch: 24388, Training Loss: 0.00878
Epoch: 24388, Training Loss: 0.01076
Epoch: 24388, Training Loss: 0.00832
Epoch: 24389, Training Loss: 0.00942
Epoch: 24389, Training Loss: 0.00878
Epoch: 24389, Training Loss: 0.01076
Epoch: 24389, Training Loss: 0.00832
Epoch: 24390, Training Loss: 0.00942
Epoch: 24390, Training Loss: 0.00878
Epoch: 24390, Training Loss: 0.01076
Epoch: 24390, Training Loss: 0.00832
Epoch: 24391, Training Loss: 0.00942
Epoch: 24391, Training Loss: 0.00878
Epoch: 24391, Training Loss: 0.01075
Epoch: 24391, Training Loss: 0.00832
Epoch: 24392, Training Loss: 0.00942
Epoch: 24392, Training Loss: 0.00878
Epoch: 24392, Training Loss: 0.01075
Epoch: 24392, Training Loss: 0.00832
Epoch: 24393, Training Loss: 0.00942
Epoch: 24393, Training Loss: 0.00878
Epoch: 24393, Training Loss: 0.01075
Epoch: 24393, Training Loss: 0.00832
Epoch: 24394, Training Loss: 0.00942
Epoch: 24394, Training Loss: 0.00878
Epoch: 24394, Training Loss: 0.01075
Epoch: 24394, Training Loss: 0.00831
Epoch: 24395, Training Loss: 0.00942
Epoch: 24395, Training Loss: 0.00878
Epoch: 24395, Training Loss: 0.01075
Epoch: 24395, Training Loss: 0.00831
Epoch: 24396, Training Loss: 0.00942
Epoch: 24396, Training Loss: 0.00878
Epoch: 24396, Training Loss: 0.01075
Epoch: 24396, Training Loss: 0.00831
Epoch: 24397, Training Loss: 0.00942
Epoch: 24397, Training Loss: 0.00878
Epoch: 24397, Training Loss: 0.01075
Epoch: 24397, Training Loss: 0.00831
Epoch: 24398, Training Loss: 0.00942
Epoch: 24398, Training Loss: 0.00878
Epoch: 24398, Training Loss: 0.01075
Epoch: 24398, Training Loss: 0.00831
Epoch: 24399, Training Loss: 0.00942
Epoch: 24399, Training Loss: 0.00878
Epoch: 24399, Training Loss: 0.01075
Epoch: 24399, Training Loss: 0.00831
Epoch: 24400, Training Loss: 0.00942
Epoch: 24400, Training Loss: 0.00878
Epoch: 24400, Training Loss: 0.01075
Epoch: 24400, Training Loss: 0.00831
Epoch: 24401, Training Loss: 0.00942
Epoch: 24401, Training Loss: 0.00878
Epoch: 24401, Training Loss: 0.01075
Epoch: 24401, Training Loss: 0.00831
Epoch: 24402, Training Loss: 0.00942
Epoch: 24402, Training Loss: 0.00878
Epoch: 24402, Training Loss: 0.01075
Epoch: 24402, Training Loss: 0.00831
Epoch: 24403, Training Loss: 0.00942
Epoch: 24403, Training Loss: 0.00878
Epoch: 24403, Training Loss: 0.01075
Epoch: 24403, Training Loss: 0.00831
Epoch: 24404, Training Loss: 0.00942
Epoch: 24404, Training Loss: 0.00878
Epoch: 24404, Training Loss: 0.01075
Epoch: 24404, Training Loss: 0.00831
Epoch: 24405, Training Loss: 0.00942
Epoch: 24405, Training Loss: 0.00878
Epoch: 24405, Training Loss: 0.01075
Epoch: 24405, Training Loss: 0.00831
Epoch: 24406, Training Loss: 0.00942
Epoch: 24406, Training Loss: 0.00878
Epoch: 24406, Training Loss: 0.01075
Epoch: 24406, Training Loss: 0.00831
Epoch: 24407, Training Loss: 0.00942
Epoch: 24407, Training Loss: 0.00878
Epoch: 24407, Training Loss: 0.01075
Epoch: 24407, Training Loss: 0.00831
Epoch: 24408, Training Loss: 0.00942
Epoch: 24408, Training Loss: 0.00878
Epoch: 24408, Training Loss: 0.01075
Epoch: 24408, Training Loss: 0.00831
Epoch: 24409, Training Loss: 0.00942
Epoch: 24409, Training Loss: 0.00878
Epoch: 24409, Training Loss: 0.01075
Epoch: 24409, Training Loss: 0.00831
Epoch: 24410, Training Loss: 0.00942
Epoch: 24410, Training Loss: 0.00878
Epoch: 24410, Training Loss: 0.01075
Epoch: 24410, Training Loss: 0.00831
Epoch: 24411, Training Loss: 0.00942
Epoch: 24411, Training Loss: 0.00878
Epoch: 24411, Training Loss: 0.01075
Epoch: 24411, Training Loss: 0.00831
Epoch: 24412, Training Loss: 0.00942
Epoch: 24412, Training Loss: 0.00877
Epoch: 24412, Training Loss: 0.01075
Epoch: 24412, Training Loss: 0.00831
Epoch: 24413, Training Loss: 0.00941
Epoch: 24413, Training Loss: 0.00877
Epoch: 24413, Training Loss: 0.01075
Epoch: 24413, Training Loss: 0.00831
Epoch: 24414, Training Loss: 0.00941
Epoch: 24414, Training Loss: 0.00877
Epoch: 24414, Training Loss: 0.01075
Epoch: 24414, Training Loss: 0.00831
Epoch: 24415, Training Loss: 0.00941
Epoch: 24415, Training Loss: 0.00877
Epoch: 24415, Training Loss: 0.01075
Epoch: 24415, Training Loss: 0.00831
Epoch: 24416, Training Loss: 0.00941
Epoch: 24416, Training Loss: 0.00877
Epoch: 24416, Training Loss: 0.01075
Epoch: 24416, Training Loss: 0.00831
Epoch: 24417, Training Loss: 0.00941
Epoch: 24417, Training Loss: 0.00877
Epoch: 24417, Training Loss: 0.01075
Epoch: 24417, Training Loss: 0.00831
Epoch: 24418, Training Loss: 0.00941
Epoch: 24418, Training Loss: 0.00877
Epoch: 24418, Training Loss: 0.01075
Epoch: 24418, Training Loss: 0.00831
Epoch: 24419, Training Loss: 0.00941
Epoch: 24419, Training Loss: 0.00877
Epoch: 24419, Training Loss: 0.01075
Epoch: 24419, Training Loss: 0.00831
Epoch: 24420, Training Loss: 0.00941
Epoch: 24420, Training Loss: 0.00877
Epoch: 24420, Training Loss: 0.01075
Epoch: 24420, Training Loss: 0.00831
Epoch: 24421, Training Loss: 0.00941
Epoch: 24421, Training Loss: 0.00877
Epoch: 24421, Training Loss: 0.01075
Epoch: 24421, Training Loss: 0.00831
Epoch: 24422, Training Loss: 0.00941
Epoch: 24422, Training Loss: 0.00877
Epoch: 24422, Training Loss: 0.01075
Epoch: 24422, Training Loss: 0.00831
Epoch: 24423, Training Loss: 0.00941
Epoch: 24423, Training Loss: 0.00877
Epoch: 24423, Training Loss: 0.01075
Epoch: 24423, Training Loss: 0.00831
Epoch: 24424, Training Loss: 0.00941
Epoch: 24424, Training Loss: 0.00877
Epoch: 24424, Training Loss: 0.01075
Epoch: 24424, Training Loss: 0.00831
Epoch: 24425, Training Loss: 0.00941
Epoch: 24425, Training Loss: 0.00877
Epoch: 24425, Training Loss: 0.01075
Epoch: 24425, Training Loss: 0.00831
Epoch: 24426, Training Loss: 0.00941
Epoch: 24426, Training Loss: 0.00877
Epoch: 24426, Training Loss: 0.01075
Epoch: 24426, Training Loss: 0.00831
Epoch: 24427, Training Loss: 0.00941
Epoch: 24427, Training Loss: 0.00877
Epoch: 24427, Training Loss: 0.01075
Epoch: 24427, Training Loss: 0.00831
Epoch: 24428, Training Loss: 0.00941
Epoch: 24428, Training Loss: 0.00877
Epoch: 24428, Training Loss: 0.01075
Epoch: 24428, Training Loss: 0.00831
Epoch: 24429, Training Loss: 0.00941
Epoch: 24429, Training Loss: 0.00877
Epoch: 24429, Training Loss: 0.01075
Epoch: 24429, Training Loss: 0.00831
Epoch: 24430, Training Loss: 0.00941
Epoch: 24430, Training Loss: 0.00877
Epoch: 24430, Training Loss: 0.01075
Epoch: 24430, Training Loss: 0.00831
Epoch: 24431, Training Loss: 0.00941
Epoch: 24431, Training Loss: 0.00877
Epoch: 24431, Training Loss: 0.01075
Epoch: 24431, Training Loss: 0.00831
Epoch: 24432, Training Loss: 0.00941
Epoch: 24432, Training Loss: 0.00877
Epoch: 24432, Training Loss: 0.01075
Epoch: 24432, Training Loss: 0.00831
Epoch: 24433, Training Loss: 0.00941
Epoch: 24433, Training Loss: 0.00877
Epoch: 24433, Training Loss: 0.01074
Epoch: 24433, Training Loss: 0.00831
Epoch: 24434, Training Loss: 0.00941
Epoch: 24434, Training Loss: 0.00877
Epoch: 24434, Training Loss: 0.01074
Epoch: 24434, Training Loss: 0.00831
Epoch: 24435, Training Loss: 0.00941
Epoch: 24435, Training Loss: 0.00877
Epoch: 24435, Training Loss: 0.01074
Epoch: 24435, Training Loss: 0.00831
Epoch: 24436, Training Loss: 0.00941
Epoch: 24436, Training Loss: 0.00877
Epoch: 24436, Training Loss: 0.01074
Epoch: 24436, Training Loss: 0.00831
Epoch: 24437, Training Loss: 0.00941
Epoch: 24437, Training Loss: 0.00877
Epoch: 24437, Training Loss: 0.01074
Epoch: 24437, Training Loss: 0.00831
Epoch: 24438, Training Loss: 0.00941
Epoch: 24438, Training Loss: 0.00877
Epoch: 24438, Training Loss: 0.01074
Epoch: 24438, Training Loss: 0.00831
Epoch: 24439, Training Loss: 0.00941
Epoch: 24439, Training Loss: 0.00877
Epoch: 24439, Training Loss: 0.01074
Epoch: 24439, Training Loss: 0.00831
Epoch: 24440, Training Loss: 0.00941
Epoch: 24440, Training Loss: 0.00877
Epoch: 24440, Training Loss: 0.01074
Epoch: 24440, Training Loss: 0.00831
Epoch: 24441, Training Loss: 0.00941
Epoch: 24441, Training Loss: 0.00877
Epoch: 24441, Training Loss: 0.01074
Epoch: 24441, Training Loss: 0.00831
Epoch: 24442, Training Loss: 0.00941
Epoch: 24442, Training Loss: 0.00877
Epoch: 24442, Training Loss: 0.01074
Epoch: 24442, Training Loss: 0.00831
Epoch: 24443, Training Loss: 0.00941
Epoch: 24443, Training Loss: 0.00877
Epoch: 24443, Training Loss: 0.01074
Epoch: 24443, Training Loss: 0.00831
Epoch: 24444, Training Loss: 0.00941
Epoch: 24444, Training Loss: 0.00877
Epoch: 24444, Training Loss: 0.01074
Epoch: 24444, Training Loss: 0.00831
Epoch: 24445, Training Loss: 0.00941
Epoch: 24445, Training Loss: 0.00877
Epoch: 24445, Training Loss: 0.01074
Epoch: 24445, Training Loss: 0.00831
Epoch: 24446, Training Loss: 0.00941
Epoch: 24446, Training Loss: 0.00877
Epoch: 24446, Training Loss: 0.01074
Epoch: 24446, Training Loss: 0.00831
Epoch: 24447, Training Loss: 0.00941
Epoch: 24447, Training Loss: 0.00877
Epoch: 24447, Training Loss: 0.01074
Epoch: 24447, Training Loss: 0.00831
Epoch: 24448, Training Loss: 0.00941
Epoch: 24448, Training Loss: 0.00877
Epoch: 24448, Training Loss: 0.01074
Epoch: 24448, Training Loss: 0.00831
Epoch: 24449, Training Loss: 0.00941
Epoch: 24449, Training Loss: 0.00877
Epoch: 24449, Training Loss: 0.01074
Epoch: 24449, Training Loss: 0.00831
Epoch: 24450, Training Loss: 0.00941
Epoch: 24450, Training Loss: 0.00877
Epoch: 24450, Training Loss: 0.01074
Epoch: 24450, Training Loss: 0.00830
Epoch: 24451, Training Loss: 0.00941
Epoch: 24451, Training Loss: 0.00877
Epoch: 24451, Training Loss: 0.01074
Epoch: 24451, Training Loss: 0.00830
Epoch: 24452, Training Loss: 0.00941
Epoch: 24452, Training Loss: 0.00877
Epoch: 24452, Training Loss: 0.01074
Epoch: 24452, Training Loss: 0.00830
Epoch: 24453, Training Loss: 0.00941
Epoch: 24453, Training Loss: 0.00877
Epoch: 24453, Training Loss: 0.01074
Epoch: 24453, Training Loss: 0.00830
Epoch: 24454, Training Loss: 0.00941
Epoch: 24454, Training Loss: 0.00877
Epoch: 24454, Training Loss: 0.01074
Epoch: 24454, Training Loss: 0.00830
Epoch: 24455, Training Loss: 0.00941
Epoch: 24455, Training Loss: 0.00877
Epoch: 24455, Training Loss: 0.01074
Epoch: 24455, Training Loss: 0.00830
Epoch: 24456, Training Loss: 0.00941
Epoch: 24456, Training Loss: 0.00877
Epoch: 24456, Training Loss: 0.01074
Epoch: 24456, Training Loss: 0.00830
Epoch: 24457, Training Loss: 0.00941
Epoch: 24457, Training Loss: 0.00877
Epoch: 24457, Training Loss: 0.01074
Epoch: 24457, Training Loss: 0.00830
Epoch: 24458, Training Loss: 0.00941
Epoch: 24458, Training Loss: 0.00877
Epoch: 24458, Training Loss: 0.01074
Epoch: 24458, Training Loss: 0.00830
Epoch: 24459, Training Loss: 0.00941
Epoch: 24459, Training Loss: 0.00877
Epoch: 24459, Training Loss: 0.01074
Epoch: 24459, Training Loss: 0.00830
Epoch: 24460, Training Loss: 0.00941
Epoch: 24460, Training Loss: 0.00877
Epoch: 24460, Training Loss: 0.01074
Epoch: 24460, Training Loss: 0.00830
Epoch: 24461, Training Loss: 0.00941
Epoch: 24461, Training Loss: 0.00877
Epoch: 24461, Training Loss: 0.01074
Epoch: 24461, Training Loss: 0.00830
Epoch: 24462, Training Loss: 0.00940
Epoch: 24462, Training Loss: 0.00877
Epoch: 24462, Training Loss: 0.01074
Epoch: 24462, Training Loss: 0.00830
Epoch: 24463, Training Loss: 0.00940
Epoch: 24463, Training Loss: 0.00877
Epoch: 24463, Training Loss: 0.01074
Epoch: 24463, Training Loss: 0.00830
Epoch: 24464, Training Loss: 0.00940
Epoch: 24464, Training Loss: 0.00877
Epoch: 24464, Training Loss: 0.01074
Epoch: 24464, Training Loss: 0.00830
Epoch: 24465, Training Loss: 0.00940
Epoch: 24465, Training Loss: 0.00876
Epoch: 24465, Training Loss: 0.01074
Epoch: 24465, Training Loss: 0.00830
Epoch: 24466, Training Loss: 0.00940
Epoch: 24466, Training Loss: 0.00876
Epoch: 24466, Training Loss: 0.01074
Epoch: 24466, Training Loss: 0.00830
Epoch: 24467, Training Loss: 0.00940
Epoch: 24467, Training Loss: 0.00876
Epoch: 24467, Training Loss: 0.01074
Epoch: 24467, Training Loss: 0.00830
Epoch: 24468, Training Loss: 0.00940
Epoch: 24468, Training Loss: 0.00876
Epoch: 24468, Training Loss: 0.01074
Epoch: 24468, Training Loss: 0.00830
Epoch: 24469, Training Loss: 0.00940
Epoch: 24469, Training Loss: 0.00876
Epoch: 24469, Training Loss: 0.01074
Epoch: 24469, Training Loss: 0.00830
Epoch: 24470, Training Loss: 0.00940
Epoch: 24470, Training Loss: 0.00876
Epoch: 24470, Training Loss: 0.01074
Epoch: 24470, Training Loss: 0.00830
Epoch: 24471, Training Loss: 0.00940
Epoch: 24471, Training Loss: 0.00876
Epoch: 24471, Training Loss: 0.01074
Epoch: 24471, Training Loss: 0.00830
Epoch: 24472, Training Loss: 0.00940
Epoch: 24472, Training Loss: 0.00876
Epoch: 24472, Training Loss: 0.01074
Epoch: 24472, Training Loss: 0.00830
Epoch: 24473, Training Loss: 0.00940
Epoch: 24473, Training Loss: 0.00876
Epoch: 24473, Training Loss: 0.01074
Epoch: 24473, Training Loss: 0.00830
Epoch: 24474, Training Loss: 0.00940
Epoch: 24474, Training Loss: 0.00876
Epoch: 24474, Training Loss: 0.01074
Epoch: 24474, Training Loss: 0.00830
Epoch: 24475, Training Loss: 0.00940
Epoch: 24475, Training Loss: 0.00876
Epoch: 24475, Training Loss: 0.01074
Epoch: 24475, Training Loss: 0.00830
Epoch: 24476, Training Loss: 0.00940
Epoch: 24476, Training Loss: 0.00876
Epoch: 24476, Training Loss: 0.01073
Epoch: 24476, Training Loss: 0.00830
Epoch: 24477, Training Loss: 0.00940
Epoch: 24477, Training Loss: 0.00876
Epoch: 24477, Training Loss: 0.01073
Epoch: 24477, Training Loss: 0.00830
Epoch: 24478, Training Loss: 0.00940
Epoch: 24478, Training Loss: 0.00876
Epoch: 24478, Training Loss: 0.01073
Epoch: 24478, Training Loss: 0.00830
Epoch: 24479, Training Loss: 0.00940
Epoch: 24479, Training Loss: 0.00876
Epoch: 24479, Training Loss: 0.01073
Epoch: 24479, Training Loss: 0.00830
Epoch: 24480, Training Loss: 0.00940
Epoch: 24480, Training Loss: 0.00876
Epoch: 24480, Training Loss: 0.01073
Epoch: 24480, Training Loss: 0.00830
Epoch: 24481, Training Loss: 0.00940
Epoch: 24481, Training Loss: 0.00876
Epoch: 24481, Training Loss: 0.01073
Epoch: 24481, Training Loss: 0.00830
Epoch: 24482, Training Loss: 0.00940
Epoch: 24482, Training Loss: 0.00876
Epoch: 24482, Training Loss: 0.01073
Epoch: 24482, Training Loss: 0.00830
Epoch: 24483, Training Loss: 0.00940
Epoch: 24483, Training Loss: 0.00876
Epoch: 24483, Training Loss: 0.01073
Epoch: 24483, Training Loss: 0.00830
Epoch: 24484, Training Loss: 0.00940
Epoch: 24484, Training Loss: 0.00876
Epoch: 24484, Training Loss: 0.01073
Epoch: 24484, Training Loss: 0.00830
Epoch: 24485, Training Loss: 0.00940
Epoch: 24485, Training Loss: 0.00876
Epoch: 24485, Training Loss: 0.01073
Epoch: 24485, Training Loss: 0.00830
Epoch: 24486, Training Loss: 0.00940
Epoch: 24486, Training Loss: 0.00876
Epoch: 24486, Training Loss: 0.01073
Epoch: 24486, Training Loss: 0.00830
Epoch: 24487, Training Loss: 0.00940
Epoch: 24487, Training Loss: 0.00876
Epoch: 24487, Training Loss: 0.01073
Epoch: 24487, Training Loss: 0.00830
Epoch: 24488, Training Loss: 0.00940
Epoch: 24488, Training Loss: 0.00876
Epoch: 24488, Training Loss: 0.01073
Epoch: 24488, Training Loss: 0.00830
Epoch: 24489, Training Loss: 0.00940
Epoch: 24489, Training Loss: 0.00876
Epoch: 24489, Training Loss: 0.01073
Epoch: 24489, Training Loss: 0.00830
Epoch: 24490, Training Loss: 0.00940
Epoch: 24490, Training Loss: 0.00876
Epoch: 24490, Training Loss: 0.01073
Epoch: 24490, Training Loss: 0.00830
Epoch: 24491, Training Loss: 0.00940
Epoch: 24491, Training Loss: 0.00876
Epoch: 24491, Training Loss: 0.01073
Epoch: 24491, Training Loss: 0.00830
Epoch: 24492, Training Loss: 0.00940
Epoch: 24492, Training Loss: 0.00876
Epoch: 24492, Training Loss: 0.01073
Epoch: 24492, Training Loss: 0.00830
Epoch: 24493, Training Loss: 0.00940
Epoch: 24493, Training Loss: 0.00876
Epoch: 24493, Training Loss: 0.01073
Epoch: 24493, Training Loss: 0.00830
Epoch: 24494, Training Loss: 0.00940
Epoch: 24494, Training Loss: 0.00876
Epoch: 24494, Training Loss: 0.01073
Epoch: 24494, Training Loss: 0.00830
Epoch: 24495, Training Loss: 0.00940
Epoch: 24495, Training Loss: 0.00876
Epoch: 24495, Training Loss: 0.01073
Epoch: 24495, Training Loss: 0.00830
Epoch: 24496, Training Loss: 0.00940
Epoch: 24496, Training Loss: 0.00876
Epoch: 24496, Training Loss: 0.01073
Epoch: 24496, Training Loss: 0.00830
Epoch: 24497, Training Loss: 0.00940
Epoch: 24497, Training Loss: 0.00876
Epoch: 24497, Training Loss: 0.01073
Epoch: 24497, Training Loss: 0.00830
Epoch: 24498, Training Loss: 0.00940
Epoch: 24498, Training Loss: 0.00876
Epoch: 24498, Training Loss: 0.01073
Epoch: 24498, Training Loss: 0.00830
Epoch: 24499, Training Loss: 0.00940
Epoch: 24499, Training Loss: 0.00876
Epoch: 24499, Training Loss: 0.01073
Epoch: 24499, Training Loss: 0.00830
Epoch: 24500, Training Loss: 0.00940
Epoch: 24500, Training Loss: 0.00876
Epoch: 24500, Training Loss: 0.01073
Epoch: 24500, Training Loss: 0.00830
Epoch: 24501, Training Loss: 0.00940
Epoch: 24501, Training Loss: 0.00876
Epoch: 24501, Training Loss: 0.01073
Epoch: 24501, Training Loss: 0.00830
Epoch: 24502, Training Loss: 0.00940
Epoch: 24502, Training Loss: 0.00876
Epoch: 24502, Training Loss: 0.01073
Epoch: 24502, Training Loss: 0.00830
Epoch: 24503, Training Loss: 0.00940
Epoch: 24503, Training Loss: 0.00876
Epoch: 24503, Training Loss: 0.01073
Epoch: 24503, Training Loss: 0.00830
Epoch: 24504, Training Loss: 0.00940
Epoch: 24504, Training Loss: 0.00876
Epoch: 24504, Training Loss: 0.01073
Epoch: 24504, Training Loss: 0.00830
Epoch: 24505, Training Loss: 0.00940
Epoch: 24505, Training Loss: 0.00876
Epoch: 24505, Training Loss: 0.01073
Epoch: 24505, Training Loss: 0.00830
Epoch: 24506, Training Loss: 0.00940
Epoch: 24506, Training Loss: 0.00876
Epoch: 24506, Training Loss: 0.01073
Epoch: 24506, Training Loss: 0.00829
Epoch: 24507, Training Loss: 0.00940
Epoch: 24507, Training Loss: 0.00876
Epoch: 24507, Training Loss: 0.01073
Epoch: 24507, Training Loss: 0.00829
Epoch: 24508, Training Loss: 0.00940
Epoch: 24508, Training Loss: 0.00876
Epoch: 24508, Training Loss: 0.01073
Epoch: 24508, Training Loss: 0.00829
Epoch: 24509, Training Loss: 0.00940
Epoch: 24509, Training Loss: 0.00876
Epoch: 24509, Training Loss: 0.01073
Epoch: 24509, Training Loss: 0.00829
Epoch: 24510, Training Loss: 0.00939
Epoch: 24510, Training Loss: 0.00876
Epoch: 24510, Training Loss: 0.01073
Epoch: 24510, Training Loss: 0.00829
Epoch: 24511, Training Loss: 0.00939
Epoch: 24511, Training Loss: 0.00876
Epoch: 24511, Training Loss: 0.01073
Epoch: 24511, Training Loss: 0.00829
Epoch: 24512, Training Loss: 0.00939
Epoch: 24512, Training Loss: 0.00876
Epoch: 24512, Training Loss: 0.01073
Epoch: 24512, Training Loss: 0.00829
Epoch: 24513, Training Loss: 0.00939
Epoch: 24513, Training Loss: 0.00876
Epoch: 24513, Training Loss: 0.01073
Epoch: 24513, Training Loss: 0.00829
Epoch: 24514, Training Loss: 0.00939
Epoch: 24514, Training Loss: 0.00876
Epoch: 24514, Training Loss: 0.01073
Epoch: 24514, Training Loss: 0.00829
Epoch: 24515, Training Loss: 0.00939
Epoch: 24515, Training Loss: 0.00876
Epoch: 24515, Training Loss: 0.01073
Epoch: 24515, Training Loss: 0.00829
Epoch: 24516, Training Loss: 0.00939
Epoch: 24516, Training Loss: 0.00876
Epoch: 24516, Training Loss: 0.01073
Epoch: 24516, Training Loss: 0.00829
Epoch: 24517, Training Loss: 0.00939
Epoch: 24517, Training Loss: 0.00876
Epoch: 24517, Training Loss: 0.01073
Epoch: 24517, Training Loss: 0.00829
Epoch: 24518, Training Loss: 0.00939
Epoch: 24518, Training Loss: 0.00875
Epoch: 24518, Training Loss: 0.01073
Epoch: 24518, Training Loss: 0.00829
Epoch: 24519, Training Loss: 0.00939
Epoch: 24519, Training Loss: 0.00875
Epoch: 24519, Training Loss: 0.01072
Epoch: 24519, Training Loss: 0.00829
Epoch: 24520, Training Loss: 0.00939
Epoch: 24520, Training Loss: 0.00875
Epoch: 24520, Training Loss: 0.01072
Epoch: 24520, Training Loss: 0.00829
Epoch: 24521, Training Loss: 0.00939
Epoch: 24521, Training Loss: 0.00875
Epoch: 24521, Training Loss: 0.01072
Epoch: 24521, Training Loss: 0.00829
Epoch: 24522, Training Loss: 0.00939
Epoch: 24522, Training Loss: 0.00875
Epoch: 24522, Training Loss: 0.01072
Epoch: 24522, Training Loss: 0.00829
Epoch: 24523, Training Loss: 0.00939
Epoch: 24523, Training Loss: 0.00875
Epoch: 24523, Training Loss: 0.01072
Epoch: 24523, Training Loss: 0.00829
Epoch: 24524, Training Loss: 0.00939
Epoch: 24524, Training Loss: 0.00875
Epoch: 24524, Training Loss: 0.01072
Epoch: 24524, Training Loss: 0.00829
Epoch: 24525, Training Loss: 0.00939
Epoch: 24525, Training Loss: 0.00875
Epoch: 24525, Training Loss: 0.01072
Epoch: 24525, Training Loss: 0.00829
Epoch: 24526, Training Loss: 0.00939
Epoch: 24526, Training Loss: 0.00875
Epoch: 24526, Training Loss: 0.01072
Epoch: 24526, Training Loss: 0.00829
Epoch: 24527, Training Loss: 0.00939
Epoch: 24527, Training Loss: 0.00875
Epoch: 24527, Training Loss: 0.01072
Epoch: 24527, Training Loss: 0.00829
Epoch: 24528, Training Loss: 0.00939
Epoch: 24528, Training Loss: 0.00875
Epoch: 24528, Training Loss: 0.01072
Epoch: 24528, Training Loss: 0.00829
Epoch: 24529, Training Loss: 0.00939
Epoch: 24529, Training Loss: 0.00875
Epoch: 24529, Training Loss: 0.01072
Epoch: 24529, Training Loss: 0.00829
Epoch: 24530, Training Loss: 0.00939
Epoch: 24530, Training Loss: 0.00875
Epoch: 24530, Training Loss: 0.01072
Epoch: 24530, Training Loss: 0.00829
Epoch: 24531, Training Loss: 0.00939
Epoch: 24531, Training Loss: 0.00875
Epoch: 24531, Training Loss: 0.01072
Epoch: 24531, Training Loss: 0.00829
Epoch: 24532, Training Loss: 0.00939
Epoch: 24532, Training Loss: 0.00875
Epoch: 24532, Training Loss: 0.01072
Epoch: 24532, Training Loss: 0.00829
Epoch: 24533, Training Loss: 0.00939
Epoch: 24533, Training Loss: 0.00875
Epoch: 24533, Training Loss: 0.01072
Epoch: 24533, Training Loss: 0.00829
Epoch: 24534, Training Loss: 0.00939
Epoch: 24534, Training Loss: 0.00875
Epoch: 24534, Training Loss: 0.01072
Epoch: 24534, Training Loss: 0.00829
Epoch: 24535, Training Loss: 0.00939
Epoch: 24535, Training Loss: 0.00875
Epoch: 24535, Training Loss: 0.01072
Epoch: 24535, Training Loss: 0.00829
Epoch: 24536, Training Loss: 0.00939
Epoch: 24536, Training Loss: 0.00875
Epoch: 24536, Training Loss: 0.01072
Epoch: 24536, Training Loss: 0.00829
Epoch: 24537, Training Loss: 0.00939
Epoch: 24537, Training Loss: 0.00875
Epoch: 24537, Training Loss: 0.01072
Epoch: 24537, Training Loss: 0.00829
Epoch: 24538, Training Loss: 0.00939
Epoch: 24538, Training Loss: 0.00875
Epoch: 24538, Training Loss: 0.01072
Epoch: 24538, Training Loss: 0.00829
Epoch: 24539, Training Loss: 0.00939
Epoch: 24539, Training Loss: 0.00875
Epoch: 24539, Training Loss: 0.01072
Epoch: 24539, Training Loss: 0.00829
Epoch: 24540, Training Loss: 0.00939
Epoch: 24540, Training Loss: 0.00875
Epoch: 24540, Training Loss: 0.01072
Epoch: 24540, Training Loss: 0.00829
Epoch: 24541, Training Loss: 0.00939
Epoch: 24541, Training Loss: 0.00875
Epoch: 24541, Training Loss: 0.01072
Epoch: 24541, Training Loss: 0.00829
Epoch: 24542, Training Loss: 0.00939
Epoch: 24542, Training Loss: 0.00875
Epoch: 24542, Training Loss: 0.01072
Epoch: 24542, Training Loss: 0.00829
Epoch: 24543, Training Loss: 0.00939
Epoch: 24543, Training Loss: 0.00875
Epoch: 24543, Training Loss: 0.01072
Epoch: 24543, Training Loss: 0.00829
Epoch: 24544, Training Loss: 0.00939
Epoch: 24544, Training Loss: 0.00875
Epoch: 24544, Training Loss: 0.01072
Epoch: 24544, Training Loss: 0.00829
Epoch: 24545, Training Loss: 0.00939
Epoch: 24545, Training Loss: 0.00875
Epoch: 24545, Training Loss: 0.01072
Epoch: 24545, Training Loss: 0.00829
Epoch: 24546, Training Loss: 0.00939
Epoch: 24546, Training Loss: 0.00875
Epoch: 24546, Training Loss: 0.01072
Epoch: 24546, Training Loss: 0.00829
Epoch: 24547, Training Loss: 0.00939
Epoch: 24547, Training Loss: 0.00875
Epoch: 24547, Training Loss: 0.01072
Epoch: 24547, Training Loss: 0.00829
Epoch: 24548, Training Loss: 0.00939
Epoch: 24548, Training Loss: 0.00875
Epoch: 24548, Training Loss: 0.01072
Epoch: 24548, Training Loss: 0.00829
Epoch: 24549, Training Loss: 0.00939
Epoch: 24549, Training Loss: 0.00875
Epoch: 24549, Training Loss: 0.01072
Epoch: 24549, Training Loss: 0.00829
Epoch: 24550, Training Loss: 0.00939
Epoch: 24550, Training Loss: 0.00875
Epoch: 24550, Training Loss: 0.01072
Epoch: 24550, Training Loss: 0.00829
Epoch: 24551, Training Loss: 0.00939
Epoch: 24551, Training Loss: 0.00875
Epoch: 24551, Training Loss: 0.01072
Epoch: 24551, Training Loss: 0.00829
Epoch: 24552, Training Loss: 0.00939
Epoch: 24552, Training Loss: 0.00875
Epoch: 24552, Training Loss: 0.01072
Epoch: 24552, Training Loss: 0.00829
Epoch: 24553, Training Loss: 0.00939
Epoch: 24553, Training Loss: 0.00875
Epoch: 24553, Training Loss: 0.01072
Epoch: 24553, Training Loss: 0.00829
Epoch: 24554, Training Loss: 0.00939
Epoch: 24554, Training Loss: 0.00875
Epoch: 24554, Training Loss: 0.01072
Epoch: 24554, Training Loss: 0.00829
Epoch: 24555, Training Loss: 0.00939
Epoch: 24555, Training Loss: 0.00875
Epoch: 24555, Training Loss: 0.01072
Epoch: 24555, Training Loss: 0.00829
Epoch: 24556, Training Loss: 0.00939
Epoch: 24556, Training Loss: 0.00875
Epoch: 24556, Training Loss: 0.01072
Epoch: 24556, Training Loss: 0.00829
Epoch: 24557, Training Loss: 0.00939
Epoch: 24557, Training Loss: 0.00875
Epoch: 24557, Training Loss: 0.01072
Epoch: 24557, Training Loss: 0.00829
Epoch: 24558, Training Loss: 0.00939
Epoch: 24558, Training Loss: 0.00875
Epoch: 24558, Training Loss: 0.01072
Epoch: 24558, Training Loss: 0.00829
Epoch: 24559, Training Loss: 0.00938
Epoch: 24559, Training Loss: 0.00875
Epoch: 24559, Training Loss: 0.01072
Epoch: 24559, Training Loss: 0.00829
Epoch: 24560, Training Loss: 0.00938
Epoch: 24560, Training Loss: 0.00875
Epoch: 24560, Training Loss: 0.01072
Epoch: 24560, Training Loss: 0.00829
Epoch: 24561, Training Loss: 0.00938
Epoch: 24561, Training Loss: 0.00875
Epoch: 24561, Training Loss: 0.01071
Epoch: 24561, Training Loss: 0.00829
Epoch: 24562, Training Loss: 0.00938
Epoch: 24562, Training Loss: 0.00875
Epoch: 24562, Training Loss: 0.01071
Epoch: 24562, Training Loss: 0.00828
Epoch: 24563, Training Loss: 0.00938
Epoch: 24563, Training Loss: 0.00875
Epoch: 24563, Training Loss: 0.01071
Epoch: 24563, Training Loss: 0.00828
Epoch: 24564, Training Loss: 0.00938
Epoch: 24564, Training Loss: 0.00875
Epoch: 24564, Training Loss: 0.01071
Epoch: 24564, Training Loss: 0.00828
Epoch: 24565, Training Loss: 0.00938
Epoch: 24565, Training Loss: 0.00875
Epoch: 24565, Training Loss: 0.01071
Epoch: 24565, Training Loss: 0.00828
Epoch: 24566, Training Loss: 0.00938
Epoch: 24566, Training Loss: 0.00875
Epoch: 24566, Training Loss: 0.01071
Epoch: 24566, Training Loss: 0.00828
Epoch: 24567, Training Loss: 0.00938
Epoch: 24567, Training Loss: 0.00875
Epoch: 24567, Training Loss: 0.01071
Epoch: 24567, Training Loss: 0.00828
Epoch: 24568, Training Loss: 0.00938
Epoch: 24568, Training Loss: 0.00875
Epoch: 24568, Training Loss: 0.01071
Epoch: 24568, Training Loss: 0.00828
Epoch: 24569, Training Loss: 0.00938
Epoch: 24569, Training Loss: 0.00875
Epoch: 24569, Training Loss: 0.01071
Epoch: 24569, Training Loss: 0.00828
Epoch: 24570, Training Loss: 0.00938
Epoch: 24570, Training Loss: 0.00875
Epoch: 24570, Training Loss: 0.01071
Epoch: 24570, Training Loss: 0.00828
Epoch: 24571, Training Loss: 0.00938
Epoch: 24571, Training Loss: 0.00874
Epoch: 24571, Training Loss: 0.01071
Epoch: 24571, Training Loss: 0.00828
Epoch: 24572, Training Loss: 0.00938
Epoch: 24572, Training Loss: 0.00874
Epoch: 24572, Training Loss: 0.01071
Epoch: 24572, Training Loss: 0.00828
Epoch: 24573, Training Loss: 0.00938
Epoch: 24573, Training Loss: 0.00874
Epoch: 24573, Training Loss: 0.01071
Epoch: 24573, Training Loss: 0.00828
Epoch: 24574, Training Loss: 0.00938
Epoch: 24574, Training Loss: 0.00874
Epoch: 24574, Training Loss: 0.01071
Epoch: 24574, Training Loss: 0.00828
Epoch: 24575, Training Loss: 0.00938
Epoch: 24575, Training Loss: 0.00874
Epoch: 24575, Training Loss: 0.01071
Epoch: 24575, Training Loss: 0.00828
Epoch: 24576, Training Loss: 0.00938
Epoch: 24576, Training Loss: 0.00874
Epoch: 24576, Training Loss: 0.01071
Epoch: 24576, Training Loss: 0.00828
Epoch: 24577, Training Loss: 0.00938
Epoch: 24577, Training Loss: 0.00874
Epoch: 24577, Training Loss: 0.01071
Epoch: 24577, Training Loss: 0.00828
Epoch: 24578, Training Loss: 0.00938
Epoch: 24578, Training Loss: 0.00874
Epoch: 24578, Training Loss: 0.01071
Epoch: 24578, Training Loss: 0.00828
Epoch: 24579, Training Loss: 0.00938
Epoch: 24579, Training Loss: 0.00874
Epoch: 24579, Training Loss: 0.01071
Epoch: 24579, Training Loss: 0.00828
Epoch: 24580, Training Loss: 0.00938
Epoch: 24580, Training Loss: 0.00874
Epoch: 24580, Training Loss: 0.01071
Epoch: 24580, Training Loss: 0.00828
Epoch: 24581, Training Loss: 0.00938
Epoch: 24581, Training Loss: 0.00874
Epoch: 24581, Training Loss: 0.01071
Epoch: 24581, Training Loss: 0.00828
Epoch: 24582, Training Loss: 0.00938
Epoch: 24582, Training Loss: 0.00874
Epoch: 24582, Training Loss: 0.01071
Epoch: 24582, Training Loss: 0.00828
Epoch: 24583, Training Loss: 0.00938
Epoch: 24583, Training Loss: 0.00874
Epoch: 24583, Training Loss: 0.01071
Epoch: 24583, Training Loss: 0.00828
Epoch: 24584, Training Loss: 0.00938
Epoch: 24584, Training Loss: 0.00874
Epoch: 24584, Training Loss: 0.01071
Epoch: 24584, Training Loss: 0.00828
Epoch: 24585, Training Loss: 0.00938
Epoch: 24585, Training Loss: 0.00874
Epoch: 24585, Training Loss: 0.01071
Epoch: 24585, Training Loss: 0.00828
Epoch: 24586, Training Loss: 0.00938
Epoch: 24586, Training Loss: 0.00874
Epoch: 24586, Training Loss: 0.01071
Epoch: 24586, Training Loss: 0.00828
Epoch: 24587, Training Loss: 0.00938
Epoch: 24587, Training Loss: 0.00874
Epoch: 24587, Training Loss: 0.01071
Epoch: 24587, Training Loss: 0.00828
Epoch: 24588, Training Loss: 0.00938
Epoch: 24588, Training Loss: 0.00874
Epoch: 24588, Training Loss: 0.01071
Epoch: 24588, Training Loss: 0.00828
Epoch: 24589, Training Loss: 0.00938
Epoch: 24589, Training Loss: 0.00874
Epoch: 24589, Training Loss: 0.01071
Epoch: 24589, Training Loss: 0.00828
Epoch: 24590, Training Loss: 0.00938
Epoch: 24590, Training Loss: 0.00874
Epoch: 24590, Training Loss: 0.01071
Epoch: 24590, Training Loss: 0.00828
Epoch: 24591, Training Loss: 0.00938
Epoch: 24591, Training Loss: 0.00874
Epoch: 24591, Training Loss: 0.01071
Epoch: 24591, Training Loss: 0.00828
Epoch: 24592, Training Loss: 0.00938
Epoch: 24592, Training Loss: 0.00874
Epoch: 24592, Training Loss: 0.01071
Epoch: 24592, Training Loss: 0.00828
Epoch: 24593, Training Loss: 0.00938
Epoch: 24593, Training Loss: 0.00874
Epoch: 24593, Training Loss: 0.01071
Epoch: 24593, Training Loss: 0.00828
Epoch: 24594, Training Loss: 0.00938
Epoch: 24594, Training Loss: 0.00874
Epoch: 24594, Training Loss: 0.01071
Epoch: 24594, Training Loss: 0.00828
Epoch: 24595, Training Loss: 0.00938
Epoch: 24595, Training Loss: 0.00874
Epoch: 24595, Training Loss: 0.01071
Epoch: 24595, Training Loss: 0.00828
Epoch: 24596, Training Loss: 0.00938
Epoch: 24596, Training Loss: 0.00874
Epoch: 24596, Training Loss: 0.01071
Epoch: 24596, Training Loss: 0.00828
Epoch: 24597, Training Loss: 0.00938
Epoch: 24597, Training Loss: 0.00874
Epoch: 24597, Training Loss: 0.01071
Epoch: 24597, Training Loss: 0.00828
Epoch: 24598, Training Loss: 0.00938
Epoch: 24598, Training Loss: 0.00874
Epoch: 24598, Training Loss: 0.01071
Epoch: 24598, Training Loss: 0.00828
Epoch: 24599, Training Loss: 0.00938
Epoch: 24599, Training Loss: 0.00874
Epoch: 24599, Training Loss: 0.01071
Epoch: 24599, Training Loss: 0.00828
Epoch: 24600, Training Loss: 0.00938
Epoch: 24600, Training Loss: 0.00874
Epoch: 24600, Training Loss: 0.01071
Epoch: 24600, Training Loss: 0.00828
Epoch: 24601, Training Loss: 0.00938
Epoch: 24601, Training Loss: 0.00874
Epoch: 24601, Training Loss: 0.01071
Epoch: 24601, Training Loss: 0.00828
Epoch: 24602, Training Loss: 0.00938
Epoch: 24602, Training Loss: 0.00874
Epoch: 24602, Training Loss: 0.01071
Epoch: 24602, Training Loss: 0.00828
Epoch: 24603, Training Loss: 0.00938
Epoch: 24603, Training Loss: 0.00874
Epoch: 24603, Training Loss: 0.01071
Epoch: 24603, Training Loss: 0.00828
Epoch: 24604, Training Loss: 0.00938
Epoch: 24604, Training Loss: 0.00874
Epoch: 24604, Training Loss: 0.01070
Epoch: 24604, Training Loss: 0.00828
Epoch: 24605, Training Loss: 0.00938
Epoch: 24605, Training Loss: 0.00874
Epoch: 24605, Training Loss: 0.01070
Epoch: 24605, Training Loss: 0.00828
Epoch: 24606, Training Loss: 0.00938
Epoch: 24606, Training Loss: 0.00874
Epoch: 24606, Training Loss: 0.01070
Epoch: 24606, Training Loss: 0.00828
Epoch: 24607, Training Loss: 0.00938
Epoch: 24607, Training Loss: 0.00874
Epoch: 24607, Training Loss: 0.01070
Epoch: 24607, Training Loss: 0.00828
Epoch: 24608, Training Loss: 0.00937
Epoch: 24608, Training Loss: 0.00874
Epoch: 24608, Training Loss: 0.01070
Epoch: 24608, Training Loss: 0.00828
Epoch: 24609, Training Loss: 0.00937
Epoch: 24609, Training Loss: 0.00874
Epoch: 24609, Training Loss: 0.01070
Epoch: 24609, Training Loss: 0.00828
Epoch: 24610, Training Loss: 0.00937
Epoch: 24610, Training Loss: 0.00874
Epoch: 24610, Training Loss: 0.01070
Epoch: 24610, Training Loss: 0.00828
Epoch: 24611, Training Loss: 0.00937
Epoch: 24611, Training Loss: 0.00874
Epoch: 24611, Training Loss: 0.01070
Epoch: 24611, Training Loss: 0.00828
Epoch: 24612, Training Loss: 0.00937
Epoch: 24612, Training Loss: 0.00874
Epoch: 24612, Training Loss: 0.01070
Epoch: 24612, Training Loss: 0.00828
Epoch: 24613, Training Loss: 0.00937
Epoch: 24613, Training Loss: 0.00874
Epoch: 24613, Training Loss: 0.01070
Epoch: 24613, Training Loss: 0.00828
Epoch: 24614, Training Loss: 0.00937
Epoch: 24614, Training Loss: 0.00874
Epoch: 24614, Training Loss: 0.01070
Epoch: 24614, Training Loss: 0.00828
Epoch: 24615, Training Loss: 0.00937
Epoch: 24615, Training Loss: 0.00874
Epoch: 24615, Training Loss: 0.01070
Epoch: 24615, Training Loss: 0.00828
Epoch: 24616, Training Loss: 0.00937
Epoch: 24616, Training Loss: 0.00874
Epoch: 24616, Training Loss: 0.01070
Epoch: 24616, Training Loss: 0.00828
Epoch: 24617, Training Loss: 0.00937
Epoch: 24617, Training Loss: 0.00874
Epoch: 24617, Training Loss: 0.01070
Epoch: 24617, Training Loss: 0.00828
Epoch: 24618, Training Loss: 0.00937
Epoch: 24618, Training Loss: 0.00874
Epoch: 24618, Training Loss: 0.01070
Epoch: 24618, Training Loss: 0.00827
Epoch: 24619, Training Loss: 0.00937
Epoch: 24619, Training Loss: 0.00874
Epoch: 24619, Training Loss: 0.01070
Epoch: 24619, Training Loss: 0.00827
Epoch: 24620, Training Loss: 0.00937
Epoch: 24620, Training Loss: 0.00874
Epoch: 24620, Training Loss: 0.01070
Epoch: 24620, Training Loss: 0.00827
Epoch: 24621, Training Loss: 0.00937
Epoch: 24621, Training Loss: 0.00874
Epoch: 24621, Training Loss: 0.01070
Epoch: 24621, Training Loss: 0.00827
Epoch: 24622, Training Loss: 0.00937
Epoch: 24622, Training Loss: 0.00874
Epoch: 24622, Training Loss: 0.01070
Epoch: 24622, Training Loss: 0.00827
Epoch: 24623, Training Loss: 0.00937
Epoch: 24623, Training Loss: 0.00874
Epoch: 24623, Training Loss: 0.01070
Epoch: 24623, Training Loss: 0.00827
Epoch: 24624, Training Loss: 0.00937
Epoch: 24624, Training Loss: 0.00874
Epoch: 24624, Training Loss: 0.01070
Epoch: 24624, Training Loss: 0.00827
Epoch: 24625, Training Loss: 0.00937
Epoch: 24625, Training Loss: 0.00873
Epoch: 24625, Training Loss: 0.01070
Epoch: 24625, Training Loss: 0.00827
Epoch: 24626, Training Loss: 0.00937
Epoch: 24626, Training Loss: 0.00873
Epoch: 24626, Training Loss: 0.01070
Epoch: 24626, Training Loss: 0.00827
Epoch: 24627, Training Loss: 0.00937
Epoch: 24627, Training Loss: 0.00873
Epoch: 24627, Training Loss: 0.01070
Epoch: 24627, Training Loss: 0.00827
Epoch: 24628, Training Loss: 0.00937
Epoch: 24628, Training Loss: 0.00873
Epoch: 24628, Training Loss: 0.01070
Epoch: 24628, Training Loss: 0.00827
Epoch: 24629, Training Loss: 0.00937
Epoch: 24629, Training Loss: 0.00873
Epoch: 24629, Training Loss: 0.01070
Epoch: 24629, Training Loss: 0.00827
Epoch: 24630, Training Loss: 0.00937
Epoch: 24630, Training Loss: 0.00873
Epoch: 24630, Training Loss: 0.01070
Epoch: 24630, Training Loss: 0.00827
Epoch: 24631, Training Loss: 0.00937
Epoch: 24631, Training Loss: 0.00873
Epoch: 24631, Training Loss: 0.01070
Epoch: 24631, Training Loss: 0.00827
Epoch: 24632, Training Loss: 0.00937
Epoch: 24632, Training Loss: 0.00873
Epoch: 24632, Training Loss: 0.01070
Epoch: 24632, Training Loss: 0.00827
Epoch: 24633, Training Loss: 0.00937
Epoch: 24633, Training Loss: 0.00873
Epoch: 24633, Training Loss: 0.01070
Epoch: 24633, Training Loss: 0.00827
Epoch: 24634, Training Loss: 0.00937
Epoch: 24634, Training Loss: 0.00873
Epoch: 24634, Training Loss: 0.01070
Epoch: 24634, Training Loss: 0.00827
Epoch: 24635, Training Loss: 0.00937
Epoch: 24635, Training Loss: 0.00873
Epoch: 24635, Training Loss: 0.01070
Epoch: 24635, Training Loss: 0.00827
Epoch: 24636, Training Loss: 0.00937
Epoch: 24636, Training Loss: 0.00873
Epoch: 24636, Training Loss: 0.01070
Epoch: 24636, Training Loss: 0.00827
Epoch: 24637, Training Loss: 0.00937
Epoch: 24637, Training Loss: 0.00873
Epoch: 24637, Training Loss: 0.01070
Epoch: 24637, Training Loss: 0.00827
Epoch: 24638, Training Loss: 0.00937
Epoch: 24638, Training Loss: 0.00873
Epoch: 24638, Training Loss: 0.01070
Epoch: 24638, Training Loss: 0.00827
Epoch: 24639, Training Loss: 0.00937
Epoch: 24639, Training Loss: 0.00873
Epoch: 24639, Training Loss: 0.01070
Epoch: 24639, Training Loss: 0.00827
Epoch: 24640, Training Loss: 0.00937
Epoch: 24640, Training Loss: 0.00873
Epoch: 24640, Training Loss: 0.01070
Epoch: 24640, Training Loss: 0.00827
Epoch: 24641, Training Loss: 0.00937
Epoch: 24641, Training Loss: 0.00873
Epoch: 24641, Training Loss: 0.01070
Epoch: 24641, Training Loss: 0.00827
Epoch: 24642, Training Loss: 0.00937
Epoch: 24642, Training Loss: 0.00873
Epoch: 24642, Training Loss: 0.01070
Epoch: 24642, Training Loss: 0.00827
Epoch: 24643, Training Loss: 0.00937
Epoch: 24643, Training Loss: 0.00873
Epoch: 24643, Training Loss: 0.01070
Epoch: 24643, Training Loss: 0.00827
Epoch: 24644, Training Loss: 0.00937
Epoch: 24644, Training Loss: 0.00873
Epoch: 24644, Training Loss: 0.01070
Epoch: 24644, Training Loss: 0.00827
Epoch: 24645, Training Loss: 0.00937
Epoch: 24645, Training Loss: 0.00873
Epoch: 24645, Training Loss: 0.01070
Epoch: 24645, Training Loss: 0.00827
Epoch: 24646, Training Loss: 0.00937
Epoch: 24646, Training Loss: 0.00873
Epoch: 24646, Training Loss: 0.01070
Epoch: 24646, Training Loss: 0.00827
Epoch: 24647, Training Loss: 0.00937
Epoch: 24647, Training Loss: 0.00873
Epoch: 24647, Training Loss: 0.01069
Epoch: 24647, Training Loss: 0.00827
Epoch: 24648, Training Loss: 0.00937
Epoch: 24648, Training Loss: 0.00873
Epoch: 24648, Training Loss: 0.01069
Epoch: 24648, Training Loss: 0.00827
Epoch: 24649, Training Loss: 0.00937
Epoch: 24649, Training Loss: 0.00873
Epoch: 24649, Training Loss: 0.01069
Epoch: 24649, Training Loss: 0.00827
Epoch: 24650, Training Loss: 0.00937
Epoch: 24650, Training Loss: 0.00873
Epoch: 24650, Training Loss: 0.01069
Epoch: 24650, Training Loss: 0.00827
Epoch: 24651, Training Loss: 0.00937
Epoch: 24651, Training Loss: 0.00873
Epoch: 24651, Training Loss: 0.01069
Epoch: 24651, Training Loss: 0.00827
Epoch: 24652, Training Loss: 0.00937
Epoch: 24652, Training Loss: 0.00873
Epoch: 24652, Training Loss: 0.01069
Epoch: 24652, Training Loss: 0.00827
Epoch: 24653, Training Loss: 0.00937
Epoch: 24653, Training Loss: 0.00873
Epoch: 24653, Training Loss: 0.01069
Epoch: 24653, Training Loss: 0.00827
Epoch: 24654, Training Loss: 0.00937
Epoch: 24654, Training Loss: 0.00873
Epoch: 24654, Training Loss: 0.01069
Epoch: 24654, Training Loss: 0.00827
Epoch: 24655, Training Loss: 0.00937
Epoch: 24655, Training Loss: 0.00873
Epoch: 24655, Training Loss: 0.01069
Epoch: 24655, Training Loss: 0.00827
Epoch: 24656, Training Loss: 0.00937
Epoch: 24656, Training Loss: 0.00873
Epoch: 24656, Training Loss: 0.01069
Epoch: 24656, Training Loss: 0.00827
Epoch: 24657, Training Loss: 0.00936
Epoch: 24657, Training Loss: 0.00873
Epoch: 24657, Training Loss: 0.01069
Epoch: 24657, Training Loss: 0.00827
Epoch: 24658, Training Loss: 0.00936
Epoch: 24658, Training Loss: 0.00873
Epoch: 24658, Training Loss: 0.01069
Epoch: 24658, Training Loss: 0.00827
Epoch: 24659, Training Loss: 0.00936
Epoch: 24659, Training Loss: 0.00873
Epoch: 24659, Training Loss: 0.01069
Epoch: 24659, Training Loss: 0.00827
Epoch: 24660, Training Loss: 0.00936
Epoch: 24660, Training Loss: 0.00873
Epoch: 24660, Training Loss: 0.01069
Epoch: 24660, Training Loss: 0.00827
Epoch: 24661, Training Loss: 0.00936
Epoch: 24661, Training Loss: 0.00873
Epoch: 24661, Training Loss: 0.01069
Epoch: 24661, Training Loss: 0.00827
Epoch: 24662, Training Loss: 0.00936
Epoch: 24662, Training Loss: 0.00873
Epoch: 24662, Training Loss: 0.01069
Epoch: 24662, Training Loss: 0.00827
Epoch: 24663, Training Loss: 0.00936
Epoch: 24663, Training Loss: 0.00873
Epoch: 24663, Training Loss: 0.01069
Epoch: 24663, Training Loss: 0.00827
Epoch: 24664, Training Loss: 0.00936
Epoch: 24664, Training Loss: 0.00873
Epoch: 24664, Training Loss: 0.01069
Epoch: 24664, Training Loss: 0.00827
Epoch: 24665, Training Loss: 0.00936
Epoch: 24665, Training Loss: 0.00873
Epoch: 24665, Training Loss: 0.01069
Epoch: 24665, Training Loss: 0.00827
Epoch: 24666, Training Loss: 0.00936
Epoch: 24666, Training Loss: 0.00873
Epoch: 24666, Training Loss: 0.01069
Epoch: 24666, Training Loss: 0.00827
Epoch: 24667, Training Loss: 0.00936
Epoch: 24667, Training Loss: 0.00873
Epoch: 24667, Training Loss: 0.01069
Epoch: 24667, Training Loss: 0.00827
Epoch: 24668, Training Loss: 0.00936
Epoch: 24668, Training Loss: 0.00873
Epoch: 24668, Training Loss: 0.01069
Epoch: 24668, Training Loss: 0.00827
Epoch: 24669, Training Loss: 0.00936
Epoch: 24669, Training Loss: 0.00873
Epoch: 24669, Training Loss: 0.01069
Epoch: 24669, Training Loss: 0.00827
Epoch: 24670, Training Loss: 0.00936
Epoch: 24670, Training Loss: 0.00873
Epoch: 24670, Training Loss: 0.01069
Epoch: 24670, Training Loss: 0.00827
Epoch: 24671, Training Loss: 0.00936
Epoch: 24671, Training Loss: 0.00873
Epoch: 24671, Training Loss: 0.01069
Epoch: 24671, Training Loss: 0.00827
Epoch: 24672, Training Loss: 0.00936
Epoch: 24672, Training Loss: 0.00873
Epoch: 24672, Training Loss: 0.01069
Epoch: 24672, Training Loss: 0.00827
Epoch: 24673, Training Loss: 0.00936
Epoch: 24673, Training Loss: 0.00873
Epoch: 24673, Training Loss: 0.01069
Epoch: 24673, Training Loss: 0.00827
Epoch: 24674, Training Loss: 0.00936
Epoch: 24674, Training Loss: 0.00873
Epoch: 24674, Training Loss: 0.01069
Epoch: 24674, Training Loss: 0.00827
Epoch: 24675, Training Loss: 0.00936
Epoch: 24675, Training Loss: 0.00873
Epoch: 24675, Training Loss: 0.01069
Epoch: 24675, Training Loss: 0.00826
Epoch: 24676, Training Loss: 0.00936
Epoch: 24676, Training Loss: 0.00873
Epoch: 24676, Training Loss: 0.01069
Epoch: 24676, Training Loss: 0.00826
Epoch: 24677, Training Loss: 0.00936
Epoch: 24677, Training Loss: 0.00873
Epoch: 24677, Training Loss: 0.01069
Epoch: 24677, Training Loss: 0.00826
Epoch: 24678, Training Loss: 0.00936
Epoch: 24678, Training Loss: 0.00872
Epoch: 24678, Training Loss: 0.01069
Epoch: 24678, Training Loss: 0.00826
Epoch: 24679, Training Loss: 0.00936
Epoch: 24679, Training Loss: 0.00872
Epoch: 24679, Training Loss: 0.01069
Epoch: 24679, Training Loss: 0.00826
Epoch: 24680, Training Loss: 0.00936
Epoch: 24680, Training Loss: 0.00872
Epoch: 24680, Training Loss: 0.01069
Epoch: 24680, Training Loss: 0.00826
Epoch: 24681, Training Loss: 0.00936
Epoch: 24681, Training Loss: 0.00872
Epoch: 24681, Training Loss: 0.01069
Epoch: 24681, Training Loss: 0.00826
Epoch: 24682, Training Loss: 0.00936
Epoch: 24682, Training Loss: 0.00872
Epoch: 24682, Training Loss: 0.01069
Epoch: 24682, Training Loss: 0.00826
Epoch: 24683, Training Loss: 0.00936
Epoch: 24683, Training Loss: 0.00872
Epoch: 24683, Training Loss: 0.01069
Epoch: 24683, Training Loss: 0.00826
Epoch: 24684, Training Loss: 0.00936
Epoch: 24684, Training Loss: 0.00872
Epoch: 24684, Training Loss: 0.01069
Epoch: 24684, Training Loss: 0.00826
Epoch: 24685, Training Loss: 0.00936
Epoch: 24685, Training Loss: 0.00872
Epoch: 24685, Training Loss: 0.01069
Epoch: 24685, Training Loss: 0.00826
Epoch: 24686, Training Loss: 0.00936
Epoch: 24686, Training Loss: 0.00872
Epoch: 24686, Training Loss: 0.01069
Epoch: 24686, Training Loss: 0.00826
Epoch: 24687, Training Loss: 0.00936
Epoch: 24687, Training Loss: 0.00872
Epoch: 24687, Training Loss: 0.01069
Epoch: 24687, Training Loss: 0.00826
Epoch: 24688, Training Loss: 0.00936
Epoch: 24688, Training Loss: 0.00872
Epoch: 24688, Training Loss: 0.01069
Epoch: 24688, Training Loss: 0.00826
Epoch: 24689, Training Loss: 0.00936
Epoch: 24689, Training Loss: 0.00872
Epoch: 24689, Training Loss: 0.01069
Epoch: 24689, Training Loss: 0.00826
Epoch: 24690, Training Loss: 0.00936
Epoch: 24690, Training Loss: 0.00872
Epoch: 24690, Training Loss: 0.01069
Epoch: 24690, Training Loss: 0.00826
Epoch: 24691, Training Loss: 0.00936
Epoch: 24691, Training Loss: 0.00872
Epoch: 24691, Training Loss: 0.01068
Epoch: 24691, Training Loss: 0.00826
Epoch: 24692, Training Loss: 0.00936
Epoch: 24692, Training Loss: 0.00872
Epoch: 24692, Training Loss: 0.01068
Epoch: 24692, Training Loss: 0.00826
Epoch: 24693, Training Loss: 0.00936
Epoch: 24693, Training Loss: 0.00872
Epoch: 24693, Training Loss: 0.01068
Epoch: 24693, Training Loss: 0.00826
Epoch: 24694, Training Loss: 0.00936
Epoch: 24694, Training Loss: 0.00872
Epoch: 24694, Training Loss: 0.01068
Epoch: 24694, Training Loss: 0.00826
Epoch: 24695, Training Loss: 0.00936
Epoch: 24695, Training Loss: 0.00872
Epoch: 24695, Training Loss: 0.01068
Epoch: 24695, Training Loss: 0.00826
Epoch: 24696, Training Loss: 0.00936
Epoch: 24696, Training Loss: 0.00872
Epoch: 24696, Training Loss: 0.01068
Epoch: 24696, Training Loss: 0.00826
Epoch: 24697, Training Loss: 0.00936
Epoch: 24697, Training Loss: 0.00872
Epoch: 24697, Training Loss: 0.01068
Epoch: 24697, Training Loss: 0.00826
Epoch: 24698, Training Loss: 0.00936
Epoch: 24698, Training Loss: 0.00872
Epoch: 24698, Training Loss: 0.01068
Epoch: 24698, Training Loss: 0.00826
Epoch: 24699, Training Loss: 0.00936
Epoch: 24699, Training Loss: 0.00872
Epoch: 24699, Training Loss: 0.01068
Epoch: 24699, Training Loss: 0.00826
Epoch: 24700, Training Loss: 0.00936
Epoch: 24700, Training Loss: 0.00872
Epoch: 24700, Training Loss: 0.01068
Epoch: 24700, Training Loss: 0.00826
Epoch: 24701, Training Loss: 0.00936
Epoch: 24701, Training Loss: 0.00872
Epoch: 24701, Training Loss: 0.01068
Epoch: 24701, Training Loss: 0.00826
Epoch: 24702, Training Loss: 0.00936
Epoch: 24702, Training Loss: 0.00872
Epoch: 24702, Training Loss: 0.01068
Epoch: 24702, Training Loss: 0.00826
Epoch: 24703, Training Loss: 0.00936
Epoch: 24703, Training Loss: 0.00872
Epoch: 24703, Training Loss: 0.01068
Epoch: 24703, Training Loss: 0.00826
Epoch: 24704, Training Loss: 0.00936
Epoch: 24704, Training Loss: 0.00872
Epoch: 24704, Training Loss: 0.01068
Epoch: 24704, Training Loss: 0.00826
Epoch: 24705, Training Loss: 0.00936
Epoch: 24705, Training Loss: 0.00872
Epoch: 24705, Training Loss: 0.01068
Epoch: 24705, Training Loss: 0.00826
Epoch: 24706, Training Loss: 0.00935
Epoch: 24706, Training Loss: 0.00872
Epoch: 24706, Training Loss: 0.01068
Epoch: 24706, Training Loss: 0.00826
Epoch: 24707, Training Loss: 0.00935
Epoch: 24707, Training Loss: 0.00872
Epoch: 24707, Training Loss: 0.01068
Epoch: 24707, Training Loss: 0.00826
Epoch: 24708, Training Loss: 0.00935
Epoch: 24708, Training Loss: 0.00872
Epoch: 24708, Training Loss: 0.01068
Epoch: 24708, Training Loss: 0.00826
Epoch: 24709, Training Loss: 0.00935
Epoch: 24709, Training Loss: 0.00872
Epoch: 24709, Training Loss: 0.01068
Epoch: 24709, Training Loss: 0.00826
Epoch: 24710, Training Loss: 0.00935
Epoch: 24710, Training Loss: 0.00872
Epoch: 24710, Training Loss: 0.01068
Epoch: 24710, Training Loss: 0.00826
Epoch: 24711, Training Loss: 0.00935
Epoch: 24711, Training Loss: 0.00872
Epoch: 24711, Training Loss: 0.01068
Epoch: 24711, Training Loss: 0.00826
Epoch: 24712, Training Loss: 0.00935
Epoch: 24712, Training Loss: 0.00872
Epoch: 24712, Training Loss: 0.01068
Epoch: 24712, Training Loss: 0.00826
Epoch: 24713, Training Loss: 0.00935
Epoch: 24713, Training Loss: 0.00872
Epoch: 24713, Training Loss: 0.01068
Epoch: 24713, Training Loss: 0.00826
Epoch: 24714, Training Loss: 0.00935
Epoch: 24714, Training Loss: 0.00872
Epoch: 24714, Training Loss: 0.01068
Epoch: 24714, Training Loss: 0.00826
Epoch: 24715, Training Loss: 0.00935
Epoch: 24715, Training Loss: 0.00872
Epoch: 24715, Training Loss: 0.01068
Epoch: 24715, Training Loss: 0.00826
Epoch: 24716, Training Loss: 0.00935
Epoch: 24716, Training Loss: 0.00872
Epoch: 24716, Training Loss: 0.01068
Epoch: 24716, Training Loss: 0.00826
Epoch: 24717, Training Loss: 0.00935
Epoch: 24717, Training Loss: 0.00872
Epoch: 24717, Training Loss: 0.01068
Epoch: 24717, Training Loss: 0.00826
Epoch: 24718, Training Loss: 0.00935
Epoch: 24718, Training Loss: 0.00872
Epoch: 24718, Training Loss: 0.01068
Epoch: 24718, Training Loss: 0.00826
Epoch: 24719, Training Loss: 0.00935
Epoch: 24719, Training Loss: 0.00872
Epoch: 24719, Training Loss: 0.01068
Epoch: 24719, Training Loss: 0.00826
Epoch: 24720, Training Loss: 0.00935
Epoch: 24720, Training Loss: 0.00872
Epoch: 24720, Training Loss: 0.01068
Epoch: 24720, Training Loss: 0.00826
Epoch: 24721, Training Loss: 0.00935
Epoch: 24721, Training Loss: 0.00872
Epoch: 24721, Training Loss: 0.01068
Epoch: 24721, Training Loss: 0.00826
Epoch: 24722, Training Loss: 0.00935
Epoch: 24722, Training Loss: 0.00872
Epoch: 24722, Training Loss: 0.01068
Epoch: 24722, Training Loss: 0.00826
Epoch: 24723, Training Loss: 0.00935
Epoch: 24723, Training Loss: 0.00872
Epoch: 24723, Training Loss: 0.01068
Epoch: 24723, Training Loss: 0.00826
Epoch: 24724, Training Loss: 0.00935
Epoch: 24724, Training Loss: 0.00872
Epoch: 24724, Training Loss: 0.01068
Epoch: 24724, Training Loss: 0.00826
Epoch: 24725, Training Loss: 0.00935
Epoch: 24725, Training Loss: 0.00872
Epoch: 24725, Training Loss: 0.01068
Epoch: 24725, Training Loss: 0.00826
Epoch: 24726, Training Loss: 0.00935
Epoch: 24726, Training Loss: 0.00872
Epoch: 24726, Training Loss: 0.01068
Epoch: 24726, Training Loss: 0.00826
Epoch: 24727, Training Loss: 0.00935
Epoch: 24727, Training Loss: 0.00872
Epoch: 24727, Training Loss: 0.01068
Epoch: 24727, Training Loss: 0.00826
Epoch: 24728, Training Loss: 0.00935
Epoch: 24728, Training Loss: 0.00872
Epoch: 24728, Training Loss: 0.01068
Epoch: 24728, Training Loss: 0.00826
Epoch: 24729, Training Loss: 0.00935
Epoch: 24729, Training Loss: 0.00872
Epoch: 24729, Training Loss: 0.01068
Epoch: 24729, Training Loss: 0.00826
Epoch: 24730, Training Loss: 0.00935
Epoch: 24730, Training Loss: 0.00872
Epoch: 24730, Training Loss: 0.01068
Epoch: 24730, Training Loss: 0.00826
Epoch: 24731, Training Loss: 0.00935
Epoch: 24731, Training Loss: 0.00872
Epoch: 24731, Training Loss: 0.01068
Epoch: 24731, Training Loss: 0.00825
Epoch: 24732, Training Loss: 0.00935
Epoch: 24732, Training Loss: 0.00871
Epoch: 24732, Training Loss: 0.01068
Epoch: 24732, Training Loss: 0.00825
Epoch: 24733, Training Loss: 0.00935
Epoch: 24733, Training Loss: 0.00871
Epoch: 24733, Training Loss: 0.01068
Epoch: 24733, Training Loss: 0.00825
Epoch: 24734, Training Loss: 0.00935
Epoch: 24734, Training Loss: 0.00871
Epoch: 24734, Training Loss: 0.01067
Epoch: 24734, Training Loss: 0.00825
Epoch: 24735, Training Loss: 0.00935
Epoch: 24735, Training Loss: 0.00871
Epoch: 24735, Training Loss: 0.01067
Epoch: 24735, Training Loss: 0.00825
Epoch: 24736, Training Loss: 0.00935
Epoch: 24736, Training Loss: 0.00871
Epoch: 24736, Training Loss: 0.01067
Epoch: 24736, Training Loss: 0.00825
Epoch: 24737, Training Loss: 0.00935
Epoch: 24737, Training Loss: 0.00871
Epoch: 24737, Training Loss: 0.01067
Epoch: 24737, Training Loss: 0.00825
Epoch: 24738, Training Loss: 0.00935
Epoch: 24738, Training Loss: 0.00871
Epoch: 24738, Training Loss: 0.01067
Epoch: 24738, Training Loss: 0.00825
Epoch: 24739, Training Loss: 0.00935
Epoch: 24739, Training Loss: 0.00871
Epoch: 24739, Training Loss: 0.01067
Epoch: 24739, Training Loss: 0.00825
Epoch: 24740, Training Loss: 0.00935
Epoch: 24740, Training Loss: 0.00871
Epoch: 24740, Training Loss: 0.01067
Epoch: 24740, Training Loss: 0.00825
Epoch: 24741, Training Loss: 0.00935
Epoch: 24741, Training Loss: 0.00871
Epoch: 24741, Training Loss: 0.01067
Epoch: 24741, Training Loss: 0.00825
Epoch: 24742, Training Loss: 0.00935
Epoch: 24742, Training Loss: 0.00871
Epoch: 24742, Training Loss: 0.01067
Epoch: 24742, Training Loss: 0.00825
Epoch: 24743, Training Loss: 0.00935
Epoch: 24743, Training Loss: 0.00871
Epoch: 24743, Training Loss: 0.01067
Epoch: 24743, Training Loss: 0.00825
Epoch: 24744, Training Loss: 0.00935
Epoch: 24744, Training Loss: 0.00871
Epoch: 24744, Training Loss: 0.01067
Epoch: 24744, Training Loss: 0.00825
Epoch: 24745, Training Loss: 0.00935
Epoch: 24745, Training Loss: 0.00871
Epoch: 24745, Training Loss: 0.01067
Epoch: 24745, Training Loss: 0.00825
Epoch: 24746, Training Loss: 0.00935
Epoch: 24746, Training Loss: 0.00871
Epoch: 24746, Training Loss: 0.01067
Epoch: 24746, Training Loss: 0.00825
Epoch: 24747, Training Loss: 0.00935
Epoch: 24747, Training Loss: 0.00871
Epoch: 24747, Training Loss: 0.01067
Epoch: 24747, Training Loss: 0.00825
Epoch: 24748, Training Loss: 0.00935
Epoch: 24748, Training Loss: 0.00871
Epoch: 24748, Training Loss: 0.01067
Epoch: 24748, Training Loss: 0.00825
Epoch: 24749, Training Loss: 0.00935
Epoch: 24749, Training Loss: 0.00871
Epoch: 24749, Training Loss: 0.01067
Epoch: 24749, Training Loss: 0.00825
Epoch: 24750, Training Loss: 0.00935
Epoch: 24750, Training Loss: 0.00871
Epoch: 24750, Training Loss: 0.01067
Epoch: 24750, Training Loss: 0.00825
Epoch: 24751, Training Loss: 0.00935
Epoch: 24751, Training Loss: 0.00871
Epoch: 24751, Training Loss: 0.01067
Epoch: 24751, Training Loss: 0.00825
Epoch: 24752, Training Loss: 0.00935
Epoch: 24752, Training Loss: 0.00871
Epoch: 24752, Training Loss: 0.01067
Epoch: 24752, Training Loss: 0.00825
Epoch: 24753, Training Loss: 0.00935
Epoch: 24753, Training Loss: 0.00871
Epoch: 24753, Training Loss: 0.01067
Epoch: 24753, Training Loss: 0.00825
Epoch: 24754, Training Loss: 0.00935
Epoch: 24754, Training Loss: 0.00871
Epoch: 24754, Training Loss: 0.01067
Epoch: 24754, Training Loss: 0.00825
Epoch: 24755, Training Loss: 0.00935
Epoch: 24755, Training Loss: 0.00871
Epoch: 24755, Training Loss: 0.01067
Epoch: 24755, Training Loss: 0.00825
Epoch: 24756, Training Loss: 0.00934
Epoch: 24756, Training Loss: 0.00871
Epoch: 24756, Training Loss: 0.01067
Epoch: 24756, Training Loss: 0.00825
Epoch: 24757, Training Loss: 0.00934
Epoch: 24757, Training Loss: 0.00871
Epoch: 24757, Training Loss: 0.01067
Epoch: 24757, Training Loss: 0.00825
Epoch: 24758, Training Loss: 0.00934
Epoch: 24758, Training Loss: 0.00871
Epoch: 24758, Training Loss: 0.01067
Epoch: 24758, Training Loss: 0.00825
Epoch: 24759, Training Loss: 0.00934
Epoch: 24759, Training Loss: 0.00871
Epoch: 24759, Training Loss: 0.01067
Epoch: 24759, Training Loss: 0.00825
Epoch: 24760, Training Loss: 0.00934
Epoch: 24760, Training Loss: 0.00871
Epoch: 24760, Training Loss: 0.01067
Epoch: 24760, Training Loss: 0.00825
Epoch: 24761, Training Loss: 0.00934
Epoch: 24761, Training Loss: 0.00871
Epoch: 24761, Training Loss: 0.01067
Epoch: 24761, Training Loss: 0.00825
Epoch: 24762, Training Loss: 0.00934
Epoch: 24762, Training Loss: 0.00871
Epoch: 24762, Training Loss: 0.01067
Epoch: 24762, Training Loss: 0.00825
Epoch: 24763, Training Loss: 0.00934
Epoch: 24763, Training Loss: 0.00871
Epoch: 24763, Training Loss: 0.01067
Epoch: 24763, Training Loss: 0.00825
Epoch: 24764, Training Loss: 0.00934
Epoch: 24764, Training Loss: 0.00871
Epoch: 24764, Training Loss: 0.01067
Epoch: 24764, Training Loss: 0.00825
Epoch: 24765, Training Loss: 0.00934
Epoch: 24765, Training Loss: 0.00871
Epoch: 24765, Training Loss: 0.01067
Epoch: 24765, Training Loss: 0.00825
Epoch: 24766, Training Loss: 0.00934
Epoch: 24766, Training Loss: 0.00871
Epoch: 24766, Training Loss: 0.01067
Epoch: 24766, Training Loss: 0.00825
Epoch: 24767, Training Loss: 0.00934
Epoch: 24767, Training Loss: 0.00871
Epoch: 24767, Training Loss: 0.01067
Epoch: 24767, Training Loss: 0.00825
Epoch: 24768, Training Loss: 0.00934
Epoch: 24768, Training Loss: 0.00871
Epoch: 24768, Training Loss: 0.01067
Epoch: 24768, Training Loss: 0.00825
Epoch: 24769, Training Loss: 0.00934
Epoch: 24769, Training Loss: 0.00871
Epoch: 24769, Training Loss: 0.01067
Epoch: 24769, Training Loss: 0.00825
Epoch: 24770, Training Loss: 0.00934
Epoch: 24770, Training Loss: 0.00871
Epoch: 24770, Training Loss: 0.01067
Epoch: 24770, Training Loss: 0.00825
Epoch: 24771, Training Loss: 0.00934
Epoch: 24771, Training Loss: 0.00871
Epoch: 24771, Training Loss: 0.01067
Epoch: 24771, Training Loss: 0.00825
Epoch: 24772, Training Loss: 0.00934
Epoch: 24772, Training Loss: 0.00871
Epoch: 24772, Training Loss: 0.01067
Epoch: 24772, Training Loss: 0.00825
Epoch: 24773, Training Loss: 0.00934
Epoch: 24773, Training Loss: 0.00871
Epoch: 24773, Training Loss: 0.01067
Epoch: 24773, Training Loss: 0.00825
Epoch: 24774, Training Loss: 0.00934
Epoch: 24774, Training Loss: 0.00871
Epoch: 24774, Training Loss: 0.01067
Epoch: 24774, Training Loss: 0.00825
Epoch: 24775, Training Loss: 0.00934
Epoch: 24775, Training Loss: 0.00871
Epoch: 24775, Training Loss: 0.01067
Epoch: 24775, Training Loss: 0.00825
Epoch: 24776, Training Loss: 0.00934
Epoch: 24776, Training Loss: 0.00871
Epoch: 24776, Training Loss: 0.01067
Epoch: 24776, Training Loss: 0.00825
Epoch: 24777, Training Loss: 0.00934
Epoch: 24777, Training Loss: 0.00871
Epoch: 24777, Training Loss: 0.01066
Epoch: 24777, Training Loss: 0.00825
Epoch: 24778, Training Loss: 0.00934
Epoch: 24778, Training Loss: 0.00871
Epoch: 24778, Training Loss: 0.01066
Epoch: 24778, Training Loss: 0.00825
Epoch: 24779, Training Loss: 0.00934
Epoch: 24779, Training Loss: 0.00871
Epoch: 24779, Training Loss: 0.01066
Epoch: 24779, Training Loss: 0.00825
Epoch: 24780, Training Loss: 0.00934
Epoch: 24780, Training Loss: 0.00871
Epoch: 24780, Training Loss: 0.01066
Epoch: 24780, Training Loss: 0.00825
Epoch: 24781, Training Loss: 0.00934
Epoch: 24781, Training Loss: 0.00871
Epoch: 24781, Training Loss: 0.01066
Epoch: 24781, Training Loss: 0.00825
Epoch: 24782, Training Loss: 0.00934
Epoch: 24782, Training Loss: 0.00871
Epoch: 24782, Training Loss: 0.01066
Epoch: 24782, Training Loss: 0.00825
Epoch: 24783, Training Loss: 0.00934
Epoch: 24783, Training Loss: 0.00871
Epoch: 24783, Training Loss: 0.01066
Epoch: 24783, Training Loss: 0.00825
Epoch: 24784, Training Loss: 0.00934
Epoch: 24784, Training Loss: 0.00871
Epoch: 24784, Training Loss: 0.01066
Epoch: 24784, Training Loss: 0.00825
Epoch: 24785, Training Loss: 0.00934
Epoch: 24785, Training Loss: 0.00871
Epoch: 24785, Training Loss: 0.01066
Epoch: 24785, Training Loss: 0.00825
Epoch: 24786, Training Loss: 0.00934
Epoch: 24786, Training Loss: 0.00870
Epoch: 24786, Training Loss: 0.01066
Epoch: 24786, Training Loss: 0.00825
Epoch: 24787, Training Loss: 0.00934
Epoch: 24787, Training Loss: 0.00870
Epoch: 24787, Training Loss: 0.01066
Epoch: 24787, Training Loss: 0.00825
Epoch: 24788, Training Loss: 0.00934
Epoch: 24788, Training Loss: 0.00870
Epoch: 24788, Training Loss: 0.01066
Epoch: 24788, Training Loss: 0.00824
Epoch: 24789, Training Loss: 0.00934
Epoch: 24789, Training Loss: 0.00870
Epoch: 24789, Training Loss: 0.01066
Epoch: 24789, Training Loss: 0.00824
Epoch: 24790, Training Loss: 0.00934
Epoch: 24790, Training Loss: 0.00870
Epoch: 24790, Training Loss: 0.01066
Epoch: 24790, Training Loss: 0.00824
Epoch: 24791, Training Loss: 0.00934
Epoch: 24791, Training Loss: 0.00870
Epoch: 24791, Training Loss: 0.01066
Epoch: 24791, Training Loss: 0.00824
Epoch: 24792, Training Loss: 0.00934
Epoch: 24792, Training Loss: 0.00870
Epoch: 24792, Training Loss: 0.01066
Epoch: 24792, Training Loss: 0.00824
Epoch: 24793, Training Loss: 0.00934
Epoch: 24793, Training Loss: 0.00870
Epoch: 24793, Training Loss: 0.01066
Epoch: 24793, Training Loss: 0.00824
Epoch: 24794, Training Loss: 0.00934
Epoch: 24794, Training Loss: 0.00870
Epoch: 24794, Training Loss: 0.01066
Epoch: 24794, Training Loss: 0.00824
Epoch: 24795, Training Loss: 0.00934
Epoch: 24795, Training Loss: 0.00870
Epoch: 24795, Training Loss: 0.01066
Epoch: 24795, Training Loss: 0.00824
Epoch: 24796, Training Loss: 0.00934
Epoch: 24796, Training Loss: 0.00870
Epoch: 24796, Training Loss: 0.01066
Epoch: 24796, Training Loss: 0.00824
Epoch: 24797, Training Loss: 0.00934
Epoch: 24797, Training Loss: 0.00870
Epoch: 24797, Training Loss: 0.01066
Epoch: 24797, Training Loss: 0.00824
Epoch: 24798, Training Loss: 0.00934
Epoch: 24798, Training Loss: 0.00870
Epoch: 24798, Training Loss: 0.01066
Epoch: 24798, Training Loss: 0.00824
Epoch: 24799, Training Loss: 0.00934
Epoch: 24799, Training Loss: 0.00870
Epoch: 24799, Training Loss: 0.01066
Epoch: 24799, Training Loss: 0.00824
Epoch: 24800, Training Loss: 0.00934
Epoch: 24800, Training Loss: 0.00870
Epoch: 24800, Training Loss: 0.01066
Epoch: 24800, Training Loss: 0.00824
Epoch: 24801, Training Loss: 0.00934
Epoch: 24801, Training Loss: 0.00870
Epoch: 24801, Training Loss: 0.01066
Epoch: 24801, Training Loss: 0.00824
Epoch: 24802, Training Loss: 0.00934
Epoch: 24802, Training Loss: 0.00870
Epoch: 24802, Training Loss: 0.01066
Epoch: 24802, Training Loss: 0.00824
Epoch: 24803, Training Loss: 0.00934
Epoch: 24803, Training Loss: 0.00870
Epoch: 24803, Training Loss: 0.01066
Epoch: 24803, Training Loss: 0.00824
Epoch: 24804, Training Loss: 0.00934
Epoch: 24804, Training Loss: 0.00870
Epoch: 24804, Training Loss: 0.01066
Epoch: 24804, Training Loss: 0.00824
Epoch: 24805, Training Loss: 0.00933
Epoch: 24805, Training Loss: 0.00870
Epoch: 24805, Training Loss: 0.01066
Epoch: 24805, Training Loss: 0.00824
Epoch: 24806, Training Loss: 0.00933
Epoch: 24806, Training Loss: 0.00870
Epoch: 24806, Training Loss: 0.01066
Epoch: 24806, Training Loss: 0.00824
Epoch: 24807, Training Loss: 0.00933
Epoch: 24807, Training Loss: 0.00870
Epoch: 24807, Training Loss: 0.01066
Epoch: 24807, Training Loss: 0.00824
Epoch: 24808, Training Loss: 0.00933
Epoch: 24808, Training Loss: 0.00870
Epoch: 24808, Training Loss: 0.01066
Epoch: 24808, Training Loss: 0.00824
Epoch: 24809, Training Loss: 0.00933
Epoch: 24809, Training Loss: 0.00870
Epoch: 24809, Training Loss: 0.01066
Epoch: 24809, Training Loss: 0.00824
Epoch: 24810, Training Loss: 0.00933
Epoch: 24810, Training Loss: 0.00870
Epoch: 24810, Training Loss: 0.01066
Epoch: 24810, Training Loss: 0.00824
Epoch: 24811, Training Loss: 0.00933
Epoch: 24811, Training Loss: 0.00870
Epoch: 24811, Training Loss: 0.01066
Epoch: 24811, Training Loss: 0.00824
Epoch: 24812, Training Loss: 0.00933
Epoch: 24812, Training Loss: 0.00870
Epoch: 24812, Training Loss: 0.01066
Epoch: 24812, Training Loss: 0.00824
Epoch: 24813, Training Loss: 0.00933
Epoch: 24813, Training Loss: 0.00870
Epoch: 24813, Training Loss: 0.01066
Epoch: 24813, Training Loss: 0.00824
Epoch: 24814, Training Loss: 0.00933
Epoch: 24814, Training Loss: 0.00870
Epoch: 24814, Training Loss: 0.01066
Epoch: 24814, Training Loss: 0.00824
Epoch: 24815, Training Loss: 0.00933
Epoch: 24815, Training Loss: 0.00870
Epoch: 24815, Training Loss: 0.01066
Epoch: 24815, Training Loss: 0.00824
Epoch: 24816, Training Loss: 0.00933
Epoch: 24816, Training Loss: 0.00870
Epoch: 24816, Training Loss: 0.01066
Epoch: 24816, Training Loss: 0.00824
Epoch: 24817, Training Loss: 0.00933
Epoch: 24817, Training Loss: 0.00870
Epoch: 24817, Training Loss: 0.01066
Epoch: 24817, Training Loss: 0.00824
Epoch: 24818, Training Loss: 0.00933
Epoch: 24818, Training Loss: 0.00870
Epoch: 24818, Training Loss: 0.01066
Epoch: 24818, Training Loss: 0.00824
Epoch: 24819, Training Loss: 0.00933
Epoch: 24819, Training Loss: 0.00870
Epoch: 24819, Training Loss: 0.01066
Epoch: 24819, Training Loss: 0.00824
Epoch: 24820, Training Loss: 0.00933
Epoch: 24820, Training Loss: 0.00870
Epoch: 24820, Training Loss: 0.01066
Epoch: 24820, Training Loss: 0.00824
Epoch: 24821, Training Loss: 0.00933
Epoch: 24821, Training Loss: 0.00870
Epoch: 24821, Training Loss: 0.01065
Epoch: 24821, Training Loss: 0.00824
Epoch: 24822, Training Loss: 0.00933
Epoch: 24822, Training Loss: 0.00870
Epoch: 24822, Training Loss: 0.01065
Epoch: 24822, Training Loss: 0.00824
Epoch: 24823, Training Loss: 0.00933
Epoch: 24823, Training Loss: 0.00870
Epoch: 24823, Training Loss: 0.01065
Epoch: 24823, Training Loss: 0.00824
Epoch: 24824, Training Loss: 0.00933
Epoch: 24824, Training Loss: 0.00870
Epoch: 24824, Training Loss: 0.01065
Epoch: 24824, Training Loss: 0.00824
Epoch: 24825, Training Loss: 0.00933
Epoch: 24825, Training Loss: 0.00870
Epoch: 24825, Training Loss: 0.01065
Epoch: 24825, Training Loss: 0.00824
Epoch: 24826, Training Loss: 0.00933
Epoch: 24826, Training Loss: 0.00870
Epoch: 24826, Training Loss: 0.01065
Epoch: 24826, Training Loss: 0.00824
Epoch: 24827, Training Loss: 0.00933
Epoch: 24827, Training Loss: 0.00870
Epoch: 24827, Training Loss: 0.01065
Epoch: 24827, Training Loss: 0.00824
Epoch: 24828, Training Loss: 0.00933
Epoch: 24828, Training Loss: 0.00870
Epoch: 24828, Training Loss: 0.01065
Epoch: 24828, Training Loss: 0.00824
Epoch: 24829, Training Loss: 0.00933
Epoch: 24829, Training Loss: 0.00870
Epoch: 24829, Training Loss: 0.01065
Epoch: 24829, Training Loss: 0.00824
Epoch: 24830, Training Loss: 0.00933
Epoch: 24830, Training Loss: 0.00870
Epoch: 24830, Training Loss: 0.01065
Epoch: 24830, Training Loss: 0.00824
Epoch: 24831, Training Loss: 0.00933
Epoch: 24831, Training Loss: 0.00870
Epoch: 24831, Training Loss: 0.01065
Epoch: 24831, Training Loss: 0.00824
Epoch: 24832, Training Loss: 0.00933
Epoch: 24832, Training Loss: 0.00870
Epoch: 24832, Training Loss: 0.01065
Epoch: 24832, Training Loss: 0.00824
Epoch: 24833, Training Loss: 0.00933
Epoch: 24833, Training Loss: 0.00870
Epoch: 24833, Training Loss: 0.01065
Epoch: 24833, Training Loss: 0.00824
Epoch: 24834, Training Loss: 0.00933
Epoch: 24834, Training Loss: 0.00870
Epoch: 24834, Training Loss: 0.01065
Epoch: 24834, Training Loss: 0.00824
Epoch: 24835, Training Loss: 0.00933
Epoch: 24835, Training Loss: 0.00870
Epoch: 24835, Training Loss: 0.01065
Epoch: 24835, Training Loss: 0.00824
Epoch: 24836, Training Loss: 0.00933
Epoch: 24836, Training Loss: 0.00870
Epoch: 24836, Training Loss: 0.01065
Epoch: 24836, Training Loss: 0.00824
Epoch: 24837, Training Loss: 0.00933
Epoch: 24837, Training Loss: 0.00870
Epoch: 24837, Training Loss: 0.01065
Epoch: 24837, Training Loss: 0.00824
Epoch: 24838, Training Loss: 0.00933
Epoch: 24838, Training Loss: 0.00870
Epoch: 24838, Training Loss: 0.01065
Epoch: 24838, Training Loss: 0.00824
Epoch: 24839, Training Loss: 0.00933
Epoch: 24839, Training Loss: 0.00870
Epoch: 24839, Training Loss: 0.01065
Epoch: 24839, Training Loss: 0.00824
Epoch: 24840, Training Loss: 0.00933
Epoch: 24840, Training Loss: 0.00870
Epoch: 24840, Training Loss: 0.01065
Epoch: 24840, Training Loss: 0.00824
Epoch: 24841, Training Loss: 0.00933
Epoch: 24841, Training Loss: 0.00869
Epoch: 24841, Training Loss: 0.01065
Epoch: 24841, Training Loss: 0.00824
Epoch: 24842, Training Loss: 0.00933
Epoch: 24842, Training Loss: 0.00869
Epoch: 24842, Training Loss: 0.01065
Epoch: 24842, Training Loss: 0.00824
Epoch: 24843, Training Loss: 0.00933
Epoch: 24843, Training Loss: 0.00869
Epoch: 24843, Training Loss: 0.01065
Epoch: 24843, Training Loss: 0.00824
Epoch: 24844, Training Loss: 0.00933
Epoch: 24844, Training Loss: 0.00869
Epoch: 24844, Training Loss: 0.01065
Epoch: 24844, Training Loss: 0.00824
Epoch: 24845, Training Loss: 0.00933
Epoch: 24845, Training Loss: 0.00869
Epoch: 24845, Training Loss: 0.01065
Epoch: 24845, Training Loss: 0.00823
Epoch: 24846, Training Loss: 0.00933
Epoch: 24846, Training Loss: 0.00869
Epoch: 24846, Training Loss: 0.01065
Epoch: 24846, Training Loss: 0.00823
Epoch: 24847, Training Loss: 0.00933
Epoch: 24847, Training Loss: 0.00869
Epoch: 24847, Training Loss: 0.01065
Epoch: 24847, Training Loss: 0.00823
Epoch: 24848, Training Loss: 0.00933
Epoch: 24848, Training Loss: 0.00869
Epoch: 24848, Training Loss: 0.01065
Epoch: 24848, Training Loss: 0.00823
Epoch: 24849, Training Loss: 0.00933
Epoch: 24849, Training Loss: 0.00869
Epoch: 24849, Training Loss: 0.01065
Epoch: 24849, Training Loss: 0.00823
Epoch: 24850, Training Loss: 0.00933
Epoch: 24850, Training Loss: 0.00869
Epoch: 24850, Training Loss: 0.01065
Epoch: 24850, Training Loss: 0.00823
Epoch: 24851, Training Loss: 0.00933
Epoch: 24851, Training Loss: 0.00869
Epoch: 24851, Training Loss: 0.01065
Epoch: 24851, Training Loss: 0.00823
Epoch: 24852, Training Loss: 0.00933
Epoch: 24852, Training Loss: 0.00869
Epoch: 24852, Training Loss: 0.01065
Epoch: 24852, Training Loss: 0.00823
Epoch: 24853, Training Loss: 0.00933
Epoch: 24853, Training Loss: 0.00869
Epoch: 24853, Training Loss: 0.01065
Epoch: 24853, Training Loss: 0.00823
Epoch: 24854, Training Loss: 0.00933
Epoch: 24854, Training Loss: 0.00869
Epoch: 24854, Training Loss: 0.01065
Epoch: 24854, Training Loss: 0.00823
Epoch: 24855, Training Loss: 0.00932
Epoch: 24855, Training Loss: 0.00869
Epoch: 24855, Training Loss: 0.01065
Epoch: 24855, Training Loss: 0.00823
Epoch: 24856, Training Loss: 0.00932
Epoch: 24856, Training Loss: 0.00869
Epoch: 24856, Training Loss: 0.01065
Epoch: 24856, Training Loss: 0.00823
Epoch: 24857, Training Loss: 0.00932
Epoch: 24857, Training Loss: 0.00869
Epoch: 24857, Training Loss: 0.01065
Epoch: 24857, Training Loss: 0.00823
Epoch: 24858, Training Loss: 0.00932
Epoch: 24858, Training Loss: 0.00869
Epoch: 24858, Training Loss: 0.01065
Epoch: 24858, Training Loss: 0.00823
Epoch: 24859, Training Loss: 0.00932
Epoch: 24859, Training Loss: 0.00869
Epoch: 24859, Training Loss: 0.01065
Epoch: 24859, Training Loss: 0.00823
Epoch: 24860, Training Loss: 0.00932
Epoch: 24860, Training Loss: 0.00869
Epoch: 24860, Training Loss: 0.01065
Epoch: 24860, Training Loss: 0.00823
Epoch: 24861, Training Loss: 0.00932
Epoch: 24861, Training Loss: 0.00869
Epoch: 24861, Training Loss: 0.01065
Epoch: 24861, Training Loss: 0.00823
Epoch: 24862, Training Loss: 0.00932
Epoch: 24862, Training Loss: 0.00869
Epoch: 24862, Training Loss: 0.01065
Epoch: 24862, Training Loss: 0.00823
Epoch: 24863, Training Loss: 0.00932
Epoch: 24863, Training Loss: 0.00869
Epoch: 24863, Training Loss: 0.01065
Epoch: 24863, Training Loss: 0.00823
Epoch: 24864, Training Loss: 0.00932
Epoch: 24864, Training Loss: 0.00869
Epoch: 24864, Training Loss: 0.01064
Epoch: 24864, Training Loss: 0.00823
Epoch: 24865, Training Loss: 0.00932
Epoch: 24865, Training Loss: 0.00869
Epoch: 24865, Training Loss: 0.01064
Epoch: 24865, Training Loss: 0.00823
Epoch: 24866, Training Loss: 0.00932
Epoch: 24866, Training Loss: 0.00869
Epoch: 24866, Training Loss: 0.01064
Epoch: 24866, Training Loss: 0.00823
Epoch: 24867, Training Loss: 0.00932
Epoch: 24867, Training Loss: 0.00869
Epoch: 24867, Training Loss: 0.01064
Epoch: 24867, Training Loss: 0.00823
Epoch: 24868, Training Loss: 0.00932
Epoch: 24868, Training Loss: 0.00869
Epoch: 24868, Training Loss: 0.01064
Epoch: 24868, Training Loss: 0.00823
Epoch: 24869, Training Loss: 0.00932
Epoch: 24869, Training Loss: 0.00869
Epoch: 24869, Training Loss: 0.01064
Epoch: 24869, Training Loss: 0.00823
Epoch: 24870, Training Loss: 0.00932
Epoch: 24870, Training Loss: 0.00869
Epoch: 24870, Training Loss: 0.01064
Epoch: 24870, Training Loss: 0.00823
Epoch: 24871, Training Loss: 0.00932
Epoch: 24871, Training Loss: 0.00869
Epoch: 24871, Training Loss: 0.01064
Epoch: 24871, Training Loss: 0.00823
Epoch: 24872, Training Loss: 0.00932
Epoch: 24872, Training Loss: 0.00869
Epoch: 24872, Training Loss: 0.01064
Epoch: 24872, Training Loss: 0.00823
Epoch: 24873, Training Loss: 0.00932
Epoch: 24873, Training Loss: 0.00869
Epoch: 24873, Training Loss: 0.01064
Epoch: 24873, Training Loss: 0.00823
Epoch: 24874, Training Loss: 0.00932
Epoch: 24874, Training Loss: 0.00869
Epoch: 24874, Training Loss: 0.01064
Epoch: 24874, Training Loss: 0.00823
Epoch: 24875, Training Loss: 0.00932
Epoch: 24875, Training Loss: 0.00869
Epoch: 24875, Training Loss: 0.01064
Epoch: 24875, Training Loss: 0.00823
Epoch: 24876, Training Loss: 0.00932
Epoch: 24876, Training Loss: 0.00869
Epoch: 24876, Training Loss: 0.01064
Epoch: 24876, Training Loss: 0.00823
Epoch: 24877, Training Loss: 0.00932
Epoch: 24877, Training Loss: 0.00869
Epoch: 24877, Training Loss: 0.01064
Epoch: 24877, Training Loss: 0.00823
Epoch: 24878, Training Loss: 0.00932
Epoch: 24878, Training Loss: 0.00869
Epoch: 24878, Training Loss: 0.01064
Epoch: 24878, Training Loss: 0.00823
Epoch: 24879, Training Loss: 0.00932
Epoch: 24879, Training Loss: 0.00869
Epoch: 24879, Training Loss: 0.01064
Epoch: 24879, Training Loss: 0.00823
Epoch: 24880, Training Loss: 0.00932
Epoch: 24880, Training Loss: 0.00869
Epoch: 24880, Training Loss: 0.01064
Epoch: 24880, Training Loss: 0.00823
Epoch: 24881, Training Loss: 0.00932
Epoch: 24881, Training Loss: 0.00869
Epoch: 24881, Training Loss: 0.01064
Epoch: 24881, Training Loss: 0.00823
Epoch: 24882, Training Loss: 0.00932
Epoch: 24882, Training Loss: 0.00869
Epoch: 24882, Training Loss: 0.01064
Epoch: 24882, Training Loss: 0.00823
Epoch: 24883, Training Loss: 0.00932
Epoch: 24883, Training Loss: 0.00869
Epoch: 24883, Training Loss: 0.01064
Epoch: 24883, Training Loss: 0.00823
Epoch: 24884, Training Loss: 0.00932
Epoch: 24884, Training Loss: 0.00869
Epoch: 24884, Training Loss: 0.01064
Epoch: 24884, Training Loss: 0.00823
Epoch: 24885, Training Loss: 0.00932
Epoch: 24885, Training Loss: 0.00869
Epoch: 24885, Training Loss: 0.01064
Epoch: 24885, Training Loss: 0.00823
Epoch: 24886, Training Loss: 0.00932
Epoch: 24886, Training Loss: 0.00869
Epoch: 24886, Training Loss: 0.01064
Epoch: 24886, Training Loss: 0.00823
Epoch: 24887, Training Loss: 0.00932
Epoch: 24887, Training Loss: 0.00869
Epoch: 24887, Training Loss: 0.01064
Epoch: 24887, Training Loss: 0.00823
Epoch: 24888, Training Loss: 0.00932
Epoch: 24888, Training Loss: 0.00869
Epoch: 24888, Training Loss: 0.01064
Epoch: 24888, Training Loss: 0.00823
Epoch: 24889, Training Loss: 0.00932
Epoch: 24889, Training Loss: 0.00869
Epoch: 24889, Training Loss: 0.01064
Epoch: 24889, Training Loss: 0.00823
Epoch: 24890, Training Loss: 0.00932
Epoch: 24890, Training Loss: 0.00869
Epoch: 24890, Training Loss: 0.01064
Epoch: 24890, Training Loss: 0.00823
Epoch: 24891, Training Loss: 0.00932
Epoch: 24891, Training Loss: 0.00869
Epoch: 24891, Training Loss: 0.01064
Epoch: 24891, Training Loss: 0.00823
Epoch: 24892, Training Loss: 0.00932
Epoch: 24892, Training Loss: 0.00869
Epoch: 24892, Training Loss: 0.01064
Epoch: 24892, Training Loss: 0.00823
Epoch: 24893, Training Loss: 0.00932
Epoch: 24893, Training Loss: 0.00869
Epoch: 24893, Training Loss: 0.01064
Epoch: 24893, Training Loss: 0.00823
Epoch: 24894, Training Loss: 0.00932
Epoch: 24894, Training Loss: 0.00869
Epoch: 24894, Training Loss: 0.01064
Epoch: 24894, Training Loss: 0.00823
Epoch: 24895, Training Loss: 0.00932
Epoch: 24895, Training Loss: 0.00868
Epoch: 24895, Training Loss: 0.01064
Epoch: 24895, Training Loss: 0.00823
Epoch: 24896, Training Loss: 0.00932
Epoch: 24896, Training Loss: 0.00868
Epoch: 24896, Training Loss: 0.01064
Epoch: 24896, Training Loss: 0.00823
Epoch: 24897, Training Loss: 0.00932
Epoch: 24897, Training Loss: 0.00868
Epoch: 24897, Training Loss: 0.01064
Epoch: 24897, Training Loss: 0.00823
Epoch: 24898, Training Loss: 0.00932
Epoch: 24898, Training Loss: 0.00868
Epoch: 24898, Training Loss: 0.01064
Epoch: 24898, Training Loss: 0.00823
Epoch: 24899, Training Loss: 0.00932
Epoch: 24899, Training Loss: 0.00868
Epoch: 24899, Training Loss: 0.01064
Epoch: 24899, Training Loss: 0.00823
Epoch: 24900, Training Loss: 0.00932
Epoch: 24900, Training Loss: 0.00868
Epoch: 24900, Training Loss: 0.01064
Epoch: 24900, Training Loss: 0.00823
Epoch: 24901, Training Loss: 0.00932
Epoch: 24901, Training Loss: 0.00868
Epoch: 24901, Training Loss: 0.01064
Epoch: 24901, Training Loss: 0.00823
Epoch: 24902, Training Loss: 0.00932
Epoch: 24902, Training Loss: 0.00868
Epoch: 24902, Training Loss: 0.01064
Epoch: 24902, Training Loss: 0.00823
Epoch: 24903, Training Loss: 0.00932
Epoch: 24903, Training Loss: 0.00868
Epoch: 24903, Training Loss: 0.01064
Epoch: 24903, Training Loss: 0.00822
Epoch: 24904, Training Loss: 0.00932
Epoch: 24904, Training Loss: 0.00868
Epoch: 24904, Training Loss: 0.01064
Epoch: 24904, Training Loss: 0.00822
Epoch: 24905, Training Loss: 0.00931
Epoch: 24905, Training Loss: 0.00868
Epoch: 24905, Training Loss: 0.01064
Epoch: 24905, Training Loss: 0.00822
Epoch: 24906, Training Loss: 0.00931
Epoch: 24906, Training Loss: 0.00868
Epoch: 24906, Training Loss: 0.01064
Epoch: 24906, Training Loss: 0.00822
Epoch: 24907, Training Loss: 0.00931
Epoch: 24907, Training Loss: 0.00868
Epoch: 24907, Training Loss: 0.01064
Epoch: 24907, Training Loss: 0.00822
Epoch: 24908, Training Loss: 0.00931
Epoch: 24908, Training Loss: 0.00868
Epoch: 24908, Training Loss: 0.01063
Epoch: 24908, Training Loss: 0.00822
Epoch: 24909, Training Loss: 0.00931
Epoch: 24909, Training Loss: 0.00868
Epoch: 24909, Training Loss: 0.01063
Epoch: 24909, Training Loss: 0.00822
Epoch: 24910, Training Loss: 0.00931
Epoch: 24910, Training Loss: 0.00868
Epoch: 24910, Training Loss: 0.01063
Epoch: 24910, Training Loss: 0.00822
Epoch: 24911, Training Loss: 0.00931
Epoch: 24911, Training Loss: 0.00868
Epoch: 24911, Training Loss: 0.01063
Epoch: 24911, Training Loss: 0.00822
Epoch: 24912, Training Loss: 0.00931
Epoch: 24912, Training Loss: 0.00868
Epoch: 24912, Training Loss: 0.01063
Epoch: 24912, Training Loss: 0.00822
Epoch: 24913, Training Loss: 0.00931
Epoch: 24913, Training Loss: 0.00868
Epoch: 24913, Training Loss: 0.01063
Epoch: 24913, Training Loss: 0.00822
Epoch: 24914, Training Loss: 0.00931
Epoch: 24914, Training Loss: 0.00868
Epoch: 24914, Training Loss: 0.01063
Epoch: 24914, Training Loss: 0.00822
Epoch: 24915, Training Loss: 0.00931
Epoch: 24915, Training Loss: 0.00868
Epoch: 24915, Training Loss: 0.01063
Epoch: 24915, Training Loss: 0.00822
Epoch: 24916, Training Loss: 0.00931
Epoch: 24916, Training Loss: 0.00868
Epoch: 24916, Training Loss: 0.01063
Epoch: 24916, Training Loss: 0.00822
Epoch: 24917, Training Loss: 0.00931
Epoch: 24917, Training Loss: 0.00868
Epoch: 24917, Training Loss: 0.01063
Epoch: 24917, Training Loss: 0.00822
Epoch: 24918, Training Loss: 0.00931
Epoch: 24918, Training Loss: 0.00868
Epoch: 24918, Training Loss: 0.01063
Epoch: 24918, Training Loss: 0.00822
Epoch: 24919, Training Loss: 0.00931
Epoch: 24919, Training Loss: 0.00868
Epoch: 24919, Training Loss: 0.01063
Epoch: 24919, Training Loss: 0.00822
Epoch: 24920, Training Loss: 0.00931
Epoch: 24920, Training Loss: 0.00868
Epoch: 24920, Training Loss: 0.01063
Epoch: 24920, Training Loss: 0.00822
Epoch: 24921, Training Loss: 0.00931
Epoch: 24921, Training Loss: 0.00868
Epoch: 24921, Training Loss: 0.01063
Epoch: 24921, Training Loss: 0.00822
Epoch: 24922, Training Loss: 0.00931
Epoch: 24922, Training Loss: 0.00868
Epoch: 24922, Training Loss: 0.01063
Epoch: 24922, Training Loss: 0.00822
Epoch: 24923, Training Loss: 0.00931
Epoch: 24923, Training Loss: 0.00868
Epoch: 24923, Training Loss: 0.01063
Epoch: 24923, Training Loss: 0.00822
Epoch: 24924, Training Loss: 0.00931
Epoch: 24924, Training Loss: 0.00868
Epoch: 24924, Training Loss: 0.01063
Epoch: 24924, Training Loss: 0.00822
Epoch: 24925, Training Loss: 0.00931
Epoch: 24925, Training Loss: 0.00868
Epoch: 24925, Training Loss: 0.01063
Epoch: 24925, Training Loss: 0.00822
Epoch: 24926, Training Loss: 0.00931
Epoch: 24926, Training Loss: 0.00868
Epoch: 24926, Training Loss: 0.01063
Epoch: 24926, Training Loss: 0.00822
Epoch: 24927, Training Loss: 0.00931
Epoch: 24927, Training Loss: 0.00868
Epoch: 24927, Training Loss: 0.01063
Epoch: 24927, Training Loss: 0.00822
Epoch: 24928, Training Loss: 0.00931
Epoch: 24928, Training Loss: 0.00868
Epoch: 24928, Training Loss: 0.01063
Epoch: 24928, Training Loss: 0.00822
Epoch: 24929, Training Loss: 0.00931
Epoch: 24929, Training Loss: 0.00868
Epoch: 24929, Training Loss: 0.01063
Epoch: 24929, Training Loss: 0.00822
Epoch: 24930, Training Loss: 0.00931
Epoch: 24930, Training Loss: 0.00868
Epoch: 24930, Training Loss: 0.01063
Epoch: 24930, Training Loss: 0.00822
Epoch: 24931, Training Loss: 0.00931
Epoch: 24931, Training Loss: 0.00868
Epoch: 24931, Training Loss: 0.01063
Epoch: 24931, Training Loss: 0.00822
Epoch: 24932, Training Loss: 0.00931
Epoch: 24932, Training Loss: 0.00868
Epoch: 24932, Training Loss: 0.01063
Epoch: 24932, Training Loss: 0.00822
Epoch: 24933, Training Loss: 0.00931
Epoch: 24933, Training Loss: 0.00868
Epoch: 24933, Training Loss: 0.01063
Epoch: 24933, Training Loss: 0.00822
Epoch: 24934, Training Loss: 0.00931
Epoch: 24934, Training Loss: 0.00868
Epoch: 24934, Training Loss: 0.01063
Epoch: 24934, Training Loss: 0.00822
Epoch: 24935, Training Loss: 0.00931
Epoch: 24935, Training Loss: 0.00868
Epoch: 24935, Training Loss: 0.01063
Epoch: 24935, Training Loss: 0.00822
Epoch: 24936, Training Loss: 0.00931
Epoch: 24936, Training Loss: 0.00868
Epoch: 24936, Training Loss: 0.01063
Epoch: 24936, Training Loss: 0.00822
Epoch: 24937, Training Loss: 0.00931
Epoch: 24937, Training Loss: 0.00868
Epoch: 24937, Training Loss: 0.01063
Epoch: 24937, Training Loss: 0.00822
Epoch: 24938, Training Loss: 0.00931
Epoch: 24938, Training Loss: 0.00868
Epoch: 24938, Training Loss: 0.01063
Epoch: 24938, Training Loss: 0.00822
Epoch: 24939, Training Loss: 0.00931
Epoch: 24939, Training Loss: 0.00868
Epoch: 24939, Training Loss: 0.01063
Epoch: 24939, Training Loss: 0.00822
Epoch: 24940, Training Loss: 0.00931
Epoch: 24940, Training Loss: 0.00868
Epoch: 24940, Training Loss: 0.01063
Epoch: 24940, Training Loss: 0.00822
Epoch: 24941, Training Loss: 0.00931
Epoch: 24941, Training Loss: 0.00868
Epoch: 24941, Training Loss: 0.01063
Epoch: 24941, Training Loss: 0.00822
Epoch: 24942, Training Loss: 0.00931
Epoch: 24942, Training Loss: 0.00868
Epoch: 24942, Training Loss: 0.01063
Epoch: 24942, Training Loss: 0.00822
Epoch: 24943, Training Loss: 0.00931
Epoch: 24943, Training Loss: 0.00868
Epoch: 24943, Training Loss: 0.01063
Epoch: 24943, Training Loss: 0.00822
Epoch: 24944, Training Loss: 0.00931
Epoch: 24944, Training Loss: 0.00868
Epoch: 24944, Training Loss: 0.01063
Epoch: 24944, Training Loss: 0.00822
Epoch: 24945, Training Loss: 0.00931
Epoch: 24945, Training Loss: 0.00868
Epoch: 24945, Training Loss: 0.01063
Epoch: 24945, Training Loss: 0.00822
Epoch: 24946, Training Loss: 0.00931
Epoch: 24946, Training Loss: 0.00868
Epoch: 24946, Training Loss: 0.01063
Epoch: 24946, Training Loss: 0.00822
Epoch: 24947, Training Loss: 0.00931
Epoch: 24947, Training Loss: 0.00868
Epoch: 24947, Training Loss: 0.01063
Epoch: 24947, Training Loss: 0.00822
Epoch: 24948, Training Loss: 0.00931
Epoch: 24948, Training Loss: 0.00868
Epoch: 24948, Training Loss: 0.01063
Epoch: 24948, Training Loss: 0.00822
Epoch: 24949, Training Loss: 0.00931
Epoch: 24949, Training Loss: 0.00868
Epoch: 24949, Training Loss: 0.01063
Epoch: 24949, Training Loss: 0.00822
Epoch: 24950, Training Loss: 0.00931
Epoch: 24950, Training Loss: 0.00867
Epoch: 24950, Training Loss: 0.01063
Epoch: 24950, Training Loss: 0.00822
Epoch: 24951, Training Loss: 0.00931
Epoch: 24951, Training Loss: 0.00867
Epoch: 24951, Training Loss: 0.01063
Epoch: 24951, Training Loss: 0.00822
Epoch: 24952, Training Loss: 0.00931
Epoch: 24952, Training Loss: 0.00867
Epoch: 24952, Training Loss: 0.01062
Epoch: 24952, Training Loss: 0.00822
Epoch: 24953, Training Loss: 0.00931
Epoch: 24953, Training Loss: 0.00867
Epoch: 24953, Training Loss: 0.01062
Epoch: 24953, Training Loss: 0.00822
Epoch: 24954, Training Loss: 0.00931
Epoch: 24954, Training Loss: 0.00867
Epoch: 24954, Training Loss: 0.01062
Epoch: 24954, Training Loss: 0.00822
Epoch: 24955, Training Loss: 0.00930
Epoch: 24955, Training Loss: 0.00867
Epoch: 24955, Training Loss: 0.01062
Epoch: 24955, Training Loss: 0.00822
Epoch: 24956, Training Loss: 0.00930
Epoch: 24956, Training Loss: 0.00867
Epoch: 24956, Training Loss: 0.01062
Epoch: 24956, Training Loss: 0.00822
Epoch: 24957, Training Loss: 0.00930
Epoch: 24957, Training Loss: 0.00867
Epoch: 24957, Training Loss: 0.01062
Epoch: 24957, Training Loss: 0.00822
Epoch: 24958, Training Loss: 0.00930
Epoch: 24958, Training Loss: 0.00867
Epoch: 24958, Training Loss: 0.01062
Epoch: 24958, Training Loss: 0.00822
Epoch: 24959, Training Loss: 0.00930
Epoch: 24959, Training Loss: 0.00867
Epoch: 24959, Training Loss: 0.01062
Epoch: 24959, Training Loss: 0.00822
Epoch: 24960, Training Loss: 0.00930
Epoch: 24960, Training Loss: 0.00867
Epoch: 24960, Training Loss: 0.01062
Epoch: 24960, Training Loss: 0.00821
Epoch: 24961, Training Loss: 0.00930
Epoch: 24961, Training Loss: 0.00867
Epoch: 24961, Training Loss: 0.01062
Epoch: 24961, Training Loss: 0.00821
Epoch: 24962, Training Loss: 0.00930
Epoch: 24962, Training Loss: 0.00867
Epoch: 24962, Training Loss: 0.01062
Epoch: 24962, Training Loss: 0.00821
Epoch: 24963, Training Loss: 0.00930
Epoch: 24963, Training Loss: 0.00867
Epoch: 24963, Training Loss: 0.01062
Epoch: 24963, Training Loss: 0.00821
Epoch: 24964, Training Loss: 0.00930
Epoch: 24964, Training Loss: 0.00867
Epoch: 24964, Training Loss: 0.01062
Epoch: 24964, Training Loss: 0.00821
Epoch: 24965, Training Loss: 0.00930
Epoch: 24965, Training Loss: 0.00867
Epoch: 24965, Training Loss: 0.01062
Epoch: 24965, Training Loss: 0.00821
Epoch: 24966, Training Loss: 0.00930
Epoch: 24966, Training Loss: 0.00867
Epoch: 24966, Training Loss: 0.01062
Epoch: 24966, Training Loss: 0.00821
Epoch: 24967, Training Loss: 0.00930
Epoch: 24967, Training Loss: 0.00867
Epoch: 24967, Training Loss: 0.01062
Epoch: 24967, Training Loss: 0.00821
Epoch: 24968, Training Loss: 0.00930
Epoch: 24968, Training Loss: 0.00867
Epoch: 24968, Training Loss: 0.01062
Epoch: 24968, Training Loss: 0.00821
Epoch: 24969, Training Loss: 0.00930
Epoch: 24969, Training Loss: 0.00867
Epoch: 24969, Training Loss: 0.01062
Epoch: 24969, Training Loss: 0.00821
Epoch: 24970, Training Loss: 0.00930
Epoch: 24970, Training Loss: 0.00867
Epoch: 24970, Training Loss: 0.01062
Epoch: 24970, Training Loss: 0.00821
Epoch: 24971, Training Loss: 0.00930
Epoch: 24971, Training Loss: 0.00867
Epoch: 24971, Training Loss: 0.01062
Epoch: 24971, Training Loss: 0.00821
Epoch: 24972, Training Loss: 0.00930
Epoch: 24972, Training Loss: 0.00867
Epoch: 24972, Training Loss: 0.01062
Epoch: 24972, Training Loss: 0.00821
Epoch: 24973, Training Loss: 0.00930
Epoch: 24973, Training Loss: 0.00867
Epoch: 24973, Training Loss: 0.01062
Epoch: 24973, Training Loss: 0.00821
Epoch: 24974, Training Loss: 0.00930
Epoch: 24974, Training Loss: 0.00867
Epoch: 24974, Training Loss: 0.01062
Epoch: 24974, Training Loss: 0.00821
Epoch: 24975, Training Loss: 0.00930
Epoch: 24975, Training Loss: 0.00867
Epoch: 24975, Training Loss: 0.01062
Epoch: 24975, Training Loss: 0.00821
Epoch: 24976, Training Loss: 0.00930
Epoch: 24976, Training Loss: 0.00867
Epoch: 24976, Training Loss: 0.01062
Epoch: 24976, Training Loss: 0.00821
Epoch: 24977, Training Loss: 0.00930
Epoch: 24977, Training Loss: 0.00867
Epoch: 24977, Training Loss: 0.01062
Epoch: 24977, Training Loss: 0.00821
Epoch: 24978, Training Loss: 0.00930
Epoch: 24978, Training Loss: 0.00867
Epoch: 24978, Training Loss: 0.01062
Epoch: 24978, Training Loss: 0.00821
Epoch: 24979, Training Loss: 0.00930
Epoch: 24979, Training Loss: 0.00867
Epoch: 24979, Training Loss: 0.01062
Epoch: 24979, Training Loss: 0.00821
Epoch: 24980, Training Loss: 0.00930
Epoch: 24980, Training Loss: 0.00867
Epoch: 24980, Training Loss: 0.01062
Epoch: 24980, Training Loss: 0.00821
Epoch: 24981, Training Loss: 0.00930
Epoch: 24981, Training Loss: 0.00867
Epoch: 24981, Training Loss: 0.01062
Epoch: 24981, Training Loss: 0.00821
Epoch: 24982, Training Loss: 0.00930
Epoch: 24982, Training Loss: 0.00867
Epoch: 24982, Training Loss: 0.01062
Epoch: 24982, Training Loss: 0.00821
Epoch: 24983, Training Loss: 0.00930
Epoch: 24983, Training Loss: 0.00867
Epoch: 24983, Training Loss: 0.01062
Epoch: 24983, Training Loss: 0.00821
Epoch: 24984, Training Loss: 0.00930
Epoch: 24984, Training Loss: 0.00867
Epoch: 24984, Training Loss: 0.01062
Epoch: 24984, Training Loss: 0.00821
Epoch: 24985, Training Loss: 0.00930
Epoch: 24985, Training Loss: 0.00867
Epoch: 24985, Training Loss: 0.01062
Epoch: 24985, Training Loss: 0.00821
Epoch: 24986, Training Loss: 0.00930
Epoch: 24986, Training Loss: 0.00867
Epoch: 24986, Training Loss: 0.01062
Epoch: 24986, Training Loss: 0.00821
Epoch: 24987, Training Loss: 0.00930
Epoch: 24987, Training Loss: 0.00867
Epoch: 24987, Training Loss: 0.01062
Epoch: 24987, Training Loss: 0.00821
Epoch: 24988, Training Loss: 0.00930
Epoch: 24988, Training Loss: 0.00867
Epoch: 24988, Training Loss: 0.01062
Epoch: 24988, Training Loss: 0.00821
Epoch: 24989, Training Loss: 0.00930
Epoch: 24989, Training Loss: 0.00867
Epoch: 24989, Training Loss: 0.01062
Epoch: 24989, Training Loss: 0.00821
Epoch: 24990, Training Loss: 0.00930
Epoch: 24990, Training Loss: 0.00867
Epoch: 24990, Training Loss: 0.01062
Epoch: 24990, Training Loss: 0.00821
Epoch: 24991, Training Loss: 0.00930
Epoch: 24991, Training Loss: 0.00867
Epoch: 24991, Training Loss: 0.01062
Epoch: 24991, Training Loss: 0.00821
Epoch: 24992, Training Loss: 0.00930
Epoch: 24992, Training Loss: 0.00867
Epoch: 24992, Training Loss: 0.01062
Epoch: 24992, Training Loss: 0.00821
Epoch: 24993, Training Loss: 0.00930
Epoch: 24993, Training Loss: 0.00867
Epoch: 24993, Training Loss: 0.01062
Epoch: 24993, Training Loss: 0.00821
Epoch: 24994, Training Loss: 0.00930
Epoch: 24994, Training Loss: 0.00867
Epoch: 24994, Training Loss: 0.01062
Epoch: 24994, Training Loss: 0.00821
Epoch: 24995, Training Loss: 0.00930
Epoch: 24995, Training Loss: 0.00867
Epoch: 24995, Training Loss: 0.01062
Epoch: 24995, Training Loss: 0.00821
Epoch: 24996, Training Loss: 0.00930
Epoch: 24996, Training Loss: 0.00867
Epoch: 24996, Training Loss: 0.01061
Epoch: 24996, Training Loss: 0.00821
Epoch: 24997, Training Loss: 0.00930
Epoch: 24997, Training Loss: 0.00867
Epoch: 24997, Training Loss: 0.01061
Epoch: 24997, Training Loss: 0.00821
Epoch: 24998, Training Loss: 0.00930
Epoch: 24998, Training Loss: 0.00867
Epoch: 24998, Training Loss: 0.01061
Epoch: 24998, Training Loss: 0.00821
Epoch: 24999, Training Loss: 0.00930
Epoch: 24999, Training Loss: 0.00867
Epoch: 24999, Training Loss: 0.01061
Epoch: 24999, Training Loss: 0.00821
Epoch: 25000, Training Loss: 0.00930
Epoch: 25000, Training Loss: 0.00867
Epoch: 25000, Training Loss: 0.01061
Epoch: 25000, Training Loss: 0.00821
Epoch: 25001, Training Loss: 0.00930
Epoch: 25001, Training Loss: 0.00867
Epoch: 25001, Training Loss: 0.01061
Epoch: 25001, Training Loss: 0.00821
Epoch: 25002, Training Loss: 0.00930
Epoch: 25002, Training Loss: 0.00867
Epoch: 25002, Training Loss: 0.01061
Epoch: 25002, Training Loss: 0.00821
Epoch: 25003, Training Loss: 0.00930
Epoch: 25003, Training Loss: 0.00867
Epoch: 25003, Training Loss: 0.01061
Epoch: 25003, Training Loss: 0.00821
Epoch: 25004, Training Loss: 0.00930
Epoch: 25004, Training Loss: 0.00866
Epoch: 25004, Training Loss: 0.01061
Epoch: 25004, Training Loss: 0.00821
Epoch: 25005, Training Loss: 0.00929
Epoch: 25005, Training Loss: 0.00866
Epoch: 25005, Training Loss: 0.01061
Epoch: 25005, Training Loss: 0.00821
Epoch: 25006, Training Loss: 0.00929
Epoch: 25006, Training Loss: 0.00866
Epoch: 25006, Training Loss: 0.01061
Epoch: 25006, Training Loss: 0.00821
Epoch: 25007, Training Loss: 0.00929
Epoch: 25007, Training Loss: 0.00866
Epoch: 25007, Training Loss: 0.01061
Epoch: 25007, Training Loss: 0.00821
Epoch: 25008, Training Loss: 0.00929
Epoch: 25008, Training Loss: 0.00866
Epoch: 25008, Training Loss: 0.01061
Epoch: 25008, Training Loss: 0.00821
Epoch: 25009, Training Loss: 0.00929
Epoch: 25009, Training Loss: 0.00866
Epoch: 25009, Training Loss: 0.01061
Epoch: 25009, Training Loss: 0.00821
Epoch: 25010, Training Loss: 0.00929
Epoch: 25010, Training Loss: 0.00866
Epoch: 25010, Training Loss: 0.01061
Epoch: 25010, Training Loss: 0.00821
Epoch: 25011, Training Loss: 0.00929
Epoch: 25011, Training Loss: 0.00866
Epoch: 25011, Training Loss: 0.01061
Epoch: 25011, Training Loss: 0.00821
Epoch: 25012, Training Loss: 0.00929
Epoch: 25012, Training Loss: 0.00866
Epoch: 25012, Training Loss: 0.01061
Epoch: 25012, Training Loss: 0.00821
Epoch: 25013, Training Loss: 0.00929
Epoch: 25013, Training Loss: 0.00866
Epoch: 25013, Training Loss: 0.01061
Epoch: 25013, Training Loss: 0.00821
Epoch: 25014, Training Loss: 0.00929
Epoch: 25014, Training Loss: 0.00866
Epoch: 25014, Training Loss: 0.01061
Epoch: 25014, Training Loss: 0.00821
Epoch: 25015, Training Loss: 0.00929
Epoch: 25015, Training Loss: 0.00866
Epoch: 25015, Training Loss: 0.01061
Epoch: 25015, Training Loss: 0.00821
Epoch: 25016, Training Loss: 0.00929
Epoch: 25016, Training Loss: 0.00866
Epoch: 25016, Training Loss: 0.01061
Epoch: 25016, Training Loss: 0.00821
Epoch: 25017, Training Loss: 0.00929
Epoch: 25017, Training Loss: 0.00866
Epoch: 25017, Training Loss: 0.01061
Epoch: 25017, Training Loss: 0.00821
Epoch: 25018, Training Loss: 0.00929
Epoch: 25018, Training Loss: 0.00866
Epoch: 25018, Training Loss: 0.01061
Epoch: 25018, Training Loss: 0.00820
Epoch: 25019, Training Loss: 0.00929
Epoch: 25019, Training Loss: 0.00866
Epoch: 25019, Training Loss: 0.01061
Epoch: 25019, Training Loss: 0.00820
Epoch: 25020, Training Loss: 0.00929
Epoch: 25020, Training Loss: 0.00866
Epoch: 25020, Training Loss: 0.01061
Epoch: 25020, Training Loss: 0.00820
Epoch: 25021, Training Loss: 0.00929
Epoch: 25021, Training Loss: 0.00866
Epoch: 25021, Training Loss: 0.01061
Epoch: 25021, Training Loss: 0.00820
Epoch: 25022, Training Loss: 0.00929
Epoch: 25022, Training Loss: 0.00866
Epoch: 25022, Training Loss: 0.01061
Epoch: 25022, Training Loss: 0.00820
Epoch: 25023, Training Loss: 0.00929
Epoch: 25023, Training Loss: 0.00866
Epoch: 25023, Training Loss: 0.01061
Epoch: 25023, Training Loss: 0.00820
Epoch: 25024, Training Loss: 0.00929
Epoch: 25024, Training Loss: 0.00866
Epoch: 25024, Training Loss: 0.01061
Epoch: 25024, Training Loss: 0.00820
Epoch: 25025, Training Loss: 0.00929
Epoch: 25025, Training Loss: 0.00866
Epoch: 25025, Training Loss: 0.01061
Epoch: 25025, Training Loss: 0.00820
Epoch: 25026, Training Loss: 0.00929
Epoch: 25026, Training Loss: 0.00866
Epoch: 25026, Training Loss: 0.01061
Epoch: 25026, Training Loss: 0.00820
Epoch: 25027, Training Loss: 0.00929
Epoch: 25027, Training Loss: 0.00866
Epoch: 25027, Training Loss: 0.01061
Epoch: 25027, Training Loss: 0.00820
Epoch: 25028, Training Loss: 0.00929
Epoch: 25028, Training Loss: 0.00866
Epoch: 25028, Training Loss: 0.01061
Epoch: 25028, Training Loss: 0.00820
Epoch: 25029, Training Loss: 0.00929
Epoch: 25029, Training Loss: 0.00866
Epoch: 25029, Training Loss: 0.01061
Epoch: 25029, Training Loss: 0.00820
Epoch: 25030, Training Loss: 0.00929
Epoch: 25030, Training Loss: 0.00866
Epoch: 25030, Training Loss: 0.01061
Epoch: 25030, Training Loss: 0.00820
Epoch: 25031, Training Loss: 0.00929
Epoch: 25031, Training Loss: 0.00866
Epoch: 25031, Training Loss: 0.01061
Epoch: 25031, Training Loss: 0.00820
Epoch: 25032, Training Loss: 0.00929
Epoch: 25032, Training Loss: 0.00866
Epoch: 25032, Training Loss: 0.01061
Epoch: 25032, Training Loss: 0.00820
Epoch: 25033, Training Loss: 0.00929
Epoch: 25033, Training Loss: 0.00866
Epoch: 25033, Training Loss: 0.01061
Epoch: 25033, Training Loss: 0.00820
Epoch: 25034, Training Loss: 0.00929
Epoch: 25034, Training Loss: 0.00866
Epoch: 25034, Training Loss: 0.01061
Epoch: 25034, Training Loss: 0.00820
Epoch: 25035, Training Loss: 0.00929
Epoch: 25035, Training Loss: 0.00866
Epoch: 25035, Training Loss: 0.01061
Epoch: 25035, Training Loss: 0.00820
Epoch: 25036, Training Loss: 0.00929
Epoch: 25036, Training Loss: 0.00866
Epoch: 25036, Training Loss: 0.01061
Epoch: 25036, Training Loss: 0.00820
Epoch: 25037, Training Loss: 0.00929
Epoch: 25037, Training Loss: 0.00866
Epoch: 25037, Training Loss: 0.01061
Epoch: 25037, Training Loss: 0.00820
Epoch: 25038, Training Loss: 0.00929
Epoch: 25038, Training Loss: 0.00866
Epoch: 25038, Training Loss: 0.01061
Epoch: 25038, Training Loss: 0.00820
Epoch: 25039, Training Loss: 0.00929
Epoch: 25039, Training Loss: 0.00866
Epoch: 25039, Training Loss: 0.01061
Epoch: 25039, Training Loss: 0.00820
Epoch: 25040, Training Loss: 0.00929
Epoch: 25040, Training Loss: 0.00866
Epoch: 25040, Training Loss: 0.01060
Epoch: 25040, Training Loss: 0.00820
Epoch: 25041, Training Loss: 0.00929
Epoch: 25041, Training Loss: 0.00866
Epoch: 25041, Training Loss: 0.01060
Epoch: 25041, Training Loss: 0.00820
Epoch: 25042, Training Loss: 0.00929
Epoch: 25042, Training Loss: 0.00866
Epoch: 25042, Training Loss: 0.01060
Epoch: 25042, Training Loss: 0.00820
Epoch: 25043, Training Loss: 0.00929
Epoch: 25043, Training Loss: 0.00866
Epoch: 25043, Training Loss: 0.01060
Epoch: 25043, Training Loss: 0.00820
Epoch: 25044, Training Loss: 0.00929
Epoch: 25044, Training Loss: 0.00866
Epoch: 25044, Training Loss: 0.01060
Epoch: 25044, Training Loss: 0.00820
Epoch: 25045, Training Loss: 0.00929
Epoch: 25045, Training Loss: 0.00866
Epoch: 25045, Training Loss: 0.01060
Epoch: 25045, Training Loss: 0.00820
Epoch: 25046, Training Loss: 0.00929
Epoch: 25046, Training Loss: 0.00866
Epoch: 25046, Training Loss: 0.01060
Epoch: 25046, Training Loss: 0.00820
Epoch: 25047, Training Loss: 0.00929
Epoch: 25047, Training Loss: 0.00866
Epoch: 25047, Training Loss: 0.01060
Epoch: 25047, Training Loss: 0.00820
Epoch: 25048, Training Loss: 0.00929
Epoch: 25048, Training Loss: 0.00866
Epoch: 25048, Training Loss: 0.01060
Epoch: 25048, Training Loss: 0.00820
Epoch: 25049, Training Loss: 0.00929
Epoch: 25049, Training Loss: 0.00866
Epoch: 25049, Training Loss: 0.01060
Epoch: 25049, Training Loss: 0.00820
Epoch: 25050, Training Loss: 0.00929
Epoch: 25050, Training Loss: 0.00866
Epoch: 25050, Training Loss: 0.01060
Epoch: 25050, Training Loss: 0.00820
Epoch: 25051, Training Loss: 0.00929
Epoch: 25051, Training Loss: 0.00866
Epoch: 25051, Training Loss: 0.01060
Epoch: 25051, Training Loss: 0.00820
Epoch: 25052, Training Loss: 0.00929
Epoch: 25052, Training Loss: 0.00866
Epoch: 25052, Training Loss: 0.01060
Epoch: 25052, Training Loss: 0.00820
Epoch: 25053, Training Loss: 0.00929
Epoch: 25053, Training Loss: 0.00866
Epoch: 25053, Training Loss: 0.01060
Epoch: 25053, Training Loss: 0.00820
Epoch: 25054, Training Loss: 0.00929
Epoch: 25054, Training Loss: 0.00866
Epoch: 25054, Training Loss: 0.01060
Epoch: 25054, Training Loss: 0.00820
Epoch: 25055, Training Loss: 0.00928
Epoch: 25055, Training Loss: 0.00866
Epoch: 25055, Training Loss: 0.01060
Epoch: 25055, Training Loss: 0.00820
Epoch: 25056, Training Loss: 0.00928
Epoch: 25056, Training Loss: 0.00866
Epoch: 25056, Training Loss: 0.01060
Epoch: 25056, Training Loss: 0.00820
Epoch: 25057, Training Loss: 0.00928
Epoch: 25057, Training Loss: 0.00866
Epoch: 25057, Training Loss: 0.01060
Epoch: 25057, Training Loss: 0.00820
Epoch: 25058, Training Loss: 0.00928
Epoch: 25058, Training Loss: 0.00866
Epoch: 25058, Training Loss: 0.01060
Epoch: 25058, Training Loss: 0.00820
Epoch: 25059, Training Loss: 0.00928
Epoch: 25059, Training Loss: 0.00865
Epoch: 25059, Training Loss: 0.01060
Epoch: 25059, Training Loss: 0.00820
Epoch: 25060, Training Loss: 0.00928
Epoch: 25060, Training Loss: 0.00865
Epoch: 25060, Training Loss: 0.01060
Epoch: 25060, Training Loss: 0.00820
Epoch: 25061, Training Loss: 0.00928
Epoch: 25061, Training Loss: 0.00865
Epoch: 25061, Training Loss: 0.01060
Epoch: 25061, Training Loss: 0.00820
Epoch: 25062, Training Loss: 0.00928
Epoch: 25062, Training Loss: 0.00865
Epoch: 25062, Training Loss: 0.01060
Epoch: 25062, Training Loss: 0.00820
Epoch: 25063, Training Loss: 0.00928
Epoch: 25063, Training Loss: 0.00865
Epoch: 25063, Training Loss: 0.01060
Epoch: 25063, Training Loss: 0.00820
Epoch: 25064, Training Loss: 0.00928
Epoch: 25064, Training Loss: 0.00865
Epoch: 25064, Training Loss: 0.01060
Epoch: 25064, Training Loss: 0.00820
Epoch: 25065, Training Loss: 0.00928
Epoch: 25065, Training Loss: 0.00865
Epoch: 25065, Training Loss: 0.01060
Epoch: 25065, Training Loss: 0.00820
Epoch: 25066, Training Loss: 0.00928
Epoch: 25066, Training Loss: 0.00865
Epoch: 25066, Training Loss: 0.01060
Epoch: 25066, Training Loss: 0.00820
Epoch: 25067, Training Loss: 0.00928
Epoch: 25067, Training Loss: 0.00865
Epoch: 25067, Training Loss: 0.01060
Epoch: 25067, Training Loss: 0.00820
Epoch: 25068, Training Loss: 0.00928
Epoch: 25068, Training Loss: 0.00865
Epoch: 25068, Training Loss: 0.01060
Epoch: 25068, Training Loss: 0.00820
Epoch: 25069, Training Loss: 0.00928
Epoch: 25069, Training Loss: 0.00865
Epoch: 25069, Training Loss: 0.01060
Epoch: 25069, Training Loss: 0.00820
Epoch: 25070, Training Loss: 0.00928
Epoch: 25070, Training Loss: 0.00865
Epoch: 25070, Training Loss: 0.01060
Epoch: 25070, Training Loss: 0.00820
Epoch: 25071, Training Loss: 0.00928
Epoch: 25071, Training Loss: 0.00865
Epoch: 25071, Training Loss: 0.01060
Epoch: 25071, Training Loss: 0.00820
Epoch: 25072, Training Loss: 0.00928
Epoch: 25072, Training Loss: 0.00865
Epoch: 25072, Training Loss: 0.01060
Epoch: 25072, Training Loss: 0.00820
Epoch: 25073, Training Loss: 0.00928
Epoch: 25073, Training Loss: 0.00865
Epoch: 25073, Training Loss: 0.01060
Epoch: 25073, Training Loss: 0.00820
Epoch: 25074, Training Loss: 0.00928
Epoch: 25074, Training Loss: 0.00865
Epoch: 25074, Training Loss: 0.01060
Epoch: 25074, Training Loss: 0.00820
Epoch: 25075, Training Loss: 0.00928
Epoch: 25075, Training Loss: 0.00865
Epoch: 25075, Training Loss: 0.01060
Epoch: 25075, Training Loss: 0.00820
Epoch: 25076, Training Loss: 0.00928
Epoch: 25076, Training Loss: 0.00865
Epoch: 25076, Training Loss: 0.01060
Epoch: 25076, Training Loss: 0.00819
Epoch: 25077, Training Loss: 0.00928
Epoch: 25077, Training Loss: 0.00865
Epoch: 25077, Training Loss: 0.01060
Epoch: 25077, Training Loss: 0.00819
Epoch: 25078, Training Loss: 0.00928
Epoch: 25078, Training Loss: 0.00865
Epoch: 25078, Training Loss: 0.01060
Epoch: 25078, Training Loss: 0.00819
Epoch: 25079, Training Loss: 0.00928
Epoch: 25079, Training Loss: 0.00865
Epoch: 25079, Training Loss: 0.01060
Epoch: 25079, Training Loss: 0.00819
Epoch: 25080, Training Loss: 0.00928
Epoch: 25080, Training Loss: 0.00865
Epoch: 25080, Training Loss: 0.01060
Epoch: 25080, Training Loss: 0.00819
Epoch: 25081, Training Loss: 0.00928
Epoch: 25081, Training Loss: 0.00865
Epoch: 25081, Training Loss: 0.01060
Epoch: 25081, Training Loss: 0.00819
Epoch: 25082, Training Loss: 0.00928
Epoch: 25082, Training Loss: 0.00865
Epoch: 25082, Training Loss: 0.01060
Epoch: 25082, Training Loss: 0.00819
Epoch: 25083, Training Loss: 0.00928
Epoch: 25083, Training Loss: 0.00865
Epoch: 25083, Training Loss: 0.01060
Epoch: 25083, Training Loss: 0.00819
Epoch: 25084, Training Loss: 0.00928
Epoch: 25084, Training Loss: 0.00865
Epoch: 25084, Training Loss: 0.01059
Epoch: 25084, Training Loss: 0.00819
Epoch: 25085, Training Loss: 0.00928
Epoch: 25085, Training Loss: 0.00865
Epoch: 25085, Training Loss: 0.01059
Epoch: 25085, Training Loss: 0.00819
Epoch: 25086, Training Loss: 0.00928
Epoch: 25086, Training Loss: 0.00865
Epoch: 25086, Training Loss: 0.01059
Epoch: 25086, Training Loss: 0.00819
Epoch: 25087, Training Loss: 0.00928
Epoch: 25087, Training Loss: 0.00865
Epoch: 25087, Training Loss: 0.01059
Epoch: 25087, Training Loss: 0.00819
Epoch: 25088, Training Loss: 0.00928
Epoch: 25088, Training Loss: 0.00865
Epoch: 25088, Training Loss: 0.01059
Epoch: 25088, Training Loss: 0.00819
Epoch: 25089, Training Loss: 0.00928
Epoch: 25089, Training Loss: 0.00865
Epoch: 25089, Training Loss: 0.01059
Epoch: 25089, Training Loss: 0.00819
Epoch: 25090, Training Loss: 0.00928
Epoch: 25090, Training Loss: 0.00865
Epoch: 25090, Training Loss: 0.01059
Epoch: 25090, Training Loss: 0.00819
Epoch: 25091, Training Loss: 0.00928
Epoch: 25091, Training Loss: 0.00865
Epoch: 25091, Training Loss: 0.01059
Epoch: 25091, Training Loss: 0.00819
Epoch: 25092, Training Loss: 0.00928
Epoch: 25092, Training Loss: 0.00865
Epoch: 25092, Training Loss: 0.01059
Epoch: 25092, Training Loss: 0.00819
Epoch: 25093, Training Loss: 0.00928
Epoch: 25093, Training Loss: 0.00865
Epoch: 25093, Training Loss: 0.01059
Epoch: 25093, Training Loss: 0.00819
Epoch: 25094, Training Loss: 0.00928
Epoch: 25094, Training Loss: 0.00865
Epoch: 25094, Training Loss: 0.01059
Epoch: 25094, Training Loss: 0.00819
Epoch: 25095, Training Loss: 0.00928
Epoch: 25095, Training Loss: 0.00865
Epoch: 25095, Training Loss: 0.01059
Epoch: 25095, Training Loss: 0.00819
Epoch: 25096, Training Loss: 0.00928
Epoch: 25096, Training Loss: 0.00865
Epoch: 25096, Training Loss: 0.01059
Epoch: 25096, Training Loss: 0.00819
Epoch: 25097, Training Loss: 0.00928
Epoch: 25097, Training Loss: 0.00865
Epoch: 25097, Training Loss: 0.01059
Epoch: 25097, Training Loss: 0.00819
Epoch: 25098, Training Loss: 0.00928
Epoch: 25098, Training Loss: 0.00865
Epoch: 25098, Training Loss: 0.01059
Epoch: 25098, Training Loss: 0.00819
Epoch: 25099, Training Loss: 0.00928
Epoch: 25099, Training Loss: 0.00865
Epoch: 25099, Training Loss: 0.01059
Epoch: 25099, Training Loss: 0.00819
Epoch: 25100, Training Loss: 0.00928
Epoch: 25100, Training Loss: 0.00865
Epoch: 25100, Training Loss: 0.01059
Epoch: 25100, Training Loss: 0.00819
Epoch: 25101, Training Loss: 0.00928
Epoch: 25101, Training Loss: 0.00865
Epoch: 25101, Training Loss: 0.01059
Epoch: 25101, Training Loss: 0.00819
Epoch: 25102, Training Loss: 0.00928
Epoch: 25102, Training Loss: 0.00865
Epoch: 25102, Training Loss: 0.01059
Epoch: 25102, Training Loss: 0.00819
Epoch: 25103, Training Loss: 0.00928
Epoch: 25103, Training Loss: 0.00865
Epoch: 25103, Training Loss: 0.01059
Epoch: 25103, Training Loss: 0.00819
Epoch: 25104, Training Loss: 0.00928
Epoch: 25104, Training Loss: 0.00865
Epoch: 25104, Training Loss: 0.01059
Epoch: 25104, Training Loss: 0.00819
Epoch: 25105, Training Loss: 0.00928
Epoch: 25105, Training Loss: 0.00865
Epoch: 25105, Training Loss: 0.01059
Epoch: 25105, Training Loss: 0.00819
Epoch: 25106, Training Loss: 0.00927
Epoch: 25106, Training Loss: 0.00865
Epoch: 25106, Training Loss: 0.01059
Epoch: 25106, Training Loss: 0.00819
Epoch: 25107, Training Loss: 0.00927
Epoch: 25107, Training Loss: 0.00865
Epoch: 25107, Training Loss: 0.01059
Epoch: 25107, Training Loss: 0.00819
Epoch: 25108, Training Loss: 0.00927
Epoch: 25108, Training Loss: 0.00865
Epoch: 25108, Training Loss: 0.01059
Epoch: 25108, Training Loss: 0.00819
Epoch: 25109, Training Loss: 0.00927
Epoch: 25109, Training Loss: 0.00865
Epoch: 25109, Training Loss: 0.01059
Epoch: 25109, Training Loss: 0.00819
Epoch: 25110, Training Loss: 0.00927
Epoch: 25110, Training Loss: 0.00865
Epoch: 25110, Training Loss: 0.01059
Epoch: 25110, Training Loss: 0.00819
Epoch: 25111, Training Loss: 0.00927
Epoch: 25111, Training Loss: 0.00865
Epoch: 25111, Training Loss: 0.01059
Epoch: 25111, Training Loss: 0.00819
Epoch: 25112, Training Loss: 0.00927
Epoch: 25112, Training Loss: 0.00865
Epoch: 25112, Training Loss: 0.01059
Epoch: 25112, Training Loss: 0.00819
Epoch: 25113, Training Loss: 0.00927
Epoch: 25113, Training Loss: 0.00865
Epoch: 25113, Training Loss: 0.01059
Epoch: 25113, Training Loss: 0.00819
Epoch: 25114, Training Loss: 0.00927
Epoch: 25114, Training Loss: 0.00865
Epoch: 25114, Training Loss: 0.01059
Epoch: 25114, Training Loss: 0.00819
Epoch: 25115, Training Loss: 0.00927
Epoch: 25115, Training Loss: 0.00864
Epoch: 25115, Training Loss: 0.01059
Epoch: 25115, Training Loss: 0.00819
Epoch: 25116, Training Loss: 0.00927
Epoch: 25116, Training Loss: 0.00864
Epoch: 25116, Training Loss: 0.01059
Epoch: 25116, Training Loss: 0.00819
Epoch: 25117, Training Loss: 0.00927
Epoch: 25117, Training Loss: 0.00864
Epoch: 25117, Training Loss: 0.01059
Epoch: 25117, Training Loss: 0.00819
Epoch: 25118, Training Loss: 0.00927
Epoch: 25118, Training Loss: 0.00864
Epoch: 25118, Training Loss: 0.01059
Epoch: 25118, Training Loss: 0.00819
Epoch: 25119, Training Loss: 0.00927
Epoch: 25119, Training Loss: 0.00864
Epoch: 25119, Training Loss: 0.01059
Epoch: 25119, Training Loss: 0.00819
Epoch: 25120, Training Loss: 0.00927
Epoch: 25120, Training Loss: 0.00864
Epoch: 25120, Training Loss: 0.01059
Epoch: 25120, Training Loss: 0.00819
Epoch: 25121, Training Loss: 0.00927
Epoch: 25121, Training Loss: 0.00864
Epoch: 25121, Training Loss: 0.01059
Epoch: 25121, Training Loss: 0.00819
Epoch: 25122, Training Loss: 0.00927
Epoch: 25122, Training Loss: 0.00864
Epoch: 25122, Training Loss: 0.01059
Epoch: 25122, Training Loss: 0.00819
Epoch: 25123, Training Loss: 0.00927
Epoch: 25123, Training Loss: 0.00864
Epoch: 25123, Training Loss: 0.01059
Epoch: 25123, Training Loss: 0.00819
Epoch: 25124, Training Loss: 0.00927
Epoch: 25124, Training Loss: 0.00864
Epoch: 25124, Training Loss: 0.01059
Epoch: 25124, Training Loss: 0.00819
Epoch: 25125, Training Loss: 0.00927
Epoch: 25125, Training Loss: 0.00864
Epoch: 25125, Training Loss: 0.01059
Epoch: 25125, Training Loss: 0.00819
Epoch: 25126, Training Loss: 0.00927
Epoch: 25126, Training Loss: 0.00864
Epoch: 25126, Training Loss: 0.01059
Epoch: 25126, Training Loss: 0.00819
Epoch: 25127, Training Loss: 0.00927
Epoch: 25127, Training Loss: 0.00864
Epoch: 25127, Training Loss: 0.01059
Epoch: 25127, Training Loss: 0.00819
Epoch: 25128, Training Loss: 0.00927
Epoch: 25128, Training Loss: 0.00864
Epoch: 25128, Training Loss: 0.01059
Epoch: 25128, Training Loss: 0.00819
Epoch: 25129, Training Loss: 0.00927
Epoch: 25129, Training Loss: 0.00864
Epoch: 25129, Training Loss: 0.01058
Epoch: 25129, Training Loss: 0.00819
Epoch: 25130, Training Loss: 0.00927
Epoch: 25130, Training Loss: 0.00864
Epoch: 25130, Training Loss: 0.01058
Epoch: 25130, Training Loss: 0.00819
Epoch: 25131, Training Loss: 0.00927
Epoch: 25131, Training Loss: 0.00864
Epoch: 25131, Training Loss: 0.01058
Epoch: 25131, Training Loss: 0.00819
Epoch: 25132, Training Loss: 0.00927
Epoch: 25132, Training Loss: 0.00864
Epoch: 25132, Training Loss: 0.01058
Epoch: 25132, Training Loss: 0.00819
Epoch: 25133, Training Loss: 0.00927
Epoch: 25133, Training Loss: 0.00864
Epoch: 25133, Training Loss: 0.01058
Epoch: 25133, Training Loss: 0.00819
Epoch: 25134, Training Loss: 0.00927
Epoch: 25134, Training Loss: 0.00864
Epoch: 25134, Training Loss: 0.01058
Epoch: 25134, Training Loss: 0.00818
Epoch: 25135, Training Loss: 0.00927
Epoch: 25135, Training Loss: 0.00864
Epoch: 25135, Training Loss: 0.01058
Epoch: 25135, Training Loss: 0.00818
Epoch: 25136, Training Loss: 0.00927
Epoch: 25136, Training Loss: 0.00864
Epoch: 25136, Training Loss: 0.01058
Epoch: 25136, Training Loss: 0.00818
Epoch: 25137, Training Loss: 0.00927
Epoch: 25137, Training Loss: 0.00864
Epoch: 25137, Training Loss: 0.01058
Epoch: 25137, Training Loss: 0.00818
Epoch: 25138, Training Loss: 0.00927
Epoch: 25138, Training Loss: 0.00864
Epoch: 25138, Training Loss: 0.01058
Epoch: 25138, Training Loss: 0.00818
Epoch: 25139, Training Loss: 0.00927
Epoch: 25139, Training Loss: 0.00864
Epoch: 25139, Training Loss: 0.01058
Epoch: 25139, Training Loss: 0.00818
Epoch: 25140, Training Loss: 0.00927
Epoch: 25140, Training Loss: 0.00864
Epoch: 25140, Training Loss: 0.01058
Epoch: 25140, Training Loss: 0.00818
Epoch: 25141, Training Loss: 0.00927
Epoch: 25141, Training Loss: 0.00864
Epoch: 25141, Training Loss: 0.01058
Epoch: 25141, Training Loss: 0.00818
Epoch: 25142, Training Loss: 0.00927
Epoch: 25142, Training Loss: 0.00864
Epoch: 25142, Training Loss: 0.01058
Epoch: 25142, Training Loss: 0.00818
Epoch: 25143, Training Loss: 0.00927
Epoch: 25143, Training Loss: 0.00864
Epoch: 25143, Training Loss: 0.01058
Epoch: 25143, Training Loss: 0.00818
Epoch: 25144, Training Loss: 0.00927
Epoch: 25144, Training Loss: 0.00864
Epoch: 25144, Training Loss: 0.01058
Epoch: 25144, Training Loss: 0.00818
Epoch: 25145, Training Loss: 0.00927
Epoch: 25145, Training Loss: 0.00864
Epoch: 25145, Training Loss: 0.01058
Epoch: 25145, Training Loss: 0.00818
Epoch: 25146, Training Loss: 0.00927
Epoch: 25146, Training Loss: 0.00864
Epoch: 25146, Training Loss: 0.01058
Epoch: 25146, Training Loss: 0.00818
Epoch: 25147, Training Loss: 0.00927
Epoch: 25147, Training Loss: 0.00864
Epoch: 25147, Training Loss: 0.01058
Epoch: 25147, Training Loss: 0.00818
Epoch: 25148, Training Loss: 0.00927
Epoch: 25148, Training Loss: 0.00864
Epoch: 25148, Training Loss: 0.01058
Epoch: 25148, Training Loss: 0.00818
Epoch: 25149, Training Loss: 0.00927
Epoch: 25149, Training Loss: 0.00864
Epoch: 25149, Training Loss: 0.01058
Epoch: 25149, Training Loss: 0.00818
Epoch: 25150, Training Loss: 0.00927
Epoch: 25150, Training Loss: 0.00864
Epoch: 25150, Training Loss: 0.01058
Epoch: 25150, Training Loss: 0.00818
Epoch: 25151, Training Loss: 0.00927
Epoch: 25151, Training Loss: 0.00864
Epoch: 25151, Training Loss: 0.01058
Epoch: 25151, Training Loss: 0.00818
Epoch: 25152, Training Loss: 0.00927
Epoch: 25152, Training Loss: 0.00864
Epoch: 25152, Training Loss: 0.01058
Epoch: 25152, Training Loss: 0.00818
Epoch: 25153, Training Loss: 0.00927
Epoch: 25153, Training Loss: 0.00864
Epoch: 25153, Training Loss: 0.01058
Epoch: 25153, Training Loss: 0.00818
Epoch: 25154, Training Loss: 0.00927
Epoch: 25154, Training Loss: 0.00864
Epoch: 25154, Training Loss: 0.01058
Epoch: 25154, Training Loss: 0.00818
Epoch: 25155, Training Loss: 0.00927
Epoch: 25155, Training Loss: 0.00864
Epoch: 25155, Training Loss: 0.01058
Epoch: 25155, Training Loss: 0.00818
Epoch: 25156, Training Loss: 0.00927
Epoch: 25156, Training Loss: 0.00864
Epoch: 25156, Training Loss: 0.01058
Epoch: 25156, Training Loss: 0.00818
Epoch: 25157, Training Loss: 0.00926
Epoch: 25157, Training Loss: 0.00864
Epoch: 25157, Training Loss: 0.01058
Epoch: 25157, Training Loss: 0.00818
Epoch: 25158, Training Loss: 0.00926
Epoch: 25158, Training Loss: 0.00864
Epoch: 25158, Training Loss: 0.01058
Epoch: 25158, Training Loss: 0.00818
Epoch: 25159, Training Loss: 0.00926
Epoch: 25159, Training Loss: 0.00864
Epoch: 25159, Training Loss: 0.01058
Epoch: 25159, Training Loss: 0.00818
Epoch: 25160, Training Loss: 0.00926
Epoch: 25160, Training Loss: 0.00864
Epoch: 25160, Training Loss: 0.01058
Epoch: 25160, Training Loss: 0.00818
Epoch: 25161, Training Loss: 0.00926
Epoch: 25161, Training Loss: 0.00864
Epoch: 25161, Training Loss: 0.01058
Epoch: 25161, Training Loss: 0.00818
Epoch: 25162, Training Loss: 0.00926
Epoch: 25162, Training Loss: 0.00864
Epoch: 25162, Training Loss: 0.01058
Epoch: 25162, Training Loss: 0.00818
Epoch: 25163, Training Loss: 0.00926
Epoch: 25163, Training Loss: 0.00864
Epoch: 25163, Training Loss: 0.01058
Epoch: 25163, Training Loss: 0.00818
Epoch: 25164, Training Loss: 0.00926
Epoch: 25164, Training Loss: 0.00864
Epoch: 25164, Training Loss: 0.01058
Epoch: 25164, Training Loss: 0.00818
Epoch: 25165, Training Loss: 0.00926
Epoch: 25165, Training Loss: 0.00864
Epoch: 25165, Training Loss: 0.01058
Epoch: 25165, Training Loss: 0.00818
Epoch: 25166, Training Loss: 0.00926
Epoch: 25166, Training Loss: 0.00864
Epoch: 25166, Training Loss: 0.01058
Epoch: 25166, Training Loss: 0.00818
Epoch: 25167, Training Loss: 0.00926
Epoch: 25167, Training Loss: 0.00864
Epoch: 25167, Training Loss: 0.01058
Epoch: 25167, Training Loss: 0.00818
Epoch: 25168, Training Loss: 0.00926
Epoch: 25168, Training Loss: 0.00864
Epoch: 25168, Training Loss: 0.01058
Epoch: 25168, Training Loss: 0.00818
Epoch: 25169, Training Loss: 0.00926
Epoch: 25169, Training Loss: 0.00864
Epoch: 25169, Training Loss: 0.01058
Epoch: 25169, Training Loss: 0.00818
Epoch: 25170, Training Loss: 0.00926
Epoch: 25170, Training Loss: 0.00863
Epoch: 25170, Training Loss: 0.01058
Epoch: 25170, Training Loss: 0.00818
Epoch: 25171, Training Loss: 0.00926
Epoch: 25171, Training Loss: 0.00863
Epoch: 25171, Training Loss: 0.01058
Epoch: 25171, Training Loss: 0.00818
Epoch: 25172, Training Loss: 0.00926
Epoch: 25172, Training Loss: 0.00863
Epoch: 25172, Training Loss: 0.01058
Epoch: 25172, Training Loss: 0.00818
Epoch: 25173, Training Loss: 0.00926
Epoch: 25173, Training Loss: 0.00863
Epoch: 25173, Training Loss: 0.01057
Epoch: 25173, Training Loss: 0.00818
Epoch: 25174, Training Loss: 0.00926
Epoch: 25174, Training Loss: 0.00863
Epoch: 25174, Training Loss: 0.01057
Epoch: 25174, Training Loss: 0.00818
Epoch: 25175, Training Loss: 0.00926
Epoch: 25175, Training Loss: 0.00863
Epoch: 25175, Training Loss: 0.01057
Epoch: 25175, Training Loss: 0.00818
Epoch: 25176, Training Loss: 0.00926
Epoch: 25176, Training Loss: 0.00863
Epoch: 25176, Training Loss: 0.01057
Epoch: 25176, Training Loss: 0.00818
Epoch: 25177, Training Loss: 0.00926
Epoch: 25177, Training Loss: 0.00863
Epoch: 25177, Training Loss: 0.01057
Epoch: 25177, Training Loss: 0.00818
Epoch: 25178, Training Loss: 0.00926
Epoch: 25178, Training Loss: 0.00863
Epoch: 25178, Training Loss: 0.01057
Epoch: 25178, Training Loss: 0.00818
Epoch: 25179, Training Loss: 0.00926
Epoch: 25179, Training Loss: 0.00863
Epoch: 25179, Training Loss: 0.01057
Epoch: 25179, Training Loss: 0.00818
Epoch: 25180, Training Loss: 0.00926
Epoch: 25180, Training Loss: 0.00863
Epoch: 25180, Training Loss: 0.01057
Epoch: 25180, Training Loss: 0.00818
Epoch: 25181, Training Loss: 0.00926
Epoch: 25181, Training Loss: 0.00863
Epoch: 25181, Training Loss: 0.01057
Epoch: 25181, Training Loss: 0.00818
Epoch: 25182, Training Loss: 0.00926
Epoch: 25182, Training Loss: 0.00863
Epoch: 25182, Training Loss: 0.01057
Epoch: 25182, Training Loss: 0.00818
Epoch: 25183, Training Loss: 0.00926
Epoch: 25183, Training Loss: 0.00863
Epoch: 25183, Training Loss: 0.01057
Epoch: 25183, Training Loss: 0.00818
Epoch: 25184, Training Loss: 0.00926
Epoch: 25184, Training Loss: 0.00863
Epoch: 25184, Training Loss: 0.01057
Epoch: 25184, Training Loss: 0.00818
Epoch: 25185, Training Loss: 0.00926
Epoch: 25185, Training Loss: 0.00863
Epoch: 25185, Training Loss: 0.01057
Epoch: 25185, Training Loss: 0.00818
Epoch: 25186, Training Loss: 0.00926
Epoch: 25186, Training Loss: 0.00863
Epoch: 25186, Training Loss: 0.01057
Epoch: 25186, Training Loss: 0.00818
Epoch: 25187, Training Loss: 0.00926
Epoch: 25187, Training Loss: 0.00863
Epoch: 25187, Training Loss: 0.01057
Epoch: 25187, Training Loss: 0.00818
Epoch: 25188, Training Loss: 0.00926
Epoch: 25188, Training Loss: 0.00863
Epoch: 25188, Training Loss: 0.01057
Epoch: 25188, Training Loss: 0.00818
Epoch: 25189, Training Loss: 0.00926
Epoch: 25189, Training Loss: 0.00863
Epoch: 25189, Training Loss: 0.01057
Epoch: 25189, Training Loss: 0.00818
Epoch: 25190, Training Loss: 0.00926
Epoch: 25190, Training Loss: 0.00863
Epoch: 25190, Training Loss: 0.01057
Epoch: 25190, Training Loss: 0.00818
Epoch: 25191, Training Loss: 0.00926
Epoch: 25191, Training Loss: 0.00863
Epoch: 25191, Training Loss: 0.01057
Epoch: 25191, Training Loss: 0.00818
Epoch: 25192, Training Loss: 0.00926
Epoch: 25192, Training Loss: 0.00863
Epoch: 25192, Training Loss: 0.01057
Epoch: 25192, Training Loss: 0.00817
Epoch: 25193, Training Loss: 0.00926
Epoch: 25193, Training Loss: 0.00863
Epoch: 25193, Training Loss: 0.01057
Epoch: 25193, Training Loss: 0.00817
Epoch: 25194, Training Loss: 0.00926
Epoch: 25194, Training Loss: 0.00863
Epoch: 25194, Training Loss: 0.01057
Epoch: 25194, Training Loss: 0.00817
Epoch: 25195, Training Loss: 0.00926
Epoch: 25195, Training Loss: 0.00863
Epoch: 25195, Training Loss: 0.01057
Epoch: 25195, Training Loss: 0.00817
Epoch: 25196, Training Loss: 0.00926
Epoch: 25196, Training Loss: 0.00863
Epoch: 25196, Training Loss: 0.01057
Epoch: 25196, Training Loss: 0.00817
Epoch: 25197, Training Loss: 0.00926
Epoch: 25197, Training Loss: 0.00863
Epoch: 25197, Training Loss: 0.01057
Epoch: 25197, Training Loss: 0.00817
Epoch: 25198, Training Loss: 0.00926
Epoch: 25198, Training Loss: 0.00863
Epoch: 25198, Training Loss: 0.01057
Epoch: 25198, Training Loss: 0.00817
Epoch: 25199, Training Loss: 0.00926
Epoch: 25199, Training Loss: 0.00863
Epoch: 25199, Training Loss: 0.01057
Epoch: 25199, Training Loss: 0.00817
Epoch: 25200, Training Loss: 0.00926
Epoch: 25200, Training Loss: 0.00863
Epoch: 25200, Training Loss: 0.01057
Epoch: 25200, Training Loss: 0.00817
Epoch: 25201, Training Loss: 0.00926
Epoch: 25201, Training Loss: 0.00863
Epoch: 25201, Training Loss: 0.01057
Epoch: 25201, Training Loss: 0.00817
Epoch: 25202, Training Loss: 0.00926
Epoch: 25202, Training Loss: 0.00863
Epoch: 25202, Training Loss: 0.01057
Epoch: 25202, Training Loss: 0.00817
Epoch: 25203, Training Loss: 0.00926
Epoch: 25203, Training Loss: 0.00863
Epoch: 25203, Training Loss: 0.01057
Epoch: 25203, Training Loss: 0.00817
Epoch: 25204, Training Loss: 0.00926
Epoch: 25204, Training Loss: 0.00863
Epoch: 25204, Training Loss: 0.01057
Epoch: 25204, Training Loss: 0.00817
Epoch: 25205, Training Loss: 0.00926
Epoch: 25205, Training Loss: 0.00863
Epoch: 25205, Training Loss: 0.01057
Epoch: 25205, Training Loss: 0.00817
Epoch: 25206, Training Loss: 0.00926
Epoch: 25206, Training Loss: 0.00863
Epoch: 25206, Training Loss: 0.01057
Epoch: 25206, Training Loss: 0.00817
Epoch: 25207, Training Loss: 0.00925
Epoch: 25207, Training Loss: 0.00863
Epoch: 25207, Training Loss: 0.01057
Epoch: 25207, Training Loss: 0.00817
Epoch: 25208, Training Loss: 0.00925
Epoch: 25208, Training Loss: 0.00863
Epoch: 25208, Training Loss: 0.01057
Epoch: 25208, Training Loss: 0.00817
Epoch: 25209, Training Loss: 0.00925
Epoch: 25209, Training Loss: 0.00863
Epoch: 25209, Training Loss: 0.01057
Epoch: 25209, Training Loss: 0.00817
Epoch: 25210, Training Loss: 0.00925
Epoch: 25210, Training Loss: 0.00863
Epoch: 25210, Training Loss: 0.01057
Epoch: 25210, Training Loss: 0.00817
Epoch: 25211, Training Loss: 0.00925
Epoch: 25211, Training Loss: 0.00863
Epoch: 25211, Training Loss: 0.01057
Epoch: 25211, Training Loss: 0.00817
Epoch: 25212, Training Loss: 0.00925
Epoch: 25212, Training Loss: 0.00863
Epoch: 25212, Training Loss: 0.01057
Epoch: 25212, Training Loss: 0.00817
Epoch: 25213, Training Loss: 0.00925
Epoch: 25213, Training Loss: 0.00863
Epoch: 25213, Training Loss: 0.01057
Epoch: 25213, Training Loss: 0.00817
Epoch: 25214, Training Loss: 0.00925
Epoch: 25214, Training Loss: 0.00863
Epoch: 25214, Training Loss: 0.01057
Epoch: 25214, Training Loss: 0.00817
Epoch: 25215, Training Loss: 0.00925
Epoch: 25215, Training Loss: 0.00863
Epoch: 25215, Training Loss: 0.01057
Epoch: 25215, Training Loss: 0.00817
Epoch: 25216, Training Loss: 0.00925
Epoch: 25216, Training Loss: 0.00863
Epoch: 25216, Training Loss: 0.01057
Epoch: 25216, Training Loss: 0.00817
Epoch: 25217, Training Loss: 0.00925
Epoch: 25217, Training Loss: 0.00863
Epoch: 25217, Training Loss: 0.01057
Epoch: 25217, Training Loss: 0.00817
Epoch: 25218, Training Loss: 0.00925
Epoch: 25218, Training Loss: 0.00863
Epoch: 25218, Training Loss: 0.01056
Epoch: 25218, Training Loss: 0.00817
Epoch: 25219, Training Loss: 0.00925
Epoch: 25219, Training Loss: 0.00863
Epoch: 25219, Training Loss: 0.01056
Epoch: 25219, Training Loss: 0.00817
Epoch: 25220, Training Loss: 0.00925
Epoch: 25220, Training Loss: 0.00863
Epoch: 25220, Training Loss: 0.01056
Epoch: 25220, Training Loss: 0.00817
Epoch: 25221, Training Loss: 0.00925
Epoch: 25221, Training Loss: 0.00863
Epoch: 25221, Training Loss: 0.01056
Epoch: 25221, Training Loss: 0.00817
Epoch: 25222, Training Loss: 0.00925
Epoch: 25222, Training Loss: 0.00863
Epoch: 25222, Training Loss: 0.01056
Epoch: 25222, Training Loss: 0.00817
Epoch: 25223, Training Loss: 0.00925
Epoch: 25223, Training Loss: 0.00863
Epoch: 25223, Training Loss: 0.01056
Epoch: 25223, Training Loss: 0.00817
Epoch: 25224, Training Loss: 0.00925
Epoch: 25224, Training Loss: 0.00863
Epoch: 25224, Training Loss: 0.01056
Epoch: 25224, Training Loss: 0.00817
Epoch: 25225, Training Loss: 0.00925
Epoch: 25225, Training Loss: 0.00862
Epoch: 25225, Training Loss: 0.01056
Epoch: 25225, Training Loss: 0.00817
Epoch: 25226, Training Loss: 0.00925
Epoch: 25226, Training Loss: 0.00862
Epoch: 25226, Training Loss: 0.01056
Epoch: 25226, Training Loss: 0.00817
Epoch: 25227, Training Loss: 0.00925
Epoch: 25227, Training Loss: 0.00862
Epoch: 25227, Training Loss: 0.01056
Epoch: 25227, Training Loss: 0.00817
Epoch: 25228, Training Loss: 0.00925
Epoch: 25228, Training Loss: 0.00862
Epoch: 25228, Training Loss: 0.01056
Epoch: 25228, Training Loss: 0.00817
Epoch: 25229, Training Loss: 0.00925
Epoch: 25229, Training Loss: 0.00862
Epoch: 25229, Training Loss: 0.01056
Epoch: 25229, Training Loss: 0.00817
Epoch: 25230, Training Loss: 0.00925
Epoch: 25230, Training Loss: 0.00862
Epoch: 25230, Training Loss: 0.01056
Epoch: 25230, Training Loss: 0.00817
Epoch: 25231, Training Loss: 0.00925
Epoch: 25231, Training Loss: 0.00862
Epoch: 25231, Training Loss: 0.01056
Epoch: 25231, Training Loss: 0.00817
Epoch: 25232, Training Loss: 0.00925
Epoch: 25232, Training Loss: 0.00862
Epoch: 25232, Training Loss: 0.01056
Epoch: 25232, Training Loss: 0.00817
Epoch: 25233, Training Loss: 0.00925
Epoch: 25233, Training Loss: 0.00862
Epoch: 25233, Training Loss: 0.01056
Epoch: 25233, Training Loss: 0.00817
Epoch: 25234, Training Loss: 0.00925
Epoch: 25234, Training Loss: 0.00862
Epoch: 25234, Training Loss: 0.01056
Epoch: 25234, Training Loss: 0.00817
Epoch: 25235, Training Loss: 0.00925
Epoch: 25235, Training Loss: 0.00862
Epoch: 25235, Training Loss: 0.01056
Epoch: 25235, Training Loss: 0.00817
Epoch: 25236, Training Loss: 0.00925
Epoch: 25236, Training Loss: 0.00862
Epoch: 25236, Training Loss: 0.01056
Epoch: 25236, Training Loss: 0.00817
Epoch: 25237, Training Loss: 0.00925
Epoch: 25237, Training Loss: 0.00862
Epoch: 25237, Training Loss: 0.01056
Epoch: 25237, Training Loss: 0.00817
Epoch: 25238, Training Loss: 0.00925
Epoch: 25238, Training Loss: 0.00862
Epoch: 25238, Training Loss: 0.01056
Epoch: 25238, Training Loss: 0.00817
Epoch: 25239, Training Loss: 0.00925
Epoch: 25239, Training Loss: 0.00862
Epoch: 25239, Training Loss: 0.01056
Epoch: 25239, Training Loss: 0.00817
Epoch: 25240, Training Loss: 0.00925
Epoch: 25240, Training Loss: 0.00862
Epoch: 25240, Training Loss: 0.01056
Epoch: 25240, Training Loss: 0.00817
Epoch: 25241, Training Loss: 0.00925
Epoch: 25241, Training Loss: 0.00862
Epoch: 25241, Training Loss: 0.01056
Epoch: 25241, Training Loss: 0.00817
Epoch: 25242, Training Loss: 0.00925
Epoch: 25242, Training Loss: 0.00862
Epoch: 25242, Training Loss: 0.01056
Epoch: 25242, Training Loss: 0.00817
Epoch: 25243, Training Loss: 0.00925
Epoch: 25243, Training Loss: 0.00862
Epoch: 25243, Training Loss: 0.01056
Epoch: 25243, Training Loss: 0.00817
Epoch: 25244, Training Loss: 0.00925
Epoch: 25244, Training Loss: 0.00862
Epoch: 25244, Training Loss: 0.01056
Epoch: 25244, Training Loss: 0.00817
Epoch: 25245, Training Loss: 0.00925
Epoch: 25245, Training Loss: 0.00862
Epoch: 25245, Training Loss: 0.01056
Epoch: 25245, Training Loss: 0.00817
Epoch: 25246, Training Loss: 0.00925
Epoch: 25246, Training Loss: 0.00862
Epoch: 25246, Training Loss: 0.01056
Epoch: 25246, Training Loss: 0.00817
Epoch: 25247, Training Loss: 0.00925
Epoch: 25247, Training Loss: 0.00862
Epoch: 25247, Training Loss: 0.01056
Epoch: 25247, Training Loss: 0.00817
Epoch: 25248, Training Loss: 0.00925
Epoch: 25248, Training Loss: 0.00862
Epoch: 25248, Training Loss: 0.01056
Epoch: 25248, Training Loss: 0.00817
Epoch: 25249, Training Loss: 0.00925
Epoch: 25249, Training Loss: 0.00862
Epoch: 25249, Training Loss: 0.01056
Epoch: 25249, Training Loss: 0.00817
Epoch: 25250, Training Loss: 0.00925
Epoch: 25250, Training Loss: 0.00862
Epoch: 25250, Training Loss: 0.01056
Epoch: 25250, Training Loss: 0.00817
Epoch: 25251, Training Loss: 0.00925
Epoch: 25251, Training Loss: 0.00862
Epoch: 25251, Training Loss: 0.01056
Epoch: 25251, Training Loss: 0.00816
Epoch: 25252, Training Loss: 0.00925
Epoch: 25252, Training Loss: 0.00862
Epoch: 25252, Training Loss: 0.01056
Epoch: 25252, Training Loss: 0.00816
Epoch: 25253, Training Loss: 0.00925
Epoch: 25253, Training Loss: 0.00862
Epoch: 25253, Training Loss: 0.01056
Epoch: 25253, Training Loss: 0.00816
Epoch: 25254, Training Loss: 0.00925
Epoch: 25254, Training Loss: 0.00862
Epoch: 25254, Training Loss: 0.01056
Epoch: 25254, Training Loss: 0.00816
Epoch: 25255, Training Loss: 0.00925
Epoch: 25255, Training Loss: 0.00862
Epoch: 25255, Training Loss: 0.01056
Epoch: 25255, Training Loss: 0.00816
Epoch: 25256, Training Loss: 0.00925
Epoch: 25256, Training Loss: 0.00862
Epoch: 25256, Training Loss: 0.01056
Epoch: 25256, Training Loss: 0.00816
Epoch: 25257, Training Loss: 0.00925
Epoch: 25257, Training Loss: 0.00862
Epoch: 25257, Training Loss: 0.01056
Epoch: 25257, Training Loss: 0.00816
Epoch: 25258, Training Loss: 0.00924
Epoch: 25258, Training Loss: 0.00862
Epoch: 25258, Training Loss: 0.01056
Epoch: 25258, Training Loss: 0.00816
Epoch: 25259, Training Loss: 0.00924
Epoch: 25259, Training Loss: 0.00862
Epoch: 25259, Training Loss: 0.01056
Epoch: 25259, Training Loss: 0.00816
Epoch: 25260, Training Loss: 0.00924
Epoch: 25260, Training Loss: 0.00862
Epoch: 25260, Training Loss: 0.01056
Epoch: 25260, Training Loss: 0.00816
Epoch: 25261, Training Loss: 0.00924
Epoch: 25261, Training Loss: 0.00862
Epoch: 25261, Training Loss: 0.01056
Epoch: 25261, Training Loss: 0.00816
Epoch: 25262, Training Loss: 0.00924
Epoch: 25262, Training Loss: 0.00862
Epoch: 25262, Training Loss: 0.01056
Epoch: 25262, Training Loss: 0.00816
Epoch: 25263, Training Loss: 0.00924
Epoch: 25263, Training Loss: 0.00862
Epoch: 25263, Training Loss: 0.01055
Epoch: 25263, Training Loss: 0.00816
Epoch: 25264, Training Loss: 0.00924
Epoch: 25264, Training Loss: 0.00862
Epoch: 25264, Training Loss: 0.01055
Epoch: 25264, Training Loss: 0.00816
Epoch: 25265, Training Loss: 0.00924
Epoch: 25265, Training Loss: 0.00862
Epoch: 25265, Training Loss: 0.01055
Epoch: 25265, Training Loss: 0.00816
Epoch: 25266, Training Loss: 0.00924
Epoch: 25266, Training Loss: 0.00862
Epoch: 25266, Training Loss: 0.01055
Epoch: 25266, Training Loss: 0.00816
Epoch: 25267, Training Loss: 0.00924
Epoch: 25267, Training Loss: 0.00862
Epoch: 25267, Training Loss: 0.01055
Epoch: 25267, Training Loss: 0.00816
Epoch: 25268, Training Loss: 0.00924
Epoch: 25268, Training Loss: 0.00862
Epoch: 25268, Training Loss: 0.01055
Epoch: 25268, Training Loss: 0.00816
Epoch: 25269, Training Loss: 0.00924
Epoch: 25269, Training Loss: 0.00862
Epoch: 25269, Training Loss: 0.01055
Epoch: 25269, Training Loss: 0.00816
Epoch: 25270, Training Loss: 0.00924
Epoch: 25270, Training Loss: 0.00862
Epoch: 25270, Training Loss: 0.01055
Epoch: 25270, Training Loss: 0.00816
Epoch: 25271, Training Loss: 0.00924
Epoch: 25271, Training Loss: 0.00862
Epoch: 25271, Training Loss: 0.01055
Epoch: 25271, Training Loss: 0.00816
Epoch: 25272, Training Loss: 0.00924
Epoch: 25272, Training Loss: 0.00862
Epoch: 25272, Training Loss: 0.01055
Epoch: 25272, Training Loss: 0.00816
Epoch: 25273, Training Loss: 0.00924
Epoch: 25273, Training Loss: 0.00862
Epoch: 25273, Training Loss: 0.01055
Epoch: 25273, Training Loss: 0.00816
Epoch: 25274, Training Loss: 0.00924
Epoch: 25274, Training Loss: 0.00862
Epoch: 25274, Training Loss: 0.01055
Epoch: 25274, Training Loss: 0.00816
Epoch: 25275, Training Loss: 0.00924
Epoch: 25275, Training Loss: 0.00862
Epoch: 25275, Training Loss: 0.01055
Epoch: 25275, Training Loss: 0.00816
Epoch: 25276, Training Loss: 0.00924
Epoch: 25276, Training Loss: 0.00862
Epoch: 25276, Training Loss: 0.01055
Epoch: 25276, Training Loss: 0.00816
Epoch: 25277, Training Loss: 0.00924
Epoch: 25277, Training Loss: 0.00862
Epoch: 25277, Training Loss: 0.01055
Epoch: 25277, Training Loss: 0.00816
Epoch: 25278, Training Loss: 0.00924
Epoch: 25278, Training Loss: 0.00862
Epoch: 25278, Training Loss: 0.01055
Epoch: 25278, Training Loss: 0.00816
Epoch: 25279, Training Loss: 0.00924
Epoch: 25279, Training Loss: 0.00862
Epoch: 25279, Training Loss: 0.01055
Epoch: 25279, Training Loss: 0.00816
Epoch: 25280, Training Loss: 0.00924
Epoch: 25280, Training Loss: 0.00862
Epoch: 25280, Training Loss: 0.01055
Epoch: 25280, Training Loss: 0.00816
Epoch: 25281, Training Loss: 0.00924
Epoch: 25281, Training Loss: 0.00861
Epoch: 25281, Training Loss: 0.01055
Epoch: 25281, Training Loss: 0.00816
Epoch: 25282, Training Loss: 0.00924
Epoch: 25282, Training Loss: 0.00861
Epoch: 25282, Training Loss: 0.01055
Epoch: 25282, Training Loss: 0.00816
Epoch: 25283, Training Loss: 0.00924
Epoch: 25283, Training Loss: 0.00861
Epoch: 25283, Training Loss: 0.01055
Epoch: 25283, Training Loss: 0.00816
Epoch: 25284, Training Loss: 0.00924
Epoch: 25284, Training Loss: 0.00861
Epoch: 25284, Training Loss: 0.01055
Epoch: 25284, Training Loss: 0.00816
Epoch: 25285, Training Loss: 0.00924
Epoch: 25285, Training Loss: 0.00861
Epoch: 25285, Training Loss: 0.01055
Epoch: 25285, Training Loss: 0.00816
Epoch: 25286, Training Loss: 0.00924
Epoch: 25286, Training Loss: 0.00861
Epoch: 25286, Training Loss: 0.01055
Epoch: 25286, Training Loss: 0.00816
Epoch: 25287, Training Loss: 0.00924
Epoch: 25287, Training Loss: 0.00861
Epoch: 25287, Training Loss: 0.01055
Epoch: 25287, Training Loss: 0.00816
Epoch: 25288, Training Loss: 0.00924
Epoch: 25288, Training Loss: 0.00861
Epoch: 25288, Training Loss: 0.01055
Epoch: 25288, Training Loss: 0.00816
Epoch: 25289, Training Loss: 0.00924
Epoch: 25289, Training Loss: 0.00861
Epoch: 25289, Training Loss: 0.01055
Epoch: 25289, Training Loss: 0.00816
Epoch: 25290, Training Loss: 0.00924
Epoch: 25290, Training Loss: 0.00861
Epoch: 25290, Training Loss: 0.01055
Epoch: 25290, Training Loss: 0.00816
Epoch: 25291, Training Loss: 0.00924
Epoch: 25291, Training Loss: 0.00861
Epoch: 25291, Training Loss: 0.01055
Epoch: 25291, Training Loss: 0.00816
Epoch: 25292, Training Loss: 0.00924
Epoch: 25292, Training Loss: 0.00861
Epoch: 25292, Training Loss: 0.01055
Epoch: 25292, Training Loss: 0.00816
Epoch: 25293, Training Loss: 0.00924
Epoch: 25293, Training Loss: 0.00861
Epoch: 25293, Training Loss: 0.01055
Epoch: 25293, Training Loss: 0.00816
Epoch: 25294, Training Loss: 0.00924
Epoch: 25294, Training Loss: 0.00861
Epoch: 25294, Training Loss: 0.01055
Epoch: 25294, Training Loss: 0.00816
Epoch: 25295, Training Loss: 0.00924
Epoch: 25295, Training Loss: 0.00861
Epoch: 25295, Training Loss: 0.01055
Epoch: 25295, Training Loss: 0.00816
Epoch: 25296, Training Loss: 0.00924
Epoch: 25296, Training Loss: 0.00861
Epoch: 25296, Training Loss: 0.01055
Epoch: 25296, Training Loss: 0.00816
Epoch: 25297, Training Loss: 0.00924
Epoch: 25297, Training Loss: 0.00861
Epoch: 25297, Training Loss: 0.01055
Epoch: 25297, Training Loss: 0.00816
Epoch: 25298, Training Loss: 0.00924
Epoch: 25298, Training Loss: 0.00861
Epoch: 25298, Training Loss: 0.01055
Epoch: 25298, Training Loss: 0.00816
Epoch: 25299, Training Loss: 0.00924
Epoch: 25299, Training Loss: 0.00861
Epoch: 25299, Training Loss: 0.01055
Epoch: 25299, Training Loss: 0.00816
Epoch: 25300, Training Loss: 0.00924
Epoch: 25300, Training Loss: 0.00861
Epoch: 25300, Training Loss: 0.01055
Epoch: 25300, Training Loss: 0.00816
Epoch: 25301, Training Loss: 0.00924
Epoch: 25301, Training Loss: 0.00861
Epoch: 25301, Training Loss: 0.01055
Epoch: 25301, Training Loss: 0.00816
Epoch: 25302, Training Loss: 0.00924
Epoch: 25302, Training Loss: 0.00861
Epoch: 25302, Training Loss: 0.01055
Epoch: 25302, Training Loss: 0.00816
Epoch: 25303, Training Loss: 0.00924
Epoch: 25303, Training Loss: 0.00861
Epoch: 25303, Training Loss: 0.01055
Epoch: 25303, Training Loss: 0.00816
Epoch: 25304, Training Loss: 0.00924
Epoch: 25304, Training Loss: 0.00861
Epoch: 25304, Training Loss: 0.01055
Epoch: 25304, Training Loss: 0.00816
Epoch: 25305, Training Loss: 0.00924
Epoch: 25305, Training Loss: 0.00861
Epoch: 25305, Training Loss: 0.01055
Epoch: 25305, Training Loss: 0.00816
Epoch: 25306, Training Loss: 0.00924
Epoch: 25306, Training Loss: 0.00861
Epoch: 25306, Training Loss: 0.01055
Epoch: 25306, Training Loss: 0.00816
Epoch: 25307, Training Loss: 0.00924
Epoch: 25307, Training Loss: 0.00861
Epoch: 25307, Training Loss: 0.01055
Epoch: 25307, Training Loss: 0.00816
Epoch: 25308, Training Loss: 0.00924
Epoch: 25308, Training Loss: 0.00861
Epoch: 25308, Training Loss: 0.01054
Epoch: 25308, Training Loss: 0.00816
Epoch: 25309, Training Loss: 0.00923
Epoch: 25309, Training Loss: 0.00861
Epoch: 25309, Training Loss: 0.01054
Epoch: 25309, Training Loss: 0.00816
Epoch: 25310, Training Loss: 0.00923
Epoch: 25310, Training Loss: 0.00861
Epoch: 25310, Training Loss: 0.01054
Epoch: 25310, Training Loss: 0.00815
Epoch: 25311, Training Loss: 0.00923
Epoch: 25311, Training Loss: 0.00861
Epoch: 25311, Training Loss: 0.01054
Epoch: 25311, Training Loss: 0.00815
Epoch: 25312, Training Loss: 0.00923
Epoch: 25312, Training Loss: 0.00861
Epoch: 25312, Training Loss: 0.01054
Epoch: 25312, Training Loss: 0.00815
Epoch: 25313, Training Loss: 0.00923
Epoch: 25313, Training Loss: 0.00861
Epoch: 25313, Training Loss: 0.01054
Epoch: 25313, Training Loss: 0.00815
Epoch: 25314, Training Loss: 0.00923
Epoch: 25314, Training Loss: 0.00861
Epoch: 25314, Training Loss: 0.01054
Epoch: 25314, Training Loss: 0.00815
Epoch: 25315, Training Loss: 0.00923
Epoch: 25315, Training Loss: 0.00861
Epoch: 25315, Training Loss: 0.01054
Epoch: 25315, Training Loss: 0.00815
Epoch: 25316, Training Loss: 0.00923
Epoch: 25316, Training Loss: 0.00861
Epoch: 25316, Training Loss: 0.01054
Epoch: 25316, Training Loss: 0.00815
Epoch: 25317, Training Loss: 0.00923
Epoch: 25317, Training Loss: 0.00861
Epoch: 25317, Training Loss: 0.01054
Epoch: 25317, Training Loss: 0.00815
Epoch: 25318, Training Loss: 0.00923
Epoch: 25318, Training Loss: 0.00861
Epoch: 25318, Training Loss: 0.01054
Epoch: 25318, Training Loss: 0.00815
Epoch: 25319, Training Loss: 0.00923
Epoch: 25319, Training Loss: 0.00861
Epoch: 25319, Training Loss: 0.01054
Epoch: 25319, Training Loss: 0.00815
Epoch: 25320, Training Loss: 0.00923
Epoch: 25320, Training Loss: 0.00861
Epoch: 25320, Training Loss: 0.01054
Epoch: 25320, Training Loss: 0.00815
Epoch: 25321, Training Loss: 0.00923
Epoch: 25321, Training Loss: 0.00861
Epoch: 25321, Training Loss: 0.01054
Epoch: 25321, Training Loss: 0.00815
Epoch: 25322, Training Loss: 0.00923
Epoch: 25322, Training Loss: 0.00861
Epoch: 25322, Training Loss: 0.01054
Epoch: 25322, Training Loss: 0.00815
Epoch: 25323, Training Loss: 0.00923
Epoch: 25323, Training Loss: 0.00861
Epoch: 25323, Training Loss: 0.01054
Epoch: 25323, Training Loss: 0.00815
Epoch: 25324, Training Loss: 0.00923
Epoch: 25324, Training Loss: 0.00861
Epoch: 25324, Training Loss: 0.01054
Epoch: 25324, Training Loss: 0.00815
Epoch: 25325, Training Loss: 0.00923
Epoch: 25325, Training Loss: 0.00861
Epoch: 25325, Training Loss: 0.01054
Epoch: 25325, Training Loss: 0.00815
Epoch: 25326, Training Loss: 0.00923
Epoch: 25326, Training Loss: 0.00861
Epoch: 25326, Training Loss: 0.01054
Epoch: 25326, Training Loss: 0.00815
Epoch: 25327, Training Loss: 0.00923
Epoch: 25327, Training Loss: 0.00861
Epoch: 25327, Training Loss: 0.01054
Epoch: 25327, Training Loss: 0.00815
Epoch: 25328, Training Loss: 0.00923
Epoch: 25328, Training Loss: 0.00861
Epoch: 25328, Training Loss: 0.01054
Epoch: 25328, Training Loss: 0.00815
Epoch: 25329, Training Loss: 0.00923
Epoch: 25329, Training Loss: 0.00861
Epoch: 25329, Training Loss: 0.01054
Epoch: 25329, Training Loss: 0.00815
Epoch: 25330, Training Loss: 0.00923
Epoch: 25330, Training Loss: 0.00861
Epoch: 25330, Training Loss: 0.01054
Epoch: 25330, Training Loss: 0.00815
Epoch: 25331, Training Loss: 0.00923
Epoch: 25331, Training Loss: 0.00861
Epoch: 25331, Training Loss: 0.01054
Epoch: 25331, Training Loss: 0.00815
Epoch: 25332, Training Loss: 0.00923
Epoch: 25332, Training Loss: 0.00861
Epoch: 25332, Training Loss: 0.01054
Epoch: 25332, Training Loss: 0.00815
Epoch: 25333, Training Loss: 0.00923
Epoch: 25333, Training Loss: 0.00861
Epoch: 25333, Training Loss: 0.01054
Epoch: 25333, Training Loss: 0.00815
Epoch: 25334, Training Loss: 0.00923
Epoch: 25334, Training Loss: 0.00861
Epoch: 25334, Training Loss: 0.01054
Epoch: 25334, Training Loss: 0.00815
Epoch: 25335, Training Loss: 0.00923
Epoch: 25335, Training Loss: 0.00861
Epoch: 25335, Training Loss: 0.01054
Epoch: 25335, Training Loss: 0.00815
Epoch: 25336, Training Loss: 0.00923
Epoch: 25336, Training Loss: 0.00861
Epoch: 25336, Training Loss: 0.01054
Epoch: 25336, Training Loss: 0.00815
Epoch: 25337, Training Loss: 0.00923
Epoch: 25337, Training Loss: 0.00860
Epoch: 25337, Training Loss: 0.01054
Epoch: 25337, Training Loss: 0.00815
Epoch: 25338, Training Loss: 0.00923
Epoch: 25338, Training Loss: 0.00860
Epoch: 25338, Training Loss: 0.01054
Epoch: 25338, Training Loss: 0.00815
Epoch: 25339, Training Loss: 0.00923
Epoch: 25339, Training Loss: 0.00860
Epoch: 25339, Training Loss: 0.01054
Epoch: 25339, Training Loss: 0.00815
Epoch: 25340, Training Loss: 0.00923
Epoch: 25340, Training Loss: 0.00860
Epoch: 25340, Training Loss: 0.01054
Epoch: 25340, Training Loss: 0.00815
Epoch: 25341, Training Loss: 0.00923
Epoch: 25341, Training Loss: 0.00860
Epoch: 25341, Training Loss: 0.01054
Epoch: 25341, Training Loss: 0.00815
Epoch: 25342, Training Loss: 0.00923
Epoch: 25342, Training Loss: 0.00860
Epoch: 25342, Training Loss: 0.01054
Epoch: 25342, Training Loss: 0.00815
Epoch: 25343, Training Loss: 0.00923
Epoch: 25343, Training Loss: 0.00860
Epoch: 25343, Training Loss: 0.01054
Epoch: 25343, Training Loss: 0.00815
Epoch: 25344, Training Loss: 0.00923
Epoch: 25344, Training Loss: 0.00860
Epoch: 25344, Training Loss: 0.01054
Epoch: 25344, Training Loss: 0.00815
Epoch: 25345, Training Loss: 0.00923
Epoch: 25345, Training Loss: 0.00860
Epoch: 25345, Training Loss: 0.01054
Epoch: 25345, Training Loss: 0.00815
Epoch: 25346, Training Loss: 0.00923
Epoch: 25346, Training Loss: 0.00860
Epoch: 25346, Training Loss: 0.01054
Epoch: 25346, Training Loss: 0.00815
Epoch: 25347, Training Loss: 0.00923
Epoch: 25347, Training Loss: 0.00860
Epoch: 25347, Training Loss: 0.01054
Epoch: 25347, Training Loss: 0.00815
Epoch: 25348, Training Loss: 0.00923
Epoch: 25348, Training Loss: 0.00860
Epoch: 25348, Training Loss: 0.01054
Epoch: 25348, Training Loss: 0.00815
Epoch: 25349, Training Loss: 0.00923
Epoch: 25349, Training Loss: 0.00860
Epoch: 25349, Training Loss: 0.01054
Epoch: 25349, Training Loss: 0.00815
Epoch: 25350, Training Loss: 0.00923
Epoch: 25350, Training Loss: 0.00860
Epoch: 25350, Training Loss: 0.01054
Epoch: 25350, Training Loss: 0.00815
Epoch: 25351, Training Loss: 0.00923
Epoch: 25351, Training Loss: 0.00860
Epoch: 25351, Training Loss: 0.01054
Epoch: 25351, Training Loss: 0.00815
Epoch: 25352, Training Loss: 0.00923
Epoch: 25352, Training Loss: 0.00860
Epoch: 25352, Training Loss: 0.01054
Epoch: 25352, Training Loss: 0.00815
Epoch: 25353, Training Loss: 0.00923
Epoch: 25353, Training Loss: 0.00860
Epoch: 25353, Training Loss: 0.01053
Epoch: 25353, Training Loss: 0.00815
Epoch: 25354, Training Loss: 0.00923
Epoch: 25354, Training Loss: 0.00860
Epoch: 25354, Training Loss: 0.01053
Epoch: 25354, Training Loss: 0.00815
Epoch: 25355, Training Loss: 0.00923
Epoch: 25355, Training Loss: 0.00860
Epoch: 25355, Training Loss: 0.01053
Epoch: 25355, Training Loss: 0.00815
Epoch: 25356, Training Loss: 0.00923
Epoch: 25356, Training Loss: 0.00860
Epoch: 25356, Training Loss: 0.01053
Epoch: 25356, Training Loss: 0.00815
Epoch: 25357, Training Loss: 0.00923
Epoch: 25357, Training Loss: 0.00860
Epoch: 25357, Training Loss: 0.01053
Epoch: 25357, Training Loss: 0.00815
Epoch: 25358, Training Loss: 0.00923
Epoch: 25358, Training Loss: 0.00860
Epoch: 25358, Training Loss: 0.01053
Epoch: 25358, Training Loss: 0.00815
Epoch: 25359, Training Loss: 0.00923
Epoch: 25359, Training Loss: 0.00860
Epoch: 25359, Training Loss: 0.01053
Epoch: 25359, Training Loss: 0.00815
Epoch: 25360, Training Loss: 0.00923
Epoch: 25360, Training Loss: 0.00860
Epoch: 25360, Training Loss: 0.01053
Epoch: 25360, Training Loss: 0.00815
Epoch: 25361, Training Loss: 0.00922
Epoch: 25361, Training Loss: 0.00860
Epoch: 25361, Training Loss: 0.01053
Epoch: 25361, Training Loss: 0.00815
Epoch: 25362, Training Loss: 0.00922
Epoch: 25362, Training Loss: 0.00860
Epoch: 25362, Training Loss: 0.01053
Epoch: 25362, Training Loss: 0.00815
Epoch: 25363, Training Loss: 0.00922
Epoch: 25363, Training Loss: 0.00860
Epoch: 25363, Training Loss: 0.01053
Epoch: 25363, Training Loss: 0.00815
Epoch: 25364, Training Loss: 0.00922
Epoch: 25364, Training Loss: 0.00860
Epoch: 25364, Training Loss: 0.01053
Epoch: 25364, Training Loss: 0.00815
Epoch: 25365, Training Loss: 0.00922
Epoch: 25365, Training Loss: 0.00860
Epoch: 25365, Training Loss: 0.01053
Epoch: 25365, Training Loss: 0.00815
Epoch: 25366, Training Loss: 0.00922
Epoch: 25366, Training Loss: 0.00860
Epoch: 25366, Training Loss: 0.01053
Epoch: 25366, Training Loss: 0.00815
Epoch: 25367, Training Loss: 0.00922
Epoch: 25367, Training Loss: 0.00860
Epoch: 25367, Training Loss: 0.01053
Epoch: 25367, Training Loss: 0.00815
Epoch: 25368, Training Loss: 0.00922
Epoch: 25368, Training Loss: 0.00860
Epoch: 25368, Training Loss: 0.01053
Epoch: 25368, Training Loss: 0.00815
Epoch: 25369, Training Loss: 0.00922
Epoch: 25369, Training Loss: 0.00860
Epoch: 25369, Training Loss: 0.01053
Epoch: 25369, Training Loss: 0.00814
Epoch: 25370, Training Loss: 0.00922
Epoch: 25370, Training Loss: 0.00860
Epoch: 25370, Training Loss: 0.01053
Epoch: 25370, Training Loss: 0.00814
Epoch: 25371, Training Loss: 0.00922
Epoch: 25371, Training Loss: 0.00860
Epoch: 25371, Training Loss: 0.01053
Epoch: 25371, Training Loss: 0.00814
Epoch: 25372, Training Loss: 0.00922
Epoch: 25372, Training Loss: 0.00860
Epoch: 25372, Training Loss: 0.01053
Epoch: 25372, Training Loss: 0.00814
Epoch: 25373, Training Loss: 0.00922
Epoch: 25373, Training Loss: 0.00860
Epoch: 25373, Training Loss: 0.01053
Epoch: 25373, Training Loss: 0.00814
Epoch: 25374, Training Loss: 0.00922
Epoch: 25374, Training Loss: 0.00860
Epoch: 25374, Training Loss: 0.01053
Epoch: 25374, Training Loss: 0.00814
Epoch: 25375, Training Loss: 0.00922
Epoch: 25375, Training Loss: 0.00860
Epoch: 25375, Training Loss: 0.01053
Epoch: 25375, Training Loss: 0.00814
Epoch: 25376, Training Loss: 0.00922
Epoch: 25376, Training Loss: 0.00860
Epoch: 25376, Training Loss: 0.01053
Epoch: 25376, Training Loss: 0.00814
Epoch: 25377, Training Loss: 0.00922
Epoch: 25377, Training Loss: 0.00860
Epoch: 25377, Training Loss: 0.01053
Epoch: 25377, Training Loss: 0.00814
Epoch: 25378, Training Loss: 0.00922
Epoch: 25378, Training Loss: 0.00860
Epoch: 25378, Training Loss: 0.01053
Epoch: 25378, Training Loss: 0.00814
Epoch: 25379, Training Loss: 0.00922
Epoch: 25379, Training Loss: 0.00860
Epoch: 25379, Training Loss: 0.01053
Epoch: 25379, Training Loss: 0.00814
Epoch: 25380, Training Loss: 0.00922
Epoch: 25380, Training Loss: 0.00860
Epoch: 25380, Training Loss: 0.01053
Epoch: 25380, Training Loss: 0.00814
Epoch: 25381, Training Loss: 0.00922
Epoch: 25381, Training Loss: 0.00860
Epoch: 25381, Training Loss: 0.01053
Epoch: 25381, Training Loss: 0.00814
Epoch: 25382, Training Loss: 0.00922
Epoch: 25382, Training Loss: 0.00860
Epoch: 25382, Training Loss: 0.01053
Epoch: 25382, Training Loss: 0.00814
Epoch: 25383, Training Loss: 0.00922
Epoch: 25383, Training Loss: 0.00860
Epoch: 25383, Training Loss: 0.01053
Epoch: 25383, Training Loss: 0.00814
Epoch: 25384, Training Loss: 0.00922
Epoch: 25384, Training Loss: 0.00860
Epoch: 25384, Training Loss: 0.01053
Epoch: 25384, Training Loss: 0.00814
Epoch: 25385, Training Loss: 0.00922
Epoch: 25385, Training Loss: 0.00860
Epoch: 25385, Training Loss: 0.01053
Epoch: 25385, Training Loss: 0.00814
Epoch: 25386, Training Loss: 0.00922
Epoch: 25386, Training Loss: 0.00860
Epoch: 25386, Training Loss: 0.01053
Epoch: 25386, Training Loss: 0.00814
Epoch: 25387, Training Loss: 0.00922
Epoch: 25387, Training Loss: 0.00860
Epoch: 25387, Training Loss: 0.01053
Epoch: 25387, Training Loss: 0.00814
Epoch: 25388, Training Loss: 0.00922
Epoch: 25388, Training Loss: 0.00860
Epoch: 25388, Training Loss: 0.01053
Epoch: 25388, Training Loss: 0.00814
Epoch: 25389, Training Loss: 0.00922
Epoch: 25389, Training Loss: 0.00860
Epoch: 25389, Training Loss: 0.01053
Epoch: 25389, Training Loss: 0.00814
Epoch: 25390, Training Loss: 0.00922
Epoch: 25390, Training Loss: 0.00860
Epoch: 25390, Training Loss: 0.01053
Epoch: 25390, Training Loss: 0.00814
Epoch: 25391, Training Loss: 0.00922
Epoch: 25391, Training Loss: 0.00860
Epoch: 25391, Training Loss: 0.01053
Epoch: 25391, Training Loss: 0.00814
Epoch: 25392, Training Loss: 0.00922
Epoch: 25392, Training Loss: 0.00860
Epoch: 25392, Training Loss: 0.01053
Epoch: 25392, Training Loss: 0.00814
Epoch: 25393, Training Loss: 0.00922
Epoch: 25393, Training Loss: 0.00859
Epoch: 25393, Training Loss: 0.01053
Epoch: 25393, Training Loss: 0.00814
Epoch: 25394, Training Loss: 0.00922
Epoch: 25394, Training Loss: 0.00859
Epoch: 25394, Training Loss: 0.01053
Epoch: 25394, Training Loss: 0.00814
Epoch: 25395, Training Loss: 0.00922
Epoch: 25395, Training Loss: 0.00859
Epoch: 25395, Training Loss: 0.01053
Epoch: 25395, Training Loss: 0.00814
Epoch: 25396, Training Loss: 0.00922
Epoch: 25396, Training Loss: 0.00859
Epoch: 25396, Training Loss: 0.01053
Epoch: 25396, Training Loss: 0.00814
Epoch: 25397, Training Loss: 0.00922
Epoch: 25397, Training Loss: 0.00859
Epoch: 25397, Training Loss: 0.01053
Epoch: 25397, Training Loss: 0.00814
Epoch: 25398, Training Loss: 0.00922
Epoch: 25398, Training Loss: 0.00859
Epoch: 25398, Training Loss: 0.01052
Epoch: 25398, Training Loss: 0.00814
Epoch: 25399, Training Loss: 0.00922
Epoch: 25399, Training Loss: 0.00859
Epoch: 25399, Training Loss: 0.01052
Epoch: 25399, Training Loss: 0.00814
Epoch: 25400, Training Loss: 0.00922
Epoch: 25400, Training Loss: 0.00859
Epoch: 25400, Training Loss: 0.01052
Epoch: 25400, Training Loss: 0.00814
Epoch: 25401, Training Loss: 0.00922
Epoch: 25401, Training Loss: 0.00859
Epoch: 25401, Training Loss: 0.01052
Epoch: 25401, Training Loss: 0.00814
Epoch: 25402, Training Loss: 0.00922
Epoch: 25402, Training Loss: 0.00859
Epoch: 25402, Training Loss: 0.01052
Epoch: 25402, Training Loss: 0.00814
Epoch: 25403, Training Loss: 0.00922
Epoch: 25403, Training Loss: 0.00859
Epoch: 25403, Training Loss: 0.01052
Epoch: 25403, Training Loss: 0.00814
Epoch: 25404, Training Loss: 0.00922
Epoch: 25404, Training Loss: 0.00859
Epoch: 25404, Training Loss: 0.01052
Epoch: 25404, Training Loss: 0.00814
Epoch: 25405, Training Loss: 0.00922
Epoch: 25405, Training Loss: 0.00859
Epoch: 25405, Training Loss: 0.01052
Epoch: 25405, Training Loss: 0.00814
Epoch: 25406, Training Loss: 0.00922
Epoch: 25406, Training Loss: 0.00859
Epoch: 25406, Training Loss: 0.01052
Epoch: 25406, Training Loss: 0.00814
Epoch: 25407, Training Loss: 0.00922
Epoch: 25407, Training Loss: 0.00859
Epoch: 25407, Training Loss: 0.01052
Epoch: 25407, Training Loss: 0.00814
Epoch: 25408, Training Loss: 0.00922
Epoch: 25408, Training Loss: 0.00859
Epoch: 25408, Training Loss: 0.01052
Epoch: 25408, Training Loss: 0.00814
Epoch: 25409, Training Loss: 0.00922
Epoch: 25409, Training Loss: 0.00859
Epoch: 25409, Training Loss: 0.01052
Epoch: 25409, Training Loss: 0.00814
Epoch: 25410, Training Loss: 0.00922
Epoch: 25410, Training Loss: 0.00859
Epoch: 25410, Training Loss: 0.01052
Epoch: 25410, Training Loss: 0.00814
Epoch: 25411, Training Loss: 0.00922
Epoch: 25411, Training Loss: 0.00859
Epoch: 25411, Training Loss: 0.01052
Epoch: 25411, Training Loss: 0.00814
Epoch: 25412, Training Loss: 0.00921
Epoch: 25412, Training Loss: 0.00859
Epoch: 25412, Training Loss: 0.01052
Epoch: 25412, Training Loss: 0.00814
Epoch: 25413, Training Loss: 0.00921
Epoch: 25413, Training Loss: 0.00859
Epoch: 25413, Training Loss: 0.01052
Epoch: 25413, Training Loss: 0.00814
Epoch: 25414, Training Loss: 0.00921
Epoch: 25414, Training Loss: 0.00859
Epoch: 25414, Training Loss: 0.01052
Epoch: 25414, Training Loss: 0.00814
Epoch: 25415, Training Loss: 0.00921
Epoch: 25415, Training Loss: 0.00859
Epoch: 25415, Training Loss: 0.01052
Epoch: 25415, Training Loss: 0.00814
Epoch: 25416, Training Loss: 0.00921
Epoch: 25416, Training Loss: 0.00859
Epoch: 25416, Training Loss: 0.01052
Epoch: 25416, Training Loss: 0.00814
Epoch: 25417, Training Loss: 0.00921
Epoch: 25417, Training Loss: 0.00859
Epoch: 25417, Training Loss: 0.01052
Epoch: 25417, Training Loss: 0.00814
Epoch: 25418, Training Loss: 0.00921
Epoch: 25418, Training Loss: 0.00859
Epoch: 25418, Training Loss: 0.01052
Epoch: 25418, Training Loss: 0.00814
Epoch: 25419, Training Loss: 0.00921
Epoch: 25419, Training Loss: 0.00859
Epoch: 25419, Training Loss: 0.01052
Epoch: 25419, Training Loss: 0.00814
Epoch: 25420, Training Loss: 0.00921
Epoch: 25420, Training Loss: 0.00859
Epoch: 25420, Training Loss: 0.01052
Epoch: 25420, Training Loss: 0.00814
Epoch: 25421, Training Loss: 0.00921
Epoch: 25421, Training Loss: 0.00859
Epoch: 25421, Training Loss: 0.01052
Epoch: 25421, Training Loss: 0.00814
Epoch: 25422, Training Loss: 0.00921
Epoch: 25422, Training Loss: 0.00859
Epoch: 25422, Training Loss: 0.01052
Epoch: 25422, Training Loss: 0.00814
Epoch: 25423, Training Loss: 0.00921
Epoch: 25423, Training Loss: 0.00859
Epoch: 25423, Training Loss: 0.01052
Epoch: 25423, Training Loss: 0.00814
Epoch: 25424, Training Loss: 0.00921
Epoch: 25424, Training Loss: 0.00859
Epoch: 25424, Training Loss: 0.01052
Epoch: 25424, Training Loss: 0.00814
Epoch: 25425, Training Loss: 0.00921
Epoch: 25425, Training Loss: 0.00859
Epoch: 25425, Training Loss: 0.01052
Epoch: 25425, Training Loss: 0.00814
Epoch: 25426, Training Loss: 0.00921
Epoch: 25426, Training Loss: 0.00859
Epoch: 25426, Training Loss: 0.01052
Epoch: 25426, Training Loss: 0.00814
Epoch: 25427, Training Loss: 0.00921
Epoch: 25427, Training Loss: 0.00859
Epoch: 25427, Training Loss: 0.01052
Epoch: 25427, Training Loss: 0.00814
Epoch: 25428, Training Loss: 0.00921
Epoch: 25428, Training Loss: 0.00859
Epoch: 25428, Training Loss: 0.01052
Epoch: 25428, Training Loss: 0.00813
Epoch: 25429, Training Loss: 0.00921
Epoch: 25429, Training Loss: 0.00859
Epoch: 25429, Training Loss: 0.01052
Epoch: 25429, Training Loss: 0.00813
Epoch: 25430, Training Loss: 0.00921
Epoch: 25430, Training Loss: 0.00859
Epoch: 25430, Training Loss: 0.01052
Epoch: 25430, Training Loss: 0.00813
Epoch: 25431, Training Loss: 0.00921
Epoch: 25431, Training Loss: 0.00859
Epoch: 25431, Training Loss: 0.01052
Epoch: 25431, Training Loss: 0.00813
Epoch: 25432, Training Loss: 0.00921
Epoch: 25432, Training Loss: 0.00859
Epoch: 25432, Training Loss: 0.01052
Epoch: 25432, Training Loss: 0.00813
Epoch: 25433, Training Loss: 0.00921
Epoch: 25433, Training Loss: 0.00859
Epoch: 25433, Training Loss: 0.01052
Epoch: 25433, Training Loss: 0.00813
Epoch: 25434, Training Loss: 0.00921
Epoch: 25434, Training Loss: 0.00859
Epoch: 25434, Training Loss: 0.01052
Epoch: 25434, Training Loss: 0.00813
Epoch: 25435, Training Loss: 0.00921
Epoch: 25435, Training Loss: 0.00859
Epoch: 25435, Training Loss: 0.01052
Epoch: 25435, Training Loss: 0.00813
Epoch: 25436, Training Loss: 0.00921
Epoch: 25436, Training Loss: 0.00859
Epoch: 25436, Training Loss: 0.01052
Epoch: 25436, Training Loss: 0.00813
Epoch: 25437, Training Loss: 0.00921
Epoch: 25437, Training Loss: 0.00859
Epoch: 25437, Training Loss: 0.01052
Epoch: 25437, Training Loss: 0.00813
Epoch: 25438, Training Loss: 0.00921
Epoch: 25438, Training Loss: 0.00859
Epoch: 25438, Training Loss: 0.01052
Epoch: 25438, Training Loss: 0.00813
Epoch: 25439, Training Loss: 0.00921
Epoch: 25439, Training Loss: 0.00859
Epoch: 25439, Training Loss: 0.01052
Epoch: 25439, Training Loss: 0.00813
Epoch: 25440, Training Loss: 0.00921
Epoch: 25440, Training Loss: 0.00859
Epoch: 25440, Training Loss: 0.01052
Epoch: 25440, Training Loss: 0.00813
Epoch: 25441, Training Loss: 0.00921
Epoch: 25441, Training Loss: 0.00859
Epoch: 25441, Training Loss: 0.01052
Epoch: 25441, Training Loss: 0.00813
Epoch: 25442, Training Loss: 0.00921
Epoch: 25442, Training Loss: 0.00859
Epoch: 25442, Training Loss: 0.01052
Epoch: 25442, Training Loss: 0.00813
Epoch: 25443, Training Loss: 0.00921
Epoch: 25443, Training Loss: 0.00859
Epoch: 25443, Training Loss: 0.01051
Epoch: 25443, Training Loss: 0.00813
Epoch: 25444, Training Loss: 0.00921
Epoch: 25444, Training Loss: 0.00859
Epoch: 25444, Training Loss: 0.01051
Epoch: 25444, Training Loss: 0.00813
Epoch: 25445, Training Loss: 0.00921
Epoch: 25445, Training Loss: 0.00859
Epoch: 25445, Training Loss: 0.01051
Epoch: 25445, Training Loss: 0.00813
Epoch: 25446, Training Loss: 0.00921
Epoch: 25446, Training Loss: 0.00859
Epoch: 25446, Training Loss: 0.01051
Epoch: 25446, Training Loss: 0.00813
Epoch: 25447, Training Loss: 0.00921
Epoch: 25447, Training Loss: 0.00859
Epoch: 25447, Training Loss: 0.01051
Epoch: 25447, Training Loss: 0.00813
Epoch: 25448, Training Loss: 0.00921
Epoch: 25448, Training Loss: 0.00859
Epoch: 25448, Training Loss: 0.01051
Epoch: 25448, Training Loss: 0.00813
Epoch: 25449, Training Loss: 0.00921
Epoch: 25449, Training Loss: 0.00859
Epoch: 25449, Training Loss: 0.01051
Epoch: 25449, Training Loss: 0.00813
Epoch: 25450, Training Loss: 0.00921
Epoch: 25450, Training Loss: 0.00858
Epoch: 25450, Training Loss: 0.01051
Epoch: 25450, Training Loss: 0.00813
Epoch: 25451, Training Loss: 0.00921
Epoch: 25451, Training Loss: 0.00858
Epoch: 25451, Training Loss: 0.01051
Epoch: 25451, Training Loss: 0.00813
Epoch: 25452, Training Loss: 0.00921
Epoch: 25452, Training Loss: 0.00858
Epoch: 25452, Training Loss: 0.01051
Epoch: 25452, Training Loss: 0.00813
Epoch: 25453, Training Loss: 0.00921
Epoch: 25453, Training Loss: 0.00858
Epoch: 25453, Training Loss: 0.01051
Epoch: 25453, Training Loss: 0.00813
Epoch: 25454, Training Loss: 0.00921
Epoch: 25454, Training Loss: 0.00858
Epoch: 25454, Training Loss: 0.01051
Epoch: 25454, Training Loss: 0.00813
Epoch: 25455, Training Loss: 0.00921
Epoch: 25455, Training Loss: 0.00858
Epoch: 25455, Training Loss: 0.01051
Epoch: 25455, Training Loss: 0.00813
Epoch: 25456, Training Loss: 0.00921
Epoch: 25456, Training Loss: 0.00858
Epoch: 25456, Training Loss: 0.01051
Epoch: 25456, Training Loss: 0.00813
Epoch: 25457, Training Loss: 0.00921
Epoch: 25457, Training Loss: 0.00858
Epoch: 25457, Training Loss: 0.01051
Epoch: 25457, Training Loss: 0.00813
Epoch: 25458, Training Loss: 0.00921
Epoch: 25458, Training Loss: 0.00858
Epoch: 25458, Training Loss: 0.01051
Epoch: 25458, Training Loss: 0.00813
Epoch: 25459, Training Loss: 0.00921
Epoch: 25459, Training Loss: 0.00858
Epoch: 25459, Training Loss: 0.01051
Epoch: 25459, Training Loss: 0.00813
Epoch: 25460, Training Loss: 0.00921
Epoch: 25460, Training Loss: 0.00858
Epoch: 25460, Training Loss: 0.01051
Epoch: 25460, Training Loss: 0.00813
Epoch: 25461, Training Loss: 0.00921
Epoch: 25461, Training Loss: 0.00858
Epoch: 25461, Training Loss: 0.01051
Epoch: 25461, Training Loss: 0.00813
Epoch: 25462, Training Loss: 0.00921
Epoch: 25462, Training Loss: 0.00858
Epoch: 25462, Training Loss: 0.01051
Epoch: 25462, Training Loss: 0.00813
Epoch: 25463, Training Loss: 0.00921
Epoch: 25463, Training Loss: 0.00858
Epoch: 25463, Training Loss: 0.01051
Epoch: 25463, Training Loss: 0.00813
Epoch: 25464, Training Loss: 0.00920
Epoch: 25464, Training Loss: 0.00858
Epoch: 25464, Training Loss: 0.01051
Epoch: 25464, Training Loss: 0.00813
Epoch: 25465, Training Loss: 0.00920
Epoch: 25465, Training Loss: 0.00858
Epoch: 25465, Training Loss: 0.01051
Epoch: 25465, Training Loss: 0.00813
Epoch: 25466, Training Loss: 0.00920
Epoch: 25466, Training Loss: 0.00858
Epoch: 25466, Training Loss: 0.01051
Epoch: 25466, Training Loss: 0.00813
Epoch: 25467, Training Loss: 0.00920
Epoch: 25467, Training Loss: 0.00858
Epoch: 25467, Training Loss: 0.01051
Epoch: 25467, Training Loss: 0.00813
Epoch: 25468, Training Loss: 0.00920
Epoch: 25468, Training Loss: 0.00858
Epoch: 25468, Training Loss: 0.01051
Epoch: 25468, Training Loss: 0.00813
Epoch: 25469, Training Loss: 0.00920
Epoch: 25469, Training Loss: 0.00858
Epoch: 25469, Training Loss: 0.01051
Epoch: 25469, Training Loss: 0.00813
Epoch: 25470, Training Loss: 0.00920
Epoch: 25470, Training Loss: 0.00858
Epoch: 25470, Training Loss: 0.01051
Epoch: 25470, Training Loss: 0.00813
Epoch: 25471, Training Loss: 0.00920
Epoch: 25471, Training Loss: 0.00858
Epoch: 25471, Training Loss: 0.01051
Epoch: 25471, Training Loss: 0.00813
Epoch: 25472, Training Loss: 0.00920
Epoch: 25472, Training Loss: 0.00858
Epoch: 25472, Training Loss: 0.01051
Epoch: 25472, Training Loss: 0.00813
Epoch: 25473, Training Loss: 0.00920
Epoch: 25473, Training Loss: 0.00858
Epoch: 25473, Training Loss: 0.01051
Epoch: 25473, Training Loss: 0.00813
Epoch: 25474, Training Loss: 0.00920
Epoch: 25474, Training Loss: 0.00858
Epoch: 25474, Training Loss: 0.01051
Epoch: 25474, Training Loss: 0.00813
Epoch: 25475, Training Loss: 0.00920
Epoch: 25475, Training Loss: 0.00858
Epoch: 25475, Training Loss: 0.01051
Epoch: 25475, Training Loss: 0.00813
Epoch: 25476, Training Loss: 0.00920
Epoch: 25476, Training Loss: 0.00858
Epoch: 25476, Training Loss: 0.01051
Epoch: 25476, Training Loss: 0.00813
Epoch: 25477, Training Loss: 0.00920
Epoch: 25477, Training Loss: 0.00858
Epoch: 25477, Training Loss: 0.01051
Epoch: 25477, Training Loss: 0.00813
Epoch: 25478, Training Loss: 0.00920
Epoch: 25478, Training Loss: 0.00858
Epoch: 25478, Training Loss: 0.01051
Epoch: 25478, Training Loss: 0.00813
Epoch: 25479, Training Loss: 0.00920
Epoch: 25479, Training Loss: 0.00858
Epoch: 25479, Training Loss: 0.01051
Epoch: 25479, Training Loss: 0.00813
Epoch: 25480, Training Loss: 0.00920
Epoch: 25480, Training Loss: 0.00858
Epoch: 25480, Training Loss: 0.01051
Epoch: 25480, Training Loss: 0.00813
Epoch: 25481, Training Loss: 0.00920
Epoch: 25481, Training Loss: 0.00858
Epoch: 25481, Training Loss: 0.01051
Epoch: 25481, Training Loss: 0.00813
Epoch: 25482, Training Loss: 0.00920
Epoch: 25482, Training Loss: 0.00858
Epoch: 25482, Training Loss: 0.01051
Epoch: 25482, Training Loss: 0.00813
Epoch: 25483, Training Loss: 0.00920
Epoch: 25483, Training Loss: 0.00858
Epoch: 25483, Training Loss: 0.01051
Epoch: 25483, Training Loss: 0.00813
Epoch: 25484, Training Loss: 0.00920
Epoch: 25484, Training Loss: 0.00858
Epoch: 25484, Training Loss: 0.01051
Epoch: 25484, Training Loss: 0.00813
Epoch: 25485, Training Loss: 0.00920
Epoch: 25485, Training Loss: 0.00858
Epoch: 25485, Training Loss: 0.01051
Epoch: 25485, Training Loss: 0.00813
Epoch: 25486, Training Loss: 0.00920
Epoch: 25486, Training Loss: 0.00858
Epoch: 25486, Training Loss: 0.01051
Epoch: 25486, Training Loss: 0.00813
Epoch: 25487, Training Loss: 0.00920
Epoch: 25487, Training Loss: 0.00858
Epoch: 25487, Training Loss: 0.01051
Epoch: 25487, Training Loss: 0.00812
Epoch: 25488, Training Loss: 0.00920
Epoch: 25488, Training Loss: 0.00858
Epoch: 25488, Training Loss: 0.01050
Epoch: 25488, Training Loss: 0.00812
Epoch: 25489, Training Loss: 0.00920
Epoch: 25489, Training Loss: 0.00858
Epoch: 25489, Training Loss: 0.01050
Epoch: 25489, Training Loss: 0.00812
Epoch: 25490, Training Loss: 0.00920
Epoch: 25490, Training Loss: 0.00858
Epoch: 25490, Training Loss: 0.01050
Epoch: 25490, Training Loss: 0.00812
Epoch: 25491, Training Loss: 0.00920
Epoch: 25491, Training Loss: 0.00858
Epoch: 25491, Training Loss: 0.01050
Epoch: 25491, Training Loss: 0.00812
Epoch: 25492, Training Loss: 0.00920
Epoch: 25492, Training Loss: 0.00858
Epoch: 25492, Training Loss: 0.01050
Epoch: 25492, Training Loss: 0.00812
Epoch: 25493, Training Loss: 0.00920
Epoch: 25493, Training Loss: 0.00858
Epoch: 25493, Training Loss: 0.01050
Epoch: 25493, Training Loss: 0.00812
Epoch: 25494, Training Loss: 0.00920
Epoch: 25494, Training Loss: 0.00858
Epoch: 25494, Training Loss: 0.01050
Epoch: 25494, Training Loss: 0.00812
Epoch: 25495, Training Loss: 0.00920
Epoch: 25495, Training Loss: 0.00858
Epoch: 25495, Training Loss: 0.01050
Epoch: 25495, Training Loss: 0.00812
Epoch: 25496, Training Loss: 0.00920
Epoch: 25496, Training Loss: 0.00858
Epoch: 25496, Training Loss: 0.01050
Epoch: 25496, Training Loss: 0.00812
Epoch: 25497, Training Loss: 0.00920
Epoch: 25497, Training Loss: 0.00858
Epoch: 25497, Training Loss: 0.01050
Epoch: 25497, Training Loss: 0.00812
Epoch: 25498, Training Loss: 0.00920
Epoch: 25498, Training Loss: 0.00858
Epoch: 25498, Training Loss: 0.01050
Epoch: 25498, Training Loss: 0.00812
Epoch: 25499, Training Loss: 0.00920
Epoch: 25499, Training Loss: 0.00858
Epoch: 25499, Training Loss: 0.01050
Epoch: 25499, Training Loss: 0.00812
Epoch: 25500, Training Loss: 0.00920
Epoch: 25500, Training Loss: 0.00858
Epoch: 25500, Training Loss: 0.01050
Epoch: 25500, Training Loss: 0.00812
Epoch: 25501, Training Loss: 0.00920
Epoch: 25501, Training Loss: 0.00858
Epoch: 25501, Training Loss: 0.01050
Epoch: 25501, Training Loss: 0.00812
Epoch: 25502, Training Loss: 0.00920
Epoch: 25502, Training Loss: 0.00858
Epoch: 25502, Training Loss: 0.01050
Epoch: 25502, Training Loss: 0.00812
Epoch: 25503, Training Loss: 0.00920
Epoch: 25503, Training Loss: 0.00858
Epoch: 25503, Training Loss: 0.01050
Epoch: 25503, Training Loss: 0.00812
Epoch: 25504, Training Loss: 0.00920
Epoch: 25504, Training Loss: 0.00858
Epoch: 25504, Training Loss: 0.01050
Epoch: 25504, Training Loss: 0.00812
Epoch: 25505, Training Loss: 0.00920
Epoch: 25505, Training Loss: 0.00858
Epoch: 25505, Training Loss: 0.01050
Epoch: 25505, Training Loss: 0.00812
Epoch: 25506, Training Loss: 0.00920
Epoch: 25506, Training Loss: 0.00857
Epoch: 25506, Training Loss: 0.01050
Epoch: 25506, Training Loss: 0.00812
Epoch: 25507, Training Loss: 0.00920
Epoch: 25507, Training Loss: 0.00857
Epoch: 25507, Training Loss: 0.01050
Epoch: 25507, Training Loss: 0.00812
Epoch: 25508, Training Loss: 0.00920
Epoch: 25508, Training Loss: 0.00857
Epoch: 25508, Training Loss: 0.01050
Epoch: 25508, Training Loss: 0.00812
Epoch: 25509, Training Loss: 0.00920
Epoch: 25509, Training Loss: 0.00857
Epoch: 25509, Training Loss: 0.01050
Epoch: 25509, Training Loss: 0.00812
Epoch: 25510, Training Loss: 0.00920
Epoch: 25510, Training Loss: 0.00857
Epoch: 25510, Training Loss: 0.01050
Epoch: 25510, Training Loss: 0.00812
Epoch: 25511, Training Loss: 0.00920
Epoch: 25511, Training Loss: 0.00857
Epoch: 25511, Training Loss: 0.01050
Epoch: 25511, Training Loss: 0.00812
Epoch: 25512, Training Loss: 0.00920
Epoch: 25512, Training Loss: 0.00857
Epoch: 25512, Training Loss: 0.01050
Epoch: 25512, Training Loss: 0.00812
Epoch: 25513, Training Loss: 0.00920
Epoch: 25513, Training Loss: 0.00857
Epoch: 25513, Training Loss: 0.01050
Epoch: 25513, Training Loss: 0.00812
Epoch: 25514, Training Loss: 0.00920
Epoch: 25514, Training Loss: 0.00857
Epoch: 25514, Training Loss: 0.01050
Epoch: 25514, Training Loss: 0.00812
Epoch: 25515, Training Loss: 0.00920
Epoch: 25515, Training Loss: 0.00857
Epoch: 25515, Training Loss: 0.01050
Epoch: 25515, Training Loss: 0.00812
Epoch: 25516, Training Loss: 0.00919
Epoch: 25516, Training Loss: 0.00857
Epoch: 25516, Training Loss: 0.01050
Epoch: 25516, Training Loss: 0.00812
Epoch: 25517, Training Loss: 0.00919
Epoch: 25517, Training Loss: 0.00857
Epoch: 25517, Training Loss: 0.01050
Epoch: 25517, Training Loss: 0.00812
Epoch: 25518, Training Loss: 0.00919
Epoch: 25518, Training Loss: 0.00857
Epoch: 25518, Training Loss: 0.01050
Epoch: 25518, Training Loss: 0.00812
Epoch: 25519, Training Loss: 0.00919
Epoch: 25519, Training Loss: 0.00857
Epoch: 25519, Training Loss: 0.01050
Epoch: 25519, Training Loss: 0.00812
Epoch: 25520, Training Loss: 0.00919
Epoch: 25520, Training Loss: 0.00857
Epoch: 25520, Training Loss: 0.01050
Epoch: 25520, Training Loss: 0.00812
Epoch: 25521, Training Loss: 0.00919
Epoch: 25521, Training Loss: 0.00857
Epoch: 25521, Training Loss: 0.01050
Epoch: 25521, Training Loss: 0.00812
Epoch: 25522, Training Loss: 0.00919
Epoch: 25522, Training Loss: 0.00857
Epoch: 25522, Training Loss: 0.01050
Epoch: 25522, Training Loss: 0.00812
Epoch: 25523, Training Loss: 0.00919
Epoch: 25523, Training Loss: 0.00857
Epoch: 25523, Training Loss: 0.01050
Epoch: 25523, Training Loss: 0.00812
Epoch: 25524, Training Loss: 0.00919
Epoch: 25524, Training Loss: 0.00857
Epoch: 25524, Training Loss: 0.01050
Epoch: 25524, Training Loss: 0.00812
Epoch: 25525, Training Loss: 0.00919
Epoch: 25525, Training Loss: 0.00857
Epoch: 25525, Training Loss: 0.01050
Epoch: 25525, Training Loss: 0.00812
Epoch: 25526, Training Loss: 0.00919
Epoch: 25526, Training Loss: 0.00857
Epoch: 25526, Training Loss: 0.01050
Epoch: 25526, Training Loss: 0.00812
Epoch: 25527, Training Loss: 0.00919
Epoch: 25527, Training Loss: 0.00857
Epoch: 25527, Training Loss: 0.01050
Epoch: 25527, Training Loss: 0.00812
Epoch: 25528, Training Loss: 0.00919
Epoch: 25528, Training Loss: 0.00857
Epoch: 25528, Training Loss: 0.01050
Epoch: 25528, Training Loss: 0.00812
Epoch: 25529, Training Loss: 0.00919
Epoch: 25529, Training Loss: 0.00857
Epoch: 25529, Training Loss: 0.01050
Epoch: 25529, Training Loss: 0.00812
Epoch: 25530, Training Loss: 0.00919
Epoch: 25530, Training Loss: 0.00857
Epoch: 25530, Training Loss: 0.01050
Epoch: 25530, Training Loss: 0.00812
Epoch: 25531, Training Loss: 0.00919
Epoch: 25531, Training Loss: 0.00857
Epoch: 25531, Training Loss: 0.01050
Epoch: 25531, Training Loss: 0.00812
Epoch: 25532, Training Loss: 0.00919
Epoch: 25532, Training Loss: 0.00857
Epoch: 25532, Training Loss: 0.01050
Epoch: 25532, Training Loss: 0.00812
Epoch: 25533, Training Loss: 0.00919
Epoch: 25533, Training Loss: 0.00857
Epoch: 25533, Training Loss: 0.01050
Epoch: 25533, Training Loss: 0.00812
Epoch: 25534, Training Loss: 0.00919
Epoch: 25534, Training Loss: 0.00857
Epoch: 25534, Training Loss: 0.01049
Epoch: 25534, Training Loss: 0.00812
Epoch: 25535, Training Loss: 0.00919
Epoch: 25535, Training Loss: 0.00857
Epoch: 25535, Training Loss: 0.01049
Epoch: 25535, Training Loss: 0.00812
Epoch: 25536, Training Loss: 0.00919
Epoch: 25536, Training Loss: 0.00857
Epoch: 25536, Training Loss: 0.01049
Epoch: 25536, Training Loss: 0.00812
Epoch: 25537, Training Loss: 0.00919
Epoch: 25537, Training Loss: 0.00857
Epoch: 25537, Training Loss: 0.01049
Epoch: 25537, Training Loss: 0.00812
Epoch: 25538, Training Loss: 0.00919
Epoch: 25538, Training Loss: 0.00857
Epoch: 25538, Training Loss: 0.01049
Epoch: 25538, Training Loss: 0.00812
Epoch: 25539, Training Loss: 0.00919
Epoch: 25539, Training Loss: 0.00857
Epoch: 25539, Training Loss: 0.01049
Epoch: 25539, Training Loss: 0.00812
Epoch: 25540, Training Loss: 0.00919
Epoch: 25540, Training Loss: 0.00857
Epoch: 25540, Training Loss: 0.01049
Epoch: 25540, Training Loss: 0.00812
Epoch: 25541, Training Loss: 0.00919
Epoch: 25541, Training Loss: 0.00857
Epoch: 25541, Training Loss: 0.01049
Epoch: 25541, Training Loss: 0.00812
Epoch: 25542, Training Loss: 0.00919
Epoch: 25542, Training Loss: 0.00857
Epoch: 25542, Training Loss: 0.01049
Epoch: 25542, Training Loss: 0.00812
Epoch: 25543, Training Loss: 0.00919
Epoch: 25543, Training Loss: 0.00857
Epoch: 25543, Training Loss: 0.01049
Epoch: 25543, Training Loss: 0.00812
Epoch: 25544, Training Loss: 0.00919
Epoch: 25544, Training Loss: 0.00857
Epoch: 25544, Training Loss: 0.01049
Epoch: 25544, Training Loss: 0.00812
Epoch: 25545, Training Loss: 0.00919
Epoch: 25545, Training Loss: 0.00857
Epoch: 25545, Training Loss: 0.01049
Epoch: 25545, Training Loss: 0.00812
Epoch: 25546, Training Loss: 0.00919
Epoch: 25546, Training Loss: 0.00857
Epoch: 25546, Training Loss: 0.01049
Epoch: 25546, Training Loss: 0.00812
Epoch: 25547, Training Loss: 0.00919
Epoch: 25547, Training Loss: 0.00857
Epoch: 25547, Training Loss: 0.01049
Epoch: 25547, Training Loss: 0.00811
Epoch: 25548, Training Loss: 0.00919
Epoch: 25548, Training Loss: 0.00857
Epoch: 25548, Training Loss: 0.01049
Epoch: 25548, Training Loss: 0.00811
Epoch: 25549, Training Loss: 0.00919
Epoch: 25549, Training Loss: 0.00857
Epoch: 25549, Training Loss: 0.01049
Epoch: 25549, Training Loss: 0.00811
Epoch: 25550, Training Loss: 0.00919
Epoch: 25550, Training Loss: 0.00857
Epoch: 25550, Training Loss: 0.01049
Epoch: 25550, Training Loss: 0.00811
Epoch: 25551, Training Loss: 0.00919
Epoch: 25551, Training Loss: 0.00857
Epoch: 25551, Training Loss: 0.01049
Epoch: 25551, Training Loss: 0.00811
Epoch: 25552, Training Loss: 0.00919
Epoch: 25552, Training Loss: 0.00857
Epoch: 25552, Training Loss: 0.01049
Epoch: 25552, Training Loss: 0.00811
Epoch: 25553, Training Loss: 0.00919
Epoch: 25553, Training Loss: 0.00857
Epoch: 25553, Training Loss: 0.01049
Epoch: 25553, Training Loss: 0.00811
Epoch: 25554, Training Loss: 0.00919
Epoch: 25554, Training Loss: 0.00857
Epoch: 25554, Training Loss: 0.01049
Epoch: 25554, Training Loss: 0.00811
Epoch: 25555, Training Loss: 0.00919
Epoch: 25555, Training Loss: 0.00857
Epoch: 25555, Training Loss: 0.01049
Epoch: 25555, Training Loss: 0.00811
Epoch: 25556, Training Loss: 0.00919
Epoch: 25556, Training Loss: 0.00857
Epoch: 25556, Training Loss: 0.01049
Epoch: 25556, Training Loss: 0.00811
Epoch: 25557, Training Loss: 0.00919
Epoch: 25557, Training Loss: 0.00857
Epoch: 25557, Training Loss: 0.01049
Epoch: 25557, Training Loss: 0.00811
Epoch: 25558, Training Loss: 0.00919
Epoch: 25558, Training Loss: 0.00857
Epoch: 25558, Training Loss: 0.01049
Epoch: 25558, Training Loss: 0.00811
Epoch: 25559, Training Loss: 0.00919
Epoch: 25559, Training Loss: 0.00857
Epoch: 25559, Training Loss: 0.01049
Epoch: 25559, Training Loss: 0.00811
Epoch: 25560, Training Loss: 0.00919
Epoch: 25560, Training Loss: 0.00857
Epoch: 25560, Training Loss: 0.01049
Epoch: 25560, Training Loss: 0.00811
Epoch: 25561, Training Loss: 0.00919
Epoch: 25561, Training Loss: 0.00857
Epoch: 25561, Training Loss: 0.01049
Epoch: 25561, Training Loss: 0.00811
Epoch: 25562, Training Loss: 0.00919
Epoch: 25562, Training Loss: 0.00857
Epoch: 25562, Training Loss: 0.01049
Epoch: 25562, Training Loss: 0.00811
Epoch: 25563, Training Loss: 0.00919
Epoch: 25563, Training Loss: 0.00856
Epoch: 25563, Training Loss: 0.01049
Epoch: 25563, Training Loss: 0.00811
Epoch: 25564, Training Loss: 0.00919
Epoch: 25564, Training Loss: 0.00856
Epoch: 25564, Training Loss: 0.01049
Epoch: 25564, Training Loss: 0.00811
Epoch: 25565, Training Loss: 0.00919
Epoch: 25565, Training Loss: 0.00856
Epoch: 25565, Training Loss: 0.01049
Epoch: 25565, Training Loss: 0.00811
Epoch: 25566, Training Loss: 0.00919
Epoch: 25566, Training Loss: 0.00856
Epoch: 25566, Training Loss: 0.01049
Epoch: 25566, Training Loss: 0.00811
Epoch: 25567, Training Loss: 0.00919
Epoch: 25567, Training Loss: 0.00856
Epoch: 25567, Training Loss: 0.01049
Epoch: 25567, Training Loss: 0.00811
Epoch: 25568, Training Loss: 0.00918
Epoch: 25568, Training Loss: 0.00856
Epoch: 25568, Training Loss: 0.01049
Epoch: 25568, Training Loss: 0.00811
Epoch: 25569, Training Loss: 0.00918
Epoch: 25569, Training Loss: 0.00856
Epoch: 25569, Training Loss: 0.01049
Epoch: 25569, Training Loss: 0.00811
Epoch: 25570, Training Loss: 0.00918
Epoch: 25570, Training Loss: 0.00856
Epoch: 25570, Training Loss: 0.01049
Epoch: 25570, Training Loss: 0.00811
Epoch: 25571, Training Loss: 0.00918
Epoch: 25571, Training Loss: 0.00856
Epoch: 25571, Training Loss: 0.01049
Epoch: 25571, Training Loss: 0.00811
Epoch: 25572, Training Loss: 0.00918
Epoch: 25572, Training Loss: 0.00856
Epoch: 25572, Training Loss: 0.01049
Epoch: 25572, Training Loss: 0.00811
Epoch: 25573, Training Loss: 0.00918
Epoch: 25573, Training Loss: 0.00856
Epoch: 25573, Training Loss: 0.01049
Epoch: 25573, Training Loss: 0.00811
Epoch: 25574, Training Loss: 0.00918
Epoch: 25574, Training Loss: 0.00856
Epoch: 25574, Training Loss: 0.01049
Epoch: 25574, Training Loss: 0.00811
Epoch: 25575, Training Loss: 0.00918
Epoch: 25575, Training Loss: 0.00856
Epoch: 25575, Training Loss: 0.01049
Epoch: 25575, Training Loss: 0.00811
Epoch: 25576, Training Loss: 0.00918
Epoch: 25576, Training Loss: 0.00856
Epoch: 25576, Training Loss: 0.01049
Epoch: 25576, Training Loss: 0.00811
Epoch: 25577, Training Loss: 0.00918
Epoch: 25577, Training Loss: 0.00856
Epoch: 25577, Training Loss: 0.01049
Epoch: 25577, Training Loss: 0.00811
Epoch: 25578, Training Loss: 0.00918
Epoch: 25578, Training Loss: 0.00856
Epoch: 25578, Training Loss: 0.01049
Epoch: 25578, Training Loss: 0.00811
Epoch: 25579, Training Loss: 0.00918
Epoch: 25579, Training Loss: 0.00856
Epoch: 25579, Training Loss: 0.01048
Epoch: 25579, Training Loss: 0.00811
Epoch: 25580, Training Loss: 0.00918
Epoch: 25580, Training Loss: 0.00856
Epoch: 25580, Training Loss: 0.01048
Epoch: 25580, Training Loss: 0.00811
Epoch: 25581, Training Loss: 0.00918
Epoch: 25581, Training Loss: 0.00856
Epoch: 25581, Training Loss: 0.01048
Epoch: 25581, Training Loss: 0.00811
Epoch: 25582, Training Loss: 0.00918
Epoch: 25582, Training Loss: 0.00856
Epoch: 25582, Training Loss: 0.01048
Epoch: 25582, Training Loss: 0.00811
Epoch: 25583, Training Loss: 0.00918
Epoch: 25583, Training Loss: 0.00856
Epoch: 25583, Training Loss: 0.01048
Epoch: 25583, Training Loss: 0.00811
Epoch: 25584, Training Loss: 0.00918
Epoch: 25584, Training Loss: 0.00856
Epoch: 25584, Training Loss: 0.01048
Epoch: 25584, Training Loss: 0.00811
Epoch: 25585, Training Loss: 0.00918
Epoch: 25585, Training Loss: 0.00856
Epoch: 25585, Training Loss: 0.01048
Epoch: 25585, Training Loss: 0.00811
Epoch: 25586, Training Loss: 0.00918
Epoch: 25586, Training Loss: 0.00856
Epoch: 25586, Training Loss: 0.01048
Epoch: 25586, Training Loss: 0.00811
Epoch: 25587, Training Loss: 0.00918
Epoch: 25587, Training Loss: 0.00856
Epoch: 25587, Training Loss: 0.01048
Epoch: 25587, Training Loss: 0.00811
Epoch: 25588, Training Loss: 0.00918
Epoch: 25588, Training Loss: 0.00856
Epoch: 25588, Training Loss: 0.01048
Epoch: 25588, Training Loss: 0.00811
Epoch: 25589, Training Loss: 0.00918
Epoch: 25589, Training Loss: 0.00856
Epoch: 25589, Training Loss: 0.01048
Epoch: 25589, Training Loss: 0.00811
Epoch: 25590, Training Loss: 0.00918
Epoch: 25590, Training Loss: 0.00856
Epoch: 25590, Training Loss: 0.01048
Epoch: 25590, Training Loss: 0.00811
Epoch: 25591, Training Loss: 0.00918
Epoch: 25591, Training Loss: 0.00856
Epoch: 25591, Training Loss: 0.01048
Epoch: 25591, Training Loss: 0.00811
Epoch: 25592, Training Loss: 0.00918
Epoch: 25592, Training Loss: 0.00856
Epoch: 25592, Training Loss: 0.01048
Epoch: 25592, Training Loss: 0.00811
Epoch: 25593, Training Loss: 0.00918
Epoch: 25593, Training Loss: 0.00856
Epoch: 25593, Training Loss: 0.01048
Epoch: 25593, Training Loss: 0.00811
Epoch: 25594, Training Loss: 0.00918
Epoch: 25594, Training Loss: 0.00856
Epoch: 25594, Training Loss: 0.01048
Epoch: 25594, Training Loss: 0.00811
Epoch: 25595, Training Loss: 0.00918
Epoch: 25595, Training Loss: 0.00856
Epoch: 25595, Training Loss: 0.01048
Epoch: 25595, Training Loss: 0.00811
Epoch: 25596, Training Loss: 0.00918
Epoch: 25596, Training Loss: 0.00856
Epoch: 25596, Training Loss: 0.01048
Epoch: 25596, Training Loss: 0.00811
Epoch: 25597, Training Loss: 0.00918
Epoch: 25597, Training Loss: 0.00856
Epoch: 25597, Training Loss: 0.01048
Epoch: 25597, Training Loss: 0.00811
Epoch: 25598, Training Loss: 0.00918
Epoch: 25598, Training Loss: 0.00856
Epoch: 25598, Training Loss: 0.01048
Epoch: 25598, Training Loss: 0.00811
Epoch: 25599, Training Loss: 0.00918
Epoch: 25599, Training Loss: 0.00856
Epoch: 25599, Training Loss: 0.01048
Epoch: 25599, Training Loss: 0.00811
Epoch: 25600, Training Loss: 0.00918
Epoch: 25600, Training Loss: 0.00856
Epoch: 25600, Training Loss: 0.01048
Epoch: 25600, Training Loss: 0.00811
Epoch: 25601, Training Loss: 0.00918
Epoch: 25601, Training Loss: 0.00856
Epoch: 25601, Training Loss: 0.01048
Epoch: 25601, Training Loss: 0.00811
Epoch: 25602, Training Loss: 0.00918
Epoch: 25602, Training Loss: 0.00856
Epoch: 25602, Training Loss: 0.01048
Epoch: 25602, Training Loss: 0.00811
Epoch: 25603, Training Loss: 0.00918
Epoch: 25603, Training Loss: 0.00856
Epoch: 25603, Training Loss: 0.01048
Epoch: 25603, Training Loss: 0.00811
Epoch: 25604, Training Loss: 0.00918
Epoch: 25604, Training Loss: 0.00856
Epoch: 25604, Training Loss: 0.01048
Epoch: 25604, Training Loss: 0.00811
Epoch: 25605, Training Loss: 0.00918
Epoch: 25605, Training Loss: 0.00856
Epoch: 25605, Training Loss: 0.01048
Epoch: 25605, Training Loss: 0.00811
Epoch: 25606, Training Loss: 0.00918
Epoch: 25606, Training Loss: 0.00856
Epoch: 25606, Training Loss: 0.01048
Epoch: 25606, Training Loss: 0.00811
Epoch: 25607, Training Loss: 0.00918
Epoch: 25607, Training Loss: 0.00856
Epoch: 25607, Training Loss: 0.01048
Epoch: 25607, Training Loss: 0.00810
Epoch: 25608, Training Loss: 0.00918
Epoch: 25608, Training Loss: 0.00856
Epoch: 25608, Training Loss: 0.01048
Epoch: 25608, Training Loss: 0.00810
Epoch: 25609, Training Loss: 0.00918
Epoch: 25609, Training Loss: 0.00856
Epoch: 25609, Training Loss: 0.01048
Epoch: 25609, Training Loss: 0.00810
Epoch: 25610, Training Loss: 0.00918
Epoch: 25610, Training Loss: 0.00856
Epoch: 25610, Training Loss: 0.01048
Epoch: 25610, Training Loss: 0.00810
Epoch: 25611, Training Loss: 0.00918
Epoch: 25611, Training Loss: 0.00856
Epoch: 25611, Training Loss: 0.01048
Epoch: 25611, Training Loss: 0.00810
Epoch: 25612, Training Loss: 0.00918
Epoch: 25612, Training Loss: 0.00856
Epoch: 25612, Training Loss: 0.01048
Epoch: 25612, Training Loss: 0.00810
Epoch: 25613, Training Loss: 0.00918
Epoch: 25613, Training Loss: 0.00856
Epoch: 25613, Training Loss: 0.01048
Epoch: 25613, Training Loss: 0.00810
Epoch: 25614, Training Loss: 0.00918
Epoch: 25614, Training Loss: 0.00856
Epoch: 25614, Training Loss: 0.01048
Epoch: 25614, Training Loss: 0.00810
Epoch: 25615, Training Loss: 0.00918
Epoch: 25615, Training Loss: 0.00856
Epoch: 25615, Training Loss: 0.01048
Epoch: 25615, Training Loss: 0.00810
Epoch: 25616, Training Loss: 0.00918
Epoch: 25616, Training Loss: 0.00856
Epoch: 25616, Training Loss: 0.01048
Epoch: 25616, Training Loss: 0.00810
Epoch: 25617, Training Loss: 0.00918
Epoch: 25617, Training Loss: 0.00856
Epoch: 25617, Training Loss: 0.01048
Epoch: 25617, Training Loss: 0.00810
Epoch: 25618, Training Loss: 0.00918
Epoch: 25618, Training Loss: 0.00856
Epoch: 25618, Training Loss: 0.01048
Epoch: 25618, Training Loss: 0.00810
Epoch: 25619, Training Loss: 0.00918
Epoch: 25619, Training Loss: 0.00856
Epoch: 25619, Training Loss: 0.01048
Epoch: 25619, Training Loss: 0.00810
Epoch: 25620, Training Loss: 0.00917
Epoch: 25620, Training Loss: 0.00855
Epoch: 25620, Training Loss: 0.01048
Epoch: 25620, Training Loss: 0.00810
Epoch: 25621, Training Loss: 0.00917
Epoch: 25621, Training Loss: 0.00855
Epoch: 25621, Training Loss: 0.01048
Epoch: 25621, Training Loss: 0.00810
Epoch: 25622, Training Loss: 0.00917
Epoch: 25622, Training Loss: 0.00855
Epoch: 25622, Training Loss: 0.01048
Epoch: 25622, Training Loss: 0.00810
Epoch: 25623, Training Loss: 0.00917
Epoch: 25623, Training Loss: 0.00855
Epoch: 25623, Training Loss: 0.01048
Epoch: 25623, Training Loss: 0.00810
Epoch: 25624, Training Loss: 0.00917
Epoch: 25624, Training Loss: 0.00855
Epoch: 25624, Training Loss: 0.01048
Epoch: 25624, Training Loss: 0.00810
Epoch: 25625, Training Loss: 0.00917
Epoch: 25625, Training Loss: 0.00855
Epoch: 25625, Training Loss: 0.01047
Epoch: 25625, Training Loss: 0.00810
Epoch: 25626, Training Loss: 0.00917
Epoch: 25626, Training Loss: 0.00855
Epoch: 25626, Training Loss: 0.01047
Epoch: 25626, Training Loss: 0.00810
Epoch: 25627, Training Loss: 0.00917
Epoch: 25627, Training Loss: 0.00855
Epoch: 25627, Training Loss: 0.01047
Epoch: 25627, Training Loss: 0.00810
Epoch: 25628, Training Loss: 0.00917
Epoch: 25628, Training Loss: 0.00855
Epoch: 25628, Training Loss: 0.01047
Epoch: 25628, Training Loss: 0.00810
Epoch: 25629, Training Loss: 0.00917
Epoch: 25629, Training Loss: 0.00855
Epoch: 25629, Training Loss: 0.01047
Epoch: 25629, Training Loss: 0.00810
Epoch: 25630, Training Loss: 0.00917
Epoch: 25630, Training Loss: 0.00855
Epoch: 25630, Training Loss: 0.01047
Epoch: 25630, Training Loss: 0.00810
Epoch: 25631, Training Loss: 0.00917
Epoch: 25631, Training Loss: 0.00855
Epoch: 25631, Training Loss: 0.01047
Epoch: 25631, Training Loss: 0.00810
Epoch: 25632, Training Loss: 0.00917
Epoch: 25632, Training Loss: 0.00855
Epoch: 25632, Training Loss: 0.01047
Epoch: 25632, Training Loss: 0.00810
Epoch: 25633, Training Loss: 0.00917
Epoch: 25633, Training Loss: 0.00855
Epoch: 25633, Training Loss: 0.01047
Epoch: 25633, Training Loss: 0.00810
Epoch: 25634, Training Loss: 0.00917
Epoch: 25634, Training Loss: 0.00855
Epoch: 25634, Training Loss: 0.01047
Epoch: 25634, Training Loss: 0.00810
Epoch: 25635, Training Loss: 0.00917
Epoch: 25635, Training Loss: 0.00855
Epoch: 25635, Training Loss: 0.01047
Epoch: 25635, Training Loss: 0.00810
Epoch: 25636, Training Loss: 0.00917
Epoch: 25636, Training Loss: 0.00855
Epoch: 25636, Training Loss: 0.01047
Epoch: 25636, Training Loss: 0.00810
Epoch: 25637, Training Loss: 0.00917
Epoch: 25637, Training Loss: 0.00855
Epoch: 25637, Training Loss: 0.01047
Epoch: 25637, Training Loss: 0.00810
Epoch: 25638, Training Loss: 0.00917
Epoch: 25638, Training Loss: 0.00855
Epoch: 25638, Training Loss: 0.01047
Epoch: 25638, Training Loss: 0.00810
Epoch: 25639, Training Loss: 0.00917
Epoch: 25639, Training Loss: 0.00855
Epoch: 25639, Training Loss: 0.01047
Epoch: 25639, Training Loss: 0.00810
Epoch: 25640, Training Loss: 0.00917
Epoch: 25640, Training Loss: 0.00855
Epoch: 25640, Training Loss: 0.01047
Epoch: 25640, Training Loss: 0.00810
Epoch: 25641, Training Loss: 0.00917
Epoch: 25641, Training Loss: 0.00855
Epoch: 25641, Training Loss: 0.01047
Epoch: 25641, Training Loss: 0.00810
Epoch: 25642, Training Loss: 0.00917
Epoch: 25642, Training Loss: 0.00855
Epoch: 25642, Training Loss: 0.01047
Epoch: 25642, Training Loss: 0.00810
Epoch: 25643, Training Loss: 0.00917
Epoch: 25643, Training Loss: 0.00855
Epoch: 25643, Training Loss: 0.01047
Epoch: 25643, Training Loss: 0.00810
Epoch: 25644, Training Loss: 0.00917
Epoch: 25644, Training Loss: 0.00855
Epoch: 25644, Training Loss: 0.01047
Epoch: 25644, Training Loss: 0.00810
Epoch: 25645, Training Loss: 0.00917
Epoch: 25645, Training Loss: 0.00855
Epoch: 25645, Training Loss: 0.01047
Epoch: 25645, Training Loss: 0.00810
Epoch: 25646, Training Loss: 0.00917
Epoch: 25646, Training Loss: 0.00855
Epoch: 25646, Training Loss: 0.01047
Epoch: 25646, Training Loss: 0.00810
Epoch: 25647, Training Loss: 0.00917
Epoch: 25647, Training Loss: 0.00855
Epoch: 25647, Training Loss: 0.01047
Epoch: 25647, Training Loss: 0.00810
Epoch: 25648, Training Loss: 0.00917
Epoch: 25648, Training Loss: 0.00855
Epoch: 25648, Training Loss: 0.01047
Epoch: 25648, Training Loss: 0.00810
Epoch: 25649, Training Loss: 0.00917
Epoch: 25649, Training Loss: 0.00855
Epoch: 25649, Training Loss: 0.01047
Epoch: 25649, Training Loss: 0.00810
Epoch: 25650, Training Loss: 0.00917
Epoch: 25650, Training Loss: 0.00855
Epoch: 25650, Training Loss: 0.01047
Epoch: 25650, Training Loss: 0.00810
Epoch: 25651, Training Loss: 0.00917
Epoch: 25651, Training Loss: 0.00855
Epoch: 25651, Training Loss: 0.01047
Epoch: 25651, Training Loss: 0.00810
Epoch: 25652, Training Loss: 0.00917
Epoch: 25652, Training Loss: 0.00855
Epoch: 25652, Training Loss: 0.01047
Epoch: 25652, Training Loss: 0.00810
Epoch: 25653, Training Loss: 0.00917
Epoch: 25653, Training Loss: 0.00855
Epoch: 25653, Training Loss: 0.01047
Epoch: 25653, Training Loss: 0.00810
Epoch: 25654, Training Loss: 0.00917
Epoch: 25654, Training Loss: 0.00855
Epoch: 25654, Training Loss: 0.01047
Epoch: 25654, Training Loss: 0.00810
Epoch: 25655, Training Loss: 0.00917
Epoch: 25655, Training Loss: 0.00855
Epoch: 25655, Training Loss: 0.01047
Epoch: 25655, Training Loss: 0.00810
Epoch: 25656, Training Loss: 0.00917
Epoch: 25656, Training Loss: 0.00855
Epoch: 25656, Training Loss: 0.01047
Epoch: 25656, Training Loss: 0.00810
Epoch: 25657, Training Loss: 0.00917
Epoch: 25657, Training Loss: 0.00855
Epoch: 25657, Training Loss: 0.01047
Epoch: 25657, Training Loss: 0.00810
Epoch: 25658, Training Loss: 0.00917
Epoch: 25658, Training Loss: 0.00855
Epoch: 25658, Training Loss: 0.01047
Epoch: 25658, Training Loss: 0.00810
Epoch: 25659, Training Loss: 0.00917
Epoch: 25659, Training Loss: 0.00855
Epoch: 25659, Training Loss: 0.01047
Epoch: 25659, Training Loss: 0.00810
Epoch: 25660, Training Loss: 0.00917
Epoch: 25660, Training Loss: 0.00855
Epoch: 25660, Training Loss: 0.01047
Epoch: 25660, Training Loss: 0.00810
Epoch: 25661, Training Loss: 0.00917
Epoch: 25661, Training Loss: 0.00855
Epoch: 25661, Training Loss: 0.01047
Epoch: 25661, Training Loss: 0.00810
Epoch: 25662, Training Loss: 0.00917
Epoch: 25662, Training Loss: 0.00855
Epoch: 25662, Training Loss: 0.01047
Epoch: 25662, Training Loss: 0.00810
Epoch: 25663, Training Loss: 0.00917
Epoch: 25663, Training Loss: 0.00855
Epoch: 25663, Training Loss: 0.01047
Epoch: 25663, Training Loss: 0.00810
Epoch: 25664, Training Loss: 0.00917
Epoch: 25664, Training Loss: 0.00855
Epoch: 25664, Training Loss: 0.01047
Epoch: 25664, Training Loss: 0.00810
Epoch: 25665, Training Loss: 0.00917
Epoch: 25665, Training Loss: 0.00855
Epoch: 25665, Training Loss: 0.01047
Epoch: 25665, Training Loss: 0.00810
Epoch: 25666, Training Loss: 0.00917
Epoch: 25666, Training Loss: 0.00855
Epoch: 25666, Training Loss: 0.01047
Epoch: 25666, Training Loss: 0.00810
Epoch: 25667, Training Loss: 0.00917
Epoch: 25667, Training Loss: 0.00855
Epoch: 25667, Training Loss: 0.01047
Epoch: 25667, Training Loss: 0.00809
Epoch: 25668, Training Loss: 0.00917
Epoch: 25668, Training Loss: 0.00855
Epoch: 25668, Training Loss: 0.01047
Epoch: 25668, Training Loss: 0.00809
Epoch: 25669, Training Loss: 0.00917
Epoch: 25669, Training Loss: 0.00855
Epoch: 25669, Training Loss: 0.01047
Epoch: 25669, Training Loss: 0.00809
Epoch: 25670, Training Loss: 0.00917
Epoch: 25670, Training Loss: 0.00855
Epoch: 25670, Training Loss: 0.01047
Epoch: 25670, Training Loss: 0.00809
Epoch: 25671, Training Loss: 0.00917
Epoch: 25671, Training Loss: 0.00855
Epoch: 25671, Training Loss: 0.01046
Epoch: 25671, Training Loss: 0.00809
Epoch: 25672, Training Loss: 0.00916
Epoch: 25672, Training Loss: 0.00855
Epoch: 25672, Training Loss: 0.01046
Epoch: 25672, Training Loss: 0.00809
Epoch: 25673, Training Loss: 0.00916
Epoch: 25673, Training Loss: 0.00855
Epoch: 25673, Training Loss: 0.01046
Epoch: 25673, Training Loss: 0.00809
Epoch: 25674, Training Loss: 0.00916
Epoch: 25674, Training Loss: 0.00855
Epoch: 25674, Training Loss: 0.01046
Epoch: 25674, Training Loss: 0.00809
Epoch: 25675, Training Loss: 0.00916
Epoch: 25675, Training Loss: 0.00855
Epoch: 25675, Training Loss: 0.01046
Epoch: 25675, Training Loss: 0.00809
Epoch: 25676, Training Loss: 0.00916
Epoch: 25676, Training Loss: 0.00855
Epoch: 25676, Training Loss: 0.01046
Epoch: 25676, Training Loss: 0.00809
Epoch: 25677, Training Loss: 0.00916
Epoch: 25677, Training Loss: 0.00854
Epoch: 25677, Training Loss: 0.01046
Epoch: 25677, Training Loss: 0.00809
Epoch: 25678, Training Loss: 0.00916
Epoch: 25678, Training Loss: 0.00854
Epoch: 25678, Training Loss: 0.01046
Epoch: 25678, Training Loss: 0.00809
Epoch: 25679, Training Loss: 0.00916
Epoch: 25679, Training Loss: 0.00854
Epoch: 25679, Training Loss: 0.01046
Epoch: 25679, Training Loss: 0.00809
Epoch: 25680, Training Loss: 0.00916
Epoch: 25680, Training Loss: 0.00854
Epoch: 25680, Training Loss: 0.01046
Epoch: 25680, Training Loss: 0.00809
Epoch: 25681, Training Loss: 0.00916
Epoch: 25681, Training Loss: 0.00854
Epoch: 25681, Training Loss: 0.01046
Epoch: 25681, Training Loss: 0.00809
Epoch: 25682, Training Loss: 0.00916
Epoch: 25682, Training Loss: 0.00854
Epoch: 25682, Training Loss: 0.01046
Epoch: 25682, Training Loss: 0.00809
Epoch: 25683, Training Loss: 0.00916
Epoch: 25683, Training Loss: 0.00854
Epoch: 25683, Training Loss: 0.01046
Epoch: 25683, Training Loss: 0.00809
Epoch: 25684, Training Loss: 0.00916
Epoch: 25684, Training Loss: 0.00854
Epoch: 25684, Training Loss: 0.01046
Epoch: 25684, Training Loss: 0.00809
Epoch: 25685, Training Loss: 0.00916
Epoch: 25685, Training Loss: 0.00854
Epoch: 25685, Training Loss: 0.01046
Epoch: 25685, Training Loss: 0.00809
Epoch: 25686, Training Loss: 0.00916
Epoch: 25686, Training Loss: 0.00854
Epoch: 25686, Training Loss: 0.01046
Epoch: 25686, Training Loss: 0.00809
Epoch: 25687, Training Loss: 0.00916
Epoch: 25687, Training Loss: 0.00854
Epoch: 25687, Training Loss: 0.01046
Epoch: 25687, Training Loss: 0.00809
Epoch: 25688, Training Loss: 0.00916
Epoch: 25688, Training Loss: 0.00854
Epoch: 25688, Training Loss: 0.01046
Epoch: 25688, Training Loss: 0.00809
Epoch: 25689, Training Loss: 0.00916
Epoch: 25689, Training Loss: 0.00854
Epoch: 25689, Training Loss: 0.01046
Epoch: 25689, Training Loss: 0.00809
Epoch: 25690, Training Loss: 0.00916
Epoch: 25690, Training Loss: 0.00854
Epoch: 25690, Training Loss: 0.01046
Epoch: 25690, Training Loss: 0.00809
Epoch: 25691, Training Loss: 0.00916
Epoch: 25691, Training Loss: 0.00854
Epoch: 25691, Training Loss: 0.01046
Epoch: 25691, Training Loss: 0.00809
Epoch: 25692, Training Loss: 0.00916
Epoch: 25692, Training Loss: 0.00854
Epoch: 25692, Training Loss: 0.01046
Epoch: 25692, Training Loss: 0.00809
Epoch: 25693, Training Loss: 0.00916
Epoch: 25693, Training Loss: 0.00854
Epoch: 25693, Training Loss: 0.01046
Epoch: 25693, Training Loss: 0.00809
Epoch: 25694, Training Loss: 0.00916
Epoch: 25694, Training Loss: 0.00854
Epoch: 25694, Training Loss: 0.01046
Epoch: 25694, Training Loss: 0.00809
Epoch: 25695, Training Loss: 0.00916
Epoch: 25695, Training Loss: 0.00854
Epoch: 25695, Training Loss: 0.01046
Epoch: 25695, Training Loss: 0.00809
Epoch: 25696, Training Loss: 0.00916
Epoch: 25696, Training Loss: 0.00854
Epoch: 25696, Training Loss: 0.01046
Epoch: 25696, Training Loss: 0.00809
Epoch: 25697, Training Loss: 0.00916
Epoch: 25697, Training Loss: 0.00854
Epoch: 25697, Training Loss: 0.01046
Epoch: 25697, Training Loss: 0.00809
Epoch: 25698, Training Loss: 0.00916
Epoch: 25698, Training Loss: 0.00854
Epoch: 25698, Training Loss: 0.01046
Epoch: 25698, Training Loss: 0.00809
Epoch: 25699, Training Loss: 0.00916
Epoch: 25699, Training Loss: 0.00854
Epoch: 25699, Training Loss: 0.01046
Epoch: 25699, Training Loss: 0.00809
Epoch: 25700, Training Loss: 0.00916
Epoch: 25700, Training Loss: 0.00854
Epoch: 25700, Training Loss: 0.01046
Epoch: 25700, Training Loss: 0.00809
Epoch: 25701, Training Loss: 0.00916
Epoch: 25701, Training Loss: 0.00854
Epoch: 25701, Training Loss: 0.01046
Epoch: 25701, Training Loss: 0.00809
Epoch: 25702, Training Loss: 0.00916
Epoch: 25702, Training Loss: 0.00854
Epoch: 25702, Training Loss: 0.01046
Epoch: 25702, Training Loss: 0.00809
Epoch: 25703, Training Loss: 0.00916
Epoch: 25703, Training Loss: 0.00854
Epoch: 25703, Training Loss: 0.01046
Epoch: 25703, Training Loss: 0.00809
Epoch: 25704, Training Loss: 0.00916
Epoch: 25704, Training Loss: 0.00854
Epoch: 25704, Training Loss: 0.01046
Epoch: 25704, Training Loss: 0.00809
Epoch: 25705, Training Loss: 0.00916
Epoch: 25705, Training Loss: 0.00854
Epoch: 25705, Training Loss: 0.01046
Epoch: 25705, Training Loss: 0.00809
Epoch: 25706, Training Loss: 0.00916
Epoch: 25706, Training Loss: 0.00854
Epoch: 25706, Training Loss: 0.01046
Epoch: 25706, Training Loss: 0.00809
Epoch: 25707, Training Loss: 0.00916
Epoch: 25707, Training Loss: 0.00854
Epoch: 25707, Training Loss: 0.01046
Epoch: 25707, Training Loss: 0.00809
Epoch: 25708, Training Loss: 0.00916
Epoch: 25708, Training Loss: 0.00854
Epoch: 25708, Training Loss: 0.01046
Epoch: 25708, Training Loss: 0.00809
Epoch: 25709, Training Loss: 0.00916
Epoch: 25709, Training Loss: 0.00854
Epoch: 25709, Training Loss: 0.01046
Epoch: 25709, Training Loss: 0.00809
Epoch: 25710, Training Loss: 0.00916
Epoch: 25710, Training Loss: 0.00854
Epoch: 25710, Training Loss: 0.01046
Epoch: 25710, Training Loss: 0.00809
Epoch: 25711, Training Loss: 0.00916
Epoch: 25711, Training Loss: 0.00854
Epoch: 25711, Training Loss: 0.01046
Epoch: 25711, Training Loss: 0.00809
Epoch: 25712, Training Loss: 0.00916
Epoch: 25712, Training Loss: 0.00854
Epoch: 25712, Training Loss: 0.01046
Epoch: 25712, Training Loss: 0.00809
Epoch: 25713, Training Loss: 0.00916
Epoch: 25713, Training Loss: 0.00854
Epoch: 25713, Training Loss: 0.01046
Epoch: 25713, Training Loss: 0.00809
Epoch: 25714, Training Loss: 0.00916
Epoch: 25714, Training Loss: 0.00854
Epoch: 25714, Training Loss: 0.01046
Epoch: 25714, Training Loss: 0.00809
Epoch: 25715, Training Loss: 0.00916
Epoch: 25715, Training Loss: 0.00854
Epoch: 25715, Training Loss: 0.01046
Epoch: 25715, Training Loss: 0.00809
Epoch: 25716, Training Loss: 0.00916
Epoch: 25716, Training Loss: 0.00854
Epoch: 25716, Training Loss: 0.01046
Epoch: 25716, Training Loss: 0.00809
Epoch: 25717, Training Loss: 0.00916
Epoch: 25717, Training Loss: 0.00854
Epoch: 25717, Training Loss: 0.01045
Epoch: 25717, Training Loss: 0.00809
Epoch: 25718, Training Loss: 0.00916
Epoch: 25718, Training Loss: 0.00854
Epoch: 25718, Training Loss: 0.01045
Epoch: 25718, Training Loss: 0.00809
Epoch: 25719, Training Loss: 0.00916
Epoch: 25719, Training Loss: 0.00854
Epoch: 25719, Training Loss: 0.01045
Epoch: 25719, Training Loss: 0.00809
Epoch: 25720, Training Loss: 0.00916
Epoch: 25720, Training Loss: 0.00854
Epoch: 25720, Training Loss: 0.01045
Epoch: 25720, Training Loss: 0.00809
Epoch: 25721, Training Loss: 0.00916
Epoch: 25721, Training Loss: 0.00854
Epoch: 25721, Training Loss: 0.01045
Epoch: 25721, Training Loss: 0.00809
Epoch: 25722, Training Loss: 0.00916
Epoch: 25722, Training Loss: 0.00854
Epoch: 25722, Training Loss: 0.01045
Epoch: 25722, Training Loss: 0.00809
Epoch: 25723, Training Loss: 0.00916
Epoch: 25723, Training Loss: 0.00854
Epoch: 25723, Training Loss: 0.01045
Epoch: 25723, Training Loss: 0.00809
Epoch: 25724, Training Loss: 0.00916
Epoch: 25724, Training Loss: 0.00854
Epoch: 25724, Training Loss: 0.01045
Epoch: 25724, Training Loss: 0.00809
Epoch: 25725, Training Loss: 0.00915
Epoch: 25725, Training Loss: 0.00854
Epoch: 25725, Training Loss: 0.01045
Epoch: 25725, Training Loss: 0.00809
Epoch: 25726, Training Loss: 0.00915
Epoch: 25726, Training Loss: 0.00854
Epoch: 25726, Training Loss: 0.01045
Epoch: 25726, Training Loss: 0.00809
Epoch: 25727, Training Loss: 0.00915
Epoch: 25727, Training Loss: 0.00854
Epoch: 25727, Training Loss: 0.01045
Epoch: 25727, Training Loss: 0.00808
Epoch: 25728, Training Loss: 0.00915
Epoch: 25728, Training Loss: 0.00854
Epoch: 25728, Training Loss: 0.01045
Epoch: 25728, Training Loss: 0.00808
Epoch: 25729, Training Loss: 0.00915
Epoch: 25729, Training Loss: 0.00854
Epoch: 25729, Training Loss: 0.01045
Epoch: 25729, Training Loss: 0.00808
Epoch: 25730, Training Loss: 0.00915
Epoch: 25730, Training Loss: 0.00854
Epoch: 25730, Training Loss: 0.01045
Epoch: 25730, Training Loss: 0.00808
Epoch: 25731, Training Loss: 0.00915
Epoch: 25731, Training Loss: 0.00854
Epoch: 25731, Training Loss: 0.01045
Epoch: 25731, Training Loss: 0.00808
Epoch: 25732, Training Loss: 0.00915
Epoch: 25732, Training Loss: 0.00854
Epoch: 25732, Training Loss: 0.01045
Epoch: 25732, Training Loss: 0.00808
Epoch: 25733, Training Loss: 0.00915
Epoch: 25733, Training Loss: 0.00854
Epoch: 25733, Training Loss: 0.01045
Epoch: 25733, Training Loss: 0.00808
Epoch: 25734, Training Loss: 0.00915
Epoch: 25734, Training Loss: 0.00853
Epoch: 25734, Training Loss: 0.01045
Epoch: 25734, Training Loss: 0.00808
Epoch: 25735, Training Loss: 0.00915
Epoch: 25735, Training Loss: 0.00853
Epoch: 25735, Training Loss: 0.01045
Epoch: 25735, Training Loss: 0.00808
Epoch: 25736, Training Loss: 0.00915
Epoch: 25736, Training Loss: 0.00853
Epoch: 25736, Training Loss: 0.01045
Epoch: 25736, Training Loss: 0.00808
Epoch: 25737, Training Loss: 0.00915
Epoch: 25737, Training Loss: 0.00853
Epoch: 25737, Training Loss: 0.01045
Epoch: 25737, Training Loss: 0.00808
Epoch: 25738, Training Loss: 0.00915
Epoch: 25738, Training Loss: 0.00853
Epoch: 25738, Training Loss: 0.01045
Epoch: 25738, Training Loss: 0.00808
Epoch: 25739, Training Loss: 0.00915
Epoch: 25739, Training Loss: 0.00853
Epoch: 25739, Training Loss: 0.01045
Epoch: 25739, Training Loss: 0.00808
Epoch: 25740, Training Loss: 0.00915
Epoch: 25740, Training Loss: 0.00853
Epoch: 25740, Training Loss: 0.01045
Epoch: 25740, Training Loss: 0.00808
Epoch: 25741, Training Loss: 0.00915
Epoch: 25741, Training Loss: 0.00853
Epoch: 25741, Training Loss: 0.01045
Epoch: 25741, Training Loss: 0.00808
Epoch: 25742, Training Loss: 0.00915
Epoch: 25742, Training Loss: 0.00853
Epoch: 25742, Training Loss: 0.01045
Epoch: 25742, Training Loss: 0.00808
Epoch: 25743, Training Loss: 0.00915
Epoch: 25743, Training Loss: 0.00853
Epoch: 25743, Training Loss: 0.01045
Epoch: 25743, Training Loss: 0.00808
Epoch: 25744, Training Loss: 0.00915
Epoch: 25744, Training Loss: 0.00853
Epoch: 25744, Training Loss: 0.01045
Epoch: 25744, Training Loss: 0.00808
Epoch: 25745, Training Loss: 0.00915
Epoch: 25745, Training Loss: 0.00853
Epoch: 25745, Training Loss: 0.01045
Epoch: 25745, Training Loss: 0.00808
Epoch: 25746, Training Loss: 0.00915
Epoch: 25746, Training Loss: 0.00853
Epoch: 25746, Training Loss: 0.01045
Epoch: 25746, Training Loss: 0.00808
Epoch: 25747, Training Loss: 0.00915
Epoch: 25747, Training Loss: 0.00853
Epoch: 25747, Training Loss: 0.01045
Epoch: 25747, Training Loss: 0.00808
Epoch: 25748, Training Loss: 0.00915
Epoch: 25748, Training Loss: 0.00853
Epoch: 25748, Training Loss: 0.01045
Epoch: 25748, Training Loss: 0.00808
Epoch: 25749, Training Loss: 0.00915
Epoch: 25749, Training Loss: 0.00853
Epoch: 25749, Training Loss: 0.01045
Epoch: 25749, Training Loss: 0.00808
Epoch: 25750, Training Loss: 0.00915
Epoch: 25750, Training Loss: 0.00853
Epoch: 25750, Training Loss: 0.01045
Epoch: 25750, Training Loss: 0.00808
Epoch: 25751, Training Loss: 0.00915
Epoch: 25751, Training Loss: 0.00853
Epoch: 25751, Training Loss: 0.01045
Epoch: 25751, Training Loss: 0.00808
Epoch: 25752, Training Loss: 0.00915
Epoch: 25752, Training Loss: 0.00853
Epoch: 25752, Training Loss: 0.01045
Epoch: 25752, Training Loss: 0.00808
Epoch: 25753, Training Loss: 0.00915
Epoch: 25753, Training Loss: 0.00853
Epoch: 25753, Training Loss: 0.01045
Epoch: 25753, Training Loss: 0.00808
Epoch: 25754, Training Loss: 0.00915
Epoch: 25754, Training Loss: 0.00853
Epoch: 25754, Training Loss: 0.01045
Epoch: 25754, Training Loss: 0.00808
Epoch: 25755, Training Loss: 0.00915
Epoch: 25755, Training Loss: 0.00853
Epoch: 25755, Training Loss: 0.01045
Epoch: 25755, Training Loss: 0.00808
Epoch: 25756, Training Loss: 0.00915
Epoch: 25756, Training Loss: 0.00853
Epoch: 25756, Training Loss: 0.01045
Epoch: 25756, Training Loss: 0.00808
Epoch: 25757, Training Loss: 0.00915
Epoch: 25757, Training Loss: 0.00853
Epoch: 25757, Training Loss: 0.01045
Epoch: 25757, Training Loss: 0.00808
Epoch: 25758, Training Loss: 0.00915
Epoch: 25758, Training Loss: 0.00853
Epoch: 25758, Training Loss: 0.01045
Epoch: 25758, Training Loss: 0.00808
Epoch: 25759, Training Loss: 0.00915
Epoch: 25759, Training Loss: 0.00853
Epoch: 25759, Training Loss: 0.01045
Epoch: 25759, Training Loss: 0.00808
Epoch: 25760, Training Loss: 0.00915
Epoch: 25760, Training Loss: 0.00853
Epoch: 25760, Training Loss: 0.01045
Epoch: 25760, Training Loss: 0.00808
Epoch: 25761, Training Loss: 0.00915
Epoch: 25761, Training Loss: 0.00853
Epoch: 25761, Training Loss: 0.01045
Epoch: 25761, Training Loss: 0.00808
Epoch: 25762, Training Loss: 0.00915
Epoch: 25762, Training Loss: 0.00853
Epoch: 25762, Training Loss: 0.01045
Epoch: 25762, Training Loss: 0.00808
Epoch: 25763, Training Loss: 0.00915
Epoch: 25763, Training Loss: 0.00853
Epoch: 25763, Training Loss: 0.01044
Epoch: 25763, Training Loss: 0.00808
Epoch: 25764, Training Loss: 0.00915
Epoch: 25764, Training Loss: 0.00853
Epoch: 25764, Training Loss: 0.01044
Epoch: 25764, Training Loss: 0.00808
Epoch: 25765, Training Loss: 0.00915
Epoch: 25765, Training Loss: 0.00853
Epoch: 25765, Training Loss: 0.01044
Epoch: 25765, Training Loss: 0.00808
Epoch: 25766, Training Loss: 0.00915
Epoch: 25766, Training Loss: 0.00853
Epoch: 25766, Training Loss: 0.01044
Epoch: 25766, Training Loss: 0.00808
Epoch: 25767, Training Loss: 0.00915
Epoch: 25767, Training Loss: 0.00853
Epoch: 25767, Training Loss: 0.01044
Epoch: 25767, Training Loss: 0.00808
Epoch: 25768, Training Loss: 0.00915
Epoch: 25768, Training Loss: 0.00853
Epoch: 25768, Training Loss: 0.01044
Epoch: 25768, Training Loss: 0.00808
Epoch: 25769, Training Loss: 0.00915
Epoch: 25769, Training Loss: 0.00853
Epoch: 25769, Training Loss: 0.01044
Epoch: 25769, Training Loss: 0.00808
Epoch: 25770, Training Loss: 0.00915
Epoch: 25770, Training Loss: 0.00853
Epoch: 25770, Training Loss: 0.01044
Epoch: 25770, Training Loss: 0.00808
Epoch: 25771, Training Loss: 0.00915
Epoch: 25771, Training Loss: 0.00853
Epoch: 25771, Training Loss: 0.01044
Epoch: 25771, Training Loss: 0.00808
Epoch: 25772, Training Loss: 0.00915
Epoch: 25772, Training Loss: 0.00853
Epoch: 25772, Training Loss: 0.01044
Epoch: 25772, Training Loss: 0.00808
Epoch: 25773, Training Loss: 0.00915
Epoch: 25773, Training Loss: 0.00853
Epoch: 25773, Training Loss: 0.01044
Epoch: 25773, Training Loss: 0.00808
Epoch: 25774, Training Loss: 0.00915
Epoch: 25774, Training Loss: 0.00853
Epoch: 25774, Training Loss: 0.01044
Epoch: 25774, Training Loss: 0.00808
Epoch: 25775, Training Loss: 0.00915
Epoch: 25775, Training Loss: 0.00853
Epoch: 25775, Training Loss: 0.01044
Epoch: 25775, Training Loss: 0.00808
Epoch: 25776, Training Loss: 0.00915
Epoch: 25776, Training Loss: 0.00853
Epoch: 25776, Training Loss: 0.01044
Epoch: 25776, Training Loss: 0.00808
Epoch: 25777, Training Loss: 0.00914
Epoch: 25777, Training Loss: 0.00853
Epoch: 25777, Training Loss: 0.01044
Epoch: 25777, Training Loss: 0.00808
Epoch: 25778, Training Loss: 0.00914
Epoch: 25778, Training Loss: 0.00853
Epoch: 25778, Training Loss: 0.01044
Epoch: 25778, Training Loss: 0.00808
Epoch: 25779, Training Loss: 0.00914
Epoch: 25779, Training Loss: 0.00853
Epoch: 25779, Training Loss: 0.01044
Epoch: 25779, Training Loss: 0.00808
Epoch: 25780, Training Loss: 0.00914
Epoch: 25780, Training Loss: 0.00853
Epoch: 25780, Training Loss: 0.01044
Epoch: 25780, Training Loss: 0.00808
Epoch: 25781, Training Loss: 0.00914
Epoch: 25781, Training Loss: 0.00853
Epoch: 25781, Training Loss: 0.01044
Epoch: 25781, Training Loss: 0.00808
Epoch: 25782, Training Loss: 0.00914
Epoch: 25782, Training Loss: 0.00853
Epoch: 25782, Training Loss: 0.01044
Epoch: 25782, Training Loss: 0.00808
Epoch: 25783, Training Loss: 0.00914
Epoch: 25783, Training Loss: 0.00853
Epoch: 25783, Training Loss: 0.01044
Epoch: 25783, Training Loss: 0.00808
Epoch: 25784, Training Loss: 0.00914
Epoch: 25784, Training Loss: 0.00853
Epoch: 25784, Training Loss: 0.01044
Epoch: 25784, Training Loss: 0.00808
Epoch: 25785, Training Loss: 0.00914
Epoch: 25785, Training Loss: 0.00853
Epoch: 25785, Training Loss: 0.01044
Epoch: 25785, Training Loss: 0.00808
Epoch: 25786, Training Loss: 0.00914
Epoch: 25786, Training Loss: 0.00853
Epoch: 25786, Training Loss: 0.01044
Epoch: 25786, Training Loss: 0.00808
Epoch: 25787, Training Loss: 0.00914
Epoch: 25787, Training Loss: 0.00853
Epoch: 25787, Training Loss: 0.01044
Epoch: 25787, Training Loss: 0.00808
Epoch: 25788, Training Loss: 0.00914
Epoch: 25788, Training Loss: 0.00853
Epoch: 25788, Training Loss: 0.01044
Epoch: 25788, Training Loss: 0.00807
Epoch: 25789, Training Loss: 0.00914
Epoch: 25789, Training Loss: 0.00853
Epoch: 25789, Training Loss: 0.01044
Epoch: 25789, Training Loss: 0.00807
Epoch: 25790, Training Loss: 0.00914
Epoch: 25790, Training Loss: 0.00853
Epoch: 25790, Training Loss: 0.01044
Epoch: 25790, Training Loss: 0.00807
Epoch: 25791, Training Loss: 0.00914
Epoch: 25791, Training Loss: 0.00853
Epoch: 25791, Training Loss: 0.01044
Epoch: 25791, Training Loss: 0.00807
Epoch: 25792, Training Loss: 0.00914
Epoch: 25792, Training Loss: 0.00852
Epoch: 25792, Training Loss: 0.01044
Epoch: 25792, Training Loss: 0.00807
Epoch: 25793, Training Loss: 0.00914
Epoch: 25793, Training Loss: 0.00852
Epoch: 25793, Training Loss: 0.01044
Epoch: 25793, Training Loss: 0.00807
Epoch: 25794, Training Loss: 0.00914
Epoch: 25794, Training Loss: 0.00852
Epoch: 25794, Training Loss: 0.01044
Epoch: 25794, Training Loss: 0.00807
Epoch: 25795, Training Loss: 0.00914
Epoch: 25795, Training Loss: 0.00852
Epoch: 25795, Training Loss: 0.01044
Epoch: 25795, Training Loss: 0.00807
Epoch: 25796, Training Loss: 0.00914
Epoch: 25796, Training Loss: 0.00852
Epoch: 25796, Training Loss: 0.01044
Epoch: 25796, Training Loss: 0.00807
Epoch: 25797, Training Loss: 0.00914
Epoch: 25797, Training Loss: 0.00852
Epoch: 25797, Training Loss: 0.01044
Epoch: 25797, Training Loss: 0.00807
Epoch: 25798, Training Loss: 0.00914
Epoch: 25798, Training Loss: 0.00852
Epoch: 25798, Training Loss: 0.01044
Epoch: 25798, Training Loss: 0.00807
Epoch: 25799, Training Loss: 0.00914
Epoch: 25799, Training Loss: 0.00852
Epoch: 25799, Training Loss: 0.01044
Epoch: 25799, Training Loss: 0.00807
Epoch: 25800, Training Loss: 0.00914
Epoch: 25800, Training Loss: 0.00852
Epoch: 25800, Training Loss: 0.01044
Epoch: 25800, Training Loss: 0.00807
Epoch: 25801, Training Loss: 0.00914
Epoch: 25801, Training Loss: 0.00852
Epoch: 25801, Training Loss: 0.01044
Epoch: 25801, Training Loss: 0.00807
Epoch: 25802, Training Loss: 0.00914
Epoch: 25802, Training Loss: 0.00852
Epoch: 25802, Training Loss: 0.01044
Epoch: 25802, Training Loss: 0.00807
Epoch: 25803, Training Loss: 0.00914
Epoch: 25803, Training Loss: 0.00852
Epoch: 25803, Training Loss: 0.01044
Epoch: 25803, Training Loss: 0.00807
Epoch: 25804, Training Loss: 0.00914
Epoch: 25804, Training Loss: 0.00852
Epoch: 25804, Training Loss: 0.01044
Epoch: 25804, Training Loss: 0.00807
Epoch: 25805, Training Loss: 0.00914
Epoch: 25805, Training Loss: 0.00852
Epoch: 25805, Training Loss: 0.01044
Epoch: 25805, Training Loss: 0.00807
Epoch: 25806, Training Loss: 0.00914
Epoch: 25806, Training Loss: 0.00852
Epoch: 25806, Training Loss: 0.01044
Epoch: 25806, Training Loss: 0.00807
Epoch: 25807, Training Loss: 0.00914
Epoch: 25807, Training Loss: 0.00852
Epoch: 25807, Training Loss: 0.01044
Epoch: 25807, Training Loss: 0.00807
Epoch: 25808, Training Loss: 0.00914
Epoch: 25808, Training Loss: 0.00852
Epoch: 25808, Training Loss: 0.01044
Epoch: 25808, Training Loss: 0.00807
Epoch: 25809, Training Loss: 0.00914
Epoch: 25809, Training Loss: 0.00852
Epoch: 25809, Training Loss: 0.01044
Epoch: 25809, Training Loss: 0.00807
Epoch: 25810, Training Loss: 0.00914
Epoch: 25810, Training Loss: 0.00852
Epoch: 25810, Training Loss: 0.01043
Epoch: 25810, Training Loss: 0.00807
Epoch: 25811, Training Loss: 0.00914
Epoch: 25811, Training Loss: 0.00852
Epoch: 25811, Training Loss: 0.01043
Epoch: 25811, Training Loss: 0.00807
Epoch: 25812, Training Loss: 0.00914
Epoch: 25812, Training Loss: 0.00852
Epoch: 25812, Training Loss: 0.01043
Epoch: 25812, Training Loss: 0.00807
Epoch: 25813, Training Loss: 0.00914
Epoch: 25813, Training Loss: 0.00852
Epoch: 25813, Training Loss: 0.01043
Epoch: 25813, Training Loss: 0.00807
Epoch: 25814, Training Loss: 0.00914
Epoch: 25814, Training Loss: 0.00852
Epoch: 25814, Training Loss: 0.01043
Epoch: 25814, Training Loss: 0.00807
Epoch: 25815, Training Loss: 0.00914
Epoch: 25815, Training Loss: 0.00852
Epoch: 25815, Training Loss: 0.01043
Epoch: 25815, Training Loss: 0.00807
Epoch: 25816, Training Loss: 0.00914
Epoch: 25816, Training Loss: 0.00852
Epoch: 25816, Training Loss: 0.01043
Epoch: 25816, Training Loss: 0.00807
Epoch: 25817, Training Loss: 0.00914
Epoch: 25817, Training Loss: 0.00852
Epoch: 25817, Training Loss: 0.01043
Epoch: 25817, Training Loss: 0.00807
Epoch: 25818, Training Loss: 0.00914
Epoch: 25818, Training Loss: 0.00852
Epoch: 25818, Training Loss: 0.01043
Epoch: 25818, Training Loss: 0.00807
Epoch: 25819, Training Loss: 0.00914
Epoch: 25819, Training Loss: 0.00852
Epoch: 25819, Training Loss: 0.01043
Epoch: 25819, Training Loss: 0.00807
Epoch: 25820, Training Loss: 0.00914
Epoch: 25820, Training Loss: 0.00852
Epoch: 25820, Training Loss: 0.01043
Epoch: 25820, Training Loss: 0.00807
Epoch: 25821, Training Loss: 0.00914
Epoch: 25821, Training Loss: 0.00852
Epoch: 25821, Training Loss: 0.01043
Epoch: 25821, Training Loss: 0.00807
Epoch: 25822, Training Loss: 0.00914
Epoch: 25822, Training Loss: 0.00852
Epoch: 25822, Training Loss: 0.01043
Epoch: 25822, Training Loss: 0.00807
Epoch: 25823, Training Loss: 0.00914
Epoch: 25823, Training Loss: 0.00852
Epoch: 25823, Training Loss: 0.01043
Epoch: 25823, Training Loss: 0.00807
Epoch: 25824, Training Loss: 0.00914
Epoch: 25824, Training Loss: 0.00852
Epoch: 25824, Training Loss: 0.01043
Epoch: 25824, Training Loss: 0.00807
Epoch: 25825, Training Loss: 0.00914
Epoch: 25825, Training Loss: 0.00852
Epoch: 25825, Training Loss: 0.01043
Epoch: 25825, Training Loss: 0.00807
Epoch: 25826, Training Loss: 0.00914
Epoch: 25826, Training Loss: 0.00852
Epoch: 25826, Training Loss: 0.01043
Epoch: 25826, Training Loss: 0.00807
Epoch: 25827, Training Loss: 0.00914
Epoch: 25827, Training Loss: 0.00852
Epoch: 25827, Training Loss: 0.01043
Epoch: 25827, Training Loss: 0.00807
Epoch: 25828, Training Loss: 0.00914
Epoch: 25828, Training Loss: 0.00852
Epoch: 25828, Training Loss: 0.01043
Epoch: 25828, Training Loss: 0.00807
Epoch: 25829, Training Loss: 0.00914
Epoch: 25829, Training Loss: 0.00852
Epoch: 25829, Training Loss: 0.01043
Epoch: 25829, Training Loss: 0.00807
Epoch: 25830, Training Loss: 0.00913
Epoch: 25830, Training Loss: 0.00852
Epoch: 25830, Training Loss: 0.01043
Epoch: 25830, Training Loss: 0.00807
Epoch: 25831, Training Loss: 0.00913
Epoch: 25831, Training Loss: 0.00852
Epoch: 25831, Training Loss: 0.01043
Epoch: 25831, Training Loss: 0.00807
Epoch: 25832, Training Loss: 0.00913
Epoch: 25832, Training Loss: 0.00852
Epoch: 25832, Training Loss: 0.01043
Epoch: 25832, Training Loss: 0.00807
Epoch: 25833, Training Loss: 0.00913
Epoch: 25833, Training Loss: 0.00852
Epoch: 25833, Training Loss: 0.01043
Epoch: 25833, Training Loss: 0.00807
Epoch: 25834, Training Loss: 0.00913
Epoch: 25834, Training Loss: 0.00852
Epoch: 25834, Training Loss: 0.01043
Epoch: 25834, Training Loss: 0.00807
Epoch: 25835, Training Loss: 0.00913
Epoch: 25835, Training Loss: 0.00852
Epoch: 25835, Training Loss: 0.01043
Epoch: 25835, Training Loss: 0.00807
Epoch: 25836, Training Loss: 0.00913
Epoch: 25836, Training Loss: 0.00852
Epoch: 25836, Training Loss: 0.01043
Epoch: 25836, Training Loss: 0.00807
Epoch: 25837, Training Loss: 0.00913
Epoch: 25837, Training Loss: 0.00852
Epoch: 25837, Training Loss: 0.01043
Epoch: 25837, Training Loss: 0.00807
Epoch: 25838, Training Loss: 0.00913
Epoch: 25838, Training Loss: 0.00852
Epoch: 25838, Training Loss: 0.01043
Epoch: 25838, Training Loss: 0.00807
Epoch: 25839, Training Loss: 0.00913
Epoch: 25839, Training Loss: 0.00852
Epoch: 25839, Training Loss: 0.01043
Epoch: 25839, Training Loss: 0.00807
Epoch: 25840, Training Loss: 0.00913
Epoch: 25840, Training Loss: 0.00852
Epoch: 25840, Training Loss: 0.01043
Epoch: 25840, Training Loss: 0.00807
Epoch: 25841, Training Loss: 0.00913
Epoch: 25841, Training Loss: 0.00852
Epoch: 25841, Training Loss: 0.01043
Epoch: 25841, Training Loss: 0.00807
Epoch: 25842, Training Loss: 0.00913
Epoch: 25842, Training Loss: 0.00852
Epoch: 25842, Training Loss: 0.01043
Epoch: 25842, Training Loss: 0.00807
Epoch: 25843, Training Loss: 0.00913
Epoch: 25843, Training Loss: 0.00852
Epoch: 25843, Training Loss: 0.01043
Epoch: 25843, Training Loss: 0.00807
Epoch: 25844, Training Loss: 0.00913
Epoch: 25844, Training Loss: 0.00852
Epoch: 25844, Training Loss: 0.01043
Epoch: 25844, Training Loss: 0.00807
Epoch: 25845, Training Loss: 0.00913
Epoch: 25845, Training Loss: 0.00852
Epoch: 25845, Training Loss: 0.01043
Epoch: 25845, Training Loss: 0.00807
Epoch: 25846, Training Loss: 0.00913
Epoch: 25846, Training Loss: 0.00852
Epoch: 25846, Training Loss: 0.01043
Epoch: 25846, Training Loss: 0.00807
Epoch: 25847, Training Loss: 0.00913
Epoch: 25847, Training Loss: 0.00852
Epoch: 25847, Training Loss: 0.01043
Epoch: 25847, Training Loss: 0.00807
Epoch: 25848, Training Loss: 0.00913
Epoch: 25848, Training Loss: 0.00852
Epoch: 25848, Training Loss: 0.01043
Epoch: 25848, Training Loss: 0.00807
Epoch: 25849, Training Loss: 0.00913
Epoch: 25849, Training Loss: 0.00851
Epoch: 25849, Training Loss: 0.01043
Epoch: 25849, Training Loss: 0.00806
Epoch: 25850, Training Loss: 0.00913
Epoch: 25850, Training Loss: 0.00851
Epoch: 25850, Training Loss: 0.01043
Epoch: 25850, Training Loss: 0.00806
Epoch: 25851, Training Loss: 0.00913
Epoch: 25851, Training Loss: 0.00851
Epoch: 25851, Training Loss: 0.01043
Epoch: 25851, Training Loss: 0.00806
Epoch: 25852, Training Loss: 0.00913
Epoch: 25852, Training Loss: 0.00851
Epoch: 25852, Training Loss: 0.01043
Epoch: 25852, Training Loss: 0.00806
Epoch: 25853, Training Loss: 0.00913
Epoch: 25853, Training Loss: 0.00851
Epoch: 25853, Training Loss: 0.01043
Epoch: 25853, Training Loss: 0.00806
Epoch: 25854, Training Loss: 0.00913
Epoch: 25854, Training Loss: 0.00851
Epoch: 25854, Training Loss: 0.01043
Epoch: 25854, Training Loss: 0.00806
Epoch: 25855, Training Loss: 0.00913
Epoch: 25855, Training Loss: 0.00851
Epoch: 25855, Training Loss: 0.01043
Epoch: 25855, Training Loss: 0.00806
Epoch: 25856, Training Loss: 0.00913
Epoch: 25856, Training Loss: 0.00851
Epoch: 25856, Training Loss: 0.01042
Epoch: 25856, Training Loss: 0.00806
Epoch: 25857, Training Loss: 0.00913
Epoch: 25857, Training Loss: 0.00851
Epoch: 25857, Training Loss: 0.01042
Epoch: 25857, Training Loss: 0.00806
Epoch: 25858, Training Loss: 0.00913
Epoch: 25858, Training Loss: 0.00851
Epoch: 25858, Training Loss: 0.01042
Epoch: 25858, Training Loss: 0.00806
Epoch: 25859, Training Loss: 0.00913
Epoch: 25859, Training Loss: 0.00851
Epoch: 25859, Training Loss: 0.01042
Epoch: 25859, Training Loss: 0.00806
Epoch: 25860, Training Loss: 0.00913
Epoch: 25860, Training Loss: 0.00851
Epoch: 25860, Training Loss: 0.01042
Epoch: 25860, Training Loss: 0.00806
Epoch: 25861, Training Loss: 0.00913
Epoch: 25861, Training Loss: 0.00851
Epoch: 25861, Training Loss: 0.01042
Epoch: 25861, Training Loss: 0.00806
Epoch: 25862, Training Loss: 0.00913
Epoch: 25862, Training Loss: 0.00851
Epoch: 25862, Training Loss: 0.01042
Epoch: 25862, Training Loss: 0.00806
Epoch: 25863, Training Loss: 0.00913
Epoch: 25863, Training Loss: 0.00851
Epoch: 25863, Training Loss: 0.01042
Epoch: 25863, Training Loss: 0.00806
Epoch: 25864, Training Loss: 0.00913
Epoch: 25864, Training Loss: 0.00851
Epoch: 25864, Training Loss: 0.01042
Epoch: 25864, Training Loss: 0.00806
Epoch: 25865, Training Loss: 0.00913
Epoch: 25865, Training Loss: 0.00851
Epoch: 25865, Training Loss: 0.01042
Epoch: 25865, Training Loss: 0.00806
Epoch: 25866, Training Loss: 0.00913
Epoch: 25866, Training Loss: 0.00851
Epoch: 25866, Training Loss: 0.01042
Epoch: 25866, Training Loss: 0.00806
Epoch: 25867, Training Loss: 0.00913
Epoch: 25867, Training Loss: 0.00851
Epoch: 25867, Training Loss: 0.01042
Epoch: 25867, Training Loss: 0.00806
Epoch: 25868, Training Loss: 0.00913
Epoch: 25868, Training Loss: 0.00851
Epoch: 25868, Training Loss: 0.01042
Epoch: 25868, Training Loss: 0.00806
Epoch: 25869, Training Loss: 0.00913
Epoch: 25869, Training Loss: 0.00851
Epoch: 25869, Training Loss: 0.01042
Epoch: 25869, Training Loss: 0.00806
Epoch: 25870, Training Loss: 0.00913
Epoch: 25870, Training Loss: 0.00851
Epoch: 25870, Training Loss: 0.01042
Epoch: 25870, Training Loss: 0.00806
Epoch: 25871, Training Loss: 0.00913
Epoch: 25871, Training Loss: 0.00851
Epoch: 25871, Training Loss: 0.01042
Epoch: 25871, Training Loss: 0.00806
Epoch: 25872, Training Loss: 0.00913
Epoch: 25872, Training Loss: 0.00851
Epoch: 25872, Training Loss: 0.01042
Epoch: 25872, Training Loss: 0.00806
Epoch: 25873, Training Loss: 0.00913
Epoch: 25873, Training Loss: 0.00851
Epoch: 25873, Training Loss: 0.01042
Epoch: 25873, Training Loss: 0.00806
Epoch: 25874, Training Loss: 0.00913
Epoch: 25874, Training Loss: 0.00851
Epoch: 25874, Training Loss: 0.01042
Epoch: 25874, Training Loss: 0.00806
Epoch: 25875, Training Loss: 0.00913
Epoch: 25875, Training Loss: 0.00851
Epoch: 25875, Training Loss: 0.01042
Epoch: 25875, Training Loss: 0.00806
Epoch: 25876, Training Loss: 0.00913
Epoch: 25876, Training Loss: 0.00851
Epoch: 25876, Training Loss: 0.01042
Epoch: 25876, Training Loss: 0.00806
Epoch: 25877, Training Loss: 0.00913
Epoch: 25877, Training Loss: 0.00851
Epoch: 25877, Training Loss: 0.01042
Epoch: 25877, Training Loss: 0.00806
Epoch: 25878, Training Loss: 0.00913
Epoch: 25878, Training Loss: 0.00851
Epoch: 25878, Training Loss: 0.01042
Epoch: 25878, Training Loss: 0.00806
Epoch: 25879, Training Loss: 0.00913
Epoch: 25879, Training Loss: 0.00851
Epoch: 25879, Training Loss: 0.01042
Epoch: 25879, Training Loss: 0.00806
Epoch: 25880, Training Loss: 0.00913
Epoch: 25880, Training Loss: 0.00851
Epoch: 25880, Training Loss: 0.01042
Epoch: 25880, Training Loss: 0.00806
Epoch: 25881, Training Loss: 0.00913
Epoch: 25881, Training Loss: 0.00851
Epoch: 25881, Training Loss: 0.01042
Epoch: 25881, Training Loss: 0.00806
Epoch: 25882, Training Loss: 0.00913
Epoch: 25882, Training Loss: 0.00851
Epoch: 25882, Training Loss: 0.01042
Epoch: 25882, Training Loss: 0.00806
Epoch: 25883, Training Loss: 0.00912
Epoch: 25883, Training Loss: 0.00851
Epoch: 25883, Training Loss: 0.01042
Epoch: 25883, Training Loss: 0.00806
Epoch: 25884, Training Loss: 0.00912
Epoch: 25884, Training Loss: 0.00851
Epoch: 25884, Training Loss: 0.01042
Epoch: 25884, Training Loss: 0.00806
Epoch: 25885, Training Loss: 0.00912
Epoch: 25885, Training Loss: 0.00851
Epoch: 25885, Training Loss: 0.01042
Epoch: 25885, Training Loss: 0.00806
Epoch: 25886, Training Loss: 0.00912
Epoch: 25886, Training Loss: 0.00851
Epoch: 25886, Training Loss: 0.01042
Epoch: 25886, Training Loss: 0.00806
Epoch: 25887, Training Loss: 0.00912
Epoch: 25887, Training Loss: 0.00851
Epoch: 25887, Training Loss: 0.01042
Epoch: 25887, Training Loss: 0.00806
Epoch: 25888, Training Loss: 0.00912
Epoch: 25888, Training Loss: 0.00851
Epoch: 25888, Training Loss: 0.01042
Epoch: 25888, Training Loss: 0.00806
Epoch: 25889, Training Loss: 0.00912
Epoch: 25889, Training Loss: 0.00851
Epoch: 25889, Training Loss: 0.01042
Epoch: 25889, Training Loss: 0.00806
Epoch: 25890, Training Loss: 0.00912
Epoch: 25890, Training Loss: 0.00851
Epoch: 25890, Training Loss: 0.01042
Epoch: 25890, Training Loss: 0.00806
Epoch: 25891, Training Loss: 0.00912
Epoch: 25891, Training Loss: 0.00851
Epoch: 25891, Training Loss: 0.01042
Epoch: 25891, Training Loss: 0.00806
Epoch: 25892, Training Loss: 0.00912
Epoch: 25892, Training Loss: 0.00851
Epoch: 25892, Training Loss: 0.01042
Epoch: 25892, Training Loss: 0.00806
Epoch: 25893, Training Loss: 0.00912
Epoch: 25893, Training Loss: 0.00851
Epoch: 25893, Training Loss: 0.01042
Epoch: 25893, Training Loss: 0.00806
Epoch: 25894, Training Loss: 0.00912
Epoch: 25894, Training Loss: 0.00851
Epoch: 25894, Training Loss: 0.01042
Epoch: 25894, Training Loss: 0.00806
Epoch: 25895, Training Loss: 0.00912
Epoch: 25895, Training Loss: 0.00851
Epoch: 25895, Training Loss: 0.01042
Epoch: 25895, Training Loss: 0.00806
Epoch: 25896, Training Loss: 0.00912
Epoch: 25896, Training Loss: 0.00851
Epoch: 25896, Training Loss: 0.01042
Epoch: 25896, Training Loss: 0.00806
Epoch: 25897, Training Loss: 0.00912
Epoch: 25897, Training Loss: 0.00851
Epoch: 25897, Training Loss: 0.01042
Epoch: 25897, Training Loss: 0.00806
Epoch: 25898, Training Loss: 0.00912
Epoch: 25898, Training Loss: 0.00851
Epoch: 25898, Training Loss: 0.01042
Epoch: 25898, Training Loss: 0.00806
Epoch: 25899, Training Loss: 0.00912
Epoch: 25899, Training Loss: 0.00851
Epoch: 25899, Training Loss: 0.01042
Epoch: 25899, Training Loss: 0.00806
Epoch: 25900, Training Loss: 0.00912
Epoch: 25900, Training Loss: 0.00851
Epoch: 25900, Training Loss: 0.01042
Epoch: 25900, Training Loss: 0.00806
Epoch: 25901, Training Loss: 0.00912
Epoch: 25901, Training Loss: 0.00851
Epoch: 25901, Training Loss: 0.01042
Epoch: 25901, Training Loss: 0.00806
Epoch: 25902, Training Loss: 0.00912
Epoch: 25902, Training Loss: 0.00851
Epoch: 25902, Training Loss: 0.01042
Epoch: 25902, Training Loss: 0.00806
Epoch: 25903, Training Loss: 0.00912
Epoch: 25903, Training Loss: 0.00851
Epoch: 25903, Training Loss: 0.01041
Epoch: 25903, Training Loss: 0.00806
Epoch: 25904, Training Loss: 0.00912
Epoch: 25904, Training Loss: 0.00851
Epoch: 25904, Training Loss: 0.01041
Epoch: 25904, Training Loss: 0.00806
Epoch: 25905, Training Loss: 0.00912
Epoch: 25905, Training Loss: 0.00851
Epoch: 25905, Training Loss: 0.01041
Epoch: 25905, Training Loss: 0.00806
Epoch: 25906, Training Loss: 0.00912
Epoch: 25906, Training Loss: 0.00851
Epoch: 25906, Training Loss: 0.01041
Epoch: 25906, Training Loss: 0.00806
Epoch: 25907, Training Loss: 0.00912
Epoch: 25907, Training Loss: 0.00850
Epoch: 25907, Training Loss: 0.01041
Epoch: 25907, Training Loss: 0.00806
Epoch: 25908, Training Loss: 0.00912
Epoch: 25908, Training Loss: 0.00850
Epoch: 25908, Training Loss: 0.01041
Epoch: 25908, Training Loss: 0.00806
Epoch: 25909, Training Loss: 0.00912
Epoch: 25909, Training Loss: 0.00850
Epoch: 25909, Training Loss: 0.01041
Epoch: 25909, Training Loss: 0.00805
Epoch: 25910, Training Loss: 0.00912
Epoch: 25910, Training Loss: 0.00850
Epoch: 25910, Training Loss: 0.01041
Epoch: 25910, Training Loss: 0.00805
Epoch: 25911, Training Loss: 0.00912
Epoch: 25911, Training Loss: 0.00850
Epoch: 25911, Training Loss: 0.01041
Epoch: 25911, Training Loss: 0.00805
Epoch: 25912, Training Loss: 0.00912
Epoch: 25912, Training Loss: 0.00850
Epoch: 25912, Training Loss: 0.01041
Epoch: 25912, Training Loss: 0.00805
Epoch: 25913, Training Loss: 0.00912
Epoch: 25913, Training Loss: 0.00850
Epoch: 25913, Training Loss: 0.01041
Epoch: 25913, Training Loss: 0.00805
Epoch: 25914, Training Loss: 0.00912
Epoch: 25914, Training Loss: 0.00850
Epoch: 25914, Training Loss: 0.01041
Epoch: 25914, Training Loss: 0.00805
Epoch: 25915, Training Loss: 0.00912
Epoch: 25915, Training Loss: 0.00850
Epoch: 25915, Training Loss: 0.01041
Epoch: 25915, Training Loss: 0.00805
Epoch: 25916, Training Loss: 0.00912
Epoch: 25916, Training Loss: 0.00850
Epoch: 25916, Training Loss: 0.01041
Epoch: 25916, Training Loss: 0.00805
Epoch: 25917, Training Loss: 0.00912
Epoch: 25917, Training Loss: 0.00850
Epoch: 25917, Training Loss: 0.01041
Epoch: 25917, Training Loss: 0.00805
Epoch: 25918, Training Loss: 0.00912
Epoch: 25918, Training Loss: 0.00850
Epoch: 25918, Training Loss: 0.01041
Epoch: 25918, Training Loss: 0.00805
Epoch: 25919, Training Loss: 0.00912
Epoch: 25919, Training Loss: 0.00850
Epoch: 25919, Training Loss: 0.01041
Epoch: 25919, Training Loss: 0.00805
Epoch: 25920, Training Loss: 0.00912
Epoch: 25920, Training Loss: 0.00850
Epoch: 25920, Training Loss: 0.01041
Epoch: 25920, Training Loss: 0.00805
Epoch: 25921, Training Loss: 0.00912
Epoch: 25921, Training Loss: 0.00850
Epoch: 25921, Training Loss: 0.01041
Epoch: 25921, Training Loss: 0.00805
Epoch: 25922, Training Loss: 0.00912
Epoch: 25922, Training Loss: 0.00850
Epoch: 25922, Training Loss: 0.01041
Epoch: 25922, Training Loss: 0.00805
Epoch: 25923, Training Loss: 0.00912
Epoch: 25923, Training Loss: 0.00850
Epoch: 25923, Training Loss: 0.01041
Epoch: 25923, Training Loss: 0.00805
Epoch: 25924, Training Loss: 0.00912
Epoch: 25924, Training Loss: 0.00850
Epoch: 25924, Training Loss: 0.01041
Epoch: 25924, Training Loss: 0.00805
Epoch: 25925, Training Loss: 0.00912
Epoch: 25925, Training Loss: 0.00850
Epoch: 25925, Training Loss: 0.01041
Epoch: 25925, Training Loss: 0.00805
Epoch: 25926, Training Loss: 0.00912
Epoch: 25926, Training Loss: 0.00850
Epoch: 25926, Training Loss: 0.01041
Epoch: 25926, Training Loss: 0.00805
Epoch: 25927, Training Loss: 0.00912
Epoch: 25927, Training Loss: 0.00850
Epoch: 25927, Training Loss: 0.01041
Epoch: 25927, Training Loss: 0.00805
Epoch: 25928, Training Loss: 0.00912
Epoch: 25928, Training Loss: 0.00850
Epoch: 25928, Training Loss: 0.01041
Epoch: 25928, Training Loss: 0.00805
Epoch: 25929, Training Loss: 0.00912
Epoch: 25929, Training Loss: 0.00850
Epoch: 25929, Training Loss: 0.01041
Epoch: 25929, Training Loss: 0.00805
Epoch: 25930, Training Loss: 0.00912
Epoch: 25930, Training Loss: 0.00850
Epoch: 25930, Training Loss: 0.01041
Epoch: 25930, Training Loss: 0.00805
Epoch: 25931, Training Loss: 0.00912
Epoch: 25931, Training Loss: 0.00850
Epoch: 25931, Training Loss: 0.01041
Epoch: 25931, Training Loss: 0.00805
Epoch: 25932, Training Loss: 0.00912
Epoch: 25932, Training Loss: 0.00850
Epoch: 25932, Training Loss: 0.01041
Epoch: 25932, Training Loss: 0.00805
Epoch: 25933, Training Loss: 0.00912
Epoch: 25933, Training Loss: 0.00850
Epoch: 25933, Training Loss: 0.01041
Epoch: 25933, Training Loss: 0.00805
Epoch: 25934, Training Loss: 0.00912
Epoch: 25934, Training Loss: 0.00850
Epoch: 25934, Training Loss: 0.01041
Epoch: 25934, Training Loss: 0.00805
Epoch: 25935, Training Loss: 0.00912
Epoch: 25935, Training Loss: 0.00850
Epoch: 25935, Training Loss: 0.01041
Epoch: 25935, Training Loss: 0.00805
Epoch: 25936, Training Loss: 0.00911
Epoch: 25936, Training Loss: 0.00850
Epoch: 25936, Training Loss: 0.01041
Epoch: 25936, Training Loss: 0.00805
Epoch: 25937, Training Loss: 0.00911
Epoch: 25937, Training Loss: 0.00850
Epoch: 25937, Training Loss: 0.01041
Epoch: 25937, Training Loss: 0.00805
Epoch: 25938, Training Loss: 0.00911
Epoch: 25938, Training Loss: 0.00850
Epoch: 25938, Training Loss: 0.01041
Epoch: 25938, Training Loss: 0.00805
Epoch: 25939, Training Loss: 0.00911
Epoch: 25939, Training Loss: 0.00850
Epoch: 25939, Training Loss: 0.01041
Epoch: 25939, Training Loss: 0.00805
Epoch: 25940, Training Loss: 0.00911
Epoch: 25940, Training Loss: 0.00850
Epoch: 25940, Training Loss: 0.01041
Epoch: 25940, Training Loss: 0.00805
Epoch: 25941, Training Loss: 0.00911
Epoch: 25941, Training Loss: 0.00850
Epoch: 25941, Training Loss: 0.01041
Epoch: 25941, Training Loss: 0.00805
Epoch: 25942, Training Loss: 0.00911
Epoch: 25942, Training Loss: 0.00850
Epoch: 25942, Training Loss: 0.01041
Epoch: 25942, Training Loss: 0.00805
Epoch: 25943, Training Loss: 0.00911
Epoch: 25943, Training Loss: 0.00850
Epoch: 25943, Training Loss: 0.01041
Epoch: 25943, Training Loss: 0.00805
Epoch: 25944, Training Loss: 0.00911
Epoch: 25944, Training Loss: 0.00850
Epoch: 25944, Training Loss: 0.01041
Epoch: 25944, Training Loss: 0.00805
Epoch: 25945, Training Loss: 0.00911
Epoch: 25945, Training Loss: 0.00850
Epoch: 25945, Training Loss: 0.01041
Epoch: 25945, Training Loss: 0.00805
Epoch: 25946, Training Loss: 0.00911
Epoch: 25946, Training Loss: 0.00850
Epoch: 25946, Training Loss: 0.01041
Epoch: 25946, Training Loss: 0.00805
Epoch: 25947, Training Loss: 0.00911
Epoch: 25947, Training Loss: 0.00850
Epoch: 25947, Training Loss: 0.01041
Epoch: 25947, Training Loss: 0.00805
Epoch: 25948, Training Loss: 0.00911
Epoch: 25948, Training Loss: 0.00850
Epoch: 25948, Training Loss: 0.01041
Epoch: 25948, Training Loss: 0.00805
Epoch: 25949, Training Loss: 0.00911
Epoch: 25949, Training Loss: 0.00850
Epoch: 25949, Training Loss: 0.01040
Epoch: 25949, Training Loss: 0.00805
Epoch: 25950, Training Loss: 0.00911
Epoch: 25950, Training Loss: 0.00850
Epoch: 25950, Training Loss: 0.01040
Epoch: 25950, Training Loss: 0.00805
Epoch: 25951, Training Loss: 0.00911
Epoch: 25951, Training Loss: 0.00850
Epoch: 25951, Training Loss: 0.01040
Epoch: 25951, Training Loss: 0.00805
Epoch: 25952, Training Loss: 0.00911
Epoch: 25952, Training Loss: 0.00850
Epoch: 25952, Training Loss: 0.01040
Epoch: 25952, Training Loss: 0.00805
Epoch: 25953, Training Loss: 0.00911
Epoch: 25953, Training Loss: 0.00850
Epoch: 25953, Training Loss: 0.01040
Epoch: 25953, Training Loss: 0.00805
Epoch: 25954, Training Loss: 0.00911
Epoch: 25954, Training Loss: 0.00850
Epoch: 25954, Training Loss: 0.01040
Epoch: 25954, Training Loss: 0.00805
Epoch: 25955, Training Loss: 0.00911
Epoch: 25955, Training Loss: 0.00850
Epoch: 25955, Training Loss: 0.01040
Epoch: 25955, Training Loss: 0.00805
Epoch: 25956, Training Loss: 0.00911
Epoch: 25956, Training Loss: 0.00850
Epoch: 25956, Training Loss: 0.01040
Epoch: 25956, Training Loss: 0.00805
Epoch: 25957, Training Loss: 0.00911
Epoch: 25957, Training Loss: 0.00850
Epoch: 25957, Training Loss: 0.01040
Epoch: 25957, Training Loss: 0.00805
Epoch: 25958, Training Loss: 0.00911
Epoch: 25958, Training Loss: 0.00850
Epoch: 25958, Training Loss: 0.01040
Epoch: 25958, Training Loss: 0.00805
Epoch: 25959, Training Loss: 0.00911
Epoch: 25959, Training Loss: 0.00850
Epoch: 25959, Training Loss: 0.01040
Epoch: 25959, Training Loss: 0.00805
Epoch: 25960, Training Loss: 0.00911
Epoch: 25960, Training Loss: 0.00850
Epoch: 25960, Training Loss: 0.01040
Epoch: 25960, Training Loss: 0.00805
Epoch: 25961, Training Loss: 0.00911
Epoch: 25961, Training Loss: 0.00850
Epoch: 25961, Training Loss: 0.01040
Epoch: 25961, Training Loss: 0.00805
Epoch: 25962, Training Loss: 0.00911
Epoch: 25962, Training Loss: 0.00850
Epoch: 25962, Training Loss: 0.01040
Epoch: 25962, Training Loss: 0.00805
Epoch: 25963, Training Loss: 0.00911
Epoch: 25963, Training Loss: 0.00850
Epoch: 25963, Training Loss: 0.01040
Epoch: 25963, Training Loss: 0.00805
Epoch: 25964, Training Loss: 0.00911
Epoch: 25964, Training Loss: 0.00850
Epoch: 25964, Training Loss: 0.01040
Epoch: 25964, Training Loss: 0.00805
Epoch: 25965, Training Loss: 0.00911
Epoch: 25965, Training Loss: 0.00849
Epoch: 25965, Training Loss: 0.01040
Epoch: 25965, Training Loss: 0.00805
Epoch: 25966, Training Loss: 0.00911
Epoch: 25966, Training Loss: 0.00849
Epoch: 25966, Training Loss: 0.01040
Epoch: 25966, Training Loss: 0.00805
Epoch: 25967, Training Loss: 0.00911
Epoch: 25967, Training Loss: 0.00849
Epoch: 25967, Training Loss: 0.01040
Epoch: 25967, Training Loss: 0.00805
Epoch: 25968, Training Loss: 0.00911
Epoch: 25968, Training Loss: 0.00849
Epoch: 25968, Training Loss: 0.01040
Epoch: 25968, Training Loss: 0.00805
Epoch: 25969, Training Loss: 0.00911
Epoch: 25969, Training Loss: 0.00849
Epoch: 25969, Training Loss: 0.01040
Epoch: 25969, Training Loss: 0.00805
Epoch: 25970, Training Loss: 0.00911
Epoch: 25970, Training Loss: 0.00849
Epoch: 25970, Training Loss: 0.01040
Epoch: 25970, Training Loss: 0.00805
Epoch: 25971, Training Loss: 0.00911
Epoch: 25971, Training Loss: 0.00849
Epoch: 25971, Training Loss: 0.01040
Epoch: 25971, Training Loss: 0.00804
Epoch: 25972, Training Loss: 0.00911
Epoch: 25972, Training Loss: 0.00849
Epoch: 25972, Training Loss: 0.01040
Epoch: 25972, Training Loss: 0.00804
Epoch: 25973, Training Loss: 0.00911
Epoch: 25973, Training Loss: 0.00849
Epoch: 25973, Training Loss: 0.01040
Epoch: 25973, Training Loss: 0.00804
Epoch: 25974, Training Loss: 0.00911
Epoch: 25974, Training Loss: 0.00849
Epoch: 25974, Training Loss: 0.01040
Epoch: 25974, Training Loss: 0.00804
Epoch: 25975, Training Loss: 0.00911
Epoch: 25975, Training Loss: 0.00849
Epoch: 25975, Training Loss: 0.01040
Epoch: 25975, Training Loss: 0.00804
Epoch: 25976, Training Loss: 0.00911
Epoch: 25976, Training Loss: 0.00849
Epoch: 25976, Training Loss: 0.01040
Epoch: 25976, Training Loss: 0.00804
Epoch: 25977, Training Loss: 0.00911
Epoch: 25977, Training Loss: 0.00849
Epoch: 25977, Training Loss: 0.01040
Epoch: 25977, Training Loss: 0.00804
Epoch: 25978, Training Loss: 0.00911
Epoch: 25978, Training Loss: 0.00849
Epoch: 25978, Training Loss: 0.01040
Epoch: 25978, Training Loss: 0.00804
Epoch: 25979, Training Loss: 0.00911
Epoch: 25979, Training Loss: 0.00849
Epoch: 25979, Training Loss: 0.01040
Epoch: 25979, Training Loss: 0.00804
Epoch: 25980, Training Loss: 0.00911
Epoch: 25980, Training Loss: 0.00849
Epoch: 25980, Training Loss: 0.01040
Epoch: 25980, Training Loss: 0.00804
Epoch: 25981, Training Loss: 0.00911
Epoch: 25981, Training Loss: 0.00849
Epoch: 25981, Training Loss: 0.01040
Epoch: 25981, Training Loss: 0.00804
Epoch: 25982, Training Loss: 0.00911
Epoch: 25982, Training Loss: 0.00849
Epoch: 25982, Training Loss: 0.01040
Epoch: 25982, Training Loss: 0.00804
Epoch: 25983, Training Loss: 0.00911
Epoch: 25983, Training Loss: 0.00849
Epoch: 25983, Training Loss: 0.01040
Epoch: 25983, Training Loss: 0.00804
Epoch: 25984, Training Loss: 0.00911
Epoch: 25984, Training Loss: 0.00849
Epoch: 25984, Training Loss: 0.01040
Epoch: 25984, Training Loss: 0.00804
Epoch: 25985, Training Loss: 0.00911
Epoch: 25985, Training Loss: 0.00849
Epoch: 25985, Training Loss: 0.01040
Epoch: 25985, Training Loss: 0.00804
Epoch: 25986, Training Loss: 0.00911
Epoch: 25986, Training Loss: 0.00849
Epoch: 25986, Training Loss: 0.01040
Epoch: 25986, Training Loss: 0.00804
Epoch: 25987, Training Loss: 0.00911
Epoch: 25987, Training Loss: 0.00849
Epoch: 25987, Training Loss: 0.01040
Epoch: 25987, Training Loss: 0.00804
Epoch: 25988, Training Loss: 0.00911
Epoch: 25988, Training Loss: 0.00849
Epoch: 25988, Training Loss: 0.01040
Epoch: 25988, Training Loss: 0.00804
Epoch: 25989, Training Loss: 0.00910
Epoch: 25989, Training Loss: 0.00849
Epoch: 25989, Training Loss: 0.01040
Epoch: 25989, Training Loss: 0.00804
Epoch: 25990, Training Loss: 0.00910
Epoch: 25990, Training Loss: 0.00849
Epoch: 25990, Training Loss: 0.01040
Epoch: 25990, Training Loss: 0.00804
Epoch: 25991, Training Loss: 0.00910
Epoch: 25991, Training Loss: 0.00849
Epoch: 25991, Training Loss: 0.01040
Epoch: 25991, Training Loss: 0.00804
Epoch: 25992, Training Loss: 0.00910
Epoch: 25992, Training Loss: 0.00849
Epoch: 25992, Training Loss: 0.01040
Epoch: 25992, Training Loss: 0.00804
Epoch: 25993, Training Loss: 0.00910
Epoch: 25993, Training Loss: 0.00849
Epoch: 25993, Training Loss: 0.01040
Epoch: 25993, Training Loss: 0.00804
Epoch: 25994, Training Loss: 0.00910
Epoch: 25994, Training Loss: 0.00849
Epoch: 25994, Training Loss: 0.01040
Epoch: 25994, Training Loss: 0.00804
Epoch: 25995, Training Loss: 0.00910
Epoch: 25995, Training Loss: 0.00849
Epoch: 25995, Training Loss: 0.01040
Epoch: 25995, Training Loss: 0.00804
Epoch: 25996, Training Loss: 0.00910
Epoch: 25996, Training Loss: 0.00849
Epoch: 25996, Training Loss: 0.01039
Epoch: 25996, Training Loss: 0.00804
Epoch: 25997, Training Loss: 0.00910
Epoch: 25997, Training Loss: 0.00849
Epoch: 25997, Training Loss: 0.01039
Epoch: 25997, Training Loss: 0.00804
Epoch: 25998, Training Loss: 0.00910
Epoch: 25998, Training Loss: 0.00849
Epoch: 25998, Training Loss: 0.01039
Epoch: 25998, Training Loss: 0.00804
Epoch: 25999, Training Loss: 0.00910
Epoch: 25999, Training Loss: 0.00849
Epoch: 25999, Training Loss: 0.01039
Epoch: 25999, Training Loss: 0.00804
Epoch: 26000, Training Loss: 0.00910
Epoch: 26000, Training Loss: 0.00849
Epoch: 26000, Training Loss: 0.01039
Epoch: 26000, Training Loss: 0.00804
Epoch: 26001, Training Loss: 0.00910
Epoch: 26001, Training Loss: 0.00849
Epoch: 26001, Training Loss: 0.01039
Epoch: 26001, Training Loss: 0.00804
Epoch: 26002, Training Loss: 0.00910
Epoch: 26002, Training Loss: 0.00849
Epoch: 26002, Training Loss: 0.01039
Epoch: 26002, Training Loss: 0.00804
Epoch: 26003, Training Loss: 0.00910
Epoch: 26003, Training Loss: 0.00849
Epoch: 26003, Training Loss: 0.01039
Epoch: 26003, Training Loss: 0.00804
Epoch: 26004, Training Loss: 0.00910
Epoch: 26004, Training Loss: 0.00849
Epoch: 26004, Training Loss: 0.01039
Epoch: 26004, Training Loss: 0.00804
Epoch: 26005, Training Loss: 0.00910
Epoch: 26005, Training Loss: 0.00849
Epoch: 26005, Training Loss: 0.01039
Epoch: 26005, Training Loss: 0.00804
Epoch: 26006, Training Loss: 0.00910
Epoch: 26006, Training Loss: 0.00849
Epoch: 26006, Training Loss: 0.01039
Epoch: 26006, Training Loss: 0.00804
Epoch: 26007, Training Loss: 0.00910
Epoch: 26007, Training Loss: 0.00849
Epoch: 26007, Training Loss: 0.01039
Epoch: 26007, Training Loss: 0.00804
Epoch: 26008, Training Loss: 0.00910
Epoch: 26008, Training Loss: 0.00849
Epoch: 26008, Training Loss: 0.01039
Epoch: 26008, Training Loss: 0.00804
Epoch: 26009, Training Loss: 0.00910
Epoch: 26009, Training Loss: 0.00849
Epoch: 26009, Training Loss: 0.01039
Epoch: 26009, Training Loss: 0.00804
Epoch: 26010, Training Loss: 0.00910
Epoch: 26010, Training Loss: 0.00849
Epoch: 26010, Training Loss: 0.01039
Epoch: 26010, Training Loss: 0.00804
Epoch: 26011, Training Loss: 0.00910
Epoch: 26011, Training Loss: 0.00849
Epoch: 26011, Training Loss: 0.01039
Epoch: 26011, Training Loss: 0.00804
Epoch: 26012, Training Loss: 0.00910
Epoch: 26012, Training Loss: 0.00849
Epoch: 26012, Training Loss: 0.01039
Epoch: 26012, Training Loss: 0.00804
Epoch: 26013, Training Loss: 0.00910
Epoch: 26013, Training Loss: 0.00849
Epoch: 26013, Training Loss: 0.01039
Epoch: 26013, Training Loss: 0.00804
Epoch: 26014, Training Loss: 0.00910
Epoch: 26014, Training Loss: 0.00849
Epoch: 26014, Training Loss: 0.01039
Epoch: 26014, Training Loss: 0.00804
Epoch: 26015, Training Loss: 0.00910
Epoch: 26015, Training Loss: 0.00849
Epoch: 26015, Training Loss: 0.01039
Epoch: 26015, Training Loss: 0.00804
Epoch: 26016, Training Loss: 0.00910
Epoch: 26016, Training Loss: 0.00849
Epoch: 26016, Training Loss: 0.01039
Epoch: 26016, Training Loss: 0.00804
Epoch: 26017, Training Loss: 0.00910
Epoch: 26017, Training Loss: 0.00849
Epoch: 26017, Training Loss: 0.01039
Epoch: 26017, Training Loss: 0.00804
Epoch: 26018, Training Loss: 0.00910
Epoch: 26018, Training Loss: 0.00849
Epoch: 26018, Training Loss: 0.01039
Epoch: 26018, Training Loss: 0.00804
Epoch: 26019, Training Loss: 0.00910
Epoch: 26019, Training Loss: 0.00849
Epoch: 26019, Training Loss: 0.01039
Epoch: 26019, Training Loss: 0.00804
Epoch: 26020, Training Loss: 0.00910
Epoch: 26020, Training Loss: 0.00849
Epoch: 26020, Training Loss: 0.01039
Epoch: 26020, Training Loss: 0.00804
Epoch: 26021, Training Loss: 0.00910
Epoch: 26021, Training Loss: 0.00849
Epoch: 26021, Training Loss: 0.01039
Epoch: 26021, Training Loss: 0.00804
Epoch: 26022, Training Loss: 0.00910
Epoch: 26022, Training Loss: 0.00849
Epoch: 26022, Training Loss: 0.01039
Epoch: 26022, Training Loss: 0.00804
Epoch: 26023, Training Loss: 0.00910
Epoch: 26023, Training Loss: 0.00849
Epoch: 26023, Training Loss: 0.01039
Epoch: 26023, Training Loss: 0.00804
Epoch: 26024, Training Loss: 0.00910
Epoch: 26024, Training Loss: 0.00848
Epoch: 26024, Training Loss: 0.01039
Epoch: 26024, Training Loss: 0.00804
Epoch: 26025, Training Loss: 0.00910
Epoch: 26025, Training Loss: 0.00848
Epoch: 26025, Training Loss: 0.01039
Epoch: 26025, Training Loss: 0.00804
Epoch: 26026, Training Loss: 0.00910
Epoch: 26026, Training Loss: 0.00848
Epoch: 26026, Training Loss: 0.01039
Epoch: 26026, Training Loss: 0.00804
Epoch: 26027, Training Loss: 0.00910
Epoch: 26027, Training Loss: 0.00848
Epoch: 26027, Training Loss: 0.01039
Epoch: 26027, Training Loss: 0.00804
Epoch: 26028, Training Loss: 0.00910
Epoch: 26028, Training Loss: 0.00848
Epoch: 26028, Training Loss: 0.01039
Epoch: 26028, Training Loss: 0.00804
Epoch: 26029, Training Loss: 0.00910
Epoch: 26029, Training Loss: 0.00848
Epoch: 26029, Training Loss: 0.01039
Epoch: 26029, Training Loss: 0.00804
Epoch: 26030, Training Loss: 0.00910
Epoch: 26030, Training Loss: 0.00848
Epoch: 26030, Training Loss: 0.01039
Epoch: 26030, Training Loss: 0.00804
Epoch: 26031, Training Loss: 0.00910
Epoch: 26031, Training Loss: 0.00848
Epoch: 26031, Training Loss: 0.01039
Epoch: 26031, Training Loss: 0.00804
Epoch: 26032, Training Loss: 0.00910
Epoch: 26032, Training Loss: 0.00848
Epoch: 26032, Training Loss: 0.01039
Epoch: 26032, Training Loss: 0.00803
Epoch: 26033, Training Loss: 0.00910
Epoch: 26033, Training Loss: 0.00848
Epoch: 26033, Training Loss: 0.01039
Epoch: 26033, Training Loss: 0.00803
Epoch: 26034, Training Loss: 0.00910
Epoch: 26034, Training Loss: 0.00848
Epoch: 26034, Training Loss: 0.01039
Epoch: 26034, Training Loss: 0.00803
Epoch: 26035, Training Loss: 0.00910
Epoch: 26035, Training Loss: 0.00848
Epoch: 26035, Training Loss: 0.01039
Epoch: 26035, Training Loss: 0.00803
Epoch: 26036, Training Loss: 0.00910
Epoch: 26036, Training Loss: 0.00848
Epoch: 26036, Training Loss: 0.01039
Epoch: 26036, Training Loss: 0.00803
Epoch: 26037, Training Loss: 0.00910
Epoch: 26037, Training Loss: 0.00848
Epoch: 26037, Training Loss: 0.01039
Epoch: 26037, Training Loss: 0.00803
Epoch: 26038, Training Loss: 0.00910
Epoch: 26038, Training Loss: 0.00848
Epoch: 26038, Training Loss: 0.01039
Epoch: 26038, Training Loss: 0.00803
Epoch: 26039, Training Loss: 0.00910
Epoch: 26039, Training Loss: 0.00848
Epoch: 26039, Training Loss: 0.01039
Epoch: 26039, Training Loss: 0.00803
Epoch: 26040, Training Loss: 0.00910
Epoch: 26040, Training Loss: 0.00848
Epoch: 26040, Training Loss: 0.01039
Epoch: 26040, Training Loss: 0.00803
Epoch: 26041, Training Loss: 0.00910
Epoch: 26041, Training Loss: 0.00848
Epoch: 26041, Training Loss: 0.01039
Epoch: 26041, Training Loss: 0.00803
Epoch: 26042, Training Loss: 0.00910
Epoch: 26042, Training Loss: 0.00848
Epoch: 26042, Training Loss: 0.01039
Epoch: 26042, Training Loss: 0.00803
Epoch: 26043, Training Loss: 0.00909
Epoch: 26043, Training Loss: 0.00848
Epoch: 26043, Training Loss: 0.01038
Epoch: 26043, Training Loss: 0.00803
Epoch: 26044, Training Loss: 0.00909
Epoch: 26044, Training Loss: 0.00848
Epoch: 26044, Training Loss: 0.01038
Epoch: 26044, Training Loss: 0.00803
Epoch: 26045, Training Loss: 0.00909
Epoch: 26045, Training Loss: 0.00848
Epoch: 26045, Training Loss: 0.01038
Epoch: 26045, Training Loss: 0.00803
Epoch: 26046, Training Loss: 0.00909
Epoch: 26046, Training Loss: 0.00848
Epoch: 26046, Training Loss: 0.01038
Epoch: 26046, Training Loss: 0.00803
Epoch: 26047, Training Loss: 0.00909
Epoch: 26047, Training Loss: 0.00848
Epoch: 26047, Training Loss: 0.01038
Epoch: 26047, Training Loss: 0.00803
Epoch: 26048, Training Loss: 0.00909
Epoch: 26048, Training Loss: 0.00848
Epoch: 26048, Training Loss: 0.01038
Epoch: 26048, Training Loss: 0.00803
Epoch: 26049, Training Loss: 0.00909
Epoch: 26049, Training Loss: 0.00848
Epoch: 26049, Training Loss: 0.01038
Epoch: 26049, Training Loss: 0.00803
Epoch: 26050, Training Loss: 0.00909
Epoch: 26050, Training Loss: 0.00848
Epoch: 26050, Training Loss: 0.01038
Epoch: 26050, Training Loss: 0.00803
Epoch: 26051, Training Loss: 0.00909
Epoch: 26051, Training Loss: 0.00848
Epoch: 26051, Training Loss: 0.01038
Epoch: 26051, Training Loss: 0.00803
Epoch: 26052, Training Loss: 0.00909
Epoch: 26052, Training Loss: 0.00848
Epoch: 26052, Training Loss: 0.01038
Epoch: 26052, Training Loss: 0.00803
Epoch: 26053, Training Loss: 0.00909
Epoch: 26053, Training Loss: 0.00848
Epoch: 26053, Training Loss: 0.01038
Epoch: 26053, Training Loss: 0.00803
Epoch: 26054, Training Loss: 0.00909
Epoch: 26054, Training Loss: 0.00848
Epoch: 26054, Training Loss: 0.01038
Epoch: 26054, Training Loss: 0.00803
Epoch: 26055, Training Loss: 0.00909
Epoch: 26055, Training Loss: 0.00848
Epoch: 26055, Training Loss: 0.01038
Epoch: 26055, Training Loss: 0.00803
Epoch: 26056, Training Loss: 0.00909
Epoch: 26056, Training Loss: 0.00848
Epoch: 26056, Training Loss: 0.01038
Epoch: 26056, Training Loss: 0.00803
Epoch: 26057, Training Loss: 0.00909
Epoch: 26057, Training Loss: 0.00848
Epoch: 26057, Training Loss: 0.01038
Epoch: 26057, Training Loss: 0.00803
Epoch: 26058, Training Loss: 0.00909
Epoch: 26058, Training Loss: 0.00848
Epoch: 26058, Training Loss: 0.01038
Epoch: 26058, Training Loss: 0.00803
Epoch: 26059, Training Loss: 0.00909
Epoch: 26059, Training Loss: 0.00848
Epoch: 26059, Training Loss: 0.01038
Epoch: 26059, Training Loss: 0.00803
Epoch: 26060, Training Loss: 0.00909
Epoch: 26060, Training Loss: 0.00848
Epoch: 26060, Training Loss: 0.01038
Epoch: 26060, Training Loss: 0.00803
Epoch: 26061, Training Loss: 0.00909
Epoch: 26061, Training Loss: 0.00848
Epoch: 26061, Training Loss: 0.01038
Epoch: 26061, Training Loss: 0.00803
Epoch: 26062, Training Loss: 0.00909
Epoch: 26062, Training Loss: 0.00848
Epoch: 26062, Training Loss: 0.01038
Epoch: 26062, Training Loss: 0.00803
Epoch: 26063, Training Loss: 0.00909
Epoch: 26063, Training Loss: 0.00848
Epoch: 26063, Training Loss: 0.01038
Epoch: 26063, Training Loss: 0.00803
Epoch: 26064, Training Loss: 0.00909
Epoch: 26064, Training Loss: 0.00848
Epoch: 26064, Training Loss: 0.01038
Epoch: 26064, Training Loss: 0.00803
Epoch: 26065, Training Loss: 0.00909
Epoch: 26065, Training Loss: 0.00848
Epoch: 26065, Training Loss: 0.01038
Epoch: 26065, Training Loss: 0.00803
Epoch: 26066, Training Loss: 0.00909
Epoch: 26066, Training Loss: 0.00848
Epoch: 26066, Training Loss: 0.01038
Epoch: 26066, Training Loss: 0.00803
Epoch: 26067, Training Loss: 0.00909
Epoch: 26067, Training Loss: 0.00848
Epoch: 26067, Training Loss: 0.01038
Epoch: 26067, Training Loss: 0.00803
Epoch: 26068, Training Loss: 0.00909
Epoch: 26068, Training Loss: 0.00848
Epoch: 26068, Training Loss: 0.01038
Epoch: 26068, Training Loss: 0.00803
Epoch: 26069, Training Loss: 0.00909
Epoch: 26069, Training Loss: 0.00848
Epoch: 26069, Training Loss: 0.01038
Epoch: 26069, Training Loss: 0.00803
Epoch: 26070, Training Loss: 0.00909
Epoch: 26070, Training Loss: 0.00848
Epoch: 26070, Training Loss: 0.01038
Epoch: 26070, Training Loss: 0.00803
Epoch: 26071, Training Loss: 0.00909
Epoch: 26071, Training Loss: 0.00848
Epoch: 26071, Training Loss: 0.01038
Epoch: 26071, Training Loss: 0.00803
Epoch: 26072, Training Loss: 0.00909
Epoch: 26072, Training Loss: 0.00848
Epoch: 26072, Training Loss: 0.01038
Epoch: 26072, Training Loss: 0.00803
Epoch: 26073, Training Loss: 0.00909
Epoch: 26073, Training Loss: 0.00848
Epoch: 26073, Training Loss: 0.01038
Epoch: 26073, Training Loss: 0.00803
Epoch: 26074, Training Loss: 0.00909
Epoch: 26074, Training Loss: 0.00848
Epoch: 26074, Training Loss: 0.01038
Epoch: 26074, Training Loss: 0.00803
Epoch: 26075, Training Loss: 0.00909
Epoch: 26075, Training Loss: 0.00848
Epoch: 26075, Training Loss: 0.01038
Epoch: 26075, Training Loss: 0.00803
Epoch: 26076, Training Loss: 0.00909
Epoch: 26076, Training Loss: 0.00848
Epoch: 26076, Training Loss: 0.01038
Epoch: 26076, Training Loss: 0.00803
Epoch: 26077, Training Loss: 0.00909
Epoch: 26077, Training Loss: 0.00848
Epoch: 26077, Training Loss: 0.01038
Epoch: 26077, Training Loss: 0.00803
Epoch: 26078, Training Loss: 0.00909
Epoch: 26078, Training Loss: 0.00848
Epoch: 26078, Training Loss: 0.01038
Epoch: 26078, Training Loss: 0.00803
Epoch: 26079, Training Loss: 0.00909
Epoch: 26079, Training Loss: 0.00848
Epoch: 26079, Training Loss: 0.01038
Epoch: 26079, Training Loss: 0.00803
Epoch: 26080, Training Loss: 0.00909
Epoch: 26080, Training Loss: 0.00848
Epoch: 26080, Training Loss: 0.01038
Epoch: 26080, Training Loss: 0.00803
Epoch: 26081, Training Loss: 0.00909
Epoch: 26081, Training Loss: 0.00848
Epoch: 26081, Training Loss: 0.01038
Epoch: 26081, Training Loss: 0.00803
Epoch: 26082, Training Loss: 0.00909
Epoch: 26082, Training Loss: 0.00847
Epoch: 26082, Training Loss: 0.01038
Epoch: 26082, Training Loss: 0.00803
Epoch: 26083, Training Loss: 0.00909
Epoch: 26083, Training Loss: 0.00847
Epoch: 26083, Training Loss: 0.01038
Epoch: 26083, Training Loss: 0.00803
Epoch: 26084, Training Loss: 0.00909
Epoch: 26084, Training Loss: 0.00847
Epoch: 26084, Training Loss: 0.01038
Epoch: 26084, Training Loss: 0.00803
Epoch: 26085, Training Loss: 0.00909
Epoch: 26085, Training Loss: 0.00847
Epoch: 26085, Training Loss: 0.01038
Epoch: 26085, Training Loss: 0.00803
Epoch: 26086, Training Loss: 0.00909
Epoch: 26086, Training Loss: 0.00847
Epoch: 26086, Training Loss: 0.01038
Epoch: 26086, Training Loss: 0.00803
Epoch: 26087, Training Loss: 0.00909
Epoch: 26087, Training Loss: 0.00847
Epoch: 26087, Training Loss: 0.01038
Epoch: 26087, Training Loss: 0.00803
Epoch: 26088, Training Loss: 0.00909
Epoch: 26088, Training Loss: 0.00847
Epoch: 26088, Training Loss: 0.01038
Epoch: 26088, Training Loss: 0.00803
Epoch: 26089, Training Loss: 0.00909
Epoch: 26089, Training Loss: 0.00847
Epoch: 26089, Training Loss: 0.01038
Epoch: 26089, Training Loss: 0.00803
Epoch: 26090, Training Loss: 0.00909
Epoch: 26090, Training Loss: 0.00847
Epoch: 26090, Training Loss: 0.01037
Epoch: 26090, Training Loss: 0.00803
Epoch: 26091, Training Loss: 0.00909
Epoch: 26091, Training Loss: 0.00847
Epoch: 26091, Training Loss: 0.01037
Epoch: 26091, Training Loss: 0.00803
Epoch: 26092, Training Loss: 0.00909
Epoch: 26092, Training Loss: 0.00847
Epoch: 26092, Training Loss: 0.01037
Epoch: 26092, Training Loss: 0.00803
Epoch: 26093, Training Loss: 0.00909
Epoch: 26093, Training Loss: 0.00847
Epoch: 26093, Training Loss: 0.01037
Epoch: 26093, Training Loss: 0.00803
Epoch: 26094, Training Loss: 0.00909
Epoch: 26094, Training Loss: 0.00847
Epoch: 26094, Training Loss: 0.01037
Epoch: 26094, Training Loss: 0.00802
Epoch: 26095, Training Loss: 0.00909
Epoch: 26095, Training Loss: 0.00847
Epoch: 26095, Training Loss: 0.01037
Epoch: 26095, Training Loss: 0.00802
Epoch: 26096, Training Loss: 0.00909
Epoch: 26096, Training Loss: 0.00847
Epoch: 26096, Training Loss: 0.01037
Epoch: 26096, Training Loss: 0.00802
Epoch: 26097, Training Loss: 0.00908
Epoch: 26097, Training Loss: 0.00847
Epoch: 26097, Training Loss: 0.01037
Epoch: 26097, Training Loss: 0.00802
Epoch: 26098, Training Loss: 0.00908
Epoch: 26098, Training Loss: 0.00847
Epoch: 26098, Training Loss: 0.01037
Epoch: 26098, Training Loss: 0.00802
Epoch: 26099, Training Loss: 0.00908
Epoch: 26099, Training Loss: 0.00847
Epoch: 26099, Training Loss: 0.01037
Epoch: 26099, Training Loss: 0.00802
Epoch: 26100, Training Loss: 0.00908
Epoch: 26100, Training Loss: 0.00847
Epoch: 26100, Training Loss: 0.01037
Epoch: 26100, Training Loss: 0.00802
Epoch: 26101, Training Loss: 0.00908
Epoch: 26101, Training Loss: 0.00847
Epoch: 26101, Training Loss: 0.01037
Epoch: 26101, Training Loss: 0.00802
Epoch: 26102, Training Loss: 0.00908
Epoch: 26102, Training Loss: 0.00847
Epoch: 26102, Training Loss: 0.01037
Epoch: 26102, Training Loss: 0.00802
Epoch: 26103, Training Loss: 0.00908
Epoch: 26103, Training Loss: 0.00847
Epoch: 26103, Training Loss: 0.01037
Epoch: 26103, Training Loss: 0.00802
Epoch: 26104, Training Loss: 0.00908
Epoch: 26104, Training Loss: 0.00847
Epoch: 26104, Training Loss: 0.01037
Epoch: 26104, Training Loss: 0.00802
Epoch: 26105, Training Loss: 0.00908
Epoch: 26105, Training Loss: 0.00847
Epoch: 26105, Training Loss: 0.01037
Epoch: 26105, Training Loss: 0.00802
Epoch: 26106, Training Loss: 0.00908
Epoch: 26106, Training Loss: 0.00847
Epoch: 26106, Training Loss: 0.01037
Epoch: 26106, Training Loss: 0.00802
Epoch: 26107, Training Loss: 0.00908
Epoch: 26107, Training Loss: 0.00847
Epoch: 26107, Training Loss: 0.01037
Epoch: 26107, Training Loss: 0.00802
Epoch: 26108, Training Loss: 0.00908
Epoch: 26108, Training Loss: 0.00847
Epoch: 26108, Training Loss: 0.01037
Epoch: 26108, Training Loss: 0.00802
Epoch: 26109, Training Loss: 0.00908
Epoch: 26109, Training Loss: 0.00847
Epoch: 26109, Training Loss: 0.01037
Epoch: 26109, Training Loss: 0.00802
Epoch: 26110, Training Loss: 0.00908
Epoch: 26110, Training Loss: 0.00847
Epoch: 26110, Training Loss: 0.01037
Epoch: 26110, Training Loss: 0.00802
Epoch: 26111, Training Loss: 0.00908
Epoch: 26111, Training Loss: 0.00847
Epoch: 26111, Training Loss: 0.01037
Epoch: 26111, Training Loss: 0.00802
Epoch: 26112, Training Loss: 0.00908
Epoch: 26112, Training Loss: 0.00847
Epoch: 26112, Training Loss: 0.01037
Epoch: 26112, Training Loss: 0.00802
Epoch: 26113, Training Loss: 0.00908
Epoch: 26113, Training Loss: 0.00847
Epoch: 26113, Training Loss: 0.01037
Epoch: 26113, Training Loss: 0.00802
Epoch: 26114, Training Loss: 0.00908
Epoch: 26114, Training Loss: 0.00847
Epoch: 26114, Training Loss: 0.01037
Epoch: 26114, Training Loss: 0.00802
Epoch: 26115, Training Loss: 0.00908
Epoch: 26115, Training Loss: 0.00847
Epoch: 26115, Training Loss: 0.01037
Epoch: 26115, Training Loss: 0.00802
Epoch: 26116, Training Loss: 0.00908
Epoch: 26116, Training Loss: 0.00847
Epoch: 26116, Training Loss: 0.01037
Epoch: 26116, Training Loss: 0.00802
Epoch: 26117, Training Loss: 0.00908
Epoch: 26117, Training Loss: 0.00847
Epoch: 26117, Training Loss: 0.01037
Epoch: 26117, Training Loss: 0.00802
Epoch: 26118, Training Loss: 0.00908
Epoch: 26118, Training Loss: 0.00847
Epoch: 26118, Training Loss: 0.01037
Epoch: 26118, Training Loss: 0.00802
Epoch: 26119, Training Loss: 0.00908
Epoch: 26119, Training Loss: 0.00847
Epoch: 26119, Training Loss: 0.01037
Epoch: 26119, Training Loss: 0.00802
Epoch: 26120, Training Loss: 0.00908
Epoch: 26120, Training Loss: 0.00847
Epoch: 26120, Training Loss: 0.01037
Epoch: 26120, Training Loss: 0.00802
Epoch: 26121, Training Loss: 0.00908
Epoch: 26121, Training Loss: 0.00847
Epoch: 26121, Training Loss: 0.01037
Epoch: 26121, Training Loss: 0.00802
Epoch: 26122, Training Loss: 0.00908
Epoch: 26122, Training Loss: 0.00847
Epoch: 26122, Training Loss: 0.01037
Epoch: 26122, Training Loss: 0.00802
Epoch: 26123, Training Loss: 0.00908
Epoch: 26123, Training Loss: 0.00847
Epoch: 26123, Training Loss: 0.01037
Epoch: 26123, Training Loss: 0.00802
Epoch: 26124, Training Loss: 0.00908
Epoch: 26124, Training Loss: 0.00847
Epoch: 26124, Training Loss: 0.01037
Epoch: 26124, Training Loss: 0.00802
Epoch: 26125, Training Loss: 0.00908
Epoch: 26125, Training Loss: 0.00847
Epoch: 26125, Training Loss: 0.01037
Epoch: 26125, Training Loss: 0.00802
Epoch: 26126, Training Loss: 0.00908
Epoch: 26126, Training Loss: 0.00847
Epoch: 26126, Training Loss: 0.01037
Epoch: 26126, Training Loss: 0.00802
Epoch: 26127, Training Loss: 0.00908
Epoch: 26127, Training Loss: 0.00847
Epoch: 26127, Training Loss: 0.01037
Epoch: 26127, Training Loss: 0.00802
Epoch: 26128, Training Loss: 0.00908
Epoch: 26128, Training Loss: 0.00847
Epoch: 26128, Training Loss: 0.01037
Epoch: 26128, Training Loss: 0.00802
Epoch: 26129, Training Loss: 0.00908
Epoch: 26129, Training Loss: 0.00847
Epoch: 26129, Training Loss: 0.01037
Epoch: 26129, Training Loss: 0.00802
Epoch: 26130, Training Loss: 0.00908
Epoch: 26130, Training Loss: 0.00847
Epoch: 26130, Training Loss: 0.01037
Epoch: 26130, Training Loss: 0.00802
Epoch: 26131, Training Loss: 0.00908
Epoch: 26131, Training Loss: 0.00847
Epoch: 26131, Training Loss: 0.01037
Epoch: 26131, Training Loss: 0.00802
Epoch: 26132, Training Loss: 0.00908
Epoch: 26132, Training Loss: 0.00847
Epoch: 26132, Training Loss: 0.01037
Epoch: 26132, Training Loss: 0.00802
Epoch: 26133, Training Loss: 0.00908
Epoch: 26133, Training Loss: 0.00847
Epoch: 26133, Training Loss: 0.01037
Epoch: 26133, Training Loss: 0.00802
Epoch: 26134, Training Loss: 0.00908
Epoch: 26134, Training Loss: 0.00847
Epoch: 26134, Training Loss: 0.01037
Epoch: 26134, Training Loss: 0.00802
Epoch: 26135, Training Loss: 0.00908
Epoch: 26135, Training Loss: 0.00847
Epoch: 26135, Training Loss: 0.01037
Epoch: 26135, Training Loss: 0.00802
Epoch: 26136, Training Loss: 0.00908
Epoch: 26136, Training Loss: 0.00847
Epoch: 26136, Training Loss: 0.01037
Epoch: 26136, Training Loss: 0.00802
Epoch: 26137, Training Loss: 0.00908
Epoch: 26137, Training Loss: 0.00847
Epoch: 26137, Training Loss: 0.01036
Epoch: 26137, Training Loss: 0.00802
Epoch: 26138, Training Loss: 0.00908
Epoch: 26138, Training Loss: 0.00847
Epoch: 26138, Training Loss: 0.01036
Epoch: 26138, Training Loss: 0.00802
Epoch: 26139, Training Loss: 0.00908
Epoch: 26139, Training Loss: 0.00847
Epoch: 26139, Training Loss: 0.01036
Epoch: 26139, Training Loss: 0.00802
Epoch: 26140, Training Loss: 0.00908
Epoch: 26140, Training Loss: 0.00847
Epoch: 26140, Training Loss: 0.01036
Epoch: 26140, Training Loss: 0.00802
Epoch: 26141, Training Loss: 0.00908
Epoch: 26141, Training Loss: 0.00846
Epoch: 26141, Training Loss: 0.01036
Epoch: 26141, Training Loss: 0.00802
Epoch: 26142, Training Loss: 0.00908
Epoch: 26142, Training Loss: 0.00846
Epoch: 26142, Training Loss: 0.01036
Epoch: 26142, Training Loss: 0.00802
Epoch: 26143, Training Loss: 0.00908
Epoch: 26143, Training Loss: 0.00846
Epoch: 26143, Training Loss: 0.01036
Epoch: 26143, Training Loss: 0.00802
Epoch: 26144, Training Loss: 0.00908
Epoch: 26144, Training Loss: 0.00846
Epoch: 26144, Training Loss: 0.01036
Epoch: 26144, Training Loss: 0.00802
Epoch: 26145, Training Loss: 0.00908
Epoch: 26145, Training Loss: 0.00846
Epoch: 26145, Training Loss: 0.01036
Epoch: 26145, Training Loss: 0.00802
Epoch: 26146, Training Loss: 0.00908
Epoch: 26146, Training Loss: 0.00846
Epoch: 26146, Training Loss: 0.01036
Epoch: 26146, Training Loss: 0.00802
Epoch: 26147, Training Loss: 0.00908
Epoch: 26147, Training Loss: 0.00846
Epoch: 26147, Training Loss: 0.01036
Epoch: 26147, Training Loss: 0.00802
Epoch: 26148, Training Loss: 0.00908
Epoch: 26148, Training Loss: 0.00846
Epoch: 26148, Training Loss: 0.01036
Epoch: 26148, Training Loss: 0.00802
Epoch: 26149, Training Loss: 0.00908
Epoch: 26149, Training Loss: 0.00846
Epoch: 26149, Training Loss: 0.01036
Epoch: 26149, Training Loss: 0.00802
Epoch: 26150, Training Loss: 0.00907
Epoch: 26150, Training Loss: 0.00846
Epoch: 26150, Training Loss: 0.01036
Epoch: 26150, Training Loss: 0.00802
Epoch: 26151, Training Loss: 0.00907
Epoch: 26151, Training Loss: 0.00846
Epoch: 26151, Training Loss: 0.01036
Epoch: 26151, Training Loss: 0.00802
Epoch: 26152, Training Loss: 0.00907
Epoch: 26152, Training Loss: 0.00846
Epoch: 26152, Training Loss: 0.01036
Epoch: 26152, Training Loss: 0.00802
Epoch: 26153, Training Loss: 0.00907
Epoch: 26153, Training Loss: 0.00846
Epoch: 26153, Training Loss: 0.01036
Epoch: 26153, Training Loss: 0.00802
Epoch: 26154, Training Loss: 0.00907
Epoch: 26154, Training Loss: 0.00846
Epoch: 26154, Training Loss: 0.01036
Epoch: 26154, Training Loss: 0.00802
Epoch: 26155, Training Loss: 0.00907
Epoch: 26155, Training Loss: 0.00846
Epoch: 26155, Training Loss: 0.01036
Epoch: 26155, Training Loss: 0.00802
Epoch: 26156, Training Loss: 0.00907
Epoch: 26156, Training Loss: 0.00846
Epoch: 26156, Training Loss: 0.01036
Epoch: 26156, Training Loss: 0.00801
Epoch: 26157, Training Loss: 0.00907
Epoch: 26157, Training Loss: 0.00846
Epoch: 26157, Training Loss: 0.01036
Epoch: 26157, Training Loss: 0.00801
Epoch: 26158, Training Loss: 0.00907
Epoch: 26158, Training Loss: 0.00846
Epoch: 26158, Training Loss: 0.01036
Epoch: 26158, Training Loss: 0.00801
Epoch: 26159, Training Loss: 0.00907
Epoch: 26159, Training Loss: 0.00846
Epoch: 26159, Training Loss: 0.01036
Epoch: 26159, Training Loss: 0.00801
Epoch: 26160, Training Loss: 0.00907
Epoch: 26160, Training Loss: 0.00846
Epoch: 26160, Training Loss: 0.01036
Epoch: 26160, Training Loss: 0.00801
Epoch: 26161, Training Loss: 0.00907
Epoch: 26161, Training Loss: 0.00846
Epoch: 26161, Training Loss: 0.01036
Epoch: 26161, Training Loss: 0.00801
Epoch: 26162, Training Loss: 0.00907
Epoch: 26162, Training Loss: 0.00846
Epoch: 26162, Training Loss: 0.01036
Epoch: 26162, Training Loss: 0.00801
Epoch: 26163, Training Loss: 0.00907
Epoch: 26163, Training Loss: 0.00846
Epoch: 26163, Training Loss: 0.01036
Epoch: 26163, Training Loss: 0.00801
Epoch: 26164, Training Loss: 0.00907
Epoch: 26164, Training Loss: 0.00846
Epoch: 26164, Training Loss: 0.01036
Epoch: 26164, Training Loss: 0.00801
Epoch: 26165, Training Loss: 0.00907
Epoch: 26165, Training Loss: 0.00846
Epoch: 26165, Training Loss: 0.01036
Epoch: 26165, Training Loss: 0.00801
Epoch: 26166, Training Loss: 0.00907
Epoch: 26166, Training Loss: 0.00846
Epoch: 26166, Training Loss: 0.01036
Epoch: 26166, Training Loss: 0.00801
Epoch: 26167, Training Loss: 0.00907
Epoch: 26167, Training Loss: 0.00846
Epoch: 26167, Training Loss: 0.01036
Epoch: 26167, Training Loss: 0.00801
Epoch: 26168, Training Loss: 0.00907
Epoch: 26168, Training Loss: 0.00846
Epoch: 26168, Training Loss: 0.01036
Epoch: 26168, Training Loss: 0.00801
Epoch: 26169, Training Loss: 0.00907
Epoch: 26169, Training Loss: 0.00846
Epoch: 26169, Training Loss: 0.01036
Epoch: 26169, Training Loss: 0.00801
Epoch: 26170, Training Loss: 0.00907
Epoch: 26170, Training Loss: 0.00846
Epoch: 26170, Training Loss: 0.01036
Epoch: 26170, Training Loss: 0.00801
Epoch: 26171, Training Loss: 0.00907
Epoch: 26171, Training Loss: 0.00846
Epoch: 26171, Training Loss: 0.01036
Epoch: 26171, Training Loss: 0.00801
Epoch: 26172, Training Loss: 0.00907
Epoch: 26172, Training Loss: 0.00846
Epoch: 26172, Training Loss: 0.01036
Epoch: 26172, Training Loss: 0.00801
Epoch: 26173, Training Loss: 0.00907
Epoch: 26173, Training Loss: 0.00846
Epoch: 26173, Training Loss: 0.01036
Epoch: 26173, Training Loss: 0.00801
Epoch: 26174, Training Loss: 0.00907
Epoch: 26174, Training Loss: 0.00846
Epoch: 26174, Training Loss: 0.01036
Epoch: 26174, Training Loss: 0.00801
Epoch: 26175, Training Loss: 0.00907
Epoch: 26175, Training Loss: 0.00846
Epoch: 26175, Training Loss: 0.01036
Epoch: 26175, Training Loss: 0.00801
Epoch: 26176, Training Loss: 0.00907
Epoch: 26176, Training Loss: 0.00846
Epoch: 26176, Training Loss: 0.01036
Epoch: 26176, Training Loss: 0.00801
Epoch: 26177, Training Loss: 0.00907
Epoch: 26177, Training Loss: 0.00846
Epoch: 26177, Training Loss: 0.01036
Epoch: 26177, Training Loss: 0.00801
Epoch: 26178, Training Loss: 0.00907
Epoch: 26178, Training Loss: 0.00846
Epoch: 26178, Training Loss: 0.01036
Epoch: 26178, Training Loss: 0.00801
Epoch: 26179, Training Loss: 0.00907
Epoch: 26179, Training Loss: 0.00846
Epoch: 26179, Training Loss: 0.01036
Epoch: 26179, Training Loss: 0.00801
Epoch: 26180, Training Loss: 0.00907
Epoch: 26180, Training Loss: 0.00846
Epoch: 26180, Training Loss: 0.01036
Epoch: 26180, Training Loss: 0.00801
Epoch: 26181, Training Loss: 0.00907
Epoch: 26181, Training Loss: 0.00846
Epoch: 26181, Training Loss: 0.01036
Epoch: 26181, Training Loss: 0.00801
Epoch: 26182, Training Loss: 0.00907
Epoch: 26182, Training Loss: 0.00846
Epoch: 26182, Training Loss: 0.01036
Epoch: 26182, Training Loss: 0.00801
Epoch: 26183, Training Loss: 0.00907
Epoch: 26183, Training Loss: 0.00846
Epoch: 26183, Training Loss: 0.01036
Epoch: 26183, Training Loss: 0.00801
Epoch: 26184, Training Loss: 0.00907
Epoch: 26184, Training Loss: 0.00846
Epoch: 26184, Training Loss: 0.01036
Epoch: 26184, Training Loss: 0.00801
Epoch: 26185, Training Loss: 0.00907
Epoch: 26185, Training Loss: 0.00846
Epoch: 26185, Training Loss: 0.01035
Epoch: 26185, Training Loss: 0.00801
Epoch: 26186, Training Loss: 0.00907
Epoch: 26186, Training Loss: 0.00846
Epoch: 26186, Training Loss: 0.01035
Epoch: 26186, Training Loss: 0.00801
Epoch: 26187, Training Loss: 0.00907
Epoch: 26187, Training Loss: 0.00846
Epoch: 26187, Training Loss: 0.01035
Epoch: 26187, Training Loss: 0.00801
Epoch: 26188, Training Loss: 0.00907
Epoch: 26188, Training Loss: 0.00846
Epoch: 26188, Training Loss: 0.01035
Epoch: 26188, Training Loss: 0.00801
Epoch: 26189, Training Loss: 0.00907
Epoch: 26189, Training Loss: 0.00846
Epoch: 26189, Training Loss: 0.01035
Epoch: 26189, Training Loss: 0.00801
Epoch: 26190, Training Loss: 0.00907
Epoch: 26190, Training Loss: 0.00846
Epoch: 26190, Training Loss: 0.01035
Epoch: 26190, Training Loss: 0.00801
Epoch: 26191, Training Loss: 0.00907
Epoch: 26191, Training Loss: 0.00846
Epoch: 26191, Training Loss: 0.01035
Epoch: 26191, Training Loss: 0.00801
Epoch: 26192, Training Loss: 0.00907
Epoch: 26192, Training Loss: 0.00846
Epoch: 26192, Training Loss: 0.01035
Epoch: 26192, Training Loss: 0.00801
Epoch: 26193, Training Loss: 0.00907
Epoch: 26193, Training Loss: 0.00846
Epoch: 26193, Training Loss: 0.01035
Epoch: 26193, Training Loss: 0.00801
Epoch: 26194, Training Loss: 0.00907
Epoch: 26194, Training Loss: 0.00846
Epoch: 26194, Training Loss: 0.01035
Epoch: 26194, Training Loss: 0.00801
Epoch: 26195, Training Loss: 0.00907
Epoch: 26195, Training Loss: 0.00846
Epoch: 26195, Training Loss: 0.01035
Epoch: 26195, Training Loss: 0.00801
Epoch: 26196, Training Loss: 0.00907
Epoch: 26196, Training Loss: 0.00846
Epoch: 26196, Training Loss: 0.01035
Epoch: 26196, Training Loss: 0.00801
Epoch: 26197, Training Loss: 0.00907
Epoch: 26197, Training Loss: 0.00846
Epoch: 26197, Training Loss: 0.01035
Epoch: 26197, Training Loss: 0.00801
Epoch: 26198, Training Loss: 0.00907
Epoch: 26198, Training Loss: 0.00846
Epoch: 26198, Training Loss: 0.01035
Epoch: 26198, Training Loss: 0.00801
Epoch: 26199, Training Loss: 0.00907
Epoch: 26199, Training Loss: 0.00846
Epoch: 26199, Training Loss: 0.01035
Epoch: 26199, Training Loss: 0.00801
Epoch: 26200, Training Loss: 0.00907
Epoch: 26200, Training Loss: 0.00845
Epoch: 26200, Training Loss: 0.01035
Epoch: 26200, Training Loss: 0.00801
Epoch: 26201, Training Loss: 0.00907
Epoch: 26201, Training Loss: 0.00845
Epoch: 26201, Training Loss: 0.01035
Epoch: 26201, Training Loss: 0.00801
Epoch: 26202, Training Loss: 0.00907
Epoch: 26202, Training Loss: 0.00845
Epoch: 26202, Training Loss: 0.01035
Epoch: 26202, Training Loss: 0.00801
Epoch: 26203, Training Loss: 0.00907
Epoch: 26203, Training Loss: 0.00845
Epoch: 26203, Training Loss: 0.01035
Epoch: 26203, Training Loss: 0.00801
Epoch: 26204, Training Loss: 0.00906
Epoch: 26204, Training Loss: 0.00845
Epoch: 26204, Training Loss: 0.01035
Epoch: 26204, Training Loss: 0.00801
Epoch: 26205, Training Loss: 0.00906
Epoch: 26205, Training Loss: 0.00845
Epoch: 26205, Training Loss: 0.01035
Epoch: 26205, Training Loss: 0.00801
Epoch: 26206, Training Loss: 0.00906
Epoch: 26206, Training Loss: 0.00845
Epoch: 26206, Training Loss: 0.01035
Epoch: 26206, Training Loss: 0.00801
Epoch: 26207, Training Loss: 0.00906
Epoch: 26207, Training Loss: 0.00845
Epoch: 26207, Training Loss: 0.01035
Epoch: 26207, Training Loss: 0.00801
Epoch: 26208, Training Loss: 0.00906
Epoch: 26208, Training Loss: 0.00845
Epoch: 26208, Training Loss: 0.01035
Epoch: 26208, Training Loss: 0.00801
Epoch: 26209, Training Loss: 0.00906
Epoch: 26209, Training Loss: 0.00845
Epoch: 26209, Training Loss: 0.01035
Epoch: 26209, Training Loss: 0.00801
Epoch: 26210, Training Loss: 0.00906
Epoch: 26210, Training Loss: 0.00845
Epoch: 26210, Training Loss: 0.01035
Epoch: 26210, Training Loss: 0.00801
Epoch: 26211, Training Loss: 0.00906
Epoch: 26211, Training Loss: 0.00845
Epoch: 26211, Training Loss: 0.01035
Epoch: 26211, Training Loss: 0.00801
Epoch: 26212, Training Loss: 0.00906
Epoch: 26212, Training Loss: 0.00845
Epoch: 26212, Training Loss: 0.01035
Epoch: 26212, Training Loss: 0.00801
Epoch: 26213, Training Loss: 0.00906
Epoch: 26213, Training Loss: 0.00845
Epoch: 26213, Training Loss: 0.01035
Epoch: 26213, Training Loss: 0.00801
Epoch: 26214, Training Loss: 0.00906
Epoch: 26214, Training Loss: 0.00845
Epoch: 26214, Training Loss: 0.01035
Epoch: 26214, Training Loss: 0.00801
Epoch: 26215, Training Loss: 0.00906
Epoch: 26215, Training Loss: 0.00845
Epoch: 26215, Training Loss: 0.01035
Epoch: 26215, Training Loss: 0.00801
Epoch: 26216, Training Loss: 0.00906
Epoch: 26216, Training Loss: 0.00845
Epoch: 26216, Training Loss: 0.01035
Epoch: 26216, Training Loss: 0.00801
Epoch: 26217, Training Loss: 0.00906
Epoch: 26217, Training Loss: 0.00845
Epoch: 26217, Training Loss: 0.01035
Epoch: 26217, Training Loss: 0.00801
Epoch: 26218, Training Loss: 0.00906
Epoch: 26218, Training Loss: 0.00845
Epoch: 26218, Training Loss: 0.01035
Epoch: 26218, Training Loss: 0.00800
Epoch: 26219, Training Loss: 0.00906
Epoch: 26219, Training Loss: 0.00845
Epoch: 26219, Training Loss: 0.01035
Epoch: 26219, Training Loss: 0.00800
Epoch: 26220, Training Loss: 0.00906
Epoch: 26220, Training Loss: 0.00845
Epoch: 26220, Training Loss: 0.01035
Epoch: 26220, Training Loss: 0.00800
Epoch: 26221, Training Loss: 0.00906
Epoch: 26221, Training Loss: 0.00845
Epoch: 26221, Training Loss: 0.01035
Epoch: 26221, Training Loss: 0.00800
Epoch: 26222, Training Loss: 0.00906
Epoch: 26222, Training Loss: 0.00845
Epoch: 26222, Training Loss: 0.01035
Epoch: 26222, Training Loss: 0.00800
Epoch: 26223, Training Loss: 0.00906
Epoch: 26223, Training Loss: 0.00845
Epoch: 26223, Training Loss: 0.01035
Epoch: 26223, Training Loss: 0.00800
Epoch: 26224, Training Loss: 0.00906
Epoch: 26224, Training Loss: 0.00845
Epoch: 26224, Training Loss: 0.01035
Epoch: 26224, Training Loss: 0.00800
Epoch: 26225, Training Loss: 0.00906
Epoch: 26225, Training Loss: 0.00845
Epoch: 26225, Training Loss: 0.01035
Epoch: 26225, Training Loss: 0.00800
Epoch: 26226, Training Loss: 0.00906
Epoch: 26226, Training Loss: 0.00845
Epoch: 26226, Training Loss: 0.01035
Epoch: 26226, Training Loss: 0.00800
Epoch: 26227, Training Loss: 0.00906
Epoch: 26227, Training Loss: 0.00845
Epoch: 26227, Training Loss: 0.01035
Epoch: 26227, Training Loss: 0.00800
Epoch: 26228, Training Loss: 0.00906
Epoch: 26228, Training Loss: 0.00845
Epoch: 26228, Training Loss: 0.01035
Epoch: 26228, Training Loss: 0.00800
Epoch: 26229, Training Loss: 0.00906
Epoch: 26229, Training Loss: 0.00845
Epoch: 26229, Training Loss: 0.01035
Epoch: 26229, Training Loss: 0.00800
Epoch: 26230, Training Loss: 0.00906
Epoch: 26230, Training Loss: 0.00845
Epoch: 26230, Training Loss: 0.01035
Epoch: 26230, Training Loss: 0.00800
Epoch: 26231, Training Loss: 0.00906
Epoch: 26231, Training Loss: 0.00845
Epoch: 26231, Training Loss: 0.01035
Epoch: 26231, Training Loss: 0.00800
Epoch: 26232, Training Loss: 0.00906
Epoch: 26232, Training Loss: 0.00845
Epoch: 26232, Training Loss: 0.01034
Epoch: 26232, Training Loss: 0.00800
Epoch: 26233, Training Loss: 0.00906
Epoch: 26233, Training Loss: 0.00845
Epoch: 26233, Training Loss: 0.01034
Epoch: 26233, Training Loss: 0.00800
Epoch: 26234, Training Loss: 0.00906
Epoch: 26234, Training Loss: 0.00845
Epoch: 26234, Training Loss: 0.01034
Epoch: 26234, Training Loss: 0.00800
Epoch: 26235, Training Loss: 0.00906
Epoch: 26235, Training Loss: 0.00845
Epoch: 26235, Training Loss: 0.01034
Epoch: 26235, Training Loss: 0.00800
Epoch: 26236, Training Loss: 0.00906
Epoch: 26236, Training Loss: 0.00845
Epoch: 26236, Training Loss: 0.01034
Epoch: 26236, Training Loss: 0.00800
Epoch: 26237, Training Loss: 0.00906
Epoch: 26237, Training Loss: 0.00845
Epoch: 26237, Training Loss: 0.01034
Epoch: 26237, Training Loss: 0.00800
Epoch: 26238, Training Loss: 0.00906
Epoch: 26238, Training Loss: 0.00845
Epoch: 26238, Training Loss: 0.01034
Epoch: 26238, Training Loss: 0.00800
Epoch: 26239, Training Loss: 0.00906
Epoch: 26239, Training Loss: 0.00845
Epoch: 26239, Training Loss: 0.01034
Epoch: 26239, Training Loss: 0.00800
Epoch: 26240, Training Loss: 0.00906
Epoch: 26240, Training Loss: 0.00845
Epoch: 26240, Training Loss: 0.01034
Epoch: 26240, Training Loss: 0.00800
Epoch: 26241, Training Loss: 0.00906
Epoch: 26241, Training Loss: 0.00845
Epoch: 26241, Training Loss: 0.01034
Epoch: 26241, Training Loss: 0.00800
Epoch: 26242, Training Loss: 0.00906
Epoch: 26242, Training Loss: 0.00845
Epoch: 26242, Training Loss: 0.01034
Epoch: 26242, Training Loss: 0.00800
Epoch: 26243, Training Loss: 0.00906
Epoch: 26243, Training Loss: 0.00845
Epoch: 26243, Training Loss: 0.01034
Epoch: 26243, Training Loss: 0.00800
Epoch: 26244, Training Loss: 0.00906
Epoch: 26244, Training Loss: 0.00845
Epoch: 26244, Training Loss: 0.01034
Epoch: 26244, Training Loss: 0.00800
Epoch: 26245, Training Loss: 0.00906
Epoch: 26245, Training Loss: 0.00845
Epoch: 26245, Training Loss: 0.01034
Epoch: 26245, Training Loss: 0.00800
Epoch: 26246, Training Loss: 0.00906
Epoch: 26246, Training Loss: 0.00845
Epoch: 26246, Training Loss: 0.01034
Epoch: 26246, Training Loss: 0.00800
Epoch: 26247, Training Loss: 0.00906
Epoch: 26247, Training Loss: 0.00845
Epoch: 26247, Training Loss: 0.01034
Epoch: 26247, Training Loss: 0.00800
Epoch: 26248, Training Loss: 0.00906
Epoch: 26248, Training Loss: 0.00845
Epoch: 26248, Training Loss: 0.01034
Epoch: 26248, Training Loss: 0.00800
Epoch: 26249, Training Loss: 0.00906
Epoch: 26249, Training Loss: 0.00845
Epoch: 26249, Training Loss: 0.01034
Epoch: 26249, Training Loss: 0.00800
Epoch: 26250, Training Loss: 0.00906
Epoch: 26250, Training Loss: 0.00845
Epoch: 26250, Training Loss: 0.01034
Epoch: 26250, Training Loss: 0.00800
Epoch: 26251, Training Loss: 0.00906
Epoch: 26251, Training Loss: 0.00845
Epoch: 26251, Training Loss: 0.01034
Epoch: 26251, Training Loss: 0.00800
Epoch: 26252, Training Loss: 0.00906
Epoch: 26252, Training Loss: 0.00845
Epoch: 26252, Training Loss: 0.01034
Epoch: 26252, Training Loss: 0.00800
Epoch: 26253, Training Loss: 0.00906
Epoch: 26253, Training Loss: 0.00845
Epoch: 26253, Training Loss: 0.01034
Epoch: 26253, Training Loss: 0.00800
Epoch: 26254, Training Loss: 0.00906
Epoch: 26254, Training Loss: 0.00845
Epoch: 26254, Training Loss: 0.01034
Epoch: 26254, Training Loss: 0.00800
Epoch: 26255, Training Loss: 0.00906
Epoch: 26255, Training Loss: 0.00845
Epoch: 26255, Training Loss: 0.01034
Epoch: 26255, Training Loss: 0.00800
Epoch: 26256, Training Loss: 0.00906
Epoch: 26256, Training Loss: 0.00845
Epoch: 26256, Training Loss: 0.01034
Epoch: 26256, Training Loss: 0.00800
Epoch: 26257, Training Loss: 0.00906
Epoch: 26257, Training Loss: 0.00845
Epoch: 26257, Training Loss: 0.01034
Epoch: 26257, Training Loss: 0.00800
Epoch: 26258, Training Loss: 0.00906
Epoch: 26258, Training Loss: 0.00845
Epoch: 26258, Training Loss: 0.01034
Epoch: 26258, Training Loss: 0.00800
Epoch: 26259, Training Loss: 0.00905
Epoch: 26259, Training Loss: 0.00844
Epoch: 26259, Training Loss: 0.01034
Epoch: 26259, Training Loss: 0.00800
Epoch: 26260, Training Loss: 0.00905
Epoch: 26260, Training Loss: 0.00844
Epoch: 26260, Training Loss: 0.01034
Epoch: 26260, Training Loss: 0.00800
Epoch: 26261, Training Loss: 0.00905
Epoch: 26261, Training Loss: 0.00844
Epoch: 26261, Training Loss: 0.01034
Epoch: 26261, Training Loss: 0.00800
Epoch: 26262, Training Loss: 0.00905
Epoch: 26262, Training Loss: 0.00844
Epoch: 26262, Training Loss: 0.01034
Epoch: 26262, Training Loss: 0.00800
Epoch: 26263, Training Loss: 0.00905
Epoch: 26263, Training Loss: 0.00844
Epoch: 26263, Training Loss: 0.01034
Epoch: 26263, Training Loss: 0.00800
Epoch: 26264, Training Loss: 0.00905
Epoch: 26264, Training Loss: 0.00844
Epoch: 26264, Training Loss: 0.01034
Epoch: 26264, Training Loss: 0.00800
Epoch: 26265, Training Loss: 0.00905
Epoch: 26265, Training Loss: 0.00844
Epoch: 26265, Training Loss: 0.01034
Epoch: 26265, Training Loss: 0.00800
Epoch: 26266, Training Loss: 0.00905
Epoch: 26266, Training Loss: 0.00844
Epoch: 26266, Training Loss: 0.01034
Epoch: 26266, Training Loss: 0.00800
Epoch: 26267, Training Loss: 0.00905
Epoch: 26267, Training Loss: 0.00844
Epoch: 26267, Training Loss: 0.01034
Epoch: 26267, Training Loss: 0.00800
Epoch: 26268, Training Loss: 0.00905
Epoch: 26268, Training Loss: 0.00844
Epoch: 26268, Training Loss: 0.01034
Epoch: 26268, Training Loss: 0.00800
Epoch: 26269, Training Loss: 0.00905
Epoch: 26269, Training Loss: 0.00844
Epoch: 26269, Training Loss: 0.01034
Epoch: 26269, Training Loss: 0.00800
Epoch: 26270, Training Loss: 0.00905
Epoch: 26270, Training Loss: 0.00844
Epoch: 26270, Training Loss: 0.01034
Epoch: 26270, Training Loss: 0.00800
Epoch: 26271, Training Loss: 0.00905
Epoch: 26271, Training Loss: 0.00844
Epoch: 26271, Training Loss: 0.01034
Epoch: 26271, Training Loss: 0.00800
Epoch: 26272, Training Loss: 0.00905
Epoch: 26272, Training Loss: 0.00844
Epoch: 26272, Training Loss: 0.01034
Epoch: 26272, Training Loss: 0.00800
Epoch: 26273, Training Loss: 0.00905
Epoch: 26273, Training Loss: 0.00844
Epoch: 26273, Training Loss: 0.01034
Epoch: 26273, Training Loss: 0.00800
Epoch: 26274, Training Loss: 0.00905
Epoch: 26274, Training Loss: 0.00844
Epoch: 26274, Training Loss: 0.01034
Epoch: 26274, Training Loss: 0.00800
Epoch: 26275, Training Loss: 0.00905
Epoch: 26275, Training Loss: 0.00844
Epoch: 26275, Training Loss: 0.01034
Epoch: 26275, Training Loss: 0.00800
Epoch: 26276, Training Loss: 0.00905
Epoch: 26276, Training Loss: 0.00844
Epoch: 26276, Training Loss: 0.01034
Epoch: 26276, Training Loss: 0.00800
Epoch: 26277, Training Loss: 0.00905
Epoch: 26277, Training Loss: 0.00844
Epoch: 26277, Training Loss: 0.01034
Epoch: 26277, Training Loss: 0.00800
Epoch: 26278, Training Loss: 0.00905
Epoch: 26278, Training Loss: 0.00844
Epoch: 26278, Training Loss: 0.01034
Epoch: 26278, Training Loss: 0.00800
Epoch: 26279, Training Loss: 0.00905
Epoch: 26279, Training Loss: 0.00844
Epoch: 26279, Training Loss: 0.01034
Epoch: 26279, Training Loss: 0.00800
Epoch: 26280, Training Loss: 0.00905
Epoch: 26280, Training Loss: 0.00844
Epoch: 26280, Training Loss: 0.01033
Epoch: 26280, Training Loss: 0.00799
Epoch: 26281, Training Loss: 0.00905
Epoch: 26281, Training Loss: 0.00844
Epoch: 26281, Training Loss: 0.01033
Epoch: 26281, Training Loss: 0.00799
Epoch: 26282, Training Loss: 0.00905
Epoch: 26282, Training Loss: 0.00844
Epoch: 26282, Training Loss: 0.01033
Epoch: 26282, Training Loss: 0.00799
Epoch: 26283, Training Loss: 0.00905
Epoch: 26283, Training Loss: 0.00844
Epoch: 26283, Training Loss: 0.01033
Epoch: 26283, Training Loss: 0.00799
Epoch: 26284, Training Loss: 0.00905
Epoch: 26284, Training Loss: 0.00844
Epoch: 26284, Training Loss: 0.01033
Epoch: 26284, Training Loss: 0.00799
Epoch: 26285, Training Loss: 0.00905
Epoch: 26285, Training Loss: 0.00844
Epoch: 26285, Training Loss: 0.01033
Epoch: 26285, Training Loss: 0.00799
Epoch: 26286, Training Loss: 0.00905
Epoch: 26286, Training Loss: 0.00844
Epoch: 26286, Training Loss: 0.01033
Epoch: 26286, Training Loss: 0.00799
Epoch: 26287, Training Loss: 0.00905
Epoch: 26287, Training Loss: 0.00844
Epoch: 26287, Training Loss: 0.01033
Epoch: 26287, Training Loss: 0.00799
Epoch: 26288, Training Loss: 0.00905
Epoch: 26288, Training Loss: 0.00844
Epoch: 26288, Training Loss: 0.01033
Epoch: 26288, Training Loss: 0.00799
Epoch: 26289, Training Loss: 0.00905
Epoch: 26289, Training Loss: 0.00844
Epoch: 26289, Training Loss: 0.01033
Epoch: 26289, Training Loss: 0.00799
Epoch: 26290, Training Loss: 0.00905
Epoch: 26290, Training Loss: 0.00844
Epoch: 26290, Training Loss: 0.01033
Epoch: 26290, Training Loss: 0.00799
Epoch: 26291, Training Loss: 0.00905
Epoch: 26291, Training Loss: 0.00844
Epoch: 26291, Training Loss: 0.01033
Epoch: 26291, Training Loss: 0.00799
Epoch: 26292, Training Loss: 0.00905
Epoch: 26292, Training Loss: 0.00844
Epoch: 26292, Training Loss: 0.01033
Epoch: 26292, Training Loss: 0.00799
Epoch: 26293, Training Loss: 0.00905
Epoch: 26293, Training Loss: 0.00844
Epoch: 26293, Training Loss: 0.01033
Epoch: 26293, Training Loss: 0.00799
Epoch: 26294, Training Loss: 0.00905
Epoch: 26294, Training Loss: 0.00844
Epoch: 26294, Training Loss: 0.01033
Epoch: 26294, Training Loss: 0.00799
Epoch: 26295, Training Loss: 0.00905
Epoch: 26295, Training Loss: 0.00844
Epoch: 26295, Training Loss: 0.01033
Epoch: 26295, Training Loss: 0.00799
Epoch: 26296, Training Loss: 0.00905
Epoch: 26296, Training Loss: 0.00844
Epoch: 26296, Training Loss: 0.01033
Epoch: 26296, Training Loss: 0.00799
Epoch: 26297, Training Loss: 0.00905
Epoch: 26297, Training Loss: 0.00844
Epoch: 26297, Training Loss: 0.01033
Epoch: 26297, Training Loss: 0.00799
Epoch: 26298, Training Loss: 0.00905
Epoch: 26298, Training Loss: 0.00844
Epoch: 26298, Training Loss: 0.01033
Epoch: 26298, Training Loss: 0.00799
Epoch: 26299, Training Loss: 0.00905
Epoch: 26299, Training Loss: 0.00844
Epoch: 26299, Training Loss: 0.01033
Epoch: 26299, Training Loss: 0.00799
Epoch: 26300, Training Loss: 0.00905
Epoch: 26300, Training Loss: 0.00844
Epoch: 26300, Training Loss: 0.01033
Epoch: 26300, Training Loss: 0.00799
Epoch: 26301, Training Loss: 0.00905
Epoch: 26301, Training Loss: 0.00844
Epoch: 26301, Training Loss: 0.01033
Epoch: 26301, Training Loss: 0.00799
Epoch: 26302, Training Loss: 0.00905
Epoch: 26302, Training Loss: 0.00844
Epoch: 26302, Training Loss: 0.01033
Epoch: 26302, Training Loss: 0.00799
Epoch: 26303, Training Loss: 0.00905
Epoch: 26303, Training Loss: 0.00844
Epoch: 26303, Training Loss: 0.01033
Epoch: 26303, Training Loss: 0.00799
Epoch: 26304, Training Loss: 0.00905
Epoch: 26304, Training Loss: 0.00844
Epoch: 26304, Training Loss: 0.01033
Epoch: 26304, Training Loss: 0.00799
Epoch: 26305, Training Loss: 0.00905
Epoch: 26305, Training Loss: 0.00844
Epoch: 26305, Training Loss: 0.01033
Epoch: 26305, Training Loss: 0.00799
Epoch: 26306, Training Loss: 0.00905
Epoch: 26306, Training Loss: 0.00844
Epoch: 26306, Training Loss: 0.01033
Epoch: 26306, Training Loss: 0.00799
Epoch: 26307, Training Loss: 0.00905
Epoch: 26307, Training Loss: 0.00844
Epoch: 26307, Training Loss: 0.01033
Epoch: 26307, Training Loss: 0.00799
Epoch: 26308, Training Loss: 0.00905
Epoch: 26308, Training Loss: 0.00844
Epoch: 26308, Training Loss: 0.01033
Epoch: 26308, Training Loss: 0.00799
Epoch: 26309, Training Loss: 0.00905
Epoch: 26309, Training Loss: 0.00844
Epoch: 26309, Training Loss: 0.01033
Epoch: 26309, Training Loss: 0.00799
Epoch: 26310, Training Loss: 0.00905
Epoch: 26310, Training Loss: 0.00844
Epoch: 26310, Training Loss: 0.01033
Epoch: 26310, Training Loss: 0.00799
Epoch: 26311, Training Loss: 0.00905
Epoch: 26311, Training Loss: 0.00844
Epoch: 26311, Training Loss: 0.01033
Epoch: 26311, Training Loss: 0.00799
Epoch: 26312, Training Loss: 0.00905
Epoch: 26312, Training Loss: 0.00844
Epoch: 26312, Training Loss: 0.01033
Epoch: 26312, Training Loss: 0.00799
Epoch: 26313, Training Loss: 0.00904
Epoch: 26313, Training Loss: 0.00844
Epoch: 26313, Training Loss: 0.01033
Epoch: 26313, Training Loss: 0.00799
Epoch: 26314, Training Loss: 0.00904
Epoch: 26314, Training Loss: 0.00844
Epoch: 26314, Training Loss: 0.01033
Epoch: 26314, Training Loss: 0.00799
Epoch: 26315, Training Loss: 0.00904
Epoch: 26315, Training Loss: 0.00844
Epoch: 26315, Training Loss: 0.01033
Epoch: 26315, Training Loss: 0.00799
Epoch: 26316, Training Loss: 0.00904
Epoch: 26316, Training Loss: 0.00844
Epoch: 26316, Training Loss: 0.01033
Epoch: 26316, Training Loss: 0.00799
Epoch: 26317, Training Loss: 0.00904
Epoch: 26317, Training Loss: 0.00844
Epoch: 26317, Training Loss: 0.01033
Epoch: 26317, Training Loss: 0.00799
Epoch: 26318, Training Loss: 0.00904
Epoch: 26318, Training Loss: 0.00843
Epoch: 26318, Training Loss: 0.01033
Epoch: 26318, Training Loss: 0.00799
Epoch: 26319, Training Loss: 0.00904
Epoch: 26319, Training Loss: 0.00843
Epoch: 26319, Training Loss: 0.01033
Epoch: 26319, Training Loss: 0.00799
Epoch: 26320, Training Loss: 0.00904
Epoch: 26320, Training Loss: 0.00843
Epoch: 26320, Training Loss: 0.01033
Epoch: 26320, Training Loss: 0.00799
Epoch: 26321, Training Loss: 0.00904
Epoch: 26321, Training Loss: 0.00843
Epoch: 26321, Training Loss: 0.01033
Epoch: 26321, Training Loss: 0.00799
Epoch: 26322, Training Loss: 0.00904
Epoch: 26322, Training Loss: 0.00843
Epoch: 26322, Training Loss: 0.01033
Epoch: 26322, Training Loss: 0.00799
Epoch: 26323, Training Loss: 0.00904
Epoch: 26323, Training Loss: 0.00843
Epoch: 26323, Training Loss: 0.01033
Epoch: 26323, Training Loss: 0.00799
Epoch: 26324, Training Loss: 0.00904
Epoch: 26324, Training Loss: 0.00843
Epoch: 26324, Training Loss: 0.01033
Epoch: 26324, Training Loss: 0.00799
Epoch: 26325, Training Loss: 0.00904
Epoch: 26325, Training Loss: 0.00843
Epoch: 26325, Training Loss: 0.01033
Epoch: 26325, Training Loss: 0.00799
Epoch: 26326, Training Loss: 0.00904
Epoch: 26326, Training Loss: 0.00843
Epoch: 26326, Training Loss: 0.01033
Epoch: 26326, Training Loss: 0.00799
Epoch: 26327, Training Loss: 0.00904
Epoch: 26327, Training Loss: 0.00843
Epoch: 26327, Training Loss: 0.01033
Epoch: 26327, Training Loss: 0.00799
Epoch: 26328, Training Loss: 0.00904
Epoch: 26328, Training Loss: 0.00843
Epoch: 26328, Training Loss: 0.01032
Epoch: 26328, Training Loss: 0.00799
Epoch: 26329, Training Loss: 0.00904
Epoch: 26329, Training Loss: 0.00843
Epoch: 26329, Training Loss: 0.01032
Epoch: 26329, Training Loss: 0.00799
Epoch: 26330, Training Loss: 0.00904
Epoch: 26330, Training Loss: 0.00843
Epoch: 26330, Training Loss: 0.01032
Epoch: 26330, Training Loss: 0.00799
Epoch: 26331, Training Loss: 0.00904
Epoch: 26331, Training Loss: 0.00843
Epoch: 26331, Training Loss: 0.01032
Epoch: 26331, Training Loss: 0.00799
Epoch: 26332, Training Loss: 0.00904
Epoch: 26332, Training Loss: 0.00843
Epoch: 26332, Training Loss: 0.01032
Epoch: 26332, Training Loss: 0.00799
Epoch: 26333, Training Loss: 0.00904
Epoch: 26333, Training Loss: 0.00843
Epoch: 26333, Training Loss: 0.01032
Epoch: 26333, Training Loss: 0.00799
Epoch: 26334, Training Loss: 0.00904
Epoch: 26334, Training Loss: 0.00843
Epoch: 26334, Training Loss: 0.01032
Epoch: 26334, Training Loss: 0.00799
Epoch: 26335, Training Loss: 0.00904
Epoch: 26335, Training Loss: 0.00843
Epoch: 26335, Training Loss: 0.01032
Epoch: 26335, Training Loss: 0.00799
Epoch: 26336, Training Loss: 0.00904
Epoch: 26336, Training Loss: 0.00843
Epoch: 26336, Training Loss: 0.01032
Epoch: 26336, Training Loss: 0.00799
Epoch: 26337, Training Loss: 0.00904
Epoch: 26337, Training Loss: 0.00843
Epoch: 26337, Training Loss: 0.01032
Epoch: 26337, Training Loss: 0.00799
Epoch: 26338, Training Loss: 0.00904
Epoch: 26338, Training Loss: 0.00843
Epoch: 26338, Training Loss: 0.01032
Epoch: 26338, Training Loss: 0.00799
Epoch: 26339, Training Loss: 0.00904
Epoch: 26339, Training Loss: 0.00843
Epoch: 26339, Training Loss: 0.01032
Epoch: 26339, Training Loss: 0.00799
Epoch: 26340, Training Loss: 0.00904
Epoch: 26340, Training Loss: 0.00843
Epoch: 26340, Training Loss: 0.01032
Epoch: 26340, Training Loss: 0.00799
Epoch: 26341, Training Loss: 0.00904
Epoch: 26341, Training Loss: 0.00843
Epoch: 26341, Training Loss: 0.01032
Epoch: 26341, Training Loss: 0.00799
Epoch: 26342, Training Loss: 0.00904
Epoch: 26342, Training Loss: 0.00843
Epoch: 26342, Training Loss: 0.01032
Epoch: 26342, Training Loss: 0.00799
Epoch: 26343, Training Loss: 0.00904
Epoch: 26343, Training Loss: 0.00843
Epoch: 26343, Training Loss: 0.01032
Epoch: 26343, Training Loss: 0.00798
Epoch: 26344, Training Loss: 0.00904
Epoch: 26344, Training Loss: 0.00843
Epoch: 26344, Training Loss: 0.01032
Epoch: 26344, Training Loss: 0.00798
Epoch: 26345, Training Loss: 0.00904
Epoch: 26345, Training Loss: 0.00843
Epoch: 26345, Training Loss: 0.01032
Epoch: 26345, Training Loss: 0.00798
Epoch: 26346, Training Loss: 0.00904
Epoch: 26346, Training Loss: 0.00843
Epoch: 26346, Training Loss: 0.01032
Epoch: 26346, Training Loss: 0.00798
Epoch: 26347, Training Loss: 0.00904
Epoch: 26347, Training Loss: 0.00843
Epoch: 26347, Training Loss: 0.01032
Epoch: 26347, Training Loss: 0.00798
Epoch: 26348, Training Loss: 0.00904
Epoch: 26348, Training Loss: 0.00843
Epoch: 26348, Training Loss: 0.01032
Epoch: 26348, Training Loss: 0.00798
Epoch: 26349, Training Loss: 0.00904
Epoch: 26349, Training Loss: 0.00843
Epoch: 26349, Training Loss: 0.01032
Epoch: 26349, Training Loss: 0.00798
Epoch: 26350, Training Loss: 0.00904
Epoch: 26350, Training Loss: 0.00843
Epoch: 26350, Training Loss: 0.01032
Epoch: 26350, Training Loss: 0.00798
Epoch: 26351, Training Loss: 0.00904
Epoch: 26351, Training Loss: 0.00843
Epoch: 26351, Training Loss: 0.01032
Epoch: 26351, Training Loss: 0.00798
Epoch: 26352, Training Loss: 0.00904
Epoch: 26352, Training Loss: 0.00843
Epoch: 26352, Training Loss: 0.01032
Epoch: 26352, Training Loss: 0.00798
Epoch: 26353, Training Loss: 0.00904
Epoch: 26353, Training Loss: 0.00843
Epoch: 26353, Training Loss: 0.01032
Epoch: 26353, Training Loss: 0.00798
Epoch: 26354, Training Loss: 0.00904
Epoch: 26354, Training Loss: 0.00843
Epoch: 26354, Training Loss: 0.01032
Epoch: 26354, Training Loss: 0.00798
Epoch: 26355, Training Loss: 0.00904
Epoch: 26355, Training Loss: 0.00843
Epoch: 26355, Training Loss: 0.01032
Epoch: 26355, Training Loss: 0.00798
Epoch: 26356, Training Loss: 0.00904
Epoch: 26356, Training Loss: 0.00843
Epoch: 26356, Training Loss: 0.01032
Epoch: 26356, Training Loss: 0.00798
Epoch: 26357, Training Loss: 0.00904
Epoch: 26357, Training Loss: 0.00843
Epoch: 26357, Training Loss: 0.01032
Epoch: 26357, Training Loss: 0.00798
Epoch: 26358, Training Loss: 0.00904
Epoch: 26358, Training Loss: 0.00843
Epoch: 26358, Training Loss: 0.01032
Epoch: 26358, Training Loss: 0.00798
Epoch: 26359, Training Loss: 0.00904
Epoch: 26359, Training Loss: 0.00843
Epoch: 26359, Training Loss: 0.01032
Epoch: 26359, Training Loss: 0.00798
Epoch: 26360, Training Loss: 0.00904
Epoch: 26360, Training Loss: 0.00843
Epoch: 26360, Training Loss: 0.01032
Epoch: 26360, Training Loss: 0.00798
Epoch: 26361, Training Loss: 0.00904
Epoch: 26361, Training Loss: 0.00843
Epoch: 26361, Training Loss: 0.01032
Epoch: 26361, Training Loss: 0.00798
Epoch: 26362, Training Loss: 0.00904
Epoch: 26362, Training Loss: 0.00843
Epoch: 26362, Training Loss: 0.01032
Epoch: 26362, Training Loss: 0.00798
Epoch: 26363, Training Loss: 0.00904
Epoch: 26363, Training Loss: 0.00843
Epoch: 26363, Training Loss: 0.01032
Epoch: 26363, Training Loss: 0.00798
Epoch: 26364, Training Loss: 0.00904
Epoch: 26364, Training Loss: 0.00843
Epoch: 26364, Training Loss: 0.01032
Epoch: 26364, Training Loss: 0.00798
Epoch: 26365, Training Loss: 0.00904
Epoch: 26365, Training Loss: 0.00843
Epoch: 26365, Training Loss: 0.01032
Epoch: 26365, Training Loss: 0.00798
Epoch: 26366, Training Loss: 0.00904
Epoch: 26366, Training Loss: 0.00843
Epoch: 26366, Training Loss: 0.01032
Epoch: 26366, Training Loss: 0.00798
Epoch: 26367, Training Loss: 0.00904
Epoch: 26367, Training Loss: 0.00843
Epoch: 26367, Training Loss: 0.01032
Epoch: 26367, Training Loss: 0.00798
Epoch: 26368, Training Loss: 0.00903
Epoch: 26368, Training Loss: 0.00843
Epoch: 26368, Training Loss: 0.01032
Epoch: 26368, Training Loss: 0.00798
Epoch: 26369, Training Loss: 0.00903
Epoch: 26369, Training Loss: 0.00843
Epoch: 26369, Training Loss: 0.01032
Epoch: 26369, Training Loss: 0.00798
Epoch: 26370, Training Loss: 0.00903
Epoch: 26370, Training Loss: 0.00843
Epoch: 26370, Training Loss: 0.01032
Epoch: 26370, Training Loss: 0.00798
Epoch: 26371, Training Loss: 0.00903
Epoch: 26371, Training Loss: 0.00843
Epoch: 26371, Training Loss: 0.01032
Epoch: 26371, Training Loss: 0.00798
Epoch: 26372, Training Loss: 0.00903
Epoch: 26372, Training Loss: 0.00843
Epoch: 26372, Training Loss: 0.01032
Epoch: 26372, Training Loss: 0.00798
Epoch: 26373, Training Loss: 0.00903
Epoch: 26373, Training Loss: 0.00843
Epoch: 26373, Training Loss: 0.01032
Epoch: 26373, Training Loss: 0.00798
Epoch: 26374, Training Loss: 0.00903
Epoch: 26374, Training Loss: 0.00843
Epoch: 26374, Training Loss: 0.01032
Epoch: 26374, Training Loss: 0.00798
Epoch: 26375, Training Loss: 0.00903
Epoch: 26375, Training Loss: 0.00843
Epoch: 26375, Training Loss: 0.01031
Epoch: 26375, Training Loss: 0.00798
Epoch: 26376, Training Loss: 0.00903
Epoch: 26376, Training Loss: 0.00843
Epoch: 26376, Training Loss: 0.01031
Epoch: 26376, Training Loss: 0.00798
Epoch: 26377, Training Loss: 0.00903
Epoch: 26377, Training Loss: 0.00843
Epoch: 26377, Training Loss: 0.01031
Epoch: 26377, Training Loss: 0.00798
Epoch: 26378, Training Loss: 0.00903
Epoch: 26378, Training Loss: 0.00842
Epoch: 26378, Training Loss: 0.01031
Epoch: 26378, Training Loss: 0.00798
Epoch: 26379, Training Loss: 0.00903
Epoch: 26379, Training Loss: 0.00842
Epoch: 26379, Training Loss: 0.01031
Epoch: 26379, Training Loss: 0.00798
Epoch: 26380, Training Loss: 0.00903
Epoch: 26380, Training Loss: 0.00842
Epoch: 26380, Training Loss: 0.01031
Epoch: 26380, Training Loss: 0.00798
Epoch: 26381, Training Loss: 0.00903
Epoch: 26381, Training Loss: 0.00842
Epoch: 26381, Training Loss: 0.01031
Epoch: 26381, Training Loss: 0.00798
Epoch: 26382, Training Loss: 0.00903
Epoch: 26382, Training Loss: 0.00842
Epoch: 26382, Training Loss: 0.01031
Epoch: 26382, Training Loss: 0.00798
Epoch: 26383, Training Loss: 0.00903
Epoch: 26383, Training Loss: 0.00842
Epoch: 26383, Training Loss: 0.01031
Epoch: 26383, Training Loss: 0.00798
Epoch: 26384, Training Loss: 0.00903
Epoch: 26384, Training Loss: 0.00842
Epoch: 26384, Training Loss: 0.01031
Epoch: 26384, Training Loss: 0.00798
Epoch: 26385, Training Loss: 0.00903
Epoch: 26385, Training Loss: 0.00842
Epoch: 26385, Training Loss: 0.01031
Epoch: 26385, Training Loss: 0.00798
Epoch: 26386, Training Loss: 0.00903
Epoch: 26386, Training Loss: 0.00842
Epoch: 26386, Training Loss: 0.01031
Epoch: 26386, Training Loss: 0.00798
Epoch: 26387, Training Loss: 0.00903
Epoch: 26387, Training Loss: 0.00842
Epoch: 26387, Training Loss: 0.01031
Epoch: 26387, Training Loss: 0.00798
Epoch: 26388, Training Loss: 0.00903
Epoch: 26388, Training Loss: 0.00842
Epoch: 26388, Training Loss: 0.01031
Epoch: 26388, Training Loss: 0.00798
Epoch: 26389, Training Loss: 0.00903
Epoch: 26389, Training Loss: 0.00842
Epoch: 26389, Training Loss: 0.01031
Epoch: 26389, Training Loss: 0.00798
Epoch: 26390, Training Loss: 0.00903
Epoch: 26390, Training Loss: 0.00842
Epoch: 26390, Training Loss: 0.01031
Epoch: 26390, Training Loss: 0.00798
Epoch: 26391, Training Loss: 0.00903
Epoch: 26391, Training Loss: 0.00842
Epoch: 26391, Training Loss: 0.01031
Epoch: 26391, Training Loss: 0.00798
Epoch: 26392, Training Loss: 0.00903
Epoch: 26392, Training Loss: 0.00842
Epoch: 26392, Training Loss: 0.01031
Epoch: 26392, Training Loss: 0.00798
Epoch: 26393, Training Loss: 0.00903
Epoch: 26393, Training Loss: 0.00842
Epoch: 26393, Training Loss: 0.01031
Epoch: 26393, Training Loss: 0.00798
Epoch: 26394, Training Loss: 0.00903
Epoch: 26394, Training Loss: 0.00842
Epoch: 26394, Training Loss: 0.01031
Epoch: 26394, Training Loss: 0.00798
Epoch: 26395, Training Loss: 0.00903
Epoch: 26395, Training Loss: 0.00842
Epoch: 26395, Training Loss: 0.01031
Epoch: 26395, Training Loss: 0.00798
Epoch: 26396, Training Loss: 0.00903
Epoch: 26396, Training Loss: 0.00842
Epoch: 26396, Training Loss: 0.01031
Epoch: 26396, Training Loss: 0.00798
Epoch: 26397, Training Loss: 0.00903
Epoch: 26397, Training Loss: 0.00842
Epoch: 26397, Training Loss: 0.01031
Epoch: 26397, Training Loss: 0.00798
Epoch: 26398, Training Loss: 0.00903
Epoch: 26398, Training Loss: 0.00842
Epoch: 26398, Training Loss: 0.01031
Epoch: 26398, Training Loss: 0.00798
Epoch: 26399, Training Loss: 0.00903
Epoch: 26399, Training Loss: 0.00842
Epoch: 26399, Training Loss: 0.01031
Epoch: 26399, Training Loss: 0.00798
Epoch: 26400, Training Loss: 0.00903
Epoch: 26400, Training Loss: 0.00842
Epoch: 26400, Training Loss: 0.01031
Epoch: 26400, Training Loss: 0.00798
Epoch: 26401, Training Loss: 0.00903
Epoch: 26401, Training Loss: 0.00842
Epoch: 26401, Training Loss: 0.01031
Epoch: 26401, Training Loss: 0.00798
Epoch: 26402, Training Loss: 0.00903
Epoch: 26402, Training Loss: 0.00842
Epoch: 26402, Training Loss: 0.01031
Epoch: 26402, Training Loss: 0.00798
Epoch: 26403, Training Loss: 0.00903
Epoch: 26403, Training Loss: 0.00842
Epoch: 26403, Training Loss: 0.01031
Epoch: 26403, Training Loss: 0.00798
Epoch: 26404, Training Loss: 0.00903
Epoch: 26404, Training Loss: 0.00842
Epoch: 26404, Training Loss: 0.01031
Epoch: 26404, Training Loss: 0.00798
Epoch: 26405, Training Loss: 0.00903
Epoch: 26405, Training Loss: 0.00842
Epoch: 26405, Training Loss: 0.01031
Epoch: 26405, Training Loss: 0.00798
Epoch: 26406, Training Loss: 0.00903
Epoch: 26406, Training Loss: 0.00842
Epoch: 26406, Training Loss: 0.01031
Epoch: 26406, Training Loss: 0.00797
Epoch: 26407, Training Loss: 0.00903
Epoch: 26407, Training Loss: 0.00842
Epoch: 26407, Training Loss: 0.01031
Epoch: 26407, Training Loss: 0.00797
Epoch: 26408, Training Loss: 0.00903
Epoch: 26408, Training Loss: 0.00842
Epoch: 26408, Training Loss: 0.01031
Epoch: 26408, Training Loss: 0.00797
Epoch: 26409, Training Loss: 0.00903
Epoch: 26409, Training Loss: 0.00842
Epoch: 26409, Training Loss: 0.01031
Epoch: 26409, Training Loss: 0.00797
Epoch: 26410, Training Loss: 0.00903
Epoch: 26410, Training Loss: 0.00842
Epoch: 26410, Training Loss: 0.01031
Epoch: 26410, Training Loss: 0.00797
Epoch: 26411, Training Loss: 0.00903
Epoch: 26411, Training Loss: 0.00842
Epoch: 26411, Training Loss: 0.01031
Epoch: 26411, Training Loss: 0.00797
Epoch: 26412, Training Loss: 0.00903
Epoch: 26412, Training Loss: 0.00842
Epoch: 26412, Training Loss: 0.01031
Epoch: 26412, Training Loss: 0.00797
Epoch: 26413, Training Loss: 0.00903
Epoch: 26413, Training Loss: 0.00842
Epoch: 26413, Training Loss: 0.01031
Epoch: 26413, Training Loss: 0.00797
Epoch: 26414, Training Loss: 0.00903
Epoch: 26414, Training Loss: 0.00842
Epoch: 26414, Training Loss: 0.01031
Epoch: 26414, Training Loss: 0.00797
Epoch: 26415, Training Loss: 0.00903
Epoch: 26415, Training Loss: 0.00842
Epoch: 26415, Training Loss: 0.01031
Epoch: 26415, Training Loss: 0.00797
Epoch: 26416, Training Loss: 0.00903
Epoch: 26416, Training Loss: 0.00842
Epoch: 26416, Training Loss: 0.01031
Epoch: 26416, Training Loss: 0.00797
Epoch: 26417, Training Loss: 0.00903
Epoch: 26417, Training Loss: 0.00842
Epoch: 26417, Training Loss: 0.01031
Epoch: 26417, Training Loss: 0.00797
Epoch: 26418, Training Loss: 0.00903
Epoch: 26418, Training Loss: 0.00842
Epoch: 26418, Training Loss: 0.01031
Epoch: 26418, Training Loss: 0.00797
Epoch: 26419, Training Loss: 0.00903
Epoch: 26419, Training Loss: 0.00842
Epoch: 26419, Training Loss: 0.01031
Epoch: 26419, Training Loss: 0.00797
Epoch: 26420, Training Loss: 0.00903
Epoch: 26420, Training Loss: 0.00842
Epoch: 26420, Training Loss: 0.01031
Epoch: 26420, Training Loss: 0.00797
Epoch: 26421, Training Loss: 0.00903
Epoch: 26421, Training Loss: 0.00842
Epoch: 26421, Training Loss: 0.01031
Epoch: 26421, Training Loss: 0.00797
Epoch: 26422, Training Loss: 0.00902
Epoch: 26422, Training Loss: 0.00842
Epoch: 26422, Training Loss: 0.01031
Epoch: 26422, Training Loss: 0.00797
Epoch: 26423, Training Loss: 0.00902
Epoch: 26423, Training Loss: 0.00842
Epoch: 26423, Training Loss: 0.01031
Epoch: 26423, Training Loss: 0.00797
Epoch: 26424, Training Loss: 0.00902
Epoch: 26424, Training Loss: 0.00842
Epoch: 26424, Training Loss: 0.01030
Epoch: 26424, Training Loss: 0.00797
Epoch: 26425, Training Loss: 0.00902
Epoch: 26425, Training Loss: 0.00842
Epoch: 26425, Training Loss: 0.01030
Epoch: 26425, Training Loss: 0.00797
Epoch: 26426, Training Loss: 0.00902
Epoch: 26426, Training Loss: 0.00842
Epoch: 26426, Training Loss: 0.01030
Epoch: 26426, Training Loss: 0.00797
Epoch: 26427, Training Loss: 0.00902
Epoch: 26427, Training Loss: 0.00842
Epoch: 26427, Training Loss: 0.01030
Epoch: 26427, Training Loss: 0.00797
Epoch: 26428, Training Loss: 0.00902
Epoch: 26428, Training Loss: 0.00842
Epoch: 26428, Training Loss: 0.01030
Epoch: 26428, Training Loss: 0.00797
Epoch: 26429, Training Loss: 0.00902
Epoch: 26429, Training Loss: 0.00842
Epoch: 26429, Training Loss: 0.01030
Epoch: 26429, Training Loss: 0.00797
Epoch: 26430, Training Loss: 0.00902
Epoch: 26430, Training Loss: 0.00842
Epoch: 26430, Training Loss: 0.01030
Epoch: 26430, Training Loss: 0.00797
Epoch: 26431, Training Loss: 0.00902
Epoch: 26431, Training Loss: 0.00842
Epoch: 26431, Training Loss: 0.01030
Epoch: 26431, Training Loss: 0.00797
Epoch: 26432, Training Loss: 0.00902
Epoch: 26432, Training Loss: 0.00842
Epoch: 26432, Training Loss: 0.01030
Epoch: 26432, Training Loss: 0.00797
Epoch: 26433, Training Loss: 0.00902
Epoch: 26433, Training Loss: 0.00842
Epoch: 26433, Training Loss: 0.01030
Epoch: 26433, Training Loss: 0.00797
Epoch: 26434, Training Loss: 0.00902
Epoch: 26434, Training Loss: 0.00842
Epoch: 26434, Training Loss: 0.01030
Epoch: 26434, Training Loss: 0.00797
Epoch: 26435, Training Loss: 0.00902
Epoch: 26435, Training Loss: 0.00842
Epoch: 26435, Training Loss: 0.01030
Epoch: 26435, Training Loss: 0.00797
Epoch: 26436, Training Loss: 0.00902
Epoch: 26436, Training Loss: 0.00842
Epoch: 26436, Training Loss: 0.01030
Epoch: 26436, Training Loss: 0.00797
Epoch: 26437, Training Loss: 0.00902
Epoch: 26437, Training Loss: 0.00842
Epoch: 26437, Training Loss: 0.01030
Epoch: 26437, Training Loss: 0.00797
Epoch: 26438, Training Loss: 0.00902
Epoch: 26438, Training Loss: 0.00841
Epoch: 26438, Training Loss: 0.01030
Epoch: 26438, Training Loss: 0.00797
Epoch: 26439, Training Loss: 0.00902
Epoch: 26439, Training Loss: 0.00841
Epoch: 26439, Training Loss: 0.01030
Epoch: 26439, Training Loss: 0.00797
Epoch: 26440, Training Loss: 0.00902
Epoch: 26440, Training Loss: 0.00841
Epoch: 26440, Training Loss: 0.01030
Epoch: 26440, Training Loss: 0.00797
Epoch: 26441, Training Loss: 0.00902
Epoch: 26441, Training Loss: 0.00841
Epoch: 26441, Training Loss: 0.01030
Epoch: 26441, Training Loss: 0.00797
Epoch: 26442, Training Loss: 0.00902
Epoch: 26442, Training Loss: 0.00841
Epoch: 26442, Training Loss: 0.01030
Epoch: 26442, Training Loss: 0.00797
Epoch: 26443, Training Loss: 0.00902
Epoch: 26443, Training Loss: 0.00841
Epoch: 26443, Training Loss: 0.01030
Epoch: 26443, Training Loss: 0.00797
Epoch: 26444, Training Loss: 0.00902
Epoch: 26444, Training Loss: 0.00841
Epoch: 26444, Training Loss: 0.01030
Epoch: 26444, Training Loss: 0.00797
Epoch: 26445, Training Loss: 0.00902
Epoch: 26445, Training Loss: 0.00841
Epoch: 26445, Training Loss: 0.01030
Epoch: 26445, Training Loss: 0.00797
Epoch: 26446, Training Loss: 0.00902
Epoch: 26446, Training Loss: 0.00841
Epoch: 26446, Training Loss: 0.01030
Epoch: 26446, Training Loss: 0.00797
Epoch: 26447, Training Loss: 0.00902
Epoch: 26447, Training Loss: 0.00841
Epoch: 26447, Training Loss: 0.01030
Epoch: 26447, Training Loss: 0.00797
Epoch: 26448, Training Loss: 0.00902
Epoch: 26448, Training Loss: 0.00841
Epoch: 26448, Training Loss: 0.01030
Epoch: 26448, Training Loss: 0.00797
Epoch: 26449, Training Loss: 0.00902
Epoch: 26449, Training Loss: 0.00841
Epoch: 26449, Training Loss: 0.01030
Epoch: 26449, Training Loss: 0.00797
Epoch: 26450, Training Loss: 0.00902
Epoch: 26450, Training Loss: 0.00841
Epoch: 26450, Training Loss: 0.01030
Epoch: 26450, Training Loss: 0.00797
Epoch: 26451, Training Loss: 0.00902
Epoch: 26451, Training Loss: 0.00841
Epoch: 26451, Training Loss: 0.01030
Epoch: 26451, Training Loss: 0.00797
Epoch: 26452, Training Loss: 0.00902
Epoch: 26452, Training Loss: 0.00841
Epoch: 26452, Training Loss: 0.01030
Epoch: 26452, Training Loss: 0.00797
Epoch: 26453, Training Loss: 0.00902
Epoch: 26453, Training Loss: 0.00841
Epoch: 26453, Training Loss: 0.01030
Epoch: 26453, Training Loss: 0.00797
Epoch: 26454, Training Loss: 0.00902
Epoch: 26454, Training Loss: 0.00841
Epoch: 26454, Training Loss: 0.01030
Epoch: 26454, Training Loss: 0.00797
Epoch: 26455, Training Loss: 0.00902
Epoch: 26455, Training Loss: 0.00841
Epoch: 26455, Training Loss: 0.01030
Epoch: 26455, Training Loss: 0.00797
Epoch: 26456, Training Loss: 0.00902
Epoch: 26456, Training Loss: 0.00841
Epoch: 26456, Training Loss: 0.01030
Epoch: 26456, Training Loss: 0.00797
Epoch: 26457, Training Loss: 0.00902
Epoch: 26457, Training Loss: 0.00841
Epoch: 26457, Training Loss: 0.01030
Epoch: 26457, Training Loss: 0.00797
Epoch: 26458, Training Loss: 0.00902
Epoch: 26458, Training Loss: 0.00841
Epoch: 26458, Training Loss: 0.01030
Epoch: 26458, Training Loss: 0.00797
Epoch: 26459, Training Loss: 0.00902
Epoch: 26459, Training Loss: 0.00841
Epoch: 26459, Training Loss: 0.01030
Epoch: 26459, Training Loss: 0.00797
Epoch: 26460, Training Loss: 0.00902
Epoch: 26460, Training Loss: 0.00841
Epoch: 26460, Training Loss: 0.01030
Epoch: 26460, Training Loss: 0.00797
Epoch: 26461, Training Loss: 0.00902
Epoch: 26461, Training Loss: 0.00841
Epoch: 26461, Training Loss: 0.01030
Epoch: 26461, Training Loss: 0.00797
Epoch: 26462, Training Loss: 0.00902
Epoch: 26462, Training Loss: 0.00841
Epoch: 26462, Training Loss: 0.01030
Epoch: 26462, Training Loss: 0.00797
Epoch: 26463, Training Loss: 0.00902
Epoch: 26463, Training Loss: 0.00841
Epoch: 26463, Training Loss: 0.01030
Epoch: 26463, Training Loss: 0.00797
Epoch: 26464, Training Loss: 0.00902
Epoch: 26464, Training Loss: 0.00841
Epoch: 26464, Training Loss: 0.01030
Epoch: 26464, Training Loss: 0.00797
Epoch: 26465, Training Loss: 0.00902
Epoch: 26465, Training Loss: 0.00841
Epoch: 26465, Training Loss: 0.01030
Epoch: 26465, Training Loss: 0.00797
Epoch: 26466, Training Loss: 0.00902
Epoch: 26466, Training Loss: 0.00841
Epoch: 26466, Training Loss: 0.01030
Epoch: 26466, Training Loss: 0.00797
Epoch: 26467, Training Loss: 0.00902
Epoch: 26467, Training Loss: 0.00841
Epoch: 26467, Training Loss: 0.01030
Epoch: 26467, Training Loss: 0.00797
Epoch: 26468, Training Loss: 0.00902
Epoch: 26468, Training Loss: 0.00841
Epoch: 26468, Training Loss: 0.01030
Epoch: 26468, Training Loss: 0.00797
Epoch: 26469, Training Loss: 0.00902
Epoch: 26469, Training Loss: 0.00841
Epoch: 26469, Training Loss: 0.01030
Epoch: 26469, Training Loss: 0.00796
Epoch: 26470, Training Loss: 0.00902
Epoch: 26470, Training Loss: 0.00841
Epoch: 26470, Training Loss: 0.01030
Epoch: 26470, Training Loss: 0.00796
Epoch: 26471, Training Loss: 0.00902
Epoch: 26471, Training Loss: 0.00841
Epoch: 26471, Training Loss: 0.01030
Epoch: 26471, Training Loss: 0.00796
Epoch: 26472, Training Loss: 0.00902
Epoch: 26472, Training Loss: 0.00841
Epoch: 26472, Training Loss: 0.01029
Epoch: 26472, Training Loss: 0.00796
Epoch: 26473, Training Loss: 0.00902
Epoch: 26473, Training Loss: 0.00841
Epoch: 26473, Training Loss: 0.01029
Epoch: 26473, Training Loss: 0.00796
Epoch: 26474, Training Loss: 0.00902
Epoch: 26474, Training Loss: 0.00841
Epoch: 26474, Training Loss: 0.01029
Epoch: 26474, Training Loss: 0.00796
Epoch: 26475, Training Loss: 0.00902
Epoch: 26475, Training Loss: 0.00841
Epoch: 26475, Training Loss: 0.01029
Epoch: 26475, Training Loss: 0.00796
Epoch: 26476, Training Loss: 0.00902
Epoch: 26476, Training Loss: 0.00841
Epoch: 26476, Training Loss: 0.01029
Epoch: 26476, Training Loss: 0.00796
Epoch: 26477, Training Loss: 0.00901
Epoch: 26477, Training Loss: 0.00841
Epoch: 26477, Training Loss: 0.01029
Epoch: 26477, Training Loss: 0.00796
Epoch: 26478, Training Loss: 0.00901
Epoch: 26478, Training Loss: 0.00841
Epoch: 26478, Training Loss: 0.01029
Epoch: 26478, Training Loss: 0.00796
Epoch: 26479, Training Loss: 0.00901
Epoch: 26479, Training Loss: 0.00841
Epoch: 26479, Training Loss: 0.01029
Epoch: 26479, Training Loss: 0.00796
Epoch: 26480, Training Loss: 0.00901
Epoch: 26480, Training Loss: 0.00841
Epoch: 26480, Training Loss: 0.01029
Epoch: 26480, Training Loss: 0.00796
Epoch: 26481, Training Loss: 0.00901
Epoch: 26481, Training Loss: 0.00841
Epoch: 26481, Training Loss: 0.01029
Epoch: 26481, Training Loss: 0.00796
Epoch: 26482, Training Loss: 0.00901
Epoch: 26482, Training Loss: 0.00841
Epoch: 26482, Training Loss: 0.01029
Epoch: 26482, Training Loss: 0.00796
Epoch: 26483, Training Loss: 0.00901
Epoch: 26483, Training Loss: 0.00841
Epoch: 26483, Training Loss: 0.01029
Epoch: 26483, Training Loss: 0.00796
Epoch: 26484, Training Loss: 0.00901
Epoch: 26484, Training Loss: 0.00841
Epoch: 26484, Training Loss: 0.01029
Epoch: 26484, Training Loss: 0.00796
Epoch: 26485, Training Loss: 0.00901
Epoch: 26485, Training Loss: 0.00841
Epoch: 26485, Training Loss: 0.01029
Epoch: 26485, Training Loss: 0.00796
Epoch: 26486, Training Loss: 0.00901
Epoch: 26486, Training Loss: 0.00841
Epoch: 26486, Training Loss: 0.01029
Epoch: 26486, Training Loss: 0.00796
Epoch: 26487, Training Loss: 0.00901
Epoch: 26487, Training Loss: 0.00841
Epoch: 26487, Training Loss: 0.01029
Epoch: 26487, Training Loss: 0.00796
Epoch: 26488, Training Loss: 0.00901
Epoch: 26488, Training Loss: 0.00841
Epoch: 26488, Training Loss: 0.01029
Epoch: 26488, Training Loss: 0.00796
Epoch: 26489, Training Loss: 0.00901
Epoch: 26489, Training Loss: 0.00841
Epoch: 26489, Training Loss: 0.01029
Epoch: 26489, Training Loss: 0.00796
Epoch: 26490, Training Loss: 0.00901
Epoch: 26490, Training Loss: 0.00841
Epoch: 26490, Training Loss: 0.01029
Epoch: 26490, Training Loss: 0.00796
Epoch: 26491, Training Loss: 0.00901
Epoch: 26491, Training Loss: 0.00841
Epoch: 26491, Training Loss: 0.01029
Epoch: 26491, Training Loss: 0.00796
Epoch: 26492, Training Loss: 0.00901
Epoch: 26492, Training Loss: 0.00841
Epoch: 26492, Training Loss: 0.01029
Epoch: 26492, Training Loss: 0.00796
Epoch: 26493, Training Loss: 0.00901
Epoch: 26493, Training Loss: 0.00841
Epoch: 26493, Training Loss: 0.01029
Epoch: 26493, Training Loss: 0.00796
Epoch: 26494, Training Loss: 0.00901
Epoch: 26494, Training Loss: 0.00841
Epoch: 26494, Training Loss: 0.01029
Epoch: 26494, Training Loss: 0.00796
Epoch: 26495, Training Loss: 0.00901
Epoch: 26495, Training Loss: 0.00841
Epoch: 26495, Training Loss: 0.01029
Epoch: 26495, Training Loss: 0.00796
Epoch: 26496, Training Loss: 0.00901
Epoch: 26496, Training Loss: 0.00841
Epoch: 26496, Training Loss: 0.01029
Epoch: 26496, Training Loss: 0.00796
Epoch: 26497, Training Loss: 0.00901
Epoch: 26497, Training Loss: 0.00841
Epoch: 26497, Training Loss: 0.01029
Epoch: 26497, Training Loss: 0.00796
Epoch: 26498, Training Loss: 0.00901
Epoch: 26498, Training Loss: 0.00840
Epoch: 26498, Training Loss: 0.01029
Epoch: 26498, Training Loss: 0.00796
Epoch: 26499, Training Loss: 0.00901
Epoch: 26499, Training Loss: 0.00840
Epoch: 26499, Training Loss: 0.01029
Epoch: 26499, Training Loss: 0.00796
Epoch: 26500, Training Loss: 0.00901
Epoch: 26500, Training Loss: 0.00840
Epoch: 26500, Training Loss: 0.01029
Epoch: 26500, Training Loss: 0.00796
Epoch: 26501, Training Loss: 0.00901
Epoch: 26501, Training Loss: 0.00840
Epoch: 26501, Training Loss: 0.01029
Epoch: 26501, Training Loss: 0.00796
Epoch: 26502, Training Loss: 0.00901
Epoch: 26502, Training Loss: 0.00840
Epoch: 26502, Training Loss: 0.01029
Epoch: 26502, Training Loss: 0.00796
Epoch: 26503, Training Loss: 0.00901
Epoch: 26503, Training Loss: 0.00840
Epoch: 26503, Training Loss: 0.01029
Epoch: 26503, Training Loss: 0.00796
Epoch: 26504, Training Loss: 0.00901
Epoch: 26504, Training Loss: 0.00840
Epoch: 26504, Training Loss: 0.01029
Epoch: 26504, Training Loss: 0.00796
Epoch: 26505, Training Loss: 0.00901
Epoch: 26505, Training Loss: 0.00840
Epoch: 26505, Training Loss: 0.01029
Epoch: 26505, Training Loss: 0.00796
Epoch: 26506, Training Loss: 0.00901
Epoch: 26506, Training Loss: 0.00840
Epoch: 26506, Training Loss: 0.01029
Epoch: 26506, Training Loss: 0.00796
Epoch: 26507, Training Loss: 0.00901
Epoch: 26507, Training Loss: 0.00840
Epoch: 26507, Training Loss: 0.01029
Epoch: 26507, Training Loss: 0.00796
Epoch: 26508, Training Loss: 0.00901
Epoch: 26508, Training Loss: 0.00840
Epoch: 26508, Training Loss: 0.01029
Epoch: 26508, Training Loss: 0.00796
Epoch: 26509, Training Loss: 0.00901
Epoch: 26509, Training Loss: 0.00840
Epoch: 26509, Training Loss: 0.01029
Epoch: 26509, Training Loss: 0.00796
Epoch: 26510, Training Loss: 0.00901
Epoch: 26510, Training Loss: 0.00840
Epoch: 26510, Training Loss: 0.01029
Epoch: 26510, Training Loss: 0.00796
Epoch: 26511, Training Loss: 0.00901
Epoch: 26511, Training Loss: 0.00840
Epoch: 26511, Training Loss: 0.01029
Epoch: 26511, Training Loss: 0.00796
Epoch: 26512, Training Loss: 0.00901
Epoch: 26512, Training Loss: 0.00840
Epoch: 26512, Training Loss: 0.01029
Epoch: 26512, Training Loss: 0.00796
Epoch: 26513, Training Loss: 0.00901
Epoch: 26513, Training Loss: 0.00840
Epoch: 26513, Training Loss: 0.01029
Epoch: 26513, Training Loss: 0.00796
Epoch: 26514, Training Loss: 0.00901
Epoch: 26514, Training Loss: 0.00840
Epoch: 26514, Training Loss: 0.01029
Epoch: 26514, Training Loss: 0.00796
Epoch: 26515, Training Loss: 0.00901
Epoch: 26515, Training Loss: 0.00840
Epoch: 26515, Training Loss: 0.01029
Epoch: 26515, Training Loss: 0.00796
Epoch: 26516, Training Loss: 0.00901
Epoch: 26516, Training Loss: 0.00840
Epoch: 26516, Training Loss: 0.01029
Epoch: 26516, Training Loss: 0.00796
Epoch: 26517, Training Loss: 0.00901
Epoch: 26517, Training Loss: 0.00840
Epoch: 26517, Training Loss: 0.01029
Epoch: 26517, Training Loss: 0.00796
Epoch: 26518, Training Loss: 0.00901
Epoch: 26518, Training Loss: 0.00840
Epoch: 26518, Training Loss: 0.01029
Epoch: 26518, Training Loss: 0.00796
Epoch: 26519, Training Loss: 0.00901
Epoch: 26519, Training Loss: 0.00840
Epoch: 26519, Training Loss: 0.01029
Epoch: 26519, Training Loss: 0.00796
Epoch: 26520, Training Loss: 0.00901
Epoch: 26520, Training Loss: 0.00840
Epoch: 26520, Training Loss: 0.01028
Epoch: 26520, Training Loss: 0.00796
Epoch: 26521, Training Loss: 0.00901
Epoch: 26521, Training Loss: 0.00840
Epoch: 26521, Training Loss: 0.01028
Epoch: 26521, Training Loss: 0.00796
Epoch: 26522, Training Loss: 0.00901
Epoch: 26522, Training Loss: 0.00840
Epoch: 26522, Training Loss: 0.01028
Epoch: 26522, Training Loss: 0.00796
Epoch: 26523, Training Loss: 0.00901
Epoch: 26523, Training Loss: 0.00840
Epoch: 26523, Training Loss: 0.01028
Epoch: 26523, Training Loss: 0.00796
Epoch: 26524, Training Loss: 0.00901
Epoch: 26524, Training Loss: 0.00840
Epoch: 26524, Training Loss: 0.01028
Epoch: 26524, Training Loss: 0.00796
Epoch: 26525, Training Loss: 0.00901
Epoch: 26525, Training Loss: 0.00840
Epoch: 26525, Training Loss: 0.01028
Epoch: 26525, Training Loss: 0.00796
Epoch: 26526, Training Loss: 0.00901
Epoch: 26526, Training Loss: 0.00840
Epoch: 26526, Training Loss: 0.01028
Epoch: 26526, Training Loss: 0.00796
Epoch: 26527, Training Loss: 0.00901
Epoch: 26527, Training Loss: 0.00840
Epoch: 26527, Training Loss: 0.01028
Epoch: 26527, Training Loss: 0.00796
Epoch: 26528, Training Loss: 0.00901
Epoch: 26528, Training Loss: 0.00840
Epoch: 26528, Training Loss: 0.01028
Epoch: 26528, Training Loss: 0.00796
Epoch: 26529, Training Loss: 0.00901
Epoch: 26529, Training Loss: 0.00840
Epoch: 26529, Training Loss: 0.01028
Epoch: 26529, Training Loss: 0.00796
Epoch: 26530, Training Loss: 0.00901
Epoch: 26530, Training Loss: 0.00840
Epoch: 26530, Training Loss: 0.01028
Epoch: 26530, Training Loss: 0.00796
Epoch: 26531, Training Loss: 0.00901
Epoch: 26531, Training Loss: 0.00840
Epoch: 26531, Training Loss: 0.01028
Epoch: 26531, Training Loss: 0.00796
Epoch: 26532, Training Loss: 0.00900
Epoch: 26532, Training Loss: 0.00840
Epoch: 26532, Training Loss: 0.01028
Epoch: 26532, Training Loss: 0.00795
Epoch: 26533, Training Loss: 0.00900
Epoch: 26533, Training Loss: 0.00840
Epoch: 26533, Training Loss: 0.01028
Epoch: 26533, Training Loss: 0.00795
Epoch: 26534, Training Loss: 0.00900
Epoch: 26534, Training Loss: 0.00840
Epoch: 26534, Training Loss: 0.01028
Epoch: 26534, Training Loss: 0.00795
Epoch: 26535, Training Loss: 0.00900
Epoch: 26535, Training Loss: 0.00840
Epoch: 26535, Training Loss: 0.01028
Epoch: 26535, Training Loss: 0.00795
Epoch: 26536, Training Loss: 0.00900
Epoch: 26536, Training Loss: 0.00840
Epoch: 26536, Training Loss: 0.01028
Epoch: 26536, Training Loss: 0.00795
Epoch: 26537, Training Loss: 0.00900
Epoch: 26537, Training Loss: 0.00840
Epoch: 26537, Training Loss: 0.01028
Epoch: 26537, Training Loss: 0.00795
Epoch: 26538, Training Loss: 0.00900
Epoch: 26538, Training Loss: 0.00840
Epoch: 26538, Training Loss: 0.01028
Epoch: 26538, Training Loss: 0.00795
Epoch: 26539, Training Loss: 0.00900
Epoch: 26539, Training Loss: 0.00840
Epoch: 26539, Training Loss: 0.01028
Epoch: 26539, Training Loss: 0.00795
Epoch: 26540, Training Loss: 0.00900
Epoch: 26540, Training Loss: 0.00840
Epoch: 26540, Training Loss: 0.01028
Epoch: 26540, Training Loss: 0.00795
Epoch: 26541, Training Loss: 0.00900
Epoch: 26541, Training Loss: 0.00840
Epoch: 26541, Training Loss: 0.01028
Epoch: 26541, Training Loss: 0.00795
Epoch: 26542, Training Loss: 0.00900
Epoch: 26542, Training Loss: 0.00840
Epoch: 26542, Training Loss: 0.01028
Epoch: 26542, Training Loss: 0.00795
Epoch: 26543, Training Loss: 0.00900
Epoch: 26543, Training Loss: 0.00840
Epoch: 26543, Training Loss: 0.01028
Epoch: 26543, Training Loss: 0.00795
Epoch: 26544, Training Loss: 0.00900
Epoch: 26544, Training Loss: 0.00840
Epoch: 26544, Training Loss: 0.01028
Epoch: 26544, Training Loss: 0.00795
Epoch: 26545, Training Loss: 0.00900
Epoch: 26545, Training Loss: 0.00840
Epoch: 26545, Training Loss: 0.01028
Epoch: 26545, Training Loss: 0.00795
Epoch: 26546, Training Loss: 0.00900
Epoch: 26546, Training Loss: 0.00840
Epoch: 26546, Training Loss: 0.01028
Epoch: 26546, Training Loss: 0.00795
Epoch: 26547, Training Loss: 0.00900
Epoch: 26547, Training Loss: 0.00840
Epoch: 26547, Training Loss: 0.01028
Epoch: 26547, Training Loss: 0.00795
Epoch: 26548, Training Loss: 0.00900
Epoch: 26548, Training Loss: 0.00840
Epoch: 26548, Training Loss: 0.01028
Epoch: 26548, Training Loss: 0.00795
Epoch: 26549, Training Loss: 0.00900
Epoch: 26549, Training Loss: 0.00840
Epoch: 26549, Training Loss: 0.01028
Epoch: 26549, Training Loss: 0.00795
Epoch: 26550, Training Loss: 0.00900
Epoch: 26550, Training Loss: 0.00840
Epoch: 26550, Training Loss: 0.01028
Epoch: 26550, Training Loss: 0.00795
Epoch: 26551, Training Loss: 0.00900
Epoch: 26551, Training Loss: 0.00840
Epoch: 26551, Training Loss: 0.01028
Epoch: 26551, Training Loss: 0.00795
Epoch: 26552, Training Loss: 0.00900
Epoch: 26552, Training Loss: 0.00840
Epoch: 26552, Training Loss: 0.01028
Epoch: 26552, Training Loss: 0.00795
Epoch: 26553, Training Loss: 0.00900
Epoch: 26553, Training Loss: 0.00840
Epoch: 26553, Training Loss: 0.01028
Epoch: 26553, Training Loss: 0.00795
Epoch: 26554, Training Loss: 0.00900
Epoch: 26554, Training Loss: 0.00840
Epoch: 26554, Training Loss: 0.01028
Epoch: 26554, Training Loss: 0.00795
Epoch: 26555, Training Loss: 0.00900
Epoch: 26555, Training Loss: 0.00840
Epoch: 26555, Training Loss: 0.01028
Epoch: 26555, Training Loss: 0.00795
Epoch: 26556, Training Loss: 0.00900
Epoch: 26556, Training Loss: 0.00840
Epoch: 26556, Training Loss: 0.01028
Epoch: 26556, Training Loss: 0.00795
Epoch: 26557, Training Loss: 0.00900
Epoch: 26557, Training Loss: 0.00840
Epoch: 26557, Training Loss: 0.01028
Epoch: 26557, Training Loss: 0.00795
Epoch: 26558, Training Loss: 0.00900
Epoch: 26558, Training Loss: 0.00839
Epoch: 26558, Training Loss: 0.01028
Epoch: 26558, Training Loss: 0.00795
Epoch: 26559, Training Loss: 0.00900
Epoch: 26559, Training Loss: 0.00839
Epoch: 26559, Training Loss: 0.01028
Epoch: 26559, Training Loss: 0.00795
Epoch: 26560, Training Loss: 0.00900
Epoch: 26560, Training Loss: 0.00839
Epoch: 26560, Training Loss: 0.01028
Epoch: 26560, Training Loss: 0.00795
Epoch: 26561, Training Loss: 0.00900
Epoch: 26561, Training Loss: 0.00839
Epoch: 26561, Training Loss: 0.01028
Epoch: 26561, Training Loss: 0.00795
Epoch: 26562, Training Loss: 0.00900
Epoch: 26562, Training Loss: 0.00839
Epoch: 26562, Training Loss: 0.01028
Epoch: 26562, Training Loss: 0.00795
Epoch: 26563, Training Loss: 0.00900
Epoch: 26563, Training Loss: 0.00839
Epoch: 26563, Training Loss: 0.01028
Epoch: 26563, Training Loss: 0.00795
Epoch: 26564, Training Loss: 0.00900
Epoch: 26564, Training Loss: 0.00839
Epoch: 26564, Training Loss: 0.01028
Epoch: 26564, Training Loss: 0.00795
Epoch: 26565, Training Loss: 0.00900
Epoch: 26565, Training Loss: 0.00839
Epoch: 26565, Training Loss: 0.01028
Epoch: 26565, Training Loss: 0.00795
Epoch: 26566, Training Loss: 0.00900
Epoch: 26566, Training Loss: 0.00839
Epoch: 26566, Training Loss: 0.01028
Epoch: 26566, Training Loss: 0.00795
Epoch: 26567, Training Loss: 0.00900
Epoch: 26567, Training Loss: 0.00839
Epoch: 26567, Training Loss: 0.01028
Epoch: 26567, Training Loss: 0.00795
Epoch: 26568, Training Loss: 0.00900
Epoch: 26568, Training Loss: 0.00839
Epoch: 26568, Training Loss: 0.01027
Epoch: 26568, Training Loss: 0.00795
Epoch: 26569, Training Loss: 0.00900
Epoch: 26569, Training Loss: 0.00839
Epoch: 26569, Training Loss: 0.01027
Epoch: 26569, Training Loss: 0.00795
Epoch: 26570, Training Loss: 0.00900
Epoch: 26570, Training Loss: 0.00839
Epoch: 26570, Training Loss: 0.01027
Epoch: 26570, Training Loss: 0.00795
Epoch: 26571, Training Loss: 0.00900
Epoch: 26571, Training Loss: 0.00839
Epoch: 26571, Training Loss: 0.01027
Epoch: 26571, Training Loss: 0.00795
Epoch: 26572, Training Loss: 0.00900
Epoch: 26572, Training Loss: 0.00839
Epoch: 26572, Training Loss: 0.01027
Epoch: 26572, Training Loss: 0.00795
Epoch: 26573, Training Loss: 0.00900
Epoch: 26573, Training Loss: 0.00839
Epoch: 26573, Training Loss: 0.01027
Epoch: 26573, Training Loss: 0.00795
Epoch: 26574, Training Loss: 0.00900
Epoch: 26574, Training Loss: 0.00839
Epoch: 26574, Training Loss: 0.01027
Epoch: 26574, Training Loss: 0.00795
Epoch: 26575, Training Loss: 0.00900
Epoch: 26575, Training Loss: 0.00839
Epoch: 26575, Training Loss: 0.01027
Epoch: 26575, Training Loss: 0.00795
Epoch: 26576, Training Loss: 0.00900
Epoch: 26576, Training Loss: 0.00839
Epoch: 26576, Training Loss: 0.01027
Epoch: 26576, Training Loss: 0.00795
Epoch: 26577, Training Loss: 0.00900
Epoch: 26577, Training Loss: 0.00839
Epoch: 26577, Training Loss: 0.01027
Epoch: 26577, Training Loss: 0.00795
Epoch: 26578, Training Loss: 0.00900
Epoch: 26578, Training Loss: 0.00839
Epoch: 26578, Training Loss: 0.01027
Epoch: 26578, Training Loss: 0.00795
Epoch: 26579, Training Loss: 0.00900
Epoch: 26579, Training Loss: 0.00839
Epoch: 26579, Training Loss: 0.01027
Epoch: 26579, Training Loss: 0.00795
Epoch: 26580, Training Loss: 0.00900
Epoch: 26580, Training Loss: 0.00839
Epoch: 26580, Training Loss: 0.01027
Epoch: 26580, Training Loss: 0.00795
Epoch: 26581, Training Loss: 0.00900
Epoch: 26581, Training Loss: 0.00839
Epoch: 26581, Training Loss: 0.01027
Epoch: 26581, Training Loss: 0.00795
Epoch: 26582, Training Loss: 0.00900
Epoch: 26582, Training Loss: 0.00839
Epoch: 26582, Training Loss: 0.01027
Epoch: 26582, Training Loss: 0.00795
Epoch: 26583, Training Loss: 0.00900
Epoch: 26583, Training Loss: 0.00839
Epoch: 26583, Training Loss: 0.01027
Epoch: 26583, Training Loss: 0.00795
Epoch: 26584, Training Loss: 0.00900
Epoch: 26584, Training Loss: 0.00839
Epoch: 26584, Training Loss: 0.01027
Epoch: 26584, Training Loss: 0.00795
Epoch: 26585, Training Loss: 0.00900
Epoch: 26585, Training Loss: 0.00839
Epoch: 26585, Training Loss: 0.01027
Epoch: 26585, Training Loss: 0.00795
Epoch: 26586, Training Loss: 0.00900
Epoch: 26586, Training Loss: 0.00839
Epoch: 26586, Training Loss: 0.01027
Epoch: 26586, Training Loss: 0.00795
Epoch: 26587, Training Loss: 0.00900
Epoch: 26587, Training Loss: 0.00839
Epoch: 26587, Training Loss: 0.01027
Epoch: 26587, Training Loss: 0.00795
Epoch: 26588, Training Loss: 0.00899
Epoch: 26588, Training Loss: 0.00839
Epoch: 26588, Training Loss: 0.01027
Epoch: 26588, Training Loss: 0.00795
Epoch: 26589, Training Loss: 0.00899
Epoch: 26589, Training Loss: 0.00839
Epoch: 26589, Training Loss: 0.01027
Epoch: 26589, Training Loss: 0.00795
Epoch: 26590, Training Loss: 0.00899
Epoch: 26590, Training Loss: 0.00839
Epoch: 26590, Training Loss: 0.01027
Epoch: 26590, Training Loss: 0.00795
Epoch: 26591, Training Loss: 0.00899
Epoch: 26591, Training Loss: 0.00839
Epoch: 26591, Training Loss: 0.01027
Epoch: 26591, Training Loss: 0.00795
Epoch: 26592, Training Loss: 0.00899
Epoch: 26592, Training Loss: 0.00839
Epoch: 26592, Training Loss: 0.01027
Epoch: 26592, Training Loss: 0.00795
Epoch: 26593, Training Loss: 0.00899
Epoch: 26593, Training Loss: 0.00839
Epoch: 26593, Training Loss: 0.01027
Epoch: 26593, Training Loss: 0.00795
Epoch: 26594, Training Loss: 0.00899
Epoch: 26594, Training Loss: 0.00839
Epoch: 26594, Training Loss: 0.01027
Epoch: 26594, Training Loss: 0.00795
Epoch: 26595, Training Loss: 0.00899
Epoch: 26595, Training Loss: 0.00839
Epoch: 26595, Training Loss: 0.01027
Epoch: 26595, Training Loss: 0.00794
Epoch: 26596, Training Loss: 0.00899
Epoch: 26596, Training Loss: 0.00839
Epoch: 26596, Training Loss: 0.01027
Epoch: 26596, Training Loss: 0.00794
Epoch: 26597, Training Loss: 0.00899
Epoch: 26597, Training Loss: 0.00839
Epoch: 26597, Training Loss: 0.01027
Epoch: 26597, Training Loss: 0.00794
Epoch: 26598, Training Loss: 0.00899
Epoch: 26598, Training Loss: 0.00839
Epoch: 26598, Training Loss: 0.01027
Epoch: 26598, Training Loss: 0.00794
Epoch: 26599, Training Loss: 0.00899
Epoch: 26599, Training Loss: 0.00839
Epoch: 26599, Training Loss: 0.01027
Epoch: 26599, Training Loss: 0.00794
Epoch: 26600, Training Loss: 0.00899
Epoch: 26600, Training Loss: 0.00839
Epoch: 26600, Training Loss: 0.01027
Epoch: 26600, Training Loss: 0.00794
Epoch: 26601, Training Loss: 0.00899
Epoch: 26601, Training Loss: 0.00839
Epoch: 26601, Training Loss: 0.01027
Epoch: 26601, Training Loss: 0.00794
Epoch: 26602, Training Loss: 0.00899
Epoch: 26602, Training Loss: 0.00839
Epoch: 26602, Training Loss: 0.01027
Epoch: 26602, Training Loss: 0.00794
Epoch: 26603, Training Loss: 0.00899
Epoch: 26603, Training Loss: 0.00839
Epoch: 26603, Training Loss: 0.01027
Epoch: 26603, Training Loss: 0.00794
Epoch: 26604, Training Loss: 0.00899
Epoch: 26604, Training Loss: 0.00839
Epoch: 26604, Training Loss: 0.01027
Epoch: 26604, Training Loss: 0.00794
Epoch: 26605, Training Loss: 0.00899
Epoch: 26605, Training Loss: 0.00839
Epoch: 26605, Training Loss: 0.01027
Epoch: 26605, Training Loss: 0.00794
Epoch: 26606, Training Loss: 0.00899
Epoch: 26606, Training Loss: 0.00839
Epoch: 26606, Training Loss: 0.01027
Epoch: 26606, Training Loss: 0.00794
Epoch: 26607, Training Loss: 0.00899
Epoch: 26607, Training Loss: 0.00839
Epoch: 26607, Training Loss: 0.01027
Epoch: 26607, Training Loss: 0.00794
Epoch: 26608, Training Loss: 0.00899
Epoch: 26608, Training Loss: 0.00839
Epoch: 26608, Training Loss: 0.01027
Epoch: 26608, Training Loss: 0.00794
Epoch: 26609, Training Loss: 0.00899
Epoch: 26609, Training Loss: 0.00839
Epoch: 26609, Training Loss: 0.01027
Epoch: 26609, Training Loss: 0.00794
Epoch: 26610, Training Loss: 0.00899
Epoch: 26610, Training Loss: 0.00839
Epoch: 26610, Training Loss: 0.01027
Epoch: 26610, Training Loss: 0.00794
Epoch: 26611, Training Loss: 0.00899
Epoch: 26611, Training Loss: 0.00839
Epoch: 26611, Training Loss: 0.01027
Epoch: 26611, Training Loss: 0.00794
Epoch: 26612, Training Loss: 0.00899
Epoch: 26612, Training Loss: 0.00839
Epoch: 26612, Training Loss: 0.01027
Epoch: 26612, Training Loss: 0.00794
Epoch: 26613, Training Loss: 0.00899
Epoch: 26613, Training Loss: 0.00839
Epoch: 26613, Training Loss: 0.01027
Epoch: 26613, Training Loss: 0.00794
Epoch: 26614, Training Loss: 0.00899
Epoch: 26614, Training Loss: 0.00839
Epoch: 26614, Training Loss: 0.01027
Epoch: 26614, Training Loss: 0.00794
Epoch: 26615, Training Loss: 0.00899
Epoch: 26615, Training Loss: 0.00839
Epoch: 26615, Training Loss: 0.01027
Epoch: 26615, Training Loss: 0.00794
Epoch: 26616, Training Loss: 0.00899
Epoch: 26616, Training Loss: 0.00839
Epoch: 26616, Training Loss: 0.01027
Epoch: 26616, Training Loss: 0.00794
Epoch: 26617, Training Loss: 0.00899
Epoch: 26617, Training Loss: 0.00839
Epoch: 26617, Training Loss: 0.01026
Epoch: 26617, Training Loss: 0.00794
Epoch: 26618, Training Loss: 0.00899
Epoch: 26618, Training Loss: 0.00838
Epoch: 26618, Training Loss: 0.01026
Epoch: 26618, Training Loss: 0.00794
Epoch: 26619, Training Loss: 0.00899
Epoch: 26619, Training Loss: 0.00838
Epoch: 26619, Training Loss: 0.01026
Epoch: 26619, Training Loss: 0.00794
Epoch: 26620, Training Loss: 0.00899
Epoch: 26620, Training Loss: 0.00838
Epoch: 26620, Training Loss: 0.01026
Epoch: 26620, Training Loss: 0.00794
Epoch: 26621, Training Loss: 0.00899
Epoch: 26621, Training Loss: 0.00838
Epoch: 26621, Training Loss: 0.01026
Epoch: 26621, Training Loss: 0.00794
Epoch: 26622, Training Loss: 0.00899
Epoch: 26622, Training Loss: 0.00838
Epoch: 26622, Training Loss: 0.01026
Epoch: 26622, Training Loss: 0.00794
Epoch: 26623, Training Loss: 0.00899
Epoch: 26623, Training Loss: 0.00838
Epoch: 26623, Training Loss: 0.01026
Epoch: 26623, Training Loss: 0.00794
Epoch: 26624, Training Loss: 0.00899
Epoch: 26624, Training Loss: 0.00838
Epoch: 26624, Training Loss: 0.01026
Epoch: 26624, Training Loss: 0.00794
Epoch: 26625, Training Loss: 0.00899
Epoch: 26625, Training Loss: 0.00838
Epoch: 26625, Training Loss: 0.01026
Epoch: 26625, Training Loss: 0.00794
Epoch: 26626, Training Loss: 0.00899
Epoch: 26626, Training Loss: 0.00838
Epoch: 26626, Training Loss: 0.01026
Epoch: 26626, Training Loss: 0.00794
Epoch: 26627, Training Loss: 0.00899
Epoch: 26627, Training Loss: 0.00838
Epoch: 26627, Training Loss: 0.01026
Epoch: 26627, Training Loss: 0.00794
Epoch: 26628, Training Loss: 0.00899
Epoch: 26628, Training Loss: 0.00838
Epoch: 26628, Training Loss: 0.01026
Epoch: 26628, Training Loss: 0.00794
Epoch: 26629, Training Loss: 0.00899
Epoch: 26629, Training Loss: 0.00838
Epoch: 26629, Training Loss: 0.01026
Epoch: 26629, Training Loss: 0.00794
Epoch: 26630, Training Loss: 0.00899
Epoch: 26630, Training Loss: 0.00838
Epoch: 26630, Training Loss: 0.01026
Epoch: 26630, Training Loss: 0.00794
Epoch: 26631, Training Loss: 0.00899
Epoch: 26631, Training Loss: 0.00838
Epoch: 26631, Training Loss: 0.01026
Epoch: 26631, Training Loss: 0.00794
Epoch: 26632, Training Loss: 0.00899
Epoch: 26632, Training Loss: 0.00838
Epoch: 26632, Training Loss: 0.01026
Epoch: 26632, Training Loss: 0.00794
Epoch: 26633, Training Loss: 0.00899
Epoch: 26633, Training Loss: 0.00838
Epoch: 26633, Training Loss: 0.01026
Epoch: 26633, Training Loss: 0.00794
Epoch: 26634, Training Loss: 0.00899
Epoch: 26634, Training Loss: 0.00838
Epoch: 26634, Training Loss: 0.01026
Epoch: 26634, Training Loss: 0.00794
Epoch: 26635, Training Loss: 0.00899
Epoch: 26635, Training Loss: 0.00838
Epoch: 26635, Training Loss: 0.01026
Epoch: 26635, Training Loss: 0.00794
Epoch: 26636, Training Loss: 0.00899
Epoch: 26636, Training Loss: 0.00838
Epoch: 26636, Training Loss: 0.01026
Epoch: 26636, Training Loss: 0.00794
Epoch: 26637, Training Loss: 0.00899
Epoch: 26637, Training Loss: 0.00838
Epoch: 26637, Training Loss: 0.01026
Epoch: 26637, Training Loss: 0.00794
Epoch: 26638, Training Loss: 0.00899
Epoch: 26638, Training Loss: 0.00838
Epoch: 26638, Training Loss: 0.01026
Epoch: 26638, Training Loss: 0.00794
Epoch: 26639, Training Loss: 0.00899
Epoch: 26639, Training Loss: 0.00838
Epoch: 26639, Training Loss: 0.01026
Epoch: 26639, Training Loss: 0.00794
Epoch: 26640, Training Loss: 0.00899
Epoch: 26640, Training Loss: 0.00838
Epoch: 26640, Training Loss: 0.01026
Epoch: 26640, Training Loss: 0.00794
Epoch: 26641, Training Loss: 0.00899
Epoch: 26641, Training Loss: 0.00838
Epoch: 26641, Training Loss: 0.01026
Epoch: 26641, Training Loss: 0.00794
Epoch: 26642, Training Loss: 0.00899
Epoch: 26642, Training Loss: 0.00838
Epoch: 26642, Training Loss: 0.01026
Epoch: 26642, Training Loss: 0.00794
Epoch: 26643, Training Loss: 0.00898
Epoch: 26643, Training Loss: 0.00838
Epoch: 26643, Training Loss: 0.01026
Epoch: 26643, Training Loss: 0.00794
Epoch: 26644, Training Loss: 0.00898
Epoch: 26644, Training Loss: 0.00838
Epoch: 26644, Training Loss: 0.01026
Epoch: 26644, Training Loss: 0.00794
Epoch: 26645, Training Loss: 0.00898
Epoch: 26645, Training Loss: 0.00838
Epoch: 26645, Training Loss: 0.01026
Epoch: 26645, Training Loss: 0.00794
Epoch: 26646, Training Loss: 0.00898
Epoch: 26646, Training Loss: 0.00838
Epoch: 26646, Training Loss: 0.01026
Epoch: 26646, Training Loss: 0.00794
Epoch: 26647, Training Loss: 0.00898
Epoch: 26647, Training Loss: 0.00838
Epoch: 26647, Training Loss: 0.01026
Epoch: 26647, Training Loss: 0.00794
Epoch: 26648, Training Loss: 0.00898
Epoch: 26648, Training Loss: 0.00838
Epoch: 26648, Training Loss: 0.01026
Epoch: 26648, Training Loss: 0.00794
Epoch: 26649, Training Loss: 0.00898
Epoch: 26649, Training Loss: 0.00838
Epoch: 26649, Training Loss: 0.01026
Epoch: 26649, Training Loss: 0.00794
Epoch: 26650, Training Loss: 0.00898
Epoch: 26650, Training Loss: 0.00838
Epoch: 26650, Training Loss: 0.01026
Epoch: 26650, Training Loss: 0.00794
Epoch: 26651, Training Loss: 0.00898
Epoch: 26651, Training Loss: 0.00838
Epoch: 26651, Training Loss: 0.01026
Epoch: 26651, Training Loss: 0.00794
Epoch: 26652, Training Loss: 0.00898
Epoch: 26652, Training Loss: 0.00838
Epoch: 26652, Training Loss: 0.01026
Epoch: 26652, Training Loss: 0.00794
Epoch: 26653, Training Loss: 0.00898
Epoch: 26653, Training Loss: 0.00838
Epoch: 26653, Training Loss: 0.01026
Epoch: 26653, Training Loss: 0.00794
Epoch: 26654, Training Loss: 0.00898
Epoch: 26654, Training Loss: 0.00838
Epoch: 26654, Training Loss: 0.01026
Epoch: 26654, Training Loss: 0.00794
Epoch: 26655, Training Loss: 0.00898
Epoch: 26655, Training Loss: 0.00838
Epoch: 26655, Training Loss: 0.01026
Epoch: 26655, Training Loss: 0.00794
Epoch: 26656, Training Loss: 0.00898
Epoch: 26656, Training Loss: 0.00838
Epoch: 26656, Training Loss: 0.01026
Epoch: 26656, Training Loss: 0.00794
Epoch: 26657, Training Loss: 0.00898
Epoch: 26657, Training Loss: 0.00838
Epoch: 26657, Training Loss: 0.01026
Epoch: 26657, Training Loss: 0.00794
Epoch: 26658, Training Loss: 0.00898
Epoch: 26658, Training Loss: 0.00838
Epoch: 26658, Training Loss: 0.01026
Epoch: 26658, Training Loss: 0.00794
Epoch: 26659, Training Loss: 0.00898
Epoch: 26659, Training Loss: 0.00838
Epoch: 26659, Training Loss: 0.01026
Epoch: 26659, Training Loss: 0.00793
Epoch: 26660, Training Loss: 0.00898
Epoch: 26660, Training Loss: 0.00838
Epoch: 26660, Training Loss: 0.01026
Epoch: 26660, Training Loss: 0.00793
Epoch: 26661, Training Loss: 0.00898
Epoch: 26661, Training Loss: 0.00838
Epoch: 26661, Training Loss: 0.01026
Epoch: 26661, Training Loss: 0.00793
Epoch: 26662, Training Loss: 0.00898
Epoch: 26662, Training Loss: 0.00838
Epoch: 26662, Training Loss: 0.01026
Epoch: 26662, Training Loss: 0.00793
Epoch: 26663, Training Loss: 0.00898
Epoch: 26663, Training Loss: 0.00838
Epoch: 26663, Training Loss: 0.01026
Epoch: 26663, Training Loss: 0.00793
Epoch: 26664, Training Loss: 0.00898
Epoch: 26664, Training Loss: 0.00838
Epoch: 26664, Training Loss: 0.01026
Epoch: 26664, Training Loss: 0.00793
Epoch: 26665, Training Loss: 0.00898
Epoch: 26665, Training Loss: 0.00838
Epoch: 26665, Training Loss: 0.01026
Epoch: 26665, Training Loss: 0.00793
Epoch: 26666, Training Loss: 0.00898
Epoch: 26666, Training Loss: 0.00838
Epoch: 26666, Training Loss: 0.01025
Epoch: 26666, Training Loss: 0.00793
Epoch: 26667, Training Loss: 0.00898
Epoch: 26667, Training Loss: 0.00838
Epoch: 26667, Training Loss: 0.01025
Epoch: 26667, Training Loss: 0.00793
Epoch: 26668, Training Loss: 0.00898
Epoch: 26668, Training Loss: 0.00838
Epoch: 26668, Training Loss: 0.01025
Epoch: 26668, Training Loss: 0.00793
Epoch: 26669, Training Loss: 0.00898
Epoch: 26669, Training Loss: 0.00838
Epoch: 26669, Training Loss: 0.01025
Epoch: 26669, Training Loss: 0.00793
Epoch: 26670, Training Loss: 0.00898
Epoch: 26670, Training Loss: 0.00838
Epoch: 26670, Training Loss: 0.01025
Epoch: 26670, Training Loss: 0.00793
Epoch: 26671, Training Loss: 0.00898
Epoch: 26671, Training Loss: 0.00838
Epoch: 26671, Training Loss: 0.01025
Epoch: 26671, Training Loss: 0.00793
Epoch: 26672, Training Loss: 0.00898
Epoch: 26672, Training Loss: 0.00838
Epoch: 26672, Training Loss: 0.01025
Epoch: 26672, Training Loss: 0.00793
Epoch: 26673, Training Loss: 0.00898
Epoch: 26673, Training Loss: 0.00838
Epoch: 26673, Training Loss: 0.01025
Epoch: 26673, Training Loss: 0.00793
Epoch: 26674, Training Loss: 0.00898
Epoch: 26674, Training Loss: 0.00838
Epoch: 26674, Training Loss: 0.01025
Epoch: 26674, Training Loss: 0.00793
Epoch: 26675, Training Loss: 0.00898
Epoch: 26675, Training Loss: 0.00838
Epoch: 26675, Training Loss: 0.01025
Epoch: 26675, Training Loss: 0.00793
Epoch: 26676, Training Loss: 0.00898
Epoch: 26676, Training Loss: 0.00838
Epoch: 26676, Training Loss: 0.01025
Epoch: 26676, Training Loss: 0.00793
Epoch: 26677, Training Loss: 0.00898
Epoch: 26677, Training Loss: 0.00838
Epoch: 26677, Training Loss: 0.01025
Epoch: 26677, Training Loss: 0.00793
Epoch: 26678, Training Loss: 0.00898
Epoch: 26678, Training Loss: 0.00838
Epoch: 26678, Training Loss: 0.01025
Epoch: 26678, Training Loss: 0.00793
Epoch: 26679, Training Loss: 0.00898
Epoch: 26679, Training Loss: 0.00837
Epoch: 26679, Training Loss: 0.01025
Epoch: 26679, Training Loss: 0.00793
Epoch: 26680, Training Loss: 0.00898
Epoch: 26680, Training Loss: 0.00837
Epoch: 26680, Training Loss: 0.01025
Epoch: 26680, Training Loss: 0.00793
Epoch: 26681, Training Loss: 0.00898
Epoch: 26681, Training Loss: 0.00837
Epoch: 26681, Training Loss: 0.01025
Epoch: 26681, Training Loss: 0.00793
Epoch: 26682, Training Loss: 0.00898
Epoch: 26682, Training Loss: 0.00837
Epoch: 26682, Training Loss: 0.01025
Epoch: 26682, Training Loss: 0.00793
Epoch: 26683, Training Loss: 0.00898
Epoch: 26683, Training Loss: 0.00837
Epoch: 26683, Training Loss: 0.01025
Epoch: 26683, Training Loss: 0.00793
Epoch: 26684, Training Loss: 0.00898
Epoch: 26684, Training Loss: 0.00837
Epoch: 26684, Training Loss: 0.01025
Epoch: 26684, Training Loss: 0.00793
Epoch: 26685, Training Loss: 0.00898
Epoch: 26685, Training Loss: 0.00837
Epoch: 26685, Training Loss: 0.01025
Epoch: 26685, Training Loss: 0.00793
Epoch: 26686, Training Loss: 0.00898
Epoch: 26686, Training Loss: 0.00837
Epoch: 26686, Training Loss: 0.01025
Epoch: 26686, Training Loss: 0.00793
Epoch: 26687, Training Loss: 0.00898
Epoch: 26687, Training Loss: 0.00837
Epoch: 26687, Training Loss: 0.01025
Epoch: 26687, Training Loss: 0.00793
Epoch: 26688, Training Loss: 0.00898
Epoch: 26688, Training Loss: 0.00837
Epoch: 26688, Training Loss: 0.01025
Epoch: 26688, Training Loss: 0.00793
Epoch: 26689, Training Loss: 0.00898
Epoch: 26689, Training Loss: 0.00837
Epoch: 26689, Training Loss: 0.01025
Epoch: 26689, Training Loss: 0.00793
Epoch: 26690, Training Loss: 0.00898
Epoch: 26690, Training Loss: 0.00837
Epoch: 26690, Training Loss: 0.01025
Epoch: 26690, Training Loss: 0.00793
Epoch: 26691, Training Loss: 0.00898
Epoch: 26691, Training Loss: 0.00837
Epoch: 26691, Training Loss: 0.01025
Epoch: 26691, Training Loss: 0.00793
Epoch: 26692, Training Loss: 0.00898
Epoch: 26692, Training Loss: 0.00837
Epoch: 26692, Training Loss: 0.01025
Epoch: 26692, Training Loss: 0.00793
Epoch: 26693, Training Loss: 0.00898
Epoch: 26693, Training Loss: 0.00837
Epoch: 26693, Training Loss: 0.01025
Epoch: 26693, Training Loss: 0.00793
Epoch: 26694, Training Loss: 0.00898
Epoch: 26694, Training Loss: 0.00837
Epoch: 26694, Training Loss: 0.01025
Epoch: 26694, Training Loss: 0.00793
Epoch: 26695, Training Loss: 0.00898
Epoch: 26695, Training Loss: 0.00837
Epoch: 26695, Training Loss: 0.01025
Epoch: 26695, Training Loss: 0.00793
Epoch: 26696, Training Loss: 0.00898
Epoch: 26696, Training Loss: 0.00837
Epoch: 26696, Training Loss: 0.01025
Epoch: 26696, Training Loss: 0.00793
Epoch: 26697, Training Loss: 0.00898
Epoch: 26697, Training Loss: 0.00837
Epoch: 26697, Training Loss: 0.01025
Epoch: 26697, Training Loss: 0.00793
Epoch: 26698, Training Loss: 0.00898
Epoch: 26698, Training Loss: 0.00837
Epoch: 26698, Training Loss: 0.01025
Epoch: 26698, Training Loss: 0.00793
Epoch: 26699, Training Loss: 0.00897
Epoch: 26699, Training Loss: 0.00837
Epoch: 26699, Training Loss: 0.01025
Epoch: 26699, Training Loss: 0.00793
Epoch: 26700, Training Loss: 0.00897
Epoch: 26700, Training Loss: 0.00837
Epoch: 26700, Training Loss: 0.01025
Epoch: 26700, Training Loss: 0.00793
Epoch: 26701, Training Loss: 0.00897
Epoch: 26701, Training Loss: 0.00837
Epoch: 26701, Training Loss: 0.01025
Epoch: 26701, Training Loss: 0.00793
Epoch: 26702, Training Loss: 0.00897
Epoch: 26702, Training Loss: 0.00837
Epoch: 26702, Training Loss: 0.01025
Epoch: 26702, Training Loss: 0.00793
Epoch: 26703, Training Loss: 0.00897
Epoch: 26703, Training Loss: 0.00837
Epoch: 26703, Training Loss: 0.01025
Epoch: 26703, Training Loss: 0.00793
Epoch: 26704, Training Loss: 0.00897
Epoch: 26704, Training Loss: 0.00837
Epoch: 26704, Training Loss: 0.01025
Epoch: 26704, Training Loss: 0.00793
Epoch: 26705, Training Loss: 0.00897
Epoch: 26705, Training Loss: 0.00837
Epoch: 26705, Training Loss: 0.01025
Epoch: 26705, Training Loss: 0.00793
Epoch: 26706, Training Loss: 0.00897
Epoch: 26706, Training Loss: 0.00837
Epoch: 26706, Training Loss: 0.01025
Epoch: 26706, Training Loss: 0.00793
Epoch: 26707, Training Loss: 0.00897
Epoch: 26707, Training Loss: 0.00837
Epoch: 26707, Training Loss: 0.01025
Epoch: 26707, Training Loss: 0.00793
Epoch: 26708, Training Loss: 0.00897
Epoch: 26708, Training Loss: 0.00837
Epoch: 26708, Training Loss: 0.01025
Epoch: 26708, Training Loss: 0.00793
Epoch: 26709, Training Loss: 0.00897
Epoch: 26709, Training Loss: 0.00837
Epoch: 26709, Training Loss: 0.01025
Epoch: 26709, Training Loss: 0.00793
Epoch: 26710, Training Loss: 0.00897
Epoch: 26710, Training Loss: 0.00837
Epoch: 26710, Training Loss: 0.01025
Epoch: 26710, Training Loss: 0.00793
Epoch: 26711, Training Loss: 0.00897
Epoch: 26711, Training Loss: 0.00837
Epoch: 26711, Training Loss: 0.01025
Epoch: 26711, Training Loss: 0.00793
Epoch: 26712, Training Loss: 0.00897
Epoch: 26712, Training Loss: 0.00837
Epoch: 26712, Training Loss: 0.01025
Epoch: 26712, Training Loss: 0.00793
Epoch: 26713, Training Loss: 0.00897
Epoch: 26713, Training Loss: 0.00837
Epoch: 26713, Training Loss: 0.01025
Epoch: 26713, Training Loss: 0.00793
Epoch: 26714, Training Loss: 0.00897
Epoch: 26714, Training Loss: 0.00837
Epoch: 26714, Training Loss: 0.01025
Epoch: 26714, Training Loss: 0.00793
Epoch: 26715, Training Loss: 0.00897
Epoch: 26715, Training Loss: 0.00837
Epoch: 26715, Training Loss: 0.01024
Epoch: 26715, Training Loss: 0.00793
Epoch: 26716, Training Loss: 0.00897
Epoch: 26716, Training Loss: 0.00837
Epoch: 26716, Training Loss: 0.01024
Epoch: 26716, Training Loss: 0.00793
Epoch: 26717, Training Loss: 0.00897
Epoch: 26717, Training Loss: 0.00837
Epoch: 26717, Training Loss: 0.01024
Epoch: 26717, Training Loss: 0.00793
Epoch: 26718, Training Loss: 0.00897
Epoch: 26718, Training Loss: 0.00837
Epoch: 26718, Training Loss: 0.01024
Epoch: 26718, Training Loss: 0.00793
Epoch: 26719, Training Loss: 0.00897
Epoch: 26719, Training Loss: 0.00837
Epoch: 26719, Training Loss: 0.01024
Epoch: 26719, Training Loss: 0.00793
Epoch: 26720, Training Loss: 0.00897
Epoch: 26720, Training Loss: 0.00837
Epoch: 26720, Training Loss: 0.01024
Epoch: 26720, Training Loss: 0.00793
Epoch: 26721, Training Loss: 0.00897
Epoch: 26721, Training Loss: 0.00837
Epoch: 26721, Training Loss: 0.01024
Epoch: 26721, Training Loss: 0.00793
Epoch: 26722, Training Loss: 0.00897
Epoch: 26722, Training Loss: 0.00837
Epoch: 26722, Training Loss: 0.01024
Epoch: 26722, Training Loss: 0.00793
Epoch: 26723, Training Loss: 0.00897
Epoch: 26723, Training Loss: 0.00837
Epoch: 26723, Training Loss: 0.01024
Epoch: 26723, Training Loss: 0.00792
Epoch: 26724, Training Loss: 0.00897
Epoch: 26724, Training Loss: 0.00837
Epoch: 26724, Training Loss: 0.01024
Epoch: 26724, Training Loss: 0.00792
Epoch: 26725, Training Loss: 0.00897
Epoch: 26725, Training Loss: 0.00837
Epoch: 26725, Training Loss: 0.01024
Epoch: 26725, Training Loss: 0.00792
Epoch: 26726, Training Loss: 0.00897
Epoch: 26726, Training Loss: 0.00837
Epoch: 26726, Training Loss: 0.01024
Epoch: 26726, Training Loss: 0.00792
Epoch: 26727, Training Loss: 0.00897
Epoch: 26727, Training Loss: 0.00837
Epoch: 26727, Training Loss: 0.01024
Epoch: 26727, Training Loss: 0.00792
Epoch: 26728, Training Loss: 0.00897
Epoch: 26728, Training Loss: 0.00837
Epoch: 26728, Training Loss: 0.01024
Epoch: 26728, Training Loss: 0.00792
Epoch: 26729, Training Loss: 0.00897
Epoch: 26729, Training Loss: 0.00837
Epoch: 26729, Training Loss: 0.01024
Epoch: 26729, Training Loss: 0.00792
Epoch: 26730, Training Loss: 0.00897
Epoch: 26730, Training Loss: 0.00837
Epoch: 26730, Training Loss: 0.01024
Epoch: 26730, Training Loss: 0.00792
Epoch: 26731, Training Loss: 0.00897
Epoch: 26731, Training Loss: 0.00837
Epoch: 26731, Training Loss: 0.01024
Epoch: 26731, Training Loss: 0.00792
Epoch: 26732, Training Loss: 0.00897
Epoch: 26732, Training Loss: 0.00837
Epoch: 26732, Training Loss: 0.01024
Epoch: 26732, Training Loss: 0.00792
Epoch: 26733, Training Loss: 0.00897
Epoch: 26733, Training Loss: 0.00837
Epoch: 26733, Training Loss: 0.01024
Epoch: 26733, Training Loss: 0.00792
Epoch: 26734, Training Loss: 0.00897
Epoch: 26734, Training Loss: 0.00837
Epoch: 26734, Training Loss: 0.01024
Epoch: 26734, Training Loss: 0.00792
Epoch: 26735, Training Loss: 0.00897
Epoch: 26735, Training Loss: 0.00837
Epoch: 26735, Training Loss: 0.01024
Epoch: 26735, Training Loss: 0.00792
Epoch: 26736, Training Loss: 0.00897
Epoch: 26736, Training Loss: 0.00837
Epoch: 26736, Training Loss: 0.01024
Epoch: 26736, Training Loss: 0.00792
Epoch: 26737, Training Loss: 0.00897
Epoch: 26737, Training Loss: 0.00837
Epoch: 26737, Training Loss: 0.01024
Epoch: 26737, Training Loss: 0.00792
Epoch: 26738, Training Loss: 0.00897
Epoch: 26738, Training Loss: 0.00837
Epoch: 26738, Training Loss: 0.01024
Epoch: 26738, Training Loss: 0.00792
Epoch: 26739, Training Loss: 0.00897
Epoch: 26739, Training Loss: 0.00837
Epoch: 26739, Training Loss: 0.01024
Epoch: 26739, Training Loss: 0.00792
Epoch: 26740, Training Loss: 0.00897
Epoch: 26740, Training Loss: 0.00836
Epoch: 26740, Training Loss: 0.01024
Epoch: 26740, Training Loss: 0.00792
Epoch: 26741, Training Loss: 0.00897
Epoch: 26741, Training Loss: 0.00836
Epoch: 26741, Training Loss: 0.01024
Epoch: 26741, Training Loss: 0.00792
Epoch: 26742, Training Loss: 0.00897
Epoch: 26742, Training Loss: 0.00836
Epoch: 26742, Training Loss: 0.01024
Epoch: 26742, Training Loss: 0.00792
Epoch: 26743, Training Loss: 0.00897
Epoch: 26743, Training Loss: 0.00836
Epoch: 26743, Training Loss: 0.01024
Epoch: 26743, Training Loss: 0.00792
Epoch: 26744, Training Loss: 0.00897
Epoch: 26744, Training Loss: 0.00836
Epoch: 26744, Training Loss: 0.01024
Epoch: 26744, Training Loss: 0.00792
Epoch: 26745, Training Loss: 0.00897
Epoch: 26745, Training Loss: 0.00836
Epoch: 26745, Training Loss: 0.01024
Epoch: 26745, Training Loss: 0.00792
Epoch: 26746, Training Loss: 0.00897
Epoch: 26746, Training Loss: 0.00836
Epoch: 26746, Training Loss: 0.01024
Epoch: 26746, Training Loss: 0.00792
Epoch: 26747, Training Loss: 0.00897
Epoch: 26747, Training Loss: 0.00836
Epoch: 26747, Training Loss: 0.01024
Epoch: 26747, Training Loss: 0.00792
Epoch: 26748, Training Loss: 0.00897
Epoch: 26748, Training Loss: 0.00836
Epoch: 26748, Training Loss: 0.01024
Epoch: 26748, Training Loss: 0.00792
Epoch: 26749, Training Loss: 0.00897
Epoch: 26749, Training Loss: 0.00836
Epoch: 26749, Training Loss: 0.01024
Epoch: 26749, Training Loss: 0.00792
Epoch: 26750, Training Loss: 0.00897
Epoch: 26750, Training Loss: 0.00836
Epoch: 26750, Training Loss: 0.01024
Epoch: 26750, Training Loss: 0.00792
Epoch: 26751, Training Loss: 0.00897
Epoch: 26751, Training Loss: 0.00836
Epoch: 26751, Training Loss: 0.01024
Epoch: 26751, Training Loss: 0.00792
Epoch: 26752, Training Loss: 0.00897
Epoch: 26752, Training Loss: 0.00836
Epoch: 26752, Training Loss: 0.01024
Epoch: 26752, Training Loss: 0.00792
Epoch: 26753, Training Loss: 0.00897
Epoch: 26753, Training Loss: 0.00836
Epoch: 26753, Training Loss: 0.01024
Epoch: 26753, Training Loss: 0.00792
Epoch: 26754, Training Loss: 0.00896
Epoch: 26754, Training Loss: 0.00836
Epoch: 26754, Training Loss: 0.01024
Epoch: 26754, Training Loss: 0.00792
Epoch: 26755, Training Loss: 0.00896
Epoch: 26755, Training Loss: 0.00836
Epoch: 26755, Training Loss: 0.01024
Epoch: 26755, Training Loss: 0.00792
Epoch: 26756, Training Loss: 0.00896
Epoch: 26756, Training Loss: 0.00836
Epoch: 26756, Training Loss: 0.01024
Epoch: 26756, Training Loss: 0.00792
Epoch: 26757, Training Loss: 0.00896
Epoch: 26757, Training Loss: 0.00836
Epoch: 26757, Training Loss: 0.01024
Epoch: 26757, Training Loss: 0.00792
Epoch: 26758, Training Loss: 0.00896
Epoch: 26758, Training Loss: 0.00836
Epoch: 26758, Training Loss: 0.01024
Epoch: 26758, Training Loss: 0.00792
Epoch: 26759, Training Loss: 0.00896
Epoch: 26759, Training Loss: 0.00836
Epoch: 26759, Training Loss: 0.01024
Epoch: 26759, Training Loss: 0.00792
Epoch: 26760, Training Loss: 0.00896
Epoch: 26760, Training Loss: 0.00836
Epoch: 26760, Training Loss: 0.01024
Epoch: 26760, Training Loss: 0.00792
Epoch: 26761, Training Loss: 0.00896
Epoch: 26761, Training Loss: 0.00836
Epoch: 26761, Training Loss: 0.01024
Epoch: 26761, Training Loss: 0.00792
Epoch: 26762, Training Loss: 0.00896
Epoch: 26762, Training Loss: 0.00836
Epoch: 26762, Training Loss: 0.01024
Epoch: 26762, Training Loss: 0.00792
Epoch: 26763, Training Loss: 0.00896
Epoch: 26763, Training Loss: 0.00836
Epoch: 26763, Training Loss: 0.01024
Epoch: 26763, Training Loss: 0.00792
Epoch: 26764, Training Loss: 0.00896
Epoch: 26764, Training Loss: 0.00836
Epoch: 26764, Training Loss: 0.01023
Epoch: 26764, Training Loss: 0.00792
Epoch: 26765, Training Loss: 0.00896
Epoch: 26765, Training Loss: 0.00836
Epoch: 26765, Training Loss: 0.01023
Epoch: 26765, Training Loss: 0.00792
Epoch: 26766, Training Loss: 0.00896
Epoch: 26766, Training Loss: 0.00836
Epoch: 26766, Training Loss: 0.01023
Epoch: 26766, Training Loss: 0.00792
Epoch: 26767, Training Loss: 0.00896
Epoch: 26767, Training Loss: 0.00836
Epoch: 26767, Training Loss: 0.01023
Epoch: 26767, Training Loss: 0.00792
Epoch: 26768, Training Loss: 0.00896
Epoch: 26768, Training Loss: 0.00836
Epoch: 26768, Training Loss: 0.01023
Epoch: 26768, Training Loss: 0.00792
Epoch: 26769, Training Loss: 0.00896
Epoch: 26769, Training Loss: 0.00836
Epoch: 26769, Training Loss: 0.01023
Epoch: 26769, Training Loss: 0.00792
Epoch: 26770, Training Loss: 0.00896
Epoch: 26770, Training Loss: 0.00836
Epoch: 26770, Training Loss: 0.01023
Epoch: 26770, Training Loss: 0.00792
Epoch: 26771, Training Loss: 0.00896
Epoch: 26771, Training Loss: 0.00836
Epoch: 26771, Training Loss: 0.01023
Epoch: 26771, Training Loss: 0.00792
Epoch: 26772, Training Loss: 0.00896
Epoch: 26772, Training Loss: 0.00836
Epoch: 26772, Training Loss: 0.01023
Epoch: 26772, Training Loss: 0.00792
Epoch: 26773, Training Loss: 0.00896
Epoch: 26773, Training Loss: 0.00836
Epoch: 26773, Training Loss: 0.01023
Epoch: 26773, Training Loss: 0.00792
Epoch: 26774, Training Loss: 0.00896
Epoch: 26774, Training Loss: 0.00836
Epoch: 26774, Training Loss: 0.01023
Epoch: 26774, Training Loss: 0.00792
Epoch: 26775, Training Loss: 0.00896
Epoch: 26775, Training Loss: 0.00836
Epoch: 26775, Training Loss: 0.01023
Epoch: 26775, Training Loss: 0.00792
Epoch: 26776, Training Loss: 0.00896
Epoch: 26776, Training Loss: 0.00836
Epoch: 26776, Training Loss: 0.01023
Epoch: 26776, Training Loss: 0.00792
Epoch: 26777, Training Loss: 0.00896
Epoch: 26777, Training Loss: 0.00836
Epoch: 26777, Training Loss: 0.01023
Epoch: 26777, Training Loss: 0.00792
Epoch: 26778, Training Loss: 0.00896
Epoch: 26778, Training Loss: 0.00836
Epoch: 26778, Training Loss: 0.01023
Epoch: 26778, Training Loss: 0.00792
Epoch: 26779, Training Loss: 0.00896
Epoch: 26779, Training Loss: 0.00836
Epoch: 26779, Training Loss: 0.01023
Epoch: 26779, Training Loss: 0.00792
Epoch: 26780, Training Loss: 0.00896
Epoch: 26780, Training Loss: 0.00836
Epoch: 26780, Training Loss: 0.01023
Epoch: 26780, Training Loss: 0.00792
Epoch: 26781, Training Loss: 0.00896
Epoch: 26781, Training Loss: 0.00836
Epoch: 26781, Training Loss: 0.01023
Epoch: 26781, Training Loss: 0.00792
Epoch: 26782, Training Loss: 0.00896
Epoch: 26782, Training Loss: 0.00836
Epoch: 26782, Training Loss: 0.01023
Epoch: 26782, Training Loss: 0.00792
Epoch: 26783, Training Loss: 0.00896
Epoch: 26783, Training Loss: 0.00836
Epoch: 26783, Training Loss: 0.01023
Epoch: 26783, Training Loss: 0.00792
Epoch: 26784, Training Loss: 0.00896
Epoch: 26784, Training Loss: 0.00836
Epoch: 26784, Training Loss: 0.01023
Epoch: 26784, Training Loss: 0.00792
Epoch: 26785, Training Loss: 0.00896
Epoch: 26785, Training Loss: 0.00836
Epoch: 26785, Training Loss: 0.01023
Epoch: 26785, Training Loss: 0.00792
Epoch: 26786, Training Loss: 0.00896
Epoch: 26786, Training Loss: 0.00836
Epoch: 26786, Training Loss: 0.01023
Epoch: 26786, Training Loss: 0.00792
Epoch: 26787, Training Loss: 0.00896
Epoch: 26787, Training Loss: 0.00836
Epoch: 26787, Training Loss: 0.01023
Epoch: 26787, Training Loss: 0.00791
Epoch: 26788, Training Loss: 0.00896
Epoch: 26788, Training Loss: 0.00836
Epoch: 26788, Training Loss: 0.01023
Epoch: 26788, Training Loss: 0.00791
Epoch: 26789, Training Loss: 0.00896
Epoch: 26789, Training Loss: 0.00836
Epoch: 26789, Training Loss: 0.01023
Epoch: 26789, Training Loss: 0.00791
Epoch: 26790, Training Loss: 0.00896
Epoch: 26790, Training Loss: 0.00836
Epoch: 26790, Training Loss: 0.01023
Epoch: 26790, Training Loss: 0.00791
Epoch: 26791, Training Loss: 0.00896
Epoch: 26791, Training Loss: 0.00836
Epoch: 26791, Training Loss: 0.01023
Epoch: 26791, Training Loss: 0.00791
Epoch: 26792, Training Loss: 0.00896
Epoch: 26792, Training Loss: 0.00836
Epoch: 26792, Training Loss: 0.01023
Epoch: 26792, Training Loss: 0.00791
Epoch: 26793, Training Loss: 0.00896
Epoch: 26793, Training Loss: 0.00836
Epoch: 26793, Training Loss: 0.01023
Epoch: 26793, Training Loss: 0.00791
Epoch: 26794, Training Loss: 0.00896
Epoch: 26794, Training Loss: 0.00836
Epoch: 26794, Training Loss: 0.01023
Epoch: 26794, Training Loss: 0.00791
Epoch: 26795, Training Loss: 0.00896
Epoch: 26795, Training Loss: 0.00836
Epoch: 26795, Training Loss: 0.01023
Epoch: 26795, Training Loss: 0.00791
Epoch: 26796, Training Loss: 0.00896
Epoch: 26796, Training Loss: 0.00836
Epoch: 26796, Training Loss: 0.01023
Epoch: 26796, Training Loss: 0.00791
Epoch: 26797, Training Loss: 0.00896
Epoch: 26797, Training Loss: 0.00836
Epoch: 26797, Training Loss: 0.01023
Epoch: 26797, Training Loss: 0.00791
Epoch: 26798, Training Loss: 0.00896
Epoch: 26798, Training Loss: 0.00836
Epoch: 26798, Training Loss: 0.01023
Epoch: 26798, Training Loss: 0.00791
Epoch: 26799, Training Loss: 0.00896
Epoch: 26799, Training Loss: 0.00836
Epoch: 26799, Training Loss: 0.01023
Epoch: 26799, Training Loss: 0.00791
Epoch: 26800, Training Loss: 0.00896
Epoch: 26800, Training Loss: 0.00836
Epoch: 26800, Training Loss: 0.01023
Epoch: 26800, Training Loss: 0.00791
Epoch: 26801, Training Loss: 0.00896
Epoch: 26801, Training Loss: 0.00835
Epoch: 26801, Training Loss: 0.01023
Epoch: 26801, Training Loss: 0.00791
Epoch: 26802, Training Loss: 0.00896
Epoch: 26802, Training Loss: 0.00835
Epoch: 26802, Training Loss: 0.01023
Epoch: 26802, Training Loss: 0.00791
Epoch: 26803, Training Loss: 0.00896
Epoch: 26803, Training Loss: 0.00835
Epoch: 26803, Training Loss: 0.01023
Epoch: 26803, Training Loss: 0.00791
Epoch: 26804, Training Loss: 0.00896
Epoch: 26804, Training Loss: 0.00835
Epoch: 26804, Training Loss: 0.01023
Epoch: 26804, Training Loss: 0.00791
Epoch: 26805, Training Loss: 0.00896
Epoch: 26805, Training Loss: 0.00835
Epoch: 26805, Training Loss: 0.01023
Epoch: 26805, Training Loss: 0.00791
Epoch: 26806, Training Loss: 0.00896
Epoch: 26806, Training Loss: 0.00835
Epoch: 26806, Training Loss: 0.01023
Epoch: 26806, Training Loss: 0.00791
Epoch: 26807, Training Loss: 0.00896
Epoch: 26807, Training Loss: 0.00835
Epoch: 26807, Training Loss: 0.01023
Epoch: 26807, Training Loss: 0.00791
Epoch: 26808, Training Loss: 0.00896
Epoch: 26808, Training Loss: 0.00835
Epoch: 26808, Training Loss: 0.01023
Epoch: 26808, Training Loss: 0.00791
Epoch: 26809, Training Loss: 0.00896
Epoch: 26809, Training Loss: 0.00835
Epoch: 26809, Training Loss: 0.01023
Epoch: 26809, Training Loss: 0.00791
Epoch: 26810, Training Loss: 0.00895
Epoch: 26810, Training Loss: 0.00835
Epoch: 26810, Training Loss: 0.01023
Epoch: 26810, Training Loss: 0.00791
Epoch: 26811, Training Loss: 0.00895
Epoch: 26811, Training Loss: 0.00835
Epoch: 26811, Training Loss: 0.01023
Epoch: 26811, Training Loss: 0.00791
Epoch: 26812, Training Loss: 0.00895
Epoch: 26812, Training Loss: 0.00835
Epoch: 26812, Training Loss: 0.01023
Epoch: 26812, Training Loss: 0.00791
Epoch: 26813, Training Loss: 0.00895
Epoch: 26813, Training Loss: 0.00835
Epoch: 26813, Training Loss: 0.01022
Epoch: 26813, Training Loss: 0.00791
Epoch: 26814, Training Loss: 0.00895
Epoch: 26814, Training Loss: 0.00835
Epoch: 26814, Training Loss: 0.01022
Epoch: 26814, Training Loss: 0.00791
Epoch: 26815, Training Loss: 0.00895
Epoch: 26815, Training Loss: 0.00835
Epoch: 26815, Training Loss: 0.01022
Epoch: 26815, Training Loss: 0.00791
Epoch: 26816, Training Loss: 0.00895
Epoch: 26816, Training Loss: 0.00835
Epoch: 26816, Training Loss: 0.01022
Epoch: 26816, Training Loss: 0.00791
Epoch: 26817, Training Loss: 0.00895
Epoch: 26817, Training Loss: 0.00835
Epoch: 26817, Training Loss: 0.01022
Epoch: 26817, Training Loss: 0.00791
Epoch: 26818, Training Loss: 0.00895
Epoch: 26818, Training Loss: 0.00835
Epoch: 26818, Training Loss: 0.01022
Epoch: 26818, Training Loss: 0.00791
Epoch: 26819, Training Loss: 0.00895
Epoch: 26819, Training Loss: 0.00835
Epoch: 26819, Training Loss: 0.01022
Epoch: 26819, Training Loss: 0.00791
Epoch: 26820, Training Loss: 0.00895
Epoch: 26820, Training Loss: 0.00835
Epoch: 26820, Training Loss: 0.01022
Epoch: 26820, Training Loss: 0.00791
Epoch: 26821, Training Loss: 0.00895
Epoch: 26821, Training Loss: 0.00835
Epoch: 26821, Training Loss: 0.01022
Epoch: 26821, Training Loss: 0.00791
Epoch: 26822, Training Loss: 0.00895
Epoch: 26822, Training Loss: 0.00835
Epoch: 26822, Training Loss: 0.01022
Epoch: 26822, Training Loss: 0.00791
Epoch: 26823, Training Loss: 0.00895
Epoch: 26823, Training Loss: 0.00835
Epoch: 26823, Training Loss: 0.01022
Epoch: 26823, Training Loss: 0.00791
Epoch: 26824, Training Loss: 0.00895
Epoch: 26824, Training Loss: 0.00835
Epoch: 26824, Training Loss: 0.01022
Epoch: 26824, Training Loss: 0.00791
Epoch: 26825, Training Loss: 0.00895
Epoch: 26825, Training Loss: 0.00835
Epoch: 26825, Training Loss: 0.01022
Epoch: 26825, Training Loss: 0.00791
Epoch: 26826, Training Loss: 0.00895
Epoch: 26826, Training Loss: 0.00835
Epoch: 26826, Training Loss: 0.01022
Epoch: 26826, Training Loss: 0.00791
Epoch: 26827, Training Loss: 0.00895
Epoch: 26827, Training Loss: 0.00835
Epoch: 26827, Training Loss: 0.01022
Epoch: 26827, Training Loss: 0.00791
Epoch: 26828, Training Loss: 0.00895
Epoch: 26828, Training Loss: 0.00835
Epoch: 26828, Training Loss: 0.01022
Epoch: 26828, Training Loss: 0.00791
Epoch: 26829, Training Loss: 0.00895
Epoch: 26829, Training Loss: 0.00835
Epoch: 26829, Training Loss: 0.01022
Epoch: 26829, Training Loss: 0.00791
Epoch: 26830, Training Loss: 0.00895
Epoch: 26830, Training Loss: 0.00835
Epoch: 26830, Training Loss: 0.01022
Epoch: 26830, Training Loss: 0.00791
Epoch: 26831, Training Loss: 0.00895
Epoch: 26831, Training Loss: 0.00835
Epoch: 26831, Training Loss: 0.01022
Epoch: 26831, Training Loss: 0.00791
Epoch: 26832, Training Loss: 0.00895
Epoch: 26832, Training Loss: 0.00835
Epoch: 26832, Training Loss: 0.01022
Epoch: 26832, Training Loss: 0.00791
Epoch: 26833, Training Loss: 0.00895
Epoch: 26833, Training Loss: 0.00835
Epoch: 26833, Training Loss: 0.01022
Epoch: 26833, Training Loss: 0.00791
Epoch: 26834, Training Loss: 0.00895
Epoch: 26834, Training Loss: 0.00835
Epoch: 26834, Training Loss: 0.01022
Epoch: 26834, Training Loss: 0.00791
Epoch: 26835, Training Loss: 0.00895
Epoch: 26835, Training Loss: 0.00835
Epoch: 26835, Training Loss: 0.01022
Epoch: 26835, Training Loss: 0.00791
Epoch: 26836, Training Loss: 0.00895
Epoch: 26836, Training Loss: 0.00835
Epoch: 26836, Training Loss: 0.01022
Epoch: 26836, Training Loss: 0.00791
Epoch: 26837, Training Loss: 0.00895
Epoch: 26837, Training Loss: 0.00835
Epoch: 26837, Training Loss: 0.01022
Epoch: 26837, Training Loss: 0.00791
Epoch: 26838, Training Loss: 0.00895
Epoch: 26838, Training Loss: 0.00835
Epoch: 26838, Training Loss: 0.01022
Epoch: 26838, Training Loss: 0.00791
Epoch: 26839, Training Loss: 0.00895
Epoch: 26839, Training Loss: 0.00835
Epoch: 26839, Training Loss: 0.01022
Epoch: 26839, Training Loss: 0.00791
Epoch: 26840, Training Loss: 0.00895
Epoch: 26840, Training Loss: 0.00835
Epoch: 26840, Training Loss: 0.01022
Epoch: 26840, Training Loss: 0.00791
Epoch: 26841, Training Loss: 0.00895
Epoch: 26841, Training Loss: 0.00835
Epoch: 26841, Training Loss: 0.01022
Epoch: 26841, Training Loss: 0.00791
Epoch: 26842, Training Loss: 0.00895
Epoch: 26842, Training Loss: 0.00835
Epoch: 26842, Training Loss: 0.01022
Epoch: 26842, Training Loss: 0.00791
Epoch: 26843, Training Loss: 0.00895
Epoch: 26843, Training Loss: 0.00835
Epoch: 26843, Training Loss: 0.01022
Epoch: 26843, Training Loss: 0.00791
Epoch: 26844, Training Loss: 0.00895
Epoch: 26844, Training Loss: 0.00835
Epoch: 26844, Training Loss: 0.01022
Epoch: 26844, Training Loss: 0.00791
Epoch: 26845, Training Loss: 0.00895
Epoch: 26845, Training Loss: 0.00835
Epoch: 26845, Training Loss: 0.01022
Epoch: 26845, Training Loss: 0.00791
Epoch: 26846, Training Loss: 0.00895
Epoch: 26846, Training Loss: 0.00835
Epoch: 26846, Training Loss: 0.01022
Epoch: 26846, Training Loss: 0.00791
Epoch: 26847, Training Loss: 0.00895
Epoch: 26847, Training Loss: 0.00835
Epoch: 26847, Training Loss: 0.01022
Epoch: 26847, Training Loss: 0.00791
Epoch: 26848, Training Loss: 0.00895
Epoch: 26848, Training Loss: 0.00835
Epoch: 26848, Training Loss: 0.01022
Epoch: 26848, Training Loss: 0.00791
Epoch: 26849, Training Loss: 0.00895
Epoch: 26849, Training Loss: 0.00835
Epoch: 26849, Training Loss: 0.01022
Epoch: 26849, Training Loss: 0.00791
Epoch: 26850, Training Loss: 0.00895
Epoch: 26850, Training Loss: 0.00835
Epoch: 26850, Training Loss: 0.01022
Epoch: 26850, Training Loss: 0.00791
Epoch: 26851, Training Loss: 0.00895
Epoch: 26851, Training Loss: 0.00835
Epoch: 26851, Training Loss: 0.01022
Epoch: 26851, Training Loss: 0.00791
Epoch: 26852, Training Loss: 0.00895
Epoch: 26852, Training Loss: 0.00835
Epoch: 26852, Training Loss: 0.01022
Epoch: 26852, Training Loss: 0.00790
Epoch: 26853, Training Loss: 0.00895
Epoch: 26853, Training Loss: 0.00835
Epoch: 26853, Training Loss: 0.01022
Epoch: 26853, Training Loss: 0.00790
Epoch: 26854, Training Loss: 0.00895
Epoch: 26854, Training Loss: 0.00835
Epoch: 26854, Training Loss: 0.01022
Epoch: 26854, Training Loss: 0.00790
Epoch: 26855, Training Loss: 0.00895
Epoch: 26855, Training Loss: 0.00835
Epoch: 26855, Training Loss: 0.01022
Epoch: 26855, Training Loss: 0.00790
Epoch: 26856, Training Loss: 0.00895
Epoch: 26856, Training Loss: 0.00835
Epoch: 26856, Training Loss: 0.01022
Epoch: 26856, Training Loss: 0.00790
Epoch: 26857, Training Loss: 0.00895
Epoch: 26857, Training Loss: 0.00835
Epoch: 26857, Training Loss: 0.01022
Epoch: 26857, Training Loss: 0.00790
Epoch: 26858, Training Loss: 0.00895
Epoch: 26858, Training Loss: 0.00835
Epoch: 26858, Training Loss: 0.01022
Epoch: 26858, Training Loss: 0.00790
Epoch: 26859, Training Loss: 0.00895
Epoch: 26859, Training Loss: 0.00835
Epoch: 26859, Training Loss: 0.01022
Epoch: 26859, Training Loss: 0.00790
Epoch: 26860, Training Loss: 0.00895
Epoch: 26860, Training Loss: 0.00835
Epoch: 26860, Training Loss: 0.01022
Epoch: 26860, Training Loss: 0.00790
Epoch: 26861, Training Loss: 0.00895
Epoch: 26861, Training Loss: 0.00835
Epoch: 26861, Training Loss: 0.01022
Epoch: 26861, Training Loss: 0.00790
Epoch: 26862, Training Loss: 0.00895
Epoch: 26862, Training Loss: 0.00834
Epoch: 26862, Training Loss: 0.01021
Epoch: 26862, Training Loss: 0.00790
Epoch: 26863, Training Loss: 0.00895
Epoch: 26863, Training Loss: 0.00834
Epoch: 26863, Training Loss: 0.01021
Epoch: 26863, Training Loss: 0.00790
Epoch: 26864, Training Loss: 0.00895
Epoch: 26864, Training Loss: 0.00834
Epoch: 26864, Training Loss: 0.01021
Epoch: 26864, Training Loss: 0.00790
Epoch: 26865, Training Loss: 0.00895
Epoch: 26865, Training Loss: 0.00834
Epoch: 26865, Training Loss: 0.01021
Epoch: 26865, Training Loss: 0.00790
Epoch: 26866, Training Loss: 0.00895
Epoch: 26866, Training Loss: 0.00834
Epoch: 26866, Training Loss: 0.01021
Epoch: 26866, Training Loss: 0.00790
Epoch: 26867, Training Loss: 0.00894
Epoch: 26867, Training Loss: 0.00834
Epoch: 26867, Training Loss: 0.01021
Epoch: 26867, Training Loss: 0.00790
Epoch: 26868, Training Loss: 0.00894
Epoch: 26868, Training Loss: 0.00834
Epoch: 26868, Training Loss: 0.01021
Epoch: 26868, Training Loss: 0.00790
Epoch: 26869, Training Loss: 0.00894
Epoch: 26869, Training Loss: 0.00834
Epoch: 26869, Training Loss: 0.01021
Epoch: 26869, Training Loss: 0.00790
Epoch: 26870, Training Loss: 0.00894
Epoch: 26870, Training Loss: 0.00834
Epoch: 26870, Training Loss: 0.01021
Epoch: 26870, Training Loss: 0.00790
Epoch: 26871, Training Loss: 0.00894
Epoch: 26871, Training Loss: 0.00834
Epoch: 26871, Training Loss: 0.01021
Epoch: 26871, Training Loss: 0.00790
Epoch: 26872, Training Loss: 0.00894
Epoch: 26872, Training Loss: 0.00834
Epoch: 26872, Training Loss: 0.01021
Epoch: 26872, Training Loss: 0.00790
Epoch: 26873, Training Loss: 0.00894
Epoch: 26873, Training Loss: 0.00834
Epoch: 26873, Training Loss: 0.01021
Epoch: 26873, Training Loss: 0.00790
Epoch: 26874, Training Loss: 0.00894
Epoch: 26874, Training Loss: 0.00834
Epoch: 26874, Training Loss: 0.01021
Epoch: 26874, Training Loss: 0.00790
Epoch: 26875, Training Loss: 0.00894
Epoch: 26875, Training Loss: 0.00834
Epoch: 26875, Training Loss: 0.01021
Epoch: 26875, Training Loss: 0.00790
Epoch: 26876, Training Loss: 0.00894
Epoch: 26876, Training Loss: 0.00834
Epoch: 26876, Training Loss: 0.01021
Epoch: 26876, Training Loss: 0.00790
Epoch: 26877, Training Loss: 0.00894
Epoch: 26877, Training Loss: 0.00834
Epoch: 26877, Training Loss: 0.01021
Epoch: 26877, Training Loss: 0.00790
Epoch: 26878, Training Loss: 0.00894
Epoch: 26878, Training Loss: 0.00834
Epoch: 26878, Training Loss: 0.01021
Epoch: 26878, Training Loss: 0.00790
Epoch: 26879, Training Loss: 0.00894
Epoch: 26879, Training Loss: 0.00834
Epoch: 26879, Training Loss: 0.01021
Epoch: 26879, Training Loss: 0.00790
Epoch: 26880, Training Loss: 0.00894
Epoch: 26880, Training Loss: 0.00834
Epoch: 26880, Training Loss: 0.01021
Epoch: 26880, Training Loss: 0.00790
Epoch: 26881, Training Loss: 0.00894
Epoch: 26881, Training Loss: 0.00834
Epoch: 26881, Training Loss: 0.01021
Epoch: 26881, Training Loss: 0.00790
Epoch: 26882, Training Loss: 0.00894
Epoch: 26882, Training Loss: 0.00834
Epoch: 26882, Training Loss: 0.01021
Epoch: 26882, Training Loss: 0.00790
Epoch: 26883, Training Loss: 0.00894
Epoch: 26883, Training Loss: 0.00834
Epoch: 26883, Training Loss: 0.01021
Epoch: 26883, Training Loss: 0.00790
Epoch: 26884, Training Loss: 0.00894
Epoch: 26884, Training Loss: 0.00834
Epoch: 26884, Training Loss: 0.01021
Epoch: 26884, Training Loss: 0.00790
Epoch: 26885, Training Loss: 0.00894
Epoch: 26885, Training Loss: 0.00834
Epoch: 26885, Training Loss: 0.01021
Epoch: 26885, Training Loss: 0.00790
Epoch: 26886, Training Loss: 0.00894
Epoch: 26886, Training Loss: 0.00834
Epoch: 26886, Training Loss: 0.01021
Epoch: 26886, Training Loss: 0.00790
Epoch: 26887, Training Loss: 0.00894
Epoch: 26887, Training Loss: 0.00834
Epoch: 26887, Training Loss: 0.01021
Epoch: 26887, Training Loss: 0.00790
Epoch: 26888, Training Loss: 0.00894
Epoch: 26888, Training Loss: 0.00834
Epoch: 26888, Training Loss: 0.01021
Epoch: 26888, Training Loss: 0.00790
Epoch: 26889, Training Loss: 0.00894
Epoch: 26889, Training Loss: 0.00834
Epoch: 26889, Training Loss: 0.01021
Epoch: 26889, Training Loss: 0.00790
Epoch: 26890, Training Loss: 0.00894
Epoch: 26890, Training Loss: 0.00834
Epoch: 26890, Training Loss: 0.01021
Epoch: 26890, Training Loss: 0.00790
Epoch: 26891, Training Loss: 0.00894
Epoch: 26891, Training Loss: 0.00834
Epoch: 26891, Training Loss: 0.01021
Epoch: 26891, Training Loss: 0.00790
Epoch: 26892, Training Loss: 0.00894
Epoch: 26892, Training Loss: 0.00834
Epoch: 26892, Training Loss: 0.01021
Epoch: 26892, Training Loss: 0.00790
Epoch: 26893, Training Loss: 0.00894
Epoch: 26893, Training Loss: 0.00834
Epoch: 26893, Training Loss: 0.01021
Epoch: 26893, Training Loss: 0.00790
Epoch: 26894, Training Loss: 0.00894
Epoch: 26894, Training Loss: 0.00834
Epoch: 26894, Training Loss: 0.01021
Epoch: 26894, Training Loss: 0.00790
Epoch: 26895, Training Loss: 0.00894
Epoch: 26895, Training Loss: 0.00834
Epoch: 26895, Training Loss: 0.01021
Epoch: 26895, Training Loss: 0.00790
Epoch: 26896, Training Loss: 0.00894
Epoch: 26896, Training Loss: 0.00834
Epoch: 26896, Training Loss: 0.01021
Epoch: 26896, Training Loss: 0.00790
Epoch: 26897, Training Loss: 0.00894
Epoch: 26897, Training Loss: 0.00834
Epoch: 26897, Training Loss: 0.01021
Epoch: 26897, Training Loss: 0.00790
Epoch: 26898, Training Loss: 0.00894
Epoch: 26898, Training Loss: 0.00834
Epoch: 26898, Training Loss: 0.01021
Epoch: 26898, Training Loss: 0.00790
Epoch: 26899, Training Loss: 0.00894
Epoch: 26899, Training Loss: 0.00834
Epoch: 26899, Training Loss: 0.01021
Epoch: 26899, Training Loss: 0.00790
Epoch: 26900, Training Loss: 0.00894
Epoch: 26900, Training Loss: 0.00834
Epoch: 26900, Training Loss: 0.01021
Epoch: 26900, Training Loss: 0.00790
Epoch: 26901, Training Loss: 0.00894
Epoch: 26901, Training Loss: 0.00834
Epoch: 26901, Training Loss: 0.01021
Epoch: 26901, Training Loss: 0.00790
Epoch: 26902, Training Loss: 0.00894
Epoch: 26902, Training Loss: 0.00834
Epoch: 26902, Training Loss: 0.01021
Epoch: 26902, Training Loss: 0.00790
Epoch: 26903, Training Loss: 0.00894
Epoch: 26903, Training Loss: 0.00834
Epoch: 26903, Training Loss: 0.01021
Epoch: 26903, Training Loss: 0.00790
Epoch: 26904, Training Loss: 0.00894
Epoch: 26904, Training Loss: 0.00834
Epoch: 26904, Training Loss: 0.01021
Epoch: 26904, Training Loss: 0.00790
Epoch: 26905, Training Loss: 0.00894
Epoch: 26905, Training Loss: 0.00834
Epoch: 26905, Training Loss: 0.01021
Epoch: 26905, Training Loss: 0.00790
Epoch: 26906, Training Loss: 0.00894
Epoch: 26906, Training Loss: 0.00834
Epoch: 26906, Training Loss: 0.01021
Epoch: 26906, Training Loss: 0.00790
Epoch: 26907, Training Loss: 0.00894
Epoch: 26907, Training Loss: 0.00834
Epoch: 26907, Training Loss: 0.01021
Epoch: 26907, Training Loss: 0.00790
Epoch: 26908, Training Loss: 0.00894
Epoch: 26908, Training Loss: 0.00834
Epoch: 26908, Training Loss: 0.01021
Epoch: 26908, Training Loss: 0.00790
Epoch: 26909, Training Loss: 0.00894
Epoch: 26909, Training Loss: 0.00834
Epoch: 26909, Training Loss: 0.01021
Epoch: 26909, Training Loss: 0.00790
Epoch: 26910, Training Loss: 0.00894
Epoch: 26910, Training Loss: 0.00834
Epoch: 26910, Training Loss: 0.01021
Epoch: 26910, Training Loss: 0.00790
Epoch: 26911, Training Loss: 0.00894
Epoch: 26911, Training Loss: 0.00834
Epoch: 26911, Training Loss: 0.01021
Epoch: 26911, Training Loss: 0.00790
Epoch: 26912, Training Loss: 0.00894
Epoch: 26912, Training Loss: 0.00834
Epoch: 26912, Training Loss: 0.01020
Epoch: 26912, Training Loss: 0.00790
Epoch: 26913, Training Loss: 0.00894
Epoch: 26913, Training Loss: 0.00834
Epoch: 26913, Training Loss: 0.01020
Epoch: 26913, Training Loss: 0.00790
Epoch: 26914, Training Loss: 0.00894
Epoch: 26914, Training Loss: 0.00834
Epoch: 26914, Training Loss: 0.01020
Epoch: 26914, Training Loss: 0.00790
Epoch: 26915, Training Loss: 0.00894
Epoch: 26915, Training Loss: 0.00834
Epoch: 26915, Training Loss: 0.01020
Epoch: 26915, Training Loss: 0.00790
Epoch: 26916, Training Loss: 0.00894
Epoch: 26916, Training Loss: 0.00834
Epoch: 26916, Training Loss: 0.01020
Epoch: 26916, Training Loss: 0.00790
Epoch: 26917, Training Loss: 0.00894
Epoch: 26917, Training Loss: 0.00834
Epoch: 26917, Training Loss: 0.01020
Epoch: 26917, Training Loss: 0.00789
Epoch: 26918, Training Loss: 0.00894
Epoch: 26918, Training Loss: 0.00834
Epoch: 26918, Training Loss: 0.01020
Epoch: 26918, Training Loss: 0.00789
Epoch: 26919, Training Loss: 0.00894
Epoch: 26919, Training Loss: 0.00834
Epoch: 26919, Training Loss: 0.01020
Epoch: 26919, Training Loss: 0.00789
Epoch: 26920, Training Loss: 0.00894
Epoch: 26920, Training Loss: 0.00834
Epoch: 26920, Training Loss: 0.01020
Epoch: 26920, Training Loss: 0.00789
Epoch: 26921, Training Loss: 0.00894
Epoch: 26921, Training Loss: 0.00834
Epoch: 26921, Training Loss: 0.01020
Epoch: 26921, Training Loss: 0.00789
Epoch: 26922, Training Loss: 0.00894
Epoch: 26922, Training Loss: 0.00834
Epoch: 26922, Training Loss: 0.01020
Epoch: 26922, Training Loss: 0.00789
Epoch: 26923, Training Loss: 0.00893
Epoch: 26923, Training Loss: 0.00833
Epoch: 26923, Training Loss: 0.01020
Epoch: 26923, Training Loss: 0.00789
Epoch: 26924, Training Loss: 0.00893
Epoch: 26924, Training Loss: 0.00833
Epoch: 26924, Training Loss: 0.01020
Epoch: 26924, Training Loss: 0.00789
Epoch: 26925, Training Loss: 0.00893
Epoch: 26925, Training Loss: 0.00833
Epoch: 26925, Training Loss: 0.01020
Epoch: 26925, Training Loss: 0.00789
Epoch: 26926, Training Loss: 0.00893
Epoch: 26926, Training Loss: 0.00833
Epoch: 26926, Training Loss: 0.01020
Epoch: 26926, Training Loss: 0.00789
Epoch: 26927, Training Loss: 0.00893
Epoch: 26927, Training Loss: 0.00833
Epoch: 26927, Training Loss: 0.01020
Epoch: 26927, Training Loss: 0.00789
Epoch: 26928, Training Loss: 0.00893
Epoch: 26928, Training Loss: 0.00833
Epoch: 26928, Training Loss: 0.01020
Epoch: 26928, Training Loss: 0.00789
Epoch: 26929, Training Loss: 0.00893
Epoch: 26929, Training Loss: 0.00833
Epoch: 26929, Training Loss: 0.01020
Epoch: 26929, Training Loss: 0.00789
Epoch: 26930, Training Loss: 0.00893
Epoch: 26930, Training Loss: 0.00833
Epoch: 26930, Training Loss: 0.01020
Epoch: 26930, Training Loss: 0.00789
Epoch: 26931, Training Loss: 0.00893
Epoch: 26931, Training Loss: 0.00833
Epoch: 26931, Training Loss: 0.01020
Epoch: 26931, Training Loss: 0.00789
Epoch: 26932, Training Loss: 0.00893
Epoch: 26932, Training Loss: 0.00833
Epoch: 26932, Training Loss: 0.01020
Epoch: 26932, Training Loss: 0.00789
Epoch: 26933, Training Loss: 0.00893
Epoch: 26933, Training Loss: 0.00833
Epoch: 26933, Training Loss: 0.01020
Epoch: 26933, Training Loss: 0.00789
Epoch: 26934, Training Loss: 0.00893
Epoch: 26934, Training Loss: 0.00833
Epoch: 26934, Training Loss: 0.01020
Epoch: 26934, Training Loss: 0.00789
Epoch: 26935, Training Loss: 0.00893
Epoch: 26935, Training Loss: 0.00833
Epoch: 26935, Training Loss: 0.01020
Epoch: 26935, Training Loss: 0.00789
Epoch: 26936, Training Loss: 0.00893
Epoch: 26936, Training Loss: 0.00833
Epoch: 26936, Training Loss: 0.01020
Epoch: 26936, Training Loss: 0.00789
Epoch: 26937, Training Loss: 0.00893
Epoch: 26937, Training Loss: 0.00833
Epoch: 26937, Training Loss: 0.01020
Epoch: 26937, Training Loss: 0.00789
Epoch: 26938, Training Loss: 0.00893
Epoch: 26938, Training Loss: 0.00833
Epoch: 26938, Training Loss: 0.01020
Epoch: 26938, Training Loss: 0.00789
Epoch: 26939, Training Loss: 0.00893
Epoch: 26939, Training Loss: 0.00833
Epoch: 26939, Training Loss: 0.01020
Epoch: 26939, Training Loss: 0.00789
Epoch: 26940, Training Loss: 0.00893
Epoch: 26940, Training Loss: 0.00833
Epoch: 26940, Training Loss: 0.01020
Epoch: 26940, Training Loss: 0.00789
Epoch: 26941, Training Loss: 0.00893
Epoch: 26941, Training Loss: 0.00833
Epoch: 26941, Training Loss: 0.01020
Epoch: 26941, Training Loss: 0.00789
Epoch: 26942, Training Loss: 0.00893
Epoch: 26942, Training Loss: 0.00833
Epoch: 26942, Training Loss: 0.01020
Epoch: 26942, Training Loss: 0.00789
Epoch: 26943, Training Loss: 0.00893
Epoch: 26943, Training Loss: 0.00833
Epoch: 26943, Training Loss: 0.01020
Epoch: 26943, Training Loss: 0.00789
Epoch: 26944, Training Loss: 0.00893
Epoch: 26944, Training Loss: 0.00833
Epoch: 26944, Training Loss: 0.01020
Epoch: 26944, Training Loss: 0.00789
Epoch: 26945, Training Loss: 0.00893
Epoch: 26945, Training Loss: 0.00833
Epoch: 26945, Training Loss: 0.01020
Epoch: 26945, Training Loss: 0.00789
Epoch: 26946, Training Loss: 0.00893
Epoch: 26946, Training Loss: 0.00833
Epoch: 26946, Training Loss: 0.01020
Epoch: 26946, Training Loss: 0.00789
Epoch: 26947, Training Loss: 0.00893
Epoch: 26947, Training Loss: 0.00833
Epoch: 26947, Training Loss: 0.01020
Epoch: 26947, Training Loss: 0.00789
Epoch: 26948, Training Loss: 0.00893
Epoch: 26948, Training Loss: 0.00833
Epoch: 26948, Training Loss: 0.01020
Epoch: 26948, Training Loss: 0.00789
Epoch: 26949, Training Loss: 0.00893
Epoch: 26949, Training Loss: 0.00833
Epoch: 26949, Training Loss: 0.01020
Epoch: 26949, Training Loss: 0.00789
Epoch: 26950, Training Loss: 0.00893
Epoch: 26950, Training Loss: 0.00833
Epoch: 26950, Training Loss: 0.01020
Epoch: 26950, Training Loss: 0.00789
Epoch: 26951, Training Loss: 0.00893
Epoch: 26951, Training Loss: 0.00833
Epoch: 26951, Training Loss: 0.01020
Epoch: 26951, Training Loss: 0.00789
Epoch: 26952, Training Loss: 0.00893
Epoch: 26952, Training Loss: 0.00833
Epoch: 26952, Training Loss: 0.01020
Epoch: 26952, Training Loss: 0.00789
Epoch: 26953, Training Loss: 0.00893
Epoch: 26953, Training Loss: 0.00833
Epoch: 26953, Training Loss: 0.01020
Epoch: 26953, Training Loss: 0.00789
Epoch: 26954, Training Loss: 0.00893
Epoch: 26954, Training Loss: 0.00833
Epoch: 26954, Training Loss: 0.01020
Epoch: 26954, Training Loss: 0.00789
Epoch: 26955, Training Loss: 0.00893
Epoch: 26955, Training Loss: 0.00833
Epoch: 26955, Training Loss: 0.01020
Epoch: 26955, Training Loss: 0.00789
Epoch: 26956, Training Loss: 0.00893
Epoch: 26956, Training Loss: 0.00833
Epoch: 26956, Training Loss: 0.01020
Epoch: 26956, Training Loss: 0.00789
Epoch: 26957, Training Loss: 0.00893
Epoch: 26957, Training Loss: 0.00833
Epoch: 26957, Training Loss: 0.01020
Epoch: 26957, Training Loss: 0.00789
Epoch: 26958, Training Loss: 0.00893
Epoch: 26958, Training Loss: 0.00833
Epoch: 26958, Training Loss: 0.01020
Epoch: 26958, Training Loss: 0.00789
Epoch: 26959, Training Loss: 0.00893
Epoch: 26959, Training Loss: 0.00833
Epoch: 26959, Training Loss: 0.01020
Epoch: 26959, Training Loss: 0.00789
Epoch: 26960, Training Loss: 0.00893
Epoch: 26960, Training Loss: 0.00833
Epoch: 26960, Training Loss: 0.01020
Epoch: 26960, Training Loss: 0.00789
Epoch: 26961, Training Loss: 0.00893
Epoch: 26961, Training Loss: 0.00833
Epoch: 26961, Training Loss: 0.01019
Epoch: 26961, Training Loss: 0.00789
Epoch: 26962, Training Loss: 0.00893
Epoch: 26962, Training Loss: 0.00833
Epoch: 26962, Training Loss: 0.01019
Epoch: 26962, Training Loss: 0.00789
Epoch: 26963, Training Loss: 0.00893
Epoch: 26963, Training Loss: 0.00833
Epoch: 26963, Training Loss: 0.01019
Epoch: 26963, Training Loss: 0.00789
Epoch: 26964, Training Loss: 0.00893
Epoch: 26964, Training Loss: 0.00833
Epoch: 26964, Training Loss: 0.01019
Epoch: 26964, Training Loss: 0.00789
Epoch: 26965, Training Loss: 0.00893
Epoch: 26965, Training Loss: 0.00833
Epoch: 26965, Training Loss: 0.01019
Epoch: 26965, Training Loss: 0.00789
Epoch: 26966, Training Loss: 0.00893
Epoch: 26966, Training Loss: 0.00833
Epoch: 26966, Training Loss: 0.01019
Epoch: 26966, Training Loss: 0.00789
Epoch: 26967, Training Loss: 0.00893
Epoch: 26967, Training Loss: 0.00833
Epoch: 26967, Training Loss: 0.01019
Epoch: 26967, Training Loss: 0.00789
Epoch: 26968, Training Loss: 0.00893
Epoch: 26968, Training Loss: 0.00833
Epoch: 26968, Training Loss: 0.01019
Epoch: 26968, Training Loss: 0.00789
Epoch: 26969, Training Loss: 0.00893
Epoch: 26969, Training Loss: 0.00833
Epoch: 26969, Training Loss: 0.01019
Epoch: 26969, Training Loss: 0.00789
Epoch: 26970, Training Loss: 0.00893
Epoch: 26970, Training Loss: 0.00833
Epoch: 26970, Training Loss: 0.01019
Epoch: 26970, Training Loss: 0.00789
Epoch: 26971, Training Loss: 0.00893
Epoch: 26971, Training Loss: 0.00833
Epoch: 26971, Training Loss: 0.01019
Epoch: 26971, Training Loss: 0.00789
Epoch: 26972, Training Loss: 0.00893
Epoch: 26972, Training Loss: 0.00833
Epoch: 26972, Training Loss: 0.01019
Epoch: 26972, Training Loss: 0.00789
Epoch: 26973, Training Loss: 0.00893
Epoch: 26973, Training Loss: 0.00833
Epoch: 26973, Training Loss: 0.01019
Epoch: 26973, Training Loss: 0.00789
Epoch: 26974, Training Loss: 0.00893
Epoch: 26974, Training Loss: 0.00833
Epoch: 26974, Training Loss: 0.01019
Epoch: 26974, Training Loss: 0.00789
Epoch: 26975, Training Loss: 0.00893
Epoch: 26975, Training Loss: 0.00833
Epoch: 26975, Training Loss: 0.01019
Epoch: 26975, Training Loss: 0.00789
Epoch: 26976, Training Loss: 0.00893
Epoch: 26976, Training Loss: 0.00833
Epoch: 26976, Training Loss: 0.01019
Epoch: 26976, Training Loss: 0.00789
Epoch: 26977, Training Loss: 0.00893
Epoch: 26977, Training Loss: 0.00833
Epoch: 26977, Training Loss: 0.01019
Epoch: 26977, Training Loss: 0.00789
Epoch: 26978, Training Loss: 0.00893
Epoch: 26978, Training Loss: 0.00833
Epoch: 26978, Training Loss: 0.01019
Epoch: 26978, Training Loss: 0.00789
Epoch: 26979, Training Loss: 0.00892
Epoch: 26979, Training Loss: 0.00833
Epoch: 26979, Training Loss: 0.01019
Epoch: 26979, Training Loss: 0.00789
Epoch: 26980, Training Loss: 0.00892
Epoch: 26980, Training Loss: 0.00833
Epoch: 26980, Training Loss: 0.01019
Epoch: 26980, Training Loss: 0.00789
Epoch: 26981, Training Loss: 0.00892
Epoch: 26981, Training Loss: 0.00833
Epoch: 26981, Training Loss: 0.01019
Epoch: 26981, Training Loss: 0.00789
Epoch: 26982, Training Loss: 0.00892
Epoch: 26982, Training Loss: 0.00833
Epoch: 26982, Training Loss: 0.01019
Epoch: 26982, Training Loss: 0.00788
Epoch: 26983, Training Loss: 0.00892
Epoch: 26983, Training Loss: 0.00833
Epoch: 26983, Training Loss: 0.01019
Epoch: 26983, Training Loss: 0.00788
Epoch: 26984, Training Loss: 0.00892
Epoch: 26984, Training Loss: 0.00833
Epoch: 26984, Training Loss: 0.01019
Epoch: 26984, Training Loss: 0.00788
Epoch: 26985, Training Loss: 0.00892
Epoch: 26985, Training Loss: 0.00832
Epoch: 26985, Training Loss: 0.01019
Epoch: 26985, Training Loss: 0.00788
Epoch: 26986, Training Loss: 0.00892
Epoch: 26986, Training Loss: 0.00832
Epoch: 26986, Training Loss: 0.01019
Epoch: 26986, Training Loss: 0.00788
Epoch: 26987, Training Loss: 0.00892
Epoch: 26987, Training Loss: 0.00832
Epoch: 26987, Training Loss: 0.01019
Epoch: 26987, Training Loss: 0.00788
Epoch: 26988, Training Loss: 0.00892
Epoch: 26988, Training Loss: 0.00832
Epoch: 26988, Training Loss: 0.01019
Epoch: 26988, Training Loss: 0.00788
Epoch: 26989, Training Loss: 0.00892
Epoch: 26989, Training Loss: 0.00832
Epoch: 26989, Training Loss: 0.01019
Epoch: 26989, Training Loss: 0.00788
Epoch: 26990, Training Loss: 0.00892
Epoch: 26990, Training Loss: 0.00832
Epoch: 26990, Training Loss: 0.01019
Epoch: 26990, Training Loss: 0.00788
Epoch: 26991, Training Loss: 0.00892
Epoch: 26991, Training Loss: 0.00832
Epoch: 26991, Training Loss: 0.01019
Epoch: 26991, Training Loss: 0.00788
Epoch: 26992, Training Loss: 0.00892
Epoch: 26992, Training Loss: 0.00832
Epoch: 26992, Training Loss: 0.01019
Epoch: 26992, Training Loss: 0.00788
Epoch: 26993, Training Loss: 0.00892
Epoch: 26993, Training Loss: 0.00832
Epoch: 26993, Training Loss: 0.01019
Epoch: 26993, Training Loss: 0.00788
Epoch: 26994, Training Loss: 0.00892
Epoch: 26994, Training Loss: 0.00832
Epoch: 26994, Training Loss: 0.01019
Epoch: 26994, Training Loss: 0.00788
Epoch: 26995, Training Loss: 0.00892
Epoch: 26995, Training Loss: 0.00832
Epoch: 26995, Training Loss: 0.01019
Epoch: 26995, Training Loss: 0.00788
Epoch: 26996, Training Loss: 0.00892
Epoch: 26996, Training Loss: 0.00832
Epoch: 26996, Training Loss: 0.01019
Epoch: 26996, Training Loss: 0.00788
Epoch: 26997, Training Loss: 0.00892
Epoch: 26997, Training Loss: 0.00832
Epoch: 26997, Training Loss: 0.01019
Epoch: 26997, Training Loss: 0.00788
Epoch: 26998, Training Loss: 0.00892
Epoch: 26998, Training Loss: 0.00832
Epoch: 26998, Training Loss: 0.01019
Epoch: 26998, Training Loss: 0.00788
Epoch: 26999, Training Loss: 0.00892
Epoch: 26999, Training Loss: 0.00832
Epoch: 26999, Training Loss: 0.01019
Epoch: 26999, Training Loss: 0.00788
Epoch: 27000, Training Loss: 0.00892
Epoch: 27000, Training Loss: 0.00832
Epoch: 27000, Training Loss: 0.01019
Epoch: 27000, Training Loss: 0.00788
Epoch: 27001, Training Loss: 0.00892
Epoch: 27001, Training Loss: 0.00832
Epoch: 27001, Training Loss: 0.01019
Epoch: 27001, Training Loss: 0.00788
Epoch: 27002, Training Loss: 0.00892
Epoch: 27002, Training Loss: 0.00832
Epoch: 27002, Training Loss: 0.01019
Epoch: 27002, Training Loss: 0.00788
Epoch: 27003, Training Loss: 0.00892
Epoch: 27003, Training Loss: 0.00832
Epoch: 27003, Training Loss: 0.01019
Epoch: 27003, Training Loss: 0.00788
Epoch: 27004, Training Loss: 0.00892
Epoch: 27004, Training Loss: 0.00832
Epoch: 27004, Training Loss: 0.01019
Epoch: 27004, Training Loss: 0.00788
Epoch: 27005, Training Loss: 0.00892
Epoch: 27005, Training Loss: 0.00832
Epoch: 27005, Training Loss: 0.01019
Epoch: 27005, Training Loss: 0.00788
Epoch: 27006, Training Loss: 0.00892
Epoch: 27006, Training Loss: 0.00832
Epoch: 27006, Training Loss: 0.01019
Epoch: 27006, Training Loss: 0.00788
Epoch: 27007, Training Loss: 0.00892
Epoch: 27007, Training Loss: 0.00832
Epoch: 27007, Training Loss: 0.01019
Epoch: 27007, Training Loss: 0.00788
Epoch: 27008, Training Loss: 0.00892
Epoch: 27008, Training Loss: 0.00832
Epoch: 27008, Training Loss: 0.01019
Epoch: 27008, Training Loss: 0.00788
Epoch: 27009, Training Loss: 0.00892
Epoch: 27009, Training Loss: 0.00832
Epoch: 27009, Training Loss: 0.01019
Epoch: 27009, Training Loss: 0.00788
Epoch: 27010, Training Loss: 0.00892
Epoch: 27010, Training Loss: 0.00832
Epoch: 27010, Training Loss: 0.01019
Epoch: 27010, Training Loss: 0.00788
Epoch: 27011, Training Loss: 0.00892
Epoch: 27011, Training Loss: 0.00832
Epoch: 27011, Training Loss: 0.01018
Epoch: 27011, Training Loss: 0.00788
Epoch: 27012, Training Loss: 0.00892
Epoch: 27012, Training Loss: 0.00832
Epoch: 27012, Training Loss: 0.01018
Epoch: 27012, Training Loss: 0.00788
Epoch: 27013, Training Loss: 0.00892
Epoch: 27013, Training Loss: 0.00832
Epoch: 27013, Training Loss: 0.01018
Epoch: 27013, Training Loss: 0.00788
Epoch: 27014, Training Loss: 0.00892
Epoch: 27014, Training Loss: 0.00832
Epoch: 27014, Training Loss: 0.01018
Epoch: 27014, Training Loss: 0.00788
Epoch: 27015, Training Loss: 0.00892
Epoch: 27015, Training Loss: 0.00832
Epoch: 27015, Training Loss: 0.01018
Epoch: 27015, Training Loss: 0.00788
Epoch: 27016, Training Loss: 0.00892
Epoch: 27016, Training Loss: 0.00832
Epoch: 27016, Training Loss: 0.01018
Epoch: 27016, Training Loss: 0.00788
Epoch: 27017, Training Loss: 0.00892
Epoch: 27017, Training Loss: 0.00832
Epoch: 27017, Training Loss: 0.01018
Epoch: 27017, Training Loss: 0.00788
Epoch: 27018, Training Loss: 0.00892
Epoch: 27018, Training Loss: 0.00832
Epoch: 27018, Training Loss: 0.01018
Epoch: 27018, Training Loss: 0.00788
Epoch: 27019, Training Loss: 0.00892
Epoch: 27019, Training Loss: 0.00832
Epoch: 27019, Training Loss: 0.01018
Epoch: 27019, Training Loss: 0.00788
Epoch: 27020, Training Loss: 0.00892
Epoch: 27020, Training Loss: 0.00832
Epoch: 27020, Training Loss: 0.01018
Epoch: 27020, Training Loss: 0.00788
Epoch: 27021, Training Loss: 0.00892
Epoch: 27021, Training Loss: 0.00832
Epoch: 27021, Training Loss: 0.01018
Epoch: 27021, Training Loss: 0.00788
Epoch: 27022, Training Loss: 0.00892
Epoch: 27022, Training Loss: 0.00832
Epoch: 27022, Training Loss: 0.01018
Epoch: 27022, Training Loss: 0.00788
Epoch: 27023, Training Loss: 0.00892
Epoch: 27023, Training Loss: 0.00832
Epoch: 27023, Training Loss: 0.01018
Epoch: 27023, Training Loss: 0.00788
Epoch: 27024, Training Loss: 0.00892
Epoch: 27024, Training Loss: 0.00832
Epoch: 27024, Training Loss: 0.01018
Epoch: 27024, Training Loss: 0.00788
Epoch: 27025, Training Loss: 0.00892
Epoch: 27025, Training Loss: 0.00832
Epoch: 27025, Training Loss: 0.01018
Epoch: 27025, Training Loss: 0.00788
Epoch: 27026, Training Loss: 0.00892
Epoch: 27026, Training Loss: 0.00832
Epoch: 27026, Training Loss: 0.01018
Epoch: 27026, Training Loss: 0.00788
Epoch: 27027, Training Loss: 0.00892
Epoch: 27027, Training Loss: 0.00832
Epoch: 27027, Training Loss: 0.01018
Epoch: 27027, Training Loss: 0.00788
Epoch: 27028, Training Loss: 0.00892
Epoch: 27028, Training Loss: 0.00832
Epoch: 27028, Training Loss: 0.01018
Epoch: 27028, Training Loss: 0.00788
Epoch: 27029, Training Loss: 0.00892
Epoch: 27029, Training Loss: 0.00832
Epoch: 27029, Training Loss: 0.01018
Epoch: 27029, Training Loss: 0.00788
Epoch: 27030, Training Loss: 0.00892
Epoch: 27030, Training Loss: 0.00832
Epoch: 27030, Training Loss: 0.01018
Epoch: 27030, Training Loss: 0.00788
Epoch: 27031, Training Loss: 0.00892
Epoch: 27031, Training Loss: 0.00832
Epoch: 27031, Training Loss: 0.01018
Epoch: 27031, Training Loss: 0.00788
Epoch: 27032, Training Loss: 0.00892
Epoch: 27032, Training Loss: 0.00832
Epoch: 27032, Training Loss: 0.01018
Epoch: 27032, Training Loss: 0.00788
Epoch: 27033, Training Loss: 0.00892
Epoch: 27033, Training Loss: 0.00832
Epoch: 27033, Training Loss: 0.01018
Epoch: 27033, Training Loss: 0.00788
Epoch: 27034, Training Loss: 0.00892
Epoch: 27034, Training Loss: 0.00832
Epoch: 27034, Training Loss: 0.01018
Epoch: 27034, Training Loss: 0.00788
Epoch: 27035, Training Loss: 0.00892
Epoch: 27035, Training Loss: 0.00832
Epoch: 27035, Training Loss: 0.01018
Epoch: 27035, Training Loss: 0.00788
Epoch: 27036, Training Loss: 0.00891
Epoch: 27036, Training Loss: 0.00832
Epoch: 27036, Training Loss: 0.01018
Epoch: 27036, Training Loss: 0.00788
Epoch: 27037, Training Loss: 0.00891
Epoch: 27037, Training Loss: 0.00832
Epoch: 27037, Training Loss: 0.01018
Epoch: 27037, Training Loss: 0.00788
Epoch: 27038, Training Loss: 0.00891
Epoch: 27038, Training Loss: 0.00832
Epoch: 27038, Training Loss: 0.01018
Epoch: 27038, Training Loss: 0.00788
Epoch: 27039, Training Loss: 0.00891
Epoch: 27039, Training Loss: 0.00832
Epoch: 27039, Training Loss: 0.01018
Epoch: 27039, Training Loss: 0.00788
Epoch: 27040, Training Loss: 0.00891
Epoch: 27040, Training Loss: 0.00832
Epoch: 27040, Training Loss: 0.01018
Epoch: 27040, Training Loss: 0.00788
Epoch: 27041, Training Loss: 0.00891
Epoch: 27041, Training Loss: 0.00832
Epoch: 27041, Training Loss: 0.01018
Epoch: 27041, Training Loss: 0.00788
Epoch: 27042, Training Loss: 0.00891
Epoch: 27042, Training Loss: 0.00832
Epoch: 27042, Training Loss: 0.01018
Epoch: 27042, Training Loss: 0.00788
Epoch: 27043, Training Loss: 0.00891
Epoch: 27043, Training Loss: 0.00832
Epoch: 27043, Training Loss: 0.01018
Epoch: 27043, Training Loss: 0.00788
Epoch: 27044, Training Loss: 0.00891
Epoch: 27044, Training Loss: 0.00832
Epoch: 27044, Training Loss: 0.01018
Epoch: 27044, Training Loss: 0.00788
Epoch: 27045, Training Loss: 0.00891
Epoch: 27045, Training Loss: 0.00832
Epoch: 27045, Training Loss: 0.01018
Epoch: 27045, Training Loss: 0.00788
Epoch: 27046, Training Loss: 0.00891
Epoch: 27046, Training Loss: 0.00832
Epoch: 27046, Training Loss: 0.01018
Epoch: 27046, Training Loss: 0.00788
Epoch: 27047, Training Loss: 0.00891
Epoch: 27047, Training Loss: 0.00831
Epoch: 27047, Training Loss: 0.01018
Epoch: 27047, Training Loss: 0.00787
Epoch: 27048, Training Loss: 0.00891
Epoch: 27048, Training Loss: 0.00831
Epoch: 27048, Training Loss: 0.01018
Epoch: 27048, Training Loss: 0.00787
Epoch: 27049, Training Loss: 0.00891
Epoch: 27049, Training Loss: 0.00831
Epoch: 27049, Training Loss: 0.01018
Epoch: 27049, Training Loss: 0.00787
Epoch: 27050, Training Loss: 0.00891
Epoch: 27050, Training Loss: 0.00831
Epoch: 27050, Training Loss: 0.01018
Epoch: 27050, Training Loss: 0.00787
Epoch: 27051, Training Loss: 0.00891
Epoch: 27051, Training Loss: 0.00831
Epoch: 27051, Training Loss: 0.01018
Epoch: 27051, Training Loss: 0.00787
Epoch: 27052, Training Loss: 0.00891
Epoch: 27052, Training Loss: 0.00831
Epoch: 27052, Training Loss: 0.01018
Epoch: 27052, Training Loss: 0.00787
Epoch: 27053, Training Loss: 0.00891
Epoch: 27053, Training Loss: 0.00831
Epoch: 27053, Training Loss: 0.01018
Epoch: 27053, Training Loss: 0.00787
Epoch: 27054, Training Loss: 0.00891
Epoch: 27054, Training Loss: 0.00831
Epoch: 27054, Training Loss: 0.01018
Epoch: 27054, Training Loss: 0.00787
Epoch: 27055, Training Loss: 0.00891
Epoch: 27055, Training Loss: 0.00831
Epoch: 27055, Training Loss: 0.01018
Epoch: 27055, Training Loss: 0.00787
Epoch: 27056, Training Loss: 0.00891
Epoch: 27056, Training Loss: 0.00831
Epoch: 27056, Training Loss: 0.01018
Epoch: 27056, Training Loss: 0.00787
Epoch: 27057, Training Loss: 0.00891
Epoch: 27057, Training Loss: 0.00831
Epoch: 27057, Training Loss: 0.01018
Epoch: 27057, Training Loss: 0.00787
Epoch: 27058, Training Loss: 0.00891
Epoch: 27058, Training Loss: 0.00831
Epoch: 27058, Training Loss: 0.01018
Epoch: 27058, Training Loss: 0.00787
Epoch: 27059, Training Loss: 0.00891
Epoch: 27059, Training Loss: 0.00831
Epoch: 27059, Training Loss: 0.01018
Epoch: 27059, Training Loss: 0.00787
Epoch: 27060, Training Loss: 0.00891
Epoch: 27060, Training Loss: 0.00831
Epoch: 27060, Training Loss: 0.01018
Epoch: 27060, Training Loss: 0.00787
Epoch: 27061, Training Loss: 0.00891
Epoch: 27061, Training Loss: 0.00831
Epoch: 27061, Training Loss: 0.01017
Epoch: 27061, Training Loss: 0.00787
Epoch: 27062, Training Loss: 0.00891
Epoch: 27062, Training Loss: 0.00831
Epoch: 27062, Training Loss: 0.01017
Epoch: 27062, Training Loss: 0.00787
Epoch: 27063, Training Loss: 0.00891
Epoch: 27063, Training Loss: 0.00831
Epoch: 27063, Training Loss: 0.01017
Epoch: 27063, Training Loss: 0.00787
Epoch: 27064, Training Loss: 0.00891
Epoch: 27064, Training Loss: 0.00831
Epoch: 27064, Training Loss: 0.01017
Epoch: 27064, Training Loss: 0.00787
Epoch: 27065, Training Loss: 0.00891
Epoch: 27065, Training Loss: 0.00831
Epoch: 27065, Training Loss: 0.01017
Epoch: 27065, Training Loss: 0.00787
Epoch: 27066, Training Loss: 0.00891
Epoch: 27066, Training Loss: 0.00831
Epoch: 27066, Training Loss: 0.01017
Epoch: 27066, Training Loss: 0.00787
Epoch: 27067, Training Loss: 0.00891
Epoch: 27067, Training Loss: 0.00831
Epoch: 27067, Training Loss: 0.01017
Epoch: 27067, Training Loss: 0.00787
Epoch: 27068, Training Loss: 0.00891
Epoch: 27068, Training Loss: 0.00831
Epoch: 27068, Training Loss: 0.01017
Epoch: 27068, Training Loss: 0.00787
Epoch: 27069, Training Loss: 0.00891
Epoch: 27069, Training Loss: 0.00831
Epoch: 27069, Training Loss: 0.01017
Epoch: 27069, Training Loss: 0.00787
Epoch: 27070, Training Loss: 0.00891
Epoch: 27070, Training Loss: 0.00831
Epoch: 27070, Training Loss: 0.01017
Epoch: 27070, Training Loss: 0.00787
Epoch: 27071, Training Loss: 0.00891
Epoch: 27071, Training Loss: 0.00831
Epoch: 27071, Training Loss: 0.01017
Epoch: 27071, Training Loss: 0.00787
Epoch: 27072, Training Loss: 0.00891
Epoch: 27072, Training Loss: 0.00831
Epoch: 27072, Training Loss: 0.01017
Epoch: 27072, Training Loss: 0.00787
Epoch: 27073, Training Loss: 0.00891
Epoch: 27073, Training Loss: 0.00831
Epoch: 27073, Training Loss: 0.01017
Epoch: 27073, Training Loss: 0.00787
Epoch: 27074, Training Loss: 0.00891
Epoch: 27074, Training Loss: 0.00831
Epoch: 27074, Training Loss: 0.01017
Epoch: 27074, Training Loss: 0.00787
Epoch: 27075, Training Loss: 0.00891
Epoch: 27075, Training Loss: 0.00831
Epoch: 27075, Training Loss: 0.01017
Epoch: 27075, Training Loss: 0.00787
Epoch: 27076, Training Loss: 0.00891
Epoch: 27076, Training Loss: 0.00831
Epoch: 27076, Training Loss: 0.01017
Epoch: 27076, Training Loss: 0.00787
Epoch: 27077, Training Loss: 0.00891
Epoch: 27077, Training Loss: 0.00831
Epoch: 27077, Training Loss: 0.01017
Epoch: 27077, Training Loss: 0.00787
Epoch: 27078, Training Loss: 0.00891
Epoch: 27078, Training Loss: 0.00831
Epoch: 27078, Training Loss: 0.01017
Epoch: 27078, Training Loss: 0.00787
Epoch: 27079, Training Loss: 0.00891
Epoch: 27079, Training Loss: 0.00831
Epoch: 27079, Training Loss: 0.01017
Epoch: 27079, Training Loss: 0.00787
Epoch: 27080, Training Loss: 0.00891
Epoch: 27080, Training Loss: 0.00831
Epoch: 27080, Training Loss: 0.01017
Epoch: 27080, Training Loss: 0.00787
Epoch: 27081, Training Loss: 0.00891
Epoch: 27081, Training Loss: 0.00831
Epoch: 27081, Training Loss: 0.01017
Epoch: 27081, Training Loss: 0.00787
Epoch: 27082, Training Loss: 0.00891
Epoch: 27082, Training Loss: 0.00831
Epoch: 27082, Training Loss: 0.01017
Epoch: 27082, Training Loss: 0.00787
Epoch: 27083, Training Loss: 0.00891
Epoch: 27083, Training Loss: 0.00831
Epoch: 27083, Training Loss: 0.01017
Epoch: 27083, Training Loss: 0.00787
Epoch: 27084, Training Loss: 0.00891
Epoch: 27084, Training Loss: 0.00831
Epoch: 27084, Training Loss: 0.01017
Epoch: 27084, Training Loss: 0.00787
Epoch: 27085, Training Loss: 0.00891
Epoch: 27085, Training Loss: 0.00831
Epoch: 27085, Training Loss: 0.01017
Epoch: 27085, Training Loss: 0.00787
Epoch: 27086, Training Loss: 0.00891
Epoch: 27086, Training Loss: 0.00831
Epoch: 27086, Training Loss: 0.01017
Epoch: 27086, Training Loss: 0.00787
Epoch: 27087, Training Loss: 0.00891
Epoch: 27087, Training Loss: 0.00831
Epoch: 27087, Training Loss: 0.01017
Epoch: 27087, Training Loss: 0.00787
Epoch: 27088, Training Loss: 0.00891
Epoch: 27088, Training Loss: 0.00831
Epoch: 27088, Training Loss: 0.01017
Epoch: 27088, Training Loss: 0.00787
Epoch: 27089, Training Loss: 0.00891
Epoch: 27089, Training Loss: 0.00831
Epoch: 27089, Training Loss: 0.01017
Epoch: 27089, Training Loss: 0.00787
Epoch: 27090, Training Loss: 0.00891
Epoch: 27090, Training Loss: 0.00831
Epoch: 27090, Training Loss: 0.01017
Epoch: 27090, Training Loss: 0.00787
Epoch: 27091, Training Loss: 0.00891
Epoch: 27091, Training Loss: 0.00831
Epoch: 27091, Training Loss: 0.01017
Epoch: 27091, Training Loss: 0.00787
Epoch: 27092, Training Loss: 0.00891
Epoch: 27092, Training Loss: 0.00831
Epoch: 27092, Training Loss: 0.01017
Epoch: 27092, Training Loss: 0.00787
Epoch: 27093, Training Loss: 0.00890
Epoch: 27093, Training Loss: 0.00831
Epoch: 27093, Training Loss: 0.01017
Epoch: 27093, Training Loss: 0.00787
Epoch: 27094, Training Loss: 0.00890
Epoch: 27094, Training Loss: 0.00831
Epoch: 27094, Training Loss: 0.01017
Epoch: 27094, Training Loss: 0.00787
Epoch: 27095, Training Loss: 0.00890
Epoch: 27095, Training Loss: 0.00831
Epoch: 27095, Training Loss: 0.01017
Epoch: 27095, Training Loss: 0.00787
Epoch: 27096, Training Loss: 0.00890
Epoch: 27096, Training Loss: 0.00831
Epoch: 27096, Training Loss: 0.01017
Epoch: 27096, Training Loss: 0.00787
Epoch: 27097, Training Loss: 0.00890
Epoch: 27097, Training Loss: 0.00831
Epoch: 27097, Training Loss: 0.01017
Epoch: 27097, Training Loss: 0.00787
Epoch: 27098, Training Loss: 0.00890
Epoch: 27098, Training Loss: 0.00831
Epoch: 27098, Training Loss: 0.01017
Epoch: 27098, Training Loss: 0.00787
Epoch: 27099, Training Loss: 0.00890
Epoch: 27099, Training Loss: 0.00831
Epoch: 27099, Training Loss: 0.01017
Epoch: 27099, Training Loss: 0.00787
Epoch: 27100, Training Loss: 0.00890
Epoch: 27100, Training Loss: 0.00831
Epoch: 27100, Training Loss: 0.01017
Epoch: 27100, Training Loss: 0.00787
Epoch: 27101, Training Loss: 0.00890
Epoch: 27101, Training Loss: 0.00831
Epoch: 27101, Training Loss: 0.01017
Epoch: 27101, Training Loss: 0.00787
Epoch: 27102, Training Loss: 0.00890
Epoch: 27102, Training Loss: 0.00831
Epoch: 27102, Training Loss: 0.01017
Epoch: 27102, Training Loss: 0.00787
Epoch: 27103, Training Loss: 0.00890
Epoch: 27103, Training Loss: 0.00831
Epoch: 27103, Training Loss: 0.01017
Epoch: 27103, Training Loss: 0.00787
Epoch: 27104, Training Loss: 0.00890
Epoch: 27104, Training Loss: 0.00831
Epoch: 27104, Training Loss: 0.01017
Epoch: 27104, Training Loss: 0.00787
Epoch: 27105, Training Loss: 0.00890
Epoch: 27105, Training Loss: 0.00831
Epoch: 27105, Training Loss: 0.01017
Epoch: 27105, Training Loss: 0.00787
Epoch: 27106, Training Loss: 0.00890
Epoch: 27106, Training Loss: 0.00831
Epoch: 27106, Training Loss: 0.01017
Epoch: 27106, Training Loss: 0.00787
Epoch: 27107, Training Loss: 0.00890
Epoch: 27107, Training Loss: 0.00831
Epoch: 27107, Training Loss: 0.01017
Epoch: 27107, Training Loss: 0.00787
Epoch: 27108, Training Loss: 0.00890
Epoch: 27108, Training Loss: 0.00831
Epoch: 27108, Training Loss: 0.01017
Epoch: 27108, Training Loss: 0.00787
Epoch: 27109, Training Loss: 0.00890
Epoch: 27109, Training Loss: 0.00830
Epoch: 27109, Training Loss: 0.01017
Epoch: 27109, Training Loss: 0.00787
Epoch: 27110, Training Loss: 0.00890
Epoch: 27110, Training Loss: 0.00830
Epoch: 27110, Training Loss: 0.01017
Epoch: 27110, Training Loss: 0.00787
Epoch: 27111, Training Loss: 0.00890
Epoch: 27111, Training Loss: 0.00830
Epoch: 27111, Training Loss: 0.01016
Epoch: 27111, Training Loss: 0.00787
Epoch: 27112, Training Loss: 0.00890
Epoch: 27112, Training Loss: 0.00830
Epoch: 27112, Training Loss: 0.01016
Epoch: 27112, Training Loss: 0.00786
Epoch: 27113, Training Loss: 0.00890
Epoch: 27113, Training Loss: 0.00830
Epoch: 27113, Training Loss: 0.01016
Epoch: 27113, Training Loss: 0.00786
Epoch: 27114, Training Loss: 0.00890
Epoch: 27114, Training Loss: 0.00830
Epoch: 27114, Training Loss: 0.01016
Epoch: 27114, Training Loss: 0.00786
Epoch: 27115, Training Loss: 0.00890
Epoch: 27115, Training Loss: 0.00830
Epoch: 27115, Training Loss: 0.01016
Epoch: 27115, Training Loss: 0.00786
Epoch: 27116, Training Loss: 0.00890
Epoch: 27116, Training Loss: 0.00830
Epoch: 27116, Training Loss: 0.01016
Epoch: 27116, Training Loss: 0.00786
Epoch: 27117, Training Loss: 0.00890
Epoch: 27117, Training Loss: 0.00830
Epoch: 27117, Training Loss: 0.01016
Epoch: 27117, Training Loss: 0.00786
Epoch: 27118, Training Loss: 0.00890
Epoch: 27118, Training Loss: 0.00830
Epoch: 27118, Training Loss: 0.01016
Epoch: 27118, Training Loss: 0.00786
Epoch: 27119, Training Loss: 0.00890
Epoch: 27119, Training Loss: 0.00830
Epoch: 27119, Training Loss: 0.01016
Epoch: 27119, Training Loss: 0.00786
Epoch: 27120, Training Loss: 0.00890
Epoch: 27120, Training Loss: 0.00830
Epoch: 27120, Training Loss: 0.01016
Epoch: 27120, Training Loss: 0.00786
Epoch: 27121, Training Loss: 0.00890
Epoch: 27121, Training Loss: 0.00830
Epoch: 27121, Training Loss: 0.01016
Epoch: 27121, Training Loss: 0.00786
Epoch: 27122, Training Loss: 0.00890
Epoch: 27122, Training Loss: 0.00830
Epoch: 27122, Training Loss: 0.01016
Epoch: 27122, Training Loss: 0.00786
Epoch: 27123, Training Loss: 0.00890
Epoch: 27123, Training Loss: 0.00830
Epoch: 27123, Training Loss: 0.01016
Epoch: 27123, Training Loss: 0.00786
Epoch: 27124, Training Loss: 0.00890
Epoch: 27124, Training Loss: 0.00830
Epoch: 27124, Training Loss: 0.01016
Epoch: 27124, Training Loss: 0.00786
Epoch: 27125, Training Loss: 0.00890
Epoch: 27125, Training Loss: 0.00830
Epoch: 27125, Training Loss: 0.01016
Epoch: 27125, Training Loss: 0.00786
Epoch: 27126, Training Loss: 0.00890
Epoch: 27126, Training Loss: 0.00830
Epoch: 27126, Training Loss: 0.01016
Epoch: 27126, Training Loss: 0.00786
Epoch: 27127, Training Loss: 0.00890
Epoch: 27127, Training Loss: 0.00830
Epoch: 27127, Training Loss: 0.01016
Epoch: 27127, Training Loss: 0.00786
Epoch: 27128, Training Loss: 0.00890
Epoch: 27128, Training Loss: 0.00830
Epoch: 27128, Training Loss: 0.01016
Epoch: 27128, Training Loss: 0.00786
Epoch: 27129, Training Loss: 0.00890
Epoch: 27129, Training Loss: 0.00830
Epoch: 27129, Training Loss: 0.01016
Epoch: 27129, Training Loss: 0.00786
Epoch: 27130, Training Loss: 0.00890
Epoch: 27130, Training Loss: 0.00830
Epoch: 27130, Training Loss: 0.01016
Epoch: 27130, Training Loss: 0.00786
Epoch: 27131, Training Loss: 0.00890
Epoch: 27131, Training Loss: 0.00830
Epoch: 27131, Training Loss: 0.01016
Epoch: 27131, Training Loss: 0.00786
Epoch: 27132, Training Loss: 0.00890
Epoch: 27132, Training Loss: 0.00830
Epoch: 27132, Training Loss: 0.01016
Epoch: 27132, Training Loss: 0.00786
Epoch: 27133, Training Loss: 0.00890
Epoch: 27133, Training Loss: 0.00830
Epoch: 27133, Training Loss: 0.01016
Epoch: 27133, Training Loss: 0.00786
Epoch: 27134, Training Loss: 0.00890
Epoch: 27134, Training Loss: 0.00830
Epoch: 27134, Training Loss: 0.01016
Epoch: 27134, Training Loss: 0.00786
Epoch: 27135, Training Loss: 0.00890
Epoch: 27135, Training Loss: 0.00830
Epoch: 27135, Training Loss: 0.01016
Epoch: 27135, Training Loss: 0.00786
Epoch: 27136, Training Loss: 0.00890
Epoch: 27136, Training Loss: 0.00830
Epoch: 27136, Training Loss: 0.01016
Epoch: 27136, Training Loss: 0.00786
Epoch: 27137, Training Loss: 0.00890
Epoch: 27137, Training Loss: 0.00830
Epoch: 27137, Training Loss: 0.01016
Epoch: 27137, Training Loss: 0.00786
Epoch: 27138, Training Loss: 0.00890
Epoch: 27138, Training Loss: 0.00830
Epoch: 27138, Training Loss: 0.01016
Epoch: 27138, Training Loss: 0.00786
Epoch: 27139, Training Loss: 0.00890
Epoch: 27139, Training Loss: 0.00830
Epoch: 27139, Training Loss: 0.01016
Epoch: 27139, Training Loss: 0.00786
Epoch: 27140, Training Loss: 0.00890
Epoch: 27140, Training Loss: 0.00830
Epoch: 27140, Training Loss: 0.01016
Epoch: 27140, Training Loss: 0.00786
Epoch: 27141, Training Loss: 0.00890
Epoch: 27141, Training Loss: 0.00830
Epoch: 27141, Training Loss: 0.01016
Epoch: 27141, Training Loss: 0.00786
Epoch: 27142, Training Loss: 0.00890
Epoch: 27142, Training Loss: 0.00830
Epoch: 27142, Training Loss: 0.01016
Epoch: 27142, Training Loss: 0.00786
Epoch: 27143, Training Loss: 0.00890
Epoch: 27143, Training Loss: 0.00830
Epoch: 27143, Training Loss: 0.01016
Epoch: 27143, Training Loss: 0.00786
Epoch: 27144, Training Loss: 0.00890
Epoch: 27144, Training Loss: 0.00830
Epoch: 27144, Training Loss: 0.01016
Epoch: 27144, Training Loss: 0.00786
Epoch: 27145, Training Loss: 0.00890
Epoch: 27145, Training Loss: 0.00830
Epoch: 27145, Training Loss: 0.01016
Epoch: 27145, Training Loss: 0.00786
Epoch: 27146, Training Loss: 0.00890
Epoch: 27146, Training Loss: 0.00830
Epoch: 27146, Training Loss: 0.01016
Epoch: 27146, Training Loss: 0.00786
Epoch: 27147, Training Loss: 0.00890
Epoch: 27147, Training Loss: 0.00830
Epoch: 27147, Training Loss: 0.01016
Epoch: 27147, Training Loss: 0.00786
Epoch: 27148, Training Loss: 0.00890
Epoch: 27148, Training Loss: 0.00830
Epoch: 27148, Training Loss: 0.01016
Epoch: 27148, Training Loss: 0.00786
Epoch: 27149, Training Loss: 0.00890
Epoch: 27149, Training Loss: 0.00830
Epoch: 27149, Training Loss: 0.01016
Epoch: 27149, Training Loss: 0.00786
Epoch: 27150, Training Loss: 0.00889
Epoch: 27150, Training Loss: 0.00830
Epoch: 27150, Training Loss: 0.01016
Epoch: 27150, Training Loss: 0.00786
Epoch: 27151, Training Loss: 0.00889
Epoch: 27151, Training Loss: 0.00830
Epoch: 27151, Training Loss: 0.01016
Epoch: 27151, Training Loss: 0.00786
Epoch: 27152, Training Loss: 0.00889
Epoch: 27152, Training Loss: 0.00830
Epoch: 27152, Training Loss: 0.01016
Epoch: 27152, Training Loss: 0.00786
Epoch: 27153, Training Loss: 0.00889
Epoch: 27153, Training Loss: 0.00830
Epoch: 27153, Training Loss: 0.01016
Epoch: 27153, Training Loss: 0.00786
Epoch: 27154, Training Loss: 0.00889
Epoch: 27154, Training Loss: 0.00830
Epoch: 27154, Training Loss: 0.01016
Epoch: 27154, Training Loss: 0.00786
Epoch: 27155, Training Loss: 0.00889
Epoch: 27155, Training Loss: 0.00830
Epoch: 27155, Training Loss: 0.01016
Epoch: 27155, Training Loss: 0.00786
Epoch: 27156, Training Loss: 0.00889
Epoch: 27156, Training Loss: 0.00830
Epoch: 27156, Training Loss: 0.01016
Epoch: 27156, Training Loss: 0.00786
Epoch: 27157, Training Loss: 0.00889
Epoch: 27157, Training Loss: 0.00830
Epoch: 27157, Training Loss: 0.01016
Epoch: 27157, Training Loss: 0.00786
Epoch: 27158, Training Loss: 0.00889
Epoch: 27158, Training Loss: 0.00830
Epoch: 27158, Training Loss: 0.01016
Epoch: 27158, Training Loss: 0.00786
Epoch: 27159, Training Loss: 0.00889
Epoch: 27159, Training Loss: 0.00830
Epoch: 27159, Training Loss: 0.01016
Epoch: 27159, Training Loss: 0.00786
Epoch: 27160, Training Loss: 0.00889
Epoch: 27160, Training Loss: 0.00830
Epoch: 27160, Training Loss: 0.01016
Epoch: 27160, Training Loss: 0.00786
Epoch: 27161, Training Loss: 0.00889
Epoch: 27161, Training Loss: 0.00830
Epoch: 27161, Training Loss: 0.01015
Epoch: 27161, Training Loss: 0.00786
Epoch: 27162, Training Loss: 0.00889
Epoch: 27162, Training Loss: 0.00830
Epoch: 27162, Training Loss: 0.01015
Epoch: 27162, Training Loss: 0.00786
Epoch: 27163, Training Loss: 0.00889
Epoch: 27163, Training Loss: 0.00830
Epoch: 27163, Training Loss: 0.01015
Epoch: 27163, Training Loss: 0.00786
Epoch: 27164, Training Loss: 0.00889
Epoch: 27164, Training Loss: 0.00830
Epoch: 27164, Training Loss: 0.01015
Epoch: 27164, Training Loss: 0.00786
Epoch: 27165, Training Loss: 0.00889
Epoch: 27165, Training Loss: 0.00830
Epoch: 27165, Training Loss: 0.01015
Epoch: 27165, Training Loss: 0.00786
Epoch: 27166, Training Loss: 0.00889
Epoch: 27166, Training Loss: 0.00830
Epoch: 27166, Training Loss: 0.01015
Epoch: 27166, Training Loss: 0.00786
Epoch: 27167, Training Loss: 0.00889
Epoch: 27167, Training Loss: 0.00830
Epoch: 27167, Training Loss: 0.01015
Epoch: 27167, Training Loss: 0.00786
Epoch: 27168, Training Loss: 0.00889
Epoch: 27168, Training Loss: 0.00830
Epoch: 27168, Training Loss: 0.01015
Epoch: 27168, Training Loss: 0.00786
Epoch: 27169, Training Loss: 0.00889
Epoch: 27169, Training Loss: 0.00830
Epoch: 27169, Training Loss: 0.01015
Epoch: 27169, Training Loss: 0.00786
Epoch: 27170, Training Loss: 0.00889
Epoch: 27170, Training Loss: 0.00830
Epoch: 27170, Training Loss: 0.01015
Epoch: 27170, Training Loss: 0.00786
Epoch: 27171, Training Loss: 0.00889
Epoch: 27171, Training Loss: 0.00829
Epoch: 27171, Training Loss: 0.01015
Epoch: 27171, Training Loss: 0.00786
Epoch: 27172, Training Loss: 0.00889
Epoch: 27172, Training Loss: 0.00829
Epoch: 27172, Training Loss: 0.01015
Epoch: 27172, Training Loss: 0.00786
Epoch: 27173, Training Loss: 0.00889
Epoch: 27173, Training Loss: 0.00829
Epoch: 27173, Training Loss: 0.01015
Epoch: 27173, Training Loss: 0.00786
Epoch: 27174, Training Loss: 0.00889
Epoch: 27174, Training Loss: 0.00829
Epoch: 27174, Training Loss: 0.01015
Epoch: 27174, Training Loss: 0.00786
Epoch: 27175, Training Loss: 0.00889
Epoch: 27175, Training Loss: 0.00829
Epoch: 27175, Training Loss: 0.01015
Epoch: 27175, Training Loss: 0.00786
Epoch: 27176, Training Loss: 0.00889
Epoch: 27176, Training Loss: 0.00829
Epoch: 27176, Training Loss: 0.01015
Epoch: 27176, Training Loss: 0.00786
Epoch: 27177, Training Loss: 0.00889
Epoch: 27177, Training Loss: 0.00829
Epoch: 27177, Training Loss: 0.01015
Epoch: 27177, Training Loss: 0.00786
Epoch: 27178, Training Loss: 0.00889
Epoch: 27178, Training Loss: 0.00829
Epoch: 27178, Training Loss: 0.01015
Epoch: 27178, Training Loss: 0.00785
Epoch: 27179, Training Loss: 0.00889
Epoch: 27179, Training Loss: 0.00829
Epoch: 27179, Training Loss: 0.01015
Epoch: 27179, Training Loss: 0.00785
Epoch: 27180, Training Loss: 0.00889
Epoch: 27180, Training Loss: 0.00829
Epoch: 27180, Training Loss: 0.01015
Epoch: 27180, Training Loss: 0.00785
Epoch: 27181, Training Loss: 0.00889
Epoch: 27181, Training Loss: 0.00829
Epoch: 27181, Training Loss: 0.01015
Epoch: 27181, Training Loss: 0.00785
Epoch: 27182, Training Loss: 0.00889
Epoch: 27182, Training Loss: 0.00829
Epoch: 27182, Training Loss: 0.01015
Epoch: 27182, Training Loss: 0.00785
Epoch: 27183, Training Loss: 0.00889
Epoch: 27183, Training Loss: 0.00829
Epoch: 27183, Training Loss: 0.01015
Epoch: 27183, Training Loss: 0.00785
Epoch: 27184, Training Loss: 0.00889
Epoch: 27184, Training Loss: 0.00829
Epoch: 27184, Training Loss: 0.01015
Epoch: 27184, Training Loss: 0.00785
Epoch: 27185, Training Loss: 0.00889
Epoch: 27185, Training Loss: 0.00829
Epoch: 27185, Training Loss: 0.01015
Epoch: 27185, Training Loss: 0.00785
Epoch: 27186, Training Loss: 0.00889
Epoch: 27186, Training Loss: 0.00829
Epoch: 27186, Training Loss: 0.01015
Epoch: 27186, Training Loss: 0.00785
Epoch: 27187, Training Loss: 0.00889
Epoch: 27187, Training Loss: 0.00829
Epoch: 27187, Training Loss: 0.01015
Epoch: 27187, Training Loss: 0.00785
Epoch: 27188, Training Loss: 0.00889
Epoch: 27188, Training Loss: 0.00829
Epoch: 27188, Training Loss: 0.01015
Epoch: 27188, Training Loss: 0.00785
Epoch: 27189, Training Loss: 0.00889
Epoch: 27189, Training Loss: 0.00829
Epoch: 27189, Training Loss: 0.01015
Epoch: 27189, Training Loss: 0.00785
Epoch: 27190, Training Loss: 0.00889
Epoch: 27190, Training Loss: 0.00829
Epoch: 27190, Training Loss: 0.01015
Epoch: 27190, Training Loss: 0.00785
Epoch: 27191, Training Loss: 0.00889
Epoch: 27191, Training Loss: 0.00829
Epoch: 27191, Training Loss: 0.01015
Epoch: 27191, Training Loss: 0.00785
Epoch: 27192, Training Loss: 0.00889
Epoch: 27192, Training Loss: 0.00829
Epoch: 27192, Training Loss: 0.01015
Epoch: 27192, Training Loss: 0.00785
Epoch: 27193, Training Loss: 0.00889
Epoch: 27193, Training Loss: 0.00829
Epoch: 27193, Training Loss: 0.01015
Epoch: 27193, Training Loss: 0.00785
Epoch: 27194, Training Loss: 0.00889
Epoch: 27194, Training Loss: 0.00829
Epoch: 27194, Training Loss: 0.01015
Epoch: 27194, Training Loss: 0.00785
Epoch: 27195, Training Loss: 0.00889
Epoch: 27195, Training Loss: 0.00829
Epoch: 27195, Training Loss: 0.01015
Epoch: 27195, Training Loss: 0.00785
Epoch: 27196, Training Loss: 0.00889
Epoch: 27196, Training Loss: 0.00829
Epoch: 27196, Training Loss: 0.01015
Epoch: 27196, Training Loss: 0.00785
Epoch: 27197, Training Loss: 0.00889
Epoch: 27197, Training Loss: 0.00829
Epoch: 27197, Training Loss: 0.01015
Epoch: 27197, Training Loss: 0.00785
Epoch: 27198, Training Loss: 0.00889
Epoch: 27198, Training Loss: 0.00829
Epoch: 27198, Training Loss: 0.01015
Epoch: 27198, Training Loss: 0.00785
Epoch: 27199, Training Loss: 0.00889
Epoch: 27199, Training Loss: 0.00829
Epoch: 27199, Training Loss: 0.01015
Epoch: 27199, Training Loss: 0.00785
Epoch: 27200, Training Loss: 0.00889
Epoch: 27200, Training Loss: 0.00829
Epoch: 27200, Training Loss: 0.01015
Epoch: 27200, Training Loss: 0.00785
Epoch: 27201, Training Loss: 0.00889
Epoch: 27201, Training Loss: 0.00829
Epoch: 27201, Training Loss: 0.01015
Epoch: 27201, Training Loss: 0.00785
Epoch: 27202, Training Loss: 0.00889
Epoch: 27202, Training Loss: 0.00829
Epoch: 27202, Training Loss: 0.01015
Epoch: 27202, Training Loss: 0.00785
Epoch: 27203, Training Loss: 0.00889
Epoch: 27203, Training Loss: 0.00829
Epoch: 27203, Training Loss: 0.01015
Epoch: 27203, Training Loss: 0.00785
Epoch: 27204, Training Loss: 0.00889
Epoch: 27204, Training Loss: 0.00829
Epoch: 27204, Training Loss: 0.01015
Epoch: 27204, Training Loss: 0.00785
Epoch: 27205, Training Loss: 0.00889
Epoch: 27205, Training Loss: 0.00829
Epoch: 27205, Training Loss: 0.01015
Epoch: 27205, Training Loss: 0.00785
Epoch: 27206, Training Loss: 0.00889
Epoch: 27206, Training Loss: 0.00829
Epoch: 27206, Training Loss: 0.01015
Epoch: 27206, Training Loss: 0.00785
Epoch: 27207, Training Loss: 0.00889
Epoch: 27207, Training Loss: 0.00829
Epoch: 27207, Training Loss: 0.01015
Epoch: 27207, Training Loss: 0.00785
Epoch: 27208, Training Loss: 0.00888
Epoch: 27208, Training Loss: 0.00829
Epoch: 27208, Training Loss: 0.01015
Epoch: 27208, Training Loss: 0.00785
Epoch: 27209, Training Loss: 0.00888
Epoch: 27209, Training Loss: 0.00829
Epoch: 27209, Training Loss: 0.01015
Epoch: 27209, Training Loss: 0.00785
Epoch: 27210, Training Loss: 0.00888
Epoch: 27210, Training Loss: 0.00829
Epoch: 27210, Training Loss: 0.01015
Epoch: 27210, Training Loss: 0.00785
Epoch: 27211, Training Loss: 0.00888
Epoch: 27211, Training Loss: 0.00829
Epoch: 27211, Training Loss: 0.01014
Epoch: 27211, Training Loss: 0.00785
Epoch: 27212, Training Loss: 0.00888
Epoch: 27212, Training Loss: 0.00829
Epoch: 27212, Training Loss: 0.01014
Epoch: 27212, Training Loss: 0.00785
Epoch: 27213, Training Loss: 0.00888
Epoch: 27213, Training Loss: 0.00829
Epoch: 27213, Training Loss: 0.01014
Epoch: 27213, Training Loss: 0.00785
Epoch: 27214, Training Loss: 0.00888
Epoch: 27214, Training Loss: 0.00829
Epoch: 27214, Training Loss: 0.01014
Epoch: 27214, Training Loss: 0.00785
Epoch: 27215, Training Loss: 0.00888
Epoch: 27215, Training Loss: 0.00829
Epoch: 27215, Training Loss: 0.01014
Epoch: 27215, Training Loss: 0.00785
Epoch: 27216, Training Loss: 0.00888
Epoch: 27216, Training Loss: 0.00829
Epoch: 27216, Training Loss: 0.01014
Epoch: 27216, Training Loss: 0.00785
Epoch: 27217, Training Loss: 0.00888
Epoch: 27217, Training Loss: 0.00829
Epoch: 27217, Training Loss: 0.01014
Epoch: 27217, Training Loss: 0.00785
Epoch: 27218, Training Loss: 0.00888
Epoch: 27218, Training Loss: 0.00829
Epoch: 27218, Training Loss: 0.01014
Epoch: 27218, Training Loss: 0.00785
Epoch: 27219, Training Loss: 0.00888
Epoch: 27219, Training Loss: 0.00829
Epoch: 27219, Training Loss: 0.01014
Epoch: 27219, Training Loss: 0.00785
Epoch: 27220, Training Loss: 0.00888
Epoch: 27220, Training Loss: 0.00829
Epoch: 27220, Training Loss: 0.01014
Epoch: 27220, Training Loss: 0.00785
Epoch: 27221, Training Loss: 0.00888
Epoch: 27221, Training Loss: 0.00829
Epoch: 27221, Training Loss: 0.01014
Epoch: 27221, Training Loss: 0.00785
Epoch: 27222, Training Loss: 0.00888
Epoch: 27222, Training Loss: 0.00829
Epoch: 27222, Training Loss: 0.01014
Epoch: 27222, Training Loss: 0.00785
Epoch: 27223, Training Loss: 0.00888
Epoch: 27223, Training Loss: 0.00829
Epoch: 27223, Training Loss: 0.01014
Epoch: 27223, Training Loss: 0.00785
Epoch: 27224, Training Loss: 0.00888
Epoch: 27224, Training Loss: 0.00829
Epoch: 27224, Training Loss: 0.01014
Epoch: 27224, Training Loss: 0.00785
Epoch: 27225, Training Loss: 0.00888
Epoch: 27225, Training Loss: 0.00829
Epoch: 27225, Training Loss: 0.01014
Epoch: 27225, Training Loss: 0.00785
Epoch: 27226, Training Loss: 0.00888
Epoch: 27226, Training Loss: 0.00829
Epoch: 27226, Training Loss: 0.01014
Epoch: 27226, Training Loss: 0.00785
Epoch: 27227, Training Loss: 0.00888
Epoch: 27227, Training Loss: 0.00829
Epoch: 27227, Training Loss: 0.01014
Epoch: 27227, Training Loss: 0.00785
Epoch: 27228, Training Loss: 0.00888
Epoch: 27228, Training Loss: 0.00829
Epoch: 27228, Training Loss: 0.01014
Epoch: 27228, Training Loss: 0.00785
Epoch: 27229, Training Loss: 0.00888
Epoch: 27229, Training Loss: 0.00829
Epoch: 27229, Training Loss: 0.01014
Epoch: 27229, Training Loss: 0.00785
Epoch: 27230, Training Loss: 0.00888
Epoch: 27230, Training Loss: 0.00829
Epoch: 27230, Training Loss: 0.01014
Epoch: 27230, Training Loss: 0.00785
Epoch: 27231, Training Loss: 0.00888
Epoch: 27231, Training Loss: 0.00829
Epoch: 27231, Training Loss: 0.01014
Epoch: 27231, Training Loss: 0.00785
Epoch: 27232, Training Loss: 0.00888
Epoch: 27232, Training Loss: 0.00829
Epoch: 27232, Training Loss: 0.01014
Epoch: 27232, Training Loss: 0.00785
Epoch: 27233, Training Loss: 0.00888
Epoch: 27233, Training Loss: 0.00829
Epoch: 27233, Training Loss: 0.01014
Epoch: 27233, Training Loss: 0.00785
Epoch: 27234, Training Loss: 0.00888
Epoch: 27234, Training Loss: 0.00828
Epoch: 27234, Training Loss: 0.01014
Epoch: 27234, Training Loss: 0.00785
Epoch: 27235, Training Loss: 0.00888
Epoch: 27235, Training Loss: 0.00828
Epoch: 27235, Training Loss: 0.01014
Epoch: 27235, Training Loss: 0.00785
Epoch: 27236, Training Loss: 0.00888
Epoch: 27236, Training Loss: 0.00828
Epoch: 27236, Training Loss: 0.01014
Epoch: 27236, Training Loss: 0.00785
Epoch: 27237, Training Loss: 0.00888
Epoch: 27237, Training Loss: 0.00828
Epoch: 27237, Training Loss: 0.01014
Epoch: 27237, Training Loss: 0.00785
Epoch: 27238, Training Loss: 0.00888
Epoch: 27238, Training Loss: 0.00828
Epoch: 27238, Training Loss: 0.01014
Epoch: 27238, Training Loss: 0.00785
Epoch: 27239, Training Loss: 0.00888
Epoch: 27239, Training Loss: 0.00828
Epoch: 27239, Training Loss: 0.01014
Epoch: 27239, Training Loss: 0.00785
Epoch: 27240, Training Loss: 0.00888
Epoch: 27240, Training Loss: 0.00828
Epoch: 27240, Training Loss: 0.01014
Epoch: 27240, Training Loss: 0.00785
Epoch: 27241, Training Loss: 0.00888
Epoch: 27241, Training Loss: 0.00828
Epoch: 27241, Training Loss: 0.01014
Epoch: 27241, Training Loss: 0.00785
Epoch: 27242, Training Loss: 0.00888
Epoch: 27242, Training Loss: 0.00828
Epoch: 27242, Training Loss: 0.01014
Epoch: 27242, Training Loss: 0.00785
Epoch: 27243, Training Loss: 0.00888
Epoch: 27243, Training Loss: 0.00828
Epoch: 27243, Training Loss: 0.01014
Epoch: 27243, Training Loss: 0.00785
Epoch: 27244, Training Loss: 0.00888
Epoch: 27244, Training Loss: 0.00828
Epoch: 27244, Training Loss: 0.01014
Epoch: 27244, Training Loss: 0.00784
Epoch: 27245, Training Loss: 0.00888
Epoch: 27245, Training Loss: 0.00828
Epoch: 27245, Training Loss: 0.01014
Epoch: 27245, Training Loss: 0.00784
Epoch: 27246, Training Loss: 0.00888
Epoch: 27246, Training Loss: 0.00828
Epoch: 27246, Training Loss: 0.01014
Epoch: 27246, Training Loss: 0.00784
Epoch: 27247, Training Loss: 0.00888
Epoch: 27247, Training Loss: 0.00828
Epoch: 27247, Training Loss: 0.01014
Epoch: 27247, Training Loss: 0.00784
Epoch: 27248, Training Loss: 0.00888
Epoch: 27248, Training Loss: 0.00828
Epoch: 27248, Training Loss: 0.01014
Epoch: 27248, Training Loss: 0.00784
Epoch: 27249, Training Loss: 0.00888
Epoch: 27249, Training Loss: 0.00828
Epoch: 27249, Training Loss: 0.01014
Epoch: 27249, Training Loss: 0.00784
Epoch: 27250, Training Loss: 0.00888
Epoch: 27250, Training Loss: 0.00828
Epoch: 27250, Training Loss: 0.01014
Epoch: 27250, Training Loss: 0.00784
Epoch: 27251, Training Loss: 0.00888
Epoch: 27251, Training Loss: 0.00828
Epoch: 27251, Training Loss: 0.01014
Epoch: 27251, Training Loss: 0.00784
Epoch: 27252, Training Loss: 0.00888
Epoch: 27252, Training Loss: 0.00828
Epoch: 27252, Training Loss: 0.01014
Epoch: 27252, Training Loss: 0.00784
Epoch: 27253, Training Loss: 0.00888
Epoch: 27253, Training Loss: 0.00828
Epoch: 27253, Training Loss: 0.01014
Epoch: 27253, Training Loss: 0.00784
Epoch: 27254, Training Loss: 0.00888
Epoch: 27254, Training Loss: 0.00828
Epoch: 27254, Training Loss: 0.01014
Epoch: 27254, Training Loss: 0.00784
Epoch: 27255, Training Loss: 0.00888
Epoch: 27255, Training Loss: 0.00828
Epoch: 27255, Training Loss: 0.01014
Epoch: 27255, Training Loss: 0.00784
Epoch: 27256, Training Loss: 0.00888
Epoch: 27256, Training Loss: 0.00828
Epoch: 27256, Training Loss: 0.01014
Epoch: 27256, Training Loss: 0.00784
Epoch: 27257, Training Loss: 0.00888
Epoch: 27257, Training Loss: 0.00828
Epoch: 27257, Training Loss: 0.01014
Epoch: 27257, Training Loss: 0.00784
Epoch: 27258, Training Loss: 0.00888
Epoch: 27258, Training Loss: 0.00828
Epoch: 27258, Training Loss: 0.01014
Epoch: 27258, Training Loss: 0.00784
Epoch: 27259, Training Loss: 0.00888
Epoch: 27259, Training Loss: 0.00828
Epoch: 27259, Training Loss: 0.01014
Epoch: 27259, Training Loss: 0.00784
Epoch: 27260, Training Loss: 0.00888
Epoch: 27260, Training Loss: 0.00828
Epoch: 27260, Training Loss: 0.01014
Epoch: 27260, Training Loss: 0.00784
Epoch: 27261, Training Loss: 0.00888
Epoch: 27261, Training Loss: 0.00828
Epoch: 27261, Training Loss: 0.01014
Epoch: 27261, Training Loss: 0.00784
Epoch: 27262, Training Loss: 0.00888
Epoch: 27262, Training Loss: 0.00828
Epoch: 27262, Training Loss: 0.01013
Epoch: 27262, Training Loss: 0.00784
Epoch: 27263, Training Loss: 0.00888
Epoch: 27263, Training Loss: 0.00828
Epoch: 27263, Training Loss: 0.01013
Epoch: 27263, Training Loss: 0.00784
Epoch: 27264, Training Loss: 0.00888
Epoch: 27264, Training Loss: 0.00828
Epoch: 27264, Training Loss: 0.01013
Epoch: 27264, Training Loss: 0.00784
Epoch: 27265, Training Loss: 0.00887
Epoch: 27265, Training Loss: 0.00828
Epoch: 27265, Training Loss: 0.01013
Epoch: 27265, Training Loss: 0.00784
Epoch: 27266, Training Loss: 0.00887
Epoch: 27266, Training Loss: 0.00828
Epoch: 27266, Training Loss: 0.01013
Epoch: 27266, Training Loss: 0.00784
Epoch: 27267, Training Loss: 0.00887
Epoch: 27267, Training Loss: 0.00828
Epoch: 27267, Training Loss: 0.01013
Epoch: 27267, Training Loss: 0.00784
Epoch: 27268, Training Loss: 0.00887
Epoch: 27268, Training Loss: 0.00828
Epoch: 27268, Training Loss: 0.01013
Epoch: 27268, Training Loss: 0.00784
Epoch: 27269, Training Loss: 0.00887
Epoch: 27269, Training Loss: 0.00828
Epoch: 27269, Training Loss: 0.01013
Epoch: 27269, Training Loss: 0.00784
Epoch: 27270, Training Loss: 0.00887
Epoch: 27270, Training Loss: 0.00828
Epoch: 27270, Training Loss: 0.01013
Epoch: 27270, Training Loss: 0.00784
Epoch: 27271, Training Loss: 0.00887
Epoch: 27271, Training Loss: 0.00828
Epoch: 27271, Training Loss: 0.01013
Epoch: 27271, Training Loss: 0.00784
Epoch: 27272, Training Loss: 0.00887
Epoch: 27272, Training Loss: 0.00828
Epoch: 27272, Training Loss: 0.01013
Epoch: 27272, Training Loss: 0.00784
Epoch: 27273, Training Loss: 0.00887
Epoch: 27273, Training Loss: 0.00828
Epoch: 27273, Training Loss: 0.01013
Epoch: 27273, Training Loss: 0.00784
Epoch: 27274, Training Loss: 0.00887
Epoch: 27274, Training Loss: 0.00828
Epoch: 27274, Training Loss: 0.01013
Epoch: 27274, Training Loss: 0.00784
Epoch: 27275, Training Loss: 0.00887
Epoch: 27275, Training Loss: 0.00828
Epoch: 27275, Training Loss: 0.01013
Epoch: 27275, Training Loss: 0.00784
Epoch: 27276, Training Loss: 0.00887
Epoch: 27276, Training Loss: 0.00828
Epoch: 27276, Training Loss: 0.01013
Epoch: 27276, Training Loss: 0.00784
Epoch: 27277, Training Loss: 0.00887
Epoch: 27277, Training Loss: 0.00828
Epoch: 27277, Training Loss: 0.01013
Epoch: 27277, Training Loss: 0.00784
Epoch: 27278, Training Loss: 0.00887
Epoch: 27278, Training Loss: 0.00828
Epoch: 27278, Training Loss: 0.01013
Epoch: 27278, Training Loss: 0.00784
Epoch: 27279, Training Loss: 0.00887
Epoch: 27279, Training Loss: 0.00828
Epoch: 27279, Training Loss: 0.01013
Epoch: 27279, Training Loss: 0.00784
Epoch: 27280, Training Loss: 0.00887
Epoch: 27280, Training Loss: 0.00828
Epoch: 27280, Training Loss: 0.01013
Epoch: 27280, Training Loss: 0.00784
Epoch: 27281, Training Loss: 0.00887
Epoch: 27281, Training Loss: 0.00828
Epoch: 27281, Training Loss: 0.01013
Epoch: 27281, Training Loss: 0.00784
Epoch: 27282, Training Loss: 0.00887
Epoch: 27282, Training Loss: 0.00828
Epoch: 27282, Training Loss: 0.01013
Epoch: 27282, Training Loss: 0.00784
Epoch: 27283, Training Loss: 0.00887
Epoch: 27283, Training Loss: 0.00828
Epoch: 27283, Training Loss: 0.01013
Epoch: 27283, Training Loss: 0.00784
Epoch: 27284, Training Loss: 0.00887
Epoch: 27284, Training Loss: 0.00828
Epoch: 27284, Training Loss: 0.01013
Epoch: 27284, Training Loss: 0.00784
Epoch: 27285, Training Loss: 0.00887
Epoch: 27285, Training Loss: 0.00828
Epoch: 27285, Training Loss: 0.01013
Epoch: 27285, Training Loss: 0.00784
Epoch: 27286, Training Loss: 0.00887
Epoch: 27286, Training Loss: 0.00828
Epoch: 27286, Training Loss: 0.01013
Epoch: 27286, Training Loss: 0.00784
Epoch: 27287, Training Loss: 0.00887
Epoch: 27287, Training Loss: 0.00828
Epoch: 27287, Training Loss: 0.01013
Epoch: 27287, Training Loss: 0.00784
Epoch: 27288, Training Loss: 0.00887
Epoch: 27288, Training Loss: 0.00828
Epoch: 27288, Training Loss: 0.01013
Epoch: 27288, Training Loss: 0.00784
Epoch: 27289, Training Loss: 0.00887
Epoch: 27289, Training Loss: 0.00828
Epoch: 27289, Training Loss: 0.01013
Epoch: 27289, Training Loss: 0.00784
Epoch: 27290, Training Loss: 0.00887
Epoch: 27290, Training Loss: 0.00828
Epoch: 27290, Training Loss: 0.01013
Epoch: 27290, Training Loss: 0.00784
Epoch: 27291, Training Loss: 0.00887
Epoch: 27291, Training Loss: 0.00828
Epoch: 27291, Training Loss: 0.01013
Epoch: 27291, Training Loss: 0.00784
Epoch: 27292, Training Loss: 0.00887
Epoch: 27292, Training Loss: 0.00828
Epoch: 27292, Training Loss: 0.01013
Epoch: 27292, Training Loss: 0.00784
Epoch: 27293, Training Loss: 0.00887
Epoch: 27293, Training Loss: 0.00828
Epoch: 27293, Training Loss: 0.01013
Epoch: 27293, Training Loss: 0.00784
Epoch: 27294, Training Loss: 0.00887
Epoch: 27294, Training Loss: 0.00828
Epoch: 27294, Training Loss: 0.01013
Epoch: 27294, Training Loss: 0.00784
Epoch: 27295, Training Loss: 0.00887
Epoch: 27295, Training Loss: 0.00828
Epoch: 27295, Training Loss: 0.01013
Epoch: 27295, Training Loss: 0.00784
Epoch: 27296, Training Loss: 0.00887
Epoch: 27296, Training Loss: 0.00828
Epoch: 27296, Training Loss: 0.01013
Epoch: 27296, Training Loss: 0.00784
Epoch: 27297, Training Loss: 0.00887
Epoch: 27297, Training Loss: 0.00827
Epoch: 27297, Training Loss: 0.01013
Epoch: 27297, Training Loss: 0.00784
Epoch: 27298, Training Loss: 0.00887
Epoch: 27298, Training Loss: 0.00827
Epoch: 27298, Training Loss: 0.01013
Epoch: 27298, Training Loss: 0.00784
Epoch: 27299, Training Loss: 0.00887
Epoch: 27299, Training Loss: 0.00827
Epoch: 27299, Training Loss: 0.01013
Epoch: 27299, Training Loss: 0.00784
Epoch: 27300, Training Loss: 0.00887
Epoch: 27300, Training Loss: 0.00827
Epoch: 27300, Training Loss: 0.01013
Epoch: 27300, Training Loss: 0.00784
Epoch: 27301, Training Loss: 0.00887
Epoch: 27301, Training Loss: 0.00827
Epoch: 27301, Training Loss: 0.01013
Epoch: 27301, Training Loss: 0.00784
Epoch: 27302, Training Loss: 0.00887
Epoch: 27302, Training Loss: 0.00827
Epoch: 27302, Training Loss: 0.01013
Epoch: 27302, Training Loss: 0.00784
Epoch: 27303, Training Loss: 0.00887
Epoch: 27303, Training Loss: 0.00827
Epoch: 27303, Training Loss: 0.01013
Epoch: 27303, Training Loss: 0.00784
Epoch: 27304, Training Loss: 0.00887
Epoch: 27304, Training Loss: 0.00827
Epoch: 27304, Training Loss: 0.01013
Epoch: 27304, Training Loss: 0.00784
Epoch: 27305, Training Loss: 0.00887
Epoch: 27305, Training Loss: 0.00827
Epoch: 27305, Training Loss: 0.01013
Epoch: 27305, Training Loss: 0.00784
Epoch: 27306, Training Loss: 0.00887
Epoch: 27306, Training Loss: 0.00827
Epoch: 27306, Training Loss: 0.01013
Epoch: 27306, Training Loss: 0.00784
Epoch: 27307, Training Loss: 0.00887
Epoch: 27307, Training Loss: 0.00827
Epoch: 27307, Training Loss: 0.01013
Epoch: 27307, Training Loss: 0.00784
Epoch: 27308, Training Loss: 0.00887
Epoch: 27308, Training Loss: 0.00827
Epoch: 27308, Training Loss: 0.01013
Epoch: 27308, Training Loss: 0.00784
Epoch: 27309, Training Loss: 0.00887
Epoch: 27309, Training Loss: 0.00827
Epoch: 27309, Training Loss: 0.01013
Epoch: 27309, Training Loss: 0.00784
Epoch: 27310, Training Loss: 0.00887
Epoch: 27310, Training Loss: 0.00827
Epoch: 27310, Training Loss: 0.01013
Epoch: 27310, Training Loss: 0.00783
Epoch: 27311, Training Loss: 0.00887
Epoch: 27311, Training Loss: 0.00827
Epoch: 27311, Training Loss: 0.01013
Epoch: 27311, Training Loss: 0.00783
Epoch: 27312, Training Loss: 0.00887
Epoch: 27312, Training Loss: 0.00827
Epoch: 27312, Training Loss: 0.01012
Epoch: 27312, Training Loss: 0.00783
Epoch: 27313, Training Loss: 0.00887
Epoch: 27313, Training Loss: 0.00827
Epoch: 27313, Training Loss: 0.01012
Epoch: 27313, Training Loss: 0.00783
Epoch: 27314, Training Loss: 0.00887
Epoch: 27314, Training Loss: 0.00827
Epoch: 27314, Training Loss: 0.01012
Epoch: 27314, Training Loss: 0.00783
Epoch: 27315, Training Loss: 0.00887
Epoch: 27315, Training Loss: 0.00827
Epoch: 27315, Training Loss: 0.01012
Epoch: 27315, Training Loss: 0.00783
Epoch: 27316, Training Loss: 0.00887
Epoch: 27316, Training Loss: 0.00827
Epoch: 27316, Training Loss: 0.01012
Epoch: 27316, Training Loss: 0.00783
Epoch: 27317, Training Loss: 0.00887
Epoch: 27317, Training Loss: 0.00827
Epoch: 27317, Training Loss: 0.01012
Epoch: 27317, Training Loss: 0.00783
Epoch: 27318, Training Loss: 0.00887
Epoch: 27318, Training Loss: 0.00827
Epoch: 27318, Training Loss: 0.01012
Epoch: 27318, Training Loss: 0.00783
Epoch: 27319, Training Loss: 0.00887
Epoch: 27319, Training Loss: 0.00827
Epoch: 27319, Training Loss: 0.01012
Epoch: 27319, Training Loss: 0.00783
Epoch: 27320, Training Loss: 0.00887
Epoch: 27320, Training Loss: 0.00827
Epoch: 27320, Training Loss: 0.01012
Epoch: 27320, Training Loss: 0.00783
Epoch: 27321, Training Loss: 0.00887
Epoch: 27321, Training Loss: 0.00827
Epoch: 27321, Training Loss: 0.01012
Epoch: 27321, Training Loss: 0.00783
Epoch: 27322, Training Loss: 0.00887
Epoch: 27322, Training Loss: 0.00827
Epoch: 27322, Training Loss: 0.01012
Epoch: 27322, Training Loss: 0.00783
Epoch: 27323, Training Loss: 0.00886
Epoch: 27323, Training Loss: 0.00827
Epoch: 27323, Training Loss: 0.01012
Epoch: 27323, Training Loss: 0.00783
Epoch: 27324, Training Loss: 0.00886
Epoch: 27324, Training Loss: 0.00827
Epoch: 27324, Training Loss: 0.01012
Epoch: 27324, Training Loss: 0.00783
Epoch: 27325, Training Loss: 0.00886
Epoch: 27325, Training Loss: 0.00827
Epoch: 27325, Training Loss: 0.01012
Epoch: 27325, Training Loss: 0.00783
Epoch: 27326, Training Loss: 0.00886
Epoch: 27326, Training Loss: 0.00827
Epoch: 27326, Training Loss: 0.01012
Epoch: 27326, Training Loss: 0.00783
Epoch: 27327, Training Loss: 0.00886
Epoch: 27327, Training Loss: 0.00827
Epoch: 27327, Training Loss: 0.01012
Epoch: 27327, Training Loss: 0.00783
Epoch: 27328, Training Loss: 0.00886
Epoch: 27328, Training Loss: 0.00827
Epoch: 27328, Training Loss: 0.01012
Epoch: 27328, Training Loss: 0.00783
Epoch: 27329, Training Loss: 0.00886
Epoch: 27329, Training Loss: 0.00827
Epoch: 27329, Training Loss: 0.01012
Epoch: 27329, Training Loss: 0.00783
Epoch: 27330, Training Loss: 0.00886
Epoch: 27330, Training Loss: 0.00827
Epoch: 27330, Training Loss: 0.01012
Epoch: 27330, Training Loss: 0.00783
Epoch: 27331, Training Loss: 0.00886
Epoch: 27331, Training Loss: 0.00827
Epoch: 27331, Training Loss: 0.01012
Epoch: 27331, Training Loss: 0.00783
Epoch: 27332, Training Loss: 0.00886
Epoch: 27332, Training Loss: 0.00827
Epoch: 27332, Training Loss: 0.01012
Epoch: 27332, Training Loss: 0.00783
Epoch: 27333, Training Loss: 0.00886
Epoch: 27333, Training Loss: 0.00827
Epoch: 27333, Training Loss: 0.01012
Epoch: 27333, Training Loss: 0.00783
Epoch: 27334, Training Loss: 0.00886
Epoch: 27334, Training Loss: 0.00827
Epoch: 27334, Training Loss: 0.01012
Epoch: 27334, Training Loss: 0.00783
Epoch: 27335, Training Loss: 0.00886
Epoch: 27335, Training Loss: 0.00827
Epoch: 27335, Training Loss: 0.01012
Epoch: 27335, Training Loss: 0.00783
Epoch: 27336, Training Loss: 0.00886
Epoch: 27336, Training Loss: 0.00827
Epoch: 27336, Training Loss: 0.01012
Epoch: 27336, Training Loss: 0.00783
Epoch: 27337, Training Loss: 0.00886
Epoch: 27337, Training Loss: 0.00827
Epoch: 27337, Training Loss: 0.01012
Epoch: 27337, Training Loss: 0.00783
Epoch: 27338, Training Loss: 0.00886
Epoch: 27338, Training Loss: 0.00827
Epoch: 27338, Training Loss: 0.01012
Epoch: 27338, Training Loss: 0.00783
Epoch: 27339, Training Loss: 0.00886
Epoch: 27339, Training Loss: 0.00827
Epoch: 27339, Training Loss: 0.01012
Epoch: 27339, Training Loss: 0.00783
Epoch: 27340, Training Loss: 0.00886
Epoch: 27340, Training Loss: 0.00827
Epoch: 27340, Training Loss: 0.01012
Epoch: 27340, Training Loss: 0.00783
Epoch: 27341, Training Loss: 0.00886
Epoch: 27341, Training Loss: 0.00827
Epoch: 27341, Training Loss: 0.01012
Epoch: 27341, Training Loss: 0.00783
Epoch: 27342, Training Loss: 0.00886
Epoch: 27342, Training Loss: 0.00827
Epoch: 27342, Training Loss: 0.01012
Epoch: 27342, Training Loss: 0.00783
Epoch: 27343, Training Loss: 0.00886
Epoch: 27343, Training Loss: 0.00827
Epoch: 27343, Training Loss: 0.01012
Epoch: 27343, Training Loss: 0.00783
Epoch: 27344, Training Loss: 0.00886
Epoch: 27344, Training Loss: 0.00827
Epoch: 27344, Training Loss: 0.01012
Epoch: 27344, Training Loss: 0.00783
Epoch: 27345, Training Loss: 0.00886
Epoch: 27345, Training Loss: 0.00827
Epoch: 27345, Training Loss: 0.01012
Epoch: 27345, Training Loss: 0.00783
Epoch: 27346, Training Loss: 0.00886
Epoch: 27346, Training Loss: 0.00827
Epoch: 27346, Training Loss: 0.01012
Epoch: 27346, Training Loss: 0.00783
Epoch: 27347, Training Loss: 0.00886
Epoch: 27347, Training Loss: 0.00827
Epoch: 27347, Training Loss: 0.01012
Epoch: 27347, Training Loss: 0.00783
Epoch: 27348, Training Loss: 0.00886
Epoch: 27348, Training Loss: 0.00827
Epoch: 27348, Training Loss: 0.01012
Epoch: 27348, Training Loss: 0.00783
Epoch: 27349, Training Loss: 0.00886
Epoch: 27349, Training Loss: 0.00827
Epoch: 27349, Training Loss: 0.01012
Epoch: 27349, Training Loss: 0.00783
Epoch: 27350, Training Loss: 0.00886
Epoch: 27350, Training Loss: 0.00827
Epoch: 27350, Training Loss: 0.01012
Epoch: 27350, Training Loss: 0.00783
Epoch: 27351, Training Loss: 0.00886
Epoch: 27351, Training Loss: 0.00827
Epoch: 27351, Training Loss: 0.01012
Epoch: 27351, Training Loss: 0.00783
Epoch: 27352, Training Loss: 0.00886
Epoch: 27352, Training Loss: 0.00827
Epoch: 27352, Training Loss: 0.01012
Epoch: 27352, Training Loss: 0.00783
Epoch: 27353, Training Loss: 0.00886
Epoch: 27353, Training Loss: 0.00827
Epoch: 27353, Training Loss: 0.01012
Epoch: 27353, Training Loss: 0.00783
Epoch: 27354, Training Loss: 0.00886
Epoch: 27354, Training Loss: 0.00827
Epoch: 27354, Training Loss: 0.01012
Epoch: 27354, Training Loss: 0.00783
Epoch: 27355, Training Loss: 0.00886
Epoch: 27355, Training Loss: 0.00827
Epoch: 27355, Training Loss: 0.01012
Epoch: 27355, Training Loss: 0.00783
Epoch: 27356, Training Loss: 0.00886
Epoch: 27356, Training Loss: 0.00827
Epoch: 27356, Training Loss: 0.01012
Epoch: 27356, Training Loss: 0.00783
Epoch: 27357, Training Loss: 0.00886
Epoch: 27357, Training Loss: 0.00827
Epoch: 27357, Training Loss: 0.01012
Epoch: 27357, Training Loss: 0.00783
Epoch: 27358, Training Loss: 0.00886
Epoch: 27358, Training Loss: 0.00827
Epoch: 27358, Training Loss: 0.01012
Epoch: 27358, Training Loss: 0.00783
Epoch: 27359, Training Loss: 0.00886
Epoch: 27359, Training Loss: 0.00827
Epoch: 27359, Training Loss: 0.01012
Epoch: 27359, Training Loss: 0.00783
Epoch: 27360, Training Loss: 0.00886
Epoch: 27360, Training Loss: 0.00826
Epoch: 27360, Training Loss: 0.01012
Epoch: 27360, Training Loss: 0.00783
Epoch: 27361, Training Loss: 0.00886
Epoch: 27361, Training Loss: 0.00826
Epoch: 27361, Training Loss: 0.01012
Epoch: 27361, Training Loss: 0.00783
Epoch: 27362, Training Loss: 0.00886
Epoch: 27362, Training Loss: 0.00826
Epoch: 27362, Training Loss: 0.01012
Epoch: 27362, Training Loss: 0.00783
Epoch: 27363, Training Loss: 0.00886
Epoch: 27363, Training Loss: 0.00826
Epoch: 27363, Training Loss: 0.01011
Epoch: 27363, Training Loss: 0.00783
Epoch: 27364, Training Loss: 0.00886
Epoch: 27364, Training Loss: 0.00826
Epoch: 27364, Training Loss: 0.01011
Epoch: 27364, Training Loss: 0.00783
Epoch: 27365, Training Loss: 0.00886
Epoch: 27365, Training Loss: 0.00826
Epoch: 27365, Training Loss: 0.01011
Epoch: 27365, Training Loss: 0.00783
Epoch: 27366, Training Loss: 0.00886
Epoch: 27366, Training Loss: 0.00826
Epoch: 27366, Training Loss: 0.01011
Epoch: 27366, Training Loss: 0.00783
Epoch: 27367, Training Loss: 0.00886
Epoch: 27367, Training Loss: 0.00826
Epoch: 27367, Training Loss: 0.01011
Epoch: 27367, Training Loss: 0.00783
Epoch: 27368, Training Loss: 0.00886
Epoch: 27368, Training Loss: 0.00826
Epoch: 27368, Training Loss: 0.01011
Epoch: 27368, Training Loss: 0.00783
Epoch: 27369, Training Loss: 0.00886
Epoch: 27369, Training Loss: 0.00826
Epoch: 27369, Training Loss: 0.01011
Epoch: 27369, Training Loss: 0.00783
Epoch: 27370, Training Loss: 0.00886
Epoch: 27370, Training Loss: 0.00826
Epoch: 27370, Training Loss: 0.01011
Epoch: 27370, Training Loss: 0.00783
Epoch: 27371, Training Loss: 0.00886
Epoch: 27371, Training Loss: 0.00826
Epoch: 27371, Training Loss: 0.01011
Epoch: 27371, Training Loss: 0.00783
Epoch: 27372, Training Loss: 0.00886
Epoch: 27372, Training Loss: 0.00826
Epoch: 27372, Training Loss: 0.01011
Epoch: 27372, Training Loss: 0.00783
Epoch: 27373, Training Loss: 0.00886
Epoch: 27373, Training Loss: 0.00826
Epoch: 27373, Training Loss: 0.01011
Epoch: 27373, Training Loss: 0.00783
Epoch: 27374, Training Loss: 0.00886
Epoch: 27374, Training Loss: 0.00826
Epoch: 27374, Training Loss: 0.01011
Epoch: 27374, Training Loss: 0.00783
Epoch: 27375, Training Loss: 0.00886
Epoch: 27375, Training Loss: 0.00826
Epoch: 27375, Training Loss: 0.01011
Epoch: 27375, Training Loss: 0.00783
Epoch: 27376, Training Loss: 0.00886
Epoch: 27376, Training Loss: 0.00826
Epoch: 27376, Training Loss: 0.01011
Epoch: 27376, Training Loss: 0.00782
Epoch: 27377, Training Loss: 0.00886
Epoch: 27377, Training Loss: 0.00826
Epoch: 27377, Training Loss: 0.01011
Epoch: 27377, Training Loss: 0.00782
Epoch: 27378, Training Loss: 0.00886
Epoch: 27378, Training Loss: 0.00826
Epoch: 27378, Training Loss: 0.01011
Epoch: 27378, Training Loss: 0.00782
Epoch: 27379, Training Loss: 0.00886
Epoch: 27379, Training Loss: 0.00826
Epoch: 27379, Training Loss: 0.01011
Epoch: 27379, Training Loss: 0.00782
Epoch: 27380, Training Loss: 0.00886
Epoch: 27380, Training Loss: 0.00826
Epoch: 27380, Training Loss: 0.01011
Epoch: 27380, Training Loss: 0.00782
Epoch: 27381, Training Loss: 0.00885
Epoch: 27381, Training Loss: 0.00826
Epoch: 27381, Training Loss: 0.01011
Epoch: 27381, Training Loss: 0.00782
Epoch: 27382, Training Loss: 0.00885
Epoch: 27382, Training Loss: 0.00826
Epoch: 27382, Training Loss: 0.01011
Epoch: 27382, Training Loss: 0.00782
Epoch: 27383, Training Loss: 0.00885
Epoch: 27383, Training Loss: 0.00826
Epoch: 27383, Training Loss: 0.01011
Epoch: 27383, Training Loss: 0.00782
Epoch: 27384, Training Loss: 0.00885
Epoch: 27384, Training Loss: 0.00826
Epoch: 27384, Training Loss: 0.01011
Epoch: 27384, Training Loss: 0.00782
Epoch: 27385, Training Loss: 0.00885
Epoch: 27385, Training Loss: 0.00826
Epoch: 27385, Training Loss: 0.01011
Epoch: 27385, Training Loss: 0.00782
Epoch: 27386, Training Loss: 0.00885
Epoch: 27386, Training Loss: 0.00826
Epoch: 27386, Training Loss: 0.01011
Epoch: 27386, Training Loss: 0.00782
Epoch: 27387, Training Loss: 0.00885
Epoch: 27387, Training Loss: 0.00826
Epoch: 27387, Training Loss: 0.01011
Epoch: 27387, Training Loss: 0.00782
Epoch: 27388, Training Loss: 0.00885
Epoch: 27388, Training Loss: 0.00826
Epoch: 27388, Training Loss: 0.01011
Epoch: 27388, Training Loss: 0.00782
Epoch: 27389, Training Loss: 0.00885
Epoch: 27389, Training Loss: 0.00826
Epoch: 27389, Training Loss: 0.01011
Epoch: 27389, Training Loss: 0.00782
Epoch: 27390, Training Loss: 0.00885
Epoch: 27390, Training Loss: 0.00826
Epoch: 27390, Training Loss: 0.01011
Epoch: 27390, Training Loss: 0.00782
Epoch: 27391, Training Loss: 0.00885
Epoch: 27391, Training Loss: 0.00826
Epoch: 27391, Training Loss: 0.01011
Epoch: 27391, Training Loss: 0.00782
Epoch: 27392, Training Loss: 0.00885
Epoch: 27392, Training Loss: 0.00826
Epoch: 27392, Training Loss: 0.01011
Epoch: 27392, Training Loss: 0.00782
Epoch: 27393, Training Loss: 0.00885
Epoch: 27393, Training Loss: 0.00826
Epoch: 27393, Training Loss: 0.01011
Epoch: 27393, Training Loss: 0.00782
Epoch: 27394, Training Loss: 0.00885
Epoch: 27394, Training Loss: 0.00826
Epoch: 27394, Training Loss: 0.01011
Epoch: 27394, Training Loss: 0.00782
Epoch: 27395, Training Loss: 0.00885
Epoch: 27395, Training Loss: 0.00826
Epoch: 27395, Training Loss: 0.01011
Epoch: 27395, Training Loss: 0.00782
Epoch: 27396, Training Loss: 0.00885
Epoch: 27396, Training Loss: 0.00826
Epoch: 27396, Training Loss: 0.01011
Epoch: 27396, Training Loss: 0.00782
Epoch: 27397, Training Loss: 0.00885
Epoch: 27397, Training Loss: 0.00826
Epoch: 27397, Training Loss: 0.01011
Epoch: 27397, Training Loss: 0.00782
Epoch: 27398, Training Loss: 0.00885
Epoch: 27398, Training Loss: 0.00826
Epoch: 27398, Training Loss: 0.01011
Epoch: 27398, Training Loss: 0.00782
Epoch: 27399, Training Loss: 0.00885
Epoch: 27399, Training Loss: 0.00826
Epoch: 27399, Training Loss: 0.01011
Epoch: 27399, Training Loss: 0.00782
Epoch: 27400, Training Loss: 0.00885
Epoch: 27400, Training Loss: 0.00826
Epoch: 27400, Training Loss: 0.01011
Epoch: 27400, Training Loss: 0.00782
Epoch: 27401, Training Loss: 0.00885
Epoch: 27401, Training Loss: 0.00826
Epoch: 27401, Training Loss: 0.01011
Epoch: 27401, Training Loss: 0.00782
Epoch: 27402, Training Loss: 0.00885
Epoch: 27402, Training Loss: 0.00826
Epoch: 27402, Training Loss: 0.01011
Epoch: 27402, Training Loss: 0.00782
Epoch: 27403, Training Loss: 0.00885
Epoch: 27403, Training Loss: 0.00826
Epoch: 27403, Training Loss: 0.01011
Epoch: 27403, Training Loss: 0.00782
Epoch: 27404, Training Loss: 0.00885
Epoch: 27404, Training Loss: 0.00826
Epoch: 27404, Training Loss: 0.01011
Epoch: 27404, Training Loss: 0.00782
Epoch: 27405, Training Loss: 0.00885
Epoch: 27405, Training Loss: 0.00826
Epoch: 27405, Training Loss: 0.01011
Epoch: 27405, Training Loss: 0.00782
Epoch: 27406, Training Loss: 0.00885
Epoch: 27406, Training Loss: 0.00826
Epoch: 27406, Training Loss: 0.01011
Epoch: 27406, Training Loss: 0.00782
Epoch: 27407, Training Loss: 0.00885
Epoch: 27407, Training Loss: 0.00826
Epoch: 27407, Training Loss: 0.01011
Epoch: 27407, Training Loss: 0.00782
Epoch: 27408, Training Loss: 0.00885
Epoch: 27408, Training Loss: 0.00826
Epoch: 27408, Training Loss: 0.01011
Epoch: 27408, Training Loss: 0.00782
Epoch: 27409, Training Loss: 0.00885
Epoch: 27409, Training Loss: 0.00826
Epoch: 27409, Training Loss: 0.01011
Epoch: 27409, Training Loss: 0.00782
Epoch: 27410, Training Loss: 0.00885
Epoch: 27410, Training Loss: 0.00826
Epoch: 27410, Training Loss: 0.01011
Epoch: 27410, Training Loss: 0.00782
Epoch: 27411, Training Loss: 0.00885
Epoch: 27411, Training Loss: 0.00826
Epoch: 27411, Training Loss: 0.01011
Epoch: 27411, Training Loss: 0.00782
Epoch: 27412, Training Loss: 0.00885
Epoch: 27412, Training Loss: 0.00826
Epoch: 27412, Training Loss: 0.01011
Epoch: 27412, Training Loss: 0.00782
Epoch: 27413, Training Loss: 0.00885
Epoch: 27413, Training Loss: 0.00826
Epoch: 27413, Training Loss: 0.01011
Epoch: 27413, Training Loss: 0.00782
Epoch: 27414, Training Loss: 0.00885
Epoch: 27414, Training Loss: 0.00826
Epoch: 27414, Training Loss: 0.01010
Epoch: 27414, Training Loss: 0.00782
Epoch: 27415, Training Loss: 0.00885
Epoch: 27415, Training Loss: 0.00826
Epoch: 27415, Training Loss: 0.01010
Epoch: 27415, Training Loss: 0.00782
Epoch: 27416, Training Loss: 0.00885
Epoch: 27416, Training Loss: 0.00826
Epoch: 27416, Training Loss: 0.01010
Epoch: 27416, Training Loss: 0.00782
Epoch: 27417, Training Loss: 0.00885
Epoch: 27417, Training Loss: 0.00826
Epoch: 27417, Training Loss: 0.01010
Epoch: 27417, Training Loss: 0.00782
Epoch: 27418, Training Loss: 0.00885
Epoch: 27418, Training Loss: 0.00826
Epoch: 27418, Training Loss: 0.01010
Epoch: 27418, Training Loss: 0.00782
Epoch: 27419, Training Loss: 0.00885
Epoch: 27419, Training Loss: 0.00826
Epoch: 27419, Training Loss: 0.01010
Epoch: 27419, Training Loss: 0.00782
Epoch: 27420, Training Loss: 0.00885
Epoch: 27420, Training Loss: 0.00826
Epoch: 27420, Training Loss: 0.01010
Epoch: 27420, Training Loss: 0.00782
Epoch: 27421, Training Loss: 0.00885
Epoch: 27421, Training Loss: 0.00826
Epoch: 27421, Training Loss: 0.01010
Epoch: 27421, Training Loss: 0.00782
Epoch: 27422, Training Loss: 0.00885
Epoch: 27422, Training Loss: 0.00826
Epoch: 27422, Training Loss: 0.01010
Epoch: 27422, Training Loss: 0.00782
Epoch: 27423, Training Loss: 0.00885
Epoch: 27423, Training Loss: 0.00825
Epoch: 27423, Training Loss: 0.01010
Epoch: 27423, Training Loss: 0.00782
Epoch: 27424, Training Loss: 0.00885
Epoch: 27424, Training Loss: 0.00825
Epoch: 27424, Training Loss: 0.01010
Epoch: 27424, Training Loss: 0.00782
Epoch: 27425, Training Loss: 0.00885
Epoch: 27425, Training Loss: 0.00825
Epoch: 27425, Training Loss: 0.01010
Epoch: 27425, Training Loss: 0.00782
Epoch: 27426, Training Loss: 0.00885
Epoch: 27426, Training Loss: 0.00825
Epoch: 27426, Training Loss: 0.01010
Epoch: 27426, Training Loss: 0.00782
Epoch: 27427, Training Loss: 0.00885
Epoch: 27427, Training Loss: 0.00825
Epoch: 27427, Training Loss: 0.01010
Epoch: 27427, Training Loss: 0.00782
Epoch: 27428, Training Loss: 0.00885
Epoch: 27428, Training Loss: 0.00825
Epoch: 27428, Training Loss: 0.01010
Epoch: 27428, Training Loss: 0.00782
Epoch: 27429, Training Loss: 0.00885
Epoch: 27429, Training Loss: 0.00825
Epoch: 27429, Training Loss: 0.01010
Epoch: 27429, Training Loss: 0.00782
Epoch: 27430, Training Loss: 0.00885
Epoch: 27430, Training Loss: 0.00825
Epoch: 27430, Training Loss: 0.01010
Epoch: 27430, Training Loss: 0.00782
Epoch: 27431, Training Loss: 0.00885
Epoch: 27431, Training Loss: 0.00825
Epoch: 27431, Training Loss: 0.01010
Epoch: 27431, Training Loss: 0.00782
Epoch: 27432, Training Loss: 0.00885
Epoch: 27432, Training Loss: 0.00825
Epoch: 27432, Training Loss: 0.01010
Epoch: 27432, Training Loss: 0.00782
Epoch: 27433, Training Loss: 0.00885
Epoch: 27433, Training Loss: 0.00825
Epoch: 27433, Training Loss: 0.01010
Epoch: 27433, Training Loss: 0.00782
Epoch: 27434, Training Loss: 0.00885
Epoch: 27434, Training Loss: 0.00825
Epoch: 27434, Training Loss: 0.01010
Epoch: 27434, Training Loss: 0.00782
Epoch: 27435, Training Loss: 0.00885
Epoch: 27435, Training Loss: 0.00825
Epoch: 27435, Training Loss: 0.01010
Epoch: 27435, Training Loss: 0.00782
Epoch: 27436, Training Loss: 0.00885
Epoch: 27436, Training Loss: 0.00825
Epoch: 27436, Training Loss: 0.01010
Epoch: 27436, Training Loss: 0.00782
Epoch: 27437, Training Loss: 0.00885
Epoch: 27437, Training Loss: 0.00825
Epoch: 27437, Training Loss: 0.01010
Epoch: 27437, Training Loss: 0.00782
Epoch: 27438, Training Loss: 0.00885
Epoch: 27438, Training Loss: 0.00825
Epoch: 27438, Training Loss: 0.01010
Epoch: 27438, Training Loss: 0.00782
Epoch: 27439, Training Loss: 0.00884
Epoch: 27439, Training Loss: 0.00825
Epoch: 27439, Training Loss: 0.01010
Epoch: 27439, Training Loss: 0.00782
Epoch: 27440, Training Loss: 0.00884
Epoch: 27440, Training Loss: 0.00825
Epoch: 27440, Training Loss: 0.01010
Epoch: 27440, Training Loss: 0.00782
Epoch: 27441, Training Loss: 0.00884
Epoch: 27441, Training Loss: 0.00825
Epoch: 27441, Training Loss: 0.01010
Epoch: 27441, Training Loss: 0.00782
Epoch: 27442, Training Loss: 0.00884
Epoch: 27442, Training Loss: 0.00825
Epoch: 27442, Training Loss: 0.01010
Epoch: 27442, Training Loss: 0.00782
Epoch: 27443, Training Loss: 0.00884
Epoch: 27443, Training Loss: 0.00825
Epoch: 27443, Training Loss: 0.01010
Epoch: 27443, Training Loss: 0.00781
Epoch: 27444, Training Loss: 0.00884
Epoch: 27444, Training Loss: 0.00825
Epoch: 27444, Training Loss: 0.01010
Epoch: 27444, Training Loss: 0.00781
Epoch: 27445, Training Loss: 0.00884
Epoch: 27445, Training Loss: 0.00825
Epoch: 27445, Training Loss: 0.01010
Epoch: 27445, Training Loss: 0.00781
Epoch: 27446, Training Loss: 0.00884
Epoch: 27446, Training Loss: 0.00825
Epoch: 27446, Training Loss: 0.01010
Epoch: 27446, Training Loss: 0.00781
Epoch: 27447, Training Loss: 0.00884
Epoch: 27447, Training Loss: 0.00825
Epoch: 27447, Training Loss: 0.01010
Epoch: 27447, Training Loss: 0.00781
Epoch: 27448, Training Loss: 0.00884
Epoch: 27448, Training Loss: 0.00825
Epoch: 27448, Training Loss: 0.01010
Epoch: 27448, Training Loss: 0.00781
Epoch: 27449, Training Loss: 0.00884
Epoch: 27449, Training Loss: 0.00825
Epoch: 27449, Training Loss: 0.01010
Epoch: 27449, Training Loss: 0.00781
Epoch: 27450, Training Loss: 0.00884
Epoch: 27450, Training Loss: 0.00825
Epoch: 27450, Training Loss: 0.01010
Epoch: 27450, Training Loss: 0.00781
Epoch: 27451, Training Loss: 0.00884
Epoch: 27451, Training Loss: 0.00825
Epoch: 27451, Training Loss: 0.01010
Epoch: 27451, Training Loss: 0.00781
Epoch: 27452, Training Loss: 0.00884
Epoch: 27452, Training Loss: 0.00825
Epoch: 27452, Training Loss: 0.01010
Epoch: 27452, Training Loss: 0.00781
Epoch: 27453, Training Loss: 0.00884
Epoch: 27453, Training Loss: 0.00825
Epoch: 27453, Training Loss: 0.01010
Epoch: 27453, Training Loss: 0.00781
Epoch: 27454, Training Loss: 0.00884
Epoch: 27454, Training Loss: 0.00825
Epoch: 27454, Training Loss: 0.01010
Epoch: 27454, Training Loss: 0.00781
Epoch: 27455, Training Loss: 0.00884
Epoch: 27455, Training Loss: 0.00825
Epoch: 27455, Training Loss: 0.01010
Epoch: 27455, Training Loss: 0.00781
Epoch: 27456, Training Loss: 0.00884
Epoch: 27456, Training Loss: 0.00825
Epoch: 27456, Training Loss: 0.01010
Epoch: 27456, Training Loss: 0.00781
Epoch: 27457, Training Loss: 0.00884
Epoch: 27457, Training Loss: 0.00825
Epoch: 27457, Training Loss: 0.01010
Epoch: 27457, Training Loss: 0.00781
Epoch: 27458, Training Loss: 0.00884
Epoch: 27458, Training Loss: 0.00825
Epoch: 27458, Training Loss: 0.01010
Epoch: 27458, Training Loss: 0.00781
Epoch: 27459, Training Loss: 0.00884
Epoch: 27459, Training Loss: 0.00825
Epoch: 27459, Training Loss: 0.01010
Epoch: 27459, Training Loss: 0.00781
Epoch: 27460, Training Loss: 0.00884
Epoch: 27460, Training Loss: 0.00825
Epoch: 27460, Training Loss: 0.01010
Epoch: 27460, Training Loss: 0.00781
Epoch: 27461, Training Loss: 0.00884
Epoch: 27461, Training Loss: 0.00825
Epoch: 27461, Training Loss: 0.01010
Epoch: 27461, Training Loss: 0.00781
Epoch: 27462, Training Loss: 0.00884
Epoch: 27462, Training Loss: 0.00825
Epoch: 27462, Training Loss: 0.01010
Epoch: 27462, Training Loss: 0.00781
Epoch: 27463, Training Loss: 0.00884
Epoch: 27463, Training Loss: 0.00825
Epoch: 27463, Training Loss: 0.01010
Epoch: 27463, Training Loss: 0.00781
Epoch: 27464, Training Loss: 0.00884
Epoch: 27464, Training Loss: 0.00825
Epoch: 27464, Training Loss: 0.01010
Epoch: 27464, Training Loss: 0.00781
Epoch: 27465, Training Loss: 0.00884
Epoch: 27465, Training Loss: 0.00825
Epoch: 27465, Training Loss: 0.01009
Epoch: 27465, Training Loss: 0.00781
Epoch: 27466, Training Loss: 0.00884
Epoch: 27466, Training Loss: 0.00825
Epoch: 27466, Training Loss: 0.01009
Epoch: 27466, Training Loss: 0.00781
Epoch: 27467, Training Loss: 0.00884
Epoch: 27467, Training Loss: 0.00825
Epoch: 27467, Training Loss: 0.01009
Epoch: 27467, Training Loss: 0.00781
Epoch: 27468, Training Loss: 0.00884
Epoch: 27468, Training Loss: 0.00825
Epoch: 27468, Training Loss: 0.01009
Epoch: 27468, Training Loss: 0.00781
Epoch: 27469, Training Loss: 0.00884
Epoch: 27469, Training Loss: 0.00825
Epoch: 27469, Training Loss: 0.01009
Epoch: 27469, Training Loss: 0.00781
Epoch: 27470, Training Loss: 0.00884
Epoch: 27470, Training Loss: 0.00825
Epoch: 27470, Training Loss: 0.01009
Epoch: 27470, Training Loss: 0.00781
Epoch: 27471, Training Loss: 0.00884
Epoch: 27471, Training Loss: 0.00825
Epoch: 27471, Training Loss: 0.01009
Epoch: 27471, Training Loss: 0.00781
Epoch: 27472, Training Loss: 0.00884
Epoch: 27472, Training Loss: 0.00825
Epoch: 27472, Training Loss: 0.01009
Epoch: 27472, Training Loss: 0.00781
Epoch: 27473, Training Loss: 0.00884
Epoch: 27473, Training Loss: 0.00825
Epoch: 27473, Training Loss: 0.01009
Epoch: 27473, Training Loss: 0.00781
Epoch: 27474, Training Loss: 0.00884
Epoch: 27474, Training Loss: 0.00825
Epoch: 27474, Training Loss: 0.01009
Epoch: 27474, Training Loss: 0.00781
Epoch: 27475, Training Loss: 0.00884
Epoch: 27475, Training Loss: 0.00825
Epoch: 27475, Training Loss: 0.01009
Epoch: 27475, Training Loss: 0.00781
Epoch: 27476, Training Loss: 0.00884
Epoch: 27476, Training Loss: 0.00825
Epoch: 27476, Training Loss: 0.01009
Epoch: 27476, Training Loss: 0.00781
Epoch: 27477, Training Loss: 0.00884
Epoch: 27477, Training Loss: 0.00825
Epoch: 27477, Training Loss: 0.01009
Epoch: 27477, Training Loss: 0.00781
Epoch: 27478, Training Loss: 0.00884
Epoch: 27478, Training Loss: 0.00825
Epoch: 27478, Training Loss: 0.01009
Epoch: 27478, Training Loss: 0.00781
Epoch: 27479, Training Loss: 0.00884
Epoch: 27479, Training Loss: 0.00825
Epoch: 27479, Training Loss: 0.01009
Epoch: 27479, Training Loss: 0.00781
Epoch: 27480, Training Loss: 0.00884
Epoch: 27480, Training Loss: 0.00825
Epoch: 27480, Training Loss: 0.01009
Epoch: 27480, Training Loss: 0.00781
Epoch: 27481, Training Loss: 0.00884
Epoch: 27481, Training Loss: 0.00825
Epoch: 27481, Training Loss: 0.01009
Epoch: 27481, Training Loss: 0.00781
Epoch: 27482, Training Loss: 0.00884
Epoch: 27482, Training Loss: 0.00825
Epoch: 27482, Training Loss: 0.01009
Epoch: 27482, Training Loss: 0.00781
Epoch: 27483, Training Loss: 0.00884
Epoch: 27483, Training Loss: 0.00825
Epoch: 27483, Training Loss: 0.01009
Epoch: 27483, Training Loss: 0.00781
Epoch: 27484, Training Loss: 0.00884
Epoch: 27484, Training Loss: 0.00825
Epoch: 27484, Training Loss: 0.01009
Epoch: 27484, Training Loss: 0.00781
Epoch: 27485, Training Loss: 0.00884
Epoch: 27485, Training Loss: 0.00825
Epoch: 27485, Training Loss: 0.01009
Epoch: 27485, Training Loss: 0.00781
Epoch: 27486, Training Loss: 0.00884
Epoch: 27486, Training Loss: 0.00825
Epoch: 27486, Training Loss: 0.01009
Epoch: 27486, Training Loss: 0.00781
Epoch: 27487, Training Loss: 0.00884
Epoch: 27487, Training Loss: 0.00824
Epoch: 27487, Training Loss: 0.01009
Epoch: 27487, Training Loss: 0.00781
Epoch: 27488, Training Loss: 0.00884
Epoch: 27488, Training Loss: 0.00824
Epoch: 27488, Training Loss: 0.01009
Epoch: 27488, Training Loss: 0.00781
Epoch: 27489, Training Loss: 0.00884
Epoch: 27489, Training Loss: 0.00824
Epoch: 27489, Training Loss: 0.01009
Epoch: 27489, Training Loss: 0.00781
Epoch: 27490, Training Loss: 0.00884
Epoch: 27490, Training Loss: 0.00824
Epoch: 27490, Training Loss: 0.01009
Epoch: 27490, Training Loss: 0.00781
Epoch: 27491, Training Loss: 0.00884
Epoch: 27491, Training Loss: 0.00824
Epoch: 27491, Training Loss: 0.01009
Epoch: 27491, Training Loss: 0.00781
Epoch: 27492, Training Loss: 0.00884
Epoch: 27492, Training Loss: 0.00824
Epoch: 27492, Training Loss: 0.01009
Epoch: 27492, Training Loss: 0.00781
Epoch: 27493, Training Loss: 0.00884
Epoch: 27493, Training Loss: 0.00824
Epoch: 27493, Training Loss: 0.01009
Epoch: 27493, Training Loss: 0.00781
Epoch: 27494, Training Loss: 0.00884
Epoch: 27494, Training Loss: 0.00824
Epoch: 27494, Training Loss: 0.01009
Epoch: 27494, Training Loss: 0.00781
Epoch: 27495, Training Loss: 0.00884
Epoch: 27495, Training Loss: 0.00824
Epoch: 27495, Training Loss: 0.01009
Epoch: 27495, Training Loss: 0.00781
Epoch: 27496, Training Loss: 0.00884
Epoch: 27496, Training Loss: 0.00824
Epoch: 27496, Training Loss: 0.01009
Epoch: 27496, Training Loss: 0.00781
Epoch: 27497, Training Loss: 0.00883
Epoch: 27497, Training Loss: 0.00824
Epoch: 27497, Training Loss: 0.01009
Epoch: 27497, Training Loss: 0.00781
Epoch: 27498, Training Loss: 0.00883
Epoch: 27498, Training Loss: 0.00824
Epoch: 27498, Training Loss: 0.01009
Epoch: 27498, Training Loss: 0.00781
Epoch: 27499, Training Loss: 0.00883
Epoch: 27499, Training Loss: 0.00824
Epoch: 27499, Training Loss: 0.01009
Epoch: 27499, Training Loss: 0.00781
Epoch: 27500, Training Loss: 0.00883
Epoch: 27500, Training Loss: 0.00824
Epoch: 27500, Training Loss: 0.01009
Epoch: 27500, Training Loss: 0.00781
Epoch: 27501, Training Loss: 0.00883
Epoch: 27501, Training Loss: 0.00824
Epoch: 27501, Training Loss: 0.01009
Epoch: 27501, Training Loss: 0.00781
Epoch: 27502, Training Loss: 0.00883
Epoch: 27502, Training Loss: 0.00824
Epoch: 27502, Training Loss: 0.01009
Epoch: 27502, Training Loss: 0.00781
Epoch: 27503, Training Loss: 0.00883
Epoch: 27503, Training Loss: 0.00824
Epoch: 27503, Training Loss: 0.01009
Epoch: 27503, Training Loss: 0.00781
Epoch: 27504, Training Loss: 0.00883
Epoch: 27504, Training Loss: 0.00824
Epoch: 27504, Training Loss: 0.01009
Epoch: 27504, Training Loss: 0.00781
Epoch: 27505, Training Loss: 0.00883
Epoch: 27505, Training Loss: 0.00824
Epoch: 27505, Training Loss: 0.01009
Epoch: 27505, Training Loss: 0.00781
Epoch: 27506, Training Loss: 0.00883
Epoch: 27506, Training Loss: 0.00824
Epoch: 27506, Training Loss: 0.01009
Epoch: 27506, Training Loss: 0.00781
Epoch: 27507, Training Loss: 0.00883
Epoch: 27507, Training Loss: 0.00824
Epoch: 27507, Training Loss: 0.01009
Epoch: 27507, Training Loss: 0.00781
Epoch: 27508, Training Loss: 0.00883
Epoch: 27508, Training Loss: 0.00824
Epoch: 27508, Training Loss: 0.01009
Epoch: 27508, Training Loss: 0.00781
Epoch: 27509, Training Loss: 0.00883
Epoch: 27509, Training Loss: 0.00824
Epoch: 27509, Training Loss: 0.01009
Epoch: 27509, Training Loss: 0.00781
Epoch: 27510, Training Loss: 0.00883
Epoch: 27510, Training Loss: 0.00824
Epoch: 27510, Training Loss: 0.01009
Epoch: 27510, Training Loss: 0.00780
Epoch: 27511, Training Loss: 0.00883
Epoch: 27511, Training Loss: 0.00824
Epoch: 27511, Training Loss: 0.01009
Epoch: 27511, Training Loss: 0.00780
Epoch: 27512, Training Loss: 0.00883
Epoch: 27512, Training Loss: 0.00824
Epoch: 27512, Training Loss: 0.01009
Epoch: 27512, Training Loss: 0.00780
Epoch: 27513, Training Loss: 0.00883
Epoch: 27513, Training Loss: 0.00824
Epoch: 27513, Training Loss: 0.01009
Epoch: 27513, Training Loss: 0.00780
Epoch: 27514, Training Loss: 0.00883
Epoch: 27514, Training Loss: 0.00824
Epoch: 27514, Training Loss: 0.01009
Epoch: 27514, Training Loss: 0.00780
Epoch: 27515, Training Loss: 0.00883
Epoch: 27515, Training Loss: 0.00824
Epoch: 27515, Training Loss: 0.01009
Epoch: 27515, Training Loss: 0.00780
Epoch: 27516, Training Loss: 0.00883
Epoch: 27516, Training Loss: 0.00824
Epoch: 27516, Training Loss: 0.01008
Epoch: 27516, Training Loss: 0.00780
Epoch: 27517, Training Loss: 0.00883
Epoch: 27517, Training Loss: 0.00824
Epoch: 27517, Training Loss: 0.01008
Epoch: 27517, Training Loss: 0.00780
Epoch: 27518, Training Loss: 0.00883
Epoch: 27518, Training Loss: 0.00824
Epoch: 27518, Training Loss: 0.01008
Epoch: 27518, Training Loss: 0.00780
Epoch: 27519, Training Loss: 0.00883
Epoch: 27519, Training Loss: 0.00824
Epoch: 27519, Training Loss: 0.01008
Epoch: 27519, Training Loss: 0.00780
Epoch: 27520, Training Loss: 0.00883
Epoch: 27520, Training Loss: 0.00824
Epoch: 27520, Training Loss: 0.01008
Epoch: 27520, Training Loss: 0.00780
Epoch: 27521, Training Loss: 0.00883
Epoch: 27521, Training Loss: 0.00824
Epoch: 27521, Training Loss: 0.01008
Epoch: 27521, Training Loss: 0.00780
Epoch: 27522, Training Loss: 0.00883
Epoch: 27522, Training Loss: 0.00824
Epoch: 27522, Training Loss: 0.01008
Epoch: 27522, Training Loss: 0.00780
Epoch: 27523, Training Loss: 0.00883
Epoch: 27523, Training Loss: 0.00824
Epoch: 27523, Training Loss: 0.01008
Epoch: 27523, Training Loss: 0.00780
Epoch: 27524, Training Loss: 0.00883
Epoch: 27524, Training Loss: 0.00824
Epoch: 27524, Training Loss: 0.01008
Epoch: 27524, Training Loss: 0.00780
Epoch: 27525, Training Loss: 0.00883
Epoch: 27525, Training Loss: 0.00824
Epoch: 27525, Training Loss: 0.01008
Epoch: 27525, Training Loss: 0.00780
Epoch: 27526, Training Loss: 0.00883
Epoch: 27526, Training Loss: 0.00824
Epoch: 27526, Training Loss: 0.01008
Epoch: 27526, Training Loss: 0.00780
Epoch: 27527, Training Loss: 0.00883
Epoch: 27527, Training Loss: 0.00824
Epoch: 27527, Training Loss: 0.01008
Epoch: 27527, Training Loss: 0.00780
Epoch: 27528, Training Loss: 0.00883
Epoch: 27528, Training Loss: 0.00824
Epoch: 27528, Training Loss: 0.01008
Epoch: 27528, Training Loss: 0.00780
Epoch: 27529, Training Loss: 0.00883
Epoch: 27529, Training Loss: 0.00824
Epoch: 27529, Training Loss: 0.01008
Epoch: 27529, Training Loss: 0.00780
Epoch: 27530, Training Loss: 0.00883
Epoch: 27530, Training Loss: 0.00824
Epoch: 27530, Training Loss: 0.01008
Epoch: 27530, Training Loss: 0.00780
Epoch: 27531, Training Loss: 0.00883
Epoch: 27531, Training Loss: 0.00824
Epoch: 27531, Training Loss: 0.01008
Epoch: 27531, Training Loss: 0.00780
Epoch: 27532, Training Loss: 0.00883
Epoch: 27532, Training Loss: 0.00824
Epoch: 27532, Training Loss: 0.01008
Epoch: 27532, Training Loss: 0.00780
Epoch: 27533, Training Loss: 0.00883
Epoch: 27533, Training Loss: 0.00824
Epoch: 27533, Training Loss: 0.01008
Epoch: 27533, Training Loss: 0.00780
Epoch: 27534, Training Loss: 0.00883
Epoch: 27534, Training Loss: 0.00824
Epoch: 27534, Training Loss: 0.01008
Epoch: 27534, Training Loss: 0.00780
Epoch: 27535, Training Loss: 0.00883
Epoch: 27535, Training Loss: 0.00824
Epoch: 27535, Training Loss: 0.01008
Epoch: 27535, Training Loss: 0.00780
Epoch: 27536, Training Loss: 0.00883
Epoch: 27536, Training Loss: 0.00824
Epoch: 27536, Training Loss: 0.01008
Epoch: 27536, Training Loss: 0.00780
Epoch: 27537, Training Loss: 0.00883
Epoch: 27537, Training Loss: 0.00824
Epoch: 27537, Training Loss: 0.01008
Epoch: 27537, Training Loss: 0.00780
Epoch: 27538, Training Loss: 0.00883
Epoch: 27538, Training Loss: 0.00824
Epoch: 27538, Training Loss: 0.01008
Epoch: 27538, Training Loss: 0.00780
Epoch: 27539, Training Loss: 0.00883
Epoch: 27539, Training Loss: 0.00824
Epoch: 27539, Training Loss: 0.01008
Epoch: 27539, Training Loss: 0.00780
Epoch: 27540, Training Loss: 0.00883
Epoch: 27540, Training Loss: 0.00824
Epoch: 27540, Training Loss: 0.01008
Epoch: 27540, Training Loss: 0.00780
Epoch: 27541, Training Loss: 0.00883
Epoch: 27541, Training Loss: 0.00824
Epoch: 27541, Training Loss: 0.01008
Epoch: 27541, Training Loss: 0.00780
Epoch: 27542, Training Loss: 0.00883
Epoch: 27542, Training Loss: 0.00824
Epoch: 27542, Training Loss: 0.01008
Epoch: 27542, Training Loss: 0.00780
Epoch: 27543, Training Loss: 0.00883
Epoch: 27543, Training Loss: 0.00824
Epoch: 27543, Training Loss: 0.01008
Epoch: 27543, Training Loss: 0.00780
Epoch: 27544, Training Loss: 0.00883
Epoch: 27544, Training Loss: 0.00824
Epoch: 27544, Training Loss: 0.01008
Epoch: 27544, Training Loss: 0.00780
Epoch: 27545, Training Loss: 0.00883
Epoch: 27545, Training Loss: 0.00824
Epoch: 27545, Training Loss: 0.01008
Epoch: 27545, Training Loss: 0.00780
Epoch: 27546, Training Loss: 0.00883
Epoch: 27546, Training Loss: 0.00824
Epoch: 27546, Training Loss: 0.01008
Epoch: 27546, Training Loss: 0.00780
Epoch: 27547, Training Loss: 0.00883
Epoch: 27547, Training Loss: 0.00824
Epoch: 27547, Training Loss: 0.01008
Epoch: 27547, Training Loss: 0.00780
Epoch: 27548, Training Loss: 0.00883
Epoch: 27548, Training Loss: 0.00824
Epoch: 27548, Training Loss: 0.01008
Epoch: 27548, Training Loss: 0.00780
Epoch: 27549, Training Loss: 0.00883
Epoch: 27549, Training Loss: 0.00824
Epoch: 27549, Training Loss: 0.01008
Epoch: 27549, Training Loss: 0.00780
Epoch: 27550, Training Loss: 0.00883
Epoch: 27550, Training Loss: 0.00823
Epoch: 27550, Training Loss: 0.01008
Epoch: 27550, Training Loss: 0.00780
Epoch: 27551, Training Loss: 0.00883
Epoch: 27551, Training Loss: 0.00823
Epoch: 27551, Training Loss: 0.01008
Epoch: 27551, Training Loss: 0.00780
Epoch: 27552, Training Loss: 0.00883
Epoch: 27552, Training Loss: 0.00823
Epoch: 27552, Training Loss: 0.01008
Epoch: 27552, Training Loss: 0.00780
Epoch: 27553, Training Loss: 0.00883
Epoch: 27553, Training Loss: 0.00823
Epoch: 27553, Training Loss: 0.01008
Epoch: 27553, Training Loss: 0.00780
Epoch: 27554, Training Loss: 0.00883
Epoch: 27554, Training Loss: 0.00823
Epoch: 27554, Training Loss: 0.01008
Epoch: 27554, Training Loss: 0.00780
Epoch: 27555, Training Loss: 0.00882
Epoch: 27555, Training Loss: 0.00823
Epoch: 27555, Training Loss: 0.01008
Epoch: 27555, Training Loss: 0.00780
Epoch: 27556, Training Loss: 0.00882
Epoch: 27556, Training Loss: 0.00823
Epoch: 27556, Training Loss: 0.01008
Epoch: 27556, Training Loss: 0.00780
Epoch: 27557, Training Loss: 0.00882
Epoch: 27557, Training Loss: 0.00823
Epoch: 27557, Training Loss: 0.01008
Epoch: 27557, Training Loss: 0.00780
Epoch: 27558, Training Loss: 0.00882
Epoch: 27558, Training Loss: 0.00823
Epoch: 27558, Training Loss: 0.01008
Epoch: 27558, Training Loss: 0.00780
Epoch: 27559, Training Loss: 0.00882
Epoch: 27559, Training Loss: 0.00823
Epoch: 27559, Training Loss: 0.01008
Epoch: 27559, Training Loss: 0.00780
Epoch: 27560, Training Loss: 0.00882
Epoch: 27560, Training Loss: 0.00823
Epoch: 27560, Training Loss: 0.01008
Epoch: 27560, Training Loss: 0.00780
Epoch: 27561, Training Loss: 0.00882
Epoch: 27561, Training Loss: 0.00823
Epoch: 27561, Training Loss: 0.01008
Epoch: 27561, Training Loss: 0.00780
Epoch: 27562, Training Loss: 0.00882
Epoch: 27562, Training Loss: 0.00823
Epoch: 27562, Training Loss: 0.01008
Epoch: 27562, Training Loss: 0.00780
Epoch: 27563, Training Loss: 0.00882
Epoch: 27563, Training Loss: 0.00823
Epoch: 27563, Training Loss: 0.01008
Epoch: 27563, Training Loss: 0.00780
Epoch: 27564, Training Loss: 0.00882
Epoch: 27564, Training Loss: 0.00823
Epoch: 27564, Training Loss: 0.01008
Epoch: 27564, Training Loss: 0.00780
Epoch: 27565, Training Loss: 0.00882
Epoch: 27565, Training Loss: 0.00823
Epoch: 27565, Training Loss: 0.01008
Epoch: 27565, Training Loss: 0.00780
Epoch: 27566, Training Loss: 0.00882
Epoch: 27566, Training Loss: 0.00823
Epoch: 27566, Training Loss: 0.01008
Epoch: 27566, Training Loss: 0.00780
Epoch: 27567, Training Loss: 0.00882
Epoch: 27567, Training Loss: 0.00823
Epoch: 27567, Training Loss: 0.01008
Epoch: 27567, Training Loss: 0.00780
Epoch: 27568, Training Loss: 0.00882
Epoch: 27568, Training Loss: 0.00823
Epoch: 27568, Training Loss: 0.01007
Epoch: 27568, Training Loss: 0.00780
Epoch: 27569, Training Loss: 0.00882
Epoch: 27569, Training Loss: 0.00823
Epoch: 27569, Training Loss: 0.01007
Epoch: 27569, Training Loss: 0.00780
Epoch: 27570, Training Loss: 0.00882
Epoch: 27570, Training Loss: 0.00823
Epoch: 27570, Training Loss: 0.01007
Epoch: 27570, Training Loss: 0.00780
Epoch: 27571, Training Loss: 0.00882
Epoch: 27571, Training Loss: 0.00823
Epoch: 27571, Training Loss: 0.01007
Epoch: 27571, Training Loss: 0.00780
Epoch: 27572, Training Loss: 0.00882
Epoch: 27572, Training Loss: 0.00823
Epoch: 27572, Training Loss: 0.01007
Epoch: 27572, Training Loss: 0.00780
Epoch: 27573, Training Loss: 0.00882
Epoch: 27573, Training Loss: 0.00823
Epoch: 27573, Training Loss: 0.01007
Epoch: 27573, Training Loss: 0.00780
Epoch: 27574, Training Loss: 0.00882
Epoch: 27574, Training Loss: 0.00823
Epoch: 27574, Training Loss: 0.01007
Epoch: 27574, Training Loss: 0.00780
Epoch: 27575, Training Loss: 0.00882
Epoch: 27575, Training Loss: 0.00823
Epoch: 27575, Training Loss: 0.01007
Epoch: 27575, Training Loss: 0.00780
Epoch: 27576, Training Loss: 0.00882
Epoch: 27576, Training Loss: 0.00823
Epoch: 27576, Training Loss: 0.01007
Epoch: 27576, Training Loss: 0.00780
Epoch: 27577, Training Loss: 0.00882
Epoch: 27577, Training Loss: 0.00823
Epoch: 27577, Training Loss: 0.01007
Epoch: 27577, Training Loss: 0.00779
Epoch: 27578, Training Loss: 0.00882
Epoch: 27578, Training Loss: 0.00823
Epoch: 27578, Training Loss: 0.01007
Epoch: 27578, Training Loss: 0.00779
Epoch: 27579, Training Loss: 0.00882
Epoch: 27579, Training Loss: 0.00823
Epoch: 27579, Training Loss: 0.01007
Epoch: 27579, Training Loss: 0.00779
Epoch: 27580, Training Loss: 0.00882
Epoch: 27580, Training Loss: 0.00823
Epoch: 27580, Training Loss: 0.01007
Epoch: 27580, Training Loss: 0.00779
Epoch: 27581, Training Loss: 0.00882
Epoch: 27581, Training Loss: 0.00823
Epoch: 27581, Training Loss: 0.01007
Epoch: 27581, Training Loss: 0.00779
Epoch: 27582, Training Loss: 0.00882
Epoch: 27582, Training Loss: 0.00823
Epoch: 27582, Training Loss: 0.01007
Epoch: 27582, Training Loss: 0.00779
Epoch: 27583, Training Loss: 0.00882
Epoch: 27583, Training Loss: 0.00823
Epoch: 27583, Training Loss: 0.01007
Epoch: 27583, Training Loss: 0.00779
Epoch: 27584, Training Loss: 0.00882
Epoch: 27584, Training Loss: 0.00823
Epoch: 27584, Training Loss: 0.01007
Epoch: 27584, Training Loss: 0.00779
Epoch: 27585, Training Loss: 0.00882
Epoch: 27585, Training Loss: 0.00823
Epoch: 27585, Training Loss: 0.01007
Epoch: 27585, Training Loss: 0.00779
Epoch: 27586, Training Loss: 0.00882
Epoch: 27586, Training Loss: 0.00823
Epoch: 27586, Training Loss: 0.01007
Epoch: 27586, Training Loss: 0.00779
Epoch: 27587, Training Loss: 0.00882
Epoch: 27587, Training Loss: 0.00823
Epoch: 27587, Training Loss: 0.01007
Epoch: 27587, Training Loss: 0.00779
Epoch: 27588, Training Loss: 0.00882
Epoch: 27588, Training Loss: 0.00823
Epoch: 27588, Training Loss: 0.01007
Epoch: 27588, Training Loss: 0.00779
Epoch: 27589, Training Loss: 0.00882
Epoch: 27589, Training Loss: 0.00823
Epoch: 27589, Training Loss: 0.01007
Epoch: 27589, Training Loss: 0.00779
Epoch: 27590, Training Loss: 0.00882
Epoch: 27590, Training Loss: 0.00823
Epoch: 27590, Training Loss: 0.01007
Epoch: 27590, Training Loss: 0.00779
Epoch: 27591, Training Loss: 0.00882
Epoch: 27591, Training Loss: 0.00823
Epoch: 27591, Training Loss: 0.01007
Epoch: 27591, Training Loss: 0.00779
Epoch: 27592, Training Loss: 0.00882
Epoch: 27592, Training Loss: 0.00823
Epoch: 27592, Training Loss: 0.01007
Epoch: 27592, Training Loss: 0.00779
Epoch: 27593, Training Loss: 0.00882
Epoch: 27593, Training Loss: 0.00823
Epoch: 27593, Training Loss: 0.01007
Epoch: 27593, Training Loss: 0.00779
Epoch: 27594, Training Loss: 0.00882
Epoch: 27594, Training Loss: 0.00823
Epoch: 27594, Training Loss: 0.01007
Epoch: 27594, Training Loss: 0.00779
Epoch: 27595, Training Loss: 0.00882
Epoch: 27595, Training Loss: 0.00823
Epoch: 27595, Training Loss: 0.01007
Epoch: 27595, Training Loss: 0.00779
Epoch: 27596, Training Loss: 0.00882
Epoch: 27596, Training Loss: 0.00823
Epoch: 27596, Training Loss: 0.01007
Epoch: 27596, Training Loss: 0.00779
Epoch: 27597, Training Loss: 0.00882
Epoch: 27597, Training Loss: 0.00823
Epoch: 27597, Training Loss: 0.01007
Epoch: 27597, Training Loss: 0.00779
Epoch: 27598, Training Loss: 0.00882
Epoch: 27598, Training Loss: 0.00823
Epoch: 27598, Training Loss: 0.01007
Epoch: 27598, Training Loss: 0.00779
Epoch: 27599, Training Loss: 0.00882
Epoch: 27599, Training Loss: 0.00823
Epoch: 27599, Training Loss: 0.01007
Epoch: 27599, Training Loss: 0.00779
Epoch: 27600, Training Loss: 0.00882
Epoch: 27600, Training Loss: 0.00823
Epoch: 27600, Training Loss: 0.01007
Epoch: 27600, Training Loss: 0.00779
Epoch: 27601, Training Loss: 0.00882
Epoch: 27601, Training Loss: 0.00823
Epoch: 27601, Training Loss: 0.01007
Epoch: 27601, Training Loss: 0.00779
Epoch: 27602, Training Loss: 0.00882
Epoch: 27602, Training Loss: 0.00823
Epoch: 27602, Training Loss: 0.01007
Epoch: 27602, Training Loss: 0.00779
Epoch: 27603, Training Loss: 0.00882
Epoch: 27603, Training Loss: 0.00823
Epoch: 27603, Training Loss: 0.01007
Epoch: 27603, Training Loss: 0.00779
Epoch: 27604, Training Loss: 0.00882
Epoch: 27604, Training Loss: 0.00823
Epoch: 27604, Training Loss: 0.01007
Epoch: 27604, Training Loss: 0.00779
Epoch: 27605, Training Loss: 0.00882
Epoch: 27605, Training Loss: 0.00823
Epoch: 27605, Training Loss: 0.01007
Epoch: 27605, Training Loss: 0.00779
Epoch: 27606, Training Loss: 0.00882
Epoch: 27606, Training Loss: 0.00823
Epoch: 27606, Training Loss: 0.01007
Epoch: 27606, Training Loss: 0.00779
Epoch: 27607, Training Loss: 0.00882
Epoch: 27607, Training Loss: 0.00823
Epoch: 27607, Training Loss: 0.01007
Epoch: 27607, Training Loss: 0.00779
Epoch: 27608, Training Loss: 0.00882
Epoch: 27608, Training Loss: 0.00823
Epoch: 27608, Training Loss: 0.01007
Epoch: 27608, Training Loss: 0.00779
Epoch: 27609, Training Loss: 0.00882
Epoch: 27609, Training Loss: 0.00823
Epoch: 27609, Training Loss: 0.01007
Epoch: 27609, Training Loss: 0.00779
Epoch: 27610, Training Loss: 0.00882
Epoch: 27610, Training Loss: 0.00823
Epoch: 27610, Training Loss: 0.01007
Epoch: 27610, Training Loss: 0.00779
Epoch: 27611, Training Loss: 0.00882
Epoch: 27611, Training Loss: 0.00823
Epoch: 27611, Training Loss: 0.01007
Epoch: 27611, Training Loss: 0.00779
Epoch: 27612, Training Loss: 0.00882
Epoch: 27612, Training Loss: 0.00823
Epoch: 27612, Training Loss: 0.01007
Epoch: 27612, Training Loss: 0.00779
Epoch: 27613, Training Loss: 0.00882
Epoch: 27613, Training Loss: 0.00823
Epoch: 27613, Training Loss: 0.01007
Epoch: 27613, Training Loss: 0.00779
Epoch: 27614, Training Loss: 0.00881
Epoch: 27614, Training Loss: 0.00822
Epoch: 27614, Training Loss: 0.01007
Epoch: 27614, Training Loss: 0.00779
Epoch: 27615, Training Loss: 0.00881
Epoch: 27615, Training Loss: 0.00822
Epoch: 27615, Training Loss: 0.01007
Epoch: 27615, Training Loss: 0.00779
Epoch: 27616, Training Loss: 0.00881
Epoch: 27616, Training Loss: 0.00822
Epoch: 27616, Training Loss: 0.01007
Epoch: 27616, Training Loss: 0.00779
Epoch: 27617, Training Loss: 0.00881
Epoch: 27617, Training Loss: 0.00822
Epoch: 27617, Training Loss: 0.01007
Epoch: 27617, Training Loss: 0.00779
Epoch: 27618, Training Loss: 0.00881
Epoch: 27618, Training Loss: 0.00822
Epoch: 27618, Training Loss: 0.01007
Epoch: 27618, Training Loss: 0.00779
Epoch: 27619, Training Loss: 0.00881
Epoch: 27619, Training Loss: 0.00822
Epoch: 27619, Training Loss: 0.01006
Epoch: 27619, Training Loss: 0.00779
Epoch: 27620, Training Loss: 0.00881
Epoch: 27620, Training Loss: 0.00822
Epoch: 27620, Training Loss: 0.01006
Epoch: 27620, Training Loss: 0.00779
Epoch: 27621, Training Loss: 0.00881
Epoch: 27621, Training Loss: 0.00822
Epoch: 27621, Training Loss: 0.01006
Epoch: 27621, Training Loss: 0.00779
Epoch: 27622, Training Loss: 0.00881
Epoch: 27622, Training Loss: 0.00822
Epoch: 27622, Training Loss: 0.01006
Epoch: 27622, Training Loss: 0.00779
Epoch: 27623, Training Loss: 0.00881
Epoch: 27623, Training Loss: 0.00822
Epoch: 27623, Training Loss: 0.01006
Epoch: 27623, Training Loss: 0.00779
Epoch: 27624, Training Loss: 0.00881
Epoch: 27624, Training Loss: 0.00822
Epoch: 27624, Training Loss: 0.01006
Epoch: 27624, Training Loss: 0.00779
Epoch: 27625, Training Loss: 0.00881
Epoch: 27625, Training Loss: 0.00822
Epoch: 27625, Training Loss: 0.01006
Epoch: 27625, Training Loss: 0.00779
Epoch: 27626, Training Loss: 0.00881
Epoch: 27626, Training Loss: 0.00822
Epoch: 27626, Training Loss: 0.01006
Epoch: 27626, Training Loss: 0.00779
Epoch: 27627, Training Loss: 0.00881
Epoch: 27627, Training Loss: 0.00822
Epoch: 27627, Training Loss: 0.01006
Epoch: 27627, Training Loss: 0.00779
Epoch: 27628, Training Loss: 0.00881
Epoch: 27628, Training Loss: 0.00822
Epoch: 27628, Training Loss: 0.01006
Epoch: 27628, Training Loss: 0.00779
Epoch: 27629, Training Loss: 0.00881
Epoch: 27629, Training Loss: 0.00822
Epoch: 27629, Training Loss: 0.01006
Epoch: 27629, Training Loss: 0.00779
Epoch: 27630, Training Loss: 0.00881
Epoch: 27630, Training Loss: 0.00822
Epoch: 27630, Training Loss: 0.01006
Epoch: 27630, Training Loss: 0.00779
Epoch: 27631, Training Loss: 0.00881
Epoch: 27631, Training Loss: 0.00822
Epoch: 27631, Training Loss: 0.01006
Epoch: 27631, Training Loss: 0.00779
Epoch: 27632, Training Loss: 0.00881
Epoch: 27632, Training Loss: 0.00822
Epoch: 27632, Training Loss: 0.01006
Epoch: 27632, Training Loss: 0.00779
Epoch: 27633, Training Loss: 0.00881
Epoch: 27633, Training Loss: 0.00822
Epoch: 27633, Training Loss: 0.01006
Epoch: 27633, Training Loss: 0.00779
Epoch: 27634, Training Loss: 0.00881
Epoch: 27634, Training Loss: 0.00822
Epoch: 27634, Training Loss: 0.01006
Epoch: 27634, Training Loss: 0.00779
Epoch: 27635, Training Loss: 0.00881
Epoch: 27635, Training Loss: 0.00822
Epoch: 27635, Training Loss: 0.01006
Epoch: 27635, Training Loss: 0.00779
Epoch: 27636, Training Loss: 0.00881
Epoch: 27636, Training Loss: 0.00822
Epoch: 27636, Training Loss: 0.01006
Epoch: 27636, Training Loss: 0.00779
Epoch: 27637, Training Loss: 0.00881
Epoch: 27637, Training Loss: 0.00822
Epoch: 27637, Training Loss: 0.01006
Epoch: 27637, Training Loss: 0.00779
Epoch: 27638, Training Loss: 0.00881
Epoch: 27638, Training Loss: 0.00822
Epoch: 27638, Training Loss: 0.01006
Epoch: 27638, Training Loss: 0.00779
Epoch: 27639, Training Loss: 0.00881
Epoch: 27639, Training Loss: 0.00822
Epoch: 27639, Training Loss: 0.01006
Epoch: 27639, Training Loss: 0.00779
Epoch: 27640, Training Loss: 0.00881
Epoch: 27640, Training Loss: 0.00822
Epoch: 27640, Training Loss: 0.01006
Epoch: 27640, Training Loss: 0.00779
Epoch: 27641, Training Loss: 0.00881
Epoch: 27641, Training Loss: 0.00822
Epoch: 27641, Training Loss: 0.01006
Epoch: 27641, Training Loss: 0.00779
Epoch: 27642, Training Loss: 0.00881
Epoch: 27642, Training Loss: 0.00822
Epoch: 27642, Training Loss: 0.01006
Epoch: 27642, Training Loss: 0.00779
Epoch: 27643, Training Loss: 0.00881
Epoch: 27643, Training Loss: 0.00822
Epoch: 27643, Training Loss: 0.01006
Epoch: 27643, Training Loss: 0.00779
Epoch: 27644, Training Loss: 0.00881
Epoch: 27644, Training Loss: 0.00822
Epoch: 27644, Training Loss: 0.01006
Epoch: 27644, Training Loss: 0.00779
Epoch: 27645, Training Loss: 0.00881
Epoch: 27645, Training Loss: 0.00822
Epoch: 27645, Training Loss: 0.01006
Epoch: 27645, Training Loss: 0.00778
Epoch: 27646, Training Loss: 0.00881
Epoch: 27646, Training Loss: 0.00822
Epoch: 27646, Training Loss: 0.01006
Epoch: 27646, Training Loss: 0.00778
Epoch: 27647, Training Loss: 0.00881
Epoch: 27647, Training Loss: 0.00822
Epoch: 27647, Training Loss: 0.01006
Epoch: 27647, Training Loss: 0.00778
Epoch: 27648, Training Loss: 0.00881
Epoch: 27648, Training Loss: 0.00822
Epoch: 27648, Training Loss: 0.01006
Epoch: 27648, Training Loss: 0.00778
Epoch: 27649, Training Loss: 0.00881
Epoch: 27649, Training Loss: 0.00822
Epoch: 27649, Training Loss: 0.01006
Epoch: 27649, Training Loss: 0.00778
Epoch: 27650, Training Loss: 0.00881
Epoch: 27650, Training Loss: 0.00822
Epoch: 27650, Training Loss: 0.01006
Epoch: 27650, Training Loss: 0.00778
Epoch: 27651, Training Loss: 0.00881
Epoch: 27651, Training Loss: 0.00822
Epoch: 27651, Training Loss: 0.01006
Epoch: 27651, Training Loss: 0.00778
Epoch: 27652, Training Loss: 0.00881
Epoch: 27652, Training Loss: 0.00822
Epoch: 27652, Training Loss: 0.01006
Epoch: 27652, Training Loss: 0.00778
Epoch: 27653, Training Loss: 0.00881
Epoch: 27653, Training Loss: 0.00822
Epoch: 27653, Training Loss: 0.01006
Epoch: 27653, Training Loss: 0.00778
Epoch: 27654, Training Loss: 0.00881
Epoch: 27654, Training Loss: 0.00822
Epoch: 27654, Training Loss: 0.01006
Epoch: 27654, Training Loss: 0.00778
Epoch: 27655, Training Loss: 0.00881
Epoch: 27655, Training Loss: 0.00822
Epoch: 27655, Training Loss: 0.01006
Epoch: 27655, Training Loss: 0.00778
Epoch: 27656, Training Loss: 0.00881
Epoch: 27656, Training Loss: 0.00822
Epoch: 27656, Training Loss: 0.01006
Epoch: 27656, Training Loss: 0.00778
Epoch: 27657, Training Loss: 0.00881
Epoch: 27657, Training Loss: 0.00822
Epoch: 27657, Training Loss: 0.01006
Epoch: 27657, Training Loss: 0.00778
Epoch: 27658, Training Loss: 0.00881
Epoch: 27658, Training Loss: 0.00822
Epoch: 27658, Training Loss: 0.01006
Epoch: 27658, Training Loss: 0.00778
Epoch: 27659, Training Loss: 0.00881
Epoch: 27659, Training Loss: 0.00822
Epoch: 27659, Training Loss: 0.01006
Epoch: 27659, Training Loss: 0.00778
Epoch: 27660, Training Loss: 0.00881
Epoch: 27660, Training Loss: 0.00822
Epoch: 27660, Training Loss: 0.01006
Epoch: 27660, Training Loss: 0.00778
Epoch: 27661, Training Loss: 0.00881
Epoch: 27661, Training Loss: 0.00822
Epoch: 27661, Training Loss: 0.01006
Epoch: 27661, Training Loss: 0.00778
Epoch: 27662, Training Loss: 0.00881
Epoch: 27662, Training Loss: 0.00822
Epoch: 27662, Training Loss: 0.01006
Epoch: 27662, Training Loss: 0.00778
Epoch: 27663, Training Loss: 0.00881
Epoch: 27663, Training Loss: 0.00822
Epoch: 27663, Training Loss: 0.01006
Epoch: 27663, Training Loss: 0.00778
Epoch: 27664, Training Loss: 0.00881
Epoch: 27664, Training Loss: 0.00822
Epoch: 27664, Training Loss: 0.01006
Epoch: 27664, Training Loss: 0.00778
Epoch: 27665, Training Loss: 0.00881
Epoch: 27665, Training Loss: 0.00822
Epoch: 27665, Training Loss: 0.01006
Epoch: 27665, Training Loss: 0.00778
Epoch: 27666, Training Loss: 0.00881
Epoch: 27666, Training Loss: 0.00822
Epoch: 27666, Training Loss: 0.01006
Epoch: 27666, Training Loss: 0.00778
Epoch: 27667, Training Loss: 0.00881
Epoch: 27667, Training Loss: 0.00822
Epoch: 27667, Training Loss: 0.01006
Epoch: 27667, Training Loss: 0.00778
Epoch: 27668, Training Loss: 0.00881
Epoch: 27668, Training Loss: 0.00822
Epoch: 27668, Training Loss: 0.01006
Epoch: 27668, Training Loss: 0.00778
Epoch: 27669, Training Loss: 0.00881
Epoch: 27669, Training Loss: 0.00822
Epoch: 27669, Training Loss: 0.01006
Epoch: 27669, Training Loss: 0.00778
Epoch: 27670, Training Loss: 0.00881
Epoch: 27670, Training Loss: 0.00822
Epoch: 27670, Training Loss: 0.01006
Epoch: 27670, Training Loss: 0.00778
Epoch: 27671, Training Loss: 0.00881
Epoch: 27671, Training Loss: 0.00822
Epoch: 27671, Training Loss: 0.01005
Epoch: 27671, Training Loss: 0.00778
Epoch: 27672, Training Loss: 0.00881
Epoch: 27672, Training Loss: 0.00822
Epoch: 27672, Training Loss: 0.01005
Epoch: 27672, Training Loss: 0.00778
Epoch: 27673, Training Loss: 0.00880
Epoch: 27673, Training Loss: 0.00822
Epoch: 27673, Training Loss: 0.01005
Epoch: 27673, Training Loss: 0.00778
Epoch: 27674, Training Loss: 0.00880
Epoch: 27674, Training Loss: 0.00822
Epoch: 27674, Training Loss: 0.01005
Epoch: 27674, Training Loss: 0.00778
Epoch: 27675, Training Loss: 0.00880
Epoch: 27675, Training Loss: 0.00822
Epoch: 27675, Training Loss: 0.01005
Epoch: 27675, Training Loss: 0.00778
Epoch: 27676, Training Loss: 0.00880
Epoch: 27676, Training Loss: 0.00822
Epoch: 27676, Training Loss: 0.01005
Epoch: 27676, Training Loss: 0.00778
Epoch: 27677, Training Loss: 0.00880
Epoch: 27677, Training Loss: 0.00822
Epoch: 27677, Training Loss: 0.01005
Epoch: 27677, Training Loss: 0.00778
Epoch: 27678, Training Loss: 0.00880
Epoch: 27678, Training Loss: 0.00821
Epoch: 27678, Training Loss: 0.01005
Epoch: 27678, Training Loss: 0.00778
Epoch: 27679, Training Loss: 0.00880
Epoch: 27679, Training Loss: 0.00821
Epoch: 27679, Training Loss: 0.01005
Epoch: 27679, Training Loss: 0.00778
Epoch: 27680, Training Loss: 0.00880
Epoch: 27680, Training Loss: 0.00821
Epoch: 27680, Training Loss: 0.01005
Epoch: 27680, Training Loss: 0.00778
Epoch: 27681, Training Loss: 0.00880
Epoch: 27681, Training Loss: 0.00821
Epoch: 27681, Training Loss: 0.01005
Epoch: 27681, Training Loss: 0.00778
Epoch: 27682, Training Loss: 0.00880
Epoch: 27682, Training Loss: 0.00821
Epoch: 27682, Training Loss: 0.01005
Epoch: 27682, Training Loss: 0.00778
Epoch: 27683, Training Loss: 0.00880
Epoch: 27683, Training Loss: 0.00821
Epoch: 27683, Training Loss: 0.01005
Epoch: 27683, Training Loss: 0.00778
Epoch: 27684, Training Loss: 0.00880
Epoch: 27684, Training Loss: 0.00821
Epoch: 27684, Training Loss: 0.01005
Epoch: 27684, Training Loss: 0.00778
Epoch: 27685, Training Loss: 0.00880
Epoch: 27685, Training Loss: 0.00821
Epoch: 27685, Training Loss: 0.01005
Epoch: 27685, Training Loss: 0.00778
Epoch: 27686, Training Loss: 0.00880
Epoch: 27686, Training Loss: 0.00821
Epoch: 27686, Training Loss: 0.01005
Epoch: 27686, Training Loss: 0.00778
Epoch: 27687, Training Loss: 0.00880
Epoch: 27687, Training Loss: 0.00821
Epoch: 27687, Training Loss: 0.01005
Epoch: 27687, Training Loss: 0.00778
Epoch: 27688, Training Loss: 0.00880
Epoch: 27688, Training Loss: 0.00821
Epoch: 27688, Training Loss: 0.01005
Epoch: 27688, Training Loss: 0.00778
Epoch: 27689, Training Loss: 0.00880
Epoch: 27689, Training Loss: 0.00821
Epoch: 27689, Training Loss: 0.01005
Epoch: 27689, Training Loss: 0.00778
Epoch: 27690, Training Loss: 0.00880
Epoch: 27690, Training Loss: 0.00821
Epoch: 27690, Training Loss: 0.01005
Epoch: 27690, Training Loss: 0.00778
Epoch: 27691, Training Loss: 0.00880
Epoch: 27691, Training Loss: 0.00821
Epoch: 27691, Training Loss: 0.01005
Epoch: 27691, Training Loss: 0.00778
Epoch: 27692, Training Loss: 0.00880
Epoch: 27692, Training Loss: 0.00821
Epoch: 27692, Training Loss: 0.01005
Epoch: 27692, Training Loss: 0.00778
Epoch: 27693, Training Loss: 0.00880
Epoch: 27693, Training Loss: 0.00821
Epoch: 27693, Training Loss: 0.01005
Epoch: 27693, Training Loss: 0.00778
Epoch: 27694, Training Loss: 0.00880
Epoch: 27694, Training Loss: 0.00821
Epoch: 27694, Training Loss: 0.01005
Epoch: 27694, Training Loss: 0.00778
Epoch: 27695, Training Loss: 0.00880
Epoch: 27695, Training Loss: 0.00821
Epoch: 27695, Training Loss: 0.01005
Epoch: 27695, Training Loss: 0.00778
Epoch: 27696, Training Loss: 0.00880
Epoch: 27696, Training Loss: 0.00821
Epoch: 27696, Training Loss: 0.01005
Epoch: 27696, Training Loss: 0.00778
Epoch: 27697, Training Loss: 0.00880
Epoch: 27697, Training Loss: 0.00821
Epoch: 27697, Training Loss: 0.01005
Epoch: 27697, Training Loss: 0.00778
Epoch: 27698, Training Loss: 0.00880
Epoch: 27698, Training Loss: 0.00821
Epoch: 27698, Training Loss: 0.01005
Epoch: 27698, Training Loss: 0.00778
Epoch: 27699, Training Loss: 0.00880
Epoch: 27699, Training Loss: 0.00821
Epoch: 27699, Training Loss: 0.01005
Epoch: 27699, Training Loss: 0.00778
Epoch: 27700, Training Loss: 0.00880
Epoch: 27700, Training Loss: 0.00821
Epoch: 27700, Training Loss: 0.01005
Epoch: 27700, Training Loss: 0.00778
Epoch: 27701, Training Loss: 0.00880
Epoch: 27701, Training Loss: 0.00821
Epoch: 27701, Training Loss: 0.01005
Epoch: 27701, Training Loss: 0.00778
Epoch: 27702, Training Loss: 0.00880
Epoch: 27702, Training Loss: 0.00821
Epoch: 27702, Training Loss: 0.01005
Epoch: 27702, Training Loss: 0.00778
Epoch: 27703, Training Loss: 0.00880
Epoch: 27703, Training Loss: 0.00821
Epoch: 27703, Training Loss: 0.01005
Epoch: 27703, Training Loss: 0.00778
Epoch: 27704, Training Loss: 0.00880
Epoch: 27704, Training Loss: 0.00821
Epoch: 27704, Training Loss: 0.01005
Epoch: 27704, Training Loss: 0.00778
Epoch: 27705, Training Loss: 0.00880
Epoch: 27705, Training Loss: 0.00821
Epoch: 27705, Training Loss: 0.01005
Epoch: 27705, Training Loss: 0.00778
Epoch: 27706, Training Loss: 0.00880
Epoch: 27706, Training Loss: 0.00821
Epoch: 27706, Training Loss: 0.01005
Epoch: 27706, Training Loss: 0.00778
Epoch: 27707, Training Loss: 0.00880
Epoch: 27707, Training Loss: 0.00821
Epoch: 27707, Training Loss: 0.01005
Epoch: 27707, Training Loss: 0.00778
Epoch: 27708, Training Loss: 0.00880
Epoch: 27708, Training Loss: 0.00821
Epoch: 27708, Training Loss: 0.01005
Epoch: 27708, Training Loss: 0.00778
Epoch: 27709, Training Loss: 0.00880
Epoch: 27709, Training Loss: 0.00821
Epoch: 27709, Training Loss: 0.01005
Epoch: 27709, Training Loss: 0.00778
Epoch: 27710, Training Loss: 0.00880
Epoch: 27710, Training Loss: 0.00821
Epoch: 27710, Training Loss: 0.01005
Epoch: 27710, Training Loss: 0.00778
Epoch: 27711, Training Loss: 0.00880
Epoch: 27711, Training Loss: 0.00821
Epoch: 27711, Training Loss: 0.01005
Epoch: 27711, Training Loss: 0.00778
Epoch: 27712, Training Loss: 0.00880
Epoch: 27712, Training Loss: 0.00821
Epoch: 27712, Training Loss: 0.01005
Epoch: 27712, Training Loss: 0.00778
Epoch: 27713, Training Loss: 0.00880
Epoch: 27713, Training Loss: 0.00821
Epoch: 27713, Training Loss: 0.01005
Epoch: 27713, Training Loss: 0.00777
Epoch: 27714, Training Loss: 0.00880
Epoch: 27714, Training Loss: 0.00821
Epoch: 27714, Training Loss: 0.01005
Epoch: 27714, Training Loss: 0.00777
Epoch: 27715, Training Loss: 0.00880
Epoch: 27715, Training Loss: 0.00821
Epoch: 27715, Training Loss: 0.01005
Epoch: 27715, Training Loss: 0.00777
Epoch: 27716, Training Loss: 0.00880
Epoch: 27716, Training Loss: 0.00821
Epoch: 27716, Training Loss: 0.01005
Epoch: 27716, Training Loss: 0.00777
Epoch: 27717, Training Loss: 0.00880
Epoch: 27717, Training Loss: 0.00821
Epoch: 27717, Training Loss: 0.01005
Epoch: 27717, Training Loss: 0.00777
Epoch: 27718, Training Loss: 0.00880
Epoch: 27718, Training Loss: 0.00821
Epoch: 27718, Training Loss: 0.01005
Epoch: 27718, Training Loss: 0.00777
Epoch: 27719, Training Loss: 0.00880
Epoch: 27719, Training Loss: 0.00821
Epoch: 27719, Training Loss: 0.01005
Epoch: 27719, Training Loss: 0.00777
Epoch: 27720, Training Loss: 0.00880
Epoch: 27720, Training Loss: 0.00821
Epoch: 27720, Training Loss: 0.01005
Epoch: 27720, Training Loss: 0.00777
Epoch: 27721, Training Loss: 0.00880
Epoch: 27721, Training Loss: 0.00821
Epoch: 27721, Training Loss: 0.01005
Epoch: 27721, Training Loss: 0.00777
Epoch: 27722, Training Loss: 0.00880
Epoch: 27722, Training Loss: 0.00821
Epoch: 27722, Training Loss: 0.01005
Epoch: 27722, Training Loss: 0.00777
Epoch: 27723, Training Loss: 0.00880
Epoch: 27723, Training Loss: 0.00821
Epoch: 27723, Training Loss: 0.01004
Epoch: 27723, Training Loss: 0.00777
Epoch: 27724, Training Loss: 0.00880
Epoch: 27724, Training Loss: 0.00821
Epoch: 27724, Training Loss: 0.01004
Epoch: 27724, Training Loss: 0.00777
Epoch: 27725, Training Loss: 0.00880
Epoch: 27725, Training Loss: 0.00821
Epoch: 27725, Training Loss: 0.01004
Epoch: 27725, Training Loss: 0.00777
Epoch: 27726, Training Loss: 0.00880
Epoch: 27726, Training Loss: 0.00821
Epoch: 27726, Training Loss: 0.01004
Epoch: 27726, Training Loss: 0.00777
Epoch: 27727, Training Loss: 0.00880
Epoch: 27727, Training Loss: 0.00821
Epoch: 27727, Training Loss: 0.01004
Epoch: 27727, Training Loss: 0.00777
Epoch: 27728, Training Loss: 0.00880
Epoch: 27728, Training Loss: 0.00821
Epoch: 27728, Training Loss: 0.01004
Epoch: 27728, Training Loss: 0.00777
Epoch: 27729, Training Loss: 0.00880
Epoch: 27729, Training Loss: 0.00821
Epoch: 27729, Training Loss: 0.01004
Epoch: 27729, Training Loss: 0.00777
Epoch: 27730, Training Loss: 0.00880
Epoch: 27730, Training Loss: 0.00821
Epoch: 27730, Training Loss: 0.01004
Epoch: 27730, Training Loss: 0.00777
Epoch: 27731, Training Loss: 0.00880
Epoch: 27731, Training Loss: 0.00821
Epoch: 27731, Training Loss: 0.01004
Epoch: 27731, Training Loss: 0.00777
Epoch: 27732, Training Loss: 0.00879
Epoch: 27732, Training Loss: 0.00821
Epoch: 27732, Training Loss: 0.01004
Epoch: 27732, Training Loss: 0.00777
Epoch: 27733, Training Loss: 0.00879
Epoch: 27733, Training Loss: 0.00821
Epoch: 27733, Training Loss: 0.01004
Epoch: 27733, Training Loss: 0.00777
Epoch: 27734, Training Loss: 0.00879
Epoch: 27734, Training Loss: 0.00821
Epoch: 27734, Training Loss: 0.01004
Epoch: 27734, Training Loss: 0.00777
Epoch: 27735, Training Loss: 0.00879
Epoch: 27735, Training Loss: 0.00821
Epoch: 27735, Training Loss: 0.01004
Epoch: 27735, Training Loss: 0.00777
Epoch: 27736, Training Loss: 0.00879
Epoch: 27736, Training Loss: 0.00821
Epoch: 27736, Training Loss: 0.01004
Epoch: 27736, Training Loss: 0.00777
Epoch: 27737, Training Loss: 0.00879
Epoch: 27737, Training Loss: 0.00821
Epoch: 27737, Training Loss: 0.01004
Epoch: 27737, Training Loss: 0.00777
Epoch: 27738, Training Loss: 0.00879
Epoch: 27738, Training Loss: 0.00821
Epoch: 27738, Training Loss: 0.01004
Epoch: 27738, Training Loss: 0.00777
Epoch: 27739, Training Loss: 0.00879
Epoch: 27739, Training Loss: 0.00821
Epoch: 27739, Training Loss: 0.01004
Epoch: 27739, Training Loss: 0.00777
Epoch: 27740, Training Loss: 0.00879
Epoch: 27740, Training Loss: 0.00821
Epoch: 27740, Training Loss: 0.01004
Epoch: 27740, Training Loss: 0.00777
Epoch: 27741, Training Loss: 0.00879
Epoch: 27741, Training Loss: 0.00821
Epoch: 27741, Training Loss: 0.01004
Epoch: 27741, Training Loss: 0.00777
Epoch: 27742, Training Loss: 0.00879
Epoch: 27742, Training Loss: 0.00821
Epoch: 27742, Training Loss: 0.01004
Epoch: 27742, Training Loss: 0.00777
Epoch: 27743, Training Loss: 0.00879
Epoch: 27743, Training Loss: 0.00820
Epoch: 27743, Training Loss: 0.01004
Epoch: 27743, Training Loss: 0.00777
Epoch: 27744, Training Loss: 0.00879
Epoch: 27744, Training Loss: 0.00820
Epoch: 27744, Training Loss: 0.01004
Epoch: 27744, Training Loss: 0.00777
Epoch: 27745, Training Loss: 0.00879
Epoch: 27745, Training Loss: 0.00820
Epoch: 27745, Training Loss: 0.01004
Epoch: 27745, Training Loss: 0.00777
Epoch: 27746, Training Loss: 0.00879
Epoch: 27746, Training Loss: 0.00820
Epoch: 27746, Training Loss: 0.01004
Epoch: 27746, Training Loss: 0.00777
Epoch: 27747, Training Loss: 0.00879
Epoch: 27747, Training Loss: 0.00820
Epoch: 27747, Training Loss: 0.01004
Epoch: 27747, Training Loss: 0.00777
Epoch: 27748, Training Loss: 0.00879
Epoch: 27748, Training Loss: 0.00820
Epoch: 27748, Training Loss: 0.01004
Epoch: 27748, Training Loss: 0.00777
Epoch: 27749, Training Loss: 0.00879
Epoch: 27749, Training Loss: 0.00820
Epoch: 27749, Training Loss: 0.01004
Epoch: 27749, Training Loss: 0.00777
Epoch: 27750, Training Loss: 0.00879
Epoch: 27750, Training Loss: 0.00820
Epoch: 27750, Training Loss: 0.01004
Epoch: 27750, Training Loss: 0.00777
Epoch: 27751, Training Loss: 0.00879
Epoch: 27751, Training Loss: 0.00820
Epoch: 27751, Training Loss: 0.01004
Epoch: 27751, Training Loss: 0.00777
Epoch: 27752, Training Loss: 0.00879
Epoch: 27752, Training Loss: 0.00820
Epoch: 27752, Training Loss: 0.01004
Epoch: 27752, Training Loss: 0.00777
Epoch: 27753, Training Loss: 0.00879
Epoch: 27753, Training Loss: 0.00820
Epoch: 27753, Training Loss: 0.01004
Epoch: 27753, Training Loss: 0.00777
Epoch: 27754, Training Loss: 0.00879
Epoch: 27754, Training Loss: 0.00820
Epoch: 27754, Training Loss: 0.01004
Epoch: 27754, Training Loss: 0.00777
Epoch: 27755, Training Loss: 0.00879
Epoch: 27755, Training Loss: 0.00820
Epoch: 27755, Training Loss: 0.01004
Epoch: 27755, Training Loss: 0.00777
Epoch: 27756, Training Loss: 0.00879
Epoch: 27756, Training Loss: 0.00820
Epoch: 27756, Training Loss: 0.01004
Epoch: 27756, Training Loss: 0.00777
Epoch: 27757, Training Loss: 0.00879
Epoch: 27757, Training Loss: 0.00820
Epoch: 27757, Training Loss: 0.01004
Epoch: 27757, Training Loss: 0.00777
Epoch: 27758, Training Loss: 0.00879
Epoch: 27758, Training Loss: 0.00820
Epoch: 27758, Training Loss: 0.01004
Epoch: 27758, Training Loss: 0.00777
Epoch: 27759, Training Loss: 0.00879
Epoch: 27759, Training Loss: 0.00820
Epoch: 27759, Training Loss: 0.01004
Epoch: 27759, Training Loss: 0.00777
Epoch: 27760, Training Loss: 0.00879
Epoch: 27760, Training Loss: 0.00820
Epoch: 27760, Training Loss: 0.01004
Epoch: 27760, Training Loss: 0.00777
Epoch: 27761, Training Loss: 0.00879
Epoch: 27761, Training Loss: 0.00820
Epoch: 27761, Training Loss: 0.01004
Epoch: 27761, Training Loss: 0.00777
Epoch: 27762, Training Loss: 0.00879
Epoch: 27762, Training Loss: 0.00820
Epoch: 27762, Training Loss: 0.01004
Epoch: 27762, Training Loss: 0.00777
Epoch: 27763, Training Loss: 0.00879
Epoch: 27763, Training Loss: 0.00820
Epoch: 27763, Training Loss: 0.01004
Epoch: 27763, Training Loss: 0.00777
Epoch: 27764, Training Loss: 0.00879
Epoch: 27764, Training Loss: 0.00820
Epoch: 27764, Training Loss: 0.01004
Epoch: 27764, Training Loss: 0.00777
Epoch: 27765, Training Loss: 0.00879
Epoch: 27765, Training Loss: 0.00820
Epoch: 27765, Training Loss: 0.01004
Epoch: 27765, Training Loss: 0.00777
Epoch: 27766, Training Loss: 0.00879
Epoch: 27766, Training Loss: 0.00820
Epoch: 27766, Training Loss: 0.01004
Epoch: 27766, Training Loss: 0.00777
Epoch: 27767, Training Loss: 0.00879
Epoch: 27767, Training Loss: 0.00820
Epoch: 27767, Training Loss: 0.01004
Epoch: 27767, Training Loss: 0.00777
Epoch: 27768, Training Loss: 0.00879
Epoch: 27768, Training Loss: 0.00820
Epoch: 27768, Training Loss: 0.01004
Epoch: 27768, Training Loss: 0.00777
Epoch: 27769, Training Loss: 0.00879
Epoch: 27769, Training Loss: 0.00820
Epoch: 27769, Training Loss: 0.01004
Epoch: 27769, Training Loss: 0.00777
Epoch: 27770, Training Loss: 0.00879
Epoch: 27770, Training Loss: 0.00820
Epoch: 27770, Training Loss: 0.01004
Epoch: 27770, Training Loss: 0.00777
Epoch: 27771, Training Loss: 0.00879
Epoch: 27771, Training Loss: 0.00820
Epoch: 27771, Training Loss: 0.01004
Epoch: 27771, Training Loss: 0.00777
Epoch: 27772, Training Loss: 0.00879
Epoch: 27772, Training Loss: 0.00820
Epoch: 27772, Training Loss: 0.01004
Epoch: 27772, Training Loss: 0.00777
Epoch: 27773, Training Loss: 0.00879
Epoch: 27773, Training Loss: 0.00820
Epoch: 27773, Training Loss: 0.01004
Epoch: 27773, Training Loss: 0.00777
Epoch: 27774, Training Loss: 0.00879
Epoch: 27774, Training Loss: 0.00820
Epoch: 27774, Training Loss: 0.01004
Epoch: 27774, Training Loss: 0.00777
Epoch: 27775, Training Loss: 0.00879
Epoch: 27775, Training Loss: 0.00820
Epoch: 27775, Training Loss: 0.01003
Epoch: 27775, Training Loss: 0.00777
Epoch: 27776, Training Loss: 0.00879
Epoch: 27776, Training Loss: 0.00820
Epoch: 27776, Training Loss: 0.01003
Epoch: 27776, Training Loss: 0.00777
Epoch: 27777, Training Loss: 0.00879
Epoch: 27777, Training Loss: 0.00820
Epoch: 27777, Training Loss: 0.01003
Epoch: 27777, Training Loss: 0.00777
Epoch: 27778, Training Loss: 0.00879
Epoch: 27778, Training Loss: 0.00820
Epoch: 27778, Training Loss: 0.01003
Epoch: 27778, Training Loss: 0.00777
Epoch: 27779, Training Loss: 0.00879
Epoch: 27779, Training Loss: 0.00820
Epoch: 27779, Training Loss: 0.01003
Epoch: 27779, Training Loss: 0.00777
Epoch: 27780, Training Loss: 0.00879
Epoch: 27780, Training Loss: 0.00820
Epoch: 27780, Training Loss: 0.01003
Epoch: 27780, Training Loss: 0.00777
Epoch: 27781, Training Loss: 0.00879
Epoch: 27781, Training Loss: 0.00820
Epoch: 27781, Training Loss: 0.01003
Epoch: 27781, Training Loss: 0.00776
Epoch: 27782, Training Loss: 0.00879
Epoch: 27782, Training Loss: 0.00820
Epoch: 27782, Training Loss: 0.01003
Epoch: 27782, Training Loss: 0.00776
Epoch: 27783, Training Loss: 0.00879
Epoch: 27783, Training Loss: 0.00820
Epoch: 27783, Training Loss: 0.01003
Epoch: 27783, Training Loss: 0.00776
Epoch: 27784, Training Loss: 0.00879
Epoch: 27784, Training Loss: 0.00820
Epoch: 27784, Training Loss: 0.01003
Epoch: 27784, Training Loss: 0.00776
Epoch: 27785, Training Loss: 0.00879
Epoch: 27785, Training Loss: 0.00820
Epoch: 27785, Training Loss: 0.01003
Epoch: 27785, Training Loss: 0.00776
Epoch: 27786, Training Loss: 0.00879
Epoch: 27786, Training Loss: 0.00820
Epoch: 27786, Training Loss: 0.01003
Epoch: 27786, Training Loss: 0.00776
Epoch: 27787, Training Loss: 0.00879
Epoch: 27787, Training Loss: 0.00820
Epoch: 27787, Training Loss: 0.01003
Epoch: 27787, Training Loss: 0.00776
Epoch: 27788, Training Loss: 0.00879
Epoch: 27788, Training Loss: 0.00820
Epoch: 27788, Training Loss: 0.01003
Epoch: 27788, Training Loss: 0.00776
Epoch: 27789, Training Loss: 0.00879
Epoch: 27789, Training Loss: 0.00820
Epoch: 27789, Training Loss: 0.01003
Epoch: 27789, Training Loss: 0.00776
Epoch: 27790, Training Loss: 0.00879
Epoch: 27790, Training Loss: 0.00820
Epoch: 27790, Training Loss: 0.01003
Epoch: 27790, Training Loss: 0.00776
Epoch: 27791, Training Loss: 0.00878
Epoch: 27791, Training Loss: 0.00820
Epoch: 27791, Training Loss: 0.01003
Epoch: 27791, Training Loss: 0.00776
Epoch: 27792, Training Loss: 0.00878
Epoch: 27792, Training Loss: 0.00820
Epoch: 27792, Training Loss: 0.01003
Epoch: 27792, Training Loss: 0.00776
Epoch: 27793, Training Loss: 0.00878
Epoch: 27793, Training Loss: 0.00820
Epoch: 27793, Training Loss: 0.01003
Epoch: 27793, Training Loss: 0.00776
Epoch: 27794, Training Loss: 0.00878
Epoch: 27794, Training Loss: 0.00820
Epoch: 27794, Training Loss: 0.01003
Epoch: 27794, Training Loss: 0.00776
Epoch: 27795, Training Loss: 0.00878
Epoch: 27795, Training Loss: 0.00820
Epoch: 27795, Training Loss: 0.01003
Epoch: 27795, Training Loss: 0.00776
Epoch: 27796, Training Loss: 0.00878
Epoch: 27796, Training Loss: 0.00820
Epoch: 27796, Training Loss: 0.01003
Epoch: 27796, Training Loss: 0.00776
Epoch: 27797, Training Loss: 0.00878
Epoch: 27797, Training Loss: 0.00820
Epoch: 27797, Training Loss: 0.01003
Epoch: 27797, Training Loss: 0.00776
Epoch: 27798, Training Loss: 0.00878
Epoch: 27798, Training Loss: 0.00820
Epoch: 27798, Training Loss: 0.01003
Epoch: 27798, Training Loss: 0.00776
Epoch: 27799, Training Loss: 0.00878
Epoch: 27799, Training Loss: 0.00820
Epoch: 27799, Training Loss: 0.01003
Epoch: 27799, Training Loss: 0.00776
Epoch: 27800, Training Loss: 0.00878
Epoch: 27800, Training Loss: 0.00820
Epoch: 27800, Training Loss: 0.01003
Epoch: 27800, Training Loss: 0.00776
Epoch: 27801, Training Loss: 0.00878
Epoch: 27801, Training Loss: 0.00820
Epoch: 27801, Training Loss: 0.01003
Epoch: 27801, Training Loss: 0.00776
Epoch: 27802, Training Loss: 0.00878
Epoch: 27802, Training Loss: 0.00820
Epoch: 27802, Training Loss: 0.01003
Epoch: 27802, Training Loss: 0.00776
Epoch: 27803, Training Loss: 0.00878
Epoch: 27803, Training Loss: 0.00820
Epoch: 27803, Training Loss: 0.01003
Epoch: 27803, Training Loss: 0.00776
Epoch: 27804, Training Loss: 0.00878
Epoch: 27804, Training Loss: 0.00820
Epoch: 27804, Training Loss: 0.01003
Epoch: 27804, Training Loss: 0.00776
Epoch: 27805, Training Loss: 0.00878
Epoch: 27805, Training Loss: 0.00820
Epoch: 27805, Training Loss: 0.01003
Epoch: 27805, Training Loss: 0.00776
Epoch: 27806, Training Loss: 0.00878
Epoch: 27806, Training Loss: 0.00820
Epoch: 27806, Training Loss: 0.01003
Epoch: 27806, Training Loss: 0.00776
Epoch: 27807, Training Loss: 0.00878
Epoch: 27807, Training Loss: 0.00819
Epoch: 27807, Training Loss: 0.01003
Epoch: 27807, Training Loss: 0.00776
Epoch: 27808, Training Loss: 0.00878
Epoch: 27808, Training Loss: 0.00819
Epoch: 27808, Training Loss: 0.01003
Epoch: 27808, Training Loss: 0.00776
Epoch: 27809, Training Loss: 0.00878
Epoch: 27809, Training Loss: 0.00819
Epoch: 27809, Training Loss: 0.01003
Epoch: 27809, Training Loss: 0.00776
Epoch: 27810, Training Loss: 0.00878
Epoch: 27810, Training Loss: 0.00819
Epoch: 27810, Training Loss: 0.01003
Epoch: 27810, Training Loss: 0.00776
Epoch: 27811, Training Loss: 0.00878
Epoch: 27811, Training Loss: 0.00819
Epoch: 27811, Training Loss: 0.01003
Epoch: 27811, Training Loss: 0.00776
Epoch: 27812, Training Loss: 0.00878
Epoch: 27812, Training Loss: 0.00819
Epoch: 27812, Training Loss: 0.01003
Epoch: 27812, Training Loss: 0.00776
Epoch: 27813, Training Loss: 0.00878
Epoch: 27813, Training Loss: 0.00819
Epoch: 27813, Training Loss: 0.01003
Epoch: 27813, Training Loss: 0.00776
Epoch: 27814, Training Loss: 0.00878
Epoch: 27814, Training Loss: 0.00819
Epoch: 27814, Training Loss: 0.01003
Epoch: 27814, Training Loss: 0.00776
Epoch: 27815, Training Loss: 0.00878
Epoch: 27815, Training Loss: 0.00819
Epoch: 27815, Training Loss: 0.01003
Epoch: 27815, Training Loss: 0.00776
Epoch: 27816, Training Loss: 0.00878
Epoch: 27816, Training Loss: 0.00819
Epoch: 27816, Training Loss: 0.01003
Epoch: 27816, Training Loss: 0.00776
Epoch: 27817, Training Loss: 0.00878
Epoch: 27817, Training Loss: 0.00819
Epoch: 27817, Training Loss: 0.01003
Epoch: 27817, Training Loss: 0.00776
Epoch: 27818, Training Loss: 0.00878
Epoch: 27818, Training Loss: 0.00819
Epoch: 27818, Training Loss: 0.01003
Epoch: 27818, Training Loss: 0.00776
Epoch: 27819, Training Loss: 0.00878
Epoch: 27819, Training Loss: 0.00819
Epoch: 27819, Training Loss: 0.01003
Epoch: 27819, Training Loss: 0.00776
Epoch: 27820, Training Loss: 0.00878
Epoch: 27820, Training Loss: 0.00819
Epoch: 27820, Training Loss: 0.01003
Epoch: 27820, Training Loss: 0.00776
Epoch: 27821, Training Loss: 0.00878
Epoch: 27821, Training Loss: 0.00819
Epoch: 27821, Training Loss: 0.01003
Epoch: 27821, Training Loss: 0.00776
Epoch: 27822, Training Loss: 0.00878
Epoch: 27822, Training Loss: 0.00819
Epoch: 27822, Training Loss: 0.01003
Epoch: 27822, Training Loss: 0.00776
Epoch: 27823, Training Loss: 0.00878
Epoch: 27823, Training Loss: 0.00819
Epoch: 27823, Training Loss: 0.01003
Epoch: 27823, Training Loss: 0.00776
Epoch: 27824, Training Loss: 0.00878
Epoch: 27824, Training Loss: 0.00819
Epoch: 27824, Training Loss: 0.01003
Epoch: 27824, Training Loss: 0.00776
Epoch: 27825, Training Loss: 0.00878
Epoch: 27825, Training Loss: 0.00819
Epoch: 27825, Training Loss: 0.01003
Epoch: 27825, Training Loss: 0.00776
Epoch: 27826, Training Loss: 0.00878
Epoch: 27826, Training Loss: 0.00819
Epoch: 27826, Training Loss: 0.01003
Epoch: 27826, Training Loss: 0.00776
Epoch: 27827, Training Loss: 0.00878
Epoch: 27827, Training Loss: 0.00819
Epoch: 27827, Training Loss: 0.01002
Epoch: 27827, Training Loss: 0.00776
Epoch: 27828, Training Loss: 0.00878
Epoch: 27828, Training Loss: 0.00819
Epoch: 27828, Training Loss: 0.01002
Epoch: 27828, Training Loss: 0.00776
Epoch: 27829, Training Loss: 0.00878
Epoch: 27829, Training Loss: 0.00819
Epoch: 27829, Training Loss: 0.01002
Epoch: 27829, Training Loss: 0.00776
Epoch: 27830, Training Loss: 0.00878
Epoch: 27830, Training Loss: 0.00819
Epoch: 27830, Training Loss: 0.01002
Epoch: 27830, Training Loss: 0.00776
Epoch: 27831, Training Loss: 0.00878
Epoch: 27831, Training Loss: 0.00819
Epoch: 27831, Training Loss: 0.01002
Epoch: 27831, Training Loss: 0.00776
Epoch: 27832, Training Loss: 0.00878
Epoch: 27832, Training Loss: 0.00819
Epoch: 27832, Training Loss: 0.01002
Epoch: 27832, Training Loss: 0.00776
Epoch: 27833, Training Loss: 0.00878
Epoch: 27833, Training Loss: 0.00819
Epoch: 27833, Training Loss: 0.01002
Epoch: 27833, Training Loss: 0.00776
Epoch: 27834, Training Loss: 0.00878
Epoch: 27834, Training Loss: 0.00819
Epoch: 27834, Training Loss: 0.01002
Epoch: 27834, Training Loss: 0.00776
Epoch: 27835, Training Loss: 0.00878
Epoch: 27835, Training Loss: 0.00819
Epoch: 27835, Training Loss: 0.01002
Epoch: 27835, Training Loss: 0.00776
Epoch: 27836, Training Loss: 0.00878
Epoch: 27836, Training Loss: 0.00819
Epoch: 27836, Training Loss: 0.01002
Epoch: 27836, Training Loss: 0.00776
Epoch: 27837, Training Loss: 0.00878
Epoch: 27837, Training Loss: 0.00819
Epoch: 27837, Training Loss: 0.01002
Epoch: 27837, Training Loss: 0.00776
Epoch: 27838, Training Loss: 0.00878
Epoch: 27838, Training Loss: 0.00819
Epoch: 27838, Training Loss: 0.01002
Epoch: 27838, Training Loss: 0.00776
Epoch: 27839, Training Loss: 0.00878
Epoch: 27839, Training Loss: 0.00819
Epoch: 27839, Training Loss: 0.01002
Epoch: 27839, Training Loss: 0.00776
Epoch: 27840, Training Loss: 0.00878
Epoch: 27840, Training Loss: 0.00819
Epoch: 27840, Training Loss: 0.01002
Epoch: 27840, Training Loss: 0.00776
Epoch: 27841, Training Loss: 0.00878
Epoch: 27841, Training Loss: 0.00819
Epoch: 27841, Training Loss: 0.01002
Epoch: 27841, Training Loss: 0.00776
Epoch: 27842, Training Loss: 0.00878
Epoch: 27842, Training Loss: 0.00819
Epoch: 27842, Training Loss: 0.01002
Epoch: 27842, Training Loss: 0.00776
Epoch: 27843, Training Loss: 0.00878
Epoch: 27843, Training Loss: 0.00819
Epoch: 27843, Training Loss: 0.01002
Epoch: 27843, Training Loss: 0.00776
Epoch: 27844, Training Loss: 0.00878
Epoch: 27844, Training Loss: 0.00819
Epoch: 27844, Training Loss: 0.01002
Epoch: 27844, Training Loss: 0.00776
Epoch: 27845, Training Loss: 0.00878
Epoch: 27845, Training Loss: 0.00819
Epoch: 27845, Training Loss: 0.01002
Epoch: 27845, Training Loss: 0.00776
Epoch: 27846, Training Loss: 0.00878
Epoch: 27846, Training Loss: 0.00819
Epoch: 27846, Training Loss: 0.01002
Epoch: 27846, Training Loss: 0.00776
Epoch: 27847, Training Loss: 0.00878
Epoch: 27847, Training Loss: 0.00819
Epoch: 27847, Training Loss: 0.01002
Epoch: 27847, Training Loss: 0.00776
Epoch: 27848, Training Loss: 0.00878
Epoch: 27848, Training Loss: 0.00819
Epoch: 27848, Training Loss: 0.01002
Epoch: 27848, Training Loss: 0.00776
Epoch: 27849, Training Loss: 0.00878
Epoch: 27849, Training Loss: 0.00819
Epoch: 27849, Training Loss: 0.01002
Epoch: 27849, Training Loss: 0.00775
Epoch: 27850, Training Loss: 0.00878
Epoch: 27850, Training Loss: 0.00819
Epoch: 27850, Training Loss: 0.01002
Epoch: 27850, Training Loss: 0.00775
Epoch: 27851, Training Loss: 0.00877
Epoch: 27851, Training Loss: 0.00819
Epoch: 27851, Training Loss: 0.01002
Epoch: 27851, Training Loss: 0.00775
Epoch: 27852, Training Loss: 0.00877
Epoch: 27852, Training Loss: 0.00819
Epoch: 27852, Training Loss: 0.01002
Epoch: 27852, Training Loss: 0.00775
Epoch: 27853, Training Loss: 0.00877
Epoch: 27853, Training Loss: 0.00819
Epoch: 27853, Training Loss: 0.01002
Epoch: 27853, Training Loss: 0.00775
Epoch: 27854, Training Loss: 0.00877
Epoch: 27854, Training Loss: 0.00819
Epoch: 27854, Training Loss: 0.01002
Epoch: 27854, Training Loss: 0.00775
Epoch: 27855, Training Loss: 0.00877
Epoch: 27855, Training Loss: 0.00819
Epoch: 27855, Training Loss: 0.01002
Epoch: 27855, Training Loss: 0.00775
Epoch: 27856, Training Loss: 0.00877
Epoch: 27856, Training Loss: 0.00819
Epoch: 27856, Training Loss: 0.01002
Epoch: 27856, Training Loss: 0.00775
Epoch: 27857, Training Loss: 0.00877
Epoch: 27857, Training Loss: 0.00819
Epoch: 27857, Training Loss: 0.01002
Epoch: 27857, Training Loss: 0.00775
Epoch: 27858, Training Loss: 0.00877
Epoch: 27858, Training Loss: 0.00819
Epoch: 27858, Training Loss: 0.01002
Epoch: 27858, Training Loss: 0.00775
Epoch: 27859, Training Loss: 0.00877
Epoch: 27859, Training Loss: 0.00819
Epoch: 27859, Training Loss: 0.01002
Epoch: 27859, Training Loss: 0.00775
Epoch: 27860, Training Loss: 0.00877
Epoch: 27860, Training Loss: 0.00819
Epoch: 27860, Training Loss: 0.01002
Epoch: 27860, Training Loss: 0.00775
Epoch: 27861, Training Loss: 0.00877
Epoch: 27861, Training Loss: 0.00819
Epoch: 27861, Training Loss: 0.01002
Epoch: 27861, Training Loss: 0.00775
Epoch: 27862, Training Loss: 0.00877
Epoch: 27862, Training Loss: 0.00819
Epoch: 27862, Training Loss: 0.01002
Epoch: 27862, Training Loss: 0.00775
Epoch: 27863, Training Loss: 0.00877
Epoch: 27863, Training Loss: 0.00819
Epoch: 27863, Training Loss: 0.01002
Epoch: 27863, Training Loss: 0.00775
Epoch: 27864, Training Loss: 0.00877
Epoch: 27864, Training Loss: 0.00819
Epoch: 27864, Training Loss: 0.01002
Epoch: 27864, Training Loss: 0.00775
Epoch: 27865, Training Loss: 0.00877
Epoch: 27865, Training Loss: 0.00819
Epoch: 27865, Training Loss: 0.01002
Epoch: 27865, Training Loss: 0.00775
Epoch: 27866, Training Loss: 0.00877
Epoch: 27866, Training Loss: 0.00819
Epoch: 27866, Training Loss: 0.01002
Epoch: 27866, Training Loss: 0.00775
Epoch: 27867, Training Loss: 0.00877
Epoch: 27867, Training Loss: 0.00819
Epoch: 27867, Training Loss: 0.01002
Epoch: 27867, Training Loss: 0.00775
Epoch: 27868, Training Loss: 0.00877
Epoch: 27868, Training Loss: 0.00819
Epoch: 27868, Training Loss: 0.01002
Epoch: 27868, Training Loss: 0.00775
Epoch: 27869, Training Loss: 0.00877
Epoch: 27869, Training Loss: 0.00819
Epoch: 27869, Training Loss: 0.01002
Epoch: 27869, Training Loss: 0.00775
Epoch: 27870, Training Loss: 0.00877
Epoch: 27870, Training Loss: 0.00819
Epoch: 27870, Training Loss: 0.01002
Epoch: 27870, Training Loss: 0.00775
Epoch: 27871, Training Loss: 0.00877
Epoch: 27871, Training Loss: 0.00819
Epoch: 27871, Training Loss: 0.01002
Epoch: 27871, Training Loss: 0.00775
Epoch: 27872, Training Loss: 0.00877
Epoch: 27872, Training Loss: 0.00818
Epoch: 27872, Training Loss: 0.01002
Epoch: 27872, Training Loss: 0.00775
Epoch: 27873, Training Loss: 0.00877
Epoch: 27873, Training Loss: 0.00818
Epoch: 27873, Training Loss: 0.01002
Epoch: 27873, Training Loss: 0.00775
Epoch: 27874, Training Loss: 0.00877
Epoch: 27874, Training Loss: 0.00818
Epoch: 27874, Training Loss: 0.01002
Epoch: 27874, Training Loss: 0.00775
Epoch: 27875, Training Loss: 0.00877
Epoch: 27875, Training Loss: 0.00818
Epoch: 27875, Training Loss: 0.01002
Epoch: 27875, Training Loss: 0.00775
Epoch: 27876, Training Loss: 0.00877
Epoch: 27876, Training Loss: 0.00818
Epoch: 27876, Training Loss: 0.01002
Epoch: 27876, Training Loss: 0.00775
Epoch: 27877, Training Loss: 0.00877
Epoch: 27877, Training Loss: 0.00818
Epoch: 27877, Training Loss: 0.01002
Epoch: 27877, Training Loss: 0.00775
Epoch: 27878, Training Loss: 0.00877
Epoch: 27878, Training Loss: 0.00818
Epoch: 27878, Training Loss: 0.01002
Epoch: 27878, Training Loss: 0.00775
Epoch: 27879, Training Loss: 0.00877
Epoch: 27879, Training Loss: 0.00818
Epoch: 27879, Training Loss: 0.01001
Epoch: 27879, Training Loss: 0.00775
Epoch: 27880, Training Loss: 0.00877
Epoch: 27880, Training Loss: 0.00818
Epoch: 27880, Training Loss: 0.01001
Epoch: 27880, Training Loss: 0.00775
Epoch: 27881, Training Loss: 0.00877
Epoch: 27881, Training Loss: 0.00818
Epoch: 27881, Training Loss: 0.01001
Epoch: 27881, Training Loss: 0.00775
Epoch: 27882, Training Loss: 0.00877
Epoch: 27882, Training Loss: 0.00818
Epoch: 27882, Training Loss: 0.01001
Epoch: 27882, Training Loss: 0.00775
Epoch: 27883, Training Loss: 0.00877
Epoch: 27883, Training Loss: 0.00818
Epoch: 27883, Training Loss: 0.01001
Epoch: 27883, Training Loss: 0.00775
Epoch: 27884, Training Loss: 0.00877
Epoch: 27884, Training Loss: 0.00818
Epoch: 27884, Training Loss: 0.01001
Epoch: 27884, Training Loss: 0.00775
Epoch: 27885, Training Loss: 0.00877
Epoch: 27885, Training Loss: 0.00818
Epoch: 27885, Training Loss: 0.01001
Epoch: 27885, Training Loss: 0.00775
Epoch: 27886, Training Loss: 0.00877
Epoch: 27886, Training Loss: 0.00818
Epoch: 27886, Training Loss: 0.01001
Epoch: 27886, Training Loss: 0.00775
Epoch: 27887, Training Loss: 0.00877
Epoch: 27887, Training Loss: 0.00818
Epoch: 27887, Training Loss: 0.01001
Epoch: 27887, Training Loss: 0.00775
Epoch: 27888, Training Loss: 0.00877
Epoch: 27888, Training Loss: 0.00818
Epoch: 27888, Training Loss: 0.01001
Epoch: 27888, Training Loss: 0.00775
Epoch: 27889, Training Loss: 0.00877
Epoch: 27889, Training Loss: 0.00818
Epoch: 27889, Training Loss: 0.01001
Epoch: 27889, Training Loss: 0.00775
Epoch: 27890, Training Loss: 0.00877
Epoch: 27890, Training Loss: 0.00818
Epoch: 27890, Training Loss: 0.01001
Epoch: 27890, Training Loss: 0.00775
Epoch: 27891, Training Loss: 0.00877
Epoch: 27891, Training Loss: 0.00818
Epoch: 27891, Training Loss: 0.01001
Epoch: 27891, Training Loss: 0.00775
Epoch: 27892, Training Loss: 0.00877
Epoch: 27892, Training Loss: 0.00818
Epoch: 27892, Training Loss: 0.01001
Epoch: 27892, Training Loss: 0.00775
Epoch: 27893, Training Loss: 0.00877
Epoch: 27893, Training Loss: 0.00818
Epoch: 27893, Training Loss: 0.01001
Epoch: 27893, Training Loss: 0.00775
Epoch: 27894, Training Loss: 0.00877
Epoch: 27894, Training Loss: 0.00818
Epoch: 27894, Training Loss: 0.01001
Epoch: 27894, Training Loss: 0.00775
Epoch: 27895, Training Loss: 0.00877
Epoch: 27895, Training Loss: 0.00818
Epoch: 27895, Training Loss: 0.01001
Epoch: 27895, Training Loss: 0.00775
Epoch: 27896, Training Loss: 0.00877
Epoch: 27896, Training Loss: 0.00818
Epoch: 27896, Training Loss: 0.01001
Epoch: 27896, Training Loss: 0.00775
Epoch: 27897, Training Loss: 0.00877
Epoch: 27897, Training Loss: 0.00818
Epoch: 27897, Training Loss: 0.01001
Epoch: 27897, Training Loss: 0.00775
Epoch: 27898, Training Loss: 0.00877
Epoch: 27898, Training Loss: 0.00818
Epoch: 27898, Training Loss: 0.01001
Epoch: 27898, Training Loss: 0.00775
Epoch: 27899, Training Loss: 0.00877
Epoch: 27899, Training Loss: 0.00818
Epoch: 27899, Training Loss: 0.01001
Epoch: 27899, Training Loss: 0.00775
Epoch: 27900, Training Loss: 0.00877
Epoch: 27900, Training Loss: 0.00818
Epoch: 27900, Training Loss: 0.01001
Epoch: 27900, Training Loss: 0.00775
Epoch: 27901, Training Loss: 0.00877
Epoch: 27901, Training Loss: 0.00818
Epoch: 27901, Training Loss: 0.01001
Epoch: 27901, Training Loss: 0.00775
Epoch: 27902, Training Loss: 0.00877
Epoch: 27902, Training Loss: 0.00818
Epoch: 27902, Training Loss: 0.01001
Epoch: 27902, Training Loss: 0.00775
Epoch: 27903, Training Loss: 0.00877
Epoch: 27903, Training Loss: 0.00818
Epoch: 27903, Training Loss: 0.01001
Epoch: 27903, Training Loss: 0.00775
Epoch: 27904, Training Loss: 0.00877
Epoch: 27904, Training Loss: 0.00818
Epoch: 27904, Training Loss: 0.01001
Epoch: 27904, Training Loss: 0.00775
Epoch: 27905, Training Loss: 0.00877
Epoch: 27905, Training Loss: 0.00818
Epoch: 27905, Training Loss: 0.01001
Epoch: 27905, Training Loss: 0.00775
Epoch: 27906, Training Loss: 0.00877
Epoch: 27906, Training Loss: 0.00818
Epoch: 27906, Training Loss: 0.01001
Epoch: 27906, Training Loss: 0.00775
Epoch: 27907, Training Loss: 0.00877
Epoch: 27907, Training Loss: 0.00818
Epoch: 27907, Training Loss: 0.01001
Epoch: 27907, Training Loss: 0.00775
Epoch: 27908, Training Loss: 0.00877
Epoch: 27908, Training Loss: 0.00818
Epoch: 27908, Training Loss: 0.01001
Epoch: 27908, Training Loss: 0.00775
Epoch: 27909, Training Loss: 0.00877
Epoch: 27909, Training Loss: 0.00818
Epoch: 27909, Training Loss: 0.01001
Epoch: 27909, Training Loss: 0.00775
Epoch: 27910, Training Loss: 0.00876
Epoch: 27910, Training Loss: 0.00818
Epoch: 27910, Training Loss: 0.01001
Epoch: 27910, Training Loss: 0.00775
Epoch: 27911, Training Loss: 0.00876
Epoch: 27911, Training Loss: 0.00818
Epoch: 27911, Training Loss: 0.01001
Epoch: 27911, Training Loss: 0.00775
Epoch: 27912, Training Loss: 0.00876
Epoch: 27912, Training Loss: 0.00818
Epoch: 27912, Training Loss: 0.01001
Epoch: 27912, Training Loss: 0.00775
Epoch: 27913, Training Loss: 0.00876
Epoch: 27913, Training Loss: 0.00818
Epoch: 27913, Training Loss: 0.01001
Epoch: 27913, Training Loss: 0.00775
Epoch: 27914, Training Loss: 0.00876
Epoch: 27914, Training Loss: 0.00818
Epoch: 27914, Training Loss: 0.01001
Epoch: 27914, Training Loss: 0.00775
Epoch: 27915, Training Loss: 0.00876
Epoch: 27915, Training Loss: 0.00818
Epoch: 27915, Training Loss: 0.01001
Epoch: 27915, Training Loss: 0.00775
Epoch: 27916, Training Loss: 0.00876
Epoch: 27916, Training Loss: 0.00818
Epoch: 27916, Training Loss: 0.01001
Epoch: 27916, Training Loss: 0.00775
Epoch: 27917, Training Loss: 0.00876
Epoch: 27917, Training Loss: 0.00818
Epoch: 27917, Training Loss: 0.01001
Epoch: 27917, Training Loss: 0.00774
Epoch: 27918, Training Loss: 0.00876
Epoch: 27918, Training Loss: 0.00818
Epoch: 27918, Training Loss: 0.01001
Epoch: 27918, Training Loss: 0.00774
Epoch: 27919, Training Loss: 0.00876
Epoch: 27919, Training Loss: 0.00818
Epoch: 27919, Training Loss: 0.01001
Epoch: 27919, Training Loss: 0.00774
Epoch: 27920, Training Loss: 0.00876
Epoch: 27920, Training Loss: 0.00818
Epoch: 27920, Training Loss: 0.01001
Epoch: 27920, Training Loss: 0.00774
Epoch: 27921, Training Loss: 0.00876
Epoch: 27921, Training Loss: 0.00818
Epoch: 27921, Training Loss: 0.01001
Epoch: 27921, Training Loss: 0.00774
Epoch: 27922, Training Loss: 0.00876
Epoch: 27922, Training Loss: 0.00818
Epoch: 27922, Training Loss: 0.01001
Epoch: 27922, Training Loss: 0.00774
Epoch: 27923, Training Loss: 0.00876
Epoch: 27923, Training Loss: 0.00818
Epoch: 27923, Training Loss: 0.01001
Epoch: 27923, Training Loss: 0.00774
Epoch: 27924, Training Loss: 0.00876
Epoch: 27924, Training Loss: 0.00818
Epoch: 27924, Training Loss: 0.01001
Epoch: 27924, Training Loss: 0.00774
Epoch: 27925, Training Loss: 0.00876
Epoch: 27925, Training Loss: 0.00818
Epoch: 27925, Training Loss: 0.01001
Epoch: 27925, Training Loss: 0.00774
Epoch: 27926, Training Loss: 0.00876
Epoch: 27926, Training Loss: 0.00818
Epoch: 27926, Training Loss: 0.01001
Epoch: 27926, Training Loss: 0.00774
Epoch: 27927, Training Loss: 0.00876
Epoch: 27927, Training Loss: 0.00818
Epoch: 27927, Training Loss: 0.01001
Epoch: 27927, Training Loss: 0.00774
Epoch: 27928, Training Loss: 0.00876
Epoch: 27928, Training Loss: 0.00818
Epoch: 27928, Training Loss: 0.01001
Epoch: 27928, Training Loss: 0.00774
Epoch: 27929, Training Loss: 0.00876
Epoch: 27929, Training Loss: 0.00818
Epoch: 27929, Training Loss: 0.01001
Epoch: 27929, Training Loss: 0.00774
Epoch: 27930, Training Loss: 0.00876
Epoch: 27930, Training Loss: 0.00818
Epoch: 27930, Training Loss: 0.01001
Epoch: 27930, Training Loss: 0.00774
Epoch: 27931, Training Loss: 0.00876
Epoch: 27931, Training Loss: 0.00818
Epoch: 27931, Training Loss: 0.01000
Epoch: 27931, Training Loss: 0.00774
Epoch: 27932, Training Loss: 0.00876
Epoch: 27932, Training Loss: 0.00818
Epoch: 27932, Training Loss: 0.01000
Epoch: 27932, Training Loss: 0.00774
Epoch: 27933, Training Loss: 0.00876
Epoch: 27933, Training Loss: 0.00818
Epoch: 27933, Training Loss: 0.01000
Epoch: 27933, Training Loss: 0.00774
Epoch: 27934, Training Loss: 0.00876
Epoch: 27934, Training Loss: 0.00818
Epoch: 27934, Training Loss: 0.01000
Epoch: 27934, Training Loss: 0.00774
Epoch: 27935, Training Loss: 0.00876
Epoch: 27935, Training Loss: 0.00818
Epoch: 27935, Training Loss: 0.01000
Epoch: 27935, Training Loss: 0.00774
Epoch: 27936, Training Loss: 0.00876
Epoch: 27936, Training Loss: 0.00818
Epoch: 27936, Training Loss: 0.01000
Epoch: 27936, Training Loss: 0.00774
Epoch: 27937, Training Loss: 0.00876
Epoch: 27937, Training Loss: 0.00817
Epoch: 27937, Training Loss: 0.01000
Epoch: 27937, Training Loss: 0.00774
Epoch: 27938, Training Loss: 0.00876
Epoch: 27938, Training Loss: 0.00817
Epoch: 27938, Training Loss: 0.01000
Epoch: 27938, Training Loss: 0.00774
Epoch: 27939, Training Loss: 0.00876
Epoch: 27939, Training Loss: 0.00817
Epoch: 27939, Training Loss: 0.01000
Epoch: 27939, Training Loss: 0.00774
Epoch: 27940, Training Loss: 0.00876
Epoch: 27940, Training Loss: 0.00817
Epoch: 27940, Training Loss: 0.01000
Epoch: 27940, Training Loss: 0.00774
Epoch: 27941, Training Loss: 0.00876
Epoch: 27941, Training Loss: 0.00817
Epoch: 27941, Training Loss: 0.01000
Epoch: 27941, Training Loss: 0.00774
Epoch: 27942, Training Loss: 0.00876
Epoch: 27942, Training Loss: 0.00817
Epoch: 27942, Training Loss: 0.01000
Epoch: 27942, Training Loss: 0.00774
Epoch: 27943, Training Loss: 0.00876
Epoch: 27943, Training Loss: 0.00817
Epoch: 27943, Training Loss: 0.01000
Epoch: 27943, Training Loss: 0.00774
Epoch: 27944, Training Loss: 0.00876
Epoch: 27944, Training Loss: 0.00817
Epoch: 27944, Training Loss: 0.01000
Epoch: 27944, Training Loss: 0.00774
Epoch: 27945, Training Loss: 0.00876
Epoch: 27945, Training Loss: 0.00817
Epoch: 27945, Training Loss: 0.01000
Epoch: 27945, Training Loss: 0.00774
Epoch: 27946, Training Loss: 0.00876
Epoch: 27946, Training Loss: 0.00817
Epoch: 27946, Training Loss: 0.01000
Epoch: 27946, Training Loss: 0.00774
Epoch: 27947, Training Loss: 0.00876
Epoch: 27947, Training Loss: 0.00817
Epoch: 27947, Training Loss: 0.01000
Epoch: 27947, Training Loss: 0.00774
Epoch: 27948, Training Loss: 0.00876
Epoch: 27948, Training Loss: 0.00817
Epoch: 27948, Training Loss: 0.01000
Epoch: 27948, Training Loss: 0.00774
Epoch: 27949, Training Loss: 0.00876
Epoch: 27949, Training Loss: 0.00817
Epoch: 27949, Training Loss: 0.01000
Epoch: 27949, Training Loss: 0.00774
Epoch: 27950, Training Loss: 0.00876
Epoch: 27950, Training Loss: 0.00817
Epoch: 27950, Training Loss: 0.01000
Epoch: 27950, Training Loss: 0.00774
Epoch: 27951, Training Loss: 0.00876
Epoch: 27951, Training Loss: 0.00817
Epoch: 27951, Training Loss: 0.01000
Epoch: 27951, Training Loss: 0.00774
Epoch: 27952, Training Loss: 0.00876
Epoch: 27952, Training Loss: 0.00817
Epoch: 27952, Training Loss: 0.01000
Epoch: 27952, Training Loss: 0.00774
Epoch: 27953, Training Loss: 0.00876
Epoch: 27953, Training Loss: 0.00817
Epoch: 27953, Training Loss: 0.01000
Epoch: 27953, Training Loss: 0.00774
Epoch: 27954, Training Loss: 0.00876
Epoch: 27954, Training Loss: 0.00817
Epoch: 27954, Training Loss: 0.01000
Epoch: 27954, Training Loss: 0.00774
Epoch: 27955, Training Loss: 0.00876
Epoch: 27955, Training Loss: 0.00817
Epoch: 27955, Training Loss: 0.01000
Epoch: 27955, Training Loss: 0.00774
Epoch: 27956, Training Loss: 0.00876
Epoch: 27956, Training Loss: 0.00817
Epoch: 27956, Training Loss: 0.01000
Epoch: 27956, Training Loss: 0.00774
Epoch: 27957, Training Loss: 0.00876
Epoch: 27957, Training Loss: 0.00817
Epoch: 27957, Training Loss: 0.01000
Epoch: 27957, Training Loss: 0.00774
Epoch: 27958, Training Loss: 0.00876
Epoch: 27958, Training Loss: 0.00817
Epoch: 27958, Training Loss: 0.01000
Epoch: 27958, Training Loss: 0.00774
Epoch: 27959, Training Loss: 0.00876
Epoch: 27959, Training Loss: 0.00817
Epoch: 27959, Training Loss: 0.01000
Epoch: 27959, Training Loss: 0.00774
Epoch: 27960, Training Loss: 0.00876
Epoch: 27960, Training Loss: 0.00817
Epoch: 27960, Training Loss: 0.01000
Epoch: 27960, Training Loss: 0.00774
Epoch: 27961, Training Loss: 0.00876
Epoch: 27961, Training Loss: 0.00817
Epoch: 27961, Training Loss: 0.01000
Epoch: 27961, Training Loss: 0.00774
Epoch: 27962, Training Loss: 0.00876
Epoch: 27962, Training Loss: 0.00817
Epoch: 27962, Training Loss: 0.01000
Epoch: 27962, Training Loss: 0.00774
Epoch: 27963, Training Loss: 0.00876
Epoch: 27963, Training Loss: 0.00817
Epoch: 27963, Training Loss: 0.01000
Epoch: 27963, Training Loss: 0.00774
Epoch: 27964, Training Loss: 0.00876
Epoch: 27964, Training Loss: 0.00817
Epoch: 27964, Training Loss: 0.01000
Epoch: 27964, Training Loss: 0.00774
Epoch: 27965, Training Loss: 0.00876
Epoch: 27965, Training Loss: 0.00817
Epoch: 27965, Training Loss: 0.01000
Epoch: 27965, Training Loss: 0.00774
Epoch: 27966, Training Loss: 0.00876
Epoch: 27966, Training Loss: 0.00817
Epoch: 27966, Training Loss: 0.01000
Epoch: 27966, Training Loss: 0.00774
Epoch: 27967, Training Loss: 0.00876
Epoch: 27967, Training Loss: 0.00817
Epoch: 27967, Training Loss: 0.01000
Epoch: 27967, Training Loss: 0.00774
Epoch: 27968, Training Loss: 0.00876
Epoch: 27968, Training Loss: 0.00817
Epoch: 27968, Training Loss: 0.01000
Epoch: 27968, Training Loss: 0.00774
Epoch: 27969, Training Loss: 0.00876
Epoch: 27969, Training Loss: 0.00817
Epoch: 27969, Training Loss: 0.01000
Epoch: 27969, Training Loss: 0.00774
Epoch: 27970, Training Loss: 0.00875
Epoch: 27970, Training Loss: 0.00817
Epoch: 27970, Training Loss: 0.01000
Epoch: 27970, Training Loss: 0.00774
Epoch: 27971, Training Loss: 0.00875
Epoch: 27971, Training Loss: 0.00817
Epoch: 27971, Training Loss: 0.01000
Epoch: 27971, Training Loss: 0.00774
Epoch: 27972, Training Loss: 0.00875
Epoch: 27972, Training Loss: 0.00817
Epoch: 27972, Training Loss: 0.01000
Epoch: 27972, Training Loss: 0.00774
Epoch: 27973, Training Loss: 0.00875
Epoch: 27973, Training Loss: 0.00817
Epoch: 27973, Training Loss: 0.01000
Epoch: 27973, Training Loss: 0.00774
Epoch: 27974, Training Loss: 0.00875
Epoch: 27974, Training Loss: 0.00817
Epoch: 27974, Training Loss: 0.01000
Epoch: 27974, Training Loss: 0.00774
Epoch: 27975, Training Loss: 0.00875
Epoch: 27975, Training Loss: 0.00817
Epoch: 27975, Training Loss: 0.01000
Epoch: 27975, Training Loss: 0.00774
Epoch: 27976, Training Loss: 0.00875
Epoch: 27976, Training Loss: 0.00817
Epoch: 27976, Training Loss: 0.01000
Epoch: 27976, Training Loss: 0.00774
Epoch: 27977, Training Loss: 0.00875
Epoch: 27977, Training Loss: 0.00817
Epoch: 27977, Training Loss: 0.01000
Epoch: 27977, Training Loss: 0.00774
Epoch: 27978, Training Loss: 0.00875
Epoch: 27978, Training Loss: 0.00817
Epoch: 27978, Training Loss: 0.01000
Epoch: 27978, Training Loss: 0.00774
Epoch: 27979, Training Loss: 0.00875
Epoch: 27979, Training Loss: 0.00817
Epoch: 27979, Training Loss: 0.01000
Epoch: 27979, Training Loss: 0.00774
Epoch: 27980, Training Loss: 0.00875
Epoch: 27980, Training Loss: 0.00817
Epoch: 27980, Training Loss: 0.01000
Epoch: 27980, Training Loss: 0.00774
Epoch: 27981, Training Loss: 0.00875
Epoch: 27981, Training Loss: 0.00817
Epoch: 27981, Training Loss: 0.01000
Epoch: 27981, Training Loss: 0.00774
Epoch: 27982, Training Loss: 0.00875
Epoch: 27982, Training Loss: 0.00817
Epoch: 27982, Training Loss: 0.01000
Epoch: 27982, Training Loss: 0.00774
Epoch: 27983, Training Loss: 0.00875
Epoch: 27983, Training Loss: 0.00817
Epoch: 27983, Training Loss: 0.01000
Epoch: 27983, Training Loss: 0.00774
Epoch: 27984, Training Loss: 0.00875
Epoch: 27984, Training Loss: 0.00817
Epoch: 27984, Training Loss: 0.00999
Epoch: 27984, Training Loss: 0.00774
Epoch: 27985, Training Loss: 0.00875
Epoch: 27985, Training Loss: 0.00817
Epoch: 27985, Training Loss: 0.00999
Epoch: 27985, Training Loss: 0.00774
Epoch: 27986, Training Loss: 0.00875
Epoch: 27986, Training Loss: 0.00817
Epoch: 27986, Training Loss: 0.00999
Epoch: 27986, Training Loss: 0.00773
Epoch: 27987, Training Loss: 0.00875
Epoch: 27987, Training Loss: 0.00817
Epoch: 27987, Training Loss: 0.00999
Epoch: 27987, Training Loss: 0.00773
Epoch: 27988, Training Loss: 0.00875
Epoch: 27988, Training Loss: 0.00817
Epoch: 27988, Training Loss: 0.00999
Epoch: 27988, Training Loss: 0.00773
Epoch: 27989, Training Loss: 0.00875
Epoch: 27989, Training Loss: 0.00817
Epoch: 27989, Training Loss: 0.00999
Epoch: 27989, Training Loss: 0.00773
Epoch: 27990, Training Loss: 0.00875
Epoch: 27990, Training Loss: 0.00817
Epoch: 27990, Training Loss: 0.00999
Epoch: 27990, Training Loss: 0.00773
Epoch: 27991, Training Loss: 0.00875
Epoch: 27991, Training Loss: 0.00817
Epoch: 27991, Training Loss: 0.00999
Epoch: 27991, Training Loss: 0.00773
Epoch: 27992, Training Loss: 0.00875
Epoch: 27992, Training Loss: 0.00817
Epoch: 27992, Training Loss: 0.00999
Epoch: 27992, Training Loss: 0.00773
Epoch: 27993, Training Loss: 0.00875
Epoch: 27993, Training Loss: 0.00817
Epoch: 27993, Training Loss: 0.00999
Epoch: 27993, Training Loss: 0.00773
Epoch: 27994, Training Loss: 0.00875
Epoch: 27994, Training Loss: 0.00817
Epoch: 27994, Training Loss: 0.00999
Epoch: 27994, Training Loss: 0.00773
Epoch: 27995, Training Loss: 0.00875
Epoch: 27995, Training Loss: 0.00817
Epoch: 27995, Training Loss: 0.00999
Epoch: 27995, Training Loss: 0.00773
Epoch: 27996, Training Loss: 0.00875
Epoch: 27996, Training Loss: 0.00817
Epoch: 27996, Training Loss: 0.00999
Epoch: 27996, Training Loss: 0.00773
Epoch: 27997, Training Loss: 0.00875
Epoch: 27997, Training Loss: 0.00817
Epoch: 27997, Training Loss: 0.00999
Epoch: 27997, Training Loss: 0.00773
Epoch: 27998, Training Loss: 0.00875
Epoch: 27998, Training Loss: 0.00817
Epoch: 27998, Training Loss: 0.00999
Epoch: 27998, Training Loss: 0.00773
Epoch: 27999, Training Loss: 0.00875
Epoch: 27999, Training Loss: 0.00817
Epoch: 27999, Training Loss: 0.00999
Epoch: 27999, Training Loss: 0.00773
Epoch: 28000, Training Loss: 0.00875
Epoch: 28000, Training Loss: 0.00817
Epoch: 28000, Training Loss: 0.00999
Epoch: 28000, Training Loss: 0.00773
Epoch: 28001, Training Loss: 0.00875
Epoch: 28001, Training Loss: 0.00817
Epoch: 28001, Training Loss: 0.00999
Epoch: 28001, Training Loss: 0.00773
Epoch: 28002, Training Loss: 0.00875
Epoch: 28002, Training Loss: 0.00817
Epoch: 28002, Training Loss: 0.00999
Epoch: 28002, Training Loss: 0.00773
Epoch: 28003, Training Loss: 0.00875
Epoch: 28003, Training Loss: 0.00816
Epoch: 28003, Training Loss: 0.00999
Epoch: 28003, Training Loss: 0.00773
Epoch: 28004, Training Loss: 0.00875
Epoch: 28004, Training Loss: 0.00816
Epoch: 28004, Training Loss: 0.00999
Epoch: 28004, Training Loss: 0.00773
Epoch: 28005, Training Loss: 0.00875
Epoch: 28005, Training Loss: 0.00816
Epoch: 28005, Training Loss: 0.00999
Epoch: 28005, Training Loss: 0.00773
Epoch: 28006, Training Loss: 0.00875
Epoch: 28006, Training Loss: 0.00816
Epoch: 28006, Training Loss: 0.00999
Epoch: 28006, Training Loss: 0.00773
Epoch: 28007, Training Loss: 0.00875
Epoch: 28007, Training Loss: 0.00816
Epoch: 28007, Training Loss: 0.00999
Epoch: 28007, Training Loss: 0.00773
Epoch: 28008, Training Loss: 0.00875
Epoch: 28008, Training Loss: 0.00816
Epoch: 28008, Training Loss: 0.00999
Epoch: 28008, Training Loss: 0.00773
Epoch: 28009, Training Loss: 0.00875
Epoch: 28009, Training Loss: 0.00816
Epoch: 28009, Training Loss: 0.00999
Epoch: 28009, Training Loss: 0.00773
Epoch: 28010, Training Loss: 0.00875
Epoch: 28010, Training Loss: 0.00816
Epoch: 28010, Training Loss: 0.00999
Epoch: 28010, Training Loss: 0.00773
Epoch: 28011, Training Loss: 0.00875
Epoch: 28011, Training Loss: 0.00816
Epoch: 28011, Training Loss: 0.00999
Epoch: 28011, Training Loss: 0.00773
Epoch: 28012, Training Loss: 0.00875
Epoch: 28012, Training Loss: 0.00816
Epoch: 28012, Training Loss: 0.00999
Epoch: 28012, Training Loss: 0.00773
Epoch: 28013, Training Loss: 0.00875
Epoch: 28013, Training Loss: 0.00816
Epoch: 28013, Training Loss: 0.00999
Epoch: 28013, Training Loss: 0.00773
Epoch: 28014, Training Loss: 0.00875
Epoch: 28014, Training Loss: 0.00816
Epoch: 28014, Training Loss: 0.00999
Epoch: 28014, Training Loss: 0.00773
Epoch: 28015, Training Loss: 0.00875
Epoch: 28015, Training Loss: 0.00816
Epoch: 28015, Training Loss: 0.00999
Epoch: 28015, Training Loss: 0.00773
Epoch: 28016, Training Loss: 0.00875
Epoch: 28016, Training Loss: 0.00816
Epoch: 28016, Training Loss: 0.00999
Epoch: 28016, Training Loss: 0.00773
Epoch: 28017, Training Loss: 0.00875
Epoch: 28017, Training Loss: 0.00816
Epoch: 28017, Training Loss: 0.00999
Epoch: 28017, Training Loss: 0.00773
Epoch: 28018, Training Loss: 0.00875
Epoch: 28018, Training Loss: 0.00816
Epoch: 28018, Training Loss: 0.00999
Epoch: 28018, Training Loss: 0.00773
Epoch: 28019, Training Loss: 0.00875
Epoch: 28019, Training Loss: 0.00816
Epoch: 28019, Training Loss: 0.00999
Epoch: 28019, Training Loss: 0.00773
Epoch: 28020, Training Loss: 0.00875
Epoch: 28020, Training Loss: 0.00816
Epoch: 28020, Training Loss: 0.00999
Epoch: 28020, Training Loss: 0.00773
Epoch: 28021, Training Loss: 0.00875
Epoch: 28021, Training Loss: 0.00816
Epoch: 28021, Training Loss: 0.00999
Epoch: 28021, Training Loss: 0.00773
Epoch: 28022, Training Loss: 0.00875
Epoch: 28022, Training Loss: 0.00816
Epoch: 28022, Training Loss: 0.00999
Epoch: 28022, Training Loss: 0.00773
Epoch: 28023, Training Loss: 0.00875
Epoch: 28023, Training Loss: 0.00816
Epoch: 28023, Training Loss: 0.00999
Epoch: 28023, Training Loss: 0.00773
Epoch: 28024, Training Loss: 0.00875
Epoch: 28024, Training Loss: 0.00816
Epoch: 28024, Training Loss: 0.00999
Epoch: 28024, Training Loss: 0.00773
Epoch: 28025, Training Loss: 0.00875
Epoch: 28025, Training Loss: 0.00816
Epoch: 28025, Training Loss: 0.00999
Epoch: 28025, Training Loss: 0.00773
Epoch: 28026, Training Loss: 0.00875
Epoch: 28026, Training Loss: 0.00816
Epoch: 28026, Training Loss: 0.00999
Epoch: 28026, Training Loss: 0.00773
Epoch: 28027, Training Loss: 0.00875
Epoch: 28027, Training Loss: 0.00816
Epoch: 28027, Training Loss: 0.00999
Epoch: 28027, Training Loss: 0.00773
Epoch: 28028, Training Loss: 0.00875
Epoch: 28028, Training Loss: 0.00816
Epoch: 28028, Training Loss: 0.00999
Epoch: 28028, Training Loss: 0.00773
Epoch: 28029, Training Loss: 0.00875
Epoch: 28029, Training Loss: 0.00816
Epoch: 28029, Training Loss: 0.00999
Epoch: 28029, Training Loss: 0.00773
Epoch: 28030, Training Loss: 0.00874
Epoch: 28030, Training Loss: 0.00816
Epoch: 28030, Training Loss: 0.00999
Epoch: 28030, Training Loss: 0.00773
Epoch: 28031, Training Loss: 0.00874
Epoch: 28031, Training Loss: 0.00816
Epoch: 28031, Training Loss: 0.00999
Epoch: 28031, Training Loss: 0.00773
Epoch: 28032, Training Loss: 0.00874
Epoch: 28032, Training Loss: 0.00816
Epoch: 28032, Training Loss: 0.00999
Epoch: 28032, Training Loss: 0.00773
Epoch: 28033, Training Loss: 0.00874
Epoch: 28033, Training Loss: 0.00816
Epoch: 28033, Training Loss: 0.00999
Epoch: 28033, Training Loss: 0.00773
Epoch: 28034, Training Loss: 0.00874
Epoch: 28034, Training Loss: 0.00816
Epoch: 28034, Training Loss: 0.00999
Epoch: 28034, Training Loss: 0.00773
Epoch: 28035, Training Loss: 0.00874
Epoch: 28035, Training Loss: 0.00816
Epoch: 28035, Training Loss: 0.00999
Epoch: 28035, Training Loss: 0.00773
Epoch: 28036, Training Loss: 0.00874
Epoch: 28036, Training Loss: 0.00816
Epoch: 28036, Training Loss: 0.00999
Epoch: 28036, Training Loss: 0.00773
Epoch: 28037, Training Loss: 0.00874
Epoch: 28037, Training Loss: 0.00816
Epoch: 28037, Training Loss: 0.00998
Epoch: 28037, Training Loss: 0.00773
Epoch: 28038, Training Loss: 0.00874
Epoch: 28038, Training Loss: 0.00816
Epoch: 28038, Training Loss: 0.00998
Epoch: 28038, Training Loss: 0.00773
Epoch: 28039, Training Loss: 0.00874
Epoch: 28039, Training Loss: 0.00816
Epoch: 28039, Training Loss: 0.00998
Epoch: 28039, Training Loss: 0.00773
Epoch: 28040, Training Loss: 0.00874
Epoch: 28040, Training Loss: 0.00816
Epoch: 28040, Training Loss: 0.00998
Epoch: 28040, Training Loss: 0.00773
Epoch: 28041, Training Loss: 0.00874
Epoch: 28041, Training Loss: 0.00816
Epoch: 28041, Training Loss: 0.00998
Epoch: 28041, Training Loss: 0.00773
Epoch: 28042, Training Loss: 0.00874
Epoch: 28042, Training Loss: 0.00816
Epoch: 28042, Training Loss: 0.00998
Epoch: 28042, Training Loss: 0.00773
Epoch: 28043, Training Loss: 0.00874
Epoch: 28043, Training Loss: 0.00816
Epoch: 28043, Training Loss: 0.00998
Epoch: 28043, Training Loss: 0.00773
Epoch: 28044, Training Loss: 0.00874
Epoch: 28044, Training Loss: 0.00816
Epoch: 28044, Training Loss: 0.00998
Epoch: 28044, Training Loss: 0.00773
Epoch: 28045, Training Loss: 0.00874
Epoch: 28045, Training Loss: 0.00816
Epoch: 28045, Training Loss: 0.00998
Epoch: 28045, Training Loss: 0.00773
Epoch: 28046, Training Loss: 0.00874
Epoch: 28046, Training Loss: 0.00816
Epoch: 28046, Training Loss: 0.00998
Epoch: 28046, Training Loss: 0.00773
Epoch: 28047, Training Loss: 0.00874
Epoch: 28047, Training Loss: 0.00816
Epoch: 28047, Training Loss: 0.00998
Epoch: 28047, Training Loss: 0.00773
Epoch: 28048, Training Loss: 0.00874
Epoch: 28048, Training Loss: 0.00816
Epoch: 28048, Training Loss: 0.00998
Epoch: 28048, Training Loss: 0.00773
Epoch: 28049, Training Loss: 0.00874
Epoch: 28049, Training Loss: 0.00816
Epoch: 28049, Training Loss: 0.00998
Epoch: 28049, Training Loss: 0.00773
Epoch: 28050, Training Loss: 0.00874
Epoch: 28050, Training Loss: 0.00816
Epoch: 28050, Training Loss: 0.00998
Epoch: 28050, Training Loss: 0.00773
Epoch: 28051, Training Loss: 0.00874
Epoch: 28051, Training Loss: 0.00816
Epoch: 28051, Training Loss: 0.00998
Epoch: 28051, Training Loss: 0.00773
Epoch: 28052, Training Loss: 0.00874
Epoch: 28052, Training Loss: 0.00816
Epoch: 28052, Training Loss: 0.00998
Epoch: 28052, Training Loss: 0.00773
Epoch: 28053, Training Loss: 0.00874
Epoch: 28053, Training Loss: 0.00816
Epoch: 28053, Training Loss: 0.00998
Epoch: 28053, Training Loss: 0.00773
Epoch: 28054, Training Loss: 0.00874
Epoch: 28054, Training Loss: 0.00816
Epoch: 28054, Training Loss: 0.00998
Epoch: 28054, Training Loss: 0.00773
Epoch: 28055, Training Loss: 0.00874
Epoch: 28055, Training Loss: 0.00816
Epoch: 28055, Training Loss: 0.00998
Epoch: 28055, Training Loss: 0.00772
Epoch: 28056, Training Loss: 0.00874
Epoch: 28056, Training Loss: 0.00816
Epoch: 28056, Training Loss: 0.00998
Epoch: 28056, Training Loss: 0.00772
Epoch: 28057, Training Loss: 0.00874
Epoch: 28057, Training Loss: 0.00816
Epoch: 28057, Training Loss: 0.00998
Epoch: 28057, Training Loss: 0.00772
Epoch: 28058, Training Loss: 0.00874
Epoch: 28058, Training Loss: 0.00816
Epoch: 28058, Training Loss: 0.00998
Epoch: 28058, Training Loss: 0.00772
Epoch: 28059, Training Loss: 0.00874
Epoch: 28059, Training Loss: 0.00816
Epoch: 28059, Training Loss: 0.00998
Epoch: 28059, Training Loss: 0.00772
Epoch: 28060, Training Loss: 0.00874
Epoch: 28060, Training Loss: 0.00816
Epoch: 28060, Training Loss: 0.00998
Epoch: 28060, Training Loss: 0.00772
Epoch: 28061, Training Loss: 0.00874
Epoch: 28061, Training Loss: 0.00816
Epoch: 28061, Training Loss: 0.00998
Epoch: 28061, Training Loss: 0.00772
Epoch: 28062, Training Loss: 0.00874
Epoch: 28062, Training Loss: 0.00816
Epoch: 28062, Training Loss: 0.00998
Epoch: 28062, Training Loss: 0.00772
Epoch: 28063, Training Loss: 0.00874
Epoch: 28063, Training Loss: 0.00816
Epoch: 28063, Training Loss: 0.00998
Epoch: 28063, Training Loss: 0.00772
Epoch: 28064, Training Loss: 0.00874
Epoch: 28064, Training Loss: 0.00816
Epoch: 28064, Training Loss: 0.00998
Epoch: 28064, Training Loss: 0.00772
Epoch: 28065, Training Loss: 0.00874
Epoch: 28065, Training Loss: 0.00816
Epoch: 28065, Training Loss: 0.00998
Epoch: 28065, Training Loss: 0.00772
Epoch: 28066, Training Loss: 0.00874
Epoch: 28066, Training Loss: 0.00816
Epoch: 28066, Training Loss: 0.00998
Epoch: 28066, Training Loss: 0.00772
Epoch: 28067, Training Loss: 0.00874
Epoch: 28067, Training Loss: 0.00816
Epoch: 28067, Training Loss: 0.00998
Epoch: 28067, Training Loss: 0.00772
Epoch: 28068, Training Loss: 0.00874
Epoch: 28068, Training Loss: 0.00815
Epoch: 28068, Training Loss: 0.00998
Epoch: 28068, Training Loss: 0.00772
Epoch: 28069, Training Loss: 0.00874
Epoch: 28069, Training Loss: 0.00815
Epoch: 28069, Training Loss: 0.00998
Epoch: 28069, Training Loss: 0.00772
Epoch: 28070, Training Loss: 0.00874
Epoch: 28070, Training Loss: 0.00815
Epoch: 28070, Training Loss: 0.00998
Epoch: 28070, Training Loss: 0.00772
Epoch: 28071, Training Loss: 0.00874
Epoch: 28071, Training Loss: 0.00815
Epoch: 28071, Training Loss: 0.00998
Epoch: 28071, Training Loss: 0.00772
Epoch: 28072, Training Loss: 0.00874
Epoch: 28072, Training Loss: 0.00815
Epoch: 28072, Training Loss: 0.00998
Epoch: 28072, Training Loss: 0.00772
Epoch: 28073, Training Loss: 0.00874
Epoch: 28073, Training Loss: 0.00815
Epoch: 28073, Training Loss: 0.00998
Epoch: 28073, Training Loss: 0.00772
Epoch: 28074, Training Loss: 0.00874
Epoch: 28074, Training Loss: 0.00815
Epoch: 28074, Training Loss: 0.00998
Epoch: 28074, Training Loss: 0.00772
Epoch: 28075, Training Loss: 0.00874
Epoch: 28075, Training Loss: 0.00815
Epoch: 28075, Training Loss: 0.00998
Epoch: 28075, Training Loss: 0.00772
Epoch: 28076, Training Loss: 0.00874
Epoch: 28076, Training Loss: 0.00815
Epoch: 28076, Training Loss: 0.00998
Epoch: 28076, Training Loss: 0.00772
Epoch: 28077, Training Loss: 0.00874
Epoch: 28077, Training Loss: 0.00815
Epoch: 28077, Training Loss: 0.00998
Epoch: 28077, Training Loss: 0.00772
Epoch: 28078, Training Loss: 0.00874
Epoch: 28078, Training Loss: 0.00815
Epoch: 28078, Training Loss: 0.00998
Epoch: 28078, Training Loss: 0.00772
Epoch: 28079, Training Loss: 0.00874
Epoch: 28079, Training Loss: 0.00815
Epoch: 28079, Training Loss: 0.00998
Epoch: 28079, Training Loss: 0.00772
Epoch: 28080, Training Loss: 0.00874
Epoch: 28080, Training Loss: 0.00815
Epoch: 28080, Training Loss: 0.00998
Epoch: 28080, Training Loss: 0.00772
Epoch: 28081, Training Loss: 0.00874
Epoch: 28081, Training Loss: 0.00815
Epoch: 28081, Training Loss: 0.00998
Epoch: 28081, Training Loss: 0.00772
Epoch: 28082, Training Loss: 0.00874
Epoch: 28082, Training Loss: 0.00815
Epoch: 28082, Training Loss: 0.00998
Epoch: 28082, Training Loss: 0.00772
Epoch: 28083, Training Loss: 0.00874
Epoch: 28083, Training Loss: 0.00815
Epoch: 28083, Training Loss: 0.00998
Epoch: 28083, Training Loss: 0.00772
Epoch: 28084, Training Loss: 0.00874
Epoch: 28084, Training Loss: 0.00815
Epoch: 28084, Training Loss: 0.00998
Epoch: 28084, Training Loss: 0.00772
Epoch: 28085, Training Loss: 0.00874
Epoch: 28085, Training Loss: 0.00815
Epoch: 28085, Training Loss: 0.00998
Epoch: 28085, Training Loss: 0.00772
Epoch: 28086, Training Loss: 0.00874
Epoch: 28086, Training Loss: 0.00815
Epoch: 28086, Training Loss: 0.00998
Epoch: 28086, Training Loss: 0.00772
Epoch: 28087, Training Loss: 0.00874
Epoch: 28087, Training Loss: 0.00815
Epoch: 28087, Training Loss: 0.00998
Epoch: 28087, Training Loss: 0.00772
Epoch: 28088, Training Loss: 0.00874
Epoch: 28088, Training Loss: 0.00815
Epoch: 28088, Training Loss: 0.00998
Epoch: 28088, Training Loss: 0.00772
Epoch: 28089, Training Loss: 0.00874
Epoch: 28089, Training Loss: 0.00815
Epoch: 28089, Training Loss: 0.00998
Epoch: 28089, Training Loss: 0.00772
Epoch: 28090, Training Loss: 0.00873
Epoch: 28090, Training Loss: 0.00815
Epoch: 28090, Training Loss: 0.00997
Epoch: 28090, Training Loss: 0.00772
Epoch: 28091, Training Loss: 0.00873
Epoch: 28091, Training Loss: 0.00815
Epoch: 28091, Training Loss: 0.00997
Epoch: 28091, Training Loss: 0.00772
Epoch: 28092, Training Loss: 0.00873
Epoch: 28092, Training Loss: 0.00815
Epoch: 28092, Training Loss: 0.00997
Epoch: 28092, Training Loss: 0.00772
Epoch: 28093, Training Loss: 0.00873
Epoch: 28093, Training Loss: 0.00815
Epoch: 28093, Training Loss: 0.00997
Epoch: 28093, Training Loss: 0.00772
Epoch: 28094, Training Loss: 0.00873
Epoch: 28094, Training Loss: 0.00815
Epoch: 28094, Training Loss: 0.00997
Epoch: 28094, Training Loss: 0.00772
Epoch: 28095, Training Loss: 0.00873
Epoch: 28095, Training Loss: 0.00815
Epoch: 28095, Training Loss: 0.00997
Epoch: 28095, Training Loss: 0.00772
Epoch: 28096, Training Loss: 0.00873
Epoch: 28096, Training Loss: 0.00815
Epoch: 28096, Training Loss: 0.00997
Epoch: 28096, Training Loss: 0.00772
Epoch: 28097, Training Loss: 0.00873
Epoch: 28097, Training Loss: 0.00815
Epoch: 28097, Training Loss: 0.00997
Epoch: 28097, Training Loss: 0.00772
Epoch: 28098, Training Loss: 0.00873
Epoch: 28098, Training Loss: 0.00815
Epoch: 28098, Training Loss: 0.00997
Epoch: 28098, Training Loss: 0.00772
Epoch: 28099, Training Loss: 0.00873
Epoch: 28099, Training Loss: 0.00815
Epoch: 28099, Training Loss: 0.00997
Epoch: 28099, Training Loss: 0.00772
Epoch: 28100, Training Loss: 0.00873
Epoch: 28100, Training Loss: 0.00815
Epoch: 28100, Training Loss: 0.00997
Epoch: 28100, Training Loss: 0.00772
Epoch: 28101, Training Loss: 0.00873
Epoch: 28101, Training Loss: 0.00815
Epoch: 28101, Training Loss: 0.00997
Epoch: 28101, Training Loss: 0.00772
Epoch: 28102, Training Loss: 0.00873
Epoch: 28102, Training Loss: 0.00815
Epoch: 28102, Training Loss: 0.00997
Epoch: 28102, Training Loss: 0.00772
Epoch: 28103, Training Loss: 0.00873
Epoch: 28103, Training Loss: 0.00815
Epoch: 28103, Training Loss: 0.00997
Epoch: 28103, Training Loss: 0.00772
Epoch: 28104, Training Loss: 0.00873
Epoch: 28104, Training Loss: 0.00815
Epoch: 28104, Training Loss: 0.00997
Epoch: 28104, Training Loss: 0.00772
Epoch: 28105, Training Loss: 0.00873
Epoch: 28105, Training Loss: 0.00815
Epoch: 28105, Training Loss: 0.00997
Epoch: 28105, Training Loss: 0.00772
Epoch: 28106, Training Loss: 0.00873
Epoch: 28106, Training Loss: 0.00815
Epoch: 28106, Training Loss: 0.00997
Epoch: 28106, Training Loss: 0.00772
Epoch: 28107, Training Loss: 0.00873
Epoch: 28107, Training Loss: 0.00815
Epoch: 28107, Training Loss: 0.00997
Epoch: 28107, Training Loss: 0.00772
Epoch: 28108, Training Loss: 0.00873
Epoch: 28108, Training Loss: 0.00815
Epoch: 28108, Training Loss: 0.00997
Epoch: 28108, Training Loss: 0.00772
Epoch: 28109, Training Loss: 0.00873
Epoch: 28109, Training Loss: 0.00815
Epoch: 28109, Training Loss: 0.00997
Epoch: 28109, Training Loss: 0.00772
Epoch: 28110, Training Loss: 0.00873
Epoch: 28110, Training Loss: 0.00815
Epoch: 28110, Training Loss: 0.00997
Epoch: 28110, Training Loss: 0.00772
Epoch: 28111, Training Loss: 0.00873
Epoch: 28111, Training Loss: 0.00815
Epoch: 28111, Training Loss: 0.00997
Epoch: 28111, Training Loss: 0.00772
Epoch: 28112, Training Loss: 0.00873
Epoch: 28112, Training Loss: 0.00815
Epoch: 28112, Training Loss: 0.00997
Epoch: 28112, Training Loss: 0.00772
Epoch: 28113, Training Loss: 0.00873
Epoch: 28113, Training Loss: 0.00815
Epoch: 28113, Training Loss: 0.00997
Epoch: 28113, Training Loss: 0.00772
Epoch: 28114, Training Loss: 0.00873
Epoch: 28114, Training Loss: 0.00815
Epoch: 28114, Training Loss: 0.00997
Epoch: 28114, Training Loss: 0.00772
Epoch: 28115, Training Loss: 0.00873
Epoch: 28115, Training Loss: 0.00815
Epoch: 28115, Training Loss: 0.00997
Epoch: 28115, Training Loss: 0.00772
Epoch: 28116, Training Loss: 0.00873
Epoch: 28116, Training Loss: 0.00815
Epoch: 28116, Training Loss: 0.00997
Epoch: 28116, Training Loss: 0.00772
Epoch: 28117, Training Loss: 0.00873
Epoch: 28117, Training Loss: 0.00815
Epoch: 28117, Training Loss: 0.00997
Epoch: 28117, Training Loss: 0.00772
Epoch: 28118, Training Loss: 0.00873
Epoch: 28118, Training Loss: 0.00815
Epoch: 28118, Training Loss: 0.00997
Epoch: 28118, Training Loss: 0.00772
Epoch: 28119, Training Loss: 0.00873
Epoch: 28119, Training Loss: 0.00815
Epoch: 28119, Training Loss: 0.00997
Epoch: 28119, Training Loss: 0.00772
Epoch: 28120, Training Loss: 0.00873
Epoch: 28120, Training Loss: 0.00815
Epoch: 28120, Training Loss: 0.00997
Epoch: 28120, Training Loss: 0.00772
Epoch: 28121, Training Loss: 0.00873
Epoch: 28121, Training Loss: 0.00815
Epoch: 28121, Training Loss: 0.00997
Epoch: 28121, Training Loss: 0.00772
Epoch: 28122, Training Loss: 0.00873
Epoch: 28122, Training Loss: 0.00815
Epoch: 28122, Training Loss: 0.00997
Epoch: 28122, Training Loss: 0.00772
Epoch: 28123, Training Loss: 0.00873
Epoch: 28123, Training Loss: 0.00815
Epoch: 28123, Training Loss: 0.00997
Epoch: 28123, Training Loss: 0.00772
Epoch: 28124, Training Loss: 0.00873
Epoch: 28124, Training Loss: 0.00815
Epoch: 28124, Training Loss: 0.00997
Epoch: 28124, Training Loss: 0.00771
Epoch: 28125, Training Loss: 0.00873
Epoch: 28125, Training Loss: 0.00815
Epoch: 28125, Training Loss: 0.00997
Epoch: 28125, Training Loss: 0.00771
Epoch: 28126, Training Loss: 0.00873
Epoch: 28126, Training Loss: 0.00815
Epoch: 28126, Training Loss: 0.00997
Epoch: 28126, Training Loss: 0.00771
Epoch: 28127, Training Loss: 0.00873
Epoch: 28127, Training Loss: 0.00815
Epoch: 28127, Training Loss: 0.00997
Epoch: 28127, Training Loss: 0.00771
Epoch: 28128, Training Loss: 0.00873
Epoch: 28128, Training Loss: 0.00815
Epoch: 28128, Training Loss: 0.00997
Epoch: 28128, Training Loss: 0.00771
Epoch: 28129, Training Loss: 0.00873
Epoch: 28129, Training Loss: 0.00815
Epoch: 28129, Training Loss: 0.00997
Epoch: 28129, Training Loss: 0.00771
Epoch: 28130, Training Loss: 0.00873
Epoch: 28130, Training Loss: 0.00815
Epoch: 28130, Training Loss: 0.00997
Epoch: 28130, Training Loss: 0.00771
Epoch: 28131, Training Loss: 0.00873
Epoch: 28131, Training Loss: 0.00815
Epoch: 28131, Training Loss: 0.00997
Epoch: 28131, Training Loss: 0.00771
Epoch: 28132, Training Loss: 0.00873
Epoch: 28132, Training Loss: 0.00815
Epoch: 28132, Training Loss: 0.00997
Epoch: 28132, Training Loss: 0.00771
Epoch: 28133, Training Loss: 0.00873
Epoch: 28133, Training Loss: 0.00815
Epoch: 28133, Training Loss: 0.00997
Epoch: 28133, Training Loss: 0.00771
Epoch: 28134, Training Loss: 0.00873
Epoch: 28134, Training Loss: 0.00814
Epoch: 28134, Training Loss: 0.00997
Epoch: 28134, Training Loss: 0.00771
Epoch: 28135, Training Loss: 0.00873
Epoch: 28135, Training Loss: 0.00814
Epoch: 28135, Training Loss: 0.00997
Epoch: 28135, Training Loss: 0.00771
Epoch: 28136, Training Loss: 0.00873
Epoch: 28136, Training Loss: 0.00814
Epoch: 28136, Training Loss: 0.00997
Epoch: 28136, Training Loss: 0.00771
Epoch: 28137, Training Loss: 0.00873
Epoch: 28137, Training Loss: 0.00814
Epoch: 28137, Training Loss: 0.00997
Epoch: 28137, Training Loss: 0.00771
Epoch: 28138, Training Loss: 0.00873
Epoch: 28138, Training Loss: 0.00814
Epoch: 28138, Training Loss: 0.00997
Epoch: 28138, Training Loss: 0.00771
Epoch: 28139, Training Loss: 0.00873
Epoch: 28139, Training Loss: 0.00814
Epoch: 28139, Training Loss: 0.00997
Epoch: 28139, Training Loss: 0.00771
Epoch: 28140, Training Loss: 0.00873
Epoch: 28140, Training Loss: 0.00814
Epoch: 28140, Training Loss: 0.00997
Epoch: 28140, Training Loss: 0.00771
Epoch: 28141, Training Loss: 0.00873
Epoch: 28141, Training Loss: 0.00814
Epoch: 28141, Training Loss: 0.00997
Epoch: 28141, Training Loss: 0.00771
Epoch: 28142, Training Loss: 0.00873
Epoch: 28142, Training Loss: 0.00814
Epoch: 28142, Training Loss: 0.00997
Epoch: 28142, Training Loss: 0.00771
Epoch: 28143, Training Loss: 0.00873
Epoch: 28143, Training Loss: 0.00814
Epoch: 28143, Training Loss: 0.00996
Epoch: 28143, Training Loss: 0.00771
Epoch: 28144, Training Loss: 0.00873
Epoch: 28144, Training Loss: 0.00814
Epoch: 28144, Training Loss: 0.00996
Epoch: 28144, Training Loss: 0.00771
Epoch: 28145, Training Loss: 0.00873
Epoch: 28145, Training Loss: 0.00814
Epoch: 28145, Training Loss: 0.00996
Epoch: 28145, Training Loss: 0.00771
Epoch: 28146, Training Loss: 0.00873
Epoch: 28146, Training Loss: 0.00814
Epoch: 28146, Training Loss: 0.00996
Epoch: 28146, Training Loss: 0.00771
Epoch: 28147, Training Loss: 0.00873
Epoch: 28147, Training Loss: 0.00814
Epoch: 28147, Training Loss: 0.00996
Epoch: 28147, Training Loss: 0.00771
Epoch: 28148, Training Loss: 0.00873
Epoch: 28148, Training Loss: 0.00814
Epoch: 28148, Training Loss: 0.00996
Epoch: 28148, Training Loss: 0.00771
Epoch: 28149, Training Loss: 0.00873
Epoch: 28149, Training Loss: 0.00814
Epoch: 28149, Training Loss: 0.00996
Epoch: 28149, Training Loss: 0.00771
Epoch: 28150, Training Loss: 0.00873
Epoch: 28150, Training Loss: 0.00814
Epoch: 28150, Training Loss: 0.00996
Epoch: 28150, Training Loss: 0.00771
Epoch: 28151, Training Loss: 0.00872
Epoch: 28151, Training Loss: 0.00814
Epoch: 28151, Training Loss: 0.00996
Epoch: 28151, Training Loss: 0.00771
Epoch: 28152, Training Loss: 0.00872
Epoch: 28152, Training Loss: 0.00814
Epoch: 28152, Training Loss: 0.00996
Epoch: 28152, Training Loss: 0.00771
Epoch: 28153, Training Loss: 0.00872
Epoch: 28153, Training Loss: 0.00814
Epoch: 28153, Training Loss: 0.00996
Epoch: 28153, Training Loss: 0.00771
Epoch: 28154, Training Loss: 0.00872
Epoch: 28154, Training Loss: 0.00814
Epoch: 28154, Training Loss: 0.00996
Epoch: 28154, Training Loss: 0.00771
Epoch: 28155, Training Loss: 0.00872
Epoch: 28155, Training Loss: 0.00814
Epoch: 28155, Training Loss: 0.00996
Epoch: 28155, Training Loss: 0.00771
Epoch: 28156, Training Loss: 0.00872
Epoch: 28156, Training Loss: 0.00814
Epoch: 28156, Training Loss: 0.00996
Epoch: 28156, Training Loss: 0.00771
Epoch: 28157, Training Loss: 0.00872
Epoch: 28157, Training Loss: 0.00814
Epoch: 28157, Training Loss: 0.00996
Epoch: 28157, Training Loss: 0.00771
Epoch: 28158, Training Loss: 0.00872
Epoch: 28158, Training Loss: 0.00814
Epoch: 28158, Training Loss: 0.00996
Epoch: 28158, Training Loss: 0.00771
Epoch: 28159, Training Loss: 0.00872
Epoch: 28159, Training Loss: 0.00814
Epoch: 28159, Training Loss: 0.00996
Epoch: 28159, Training Loss: 0.00771
Epoch: 28160, Training Loss: 0.00872
Epoch: 28160, Training Loss: 0.00814
Epoch: 28160, Training Loss: 0.00996
Epoch: 28160, Training Loss: 0.00771
Epoch: 28161, Training Loss: 0.00872
Epoch: 28161, Training Loss: 0.00814
Epoch: 28161, Training Loss: 0.00996
Epoch: 28161, Training Loss: 0.00771
Epoch: 28162, Training Loss: 0.00872
Epoch: 28162, Training Loss: 0.00814
Epoch: 28162, Training Loss: 0.00996
Epoch: 28162, Training Loss: 0.00771
Epoch: 28163, Training Loss: 0.00872
Epoch: 28163, Training Loss: 0.00814
Epoch: 28163, Training Loss: 0.00996
Epoch: 28163, Training Loss: 0.00771
Epoch: 28164, Training Loss: 0.00872
Epoch: 28164, Training Loss: 0.00814
Epoch: 28164, Training Loss: 0.00996
Epoch: 28164, Training Loss: 0.00771
Epoch: 28165, Training Loss: 0.00872
Epoch: 28165, Training Loss: 0.00814
Epoch: 28165, Training Loss: 0.00996
Epoch: 28165, Training Loss: 0.00771
Epoch: 28166, Training Loss: 0.00872
Epoch: 28166, Training Loss: 0.00814
Epoch: 28166, Training Loss: 0.00996
Epoch: 28166, Training Loss: 0.00771
Epoch: 28167, Training Loss: 0.00872
Epoch: 28167, Training Loss: 0.00814
Epoch: 28167, Training Loss: 0.00996
Epoch: 28167, Training Loss: 0.00771
Epoch: 28168, Training Loss: 0.00872
Epoch: 28168, Training Loss: 0.00814
Epoch: 28168, Training Loss: 0.00996
Epoch: 28168, Training Loss: 0.00771
Epoch: 28169, Training Loss: 0.00872
Epoch: 28169, Training Loss: 0.00814
Epoch: 28169, Training Loss: 0.00996
Epoch: 28169, Training Loss: 0.00771
Epoch: 28170, Training Loss: 0.00872
Epoch: 28170, Training Loss: 0.00814
Epoch: 28170, Training Loss: 0.00996
Epoch: 28170, Training Loss: 0.00771
Epoch: 28171, Training Loss: 0.00872
Epoch: 28171, Training Loss: 0.00814
Epoch: 28171, Training Loss: 0.00996
Epoch: 28171, Training Loss: 0.00771
Epoch: 28172, Training Loss: 0.00872
Epoch: 28172, Training Loss: 0.00814
Epoch: 28172, Training Loss: 0.00996
Epoch: 28172, Training Loss: 0.00771
Epoch: 28173, Training Loss: 0.00872
Epoch: 28173, Training Loss: 0.00814
Epoch: 28173, Training Loss: 0.00996
Epoch: 28173, Training Loss: 0.00771
Epoch: 28174, Training Loss: 0.00872
Epoch: 28174, Training Loss: 0.00814
Epoch: 28174, Training Loss: 0.00996
Epoch: 28174, Training Loss: 0.00771
Epoch: 28175, Training Loss: 0.00872
Epoch: 28175, Training Loss: 0.00814
Epoch: 28175, Training Loss: 0.00996
Epoch: 28175, Training Loss: 0.00771
Epoch: 28176, Training Loss: 0.00872
Epoch: 28176, Training Loss: 0.00814
Epoch: 28176, Training Loss: 0.00996
Epoch: 28176, Training Loss: 0.00771
Epoch: 28177, Training Loss: 0.00872
Epoch: 28177, Training Loss: 0.00814
Epoch: 28177, Training Loss: 0.00996
Epoch: 28177, Training Loss: 0.00771
Epoch: 28178, Training Loss: 0.00872
Epoch: 28178, Training Loss: 0.00814
Epoch: 28178, Training Loss: 0.00996
Epoch: 28178, Training Loss: 0.00771
Epoch: 28179, Training Loss: 0.00872
Epoch: 28179, Training Loss: 0.00814
Epoch: 28179, Training Loss: 0.00996
Epoch: 28179, Training Loss: 0.00771
Epoch: 28180, Training Loss: 0.00872
Epoch: 28180, Training Loss: 0.00814
Epoch: 28180, Training Loss: 0.00996
Epoch: 28180, Training Loss: 0.00771
Epoch: 28181, Training Loss: 0.00872
Epoch: 28181, Training Loss: 0.00814
Epoch: 28181, Training Loss: 0.00996
Epoch: 28181, Training Loss: 0.00771
Epoch: 28182, Training Loss: 0.00872
Epoch: 28182, Training Loss: 0.00814
Epoch: 28182, Training Loss: 0.00996
Epoch: 28182, Training Loss: 0.00771
Epoch: 28183, Training Loss: 0.00872
Epoch: 28183, Training Loss: 0.00814
Epoch: 28183, Training Loss: 0.00996
Epoch: 28183, Training Loss: 0.00771
Epoch: 28184, Training Loss: 0.00872
Epoch: 28184, Training Loss: 0.00814
Epoch: 28184, Training Loss: 0.00996
Epoch: 28184, Training Loss: 0.00771
Epoch: 28185, Training Loss: 0.00872
Epoch: 28185, Training Loss: 0.00814
Epoch: 28185, Training Loss: 0.00996
Epoch: 28185, Training Loss: 0.00771
Epoch: 28186, Training Loss: 0.00872
Epoch: 28186, Training Loss: 0.00814
Epoch: 28186, Training Loss: 0.00996
Epoch: 28186, Training Loss: 0.00771
Epoch: 28187, Training Loss: 0.00872
Epoch: 28187, Training Loss: 0.00814
Epoch: 28187, Training Loss: 0.00996
Epoch: 28187, Training Loss: 0.00771
Epoch: 28188, Training Loss: 0.00872
Epoch: 28188, Training Loss: 0.00814
Epoch: 28188, Training Loss: 0.00996
Epoch: 28188, Training Loss: 0.00771
Epoch: 28189, Training Loss: 0.00872
Epoch: 28189, Training Loss: 0.00814
Epoch: 28189, Training Loss: 0.00996
Epoch: 28189, Training Loss: 0.00771
Epoch: 28190, Training Loss: 0.00872
Epoch: 28190, Training Loss: 0.00814
Epoch: 28190, Training Loss: 0.00996
Epoch: 28190, Training Loss: 0.00771
Epoch: 28191, Training Loss: 0.00872
Epoch: 28191, Training Loss: 0.00814
Epoch: 28191, Training Loss: 0.00996
Epoch: 28191, Training Loss: 0.00771
Epoch: 28192, Training Loss: 0.00872
Epoch: 28192, Training Loss: 0.00814
Epoch: 28192, Training Loss: 0.00996
Epoch: 28192, Training Loss: 0.00771
Epoch: 28193, Training Loss: 0.00872
Epoch: 28193, Training Loss: 0.00814
Epoch: 28193, Training Loss: 0.00996
Epoch: 28193, Training Loss: 0.00771
Epoch: 28194, Training Loss: 0.00872
Epoch: 28194, Training Loss: 0.00814
Epoch: 28194, Training Loss: 0.00996
Epoch: 28194, Training Loss: 0.00770
Epoch: 28195, Training Loss: 0.00872
Epoch: 28195, Training Loss: 0.00814
Epoch: 28195, Training Loss: 0.00996
Epoch: 28195, Training Loss: 0.00770
Epoch: 28196, Training Loss: 0.00872
Epoch: 28196, Training Loss: 0.00814
Epoch: 28196, Training Loss: 0.00995
Epoch: 28196, Training Loss: 0.00770
Epoch: 28197, Training Loss: 0.00872
Epoch: 28197, Training Loss: 0.00814
Epoch: 28197, Training Loss: 0.00995
Epoch: 28197, Training Loss: 0.00770
Epoch: 28198, Training Loss: 0.00872
Epoch: 28198, Training Loss: 0.00814
Epoch: 28198, Training Loss: 0.00995
Epoch: 28198, Training Loss: 0.00770
Epoch: 28199, Training Loss: 0.00872
Epoch: 28199, Training Loss: 0.00814
Epoch: 28199, Training Loss: 0.00995
Epoch: 28199, Training Loss: 0.00770
Epoch: 28200, Training Loss: 0.00872
Epoch: 28200, Training Loss: 0.00813
Epoch: 28200, Training Loss: 0.00995
Epoch: 28200, Training Loss: 0.00770
Epoch: 28201, Training Loss: 0.00872
Epoch: 28201, Training Loss: 0.00813
Epoch: 28201, Training Loss: 0.00995
Epoch: 28201, Training Loss: 0.00770
Epoch: 28202, Training Loss: 0.00872
Epoch: 28202, Training Loss: 0.00813
Epoch: 28202, Training Loss: 0.00995
Epoch: 28202, Training Loss: 0.00770
Epoch: 28203, Training Loss: 0.00872
Epoch: 28203, Training Loss: 0.00813
Epoch: 28203, Training Loss: 0.00995
Epoch: 28203, Training Loss: 0.00770
Epoch: 28204, Training Loss: 0.00872
Epoch: 28204, Training Loss: 0.00813
Epoch: 28204, Training Loss: 0.00995
Epoch: 28204, Training Loss: 0.00770
Epoch: 28205, Training Loss: 0.00872
Epoch: 28205, Training Loss: 0.00813
Epoch: 28205, Training Loss: 0.00995
Epoch: 28205, Training Loss: 0.00770
Epoch: 28206, Training Loss: 0.00872
Epoch: 28206, Training Loss: 0.00813
Epoch: 28206, Training Loss: 0.00995
Epoch: 28206, Training Loss: 0.00770
Epoch: 28207, Training Loss: 0.00872
Epoch: 28207, Training Loss: 0.00813
Epoch: 28207, Training Loss: 0.00995
Epoch: 28207, Training Loss: 0.00770
Epoch: 28208, Training Loss: 0.00872
Epoch: 28208, Training Loss: 0.00813
Epoch: 28208, Training Loss: 0.00995
Epoch: 28208, Training Loss: 0.00770
Epoch: 28209, Training Loss: 0.00872
Epoch: 28209, Training Loss: 0.00813
Epoch: 28209, Training Loss: 0.00995
Epoch: 28209, Training Loss: 0.00770
Epoch: 28210, Training Loss: 0.00872
Epoch: 28210, Training Loss: 0.00813
Epoch: 28210, Training Loss: 0.00995
Epoch: 28210, Training Loss: 0.00770
Epoch: 28211, Training Loss: 0.00872
Epoch: 28211, Training Loss: 0.00813
Epoch: 28211, Training Loss: 0.00995
Epoch: 28211, Training Loss: 0.00770
Epoch: 28212, Training Loss: 0.00871
Epoch: 28212, Training Loss: 0.00813
Epoch: 28212, Training Loss: 0.00995
Epoch: 28212, Training Loss: 0.00770
Epoch: 28213, Training Loss: 0.00871
Epoch: 28213, Training Loss: 0.00813
Epoch: 28213, Training Loss: 0.00995
Epoch: 28213, Training Loss: 0.00770
Epoch: 28214, Training Loss: 0.00871
Epoch: 28214, Training Loss: 0.00813
Epoch: 28214, Training Loss: 0.00995
Epoch: 28214, Training Loss: 0.00770
Epoch: 28215, Training Loss: 0.00871
Epoch: 28215, Training Loss: 0.00813
Epoch: 28215, Training Loss: 0.00995
Epoch: 28215, Training Loss: 0.00770
Epoch: 28216, Training Loss: 0.00871
Epoch: 28216, Training Loss: 0.00813
Epoch: 28216, Training Loss: 0.00995
Epoch: 28216, Training Loss: 0.00770
Epoch: 28217, Training Loss: 0.00871
Epoch: 28217, Training Loss: 0.00813
Epoch: 28217, Training Loss: 0.00995
Epoch: 28217, Training Loss: 0.00770
Epoch: 28218, Training Loss: 0.00871
Epoch: 28218, Training Loss: 0.00813
Epoch: 28218, Training Loss: 0.00995
Epoch: 28218, Training Loss: 0.00770
Epoch: 28219, Training Loss: 0.00871
Epoch: 28219, Training Loss: 0.00813
Epoch: 28219, Training Loss: 0.00995
Epoch: 28219, Training Loss: 0.00770
Epoch: 28220, Training Loss: 0.00871
Epoch: 28220, Training Loss: 0.00813
Epoch: 28220, Training Loss: 0.00995
Epoch: 28220, Training Loss: 0.00770
Epoch: 28221, Training Loss: 0.00871
Epoch: 28221, Training Loss: 0.00813
Epoch: 28221, Training Loss: 0.00995
Epoch: 28221, Training Loss: 0.00770
Epoch: 28222, Training Loss: 0.00871
Epoch: 28222, Training Loss: 0.00813
Epoch: 28222, Training Loss: 0.00995
Epoch: 28222, Training Loss: 0.00770
Epoch: 28223, Training Loss: 0.00871
Epoch: 28223, Training Loss: 0.00813
Epoch: 28223, Training Loss: 0.00995
Epoch: 28223, Training Loss: 0.00770
Epoch: 28224, Training Loss: 0.00871
Epoch: 28224, Training Loss: 0.00813
Epoch: 28224, Training Loss: 0.00995
Epoch: 28224, Training Loss: 0.00770
Epoch: 28225, Training Loss: 0.00871
Epoch: 28225, Training Loss: 0.00813
Epoch: 28225, Training Loss: 0.00995
Epoch: 28225, Training Loss: 0.00770
Epoch: 28226, Training Loss: 0.00871
Epoch: 28226, Training Loss: 0.00813
Epoch: 28226, Training Loss: 0.00995
Epoch: 28226, Training Loss: 0.00770
Epoch: 28227, Training Loss: 0.00871
Epoch: 28227, Training Loss: 0.00813
Epoch: 28227, Training Loss: 0.00995
Epoch: 28227, Training Loss: 0.00770
Epoch: 28228, Training Loss: 0.00871
Epoch: 28228, Training Loss: 0.00813
Epoch: 28228, Training Loss: 0.00995
Epoch: 28228, Training Loss: 0.00770
Epoch: 28229, Training Loss: 0.00871
Epoch: 28229, Training Loss: 0.00813
Epoch: 28229, Training Loss: 0.00995
Epoch: 28229, Training Loss: 0.00770
Epoch: 28230, Training Loss: 0.00871
Epoch: 28230, Training Loss: 0.00813
Epoch: 28230, Training Loss: 0.00995
Epoch: 28230, Training Loss: 0.00770
Epoch: 28231, Training Loss: 0.00871
Epoch: 28231, Training Loss: 0.00813
Epoch: 28231, Training Loss: 0.00995
Epoch: 28231, Training Loss: 0.00770
Epoch: 28232, Training Loss: 0.00871
Epoch: 28232, Training Loss: 0.00813
Epoch: 28232, Training Loss: 0.00995
Epoch: 28232, Training Loss: 0.00770
Epoch: 28233, Training Loss: 0.00871
Epoch: 28233, Training Loss: 0.00813
Epoch: 28233, Training Loss: 0.00995
Epoch: 28233, Training Loss: 0.00770
Epoch: 28234, Training Loss: 0.00871
Epoch: 28234, Training Loss: 0.00813
Epoch: 28234, Training Loss: 0.00995
Epoch: 28234, Training Loss: 0.00770
Epoch: 28235, Training Loss: 0.00871
Epoch: 28235, Training Loss: 0.00813
Epoch: 28235, Training Loss: 0.00995
Epoch: 28235, Training Loss: 0.00770
Epoch: 28236, Training Loss: 0.00871
Epoch: 28236, Training Loss: 0.00813
Epoch: 28236, Training Loss: 0.00995
Epoch: 28236, Training Loss: 0.00770
Epoch: 28237, Training Loss: 0.00871
Epoch: 28237, Training Loss: 0.00813
Epoch: 28237, Training Loss: 0.00995
Epoch: 28237, Training Loss: 0.00770
Epoch: 28238, Training Loss: 0.00871
Epoch: 28238, Training Loss: 0.00813
Epoch: 28238, Training Loss: 0.00995
Epoch: 28238, Training Loss: 0.00770
Epoch: 28239, Training Loss: 0.00871
Epoch: 28239, Training Loss: 0.00813
Epoch: 28239, Training Loss: 0.00995
Epoch: 28239, Training Loss: 0.00770
Epoch: 28240, Training Loss: 0.00871
Epoch: 28240, Training Loss: 0.00813
Epoch: 28240, Training Loss: 0.00995
Epoch: 28240, Training Loss: 0.00770
Epoch: 28241, Training Loss: 0.00871
Epoch: 28241, Training Loss: 0.00813
Epoch: 28241, Training Loss: 0.00995
Epoch: 28241, Training Loss: 0.00770
Epoch: 28242, Training Loss: 0.00871
Epoch: 28242, Training Loss: 0.00813
Epoch: 28242, Training Loss: 0.00995
Epoch: 28242, Training Loss: 0.00770
Epoch: 28243, Training Loss: 0.00871
Epoch: 28243, Training Loss: 0.00813
Epoch: 28243, Training Loss: 0.00995
Epoch: 28243, Training Loss: 0.00770
Epoch: 28244, Training Loss: 0.00871
Epoch: 28244, Training Loss: 0.00813
Epoch: 28244, Training Loss: 0.00995
Epoch: 28244, Training Loss: 0.00770
Epoch: 28245, Training Loss: 0.00871
Epoch: 28245, Training Loss: 0.00813
Epoch: 28245, Training Loss: 0.00995
Epoch: 28245, Training Loss: 0.00770
Epoch: 28246, Training Loss: 0.00871
Epoch: 28246, Training Loss: 0.00813
Epoch: 28246, Training Loss: 0.00995
Epoch: 28246, Training Loss: 0.00770
Epoch: 28247, Training Loss: 0.00871
Epoch: 28247, Training Loss: 0.00813
Epoch: 28247, Training Loss: 0.00995
Epoch: 28247, Training Loss: 0.00770
Epoch: 28248, Training Loss: 0.00871
Epoch: 28248, Training Loss: 0.00813
Epoch: 28248, Training Loss: 0.00995
Epoch: 28248, Training Loss: 0.00770
Epoch: 28249, Training Loss: 0.00871
Epoch: 28249, Training Loss: 0.00813
Epoch: 28249, Training Loss: 0.00994
Epoch: 28249, Training Loss: 0.00770
Epoch: 28250, Training Loss: 0.00871
Epoch: 28250, Training Loss: 0.00813
Epoch: 28250, Training Loss: 0.00994
Epoch: 28250, Training Loss: 0.00770
Epoch: 28251, Training Loss: 0.00871
Epoch: 28251, Training Loss: 0.00813
Epoch: 28251, Training Loss: 0.00994
Epoch: 28251, Training Loss: 0.00770
Epoch: 28252, Training Loss: 0.00871
Epoch: 28252, Training Loss: 0.00813
Epoch: 28252, Training Loss: 0.00994
Epoch: 28252, Training Loss: 0.00770
Epoch: 28253, Training Loss: 0.00871
Epoch: 28253, Training Loss: 0.00813
Epoch: 28253, Training Loss: 0.00994
Epoch: 28253, Training Loss: 0.00770
Epoch: 28254, Training Loss: 0.00871
Epoch: 28254, Training Loss: 0.00813
Epoch: 28254, Training Loss: 0.00994
Epoch: 28254, Training Loss: 0.00770
Epoch: 28255, Training Loss: 0.00871
Epoch: 28255, Training Loss: 0.00813
Epoch: 28255, Training Loss: 0.00994
Epoch: 28255, Training Loss: 0.00770
Epoch: 28256, Training Loss: 0.00871
Epoch: 28256, Training Loss: 0.00813
Epoch: 28256, Training Loss: 0.00994
Epoch: 28256, Training Loss: 0.00770
Epoch: 28257, Training Loss: 0.00871
Epoch: 28257, Training Loss: 0.00813
Epoch: 28257, Training Loss: 0.00994
Epoch: 28257, Training Loss: 0.00770
Epoch: 28258, Training Loss: 0.00871
Epoch: 28258, Training Loss: 0.00813
Epoch: 28258, Training Loss: 0.00994
Epoch: 28258, Training Loss: 0.00770
Epoch: 28259, Training Loss: 0.00871
Epoch: 28259, Training Loss: 0.00813
Epoch: 28259, Training Loss: 0.00994
Epoch: 28259, Training Loss: 0.00770
Epoch: 28260, Training Loss: 0.00871
Epoch: 28260, Training Loss: 0.00813
Epoch: 28260, Training Loss: 0.00994
Epoch: 28260, Training Loss: 0.00770
Epoch: 28261, Training Loss: 0.00871
Epoch: 28261, Training Loss: 0.00813
Epoch: 28261, Training Loss: 0.00994
Epoch: 28261, Training Loss: 0.00770
Epoch: 28262, Training Loss: 0.00871
Epoch: 28262, Training Loss: 0.00813
Epoch: 28262, Training Loss: 0.00994
Epoch: 28262, Training Loss: 0.00770
Epoch: 28263, Training Loss: 0.00871
Epoch: 28263, Training Loss: 0.00813
Epoch: 28263, Training Loss: 0.00994
Epoch: 28263, Training Loss: 0.00770
Epoch: 28264, Training Loss: 0.00871
Epoch: 28264, Training Loss: 0.00813
Epoch: 28264, Training Loss: 0.00994
Epoch: 28264, Training Loss: 0.00769
Epoch: 28265, Training Loss: 0.00871
Epoch: 28265, Training Loss: 0.00813
Epoch: 28265, Training Loss: 0.00994
Epoch: 28265, Training Loss: 0.00769
Epoch: 28266, Training Loss: 0.00871
Epoch: 28266, Training Loss: 0.00812
Epoch: 28266, Training Loss: 0.00994
Epoch: 28266, Training Loss: 0.00769
Epoch: 28267, Training Loss: 0.00871
Epoch: 28267, Training Loss: 0.00812
Epoch: 28267, Training Loss: 0.00994
Epoch: 28267, Training Loss: 0.00769
Epoch: 28268, Training Loss: 0.00871
Epoch: 28268, Training Loss: 0.00812
Epoch: 28268, Training Loss: 0.00994
Epoch: 28268, Training Loss: 0.00769
Epoch: 28269, Training Loss: 0.00871
Epoch: 28269, Training Loss: 0.00812
Epoch: 28269, Training Loss: 0.00994
Epoch: 28269, Training Loss: 0.00769
Epoch: 28270, Training Loss: 0.00871
Epoch: 28270, Training Loss: 0.00812
Epoch: 28270, Training Loss: 0.00994
Epoch: 28270, Training Loss: 0.00769
Epoch: 28271, Training Loss: 0.00871
Epoch: 28271, Training Loss: 0.00812
Epoch: 28271, Training Loss: 0.00994
Epoch: 28271, Training Loss: 0.00769
Epoch: 28272, Training Loss: 0.00870
Epoch: 28272, Training Loss: 0.00812
Epoch: 28272, Training Loss: 0.00994
Epoch: 28272, Training Loss: 0.00769
Epoch: 28273, Training Loss: 0.00870
Epoch: 28273, Training Loss: 0.00812
Epoch: 28273, Training Loss: 0.00994
Epoch: 28273, Training Loss: 0.00769
Epoch: 28274, Training Loss: 0.00870
Epoch: 28274, Training Loss: 0.00812
Epoch: 28274, Training Loss: 0.00994
Epoch: 28274, Training Loss: 0.00769
Epoch: 28275, Training Loss: 0.00870
Epoch: 28275, Training Loss: 0.00812
Epoch: 28275, Training Loss: 0.00994
Epoch: 28275, Training Loss: 0.00769
Epoch: 28276, Training Loss: 0.00870
Epoch: 28276, Training Loss: 0.00812
Epoch: 28276, Training Loss: 0.00994
Epoch: 28276, Training Loss: 0.00769
Epoch: 28277, Training Loss: 0.00870
Epoch: 28277, Training Loss: 0.00812
Epoch: 28277, Training Loss: 0.00994
Epoch: 28277, Training Loss: 0.00769
Epoch: 28278, Training Loss: 0.00870
Epoch: 28278, Training Loss: 0.00812
Epoch: 28278, Training Loss: 0.00994
Epoch: 28278, Training Loss: 0.00769
Epoch: 28279, Training Loss: 0.00870
Epoch: 28279, Training Loss: 0.00812
Epoch: 28279, Training Loss: 0.00994
Epoch: 28279, Training Loss: 0.00769
Epoch: 28280, Training Loss: 0.00870
Epoch: 28280, Training Loss: 0.00812
Epoch: 28280, Training Loss: 0.00994
Epoch: 28280, Training Loss: 0.00769
Epoch: 28281, Training Loss: 0.00870
Epoch: 28281, Training Loss: 0.00812
Epoch: 28281, Training Loss: 0.00994
Epoch: 28281, Training Loss: 0.00769
Epoch: 28282, Training Loss: 0.00870
Epoch: 28282, Training Loss: 0.00812
Epoch: 28282, Training Loss: 0.00994
Epoch: 28282, Training Loss: 0.00769
Epoch: 28283, Training Loss: 0.00870
Epoch: 28283, Training Loss: 0.00812
Epoch: 28283, Training Loss: 0.00994
Epoch: 28283, Training Loss: 0.00769
Epoch: 28284, Training Loss: 0.00870
Epoch: 28284, Training Loss: 0.00812
Epoch: 28284, Training Loss: 0.00994
Epoch: 28284, Training Loss: 0.00769
Epoch: 28285, Training Loss: 0.00870
Epoch: 28285, Training Loss: 0.00812
Epoch: 28285, Training Loss: 0.00994
Epoch: 28285, Training Loss: 0.00769
Epoch: 28286, Training Loss: 0.00870
Epoch: 28286, Training Loss: 0.00812
Epoch: 28286, Training Loss: 0.00994
Epoch: 28286, Training Loss: 0.00769
Epoch: 28287, Training Loss: 0.00870
Epoch: 28287, Training Loss: 0.00812
Epoch: 28287, Training Loss: 0.00994
Epoch: 28287, Training Loss: 0.00769
Epoch: 28288, Training Loss: 0.00870
Epoch: 28288, Training Loss: 0.00812
Epoch: 28288, Training Loss: 0.00994
Epoch: 28288, Training Loss: 0.00769
Epoch: 28289, Training Loss: 0.00870
Epoch: 28289, Training Loss: 0.00812
Epoch: 28289, Training Loss: 0.00994
Epoch: 28289, Training Loss: 0.00769
Epoch: 28290, Training Loss: 0.00870
Epoch: 28290, Training Loss: 0.00812
Epoch: 28290, Training Loss: 0.00994
Epoch: 28290, Training Loss: 0.00769
Epoch: 28291, Training Loss: 0.00870
Epoch: 28291, Training Loss: 0.00812
Epoch: 28291, Training Loss: 0.00994
Epoch: 28291, Training Loss: 0.00769
Epoch: 28292, Training Loss: 0.00870
Epoch: 28292, Training Loss: 0.00812
Epoch: 28292, Training Loss: 0.00994
Epoch: 28292, Training Loss: 0.00769
Epoch: 28293, Training Loss: 0.00870
Epoch: 28293, Training Loss: 0.00812
Epoch: 28293, Training Loss: 0.00994
Epoch: 28293, Training Loss: 0.00769
Epoch: 28294, Training Loss: 0.00870
Epoch: 28294, Training Loss: 0.00812
Epoch: 28294, Training Loss: 0.00994
Epoch: 28294, Training Loss: 0.00769
Epoch: 28295, Training Loss: 0.00870
Epoch: 28295, Training Loss: 0.00812
Epoch: 28295, Training Loss: 0.00994
Epoch: 28295, Training Loss: 0.00769
Epoch: 28296, Training Loss: 0.00870
Epoch: 28296, Training Loss: 0.00812
Epoch: 28296, Training Loss: 0.00994
Epoch: 28296, Training Loss: 0.00769
Epoch: 28297, Training Loss: 0.00870
Epoch: 28297, Training Loss: 0.00812
Epoch: 28297, Training Loss: 0.00994
Epoch: 28297, Training Loss: 0.00769
Epoch: 28298, Training Loss: 0.00870
Epoch: 28298, Training Loss: 0.00812
Epoch: 28298, Training Loss: 0.00994
Epoch: 28298, Training Loss: 0.00769
Epoch: 28299, Training Loss: 0.00870
Epoch: 28299, Training Loss: 0.00812
Epoch: 28299, Training Loss: 0.00994
Epoch: 28299, Training Loss: 0.00769
Epoch: 28300, Training Loss: 0.00870
Epoch: 28300, Training Loss: 0.00812
Epoch: 28300, Training Loss: 0.00994
Epoch: 28300, Training Loss: 0.00769
Epoch: 28301, Training Loss: 0.00870
Epoch: 28301, Training Loss: 0.00812
Epoch: 28301, Training Loss: 0.00994
Epoch: 28301, Training Loss: 0.00769
Epoch: 28302, Training Loss: 0.00870
Epoch: 28302, Training Loss: 0.00812
Epoch: 28302, Training Loss: 0.00994
Epoch: 28302, Training Loss: 0.00769
Epoch: 28303, Training Loss: 0.00870
Epoch: 28303, Training Loss: 0.00812
Epoch: 28303, Training Loss: 0.00993
Epoch: 28303, Training Loss: 0.00769
Epoch: 28304, Training Loss: 0.00870
Epoch: 28304, Training Loss: 0.00812
Epoch: 28304, Training Loss: 0.00993
Epoch: 28304, Training Loss: 0.00769
Epoch: 28305, Training Loss: 0.00870
Epoch: 28305, Training Loss: 0.00812
Epoch: 28305, Training Loss: 0.00993
Epoch: 28305, Training Loss: 0.00769
Epoch: 28306, Training Loss: 0.00870
Epoch: 28306, Training Loss: 0.00812
Epoch: 28306, Training Loss: 0.00993
Epoch: 28306, Training Loss: 0.00769
Epoch: 28307, Training Loss: 0.00870
Epoch: 28307, Training Loss: 0.00812
Epoch: 28307, Training Loss: 0.00993
Epoch: 28307, Training Loss: 0.00769
Epoch: 28308, Training Loss: 0.00870
Epoch: 28308, Training Loss: 0.00812
Epoch: 28308, Training Loss: 0.00993
Epoch: 28308, Training Loss: 0.00769
Epoch: 28309, Training Loss: 0.00870
Epoch: 28309, Training Loss: 0.00812
Epoch: 28309, Training Loss: 0.00993
Epoch: 28309, Training Loss: 0.00769
Epoch: 28310, Training Loss: 0.00870
Epoch: 28310, Training Loss: 0.00812
Epoch: 28310, Training Loss: 0.00993
Epoch: 28310, Training Loss: 0.00769
Epoch: 28311, Training Loss: 0.00870
Epoch: 28311, Training Loss: 0.00812
Epoch: 28311, Training Loss: 0.00993
Epoch: 28311, Training Loss: 0.00769
Epoch: 28312, Training Loss: 0.00870
Epoch: 28312, Training Loss: 0.00812
Epoch: 28312, Training Loss: 0.00993
Epoch: 28312, Training Loss: 0.00769
Epoch: 28313, Training Loss: 0.00870
Epoch: 28313, Training Loss: 0.00812
Epoch: 28313, Training Loss: 0.00993
Epoch: 28313, Training Loss: 0.00769
Epoch: 28314, Training Loss: 0.00870
Epoch: 28314, Training Loss: 0.00812
Epoch: 28314, Training Loss: 0.00993
Epoch: 28314, Training Loss: 0.00769
Epoch: 28315, Training Loss: 0.00870
Epoch: 28315, Training Loss: 0.00812
Epoch: 28315, Training Loss: 0.00993
Epoch: 28315, Training Loss: 0.00769
Epoch: 28316, Training Loss: 0.00870
Epoch: 28316, Training Loss: 0.00812
Epoch: 28316, Training Loss: 0.00993
Epoch: 28316, Training Loss: 0.00769
Epoch: 28317, Training Loss: 0.00870
Epoch: 28317, Training Loss: 0.00812
Epoch: 28317, Training Loss: 0.00993
Epoch: 28317, Training Loss: 0.00769
Epoch: 28318, Training Loss: 0.00870
Epoch: 28318, Training Loss: 0.00812
Epoch: 28318, Training Loss: 0.00993
Epoch: 28318, Training Loss: 0.00769
Epoch: 28319, Training Loss: 0.00870
Epoch: 28319, Training Loss: 0.00812
Epoch: 28319, Training Loss: 0.00993
Epoch: 28319, Training Loss: 0.00769
Epoch: 28320, Training Loss: 0.00870
Epoch: 28320, Training Loss: 0.00812
Epoch: 28320, Training Loss: 0.00993
Epoch: 28320, Training Loss: 0.00769
Epoch: 28321, Training Loss: 0.00870
Epoch: 28321, Training Loss: 0.00812
Epoch: 28321, Training Loss: 0.00993
Epoch: 28321, Training Loss: 0.00769
Epoch: 28322, Training Loss: 0.00870
Epoch: 28322, Training Loss: 0.00812
Epoch: 28322, Training Loss: 0.00993
Epoch: 28322, Training Loss: 0.00769
Epoch: 28323, Training Loss: 0.00870
Epoch: 28323, Training Loss: 0.00812
Epoch: 28323, Training Loss: 0.00993
Epoch: 28323, Training Loss: 0.00769
Epoch: 28324, Training Loss: 0.00870
Epoch: 28324, Training Loss: 0.00812
Epoch: 28324, Training Loss: 0.00993
Epoch: 28324, Training Loss: 0.00769
Epoch: 28325, Training Loss: 0.00870
Epoch: 28325, Training Loss: 0.00812
Epoch: 28325, Training Loss: 0.00993
Epoch: 28325, Training Loss: 0.00769
Epoch: 28326, Training Loss: 0.00870
Epoch: 28326, Training Loss: 0.00812
Epoch: 28326, Training Loss: 0.00993
Epoch: 28326, Training Loss: 0.00769
Epoch: 28327, Training Loss: 0.00870
Epoch: 28327, Training Loss: 0.00812
Epoch: 28327, Training Loss: 0.00993
Epoch: 28327, Training Loss: 0.00769
Epoch: 28328, Training Loss: 0.00870
Epoch: 28328, Training Loss: 0.00812
Epoch: 28328, Training Loss: 0.00993
Epoch: 28328, Training Loss: 0.00769
Epoch: 28329, Training Loss: 0.00870
Epoch: 28329, Training Loss: 0.00812
Epoch: 28329, Training Loss: 0.00993
Epoch: 28329, Training Loss: 0.00769
Epoch: 28330, Training Loss: 0.00870
Epoch: 28330, Training Loss: 0.00812
Epoch: 28330, Training Loss: 0.00993
Epoch: 28330, Training Loss: 0.00769
Epoch: 28331, Training Loss: 0.00870
Epoch: 28331, Training Loss: 0.00812
Epoch: 28331, Training Loss: 0.00993
Epoch: 28331, Training Loss: 0.00769
Epoch: 28332, Training Loss: 0.00870
Epoch: 28332, Training Loss: 0.00812
Epoch: 28332, Training Loss: 0.00993
Epoch: 28332, Training Loss: 0.00769
Epoch: 28333, Training Loss: 0.00870
Epoch: 28333, Training Loss: 0.00811
Epoch: 28333, Training Loss: 0.00993
Epoch: 28333, Training Loss: 0.00769
Epoch: 28334, Training Loss: 0.00869
Epoch: 28334, Training Loss: 0.00811
Epoch: 28334, Training Loss: 0.00993
Epoch: 28334, Training Loss: 0.00768
Epoch: 28335, Training Loss: 0.00869
Epoch: 28335, Training Loss: 0.00811
Epoch: 28335, Training Loss: 0.00993
Epoch: 28335, Training Loss: 0.00768
Epoch: 28336, Training Loss: 0.00869
Epoch: 28336, Training Loss: 0.00811
Epoch: 28336, Training Loss: 0.00993
Epoch: 28336, Training Loss: 0.00768
Epoch: 28337, Training Loss: 0.00869
Epoch: 28337, Training Loss: 0.00811
Epoch: 28337, Training Loss: 0.00993
Epoch: 28337, Training Loss: 0.00768
Epoch: 28338, Training Loss: 0.00869
Epoch: 28338, Training Loss: 0.00811
Epoch: 28338, Training Loss: 0.00993
Epoch: 28338, Training Loss: 0.00768
Epoch: 28339, Training Loss: 0.00869
Epoch: 28339, Training Loss: 0.00811
Epoch: 28339, Training Loss: 0.00993
Epoch: 28339, Training Loss: 0.00768
Epoch: 28340, Training Loss: 0.00869
Epoch: 28340, Training Loss: 0.00811
Epoch: 28340, Training Loss: 0.00993
Epoch: 28340, Training Loss: 0.00768
Epoch: 28341, Training Loss: 0.00869
Epoch: 28341, Training Loss: 0.00811
Epoch: 28341, Training Loss: 0.00993
Epoch: 28341, Training Loss: 0.00768
Epoch: 28342, Training Loss: 0.00869
Epoch: 28342, Training Loss: 0.00811
Epoch: 28342, Training Loss: 0.00993
Epoch: 28342, Training Loss: 0.00768
Epoch: 28343, Training Loss: 0.00869
Epoch: 28343, Training Loss: 0.00811
Epoch: 28343, Training Loss: 0.00993
Epoch: 28343, Training Loss: 0.00768
Epoch: 28344, Training Loss: 0.00869
Epoch: 28344, Training Loss: 0.00811
Epoch: 28344, Training Loss: 0.00993
Epoch: 28344, Training Loss: 0.00768
Epoch: 28345, Training Loss: 0.00869
Epoch: 28345, Training Loss: 0.00811
Epoch: 28345, Training Loss: 0.00993
Epoch: 28345, Training Loss: 0.00768
Epoch: 28346, Training Loss: 0.00869
Epoch: 28346, Training Loss: 0.00811
Epoch: 28346, Training Loss: 0.00993
Epoch: 28346, Training Loss: 0.00768
Epoch: 28347, Training Loss: 0.00869
Epoch: 28347, Training Loss: 0.00811
Epoch: 28347, Training Loss: 0.00993
Epoch: 28347, Training Loss: 0.00768
Epoch: 28348, Training Loss: 0.00869
Epoch: 28348, Training Loss: 0.00811
Epoch: 28348, Training Loss: 0.00993
Epoch: 28348, Training Loss: 0.00768
Epoch: 28349, Training Loss: 0.00869
Epoch: 28349, Training Loss: 0.00811
Epoch: 28349, Training Loss: 0.00993
Epoch: 28349, Training Loss: 0.00768
Epoch: 28350, Training Loss: 0.00869
Epoch: 28350, Training Loss: 0.00811
Epoch: 28350, Training Loss: 0.00993
Epoch: 28350, Training Loss: 0.00768
Epoch: 28351, Training Loss: 0.00869
Epoch: 28351, Training Loss: 0.00811
Epoch: 28351, Training Loss: 0.00993
Epoch: 28351, Training Loss: 0.00768
Epoch: 28352, Training Loss: 0.00869
Epoch: 28352, Training Loss: 0.00811
Epoch: 28352, Training Loss: 0.00993
Epoch: 28352, Training Loss: 0.00768
Epoch: 28353, Training Loss: 0.00869
Epoch: 28353, Training Loss: 0.00811
Epoch: 28353, Training Loss: 0.00993
Epoch: 28353, Training Loss: 0.00768
Epoch: 28354, Training Loss: 0.00869
Epoch: 28354, Training Loss: 0.00811
Epoch: 28354, Training Loss: 0.00993
Epoch: 28354, Training Loss: 0.00768
Epoch: 28355, Training Loss: 0.00869
Epoch: 28355, Training Loss: 0.00811
Epoch: 28355, Training Loss: 0.00993
Epoch: 28355, Training Loss: 0.00768
Epoch: 28356, Training Loss: 0.00869
Epoch: 28356, Training Loss: 0.00811
Epoch: 28356, Training Loss: 0.00992
Epoch: 28356, Training Loss: 0.00768
Epoch: 28357, Training Loss: 0.00869
Epoch: 28357, Training Loss: 0.00811
Epoch: 28357, Training Loss: 0.00992
Epoch: 28357, Training Loss: 0.00768
Epoch: 28358, Training Loss: 0.00869
Epoch: 28358, Training Loss: 0.00811
Epoch: 28358, Training Loss: 0.00992
Epoch: 28358, Training Loss: 0.00768
Epoch: 28359, Training Loss: 0.00869
Epoch: 28359, Training Loss: 0.00811
Epoch: 28359, Training Loss: 0.00992
Epoch: 28359, Training Loss: 0.00768
Epoch: 28360, Training Loss: 0.00869
Epoch: 28360, Training Loss: 0.00811
Epoch: 28360, Training Loss: 0.00992
Epoch: 28360, Training Loss: 0.00768
Epoch: 28361, Training Loss: 0.00869
Epoch: 28361, Training Loss: 0.00811
Epoch: 28361, Training Loss: 0.00992
Epoch: 28361, Training Loss: 0.00768
Epoch: 28362, Training Loss: 0.00869
Epoch: 28362, Training Loss: 0.00811
Epoch: 28362, Training Loss: 0.00992
Epoch: 28362, Training Loss: 0.00768
Epoch: 28363, Training Loss: 0.00869
Epoch: 28363, Training Loss: 0.00811
Epoch: 28363, Training Loss: 0.00992
Epoch: 28363, Training Loss: 0.00768
Epoch: 28364, Training Loss: 0.00869
Epoch: 28364, Training Loss: 0.00811
Epoch: 28364, Training Loss: 0.00992
Epoch: 28364, Training Loss: 0.00768
Epoch: 28365, Training Loss: 0.00869
Epoch: 28365, Training Loss: 0.00811
Epoch: 28365, Training Loss: 0.00992
Epoch: 28365, Training Loss: 0.00768
Epoch: 28366, Training Loss: 0.00869
Epoch: 28366, Training Loss: 0.00811
Epoch: 28366, Training Loss: 0.00992
Epoch: 28366, Training Loss: 0.00768
Epoch: 28367, Training Loss: 0.00869
Epoch: 28367, Training Loss: 0.00811
Epoch: 28367, Training Loss: 0.00992
Epoch: 28367, Training Loss: 0.00768
Epoch: 28368, Training Loss: 0.00869
Epoch: 28368, Training Loss: 0.00811
Epoch: 28368, Training Loss: 0.00992
Epoch: 28368, Training Loss: 0.00768
Epoch: 28369, Training Loss: 0.00869
Epoch: 28369, Training Loss: 0.00811
Epoch: 28369, Training Loss: 0.00992
Epoch: 28369, Training Loss: 0.00768
Epoch: 28370, Training Loss: 0.00869
Epoch: 28370, Training Loss: 0.00811
Epoch: 28370, Training Loss: 0.00992
Epoch: 28370, Training Loss: 0.00768
Epoch: 28371, Training Loss: 0.00869
Epoch: 28371, Training Loss: 0.00811
Epoch: 28371, Training Loss: 0.00992
Epoch: 28371, Training Loss: 0.00768
Epoch: 28372, Training Loss: 0.00869
Epoch: 28372, Training Loss: 0.00811
Epoch: 28372, Training Loss: 0.00992
Epoch: 28372, Training Loss: 0.00768
Epoch: 28373, Training Loss: 0.00869
Epoch: 28373, Training Loss: 0.00811
Epoch: 28373, Training Loss: 0.00992
Epoch: 28373, Training Loss: 0.00768
Epoch: 28374, Training Loss: 0.00869
Epoch: 28374, Training Loss: 0.00811
Epoch: 28374, Training Loss: 0.00992
Epoch: 28374, Training Loss: 0.00768
Epoch: 28375, Training Loss: 0.00869
Epoch: 28375, Training Loss: 0.00811
Epoch: 28375, Training Loss: 0.00992
Epoch: 28375, Training Loss: 0.00768
Epoch: 28376, Training Loss: 0.00869
Epoch: 28376, Training Loss: 0.00811
Epoch: 28376, Training Loss: 0.00992
Epoch: 28376, Training Loss: 0.00768
Epoch: 28377, Training Loss: 0.00869
Epoch: 28377, Training Loss: 0.00811
Epoch: 28377, Training Loss: 0.00992
Epoch: 28377, Training Loss: 0.00768
Epoch: 28378, Training Loss: 0.00869
Epoch: 28378, Training Loss: 0.00811
Epoch: 28378, Training Loss: 0.00992
Epoch: 28378, Training Loss: 0.00768
Epoch: 28379, Training Loss: 0.00869
Epoch: 28379, Training Loss: 0.00811
Epoch: 28379, Training Loss: 0.00992
Epoch: 28379, Training Loss: 0.00768
Epoch: 28380, Training Loss: 0.00869
Epoch: 28380, Training Loss: 0.00811
Epoch: 28380, Training Loss: 0.00992
Epoch: 28380, Training Loss: 0.00768
Epoch: 28381, Training Loss: 0.00869
Epoch: 28381, Training Loss: 0.00811
Epoch: 28381, Training Loss: 0.00992
Epoch: 28381, Training Loss: 0.00768
Epoch: 28382, Training Loss: 0.00869
Epoch: 28382, Training Loss: 0.00811
Epoch: 28382, Training Loss: 0.00992
Epoch: 28382, Training Loss: 0.00768
Epoch: 28383, Training Loss: 0.00869
Epoch: 28383, Training Loss: 0.00811
Epoch: 28383, Training Loss: 0.00992
Epoch: 28383, Training Loss: 0.00768
Epoch: 28384, Training Loss: 0.00869
Epoch: 28384, Training Loss: 0.00811
Epoch: 28384, Training Loss: 0.00992
Epoch: 28384, Training Loss: 0.00768
Epoch: 28385, Training Loss: 0.00869
Epoch: 28385, Training Loss: 0.00811
Epoch: 28385, Training Loss: 0.00992
Epoch: 28385, Training Loss: 0.00768
Epoch: 28386, Training Loss: 0.00869
Epoch: 28386, Training Loss: 0.00811
Epoch: 28386, Training Loss: 0.00992
Epoch: 28386, Training Loss: 0.00768
Epoch: 28387, Training Loss: 0.00869
Epoch: 28387, Training Loss: 0.00811
Epoch: 28387, Training Loss: 0.00992
Epoch: 28387, Training Loss: 0.00768
Epoch: 28388, Training Loss: 0.00869
Epoch: 28388, Training Loss: 0.00811
Epoch: 28388, Training Loss: 0.00992
Epoch: 28388, Training Loss: 0.00768
Epoch: 28389, Training Loss: 0.00869
Epoch: 28389, Training Loss: 0.00811
Epoch: 28389, Training Loss: 0.00992
Epoch: 28389, Training Loss: 0.00768
Epoch: 28390, Training Loss: 0.00869
Epoch: 28390, Training Loss: 0.00811
Epoch: 28390, Training Loss: 0.00992
Epoch: 28390, Training Loss: 0.00768
Epoch: 28391, Training Loss: 0.00869
Epoch: 28391, Training Loss: 0.00811
Epoch: 28391, Training Loss: 0.00992
Epoch: 28391, Training Loss: 0.00768
Epoch: 28392, Training Loss: 0.00869
Epoch: 28392, Training Loss: 0.00811
Epoch: 28392, Training Loss: 0.00992
Epoch: 28392, Training Loss: 0.00768
Epoch: 28393, Training Loss: 0.00869
Epoch: 28393, Training Loss: 0.00811
Epoch: 28393, Training Loss: 0.00992
Epoch: 28393, Training Loss: 0.00768
Epoch: 28394, Training Loss: 0.00869
Epoch: 28394, Training Loss: 0.00811
Epoch: 28394, Training Loss: 0.00992
Epoch: 28394, Training Loss: 0.00768
Epoch: 28395, Training Loss: 0.00868
Epoch: 28395, Training Loss: 0.00811
Epoch: 28395, Training Loss: 0.00992
Epoch: 28395, Training Loss: 0.00768
Epoch: 28396, Training Loss: 0.00868
Epoch: 28396, Training Loss: 0.00811
Epoch: 28396, Training Loss: 0.00992
Epoch: 28396, Training Loss: 0.00768
Epoch: 28397, Training Loss: 0.00868
Epoch: 28397, Training Loss: 0.00811
Epoch: 28397, Training Loss: 0.00992
Epoch: 28397, Training Loss: 0.00768
Epoch: 28398, Training Loss: 0.00868
Epoch: 28398, Training Loss: 0.00811
Epoch: 28398, Training Loss: 0.00992
Epoch: 28398, Training Loss: 0.00768
Epoch: 28399, Training Loss: 0.00868
Epoch: 28399, Training Loss: 0.00811
Epoch: 28399, Training Loss: 0.00992
Epoch: 28399, Training Loss: 0.00768
Epoch: 28400, Training Loss: 0.00868
Epoch: 28400, Training Loss: 0.00810
Epoch: 28400, Training Loss: 0.00992
Epoch: 28400, Training Loss: 0.00768
Epoch: 28401, Training Loss: 0.00868
Epoch: 28401, Training Loss: 0.00810
Epoch: 28401, Training Loss: 0.00992
Epoch: 28401, Training Loss: 0.00768
Epoch: 28402, Training Loss: 0.00868
Epoch: 28402, Training Loss: 0.00810
Epoch: 28402, Training Loss: 0.00992
Epoch: 28402, Training Loss: 0.00768
Epoch: 28403, Training Loss: 0.00868
Epoch: 28403, Training Loss: 0.00810
Epoch: 28403, Training Loss: 0.00992
Epoch: 28403, Training Loss: 0.00768
Epoch: 28404, Training Loss: 0.00868
Epoch: 28404, Training Loss: 0.00810
Epoch: 28404, Training Loss: 0.00992
Epoch: 28404, Training Loss: 0.00767
Epoch: 28405, Training Loss: 0.00868
Epoch: 28405, Training Loss: 0.00810
Epoch: 28405, Training Loss: 0.00992
Epoch: 28405, Training Loss: 0.00767
Epoch: 28406, Training Loss: 0.00868
Epoch: 28406, Training Loss: 0.00810
Epoch: 28406, Training Loss: 0.00992
Epoch: 28406, Training Loss: 0.00767
Epoch: 28407, Training Loss: 0.00868
Epoch: 28407, Training Loss: 0.00810
Epoch: 28407, Training Loss: 0.00992
Epoch: 28407, Training Loss: 0.00767
Epoch: 28408, Training Loss: 0.00868
Epoch: 28408, Training Loss: 0.00810
Epoch: 28408, Training Loss: 0.00992
Epoch: 28408, Training Loss: 0.00767
Epoch: 28409, Training Loss: 0.00868
Epoch: 28409, Training Loss: 0.00810
Epoch: 28409, Training Loss: 0.00992
Epoch: 28409, Training Loss: 0.00767
Epoch: 28410, Training Loss: 0.00868
Epoch: 28410, Training Loss: 0.00810
Epoch: 28410, Training Loss: 0.00991
Epoch: 28410, Training Loss: 0.00767
Epoch: 28411, Training Loss: 0.00868
Epoch: 28411, Training Loss: 0.00810
Epoch: 28411, Training Loss: 0.00991
Epoch: 28411, Training Loss: 0.00767
Epoch: 28412, Training Loss: 0.00868
Epoch: 28412, Training Loss: 0.00810
Epoch: 28412, Training Loss: 0.00991
Epoch: 28412, Training Loss: 0.00767
Epoch: 28413, Training Loss: 0.00868
Epoch: 28413, Training Loss: 0.00810
Epoch: 28413, Training Loss: 0.00991
Epoch: 28413, Training Loss: 0.00767
Epoch: 28414, Training Loss: 0.00868
Epoch: 28414, Training Loss: 0.00810
Epoch: 28414, Training Loss: 0.00991
Epoch: 28414, Training Loss: 0.00767
Epoch: 28415, Training Loss: 0.00868
Epoch: 28415, Training Loss: 0.00810
Epoch: 28415, Training Loss: 0.00991
Epoch: 28415, Training Loss: 0.00767
Epoch: 28416, Training Loss: 0.00868
Epoch: 28416, Training Loss: 0.00810
Epoch: 28416, Training Loss: 0.00991
Epoch: 28416, Training Loss: 0.00767
Epoch: 28417, Training Loss: 0.00868
Epoch: 28417, Training Loss: 0.00810
Epoch: 28417, Training Loss: 0.00991
Epoch: 28417, Training Loss: 0.00767
Epoch: 28418, Training Loss: 0.00868
Epoch: 28418, Training Loss: 0.00810
Epoch: 28418, Training Loss: 0.00991
Epoch: 28418, Training Loss: 0.00767
Epoch: 28419, Training Loss: 0.00868
Epoch: 28419, Training Loss: 0.00810
Epoch: 28419, Training Loss: 0.00991
Epoch: 28419, Training Loss: 0.00767
Epoch: 28420, Training Loss: 0.00868
Epoch: 28420, Training Loss: 0.00810
Epoch: 28420, Training Loss: 0.00991
Epoch: 28420, Training Loss: 0.00767
Epoch: 28421, Training Loss: 0.00868
Epoch: 28421, Training Loss: 0.00810
Epoch: 28421, Training Loss: 0.00991
Epoch: 28421, Training Loss: 0.00767
Epoch: 28422, Training Loss: 0.00868
Epoch: 28422, Training Loss: 0.00810
Epoch: 28422, Training Loss: 0.00991
Epoch: 28422, Training Loss: 0.00767
Epoch: 28423, Training Loss: 0.00868
Epoch: 28423, Training Loss: 0.00810
Epoch: 28423, Training Loss: 0.00991
Epoch: 28423, Training Loss: 0.00767
Epoch: 28424, Training Loss: 0.00868
Epoch: 28424, Training Loss: 0.00810
Epoch: 28424, Training Loss: 0.00991
Epoch: 28424, Training Loss: 0.00767
Epoch: 28425, Training Loss: 0.00868
Epoch: 28425, Training Loss: 0.00810
Epoch: 28425, Training Loss: 0.00991
Epoch: 28425, Training Loss: 0.00767
Epoch: 28426, Training Loss: 0.00868
Epoch: 28426, Training Loss: 0.00810
Epoch: 28426, Training Loss: 0.00991
Epoch: 28426, Training Loss: 0.00767
Epoch: 28427, Training Loss: 0.00868
Epoch: 28427, Training Loss: 0.00810
Epoch: 28427, Training Loss: 0.00991
Epoch: 28427, Training Loss: 0.00767
Epoch: 28428, Training Loss: 0.00868
Epoch: 28428, Training Loss: 0.00810
Epoch: 28428, Training Loss: 0.00991
Epoch: 28428, Training Loss: 0.00767
Epoch: 28429, Training Loss: 0.00868
Epoch: 28429, Training Loss: 0.00810
Epoch: 28429, Training Loss: 0.00991
Epoch: 28429, Training Loss: 0.00767
Epoch: 28430, Training Loss: 0.00868
Epoch: 28430, Training Loss: 0.00810
Epoch: 28430, Training Loss: 0.00991
Epoch: 28430, Training Loss: 0.00767
Epoch: 28431, Training Loss: 0.00868
Epoch: 28431, Training Loss: 0.00810
Epoch: 28431, Training Loss: 0.00991
Epoch: 28431, Training Loss: 0.00767
Epoch: 28432, Training Loss: 0.00868
Epoch: 28432, Training Loss: 0.00810
Epoch: 28432, Training Loss: 0.00991
Epoch: 28432, Training Loss: 0.00767
Epoch: 28433, Training Loss: 0.00868
Epoch: 28433, Training Loss: 0.00810
Epoch: 28433, Training Loss: 0.00991
Epoch: 28433, Training Loss: 0.00767
Epoch: 28434, Training Loss: 0.00868
Epoch: 28434, Training Loss: 0.00810
Epoch: 28434, Training Loss: 0.00991
Epoch: 28434, Training Loss: 0.00767
Epoch: 28435, Training Loss: 0.00868
Epoch: 28435, Training Loss: 0.00810
Epoch: 28435, Training Loss: 0.00991
Epoch: 28435, Training Loss: 0.00767
Epoch: 28436, Training Loss: 0.00868
Epoch: 28436, Training Loss: 0.00810
Epoch: 28436, Training Loss: 0.00991
Epoch: 28436, Training Loss: 0.00767
Epoch: 28437, Training Loss: 0.00868
Epoch: 28437, Training Loss: 0.00810
Epoch: 28437, Training Loss: 0.00991
Epoch: 28437, Training Loss: 0.00767
Epoch: 28438, Training Loss: 0.00868
Epoch: 28438, Training Loss: 0.00810
Epoch: 28438, Training Loss: 0.00991
Epoch: 28438, Training Loss: 0.00767
Epoch: 28439, Training Loss: 0.00868
Epoch: 28439, Training Loss: 0.00810
Epoch: 28439, Training Loss: 0.00991
Epoch: 28439, Training Loss: 0.00767
Epoch: 28440, Training Loss: 0.00868
Epoch: 28440, Training Loss: 0.00810
Epoch: 28440, Training Loss: 0.00991
Epoch: 28440, Training Loss: 0.00767
Epoch: 28441, Training Loss: 0.00868
Epoch: 28441, Training Loss: 0.00810
Epoch: 28441, Training Loss: 0.00991
Epoch: 28441, Training Loss: 0.00767
Epoch: 28442, Training Loss: 0.00868
Epoch: 28442, Training Loss: 0.00810
Epoch: 28442, Training Loss: 0.00991
Epoch: 28442, Training Loss: 0.00767
Epoch: 28443, Training Loss: 0.00868
Epoch: 28443, Training Loss: 0.00810
Epoch: 28443, Training Loss: 0.00991
Epoch: 28443, Training Loss: 0.00767
Epoch: 28444, Training Loss: 0.00868
Epoch: 28444, Training Loss: 0.00810
Epoch: 28444, Training Loss: 0.00991
Epoch: 28444, Training Loss: 0.00767
Epoch: 28445, Training Loss: 0.00868
Epoch: 28445, Training Loss: 0.00810
Epoch: 28445, Training Loss: 0.00991
Epoch: 28445, Training Loss: 0.00767
Epoch: 28446, Training Loss: 0.00868
Epoch: 28446, Training Loss: 0.00810
Epoch: 28446, Training Loss: 0.00991
Epoch: 28446, Training Loss: 0.00767
Epoch: 28447, Training Loss: 0.00868
Epoch: 28447, Training Loss: 0.00810
Epoch: 28447, Training Loss: 0.00991
Epoch: 28447, Training Loss: 0.00767
Epoch: 28448, Training Loss: 0.00868
Epoch: 28448, Training Loss: 0.00810
Epoch: 28448, Training Loss: 0.00991
Epoch: 28448, Training Loss: 0.00767
Epoch: 28449, Training Loss: 0.00868
Epoch: 28449, Training Loss: 0.00810
Epoch: 28449, Training Loss: 0.00991
Epoch: 28449, Training Loss: 0.00767
Epoch: 28450, Training Loss: 0.00868
Epoch: 28450, Training Loss: 0.00810
Epoch: 28450, Training Loss: 0.00991
Epoch: 28450, Training Loss: 0.00767
Epoch: 28451, Training Loss: 0.00868
Epoch: 28451, Training Loss: 0.00810
Epoch: 28451, Training Loss: 0.00991
Epoch: 28451, Training Loss: 0.00767
Epoch: 28452, Training Loss: 0.00868
Epoch: 28452, Training Loss: 0.00810
Epoch: 28452, Training Loss: 0.00991
Epoch: 28452, Training Loss: 0.00767
Epoch: 28453, Training Loss: 0.00868
Epoch: 28453, Training Loss: 0.00810
Epoch: 28453, Training Loss: 0.00991
Epoch: 28453, Training Loss: 0.00767
Epoch: 28454, Training Loss: 0.00868
Epoch: 28454, Training Loss: 0.00810
Epoch: 28454, Training Loss: 0.00991
Epoch: 28454, Training Loss: 0.00767
Epoch: 28455, Training Loss: 0.00868
Epoch: 28455, Training Loss: 0.00810
Epoch: 28455, Training Loss: 0.00991
Epoch: 28455, Training Loss: 0.00767
Epoch: 28456, Training Loss: 0.00867
Epoch: 28456, Training Loss: 0.00810
Epoch: 28456, Training Loss: 0.00991
Epoch: 28456, Training Loss: 0.00767
Epoch: 28457, Training Loss: 0.00867
Epoch: 28457, Training Loss: 0.00810
Epoch: 28457, Training Loss: 0.00991
Epoch: 28457, Training Loss: 0.00767
Epoch: 28458, Training Loss: 0.00867
Epoch: 28458, Training Loss: 0.00810
Epoch: 28458, Training Loss: 0.00991
Epoch: 28458, Training Loss: 0.00767
Epoch: 28459, Training Loss: 0.00867
Epoch: 28459, Training Loss: 0.00810
Epoch: 28459, Training Loss: 0.00991
Epoch: 28459, Training Loss: 0.00767
Epoch: 28460, Training Loss: 0.00867
Epoch: 28460, Training Loss: 0.00810
Epoch: 28460, Training Loss: 0.00991
Epoch: 28460, Training Loss: 0.00767
Epoch: 28461, Training Loss: 0.00867
Epoch: 28461, Training Loss: 0.00810
Epoch: 28461, Training Loss: 0.00991
Epoch: 28461, Training Loss: 0.00767
Epoch: 28462, Training Loss: 0.00867
Epoch: 28462, Training Loss: 0.00810
Epoch: 28462, Training Loss: 0.00991
Epoch: 28462, Training Loss: 0.00767
Epoch: 28463, Training Loss: 0.00867
Epoch: 28463, Training Loss: 0.00810
Epoch: 28463, Training Loss: 0.00991
Epoch: 28463, Training Loss: 0.00767
Epoch: 28464, Training Loss: 0.00867
Epoch: 28464, Training Loss: 0.00810
Epoch: 28464, Training Loss: 0.00990
Epoch: 28464, Training Loss: 0.00767
Epoch: 28465, Training Loss: 0.00867
Epoch: 28465, Training Loss: 0.00810
Epoch: 28465, Training Loss: 0.00990
Epoch: 28465, Training Loss: 0.00767
Epoch: 28466, Training Loss: 0.00867
Epoch: 28466, Training Loss: 0.00810
Epoch: 28466, Training Loss: 0.00990
Epoch: 28466, Training Loss: 0.00767
Epoch: 28467, Training Loss: 0.00867
Epoch: 28467, Training Loss: 0.00809
Epoch: 28467, Training Loss: 0.00990
Epoch: 28467, Training Loss: 0.00767
Epoch: 28468, Training Loss: 0.00867
Epoch: 28468, Training Loss: 0.00809
Epoch: 28468, Training Loss: 0.00990
Epoch: 28468, Training Loss: 0.00767
Epoch: 28469, Training Loss: 0.00867
Epoch: 28469, Training Loss: 0.00809
Epoch: 28469, Training Loss: 0.00990
Epoch: 28469, Training Loss: 0.00767
Epoch: 28470, Training Loss: 0.00867
Epoch: 28470, Training Loss: 0.00809
Epoch: 28470, Training Loss: 0.00990
Epoch: 28470, Training Loss: 0.00767
Epoch: 28471, Training Loss: 0.00867
Epoch: 28471, Training Loss: 0.00809
Epoch: 28471, Training Loss: 0.00990
Epoch: 28471, Training Loss: 0.00767
Epoch: 28472, Training Loss: 0.00867
Epoch: 28472, Training Loss: 0.00809
Epoch: 28472, Training Loss: 0.00990
Epoch: 28472, Training Loss: 0.00767
Epoch: 28473, Training Loss: 0.00867
Epoch: 28473, Training Loss: 0.00809
Epoch: 28473, Training Loss: 0.00990
Epoch: 28473, Training Loss: 0.00767
Epoch: 28474, Training Loss: 0.00867
Epoch: 28474, Training Loss: 0.00809
Epoch: 28474, Training Loss: 0.00990
Epoch: 28474, Training Loss: 0.00767
Epoch: 28475, Training Loss: 0.00867
Epoch: 28475, Training Loss: 0.00809
Epoch: 28475, Training Loss: 0.00990
Epoch: 28475, Training Loss: 0.00766
Epoch: 28476, Training Loss: 0.00867
Epoch: 28476, Training Loss: 0.00809
Epoch: 28476, Training Loss: 0.00990
Epoch: 28476, Training Loss: 0.00766
Epoch: 28477, Training Loss: 0.00867
Epoch: 28477, Training Loss: 0.00809
Epoch: 28477, Training Loss: 0.00990
Epoch: 28477, Training Loss: 0.00766
Epoch: 28478, Training Loss: 0.00867
Epoch: 28478, Training Loss: 0.00809
Epoch: 28478, Training Loss: 0.00990
Epoch: 28478, Training Loss: 0.00766
Epoch: 28479, Training Loss: 0.00867
Epoch: 28479, Training Loss: 0.00809
Epoch: 28479, Training Loss: 0.00990
Epoch: 28479, Training Loss: 0.00766
Epoch: 28480, Training Loss: 0.00867
Epoch: 28480, Training Loss: 0.00809
Epoch: 28480, Training Loss: 0.00990
Epoch: 28480, Training Loss: 0.00766
Epoch: 28481, Training Loss: 0.00867
Epoch: 28481, Training Loss: 0.00809
Epoch: 28481, Training Loss: 0.00990
Epoch: 28481, Training Loss: 0.00766
Epoch: 28482, Training Loss: 0.00867
Epoch: 28482, Training Loss: 0.00809
Epoch: 28482, Training Loss: 0.00990
Epoch: 28482, Training Loss: 0.00766
Epoch: 28483, Training Loss: 0.00867
Epoch: 28483, Training Loss: 0.00809
Epoch: 28483, Training Loss: 0.00990
Epoch: 28483, Training Loss: 0.00766
Epoch: 28484, Training Loss: 0.00867
Epoch: 28484, Training Loss: 0.00809
Epoch: 28484, Training Loss: 0.00990
Epoch: 28484, Training Loss: 0.00766
Epoch: 28485, Training Loss: 0.00867
Epoch: 28485, Training Loss: 0.00809
Epoch: 28485, Training Loss: 0.00990
Epoch: 28485, Training Loss: 0.00766
Epoch: 28486, Training Loss: 0.00867
Epoch: 28486, Training Loss: 0.00809
Epoch: 28486, Training Loss: 0.00990
Epoch: 28486, Training Loss: 0.00766
Epoch: 28487, Training Loss: 0.00867
Epoch: 28487, Training Loss: 0.00809
Epoch: 28487, Training Loss: 0.00990
Epoch: 28487, Training Loss: 0.00766
Epoch: 28488, Training Loss: 0.00867
Epoch: 28488, Training Loss: 0.00809
Epoch: 28488, Training Loss: 0.00990
Epoch: 28488, Training Loss: 0.00766
Epoch: 28489, Training Loss: 0.00867
Epoch: 28489, Training Loss: 0.00809
Epoch: 28489, Training Loss: 0.00990
Epoch: 28489, Training Loss: 0.00766
Epoch: 28490, Training Loss: 0.00867
Epoch: 28490, Training Loss: 0.00809
Epoch: 28490, Training Loss: 0.00990
Epoch: 28490, Training Loss: 0.00766
Epoch: 28491, Training Loss: 0.00867
Epoch: 28491, Training Loss: 0.00809
Epoch: 28491, Training Loss: 0.00990
Epoch: 28491, Training Loss: 0.00766
Epoch: 28492, Training Loss: 0.00867
Epoch: 28492, Training Loss: 0.00809
Epoch: 28492, Training Loss: 0.00990
Epoch: 28492, Training Loss: 0.00766
Epoch: 28493, Training Loss: 0.00867
Epoch: 28493, Training Loss: 0.00809
Epoch: 28493, Training Loss: 0.00990
Epoch: 28493, Training Loss: 0.00766
Epoch: 28494, Training Loss: 0.00867
Epoch: 28494, Training Loss: 0.00809
Epoch: 28494, Training Loss: 0.00990
Epoch: 28494, Training Loss: 0.00766
Epoch: 28495, Training Loss: 0.00867
Epoch: 28495, Training Loss: 0.00809
Epoch: 28495, Training Loss: 0.00990
Epoch: 28495, Training Loss: 0.00766
Epoch: 28496, Training Loss: 0.00867
Epoch: 28496, Training Loss: 0.00809
Epoch: 28496, Training Loss: 0.00990
Epoch: 28496, Training Loss: 0.00766
Epoch: 28497, Training Loss: 0.00867
Epoch: 28497, Training Loss: 0.00809
Epoch: 28497, Training Loss: 0.00990
Epoch: 28497, Training Loss: 0.00766
Epoch: 28498, Training Loss: 0.00867
Epoch: 28498, Training Loss: 0.00809
Epoch: 28498, Training Loss: 0.00990
Epoch: 28498, Training Loss: 0.00766
Epoch: 28499, Training Loss: 0.00867
Epoch: 28499, Training Loss: 0.00809
Epoch: 28499, Training Loss: 0.00990
Epoch: 28499, Training Loss: 0.00766
Epoch: 28500, Training Loss: 0.00867
Epoch: 28500, Training Loss: 0.00809
Epoch: 28500, Training Loss: 0.00990
Epoch: 28500, Training Loss: 0.00766
Epoch: 28501, Training Loss: 0.00867
Epoch: 28501, Training Loss: 0.00809
Epoch: 28501, Training Loss: 0.00990
Epoch: 28501, Training Loss: 0.00766
Epoch: 28502, Training Loss: 0.00867
Epoch: 28502, Training Loss: 0.00809
Epoch: 28502, Training Loss: 0.00990
Epoch: 28502, Training Loss: 0.00766
Epoch: 28503, Training Loss: 0.00867
Epoch: 28503, Training Loss: 0.00809
Epoch: 28503, Training Loss: 0.00990
Epoch: 28503, Training Loss: 0.00766
Epoch: 28504, Training Loss: 0.00867
Epoch: 28504, Training Loss: 0.00809
Epoch: 28504, Training Loss: 0.00990
Epoch: 28504, Training Loss: 0.00766
Epoch: 28505, Training Loss: 0.00867
Epoch: 28505, Training Loss: 0.00809
Epoch: 28505, Training Loss: 0.00990
Epoch: 28505, Training Loss: 0.00766
Epoch: 28506, Training Loss: 0.00867
Epoch: 28506, Training Loss: 0.00809
Epoch: 28506, Training Loss: 0.00990
Epoch: 28506, Training Loss: 0.00766
Epoch: 28507, Training Loss: 0.00867
Epoch: 28507, Training Loss: 0.00809
Epoch: 28507, Training Loss: 0.00990
Epoch: 28507, Training Loss: 0.00766
Epoch: 28508, Training Loss: 0.00867
Epoch: 28508, Training Loss: 0.00809
Epoch: 28508, Training Loss: 0.00990
Epoch: 28508, Training Loss: 0.00766
Epoch: 28509, Training Loss: 0.00867
Epoch: 28509, Training Loss: 0.00809
Epoch: 28509, Training Loss: 0.00990
Epoch: 28509, Training Loss: 0.00766
Epoch: 28510, Training Loss: 0.00867
Epoch: 28510, Training Loss: 0.00809
Epoch: 28510, Training Loss: 0.00990
Epoch: 28510, Training Loss: 0.00766
Epoch: 28511, Training Loss: 0.00867
Epoch: 28511, Training Loss: 0.00809
Epoch: 28511, Training Loss: 0.00990
Epoch: 28511, Training Loss: 0.00766
Epoch: 28512, Training Loss: 0.00867
Epoch: 28512, Training Loss: 0.00809
Epoch: 28512, Training Loss: 0.00990
Epoch: 28512, Training Loss: 0.00766
Epoch: 28513, Training Loss: 0.00867
Epoch: 28513, Training Loss: 0.00809
Epoch: 28513, Training Loss: 0.00990
Epoch: 28513, Training Loss: 0.00766
Epoch: 28514, Training Loss: 0.00867
Epoch: 28514, Training Loss: 0.00809
Epoch: 28514, Training Loss: 0.00990
Epoch: 28514, Training Loss: 0.00766
Epoch: 28515, Training Loss: 0.00867
Epoch: 28515, Training Loss: 0.00809
Epoch: 28515, Training Loss: 0.00990
Epoch: 28515, Training Loss: 0.00766
Epoch: 28516, Training Loss: 0.00867
Epoch: 28516, Training Loss: 0.00809
Epoch: 28516, Training Loss: 0.00990
Epoch: 28516, Training Loss: 0.00766
Epoch: 28517, Training Loss: 0.00867
Epoch: 28517, Training Loss: 0.00809
Epoch: 28517, Training Loss: 0.00990
Epoch: 28517, Training Loss: 0.00766
Epoch: 28518, Training Loss: 0.00866
Epoch: 28518, Training Loss: 0.00809
Epoch: 28518, Training Loss: 0.00990
Epoch: 28518, Training Loss: 0.00766
Epoch: 28519, Training Loss: 0.00866
Epoch: 28519, Training Loss: 0.00809
Epoch: 28519, Training Loss: 0.00989
Epoch: 28519, Training Loss: 0.00766
Epoch: 28520, Training Loss: 0.00866
Epoch: 28520, Training Loss: 0.00809
Epoch: 28520, Training Loss: 0.00989
Epoch: 28520, Training Loss: 0.00766
Epoch: 28521, Training Loss: 0.00866
Epoch: 28521, Training Loss: 0.00809
Epoch: 28521, Training Loss: 0.00989
Epoch: 28521, Training Loss: 0.00766
Epoch: 28522, Training Loss: 0.00866
Epoch: 28522, Training Loss: 0.00809
Epoch: 28522, Training Loss: 0.00989
Epoch: 28522, Training Loss: 0.00766
Epoch: 28523, Training Loss: 0.00866
Epoch: 28523, Training Loss: 0.00809
Epoch: 28523, Training Loss: 0.00989
Epoch: 28523, Training Loss: 0.00766
Epoch: 28524, Training Loss: 0.00866
Epoch: 28524, Training Loss: 0.00809
Epoch: 28524, Training Loss: 0.00989
Epoch: 28524, Training Loss: 0.00766
Epoch: 28525, Training Loss: 0.00866
Epoch: 28525, Training Loss: 0.00809
Epoch: 28525, Training Loss: 0.00989
Epoch: 28525, Training Loss: 0.00766
Epoch: 28526, Training Loss: 0.00866
Epoch: 28526, Training Loss: 0.00809
Epoch: 28526, Training Loss: 0.00989
Epoch: 28526, Training Loss: 0.00766
Epoch: 28527, Training Loss: 0.00866
Epoch: 28527, Training Loss: 0.00809
Epoch: 28527, Training Loss: 0.00989
Epoch: 28527, Training Loss: 0.00766
Epoch: 28528, Training Loss: 0.00866
Epoch: 28528, Training Loss: 0.00809
Epoch: 28528, Training Loss: 0.00989
Epoch: 28528, Training Loss: 0.00766
Epoch: 28529, Training Loss: 0.00866
Epoch: 28529, Training Loss: 0.00809
Epoch: 28529, Training Loss: 0.00989
Epoch: 28529, Training Loss: 0.00766
Epoch: 28530, Training Loss: 0.00866
Epoch: 28530, Training Loss: 0.00809
Epoch: 28530, Training Loss: 0.00989
Epoch: 28530, Training Loss: 0.00766
Epoch: 28531, Training Loss: 0.00866
Epoch: 28531, Training Loss: 0.00809
Epoch: 28531, Training Loss: 0.00989
Epoch: 28531, Training Loss: 0.00766
Epoch: 28532, Training Loss: 0.00866
Epoch: 28532, Training Loss: 0.00809
Epoch: 28532, Training Loss: 0.00989
Epoch: 28532, Training Loss: 0.00766
Epoch: 28533, Training Loss: 0.00866
Epoch: 28533, Training Loss: 0.00809
Epoch: 28533, Training Loss: 0.00989
Epoch: 28533, Training Loss: 0.00766
Epoch: 28534, Training Loss: 0.00866
Epoch: 28534, Training Loss: 0.00808
Epoch: 28534, Training Loss: 0.00989
Epoch: 28534, Training Loss: 0.00766
Epoch: 28535, Training Loss: 0.00866
Epoch: 28535, Training Loss: 0.00808
Epoch: 28535, Training Loss: 0.00989
Epoch: 28535, Training Loss: 0.00766
Epoch: 28536, Training Loss: 0.00866
Epoch: 28536, Training Loss: 0.00808
Epoch: 28536, Training Loss: 0.00989
Epoch: 28536, Training Loss: 0.00766
Epoch: 28537, Training Loss: 0.00866
Epoch: 28537, Training Loss: 0.00808
Epoch: 28537, Training Loss: 0.00989
Epoch: 28537, Training Loss: 0.00766
Epoch: 28538, Training Loss: 0.00866
Epoch: 28538, Training Loss: 0.00808
Epoch: 28538, Training Loss: 0.00989
Epoch: 28538, Training Loss: 0.00766
Epoch: 28539, Training Loss: 0.00866
Epoch: 28539, Training Loss: 0.00808
Epoch: 28539, Training Loss: 0.00989
Epoch: 28539, Training Loss: 0.00766
Epoch: 28540, Training Loss: 0.00866
Epoch: 28540, Training Loss: 0.00808
Epoch: 28540, Training Loss: 0.00989
Epoch: 28540, Training Loss: 0.00766
Epoch: 28541, Training Loss: 0.00866
Epoch: 28541, Training Loss: 0.00808
Epoch: 28541, Training Loss: 0.00989
Epoch: 28541, Training Loss: 0.00766
Epoch: 28542, Training Loss: 0.00866
Epoch: 28542, Training Loss: 0.00808
Epoch: 28542, Training Loss: 0.00989
Epoch: 28542, Training Loss: 0.00766
Epoch: 28543, Training Loss: 0.00866
Epoch: 28543, Training Loss: 0.00808
Epoch: 28543, Training Loss: 0.00989
Epoch: 28543, Training Loss: 0.00766
Epoch: 28544, Training Loss: 0.00866
Epoch: 28544, Training Loss: 0.00808
Epoch: 28544, Training Loss: 0.00989
Epoch: 28544, Training Loss: 0.00766
Epoch: 28545, Training Loss: 0.00866
Epoch: 28545, Training Loss: 0.00808
Epoch: 28545, Training Loss: 0.00989
Epoch: 28545, Training Loss: 0.00766
Epoch: 28546, Training Loss: 0.00866
Epoch: 28546, Training Loss: 0.00808
Epoch: 28546, Training Loss: 0.00989
Epoch: 28546, Training Loss: 0.00765
Epoch: 28547, Training Loss: 0.00866
Epoch: 28547, Training Loss: 0.00808
Epoch: 28547, Training Loss: 0.00989
Epoch: 28547, Training Loss: 0.00765
Epoch: 28548, Training Loss: 0.00866
Epoch: 28548, Training Loss: 0.00808
Epoch: 28548, Training Loss: 0.00989
Epoch: 28548, Training Loss: 0.00765
Epoch: 28549, Training Loss: 0.00866
Epoch: 28549, Training Loss: 0.00808
Epoch: 28549, Training Loss: 0.00989
Epoch: 28549, Training Loss: 0.00765
Epoch: 28550, Training Loss: 0.00866
Epoch: 28550, Training Loss: 0.00808
Epoch: 28550, Training Loss: 0.00989
Epoch: 28550, Training Loss: 0.00765
Epoch: 28551, Training Loss: 0.00866
Epoch: 28551, Training Loss: 0.00808
Epoch: 28551, Training Loss: 0.00989
Epoch: 28551, Training Loss: 0.00765
Epoch: 28552, Training Loss: 0.00866
Epoch: 28552, Training Loss: 0.00808
Epoch: 28552, Training Loss: 0.00989
Epoch: 28552, Training Loss: 0.00765
Epoch: 28553, Training Loss: 0.00866
Epoch: 28553, Training Loss: 0.00808
Epoch: 28553, Training Loss: 0.00989
Epoch: 28553, Training Loss: 0.00765
Epoch: 28554, Training Loss: 0.00866
Epoch: 28554, Training Loss: 0.00808
Epoch: 28554, Training Loss: 0.00989
Epoch: 28554, Training Loss: 0.00765
Epoch: 28555, Training Loss: 0.00866
Epoch: 28555, Training Loss: 0.00808
Epoch: 28555, Training Loss: 0.00989
Epoch: 28555, Training Loss: 0.00765
Epoch: 28556, Training Loss: 0.00866
Epoch: 28556, Training Loss: 0.00808
Epoch: 28556, Training Loss: 0.00989
Epoch: 28556, Training Loss: 0.00765
Epoch: 28557, Training Loss: 0.00866
Epoch: 28557, Training Loss: 0.00808
Epoch: 28557, Training Loss: 0.00989
Epoch: 28557, Training Loss: 0.00765
Epoch: 28558, Training Loss: 0.00866
Epoch: 28558, Training Loss: 0.00808
Epoch: 28558, Training Loss: 0.00989
Epoch: 28558, Training Loss: 0.00765
Epoch: 28559, Training Loss: 0.00866
Epoch: 28559, Training Loss: 0.00808
Epoch: 28559, Training Loss: 0.00989
Epoch: 28559, Training Loss: 0.00765
Epoch: 28560, Training Loss: 0.00866
Epoch: 28560, Training Loss: 0.00808
Epoch: 28560, Training Loss: 0.00989
Epoch: 28560, Training Loss: 0.00765
Epoch: 28561, Training Loss: 0.00866
Epoch: 28561, Training Loss: 0.00808
Epoch: 28561, Training Loss: 0.00989
Epoch: 28561, Training Loss: 0.00765
Epoch: 28562, Training Loss: 0.00866
Epoch: 28562, Training Loss: 0.00808
Epoch: 28562, Training Loss: 0.00989
Epoch: 28562, Training Loss: 0.00765
Epoch: 28563, Training Loss: 0.00866
Epoch: 28563, Training Loss: 0.00808
Epoch: 28563, Training Loss: 0.00989
Epoch: 28563, Training Loss: 0.00765
Epoch: 28564, Training Loss: 0.00866
Epoch: 28564, Training Loss: 0.00808
Epoch: 28564, Training Loss: 0.00989
Epoch: 28564, Training Loss: 0.00765
Epoch: 28565, Training Loss: 0.00866
Epoch: 28565, Training Loss: 0.00808
Epoch: 28565, Training Loss: 0.00989
Epoch: 28565, Training Loss: 0.00765
Epoch: 28566, Training Loss: 0.00866
Epoch: 28566, Training Loss: 0.00808
Epoch: 28566, Training Loss: 0.00989
Epoch: 28566, Training Loss: 0.00765
Epoch: 28567, Training Loss: 0.00866
Epoch: 28567, Training Loss: 0.00808
Epoch: 28567, Training Loss: 0.00989
Epoch: 28567, Training Loss: 0.00765
Epoch: 28568, Training Loss: 0.00866
Epoch: 28568, Training Loss: 0.00808
Epoch: 28568, Training Loss: 0.00989
Epoch: 28568, Training Loss: 0.00765
Epoch: 28569, Training Loss: 0.00866
Epoch: 28569, Training Loss: 0.00808
Epoch: 28569, Training Loss: 0.00989
Epoch: 28569, Training Loss: 0.00765
Epoch: 28570, Training Loss: 0.00866
Epoch: 28570, Training Loss: 0.00808
Epoch: 28570, Training Loss: 0.00989
Epoch: 28570, Training Loss: 0.00765
Epoch: 28571, Training Loss: 0.00866
Epoch: 28571, Training Loss: 0.00808
Epoch: 28571, Training Loss: 0.00989
Epoch: 28571, Training Loss: 0.00765
Epoch: 28572, Training Loss: 0.00866
Epoch: 28572, Training Loss: 0.00808
Epoch: 28572, Training Loss: 0.00989
Epoch: 28572, Training Loss: 0.00765
Epoch: 28573, Training Loss: 0.00866
Epoch: 28573, Training Loss: 0.00808
Epoch: 28573, Training Loss: 0.00988
Epoch: 28573, Training Loss: 0.00765
Epoch: 28574, Training Loss: 0.00866
Epoch: 28574, Training Loss: 0.00808
Epoch: 28574, Training Loss: 0.00988
Epoch: 28574, Training Loss: 0.00765
Epoch: 28575, Training Loss: 0.00866
Epoch: 28575, Training Loss: 0.00808
Epoch: 28575, Training Loss: 0.00988
Epoch: 28575, Training Loss: 0.00765
Epoch: 28576, Training Loss: 0.00866
Epoch: 28576, Training Loss: 0.00808
Epoch: 28576, Training Loss: 0.00988
Epoch: 28576, Training Loss: 0.00765
Epoch: 28577, Training Loss: 0.00866
Epoch: 28577, Training Loss: 0.00808
Epoch: 28577, Training Loss: 0.00988
Epoch: 28577, Training Loss: 0.00765
Epoch: 28578, Training Loss: 0.00866
Epoch: 28578, Training Loss: 0.00808
Epoch: 28578, Training Loss: 0.00988
Epoch: 28578, Training Loss: 0.00765
Epoch: 28579, Training Loss: 0.00866
Epoch: 28579, Training Loss: 0.00808
Epoch: 28579, Training Loss: 0.00988
Epoch: 28579, Training Loss: 0.00765
Epoch: 28580, Training Loss: 0.00865
Epoch: 28580, Training Loss: 0.00808
Epoch: 28580, Training Loss: 0.00988
Epoch: 28580, Training Loss: 0.00765
Epoch: 28581, Training Loss: 0.00865
Epoch: 28581, Training Loss: 0.00808
Epoch: 28581, Training Loss: 0.00988
Epoch: 28581, Training Loss: 0.00765
Epoch: 28582, Training Loss: 0.00865
Epoch: 28582, Training Loss: 0.00808
Epoch: 28582, Training Loss: 0.00988
Epoch: 28582, Training Loss: 0.00765
Epoch: 28583, Training Loss: 0.00865
Epoch: 28583, Training Loss: 0.00808
Epoch: 28583, Training Loss: 0.00988
Epoch: 28583, Training Loss: 0.00765
Epoch: 28584, Training Loss: 0.00865
Epoch: 28584, Training Loss: 0.00808
Epoch: 28584, Training Loss: 0.00988
Epoch: 28584, Training Loss: 0.00765
Epoch: 28585, Training Loss: 0.00865
Epoch: 28585, Training Loss: 0.00808
Epoch: 28585, Training Loss: 0.00988
Epoch: 28585, Training Loss: 0.00765
Epoch: 28586, Training Loss: 0.00865
Epoch: 28586, Training Loss: 0.00808
Epoch: 28586, Training Loss: 0.00988
Epoch: 28586, Training Loss: 0.00765
Epoch: 28587, Training Loss: 0.00865
Epoch: 28587, Training Loss: 0.00808
Epoch: 28587, Training Loss: 0.00988
Epoch: 28587, Training Loss: 0.00765
Epoch: 28588, Training Loss: 0.00865
Epoch: 28588, Training Loss: 0.00808
Epoch: 28588, Training Loss: 0.00988
Epoch: 28588, Training Loss: 0.00765
Epoch: 28589, Training Loss: 0.00865
Epoch: 28589, Training Loss: 0.00808
Epoch: 28589, Training Loss: 0.00988
Epoch: 28589, Training Loss: 0.00765
Epoch: 28590, Training Loss: 0.00865
Epoch: 28590, Training Loss: 0.00808
Epoch: 28590, Training Loss: 0.00988
Epoch: 28590, Training Loss: 0.00765
Epoch: 28591, Training Loss: 0.00865
Epoch: 28591, Training Loss: 0.00808
Epoch: 28591, Training Loss: 0.00988
Epoch: 28591, Training Loss: 0.00765
Epoch: 28592, Training Loss: 0.00865
Epoch: 28592, Training Loss: 0.00808
Epoch: 28592, Training Loss: 0.00988
Epoch: 28592, Training Loss: 0.00765
Epoch: 28593, Training Loss: 0.00865
Epoch: 28593, Training Loss: 0.00808
Epoch: 28593, Training Loss: 0.00988
Epoch: 28593, Training Loss: 0.00765
Epoch: 28594, Training Loss: 0.00865
Epoch: 28594, Training Loss: 0.00808
Epoch: 28594, Training Loss: 0.00988
Epoch: 28594, Training Loss: 0.00765
Epoch: 28595, Training Loss: 0.00865
Epoch: 28595, Training Loss: 0.00808
Epoch: 28595, Training Loss: 0.00988
Epoch: 28595, Training Loss: 0.00765
Epoch: 28596, Training Loss: 0.00865
Epoch: 28596, Training Loss: 0.00808
Epoch: 28596, Training Loss: 0.00988
Epoch: 28596, Training Loss: 0.00765
Epoch: 28597, Training Loss: 0.00865
Epoch: 28597, Training Loss: 0.00808
Epoch: 28597, Training Loss: 0.00988
Epoch: 28597, Training Loss: 0.00765
Epoch: 28598, Training Loss: 0.00865
Epoch: 28598, Training Loss: 0.00808
Epoch: 28598, Training Loss: 0.00988
Epoch: 28598, Training Loss: 0.00765
Epoch: 28599, Training Loss: 0.00865
Epoch: 28599, Training Loss: 0.00808
Epoch: 28599, Training Loss: 0.00988
Epoch: 28599, Training Loss: 0.00765
Epoch: 28600, Training Loss: 0.00865
Epoch: 28600, Training Loss: 0.00808
Epoch: 28600, Training Loss: 0.00988
Epoch: 28600, Training Loss: 0.00765
Epoch: 28601, Training Loss: 0.00865
Epoch: 28601, Training Loss: 0.00808
Epoch: 28601, Training Loss: 0.00988
Epoch: 28601, Training Loss: 0.00765
Epoch: 28602, Training Loss: 0.00865
Epoch: 28602, Training Loss: 0.00807
Epoch: 28602, Training Loss: 0.00988
Epoch: 28602, Training Loss: 0.00765
Epoch: 28603, Training Loss: 0.00865
Epoch: 28603, Training Loss: 0.00807
Epoch: 28603, Training Loss: 0.00988
Epoch: 28603, Training Loss: 0.00765
Epoch: 28604, Training Loss: 0.00865
Epoch: 28604, Training Loss: 0.00807
Epoch: 28604, Training Loss: 0.00988
Epoch: 28604, Training Loss: 0.00765
Epoch: 28605, Training Loss: 0.00865
Epoch: 28605, Training Loss: 0.00807
Epoch: 28605, Training Loss: 0.00988
Epoch: 28605, Training Loss: 0.00765
Epoch: 28606, Training Loss: 0.00865
Epoch: 28606, Training Loss: 0.00807
Epoch: 28606, Training Loss: 0.00988
Epoch: 28606, Training Loss: 0.00765
Epoch: 28607, Training Loss: 0.00865
Epoch: 28607, Training Loss: 0.00807
Epoch: 28607, Training Loss: 0.00988
Epoch: 28607, Training Loss: 0.00765
Epoch: 28608, Training Loss: 0.00865
Epoch: 28608, Training Loss: 0.00807
Epoch: 28608, Training Loss: 0.00988
Epoch: 28608, Training Loss: 0.00765
Epoch: 28609, Training Loss: 0.00865
Epoch: 28609, Training Loss: 0.00807
Epoch: 28609, Training Loss: 0.00988
Epoch: 28609, Training Loss: 0.00765
Epoch: 28610, Training Loss: 0.00865
Epoch: 28610, Training Loss: 0.00807
Epoch: 28610, Training Loss: 0.00988
Epoch: 28610, Training Loss: 0.00765
Epoch: 28611, Training Loss: 0.00865
Epoch: 28611, Training Loss: 0.00807
Epoch: 28611, Training Loss: 0.00988
Epoch: 28611, Training Loss: 0.00765
Epoch: 28612, Training Loss: 0.00865
Epoch: 28612, Training Loss: 0.00807
Epoch: 28612, Training Loss: 0.00988
Epoch: 28612, Training Loss: 0.00765
Epoch: 28613, Training Loss: 0.00865
Epoch: 28613, Training Loss: 0.00807
Epoch: 28613, Training Loss: 0.00988
Epoch: 28613, Training Loss: 0.00765
Epoch: 28614, Training Loss: 0.00865
Epoch: 28614, Training Loss: 0.00807
Epoch: 28614, Training Loss: 0.00988
Epoch: 28614, Training Loss: 0.00765
Epoch: 28615, Training Loss: 0.00865
Epoch: 28615, Training Loss: 0.00807
Epoch: 28615, Training Loss: 0.00988
Epoch: 28615, Training Loss: 0.00765
Epoch: 28616, Training Loss: 0.00865
Epoch: 28616, Training Loss: 0.00807
Epoch: 28616, Training Loss: 0.00988
Epoch: 28616, Training Loss: 0.00765
Epoch: 28617, Training Loss: 0.00865
Epoch: 28617, Training Loss: 0.00807
Epoch: 28617, Training Loss: 0.00988
Epoch: 28617, Training Loss: 0.00764
Epoch: 28618, Training Loss: 0.00865
Epoch: 28618, Training Loss: 0.00807
Epoch: 28618, Training Loss: 0.00988
Epoch: 28618, Training Loss: 0.00764
Epoch: 28619, Training Loss: 0.00865
Epoch: 28619, Training Loss: 0.00807
Epoch: 28619, Training Loss: 0.00988
Epoch: 28619, Training Loss: 0.00764
Epoch: 28620, Training Loss: 0.00865
Epoch: 28620, Training Loss: 0.00807
Epoch: 28620, Training Loss: 0.00988
Epoch: 28620, Training Loss: 0.00764
Epoch: 28621, Training Loss: 0.00865
Epoch: 28621, Training Loss: 0.00807
Epoch: 28621, Training Loss: 0.00988
Epoch: 28621, Training Loss: 0.00764
Epoch: 28622, Training Loss: 0.00865
Epoch: 28622, Training Loss: 0.00807
Epoch: 28622, Training Loss: 0.00988
Epoch: 28622, Training Loss: 0.00764
Epoch: 28623, Training Loss: 0.00865
Epoch: 28623, Training Loss: 0.00807
Epoch: 28623, Training Loss: 0.00988
Epoch: 28623, Training Loss: 0.00764
Epoch: 28624, Training Loss: 0.00865
Epoch: 28624, Training Loss: 0.00807
Epoch: 28624, Training Loss: 0.00988
Epoch: 28624, Training Loss: 0.00764
Epoch: 28625, Training Loss: 0.00865
Epoch: 28625, Training Loss: 0.00807
Epoch: 28625, Training Loss: 0.00988
Epoch: 28625, Training Loss: 0.00764
Epoch: 28626, Training Loss: 0.00865
Epoch: 28626, Training Loss: 0.00807
Epoch: 28626, Training Loss: 0.00988
Epoch: 28626, Training Loss: 0.00764
Epoch: 28627, Training Loss: 0.00865
Epoch: 28627, Training Loss: 0.00807
Epoch: 28627, Training Loss: 0.00987
Epoch: 28627, Training Loss: 0.00764
Epoch: 28628, Training Loss: 0.00865
Epoch: 28628, Training Loss: 0.00807
Epoch: 28628, Training Loss: 0.00987
Epoch: 28628, Training Loss: 0.00764
Epoch: 28629, Training Loss: 0.00865
Epoch: 28629, Training Loss: 0.00807
Epoch: 28629, Training Loss: 0.00987
Epoch: 28629, Training Loss: 0.00764
Epoch: 28630, Training Loss: 0.00865
Epoch: 28630, Training Loss: 0.00807
Epoch: 28630, Training Loss: 0.00987
Epoch: 28630, Training Loss: 0.00764
Epoch: 28631, Training Loss: 0.00865
Epoch: 28631, Training Loss: 0.00807
Epoch: 28631, Training Loss: 0.00987
Epoch: 28631, Training Loss: 0.00764
Epoch: 28632, Training Loss: 0.00865
Epoch: 28632, Training Loss: 0.00807
Epoch: 28632, Training Loss: 0.00987
Epoch: 28632, Training Loss: 0.00764
Epoch: 28633, Training Loss: 0.00865
Epoch: 28633, Training Loss: 0.00807
Epoch: 28633, Training Loss: 0.00987
Epoch: 28633, Training Loss: 0.00764
Epoch: 28634, Training Loss: 0.00865
Epoch: 28634, Training Loss: 0.00807
Epoch: 28634, Training Loss: 0.00987
Epoch: 28634, Training Loss: 0.00764
Epoch: 28635, Training Loss: 0.00865
Epoch: 28635, Training Loss: 0.00807
Epoch: 28635, Training Loss: 0.00987
Epoch: 28635, Training Loss: 0.00764
Epoch: 28636, Training Loss: 0.00865
Epoch: 28636, Training Loss: 0.00807
Epoch: 28636, Training Loss: 0.00987
Epoch: 28636, Training Loss: 0.00764
Epoch: 28637, Training Loss: 0.00865
Epoch: 28637, Training Loss: 0.00807
Epoch: 28637, Training Loss: 0.00987
Epoch: 28637, Training Loss: 0.00764
Epoch: 28638, Training Loss: 0.00865
Epoch: 28638, Training Loss: 0.00807
Epoch: 28638, Training Loss: 0.00987
Epoch: 28638, Training Loss: 0.00764
Epoch: 28639, Training Loss: 0.00865
Epoch: 28639, Training Loss: 0.00807
Epoch: 28639, Training Loss: 0.00987
Epoch: 28639, Training Loss: 0.00764
Epoch: 28640, Training Loss: 0.00865
Epoch: 28640, Training Loss: 0.00807
Epoch: 28640, Training Loss: 0.00987
Epoch: 28640, Training Loss: 0.00764
Epoch: 28641, Training Loss: 0.00865
Epoch: 28641, Training Loss: 0.00807
Epoch: 28641, Training Loss: 0.00987
Epoch: 28641, Training Loss: 0.00764
Epoch: 28642, Training Loss: 0.00864
Epoch: 28642, Training Loss: 0.00807
Epoch: 28642, Training Loss: 0.00987
Epoch: 28642, Training Loss: 0.00764
Epoch: 28643, Training Loss: 0.00864
Epoch: 28643, Training Loss: 0.00807
Epoch: 28643, Training Loss: 0.00987
Epoch: 28643, Training Loss: 0.00764
Epoch: 28644, Training Loss: 0.00864
Epoch: 28644, Training Loss: 0.00807
Epoch: 28644, Training Loss: 0.00987
Epoch: 28644, Training Loss: 0.00764
Epoch: 28645, Training Loss: 0.00864
Epoch: 28645, Training Loss: 0.00807
Epoch: 28645, Training Loss: 0.00987
Epoch: 28645, Training Loss: 0.00764
Epoch: 28646, Training Loss: 0.00864
Epoch: 28646, Training Loss: 0.00807
Epoch: 28646, Training Loss: 0.00987
Epoch: 28646, Training Loss: 0.00764
Epoch: 28647, Training Loss: 0.00864
Epoch: 28647, Training Loss: 0.00807
Epoch: 28647, Training Loss: 0.00987
Epoch: 28647, Training Loss: 0.00764
Epoch: 28648, Training Loss: 0.00864
Epoch: 28648, Training Loss: 0.00807
Epoch: 28648, Training Loss: 0.00987
Epoch: 28648, Training Loss: 0.00764
Epoch: 28649, Training Loss: 0.00864
Epoch: 28649, Training Loss: 0.00807
Epoch: 28649, Training Loss: 0.00987
Epoch: 28649, Training Loss: 0.00764
Epoch: 28650, Training Loss: 0.00864
Epoch: 28650, Training Loss: 0.00807
Epoch: 28650, Training Loss: 0.00987
Epoch: 28650, Training Loss: 0.00764
Epoch: 28651, Training Loss: 0.00864
Epoch: 28651, Training Loss: 0.00807
Epoch: 28651, Training Loss: 0.00987
Epoch: 28651, Training Loss: 0.00764
Epoch: 28652, Training Loss: 0.00864
Epoch: 28652, Training Loss: 0.00807
Epoch: 28652, Training Loss: 0.00987
Epoch: 28652, Training Loss: 0.00764
Epoch: 28653, Training Loss: 0.00864
Epoch: 28653, Training Loss: 0.00807
Epoch: 28653, Training Loss: 0.00987
Epoch: 28653, Training Loss: 0.00764
Epoch: 28654, Training Loss: 0.00864
Epoch: 28654, Training Loss: 0.00807
Epoch: 28654, Training Loss: 0.00987
Epoch: 28654, Training Loss: 0.00764
Epoch: 28655, Training Loss: 0.00864
Epoch: 28655, Training Loss: 0.00807
Epoch: 28655, Training Loss: 0.00987
Epoch: 28655, Training Loss: 0.00764
Epoch: 28656, Training Loss: 0.00864
Epoch: 28656, Training Loss: 0.00807
Epoch: 28656, Training Loss: 0.00987
Epoch: 28656, Training Loss: 0.00764
Epoch: 28657, Training Loss: 0.00864
Epoch: 28657, Training Loss: 0.00807
Epoch: 28657, Training Loss: 0.00987
Epoch: 28657, Training Loss: 0.00764
Epoch: 28658, Training Loss: 0.00864
Epoch: 28658, Training Loss: 0.00807
Epoch: 28658, Training Loss: 0.00987
Epoch: 28658, Training Loss: 0.00764
Epoch: 28659, Training Loss: 0.00864
Epoch: 28659, Training Loss: 0.00807
Epoch: 28659, Training Loss: 0.00987
Epoch: 28659, Training Loss: 0.00764
Epoch: 28660, Training Loss: 0.00864
Epoch: 28660, Training Loss: 0.00807
Epoch: 28660, Training Loss: 0.00987
Epoch: 28660, Training Loss: 0.00764
Epoch: 28661, Training Loss: 0.00864
Epoch: 28661, Training Loss: 0.00807
Epoch: 28661, Training Loss: 0.00987
Epoch: 28661, Training Loss: 0.00764
Epoch: 28662, Training Loss: 0.00864
Epoch: 28662, Training Loss: 0.00807
Epoch: 28662, Training Loss: 0.00987
Epoch: 28662, Training Loss: 0.00764
Epoch: 28663, Training Loss: 0.00864
Epoch: 28663, Training Loss: 0.00807
Epoch: 28663, Training Loss: 0.00987
Epoch: 28663, Training Loss: 0.00764
Epoch: 28664, Training Loss: 0.00864
Epoch: 28664, Training Loss: 0.00807
Epoch: 28664, Training Loss: 0.00987
Epoch: 28664, Training Loss: 0.00764
Epoch: 28665, Training Loss: 0.00864
Epoch: 28665, Training Loss: 0.00807
Epoch: 28665, Training Loss: 0.00987
Epoch: 28665, Training Loss: 0.00764
Epoch: 28666, Training Loss: 0.00864
Epoch: 28666, Training Loss: 0.00807
Epoch: 28666, Training Loss: 0.00987
Epoch: 28666, Training Loss: 0.00764
Epoch: 28667, Training Loss: 0.00864
Epoch: 28667, Training Loss: 0.00807
Epoch: 28667, Training Loss: 0.00987
Epoch: 28667, Training Loss: 0.00764
Epoch: 28668, Training Loss: 0.00864
Epoch: 28668, Training Loss: 0.00807
Epoch: 28668, Training Loss: 0.00987
Epoch: 28668, Training Loss: 0.00764
Epoch: 28669, Training Loss: 0.00864
Epoch: 28669, Training Loss: 0.00806
Epoch: 28669, Training Loss: 0.00987
Epoch: 28669, Training Loss: 0.00764
Epoch: 28670, Training Loss: 0.00864
Epoch: 28670, Training Loss: 0.00806
Epoch: 28670, Training Loss: 0.00987
Epoch: 28670, Training Loss: 0.00764
Epoch: 28671, Training Loss: 0.00864
Epoch: 28671, Training Loss: 0.00806
Epoch: 28671, Training Loss: 0.00987
Epoch: 28671, Training Loss: 0.00764
Epoch: 28672, Training Loss: 0.00864
Epoch: 28672, Training Loss: 0.00806
Epoch: 28672, Training Loss: 0.00987
Epoch: 28672, Training Loss: 0.00764
Epoch: 28673, Training Loss: 0.00864
Epoch: 28673, Training Loss: 0.00806
Epoch: 28673, Training Loss: 0.00987
Epoch: 28673, Training Loss: 0.00764
Epoch: 28674, Training Loss: 0.00864
Epoch: 28674, Training Loss: 0.00806
Epoch: 28674, Training Loss: 0.00987
Epoch: 28674, Training Loss: 0.00764
Epoch: 28675, Training Loss: 0.00864
Epoch: 28675, Training Loss: 0.00806
Epoch: 28675, Training Loss: 0.00987
Epoch: 28675, Training Loss: 0.00764
Epoch: 28676, Training Loss: 0.00864
Epoch: 28676, Training Loss: 0.00806
Epoch: 28676, Training Loss: 0.00987
Epoch: 28676, Training Loss: 0.00764
Epoch: 28677, Training Loss: 0.00864
Epoch: 28677, Training Loss: 0.00806
Epoch: 28677, Training Loss: 0.00987
Epoch: 28677, Training Loss: 0.00764
Epoch: 28678, Training Loss: 0.00864
Epoch: 28678, Training Loss: 0.00806
Epoch: 28678, Training Loss: 0.00987
Epoch: 28678, Training Loss: 0.00764
Epoch: 28679, Training Loss: 0.00864
Epoch: 28679, Training Loss: 0.00806
Epoch: 28679, Training Loss: 0.00987
Epoch: 28679, Training Loss: 0.00764
Epoch: 28680, Training Loss: 0.00864
Epoch: 28680, Training Loss: 0.00806
Epoch: 28680, Training Loss: 0.00987
Epoch: 28680, Training Loss: 0.00764
Epoch: 28681, Training Loss: 0.00864
Epoch: 28681, Training Loss: 0.00806
Epoch: 28681, Training Loss: 0.00987
Epoch: 28681, Training Loss: 0.00764
Epoch: 28682, Training Loss: 0.00864
Epoch: 28682, Training Loss: 0.00806
Epoch: 28682, Training Loss: 0.00986
Epoch: 28682, Training Loss: 0.00764
Epoch: 28683, Training Loss: 0.00864
Epoch: 28683, Training Loss: 0.00806
Epoch: 28683, Training Loss: 0.00986
Epoch: 28683, Training Loss: 0.00764
Epoch: 28684, Training Loss: 0.00864
Epoch: 28684, Training Loss: 0.00806
Epoch: 28684, Training Loss: 0.00986
Epoch: 28684, Training Loss: 0.00764
Epoch: 28685, Training Loss: 0.00864
Epoch: 28685, Training Loss: 0.00806
Epoch: 28685, Training Loss: 0.00986
Epoch: 28685, Training Loss: 0.00764
Epoch: 28686, Training Loss: 0.00864
Epoch: 28686, Training Loss: 0.00806
Epoch: 28686, Training Loss: 0.00986
Epoch: 28686, Training Loss: 0.00764
Epoch: 28687, Training Loss: 0.00864
Epoch: 28687, Training Loss: 0.00806
Epoch: 28687, Training Loss: 0.00986
Epoch: 28687, Training Loss: 0.00764
Epoch: 28688, Training Loss: 0.00864
Epoch: 28688, Training Loss: 0.00806
Epoch: 28688, Training Loss: 0.00986
Epoch: 28688, Training Loss: 0.00764
Epoch: 28689, Training Loss: 0.00864
Epoch: 28689, Training Loss: 0.00806
Epoch: 28689, Training Loss: 0.00986
Epoch: 28689, Training Loss: 0.00763
Epoch: 28690, Training Loss: 0.00864
Epoch: 28690, Training Loss: 0.00806
Epoch: 28690, Training Loss: 0.00986
Epoch: 28690, Training Loss: 0.00763
Epoch: 28691, Training Loss: 0.00864
Epoch: 28691, Training Loss: 0.00806
Epoch: 28691, Training Loss: 0.00986
Epoch: 28691, Training Loss: 0.00763
Epoch: 28692, Training Loss: 0.00864
Epoch: 28692, Training Loss: 0.00806
Epoch: 28692, Training Loss: 0.00986
Epoch: 28692, Training Loss: 0.00763
Epoch: 28693, Training Loss: 0.00864
Epoch: 28693, Training Loss: 0.00806
Epoch: 28693, Training Loss: 0.00986
Epoch: 28693, Training Loss: 0.00763
Epoch: 28694, Training Loss: 0.00864
Epoch: 28694, Training Loss: 0.00806
Epoch: 28694, Training Loss: 0.00986
Epoch: 28694, Training Loss: 0.00763
Epoch: 28695, Training Loss: 0.00864
Epoch: 28695, Training Loss: 0.00806
Epoch: 28695, Training Loss: 0.00986
Epoch: 28695, Training Loss: 0.00763
Epoch: 28696, Training Loss: 0.00864
Epoch: 28696, Training Loss: 0.00806
Epoch: 28696, Training Loss: 0.00986
Epoch: 28696, Training Loss: 0.00763
Epoch: 28697, Training Loss: 0.00864
Epoch: 28697, Training Loss: 0.00806
Epoch: 28697, Training Loss: 0.00986
Epoch: 28697, Training Loss: 0.00763
Epoch: 28698, Training Loss: 0.00864
Epoch: 28698, Training Loss: 0.00806
Epoch: 28698, Training Loss: 0.00986
Epoch: 28698, Training Loss: 0.00763
Epoch: 28699, Training Loss: 0.00864
Epoch: 28699, Training Loss: 0.00806
Epoch: 28699, Training Loss: 0.00986
Epoch: 28699, Training Loss: 0.00763
Epoch: 28700, Training Loss: 0.00864
Epoch: 28700, Training Loss: 0.00806
Epoch: 28700, Training Loss: 0.00986
Epoch: 28700, Training Loss: 0.00763
Epoch: 28701, Training Loss: 0.00864
Epoch: 28701, Training Loss: 0.00806
Epoch: 28701, Training Loss: 0.00986
Epoch: 28701, Training Loss: 0.00763
Epoch: 28702, Training Loss: 0.00864
Epoch: 28702, Training Loss: 0.00806
Epoch: 28702, Training Loss: 0.00986
Epoch: 28702, Training Loss: 0.00763
Epoch: 28703, Training Loss: 0.00864
Epoch: 28703, Training Loss: 0.00806
Epoch: 28703, Training Loss: 0.00986
Epoch: 28703, Training Loss: 0.00763
Epoch: 28704, Training Loss: 0.00863
Epoch: 28704, Training Loss: 0.00806
Epoch: 28704, Training Loss: 0.00986
Epoch: 28704, Training Loss: 0.00763
Epoch: 28705, Training Loss: 0.00863
Epoch: 28705, Training Loss: 0.00806
Epoch: 28705, Training Loss: 0.00986
Epoch: 28705, Training Loss: 0.00763
Epoch: 28706, Training Loss: 0.00863
Epoch: 28706, Training Loss: 0.00806
Epoch: 28706, Training Loss: 0.00986
Epoch: 28706, Training Loss: 0.00763
Epoch: 28707, Training Loss: 0.00863
Epoch: 28707, Training Loss: 0.00806
Epoch: 28707, Training Loss: 0.00986
Epoch: 28707, Training Loss: 0.00763
Epoch: 28708, Training Loss: 0.00863
Epoch: 28708, Training Loss: 0.00806
Epoch: 28708, Training Loss: 0.00986
Epoch: 28708, Training Loss: 0.00763
Epoch: 28709, Training Loss: 0.00863
Epoch: 28709, Training Loss: 0.00806
Epoch: 28709, Training Loss: 0.00986
Epoch: 28709, Training Loss: 0.00763
Epoch: 28710, Training Loss: 0.00863
Epoch: 28710, Training Loss: 0.00806
Epoch: 28710, Training Loss: 0.00986
Epoch: 28710, Training Loss: 0.00763
Epoch: 28711, Training Loss: 0.00863
Epoch: 28711, Training Loss: 0.00806
Epoch: 28711, Training Loss: 0.00986
Epoch: 28711, Training Loss: 0.00763
Epoch: 28712, Training Loss: 0.00863
Epoch: 28712, Training Loss: 0.00806
Epoch: 28712, Training Loss: 0.00986
Epoch: 28712, Training Loss: 0.00763
Epoch: 28713, Training Loss: 0.00863
Epoch: 28713, Training Loss: 0.00806
Epoch: 28713, Training Loss: 0.00986
Epoch: 28713, Training Loss: 0.00763
Epoch: 28714, Training Loss: 0.00863
Epoch: 28714, Training Loss: 0.00806
Epoch: 28714, Training Loss: 0.00986
Epoch: 28714, Training Loss: 0.00763
Epoch: 28715, Training Loss: 0.00863
Epoch: 28715, Training Loss: 0.00806
Epoch: 28715, Training Loss: 0.00986
Epoch: 28715, Training Loss: 0.00763
Epoch: 28716, Training Loss: 0.00863
Epoch: 28716, Training Loss: 0.00806
Epoch: 28716, Training Loss: 0.00986
Epoch: 28716, Training Loss: 0.00763
Epoch: 28717, Training Loss: 0.00863
Epoch: 28717, Training Loss: 0.00806
Epoch: 28717, Training Loss: 0.00986
Epoch: 28717, Training Loss: 0.00763
Epoch: 28718, Training Loss: 0.00863
Epoch: 28718, Training Loss: 0.00806
Epoch: 28718, Training Loss: 0.00986
Epoch: 28718, Training Loss: 0.00763
Epoch: 28719, Training Loss: 0.00863
Epoch: 28719, Training Loss: 0.00806
Epoch: 28719, Training Loss: 0.00986
Epoch: 28719, Training Loss: 0.00763
Epoch: 28720, Training Loss: 0.00863
Epoch: 28720, Training Loss: 0.00806
Epoch: 28720, Training Loss: 0.00986
Epoch: 28720, Training Loss: 0.00763
Epoch: 28721, Training Loss: 0.00863
Epoch: 28721, Training Loss: 0.00806
Epoch: 28721, Training Loss: 0.00986
Epoch: 28721, Training Loss: 0.00763
Epoch: 28722, Training Loss: 0.00863
Epoch: 28722, Training Loss: 0.00806
Epoch: 28722, Training Loss: 0.00986
Epoch: 28722, Training Loss: 0.00763
Epoch: 28723, Training Loss: 0.00863
Epoch: 28723, Training Loss: 0.00806
Epoch: 28723, Training Loss: 0.00986
Epoch: 28723, Training Loss: 0.00763
Epoch: 28724, Training Loss: 0.00863
Epoch: 28724, Training Loss: 0.00806
Epoch: 28724, Training Loss: 0.00986
Epoch: 28724, Training Loss: 0.00763
Epoch: 28725, Training Loss: 0.00863
Epoch: 28725, Training Loss: 0.00806
Epoch: 28725, Training Loss: 0.00986
Epoch: 28725, Training Loss: 0.00763
Epoch: 28726, Training Loss: 0.00863
Epoch: 28726, Training Loss: 0.00806
Epoch: 28726, Training Loss: 0.00986
Epoch: 28726, Training Loss: 0.00763
Epoch: 28727, Training Loss: 0.00863
Epoch: 28727, Training Loss: 0.00806
Epoch: 28727, Training Loss: 0.00986
Epoch: 28727, Training Loss: 0.00763
Epoch: 28728, Training Loss: 0.00863
Epoch: 28728, Training Loss: 0.00806
Epoch: 28728, Training Loss: 0.00986
Epoch: 28728, Training Loss: 0.00763
Epoch: 28729, Training Loss: 0.00863
Epoch: 28729, Training Loss: 0.00806
Epoch: 28729, Training Loss: 0.00986
Epoch: 28729, Training Loss: 0.00763
Epoch: 28730, Training Loss: 0.00863
Epoch: 28730, Training Loss: 0.00806
Epoch: 28730, Training Loss: 0.00986
Epoch: 28730, Training Loss: 0.00763
Epoch: 28731, Training Loss: 0.00863
Epoch: 28731, Training Loss: 0.00806
Epoch: 28731, Training Loss: 0.00986
Epoch: 28731, Training Loss: 0.00763
Epoch: 28732, Training Loss: 0.00863
Epoch: 28732, Training Loss: 0.00806
Epoch: 28732, Training Loss: 0.00986
Epoch: 28732, Training Loss: 0.00763
Epoch: 28733, Training Loss: 0.00863
Epoch: 28733, Training Loss: 0.00806
Epoch: 28733, Training Loss: 0.00986
Epoch: 28733, Training Loss: 0.00763
Epoch: 28734, Training Loss: 0.00863
Epoch: 28734, Training Loss: 0.00806
Epoch: 28734, Training Loss: 0.00986
Epoch: 28734, Training Loss: 0.00763
Epoch: 28735, Training Loss: 0.00863
Epoch: 28735, Training Loss: 0.00806
Epoch: 28735, Training Loss: 0.00986
Epoch: 28735, Training Loss: 0.00763
Epoch: 28736, Training Loss: 0.00863
Epoch: 28736, Training Loss: 0.00806
Epoch: 28736, Training Loss: 0.00986
Epoch: 28736, Training Loss: 0.00763
Epoch: 28737, Training Loss: 0.00863
Epoch: 28737, Training Loss: 0.00805
Epoch: 28737, Training Loss: 0.00985
Epoch: 28737, Training Loss: 0.00763
Epoch: 28738, Training Loss: 0.00863
Epoch: 28738, Training Loss: 0.00805
Epoch: 28738, Training Loss: 0.00985
Epoch: 28738, Training Loss: 0.00763
Epoch: 28739, Training Loss: 0.00863
Epoch: 28739, Training Loss: 0.00805
Epoch: 28739, Training Loss: 0.00985
Epoch: 28739, Training Loss: 0.00763
Epoch: 28740, Training Loss: 0.00863
Epoch: 28740, Training Loss: 0.00805
Epoch: 28740, Training Loss: 0.00985
Epoch: 28740, Training Loss: 0.00763
Epoch: 28741, Training Loss: 0.00863
Epoch: 28741, Training Loss: 0.00805
Epoch: 28741, Training Loss: 0.00985
Epoch: 28741, Training Loss: 0.00763
Epoch: 28742, Training Loss: 0.00863
Epoch: 28742, Training Loss: 0.00805
Epoch: 28742, Training Loss: 0.00985
Epoch: 28742, Training Loss: 0.00763
Epoch: 28743, Training Loss: 0.00863
Epoch: 28743, Training Loss: 0.00805
Epoch: 28743, Training Loss: 0.00985
Epoch: 28743, Training Loss: 0.00763
Epoch: 28744, Training Loss: 0.00863
Epoch: 28744, Training Loss: 0.00805
Epoch: 28744, Training Loss: 0.00985
Epoch: 28744, Training Loss: 0.00763
Epoch: 28745, Training Loss: 0.00863
Epoch: 28745, Training Loss: 0.00805
Epoch: 28745, Training Loss: 0.00985
Epoch: 28745, Training Loss: 0.00763
Epoch: 28746, Training Loss: 0.00863
Epoch: 28746, Training Loss: 0.00805
Epoch: 28746, Training Loss: 0.00985
Epoch: 28746, Training Loss: 0.00763
Epoch: 28747, Training Loss: 0.00863
Epoch: 28747, Training Loss: 0.00805
Epoch: 28747, Training Loss: 0.00985
Epoch: 28747, Training Loss: 0.00763
Epoch: 28748, Training Loss: 0.00863
Epoch: 28748, Training Loss: 0.00805
Epoch: 28748, Training Loss: 0.00985
Epoch: 28748, Training Loss: 0.00763
Epoch: 28749, Training Loss: 0.00863
Epoch: 28749, Training Loss: 0.00805
Epoch: 28749, Training Loss: 0.00985
Epoch: 28749, Training Loss: 0.00763
Epoch: 28750, Training Loss: 0.00863
Epoch: 28750, Training Loss: 0.00805
Epoch: 28750, Training Loss: 0.00985
Epoch: 28750, Training Loss: 0.00763
Epoch: 28751, Training Loss: 0.00863
Epoch: 28751, Training Loss: 0.00805
Epoch: 28751, Training Loss: 0.00985
Epoch: 28751, Training Loss: 0.00763
Epoch: 28752, Training Loss: 0.00863
Epoch: 28752, Training Loss: 0.00805
Epoch: 28752, Training Loss: 0.00985
Epoch: 28752, Training Loss: 0.00763
Epoch: 28753, Training Loss: 0.00863
Epoch: 28753, Training Loss: 0.00805
Epoch: 28753, Training Loss: 0.00985
Epoch: 28753, Training Loss: 0.00763
Epoch: 28754, Training Loss: 0.00863
Epoch: 28754, Training Loss: 0.00805
Epoch: 28754, Training Loss: 0.00985
Epoch: 28754, Training Loss: 0.00763
Epoch: 28755, Training Loss: 0.00863
Epoch: 28755, Training Loss: 0.00805
Epoch: 28755, Training Loss: 0.00985
Epoch: 28755, Training Loss: 0.00763
Epoch: 28756, Training Loss: 0.00863
Epoch: 28756, Training Loss: 0.00805
Epoch: 28756, Training Loss: 0.00985
Epoch: 28756, Training Loss: 0.00763
Epoch: 28757, Training Loss: 0.00863
Epoch: 28757, Training Loss: 0.00805
Epoch: 28757, Training Loss: 0.00985
Epoch: 28757, Training Loss: 0.00763
Epoch: 28758, Training Loss: 0.00863
Epoch: 28758, Training Loss: 0.00805
Epoch: 28758, Training Loss: 0.00985
Epoch: 28758, Training Loss: 0.00763
Epoch: 28759, Training Loss: 0.00863
Epoch: 28759, Training Loss: 0.00805
Epoch: 28759, Training Loss: 0.00985
Epoch: 28759, Training Loss: 0.00763
Epoch: 28760, Training Loss: 0.00863
Epoch: 28760, Training Loss: 0.00805
Epoch: 28760, Training Loss: 0.00985
Epoch: 28760, Training Loss: 0.00762
Epoch: 28761, Training Loss: 0.00863
Epoch: 28761, Training Loss: 0.00805
Epoch: 28761, Training Loss: 0.00985
Epoch: 28761, Training Loss: 0.00762
Epoch: 28762, Training Loss: 0.00863
Epoch: 28762, Training Loss: 0.00805
Epoch: 28762, Training Loss: 0.00985
Epoch: 28762, Training Loss: 0.00762
Epoch: 28763, Training Loss: 0.00863
Epoch: 28763, Training Loss: 0.00805
Epoch: 28763, Training Loss: 0.00985
Epoch: 28763, Training Loss: 0.00762
Epoch: 28764, Training Loss: 0.00863
Epoch: 28764, Training Loss: 0.00805
Epoch: 28764, Training Loss: 0.00985
Epoch: 28764, Training Loss: 0.00762
Epoch: 28765, Training Loss: 0.00863
Epoch: 28765, Training Loss: 0.00805
Epoch: 28765, Training Loss: 0.00985
Epoch: 28765, Training Loss: 0.00762
Epoch: 28766, Training Loss: 0.00863
Epoch: 28766, Training Loss: 0.00805
Epoch: 28766, Training Loss: 0.00985
Epoch: 28766, Training Loss: 0.00762
Epoch: 28767, Training Loss: 0.00862
Epoch: 28767, Training Loss: 0.00805
Epoch: 28767, Training Loss: 0.00985
Epoch: 28767, Training Loss: 0.00762
Epoch: 28768, Training Loss: 0.00862
Epoch: 28768, Training Loss: 0.00805
Epoch: 28768, Training Loss: 0.00985
Epoch: 28768, Training Loss: 0.00762
Epoch: 28769, Training Loss: 0.00862
Epoch: 28769, Training Loss: 0.00805
Epoch: 28769, Training Loss: 0.00985
Epoch: 28769, Training Loss: 0.00762
Epoch: 28770, Training Loss: 0.00862
Epoch: 28770, Training Loss: 0.00805
Epoch: 28770, Training Loss: 0.00985
Epoch: 28770, Training Loss: 0.00762
Epoch: 28771, Training Loss: 0.00862
Epoch: 28771, Training Loss: 0.00805
Epoch: 28771, Training Loss: 0.00985
Epoch: 28771, Training Loss: 0.00762
Epoch: 28772, Training Loss: 0.00862
Epoch: 28772, Training Loss: 0.00805
Epoch: 28772, Training Loss: 0.00985
Epoch: 28772, Training Loss: 0.00762
Epoch: 28773, Training Loss: 0.00862
Epoch: 28773, Training Loss: 0.00805
Epoch: 28773, Training Loss: 0.00985
Epoch: 28773, Training Loss: 0.00762
Epoch: 28774, Training Loss: 0.00862
Epoch: 28774, Training Loss: 0.00805
Epoch: 28774, Training Loss: 0.00985
Epoch: 28774, Training Loss: 0.00762
Epoch: 28775, Training Loss: 0.00862
Epoch: 28775, Training Loss: 0.00805
Epoch: 28775, Training Loss: 0.00985
Epoch: 28775, Training Loss: 0.00762
Epoch: 28776, Training Loss: 0.00862
Epoch: 28776, Training Loss: 0.00805
Epoch: 28776, Training Loss: 0.00985
Epoch: 28776, Training Loss: 0.00762
Epoch: 28777, Training Loss: 0.00862
Epoch: 28777, Training Loss: 0.00805
Epoch: 28777, Training Loss: 0.00985
Epoch: 28777, Training Loss: 0.00762
Epoch: 28778, Training Loss: 0.00862
Epoch: 28778, Training Loss: 0.00805
Epoch: 28778, Training Loss: 0.00985
Epoch: 28778, Training Loss: 0.00762
Epoch: 28779, Training Loss: 0.00862
Epoch: 28779, Training Loss: 0.00805
Epoch: 28779, Training Loss: 0.00985
Epoch: 28779, Training Loss: 0.00762
Epoch: 28780, Training Loss: 0.00862
Epoch: 28780, Training Loss: 0.00805
Epoch: 28780, Training Loss: 0.00985
Epoch: 28780, Training Loss: 0.00762
Epoch: 28781, Training Loss: 0.00862
Epoch: 28781, Training Loss: 0.00805
Epoch: 28781, Training Loss: 0.00985
Epoch: 28781, Training Loss: 0.00762
Epoch: 28782, Training Loss: 0.00862
Epoch: 28782, Training Loss: 0.00805
Epoch: 28782, Training Loss: 0.00985
Epoch: 28782, Training Loss: 0.00762
Epoch: 28783, Training Loss: 0.00862
Epoch: 28783, Training Loss: 0.00805
Epoch: 28783, Training Loss: 0.00985
Epoch: 28783, Training Loss: 0.00762
Epoch: 28784, Training Loss: 0.00862
Epoch: 28784, Training Loss: 0.00805
Epoch: 28784, Training Loss: 0.00985
Epoch: 28784, Training Loss: 0.00762
Epoch: 28785, Training Loss: 0.00862
Epoch: 28785, Training Loss: 0.00805
Epoch: 28785, Training Loss: 0.00985
Epoch: 28785, Training Loss: 0.00762
Epoch: 28786, Training Loss: 0.00862
Epoch: 28786, Training Loss: 0.00805
Epoch: 28786, Training Loss: 0.00985
Epoch: 28786, Training Loss: 0.00762
Epoch: 28787, Training Loss: 0.00862
Epoch: 28787, Training Loss: 0.00805
Epoch: 28787, Training Loss: 0.00985
Epoch: 28787, Training Loss: 0.00762
Epoch: 28788, Training Loss: 0.00862
Epoch: 28788, Training Loss: 0.00805
Epoch: 28788, Training Loss: 0.00985
Epoch: 28788, Training Loss: 0.00762
Epoch: 28789, Training Loss: 0.00862
Epoch: 28789, Training Loss: 0.00805
Epoch: 28789, Training Loss: 0.00985
Epoch: 28789, Training Loss: 0.00762
Epoch: 28790, Training Loss: 0.00862
Epoch: 28790, Training Loss: 0.00805
Epoch: 28790, Training Loss: 0.00985
Epoch: 28790, Training Loss: 0.00762
Epoch: 28791, Training Loss: 0.00862
Epoch: 28791, Training Loss: 0.00805
Epoch: 28791, Training Loss: 0.00985
Epoch: 28791, Training Loss: 0.00762
Epoch: 28792, Training Loss: 0.00862
Epoch: 28792, Training Loss: 0.00805
Epoch: 28792, Training Loss: 0.00984
Epoch: 28792, Training Loss: 0.00762
Epoch: 28793, Training Loss: 0.00862
Epoch: 28793, Training Loss: 0.00805
Epoch: 28793, Training Loss: 0.00984
Epoch: 28793, Training Loss: 0.00762
Epoch: 28794, Training Loss: 0.00862
Epoch: 28794, Training Loss: 0.00805
Epoch: 28794, Training Loss: 0.00984
Epoch: 28794, Training Loss: 0.00762
Epoch: 28795, Training Loss: 0.00862
Epoch: 28795, Training Loss: 0.00805
Epoch: 28795, Training Loss: 0.00984
Epoch: 28795, Training Loss: 0.00762
Epoch: 28796, Training Loss: 0.00862
Epoch: 28796, Training Loss: 0.00805
Epoch: 28796, Training Loss: 0.00984
Epoch: 28796, Training Loss: 0.00762
Epoch: 28797, Training Loss: 0.00862
Epoch: 28797, Training Loss: 0.00805
Epoch: 28797, Training Loss: 0.00984
Epoch: 28797, Training Loss: 0.00762
Epoch: 28798, Training Loss: 0.00862
Epoch: 28798, Training Loss: 0.00805
Epoch: 28798, Training Loss: 0.00984
Epoch: 28798, Training Loss: 0.00762
Epoch: 28799, Training Loss: 0.00862
Epoch: 28799, Training Loss: 0.00805
Epoch: 28799, Training Loss: 0.00984
Epoch: 28799, Training Loss: 0.00762
Epoch: 28800, Training Loss: 0.00862
Epoch: 28800, Training Loss: 0.00805
Epoch: 28800, Training Loss: 0.00984
Epoch: 28800, Training Loss: 0.00762
Epoch: 28801, Training Loss: 0.00862
Epoch: 28801, Training Loss: 0.00805
Epoch: 28801, Training Loss: 0.00984
Epoch: 28801, Training Loss: 0.00762
Epoch: 28802, Training Loss: 0.00862
Epoch: 28802, Training Loss: 0.00805
Epoch: 28802, Training Loss: 0.00984
Epoch: 28802, Training Loss: 0.00762
Epoch: 28803, Training Loss: 0.00862
Epoch: 28803, Training Loss: 0.00805
Epoch: 28803, Training Loss: 0.00984
Epoch: 28803, Training Loss: 0.00762
Epoch: 28804, Training Loss: 0.00862
Epoch: 28804, Training Loss: 0.00805
Epoch: 28804, Training Loss: 0.00984
Epoch: 28804, Training Loss: 0.00762
Epoch: 28805, Training Loss: 0.00862
Epoch: 28805, Training Loss: 0.00805
Epoch: 28805, Training Loss: 0.00984
Epoch: 28805, Training Loss: 0.00762
Epoch: 28806, Training Loss: 0.00862
Epoch: 28806, Training Loss: 0.00804
Epoch: 28806, Training Loss: 0.00984
Epoch: 28806, Training Loss: 0.00762
Epoch: 28807, Training Loss: 0.00862
Epoch: 28807, Training Loss: 0.00804
Epoch: 28807, Training Loss: 0.00984
Epoch: 28807, Training Loss: 0.00762
Epoch: 28808, Training Loss: 0.00862
Epoch: 28808, Training Loss: 0.00804
Epoch: 28808, Training Loss: 0.00984
Epoch: 28808, Training Loss: 0.00762
Epoch: 28809, Training Loss: 0.00862
Epoch: 28809, Training Loss: 0.00804
Epoch: 28809, Training Loss: 0.00984
Epoch: 28809, Training Loss: 0.00762
Epoch: 28810, Training Loss: 0.00862
Epoch: 28810, Training Loss: 0.00804
Epoch: 28810, Training Loss: 0.00984
Epoch: 28810, Training Loss: 0.00762
Epoch: 28811, Training Loss: 0.00862
Epoch: 28811, Training Loss: 0.00804
Epoch: 28811, Training Loss: 0.00984
Epoch: 28811, Training Loss: 0.00762
Epoch: 28812, Training Loss: 0.00862
Epoch: 28812, Training Loss: 0.00804
Epoch: 28812, Training Loss: 0.00984
Epoch: 28812, Training Loss: 0.00762
Epoch: 28813, Training Loss: 0.00862
Epoch: 28813, Training Loss: 0.00804
Epoch: 28813, Training Loss: 0.00984
Epoch: 28813, Training Loss: 0.00762
Epoch: 28814, Training Loss: 0.00862
Epoch: 28814, Training Loss: 0.00804
Epoch: 28814, Training Loss: 0.00984
Epoch: 28814, Training Loss: 0.00762
Epoch: 28815, Training Loss: 0.00862
Epoch: 28815, Training Loss: 0.00804
Epoch: 28815, Training Loss: 0.00984
Epoch: 28815, Training Loss: 0.00762
Epoch: 28816, Training Loss: 0.00862
Epoch: 28816, Training Loss: 0.00804
Epoch: 28816, Training Loss: 0.00984
Epoch: 28816, Training Loss: 0.00762
Epoch: 28817, Training Loss: 0.00862
Epoch: 28817, Training Loss: 0.00804
Epoch: 28817, Training Loss: 0.00984
Epoch: 28817, Training Loss: 0.00762
Epoch: 28818, Training Loss: 0.00862
Epoch: 28818, Training Loss: 0.00804
Epoch: 28818, Training Loss: 0.00984
Epoch: 28818, Training Loss: 0.00762
Epoch: 28819, Training Loss: 0.00862
Epoch: 28819, Training Loss: 0.00804
Epoch: 28819, Training Loss: 0.00984
Epoch: 28819, Training Loss: 0.00762
Epoch: 28820, Training Loss: 0.00862
Epoch: 28820, Training Loss: 0.00804
Epoch: 28820, Training Loss: 0.00984
Epoch: 28820, Training Loss: 0.00762
Epoch: 28821, Training Loss: 0.00862
Epoch: 28821, Training Loss: 0.00804
Epoch: 28821, Training Loss: 0.00984
Epoch: 28821, Training Loss: 0.00762
Epoch: 28822, Training Loss: 0.00862
Epoch: 28822, Training Loss: 0.00804
Epoch: 28822, Training Loss: 0.00984
Epoch: 28822, Training Loss: 0.00762
Epoch: 28823, Training Loss: 0.00862
Epoch: 28823, Training Loss: 0.00804
Epoch: 28823, Training Loss: 0.00984
Epoch: 28823, Training Loss: 0.00762
Epoch: 28824, Training Loss: 0.00862
Epoch: 28824, Training Loss: 0.00804
Epoch: 28824, Training Loss: 0.00984
Epoch: 28824, Training Loss: 0.00762
Epoch: 28825, Training Loss: 0.00862
Epoch: 28825, Training Loss: 0.00804
Epoch: 28825, Training Loss: 0.00984
Epoch: 28825, Training Loss: 0.00762
Epoch: 28826, Training Loss: 0.00862
Epoch: 28826, Training Loss: 0.00804
Epoch: 28826, Training Loss: 0.00984
Epoch: 28826, Training Loss: 0.00762
Epoch: 28827, Training Loss: 0.00862
Epoch: 28827, Training Loss: 0.00804
Epoch: 28827, Training Loss: 0.00984
Epoch: 28827, Training Loss: 0.00762
Epoch: 28828, Training Loss: 0.00862
Epoch: 28828, Training Loss: 0.00804
Epoch: 28828, Training Loss: 0.00984
Epoch: 28828, Training Loss: 0.00762
Epoch: 28829, Training Loss: 0.00862
Epoch: 28829, Training Loss: 0.00804
Epoch: 28829, Training Loss: 0.00984
Epoch: 28829, Training Loss: 0.00762
Epoch: 28830, Training Loss: 0.00861
Epoch: 28830, Training Loss: 0.00804
Epoch: 28830, Training Loss: 0.00984
Epoch: 28830, Training Loss: 0.00762
Epoch: 28831, Training Loss: 0.00861
Epoch: 28831, Training Loss: 0.00804
Epoch: 28831, Training Loss: 0.00984
Epoch: 28831, Training Loss: 0.00762
Epoch: 28832, Training Loss: 0.00861
Epoch: 28832, Training Loss: 0.00804
Epoch: 28832, Training Loss: 0.00984
Epoch: 28832, Training Loss: 0.00761
Epoch: 28833, Training Loss: 0.00861
Epoch: 28833, Training Loss: 0.00804
Epoch: 28833, Training Loss: 0.00984
Epoch: 28833, Training Loss: 0.00761
Epoch: 28834, Training Loss: 0.00861
Epoch: 28834, Training Loss: 0.00804
Epoch: 28834, Training Loss: 0.00984
Epoch: 28834, Training Loss: 0.00761
Epoch: 28835, Training Loss: 0.00861
Epoch: 28835, Training Loss: 0.00804
Epoch: 28835, Training Loss: 0.00984
Epoch: 28835, Training Loss: 0.00761
Epoch: 28836, Training Loss: 0.00861
Epoch: 28836, Training Loss: 0.00804
Epoch: 28836, Training Loss: 0.00984
Epoch: 28836, Training Loss: 0.00761
Epoch: 28837, Training Loss: 0.00861
Epoch: 28837, Training Loss: 0.00804
Epoch: 28837, Training Loss: 0.00984
Epoch: 28837, Training Loss: 0.00761
Epoch: 28838, Training Loss: 0.00861
Epoch: 28838, Training Loss: 0.00804
Epoch: 28838, Training Loss: 0.00984
Epoch: 28838, Training Loss: 0.00761
Epoch: 28839, Training Loss: 0.00861
Epoch: 28839, Training Loss: 0.00804
Epoch: 28839, Training Loss: 0.00984
Epoch: 28839, Training Loss: 0.00761
Epoch: 28840, Training Loss: 0.00861
Epoch: 28840, Training Loss: 0.00804
Epoch: 28840, Training Loss: 0.00984
Epoch: 28840, Training Loss: 0.00761
Epoch: 28841, Training Loss: 0.00861
Epoch: 28841, Training Loss: 0.00804
Epoch: 28841, Training Loss: 0.00984
Epoch: 28841, Training Loss: 0.00761
Epoch: 28842, Training Loss: 0.00861
Epoch: 28842, Training Loss: 0.00804
Epoch: 28842, Training Loss: 0.00984
Epoch: 28842, Training Loss: 0.00761
Epoch: 28843, Training Loss: 0.00861
Epoch: 28843, Training Loss: 0.00804
Epoch: 28843, Training Loss: 0.00984
Epoch: 28843, Training Loss: 0.00761
Epoch: 28844, Training Loss: 0.00861
Epoch: 28844, Training Loss: 0.00804
Epoch: 28844, Training Loss: 0.00984
Epoch: 28844, Training Loss: 0.00761
Epoch: 28845, Training Loss: 0.00861
Epoch: 28845, Training Loss: 0.00804
Epoch: 28845, Training Loss: 0.00984
Epoch: 28845, Training Loss: 0.00761
Epoch: 28846, Training Loss: 0.00861
Epoch: 28846, Training Loss: 0.00804
Epoch: 28846, Training Loss: 0.00984
Epoch: 28846, Training Loss: 0.00761
Epoch: 28847, Training Loss: 0.00861
Epoch: 28847, Training Loss: 0.00804
Epoch: 28847, Training Loss: 0.00983
Epoch: 28847, Training Loss: 0.00761
Epoch: 28848, Training Loss: 0.00861
Epoch: 28848, Training Loss: 0.00804
Epoch: 28848, Training Loss: 0.00983
Epoch: 28848, Training Loss: 0.00761
Epoch: 28849, Training Loss: 0.00861
Epoch: 28849, Training Loss: 0.00804
Epoch: 28849, Training Loss: 0.00983
Epoch: 28849, Training Loss: 0.00761
Epoch: 28850, Training Loss: 0.00861
Epoch: 28850, Training Loss: 0.00804
Epoch: 28850, Training Loss: 0.00983
Epoch: 28850, Training Loss: 0.00761
Epoch: 28851, Training Loss: 0.00861
Epoch: 28851, Training Loss: 0.00804
Epoch: 28851, Training Loss: 0.00983
Epoch: 28851, Training Loss: 0.00761
Epoch: 28852, Training Loss: 0.00861
Epoch: 28852, Training Loss: 0.00804
Epoch: 28852, Training Loss: 0.00983
Epoch: 28852, Training Loss: 0.00761
Epoch: 28853, Training Loss: 0.00861
Epoch: 28853, Training Loss: 0.00804
Epoch: 28853, Training Loss: 0.00983
Epoch: 28853, Training Loss: 0.00761
Epoch: 28854, Training Loss: 0.00861
Epoch: 28854, Training Loss: 0.00804
Epoch: 28854, Training Loss: 0.00983
Epoch: 28854, Training Loss: 0.00761
Epoch: 28855, Training Loss: 0.00861
Epoch: 28855, Training Loss: 0.00804
Epoch: 28855, Training Loss: 0.00983
Epoch: 28855, Training Loss: 0.00761
Epoch: 28856, Training Loss: 0.00861
Epoch: 28856, Training Loss: 0.00804
Epoch: 28856, Training Loss: 0.00983
Epoch: 28856, Training Loss: 0.00761
Epoch: 28857, Training Loss: 0.00861
Epoch: 28857, Training Loss: 0.00804
Epoch: 28857, Training Loss: 0.00983
Epoch: 28857, Training Loss: 0.00761
Epoch: 28858, Training Loss: 0.00861
Epoch: 28858, Training Loss: 0.00804
Epoch: 28858, Training Loss: 0.00983
Epoch: 28858, Training Loss: 0.00761
Epoch: 28859, Training Loss: 0.00861
Epoch: 28859, Training Loss: 0.00804
Epoch: 28859, Training Loss: 0.00983
Epoch: 28859, Training Loss: 0.00761
Epoch: 28860, Training Loss: 0.00861
Epoch: 28860, Training Loss: 0.00804
Epoch: 28860, Training Loss: 0.00983
Epoch: 28860, Training Loss: 0.00761
Epoch: 28861, Training Loss: 0.00861
Epoch: 28861, Training Loss: 0.00804
Epoch: 28861, Training Loss: 0.00983
Epoch: 28861, Training Loss: 0.00761
Epoch: 28862, Training Loss: 0.00861
Epoch: 28862, Training Loss: 0.00804
Epoch: 28862, Training Loss: 0.00983
Epoch: 28862, Training Loss: 0.00761
Epoch: 28863, Training Loss: 0.00861
Epoch: 28863, Training Loss: 0.00804
Epoch: 28863, Training Loss: 0.00983
Epoch: 28863, Training Loss: 0.00761
Epoch: 28864, Training Loss: 0.00861
Epoch: 28864, Training Loss: 0.00804
Epoch: 28864, Training Loss: 0.00983
Epoch: 28864, Training Loss: 0.00761
Epoch: 28865, Training Loss: 0.00861
Epoch: 28865, Training Loss: 0.00804
Epoch: 28865, Training Loss: 0.00983
Epoch: 28865, Training Loss: 0.00761
Epoch: 28866, Training Loss: 0.00861
Epoch: 28866, Training Loss: 0.00804
Epoch: 28866, Training Loss: 0.00983
Epoch: 28866, Training Loss: 0.00761
Epoch: 28867, Training Loss: 0.00861
Epoch: 28867, Training Loss: 0.00804
Epoch: 28867, Training Loss: 0.00983
Epoch: 28867, Training Loss: 0.00761
Epoch: 28868, Training Loss: 0.00861
Epoch: 28868, Training Loss: 0.00804
Epoch: 28868, Training Loss: 0.00983
Epoch: 28868, Training Loss: 0.00761
Epoch: 28869, Training Loss: 0.00861
Epoch: 28869, Training Loss: 0.00804
Epoch: 28869, Training Loss: 0.00983
Epoch: 28869, Training Loss: 0.00761
Epoch: 28870, Training Loss: 0.00861
Epoch: 28870, Training Loss: 0.00804
Epoch: 28870, Training Loss: 0.00983
Epoch: 28870, Training Loss: 0.00761
Epoch: 28871, Training Loss: 0.00861
Epoch: 28871, Training Loss: 0.00804
Epoch: 28871, Training Loss: 0.00983
Epoch: 28871, Training Loss: 0.00761
Epoch: 28872, Training Loss: 0.00861
Epoch: 28872, Training Loss: 0.00804
Epoch: 28872, Training Loss: 0.00983
Epoch: 28872, Training Loss: 0.00761
Epoch: 28873, Training Loss: 0.00861
Epoch: 28873, Training Loss: 0.00804
Epoch: 28873, Training Loss: 0.00983
Epoch: 28873, Training Loss: 0.00761
Epoch: 28874, Training Loss: 0.00861
Epoch: 28874, Training Loss: 0.00803
Epoch: 28874, Training Loss: 0.00983
Epoch: 28874, Training Loss: 0.00761
Epoch: 28875, Training Loss: 0.00861
Epoch: 28875, Training Loss: 0.00803
Epoch: 28875, Training Loss: 0.00983
Epoch: 28875, Training Loss: 0.00761
Epoch: 28876, Training Loss: 0.00861
Epoch: 28876, Training Loss: 0.00803
Epoch: 28876, Training Loss: 0.00983
Epoch: 28876, Training Loss: 0.00761
Epoch: 28877, Training Loss: 0.00861
Epoch: 28877, Training Loss: 0.00803
Epoch: 28877, Training Loss: 0.00983
Epoch: 28877, Training Loss: 0.00761
Epoch: 28878, Training Loss: 0.00861
Epoch: 28878, Training Loss: 0.00803
Epoch: 28878, Training Loss: 0.00983
Epoch: 28878, Training Loss: 0.00761
Epoch: 28879, Training Loss: 0.00861
Epoch: 28879, Training Loss: 0.00803
Epoch: 28879, Training Loss: 0.00983
Epoch: 28879, Training Loss: 0.00761
Epoch: 28880, Training Loss: 0.00861
Epoch: 28880, Training Loss: 0.00803
Epoch: 28880, Training Loss: 0.00983
Epoch: 28880, Training Loss: 0.00761
Epoch: 28881, Training Loss: 0.00861
Epoch: 28881, Training Loss: 0.00803
Epoch: 28881, Training Loss: 0.00983
Epoch: 28881, Training Loss: 0.00761
Epoch: 28882, Training Loss: 0.00861
Epoch: 28882, Training Loss: 0.00803
Epoch: 28882, Training Loss: 0.00983
Epoch: 28882, Training Loss: 0.00761
Epoch: 28883, Training Loss: 0.00861
Epoch: 28883, Training Loss: 0.00803
Epoch: 28883, Training Loss: 0.00983
Epoch: 28883, Training Loss: 0.00761
Epoch: 28884, Training Loss: 0.00861
Epoch: 28884, Training Loss: 0.00803
Epoch: 28884, Training Loss: 0.00983
Epoch: 28884, Training Loss: 0.00761
Epoch: 28885, Training Loss: 0.00861
Epoch: 28885, Training Loss: 0.00803
Epoch: 28885, Training Loss: 0.00983
Epoch: 28885, Training Loss: 0.00761
Epoch: 28886, Training Loss: 0.00861
Epoch: 28886, Training Loss: 0.00803
Epoch: 28886, Training Loss: 0.00983
Epoch: 28886, Training Loss: 0.00761
Epoch: 28887, Training Loss: 0.00861
Epoch: 28887, Training Loss: 0.00803
Epoch: 28887, Training Loss: 0.00983
Epoch: 28887, Training Loss: 0.00761
Epoch: 28888, Training Loss: 0.00861
Epoch: 28888, Training Loss: 0.00803
Epoch: 28888, Training Loss: 0.00983
Epoch: 28888, Training Loss: 0.00761
Epoch: 28889, Training Loss: 0.00861
Epoch: 28889, Training Loss: 0.00803
Epoch: 28889, Training Loss: 0.00983
Epoch: 28889, Training Loss: 0.00761
Epoch: 28890, Training Loss: 0.00861
Epoch: 28890, Training Loss: 0.00803
Epoch: 28890, Training Loss: 0.00983
Epoch: 28890, Training Loss: 0.00761
Epoch: 28891, Training Loss: 0.00861
Epoch: 28891, Training Loss: 0.00803
Epoch: 28891, Training Loss: 0.00983
Epoch: 28891, Training Loss: 0.00761
Epoch: 28892, Training Loss: 0.00861
Epoch: 28892, Training Loss: 0.00803
Epoch: 28892, Training Loss: 0.00983
Epoch: 28892, Training Loss: 0.00761
Epoch: 28893, Training Loss: 0.00860
Epoch: 28893, Training Loss: 0.00803
Epoch: 28893, Training Loss: 0.00983
Epoch: 28893, Training Loss: 0.00761
Epoch: 28894, Training Loss: 0.00860
Epoch: 28894, Training Loss: 0.00803
Epoch: 28894, Training Loss: 0.00983
Epoch: 28894, Training Loss: 0.00761
Epoch: 28895, Training Loss: 0.00860
Epoch: 28895, Training Loss: 0.00803
Epoch: 28895, Training Loss: 0.00983
Epoch: 28895, Training Loss: 0.00761
Epoch: 28896, Training Loss: 0.00860
Epoch: 28896, Training Loss: 0.00803
Epoch: 28896, Training Loss: 0.00983
Epoch: 28896, Training Loss: 0.00761
Epoch: 28897, Training Loss: 0.00860
Epoch: 28897, Training Loss: 0.00803
Epoch: 28897, Training Loss: 0.00983
Epoch: 28897, Training Loss: 0.00761
Epoch: 28898, Training Loss: 0.00860
Epoch: 28898, Training Loss: 0.00803
Epoch: 28898, Training Loss: 0.00983
Epoch: 28898, Training Loss: 0.00761
Epoch: 28899, Training Loss: 0.00860
Epoch: 28899, Training Loss: 0.00803
Epoch: 28899, Training Loss: 0.00983
Epoch: 28899, Training Loss: 0.00761
Epoch: 28900, Training Loss: 0.00860
Epoch: 28900, Training Loss: 0.00803
Epoch: 28900, Training Loss: 0.00983
Epoch: 28900, Training Loss: 0.00761
Epoch: 28901, Training Loss: 0.00860
Epoch: 28901, Training Loss: 0.00803
Epoch: 28901, Training Loss: 0.00983
Epoch: 28901, Training Loss: 0.00761
Epoch: 28902, Training Loss: 0.00860
Epoch: 28902, Training Loss: 0.00803
Epoch: 28902, Training Loss: 0.00982
Epoch: 28902, Training Loss: 0.00761
Epoch: 28903, Training Loss: 0.00860
Epoch: 28903, Training Loss: 0.00803
Epoch: 28903, Training Loss: 0.00982
Epoch: 28903, Training Loss: 0.00761
Epoch: 28904, Training Loss: 0.00860
Epoch: 28904, Training Loss: 0.00803
Epoch: 28904, Training Loss: 0.00982
Epoch: 28904, Training Loss: 0.00761
Epoch: 28905, Training Loss: 0.00860
Epoch: 28905, Training Loss: 0.00803
Epoch: 28905, Training Loss: 0.00982
Epoch: 28905, Training Loss: 0.00760
Epoch: 28906, Training Loss: 0.00860
Epoch: 28906, Training Loss: 0.00803
Epoch: 28906, Training Loss: 0.00982
Epoch: 28906, Training Loss: 0.00760
Epoch: 28907, Training Loss: 0.00860
Epoch: 28907, Training Loss: 0.00803
Epoch: 28907, Training Loss: 0.00982
Epoch: 28907, Training Loss: 0.00760
Epoch: 28908, Training Loss: 0.00860
Epoch: 28908, Training Loss: 0.00803
Epoch: 28908, Training Loss: 0.00982
Epoch: 28908, Training Loss: 0.00760
Epoch: 28909, Training Loss: 0.00860
Epoch: 28909, Training Loss: 0.00803
Epoch: 28909, Training Loss: 0.00982
Epoch: 28909, Training Loss: 0.00760
Epoch: 28910, Training Loss: 0.00860
Epoch: 28910, Training Loss: 0.00803
Epoch: 28910, Training Loss: 0.00982
Epoch: 28910, Training Loss: 0.00760
Epoch: 28911, Training Loss: 0.00860
Epoch: 28911, Training Loss: 0.00803
Epoch: 28911, Training Loss: 0.00982
Epoch: 28911, Training Loss: 0.00760
Epoch: 28912, Training Loss: 0.00860
Epoch: 28912, Training Loss: 0.00803
Epoch: 28912, Training Loss: 0.00982
Epoch: 28912, Training Loss: 0.00760
Epoch: 28913, Training Loss: 0.00860
Epoch: 28913, Training Loss: 0.00803
Epoch: 28913, Training Loss: 0.00982
Epoch: 28913, Training Loss: 0.00760
Epoch: 28914, Training Loss: 0.00860
Epoch: 28914, Training Loss: 0.00803
Epoch: 28914, Training Loss: 0.00982
Epoch: 28914, Training Loss: 0.00760
Epoch: 28915, Training Loss: 0.00860
Epoch: 28915, Training Loss: 0.00803
Epoch: 28915, Training Loss: 0.00982
Epoch: 28915, Training Loss: 0.00760
Epoch: 28916, Training Loss: 0.00860
Epoch: 28916, Training Loss: 0.00803
Epoch: 28916, Training Loss: 0.00982
Epoch: 28916, Training Loss: 0.00760
Epoch: 28917, Training Loss: 0.00860
Epoch: 28917, Training Loss: 0.00803
Epoch: 28917, Training Loss: 0.00982
Epoch: 28917, Training Loss: 0.00760
Epoch: 28918, Training Loss: 0.00860
Epoch: 28918, Training Loss: 0.00803
Epoch: 28918, Training Loss: 0.00982
Epoch: 28918, Training Loss: 0.00760
Epoch: 28919, Training Loss: 0.00860
Epoch: 28919, Training Loss: 0.00803
Epoch: 28919, Training Loss: 0.00982
Epoch: 28919, Training Loss: 0.00760
Epoch: 28920, Training Loss: 0.00860
Epoch: 28920, Training Loss: 0.00803
Epoch: 28920, Training Loss: 0.00982
Epoch: 28920, Training Loss: 0.00760
Epoch: 28921, Training Loss: 0.00860
Epoch: 28921, Training Loss: 0.00803
Epoch: 28921, Training Loss: 0.00982
Epoch: 28921, Training Loss: 0.00760
Epoch: 28922, Training Loss: 0.00860
Epoch: 28922, Training Loss: 0.00803
Epoch: 28922, Training Loss: 0.00982
Epoch: 28922, Training Loss: 0.00760
Epoch: 28923, Training Loss: 0.00860
Epoch: 28923, Training Loss: 0.00803
Epoch: 28923, Training Loss: 0.00982
Epoch: 28923, Training Loss: 0.00760
Epoch: 28924, Training Loss: 0.00860
Epoch: 28924, Training Loss: 0.00803
Epoch: 28924, Training Loss: 0.00982
Epoch: 28924, Training Loss: 0.00760
Epoch: 28925, Training Loss: 0.00860
Epoch: 28925, Training Loss: 0.00803
Epoch: 28925, Training Loss: 0.00982
Epoch: 28925, Training Loss: 0.00760
Epoch: 28926, Training Loss: 0.00860
Epoch: 28926, Training Loss: 0.00803
Epoch: 28926, Training Loss: 0.00982
Epoch: 28926, Training Loss: 0.00760
Epoch: 28927, Training Loss: 0.00860
Epoch: 28927, Training Loss: 0.00803
Epoch: 28927, Training Loss: 0.00982
Epoch: 28927, Training Loss: 0.00760
Epoch: 28928, Training Loss: 0.00860
Epoch: 28928, Training Loss: 0.00803
Epoch: 28928, Training Loss: 0.00982
Epoch: 28928, Training Loss: 0.00760
Epoch: 28929, Training Loss: 0.00860
Epoch: 28929, Training Loss: 0.00803
Epoch: 28929, Training Loss: 0.00982
Epoch: 28929, Training Loss: 0.00760
Epoch: 28930, Training Loss: 0.00860
Epoch: 28930, Training Loss: 0.00803
Epoch: 28930, Training Loss: 0.00982
Epoch: 28930, Training Loss: 0.00760
Epoch: 28931, Training Loss: 0.00860
Epoch: 28931, Training Loss: 0.00803
Epoch: 28931, Training Loss: 0.00982
Epoch: 28931, Training Loss: 0.00760
Epoch: 28932, Training Loss: 0.00860
Epoch: 28932, Training Loss: 0.00803
Epoch: 28932, Training Loss: 0.00982
Epoch: 28932, Training Loss: 0.00760
Epoch: 28933, Training Loss: 0.00860
Epoch: 28933, Training Loss: 0.00803
Epoch: 28933, Training Loss: 0.00982
Epoch: 28933, Training Loss: 0.00760
Epoch: 28934, Training Loss: 0.00860
Epoch: 28934, Training Loss: 0.00803
Epoch: 28934, Training Loss: 0.00982
Epoch: 28934, Training Loss: 0.00760
Epoch: 28935, Training Loss: 0.00860
Epoch: 28935, Training Loss: 0.00803
Epoch: 28935, Training Loss: 0.00982
Epoch: 28935, Training Loss: 0.00760
Epoch: 28936, Training Loss: 0.00860
Epoch: 28936, Training Loss: 0.00803
Epoch: 28936, Training Loss: 0.00982
Epoch: 28936, Training Loss: 0.00760
Epoch: 28937, Training Loss: 0.00860
Epoch: 28937, Training Loss: 0.00803
Epoch: 28937, Training Loss: 0.00982
Epoch: 28937, Training Loss: 0.00760
Epoch: 28938, Training Loss: 0.00860
Epoch: 28938, Training Loss: 0.00803
Epoch: 28938, Training Loss: 0.00982
Epoch: 28938, Training Loss: 0.00760
Epoch: 28939, Training Loss: 0.00860
Epoch: 28939, Training Loss: 0.00803
Epoch: 28939, Training Loss: 0.00982
Epoch: 28939, Training Loss: 0.00760
Epoch: 28940, Training Loss: 0.00860
Epoch: 28940, Training Loss: 0.00803
Epoch: 28940, Training Loss: 0.00982
Epoch: 28940, Training Loss: 0.00760
Epoch: 28941, Training Loss: 0.00860
Epoch: 28941, Training Loss: 0.00803
Epoch: 28941, Training Loss: 0.00982
Epoch: 28941, Training Loss: 0.00760
Epoch: 28942, Training Loss: 0.00860
Epoch: 28942, Training Loss: 0.00803
Epoch: 28942, Training Loss: 0.00982
Epoch: 28942, Training Loss: 0.00760
Epoch: 28943, Training Loss: 0.00860
Epoch: 28943, Training Loss: 0.00802
Epoch: 28943, Training Loss: 0.00982
Epoch: 28943, Training Loss: 0.00760
Epoch: 28944, Training Loss: 0.00860
Epoch: 28944, Training Loss: 0.00802
Epoch: 28944, Training Loss: 0.00982
Epoch: 28944, Training Loss: 0.00760
Epoch: 28945, Training Loss: 0.00860
Epoch: 28945, Training Loss: 0.00802
Epoch: 28945, Training Loss: 0.00982
Epoch: 28945, Training Loss: 0.00760
Epoch: 28946, Training Loss: 0.00860
Epoch: 28946, Training Loss: 0.00802
Epoch: 28946, Training Loss: 0.00982
Epoch: 28946, Training Loss: 0.00760
Epoch: 28947, Training Loss: 0.00860
Epoch: 28947, Training Loss: 0.00802
Epoch: 28947, Training Loss: 0.00982
Epoch: 28947, Training Loss: 0.00760
Epoch: 28948, Training Loss: 0.00860
Epoch: 28948, Training Loss: 0.00802
Epoch: 28948, Training Loss: 0.00982
Epoch: 28948, Training Loss: 0.00760
Epoch: 28949, Training Loss: 0.00860
Epoch: 28949, Training Loss: 0.00802
Epoch: 28949, Training Loss: 0.00982
Epoch: 28949, Training Loss: 0.00760
Epoch: 28950, Training Loss: 0.00860
Epoch: 28950, Training Loss: 0.00802
Epoch: 28950, Training Loss: 0.00982
Epoch: 28950, Training Loss: 0.00760
Epoch: 28951, Training Loss: 0.00860
Epoch: 28951, Training Loss: 0.00802
Epoch: 28951, Training Loss: 0.00982
Epoch: 28951, Training Loss: 0.00760
Epoch: 28952, Training Loss: 0.00860
Epoch: 28952, Training Loss: 0.00802
Epoch: 28952, Training Loss: 0.00982
Epoch: 28952, Training Loss: 0.00760
Epoch: 28953, Training Loss: 0.00860
Epoch: 28953, Training Loss: 0.00802
Epoch: 28953, Training Loss: 0.00982
Epoch: 28953, Training Loss: 0.00760
Epoch: 28954, Training Loss: 0.00860
Epoch: 28954, Training Loss: 0.00802
Epoch: 28954, Training Loss: 0.00982
Epoch: 28954, Training Loss: 0.00760
Epoch: 28955, Training Loss: 0.00860
Epoch: 28955, Training Loss: 0.00802
Epoch: 28955, Training Loss: 0.00982
Epoch: 28955, Training Loss: 0.00760
Epoch: 28956, Training Loss: 0.00859
Epoch: 28956, Training Loss: 0.00802
Epoch: 28956, Training Loss: 0.00982
Epoch: 28956, Training Loss: 0.00760
Epoch: 28957, Training Loss: 0.00859
Epoch: 28957, Training Loss: 0.00802
Epoch: 28957, Training Loss: 0.00982
Epoch: 28957, Training Loss: 0.00760
Epoch: 28958, Training Loss: 0.00859
Epoch: 28958, Training Loss: 0.00802
Epoch: 28958, Training Loss: 0.00981
Epoch: 28958, Training Loss: 0.00760
Epoch: 28959, Training Loss: 0.00859
Epoch: 28959, Training Loss: 0.00802
Epoch: 28959, Training Loss: 0.00981
Epoch: 28959, Training Loss: 0.00760
Epoch: 28960, Training Loss: 0.00859
Epoch: 28960, Training Loss: 0.00802
Epoch: 28960, Training Loss: 0.00981
Epoch: 28960, Training Loss: 0.00760
Epoch: 28961, Training Loss: 0.00859
Epoch: 28961, Training Loss: 0.00802
Epoch: 28961, Training Loss: 0.00981
Epoch: 28961, Training Loss: 0.00760
Epoch: 28962, Training Loss: 0.00859
Epoch: 28962, Training Loss: 0.00802
Epoch: 28962, Training Loss: 0.00981
Epoch: 28962, Training Loss: 0.00760
Epoch: 28963, Training Loss: 0.00859
Epoch: 28963, Training Loss: 0.00802
Epoch: 28963, Training Loss: 0.00981
Epoch: 28963, Training Loss: 0.00760
Epoch: 28964, Training Loss: 0.00859
Epoch: 28964, Training Loss: 0.00802
Epoch: 28964, Training Loss: 0.00981
Epoch: 28964, Training Loss: 0.00760
Epoch: 28965, Training Loss: 0.00859
Epoch: 28965, Training Loss: 0.00802
Epoch: 28965, Training Loss: 0.00981
Epoch: 28965, Training Loss: 0.00760
Epoch: 28966, Training Loss: 0.00859
Epoch: 28966, Training Loss: 0.00802
Epoch: 28966, Training Loss: 0.00981
Epoch: 28966, Training Loss: 0.00760
Epoch: 28967, Training Loss: 0.00859
Epoch: 28967, Training Loss: 0.00802
Epoch: 28967, Training Loss: 0.00981
Epoch: 28967, Training Loss: 0.00760
Epoch: 28968, Training Loss: 0.00859
Epoch: 28968, Training Loss: 0.00802
Epoch: 28968, Training Loss: 0.00981
Epoch: 28968, Training Loss: 0.00760
Epoch: 28969, Training Loss: 0.00859
Epoch: 28969, Training Loss: 0.00802
Epoch: 28969, Training Loss: 0.00981
Epoch: 28969, Training Loss: 0.00760
Epoch: 28970, Training Loss: 0.00859
Epoch: 28970, Training Loss: 0.00802
Epoch: 28970, Training Loss: 0.00981
Epoch: 28970, Training Loss: 0.00760
Epoch: 28971, Training Loss: 0.00859
Epoch: 28971, Training Loss: 0.00802
Epoch: 28971, Training Loss: 0.00981
Epoch: 28971, Training Loss: 0.00760
Epoch: 28972, Training Loss: 0.00859
Epoch: 28972, Training Loss: 0.00802
Epoch: 28972, Training Loss: 0.00981
Epoch: 28972, Training Loss: 0.00760
Epoch: 28973, Training Loss: 0.00859
Epoch: 28973, Training Loss: 0.00802
Epoch: 28973, Training Loss: 0.00981
Epoch: 28973, Training Loss: 0.00760
Epoch: 28974, Training Loss: 0.00859
Epoch: 28974, Training Loss: 0.00802
Epoch: 28974, Training Loss: 0.00981
Epoch: 28974, Training Loss: 0.00760
Epoch: 28975, Training Loss: 0.00859
Epoch: 28975, Training Loss: 0.00802
Epoch: 28975, Training Loss: 0.00981
Epoch: 28975, Training Loss: 0.00760
Epoch: 28976, Training Loss: 0.00859
Epoch: 28976, Training Loss: 0.00802
Epoch: 28976, Training Loss: 0.00981
Epoch: 28976, Training Loss: 0.00760
Epoch: 28977, Training Loss: 0.00859
Epoch: 28977, Training Loss: 0.00802
Epoch: 28977, Training Loss: 0.00981
Epoch: 28977, Training Loss: 0.00759
Epoch: 28978, Training Loss: 0.00859
Epoch: 28978, Training Loss: 0.00802
Epoch: 28978, Training Loss: 0.00981
Epoch: 28978, Training Loss: 0.00759
Epoch: 28979, Training Loss: 0.00859
Epoch: 28979, Training Loss: 0.00802
Epoch: 28979, Training Loss: 0.00981
Epoch: 28979, Training Loss: 0.00759
Epoch: 28980, Training Loss: 0.00859
Epoch: 28980, Training Loss: 0.00802
Epoch: 28980, Training Loss: 0.00981
Epoch: 28980, Training Loss: 0.00759
Epoch: 28981, Training Loss: 0.00859
Epoch: 28981, Training Loss: 0.00802
Epoch: 28981, Training Loss: 0.00981
Epoch: 28981, Training Loss: 0.00759
Epoch: 28982, Training Loss: 0.00859
Epoch: 28982, Training Loss: 0.00802
Epoch: 28982, Training Loss: 0.00981
Epoch: 28982, Training Loss: 0.00759
Epoch: 28983, Training Loss: 0.00859
Epoch: 28983, Training Loss: 0.00802
Epoch: 28983, Training Loss: 0.00981
Epoch: 28983, Training Loss: 0.00759
Epoch: 28984, Training Loss: 0.00859
Epoch: 28984, Training Loss: 0.00802
Epoch: 28984, Training Loss: 0.00981
Epoch: 28984, Training Loss: 0.00759
Epoch: 28985, Training Loss: 0.00859
Epoch: 28985, Training Loss: 0.00802
Epoch: 28985, Training Loss: 0.00981
Epoch: 28985, Training Loss: 0.00759
Epoch: 28986, Training Loss: 0.00859
Epoch: 28986, Training Loss: 0.00802
Epoch: 28986, Training Loss: 0.00981
Epoch: 28986, Training Loss: 0.00759
Epoch: 28987, Training Loss: 0.00859
Epoch: 28987, Training Loss: 0.00802
Epoch: 28987, Training Loss: 0.00981
Epoch: 28987, Training Loss: 0.00759
Epoch: 28988, Training Loss: 0.00859
Epoch: 28988, Training Loss: 0.00802
Epoch: 28988, Training Loss: 0.00981
Epoch: 28988, Training Loss: 0.00759
Epoch: 28989, Training Loss: 0.00859
Epoch: 28989, Training Loss: 0.00802
Epoch: 28989, Training Loss: 0.00981
Epoch: 28989, Training Loss: 0.00759
Epoch: 28990, Training Loss: 0.00859
Epoch: 28990, Training Loss: 0.00802
Epoch: 28990, Training Loss: 0.00981
Epoch: 28990, Training Loss: 0.00759
Epoch: 28991, Training Loss: 0.00859
Epoch: 28991, Training Loss: 0.00802
Epoch: 28991, Training Loss: 0.00981
Epoch: 28991, Training Loss: 0.00759
Epoch: 28992, Training Loss: 0.00859
Epoch: 28992, Training Loss: 0.00802
Epoch: 28992, Training Loss: 0.00981
Epoch: 28992, Training Loss: 0.00759
Epoch: 28993, Training Loss: 0.00859
Epoch: 28993, Training Loss: 0.00802
Epoch: 28993, Training Loss: 0.00981
Epoch: 28993, Training Loss: 0.00759
Epoch: 28994, Training Loss: 0.00859
Epoch: 28994, Training Loss: 0.00802
Epoch: 28994, Training Loss: 0.00981
Epoch: 28994, Training Loss: 0.00759
Epoch: 28995, Training Loss: 0.00859
Epoch: 28995, Training Loss: 0.00802
Epoch: 28995, Training Loss: 0.00981
Epoch: 28995, Training Loss: 0.00759
Epoch: 28996, Training Loss: 0.00859
Epoch: 28996, Training Loss: 0.00802
Epoch: 28996, Training Loss: 0.00981
Epoch: 28996, Training Loss: 0.00759
Epoch: 28997, Training Loss: 0.00859
Epoch: 28997, Training Loss: 0.00802
Epoch: 28997, Training Loss: 0.00981
Epoch: 28997, Training Loss: 0.00759
Epoch: 28998, Training Loss: 0.00859
Epoch: 28998, Training Loss: 0.00802
Epoch: 28998, Training Loss: 0.00981
Epoch: 28998, Training Loss: 0.00759
Epoch: 28999, Training Loss: 0.00859
Epoch: 28999, Training Loss: 0.00802
Epoch: 28999, Training Loss: 0.00981
Epoch: 28999, Training Loss: 0.00759
Epoch: 29000, Training Loss: 0.00859
Epoch: 29000, Training Loss: 0.00802
Epoch: 29000, Training Loss: 0.00981
Epoch: 29000, Training Loss: 0.00759
Epoch: 29001, Training Loss: 0.00859
Epoch: 29001, Training Loss: 0.00802
Epoch: 29001, Training Loss: 0.00981
Epoch: 29001, Training Loss: 0.00759
Epoch: 29002, Training Loss: 0.00859
Epoch: 29002, Training Loss: 0.00802
Epoch: 29002, Training Loss: 0.00981
Epoch: 29002, Training Loss: 0.00759
Epoch: 29003, Training Loss: 0.00859
Epoch: 29003, Training Loss: 0.00802
Epoch: 29003, Training Loss: 0.00981
Epoch: 29003, Training Loss: 0.00759
Epoch: 29004, Training Loss: 0.00859
Epoch: 29004, Training Loss: 0.00802
Epoch: 29004, Training Loss: 0.00981
Epoch: 29004, Training Loss: 0.00759
Epoch: 29005, Training Loss: 0.00859
Epoch: 29005, Training Loss: 0.00802
Epoch: 29005, Training Loss: 0.00981
Epoch: 29005, Training Loss: 0.00759
Epoch: 29006, Training Loss: 0.00859
Epoch: 29006, Training Loss: 0.00802
Epoch: 29006, Training Loss: 0.00981
Epoch: 29006, Training Loss: 0.00759
Epoch: 29007, Training Loss: 0.00859
Epoch: 29007, Training Loss: 0.00802
Epoch: 29007, Training Loss: 0.00981
Epoch: 29007, Training Loss: 0.00759
Epoch: 29008, Training Loss: 0.00859
Epoch: 29008, Training Loss: 0.00802
Epoch: 29008, Training Loss: 0.00981
Epoch: 29008, Training Loss: 0.00759
Epoch: 29009, Training Loss: 0.00859
Epoch: 29009, Training Loss: 0.00802
Epoch: 29009, Training Loss: 0.00981
Epoch: 29009, Training Loss: 0.00759
Epoch: 29010, Training Loss: 0.00859
Epoch: 29010, Training Loss: 0.00802
Epoch: 29010, Training Loss: 0.00981
Epoch: 29010, Training Loss: 0.00759
Epoch: 29011, Training Loss: 0.00859
Epoch: 29011, Training Loss: 0.00802
Epoch: 29011, Training Loss: 0.00981
Epoch: 29011, Training Loss: 0.00759
Epoch: 29012, Training Loss: 0.00859
Epoch: 29012, Training Loss: 0.00801
Epoch: 29012, Training Loss: 0.00981
Epoch: 29012, Training Loss: 0.00759
Epoch: 29013, Training Loss: 0.00859
Epoch: 29013, Training Loss: 0.00801
Epoch: 29013, Training Loss: 0.00980
Epoch: 29013, Training Loss: 0.00759
Epoch: 29014, Training Loss: 0.00859
Epoch: 29014, Training Loss: 0.00801
Epoch: 29014, Training Loss: 0.00980
Epoch: 29014, Training Loss: 0.00759
Epoch: 29015, Training Loss: 0.00859
Epoch: 29015, Training Loss: 0.00801
Epoch: 29015, Training Loss: 0.00980
Epoch: 29015, Training Loss: 0.00759
Epoch: 29016, Training Loss: 0.00859
Epoch: 29016, Training Loss: 0.00801
Epoch: 29016, Training Loss: 0.00980
Epoch: 29016, Training Loss: 0.00759
Epoch: 29017, Training Loss: 0.00859
Epoch: 29017, Training Loss: 0.00801
Epoch: 29017, Training Loss: 0.00980
Epoch: 29017, Training Loss: 0.00759
Epoch: 29018, Training Loss: 0.00859
Epoch: 29018, Training Loss: 0.00801
Epoch: 29018, Training Loss: 0.00980
Epoch: 29018, Training Loss: 0.00759
Epoch: 29019, Training Loss: 0.00858
Epoch: 29019, Training Loss: 0.00801
Epoch: 29019, Training Loss: 0.00980
Epoch: 29019, Training Loss: 0.00759
Epoch: 29020, Training Loss: 0.00858
Epoch: 29020, Training Loss: 0.00801
Epoch: 29020, Training Loss: 0.00980
Epoch: 29020, Training Loss: 0.00759
Epoch: 29021, Training Loss: 0.00858
Epoch: 29021, Training Loss: 0.00801
Epoch: 29021, Training Loss: 0.00980
Epoch: 29021, Training Loss: 0.00759
Epoch: 29022, Training Loss: 0.00858
Epoch: 29022, Training Loss: 0.00801
Epoch: 29022, Training Loss: 0.00980
Epoch: 29022, Training Loss: 0.00759
Epoch: 29023, Training Loss: 0.00858
Epoch: 29023, Training Loss: 0.00801
Epoch: 29023, Training Loss: 0.00980
Epoch: 29023, Training Loss: 0.00759
Epoch: 29024, Training Loss: 0.00858
Epoch: 29024, Training Loss: 0.00801
Epoch: 29024, Training Loss: 0.00980
Epoch: 29024, Training Loss: 0.00759
Epoch: 29025, Training Loss: 0.00858
Epoch: 29025, Training Loss: 0.00801
Epoch: 29025, Training Loss: 0.00980
Epoch: 29025, Training Loss: 0.00759
Epoch: 29026, Training Loss: 0.00858
Epoch: 29026, Training Loss: 0.00801
Epoch: 29026, Training Loss: 0.00980
Epoch: 29026, Training Loss: 0.00759
Epoch: 29027, Training Loss: 0.00858
Epoch: 29027, Training Loss: 0.00801
Epoch: 29027, Training Loss: 0.00980
Epoch: 29027, Training Loss: 0.00759
Epoch: 29028, Training Loss: 0.00858
Epoch: 29028, Training Loss: 0.00801
Epoch: 29028, Training Loss: 0.00980
Epoch: 29028, Training Loss: 0.00759
Epoch: 29029, Training Loss: 0.00858
Epoch: 29029, Training Loss: 0.00801
Epoch: 29029, Training Loss: 0.00980
Epoch: 29029, Training Loss: 0.00759
Epoch: 29030, Training Loss: 0.00858
Epoch: 29030, Training Loss: 0.00801
Epoch: 29030, Training Loss: 0.00980
Epoch: 29030, Training Loss: 0.00759
Epoch: 29031, Training Loss: 0.00858
Epoch: 29031, Training Loss: 0.00801
Epoch: 29031, Training Loss: 0.00980
Epoch: 29031, Training Loss: 0.00759
Epoch: 29032, Training Loss: 0.00858
Epoch: 29032, Training Loss: 0.00801
Epoch: 29032, Training Loss: 0.00980
Epoch: 29032, Training Loss: 0.00759
Epoch: 29033, Training Loss: 0.00858
Epoch: 29033, Training Loss: 0.00801
Epoch: 29033, Training Loss: 0.00980
Epoch: 29033, Training Loss: 0.00759
Epoch: 29034, Training Loss: 0.00858
Epoch: 29034, Training Loss: 0.00801
Epoch: 29034, Training Loss: 0.00980
Epoch: 29034, Training Loss: 0.00759
Epoch: 29035, Training Loss: 0.00858
Epoch: 29035, Training Loss: 0.00801
Epoch: 29035, Training Loss: 0.00980
Epoch: 29035, Training Loss: 0.00759
Epoch: 29036, Training Loss: 0.00858
Epoch: 29036, Training Loss: 0.00801
Epoch: 29036, Training Loss: 0.00980
Epoch: 29036, Training Loss: 0.00759
Epoch: 29037, Training Loss: 0.00858
Epoch: 29037, Training Loss: 0.00801
Epoch: 29037, Training Loss: 0.00980
Epoch: 29037, Training Loss: 0.00759
Epoch: 29038, Training Loss: 0.00858
Epoch: 29038, Training Loss: 0.00801
Epoch: 29038, Training Loss: 0.00980
Epoch: 29038, Training Loss: 0.00759
Epoch: 29039, Training Loss: 0.00858
Epoch: 29039, Training Loss: 0.00801
Epoch: 29039, Training Loss: 0.00980
Epoch: 29039, Training Loss: 0.00759
Epoch: 29040, Training Loss: 0.00858
Epoch: 29040, Training Loss: 0.00801
Epoch: 29040, Training Loss: 0.00980
Epoch: 29040, Training Loss: 0.00759
Epoch: 29041, Training Loss: 0.00858
Epoch: 29041, Training Loss: 0.00801
Epoch: 29041, Training Loss: 0.00980
Epoch: 29041, Training Loss: 0.00759
Epoch: 29042, Training Loss: 0.00858
Epoch: 29042, Training Loss: 0.00801
Epoch: 29042, Training Loss: 0.00980
Epoch: 29042, Training Loss: 0.00759
Epoch: 29043, Training Loss: 0.00858
Epoch: 29043, Training Loss: 0.00801
Epoch: 29043, Training Loss: 0.00980
Epoch: 29043, Training Loss: 0.00759
Epoch: 29044, Training Loss: 0.00858
Epoch: 29044, Training Loss: 0.00801
Epoch: 29044, Training Loss: 0.00980
Epoch: 29044, Training Loss: 0.00759
Epoch: 29045, Training Loss: 0.00858
Epoch: 29045, Training Loss: 0.00801
Epoch: 29045, Training Loss: 0.00980
Epoch: 29045, Training Loss: 0.00759
Epoch: 29046, Training Loss: 0.00858
Epoch: 29046, Training Loss: 0.00801
Epoch: 29046, Training Loss: 0.00980
Epoch: 29046, Training Loss: 0.00759
Epoch: 29047, Training Loss: 0.00858
Epoch: 29047, Training Loss: 0.00801
Epoch: 29047, Training Loss: 0.00980
Epoch: 29047, Training Loss: 0.00759
Epoch: 29048, Training Loss: 0.00858
Epoch: 29048, Training Loss: 0.00801
Epoch: 29048, Training Loss: 0.00980
Epoch: 29048, Training Loss: 0.00759
Epoch: 29049, Training Loss: 0.00858
Epoch: 29049, Training Loss: 0.00801
Epoch: 29049, Training Loss: 0.00980
Epoch: 29049, Training Loss: 0.00759
Epoch: 29050, Training Loss: 0.00858
Epoch: 29050, Training Loss: 0.00801
Epoch: 29050, Training Loss: 0.00980
Epoch: 29050, Training Loss: 0.00758
Epoch: 29051, Training Loss: 0.00858
Epoch: 29051, Training Loss: 0.00801
Epoch: 29051, Training Loss: 0.00980
Epoch: 29051, Training Loss: 0.00758
Epoch: 29052, Training Loss: 0.00858
Epoch: 29052, Training Loss: 0.00801
Epoch: 29052, Training Loss: 0.00980
Epoch: 29052, Training Loss: 0.00758
Epoch: 29053, Training Loss: 0.00858
Epoch: 29053, Training Loss: 0.00801
Epoch: 29053, Training Loss: 0.00980
Epoch: 29053, Training Loss: 0.00758
Epoch: 29054, Training Loss: 0.00858
Epoch: 29054, Training Loss: 0.00801
Epoch: 29054, Training Loss: 0.00980
Epoch: 29054, Training Loss: 0.00758
Epoch: 29055, Training Loss: 0.00858
Epoch: 29055, Training Loss: 0.00801
Epoch: 29055, Training Loss: 0.00980
Epoch: 29055, Training Loss: 0.00758
Epoch: 29056, Training Loss: 0.00858
Epoch: 29056, Training Loss: 0.00801
Epoch: 29056, Training Loss: 0.00980
Epoch: 29056, Training Loss: 0.00758
Epoch: 29057, Training Loss: 0.00858
Epoch: 29057, Training Loss: 0.00801
Epoch: 29057, Training Loss: 0.00980
Epoch: 29057, Training Loss: 0.00758
Epoch: 29058, Training Loss: 0.00858
Epoch: 29058, Training Loss: 0.00801
Epoch: 29058, Training Loss: 0.00980
Epoch: 29058, Training Loss: 0.00758
Epoch: 29059, Training Loss: 0.00858
Epoch: 29059, Training Loss: 0.00801
Epoch: 29059, Training Loss: 0.00980
Epoch: 29059, Training Loss: 0.00758
Epoch: 29060, Training Loss: 0.00858
Epoch: 29060, Training Loss: 0.00801
Epoch: 29060, Training Loss: 0.00980
Epoch: 29060, Training Loss: 0.00758
Epoch: 29061, Training Loss: 0.00858
Epoch: 29061, Training Loss: 0.00801
Epoch: 29061, Training Loss: 0.00980
Epoch: 29061, Training Loss: 0.00758
Epoch: 29062, Training Loss: 0.00858
Epoch: 29062, Training Loss: 0.00801
Epoch: 29062, Training Loss: 0.00980
Epoch: 29062, Training Loss: 0.00758
Epoch: 29063, Training Loss: 0.00858
Epoch: 29063, Training Loss: 0.00801
Epoch: 29063, Training Loss: 0.00980
Epoch: 29063, Training Loss: 0.00758
Epoch: 29064, Training Loss: 0.00858
Epoch: 29064, Training Loss: 0.00801
Epoch: 29064, Training Loss: 0.00980
Epoch: 29064, Training Loss: 0.00758
Epoch: 29065, Training Loss: 0.00858
Epoch: 29065, Training Loss: 0.00801
Epoch: 29065, Training Loss: 0.00980
Epoch: 29065, Training Loss: 0.00758
Epoch: 29066, Training Loss: 0.00858
Epoch: 29066, Training Loss: 0.00801
Epoch: 29066, Training Loss: 0.00980
Epoch: 29066, Training Loss: 0.00758
Epoch: 29067, Training Loss: 0.00858
Epoch: 29067, Training Loss: 0.00801
Epoch: 29067, Training Loss: 0.00980
Epoch: 29067, Training Loss: 0.00758
Epoch: 29068, Training Loss: 0.00858
Epoch: 29068, Training Loss: 0.00801
Epoch: 29068, Training Loss: 0.00980
Epoch: 29068, Training Loss: 0.00758
Epoch: 29069, Training Loss: 0.00858
Epoch: 29069, Training Loss: 0.00801
Epoch: 29069, Training Loss: 0.00979
Epoch: 29069, Training Loss: 0.00758
Epoch: 29070, Training Loss: 0.00858
Epoch: 29070, Training Loss: 0.00801
Epoch: 29070, Training Loss: 0.00979
Epoch: 29070, Training Loss: 0.00758
Epoch: 29071, Training Loss: 0.00858
Epoch: 29071, Training Loss: 0.00801
Epoch: 29071, Training Loss: 0.00979
Epoch: 29071, Training Loss: 0.00758
Epoch: 29072, Training Loss: 0.00858
Epoch: 29072, Training Loss: 0.00801
Epoch: 29072, Training Loss: 0.00979
Epoch: 29072, Training Loss: 0.00758
Epoch: 29073, Training Loss: 0.00858
Epoch: 29073, Training Loss: 0.00801
Epoch: 29073, Training Loss: 0.00979
Epoch: 29073, Training Loss: 0.00758
Epoch: 29074, Training Loss: 0.00858
Epoch: 29074, Training Loss: 0.00801
Epoch: 29074, Training Loss: 0.00979
Epoch: 29074, Training Loss: 0.00758
Epoch: 29075, Training Loss: 0.00858
Epoch: 29075, Training Loss: 0.00801
Epoch: 29075, Training Loss: 0.00979
Epoch: 29075, Training Loss: 0.00758
Epoch: 29076, Training Loss: 0.00858
Epoch: 29076, Training Loss: 0.00801
Epoch: 29076, Training Loss: 0.00979
Epoch: 29076, Training Loss: 0.00758
Epoch: 29077, Training Loss: 0.00858
Epoch: 29077, Training Loss: 0.00801
Epoch: 29077, Training Loss: 0.00979
Epoch: 29077, Training Loss: 0.00758
Epoch: 29078, Training Loss: 0.00858
Epoch: 29078, Training Loss: 0.00801
Epoch: 29078, Training Loss: 0.00979
Epoch: 29078, Training Loss: 0.00758
Epoch: 29079, Training Loss: 0.00858
Epoch: 29079, Training Loss: 0.00801
Epoch: 29079, Training Loss: 0.00979
Epoch: 29079, Training Loss: 0.00758
Epoch: 29080, Training Loss: 0.00858
Epoch: 29080, Training Loss: 0.00801
Epoch: 29080, Training Loss: 0.00979
Epoch: 29080, Training Loss: 0.00758
Epoch: 29081, Training Loss: 0.00858
Epoch: 29081, Training Loss: 0.00800
Epoch: 29081, Training Loss: 0.00979
Epoch: 29081, Training Loss: 0.00758
Epoch: 29082, Training Loss: 0.00858
Epoch: 29082, Training Loss: 0.00800
Epoch: 29082, Training Loss: 0.00979
Epoch: 29082, Training Loss: 0.00758
Epoch: 29083, Training Loss: 0.00857
Epoch: 29083, Training Loss: 0.00800
Epoch: 29083, Training Loss: 0.00979
Epoch: 29083, Training Loss: 0.00758
Epoch: 29084, Training Loss: 0.00857
Epoch: 29084, Training Loss: 0.00800
Epoch: 29084, Training Loss: 0.00979
Epoch: 29084, Training Loss: 0.00758
Epoch: 29085, Training Loss: 0.00857
Epoch: 29085, Training Loss: 0.00800
Epoch: 29085, Training Loss: 0.00979
Epoch: 29085, Training Loss: 0.00758
Epoch: 29086, Training Loss: 0.00857
Epoch: 29086, Training Loss: 0.00800
Epoch: 29086, Training Loss: 0.00979
Epoch: 29086, Training Loss: 0.00758
Epoch: 29087, Training Loss: 0.00857
Epoch: 29087, Training Loss: 0.00800
Epoch: 29087, Training Loss: 0.00979
Epoch: 29087, Training Loss: 0.00758
Epoch: 29088, Training Loss: 0.00857
Epoch: 29088, Training Loss: 0.00800
Epoch: 29088, Training Loss: 0.00979
Epoch: 29088, Training Loss: 0.00758
Epoch: 29089, Training Loss: 0.00857
Epoch: 29089, Training Loss: 0.00800
Epoch: 29089, Training Loss: 0.00979
Epoch: 29089, Training Loss: 0.00758
Epoch: 29090, Training Loss: 0.00857
Epoch: 29090, Training Loss: 0.00800
Epoch: 29090, Training Loss: 0.00979
Epoch: 29090, Training Loss: 0.00758
Epoch: 29091, Training Loss: 0.00857
Epoch: 29091, Training Loss: 0.00800
Epoch: 29091, Training Loss: 0.00979
Epoch: 29091, Training Loss: 0.00758
Epoch: 29092, Training Loss: 0.00857
Epoch: 29092, Training Loss: 0.00800
Epoch: 29092, Training Loss: 0.00979
Epoch: 29092, Training Loss: 0.00758
Epoch: 29093, Training Loss: 0.00857
Epoch: 29093, Training Loss: 0.00800
Epoch: 29093, Training Loss: 0.00979
Epoch: 29093, Training Loss: 0.00758
Epoch: 29094, Training Loss: 0.00857
Epoch: 29094, Training Loss: 0.00800
Epoch: 29094, Training Loss: 0.00979
Epoch: 29094, Training Loss: 0.00758
Epoch: 29095, Training Loss: 0.00857
Epoch: 29095, Training Loss: 0.00800
Epoch: 29095, Training Loss: 0.00979
Epoch: 29095, Training Loss: 0.00758
Epoch: 29096, Training Loss: 0.00857
Epoch: 29096, Training Loss: 0.00800
Epoch: 29096, Training Loss: 0.00979
Epoch: 29096, Training Loss: 0.00758
Epoch: 29097, Training Loss: 0.00857
Epoch: 29097, Training Loss: 0.00800
Epoch: 29097, Training Loss: 0.00979
Epoch: 29097, Training Loss: 0.00758
Epoch: 29098, Training Loss: 0.00857
Epoch: 29098, Training Loss: 0.00800
Epoch: 29098, Training Loss: 0.00979
Epoch: 29098, Training Loss: 0.00758
Epoch: 29099, Training Loss: 0.00857
Epoch: 29099, Training Loss: 0.00800
Epoch: 29099, Training Loss: 0.00979
Epoch: 29099, Training Loss: 0.00758
Epoch: 29100, Training Loss: 0.00857
Epoch: 29100, Training Loss: 0.00800
Epoch: 29100, Training Loss: 0.00979
Epoch: 29100, Training Loss: 0.00758
Epoch: 29101, Training Loss: 0.00857
Epoch: 29101, Training Loss: 0.00800
Epoch: 29101, Training Loss: 0.00979
Epoch: 29101, Training Loss: 0.00758
Epoch: 29102, Training Loss: 0.00857
Epoch: 29102, Training Loss: 0.00800
Epoch: 29102, Training Loss: 0.00979
Epoch: 29102, Training Loss: 0.00758
Epoch: 29103, Training Loss: 0.00857
Epoch: 29103, Training Loss: 0.00800
Epoch: 29103, Training Loss: 0.00979
Epoch: 29103, Training Loss: 0.00758
Epoch: 29104, Training Loss: 0.00857
Epoch: 29104, Training Loss: 0.00800
Epoch: 29104, Training Loss: 0.00979
Epoch: 29104, Training Loss: 0.00758
Epoch: 29105, Training Loss: 0.00857
Epoch: 29105, Training Loss: 0.00800
Epoch: 29105, Training Loss: 0.00979
Epoch: 29105, Training Loss: 0.00758
Epoch: 29106, Training Loss: 0.00857
Epoch: 29106, Training Loss: 0.00800
Epoch: 29106, Training Loss: 0.00979
Epoch: 29106, Training Loss: 0.00758
Epoch: 29107, Training Loss: 0.00857
Epoch: 29107, Training Loss: 0.00800
Epoch: 29107, Training Loss: 0.00979
Epoch: 29107, Training Loss: 0.00758
Epoch: 29108, Training Loss: 0.00857
Epoch: 29108, Training Loss: 0.00800
Epoch: 29108, Training Loss: 0.00979
Epoch: 29108, Training Loss: 0.00758
Epoch: 29109, Training Loss: 0.00857
Epoch: 29109, Training Loss: 0.00800
Epoch: 29109, Training Loss: 0.00979
Epoch: 29109, Training Loss: 0.00758
Epoch: 29110, Training Loss: 0.00857
Epoch: 29110, Training Loss: 0.00800
Epoch: 29110, Training Loss: 0.00979
Epoch: 29110, Training Loss: 0.00758
Epoch: 29111, Training Loss: 0.00857
Epoch: 29111, Training Loss: 0.00800
Epoch: 29111, Training Loss: 0.00979
Epoch: 29111, Training Loss: 0.00758
Epoch: 29112, Training Loss: 0.00857
Epoch: 29112, Training Loss: 0.00800
Epoch: 29112, Training Loss: 0.00979
Epoch: 29112, Training Loss: 0.00758
Epoch: 29113, Training Loss: 0.00857
Epoch: 29113, Training Loss: 0.00800
Epoch: 29113, Training Loss: 0.00979
Epoch: 29113, Training Loss: 0.00758
Epoch: 29114, Training Loss: 0.00857
Epoch: 29114, Training Loss: 0.00800
Epoch: 29114, Training Loss: 0.00979
Epoch: 29114, Training Loss: 0.00758
Epoch: 29115, Training Loss: 0.00857
Epoch: 29115, Training Loss: 0.00800
Epoch: 29115, Training Loss: 0.00979
Epoch: 29115, Training Loss: 0.00758
Epoch: 29116, Training Loss: 0.00857
Epoch: 29116, Training Loss: 0.00800
Epoch: 29116, Training Loss: 0.00979
Epoch: 29116, Training Loss: 0.00758
Epoch: 29117, Training Loss: 0.00857
Epoch: 29117, Training Loss: 0.00800
Epoch: 29117, Training Loss: 0.00979
Epoch: 29117, Training Loss: 0.00758
Epoch: 29118, Training Loss: 0.00857
Epoch: 29118, Training Loss: 0.00800
Epoch: 29118, Training Loss: 0.00979
Epoch: 29118, Training Loss: 0.00758
Epoch: 29119, Training Loss: 0.00857
Epoch: 29119, Training Loss: 0.00800
Epoch: 29119, Training Loss: 0.00979
Epoch: 29119, Training Loss: 0.00758
Epoch: 29120, Training Loss: 0.00857
Epoch: 29120, Training Loss: 0.00800
Epoch: 29120, Training Loss: 0.00979
Epoch: 29120, Training Loss: 0.00758
Epoch: 29121, Training Loss: 0.00857
Epoch: 29121, Training Loss: 0.00800
Epoch: 29121, Training Loss: 0.00979
Epoch: 29121, Training Loss: 0.00758
Epoch: 29122, Training Loss: 0.00857
Epoch: 29122, Training Loss: 0.00800
Epoch: 29122, Training Loss: 0.00979
Epoch: 29122, Training Loss: 0.00758
Epoch: 29123, Training Loss: 0.00857
Epoch: 29123, Training Loss: 0.00800
Epoch: 29123, Training Loss: 0.00979
Epoch: 29123, Training Loss: 0.00757
Epoch: 29124, Training Loss: 0.00857
Epoch: 29124, Training Loss: 0.00800
Epoch: 29124, Training Loss: 0.00979
Epoch: 29124, Training Loss: 0.00757
Epoch: 29125, Training Loss: 0.00857
Epoch: 29125, Training Loss: 0.00800
Epoch: 29125, Training Loss: 0.00978
Epoch: 29125, Training Loss: 0.00757
Epoch: 29126, Training Loss: 0.00857
Epoch: 29126, Training Loss: 0.00800
Epoch: 29126, Training Loss: 0.00978
Epoch: 29126, Training Loss: 0.00757
Epoch: 29127, Training Loss: 0.00857
Epoch: 29127, Training Loss: 0.00800
Epoch: 29127, Training Loss: 0.00978
Epoch: 29127, Training Loss: 0.00757
Epoch: 29128, Training Loss: 0.00857
Epoch: 29128, Training Loss: 0.00800
Epoch: 29128, Training Loss: 0.00978
Epoch: 29128, Training Loss: 0.00757
Epoch: 29129, Training Loss: 0.00857
Epoch: 29129, Training Loss: 0.00800
Epoch: 29129, Training Loss: 0.00978
Epoch: 29129, Training Loss: 0.00757
Epoch: 29130, Training Loss: 0.00857
Epoch: 29130, Training Loss: 0.00800
Epoch: 29130, Training Loss: 0.00978
Epoch: 29130, Training Loss: 0.00757
Epoch: 29131, Training Loss: 0.00857
Epoch: 29131, Training Loss: 0.00800
Epoch: 29131, Training Loss: 0.00978
Epoch: 29131, Training Loss: 0.00757
Epoch: 29132, Training Loss: 0.00857
Epoch: 29132, Training Loss: 0.00800
Epoch: 29132, Training Loss: 0.00978
Epoch: 29132, Training Loss: 0.00757
Epoch: 29133, Training Loss: 0.00857
Epoch: 29133, Training Loss: 0.00800
Epoch: 29133, Training Loss: 0.00978
Epoch: 29133, Training Loss: 0.00757
Epoch: 29134, Training Loss: 0.00857
Epoch: 29134, Training Loss: 0.00800
Epoch: 29134, Training Loss: 0.00978
Epoch: 29134, Training Loss: 0.00757
Epoch: 29135, Training Loss: 0.00857
Epoch: 29135, Training Loss: 0.00800
Epoch: 29135, Training Loss: 0.00978
Epoch: 29135, Training Loss: 0.00757
Epoch: 29136, Training Loss: 0.00857
Epoch: 29136, Training Loss: 0.00800
Epoch: 29136, Training Loss: 0.00978
Epoch: 29136, Training Loss: 0.00757
Epoch: 29137, Training Loss: 0.00857
Epoch: 29137, Training Loss: 0.00800
Epoch: 29137, Training Loss: 0.00978
Epoch: 29137, Training Loss: 0.00757
Epoch: 29138, Training Loss: 0.00857
Epoch: 29138, Training Loss: 0.00800
Epoch: 29138, Training Loss: 0.00978
Epoch: 29138, Training Loss: 0.00757
Epoch: 29139, Training Loss: 0.00857
Epoch: 29139, Training Loss: 0.00800
Epoch: 29139, Training Loss: 0.00978
Epoch: 29139, Training Loss: 0.00757
Epoch: 29140, Training Loss: 0.00857
Epoch: 29140, Training Loss: 0.00800
Epoch: 29140, Training Loss: 0.00978
Epoch: 29140, Training Loss: 0.00757
Epoch: 29141, Training Loss: 0.00857
Epoch: 29141, Training Loss: 0.00800
Epoch: 29141, Training Loss: 0.00978
Epoch: 29141, Training Loss: 0.00757
Epoch: 29142, Training Loss: 0.00857
Epoch: 29142, Training Loss: 0.00800
Epoch: 29142, Training Loss: 0.00978
Epoch: 29142, Training Loss: 0.00757
Epoch: 29143, Training Loss: 0.00857
Epoch: 29143, Training Loss: 0.00800
Epoch: 29143, Training Loss: 0.00978
Epoch: 29143, Training Loss: 0.00757
Epoch: 29144, Training Loss: 0.00857
Epoch: 29144, Training Loss: 0.00800
Epoch: 29144, Training Loss: 0.00978
Epoch: 29144, Training Loss: 0.00757
Epoch: 29145, Training Loss: 0.00857
Epoch: 29145, Training Loss: 0.00800
Epoch: 29145, Training Loss: 0.00978
Epoch: 29145, Training Loss: 0.00757
Epoch: 29146, Training Loss: 0.00857
Epoch: 29146, Training Loss: 0.00800
Epoch: 29146, Training Loss: 0.00978
Epoch: 29146, Training Loss: 0.00757
Epoch: 29147, Training Loss: 0.00856
Epoch: 29147, Training Loss: 0.00800
Epoch: 29147, Training Loss: 0.00978
Epoch: 29147, Training Loss: 0.00757
Epoch: 29148, Training Loss: 0.00856
Epoch: 29148, Training Loss: 0.00800
Epoch: 29148, Training Loss: 0.00978
Epoch: 29148, Training Loss: 0.00757
Epoch: 29149, Training Loss: 0.00856
Epoch: 29149, Training Loss: 0.00800
Epoch: 29149, Training Loss: 0.00978
Epoch: 29149, Training Loss: 0.00757
Epoch: 29150, Training Loss: 0.00856
Epoch: 29150, Training Loss: 0.00800
Epoch: 29150, Training Loss: 0.00978
Epoch: 29150, Training Loss: 0.00757
Epoch: 29151, Training Loss: 0.00856
Epoch: 29151, Training Loss: 0.00799
Epoch: 29151, Training Loss: 0.00978
Epoch: 29151, Training Loss: 0.00757
Epoch: 29152, Training Loss: 0.00856
Epoch: 29152, Training Loss: 0.00799
Epoch: 29152, Training Loss: 0.00978
Epoch: 29152, Training Loss: 0.00757
Epoch: 29153, Training Loss: 0.00856
Epoch: 29153, Training Loss: 0.00799
Epoch: 29153, Training Loss: 0.00978
Epoch: 29153, Training Loss: 0.00757
Epoch: 29154, Training Loss: 0.00856
Epoch: 29154, Training Loss: 0.00799
Epoch: 29154, Training Loss: 0.00978
Epoch: 29154, Training Loss: 0.00757
Epoch: 29155, Training Loss: 0.00856
Epoch: 29155, Training Loss: 0.00799
Epoch: 29155, Training Loss: 0.00978
Epoch: 29155, Training Loss: 0.00757
Epoch: 29156, Training Loss: 0.00856
Epoch: 29156, Training Loss: 0.00799
Epoch: 29156, Training Loss: 0.00978
Epoch: 29156, Training Loss: 0.00757
Epoch: 29157, Training Loss: 0.00856
Epoch: 29157, Training Loss: 0.00799
Epoch: 29157, Training Loss: 0.00978
Epoch: 29157, Training Loss: 0.00757
Epoch: 29158, Training Loss: 0.00856
Epoch: 29158, Training Loss: 0.00799
Epoch: 29158, Training Loss: 0.00978
Epoch: 29158, Training Loss: 0.00757
Epoch: 29159, Training Loss: 0.00856
Epoch: 29159, Training Loss: 0.00799
Epoch: 29159, Training Loss: 0.00978
Epoch: 29159, Training Loss: 0.00757
Epoch: 29160, Training Loss: 0.00856
Epoch: 29160, Training Loss: 0.00799
Epoch: 29160, Training Loss: 0.00978
Epoch: 29160, Training Loss: 0.00757
Epoch: 29161, Training Loss: 0.00856
Epoch: 29161, Training Loss: 0.00799
Epoch: 29161, Training Loss: 0.00978
Epoch: 29161, Training Loss: 0.00757
Epoch: 29162, Training Loss: 0.00856
Epoch: 29162, Training Loss: 0.00799
Epoch: 29162, Training Loss: 0.00978
Epoch: 29162, Training Loss: 0.00757
Epoch: 29163, Training Loss: 0.00856
Epoch: 29163, Training Loss: 0.00799
Epoch: 29163, Training Loss: 0.00978
Epoch: 29163, Training Loss: 0.00757
Epoch: 29164, Training Loss: 0.00856
Epoch: 29164, Training Loss: 0.00799
Epoch: 29164, Training Loss: 0.00978
Epoch: 29164, Training Loss: 0.00757
Epoch: 29165, Training Loss: 0.00856
Epoch: 29165, Training Loss: 0.00799
Epoch: 29165, Training Loss: 0.00978
Epoch: 29165, Training Loss: 0.00757
Epoch: 29166, Training Loss: 0.00856
Epoch: 29166, Training Loss: 0.00799
Epoch: 29166, Training Loss: 0.00978
Epoch: 29166, Training Loss: 0.00757
Epoch: 29167, Training Loss: 0.00856
Epoch: 29167, Training Loss: 0.00799
Epoch: 29167, Training Loss: 0.00978
Epoch: 29167, Training Loss: 0.00757
Epoch: 29168, Training Loss: 0.00856
Epoch: 29168, Training Loss: 0.00799
Epoch: 29168, Training Loss: 0.00978
Epoch: 29168, Training Loss: 0.00757
Epoch: 29169, Training Loss: 0.00856
Epoch: 29169, Training Loss: 0.00799
Epoch: 29169, Training Loss: 0.00978
Epoch: 29169, Training Loss: 0.00757
Epoch: 29170, Training Loss: 0.00856
Epoch: 29170, Training Loss: 0.00799
Epoch: 29170, Training Loss: 0.00978
Epoch: 29170, Training Loss: 0.00757
Epoch: 29171, Training Loss: 0.00856
Epoch: 29171, Training Loss: 0.00799
Epoch: 29171, Training Loss: 0.00978
Epoch: 29171, Training Loss: 0.00757
Epoch: 29172, Training Loss: 0.00856
Epoch: 29172, Training Loss: 0.00799
Epoch: 29172, Training Loss: 0.00978
Epoch: 29172, Training Loss: 0.00757
Epoch: 29173, Training Loss: 0.00856
Epoch: 29173, Training Loss: 0.00799
Epoch: 29173, Training Loss: 0.00978
Epoch: 29173, Training Loss: 0.00757
Epoch: 29174, Training Loss: 0.00856
Epoch: 29174, Training Loss: 0.00799
Epoch: 29174, Training Loss: 0.00978
Epoch: 29174, Training Loss: 0.00757
Epoch: 29175, Training Loss: 0.00856
Epoch: 29175, Training Loss: 0.00799
Epoch: 29175, Training Loss: 0.00978
Epoch: 29175, Training Loss: 0.00757
Epoch: 29176, Training Loss: 0.00856
Epoch: 29176, Training Loss: 0.00799
Epoch: 29176, Training Loss: 0.00978
Epoch: 29176, Training Loss: 0.00757
Epoch: 29177, Training Loss: 0.00856
Epoch: 29177, Training Loss: 0.00799
Epoch: 29177, Training Loss: 0.00978
Epoch: 29177, Training Loss: 0.00757
Epoch: 29178, Training Loss: 0.00856
Epoch: 29178, Training Loss: 0.00799
Epoch: 29178, Training Loss: 0.00978
Epoch: 29178, Training Loss: 0.00757
Epoch: 29179, Training Loss: 0.00856
Epoch: 29179, Training Loss: 0.00799
Epoch: 29179, Training Loss: 0.00978
Epoch: 29179, Training Loss: 0.00757
Epoch: 29180, Training Loss: 0.00856
Epoch: 29180, Training Loss: 0.00799
Epoch: 29180, Training Loss: 0.00978
Epoch: 29180, Training Loss: 0.00757
Epoch: 29181, Training Loss: 0.00856
Epoch: 29181, Training Loss: 0.00799
Epoch: 29181, Training Loss: 0.00977
Epoch: 29181, Training Loss: 0.00757
Epoch: 29182, Training Loss: 0.00856
Epoch: 29182, Training Loss: 0.00799
Epoch: 29182, Training Loss: 0.00977
Epoch: 29182, Training Loss: 0.00757
Epoch: 29183, Training Loss: 0.00856
Epoch: 29183, Training Loss: 0.00799
Epoch: 29183, Training Loss: 0.00977
Epoch: 29183, Training Loss: 0.00757
Epoch: 29184, Training Loss: 0.00856
Epoch: 29184, Training Loss: 0.00799
Epoch: 29184, Training Loss: 0.00977
Epoch: 29184, Training Loss: 0.00757
Epoch: 29185, Training Loss: 0.00856
Epoch: 29185, Training Loss: 0.00799
Epoch: 29185, Training Loss: 0.00977
Epoch: 29185, Training Loss: 0.00757
Epoch: 29186, Training Loss: 0.00856
Epoch: 29186, Training Loss: 0.00799
Epoch: 29186, Training Loss: 0.00977
Epoch: 29186, Training Loss: 0.00757
Epoch: 29187, Training Loss: 0.00856
Epoch: 29187, Training Loss: 0.00799
Epoch: 29187, Training Loss: 0.00977
Epoch: 29187, Training Loss: 0.00757
Epoch: 29188, Training Loss: 0.00856
Epoch: 29188, Training Loss: 0.00799
Epoch: 29188, Training Loss: 0.00977
Epoch: 29188, Training Loss: 0.00757
Epoch: 29189, Training Loss: 0.00856
Epoch: 29189, Training Loss: 0.00799
Epoch: 29189, Training Loss: 0.00977
Epoch: 29189, Training Loss: 0.00757
Epoch: 29190, Training Loss: 0.00856
Epoch: 29190, Training Loss: 0.00799
Epoch: 29190, Training Loss: 0.00977
Epoch: 29190, Training Loss: 0.00757
Epoch: 29191, Training Loss: 0.00856
Epoch: 29191, Training Loss: 0.00799
Epoch: 29191, Training Loss: 0.00977
Epoch: 29191, Training Loss: 0.00757
Epoch: 29192, Training Loss: 0.00856
Epoch: 29192, Training Loss: 0.00799
Epoch: 29192, Training Loss: 0.00977
Epoch: 29192, Training Loss: 0.00757
Epoch: 29193, Training Loss: 0.00856
Epoch: 29193, Training Loss: 0.00799
Epoch: 29193, Training Loss: 0.00977
Epoch: 29193, Training Loss: 0.00757
Epoch: 29194, Training Loss: 0.00856
Epoch: 29194, Training Loss: 0.00799
Epoch: 29194, Training Loss: 0.00977
Epoch: 29194, Training Loss: 0.00757
Epoch: 29195, Training Loss: 0.00856
Epoch: 29195, Training Loss: 0.00799
Epoch: 29195, Training Loss: 0.00977
Epoch: 29195, Training Loss: 0.00757
Epoch: 29196, Training Loss: 0.00856
Epoch: 29196, Training Loss: 0.00799
Epoch: 29196, Training Loss: 0.00977
Epoch: 29196, Training Loss: 0.00757
Epoch: 29197, Training Loss: 0.00856
Epoch: 29197, Training Loss: 0.00799
Epoch: 29197, Training Loss: 0.00977
Epoch: 29197, Training Loss: 0.00756
Epoch: 29198, Training Loss: 0.00856
Epoch: 29198, Training Loss: 0.00799
Epoch: 29198, Training Loss: 0.00977
Epoch: 29198, Training Loss: 0.00756
Epoch: 29199, Training Loss: 0.00856
Epoch: 29199, Training Loss: 0.00799
Epoch: 29199, Training Loss: 0.00977
Epoch: 29199, Training Loss: 0.00756
Epoch: 29200, Training Loss: 0.00856
Epoch: 29200, Training Loss: 0.00799
Epoch: 29200, Training Loss: 0.00977
Epoch: 29200, Training Loss: 0.00756
Epoch: 29201, Training Loss: 0.00856
Epoch: 29201, Training Loss: 0.00799
Epoch: 29201, Training Loss: 0.00977
Epoch: 29201, Training Loss: 0.00756
Epoch: 29202, Training Loss: 0.00856
Epoch: 29202, Training Loss: 0.00799
Epoch: 29202, Training Loss: 0.00977
Epoch: 29202, Training Loss: 0.00756
Epoch: 29203, Training Loss: 0.00856
Epoch: 29203, Training Loss: 0.00799
Epoch: 29203, Training Loss: 0.00977
Epoch: 29203, Training Loss: 0.00756
Epoch: 29204, Training Loss: 0.00856
Epoch: 29204, Training Loss: 0.00799
Epoch: 29204, Training Loss: 0.00977
Epoch: 29204, Training Loss: 0.00756
Epoch: 29205, Training Loss: 0.00856
Epoch: 29205, Training Loss: 0.00799
Epoch: 29205, Training Loss: 0.00977
Epoch: 29205, Training Loss: 0.00756
Epoch: 29206, Training Loss: 0.00856
Epoch: 29206, Training Loss: 0.00799
Epoch: 29206, Training Loss: 0.00977
Epoch: 29206, Training Loss: 0.00756
Epoch: 29207, Training Loss: 0.00856
Epoch: 29207, Training Loss: 0.00799
Epoch: 29207, Training Loss: 0.00977
Epoch: 29207, Training Loss: 0.00756
Epoch: 29208, Training Loss: 0.00856
Epoch: 29208, Training Loss: 0.00799
Epoch: 29208, Training Loss: 0.00977
Epoch: 29208, Training Loss: 0.00756
Epoch: 29209, Training Loss: 0.00856
Epoch: 29209, Training Loss: 0.00799
Epoch: 29209, Training Loss: 0.00977
Epoch: 29209, Training Loss: 0.00756
Epoch: 29210, Training Loss: 0.00856
Epoch: 29210, Training Loss: 0.00799
Epoch: 29210, Training Loss: 0.00977
Epoch: 29210, Training Loss: 0.00756
Epoch: 29211, Training Loss: 0.00855
Epoch: 29211, Training Loss: 0.00799
Epoch: 29211, Training Loss: 0.00977
Epoch: 29211, Training Loss: 0.00756
Epoch: 29212, Training Loss: 0.00855
Epoch: 29212, Training Loss: 0.00799
Epoch: 29212, Training Loss: 0.00977
Epoch: 29212, Training Loss: 0.00756
Epoch: 29213, Training Loss: 0.00855
Epoch: 29213, Training Loss: 0.00799
Epoch: 29213, Training Loss: 0.00977
Epoch: 29213, Training Loss: 0.00756
Epoch: 29214, Training Loss: 0.00855
Epoch: 29214, Training Loss: 0.00799
Epoch: 29214, Training Loss: 0.00977
Epoch: 29214, Training Loss: 0.00756
Epoch: 29215, Training Loss: 0.00855
Epoch: 29215, Training Loss: 0.00799
Epoch: 29215, Training Loss: 0.00977
Epoch: 29215, Training Loss: 0.00756
Epoch: 29216, Training Loss: 0.00855
Epoch: 29216, Training Loss: 0.00799
Epoch: 29216, Training Loss: 0.00977
Epoch: 29216, Training Loss: 0.00756
Epoch: 29217, Training Loss: 0.00855
Epoch: 29217, Training Loss: 0.00799
Epoch: 29217, Training Loss: 0.00977
Epoch: 29217, Training Loss: 0.00756
Epoch: 29218, Training Loss: 0.00855
Epoch: 29218, Training Loss: 0.00799
Epoch: 29218, Training Loss: 0.00977
Epoch: 29218, Training Loss: 0.00756
Epoch: 29219, Training Loss: 0.00855
Epoch: 29219, Training Loss: 0.00799
Epoch: 29219, Training Loss: 0.00977
Epoch: 29219, Training Loss: 0.00756
Epoch: 29220, Training Loss: 0.00855
Epoch: 29220, Training Loss: 0.00799
Epoch: 29220, Training Loss: 0.00977
Epoch: 29220, Training Loss: 0.00756
Epoch: 29221, Training Loss: 0.00855
Epoch: 29221, Training Loss: 0.00798
Epoch: 29221, Training Loss: 0.00977
Epoch: 29221, Training Loss: 0.00756
Epoch: 29222, Training Loss: 0.00855
Epoch: 29222, Training Loss: 0.00798
Epoch: 29222, Training Loss: 0.00977
Epoch: 29222, Training Loss: 0.00756
Epoch: 29223, Training Loss: 0.00855
Epoch: 29223, Training Loss: 0.00798
Epoch: 29223, Training Loss: 0.00977
Epoch: 29223, Training Loss: 0.00756
Epoch: 29224, Training Loss: 0.00855
Epoch: 29224, Training Loss: 0.00798
Epoch: 29224, Training Loss: 0.00977
Epoch: 29224, Training Loss: 0.00756
Epoch: 29225, Training Loss: 0.00855
Epoch: 29225, Training Loss: 0.00798
Epoch: 29225, Training Loss: 0.00977
Epoch: 29225, Training Loss: 0.00756
Epoch: 29226, Training Loss: 0.00855
Epoch: 29226, Training Loss: 0.00798
Epoch: 29226, Training Loss: 0.00977
Epoch: 29226, Training Loss: 0.00756
Epoch: 29227, Training Loss: 0.00855
Epoch: 29227, Training Loss: 0.00798
Epoch: 29227, Training Loss: 0.00977
Epoch: 29227, Training Loss: 0.00756
Epoch: 29228, Training Loss: 0.00855
Epoch: 29228, Training Loss: 0.00798
Epoch: 29228, Training Loss: 0.00977
Epoch: 29228, Training Loss: 0.00756
Epoch: 29229, Training Loss: 0.00855
Epoch: 29229, Training Loss: 0.00798
Epoch: 29229, Training Loss: 0.00977
Epoch: 29229, Training Loss: 0.00756
Epoch: 29230, Training Loss: 0.00855
Epoch: 29230, Training Loss: 0.00798
Epoch: 29230, Training Loss: 0.00977
Epoch: 29230, Training Loss: 0.00756
Epoch: 29231, Training Loss: 0.00855
Epoch: 29231, Training Loss: 0.00798
Epoch: 29231, Training Loss: 0.00977
Epoch: 29231, Training Loss: 0.00756
Epoch: 29232, Training Loss: 0.00855
Epoch: 29232, Training Loss: 0.00798
Epoch: 29232, Training Loss: 0.00977
Epoch: 29232, Training Loss: 0.00756
Epoch: 29233, Training Loss: 0.00855
Epoch: 29233, Training Loss: 0.00798
Epoch: 29233, Training Loss: 0.00977
Epoch: 29233, Training Loss: 0.00756
Epoch: 29234, Training Loss: 0.00855
Epoch: 29234, Training Loss: 0.00798
Epoch: 29234, Training Loss: 0.00977
Epoch: 29234, Training Loss: 0.00756
Epoch: 29235, Training Loss: 0.00855
Epoch: 29235, Training Loss: 0.00798
Epoch: 29235, Training Loss: 0.00977
Epoch: 29235, Training Loss: 0.00756
Epoch: 29236, Training Loss: 0.00855
Epoch: 29236, Training Loss: 0.00798
Epoch: 29236, Training Loss: 0.00977
Epoch: 29236, Training Loss: 0.00756
Epoch: 29237, Training Loss: 0.00855
Epoch: 29237, Training Loss: 0.00798
Epoch: 29237, Training Loss: 0.00977
Epoch: 29237, Training Loss: 0.00756
Epoch: 29238, Training Loss: 0.00855
Epoch: 29238, Training Loss: 0.00798
Epoch: 29238, Training Loss: 0.00976
Epoch: 29238, Training Loss: 0.00756
Epoch: 29239, Training Loss: 0.00855
Epoch: 29239, Training Loss: 0.00798
Epoch: 29239, Training Loss: 0.00976
Epoch: 29239, Training Loss: 0.00756
Epoch: 29240, Training Loss: 0.00855
Epoch: 29240, Training Loss: 0.00798
Epoch: 29240, Training Loss: 0.00976
Epoch: 29240, Training Loss: 0.00756
Epoch: 29241, Training Loss: 0.00855
Epoch: 29241, Training Loss: 0.00798
Epoch: 29241, Training Loss: 0.00976
Epoch: 29241, Training Loss: 0.00756
Epoch: 29242, Training Loss: 0.00855
Epoch: 29242, Training Loss: 0.00798
Epoch: 29242, Training Loss: 0.00976
Epoch: 29242, Training Loss: 0.00756
Epoch: 29243, Training Loss: 0.00855
Epoch: 29243, Training Loss: 0.00798
Epoch: 29243, Training Loss: 0.00976
Epoch: 29243, Training Loss: 0.00756
Epoch: 29244, Training Loss: 0.00855
Epoch: 29244, Training Loss: 0.00798
Epoch: 29244, Training Loss: 0.00976
Epoch: 29244, Training Loss: 0.00756
Epoch: 29245, Training Loss: 0.00855
Epoch: 29245, Training Loss: 0.00798
Epoch: 29245, Training Loss: 0.00976
Epoch: 29245, Training Loss: 0.00756
Epoch: 29246, Training Loss: 0.00855
Epoch: 29246, Training Loss: 0.00798
Epoch: 29246, Training Loss: 0.00976
Epoch: 29246, Training Loss: 0.00756
Epoch: 29247, Training Loss: 0.00855
Epoch: 29247, Training Loss: 0.00798
Epoch: 29247, Training Loss: 0.00976
Epoch: 29247, Training Loss: 0.00756
Epoch: 29248, Training Loss: 0.00855
Epoch: 29248, Training Loss: 0.00798
Epoch: 29248, Training Loss: 0.00976
Epoch: 29248, Training Loss: 0.00756
Epoch: 29249, Training Loss: 0.00855
Epoch: 29249, Training Loss: 0.00798
Epoch: 29249, Training Loss: 0.00976
Epoch: 29249, Training Loss: 0.00756
Epoch: 29250, Training Loss: 0.00855
Epoch: 29250, Training Loss: 0.00798
Epoch: 29250, Training Loss: 0.00976
Epoch: 29250, Training Loss: 0.00756
Epoch: 29251, Training Loss: 0.00855
Epoch: 29251, Training Loss: 0.00798
Epoch: 29251, Training Loss: 0.00976
Epoch: 29251, Training Loss: 0.00756
Epoch: 29252, Training Loss: 0.00855
Epoch: 29252, Training Loss: 0.00798
Epoch: 29252, Training Loss: 0.00976
Epoch: 29252, Training Loss: 0.00756
Epoch: 29253, Training Loss: 0.00855
Epoch: 29253, Training Loss: 0.00798
Epoch: 29253, Training Loss: 0.00976
Epoch: 29253, Training Loss: 0.00756
Epoch: 29254, Training Loss: 0.00855
Epoch: 29254, Training Loss: 0.00798
Epoch: 29254, Training Loss: 0.00976
Epoch: 29254, Training Loss: 0.00756
Epoch: 29255, Training Loss: 0.00855
Epoch: 29255, Training Loss: 0.00798
Epoch: 29255, Training Loss: 0.00976
Epoch: 29255, Training Loss: 0.00756
Epoch: 29256, Training Loss: 0.00855
Epoch: 29256, Training Loss: 0.00798
Epoch: 29256, Training Loss: 0.00976
Epoch: 29256, Training Loss: 0.00756
Epoch: 29257, Training Loss: 0.00855
Epoch: 29257, Training Loss: 0.00798
Epoch: 29257, Training Loss: 0.00976
Epoch: 29257, Training Loss: 0.00756
Epoch: 29258, Training Loss: 0.00855
Epoch: 29258, Training Loss: 0.00798
Epoch: 29258, Training Loss: 0.00976
Epoch: 29258, Training Loss: 0.00756
Epoch: 29259, Training Loss: 0.00855
Epoch: 29259, Training Loss: 0.00798
Epoch: 29259, Training Loss: 0.00976
Epoch: 29259, Training Loss: 0.00756
Epoch: 29260, Training Loss: 0.00855
Epoch: 29260, Training Loss: 0.00798
Epoch: 29260, Training Loss: 0.00976
Epoch: 29260, Training Loss: 0.00756
Epoch: 29261, Training Loss: 0.00855
Epoch: 29261, Training Loss: 0.00798
Epoch: 29261, Training Loss: 0.00976
Epoch: 29261, Training Loss: 0.00756
Epoch: 29262, Training Loss: 0.00855
Epoch: 29262, Training Loss: 0.00798
Epoch: 29262, Training Loss: 0.00976
Epoch: 29262, Training Loss: 0.00756
Epoch: 29263, Training Loss: 0.00855
Epoch: 29263, Training Loss: 0.00798
Epoch: 29263, Training Loss: 0.00976
Epoch: 29263, Training Loss: 0.00756
Epoch: 29264, Training Loss: 0.00855
Epoch: 29264, Training Loss: 0.00798
Epoch: 29264, Training Loss: 0.00976
Epoch: 29264, Training Loss: 0.00756
Epoch: 29265, Training Loss: 0.00855
Epoch: 29265, Training Loss: 0.00798
Epoch: 29265, Training Loss: 0.00976
Epoch: 29265, Training Loss: 0.00756
Epoch: 29266, Training Loss: 0.00855
Epoch: 29266, Training Loss: 0.00798
Epoch: 29266, Training Loss: 0.00976
Epoch: 29266, Training Loss: 0.00756
Epoch: 29267, Training Loss: 0.00855
Epoch: 29267, Training Loss: 0.00798
Epoch: 29267, Training Loss: 0.00976
Epoch: 29267, Training Loss: 0.00756
Epoch: 29268, Training Loss: 0.00855
Epoch: 29268, Training Loss: 0.00798
Epoch: 29268, Training Loss: 0.00976
Epoch: 29268, Training Loss: 0.00756
Epoch: 29269, Training Loss: 0.00855
Epoch: 29269, Training Loss: 0.00798
Epoch: 29269, Training Loss: 0.00976
Epoch: 29269, Training Loss: 0.00756
Epoch: 29270, Training Loss: 0.00855
Epoch: 29270, Training Loss: 0.00798
Epoch: 29270, Training Loss: 0.00976
Epoch: 29270, Training Loss: 0.00756
Epoch: 29271, Training Loss: 0.00855
Epoch: 29271, Training Loss: 0.00798
Epoch: 29271, Training Loss: 0.00976
Epoch: 29271, Training Loss: 0.00755
Epoch: 29272, Training Loss: 0.00855
Epoch: 29272, Training Loss: 0.00798
Epoch: 29272, Training Loss: 0.00976
Epoch: 29272, Training Loss: 0.00755
Epoch: 29273, Training Loss: 0.00855
Epoch: 29273, Training Loss: 0.00798
Epoch: 29273, Training Loss: 0.00976
Epoch: 29273, Training Loss: 0.00755
Epoch: 29274, Training Loss: 0.00855
Epoch: 29274, Training Loss: 0.00798
Epoch: 29274, Training Loss: 0.00976
Epoch: 29274, Training Loss: 0.00755
Epoch: 29275, Training Loss: 0.00854
Epoch: 29275, Training Loss: 0.00798
Epoch: 29275, Training Loss: 0.00976
Epoch: 29275, Training Loss: 0.00755
Epoch: 29276, Training Loss: 0.00854
Epoch: 29276, Training Loss: 0.00798
Epoch: 29276, Training Loss: 0.00976
Epoch: 29276, Training Loss: 0.00755
Epoch: 29277, Training Loss: 0.00854
Epoch: 29277, Training Loss: 0.00798
Epoch: 29277, Training Loss: 0.00976
Epoch: 29277, Training Loss: 0.00755
Epoch: 29278, Training Loss: 0.00854
Epoch: 29278, Training Loss: 0.00798
Epoch: 29278, Training Loss: 0.00976
Epoch: 29278, Training Loss: 0.00755
Epoch: 29279, Training Loss: 0.00854
Epoch: 29279, Training Loss: 0.00798
Epoch: 29279, Training Loss: 0.00976
Epoch: 29279, Training Loss: 0.00755
Epoch: 29280, Training Loss: 0.00854
Epoch: 29280, Training Loss: 0.00798
Epoch: 29280, Training Loss: 0.00976
Epoch: 29280, Training Loss: 0.00755
Epoch: 29281, Training Loss: 0.00854
Epoch: 29281, Training Loss: 0.00798
Epoch: 29281, Training Loss: 0.00976
Epoch: 29281, Training Loss: 0.00755
Epoch: 29282, Training Loss: 0.00854
Epoch: 29282, Training Loss: 0.00798
Epoch: 29282, Training Loss: 0.00976
Epoch: 29282, Training Loss: 0.00755
Epoch: 29283, Training Loss: 0.00854
Epoch: 29283, Training Loss: 0.00798
Epoch: 29283, Training Loss: 0.00976
Epoch: 29283, Training Loss: 0.00755
Epoch: 29284, Training Loss: 0.00854
Epoch: 29284, Training Loss: 0.00798
Epoch: 29284, Training Loss: 0.00976
Epoch: 29284, Training Loss: 0.00755
Epoch: 29285, Training Loss: 0.00854
Epoch: 29285, Training Loss: 0.00798
Epoch: 29285, Training Loss: 0.00976
Epoch: 29285, Training Loss: 0.00755
Epoch: 29286, Training Loss: 0.00854
Epoch: 29286, Training Loss: 0.00798
Epoch: 29286, Training Loss: 0.00976
Epoch: 29286, Training Loss: 0.00755
Epoch: 29287, Training Loss: 0.00854
Epoch: 29287, Training Loss: 0.00798
Epoch: 29287, Training Loss: 0.00976
Epoch: 29287, Training Loss: 0.00755
Epoch: 29288, Training Loss: 0.00854
Epoch: 29288, Training Loss: 0.00798
Epoch: 29288, Training Loss: 0.00976
Epoch: 29288, Training Loss: 0.00755
Epoch: 29289, Training Loss: 0.00854
Epoch: 29289, Training Loss: 0.00798
Epoch: 29289, Training Loss: 0.00976
Epoch: 29289, Training Loss: 0.00755
Epoch: 29290, Training Loss: 0.00854
Epoch: 29290, Training Loss: 0.00798
Epoch: 29290, Training Loss: 0.00976
Epoch: 29290, Training Loss: 0.00755
Epoch: 29291, Training Loss: 0.00854
Epoch: 29291, Training Loss: 0.00797
Epoch: 29291, Training Loss: 0.00976
Epoch: 29291, Training Loss: 0.00755
Epoch: 29292, Training Loss: 0.00854
Epoch: 29292, Training Loss: 0.00797
Epoch: 29292, Training Loss: 0.00976
Epoch: 29292, Training Loss: 0.00755
Epoch: 29293, Training Loss: 0.00854
Epoch: 29293, Training Loss: 0.00797
Epoch: 29293, Training Loss: 0.00976
Epoch: 29293, Training Loss: 0.00755
Epoch: 29294, Training Loss: 0.00854
Epoch: 29294, Training Loss: 0.00797
Epoch: 29294, Training Loss: 0.00975
Epoch: 29294, Training Loss: 0.00755
Epoch: 29295, Training Loss: 0.00854
Epoch: 29295, Training Loss: 0.00797
Epoch: 29295, Training Loss: 0.00975
Epoch: 29295, Training Loss: 0.00755
Epoch: 29296, Training Loss: 0.00854
Epoch: 29296, Training Loss: 0.00797
Epoch: 29296, Training Loss: 0.00975
Epoch: 29296, Training Loss: 0.00755
Epoch: 29297, Training Loss: 0.00854
Epoch: 29297, Training Loss: 0.00797
Epoch: 29297, Training Loss: 0.00975
Epoch: 29297, Training Loss: 0.00755
Epoch: 29298, Training Loss: 0.00854
Epoch: 29298, Training Loss: 0.00797
Epoch: 29298, Training Loss: 0.00975
Epoch: 29298, Training Loss: 0.00755
Epoch: 29299, Training Loss: 0.00854
Epoch: 29299, Training Loss: 0.00797
Epoch: 29299, Training Loss: 0.00975
Epoch: 29299, Training Loss: 0.00755
Epoch: 29300, Training Loss: 0.00854
Epoch: 29300, Training Loss: 0.00797
Epoch: 29300, Training Loss: 0.00975
Epoch: 29300, Training Loss: 0.00755
Epoch: 29301, Training Loss: 0.00854
Epoch: 29301, Training Loss: 0.00797
Epoch: 29301, Training Loss: 0.00975
Epoch: 29301, Training Loss: 0.00755
Epoch: 29302, Training Loss: 0.00854
Epoch: 29302, Training Loss: 0.00797
Epoch: 29302, Training Loss: 0.00975
Epoch: 29302, Training Loss: 0.00755
Epoch: 29303, Training Loss: 0.00854
Epoch: 29303, Training Loss: 0.00797
Epoch: 29303, Training Loss: 0.00975
Epoch: 29303, Training Loss: 0.00755
Epoch: 29304, Training Loss: 0.00854
Epoch: 29304, Training Loss: 0.00797
Epoch: 29304, Training Loss: 0.00975
Epoch: 29304, Training Loss: 0.00755
Epoch: 29305, Training Loss: 0.00854
Epoch: 29305, Training Loss: 0.00797
Epoch: 29305, Training Loss: 0.00975
Epoch: 29305, Training Loss: 0.00755
Epoch: 29306, Training Loss: 0.00854
Epoch: 29306, Training Loss: 0.00797
Epoch: 29306, Training Loss: 0.00975
Epoch: 29306, Training Loss: 0.00755
Epoch: 29307, Training Loss: 0.00854
Epoch: 29307, Training Loss: 0.00797
Epoch: 29307, Training Loss: 0.00975
Epoch: 29307, Training Loss: 0.00755
Epoch: 29308, Training Loss: 0.00854
Epoch: 29308, Training Loss: 0.00797
Epoch: 29308, Training Loss: 0.00975
Epoch: 29308, Training Loss: 0.00755
Epoch: 29309, Training Loss: 0.00854
Epoch: 29309, Training Loss: 0.00797
Epoch: 29309, Training Loss: 0.00975
Epoch: 29309, Training Loss: 0.00755
Epoch: 29310, Training Loss: 0.00854
Epoch: 29310, Training Loss: 0.00797
Epoch: 29310, Training Loss: 0.00975
Epoch: 29310, Training Loss: 0.00755
Epoch: 29311, Training Loss: 0.00854
Epoch: 29311, Training Loss: 0.00797
Epoch: 29311, Training Loss: 0.00975
Epoch: 29311, Training Loss: 0.00755
Epoch: 29312, Training Loss: 0.00854
Epoch: 29312, Training Loss: 0.00797
Epoch: 29312, Training Loss: 0.00975
Epoch: 29312, Training Loss: 0.00755
Epoch: 29313, Training Loss: 0.00854
Epoch: 29313, Training Loss: 0.00797
Epoch: 29313, Training Loss: 0.00975
Epoch: 29313, Training Loss: 0.00755
Epoch: 29314, Training Loss: 0.00854
Epoch: 29314, Training Loss: 0.00797
Epoch: 29314, Training Loss: 0.00975
Epoch: 29314, Training Loss: 0.00755
Epoch: 29315, Training Loss: 0.00854
Epoch: 29315, Training Loss: 0.00797
Epoch: 29315, Training Loss: 0.00975
Epoch: 29315, Training Loss: 0.00755
Epoch: 29316, Training Loss: 0.00854
Epoch: 29316, Training Loss: 0.00797
Epoch: 29316, Training Loss: 0.00975
Epoch: 29316, Training Loss: 0.00755
Epoch: 29317, Training Loss: 0.00854
Epoch: 29317, Training Loss: 0.00797
Epoch: 29317, Training Loss: 0.00975
Epoch: 29317, Training Loss: 0.00755
Epoch: 29318, Training Loss: 0.00854
Epoch: 29318, Training Loss: 0.00797
Epoch: 29318, Training Loss: 0.00975
Epoch: 29318, Training Loss: 0.00755
Epoch: 29319, Training Loss: 0.00854
Epoch: 29319, Training Loss: 0.00797
Epoch: 29319, Training Loss: 0.00975
Epoch: 29319, Training Loss: 0.00755
Epoch: 29320, Training Loss: 0.00854
Epoch: 29320, Training Loss: 0.00797
Epoch: 29320, Training Loss: 0.00975
Epoch: 29320, Training Loss: 0.00755
Epoch: 29321, Training Loss: 0.00854
Epoch: 29321, Training Loss: 0.00797
Epoch: 29321, Training Loss: 0.00975
Epoch: 29321, Training Loss: 0.00755
Epoch: 29322, Training Loss: 0.00854
Epoch: 29322, Training Loss: 0.00797
Epoch: 29322, Training Loss: 0.00975
Epoch: 29322, Training Loss: 0.00755
Epoch: 29323, Training Loss: 0.00854
Epoch: 29323, Training Loss: 0.00797
Epoch: 29323, Training Loss: 0.00975
Epoch: 29323, Training Loss: 0.00755
Epoch: 29324, Training Loss: 0.00854
Epoch: 29324, Training Loss: 0.00797
Epoch: 29324, Training Loss: 0.00975
Epoch: 29324, Training Loss: 0.00755
Epoch: 29325, Training Loss: 0.00854
Epoch: 29325, Training Loss: 0.00797
Epoch: 29325, Training Loss: 0.00975
Epoch: 29325, Training Loss: 0.00755
Epoch: 29326, Training Loss: 0.00854
Epoch: 29326, Training Loss: 0.00797
Epoch: 29326, Training Loss: 0.00975
Epoch: 29326, Training Loss: 0.00755
Epoch: 29327, Training Loss: 0.00854
Epoch: 29327, Training Loss: 0.00797
Epoch: 29327, Training Loss: 0.00975
Epoch: 29327, Training Loss: 0.00755
Epoch: 29328, Training Loss: 0.00854
Epoch: 29328, Training Loss: 0.00797
Epoch: 29328, Training Loss: 0.00975
Epoch: 29328, Training Loss: 0.00755
Epoch: 29329, Training Loss: 0.00854
Epoch: 29329, Training Loss: 0.00797
Epoch: 29329, Training Loss: 0.00975
Epoch: 29329, Training Loss: 0.00755
Epoch: 29330, Training Loss: 0.00854
Epoch: 29330, Training Loss: 0.00797
Epoch: 29330, Training Loss: 0.00975
Epoch: 29330, Training Loss: 0.00755
Epoch: 29331, Training Loss: 0.00854
Epoch: 29331, Training Loss: 0.00797
Epoch: 29331, Training Loss: 0.00975
Epoch: 29331, Training Loss: 0.00755
Epoch: 29332, Training Loss: 0.00854
Epoch: 29332, Training Loss: 0.00797
Epoch: 29332, Training Loss: 0.00975
Epoch: 29332, Training Loss: 0.00755
Epoch: 29333, Training Loss: 0.00854
Epoch: 29333, Training Loss: 0.00797
Epoch: 29333, Training Loss: 0.00975
Epoch: 29333, Training Loss: 0.00755
Epoch: 29334, Training Loss: 0.00854
Epoch: 29334, Training Loss: 0.00797
Epoch: 29334, Training Loss: 0.00975
Epoch: 29334, Training Loss: 0.00755
Epoch: 29335, Training Loss: 0.00854
Epoch: 29335, Training Loss: 0.00797
Epoch: 29335, Training Loss: 0.00975
Epoch: 29335, Training Loss: 0.00755
Epoch: 29336, Training Loss: 0.00854
Epoch: 29336, Training Loss: 0.00797
Epoch: 29336, Training Loss: 0.00975
Epoch: 29336, Training Loss: 0.00755
Epoch: 29337, Training Loss: 0.00854
Epoch: 29337, Training Loss: 0.00797
Epoch: 29337, Training Loss: 0.00975
Epoch: 29337, Training Loss: 0.00755
Epoch: 29338, Training Loss: 0.00854
Epoch: 29338, Training Loss: 0.00797
Epoch: 29338, Training Loss: 0.00975
Epoch: 29338, Training Loss: 0.00755
Epoch: 29339, Training Loss: 0.00854
Epoch: 29339, Training Loss: 0.00797
Epoch: 29339, Training Loss: 0.00975
Epoch: 29339, Training Loss: 0.00755
Epoch: 29340, Training Loss: 0.00853
Epoch: 29340, Training Loss: 0.00797
Epoch: 29340, Training Loss: 0.00975
Epoch: 29340, Training Loss: 0.00755
Epoch: 29341, Training Loss: 0.00853
Epoch: 29341, Training Loss: 0.00797
Epoch: 29341, Training Loss: 0.00975
Epoch: 29341, Training Loss: 0.00755
Epoch: 29342, Training Loss: 0.00853
Epoch: 29342, Training Loss: 0.00797
Epoch: 29342, Training Loss: 0.00975
Epoch: 29342, Training Loss: 0.00755
Epoch: 29343, Training Loss: 0.00853
Epoch: 29343, Training Loss: 0.00797
Epoch: 29343, Training Loss: 0.00975
Epoch: 29343, Training Loss: 0.00755
Epoch: 29344, Training Loss: 0.00853
Epoch: 29344, Training Loss: 0.00797
Epoch: 29344, Training Loss: 0.00975
Epoch: 29344, Training Loss: 0.00755
Epoch: 29345, Training Loss: 0.00853
Epoch: 29345, Training Loss: 0.00797
Epoch: 29345, Training Loss: 0.00975
Epoch: 29345, Training Loss: 0.00754
Epoch: 29346, Training Loss: 0.00853
Epoch: 29346, Training Loss: 0.00797
Epoch: 29346, Training Loss: 0.00975
Epoch: 29346, Training Loss: 0.00754
Epoch: 29347, Training Loss: 0.00853
Epoch: 29347, Training Loss: 0.00797
Epoch: 29347, Training Loss: 0.00975
Epoch: 29347, Training Loss: 0.00754
Epoch: 29348, Training Loss: 0.00853
Epoch: 29348, Training Loss: 0.00797
Epoch: 29348, Training Loss: 0.00975
Epoch: 29348, Training Loss: 0.00754
Epoch: 29349, Training Loss: 0.00853
Epoch: 29349, Training Loss: 0.00797
Epoch: 29349, Training Loss: 0.00975
Epoch: 29349, Training Loss: 0.00754
Epoch: 29350, Training Loss: 0.00853
Epoch: 29350, Training Loss: 0.00797
Epoch: 29350, Training Loss: 0.00975
Epoch: 29350, Training Loss: 0.00754
Epoch: 29351, Training Loss: 0.00853
Epoch: 29351, Training Loss: 0.00797
Epoch: 29351, Training Loss: 0.00974
Epoch: 29351, Training Loss: 0.00754
Epoch: 29352, Training Loss: 0.00853
Epoch: 29352, Training Loss: 0.00797
Epoch: 29352, Training Loss: 0.00974
Epoch: 29352, Training Loss: 0.00754
Epoch: 29353, Training Loss: 0.00853
Epoch: 29353, Training Loss: 0.00797
Epoch: 29353, Training Loss: 0.00974
Epoch: 29353, Training Loss: 0.00754
Epoch: 29354, Training Loss: 0.00853
Epoch: 29354, Training Loss: 0.00797
Epoch: 29354, Training Loss: 0.00974
Epoch: 29354, Training Loss: 0.00754
Epoch: 29355, Training Loss: 0.00853
Epoch: 29355, Training Loss: 0.00797
Epoch: 29355, Training Loss: 0.00974
Epoch: 29355, Training Loss: 0.00754
Epoch: 29356, Training Loss: 0.00853
Epoch: 29356, Training Loss: 0.00797
Epoch: 29356, Training Loss: 0.00974
Epoch: 29356, Training Loss: 0.00754
Epoch: 29357, Training Loss: 0.00853
Epoch: 29357, Training Loss: 0.00797
Epoch: 29357, Training Loss: 0.00974
Epoch: 29357, Training Loss: 0.00754
Epoch: 29358, Training Loss: 0.00853
Epoch: 29358, Training Loss: 0.00797
Epoch: 29358, Training Loss: 0.00974
Epoch: 29358, Training Loss: 0.00754
Epoch: 29359, Training Loss: 0.00853
Epoch: 29359, Training Loss: 0.00797
Epoch: 29359, Training Loss: 0.00974
Epoch: 29359, Training Loss: 0.00754
Epoch: 29360, Training Loss: 0.00853
Epoch: 29360, Training Loss: 0.00797
Epoch: 29360, Training Loss: 0.00974
Epoch: 29360, Training Loss: 0.00754
Epoch: 29361, Training Loss: 0.00853
Epoch: 29361, Training Loss: 0.00796
Epoch: 29361, Training Loss: 0.00974
Epoch: 29361, Training Loss: 0.00754
Epoch: 29362, Training Loss: 0.00853
Epoch: 29362, Training Loss: 0.00796
Epoch: 29362, Training Loss: 0.00974
Epoch: 29362, Training Loss: 0.00754
Epoch: 29363, Training Loss: 0.00853
Epoch: 29363, Training Loss: 0.00796
Epoch: 29363, Training Loss: 0.00974
Epoch: 29363, Training Loss: 0.00754
Epoch: 29364, Training Loss: 0.00853
Epoch: 29364, Training Loss: 0.00796
Epoch: 29364, Training Loss: 0.00974
Epoch: 29364, Training Loss: 0.00754
Epoch: 29365, Training Loss: 0.00853
Epoch: 29365, Training Loss: 0.00796
Epoch: 29365, Training Loss: 0.00974
Epoch: 29365, Training Loss: 0.00754
Epoch: 29366, Training Loss: 0.00853
Epoch: 29366, Training Loss: 0.00796
Epoch: 29366, Training Loss: 0.00974
Epoch: 29366, Training Loss: 0.00754
Epoch: 29367, Training Loss: 0.00853
Epoch: 29367, Training Loss: 0.00796
Epoch: 29367, Training Loss: 0.00974
Epoch: 29367, Training Loss: 0.00754
Epoch: 29368, Training Loss: 0.00853
Epoch: 29368, Training Loss: 0.00796
Epoch: 29368, Training Loss: 0.00974
Epoch: 29368, Training Loss: 0.00754
Epoch: 29369, Training Loss: 0.00853
Epoch: 29369, Training Loss: 0.00796
Epoch: 29369, Training Loss: 0.00974
Epoch: 29369, Training Loss: 0.00754
Epoch: 29370, Training Loss: 0.00853
Epoch: 29370, Training Loss: 0.00796
Epoch: 29370, Training Loss: 0.00974
Epoch: 29370, Training Loss: 0.00754
Epoch: 29371, Training Loss: 0.00853
Epoch: 29371, Training Loss: 0.00796
Epoch: 29371, Training Loss: 0.00974
Epoch: 29371, Training Loss: 0.00754
Epoch: 29372, Training Loss: 0.00853
Epoch: 29372, Training Loss: 0.00796
Epoch: 29372, Training Loss: 0.00974
Epoch: 29372, Training Loss: 0.00754
Epoch: 29373, Training Loss: 0.00853
Epoch: 29373, Training Loss: 0.00796
Epoch: 29373, Training Loss: 0.00974
Epoch: 29373, Training Loss: 0.00754
Epoch: 29374, Training Loss: 0.00853
Epoch: 29374, Training Loss: 0.00796
Epoch: 29374, Training Loss: 0.00974
Epoch: 29374, Training Loss: 0.00754
Epoch: 29375, Training Loss: 0.00853
Epoch: 29375, Training Loss: 0.00796
Epoch: 29375, Training Loss: 0.00974
Epoch: 29375, Training Loss: 0.00754
Epoch: 29376, Training Loss: 0.00853
Epoch: 29376, Training Loss: 0.00796
Epoch: 29376, Training Loss: 0.00974
Epoch: 29376, Training Loss: 0.00754
Epoch: 29377, Training Loss: 0.00853
Epoch: 29377, Training Loss: 0.00796
Epoch: 29377, Training Loss: 0.00974
Epoch: 29377, Training Loss: 0.00754
Epoch: 29378, Training Loss: 0.00853
Epoch: 29378, Training Loss: 0.00796
Epoch: 29378, Training Loss: 0.00974
Epoch: 29378, Training Loss: 0.00754
Epoch: 29379, Training Loss: 0.00853
Epoch: 29379, Training Loss: 0.00796
Epoch: 29379, Training Loss: 0.00974
Epoch: 29379, Training Loss: 0.00754
Epoch: 29380, Training Loss: 0.00853
Epoch: 29380, Training Loss: 0.00796
Epoch: 29380, Training Loss: 0.00974
Epoch: 29380, Training Loss: 0.00754
Epoch: 29381, Training Loss: 0.00853
Epoch: 29381, Training Loss: 0.00796
Epoch: 29381, Training Loss: 0.00974
Epoch: 29381, Training Loss: 0.00754
Epoch: 29382, Training Loss: 0.00853
Epoch: 29382, Training Loss: 0.00796
Epoch: 29382, Training Loss: 0.00974
Epoch: 29382, Training Loss: 0.00754
Epoch: 29383, Training Loss: 0.00853
Epoch: 29383, Training Loss: 0.00796
Epoch: 29383, Training Loss: 0.00974
Epoch: 29383, Training Loss: 0.00754
Epoch: 29384, Training Loss: 0.00853
Epoch: 29384, Training Loss: 0.00796
Epoch: 29384, Training Loss: 0.00974
Epoch: 29384, Training Loss: 0.00754
Epoch: 29385, Training Loss: 0.00853
Epoch: 29385, Training Loss: 0.00796
Epoch: 29385, Training Loss: 0.00974
Epoch: 29385, Training Loss: 0.00754
Epoch: 29386, Training Loss: 0.00853
Epoch: 29386, Training Loss: 0.00796
Epoch: 29386, Training Loss: 0.00974
Epoch: 29386, Training Loss: 0.00754
Epoch: 29387, Training Loss: 0.00853
Epoch: 29387, Training Loss: 0.00796
Epoch: 29387, Training Loss: 0.00974
Epoch: 29387, Training Loss: 0.00754
Epoch: 29388, Training Loss: 0.00853
Epoch: 29388, Training Loss: 0.00796
Epoch: 29388, Training Loss: 0.00974
Epoch: 29388, Training Loss: 0.00754
Epoch: 29389, Training Loss: 0.00853
Epoch: 29389, Training Loss: 0.00796
Epoch: 29389, Training Loss: 0.00974
Epoch: 29389, Training Loss: 0.00754
Epoch: 29390, Training Loss: 0.00853
Epoch: 29390, Training Loss: 0.00796
Epoch: 29390, Training Loss: 0.00974
Epoch: 29390, Training Loss: 0.00754
Epoch: 29391, Training Loss: 0.00853
Epoch: 29391, Training Loss: 0.00796
Epoch: 29391, Training Loss: 0.00974
Epoch: 29391, Training Loss: 0.00754
Epoch: 29392, Training Loss: 0.00853
Epoch: 29392, Training Loss: 0.00796
Epoch: 29392, Training Loss: 0.00974
Epoch: 29392, Training Loss: 0.00754
Epoch: 29393, Training Loss: 0.00853
Epoch: 29393, Training Loss: 0.00796
Epoch: 29393, Training Loss: 0.00974
Epoch: 29393, Training Loss: 0.00754
Epoch: 29394, Training Loss: 0.00853
Epoch: 29394, Training Loss: 0.00796
Epoch: 29394, Training Loss: 0.00974
Epoch: 29394, Training Loss: 0.00754
Epoch: 29395, Training Loss: 0.00853
Epoch: 29395, Training Loss: 0.00796
Epoch: 29395, Training Loss: 0.00974
Epoch: 29395, Training Loss: 0.00754
Epoch: 29396, Training Loss: 0.00853
Epoch: 29396, Training Loss: 0.00796
Epoch: 29396, Training Loss: 0.00974
Epoch: 29396, Training Loss: 0.00754
Epoch: 29397, Training Loss: 0.00853
Epoch: 29397, Training Loss: 0.00796
Epoch: 29397, Training Loss: 0.00974
Epoch: 29397, Training Loss: 0.00754
Epoch: 29398, Training Loss: 0.00853
Epoch: 29398, Training Loss: 0.00796
Epoch: 29398, Training Loss: 0.00974
Epoch: 29398, Training Loss: 0.00754
Epoch: 29399, Training Loss: 0.00853
Epoch: 29399, Training Loss: 0.00796
Epoch: 29399, Training Loss: 0.00974
Epoch: 29399, Training Loss: 0.00754
Epoch: 29400, Training Loss: 0.00853
Epoch: 29400, Training Loss: 0.00796
Epoch: 29400, Training Loss: 0.00974
Epoch: 29400, Training Loss: 0.00754
Epoch: 29401, Training Loss: 0.00853
Epoch: 29401, Training Loss: 0.00796
Epoch: 29401, Training Loss: 0.00974
Epoch: 29401, Training Loss: 0.00754
Epoch: 29402, Training Loss: 0.00853
Epoch: 29402, Training Loss: 0.00796
Epoch: 29402, Training Loss: 0.00974
Epoch: 29402, Training Loss: 0.00754
Epoch: 29403, Training Loss: 0.00853
Epoch: 29403, Training Loss: 0.00796
Epoch: 29403, Training Loss: 0.00974
Epoch: 29403, Training Loss: 0.00754
Epoch: 29404, Training Loss: 0.00852
Epoch: 29404, Training Loss: 0.00796
Epoch: 29404, Training Loss: 0.00974
Epoch: 29404, Training Loss: 0.00754
Epoch: 29405, Training Loss: 0.00852
Epoch: 29405, Training Loss: 0.00796
Epoch: 29405, Training Loss: 0.00974
Epoch: 29405, Training Loss: 0.00754
Epoch: 29406, Training Loss: 0.00852
Epoch: 29406, Training Loss: 0.00796
Epoch: 29406, Training Loss: 0.00974
Epoch: 29406, Training Loss: 0.00754
Epoch: 29407, Training Loss: 0.00852
Epoch: 29407, Training Loss: 0.00796
Epoch: 29407, Training Loss: 0.00974
Epoch: 29407, Training Loss: 0.00754
Epoch: 29408, Training Loss: 0.00852
Epoch: 29408, Training Loss: 0.00796
Epoch: 29408, Training Loss: 0.00973
Epoch: 29408, Training Loss: 0.00754
Epoch: 29409, Training Loss: 0.00852
Epoch: 29409, Training Loss: 0.00796
Epoch: 29409, Training Loss: 0.00973
Epoch: 29409, Training Loss: 0.00754
Epoch: 29410, Training Loss: 0.00852
Epoch: 29410, Training Loss: 0.00796
Epoch: 29410, Training Loss: 0.00973
Epoch: 29410, Training Loss: 0.00754
Epoch: 29411, Training Loss: 0.00852
Epoch: 29411, Training Loss: 0.00796
Epoch: 29411, Training Loss: 0.00973
Epoch: 29411, Training Loss: 0.00754
Epoch: 29412, Training Loss: 0.00852
Epoch: 29412, Training Loss: 0.00796
Epoch: 29412, Training Loss: 0.00973
Epoch: 29412, Training Loss: 0.00754
Epoch: 29413, Training Loss: 0.00852
Epoch: 29413, Training Loss: 0.00796
Epoch: 29413, Training Loss: 0.00973
Epoch: 29413, Training Loss: 0.00754
Epoch: 29414, Training Loss: 0.00852
Epoch: 29414, Training Loss: 0.00796
Epoch: 29414, Training Loss: 0.00973
Epoch: 29414, Training Loss: 0.00754
Epoch: 29415, Training Loss: 0.00852
Epoch: 29415, Training Loss: 0.00796
Epoch: 29415, Training Loss: 0.00973
Epoch: 29415, Training Loss: 0.00754
Epoch: 29416, Training Loss: 0.00852
Epoch: 29416, Training Loss: 0.00796
Epoch: 29416, Training Loss: 0.00973
Epoch: 29416, Training Loss: 0.00754
Epoch: 29417, Training Loss: 0.00852
Epoch: 29417, Training Loss: 0.00796
Epoch: 29417, Training Loss: 0.00973
Epoch: 29417, Training Loss: 0.00754
Epoch: 29418, Training Loss: 0.00852
Epoch: 29418, Training Loss: 0.00796
Epoch: 29418, Training Loss: 0.00973
Epoch: 29418, Training Loss: 0.00754
Epoch: 29419, Training Loss: 0.00852
Epoch: 29419, Training Loss: 0.00796
Epoch: 29419, Training Loss: 0.00973
Epoch: 29419, Training Loss: 0.00753
Epoch: 29420, Training Loss: 0.00852
Epoch: 29420, Training Loss: 0.00796
Epoch: 29420, Training Loss: 0.00973
Epoch: 29420, Training Loss: 0.00753
Epoch: 29421, Training Loss: 0.00852
Epoch: 29421, Training Loss: 0.00796
Epoch: 29421, Training Loss: 0.00973
Epoch: 29421, Training Loss: 0.00753
Epoch: 29422, Training Loss: 0.00852
Epoch: 29422, Training Loss: 0.00796
Epoch: 29422, Training Loss: 0.00973
Epoch: 29422, Training Loss: 0.00753
Epoch: 29423, Training Loss: 0.00852
Epoch: 29423, Training Loss: 0.00796
Epoch: 29423, Training Loss: 0.00973
Epoch: 29423, Training Loss: 0.00753
Epoch: 29424, Training Loss: 0.00852
Epoch: 29424, Training Loss: 0.00796
Epoch: 29424, Training Loss: 0.00973
Epoch: 29424, Training Loss: 0.00753
Epoch: 29425, Training Loss: 0.00852
Epoch: 29425, Training Loss: 0.00796
Epoch: 29425, Training Loss: 0.00973
Epoch: 29425, Training Loss: 0.00753
Epoch: 29426, Training Loss: 0.00852
Epoch: 29426, Training Loss: 0.00796
Epoch: 29426, Training Loss: 0.00973
Epoch: 29426, Training Loss: 0.00753
Epoch: 29427, Training Loss: 0.00852
Epoch: 29427, Training Loss: 0.00796
Epoch: 29427, Training Loss: 0.00973
Epoch: 29427, Training Loss: 0.00753
Epoch: 29428, Training Loss: 0.00852
Epoch: 29428, Training Loss: 0.00796
Epoch: 29428, Training Loss: 0.00973
Epoch: 29428, Training Loss: 0.00753
Epoch: 29429, Training Loss: 0.00852
Epoch: 29429, Training Loss: 0.00796
Epoch: 29429, Training Loss: 0.00973
Epoch: 29429, Training Loss: 0.00753
Epoch: 29430, Training Loss: 0.00852
Epoch: 29430, Training Loss: 0.00796
Epoch: 29430, Training Loss: 0.00973
Epoch: 29430, Training Loss: 0.00753
Epoch: 29431, Training Loss: 0.00852
Epoch: 29431, Training Loss: 0.00796
Epoch: 29431, Training Loss: 0.00973
Epoch: 29431, Training Loss: 0.00753
Epoch: 29432, Training Loss: 0.00852
Epoch: 29432, Training Loss: 0.00795
Epoch: 29432, Training Loss: 0.00973
Epoch: 29432, Training Loss: 0.00753
Epoch: 29433, Training Loss: 0.00852
Epoch: 29433, Training Loss: 0.00795
Epoch: 29433, Training Loss: 0.00973
Epoch: 29433, Training Loss: 0.00753
Epoch: 29434, Training Loss: 0.00852
Epoch: 29434, Training Loss: 0.00795
Epoch: 29434, Training Loss: 0.00973
Epoch: 29434, Training Loss: 0.00753
Epoch: 29435, Training Loss: 0.00852
Epoch: 29435, Training Loss: 0.00795
Epoch: 29435, Training Loss: 0.00973
Epoch: 29435, Training Loss: 0.00753
Epoch: 29436, Training Loss: 0.00852
Epoch: 29436, Training Loss: 0.00795
Epoch: 29436, Training Loss: 0.00973
Epoch: 29436, Training Loss: 0.00753
Epoch: 29437, Training Loss: 0.00852
Epoch: 29437, Training Loss: 0.00795
Epoch: 29437, Training Loss: 0.00973
Epoch: 29437, Training Loss: 0.00753
Epoch: 29438, Training Loss: 0.00852
Epoch: 29438, Training Loss: 0.00795
Epoch: 29438, Training Loss: 0.00973
Epoch: 29438, Training Loss: 0.00753
Epoch: 29439, Training Loss: 0.00852
Epoch: 29439, Training Loss: 0.00795
Epoch: 29439, Training Loss: 0.00973
Epoch: 29439, Training Loss: 0.00753
Epoch: 29440, Training Loss: 0.00852
Epoch: 29440, Training Loss: 0.00795
Epoch: 29440, Training Loss: 0.00973
Epoch: 29440, Training Loss: 0.00753
Epoch: 29441, Training Loss: 0.00852
Epoch: 29441, Training Loss: 0.00795
Epoch: 29441, Training Loss: 0.00973
Epoch: 29441, Training Loss: 0.00753
Epoch: 29442, Training Loss: 0.00852
Epoch: 29442, Training Loss: 0.00795
Epoch: 29442, Training Loss: 0.00973
Epoch: 29442, Training Loss: 0.00753
Epoch: 29443, Training Loss: 0.00852
Epoch: 29443, Training Loss: 0.00795
Epoch: 29443, Training Loss: 0.00973
Epoch: 29443, Training Loss: 0.00753
Epoch: 29444, Training Loss: 0.00852
Epoch: 29444, Training Loss: 0.00795
Epoch: 29444, Training Loss: 0.00973
Epoch: 29444, Training Loss: 0.00753
Epoch: 29445, Training Loss: 0.00852
Epoch: 29445, Training Loss: 0.00795
Epoch: 29445, Training Loss: 0.00973
Epoch: 29445, Training Loss: 0.00753
Epoch: 29446, Training Loss: 0.00852
Epoch: 29446, Training Loss: 0.00795
Epoch: 29446, Training Loss: 0.00973
Epoch: 29446, Training Loss: 0.00753
Epoch: 29447, Training Loss: 0.00852
Epoch: 29447, Training Loss: 0.00795
Epoch: 29447, Training Loss: 0.00973
Epoch: 29447, Training Loss: 0.00753
Epoch: 29448, Training Loss: 0.00852
Epoch: 29448, Training Loss: 0.00795
Epoch: 29448, Training Loss: 0.00973
Epoch: 29448, Training Loss: 0.00753
Epoch: 29449, Training Loss: 0.00852
Epoch: 29449, Training Loss: 0.00795
Epoch: 29449, Training Loss: 0.00973
Epoch: 29449, Training Loss: 0.00753
Epoch: 29450, Training Loss: 0.00852
Epoch: 29450, Training Loss: 0.00795
Epoch: 29450, Training Loss: 0.00973
Epoch: 29450, Training Loss: 0.00753
Epoch: 29451, Training Loss: 0.00852
Epoch: 29451, Training Loss: 0.00795
Epoch: 29451, Training Loss: 0.00973
Epoch: 29451, Training Loss: 0.00753
Epoch: 29452, Training Loss: 0.00852
Epoch: 29452, Training Loss: 0.00795
Epoch: 29452, Training Loss: 0.00973
Epoch: 29452, Training Loss: 0.00753
Epoch: 29453, Training Loss: 0.00852
Epoch: 29453, Training Loss: 0.00795
Epoch: 29453, Training Loss: 0.00973
Epoch: 29453, Training Loss: 0.00753
Epoch: 29454, Training Loss: 0.00852
Epoch: 29454, Training Loss: 0.00795
Epoch: 29454, Training Loss: 0.00973
Epoch: 29454, Training Loss: 0.00753
Epoch: 29455, Training Loss: 0.00852
Epoch: 29455, Training Loss: 0.00795
Epoch: 29455, Training Loss: 0.00973
Epoch: 29455, Training Loss: 0.00753
Epoch: 29456, Training Loss: 0.00852
Epoch: 29456, Training Loss: 0.00795
Epoch: 29456, Training Loss: 0.00973
Epoch: 29456, Training Loss: 0.00753
Epoch: 29457, Training Loss: 0.00852
Epoch: 29457, Training Loss: 0.00795
Epoch: 29457, Training Loss: 0.00973
Epoch: 29457, Training Loss: 0.00753
Epoch: 29458, Training Loss: 0.00852
Epoch: 29458, Training Loss: 0.00795
Epoch: 29458, Training Loss: 0.00973
Epoch: 29458, Training Loss: 0.00753
Epoch: 29459, Training Loss: 0.00852
Epoch: 29459, Training Loss: 0.00795
Epoch: 29459, Training Loss: 0.00973
Epoch: 29459, Training Loss: 0.00753
Epoch: 29460, Training Loss: 0.00852
Epoch: 29460, Training Loss: 0.00795
Epoch: 29460, Training Loss: 0.00973
Epoch: 29460, Training Loss: 0.00753
Epoch: 29461, Training Loss: 0.00852
Epoch: 29461, Training Loss: 0.00795
Epoch: 29461, Training Loss: 0.00973
Epoch: 29461, Training Loss: 0.00753
Epoch: 29462, Training Loss: 0.00852
Epoch: 29462, Training Loss: 0.00795
Epoch: 29462, Training Loss: 0.00973
Epoch: 29462, Training Loss: 0.00753
Epoch: 29463, Training Loss: 0.00852
Epoch: 29463, Training Loss: 0.00795
Epoch: 29463, Training Loss: 0.00973
Epoch: 29463, Training Loss: 0.00753
Epoch: 29464, Training Loss: 0.00852
Epoch: 29464, Training Loss: 0.00795
Epoch: 29464, Training Loss: 0.00973
Epoch: 29464, Training Loss: 0.00753
Epoch: 29465, Training Loss: 0.00852
Epoch: 29465, Training Loss: 0.00795
Epoch: 29465, Training Loss: 0.00972
Epoch: 29465, Training Loss: 0.00753
Epoch: 29466, Training Loss: 0.00852
Epoch: 29466, Training Loss: 0.00795
Epoch: 29466, Training Loss: 0.00972
Epoch: 29466, Training Loss: 0.00753
Epoch: 29467, Training Loss: 0.00852
Epoch: 29467, Training Loss: 0.00795
Epoch: 29467, Training Loss: 0.00972
Epoch: 29467, Training Loss: 0.00753
Epoch: 29468, Training Loss: 0.00852
Epoch: 29468, Training Loss: 0.00795
Epoch: 29468, Training Loss: 0.00972
Epoch: 29468, Training Loss: 0.00753
Epoch: 29469, Training Loss: 0.00851
Epoch: 29469, Training Loss: 0.00795
Epoch: 29469, Training Loss: 0.00972
Epoch: 29469, Training Loss: 0.00753
Epoch: 29470, Training Loss: 0.00851
Epoch: 29470, Training Loss: 0.00795
Epoch: 29470, Training Loss: 0.00972
Epoch: 29470, Training Loss: 0.00753
Epoch: 29471, Training Loss: 0.00851
Epoch: 29471, Training Loss: 0.00795
Epoch: 29471, Training Loss: 0.00972
Epoch: 29471, Training Loss: 0.00753
Epoch: 29472, Training Loss: 0.00851
Epoch: 29472, Training Loss: 0.00795
Epoch: 29472, Training Loss: 0.00972
Epoch: 29472, Training Loss: 0.00753
Epoch: 29473, Training Loss: 0.00851
Epoch: 29473, Training Loss: 0.00795
Epoch: 29473, Training Loss: 0.00972
Epoch: 29473, Training Loss: 0.00753
Epoch: 29474, Training Loss: 0.00851
Epoch: 29474, Training Loss: 0.00795
Epoch: 29474, Training Loss: 0.00972
Epoch: 29474, Training Loss: 0.00753
Epoch: 29475, Training Loss: 0.00851
Epoch: 29475, Training Loss: 0.00795
Epoch: 29475, Training Loss: 0.00972
Epoch: 29475, Training Loss: 0.00753
Epoch: 29476, Training Loss: 0.00851
Epoch: 29476, Training Loss: 0.00795
Epoch: 29476, Training Loss: 0.00972
Epoch: 29476, Training Loss: 0.00753
Epoch: 29477, Training Loss: 0.00851
Epoch: 29477, Training Loss: 0.00795
Epoch: 29477, Training Loss: 0.00972
Epoch: 29477, Training Loss: 0.00753
Epoch: 29478, Training Loss: 0.00851
Epoch: 29478, Training Loss: 0.00795
Epoch: 29478, Training Loss: 0.00972
Epoch: 29478, Training Loss: 0.00753
Epoch: 29479, Training Loss: 0.00851
Epoch: 29479, Training Loss: 0.00795
Epoch: 29479, Training Loss: 0.00972
Epoch: 29479, Training Loss: 0.00753
Epoch: 29480, Training Loss: 0.00851
Epoch: 29480, Training Loss: 0.00795
Epoch: 29480, Training Loss: 0.00972
Epoch: 29480, Training Loss: 0.00753
Epoch: 29481, Training Loss: 0.00851
Epoch: 29481, Training Loss: 0.00795
Epoch: 29481, Training Loss: 0.00972
Epoch: 29481, Training Loss: 0.00753
Epoch: 29482, Training Loss: 0.00851
Epoch: 29482, Training Loss: 0.00795
Epoch: 29482, Training Loss: 0.00972
Epoch: 29482, Training Loss: 0.00753
Epoch: 29483, Training Loss: 0.00851
Epoch: 29483, Training Loss: 0.00795
Epoch: 29483, Training Loss: 0.00972
Epoch: 29483, Training Loss: 0.00753
Epoch: 29484, Training Loss: 0.00851
Epoch: 29484, Training Loss: 0.00795
Epoch: 29484, Training Loss: 0.00972
Epoch: 29484, Training Loss: 0.00753
Epoch: 29485, Training Loss: 0.00851
Epoch: 29485, Training Loss: 0.00795
Epoch: 29485, Training Loss: 0.00972
Epoch: 29485, Training Loss: 0.00753
Epoch: 29486, Training Loss: 0.00851
Epoch: 29486, Training Loss: 0.00795
Epoch: 29486, Training Loss: 0.00972
Epoch: 29486, Training Loss: 0.00753
Epoch: 29487, Training Loss: 0.00851
Epoch: 29487, Training Loss: 0.00795
Epoch: 29487, Training Loss: 0.00972
Epoch: 29487, Training Loss: 0.00753
Epoch: 29488, Training Loss: 0.00851
Epoch: 29488, Training Loss: 0.00795
Epoch: 29488, Training Loss: 0.00972
Epoch: 29488, Training Loss: 0.00753
Epoch: 29489, Training Loss: 0.00851
Epoch: 29489, Training Loss: 0.00795
Epoch: 29489, Training Loss: 0.00972
Epoch: 29489, Training Loss: 0.00753
Epoch: 29490, Training Loss: 0.00851
Epoch: 29490, Training Loss: 0.00795
Epoch: 29490, Training Loss: 0.00972
Epoch: 29490, Training Loss: 0.00753
Epoch: 29491, Training Loss: 0.00851
Epoch: 29491, Training Loss: 0.00795
Epoch: 29491, Training Loss: 0.00972
Epoch: 29491, Training Loss: 0.00753
Epoch: 29492, Training Loss: 0.00851
Epoch: 29492, Training Loss: 0.00795
Epoch: 29492, Training Loss: 0.00972
Epoch: 29492, Training Loss: 0.00753
Epoch: 29493, Training Loss: 0.00851
Epoch: 29493, Training Loss: 0.00795
Epoch: 29493, Training Loss: 0.00972
Epoch: 29493, Training Loss: 0.00753
Epoch: 29494, Training Loss: 0.00851
Epoch: 29494, Training Loss: 0.00795
Epoch: 29494, Training Loss: 0.00972
Epoch: 29494, Training Loss: 0.00752
Epoch: 29495, Training Loss: 0.00851
Epoch: 29495, Training Loss: 0.00795
Epoch: 29495, Training Loss: 0.00972
Epoch: 29495, Training Loss: 0.00752
Epoch: 29496, Training Loss: 0.00851
Epoch: 29496, Training Loss: 0.00795
Epoch: 29496, Training Loss: 0.00972
Epoch: 29496, Training Loss: 0.00752
Epoch: 29497, Training Loss: 0.00851
Epoch: 29497, Training Loss: 0.00795
Epoch: 29497, Training Loss: 0.00972
Epoch: 29497, Training Loss: 0.00752
Epoch: 29498, Training Loss: 0.00851
Epoch: 29498, Training Loss: 0.00795
Epoch: 29498, Training Loss: 0.00972
Epoch: 29498, Training Loss: 0.00752
Epoch: 29499, Training Loss: 0.00851
Epoch: 29499, Training Loss: 0.00795
Epoch: 29499, Training Loss: 0.00972
Epoch: 29499, Training Loss: 0.00752
Epoch: 29500, Training Loss: 0.00851
Epoch: 29500, Training Loss: 0.00795
Epoch: 29500, Training Loss: 0.00972
Epoch: 29500, Training Loss: 0.00752
Epoch: 29501, Training Loss: 0.00851
Epoch: 29501, Training Loss: 0.00795
Epoch: 29501, Training Loss: 0.00972
Epoch: 29501, Training Loss: 0.00752
Epoch: 29502, Training Loss: 0.00851
Epoch: 29502, Training Loss: 0.00795
Epoch: 29502, Training Loss: 0.00972
Epoch: 29502, Training Loss: 0.00752
Epoch: 29503, Training Loss: 0.00851
Epoch: 29503, Training Loss: 0.00794
Epoch: 29503, Training Loss: 0.00972
Epoch: 29503, Training Loss: 0.00752
Epoch: 29504, Training Loss: 0.00851
Epoch: 29504, Training Loss: 0.00794
Epoch: 29504, Training Loss: 0.00972
Epoch: 29504, Training Loss: 0.00752
Epoch: 29505, Training Loss: 0.00851
Epoch: 29505, Training Loss: 0.00794
Epoch: 29505, Training Loss: 0.00972
Epoch: 29505, Training Loss: 0.00752
Epoch: 29506, Training Loss: 0.00851
Epoch: 29506, Training Loss: 0.00794
Epoch: 29506, Training Loss: 0.00972
Epoch: 29506, Training Loss: 0.00752
Epoch: 29507, Training Loss: 0.00851
Epoch: 29507, Training Loss: 0.00794
Epoch: 29507, Training Loss: 0.00972
Epoch: 29507, Training Loss: 0.00752
Epoch: 29508, Training Loss: 0.00851
Epoch: 29508, Training Loss: 0.00794
Epoch: 29508, Training Loss: 0.00972
Epoch: 29508, Training Loss: 0.00752
Epoch: 29509, Training Loss: 0.00851
Epoch: 29509, Training Loss: 0.00794
Epoch: 29509, Training Loss: 0.00972
Epoch: 29509, Training Loss: 0.00752
Epoch: 29510, Training Loss: 0.00851
Epoch: 29510, Training Loss: 0.00794
Epoch: 29510, Training Loss: 0.00972
Epoch: 29510, Training Loss: 0.00752
Epoch: 29511, Training Loss: 0.00851
Epoch: 29511, Training Loss: 0.00794
Epoch: 29511, Training Loss: 0.00972
Epoch: 29511, Training Loss: 0.00752
Epoch: 29512, Training Loss: 0.00851
Epoch: 29512, Training Loss: 0.00794
Epoch: 29512, Training Loss: 0.00972
Epoch: 29512, Training Loss: 0.00752
Epoch: 29513, Training Loss: 0.00851
Epoch: 29513, Training Loss: 0.00794
Epoch: 29513, Training Loss: 0.00972
Epoch: 29513, Training Loss: 0.00752
Epoch: 29514, Training Loss: 0.00851
Epoch: 29514, Training Loss: 0.00794
Epoch: 29514, Training Loss: 0.00972
Epoch: 29514, Training Loss: 0.00752
Epoch: 29515, Training Loss: 0.00851
Epoch: 29515, Training Loss: 0.00794
Epoch: 29515, Training Loss: 0.00972
Epoch: 29515, Training Loss: 0.00752
Epoch: 29516, Training Loss: 0.00851
Epoch: 29516, Training Loss: 0.00794
Epoch: 29516, Training Loss: 0.00972
Epoch: 29516, Training Loss: 0.00752
Epoch: 29517, Training Loss: 0.00851
Epoch: 29517, Training Loss: 0.00794
Epoch: 29517, Training Loss: 0.00972
Epoch: 29517, Training Loss: 0.00752
Epoch: 29518, Training Loss: 0.00851
Epoch: 29518, Training Loss: 0.00794
Epoch: 29518, Training Loss: 0.00972
Epoch: 29518, Training Loss: 0.00752
Epoch: 29519, Training Loss: 0.00851
Epoch: 29519, Training Loss: 0.00794
Epoch: 29519, Training Loss: 0.00972
Epoch: 29519, Training Loss: 0.00752
Epoch: 29520, Training Loss: 0.00851
Epoch: 29520, Training Loss: 0.00794
Epoch: 29520, Training Loss: 0.00972
Epoch: 29520, Training Loss: 0.00752
Epoch: 29521, Training Loss: 0.00851
Epoch: 29521, Training Loss: 0.00794
Epoch: 29521, Training Loss: 0.00972
Epoch: 29521, Training Loss: 0.00752
Epoch: 29522, Training Loss: 0.00851
Epoch: 29522, Training Loss: 0.00794
Epoch: 29522, Training Loss: 0.00971
Epoch: 29522, Training Loss: 0.00752
Epoch: 29523, Training Loss: 0.00851
Epoch: 29523, Training Loss: 0.00794
Epoch: 29523, Training Loss: 0.00971
Epoch: 29523, Training Loss: 0.00752
Epoch: 29524, Training Loss: 0.00851
Epoch: 29524, Training Loss: 0.00794
Epoch: 29524, Training Loss: 0.00971
Epoch: 29524, Training Loss: 0.00752
Epoch: 29525, Training Loss: 0.00851
Epoch: 29525, Training Loss: 0.00794
Epoch: 29525, Training Loss: 0.00971
Epoch: 29525, Training Loss: 0.00752
Epoch: 29526, Training Loss: 0.00851
Epoch: 29526, Training Loss: 0.00794
Epoch: 29526, Training Loss: 0.00971
Epoch: 29526, Training Loss: 0.00752
Epoch: 29527, Training Loss: 0.00851
Epoch: 29527, Training Loss: 0.00794
Epoch: 29527, Training Loss: 0.00971
Epoch: 29527, Training Loss: 0.00752
Epoch: 29528, Training Loss: 0.00851
Epoch: 29528, Training Loss: 0.00794
Epoch: 29528, Training Loss: 0.00971
Epoch: 29528, Training Loss: 0.00752
Epoch: 29529, Training Loss: 0.00851
Epoch: 29529, Training Loss: 0.00794
Epoch: 29529, Training Loss: 0.00971
Epoch: 29529, Training Loss: 0.00752
Epoch: 29530, Training Loss: 0.00851
Epoch: 29530, Training Loss: 0.00794
Epoch: 29530, Training Loss: 0.00971
Epoch: 29530, Training Loss: 0.00752
Epoch: 29531, Training Loss: 0.00851
Epoch: 29531, Training Loss: 0.00794
Epoch: 29531, Training Loss: 0.00971
Epoch: 29531, Training Loss: 0.00752
Epoch: 29532, Training Loss: 0.00851
Epoch: 29532, Training Loss: 0.00794
Epoch: 29532, Training Loss: 0.00971
Epoch: 29532, Training Loss: 0.00752
Epoch: 29533, Training Loss: 0.00851
Epoch: 29533, Training Loss: 0.00794
Epoch: 29533, Training Loss: 0.00971
Epoch: 29533, Training Loss: 0.00752
Epoch: 29534, Training Loss: 0.00850
Epoch: 29534, Training Loss: 0.00794
Epoch: 29534, Training Loss: 0.00971
Epoch: 29534, Training Loss: 0.00752
Epoch: 29535, Training Loss: 0.00850
Epoch: 29535, Training Loss: 0.00794
Epoch: 29535, Training Loss: 0.00971
Epoch: 29535, Training Loss: 0.00752
Epoch: 29536, Training Loss: 0.00850
Epoch: 29536, Training Loss: 0.00794
Epoch: 29536, Training Loss: 0.00971
Epoch: 29536, Training Loss: 0.00752
Epoch: 29537, Training Loss: 0.00850
Epoch: 29537, Training Loss: 0.00794
Epoch: 29537, Training Loss: 0.00971
Epoch: 29537, Training Loss: 0.00752
Epoch: 29538, Training Loss: 0.00850
Epoch: 29538, Training Loss: 0.00794
Epoch: 29538, Training Loss: 0.00971
Epoch: 29538, Training Loss: 0.00752
Epoch: 29539, Training Loss: 0.00850
Epoch: 29539, Training Loss: 0.00794
Epoch: 29539, Training Loss: 0.00971
Epoch: 29539, Training Loss: 0.00752
Epoch: 29540, Training Loss: 0.00850
Epoch: 29540, Training Loss: 0.00794
Epoch: 29540, Training Loss: 0.00971
Epoch: 29540, Training Loss: 0.00752
Epoch: 29541, Training Loss: 0.00850
Epoch: 29541, Training Loss: 0.00794
Epoch: 29541, Training Loss: 0.00971
Epoch: 29541, Training Loss: 0.00752
Epoch: 29542, Training Loss: 0.00850
Epoch: 29542, Training Loss: 0.00794
Epoch: 29542, Training Loss: 0.00971
Epoch: 29542, Training Loss: 0.00752
Epoch: 29543, Training Loss: 0.00850
Epoch: 29543, Training Loss: 0.00794
Epoch: 29543, Training Loss: 0.00971
Epoch: 29543, Training Loss: 0.00752
Epoch: 29544, Training Loss: 0.00850
Epoch: 29544, Training Loss: 0.00794
Epoch: 29544, Training Loss: 0.00971
Epoch: 29544, Training Loss: 0.00752
Epoch: 29545, Training Loss: 0.00850
Epoch: 29545, Training Loss: 0.00794
Epoch: 29545, Training Loss: 0.00971
Epoch: 29545, Training Loss: 0.00752
Epoch: 29546, Training Loss: 0.00850
Epoch: 29546, Training Loss: 0.00794
Epoch: 29546, Training Loss: 0.00971
Epoch: 29546, Training Loss: 0.00752
Epoch: 29547, Training Loss: 0.00850
Epoch: 29547, Training Loss: 0.00794
Epoch: 29547, Training Loss: 0.00971
Epoch: 29547, Training Loss: 0.00752
Epoch: 29548, Training Loss: 0.00850
Epoch: 29548, Training Loss: 0.00794
Epoch: 29548, Training Loss: 0.00971
Epoch: 29548, Training Loss: 0.00752
Epoch: 29549, Training Loss: 0.00850
Epoch: 29549, Training Loss: 0.00794
Epoch: 29549, Training Loss: 0.00971
Epoch: 29549, Training Loss: 0.00752
Epoch: 29550, Training Loss: 0.00850
Epoch: 29550, Training Loss: 0.00794
Epoch: 29550, Training Loss: 0.00971
Epoch: 29550, Training Loss: 0.00752
Epoch: 29551, Training Loss: 0.00850
Epoch: 29551, Training Loss: 0.00794
Epoch: 29551, Training Loss: 0.00971
Epoch: 29551, Training Loss: 0.00752
Epoch: 29552, Training Loss: 0.00850
Epoch: 29552, Training Loss: 0.00794
Epoch: 29552, Training Loss: 0.00971
Epoch: 29552, Training Loss: 0.00752
Epoch: 29553, Training Loss: 0.00850
Epoch: 29553, Training Loss: 0.00794
Epoch: 29553, Training Loss: 0.00971
Epoch: 29553, Training Loss: 0.00752
Epoch: 29554, Training Loss: 0.00850
Epoch: 29554, Training Loss: 0.00794
Epoch: 29554, Training Loss: 0.00971
Epoch: 29554, Training Loss: 0.00752
Epoch: 29555, Training Loss: 0.00850
Epoch: 29555, Training Loss: 0.00794
Epoch: 29555, Training Loss: 0.00971
Epoch: 29555, Training Loss: 0.00752
Epoch: 29556, Training Loss: 0.00850
Epoch: 29556, Training Loss: 0.00794
Epoch: 29556, Training Loss: 0.00971
Epoch: 29556, Training Loss: 0.00752
Epoch: 29557, Training Loss: 0.00850
Epoch: 29557, Training Loss: 0.00794
Epoch: 29557, Training Loss: 0.00971
Epoch: 29557, Training Loss: 0.00752
Epoch: 29558, Training Loss: 0.00850
Epoch: 29558, Training Loss: 0.00794
Epoch: 29558, Training Loss: 0.00971
Epoch: 29558, Training Loss: 0.00752
Epoch: 29559, Training Loss: 0.00850
Epoch: 29559, Training Loss: 0.00794
Epoch: 29559, Training Loss: 0.00971
Epoch: 29559, Training Loss: 0.00752
Epoch: 29560, Training Loss: 0.00850
Epoch: 29560, Training Loss: 0.00794
Epoch: 29560, Training Loss: 0.00971
Epoch: 29560, Training Loss: 0.00752
Epoch: 29561, Training Loss: 0.00850
Epoch: 29561, Training Loss: 0.00794
Epoch: 29561, Training Loss: 0.00971
Epoch: 29561, Training Loss: 0.00752
Epoch: 29562, Training Loss: 0.00850
Epoch: 29562, Training Loss: 0.00794
Epoch: 29562, Training Loss: 0.00971
Epoch: 29562, Training Loss: 0.00752
Epoch: 29563, Training Loss: 0.00850
Epoch: 29563, Training Loss: 0.00794
Epoch: 29563, Training Loss: 0.00971
Epoch: 29563, Training Loss: 0.00752
Epoch: 29564, Training Loss: 0.00850
Epoch: 29564, Training Loss: 0.00794
Epoch: 29564, Training Loss: 0.00971
Epoch: 29564, Training Loss: 0.00752
Epoch: 29565, Training Loss: 0.00850
Epoch: 29565, Training Loss: 0.00794
Epoch: 29565, Training Loss: 0.00971
Epoch: 29565, Training Loss: 0.00752
Epoch: 29566, Training Loss: 0.00850
Epoch: 29566, Training Loss: 0.00794
Epoch: 29566, Training Loss: 0.00971
Epoch: 29566, Training Loss: 0.00752
Epoch: 29567, Training Loss: 0.00850
Epoch: 29567, Training Loss: 0.00794
Epoch: 29567, Training Loss: 0.00971
Epoch: 29567, Training Loss: 0.00752
Epoch: 29568, Training Loss: 0.00850
Epoch: 29568, Training Loss: 0.00794
Epoch: 29568, Training Loss: 0.00971
Epoch: 29568, Training Loss: 0.00751
Epoch: 29569, Training Loss: 0.00850
Epoch: 29569, Training Loss: 0.00794
Epoch: 29569, Training Loss: 0.00971
Epoch: 29569, Training Loss: 0.00751
Epoch: 29570, Training Loss: 0.00850
Epoch: 29570, Training Loss: 0.00794
Epoch: 29570, Training Loss: 0.00971
Epoch: 29570, Training Loss: 0.00751
Epoch: 29571, Training Loss: 0.00850
Epoch: 29571, Training Loss: 0.00794
Epoch: 29571, Training Loss: 0.00971
Epoch: 29571, Training Loss: 0.00751
Epoch: 29572, Training Loss: 0.00850
Epoch: 29572, Training Loss: 0.00794
Epoch: 29572, Training Loss: 0.00971
Epoch: 29572, Training Loss: 0.00751
Epoch: 29573, Training Loss: 0.00850
Epoch: 29573, Training Loss: 0.00794
Epoch: 29573, Training Loss: 0.00971
Epoch: 29573, Training Loss: 0.00751
Epoch: 29574, Training Loss: 0.00850
Epoch: 29574, Training Loss: 0.00793
Epoch: 29574, Training Loss: 0.00971
Epoch: 29574, Training Loss: 0.00751
Epoch: 29575, Training Loss: 0.00850
Epoch: 29575, Training Loss: 0.00793
Epoch: 29575, Training Loss: 0.00971
Epoch: 29575, Training Loss: 0.00751
Epoch: 29576, Training Loss: 0.00850
Epoch: 29576, Training Loss: 0.00793
Epoch: 29576, Training Loss: 0.00971
Epoch: 29576, Training Loss: 0.00751
Epoch: 29577, Training Loss: 0.00850
Epoch: 29577, Training Loss: 0.00793
Epoch: 29577, Training Loss: 0.00971
Epoch: 29577, Training Loss: 0.00751
Epoch: 29578, Training Loss: 0.00850
Epoch: 29578, Training Loss: 0.00793
Epoch: 29578, Training Loss: 0.00971
Epoch: 29578, Training Loss: 0.00751
Epoch: 29579, Training Loss: 0.00850
Epoch: 29579, Training Loss: 0.00793
Epoch: 29579, Training Loss: 0.00970
Epoch: 29579, Training Loss: 0.00751
Epoch: 29580, Training Loss: 0.00850
Epoch: 29580, Training Loss: 0.00793
Epoch: 29580, Training Loss: 0.00970
Epoch: 29580, Training Loss: 0.00751
Epoch: 29581, Training Loss: 0.00850
Epoch: 29581, Training Loss: 0.00793
Epoch: 29581, Training Loss: 0.00970
Epoch: 29581, Training Loss: 0.00751
Epoch: 29582, Training Loss: 0.00850
Epoch: 29582, Training Loss: 0.00793
Epoch: 29582, Training Loss: 0.00970
Epoch: 29582, Training Loss: 0.00751
Epoch: 29583, Training Loss: 0.00850
Epoch: 29583, Training Loss: 0.00793
Epoch: 29583, Training Loss: 0.00970
Epoch: 29583, Training Loss: 0.00751
Epoch: 29584, Training Loss: 0.00850
Epoch: 29584, Training Loss: 0.00793
Epoch: 29584, Training Loss: 0.00970
Epoch: 29584, Training Loss: 0.00751
Epoch: 29585, Training Loss: 0.00850
Epoch: 29585, Training Loss: 0.00793
Epoch: 29585, Training Loss: 0.00970
Epoch: 29585, Training Loss: 0.00751
Epoch: 29586, Training Loss: 0.00850
Epoch: 29586, Training Loss: 0.00793
Epoch: 29586, Training Loss: 0.00970
Epoch: 29586, Training Loss: 0.00751
Epoch: 29587, Training Loss: 0.00850
Epoch: 29587, Training Loss: 0.00793
Epoch: 29587, Training Loss: 0.00970
Epoch: 29587, Training Loss: 0.00751
Epoch: 29588, Training Loss: 0.00850
Epoch: 29588, Training Loss: 0.00793
Epoch: 29588, Training Loss: 0.00970
Epoch: 29588, Training Loss: 0.00751
Epoch: 29589, Training Loss: 0.00850
Epoch: 29589, Training Loss: 0.00793
Epoch: 29589, Training Loss: 0.00970
Epoch: 29589, Training Loss: 0.00751
Epoch: 29590, Training Loss: 0.00850
Epoch: 29590, Training Loss: 0.00793
Epoch: 29590, Training Loss: 0.00970
Epoch: 29590, Training Loss: 0.00751
Epoch: 29591, Training Loss: 0.00850
Epoch: 29591, Training Loss: 0.00793
Epoch: 29591, Training Loss: 0.00970
Epoch: 29591, Training Loss: 0.00751
Epoch: 29592, Training Loss: 0.00850
Epoch: 29592, Training Loss: 0.00793
Epoch: 29592, Training Loss: 0.00970
Epoch: 29592, Training Loss: 0.00751
Epoch: 29593, Training Loss: 0.00850
Epoch: 29593, Training Loss: 0.00793
Epoch: 29593, Training Loss: 0.00970
Epoch: 29593, Training Loss: 0.00751
Epoch: 29594, Training Loss: 0.00850
Epoch: 29594, Training Loss: 0.00793
Epoch: 29594, Training Loss: 0.00970
Epoch: 29594, Training Loss: 0.00751
Epoch: 29595, Training Loss: 0.00850
Epoch: 29595, Training Loss: 0.00793
Epoch: 29595, Training Loss: 0.00970
Epoch: 29595, Training Loss: 0.00751
Epoch: 29596, Training Loss: 0.00850
Epoch: 29596, Training Loss: 0.00793
Epoch: 29596, Training Loss: 0.00970
Epoch: 29596, Training Loss: 0.00751
Epoch: 29597, Training Loss: 0.00850
Epoch: 29597, Training Loss: 0.00793
Epoch: 29597, Training Loss: 0.00970
Epoch: 29597, Training Loss: 0.00751
Epoch: 29598, Training Loss: 0.00850
Epoch: 29598, Training Loss: 0.00793
Epoch: 29598, Training Loss: 0.00970
Epoch: 29598, Training Loss: 0.00751
Epoch: 29599, Training Loss: 0.00850
Epoch: 29599, Training Loss: 0.00793
Epoch: 29599, Training Loss: 0.00970
Epoch: 29599, Training Loss: 0.00751
Epoch: 29600, Training Loss: 0.00849
Epoch: 29600, Training Loss: 0.00793
Epoch: 29600, Training Loss: 0.00970
Epoch: 29600, Training Loss: 0.00751
Epoch: 29601, Training Loss: 0.00849
Epoch: 29601, Training Loss: 0.00793
Epoch: 29601, Training Loss: 0.00970
Epoch: 29601, Training Loss: 0.00751
Epoch: 29602, Training Loss: 0.00849
Epoch: 29602, Training Loss: 0.00793
Epoch: 29602, Training Loss: 0.00970
Epoch: 29602, Training Loss: 0.00751
Epoch: 29603, Training Loss: 0.00849
Epoch: 29603, Training Loss: 0.00793
Epoch: 29603, Training Loss: 0.00970
Epoch: 29603, Training Loss: 0.00751
Epoch: 29604, Training Loss: 0.00849
Epoch: 29604, Training Loss: 0.00793
Epoch: 29604, Training Loss: 0.00970
Epoch: 29604, Training Loss: 0.00751
Epoch: 29605, Training Loss: 0.00849
Epoch: 29605, Training Loss: 0.00793
Epoch: 29605, Training Loss: 0.00970
Epoch: 29605, Training Loss: 0.00751
Epoch: 29606, Training Loss: 0.00849
Epoch: 29606, Training Loss: 0.00793
Epoch: 29606, Training Loss: 0.00970
Epoch: 29606, Training Loss: 0.00751
Epoch: 29607, Training Loss: 0.00849
Epoch: 29607, Training Loss: 0.00793
Epoch: 29607, Training Loss: 0.00970
Epoch: 29607, Training Loss: 0.00751
Epoch: 29608, Training Loss: 0.00849
Epoch: 29608, Training Loss: 0.00793
Epoch: 29608, Training Loss: 0.00970
Epoch: 29608, Training Loss: 0.00751
Epoch: 29609, Training Loss: 0.00849
Epoch: 29609, Training Loss: 0.00793
Epoch: 29609, Training Loss: 0.00970
Epoch: 29609, Training Loss: 0.00751
Epoch: 29610, Training Loss: 0.00849
Epoch: 29610, Training Loss: 0.00793
Epoch: 29610, Training Loss: 0.00970
Epoch: 29610, Training Loss: 0.00751
Epoch: 29611, Training Loss: 0.00849
Epoch: 29611, Training Loss: 0.00793
Epoch: 29611, Training Loss: 0.00970
Epoch: 29611, Training Loss: 0.00751
Epoch: 29612, Training Loss: 0.00849
Epoch: 29612, Training Loss: 0.00793
Epoch: 29612, Training Loss: 0.00970
Epoch: 29612, Training Loss: 0.00751
Epoch: 29613, Training Loss: 0.00849
Epoch: 29613, Training Loss: 0.00793
Epoch: 29613, Training Loss: 0.00970
Epoch: 29613, Training Loss: 0.00751
Epoch: 29614, Training Loss: 0.00849
Epoch: 29614, Training Loss: 0.00793
Epoch: 29614, Training Loss: 0.00970
Epoch: 29614, Training Loss: 0.00751
Epoch: 29615, Training Loss: 0.00849
Epoch: 29615, Training Loss: 0.00793
Epoch: 29615, Training Loss: 0.00970
Epoch: 29615, Training Loss: 0.00751
Epoch: 29616, Training Loss: 0.00849
Epoch: 29616, Training Loss: 0.00793
Epoch: 29616, Training Loss: 0.00970
Epoch: 29616, Training Loss: 0.00751
Epoch: 29617, Training Loss: 0.00849
Epoch: 29617, Training Loss: 0.00793
Epoch: 29617, Training Loss: 0.00970
Epoch: 29617, Training Loss: 0.00751
Epoch: 29618, Training Loss: 0.00849
Epoch: 29618, Training Loss: 0.00793
Epoch: 29618, Training Loss: 0.00970
Epoch: 29618, Training Loss: 0.00751
Epoch: 29619, Training Loss: 0.00849
Epoch: 29619, Training Loss: 0.00793
Epoch: 29619, Training Loss: 0.00970
Epoch: 29619, Training Loss: 0.00751
Epoch: 29620, Training Loss: 0.00849
Epoch: 29620, Training Loss: 0.00793
Epoch: 29620, Training Loss: 0.00970
Epoch: 29620, Training Loss: 0.00751
Epoch: 29621, Training Loss: 0.00849
Epoch: 29621, Training Loss: 0.00793
Epoch: 29621, Training Loss: 0.00970
Epoch: 29621, Training Loss: 0.00751
Epoch: 29622, Training Loss: 0.00849
Epoch: 29622, Training Loss: 0.00793
Epoch: 29622, Training Loss: 0.00970
Epoch: 29622, Training Loss: 0.00751
Epoch: 29623, Training Loss: 0.00849
Epoch: 29623, Training Loss: 0.00793
Epoch: 29623, Training Loss: 0.00970
Epoch: 29623, Training Loss: 0.00751
Epoch: 29624, Training Loss: 0.00849
Epoch: 29624, Training Loss: 0.00793
Epoch: 29624, Training Loss: 0.00970
Epoch: 29624, Training Loss: 0.00751
Epoch: 29625, Training Loss: 0.00849
Epoch: 29625, Training Loss: 0.00793
Epoch: 29625, Training Loss: 0.00970
Epoch: 29625, Training Loss: 0.00751
Epoch: 29626, Training Loss: 0.00849
Epoch: 29626, Training Loss: 0.00793
Epoch: 29626, Training Loss: 0.00970
Epoch: 29626, Training Loss: 0.00751
Epoch: 29627, Training Loss: 0.00849
Epoch: 29627, Training Loss: 0.00793
Epoch: 29627, Training Loss: 0.00970
Epoch: 29627, Training Loss: 0.00751
Epoch: 29628, Training Loss: 0.00849
Epoch: 29628, Training Loss: 0.00793
Epoch: 29628, Training Loss: 0.00970
Epoch: 29628, Training Loss: 0.00751
Epoch: 29629, Training Loss: 0.00849
Epoch: 29629, Training Loss: 0.00793
Epoch: 29629, Training Loss: 0.00970
Epoch: 29629, Training Loss: 0.00751
Epoch: 29630, Training Loss: 0.00849
Epoch: 29630, Training Loss: 0.00793
Epoch: 29630, Training Loss: 0.00970
Epoch: 29630, Training Loss: 0.00751
Epoch: 29631, Training Loss: 0.00849
Epoch: 29631, Training Loss: 0.00793
Epoch: 29631, Training Loss: 0.00970
Epoch: 29631, Training Loss: 0.00751
Epoch: 29632, Training Loss: 0.00849
Epoch: 29632, Training Loss: 0.00793
Epoch: 29632, Training Loss: 0.00970
Epoch: 29632, Training Loss: 0.00751
Epoch: 29633, Training Loss: 0.00849
Epoch: 29633, Training Loss: 0.00793
Epoch: 29633, Training Loss: 0.00970
Epoch: 29633, Training Loss: 0.00751
Epoch: 29634, Training Loss: 0.00849
Epoch: 29634, Training Loss: 0.00793
Epoch: 29634, Training Loss: 0.00970
Epoch: 29634, Training Loss: 0.00751
Epoch: 29635, Training Loss: 0.00849
Epoch: 29635, Training Loss: 0.00793
Epoch: 29635, Training Loss: 0.00970
Epoch: 29635, Training Loss: 0.00751
Epoch: 29636, Training Loss: 0.00849
Epoch: 29636, Training Loss: 0.00793
Epoch: 29636, Training Loss: 0.00970
Epoch: 29636, Training Loss: 0.00751
Epoch: 29637, Training Loss: 0.00849
Epoch: 29637, Training Loss: 0.00793
Epoch: 29637, Training Loss: 0.00969
Epoch: 29637, Training Loss: 0.00751
Epoch: 29638, Training Loss: 0.00849
Epoch: 29638, Training Loss: 0.00793
Epoch: 29638, Training Loss: 0.00969
Epoch: 29638, Training Loss: 0.00751
Epoch: 29639, Training Loss: 0.00849
Epoch: 29639, Training Loss: 0.00793
Epoch: 29639, Training Loss: 0.00969
Epoch: 29639, Training Loss: 0.00751
Epoch: 29640, Training Loss: 0.00849
Epoch: 29640, Training Loss: 0.00793
Epoch: 29640, Training Loss: 0.00969
Epoch: 29640, Training Loss: 0.00751
Epoch: 29641, Training Loss: 0.00849
Epoch: 29641, Training Loss: 0.00793
Epoch: 29641, Training Loss: 0.00969
Epoch: 29641, Training Loss: 0.00751
Epoch: 29642, Training Loss: 0.00849
Epoch: 29642, Training Loss: 0.00793
Epoch: 29642, Training Loss: 0.00969
Epoch: 29642, Training Loss: 0.00751
Epoch: 29643, Training Loss: 0.00849
Epoch: 29643, Training Loss: 0.00793
Epoch: 29643, Training Loss: 0.00969
Epoch: 29643, Training Loss: 0.00751
Epoch: 29644, Training Loss: 0.00849
Epoch: 29644, Training Loss: 0.00793
Epoch: 29644, Training Loss: 0.00969
Epoch: 29644, Training Loss: 0.00750
Epoch: 29645, Training Loss: 0.00849
Epoch: 29645, Training Loss: 0.00792
Epoch: 29645, Training Loss: 0.00969
Epoch: 29645, Training Loss: 0.00750
Epoch: 29646, Training Loss: 0.00849
Epoch: 29646, Training Loss: 0.00792
Epoch: 29646, Training Loss: 0.00969
Epoch: 29646, Training Loss: 0.00750
Epoch: 29647, Training Loss: 0.00849
Epoch: 29647, Training Loss: 0.00792
Epoch: 29647, Training Loss: 0.00969
Epoch: 29647, Training Loss: 0.00750
Epoch: 29648, Training Loss: 0.00849
Epoch: 29648, Training Loss: 0.00792
Epoch: 29648, Training Loss: 0.00969
Epoch: 29648, Training Loss: 0.00750
Epoch: 29649, Training Loss: 0.00849
Epoch: 29649, Training Loss: 0.00792
Epoch: 29649, Training Loss: 0.00969
Epoch: 29649, Training Loss: 0.00750
Epoch: 29650, Training Loss: 0.00849
Epoch: 29650, Training Loss: 0.00792
Epoch: 29650, Training Loss: 0.00969
Epoch: 29650, Training Loss: 0.00750
Epoch: 29651, Training Loss: 0.00849
Epoch: 29651, Training Loss: 0.00792
Epoch: 29651, Training Loss: 0.00969
Epoch: 29651, Training Loss: 0.00750
Epoch: 29652, Training Loss: 0.00849
Epoch: 29652, Training Loss: 0.00792
Epoch: 29652, Training Loss: 0.00969
Epoch: 29652, Training Loss: 0.00750
Epoch: 29653, Training Loss: 0.00849
Epoch: 29653, Training Loss: 0.00792
Epoch: 29653, Training Loss: 0.00969
Epoch: 29653, Training Loss: 0.00750
Epoch: 29654, Training Loss: 0.00849
Epoch: 29654, Training Loss: 0.00792
Epoch: 29654, Training Loss: 0.00969
Epoch: 29654, Training Loss: 0.00750
Epoch: 29655, Training Loss: 0.00849
Epoch: 29655, Training Loss: 0.00792
Epoch: 29655, Training Loss: 0.00969
Epoch: 29655, Training Loss: 0.00750
Epoch: 29656, Training Loss: 0.00849
Epoch: 29656, Training Loss: 0.00792
Epoch: 29656, Training Loss: 0.00969
Epoch: 29656, Training Loss: 0.00750
Epoch: 29657, Training Loss: 0.00849
Epoch: 29657, Training Loss: 0.00792
Epoch: 29657, Training Loss: 0.00969
Epoch: 29657, Training Loss: 0.00750
Epoch: 29658, Training Loss: 0.00849
Epoch: 29658, Training Loss: 0.00792
Epoch: 29658, Training Loss: 0.00969
Epoch: 29658, Training Loss: 0.00750
Epoch: 29659, Training Loss: 0.00849
Epoch: 29659, Training Loss: 0.00792
Epoch: 29659, Training Loss: 0.00969
Epoch: 29659, Training Loss: 0.00750
Epoch: 29660, Training Loss: 0.00849
Epoch: 29660, Training Loss: 0.00792
Epoch: 29660, Training Loss: 0.00969
Epoch: 29660, Training Loss: 0.00750
Epoch: 29661, Training Loss: 0.00849
Epoch: 29661, Training Loss: 0.00792
Epoch: 29661, Training Loss: 0.00969
Epoch: 29661, Training Loss: 0.00750
Epoch: 29662, Training Loss: 0.00849
Epoch: 29662, Training Loss: 0.00792
Epoch: 29662, Training Loss: 0.00969
Epoch: 29662, Training Loss: 0.00750
Epoch: 29663, Training Loss: 0.00849
Epoch: 29663, Training Loss: 0.00792
Epoch: 29663, Training Loss: 0.00969
Epoch: 29663, Training Loss: 0.00750
Epoch: 29664, Training Loss: 0.00849
Epoch: 29664, Training Loss: 0.00792
Epoch: 29664, Training Loss: 0.00969
Epoch: 29664, Training Loss: 0.00750
Epoch: 29665, Training Loss: 0.00848
Epoch: 29665, Training Loss: 0.00792
Epoch: 29665, Training Loss: 0.00969
Epoch: 29665, Training Loss: 0.00750
Epoch: 29666, Training Loss: 0.00848
Epoch: 29666, Training Loss: 0.00792
Epoch: 29666, Training Loss: 0.00969
Epoch: 29666, Training Loss: 0.00750
Epoch: 29667, Training Loss: 0.00848
Epoch: 29667, Training Loss: 0.00792
Epoch: 29667, Training Loss: 0.00969
Epoch: 29667, Training Loss: 0.00750
Epoch: 29668, Training Loss: 0.00848
Epoch: 29668, Training Loss: 0.00792
Epoch: 29668, Training Loss: 0.00969
Epoch: 29668, Training Loss: 0.00750
Epoch: 29669, Training Loss: 0.00848
Epoch: 29669, Training Loss: 0.00792
Epoch: 29669, Training Loss: 0.00969
Epoch: 29669, Training Loss: 0.00750
Epoch: 29670, Training Loss: 0.00848
Epoch: 29670, Training Loss: 0.00792
Epoch: 29670, Training Loss: 0.00969
Epoch: 29670, Training Loss: 0.00750
Epoch: 29671, Training Loss: 0.00848
Epoch: 29671, Training Loss: 0.00792
Epoch: 29671, Training Loss: 0.00969
Epoch: 29671, Training Loss: 0.00750
Epoch: 29672, Training Loss: 0.00848
Epoch: 29672, Training Loss: 0.00792
Epoch: 29672, Training Loss: 0.00969
Epoch: 29672, Training Loss: 0.00750
Epoch: 29673, Training Loss: 0.00848
Epoch: 29673, Training Loss: 0.00792
Epoch: 29673, Training Loss: 0.00969
Epoch: 29673, Training Loss: 0.00750
Epoch: 29674, Training Loss: 0.00848
Epoch: 29674, Training Loss: 0.00792
Epoch: 29674, Training Loss: 0.00969
Epoch: 29674, Training Loss: 0.00750
Epoch: 29675, Training Loss: 0.00848
Epoch: 29675, Training Loss: 0.00792
Epoch: 29675, Training Loss: 0.00969
Epoch: 29675, Training Loss: 0.00750
Epoch: 29676, Training Loss: 0.00848
Epoch: 29676, Training Loss: 0.00792
Epoch: 29676, Training Loss: 0.00969
Epoch: 29676, Training Loss: 0.00750
Epoch: 29677, Training Loss: 0.00848
Epoch: 29677, Training Loss: 0.00792
Epoch: 29677, Training Loss: 0.00969
Epoch: 29677, Training Loss: 0.00750
Epoch: 29678, Training Loss: 0.00848
Epoch: 29678, Training Loss: 0.00792
Epoch: 29678, Training Loss: 0.00969
Epoch: 29678, Training Loss: 0.00750
Epoch: 29679, Training Loss: 0.00848
Epoch: 29679, Training Loss: 0.00792
Epoch: 29679, Training Loss: 0.00969
Epoch: 29679, Training Loss: 0.00750
Epoch: 29680, Training Loss: 0.00848
Epoch: 29680, Training Loss: 0.00792
Epoch: 29680, Training Loss: 0.00969
Epoch: 29680, Training Loss: 0.00750
Epoch: 29681, Training Loss: 0.00848
Epoch: 29681, Training Loss: 0.00792
Epoch: 29681, Training Loss: 0.00969
Epoch: 29681, Training Loss: 0.00750
Epoch: 29682, Training Loss: 0.00848
Epoch: 29682, Training Loss: 0.00792
Epoch: 29682, Training Loss: 0.00969
Epoch: 29682, Training Loss: 0.00750
Epoch: 29683, Training Loss: 0.00848
Epoch: 29683, Training Loss: 0.00792
Epoch: 29683, Training Loss: 0.00969
Epoch: 29683, Training Loss: 0.00750
Epoch: 29684, Training Loss: 0.00848
Epoch: 29684, Training Loss: 0.00792
Epoch: 29684, Training Loss: 0.00969
Epoch: 29684, Training Loss: 0.00750
Epoch: 29685, Training Loss: 0.00848
Epoch: 29685, Training Loss: 0.00792
Epoch: 29685, Training Loss: 0.00969
Epoch: 29685, Training Loss: 0.00750
Epoch: 29686, Training Loss: 0.00848
Epoch: 29686, Training Loss: 0.00792
Epoch: 29686, Training Loss: 0.00969
Epoch: 29686, Training Loss: 0.00750
Epoch: 29687, Training Loss: 0.00848
Epoch: 29687, Training Loss: 0.00792
Epoch: 29687, Training Loss: 0.00969
Epoch: 29687, Training Loss: 0.00750
Epoch: 29688, Training Loss: 0.00848
Epoch: 29688, Training Loss: 0.00792
Epoch: 29688, Training Loss: 0.00969
Epoch: 29688, Training Loss: 0.00750
Epoch: 29689, Training Loss: 0.00848
Epoch: 29689, Training Loss: 0.00792
Epoch: 29689, Training Loss: 0.00969
Epoch: 29689, Training Loss: 0.00750
Epoch: 29690, Training Loss: 0.00848
Epoch: 29690, Training Loss: 0.00792
Epoch: 29690, Training Loss: 0.00969
Epoch: 29690, Training Loss: 0.00750
Epoch: 29691, Training Loss: 0.00848
Epoch: 29691, Training Loss: 0.00792
Epoch: 29691, Training Loss: 0.00969
Epoch: 29691, Training Loss: 0.00750
Epoch: 29692, Training Loss: 0.00848
Epoch: 29692, Training Loss: 0.00792
Epoch: 29692, Training Loss: 0.00969
Epoch: 29692, Training Loss: 0.00750
Epoch: 29693, Training Loss: 0.00848
Epoch: 29693, Training Loss: 0.00792
Epoch: 29693, Training Loss: 0.00969
Epoch: 29693, Training Loss: 0.00750
Epoch: 29694, Training Loss: 0.00848
Epoch: 29694, Training Loss: 0.00792
Epoch: 29694, Training Loss: 0.00969
Epoch: 29694, Training Loss: 0.00750
Epoch: 29695, Training Loss: 0.00848
Epoch: 29695, Training Loss: 0.00792
Epoch: 29695, Training Loss: 0.00968
Epoch: 29695, Training Loss: 0.00750
Epoch: 29696, Training Loss: 0.00848
Epoch: 29696, Training Loss: 0.00792
Epoch: 29696, Training Loss: 0.00968
Epoch: 29696, Training Loss: 0.00750
Epoch: 29697, Training Loss: 0.00848
Epoch: 29697, Training Loss: 0.00792
Epoch: 29697, Training Loss: 0.00968
Epoch: 29697, Training Loss: 0.00750
Epoch: 29698, Training Loss: 0.00848
Epoch: 29698, Training Loss: 0.00792
Epoch: 29698, Training Loss: 0.00968
Epoch: 29698, Training Loss: 0.00750
Epoch: 29699, Training Loss: 0.00848
Epoch: 29699, Training Loss: 0.00792
Epoch: 29699, Training Loss: 0.00968
Epoch: 29699, Training Loss: 0.00750
Epoch: 29700, Training Loss: 0.00848
Epoch: 29700, Training Loss: 0.00792
Epoch: 29700, Training Loss: 0.00968
Epoch: 29700, Training Loss: 0.00750
Epoch: 29701, Training Loss: 0.00848
Epoch: 29701, Training Loss: 0.00792
Epoch: 29701, Training Loss: 0.00968
Epoch: 29701, Training Loss: 0.00750
Epoch: 29702, Training Loss: 0.00848
Epoch: 29702, Training Loss: 0.00792
Epoch: 29702, Training Loss: 0.00968
Epoch: 29702, Training Loss: 0.00750
Epoch: 29703, Training Loss: 0.00848
Epoch: 29703, Training Loss: 0.00792
Epoch: 29703, Training Loss: 0.00968
Epoch: 29703, Training Loss: 0.00750
Epoch: 29704, Training Loss: 0.00848
Epoch: 29704, Training Loss: 0.00792
Epoch: 29704, Training Loss: 0.00968
Epoch: 29704, Training Loss: 0.00750
Epoch: 29705, Training Loss: 0.00848
Epoch: 29705, Training Loss: 0.00792
Epoch: 29705, Training Loss: 0.00968
Epoch: 29705, Training Loss: 0.00750
Epoch: 29706, Training Loss: 0.00848
Epoch: 29706, Training Loss: 0.00792
Epoch: 29706, Training Loss: 0.00968
Epoch: 29706, Training Loss: 0.00750
Epoch: 29707, Training Loss: 0.00848
Epoch: 29707, Training Loss: 0.00792
Epoch: 29707, Training Loss: 0.00968
Epoch: 29707, Training Loss: 0.00750
Epoch: 29708, Training Loss: 0.00848
Epoch: 29708, Training Loss: 0.00792
Epoch: 29708, Training Loss: 0.00968
Epoch: 29708, Training Loss: 0.00750
Epoch: 29709, Training Loss: 0.00848
Epoch: 29709, Training Loss: 0.00792
Epoch: 29709, Training Loss: 0.00968
Epoch: 29709, Training Loss: 0.00750
Epoch: 29710, Training Loss: 0.00848
Epoch: 29710, Training Loss: 0.00792
Epoch: 29710, Training Loss: 0.00968
Epoch: 29710, Training Loss: 0.00750
Epoch: 29711, Training Loss: 0.00848
Epoch: 29711, Training Loss: 0.00792
Epoch: 29711, Training Loss: 0.00968
Epoch: 29711, Training Loss: 0.00750
Epoch: 29712, Training Loss: 0.00848
Epoch: 29712, Training Loss: 0.00792
Epoch: 29712, Training Loss: 0.00968
Epoch: 29712, Training Loss: 0.00750
Epoch: 29713, Training Loss: 0.00848
Epoch: 29713, Training Loss: 0.00792
Epoch: 29713, Training Loss: 0.00968
Epoch: 29713, Training Loss: 0.00750
Epoch: 29714, Training Loss: 0.00848
Epoch: 29714, Training Loss: 0.00792
Epoch: 29714, Training Loss: 0.00968
Epoch: 29714, Training Loss: 0.00750
Epoch: 29715, Training Loss: 0.00848
Epoch: 29715, Training Loss: 0.00792
Epoch: 29715, Training Loss: 0.00968
Epoch: 29715, Training Loss: 0.00750
Epoch: 29716, Training Loss: 0.00848
Epoch: 29716, Training Loss: 0.00792
Epoch: 29716, Training Loss: 0.00968
Epoch: 29716, Training Loss: 0.00750
Epoch: 29717, Training Loss: 0.00848
Epoch: 29717, Training Loss: 0.00791
Epoch: 29717, Training Loss: 0.00968
Epoch: 29717, Training Loss: 0.00750
Epoch: 29718, Training Loss: 0.00848
Epoch: 29718, Training Loss: 0.00791
Epoch: 29718, Training Loss: 0.00968
Epoch: 29718, Training Loss: 0.00750
Epoch: 29719, Training Loss: 0.00848
Epoch: 29719, Training Loss: 0.00791
Epoch: 29719, Training Loss: 0.00968
Epoch: 29719, Training Loss: 0.00749
Epoch: 29720, Training Loss: 0.00848
Epoch: 29720, Training Loss: 0.00791
Epoch: 29720, Training Loss: 0.00968
Epoch: 29720, Training Loss: 0.00749
Epoch: 29721, Training Loss: 0.00848
Epoch: 29721, Training Loss: 0.00791
Epoch: 29721, Training Loss: 0.00968
Epoch: 29721, Training Loss: 0.00749
Epoch: 29722, Training Loss: 0.00848
Epoch: 29722, Training Loss: 0.00791
Epoch: 29722, Training Loss: 0.00968
Epoch: 29722, Training Loss: 0.00749
Epoch: 29723, Training Loss: 0.00848
Epoch: 29723, Training Loss: 0.00791
Epoch: 29723, Training Loss: 0.00968
Epoch: 29723, Training Loss: 0.00749
Epoch: 29724, Training Loss: 0.00848
Epoch: 29724, Training Loss: 0.00791
Epoch: 29724, Training Loss: 0.00968
Epoch: 29724, Training Loss: 0.00749
Epoch: 29725, Training Loss: 0.00848
Epoch: 29725, Training Loss: 0.00791
Epoch: 29725, Training Loss: 0.00968
Epoch: 29725, Training Loss: 0.00749
Epoch: 29726, Training Loss: 0.00848
Epoch: 29726, Training Loss: 0.00791
Epoch: 29726, Training Loss: 0.00968
Epoch: 29726, Training Loss: 0.00749
Epoch: 29727, Training Loss: 0.00848
Epoch: 29727, Training Loss: 0.00791
Epoch: 29727, Training Loss: 0.00968
Epoch: 29727, Training Loss: 0.00749
Epoch: 29728, Training Loss: 0.00848
Epoch: 29728, Training Loss: 0.00791
Epoch: 29728, Training Loss: 0.00968
Epoch: 29728, Training Loss: 0.00749
Epoch: 29729, Training Loss: 0.00848
Epoch: 29729, Training Loss: 0.00791
Epoch: 29729, Training Loss: 0.00968
Epoch: 29729, Training Loss: 0.00749
Epoch: 29730, Training Loss: 0.00848
Epoch: 29730, Training Loss: 0.00791
Epoch: 29730, Training Loss: 0.00968
Epoch: 29730, Training Loss: 0.00749
Epoch: 29731, Training Loss: 0.00847
Epoch: 29731, Training Loss: 0.00791
Epoch: 29731, Training Loss: 0.00968
Epoch: 29731, Training Loss: 0.00749
Epoch: 29732, Training Loss: 0.00847
Epoch: 29732, Training Loss: 0.00791
Epoch: 29732, Training Loss: 0.00968
Epoch: 29732, Training Loss: 0.00749
Epoch: 29733, Training Loss: 0.00847
Epoch: 29733, Training Loss: 0.00791
Epoch: 29733, Training Loss: 0.00968
Epoch: 29733, Training Loss: 0.00749
Epoch: 29734, Training Loss: 0.00847
Epoch: 29734, Training Loss: 0.00791
Epoch: 29734, Training Loss: 0.00968
Epoch: 29734, Training Loss: 0.00749
Epoch: 29735, Training Loss: 0.00847
Epoch: 29735, Training Loss: 0.00791
Epoch: 29735, Training Loss: 0.00968
Epoch: 29735, Training Loss: 0.00749
Epoch: 29736, Training Loss: 0.00847
Epoch: 29736, Training Loss: 0.00791
Epoch: 29736, Training Loss: 0.00968
Epoch: 29736, Training Loss: 0.00749
Epoch: 29737, Training Loss: 0.00847
Epoch: 29737, Training Loss: 0.00791
Epoch: 29737, Training Loss: 0.00968
Epoch: 29737, Training Loss: 0.00749
Epoch: 29738, Training Loss: 0.00847
Epoch: 29738, Training Loss: 0.00791
Epoch: 29738, Training Loss: 0.00968
Epoch: 29738, Training Loss: 0.00749
Epoch: 29739, Training Loss: 0.00847
Epoch: 29739, Training Loss: 0.00791
Epoch: 29739, Training Loss: 0.00968
Epoch: 29739, Training Loss: 0.00749
Epoch: 29740, Training Loss: 0.00847
Epoch: 29740, Training Loss: 0.00791
Epoch: 29740, Training Loss: 0.00968
Epoch: 29740, Training Loss: 0.00749
Epoch: 29741, Training Loss: 0.00847
Epoch: 29741, Training Loss: 0.00791
Epoch: 29741, Training Loss: 0.00968
Epoch: 29741, Training Loss: 0.00749
Epoch: 29742, Training Loss: 0.00847
Epoch: 29742, Training Loss: 0.00791
Epoch: 29742, Training Loss: 0.00968
Epoch: 29742, Training Loss: 0.00749
Epoch: 29743, Training Loss: 0.00847
Epoch: 29743, Training Loss: 0.00791
Epoch: 29743, Training Loss: 0.00968
Epoch: 29743, Training Loss: 0.00749
Epoch: 29744, Training Loss: 0.00847
Epoch: 29744, Training Loss: 0.00791
Epoch: 29744, Training Loss: 0.00968
Epoch: 29744, Training Loss: 0.00749
Epoch: 29745, Training Loss: 0.00847
Epoch: 29745, Training Loss: 0.00791
Epoch: 29745, Training Loss: 0.00968
Epoch: 29745, Training Loss: 0.00749
Epoch: 29746, Training Loss: 0.00847
Epoch: 29746, Training Loss: 0.00791
Epoch: 29746, Training Loss: 0.00968
Epoch: 29746, Training Loss: 0.00749
Epoch: 29747, Training Loss: 0.00847
Epoch: 29747, Training Loss: 0.00791
Epoch: 29747, Training Loss: 0.00968
Epoch: 29747, Training Loss: 0.00749
Epoch: 29748, Training Loss: 0.00847
Epoch: 29748, Training Loss: 0.00791
Epoch: 29748, Training Loss: 0.00968
Epoch: 29748, Training Loss: 0.00749
Epoch: 29749, Training Loss: 0.00847
Epoch: 29749, Training Loss: 0.00791
Epoch: 29749, Training Loss: 0.00968
Epoch: 29749, Training Loss: 0.00749
Epoch: 29750, Training Loss: 0.00847
Epoch: 29750, Training Loss: 0.00791
Epoch: 29750, Training Loss: 0.00968
Epoch: 29750, Training Loss: 0.00749
Epoch: 29751, Training Loss: 0.00847
Epoch: 29751, Training Loss: 0.00791
Epoch: 29751, Training Loss: 0.00968
Epoch: 29751, Training Loss: 0.00749
Epoch: 29752, Training Loss: 0.00847
Epoch: 29752, Training Loss: 0.00791
Epoch: 29752, Training Loss: 0.00968
Epoch: 29752, Training Loss: 0.00749
Epoch: 29753, Training Loss: 0.00847
Epoch: 29753, Training Loss: 0.00791
Epoch: 29753, Training Loss: 0.00967
Epoch: 29753, Training Loss: 0.00749
Epoch: 29754, Training Loss: 0.00847
Epoch: 29754, Training Loss: 0.00791
Epoch: 29754, Training Loss: 0.00967
Epoch: 29754, Training Loss: 0.00749
Epoch: 29755, Training Loss: 0.00847
Epoch: 29755, Training Loss: 0.00791
Epoch: 29755, Training Loss: 0.00967
Epoch: 29755, Training Loss: 0.00749
Epoch: 29756, Training Loss: 0.00847
Epoch: 29756, Training Loss: 0.00791
Epoch: 29756, Training Loss: 0.00967
Epoch: 29756, Training Loss: 0.00749
Epoch: 29757, Training Loss: 0.00847
Epoch: 29757, Training Loss: 0.00791
Epoch: 29757, Training Loss: 0.00967
Epoch: 29757, Training Loss: 0.00749
Epoch: 29758, Training Loss: 0.00847
Epoch: 29758, Training Loss: 0.00791
Epoch: 29758, Training Loss: 0.00967
Epoch: 29758, Training Loss: 0.00749
Epoch: 29759, Training Loss: 0.00847
Epoch: 29759, Training Loss: 0.00791
Epoch: 29759, Training Loss: 0.00967
Epoch: 29759, Training Loss: 0.00749
Epoch: 29760, Training Loss: 0.00847
Epoch: 29760, Training Loss: 0.00791
Epoch: 29760, Training Loss: 0.00967
Epoch: 29760, Training Loss: 0.00749
Epoch: 29761, Training Loss: 0.00847
Epoch: 29761, Training Loss: 0.00791
Epoch: 29761, Training Loss: 0.00967
Epoch: 29761, Training Loss: 0.00749
Epoch: 29762, Training Loss: 0.00847
Epoch: 29762, Training Loss: 0.00791
Epoch: 29762, Training Loss: 0.00967
Epoch: 29762, Training Loss: 0.00749
Epoch: 29763, Training Loss: 0.00847
Epoch: 29763, Training Loss: 0.00791
Epoch: 29763, Training Loss: 0.00967
Epoch: 29763, Training Loss: 0.00749
Epoch: 29764, Training Loss: 0.00847
Epoch: 29764, Training Loss: 0.00791
Epoch: 29764, Training Loss: 0.00967
Epoch: 29764, Training Loss: 0.00749
Epoch: 29765, Training Loss: 0.00847
Epoch: 29765, Training Loss: 0.00791
Epoch: 29765, Training Loss: 0.00967
Epoch: 29765, Training Loss: 0.00749
Epoch: 29766, Training Loss: 0.00847
Epoch: 29766, Training Loss: 0.00791
Epoch: 29766, Training Loss: 0.00967
Epoch: 29766, Training Loss: 0.00749
Epoch: 29767, Training Loss: 0.00847
Epoch: 29767, Training Loss: 0.00791
Epoch: 29767, Training Loss: 0.00967
Epoch: 29767, Training Loss: 0.00749
Epoch: 29768, Training Loss: 0.00847
Epoch: 29768, Training Loss: 0.00791
Epoch: 29768, Training Loss: 0.00967
Epoch: 29768, Training Loss: 0.00749
Epoch: 29769, Training Loss: 0.00847
Epoch: 29769, Training Loss: 0.00791
Epoch: 29769, Training Loss: 0.00967
Epoch: 29769, Training Loss: 0.00749
Epoch: 29770, Training Loss: 0.00847
Epoch: 29770, Training Loss: 0.00791
Epoch: 29770, Training Loss: 0.00967
Epoch: 29770, Training Loss: 0.00749
Epoch: 29771, Training Loss: 0.00847
Epoch: 29771, Training Loss: 0.00791
Epoch: 29771, Training Loss: 0.00967
Epoch: 29771, Training Loss: 0.00749
Epoch: 29772, Training Loss: 0.00847
Epoch: 29772, Training Loss: 0.00791
Epoch: 29772, Training Loss: 0.00967
Epoch: 29772, Training Loss: 0.00749
Epoch: 29773, Training Loss: 0.00847
Epoch: 29773, Training Loss: 0.00791
Epoch: 29773, Training Loss: 0.00967
Epoch: 29773, Training Loss: 0.00749
Epoch: 29774, Training Loss: 0.00847
Epoch: 29774, Training Loss: 0.00791
Epoch: 29774, Training Loss: 0.00967
Epoch: 29774, Training Loss: 0.00749
Epoch: 29775, Training Loss: 0.00847
Epoch: 29775, Training Loss: 0.00791
Epoch: 29775, Training Loss: 0.00967
Epoch: 29775, Training Loss: 0.00749
Epoch: 29776, Training Loss: 0.00847
Epoch: 29776, Training Loss: 0.00791
Epoch: 29776, Training Loss: 0.00967
Epoch: 29776, Training Loss: 0.00749
Epoch: 29777, Training Loss: 0.00847
Epoch: 29777, Training Loss: 0.00791
Epoch: 29777, Training Loss: 0.00967
Epoch: 29777, Training Loss: 0.00749
Epoch: 29778, Training Loss: 0.00847
Epoch: 29778, Training Loss: 0.00791
Epoch: 29778, Training Loss: 0.00967
Epoch: 29778, Training Loss: 0.00749
Epoch: 29779, Training Loss: 0.00847
Epoch: 29779, Training Loss: 0.00791
Epoch: 29779, Training Loss: 0.00967
Epoch: 29779, Training Loss: 0.00749
Epoch: 29780, Training Loss: 0.00847
Epoch: 29780, Training Loss: 0.00791
Epoch: 29780, Training Loss: 0.00967
Epoch: 29780, Training Loss: 0.00749
Epoch: 29781, Training Loss: 0.00847
Epoch: 29781, Training Loss: 0.00791
Epoch: 29781, Training Loss: 0.00967
Epoch: 29781, Training Loss: 0.00749
Epoch: 29782, Training Loss: 0.00847
Epoch: 29782, Training Loss: 0.00791
Epoch: 29782, Training Loss: 0.00967
Epoch: 29782, Training Loss: 0.00749
Epoch: 29783, Training Loss: 0.00847
Epoch: 29783, Training Loss: 0.00791
Epoch: 29783, Training Loss: 0.00967
Epoch: 29783, Training Loss: 0.00749
Epoch: 29784, Training Loss: 0.00847
Epoch: 29784, Training Loss: 0.00791
Epoch: 29784, Training Loss: 0.00967
Epoch: 29784, Training Loss: 0.00749
Epoch: 29785, Training Loss: 0.00847
Epoch: 29785, Training Loss: 0.00791
Epoch: 29785, Training Loss: 0.00967
Epoch: 29785, Training Loss: 0.00749
Epoch: 29786, Training Loss: 0.00847
Epoch: 29786, Training Loss: 0.00791
Epoch: 29786, Training Loss: 0.00967
Epoch: 29786, Training Loss: 0.00749
Epoch: 29787, Training Loss: 0.00847
Epoch: 29787, Training Loss: 0.00791
Epoch: 29787, Training Loss: 0.00967
Epoch: 29787, Training Loss: 0.00749
Epoch: 29788, Training Loss: 0.00847
Epoch: 29788, Training Loss: 0.00791
Epoch: 29788, Training Loss: 0.00967
Epoch: 29788, Training Loss: 0.00749
Epoch: 29789, Training Loss: 0.00847
Epoch: 29789, Training Loss: 0.00790
Epoch: 29789, Training Loss: 0.00967
Epoch: 29789, Training Loss: 0.00749
Epoch: 29790, Training Loss: 0.00847
Epoch: 29790, Training Loss: 0.00790
Epoch: 29790, Training Loss: 0.00967
Epoch: 29790, Training Loss: 0.00749
Epoch: 29791, Training Loss: 0.00847
Epoch: 29791, Training Loss: 0.00790
Epoch: 29791, Training Loss: 0.00967
Epoch: 29791, Training Loss: 0.00749
Epoch: 29792, Training Loss: 0.00847
Epoch: 29792, Training Loss: 0.00790
Epoch: 29792, Training Loss: 0.00967
Epoch: 29792, Training Loss: 0.00749
Epoch: 29793, Training Loss: 0.00847
Epoch: 29793, Training Loss: 0.00790
Epoch: 29793, Training Loss: 0.00967
Epoch: 29793, Training Loss: 0.00749
Epoch: 29794, Training Loss: 0.00847
Epoch: 29794, Training Loss: 0.00790
Epoch: 29794, Training Loss: 0.00967
Epoch: 29794, Training Loss: 0.00749
Epoch: 29795, Training Loss: 0.00847
Epoch: 29795, Training Loss: 0.00790
Epoch: 29795, Training Loss: 0.00967
Epoch: 29795, Training Loss: 0.00748
Epoch: 29796, Training Loss: 0.00847
Epoch: 29796, Training Loss: 0.00790
Epoch: 29796, Training Loss: 0.00967
Epoch: 29796, Training Loss: 0.00748
Epoch: 29797, Training Loss: 0.00846
Epoch: 29797, Training Loss: 0.00790
Epoch: 29797, Training Loss: 0.00967
Epoch: 29797, Training Loss: 0.00748
Epoch: 29798, Training Loss: 0.00846
Epoch: 29798, Training Loss: 0.00790
Epoch: 29798, Training Loss: 0.00967
Epoch: 29798, Training Loss: 0.00748
Epoch: 29799, Training Loss: 0.00846
Epoch: 29799, Training Loss: 0.00790
Epoch: 29799, Training Loss: 0.00967
Epoch: 29799, Training Loss: 0.00748
Epoch: 29800, Training Loss: 0.00846
Epoch: 29800, Training Loss: 0.00790
Epoch: 29800, Training Loss: 0.00967
Epoch: 29800, Training Loss: 0.00748
Epoch: 29801, Training Loss: 0.00846
Epoch: 29801, Training Loss: 0.00790
Epoch: 29801, Training Loss: 0.00967
Epoch: 29801, Training Loss: 0.00748
Epoch: 29802, Training Loss: 0.00846
Epoch: 29802, Training Loss: 0.00790
Epoch: 29802, Training Loss: 0.00967
Epoch: 29802, Training Loss: 0.00748
Epoch: 29803, Training Loss: 0.00846
Epoch: 29803, Training Loss: 0.00790
Epoch: 29803, Training Loss: 0.00967
Epoch: 29803, Training Loss: 0.00748
Epoch: 29804, Training Loss: 0.00846
Epoch: 29804, Training Loss: 0.00790
Epoch: 29804, Training Loss: 0.00967
Epoch: 29804, Training Loss: 0.00748
Epoch: 29805, Training Loss: 0.00846
Epoch: 29805, Training Loss: 0.00790
Epoch: 29805, Training Loss: 0.00967
Epoch: 29805, Training Loss: 0.00748
Epoch: 29806, Training Loss: 0.00846
Epoch: 29806, Training Loss: 0.00790
Epoch: 29806, Training Loss: 0.00967
Epoch: 29806, Training Loss: 0.00748
Epoch: 29807, Training Loss: 0.00846
Epoch: 29807, Training Loss: 0.00790
Epoch: 29807, Training Loss: 0.00967
Epoch: 29807, Training Loss: 0.00748
Epoch: 29808, Training Loss: 0.00846
Epoch: 29808, Training Loss: 0.00790
Epoch: 29808, Training Loss: 0.00967
Epoch: 29808, Training Loss: 0.00748
Epoch: 29809, Training Loss: 0.00846
Epoch: 29809, Training Loss: 0.00790
Epoch: 29809, Training Loss: 0.00967
Epoch: 29809, Training Loss: 0.00748
Epoch: 29810, Training Loss: 0.00846
Epoch: 29810, Training Loss: 0.00790
Epoch: 29810, Training Loss: 0.00967
Epoch: 29810, Training Loss: 0.00748
Epoch: 29811, Training Loss: 0.00846
Epoch: 29811, Training Loss: 0.00790
Epoch: 29811, Training Loss: 0.00966
Epoch: 29811, Training Loss: 0.00748
Epoch: 29812, Training Loss: 0.00846
Epoch: 29812, Training Loss: 0.00790
Epoch: 29812, Training Loss: 0.00966
Epoch: 29812, Training Loss: 0.00748
Epoch: 29813, Training Loss: 0.00846
Epoch: 29813, Training Loss: 0.00790
Epoch: 29813, Training Loss: 0.00966
Epoch: 29813, Training Loss: 0.00748
Epoch: 29814, Training Loss: 0.00846
Epoch: 29814, Training Loss: 0.00790
Epoch: 29814, Training Loss: 0.00966
Epoch: 29814, Training Loss: 0.00748
Epoch: 29815, Training Loss: 0.00846
Epoch: 29815, Training Loss: 0.00790
Epoch: 29815, Training Loss: 0.00966
Epoch: 29815, Training Loss: 0.00748
Epoch: 29816, Training Loss: 0.00846
Epoch: 29816, Training Loss: 0.00790
Epoch: 29816, Training Loss: 0.00966
Epoch: 29816, Training Loss: 0.00748
Epoch: 29817, Training Loss: 0.00846
Epoch: 29817, Training Loss: 0.00790
Epoch: 29817, Training Loss: 0.00966
Epoch: 29817, Training Loss: 0.00748
Epoch: 29818, Training Loss: 0.00846
Epoch: 29818, Training Loss: 0.00790
Epoch: 29818, Training Loss: 0.00966
Epoch: 29818, Training Loss: 0.00748
Epoch: 29819, Training Loss: 0.00846
Epoch: 29819, Training Loss: 0.00790
Epoch: 29819, Training Loss: 0.00966
Epoch: 29819, Training Loss: 0.00748
Epoch: 29820, Training Loss: 0.00846
Epoch: 29820, Training Loss: 0.00790
Epoch: 29820, Training Loss: 0.00966
Epoch: 29820, Training Loss: 0.00748
Epoch: 29821, Training Loss: 0.00846
Epoch: 29821, Training Loss: 0.00790
Epoch: 29821, Training Loss: 0.00966
Epoch: 29821, Training Loss: 0.00748
Epoch: 29822, Training Loss: 0.00846
Epoch: 29822, Training Loss: 0.00790
Epoch: 29822, Training Loss: 0.00966
Epoch: 29822, Training Loss: 0.00748
Epoch: 29823, Training Loss: 0.00846
Epoch: 29823, Training Loss: 0.00790
Epoch: 29823, Training Loss: 0.00966
Epoch: 29823, Training Loss: 0.00748
Epoch: 29824, Training Loss: 0.00846
Epoch: 29824, Training Loss: 0.00790
Epoch: 29824, Training Loss: 0.00966
Epoch: 29824, Training Loss: 0.00748
Epoch: 29825, Training Loss: 0.00846
Epoch: 29825, Training Loss: 0.00790
Epoch: 29825, Training Loss: 0.00966
Epoch: 29825, Training Loss: 0.00748
Epoch: 29826, Training Loss: 0.00846
Epoch: 29826, Training Loss: 0.00790
Epoch: 29826, Training Loss: 0.00966
Epoch: 29826, Training Loss: 0.00748
Epoch: 29827, Training Loss: 0.00846
Epoch: 29827, Training Loss: 0.00790
Epoch: 29827, Training Loss: 0.00966
Epoch: 29827, Training Loss: 0.00748
Epoch: 29828, Training Loss: 0.00846
Epoch: 29828, Training Loss: 0.00790
Epoch: 29828, Training Loss: 0.00966
Epoch: 29828, Training Loss: 0.00748
Epoch: 29829, Training Loss: 0.00846
Epoch: 29829, Training Loss: 0.00790
Epoch: 29829, Training Loss: 0.00966
Epoch: 29829, Training Loss: 0.00748
Epoch: 29830, Training Loss: 0.00846
Epoch: 29830, Training Loss: 0.00790
Epoch: 29830, Training Loss: 0.00966
Epoch: 29830, Training Loss: 0.00748
Epoch: 29831, Training Loss: 0.00846
Epoch: 29831, Training Loss: 0.00790
Epoch: 29831, Training Loss: 0.00966
Epoch: 29831, Training Loss: 0.00748
Epoch: 29832, Training Loss: 0.00846
Epoch: 29832, Training Loss: 0.00790
Epoch: 29832, Training Loss: 0.00966
Epoch: 29832, Training Loss: 0.00748
Epoch: 29833, Training Loss: 0.00846
Epoch: 29833, Training Loss: 0.00790
Epoch: 29833, Training Loss: 0.00966
Epoch: 29833, Training Loss: 0.00748
Epoch: 29834, Training Loss: 0.00846
Epoch: 29834, Training Loss: 0.00790
Epoch: 29834, Training Loss: 0.00966
Epoch: 29834, Training Loss: 0.00748
Epoch: 29835, Training Loss: 0.00846
Epoch: 29835, Training Loss: 0.00790
Epoch: 29835, Training Loss: 0.00966
Epoch: 29835, Training Loss: 0.00748
Epoch: 29836, Training Loss: 0.00846
Epoch: 29836, Training Loss: 0.00790
Epoch: 29836, Training Loss: 0.00966
Epoch: 29836, Training Loss: 0.00748
Epoch: 29837, Training Loss: 0.00846
Epoch: 29837, Training Loss: 0.00790
Epoch: 29837, Training Loss: 0.00966
Epoch: 29837, Training Loss: 0.00748
Epoch: 29838, Training Loss: 0.00846
Epoch: 29838, Training Loss: 0.00790
Epoch: 29838, Training Loss: 0.00966
Epoch: 29838, Training Loss: 0.00748
Epoch: 29839, Training Loss: 0.00846
Epoch: 29839, Training Loss: 0.00790
Epoch: 29839, Training Loss: 0.00966
Epoch: 29839, Training Loss: 0.00748
Epoch: 29840, Training Loss: 0.00846
Epoch: 29840, Training Loss: 0.00790
Epoch: 29840, Training Loss: 0.00966
Epoch: 29840, Training Loss: 0.00748
Epoch: 29841, Training Loss: 0.00846
Epoch: 29841, Training Loss: 0.00790
Epoch: 29841, Training Loss: 0.00966
Epoch: 29841, Training Loss: 0.00748
Epoch: 29842, Training Loss: 0.00846
Epoch: 29842, Training Loss: 0.00790
Epoch: 29842, Training Loss: 0.00966
Epoch: 29842, Training Loss: 0.00748
Epoch: 29843, Training Loss: 0.00846
Epoch: 29843, Training Loss: 0.00790
Epoch: 29843, Training Loss: 0.00966
Epoch: 29843, Training Loss: 0.00748
Epoch: 29844, Training Loss: 0.00846
Epoch: 29844, Training Loss: 0.00790
Epoch: 29844, Training Loss: 0.00966
Epoch: 29844, Training Loss: 0.00748
Epoch: 29845, Training Loss: 0.00846
Epoch: 29845, Training Loss: 0.00790
Epoch: 29845, Training Loss: 0.00966
Epoch: 29845, Training Loss: 0.00748
Epoch: 29846, Training Loss: 0.00846
Epoch: 29846, Training Loss: 0.00790
Epoch: 29846, Training Loss: 0.00966
Epoch: 29846, Training Loss: 0.00748
Epoch: 29847, Training Loss: 0.00846
Epoch: 29847, Training Loss: 0.00790
Epoch: 29847, Training Loss: 0.00966
Epoch: 29847, Training Loss: 0.00748
Epoch: 29848, Training Loss: 0.00846
Epoch: 29848, Training Loss: 0.00790
Epoch: 29848, Training Loss: 0.00966
Epoch: 29848, Training Loss: 0.00748
Epoch: 29849, Training Loss: 0.00846
Epoch: 29849, Training Loss: 0.00790
Epoch: 29849, Training Loss: 0.00966
Epoch: 29849, Training Loss: 0.00748
Epoch: 29850, Training Loss: 0.00846
Epoch: 29850, Training Loss: 0.00790
Epoch: 29850, Training Loss: 0.00966
Epoch: 29850, Training Loss: 0.00748
Epoch: 29851, Training Loss: 0.00846
Epoch: 29851, Training Loss: 0.00790
Epoch: 29851, Training Loss: 0.00966
Epoch: 29851, Training Loss: 0.00748
Epoch: 29852, Training Loss: 0.00846
Epoch: 29852, Training Loss: 0.00790
Epoch: 29852, Training Loss: 0.00966
Epoch: 29852, Training Loss: 0.00748
Epoch: 29853, Training Loss: 0.00846
Epoch: 29853, Training Loss: 0.00790
Epoch: 29853, Training Loss: 0.00966
Epoch: 29853, Training Loss: 0.00748
Epoch: 29854, Training Loss: 0.00846
Epoch: 29854, Training Loss: 0.00790
Epoch: 29854, Training Loss: 0.00966
Epoch: 29854, Training Loss: 0.00748
Epoch: 29855, Training Loss: 0.00846
Epoch: 29855, Training Loss: 0.00790
Epoch: 29855, Training Loss: 0.00966
Epoch: 29855, Training Loss: 0.00748
Epoch: 29856, Training Loss: 0.00846
Epoch: 29856, Training Loss: 0.00790
Epoch: 29856, Training Loss: 0.00966
Epoch: 29856, Training Loss: 0.00748
Epoch: 29857, Training Loss: 0.00846
Epoch: 29857, Training Loss: 0.00790
Epoch: 29857, Training Loss: 0.00966
Epoch: 29857, Training Loss: 0.00748
Epoch: 29858, Training Loss: 0.00846
Epoch: 29858, Training Loss: 0.00790
Epoch: 29858, Training Loss: 0.00966
Epoch: 29858, Training Loss: 0.00748
Epoch: 29859, Training Loss: 0.00846
Epoch: 29859, Training Loss: 0.00790
Epoch: 29859, Training Loss: 0.00966
Epoch: 29859, Training Loss: 0.00748
Epoch: 29860, Training Loss: 0.00846
Epoch: 29860, Training Loss: 0.00790
Epoch: 29860, Training Loss: 0.00966
Epoch: 29860, Training Loss: 0.00748
Epoch: 29861, Training Loss: 0.00846
Epoch: 29861, Training Loss: 0.00789
Epoch: 29861, Training Loss: 0.00966
Epoch: 29861, Training Loss: 0.00748
Epoch: 29862, Training Loss: 0.00846
Epoch: 29862, Training Loss: 0.00789
Epoch: 29862, Training Loss: 0.00966
Epoch: 29862, Training Loss: 0.00748
Epoch: 29863, Training Loss: 0.00846
Epoch: 29863, Training Loss: 0.00789
Epoch: 29863, Training Loss: 0.00966
Epoch: 29863, Training Loss: 0.00748
Epoch: 29864, Training Loss: 0.00845
Epoch: 29864, Training Loss: 0.00789
Epoch: 29864, Training Loss: 0.00966
Epoch: 29864, Training Loss: 0.00748
Epoch: 29865, Training Loss: 0.00845
Epoch: 29865, Training Loss: 0.00789
Epoch: 29865, Training Loss: 0.00966
Epoch: 29865, Training Loss: 0.00748
Epoch: 29866, Training Loss: 0.00845
Epoch: 29866, Training Loss: 0.00789
Epoch: 29866, Training Loss: 0.00966
Epoch: 29866, Training Loss: 0.00748
Epoch: 29867, Training Loss: 0.00845
Epoch: 29867, Training Loss: 0.00789
Epoch: 29867, Training Loss: 0.00966
Epoch: 29867, Training Loss: 0.00748
Epoch: 29868, Training Loss: 0.00845
Epoch: 29868, Training Loss: 0.00789
Epoch: 29868, Training Loss: 0.00966
Epoch: 29868, Training Loss: 0.00748
Epoch: 29869, Training Loss: 0.00845
Epoch: 29869, Training Loss: 0.00789
Epoch: 29869, Training Loss: 0.00965
Epoch: 29869, Training Loss: 0.00748
Epoch: 29870, Training Loss: 0.00845
Epoch: 29870, Training Loss: 0.00789
Epoch: 29870, Training Loss: 0.00965
Epoch: 29870, Training Loss: 0.00748
Epoch: 29871, Training Loss: 0.00845
Epoch: 29871, Training Loss: 0.00789
Epoch: 29871, Training Loss: 0.00965
Epoch: 29871, Training Loss: 0.00747
Epoch: 29872, Training Loss: 0.00845
Epoch: 29872, Training Loss: 0.00789
Epoch: 29872, Training Loss: 0.00965
Epoch: 29872, Training Loss: 0.00747
Epoch: 29873, Training Loss: 0.00845
Epoch: 29873, Training Loss: 0.00789
Epoch: 29873, Training Loss: 0.00965
Epoch: 29873, Training Loss: 0.00747
Epoch: 29874, Training Loss: 0.00845
Epoch: 29874, Training Loss: 0.00789
Epoch: 29874, Training Loss: 0.00965
Epoch: 29874, Training Loss: 0.00747
Epoch: 29875, Training Loss: 0.00845
Epoch: 29875, Training Loss: 0.00789
Epoch: 29875, Training Loss: 0.00965
Epoch: 29875, Training Loss: 0.00747
Epoch: 29876, Training Loss: 0.00845
Epoch: 29876, Training Loss: 0.00789
Epoch: 29876, Training Loss: 0.00965
Epoch: 29876, Training Loss: 0.00747
Epoch: 29877, Training Loss: 0.00845
Epoch: 29877, Training Loss: 0.00789
Epoch: 29877, Training Loss: 0.00965
Epoch: 29877, Training Loss: 0.00747
Epoch: 29878, Training Loss: 0.00845
Epoch: 29878, Training Loss: 0.00789
Epoch: 29878, Training Loss: 0.00965
Epoch: 29878, Training Loss: 0.00747
Epoch: 29879, Training Loss: 0.00845
Epoch: 29879, Training Loss: 0.00789
Epoch: 29879, Training Loss: 0.00965
Epoch: 29879, Training Loss: 0.00747
Epoch: 29880, Training Loss: 0.00845
Epoch: 29880, Training Loss: 0.00789
Epoch: 29880, Training Loss: 0.00965
Epoch: 29880, Training Loss: 0.00747
Epoch: 29881, Training Loss: 0.00845
Epoch: 29881, Training Loss: 0.00789
Epoch: 29881, Training Loss: 0.00965
Epoch: 29881, Training Loss: 0.00747
Epoch: 29882, Training Loss: 0.00845
Epoch: 29882, Training Loss: 0.00789
Epoch: 29882, Training Loss: 0.00965
Epoch: 29882, Training Loss: 0.00747
Epoch: 29883, Training Loss: 0.00845
Epoch: 29883, Training Loss: 0.00789
Epoch: 29883, Training Loss: 0.00965
Epoch: 29883, Training Loss: 0.00747
Epoch: 29884, Training Loss: 0.00845
Epoch: 29884, Training Loss: 0.00789
Epoch: 29884, Training Loss: 0.00965
Epoch: 29884, Training Loss: 0.00747
Epoch: 29885, Training Loss: 0.00845
Epoch: 29885, Training Loss: 0.00789
Epoch: 29885, Training Loss: 0.00965
Epoch: 29885, Training Loss: 0.00747
Epoch: 29886, Training Loss: 0.00845
Epoch: 29886, Training Loss: 0.00789
Epoch: 29886, Training Loss: 0.00965
Epoch: 29886, Training Loss: 0.00747
Epoch: 29887, Training Loss: 0.00845
Epoch: 29887, Training Loss: 0.00789
Epoch: 29887, Training Loss: 0.00965
Epoch: 29887, Training Loss: 0.00747
Epoch: 29888, Training Loss: 0.00845
Epoch: 29888, Training Loss: 0.00789
Epoch: 29888, Training Loss: 0.00965
Epoch: 29888, Training Loss: 0.00747
Epoch: 29889, Training Loss: 0.00845
Epoch: 29889, Training Loss: 0.00789
Epoch: 29889, Training Loss: 0.00965
Epoch: 29889, Training Loss: 0.00747
Epoch: 29890, Training Loss: 0.00845
Epoch: 29890, Training Loss: 0.00789
Epoch: 29890, Training Loss: 0.00965
Epoch: 29890, Training Loss: 0.00747
Epoch: 29891, Training Loss: 0.00845
Epoch: 29891, Training Loss: 0.00789
Epoch: 29891, Training Loss: 0.00965
Epoch: 29891, Training Loss: 0.00747
Epoch: 29892, Training Loss: 0.00845
Epoch: 29892, Training Loss: 0.00789
Epoch: 29892, Training Loss: 0.00965
Epoch: 29892, Training Loss: 0.00747
Epoch: 29893, Training Loss: 0.00845
Epoch: 29893, Training Loss: 0.00789
Epoch: 29893, Training Loss: 0.00965
Epoch: 29893, Training Loss: 0.00747
Epoch: 29894, Training Loss: 0.00845
Epoch: 29894, Training Loss: 0.00789
Epoch: 29894, Training Loss: 0.00965
Epoch: 29894, Training Loss: 0.00747
Epoch: 29895, Training Loss: 0.00845
Epoch: 29895, Training Loss: 0.00789
Epoch: 29895, Training Loss: 0.00965
Epoch: 29895, Training Loss: 0.00747
Epoch: 29896, Training Loss: 0.00845
Epoch: 29896, Training Loss: 0.00789
Epoch: 29896, Training Loss: 0.00965
Epoch: 29896, Training Loss: 0.00747
Epoch: 29897, Training Loss: 0.00845
Epoch: 29897, Training Loss: 0.00789
Epoch: 29897, Training Loss: 0.00965
Epoch: 29897, Training Loss: 0.00747
Epoch: 29898, Training Loss: 0.00845
Epoch: 29898, Training Loss: 0.00789
Epoch: 29898, Training Loss: 0.00965
Epoch: 29898, Training Loss: 0.00747
Epoch: 29899, Training Loss: 0.00845
Epoch: 29899, Training Loss: 0.00789
Epoch: 29899, Training Loss: 0.00965
Epoch: 29899, Training Loss: 0.00747
Epoch: 29900, Training Loss: 0.00845
Epoch: 29900, Training Loss: 0.00789
Epoch: 29900, Training Loss: 0.00965
Epoch: 29900, Training Loss: 0.00747
Epoch: 29901, Training Loss: 0.00845
Epoch: 29901, Training Loss: 0.00789
Epoch: 29901, Training Loss: 0.00965
Epoch: 29901, Training Loss: 0.00747
Epoch: 29902, Training Loss: 0.00845
Epoch: 29902, Training Loss: 0.00789
Epoch: 29902, Training Loss: 0.00965
Epoch: 29902, Training Loss: 0.00747
Epoch: 29903, Training Loss: 0.00845
Epoch: 29903, Training Loss: 0.00789
Epoch: 29903, Training Loss: 0.00965
Epoch: 29903, Training Loss: 0.00747
Epoch: 29904, Training Loss: 0.00845
Epoch: 29904, Training Loss: 0.00789
Epoch: 29904, Training Loss: 0.00965
Epoch: 29904, Training Loss: 0.00747
Epoch: 29905, Training Loss: 0.00845
Epoch: 29905, Training Loss: 0.00789
Epoch: 29905, Training Loss: 0.00965
Epoch: 29905, Training Loss: 0.00747
Epoch: 29906, Training Loss: 0.00845
Epoch: 29906, Training Loss: 0.00789
Epoch: 29906, Training Loss: 0.00965
Epoch: 29906, Training Loss: 0.00747
Epoch: 29907, Training Loss: 0.00845
Epoch: 29907, Training Loss: 0.00789
Epoch: 29907, Training Loss: 0.00965
Epoch: 29907, Training Loss: 0.00747
Epoch: 29908, Training Loss: 0.00845
Epoch: 29908, Training Loss: 0.00789
Epoch: 29908, Training Loss: 0.00965
Epoch: 29908, Training Loss: 0.00747
Epoch: 29909, Training Loss: 0.00845
Epoch: 29909, Training Loss: 0.00789
Epoch: 29909, Training Loss: 0.00965
Epoch: 29909, Training Loss: 0.00747
Epoch: 29910, Training Loss: 0.00845
Epoch: 29910, Training Loss: 0.00789
Epoch: 29910, Training Loss: 0.00965
Epoch: 29910, Training Loss: 0.00747
Epoch: 29911, Training Loss: 0.00845
Epoch: 29911, Training Loss: 0.00789
Epoch: 29911, Training Loss: 0.00965
Epoch: 29911, Training Loss: 0.00747
Epoch: 29912, Training Loss: 0.00845
Epoch: 29912, Training Loss: 0.00789
Epoch: 29912, Training Loss: 0.00965
Epoch: 29912, Training Loss: 0.00747
Epoch: 29913, Training Loss: 0.00845
Epoch: 29913, Training Loss: 0.00789
Epoch: 29913, Training Loss: 0.00965
Epoch: 29913, Training Loss: 0.00747
Epoch: 29914, Training Loss: 0.00845
Epoch: 29914, Training Loss: 0.00789
Epoch: 29914, Training Loss: 0.00965
Epoch: 29914, Training Loss: 0.00747
Epoch: 29915, Training Loss: 0.00845
Epoch: 29915, Training Loss: 0.00789
Epoch: 29915, Training Loss: 0.00965
Epoch: 29915, Training Loss: 0.00747
Epoch: 29916, Training Loss: 0.00845
Epoch: 29916, Training Loss: 0.00789
Epoch: 29916, Training Loss: 0.00965
Epoch: 29916, Training Loss: 0.00747
Epoch: 29917, Training Loss: 0.00845
Epoch: 29917, Training Loss: 0.00789
Epoch: 29917, Training Loss: 0.00965
Epoch: 29917, Training Loss: 0.00747
Epoch: 29918, Training Loss: 0.00845
Epoch: 29918, Training Loss: 0.00789
Epoch: 29918, Training Loss: 0.00965
Epoch: 29918, Training Loss: 0.00747
Epoch: 29919, Training Loss: 0.00845
Epoch: 29919, Training Loss: 0.00789
Epoch: 29919, Training Loss: 0.00965
Epoch: 29919, Training Loss: 0.00747
Epoch: 29920, Training Loss: 0.00845
Epoch: 29920, Training Loss: 0.00789
Epoch: 29920, Training Loss: 0.00965
Epoch: 29920, Training Loss: 0.00747
Epoch: 29921, Training Loss: 0.00845
Epoch: 29921, Training Loss: 0.00789
Epoch: 29921, Training Loss: 0.00965
Epoch: 29921, Training Loss: 0.00747
Epoch: 29922, Training Loss: 0.00845
Epoch: 29922, Training Loss: 0.00789
Epoch: 29922, Training Loss: 0.00965
Epoch: 29922, Training Loss: 0.00747
Epoch: 29923, Training Loss: 0.00845
Epoch: 29923, Training Loss: 0.00789
Epoch: 29923, Training Loss: 0.00965
Epoch: 29923, Training Loss: 0.00747
Epoch: 29924, Training Loss: 0.00845
Epoch: 29924, Training Loss: 0.00789
Epoch: 29924, Training Loss: 0.00965
Epoch: 29924, Training Loss: 0.00747
Epoch: 29925, Training Loss: 0.00845
Epoch: 29925, Training Loss: 0.00789
Epoch: 29925, Training Loss: 0.00965
Epoch: 29925, Training Loss: 0.00747
Epoch: 29926, Training Loss: 0.00845
Epoch: 29926, Training Loss: 0.00789
Epoch: 29926, Training Loss: 0.00965
Epoch: 29926, Training Loss: 0.00747
Epoch: 29927, Training Loss: 0.00845
Epoch: 29927, Training Loss: 0.00789
Epoch: 29927, Training Loss: 0.00964
Epoch: 29927, Training Loss: 0.00747
Epoch: 29928, Training Loss: 0.00845
Epoch: 29928, Training Loss: 0.00789
Epoch: 29928, Training Loss: 0.00964
Epoch: 29928, Training Loss: 0.00747
Epoch: 29929, Training Loss: 0.00845
Epoch: 29929, Training Loss: 0.00789
Epoch: 29929, Training Loss: 0.00964
Epoch: 29929, Training Loss: 0.00747
Epoch: 29930, Training Loss: 0.00844
Epoch: 29930, Training Loss: 0.00789
Epoch: 29930, Training Loss: 0.00964
Epoch: 29930, Training Loss: 0.00747
Epoch: 29931, Training Loss: 0.00844
Epoch: 29931, Training Loss: 0.00789
Epoch: 29931, Training Loss: 0.00964
Epoch: 29931, Training Loss: 0.00747
Epoch: 29932, Training Loss: 0.00844
Epoch: 29932, Training Loss: 0.00789
Epoch: 29932, Training Loss: 0.00964
Epoch: 29932, Training Loss: 0.00747
Epoch: 29933, Training Loss: 0.00844
Epoch: 29933, Training Loss: 0.00788
Epoch: 29933, Training Loss: 0.00964
Epoch: 29933, Training Loss: 0.00747
Epoch: 29934, Training Loss: 0.00844
Epoch: 29934, Training Loss: 0.00788
Epoch: 29934, Training Loss: 0.00964
Epoch: 29934, Training Loss: 0.00747
Epoch: 29935, Training Loss: 0.00844
Epoch: 29935, Training Loss: 0.00788
Epoch: 29935, Training Loss: 0.00964
Epoch: 29935, Training Loss: 0.00747
Epoch: 29936, Training Loss: 0.00844
Epoch: 29936, Training Loss: 0.00788
Epoch: 29936, Training Loss: 0.00964
Epoch: 29936, Training Loss: 0.00747
Epoch: 29937, Training Loss: 0.00844
Epoch: 29937, Training Loss: 0.00788
Epoch: 29937, Training Loss: 0.00964
Epoch: 29937, Training Loss: 0.00747
Epoch: 29938, Training Loss: 0.00844
Epoch: 29938, Training Loss: 0.00788
Epoch: 29938, Training Loss: 0.00964
Epoch: 29938, Training Loss: 0.00747
Epoch: 29939, Training Loss: 0.00844
Epoch: 29939, Training Loss: 0.00788
Epoch: 29939, Training Loss: 0.00964
Epoch: 29939, Training Loss: 0.00747
Epoch: 29940, Training Loss: 0.00844
Epoch: 29940, Training Loss: 0.00788
Epoch: 29940, Training Loss: 0.00964
Epoch: 29940, Training Loss: 0.00747
Epoch: 29941, Training Loss: 0.00844
Epoch: 29941, Training Loss: 0.00788
Epoch: 29941, Training Loss: 0.00964
Epoch: 29941, Training Loss: 0.00747
Epoch: 29942, Training Loss: 0.00844
Epoch: 29942, Training Loss: 0.00788
Epoch: 29942, Training Loss: 0.00964
Epoch: 29942, Training Loss: 0.00747
Epoch: 29943, Training Loss: 0.00844
Epoch: 29943, Training Loss: 0.00788
Epoch: 29943, Training Loss: 0.00964
Epoch: 29943, Training Loss: 0.00747
Epoch: 29944, Training Loss: 0.00844
Epoch: 29944, Training Loss: 0.00788
Epoch: 29944, Training Loss: 0.00964
Epoch: 29944, Training Loss: 0.00747
Epoch: 29945, Training Loss: 0.00844
Epoch: 29945, Training Loss: 0.00788
Epoch: 29945, Training Loss: 0.00964
Epoch: 29945, Training Loss: 0.00747
Epoch: 29946, Training Loss: 0.00844
Epoch: 29946, Training Loss: 0.00788
Epoch: 29946, Training Loss: 0.00964
Epoch: 29946, Training Loss: 0.00747
Epoch: 29947, Training Loss: 0.00844
Epoch: 29947, Training Loss: 0.00788
Epoch: 29947, Training Loss: 0.00964
Epoch: 29947, Training Loss: 0.00747
Epoch: 29948, Training Loss: 0.00844
Epoch: 29948, Training Loss: 0.00788
Epoch: 29948, Training Loss: 0.00964
Epoch: 29948, Training Loss: 0.00746
Epoch: 29949, Training Loss: 0.00844
Epoch: 29949, Training Loss: 0.00788
Epoch: 29949, Training Loss: 0.00964
Epoch: 29949, Training Loss: 0.00746
Epoch: 29950, Training Loss: 0.00844
Epoch: 29950, Training Loss: 0.00788
Epoch: 29950, Training Loss: 0.00964
Epoch: 29950, Training Loss: 0.00746
Epoch: 29951, Training Loss: 0.00844
Epoch: 29951, Training Loss: 0.00788
Epoch: 29951, Training Loss: 0.00964
Epoch: 29951, Training Loss: 0.00746
Epoch: 29952, Training Loss: 0.00844
Epoch: 29952, Training Loss: 0.00788
Epoch: 29952, Training Loss: 0.00964
Epoch: 29952, Training Loss: 0.00746
Epoch: 29953, Training Loss: 0.00844
Epoch: 29953, Training Loss: 0.00788
Epoch: 29953, Training Loss: 0.00964
Epoch: 29953, Training Loss: 0.00746
Epoch: 29954, Training Loss: 0.00844
Epoch: 29954, Training Loss: 0.00788
Epoch: 29954, Training Loss: 0.00964
Epoch: 29954, Training Loss: 0.00746
Epoch: 29955, Training Loss: 0.00844
Epoch: 29955, Training Loss: 0.00788
Epoch: 29955, Training Loss: 0.00964
Epoch: 29955, Training Loss: 0.00746
Epoch: 29956, Training Loss: 0.00844
Epoch: 29956, Training Loss: 0.00788
Epoch: 29956, Training Loss: 0.00964
Epoch: 29956, Training Loss: 0.00746
Epoch: 29957, Training Loss: 0.00844
Epoch: 29957, Training Loss: 0.00788
Epoch: 29957, Training Loss: 0.00964
Epoch: 29957, Training Loss: 0.00746
Epoch: 29958, Training Loss: 0.00844
Epoch: 29958, Training Loss: 0.00788
Epoch: 29958, Training Loss: 0.00964
Epoch: 29958, Training Loss: 0.00746
Epoch: 29959, Training Loss: 0.00844
Epoch: 29959, Training Loss: 0.00788
Epoch: 29959, Training Loss: 0.00964
Epoch: 29959, Training Loss: 0.00746
Epoch: 29960, Training Loss: 0.00844
Epoch: 29960, Training Loss: 0.00788
Epoch: 29960, Training Loss: 0.00964
Epoch: 29960, Training Loss: 0.00746
Epoch: 29961, Training Loss: 0.00844
Epoch: 29961, Training Loss: 0.00788
Epoch: 29961, Training Loss: 0.00964
Epoch: 29961, Training Loss: 0.00746
Epoch: 29962, Training Loss: 0.00844
Epoch: 29962, Training Loss: 0.00788
Epoch: 29962, Training Loss: 0.00964
Epoch: 29962, Training Loss: 0.00746
Epoch: 29963, Training Loss: 0.00844
Epoch: 29963, Training Loss: 0.00788
Epoch: 29963, Training Loss: 0.00964
Epoch: 29963, Training Loss: 0.00746
Epoch: 29964, Training Loss: 0.00844
Epoch: 29964, Training Loss: 0.00788
Epoch: 29964, Training Loss: 0.00964
Epoch: 29964, Training Loss: 0.00746
Epoch: 29965, Training Loss: 0.00844
Epoch: 29965, Training Loss: 0.00788
Epoch: 29965, Training Loss: 0.00964
Epoch: 29965, Training Loss: 0.00746
Epoch: 29966, Training Loss: 0.00844
Epoch: 29966, Training Loss: 0.00788
Epoch: 29966, Training Loss: 0.00964
Epoch: 29966, Training Loss: 0.00746
Epoch: 29967, Training Loss: 0.00844
Epoch: 29967, Training Loss: 0.00788
Epoch: 29967, Training Loss: 0.00964
Epoch: 29967, Training Loss: 0.00746
Epoch: 29968, Training Loss: 0.00844
Epoch: 29968, Training Loss: 0.00788
Epoch: 29968, Training Loss: 0.00964
Epoch: 29968, Training Loss: 0.00746
Epoch: 29969, Training Loss: 0.00844
Epoch: 29969, Training Loss: 0.00788
Epoch: 29969, Training Loss: 0.00964
Epoch: 29969, Training Loss: 0.00746
Epoch: 29970, Training Loss: 0.00844
Epoch: 29970, Training Loss: 0.00788
Epoch: 29970, Training Loss: 0.00964
Epoch: 29970, Training Loss: 0.00746
Epoch: 29971, Training Loss: 0.00844
Epoch: 29971, Training Loss: 0.00788
Epoch: 29971, Training Loss: 0.00964
Epoch: 29971, Training Loss: 0.00746
Epoch: 29972, Training Loss: 0.00844
Epoch: 29972, Training Loss: 0.00788
Epoch: 29972, Training Loss: 0.00964
Epoch: 29972, Training Loss: 0.00746
Epoch: 29973, Training Loss: 0.00844
Epoch: 29973, Training Loss: 0.00788
Epoch: 29973, Training Loss: 0.00964
Epoch: 29973, Training Loss: 0.00746
Epoch: 29974, Training Loss: 0.00844
Epoch: 29974, Training Loss: 0.00788
Epoch: 29974, Training Loss: 0.00964
Epoch: 29974, Training Loss: 0.00746
Epoch: 29975, Training Loss: 0.00844
Epoch: 29975, Training Loss: 0.00788
Epoch: 29975, Training Loss: 0.00964
Epoch: 29975, Training Loss: 0.00746
Epoch: 29976, Training Loss: 0.00844
Epoch: 29976, Training Loss: 0.00788
Epoch: 29976, Training Loss: 0.00964
Epoch: 29976, Training Loss: 0.00746
Epoch: 29977, Training Loss: 0.00844
Epoch: 29977, Training Loss: 0.00788
Epoch: 29977, Training Loss: 0.00964
Epoch: 29977, Training Loss: 0.00746
Epoch: 29978, Training Loss: 0.00844
Epoch: 29978, Training Loss: 0.00788
Epoch: 29978, Training Loss: 0.00964
Epoch: 29978, Training Loss: 0.00746
Epoch: 29979, Training Loss: 0.00844
Epoch: 29979, Training Loss: 0.00788
Epoch: 29979, Training Loss: 0.00964
Epoch: 29979, Training Loss: 0.00746
Epoch: 29980, Training Loss: 0.00844
Epoch: 29980, Training Loss: 0.00788
Epoch: 29980, Training Loss: 0.00964
Epoch: 29980, Training Loss: 0.00746
Epoch: 29981, Training Loss: 0.00844
Epoch: 29981, Training Loss: 0.00788
Epoch: 29981, Training Loss: 0.00964
Epoch: 29981, Training Loss: 0.00746
Epoch: 29982, Training Loss: 0.00844
Epoch: 29982, Training Loss: 0.00788
Epoch: 29982, Training Loss: 0.00964
Epoch: 29982, Training Loss: 0.00746
Epoch: 29983, Training Loss: 0.00844
Epoch: 29983, Training Loss: 0.00788
Epoch: 29983, Training Loss: 0.00964
Epoch: 29983, Training Loss: 0.00746
Epoch: 29984, Training Loss: 0.00844
Epoch: 29984, Training Loss: 0.00788
Epoch: 29984, Training Loss: 0.00964
Epoch: 29984, Training Loss: 0.00746
Epoch: 29985, Training Loss: 0.00844
Epoch: 29985, Training Loss: 0.00788
Epoch: 29985, Training Loss: 0.00964
Epoch: 29985, Training Loss: 0.00746
Epoch: 29986, Training Loss: 0.00844
Epoch: 29986, Training Loss: 0.00788
Epoch: 29986, Training Loss: 0.00963
Epoch: 29986, Training Loss: 0.00746
Epoch: 29987, Training Loss: 0.00844
Epoch: 29987, Training Loss: 0.00788
Epoch: 29987, Training Loss: 0.00963
Epoch: 29987, Training Loss: 0.00746
Epoch: 29988, Training Loss: 0.00844
Epoch: 29988, Training Loss: 0.00788
Epoch: 29988, Training Loss: 0.00963
Epoch: 29988, Training Loss: 0.00746
Epoch: 29989, Training Loss: 0.00844
Epoch: 29989, Training Loss: 0.00788
Epoch: 29989, Training Loss: 0.00963
Epoch: 29989, Training Loss: 0.00746
Epoch: 29990, Training Loss: 0.00844
Epoch: 29990, Training Loss: 0.00788
Epoch: 29990, Training Loss: 0.00963
Epoch: 29990, Training Loss: 0.00746
Epoch: 29991, Training Loss: 0.00844
Epoch: 29991, Training Loss: 0.00788
Epoch: 29991, Training Loss: 0.00963
Epoch: 29991, Training Loss: 0.00746
Epoch: 29992, Training Loss: 0.00844
Epoch: 29992, Training Loss: 0.00788
Epoch: 29992, Training Loss: 0.00963
Epoch: 29992, Training Loss: 0.00746
Epoch: 29993, Training Loss: 0.00844
Epoch: 29993, Training Loss: 0.00788
Epoch: 29993, Training Loss: 0.00963
Epoch: 29993, Training Loss: 0.00746
Epoch: 29994, Training Loss: 0.00844
Epoch: 29994, Training Loss: 0.00788
Epoch: 29994, Training Loss: 0.00963
Epoch: 29994, Training Loss: 0.00746
Epoch: 29995, Training Loss: 0.00844
Epoch: 29995, Training Loss: 0.00788
Epoch: 29995, Training Loss: 0.00963
Epoch: 29995, Training Loss: 0.00746
Epoch: 29996, Training Loss: 0.00844
Epoch: 29996, Training Loss: 0.00788
Epoch: 29996, Training Loss: 0.00963
Epoch: 29996, Training Loss: 0.00746
Epoch: 29997, Training Loss: 0.00843
Epoch: 29997, Training Loss: 0.00788
Epoch: 29997, Training Loss: 0.00963
Epoch: 29997, Training Loss: 0.00746
Epoch: 29998, Training Loss: 0.00843
Epoch: 29998, Training Loss: 0.00788
Epoch: 29998, Training Loss: 0.00963
Epoch: 29998, Training Loss: 0.00746
Epoch: 29999, Training Loss: 0.00843
Epoch: 29999, Training Loss: 0.00788
Epoch: 29999, Training Loss: 0.00963
Epoch: 29999, Training Loss: 0.00746
Epoch: 30000, Training Loss: 0.00843
Epoch: 30000, Training Loss: 0.00788
Epoch: 30000, Training Loss: 0.00963
Epoch: 30000, Training Loss: 0.00746
Epoch: 30001, Training Loss: 0.00843
Epoch: 30001, Training Loss: 0.00788
Epoch: 30001, Training Loss: 0.00963
Epoch: 30001, Training Loss: 0.00746
Epoch: 30002, Training Loss: 0.00843
Epoch: 30002, Training Loss: 0.00788
Epoch: 30002, Training Loss: 0.00963
Epoch: 30002, Training Loss: 0.00746
Epoch: 30003, Training Loss: 0.00843
Epoch: 30003, Training Loss: 0.00788
Epoch: 30003, Training Loss: 0.00963
Epoch: 30003, Training Loss: 0.00746
Epoch: 30004, Training Loss: 0.00843
Epoch: 30004, Training Loss: 0.00788
Epoch: 30004, Training Loss: 0.00963
Epoch: 30004, Training Loss: 0.00746
Epoch: 30005, Training Loss: 0.00843
Epoch: 30005, Training Loss: 0.00788
Epoch: 30005, Training Loss: 0.00963
Epoch: 30005, Training Loss: 0.00746
Epoch: 30006, Training Loss: 0.00843
Epoch: 30006, Training Loss: 0.00787
Epoch: 30006, Training Loss: 0.00963
Epoch: 30006, Training Loss: 0.00746
Epoch: 30007, Training Loss: 0.00843
Epoch: 30007, Training Loss: 0.00787
Epoch: 30007, Training Loss: 0.00963
Epoch: 30007, Training Loss: 0.00746
Epoch: 30008, Training Loss: 0.00843
Epoch: 30008, Training Loss: 0.00787
Epoch: 30008, Training Loss: 0.00963
Epoch: 30008, Training Loss: 0.00746
Epoch: 30009, Training Loss: 0.00843
Epoch: 30009, Training Loss: 0.00787
Epoch: 30009, Training Loss: 0.00963
Epoch: 30009, Training Loss: 0.00746
Epoch: 30010, Training Loss: 0.00843
Epoch: 30010, Training Loss: 0.00787
Epoch: 30010, Training Loss: 0.00963
Epoch: 30010, Training Loss: 0.00746
Epoch: 30011, Training Loss: 0.00843
Epoch: 30011, Training Loss: 0.00787
Epoch: 30011, Training Loss: 0.00963
Epoch: 30011, Training Loss: 0.00746
Epoch: 30012, Training Loss: 0.00843
Epoch: 30012, Training Loss: 0.00787
Epoch: 30012, Training Loss: 0.00963
Epoch: 30012, Training Loss: 0.00746
Epoch: 30013, Training Loss: 0.00843
Epoch: 30013, Training Loss: 0.00787
Epoch: 30013, Training Loss: 0.00963
Epoch: 30013, Training Loss: 0.00746
Epoch: 30014, Training Loss: 0.00843
Epoch: 30014, Training Loss: 0.00787
Epoch: 30014, Training Loss: 0.00963
Epoch: 30014, Training Loss: 0.00746
Epoch: 30015, Training Loss: 0.00843
Epoch: 30015, Training Loss: 0.00787
Epoch: 30015, Training Loss: 0.00963
Epoch: 30015, Training Loss: 0.00746
Epoch: 30016, Training Loss: 0.00843
Epoch: 30016, Training Loss: 0.00787
Epoch: 30016, Training Loss: 0.00963
Epoch: 30016, Training Loss: 0.00746
Epoch: 30017, Training Loss: 0.00843
Epoch: 30017, Training Loss: 0.00787
Epoch: 30017, Training Loss: 0.00963
Epoch: 30017, Training Loss: 0.00746
Epoch: 30018, Training Loss: 0.00843
Epoch: 30018, Training Loss: 0.00787
Epoch: 30018, Training Loss: 0.00963
Epoch: 30018, Training Loss: 0.00746
Epoch: 30019, Training Loss: 0.00843
Epoch: 30019, Training Loss: 0.00787
Epoch: 30019, Training Loss: 0.00963
Epoch: 30019, Training Loss: 0.00746
Epoch: 30020, Training Loss: 0.00843
Epoch: 30020, Training Loss: 0.00787
Epoch: 30020, Training Loss: 0.00963
Epoch: 30020, Training Loss: 0.00746
Epoch: 30021, Training Loss: 0.00843
Epoch: 30021, Training Loss: 0.00787
Epoch: 30021, Training Loss: 0.00963
Epoch: 30021, Training Loss: 0.00746
Epoch: 30022, Training Loss: 0.00843
Epoch: 30022, Training Loss: 0.00787
Epoch: 30022, Training Loss: 0.00963
Epoch: 30022, Training Loss: 0.00746
Epoch: 30023, Training Loss: 0.00843
Epoch: 30023, Training Loss: 0.00787
Epoch: 30023, Training Loss: 0.00963
Epoch: 30023, Training Loss: 0.00746
Epoch: 30024, Training Loss: 0.00843
Epoch: 30024, Training Loss: 0.00787
Epoch: 30024, Training Loss: 0.00963
Epoch: 30024, Training Loss: 0.00745
Epoch: 30025, Training Loss: 0.00843
Epoch: 30025, Training Loss: 0.00787
Epoch: 30025, Training Loss: 0.00963
Epoch: 30025, Training Loss: 0.00745
Epoch: 30026, Training Loss: 0.00843
Epoch: 30026, Training Loss: 0.00787
Epoch: 30026, Training Loss: 0.00963
Epoch: 30026, Training Loss: 0.00745
Epoch: 30027, Training Loss: 0.00843
Epoch: 30027, Training Loss: 0.00787
Epoch: 30027, Training Loss: 0.00963
Epoch: 30027, Training Loss: 0.00745
Epoch: 30028, Training Loss: 0.00843
Epoch: 30028, Training Loss: 0.00787
Epoch: 30028, Training Loss: 0.00963
Epoch: 30028, Training Loss: 0.00745
Epoch: 30029, Training Loss: 0.00843
Epoch: 30029, Training Loss: 0.00787
Epoch: 30029, Training Loss: 0.00963
Epoch: 30029, Training Loss: 0.00745
Epoch: 30030, Training Loss: 0.00843
Epoch: 30030, Training Loss: 0.00787
Epoch: 30030, Training Loss: 0.00963
Epoch: 30030, Training Loss: 0.00745
Epoch: 30031, Training Loss: 0.00843
Epoch: 30031, Training Loss: 0.00787
Epoch: 30031, Training Loss: 0.00963
Epoch: 30031, Training Loss: 0.00745
Epoch: 30032, Training Loss: 0.00843
Epoch: 30032, Training Loss: 0.00787
Epoch: 30032, Training Loss: 0.00963
Epoch: 30032, Training Loss: 0.00745
Epoch: 30033, Training Loss: 0.00843
Epoch: 30033, Training Loss: 0.00787
Epoch: 30033, Training Loss: 0.00963
Epoch: 30033, Training Loss: 0.00745
Epoch: 30034, Training Loss: 0.00843
Epoch: 30034, Training Loss: 0.00787
Epoch: 30034, Training Loss: 0.00963
Epoch: 30034, Training Loss: 0.00745
Epoch: 30035, Training Loss: 0.00843
Epoch: 30035, Training Loss: 0.00787
Epoch: 30035, Training Loss: 0.00963
Epoch: 30035, Training Loss: 0.00745
Epoch: 30036, Training Loss: 0.00843
Epoch: 30036, Training Loss: 0.00787
Epoch: 30036, Training Loss: 0.00963
Epoch: 30036, Training Loss: 0.00745
Epoch: 30037, Training Loss: 0.00843
Epoch: 30037, Training Loss: 0.00787
Epoch: 30037, Training Loss: 0.00963
Epoch: 30037, Training Loss: 0.00745
Epoch: 30038, Training Loss: 0.00843
Epoch: 30038, Training Loss: 0.00787
Epoch: 30038, Training Loss: 0.00963
Epoch: 30038, Training Loss: 0.00745
Epoch: 30039, Training Loss: 0.00843
Epoch: 30039, Training Loss: 0.00787
Epoch: 30039, Training Loss: 0.00963
Epoch: 30039, Training Loss: 0.00745
Epoch: 30040, Training Loss: 0.00843
Epoch: 30040, Training Loss: 0.00787
Epoch: 30040, Training Loss: 0.00963
Epoch: 30040, Training Loss: 0.00745
Epoch: 30041, Training Loss: 0.00843
Epoch: 30041, Training Loss: 0.00787
Epoch: 30041, Training Loss: 0.00963
Epoch: 30041, Training Loss: 0.00745
Epoch: 30042, Training Loss: 0.00843
Epoch: 30042, Training Loss: 0.00787
Epoch: 30042, Training Loss: 0.00963
Epoch: 30042, Training Loss: 0.00745
Epoch: 30043, Training Loss: 0.00843
Epoch: 30043, Training Loss: 0.00787
Epoch: 30043, Training Loss: 0.00963
Epoch: 30043, Training Loss: 0.00745
Epoch: 30044, Training Loss: 0.00843
Epoch: 30044, Training Loss: 0.00787
Epoch: 30044, Training Loss: 0.00963
Epoch: 30044, Training Loss: 0.00745
Epoch: 30045, Training Loss: 0.00843
Epoch: 30045, Training Loss: 0.00787
Epoch: 30045, Training Loss: 0.00962
Epoch: 30045, Training Loss: 0.00745
Epoch: 30046, Training Loss: 0.00843
Epoch: 30046, Training Loss: 0.00787
Epoch: 30046, Training Loss: 0.00962
Epoch: 30046, Training Loss: 0.00745
Epoch: 30047, Training Loss: 0.00843
Epoch: 30047, Training Loss: 0.00787
Epoch: 30047, Training Loss: 0.00962
Epoch: 30047, Training Loss: 0.00745
Epoch: 30048, Training Loss: 0.00843
Epoch: 30048, Training Loss: 0.00787
Epoch: 30048, Training Loss: 0.00962
Epoch: 30048, Training Loss: 0.00745
Epoch: 30049, Training Loss: 0.00843
Epoch: 30049, Training Loss: 0.00787
Epoch: 30049, Training Loss: 0.00962
Epoch: 30049, Training Loss: 0.00745
Epoch: 30050, Training Loss: 0.00843
Epoch: 30050, Training Loss: 0.00787
Epoch: 30050, Training Loss: 0.00962
Epoch: 30050, Training Loss: 0.00745
Epoch: 30051, Training Loss: 0.00843
Epoch: 30051, Training Loss: 0.00787
Epoch: 30051, Training Loss: 0.00962
Epoch: 30051, Training Loss: 0.00745
Epoch: 30052, Training Loss: 0.00843
Epoch: 30052, Training Loss: 0.00787
Epoch: 30052, Training Loss: 0.00962
Epoch: 30052, Training Loss: 0.00745
Epoch: 30053, Training Loss: 0.00843
Epoch: 30053, Training Loss: 0.00787
Epoch: 30053, Training Loss: 0.00962
Epoch: 30053, Training Loss: 0.00745
Epoch: 30054, Training Loss: 0.00843
Epoch: 30054, Training Loss: 0.00787
Epoch: 30054, Training Loss: 0.00962
Epoch: 30054, Training Loss: 0.00745
Epoch: 30055, Training Loss: 0.00843
Epoch: 30055, Training Loss: 0.00787
Epoch: 30055, Training Loss: 0.00962
Epoch: 30055, Training Loss: 0.00745
Epoch: 30056, Training Loss: 0.00843
Epoch: 30056, Training Loss: 0.00787
Epoch: 30056, Training Loss: 0.00962
Epoch: 30056, Training Loss: 0.00745
Epoch: 30057, Training Loss: 0.00843
Epoch: 30057, Training Loss: 0.00787
Epoch: 30057, Training Loss: 0.00962
Epoch: 30057, Training Loss: 0.00745
Epoch: 30058, Training Loss: 0.00843
Epoch: 30058, Training Loss: 0.00787
Epoch: 30058, Training Loss: 0.00962
Epoch: 30058, Training Loss: 0.00745
Epoch: 30059, Training Loss: 0.00843
Epoch: 30059, Training Loss: 0.00787
Epoch: 30059, Training Loss: 0.00962
Epoch: 30059, Training Loss: 0.00745
Epoch: 30060, Training Loss: 0.00843
Epoch: 30060, Training Loss: 0.00787
Epoch: 30060, Training Loss: 0.00962
Epoch: 30060, Training Loss: 0.00745
Epoch: 30061, Training Loss: 0.00843
Epoch: 30061, Training Loss: 0.00787
Epoch: 30061, Training Loss: 0.00962
Epoch: 30061, Training Loss: 0.00745
Epoch: 30062, Training Loss: 0.00843
Epoch: 30062, Training Loss: 0.00787
Epoch: 30062, Training Loss: 0.00962
Epoch: 30062, Training Loss: 0.00745
Epoch: 30063, Training Loss: 0.00843
Epoch: 30063, Training Loss: 0.00787
Epoch: 30063, Training Loss: 0.00962
Epoch: 30063, Training Loss: 0.00745
Epoch: 30064, Training Loss: 0.00842
Epoch: 30064, Training Loss: 0.00787
Epoch: 30064, Training Loss: 0.00962
Epoch: 30064, Training Loss: 0.00745
Epoch: 30065, Training Loss: 0.00842
Epoch: 30065, Training Loss: 0.00787
Epoch: 30065, Training Loss: 0.00962
Epoch: 30065, Training Loss: 0.00745
Epoch: 30066, Training Loss: 0.00842
Epoch: 30066, Training Loss: 0.00787
Epoch: 30066, Training Loss: 0.00962
Epoch: 30066, Training Loss: 0.00745
Epoch: 30067, Training Loss: 0.00842
Epoch: 30067, Training Loss: 0.00787
Epoch: 30067, Training Loss: 0.00962
Epoch: 30067, Training Loss: 0.00745
Epoch: 30068, Training Loss: 0.00842
Epoch: 30068, Training Loss: 0.00787
Epoch: 30068, Training Loss: 0.00962
Epoch: 30068, Training Loss: 0.00745
Epoch: 30069, Training Loss: 0.00842
Epoch: 30069, Training Loss: 0.00787
Epoch: 30069, Training Loss: 0.00962
Epoch: 30069, Training Loss: 0.00745
Epoch: 30070, Training Loss: 0.00842
Epoch: 30070, Training Loss: 0.00787
Epoch: 30070, Training Loss: 0.00962
Epoch: 30070, Training Loss: 0.00745
Epoch: 30071, Training Loss: 0.00842
Epoch: 30071, Training Loss: 0.00787
Epoch: 30071, Training Loss: 0.00962
Epoch: 30071, Training Loss: 0.00745
Epoch: 30072, Training Loss: 0.00842
Epoch: 30072, Training Loss: 0.00787
Epoch: 30072, Training Loss: 0.00962
Epoch: 30072, Training Loss: 0.00745
Epoch: 30073, Training Loss: 0.00842
Epoch: 30073, Training Loss: 0.00787
Epoch: 30073, Training Loss: 0.00962
Epoch: 30073, Training Loss: 0.00745
Epoch: 30074, Training Loss: 0.00842
Epoch: 30074, Training Loss: 0.00787
Epoch: 30074, Training Loss: 0.00962
Epoch: 30074, Training Loss: 0.00745
Epoch: 30075, Training Loss: 0.00842
Epoch: 30075, Training Loss: 0.00787
Epoch: 30075, Training Loss: 0.00962
Epoch: 30075, Training Loss: 0.00745
Epoch: 30076, Training Loss: 0.00842
Epoch: 30076, Training Loss: 0.00787
Epoch: 30076, Training Loss: 0.00962
Epoch: 30076, Training Loss: 0.00745
Epoch: 30077, Training Loss: 0.00842
Epoch: 30077, Training Loss: 0.00787
Epoch: 30077, Training Loss: 0.00962
Epoch: 30077, Training Loss: 0.00745
Epoch: 30078, Training Loss: 0.00842
Epoch: 30078, Training Loss: 0.00787
Epoch: 30078, Training Loss: 0.00962
Epoch: 30078, Training Loss: 0.00745
Epoch: 30079, Training Loss: 0.00842
Epoch: 30079, Training Loss: 0.00786
Epoch: 30079, Training Loss: 0.00962
Epoch: 30079, Training Loss: 0.00745
Epoch: 30080, Training Loss: 0.00842
Epoch: 30080, Training Loss: 0.00786
Epoch: 30080, Training Loss: 0.00962
Epoch: 30080, Training Loss: 0.00745
Epoch: 30081, Training Loss: 0.00842
Epoch: 30081, Training Loss: 0.00786
Epoch: 30081, Training Loss: 0.00962
Epoch: 30081, Training Loss: 0.00745
Epoch: 30082, Training Loss: 0.00842
Epoch: 30082, Training Loss: 0.00786
Epoch: 30082, Training Loss: 0.00962
Epoch: 30082, Training Loss: 0.00745
Epoch: 30083, Training Loss: 0.00842
Epoch: 30083, Training Loss: 0.00786
Epoch: 30083, Training Loss: 0.00962
Epoch: 30083, Training Loss: 0.00745
Epoch: 30084, Training Loss: 0.00842
Epoch: 30084, Training Loss: 0.00786
Epoch: 30084, Training Loss: 0.00962
Epoch: 30084, Training Loss: 0.00745
Epoch: 30085, Training Loss: 0.00842
Epoch: 30085, Training Loss: 0.00786
Epoch: 30085, Training Loss: 0.00962
Epoch: 30085, Training Loss: 0.00745
Epoch: 30086, Training Loss: 0.00842
Epoch: 30086, Training Loss: 0.00786
Epoch: 30086, Training Loss: 0.00962
Epoch: 30086, Training Loss: 0.00745
Epoch: 30087, Training Loss: 0.00842
Epoch: 30087, Training Loss: 0.00786
Epoch: 30087, Training Loss: 0.00962
Epoch: 30087, Training Loss: 0.00745
Epoch: 30088, Training Loss: 0.00842
Epoch: 30088, Training Loss: 0.00786
Epoch: 30088, Training Loss: 0.00962
Epoch: 30088, Training Loss: 0.00745
Epoch: 30089, Training Loss: 0.00842
Epoch: 30089, Training Loss: 0.00786
Epoch: 30089, Training Loss: 0.00962
Epoch: 30089, Training Loss: 0.00745
Epoch: 30090, Training Loss: 0.00842
Epoch: 30090, Training Loss: 0.00786
Epoch: 30090, Training Loss: 0.00962
Epoch: 30090, Training Loss: 0.00745
Epoch: 30091, Training Loss: 0.00842
Epoch: 30091, Training Loss: 0.00786
Epoch: 30091, Training Loss: 0.00962
Epoch: 30091, Training Loss: 0.00745
Epoch: 30092, Training Loss: 0.00842
Epoch: 30092, Training Loss: 0.00786
Epoch: 30092, Training Loss: 0.00962
Epoch: 30092, Training Loss: 0.00745
Epoch: 30093, Training Loss: 0.00842
Epoch: 30093, Training Loss: 0.00786
Epoch: 30093, Training Loss: 0.00962
Epoch: 30093, Training Loss: 0.00745
Epoch: 30094, Training Loss: 0.00842
Epoch: 30094, Training Loss: 0.00786
Epoch: 30094, Training Loss: 0.00962
Epoch: 30094, Training Loss: 0.00745
Epoch: 30095, Training Loss: 0.00842
Epoch: 30095, Training Loss: 0.00786
Epoch: 30095, Training Loss: 0.00962
Epoch: 30095, Training Loss: 0.00745
Epoch: 30096, Training Loss: 0.00842
Epoch: 30096, Training Loss: 0.00786
Epoch: 30096, Training Loss: 0.00962
Epoch: 30096, Training Loss: 0.00745
Epoch: 30097, Training Loss: 0.00842
Epoch: 30097, Training Loss: 0.00786
Epoch: 30097, Training Loss: 0.00962
Epoch: 30097, Training Loss: 0.00745
Epoch: 30098, Training Loss: 0.00842
Epoch: 30098, Training Loss: 0.00786
Epoch: 30098, Training Loss: 0.00962
Epoch: 30098, Training Loss: 0.00745
Epoch: 30099, Training Loss: 0.00842
Epoch: 30099, Training Loss: 0.00786
Epoch: 30099, Training Loss: 0.00962
Epoch: 30099, Training Loss: 0.00745
Epoch: 30100, Training Loss: 0.00842
Epoch: 30100, Training Loss: 0.00786
Epoch: 30100, Training Loss: 0.00962
Epoch: 30100, Training Loss: 0.00745
Epoch: 30101, Training Loss: 0.00842
Epoch: 30101, Training Loss: 0.00786
Epoch: 30101, Training Loss: 0.00962
Epoch: 30101, Training Loss: 0.00744
Epoch: 30102, Training Loss: 0.00842
Epoch: 30102, Training Loss: 0.00786
Epoch: 30102, Training Loss: 0.00962
Epoch: 30102, Training Loss: 0.00744
Epoch: 30103, Training Loss: 0.00842
Epoch: 30103, Training Loss: 0.00786
Epoch: 30103, Training Loss: 0.00962
Epoch: 30103, Training Loss: 0.00744
Epoch: 30104, Training Loss: 0.00842
Epoch: 30104, Training Loss: 0.00786
Epoch: 30104, Training Loss: 0.00961
Epoch: 30104, Training Loss: 0.00744
Epoch: 30105, Training Loss: 0.00842
Epoch: 30105, Training Loss: 0.00786
Epoch: 30105, Training Loss: 0.00961
Epoch: 30105, Training Loss: 0.00744
Epoch: 30106, Training Loss: 0.00842
Epoch: 30106, Training Loss: 0.00786
Epoch: 30106, Training Loss: 0.00961
Epoch: 30106, Training Loss: 0.00744
Epoch: 30107, Training Loss: 0.00842
Epoch: 30107, Training Loss: 0.00786
Epoch: 30107, Training Loss: 0.00961
Epoch: 30107, Training Loss: 0.00744
Epoch: 30108, Training Loss: 0.00842
Epoch: 30108, Training Loss: 0.00786
Epoch: 30108, Training Loss: 0.00961
Epoch: 30108, Training Loss: 0.00744
Epoch: 30109, Training Loss: 0.00842
Epoch: 30109, Training Loss: 0.00786
Epoch: 30109, Training Loss: 0.00961
Epoch: 30109, Training Loss: 0.00744
Epoch: 30110, Training Loss: 0.00842
Epoch: 30110, Training Loss: 0.00786
Epoch: 30110, Training Loss: 0.00961
Epoch: 30110, Training Loss: 0.00744
Epoch: 30111, Training Loss: 0.00842
Epoch: 30111, Training Loss: 0.00786
Epoch: 30111, Training Loss: 0.00961
Epoch: 30111, Training Loss: 0.00744
Epoch: 30112, Training Loss: 0.00842
Epoch: 30112, Training Loss: 0.00786
Epoch: 30112, Training Loss: 0.00961
Epoch: 30112, Training Loss: 0.00744
Epoch: 30113, Training Loss: 0.00842
Epoch: 30113, Training Loss: 0.00786
Epoch: 30113, Training Loss: 0.00961
Epoch: 30113, Training Loss: 0.00744
Epoch: 30114, Training Loss: 0.00842
Epoch: 30114, Training Loss: 0.00786
Epoch: 30114, Training Loss: 0.00961
Epoch: 30114, Training Loss: 0.00744
Epoch: 30115, Training Loss: 0.00842
Epoch: 30115, Training Loss: 0.00786
Epoch: 30115, Training Loss: 0.00961
Epoch: 30115, Training Loss: 0.00744
Epoch: 30116, Training Loss: 0.00842
Epoch: 30116, Training Loss: 0.00786
Epoch: 30116, Training Loss: 0.00961
Epoch: 30116, Training Loss: 0.00744
Epoch: 30117, Training Loss: 0.00842
Epoch: 30117, Training Loss: 0.00786
Epoch: 30117, Training Loss: 0.00961
Epoch: 30117, Training Loss: 0.00744
Epoch: 30118, Training Loss: 0.00842
Epoch: 30118, Training Loss: 0.00786
Epoch: 30118, Training Loss: 0.00961
Epoch: 30118, Training Loss: 0.00744
Epoch: 30119, Training Loss: 0.00842
Epoch: 30119, Training Loss: 0.00786
Epoch: 30119, Training Loss: 0.00961
Epoch: 30119, Training Loss: 0.00744
Epoch: 30120, Training Loss: 0.00842
Epoch: 30120, Training Loss: 0.00786
Epoch: 30120, Training Loss: 0.00961
Epoch: 30120, Training Loss: 0.00744
Epoch: 30121, Training Loss: 0.00842
Epoch: 30121, Training Loss: 0.00786
Epoch: 30121, Training Loss: 0.00961
Epoch: 30121, Training Loss: 0.00744
Epoch: 30122, Training Loss: 0.00842
Epoch: 30122, Training Loss: 0.00786
Epoch: 30122, Training Loss: 0.00961
Epoch: 30122, Training Loss: 0.00744
Epoch: 30123, Training Loss: 0.00842
Epoch: 30123, Training Loss: 0.00786
Epoch: 30123, Training Loss: 0.00961
Epoch: 30123, Training Loss: 0.00744
Epoch: 30124, Training Loss: 0.00842
Epoch: 30124, Training Loss: 0.00786
Epoch: 30124, Training Loss: 0.00961
Epoch: 30124, Training Loss: 0.00744
Epoch: 30125, Training Loss: 0.00842
Epoch: 30125, Training Loss: 0.00786
Epoch: 30125, Training Loss: 0.00961
Epoch: 30125, Training Loss: 0.00744
Epoch: 30126, Training Loss: 0.00842
Epoch: 30126, Training Loss: 0.00786
Epoch: 30126, Training Loss: 0.00961
Epoch: 30126, Training Loss: 0.00744
Epoch: 30127, Training Loss: 0.00842
Epoch: 30127, Training Loss: 0.00786
Epoch: 30127, Training Loss: 0.00961
Epoch: 30127, Training Loss: 0.00744
Epoch: 30128, Training Loss: 0.00842
Epoch: 30128, Training Loss: 0.00786
Epoch: 30128, Training Loss: 0.00961
Epoch: 30128, Training Loss: 0.00744
Epoch: 30129, Training Loss: 0.00842
Epoch: 30129, Training Loss: 0.00786
Epoch: 30129, Training Loss: 0.00961
Epoch: 30129, Training Loss: 0.00744
Epoch: 30130, Training Loss: 0.00842
Epoch: 30130, Training Loss: 0.00786
Epoch: 30130, Training Loss: 0.00961
Epoch: 30130, Training Loss: 0.00744
Epoch: 30131, Training Loss: 0.00841
Epoch: 30131, Training Loss: 0.00786
Epoch: 30131, Training Loss: 0.00961
Epoch: 30131, Training Loss: 0.00744
Epoch: 30132, Training Loss: 0.00841
Epoch: 30132, Training Loss: 0.00786
Epoch: 30132, Training Loss: 0.00961
Epoch: 30132, Training Loss: 0.00744
Epoch: 30133, Training Loss: 0.00841
Epoch: 30133, Training Loss: 0.00786
Epoch: 30133, Training Loss: 0.00961
Epoch: 30133, Training Loss: 0.00744
Epoch: 30134, Training Loss: 0.00841
Epoch: 30134, Training Loss: 0.00786
Epoch: 30134, Training Loss: 0.00961
Epoch: 30134, Training Loss: 0.00744
Epoch: 30135, Training Loss: 0.00841
Epoch: 30135, Training Loss: 0.00786
Epoch: 30135, Training Loss: 0.00961
Epoch: 30135, Training Loss: 0.00744
Epoch: 30136, Training Loss: 0.00841
Epoch: 30136, Training Loss: 0.00786
Epoch: 30136, Training Loss: 0.00961
Epoch: 30136, Training Loss: 0.00744
Epoch: 30137, Training Loss: 0.00841
Epoch: 30137, Training Loss: 0.00786
Epoch: 30137, Training Loss: 0.00961
Epoch: 30137, Training Loss: 0.00744
Epoch: 30138, Training Loss: 0.00841
Epoch: 30138, Training Loss: 0.00786
Epoch: 30138, Training Loss: 0.00961
Epoch: 30138, Training Loss: 0.00744
Epoch: 30139, Training Loss: 0.00841
Epoch: 30139, Training Loss: 0.00786
Epoch: 30139, Training Loss: 0.00961
Epoch: 30139, Training Loss: 0.00744
Epoch: 30140, Training Loss: 0.00841
Epoch: 30140, Training Loss: 0.00786
Epoch: 30140, Training Loss: 0.00961
Epoch: 30140, Training Loss: 0.00744
Epoch: 30141, Training Loss: 0.00841
Epoch: 30141, Training Loss: 0.00786
Epoch: 30141, Training Loss: 0.00961
Epoch: 30141, Training Loss: 0.00744
Epoch: 30142, Training Loss: 0.00841
Epoch: 30142, Training Loss: 0.00786
Epoch: 30142, Training Loss: 0.00961
Epoch: 30142, Training Loss: 0.00744
Epoch: 30143, Training Loss: 0.00841
Epoch: 30143, Training Loss: 0.00786
Epoch: 30143, Training Loss: 0.00961
Epoch: 30143, Training Loss: 0.00744
Epoch: 30144, Training Loss: 0.00841
Epoch: 30144, Training Loss: 0.00786
Epoch: 30144, Training Loss: 0.00961
Epoch: 30144, Training Loss: 0.00744
Epoch: 30145, Training Loss: 0.00841
Epoch: 30145, Training Loss: 0.00786
Epoch: 30145, Training Loss: 0.00961
Epoch: 30145, Training Loss: 0.00744
Epoch: 30146, Training Loss: 0.00841
Epoch: 30146, Training Loss: 0.00786
Epoch: 30146, Training Loss: 0.00961
Epoch: 30146, Training Loss: 0.00744
Epoch: 30147, Training Loss: 0.00841
Epoch: 30147, Training Loss: 0.00786
Epoch: 30147, Training Loss: 0.00961
Epoch: 30147, Training Loss: 0.00744
Epoch: 30148, Training Loss: 0.00841
Epoch: 30148, Training Loss: 0.00786
Epoch: 30148, Training Loss: 0.00961
Epoch: 30148, Training Loss: 0.00744
Epoch: 30149, Training Loss: 0.00841
Epoch: 30149, Training Loss: 0.00786
Epoch: 30149, Training Loss: 0.00961
Epoch: 30149, Training Loss: 0.00744
Epoch: 30150, Training Loss: 0.00841
Epoch: 30150, Training Loss: 0.00786
Epoch: 30150, Training Loss: 0.00961
Epoch: 30150, Training Loss: 0.00744
Epoch: 30151, Training Loss: 0.00841
Epoch: 30151, Training Loss: 0.00786
Epoch: 30151, Training Loss: 0.00961
Epoch: 30151, Training Loss: 0.00744
Epoch: 30152, Training Loss: 0.00841
Epoch: 30152, Training Loss: 0.00785
Epoch: 30152, Training Loss: 0.00961
Epoch: 30152, Training Loss: 0.00744
Epoch: 30153, Training Loss: 0.00841
Epoch: 30153, Training Loss: 0.00785
Epoch: 30153, Training Loss: 0.00961
Epoch: 30153, Training Loss: 0.00744
Epoch: 30154, Training Loss: 0.00841
Epoch: 30154, Training Loss: 0.00785
Epoch: 30154, Training Loss: 0.00961
Epoch: 30154, Training Loss: 0.00744
Epoch: 30155, Training Loss: 0.00841
Epoch: 30155, Training Loss: 0.00785
Epoch: 30155, Training Loss: 0.00961
Epoch: 30155, Training Loss: 0.00744
Epoch: 30156, Training Loss: 0.00841
Epoch: 30156, Training Loss: 0.00785
Epoch: 30156, Training Loss: 0.00961
Epoch: 30156, Training Loss: 0.00744
Epoch: 30157, Training Loss: 0.00841
Epoch: 30157, Training Loss: 0.00785
Epoch: 30157, Training Loss: 0.00961
Epoch: 30157, Training Loss: 0.00744
Epoch: 30158, Training Loss: 0.00841
Epoch: 30158, Training Loss: 0.00785
Epoch: 30158, Training Loss: 0.00961
Epoch: 30158, Training Loss: 0.00744
Epoch: 30159, Training Loss: 0.00841
Epoch: 30159, Training Loss: 0.00785
Epoch: 30159, Training Loss: 0.00961
Epoch: 30159, Training Loss: 0.00744
Epoch: 30160, Training Loss: 0.00841
Epoch: 30160, Training Loss: 0.00785
Epoch: 30160, Training Loss: 0.00961
Epoch: 30160, Training Loss: 0.00744
Epoch: 30161, Training Loss: 0.00841
Epoch: 30161, Training Loss: 0.00785
Epoch: 30161, Training Loss: 0.00961
Epoch: 30161, Training Loss: 0.00744
Epoch: 30162, Training Loss: 0.00841
Epoch: 30162, Training Loss: 0.00785
Epoch: 30162, Training Loss: 0.00961
Epoch: 30162, Training Loss: 0.00744
Epoch: 30163, Training Loss: 0.00841
Epoch: 30163, Training Loss: 0.00785
Epoch: 30163, Training Loss: 0.00960
Epoch: 30163, Training Loss: 0.00744
Epoch: 30164, Training Loss: 0.00841
Epoch: 30164, Training Loss: 0.00785
Epoch: 30164, Training Loss: 0.00960
Epoch: 30164, Training Loss: 0.00744
Epoch: 30165, Training Loss: 0.00841
Epoch: 30165, Training Loss: 0.00785
Epoch: 30165, Training Loss: 0.00960
Epoch: 30165, Training Loss: 0.00744
Epoch: 30166, Training Loss: 0.00841
Epoch: 30166, Training Loss: 0.00785
Epoch: 30166, Training Loss: 0.00960
Epoch: 30166, Training Loss: 0.00744
Epoch: 30167, Training Loss: 0.00841
Epoch: 30167, Training Loss: 0.00785
Epoch: 30167, Training Loss: 0.00960
Epoch: 30167, Training Loss: 0.00744
Epoch: 30168, Training Loss: 0.00841
Epoch: 30168, Training Loss: 0.00785
Epoch: 30168, Training Loss: 0.00960
Epoch: 30168, Training Loss: 0.00744
Epoch: 30169, Training Loss: 0.00841
Epoch: 30169, Training Loss: 0.00785
Epoch: 30169, Training Loss: 0.00960
Epoch: 30169, Training Loss: 0.00744
Epoch: 30170, Training Loss: 0.00841
Epoch: 30170, Training Loss: 0.00785
Epoch: 30170, Training Loss: 0.00960
Epoch: 30170, Training Loss: 0.00744
Epoch: 30171, Training Loss: 0.00841
Epoch: 30171, Training Loss: 0.00785
Epoch: 30171, Training Loss: 0.00960
Epoch: 30171, Training Loss: 0.00744
Epoch: 30172, Training Loss: 0.00841
Epoch: 30172, Training Loss: 0.00785
Epoch: 30172, Training Loss: 0.00960
Epoch: 30172, Training Loss: 0.00744
Epoch: 30173, Training Loss: 0.00841
Epoch: 30173, Training Loss: 0.00785
Epoch: 30173, Training Loss: 0.00960
Epoch: 30173, Training Loss: 0.00744
Epoch: 30174, Training Loss: 0.00841
Epoch: 30174, Training Loss: 0.00785
Epoch: 30174, Training Loss: 0.00960
Epoch: 30174, Training Loss: 0.00744
Epoch: 30175, Training Loss: 0.00841
Epoch: 30175, Training Loss: 0.00785
Epoch: 30175, Training Loss: 0.00960
Epoch: 30175, Training Loss: 0.00744
Epoch: 30176, Training Loss: 0.00841
Epoch: 30176, Training Loss: 0.00785
Epoch: 30176, Training Loss: 0.00960
Epoch: 30176, Training Loss: 0.00744
Epoch: 30177, Training Loss: 0.00841
Epoch: 30177, Training Loss: 0.00785
Epoch: 30177, Training Loss: 0.00960
Epoch: 30177, Training Loss: 0.00744
Epoch: 30178, Training Loss: 0.00841
Epoch: 30178, Training Loss: 0.00785
Epoch: 30178, Training Loss: 0.00960
Epoch: 30178, Training Loss: 0.00744
Epoch: 30179, Training Loss: 0.00841
Epoch: 30179, Training Loss: 0.00785
Epoch: 30179, Training Loss: 0.00960
Epoch: 30179, Training Loss: 0.00743
Epoch: 30180, Training Loss: 0.00841
Epoch: 30180, Training Loss: 0.00785
Epoch: 30180, Training Loss: 0.00960
Epoch: 30180, Training Loss: 0.00743
Epoch: 30181, Training Loss: 0.00841
Epoch: 30181, Training Loss: 0.00785
Epoch: 30181, Training Loss: 0.00960
Epoch: 30181, Training Loss: 0.00743
Epoch: 30182, Training Loss: 0.00841
Epoch: 30182, Training Loss: 0.00785
Epoch: 30182, Training Loss: 0.00960
Epoch: 30182, Training Loss: 0.00743
Epoch: 30183, Training Loss: 0.00841
Epoch: 30183, Training Loss: 0.00785
Epoch: 30183, Training Loss: 0.00960
Epoch: 30183, Training Loss: 0.00743
Epoch: 30184, Training Loss: 0.00841
Epoch: 30184, Training Loss: 0.00785
Epoch: 30184, Training Loss: 0.00960
Epoch: 30184, Training Loss: 0.00743
Epoch: 30185, Training Loss: 0.00841
Epoch: 30185, Training Loss: 0.00785
Epoch: 30185, Training Loss: 0.00960
Epoch: 30185, Training Loss: 0.00743
Epoch: 30186, Training Loss: 0.00841
Epoch: 30186, Training Loss: 0.00785
Epoch: 30186, Training Loss: 0.00960
Epoch: 30186, Training Loss: 0.00743
Epoch: 30187, Training Loss: 0.00841
Epoch: 30187, Training Loss: 0.00785
Epoch: 30187, Training Loss: 0.00960
Epoch: 30187, Training Loss: 0.00743
Epoch: 30188, Training Loss: 0.00841
Epoch: 30188, Training Loss: 0.00785
Epoch: 30188, Training Loss: 0.00960
Epoch: 30188, Training Loss: 0.00743
Epoch: 30189, Training Loss: 0.00841
Epoch: 30189, Training Loss: 0.00785
Epoch: 30189, Training Loss: 0.00960
Epoch: 30189, Training Loss: 0.00743
Epoch: 30190, Training Loss: 0.00841
Epoch: 30190, Training Loss: 0.00785
Epoch: 30190, Training Loss: 0.00960
Epoch: 30190, Training Loss: 0.00743
Epoch: 30191, Training Loss: 0.00841
Epoch: 30191, Training Loss: 0.00785
Epoch: 30191, Training Loss: 0.00960
Epoch: 30191, Training Loss: 0.00743
Epoch: 30192, Training Loss: 0.00841
Epoch: 30192, Training Loss: 0.00785
Epoch: 30192, Training Loss: 0.00960
Epoch: 30192, Training Loss: 0.00743
Epoch: 30193, Training Loss: 0.00841
Epoch: 30193, Training Loss: 0.00785
Epoch: 30193, Training Loss: 0.00960
Epoch: 30193, Training Loss: 0.00743
Epoch: 30194, Training Loss: 0.00841
Epoch: 30194, Training Loss: 0.00785
Epoch: 30194, Training Loss: 0.00960
Epoch: 30194, Training Loss: 0.00743
Epoch: 30195, Training Loss: 0.00841
Epoch: 30195, Training Loss: 0.00785
Epoch: 30195, Training Loss: 0.00960
Epoch: 30195, Training Loss: 0.00743
Epoch: 30196, Training Loss: 0.00841
Epoch: 30196, Training Loss: 0.00785
Epoch: 30196, Training Loss: 0.00960
Epoch: 30196, Training Loss: 0.00743
Epoch: 30197, Training Loss: 0.00841
Epoch: 30197, Training Loss: 0.00785
Epoch: 30197, Training Loss: 0.00960
Epoch: 30197, Training Loss: 0.00743
Epoch: 30198, Training Loss: 0.00841
Epoch: 30198, Training Loss: 0.00785
Epoch: 30198, Training Loss: 0.00960
Epoch: 30198, Training Loss: 0.00743
Epoch: 30199, Training Loss: 0.00840
Epoch: 30199, Training Loss: 0.00785
Epoch: 30199, Training Loss: 0.00960
Epoch: 30199, Training Loss: 0.00743
Epoch: 30200, Training Loss: 0.00840
Epoch: 30200, Training Loss: 0.00785
Epoch: 30200, Training Loss: 0.00960
Epoch: 30200, Training Loss: 0.00743
Epoch: 30201, Training Loss: 0.00840
Epoch: 30201, Training Loss: 0.00785
Epoch: 30201, Training Loss: 0.00960
Epoch: 30201, Training Loss: 0.00743
Epoch: 30202, Training Loss: 0.00840
Epoch: 30202, Training Loss: 0.00785
Epoch: 30202, Training Loss: 0.00960
Epoch: 30202, Training Loss: 0.00743
Epoch: 30203, Training Loss: 0.00840
Epoch: 30203, Training Loss: 0.00785
Epoch: 30203, Training Loss: 0.00960
Epoch: 30203, Training Loss: 0.00743
Epoch: 30204, Training Loss: 0.00840
Epoch: 30204, Training Loss: 0.00785
Epoch: 30204, Training Loss: 0.00960
Epoch: 30204, Training Loss: 0.00743
Epoch: 30205, Training Loss: 0.00840
Epoch: 30205, Training Loss: 0.00785
Epoch: 30205, Training Loss: 0.00960
Epoch: 30205, Training Loss: 0.00743
Epoch: 30206, Training Loss: 0.00840
Epoch: 30206, Training Loss: 0.00785
Epoch: 30206, Training Loss: 0.00960
Epoch: 30206, Training Loss: 0.00743
Epoch: 30207, Training Loss: 0.00840
Epoch: 30207, Training Loss: 0.00785
Epoch: 30207, Training Loss: 0.00960
Epoch: 30207, Training Loss: 0.00743
Epoch: 30208, Training Loss: 0.00840
Epoch: 30208, Training Loss: 0.00785
Epoch: 30208, Training Loss: 0.00960
Epoch: 30208, Training Loss: 0.00743
Epoch: 30209, Training Loss: 0.00840
Epoch: 30209, Training Loss: 0.00785
Epoch: 30209, Training Loss: 0.00960
Epoch: 30209, Training Loss: 0.00743
Epoch: 30210, Training Loss: 0.00840
Epoch: 30210, Training Loss: 0.00785
Epoch: 30210, Training Loss: 0.00960
Epoch: 30210, Training Loss: 0.00743
Epoch: 30211, Training Loss: 0.00840
Epoch: 30211, Training Loss: 0.00785
Epoch: 30211, Training Loss: 0.00960
Epoch: 30211, Training Loss: 0.00743
Epoch: 30212, Training Loss: 0.00840
Epoch: 30212, Training Loss: 0.00785
Epoch: 30212, Training Loss: 0.00960
Epoch: 30212, Training Loss: 0.00743
Epoch: 30213, Training Loss: 0.00840
Epoch: 30213, Training Loss: 0.00785
Epoch: 30213, Training Loss: 0.00960
Epoch: 30213, Training Loss: 0.00743
Epoch: 30214, Training Loss: 0.00840
Epoch: 30214, Training Loss: 0.00785
Epoch: 30214, Training Loss: 0.00960
Epoch: 30214, Training Loss: 0.00743
Epoch: 30215, Training Loss: 0.00840
Epoch: 30215, Training Loss: 0.00785
Epoch: 30215, Training Loss: 0.00960
Epoch: 30215, Training Loss: 0.00743
Epoch: 30216, Training Loss: 0.00840
Epoch: 30216, Training Loss: 0.00785
Epoch: 30216, Training Loss: 0.00960
Epoch: 30216, Training Loss: 0.00743
Epoch: 30217, Training Loss: 0.00840
Epoch: 30217, Training Loss: 0.00785
Epoch: 30217, Training Loss: 0.00960
Epoch: 30217, Training Loss: 0.00743
Epoch: 30218, Training Loss: 0.00840
Epoch: 30218, Training Loss: 0.00785
Epoch: 30218, Training Loss: 0.00960
Epoch: 30218, Training Loss: 0.00743
Epoch: 30219, Training Loss: 0.00840
Epoch: 30219, Training Loss: 0.00785
Epoch: 30219, Training Loss: 0.00960
Epoch: 30219, Training Loss: 0.00743
Epoch: 30220, Training Loss: 0.00840
Epoch: 30220, Training Loss: 0.00785
Epoch: 30220, Training Loss: 0.00960
Epoch: 30220, Training Loss: 0.00743
Epoch: 30221, Training Loss: 0.00840
Epoch: 30221, Training Loss: 0.00785
Epoch: 30221, Training Loss: 0.00960
Epoch: 30221, Training Loss: 0.00743
Epoch: 30222, Training Loss: 0.00840
Epoch: 30222, Training Loss: 0.00785
Epoch: 30222, Training Loss: 0.00959
Epoch: 30222, Training Loss: 0.00743
Epoch: 30223, Training Loss: 0.00840
Epoch: 30223, Training Loss: 0.00785
Epoch: 30223, Training Loss: 0.00959
Epoch: 30223, Training Loss: 0.00743
Epoch: 30224, Training Loss: 0.00840
Epoch: 30224, Training Loss: 0.00785
Epoch: 30224, Training Loss: 0.00959
Epoch: 30224, Training Loss: 0.00743
Epoch: 30225, Training Loss: 0.00840
Epoch: 30225, Training Loss: 0.00785
Epoch: 30225, Training Loss: 0.00959
Epoch: 30225, Training Loss: 0.00743
Epoch: 30226, Training Loss: 0.00840
Epoch: 30226, Training Loss: 0.00784
Epoch: 30226, Training Loss: 0.00959
Epoch: 30226, Training Loss: 0.00743
Epoch: 30227, Training Loss: 0.00840
Epoch: 30227, Training Loss: 0.00784
Epoch: 30227, Training Loss: 0.00959
Epoch: 30227, Training Loss: 0.00743
Epoch: 30228, Training Loss: 0.00840
Epoch: 30228, Training Loss: 0.00784
Epoch: 30228, Training Loss: 0.00959
Epoch: 30228, Training Loss: 0.00743
Epoch: 30229, Training Loss: 0.00840
Epoch: 30229, Training Loss: 0.00784
Epoch: 30229, Training Loss: 0.00959
Epoch: 30229, Training Loss: 0.00743
Epoch: 30230, Training Loss: 0.00840
Epoch: 30230, Training Loss: 0.00784
Epoch: 30230, Training Loss: 0.00959
Epoch: 30230, Training Loss: 0.00743
Epoch: 30231, Training Loss: 0.00840
Epoch: 30231, Training Loss: 0.00784
Epoch: 30231, Training Loss: 0.00959
Epoch: 30231, Training Loss: 0.00743
Epoch: 30232, Training Loss: 0.00840
Epoch: 30232, Training Loss: 0.00784
Epoch: 30232, Training Loss: 0.00959
Epoch: 30232, Training Loss: 0.00743
Epoch: 30233, Training Loss: 0.00840
Epoch: 30233, Training Loss: 0.00784
Epoch: 30233, Training Loss: 0.00959
Epoch: 30233, Training Loss: 0.00743
Epoch: 30234, Training Loss: 0.00840
Epoch: 30234, Training Loss: 0.00784
Epoch: 30234, Training Loss: 0.00959
Epoch: 30234, Training Loss: 0.00743
Epoch: 30235, Training Loss: 0.00840
Epoch: 30235, Training Loss: 0.00784
Epoch: 30235, Training Loss: 0.00959
Epoch: 30235, Training Loss: 0.00743
Epoch: 30236, Training Loss: 0.00840
Epoch: 30236, Training Loss: 0.00784
Epoch: 30236, Training Loss: 0.00959
Epoch: 30236, Training Loss: 0.00743
Epoch: 30237, Training Loss: 0.00840
Epoch: 30237, Training Loss: 0.00784
Epoch: 30237, Training Loss: 0.00959
Epoch: 30237, Training Loss: 0.00743
Epoch: 30238, Training Loss: 0.00840
Epoch: 30238, Training Loss: 0.00784
Epoch: 30238, Training Loss: 0.00959
Epoch: 30238, Training Loss: 0.00743
Epoch: 30239, Training Loss: 0.00840
Epoch: 30239, Training Loss: 0.00784
Epoch: 30239, Training Loss: 0.00959
Epoch: 30239, Training Loss: 0.00743
Epoch: 30240, Training Loss: 0.00840
Epoch: 30240, Training Loss: 0.00784
Epoch: 30240, Training Loss: 0.00959
Epoch: 30240, Training Loss: 0.00743
Epoch: 30241, Training Loss: 0.00840
Epoch: 30241, Training Loss: 0.00784
Epoch: 30241, Training Loss: 0.00959
Epoch: 30241, Training Loss: 0.00743
Epoch: 30242, Training Loss: 0.00840
Epoch: 30242, Training Loss: 0.00784
Epoch: 30242, Training Loss: 0.00959
Epoch: 30242, Training Loss: 0.00743
Epoch: 30243, Training Loss: 0.00840
Epoch: 30243, Training Loss: 0.00784
Epoch: 30243, Training Loss: 0.00959
Epoch: 30243, Training Loss: 0.00743
Epoch: 30244, Training Loss: 0.00840
Epoch: 30244, Training Loss: 0.00784
Epoch: 30244, Training Loss: 0.00959
Epoch: 30244, Training Loss: 0.00743
Epoch: 30245, Training Loss: 0.00840
Epoch: 30245, Training Loss: 0.00784
Epoch: 30245, Training Loss: 0.00959
Epoch: 30245, Training Loss: 0.00743
Epoch: 30246, Training Loss: 0.00840
Epoch: 30246, Training Loss: 0.00784
Epoch: 30246, Training Loss: 0.00959
Epoch: 30246, Training Loss: 0.00743
Epoch: 30247, Training Loss: 0.00840
Epoch: 30247, Training Loss: 0.00784
Epoch: 30247, Training Loss: 0.00959
Epoch: 30247, Training Loss: 0.00743
Epoch: 30248, Training Loss: 0.00840
Epoch: 30248, Training Loss: 0.00784
Epoch: 30248, Training Loss: 0.00959
Epoch: 30248, Training Loss: 0.00743
Epoch: 30249, Training Loss: 0.00840
Epoch: 30249, Training Loss: 0.00784
Epoch: 30249, Training Loss: 0.00959
Epoch: 30249, Training Loss: 0.00743
Epoch: 30250, Training Loss: 0.00840
Epoch: 30250, Training Loss: 0.00784
Epoch: 30250, Training Loss: 0.00959
Epoch: 30250, Training Loss: 0.00743
Epoch: 30251, Training Loss: 0.00840
Epoch: 30251, Training Loss: 0.00784
Epoch: 30251, Training Loss: 0.00959
Epoch: 30251, Training Loss: 0.00743
Epoch: 30252, Training Loss: 0.00840
Epoch: 30252, Training Loss: 0.00784
Epoch: 30252, Training Loss: 0.00959
Epoch: 30252, Training Loss: 0.00743
Epoch: 30253, Training Loss: 0.00840
Epoch: 30253, Training Loss: 0.00784
Epoch: 30253, Training Loss: 0.00959
Epoch: 30253, Training Loss: 0.00743
Epoch: 30254, Training Loss: 0.00840
Epoch: 30254, Training Loss: 0.00784
Epoch: 30254, Training Loss: 0.00959
Epoch: 30254, Training Loss: 0.00743
Epoch: 30255, Training Loss: 0.00840
Epoch: 30255, Training Loss: 0.00784
Epoch: 30255, Training Loss: 0.00959
Epoch: 30255, Training Loss: 0.00743
Epoch: 30256, Training Loss: 0.00840
Epoch: 30256, Training Loss: 0.00784
Epoch: 30256, Training Loss: 0.00959
Epoch: 30256, Training Loss: 0.00742
Epoch: 30257, Training Loss: 0.00840
Epoch: 30257, Training Loss: 0.00784
Epoch: 30257, Training Loss: 0.00959
Epoch: 30257, Training Loss: 0.00742
Epoch: 30258, Training Loss: 0.00840
Epoch: 30258, Training Loss: 0.00784
Epoch: 30258, Training Loss: 0.00959
Epoch: 30258, Training Loss: 0.00742
Epoch: 30259, Training Loss: 0.00840
Epoch: 30259, Training Loss: 0.00784
Epoch: 30259, Training Loss: 0.00959
Epoch: 30259, Training Loss: 0.00742
Epoch: 30260, Training Loss: 0.00840
Epoch: 30260, Training Loss: 0.00784
Epoch: 30260, Training Loss: 0.00959
Epoch: 30260, Training Loss: 0.00742
Epoch: 30261, Training Loss: 0.00840
Epoch: 30261, Training Loss: 0.00784
Epoch: 30261, Training Loss: 0.00959
Epoch: 30261, Training Loss: 0.00742
Epoch: 30262, Training Loss: 0.00840
Epoch: 30262, Training Loss: 0.00784
Epoch: 30262, Training Loss: 0.00959
Epoch: 30262, Training Loss: 0.00742
Epoch: 30263, Training Loss: 0.00840
Epoch: 30263, Training Loss: 0.00784
Epoch: 30263, Training Loss: 0.00959
Epoch: 30263, Training Loss: 0.00742
Epoch: 30264, Training Loss: 0.00840
Epoch: 30264, Training Loss: 0.00784
Epoch: 30264, Training Loss: 0.00959
Epoch: 30264, Training Loss: 0.00742
Epoch: 30265, Training Loss: 0.00840
Epoch: 30265, Training Loss: 0.00784
Epoch: 30265, Training Loss: 0.00959
Epoch: 30265, Training Loss: 0.00742
Epoch: 30266, Training Loss: 0.00840
Epoch: 30266, Training Loss: 0.00784
Epoch: 30266, Training Loss: 0.00959
Epoch: 30266, Training Loss: 0.00742
Epoch: 30267, Training Loss: 0.00839
Epoch: 30267, Training Loss: 0.00784
Epoch: 30267, Training Loss: 0.00959
Epoch: 30267, Training Loss: 0.00742
Epoch: 30268, Training Loss: 0.00839
Epoch: 30268, Training Loss: 0.00784
Epoch: 30268, Training Loss: 0.00959
Epoch: 30268, Training Loss: 0.00742
Epoch: 30269, Training Loss: 0.00839
Epoch: 30269, Training Loss: 0.00784
Epoch: 30269, Training Loss: 0.00959
Epoch: 30269, Training Loss: 0.00742
Epoch: 30270, Training Loss: 0.00839
Epoch: 30270, Training Loss: 0.00784
Epoch: 30270, Training Loss: 0.00959
Epoch: 30270, Training Loss: 0.00742
Epoch: 30271, Training Loss: 0.00839
Epoch: 30271, Training Loss: 0.00784
Epoch: 30271, Training Loss: 0.00959
Epoch: 30271, Training Loss: 0.00742
Epoch: 30272, Training Loss: 0.00839
Epoch: 30272, Training Loss: 0.00784
Epoch: 30272, Training Loss: 0.00959
Epoch: 30272, Training Loss: 0.00742
Epoch: 30273, Training Loss: 0.00839
Epoch: 30273, Training Loss: 0.00784
Epoch: 30273, Training Loss: 0.00959
Epoch: 30273, Training Loss: 0.00742
Epoch: 30274, Training Loss: 0.00839
Epoch: 30274, Training Loss: 0.00784
Epoch: 30274, Training Loss: 0.00959
Epoch: 30274, Training Loss: 0.00742
Epoch: 30275, Training Loss: 0.00839
Epoch: 30275, Training Loss: 0.00784
Epoch: 30275, Training Loss: 0.00959
Epoch: 30275, Training Loss: 0.00742
Epoch: 30276, Training Loss: 0.00839
Epoch: 30276, Training Loss: 0.00784
Epoch: 30276, Training Loss: 0.00959
Epoch: 30276, Training Loss: 0.00742
Epoch: 30277, Training Loss: 0.00839
Epoch: 30277, Training Loss: 0.00784
Epoch: 30277, Training Loss: 0.00959
Epoch: 30277, Training Loss: 0.00742
Epoch: 30278, Training Loss: 0.00839
Epoch: 30278, Training Loss: 0.00784
Epoch: 30278, Training Loss: 0.00959
Epoch: 30278, Training Loss: 0.00742
Epoch: 30279, Training Loss: 0.00839
Epoch: 30279, Training Loss: 0.00784
Epoch: 30279, Training Loss: 0.00959
Epoch: 30279, Training Loss: 0.00742
Epoch: 30280, Training Loss: 0.00839
Epoch: 30280, Training Loss: 0.00784
Epoch: 30280, Training Loss: 0.00959
Epoch: 30280, Training Loss: 0.00742
Epoch: 30281, Training Loss: 0.00839
Epoch: 30281, Training Loss: 0.00784
Epoch: 30281, Training Loss: 0.00959
Epoch: 30281, Training Loss: 0.00742
Epoch: 30282, Training Loss: 0.00839
Epoch: 30282, Training Loss: 0.00784
Epoch: 30282, Training Loss: 0.00958
Epoch: 30282, Training Loss: 0.00742
Epoch: 30283, Training Loss: 0.00839
Epoch: 30283, Training Loss: 0.00784
Epoch: 30283, Training Loss: 0.00958
Epoch: 30283, Training Loss: 0.00742
Epoch: 30284, Training Loss: 0.00839
Epoch: 30284, Training Loss: 0.00784
Epoch: 30284, Training Loss: 0.00958
Epoch: 30284, Training Loss: 0.00742
Epoch: 30285, Training Loss: 0.00839
Epoch: 30285, Training Loss: 0.00784
Epoch: 30285, Training Loss: 0.00958
Epoch: 30285, Training Loss: 0.00742
Epoch: 30286, Training Loss: 0.00839
Epoch: 30286, Training Loss: 0.00784
Epoch: 30286, Training Loss: 0.00958
Epoch: 30286, Training Loss: 0.00742
Epoch: 30287, Training Loss: 0.00839
Epoch: 30287, Training Loss: 0.00784
Epoch: 30287, Training Loss: 0.00958
Epoch: 30287, Training Loss: 0.00742
Epoch: 30288, Training Loss: 0.00839
Epoch: 30288, Training Loss: 0.00784
Epoch: 30288, Training Loss: 0.00958
Epoch: 30288, Training Loss: 0.00742
Epoch: 30289, Training Loss: 0.00839
Epoch: 30289, Training Loss: 0.00784
Epoch: 30289, Training Loss: 0.00958
Epoch: 30289, Training Loss: 0.00742
Epoch: 30290, Training Loss: 0.00839
Epoch: 30290, Training Loss: 0.00784
Epoch: 30290, Training Loss: 0.00958
Epoch: 30290, Training Loss: 0.00742
Epoch: 30291, Training Loss: 0.00839
Epoch: 30291, Training Loss: 0.00784
Epoch: 30291, Training Loss: 0.00958
Epoch: 30291, Training Loss: 0.00742
Epoch: 30292, Training Loss: 0.00839
Epoch: 30292, Training Loss: 0.00784
Epoch: 30292, Training Loss: 0.00958
Epoch: 30292, Training Loss: 0.00742
Epoch: 30293, Training Loss: 0.00839
Epoch: 30293, Training Loss: 0.00784
Epoch: 30293, Training Loss: 0.00958
Epoch: 30293, Training Loss: 0.00742
Epoch: 30294, Training Loss: 0.00839
Epoch: 30294, Training Loss: 0.00784
Epoch: 30294, Training Loss: 0.00958
Epoch: 30294, Training Loss: 0.00742
Epoch: 30295, Training Loss: 0.00839
Epoch: 30295, Training Loss: 0.00784
Epoch: 30295, Training Loss: 0.00958
Epoch: 30295, Training Loss: 0.00742
Epoch: 30296, Training Loss: 0.00839
Epoch: 30296, Training Loss: 0.00784
Epoch: 30296, Training Loss: 0.00958
Epoch: 30296, Training Loss: 0.00742
Epoch: 30297, Training Loss: 0.00839
Epoch: 30297, Training Loss: 0.00784
Epoch: 30297, Training Loss: 0.00958
Epoch: 30297, Training Loss: 0.00742
Epoch: 30298, Training Loss: 0.00839
Epoch: 30298, Training Loss: 0.00784
Epoch: 30298, Training Loss: 0.00958
Epoch: 30298, Training Loss: 0.00742
Epoch: 30299, Training Loss: 0.00839
Epoch: 30299, Training Loss: 0.00784
Epoch: 30299, Training Loss: 0.00958
Epoch: 30299, Training Loss: 0.00742
Epoch: 30300, Training Loss: 0.00839
Epoch: 30300, Training Loss: 0.00783
Epoch: 30300, Training Loss: 0.00958
Epoch: 30300, Training Loss: 0.00742
Epoch: 30301, Training Loss: 0.00839
Epoch: 30301, Training Loss: 0.00783
Epoch: 30301, Training Loss: 0.00958
Epoch: 30301, Training Loss: 0.00742
Epoch: 30302, Training Loss: 0.00839
Epoch: 30302, Training Loss: 0.00783
Epoch: 30302, Training Loss: 0.00958
Epoch: 30302, Training Loss: 0.00742
Epoch: 30303, Training Loss: 0.00839
Epoch: 30303, Training Loss: 0.00783
Epoch: 30303, Training Loss: 0.00958
Epoch: 30303, Training Loss: 0.00742
Epoch: 30304, Training Loss: 0.00839
Epoch: 30304, Training Loss: 0.00783
Epoch: 30304, Training Loss: 0.00958
Epoch: 30304, Training Loss: 0.00742
Epoch: 30305, Training Loss: 0.00839
Epoch: 30305, Training Loss: 0.00783
Epoch: 30305, Training Loss: 0.00958
Epoch: 30305, Training Loss: 0.00742
Epoch: 30306, Training Loss: 0.00839
Epoch: 30306, Training Loss: 0.00783
Epoch: 30306, Training Loss: 0.00958
Epoch: 30306, Training Loss: 0.00742
Epoch: 30307, Training Loss: 0.00839
Epoch: 30307, Training Loss: 0.00783
Epoch: 30307, Training Loss: 0.00958
Epoch: 30307, Training Loss: 0.00742
Epoch: 30308, Training Loss: 0.00839
Epoch: 30308, Training Loss: 0.00783
Epoch: 30308, Training Loss: 0.00958
Epoch: 30308, Training Loss: 0.00742
Epoch: 30309, Training Loss: 0.00839
Epoch: 30309, Training Loss: 0.00783
Epoch: 30309, Training Loss: 0.00958
Epoch: 30309, Training Loss: 0.00742
Epoch: 30310, Training Loss: 0.00839
Epoch: 30310, Training Loss: 0.00783
Epoch: 30310, Training Loss: 0.00958
Epoch: 30310, Training Loss: 0.00742
Epoch: 30311, Training Loss: 0.00839
Epoch: 30311, Training Loss: 0.00783
Epoch: 30311, Training Loss: 0.00958
Epoch: 30311, Training Loss: 0.00742
Epoch: 30312, Training Loss: 0.00839
Epoch: 30312, Training Loss: 0.00783
Epoch: 30312, Training Loss: 0.00958
Epoch: 30312, Training Loss: 0.00742
Epoch: 30313, Training Loss: 0.00839
Epoch: 30313, Training Loss: 0.00783
Epoch: 30313, Training Loss: 0.00958
Epoch: 30313, Training Loss: 0.00742
Epoch: 30314, Training Loss: 0.00839
Epoch: 30314, Training Loss: 0.00783
Epoch: 30314, Training Loss: 0.00958
Epoch: 30314, Training Loss: 0.00742
Epoch: 30315, Training Loss: 0.00839
Epoch: 30315, Training Loss: 0.00783
Epoch: 30315, Training Loss: 0.00958
Epoch: 30315, Training Loss: 0.00742
Epoch: 30316, Training Loss: 0.00839
Epoch: 30316, Training Loss: 0.00783
Epoch: 30316, Training Loss: 0.00958
Epoch: 30316, Training Loss: 0.00742
Epoch: 30317, Training Loss: 0.00839
Epoch: 30317, Training Loss: 0.00783
Epoch: 30317, Training Loss: 0.00958
Epoch: 30317, Training Loss: 0.00742
Epoch: 30318, Training Loss: 0.00839
Epoch: 30318, Training Loss: 0.00783
Epoch: 30318, Training Loss: 0.00958
Epoch: 30318, Training Loss: 0.00742
Epoch: 30319, Training Loss: 0.00839
Epoch: 30319, Training Loss: 0.00783
Epoch: 30319, Training Loss: 0.00958
Epoch: 30319, Training Loss: 0.00742
Epoch: 30320, Training Loss: 0.00839
Epoch: 30320, Training Loss: 0.00783
Epoch: 30320, Training Loss: 0.00958
Epoch: 30320, Training Loss: 0.00742
Epoch: 30321, Training Loss: 0.00839
Epoch: 30321, Training Loss: 0.00783
Epoch: 30321, Training Loss: 0.00958
Epoch: 30321, Training Loss: 0.00742
Epoch: 30322, Training Loss: 0.00839
Epoch: 30322, Training Loss: 0.00783
Epoch: 30322, Training Loss: 0.00958
Epoch: 30322, Training Loss: 0.00742
Epoch: 30323, Training Loss: 0.00839
Epoch: 30323, Training Loss: 0.00783
Epoch: 30323, Training Loss: 0.00958
Epoch: 30323, Training Loss: 0.00742
Epoch: 30324, Training Loss: 0.00839
Epoch: 30324, Training Loss: 0.00783
Epoch: 30324, Training Loss: 0.00958
Epoch: 30324, Training Loss: 0.00742
Epoch: 30325, Training Loss: 0.00839
Epoch: 30325, Training Loss: 0.00783
Epoch: 30325, Training Loss: 0.00958
Epoch: 30325, Training Loss: 0.00742
Epoch: 30326, Training Loss: 0.00839
Epoch: 30326, Training Loss: 0.00783
Epoch: 30326, Training Loss: 0.00958
Epoch: 30326, Training Loss: 0.00742
Epoch: 30327, Training Loss: 0.00839
Epoch: 30327, Training Loss: 0.00783
Epoch: 30327, Training Loss: 0.00958
Epoch: 30327, Training Loss: 0.00742
Epoch: 30328, Training Loss: 0.00839
Epoch: 30328, Training Loss: 0.00783
Epoch: 30328, Training Loss: 0.00958
Epoch: 30328, Training Loss: 0.00742
Epoch: 30329, Training Loss: 0.00839
Epoch: 30329, Training Loss: 0.00783
Epoch: 30329, Training Loss: 0.00958
Epoch: 30329, Training Loss: 0.00742
Epoch: 30330, Training Loss: 0.00839
Epoch: 30330, Training Loss: 0.00783
Epoch: 30330, Training Loss: 0.00958
Epoch: 30330, Training Loss: 0.00742
Epoch: 30331, Training Loss: 0.00839
Epoch: 30331, Training Loss: 0.00783
Epoch: 30331, Training Loss: 0.00958
Epoch: 30331, Training Loss: 0.00742
Epoch: 30332, Training Loss: 0.00839
Epoch: 30332, Training Loss: 0.00783
Epoch: 30332, Training Loss: 0.00958
Epoch: 30332, Training Loss: 0.00742
Epoch: 30333, Training Loss: 0.00839
Epoch: 30333, Training Loss: 0.00783
Epoch: 30333, Training Loss: 0.00958
Epoch: 30333, Training Loss: 0.00742
Epoch: 30334, Training Loss: 0.00839
Epoch: 30334, Training Loss: 0.00783
Epoch: 30334, Training Loss: 0.00958
Epoch: 30334, Training Loss: 0.00741
Epoch: 30335, Training Loss: 0.00838
Epoch: 30335, Training Loss: 0.00783
Epoch: 30335, Training Loss: 0.00958
Epoch: 30335, Training Loss: 0.00741
Epoch: 30336, Training Loss: 0.00838
Epoch: 30336, Training Loss: 0.00783
Epoch: 30336, Training Loss: 0.00958
Epoch: 30336, Training Loss: 0.00741
Epoch: 30337, Training Loss: 0.00838
Epoch: 30337, Training Loss: 0.00783
Epoch: 30337, Training Loss: 0.00958
Epoch: 30337, Training Loss: 0.00741
Epoch: 30338, Training Loss: 0.00838
Epoch: 30338, Training Loss: 0.00783
Epoch: 30338, Training Loss: 0.00958
Epoch: 30338, Training Loss: 0.00741
Epoch: 30339, Training Loss: 0.00838
Epoch: 30339, Training Loss: 0.00783
Epoch: 30339, Training Loss: 0.00958
Epoch: 30339, Training Loss: 0.00741
Epoch: 30340, Training Loss: 0.00838
Epoch: 30340, Training Loss: 0.00783
Epoch: 30340, Training Loss: 0.00958
Epoch: 30340, Training Loss: 0.00741
Epoch: 30341, Training Loss: 0.00838
Epoch: 30341, Training Loss: 0.00783
Epoch: 30341, Training Loss: 0.00957
Epoch: 30341, Training Loss: 0.00741
Epoch: 30342, Training Loss: 0.00838
Epoch: 30342, Training Loss: 0.00783
Epoch: 30342, Training Loss: 0.00957
Epoch: 30342, Training Loss: 0.00741
Epoch: 30343, Training Loss: 0.00838
Epoch: 30343, Training Loss: 0.00783
Epoch: 30343, Training Loss: 0.00957
Epoch: 30343, Training Loss: 0.00741
Epoch: 30344, Training Loss: 0.00838
Epoch: 30344, Training Loss: 0.00783
Epoch: 30344, Training Loss: 0.00957
Epoch: 30344, Training Loss: 0.00741
Epoch: 30345, Training Loss: 0.00838
Epoch: 30345, Training Loss: 0.00783
Epoch: 30345, Training Loss: 0.00957
Epoch: 30345, Training Loss: 0.00741
Epoch: 30346, Training Loss: 0.00838
Epoch: 30346, Training Loss: 0.00783
Epoch: 30346, Training Loss: 0.00957
Epoch: 30346, Training Loss: 0.00741
Epoch: 30347, Training Loss: 0.00838
Epoch: 30347, Training Loss: 0.00783
Epoch: 30347, Training Loss: 0.00957
Epoch: 30347, Training Loss: 0.00741
Epoch: 30348, Training Loss: 0.00838
Epoch: 30348, Training Loss: 0.00783
Epoch: 30348, Training Loss: 0.00957
Epoch: 30348, Training Loss: 0.00741
Epoch: 30349, Training Loss: 0.00838
Epoch: 30349, Training Loss: 0.00783
Epoch: 30349, Training Loss: 0.00957
Epoch: 30349, Training Loss: 0.00741
Epoch: 30350, Training Loss: 0.00838
Epoch: 30350, Training Loss: 0.00783
Epoch: 30350, Training Loss: 0.00957
Epoch: 30350, Training Loss: 0.00741
Epoch: 30351, Training Loss: 0.00838
Epoch: 30351, Training Loss: 0.00783
Epoch: 30351, Training Loss: 0.00957
Epoch: 30351, Training Loss: 0.00741
Epoch: 30352, Training Loss: 0.00838
Epoch: 30352, Training Loss: 0.00783
Epoch: 30352, Training Loss: 0.00957
Epoch: 30352, Training Loss: 0.00741
Epoch: 30353, Training Loss: 0.00838
Epoch: 30353, Training Loss: 0.00783
Epoch: 30353, Training Loss: 0.00957
Epoch: 30353, Training Loss: 0.00741
Epoch: 30354, Training Loss: 0.00838
Epoch: 30354, Training Loss: 0.00783
Epoch: 30354, Training Loss: 0.00957
Epoch: 30354, Training Loss: 0.00741
Epoch: 30355, Training Loss: 0.00838
Epoch: 30355, Training Loss: 0.00783
Epoch: 30355, Training Loss: 0.00957
Epoch: 30355, Training Loss: 0.00741
Epoch: 30356, Training Loss: 0.00838
Epoch: 30356, Training Loss: 0.00783
Epoch: 30356, Training Loss: 0.00957
Epoch: 30356, Training Loss: 0.00741
Epoch: 30357, Training Loss: 0.00838
Epoch: 30357, Training Loss: 0.00783
Epoch: 30357, Training Loss: 0.00957
Epoch: 30357, Training Loss: 0.00741
Epoch: 30358, Training Loss: 0.00838
Epoch: 30358, Training Loss: 0.00783
Epoch: 30358, Training Loss: 0.00957
Epoch: 30358, Training Loss: 0.00741
Epoch: 30359, Training Loss: 0.00838
Epoch: 30359, Training Loss: 0.00783
Epoch: 30359, Training Loss: 0.00957
Epoch: 30359, Training Loss: 0.00741
Epoch: 30360, Training Loss: 0.00838
Epoch: 30360, Training Loss: 0.00783
Epoch: 30360, Training Loss: 0.00957
Epoch: 30360, Training Loss: 0.00741
Epoch: 30361, Training Loss: 0.00838
Epoch: 30361, Training Loss: 0.00783
Epoch: 30361, Training Loss: 0.00957
Epoch: 30361, Training Loss: 0.00741
Epoch: 30362, Training Loss: 0.00838
Epoch: 30362, Training Loss: 0.00783
Epoch: 30362, Training Loss: 0.00957
Epoch: 30362, Training Loss: 0.00741
Epoch: 30363, Training Loss: 0.00838
Epoch: 30363, Training Loss: 0.00783
Epoch: 30363, Training Loss: 0.00957
Epoch: 30363, Training Loss: 0.00741
Epoch: 30364, Training Loss: 0.00838
Epoch: 30364, Training Loss: 0.00783
Epoch: 30364, Training Loss: 0.00957
Epoch: 30364, Training Loss: 0.00741
Epoch: 30365, Training Loss: 0.00838
Epoch: 30365, Training Loss: 0.00783
Epoch: 30365, Training Loss: 0.00957
Epoch: 30365, Training Loss: 0.00741
Epoch: 30366, Training Loss: 0.00838
Epoch: 30366, Training Loss: 0.00783
Epoch: 30366, Training Loss: 0.00957
Epoch: 30366, Training Loss: 0.00741
Epoch: 30367, Training Loss: 0.00838
Epoch: 30367, Training Loss: 0.00783
Epoch: 30367, Training Loss: 0.00957
Epoch: 30367, Training Loss: 0.00741
Epoch: 30368, Training Loss: 0.00838
Epoch: 30368, Training Loss: 0.00783
Epoch: 30368, Training Loss: 0.00957
Epoch: 30368, Training Loss: 0.00741
Epoch: 30369, Training Loss: 0.00838
Epoch: 30369, Training Loss: 0.00783
Epoch: 30369, Training Loss: 0.00957
Epoch: 30369, Training Loss: 0.00741
Epoch: 30370, Training Loss: 0.00838
Epoch: 30370, Training Loss: 0.00783
Epoch: 30370, Training Loss: 0.00957
Epoch: 30370, Training Loss: 0.00741
Epoch: 30371, Training Loss: 0.00838
Epoch: 30371, Training Loss: 0.00783
Epoch: 30371, Training Loss: 0.00957
Epoch: 30371, Training Loss: 0.00741
Epoch: 30372, Training Loss: 0.00838
Epoch: 30372, Training Loss: 0.00783
Epoch: 30372, Training Loss: 0.00957
Epoch: 30372, Training Loss: 0.00741
Epoch: 30373, Training Loss: 0.00838
Epoch: 30373, Training Loss: 0.00783
Epoch: 30373, Training Loss: 0.00957
Epoch: 30373, Training Loss: 0.00741
Epoch: 30374, Training Loss: 0.00838
Epoch: 30374, Training Loss: 0.00782
Epoch: 30374, Training Loss: 0.00957
Epoch: 30374, Training Loss: 0.00741
Epoch: 30375, Training Loss: 0.00838
Epoch: 30375, Training Loss: 0.00782
Epoch: 30375, Training Loss: 0.00957
Epoch: 30375, Training Loss: 0.00741
Epoch: 30376, Training Loss: 0.00838
Epoch: 30376, Training Loss: 0.00782
Epoch: 30376, Training Loss: 0.00957
Epoch: 30376, Training Loss: 0.00741
Epoch: 30377, Training Loss: 0.00838
Epoch: 30377, Training Loss: 0.00782
Epoch: 30377, Training Loss: 0.00957
Epoch: 30377, Training Loss: 0.00741
Epoch: 30378, Training Loss: 0.00838
Epoch: 30378, Training Loss: 0.00782
Epoch: 30378, Training Loss: 0.00957
Epoch: 30378, Training Loss: 0.00741
Epoch: 30379, Training Loss: 0.00838
Epoch: 30379, Training Loss: 0.00782
Epoch: 30379, Training Loss: 0.00957
Epoch: 30379, Training Loss: 0.00741
Epoch: 30380, Training Loss: 0.00838
Epoch: 30380, Training Loss: 0.00782
Epoch: 30380, Training Loss: 0.00957
Epoch: 30380, Training Loss: 0.00741
Epoch: 30381, Training Loss: 0.00838
Epoch: 30381, Training Loss: 0.00782
Epoch: 30381, Training Loss: 0.00957
Epoch: 30381, Training Loss: 0.00741
Epoch: 30382, Training Loss: 0.00838
Epoch: 30382, Training Loss: 0.00782
Epoch: 30382, Training Loss: 0.00957
Epoch: 30382, Training Loss: 0.00741
Epoch: 30383, Training Loss: 0.00838
Epoch: 30383, Training Loss: 0.00782
Epoch: 30383, Training Loss: 0.00957
Epoch: 30383, Training Loss: 0.00741
Epoch: 30384, Training Loss: 0.00838
Epoch: 30384, Training Loss: 0.00782
Epoch: 30384, Training Loss: 0.00957
Epoch: 30384, Training Loss: 0.00741
Epoch: 30385, Training Loss: 0.00838
Epoch: 30385, Training Loss: 0.00782
Epoch: 30385, Training Loss: 0.00957
Epoch: 30385, Training Loss: 0.00741
Epoch: 30386, Training Loss: 0.00838
Epoch: 30386, Training Loss: 0.00782
Epoch: 30386, Training Loss: 0.00957
Epoch: 30386, Training Loss: 0.00741
Epoch: 30387, Training Loss: 0.00838
Epoch: 30387, Training Loss: 0.00782
Epoch: 30387, Training Loss: 0.00957
Epoch: 30387, Training Loss: 0.00741
Epoch: 30388, Training Loss: 0.00838
Epoch: 30388, Training Loss: 0.00782
Epoch: 30388, Training Loss: 0.00957
Epoch: 30388, Training Loss: 0.00741
Epoch: 30389, Training Loss: 0.00838
Epoch: 30389, Training Loss: 0.00782
Epoch: 30389, Training Loss: 0.00957
Epoch: 30389, Training Loss: 0.00741
Epoch: 30390, Training Loss: 0.00838
Epoch: 30390, Training Loss: 0.00782
Epoch: 30390, Training Loss: 0.00957
Epoch: 30390, Training Loss: 0.00741
Epoch: 30391, Training Loss: 0.00838
Epoch: 30391, Training Loss: 0.00782
Epoch: 30391, Training Loss: 0.00957
Epoch: 30391, Training Loss: 0.00741
Epoch: 30392, Training Loss: 0.00838
Epoch: 30392, Training Loss: 0.00782
Epoch: 30392, Training Loss: 0.00957
Epoch: 30392, Training Loss: 0.00741
Epoch: 30393, Training Loss: 0.00838
Epoch: 30393, Training Loss: 0.00782
Epoch: 30393, Training Loss: 0.00957
Epoch: 30393, Training Loss: 0.00741
Epoch: 30394, Training Loss: 0.00838
Epoch: 30394, Training Loss: 0.00782
Epoch: 30394, Training Loss: 0.00957
Epoch: 30394, Training Loss: 0.00741
Epoch: 30395, Training Loss: 0.00838
Epoch: 30395, Training Loss: 0.00782
Epoch: 30395, Training Loss: 0.00957
Epoch: 30395, Training Loss: 0.00741
Epoch: 30396, Training Loss: 0.00838
Epoch: 30396, Training Loss: 0.00782
Epoch: 30396, Training Loss: 0.00957
Epoch: 30396, Training Loss: 0.00741
Epoch: 30397, Training Loss: 0.00838
Epoch: 30397, Training Loss: 0.00782
Epoch: 30397, Training Loss: 0.00957
Epoch: 30397, Training Loss: 0.00741
Epoch: 30398, Training Loss: 0.00838
Epoch: 30398, Training Loss: 0.00782
Epoch: 30398, Training Loss: 0.00957
Epoch: 30398, Training Loss: 0.00741
Epoch: 30399, Training Loss: 0.00838
Epoch: 30399, Training Loss: 0.00782
Epoch: 30399, Training Loss: 0.00957
Epoch: 30399, Training Loss: 0.00741
Epoch: 30400, Training Loss: 0.00838
Epoch: 30400, Training Loss: 0.00782
Epoch: 30400, Training Loss: 0.00957
Epoch: 30400, Training Loss: 0.00741
Epoch: 30401, Training Loss: 0.00838
Epoch: 30401, Training Loss: 0.00782
Epoch: 30401, Training Loss: 0.00956
Epoch: 30401, Training Loss: 0.00741
Epoch: 30402, Training Loss: 0.00838
Epoch: 30402, Training Loss: 0.00782
Epoch: 30402, Training Loss: 0.00956
Epoch: 30402, Training Loss: 0.00741
Epoch: 30403, Training Loss: 0.00837
Epoch: 30403, Training Loss: 0.00782
Epoch: 30403, Training Loss: 0.00956
Epoch: 30403, Training Loss: 0.00741
Epoch: 30404, Training Loss: 0.00837
Epoch: 30404, Training Loss: 0.00782
Epoch: 30404, Training Loss: 0.00956
Epoch: 30404, Training Loss: 0.00741
Epoch: 30405, Training Loss: 0.00837
Epoch: 30405, Training Loss: 0.00782
Epoch: 30405, Training Loss: 0.00956
Epoch: 30405, Training Loss: 0.00741
Epoch: 30406, Training Loss: 0.00837
Epoch: 30406, Training Loss: 0.00782
Epoch: 30406, Training Loss: 0.00956
Epoch: 30406, Training Loss: 0.00741
Epoch: 30407, Training Loss: 0.00837
Epoch: 30407, Training Loss: 0.00782
Epoch: 30407, Training Loss: 0.00956
Epoch: 30407, Training Loss: 0.00741
Epoch: 30408, Training Loss: 0.00837
Epoch: 30408, Training Loss: 0.00782
Epoch: 30408, Training Loss: 0.00956
Epoch: 30408, Training Loss: 0.00741
Epoch: 30409, Training Loss: 0.00837
Epoch: 30409, Training Loss: 0.00782
Epoch: 30409, Training Loss: 0.00956
Epoch: 30409, Training Loss: 0.00741
Epoch: 30410, Training Loss: 0.00837
Epoch: 30410, Training Loss: 0.00782
Epoch: 30410, Training Loss: 0.00956
Epoch: 30410, Training Loss: 0.00741
Epoch: 30411, Training Loss: 0.00837
Epoch: 30411, Training Loss: 0.00782
Epoch: 30411, Training Loss: 0.00956
Epoch: 30411, Training Loss: 0.00741
Epoch: 30412, Training Loss: 0.00837
Epoch: 30412, Training Loss: 0.00782
Epoch: 30412, Training Loss: 0.00956
Epoch: 30412, Training Loss: 0.00741
Epoch: 30413, Training Loss: 0.00837
Epoch: 30413, Training Loss: 0.00782
Epoch: 30413, Training Loss: 0.00956
Epoch: 30413, Training Loss: 0.00740
Epoch: 30414, Training Loss: 0.00837
Epoch: 30414, Training Loss: 0.00782
Epoch: 30414, Training Loss: 0.00956
Epoch: 30414, Training Loss: 0.00740
Epoch: 30415, Training Loss: 0.00837
Epoch: 30415, Training Loss: 0.00782
Epoch: 30415, Training Loss: 0.00956
Epoch: 30415, Training Loss: 0.00740
Epoch: 30416, Training Loss: 0.00837
Epoch: 30416, Training Loss: 0.00782
Epoch: 30416, Training Loss: 0.00956
Epoch: 30416, Training Loss: 0.00740
Epoch: 30417, Training Loss: 0.00837
Epoch: 30417, Training Loss: 0.00782
Epoch: 30417, Training Loss: 0.00956
Epoch: 30417, Training Loss: 0.00740
Epoch: 30418, Training Loss: 0.00837
Epoch: 30418, Training Loss: 0.00782
Epoch: 30418, Training Loss: 0.00956
Epoch: 30418, Training Loss: 0.00740
Epoch: 30419, Training Loss: 0.00837
Epoch: 30419, Training Loss: 0.00782
Epoch: 30419, Training Loss: 0.00956
Epoch: 30419, Training Loss: 0.00740
Epoch: 30420, Training Loss: 0.00837
Epoch: 30420, Training Loss: 0.00782
Epoch: 30420, Training Loss: 0.00956
Epoch: 30420, Training Loss: 0.00740
Epoch: 30421, Training Loss: 0.00837
Epoch: 30421, Training Loss: 0.00782
Epoch: 30421, Training Loss: 0.00956
Epoch: 30421, Training Loss: 0.00740
Epoch: 30422, Training Loss: 0.00837
Epoch: 30422, Training Loss: 0.00782
Epoch: 30422, Training Loss: 0.00956
Epoch: 30422, Training Loss: 0.00740
Epoch: 30423, Training Loss: 0.00837
Epoch: 30423, Training Loss: 0.00782
Epoch: 30423, Training Loss: 0.00956
Epoch: 30423, Training Loss: 0.00740
Epoch: 30424, Training Loss: 0.00837
Epoch: 30424, Training Loss: 0.00782
Epoch: 30424, Training Loss: 0.00956
Epoch: 30424, Training Loss: 0.00740
Epoch: 30425, Training Loss: 0.00837
Epoch: 30425, Training Loss: 0.00782
Epoch: 30425, Training Loss: 0.00956
Epoch: 30425, Training Loss: 0.00740
Epoch: 30426, Training Loss: 0.00837
Epoch: 30426, Training Loss: 0.00782
Epoch: 30426, Training Loss: 0.00956
Epoch: 30426, Training Loss: 0.00740
Epoch: 30427, Training Loss: 0.00837
Epoch: 30427, Training Loss: 0.00782
Epoch: 30427, Training Loss: 0.00956
Epoch: 30427, Training Loss: 0.00740
Epoch: 30428, Training Loss: 0.00837
Epoch: 30428, Training Loss: 0.00782
Epoch: 30428, Training Loss: 0.00956
Epoch: 30428, Training Loss: 0.00740
Epoch: 30429, Training Loss: 0.00837
Epoch: 30429, Training Loss: 0.00782
Epoch: 30429, Training Loss: 0.00956
Epoch: 30429, Training Loss: 0.00740
Epoch: 30430, Training Loss: 0.00837
Epoch: 30430, Training Loss: 0.00782
Epoch: 30430, Training Loss: 0.00956
Epoch: 30430, Training Loss: 0.00740
Epoch: 30431, Training Loss: 0.00837
Epoch: 30431, Training Loss: 0.00782
Epoch: 30431, Training Loss: 0.00956
Epoch: 30431, Training Loss: 0.00740
Epoch: 30432, Training Loss: 0.00837
Epoch: 30432, Training Loss: 0.00782
Epoch: 30432, Training Loss: 0.00956
Epoch: 30432, Training Loss: 0.00740
Epoch: 30433, Training Loss: 0.00837
Epoch: 30433, Training Loss: 0.00782
Epoch: 30433, Training Loss: 0.00956
Epoch: 30433, Training Loss: 0.00740
Epoch: 30434, Training Loss: 0.00837
Epoch: 30434, Training Loss: 0.00782
Epoch: 30434, Training Loss: 0.00956
Epoch: 30434, Training Loss: 0.00740
Epoch: 30435, Training Loss: 0.00837
Epoch: 30435, Training Loss: 0.00782
Epoch: 30435, Training Loss: 0.00956
Epoch: 30435, Training Loss: 0.00740
Epoch: 30436, Training Loss: 0.00837
Epoch: 30436, Training Loss: 0.00782
Epoch: 30436, Training Loss: 0.00956
Epoch: 30436, Training Loss: 0.00740
Epoch: 30437, Training Loss: 0.00837
Epoch: 30437, Training Loss: 0.00782
Epoch: 30437, Training Loss: 0.00956
Epoch: 30437, Training Loss: 0.00740
Epoch: 30438, Training Loss: 0.00837
Epoch: 30438, Training Loss: 0.00782
Epoch: 30438, Training Loss: 0.00956
Epoch: 30438, Training Loss: 0.00740
Epoch: 30439, Training Loss: 0.00837
Epoch: 30439, Training Loss: 0.00782
Epoch: 30439, Training Loss: 0.00956
Epoch: 30439, Training Loss: 0.00740
Epoch: 30440, Training Loss: 0.00837
Epoch: 30440, Training Loss: 0.00782
Epoch: 30440, Training Loss: 0.00956
Epoch: 30440, Training Loss: 0.00740
Epoch: 30441, Training Loss: 0.00837
Epoch: 30441, Training Loss: 0.00782
Epoch: 30441, Training Loss: 0.00956
Epoch: 30441, Training Loss: 0.00740
Epoch: 30442, Training Loss: 0.00837
Epoch: 30442, Training Loss: 0.00782
Epoch: 30442, Training Loss: 0.00956
Epoch: 30442, Training Loss: 0.00740
Epoch: 30443, Training Loss: 0.00837
Epoch: 30443, Training Loss: 0.00782
Epoch: 30443, Training Loss: 0.00956
Epoch: 30443, Training Loss: 0.00740
Epoch: 30444, Training Loss: 0.00837
Epoch: 30444, Training Loss: 0.00782
Epoch: 30444, Training Loss: 0.00956
Epoch: 30444, Training Loss: 0.00740
Epoch: 30445, Training Loss: 0.00837
Epoch: 30445, Training Loss: 0.00782
Epoch: 30445, Training Loss: 0.00956
Epoch: 30445, Training Loss: 0.00740
Epoch: 30446, Training Loss: 0.00837
Epoch: 30446, Training Loss: 0.00782
Epoch: 30446, Training Loss: 0.00956
Epoch: 30446, Training Loss: 0.00740
Epoch: 30447, Training Loss: 0.00837
Epoch: 30447, Training Loss: 0.00782
Epoch: 30447, Training Loss: 0.00956
Epoch: 30447, Training Loss: 0.00740
Epoch: 30448, Training Loss: 0.00837
Epoch: 30448, Training Loss: 0.00781
Epoch: 30448, Training Loss: 0.00956
Epoch: 30448, Training Loss: 0.00740
Epoch: 30449, Training Loss: 0.00837
Epoch: 30449, Training Loss: 0.00781
Epoch: 30449, Training Loss: 0.00956
Epoch: 30449, Training Loss: 0.00740
Epoch: 30450, Training Loss: 0.00837
Epoch: 30450, Training Loss: 0.00781
Epoch: 30450, Training Loss: 0.00956
Epoch: 30450, Training Loss: 0.00740
Epoch: 30451, Training Loss: 0.00837
Epoch: 30451, Training Loss: 0.00781
Epoch: 30451, Training Loss: 0.00956
Epoch: 30451, Training Loss: 0.00740
Epoch: 30452, Training Loss: 0.00837
Epoch: 30452, Training Loss: 0.00781
Epoch: 30452, Training Loss: 0.00956
Epoch: 30452, Training Loss: 0.00740
Epoch: 30453, Training Loss: 0.00837
Epoch: 30453, Training Loss: 0.00781
Epoch: 30453, Training Loss: 0.00956
Epoch: 30453, Training Loss: 0.00740
Epoch: 30454, Training Loss: 0.00837
Epoch: 30454, Training Loss: 0.00781
Epoch: 30454, Training Loss: 0.00956
Epoch: 30454, Training Loss: 0.00740
Epoch: 30455, Training Loss: 0.00837
Epoch: 30455, Training Loss: 0.00781
Epoch: 30455, Training Loss: 0.00956
Epoch: 30455, Training Loss: 0.00740
Epoch: 30456, Training Loss: 0.00837
Epoch: 30456, Training Loss: 0.00781
Epoch: 30456, Training Loss: 0.00956
Epoch: 30456, Training Loss: 0.00740
Epoch: 30457, Training Loss: 0.00837
Epoch: 30457, Training Loss: 0.00781
Epoch: 30457, Training Loss: 0.00956
Epoch: 30457, Training Loss: 0.00740
Epoch: 30458, Training Loss: 0.00837
Epoch: 30458, Training Loss: 0.00781
Epoch: 30458, Training Loss: 0.00956
Epoch: 30458, Training Loss: 0.00740
Epoch: 30459, Training Loss: 0.00837
Epoch: 30459, Training Loss: 0.00781
Epoch: 30459, Training Loss: 0.00956
Epoch: 30459, Training Loss: 0.00740
Epoch: 30460, Training Loss: 0.00837
Epoch: 30460, Training Loss: 0.00781
Epoch: 30460, Training Loss: 0.00956
Epoch: 30460, Training Loss: 0.00740
Epoch: 30461, Training Loss: 0.00837
Epoch: 30461, Training Loss: 0.00781
Epoch: 30461, Training Loss: 0.00955
Epoch: 30461, Training Loss: 0.00740
Epoch: 30462, Training Loss: 0.00837
Epoch: 30462, Training Loss: 0.00781
Epoch: 30462, Training Loss: 0.00955
Epoch: 30462, Training Loss: 0.00740
Epoch: 30463, Training Loss: 0.00837
Epoch: 30463, Training Loss: 0.00781
Epoch: 30463, Training Loss: 0.00955
Epoch: 30463, Training Loss: 0.00740
Epoch: 30464, Training Loss: 0.00837
Epoch: 30464, Training Loss: 0.00781
Epoch: 30464, Training Loss: 0.00955
Epoch: 30464, Training Loss: 0.00740
Epoch: 30465, Training Loss: 0.00837
Epoch: 30465, Training Loss: 0.00781
Epoch: 30465, Training Loss: 0.00955
Epoch: 30465, Training Loss: 0.00740
Epoch: 30466, Training Loss: 0.00837
Epoch: 30466, Training Loss: 0.00781
Epoch: 30466, Training Loss: 0.00955
Epoch: 30466, Training Loss: 0.00740
Epoch: 30467, Training Loss: 0.00837
Epoch: 30467, Training Loss: 0.00781
Epoch: 30467, Training Loss: 0.00955
Epoch: 30467, Training Loss: 0.00740
Epoch: 30468, Training Loss: 0.00837
Epoch: 30468, Training Loss: 0.00781
Epoch: 30468, Training Loss: 0.00955
Epoch: 30468, Training Loss: 0.00740
Epoch: 30469, Training Loss: 0.00837
Epoch: 30469, Training Loss: 0.00781
Epoch: 30469, Training Loss: 0.00955
Epoch: 30469, Training Loss: 0.00740
Epoch: 30470, Training Loss: 0.00837
Epoch: 30470, Training Loss: 0.00781
Epoch: 30470, Training Loss: 0.00955
Epoch: 30470, Training Loss: 0.00740
Epoch: 30471, Training Loss: 0.00836
Epoch: 30471, Training Loss: 0.00781
Epoch: 30471, Training Loss: 0.00955
Epoch: 30471, Training Loss: 0.00740
Epoch: 30472, Training Loss: 0.00836
Epoch: 30472, Training Loss: 0.00781
Epoch: 30472, Training Loss: 0.00955
Epoch: 30472, Training Loss: 0.00740
Epoch: 30473, Training Loss: 0.00836
Epoch: 30473, Training Loss: 0.00781
Epoch: 30473, Training Loss: 0.00955
Epoch: 30473, Training Loss: 0.00740
Epoch: 30474, Training Loss: 0.00836
Epoch: 30474, Training Loss: 0.00781
Epoch: 30474, Training Loss: 0.00955
Epoch: 30474, Training Loss: 0.00740
Epoch: 30475, Training Loss: 0.00836
Epoch: 30475, Training Loss: 0.00781
Epoch: 30475, Training Loss: 0.00955
Epoch: 30475, Training Loss: 0.00740
Epoch: 30476, Training Loss: 0.00836
Epoch: 30476, Training Loss: 0.00781
Epoch: 30476, Training Loss: 0.00955
Epoch: 30476, Training Loss: 0.00740
Epoch: 30477, Training Loss: 0.00836
Epoch: 30477, Training Loss: 0.00781
Epoch: 30477, Training Loss: 0.00955
Epoch: 30477, Training Loss: 0.00740
Epoch: 30478, Training Loss: 0.00836
Epoch: 30478, Training Loss: 0.00781
Epoch: 30478, Training Loss: 0.00955
Epoch: 30478, Training Loss: 0.00740
Epoch: 30479, Training Loss: 0.00836
Epoch: 30479, Training Loss: 0.00781
Epoch: 30479, Training Loss: 0.00955
Epoch: 30479, Training Loss: 0.00740
Epoch: 30480, Training Loss: 0.00836
Epoch: 30480, Training Loss: 0.00781
Epoch: 30480, Training Loss: 0.00955
Epoch: 30480, Training Loss: 0.00740
Epoch: 30481, Training Loss: 0.00836
Epoch: 30481, Training Loss: 0.00781
Epoch: 30481, Training Loss: 0.00955
Epoch: 30481, Training Loss: 0.00740
Epoch: 30482, Training Loss: 0.00836
Epoch: 30482, Training Loss: 0.00781
Epoch: 30482, Training Loss: 0.00955
Epoch: 30482, Training Loss: 0.00740
Epoch: 30483, Training Loss: 0.00836
Epoch: 30483, Training Loss: 0.00781
Epoch: 30483, Training Loss: 0.00955
Epoch: 30483, Training Loss: 0.00740
Epoch: 30484, Training Loss: 0.00836
Epoch: 30484, Training Loss: 0.00781
Epoch: 30484, Training Loss: 0.00955
Epoch: 30484, Training Loss: 0.00740
Epoch: 30485, Training Loss: 0.00836
Epoch: 30485, Training Loss: 0.00781
Epoch: 30485, Training Loss: 0.00955
Epoch: 30485, Training Loss: 0.00740
Epoch: 30486, Training Loss: 0.00836
Epoch: 30486, Training Loss: 0.00781
Epoch: 30486, Training Loss: 0.00955
Epoch: 30486, Training Loss: 0.00740
Epoch: 30487, Training Loss: 0.00836
Epoch: 30487, Training Loss: 0.00781
Epoch: 30487, Training Loss: 0.00955
Epoch: 30487, Training Loss: 0.00740
Epoch: 30488, Training Loss: 0.00836
Epoch: 30488, Training Loss: 0.00781
Epoch: 30488, Training Loss: 0.00955
Epoch: 30488, Training Loss: 0.00740
Epoch: 30489, Training Loss: 0.00836
Epoch: 30489, Training Loss: 0.00781
Epoch: 30489, Training Loss: 0.00955
Epoch: 30489, Training Loss: 0.00740
Epoch: 30490, Training Loss: 0.00836
Epoch: 30490, Training Loss: 0.00781
Epoch: 30490, Training Loss: 0.00955
Epoch: 30490, Training Loss: 0.00740
Epoch: 30491, Training Loss: 0.00836
Epoch: 30491, Training Loss: 0.00781
Epoch: 30491, Training Loss: 0.00955
Epoch: 30491, Training Loss: 0.00739
Epoch: 30492, Training Loss: 0.00836
Epoch: 30492, Training Loss: 0.00781
Epoch: 30492, Training Loss: 0.00955
Epoch: 30492, Training Loss: 0.00739
Epoch: 30493, Training Loss: 0.00836
Epoch: 30493, Training Loss: 0.00781
Epoch: 30493, Training Loss: 0.00955
Epoch: 30493, Training Loss: 0.00739
Epoch: 30494, Training Loss: 0.00836
Epoch: 30494, Training Loss: 0.00781
Epoch: 30494, Training Loss: 0.00955
Epoch: 30494, Training Loss: 0.00739
Epoch: 30495, Training Loss: 0.00836
Epoch: 30495, Training Loss: 0.00781
Epoch: 30495, Training Loss: 0.00955
Epoch: 30495, Training Loss: 0.00739
Epoch: 30496, Training Loss: 0.00836
Epoch: 30496, Training Loss: 0.00781
Epoch: 30496, Training Loss: 0.00955
Epoch: 30496, Training Loss: 0.00739
Epoch: 30497, Training Loss: 0.00836
Epoch: 30497, Training Loss: 0.00781
Epoch: 30497, Training Loss: 0.00955
Epoch: 30497, Training Loss: 0.00739
Epoch: 30498, Training Loss: 0.00836
Epoch: 30498, Training Loss: 0.00781
Epoch: 30498, Training Loss: 0.00955
Epoch: 30498, Training Loss: 0.00739
Epoch: 30499, Training Loss: 0.00836
Epoch: 30499, Training Loss: 0.00781
Epoch: 30499, Training Loss: 0.00955
Epoch: 30499, Training Loss: 0.00739
Epoch: 30500, Training Loss: 0.00836
Epoch: 30500, Training Loss: 0.00781
Epoch: 30500, Training Loss: 0.00955
Epoch: 30500, Training Loss: 0.00739
Epoch: 30501, Training Loss: 0.00836
Epoch: 30501, Training Loss: 0.00781
Epoch: 30501, Training Loss: 0.00955
Epoch: 30501, Training Loss: 0.00739
Epoch: 30502, Training Loss: 0.00836
Epoch: 30502, Training Loss: 0.00781
Epoch: 30502, Training Loss: 0.00955
Epoch: 30502, Training Loss: 0.00739
Epoch: 30503, Training Loss: 0.00836
Epoch: 30503, Training Loss: 0.00781
Epoch: 30503, Training Loss: 0.00955
Epoch: 30503, Training Loss: 0.00739
Epoch: 30504, Training Loss: 0.00836
Epoch: 30504, Training Loss: 0.00781
Epoch: 30504, Training Loss: 0.00955
Epoch: 30504, Training Loss: 0.00739
Epoch: 30505, Training Loss: 0.00836
Epoch: 30505, Training Loss: 0.00781
Epoch: 30505, Training Loss: 0.00955
Epoch: 30505, Training Loss: 0.00739
Epoch: 30506, Training Loss: 0.00836
Epoch: 30506, Training Loss: 0.00781
Epoch: 30506, Training Loss: 0.00955
Epoch: 30506, Training Loss: 0.00739
Epoch: 30507, Training Loss: 0.00836
Epoch: 30507, Training Loss: 0.00781
Epoch: 30507, Training Loss: 0.00955
Epoch: 30507, Training Loss: 0.00739
Epoch: 30508, Training Loss: 0.00836
Epoch: 30508, Training Loss: 0.00781
Epoch: 30508, Training Loss: 0.00955
Epoch: 30508, Training Loss: 0.00739
Epoch: 30509, Training Loss: 0.00836
Epoch: 30509, Training Loss: 0.00781
Epoch: 30509, Training Loss: 0.00955
Epoch: 30509, Training Loss: 0.00739
Epoch: 30510, Training Loss: 0.00836
Epoch: 30510, Training Loss: 0.00781
Epoch: 30510, Training Loss: 0.00955
Epoch: 30510, Training Loss: 0.00739
Epoch: 30511, Training Loss: 0.00836
Epoch: 30511, Training Loss: 0.00781
Epoch: 30511, Training Loss: 0.00955
Epoch: 30511, Training Loss: 0.00739
Epoch: 30512, Training Loss: 0.00836
Epoch: 30512, Training Loss: 0.00781
Epoch: 30512, Training Loss: 0.00955
Epoch: 30512, Training Loss: 0.00739
Epoch: 30513, Training Loss: 0.00836
Epoch: 30513, Training Loss: 0.00781
Epoch: 30513, Training Loss: 0.00955
Epoch: 30513, Training Loss: 0.00739
Epoch: 30514, Training Loss: 0.00836
Epoch: 30514, Training Loss: 0.00781
Epoch: 30514, Training Loss: 0.00955
Epoch: 30514, Training Loss: 0.00739
Epoch: 30515, Training Loss: 0.00836
Epoch: 30515, Training Loss: 0.00781
Epoch: 30515, Training Loss: 0.00955
Epoch: 30515, Training Loss: 0.00739
Epoch: 30516, Training Loss: 0.00836
Epoch: 30516, Training Loss: 0.00781
Epoch: 30516, Training Loss: 0.00955
Epoch: 30516, Training Loss: 0.00739
Epoch: 30517, Training Loss: 0.00836
Epoch: 30517, Training Loss: 0.00781
Epoch: 30517, Training Loss: 0.00955
Epoch: 30517, Training Loss: 0.00739
Epoch: 30518, Training Loss: 0.00836
Epoch: 30518, Training Loss: 0.00781
Epoch: 30518, Training Loss: 0.00955
Epoch: 30518, Training Loss: 0.00739
Epoch: 30519, Training Loss: 0.00836
Epoch: 30519, Training Loss: 0.00781
Epoch: 30519, Training Loss: 0.00955
Epoch: 30519, Training Loss: 0.00739
Epoch: 30520, Training Loss: 0.00836
Epoch: 30520, Training Loss: 0.00781
Epoch: 30520, Training Loss: 0.00955
Epoch: 30520, Training Loss: 0.00739
Epoch: 30521, Training Loss: 0.00836
Epoch: 30521, Training Loss: 0.00781
Epoch: 30521, Training Loss: 0.00955
Epoch: 30521, Training Loss: 0.00739
Epoch: 30522, Training Loss: 0.00836
Epoch: 30522, Training Loss: 0.00781
Epoch: 30522, Training Loss: 0.00954
Epoch: 30522, Training Loss: 0.00739
Epoch: 30523, Training Loss: 0.00836
Epoch: 30523, Training Loss: 0.00780
Epoch: 30523, Training Loss: 0.00954
Epoch: 30523, Training Loss: 0.00739
Epoch: 30524, Training Loss: 0.00836
Epoch: 30524, Training Loss: 0.00780
Epoch: 30524, Training Loss: 0.00954
Epoch: 30524, Training Loss: 0.00739
Epoch: 30525, Training Loss: 0.00836
Epoch: 30525, Training Loss: 0.00780
Epoch: 30525, Training Loss: 0.00954
Epoch: 30525, Training Loss: 0.00739
Epoch: 30526, Training Loss: 0.00836
Epoch: 30526, Training Loss: 0.00780
Epoch: 30526, Training Loss: 0.00954
Epoch: 30526, Training Loss: 0.00739
Epoch: 30527, Training Loss: 0.00836
Epoch: 30527, Training Loss: 0.00780
Epoch: 30527, Training Loss: 0.00954
Epoch: 30527, Training Loss: 0.00739
Epoch: 30528, Training Loss: 0.00836
Epoch: 30528, Training Loss: 0.00780
Epoch: 30528, Training Loss: 0.00954
Epoch: 30528, Training Loss: 0.00739
Epoch: 30529, Training Loss: 0.00836
Epoch: 30529, Training Loss: 0.00780
Epoch: 30529, Training Loss: 0.00954
Epoch: 30529, Training Loss: 0.00739
Epoch: 30530, Training Loss: 0.00836
Epoch: 30530, Training Loss: 0.00780
Epoch: 30530, Training Loss: 0.00954
Epoch: 30530, Training Loss: 0.00739
Epoch: 30531, Training Loss: 0.00836
Epoch: 30531, Training Loss: 0.00780
Epoch: 30531, Training Loss: 0.00954
Epoch: 30531, Training Loss: 0.00739
Epoch: 30532, Training Loss: 0.00836
Epoch: 30532, Training Loss: 0.00780
Epoch: 30532, Training Loss: 0.00954
Epoch: 30532, Training Loss: 0.00739
Epoch: 30533, Training Loss: 0.00836
Epoch: 30533, Training Loss: 0.00780
Epoch: 30533, Training Loss: 0.00954
Epoch: 30533, Training Loss: 0.00739
Epoch: 30534, Training Loss: 0.00836
Epoch: 30534, Training Loss: 0.00780
Epoch: 30534, Training Loss: 0.00954
Epoch: 30534, Training Loss: 0.00739
Epoch: 30535, Training Loss: 0.00836
Epoch: 30535, Training Loss: 0.00780
Epoch: 30535, Training Loss: 0.00954
Epoch: 30535, Training Loss: 0.00739
Epoch: 30536, Training Loss: 0.00836
Epoch: 30536, Training Loss: 0.00780
Epoch: 30536, Training Loss: 0.00954
Epoch: 30536, Training Loss: 0.00739
Epoch: 30537, Training Loss: 0.00836
Epoch: 30537, Training Loss: 0.00780
Epoch: 30537, Training Loss: 0.00954
Epoch: 30537, Training Loss: 0.00739
Epoch: 30538, Training Loss: 0.00836
Epoch: 30538, Training Loss: 0.00780
Epoch: 30538, Training Loss: 0.00954
Epoch: 30538, Training Loss: 0.00739
Epoch: 30539, Training Loss: 0.00836
Epoch: 30539, Training Loss: 0.00780
Epoch: 30539, Training Loss: 0.00954
Epoch: 30539, Training Loss: 0.00739
Epoch: 30540, Training Loss: 0.00835
Epoch: 30540, Training Loss: 0.00780
Epoch: 30540, Training Loss: 0.00954
Epoch: 30540, Training Loss: 0.00739
Epoch: 30541, Training Loss: 0.00835
Epoch: 30541, Training Loss: 0.00780
Epoch: 30541, Training Loss: 0.00954
Epoch: 30541, Training Loss: 0.00739
Epoch: 30542, Training Loss: 0.00835
Epoch: 30542, Training Loss: 0.00780
Epoch: 30542, Training Loss: 0.00954
Epoch: 30542, Training Loss: 0.00739
Epoch: 30543, Training Loss: 0.00835
Epoch: 30543, Training Loss: 0.00780
Epoch: 30543, Training Loss: 0.00954
Epoch: 30543, Training Loss: 0.00739
Epoch: 30544, Training Loss: 0.00835
Epoch: 30544, Training Loss: 0.00780
Epoch: 30544, Training Loss: 0.00954
Epoch: 30544, Training Loss: 0.00739
Epoch: 30545, Training Loss: 0.00835
Epoch: 30545, Training Loss: 0.00780
Epoch: 30545, Training Loss: 0.00954
Epoch: 30545, Training Loss: 0.00739
Epoch: 30546, Training Loss: 0.00835
Epoch: 30546, Training Loss: 0.00780
Epoch: 30546, Training Loss: 0.00954
Epoch: 30546, Training Loss: 0.00739
Epoch: 30547, Training Loss: 0.00835
Epoch: 30547, Training Loss: 0.00780
Epoch: 30547, Training Loss: 0.00954
Epoch: 30547, Training Loss: 0.00739
Epoch: 30548, Training Loss: 0.00835
Epoch: 30548, Training Loss: 0.00780
Epoch: 30548, Training Loss: 0.00954
Epoch: 30548, Training Loss: 0.00739
Epoch: 30549, Training Loss: 0.00835
Epoch: 30549, Training Loss: 0.00780
Epoch: 30549, Training Loss: 0.00954
Epoch: 30549, Training Loss: 0.00739
Epoch: 30550, Training Loss: 0.00835
Epoch: 30550, Training Loss: 0.00780
Epoch: 30550, Training Loss: 0.00954
Epoch: 30550, Training Loss: 0.00739
Epoch: 30551, Training Loss: 0.00835
Epoch: 30551, Training Loss: 0.00780
Epoch: 30551, Training Loss: 0.00954
Epoch: 30551, Training Loss: 0.00739
Epoch: 30552, Training Loss: 0.00835
Epoch: 30552, Training Loss: 0.00780
Epoch: 30552, Training Loss: 0.00954
Epoch: 30552, Training Loss: 0.00739
Epoch: 30553, Training Loss: 0.00835
Epoch: 30553, Training Loss: 0.00780
Epoch: 30553, Training Loss: 0.00954
Epoch: 30553, Training Loss: 0.00739
Epoch: 30554, Training Loss: 0.00835
Epoch: 30554, Training Loss: 0.00780
Epoch: 30554, Training Loss: 0.00954
Epoch: 30554, Training Loss: 0.00739
Epoch: 30555, Training Loss: 0.00835
Epoch: 30555, Training Loss: 0.00780
Epoch: 30555, Training Loss: 0.00954
Epoch: 30555, Training Loss: 0.00739
Epoch: 30556, Training Loss: 0.00835
Epoch: 30556, Training Loss: 0.00780
Epoch: 30556, Training Loss: 0.00954
Epoch: 30556, Training Loss: 0.00739
Epoch: 30557, Training Loss: 0.00835
Epoch: 30557, Training Loss: 0.00780
Epoch: 30557, Training Loss: 0.00954
Epoch: 30557, Training Loss: 0.00739
Epoch: 30558, Training Loss: 0.00835
Epoch: 30558, Training Loss: 0.00780
Epoch: 30558, Training Loss: 0.00954
Epoch: 30558, Training Loss: 0.00739
Epoch: 30559, Training Loss: 0.00835
Epoch: 30559, Training Loss: 0.00780
Epoch: 30559, Training Loss: 0.00954
Epoch: 30559, Training Loss: 0.00739
Epoch: 30560, Training Loss: 0.00835
Epoch: 30560, Training Loss: 0.00780
Epoch: 30560, Training Loss: 0.00954
Epoch: 30560, Training Loss: 0.00739
Epoch: 30561, Training Loss: 0.00835
Epoch: 30561, Training Loss: 0.00780
Epoch: 30561, Training Loss: 0.00954
Epoch: 30561, Training Loss: 0.00739
Epoch: 30562, Training Loss: 0.00835
Epoch: 30562, Training Loss: 0.00780
Epoch: 30562, Training Loss: 0.00954
Epoch: 30562, Training Loss: 0.00739
Epoch: 30563, Training Loss: 0.00835
Epoch: 30563, Training Loss: 0.00780
Epoch: 30563, Training Loss: 0.00954
Epoch: 30563, Training Loss: 0.00739
Epoch: 30564, Training Loss: 0.00835
Epoch: 30564, Training Loss: 0.00780
Epoch: 30564, Training Loss: 0.00954
Epoch: 30564, Training Loss: 0.00739
Epoch: 30565, Training Loss: 0.00835
Epoch: 30565, Training Loss: 0.00780
Epoch: 30565, Training Loss: 0.00954
Epoch: 30565, Training Loss: 0.00739
Epoch: 30566, Training Loss: 0.00835
Epoch: 30566, Training Loss: 0.00780
Epoch: 30566, Training Loss: 0.00954
Epoch: 30566, Training Loss: 0.00739
Epoch: 30567, Training Loss: 0.00835
Epoch: 30567, Training Loss: 0.00780
Epoch: 30567, Training Loss: 0.00954
Epoch: 30567, Training Loss: 0.00739
Epoch: 30568, Training Loss: 0.00835
Epoch: 30568, Training Loss: 0.00780
Epoch: 30568, Training Loss: 0.00954
Epoch: 30568, Training Loss: 0.00739
Epoch: 30569, Training Loss: 0.00835
Epoch: 30569, Training Loss: 0.00780
Epoch: 30569, Training Loss: 0.00954
Epoch: 30569, Training Loss: 0.00739
Epoch: 30570, Training Loss: 0.00835
Epoch: 30570, Training Loss: 0.00780
Epoch: 30570, Training Loss: 0.00954
Epoch: 30570, Training Loss: 0.00738
Epoch: 30571, Training Loss: 0.00835
Epoch: 30571, Training Loss: 0.00780
Epoch: 30571, Training Loss: 0.00954
Epoch: 30571, Training Loss: 0.00738
Epoch: 30572, Training Loss: 0.00835
Epoch: 30572, Training Loss: 0.00780
Epoch: 30572, Training Loss: 0.00954
Epoch: 30572, Training Loss: 0.00738
Epoch: 30573, Training Loss: 0.00835
Epoch: 30573, Training Loss: 0.00780
Epoch: 30573, Training Loss: 0.00954
Epoch: 30573, Training Loss: 0.00738
Epoch: 30574, Training Loss: 0.00835
Epoch: 30574, Training Loss: 0.00780
Epoch: 30574, Training Loss: 0.00954
Epoch: 30574, Training Loss: 0.00738
Epoch: 30575, Training Loss: 0.00835
Epoch: 30575, Training Loss: 0.00780
Epoch: 30575, Training Loss: 0.00954
Epoch: 30575, Training Loss: 0.00738
Epoch: 30576, Training Loss: 0.00835
Epoch: 30576, Training Loss: 0.00780
Epoch: 30576, Training Loss: 0.00954
Epoch: 30576, Training Loss: 0.00738
Epoch: 30577, Training Loss: 0.00835
Epoch: 30577, Training Loss: 0.00780
Epoch: 30577, Training Loss: 0.00954
Epoch: 30577, Training Loss: 0.00738
Epoch: 30578, Training Loss: 0.00835
Epoch: 30578, Training Loss: 0.00780
Epoch: 30578, Training Loss: 0.00954
Epoch: 30578, Training Loss: 0.00738
Epoch: 30579, Training Loss: 0.00835
Epoch: 30579, Training Loss: 0.00780
Epoch: 30579, Training Loss: 0.00954
Epoch: 30579, Training Loss: 0.00738
Epoch: 30580, Training Loss: 0.00835
Epoch: 30580, Training Loss: 0.00780
Epoch: 30580, Training Loss: 0.00954
Epoch: 30580, Training Loss: 0.00738
Epoch: 30581, Training Loss: 0.00835
Epoch: 30581, Training Loss: 0.00780
Epoch: 30581, Training Loss: 0.00954
Epoch: 30581, Training Loss: 0.00738
Epoch: 30582, Training Loss: 0.00835
Epoch: 30582, Training Loss: 0.00780
Epoch: 30582, Training Loss: 0.00953
Epoch: 30582, Training Loss: 0.00738
Epoch: 30583, Training Loss: 0.00835
Epoch: 30583, Training Loss: 0.00780
Epoch: 30583, Training Loss: 0.00953
Epoch: 30583, Training Loss: 0.00738
Epoch: 30584, Training Loss: 0.00835
Epoch: 30584, Training Loss: 0.00780
Epoch: 30584, Training Loss: 0.00953
Epoch: 30584, Training Loss: 0.00738
Epoch: 30585, Training Loss: 0.00835
Epoch: 30585, Training Loss: 0.00780
Epoch: 30585, Training Loss: 0.00953
Epoch: 30585, Training Loss: 0.00738
Epoch: 30586, Training Loss: 0.00835
Epoch: 30586, Training Loss: 0.00780
Epoch: 30586, Training Loss: 0.00953
Epoch: 30586, Training Loss: 0.00738
Epoch: 30587, Training Loss: 0.00835
Epoch: 30587, Training Loss: 0.00780
Epoch: 30587, Training Loss: 0.00953
Epoch: 30587, Training Loss: 0.00738
Epoch: 30588, Training Loss: 0.00835
Epoch: 30588, Training Loss: 0.00780
Epoch: 30588, Training Loss: 0.00953
Epoch: 30588, Training Loss: 0.00738
Epoch: 30589, Training Loss: 0.00835
Epoch: 30589, Training Loss: 0.00780
Epoch: 30589, Training Loss: 0.00953
Epoch: 30589, Training Loss: 0.00738
Epoch: 30590, Training Loss: 0.00835
Epoch: 30590, Training Loss: 0.00780
Epoch: 30590, Training Loss: 0.00953
Epoch: 30590, Training Loss: 0.00738
Epoch: 30591, Training Loss: 0.00835
Epoch: 30591, Training Loss: 0.00780
Epoch: 30591, Training Loss: 0.00953
Epoch: 30591, Training Loss: 0.00738
Epoch: 30592, Training Loss: 0.00835
Epoch: 30592, Training Loss: 0.00780
Epoch: 30592, Training Loss: 0.00953
Epoch: 30592, Training Loss: 0.00738
Epoch: 30593, Training Loss: 0.00835
Epoch: 30593, Training Loss: 0.00780
Epoch: 30593, Training Loss: 0.00953
Epoch: 30593, Training Loss: 0.00738
Epoch: 30594, Training Loss: 0.00835
Epoch: 30594, Training Loss: 0.00780
Epoch: 30594, Training Loss: 0.00953
Epoch: 30594, Training Loss: 0.00738
Epoch: 30595, Training Loss: 0.00835
Epoch: 30595, Training Loss: 0.00780
Epoch: 30595, Training Loss: 0.00953
Epoch: 30595, Training Loss: 0.00738
Epoch: 30596, Training Loss: 0.00835
Epoch: 30596, Training Loss: 0.00780
Epoch: 30596, Training Loss: 0.00953
Epoch: 30596, Training Loss: 0.00738
Epoch: 30597, Training Loss: 0.00835
Epoch: 30597, Training Loss: 0.00780
Epoch: 30597, Training Loss: 0.00953
Epoch: 30597, Training Loss: 0.00738
Epoch: 30598, Training Loss: 0.00835
Epoch: 30598, Training Loss: 0.00779
Epoch: 30598, Training Loss: 0.00953
Epoch: 30598, Training Loss: 0.00738
Epoch: 30599, Training Loss: 0.00835
Epoch: 30599, Training Loss: 0.00779
Epoch: 30599, Training Loss: 0.00953
Epoch: 30599, Training Loss: 0.00738
Epoch: 30600, Training Loss: 0.00835
Epoch: 30600, Training Loss: 0.00779
Epoch: 30600, Training Loss: 0.00953
Epoch: 30600, Training Loss: 0.00738
Epoch: 30601, Training Loss: 0.00835
Epoch: 30601, Training Loss: 0.00779
Epoch: 30601, Training Loss: 0.00953
Epoch: 30601, Training Loss: 0.00738
Epoch: 30602, Training Loss: 0.00835
Epoch: 30602, Training Loss: 0.00779
Epoch: 30602, Training Loss: 0.00953
Epoch: 30602, Training Loss: 0.00738
Epoch: 30603, Training Loss: 0.00835
Epoch: 30603, Training Loss: 0.00779
Epoch: 30603, Training Loss: 0.00953
Epoch: 30603, Training Loss: 0.00738
Epoch: 30604, Training Loss: 0.00835
Epoch: 30604, Training Loss: 0.00779
Epoch: 30604, Training Loss: 0.00953
Epoch: 30604, Training Loss: 0.00738
Epoch: 30605, Training Loss: 0.00835
Epoch: 30605, Training Loss: 0.00779
Epoch: 30605, Training Loss: 0.00953
Epoch: 30605, Training Loss: 0.00738
Epoch: 30606, Training Loss: 0.00835
Epoch: 30606, Training Loss: 0.00779
Epoch: 30606, Training Loss: 0.00953
Epoch: 30606, Training Loss: 0.00738
Epoch: 30607, Training Loss: 0.00835
Epoch: 30607, Training Loss: 0.00779
Epoch: 30607, Training Loss: 0.00953
Epoch: 30607, Training Loss: 0.00738
Epoch: 30608, Training Loss: 0.00835
Epoch: 30608, Training Loss: 0.00779
Epoch: 30608, Training Loss: 0.00953
Epoch: 30608, Training Loss: 0.00738
Epoch: 30609, Training Loss: 0.00834
Epoch: 30609, Training Loss: 0.00779
Epoch: 30609, Training Loss: 0.00953
Epoch: 30609, Training Loss: 0.00738
Epoch: 30610, Training Loss: 0.00834
Epoch: 30610, Training Loss: 0.00779
Epoch: 30610, Training Loss: 0.00953
Epoch: 30610, Training Loss: 0.00738
Epoch: 30611, Training Loss: 0.00834
Epoch: 30611, Training Loss: 0.00779
Epoch: 30611, Training Loss: 0.00953
Epoch: 30611, Training Loss: 0.00738
Epoch: 30612, Training Loss: 0.00834
Epoch: 30612, Training Loss: 0.00779
Epoch: 30612, Training Loss: 0.00953
Epoch: 30612, Training Loss: 0.00738
Epoch: 30613, Training Loss: 0.00834
Epoch: 30613, Training Loss: 0.00779
Epoch: 30613, Training Loss: 0.00953
Epoch: 30613, Training Loss: 0.00738
Epoch: 30614, Training Loss: 0.00834
Epoch: 30614, Training Loss: 0.00779
Epoch: 30614, Training Loss: 0.00953
Epoch: 30614, Training Loss: 0.00738
Epoch: 30615, Training Loss: 0.00834
Epoch: 30615, Training Loss: 0.00779
Epoch: 30615, Training Loss: 0.00953
Epoch: 30615, Training Loss: 0.00738
Epoch: 30616, Training Loss: 0.00834
Epoch: 30616, Training Loss: 0.00779
Epoch: 30616, Training Loss: 0.00953
Epoch: 30616, Training Loss: 0.00738
Epoch: 30617, Training Loss: 0.00834
Epoch: 30617, Training Loss: 0.00779
Epoch: 30617, Training Loss: 0.00953
Epoch: 30617, Training Loss: 0.00738
Epoch: 30618, Training Loss: 0.00834
Epoch: 30618, Training Loss: 0.00779
Epoch: 30618, Training Loss: 0.00953
Epoch: 30618, Training Loss: 0.00738
Epoch: 30619, Training Loss: 0.00834
Epoch: 30619, Training Loss: 0.00779
Epoch: 30619, Training Loss: 0.00953
Epoch: 30619, Training Loss: 0.00738
Epoch: 30620, Training Loss: 0.00834
Epoch: 30620, Training Loss: 0.00779
Epoch: 30620, Training Loss: 0.00953
Epoch: 30620, Training Loss: 0.00738
Epoch: 30621, Training Loss: 0.00834
Epoch: 30621, Training Loss: 0.00779
Epoch: 30621, Training Loss: 0.00953
Epoch: 30621, Training Loss: 0.00738
Epoch: 30622, Training Loss: 0.00834
Epoch: 30622, Training Loss: 0.00779
Epoch: 30622, Training Loss: 0.00953
Epoch: 30622, Training Loss: 0.00738
Epoch: 30623, Training Loss: 0.00834
Epoch: 30623, Training Loss: 0.00779
Epoch: 30623, Training Loss: 0.00953
Epoch: 30623, Training Loss: 0.00738
Epoch: 30624, Training Loss: 0.00834
Epoch: 30624, Training Loss: 0.00779
Epoch: 30624, Training Loss: 0.00953
Epoch: 30624, Training Loss: 0.00738
Epoch: 30625, Training Loss: 0.00834
Epoch: 30625, Training Loss: 0.00779
Epoch: 30625, Training Loss: 0.00953
Epoch: 30625, Training Loss: 0.00738
Epoch: 30626, Training Loss: 0.00834
Epoch: 30626, Training Loss: 0.00779
Epoch: 30626, Training Loss: 0.00953
Epoch: 30626, Training Loss: 0.00738
Epoch: 30627, Training Loss: 0.00834
Epoch: 30627, Training Loss: 0.00779
Epoch: 30627, Training Loss: 0.00953
Epoch: 30627, Training Loss: 0.00738
Epoch: 30628, Training Loss: 0.00834
Epoch: 30628, Training Loss: 0.00779
Epoch: 30628, Training Loss: 0.00953
Epoch: 30628, Training Loss: 0.00738
Epoch: 30629, Training Loss: 0.00834
Epoch: 30629, Training Loss: 0.00779
Epoch: 30629, Training Loss: 0.00953
Epoch: 30629, Training Loss: 0.00738
Epoch: 30630, Training Loss: 0.00834
Epoch: 30630, Training Loss: 0.00779
Epoch: 30630, Training Loss: 0.00953
Epoch: 30630, Training Loss: 0.00738
Epoch: 30631, Training Loss: 0.00834
Epoch: 30631, Training Loss: 0.00779
Epoch: 30631, Training Loss: 0.00953
Epoch: 30631, Training Loss: 0.00738
Epoch: 30632, Training Loss: 0.00834
Epoch: 30632, Training Loss: 0.00779
Epoch: 30632, Training Loss: 0.00953
Epoch: 30632, Training Loss: 0.00738
Epoch: 30633, Training Loss: 0.00834
Epoch: 30633, Training Loss: 0.00779
Epoch: 30633, Training Loss: 0.00953
Epoch: 30633, Training Loss: 0.00738
Epoch: 30634, Training Loss: 0.00834
Epoch: 30634, Training Loss: 0.00779
Epoch: 30634, Training Loss: 0.00953
Epoch: 30634, Training Loss: 0.00738
Epoch: 30635, Training Loss: 0.00834
Epoch: 30635, Training Loss: 0.00779
Epoch: 30635, Training Loss: 0.00953
Epoch: 30635, Training Loss: 0.00738
Epoch: 30636, Training Loss: 0.00834
Epoch: 30636, Training Loss: 0.00779
Epoch: 30636, Training Loss: 0.00953
Epoch: 30636, Training Loss: 0.00738
Epoch: 30637, Training Loss: 0.00834
Epoch: 30637, Training Loss: 0.00779
Epoch: 30637, Training Loss: 0.00953
Epoch: 30637, Training Loss: 0.00738
Epoch: 30638, Training Loss: 0.00834
Epoch: 30638, Training Loss: 0.00779
Epoch: 30638, Training Loss: 0.00953
Epoch: 30638, Training Loss: 0.00738
Epoch: 30639, Training Loss: 0.00834
Epoch: 30639, Training Loss: 0.00779
Epoch: 30639, Training Loss: 0.00953
Epoch: 30639, Training Loss: 0.00738
Epoch: 30640, Training Loss: 0.00834
Epoch: 30640, Training Loss: 0.00779
Epoch: 30640, Training Loss: 0.00953
Epoch: 30640, Training Loss: 0.00738
Epoch: 30641, Training Loss: 0.00834
Epoch: 30641, Training Loss: 0.00779
Epoch: 30641, Training Loss: 0.00953
Epoch: 30641, Training Loss: 0.00738
Epoch: 30642, Training Loss: 0.00834
Epoch: 30642, Training Loss: 0.00779
Epoch: 30642, Training Loss: 0.00953
Epoch: 30642, Training Loss: 0.00738
Epoch: 30643, Training Loss: 0.00834
Epoch: 30643, Training Loss: 0.00779
Epoch: 30643, Training Loss: 0.00952
Epoch: 30643, Training Loss: 0.00738
Epoch: 30644, Training Loss: 0.00834
Epoch: 30644, Training Loss: 0.00779
Epoch: 30644, Training Loss: 0.00952
Epoch: 30644, Training Loss: 0.00738
Epoch: 30645, Training Loss: 0.00834
Epoch: 30645, Training Loss: 0.00779
Epoch: 30645, Training Loss: 0.00952
Epoch: 30645, Training Loss: 0.00738
Epoch: 30646, Training Loss: 0.00834
Epoch: 30646, Training Loss: 0.00779
Epoch: 30646, Training Loss: 0.00952
Epoch: 30646, Training Loss: 0.00738
Epoch: 30647, Training Loss: 0.00834
Epoch: 30647, Training Loss: 0.00779
Epoch: 30647, Training Loss: 0.00952
Epoch: 30647, Training Loss: 0.00738
Epoch: 30648, Training Loss: 0.00834
Epoch: 30648, Training Loss: 0.00779
Epoch: 30648, Training Loss: 0.00952
Epoch: 30648, Training Loss: 0.00738
Epoch: 30649, Training Loss: 0.00834
Epoch: 30649, Training Loss: 0.00779
Epoch: 30649, Training Loss: 0.00952
Epoch: 30649, Training Loss: 0.00737
Epoch: 30650, Training Loss: 0.00834
Epoch: 30650, Training Loss: 0.00779
Epoch: 30650, Training Loss: 0.00952
Epoch: 30650, Training Loss: 0.00737
Epoch: 30651, Training Loss: 0.00834
Epoch: 30651, Training Loss: 0.00779
Epoch: 30651, Training Loss: 0.00952
Epoch: 30651, Training Loss: 0.00737
Epoch: 30652, Training Loss: 0.00834
Epoch: 30652, Training Loss: 0.00779
Epoch: 30652, Training Loss: 0.00952
Epoch: 30652, Training Loss: 0.00737
Epoch: 30653, Training Loss: 0.00834
Epoch: 30653, Training Loss: 0.00779
Epoch: 30653, Training Loss: 0.00952
Epoch: 30653, Training Loss: 0.00737
Epoch: 30654, Training Loss: 0.00834
Epoch: 30654, Training Loss: 0.00779
Epoch: 30654, Training Loss: 0.00952
Epoch: 30654, Training Loss: 0.00737
Epoch: 30655, Training Loss: 0.00834
Epoch: 30655, Training Loss: 0.00779
Epoch: 30655, Training Loss: 0.00952
Epoch: 30655, Training Loss: 0.00737
Epoch: 30656, Training Loss: 0.00834
Epoch: 30656, Training Loss: 0.00779
Epoch: 30656, Training Loss: 0.00952
Epoch: 30656, Training Loss: 0.00737
Epoch: 30657, Training Loss: 0.00834
Epoch: 30657, Training Loss: 0.00779
Epoch: 30657, Training Loss: 0.00952
Epoch: 30657, Training Loss: 0.00737
Epoch: 30658, Training Loss: 0.00834
Epoch: 30658, Training Loss: 0.00779
Epoch: 30658, Training Loss: 0.00952
Epoch: 30658, Training Loss: 0.00737
Epoch: 30659, Training Loss: 0.00834
Epoch: 30659, Training Loss: 0.00779
Epoch: 30659, Training Loss: 0.00952
Epoch: 30659, Training Loss: 0.00737
Epoch: 30660, Training Loss: 0.00834
Epoch: 30660, Training Loss: 0.00779
Epoch: 30660, Training Loss: 0.00952
Epoch: 30660, Training Loss: 0.00737
Epoch: 30661, Training Loss: 0.00834
Epoch: 30661, Training Loss: 0.00779
Epoch: 30661, Training Loss: 0.00952
Epoch: 30661, Training Loss: 0.00737
Epoch: 30662, Training Loss: 0.00834
Epoch: 30662, Training Loss: 0.00779
Epoch: 30662, Training Loss: 0.00952
Epoch: 30662, Training Loss: 0.00737
Epoch: 30663, Training Loss: 0.00834
Epoch: 30663, Training Loss: 0.00779
Epoch: 30663, Training Loss: 0.00952
Epoch: 30663, Training Loss: 0.00737
Epoch: 30664, Training Loss: 0.00834
Epoch: 30664, Training Loss: 0.00779
Epoch: 30664, Training Loss: 0.00952
Epoch: 30664, Training Loss: 0.00737
Epoch: 30665, Training Loss: 0.00834
Epoch: 30665, Training Loss: 0.00779
Epoch: 30665, Training Loss: 0.00952
Epoch: 30665, Training Loss: 0.00737
Epoch: 30666, Training Loss: 0.00834
Epoch: 30666, Training Loss: 0.00779
Epoch: 30666, Training Loss: 0.00952
Epoch: 30666, Training Loss: 0.00737
Epoch: 30667, Training Loss: 0.00834
Epoch: 30667, Training Loss: 0.00779
Epoch: 30667, Training Loss: 0.00952
Epoch: 30667, Training Loss: 0.00737
Epoch: 30668, Training Loss: 0.00834
Epoch: 30668, Training Loss: 0.00779
Epoch: 30668, Training Loss: 0.00952
Epoch: 30668, Training Loss: 0.00737
Epoch: 30669, Training Loss: 0.00834
Epoch: 30669, Training Loss: 0.00779
Epoch: 30669, Training Loss: 0.00952
Epoch: 30669, Training Loss: 0.00737
Epoch: 30670, Training Loss: 0.00834
Epoch: 30670, Training Loss: 0.00779
Epoch: 30670, Training Loss: 0.00952
Epoch: 30670, Training Loss: 0.00737
Epoch: 30671, Training Loss: 0.00834
Epoch: 30671, Training Loss: 0.00779
Epoch: 30671, Training Loss: 0.00952
Epoch: 30671, Training Loss: 0.00737
Epoch: 30672, Training Loss: 0.00834
Epoch: 30672, Training Loss: 0.00779
Epoch: 30672, Training Loss: 0.00952
Epoch: 30672, Training Loss: 0.00737
Epoch: 30673, Training Loss: 0.00834
Epoch: 30673, Training Loss: 0.00778
Epoch: 30673, Training Loss: 0.00952
Epoch: 30673, Training Loss: 0.00737
Epoch: 30674, Training Loss: 0.00834
Epoch: 30674, Training Loss: 0.00778
Epoch: 30674, Training Loss: 0.00952
Epoch: 30674, Training Loss: 0.00737
Epoch: 30675, Training Loss: 0.00834
Epoch: 30675, Training Loss: 0.00778
Epoch: 30675, Training Loss: 0.00952
Epoch: 30675, Training Loss: 0.00737
Epoch: 30676, Training Loss: 0.00834
Epoch: 30676, Training Loss: 0.00778
Epoch: 30676, Training Loss: 0.00952
Epoch: 30676, Training Loss: 0.00737
Epoch: 30677, Training Loss: 0.00834
Epoch: 30677, Training Loss: 0.00778
Epoch: 30677, Training Loss: 0.00952
Epoch: 30677, Training Loss: 0.00737
Epoch: 30678, Training Loss: 0.00833
Epoch: 30678, Training Loss: 0.00778
Epoch: 30678, Training Loss: 0.00952
Epoch: 30678, Training Loss: 0.00737
Epoch: 30679, Training Loss: 0.00833
Epoch: 30679, Training Loss: 0.00778
Epoch: 30679, Training Loss: 0.00952
Epoch: 30679, Training Loss: 0.00737
Epoch: 30680, Training Loss: 0.00833
Epoch: 30680, Training Loss: 0.00778
Epoch: 30680, Training Loss: 0.00952
Epoch: 30680, Training Loss: 0.00737
Epoch: 30681, Training Loss: 0.00833
Epoch: 30681, Training Loss: 0.00778
Epoch: 30681, Training Loss: 0.00952
Epoch: 30681, Training Loss: 0.00737
Epoch: 30682, Training Loss: 0.00833
Epoch: 30682, Training Loss: 0.00778
Epoch: 30682, Training Loss: 0.00952
Epoch: 30682, Training Loss: 0.00737
Epoch: 30683, Training Loss: 0.00833
Epoch: 30683, Training Loss: 0.00778
Epoch: 30683, Training Loss: 0.00952
Epoch: 30683, Training Loss: 0.00737
Epoch: 30684, Training Loss: 0.00833
Epoch: 30684, Training Loss: 0.00778
Epoch: 30684, Training Loss: 0.00952
Epoch: 30684, Training Loss: 0.00737
Epoch: 30685, Training Loss: 0.00833
Epoch: 30685, Training Loss: 0.00778
Epoch: 30685, Training Loss: 0.00952
Epoch: 30685, Training Loss: 0.00737
Epoch: 30686, Training Loss: 0.00833
Epoch: 30686, Training Loss: 0.00778
Epoch: 30686, Training Loss: 0.00952
Epoch: 30686, Training Loss: 0.00737
Epoch: 30687, Training Loss: 0.00833
Epoch: 30687, Training Loss: 0.00778
Epoch: 30687, Training Loss: 0.00952
Epoch: 30687, Training Loss: 0.00737
Epoch: 30688, Training Loss: 0.00833
Epoch: 30688, Training Loss: 0.00778
Epoch: 30688, Training Loss: 0.00952
Epoch: 30688, Training Loss: 0.00737
Epoch: 30689, Training Loss: 0.00833
Epoch: 30689, Training Loss: 0.00778
Epoch: 30689, Training Loss: 0.00952
Epoch: 30689, Training Loss: 0.00737
Epoch: 30690, Training Loss: 0.00833
Epoch: 30690, Training Loss: 0.00778
Epoch: 30690, Training Loss: 0.00952
Epoch: 30690, Training Loss: 0.00737
Epoch: 30691, Training Loss: 0.00833
Epoch: 30691, Training Loss: 0.00778
Epoch: 30691, Training Loss: 0.00952
Epoch: 30691, Training Loss: 0.00737
Epoch: 30692, Training Loss: 0.00833
Epoch: 30692, Training Loss: 0.00778
Epoch: 30692, Training Loss: 0.00952
Epoch: 30692, Training Loss: 0.00737
Epoch: 30693, Training Loss: 0.00833
Epoch: 30693, Training Loss: 0.00778
Epoch: 30693, Training Loss: 0.00952
Epoch: 30693, Training Loss: 0.00737
Epoch: 30694, Training Loss: 0.00833
Epoch: 30694, Training Loss: 0.00778
Epoch: 30694, Training Loss: 0.00952
Epoch: 30694, Training Loss: 0.00737
Epoch: 30695, Training Loss: 0.00833
Epoch: 30695, Training Loss: 0.00778
Epoch: 30695, Training Loss: 0.00952
Epoch: 30695, Training Loss: 0.00737
Epoch: 30696, Training Loss: 0.00833
Epoch: 30696, Training Loss: 0.00778
Epoch: 30696, Training Loss: 0.00952
Epoch: 30696, Training Loss: 0.00737
Epoch: 30697, Training Loss: 0.00833
Epoch: 30697, Training Loss: 0.00778
Epoch: 30697, Training Loss: 0.00952
Epoch: 30697, Training Loss: 0.00737
Epoch: 30698, Training Loss: 0.00833
Epoch: 30698, Training Loss: 0.00778
Epoch: 30698, Training Loss: 0.00952
Epoch: 30698, Training Loss: 0.00737
Epoch: 30699, Training Loss: 0.00833
Epoch: 30699, Training Loss: 0.00778
Epoch: 30699, Training Loss: 0.00952
Epoch: 30699, Training Loss: 0.00737
Epoch: 30700, Training Loss: 0.00833
Epoch: 30700, Training Loss: 0.00778
Epoch: 30700, Training Loss: 0.00952
Epoch: 30700, Training Loss: 0.00737
Epoch: 30701, Training Loss: 0.00833
Epoch: 30701, Training Loss: 0.00778
Epoch: 30701, Training Loss: 0.00952
Epoch: 30701, Training Loss: 0.00737
Epoch: 30702, Training Loss: 0.00833
Epoch: 30702, Training Loss: 0.00778
Epoch: 30702, Training Loss: 0.00952
Epoch: 30702, Training Loss: 0.00737
Epoch: 30703, Training Loss: 0.00833
Epoch: 30703, Training Loss: 0.00778
Epoch: 30703, Training Loss: 0.00952
Epoch: 30703, Training Loss: 0.00737
Epoch: 30704, Training Loss: 0.00833
Epoch: 30704, Training Loss: 0.00778
Epoch: 30704, Training Loss: 0.00951
Epoch: 30704, Training Loss: 0.00737
Epoch: 30705, Training Loss: 0.00833
Epoch: 30705, Training Loss: 0.00778
Epoch: 30705, Training Loss: 0.00951
Epoch: 30705, Training Loss: 0.00737
Epoch: 30706, Training Loss: 0.00833
Epoch: 30706, Training Loss: 0.00778
Epoch: 30706, Training Loss: 0.00951
Epoch: 30706, Training Loss: 0.00737
Epoch: 30707, Training Loss: 0.00833
Epoch: 30707, Training Loss: 0.00778
Epoch: 30707, Training Loss: 0.00951
Epoch: 30707, Training Loss: 0.00737
Epoch: 30708, Training Loss: 0.00833
Epoch: 30708, Training Loss: 0.00778
Epoch: 30708, Training Loss: 0.00951
Epoch: 30708, Training Loss: 0.00737
Epoch: 30709, Training Loss: 0.00833
Epoch: 30709, Training Loss: 0.00778
Epoch: 30709, Training Loss: 0.00951
Epoch: 30709, Training Loss: 0.00737
Epoch: 30710, Training Loss: 0.00833
Epoch: 30710, Training Loss: 0.00778
Epoch: 30710, Training Loss: 0.00951
Epoch: 30710, Training Loss: 0.00737
Epoch: 30711, Training Loss: 0.00833
Epoch: 30711, Training Loss: 0.00778
Epoch: 30711, Training Loss: 0.00951
Epoch: 30711, Training Loss: 0.00737
Epoch: 30712, Training Loss: 0.00833
Epoch: 30712, Training Loss: 0.00778
Epoch: 30712, Training Loss: 0.00951
Epoch: 30712, Training Loss: 0.00737
Epoch: 30713, Training Loss: 0.00833
Epoch: 30713, Training Loss: 0.00778
Epoch: 30713, Training Loss: 0.00951
Epoch: 30713, Training Loss: 0.00737
Epoch: 30714, Training Loss: 0.00833
Epoch: 30714, Training Loss: 0.00778
Epoch: 30714, Training Loss: 0.00951
Epoch: 30714, Training Loss: 0.00737
Epoch: 30715, Training Loss: 0.00833
Epoch: 30715, Training Loss: 0.00778
Epoch: 30715, Training Loss: 0.00951
Epoch: 30715, Training Loss: 0.00737
Epoch: 30716, Training Loss: 0.00833
Epoch: 30716, Training Loss: 0.00778
Epoch: 30716, Training Loss: 0.00951
Epoch: 30716, Training Loss: 0.00737
Epoch: 30717, Training Loss: 0.00833
Epoch: 30717, Training Loss: 0.00778
Epoch: 30717, Training Loss: 0.00951
Epoch: 30717, Training Loss: 0.00737
Epoch: 30718, Training Loss: 0.00833
Epoch: 30718, Training Loss: 0.00778
Epoch: 30718, Training Loss: 0.00951
Epoch: 30718, Training Loss: 0.00737
Epoch: 30719, Training Loss: 0.00833
Epoch: 30719, Training Loss: 0.00778
Epoch: 30719, Training Loss: 0.00951
Epoch: 30719, Training Loss: 0.00737
Epoch: 30720, Training Loss: 0.00833
Epoch: 30720, Training Loss: 0.00778
Epoch: 30720, Training Loss: 0.00951
Epoch: 30720, Training Loss: 0.00737
Epoch: 30721, Training Loss: 0.00833
Epoch: 30721, Training Loss: 0.00778
Epoch: 30721, Training Loss: 0.00951
Epoch: 30721, Training Loss: 0.00737
Epoch: 30722, Training Loss: 0.00833
Epoch: 30722, Training Loss: 0.00778
Epoch: 30722, Training Loss: 0.00951
Epoch: 30722, Training Loss: 0.00737
Epoch: 30723, Training Loss: 0.00833
Epoch: 30723, Training Loss: 0.00778
Epoch: 30723, Training Loss: 0.00951
Epoch: 30723, Training Loss: 0.00737
Epoch: 30724, Training Loss: 0.00833
Epoch: 30724, Training Loss: 0.00778
Epoch: 30724, Training Loss: 0.00951
Epoch: 30724, Training Loss: 0.00737
Epoch: 30725, Training Loss: 0.00833
Epoch: 30725, Training Loss: 0.00778
Epoch: 30725, Training Loss: 0.00951
Epoch: 30725, Training Loss: 0.00737
Epoch: 30726, Training Loss: 0.00833
Epoch: 30726, Training Loss: 0.00778
Epoch: 30726, Training Loss: 0.00951
Epoch: 30726, Training Loss: 0.00737
Epoch: 30727, Training Loss: 0.00833
Epoch: 30727, Training Loss: 0.00778
Epoch: 30727, Training Loss: 0.00951
Epoch: 30727, Training Loss: 0.00737
Epoch: 30728, Training Loss: 0.00833
Epoch: 30728, Training Loss: 0.00778
Epoch: 30728, Training Loss: 0.00951
Epoch: 30728, Training Loss: 0.00737
Epoch: 30729, Training Loss: 0.00833
Epoch: 30729, Training Loss: 0.00778
Epoch: 30729, Training Loss: 0.00951
Epoch: 30729, Training Loss: 0.00736
Epoch: 30730, Training Loss: 0.00833
Epoch: 30730, Training Loss: 0.00778
Epoch: 30730, Training Loss: 0.00951
Epoch: 30730, Training Loss: 0.00736
Epoch: 30731, Training Loss: 0.00833
Epoch: 30731, Training Loss: 0.00778
Epoch: 30731, Training Loss: 0.00951
Epoch: 30731, Training Loss: 0.00736
Epoch: 30732, Training Loss: 0.00833
Epoch: 30732, Training Loss: 0.00778
Epoch: 30732, Training Loss: 0.00951
Epoch: 30732, Training Loss: 0.00736
Epoch: 30733, Training Loss: 0.00833
Epoch: 30733, Training Loss: 0.00778
Epoch: 30733, Training Loss: 0.00951
Epoch: 30733, Training Loss: 0.00736
Epoch: 30734, Training Loss: 0.00833
Epoch: 30734, Training Loss: 0.00778
Epoch: 30734, Training Loss: 0.00951
Epoch: 30734, Training Loss: 0.00736
Epoch: 30735, Training Loss: 0.00833
Epoch: 30735, Training Loss: 0.00778
Epoch: 30735, Training Loss: 0.00951
Epoch: 30735, Training Loss: 0.00736
Epoch: 30736, Training Loss: 0.00833
Epoch: 30736, Training Loss: 0.00778
Epoch: 30736, Training Loss: 0.00951
Epoch: 30736, Training Loss: 0.00736
Epoch: 30737, Training Loss: 0.00833
Epoch: 30737, Training Loss: 0.00778
Epoch: 30737, Training Loss: 0.00951
Epoch: 30737, Training Loss: 0.00736
Epoch: 30738, Training Loss: 0.00833
Epoch: 30738, Training Loss: 0.00778
Epoch: 30738, Training Loss: 0.00951
Epoch: 30738, Training Loss: 0.00736
Epoch: 30739, Training Loss: 0.00833
Epoch: 30739, Training Loss: 0.00778
Epoch: 30739, Training Loss: 0.00951
Epoch: 30739, Training Loss: 0.00736
Epoch: 30740, Training Loss: 0.00833
Epoch: 30740, Training Loss: 0.00778
Epoch: 30740, Training Loss: 0.00951
Epoch: 30740, Training Loss: 0.00736
Epoch: 30741, Training Loss: 0.00833
Epoch: 30741, Training Loss: 0.00778
Epoch: 30741, Training Loss: 0.00951
Epoch: 30741, Training Loss: 0.00736
Epoch: 30742, Training Loss: 0.00833
Epoch: 30742, Training Loss: 0.00778
Epoch: 30742, Training Loss: 0.00951
Epoch: 30742, Training Loss: 0.00736
Epoch: 30743, Training Loss: 0.00833
Epoch: 30743, Training Loss: 0.00778
Epoch: 30743, Training Loss: 0.00951
Epoch: 30743, Training Loss: 0.00736
Epoch: 30744, Training Loss: 0.00833
Epoch: 30744, Training Loss: 0.00778
Epoch: 30744, Training Loss: 0.00951
Epoch: 30744, Training Loss: 0.00736
Epoch: 30745, Training Loss: 0.00833
Epoch: 30745, Training Loss: 0.00778
Epoch: 30745, Training Loss: 0.00951
Epoch: 30745, Training Loss: 0.00736
Epoch: 30746, Training Loss: 0.00833
Epoch: 30746, Training Loss: 0.00778
Epoch: 30746, Training Loss: 0.00951
Epoch: 30746, Training Loss: 0.00736
Epoch: 30747, Training Loss: 0.00832
Epoch: 30747, Training Loss: 0.00778
Epoch: 30747, Training Loss: 0.00951
Epoch: 30747, Training Loss: 0.00736
Epoch: 30748, Training Loss: 0.00832
Epoch: 30748, Training Loss: 0.00778
Epoch: 30748, Training Loss: 0.00951
Epoch: 30748, Training Loss: 0.00736
Epoch: 30749, Training Loss: 0.00832
Epoch: 30749, Training Loss: 0.00777
Epoch: 30749, Training Loss: 0.00951
Epoch: 30749, Training Loss: 0.00736
Epoch: 30750, Training Loss: 0.00832
Epoch: 30750, Training Loss: 0.00777
Epoch: 30750, Training Loss: 0.00951
Epoch: 30750, Training Loss: 0.00736
Epoch: 30751, Training Loss: 0.00832
Epoch: 30751, Training Loss: 0.00777
Epoch: 30751, Training Loss: 0.00951
Epoch: 30751, Training Loss: 0.00736
Epoch: 30752, Training Loss: 0.00832
Epoch: 30752, Training Loss: 0.00777
Epoch: 30752, Training Loss: 0.00951
Epoch: 30752, Training Loss: 0.00736
Epoch: 30753, Training Loss: 0.00832
Epoch: 30753, Training Loss: 0.00777
Epoch: 30753, Training Loss: 0.00951
Epoch: 30753, Training Loss: 0.00736
Epoch: 30754, Training Loss: 0.00832
Epoch: 30754, Training Loss: 0.00777
Epoch: 30754, Training Loss: 0.00951
Epoch: 30754, Training Loss: 0.00736
Epoch: 30755, Training Loss: 0.00832
Epoch: 30755, Training Loss: 0.00777
Epoch: 30755, Training Loss: 0.00951
Epoch: 30755, Training Loss: 0.00736
Epoch: 30756, Training Loss: 0.00832
Epoch: 30756, Training Loss: 0.00777
Epoch: 30756, Training Loss: 0.00951
Epoch: 30756, Training Loss: 0.00736
Epoch: 30757, Training Loss: 0.00832
Epoch: 30757, Training Loss: 0.00777
Epoch: 30757, Training Loss: 0.00951
Epoch: 30757, Training Loss: 0.00736
Epoch: 30758, Training Loss: 0.00832
Epoch: 30758, Training Loss: 0.00777
Epoch: 30758, Training Loss: 0.00951
Epoch: 30758, Training Loss: 0.00736
Epoch: 30759, Training Loss: 0.00832
Epoch: 30759, Training Loss: 0.00777
Epoch: 30759, Training Loss: 0.00951
Epoch: 30759, Training Loss: 0.00736
Epoch: 30760, Training Loss: 0.00832
Epoch: 30760, Training Loss: 0.00777
Epoch: 30760, Training Loss: 0.00951
Epoch: 30760, Training Loss: 0.00736
Epoch: 30761, Training Loss: 0.00832
Epoch: 30761, Training Loss: 0.00777
Epoch: 30761, Training Loss: 0.00951
Epoch: 30761, Training Loss: 0.00736
Epoch: 30762, Training Loss: 0.00832
Epoch: 30762, Training Loss: 0.00777
Epoch: 30762, Training Loss: 0.00951
Epoch: 30762, Training Loss: 0.00736
Epoch: 30763, Training Loss: 0.00832
Epoch: 30763, Training Loss: 0.00777
Epoch: 30763, Training Loss: 0.00951
Epoch: 30763, Training Loss: 0.00736
Epoch: 30764, Training Loss: 0.00832
Epoch: 30764, Training Loss: 0.00777
Epoch: 30764, Training Loss: 0.00951
Epoch: 30764, Training Loss: 0.00736
Epoch: 30765, Training Loss: 0.00832
Epoch: 30765, Training Loss: 0.00777
Epoch: 30765, Training Loss: 0.00950
Epoch: 30765, Training Loss: 0.00736
Epoch: 30766, Training Loss: 0.00832
Epoch: 30766, Training Loss: 0.00777
Epoch: 30766, Training Loss: 0.00950
Epoch: 30766, Training Loss: 0.00736
Epoch: 30767, Training Loss: 0.00832
Epoch: 30767, Training Loss: 0.00777
Epoch: 30767, Training Loss: 0.00950
Epoch: 30767, Training Loss: 0.00736
Epoch: 30768, Training Loss: 0.00832
Epoch: 30768, Training Loss: 0.00777
Epoch: 30768, Training Loss: 0.00950
Epoch: 30768, Training Loss: 0.00736
Epoch: 30769, Training Loss: 0.00832
Epoch: 30769, Training Loss: 0.00777
Epoch: 30769, Training Loss: 0.00950
Epoch: 30769, Training Loss: 0.00736
Epoch: 30770, Training Loss: 0.00832
Epoch: 30770, Training Loss: 0.00777
Epoch: 30770, Training Loss: 0.00950
Epoch: 30770, Training Loss: 0.00736
Epoch: 30771, Training Loss: 0.00832
Epoch: 30771, Training Loss: 0.00777
Epoch: 30771, Training Loss: 0.00950
Epoch: 30771, Training Loss: 0.00736
Epoch: 30772, Training Loss: 0.00832
Epoch: 30772, Training Loss: 0.00777
Epoch: 30772, Training Loss: 0.00950
Epoch: 30772, Training Loss: 0.00736
Epoch: 30773, Training Loss: 0.00832
Epoch: 30773, Training Loss: 0.00777
Epoch: 30773, Training Loss: 0.00950
Epoch: 30773, Training Loss: 0.00736
Epoch: 30774, Training Loss: 0.00832
Epoch: 30774, Training Loss: 0.00777
Epoch: 30774, Training Loss: 0.00950
Epoch: 30774, Training Loss: 0.00736
Epoch: 30775, Training Loss: 0.00832
Epoch: 30775, Training Loss: 0.00777
Epoch: 30775, Training Loss: 0.00950
Epoch: 30775, Training Loss: 0.00736
Epoch: 30776, Training Loss: 0.00832
Epoch: 30776, Training Loss: 0.00777
Epoch: 30776, Training Loss: 0.00950
Epoch: 30776, Training Loss: 0.00736
Epoch: 30777, Training Loss: 0.00832
Epoch: 30777, Training Loss: 0.00777
Epoch: 30777, Training Loss: 0.00950
Epoch: 30777, Training Loss: 0.00736
Epoch: 30778, Training Loss: 0.00832
Epoch: 30778, Training Loss: 0.00777
Epoch: 30778, Training Loss: 0.00950
Epoch: 30778, Training Loss: 0.00736
Epoch: 30779, Training Loss: 0.00832
Epoch: 30779, Training Loss: 0.00777
Epoch: 30779, Training Loss: 0.00950
Epoch: 30779, Training Loss: 0.00736
Epoch: 30780, Training Loss: 0.00832
Epoch: 30780, Training Loss: 0.00777
Epoch: 30780, Training Loss: 0.00950
Epoch: 30780, Training Loss: 0.00736
Epoch: 30781, Training Loss: 0.00832
Epoch: 30781, Training Loss: 0.00777
Epoch: 30781, Training Loss: 0.00950
Epoch: 30781, Training Loss: 0.00736
Epoch: 30782, Training Loss: 0.00832
Epoch: 30782, Training Loss: 0.00777
Epoch: 30782, Training Loss: 0.00950
Epoch: 30782, Training Loss: 0.00736
Epoch: 30783, Training Loss: 0.00832
Epoch: 30783, Training Loss: 0.00777
Epoch: 30783, Training Loss: 0.00950
Epoch: 30783, Training Loss: 0.00736
Epoch: 30784, Training Loss: 0.00832
Epoch: 30784, Training Loss: 0.00777
Epoch: 30784, Training Loss: 0.00950
Epoch: 30784, Training Loss: 0.00736
Epoch: 30785, Training Loss: 0.00832
Epoch: 30785, Training Loss: 0.00777
Epoch: 30785, Training Loss: 0.00950
Epoch: 30785, Training Loss: 0.00736
Epoch: 30786, Training Loss: 0.00832
Epoch: 30786, Training Loss: 0.00777
Epoch: 30786, Training Loss: 0.00950
Epoch: 30786, Training Loss: 0.00736
Epoch: 30787, Training Loss: 0.00832
Epoch: 30787, Training Loss: 0.00777
Epoch: 30787, Training Loss: 0.00950
Epoch: 30787, Training Loss: 0.00736
Epoch: 30788, Training Loss: 0.00832
Epoch: 30788, Training Loss: 0.00777
Epoch: 30788, Training Loss: 0.00950
Epoch: 30788, Training Loss: 0.00736
Epoch: 30789, Training Loss: 0.00832
Epoch: 30789, Training Loss: 0.00777
Epoch: 30789, Training Loss: 0.00950
Epoch: 30789, Training Loss: 0.00736
Epoch: 30790, Training Loss: 0.00832
Epoch: 30790, Training Loss: 0.00777
Epoch: 30790, Training Loss: 0.00950
Epoch: 30790, Training Loss: 0.00736
Epoch: 30791, Training Loss: 0.00832
Epoch: 30791, Training Loss: 0.00777
Epoch: 30791, Training Loss: 0.00950
Epoch: 30791, Training Loss: 0.00736
Epoch: 30792, Training Loss: 0.00832
Epoch: 30792, Training Loss: 0.00777
Epoch: 30792, Training Loss: 0.00950
Epoch: 30792, Training Loss: 0.00736
Epoch: 30793, Training Loss: 0.00832
Epoch: 30793, Training Loss: 0.00777
Epoch: 30793, Training Loss: 0.00950
Epoch: 30793, Training Loss: 0.00736
Epoch: 30794, Training Loss: 0.00832
Epoch: 30794, Training Loss: 0.00777
Epoch: 30794, Training Loss: 0.00950
Epoch: 30794, Training Loss: 0.00736
Epoch: 30795, Training Loss: 0.00832
Epoch: 30795, Training Loss: 0.00777
Epoch: 30795, Training Loss: 0.00950
Epoch: 30795, Training Loss: 0.00736
Epoch: 30796, Training Loss: 0.00832
Epoch: 30796, Training Loss: 0.00777
Epoch: 30796, Training Loss: 0.00950
Epoch: 30796, Training Loss: 0.00736
Epoch: 30797, Training Loss: 0.00832
Epoch: 30797, Training Loss: 0.00777
Epoch: 30797, Training Loss: 0.00950
Epoch: 30797, Training Loss: 0.00736
Epoch: 30798, Training Loss: 0.00832
Epoch: 30798, Training Loss: 0.00777
Epoch: 30798, Training Loss: 0.00950
Epoch: 30798, Training Loss: 0.00736
Epoch: 30799, Training Loss: 0.00832
Epoch: 30799, Training Loss: 0.00777
Epoch: 30799, Training Loss: 0.00950
Epoch: 30799, Training Loss: 0.00736
Epoch: 30800, Training Loss: 0.00832
Epoch: 30800, Training Loss: 0.00777
Epoch: 30800, Training Loss: 0.00950
Epoch: 30800, Training Loss: 0.00736
Epoch: 30801, Training Loss: 0.00832
Epoch: 30801, Training Loss: 0.00777
Epoch: 30801, Training Loss: 0.00950
Epoch: 30801, Training Loss: 0.00736
Epoch: 30802, Training Loss: 0.00832
Epoch: 30802, Training Loss: 0.00777
Epoch: 30802, Training Loss: 0.00950
Epoch: 30802, Training Loss: 0.00736
Epoch: 30803, Training Loss: 0.00832
Epoch: 30803, Training Loss: 0.00777
Epoch: 30803, Training Loss: 0.00950
Epoch: 30803, Training Loss: 0.00736
Epoch: 30804, Training Loss: 0.00832
Epoch: 30804, Training Loss: 0.00777
Epoch: 30804, Training Loss: 0.00950
Epoch: 30804, Training Loss: 0.00736
Epoch: 30805, Training Loss: 0.00832
Epoch: 30805, Training Loss: 0.00777
Epoch: 30805, Training Loss: 0.00950
Epoch: 30805, Training Loss: 0.00736
Epoch: 30806, Training Loss: 0.00832
Epoch: 30806, Training Loss: 0.00777
Epoch: 30806, Training Loss: 0.00950
Epoch: 30806, Training Loss: 0.00736
Epoch: 30807, Training Loss: 0.00832
Epoch: 30807, Training Loss: 0.00777
Epoch: 30807, Training Loss: 0.00950
Epoch: 30807, Training Loss: 0.00736
Epoch: 30808, Training Loss: 0.00832
Epoch: 30808, Training Loss: 0.00777
Epoch: 30808, Training Loss: 0.00950
Epoch: 30808, Training Loss: 0.00736
Epoch: 30809, Training Loss: 0.00832
Epoch: 30809, Training Loss: 0.00777
Epoch: 30809, Training Loss: 0.00950
Epoch: 30809, Training Loss: 0.00735
Epoch: 30810, Training Loss: 0.00832
Epoch: 30810, Training Loss: 0.00777
Epoch: 30810, Training Loss: 0.00950
Epoch: 30810, Training Loss: 0.00735
Epoch: 30811, Training Loss: 0.00832
Epoch: 30811, Training Loss: 0.00777
Epoch: 30811, Training Loss: 0.00950
Epoch: 30811, Training Loss: 0.00735
Epoch: 30812, Training Loss: 0.00832
Epoch: 30812, Training Loss: 0.00777
Epoch: 30812, Training Loss: 0.00950
Epoch: 30812, Training Loss: 0.00735
Epoch: 30813, Training Loss: 0.00832
Epoch: 30813, Training Loss: 0.00777
Epoch: 30813, Training Loss: 0.00950
Epoch: 30813, Training Loss: 0.00735
Epoch: 30814, Training Loss: 0.00832
Epoch: 30814, Training Loss: 0.00777
Epoch: 30814, Training Loss: 0.00950
Epoch: 30814, Training Loss: 0.00735
Epoch: 30815, Training Loss: 0.00832
Epoch: 30815, Training Loss: 0.00777
Epoch: 30815, Training Loss: 0.00950
Epoch: 30815, Training Loss: 0.00735
Epoch: 30816, Training Loss: 0.00832
Epoch: 30816, Training Loss: 0.00777
Epoch: 30816, Training Loss: 0.00950
Epoch: 30816, Training Loss: 0.00735
Epoch: 30817, Training Loss: 0.00831
Epoch: 30817, Training Loss: 0.00777
Epoch: 30817, Training Loss: 0.00950
Epoch: 30817, Training Loss: 0.00735
Epoch: 30818, Training Loss: 0.00831
Epoch: 30818, Training Loss: 0.00777
Epoch: 30818, Training Loss: 0.00950
Epoch: 30818, Training Loss: 0.00735
Epoch: 30819, Training Loss: 0.00831
Epoch: 30819, Training Loss: 0.00777
Epoch: 30819, Training Loss: 0.00950
Epoch: 30819, Training Loss: 0.00735
Epoch: 30820, Training Loss: 0.00831
Epoch: 30820, Training Loss: 0.00777
Epoch: 30820, Training Loss: 0.00950
Epoch: 30820, Training Loss: 0.00735
Epoch: 30821, Training Loss: 0.00831
Epoch: 30821, Training Loss: 0.00777
Epoch: 30821, Training Loss: 0.00950
Epoch: 30821, Training Loss: 0.00735
Epoch: 30822, Training Loss: 0.00831
Epoch: 30822, Training Loss: 0.00777
Epoch: 30822, Training Loss: 0.00950
Epoch: 30822, Training Loss: 0.00735
Epoch: 30823, Training Loss: 0.00831
Epoch: 30823, Training Loss: 0.00777
Epoch: 30823, Training Loss: 0.00950
Epoch: 30823, Training Loss: 0.00735
Epoch: 30824, Training Loss: 0.00831
Epoch: 30824, Training Loss: 0.00777
Epoch: 30824, Training Loss: 0.00950
Epoch: 30824, Training Loss: 0.00735
Epoch: 30825, Training Loss: 0.00831
Epoch: 30825, Training Loss: 0.00776
Epoch: 30825, Training Loss: 0.00950
Epoch: 30825, Training Loss: 0.00735
Epoch: 30826, Training Loss: 0.00831
Epoch: 30826, Training Loss: 0.00776
Epoch: 30826, Training Loss: 0.00949
Epoch: 30826, Training Loss: 0.00735
Epoch: 30827, Training Loss: 0.00831
Epoch: 30827, Training Loss: 0.00776
Epoch: 30827, Training Loss: 0.00949
Epoch: 30827, Training Loss: 0.00735
Epoch: 30828, Training Loss: 0.00831
Epoch: 30828, Training Loss: 0.00776
Epoch: 30828, Training Loss: 0.00949
Epoch: 30828, Training Loss: 0.00735
Epoch: 30829, Training Loss: 0.00831
Epoch: 30829, Training Loss: 0.00776
Epoch: 30829, Training Loss: 0.00949
Epoch: 30829, Training Loss: 0.00735
Epoch: 30830, Training Loss: 0.00831
Epoch: 30830, Training Loss: 0.00776
Epoch: 30830, Training Loss: 0.00949
Epoch: 30830, Training Loss: 0.00735
Epoch: 30831, Training Loss: 0.00831
Epoch: 30831, Training Loss: 0.00776
Epoch: 30831, Training Loss: 0.00949
Epoch: 30831, Training Loss: 0.00735
Epoch: 30832, Training Loss: 0.00831
Epoch: 30832, Training Loss: 0.00776
Epoch: 30832, Training Loss: 0.00949
Epoch: 30832, Training Loss: 0.00735
Epoch: 30833, Training Loss: 0.00831
Epoch: 30833, Training Loss: 0.00776
Epoch: 30833, Training Loss: 0.00949
Epoch: 30833, Training Loss: 0.00735
Epoch: 30834, Training Loss: 0.00831
Epoch: 30834, Training Loss: 0.00776
Epoch: 30834, Training Loss: 0.00949
Epoch: 30834, Training Loss: 0.00735
Epoch: 30835, Training Loss: 0.00831
Epoch: 30835, Training Loss: 0.00776
Epoch: 30835, Training Loss: 0.00949
Epoch: 30835, Training Loss: 0.00735
Epoch: 30836, Training Loss: 0.00831
Epoch: 30836, Training Loss: 0.00776
Epoch: 30836, Training Loss: 0.00949
Epoch: 30836, Training Loss: 0.00735
Epoch: 30837, Training Loss: 0.00831
Epoch: 30837, Training Loss: 0.00776
Epoch: 30837, Training Loss: 0.00949
Epoch: 30837, Training Loss: 0.00735
Epoch: 30838, Training Loss: 0.00831
Epoch: 30838, Training Loss: 0.00776
Epoch: 30838, Training Loss: 0.00949
Epoch: 30838, Training Loss: 0.00735
Epoch: 30839, Training Loss: 0.00831
Epoch: 30839, Training Loss: 0.00776
Epoch: 30839, Training Loss: 0.00949
Epoch: 30839, Training Loss: 0.00735
Epoch: 30840, Training Loss: 0.00831
Epoch: 30840, Training Loss: 0.00776
Epoch: 30840, Training Loss: 0.00949
Epoch: 30840, Training Loss: 0.00735
Epoch: 30841, Training Loss: 0.00831
Epoch: 30841, Training Loss: 0.00776
Epoch: 30841, Training Loss: 0.00949
Epoch: 30841, Training Loss: 0.00735
Epoch: 30842, Training Loss: 0.00831
Epoch: 30842, Training Loss: 0.00776
Epoch: 30842, Training Loss: 0.00949
Epoch: 30842, Training Loss: 0.00735
Epoch: 30843, Training Loss: 0.00831
Epoch: 30843, Training Loss: 0.00776
Epoch: 30843, Training Loss: 0.00949
Epoch: 30843, Training Loss: 0.00735
Epoch: 30844, Training Loss: 0.00831
Epoch: 30844, Training Loss: 0.00776
Epoch: 30844, Training Loss: 0.00949
Epoch: 30844, Training Loss: 0.00735
Epoch: 30845, Training Loss: 0.00831
Epoch: 30845, Training Loss: 0.00776
Epoch: 30845, Training Loss: 0.00949
Epoch: 30845, Training Loss: 0.00735
Epoch: 30846, Training Loss: 0.00831
Epoch: 30846, Training Loss: 0.00776
Epoch: 30846, Training Loss: 0.00949
Epoch: 30846, Training Loss: 0.00735
Epoch: 30847, Training Loss: 0.00831
Epoch: 30847, Training Loss: 0.00776
Epoch: 30847, Training Loss: 0.00949
Epoch: 30847, Training Loss: 0.00735
Epoch: 30848, Training Loss: 0.00831
Epoch: 30848, Training Loss: 0.00776
Epoch: 30848, Training Loss: 0.00949
Epoch: 30848, Training Loss: 0.00735
Epoch: 30849, Training Loss: 0.00831
Epoch: 30849, Training Loss: 0.00776
Epoch: 30849, Training Loss: 0.00949
Epoch: 30849, Training Loss: 0.00735
Epoch: 30850, Training Loss: 0.00831
Epoch: 30850, Training Loss: 0.00776
Epoch: 30850, Training Loss: 0.00949
Epoch: 30850, Training Loss: 0.00735
Epoch: 30851, Training Loss: 0.00831
Epoch: 30851, Training Loss: 0.00776
Epoch: 30851, Training Loss: 0.00949
Epoch: 30851, Training Loss: 0.00735
Epoch: 30852, Training Loss: 0.00831
Epoch: 30852, Training Loss: 0.00776
Epoch: 30852, Training Loss: 0.00949
Epoch: 30852, Training Loss: 0.00735
Epoch: 30853, Training Loss: 0.00831
Epoch: 30853, Training Loss: 0.00776
Epoch: 30853, Training Loss: 0.00949
Epoch: 30853, Training Loss: 0.00735
Epoch: 30854, Training Loss: 0.00831
Epoch: 30854, Training Loss: 0.00776
Epoch: 30854, Training Loss: 0.00949
Epoch: 30854, Training Loss: 0.00735
Epoch: 30855, Training Loss: 0.00831
Epoch: 30855, Training Loss: 0.00776
Epoch: 30855, Training Loss: 0.00949
Epoch: 30855, Training Loss: 0.00735
Epoch: 30856, Training Loss: 0.00831
Epoch: 30856, Training Loss: 0.00776
Epoch: 30856, Training Loss: 0.00949
Epoch: 30856, Training Loss: 0.00735
Epoch: 30857, Training Loss: 0.00831
Epoch: 30857, Training Loss: 0.00776
Epoch: 30857, Training Loss: 0.00949
Epoch: 30857, Training Loss: 0.00735
Epoch: 30858, Training Loss: 0.00831
Epoch: 30858, Training Loss: 0.00776
Epoch: 30858, Training Loss: 0.00949
Epoch: 30858, Training Loss: 0.00735
Epoch: 30859, Training Loss: 0.00831
Epoch: 30859, Training Loss: 0.00776
Epoch: 30859, Training Loss: 0.00949
Epoch: 30859, Training Loss: 0.00735
Epoch: 30860, Training Loss: 0.00831
Epoch: 30860, Training Loss: 0.00776
Epoch: 30860, Training Loss: 0.00949
Epoch: 30860, Training Loss: 0.00735
Epoch: 30861, Training Loss: 0.00831
Epoch: 30861, Training Loss: 0.00776
Epoch: 30861, Training Loss: 0.00949
Epoch: 30861, Training Loss: 0.00735
Epoch: 30862, Training Loss: 0.00831
Epoch: 30862, Training Loss: 0.00776
Epoch: 30862, Training Loss: 0.00949
Epoch: 30862, Training Loss: 0.00735
Epoch: 30863, Training Loss: 0.00831
Epoch: 30863, Training Loss: 0.00776
Epoch: 30863, Training Loss: 0.00949
Epoch: 30863, Training Loss: 0.00735
Epoch: 30864, Training Loss: 0.00831
Epoch: 30864, Training Loss: 0.00776
Epoch: 30864, Training Loss: 0.00949
Epoch: 30864, Training Loss: 0.00735
Epoch: 30865, Training Loss: 0.00831
Epoch: 30865, Training Loss: 0.00776
Epoch: 30865, Training Loss: 0.00949
Epoch: 30865, Training Loss: 0.00735
Epoch: 30866, Training Loss: 0.00831
Epoch: 30866, Training Loss: 0.00776
Epoch: 30866, Training Loss: 0.00949
Epoch: 30866, Training Loss: 0.00735
Epoch: 30867, Training Loss: 0.00831
Epoch: 30867, Training Loss: 0.00776
Epoch: 30867, Training Loss: 0.00949
Epoch: 30867, Training Loss: 0.00735
Epoch: 30868, Training Loss: 0.00831
Epoch: 30868, Training Loss: 0.00776
Epoch: 30868, Training Loss: 0.00949
Epoch: 30868, Training Loss: 0.00735
Epoch: 30869, Training Loss: 0.00831
Epoch: 30869, Training Loss: 0.00776
Epoch: 30869, Training Loss: 0.00949
Epoch: 30869, Training Loss: 0.00735
Epoch: 30870, Training Loss: 0.00831
Epoch: 30870, Training Loss: 0.00776
Epoch: 30870, Training Loss: 0.00949
Epoch: 30870, Training Loss: 0.00735
Epoch: 30871, Training Loss: 0.00831
Epoch: 30871, Training Loss: 0.00776
Epoch: 30871, Training Loss: 0.00949
Epoch: 30871, Training Loss: 0.00735
Epoch: 30872, Training Loss: 0.00831
Epoch: 30872, Training Loss: 0.00776
Epoch: 30872, Training Loss: 0.00949
Epoch: 30872, Training Loss: 0.00735
Epoch: 30873, Training Loss: 0.00831
Epoch: 30873, Training Loss: 0.00776
Epoch: 30873, Training Loss: 0.00949
Epoch: 30873, Training Loss: 0.00735
Epoch: 30874, Training Loss: 0.00831
Epoch: 30874, Training Loss: 0.00776
Epoch: 30874, Training Loss: 0.00949
Epoch: 30874, Training Loss: 0.00735
Epoch: 30875, Training Loss: 0.00831
Epoch: 30875, Training Loss: 0.00776
Epoch: 30875, Training Loss: 0.00949
Epoch: 30875, Training Loss: 0.00735
Epoch: 30876, Training Loss: 0.00831
Epoch: 30876, Training Loss: 0.00776
Epoch: 30876, Training Loss: 0.00949
Epoch: 30876, Training Loss: 0.00735
Epoch: 30877, Training Loss: 0.00831
Epoch: 30877, Training Loss: 0.00776
Epoch: 30877, Training Loss: 0.00949
Epoch: 30877, Training Loss: 0.00735
Epoch: 30878, Training Loss: 0.00831
Epoch: 30878, Training Loss: 0.00776
Epoch: 30878, Training Loss: 0.00949
Epoch: 30878, Training Loss: 0.00735
Epoch: 30879, Training Loss: 0.00831
Epoch: 30879, Training Loss: 0.00776
Epoch: 30879, Training Loss: 0.00949
Epoch: 30879, Training Loss: 0.00735
Epoch: 30880, Training Loss: 0.00831
Epoch: 30880, Training Loss: 0.00776
Epoch: 30880, Training Loss: 0.00949
Epoch: 30880, Training Loss: 0.00735
Epoch: 30881, Training Loss: 0.00831
Epoch: 30881, Training Loss: 0.00776
Epoch: 30881, Training Loss: 0.00949
Epoch: 30881, Training Loss: 0.00735
Epoch: 30882, Training Loss: 0.00831
Epoch: 30882, Training Loss: 0.00776
Epoch: 30882, Training Loss: 0.00949
Epoch: 30882, Training Loss: 0.00735
Epoch: 30883, Training Loss: 0.00831
Epoch: 30883, Training Loss: 0.00776
Epoch: 30883, Training Loss: 0.00949
Epoch: 30883, Training Loss: 0.00735
Epoch: 30884, Training Loss: 0.00831
Epoch: 30884, Training Loss: 0.00776
Epoch: 30884, Training Loss: 0.00949
Epoch: 30884, Training Loss: 0.00735
Epoch: 30885, Training Loss: 0.00831
Epoch: 30885, Training Loss: 0.00776
Epoch: 30885, Training Loss: 0.00949
Epoch: 30885, Training Loss: 0.00735
Epoch: 30886, Training Loss: 0.00831
Epoch: 30886, Training Loss: 0.00776
Epoch: 30886, Training Loss: 0.00949
Epoch: 30886, Training Loss: 0.00735
Epoch: 30887, Training Loss: 0.00830
Epoch: 30887, Training Loss: 0.00776
Epoch: 30887, Training Loss: 0.00948
Epoch: 30887, Training Loss: 0.00735
Epoch: 30888, Training Loss: 0.00830
Epoch: 30888, Training Loss: 0.00776
Epoch: 30888, Training Loss: 0.00948
Epoch: 30888, Training Loss: 0.00735
Epoch: 30889, Training Loss: 0.00830
Epoch: 30889, Training Loss: 0.00776
Epoch: 30889, Training Loss: 0.00948
Epoch: 30889, Training Loss: 0.00734
Epoch: 30890, Training Loss: 0.00830
Epoch: 30890, Training Loss: 0.00776
Epoch: 30890, Training Loss: 0.00948
Epoch: 30890, Training Loss: 0.00734
Epoch: 30891, Training Loss: 0.00830
Epoch: 30891, Training Loss: 0.00776
Epoch: 30891, Training Loss: 0.00948
Epoch: 30891, Training Loss: 0.00734
Epoch: 30892, Training Loss: 0.00830
Epoch: 30892, Training Loss: 0.00776
Epoch: 30892, Training Loss: 0.00948
Epoch: 30892, Training Loss: 0.00734
Epoch: 30893, Training Loss: 0.00830
Epoch: 30893, Training Loss: 0.00776
Epoch: 30893, Training Loss: 0.00948
Epoch: 30893, Training Loss: 0.00734
Epoch: 30894, Training Loss: 0.00830
Epoch: 30894, Training Loss: 0.00776
Epoch: 30894, Training Loss: 0.00948
Epoch: 30894, Training Loss: 0.00734
Epoch: 30895, Training Loss: 0.00830
Epoch: 30895, Training Loss: 0.00776
Epoch: 30895, Training Loss: 0.00948
Epoch: 30895, Training Loss: 0.00734
Epoch: 30896, Training Loss: 0.00830
Epoch: 30896, Training Loss: 0.00776
Epoch: 30896, Training Loss: 0.00948
Epoch: 30896, Training Loss: 0.00734
Epoch: 30897, Training Loss: 0.00830
Epoch: 30897, Training Loss: 0.00776
Epoch: 30897, Training Loss: 0.00948
Epoch: 30897, Training Loss: 0.00734
Epoch: 30898, Training Loss: 0.00830
Epoch: 30898, Training Loss: 0.00776
Epoch: 30898, Training Loss: 0.00948
Epoch: 30898, Training Loss: 0.00734
Epoch: 30899, Training Loss: 0.00830
Epoch: 30899, Training Loss: 0.00776
Epoch: 30899, Training Loss: 0.00948
Epoch: 30899, Training Loss: 0.00734
Epoch: 30900, Training Loss: 0.00830
Epoch: 30900, Training Loss: 0.00776
Epoch: 30900, Training Loss: 0.00948
Epoch: 30900, Training Loss: 0.00734
Epoch: 30901, Training Loss: 0.00830
Epoch: 30901, Training Loss: 0.00775
Epoch: 30901, Training Loss: 0.00948
Epoch: 30901, Training Loss: 0.00734
Epoch: 30902, Training Loss: 0.00830
Epoch: 30902, Training Loss: 0.00775
Epoch: 30902, Training Loss: 0.00948
Epoch: 30902, Training Loss: 0.00734
Epoch: 30903, Training Loss: 0.00830
Epoch: 30903, Training Loss: 0.00775
Epoch: 30903, Training Loss: 0.00948
Epoch: 30903, Training Loss: 0.00734
Epoch: 30904, Training Loss: 0.00830
Epoch: 30904, Training Loss: 0.00775
Epoch: 30904, Training Loss: 0.00948
Epoch: 30904, Training Loss: 0.00734
Epoch: 30905, Training Loss: 0.00830
Epoch: 30905, Training Loss: 0.00775
Epoch: 30905, Training Loss: 0.00948
Epoch: 30905, Training Loss: 0.00734
Epoch: 30906, Training Loss: 0.00830
Epoch: 30906, Training Loss: 0.00775
Epoch: 30906, Training Loss: 0.00948
Epoch: 30906, Training Loss: 0.00734
Epoch: 30907, Training Loss: 0.00830
Epoch: 30907, Training Loss: 0.00775
Epoch: 30907, Training Loss: 0.00948
Epoch: 30907, Training Loss: 0.00734
Epoch: 30908, Training Loss: 0.00830
Epoch: 30908, Training Loss: 0.00775
Epoch: 30908, Training Loss: 0.00948
Epoch: 30908, Training Loss: 0.00734
Epoch: 30909, Training Loss: 0.00830
Epoch: 30909, Training Loss: 0.00775
Epoch: 30909, Training Loss: 0.00948
Epoch: 30909, Training Loss: 0.00734
Epoch: 30910, Training Loss: 0.00830
Epoch: 30910, Training Loss: 0.00775
Epoch: 30910, Training Loss: 0.00948
Epoch: 30910, Training Loss: 0.00734
Epoch: 30911, Training Loss: 0.00830
Epoch: 30911, Training Loss: 0.00775
Epoch: 30911, Training Loss: 0.00948
Epoch: 30911, Training Loss: 0.00734
Epoch: 30912, Training Loss: 0.00830
Epoch: 30912, Training Loss: 0.00775
Epoch: 30912, Training Loss: 0.00948
Epoch: 30912, Training Loss: 0.00734
Epoch: 30913, Training Loss: 0.00830
Epoch: 30913, Training Loss: 0.00775
Epoch: 30913, Training Loss: 0.00948
Epoch: 30913, Training Loss: 0.00734
Epoch: 30914, Training Loss: 0.00830
Epoch: 30914, Training Loss: 0.00775
Epoch: 30914, Training Loss: 0.00948
Epoch: 30914, Training Loss: 0.00734
Epoch: 30915, Training Loss: 0.00830
Epoch: 30915, Training Loss: 0.00775
Epoch: 30915, Training Loss: 0.00948
Epoch: 30915, Training Loss: 0.00734
Epoch: 30916, Training Loss: 0.00830
Epoch: 30916, Training Loss: 0.00775
Epoch: 30916, Training Loss: 0.00948
Epoch: 30916, Training Loss: 0.00734
Epoch: 30917, Training Loss: 0.00830
Epoch: 30917, Training Loss: 0.00775
Epoch: 30917, Training Loss: 0.00948
Epoch: 30917, Training Loss: 0.00734
Epoch: 30918, Training Loss: 0.00830
Epoch: 30918, Training Loss: 0.00775
Epoch: 30918, Training Loss: 0.00948
Epoch: 30918, Training Loss: 0.00734
Epoch: 30919, Training Loss: 0.00830
Epoch: 30919, Training Loss: 0.00775
Epoch: 30919, Training Loss: 0.00948
Epoch: 30919, Training Loss: 0.00734
Epoch: 30920, Training Loss: 0.00830
Epoch: 30920, Training Loss: 0.00775
Epoch: 30920, Training Loss: 0.00948
Epoch: 30920, Training Loss: 0.00734
Epoch: 30921, Training Loss: 0.00830
Epoch: 30921, Training Loss: 0.00775
Epoch: 30921, Training Loss: 0.00948
Epoch: 30921, Training Loss: 0.00734
Epoch: 30922, Training Loss: 0.00830
Epoch: 30922, Training Loss: 0.00775
Epoch: 30922, Training Loss: 0.00948
Epoch: 30922, Training Loss: 0.00734
Epoch: 30923, Training Loss: 0.00830
Epoch: 30923, Training Loss: 0.00775
Epoch: 30923, Training Loss: 0.00948
Epoch: 30923, Training Loss: 0.00734
Epoch: 30924, Training Loss: 0.00830
Epoch: 30924, Training Loss: 0.00775
Epoch: 30924, Training Loss: 0.00948
Epoch: 30924, Training Loss: 0.00734
Epoch: 30925, Training Loss: 0.00830
Epoch: 30925, Training Loss: 0.00775
Epoch: 30925, Training Loss: 0.00948
Epoch: 30925, Training Loss: 0.00734
Epoch: 30926, Training Loss: 0.00830
Epoch: 30926, Training Loss: 0.00775
Epoch: 30926, Training Loss: 0.00948
Epoch: 30926, Training Loss: 0.00734
Epoch: 30927, Training Loss: 0.00830
Epoch: 30927, Training Loss: 0.00775
Epoch: 30927, Training Loss: 0.00948
Epoch: 30927, Training Loss: 0.00734
Epoch: 30928, Training Loss: 0.00830
Epoch: 30928, Training Loss: 0.00775
Epoch: 30928, Training Loss: 0.00948
Epoch: 30928, Training Loss: 0.00734
Epoch: 30929, Training Loss: 0.00830
Epoch: 30929, Training Loss: 0.00775
Epoch: 30929, Training Loss: 0.00948
Epoch: 30929, Training Loss: 0.00734
Epoch: 30930, Training Loss: 0.00830
Epoch: 30930, Training Loss: 0.00775
Epoch: 30930, Training Loss: 0.00948
Epoch: 30930, Training Loss: 0.00734
Epoch: 30931, Training Loss: 0.00830
Epoch: 30931, Training Loss: 0.00775
Epoch: 30931, Training Loss: 0.00948
Epoch: 30931, Training Loss: 0.00734
Epoch: 30932, Training Loss: 0.00830
Epoch: 30932, Training Loss: 0.00775
Epoch: 30932, Training Loss: 0.00948
Epoch: 30932, Training Loss: 0.00734
Epoch: 30933, Training Loss: 0.00830
Epoch: 30933, Training Loss: 0.00775
Epoch: 30933, Training Loss: 0.00948
Epoch: 30933, Training Loss: 0.00734
Epoch: 30934, Training Loss: 0.00830
Epoch: 30934, Training Loss: 0.00775
Epoch: 30934, Training Loss: 0.00948
Epoch: 30934, Training Loss: 0.00734
Epoch: 30935, Training Loss: 0.00830
Epoch: 30935, Training Loss: 0.00775
Epoch: 30935, Training Loss: 0.00948
Epoch: 30935, Training Loss: 0.00734
Epoch: 30936, Training Loss: 0.00830
Epoch: 30936, Training Loss: 0.00775
Epoch: 30936, Training Loss: 0.00948
Epoch: 30936, Training Loss: 0.00734
Epoch: 30937, Training Loss: 0.00830
Epoch: 30937, Training Loss: 0.00775
Epoch: 30937, Training Loss: 0.00948
Epoch: 30937, Training Loss: 0.00734
Epoch: 30938, Training Loss: 0.00830
Epoch: 30938, Training Loss: 0.00775
Epoch: 30938, Training Loss: 0.00948
Epoch: 30938, Training Loss: 0.00734
Epoch: 30939, Training Loss: 0.00830
Epoch: 30939, Training Loss: 0.00775
Epoch: 30939, Training Loss: 0.00948
Epoch: 30939, Training Loss: 0.00734
Epoch: 30940, Training Loss: 0.00830
Epoch: 30940, Training Loss: 0.00775
Epoch: 30940, Training Loss: 0.00948
Epoch: 30940, Training Loss: 0.00734
Epoch: 30941, Training Loss: 0.00830
Epoch: 30941, Training Loss: 0.00775
Epoch: 30941, Training Loss: 0.00948
Epoch: 30941, Training Loss: 0.00734
Epoch: 30942, Training Loss: 0.00830
Epoch: 30942, Training Loss: 0.00775
Epoch: 30942, Training Loss: 0.00948
Epoch: 30942, Training Loss: 0.00734
Epoch: 30943, Training Loss: 0.00830
Epoch: 30943, Training Loss: 0.00775
Epoch: 30943, Training Loss: 0.00948
Epoch: 30943, Training Loss: 0.00734
Epoch: 30944, Training Loss: 0.00830
Epoch: 30944, Training Loss: 0.00775
Epoch: 30944, Training Loss: 0.00948
Epoch: 30944, Training Loss: 0.00734
Epoch: 30945, Training Loss: 0.00830
Epoch: 30945, Training Loss: 0.00775
Epoch: 30945, Training Loss: 0.00948
Epoch: 30945, Training Loss: 0.00734
Epoch: 30946, Training Loss: 0.00830
Epoch: 30946, Training Loss: 0.00775
Epoch: 30946, Training Loss: 0.00948
Epoch: 30946, Training Loss: 0.00734
Epoch: 30947, Training Loss: 0.00830
Epoch: 30947, Training Loss: 0.00775
Epoch: 30947, Training Loss: 0.00948
Epoch: 30947, Training Loss: 0.00734
Epoch: 30948, Training Loss: 0.00830
Epoch: 30948, Training Loss: 0.00775
Epoch: 30948, Training Loss: 0.00948
Epoch: 30948, Training Loss: 0.00734
Epoch: 30949, Training Loss: 0.00830
Epoch: 30949, Training Loss: 0.00775
Epoch: 30949, Training Loss: 0.00947
Epoch: 30949, Training Loss: 0.00734
Epoch: 30950, Training Loss: 0.00830
Epoch: 30950, Training Loss: 0.00775
Epoch: 30950, Training Loss: 0.00947
Epoch: 30950, Training Loss: 0.00734
Epoch: 30951, Training Loss: 0.00830
Epoch: 30951, Training Loss: 0.00775
Epoch: 30951, Training Loss: 0.00947
Epoch: 30951, Training Loss: 0.00734
Epoch: 30952, Training Loss: 0.00830
Epoch: 30952, Training Loss: 0.00775
Epoch: 30952, Training Loss: 0.00947
Epoch: 30952, Training Loss: 0.00734
Epoch: 30953, Training Loss: 0.00830
Epoch: 30953, Training Loss: 0.00775
Epoch: 30953, Training Loss: 0.00947
Epoch: 30953, Training Loss: 0.00734
Epoch: 30954, Training Loss: 0.00830
Epoch: 30954, Training Loss: 0.00775
Epoch: 30954, Training Loss: 0.00947
Epoch: 30954, Training Loss: 0.00734
Epoch: 30955, Training Loss: 0.00830
Epoch: 30955, Training Loss: 0.00775
Epoch: 30955, Training Loss: 0.00947
Epoch: 30955, Training Loss: 0.00734
Epoch: 30956, Training Loss: 0.00830
Epoch: 30956, Training Loss: 0.00775
Epoch: 30956, Training Loss: 0.00947
Epoch: 30956, Training Loss: 0.00734
Epoch: 30957, Training Loss: 0.00829
Epoch: 30957, Training Loss: 0.00775
Epoch: 30957, Training Loss: 0.00947
Epoch: 30957, Training Loss: 0.00734
Epoch: 30958, Training Loss: 0.00829
Epoch: 30958, Training Loss: 0.00775
Epoch: 30958, Training Loss: 0.00947
Epoch: 30958, Training Loss: 0.00734
Epoch: 30959, Training Loss: 0.00829
Epoch: 30959, Training Loss: 0.00775
Epoch: 30959, Training Loss: 0.00947
Epoch: 30959, Training Loss: 0.00734
Epoch: 30960, Training Loss: 0.00829
Epoch: 30960, Training Loss: 0.00775
Epoch: 30960, Training Loss: 0.00947
Epoch: 30960, Training Loss: 0.00734
Epoch: 30961, Training Loss: 0.00829
Epoch: 30961, Training Loss: 0.00775
Epoch: 30961, Training Loss: 0.00947
Epoch: 30961, Training Loss: 0.00734
Epoch: 30962, Training Loss: 0.00829
Epoch: 30962, Training Loss: 0.00775
Epoch: 30962, Training Loss: 0.00947
Epoch: 30962, Training Loss: 0.00734
Epoch: 30963, Training Loss: 0.00829
Epoch: 30963, Training Loss: 0.00775
Epoch: 30963, Training Loss: 0.00947
Epoch: 30963, Training Loss: 0.00734
Epoch: 30964, Training Loss: 0.00829
Epoch: 30964, Training Loss: 0.00775
Epoch: 30964, Training Loss: 0.00947
Epoch: 30964, Training Loss: 0.00734
Epoch: 30965, Training Loss: 0.00829
Epoch: 30965, Training Loss: 0.00775
Epoch: 30965, Training Loss: 0.00947
Epoch: 30965, Training Loss: 0.00734
Epoch: 30966, Training Loss: 0.00829
Epoch: 30966, Training Loss: 0.00775
Epoch: 30966, Training Loss: 0.00947
Epoch: 30966, Training Loss: 0.00734
Epoch: 30967, Training Loss: 0.00829
Epoch: 30967, Training Loss: 0.00775
Epoch: 30967, Training Loss: 0.00947
Epoch: 30967, Training Loss: 0.00734
Epoch: 30968, Training Loss: 0.00829
Epoch: 30968, Training Loss: 0.00775
Epoch: 30968, Training Loss: 0.00947
Epoch: 30968, Training Loss: 0.00734
Epoch: 30969, Training Loss: 0.00829
Epoch: 30969, Training Loss: 0.00775
Epoch: 30969, Training Loss: 0.00947
Epoch: 30969, Training Loss: 0.00733
Epoch: 30970, Training Loss: 0.00829
Epoch: 30970, Training Loss: 0.00775
Epoch: 30970, Training Loss: 0.00947
Epoch: 30970, Training Loss: 0.00733
Epoch: 30971, Training Loss: 0.00829
Epoch: 30971, Training Loss: 0.00775
Epoch: 30971, Training Loss: 0.00947
Epoch: 30971, Training Loss: 0.00733
Epoch: 30972, Training Loss: 0.00829
Epoch: 30972, Training Loss: 0.00775
Epoch: 30972, Training Loss: 0.00947
Epoch: 30972, Training Loss: 0.00733
Epoch: 30973, Training Loss: 0.00829
Epoch: 30973, Training Loss: 0.00775
Epoch: 30973, Training Loss: 0.00947
Epoch: 30973, Training Loss: 0.00733
Epoch: 30974, Training Loss: 0.00829
Epoch: 30974, Training Loss: 0.00775
Epoch: 30974, Training Loss: 0.00947
Epoch: 30974, Training Loss: 0.00733
Epoch: 30975, Training Loss: 0.00829
Epoch: 30975, Training Loss: 0.00775
Epoch: 30975, Training Loss: 0.00947
Epoch: 30975, Training Loss: 0.00733
Epoch: 30976, Training Loss: 0.00829
Epoch: 30976, Training Loss: 0.00775
Epoch: 30976, Training Loss: 0.00947
Epoch: 30976, Training Loss: 0.00733
Epoch: 30977, Training Loss: 0.00829
Epoch: 30977, Training Loss: 0.00774
Epoch: 30977, Training Loss: 0.00947
Epoch: 30977, Training Loss: 0.00733
Epoch: 30978, Training Loss: 0.00829
Epoch: 30978, Training Loss: 0.00774
Epoch: 30978, Training Loss: 0.00947
Epoch: 30978, Training Loss: 0.00733
Epoch: 30979, Training Loss: 0.00829
Epoch: 30979, Training Loss: 0.00774
Epoch: 30979, Training Loss: 0.00947
Epoch: 30979, Training Loss: 0.00733
Epoch: 30980, Training Loss: 0.00829
Epoch: 30980, Training Loss: 0.00774
Epoch: 30980, Training Loss: 0.00947
Epoch: 30980, Training Loss: 0.00733
Epoch: 30981, Training Loss: 0.00829
Epoch: 30981, Training Loss: 0.00774
Epoch: 30981, Training Loss: 0.00947
Epoch: 30981, Training Loss: 0.00733
Epoch: 30982, Training Loss: 0.00829
Epoch: 30982, Training Loss: 0.00774
Epoch: 30982, Training Loss: 0.00947
Epoch: 30982, Training Loss: 0.00733
Epoch: 30983, Training Loss: 0.00829
Epoch: 30983, Training Loss: 0.00774
Epoch: 30983, Training Loss: 0.00947
Epoch: 30983, Training Loss: 0.00733
Epoch: 30984, Training Loss: 0.00829
Epoch: 30984, Training Loss: 0.00774
Epoch: 30984, Training Loss: 0.00947
Epoch: 30984, Training Loss: 0.00733
Epoch: 30985, Training Loss: 0.00829
Epoch: 30985, Training Loss: 0.00774
Epoch: 30985, Training Loss: 0.00947
Epoch: 30985, Training Loss: 0.00733
Epoch: 30986, Training Loss: 0.00829
Epoch: 30986, Training Loss: 0.00774
Epoch: 30986, Training Loss: 0.00947
Epoch: 30986, Training Loss: 0.00733
Epoch: 30987, Training Loss: 0.00829
Epoch: 30987, Training Loss: 0.00774
Epoch: 30987, Training Loss: 0.00947
Epoch: 30987, Training Loss: 0.00733
Epoch: 30988, Training Loss: 0.00829
Epoch: 30988, Training Loss: 0.00774
Epoch: 30988, Training Loss: 0.00947
Epoch: 30988, Training Loss: 0.00733
Epoch: 30989, Training Loss: 0.00829
Epoch: 30989, Training Loss: 0.00774
Epoch: 30989, Training Loss: 0.00947
Epoch: 30989, Training Loss: 0.00733
Epoch: 30990, Training Loss: 0.00829
Epoch: 30990, Training Loss: 0.00774
Epoch: 30990, Training Loss: 0.00947
Epoch: 30990, Training Loss: 0.00733
Epoch: 30991, Training Loss: 0.00829
Epoch: 30991, Training Loss: 0.00774
Epoch: 30991, Training Loss: 0.00947
Epoch: 30991, Training Loss: 0.00733
Epoch: 30992, Training Loss: 0.00829
Epoch: 30992, Training Loss: 0.00774
Epoch: 30992, Training Loss: 0.00947
Epoch: 30992, Training Loss: 0.00733
Epoch: 30993, Training Loss: 0.00829
Epoch: 30993, Training Loss: 0.00774
Epoch: 30993, Training Loss: 0.00947
Epoch: 30993, Training Loss: 0.00733
Epoch: 30994, Training Loss: 0.00829
Epoch: 30994, Training Loss: 0.00774
Epoch: 30994, Training Loss: 0.00947
Epoch: 30994, Training Loss: 0.00733
Epoch: 30995, Training Loss: 0.00829
Epoch: 30995, Training Loss: 0.00774
Epoch: 30995, Training Loss: 0.00947
Epoch: 30995, Training Loss: 0.00733
Epoch: 30996, Training Loss: 0.00829
Epoch: 30996, Training Loss: 0.00774
Epoch: 30996, Training Loss: 0.00947
Epoch: 30996, Training Loss: 0.00733
Epoch: 30997, Training Loss: 0.00829
Epoch: 30997, Training Loss: 0.00774
Epoch: 30997, Training Loss: 0.00947
Epoch: 30997, Training Loss: 0.00733
Epoch: 30998, Training Loss: 0.00829
Epoch: 30998, Training Loss: 0.00774
Epoch: 30998, Training Loss: 0.00947
Epoch: 30998, Training Loss: 0.00733
Epoch: 30999, Training Loss: 0.00829
Epoch: 30999, Training Loss: 0.00774
Epoch: 30999, Training Loss: 0.00947
Epoch: 30999, Training Loss: 0.00733
Epoch: 31000, Training Loss: 0.00829
Epoch: 31000, Training Loss: 0.00774
Epoch: 31000, Training Loss: 0.00947
Epoch: 31000, Training Loss: 0.00733
Epoch: 31001, Training Loss: 0.00829
Epoch: 31001, Training Loss: 0.00774
Epoch: 31001, Training Loss: 0.00947
Epoch: 31001, Training Loss: 0.00733
Epoch: 31002, Training Loss: 0.00829
Epoch: 31002, Training Loss: 0.00774
Epoch: 31002, Training Loss: 0.00947
Epoch: 31002, Training Loss: 0.00733
Epoch: 31003, Training Loss: 0.00829
Epoch: 31003, Training Loss: 0.00774
Epoch: 31003, Training Loss: 0.00947
Epoch: 31003, Training Loss: 0.00733
Epoch: 31004, Training Loss: 0.00829
Epoch: 31004, Training Loss: 0.00774
Epoch: 31004, Training Loss: 0.00947
Epoch: 31004, Training Loss: 0.00733
Epoch: 31005, Training Loss: 0.00829
Epoch: 31005, Training Loss: 0.00774
Epoch: 31005, Training Loss: 0.00947
Epoch: 31005, Training Loss: 0.00733
Epoch: 31006, Training Loss: 0.00829
Epoch: 31006, Training Loss: 0.00774
Epoch: 31006, Training Loss: 0.00947
Epoch: 31006, Training Loss: 0.00733
Epoch: 31007, Training Loss: 0.00829
Epoch: 31007, Training Loss: 0.00774
Epoch: 31007, Training Loss: 0.00947
Epoch: 31007, Training Loss: 0.00733
Epoch: 31008, Training Loss: 0.00829
Epoch: 31008, Training Loss: 0.00774
Epoch: 31008, Training Loss: 0.00947
Epoch: 31008, Training Loss: 0.00733
Epoch: 31009, Training Loss: 0.00829
Epoch: 31009, Training Loss: 0.00774
Epoch: 31009, Training Loss: 0.00947
Epoch: 31009, Training Loss: 0.00733
Epoch: 31010, Training Loss: 0.00829
Epoch: 31010, Training Loss: 0.00774
Epoch: 31010, Training Loss: 0.00947
Epoch: 31010, Training Loss: 0.00733
Epoch: 31011, Training Loss: 0.00829
Epoch: 31011, Training Loss: 0.00774
Epoch: 31011, Training Loss: 0.00946
Epoch: 31011, Training Loss: 0.00733
Epoch: 31012, Training Loss: 0.00829
Epoch: 31012, Training Loss: 0.00774
Epoch: 31012, Training Loss: 0.00946
Epoch: 31012, Training Loss: 0.00733
Epoch: 31013, Training Loss: 0.00829
Epoch: 31013, Training Loss: 0.00774
Epoch: 31013, Training Loss: 0.00946
Epoch: 31013, Training Loss: 0.00733
Epoch: 31014, Training Loss: 0.00829
Epoch: 31014, Training Loss: 0.00774
Epoch: 31014, Training Loss: 0.00946
Epoch: 31014, Training Loss: 0.00733
Epoch: 31015, Training Loss: 0.00829
Epoch: 31015, Training Loss: 0.00774
Epoch: 31015, Training Loss: 0.00946
Epoch: 31015, Training Loss: 0.00733
Epoch: 31016, Training Loss: 0.00829
Epoch: 31016, Training Loss: 0.00774
Epoch: 31016, Training Loss: 0.00946
Epoch: 31016, Training Loss: 0.00733
Epoch: 31017, Training Loss: 0.00829
Epoch: 31017, Training Loss: 0.00774
Epoch: 31017, Training Loss: 0.00946
Epoch: 31017, Training Loss: 0.00733
Epoch: 31018, Training Loss: 0.00829
Epoch: 31018, Training Loss: 0.00774
Epoch: 31018, Training Loss: 0.00946
Epoch: 31018, Training Loss: 0.00733
Epoch: 31019, Training Loss: 0.00829
Epoch: 31019, Training Loss: 0.00774
Epoch: 31019, Training Loss: 0.00946
Epoch: 31019, Training Loss: 0.00733
Epoch: 31020, Training Loss: 0.00829
Epoch: 31020, Training Loss: 0.00774
Epoch: 31020, Training Loss: 0.00946
Epoch: 31020, Training Loss: 0.00733
Epoch: 31021, Training Loss: 0.00829
Epoch: 31021, Training Loss: 0.00774
Epoch: 31021, Training Loss: 0.00946
Epoch: 31021, Training Loss: 0.00733
Epoch: 31022, Training Loss: 0.00829
Epoch: 31022, Training Loss: 0.00774
Epoch: 31022, Training Loss: 0.00946
Epoch: 31022, Training Loss: 0.00733
Epoch: 31023, Training Loss: 0.00829
Epoch: 31023, Training Loss: 0.00774
Epoch: 31023, Training Loss: 0.00946
Epoch: 31023, Training Loss: 0.00733
Epoch: 31024, Training Loss: 0.00829
Epoch: 31024, Training Loss: 0.00774
Epoch: 31024, Training Loss: 0.00946
Epoch: 31024, Training Loss: 0.00733
Epoch: 31025, Training Loss: 0.00829
Epoch: 31025, Training Loss: 0.00774
Epoch: 31025, Training Loss: 0.00946
Epoch: 31025, Training Loss: 0.00733
Epoch: 31026, Training Loss: 0.00829
Epoch: 31026, Training Loss: 0.00774
Epoch: 31026, Training Loss: 0.00946
Epoch: 31026, Training Loss: 0.00733
Epoch: 31027, Training Loss: 0.00829
Epoch: 31027, Training Loss: 0.00774
Epoch: 31027, Training Loss: 0.00946
Epoch: 31027, Training Loss: 0.00733
Epoch: 31028, Training Loss: 0.00828
Epoch: 31028, Training Loss: 0.00774
Epoch: 31028, Training Loss: 0.00946
Epoch: 31028, Training Loss: 0.00733
Epoch: 31029, Training Loss: 0.00828
Epoch: 31029, Training Loss: 0.00774
Epoch: 31029, Training Loss: 0.00946
Epoch: 31029, Training Loss: 0.00733
Epoch: 31030, Training Loss: 0.00828
Epoch: 31030, Training Loss: 0.00774
Epoch: 31030, Training Loss: 0.00946
Epoch: 31030, Training Loss: 0.00733
Epoch: 31031, Training Loss: 0.00828
Epoch: 31031, Training Loss: 0.00774
Epoch: 31031, Training Loss: 0.00946
Epoch: 31031, Training Loss: 0.00733
Epoch: 31032, Training Loss: 0.00828
Epoch: 31032, Training Loss: 0.00774
Epoch: 31032, Training Loss: 0.00946
Epoch: 31032, Training Loss: 0.00733
Epoch: 31033, Training Loss: 0.00828
Epoch: 31033, Training Loss: 0.00774
Epoch: 31033, Training Loss: 0.00946
Epoch: 31033, Training Loss: 0.00733
Epoch: 31034, Training Loss: 0.00828
Epoch: 31034, Training Loss: 0.00774
Epoch: 31034, Training Loss: 0.00946
Epoch: 31034, Training Loss: 0.00733
Epoch: 31035, Training Loss: 0.00828
Epoch: 31035, Training Loss: 0.00774
Epoch: 31035, Training Loss: 0.00946
Epoch: 31035, Training Loss: 0.00733
Epoch: 31036, Training Loss: 0.00828
Epoch: 31036, Training Loss: 0.00774
Epoch: 31036, Training Loss: 0.00946
Epoch: 31036, Training Loss: 0.00733
Epoch: 31037, Training Loss: 0.00828
Epoch: 31037, Training Loss: 0.00774
Epoch: 31037, Training Loss: 0.00946
Epoch: 31037, Training Loss: 0.00733
Epoch: 31038, Training Loss: 0.00828
Epoch: 31038, Training Loss: 0.00774
Epoch: 31038, Training Loss: 0.00946
Epoch: 31038, Training Loss: 0.00733
Epoch: 31039, Training Loss: 0.00828
Epoch: 31039, Training Loss: 0.00774
Epoch: 31039, Training Loss: 0.00946
Epoch: 31039, Training Loss: 0.00733
Epoch: 31040, Training Loss: 0.00828
Epoch: 31040, Training Loss: 0.00774
Epoch: 31040, Training Loss: 0.00946
Epoch: 31040, Training Loss: 0.00733
Epoch: 31041, Training Loss: 0.00828
Epoch: 31041, Training Loss: 0.00774
Epoch: 31041, Training Loss: 0.00946
Epoch: 31041, Training Loss: 0.00733
Epoch: 31042, Training Loss: 0.00828
Epoch: 31042, Training Loss: 0.00774
Epoch: 31042, Training Loss: 0.00946
Epoch: 31042, Training Loss: 0.00733
Epoch: 31043, Training Loss: 0.00828
Epoch: 31043, Training Loss: 0.00774
Epoch: 31043, Training Loss: 0.00946
Epoch: 31043, Training Loss: 0.00733
Epoch: 31044, Training Loss: 0.00828
Epoch: 31044, Training Loss: 0.00774
Epoch: 31044, Training Loss: 0.00946
Epoch: 31044, Training Loss: 0.00733
Epoch: 31045, Training Loss: 0.00828
Epoch: 31045, Training Loss: 0.00774
Epoch: 31045, Training Loss: 0.00946
Epoch: 31045, Training Loss: 0.00733
Epoch: 31046, Training Loss: 0.00828
Epoch: 31046, Training Loss: 0.00774
Epoch: 31046, Training Loss: 0.00946
Epoch: 31046, Training Loss: 0.00733
Epoch: 31047, Training Loss: 0.00828
Epoch: 31047, Training Loss: 0.00774
Epoch: 31047, Training Loss: 0.00946
Epoch: 31047, Training Loss: 0.00733
Epoch: 31048, Training Loss: 0.00828
Epoch: 31048, Training Loss: 0.00774
Epoch: 31048, Training Loss: 0.00946
Epoch: 31048, Training Loss: 0.00733
Epoch: 31049, Training Loss: 0.00828
Epoch: 31049, Training Loss: 0.00774
Epoch: 31049, Training Loss: 0.00946
Epoch: 31049, Training Loss: 0.00733
Epoch: 31050, Training Loss: 0.00828
Epoch: 31050, Training Loss: 0.00774
Epoch: 31050, Training Loss: 0.00946
Epoch: 31050, Training Loss: 0.00732
Epoch: 31051, Training Loss: 0.00828
Epoch: 31051, Training Loss: 0.00774
Epoch: 31051, Training Loss: 0.00946
Epoch: 31051, Training Loss: 0.00732
Epoch: 31052, Training Loss: 0.00828
Epoch: 31052, Training Loss: 0.00774
Epoch: 31052, Training Loss: 0.00946
Epoch: 31052, Training Loss: 0.00732
Epoch: 31053, Training Loss: 0.00828
Epoch: 31053, Training Loss: 0.00774
Epoch: 31053, Training Loss: 0.00946
Epoch: 31053, Training Loss: 0.00732
Epoch: 31054, Training Loss: 0.00828
Epoch: 31054, Training Loss: 0.00773
Epoch: 31054, Training Loss: 0.00946
Epoch: 31054, Training Loss: 0.00732
Epoch: 31055, Training Loss: 0.00828
Epoch: 31055, Training Loss: 0.00773
Epoch: 31055, Training Loss: 0.00946
Epoch: 31055, Training Loss: 0.00732
Epoch: 31056, Training Loss: 0.00828
Epoch: 31056, Training Loss: 0.00773
Epoch: 31056, Training Loss: 0.00946
Epoch: 31056, Training Loss: 0.00732
Epoch: 31057, Training Loss: 0.00828
Epoch: 31057, Training Loss: 0.00773
Epoch: 31057, Training Loss: 0.00946
Epoch: 31057, Training Loss: 0.00732
Epoch: 31058, Training Loss: 0.00828
Epoch: 31058, Training Loss: 0.00773
Epoch: 31058, Training Loss: 0.00946
Epoch: 31058, Training Loss: 0.00732
Epoch: 31059, Training Loss: 0.00828
Epoch: 31059, Training Loss: 0.00773
Epoch: 31059, Training Loss: 0.00946
Epoch: 31059, Training Loss: 0.00732
Epoch: 31060, Training Loss: 0.00828
Epoch: 31060, Training Loss: 0.00773
Epoch: 31060, Training Loss: 0.00946
Epoch: 31060, Training Loss: 0.00732
Epoch: 31061, Training Loss: 0.00828
Epoch: 31061, Training Loss: 0.00773
Epoch: 31061, Training Loss: 0.00946
Epoch: 31061, Training Loss: 0.00732
Epoch: 31062, Training Loss: 0.00828
Epoch: 31062, Training Loss: 0.00773
Epoch: 31062, Training Loss: 0.00946
Epoch: 31062, Training Loss: 0.00732
Epoch: 31063, Training Loss: 0.00828
Epoch: 31063, Training Loss: 0.00773
Epoch: 31063, Training Loss: 0.00946
Epoch: 31063, Training Loss: 0.00732
Epoch: 31064, Training Loss: 0.00828
Epoch: 31064, Training Loss: 0.00773
Epoch: 31064, Training Loss: 0.00946
Epoch: 31064, Training Loss: 0.00732
Epoch: 31065, Training Loss: 0.00828
Epoch: 31065, Training Loss: 0.00773
Epoch: 31065, Training Loss: 0.00946
Epoch: 31065, Training Loss: 0.00732
Epoch: 31066, Training Loss: 0.00828
Epoch: 31066, Training Loss: 0.00773
Epoch: 31066, Training Loss: 0.00946
Epoch: 31066, Training Loss: 0.00732
Epoch: 31067, Training Loss: 0.00828
Epoch: 31067, Training Loss: 0.00773
Epoch: 31067, Training Loss: 0.00946
Epoch: 31067, Training Loss: 0.00732
Epoch: 31068, Training Loss: 0.00828
Epoch: 31068, Training Loss: 0.00773
Epoch: 31068, Training Loss: 0.00946
Epoch: 31068, Training Loss: 0.00732
Epoch: 31069, Training Loss: 0.00828
Epoch: 31069, Training Loss: 0.00773
Epoch: 31069, Training Loss: 0.00946
Epoch: 31069, Training Loss: 0.00732
Epoch: 31070, Training Loss: 0.00828
Epoch: 31070, Training Loss: 0.00773
Epoch: 31070, Training Loss: 0.00946
Epoch: 31070, Training Loss: 0.00732
Epoch: 31071, Training Loss: 0.00828
Epoch: 31071, Training Loss: 0.00773
Epoch: 31071, Training Loss: 0.00946
Epoch: 31071, Training Loss: 0.00732
Epoch: 31072, Training Loss: 0.00828
Epoch: 31072, Training Loss: 0.00773
Epoch: 31072, Training Loss: 0.00946
Epoch: 31072, Training Loss: 0.00732
Epoch: 31073, Training Loss: 0.00828
Epoch: 31073, Training Loss: 0.00773
Epoch: 31073, Training Loss: 0.00945
Epoch: 31073, Training Loss: 0.00732
Epoch: 31074, Training Loss: 0.00828
Epoch: 31074, Training Loss: 0.00773
Epoch: 31074, Training Loss: 0.00945
Epoch: 31074, Training Loss: 0.00732
Epoch: 31075, Training Loss: 0.00828
Epoch: 31075, Training Loss: 0.00773
Epoch: 31075, Training Loss: 0.00945
Epoch: 31075, Training Loss: 0.00732
Epoch: 31076, Training Loss: 0.00828
Epoch: 31076, Training Loss: 0.00773
Epoch: 31076, Training Loss: 0.00945
Epoch: 31076, Training Loss: 0.00732
Epoch: 31077, Training Loss: 0.00828
Epoch: 31077, Training Loss: 0.00773
Epoch: 31077, Training Loss: 0.00945
Epoch: 31077, Training Loss: 0.00732
Epoch: 31078, Training Loss: 0.00828
Epoch: 31078, Training Loss: 0.00773
Epoch: 31078, Training Loss: 0.00945
Epoch: 31078, Training Loss: 0.00732
Epoch: 31079, Training Loss: 0.00828
Epoch: 31079, Training Loss: 0.00773
Epoch: 31079, Training Loss: 0.00945
Epoch: 31079, Training Loss: 0.00732
Epoch: 31080, Training Loss: 0.00828
Epoch: 31080, Training Loss: 0.00773
Epoch: 31080, Training Loss: 0.00945
Epoch: 31080, Training Loss: 0.00732
Epoch: 31081, Training Loss: 0.00828
Epoch: 31081, Training Loss: 0.00773
Epoch: 31081, Training Loss: 0.00945
Epoch: 31081, Training Loss: 0.00732
Epoch: 31082, Training Loss: 0.00828
Epoch: 31082, Training Loss: 0.00773
Epoch: 31082, Training Loss: 0.00945
Epoch: 31082, Training Loss: 0.00732
Epoch: 31083, Training Loss: 0.00828
Epoch: 31083, Training Loss: 0.00773
Epoch: 31083, Training Loss: 0.00945
Epoch: 31083, Training Loss: 0.00732
Epoch: 31084, Training Loss: 0.00828
Epoch: 31084, Training Loss: 0.00773
Epoch: 31084, Training Loss: 0.00945
Epoch: 31084, Training Loss: 0.00732
Epoch: 31085, Training Loss: 0.00828
Epoch: 31085, Training Loss: 0.00773
Epoch: 31085, Training Loss: 0.00945
Epoch: 31085, Training Loss: 0.00732
Epoch: 31086, Training Loss: 0.00828
Epoch: 31086, Training Loss: 0.00773
Epoch: 31086, Training Loss: 0.00945
Epoch: 31086, Training Loss: 0.00732
Epoch: 31087, Training Loss: 0.00828
Epoch: 31087, Training Loss: 0.00773
Epoch: 31087, Training Loss: 0.00945
Epoch: 31087, Training Loss: 0.00732
Epoch: 31088, Training Loss: 0.00828
Epoch: 31088, Training Loss: 0.00773
Epoch: 31088, Training Loss: 0.00945
Epoch: 31088, Training Loss: 0.00732
Epoch: 31089, Training Loss: 0.00828
Epoch: 31089, Training Loss: 0.00773
Epoch: 31089, Training Loss: 0.00945
Epoch: 31089, Training Loss: 0.00732
Epoch: 31090, Training Loss: 0.00828
Epoch: 31090, Training Loss: 0.00773
Epoch: 31090, Training Loss: 0.00945
Epoch: 31090, Training Loss: 0.00732
Epoch: 31091, Training Loss: 0.00828
Epoch: 31091, Training Loss: 0.00773
Epoch: 31091, Training Loss: 0.00945
Epoch: 31091, Training Loss: 0.00732
Epoch: 31092, Training Loss: 0.00828
Epoch: 31092, Training Loss: 0.00773
Epoch: 31092, Training Loss: 0.00945
Epoch: 31092, Training Loss: 0.00732
Epoch: 31093, Training Loss: 0.00828
Epoch: 31093, Training Loss: 0.00773
Epoch: 31093, Training Loss: 0.00945
Epoch: 31093, Training Loss: 0.00732
Epoch: 31094, Training Loss: 0.00828
Epoch: 31094, Training Loss: 0.00773
Epoch: 31094, Training Loss: 0.00945
Epoch: 31094, Training Loss: 0.00732
Epoch: 31095, Training Loss: 0.00828
Epoch: 31095, Training Loss: 0.00773
Epoch: 31095, Training Loss: 0.00945
Epoch: 31095, Training Loss: 0.00732
Epoch: 31096, Training Loss: 0.00828
Epoch: 31096, Training Loss: 0.00773
Epoch: 31096, Training Loss: 0.00945
Epoch: 31096, Training Loss: 0.00732
Epoch: 31097, Training Loss: 0.00828
Epoch: 31097, Training Loss: 0.00773
Epoch: 31097, Training Loss: 0.00945
Epoch: 31097, Training Loss: 0.00732
Epoch: 31098, Training Loss: 0.00827
Epoch: 31098, Training Loss: 0.00773
Epoch: 31098, Training Loss: 0.00945
Epoch: 31098, Training Loss: 0.00732
Epoch: 31099, Training Loss: 0.00827
Epoch: 31099, Training Loss: 0.00773
Epoch: 31099, Training Loss: 0.00945
Epoch: 31099, Training Loss: 0.00732
Epoch: 31100, Training Loss: 0.00827
Epoch: 31100, Training Loss: 0.00773
Epoch: 31100, Training Loss: 0.00945
Epoch: 31100, Training Loss: 0.00732
Epoch: 31101, Training Loss: 0.00827
Epoch: 31101, Training Loss: 0.00773
Epoch: 31101, Training Loss: 0.00945
Epoch: 31101, Training Loss: 0.00732
Epoch: 31102, Training Loss: 0.00827
Epoch: 31102, Training Loss: 0.00773
Epoch: 31102, Training Loss: 0.00945
Epoch: 31102, Training Loss: 0.00732
Epoch: 31103, Training Loss: 0.00827
Epoch: 31103, Training Loss: 0.00773
Epoch: 31103, Training Loss: 0.00945
Epoch: 31103, Training Loss: 0.00732
Epoch: 31104, Training Loss: 0.00827
Epoch: 31104, Training Loss: 0.00773
Epoch: 31104, Training Loss: 0.00945
Epoch: 31104, Training Loss: 0.00732
Epoch: 31105, Training Loss: 0.00827
Epoch: 31105, Training Loss: 0.00773
Epoch: 31105, Training Loss: 0.00945
Epoch: 31105, Training Loss: 0.00732
Epoch: 31106, Training Loss: 0.00827
Epoch: 31106, Training Loss: 0.00773
Epoch: 31106, Training Loss: 0.00945
Epoch: 31106, Training Loss: 0.00732
Epoch: 31107, Training Loss: 0.00827
Epoch: 31107, Training Loss: 0.00773
Epoch: 31107, Training Loss: 0.00945
Epoch: 31107, Training Loss: 0.00732
Epoch: 31108, Training Loss: 0.00827
Epoch: 31108, Training Loss: 0.00773
Epoch: 31108, Training Loss: 0.00945
Epoch: 31108, Training Loss: 0.00732
Epoch: 31109, Training Loss: 0.00827
Epoch: 31109, Training Loss: 0.00773
Epoch: 31109, Training Loss: 0.00945
Epoch: 31109, Training Loss: 0.00732
Epoch: 31110, Training Loss: 0.00827
Epoch: 31110, Training Loss: 0.00773
Epoch: 31110, Training Loss: 0.00945
Epoch: 31110, Training Loss: 0.00732
Epoch: 31111, Training Loss: 0.00827
Epoch: 31111, Training Loss: 0.00773
Epoch: 31111, Training Loss: 0.00945
Epoch: 31111, Training Loss: 0.00732
Epoch: 31112, Training Loss: 0.00827
Epoch: 31112, Training Loss: 0.00773
Epoch: 31112, Training Loss: 0.00945
Epoch: 31112, Training Loss: 0.00732
Epoch: 31113, Training Loss: 0.00827
Epoch: 31113, Training Loss: 0.00773
Epoch: 31113, Training Loss: 0.00945
Epoch: 31113, Training Loss: 0.00732
Epoch: 31114, Training Loss: 0.00827
Epoch: 31114, Training Loss: 0.00773
Epoch: 31114, Training Loss: 0.00945
Epoch: 31114, Training Loss: 0.00732
Epoch: 31115, Training Loss: 0.00827
Epoch: 31115, Training Loss: 0.00773
Epoch: 31115, Training Loss: 0.00945
Epoch: 31115, Training Loss: 0.00732
Epoch: 31116, Training Loss: 0.00827
Epoch: 31116, Training Loss: 0.00773
Epoch: 31116, Training Loss: 0.00945
Epoch: 31116, Training Loss: 0.00732
Epoch: 31117, Training Loss: 0.00827
Epoch: 31117, Training Loss: 0.00773
Epoch: 31117, Training Loss: 0.00945
Epoch: 31117, Training Loss: 0.00732
Epoch: 31118, Training Loss: 0.00827
Epoch: 31118, Training Loss: 0.00773
Epoch: 31118, Training Loss: 0.00945
Epoch: 31118, Training Loss: 0.00732
Epoch: 31119, Training Loss: 0.00827
Epoch: 31119, Training Loss: 0.00773
Epoch: 31119, Training Loss: 0.00945
Epoch: 31119, Training Loss: 0.00732
Epoch: 31120, Training Loss: 0.00827
Epoch: 31120, Training Loss: 0.00773
Epoch: 31120, Training Loss: 0.00945
Epoch: 31120, Training Loss: 0.00732
Epoch: 31121, Training Loss: 0.00827
Epoch: 31121, Training Loss: 0.00773
Epoch: 31121, Training Loss: 0.00945
Epoch: 31121, Training Loss: 0.00732
Epoch: 31122, Training Loss: 0.00827
Epoch: 31122, Training Loss: 0.00773
Epoch: 31122, Training Loss: 0.00945
Epoch: 31122, Training Loss: 0.00732
Epoch: 31123, Training Loss: 0.00827
Epoch: 31123, Training Loss: 0.00773
Epoch: 31123, Training Loss: 0.00945
Epoch: 31123, Training Loss: 0.00732
Epoch: 31124, Training Loss: 0.00827
Epoch: 31124, Training Loss: 0.00773
Epoch: 31124, Training Loss: 0.00945
Epoch: 31124, Training Loss: 0.00732
Epoch: 31125, Training Loss: 0.00827
Epoch: 31125, Training Loss: 0.00773
Epoch: 31125, Training Loss: 0.00945
Epoch: 31125, Training Loss: 0.00732
Epoch: 31126, Training Loss: 0.00827
Epoch: 31126, Training Loss: 0.00773
Epoch: 31126, Training Loss: 0.00945
Epoch: 31126, Training Loss: 0.00732
Epoch: 31127, Training Loss: 0.00827
Epoch: 31127, Training Loss: 0.00773
Epoch: 31127, Training Loss: 0.00945
Epoch: 31127, Training Loss: 0.00732
Epoch: 31128, Training Loss: 0.00827
Epoch: 31128, Training Loss: 0.00773
Epoch: 31128, Training Loss: 0.00945
Epoch: 31128, Training Loss: 0.00732
Epoch: 31129, Training Loss: 0.00827
Epoch: 31129, Training Loss: 0.00773
Epoch: 31129, Training Loss: 0.00945
Epoch: 31129, Training Loss: 0.00732
Epoch: 31130, Training Loss: 0.00827
Epoch: 31130, Training Loss: 0.00773
Epoch: 31130, Training Loss: 0.00945
Epoch: 31130, Training Loss: 0.00732
Epoch: 31131, Training Loss: 0.00827
Epoch: 31131, Training Loss: 0.00772
Epoch: 31131, Training Loss: 0.00945
Epoch: 31131, Training Loss: 0.00731
Epoch: 31132, Training Loss: 0.00827
Epoch: 31132, Training Loss: 0.00772
Epoch: 31132, Training Loss: 0.00945
Epoch: 31132, Training Loss: 0.00731
Epoch: 31133, Training Loss: 0.00827
Epoch: 31133, Training Loss: 0.00772
Epoch: 31133, Training Loss: 0.00945
Epoch: 31133, Training Loss: 0.00731
Epoch: 31134, Training Loss: 0.00827
Epoch: 31134, Training Loss: 0.00772
Epoch: 31134, Training Loss: 0.00945
Epoch: 31134, Training Loss: 0.00731
Epoch: 31135, Training Loss: 0.00827
Epoch: 31135, Training Loss: 0.00772
Epoch: 31135, Training Loss: 0.00944
Epoch: 31135, Training Loss: 0.00731
Epoch: 31136, Training Loss: 0.00827
Epoch: 31136, Training Loss: 0.00772
Epoch: 31136, Training Loss: 0.00944
Epoch: 31136, Training Loss: 0.00731
Epoch: 31137, Training Loss: 0.00827
Epoch: 31137, Training Loss: 0.00772
Epoch: 31137, Training Loss: 0.00944
Epoch: 31137, Training Loss: 0.00731
Epoch: 31138, Training Loss: 0.00827
Epoch: 31138, Training Loss: 0.00772
Epoch: 31138, Training Loss: 0.00944
Epoch: 31138, Training Loss: 0.00731
Epoch: 31139, Training Loss: 0.00827
Epoch: 31139, Training Loss: 0.00772
Epoch: 31139, Training Loss: 0.00944
Epoch: 31139, Training Loss: 0.00731
Epoch: 31140, Training Loss: 0.00827
Epoch: 31140, Training Loss: 0.00772
Epoch: 31140, Training Loss: 0.00944
Epoch: 31140, Training Loss: 0.00731
Epoch: 31141, Training Loss: 0.00827
Epoch: 31141, Training Loss: 0.00772
Epoch: 31141, Training Loss: 0.00944
Epoch: 31141, Training Loss: 0.00731
Epoch: 31142, Training Loss: 0.00827
Epoch: 31142, Training Loss: 0.00772
Epoch: 31142, Training Loss: 0.00944
Epoch: 31142, Training Loss: 0.00731
Epoch: 31143, Training Loss: 0.00827
Epoch: 31143, Training Loss: 0.00772
Epoch: 31143, Training Loss: 0.00944
Epoch: 31143, Training Loss: 0.00731
Epoch: 31144, Training Loss: 0.00827
Epoch: 31144, Training Loss: 0.00772
Epoch: 31144, Training Loss: 0.00944
Epoch: 31144, Training Loss: 0.00731
Epoch: 31145, Training Loss: 0.00827
Epoch: 31145, Training Loss: 0.00772
Epoch: 31145, Training Loss: 0.00944
Epoch: 31145, Training Loss: 0.00731
Epoch: 31146, Training Loss: 0.00827
Epoch: 31146, Training Loss: 0.00772
Epoch: 31146, Training Loss: 0.00944
Epoch: 31146, Training Loss: 0.00731
Epoch: 31147, Training Loss: 0.00827
Epoch: 31147, Training Loss: 0.00772
Epoch: 31147, Training Loss: 0.00944
Epoch: 31147, Training Loss: 0.00731
Epoch: 31148, Training Loss: 0.00827
Epoch: 31148, Training Loss: 0.00772
Epoch: 31148, Training Loss: 0.00944
Epoch: 31148, Training Loss: 0.00731
Epoch: 31149, Training Loss: 0.00827
Epoch: 31149, Training Loss: 0.00772
Epoch: 31149, Training Loss: 0.00944
Epoch: 31149, Training Loss: 0.00731
Epoch: 31150, Training Loss: 0.00827
Epoch: 31150, Training Loss: 0.00772
Epoch: 31150, Training Loss: 0.00944
Epoch: 31150, Training Loss: 0.00731
Epoch: 31151, Training Loss: 0.00827
Epoch: 31151, Training Loss: 0.00772
Epoch: 31151, Training Loss: 0.00944
Epoch: 31151, Training Loss: 0.00731
Epoch: 31152, Training Loss: 0.00827
Epoch: 31152, Training Loss: 0.00772
Epoch: 31152, Training Loss: 0.00944
Epoch: 31152, Training Loss: 0.00731
Epoch: 31153, Training Loss: 0.00827
Epoch: 31153, Training Loss: 0.00772
Epoch: 31153, Training Loss: 0.00944
Epoch: 31153, Training Loss: 0.00731
Epoch: 31154, Training Loss: 0.00827
Epoch: 31154, Training Loss: 0.00772
Epoch: 31154, Training Loss: 0.00944
Epoch: 31154, Training Loss: 0.00731
Epoch: 31155, Training Loss: 0.00827
Epoch: 31155, Training Loss: 0.00772
Epoch: 31155, Training Loss: 0.00944
Epoch: 31155, Training Loss: 0.00731
Epoch: 31156, Training Loss: 0.00827
Epoch: 31156, Training Loss: 0.00772
Epoch: 31156, Training Loss: 0.00944
Epoch: 31156, Training Loss: 0.00731
Epoch: 31157, Training Loss: 0.00827
Epoch: 31157, Training Loss: 0.00772
Epoch: 31157, Training Loss: 0.00944
Epoch: 31157, Training Loss: 0.00731
Epoch: 31158, Training Loss: 0.00827
Epoch: 31158, Training Loss: 0.00772
Epoch: 31158, Training Loss: 0.00944
Epoch: 31158, Training Loss: 0.00731
Epoch: 31159, Training Loss: 0.00827
Epoch: 31159, Training Loss: 0.00772
Epoch: 31159, Training Loss: 0.00944
Epoch: 31159, Training Loss: 0.00731
Epoch: 31160, Training Loss: 0.00827
Epoch: 31160, Training Loss: 0.00772
Epoch: 31160, Training Loss: 0.00944
Epoch: 31160, Training Loss: 0.00731
Epoch: 31161, Training Loss: 0.00827
Epoch: 31161, Training Loss: 0.00772
Epoch: 31161, Training Loss: 0.00944
Epoch: 31161, Training Loss: 0.00731
Epoch: 31162, Training Loss: 0.00827
Epoch: 31162, Training Loss: 0.00772
Epoch: 31162, Training Loss: 0.00944
Epoch: 31162, Training Loss: 0.00731
Epoch: 31163, Training Loss: 0.00827
Epoch: 31163, Training Loss: 0.00772
Epoch: 31163, Training Loss: 0.00944
Epoch: 31163, Training Loss: 0.00731
Epoch: 31164, Training Loss: 0.00827
Epoch: 31164, Training Loss: 0.00772
Epoch: 31164, Training Loss: 0.00944
Epoch: 31164, Training Loss: 0.00731
Epoch: 31165, Training Loss: 0.00827
Epoch: 31165, Training Loss: 0.00772
Epoch: 31165, Training Loss: 0.00944
Epoch: 31165, Training Loss: 0.00731
Epoch: 31166, Training Loss: 0.00827
Epoch: 31166, Training Loss: 0.00772
Epoch: 31166, Training Loss: 0.00944
Epoch: 31166, Training Loss: 0.00731
Epoch: 31167, Training Loss: 0.00827
Epoch: 31167, Training Loss: 0.00772
Epoch: 31167, Training Loss: 0.00944
Epoch: 31167, Training Loss: 0.00731
Epoch: 31168, Training Loss: 0.00827
Epoch: 31168, Training Loss: 0.00772
Epoch: 31168, Training Loss: 0.00944
Epoch: 31168, Training Loss: 0.00731
Epoch: 31169, Training Loss: 0.00826
Epoch: 31169, Training Loss: 0.00772
Epoch: 31169, Training Loss: 0.00944
Epoch: 31169, Training Loss: 0.00731
Epoch: 31170, Training Loss: 0.00826
Epoch: 31170, Training Loss: 0.00772
Epoch: 31170, Training Loss: 0.00944
Epoch: 31170, Training Loss: 0.00731
Epoch: 31171, Training Loss: 0.00826
Epoch: 31171, Training Loss: 0.00772
Epoch: 31171, Training Loss: 0.00944
Epoch: 31171, Training Loss: 0.00731
Epoch: 31172, Training Loss: 0.00826
Epoch: 31172, Training Loss: 0.00772
Epoch: 31172, Training Loss: 0.00944
Epoch: 31172, Training Loss: 0.00731
Epoch: 31173, Training Loss: 0.00826
Epoch: 31173, Training Loss: 0.00772
Epoch: 31173, Training Loss: 0.00944
Epoch: 31173, Training Loss: 0.00731
Epoch: 31174, Training Loss: 0.00826
Epoch: 31174, Training Loss: 0.00772
Epoch: 31174, Training Loss: 0.00944
Epoch: 31174, Training Loss: 0.00731
Epoch: 31175, Training Loss: 0.00826
Epoch: 31175, Training Loss: 0.00772
Epoch: 31175, Training Loss: 0.00944
Epoch: 31175, Training Loss: 0.00731
Epoch: 31176, Training Loss: 0.00826
Epoch: 31176, Training Loss: 0.00772
Epoch: 31176, Training Loss: 0.00944
Epoch: 31176, Training Loss: 0.00731
Epoch: 31177, Training Loss: 0.00826
Epoch: 31177, Training Loss: 0.00772
Epoch: 31177, Training Loss: 0.00944
Epoch: 31177, Training Loss: 0.00731
Epoch: 31178, Training Loss: 0.00826
Epoch: 31178, Training Loss: 0.00772
Epoch: 31178, Training Loss: 0.00944
Epoch: 31178, Training Loss: 0.00731
Epoch: 31179, Training Loss: 0.00826
Epoch: 31179, Training Loss: 0.00772
Epoch: 31179, Training Loss: 0.00944
Epoch: 31179, Training Loss: 0.00731
Epoch: 31180, Training Loss: 0.00826
Epoch: 31180, Training Loss: 0.00772
Epoch: 31180, Training Loss: 0.00944
Epoch: 31180, Training Loss: 0.00731
Epoch: 31181, Training Loss: 0.00826
Epoch: 31181, Training Loss: 0.00772
Epoch: 31181, Training Loss: 0.00944
Epoch: 31181, Training Loss: 0.00731
Epoch: 31182, Training Loss: 0.00826
Epoch: 31182, Training Loss: 0.00772
Epoch: 31182, Training Loss: 0.00944
Epoch: 31182, Training Loss: 0.00731
Epoch: 31183, Training Loss: 0.00826
Epoch: 31183, Training Loss: 0.00772
Epoch: 31183, Training Loss: 0.00944
Epoch: 31183, Training Loss: 0.00731
Epoch: 31184, Training Loss: 0.00826
Epoch: 31184, Training Loss: 0.00772
Epoch: 31184, Training Loss: 0.00944
Epoch: 31184, Training Loss: 0.00731
Epoch: 31185, Training Loss: 0.00826
Epoch: 31185, Training Loss: 0.00772
Epoch: 31185, Training Loss: 0.00944
Epoch: 31185, Training Loss: 0.00731
Epoch: 31186, Training Loss: 0.00826
Epoch: 31186, Training Loss: 0.00772
Epoch: 31186, Training Loss: 0.00944
Epoch: 31186, Training Loss: 0.00731
Epoch: 31187, Training Loss: 0.00826
Epoch: 31187, Training Loss: 0.00772
Epoch: 31187, Training Loss: 0.00944
Epoch: 31187, Training Loss: 0.00731
Epoch: 31188, Training Loss: 0.00826
Epoch: 31188, Training Loss: 0.00772
Epoch: 31188, Training Loss: 0.00944
Epoch: 31188, Training Loss: 0.00731
Epoch: 31189, Training Loss: 0.00826
Epoch: 31189, Training Loss: 0.00772
Epoch: 31189, Training Loss: 0.00944
Epoch: 31189, Training Loss: 0.00731
Epoch: 31190, Training Loss: 0.00826
Epoch: 31190, Training Loss: 0.00772
Epoch: 31190, Training Loss: 0.00944
Epoch: 31190, Training Loss: 0.00731
Epoch: 31191, Training Loss: 0.00826
Epoch: 31191, Training Loss: 0.00772
Epoch: 31191, Training Loss: 0.00944
Epoch: 31191, Training Loss: 0.00731
Epoch: 31192, Training Loss: 0.00826
Epoch: 31192, Training Loss: 0.00772
Epoch: 31192, Training Loss: 0.00944
Epoch: 31192, Training Loss: 0.00731
Epoch: 31193, Training Loss: 0.00826
Epoch: 31193, Training Loss: 0.00772
Epoch: 31193, Training Loss: 0.00944
Epoch: 31193, Training Loss: 0.00731
Epoch: 31194, Training Loss: 0.00826
Epoch: 31194, Training Loss: 0.00772
Epoch: 31194, Training Loss: 0.00944
Epoch: 31194, Training Loss: 0.00731
Epoch: 31195, Training Loss: 0.00826
Epoch: 31195, Training Loss: 0.00772
Epoch: 31195, Training Loss: 0.00944
Epoch: 31195, Training Loss: 0.00731
Epoch: 31196, Training Loss: 0.00826
Epoch: 31196, Training Loss: 0.00772
Epoch: 31196, Training Loss: 0.00944
Epoch: 31196, Training Loss: 0.00731
Epoch: 31197, Training Loss: 0.00826
Epoch: 31197, Training Loss: 0.00772
Epoch: 31197, Training Loss: 0.00943
Epoch: 31197, Training Loss: 0.00731
Epoch: 31198, Training Loss: 0.00826
Epoch: 31198, Training Loss: 0.00772
Epoch: 31198, Training Loss: 0.00943
Epoch: 31198, Training Loss: 0.00731
Epoch: 31199, Training Loss: 0.00826
Epoch: 31199, Training Loss: 0.00772
Epoch: 31199, Training Loss: 0.00943
Epoch: 31199, Training Loss: 0.00731
Epoch: 31200, Training Loss: 0.00826
Epoch: 31200, Training Loss: 0.00772
Epoch: 31200, Training Loss: 0.00943
Epoch: 31200, Training Loss: 0.00731
Epoch: 31201, Training Loss: 0.00826
Epoch: 31201, Training Loss: 0.00772
Epoch: 31201, Training Loss: 0.00943
Epoch: 31201, Training Loss: 0.00731
Epoch: 31202, Training Loss: 0.00826
Epoch: 31202, Training Loss: 0.00772
Epoch: 31202, Training Loss: 0.00943
Epoch: 31202, Training Loss: 0.00731
Epoch: 31203, Training Loss: 0.00826
Epoch: 31203, Training Loss: 0.00772
Epoch: 31203, Training Loss: 0.00943
Epoch: 31203, Training Loss: 0.00731
Epoch: 31204, Training Loss: 0.00826
Epoch: 31204, Training Loss: 0.00772
Epoch: 31204, Training Loss: 0.00943
Epoch: 31204, Training Loss: 0.00731
Epoch: 31205, Training Loss: 0.00826
Epoch: 31205, Training Loss: 0.00772
Epoch: 31205, Training Loss: 0.00943
Epoch: 31205, Training Loss: 0.00731
Epoch: 31206, Training Loss: 0.00826
Epoch: 31206, Training Loss: 0.00772
Epoch: 31206, Training Loss: 0.00943
Epoch: 31206, Training Loss: 0.00731
Epoch: 31207, Training Loss: 0.00826
Epoch: 31207, Training Loss: 0.00772
Epoch: 31207, Training Loss: 0.00943
Epoch: 31207, Training Loss: 0.00731
Epoch: 31208, Training Loss: 0.00826
Epoch: 31208, Training Loss: 0.00771
Epoch: 31208, Training Loss: 0.00943
Epoch: 31208, Training Loss: 0.00731
Epoch: 31209, Training Loss: 0.00826
Epoch: 31209, Training Loss: 0.00771
Epoch: 31209, Training Loss: 0.00943
Epoch: 31209, Training Loss: 0.00731
Epoch: 31210, Training Loss: 0.00826
Epoch: 31210, Training Loss: 0.00771
Epoch: 31210, Training Loss: 0.00943
Epoch: 31210, Training Loss: 0.00731
Epoch: 31211, Training Loss: 0.00826
Epoch: 31211, Training Loss: 0.00771
Epoch: 31211, Training Loss: 0.00943
Epoch: 31211, Training Loss: 0.00731
Epoch: 31212, Training Loss: 0.00826
Epoch: 31212, Training Loss: 0.00771
Epoch: 31212, Training Loss: 0.00943
Epoch: 31212, Training Loss: 0.00731
Epoch: 31213, Training Loss: 0.00826
Epoch: 31213, Training Loss: 0.00771
Epoch: 31213, Training Loss: 0.00943
Epoch: 31213, Training Loss: 0.00730
Epoch: 31214, Training Loss: 0.00826
Epoch: 31214, Training Loss: 0.00771
Epoch: 31214, Training Loss: 0.00943
Epoch: 31214, Training Loss: 0.00730
Epoch: 31215, Training Loss: 0.00826
Epoch: 31215, Training Loss: 0.00771
Epoch: 31215, Training Loss: 0.00943
Epoch: 31215, Training Loss: 0.00730
Epoch: 31216, Training Loss: 0.00826
Epoch: 31216, Training Loss: 0.00771
Epoch: 31216, Training Loss: 0.00943
Epoch: 31216, Training Loss: 0.00730
Epoch: 31217, Training Loss: 0.00826
Epoch: 31217, Training Loss: 0.00771
Epoch: 31217, Training Loss: 0.00943
Epoch: 31217, Training Loss: 0.00730
Epoch: 31218, Training Loss: 0.00826
Epoch: 31218, Training Loss: 0.00771
Epoch: 31218, Training Loss: 0.00943
Epoch: 31218, Training Loss: 0.00730
Epoch: 31219, Training Loss: 0.00826
Epoch: 31219, Training Loss: 0.00771
Epoch: 31219, Training Loss: 0.00943
Epoch: 31219, Training Loss: 0.00730
Epoch: 31220, Training Loss: 0.00826
Epoch: 31220, Training Loss: 0.00771
Epoch: 31220, Training Loss: 0.00943
Epoch: 31220, Training Loss: 0.00730
Epoch: 31221, Training Loss: 0.00826
Epoch: 31221, Training Loss: 0.00771
Epoch: 31221, Training Loss: 0.00943
Epoch: 31221, Training Loss: 0.00730
Epoch: 31222, Training Loss: 0.00826
Epoch: 31222, Training Loss: 0.00771
Epoch: 31222, Training Loss: 0.00943
Epoch: 31222, Training Loss: 0.00730
Epoch: 31223, Training Loss: 0.00826
Epoch: 31223, Training Loss: 0.00771
Epoch: 31223, Training Loss: 0.00943
Epoch: 31223, Training Loss: 0.00730
Epoch: 31224, Training Loss: 0.00826
Epoch: 31224, Training Loss: 0.00771
Epoch: 31224, Training Loss: 0.00943
Epoch: 31224, Training Loss: 0.00730
Epoch: 31225, Training Loss: 0.00826
Epoch: 31225, Training Loss: 0.00771
Epoch: 31225, Training Loss: 0.00943
Epoch: 31225, Training Loss: 0.00730
Epoch: 31226, Training Loss: 0.00826
Epoch: 31226, Training Loss: 0.00771
Epoch: 31226, Training Loss: 0.00943
Epoch: 31226, Training Loss: 0.00730
Epoch: 31227, Training Loss: 0.00826
Epoch: 31227, Training Loss: 0.00771
Epoch: 31227, Training Loss: 0.00943
Epoch: 31227, Training Loss: 0.00730
Epoch: 31228, Training Loss: 0.00826
Epoch: 31228, Training Loss: 0.00771
Epoch: 31228, Training Loss: 0.00943
Epoch: 31228, Training Loss: 0.00730
Epoch: 31229, Training Loss: 0.00826
Epoch: 31229, Training Loss: 0.00771
Epoch: 31229, Training Loss: 0.00943
Epoch: 31229, Training Loss: 0.00730
Epoch: 31230, Training Loss: 0.00826
Epoch: 31230, Training Loss: 0.00771
Epoch: 31230, Training Loss: 0.00943
Epoch: 31230, Training Loss: 0.00730
Epoch: 31231, Training Loss: 0.00826
Epoch: 31231, Training Loss: 0.00771
Epoch: 31231, Training Loss: 0.00943
Epoch: 31231, Training Loss: 0.00730
Epoch: 31232, Training Loss: 0.00826
Epoch: 31232, Training Loss: 0.00771
Epoch: 31232, Training Loss: 0.00943
Epoch: 31232, Training Loss: 0.00730
Epoch: 31233, Training Loss: 0.00826
Epoch: 31233, Training Loss: 0.00771
Epoch: 31233, Training Loss: 0.00943
Epoch: 31233, Training Loss: 0.00730
Epoch: 31234, Training Loss: 0.00826
Epoch: 31234, Training Loss: 0.00771
Epoch: 31234, Training Loss: 0.00943
Epoch: 31234, Training Loss: 0.00730
Epoch: 31235, Training Loss: 0.00826
Epoch: 31235, Training Loss: 0.00771
Epoch: 31235, Training Loss: 0.00943
Epoch: 31235, Training Loss: 0.00730
Epoch: 31236, Training Loss: 0.00826
Epoch: 31236, Training Loss: 0.00771
Epoch: 31236, Training Loss: 0.00943
Epoch: 31236, Training Loss: 0.00730
Epoch: 31237, Training Loss: 0.00826
Epoch: 31237, Training Loss: 0.00771
Epoch: 31237, Training Loss: 0.00943
Epoch: 31237, Training Loss: 0.00730
Epoch: 31238, Training Loss: 0.00826
Epoch: 31238, Training Loss: 0.00771
Epoch: 31238, Training Loss: 0.00943
Epoch: 31238, Training Loss: 0.00730
Epoch: 31239, Training Loss: 0.00826
Epoch: 31239, Training Loss: 0.00771
Epoch: 31239, Training Loss: 0.00943
Epoch: 31239, Training Loss: 0.00730
Epoch: 31240, Training Loss: 0.00825
Epoch: 31240, Training Loss: 0.00771
Epoch: 31240, Training Loss: 0.00943
Epoch: 31240, Training Loss: 0.00730
Epoch: 31241, Training Loss: 0.00825
Epoch: 31241, Training Loss: 0.00771
Epoch: 31241, Training Loss: 0.00943
Epoch: 31241, Training Loss: 0.00730
Epoch: 31242, Training Loss: 0.00825
Epoch: 31242, Training Loss: 0.00771
Epoch: 31242, Training Loss: 0.00943
Epoch: 31242, Training Loss: 0.00730
Epoch: 31243, Training Loss: 0.00825
Epoch: 31243, Training Loss: 0.00771
Epoch: 31243, Training Loss: 0.00943
Epoch: 31243, Training Loss: 0.00730
Epoch: 31244, Training Loss: 0.00825
Epoch: 31244, Training Loss: 0.00771
Epoch: 31244, Training Loss: 0.00943
Epoch: 31244, Training Loss: 0.00730
Epoch: 31245, Training Loss: 0.00825
Epoch: 31245, Training Loss: 0.00771
Epoch: 31245, Training Loss: 0.00943
Epoch: 31245, Training Loss: 0.00730
Epoch: 31246, Training Loss: 0.00825
Epoch: 31246, Training Loss: 0.00771
Epoch: 31246, Training Loss: 0.00943
Epoch: 31246, Training Loss: 0.00730
Epoch: 31247, Training Loss: 0.00825
Epoch: 31247, Training Loss: 0.00771
Epoch: 31247, Training Loss: 0.00943
Epoch: 31247, Training Loss: 0.00730
Epoch: 31248, Training Loss: 0.00825
Epoch: 31248, Training Loss: 0.00771
Epoch: 31248, Training Loss: 0.00943
Epoch: 31248, Training Loss: 0.00730
Epoch: 31249, Training Loss: 0.00825
Epoch: 31249, Training Loss: 0.00771
Epoch: 31249, Training Loss: 0.00943
Epoch: 31249, Training Loss: 0.00730
Epoch: 31250, Training Loss: 0.00825
Epoch: 31250, Training Loss: 0.00771
Epoch: 31250, Training Loss: 0.00943
Epoch: 31250, Training Loss: 0.00730
Epoch: 31251, Training Loss: 0.00825
Epoch: 31251, Training Loss: 0.00771
Epoch: 31251, Training Loss: 0.00943
Epoch: 31251, Training Loss: 0.00730
Epoch: 31252, Training Loss: 0.00825
Epoch: 31252, Training Loss: 0.00771
Epoch: 31252, Training Loss: 0.00943
Epoch: 31252, Training Loss: 0.00730
Epoch: 31253, Training Loss: 0.00825
Epoch: 31253, Training Loss: 0.00771
Epoch: 31253, Training Loss: 0.00943
Epoch: 31253, Training Loss: 0.00730
Epoch: 31254, Training Loss: 0.00825
Epoch: 31254, Training Loss: 0.00771
Epoch: 31254, Training Loss: 0.00943
Epoch: 31254, Training Loss: 0.00730
Epoch: 31255, Training Loss: 0.00825
Epoch: 31255, Training Loss: 0.00771
Epoch: 31255, Training Loss: 0.00943
Epoch: 31255, Training Loss: 0.00730
Epoch: 31256, Training Loss: 0.00825
Epoch: 31256, Training Loss: 0.00771
Epoch: 31256, Training Loss: 0.00943
Epoch: 31256, Training Loss: 0.00730
Epoch: 31257, Training Loss: 0.00825
Epoch: 31257, Training Loss: 0.00771
Epoch: 31257, Training Loss: 0.00943
Epoch: 31257, Training Loss: 0.00730
Epoch: 31258, Training Loss: 0.00825
Epoch: 31258, Training Loss: 0.00771
Epoch: 31258, Training Loss: 0.00943
Epoch: 31258, Training Loss: 0.00730
Epoch: 31259, Training Loss: 0.00825
Epoch: 31259, Training Loss: 0.00771
Epoch: 31259, Training Loss: 0.00943
Epoch: 31259, Training Loss: 0.00730
Epoch: 31260, Training Loss: 0.00825
Epoch: 31260, Training Loss: 0.00771
Epoch: 31260, Training Loss: 0.00942
Epoch: 31260, Training Loss: 0.00730
Epoch: 31261, Training Loss: 0.00825
Epoch: 31261, Training Loss: 0.00771
Epoch: 31261, Training Loss: 0.00942
Epoch: 31261, Training Loss: 0.00730
Epoch: 31262, Training Loss: 0.00825
Epoch: 31262, Training Loss: 0.00771
Epoch: 31262, Training Loss: 0.00942
Epoch: 31262, Training Loss: 0.00730
Epoch: 31263, Training Loss: 0.00825
Epoch: 31263, Training Loss: 0.00771
Epoch: 31263, Training Loss: 0.00942
Epoch: 31263, Training Loss: 0.00730
Epoch: 31264, Training Loss: 0.00825
Epoch: 31264, Training Loss: 0.00771
Epoch: 31264, Training Loss: 0.00942
Epoch: 31264, Training Loss: 0.00730
Epoch: 31265, Training Loss: 0.00825
Epoch: 31265, Training Loss: 0.00771
Epoch: 31265, Training Loss: 0.00942
Epoch: 31265, Training Loss: 0.00730
Epoch: 31266, Training Loss: 0.00825
Epoch: 31266, Training Loss: 0.00771
Epoch: 31266, Training Loss: 0.00942
Epoch: 31266, Training Loss: 0.00730
Epoch: 31267, Training Loss: 0.00825
Epoch: 31267, Training Loss: 0.00771
Epoch: 31267, Training Loss: 0.00942
Epoch: 31267, Training Loss: 0.00730
Epoch: 31268, Training Loss: 0.00825
Epoch: 31268, Training Loss: 0.00771
Epoch: 31268, Training Loss: 0.00942
Epoch: 31268, Training Loss: 0.00730
Epoch: 31269, Training Loss: 0.00825
Epoch: 31269, Training Loss: 0.00771
Epoch: 31269, Training Loss: 0.00942
Epoch: 31269, Training Loss: 0.00730
Epoch: 31270, Training Loss: 0.00825
Epoch: 31270, Training Loss: 0.00771
Epoch: 31270, Training Loss: 0.00942
Epoch: 31270, Training Loss: 0.00730
Epoch: 31271, Training Loss: 0.00825
Epoch: 31271, Training Loss: 0.00771
Epoch: 31271, Training Loss: 0.00942
Epoch: 31271, Training Loss: 0.00730
Epoch: 31272, Training Loss: 0.00825
Epoch: 31272, Training Loss: 0.00771
Epoch: 31272, Training Loss: 0.00942
Epoch: 31272, Training Loss: 0.00730
Epoch: 31273, Training Loss: 0.00825
Epoch: 31273, Training Loss: 0.00771
Epoch: 31273, Training Loss: 0.00942
Epoch: 31273, Training Loss: 0.00730
Epoch: 31274, Training Loss: 0.00825
Epoch: 31274, Training Loss: 0.00771
Epoch: 31274, Training Loss: 0.00942
Epoch: 31274, Training Loss: 0.00730
Epoch: 31275, Training Loss: 0.00825
Epoch: 31275, Training Loss: 0.00771
Epoch: 31275, Training Loss: 0.00942
Epoch: 31275, Training Loss: 0.00730
Epoch: 31276, Training Loss: 0.00825
Epoch: 31276, Training Loss: 0.00771
Epoch: 31276, Training Loss: 0.00942
Epoch: 31276, Training Loss: 0.00730
Epoch: 31277, Training Loss: 0.00825
Epoch: 31277, Training Loss: 0.00771
Epoch: 31277, Training Loss: 0.00942
Epoch: 31277, Training Loss: 0.00730
Epoch: 31278, Training Loss: 0.00825
Epoch: 31278, Training Loss: 0.00771
Epoch: 31278, Training Loss: 0.00942
Epoch: 31278, Training Loss: 0.00730
Epoch: 31279, Training Loss: 0.00825
Epoch: 31279, Training Loss: 0.00771
Epoch: 31279, Training Loss: 0.00942
Epoch: 31279, Training Loss: 0.00730
Epoch: 31280, Training Loss: 0.00825
Epoch: 31280, Training Loss: 0.00771
Epoch: 31280, Training Loss: 0.00942
Epoch: 31280, Training Loss: 0.00730
Epoch: 31281, Training Loss: 0.00825
Epoch: 31281, Training Loss: 0.00771
Epoch: 31281, Training Loss: 0.00942
Epoch: 31281, Training Loss: 0.00730
Epoch: 31282, Training Loss: 0.00825
Epoch: 31282, Training Loss: 0.00771
Epoch: 31282, Training Loss: 0.00942
Epoch: 31282, Training Loss: 0.00730
Epoch: 31283, Training Loss: 0.00825
Epoch: 31283, Training Loss: 0.00771
Epoch: 31283, Training Loss: 0.00942
Epoch: 31283, Training Loss: 0.00730
Epoch: 31284, Training Loss: 0.00825
Epoch: 31284, Training Loss: 0.00771
Epoch: 31284, Training Loss: 0.00942
Epoch: 31284, Training Loss: 0.00730
Epoch: 31285, Training Loss: 0.00825
Epoch: 31285, Training Loss: 0.00771
Epoch: 31285, Training Loss: 0.00942
Epoch: 31285, Training Loss: 0.00730
Epoch: 31286, Training Loss: 0.00825
Epoch: 31286, Training Loss: 0.00770
Epoch: 31286, Training Loss: 0.00942
Epoch: 31286, Training Loss: 0.00730
Epoch: 31287, Training Loss: 0.00825
Epoch: 31287, Training Loss: 0.00770
Epoch: 31287, Training Loss: 0.00942
Epoch: 31287, Training Loss: 0.00730
Epoch: 31288, Training Loss: 0.00825
Epoch: 31288, Training Loss: 0.00770
Epoch: 31288, Training Loss: 0.00942
Epoch: 31288, Training Loss: 0.00730
Epoch: 31289, Training Loss: 0.00825
Epoch: 31289, Training Loss: 0.00770
Epoch: 31289, Training Loss: 0.00942
Epoch: 31289, Training Loss: 0.00730
Epoch: 31290, Training Loss: 0.00825
Epoch: 31290, Training Loss: 0.00770
Epoch: 31290, Training Loss: 0.00942
Epoch: 31290, Training Loss: 0.00730
Epoch: 31291, Training Loss: 0.00825
Epoch: 31291, Training Loss: 0.00770
Epoch: 31291, Training Loss: 0.00942
Epoch: 31291, Training Loss: 0.00730
Epoch: 31292, Training Loss: 0.00825
Epoch: 31292, Training Loss: 0.00770
Epoch: 31292, Training Loss: 0.00942
Epoch: 31292, Training Loss: 0.00730
Epoch: 31293, Training Loss: 0.00825
Epoch: 31293, Training Loss: 0.00770
Epoch: 31293, Training Loss: 0.00942
Epoch: 31293, Training Loss: 0.00730
Epoch: 31294, Training Loss: 0.00825
Epoch: 31294, Training Loss: 0.00770
Epoch: 31294, Training Loss: 0.00942
Epoch: 31294, Training Loss: 0.00730
Epoch: 31295, Training Loss: 0.00825
Epoch: 31295, Training Loss: 0.00770
Epoch: 31295, Training Loss: 0.00942
Epoch: 31295, Training Loss: 0.00729
Epoch: 31296, Training Loss: 0.00825
Epoch: 31296, Training Loss: 0.00770
Epoch: 31296, Training Loss: 0.00942
Epoch: 31296, Training Loss: 0.00729
Epoch: 31297, Training Loss: 0.00825
Epoch: 31297, Training Loss: 0.00770
Epoch: 31297, Training Loss: 0.00942
Epoch: 31297, Training Loss: 0.00729
Epoch: 31298, Training Loss: 0.00825
Epoch: 31298, Training Loss: 0.00770
Epoch: 31298, Training Loss: 0.00942
Epoch: 31298, Training Loss: 0.00729
Epoch: 31299, Training Loss: 0.00825
Epoch: 31299, Training Loss: 0.00770
Epoch: 31299, Training Loss: 0.00942
Epoch: 31299, Training Loss: 0.00729
Epoch: 31300, Training Loss: 0.00825
Epoch: 31300, Training Loss: 0.00770
Epoch: 31300, Training Loss: 0.00942
Epoch: 31300, Training Loss: 0.00729
Epoch: 31301, Training Loss: 0.00825
Epoch: 31301, Training Loss: 0.00770
Epoch: 31301, Training Loss: 0.00942
Epoch: 31301, Training Loss: 0.00729
Epoch: 31302, Training Loss: 0.00825
Epoch: 31302, Training Loss: 0.00770
Epoch: 31302, Training Loss: 0.00942
Epoch: 31302, Training Loss: 0.00729
Epoch: 31303, Training Loss: 0.00825
Epoch: 31303, Training Loss: 0.00770
Epoch: 31303, Training Loss: 0.00942
Epoch: 31303, Training Loss: 0.00729
Epoch: 31304, Training Loss: 0.00825
Epoch: 31304, Training Loss: 0.00770
Epoch: 31304, Training Loss: 0.00942
Epoch: 31304, Training Loss: 0.00729
Epoch: 31305, Training Loss: 0.00825
Epoch: 31305, Training Loss: 0.00770
Epoch: 31305, Training Loss: 0.00942
Epoch: 31305, Training Loss: 0.00729
Epoch: 31306, Training Loss: 0.00825
Epoch: 31306, Training Loss: 0.00770
Epoch: 31306, Training Loss: 0.00942
Epoch: 31306, Training Loss: 0.00729
Epoch: 31307, Training Loss: 0.00825
Epoch: 31307, Training Loss: 0.00770
Epoch: 31307, Training Loss: 0.00942
Epoch: 31307, Training Loss: 0.00729
Epoch: 31308, Training Loss: 0.00825
Epoch: 31308, Training Loss: 0.00770
Epoch: 31308, Training Loss: 0.00942
Epoch: 31308, Training Loss: 0.00729
Epoch: 31309, Training Loss: 0.00825
Epoch: 31309, Training Loss: 0.00770
Epoch: 31309, Training Loss: 0.00942
Epoch: 31309, Training Loss: 0.00729
Epoch: 31310, Training Loss: 0.00825
Epoch: 31310, Training Loss: 0.00770
Epoch: 31310, Training Loss: 0.00942
Epoch: 31310, Training Loss: 0.00729
Epoch: 31311, Training Loss: 0.00825
Epoch: 31311, Training Loss: 0.00770
Epoch: 31311, Training Loss: 0.00942
Epoch: 31311, Training Loss: 0.00729
Epoch: 31312, Training Loss: 0.00824
Epoch: 31312, Training Loss: 0.00770
Epoch: 31312, Training Loss: 0.00942
Epoch: 31312, Training Loss: 0.00729
Epoch: 31313, Training Loss: 0.00824
Epoch: 31313, Training Loss: 0.00770
Epoch: 31313, Training Loss: 0.00942
Epoch: 31313, Training Loss: 0.00729
Epoch: 31314, Training Loss: 0.00824
Epoch: 31314, Training Loss: 0.00770
Epoch: 31314, Training Loss: 0.00942
Epoch: 31314, Training Loss: 0.00729
Epoch: 31315, Training Loss: 0.00824
Epoch: 31315, Training Loss: 0.00770
Epoch: 31315, Training Loss: 0.00942
Epoch: 31315, Training Loss: 0.00729
Epoch: 31316, Training Loss: 0.00824
Epoch: 31316, Training Loss: 0.00770
Epoch: 31316, Training Loss: 0.00942
Epoch: 31316, Training Loss: 0.00729
Epoch: 31317, Training Loss: 0.00824
Epoch: 31317, Training Loss: 0.00770
Epoch: 31317, Training Loss: 0.00942
Epoch: 31317, Training Loss: 0.00729
Epoch: 31318, Training Loss: 0.00824
Epoch: 31318, Training Loss: 0.00770
Epoch: 31318, Training Loss: 0.00942
Epoch: 31318, Training Loss: 0.00729
Epoch: 31319, Training Loss: 0.00824
Epoch: 31319, Training Loss: 0.00770
Epoch: 31319, Training Loss: 0.00942
Epoch: 31319, Training Loss: 0.00729
Epoch: 31320, Training Loss: 0.00824
Epoch: 31320, Training Loss: 0.00770
Epoch: 31320, Training Loss: 0.00942
Epoch: 31320, Training Loss: 0.00729
Epoch: 31321, Training Loss: 0.00824
Epoch: 31321, Training Loss: 0.00770
Epoch: 31321, Training Loss: 0.00942
Epoch: 31321, Training Loss: 0.00729
Epoch: 31322, Training Loss: 0.00824
Epoch: 31322, Training Loss: 0.00770
Epoch: 31322, Training Loss: 0.00942
Epoch: 31322, Training Loss: 0.00729
Epoch: 31323, Training Loss: 0.00824
Epoch: 31323, Training Loss: 0.00770
Epoch: 31323, Training Loss: 0.00941
Epoch: 31323, Training Loss: 0.00729
Epoch: 31324, Training Loss: 0.00824
Epoch: 31324, Training Loss: 0.00770
Epoch: 31324, Training Loss: 0.00941
Epoch: 31324, Training Loss: 0.00729
Epoch: 31325, Training Loss: 0.00824
Epoch: 31325, Training Loss: 0.00770
Epoch: 31325, Training Loss: 0.00941
Epoch: 31325, Training Loss: 0.00729
Epoch: 31326, Training Loss: 0.00824
Epoch: 31326, Training Loss: 0.00770
Epoch: 31326, Training Loss: 0.00941
Epoch: 31326, Training Loss: 0.00729
Epoch: 31327, Training Loss: 0.00824
Epoch: 31327, Training Loss: 0.00770
Epoch: 31327, Training Loss: 0.00941
Epoch: 31327, Training Loss: 0.00729
Epoch: 31328, Training Loss: 0.00824
Epoch: 31328, Training Loss: 0.00770
Epoch: 31328, Training Loss: 0.00941
Epoch: 31328, Training Loss: 0.00729
Epoch: 31329, Training Loss: 0.00824
Epoch: 31329, Training Loss: 0.00770
Epoch: 31329, Training Loss: 0.00941
Epoch: 31329, Training Loss: 0.00729
Epoch: 31330, Training Loss: 0.00824
Epoch: 31330, Training Loss: 0.00770
Epoch: 31330, Training Loss: 0.00941
Epoch: 31330, Training Loss: 0.00729
Epoch: 31331, Training Loss: 0.00824
Epoch: 31331, Training Loss: 0.00770
Epoch: 31331, Training Loss: 0.00941
Epoch: 31331, Training Loss: 0.00729
Epoch: 31332, Training Loss: 0.00824
Epoch: 31332, Training Loss: 0.00770
Epoch: 31332, Training Loss: 0.00941
Epoch: 31332, Training Loss: 0.00729
Epoch: 31333, Training Loss: 0.00824
Epoch: 31333, Training Loss: 0.00770
Epoch: 31333, Training Loss: 0.00941
Epoch: 31333, Training Loss: 0.00729
Epoch: 31334, Training Loss: 0.00824
Epoch: 31334, Training Loss: 0.00770
Epoch: 31334, Training Loss: 0.00941
Epoch: 31334, Training Loss: 0.00729
Epoch: 31335, Training Loss: 0.00824
Epoch: 31335, Training Loss: 0.00770
Epoch: 31335, Training Loss: 0.00941
Epoch: 31335, Training Loss: 0.00729
Epoch: 31336, Training Loss: 0.00824
Epoch: 31336, Training Loss: 0.00770
Epoch: 31336, Training Loss: 0.00941
Epoch: 31336, Training Loss: 0.00729
Epoch: 31337, Training Loss: 0.00824
Epoch: 31337, Training Loss: 0.00770
Epoch: 31337, Training Loss: 0.00941
Epoch: 31337, Training Loss: 0.00729
Epoch: 31338, Training Loss: 0.00824
Epoch: 31338, Training Loss: 0.00770
Epoch: 31338, Training Loss: 0.00941
Epoch: 31338, Training Loss: 0.00729
Epoch: 31339, Training Loss: 0.00824
Epoch: 31339, Training Loss: 0.00770
Epoch: 31339, Training Loss: 0.00941
Epoch: 31339, Training Loss: 0.00729
Epoch: 31340, Training Loss: 0.00824
Epoch: 31340, Training Loss: 0.00770
Epoch: 31340, Training Loss: 0.00941
Epoch: 31340, Training Loss: 0.00729
Epoch: 31341, Training Loss: 0.00824
Epoch: 31341, Training Loss: 0.00770
Epoch: 31341, Training Loss: 0.00941
Epoch: 31341, Training Loss: 0.00729
Epoch: 31342, Training Loss: 0.00824
Epoch: 31342, Training Loss: 0.00770
Epoch: 31342, Training Loss: 0.00941
Epoch: 31342, Training Loss: 0.00729
Epoch: 31343, Training Loss: 0.00824
Epoch: 31343, Training Loss: 0.00770
Epoch: 31343, Training Loss: 0.00941
Epoch: 31343, Training Loss: 0.00729
Epoch: 31344, Training Loss: 0.00824
Epoch: 31344, Training Loss: 0.00770
Epoch: 31344, Training Loss: 0.00941
Epoch: 31344, Training Loss: 0.00729
Epoch: 31345, Training Loss: 0.00824
Epoch: 31345, Training Loss: 0.00770
Epoch: 31345, Training Loss: 0.00941
Epoch: 31345, Training Loss: 0.00729
Epoch: 31346, Training Loss: 0.00824
Epoch: 31346, Training Loss: 0.00770
Epoch: 31346, Training Loss: 0.00941
Epoch: 31346, Training Loss: 0.00729
Epoch: 31347, Training Loss: 0.00824
Epoch: 31347, Training Loss: 0.00770
Epoch: 31347, Training Loss: 0.00941
Epoch: 31347, Training Loss: 0.00729
Epoch: 31348, Training Loss: 0.00824
Epoch: 31348, Training Loss: 0.00770
Epoch: 31348, Training Loss: 0.00941
Epoch: 31348, Training Loss: 0.00729
Epoch: 31349, Training Loss: 0.00824
Epoch: 31349, Training Loss: 0.00770
Epoch: 31349, Training Loss: 0.00941
Epoch: 31349, Training Loss: 0.00729
Epoch: 31350, Training Loss: 0.00824
Epoch: 31350, Training Loss: 0.00770
Epoch: 31350, Training Loss: 0.00941
Epoch: 31350, Training Loss: 0.00729
Epoch: 31351, Training Loss: 0.00824
Epoch: 31351, Training Loss: 0.00770
Epoch: 31351, Training Loss: 0.00941
Epoch: 31351, Training Loss: 0.00729
Epoch: 31352, Training Loss: 0.00824
Epoch: 31352, Training Loss: 0.00770
Epoch: 31352, Training Loss: 0.00941
Epoch: 31352, Training Loss: 0.00729
Epoch: 31353, Training Loss: 0.00824
Epoch: 31353, Training Loss: 0.00770
Epoch: 31353, Training Loss: 0.00941
Epoch: 31353, Training Loss: 0.00729
Epoch: 31354, Training Loss: 0.00824
Epoch: 31354, Training Loss: 0.00770
Epoch: 31354, Training Loss: 0.00941
Epoch: 31354, Training Loss: 0.00729
Epoch: 31355, Training Loss: 0.00824
Epoch: 31355, Training Loss: 0.00770
Epoch: 31355, Training Loss: 0.00941
Epoch: 31355, Training Loss: 0.00729
Epoch: 31356, Training Loss: 0.00824
Epoch: 31356, Training Loss: 0.00770
Epoch: 31356, Training Loss: 0.00941
Epoch: 31356, Training Loss: 0.00729
Epoch: 31357, Training Loss: 0.00824
Epoch: 31357, Training Loss: 0.00770
Epoch: 31357, Training Loss: 0.00941
Epoch: 31357, Training Loss: 0.00729
Epoch: 31358, Training Loss: 0.00824
Epoch: 31358, Training Loss: 0.00770
Epoch: 31358, Training Loss: 0.00941
Epoch: 31358, Training Loss: 0.00729
Epoch: 31359, Training Loss: 0.00824
Epoch: 31359, Training Loss: 0.00770
Epoch: 31359, Training Loss: 0.00941
Epoch: 31359, Training Loss: 0.00729
Epoch: 31360, Training Loss: 0.00824
Epoch: 31360, Training Loss: 0.00770
Epoch: 31360, Training Loss: 0.00941
Epoch: 31360, Training Loss: 0.00729
Epoch: 31361, Training Loss: 0.00824
Epoch: 31361, Training Loss: 0.00770
Epoch: 31361, Training Loss: 0.00941
Epoch: 31361, Training Loss: 0.00729
Epoch: 31362, Training Loss: 0.00824
Epoch: 31362, Training Loss: 0.00770
Epoch: 31362, Training Loss: 0.00941
Epoch: 31362, Training Loss: 0.00729
Epoch: 31363, Training Loss: 0.00824
Epoch: 31363, Training Loss: 0.00770
Epoch: 31363, Training Loss: 0.00941
Epoch: 31363, Training Loss: 0.00729
Epoch: 31364, Training Loss: 0.00824
Epoch: 31364, Training Loss: 0.00769
Epoch: 31364, Training Loss: 0.00941
Epoch: 31364, Training Loss: 0.00729
Epoch: 31365, Training Loss: 0.00824
Epoch: 31365, Training Loss: 0.00769
Epoch: 31365, Training Loss: 0.00941
Epoch: 31365, Training Loss: 0.00729
Epoch: 31366, Training Loss: 0.00824
Epoch: 31366, Training Loss: 0.00769
Epoch: 31366, Training Loss: 0.00941
Epoch: 31366, Training Loss: 0.00729
Epoch: 31367, Training Loss: 0.00824
Epoch: 31367, Training Loss: 0.00769
Epoch: 31367, Training Loss: 0.00941
Epoch: 31367, Training Loss: 0.00729
Epoch: 31368, Training Loss: 0.00824
Epoch: 31368, Training Loss: 0.00769
Epoch: 31368, Training Loss: 0.00941
Epoch: 31368, Training Loss: 0.00729
Epoch: 31369, Training Loss: 0.00824
Epoch: 31369, Training Loss: 0.00769
Epoch: 31369, Training Loss: 0.00941
Epoch: 31369, Training Loss: 0.00729
Epoch: 31370, Training Loss: 0.00824
Epoch: 31370, Training Loss: 0.00769
Epoch: 31370, Training Loss: 0.00941
Epoch: 31370, Training Loss: 0.00729
Epoch: 31371, Training Loss: 0.00824
Epoch: 31371, Training Loss: 0.00769
Epoch: 31371, Training Loss: 0.00941
Epoch: 31371, Training Loss: 0.00729
Epoch: 31372, Training Loss: 0.00824
Epoch: 31372, Training Loss: 0.00769
Epoch: 31372, Training Loss: 0.00941
Epoch: 31372, Training Loss: 0.00729
Epoch: 31373, Training Loss: 0.00824
Epoch: 31373, Training Loss: 0.00769
Epoch: 31373, Training Loss: 0.00941
Epoch: 31373, Training Loss: 0.00729
Epoch: 31374, Training Loss: 0.00824
Epoch: 31374, Training Loss: 0.00769
Epoch: 31374, Training Loss: 0.00941
Epoch: 31374, Training Loss: 0.00729
Epoch: 31375, Training Loss: 0.00824
Epoch: 31375, Training Loss: 0.00769
Epoch: 31375, Training Loss: 0.00941
Epoch: 31375, Training Loss: 0.00729
Epoch: 31376, Training Loss: 0.00824
Epoch: 31376, Training Loss: 0.00769
Epoch: 31376, Training Loss: 0.00941
Epoch: 31376, Training Loss: 0.00729
Epoch: 31377, Training Loss: 0.00824
Epoch: 31377, Training Loss: 0.00769
Epoch: 31377, Training Loss: 0.00941
Epoch: 31377, Training Loss: 0.00728
Epoch: 31378, Training Loss: 0.00824
Epoch: 31378, Training Loss: 0.00769
Epoch: 31378, Training Loss: 0.00941
Epoch: 31378, Training Loss: 0.00728
Epoch: 31379, Training Loss: 0.00824
Epoch: 31379, Training Loss: 0.00769
Epoch: 31379, Training Loss: 0.00941
Epoch: 31379, Training Loss: 0.00728
Epoch: 31380, Training Loss: 0.00824
Epoch: 31380, Training Loss: 0.00769
Epoch: 31380, Training Loss: 0.00941
Epoch: 31380, Training Loss: 0.00728
Epoch: 31381, Training Loss: 0.00824
Epoch: 31381, Training Loss: 0.00769
Epoch: 31381, Training Loss: 0.00941
Epoch: 31381, Training Loss: 0.00728
Epoch: 31382, Training Loss: 0.00824
Epoch: 31382, Training Loss: 0.00769
Epoch: 31382, Training Loss: 0.00941
Epoch: 31382, Training Loss: 0.00728
Epoch: 31383, Training Loss: 0.00824
Epoch: 31383, Training Loss: 0.00769
Epoch: 31383, Training Loss: 0.00941
Epoch: 31383, Training Loss: 0.00728
Epoch: 31384, Training Loss: 0.00823
Epoch: 31384, Training Loss: 0.00769
Epoch: 31384, Training Loss: 0.00941
Epoch: 31384, Training Loss: 0.00728
Epoch: 31385, Training Loss: 0.00823
Epoch: 31385, Training Loss: 0.00769
Epoch: 31385, Training Loss: 0.00941
Epoch: 31385, Training Loss: 0.00728
Epoch: 31386, Training Loss: 0.00823
Epoch: 31386, Training Loss: 0.00769
Epoch: 31386, Training Loss: 0.00940
Epoch: 31386, Training Loss: 0.00728
Epoch: 31387, Training Loss: 0.00823
Epoch: 31387, Training Loss: 0.00769
Epoch: 31387, Training Loss: 0.00940
Epoch: 31387, Training Loss: 0.00728
Epoch: 31388, Training Loss: 0.00823
Epoch: 31388, Training Loss: 0.00769
Epoch: 31388, Training Loss: 0.00940
Epoch: 31388, Training Loss: 0.00728
Epoch: 31389, Training Loss: 0.00823
Epoch: 31389, Training Loss: 0.00769
Epoch: 31389, Training Loss: 0.00940
Epoch: 31389, Training Loss: 0.00728
Epoch: 31390, Training Loss: 0.00823
Epoch: 31390, Training Loss: 0.00769
Epoch: 31390, Training Loss: 0.00940
Epoch: 31390, Training Loss: 0.00728
Epoch: 31391, Training Loss: 0.00823
Epoch: 31391, Training Loss: 0.00769
Epoch: 31391, Training Loss: 0.00940
Epoch: 31391, Training Loss: 0.00728
Epoch: 31392, Training Loss: 0.00823
Epoch: 31392, Training Loss: 0.00769
Epoch: 31392, Training Loss: 0.00940
Epoch: 31392, Training Loss: 0.00728
Epoch: 31393, Training Loss: 0.00823
Epoch: 31393, Training Loss: 0.00769
Epoch: 31393, Training Loss: 0.00940
Epoch: 31393, Training Loss: 0.00728
Epoch: 31394, Training Loss: 0.00823
Epoch: 31394, Training Loss: 0.00769
Epoch: 31394, Training Loss: 0.00940
Epoch: 31394, Training Loss: 0.00728
Epoch: 31395, Training Loss: 0.00823
Epoch: 31395, Training Loss: 0.00769
Epoch: 31395, Training Loss: 0.00940
Epoch: 31395, Training Loss: 0.00728
Epoch: 31396, Training Loss: 0.00823
Epoch: 31396, Training Loss: 0.00769
Epoch: 31396, Training Loss: 0.00940
Epoch: 31396, Training Loss: 0.00728
Epoch: 31397, Training Loss: 0.00823
Epoch: 31397, Training Loss: 0.00769
Epoch: 31397, Training Loss: 0.00940
Epoch: 31397, Training Loss: 0.00728
Epoch: 31398, Training Loss: 0.00823
Epoch: 31398, Training Loss: 0.00769
Epoch: 31398, Training Loss: 0.00940
Epoch: 31398, Training Loss: 0.00728
Epoch: 31399, Training Loss: 0.00823
Epoch: 31399, Training Loss: 0.00769
Epoch: 31399, Training Loss: 0.00940
Epoch: 31399, Training Loss: 0.00728
Epoch: 31400, Training Loss: 0.00823
Epoch: 31400, Training Loss: 0.00769
Epoch: 31400, Training Loss: 0.00940
Epoch: 31400, Training Loss: 0.00728
Epoch: 31401, Training Loss: 0.00823
Epoch: 31401, Training Loss: 0.00769
Epoch: 31401, Training Loss: 0.00940
Epoch: 31401, Training Loss: 0.00728
Epoch: 31402, Training Loss: 0.00823
Epoch: 31402, Training Loss: 0.00769
Epoch: 31402, Training Loss: 0.00940
Epoch: 31402, Training Loss: 0.00728
Epoch: 31403, Training Loss: 0.00823
Epoch: 31403, Training Loss: 0.00769
Epoch: 31403, Training Loss: 0.00940
Epoch: 31403, Training Loss: 0.00728
Epoch: 31404, Training Loss: 0.00823
Epoch: 31404, Training Loss: 0.00769
Epoch: 31404, Training Loss: 0.00940
Epoch: 31404, Training Loss: 0.00728
Epoch: 31405, Training Loss: 0.00823
Epoch: 31405, Training Loss: 0.00769
Epoch: 31405, Training Loss: 0.00940
Epoch: 31405, Training Loss: 0.00728
Epoch: 31406, Training Loss: 0.00823
Epoch: 31406, Training Loss: 0.00769
Epoch: 31406, Training Loss: 0.00940
Epoch: 31406, Training Loss: 0.00728
Epoch: 31407, Training Loss: 0.00823
Epoch: 31407, Training Loss: 0.00769
Epoch: 31407, Training Loss: 0.00940
Epoch: 31407, Training Loss: 0.00728
Epoch: 31408, Training Loss: 0.00823
Epoch: 31408, Training Loss: 0.00769
Epoch: 31408, Training Loss: 0.00940
Epoch: 31408, Training Loss: 0.00728
Epoch: 31409, Training Loss: 0.00823
Epoch: 31409, Training Loss: 0.00769
Epoch: 31409, Training Loss: 0.00940
Epoch: 31409, Training Loss: 0.00728
Epoch: 31410, Training Loss: 0.00823
Epoch: 31410, Training Loss: 0.00769
Epoch: 31410, Training Loss: 0.00940
Epoch: 31410, Training Loss: 0.00728
Epoch: 31411, Training Loss: 0.00823
Epoch: 31411, Training Loss: 0.00769
Epoch: 31411, Training Loss: 0.00940
Epoch: 31411, Training Loss: 0.00728
Epoch: 31412, Training Loss: 0.00823
Epoch: 31412, Training Loss: 0.00769
Epoch: 31412, Training Loss: 0.00940
Epoch: 31412, Training Loss: 0.00728
Epoch: 31413, Training Loss: 0.00823
Epoch: 31413, Training Loss: 0.00769
Epoch: 31413, Training Loss: 0.00940
Epoch: 31413, Training Loss: 0.00728
Epoch: 31414, Training Loss: 0.00823
Epoch: 31414, Training Loss: 0.00769
Epoch: 31414, Training Loss: 0.00940
Epoch: 31414, Training Loss: 0.00728
Epoch: 31415, Training Loss: 0.00823
Epoch: 31415, Training Loss: 0.00769
Epoch: 31415, Training Loss: 0.00940
Epoch: 31415, Training Loss: 0.00728
Epoch: 31416, Training Loss: 0.00823
Epoch: 31416, Training Loss: 0.00769
Epoch: 31416, Training Loss: 0.00940
Epoch: 31416, Training Loss: 0.00728
Epoch: 31417, Training Loss: 0.00823
Epoch: 31417, Training Loss: 0.00769
Epoch: 31417, Training Loss: 0.00940
Epoch: 31417, Training Loss: 0.00728
Epoch: 31418, Training Loss: 0.00823
Epoch: 31418, Training Loss: 0.00769
Epoch: 31418, Training Loss: 0.00940
Epoch: 31418, Training Loss: 0.00728
Epoch: 31419, Training Loss: 0.00823
Epoch: 31419, Training Loss: 0.00769
Epoch: 31419, Training Loss: 0.00940
Epoch: 31419, Training Loss: 0.00728
Epoch: 31420, Training Loss: 0.00823
Epoch: 31420, Training Loss: 0.00769
Epoch: 31420, Training Loss: 0.00940
Epoch: 31420, Training Loss: 0.00728
Epoch: 31421, Training Loss: 0.00823
Epoch: 31421, Training Loss: 0.00769
Epoch: 31421, Training Loss: 0.00940
Epoch: 31421, Training Loss: 0.00728
Epoch: 31422, Training Loss: 0.00823
Epoch: 31422, Training Loss: 0.00769
Epoch: 31422, Training Loss: 0.00940
Epoch: 31422, Training Loss: 0.00728
Epoch: 31423, Training Loss: 0.00823
Epoch: 31423, Training Loss: 0.00769
Epoch: 31423, Training Loss: 0.00940
Epoch: 31423, Training Loss: 0.00728
Epoch: 31424, Training Loss: 0.00823
Epoch: 31424, Training Loss: 0.00769
Epoch: 31424, Training Loss: 0.00940
Epoch: 31424, Training Loss: 0.00728
Epoch: 31425, Training Loss: 0.00823
Epoch: 31425, Training Loss: 0.00769
Epoch: 31425, Training Loss: 0.00940
Epoch: 31425, Training Loss: 0.00728
Epoch: 31426, Training Loss: 0.00823
Epoch: 31426, Training Loss: 0.00769
Epoch: 31426, Training Loss: 0.00940
Epoch: 31426, Training Loss: 0.00728
Epoch: 31427, Training Loss: 0.00823
Epoch: 31427, Training Loss: 0.00769
Epoch: 31427, Training Loss: 0.00940
Epoch: 31427, Training Loss: 0.00728
Epoch: 31428, Training Loss: 0.00823
Epoch: 31428, Training Loss: 0.00769
Epoch: 31428, Training Loss: 0.00940
Epoch: 31428, Training Loss: 0.00728
Epoch: 31429, Training Loss: 0.00823
Epoch: 31429, Training Loss: 0.00769
Epoch: 31429, Training Loss: 0.00940
Epoch: 31429, Training Loss: 0.00728
Epoch: 31430, Training Loss: 0.00823
Epoch: 31430, Training Loss: 0.00769
Epoch: 31430, Training Loss: 0.00940
Epoch: 31430, Training Loss: 0.00728
Epoch: 31431, Training Loss: 0.00823
Epoch: 31431, Training Loss: 0.00769
Epoch: 31431, Training Loss: 0.00940
Epoch: 31431, Training Loss: 0.00728
Epoch: 31432, Training Loss: 0.00823
Epoch: 31432, Training Loss: 0.00769
Epoch: 31432, Training Loss: 0.00940
Epoch: 31432, Training Loss: 0.00728
Epoch: 31433, Training Loss: 0.00823
Epoch: 31433, Training Loss: 0.00769
Epoch: 31433, Training Loss: 0.00940
Epoch: 31433, Training Loss: 0.00728
Epoch: 31434, Training Loss: 0.00823
Epoch: 31434, Training Loss: 0.00769
Epoch: 31434, Training Loss: 0.00940
Epoch: 31434, Training Loss: 0.00728
Epoch: 31435, Training Loss: 0.00823
Epoch: 31435, Training Loss: 0.00769
Epoch: 31435, Training Loss: 0.00940
Epoch: 31435, Training Loss: 0.00728
Epoch: 31436, Training Loss: 0.00823
Epoch: 31436, Training Loss: 0.00769
Epoch: 31436, Training Loss: 0.00940
Epoch: 31436, Training Loss: 0.00728
Epoch: 31437, Training Loss: 0.00823
Epoch: 31437, Training Loss: 0.00769
Epoch: 31437, Training Loss: 0.00940
Epoch: 31437, Training Loss: 0.00728
Epoch: 31438, Training Loss: 0.00823
Epoch: 31438, Training Loss: 0.00769
Epoch: 31438, Training Loss: 0.00940
Epoch: 31438, Training Loss: 0.00728
Epoch: 31439, Training Loss: 0.00823
Epoch: 31439, Training Loss: 0.00769
Epoch: 31439, Training Loss: 0.00940
Epoch: 31439, Training Loss: 0.00728
Epoch: 31440, Training Loss: 0.00823
Epoch: 31440, Training Loss: 0.00769
Epoch: 31440, Training Loss: 0.00940
Epoch: 31440, Training Loss: 0.00728
Epoch: 31441, Training Loss: 0.00823
Epoch: 31441, Training Loss: 0.00769
Epoch: 31441, Training Loss: 0.00940
Epoch: 31441, Training Loss: 0.00728
Epoch: 31442, Training Loss: 0.00823
Epoch: 31442, Training Loss: 0.00768
Epoch: 31442, Training Loss: 0.00940
Epoch: 31442, Training Loss: 0.00728
Epoch: 31443, Training Loss: 0.00823
Epoch: 31443, Training Loss: 0.00768
Epoch: 31443, Training Loss: 0.00940
Epoch: 31443, Training Loss: 0.00728
Epoch: 31444, Training Loss: 0.00823
Epoch: 31444, Training Loss: 0.00768
Epoch: 31444, Training Loss: 0.00940
Epoch: 31444, Training Loss: 0.00728
Epoch: 31445, Training Loss: 0.00823
Epoch: 31445, Training Loss: 0.00768
Epoch: 31445, Training Loss: 0.00940
Epoch: 31445, Training Loss: 0.00728
Epoch: 31446, Training Loss: 0.00823
Epoch: 31446, Training Loss: 0.00768
Epoch: 31446, Training Loss: 0.00940
Epoch: 31446, Training Loss: 0.00728
Epoch: 31447, Training Loss: 0.00823
Epoch: 31447, Training Loss: 0.00768
Epoch: 31447, Training Loss: 0.00940
Epoch: 31447, Training Loss: 0.00728
Epoch: 31448, Training Loss: 0.00823
Epoch: 31448, Training Loss: 0.00768
Epoch: 31448, Training Loss: 0.00940
Epoch: 31448, Training Loss: 0.00728
Epoch: 31449, Training Loss: 0.00823
Epoch: 31449, Training Loss: 0.00768
Epoch: 31449, Training Loss: 0.00939
Epoch: 31449, Training Loss: 0.00728
Epoch: 31450, Training Loss: 0.00823
Epoch: 31450, Training Loss: 0.00768
Epoch: 31450, Training Loss: 0.00939
Epoch: 31450, Training Loss: 0.00728
Epoch: 31451, Training Loss: 0.00823
Epoch: 31451, Training Loss: 0.00768
Epoch: 31451, Training Loss: 0.00939
Epoch: 31451, Training Loss: 0.00728
Epoch: 31452, Training Loss: 0.00823
Epoch: 31452, Training Loss: 0.00768
Epoch: 31452, Training Loss: 0.00939
Epoch: 31452, Training Loss: 0.00728
Epoch: 31453, Training Loss: 0.00823
Epoch: 31453, Training Loss: 0.00768
Epoch: 31453, Training Loss: 0.00939
Epoch: 31453, Training Loss: 0.00728
Epoch: 31454, Training Loss: 0.00823
Epoch: 31454, Training Loss: 0.00768
Epoch: 31454, Training Loss: 0.00939
Epoch: 31454, Training Loss: 0.00728
Epoch: 31455, Training Loss: 0.00822
Epoch: 31455, Training Loss: 0.00768
Epoch: 31455, Training Loss: 0.00939
Epoch: 31455, Training Loss: 0.00728
Epoch: 31456, Training Loss: 0.00822
Epoch: 31456, Training Loss: 0.00768
Epoch: 31456, Training Loss: 0.00939
Epoch: 31456, Training Loss: 0.00728
Epoch: 31457, Training Loss: 0.00822
Epoch: 31457, Training Loss: 0.00768
Epoch: 31457, Training Loss: 0.00939
Epoch: 31457, Training Loss: 0.00728
Epoch: 31458, Training Loss: 0.00822
Epoch: 31458, Training Loss: 0.00768
Epoch: 31458, Training Loss: 0.00939
Epoch: 31458, Training Loss: 0.00728
Epoch: 31459, Training Loss: 0.00822
Epoch: 31459, Training Loss: 0.00768
Epoch: 31459, Training Loss: 0.00939
Epoch: 31459, Training Loss: 0.00727
Epoch: 31460, Training Loss: 0.00822
Epoch: 31460, Training Loss: 0.00768
Epoch: 31460, Training Loss: 0.00939
Epoch: 31460, Training Loss: 0.00727
Epoch: 31461, Training Loss: 0.00822
Epoch: 31461, Training Loss: 0.00768
Epoch: 31461, Training Loss: 0.00939
Epoch: 31461, Training Loss: 0.00727
Epoch: 31462, Training Loss: 0.00822
Epoch: 31462, Training Loss: 0.00768
Epoch: 31462, Training Loss: 0.00939
Epoch: 31462, Training Loss: 0.00727
Epoch: 31463, Training Loss: 0.00822
Epoch: 31463, Training Loss: 0.00768
Epoch: 31463, Training Loss: 0.00939
Epoch: 31463, Training Loss: 0.00727
Epoch: 31464, Training Loss: 0.00822
Epoch: 31464, Training Loss: 0.00768
Epoch: 31464, Training Loss: 0.00939
Epoch: 31464, Training Loss: 0.00727
Epoch: 31465, Training Loss: 0.00822
Epoch: 31465, Training Loss: 0.00768
Epoch: 31465, Training Loss: 0.00939
Epoch: 31465, Training Loss: 0.00727
Epoch: 31466, Training Loss: 0.00822
Epoch: 31466, Training Loss: 0.00768
Epoch: 31466, Training Loss: 0.00939
Epoch: 31466, Training Loss: 0.00727
Epoch: 31467, Training Loss: 0.00822
Epoch: 31467, Training Loss: 0.00768
Epoch: 31467, Training Loss: 0.00939
Epoch: 31467, Training Loss: 0.00727
Epoch: 31468, Training Loss: 0.00822
Epoch: 31468, Training Loss: 0.00768
Epoch: 31468, Training Loss: 0.00939
Epoch: 31468, Training Loss: 0.00727
Epoch: 31469, Training Loss: 0.00822
Epoch: 31469, Training Loss: 0.00768
Epoch: 31469, Training Loss: 0.00939
Epoch: 31469, Training Loss: 0.00727
Epoch: 31470, Training Loss: 0.00822
Epoch: 31470, Training Loss: 0.00768
Epoch: 31470, Training Loss: 0.00939
Epoch: 31470, Training Loss: 0.00727
Epoch: 31471, Training Loss: 0.00822
Epoch: 31471, Training Loss: 0.00768
Epoch: 31471, Training Loss: 0.00939
Epoch: 31471, Training Loss: 0.00727
Epoch: 31472, Training Loss: 0.00822
Epoch: 31472, Training Loss: 0.00768
Epoch: 31472, Training Loss: 0.00939
Epoch: 31472, Training Loss: 0.00727
Epoch: 31473, Training Loss: 0.00822
Epoch: 31473, Training Loss: 0.00768
Epoch: 31473, Training Loss: 0.00939
Epoch: 31473, Training Loss: 0.00727
Epoch: 31474, Training Loss: 0.00822
Epoch: 31474, Training Loss: 0.00768
Epoch: 31474, Training Loss: 0.00939
Epoch: 31474, Training Loss: 0.00727
Epoch: 31475, Training Loss: 0.00822
Epoch: 31475, Training Loss: 0.00768
Epoch: 31475, Training Loss: 0.00939
Epoch: 31475, Training Loss: 0.00727
Epoch: 31476, Training Loss: 0.00822
Epoch: 31476, Training Loss: 0.00768
Epoch: 31476, Training Loss: 0.00939
Epoch: 31476, Training Loss: 0.00727
Epoch: 31477, Training Loss: 0.00822
Epoch: 31477, Training Loss: 0.00768
Epoch: 31477, Training Loss: 0.00939
Epoch: 31477, Training Loss: 0.00727
Epoch: 31478, Training Loss: 0.00822
Epoch: 31478, Training Loss: 0.00768
Epoch: 31478, Training Loss: 0.00939
Epoch: 31478, Training Loss: 0.00727
Epoch: 31479, Training Loss: 0.00822
Epoch: 31479, Training Loss: 0.00768
Epoch: 31479, Training Loss: 0.00939
Epoch: 31479, Training Loss: 0.00727
Epoch: 31480, Training Loss: 0.00822
Epoch: 31480, Training Loss: 0.00768
Epoch: 31480, Training Loss: 0.00939
Epoch: 31480, Training Loss: 0.00727
Epoch: 31481, Training Loss: 0.00822
Epoch: 31481, Training Loss: 0.00768
Epoch: 31481, Training Loss: 0.00939
Epoch: 31481, Training Loss: 0.00727
Epoch: 31482, Training Loss: 0.00822
Epoch: 31482, Training Loss: 0.00768
Epoch: 31482, Training Loss: 0.00939
Epoch: 31482, Training Loss: 0.00727
Epoch: 31483, Training Loss: 0.00822
Epoch: 31483, Training Loss: 0.00768
Epoch: 31483, Training Loss: 0.00939
Epoch: 31483, Training Loss: 0.00727
Epoch: 31484, Training Loss: 0.00822
Epoch: 31484, Training Loss: 0.00768
Epoch: 31484, Training Loss: 0.00939
Epoch: 31484, Training Loss: 0.00727
Epoch: 31485, Training Loss: 0.00822
Epoch: 31485, Training Loss: 0.00768
Epoch: 31485, Training Loss: 0.00939
Epoch: 31485, Training Loss: 0.00727
Epoch: 31486, Training Loss: 0.00822
Epoch: 31486, Training Loss: 0.00768
Epoch: 31486, Training Loss: 0.00939
Epoch: 31486, Training Loss: 0.00727
Epoch: 31487, Training Loss: 0.00822
Epoch: 31487, Training Loss: 0.00768
Epoch: 31487, Training Loss: 0.00939
Epoch: 31487, Training Loss: 0.00727
Epoch: 31488, Training Loss: 0.00822
Epoch: 31488, Training Loss: 0.00768
Epoch: 31488, Training Loss: 0.00939
Epoch: 31488, Training Loss: 0.00727
Epoch: 31489, Training Loss: 0.00822
Epoch: 31489, Training Loss: 0.00768
Epoch: 31489, Training Loss: 0.00939
Epoch: 31489, Training Loss: 0.00727
Epoch: 31490, Training Loss: 0.00822
Epoch: 31490, Training Loss: 0.00768
Epoch: 31490, Training Loss: 0.00939
Epoch: 31490, Training Loss: 0.00727
Epoch: 31491, Training Loss: 0.00822
Epoch: 31491, Training Loss: 0.00768
Epoch: 31491, Training Loss: 0.00939
Epoch: 31491, Training Loss: 0.00727
Epoch: 31492, Training Loss: 0.00822
Epoch: 31492, Training Loss: 0.00768
Epoch: 31492, Training Loss: 0.00939
Epoch: 31492, Training Loss: 0.00727
Epoch: 31493, Training Loss: 0.00822
Epoch: 31493, Training Loss: 0.00768
Epoch: 31493, Training Loss: 0.00939
Epoch: 31493, Training Loss: 0.00727
Epoch: 31494, Training Loss: 0.00822
Epoch: 31494, Training Loss: 0.00768
Epoch: 31494, Training Loss: 0.00939
Epoch: 31494, Training Loss: 0.00727
Epoch: 31495, Training Loss: 0.00822
Epoch: 31495, Training Loss: 0.00768
Epoch: 31495, Training Loss: 0.00939
Epoch: 31495, Training Loss: 0.00727
Epoch: 31496, Training Loss: 0.00822
Epoch: 31496, Training Loss: 0.00768
Epoch: 31496, Training Loss: 0.00939
Epoch: 31496, Training Loss: 0.00727
Epoch: 31497, Training Loss: 0.00822
Epoch: 31497, Training Loss: 0.00768
Epoch: 31497, Training Loss: 0.00939
Epoch: 31497, Training Loss: 0.00727
Epoch: 31498, Training Loss: 0.00822
Epoch: 31498, Training Loss: 0.00768
Epoch: 31498, Training Loss: 0.00939
Epoch: 31498, Training Loss: 0.00727
Epoch: 31499, Training Loss: 0.00822
Epoch: 31499, Training Loss: 0.00768
Epoch: 31499, Training Loss: 0.00939
Epoch: 31499, Training Loss: 0.00727
Epoch: 31500, Training Loss: 0.00822
Epoch: 31500, Training Loss: 0.00768
Epoch: 31500, Training Loss: 0.00939
Epoch: 31500, Training Loss: 0.00727
Epoch: 31501, Training Loss: 0.00822
Epoch: 31501, Training Loss: 0.00768
Epoch: 31501, Training Loss: 0.00939
Epoch: 31501, Training Loss: 0.00727
Epoch: 31502, Training Loss: 0.00822
Epoch: 31502, Training Loss: 0.00768
Epoch: 31502, Training Loss: 0.00939
Epoch: 31502, Training Loss: 0.00727
Epoch: 31503, Training Loss: 0.00822
Epoch: 31503, Training Loss: 0.00768
Epoch: 31503, Training Loss: 0.00939
Epoch: 31503, Training Loss: 0.00727
Epoch: 31504, Training Loss: 0.00822
Epoch: 31504, Training Loss: 0.00768
Epoch: 31504, Training Loss: 0.00939
Epoch: 31504, Training Loss: 0.00727
Epoch: 31505, Training Loss: 0.00822
Epoch: 31505, Training Loss: 0.00768
Epoch: 31505, Training Loss: 0.00939
Epoch: 31505, Training Loss: 0.00727
Epoch: 31506, Training Loss: 0.00822
Epoch: 31506, Training Loss: 0.00768
Epoch: 31506, Training Loss: 0.00939
Epoch: 31506, Training Loss: 0.00727
Epoch: 31507, Training Loss: 0.00822
Epoch: 31507, Training Loss: 0.00768
Epoch: 31507, Training Loss: 0.00939
Epoch: 31507, Training Loss: 0.00727
Epoch: 31508, Training Loss: 0.00822
Epoch: 31508, Training Loss: 0.00768
Epoch: 31508, Training Loss: 0.00939
Epoch: 31508, Training Loss: 0.00727
Epoch: 31509, Training Loss: 0.00822
Epoch: 31509, Training Loss: 0.00768
Epoch: 31509, Training Loss: 0.00939
Epoch: 31509, Training Loss: 0.00727
Epoch: 31510, Training Loss: 0.00822
Epoch: 31510, Training Loss: 0.00768
Epoch: 31510, Training Loss: 0.00939
Epoch: 31510, Training Loss: 0.00727
Epoch: 31511, Training Loss: 0.00822
Epoch: 31511, Training Loss: 0.00768
Epoch: 31511, Training Loss: 0.00939
Epoch: 31511, Training Loss: 0.00727
Epoch: 31512, Training Loss: 0.00822
Epoch: 31512, Training Loss: 0.00768
Epoch: 31512, Training Loss: 0.00938
Epoch: 31512, Training Loss: 0.00727
Epoch: 31513, Training Loss: 0.00822
Epoch: 31513, Training Loss: 0.00768
Epoch: 31513, Training Loss: 0.00938
Epoch: 31513, Training Loss: 0.00727
Epoch: 31514, Training Loss: 0.00822
Epoch: 31514, Training Loss: 0.00768
Epoch: 31514, Training Loss: 0.00938
Epoch: 31514, Training Loss: 0.00727
Epoch: 31515, Training Loss: 0.00822
Epoch: 31515, Training Loss: 0.00768
Epoch: 31515, Training Loss: 0.00938
Epoch: 31515, Training Loss: 0.00727
Epoch: 31516, Training Loss: 0.00822
Epoch: 31516, Training Loss: 0.00768
Epoch: 31516, Training Loss: 0.00938
Epoch: 31516, Training Loss: 0.00727
Epoch: 31517, Training Loss: 0.00822
Epoch: 31517, Training Loss: 0.00768
Epoch: 31517, Training Loss: 0.00938
Epoch: 31517, Training Loss: 0.00727
Epoch: 31518, Training Loss: 0.00822
Epoch: 31518, Training Loss: 0.00768
Epoch: 31518, Training Loss: 0.00938
Epoch: 31518, Training Loss: 0.00727
Epoch: 31519, Training Loss: 0.00822
Epoch: 31519, Training Loss: 0.00768
Epoch: 31519, Training Loss: 0.00938
Epoch: 31519, Training Loss: 0.00727
Epoch: 31520, Training Loss: 0.00822
Epoch: 31520, Training Loss: 0.00767
Epoch: 31520, Training Loss: 0.00938
Epoch: 31520, Training Loss: 0.00727
Epoch: 31521, Training Loss: 0.00822
Epoch: 31521, Training Loss: 0.00767
Epoch: 31521, Training Loss: 0.00938
Epoch: 31521, Training Loss: 0.00727
Epoch: 31522, Training Loss: 0.00822
Epoch: 31522, Training Loss: 0.00767
Epoch: 31522, Training Loss: 0.00938
Epoch: 31522, Training Loss: 0.00727
Epoch: 31523, Training Loss: 0.00822
Epoch: 31523, Training Loss: 0.00767
Epoch: 31523, Training Loss: 0.00938
Epoch: 31523, Training Loss: 0.00727
Epoch: 31524, Training Loss: 0.00822
Epoch: 31524, Training Loss: 0.00767
Epoch: 31524, Training Loss: 0.00938
Epoch: 31524, Training Loss: 0.00727
Epoch: 31525, Training Loss: 0.00822
Epoch: 31525, Training Loss: 0.00767
Epoch: 31525, Training Loss: 0.00938
Epoch: 31525, Training Loss: 0.00727
Epoch: 31526, Training Loss: 0.00822
Epoch: 31526, Training Loss: 0.00767
Epoch: 31526, Training Loss: 0.00938
Epoch: 31526, Training Loss: 0.00727
Epoch: 31527, Training Loss: 0.00822
Epoch: 31527, Training Loss: 0.00767
Epoch: 31527, Training Loss: 0.00938
Epoch: 31527, Training Loss: 0.00727
Epoch: 31528, Training Loss: 0.00821
Epoch: 31528, Training Loss: 0.00767
Epoch: 31528, Training Loss: 0.00938
Epoch: 31528, Training Loss: 0.00727
Epoch: 31529, Training Loss: 0.00821
Epoch: 31529, Training Loss: 0.00767
Epoch: 31529, Training Loss: 0.00938
Epoch: 31529, Training Loss: 0.00727
Epoch: 31530, Training Loss: 0.00821
Epoch: 31530, Training Loss: 0.00767
Epoch: 31530, Training Loss: 0.00938
Epoch: 31530, Training Loss: 0.00727
Epoch: 31531, Training Loss: 0.00821
Epoch: 31531, Training Loss: 0.00767
Epoch: 31531, Training Loss: 0.00938
Epoch: 31531, Training Loss: 0.00727
Epoch: 31532, Training Loss: 0.00821
Epoch: 31532, Training Loss: 0.00767
Epoch: 31532, Training Loss: 0.00938
Epoch: 31532, Training Loss: 0.00727
Epoch: 31533, Training Loss: 0.00821
Epoch: 31533, Training Loss: 0.00767
Epoch: 31533, Training Loss: 0.00938
Epoch: 31533, Training Loss: 0.00727
Epoch: 31534, Training Loss: 0.00821
Epoch: 31534, Training Loss: 0.00767
Epoch: 31534, Training Loss: 0.00938
Epoch: 31534, Training Loss: 0.00727
Epoch: 31535, Training Loss: 0.00821
Epoch: 31535, Training Loss: 0.00767
Epoch: 31535, Training Loss: 0.00938
Epoch: 31535, Training Loss: 0.00727
Epoch: 31536, Training Loss: 0.00821
Epoch: 31536, Training Loss: 0.00767
Epoch: 31536, Training Loss: 0.00938
Epoch: 31536, Training Loss: 0.00727
Epoch: 31537, Training Loss: 0.00821
Epoch: 31537, Training Loss: 0.00767
Epoch: 31537, Training Loss: 0.00938
Epoch: 31537, Training Loss: 0.00727
Epoch: 31538, Training Loss: 0.00821
Epoch: 31538, Training Loss: 0.00767
Epoch: 31538, Training Loss: 0.00938
Epoch: 31538, Training Loss: 0.00727
Epoch: 31539, Training Loss: 0.00821
Epoch: 31539, Training Loss: 0.00767
Epoch: 31539, Training Loss: 0.00938
Epoch: 31539, Training Loss: 0.00727
Epoch: 31540, Training Loss: 0.00821
Epoch: 31540, Training Loss: 0.00767
Epoch: 31540, Training Loss: 0.00938
Epoch: 31540, Training Loss: 0.00727
Epoch: 31541, Training Loss: 0.00821
Epoch: 31541, Training Loss: 0.00767
Epoch: 31541, Training Loss: 0.00938
Epoch: 31541, Training Loss: 0.00727
Epoch: 31542, Training Loss: 0.00821
Epoch: 31542, Training Loss: 0.00767
Epoch: 31542, Training Loss: 0.00938
Epoch: 31542, Training Loss: 0.00726
Epoch: 31543, Training Loss: 0.00821
Epoch: 31543, Training Loss: 0.00767
Epoch: 31543, Training Loss: 0.00938
Epoch: 31543, Training Loss: 0.00726
Epoch: 31544, Training Loss: 0.00821
Epoch: 31544, Training Loss: 0.00767
Epoch: 31544, Training Loss: 0.00938
Epoch: 31544, Training Loss: 0.00726
Epoch: 31545, Training Loss: 0.00821
Epoch: 31545, Training Loss: 0.00767
Epoch: 31545, Training Loss: 0.00938
Epoch: 31545, Training Loss: 0.00726
Epoch: 31546, Training Loss: 0.00821
Epoch: 31546, Training Loss: 0.00767
Epoch: 31546, Training Loss: 0.00938
Epoch: 31546, Training Loss: 0.00726
Epoch: 31547, Training Loss: 0.00821
Epoch: 31547, Training Loss: 0.00767
Epoch: 31547, Training Loss: 0.00938
Epoch: 31547, Training Loss: 0.00726
Epoch: 31548, Training Loss: 0.00821
Epoch: 31548, Training Loss: 0.00767
Epoch: 31548, Training Loss: 0.00938
Epoch: 31548, Training Loss: 0.00726
Epoch: 31549, Training Loss: 0.00821
Epoch: 31549, Training Loss: 0.00767
Epoch: 31549, Training Loss: 0.00938
Epoch: 31549, Training Loss: 0.00726
Epoch: 31550, Training Loss: 0.00821
Epoch: 31550, Training Loss: 0.00767
Epoch: 31550, Training Loss: 0.00938
Epoch: 31550, Training Loss: 0.00726
Epoch: 31551, Training Loss: 0.00821
Epoch: 31551, Training Loss: 0.00767
Epoch: 31551, Training Loss: 0.00938
Epoch: 31551, Training Loss: 0.00726
Epoch: 31552, Training Loss: 0.00821
Epoch: 31552, Training Loss: 0.00767
Epoch: 31552, Training Loss: 0.00938
Epoch: 31552, Training Loss: 0.00726
Epoch: 31553, Training Loss: 0.00821
Epoch: 31553, Training Loss: 0.00767
Epoch: 31553, Training Loss: 0.00938
Epoch: 31553, Training Loss: 0.00726
Epoch: 31554, Training Loss: 0.00821
Epoch: 31554, Training Loss: 0.00767
Epoch: 31554, Training Loss: 0.00938
Epoch: 31554, Training Loss: 0.00726
Epoch: 31555, Training Loss: 0.00821
Epoch: 31555, Training Loss: 0.00767
Epoch: 31555, Training Loss: 0.00938
Epoch: 31555, Training Loss: 0.00726
Epoch: 31556, Training Loss: 0.00821
Epoch: 31556, Training Loss: 0.00767
Epoch: 31556, Training Loss: 0.00938
Epoch: 31556, Training Loss: 0.00726
Epoch: 31557, Training Loss: 0.00821
Epoch: 31557, Training Loss: 0.00767
Epoch: 31557, Training Loss: 0.00938
Epoch: 31557, Training Loss: 0.00726
Epoch: 31558, Training Loss: 0.00821
Epoch: 31558, Training Loss: 0.00767
Epoch: 31558, Training Loss: 0.00938
Epoch: 31558, Training Loss: 0.00726
Epoch: 31559, Training Loss: 0.00821
Epoch: 31559, Training Loss: 0.00767
Epoch: 31559, Training Loss: 0.00938
Epoch: 31559, Training Loss: 0.00726
Epoch: 31560, Training Loss: 0.00821
Epoch: 31560, Training Loss: 0.00767
Epoch: 31560, Training Loss: 0.00938
Epoch: 31560, Training Loss: 0.00726
Epoch: 31561, Training Loss: 0.00821
Epoch: 31561, Training Loss: 0.00767
Epoch: 31561, Training Loss: 0.00938
Epoch: 31561, Training Loss: 0.00726
Epoch: 31562, Training Loss: 0.00821
Epoch: 31562, Training Loss: 0.00767
Epoch: 31562, Training Loss: 0.00938
Epoch: 31562, Training Loss: 0.00726
Epoch: 31563, Training Loss: 0.00821
Epoch: 31563, Training Loss: 0.00767
Epoch: 31563, Training Loss: 0.00938
Epoch: 31563, Training Loss: 0.00726
Epoch: 31564, Training Loss: 0.00821
Epoch: 31564, Training Loss: 0.00767
Epoch: 31564, Training Loss: 0.00938
Epoch: 31564, Training Loss: 0.00726
Epoch: 31565, Training Loss: 0.00821
Epoch: 31565, Training Loss: 0.00767
Epoch: 31565, Training Loss: 0.00938
Epoch: 31565, Training Loss: 0.00726
Epoch: 31566, Training Loss: 0.00821
Epoch: 31566, Training Loss: 0.00767
Epoch: 31566, Training Loss: 0.00938
Epoch: 31566, Training Loss: 0.00726
Epoch: 31567, Training Loss: 0.00821
Epoch: 31567, Training Loss: 0.00767
Epoch: 31567, Training Loss: 0.00938
Epoch: 31567, Training Loss: 0.00726
Epoch: 31568, Training Loss: 0.00821
Epoch: 31568, Training Loss: 0.00767
Epoch: 31568, Training Loss: 0.00938
Epoch: 31568, Training Loss: 0.00726
Epoch: 31569, Training Loss: 0.00821
Epoch: 31569, Training Loss: 0.00767
Epoch: 31569, Training Loss: 0.00938
Epoch: 31569, Training Loss: 0.00726
Epoch: 31570, Training Loss: 0.00821
Epoch: 31570, Training Loss: 0.00767
Epoch: 31570, Training Loss: 0.00938
Epoch: 31570, Training Loss: 0.00726
Epoch: 31571, Training Loss: 0.00821
Epoch: 31571, Training Loss: 0.00767
Epoch: 31571, Training Loss: 0.00938
Epoch: 31571, Training Loss: 0.00726
Epoch: 31572, Training Loss: 0.00821
Epoch: 31572, Training Loss: 0.00767
Epoch: 31572, Training Loss: 0.00938
Epoch: 31572, Training Loss: 0.00726
Epoch: 31573, Training Loss: 0.00821
Epoch: 31573, Training Loss: 0.00767
Epoch: 31573, Training Loss: 0.00938
Epoch: 31573, Training Loss: 0.00726
Epoch: 31574, Training Loss: 0.00821
Epoch: 31574, Training Loss: 0.00767
Epoch: 31574, Training Loss: 0.00938
Epoch: 31574, Training Loss: 0.00726
Epoch: 31575, Training Loss: 0.00821
Epoch: 31575, Training Loss: 0.00767
Epoch: 31575, Training Loss: 0.00938
Epoch: 31575, Training Loss: 0.00726
Epoch: 31576, Training Loss: 0.00821
Epoch: 31576, Training Loss: 0.00767
Epoch: 31576, Training Loss: 0.00937
Epoch: 31576, Training Loss: 0.00726
Epoch: 31577, Training Loss: 0.00821
Epoch: 31577, Training Loss: 0.00767
Epoch: 31577, Training Loss: 0.00937
Epoch: 31577, Training Loss: 0.00726
Epoch: 31578, Training Loss: 0.00821
Epoch: 31578, Training Loss: 0.00767
Epoch: 31578, Training Loss: 0.00937
Epoch: 31578, Training Loss: 0.00726
Epoch: 31579, Training Loss: 0.00821
Epoch: 31579, Training Loss: 0.00767
Epoch: 31579, Training Loss: 0.00937
Epoch: 31579, Training Loss: 0.00726
Epoch: 31580, Training Loss: 0.00821
Epoch: 31580, Training Loss: 0.00767
Epoch: 31580, Training Loss: 0.00937
Epoch: 31580, Training Loss: 0.00726
Epoch: 31581, Training Loss: 0.00821
Epoch: 31581, Training Loss: 0.00767
Epoch: 31581, Training Loss: 0.00937
Epoch: 31581, Training Loss: 0.00726
Epoch: 31582, Training Loss: 0.00821
Epoch: 31582, Training Loss: 0.00767
Epoch: 31582, Training Loss: 0.00937
Epoch: 31582, Training Loss: 0.00726
Epoch: 31583, Training Loss: 0.00821
Epoch: 31583, Training Loss: 0.00767
Epoch: 31583, Training Loss: 0.00937
Epoch: 31583, Training Loss: 0.00726
Epoch: 31584, Training Loss: 0.00821
Epoch: 31584, Training Loss: 0.00767
Epoch: 31584, Training Loss: 0.00937
Epoch: 31584, Training Loss: 0.00726
Epoch: 31585, Training Loss: 0.00821
Epoch: 31585, Training Loss: 0.00767
Epoch: 31585, Training Loss: 0.00937
Epoch: 31585, Training Loss: 0.00726
Epoch: 31586, Training Loss: 0.00821
Epoch: 31586, Training Loss: 0.00767
Epoch: 31586, Training Loss: 0.00937
Epoch: 31586, Training Loss: 0.00726
Epoch: 31587, Training Loss: 0.00821
Epoch: 31587, Training Loss: 0.00767
Epoch: 31587, Training Loss: 0.00937
Epoch: 31587, Training Loss: 0.00726
Epoch: 31588, Training Loss: 0.00821
Epoch: 31588, Training Loss: 0.00767
Epoch: 31588, Training Loss: 0.00937
Epoch: 31588, Training Loss: 0.00726
Epoch: 31589, Training Loss: 0.00821
Epoch: 31589, Training Loss: 0.00767
Epoch: 31589, Training Loss: 0.00937
Epoch: 31589, Training Loss: 0.00726
Epoch: 31590, Training Loss: 0.00821
Epoch: 31590, Training Loss: 0.00767
Epoch: 31590, Training Loss: 0.00937
Epoch: 31590, Training Loss: 0.00726
Epoch: 31591, Training Loss: 0.00821
Epoch: 31591, Training Loss: 0.00767
Epoch: 31591, Training Loss: 0.00937
Epoch: 31591, Training Loss: 0.00726
Epoch: 31592, Training Loss: 0.00821
Epoch: 31592, Training Loss: 0.00767
Epoch: 31592, Training Loss: 0.00937
Epoch: 31592, Training Loss: 0.00726
Epoch: 31593, Training Loss: 0.00821
Epoch: 31593, Training Loss: 0.00767
Epoch: 31593, Training Loss: 0.00937
Epoch: 31593, Training Loss: 0.00726
Epoch: 31594, Training Loss: 0.00821
Epoch: 31594, Training Loss: 0.00767
Epoch: 31594, Training Loss: 0.00937
Epoch: 31594, Training Loss: 0.00726
Epoch: 31595, Training Loss: 0.00821
Epoch: 31595, Training Loss: 0.00767
Epoch: 31595, Training Loss: 0.00937
Epoch: 31595, Training Loss: 0.00726
Epoch: 31596, Training Loss: 0.00821
Epoch: 31596, Training Loss: 0.00767
Epoch: 31596, Training Loss: 0.00937
Epoch: 31596, Training Loss: 0.00726
Epoch: 31597, Training Loss: 0.00821
Epoch: 31597, Training Loss: 0.00767
Epoch: 31597, Training Loss: 0.00937
Epoch: 31597, Training Loss: 0.00726
Epoch: 31598, Training Loss: 0.00821
Epoch: 31598, Training Loss: 0.00767
Epoch: 31598, Training Loss: 0.00937
Epoch: 31598, Training Loss: 0.00726
Epoch: 31599, Training Loss: 0.00821
Epoch: 31599, Training Loss: 0.00766
Epoch: 31599, Training Loss: 0.00937
Epoch: 31599, Training Loss: 0.00726
Epoch: 31600, Training Loss: 0.00820
Epoch: 31600, Training Loss: 0.00766
Epoch: 31600, Training Loss: 0.00937
Epoch: 31600, Training Loss: 0.00726
Epoch: 31601, Training Loss: 0.00820
Epoch: 31601, Training Loss: 0.00766
Epoch: 31601, Training Loss: 0.00937
Epoch: 31601, Training Loss: 0.00726
Epoch: 31602, Training Loss: 0.00820
Epoch: 31602, Training Loss: 0.00766
Epoch: 31602, Training Loss: 0.00937
Epoch: 31602, Training Loss: 0.00726
Epoch: 31603, Training Loss: 0.00820
Epoch: 31603, Training Loss: 0.00766
Epoch: 31603, Training Loss: 0.00937
Epoch: 31603, Training Loss: 0.00726
Epoch: 31604, Training Loss: 0.00820
Epoch: 31604, Training Loss: 0.00766
Epoch: 31604, Training Loss: 0.00937
Epoch: 31604, Training Loss: 0.00726
Epoch: 31605, Training Loss: 0.00820
Epoch: 31605, Training Loss: 0.00766
Epoch: 31605, Training Loss: 0.00937
Epoch: 31605, Training Loss: 0.00726
Epoch: 31606, Training Loss: 0.00820
Epoch: 31606, Training Loss: 0.00766
Epoch: 31606, Training Loss: 0.00937
Epoch: 31606, Training Loss: 0.00726
Epoch: 31607, Training Loss: 0.00820
Epoch: 31607, Training Loss: 0.00766
Epoch: 31607, Training Loss: 0.00937
Epoch: 31607, Training Loss: 0.00726
Epoch: 31608, Training Loss: 0.00820
Epoch: 31608, Training Loss: 0.00766
Epoch: 31608, Training Loss: 0.00937
Epoch: 31608, Training Loss: 0.00726
Epoch: 31609, Training Loss: 0.00820
Epoch: 31609, Training Loss: 0.00766
Epoch: 31609, Training Loss: 0.00937
Epoch: 31609, Training Loss: 0.00726
Epoch: 31610, Training Loss: 0.00820
Epoch: 31610, Training Loss: 0.00766
Epoch: 31610, Training Loss: 0.00937
Epoch: 31610, Training Loss: 0.00726
Epoch: 31611, Training Loss: 0.00820
Epoch: 31611, Training Loss: 0.00766
Epoch: 31611, Training Loss: 0.00937
Epoch: 31611, Training Loss: 0.00726
Epoch: 31612, Training Loss: 0.00820
Epoch: 31612, Training Loss: 0.00766
Epoch: 31612, Training Loss: 0.00937
Epoch: 31612, Training Loss: 0.00726
Epoch: 31613, Training Loss: 0.00820
Epoch: 31613, Training Loss: 0.00766
Epoch: 31613, Training Loss: 0.00937
Epoch: 31613, Training Loss: 0.00726
Epoch: 31614, Training Loss: 0.00820
Epoch: 31614, Training Loss: 0.00766
Epoch: 31614, Training Loss: 0.00937
Epoch: 31614, Training Loss: 0.00726
Epoch: 31615, Training Loss: 0.00820
Epoch: 31615, Training Loss: 0.00766
Epoch: 31615, Training Loss: 0.00937
Epoch: 31615, Training Loss: 0.00726
Epoch: 31616, Training Loss: 0.00820
Epoch: 31616, Training Loss: 0.00766
Epoch: 31616, Training Loss: 0.00937
Epoch: 31616, Training Loss: 0.00726
Epoch: 31617, Training Loss: 0.00820
Epoch: 31617, Training Loss: 0.00766
Epoch: 31617, Training Loss: 0.00937
Epoch: 31617, Training Loss: 0.00726
Epoch: 31618, Training Loss: 0.00820
Epoch: 31618, Training Loss: 0.00766
Epoch: 31618, Training Loss: 0.00937
Epoch: 31618, Training Loss: 0.00726
Epoch: 31619, Training Loss: 0.00820
Epoch: 31619, Training Loss: 0.00766
Epoch: 31619, Training Loss: 0.00937
Epoch: 31619, Training Loss: 0.00726
Epoch: 31620, Training Loss: 0.00820
Epoch: 31620, Training Loss: 0.00766
Epoch: 31620, Training Loss: 0.00937
Epoch: 31620, Training Loss: 0.00726
Epoch: 31621, Training Loss: 0.00820
Epoch: 31621, Training Loss: 0.00766
Epoch: 31621, Training Loss: 0.00937
Epoch: 31621, Training Loss: 0.00726
Epoch: 31622, Training Loss: 0.00820
Epoch: 31622, Training Loss: 0.00766
Epoch: 31622, Training Loss: 0.00937
Epoch: 31622, Training Loss: 0.00726
Epoch: 31623, Training Loss: 0.00820
Epoch: 31623, Training Loss: 0.00766
Epoch: 31623, Training Loss: 0.00937
Epoch: 31623, Training Loss: 0.00726
Epoch: 31624, Training Loss: 0.00820
Epoch: 31624, Training Loss: 0.00766
Epoch: 31624, Training Loss: 0.00937
Epoch: 31624, Training Loss: 0.00726
Epoch: 31625, Training Loss: 0.00820
Epoch: 31625, Training Loss: 0.00766
Epoch: 31625, Training Loss: 0.00937
Epoch: 31625, Training Loss: 0.00725
Epoch: 31626, Training Loss: 0.00820
Epoch: 31626, Training Loss: 0.00766
Epoch: 31626, Training Loss: 0.00937
Epoch: 31626, Training Loss: 0.00725
Epoch: 31627, Training Loss: 0.00820
Epoch: 31627, Training Loss: 0.00766
Epoch: 31627, Training Loss: 0.00937
Epoch: 31627, Training Loss: 0.00725
Epoch: 31628, Training Loss: 0.00820
Epoch: 31628, Training Loss: 0.00766
Epoch: 31628, Training Loss: 0.00937
Epoch: 31628, Training Loss: 0.00725
Epoch: 31629, Training Loss: 0.00820
Epoch: 31629, Training Loss: 0.00766
Epoch: 31629, Training Loss: 0.00937
Epoch: 31629, Training Loss: 0.00725
Epoch: 31630, Training Loss: 0.00820
Epoch: 31630, Training Loss: 0.00766
Epoch: 31630, Training Loss: 0.00937
Epoch: 31630, Training Loss: 0.00725
Epoch: 31631, Training Loss: 0.00820
Epoch: 31631, Training Loss: 0.00766
Epoch: 31631, Training Loss: 0.00937
Epoch: 31631, Training Loss: 0.00725
Epoch: 31632, Training Loss: 0.00820
Epoch: 31632, Training Loss: 0.00766
Epoch: 31632, Training Loss: 0.00937
Epoch: 31632, Training Loss: 0.00725
Epoch: 31633, Training Loss: 0.00820
Epoch: 31633, Training Loss: 0.00766
Epoch: 31633, Training Loss: 0.00937
Epoch: 31633, Training Loss: 0.00725
Epoch: 31634, Training Loss: 0.00820
Epoch: 31634, Training Loss: 0.00766
Epoch: 31634, Training Loss: 0.00937
Epoch: 31634, Training Loss: 0.00725
Epoch: 31635, Training Loss: 0.00820
Epoch: 31635, Training Loss: 0.00766
Epoch: 31635, Training Loss: 0.00937
Epoch: 31635, Training Loss: 0.00725
Epoch: 31636, Training Loss: 0.00820
Epoch: 31636, Training Loss: 0.00766
Epoch: 31636, Training Loss: 0.00937
Epoch: 31636, Training Loss: 0.00725
Epoch: 31637, Training Loss: 0.00820
Epoch: 31637, Training Loss: 0.00766
Epoch: 31637, Training Loss: 0.00937
Epoch: 31637, Training Loss: 0.00725
Epoch: 31638, Training Loss: 0.00820
Epoch: 31638, Training Loss: 0.00766
Epoch: 31638, Training Loss: 0.00937
Epoch: 31638, Training Loss: 0.00725
Epoch: 31639, Training Loss: 0.00820
Epoch: 31639, Training Loss: 0.00766
Epoch: 31639, Training Loss: 0.00936
Epoch: 31639, Training Loss: 0.00725
Epoch: 31640, Training Loss: 0.00820
Epoch: 31640, Training Loss: 0.00766
Epoch: 31640, Training Loss: 0.00936
Epoch: 31640, Training Loss: 0.00725
Epoch: 31641, Training Loss: 0.00820
Epoch: 31641, Training Loss: 0.00766
Epoch: 31641, Training Loss: 0.00936
Epoch: 31641, Training Loss: 0.00725
Epoch: 31642, Training Loss: 0.00820
Epoch: 31642, Training Loss: 0.00766
Epoch: 31642, Training Loss: 0.00936
Epoch: 31642, Training Loss: 0.00725
Epoch: 31643, Training Loss: 0.00820
Epoch: 31643, Training Loss: 0.00766
Epoch: 31643, Training Loss: 0.00936
Epoch: 31643, Training Loss: 0.00725
Epoch: 31644, Training Loss: 0.00820
Epoch: 31644, Training Loss: 0.00766
Epoch: 31644, Training Loss: 0.00936
Epoch: 31644, Training Loss: 0.00725
Epoch: 31645, Training Loss: 0.00820
Epoch: 31645, Training Loss: 0.00766
Epoch: 31645, Training Loss: 0.00936
Epoch: 31645, Training Loss: 0.00725
Epoch: 31646, Training Loss: 0.00820
Epoch: 31646, Training Loss: 0.00766
Epoch: 31646, Training Loss: 0.00936
Epoch: 31646, Training Loss: 0.00725
Epoch: 31647, Training Loss: 0.00820
Epoch: 31647, Training Loss: 0.00766
Epoch: 31647, Training Loss: 0.00936
Epoch: 31647, Training Loss: 0.00725
Epoch: 31648, Training Loss: 0.00820
Epoch: 31648, Training Loss: 0.00766
Epoch: 31648, Training Loss: 0.00936
Epoch: 31648, Training Loss: 0.00725
Epoch: 31649, Training Loss: 0.00820
Epoch: 31649, Training Loss: 0.00766
Epoch: 31649, Training Loss: 0.00936
Epoch: 31649, Training Loss: 0.00725
Epoch: 31650, Training Loss: 0.00820
Epoch: 31650, Training Loss: 0.00766
Epoch: 31650, Training Loss: 0.00936
Epoch: 31650, Training Loss: 0.00725
Epoch: 31651, Training Loss: 0.00820
Epoch: 31651, Training Loss: 0.00766
Epoch: 31651, Training Loss: 0.00936
Epoch: 31651, Training Loss: 0.00725
Epoch: 31652, Training Loss: 0.00820
Epoch: 31652, Training Loss: 0.00766
Epoch: 31652, Training Loss: 0.00936
Epoch: 31652, Training Loss: 0.00725
Epoch: 31653, Training Loss: 0.00820
Epoch: 31653, Training Loss: 0.00766
Epoch: 31653, Training Loss: 0.00936
Epoch: 31653, Training Loss: 0.00725
Epoch: 31654, Training Loss: 0.00820
Epoch: 31654, Training Loss: 0.00766
Epoch: 31654, Training Loss: 0.00936
Epoch: 31654, Training Loss: 0.00725
Epoch: 31655, Training Loss: 0.00820
Epoch: 31655, Training Loss: 0.00766
Epoch: 31655, Training Loss: 0.00936
Epoch: 31655, Training Loss: 0.00725
Epoch: 31656, Training Loss: 0.00820
Epoch: 31656, Training Loss: 0.00766
Epoch: 31656, Training Loss: 0.00936
Epoch: 31656, Training Loss: 0.00725
Epoch: 31657, Training Loss: 0.00820
Epoch: 31657, Training Loss: 0.00766
Epoch: 31657, Training Loss: 0.00936
Epoch: 31657, Training Loss: 0.00725
Epoch: 31658, Training Loss: 0.00820
Epoch: 31658, Training Loss: 0.00766
Epoch: 31658, Training Loss: 0.00936
Epoch: 31658, Training Loss: 0.00725
Epoch: 31659, Training Loss: 0.00820
Epoch: 31659, Training Loss: 0.00766
Epoch: 31659, Training Loss: 0.00936
Epoch: 31659, Training Loss: 0.00725
Epoch: 31660, Training Loss: 0.00820
Epoch: 31660, Training Loss: 0.00766
Epoch: 31660, Training Loss: 0.00936
Epoch: 31660, Training Loss: 0.00725
Epoch: 31661, Training Loss: 0.00820
Epoch: 31661, Training Loss: 0.00766
Epoch: 31661, Training Loss: 0.00936
Epoch: 31661, Training Loss: 0.00725
Epoch: 31662, Training Loss: 0.00820
Epoch: 31662, Training Loss: 0.00766
Epoch: 31662, Training Loss: 0.00936
Epoch: 31662, Training Loss: 0.00725
Epoch: 31663, Training Loss: 0.00820
Epoch: 31663, Training Loss: 0.00766
Epoch: 31663, Training Loss: 0.00936
Epoch: 31663, Training Loss: 0.00725
Epoch: 31664, Training Loss: 0.00820
Epoch: 31664, Training Loss: 0.00766
Epoch: 31664, Training Loss: 0.00936
Epoch: 31664, Training Loss: 0.00725
Epoch: 31665, Training Loss: 0.00820
Epoch: 31665, Training Loss: 0.00766
Epoch: 31665, Training Loss: 0.00936
Epoch: 31665, Training Loss: 0.00725
Epoch: 31666, Training Loss: 0.00820
Epoch: 31666, Training Loss: 0.00766
Epoch: 31666, Training Loss: 0.00936
Epoch: 31666, Training Loss: 0.00725
Epoch: 31667, Training Loss: 0.00820
Epoch: 31667, Training Loss: 0.00766
Epoch: 31667, Training Loss: 0.00936
Epoch: 31667, Training Loss: 0.00725
Epoch: 31668, Training Loss: 0.00820
Epoch: 31668, Training Loss: 0.00766
Epoch: 31668, Training Loss: 0.00936
Epoch: 31668, Training Loss: 0.00725
Epoch: 31669, Training Loss: 0.00820
Epoch: 31669, Training Loss: 0.00766
Epoch: 31669, Training Loss: 0.00936
Epoch: 31669, Training Loss: 0.00725
Epoch: 31670, Training Loss: 0.00820
Epoch: 31670, Training Loss: 0.00766
Epoch: 31670, Training Loss: 0.00936
Epoch: 31670, Training Loss: 0.00725
Epoch: 31671, Training Loss: 0.00820
Epoch: 31671, Training Loss: 0.00766
Epoch: 31671, Training Loss: 0.00936
Epoch: 31671, Training Loss: 0.00725
Epoch: 31672, Training Loss: 0.00820
Epoch: 31672, Training Loss: 0.00766
Epoch: 31672, Training Loss: 0.00936
Epoch: 31672, Training Loss: 0.00725
Epoch: 31673, Training Loss: 0.00819
Epoch: 31673, Training Loss: 0.00766
Epoch: 31673, Training Loss: 0.00936
Epoch: 31673, Training Loss: 0.00725
Epoch: 31674, Training Loss: 0.00819
Epoch: 31674, Training Loss: 0.00766
Epoch: 31674, Training Loss: 0.00936
Epoch: 31674, Training Loss: 0.00725
Epoch: 31675, Training Loss: 0.00819
Epoch: 31675, Training Loss: 0.00766
Epoch: 31675, Training Loss: 0.00936
Epoch: 31675, Training Loss: 0.00725
Epoch: 31676, Training Loss: 0.00819
Epoch: 31676, Training Loss: 0.00766
Epoch: 31676, Training Loss: 0.00936
Epoch: 31676, Training Loss: 0.00725
Epoch: 31677, Training Loss: 0.00819
Epoch: 31677, Training Loss: 0.00766
Epoch: 31677, Training Loss: 0.00936
Epoch: 31677, Training Loss: 0.00725
Epoch: 31678, Training Loss: 0.00819
Epoch: 31678, Training Loss: 0.00765
Epoch: 31678, Training Loss: 0.00936
Epoch: 31678, Training Loss: 0.00725
Epoch: 31679, Training Loss: 0.00819
Epoch: 31679, Training Loss: 0.00765
Epoch: 31679, Training Loss: 0.00936
Epoch: 31679, Training Loss: 0.00725
Epoch: 31680, Training Loss: 0.00819
Epoch: 31680, Training Loss: 0.00765
Epoch: 31680, Training Loss: 0.00936
Epoch: 31680, Training Loss: 0.00725
Epoch: 31681, Training Loss: 0.00819
Epoch: 31681, Training Loss: 0.00765
Epoch: 31681, Training Loss: 0.00936
Epoch: 31681, Training Loss: 0.00725
Epoch: 31682, Training Loss: 0.00819
Epoch: 31682, Training Loss: 0.00765
Epoch: 31682, Training Loss: 0.00936
Epoch: 31682, Training Loss: 0.00725
Epoch: 31683, Training Loss: 0.00819
Epoch: 31683, Training Loss: 0.00765
Epoch: 31683, Training Loss: 0.00936
Epoch: 31683, Training Loss: 0.00725
Epoch: 31684, Training Loss: 0.00819
Epoch: 31684, Training Loss: 0.00765
Epoch: 31684, Training Loss: 0.00936
Epoch: 31684, Training Loss: 0.00725
Epoch: 31685, Training Loss: 0.00819
Epoch: 31685, Training Loss: 0.00765
Epoch: 31685, Training Loss: 0.00936
Epoch: 31685, Training Loss: 0.00725
Epoch: 31686, Training Loss: 0.00819
Epoch: 31686, Training Loss: 0.00765
Epoch: 31686, Training Loss: 0.00936
Epoch: 31686, Training Loss: 0.00725
Epoch: 31687, Training Loss: 0.00819
Epoch: 31687, Training Loss: 0.00765
Epoch: 31687, Training Loss: 0.00936
Epoch: 31687, Training Loss: 0.00725
Epoch: 31688, Training Loss: 0.00819
Epoch: 31688, Training Loss: 0.00765
Epoch: 31688, Training Loss: 0.00936
Epoch: 31688, Training Loss: 0.00725
Epoch: 31689, Training Loss: 0.00819
Epoch: 31689, Training Loss: 0.00765
Epoch: 31689, Training Loss: 0.00936
Epoch: 31689, Training Loss: 0.00725
Epoch: 31690, Training Loss: 0.00819
Epoch: 31690, Training Loss: 0.00765
Epoch: 31690, Training Loss: 0.00936
Epoch: 31690, Training Loss: 0.00725
Epoch: 31691, Training Loss: 0.00819
Epoch: 31691, Training Loss: 0.00765
Epoch: 31691, Training Loss: 0.00936
Epoch: 31691, Training Loss: 0.00725
Epoch: 31692, Training Loss: 0.00819
Epoch: 31692, Training Loss: 0.00765
Epoch: 31692, Training Loss: 0.00936
Epoch: 31692, Training Loss: 0.00725
Epoch: 31693, Training Loss: 0.00819
Epoch: 31693, Training Loss: 0.00765
Epoch: 31693, Training Loss: 0.00936
Epoch: 31693, Training Loss: 0.00725
Epoch: 31694, Training Loss: 0.00819
Epoch: 31694, Training Loss: 0.00765
Epoch: 31694, Training Loss: 0.00936
Epoch: 31694, Training Loss: 0.00725
Epoch: 31695, Training Loss: 0.00819
Epoch: 31695, Training Loss: 0.00765
Epoch: 31695, Training Loss: 0.00936
Epoch: 31695, Training Loss: 0.00725
Epoch: 31696, Training Loss: 0.00819
Epoch: 31696, Training Loss: 0.00765
Epoch: 31696, Training Loss: 0.00936
Epoch: 31696, Training Loss: 0.00725
Epoch: 31697, Training Loss: 0.00819
Epoch: 31697, Training Loss: 0.00765
Epoch: 31697, Training Loss: 0.00936
Epoch: 31697, Training Loss: 0.00725
Epoch: 31698, Training Loss: 0.00819
Epoch: 31698, Training Loss: 0.00765
Epoch: 31698, Training Loss: 0.00936
Epoch: 31698, Training Loss: 0.00725
Epoch: 31699, Training Loss: 0.00819
Epoch: 31699, Training Loss: 0.00765
Epoch: 31699, Training Loss: 0.00936
Epoch: 31699, Training Loss: 0.00725
Epoch: 31700, Training Loss: 0.00819
Epoch: 31700, Training Loss: 0.00765
Epoch: 31700, Training Loss: 0.00936
Epoch: 31700, Training Loss: 0.00725
Epoch: 31701, Training Loss: 0.00819
Epoch: 31701, Training Loss: 0.00765
Epoch: 31701, Training Loss: 0.00936
Epoch: 31701, Training Loss: 0.00725
Epoch: 31702, Training Loss: 0.00819
Epoch: 31702, Training Loss: 0.00765
Epoch: 31702, Training Loss: 0.00936
Epoch: 31702, Training Loss: 0.00725
Epoch: 31703, Training Loss: 0.00819
Epoch: 31703, Training Loss: 0.00765
Epoch: 31703, Training Loss: 0.00935
Epoch: 31703, Training Loss: 0.00725
Epoch: 31704, Training Loss: 0.00819
Epoch: 31704, Training Loss: 0.00765
Epoch: 31704, Training Loss: 0.00935
Epoch: 31704, Training Loss: 0.00725
Epoch: 31705, Training Loss: 0.00819
Epoch: 31705, Training Loss: 0.00765
Epoch: 31705, Training Loss: 0.00935
Epoch: 31705, Training Loss: 0.00725
Epoch: 31706, Training Loss: 0.00819
Epoch: 31706, Training Loss: 0.00765
Epoch: 31706, Training Loss: 0.00935
Epoch: 31706, Training Loss: 0.00725
Epoch: 31707, Training Loss: 0.00819
Epoch: 31707, Training Loss: 0.00765
Epoch: 31707, Training Loss: 0.00935
Epoch: 31707, Training Loss: 0.00725
Epoch: 31708, Training Loss: 0.00819
Epoch: 31708, Training Loss: 0.00765
Epoch: 31708, Training Loss: 0.00935
Epoch: 31708, Training Loss: 0.00725
Epoch: 31709, Training Loss: 0.00819
Epoch: 31709, Training Loss: 0.00765
Epoch: 31709, Training Loss: 0.00935
Epoch: 31709, Training Loss: 0.00724
Epoch: 31710, Training Loss: 0.00819
Epoch: 31710, Training Loss: 0.00765
Epoch: 31710, Training Loss: 0.00935
Epoch: 31710, Training Loss: 0.00724
Epoch: 31711, Training Loss: 0.00819
Epoch: 31711, Training Loss: 0.00765
Epoch: 31711, Training Loss: 0.00935
Epoch: 31711, Training Loss: 0.00724
Epoch: 31712, Training Loss: 0.00819
Epoch: 31712, Training Loss: 0.00765
Epoch: 31712, Training Loss: 0.00935
Epoch: 31712, Training Loss: 0.00724
Epoch: 31713, Training Loss: 0.00819
Epoch: 31713, Training Loss: 0.00765
Epoch: 31713, Training Loss: 0.00935
Epoch: 31713, Training Loss: 0.00724
Epoch: 31714, Training Loss: 0.00819
Epoch: 31714, Training Loss: 0.00765
Epoch: 31714, Training Loss: 0.00935
Epoch: 31714, Training Loss: 0.00724
Epoch: 31715, Training Loss: 0.00819
Epoch: 31715, Training Loss: 0.00765
Epoch: 31715, Training Loss: 0.00935
Epoch: 31715, Training Loss: 0.00724
Epoch: 31716, Training Loss: 0.00819
Epoch: 31716, Training Loss: 0.00765
Epoch: 31716, Training Loss: 0.00935
Epoch: 31716, Training Loss: 0.00724
Epoch: 31717, Training Loss: 0.00819
Epoch: 31717, Training Loss: 0.00765
Epoch: 31717, Training Loss: 0.00935
Epoch: 31717, Training Loss: 0.00724
Epoch: 31718, Training Loss: 0.00819
Epoch: 31718, Training Loss: 0.00765
Epoch: 31718, Training Loss: 0.00935
Epoch: 31718, Training Loss: 0.00724
Epoch: 31719, Training Loss: 0.00819
Epoch: 31719, Training Loss: 0.00765
Epoch: 31719, Training Loss: 0.00935
Epoch: 31719, Training Loss: 0.00724
Epoch: 31720, Training Loss: 0.00819
Epoch: 31720, Training Loss: 0.00765
Epoch: 31720, Training Loss: 0.00935
Epoch: 31720, Training Loss: 0.00724
Epoch: 31721, Training Loss: 0.00819
Epoch: 31721, Training Loss: 0.00765
Epoch: 31721, Training Loss: 0.00935
Epoch: 31721, Training Loss: 0.00724
Epoch: 31722, Training Loss: 0.00819
Epoch: 31722, Training Loss: 0.00765
Epoch: 31722, Training Loss: 0.00935
Epoch: 31722, Training Loss: 0.00724
Epoch: 31723, Training Loss: 0.00819
Epoch: 31723, Training Loss: 0.00765
Epoch: 31723, Training Loss: 0.00935
Epoch: 31723, Training Loss: 0.00724
Epoch: 31724, Training Loss: 0.00819
Epoch: 31724, Training Loss: 0.00765
Epoch: 31724, Training Loss: 0.00935
Epoch: 31724, Training Loss: 0.00724
Epoch: 31725, Training Loss: 0.00819
Epoch: 31725, Training Loss: 0.00765
Epoch: 31725, Training Loss: 0.00935
Epoch: 31725, Training Loss: 0.00724
Epoch: 31726, Training Loss: 0.00819
Epoch: 31726, Training Loss: 0.00765
Epoch: 31726, Training Loss: 0.00935
Epoch: 31726, Training Loss: 0.00724
Epoch: 31727, Training Loss: 0.00819
Epoch: 31727, Training Loss: 0.00765
Epoch: 31727, Training Loss: 0.00935
Epoch: 31727, Training Loss: 0.00724
Epoch: 31728, Training Loss: 0.00819
Epoch: 31728, Training Loss: 0.00765
Epoch: 31728, Training Loss: 0.00935
Epoch: 31728, Training Loss: 0.00724
Epoch: 31729, Training Loss: 0.00819
Epoch: 31729, Training Loss: 0.00765
Epoch: 31729, Training Loss: 0.00935
Epoch: 31729, Training Loss: 0.00724
Epoch: 31730, Training Loss: 0.00819
Epoch: 31730, Training Loss: 0.00765
Epoch: 31730, Training Loss: 0.00935
Epoch: 31730, Training Loss: 0.00724
Epoch: 31731, Training Loss: 0.00819
Epoch: 31731, Training Loss: 0.00765
Epoch: 31731, Training Loss: 0.00935
Epoch: 31731, Training Loss: 0.00724
Epoch: 31732, Training Loss: 0.00819
Epoch: 31732, Training Loss: 0.00765
Epoch: 31732, Training Loss: 0.00935
Epoch: 31732, Training Loss: 0.00724
Epoch: 31733, Training Loss: 0.00819
Epoch: 31733, Training Loss: 0.00765
Epoch: 31733, Training Loss: 0.00935
Epoch: 31733, Training Loss: 0.00724
Epoch: 31734, Training Loss: 0.00819
Epoch: 31734, Training Loss: 0.00765
Epoch: 31734, Training Loss: 0.00935
Epoch: 31734, Training Loss: 0.00724
Epoch: 31735, Training Loss: 0.00819
Epoch: 31735, Training Loss: 0.00765
Epoch: 31735, Training Loss: 0.00935
Epoch: 31735, Training Loss: 0.00724
Epoch: 31736, Training Loss: 0.00819
Epoch: 31736, Training Loss: 0.00765
Epoch: 31736, Training Loss: 0.00935
Epoch: 31736, Training Loss: 0.00724
Epoch: 31737, Training Loss: 0.00819
Epoch: 31737, Training Loss: 0.00765
Epoch: 31737, Training Loss: 0.00935
Epoch: 31737, Training Loss: 0.00724
Epoch: 31738, Training Loss: 0.00819
Epoch: 31738, Training Loss: 0.00765
Epoch: 31738, Training Loss: 0.00935
Epoch: 31738, Training Loss: 0.00724
Epoch: 31739, Training Loss: 0.00819
Epoch: 31739, Training Loss: 0.00765
Epoch: 31739, Training Loss: 0.00935
Epoch: 31739, Training Loss: 0.00724
Epoch: 31740, Training Loss: 0.00819
Epoch: 31740, Training Loss: 0.00765
Epoch: 31740, Training Loss: 0.00935
Epoch: 31740, Training Loss: 0.00724
Epoch: 31741, Training Loss: 0.00819
Epoch: 31741, Training Loss: 0.00765
Epoch: 31741, Training Loss: 0.00935
Epoch: 31741, Training Loss: 0.00724
Epoch: 31742, Training Loss: 0.00819
Epoch: 31742, Training Loss: 0.00765
Epoch: 31742, Training Loss: 0.00935
Epoch: 31742, Training Loss: 0.00724
Epoch: 31743, Training Loss: 0.00819
Epoch: 31743, Training Loss: 0.00765
Epoch: 31743, Training Loss: 0.00935
Epoch: 31743, Training Loss: 0.00724
Epoch: 31744, Training Loss: 0.00819
Epoch: 31744, Training Loss: 0.00765
Epoch: 31744, Training Loss: 0.00935
Epoch: 31744, Training Loss: 0.00724
Epoch: 31745, Training Loss: 0.00819
Epoch: 31745, Training Loss: 0.00765
Epoch: 31745, Training Loss: 0.00935
Epoch: 31745, Training Loss: 0.00724
Epoch: 31746, Training Loss: 0.00818
Epoch: 31746, Training Loss: 0.00765
Epoch: 31746, Training Loss: 0.00935
Epoch: 31746, Training Loss: 0.00724
Epoch: 31747, Training Loss: 0.00818
Epoch: 31747, Training Loss: 0.00765
Epoch: 31747, Training Loss: 0.00935
Epoch: 31747, Training Loss: 0.00724
Epoch: 31748, Training Loss: 0.00818
Epoch: 31748, Training Loss: 0.00765
Epoch: 31748, Training Loss: 0.00935
Epoch: 31748, Training Loss: 0.00724
Epoch: 31749, Training Loss: 0.00818
Epoch: 31749, Training Loss: 0.00765
Epoch: 31749, Training Loss: 0.00935
Epoch: 31749, Training Loss: 0.00724
Epoch: 31750, Training Loss: 0.00818
Epoch: 31750, Training Loss: 0.00765
Epoch: 31750, Training Loss: 0.00935
Epoch: 31750, Training Loss: 0.00724
Epoch: 31751, Training Loss: 0.00818
Epoch: 31751, Training Loss: 0.00765
Epoch: 31751, Training Loss: 0.00935
Epoch: 31751, Training Loss: 0.00724
Epoch: 31752, Training Loss: 0.00818
Epoch: 31752, Training Loss: 0.00765
Epoch: 31752, Training Loss: 0.00935
Epoch: 31752, Training Loss: 0.00724
Epoch: 31753, Training Loss: 0.00818
Epoch: 31753, Training Loss: 0.00765
Epoch: 31753, Training Loss: 0.00935
Epoch: 31753, Training Loss: 0.00724
Epoch: 31754, Training Loss: 0.00818
Epoch: 31754, Training Loss: 0.00765
Epoch: 31754, Training Loss: 0.00935
Epoch: 31754, Training Loss: 0.00724
Epoch: 31755, Training Loss: 0.00818
Epoch: 31755, Training Loss: 0.00765
Epoch: 31755, Training Loss: 0.00935
Epoch: 31755, Training Loss: 0.00724
Epoch: 31756, Training Loss: 0.00818
Epoch: 31756, Training Loss: 0.00765
Epoch: 31756, Training Loss: 0.00935
Epoch: 31756, Training Loss: 0.00724
Epoch: 31757, Training Loss: 0.00818
Epoch: 31757, Training Loss: 0.00765
Epoch: 31757, Training Loss: 0.00935
Epoch: 31757, Training Loss: 0.00724
Epoch: 31758, Training Loss: 0.00818
Epoch: 31758, Training Loss: 0.00764
Epoch: 31758, Training Loss: 0.00935
Epoch: 31758, Training Loss: 0.00724
Epoch: 31759, Training Loss: 0.00818
Epoch: 31759, Training Loss: 0.00764
Epoch: 31759, Training Loss: 0.00935
Epoch: 31759, Training Loss: 0.00724
Epoch: 31760, Training Loss: 0.00818
Epoch: 31760, Training Loss: 0.00764
Epoch: 31760, Training Loss: 0.00935
Epoch: 31760, Training Loss: 0.00724
Epoch: 31761, Training Loss: 0.00818
Epoch: 31761, Training Loss: 0.00764
Epoch: 31761, Training Loss: 0.00935
Epoch: 31761, Training Loss: 0.00724
Epoch: 31762, Training Loss: 0.00818
Epoch: 31762, Training Loss: 0.00764
Epoch: 31762, Training Loss: 0.00935
Epoch: 31762, Training Loss: 0.00724
Epoch: 31763, Training Loss: 0.00818
Epoch: 31763, Training Loss: 0.00764
Epoch: 31763, Training Loss: 0.00935
Epoch: 31763, Training Loss: 0.00724
Epoch: 31764, Training Loss: 0.00818
Epoch: 31764, Training Loss: 0.00764
Epoch: 31764, Training Loss: 0.00935
Epoch: 31764, Training Loss: 0.00724
Epoch: 31765, Training Loss: 0.00818
Epoch: 31765, Training Loss: 0.00764
Epoch: 31765, Training Loss: 0.00935
Epoch: 31765, Training Loss: 0.00724
Epoch: 31766, Training Loss: 0.00818
Epoch: 31766, Training Loss: 0.00764
Epoch: 31766, Training Loss: 0.00935
Epoch: 31766, Training Loss: 0.00724
Epoch: 31767, Training Loss: 0.00818
Epoch: 31767, Training Loss: 0.00764
Epoch: 31767, Training Loss: 0.00935
Epoch: 31767, Training Loss: 0.00724
Epoch: 31768, Training Loss: 0.00818
Epoch: 31768, Training Loss: 0.00764
Epoch: 31768, Training Loss: 0.00934
Epoch: 31768, Training Loss: 0.00724
Epoch: 31769, Training Loss: 0.00818
Epoch: 31769, Training Loss: 0.00764
Epoch: 31769, Training Loss: 0.00934
Epoch: 31769, Training Loss: 0.00724
Epoch: 31770, Training Loss: 0.00818
Epoch: 31770, Training Loss: 0.00764
Epoch: 31770, Training Loss: 0.00934
Epoch: 31770, Training Loss: 0.00724
Epoch: 31771, Training Loss: 0.00818
Epoch: 31771, Training Loss: 0.00764
Epoch: 31771, Training Loss: 0.00934
Epoch: 31771, Training Loss: 0.00724
Epoch: 31772, Training Loss: 0.00818
Epoch: 31772, Training Loss: 0.00764
Epoch: 31772, Training Loss: 0.00934
Epoch: 31772, Training Loss: 0.00724
Epoch: 31773, Training Loss: 0.00818
Epoch: 31773, Training Loss: 0.00764
Epoch: 31773, Training Loss: 0.00934
Epoch: 31773, Training Loss: 0.00724
Epoch: 31774, Training Loss: 0.00818
Epoch: 31774, Training Loss: 0.00764
Epoch: 31774, Training Loss: 0.00934
Epoch: 31774, Training Loss: 0.00724
Epoch: 31775, Training Loss: 0.00818
Epoch: 31775, Training Loss: 0.00764
Epoch: 31775, Training Loss: 0.00934
Epoch: 31775, Training Loss: 0.00724
Epoch: 31776, Training Loss: 0.00818
Epoch: 31776, Training Loss: 0.00764
Epoch: 31776, Training Loss: 0.00934
Epoch: 31776, Training Loss: 0.00724
Epoch: 31777, Training Loss: 0.00818
Epoch: 31777, Training Loss: 0.00764
Epoch: 31777, Training Loss: 0.00934
Epoch: 31777, Training Loss: 0.00724
Epoch: 31778, Training Loss: 0.00818
Epoch: 31778, Training Loss: 0.00764
Epoch: 31778, Training Loss: 0.00934
Epoch: 31778, Training Loss: 0.00724
Epoch: 31779, Training Loss: 0.00818
Epoch: 31779, Training Loss: 0.00764
Epoch: 31779, Training Loss: 0.00934
Epoch: 31779, Training Loss: 0.00724
Epoch: 31780, Training Loss: 0.00818
Epoch: 31780, Training Loss: 0.00764
Epoch: 31780, Training Loss: 0.00934
Epoch: 31780, Training Loss: 0.00724
Epoch: 31781, Training Loss: 0.00818
Epoch: 31781, Training Loss: 0.00764
Epoch: 31781, Training Loss: 0.00934
Epoch: 31781, Training Loss: 0.00724
Epoch: 31782, Training Loss: 0.00818
Epoch: 31782, Training Loss: 0.00764
Epoch: 31782, Training Loss: 0.00934
Epoch: 31782, Training Loss: 0.00724
Epoch: 31783, Training Loss: 0.00818
Epoch: 31783, Training Loss: 0.00764
Epoch: 31783, Training Loss: 0.00934
Epoch: 31783, Training Loss: 0.00724
Epoch: 31784, Training Loss: 0.00818
Epoch: 31784, Training Loss: 0.00764
Epoch: 31784, Training Loss: 0.00934
Epoch: 31784, Training Loss: 0.00724
Epoch: 31785, Training Loss: 0.00818
Epoch: 31785, Training Loss: 0.00764
Epoch: 31785, Training Loss: 0.00934
Epoch: 31785, Training Loss: 0.00724
Epoch: 31786, Training Loss: 0.00818
Epoch: 31786, Training Loss: 0.00764
Epoch: 31786, Training Loss: 0.00934
Epoch: 31786, Training Loss: 0.00724
Epoch: 31787, Training Loss: 0.00818
Epoch: 31787, Training Loss: 0.00764
Epoch: 31787, Training Loss: 0.00934
Epoch: 31787, Training Loss: 0.00724
Epoch: 31788, Training Loss: 0.00818
Epoch: 31788, Training Loss: 0.00764
Epoch: 31788, Training Loss: 0.00934
Epoch: 31788, Training Loss: 0.00724
Epoch: 31789, Training Loss: 0.00818
Epoch: 31789, Training Loss: 0.00764
Epoch: 31789, Training Loss: 0.00934
Epoch: 31789, Training Loss: 0.00724
Epoch: 31790, Training Loss: 0.00818
Epoch: 31790, Training Loss: 0.00764
Epoch: 31790, Training Loss: 0.00934
Epoch: 31790, Training Loss: 0.00724
Epoch: 31791, Training Loss: 0.00818
Epoch: 31791, Training Loss: 0.00764
Epoch: 31791, Training Loss: 0.00934
Epoch: 31791, Training Loss: 0.00724
Epoch: 31792, Training Loss: 0.00818
Epoch: 31792, Training Loss: 0.00764
Epoch: 31792, Training Loss: 0.00934
Epoch: 31792, Training Loss: 0.00724
Epoch: 31793, Training Loss: 0.00818
Epoch: 31793, Training Loss: 0.00764
Epoch: 31793, Training Loss: 0.00934
Epoch: 31793, Training Loss: 0.00723
Epoch: 31794, Training Loss: 0.00818
Epoch: 31794, Training Loss: 0.00764
Epoch: 31794, Training Loss: 0.00934
Epoch: 31794, Training Loss: 0.00723
Epoch: 31795, Training Loss: 0.00818
Epoch: 31795, Training Loss: 0.00764
Epoch: 31795, Training Loss: 0.00934
Epoch: 31795, Training Loss: 0.00723
Epoch: 31796, Training Loss: 0.00818
Epoch: 31796, Training Loss: 0.00764
Epoch: 31796, Training Loss: 0.00934
Epoch: 31796, Training Loss: 0.00723
Epoch: 31797, Training Loss: 0.00818
Epoch: 31797, Training Loss: 0.00764
Epoch: 31797, Training Loss: 0.00934
Epoch: 31797, Training Loss: 0.00723
Epoch: 31798, Training Loss: 0.00818
Epoch: 31798, Training Loss: 0.00764
Epoch: 31798, Training Loss: 0.00934
Epoch: 31798, Training Loss: 0.00723
Epoch: 31799, Training Loss: 0.00818
Epoch: 31799, Training Loss: 0.00764
Epoch: 31799, Training Loss: 0.00934
Epoch: 31799, Training Loss: 0.00723
Epoch: 31800, Training Loss: 0.00818
Epoch: 31800, Training Loss: 0.00764
Epoch: 31800, Training Loss: 0.00934
Epoch: 31800, Training Loss: 0.00723
Epoch: 31801, Training Loss: 0.00818
Epoch: 31801, Training Loss: 0.00764
Epoch: 31801, Training Loss: 0.00934
Epoch: 31801, Training Loss: 0.00723
Epoch: 31802, Training Loss: 0.00818
Epoch: 31802, Training Loss: 0.00764
Epoch: 31802, Training Loss: 0.00934
Epoch: 31802, Training Loss: 0.00723
Epoch: 31803, Training Loss: 0.00818
Epoch: 31803, Training Loss: 0.00764
Epoch: 31803, Training Loss: 0.00934
Epoch: 31803, Training Loss: 0.00723
Epoch: 31804, Training Loss: 0.00818
Epoch: 31804, Training Loss: 0.00764
Epoch: 31804, Training Loss: 0.00934
Epoch: 31804, Training Loss: 0.00723
Epoch: 31805, Training Loss: 0.00818
Epoch: 31805, Training Loss: 0.00764
Epoch: 31805, Training Loss: 0.00934
Epoch: 31805, Training Loss: 0.00723
Epoch: 31806, Training Loss: 0.00818
Epoch: 31806, Training Loss: 0.00764
Epoch: 31806, Training Loss: 0.00934
Epoch: 31806, Training Loss: 0.00723
Epoch: 31807, Training Loss: 0.00818
Epoch: 31807, Training Loss: 0.00764
Epoch: 31807, Training Loss: 0.00934
Epoch: 31807, Training Loss: 0.00723
Epoch: 31808, Training Loss: 0.00818
Epoch: 31808, Training Loss: 0.00764
Epoch: 31808, Training Loss: 0.00934
Epoch: 31808, Training Loss: 0.00723
Epoch: 31809, Training Loss: 0.00818
Epoch: 31809, Training Loss: 0.00764
Epoch: 31809, Training Loss: 0.00934
Epoch: 31809, Training Loss: 0.00723
Epoch: 31810, Training Loss: 0.00818
Epoch: 31810, Training Loss: 0.00764
Epoch: 31810, Training Loss: 0.00934
Epoch: 31810, Training Loss: 0.00723
Epoch: 31811, Training Loss: 0.00818
Epoch: 31811, Training Loss: 0.00764
Epoch: 31811, Training Loss: 0.00934
Epoch: 31811, Training Loss: 0.00723
Epoch: 31812, Training Loss: 0.00818
Epoch: 31812, Training Loss: 0.00764
Epoch: 31812, Training Loss: 0.00934
Epoch: 31812, Training Loss: 0.00723
Epoch: 31813, Training Loss: 0.00818
Epoch: 31813, Training Loss: 0.00764
Epoch: 31813, Training Loss: 0.00934
Epoch: 31813, Training Loss: 0.00723
Epoch: 31814, Training Loss: 0.00818
Epoch: 31814, Training Loss: 0.00764
Epoch: 31814, Training Loss: 0.00934
Epoch: 31814, Training Loss: 0.00723
Epoch: 31815, Training Loss: 0.00818
Epoch: 31815, Training Loss: 0.00764
Epoch: 31815, Training Loss: 0.00934
Epoch: 31815, Training Loss: 0.00723
Epoch: 31816, Training Loss: 0.00818
Epoch: 31816, Training Loss: 0.00764
Epoch: 31816, Training Loss: 0.00934
Epoch: 31816, Training Loss: 0.00723
Epoch: 31817, Training Loss: 0.00818
Epoch: 31817, Training Loss: 0.00764
Epoch: 31817, Training Loss: 0.00934
Epoch: 31817, Training Loss: 0.00723
Epoch: 31818, Training Loss: 0.00818
Epoch: 31818, Training Loss: 0.00764
Epoch: 31818, Training Loss: 0.00934
Epoch: 31818, Training Loss: 0.00723
Epoch: 31819, Training Loss: 0.00817
Epoch: 31819, Training Loss: 0.00764
Epoch: 31819, Training Loss: 0.00934
Epoch: 31819, Training Loss: 0.00723
Epoch: 31820, Training Loss: 0.00817
Epoch: 31820, Training Loss: 0.00764
Epoch: 31820, Training Loss: 0.00934
Epoch: 31820, Training Loss: 0.00723
Epoch: 31821, Training Loss: 0.00817
Epoch: 31821, Training Loss: 0.00764
Epoch: 31821, Training Loss: 0.00934
Epoch: 31821, Training Loss: 0.00723
Epoch: 31822, Training Loss: 0.00817
Epoch: 31822, Training Loss: 0.00764
Epoch: 31822, Training Loss: 0.00934
Epoch: 31822, Training Loss: 0.00723
Epoch: 31823, Training Loss: 0.00817
Epoch: 31823, Training Loss: 0.00764
Epoch: 31823, Training Loss: 0.00934
Epoch: 31823, Training Loss: 0.00723
Epoch: 31824, Training Loss: 0.00817
Epoch: 31824, Training Loss: 0.00764
Epoch: 31824, Training Loss: 0.00934
Epoch: 31824, Training Loss: 0.00723
Epoch: 31825, Training Loss: 0.00817
Epoch: 31825, Training Loss: 0.00764
Epoch: 31825, Training Loss: 0.00934
Epoch: 31825, Training Loss: 0.00723
Epoch: 31826, Training Loss: 0.00817
Epoch: 31826, Training Loss: 0.00764
Epoch: 31826, Training Loss: 0.00934
Epoch: 31826, Training Loss: 0.00723
Epoch: 31827, Training Loss: 0.00817
Epoch: 31827, Training Loss: 0.00764
Epoch: 31827, Training Loss: 0.00934
Epoch: 31827, Training Loss: 0.00723
Epoch: 31828, Training Loss: 0.00817
Epoch: 31828, Training Loss: 0.00764
Epoch: 31828, Training Loss: 0.00934
Epoch: 31828, Training Loss: 0.00723
Epoch: 31829, Training Loss: 0.00817
Epoch: 31829, Training Loss: 0.00764
Epoch: 31829, Training Loss: 0.00934
Epoch: 31829, Training Loss: 0.00723
Epoch: 31830, Training Loss: 0.00817
Epoch: 31830, Training Loss: 0.00764
Epoch: 31830, Training Loss: 0.00934
Epoch: 31830, Training Loss: 0.00723
Epoch: 31831, Training Loss: 0.00817
Epoch: 31831, Training Loss: 0.00764
Epoch: 31831, Training Loss: 0.00934
Epoch: 31831, Training Loss: 0.00723
Epoch: 31832, Training Loss: 0.00817
Epoch: 31832, Training Loss: 0.00764
Epoch: 31832, Training Loss: 0.00933
Epoch: 31832, Training Loss: 0.00723
Epoch: 31833, Training Loss: 0.00817
Epoch: 31833, Training Loss: 0.00764
Epoch: 31833, Training Loss: 0.00933
Epoch: 31833, Training Loss: 0.00723
Epoch: 31834, Training Loss: 0.00817
Epoch: 31834, Training Loss: 0.00764
Epoch: 31834, Training Loss: 0.00933
Epoch: 31834, Training Loss: 0.00723
Epoch: 31835, Training Loss: 0.00817
Epoch: 31835, Training Loss: 0.00764
Epoch: 31835, Training Loss: 0.00933
Epoch: 31835, Training Loss: 0.00723
Epoch: 31836, Training Loss: 0.00817
Epoch: 31836, Training Loss: 0.00764
Epoch: 31836, Training Loss: 0.00933
Epoch: 31836, Training Loss: 0.00723
Epoch: 31837, Training Loss: 0.00817
Epoch: 31837, Training Loss: 0.00763
Epoch: 31837, Training Loss: 0.00933
Epoch: 31837, Training Loss: 0.00723
Epoch: 31838, Training Loss: 0.00817
Epoch: 31838, Training Loss: 0.00763
Epoch: 31838, Training Loss: 0.00933
Epoch: 31838, Training Loss: 0.00723
Epoch: 31839, Training Loss: 0.00817
Epoch: 31839, Training Loss: 0.00763
Epoch: 31839, Training Loss: 0.00933
Epoch: 31839, Training Loss: 0.00723
Epoch: 31840, Training Loss: 0.00817
Epoch: 31840, Training Loss: 0.00763
Epoch: 31840, Training Loss: 0.00933
Epoch: 31840, Training Loss: 0.00723
Epoch: 31841, Training Loss: 0.00817
Epoch: 31841, Training Loss: 0.00763
Epoch: 31841, Training Loss: 0.00933
Epoch: 31841, Training Loss: 0.00723
Epoch: 31842, Training Loss: 0.00817
Epoch: 31842, Training Loss: 0.00763
Epoch: 31842, Training Loss: 0.00933
Epoch: 31842, Training Loss: 0.00723
Epoch: 31843, Training Loss: 0.00817
Epoch: 31843, Training Loss: 0.00763
Epoch: 31843, Training Loss: 0.00933
Epoch: 31843, Training Loss: 0.00723
Epoch: 31844, Training Loss: 0.00817
Epoch: 31844, Training Loss: 0.00763
Epoch: 31844, Training Loss: 0.00933
Epoch: 31844, Training Loss: 0.00723
Epoch: 31845, Training Loss: 0.00817
Epoch: 31845, Training Loss: 0.00763
Epoch: 31845, Training Loss: 0.00933
Epoch: 31845, Training Loss: 0.00723
Epoch: 31846, Training Loss: 0.00817
Epoch: 31846, Training Loss: 0.00763
Epoch: 31846, Training Loss: 0.00933
Epoch: 31846, Training Loss: 0.00723
Epoch: 31847, Training Loss: 0.00817
Epoch: 31847, Training Loss: 0.00763
Epoch: 31847, Training Loss: 0.00933
Epoch: 31847, Training Loss: 0.00723
Epoch: 31848, Training Loss: 0.00817
Epoch: 31848, Training Loss: 0.00763
Epoch: 31848, Training Loss: 0.00933
Epoch: 31848, Training Loss: 0.00723
Epoch: 31849, Training Loss: 0.00817
Epoch: 31849, Training Loss: 0.00763
Epoch: 31849, Training Loss: 0.00933
Epoch: 31849, Training Loss: 0.00723
Epoch: 31850, Training Loss: 0.00817
Epoch: 31850, Training Loss: 0.00763
Epoch: 31850, Training Loss: 0.00933
Epoch: 31850, Training Loss: 0.00723
Epoch: 31851, Training Loss: 0.00817
Epoch: 31851, Training Loss: 0.00763
Epoch: 31851, Training Loss: 0.00933
Epoch: 31851, Training Loss: 0.00723
Epoch: 31852, Training Loss: 0.00817
Epoch: 31852, Training Loss: 0.00763
Epoch: 31852, Training Loss: 0.00933
Epoch: 31852, Training Loss: 0.00723
Epoch: 31853, Training Loss: 0.00817
Epoch: 31853, Training Loss: 0.00763
Epoch: 31853, Training Loss: 0.00933
Epoch: 31853, Training Loss: 0.00723
Epoch: 31854, Training Loss: 0.00817
Epoch: 31854, Training Loss: 0.00763
Epoch: 31854, Training Loss: 0.00933
Epoch: 31854, Training Loss: 0.00723
Epoch: 31855, Training Loss: 0.00817
Epoch: 31855, Training Loss: 0.00763
Epoch: 31855, Training Loss: 0.00933
Epoch: 31855, Training Loss: 0.00723
Epoch: 31856, Training Loss: 0.00817
Epoch: 31856, Training Loss: 0.00763
Epoch: 31856, Training Loss: 0.00933
Epoch: 31856, Training Loss: 0.00723
Epoch: 31857, Training Loss: 0.00817
Epoch: 31857, Training Loss: 0.00763
Epoch: 31857, Training Loss: 0.00933
Epoch: 31857, Training Loss: 0.00723
Epoch: 31858, Training Loss: 0.00817
Epoch: 31858, Training Loss: 0.00763
Epoch: 31858, Training Loss: 0.00933
Epoch: 31858, Training Loss: 0.00723
Epoch: 31859, Training Loss: 0.00817
Epoch: 31859, Training Loss: 0.00763
Epoch: 31859, Training Loss: 0.00933
Epoch: 31859, Training Loss: 0.00723
Epoch: 31860, Training Loss: 0.00817
Epoch: 31860, Training Loss: 0.00763
Epoch: 31860, Training Loss: 0.00933
Epoch: 31860, Training Loss: 0.00723
Epoch: 31861, Training Loss: 0.00817
Epoch: 31861, Training Loss: 0.00763
Epoch: 31861, Training Loss: 0.00933
Epoch: 31861, Training Loss: 0.00723
Epoch: 31862, Training Loss: 0.00817
Epoch: 31862, Training Loss: 0.00763
Epoch: 31862, Training Loss: 0.00933
Epoch: 31862, Training Loss: 0.00723
Epoch: 31863, Training Loss: 0.00817
Epoch: 31863, Training Loss: 0.00763
Epoch: 31863, Training Loss: 0.00933
Epoch: 31863, Training Loss: 0.00723
Epoch: 31864, Training Loss: 0.00817
Epoch: 31864, Training Loss: 0.00763
Epoch: 31864, Training Loss: 0.00933
Epoch: 31864, Training Loss: 0.00723
Epoch: 31865, Training Loss: 0.00817
Epoch: 31865, Training Loss: 0.00763
Epoch: 31865, Training Loss: 0.00933
Epoch: 31865, Training Loss: 0.00723
Epoch: 31866, Training Loss: 0.00817
Epoch: 31866, Training Loss: 0.00763
Epoch: 31866, Training Loss: 0.00933
Epoch: 31866, Training Loss: 0.00723
Epoch: 31867, Training Loss: 0.00817
Epoch: 31867, Training Loss: 0.00763
Epoch: 31867, Training Loss: 0.00933
Epoch: 31867, Training Loss: 0.00723
Epoch: 31868, Training Loss: 0.00817
Epoch: 31868, Training Loss: 0.00763
Epoch: 31868, Training Loss: 0.00933
Epoch: 31868, Training Loss: 0.00723
Epoch: 31869, Training Loss: 0.00817
Epoch: 31869, Training Loss: 0.00763
Epoch: 31869, Training Loss: 0.00933
Epoch: 31869, Training Loss: 0.00723
Epoch: 31870, Training Loss: 0.00817
Epoch: 31870, Training Loss: 0.00763
Epoch: 31870, Training Loss: 0.00933
Epoch: 31870, Training Loss: 0.00723
Epoch: 31871, Training Loss: 0.00817
Epoch: 31871, Training Loss: 0.00763
Epoch: 31871, Training Loss: 0.00933
Epoch: 31871, Training Loss: 0.00723
Epoch: 31872, Training Loss: 0.00817
Epoch: 31872, Training Loss: 0.00763
Epoch: 31872, Training Loss: 0.00933
Epoch: 31872, Training Loss: 0.00723
Epoch: 31873, Training Loss: 0.00817
Epoch: 31873, Training Loss: 0.00763
Epoch: 31873, Training Loss: 0.00933
Epoch: 31873, Training Loss: 0.00723
Epoch: 31874, Training Loss: 0.00817
Epoch: 31874, Training Loss: 0.00763
Epoch: 31874, Training Loss: 0.00933
Epoch: 31874, Training Loss: 0.00723
Epoch: 31875, Training Loss: 0.00817
Epoch: 31875, Training Loss: 0.00763
Epoch: 31875, Training Loss: 0.00933
Epoch: 31875, Training Loss: 0.00723
Epoch: 31876, Training Loss: 0.00817
Epoch: 31876, Training Loss: 0.00763
Epoch: 31876, Training Loss: 0.00933
Epoch: 31876, Training Loss: 0.00723
Epoch: 31877, Training Loss: 0.00817
Epoch: 31877, Training Loss: 0.00763
Epoch: 31877, Training Loss: 0.00933
Epoch: 31877, Training Loss: 0.00722
Epoch: 31878, Training Loss: 0.00817
Epoch: 31878, Training Loss: 0.00763
Epoch: 31878, Training Loss: 0.00933
Epoch: 31878, Training Loss: 0.00722
Epoch: 31879, Training Loss: 0.00817
Epoch: 31879, Training Loss: 0.00763
Epoch: 31879, Training Loss: 0.00933
Epoch: 31879, Training Loss: 0.00722
Epoch: 31880, Training Loss: 0.00817
Epoch: 31880, Training Loss: 0.00763
Epoch: 31880, Training Loss: 0.00933
Epoch: 31880, Training Loss: 0.00722
Epoch: 31881, Training Loss: 0.00817
Epoch: 31881, Training Loss: 0.00763
Epoch: 31881, Training Loss: 0.00933
Epoch: 31881, Training Loss: 0.00722
Epoch: 31882, Training Loss: 0.00817
Epoch: 31882, Training Loss: 0.00763
Epoch: 31882, Training Loss: 0.00933
Epoch: 31882, Training Loss: 0.00722
Epoch: 31883, Training Loss: 0.00817
Epoch: 31883, Training Loss: 0.00763
Epoch: 31883, Training Loss: 0.00933
Epoch: 31883, Training Loss: 0.00722
Epoch: 31884, Training Loss: 0.00817
Epoch: 31884, Training Loss: 0.00763
Epoch: 31884, Training Loss: 0.00933
Epoch: 31884, Training Loss: 0.00722
Epoch: 31885, Training Loss: 0.00817
Epoch: 31885, Training Loss: 0.00763
Epoch: 31885, Training Loss: 0.00933
Epoch: 31885, Training Loss: 0.00722
Epoch: 31886, Training Loss: 0.00817
Epoch: 31886, Training Loss: 0.00763
Epoch: 31886, Training Loss: 0.00933
Epoch: 31886, Training Loss: 0.00722
Epoch: 31887, Training Loss: 0.00817
Epoch: 31887, Training Loss: 0.00763
Epoch: 31887, Training Loss: 0.00933
Epoch: 31887, Training Loss: 0.00722
Epoch: 31888, Training Loss: 0.00817
Epoch: 31888, Training Loss: 0.00763
Epoch: 31888, Training Loss: 0.00933
Epoch: 31888, Training Loss: 0.00722
Epoch: 31889, Training Loss: 0.00817
Epoch: 31889, Training Loss: 0.00763
Epoch: 31889, Training Loss: 0.00933
Epoch: 31889, Training Loss: 0.00722
Epoch: 31890, Training Loss: 0.00817
Epoch: 31890, Training Loss: 0.00763
Epoch: 31890, Training Loss: 0.00933
Epoch: 31890, Training Loss: 0.00722
Epoch: 31891, Training Loss: 0.00817
Epoch: 31891, Training Loss: 0.00763
Epoch: 31891, Training Loss: 0.00933
Epoch: 31891, Training Loss: 0.00722
Epoch: 31892, Training Loss: 0.00817
Epoch: 31892, Training Loss: 0.00763
Epoch: 31892, Training Loss: 0.00933
Epoch: 31892, Training Loss: 0.00722
Epoch: 31893, Training Loss: 0.00816
Epoch: 31893, Training Loss: 0.00763
Epoch: 31893, Training Loss: 0.00933
Epoch: 31893, Training Loss: 0.00722
Epoch: 31894, Training Loss: 0.00816
Epoch: 31894, Training Loss: 0.00763
Epoch: 31894, Training Loss: 0.00933
Epoch: 31894, Training Loss: 0.00722
Epoch: 31895, Training Loss: 0.00816
Epoch: 31895, Training Loss: 0.00763
Epoch: 31895, Training Loss: 0.00933
Epoch: 31895, Training Loss: 0.00722
Epoch: 31896, Training Loss: 0.00816
Epoch: 31896, Training Loss: 0.00763
Epoch: 31896, Training Loss: 0.00932
Epoch: 31896, Training Loss: 0.00722
Epoch: 31897, Training Loss: 0.00816
Epoch: 31897, Training Loss: 0.00763
Epoch: 31897, Training Loss: 0.00932
Epoch: 31897, Training Loss: 0.00722
Epoch: 31898, Training Loss: 0.00816
Epoch: 31898, Training Loss: 0.00763
Epoch: 31898, Training Loss: 0.00932
Epoch: 31898, Training Loss: 0.00722
Epoch: 31899, Training Loss: 0.00816
Epoch: 31899, Training Loss: 0.00763
Epoch: 31899, Training Loss: 0.00932
Epoch: 31899, Training Loss: 0.00722
Epoch: 31900, Training Loss: 0.00816
Epoch: 31900, Training Loss: 0.00763
Epoch: 31900, Training Loss: 0.00932
Epoch: 31900, Training Loss: 0.00722
Epoch: 31901, Training Loss: 0.00816
Epoch: 31901, Training Loss: 0.00763
Epoch: 31901, Training Loss: 0.00932
Epoch: 31901, Training Loss: 0.00722
Epoch: 31902, Training Loss: 0.00816
Epoch: 31902, Training Loss: 0.00763
Epoch: 31902, Training Loss: 0.00932
Epoch: 31902, Training Loss: 0.00722
Epoch: 31903, Training Loss: 0.00816
Epoch: 31903, Training Loss: 0.00763
Epoch: 31903, Training Loss: 0.00932
Epoch: 31903, Training Loss: 0.00722
Epoch: 31904, Training Loss: 0.00816
Epoch: 31904, Training Loss: 0.00763
Epoch: 31904, Training Loss: 0.00932
Epoch: 31904, Training Loss: 0.00722
Epoch: 31905, Training Loss: 0.00816
Epoch: 31905, Training Loss: 0.00763
Epoch: 31905, Training Loss: 0.00932
Epoch: 31905, Training Loss: 0.00722
Epoch: 31906, Training Loss: 0.00816
Epoch: 31906, Training Loss: 0.00763
Epoch: 31906, Training Loss: 0.00932
Epoch: 31906, Training Loss: 0.00722
Epoch: 31907, Training Loss: 0.00816
Epoch: 31907, Training Loss: 0.00763
Epoch: 31907, Training Loss: 0.00932
Epoch: 31907, Training Loss: 0.00722
Epoch: 31908, Training Loss: 0.00816
Epoch: 31908, Training Loss: 0.00763
Epoch: 31908, Training Loss: 0.00932
Epoch: 31908, Training Loss: 0.00722
Epoch: 31909, Training Loss: 0.00816
Epoch: 31909, Training Loss: 0.00763
Epoch: 31909, Training Loss: 0.00932
Epoch: 31909, Training Loss: 0.00722
Epoch: 31910, Training Loss: 0.00816
Epoch: 31910, Training Loss: 0.00763
Epoch: 31910, Training Loss: 0.00932
Epoch: 31910, Training Loss: 0.00722
Epoch: 31911, Training Loss: 0.00816
Epoch: 31911, Training Loss: 0.00763
Epoch: 31911, Training Loss: 0.00932
Epoch: 31911, Training Loss: 0.00722
Epoch: 31912, Training Loss: 0.00816
Epoch: 31912, Training Loss: 0.00763
Epoch: 31912, Training Loss: 0.00932
Epoch: 31912, Training Loss: 0.00722
Epoch: 31913, Training Loss: 0.00816
Epoch: 31913, Training Loss: 0.00763
Epoch: 31913, Training Loss: 0.00932
Epoch: 31913, Training Loss: 0.00722
Epoch: 31914, Training Loss: 0.00816
Epoch: 31914, Training Loss: 0.00763
Epoch: 31914, Training Loss: 0.00932
Epoch: 31914, Training Loss: 0.00722
Epoch: 31915, Training Loss: 0.00816
Epoch: 31915, Training Loss: 0.00763
Epoch: 31915, Training Loss: 0.00932
Epoch: 31915, Training Loss: 0.00722
Epoch: 31916, Training Loss: 0.00816
Epoch: 31916, Training Loss: 0.00763
Epoch: 31916, Training Loss: 0.00932
Epoch: 31916, Training Loss: 0.00722
Epoch: 31917, Training Loss: 0.00816
Epoch: 31917, Training Loss: 0.00762
Epoch: 31917, Training Loss: 0.00932
Epoch: 31917, Training Loss: 0.00722
Epoch: 31918, Training Loss: 0.00816
Epoch: 31918, Training Loss: 0.00762
Epoch: 31918, Training Loss: 0.00932
Epoch: 31918, Training Loss: 0.00722
Epoch: 31919, Training Loss: 0.00816
Epoch: 31919, Training Loss: 0.00762
Epoch: 31919, Training Loss: 0.00932
Epoch: 31919, Training Loss: 0.00722
Epoch: 31920, Training Loss: 0.00816
Epoch: 31920, Training Loss: 0.00762
Epoch: 31920, Training Loss: 0.00932
Epoch: 31920, Training Loss: 0.00722
Epoch: 31921, Training Loss: 0.00816
Epoch: 31921, Training Loss: 0.00762
Epoch: 31921, Training Loss: 0.00932
Epoch: 31921, Training Loss: 0.00722
Epoch: 31922, Training Loss: 0.00816
Epoch: 31922, Training Loss: 0.00762
Epoch: 31922, Training Loss: 0.00932
Epoch: 31922, Training Loss: 0.00722
Epoch: 31923, Training Loss: 0.00816
Epoch: 31923, Training Loss: 0.00762
Epoch: 31923, Training Loss: 0.00932
Epoch: 31923, Training Loss: 0.00722
Epoch: 31924, Training Loss: 0.00816
Epoch: 31924, Training Loss: 0.00762
Epoch: 31924, Training Loss: 0.00932
Epoch: 31924, Training Loss: 0.00722
Epoch: 31925, Training Loss: 0.00816
Epoch: 31925, Training Loss: 0.00762
Epoch: 31925, Training Loss: 0.00932
Epoch: 31925, Training Loss: 0.00722
Epoch: 31926, Training Loss: 0.00816
Epoch: 31926, Training Loss: 0.00762
Epoch: 31926, Training Loss: 0.00932
Epoch: 31926, Training Loss: 0.00722
Epoch: 31927, Training Loss: 0.00816
Epoch: 31927, Training Loss: 0.00762
Epoch: 31927, Training Loss: 0.00932
Epoch: 31927, Training Loss: 0.00722
Epoch: 31928, Training Loss: 0.00816
Epoch: 31928, Training Loss: 0.00762
Epoch: 31928, Training Loss: 0.00932
Epoch: 31928, Training Loss: 0.00722
Epoch: 31929, Training Loss: 0.00816
Epoch: 31929, Training Loss: 0.00762
Epoch: 31929, Training Loss: 0.00932
Epoch: 31929, Training Loss: 0.00722
Epoch: 31930, Training Loss: 0.00816
Epoch: 31930, Training Loss: 0.00762
Epoch: 31930, Training Loss: 0.00932
Epoch: 31930, Training Loss: 0.00722
Epoch: 31931, Training Loss: 0.00816
Epoch: 31931, Training Loss: 0.00762
Epoch: 31931, Training Loss: 0.00932
Epoch: 31931, Training Loss: 0.00722
Epoch: 31932, Training Loss: 0.00816
Epoch: 31932, Training Loss: 0.00762
Epoch: 31932, Training Loss: 0.00932
Epoch: 31932, Training Loss: 0.00722
Epoch: 31933, Training Loss: 0.00816
Epoch: 31933, Training Loss: 0.00762
Epoch: 31933, Training Loss: 0.00932
Epoch: 31933, Training Loss: 0.00722
Epoch: 31934, Training Loss: 0.00816
Epoch: 31934, Training Loss: 0.00762
Epoch: 31934, Training Loss: 0.00932
Epoch: 31934, Training Loss: 0.00722
Epoch: 31935, Training Loss: 0.00816
Epoch: 31935, Training Loss: 0.00762
Epoch: 31935, Training Loss: 0.00932
Epoch: 31935, Training Loss: 0.00722
Epoch: 31936, Training Loss: 0.00816
Epoch: 31936, Training Loss: 0.00762
Epoch: 31936, Training Loss: 0.00932
Epoch: 31936, Training Loss: 0.00722
Epoch: 31937, Training Loss: 0.00816
Epoch: 31937, Training Loss: 0.00762
Epoch: 31937, Training Loss: 0.00932
Epoch: 31937, Training Loss: 0.00722
Epoch: 31938, Training Loss: 0.00816
Epoch: 31938, Training Loss: 0.00762
Epoch: 31938, Training Loss: 0.00932
Epoch: 31938, Training Loss: 0.00722
Epoch: 31939, Training Loss: 0.00816
Epoch: 31939, Training Loss: 0.00762
Epoch: 31939, Training Loss: 0.00932
Epoch: 31939, Training Loss: 0.00722
Epoch: 31940, Training Loss: 0.00816
Epoch: 31940, Training Loss: 0.00762
Epoch: 31940, Training Loss: 0.00932
Epoch: 31940, Training Loss: 0.00722
Epoch: 31941, Training Loss: 0.00816
Epoch: 31941, Training Loss: 0.00762
Epoch: 31941, Training Loss: 0.00932
Epoch: 31941, Training Loss: 0.00722
Epoch: 31942, Training Loss: 0.00816
Epoch: 31942, Training Loss: 0.00762
Epoch: 31942, Training Loss: 0.00932
Epoch: 31942, Training Loss: 0.00722
Epoch: 31943, Training Loss: 0.00816
Epoch: 31943, Training Loss: 0.00762
Epoch: 31943, Training Loss: 0.00932
Epoch: 31943, Training Loss: 0.00722
Epoch: 31944, Training Loss: 0.00816
Epoch: 31944, Training Loss: 0.00762
Epoch: 31944, Training Loss: 0.00932
Epoch: 31944, Training Loss: 0.00722
Epoch: 31945, Training Loss: 0.00816
Epoch: 31945, Training Loss: 0.00762
Epoch: 31945, Training Loss: 0.00932
Epoch: 31945, Training Loss: 0.00722
Epoch: 31946, Training Loss: 0.00816
Epoch: 31946, Training Loss: 0.00762
Epoch: 31946, Training Loss: 0.00932
Epoch: 31946, Training Loss: 0.00722
Epoch: 31947, Training Loss: 0.00816
Epoch: 31947, Training Loss: 0.00762
Epoch: 31947, Training Loss: 0.00932
Epoch: 31947, Training Loss: 0.00722
Epoch: 31948, Training Loss: 0.00816
Epoch: 31948, Training Loss: 0.00762
Epoch: 31948, Training Loss: 0.00932
Epoch: 31948, Training Loss: 0.00722
Epoch: 31949, Training Loss: 0.00816
Epoch: 31949, Training Loss: 0.00762
Epoch: 31949, Training Loss: 0.00932
Epoch: 31949, Training Loss: 0.00722
Epoch: 31950, Training Loss: 0.00816
Epoch: 31950, Training Loss: 0.00762
Epoch: 31950, Training Loss: 0.00932
Epoch: 31950, Training Loss: 0.00722
Epoch: 31951, Training Loss: 0.00816
Epoch: 31951, Training Loss: 0.00762
Epoch: 31951, Training Loss: 0.00932
Epoch: 31951, Training Loss: 0.00722
Epoch: 31952, Training Loss: 0.00816
Epoch: 31952, Training Loss: 0.00762
Epoch: 31952, Training Loss: 0.00932
Epoch: 31952, Training Loss: 0.00722
Epoch: 31953, Training Loss: 0.00816
Epoch: 31953, Training Loss: 0.00762
Epoch: 31953, Training Loss: 0.00932
Epoch: 31953, Training Loss: 0.00722
Epoch: 31954, Training Loss: 0.00816
Epoch: 31954, Training Loss: 0.00762
Epoch: 31954, Training Loss: 0.00932
Epoch: 31954, Training Loss: 0.00722
Epoch: 31955, Training Loss: 0.00816
Epoch: 31955, Training Loss: 0.00762
Epoch: 31955, Training Loss: 0.00932
Epoch: 31955, Training Loss: 0.00722
Epoch: 31956, Training Loss: 0.00816
Epoch: 31956, Training Loss: 0.00762
Epoch: 31956, Training Loss: 0.00932
Epoch: 31956, Training Loss: 0.00722
Epoch: 31957, Training Loss: 0.00816
Epoch: 31957, Training Loss: 0.00762
Epoch: 31957, Training Loss: 0.00932
Epoch: 31957, Training Loss: 0.00722
Epoch: 31958, Training Loss: 0.00816
Epoch: 31958, Training Loss: 0.00762
Epoch: 31958, Training Loss: 0.00932
Epoch: 31958, Training Loss: 0.00722
Epoch: 31959, Training Loss: 0.00816
Epoch: 31959, Training Loss: 0.00762
Epoch: 31959, Training Loss: 0.00932
Epoch: 31959, Training Loss: 0.00722
Epoch: 31960, Training Loss: 0.00816
Epoch: 31960, Training Loss: 0.00762
Epoch: 31960, Training Loss: 0.00932
Epoch: 31960, Training Loss: 0.00722
Epoch: 31961, Training Loss: 0.00816
Epoch: 31961, Training Loss: 0.00762
Epoch: 31961, Training Loss: 0.00931
Epoch: 31961, Training Loss: 0.00722
Epoch: 31962, Training Loss: 0.00816
Epoch: 31962, Training Loss: 0.00762
Epoch: 31962, Training Loss: 0.00931
Epoch: 31962, Training Loss: 0.00721
Epoch: 31963, Training Loss: 0.00816
Epoch: 31963, Training Loss: 0.00762
Epoch: 31963, Training Loss: 0.00931
Epoch: 31963, Training Loss: 0.00721
Epoch: 31964, Training Loss: 0.00816
Epoch: 31964, Training Loss: 0.00762
Epoch: 31964, Training Loss: 0.00931
Epoch: 31964, Training Loss: 0.00721
Epoch: 31965, Training Loss: 0.00816
Epoch: 31965, Training Loss: 0.00762
Epoch: 31965, Training Loss: 0.00931
Epoch: 31965, Training Loss: 0.00721
Epoch: 31966, Training Loss: 0.00815
Epoch: 31966, Training Loss: 0.00762
Epoch: 31966, Training Loss: 0.00931
Epoch: 31966, Training Loss: 0.00721
Epoch: 31967, Training Loss: 0.00815
Epoch: 31967, Training Loss: 0.00762
Epoch: 31967, Training Loss: 0.00931
Epoch: 31967, Training Loss: 0.00721
Epoch: 31968, Training Loss: 0.00815
Epoch: 31968, Training Loss: 0.00762
Epoch: 31968, Training Loss: 0.00931
Epoch: 31968, Training Loss: 0.00721
Epoch: 31969, Training Loss: 0.00815
Epoch: 31969, Training Loss: 0.00762
Epoch: 31969, Training Loss: 0.00931
Epoch: 31969, Training Loss: 0.00721
Epoch: 31970, Training Loss: 0.00815
Epoch: 31970, Training Loss: 0.00762
Epoch: 31970, Training Loss: 0.00931
Epoch: 31970, Training Loss: 0.00721
Epoch: 31971, Training Loss: 0.00815
Epoch: 31971, Training Loss: 0.00762
Epoch: 31971, Training Loss: 0.00931
Epoch: 31971, Training Loss: 0.00721
Epoch: 31972, Training Loss: 0.00815
Epoch: 31972, Training Loss: 0.00762
Epoch: 31972, Training Loss: 0.00931
Epoch: 31972, Training Loss: 0.00721
Epoch: 31973, Training Loss: 0.00815
Epoch: 31973, Training Loss: 0.00762
Epoch: 31973, Training Loss: 0.00931
Epoch: 31973, Training Loss: 0.00721
Epoch: 31974, Training Loss: 0.00815
Epoch: 31974, Training Loss: 0.00762
Epoch: 31974, Training Loss: 0.00931
Epoch: 31974, Training Loss: 0.00721
Epoch: 31975, Training Loss: 0.00815
Epoch: 31975, Training Loss: 0.00762
Epoch: 31975, Training Loss: 0.00931
Epoch: 31975, Training Loss: 0.00721
Epoch: 31976, Training Loss: 0.00815
Epoch: 31976, Training Loss: 0.00762
Epoch: 31976, Training Loss: 0.00931
Epoch: 31976, Training Loss: 0.00721
Epoch: 31977, Training Loss: 0.00815
Epoch: 31977, Training Loss: 0.00762
Epoch: 31977, Training Loss: 0.00931
Epoch: 31977, Training Loss: 0.00721
Epoch: 31978, Training Loss: 0.00815
Epoch: 31978, Training Loss: 0.00762
Epoch: 31978, Training Loss: 0.00931
Epoch: 31978, Training Loss: 0.00721
Epoch: 31979, Training Loss: 0.00815
Epoch: 31979, Training Loss: 0.00762
Epoch: 31979, Training Loss: 0.00931
Epoch: 31979, Training Loss: 0.00721
Epoch: 31980, Training Loss: 0.00815
Epoch: 31980, Training Loss: 0.00762
Epoch: 31980, Training Loss: 0.00931
Epoch: 31980, Training Loss: 0.00721
Epoch: 31981, Training Loss: 0.00815
Epoch: 31981, Training Loss: 0.00762
Epoch: 31981, Training Loss: 0.00931
Epoch: 31981, Training Loss: 0.00721
Epoch: 31982, Training Loss: 0.00815
Epoch: 31982, Training Loss: 0.00762
Epoch: 31982, Training Loss: 0.00931
Epoch: 31982, Training Loss: 0.00721
Epoch: 31983, Training Loss: 0.00815
Epoch: 31983, Training Loss: 0.00762
Epoch: 31983, Training Loss: 0.00931
Epoch: 31983, Training Loss: 0.00721
Epoch: 31984, Training Loss: 0.00815
Epoch: 31984, Training Loss: 0.00762
Epoch: 31984, Training Loss: 0.00931
Epoch: 31984, Training Loss: 0.00721
Epoch: 31985, Training Loss: 0.00815
Epoch: 31985, Training Loss: 0.00762
Epoch: 31985, Training Loss: 0.00931
Epoch: 31985, Training Loss: 0.00721
Epoch: 31986, Training Loss: 0.00815
Epoch: 31986, Training Loss: 0.00762
Epoch: 31986, Training Loss: 0.00931
Epoch: 31986, Training Loss: 0.00721
Epoch: 31987, Training Loss: 0.00815
Epoch: 31987, Training Loss: 0.00762
Epoch: 31987, Training Loss: 0.00931
Epoch: 31987, Training Loss: 0.00721
Epoch: 31988, Training Loss: 0.00815
Epoch: 31988, Training Loss: 0.00762
Epoch: 31988, Training Loss: 0.00931
Epoch: 31988, Training Loss: 0.00721
Epoch: 31989, Training Loss: 0.00815
Epoch: 31989, Training Loss: 0.00762
Epoch: 31989, Training Loss: 0.00931
Epoch: 31989, Training Loss: 0.00721
Epoch: 31990, Training Loss: 0.00815
Epoch: 31990, Training Loss: 0.00762
Epoch: 31990, Training Loss: 0.00931
Epoch: 31990, Training Loss: 0.00721
Epoch: 31991, Training Loss: 0.00815
Epoch: 31991, Training Loss: 0.00762
Epoch: 31991, Training Loss: 0.00931
Epoch: 31991, Training Loss: 0.00721
Epoch: 31992, Training Loss: 0.00815
Epoch: 31992, Training Loss: 0.00762
Epoch: 31992, Training Loss: 0.00931
Epoch: 31992, Training Loss: 0.00721
Epoch: 31993, Training Loss: 0.00815
Epoch: 31993, Training Loss: 0.00762
Epoch: 31993, Training Loss: 0.00931
Epoch: 31993, Training Loss: 0.00721
Epoch: 31994, Training Loss: 0.00815
Epoch: 31994, Training Loss: 0.00762
Epoch: 31994, Training Loss: 0.00931
Epoch: 31994, Training Loss: 0.00721
Epoch: 31995, Training Loss: 0.00815
Epoch: 31995, Training Loss: 0.00762
Epoch: 31995, Training Loss: 0.00931
Epoch: 31995, Training Loss: 0.00721
Epoch: 31996, Training Loss: 0.00815
Epoch: 31996, Training Loss: 0.00762
Epoch: 31996, Training Loss: 0.00931
Epoch: 31996, Training Loss: 0.00721
Epoch: 31997, Training Loss: 0.00815
Epoch: 31997, Training Loss: 0.00762
Epoch: 31997, Training Loss: 0.00931
Epoch: 31997, Training Loss: 0.00721
Epoch: 31998, Training Loss: 0.00815
Epoch: 31998, Training Loss: 0.00761
Epoch: 31998, Training Loss: 0.00931
Epoch: 31998, Training Loss: 0.00721
Epoch: 31999, Training Loss: 0.00815
Epoch: 31999, Training Loss: 0.00761
Epoch: 31999, Training Loss: 0.00931
Epoch: 31999, Training Loss: 0.00721
Epoch: 32000, Training Loss: 0.00815
Epoch: 32000, Training Loss: 0.00761
Epoch: 32000, Training Loss: 0.00931
Epoch: 32000, Training Loss: 0.00721
Epoch: 32001, Training Loss: 0.00815
Epoch: 32001, Training Loss: 0.00761
Epoch: 32001, Training Loss: 0.00931
Epoch: 32001, Training Loss: 0.00721
Epoch: 32002, Training Loss: 0.00815
Epoch: 32002, Training Loss: 0.00761
Epoch: 32002, Training Loss: 0.00931
Epoch: 32002, Training Loss: 0.00721
Epoch: 32003, Training Loss: 0.00815
Epoch: 32003, Training Loss: 0.00761
Epoch: 32003, Training Loss: 0.00931
Epoch: 32003, Training Loss: 0.00721
Epoch: 32004, Training Loss: 0.00815
Epoch: 32004, Training Loss: 0.00761
Epoch: 32004, Training Loss: 0.00931
Epoch: 32004, Training Loss: 0.00721
Epoch: 32005, Training Loss: 0.00815
Epoch: 32005, Training Loss: 0.00761
Epoch: 32005, Training Loss: 0.00931
Epoch: 32005, Training Loss: 0.00721
Epoch: 32006, Training Loss: 0.00815
Epoch: 32006, Training Loss: 0.00761
Epoch: 32006, Training Loss: 0.00931
Epoch: 32006, Training Loss: 0.00721
Epoch: 32007, Training Loss: 0.00815
Epoch: 32007, Training Loss: 0.00761
Epoch: 32007, Training Loss: 0.00931
Epoch: 32007, Training Loss: 0.00721
Epoch: 32008, Training Loss: 0.00815
Epoch: 32008, Training Loss: 0.00761
Epoch: 32008, Training Loss: 0.00931
Epoch: 32008, Training Loss: 0.00721
Epoch: 32009, Training Loss: 0.00815
Epoch: 32009, Training Loss: 0.00761
Epoch: 32009, Training Loss: 0.00931
Epoch: 32009, Training Loss: 0.00721
Epoch: 32010, Training Loss: 0.00815
Epoch: 32010, Training Loss: 0.00761
Epoch: 32010, Training Loss: 0.00931
Epoch: 32010, Training Loss: 0.00721
Epoch: 32011, Training Loss: 0.00815
Epoch: 32011, Training Loss: 0.00761
Epoch: 32011, Training Loss: 0.00931
Epoch: 32011, Training Loss: 0.00721
Epoch: 32012, Training Loss: 0.00815
Epoch: 32012, Training Loss: 0.00761
Epoch: 32012, Training Loss: 0.00931
Epoch: 32012, Training Loss: 0.00721
Epoch: 32013, Training Loss: 0.00815
Epoch: 32013, Training Loss: 0.00761
Epoch: 32013, Training Loss: 0.00931
Epoch: 32013, Training Loss: 0.00721
Epoch: 32014, Training Loss: 0.00815
Epoch: 32014, Training Loss: 0.00761
Epoch: 32014, Training Loss: 0.00931
Epoch: 32014, Training Loss: 0.00721
Epoch: 32015, Training Loss: 0.00815
Epoch: 32015, Training Loss: 0.00761
Epoch: 32015, Training Loss: 0.00931
Epoch: 32015, Training Loss: 0.00721
Epoch: 32016, Training Loss: 0.00815
Epoch: 32016, Training Loss: 0.00761
Epoch: 32016, Training Loss: 0.00931
Epoch: 32016, Training Loss: 0.00721
Epoch: 32017, Training Loss: 0.00815
Epoch: 32017, Training Loss: 0.00761
Epoch: 32017, Training Loss: 0.00931
Epoch: 32017, Training Loss: 0.00721
Epoch: 32018, Training Loss: 0.00815
Epoch: 32018, Training Loss: 0.00761
Epoch: 32018, Training Loss: 0.00931
Epoch: 32018, Training Loss: 0.00721
Epoch: 32019, Training Loss: 0.00815
Epoch: 32019, Training Loss: 0.00761
Epoch: 32019, Training Loss: 0.00931
Epoch: 32019, Training Loss: 0.00721
Epoch: 32020, Training Loss: 0.00815
Epoch: 32020, Training Loss: 0.00761
Epoch: 32020, Training Loss: 0.00931
Epoch: 32020, Training Loss: 0.00721
Epoch: 32021, Training Loss: 0.00815
Epoch: 32021, Training Loss: 0.00761
Epoch: 32021, Training Loss: 0.00931
Epoch: 32021, Training Loss: 0.00721
Epoch: 32022, Training Loss: 0.00815
Epoch: 32022, Training Loss: 0.00761
Epoch: 32022, Training Loss: 0.00931
Epoch: 32022, Training Loss: 0.00721
Epoch: 32023, Training Loss: 0.00815
Epoch: 32023, Training Loss: 0.00761
Epoch: 32023, Training Loss: 0.00931
Epoch: 32023, Training Loss: 0.00721
Epoch: 32024, Training Loss: 0.00815
Epoch: 32024, Training Loss: 0.00761
Epoch: 32024, Training Loss: 0.00931
Epoch: 32024, Training Loss: 0.00721
Epoch: 32025, Training Loss: 0.00815
Epoch: 32025, Training Loss: 0.00761
Epoch: 32025, Training Loss: 0.00931
Epoch: 32025, Training Loss: 0.00721
Epoch: 32026, Training Loss: 0.00815
Epoch: 32026, Training Loss: 0.00761
Epoch: 32026, Training Loss: 0.00930
Epoch: 32026, Training Loss: 0.00721
Epoch: 32027, Training Loss: 0.00815
Epoch: 32027, Training Loss: 0.00761
Epoch: 32027, Training Loss: 0.00930
Epoch: 32027, Training Loss: 0.00721
Epoch: 32028, Training Loss: 0.00815
Epoch: 32028, Training Loss: 0.00761
Epoch: 32028, Training Loss: 0.00930
Epoch: 32028, Training Loss: 0.00721
Epoch: 32029, Training Loss: 0.00815
Epoch: 32029, Training Loss: 0.00761
Epoch: 32029, Training Loss: 0.00930
Epoch: 32029, Training Loss: 0.00721
Epoch: 32030, Training Loss: 0.00815
Epoch: 32030, Training Loss: 0.00761
Epoch: 32030, Training Loss: 0.00930
Epoch: 32030, Training Loss: 0.00721
Epoch: 32031, Training Loss: 0.00815
Epoch: 32031, Training Loss: 0.00761
Epoch: 32031, Training Loss: 0.00930
Epoch: 32031, Training Loss: 0.00721
Epoch: 32032, Training Loss: 0.00815
Epoch: 32032, Training Loss: 0.00761
Epoch: 32032, Training Loss: 0.00930
Epoch: 32032, Training Loss: 0.00721
Epoch: 32033, Training Loss: 0.00815
Epoch: 32033, Training Loss: 0.00761
Epoch: 32033, Training Loss: 0.00930
Epoch: 32033, Training Loss: 0.00721
Epoch: 32034, Training Loss: 0.00815
Epoch: 32034, Training Loss: 0.00761
Epoch: 32034, Training Loss: 0.00930
Epoch: 32034, Training Loss: 0.00721
Epoch: 32035, Training Loss: 0.00815
Epoch: 32035, Training Loss: 0.00761
Epoch: 32035, Training Loss: 0.00930
Epoch: 32035, Training Loss: 0.00721
Epoch: 32036, Training Loss: 0.00815
Epoch: 32036, Training Loss: 0.00761
Epoch: 32036, Training Loss: 0.00930
Epoch: 32036, Training Loss: 0.00721
Epoch: 32037, Training Loss: 0.00815
Epoch: 32037, Training Loss: 0.00761
Epoch: 32037, Training Loss: 0.00930
Epoch: 32037, Training Loss: 0.00721
Epoch: 32038, Training Loss: 0.00815
Epoch: 32038, Training Loss: 0.00761
Epoch: 32038, Training Loss: 0.00930
Epoch: 32038, Training Loss: 0.00721
Epoch: 32039, Training Loss: 0.00815
Epoch: 32039, Training Loss: 0.00761
Epoch: 32039, Training Loss: 0.00930
Epoch: 32039, Training Loss: 0.00721
Epoch: 32040, Training Loss: 0.00814
Epoch: 32040, Training Loss: 0.00761
Epoch: 32040, Training Loss: 0.00930
Epoch: 32040, Training Loss: 0.00721
Epoch: 32041, Training Loss: 0.00814
Epoch: 32041, Training Loss: 0.00761
Epoch: 32041, Training Loss: 0.00930
Epoch: 32041, Training Loss: 0.00721
Epoch: 32042, Training Loss: 0.00814
Epoch: 32042, Training Loss: 0.00761
Epoch: 32042, Training Loss: 0.00930
Epoch: 32042, Training Loss: 0.00721
Epoch: 32043, Training Loss: 0.00814
Epoch: 32043, Training Loss: 0.00761
Epoch: 32043, Training Loss: 0.00930
Epoch: 32043, Training Loss: 0.00721
Epoch: 32044, Training Loss: 0.00814
Epoch: 32044, Training Loss: 0.00761
Epoch: 32044, Training Loss: 0.00930
Epoch: 32044, Training Loss: 0.00721
Epoch: 32045, Training Loss: 0.00814
Epoch: 32045, Training Loss: 0.00761
Epoch: 32045, Training Loss: 0.00930
Epoch: 32045, Training Loss: 0.00721
Epoch: 32046, Training Loss: 0.00814
Epoch: 32046, Training Loss: 0.00761
Epoch: 32046, Training Loss: 0.00930
Epoch: 32046, Training Loss: 0.00720
Epoch: 32047, Training Loss: 0.00814
Epoch: 32047, Training Loss: 0.00761
Epoch: 32047, Training Loss: 0.00930
Epoch: 32047, Training Loss: 0.00720
Epoch: 32048, Training Loss: 0.00814
Epoch: 32048, Training Loss: 0.00761
Epoch: 32048, Training Loss: 0.00930
Epoch: 32048, Training Loss: 0.00720
Epoch: 32049, Training Loss: 0.00814
Epoch: 32049, Training Loss: 0.00761
Epoch: 32049, Training Loss: 0.00930
Epoch: 32049, Training Loss: 0.00720
Epoch: 32050, Training Loss: 0.00814
Epoch: 32050, Training Loss: 0.00761
Epoch: 32050, Training Loss: 0.00930
Epoch: 32050, Training Loss: 0.00720
Epoch: 32051, Training Loss: 0.00814
Epoch: 32051, Training Loss: 0.00761
Epoch: 32051, Training Loss: 0.00930
Epoch: 32051, Training Loss: 0.00720
Epoch: 32052, Training Loss: 0.00814
Epoch: 32052, Training Loss: 0.00761
Epoch: 32052, Training Loss: 0.00930
Epoch: 32052, Training Loss: 0.00720
Epoch: 32053, Training Loss: 0.00814
Epoch: 32053, Training Loss: 0.00761
Epoch: 32053, Training Loss: 0.00930
Epoch: 32053, Training Loss: 0.00720
Epoch: 32054, Training Loss: 0.00814
Epoch: 32054, Training Loss: 0.00761
Epoch: 32054, Training Loss: 0.00930
Epoch: 32054, Training Loss: 0.00720
Epoch: 32055, Training Loss: 0.00814
Epoch: 32055, Training Loss: 0.00761
Epoch: 32055, Training Loss: 0.00930
Epoch: 32055, Training Loss: 0.00720
Epoch: 32056, Training Loss: 0.00814
Epoch: 32056, Training Loss: 0.00761
Epoch: 32056, Training Loss: 0.00930
Epoch: 32056, Training Loss: 0.00720
Epoch: 32057, Training Loss: 0.00814
Epoch: 32057, Training Loss: 0.00761
Epoch: 32057, Training Loss: 0.00930
Epoch: 32057, Training Loss: 0.00720
Epoch: 32058, Training Loss: 0.00814
Epoch: 32058, Training Loss: 0.00761
Epoch: 32058, Training Loss: 0.00930
Epoch: 32058, Training Loss: 0.00720
Epoch: 32059, Training Loss: 0.00814
Epoch: 32059, Training Loss: 0.00761
Epoch: 32059, Training Loss: 0.00930
Epoch: 32059, Training Loss: 0.00720
Epoch: 32060, Training Loss: 0.00814
Epoch: 32060, Training Loss: 0.00761
Epoch: 32060, Training Loss: 0.00930
Epoch: 32060, Training Loss: 0.00720
Epoch: 32061, Training Loss: 0.00814
Epoch: 32061, Training Loss: 0.00761
Epoch: 32061, Training Loss: 0.00930
Epoch: 32061, Training Loss: 0.00720
Epoch: 32062, Training Loss: 0.00814
Epoch: 32062, Training Loss: 0.00761
Epoch: 32062, Training Loss: 0.00930
Epoch: 32062, Training Loss: 0.00720
Epoch: 32063, Training Loss: 0.00814
Epoch: 32063, Training Loss: 0.00761
Epoch: 32063, Training Loss: 0.00930
Epoch: 32063, Training Loss: 0.00720
Epoch: 32064, Training Loss: 0.00814
Epoch: 32064, Training Loss: 0.00761
Epoch: 32064, Training Loss: 0.00930
Epoch: 32064, Training Loss: 0.00720
Epoch: 32065, Training Loss: 0.00814
Epoch: 32065, Training Loss: 0.00761
Epoch: 32065, Training Loss: 0.00930
Epoch: 32065, Training Loss: 0.00720
Epoch: 32066, Training Loss: 0.00814
Epoch: 32066, Training Loss: 0.00761
Epoch: 32066, Training Loss: 0.00930
Epoch: 32066, Training Loss: 0.00720
Epoch: 32067, Training Loss: 0.00814
Epoch: 32067, Training Loss: 0.00761
Epoch: 32067, Training Loss: 0.00930
Epoch: 32067, Training Loss: 0.00720
Epoch: 32068, Training Loss: 0.00814
Epoch: 32068, Training Loss: 0.00761
Epoch: 32068, Training Loss: 0.00930
Epoch: 32068, Training Loss: 0.00720
Epoch: 32069, Training Loss: 0.00814
Epoch: 32069, Training Loss: 0.00761
Epoch: 32069, Training Loss: 0.00930
Epoch: 32069, Training Loss: 0.00720
Epoch: 32070, Training Loss: 0.00814
Epoch: 32070, Training Loss: 0.00761
Epoch: 32070, Training Loss: 0.00930
Epoch: 32070, Training Loss: 0.00720
Epoch: 32071, Training Loss: 0.00814
Epoch: 32071, Training Loss: 0.00761
Epoch: 32071, Training Loss: 0.00930
Epoch: 32071, Training Loss: 0.00720
Epoch: 32072, Training Loss: 0.00814
Epoch: 32072, Training Loss: 0.00761
Epoch: 32072, Training Loss: 0.00930
Epoch: 32072, Training Loss: 0.00720
Epoch: 32073, Training Loss: 0.00814
Epoch: 32073, Training Loss: 0.00761
Epoch: 32073, Training Loss: 0.00930
Epoch: 32073, Training Loss: 0.00720
Epoch: 32074, Training Loss: 0.00814
Epoch: 32074, Training Loss: 0.00761
Epoch: 32074, Training Loss: 0.00930
Epoch: 32074, Training Loss: 0.00720
Epoch: 32075, Training Loss: 0.00814
Epoch: 32075, Training Loss: 0.00761
Epoch: 32075, Training Loss: 0.00930
Epoch: 32075, Training Loss: 0.00720
Epoch: 32076, Training Loss: 0.00814
Epoch: 32076, Training Loss: 0.00761
Epoch: 32076, Training Loss: 0.00930
Epoch: 32076, Training Loss: 0.00720
Epoch: 32077, Training Loss: 0.00814
Epoch: 32077, Training Loss: 0.00761
Epoch: 32077, Training Loss: 0.00930
Epoch: 32077, Training Loss: 0.00720
Epoch: 32078, Training Loss: 0.00814
Epoch: 32078, Training Loss: 0.00760
Epoch: 32078, Training Loss: 0.00930
Epoch: 32078, Training Loss: 0.00720
Epoch: 32079, Training Loss: 0.00814
Epoch: 32079, Training Loss: 0.00760
Epoch: 32079, Training Loss: 0.00930
Epoch: 32079, Training Loss: 0.00720
Epoch: 32080, Training Loss: 0.00814
Epoch: 32080, Training Loss: 0.00760
Epoch: 32080, Training Loss: 0.00930
Epoch: 32080, Training Loss: 0.00720
Epoch: 32081, Training Loss: 0.00814
Epoch: 32081, Training Loss: 0.00760
Epoch: 32081, Training Loss: 0.00930
Epoch: 32081, Training Loss: 0.00720
Epoch: 32082, Training Loss: 0.00814
Epoch: 32082, Training Loss: 0.00760
Epoch: 32082, Training Loss: 0.00930
Epoch: 32082, Training Loss: 0.00720
Epoch: 32083, Training Loss: 0.00814
Epoch: 32083, Training Loss: 0.00760
Epoch: 32083, Training Loss: 0.00930
Epoch: 32083, Training Loss: 0.00720
Epoch: 32084, Training Loss: 0.00814
Epoch: 32084, Training Loss: 0.00760
Epoch: 32084, Training Loss: 0.00930
Epoch: 32084, Training Loss: 0.00720
Epoch: 32085, Training Loss: 0.00814
Epoch: 32085, Training Loss: 0.00760
Epoch: 32085, Training Loss: 0.00930
Epoch: 32085, Training Loss: 0.00720
Epoch: 32086, Training Loss: 0.00814
Epoch: 32086, Training Loss: 0.00760
Epoch: 32086, Training Loss: 0.00930
Epoch: 32086, Training Loss: 0.00720
Epoch: 32087, Training Loss: 0.00814
Epoch: 32087, Training Loss: 0.00760
Epoch: 32087, Training Loss: 0.00930
Epoch: 32087, Training Loss: 0.00720
Epoch: 32088, Training Loss: 0.00814
Epoch: 32088, Training Loss: 0.00760
Epoch: 32088, Training Loss: 0.00930
Epoch: 32088, Training Loss: 0.00720
Epoch: 32089, Training Loss: 0.00814
Epoch: 32089, Training Loss: 0.00760
Epoch: 32089, Training Loss: 0.00930
Epoch: 32089, Training Loss: 0.00720
Epoch: 32090, Training Loss: 0.00814
Epoch: 32090, Training Loss: 0.00760
Epoch: 32090, Training Loss: 0.00930
Epoch: 32090, Training Loss: 0.00720
Epoch: 32091, Training Loss: 0.00814
Epoch: 32091, Training Loss: 0.00760
Epoch: 32091, Training Loss: 0.00929
Epoch: 32091, Training Loss: 0.00720
Epoch: 32092, Training Loss: 0.00814
Epoch: 32092, Training Loss: 0.00760
Epoch: 32092, Training Loss: 0.00929
Epoch: 32092, Training Loss: 0.00720
Epoch: 32093, Training Loss: 0.00814
Epoch: 32093, Training Loss: 0.00760
Epoch: 32093, Training Loss: 0.00929
Epoch: 32093, Training Loss: 0.00720
Epoch: 32094, Training Loss: 0.00814
Epoch: 32094, Training Loss: 0.00760
Epoch: 32094, Training Loss: 0.00929
Epoch: 32094, Training Loss: 0.00720
Epoch: 32095, Training Loss: 0.00814
Epoch: 32095, Training Loss: 0.00760
Epoch: 32095, Training Loss: 0.00929
Epoch: 32095, Training Loss: 0.00720
Epoch: 32096, Training Loss: 0.00814
Epoch: 32096, Training Loss: 0.00760
Epoch: 32096, Training Loss: 0.00929
Epoch: 32096, Training Loss: 0.00720
Epoch: 32097, Training Loss: 0.00814
Epoch: 32097, Training Loss: 0.00760
Epoch: 32097, Training Loss: 0.00929
Epoch: 32097, Training Loss: 0.00720
Epoch: 32098, Training Loss: 0.00814
Epoch: 32098, Training Loss: 0.00760
Epoch: 32098, Training Loss: 0.00929
Epoch: 32098, Training Loss: 0.00720
Epoch: 32099, Training Loss: 0.00814
Epoch: 32099, Training Loss: 0.00760
Epoch: 32099, Training Loss: 0.00929
Epoch: 32099, Training Loss: 0.00720
Epoch: 32100, Training Loss: 0.00814
Epoch: 32100, Training Loss: 0.00760
Epoch: 32100, Training Loss: 0.00929
Epoch: 32100, Training Loss: 0.00720
Epoch: 32101, Training Loss: 0.00814
Epoch: 32101, Training Loss: 0.00760
Epoch: 32101, Training Loss: 0.00929
Epoch: 32101, Training Loss: 0.00720
Epoch: 32102, Training Loss: 0.00814
Epoch: 32102, Training Loss: 0.00760
Epoch: 32102, Training Loss: 0.00929
Epoch: 32102, Training Loss: 0.00720
Epoch: 32103, Training Loss: 0.00814
Epoch: 32103, Training Loss: 0.00760
Epoch: 32103, Training Loss: 0.00929
Epoch: 32103, Training Loss: 0.00720
Epoch: 32104, Training Loss: 0.00814
Epoch: 32104, Training Loss: 0.00760
Epoch: 32104, Training Loss: 0.00929
Epoch: 32104, Training Loss: 0.00720
Epoch: 32105, Training Loss: 0.00814
Epoch: 32105, Training Loss: 0.00760
Epoch: 32105, Training Loss: 0.00929
Epoch: 32105, Training Loss: 0.00720
Epoch: 32106, Training Loss: 0.00814
Epoch: 32106, Training Loss: 0.00760
Epoch: 32106, Training Loss: 0.00929
Epoch: 32106, Training Loss: 0.00720
Epoch: 32107, Training Loss: 0.00814
Epoch: 32107, Training Loss: 0.00760
Epoch: 32107, Training Loss: 0.00929
Epoch: 32107, Training Loss: 0.00720
Epoch: 32108, Training Loss: 0.00814
Epoch: 32108, Training Loss: 0.00760
Epoch: 32108, Training Loss: 0.00929
Epoch: 32108, Training Loss: 0.00720
Epoch: 32109, Training Loss: 0.00814
Epoch: 32109, Training Loss: 0.00760
Epoch: 32109, Training Loss: 0.00929
Epoch: 32109, Training Loss: 0.00720
Epoch: 32110, Training Loss: 0.00814
Epoch: 32110, Training Loss: 0.00760
Epoch: 32110, Training Loss: 0.00929
Epoch: 32110, Training Loss: 0.00720
Epoch: 32111, Training Loss: 0.00814
Epoch: 32111, Training Loss: 0.00760
Epoch: 32111, Training Loss: 0.00929
Epoch: 32111, Training Loss: 0.00720
Epoch: 32112, Training Loss: 0.00814
Epoch: 32112, Training Loss: 0.00760
Epoch: 32112, Training Loss: 0.00929
Epoch: 32112, Training Loss: 0.00720
Epoch: 32113, Training Loss: 0.00814
Epoch: 32113, Training Loss: 0.00760
Epoch: 32113, Training Loss: 0.00929
Epoch: 32113, Training Loss: 0.00720
Epoch: 32114, Training Loss: 0.00814
Epoch: 32114, Training Loss: 0.00760
Epoch: 32114, Training Loss: 0.00929
Epoch: 32114, Training Loss: 0.00720
Epoch: 32115, Training Loss: 0.00813
Epoch: 32115, Training Loss: 0.00760
Epoch: 32115, Training Loss: 0.00929
Epoch: 32115, Training Loss: 0.00720
Epoch: 32116, Training Loss: 0.00813
Epoch: 32116, Training Loss: 0.00760
Epoch: 32116, Training Loss: 0.00929
Epoch: 32116, Training Loss: 0.00720
Epoch: 32117, Training Loss: 0.00813
Epoch: 32117, Training Loss: 0.00760
Epoch: 32117, Training Loss: 0.00929
Epoch: 32117, Training Loss: 0.00720
Epoch: 32118, Training Loss: 0.00813
Epoch: 32118, Training Loss: 0.00760
Epoch: 32118, Training Loss: 0.00929
Epoch: 32118, Training Loss: 0.00720
Epoch: 32119, Training Loss: 0.00813
Epoch: 32119, Training Loss: 0.00760
Epoch: 32119, Training Loss: 0.00929
Epoch: 32119, Training Loss: 0.00720
Epoch: 32120, Training Loss: 0.00813
Epoch: 32120, Training Loss: 0.00760
Epoch: 32120, Training Loss: 0.00929
Epoch: 32120, Training Loss: 0.00720
Epoch: 32121, Training Loss: 0.00813
Epoch: 32121, Training Loss: 0.00760
Epoch: 32121, Training Loss: 0.00929
Epoch: 32121, Training Loss: 0.00720
Epoch: 32122, Training Loss: 0.00813
Epoch: 32122, Training Loss: 0.00760
Epoch: 32122, Training Loss: 0.00929
Epoch: 32122, Training Loss: 0.00720
Epoch: 32123, Training Loss: 0.00813
Epoch: 32123, Training Loss: 0.00760
Epoch: 32123, Training Loss: 0.00929
Epoch: 32123, Training Loss: 0.00720
Epoch: 32124, Training Loss: 0.00813
Epoch: 32124, Training Loss: 0.00760
Epoch: 32124, Training Loss: 0.00929
Epoch: 32124, Training Loss: 0.00720
Epoch: 32125, Training Loss: 0.00813
Epoch: 32125, Training Loss: 0.00760
Epoch: 32125, Training Loss: 0.00929
Epoch: 32125, Training Loss: 0.00720
Epoch: 32126, Training Loss: 0.00813
Epoch: 32126, Training Loss: 0.00760
Epoch: 32126, Training Loss: 0.00929
Epoch: 32126, Training Loss: 0.00720
Epoch: 32127, Training Loss: 0.00813
Epoch: 32127, Training Loss: 0.00760
Epoch: 32127, Training Loss: 0.00929
Epoch: 32127, Training Loss: 0.00720
Epoch: 32128, Training Loss: 0.00813
Epoch: 32128, Training Loss: 0.00760
Epoch: 32128, Training Loss: 0.00929
Epoch: 32128, Training Loss: 0.00720
Epoch: 32129, Training Loss: 0.00813
Epoch: 32129, Training Loss: 0.00760
Epoch: 32129, Training Loss: 0.00929
Epoch: 32129, Training Loss: 0.00720
Epoch: 32130, Training Loss: 0.00813
Epoch: 32130, Training Loss: 0.00760
Epoch: 32130, Training Loss: 0.00929
Epoch: 32130, Training Loss: 0.00720
Epoch: 32131, Training Loss: 0.00813
Epoch: 32131, Training Loss: 0.00760
Epoch: 32131, Training Loss: 0.00929
Epoch: 32131, Training Loss: 0.00720
Epoch: 32132, Training Loss: 0.00813
Epoch: 32132, Training Loss: 0.00760
Epoch: 32132, Training Loss: 0.00929
Epoch: 32132, Training Loss: 0.00719
Epoch: 32133, Training Loss: 0.00813
Epoch: 32133, Training Loss: 0.00760
Epoch: 32133, Training Loss: 0.00929
Epoch: 32133, Training Loss: 0.00719
Epoch: 32134, Training Loss: 0.00813
Epoch: 32134, Training Loss: 0.00760
Epoch: 32134, Training Loss: 0.00929
Epoch: 32134, Training Loss: 0.00719
Epoch: 32135, Training Loss: 0.00813
Epoch: 32135, Training Loss: 0.00760
Epoch: 32135, Training Loss: 0.00929
Epoch: 32135, Training Loss: 0.00719
Epoch: 32136, Training Loss: 0.00813
Epoch: 32136, Training Loss: 0.00760
Epoch: 32136, Training Loss: 0.00929
Epoch: 32136, Training Loss: 0.00719
Epoch: 32137, Training Loss: 0.00813
Epoch: 32137, Training Loss: 0.00760
Epoch: 32137, Training Loss: 0.00929
Epoch: 32137, Training Loss: 0.00719
Epoch: 32138, Training Loss: 0.00813
Epoch: 32138, Training Loss: 0.00760
Epoch: 32138, Training Loss: 0.00929
Epoch: 32138, Training Loss: 0.00719
Epoch: 32139, Training Loss: 0.00813
Epoch: 32139, Training Loss: 0.00760
Epoch: 32139, Training Loss: 0.00929
Epoch: 32139, Training Loss: 0.00719
Epoch: 32140, Training Loss: 0.00813
Epoch: 32140, Training Loss: 0.00760
Epoch: 32140, Training Loss: 0.00929
Epoch: 32140, Training Loss: 0.00719
Epoch: 32141, Training Loss: 0.00813
Epoch: 32141, Training Loss: 0.00760
Epoch: 32141, Training Loss: 0.00929
Epoch: 32141, Training Loss: 0.00719
Epoch: 32142, Training Loss: 0.00813
Epoch: 32142, Training Loss: 0.00760
Epoch: 32142, Training Loss: 0.00929
Epoch: 32142, Training Loss: 0.00719
Epoch: 32143, Training Loss: 0.00813
Epoch: 32143, Training Loss: 0.00760
Epoch: 32143, Training Loss: 0.00929
Epoch: 32143, Training Loss: 0.00719
Epoch: 32144, Training Loss: 0.00813
Epoch: 32144, Training Loss: 0.00760
Epoch: 32144, Training Loss: 0.00929
Epoch: 32144, Training Loss: 0.00719
Epoch: 32145, Training Loss: 0.00813
Epoch: 32145, Training Loss: 0.00760
Epoch: 32145, Training Loss: 0.00929
Epoch: 32145, Training Loss: 0.00719
Epoch: 32146, Training Loss: 0.00813
Epoch: 32146, Training Loss: 0.00760
Epoch: 32146, Training Loss: 0.00929
Epoch: 32146, Training Loss: 0.00719
Epoch: 32147, Training Loss: 0.00813
Epoch: 32147, Training Loss: 0.00760
Epoch: 32147, Training Loss: 0.00929
Epoch: 32147, Training Loss: 0.00719
Epoch: 32148, Training Loss: 0.00813
Epoch: 32148, Training Loss: 0.00760
Epoch: 32148, Training Loss: 0.00929
Epoch: 32148, Training Loss: 0.00719
Epoch: 32149, Training Loss: 0.00813
Epoch: 32149, Training Loss: 0.00760
Epoch: 32149, Training Loss: 0.00929
Epoch: 32149, Training Loss: 0.00719
Epoch: 32150, Training Loss: 0.00813
Epoch: 32150, Training Loss: 0.00760
Epoch: 32150, Training Loss: 0.00929
Epoch: 32150, Training Loss: 0.00719
Epoch: 32151, Training Loss: 0.00813
Epoch: 32151, Training Loss: 0.00760
Epoch: 32151, Training Loss: 0.00929
Epoch: 32151, Training Loss: 0.00719
Epoch: 32152, Training Loss: 0.00813
Epoch: 32152, Training Loss: 0.00760
Epoch: 32152, Training Loss: 0.00929
Epoch: 32152, Training Loss: 0.00719
Epoch: 32153, Training Loss: 0.00813
Epoch: 32153, Training Loss: 0.00760
Epoch: 32153, Training Loss: 0.00929
Epoch: 32153, Training Loss: 0.00719
Epoch: 32154, Training Loss: 0.00813
Epoch: 32154, Training Loss: 0.00760
Epoch: 32154, Training Loss: 0.00929
Epoch: 32154, Training Loss: 0.00719
Epoch: 32155, Training Loss: 0.00813
Epoch: 32155, Training Loss: 0.00760
Epoch: 32155, Training Loss: 0.00929
Epoch: 32155, Training Loss: 0.00719
Epoch: 32156, Training Loss: 0.00813
Epoch: 32156, Training Loss: 0.00760
Epoch: 32156, Training Loss: 0.00929
Epoch: 32156, Training Loss: 0.00719
Epoch: 32157, Training Loss: 0.00813
Epoch: 32157, Training Loss: 0.00760
Epoch: 32157, Training Loss: 0.00928
Epoch: 32157, Training Loss: 0.00719
Epoch: 32158, Training Loss: 0.00813
Epoch: 32158, Training Loss: 0.00760
Epoch: 32158, Training Loss: 0.00928
Epoch: 32158, Training Loss: 0.00719
Epoch: 32159, Training Loss: 0.00813
Epoch: 32159, Training Loss: 0.00759
Epoch: 32159, Training Loss: 0.00928
Epoch: 32159, Training Loss: 0.00719
Epoch: 32160, Training Loss: 0.00813
Epoch: 32160, Training Loss: 0.00759
Epoch: 32160, Training Loss: 0.00928
Epoch: 32160, Training Loss: 0.00719
Epoch: 32161, Training Loss: 0.00813
Epoch: 32161, Training Loss: 0.00759
Epoch: 32161, Training Loss: 0.00928
Epoch: 32161, Training Loss: 0.00719
Epoch: 32162, Training Loss: 0.00813
Epoch: 32162, Training Loss: 0.00759
Epoch: 32162, Training Loss: 0.00928
Epoch: 32162, Training Loss: 0.00719
Epoch: 32163, Training Loss: 0.00813
Epoch: 32163, Training Loss: 0.00759
Epoch: 32163, Training Loss: 0.00928
Epoch: 32163, Training Loss: 0.00719
Epoch: 32164, Training Loss: 0.00813
Epoch: 32164, Training Loss: 0.00759
Epoch: 32164, Training Loss: 0.00928
Epoch: 32164, Training Loss: 0.00719
Epoch: 32165, Training Loss: 0.00813
Epoch: 32165, Training Loss: 0.00759
Epoch: 32165, Training Loss: 0.00928
Epoch: 32165, Training Loss: 0.00719
Epoch: 32166, Training Loss: 0.00813
Epoch: 32166, Training Loss: 0.00759
Epoch: 32166, Training Loss: 0.00928
Epoch: 32166, Training Loss: 0.00719
Epoch: 32167, Training Loss: 0.00813
Epoch: 32167, Training Loss: 0.00759
Epoch: 32167, Training Loss: 0.00928
Epoch: 32167, Training Loss: 0.00719
Epoch: 32168, Training Loss: 0.00813
Epoch: 32168, Training Loss: 0.00759
Epoch: 32168, Training Loss: 0.00928
Epoch: 32168, Training Loss: 0.00719
Epoch: 32169, Training Loss: 0.00813
Epoch: 32169, Training Loss: 0.00759
Epoch: 32169, Training Loss: 0.00928
Epoch: 32169, Training Loss: 0.00719
Epoch: 32170, Training Loss: 0.00813
Epoch: 32170, Training Loss: 0.00759
Epoch: 32170, Training Loss: 0.00928
Epoch: 32170, Training Loss: 0.00719
Epoch: 32171, Training Loss: 0.00813
Epoch: 32171, Training Loss: 0.00759
Epoch: 32171, Training Loss: 0.00928
Epoch: 32171, Training Loss: 0.00719
Epoch: 32172, Training Loss: 0.00813
Epoch: 32172, Training Loss: 0.00759
Epoch: 32172, Training Loss: 0.00928
Epoch: 32172, Training Loss: 0.00719
Epoch: 32173, Training Loss: 0.00813
Epoch: 32173, Training Loss: 0.00759
Epoch: 32173, Training Loss: 0.00928
Epoch: 32173, Training Loss: 0.00719
Epoch: 32174, Training Loss: 0.00813
Epoch: 32174, Training Loss: 0.00759
Epoch: 32174, Training Loss: 0.00928
Epoch: 32174, Training Loss: 0.00719
Epoch: 32175, Training Loss: 0.00813
Epoch: 32175, Training Loss: 0.00759
Epoch: 32175, Training Loss: 0.00928
Epoch: 32175, Training Loss: 0.00719
Epoch: 32176, Training Loss: 0.00813
Epoch: 32176, Training Loss: 0.00759
Epoch: 32176, Training Loss: 0.00928
Epoch: 32176, Training Loss: 0.00719
Epoch: 32177, Training Loss: 0.00813
Epoch: 32177, Training Loss: 0.00759
Epoch: 32177, Training Loss: 0.00928
Epoch: 32177, Training Loss: 0.00719
Epoch: 32178, Training Loss: 0.00813
Epoch: 32178, Training Loss: 0.00759
Epoch: 32178, Training Loss: 0.00928
Epoch: 32178, Training Loss: 0.00719
Epoch: 32179, Training Loss: 0.00813
Epoch: 32179, Training Loss: 0.00759
Epoch: 32179, Training Loss: 0.00928
Epoch: 32179, Training Loss: 0.00719
Epoch: 32180, Training Loss: 0.00813
Epoch: 32180, Training Loss: 0.00759
Epoch: 32180, Training Loss: 0.00928
Epoch: 32180, Training Loss: 0.00719
Epoch: 32181, Training Loss: 0.00813
Epoch: 32181, Training Loss: 0.00759
Epoch: 32181, Training Loss: 0.00928
Epoch: 32181, Training Loss: 0.00719
Epoch: 32182, Training Loss: 0.00813
Epoch: 32182, Training Loss: 0.00759
Epoch: 32182, Training Loss: 0.00928
Epoch: 32182, Training Loss: 0.00719
Epoch: 32183, Training Loss: 0.00813
Epoch: 32183, Training Loss: 0.00759
Epoch: 32183, Training Loss: 0.00928
Epoch: 32183, Training Loss: 0.00719
Epoch: 32184, Training Loss: 0.00813
Epoch: 32184, Training Loss: 0.00759
Epoch: 32184, Training Loss: 0.00928
Epoch: 32184, Training Loss: 0.00719
Epoch: 32185, Training Loss: 0.00813
Epoch: 32185, Training Loss: 0.00759
Epoch: 32185, Training Loss: 0.00928
Epoch: 32185, Training Loss: 0.00719
Epoch: 32186, Training Loss: 0.00813
Epoch: 32186, Training Loss: 0.00759
Epoch: 32186, Training Loss: 0.00928
Epoch: 32186, Training Loss: 0.00719
Epoch: 32187, Training Loss: 0.00813
Epoch: 32187, Training Loss: 0.00759
Epoch: 32187, Training Loss: 0.00928
Epoch: 32187, Training Loss: 0.00719
Epoch: 32188, Training Loss: 0.00813
Epoch: 32188, Training Loss: 0.00759
Epoch: 32188, Training Loss: 0.00928
Epoch: 32188, Training Loss: 0.00719
Epoch: 32189, Training Loss: 0.00812
Epoch: 32189, Training Loss: 0.00759
Epoch: 32189, Training Loss: 0.00928
Epoch: 32189, Training Loss: 0.00719
Epoch: 32190, Training Loss: 0.00812
Epoch: 32190, Training Loss: 0.00759
Epoch: 32190, Training Loss: 0.00928
Epoch: 32190, Training Loss: 0.00719
Epoch: 32191, Training Loss: 0.00812
Epoch: 32191, Training Loss: 0.00759
Epoch: 32191, Training Loss: 0.00928
Epoch: 32191, Training Loss: 0.00719
Epoch: 32192, Training Loss: 0.00812
Epoch: 32192, Training Loss: 0.00759
Epoch: 32192, Training Loss: 0.00928
Epoch: 32192, Training Loss: 0.00719
Epoch: 32193, Training Loss: 0.00812
Epoch: 32193, Training Loss: 0.00759
Epoch: 32193, Training Loss: 0.00928
Epoch: 32193, Training Loss: 0.00719
Epoch: 32194, Training Loss: 0.00812
Epoch: 32194, Training Loss: 0.00759
Epoch: 32194, Training Loss: 0.00928
Epoch: 32194, Training Loss: 0.00719
Epoch: 32195, Training Loss: 0.00812
Epoch: 32195, Training Loss: 0.00759
Epoch: 32195, Training Loss: 0.00928
Epoch: 32195, Training Loss: 0.00719
Epoch: 32196, Training Loss: 0.00812
Epoch: 32196, Training Loss: 0.00759
Epoch: 32196, Training Loss: 0.00928
Epoch: 32196, Training Loss: 0.00719
Epoch: 32197, Training Loss: 0.00812
Epoch: 32197, Training Loss: 0.00759
Epoch: 32197, Training Loss: 0.00928
Epoch: 32197, Training Loss: 0.00719
Epoch: 32198, Training Loss: 0.00812
Epoch: 32198, Training Loss: 0.00759
Epoch: 32198, Training Loss: 0.00928
Epoch: 32198, Training Loss: 0.00719
Epoch: 32199, Training Loss: 0.00812
Epoch: 32199, Training Loss: 0.00759
Epoch: 32199, Training Loss: 0.00928
Epoch: 32199, Training Loss: 0.00719
Epoch: 32200, Training Loss: 0.00812
Epoch: 32200, Training Loss: 0.00759
Epoch: 32200, Training Loss: 0.00928
Epoch: 32200, Training Loss: 0.00719
Epoch: 32201, Training Loss: 0.00812
Epoch: 32201, Training Loss: 0.00759
Epoch: 32201, Training Loss: 0.00928
Epoch: 32201, Training Loss: 0.00719
Epoch: 32202, Training Loss: 0.00812
Epoch: 32202, Training Loss: 0.00759
Epoch: 32202, Training Loss: 0.00928
Epoch: 32202, Training Loss: 0.00719
Epoch: 32203, Training Loss: 0.00812
Epoch: 32203, Training Loss: 0.00759
Epoch: 32203, Training Loss: 0.00928
Epoch: 32203, Training Loss: 0.00719
Epoch: 32204, Training Loss: 0.00812
Epoch: 32204, Training Loss: 0.00759
Epoch: 32204, Training Loss: 0.00928
Epoch: 32204, Training Loss: 0.00719
Epoch: 32205, Training Loss: 0.00812
Epoch: 32205, Training Loss: 0.00759
Epoch: 32205, Training Loss: 0.00928
Epoch: 32205, Training Loss: 0.00719
Epoch: 32206, Training Loss: 0.00812
Epoch: 32206, Training Loss: 0.00759
Epoch: 32206, Training Loss: 0.00928
Epoch: 32206, Training Loss: 0.00719
Epoch: 32207, Training Loss: 0.00812
Epoch: 32207, Training Loss: 0.00759
Epoch: 32207, Training Loss: 0.00928
Epoch: 32207, Training Loss: 0.00719
Epoch: 32208, Training Loss: 0.00812
Epoch: 32208, Training Loss: 0.00759
Epoch: 32208, Training Loss: 0.00928
Epoch: 32208, Training Loss: 0.00719
Epoch: 32209, Training Loss: 0.00812
Epoch: 32209, Training Loss: 0.00759
Epoch: 32209, Training Loss: 0.00928
Epoch: 32209, Training Loss: 0.00719
Epoch: 32210, Training Loss: 0.00812
Epoch: 32210, Training Loss: 0.00759
Epoch: 32210, Training Loss: 0.00928
Epoch: 32210, Training Loss: 0.00719
Epoch: 32211, Training Loss: 0.00812
Epoch: 32211, Training Loss: 0.00759
Epoch: 32211, Training Loss: 0.00928
Epoch: 32211, Training Loss: 0.00719
Epoch: 32212, Training Loss: 0.00812
Epoch: 32212, Training Loss: 0.00759
Epoch: 32212, Training Loss: 0.00928
Epoch: 32212, Training Loss: 0.00719
Epoch: 32213, Training Loss: 0.00812
Epoch: 32213, Training Loss: 0.00759
Epoch: 32213, Training Loss: 0.00928
Epoch: 32213, Training Loss: 0.00719
Epoch: 32214, Training Loss: 0.00812
Epoch: 32214, Training Loss: 0.00759
Epoch: 32214, Training Loss: 0.00928
Epoch: 32214, Training Loss: 0.00719
Epoch: 32215, Training Loss: 0.00812
Epoch: 32215, Training Loss: 0.00759
Epoch: 32215, Training Loss: 0.00928
Epoch: 32215, Training Loss: 0.00719
Epoch: 32216, Training Loss: 0.00812
Epoch: 32216, Training Loss: 0.00759
Epoch: 32216, Training Loss: 0.00928
Epoch: 32216, Training Loss: 0.00719
Epoch: 32217, Training Loss: 0.00812
Epoch: 32217, Training Loss: 0.00759
Epoch: 32217, Training Loss: 0.00928
Epoch: 32217, Training Loss: 0.00718
Epoch: 32218, Training Loss: 0.00812
Epoch: 32218, Training Loss: 0.00759
Epoch: 32218, Training Loss: 0.00928
Epoch: 32218, Training Loss: 0.00718
Epoch: 32219, Training Loss: 0.00812
Epoch: 32219, Training Loss: 0.00759
Epoch: 32219, Training Loss: 0.00928
Epoch: 32219, Training Loss: 0.00718
Epoch: 32220, Training Loss: 0.00812
Epoch: 32220, Training Loss: 0.00759
Epoch: 32220, Training Loss: 0.00928
Epoch: 32220, Training Loss: 0.00718
Epoch: 32221, Training Loss: 0.00812
Epoch: 32221, Training Loss: 0.00759
Epoch: 32221, Training Loss: 0.00928
Epoch: 32221, Training Loss: 0.00718
Epoch: 32222, Training Loss: 0.00812
Epoch: 32222, Training Loss: 0.00759
Epoch: 32222, Training Loss: 0.00927
Epoch: 32222, Training Loss: 0.00718
Epoch: 32223, Training Loss: 0.00812
Epoch: 32223, Training Loss: 0.00759
Epoch: 32223, Training Loss: 0.00927
Epoch: 32223, Training Loss: 0.00718
Epoch: 32224, Training Loss: 0.00812
Epoch: 32224, Training Loss: 0.00759
Epoch: 32224, Training Loss: 0.00927
Epoch: 32224, Training Loss: 0.00718
Epoch: 32225, Training Loss: 0.00812
Epoch: 32225, Training Loss: 0.00759
Epoch: 32225, Training Loss: 0.00927
Epoch: 32225, Training Loss: 0.00718
Epoch: 32226, Training Loss: 0.00812
Epoch: 32226, Training Loss: 0.00759
Epoch: 32226, Training Loss: 0.00927
Epoch: 32226, Training Loss: 0.00718
Epoch: 32227, Training Loss: 0.00812
Epoch: 32227, Training Loss: 0.00759
Epoch: 32227, Training Loss: 0.00927
Epoch: 32227, Training Loss: 0.00718
Epoch: 32228, Training Loss: 0.00812
Epoch: 32228, Training Loss: 0.00759
Epoch: 32228, Training Loss: 0.00927
Epoch: 32228, Training Loss: 0.00718
Epoch: 32229, Training Loss: 0.00812
Epoch: 32229, Training Loss: 0.00759
Epoch: 32229, Training Loss: 0.00927
Epoch: 32229, Training Loss: 0.00718
Epoch: 32230, Training Loss: 0.00812
Epoch: 32230, Training Loss: 0.00759
Epoch: 32230, Training Loss: 0.00927
Epoch: 32230, Training Loss: 0.00718
Epoch: 32231, Training Loss: 0.00812
Epoch: 32231, Training Loss: 0.00759
Epoch: 32231, Training Loss: 0.00927
Epoch: 32231, Training Loss: 0.00718
Epoch: 32232, Training Loss: 0.00812
Epoch: 32232, Training Loss: 0.00759
Epoch: 32232, Training Loss: 0.00927
Epoch: 32232, Training Loss: 0.00718
Epoch: 32233, Training Loss: 0.00812
Epoch: 32233, Training Loss: 0.00759
Epoch: 32233, Training Loss: 0.00927
Epoch: 32233, Training Loss: 0.00718
Epoch: 32234, Training Loss: 0.00812
Epoch: 32234, Training Loss: 0.00759
Epoch: 32234, Training Loss: 0.00927
Epoch: 32234, Training Loss: 0.00718
Epoch: 32235, Training Loss: 0.00812
Epoch: 32235, Training Loss: 0.00759
Epoch: 32235, Training Loss: 0.00927
Epoch: 32235, Training Loss: 0.00718
Epoch: 32236, Training Loss: 0.00812
Epoch: 32236, Training Loss: 0.00759
Epoch: 32236, Training Loss: 0.00927
Epoch: 32236, Training Loss: 0.00718
Epoch: 32237, Training Loss: 0.00812
Epoch: 32237, Training Loss: 0.00759
Epoch: 32237, Training Loss: 0.00927
Epoch: 32237, Training Loss: 0.00718
Epoch: 32238, Training Loss: 0.00812
Epoch: 32238, Training Loss: 0.00759
Epoch: 32238, Training Loss: 0.00927
Epoch: 32238, Training Loss: 0.00718
Epoch: 32239, Training Loss: 0.00812
Epoch: 32239, Training Loss: 0.00759
Epoch: 32239, Training Loss: 0.00927
Epoch: 32239, Training Loss: 0.00718
Epoch: 32240, Training Loss: 0.00812
Epoch: 32240, Training Loss: 0.00759
Epoch: 32240, Training Loss: 0.00927
Epoch: 32240, Training Loss: 0.00718
Epoch: 32241, Training Loss: 0.00812
Epoch: 32241, Training Loss: 0.00758
Epoch: 32241, Training Loss: 0.00927
Epoch: 32241, Training Loss: 0.00718
Epoch: 32242, Training Loss: 0.00812
Epoch: 32242, Training Loss: 0.00758
Epoch: 32242, Training Loss: 0.00927
Epoch: 32242, Training Loss: 0.00718
Epoch: 32243, Training Loss: 0.00812
Epoch: 32243, Training Loss: 0.00758
Epoch: 32243, Training Loss: 0.00927
Epoch: 32243, Training Loss: 0.00718
Epoch: 32244, Training Loss: 0.00812
Epoch: 32244, Training Loss: 0.00758
Epoch: 32244, Training Loss: 0.00927
Epoch: 32244, Training Loss: 0.00718
Epoch: 32245, Training Loss: 0.00812
Epoch: 32245, Training Loss: 0.00758
Epoch: 32245, Training Loss: 0.00927
Epoch: 32245, Training Loss: 0.00718
Epoch: 32246, Training Loss: 0.00812
Epoch: 32246, Training Loss: 0.00758
Epoch: 32246, Training Loss: 0.00927
Epoch: 32246, Training Loss: 0.00718
Epoch: 32247, Training Loss: 0.00812
Epoch: 32247, Training Loss: 0.00758
Epoch: 32247, Training Loss: 0.00927
Epoch: 32247, Training Loss: 0.00718
Epoch: 32248, Training Loss: 0.00812
Epoch: 32248, Training Loss: 0.00758
Epoch: 32248, Training Loss: 0.00927
Epoch: 32248, Training Loss: 0.00718
Epoch: 32249, Training Loss: 0.00812
Epoch: 32249, Training Loss: 0.00758
Epoch: 32249, Training Loss: 0.00927
Epoch: 32249, Training Loss: 0.00718
Epoch: 32250, Training Loss: 0.00812
Epoch: 32250, Training Loss: 0.00758
Epoch: 32250, Training Loss: 0.00927
Epoch: 32250, Training Loss: 0.00718
Epoch: 32251, Training Loss: 0.00812
Epoch: 32251, Training Loss: 0.00758
Epoch: 32251, Training Loss: 0.00927
Epoch: 32251, Training Loss: 0.00718
Epoch: 32252, Training Loss: 0.00812
Epoch: 32252, Training Loss: 0.00758
Epoch: 32252, Training Loss: 0.00927
Epoch: 32252, Training Loss: 0.00718
Epoch: 32253, Training Loss: 0.00812
Epoch: 32253, Training Loss: 0.00758
Epoch: 32253, Training Loss: 0.00927
Epoch: 32253, Training Loss: 0.00718
Epoch: 32254, Training Loss: 0.00812
Epoch: 32254, Training Loss: 0.00758
Epoch: 32254, Training Loss: 0.00927
Epoch: 32254, Training Loss: 0.00718
Epoch: 32255, Training Loss: 0.00812
Epoch: 32255, Training Loss: 0.00758
Epoch: 32255, Training Loss: 0.00927
Epoch: 32255, Training Loss: 0.00718
Epoch: 32256, Training Loss: 0.00812
Epoch: 32256, Training Loss: 0.00758
Epoch: 32256, Training Loss: 0.00927
Epoch: 32256, Training Loss: 0.00718
Epoch: 32257, Training Loss: 0.00812
Epoch: 32257, Training Loss: 0.00758
Epoch: 32257, Training Loss: 0.00927
Epoch: 32257, Training Loss: 0.00718
Epoch: 32258, Training Loss: 0.00812
Epoch: 32258, Training Loss: 0.00758
Epoch: 32258, Training Loss: 0.00927
Epoch: 32258, Training Loss: 0.00718
Epoch: 32259, Training Loss: 0.00812
Epoch: 32259, Training Loss: 0.00758
Epoch: 32259, Training Loss: 0.00927
Epoch: 32259, Training Loss: 0.00718
Epoch: 32260, Training Loss: 0.00812
Epoch: 32260, Training Loss: 0.00758
Epoch: 32260, Training Loss: 0.00927
Epoch: 32260, Training Loss: 0.00718
Epoch: 32261, Training Loss: 0.00812
Epoch: 32261, Training Loss: 0.00758
Epoch: 32261, Training Loss: 0.00927
Epoch: 32261, Training Loss: 0.00718
Epoch: 32262, Training Loss: 0.00812
Epoch: 32262, Training Loss: 0.00758
Epoch: 32262, Training Loss: 0.00927
Epoch: 32262, Training Loss: 0.00718
Epoch: 32263, Training Loss: 0.00812
Epoch: 32263, Training Loss: 0.00758
Epoch: 32263, Training Loss: 0.00927
Epoch: 32263, Training Loss: 0.00718
Epoch: 32264, Training Loss: 0.00811
Epoch: 32264, Training Loss: 0.00758
Epoch: 32264, Training Loss: 0.00927
Epoch: 32264, Training Loss: 0.00718
Epoch: 32265, Training Loss: 0.00811
Epoch: 32265, Training Loss: 0.00758
Epoch: 32265, Training Loss: 0.00927
Epoch: 32265, Training Loss: 0.00718
Epoch: 32266, Training Loss: 0.00811
Epoch: 32266, Training Loss: 0.00758
Epoch: 32266, Training Loss: 0.00927
Epoch: 32266, Training Loss: 0.00718
Epoch: 32267, Training Loss: 0.00811
Epoch: 32267, Training Loss: 0.00758
Epoch: 32267, Training Loss: 0.00927
Epoch: 32267, Training Loss: 0.00718
Epoch: 32268, Training Loss: 0.00811
Epoch: 32268, Training Loss: 0.00758
Epoch: 32268, Training Loss: 0.00927
Epoch: 32268, Training Loss: 0.00718
Epoch: 32269, Training Loss: 0.00811
Epoch: 32269, Training Loss: 0.00758
Epoch: 32269, Training Loss: 0.00927
Epoch: 32269, Training Loss: 0.00718
Epoch: 32270, Training Loss: 0.00811
Epoch: 32270, Training Loss: 0.00758
Epoch: 32270, Training Loss: 0.00927
Epoch: 32270, Training Loss: 0.00718
Epoch: 32271, Training Loss: 0.00811
Epoch: 32271, Training Loss: 0.00758
Epoch: 32271, Training Loss: 0.00927
Epoch: 32271, Training Loss: 0.00718
Epoch: 32272, Training Loss: 0.00811
Epoch: 32272, Training Loss: 0.00758
Epoch: 32272, Training Loss: 0.00927
Epoch: 32272, Training Loss: 0.00718
Epoch: 32273, Training Loss: 0.00811
Epoch: 32273, Training Loss: 0.00758
Epoch: 32273, Training Loss: 0.00927
Epoch: 32273, Training Loss: 0.00718
Epoch: 32274, Training Loss: 0.00811
Epoch: 32274, Training Loss: 0.00758
Epoch: 32274, Training Loss: 0.00927
Epoch: 32274, Training Loss: 0.00718
Epoch: 32275, Training Loss: 0.00811
Epoch: 32275, Training Loss: 0.00758
Epoch: 32275, Training Loss: 0.00927
Epoch: 32275, Training Loss: 0.00718
Epoch: 32276, Training Loss: 0.00811
Epoch: 32276, Training Loss: 0.00758
Epoch: 32276, Training Loss: 0.00927
Epoch: 32276, Training Loss: 0.00718
Epoch: 32277, Training Loss: 0.00811
Epoch: 32277, Training Loss: 0.00758
Epoch: 32277, Training Loss: 0.00927
Epoch: 32277, Training Loss: 0.00718
Epoch: 32278, Training Loss: 0.00811
Epoch: 32278, Training Loss: 0.00758
Epoch: 32278, Training Loss: 0.00927
Epoch: 32278, Training Loss: 0.00718
Epoch: 32279, Training Loss: 0.00811
Epoch: 32279, Training Loss: 0.00758
Epoch: 32279, Training Loss: 0.00927
Epoch: 32279, Training Loss: 0.00718
Epoch: 32280, Training Loss: 0.00811
Epoch: 32280, Training Loss: 0.00758
Epoch: 32280, Training Loss: 0.00927
Epoch: 32280, Training Loss: 0.00718
Epoch: 32281, Training Loss: 0.00811
Epoch: 32281, Training Loss: 0.00758
Epoch: 32281, Training Loss: 0.00927
Epoch: 32281, Training Loss: 0.00718
Epoch: 32282, Training Loss: 0.00811
Epoch: 32282, Training Loss: 0.00758
Epoch: 32282, Training Loss: 0.00927
Epoch: 32282, Training Loss: 0.00718
Epoch: 32283, Training Loss: 0.00811
Epoch: 32283, Training Loss: 0.00758
Epoch: 32283, Training Loss: 0.00927
Epoch: 32283, Training Loss: 0.00718
Epoch: 32284, Training Loss: 0.00811
Epoch: 32284, Training Loss: 0.00758
Epoch: 32284, Training Loss: 0.00927
Epoch: 32284, Training Loss: 0.00718
Epoch: 32285, Training Loss: 0.00811
Epoch: 32285, Training Loss: 0.00758
Epoch: 32285, Training Loss: 0.00927
Epoch: 32285, Training Loss: 0.00718
Epoch: 32286, Training Loss: 0.00811
Epoch: 32286, Training Loss: 0.00758
Epoch: 32286, Training Loss: 0.00927
Epoch: 32286, Training Loss: 0.00718
Epoch: 32287, Training Loss: 0.00811
Epoch: 32287, Training Loss: 0.00758
Epoch: 32287, Training Loss: 0.00927
Epoch: 32287, Training Loss: 0.00718
Epoch: 32288, Training Loss: 0.00811
Epoch: 32288, Training Loss: 0.00758
Epoch: 32288, Training Loss: 0.00926
Epoch: 32288, Training Loss: 0.00718
Epoch: 32289, Training Loss: 0.00811
Epoch: 32289, Training Loss: 0.00758
Epoch: 32289, Training Loss: 0.00926
Epoch: 32289, Training Loss: 0.00718
Epoch: 32290, Training Loss: 0.00811
Epoch: 32290, Training Loss: 0.00758
Epoch: 32290, Training Loss: 0.00926
Epoch: 32290, Training Loss: 0.00718
Epoch: 32291, Training Loss: 0.00811
Epoch: 32291, Training Loss: 0.00758
Epoch: 32291, Training Loss: 0.00926
Epoch: 32291, Training Loss: 0.00718
Epoch: 32292, Training Loss: 0.00811
Epoch: 32292, Training Loss: 0.00758
Epoch: 32292, Training Loss: 0.00926
Epoch: 32292, Training Loss: 0.00718
Epoch: 32293, Training Loss: 0.00811
Epoch: 32293, Training Loss: 0.00758
Epoch: 32293, Training Loss: 0.00926
Epoch: 32293, Training Loss: 0.00718
Epoch: 32294, Training Loss: 0.00811
Epoch: 32294, Training Loss: 0.00758
Epoch: 32294, Training Loss: 0.00926
Epoch: 32294, Training Loss: 0.00718
Epoch: 32295, Training Loss: 0.00811
Epoch: 32295, Training Loss: 0.00758
Epoch: 32295, Training Loss: 0.00926
Epoch: 32295, Training Loss: 0.00718
Epoch: 32296, Training Loss: 0.00811
Epoch: 32296, Training Loss: 0.00758
Epoch: 32296, Training Loss: 0.00926
Epoch: 32296, Training Loss: 0.00718
Epoch: 32297, Training Loss: 0.00811
Epoch: 32297, Training Loss: 0.00758
Epoch: 32297, Training Loss: 0.00926
Epoch: 32297, Training Loss: 0.00718
Epoch: 32298, Training Loss: 0.00811
Epoch: 32298, Training Loss: 0.00758
Epoch: 32298, Training Loss: 0.00926
Epoch: 32298, Training Loss: 0.00718
Epoch: 32299, Training Loss: 0.00811
Epoch: 32299, Training Loss: 0.00758
Epoch: 32299, Training Loss: 0.00926
Epoch: 32299, Training Loss: 0.00718
Epoch: 32300, Training Loss: 0.00811
Epoch: 32300, Training Loss: 0.00758
Epoch: 32300, Training Loss: 0.00926
Epoch: 32300, Training Loss: 0.00718
Epoch: 32301, Training Loss: 0.00811
Epoch: 32301, Training Loss: 0.00758
Epoch: 32301, Training Loss: 0.00926
Epoch: 32301, Training Loss: 0.00718
Epoch: 32302, Training Loss: 0.00811
Epoch: 32302, Training Loss: 0.00758
Epoch: 32302, Training Loss: 0.00926
Epoch: 32302, Training Loss: 0.00718
Epoch: 32303, Training Loss: 0.00811
Epoch: 32303, Training Loss: 0.00758
Epoch: 32303, Training Loss: 0.00926
Epoch: 32303, Training Loss: 0.00717
Epoch: 32304, Training Loss: 0.00811
Epoch: 32304, Training Loss: 0.00758
Epoch: 32304, Training Loss: 0.00926
Epoch: 32304, Training Loss: 0.00717
Epoch: 32305, Training Loss: 0.00811
Epoch: 32305, Training Loss: 0.00758
Epoch: 32305, Training Loss: 0.00926
Epoch: 32305, Training Loss: 0.00717
Epoch: 32306, Training Loss: 0.00811
Epoch: 32306, Training Loss: 0.00758
Epoch: 32306, Training Loss: 0.00926
Epoch: 32306, Training Loss: 0.00717
Epoch: 32307, Training Loss: 0.00811
Epoch: 32307, Training Loss: 0.00758
Epoch: 32307, Training Loss: 0.00926
Epoch: 32307, Training Loss: 0.00717
Epoch: 32308, Training Loss: 0.00811
Epoch: 32308, Training Loss: 0.00758
Epoch: 32308, Training Loss: 0.00926
Epoch: 32308, Training Loss: 0.00717
Epoch: 32309, Training Loss: 0.00811
Epoch: 32309, Training Loss: 0.00758
Epoch: 32309, Training Loss: 0.00926
Epoch: 32309, Training Loss: 0.00717
Epoch: 32310, Training Loss: 0.00811
Epoch: 32310, Training Loss: 0.00758
Epoch: 32310, Training Loss: 0.00926
Epoch: 32310, Training Loss: 0.00717
Epoch: 32311, Training Loss: 0.00811
Epoch: 32311, Training Loss: 0.00758
Epoch: 32311, Training Loss: 0.00926
Epoch: 32311, Training Loss: 0.00717
Epoch: 32312, Training Loss: 0.00811
Epoch: 32312, Training Loss: 0.00758
Epoch: 32312, Training Loss: 0.00926
Epoch: 32312, Training Loss: 0.00717
Epoch: 32313, Training Loss: 0.00811
Epoch: 32313, Training Loss: 0.00758
Epoch: 32313, Training Loss: 0.00926
Epoch: 32313, Training Loss: 0.00717
Epoch: 32314, Training Loss: 0.00811
Epoch: 32314, Training Loss: 0.00758
Epoch: 32314, Training Loss: 0.00926
Epoch: 32314, Training Loss: 0.00717
Epoch: 32315, Training Loss: 0.00811
Epoch: 32315, Training Loss: 0.00758
Epoch: 32315, Training Loss: 0.00926
Epoch: 32315, Training Loss: 0.00717
Epoch: 32316, Training Loss: 0.00811
Epoch: 32316, Training Loss: 0.00758
Epoch: 32316, Training Loss: 0.00926
Epoch: 32316, Training Loss: 0.00717
Epoch: 32317, Training Loss: 0.00811
Epoch: 32317, Training Loss: 0.00758
Epoch: 32317, Training Loss: 0.00926
Epoch: 32317, Training Loss: 0.00717
Epoch: 32318, Training Loss: 0.00811
Epoch: 32318, Training Loss: 0.00758
Epoch: 32318, Training Loss: 0.00926
Epoch: 32318, Training Loss: 0.00717
Epoch: 32319, Training Loss: 0.00811
Epoch: 32319, Training Loss: 0.00758
Epoch: 32319, Training Loss: 0.00926
Epoch: 32319, Training Loss: 0.00717
Epoch: 32320, Training Loss: 0.00811
Epoch: 32320, Training Loss: 0.00758
Epoch: 32320, Training Loss: 0.00926
Epoch: 32320, Training Loss: 0.00717
Epoch: 32321, Training Loss: 0.00811
Epoch: 32321, Training Loss: 0.00758
Epoch: 32321, Training Loss: 0.00926
Epoch: 32321, Training Loss: 0.00717
Epoch: 32322, Training Loss: 0.00811
Epoch: 32322, Training Loss: 0.00757
Epoch: 32322, Training Loss: 0.00926
Epoch: 32322, Training Loss: 0.00717
Epoch: 32323, Training Loss: 0.00811
Epoch: 32323, Training Loss: 0.00757
Epoch: 32323, Training Loss: 0.00926
Epoch: 32323, Training Loss: 0.00717
Epoch: 32324, Training Loss: 0.00811
Epoch: 32324, Training Loss: 0.00757
Epoch: 32324, Training Loss: 0.00926
Epoch: 32324, Training Loss: 0.00717
Epoch: 32325, Training Loss: 0.00811
Epoch: 32325, Training Loss: 0.00757
Epoch: 32325, Training Loss: 0.00926
Epoch: 32325, Training Loss: 0.00717
Epoch: 32326, Training Loss: 0.00811
Epoch: 32326, Training Loss: 0.00757
Epoch: 32326, Training Loss: 0.00926
Epoch: 32326, Training Loss: 0.00717
Epoch: 32327, Training Loss: 0.00811
Epoch: 32327, Training Loss: 0.00757
Epoch: 32327, Training Loss: 0.00926
Epoch: 32327, Training Loss: 0.00717
Epoch: 32328, Training Loss: 0.00811
Epoch: 32328, Training Loss: 0.00757
Epoch: 32328, Training Loss: 0.00926
Epoch: 32328, Training Loss: 0.00717
Epoch: 32329, Training Loss: 0.00811
Epoch: 32329, Training Loss: 0.00757
Epoch: 32329, Training Loss: 0.00926
Epoch: 32329, Training Loss: 0.00717
Epoch: 32330, Training Loss: 0.00811
Epoch: 32330, Training Loss: 0.00757
Epoch: 32330, Training Loss: 0.00926
Epoch: 32330, Training Loss: 0.00717
Epoch: 32331, Training Loss: 0.00811
Epoch: 32331, Training Loss: 0.00757
Epoch: 32331, Training Loss: 0.00926
Epoch: 32331, Training Loss: 0.00717
Epoch: 32332, Training Loss: 0.00811
Epoch: 32332, Training Loss: 0.00757
Epoch: 32332, Training Loss: 0.00926
Epoch: 32332, Training Loss: 0.00717
Epoch: 32333, Training Loss: 0.00811
Epoch: 32333, Training Loss: 0.00757
Epoch: 32333, Training Loss: 0.00926
Epoch: 32333, Training Loss: 0.00717
Epoch: 32334, Training Loss: 0.00811
Epoch: 32334, Training Loss: 0.00757
Epoch: 32334, Training Loss: 0.00926
Epoch: 32334, Training Loss: 0.00717
Epoch: 32335, Training Loss: 0.00811
Epoch: 32335, Training Loss: 0.00757
Epoch: 32335, Training Loss: 0.00926
Epoch: 32335, Training Loss: 0.00717
Epoch: 32336, Training Loss: 0.00811
Epoch: 32336, Training Loss: 0.00757
Epoch: 32336, Training Loss: 0.00926
Epoch: 32336, Training Loss: 0.00717
Epoch: 32337, Training Loss: 0.00811
Epoch: 32337, Training Loss: 0.00757
Epoch: 32337, Training Loss: 0.00926
Epoch: 32337, Training Loss: 0.00717
Epoch: 32338, Training Loss: 0.00811
Epoch: 32338, Training Loss: 0.00757
Epoch: 32338, Training Loss: 0.00926
Epoch: 32338, Training Loss: 0.00717
Epoch: 32339, Training Loss: 0.00810
Epoch: 32339, Training Loss: 0.00757
Epoch: 32339, Training Loss: 0.00926
Epoch: 32339, Training Loss: 0.00717
Epoch: 32340, Training Loss: 0.00810
Epoch: 32340, Training Loss: 0.00757
Epoch: 32340, Training Loss: 0.00926
Epoch: 32340, Training Loss: 0.00717
Epoch: 32341, Training Loss: 0.00810
Epoch: 32341, Training Loss: 0.00757
Epoch: 32341, Training Loss: 0.00926
Epoch: 32341, Training Loss: 0.00717
Epoch: 32342, Training Loss: 0.00810
Epoch: 32342, Training Loss: 0.00757
Epoch: 32342, Training Loss: 0.00926
Epoch: 32342, Training Loss: 0.00717
Epoch: 32343, Training Loss: 0.00810
Epoch: 32343, Training Loss: 0.00757
Epoch: 32343, Training Loss: 0.00926
Epoch: 32343, Training Loss: 0.00717
Epoch: 32344, Training Loss: 0.00810
Epoch: 32344, Training Loss: 0.00757
Epoch: 32344, Training Loss: 0.00926
Epoch: 32344, Training Loss: 0.00717
Epoch: 32345, Training Loss: 0.00810
Epoch: 32345, Training Loss: 0.00757
Epoch: 32345, Training Loss: 0.00926
Epoch: 32345, Training Loss: 0.00717
Epoch: 32346, Training Loss: 0.00810
Epoch: 32346, Training Loss: 0.00757
Epoch: 32346, Training Loss: 0.00926
Epoch: 32346, Training Loss: 0.00717
Epoch: 32347, Training Loss: 0.00810
Epoch: 32347, Training Loss: 0.00757
Epoch: 32347, Training Loss: 0.00926
Epoch: 32347, Training Loss: 0.00717
Epoch: 32348, Training Loss: 0.00810
Epoch: 32348, Training Loss: 0.00757
Epoch: 32348, Training Loss: 0.00926
Epoch: 32348, Training Loss: 0.00717
Epoch: 32349, Training Loss: 0.00810
Epoch: 32349, Training Loss: 0.00757
Epoch: 32349, Training Loss: 0.00926
Epoch: 32349, Training Loss: 0.00717
Epoch: 32350, Training Loss: 0.00810
Epoch: 32350, Training Loss: 0.00757
Epoch: 32350, Training Loss: 0.00926
Epoch: 32350, Training Loss: 0.00717
Epoch: 32351, Training Loss: 0.00810
Epoch: 32351, Training Loss: 0.00757
Epoch: 32351, Training Loss: 0.00926
Epoch: 32351, Training Loss: 0.00717
Epoch: 32352, Training Loss: 0.00810
Epoch: 32352, Training Loss: 0.00757
Epoch: 32352, Training Loss: 0.00926
Epoch: 32352, Training Loss: 0.00717
Epoch: 32353, Training Loss: 0.00810
Epoch: 32353, Training Loss: 0.00757
Epoch: 32353, Training Loss: 0.00926
Epoch: 32353, Training Loss: 0.00717
Epoch: 32354, Training Loss: 0.00810
Epoch: 32354, Training Loss: 0.00757
Epoch: 32354, Training Loss: 0.00925
Epoch: 32354, Training Loss: 0.00717
Epoch: 32355, Training Loss: 0.00810
Epoch: 32355, Training Loss: 0.00757
Epoch: 32355, Training Loss: 0.00925
Epoch: 32355, Training Loss: 0.00717
Epoch: 32356, Training Loss: 0.00810
Epoch: 32356, Training Loss: 0.00757
Epoch: 32356, Training Loss: 0.00925
Epoch: 32356, Training Loss: 0.00717
Epoch: 32357, Training Loss: 0.00810
Epoch: 32357, Training Loss: 0.00757
Epoch: 32357, Training Loss: 0.00925
Epoch: 32357, Training Loss: 0.00717
Epoch: 32358, Training Loss: 0.00810
Epoch: 32358, Training Loss: 0.00757
Epoch: 32358, Training Loss: 0.00925
Epoch: 32358, Training Loss: 0.00717
Epoch: 32359, Training Loss: 0.00810
Epoch: 32359, Training Loss: 0.00757
Epoch: 32359, Training Loss: 0.00925
Epoch: 32359, Training Loss: 0.00717
Epoch: 32360, Training Loss: 0.00810
Epoch: 32360, Training Loss: 0.00757
Epoch: 32360, Training Loss: 0.00925
Epoch: 32360, Training Loss: 0.00717
Epoch: 32361, Training Loss: 0.00810
Epoch: 32361, Training Loss: 0.00757
Epoch: 32361, Training Loss: 0.00925
Epoch: 32361, Training Loss: 0.00717
Epoch: 32362, Training Loss: 0.00810
Epoch: 32362, Training Loss: 0.00757
Epoch: 32362, Training Loss: 0.00925
Epoch: 32362, Training Loss: 0.00717
Epoch: 32363, Training Loss: 0.00810
Epoch: 32363, Training Loss: 0.00757
Epoch: 32363, Training Loss: 0.00925
Epoch: 32363, Training Loss: 0.00717
Epoch: 32364, Training Loss: 0.00810
Epoch: 32364, Training Loss: 0.00757
Epoch: 32364, Training Loss: 0.00925
Epoch: 32364, Training Loss: 0.00717
Epoch: 32365, Training Loss: 0.00810
Epoch: 32365, Training Loss: 0.00757
Epoch: 32365, Training Loss: 0.00925
Epoch: 32365, Training Loss: 0.00717
Epoch: 32366, Training Loss: 0.00810
Epoch: 32366, Training Loss: 0.00757
Epoch: 32366, Training Loss: 0.00925
Epoch: 32366, Training Loss: 0.00717
Epoch: 32367, Training Loss: 0.00810
Epoch: 32367, Training Loss: 0.00757
Epoch: 32367, Training Loss: 0.00925
Epoch: 32367, Training Loss: 0.00717
Epoch: 32368, Training Loss: 0.00810
Epoch: 32368, Training Loss: 0.00757
Epoch: 32368, Training Loss: 0.00925
Epoch: 32368, Training Loss: 0.00717
Epoch: 32369, Training Loss: 0.00810
Epoch: 32369, Training Loss: 0.00757
Epoch: 32369, Training Loss: 0.00925
Epoch: 32369, Training Loss: 0.00717
Epoch: 32370, Training Loss: 0.00810
Epoch: 32370, Training Loss: 0.00757
Epoch: 32370, Training Loss: 0.00925
Epoch: 32370, Training Loss: 0.00717
Epoch: 32371, Training Loss: 0.00810
Epoch: 32371, Training Loss: 0.00757
Epoch: 32371, Training Loss: 0.00925
Epoch: 32371, Training Loss: 0.00717
Epoch: 32372, Training Loss: 0.00810
Epoch: 32372, Training Loss: 0.00757
Epoch: 32372, Training Loss: 0.00925
Epoch: 32372, Training Loss: 0.00717
Epoch: 32373, Training Loss: 0.00810
Epoch: 32373, Training Loss: 0.00757
Epoch: 32373, Training Loss: 0.00925
Epoch: 32373, Training Loss: 0.00717
Epoch: 32374, Training Loss: 0.00810
Epoch: 32374, Training Loss: 0.00757
Epoch: 32374, Training Loss: 0.00925
Epoch: 32374, Training Loss: 0.00717
Epoch: 32375, Training Loss: 0.00810
Epoch: 32375, Training Loss: 0.00757
Epoch: 32375, Training Loss: 0.00925
Epoch: 32375, Training Loss: 0.00717
Epoch: 32376, Training Loss: 0.00810
Epoch: 32376, Training Loss: 0.00757
Epoch: 32376, Training Loss: 0.00925
Epoch: 32376, Training Loss: 0.00717
Epoch: 32377, Training Loss: 0.00810
Epoch: 32377, Training Loss: 0.00757
Epoch: 32377, Training Loss: 0.00925
Epoch: 32377, Training Loss: 0.00717
Epoch: 32378, Training Loss: 0.00810
Epoch: 32378, Training Loss: 0.00757
Epoch: 32378, Training Loss: 0.00925
Epoch: 32378, Training Loss: 0.00717
Epoch: 32379, Training Loss: 0.00810
Epoch: 32379, Training Loss: 0.00757
Epoch: 32379, Training Loss: 0.00925
Epoch: 32379, Training Loss: 0.00717
Epoch: 32380, Training Loss: 0.00810
Epoch: 32380, Training Loss: 0.00757
Epoch: 32380, Training Loss: 0.00925
Epoch: 32380, Training Loss: 0.00717
Epoch: 32381, Training Loss: 0.00810
Epoch: 32381, Training Loss: 0.00757
Epoch: 32381, Training Loss: 0.00925
Epoch: 32381, Training Loss: 0.00717
Epoch: 32382, Training Loss: 0.00810
Epoch: 32382, Training Loss: 0.00757
Epoch: 32382, Training Loss: 0.00925
Epoch: 32382, Training Loss: 0.00717
Epoch: 32383, Training Loss: 0.00810
Epoch: 32383, Training Loss: 0.00757
Epoch: 32383, Training Loss: 0.00925
Epoch: 32383, Training Loss: 0.00717
Epoch: 32384, Training Loss: 0.00810
Epoch: 32384, Training Loss: 0.00757
Epoch: 32384, Training Loss: 0.00925
Epoch: 32384, Training Loss: 0.00717
Epoch: 32385, Training Loss: 0.00810
Epoch: 32385, Training Loss: 0.00757
Epoch: 32385, Training Loss: 0.00925
Epoch: 32385, Training Loss: 0.00717
Epoch: 32386, Training Loss: 0.00810
Epoch: 32386, Training Loss: 0.00757
Epoch: 32386, Training Loss: 0.00925
Epoch: 32386, Training Loss: 0.00717
Epoch: 32387, Training Loss: 0.00810
Epoch: 32387, Training Loss: 0.00757
Epoch: 32387, Training Loss: 0.00925
Epoch: 32387, Training Loss: 0.00717
Epoch: 32388, Training Loss: 0.00810
Epoch: 32388, Training Loss: 0.00757
Epoch: 32388, Training Loss: 0.00925
Epoch: 32388, Training Loss: 0.00717
Epoch: 32389, Training Loss: 0.00810
Epoch: 32389, Training Loss: 0.00757
Epoch: 32389, Training Loss: 0.00925
Epoch: 32389, Training Loss: 0.00717
Epoch: 32390, Training Loss: 0.00810
Epoch: 32390, Training Loss: 0.00757
Epoch: 32390, Training Loss: 0.00925
Epoch: 32390, Training Loss: 0.00716
Epoch: 32391, Training Loss: 0.00810
Epoch: 32391, Training Loss: 0.00757
Epoch: 32391, Training Loss: 0.00925
Epoch: 32391, Training Loss: 0.00716
Epoch: 32392, Training Loss: 0.00810
Epoch: 32392, Training Loss: 0.00757
Epoch: 32392, Training Loss: 0.00925
Epoch: 32392, Training Loss: 0.00716
Epoch: 32393, Training Loss: 0.00810
Epoch: 32393, Training Loss: 0.00757
Epoch: 32393, Training Loss: 0.00925
Epoch: 32393, Training Loss: 0.00716
Epoch: 32394, Training Loss: 0.00810
Epoch: 32394, Training Loss: 0.00757
Epoch: 32394, Training Loss: 0.00925
Epoch: 32394, Training Loss: 0.00716
Epoch: 32395, Training Loss: 0.00810
Epoch: 32395, Training Loss: 0.00757
Epoch: 32395, Training Loss: 0.00925
Epoch: 32395, Training Loss: 0.00716
Epoch: 32396, Training Loss: 0.00810
Epoch: 32396, Training Loss: 0.00757
Epoch: 32396, Training Loss: 0.00925
Epoch: 32396, Training Loss: 0.00716
Epoch: 32397, Training Loss: 0.00810
Epoch: 32397, Training Loss: 0.00757
Epoch: 32397, Training Loss: 0.00925
Epoch: 32397, Training Loss: 0.00716
Epoch: 32398, Training Loss: 0.00810
Epoch: 32398, Training Loss: 0.00757
Epoch: 32398, Training Loss: 0.00925
Epoch: 32398, Training Loss: 0.00716
Epoch: 32399, Training Loss: 0.00810
Epoch: 32399, Training Loss: 0.00757
Epoch: 32399, Training Loss: 0.00925
Epoch: 32399, Training Loss: 0.00716
Epoch: 32400, Training Loss: 0.00810
Epoch: 32400, Training Loss: 0.00757
Epoch: 32400, Training Loss: 0.00925
Epoch: 32400, Training Loss: 0.00716
Epoch: 32401, Training Loss: 0.00810
Epoch: 32401, Training Loss: 0.00757
Epoch: 32401, Training Loss: 0.00925
Epoch: 32401, Training Loss: 0.00716
Epoch: 32402, Training Loss: 0.00810
Epoch: 32402, Training Loss: 0.00757
Epoch: 32402, Training Loss: 0.00925
Epoch: 32402, Training Loss: 0.00716
Epoch: 32403, Training Loss: 0.00810
Epoch: 32403, Training Loss: 0.00757
Epoch: 32403, Training Loss: 0.00925
Epoch: 32403, Training Loss: 0.00716
Epoch: 32404, Training Loss: 0.00810
Epoch: 32404, Training Loss: 0.00756
Epoch: 32404, Training Loss: 0.00925
Epoch: 32404, Training Loss: 0.00716
Epoch: 32405, Training Loss: 0.00810
Epoch: 32405, Training Loss: 0.00756
Epoch: 32405, Training Loss: 0.00925
Epoch: 32405, Training Loss: 0.00716
Epoch: 32406, Training Loss: 0.00810
Epoch: 32406, Training Loss: 0.00756
Epoch: 32406, Training Loss: 0.00925
Epoch: 32406, Training Loss: 0.00716
Epoch: 32407, Training Loss: 0.00810
Epoch: 32407, Training Loss: 0.00756
Epoch: 32407, Training Loss: 0.00925
Epoch: 32407, Training Loss: 0.00716
Epoch: 32408, Training Loss: 0.00810
Epoch: 32408, Training Loss: 0.00756
Epoch: 32408, Training Loss: 0.00925
Epoch: 32408, Training Loss: 0.00716
Epoch: 32409, Training Loss: 0.00810
Epoch: 32409, Training Loss: 0.00756
Epoch: 32409, Training Loss: 0.00925
Epoch: 32409, Training Loss: 0.00716
Epoch: 32410, Training Loss: 0.00810
Epoch: 32410, Training Loss: 0.00756
Epoch: 32410, Training Loss: 0.00925
Epoch: 32410, Training Loss: 0.00716
Epoch: 32411, Training Loss: 0.00810
Epoch: 32411, Training Loss: 0.00756
Epoch: 32411, Training Loss: 0.00925
Epoch: 32411, Training Loss: 0.00716
Epoch: 32412, Training Loss: 0.00810
Epoch: 32412, Training Loss: 0.00756
Epoch: 32412, Training Loss: 0.00925
Epoch: 32412, Training Loss: 0.00716
Epoch: 32413, Training Loss: 0.00810
Epoch: 32413, Training Loss: 0.00756
Epoch: 32413, Training Loss: 0.00925
Epoch: 32413, Training Loss: 0.00716
Epoch: 32414, Training Loss: 0.00810
Epoch: 32414, Training Loss: 0.00756
Epoch: 32414, Training Loss: 0.00925
Epoch: 32414, Training Loss: 0.00716
Epoch: 32415, Training Loss: 0.00809
Epoch: 32415, Training Loss: 0.00756
Epoch: 32415, Training Loss: 0.00925
Epoch: 32415, Training Loss: 0.00716
Epoch: 32416, Training Loss: 0.00809
Epoch: 32416, Training Loss: 0.00756
Epoch: 32416, Training Loss: 0.00925
Epoch: 32416, Training Loss: 0.00716
Epoch: 32417, Training Loss: 0.00809
Epoch: 32417, Training Loss: 0.00756
Epoch: 32417, Training Loss: 0.00925
Epoch: 32417, Training Loss: 0.00716
Epoch: 32418, Training Loss: 0.00809
Epoch: 32418, Training Loss: 0.00756
Epoch: 32418, Training Loss: 0.00925
Epoch: 32418, Training Loss: 0.00716
Epoch: 32419, Training Loss: 0.00809
Epoch: 32419, Training Loss: 0.00756
Epoch: 32419, Training Loss: 0.00925
Epoch: 32419, Training Loss: 0.00716
Epoch: 32420, Training Loss: 0.00809
Epoch: 32420, Training Loss: 0.00756
Epoch: 32420, Training Loss: 0.00925
Epoch: 32420, Training Loss: 0.00716
Epoch: 32421, Training Loss: 0.00809
Epoch: 32421, Training Loss: 0.00756
Epoch: 32421, Training Loss: 0.00924
Epoch: 32421, Training Loss: 0.00716
Epoch: 32422, Training Loss: 0.00809
Epoch: 32422, Training Loss: 0.00756
Epoch: 32422, Training Loss: 0.00924
Epoch: 32422, Training Loss: 0.00716
Epoch: 32423, Training Loss: 0.00809
Epoch: 32423, Training Loss: 0.00756
Epoch: 32423, Training Loss: 0.00924
Epoch: 32423, Training Loss: 0.00716
Epoch: 32424, Training Loss: 0.00809
Epoch: 32424, Training Loss: 0.00756
Epoch: 32424, Training Loss: 0.00924
Epoch: 32424, Training Loss: 0.00716
Epoch: 32425, Training Loss: 0.00809
Epoch: 32425, Training Loss: 0.00756
Epoch: 32425, Training Loss: 0.00924
Epoch: 32425, Training Loss: 0.00716
Epoch: 32426, Training Loss: 0.00809
Epoch: 32426, Training Loss: 0.00756
Epoch: 32426, Training Loss: 0.00924
Epoch: 32426, Training Loss: 0.00716
Epoch: 32427, Training Loss: 0.00809
Epoch: 32427, Training Loss: 0.00756
Epoch: 32427, Training Loss: 0.00924
Epoch: 32427, Training Loss: 0.00716
Epoch: 32428, Training Loss: 0.00809
Epoch: 32428, Training Loss: 0.00756
Epoch: 32428, Training Loss: 0.00924
Epoch: 32428, Training Loss: 0.00716
Epoch: 32429, Training Loss: 0.00809
Epoch: 32429, Training Loss: 0.00756
Epoch: 32429, Training Loss: 0.00924
Epoch: 32429, Training Loss: 0.00716
Epoch: 32430, Training Loss: 0.00809
Epoch: 32430, Training Loss: 0.00756
Epoch: 32430, Training Loss: 0.00924
Epoch: 32430, Training Loss: 0.00716
Epoch: 32431, Training Loss: 0.00809
Epoch: 32431, Training Loss: 0.00756
Epoch: 32431, Training Loss: 0.00924
Epoch: 32431, Training Loss: 0.00716
Epoch: 32432, Training Loss: 0.00809
Epoch: 32432, Training Loss: 0.00756
Epoch: 32432, Training Loss: 0.00924
Epoch: 32432, Training Loss: 0.00716
Epoch: 32433, Training Loss: 0.00809
Epoch: 32433, Training Loss: 0.00756
Epoch: 32433, Training Loss: 0.00924
Epoch: 32433, Training Loss: 0.00716
Epoch: 32434, Training Loss: 0.00809
Epoch: 32434, Training Loss: 0.00756
Epoch: 32434, Training Loss: 0.00924
Epoch: 32434, Training Loss: 0.00716
Epoch: 32435, Training Loss: 0.00809
Epoch: 32435, Training Loss: 0.00756
Epoch: 32435, Training Loss: 0.00924
Epoch: 32435, Training Loss: 0.00716
Epoch: 32436, Training Loss: 0.00809
Epoch: 32436, Training Loss: 0.00756
Epoch: 32436, Training Loss: 0.00924
Epoch: 32436, Training Loss: 0.00716
Epoch: 32437, Training Loss: 0.00809
Epoch: 32437, Training Loss: 0.00756
Epoch: 32437, Training Loss: 0.00924
Epoch: 32437, Training Loss: 0.00716
Epoch: 32438, Training Loss: 0.00809
Epoch: 32438, Training Loss: 0.00756
Epoch: 32438, Training Loss: 0.00924
Epoch: 32438, Training Loss: 0.00716
Epoch: 32439, Training Loss: 0.00809
Epoch: 32439, Training Loss: 0.00756
Epoch: 32439, Training Loss: 0.00924
Epoch: 32439, Training Loss: 0.00716
Epoch: 32440, Training Loss: 0.00809
Epoch: 32440, Training Loss: 0.00756
Epoch: 32440, Training Loss: 0.00924
Epoch: 32440, Training Loss: 0.00716
Epoch: 32441, Training Loss: 0.00809
Epoch: 32441, Training Loss: 0.00756
Epoch: 32441, Training Loss: 0.00924
Epoch: 32441, Training Loss: 0.00716
Epoch: 32442, Training Loss: 0.00809
Epoch: 32442, Training Loss: 0.00756
Epoch: 32442, Training Loss: 0.00924
Epoch: 32442, Training Loss: 0.00716
Epoch: 32443, Training Loss: 0.00809
Epoch: 32443, Training Loss: 0.00756
Epoch: 32443, Training Loss: 0.00924
Epoch: 32443, Training Loss: 0.00716
Epoch: 32444, Training Loss: 0.00809
Epoch: 32444, Training Loss: 0.00756
Epoch: 32444, Training Loss: 0.00924
Epoch: 32444, Training Loss: 0.00716
Epoch: 32445, Training Loss: 0.00809
Epoch: 32445, Training Loss: 0.00756
Epoch: 32445, Training Loss: 0.00924
Epoch: 32445, Training Loss: 0.00716
Epoch: 32446, Training Loss: 0.00809
Epoch: 32446, Training Loss: 0.00756
Epoch: 32446, Training Loss: 0.00924
Epoch: 32446, Training Loss: 0.00716
Epoch: 32447, Training Loss: 0.00809
Epoch: 32447, Training Loss: 0.00756
Epoch: 32447, Training Loss: 0.00924
Epoch: 32447, Training Loss: 0.00716
Epoch: 32448, Training Loss: 0.00809
Epoch: 32448, Training Loss: 0.00756
Epoch: 32448, Training Loss: 0.00924
Epoch: 32448, Training Loss: 0.00716
Epoch: 32449, Training Loss: 0.00809
Epoch: 32449, Training Loss: 0.00756
Epoch: 32449, Training Loss: 0.00924
Epoch: 32449, Training Loss: 0.00716
Epoch: 32450, Training Loss: 0.00809
Epoch: 32450, Training Loss: 0.00756
Epoch: 32450, Training Loss: 0.00924
Epoch: 32450, Training Loss: 0.00716
Epoch: 32451, Training Loss: 0.00809
Epoch: 32451, Training Loss: 0.00756
Epoch: 32451, Training Loss: 0.00924
Epoch: 32451, Training Loss: 0.00716
Epoch: 32452, Training Loss: 0.00809
Epoch: 32452, Training Loss: 0.00756
Epoch: 32452, Training Loss: 0.00924
Epoch: 32452, Training Loss: 0.00716
Epoch: 32453, Training Loss: 0.00809
Epoch: 32453, Training Loss: 0.00756
Epoch: 32453, Training Loss: 0.00924
Epoch: 32453, Training Loss: 0.00716
Epoch: 32454, Training Loss: 0.00809
Epoch: 32454, Training Loss: 0.00756
Epoch: 32454, Training Loss: 0.00924
Epoch: 32454, Training Loss: 0.00716
Epoch: 32455, Training Loss: 0.00809
Epoch: 32455, Training Loss: 0.00756
Epoch: 32455, Training Loss: 0.00924
Epoch: 32455, Training Loss: 0.00716
Epoch: 32456, Training Loss: 0.00809
Epoch: 32456, Training Loss: 0.00756
Epoch: 32456, Training Loss: 0.00924
Epoch: 32456, Training Loss: 0.00716
Epoch: 32457, Training Loss: 0.00809
Epoch: 32457, Training Loss: 0.00756
Epoch: 32457, Training Loss: 0.00924
Epoch: 32457, Training Loss: 0.00716
Epoch: 32458, Training Loss: 0.00809
Epoch: 32458, Training Loss: 0.00756
Epoch: 32458, Training Loss: 0.00924
Epoch: 32458, Training Loss: 0.00716
Epoch: 32459, Training Loss: 0.00809
Epoch: 32459, Training Loss: 0.00756
Epoch: 32459, Training Loss: 0.00924
Epoch: 32459, Training Loss: 0.00716
Epoch: 32460, Training Loss: 0.00809
Epoch: 32460, Training Loss: 0.00756
Epoch: 32460, Training Loss: 0.00924
Epoch: 32460, Training Loss: 0.00716
Epoch: 32461, Training Loss: 0.00809
Epoch: 32461, Training Loss: 0.00756
Epoch: 32461, Training Loss: 0.00924
Epoch: 32461, Training Loss: 0.00716
Epoch: 32462, Training Loss: 0.00809
Epoch: 32462, Training Loss: 0.00756
Epoch: 32462, Training Loss: 0.00924
Epoch: 32462, Training Loss: 0.00716
Epoch: 32463, Training Loss: 0.00809
Epoch: 32463, Training Loss: 0.00756
Epoch: 32463, Training Loss: 0.00924
Epoch: 32463, Training Loss: 0.00716
Epoch: 32464, Training Loss: 0.00809
Epoch: 32464, Training Loss: 0.00756
Epoch: 32464, Training Loss: 0.00924
Epoch: 32464, Training Loss: 0.00716
Epoch: 32465, Training Loss: 0.00809
Epoch: 32465, Training Loss: 0.00756
Epoch: 32465, Training Loss: 0.00924
Epoch: 32465, Training Loss: 0.00716
Epoch: 32466, Training Loss: 0.00809
Epoch: 32466, Training Loss: 0.00756
Epoch: 32466, Training Loss: 0.00924
Epoch: 32466, Training Loss: 0.00716
Epoch: 32467, Training Loss: 0.00809
Epoch: 32467, Training Loss: 0.00756
Epoch: 32467, Training Loss: 0.00924
Epoch: 32467, Training Loss: 0.00716
Epoch: 32468, Training Loss: 0.00809
Epoch: 32468, Training Loss: 0.00756
Epoch: 32468, Training Loss: 0.00924
Epoch: 32468, Training Loss: 0.00716
Epoch: 32469, Training Loss: 0.00809
Epoch: 32469, Training Loss: 0.00756
Epoch: 32469, Training Loss: 0.00924
Epoch: 32469, Training Loss: 0.00716
Epoch: 32470, Training Loss: 0.00809
Epoch: 32470, Training Loss: 0.00756
Epoch: 32470, Training Loss: 0.00924
Epoch: 32470, Training Loss: 0.00716
Epoch: 32471, Training Loss: 0.00809
Epoch: 32471, Training Loss: 0.00756
Epoch: 32471, Training Loss: 0.00924
Epoch: 32471, Training Loss: 0.00716
Epoch: 32472, Training Loss: 0.00809
Epoch: 32472, Training Loss: 0.00756
Epoch: 32472, Training Loss: 0.00924
Epoch: 32472, Training Loss: 0.00716
Epoch: 32473, Training Loss: 0.00809
Epoch: 32473, Training Loss: 0.00756
Epoch: 32473, Training Loss: 0.00924
Epoch: 32473, Training Loss: 0.00716
Epoch: 32474, Training Loss: 0.00809
Epoch: 32474, Training Loss: 0.00756
Epoch: 32474, Training Loss: 0.00924
Epoch: 32474, Training Loss: 0.00716
Epoch: 32475, Training Loss: 0.00809
Epoch: 32475, Training Loss: 0.00756
Epoch: 32475, Training Loss: 0.00924
Epoch: 32475, Training Loss: 0.00716
Epoch: 32476, Training Loss: 0.00809
Epoch: 32476, Training Loss: 0.00756
Epoch: 32476, Training Loss: 0.00924
Epoch: 32476, Training Loss: 0.00715
Epoch: 32477, Training Loss: 0.00809
Epoch: 32477, Training Loss: 0.00756
Epoch: 32477, Training Loss: 0.00924
Epoch: 32477, Training Loss: 0.00715
Epoch: 32478, Training Loss: 0.00809
Epoch: 32478, Training Loss: 0.00756
Epoch: 32478, Training Loss: 0.00924
Epoch: 32478, Training Loss: 0.00715
Epoch: 32479, Training Loss: 0.00809
Epoch: 32479, Training Loss: 0.00756
Epoch: 32479, Training Loss: 0.00924
Epoch: 32479, Training Loss: 0.00715
Epoch: 32480, Training Loss: 0.00809
Epoch: 32480, Training Loss: 0.00756
Epoch: 32480, Training Loss: 0.00924
Epoch: 32480, Training Loss: 0.00715
Epoch: 32481, Training Loss: 0.00809
Epoch: 32481, Training Loss: 0.00756
Epoch: 32481, Training Loss: 0.00924
Epoch: 32481, Training Loss: 0.00715
Epoch: 32482, Training Loss: 0.00809
Epoch: 32482, Training Loss: 0.00756
Epoch: 32482, Training Loss: 0.00924
Epoch: 32482, Training Loss: 0.00715
Epoch: 32483, Training Loss: 0.00809
Epoch: 32483, Training Loss: 0.00756
Epoch: 32483, Training Loss: 0.00924
Epoch: 32483, Training Loss: 0.00715
Epoch: 32484, Training Loss: 0.00809
Epoch: 32484, Training Loss: 0.00756
Epoch: 32484, Training Loss: 0.00924
Epoch: 32484, Training Loss: 0.00715
Epoch: 32485, Training Loss: 0.00809
Epoch: 32485, Training Loss: 0.00756
Epoch: 32485, Training Loss: 0.00924
Epoch: 32485, Training Loss: 0.00715
Epoch: 32486, Training Loss: 0.00809
Epoch: 32486, Training Loss: 0.00756
Epoch: 32486, Training Loss: 0.00924
Epoch: 32486, Training Loss: 0.00715
Epoch: 32487, Training Loss: 0.00809
Epoch: 32487, Training Loss: 0.00755
Epoch: 32487, Training Loss: 0.00923
Epoch: 32487, Training Loss: 0.00715
Epoch: 32488, Training Loss: 0.00809
Epoch: 32488, Training Loss: 0.00755
Epoch: 32488, Training Loss: 0.00923
Epoch: 32488, Training Loss: 0.00715
Epoch: 32489, Training Loss: 0.00809
Epoch: 32489, Training Loss: 0.00755
Epoch: 32489, Training Loss: 0.00923
Epoch: 32489, Training Loss: 0.00715
Epoch: 32490, Training Loss: 0.00808
Epoch: 32490, Training Loss: 0.00755
Epoch: 32490, Training Loss: 0.00923
Epoch: 32490, Training Loss: 0.00715
Epoch: 32491, Training Loss: 0.00808
Epoch: 32491, Training Loss: 0.00755
Epoch: 32491, Training Loss: 0.00923
Epoch: 32491, Training Loss: 0.00715
Epoch: 32492, Training Loss: 0.00808
Epoch: 32492, Training Loss: 0.00755
Epoch: 32492, Training Loss: 0.00923
Epoch: 32492, Training Loss: 0.00715
Epoch: 32493, Training Loss: 0.00808
Epoch: 32493, Training Loss: 0.00755
Epoch: 32493, Training Loss: 0.00923
Epoch: 32493, Training Loss: 0.00715
Epoch: 32494, Training Loss: 0.00808
Epoch: 32494, Training Loss: 0.00755
Epoch: 32494, Training Loss: 0.00923
Epoch: 32494, Training Loss: 0.00715
Epoch: 32495, Training Loss: 0.00808
Epoch: 32495, Training Loss: 0.00755
Epoch: 32495, Training Loss: 0.00923
Epoch: 32495, Training Loss: 0.00715
Epoch: 32496, Training Loss: 0.00808
Epoch: 32496, Training Loss: 0.00755
Epoch: 32496, Training Loss: 0.00923
Epoch: 32496, Training Loss: 0.00715
Epoch: 32497, Training Loss: 0.00808
Epoch: 32497, Training Loss: 0.00755
Epoch: 32497, Training Loss: 0.00923
Epoch: 32497, Training Loss: 0.00715
Epoch: 32498, Training Loss: 0.00808
Epoch: 32498, Training Loss: 0.00755
Epoch: 32498, Training Loss: 0.00923
Epoch: 32498, Training Loss: 0.00715
Epoch: 32499, Training Loss: 0.00808
Epoch: 32499, Training Loss: 0.00755
Epoch: 32499, Training Loss: 0.00923
Epoch: 32499, Training Loss: 0.00715
Epoch: 32500, Training Loss: 0.00808
Epoch: 32500, Training Loss: 0.00755
Epoch: 32500, Training Loss: 0.00923
Epoch: 32500, Training Loss: 0.00715
Epoch: 32501, Training Loss: 0.00808
Epoch: 32501, Training Loss: 0.00755
Epoch: 32501, Training Loss: 0.00923
Epoch: 32501, Training Loss: 0.00715
Epoch: 32502, Training Loss: 0.00808
Epoch: 32502, Training Loss: 0.00755
Epoch: 32502, Training Loss: 0.00923
Epoch: 32502, Training Loss: 0.00715
Epoch: 32503, Training Loss: 0.00808
Epoch: 32503, Training Loss: 0.00755
Epoch: 32503, Training Loss: 0.00923
Epoch: 32503, Training Loss: 0.00715
Epoch: 32504, Training Loss: 0.00808
Epoch: 32504, Training Loss: 0.00755
Epoch: 32504, Training Loss: 0.00923
Epoch: 32504, Training Loss: 0.00715
Epoch: 32505, Training Loss: 0.00808
Epoch: 32505, Training Loss: 0.00755
Epoch: 32505, Training Loss: 0.00923
Epoch: 32505, Training Loss: 0.00715
Epoch: 32506, Training Loss: 0.00808
Epoch: 32506, Training Loss: 0.00755
Epoch: 32506, Training Loss: 0.00923
Epoch: 32506, Training Loss: 0.00715
Epoch: 32507, Training Loss: 0.00808
Epoch: 32507, Training Loss: 0.00755
Epoch: 32507, Training Loss: 0.00923
Epoch: 32507, Training Loss: 0.00715
Epoch: 32508, Training Loss: 0.00808
Epoch: 32508, Training Loss: 0.00755
Epoch: 32508, Training Loss: 0.00923
Epoch: 32508, Training Loss: 0.00715
Epoch: 32509, Training Loss: 0.00808
Epoch: 32509, Training Loss: 0.00755
Epoch: 32509, Training Loss: 0.00923
Epoch: 32509, Training Loss: 0.00715
Epoch: 32510, Training Loss: 0.00808
Epoch: 32510, Training Loss: 0.00755
Epoch: 32510, Training Loss: 0.00923
Epoch: 32510, Training Loss: 0.00715
Epoch: 32511, Training Loss: 0.00808
Epoch: 32511, Training Loss: 0.00755
Epoch: 32511, Training Loss: 0.00923
Epoch: 32511, Training Loss: 0.00715
Epoch: 32512, Training Loss: 0.00808
Epoch: 32512, Training Loss: 0.00755
Epoch: 32512, Training Loss: 0.00923
Epoch: 32512, Training Loss: 0.00715
Epoch: 32513, Training Loss: 0.00808
Epoch: 32513, Training Loss: 0.00755
Epoch: 32513, Training Loss: 0.00923
Epoch: 32513, Training Loss: 0.00715
Epoch: 32514, Training Loss: 0.00808
Epoch: 32514, Training Loss: 0.00755
Epoch: 32514, Training Loss: 0.00923
Epoch: 32514, Training Loss: 0.00715
Epoch: 32515, Training Loss: 0.00808
Epoch: 32515, Training Loss: 0.00755
Epoch: 32515, Training Loss: 0.00923
Epoch: 32515, Training Loss: 0.00715
Epoch: 32516, Training Loss: 0.00808
Epoch: 32516, Training Loss: 0.00755
Epoch: 32516, Training Loss: 0.00923
Epoch: 32516, Training Loss: 0.00715
Epoch: 32517, Training Loss: 0.00808
Epoch: 32517, Training Loss: 0.00755
Epoch: 32517, Training Loss: 0.00923
Epoch: 32517, Training Loss: 0.00715
Epoch: 32518, Training Loss: 0.00808
Epoch: 32518, Training Loss: 0.00755
Epoch: 32518, Training Loss: 0.00923
Epoch: 32518, Training Loss: 0.00715
Epoch: 32519, Training Loss: 0.00808
Epoch: 32519, Training Loss: 0.00755
Epoch: 32519, Training Loss: 0.00923
Epoch: 32519, Training Loss: 0.00715
Epoch: 32520, Training Loss: 0.00808
Epoch: 32520, Training Loss: 0.00755
Epoch: 32520, Training Loss: 0.00923
Epoch: 32520, Training Loss: 0.00715
Epoch: 32521, Training Loss: 0.00808
Epoch: 32521, Training Loss: 0.00755
Epoch: 32521, Training Loss: 0.00923
Epoch: 32521, Training Loss: 0.00715
Epoch: 32522, Training Loss: 0.00808
Epoch: 32522, Training Loss: 0.00755
Epoch: 32522, Training Loss: 0.00923
Epoch: 32522, Training Loss: 0.00715
Epoch: 32523, Training Loss: 0.00808
Epoch: 32523, Training Loss: 0.00755
Epoch: 32523, Training Loss: 0.00923
Epoch: 32523, Training Loss: 0.00715
Epoch: 32524, Training Loss: 0.00808
Epoch: 32524, Training Loss: 0.00755
Epoch: 32524, Training Loss: 0.00923
Epoch: 32524, Training Loss: 0.00715
Epoch: 32525, Training Loss: 0.00808
Epoch: 32525, Training Loss: 0.00755
Epoch: 32525, Training Loss: 0.00923
Epoch: 32525, Training Loss: 0.00715
Epoch: 32526, Training Loss: 0.00808
Epoch: 32526, Training Loss: 0.00755
Epoch: 32526, Training Loss: 0.00923
Epoch: 32526, Training Loss: 0.00715
Epoch: 32527, Training Loss: 0.00808
Epoch: 32527, Training Loss: 0.00755
Epoch: 32527, Training Loss: 0.00923
Epoch: 32527, Training Loss: 0.00715
Epoch: 32528, Training Loss: 0.00808
Epoch: 32528, Training Loss: 0.00755
Epoch: 32528, Training Loss: 0.00923
Epoch: 32528, Training Loss: 0.00715
Epoch: 32529, Training Loss: 0.00808
Epoch: 32529, Training Loss: 0.00755
Epoch: 32529, Training Loss: 0.00923
Epoch: 32529, Training Loss: 0.00715
Epoch: 32530, Training Loss: 0.00808
Epoch: 32530, Training Loss: 0.00755
Epoch: 32530, Training Loss: 0.00923
Epoch: 32530, Training Loss: 0.00715
Epoch: 32531, Training Loss: 0.00808
Epoch: 32531, Training Loss: 0.00755
Epoch: 32531, Training Loss: 0.00923
Epoch: 32531, Training Loss: 0.00715
Epoch: 32532, Training Loss: 0.00808
Epoch: 32532, Training Loss: 0.00755
Epoch: 32532, Training Loss: 0.00923
Epoch: 32532, Training Loss: 0.00715
Epoch: 32533, Training Loss: 0.00808
Epoch: 32533, Training Loss: 0.00755
Epoch: 32533, Training Loss: 0.00923
Epoch: 32533, Training Loss: 0.00715
Epoch: 32534, Training Loss: 0.00808
Epoch: 32534, Training Loss: 0.00755
Epoch: 32534, Training Loss: 0.00923
Epoch: 32534, Training Loss: 0.00715
Epoch: 32535, Training Loss: 0.00808
Epoch: 32535, Training Loss: 0.00755
Epoch: 32535, Training Loss: 0.00923
Epoch: 32535, Training Loss: 0.00715
Epoch: 32536, Training Loss: 0.00808
Epoch: 32536, Training Loss: 0.00755
Epoch: 32536, Training Loss: 0.00923
Epoch: 32536, Training Loss: 0.00715
Epoch: 32537, Training Loss: 0.00808
Epoch: 32537, Training Loss: 0.00755
Epoch: 32537, Training Loss: 0.00923
Epoch: 32537, Training Loss: 0.00715
Epoch: 32538, Training Loss: 0.00808
Epoch: 32538, Training Loss: 0.00755
Epoch: 32538, Training Loss: 0.00923
Epoch: 32538, Training Loss: 0.00715
Epoch: 32539, Training Loss: 0.00808
Epoch: 32539, Training Loss: 0.00755
Epoch: 32539, Training Loss: 0.00923
Epoch: 32539, Training Loss: 0.00715
Epoch: 32540, Training Loss: 0.00808
Epoch: 32540, Training Loss: 0.00755
Epoch: 32540, Training Loss: 0.00923
Epoch: 32540, Training Loss: 0.00715
Epoch: 32541, Training Loss: 0.00808
Epoch: 32541, Training Loss: 0.00755
Epoch: 32541, Training Loss: 0.00923
Epoch: 32541, Training Loss: 0.00715
Epoch: 32542, Training Loss: 0.00808
Epoch: 32542, Training Loss: 0.00755
Epoch: 32542, Training Loss: 0.00923
Epoch: 32542, Training Loss: 0.00715
Epoch: 32543, Training Loss: 0.00808
Epoch: 32543, Training Loss: 0.00755
Epoch: 32543, Training Loss: 0.00923
Epoch: 32543, Training Loss: 0.00715
Epoch: 32544, Training Loss: 0.00808
Epoch: 32544, Training Loss: 0.00755
Epoch: 32544, Training Loss: 0.00923
Epoch: 32544, Training Loss: 0.00715
Epoch: 32545, Training Loss: 0.00808
Epoch: 32545, Training Loss: 0.00755
Epoch: 32545, Training Loss: 0.00923
Epoch: 32545, Training Loss: 0.00715
Epoch: 32546, Training Loss: 0.00808
Epoch: 32546, Training Loss: 0.00755
Epoch: 32546, Training Loss: 0.00923
Epoch: 32546, Training Loss: 0.00715
Epoch: 32547, Training Loss: 0.00808
Epoch: 32547, Training Loss: 0.00755
Epoch: 32547, Training Loss: 0.00923
Epoch: 32547, Training Loss: 0.00715
Epoch: 32548, Training Loss: 0.00808
Epoch: 32548, Training Loss: 0.00755
Epoch: 32548, Training Loss: 0.00923
Epoch: 32548, Training Loss: 0.00715
Epoch: 32549, Training Loss: 0.00808
Epoch: 32549, Training Loss: 0.00755
Epoch: 32549, Training Loss: 0.00923
Epoch: 32549, Training Loss: 0.00715
Epoch: 32550, Training Loss: 0.00808
Epoch: 32550, Training Loss: 0.00755
Epoch: 32550, Training Loss: 0.00923
Epoch: 32550, Training Loss: 0.00715
Epoch: 32551, Training Loss: 0.00808
Epoch: 32551, Training Loss: 0.00755
Epoch: 32551, Training Loss: 0.00923
Epoch: 32551, Training Loss: 0.00715
Epoch: 32552, Training Loss: 0.00808
Epoch: 32552, Training Loss: 0.00755
Epoch: 32552, Training Loss: 0.00923
Epoch: 32552, Training Loss: 0.00715
Epoch: 32553, Training Loss: 0.00808
Epoch: 32553, Training Loss: 0.00755
Epoch: 32553, Training Loss: 0.00923
Epoch: 32553, Training Loss: 0.00715
Epoch: 32554, Training Loss: 0.00808
Epoch: 32554, Training Loss: 0.00755
Epoch: 32554, Training Loss: 0.00922
Epoch: 32554, Training Loss: 0.00715
Epoch: 32555, Training Loss: 0.00808
Epoch: 32555, Training Loss: 0.00755
Epoch: 32555, Training Loss: 0.00922
Epoch: 32555, Training Loss: 0.00715
Epoch: 32556, Training Loss: 0.00808
Epoch: 32556, Training Loss: 0.00755
Epoch: 32556, Training Loss: 0.00922
Epoch: 32556, Training Loss: 0.00715
Epoch: 32557, Training Loss: 0.00808
Epoch: 32557, Training Loss: 0.00755
Epoch: 32557, Training Loss: 0.00922
Epoch: 32557, Training Loss: 0.00715
Epoch: 32558, Training Loss: 0.00808
Epoch: 32558, Training Loss: 0.00755
Epoch: 32558, Training Loss: 0.00922
Epoch: 32558, Training Loss: 0.00715
Epoch: 32559, Training Loss: 0.00808
Epoch: 32559, Training Loss: 0.00755
Epoch: 32559, Training Loss: 0.00922
Epoch: 32559, Training Loss: 0.00715
Epoch: 32560, Training Loss: 0.00808
Epoch: 32560, Training Loss: 0.00755
Epoch: 32560, Training Loss: 0.00922
Epoch: 32560, Training Loss: 0.00715
Epoch: 32561, Training Loss: 0.00808
Epoch: 32561, Training Loss: 0.00755
Epoch: 32561, Training Loss: 0.00922
Epoch: 32561, Training Loss: 0.00715
Epoch: 32562, Training Loss: 0.00808
Epoch: 32562, Training Loss: 0.00755
Epoch: 32562, Training Loss: 0.00922
Epoch: 32562, Training Loss: 0.00715
Epoch: 32563, Training Loss: 0.00808
Epoch: 32563, Training Loss: 0.00755
Epoch: 32563, Training Loss: 0.00922
Epoch: 32563, Training Loss: 0.00714
Epoch: 32564, Training Loss: 0.00808
Epoch: 32564, Training Loss: 0.00755
Epoch: 32564, Training Loss: 0.00922
Epoch: 32564, Training Loss: 0.00714
Epoch: 32565, Training Loss: 0.00808
Epoch: 32565, Training Loss: 0.00755
Epoch: 32565, Training Loss: 0.00922
Epoch: 32565, Training Loss: 0.00714
Epoch: 32566, Training Loss: 0.00807
Epoch: 32566, Training Loss: 0.00755
Epoch: 32566, Training Loss: 0.00922
Epoch: 32566, Training Loss: 0.00714
Epoch: 32567, Training Loss: 0.00807
Epoch: 32567, Training Loss: 0.00755
Epoch: 32567, Training Loss: 0.00922
Epoch: 32567, Training Loss: 0.00714
Epoch: 32568, Training Loss: 0.00807
Epoch: 32568, Training Loss: 0.00755
Epoch: 32568, Training Loss: 0.00922
Epoch: 32568, Training Loss: 0.00714
Epoch: 32569, Training Loss: 0.00807
Epoch: 32569, Training Loss: 0.00754
Epoch: 32569, Training Loss: 0.00922
Epoch: 32569, Training Loss: 0.00714
Epoch: 32570, Training Loss: 0.00807
Epoch: 32570, Training Loss: 0.00754
Epoch: 32570, Training Loss: 0.00922
Epoch: 32570, Training Loss: 0.00714
Epoch: 32571, Training Loss: 0.00807
Epoch: 32571, Training Loss: 0.00754
Epoch: 32571, Training Loss: 0.00922
Epoch: 32571, Training Loss: 0.00714
Epoch: 32572, Training Loss: 0.00807
Epoch: 32572, Training Loss: 0.00754
Epoch: 32572, Training Loss: 0.00922
Epoch: 32572, Training Loss: 0.00714
Epoch: 32573, Training Loss: 0.00807
Epoch: 32573, Training Loss: 0.00754
Epoch: 32573, Training Loss: 0.00922
Epoch: 32573, Training Loss: 0.00714
Epoch: 32574, Training Loss: 0.00807
Epoch: 32574, Training Loss: 0.00754
Epoch: 32574, Training Loss: 0.00922
Epoch: 32574, Training Loss: 0.00714
Epoch: 32575, Training Loss: 0.00807
Epoch: 32575, Training Loss: 0.00754
Epoch: 32575, Training Loss: 0.00922
Epoch: 32575, Training Loss: 0.00714
Epoch: 32576, Training Loss: 0.00807
Epoch: 32576, Training Loss: 0.00754
Epoch: 32576, Training Loss: 0.00922
Epoch: 32576, Training Loss: 0.00714
Epoch: 32577, Training Loss: 0.00807
Epoch: 32577, Training Loss: 0.00754
Epoch: 32577, Training Loss: 0.00922
Epoch: 32577, Training Loss: 0.00714
Epoch: 32578, Training Loss: 0.00807
Epoch: 32578, Training Loss: 0.00754
Epoch: 32578, Training Loss: 0.00922
Epoch: 32578, Training Loss: 0.00714
Epoch: 32579, Training Loss: 0.00807
Epoch: 32579, Training Loss: 0.00754
Epoch: 32579, Training Loss: 0.00922
Epoch: 32579, Training Loss: 0.00714
Epoch: 32580, Training Loss: 0.00807
Epoch: 32580, Training Loss: 0.00754
Epoch: 32580, Training Loss: 0.00922
Epoch: 32580, Training Loss: 0.00714
Epoch: 32581, Training Loss: 0.00807
Epoch: 32581, Training Loss: 0.00754
Epoch: 32581, Training Loss: 0.00922
Epoch: 32581, Training Loss: 0.00714
Epoch: 32582, Training Loss: 0.00807
Epoch: 32582, Training Loss: 0.00754
Epoch: 32582, Training Loss: 0.00922
Epoch: 32582, Training Loss: 0.00714
Epoch: 32583, Training Loss: 0.00807
Epoch: 32583, Training Loss: 0.00754
Epoch: 32583, Training Loss: 0.00922
Epoch: 32583, Training Loss: 0.00714
Epoch: 32584, Training Loss: 0.00807
Epoch: 32584, Training Loss: 0.00754
Epoch: 32584, Training Loss: 0.00922
Epoch: 32584, Training Loss: 0.00714
Epoch: 32585, Training Loss: 0.00807
Epoch: 32585, Training Loss: 0.00754
Epoch: 32585, Training Loss: 0.00922
Epoch: 32585, Training Loss: 0.00714
Epoch: 32586, Training Loss: 0.00807
Epoch: 32586, Training Loss: 0.00754
Epoch: 32586, Training Loss: 0.00922
Epoch: 32586, Training Loss: 0.00714
Epoch: 32587, Training Loss: 0.00807
Epoch: 32587, Training Loss: 0.00754
Epoch: 32587, Training Loss: 0.00922
Epoch: 32587, Training Loss: 0.00714
Epoch: 32588, Training Loss: 0.00807
Epoch: 32588, Training Loss: 0.00754
Epoch: 32588, Training Loss: 0.00922
Epoch: 32588, Training Loss: 0.00714
Epoch: 32589, Training Loss: 0.00807
Epoch: 32589, Training Loss: 0.00754
Epoch: 32589, Training Loss: 0.00922
Epoch: 32589, Training Loss: 0.00714
Epoch: 32590, Training Loss: 0.00807
Epoch: 32590, Training Loss: 0.00754
Epoch: 32590, Training Loss: 0.00922
Epoch: 32590, Training Loss: 0.00714
Epoch: 32591, Training Loss: 0.00807
Epoch: 32591, Training Loss: 0.00754
Epoch: 32591, Training Loss: 0.00922
Epoch: 32591, Training Loss: 0.00714
Epoch: 32592, Training Loss: 0.00807
Epoch: 32592, Training Loss: 0.00754
Epoch: 32592, Training Loss: 0.00922
Epoch: 32592, Training Loss: 0.00714
Epoch: 32593, Training Loss: 0.00807
Epoch: 32593, Training Loss: 0.00754
Epoch: 32593, Training Loss: 0.00922
Epoch: 32593, Training Loss: 0.00714
Epoch: 32594, Training Loss: 0.00807
Epoch: 32594, Training Loss: 0.00754
Epoch: 32594, Training Loss: 0.00922
Epoch: 32594, Training Loss: 0.00714
Epoch: 32595, Training Loss: 0.00807
Epoch: 32595, Training Loss: 0.00754
Epoch: 32595, Training Loss: 0.00922
Epoch: 32595, Training Loss: 0.00714
Epoch: 32596, Training Loss: 0.00807
Epoch: 32596, Training Loss: 0.00754
Epoch: 32596, Training Loss: 0.00922
Epoch: 32596, Training Loss: 0.00714
Epoch: 32597, Training Loss: 0.00807
Epoch: 32597, Training Loss: 0.00754
Epoch: 32597, Training Loss: 0.00922
Epoch: 32597, Training Loss: 0.00714
Epoch: 32598, Training Loss: 0.00807
Epoch: 32598, Training Loss: 0.00754
Epoch: 32598, Training Loss: 0.00922
Epoch: 32598, Training Loss: 0.00714
Epoch: 32599, Training Loss: 0.00807
Epoch: 32599, Training Loss: 0.00754
Epoch: 32599, Training Loss: 0.00922
Epoch: 32599, Training Loss: 0.00714
Epoch: 32600, Training Loss: 0.00807
Epoch: 32600, Training Loss: 0.00754
Epoch: 32600, Training Loss: 0.00922
Epoch: 32600, Training Loss: 0.00714
Epoch: 32601, Training Loss: 0.00807
Epoch: 32601, Training Loss: 0.00754
Epoch: 32601, Training Loss: 0.00922
Epoch: 32601, Training Loss: 0.00714
Epoch: 32602, Training Loss: 0.00807
Epoch: 32602, Training Loss: 0.00754
Epoch: 32602, Training Loss: 0.00922
Epoch: 32602, Training Loss: 0.00714
Epoch: 32603, Training Loss: 0.00807
Epoch: 32603, Training Loss: 0.00754
Epoch: 32603, Training Loss: 0.00922
Epoch: 32603, Training Loss: 0.00714
Epoch: 32604, Training Loss: 0.00807
Epoch: 32604, Training Loss: 0.00754
Epoch: 32604, Training Loss: 0.00922
Epoch: 32604, Training Loss: 0.00714
Epoch: 32605, Training Loss: 0.00807
Epoch: 32605, Training Loss: 0.00754
Epoch: 32605, Training Loss: 0.00922
Epoch: 32605, Training Loss: 0.00714
Epoch: 32606, Training Loss: 0.00807
Epoch: 32606, Training Loss: 0.00754
Epoch: 32606, Training Loss: 0.00922
Epoch: 32606, Training Loss: 0.00714
Epoch: 32607, Training Loss: 0.00807
Epoch: 32607, Training Loss: 0.00754
Epoch: 32607, Training Loss: 0.00922
Epoch: 32607, Training Loss: 0.00714
Epoch: 32608, Training Loss: 0.00807
Epoch: 32608, Training Loss: 0.00754
Epoch: 32608, Training Loss: 0.00922
Epoch: 32608, Training Loss: 0.00714
Epoch: 32609, Training Loss: 0.00807
Epoch: 32609, Training Loss: 0.00754
Epoch: 32609, Training Loss: 0.00922
Epoch: 32609, Training Loss: 0.00714
Epoch: 32610, Training Loss: 0.00807
Epoch: 32610, Training Loss: 0.00754
Epoch: 32610, Training Loss: 0.00922
Epoch: 32610, Training Loss: 0.00714
Epoch: 32611, Training Loss: 0.00807
Epoch: 32611, Training Loss: 0.00754
Epoch: 32611, Training Loss: 0.00922
Epoch: 32611, Training Loss: 0.00714
Epoch: 32612, Training Loss: 0.00807
Epoch: 32612, Training Loss: 0.00754
Epoch: 32612, Training Loss: 0.00922
Epoch: 32612, Training Loss: 0.00714
Epoch: 32613, Training Loss: 0.00807
Epoch: 32613, Training Loss: 0.00754
Epoch: 32613, Training Loss: 0.00922
Epoch: 32613, Training Loss: 0.00714
Epoch: 32614, Training Loss: 0.00807
Epoch: 32614, Training Loss: 0.00754
Epoch: 32614, Training Loss: 0.00922
Epoch: 32614, Training Loss: 0.00714
Epoch: 32615, Training Loss: 0.00807
Epoch: 32615, Training Loss: 0.00754
Epoch: 32615, Training Loss: 0.00922
Epoch: 32615, Training Loss: 0.00714
Epoch: 32616, Training Loss: 0.00807
Epoch: 32616, Training Loss: 0.00754
Epoch: 32616, Training Loss: 0.00922
Epoch: 32616, Training Loss: 0.00714
Epoch: 32617, Training Loss: 0.00807
Epoch: 32617, Training Loss: 0.00754
Epoch: 32617, Training Loss: 0.00922
Epoch: 32617, Training Loss: 0.00714
Epoch: 32618, Training Loss: 0.00807
Epoch: 32618, Training Loss: 0.00754
Epoch: 32618, Training Loss: 0.00922
Epoch: 32618, Training Loss: 0.00714
Epoch: 32619, Training Loss: 0.00807
Epoch: 32619, Training Loss: 0.00754
Epoch: 32619, Training Loss: 0.00922
Epoch: 32619, Training Loss: 0.00714
Epoch: 32620, Training Loss: 0.00807
Epoch: 32620, Training Loss: 0.00754
Epoch: 32620, Training Loss: 0.00922
Epoch: 32620, Training Loss: 0.00714
Epoch: 32621, Training Loss: 0.00807
Epoch: 32621, Training Loss: 0.00754
Epoch: 32621, Training Loss: 0.00921
Epoch: 32621, Training Loss: 0.00714
Epoch: 32622, Training Loss: 0.00807
Epoch: 32622, Training Loss: 0.00754
Epoch: 32622, Training Loss: 0.00921
Epoch: 32622, Training Loss: 0.00714
Epoch: 32623, Training Loss: 0.00807
Epoch: 32623, Training Loss: 0.00754
Epoch: 32623, Training Loss: 0.00921
Epoch: 32623, Training Loss: 0.00714
Epoch: 32624, Training Loss: 0.00807
Epoch: 32624, Training Loss: 0.00754
Epoch: 32624, Training Loss: 0.00921
Epoch: 32624, Training Loss: 0.00714
Epoch: 32625, Training Loss: 0.00807
Epoch: 32625, Training Loss: 0.00754
Epoch: 32625, Training Loss: 0.00921
Epoch: 32625, Training Loss: 0.00714
Epoch: 32626, Training Loss: 0.00807
Epoch: 32626, Training Loss: 0.00754
Epoch: 32626, Training Loss: 0.00921
Epoch: 32626, Training Loss: 0.00714
Epoch: 32627, Training Loss: 0.00807
Epoch: 32627, Training Loss: 0.00754
Epoch: 32627, Training Loss: 0.00921
Epoch: 32627, Training Loss: 0.00714
Epoch: 32628, Training Loss: 0.00807
Epoch: 32628, Training Loss: 0.00754
Epoch: 32628, Training Loss: 0.00921
Epoch: 32628, Training Loss: 0.00714
Epoch: 32629, Training Loss: 0.00807
Epoch: 32629, Training Loss: 0.00754
Epoch: 32629, Training Loss: 0.00921
Epoch: 32629, Training Loss: 0.00714
Epoch: 32630, Training Loss: 0.00807
Epoch: 32630, Training Loss: 0.00754
Epoch: 32630, Training Loss: 0.00921
Epoch: 32630, Training Loss: 0.00714
Epoch: 32631, Training Loss: 0.00807
Epoch: 32631, Training Loss: 0.00754
Epoch: 32631, Training Loss: 0.00921
Epoch: 32631, Training Loss: 0.00714
Epoch: 32632, Training Loss: 0.00807
Epoch: 32632, Training Loss: 0.00754
Epoch: 32632, Training Loss: 0.00921
Epoch: 32632, Training Loss: 0.00714
Epoch: 32633, Training Loss: 0.00807
Epoch: 32633, Training Loss: 0.00754
Epoch: 32633, Training Loss: 0.00921
Epoch: 32633, Training Loss: 0.00714
Epoch: 32634, Training Loss: 0.00807
Epoch: 32634, Training Loss: 0.00754
Epoch: 32634, Training Loss: 0.00921
Epoch: 32634, Training Loss: 0.00714
Epoch: 32635, Training Loss: 0.00807
Epoch: 32635, Training Loss: 0.00754
Epoch: 32635, Training Loss: 0.00921
Epoch: 32635, Training Loss: 0.00714
Epoch: 32636, Training Loss: 0.00807
Epoch: 32636, Training Loss: 0.00754
Epoch: 32636, Training Loss: 0.00921
Epoch: 32636, Training Loss: 0.00714
Epoch: 32637, Training Loss: 0.00807
Epoch: 32637, Training Loss: 0.00754
Epoch: 32637, Training Loss: 0.00921
Epoch: 32637, Training Loss: 0.00714
Epoch: 32638, Training Loss: 0.00807
Epoch: 32638, Training Loss: 0.00754
Epoch: 32638, Training Loss: 0.00921
Epoch: 32638, Training Loss: 0.00714
Epoch: 32639, Training Loss: 0.00807
Epoch: 32639, Training Loss: 0.00754
Epoch: 32639, Training Loss: 0.00921
Epoch: 32639, Training Loss: 0.00714
Epoch: 32640, Training Loss: 0.00807
Epoch: 32640, Training Loss: 0.00754
Epoch: 32640, Training Loss: 0.00921
Epoch: 32640, Training Loss: 0.00714
Epoch: 32641, Training Loss: 0.00807
Epoch: 32641, Training Loss: 0.00754
Epoch: 32641, Training Loss: 0.00921
Epoch: 32641, Training Loss: 0.00714
Epoch: 32642, Training Loss: 0.00807
Epoch: 32642, Training Loss: 0.00754
Epoch: 32642, Training Loss: 0.00921
Epoch: 32642, Training Loss: 0.00714
Epoch: 32643, Training Loss: 0.00806
Epoch: 32643, Training Loss: 0.00754
Epoch: 32643, Training Loss: 0.00921
Epoch: 32643, Training Loss: 0.00714
Epoch: 32644, Training Loss: 0.00806
Epoch: 32644, Training Loss: 0.00754
Epoch: 32644, Training Loss: 0.00921
Epoch: 32644, Training Loss: 0.00714
Epoch: 32645, Training Loss: 0.00806
Epoch: 32645, Training Loss: 0.00754
Epoch: 32645, Training Loss: 0.00921
Epoch: 32645, Training Loss: 0.00714
Epoch: 32646, Training Loss: 0.00806
Epoch: 32646, Training Loss: 0.00754
Epoch: 32646, Training Loss: 0.00921
Epoch: 32646, Training Loss: 0.00714
Epoch: 32647, Training Loss: 0.00806
Epoch: 32647, Training Loss: 0.00754
Epoch: 32647, Training Loss: 0.00921
Epoch: 32647, Training Loss: 0.00714
Epoch: 32648, Training Loss: 0.00806
Epoch: 32648, Training Loss: 0.00754
Epoch: 32648, Training Loss: 0.00921
Epoch: 32648, Training Loss: 0.00714
Epoch: 32649, Training Loss: 0.00806
Epoch: 32649, Training Loss: 0.00754
Epoch: 32649, Training Loss: 0.00921
Epoch: 32649, Training Loss: 0.00714
Epoch: 32650, Training Loss: 0.00806
Epoch: 32650, Training Loss: 0.00754
Epoch: 32650, Training Loss: 0.00921
Epoch: 32650, Training Loss: 0.00714
Epoch: 32651, Training Loss: 0.00806
Epoch: 32651, Training Loss: 0.00754
Epoch: 32651, Training Loss: 0.00921
Epoch: 32651, Training Loss: 0.00713
Epoch: 32652, Training Loss: 0.00806
Epoch: 32652, Training Loss: 0.00753
Epoch: 32652, Training Loss: 0.00921
Epoch: 32652, Training Loss: 0.00713
Epoch: 32653, Training Loss: 0.00806
Epoch: 32653, Training Loss: 0.00753
Epoch: 32653, Training Loss: 0.00921
Epoch: 32653, Training Loss: 0.00713
Epoch: 32654, Training Loss: 0.00806
Epoch: 32654, Training Loss: 0.00753
Epoch: 32654, Training Loss: 0.00921
Epoch: 32654, Training Loss: 0.00713
Epoch: 32655, Training Loss: 0.00806
Epoch: 32655, Training Loss: 0.00753
Epoch: 32655, Training Loss: 0.00921
Epoch: 32655, Training Loss: 0.00713
Epoch: 32656, Training Loss: 0.00806
Epoch: 32656, Training Loss: 0.00753
Epoch: 32656, Training Loss: 0.00921
Epoch: 32656, Training Loss: 0.00713
Epoch: 32657, Training Loss: 0.00806
Epoch: 32657, Training Loss: 0.00753
Epoch: 32657, Training Loss: 0.00921
Epoch: 32657, Training Loss: 0.00713
Epoch: 32658, Training Loss: 0.00806
Epoch: 32658, Training Loss: 0.00753
Epoch: 32658, Training Loss: 0.00921
Epoch: 32658, Training Loss: 0.00713
Epoch: 32659, Training Loss: 0.00806
Epoch: 32659, Training Loss: 0.00753
Epoch: 32659, Training Loss: 0.00921
Epoch: 32659, Training Loss: 0.00713
Epoch: 32660, Training Loss: 0.00806
Epoch: 32660, Training Loss: 0.00753
Epoch: 32660, Training Loss: 0.00921
Epoch: 32660, Training Loss: 0.00713
Epoch: 32661, Training Loss: 0.00806
Epoch: 32661, Training Loss: 0.00753
Epoch: 32661, Training Loss: 0.00921
Epoch: 32661, Training Loss: 0.00713
Epoch: 32662, Training Loss: 0.00806
Epoch: 32662, Training Loss: 0.00753
Epoch: 32662, Training Loss: 0.00921
Epoch: 32662, Training Loss: 0.00713
Epoch: 32663, Training Loss: 0.00806
Epoch: 32663, Training Loss: 0.00753
Epoch: 32663, Training Loss: 0.00921
Epoch: 32663, Training Loss: 0.00713
Epoch: 32664, Training Loss: 0.00806
Epoch: 32664, Training Loss: 0.00753
Epoch: 32664, Training Loss: 0.00921
Epoch: 32664, Training Loss: 0.00713
Epoch: 32665, Training Loss: 0.00806
Epoch: 32665, Training Loss: 0.00753
Epoch: 32665, Training Loss: 0.00921
Epoch: 32665, Training Loss: 0.00713
Epoch: 32666, Training Loss: 0.00806
Epoch: 32666, Training Loss: 0.00753
Epoch: 32666, Training Loss: 0.00921
Epoch: 32666, Training Loss: 0.00713
Epoch: 32667, Training Loss: 0.00806
Epoch: 32667, Training Loss: 0.00753
Epoch: 32667, Training Loss: 0.00921
Epoch: 32667, Training Loss: 0.00713
Epoch: 32668, Training Loss: 0.00806
Epoch: 32668, Training Loss: 0.00753
Epoch: 32668, Training Loss: 0.00921
Epoch: 32668, Training Loss: 0.00713
Epoch: 32669, Training Loss: 0.00806
Epoch: 32669, Training Loss: 0.00753
Epoch: 32669, Training Loss: 0.00921
Epoch: 32669, Training Loss: 0.00713
Epoch: 32670, Training Loss: 0.00806
Epoch: 32670, Training Loss: 0.00753
Epoch: 32670, Training Loss: 0.00921
Epoch: 32670, Training Loss: 0.00713
Epoch: 32671, Training Loss: 0.00806
Epoch: 32671, Training Loss: 0.00753
Epoch: 32671, Training Loss: 0.00921
Epoch: 32671, Training Loss: 0.00713
Epoch: 32672, Training Loss: 0.00806
Epoch: 32672, Training Loss: 0.00753
Epoch: 32672, Training Loss: 0.00921
Epoch: 32672, Training Loss: 0.00713
Epoch: 32673, Training Loss: 0.00806
Epoch: 32673, Training Loss: 0.00753
Epoch: 32673, Training Loss: 0.00921
Epoch: 32673, Training Loss: 0.00713
Epoch: 32674, Training Loss: 0.00806
Epoch: 32674, Training Loss: 0.00753
Epoch: 32674, Training Loss: 0.00921
Epoch: 32674, Training Loss: 0.00713
Epoch: 32675, Training Loss: 0.00806
Epoch: 32675, Training Loss: 0.00753
Epoch: 32675, Training Loss: 0.00921
Epoch: 32675, Training Loss: 0.00713
Epoch: 32676, Training Loss: 0.00806
Epoch: 32676, Training Loss: 0.00753
Epoch: 32676, Training Loss: 0.00921
Epoch: 32676, Training Loss: 0.00713
Epoch: 32677, Training Loss: 0.00806
Epoch: 32677, Training Loss: 0.00753
Epoch: 32677, Training Loss: 0.00921
Epoch: 32677, Training Loss: 0.00713
Epoch: 32678, Training Loss: 0.00806
Epoch: 32678, Training Loss: 0.00753
Epoch: 32678, Training Loss: 0.00921
Epoch: 32678, Training Loss: 0.00713
Epoch: 32679, Training Loss: 0.00806
Epoch: 32679, Training Loss: 0.00753
Epoch: 32679, Training Loss: 0.00921
Epoch: 32679, Training Loss: 0.00713
Epoch: 32680, Training Loss: 0.00806
Epoch: 32680, Training Loss: 0.00753
Epoch: 32680, Training Loss: 0.00921
Epoch: 32680, Training Loss: 0.00713
Epoch: 32681, Training Loss: 0.00806
Epoch: 32681, Training Loss: 0.00753
Epoch: 32681, Training Loss: 0.00921
Epoch: 32681, Training Loss: 0.00713
Epoch: 32682, Training Loss: 0.00806
Epoch: 32682, Training Loss: 0.00753
Epoch: 32682, Training Loss: 0.00921
Epoch: 32682, Training Loss: 0.00713
Epoch: 32683, Training Loss: 0.00806
Epoch: 32683, Training Loss: 0.00753
Epoch: 32683, Training Loss: 0.00921
Epoch: 32683, Training Loss: 0.00713
Epoch: 32684, Training Loss: 0.00806
Epoch: 32684, Training Loss: 0.00753
Epoch: 32684, Training Loss: 0.00921
Epoch: 32684, Training Loss: 0.00713
Epoch: 32685, Training Loss: 0.00806
Epoch: 32685, Training Loss: 0.00753
Epoch: 32685, Training Loss: 0.00921
Epoch: 32685, Training Loss: 0.00713
Epoch: 32686, Training Loss: 0.00806
Epoch: 32686, Training Loss: 0.00753
Epoch: 32686, Training Loss: 0.00921
Epoch: 32686, Training Loss: 0.00713
Epoch: 32687, Training Loss: 0.00806
Epoch: 32687, Training Loss: 0.00753
Epoch: 32687, Training Loss: 0.00921
Epoch: 32687, Training Loss: 0.00713
Epoch: 32688, Training Loss: 0.00806
Epoch: 32688, Training Loss: 0.00753
Epoch: 32688, Training Loss: 0.00920
Epoch: 32688, Training Loss: 0.00713
Epoch: 32689, Training Loss: 0.00806
Epoch: 32689, Training Loss: 0.00753
Epoch: 32689, Training Loss: 0.00920
Epoch: 32689, Training Loss: 0.00713
Epoch: 32690, Training Loss: 0.00806
Epoch: 32690, Training Loss: 0.00753
Epoch: 32690, Training Loss: 0.00920
Epoch: 32690, Training Loss: 0.00713
Epoch: 32691, Training Loss: 0.00806
Epoch: 32691, Training Loss: 0.00753
Epoch: 32691, Training Loss: 0.00920
Epoch: 32691, Training Loss: 0.00713
Epoch: 32692, Training Loss: 0.00806
Epoch: 32692, Training Loss: 0.00753
Epoch: 32692, Training Loss: 0.00920
Epoch: 32692, Training Loss: 0.00713
Epoch: 32693, Training Loss: 0.00806
Epoch: 32693, Training Loss: 0.00753
Epoch: 32693, Training Loss: 0.00920
Epoch: 32693, Training Loss: 0.00713
Epoch: 32694, Training Loss: 0.00806
Epoch: 32694, Training Loss: 0.00753
Epoch: 32694, Training Loss: 0.00920
Epoch: 32694, Training Loss: 0.00713
Epoch: 32695, Training Loss: 0.00806
Epoch: 32695, Training Loss: 0.00753
Epoch: 32695, Training Loss: 0.00920
Epoch: 32695, Training Loss: 0.00713
Epoch: 32696, Training Loss: 0.00806
Epoch: 32696, Training Loss: 0.00753
Epoch: 32696, Training Loss: 0.00920
Epoch: 32696, Training Loss: 0.00713
Epoch: 32697, Training Loss: 0.00806
Epoch: 32697, Training Loss: 0.00753
Epoch: 32697, Training Loss: 0.00920
Epoch: 32697, Training Loss: 0.00713
Epoch: 32698, Training Loss: 0.00806
Epoch: 32698, Training Loss: 0.00753
Epoch: 32698, Training Loss: 0.00920
Epoch: 32698, Training Loss: 0.00713
Epoch: 32699, Training Loss: 0.00806
Epoch: 32699, Training Loss: 0.00753
Epoch: 32699, Training Loss: 0.00920
Epoch: 32699, Training Loss: 0.00713
Epoch: 32700, Training Loss: 0.00806
Epoch: 32700, Training Loss: 0.00753
Epoch: 32700, Training Loss: 0.00920
Epoch: 32700, Training Loss: 0.00713
Epoch: 32701, Training Loss: 0.00806
Epoch: 32701, Training Loss: 0.00753
Epoch: 32701, Training Loss: 0.00920
Epoch: 32701, Training Loss: 0.00713
Epoch: 32702, Training Loss: 0.00806
Epoch: 32702, Training Loss: 0.00753
Epoch: 32702, Training Loss: 0.00920
Epoch: 32702, Training Loss: 0.00713
Epoch: 32703, Training Loss: 0.00806
Epoch: 32703, Training Loss: 0.00753
Epoch: 32703, Training Loss: 0.00920
Epoch: 32703, Training Loss: 0.00713
Epoch: 32704, Training Loss: 0.00806
Epoch: 32704, Training Loss: 0.00753
Epoch: 32704, Training Loss: 0.00920
Epoch: 32704, Training Loss: 0.00713
Epoch: 32705, Training Loss: 0.00806
Epoch: 32705, Training Loss: 0.00753
Epoch: 32705, Training Loss: 0.00920
Epoch: 32705, Training Loss: 0.00713
Epoch: 32706, Training Loss: 0.00806
Epoch: 32706, Training Loss: 0.00753
Epoch: 32706, Training Loss: 0.00920
Epoch: 32706, Training Loss: 0.00713
Epoch: 32707, Training Loss: 0.00806
Epoch: 32707, Training Loss: 0.00753
Epoch: 32707, Training Loss: 0.00920
Epoch: 32707, Training Loss: 0.00713
Epoch: 32708, Training Loss: 0.00806
Epoch: 32708, Training Loss: 0.00753
Epoch: 32708, Training Loss: 0.00920
Epoch: 32708, Training Loss: 0.00713
Epoch: 32709, Training Loss: 0.00806
Epoch: 32709, Training Loss: 0.00753
Epoch: 32709, Training Loss: 0.00920
Epoch: 32709, Training Loss: 0.00713
Epoch: 32710, Training Loss: 0.00806
Epoch: 32710, Training Loss: 0.00753
Epoch: 32710, Training Loss: 0.00920
Epoch: 32710, Training Loss: 0.00713
Epoch: 32711, Training Loss: 0.00806
Epoch: 32711, Training Loss: 0.00753
Epoch: 32711, Training Loss: 0.00920
Epoch: 32711, Training Loss: 0.00713
Epoch: 32712, Training Loss: 0.00806
Epoch: 32712, Training Loss: 0.00753
Epoch: 32712, Training Loss: 0.00920
Epoch: 32712, Training Loss: 0.00713
Epoch: 32713, Training Loss: 0.00806
Epoch: 32713, Training Loss: 0.00753
Epoch: 32713, Training Loss: 0.00920
Epoch: 32713, Training Loss: 0.00713
Epoch: 32714, Training Loss: 0.00806
Epoch: 32714, Training Loss: 0.00753
Epoch: 32714, Training Loss: 0.00920
Epoch: 32714, Training Loss: 0.00713
Epoch: 32715, Training Loss: 0.00806
Epoch: 32715, Training Loss: 0.00753
Epoch: 32715, Training Loss: 0.00920
Epoch: 32715, Training Loss: 0.00713
Epoch: 32716, Training Loss: 0.00806
Epoch: 32716, Training Loss: 0.00753
Epoch: 32716, Training Loss: 0.00920
Epoch: 32716, Training Loss: 0.00713
Epoch: 32717, Training Loss: 0.00806
Epoch: 32717, Training Loss: 0.00753
Epoch: 32717, Training Loss: 0.00920
Epoch: 32717, Training Loss: 0.00713
Epoch: 32718, Training Loss: 0.00806
Epoch: 32718, Training Loss: 0.00753
Epoch: 32718, Training Loss: 0.00920
Epoch: 32718, Training Loss: 0.00713
Epoch: 32719, Training Loss: 0.00805
Epoch: 32719, Training Loss: 0.00753
Epoch: 32719, Training Loss: 0.00920
Epoch: 32719, Training Loss: 0.00713
Epoch: 32720, Training Loss: 0.00805
Epoch: 32720, Training Loss: 0.00753
Epoch: 32720, Training Loss: 0.00920
Epoch: 32720, Training Loss: 0.00713
Epoch: 32721, Training Loss: 0.00805
Epoch: 32721, Training Loss: 0.00753
Epoch: 32721, Training Loss: 0.00920
Epoch: 32721, Training Loss: 0.00713
Epoch: 32722, Training Loss: 0.00805
Epoch: 32722, Training Loss: 0.00753
Epoch: 32722, Training Loss: 0.00920
Epoch: 32722, Training Loss: 0.00713
Epoch: 32723, Training Loss: 0.00805
Epoch: 32723, Training Loss: 0.00753
Epoch: 32723, Training Loss: 0.00920
Epoch: 32723, Training Loss: 0.00713
Epoch: 32724, Training Loss: 0.00805
Epoch: 32724, Training Loss: 0.00753
Epoch: 32724, Training Loss: 0.00920
Epoch: 32724, Training Loss: 0.00713
Epoch: 32725, Training Loss: 0.00805
Epoch: 32725, Training Loss: 0.00753
Epoch: 32725, Training Loss: 0.00920
Epoch: 32725, Training Loss: 0.00713
Epoch: 32726, Training Loss: 0.00805
Epoch: 32726, Training Loss: 0.00753
Epoch: 32726, Training Loss: 0.00920
Epoch: 32726, Training Loss: 0.00713
Epoch: 32727, Training Loss: 0.00805
Epoch: 32727, Training Loss: 0.00753
Epoch: 32727, Training Loss: 0.00920
Epoch: 32727, Training Loss: 0.00713
Epoch: 32728, Training Loss: 0.00805
Epoch: 32728, Training Loss: 0.00753
Epoch: 32728, Training Loss: 0.00920
Epoch: 32728, Training Loss: 0.00713
Epoch: 32729, Training Loss: 0.00805
Epoch: 32729, Training Loss: 0.00753
Epoch: 32729, Training Loss: 0.00920
Epoch: 32729, Training Loss: 0.00713
Epoch: 32730, Training Loss: 0.00805
Epoch: 32730, Training Loss: 0.00753
Epoch: 32730, Training Loss: 0.00920
Epoch: 32730, Training Loss: 0.00713
Epoch: 32731, Training Loss: 0.00805
Epoch: 32731, Training Loss: 0.00753
Epoch: 32731, Training Loss: 0.00920
Epoch: 32731, Training Loss: 0.00713
Epoch: 32732, Training Loss: 0.00805
Epoch: 32732, Training Loss: 0.00753
Epoch: 32732, Training Loss: 0.00920
Epoch: 32732, Training Loss: 0.00713
Epoch: 32733, Training Loss: 0.00805
Epoch: 32733, Training Loss: 0.00753
Epoch: 32733, Training Loss: 0.00920
Epoch: 32733, Training Loss: 0.00713
Epoch: 32734, Training Loss: 0.00805
Epoch: 32734, Training Loss: 0.00753
Epoch: 32734, Training Loss: 0.00920
Epoch: 32734, Training Loss: 0.00713
Epoch: 32735, Training Loss: 0.00805
Epoch: 32735, Training Loss: 0.00752
Epoch: 32735, Training Loss: 0.00920
Epoch: 32735, Training Loss: 0.00713
Epoch: 32736, Training Loss: 0.00805
Epoch: 32736, Training Loss: 0.00752
Epoch: 32736, Training Loss: 0.00920
Epoch: 32736, Training Loss: 0.00713
Epoch: 32737, Training Loss: 0.00805
Epoch: 32737, Training Loss: 0.00752
Epoch: 32737, Training Loss: 0.00920
Epoch: 32737, Training Loss: 0.00713
Epoch: 32738, Training Loss: 0.00805
Epoch: 32738, Training Loss: 0.00752
Epoch: 32738, Training Loss: 0.00920
Epoch: 32738, Training Loss: 0.00712
Epoch: 32739, Training Loss: 0.00805
Epoch: 32739, Training Loss: 0.00752
Epoch: 32739, Training Loss: 0.00920
Epoch: 32739, Training Loss: 0.00712
Epoch: 32740, Training Loss: 0.00805
Epoch: 32740, Training Loss: 0.00752
Epoch: 32740, Training Loss: 0.00920
Epoch: 32740, Training Loss: 0.00712
Epoch: 32741, Training Loss: 0.00805
Epoch: 32741, Training Loss: 0.00752
Epoch: 32741, Training Loss: 0.00920
Epoch: 32741, Training Loss: 0.00712
Epoch: 32742, Training Loss: 0.00805
Epoch: 32742, Training Loss: 0.00752
Epoch: 32742, Training Loss: 0.00920
Epoch: 32742, Training Loss: 0.00712
Epoch: 32743, Training Loss: 0.00805
Epoch: 32743, Training Loss: 0.00752
Epoch: 32743, Training Loss: 0.00920
Epoch: 32743, Training Loss: 0.00712
Epoch: 32744, Training Loss: 0.00805
Epoch: 32744, Training Loss: 0.00752
Epoch: 32744, Training Loss: 0.00920
Epoch: 32744, Training Loss: 0.00712
Epoch: 32745, Training Loss: 0.00805
Epoch: 32745, Training Loss: 0.00752
Epoch: 32745, Training Loss: 0.00920
Epoch: 32745, Training Loss: 0.00712
Epoch: 32746, Training Loss: 0.00805
Epoch: 32746, Training Loss: 0.00752
Epoch: 32746, Training Loss: 0.00920
Epoch: 32746, Training Loss: 0.00712
Epoch: 32747, Training Loss: 0.00805
Epoch: 32747, Training Loss: 0.00752
Epoch: 32747, Training Loss: 0.00920
Epoch: 32747, Training Loss: 0.00712
Epoch: 32748, Training Loss: 0.00805
Epoch: 32748, Training Loss: 0.00752
Epoch: 32748, Training Loss: 0.00920
Epoch: 32748, Training Loss: 0.00712
Epoch: 32749, Training Loss: 0.00805
Epoch: 32749, Training Loss: 0.00752
Epoch: 32749, Training Loss: 0.00920
Epoch: 32749, Training Loss: 0.00712
Epoch: 32750, Training Loss: 0.00805
Epoch: 32750, Training Loss: 0.00752
Epoch: 32750, Training Loss: 0.00920
Epoch: 32750, Training Loss: 0.00712
Epoch: 32751, Training Loss: 0.00805
Epoch: 32751, Training Loss: 0.00752
Epoch: 32751, Training Loss: 0.00920
Epoch: 32751, Training Loss: 0.00712
Epoch: 32752, Training Loss: 0.00805
Epoch: 32752, Training Loss: 0.00752
Epoch: 32752, Training Loss: 0.00920
Epoch: 32752, Training Loss: 0.00712
Epoch: 32753, Training Loss: 0.00805
Epoch: 32753, Training Loss: 0.00752
Epoch: 32753, Training Loss: 0.00920
Epoch: 32753, Training Loss: 0.00712
Epoch: 32754, Training Loss: 0.00805
Epoch: 32754, Training Loss: 0.00752
Epoch: 32754, Training Loss: 0.00920
Epoch: 32754, Training Loss: 0.00712
Epoch: 32755, Training Loss: 0.00805
Epoch: 32755, Training Loss: 0.00752
Epoch: 32755, Training Loss: 0.00919
Epoch: 32755, Training Loss: 0.00712
Epoch: 32756, Training Loss: 0.00805
Epoch: 32756, Training Loss: 0.00752
Epoch: 32756, Training Loss: 0.00919
Epoch: 32756, Training Loss: 0.00712
Epoch: 32757, Training Loss: 0.00805
Epoch: 32757, Training Loss: 0.00752
Epoch: 32757, Training Loss: 0.00919
Epoch: 32757, Training Loss: 0.00712
Epoch: 32758, Training Loss: 0.00805
Epoch: 32758, Training Loss: 0.00752
Epoch: 32758, Training Loss: 0.00919
Epoch: 32758, Training Loss: 0.00712
Epoch: 32759, Training Loss: 0.00805
Epoch: 32759, Training Loss: 0.00752
Epoch: 32759, Training Loss: 0.00919
Epoch: 32759, Training Loss: 0.00712
Epoch: 32760, Training Loss: 0.00805
Epoch: 32760, Training Loss: 0.00752
Epoch: 32760, Training Loss: 0.00919
Epoch: 32760, Training Loss: 0.00712
Epoch: 32761, Training Loss: 0.00805
Epoch: 32761, Training Loss: 0.00752
Epoch: 32761, Training Loss: 0.00919
Epoch: 32761, Training Loss: 0.00712
Epoch: 32762, Training Loss: 0.00805
Epoch: 32762, Training Loss: 0.00752
Epoch: 32762, Training Loss: 0.00919
Epoch: 32762, Training Loss: 0.00712
Epoch: 32763, Training Loss: 0.00805
Epoch: 32763, Training Loss: 0.00752
Epoch: 32763, Training Loss: 0.00919
Epoch: 32763, Training Loss: 0.00712
Epoch: 32764, Training Loss: 0.00805
Epoch: 32764, Training Loss: 0.00752
Epoch: 32764, Training Loss: 0.00919
Epoch: 32764, Training Loss: 0.00712
Epoch: 32765, Training Loss: 0.00805
Epoch: 32765, Training Loss: 0.00752
Epoch: 32765, Training Loss: 0.00919
Epoch: 32765, Training Loss: 0.00712
Epoch: 32766, Training Loss: 0.00805
Epoch: 32766, Training Loss: 0.00752
Epoch: 32766, Training Loss: 0.00919
Epoch: 32766, Training Loss: 0.00712
Epoch: 32767, Training Loss: 0.00805
Epoch: 32767, Training Loss: 0.00752
Epoch: 32767, Training Loss: 0.00919
Epoch: 32767, Training Loss: 0.00712
Epoch: 32768, Training Loss: 0.00805
Epoch: 32768, Training Loss: 0.00752
Epoch: 32768, Training Loss: 0.00919
Epoch: 32768, Training Loss: 0.00712
Epoch: 32769, Training Loss: 0.00805
Epoch: 32769, Training Loss: 0.00752
Epoch: 32769, Training Loss: 0.00919
Epoch: 32769, Training Loss: 0.00712
Epoch: 32770, Training Loss: 0.00805
Epoch: 32770, Training Loss: 0.00752
Epoch: 32770, Training Loss: 0.00919
Epoch: 32770, Training Loss: 0.00712
Epoch: 32771, Training Loss: 0.00805
Epoch: 32771, Training Loss: 0.00752
Epoch: 32771, Training Loss: 0.00919
Epoch: 32771, Training Loss: 0.00712
Epoch: 32772, Training Loss: 0.00805
Epoch: 32772, Training Loss: 0.00752
Epoch: 32772, Training Loss: 0.00919
Epoch: 32772, Training Loss: 0.00712
Epoch: 32773, Training Loss: 0.00805
Epoch: 32773, Training Loss: 0.00752
Epoch: 32773, Training Loss: 0.00919
Epoch: 32773, Training Loss: 0.00712
Epoch: 32774, Training Loss: 0.00805
Epoch: 32774, Training Loss: 0.00752
Epoch: 32774, Training Loss: 0.00919
Epoch: 32774, Training Loss: 0.00712
Epoch: 32775, Training Loss: 0.00805
Epoch: 32775, Training Loss: 0.00752
Epoch: 32775, Training Loss: 0.00919
Epoch: 32775, Training Loss: 0.00712
Epoch: 32776, Training Loss: 0.00805
Epoch: 32776, Training Loss: 0.00752
Epoch: 32776, Training Loss: 0.00919
Epoch: 32776, Training Loss: 0.00712
Epoch: 32777, Training Loss: 0.00805
Epoch: 32777, Training Loss: 0.00752
Epoch: 32777, Training Loss: 0.00919
Epoch: 32777, Training Loss: 0.00712
Epoch: 32778, Training Loss: 0.00805
Epoch: 32778, Training Loss: 0.00752
Epoch: 32778, Training Loss: 0.00919
Epoch: 32778, Training Loss: 0.00712
Epoch: 32779, Training Loss: 0.00805
Epoch: 32779, Training Loss: 0.00752
Epoch: 32779, Training Loss: 0.00919
Epoch: 32779, Training Loss: 0.00712
Epoch: 32780, Training Loss: 0.00805
Epoch: 32780, Training Loss: 0.00752
Epoch: 32780, Training Loss: 0.00919
Epoch: 32780, Training Loss: 0.00712
Epoch: 32781, Training Loss: 0.00805
Epoch: 32781, Training Loss: 0.00752
Epoch: 32781, Training Loss: 0.00919
Epoch: 32781, Training Loss: 0.00712
Epoch: 32782, Training Loss: 0.00805
Epoch: 32782, Training Loss: 0.00752
Epoch: 32782, Training Loss: 0.00919
Epoch: 32782, Training Loss: 0.00712
Epoch: 32783, Training Loss: 0.00805
Epoch: 32783, Training Loss: 0.00752
Epoch: 32783, Training Loss: 0.00919
Epoch: 32783, Training Loss: 0.00712
Epoch: 32784, Training Loss: 0.00805
Epoch: 32784, Training Loss: 0.00752
Epoch: 32784, Training Loss: 0.00919
Epoch: 32784, Training Loss: 0.00712
Epoch: 32785, Training Loss: 0.00805
Epoch: 32785, Training Loss: 0.00752
Epoch: 32785, Training Loss: 0.00919
Epoch: 32785, Training Loss: 0.00712
Epoch: 32786, Training Loss: 0.00805
Epoch: 32786, Training Loss: 0.00752
Epoch: 32786, Training Loss: 0.00919
Epoch: 32786, Training Loss: 0.00712
Epoch: 32787, Training Loss: 0.00805
Epoch: 32787, Training Loss: 0.00752
Epoch: 32787, Training Loss: 0.00919
Epoch: 32787, Training Loss: 0.00712
Epoch: 32788, Training Loss: 0.00805
Epoch: 32788, Training Loss: 0.00752
Epoch: 32788, Training Loss: 0.00919
Epoch: 32788, Training Loss: 0.00712
Epoch: 32789, Training Loss: 0.00805
Epoch: 32789, Training Loss: 0.00752
Epoch: 32789, Training Loss: 0.00919
Epoch: 32789, Training Loss: 0.00712
Epoch: 32790, Training Loss: 0.00805
Epoch: 32790, Training Loss: 0.00752
Epoch: 32790, Training Loss: 0.00919
Epoch: 32790, Training Loss: 0.00712
Epoch: 32791, Training Loss: 0.00805
Epoch: 32791, Training Loss: 0.00752
Epoch: 32791, Training Loss: 0.00919
Epoch: 32791, Training Loss: 0.00712
Epoch: 32792, Training Loss: 0.00805
Epoch: 32792, Training Loss: 0.00752
Epoch: 32792, Training Loss: 0.00919
Epoch: 32792, Training Loss: 0.00712
Epoch: 32793, Training Loss: 0.00805
Epoch: 32793, Training Loss: 0.00752
Epoch: 32793, Training Loss: 0.00919
Epoch: 32793, Training Loss: 0.00712
Epoch: 32794, Training Loss: 0.00805
Epoch: 32794, Training Loss: 0.00752
Epoch: 32794, Training Loss: 0.00919
Epoch: 32794, Training Loss: 0.00712
Epoch: 32795, Training Loss: 0.00805
Epoch: 32795, Training Loss: 0.00752
Epoch: 32795, Training Loss: 0.00919
Epoch: 32795, Training Loss: 0.00712
Epoch: 32796, Training Loss: 0.00804
Epoch: 32796, Training Loss: 0.00752
Epoch: 32796, Training Loss: 0.00919
Epoch: 32796, Training Loss: 0.00712
Epoch: 32797, Training Loss: 0.00804
Epoch: 32797, Training Loss: 0.00752
Epoch: 32797, Training Loss: 0.00919
Epoch: 32797, Training Loss: 0.00712
Epoch: 32798, Training Loss: 0.00804
Epoch: 32798, Training Loss: 0.00752
Epoch: 32798, Training Loss: 0.00919
Epoch: 32798, Training Loss: 0.00712
Epoch: 32799, Training Loss: 0.00804
Epoch: 32799, Training Loss: 0.00752
Epoch: 32799, Training Loss: 0.00919
Epoch: 32799, Training Loss: 0.00712
Epoch: 32800, Training Loss: 0.00804
Epoch: 32800, Training Loss: 0.00752
Epoch: 32800, Training Loss: 0.00919
Epoch: 32800, Training Loss: 0.00712
Epoch: 32801, Training Loss: 0.00804
Epoch: 32801, Training Loss: 0.00752
Epoch: 32801, Training Loss: 0.00919
Epoch: 32801, Training Loss: 0.00712
Epoch: 32802, Training Loss: 0.00804
Epoch: 32802, Training Loss: 0.00752
Epoch: 32802, Training Loss: 0.00919
Epoch: 32802, Training Loss: 0.00712
Epoch: 32803, Training Loss: 0.00804
Epoch: 32803, Training Loss: 0.00752
Epoch: 32803, Training Loss: 0.00919
Epoch: 32803, Training Loss: 0.00712
Epoch: 32804, Training Loss: 0.00804
Epoch: 32804, Training Loss: 0.00752
Epoch: 32804, Training Loss: 0.00919
Epoch: 32804, Training Loss: 0.00712
Epoch: 32805, Training Loss: 0.00804
Epoch: 32805, Training Loss: 0.00752
Epoch: 32805, Training Loss: 0.00919
Epoch: 32805, Training Loss: 0.00712
Epoch: 32806, Training Loss: 0.00804
Epoch: 32806, Training Loss: 0.00752
Epoch: 32806, Training Loss: 0.00919
Epoch: 32806, Training Loss: 0.00712
Epoch: 32807, Training Loss: 0.00804
Epoch: 32807, Training Loss: 0.00752
Epoch: 32807, Training Loss: 0.00919
Epoch: 32807, Training Loss: 0.00712
Epoch: 32808, Training Loss: 0.00804
Epoch: 32808, Training Loss: 0.00752
Epoch: 32808, Training Loss: 0.00919
Epoch: 32808, Training Loss: 0.00712
Epoch: 32809, Training Loss: 0.00804
Epoch: 32809, Training Loss: 0.00752
Epoch: 32809, Training Loss: 0.00919
Epoch: 32809, Training Loss: 0.00712
Epoch: 32810, Training Loss: 0.00804
Epoch: 32810, Training Loss: 0.00752
Epoch: 32810, Training Loss: 0.00919
Epoch: 32810, Training Loss: 0.00712
Epoch: 32811, Training Loss: 0.00804
Epoch: 32811, Training Loss: 0.00752
Epoch: 32811, Training Loss: 0.00919
Epoch: 32811, Training Loss: 0.00712
Epoch: 32812, Training Loss: 0.00804
Epoch: 32812, Training Loss: 0.00752
Epoch: 32812, Training Loss: 0.00919
Epoch: 32812, Training Loss: 0.00712
Epoch: 32813, Training Loss: 0.00804
Epoch: 32813, Training Loss: 0.00752
Epoch: 32813, Training Loss: 0.00919
Epoch: 32813, Training Loss: 0.00712
Epoch: 32814, Training Loss: 0.00804
Epoch: 32814, Training Loss: 0.00752
Epoch: 32814, Training Loss: 0.00919
Epoch: 32814, Training Loss: 0.00712
Epoch: 32815, Training Loss: 0.00804
Epoch: 32815, Training Loss: 0.00752
Epoch: 32815, Training Loss: 0.00919
Epoch: 32815, Training Loss: 0.00712
Epoch: 32816, Training Loss: 0.00804
Epoch: 32816, Training Loss: 0.00752
Epoch: 32816, Training Loss: 0.00919
Epoch: 32816, Training Loss: 0.00712
Epoch: 32817, Training Loss: 0.00804
Epoch: 32817, Training Loss: 0.00752
Epoch: 32817, Training Loss: 0.00919
Epoch: 32817, Training Loss: 0.00712
Epoch: 32818, Training Loss: 0.00804
Epoch: 32818, Training Loss: 0.00752
Epoch: 32818, Training Loss: 0.00919
Epoch: 32818, Training Loss: 0.00712
Epoch: 32819, Training Loss: 0.00804
Epoch: 32819, Training Loss: 0.00751
Epoch: 32819, Training Loss: 0.00919
Epoch: 32819, Training Loss: 0.00712
Epoch: 32820, Training Loss: 0.00804
Epoch: 32820, Training Loss: 0.00751
Epoch: 32820, Training Loss: 0.00919
Epoch: 32820, Training Loss: 0.00712
Epoch: 32821, Training Loss: 0.00804
Epoch: 32821, Training Loss: 0.00751
Epoch: 32821, Training Loss: 0.00919
Epoch: 32821, Training Loss: 0.00712
Epoch: 32822, Training Loss: 0.00804
Epoch: 32822, Training Loss: 0.00751
Epoch: 32822, Training Loss: 0.00919
Epoch: 32822, Training Loss: 0.00712
Epoch: 32823, Training Loss: 0.00804
Epoch: 32823, Training Loss: 0.00751
Epoch: 32823, Training Loss: 0.00918
Epoch: 32823, Training Loss: 0.00712
Epoch: 32824, Training Loss: 0.00804
Epoch: 32824, Training Loss: 0.00751
Epoch: 32824, Training Loss: 0.00918
Epoch: 32824, Training Loss: 0.00712
Epoch: 32825, Training Loss: 0.00804
Epoch: 32825, Training Loss: 0.00751
Epoch: 32825, Training Loss: 0.00918
Epoch: 32825, Training Loss: 0.00712
Epoch: 32826, Training Loss: 0.00804
Epoch: 32826, Training Loss: 0.00751
Epoch: 32826, Training Loss: 0.00918
Epoch: 32826, Training Loss: 0.00712
Epoch: 32827, Training Loss: 0.00804
Epoch: 32827, Training Loss: 0.00751
Epoch: 32827, Training Loss: 0.00918
Epoch: 32827, Training Loss: 0.00711
Epoch: 32828, Training Loss: 0.00804
Epoch: 32828, Training Loss: 0.00751
Epoch: 32828, Training Loss: 0.00918
Epoch: 32828, Training Loss: 0.00711
Epoch: 32829, Training Loss: 0.00804
Epoch: 32829, Training Loss: 0.00751
Epoch: 32829, Training Loss: 0.00918
Epoch: 32829, Training Loss: 0.00711
Epoch: 32830, Training Loss: 0.00804
Epoch: 32830, Training Loss: 0.00751
Epoch: 32830, Training Loss: 0.00918
Epoch: 32830, Training Loss: 0.00711
Epoch: 32831, Training Loss: 0.00804
Epoch: 32831, Training Loss: 0.00751
Epoch: 32831, Training Loss: 0.00918
Epoch: 32831, Training Loss: 0.00711
Epoch: 32832, Training Loss: 0.00804
Epoch: 32832, Training Loss: 0.00751
Epoch: 32832, Training Loss: 0.00918
Epoch: 32832, Training Loss: 0.00711
Epoch: 32833, Training Loss: 0.00804
Epoch: 32833, Training Loss: 0.00751
Epoch: 32833, Training Loss: 0.00918
Epoch: 32833, Training Loss: 0.00711
Epoch: 32834, Training Loss: 0.00804
Epoch: 32834, Training Loss: 0.00751
Epoch: 32834, Training Loss: 0.00918
Epoch: 32834, Training Loss: 0.00711
Epoch: 32835, Training Loss: 0.00804
Epoch: 32835, Training Loss: 0.00751
Epoch: 32835, Training Loss: 0.00918
Epoch: 32835, Training Loss: 0.00711
Epoch: 32836, Training Loss: 0.00804
Epoch: 32836, Training Loss: 0.00751
Epoch: 32836, Training Loss: 0.00918
Epoch: 32836, Training Loss: 0.00711
Epoch: 32837, Training Loss: 0.00804
Epoch: 32837, Training Loss: 0.00751
Epoch: 32837, Training Loss: 0.00918
Epoch: 32837, Training Loss: 0.00711
Epoch: 32838, Training Loss: 0.00804
Epoch: 32838, Training Loss: 0.00751
Epoch: 32838, Training Loss: 0.00918
Epoch: 32838, Training Loss: 0.00711
Epoch: 32839, Training Loss: 0.00804
Epoch: 32839, Training Loss: 0.00751
Epoch: 32839, Training Loss: 0.00918
Epoch: 32839, Training Loss: 0.00711
Epoch: 32840, Training Loss: 0.00804
Epoch: 32840, Training Loss: 0.00751
Epoch: 32840, Training Loss: 0.00918
Epoch: 32840, Training Loss: 0.00711
Epoch: 32841, Training Loss: 0.00804
Epoch: 32841, Training Loss: 0.00751
Epoch: 32841, Training Loss: 0.00918
Epoch: 32841, Training Loss: 0.00711
Epoch: 32842, Training Loss: 0.00804
Epoch: 32842, Training Loss: 0.00751
Epoch: 32842, Training Loss: 0.00918
Epoch: 32842, Training Loss: 0.00711
Epoch: 32843, Training Loss: 0.00804
Epoch: 32843, Training Loss: 0.00751
Epoch: 32843, Training Loss: 0.00918
Epoch: 32843, Training Loss: 0.00711
Epoch: 32844, Training Loss: 0.00804
Epoch: 32844, Training Loss: 0.00751
Epoch: 32844, Training Loss: 0.00918
Epoch: 32844, Training Loss: 0.00711
Epoch: 32845, Training Loss: 0.00804
Epoch: 32845, Training Loss: 0.00751
Epoch: 32845, Training Loss: 0.00918
Epoch: 32845, Training Loss: 0.00711
Epoch: 32846, Training Loss: 0.00804
Epoch: 32846, Training Loss: 0.00751
Epoch: 32846, Training Loss: 0.00918
Epoch: 32846, Training Loss: 0.00711
Epoch: 32847, Training Loss: 0.00804
Epoch: 32847, Training Loss: 0.00751
Epoch: 32847, Training Loss: 0.00918
Epoch: 32847, Training Loss: 0.00711
Epoch: 32848, Training Loss: 0.00804
Epoch: 32848, Training Loss: 0.00751
Epoch: 32848, Training Loss: 0.00918
Epoch: 32848, Training Loss: 0.00711
Epoch: 32849, Training Loss: 0.00804
Epoch: 32849, Training Loss: 0.00751
Epoch: 32849, Training Loss: 0.00918
Epoch: 32849, Training Loss: 0.00711
Epoch: 32850, Training Loss: 0.00804
Epoch: 32850, Training Loss: 0.00751
Epoch: 32850, Training Loss: 0.00918
Epoch: 32850, Training Loss: 0.00711
Epoch: 32851, Training Loss: 0.00804
Epoch: 32851, Training Loss: 0.00751
Epoch: 32851, Training Loss: 0.00918
Epoch: 32851, Training Loss: 0.00711
Epoch: 32852, Training Loss: 0.00804
Epoch: 32852, Training Loss: 0.00751
Epoch: 32852, Training Loss: 0.00918
Epoch: 32852, Training Loss: 0.00711
Epoch: 32853, Training Loss: 0.00804
Epoch: 32853, Training Loss: 0.00751
Epoch: 32853, Training Loss: 0.00918
Epoch: 32853, Training Loss: 0.00711
Epoch: 32854, Training Loss: 0.00804
Epoch: 32854, Training Loss: 0.00751
Epoch: 32854, Training Loss: 0.00918
Epoch: 32854, Training Loss: 0.00711
Epoch: 32855, Training Loss: 0.00804
Epoch: 32855, Training Loss: 0.00751
Epoch: 32855, Training Loss: 0.00918
Epoch: 32855, Training Loss: 0.00711
Epoch: 32856, Training Loss: 0.00804
Epoch: 32856, Training Loss: 0.00751
Epoch: 32856, Training Loss: 0.00918
Epoch: 32856, Training Loss: 0.00711
Epoch: 32857, Training Loss: 0.00804
Epoch: 32857, Training Loss: 0.00751
Epoch: 32857, Training Loss: 0.00918
Epoch: 32857, Training Loss: 0.00711
Epoch: 32858, Training Loss: 0.00804
Epoch: 32858, Training Loss: 0.00751
Epoch: 32858, Training Loss: 0.00918
Epoch: 32858, Training Loss: 0.00711
Epoch: 32859, Training Loss: 0.00804
Epoch: 32859, Training Loss: 0.00751
Epoch: 32859, Training Loss: 0.00918
Epoch: 32859, Training Loss: 0.00711
Epoch: 32860, Training Loss: 0.00804
Epoch: 32860, Training Loss: 0.00751
Epoch: 32860, Training Loss: 0.00918
Epoch: 32860, Training Loss: 0.00711
Epoch: 32861, Training Loss: 0.00804
Epoch: 32861, Training Loss: 0.00751
Epoch: 32861, Training Loss: 0.00918
Epoch: 32861, Training Loss: 0.00711
Epoch: 32862, Training Loss: 0.00804
Epoch: 32862, Training Loss: 0.00751
Epoch: 32862, Training Loss: 0.00918
Epoch: 32862, Training Loss: 0.00711
Epoch: 32863, Training Loss: 0.00804
Epoch: 32863, Training Loss: 0.00751
Epoch: 32863, Training Loss: 0.00918
Epoch: 32863, Training Loss: 0.00711
Epoch: 32864, Training Loss: 0.00804
Epoch: 32864, Training Loss: 0.00751
Epoch: 32864, Training Loss: 0.00918
Epoch: 32864, Training Loss: 0.00711
Epoch: 32865, Training Loss: 0.00804
Epoch: 32865, Training Loss: 0.00751
Epoch: 32865, Training Loss: 0.00918
Epoch: 32865, Training Loss: 0.00711
Epoch: 32866, Training Loss: 0.00804
Epoch: 32866, Training Loss: 0.00751
Epoch: 32866, Training Loss: 0.00918
Epoch: 32866, Training Loss: 0.00711
Epoch: 32867, Training Loss: 0.00804
Epoch: 32867, Training Loss: 0.00751
Epoch: 32867, Training Loss: 0.00918
Epoch: 32867, Training Loss: 0.00711
Epoch: 32868, Training Loss: 0.00804
Epoch: 32868, Training Loss: 0.00751
Epoch: 32868, Training Loss: 0.00918
Epoch: 32868, Training Loss: 0.00711
Epoch: 32869, Training Loss: 0.00804
Epoch: 32869, Training Loss: 0.00751
Epoch: 32869, Training Loss: 0.00918
Epoch: 32869, Training Loss: 0.00711
Epoch: 32870, Training Loss: 0.00804
Epoch: 32870, Training Loss: 0.00751
Epoch: 32870, Training Loss: 0.00918
Epoch: 32870, Training Loss: 0.00711
Epoch: 32871, Training Loss: 0.00804
Epoch: 32871, Training Loss: 0.00751
Epoch: 32871, Training Loss: 0.00918
Epoch: 32871, Training Loss: 0.00711
Epoch: 32872, Training Loss: 0.00804
Epoch: 32872, Training Loss: 0.00751
Epoch: 32872, Training Loss: 0.00918
Epoch: 32872, Training Loss: 0.00711
Epoch: 32873, Training Loss: 0.00803
Epoch: 32873, Training Loss: 0.00751
Epoch: 32873, Training Loss: 0.00918
Epoch: 32873, Training Loss: 0.00711
Epoch: 32874, Training Loss: 0.00803
Epoch: 32874, Training Loss: 0.00751
Epoch: 32874, Training Loss: 0.00918
Epoch: 32874, Training Loss: 0.00711
Epoch: 32875, Training Loss: 0.00803
Epoch: 32875, Training Loss: 0.00751
Epoch: 32875, Training Loss: 0.00918
Epoch: 32875, Training Loss: 0.00711
Epoch: 32876, Training Loss: 0.00803
Epoch: 32876, Training Loss: 0.00751
Epoch: 32876, Training Loss: 0.00918
Epoch: 32876, Training Loss: 0.00711
Epoch: 32877, Training Loss: 0.00803
Epoch: 32877, Training Loss: 0.00751
Epoch: 32877, Training Loss: 0.00918
Epoch: 32877, Training Loss: 0.00711
Epoch: 32878, Training Loss: 0.00803
Epoch: 32878, Training Loss: 0.00751
Epoch: 32878, Training Loss: 0.00918
Epoch: 32878, Training Loss: 0.00711
Epoch: 32879, Training Loss: 0.00803
Epoch: 32879, Training Loss: 0.00751
Epoch: 32879, Training Loss: 0.00918
Epoch: 32879, Training Loss: 0.00711
Epoch: 32880, Training Loss: 0.00803
Epoch: 32880, Training Loss: 0.00751
Epoch: 32880, Training Loss: 0.00918
Epoch: 32880, Training Loss: 0.00711
Epoch: 32881, Training Loss: 0.00803
Epoch: 32881, Training Loss: 0.00751
Epoch: 32881, Training Loss: 0.00918
Epoch: 32881, Training Loss: 0.00711
Epoch: 32882, Training Loss: 0.00803
Epoch: 32882, Training Loss: 0.00751
Epoch: 32882, Training Loss: 0.00918
Epoch: 32882, Training Loss: 0.00711
Epoch: 32883, Training Loss: 0.00803
Epoch: 32883, Training Loss: 0.00751
Epoch: 32883, Training Loss: 0.00918
Epoch: 32883, Training Loss: 0.00711
Epoch: 32884, Training Loss: 0.00803
Epoch: 32884, Training Loss: 0.00751
Epoch: 32884, Training Loss: 0.00918
Epoch: 32884, Training Loss: 0.00711
Epoch: 32885, Training Loss: 0.00803
Epoch: 32885, Training Loss: 0.00751
Epoch: 32885, Training Loss: 0.00918
Epoch: 32885, Training Loss: 0.00711
Epoch: 32886, Training Loss: 0.00803
Epoch: 32886, Training Loss: 0.00751
Epoch: 32886, Training Loss: 0.00918
Epoch: 32886, Training Loss: 0.00711
Epoch: 32887, Training Loss: 0.00803
Epoch: 32887, Training Loss: 0.00751
Epoch: 32887, Training Loss: 0.00918
Epoch: 32887, Training Loss: 0.00711
Epoch: 32888, Training Loss: 0.00803
Epoch: 32888, Training Loss: 0.00751
Epoch: 32888, Training Loss: 0.00918
Epoch: 32888, Training Loss: 0.00711
Epoch: 32889, Training Loss: 0.00803
Epoch: 32889, Training Loss: 0.00751
Epoch: 32889, Training Loss: 0.00918
Epoch: 32889, Training Loss: 0.00711
Epoch: 32890, Training Loss: 0.00803
Epoch: 32890, Training Loss: 0.00751
Epoch: 32890, Training Loss: 0.00917
Epoch: 32890, Training Loss: 0.00711
Epoch: 32891, Training Loss: 0.00803
Epoch: 32891, Training Loss: 0.00751
Epoch: 32891, Training Loss: 0.00917
Epoch: 32891, Training Loss: 0.00711
Epoch: 32892, Training Loss: 0.00803
Epoch: 32892, Training Loss: 0.00751
Epoch: 32892, Training Loss: 0.00917
Epoch: 32892, Training Loss: 0.00711
Epoch: 32893, Training Loss: 0.00803
Epoch: 32893, Training Loss: 0.00751
Epoch: 32893, Training Loss: 0.00917
Epoch: 32893, Training Loss: 0.00711
Epoch: 32894, Training Loss: 0.00803
Epoch: 32894, Training Loss: 0.00751
Epoch: 32894, Training Loss: 0.00917
Epoch: 32894, Training Loss: 0.00711
Epoch: 32895, Training Loss: 0.00803
Epoch: 32895, Training Loss: 0.00751
Epoch: 32895, Training Loss: 0.00917
Epoch: 32895, Training Loss: 0.00711
Epoch: 32896, Training Loss: 0.00803
Epoch: 32896, Training Loss: 0.00751
Epoch: 32896, Training Loss: 0.00917
Epoch: 32896, Training Loss: 0.00711
Epoch: 32897, Training Loss: 0.00803
Epoch: 32897, Training Loss: 0.00751
Epoch: 32897, Training Loss: 0.00917
Epoch: 32897, Training Loss: 0.00711
Epoch: 32898, Training Loss: 0.00803
Epoch: 32898, Training Loss: 0.00751
Epoch: 32898, Training Loss: 0.00917
Epoch: 32898, Training Loss: 0.00711
Epoch: 32899, Training Loss: 0.00803
Epoch: 32899, Training Loss: 0.00751
Epoch: 32899, Training Loss: 0.00917
Epoch: 32899, Training Loss: 0.00711
Epoch: 32900, Training Loss: 0.00803
Epoch: 32900, Training Loss: 0.00751
Epoch: 32900, Training Loss: 0.00917
Epoch: 32900, Training Loss: 0.00711
Epoch: 32901, Training Loss: 0.00803
Epoch: 32901, Training Loss: 0.00751
Epoch: 32901, Training Loss: 0.00917
Epoch: 32901, Training Loss: 0.00711
Epoch: 32902, Training Loss: 0.00803
Epoch: 32902, Training Loss: 0.00751
Epoch: 32902, Training Loss: 0.00917
Epoch: 32902, Training Loss: 0.00711
Epoch: 32903, Training Loss: 0.00803
Epoch: 32903, Training Loss: 0.00750
Epoch: 32903, Training Loss: 0.00917
Epoch: 32903, Training Loss: 0.00711
Epoch: 32904, Training Loss: 0.00803
Epoch: 32904, Training Loss: 0.00750
Epoch: 32904, Training Loss: 0.00917
Epoch: 32904, Training Loss: 0.00711
Epoch: 32905, Training Loss: 0.00803
Epoch: 32905, Training Loss: 0.00750
Epoch: 32905, Training Loss: 0.00917
Epoch: 32905, Training Loss: 0.00711
Epoch: 32906, Training Loss: 0.00803
Epoch: 32906, Training Loss: 0.00750
Epoch: 32906, Training Loss: 0.00917
Epoch: 32906, Training Loss: 0.00711
Epoch: 32907, Training Loss: 0.00803
Epoch: 32907, Training Loss: 0.00750
Epoch: 32907, Training Loss: 0.00917
Epoch: 32907, Training Loss: 0.00711
Epoch: 32908, Training Loss: 0.00803
Epoch: 32908, Training Loss: 0.00750
Epoch: 32908, Training Loss: 0.00917
Epoch: 32908, Training Loss: 0.00711
Epoch: 32909, Training Loss: 0.00803
Epoch: 32909, Training Loss: 0.00750
Epoch: 32909, Training Loss: 0.00917
Epoch: 32909, Training Loss: 0.00711
Epoch: 32910, Training Loss: 0.00803
Epoch: 32910, Training Loss: 0.00750
Epoch: 32910, Training Loss: 0.00917
Epoch: 32910, Training Loss: 0.00711
Epoch: 32911, Training Loss: 0.00803
Epoch: 32911, Training Loss: 0.00750
Epoch: 32911, Training Loss: 0.00917
Epoch: 32911, Training Loss: 0.00711
Epoch: 32912, Training Loss: 0.00803
Epoch: 32912, Training Loss: 0.00750
Epoch: 32912, Training Loss: 0.00917
Epoch: 32912, Training Loss: 0.00711
Epoch: 32913, Training Loss: 0.00803
Epoch: 32913, Training Loss: 0.00750
Epoch: 32913, Training Loss: 0.00917
Epoch: 32913, Training Loss: 0.00711
Epoch: 32914, Training Loss: 0.00803
Epoch: 32914, Training Loss: 0.00750
Epoch: 32914, Training Loss: 0.00917
Epoch: 32914, Training Loss: 0.00711
Epoch: 32915, Training Loss: 0.00803
Epoch: 32915, Training Loss: 0.00750
Epoch: 32915, Training Loss: 0.00917
Epoch: 32915, Training Loss: 0.00710
Epoch: 32916, Training Loss: 0.00803
Epoch: 32916, Training Loss: 0.00750
Epoch: 32916, Training Loss: 0.00917
Epoch: 32916, Training Loss: 0.00710
Epoch: 32917, Training Loss: 0.00803
Epoch: 32917, Training Loss: 0.00750
Epoch: 32917, Training Loss: 0.00917
Epoch: 32917, Training Loss: 0.00710
Epoch: 32918, Training Loss: 0.00803
Epoch: 32918, Training Loss: 0.00750
Epoch: 32918, Training Loss: 0.00917
Epoch: 32918, Training Loss: 0.00710
Epoch: 32919, Training Loss: 0.00803
Epoch: 32919, Training Loss: 0.00750
Epoch: 32919, Training Loss: 0.00917
Epoch: 32919, Training Loss: 0.00710
Epoch: 32920, Training Loss: 0.00803
Epoch: 32920, Training Loss: 0.00750
Epoch: 32920, Training Loss: 0.00917
Epoch: 32920, Training Loss: 0.00710
Epoch: 32921, Training Loss: 0.00803
Epoch: 32921, Training Loss: 0.00750
Epoch: 32921, Training Loss: 0.00917
Epoch: 32921, Training Loss: 0.00710
Epoch: 32922, Training Loss: 0.00803
Epoch: 32922, Training Loss: 0.00750
Epoch: 32922, Training Loss: 0.00917
Epoch: 32922, Training Loss: 0.00710
Epoch: 32923, Training Loss: 0.00803
Epoch: 32923, Training Loss: 0.00750
Epoch: 32923, Training Loss: 0.00917
Epoch: 32923, Training Loss: 0.00710
Epoch: 32924, Training Loss: 0.00803
Epoch: 32924, Training Loss: 0.00750
Epoch: 32924, Training Loss: 0.00917
Epoch: 32924, Training Loss: 0.00710
Epoch: 32925, Training Loss: 0.00803
Epoch: 32925, Training Loss: 0.00750
Epoch: 32925, Training Loss: 0.00917
Epoch: 32925, Training Loss: 0.00710
Epoch: 32926, Training Loss: 0.00803
Epoch: 32926, Training Loss: 0.00750
Epoch: 32926, Training Loss: 0.00917
Epoch: 32926, Training Loss: 0.00710
Epoch: 32927, Training Loss: 0.00803
Epoch: 32927, Training Loss: 0.00750
Epoch: 32927, Training Loss: 0.00917
Epoch: 32927, Training Loss: 0.00710
Epoch: 32928, Training Loss: 0.00803
Epoch: 32928, Training Loss: 0.00750
Epoch: 32928, Training Loss: 0.00917
Epoch: 32928, Training Loss: 0.00710
Epoch: 32929, Training Loss: 0.00803
Epoch: 32929, Training Loss: 0.00750
Epoch: 32929, Training Loss: 0.00917
Epoch: 32929, Training Loss: 0.00710
Epoch: 32930, Training Loss: 0.00803
Epoch: 32930, Training Loss: 0.00750
Epoch: 32930, Training Loss: 0.00917
Epoch: 32930, Training Loss: 0.00710
Epoch: 32931, Training Loss: 0.00803
Epoch: 32931, Training Loss: 0.00750
Epoch: 32931, Training Loss: 0.00917
Epoch: 32931, Training Loss: 0.00710
Epoch: 32932, Training Loss: 0.00803
Epoch: 32932, Training Loss: 0.00750
Epoch: 32932, Training Loss: 0.00917
Epoch: 32932, Training Loss: 0.00710
Epoch: 32933, Training Loss: 0.00803
Epoch: 32933, Training Loss: 0.00750
Epoch: 32933, Training Loss: 0.00917
Epoch: 32933, Training Loss: 0.00710
Epoch: 32934, Training Loss: 0.00803
Epoch: 32934, Training Loss: 0.00750
Epoch: 32934, Training Loss: 0.00917
Epoch: 32934, Training Loss: 0.00710
Epoch: 32935, Training Loss: 0.00803
Epoch: 32935, Training Loss: 0.00750
Epoch: 32935, Training Loss: 0.00917
Epoch: 32935, Training Loss: 0.00710
Epoch: 32936, Training Loss: 0.00803
Epoch: 32936, Training Loss: 0.00750
Epoch: 32936, Training Loss: 0.00917
Epoch: 32936, Training Loss: 0.00710
Epoch: 32937, Training Loss: 0.00803
Epoch: 32937, Training Loss: 0.00750
Epoch: 32937, Training Loss: 0.00917
Epoch: 32937, Training Loss: 0.00710
Epoch: 32938, Training Loss: 0.00803
Epoch: 32938, Training Loss: 0.00750
Epoch: 32938, Training Loss: 0.00917
Epoch: 32938, Training Loss: 0.00710
Epoch: 32939, Training Loss: 0.00803
Epoch: 32939, Training Loss: 0.00750
Epoch: 32939, Training Loss: 0.00917
Epoch: 32939, Training Loss: 0.00710
Epoch: 32940, Training Loss: 0.00803
Epoch: 32940, Training Loss: 0.00750
Epoch: 32940, Training Loss: 0.00917
Epoch: 32940, Training Loss: 0.00710
Epoch: 32941, Training Loss: 0.00803
Epoch: 32941, Training Loss: 0.00750
Epoch: 32941, Training Loss: 0.00917
Epoch: 32941, Training Loss: 0.00710
Epoch: 32942, Training Loss: 0.00803
Epoch: 32942, Training Loss: 0.00750
Epoch: 32942, Training Loss: 0.00917
Epoch: 32942, Training Loss: 0.00710
Epoch: 32943, Training Loss: 0.00803
Epoch: 32943, Training Loss: 0.00750
Epoch: 32943, Training Loss: 0.00917
Epoch: 32943, Training Loss: 0.00710
Epoch: 32944, Training Loss: 0.00803
Epoch: 32944, Training Loss: 0.00750
Epoch: 32944, Training Loss: 0.00917
Epoch: 32944, Training Loss: 0.00710
Epoch: 32945, Training Loss: 0.00803
Epoch: 32945, Training Loss: 0.00750
Epoch: 32945, Training Loss: 0.00917
Epoch: 32945, Training Loss: 0.00710
Epoch: 32946, Training Loss: 0.00803
Epoch: 32946, Training Loss: 0.00750
Epoch: 32946, Training Loss: 0.00917
Epoch: 32946, Training Loss: 0.00710
Epoch: 32947, Training Loss: 0.00803
Epoch: 32947, Training Loss: 0.00750
Epoch: 32947, Training Loss: 0.00917
Epoch: 32947, Training Loss: 0.00710
Epoch: 32948, Training Loss: 0.00803
Epoch: 32948, Training Loss: 0.00750
Epoch: 32948, Training Loss: 0.00917
Epoch: 32948, Training Loss: 0.00710
Epoch: 32949, Training Loss: 0.00803
Epoch: 32949, Training Loss: 0.00750
Epoch: 32949, Training Loss: 0.00917
Epoch: 32949, Training Loss: 0.00710
Epoch: 32950, Training Loss: 0.00802
Epoch: 32950, Training Loss: 0.00750
Epoch: 32950, Training Loss: 0.00917
Epoch: 32950, Training Loss: 0.00710
Epoch: 32951, Training Loss: 0.00802
Epoch: 32951, Training Loss: 0.00750
Epoch: 32951, Training Loss: 0.00917
Epoch: 32951, Training Loss: 0.00710
Epoch: 32952, Training Loss: 0.00802
Epoch: 32952, Training Loss: 0.00750
Epoch: 32952, Training Loss: 0.00917
Epoch: 32952, Training Loss: 0.00710
Epoch: 32953, Training Loss: 0.00802
Epoch: 32953, Training Loss: 0.00750
Epoch: 32953, Training Loss: 0.00917
Epoch: 32953, Training Loss: 0.00710
Epoch: 32954, Training Loss: 0.00802
Epoch: 32954, Training Loss: 0.00750
Epoch: 32954, Training Loss: 0.00917
Epoch: 32954, Training Loss: 0.00710
Epoch: 32955, Training Loss: 0.00802
Epoch: 32955, Training Loss: 0.00750
Epoch: 32955, Training Loss: 0.00917
Epoch: 32955, Training Loss: 0.00710
Epoch: 32956, Training Loss: 0.00802
Epoch: 32956, Training Loss: 0.00750
Epoch: 32956, Training Loss: 0.00917
Epoch: 32956, Training Loss: 0.00710
Epoch: 32957, Training Loss: 0.00802
Epoch: 32957, Training Loss: 0.00750
Epoch: 32957, Training Loss: 0.00917
Epoch: 32957, Training Loss: 0.00710
Epoch: 32958, Training Loss: 0.00802
Epoch: 32958, Training Loss: 0.00750
Epoch: 32958, Training Loss: 0.00916
Epoch: 32958, Training Loss: 0.00710
Epoch: 32959, Training Loss: 0.00802
Epoch: 32959, Training Loss: 0.00750
Epoch: 32959, Training Loss: 0.00916
Epoch: 32959, Training Loss: 0.00710
Epoch: 32960, Training Loss: 0.00802
Epoch: 32960, Training Loss: 0.00750
Epoch: 32960, Training Loss: 0.00916
Epoch: 32960, Training Loss: 0.00710
Epoch: 32961, Training Loss: 0.00802
Epoch: 32961, Training Loss: 0.00750
Epoch: 32961, Training Loss: 0.00916
Epoch: 32961, Training Loss: 0.00710
Epoch: 32962, Training Loss: 0.00802
Epoch: 32962, Training Loss: 0.00750
Epoch: 32962, Training Loss: 0.00916
Epoch: 32962, Training Loss: 0.00710
Epoch: 32963, Training Loss: 0.00802
Epoch: 32963, Training Loss: 0.00750
Epoch: 32963, Training Loss: 0.00916
Epoch: 32963, Training Loss: 0.00710
Epoch: 32964, Training Loss: 0.00802
Epoch: 32964, Training Loss: 0.00750
Epoch: 32964, Training Loss: 0.00916
Epoch: 32964, Training Loss: 0.00710
Epoch: 32965, Training Loss: 0.00802
Epoch: 32965, Training Loss: 0.00750
Epoch: 32965, Training Loss: 0.00916
Epoch: 32965, Training Loss: 0.00710
Epoch: 32966, Training Loss: 0.00802
Epoch: 32966, Training Loss: 0.00750
Epoch: 32966, Training Loss: 0.00916
Epoch: 32966, Training Loss: 0.00710
Epoch: 32967, Training Loss: 0.00802
Epoch: 32967, Training Loss: 0.00750
Epoch: 32967, Training Loss: 0.00916
Epoch: 32967, Training Loss: 0.00710
Epoch: 32968, Training Loss: 0.00802
Epoch: 32968, Training Loss: 0.00750
Epoch: 32968, Training Loss: 0.00916
Epoch: 32968, Training Loss: 0.00710
Epoch: 32969, Training Loss: 0.00802
Epoch: 32969, Training Loss: 0.00750
Epoch: 32969, Training Loss: 0.00916
Epoch: 32969, Training Loss: 0.00710
Epoch: 32970, Training Loss: 0.00802
Epoch: 32970, Training Loss: 0.00750
Epoch: 32970, Training Loss: 0.00916
Epoch: 32970, Training Loss: 0.00710
Epoch: 32971, Training Loss: 0.00802
Epoch: 32971, Training Loss: 0.00750
Epoch: 32971, Training Loss: 0.00916
Epoch: 32971, Training Loss: 0.00710
Epoch: 32972, Training Loss: 0.00802
Epoch: 32972, Training Loss: 0.00750
Epoch: 32972, Training Loss: 0.00916
Epoch: 32972, Training Loss: 0.00710
Epoch: 32973, Training Loss: 0.00802
Epoch: 32973, Training Loss: 0.00750
Epoch: 32973, Training Loss: 0.00916
Epoch: 32973, Training Loss: 0.00710
Epoch: 32974, Training Loss: 0.00802
Epoch: 32974, Training Loss: 0.00750
Epoch: 32974, Training Loss: 0.00916
Epoch: 32974, Training Loss: 0.00710
Epoch: 32975, Training Loss: 0.00802
Epoch: 32975, Training Loss: 0.00750
Epoch: 32975, Training Loss: 0.00916
Epoch: 32975, Training Loss: 0.00710
Epoch: 32976, Training Loss: 0.00802
Epoch: 32976, Training Loss: 0.00750
Epoch: 32976, Training Loss: 0.00916
Epoch: 32976, Training Loss: 0.00710
Epoch: 32977, Training Loss: 0.00802
Epoch: 32977, Training Loss: 0.00750
Epoch: 32977, Training Loss: 0.00916
Epoch: 32977, Training Loss: 0.00710
Epoch: 32978, Training Loss: 0.00802
Epoch: 32978, Training Loss: 0.00750
Epoch: 32978, Training Loss: 0.00916
Epoch: 32978, Training Loss: 0.00710
Epoch: 32979, Training Loss: 0.00802
Epoch: 32979, Training Loss: 0.00750
Epoch: 32979, Training Loss: 0.00916
Epoch: 32979, Training Loss: 0.00710
Epoch: 32980, Training Loss: 0.00802
Epoch: 32980, Training Loss: 0.00750
Epoch: 32980, Training Loss: 0.00916
Epoch: 32980, Training Loss: 0.00710
Epoch: 32981, Training Loss: 0.00802
Epoch: 32981, Training Loss: 0.00750
Epoch: 32981, Training Loss: 0.00916
Epoch: 32981, Training Loss: 0.00710
Epoch: 32982, Training Loss: 0.00802
Epoch: 32982, Training Loss: 0.00750
Epoch: 32982, Training Loss: 0.00916
Epoch: 32982, Training Loss: 0.00710
Epoch: 32983, Training Loss: 0.00802
Epoch: 32983, Training Loss: 0.00750
Epoch: 32983, Training Loss: 0.00916
Epoch: 32983, Training Loss: 0.00710
Epoch: 32984, Training Loss: 0.00802
Epoch: 32984, Training Loss: 0.00750
Epoch: 32984, Training Loss: 0.00916
Epoch: 32984, Training Loss: 0.00710
Epoch: 32985, Training Loss: 0.00802
Epoch: 32985, Training Loss: 0.00750
Epoch: 32985, Training Loss: 0.00916
Epoch: 32985, Training Loss: 0.00710
Epoch: 32986, Training Loss: 0.00802
Epoch: 32986, Training Loss: 0.00750
Epoch: 32986, Training Loss: 0.00916
Epoch: 32986, Training Loss: 0.00710
Epoch: 32987, Training Loss: 0.00802
Epoch: 32987, Training Loss: 0.00749
Epoch: 32987, Training Loss: 0.00916
Epoch: 32987, Training Loss: 0.00710
Epoch: 32988, Training Loss: 0.00802
Epoch: 32988, Training Loss: 0.00749
Epoch: 32988, Training Loss: 0.00916
Epoch: 32988, Training Loss: 0.00710
Epoch: 32989, Training Loss: 0.00802
Epoch: 32989, Training Loss: 0.00749
Epoch: 32989, Training Loss: 0.00916
Epoch: 32989, Training Loss: 0.00710
Epoch: 32990, Training Loss: 0.00802
Epoch: 32990, Training Loss: 0.00749
Epoch: 32990, Training Loss: 0.00916
Epoch: 32990, Training Loss: 0.00710
Epoch: 32991, Training Loss: 0.00802
Epoch: 32991, Training Loss: 0.00749
Epoch: 32991, Training Loss: 0.00916
Epoch: 32991, Training Loss: 0.00710
Epoch: 32992, Training Loss: 0.00802
Epoch: 32992, Training Loss: 0.00749
Epoch: 32992, Training Loss: 0.00916
Epoch: 32992, Training Loss: 0.00710
Epoch: 32993, Training Loss: 0.00802
Epoch: 32993, Training Loss: 0.00749
Epoch: 32993, Training Loss: 0.00916
Epoch: 32993, Training Loss: 0.00710
Epoch: 32994, Training Loss: 0.00802
Epoch: 32994, Training Loss: 0.00749
Epoch: 32994, Training Loss: 0.00916
Epoch: 32994, Training Loss: 0.00710
Epoch: 32995, Training Loss: 0.00802
Epoch: 32995, Training Loss: 0.00749
Epoch: 32995, Training Loss: 0.00916
Epoch: 32995, Training Loss: 0.00710
Epoch: 32996, Training Loss: 0.00802
Epoch: 32996, Training Loss: 0.00749
Epoch: 32996, Training Loss: 0.00916
Epoch: 32996, Training Loss: 0.00710
Epoch: 32997, Training Loss: 0.00802
Epoch: 32997, Training Loss: 0.00749
Epoch: 32997, Training Loss: 0.00916
Epoch: 32997, Training Loss: 0.00710
Epoch: 32998, Training Loss: 0.00802
Epoch: 32998, Training Loss: 0.00749
Epoch: 32998, Training Loss: 0.00916
Epoch: 32998, Training Loss: 0.00710
Epoch: 32999, Training Loss: 0.00802
Epoch: 32999, Training Loss: 0.00749
Epoch: 32999, Training Loss: 0.00916
Epoch: 32999, Training Loss: 0.00710
Epoch: 33000, Training Loss: 0.00802
Epoch: 33000, Training Loss: 0.00749
Epoch: 33000, Training Loss: 0.00916
Epoch: 33000, Training Loss: 0.00710
Epoch: 33001, Training Loss: 0.00802
Epoch: 33001, Training Loss: 0.00749
Epoch: 33001, Training Loss: 0.00916
Epoch: 33001, Training Loss: 0.00710
Epoch: 33002, Training Loss: 0.00802
Epoch: 33002, Training Loss: 0.00749
Epoch: 33002, Training Loss: 0.00916
Epoch: 33002, Training Loss: 0.00710
Epoch: 33003, Training Loss: 0.00802
Epoch: 33003, Training Loss: 0.00749
Epoch: 33003, Training Loss: 0.00916
Epoch: 33003, Training Loss: 0.00710
Epoch: 33004, Training Loss: 0.00802
Epoch: 33004, Training Loss: 0.00749
Epoch: 33004, Training Loss: 0.00916
Epoch: 33004, Training Loss: 0.00709
Epoch: 33005, Training Loss: 0.00802
Epoch: 33005, Training Loss: 0.00749
Epoch: 33005, Training Loss: 0.00916
Epoch: 33005, Training Loss: 0.00709
Epoch: 33006, Training Loss: 0.00802
Epoch: 33006, Training Loss: 0.00749
Epoch: 33006, Training Loss: 0.00916
Epoch: 33006, Training Loss: 0.00709
Epoch: 33007, Training Loss: 0.00802
Epoch: 33007, Training Loss: 0.00749
Epoch: 33007, Training Loss: 0.00916
Epoch: 33007, Training Loss: 0.00709
Epoch: 33008, Training Loss: 0.00802
Epoch: 33008, Training Loss: 0.00749
Epoch: 33008, Training Loss: 0.00916
Epoch: 33008, Training Loss: 0.00709
Epoch: 33009, Training Loss: 0.00802
Epoch: 33009, Training Loss: 0.00749
Epoch: 33009, Training Loss: 0.00916
Epoch: 33009, Training Loss: 0.00709
Epoch: 33010, Training Loss: 0.00802
Epoch: 33010, Training Loss: 0.00749
Epoch: 33010, Training Loss: 0.00916
Epoch: 33010, Training Loss: 0.00709
Epoch: 33011, Training Loss: 0.00802
Epoch: 33011, Training Loss: 0.00749
Epoch: 33011, Training Loss: 0.00916
Epoch: 33011, Training Loss: 0.00709
Epoch: 33012, Training Loss: 0.00802
Epoch: 33012, Training Loss: 0.00749
Epoch: 33012, Training Loss: 0.00916
Epoch: 33012, Training Loss: 0.00709
Epoch: 33013, Training Loss: 0.00802
Epoch: 33013, Training Loss: 0.00749
Epoch: 33013, Training Loss: 0.00916
Epoch: 33013, Training Loss: 0.00709
Epoch: 33014, Training Loss: 0.00802
Epoch: 33014, Training Loss: 0.00749
Epoch: 33014, Training Loss: 0.00916
Epoch: 33014, Training Loss: 0.00709
Epoch: 33015, Training Loss: 0.00802
Epoch: 33015, Training Loss: 0.00749
Epoch: 33015, Training Loss: 0.00916
Epoch: 33015, Training Loss: 0.00709
Epoch: 33016, Training Loss: 0.00802
Epoch: 33016, Training Loss: 0.00749
Epoch: 33016, Training Loss: 0.00916
Epoch: 33016, Training Loss: 0.00709
Epoch: 33017, Training Loss: 0.00802
Epoch: 33017, Training Loss: 0.00749
Epoch: 33017, Training Loss: 0.00916
Epoch: 33017, Training Loss: 0.00709
Epoch: 33018, Training Loss: 0.00802
Epoch: 33018, Training Loss: 0.00749
Epoch: 33018, Training Loss: 0.00916
Epoch: 33018, Training Loss: 0.00709
Epoch: 33019, Training Loss: 0.00802
Epoch: 33019, Training Loss: 0.00749
Epoch: 33019, Training Loss: 0.00916
Epoch: 33019, Training Loss: 0.00709
Epoch: 33020, Training Loss: 0.00802
Epoch: 33020, Training Loss: 0.00749
Epoch: 33020, Training Loss: 0.00916
Epoch: 33020, Training Loss: 0.00709
Epoch: 33021, Training Loss: 0.00802
Epoch: 33021, Training Loss: 0.00749
Epoch: 33021, Training Loss: 0.00916
Epoch: 33021, Training Loss: 0.00709
Epoch: 33022, Training Loss: 0.00802
Epoch: 33022, Training Loss: 0.00749
Epoch: 33022, Training Loss: 0.00916
Epoch: 33022, Training Loss: 0.00709
Epoch: 33023, Training Loss: 0.00802
Epoch: 33023, Training Loss: 0.00749
Epoch: 33023, Training Loss: 0.00916
Epoch: 33023, Training Loss: 0.00709
Epoch: 33024, Training Loss: 0.00802
Epoch: 33024, Training Loss: 0.00749
Epoch: 33024, Training Loss: 0.00916
Epoch: 33024, Training Loss: 0.00709
Epoch: 33025, Training Loss: 0.00802
Epoch: 33025, Training Loss: 0.00749
Epoch: 33025, Training Loss: 0.00916
Epoch: 33025, Training Loss: 0.00709
Epoch: 33026, Training Loss: 0.00802
Epoch: 33026, Training Loss: 0.00749
Epoch: 33026, Training Loss: 0.00915
Epoch: 33026, Training Loss: 0.00709
Epoch: 33027, Training Loss: 0.00802
Epoch: 33027, Training Loss: 0.00749
Epoch: 33027, Training Loss: 0.00915
Epoch: 33027, Training Loss: 0.00709
Epoch: 33028, Training Loss: 0.00801
Epoch: 33028, Training Loss: 0.00749
Epoch: 33028, Training Loss: 0.00915
Epoch: 33028, Training Loss: 0.00709
Epoch: 33029, Training Loss: 0.00801
Epoch: 33029, Training Loss: 0.00749
Epoch: 33029, Training Loss: 0.00915
Epoch: 33029, Training Loss: 0.00709
Epoch: 33030, Training Loss: 0.00801
Epoch: 33030, Training Loss: 0.00749
Epoch: 33030, Training Loss: 0.00915
Epoch: 33030, Training Loss: 0.00709
Epoch: 33031, Training Loss: 0.00801
Epoch: 33031, Training Loss: 0.00749
Epoch: 33031, Training Loss: 0.00915
Epoch: 33031, Training Loss: 0.00709
Epoch: 33032, Training Loss: 0.00801
Epoch: 33032, Training Loss: 0.00749
Epoch: 33032, Training Loss: 0.00915
Epoch: 33032, Training Loss: 0.00709
Epoch: 33033, Training Loss: 0.00801
Epoch: 33033, Training Loss: 0.00749
Epoch: 33033, Training Loss: 0.00915
Epoch: 33033, Training Loss: 0.00709
Epoch: 33034, Training Loss: 0.00801
Epoch: 33034, Training Loss: 0.00749
Epoch: 33034, Training Loss: 0.00915
Epoch: 33034, Training Loss: 0.00709
Epoch: 33035, Training Loss: 0.00801
Epoch: 33035, Training Loss: 0.00749
Epoch: 33035, Training Loss: 0.00915
Epoch: 33035, Training Loss: 0.00709
Epoch: 33036, Training Loss: 0.00801
Epoch: 33036, Training Loss: 0.00749
Epoch: 33036, Training Loss: 0.00915
Epoch: 33036, Training Loss: 0.00709
Epoch: 33037, Training Loss: 0.00801
Epoch: 33037, Training Loss: 0.00749
Epoch: 33037, Training Loss: 0.00915
Epoch: 33037, Training Loss: 0.00709
Epoch: 33038, Training Loss: 0.00801
Epoch: 33038, Training Loss: 0.00749
Epoch: 33038, Training Loss: 0.00915
Epoch: 33038, Training Loss: 0.00709
Epoch: 33039, Training Loss: 0.00801
Epoch: 33039, Training Loss: 0.00749
Epoch: 33039, Training Loss: 0.00915
Epoch: 33039, Training Loss: 0.00709
Epoch: 33040, Training Loss: 0.00801
Epoch: 33040, Training Loss: 0.00749
Epoch: 33040, Training Loss: 0.00915
Epoch: 33040, Training Loss: 0.00709
Epoch: 33041, Training Loss: 0.00801
Epoch: 33041, Training Loss: 0.00749
Epoch: 33041, Training Loss: 0.00915
Epoch: 33041, Training Loss: 0.00709
Epoch: 33042, Training Loss: 0.00801
Epoch: 33042, Training Loss: 0.00749
Epoch: 33042, Training Loss: 0.00915
Epoch: 33042, Training Loss: 0.00709
Epoch: 33043, Training Loss: 0.00801
Epoch: 33043, Training Loss: 0.00749
Epoch: 33043, Training Loss: 0.00915
Epoch: 33043, Training Loss: 0.00709
Epoch: 33044, Training Loss: 0.00801
Epoch: 33044, Training Loss: 0.00749
Epoch: 33044, Training Loss: 0.00915
Epoch: 33044, Training Loss: 0.00709
Epoch: 33045, Training Loss: 0.00801
Epoch: 33045, Training Loss: 0.00749
Epoch: 33045, Training Loss: 0.00915
Epoch: 33045, Training Loss: 0.00709
Epoch: 33046, Training Loss: 0.00801
Epoch: 33046, Training Loss: 0.00749
Epoch: 33046, Training Loss: 0.00915
Epoch: 33046, Training Loss: 0.00709
Epoch: 33047, Training Loss: 0.00801
Epoch: 33047, Training Loss: 0.00749
Epoch: 33047, Training Loss: 0.00915
Epoch: 33047, Training Loss: 0.00709
Epoch: 33048, Training Loss: 0.00801
Epoch: 33048, Training Loss: 0.00749
Epoch: 33048, Training Loss: 0.00915
Epoch: 33048, Training Loss: 0.00709
Epoch: 33049, Training Loss: 0.00801
Epoch: 33049, Training Loss: 0.00749
Epoch: 33049, Training Loss: 0.00915
Epoch: 33049, Training Loss: 0.00709
Epoch: 33050, Training Loss: 0.00801
Epoch: 33050, Training Loss: 0.00749
Epoch: 33050, Training Loss: 0.00915
Epoch: 33050, Training Loss: 0.00709
Epoch: 33051, Training Loss: 0.00801
Epoch: 33051, Training Loss: 0.00749
Epoch: 33051, Training Loss: 0.00915
Epoch: 33051, Training Loss: 0.00709
Epoch: 33052, Training Loss: 0.00801
Epoch: 33052, Training Loss: 0.00749
Epoch: 33052, Training Loss: 0.00915
Epoch: 33052, Training Loss: 0.00709
Epoch: 33053, Training Loss: 0.00801
Epoch: 33053, Training Loss: 0.00749
Epoch: 33053, Training Loss: 0.00915
Epoch: 33053, Training Loss: 0.00709
Epoch: 33054, Training Loss: 0.00801
Epoch: 33054, Training Loss: 0.00749
Epoch: 33054, Training Loss: 0.00915
Epoch: 33054, Training Loss: 0.00709
Epoch: 33055, Training Loss: 0.00801
Epoch: 33055, Training Loss: 0.00749
Epoch: 33055, Training Loss: 0.00915
Epoch: 33055, Training Loss: 0.00709
Epoch: 33056, Training Loss: 0.00801
Epoch: 33056, Training Loss: 0.00749
Epoch: 33056, Training Loss: 0.00915
Epoch: 33056, Training Loss: 0.00709
Epoch: 33057, Training Loss: 0.00801
Epoch: 33057, Training Loss: 0.00749
Epoch: 33057, Training Loss: 0.00915
Epoch: 33057, Training Loss: 0.00709
Epoch: 33058, Training Loss: 0.00801
Epoch: 33058, Training Loss: 0.00749
Epoch: 33058, Training Loss: 0.00915
Epoch: 33058, Training Loss: 0.00709
Epoch: 33059, Training Loss: 0.00801
Epoch: 33059, Training Loss: 0.00749
Epoch: 33059, Training Loss: 0.00915
Epoch: 33059, Training Loss: 0.00709
Epoch: 33060, Training Loss: 0.00801
Epoch: 33060, Training Loss: 0.00749
Epoch: 33060, Training Loss: 0.00915
Epoch: 33060, Training Loss: 0.00709
Epoch: 33061, Training Loss: 0.00801
Epoch: 33061, Training Loss: 0.00749
Epoch: 33061, Training Loss: 0.00915
Epoch: 33061, Training Loss: 0.00709
Epoch: 33062, Training Loss: 0.00801
Epoch: 33062, Training Loss: 0.00749
Epoch: 33062, Training Loss: 0.00915
Epoch: 33062, Training Loss: 0.00709
Epoch: 33063, Training Loss: 0.00801
Epoch: 33063, Training Loss: 0.00749
Epoch: 33063, Training Loss: 0.00915
Epoch: 33063, Training Loss: 0.00709
Epoch: 33064, Training Loss: 0.00801
Epoch: 33064, Training Loss: 0.00749
Epoch: 33064, Training Loss: 0.00915
Epoch: 33064, Training Loss: 0.00709
Epoch: 33065, Training Loss: 0.00801
Epoch: 33065, Training Loss: 0.00749
Epoch: 33065, Training Loss: 0.00915
Epoch: 33065, Training Loss: 0.00709
Epoch: 33066, Training Loss: 0.00801
Epoch: 33066, Training Loss: 0.00749
Epoch: 33066, Training Loss: 0.00915
Epoch: 33066, Training Loss: 0.00709
Epoch: 33067, Training Loss: 0.00801
Epoch: 33067, Training Loss: 0.00749
Epoch: 33067, Training Loss: 0.00915
Epoch: 33067, Training Loss: 0.00709
Epoch: 33068, Training Loss: 0.00801
Epoch: 33068, Training Loss: 0.00749
Epoch: 33068, Training Loss: 0.00915
Epoch: 33068, Training Loss: 0.00709
Epoch: 33069, Training Loss: 0.00801
Epoch: 33069, Training Loss: 0.00749
Epoch: 33069, Training Loss: 0.00915
Epoch: 33069, Training Loss: 0.00709
Epoch: 33070, Training Loss: 0.00801
Epoch: 33070, Training Loss: 0.00749
Epoch: 33070, Training Loss: 0.00915
Epoch: 33070, Training Loss: 0.00709
Epoch: 33071, Training Loss: 0.00801
Epoch: 33071, Training Loss: 0.00748
Epoch: 33071, Training Loss: 0.00915
Epoch: 33071, Training Loss: 0.00709
Epoch: 33072, Training Loss: 0.00801
Epoch: 33072, Training Loss: 0.00748
Epoch: 33072, Training Loss: 0.00915
Epoch: 33072, Training Loss: 0.00709
Epoch: 33073, Training Loss: 0.00801
Epoch: 33073, Training Loss: 0.00748
Epoch: 33073, Training Loss: 0.00915
Epoch: 33073, Training Loss: 0.00709
Epoch: 33074, Training Loss: 0.00801
Epoch: 33074, Training Loss: 0.00748
Epoch: 33074, Training Loss: 0.00915
Epoch: 33074, Training Loss: 0.00709
Epoch: 33075, Training Loss: 0.00801
Epoch: 33075, Training Loss: 0.00748
Epoch: 33075, Training Loss: 0.00915
Epoch: 33075, Training Loss: 0.00709
Epoch: 33076, Training Loss: 0.00801
Epoch: 33076, Training Loss: 0.00748
Epoch: 33076, Training Loss: 0.00915
Epoch: 33076, Training Loss: 0.00709
Epoch: 33077, Training Loss: 0.00801
Epoch: 33077, Training Loss: 0.00748
Epoch: 33077, Training Loss: 0.00915
Epoch: 33077, Training Loss: 0.00709
Epoch: 33078, Training Loss: 0.00801
Epoch: 33078, Training Loss: 0.00748
Epoch: 33078, Training Loss: 0.00915
Epoch: 33078, Training Loss: 0.00709
Epoch: 33079, Training Loss: 0.00801
Epoch: 33079, Training Loss: 0.00748
Epoch: 33079, Training Loss: 0.00915
Epoch: 33079, Training Loss: 0.00709
Epoch: 33080, Training Loss: 0.00801
Epoch: 33080, Training Loss: 0.00748
Epoch: 33080, Training Loss: 0.00915
Epoch: 33080, Training Loss: 0.00709
Epoch: 33081, Training Loss: 0.00801
Epoch: 33081, Training Loss: 0.00748
Epoch: 33081, Training Loss: 0.00915
Epoch: 33081, Training Loss: 0.00709
Epoch: 33082, Training Loss: 0.00801
Epoch: 33082, Training Loss: 0.00748
Epoch: 33082, Training Loss: 0.00915
Epoch: 33082, Training Loss: 0.00709
Epoch: 33083, Training Loss: 0.00801
Epoch: 33083, Training Loss: 0.00748
Epoch: 33083, Training Loss: 0.00915
Epoch: 33083, Training Loss: 0.00709
Epoch: 33084, Training Loss: 0.00801
Epoch: 33084, Training Loss: 0.00748
Epoch: 33084, Training Loss: 0.00915
Epoch: 33084, Training Loss: 0.00709
Epoch: 33085, Training Loss: 0.00801
Epoch: 33085, Training Loss: 0.00748
Epoch: 33085, Training Loss: 0.00915
Epoch: 33085, Training Loss: 0.00709
Epoch: 33086, Training Loss: 0.00801
Epoch: 33086, Training Loss: 0.00748
Epoch: 33086, Training Loss: 0.00915
Epoch: 33086, Training Loss: 0.00709
Epoch: 33087, Training Loss: 0.00801
Epoch: 33087, Training Loss: 0.00748
Epoch: 33087, Training Loss: 0.00915
Epoch: 33087, Training Loss: 0.00709
Epoch: 33088, Training Loss: 0.00801
Epoch: 33088, Training Loss: 0.00748
Epoch: 33088, Training Loss: 0.00915
Epoch: 33088, Training Loss: 0.00709
Epoch: 33089, Training Loss: 0.00801
Epoch: 33089, Training Loss: 0.00748
Epoch: 33089, Training Loss: 0.00915
Epoch: 33089, Training Loss: 0.00709
Epoch: 33090, Training Loss: 0.00801
Epoch: 33090, Training Loss: 0.00748
Epoch: 33090, Training Loss: 0.00915
Epoch: 33090, Training Loss: 0.00709
Epoch: 33091, Training Loss: 0.00801
Epoch: 33091, Training Loss: 0.00748
Epoch: 33091, Training Loss: 0.00915
Epoch: 33091, Training Loss: 0.00709
Epoch: 33092, Training Loss: 0.00801
Epoch: 33092, Training Loss: 0.00748
Epoch: 33092, Training Loss: 0.00915
Epoch: 33092, Training Loss: 0.00709
Epoch: 33093, Training Loss: 0.00801
Epoch: 33093, Training Loss: 0.00748
Epoch: 33093, Training Loss: 0.00915
Epoch: 33093, Training Loss: 0.00708
Epoch: 33094, Training Loss: 0.00801
Epoch: 33094, Training Loss: 0.00748
Epoch: 33094, Training Loss: 0.00915
Epoch: 33094, Training Loss: 0.00708
Epoch: 33095, Training Loss: 0.00801
Epoch: 33095, Training Loss: 0.00748
Epoch: 33095, Training Loss: 0.00914
Epoch: 33095, Training Loss: 0.00708
Epoch: 33096, Training Loss: 0.00801
Epoch: 33096, Training Loss: 0.00748
Epoch: 33096, Training Loss: 0.00914
Epoch: 33096, Training Loss: 0.00708
Epoch: 33097, Training Loss: 0.00801
Epoch: 33097, Training Loss: 0.00748
Epoch: 33097, Training Loss: 0.00914
Epoch: 33097, Training Loss: 0.00708
Epoch: 33098, Training Loss: 0.00801
Epoch: 33098, Training Loss: 0.00748
Epoch: 33098, Training Loss: 0.00914
Epoch: 33098, Training Loss: 0.00708
Epoch: 33099, Training Loss: 0.00801
Epoch: 33099, Training Loss: 0.00748
Epoch: 33099, Training Loss: 0.00914
Epoch: 33099, Training Loss: 0.00708
Epoch: 33100, Training Loss: 0.00801
Epoch: 33100, Training Loss: 0.00748
Epoch: 33100, Training Loss: 0.00914
Epoch: 33100, Training Loss: 0.00708
Epoch: 33101, Training Loss: 0.00801
Epoch: 33101, Training Loss: 0.00748
Epoch: 33101, Training Loss: 0.00914
Epoch: 33101, Training Loss: 0.00708
Epoch: 33102, Training Loss: 0.00801
Epoch: 33102, Training Loss: 0.00748
Epoch: 33102, Training Loss: 0.00914
Epoch: 33102, Training Loss: 0.00708
Epoch: 33103, Training Loss: 0.00801
Epoch: 33103, Training Loss: 0.00748
Epoch: 33103, Training Loss: 0.00914
Epoch: 33103, Training Loss: 0.00708
Epoch: 33104, Training Loss: 0.00801
Epoch: 33104, Training Loss: 0.00748
Epoch: 33104, Training Loss: 0.00914
Epoch: 33104, Training Loss: 0.00708
Epoch: 33105, Training Loss: 0.00801
Epoch: 33105, Training Loss: 0.00748
Epoch: 33105, Training Loss: 0.00914
Epoch: 33105, Training Loss: 0.00708
Epoch: 33106, Training Loss: 0.00800
Epoch: 33106, Training Loss: 0.00748
Epoch: 33106, Training Loss: 0.00914
Epoch: 33106, Training Loss: 0.00708
Epoch: 33107, Training Loss: 0.00800
Epoch: 33107, Training Loss: 0.00748
Epoch: 33107, Training Loss: 0.00914
Epoch: 33107, Training Loss: 0.00708
Epoch: 33108, Training Loss: 0.00800
Epoch: 33108, Training Loss: 0.00748
Epoch: 33108, Training Loss: 0.00914
Epoch: 33108, Training Loss: 0.00708
Epoch: 33109, Training Loss: 0.00800
Epoch: 33109, Training Loss: 0.00748
Epoch: 33109, Training Loss: 0.00914
Epoch: 33109, Training Loss: 0.00708
Epoch: 33110, Training Loss: 0.00800
Epoch: 33110, Training Loss: 0.00748
Epoch: 33110, Training Loss: 0.00914
Epoch: 33110, Training Loss: 0.00708
Epoch: 33111, Training Loss: 0.00800
Epoch: 33111, Training Loss: 0.00748
Epoch: 33111, Training Loss: 0.00914
Epoch: 33111, Training Loss: 0.00708
Epoch: 33112, Training Loss: 0.00800
Epoch: 33112, Training Loss: 0.00748
Epoch: 33112, Training Loss: 0.00914
Epoch: 33112, Training Loss: 0.00708
Epoch: 33113, Training Loss: 0.00800
Epoch: 33113, Training Loss: 0.00748
Epoch: 33113, Training Loss: 0.00914
Epoch: 33113, Training Loss: 0.00708
Epoch: 33114, Training Loss: 0.00800
Epoch: 33114, Training Loss: 0.00748
Epoch: 33114, Training Loss: 0.00914
Epoch: 33114, Training Loss: 0.00708
Epoch: 33115, Training Loss: 0.00800
Epoch: 33115, Training Loss: 0.00748
Epoch: 33115, Training Loss: 0.00914
Epoch: 33115, Training Loss: 0.00708
Epoch: 33116, Training Loss: 0.00800
Epoch: 33116, Training Loss: 0.00748
Epoch: 33116, Training Loss: 0.00914
Epoch: 33116, Training Loss: 0.00708
Epoch: 33117, Training Loss: 0.00800
Epoch: 33117, Training Loss: 0.00748
Epoch: 33117, Training Loss: 0.00914
Epoch: 33117, Training Loss: 0.00708
Epoch: 33118, Training Loss: 0.00800
Epoch: 33118, Training Loss: 0.00748
Epoch: 33118, Training Loss: 0.00914
Epoch: 33118, Training Loss: 0.00708
Epoch: 33119, Training Loss: 0.00800
Epoch: 33119, Training Loss: 0.00748
Epoch: 33119, Training Loss: 0.00914
Epoch: 33119, Training Loss: 0.00708
Epoch: 33120, Training Loss: 0.00800
Epoch: 33120, Training Loss: 0.00748
Epoch: 33120, Training Loss: 0.00914
Epoch: 33120, Training Loss: 0.00708
Epoch: 33121, Training Loss: 0.00800
Epoch: 33121, Training Loss: 0.00748
Epoch: 33121, Training Loss: 0.00914
Epoch: 33121, Training Loss: 0.00708
Epoch: 33122, Training Loss: 0.00800
Epoch: 33122, Training Loss: 0.00748
Epoch: 33122, Training Loss: 0.00914
Epoch: 33122, Training Loss: 0.00708
Epoch: 33123, Training Loss: 0.00800
Epoch: 33123, Training Loss: 0.00748
Epoch: 33123, Training Loss: 0.00914
Epoch: 33123, Training Loss: 0.00708
Epoch: 33124, Training Loss: 0.00800
Epoch: 33124, Training Loss: 0.00748
Epoch: 33124, Training Loss: 0.00914
Epoch: 33124, Training Loss: 0.00708
Epoch: 33125, Training Loss: 0.00800
Epoch: 33125, Training Loss: 0.00748
Epoch: 33125, Training Loss: 0.00914
Epoch: 33125, Training Loss: 0.00708
Epoch: 33126, Training Loss: 0.00800
Epoch: 33126, Training Loss: 0.00748
Epoch: 33126, Training Loss: 0.00914
Epoch: 33126, Training Loss: 0.00708
Epoch: 33127, Training Loss: 0.00800
Epoch: 33127, Training Loss: 0.00748
Epoch: 33127, Training Loss: 0.00914
Epoch: 33127, Training Loss: 0.00708
Epoch: 33128, Training Loss: 0.00800
Epoch: 33128, Training Loss: 0.00748
Epoch: 33128, Training Loss: 0.00914
Epoch: 33128, Training Loss: 0.00708
Epoch: 33129, Training Loss: 0.00800
Epoch: 33129, Training Loss: 0.00748
Epoch: 33129, Training Loss: 0.00914
Epoch: 33129, Training Loss: 0.00708
Epoch: 33130, Training Loss: 0.00800
Epoch: 33130, Training Loss: 0.00748
Epoch: 33130, Training Loss: 0.00914
Epoch: 33130, Training Loss: 0.00708
Epoch: 33131, Training Loss: 0.00800
Epoch: 33131, Training Loss: 0.00748
Epoch: 33131, Training Loss: 0.00914
Epoch: 33131, Training Loss: 0.00708
Epoch: 33132, Training Loss: 0.00800
Epoch: 33132, Training Loss: 0.00748
Epoch: 33132, Training Loss: 0.00914
Epoch: 33132, Training Loss: 0.00708
Epoch: 33133, Training Loss: 0.00800
Epoch: 33133, Training Loss: 0.00748
Epoch: 33133, Training Loss: 0.00914
Epoch: 33133, Training Loss: 0.00708
Epoch: 33134, Training Loss: 0.00800
Epoch: 33134, Training Loss: 0.00748
Epoch: 33134, Training Loss: 0.00914
Epoch: 33134, Training Loss: 0.00708
Epoch: 33135, Training Loss: 0.00800
Epoch: 33135, Training Loss: 0.00748
Epoch: 33135, Training Loss: 0.00914
Epoch: 33135, Training Loss: 0.00708
Epoch: 33136, Training Loss: 0.00800
Epoch: 33136, Training Loss: 0.00748
Epoch: 33136, Training Loss: 0.00914
Epoch: 33136, Training Loss: 0.00708
Epoch: 33137, Training Loss: 0.00800
Epoch: 33137, Training Loss: 0.00748
Epoch: 33137, Training Loss: 0.00914
Epoch: 33137, Training Loss: 0.00708
Epoch: 33138, Training Loss: 0.00800
Epoch: 33138, Training Loss: 0.00748
Epoch: 33138, Training Loss: 0.00914
Epoch: 33138, Training Loss: 0.00708
Epoch: 33139, Training Loss: 0.00800
Epoch: 33139, Training Loss: 0.00748
Epoch: 33139, Training Loss: 0.00914
Epoch: 33139, Training Loss: 0.00708
Epoch: 33140, Training Loss: 0.00800
Epoch: 33140, Training Loss: 0.00748
Epoch: 33140, Training Loss: 0.00914
Epoch: 33140, Training Loss: 0.00708
Epoch: 33141, Training Loss: 0.00800
Epoch: 33141, Training Loss: 0.00748
Epoch: 33141, Training Loss: 0.00914
Epoch: 33141, Training Loss: 0.00708
Epoch: 33142, Training Loss: 0.00800
Epoch: 33142, Training Loss: 0.00748
Epoch: 33142, Training Loss: 0.00914
Epoch: 33142, Training Loss: 0.00708
Epoch: 33143, Training Loss: 0.00800
Epoch: 33143, Training Loss: 0.00748
Epoch: 33143, Training Loss: 0.00914
Epoch: 33143, Training Loss: 0.00708
Epoch: 33144, Training Loss: 0.00800
Epoch: 33144, Training Loss: 0.00748
Epoch: 33144, Training Loss: 0.00914
Epoch: 33144, Training Loss: 0.00708
Epoch: 33145, Training Loss: 0.00800
Epoch: 33145, Training Loss: 0.00748
Epoch: 33145, Training Loss: 0.00914
Epoch: 33145, Training Loss: 0.00708
Epoch: 33146, Training Loss: 0.00800
Epoch: 33146, Training Loss: 0.00748
Epoch: 33146, Training Loss: 0.00914
Epoch: 33146, Training Loss: 0.00708
Epoch: 33147, Training Loss: 0.00800
Epoch: 33147, Training Loss: 0.00748
Epoch: 33147, Training Loss: 0.00914
Epoch: 33147, Training Loss: 0.00708
Epoch: 33148, Training Loss: 0.00800
Epoch: 33148, Training Loss: 0.00748
Epoch: 33148, Training Loss: 0.00914
Epoch: 33148, Training Loss: 0.00708
Epoch: 33149, Training Loss: 0.00800
Epoch: 33149, Training Loss: 0.00748
Epoch: 33149, Training Loss: 0.00914
Epoch: 33149, Training Loss: 0.00708
Epoch: 33150, Training Loss: 0.00800
Epoch: 33150, Training Loss: 0.00748
Epoch: 33150, Training Loss: 0.00914
Epoch: 33150, Training Loss: 0.00708
Epoch: 33151, Training Loss: 0.00800
Epoch: 33151, Training Loss: 0.00748
Epoch: 33151, Training Loss: 0.00914
Epoch: 33151, Training Loss: 0.00708
Epoch: 33152, Training Loss: 0.00800
Epoch: 33152, Training Loss: 0.00748
Epoch: 33152, Training Loss: 0.00914
Epoch: 33152, Training Loss: 0.00708
Epoch: 33153, Training Loss: 0.00800
Epoch: 33153, Training Loss: 0.00748
Epoch: 33153, Training Loss: 0.00914
Epoch: 33153, Training Loss: 0.00708
Epoch: 33154, Training Loss: 0.00800
Epoch: 33154, Training Loss: 0.00748
Epoch: 33154, Training Loss: 0.00914
Epoch: 33154, Training Loss: 0.00708
Epoch: 33155, Training Loss: 0.00800
Epoch: 33155, Training Loss: 0.00748
Epoch: 33155, Training Loss: 0.00914
Epoch: 33155, Training Loss: 0.00708
Epoch: 33156, Training Loss: 0.00800
Epoch: 33156, Training Loss: 0.00747
Epoch: 33156, Training Loss: 0.00914
Epoch: 33156, Training Loss: 0.00708
Epoch: 33157, Training Loss: 0.00800
Epoch: 33157, Training Loss: 0.00747
Epoch: 33157, Training Loss: 0.00914
Epoch: 33157, Training Loss: 0.00708
Epoch: 33158, Training Loss: 0.00800
Epoch: 33158, Training Loss: 0.00747
Epoch: 33158, Training Loss: 0.00914
Epoch: 33158, Training Loss: 0.00708
Epoch: 33159, Training Loss: 0.00800
Epoch: 33159, Training Loss: 0.00747
Epoch: 33159, Training Loss: 0.00914
Epoch: 33159, Training Loss: 0.00708
Epoch: 33160, Training Loss: 0.00800
Epoch: 33160, Training Loss: 0.00747
Epoch: 33160, Training Loss: 0.00914
Epoch: 33160, Training Loss: 0.00708
Epoch: 33161, Training Loss: 0.00800
Epoch: 33161, Training Loss: 0.00747
Epoch: 33161, Training Loss: 0.00914
Epoch: 33161, Training Loss: 0.00708
Epoch: 33162, Training Loss: 0.00800
Epoch: 33162, Training Loss: 0.00747
Epoch: 33162, Training Loss: 0.00914
Epoch: 33162, Training Loss: 0.00708
Epoch: 33163, Training Loss: 0.00800
Epoch: 33163, Training Loss: 0.00747
Epoch: 33163, Training Loss: 0.00913
Epoch: 33163, Training Loss: 0.00708
Epoch: 33164, Training Loss: 0.00800
Epoch: 33164, Training Loss: 0.00747
Epoch: 33164, Training Loss: 0.00913
Epoch: 33164, Training Loss: 0.00708
Epoch: 33165, Training Loss: 0.00800
Epoch: 33165, Training Loss: 0.00747
Epoch: 33165, Training Loss: 0.00913
Epoch: 33165, Training Loss: 0.00708
Epoch: 33166, Training Loss: 0.00800
Epoch: 33166, Training Loss: 0.00747
Epoch: 33166, Training Loss: 0.00913
Epoch: 33166, Training Loss: 0.00708
Epoch: 33167, Training Loss: 0.00800
Epoch: 33167, Training Loss: 0.00747
Epoch: 33167, Training Loss: 0.00913
Epoch: 33167, Training Loss: 0.00708
Epoch: 33168, Training Loss: 0.00800
Epoch: 33168, Training Loss: 0.00747
Epoch: 33168, Training Loss: 0.00913
Epoch: 33168, Training Loss: 0.00708
Epoch: 33169, Training Loss: 0.00800
Epoch: 33169, Training Loss: 0.00747
Epoch: 33169, Training Loss: 0.00913
Epoch: 33169, Training Loss: 0.00708
Epoch: 33170, Training Loss: 0.00800
Epoch: 33170, Training Loss: 0.00747
Epoch: 33170, Training Loss: 0.00913
Epoch: 33170, Training Loss: 0.00708
Epoch: 33171, Training Loss: 0.00800
Epoch: 33171, Training Loss: 0.00747
Epoch: 33171, Training Loss: 0.00913
Epoch: 33171, Training Loss: 0.00708
Epoch: 33172, Training Loss: 0.00800
Epoch: 33172, Training Loss: 0.00747
Epoch: 33172, Training Loss: 0.00913
Epoch: 33172, Training Loss: 0.00708
Epoch: 33173, Training Loss: 0.00800
Epoch: 33173, Training Loss: 0.00747
Epoch: 33173, Training Loss: 0.00913
Epoch: 33173, Training Loss: 0.00708
Epoch: 33174, Training Loss: 0.00800
Epoch: 33174, Training Loss: 0.00747
Epoch: 33174, Training Loss: 0.00913
Epoch: 33174, Training Loss: 0.00708
Epoch: 33175, Training Loss: 0.00800
Epoch: 33175, Training Loss: 0.00747
Epoch: 33175, Training Loss: 0.00913
Epoch: 33175, Training Loss: 0.00708
Epoch: 33176, Training Loss: 0.00800
Epoch: 33176, Training Loss: 0.00747
Epoch: 33176, Training Loss: 0.00913
Epoch: 33176, Training Loss: 0.00708
Epoch: 33177, Training Loss: 0.00800
Epoch: 33177, Training Loss: 0.00747
Epoch: 33177, Training Loss: 0.00913
Epoch: 33177, Training Loss: 0.00708
Epoch: 33178, Training Loss: 0.00800
Epoch: 33178, Training Loss: 0.00747
Epoch: 33178, Training Loss: 0.00913
Epoch: 33178, Training Loss: 0.00708
Epoch: 33179, Training Loss: 0.00800
Epoch: 33179, Training Loss: 0.00747
Epoch: 33179, Training Loss: 0.00913
Epoch: 33179, Training Loss: 0.00708
Epoch: 33180, Training Loss: 0.00800
Epoch: 33180, Training Loss: 0.00747
Epoch: 33180, Training Loss: 0.00913
Epoch: 33180, Training Loss: 0.00708
Epoch: 33181, Training Loss: 0.00800
Epoch: 33181, Training Loss: 0.00747
Epoch: 33181, Training Loss: 0.00913
Epoch: 33181, Training Loss: 0.00708
Epoch: 33182, Training Loss: 0.00800
Epoch: 33182, Training Loss: 0.00747
Epoch: 33182, Training Loss: 0.00913
Epoch: 33182, Training Loss: 0.00708
Epoch: 33183, Training Loss: 0.00800
Epoch: 33183, Training Loss: 0.00747
Epoch: 33183, Training Loss: 0.00913
Epoch: 33183, Training Loss: 0.00707
Epoch: 33184, Training Loss: 0.00799
Epoch: 33184, Training Loss: 0.00747
Epoch: 33184, Training Loss: 0.00913
Epoch: 33184, Training Loss: 0.00707
Epoch: 33185, Training Loss: 0.00799
Epoch: 33185, Training Loss: 0.00747
Epoch: 33185, Training Loss: 0.00913
Epoch: 33185, Training Loss: 0.00707
Epoch: 33186, Training Loss: 0.00799
Epoch: 33186, Training Loss: 0.00747
Epoch: 33186, Training Loss: 0.00913
Epoch: 33186, Training Loss: 0.00707
Epoch: 33187, Training Loss: 0.00799
Epoch: 33187, Training Loss: 0.00747
Epoch: 33187, Training Loss: 0.00913
Epoch: 33187, Training Loss: 0.00707
Epoch: 33188, Training Loss: 0.00799
Epoch: 33188, Training Loss: 0.00747
Epoch: 33188, Training Loss: 0.00913
Epoch: 33188, Training Loss: 0.00707
Epoch: 33189, Training Loss: 0.00799
Epoch: 33189, Training Loss: 0.00747
Epoch: 33189, Training Loss: 0.00913
Epoch: 33189, Training Loss: 0.00707
Epoch: 33190, Training Loss: 0.00799
Epoch: 33190, Training Loss: 0.00747
Epoch: 33190, Training Loss: 0.00913
Epoch: 33190, Training Loss: 0.00707
Epoch: 33191, Training Loss: 0.00799
Epoch: 33191, Training Loss: 0.00747
Epoch: 33191, Training Loss: 0.00913
Epoch: 33191, Training Loss: 0.00707
Epoch: 33192, Training Loss: 0.00799
Epoch: 33192, Training Loss: 0.00747
Epoch: 33192, Training Loss: 0.00913
Epoch: 33192, Training Loss: 0.00707
Epoch: 33193, Training Loss: 0.00799
Epoch: 33193, Training Loss: 0.00747
Epoch: 33193, Training Loss: 0.00913
Epoch: 33193, Training Loss: 0.00707
Epoch: 33194, Training Loss: 0.00799
Epoch: 33194, Training Loss: 0.00747
Epoch: 33194, Training Loss: 0.00913
Epoch: 33194, Training Loss: 0.00707
Epoch: 33195, Training Loss: 0.00799
Epoch: 33195, Training Loss: 0.00747
Epoch: 33195, Training Loss: 0.00913
Epoch: 33195, Training Loss: 0.00707
Epoch: 33196, Training Loss: 0.00799
Epoch: 33196, Training Loss: 0.00747
Epoch: 33196, Training Loss: 0.00913
Epoch: 33196, Training Loss: 0.00707
Epoch: 33197, Training Loss: 0.00799
Epoch: 33197, Training Loss: 0.00747
Epoch: 33197, Training Loss: 0.00913
Epoch: 33197, Training Loss: 0.00707
Epoch: 33198, Training Loss: 0.00799
Epoch: 33198, Training Loss: 0.00747
Epoch: 33198, Training Loss: 0.00913
Epoch: 33198, Training Loss: 0.00707
Epoch: 33199, Training Loss: 0.00799
Epoch: 33199, Training Loss: 0.00747
Epoch: 33199, Training Loss: 0.00913
Epoch: 33199, Training Loss: 0.00707
Epoch: 33200, Training Loss: 0.00799
Epoch: 33200, Training Loss: 0.00747
Epoch: 33200, Training Loss: 0.00913
Epoch: 33200, Training Loss: 0.00707
Epoch: 33201, Training Loss: 0.00799
Epoch: 33201, Training Loss: 0.00747
Epoch: 33201, Training Loss: 0.00913
Epoch: 33201, Training Loss: 0.00707
Epoch: 33202, Training Loss: 0.00799
Epoch: 33202, Training Loss: 0.00747
Epoch: 33202, Training Loss: 0.00913
Epoch: 33202, Training Loss: 0.00707
Epoch: 33203, Training Loss: 0.00799
Epoch: 33203, Training Loss: 0.00747
Epoch: 33203, Training Loss: 0.00913
Epoch: 33203, Training Loss: 0.00707
Epoch: 33204, Training Loss: 0.00799
Epoch: 33204, Training Loss: 0.00747
Epoch: 33204, Training Loss: 0.00913
Epoch: 33204, Training Loss: 0.00707
Epoch: 33205, Training Loss: 0.00799
Epoch: 33205, Training Loss: 0.00747
Epoch: 33205, Training Loss: 0.00913
Epoch: 33205, Training Loss: 0.00707
Epoch: 33206, Training Loss: 0.00799
Epoch: 33206, Training Loss: 0.00747
Epoch: 33206, Training Loss: 0.00913
Epoch: 33206, Training Loss: 0.00707
Epoch: 33207, Training Loss: 0.00799
Epoch: 33207, Training Loss: 0.00747
Epoch: 33207, Training Loss: 0.00913
Epoch: 33207, Training Loss: 0.00707
Epoch: 33208, Training Loss: 0.00799
Epoch: 33208, Training Loss: 0.00747
Epoch: 33208, Training Loss: 0.00913
Epoch: 33208, Training Loss: 0.00707
Epoch: 33209, Training Loss: 0.00799
Epoch: 33209, Training Loss: 0.00747
Epoch: 33209, Training Loss: 0.00913
Epoch: 33209, Training Loss: 0.00707
Epoch: 33210, Training Loss: 0.00799
Epoch: 33210, Training Loss: 0.00747
Epoch: 33210, Training Loss: 0.00913
Epoch: 33210, Training Loss: 0.00707
Epoch: 33211, Training Loss: 0.00799
Epoch: 33211, Training Loss: 0.00747
Epoch: 33211, Training Loss: 0.00913
Epoch: 33211, Training Loss: 0.00707
Epoch: 33212, Training Loss: 0.00799
Epoch: 33212, Training Loss: 0.00747
Epoch: 33212, Training Loss: 0.00913
Epoch: 33212, Training Loss: 0.00707
Epoch: 33213, Training Loss: 0.00799
Epoch: 33213, Training Loss: 0.00747
Epoch: 33213, Training Loss: 0.00913
Epoch: 33213, Training Loss: 0.00707
Epoch: 33214, Training Loss: 0.00799
Epoch: 33214, Training Loss: 0.00747
Epoch: 33214, Training Loss: 0.00913
Epoch: 33214, Training Loss: 0.00707
Epoch: 33215, Training Loss: 0.00799
Epoch: 33215, Training Loss: 0.00747
Epoch: 33215, Training Loss: 0.00913
Epoch: 33215, Training Loss: 0.00707
Epoch: 33216, Training Loss: 0.00799
Epoch: 33216, Training Loss: 0.00747
Epoch: 33216, Training Loss: 0.00913
Epoch: 33216, Training Loss: 0.00707
Epoch: 33217, Training Loss: 0.00799
Epoch: 33217, Training Loss: 0.00747
Epoch: 33217, Training Loss: 0.00913
Epoch: 33217, Training Loss: 0.00707
Epoch: 33218, Training Loss: 0.00799
Epoch: 33218, Training Loss: 0.00747
Epoch: 33218, Training Loss: 0.00913
Epoch: 33218, Training Loss: 0.00707
Epoch: 33219, Training Loss: 0.00799
Epoch: 33219, Training Loss: 0.00747
Epoch: 33219, Training Loss: 0.00913
Epoch: 33219, Training Loss: 0.00707
Epoch: 33220, Training Loss: 0.00799
Epoch: 33220, Training Loss: 0.00747
Epoch: 33220, Training Loss: 0.00913
Epoch: 33220, Training Loss: 0.00707
Epoch: 33221, Training Loss: 0.00799
Epoch: 33221, Training Loss: 0.00747
Epoch: 33221, Training Loss: 0.00913
Epoch: 33221, Training Loss: 0.00707
Epoch: 33222, Training Loss: 0.00799
Epoch: 33222, Training Loss: 0.00747
Epoch: 33222, Training Loss: 0.00913
Epoch: 33222, Training Loss: 0.00707
Epoch: 33223, Training Loss: 0.00799
Epoch: 33223, Training Loss: 0.00747
Epoch: 33223, Training Loss: 0.00913
Epoch: 33223, Training Loss: 0.00707
Epoch: 33224, Training Loss: 0.00799
Epoch: 33224, Training Loss: 0.00747
Epoch: 33224, Training Loss: 0.00913
Epoch: 33224, Training Loss: 0.00707
Epoch: 33225, Training Loss: 0.00799
Epoch: 33225, Training Loss: 0.00747
Epoch: 33225, Training Loss: 0.00913
Epoch: 33225, Training Loss: 0.00707
Epoch: 33226, Training Loss: 0.00799
Epoch: 33226, Training Loss: 0.00747
Epoch: 33226, Training Loss: 0.00913
Epoch: 33226, Training Loss: 0.00707
Epoch: 33227, Training Loss: 0.00799
Epoch: 33227, Training Loss: 0.00747
Epoch: 33227, Training Loss: 0.00913
Epoch: 33227, Training Loss: 0.00707
Epoch: 33228, Training Loss: 0.00799
Epoch: 33228, Training Loss: 0.00747
Epoch: 33228, Training Loss: 0.00913
Epoch: 33228, Training Loss: 0.00707
Epoch: 33229, Training Loss: 0.00799
Epoch: 33229, Training Loss: 0.00747
Epoch: 33229, Training Loss: 0.00913
Epoch: 33229, Training Loss: 0.00707
Epoch: 33230, Training Loss: 0.00799
Epoch: 33230, Training Loss: 0.00747
Epoch: 33230, Training Loss: 0.00913
Epoch: 33230, Training Loss: 0.00707
Epoch: 33231, Training Loss: 0.00799
Epoch: 33231, Training Loss: 0.00747
Epoch: 33231, Training Loss: 0.00913
Epoch: 33231, Training Loss: 0.00707
Epoch: 33232, Training Loss: 0.00799
Epoch: 33232, Training Loss: 0.00747
Epoch: 33232, Training Loss: 0.00912
Epoch: 33232, Training Loss: 0.00707
Epoch: 33233, Training Loss: 0.00799
Epoch: 33233, Training Loss: 0.00747
Epoch: 33233, Training Loss: 0.00912
Epoch: 33233, Training Loss: 0.00707
Epoch: 33234, Training Loss: 0.00799
Epoch: 33234, Training Loss: 0.00747
Epoch: 33234, Training Loss: 0.00912
Epoch: 33234, Training Loss: 0.00707
Epoch: 33235, Training Loss: 0.00799
Epoch: 33235, Training Loss: 0.00747
Epoch: 33235, Training Loss: 0.00912
Epoch: 33235, Training Loss: 0.00707
Epoch: 33236, Training Loss: 0.00799
Epoch: 33236, Training Loss: 0.00747
Epoch: 33236, Training Loss: 0.00912
Epoch: 33236, Training Loss: 0.00707
Epoch: 33237, Training Loss: 0.00799
Epoch: 33237, Training Loss: 0.00747
Epoch: 33237, Training Loss: 0.00912
Epoch: 33237, Training Loss: 0.00707
Epoch: 33238, Training Loss: 0.00799
Epoch: 33238, Training Loss: 0.00747
Epoch: 33238, Training Loss: 0.00912
Epoch: 33238, Training Loss: 0.00707
Epoch: 33239, Training Loss: 0.00799
Epoch: 33239, Training Loss: 0.00747
Epoch: 33239, Training Loss: 0.00912
Epoch: 33239, Training Loss: 0.00707
Epoch: 33240, Training Loss: 0.00799
Epoch: 33240, Training Loss: 0.00747
Epoch: 33240, Training Loss: 0.00912
Epoch: 33240, Training Loss: 0.00707
Epoch: 33241, Training Loss: 0.00799
Epoch: 33241, Training Loss: 0.00747
Epoch: 33241, Training Loss: 0.00912
Epoch: 33241, Training Loss: 0.00707
Epoch: 33242, Training Loss: 0.00799
Epoch: 33242, Training Loss: 0.00746
Epoch: 33242, Training Loss: 0.00912
Epoch: 33242, Training Loss: 0.00707
Epoch: 33243, Training Loss: 0.00799
Epoch: 33243, Training Loss: 0.00746
Epoch: 33243, Training Loss: 0.00912
Epoch: 33243, Training Loss: 0.00707
Epoch: 33244, Training Loss: 0.00799
Epoch: 33244, Training Loss: 0.00746
Epoch: 33244, Training Loss: 0.00912
Epoch: 33244, Training Loss: 0.00707
Epoch: 33245, Training Loss: 0.00799
Epoch: 33245, Training Loss: 0.00746
Epoch: 33245, Training Loss: 0.00912
Epoch: 33245, Training Loss: 0.00707
Epoch: 33246, Training Loss: 0.00799
Epoch: 33246, Training Loss: 0.00746
Epoch: 33246, Training Loss: 0.00912
Epoch: 33246, Training Loss: 0.00707
Epoch: 33247, Training Loss: 0.00799
Epoch: 33247, Training Loss: 0.00746
Epoch: 33247, Training Loss: 0.00912
Epoch: 33247, Training Loss: 0.00707
Epoch: 33248, Training Loss: 0.00799
Epoch: 33248, Training Loss: 0.00746
Epoch: 33248, Training Loss: 0.00912
Epoch: 33248, Training Loss: 0.00707
Epoch: 33249, Training Loss: 0.00799
Epoch: 33249, Training Loss: 0.00746
Epoch: 33249, Training Loss: 0.00912
Epoch: 33249, Training Loss: 0.00707
Epoch: 33250, Training Loss: 0.00799
Epoch: 33250, Training Loss: 0.00746
Epoch: 33250, Training Loss: 0.00912
Epoch: 33250, Training Loss: 0.00707
Epoch: 33251, Training Loss: 0.00799
Epoch: 33251, Training Loss: 0.00746
Epoch: 33251, Training Loss: 0.00912
Epoch: 33251, Training Loss: 0.00707
Epoch: 33252, Training Loss: 0.00799
Epoch: 33252, Training Loss: 0.00746
Epoch: 33252, Training Loss: 0.00912
Epoch: 33252, Training Loss: 0.00707
Epoch: 33253, Training Loss: 0.00799
Epoch: 33253, Training Loss: 0.00746
Epoch: 33253, Training Loss: 0.00912
Epoch: 33253, Training Loss: 0.00707
Epoch: 33254, Training Loss: 0.00799
Epoch: 33254, Training Loss: 0.00746
Epoch: 33254, Training Loss: 0.00912
Epoch: 33254, Training Loss: 0.00707
Epoch: 33255, Training Loss: 0.00799
Epoch: 33255, Training Loss: 0.00746
Epoch: 33255, Training Loss: 0.00912
Epoch: 33255, Training Loss: 0.00707
Epoch: 33256, Training Loss: 0.00799
Epoch: 33256, Training Loss: 0.00746
Epoch: 33256, Training Loss: 0.00912
Epoch: 33256, Training Loss: 0.00707
Epoch: 33257, Training Loss: 0.00799
Epoch: 33257, Training Loss: 0.00746
Epoch: 33257, Training Loss: 0.00912
Epoch: 33257, Training Loss: 0.00707
Epoch: 33258, Training Loss: 0.00799
Epoch: 33258, Training Loss: 0.00746
Epoch: 33258, Training Loss: 0.00912
Epoch: 33258, Training Loss: 0.00707
Epoch: 33259, Training Loss: 0.00799
Epoch: 33259, Training Loss: 0.00746
Epoch: 33259, Training Loss: 0.00912
Epoch: 33259, Training Loss: 0.00707
Epoch: 33260, Training Loss: 0.00799
Epoch: 33260, Training Loss: 0.00746
Epoch: 33260, Training Loss: 0.00912
Epoch: 33260, Training Loss: 0.00707
Epoch: 33261, Training Loss: 0.00799
Epoch: 33261, Training Loss: 0.00746
Epoch: 33261, Training Loss: 0.00912
Epoch: 33261, Training Loss: 0.00707
Epoch: 33262, Training Loss: 0.00799
Epoch: 33262, Training Loss: 0.00746
Epoch: 33262, Training Loss: 0.00912
Epoch: 33262, Training Loss: 0.00707
Epoch: 33263, Training Loss: 0.00798
Epoch: 33263, Training Loss: 0.00746
Epoch: 33263, Training Loss: 0.00912
Epoch: 33263, Training Loss: 0.00707
Epoch: 33264, Training Loss: 0.00798
Epoch: 33264, Training Loss: 0.00746
Epoch: 33264, Training Loss: 0.00912
Epoch: 33264, Training Loss: 0.00707
Epoch: 33265, Training Loss: 0.00798
Epoch: 33265, Training Loss: 0.00746
Epoch: 33265, Training Loss: 0.00912
Epoch: 33265, Training Loss: 0.00707
Epoch: 33266, Training Loss: 0.00798
Epoch: 33266, Training Loss: 0.00746
Epoch: 33266, Training Loss: 0.00912
Epoch: 33266, Training Loss: 0.00707
Epoch: 33267, Training Loss: 0.00798
Epoch: 33267, Training Loss: 0.00746
Epoch: 33267, Training Loss: 0.00912
Epoch: 33267, Training Loss: 0.00707
Epoch: 33268, Training Loss: 0.00798
Epoch: 33268, Training Loss: 0.00746
Epoch: 33268, Training Loss: 0.00912
Epoch: 33268, Training Loss: 0.00707
Epoch: 33269, Training Loss: 0.00798
Epoch: 33269, Training Loss: 0.00746
Epoch: 33269, Training Loss: 0.00912
Epoch: 33269, Training Loss: 0.00707
Epoch: 33270, Training Loss: 0.00798
Epoch: 33270, Training Loss: 0.00746
Epoch: 33270, Training Loss: 0.00912
Epoch: 33270, Training Loss: 0.00707
Epoch: 33271, Training Loss: 0.00798
Epoch: 33271, Training Loss: 0.00746
Epoch: 33271, Training Loss: 0.00912
Epoch: 33271, Training Loss: 0.00707
Epoch: 33272, Training Loss: 0.00798
Epoch: 33272, Training Loss: 0.00746
Epoch: 33272, Training Loss: 0.00912
Epoch: 33272, Training Loss: 0.00707
Epoch: 33273, Training Loss: 0.00798
Epoch: 33273, Training Loss: 0.00746
Epoch: 33273, Training Loss: 0.00912
Epoch: 33273, Training Loss: 0.00706
Epoch: 33274, Training Loss: 0.00798
Epoch: 33274, Training Loss: 0.00746
Epoch: 33274, Training Loss: 0.00912
Epoch: 33274, Training Loss: 0.00706
Epoch: 33275, Training Loss: 0.00798
Epoch: 33275, Training Loss: 0.00746
Epoch: 33275, Training Loss: 0.00912
Epoch: 33275, Training Loss: 0.00706
Epoch: 33276, Training Loss: 0.00798
Epoch: 33276, Training Loss: 0.00746
Epoch: 33276, Training Loss: 0.00912
Epoch: 33276, Training Loss: 0.00706
Epoch: 33277, Training Loss: 0.00798
Epoch: 33277, Training Loss: 0.00746
Epoch: 33277, Training Loss: 0.00912
Epoch: 33277, Training Loss: 0.00706
Epoch: 33278, Training Loss: 0.00798
Epoch: 33278, Training Loss: 0.00746
Epoch: 33278, Training Loss: 0.00912
Epoch: 33278, Training Loss: 0.00706
Epoch: 33279, Training Loss: 0.00798
Epoch: 33279, Training Loss: 0.00746
Epoch: 33279, Training Loss: 0.00912
Epoch: 33279, Training Loss: 0.00706
Epoch: 33280, Training Loss: 0.00798
Epoch: 33280, Training Loss: 0.00746
Epoch: 33280, Training Loss: 0.00912
Epoch: 33280, Training Loss: 0.00706
Epoch: 33281, Training Loss: 0.00798
Epoch: 33281, Training Loss: 0.00746
Epoch: 33281, Training Loss: 0.00912
Epoch: 33281, Training Loss: 0.00706
Epoch: 33282, Training Loss: 0.00798
Epoch: 33282, Training Loss: 0.00746
Epoch: 33282, Training Loss: 0.00912
Epoch: 33282, Training Loss: 0.00706
Epoch: 33283, Training Loss: 0.00798
Epoch: 33283, Training Loss: 0.00746
Epoch: 33283, Training Loss: 0.00912
Epoch: 33283, Training Loss: 0.00706
Epoch: 33284, Training Loss: 0.00798
Epoch: 33284, Training Loss: 0.00746
Epoch: 33284, Training Loss: 0.00912
Epoch: 33284, Training Loss: 0.00706
Epoch: 33285, Training Loss: 0.00798
Epoch: 33285, Training Loss: 0.00746
Epoch: 33285, Training Loss: 0.00912
Epoch: 33285, Training Loss: 0.00706
Epoch: 33286, Training Loss: 0.00798
Epoch: 33286, Training Loss: 0.00746
Epoch: 33286, Training Loss: 0.00912
Epoch: 33286, Training Loss: 0.00706
Epoch: 33287, Training Loss: 0.00798
Epoch: 33287, Training Loss: 0.00746
Epoch: 33287, Training Loss: 0.00912
Epoch: 33287, Training Loss: 0.00706
Epoch: 33288, Training Loss: 0.00798
Epoch: 33288, Training Loss: 0.00746
Epoch: 33288, Training Loss: 0.00912
Epoch: 33288, Training Loss: 0.00706
Epoch: 33289, Training Loss: 0.00798
Epoch: 33289, Training Loss: 0.00746
Epoch: 33289, Training Loss: 0.00912
Epoch: 33289, Training Loss: 0.00706
Epoch: 33290, Training Loss: 0.00798
Epoch: 33290, Training Loss: 0.00746
Epoch: 33290, Training Loss: 0.00912
Epoch: 33290, Training Loss: 0.00706
Epoch: 33291, Training Loss: 0.00798
Epoch: 33291, Training Loss: 0.00746
Epoch: 33291, Training Loss: 0.00912
Epoch: 33291, Training Loss: 0.00706
Epoch: 33292, Training Loss: 0.00798
Epoch: 33292, Training Loss: 0.00746
Epoch: 33292, Training Loss: 0.00912
Epoch: 33292, Training Loss: 0.00706
Epoch: 33293, Training Loss: 0.00798
Epoch: 33293, Training Loss: 0.00746
Epoch: 33293, Training Loss: 0.00912
Epoch: 33293, Training Loss: 0.00706
Epoch: 33294, Training Loss: 0.00798
Epoch: 33294, Training Loss: 0.00746
Epoch: 33294, Training Loss: 0.00912
Epoch: 33294, Training Loss: 0.00706
Epoch: 33295, Training Loss: 0.00798
Epoch: 33295, Training Loss: 0.00746
Epoch: 33295, Training Loss: 0.00912
Epoch: 33295, Training Loss: 0.00706
Epoch: 33296, Training Loss: 0.00798
Epoch: 33296, Training Loss: 0.00746
Epoch: 33296, Training Loss: 0.00912
Epoch: 33296, Training Loss: 0.00706
Epoch: 33297, Training Loss: 0.00798
Epoch: 33297, Training Loss: 0.00746
Epoch: 33297, Training Loss: 0.00912
Epoch: 33297, Training Loss: 0.00706
Epoch: 33298, Training Loss: 0.00798
Epoch: 33298, Training Loss: 0.00746
Epoch: 33298, Training Loss: 0.00912
Epoch: 33298, Training Loss: 0.00706
Epoch: 33299, Training Loss: 0.00798
Epoch: 33299, Training Loss: 0.00746
Epoch: 33299, Training Loss: 0.00912
Epoch: 33299, Training Loss: 0.00706
Epoch: 33300, Training Loss: 0.00798
Epoch: 33300, Training Loss: 0.00746
Epoch: 33300, Training Loss: 0.00912
Epoch: 33300, Training Loss: 0.00706
Epoch: 33301, Training Loss: 0.00798
Epoch: 33301, Training Loss: 0.00746
Epoch: 33301, Training Loss: 0.00911
Epoch: 33301, Training Loss: 0.00706
Epoch: 33302, Training Loss: 0.00798
Epoch: 33302, Training Loss: 0.00746
Epoch: 33302, Training Loss: 0.00911
Epoch: 33302, Training Loss: 0.00706
Epoch: 33303, Training Loss: 0.00798
Epoch: 33303, Training Loss: 0.00746
Epoch: 33303, Training Loss: 0.00911
Epoch: 33303, Training Loss: 0.00706
Epoch: 33304, Training Loss: 0.00798
Epoch: 33304, Training Loss: 0.00746
Epoch: 33304, Training Loss: 0.00911
Epoch: 33304, Training Loss: 0.00706
Epoch: 33305, Training Loss: 0.00798
Epoch: 33305, Training Loss: 0.00746
Epoch: 33305, Training Loss: 0.00911
Epoch: 33305, Training Loss: 0.00706
Epoch: 33306, Training Loss: 0.00798
Epoch: 33306, Training Loss: 0.00746
Epoch: 33306, Training Loss: 0.00911
Epoch: 33306, Training Loss: 0.00706
Epoch: 33307, Training Loss: 0.00798
Epoch: 33307, Training Loss: 0.00746
Epoch: 33307, Training Loss: 0.00911
Epoch: 33307, Training Loss: 0.00706
Epoch: 33308, Training Loss: 0.00798
Epoch: 33308, Training Loss: 0.00746
Epoch: 33308, Training Loss: 0.00911
Epoch: 33308, Training Loss: 0.00706
Epoch: 33309, Training Loss: 0.00798
Epoch: 33309, Training Loss: 0.00746
Epoch: 33309, Training Loss: 0.00911
Epoch: 33309, Training Loss: 0.00706
Epoch: 33310, Training Loss: 0.00798
Epoch: 33310, Training Loss: 0.00746
Epoch: 33310, Training Loss: 0.00911
Epoch: 33310, Training Loss: 0.00706
Epoch: 33311, Training Loss: 0.00798
Epoch: 33311, Training Loss: 0.00746
Epoch: 33311, Training Loss: 0.00911
Epoch: 33311, Training Loss: 0.00706
Epoch: 33312, Training Loss: 0.00798
Epoch: 33312, Training Loss: 0.00746
Epoch: 33312, Training Loss: 0.00911
Epoch: 33312, Training Loss: 0.00706
Epoch: 33313, Training Loss: 0.00798
Epoch: 33313, Training Loss: 0.00746
Epoch: 33313, Training Loss: 0.00911
Epoch: 33313, Training Loss: 0.00706
Epoch: 33314, Training Loss: 0.00798
Epoch: 33314, Training Loss: 0.00746
Epoch: 33314, Training Loss: 0.00911
Epoch: 33314, Training Loss: 0.00706
Epoch: 33315, Training Loss: 0.00798
Epoch: 33315, Training Loss: 0.00746
Epoch: 33315, Training Loss: 0.00911
Epoch: 33315, Training Loss: 0.00706
Epoch: 33316, Training Loss: 0.00798
Epoch: 33316, Training Loss: 0.00746
Epoch: 33316, Training Loss: 0.00911
Epoch: 33316, Training Loss: 0.00706
Epoch: 33317, Training Loss: 0.00798
Epoch: 33317, Training Loss: 0.00746
Epoch: 33317, Training Loss: 0.00911
Epoch: 33317, Training Loss: 0.00706
Epoch: 33318, Training Loss: 0.00798
Epoch: 33318, Training Loss: 0.00746
Epoch: 33318, Training Loss: 0.00911
Epoch: 33318, Training Loss: 0.00706
Epoch: 33319, Training Loss: 0.00798
Epoch: 33319, Training Loss: 0.00746
Epoch: 33319, Training Loss: 0.00911
Epoch: 33319, Training Loss: 0.00706
Epoch: 33320, Training Loss: 0.00798
Epoch: 33320, Training Loss: 0.00746
Epoch: 33320, Training Loss: 0.00911
Epoch: 33320, Training Loss: 0.00706
Epoch: 33321, Training Loss: 0.00798
Epoch: 33321, Training Loss: 0.00746
Epoch: 33321, Training Loss: 0.00911
Epoch: 33321, Training Loss: 0.00706
Epoch: 33322, Training Loss: 0.00798
Epoch: 33322, Training Loss: 0.00746
Epoch: 33322, Training Loss: 0.00911
Epoch: 33322, Training Loss: 0.00706
Epoch: 33323, Training Loss: 0.00798
Epoch: 33323, Training Loss: 0.00746
Epoch: 33323, Training Loss: 0.00911
Epoch: 33323, Training Loss: 0.00706
Epoch: 33324, Training Loss: 0.00798
Epoch: 33324, Training Loss: 0.00746
Epoch: 33324, Training Loss: 0.00911
Epoch: 33324, Training Loss: 0.00706
Epoch: 33325, Training Loss: 0.00798
Epoch: 33325, Training Loss: 0.00746
Epoch: 33325, Training Loss: 0.00911
Epoch: 33325, Training Loss: 0.00706
Epoch: 33326, Training Loss: 0.00798
Epoch: 33326, Training Loss: 0.00746
Epoch: 33326, Training Loss: 0.00911
Epoch: 33326, Training Loss: 0.00706
Epoch: 33327, Training Loss: 0.00798
Epoch: 33327, Training Loss: 0.00745
Epoch: 33327, Training Loss: 0.00911
Epoch: 33327, Training Loss: 0.00706
Epoch: 33328, Training Loss: 0.00798
Epoch: 33328, Training Loss: 0.00745
Epoch: 33328, Training Loss: 0.00911
Epoch: 33328, Training Loss: 0.00706
Epoch: 33329, Training Loss: 0.00798
Epoch: 33329, Training Loss: 0.00745
Epoch: 33329, Training Loss: 0.00911
Epoch: 33329, Training Loss: 0.00706
Epoch: 33330, Training Loss: 0.00798
Epoch: 33330, Training Loss: 0.00745
Epoch: 33330, Training Loss: 0.00911
Epoch: 33330, Training Loss: 0.00706
Epoch: 33331, Training Loss: 0.00798
Epoch: 33331, Training Loss: 0.00745
Epoch: 33331, Training Loss: 0.00911
Epoch: 33331, Training Loss: 0.00706
Epoch: 33332, Training Loss: 0.00798
Epoch: 33332, Training Loss: 0.00745
Epoch: 33332, Training Loss: 0.00911
Epoch: 33332, Training Loss: 0.00706
Epoch: 33333, Training Loss: 0.00798
Epoch: 33333, Training Loss: 0.00745
Epoch: 33333, Training Loss: 0.00911
Epoch: 33333, Training Loss: 0.00706
Epoch: 33334, Training Loss: 0.00798
Epoch: 33334, Training Loss: 0.00745
Epoch: 33334, Training Loss: 0.00911
Epoch: 33334, Training Loss: 0.00706
Epoch: 33335, Training Loss: 0.00798
Epoch: 33335, Training Loss: 0.00745
Epoch: 33335, Training Loss: 0.00911
Epoch: 33335, Training Loss: 0.00706
Epoch: 33336, Training Loss: 0.00798
Epoch: 33336, Training Loss: 0.00745
Epoch: 33336, Training Loss: 0.00911
Epoch: 33336, Training Loss: 0.00706
Epoch: 33337, Training Loss: 0.00798
Epoch: 33337, Training Loss: 0.00745
Epoch: 33337, Training Loss: 0.00911
Epoch: 33337, Training Loss: 0.00706
Epoch: 33338, Training Loss: 0.00798
Epoch: 33338, Training Loss: 0.00745
Epoch: 33338, Training Loss: 0.00911
Epoch: 33338, Training Loss: 0.00706
Epoch: 33339, Training Loss: 0.00798
Epoch: 33339, Training Loss: 0.00745
Epoch: 33339, Training Loss: 0.00911
Epoch: 33339, Training Loss: 0.00706
Epoch: 33340, Training Loss: 0.00798
Epoch: 33340, Training Loss: 0.00745
Epoch: 33340, Training Loss: 0.00911
Epoch: 33340, Training Loss: 0.00706
Epoch: 33341, Training Loss: 0.00797
Epoch: 33341, Training Loss: 0.00745
Epoch: 33341, Training Loss: 0.00911
Epoch: 33341, Training Loss: 0.00706
Epoch: 33342, Training Loss: 0.00797
Epoch: 33342, Training Loss: 0.00745
Epoch: 33342, Training Loss: 0.00911
Epoch: 33342, Training Loss: 0.00706
Epoch: 33343, Training Loss: 0.00797
Epoch: 33343, Training Loss: 0.00745
Epoch: 33343, Training Loss: 0.00911
Epoch: 33343, Training Loss: 0.00706
Epoch: 33344, Training Loss: 0.00797
Epoch: 33344, Training Loss: 0.00745
Epoch: 33344, Training Loss: 0.00911
Epoch: 33344, Training Loss: 0.00706
Epoch: 33345, Training Loss: 0.00797
Epoch: 33345, Training Loss: 0.00745
Epoch: 33345, Training Loss: 0.00911
Epoch: 33345, Training Loss: 0.00706
Epoch: 33346, Training Loss: 0.00797
Epoch: 33346, Training Loss: 0.00745
Epoch: 33346, Training Loss: 0.00911
Epoch: 33346, Training Loss: 0.00706
Epoch: 33347, Training Loss: 0.00797
Epoch: 33347, Training Loss: 0.00745
Epoch: 33347, Training Loss: 0.00911
Epoch: 33347, Training Loss: 0.00706
Epoch: 33348, Training Loss: 0.00797
Epoch: 33348, Training Loss: 0.00745
Epoch: 33348, Training Loss: 0.00911
Epoch: 33348, Training Loss: 0.00706
Epoch: 33349, Training Loss: 0.00797
Epoch: 33349, Training Loss: 0.00745
Epoch: 33349, Training Loss: 0.00911
Epoch: 33349, Training Loss: 0.00706
Epoch: 33350, Training Loss: 0.00797
Epoch: 33350, Training Loss: 0.00745
Epoch: 33350, Training Loss: 0.00911
Epoch: 33350, Training Loss: 0.00706
Epoch: 33351, Training Loss: 0.00797
Epoch: 33351, Training Loss: 0.00745
Epoch: 33351, Training Loss: 0.00911
Epoch: 33351, Training Loss: 0.00706
Epoch: 33352, Training Loss: 0.00797
Epoch: 33352, Training Loss: 0.00745
Epoch: 33352, Training Loss: 0.00911
Epoch: 33352, Training Loss: 0.00706
Epoch: 33353, Training Loss: 0.00797
Epoch: 33353, Training Loss: 0.00745
Epoch: 33353, Training Loss: 0.00911
Epoch: 33353, Training Loss: 0.00706
Epoch: 33354, Training Loss: 0.00797
Epoch: 33354, Training Loss: 0.00745
Epoch: 33354, Training Loss: 0.00911
Epoch: 33354, Training Loss: 0.00706
Epoch: 33355, Training Loss: 0.00797
Epoch: 33355, Training Loss: 0.00745
Epoch: 33355, Training Loss: 0.00911
Epoch: 33355, Training Loss: 0.00706
Epoch: 33356, Training Loss: 0.00797
Epoch: 33356, Training Loss: 0.00745
Epoch: 33356, Training Loss: 0.00911
Epoch: 33356, Training Loss: 0.00706
Epoch: 33357, Training Loss: 0.00797
Epoch: 33357, Training Loss: 0.00745
Epoch: 33357, Training Loss: 0.00911
Epoch: 33357, Training Loss: 0.00706
Epoch: 33358, Training Loss: 0.00797
Epoch: 33358, Training Loss: 0.00745
Epoch: 33358, Training Loss: 0.00911
Epoch: 33358, Training Loss: 0.00706
Epoch: 33359, Training Loss: 0.00797
Epoch: 33359, Training Loss: 0.00745
Epoch: 33359, Training Loss: 0.00911
Epoch: 33359, Training Loss: 0.00706
Epoch: 33360, Training Loss: 0.00797
Epoch: 33360, Training Loss: 0.00745
Epoch: 33360, Training Loss: 0.00911
Epoch: 33360, Training Loss: 0.00706
Epoch: 33361, Training Loss: 0.00797
Epoch: 33361, Training Loss: 0.00745
Epoch: 33361, Training Loss: 0.00911
Epoch: 33361, Training Loss: 0.00706
Epoch: 33362, Training Loss: 0.00797
Epoch: 33362, Training Loss: 0.00745
Epoch: 33362, Training Loss: 0.00911
Epoch: 33362, Training Loss: 0.00706
Epoch: 33363, Training Loss: 0.00797
Epoch: 33363, Training Loss: 0.00745
Epoch: 33363, Training Loss: 0.00911
Epoch: 33363, Training Loss: 0.00705
Epoch: 33364, Training Loss: 0.00797
Epoch: 33364, Training Loss: 0.00745
Epoch: 33364, Training Loss: 0.00911
Epoch: 33364, Training Loss: 0.00705
Epoch: 33365, Training Loss: 0.00797
Epoch: 33365, Training Loss: 0.00745
Epoch: 33365, Training Loss: 0.00911
Epoch: 33365, Training Loss: 0.00705
Epoch: 33366, Training Loss: 0.00797
Epoch: 33366, Training Loss: 0.00745
Epoch: 33366, Training Loss: 0.00911
Epoch: 33366, Training Loss: 0.00705
Epoch: 33367, Training Loss: 0.00797
Epoch: 33367, Training Loss: 0.00745
Epoch: 33367, Training Loss: 0.00911
Epoch: 33367, Training Loss: 0.00705
Epoch: 33368, Training Loss: 0.00797
Epoch: 33368, Training Loss: 0.00745
Epoch: 33368, Training Loss: 0.00911
Epoch: 33368, Training Loss: 0.00705
Epoch: 33369, Training Loss: 0.00797
Epoch: 33369, Training Loss: 0.00745
Epoch: 33369, Training Loss: 0.00911
Epoch: 33369, Training Loss: 0.00705
Epoch: 33370, Training Loss: 0.00797
Epoch: 33370, Training Loss: 0.00745
Epoch: 33370, Training Loss: 0.00911
Epoch: 33370, Training Loss: 0.00705
Epoch: 33371, Training Loss: 0.00797
Epoch: 33371, Training Loss: 0.00745
Epoch: 33371, Training Loss: 0.00910
Epoch: 33371, Training Loss: 0.00705
Epoch: 33372, Training Loss: 0.00797
Epoch: 33372, Training Loss: 0.00745
Epoch: 33372, Training Loss: 0.00910
Epoch: 33372, Training Loss: 0.00705
Epoch: 33373, Training Loss: 0.00797
Epoch: 33373, Training Loss: 0.00745
Epoch: 33373, Training Loss: 0.00910
Epoch: 33373, Training Loss: 0.00705
Epoch: 33374, Training Loss: 0.00797
Epoch: 33374, Training Loss: 0.00745
Epoch: 33374, Training Loss: 0.00910
Epoch: 33374, Training Loss: 0.00705
Epoch: 33375, Training Loss: 0.00797
Epoch: 33375, Training Loss: 0.00745
Epoch: 33375, Training Loss: 0.00910
Epoch: 33375, Training Loss: 0.00705
Epoch: 33376, Training Loss: 0.00797
Epoch: 33376, Training Loss: 0.00745
Epoch: 33376, Training Loss: 0.00910
Epoch: 33376, Training Loss: 0.00705
Epoch: 33377, Training Loss: 0.00797
Epoch: 33377, Training Loss: 0.00745
Epoch: 33377, Training Loss: 0.00910
Epoch: 33377, Training Loss: 0.00705
Epoch: 33378, Training Loss: 0.00797
Epoch: 33378, Training Loss: 0.00745
Epoch: 33378, Training Loss: 0.00910
Epoch: 33378, Training Loss: 0.00705
Epoch: 33379, Training Loss: 0.00797
Epoch: 33379, Training Loss: 0.00745
Epoch: 33379, Training Loss: 0.00910
Epoch: 33379, Training Loss: 0.00705
Epoch: 33380, Training Loss: 0.00797
Epoch: 33380, Training Loss: 0.00745
Epoch: 33380, Training Loss: 0.00910
Epoch: 33380, Training Loss: 0.00705
Epoch: 33381, Training Loss: 0.00797
Epoch: 33381, Training Loss: 0.00745
Epoch: 33381, Training Loss: 0.00910
Epoch: 33381, Training Loss: 0.00705
Epoch: 33382, Training Loss: 0.00797
Epoch: 33382, Training Loss: 0.00745
Epoch: 33382, Training Loss: 0.00910
Epoch: 33382, Training Loss: 0.00705
Epoch: 33383, Training Loss: 0.00797
Epoch: 33383, Training Loss: 0.00745
Epoch: 33383, Training Loss: 0.00910
Epoch: 33383, Training Loss: 0.00705
Epoch: 33384, Training Loss: 0.00797
Epoch: 33384, Training Loss: 0.00745
Epoch: 33384, Training Loss: 0.00910
Epoch: 33384, Training Loss: 0.00705
Epoch: 33385, Training Loss: 0.00797
Epoch: 33385, Training Loss: 0.00745
Epoch: 33385, Training Loss: 0.00910
Epoch: 33385, Training Loss: 0.00705
Epoch: 33386, Training Loss: 0.00797
Epoch: 33386, Training Loss: 0.00745
Epoch: 33386, Training Loss: 0.00910
Epoch: 33386, Training Loss: 0.00705
Epoch: 33387, Training Loss: 0.00797
Epoch: 33387, Training Loss: 0.00745
Epoch: 33387, Training Loss: 0.00910
Epoch: 33387, Training Loss: 0.00705
Epoch: 33388, Training Loss: 0.00797
Epoch: 33388, Training Loss: 0.00745
Epoch: 33388, Training Loss: 0.00910
Epoch: 33388, Training Loss: 0.00705
Epoch: 33389, Training Loss: 0.00797
Epoch: 33389, Training Loss: 0.00745
Epoch: 33389, Training Loss: 0.00910
Epoch: 33389, Training Loss: 0.00705
Epoch: 33390, Training Loss: 0.00797
Epoch: 33390, Training Loss: 0.00745
Epoch: 33390, Training Loss: 0.00910
Epoch: 33390, Training Loss: 0.00705
Epoch: 33391, Training Loss: 0.00797
Epoch: 33391, Training Loss: 0.00745
Epoch: 33391, Training Loss: 0.00910
Epoch: 33391, Training Loss: 0.00705
Epoch: 33392, Training Loss: 0.00797
Epoch: 33392, Training Loss: 0.00745
Epoch: 33392, Training Loss: 0.00910
Epoch: 33392, Training Loss: 0.00705
Epoch: 33393, Training Loss: 0.00797
Epoch: 33393, Training Loss: 0.00745
Epoch: 33393, Training Loss: 0.00910
Epoch: 33393, Training Loss: 0.00705
Epoch: 33394, Training Loss: 0.00797
Epoch: 33394, Training Loss: 0.00745
Epoch: 33394, Training Loss: 0.00910
Epoch: 33394, Training Loss: 0.00705
Epoch: 33395, Training Loss: 0.00797
Epoch: 33395, Training Loss: 0.00745
Epoch: 33395, Training Loss: 0.00910
Epoch: 33395, Training Loss: 0.00705
Epoch: 33396, Training Loss: 0.00797
Epoch: 33396, Training Loss: 0.00745
Epoch: 33396, Training Loss: 0.00910
Epoch: 33396, Training Loss: 0.00705
Epoch: 33397, Training Loss: 0.00797
Epoch: 33397, Training Loss: 0.00745
Epoch: 33397, Training Loss: 0.00910
Epoch: 33397, Training Loss: 0.00705
Epoch: 33398, Training Loss: 0.00797
Epoch: 33398, Training Loss: 0.00745
Epoch: 33398, Training Loss: 0.00910
Epoch: 33398, Training Loss: 0.00705
Epoch: 33399, Training Loss: 0.00797
Epoch: 33399, Training Loss: 0.00745
Epoch: 33399, Training Loss: 0.00910
Epoch: 33399, Training Loss: 0.00705
Epoch: 33400, Training Loss: 0.00797
Epoch: 33400, Training Loss: 0.00745
Epoch: 33400, Training Loss: 0.00910
Epoch: 33400, Training Loss: 0.00705
Epoch: 33401, Training Loss: 0.00797
Epoch: 33401, Training Loss: 0.00745
Epoch: 33401, Training Loss: 0.00910
Epoch: 33401, Training Loss: 0.00705
Epoch: 33402, Training Loss: 0.00797
Epoch: 33402, Training Loss: 0.00745
Epoch: 33402, Training Loss: 0.00910
Epoch: 33402, Training Loss: 0.00705
Epoch: 33403, Training Loss: 0.00797
Epoch: 33403, Training Loss: 0.00745
Epoch: 33403, Training Loss: 0.00910
Epoch: 33403, Training Loss: 0.00705
Epoch: 33404, Training Loss: 0.00797
Epoch: 33404, Training Loss: 0.00745
Epoch: 33404, Training Loss: 0.00910
Epoch: 33404, Training Loss: 0.00705
Epoch: 33405, Training Loss: 0.00797
Epoch: 33405, Training Loss: 0.00745
Epoch: 33405, Training Loss: 0.00910
Epoch: 33405, Training Loss: 0.00705
Epoch: 33406, Training Loss: 0.00797
Epoch: 33406, Training Loss: 0.00745
Epoch: 33406, Training Loss: 0.00910
Epoch: 33406, Training Loss: 0.00705
Epoch: 33407, Training Loss: 0.00797
Epoch: 33407, Training Loss: 0.00745
Epoch: 33407, Training Loss: 0.00910
Epoch: 33407, Training Loss: 0.00705
Epoch: 33408, Training Loss: 0.00797
Epoch: 33408, Training Loss: 0.00745
Epoch: 33408, Training Loss: 0.00910
Epoch: 33408, Training Loss: 0.00705
Epoch: 33409, Training Loss: 0.00797
Epoch: 33409, Training Loss: 0.00745
Epoch: 33409, Training Loss: 0.00910
Epoch: 33409, Training Loss: 0.00705
Epoch: 33410, Training Loss: 0.00797
Epoch: 33410, Training Loss: 0.00745
Epoch: 33410, Training Loss: 0.00910
Epoch: 33410, Training Loss: 0.00705
Epoch: 33411, Training Loss: 0.00797
Epoch: 33411, Training Loss: 0.00745
Epoch: 33411, Training Loss: 0.00910
Epoch: 33411, Training Loss: 0.00705
Epoch: 33412, Training Loss: 0.00797
Epoch: 33412, Training Loss: 0.00745
Epoch: 33412, Training Loss: 0.00910
Epoch: 33412, Training Loss: 0.00705
Epoch: 33413, Training Loss: 0.00797
Epoch: 33413, Training Loss: 0.00744
Epoch: 33413, Training Loss: 0.00910
Epoch: 33413, Training Loss: 0.00705
Epoch: 33414, Training Loss: 0.00797
Epoch: 33414, Training Loss: 0.00744
Epoch: 33414, Training Loss: 0.00910
Epoch: 33414, Training Loss: 0.00705
Epoch: 33415, Training Loss: 0.00797
Epoch: 33415, Training Loss: 0.00744
Epoch: 33415, Training Loss: 0.00910
Epoch: 33415, Training Loss: 0.00705
Epoch: 33416, Training Loss: 0.00797
Epoch: 33416, Training Loss: 0.00744
Epoch: 33416, Training Loss: 0.00910
Epoch: 33416, Training Loss: 0.00705
Epoch: 33417, Training Loss: 0.00797
Epoch: 33417, Training Loss: 0.00744
Epoch: 33417, Training Loss: 0.00910
Epoch: 33417, Training Loss: 0.00705
Epoch: 33418, Training Loss: 0.00797
Epoch: 33418, Training Loss: 0.00744
Epoch: 33418, Training Loss: 0.00910
Epoch: 33418, Training Loss: 0.00705
Epoch: 33419, Training Loss: 0.00797
Epoch: 33419, Training Loss: 0.00744
Epoch: 33419, Training Loss: 0.00910
Epoch: 33419, Training Loss: 0.00705
Epoch: 33420, Training Loss: 0.00797
Epoch: 33420, Training Loss: 0.00744
Epoch: 33420, Training Loss: 0.00910
Epoch: 33420, Training Loss: 0.00705
Epoch: 33421, Training Loss: 0.00796
Epoch: 33421, Training Loss: 0.00744
Epoch: 33421, Training Loss: 0.00910
Epoch: 33421, Training Loss: 0.00705
Epoch: 33422, Training Loss: 0.00796
Epoch: 33422, Training Loss: 0.00744
Epoch: 33422, Training Loss: 0.00910
Epoch: 33422, Training Loss: 0.00705
Epoch: 33423, Training Loss: 0.00796
Epoch: 33423, Training Loss: 0.00744
Epoch: 33423, Training Loss: 0.00910
Epoch: 33423, Training Loss: 0.00705
Epoch: 33424, Training Loss: 0.00796
Epoch: 33424, Training Loss: 0.00744
Epoch: 33424, Training Loss: 0.00910
Epoch: 33424, Training Loss: 0.00705
Epoch: 33425, Training Loss: 0.00796
Epoch: 33425, Training Loss: 0.00744
Epoch: 33425, Training Loss: 0.00910
Epoch: 33425, Training Loss: 0.00705
Epoch: 33426, Training Loss: 0.00796
Epoch: 33426, Training Loss: 0.00744
Epoch: 33426, Training Loss: 0.00910
Epoch: 33426, Training Loss: 0.00705
Epoch: 33427, Training Loss: 0.00796
Epoch: 33427, Training Loss: 0.00744
Epoch: 33427, Training Loss: 0.00910
Epoch: 33427, Training Loss: 0.00705
Epoch: 33428, Training Loss: 0.00796
Epoch: 33428, Training Loss: 0.00744
Epoch: 33428, Training Loss: 0.00910
Epoch: 33428, Training Loss: 0.00705
Epoch: 33429, Training Loss: 0.00796
Epoch: 33429, Training Loss: 0.00744
Epoch: 33429, Training Loss: 0.00910
Epoch: 33429, Training Loss: 0.00705
Epoch: 33430, Training Loss: 0.00796
Epoch: 33430, Training Loss: 0.00744
Epoch: 33430, Training Loss: 0.00910
Epoch: 33430, Training Loss: 0.00705
Epoch: 33431, Training Loss: 0.00796
Epoch: 33431, Training Loss: 0.00744
Epoch: 33431, Training Loss: 0.00910
Epoch: 33431, Training Loss: 0.00705
Epoch: 33432, Training Loss: 0.00796
Epoch: 33432, Training Loss: 0.00744
Epoch: 33432, Training Loss: 0.00910
Epoch: 33432, Training Loss: 0.00705
Epoch: 33433, Training Loss: 0.00796
Epoch: 33433, Training Loss: 0.00744
Epoch: 33433, Training Loss: 0.00910
Epoch: 33433, Training Loss: 0.00705
Epoch: 33434, Training Loss: 0.00796
Epoch: 33434, Training Loss: 0.00744
Epoch: 33434, Training Loss: 0.00910
Epoch: 33434, Training Loss: 0.00705
Epoch: 33435, Training Loss: 0.00796
Epoch: 33435, Training Loss: 0.00744
Epoch: 33435, Training Loss: 0.00910
Epoch: 33435, Training Loss: 0.00705
Epoch: 33436, Training Loss: 0.00796
Epoch: 33436, Training Loss: 0.00744
Epoch: 33436, Training Loss: 0.00910
Epoch: 33436, Training Loss: 0.00705
Epoch: 33437, Training Loss: 0.00796
Epoch: 33437, Training Loss: 0.00744
Epoch: 33437, Training Loss: 0.00910
Epoch: 33437, Training Loss: 0.00705
Epoch: 33438, Training Loss: 0.00796
Epoch: 33438, Training Loss: 0.00744
Epoch: 33438, Training Loss: 0.00910
Epoch: 33438, Training Loss: 0.00705
Epoch: 33439, Training Loss: 0.00796
Epoch: 33439, Training Loss: 0.00744
Epoch: 33439, Training Loss: 0.00910
Epoch: 33439, Training Loss: 0.00705
Epoch: 33440, Training Loss: 0.00796
Epoch: 33440, Training Loss: 0.00744
Epoch: 33440, Training Loss: 0.00909
Epoch: 33440, Training Loss: 0.00705
Epoch: 33441, Training Loss: 0.00796
Epoch: 33441, Training Loss: 0.00744
Epoch: 33441, Training Loss: 0.00909
Epoch: 33441, Training Loss: 0.00705
Epoch: 33442, Training Loss: 0.00796
Epoch: 33442, Training Loss: 0.00744
Epoch: 33442, Training Loss: 0.00909
Epoch: 33442, Training Loss: 0.00705
Epoch: 33443, Training Loss: 0.00796
Epoch: 33443, Training Loss: 0.00744
Epoch: 33443, Training Loss: 0.00909
Epoch: 33443, Training Loss: 0.00705
Epoch: 33444, Training Loss: 0.00796
Epoch: 33444, Training Loss: 0.00744
Epoch: 33444, Training Loss: 0.00909
Epoch: 33444, Training Loss: 0.00705
Epoch: 33445, Training Loss: 0.00796
Epoch: 33445, Training Loss: 0.00744
Epoch: 33445, Training Loss: 0.00909
Epoch: 33445, Training Loss: 0.00705
Epoch: 33446, Training Loss: 0.00796
Epoch: 33446, Training Loss: 0.00744
Epoch: 33446, Training Loss: 0.00909
Epoch: 33446, Training Loss: 0.00705
Epoch: 33447, Training Loss: 0.00796
Epoch: 33447, Training Loss: 0.00744
Epoch: 33447, Training Loss: 0.00909
Epoch: 33447, Training Loss: 0.00705
Epoch: 33448, Training Loss: 0.00796
Epoch: 33448, Training Loss: 0.00744
Epoch: 33448, Training Loss: 0.00909
Epoch: 33448, Training Loss: 0.00705
Epoch: 33449, Training Loss: 0.00796
Epoch: 33449, Training Loss: 0.00744
Epoch: 33449, Training Loss: 0.00909
Epoch: 33449, Training Loss: 0.00705
Epoch: 33450, Training Loss: 0.00796
Epoch: 33450, Training Loss: 0.00744
Epoch: 33450, Training Loss: 0.00909
Epoch: 33450, Training Loss: 0.00705
Epoch: 33451, Training Loss: 0.00796
Epoch: 33451, Training Loss: 0.00744
Epoch: 33451, Training Loss: 0.00909
Epoch: 33451, Training Loss: 0.00705
Epoch: 33452, Training Loss: 0.00796
Epoch: 33452, Training Loss: 0.00744
Epoch: 33452, Training Loss: 0.00909
Epoch: 33452, Training Loss: 0.00705
Epoch: 33453, Training Loss: 0.00796
Epoch: 33453, Training Loss: 0.00744
Epoch: 33453, Training Loss: 0.00909
Epoch: 33453, Training Loss: 0.00705
Epoch: 33454, Training Loss: 0.00796
Epoch: 33454, Training Loss: 0.00744
Epoch: 33454, Training Loss: 0.00909
Epoch: 33454, Training Loss: 0.00704
Epoch: 33455, Training Loss: 0.00796
Epoch: 33455, Training Loss: 0.00744
Epoch: 33455, Training Loss: 0.00909
Epoch: 33455, Training Loss: 0.00704
Epoch: 33456, Training Loss: 0.00796
Epoch: 33456, Training Loss: 0.00744
Epoch: 33456, Training Loss: 0.00909
Epoch: 33456, Training Loss: 0.00704
Epoch: 33457, Training Loss: 0.00796
Epoch: 33457, Training Loss: 0.00744
Epoch: 33457, Training Loss: 0.00909
Epoch: 33457, Training Loss: 0.00704
Epoch: 33458, Training Loss: 0.00796
Epoch: 33458, Training Loss: 0.00744
Epoch: 33458, Training Loss: 0.00909
Epoch: 33458, Training Loss: 0.00704
Epoch: 33459, Training Loss: 0.00796
Epoch: 33459, Training Loss: 0.00744
Epoch: 33459, Training Loss: 0.00909
Epoch: 33459, Training Loss: 0.00704
Epoch: 33460, Training Loss: 0.00796
Epoch: 33460, Training Loss: 0.00744
Epoch: 33460, Training Loss: 0.00909
Epoch: 33460, Training Loss: 0.00704
Epoch: 33461, Training Loss: 0.00796
Epoch: 33461, Training Loss: 0.00744
Epoch: 33461, Training Loss: 0.00909
Epoch: 33461, Training Loss: 0.00704
Epoch: 33462, Training Loss: 0.00796
Epoch: 33462, Training Loss: 0.00744
Epoch: 33462, Training Loss: 0.00909
Epoch: 33462, Training Loss: 0.00704
Epoch: 33463, Training Loss: 0.00796
Epoch: 33463, Training Loss: 0.00744
Epoch: 33463, Training Loss: 0.00909
Epoch: 33463, Training Loss: 0.00704
Epoch: 33464, Training Loss: 0.00796
Epoch: 33464, Training Loss: 0.00744
Epoch: 33464, Training Loss: 0.00909
Epoch: 33464, Training Loss: 0.00704
Epoch: 33465, Training Loss: 0.00796
Epoch: 33465, Training Loss: 0.00744
Epoch: 33465, Training Loss: 0.00909
Epoch: 33465, Training Loss: 0.00704
Epoch: 33466, Training Loss: 0.00796
Epoch: 33466, Training Loss: 0.00744
Epoch: 33466, Training Loss: 0.00909
Epoch: 33466, Training Loss: 0.00704
Epoch: 33467, Training Loss: 0.00796
Epoch: 33467, Training Loss: 0.00744
Epoch: 33467, Training Loss: 0.00909
Epoch: 33467, Training Loss: 0.00704
Epoch: 33468, Training Loss: 0.00796
Epoch: 33468, Training Loss: 0.00744
Epoch: 33468, Training Loss: 0.00909
Epoch: 33468, Training Loss: 0.00704
Epoch: 33469, Training Loss: 0.00796
Epoch: 33469, Training Loss: 0.00744
Epoch: 33469, Training Loss: 0.00909
Epoch: 33469, Training Loss: 0.00704
Epoch: 33470, Training Loss: 0.00796
Epoch: 33470, Training Loss: 0.00744
Epoch: 33470, Training Loss: 0.00909
Epoch: 33470, Training Loss: 0.00704
Epoch: 33471, Training Loss: 0.00796
Epoch: 33471, Training Loss: 0.00744
Epoch: 33471, Training Loss: 0.00909
Epoch: 33471, Training Loss: 0.00704
Epoch: 33472, Training Loss: 0.00796
Epoch: 33472, Training Loss: 0.00744
Epoch: 33472, Training Loss: 0.00909
Epoch: 33472, Training Loss: 0.00704
Epoch: 33473, Training Loss: 0.00796
Epoch: 33473, Training Loss: 0.00744
Epoch: 33473, Training Loss: 0.00909
Epoch: 33473, Training Loss: 0.00704
Epoch: 33474, Training Loss: 0.00796
Epoch: 33474, Training Loss: 0.00744
Epoch: 33474, Training Loss: 0.00909
Epoch: 33474, Training Loss: 0.00704
Epoch: 33475, Training Loss: 0.00796
Epoch: 33475, Training Loss: 0.00744
Epoch: 33475, Training Loss: 0.00909
Epoch: 33475, Training Loss: 0.00704
Epoch: 33476, Training Loss: 0.00796
Epoch: 33476, Training Loss: 0.00744
Epoch: 33476, Training Loss: 0.00909
Epoch: 33476, Training Loss: 0.00704
Epoch: 33477, Training Loss: 0.00796
Epoch: 33477, Training Loss: 0.00744
Epoch: 33477, Training Loss: 0.00909
Epoch: 33477, Training Loss: 0.00704
Epoch: 33478, Training Loss: 0.00796
Epoch: 33478, Training Loss: 0.00744
Epoch: 33478, Training Loss: 0.00909
Epoch: 33478, Training Loss: 0.00704
Epoch: 33479, Training Loss: 0.00796
Epoch: 33479, Training Loss: 0.00744
Epoch: 33479, Training Loss: 0.00909
Epoch: 33479, Training Loss: 0.00704
Epoch: 33480, Training Loss: 0.00796
Epoch: 33480, Training Loss: 0.00744
Epoch: 33480, Training Loss: 0.00909
Epoch: 33480, Training Loss: 0.00704
Epoch: 33481, Training Loss: 0.00796
Epoch: 33481, Training Loss: 0.00744
Epoch: 33481, Training Loss: 0.00909
Epoch: 33481, Training Loss: 0.00704
Epoch: 33482, Training Loss: 0.00796
Epoch: 33482, Training Loss: 0.00744
Epoch: 33482, Training Loss: 0.00909
Epoch: 33482, Training Loss: 0.00704
Epoch: 33483, Training Loss: 0.00796
Epoch: 33483, Training Loss: 0.00744
Epoch: 33483, Training Loss: 0.00909
Epoch: 33483, Training Loss: 0.00704
Epoch: 33484, Training Loss: 0.00796
Epoch: 33484, Training Loss: 0.00744
Epoch: 33484, Training Loss: 0.00909
Epoch: 33484, Training Loss: 0.00704
Epoch: 33485, Training Loss: 0.00796
Epoch: 33485, Training Loss: 0.00744
Epoch: 33485, Training Loss: 0.00909
Epoch: 33485, Training Loss: 0.00704
Epoch: 33486, Training Loss: 0.00796
Epoch: 33486, Training Loss: 0.00744
Epoch: 33486, Training Loss: 0.00909
Epoch: 33486, Training Loss: 0.00704
Epoch: 33487, Training Loss: 0.00796
Epoch: 33487, Training Loss: 0.00744
Epoch: 33487, Training Loss: 0.00909
Epoch: 33487, Training Loss: 0.00704
Epoch: 33488, Training Loss: 0.00796
Epoch: 33488, Training Loss: 0.00744
Epoch: 33488, Training Loss: 0.00909
Epoch: 33488, Training Loss: 0.00704
Epoch: 33489, Training Loss: 0.00796
Epoch: 33489, Training Loss: 0.00744
Epoch: 33489, Training Loss: 0.00909
Epoch: 33489, Training Loss: 0.00704
Epoch: 33490, Training Loss: 0.00796
Epoch: 33490, Training Loss: 0.00744
Epoch: 33490, Training Loss: 0.00909
Epoch: 33490, Training Loss: 0.00704
Epoch: 33491, Training Loss: 0.00796
Epoch: 33491, Training Loss: 0.00744
Epoch: 33491, Training Loss: 0.00909
Epoch: 33491, Training Loss: 0.00704
Epoch: 33492, Training Loss: 0.00796
Epoch: 33492, Training Loss: 0.00744
Epoch: 33492, Training Loss: 0.00909
Epoch: 33492, Training Loss: 0.00704
Epoch: 33493, Training Loss: 0.00796
Epoch: 33493, Training Loss: 0.00744
Epoch: 33493, Training Loss: 0.00909
Epoch: 33493, Training Loss: 0.00704
Epoch: 33494, Training Loss: 0.00796
Epoch: 33494, Training Loss: 0.00744
Epoch: 33494, Training Loss: 0.00909
Epoch: 33494, Training Loss: 0.00704
Epoch: 33495, Training Loss: 0.00796
Epoch: 33495, Training Loss: 0.00744
Epoch: 33495, Training Loss: 0.00909
Epoch: 33495, Training Loss: 0.00704
Epoch: 33496, Training Loss: 0.00796
Epoch: 33496, Training Loss: 0.00744
Epoch: 33496, Training Loss: 0.00909
Epoch: 33496, Training Loss: 0.00704
Epoch: 33497, Training Loss: 0.00796
Epoch: 33497, Training Loss: 0.00744
Epoch: 33497, Training Loss: 0.00909
Epoch: 33497, Training Loss: 0.00704
Epoch: 33498, Training Loss: 0.00796
Epoch: 33498, Training Loss: 0.00744
Epoch: 33498, Training Loss: 0.00909
Epoch: 33498, Training Loss: 0.00704
Epoch: 33499, Training Loss: 0.00796
Epoch: 33499, Training Loss: 0.00743
Epoch: 33499, Training Loss: 0.00909
Epoch: 33499, Training Loss: 0.00704
Epoch: 33500, Training Loss: 0.00795
Epoch: 33500, Training Loss: 0.00743
Epoch: 33500, Training Loss: 0.00909
Epoch: 33500, Training Loss: 0.00704
Epoch: 33501, Training Loss: 0.00795
Epoch: 33501, Training Loss: 0.00743
Epoch: 33501, Training Loss: 0.00909
Epoch: 33501, Training Loss: 0.00704
Epoch: 33502, Training Loss: 0.00795
Epoch: 33502, Training Loss: 0.00743
Epoch: 33502, Training Loss: 0.00909
Epoch: 33502, Training Loss: 0.00704
Epoch: 33503, Training Loss: 0.00795
Epoch: 33503, Training Loss: 0.00743
Epoch: 33503, Training Loss: 0.00909
Epoch: 33503, Training Loss: 0.00704
Epoch: 33504, Training Loss: 0.00795
Epoch: 33504, Training Loss: 0.00743
Epoch: 33504, Training Loss: 0.00909
Epoch: 33504, Training Loss: 0.00704
Epoch: 33505, Training Loss: 0.00795
Epoch: 33505, Training Loss: 0.00743
Epoch: 33505, Training Loss: 0.00909
Epoch: 33505, Training Loss: 0.00704
Epoch: 33506, Training Loss: 0.00795
Epoch: 33506, Training Loss: 0.00743
Epoch: 33506, Training Loss: 0.00909
Epoch: 33506, Training Loss: 0.00704
Epoch: 33507, Training Loss: 0.00795
Epoch: 33507, Training Loss: 0.00743
Epoch: 33507, Training Loss: 0.00909
Epoch: 33507, Training Loss: 0.00704
Epoch: 33508, Training Loss: 0.00795
Epoch: 33508, Training Loss: 0.00743
Epoch: 33508, Training Loss: 0.00909
Epoch: 33508, Training Loss: 0.00704
Epoch: 33509, Training Loss: 0.00795
Epoch: 33509, Training Loss: 0.00743
Epoch: 33509, Training Loss: 0.00909
Epoch: 33509, Training Loss: 0.00704
Epoch: 33510, Training Loss: 0.00795
Epoch: 33510, Training Loss: 0.00743
Epoch: 33510, Training Loss: 0.00908
Epoch: 33510, Training Loss: 0.00704
Epoch: 33511, Training Loss: 0.00795
Epoch: 33511, Training Loss: 0.00743
Epoch: 33511, Training Loss: 0.00908
Epoch: 33511, Training Loss: 0.00704
Epoch: 33512, Training Loss: 0.00795
Epoch: 33512, Training Loss: 0.00743
Epoch: 33512, Training Loss: 0.00908
Epoch: 33512, Training Loss: 0.00704
Epoch: 33513, Training Loss: 0.00795
Epoch: 33513, Training Loss: 0.00743
Epoch: 33513, Training Loss: 0.00908
Epoch: 33513, Training Loss: 0.00704
Epoch: 33514, Training Loss: 0.00795
Epoch: 33514, Training Loss: 0.00743
Epoch: 33514, Training Loss: 0.00908
Epoch: 33514, Training Loss: 0.00704
Epoch: 33515, Training Loss: 0.00795
Epoch: 33515, Training Loss: 0.00743
Epoch: 33515, Training Loss: 0.00908
Epoch: 33515, Training Loss: 0.00704
Epoch: 33516, Training Loss: 0.00795
Epoch: 33516, Training Loss: 0.00743
Epoch: 33516, Training Loss: 0.00908
Epoch: 33516, Training Loss: 0.00704
Epoch: 33517, Training Loss: 0.00795
Epoch: 33517, Training Loss: 0.00743
Epoch: 33517, Training Loss: 0.00908
Epoch: 33517, Training Loss: 0.00704
Epoch: 33518, Training Loss: 0.00795
Epoch: 33518, Training Loss: 0.00743
Epoch: 33518, Training Loss: 0.00908
Epoch: 33518, Training Loss: 0.00704
Epoch: 33519, Training Loss: 0.00795
Epoch: 33519, Training Loss: 0.00743
Epoch: 33519, Training Loss: 0.00908
Epoch: 33519, Training Loss: 0.00704
Epoch: 33520, Training Loss: 0.00795
Epoch: 33520, Training Loss: 0.00743
Epoch: 33520, Training Loss: 0.00908
Epoch: 33520, Training Loss: 0.00704
Epoch: 33521, Training Loss: 0.00795
Epoch: 33521, Training Loss: 0.00743
Epoch: 33521, Training Loss: 0.00908
Epoch: 33521, Training Loss: 0.00704
Epoch: 33522, Training Loss: 0.00795
Epoch: 33522, Training Loss: 0.00743
Epoch: 33522, Training Loss: 0.00908
Epoch: 33522, Training Loss: 0.00704
Epoch: 33523, Training Loss: 0.00795
Epoch: 33523, Training Loss: 0.00743
Epoch: 33523, Training Loss: 0.00908
Epoch: 33523, Training Loss: 0.00704
Epoch: 33524, Training Loss: 0.00795
Epoch: 33524, Training Loss: 0.00743
Epoch: 33524, Training Loss: 0.00908
Epoch: 33524, Training Loss: 0.00704
Epoch: 33525, Training Loss: 0.00795
Epoch: 33525, Training Loss: 0.00743
Epoch: 33525, Training Loss: 0.00908
Epoch: 33525, Training Loss: 0.00704
Epoch: 33526, Training Loss: 0.00795
Epoch: 33526, Training Loss: 0.00743
Epoch: 33526, Training Loss: 0.00908
Epoch: 33526, Training Loss: 0.00704
Epoch: 33527, Training Loss: 0.00795
Epoch: 33527, Training Loss: 0.00743
Epoch: 33527, Training Loss: 0.00908
Epoch: 33527, Training Loss: 0.00704
Epoch: 33528, Training Loss: 0.00795
Epoch: 33528, Training Loss: 0.00743
Epoch: 33528, Training Loss: 0.00908
Epoch: 33528, Training Loss: 0.00704
Epoch: 33529, Training Loss: 0.00795
Epoch: 33529, Training Loss: 0.00743
Epoch: 33529, Training Loss: 0.00908
Epoch: 33529, Training Loss: 0.00704
Epoch: 33530, Training Loss: 0.00795
Epoch: 33530, Training Loss: 0.00743
Epoch: 33530, Training Loss: 0.00908
Epoch: 33530, Training Loss: 0.00704
Epoch: 33531, Training Loss: 0.00795
Epoch: 33531, Training Loss: 0.00743
Epoch: 33531, Training Loss: 0.00908
Epoch: 33531, Training Loss: 0.00704
Epoch: 33532, Training Loss: 0.00795
Epoch: 33532, Training Loss: 0.00743
Epoch: 33532, Training Loss: 0.00908
Epoch: 33532, Training Loss: 0.00704
Epoch: 33533, Training Loss: 0.00795
Epoch: 33533, Training Loss: 0.00743
Epoch: 33533, Training Loss: 0.00908
Epoch: 33533, Training Loss: 0.00704
Epoch: 33534, Training Loss: 0.00795
Epoch: 33534, Training Loss: 0.00743
Epoch: 33534, Training Loss: 0.00908
Epoch: 33534, Training Loss: 0.00704
Epoch: 33535, Training Loss: 0.00795
Epoch: 33535, Training Loss: 0.00743
Epoch: 33535, Training Loss: 0.00908
Epoch: 33535, Training Loss: 0.00704
Epoch: 33536, Training Loss: 0.00795
Epoch: 33536, Training Loss: 0.00743
Epoch: 33536, Training Loss: 0.00908
Epoch: 33536, Training Loss: 0.00704
Epoch: 33537, Training Loss: 0.00795
Epoch: 33537, Training Loss: 0.00743
Epoch: 33537, Training Loss: 0.00908
Epoch: 33537, Training Loss: 0.00704
Epoch: 33538, Training Loss: 0.00795
Epoch: 33538, Training Loss: 0.00743
Epoch: 33538, Training Loss: 0.00908
Epoch: 33538, Training Loss: 0.00704
Epoch: 33539, Training Loss: 0.00795
Epoch: 33539, Training Loss: 0.00743
Epoch: 33539, Training Loss: 0.00908
Epoch: 33539, Training Loss: 0.00704
Epoch: 33540, Training Loss: 0.00795
Epoch: 33540, Training Loss: 0.00743
Epoch: 33540, Training Loss: 0.00908
Epoch: 33540, Training Loss: 0.00704
Epoch: 33541, Training Loss: 0.00795
Epoch: 33541, Training Loss: 0.00743
Epoch: 33541, Training Loss: 0.00908
Epoch: 33541, Training Loss: 0.00704
Epoch: 33542, Training Loss: 0.00795
Epoch: 33542, Training Loss: 0.00743
Epoch: 33542, Training Loss: 0.00908
Epoch: 33542, Training Loss: 0.00704
Epoch: 33543, Training Loss: 0.00795
Epoch: 33543, Training Loss: 0.00743
Epoch: 33543, Training Loss: 0.00908
Epoch: 33543, Training Loss: 0.00704
Epoch: 33544, Training Loss: 0.00795
Epoch: 33544, Training Loss: 0.00743
Epoch: 33544, Training Loss: 0.00908
Epoch: 33544, Training Loss: 0.00704
Epoch: 33545, Training Loss: 0.00795
Epoch: 33545, Training Loss: 0.00743
Epoch: 33545, Training Loss: 0.00908
Epoch: 33545, Training Loss: 0.00703
Epoch: 33546, Training Loss: 0.00795
Epoch: 33546, Training Loss: 0.00743
Epoch: 33546, Training Loss: 0.00908
Epoch: 33546, Training Loss: 0.00703
Epoch: 33547, Training Loss: 0.00795
Epoch: 33547, Training Loss: 0.00743
Epoch: 33547, Training Loss: 0.00908
Epoch: 33547, Training Loss: 0.00703
Epoch: 33548, Training Loss: 0.00795
Epoch: 33548, Training Loss: 0.00743
Epoch: 33548, Training Loss: 0.00908
Epoch: 33548, Training Loss: 0.00703
Epoch: 33549, Training Loss: 0.00795
Epoch: 33549, Training Loss: 0.00743
Epoch: 33549, Training Loss: 0.00908
Epoch: 33549, Training Loss: 0.00703
Epoch: 33550, Training Loss: 0.00795
Epoch: 33550, Training Loss: 0.00743
Epoch: 33550, Training Loss: 0.00908
Epoch: 33550, Training Loss: 0.00703
Epoch: 33551, Training Loss: 0.00795
Epoch: 33551, Training Loss: 0.00743
Epoch: 33551, Training Loss: 0.00908
Epoch: 33551, Training Loss: 0.00703
Epoch: 33552, Training Loss: 0.00795
Epoch: 33552, Training Loss: 0.00743
Epoch: 33552, Training Loss: 0.00908
Epoch: 33552, Training Loss: 0.00703
Epoch: 33553, Training Loss: 0.00795
Epoch: 33553, Training Loss: 0.00743
Epoch: 33553, Training Loss: 0.00908
Epoch: 33553, Training Loss: 0.00703
Epoch: 33554, Training Loss: 0.00795
Epoch: 33554, Training Loss: 0.00743
Epoch: 33554, Training Loss: 0.00908
Epoch: 33554, Training Loss: 0.00703
Epoch: 33555, Training Loss: 0.00795
Epoch: 33555, Training Loss: 0.00743
Epoch: 33555, Training Loss: 0.00908
Epoch: 33555, Training Loss: 0.00703
Epoch: 33556, Training Loss: 0.00795
Epoch: 33556, Training Loss: 0.00743
Epoch: 33556, Training Loss: 0.00908
Epoch: 33556, Training Loss: 0.00703
Epoch: 33557, Training Loss: 0.00795
Epoch: 33557, Training Loss: 0.00743
Epoch: 33557, Training Loss: 0.00908
Epoch: 33557, Training Loss: 0.00703
Epoch: 33558, Training Loss: 0.00795
Epoch: 33558, Training Loss: 0.00743
Epoch: 33558, Training Loss: 0.00908
Epoch: 33558, Training Loss: 0.00703
Epoch: 33559, Training Loss: 0.00795
Epoch: 33559, Training Loss: 0.00743
Epoch: 33559, Training Loss: 0.00908
Epoch: 33559, Training Loss: 0.00703
Epoch: 33560, Training Loss: 0.00795
Epoch: 33560, Training Loss: 0.00743
Epoch: 33560, Training Loss: 0.00908
Epoch: 33560, Training Loss: 0.00703
Epoch: 33561, Training Loss: 0.00795
Epoch: 33561, Training Loss: 0.00743
Epoch: 33561, Training Loss: 0.00908
Epoch: 33561, Training Loss: 0.00703
Epoch: 33562, Training Loss: 0.00795
Epoch: 33562, Training Loss: 0.00743
Epoch: 33562, Training Loss: 0.00908
Epoch: 33562, Training Loss: 0.00703
Epoch: 33563, Training Loss: 0.00795
Epoch: 33563, Training Loss: 0.00743
Epoch: 33563, Training Loss: 0.00908
Epoch: 33563, Training Loss: 0.00703
Epoch: 33564, Training Loss: 0.00795
Epoch: 33564, Training Loss: 0.00743
Epoch: 33564, Training Loss: 0.00908
Epoch: 33564, Training Loss: 0.00703
Epoch: 33565, Training Loss: 0.00795
Epoch: 33565, Training Loss: 0.00743
Epoch: 33565, Training Loss: 0.00908
Epoch: 33565, Training Loss: 0.00703
Epoch: 33566, Training Loss: 0.00795
Epoch: 33566, Training Loss: 0.00743
Epoch: 33566, Training Loss: 0.00908
Epoch: 33566, Training Loss: 0.00703
Epoch: 33567, Training Loss: 0.00795
Epoch: 33567, Training Loss: 0.00743
Epoch: 33567, Training Loss: 0.00908
Epoch: 33567, Training Loss: 0.00703
Epoch: 33568, Training Loss: 0.00795
Epoch: 33568, Training Loss: 0.00743
Epoch: 33568, Training Loss: 0.00908
Epoch: 33568, Training Loss: 0.00703
Epoch: 33569, Training Loss: 0.00795
Epoch: 33569, Training Loss: 0.00743
Epoch: 33569, Training Loss: 0.00908
Epoch: 33569, Training Loss: 0.00703
Epoch: 33570, Training Loss: 0.00795
Epoch: 33570, Training Loss: 0.00743
Epoch: 33570, Training Loss: 0.00908
Epoch: 33570, Training Loss: 0.00703
Epoch: 33571, Training Loss: 0.00795
Epoch: 33571, Training Loss: 0.00743
Epoch: 33571, Training Loss: 0.00908
Epoch: 33571, Training Loss: 0.00703
Epoch: 33572, Training Loss: 0.00795
Epoch: 33572, Training Loss: 0.00743
Epoch: 33572, Training Loss: 0.00908
Epoch: 33572, Training Loss: 0.00703
Epoch: 33573, Training Loss: 0.00795
Epoch: 33573, Training Loss: 0.00743
Epoch: 33573, Training Loss: 0.00908
Epoch: 33573, Training Loss: 0.00703
Epoch: 33574, Training Loss: 0.00795
Epoch: 33574, Training Loss: 0.00743
Epoch: 33574, Training Loss: 0.00908
Epoch: 33574, Training Loss: 0.00703
Epoch: 33575, Training Loss: 0.00795
Epoch: 33575, Training Loss: 0.00743
Epoch: 33575, Training Loss: 0.00908
Epoch: 33575, Training Loss: 0.00703
Epoch: 33576, Training Loss: 0.00795
Epoch: 33576, Training Loss: 0.00743
Epoch: 33576, Training Loss: 0.00908
Epoch: 33576, Training Loss: 0.00703
Epoch: 33577, Training Loss: 0.00795
Epoch: 33577, Training Loss: 0.00743
Epoch: 33577, Training Loss: 0.00908
Epoch: 33577, Training Loss: 0.00703
Epoch: 33578, Training Loss: 0.00795
Epoch: 33578, Training Loss: 0.00743
Epoch: 33578, Training Loss: 0.00908
Epoch: 33578, Training Loss: 0.00703
Epoch: 33579, Training Loss: 0.00795
Epoch: 33579, Training Loss: 0.00743
Epoch: 33579, Training Loss: 0.00908
Epoch: 33579, Training Loss: 0.00703
Epoch: 33580, Training Loss: 0.00794
Epoch: 33580, Training Loss: 0.00743
Epoch: 33580, Training Loss: 0.00907
Epoch: 33580, Training Loss: 0.00703
Epoch: 33581, Training Loss: 0.00794
Epoch: 33581, Training Loss: 0.00743
Epoch: 33581, Training Loss: 0.00907
Epoch: 33581, Training Loss: 0.00703
Epoch: 33582, Training Loss: 0.00794
Epoch: 33582, Training Loss: 0.00743
Epoch: 33582, Training Loss: 0.00907
Epoch: 33582, Training Loss: 0.00703
Epoch: 33583, Training Loss: 0.00794
Epoch: 33583, Training Loss: 0.00743
Epoch: 33583, Training Loss: 0.00907
Epoch: 33583, Training Loss: 0.00703
Epoch: 33584, Training Loss: 0.00794
Epoch: 33584, Training Loss: 0.00743
Epoch: 33584, Training Loss: 0.00907
Epoch: 33584, Training Loss: 0.00703
Epoch: 33585, Training Loss: 0.00794
Epoch: 33585, Training Loss: 0.00743
Epoch: 33585, Training Loss: 0.00907
Epoch: 33585, Training Loss: 0.00703
Epoch: 33586, Training Loss: 0.00794
Epoch: 33586, Training Loss: 0.00742
Epoch: 33586, Training Loss: 0.00907
Epoch: 33586, Training Loss: 0.00703
Epoch: 33587, Training Loss: 0.00794
Epoch: 33587, Training Loss: 0.00742
Epoch: 33587, Training Loss: 0.00907
Epoch: 33587, Training Loss: 0.00703
Epoch: 33588, Training Loss: 0.00794
Epoch: 33588, Training Loss: 0.00742
Epoch: 33588, Training Loss: 0.00907
Epoch: 33588, Training Loss: 0.00703
Epoch: 33589, Training Loss: 0.00794
Epoch: 33589, Training Loss: 0.00742
Epoch: 33589, Training Loss: 0.00907
Epoch: 33589, Training Loss: 0.00703
Epoch: 33590, Training Loss: 0.00794
Epoch: 33590, Training Loss: 0.00742
Epoch: 33590, Training Loss: 0.00907
Epoch: 33590, Training Loss: 0.00703
Epoch: 33591, Training Loss: 0.00794
Epoch: 33591, Training Loss: 0.00742
Epoch: 33591, Training Loss: 0.00907
Epoch: 33591, Training Loss: 0.00703
Epoch: 33592, Training Loss: 0.00794
Epoch: 33592, Training Loss: 0.00742
Epoch: 33592, Training Loss: 0.00907
Epoch: 33592, Training Loss: 0.00703
Epoch: 33593, Training Loss: 0.00794
Epoch: 33593, Training Loss: 0.00742
Epoch: 33593, Training Loss: 0.00907
Epoch: 33593, Training Loss: 0.00703
Epoch: 33594, Training Loss: 0.00794
Epoch: 33594, Training Loss: 0.00742
Epoch: 33594, Training Loss: 0.00907
Epoch: 33594, Training Loss: 0.00703
Epoch: 33595, Training Loss: 0.00794
Epoch: 33595, Training Loss: 0.00742
Epoch: 33595, Training Loss: 0.00907
Epoch: 33595, Training Loss: 0.00703
Epoch: 33596, Training Loss: 0.00794
Epoch: 33596, Training Loss: 0.00742
Epoch: 33596, Training Loss: 0.00907
Epoch: 33596, Training Loss: 0.00703
Epoch: 33597, Training Loss: 0.00794
Epoch: 33597, Training Loss: 0.00742
Epoch: 33597, Training Loss: 0.00907
Epoch: 33597, Training Loss: 0.00703
Epoch: 33598, Training Loss: 0.00794
Epoch: 33598, Training Loss: 0.00742
Epoch: 33598, Training Loss: 0.00907
Epoch: 33598, Training Loss: 0.00703
Epoch: 33599, Training Loss: 0.00794
Epoch: 33599, Training Loss: 0.00742
Epoch: 33599, Training Loss: 0.00907
Epoch: 33599, Training Loss: 0.00703
Epoch: 33600, Training Loss: 0.00794
Epoch: 33600, Training Loss: 0.00742
Epoch: 33600, Training Loss: 0.00907
Epoch: 33600, Training Loss: 0.00703
Epoch: 33601, Training Loss: 0.00794
Epoch: 33601, Training Loss: 0.00742
Epoch: 33601, Training Loss: 0.00907
Epoch: 33601, Training Loss: 0.00703
Epoch: 33602, Training Loss: 0.00794
Epoch: 33602, Training Loss: 0.00742
Epoch: 33602, Training Loss: 0.00907
Epoch: 33602, Training Loss: 0.00703
Epoch: 33603, Training Loss: 0.00794
Epoch: 33603, Training Loss: 0.00742
Epoch: 33603, Training Loss: 0.00907
Epoch: 33603, Training Loss: 0.00703
Epoch: 33604, Training Loss: 0.00794
Epoch: 33604, Training Loss: 0.00742
Epoch: 33604, Training Loss: 0.00907
Epoch: 33604, Training Loss: 0.00703
Epoch: 33605, Training Loss: 0.00794
Epoch: 33605, Training Loss: 0.00742
Epoch: 33605, Training Loss: 0.00907
Epoch: 33605, Training Loss: 0.00703
Epoch: 33606, Training Loss: 0.00794
Epoch: 33606, Training Loss: 0.00742
Epoch: 33606, Training Loss: 0.00907
Epoch: 33606, Training Loss: 0.00703
Epoch: 33607, Training Loss: 0.00794
Epoch: 33607, Training Loss: 0.00742
Epoch: 33607, Training Loss: 0.00907
Epoch: 33607, Training Loss: 0.00703
Epoch: 33608, Training Loss: 0.00794
Epoch: 33608, Training Loss: 0.00742
Epoch: 33608, Training Loss: 0.00907
Epoch: 33608, Training Loss: 0.00703
Epoch: 33609, Training Loss: 0.00794
Epoch: 33609, Training Loss: 0.00742
Epoch: 33609, Training Loss: 0.00907
Epoch: 33609, Training Loss: 0.00703
Epoch: 33610, Training Loss: 0.00794
Epoch: 33610, Training Loss: 0.00742
Epoch: 33610, Training Loss: 0.00907
Epoch: 33610, Training Loss: 0.00703
Epoch: 33611, Training Loss: 0.00794
Epoch: 33611, Training Loss: 0.00742
Epoch: 33611, Training Loss: 0.00907
Epoch: 33611, Training Loss: 0.00703
Epoch: 33612, Training Loss: 0.00794
Epoch: 33612, Training Loss: 0.00742
Epoch: 33612, Training Loss: 0.00907
Epoch: 33612, Training Loss: 0.00703
Epoch: 33613, Training Loss: 0.00794
Epoch: 33613, Training Loss: 0.00742
Epoch: 33613, Training Loss: 0.00907
Epoch: 33613, Training Loss: 0.00703
Epoch: 33614, Training Loss: 0.00794
Epoch: 33614, Training Loss: 0.00742
Epoch: 33614, Training Loss: 0.00907
Epoch: 33614, Training Loss: 0.00703
Epoch: 33615, Training Loss: 0.00794
Epoch: 33615, Training Loss: 0.00742
Epoch: 33615, Training Loss: 0.00907
Epoch: 33615, Training Loss: 0.00703
Epoch: 33616, Training Loss: 0.00794
Epoch: 33616, Training Loss: 0.00742
Epoch: 33616, Training Loss: 0.00907
Epoch: 33616, Training Loss: 0.00703
Epoch: 33617, Training Loss: 0.00794
Epoch: 33617, Training Loss: 0.00742
Epoch: 33617, Training Loss: 0.00907
Epoch: 33617, Training Loss: 0.00703
Epoch: 33618, Training Loss: 0.00794
Epoch: 33618, Training Loss: 0.00742
Epoch: 33618, Training Loss: 0.00907
Epoch: 33618, Training Loss: 0.00703
Epoch: 33619, Training Loss: 0.00794
Epoch: 33619, Training Loss: 0.00742
Epoch: 33619, Training Loss: 0.00907
Epoch: 33619, Training Loss: 0.00703
Epoch: 33620, Training Loss: 0.00794
Epoch: 33620, Training Loss: 0.00742
Epoch: 33620, Training Loss: 0.00907
Epoch: 33620, Training Loss: 0.00703
Epoch: 33621, Training Loss: 0.00794
Epoch: 33621, Training Loss: 0.00742
Epoch: 33621, Training Loss: 0.00907
Epoch: 33621, Training Loss: 0.00703
Epoch: 33622, Training Loss: 0.00794
Epoch: 33622, Training Loss: 0.00742
Epoch: 33622, Training Loss: 0.00907
Epoch: 33622, Training Loss: 0.00703
Epoch: 33623, Training Loss: 0.00794
Epoch: 33623, Training Loss: 0.00742
Epoch: 33623, Training Loss: 0.00907
Epoch: 33623, Training Loss: 0.00703
Epoch: 33624, Training Loss: 0.00794
Epoch: 33624, Training Loss: 0.00742
Epoch: 33624, Training Loss: 0.00907
Epoch: 33624, Training Loss: 0.00703
Epoch: 33625, Training Loss: 0.00794
Epoch: 33625, Training Loss: 0.00742
Epoch: 33625, Training Loss: 0.00907
Epoch: 33625, Training Loss: 0.00703
Epoch: 33626, Training Loss: 0.00794
Epoch: 33626, Training Loss: 0.00742
Epoch: 33626, Training Loss: 0.00907
Epoch: 33626, Training Loss: 0.00703
Epoch: 33627, Training Loss: 0.00794
Epoch: 33627, Training Loss: 0.00742
Epoch: 33627, Training Loss: 0.00907
Epoch: 33627, Training Loss: 0.00703
Epoch: 33628, Training Loss: 0.00794
Epoch: 33628, Training Loss: 0.00742
Epoch: 33628, Training Loss: 0.00907
Epoch: 33628, Training Loss: 0.00703
Epoch: 33629, Training Loss: 0.00794
Epoch: 33629, Training Loss: 0.00742
Epoch: 33629, Training Loss: 0.00907
Epoch: 33629, Training Loss: 0.00703
Epoch: 33630, Training Loss: 0.00794
Epoch: 33630, Training Loss: 0.00742
Epoch: 33630, Training Loss: 0.00907
Epoch: 33630, Training Loss: 0.00703
Epoch: 33631, Training Loss: 0.00794
Epoch: 33631, Training Loss: 0.00742
Epoch: 33631, Training Loss: 0.00907
Epoch: 33631, Training Loss: 0.00703
Epoch: 33632, Training Loss: 0.00794
Epoch: 33632, Training Loss: 0.00742
Epoch: 33632, Training Loss: 0.00907
Epoch: 33632, Training Loss: 0.00703
Epoch: 33633, Training Loss: 0.00794
Epoch: 33633, Training Loss: 0.00742
Epoch: 33633, Training Loss: 0.00907
Epoch: 33633, Training Loss: 0.00703
Epoch: 33634, Training Loss: 0.00794
Epoch: 33634, Training Loss: 0.00742
Epoch: 33634, Training Loss: 0.00907
Epoch: 33634, Training Loss: 0.00703
Epoch: 33635, Training Loss: 0.00794
Epoch: 33635, Training Loss: 0.00742
Epoch: 33635, Training Loss: 0.00907
Epoch: 33635, Training Loss: 0.00703
Epoch: 33636, Training Loss: 0.00794
Epoch: 33636, Training Loss: 0.00742
Epoch: 33636, Training Loss: 0.00907
Epoch: 33636, Training Loss: 0.00703
Epoch: 33637, Training Loss: 0.00794
Epoch: 33637, Training Loss: 0.00742
Epoch: 33637, Training Loss: 0.00907
Epoch: 33637, Training Loss: 0.00702
Epoch: 33638, Training Loss: 0.00794
Epoch: 33638, Training Loss: 0.00742
Epoch: 33638, Training Loss: 0.00907
Epoch: 33638, Training Loss: 0.00702
Epoch: 33639, Training Loss: 0.00794
Epoch: 33639, Training Loss: 0.00742
Epoch: 33639, Training Loss: 0.00907
Epoch: 33639, Training Loss: 0.00702
Epoch: 33640, Training Loss: 0.00794
Epoch: 33640, Training Loss: 0.00742
Epoch: 33640, Training Loss: 0.00907
Epoch: 33640, Training Loss: 0.00702
Epoch: 33641, Training Loss: 0.00794
Epoch: 33641, Training Loss: 0.00742
Epoch: 33641, Training Loss: 0.00907
Epoch: 33641, Training Loss: 0.00702
Epoch: 33642, Training Loss: 0.00794
Epoch: 33642, Training Loss: 0.00742
Epoch: 33642, Training Loss: 0.00907
Epoch: 33642, Training Loss: 0.00702
Epoch: 33643, Training Loss: 0.00794
Epoch: 33643, Training Loss: 0.00742
Epoch: 33643, Training Loss: 0.00907
Epoch: 33643, Training Loss: 0.00702
Epoch: 33644, Training Loss: 0.00794
Epoch: 33644, Training Loss: 0.00742
Epoch: 33644, Training Loss: 0.00907
Epoch: 33644, Training Loss: 0.00702
Epoch: 33645, Training Loss: 0.00794
Epoch: 33645, Training Loss: 0.00742
Epoch: 33645, Training Loss: 0.00907
Epoch: 33645, Training Loss: 0.00702
Epoch: 33646, Training Loss: 0.00794
Epoch: 33646, Training Loss: 0.00742
Epoch: 33646, Training Loss: 0.00907
Epoch: 33646, Training Loss: 0.00702
Epoch: 33647, Training Loss: 0.00794
Epoch: 33647, Training Loss: 0.00742
Epoch: 33647, Training Loss: 0.00907
Epoch: 33647, Training Loss: 0.00702
Epoch: 33648, Training Loss: 0.00794
Epoch: 33648, Training Loss: 0.00742
Epoch: 33648, Training Loss: 0.00907
Epoch: 33648, Training Loss: 0.00702
Epoch: 33649, Training Loss: 0.00794
Epoch: 33649, Training Loss: 0.00742
Epoch: 33649, Training Loss: 0.00907
Epoch: 33649, Training Loss: 0.00702
Epoch: 33650, Training Loss: 0.00794
Epoch: 33650, Training Loss: 0.00742
Epoch: 33650, Training Loss: 0.00906
Epoch: 33650, Training Loss: 0.00702
Epoch: 33651, Training Loss: 0.00794
Epoch: 33651, Training Loss: 0.00742
Epoch: 33651, Training Loss: 0.00906
Epoch: 33651, Training Loss: 0.00702
Epoch: 33652, Training Loss: 0.00794
Epoch: 33652, Training Loss: 0.00742
Epoch: 33652, Training Loss: 0.00906
Epoch: 33652, Training Loss: 0.00702
Epoch: 33653, Training Loss: 0.00794
Epoch: 33653, Training Loss: 0.00742
Epoch: 33653, Training Loss: 0.00906
Epoch: 33653, Training Loss: 0.00702
Epoch: 33654, Training Loss: 0.00794
Epoch: 33654, Training Loss: 0.00742
Epoch: 33654, Training Loss: 0.00906
Epoch: 33654, Training Loss: 0.00702
Epoch: 33655, Training Loss: 0.00794
Epoch: 33655, Training Loss: 0.00742
Epoch: 33655, Training Loss: 0.00906
Epoch: 33655, Training Loss: 0.00702
Epoch: 33656, Training Loss: 0.00794
Epoch: 33656, Training Loss: 0.00742
Epoch: 33656, Training Loss: 0.00906
Epoch: 33656, Training Loss: 0.00702
Epoch: 33657, Training Loss: 0.00794
Epoch: 33657, Training Loss: 0.00742
Epoch: 33657, Training Loss: 0.00906
Epoch: 33657, Training Loss: 0.00702
Epoch: 33658, Training Loss: 0.00794
Epoch: 33658, Training Loss: 0.00742
Epoch: 33658, Training Loss: 0.00906
Epoch: 33658, Training Loss: 0.00702
Epoch: 33659, Training Loss: 0.00794
Epoch: 33659, Training Loss: 0.00742
Epoch: 33659, Training Loss: 0.00906
Epoch: 33659, Training Loss: 0.00702
Epoch: 33660, Training Loss: 0.00793
Epoch: 33660, Training Loss: 0.00742
Epoch: 33660, Training Loss: 0.00906
Epoch: 33660, Training Loss: 0.00702
Epoch: 33661, Training Loss: 0.00793
Epoch: 33661, Training Loss: 0.00742
Epoch: 33661, Training Loss: 0.00906
Epoch: 33661, Training Loss: 0.00702
Epoch: 33662, Training Loss: 0.00793
Epoch: 33662, Training Loss: 0.00742
Epoch: 33662, Training Loss: 0.00906
Epoch: 33662, Training Loss: 0.00702
Epoch: 33663, Training Loss: 0.00793
Epoch: 33663, Training Loss: 0.00742
Epoch: 33663, Training Loss: 0.00906
Epoch: 33663, Training Loss: 0.00702
Epoch: 33664, Training Loss: 0.00793
Epoch: 33664, Training Loss: 0.00742
Epoch: 33664, Training Loss: 0.00906
Epoch: 33664, Training Loss: 0.00702
Epoch: 33665, Training Loss: 0.00793
Epoch: 33665, Training Loss: 0.00742
Epoch: 33665, Training Loss: 0.00906
Epoch: 33665, Training Loss: 0.00702
Epoch: 33666, Training Loss: 0.00793
Epoch: 33666, Training Loss: 0.00742
Epoch: 33666, Training Loss: 0.00906
Epoch: 33666, Training Loss: 0.00702
Epoch: 33667, Training Loss: 0.00793
Epoch: 33667, Training Loss: 0.00742
Epoch: 33667, Training Loss: 0.00906
Epoch: 33667, Training Loss: 0.00702
Epoch: 33668, Training Loss: 0.00793
Epoch: 33668, Training Loss: 0.00742
Epoch: 33668, Training Loss: 0.00906
Epoch: 33668, Training Loss: 0.00702
Epoch: 33669, Training Loss: 0.00793
Epoch: 33669, Training Loss: 0.00742
Epoch: 33669, Training Loss: 0.00906
Epoch: 33669, Training Loss: 0.00702
Epoch: 33670, Training Loss: 0.00793
Epoch: 33670, Training Loss: 0.00742
Epoch: 33670, Training Loss: 0.00906
Epoch: 33670, Training Loss: 0.00702
Epoch: 33671, Training Loss: 0.00793
Epoch: 33671, Training Loss: 0.00742
Epoch: 33671, Training Loss: 0.00906
Epoch: 33671, Training Loss: 0.00702
Epoch: 33672, Training Loss: 0.00793
Epoch: 33672, Training Loss: 0.00742
Epoch: 33672, Training Loss: 0.00906
Epoch: 33672, Training Loss: 0.00702
Epoch: 33673, Training Loss: 0.00793
Epoch: 33673, Training Loss: 0.00741
Epoch: 33673, Training Loss: 0.00906
Epoch: 33673, Training Loss: 0.00702
Epoch: 33674, Training Loss: 0.00793
Epoch: 33674, Training Loss: 0.00741
Epoch: 33674, Training Loss: 0.00906
Epoch: 33674, Training Loss: 0.00702
Epoch: 33675, Training Loss: 0.00793
Epoch: 33675, Training Loss: 0.00741
Epoch: 33675, Training Loss: 0.00906
Epoch: 33675, Training Loss: 0.00702
Epoch: 33676, Training Loss: 0.00793
Epoch: 33676, Training Loss: 0.00741
Epoch: 33676, Training Loss: 0.00906
Epoch: 33676, Training Loss: 0.00702
Epoch: 33677, Training Loss: 0.00793
Epoch: 33677, Training Loss: 0.00741
Epoch: 33677, Training Loss: 0.00906
Epoch: 33677, Training Loss: 0.00702
Epoch: 33678, Training Loss: 0.00793
Epoch: 33678, Training Loss: 0.00741
Epoch: 33678, Training Loss: 0.00906
Epoch: 33678, Training Loss: 0.00702
Epoch: 33679, Training Loss: 0.00793
Epoch: 33679, Training Loss: 0.00741
Epoch: 33679, Training Loss: 0.00906
Epoch: 33679, Training Loss: 0.00702
Epoch: 33680, Training Loss: 0.00793
Epoch: 33680, Training Loss: 0.00741
Epoch: 33680, Training Loss: 0.00906
Epoch: 33680, Training Loss: 0.00702
Epoch: 33681, Training Loss: 0.00793
Epoch: 33681, Training Loss: 0.00741
Epoch: 33681, Training Loss: 0.00906
Epoch: 33681, Training Loss: 0.00702
Epoch: 33682, Training Loss: 0.00793
Epoch: 33682, Training Loss: 0.00741
Epoch: 33682, Training Loss: 0.00906
Epoch: 33682, Training Loss: 0.00702
Epoch: 33683, Training Loss: 0.00793
Epoch: 33683, Training Loss: 0.00741
Epoch: 33683, Training Loss: 0.00906
Epoch: 33683, Training Loss: 0.00702
Epoch: 33684, Training Loss: 0.00793
Epoch: 33684, Training Loss: 0.00741
Epoch: 33684, Training Loss: 0.00906
Epoch: 33684, Training Loss: 0.00702
Epoch: 33685, Training Loss: 0.00793
Epoch: 33685, Training Loss: 0.00741
Epoch: 33685, Training Loss: 0.00906
Epoch: 33685, Training Loss: 0.00702
Epoch: 33686, Training Loss: 0.00793
Epoch: 33686, Training Loss: 0.00741
Epoch: 33686, Training Loss: 0.00906
Epoch: 33686, Training Loss: 0.00702
Epoch: 33687, Training Loss: 0.00793
Epoch: 33687, Training Loss: 0.00741
Epoch: 33687, Training Loss: 0.00906
Epoch: 33687, Training Loss: 0.00702
Epoch: 33688, Training Loss: 0.00793
Epoch: 33688, Training Loss: 0.00741
Epoch: 33688, Training Loss: 0.00906
Epoch: 33688, Training Loss: 0.00702
Epoch: 33689, Training Loss: 0.00793
Epoch: 33689, Training Loss: 0.00741
Epoch: 33689, Training Loss: 0.00906
Epoch: 33689, Training Loss: 0.00702
Epoch: 33690, Training Loss: 0.00793
Epoch: 33690, Training Loss: 0.00741
Epoch: 33690, Training Loss: 0.00906
Epoch: 33690, Training Loss: 0.00702
Epoch: 33691, Training Loss: 0.00793
Epoch: 33691, Training Loss: 0.00741
Epoch: 33691, Training Loss: 0.00906
Epoch: 33691, Training Loss: 0.00702
Epoch: 33692, Training Loss: 0.00793
Epoch: 33692, Training Loss: 0.00741
Epoch: 33692, Training Loss: 0.00906
Epoch: 33692, Training Loss: 0.00702
Epoch: 33693, Training Loss: 0.00793
Epoch: 33693, Training Loss: 0.00741
Epoch: 33693, Training Loss: 0.00906
Epoch: 33693, Training Loss: 0.00702
Epoch: 33694, Training Loss: 0.00793
Epoch: 33694, Training Loss: 0.00741
Epoch: 33694, Training Loss: 0.00906
Epoch: 33694, Training Loss: 0.00702
Epoch: 33695, Training Loss: 0.00793
Epoch: 33695, Training Loss: 0.00741
Epoch: 33695, Training Loss: 0.00906
Epoch: 33695, Training Loss: 0.00702
Epoch: 33696, Training Loss: 0.00793
Epoch: 33696, Training Loss: 0.00741
Epoch: 33696, Training Loss: 0.00906
Epoch: 33696, Training Loss: 0.00702
Epoch: 33697, Training Loss: 0.00793
Epoch: 33697, Training Loss: 0.00741
Epoch: 33697, Training Loss: 0.00906
Epoch: 33697, Training Loss: 0.00702
Epoch: 33698, Training Loss: 0.00793
Epoch: 33698, Training Loss: 0.00741
Epoch: 33698, Training Loss: 0.00906
Epoch: 33698, Training Loss: 0.00702
Epoch: 33699, Training Loss: 0.00793
Epoch: 33699, Training Loss: 0.00741
Epoch: 33699, Training Loss: 0.00906
Epoch: 33699, Training Loss: 0.00702
Epoch: 33700, Training Loss: 0.00793
Epoch: 33700, Training Loss: 0.00741
Epoch: 33700, Training Loss: 0.00906
Epoch: 33700, Training Loss: 0.00702
Epoch: 33701, Training Loss: 0.00793
Epoch: 33701, Training Loss: 0.00741
Epoch: 33701, Training Loss: 0.00906
Epoch: 33701, Training Loss: 0.00702
Epoch: 33702, Training Loss: 0.00793
Epoch: 33702, Training Loss: 0.00741
Epoch: 33702, Training Loss: 0.00906
Epoch: 33702, Training Loss: 0.00702
Epoch: 33703, Training Loss: 0.00793
Epoch: 33703, Training Loss: 0.00741
Epoch: 33703, Training Loss: 0.00906
Epoch: 33703, Training Loss: 0.00702
Epoch: 33704, Training Loss: 0.00793
Epoch: 33704, Training Loss: 0.00741
Epoch: 33704, Training Loss: 0.00906
Epoch: 33704, Training Loss: 0.00702
Epoch: 33705, Training Loss: 0.00793
Epoch: 33705, Training Loss: 0.00741
Epoch: 33705, Training Loss: 0.00906
Epoch: 33705, Training Loss: 0.00702
Epoch: 33706, Training Loss: 0.00793
Epoch: 33706, Training Loss: 0.00741
Epoch: 33706, Training Loss: 0.00906
Epoch: 33706, Training Loss: 0.00702
Epoch: 33707, Training Loss: 0.00793
Epoch: 33707, Training Loss: 0.00741
Epoch: 33707, Training Loss: 0.00906
Epoch: 33707, Training Loss: 0.00702
Epoch: 33708, Training Loss: 0.00793
Epoch: 33708, Training Loss: 0.00741
Epoch: 33708, Training Loss: 0.00906
Epoch: 33708, Training Loss: 0.00702
Epoch: 33709, Training Loss: 0.00793
Epoch: 33709, Training Loss: 0.00741
Epoch: 33709, Training Loss: 0.00906
Epoch: 33709, Training Loss: 0.00702
Epoch: 33710, Training Loss: 0.00793
Epoch: 33710, Training Loss: 0.00741
Epoch: 33710, Training Loss: 0.00906
Epoch: 33710, Training Loss: 0.00702
Epoch: 33711, Training Loss: 0.00793
Epoch: 33711, Training Loss: 0.00741
Epoch: 33711, Training Loss: 0.00906
Epoch: 33711, Training Loss: 0.00702
Epoch: 33712, Training Loss: 0.00793
Epoch: 33712, Training Loss: 0.00741
Epoch: 33712, Training Loss: 0.00906
Epoch: 33712, Training Loss: 0.00702
Epoch: 33713, Training Loss: 0.00793
Epoch: 33713, Training Loss: 0.00741
Epoch: 33713, Training Loss: 0.00906
Epoch: 33713, Training Loss: 0.00702
Epoch: 33714, Training Loss: 0.00793
Epoch: 33714, Training Loss: 0.00741
Epoch: 33714, Training Loss: 0.00906
Epoch: 33714, Training Loss: 0.00702
Epoch: 33715, Training Loss: 0.00793
Epoch: 33715, Training Loss: 0.00741
Epoch: 33715, Training Loss: 0.00906
Epoch: 33715, Training Loss: 0.00702
Epoch: 33716, Training Loss: 0.00793
Epoch: 33716, Training Loss: 0.00741
Epoch: 33716, Training Loss: 0.00906
Epoch: 33716, Training Loss: 0.00702
Epoch: 33717, Training Loss: 0.00793
Epoch: 33717, Training Loss: 0.00741
Epoch: 33717, Training Loss: 0.00906
Epoch: 33717, Training Loss: 0.00702
Epoch: 33718, Training Loss: 0.00793
Epoch: 33718, Training Loss: 0.00741
Epoch: 33718, Training Loss: 0.00906
Epoch: 33718, Training Loss: 0.00702
Epoch: 33719, Training Loss: 0.00793
Epoch: 33719, Training Loss: 0.00741
Epoch: 33719, Training Loss: 0.00906
Epoch: 33719, Training Loss: 0.00702
Epoch: 33720, Training Loss: 0.00793
Epoch: 33720, Training Loss: 0.00741
Epoch: 33720, Training Loss: 0.00906
Epoch: 33720, Training Loss: 0.00702
Epoch: 33721, Training Loss: 0.00793
Epoch: 33721, Training Loss: 0.00741
Epoch: 33721, Training Loss: 0.00905
Epoch: 33721, Training Loss: 0.00702
Epoch: 33722, Training Loss: 0.00793
Epoch: 33722, Training Loss: 0.00741
Epoch: 33722, Training Loss: 0.00905
Epoch: 33722, Training Loss: 0.00702
Epoch: 33723, Training Loss: 0.00793
Epoch: 33723, Training Loss: 0.00741
Epoch: 33723, Training Loss: 0.00905
Epoch: 33723, Training Loss: 0.00702
Epoch: 33724, Training Loss: 0.00793
Epoch: 33724, Training Loss: 0.00741
Epoch: 33724, Training Loss: 0.00905
Epoch: 33724, Training Loss: 0.00702
Epoch: 33725, Training Loss: 0.00793
Epoch: 33725, Training Loss: 0.00741
Epoch: 33725, Training Loss: 0.00905
Epoch: 33725, Training Loss: 0.00702
Epoch: 33726, Training Loss: 0.00793
Epoch: 33726, Training Loss: 0.00741
Epoch: 33726, Training Loss: 0.00905
Epoch: 33726, Training Loss: 0.00702
Epoch: 33727, Training Loss: 0.00793
Epoch: 33727, Training Loss: 0.00741
Epoch: 33727, Training Loss: 0.00905
Epoch: 33727, Training Loss: 0.00702
Epoch: 33728, Training Loss: 0.00793
Epoch: 33728, Training Loss: 0.00741
Epoch: 33728, Training Loss: 0.00905
Epoch: 33728, Training Loss: 0.00702
Epoch: 33729, Training Loss: 0.00793
Epoch: 33729, Training Loss: 0.00741
Epoch: 33729, Training Loss: 0.00905
Epoch: 33729, Training Loss: 0.00701
Epoch: 33730, Training Loss: 0.00793
Epoch: 33730, Training Loss: 0.00741
Epoch: 33730, Training Loss: 0.00905
Epoch: 33730, Training Loss: 0.00701
Epoch: 33731, Training Loss: 0.00793
Epoch: 33731, Training Loss: 0.00741
Epoch: 33731, Training Loss: 0.00905
Epoch: 33731, Training Loss: 0.00701
Epoch: 33732, Training Loss: 0.00793
Epoch: 33732, Training Loss: 0.00741
Epoch: 33732, Training Loss: 0.00905
Epoch: 33732, Training Loss: 0.00701
Epoch: 33733, Training Loss: 0.00793
Epoch: 33733, Training Loss: 0.00741
Epoch: 33733, Training Loss: 0.00905
Epoch: 33733, Training Loss: 0.00701
Epoch: 33734, Training Loss: 0.00793
Epoch: 33734, Training Loss: 0.00741
Epoch: 33734, Training Loss: 0.00905
Epoch: 33734, Training Loss: 0.00701
Epoch: 33735, Training Loss: 0.00793
Epoch: 33735, Training Loss: 0.00741
Epoch: 33735, Training Loss: 0.00905
Epoch: 33735, Training Loss: 0.00701
Epoch: 33736, Training Loss: 0.00793
Epoch: 33736, Training Loss: 0.00741
Epoch: 33736, Training Loss: 0.00905
Epoch: 33736, Training Loss: 0.00701
Epoch: 33737, Training Loss: 0.00793
Epoch: 33737, Training Loss: 0.00741
Epoch: 33737, Training Loss: 0.00905
Epoch: 33737, Training Loss: 0.00701
Epoch: 33738, Training Loss: 0.00793
Epoch: 33738, Training Loss: 0.00741
Epoch: 33738, Training Loss: 0.00905
Epoch: 33738, Training Loss: 0.00701
Epoch: 33739, Training Loss: 0.00793
Epoch: 33739, Training Loss: 0.00741
Epoch: 33739, Training Loss: 0.00905
Epoch: 33739, Training Loss: 0.00701
Epoch: 33740, Training Loss: 0.00792
Epoch: 33740, Training Loss: 0.00741
Epoch: 33740, Training Loss: 0.00905
Epoch: 33740, Training Loss: 0.00701
Epoch: 33741, Training Loss: 0.00792
Epoch: 33741, Training Loss: 0.00741
Epoch: 33741, Training Loss: 0.00905
Epoch: 33741, Training Loss: 0.00701
Epoch: 33742, Training Loss: 0.00792
Epoch: 33742, Training Loss: 0.00741
Epoch: 33742, Training Loss: 0.00905
Epoch: 33742, Training Loss: 0.00701
Epoch: 33743, Training Loss: 0.00792
Epoch: 33743, Training Loss: 0.00741
Epoch: 33743, Training Loss: 0.00905
Epoch: 33743, Training Loss: 0.00701
Epoch: 33744, Training Loss: 0.00792
Epoch: 33744, Training Loss: 0.00741
Epoch: 33744, Training Loss: 0.00905
Epoch: 33744, Training Loss: 0.00701
Epoch: 33745, Training Loss: 0.00792
Epoch: 33745, Training Loss: 0.00741
Epoch: 33745, Training Loss: 0.00905
Epoch: 33745, Training Loss: 0.00701
Epoch: 33746, Training Loss: 0.00792
Epoch: 33746, Training Loss: 0.00741
Epoch: 33746, Training Loss: 0.00905
Epoch: 33746, Training Loss: 0.00701
Epoch: 33747, Training Loss: 0.00792
Epoch: 33747, Training Loss: 0.00741
Epoch: 33747, Training Loss: 0.00905
Epoch: 33747, Training Loss: 0.00701
Epoch: 33748, Training Loss: 0.00792
Epoch: 33748, Training Loss: 0.00741
Epoch: 33748, Training Loss: 0.00905
Epoch: 33748, Training Loss: 0.00701
Epoch: 33749, Training Loss: 0.00792
Epoch: 33749, Training Loss: 0.00741
Epoch: 33749, Training Loss: 0.00905
Epoch: 33749, Training Loss: 0.00701
Epoch: 33750, Training Loss: 0.00792
Epoch: 33750, Training Loss: 0.00741
Epoch: 33750, Training Loss: 0.00905
Epoch: 33750, Training Loss: 0.00701
Epoch: 33751, Training Loss: 0.00792
Epoch: 33751, Training Loss: 0.00741
Epoch: 33751, Training Loss: 0.00905
Epoch: 33751, Training Loss: 0.00701
Epoch: 33752, Training Loss: 0.00792
Epoch: 33752, Training Loss: 0.00741
Epoch: 33752, Training Loss: 0.00905
Epoch: 33752, Training Loss: 0.00701
Epoch: 33753, Training Loss: 0.00792
Epoch: 33753, Training Loss: 0.00741
Epoch: 33753, Training Loss: 0.00905
Epoch: 33753, Training Loss: 0.00701
Epoch: 33754, Training Loss: 0.00792
Epoch: 33754, Training Loss: 0.00741
Epoch: 33754, Training Loss: 0.00905
Epoch: 33754, Training Loss: 0.00701
Epoch: 33755, Training Loss: 0.00792
Epoch: 33755, Training Loss: 0.00741
Epoch: 33755, Training Loss: 0.00905
Epoch: 33755, Training Loss: 0.00701
Epoch: 33756, Training Loss: 0.00792
Epoch: 33756, Training Loss: 0.00741
Epoch: 33756, Training Loss: 0.00905
Epoch: 33756, Training Loss: 0.00701
Epoch: 33757, Training Loss: 0.00792
Epoch: 33757, Training Loss: 0.00741
Epoch: 33757, Training Loss: 0.00905
Epoch: 33757, Training Loss: 0.00701
Epoch: 33758, Training Loss: 0.00792
Epoch: 33758, Training Loss: 0.00741
Epoch: 33758, Training Loss: 0.00905
Epoch: 33758, Training Loss: 0.00701
Epoch: 33759, Training Loss: 0.00792
Epoch: 33759, Training Loss: 0.00741
Epoch: 33759, Training Loss: 0.00905
Epoch: 33759, Training Loss: 0.00701
Epoch: 33760, Training Loss: 0.00792
Epoch: 33760, Training Loss: 0.00740
Epoch: 33760, Training Loss: 0.00905
Epoch: 33760, Training Loss: 0.00701
Epoch: 33761, Training Loss: 0.00792
Epoch: 33761, Training Loss: 0.00740
Epoch: 33761, Training Loss: 0.00905
Epoch: 33761, Training Loss: 0.00701
Epoch: 33762, Training Loss: 0.00792
Epoch: 33762, Training Loss: 0.00740
Epoch: 33762, Training Loss: 0.00905
Epoch: 33762, Training Loss: 0.00701
Epoch: 33763, Training Loss: 0.00792
Epoch: 33763, Training Loss: 0.00740
Epoch: 33763, Training Loss: 0.00905
Epoch: 33763, Training Loss: 0.00701
Epoch: 33764, Training Loss: 0.00792
Epoch: 33764, Training Loss: 0.00740
Epoch: 33764, Training Loss: 0.00905
Epoch: 33764, Training Loss: 0.00701
Epoch: 33765, Training Loss: 0.00792
Epoch: 33765, Training Loss: 0.00740
Epoch: 33765, Training Loss: 0.00905
Epoch: 33765, Training Loss: 0.00701
Epoch: 33766, Training Loss: 0.00792
Epoch: 33766, Training Loss: 0.00740
Epoch: 33766, Training Loss: 0.00905
Epoch: 33766, Training Loss: 0.00701
Epoch: 33767, Training Loss: 0.00792
Epoch: 33767, Training Loss: 0.00740
Epoch: 33767, Training Loss: 0.00905
Epoch: 33767, Training Loss: 0.00701
Epoch: 33768, Training Loss: 0.00792
Epoch: 33768, Training Loss: 0.00740
Epoch: 33768, Training Loss: 0.00905
Epoch: 33768, Training Loss: 0.00701
Epoch: 33769, Training Loss: 0.00792
Epoch: 33769, Training Loss: 0.00740
Epoch: 33769, Training Loss: 0.00905
Epoch: 33769, Training Loss: 0.00701
Epoch: 33770, Training Loss: 0.00792
Epoch: 33770, Training Loss: 0.00740
Epoch: 33770, Training Loss: 0.00905
Epoch: 33770, Training Loss: 0.00701
Epoch: 33771, Training Loss: 0.00792
Epoch: 33771, Training Loss: 0.00740
Epoch: 33771, Training Loss: 0.00905
Epoch: 33771, Training Loss: 0.00701
Epoch: 33772, Training Loss: 0.00792
Epoch: 33772, Training Loss: 0.00740
Epoch: 33772, Training Loss: 0.00905
Epoch: 33772, Training Loss: 0.00701
Epoch: 33773, Training Loss: 0.00792
Epoch: 33773, Training Loss: 0.00740
Epoch: 33773, Training Loss: 0.00905
Epoch: 33773, Training Loss: 0.00701
Epoch: 33774, Training Loss: 0.00792
Epoch: 33774, Training Loss: 0.00740
Epoch: 33774, Training Loss: 0.00905
Epoch: 33774, Training Loss: 0.00701
Epoch: 33775, Training Loss: 0.00792
Epoch: 33775, Training Loss: 0.00740
Epoch: 33775, Training Loss: 0.00905
Epoch: 33775, Training Loss: 0.00701
Epoch: 33776, Training Loss: 0.00792
Epoch: 33776, Training Loss: 0.00740
Epoch: 33776, Training Loss: 0.00905
Epoch: 33776, Training Loss: 0.00701
Epoch: 33777, Training Loss: 0.00792
Epoch: 33777, Training Loss: 0.00740
Epoch: 33777, Training Loss: 0.00905
Epoch: 33777, Training Loss: 0.00701
Epoch: 33778, Training Loss: 0.00792
Epoch: 33778, Training Loss: 0.00740
Epoch: 33778, Training Loss: 0.00905
Epoch: 33778, Training Loss: 0.00701
Epoch: 33779, Training Loss: 0.00792
Epoch: 33779, Training Loss: 0.00740
Epoch: 33779, Training Loss: 0.00905
Epoch: 33779, Training Loss: 0.00701
Epoch: 33780, Training Loss: 0.00792
Epoch: 33780, Training Loss: 0.00740
Epoch: 33780, Training Loss: 0.00905
Epoch: 33780, Training Loss: 0.00701
Epoch: 33781, Training Loss: 0.00792
Epoch: 33781, Training Loss: 0.00740
Epoch: 33781, Training Loss: 0.00905
Epoch: 33781, Training Loss: 0.00701
Epoch: 33782, Training Loss: 0.00792
Epoch: 33782, Training Loss: 0.00740
Epoch: 33782, Training Loss: 0.00905
Epoch: 33782, Training Loss: 0.00701
Epoch: 33783, Training Loss: 0.00792
Epoch: 33783, Training Loss: 0.00740
Epoch: 33783, Training Loss: 0.00905
Epoch: 33783, Training Loss: 0.00701
Epoch: 33784, Training Loss: 0.00792
Epoch: 33784, Training Loss: 0.00740
Epoch: 33784, Training Loss: 0.00905
Epoch: 33784, Training Loss: 0.00701
Epoch: 33785, Training Loss: 0.00792
Epoch: 33785, Training Loss: 0.00740
Epoch: 33785, Training Loss: 0.00905
Epoch: 33785, Training Loss: 0.00701
Epoch: 33786, Training Loss: 0.00792
Epoch: 33786, Training Loss: 0.00740
Epoch: 33786, Training Loss: 0.00905
Epoch: 33786, Training Loss: 0.00701
Epoch: 33787, Training Loss: 0.00792
Epoch: 33787, Training Loss: 0.00740
Epoch: 33787, Training Loss: 0.00905
Epoch: 33787, Training Loss: 0.00701
Epoch: 33788, Training Loss: 0.00792
Epoch: 33788, Training Loss: 0.00740
Epoch: 33788, Training Loss: 0.00905
Epoch: 33788, Training Loss: 0.00701
Epoch: 33789, Training Loss: 0.00792
Epoch: 33789, Training Loss: 0.00740
Epoch: 33789, Training Loss: 0.00905
Epoch: 33789, Training Loss: 0.00701
Epoch: 33790, Training Loss: 0.00792
Epoch: 33790, Training Loss: 0.00740
Epoch: 33790, Training Loss: 0.00905
Epoch: 33790, Training Loss: 0.00701
Epoch: 33791, Training Loss: 0.00792
Epoch: 33791, Training Loss: 0.00740
Epoch: 33791, Training Loss: 0.00904
Epoch: 33791, Training Loss: 0.00701
Epoch: 33792, Training Loss: 0.00792
Epoch: 33792, Training Loss: 0.00740
Epoch: 33792, Training Loss: 0.00904
Epoch: 33792, Training Loss: 0.00701
Epoch: 33793, Training Loss: 0.00792
Epoch: 33793, Training Loss: 0.00740
Epoch: 33793, Training Loss: 0.00904
Epoch: 33793, Training Loss: 0.00701
Epoch: 33794, Training Loss: 0.00792
Epoch: 33794, Training Loss: 0.00740
Epoch: 33794, Training Loss: 0.00904
Epoch: 33794, Training Loss: 0.00701
Epoch: 33795, Training Loss: 0.00792
Epoch: 33795, Training Loss: 0.00740
Epoch: 33795, Training Loss: 0.00904
Epoch: 33795, Training Loss: 0.00701
Epoch: 33796, Training Loss: 0.00792
Epoch: 33796, Training Loss: 0.00740
Epoch: 33796, Training Loss: 0.00904
Epoch: 33796, Training Loss: 0.00701
Epoch: 33797, Training Loss: 0.00792
Epoch: 33797, Training Loss: 0.00740
Epoch: 33797, Training Loss: 0.00904
Epoch: 33797, Training Loss: 0.00701
Epoch: 33798, Training Loss: 0.00792
Epoch: 33798, Training Loss: 0.00740
Epoch: 33798, Training Loss: 0.00904
Epoch: 33798, Training Loss: 0.00701
Epoch: 33799, Training Loss: 0.00792
Epoch: 33799, Training Loss: 0.00740
Epoch: 33799, Training Loss: 0.00904
Epoch: 33799, Training Loss: 0.00701
Epoch: 33800, Training Loss: 0.00792
Epoch: 33800, Training Loss: 0.00740
Epoch: 33800, Training Loss: 0.00904
Epoch: 33800, Training Loss: 0.00701
Epoch: 33801, Training Loss: 0.00792
Epoch: 33801, Training Loss: 0.00740
Epoch: 33801, Training Loss: 0.00904
Epoch: 33801, Training Loss: 0.00701
Epoch: 33802, Training Loss: 0.00792
Epoch: 33802, Training Loss: 0.00740
Epoch: 33802, Training Loss: 0.00904
Epoch: 33802, Training Loss: 0.00701
Epoch: 33803, Training Loss: 0.00792
Epoch: 33803, Training Loss: 0.00740
Epoch: 33803, Training Loss: 0.00904
Epoch: 33803, Training Loss: 0.00701
Epoch: 33804, Training Loss: 0.00792
Epoch: 33804, Training Loss: 0.00740
Epoch: 33804, Training Loss: 0.00904
Epoch: 33804, Training Loss: 0.00701
Epoch: 33805, Training Loss: 0.00792
Epoch: 33805, Training Loss: 0.00740
Epoch: 33805, Training Loss: 0.00904
Epoch: 33805, Training Loss: 0.00701
Epoch: 33806, Training Loss: 0.00792
Epoch: 33806, Training Loss: 0.00740
Epoch: 33806, Training Loss: 0.00904
Epoch: 33806, Training Loss: 0.00701
Epoch: 33807, Training Loss: 0.00792
Epoch: 33807, Training Loss: 0.00740
Epoch: 33807, Training Loss: 0.00904
Epoch: 33807, Training Loss: 0.00701
Epoch: 33808, Training Loss: 0.00792
Epoch: 33808, Training Loss: 0.00740
Epoch: 33808, Training Loss: 0.00904
Epoch: 33808, Training Loss: 0.00701
Epoch: 33809, Training Loss: 0.00792
Epoch: 33809, Training Loss: 0.00740
Epoch: 33809, Training Loss: 0.00904
Epoch: 33809, Training Loss: 0.00701
Epoch: 33810, Training Loss: 0.00792
Epoch: 33810, Training Loss: 0.00740
Epoch: 33810, Training Loss: 0.00904
Epoch: 33810, Training Loss: 0.00701
Epoch: 33811, Training Loss: 0.00792
Epoch: 33811, Training Loss: 0.00740
Epoch: 33811, Training Loss: 0.00904
Epoch: 33811, Training Loss: 0.00701
Epoch: 33812, Training Loss: 0.00792
Epoch: 33812, Training Loss: 0.00740
Epoch: 33812, Training Loss: 0.00904
Epoch: 33812, Training Loss: 0.00701
Epoch: 33813, Training Loss: 0.00792
Epoch: 33813, Training Loss: 0.00740
Epoch: 33813, Training Loss: 0.00904
Epoch: 33813, Training Loss: 0.00701
Epoch: 33814, Training Loss: 0.00792
Epoch: 33814, Training Loss: 0.00740
Epoch: 33814, Training Loss: 0.00904
Epoch: 33814, Training Loss: 0.00701
Epoch: 33815, Training Loss: 0.00792
Epoch: 33815, Training Loss: 0.00740
Epoch: 33815, Training Loss: 0.00904
Epoch: 33815, Training Loss: 0.00701
Epoch: 33816, Training Loss: 0.00792
Epoch: 33816, Training Loss: 0.00740
Epoch: 33816, Training Loss: 0.00904
Epoch: 33816, Training Loss: 0.00701
Epoch: 33817, Training Loss: 0.00792
Epoch: 33817, Training Loss: 0.00740
Epoch: 33817, Training Loss: 0.00904
Epoch: 33817, Training Loss: 0.00701
Epoch: 33818, Training Loss: 0.00792
Epoch: 33818, Training Loss: 0.00740
Epoch: 33818, Training Loss: 0.00904
Epoch: 33818, Training Loss: 0.00701
Epoch: 33819, Training Loss: 0.00792
Epoch: 33819, Training Loss: 0.00740
Epoch: 33819, Training Loss: 0.00904
Epoch: 33819, Training Loss: 0.00701
Epoch: 33820, Training Loss: 0.00791
Epoch: 33820, Training Loss: 0.00740
Epoch: 33820, Training Loss: 0.00904
Epoch: 33820, Training Loss: 0.00701
Epoch: 33821, Training Loss: 0.00791
Epoch: 33821, Training Loss: 0.00740
Epoch: 33821, Training Loss: 0.00904
Epoch: 33821, Training Loss: 0.00700
Epoch: 33822, Training Loss: 0.00791
Epoch: 33822, Training Loss: 0.00740
Epoch: 33822, Training Loss: 0.00904
Epoch: 33822, Training Loss: 0.00700
Epoch: 33823, Training Loss: 0.00791
Epoch: 33823, Training Loss: 0.00740
Epoch: 33823, Training Loss: 0.00904
Epoch: 33823, Training Loss: 0.00700
Epoch: 33824, Training Loss: 0.00791
Epoch: 33824, Training Loss: 0.00740
Epoch: 33824, Training Loss: 0.00904
Epoch: 33824, Training Loss: 0.00700
Epoch: 33825, Training Loss: 0.00791
Epoch: 33825, Training Loss: 0.00740
Epoch: 33825, Training Loss: 0.00904
Epoch: 33825, Training Loss: 0.00700
Epoch: 33826, Training Loss: 0.00791
Epoch: 33826, Training Loss: 0.00740
Epoch: 33826, Training Loss: 0.00904
Epoch: 33826, Training Loss: 0.00700
Epoch: 33827, Training Loss: 0.00791
Epoch: 33827, Training Loss: 0.00740
Epoch: 33827, Training Loss: 0.00904
Epoch: 33827, Training Loss: 0.00700
Epoch: 33828, Training Loss: 0.00791
Epoch: 33828, Training Loss: 0.00740
Epoch: 33828, Training Loss: 0.00904
Epoch: 33828, Training Loss: 0.00700
Epoch: 33829, Training Loss: 0.00791
Epoch: 33829, Training Loss: 0.00740
Epoch: 33829, Training Loss: 0.00904
Epoch: 33829, Training Loss: 0.00700
Epoch: 33830, Training Loss: 0.00791
Epoch: 33830, Training Loss: 0.00740
Epoch: 33830, Training Loss: 0.00904
Epoch: 33830, Training Loss: 0.00700
Epoch: 33831, Training Loss: 0.00791
Epoch: 33831, Training Loss: 0.00740
Epoch: 33831, Training Loss: 0.00904
Epoch: 33831, Training Loss: 0.00700
Epoch: 33832, Training Loss: 0.00791
Epoch: 33832, Training Loss: 0.00740
Epoch: 33832, Training Loss: 0.00904
Epoch: 33832, Training Loss: 0.00700
Epoch: 33833, Training Loss: 0.00791
Epoch: 33833, Training Loss: 0.00740
Epoch: 33833, Training Loss: 0.00904
Epoch: 33833, Training Loss: 0.00700
Epoch: 33834, Training Loss: 0.00791
Epoch: 33834, Training Loss: 0.00740
Epoch: 33834, Training Loss: 0.00904
Epoch: 33834, Training Loss: 0.00700
Epoch: 33835, Training Loss: 0.00791
Epoch: 33835, Training Loss: 0.00740
Epoch: 33835, Training Loss: 0.00904
Epoch: 33835, Training Loss: 0.00700
Epoch: 33836, Training Loss: 0.00791
Epoch: 33836, Training Loss: 0.00740
Epoch: 33836, Training Loss: 0.00904
Epoch: 33836, Training Loss: 0.00700
Epoch: 33837, Training Loss: 0.00791
Epoch: 33837, Training Loss: 0.00740
Epoch: 33837, Training Loss: 0.00904
Epoch: 33837, Training Loss: 0.00700
Epoch: 33838, Training Loss: 0.00791
Epoch: 33838, Training Loss: 0.00740
Epoch: 33838, Training Loss: 0.00904
Epoch: 33838, Training Loss: 0.00700
Epoch: 33839, Training Loss: 0.00791
Epoch: 33839, Training Loss: 0.00740
Epoch: 33839, Training Loss: 0.00904
Epoch: 33839, Training Loss: 0.00700
Epoch: 33840, Training Loss: 0.00791
Epoch: 33840, Training Loss: 0.00740
Epoch: 33840, Training Loss: 0.00904
Epoch: 33840, Training Loss: 0.00700
Epoch: 33841, Training Loss: 0.00791
Epoch: 33841, Training Loss: 0.00740
Epoch: 33841, Training Loss: 0.00904
Epoch: 33841, Training Loss: 0.00700
Epoch: 33842, Training Loss: 0.00791
Epoch: 33842, Training Loss: 0.00740
Epoch: 33842, Training Loss: 0.00904
Epoch: 33842, Training Loss: 0.00700
Epoch: 33843, Training Loss: 0.00791
Epoch: 33843, Training Loss: 0.00740
Epoch: 33843, Training Loss: 0.00904
Epoch: 33843, Training Loss: 0.00700
Epoch: 33844, Training Loss: 0.00791
Epoch: 33844, Training Loss: 0.00740
Epoch: 33844, Training Loss: 0.00904
Epoch: 33844, Training Loss: 0.00700
Epoch: 33845, Training Loss: 0.00791
Epoch: 33845, Training Loss: 0.00740
Epoch: 33845, Training Loss: 0.00904
Epoch: 33845, Training Loss: 0.00700
Epoch: 33846, Training Loss: 0.00791
Epoch: 33846, Training Loss: 0.00740
Epoch: 33846, Training Loss: 0.00904
Epoch: 33846, Training Loss: 0.00700
Epoch: 33847, Training Loss: 0.00791
Epoch: 33847, Training Loss: 0.00740
Epoch: 33847, Training Loss: 0.00904
Epoch: 33847, Training Loss: 0.00700
Epoch: 33848, Training Loss: 0.00791
Epoch: 33848, Training Loss: 0.00739
Epoch: 33848, Training Loss: 0.00904
Epoch: 33848, Training Loss: 0.00700
Epoch: 33849, Training Loss: 0.00791
Epoch: 33849, Training Loss: 0.00739
Epoch: 33849, Training Loss: 0.00904
Epoch: 33849, Training Loss: 0.00700
Epoch: 33850, Training Loss: 0.00791
Epoch: 33850, Training Loss: 0.00739
Epoch: 33850, Training Loss: 0.00904
Epoch: 33850, Training Loss: 0.00700
Epoch: 33851, Training Loss: 0.00791
Epoch: 33851, Training Loss: 0.00739
Epoch: 33851, Training Loss: 0.00904
Epoch: 33851, Training Loss: 0.00700
Epoch: 33852, Training Loss: 0.00791
Epoch: 33852, Training Loss: 0.00739
Epoch: 33852, Training Loss: 0.00904
Epoch: 33852, Training Loss: 0.00700
Epoch: 33853, Training Loss: 0.00791
Epoch: 33853, Training Loss: 0.00739
Epoch: 33853, Training Loss: 0.00904
Epoch: 33853, Training Loss: 0.00700
Epoch: 33854, Training Loss: 0.00791
Epoch: 33854, Training Loss: 0.00739
Epoch: 33854, Training Loss: 0.00904
Epoch: 33854, Training Loss: 0.00700
Epoch: 33855, Training Loss: 0.00791
Epoch: 33855, Training Loss: 0.00739
Epoch: 33855, Training Loss: 0.00904
Epoch: 33855, Training Loss: 0.00700
Epoch: 33856, Training Loss: 0.00791
Epoch: 33856, Training Loss: 0.00739
Epoch: 33856, Training Loss: 0.00904
Epoch: 33856, Training Loss: 0.00700
Epoch: 33857, Training Loss: 0.00791
Epoch: 33857, Training Loss: 0.00739
Epoch: 33857, Training Loss: 0.00904
Epoch: 33857, Training Loss: 0.00700
Epoch: 33858, Training Loss: 0.00791
Epoch: 33858, Training Loss: 0.00739
Epoch: 33858, Training Loss: 0.00904
Epoch: 33858, Training Loss: 0.00700
Epoch: 33859, Training Loss: 0.00791
Epoch: 33859, Training Loss: 0.00739
Epoch: 33859, Training Loss: 0.00904
Epoch: 33859, Training Loss: 0.00700
Epoch: 33860, Training Loss: 0.00791
Epoch: 33860, Training Loss: 0.00739
Epoch: 33860, Training Loss: 0.00904
Epoch: 33860, Training Loss: 0.00700
Epoch: 33861, Training Loss: 0.00791
Epoch: 33861, Training Loss: 0.00739
Epoch: 33861, Training Loss: 0.00904
Epoch: 33861, Training Loss: 0.00700
Epoch: 33862, Training Loss: 0.00791
Epoch: 33862, Training Loss: 0.00739
Epoch: 33862, Training Loss: 0.00903
Epoch: 33862, Training Loss: 0.00700
Epoch: 33863, Training Loss: 0.00791
Epoch: 33863, Training Loss: 0.00739
Epoch: 33863, Training Loss: 0.00903
Epoch: 33863, Training Loss: 0.00700
Epoch: 33864, Training Loss: 0.00791
Epoch: 33864, Training Loss: 0.00739
Epoch: 33864, Training Loss: 0.00903
Epoch: 33864, Training Loss: 0.00700
Epoch: 33865, Training Loss: 0.00791
Epoch: 33865, Training Loss: 0.00739
Epoch: 33865, Training Loss: 0.00903
Epoch: 33865, Training Loss: 0.00700
Epoch: 33866, Training Loss: 0.00791
Epoch: 33866, Training Loss: 0.00739
Epoch: 33866, Training Loss: 0.00903
Epoch: 33866, Training Loss: 0.00700
Epoch: 33867, Training Loss: 0.00791
Epoch: 33867, Training Loss: 0.00739
Epoch: 33867, Training Loss: 0.00903
Epoch: 33867, Training Loss: 0.00700
Epoch: 33868, Training Loss: 0.00791
Epoch: 33868, Training Loss: 0.00739
Epoch: 33868, Training Loss: 0.00903
Epoch: 33868, Training Loss: 0.00700
Epoch: 33869, Training Loss: 0.00791
Epoch: 33869, Training Loss: 0.00739
Epoch: 33869, Training Loss: 0.00903
Epoch: 33869, Training Loss: 0.00700
Epoch: 33870, Training Loss: 0.00791
Epoch: 33870, Training Loss: 0.00739
Epoch: 33870, Training Loss: 0.00903
Epoch: 33870, Training Loss: 0.00700
Epoch: 33871, Training Loss: 0.00791
Epoch: 33871, Training Loss: 0.00739
Epoch: 33871, Training Loss: 0.00903
Epoch: 33871, Training Loss: 0.00700
Epoch: 33872, Training Loss: 0.00791
Epoch: 33872, Training Loss: 0.00739
Epoch: 33872, Training Loss: 0.00903
Epoch: 33872, Training Loss: 0.00700
Epoch: 33873, Training Loss: 0.00791
Epoch: 33873, Training Loss: 0.00739
Epoch: 33873, Training Loss: 0.00903
Epoch: 33873, Training Loss: 0.00700
Epoch: 33874, Training Loss: 0.00791
Epoch: 33874, Training Loss: 0.00739
Epoch: 33874, Training Loss: 0.00903
Epoch: 33874, Training Loss: 0.00700
Epoch: 33875, Training Loss: 0.00791
Epoch: 33875, Training Loss: 0.00739
Epoch: 33875, Training Loss: 0.00903
Epoch: 33875, Training Loss: 0.00700
Epoch: 33876, Training Loss: 0.00791
Epoch: 33876, Training Loss: 0.00739
Epoch: 33876, Training Loss: 0.00903
Epoch: 33876, Training Loss: 0.00700
Epoch: 33877, Training Loss: 0.00791
Epoch: 33877, Training Loss: 0.00739
Epoch: 33877, Training Loss: 0.00903
Epoch: 33877, Training Loss: 0.00700
Epoch: 33878, Training Loss: 0.00791
Epoch: 33878, Training Loss: 0.00739
Epoch: 33878, Training Loss: 0.00903
Epoch: 33878, Training Loss: 0.00700
Epoch: 33879, Training Loss: 0.00791
Epoch: 33879, Training Loss: 0.00739
Epoch: 33879, Training Loss: 0.00903
Epoch: 33879, Training Loss: 0.00700
Epoch: 33880, Training Loss: 0.00791
Epoch: 33880, Training Loss: 0.00739
Epoch: 33880, Training Loss: 0.00903
Epoch: 33880, Training Loss: 0.00700
Epoch: 33881, Training Loss: 0.00791
Epoch: 33881, Training Loss: 0.00739
Epoch: 33881, Training Loss: 0.00903
Epoch: 33881, Training Loss: 0.00700
Epoch: 33882, Training Loss: 0.00791
Epoch: 33882, Training Loss: 0.00739
Epoch: 33882, Training Loss: 0.00903
Epoch: 33882, Training Loss: 0.00700
Epoch: 33883, Training Loss: 0.00791
Epoch: 33883, Training Loss: 0.00739
Epoch: 33883, Training Loss: 0.00903
Epoch: 33883, Training Loss: 0.00700
Epoch: 33884, Training Loss: 0.00791
Epoch: 33884, Training Loss: 0.00739
Epoch: 33884, Training Loss: 0.00903
Epoch: 33884, Training Loss: 0.00700
Epoch: 33885, Training Loss: 0.00791
Epoch: 33885, Training Loss: 0.00739
Epoch: 33885, Training Loss: 0.00903
Epoch: 33885, Training Loss: 0.00700
Epoch: 33886, Training Loss: 0.00791
Epoch: 33886, Training Loss: 0.00739
Epoch: 33886, Training Loss: 0.00903
Epoch: 33886, Training Loss: 0.00700
Epoch: 33887, Training Loss: 0.00791
Epoch: 33887, Training Loss: 0.00739
Epoch: 33887, Training Loss: 0.00903
Epoch: 33887, Training Loss: 0.00700
Epoch: 33888, Training Loss: 0.00791
Epoch: 33888, Training Loss: 0.00739
Epoch: 33888, Training Loss: 0.00903
Epoch: 33888, Training Loss: 0.00700
Epoch: 33889, Training Loss: 0.00791
Epoch: 33889, Training Loss: 0.00739
Epoch: 33889, Training Loss: 0.00903
Epoch: 33889, Training Loss: 0.00700
Epoch: 33890, Training Loss: 0.00791
Epoch: 33890, Training Loss: 0.00739
Epoch: 33890, Training Loss: 0.00903
Epoch: 33890, Training Loss: 0.00700
Epoch: 33891, Training Loss: 0.00791
Epoch: 33891, Training Loss: 0.00739
Epoch: 33891, Training Loss: 0.00903
Epoch: 33891, Training Loss: 0.00700
Epoch: 33892, Training Loss: 0.00791
Epoch: 33892, Training Loss: 0.00739
Epoch: 33892, Training Loss: 0.00903
Epoch: 33892, Training Loss: 0.00700
Epoch: 33893, Training Loss: 0.00791
Epoch: 33893, Training Loss: 0.00739
Epoch: 33893, Training Loss: 0.00903
Epoch: 33893, Training Loss: 0.00700
Epoch: 33894, Training Loss: 0.00791
Epoch: 33894, Training Loss: 0.00739
Epoch: 33894, Training Loss: 0.00903
Epoch: 33894, Training Loss: 0.00700
Epoch: 33895, Training Loss: 0.00791
Epoch: 33895, Training Loss: 0.00739
Epoch: 33895, Training Loss: 0.00903
Epoch: 33895, Training Loss: 0.00700
Epoch: 33896, Training Loss: 0.00791
Epoch: 33896, Training Loss: 0.00739
Epoch: 33896, Training Loss: 0.00903
Epoch: 33896, Training Loss: 0.00700
Epoch: 33897, Training Loss: 0.00791
Epoch: 33897, Training Loss: 0.00739
Epoch: 33897, Training Loss: 0.00903
Epoch: 33897, Training Loss: 0.00700
Epoch: 33898, Training Loss: 0.00791
Epoch: 33898, Training Loss: 0.00739
Epoch: 33898, Training Loss: 0.00903
Epoch: 33898, Training Loss: 0.00700
Epoch: 33899, Training Loss: 0.00791
Epoch: 33899, Training Loss: 0.00739
Epoch: 33899, Training Loss: 0.00903
Epoch: 33899, Training Loss: 0.00700
Epoch: 33900, Training Loss: 0.00791
Epoch: 33900, Training Loss: 0.00739
Epoch: 33900, Training Loss: 0.00903
Epoch: 33900, Training Loss: 0.00700
Epoch: 33901, Training Loss: 0.00790
Epoch: 33901, Training Loss: 0.00739
Epoch: 33901, Training Loss: 0.00903
Epoch: 33901, Training Loss: 0.00700
Epoch: 33902, Training Loss: 0.00790
Epoch: 33902, Training Loss: 0.00739
Epoch: 33902, Training Loss: 0.00903
Epoch: 33902, Training Loss: 0.00700
Epoch: 33903, Training Loss: 0.00790
Epoch: 33903, Training Loss: 0.00739
Epoch: 33903, Training Loss: 0.00903
Epoch: 33903, Training Loss: 0.00700
Epoch: 33904, Training Loss: 0.00790
Epoch: 33904, Training Loss: 0.00739
Epoch: 33904, Training Loss: 0.00903
Epoch: 33904, Training Loss: 0.00700
Epoch: 33905, Training Loss: 0.00790
Epoch: 33905, Training Loss: 0.00739
Epoch: 33905, Training Loss: 0.00903
Epoch: 33905, Training Loss: 0.00700
Epoch: 33906, Training Loss: 0.00790
Epoch: 33906, Training Loss: 0.00739
Epoch: 33906, Training Loss: 0.00903
Epoch: 33906, Training Loss: 0.00700
Epoch: 33907, Training Loss: 0.00790
Epoch: 33907, Training Loss: 0.00739
Epoch: 33907, Training Loss: 0.00903
Epoch: 33907, Training Loss: 0.00700
Epoch: 33908, Training Loss: 0.00790
Epoch: 33908, Training Loss: 0.00739
Epoch: 33908, Training Loss: 0.00903
Epoch: 33908, Training Loss: 0.00700
Epoch: 33909, Training Loss: 0.00790
Epoch: 33909, Training Loss: 0.00739
Epoch: 33909, Training Loss: 0.00903
Epoch: 33909, Training Loss: 0.00700
Epoch: 33910, Training Loss: 0.00790
Epoch: 33910, Training Loss: 0.00739
Epoch: 33910, Training Loss: 0.00903
Epoch: 33910, Training Loss: 0.00700
Epoch: 33911, Training Loss: 0.00790
Epoch: 33911, Training Loss: 0.00739
Epoch: 33911, Training Loss: 0.00903
Epoch: 33911, Training Loss: 0.00700
Epoch: 33912, Training Loss: 0.00790
Epoch: 33912, Training Loss: 0.00739
Epoch: 33912, Training Loss: 0.00903
Epoch: 33912, Training Loss: 0.00700
Epoch: 33913, Training Loss: 0.00790
Epoch: 33913, Training Loss: 0.00739
Epoch: 33913, Training Loss: 0.00903
Epoch: 33913, Training Loss: 0.00700
Epoch: 33914, Training Loss: 0.00790
Epoch: 33914, Training Loss: 0.00739
Epoch: 33914, Training Loss: 0.00903
Epoch: 33914, Training Loss: 0.00699
Epoch: 33915, Training Loss: 0.00790
Epoch: 33915, Training Loss: 0.00739
Epoch: 33915, Training Loss: 0.00903
Epoch: 33915, Training Loss: 0.00699
Epoch: 33916, Training Loss: 0.00790
Epoch: 33916, Training Loss: 0.00739
Epoch: 33916, Training Loss: 0.00903
Epoch: 33916, Training Loss: 0.00699
Epoch: 33917, Training Loss: 0.00790
Epoch: 33917, Training Loss: 0.00739
Epoch: 33917, Training Loss: 0.00903
Epoch: 33917, Training Loss: 0.00699
Epoch: 33918, Training Loss: 0.00790
Epoch: 33918, Training Loss: 0.00739
Epoch: 33918, Training Loss: 0.00903
Epoch: 33918, Training Loss: 0.00699
Epoch: 33919, Training Loss: 0.00790
Epoch: 33919, Training Loss: 0.00739
Epoch: 33919, Training Loss: 0.00903
Epoch: 33919, Training Loss: 0.00699
Epoch: 33920, Training Loss: 0.00790
Epoch: 33920, Training Loss: 0.00739
Epoch: 33920, Training Loss: 0.00903
Epoch: 33920, Training Loss: 0.00699
Epoch: 33921, Training Loss: 0.00790
Epoch: 33921, Training Loss: 0.00739
Epoch: 33921, Training Loss: 0.00903
Epoch: 33921, Training Loss: 0.00699
Epoch: 33922, Training Loss: 0.00790
Epoch: 33922, Training Loss: 0.00739
Epoch: 33922, Training Loss: 0.00903
Epoch: 33922, Training Loss: 0.00699
Epoch: 33923, Training Loss: 0.00790
Epoch: 33923, Training Loss: 0.00739
Epoch: 33923, Training Loss: 0.00903
Epoch: 33923, Training Loss: 0.00699
Epoch: 33924, Training Loss: 0.00790
Epoch: 33924, Training Loss: 0.00739
Epoch: 33924, Training Loss: 0.00903
Epoch: 33924, Training Loss: 0.00699
Epoch: 33925, Training Loss: 0.00790
Epoch: 33925, Training Loss: 0.00739
Epoch: 33925, Training Loss: 0.00903
Epoch: 33925, Training Loss: 0.00699
Epoch: 33926, Training Loss: 0.00790
Epoch: 33926, Training Loss: 0.00739
Epoch: 33926, Training Loss: 0.00903
Epoch: 33926, Training Loss: 0.00699
Epoch: 33927, Training Loss: 0.00790
Epoch: 33927, Training Loss: 0.00739
Epoch: 33927, Training Loss: 0.00903
Epoch: 33927, Training Loss: 0.00699
Epoch: 33928, Training Loss: 0.00790
Epoch: 33928, Training Loss: 0.00739
Epoch: 33928, Training Loss: 0.00903
Epoch: 33928, Training Loss: 0.00699
Epoch: 33929, Training Loss: 0.00790
Epoch: 33929, Training Loss: 0.00739
Epoch: 33929, Training Loss: 0.00903
Epoch: 33929, Training Loss: 0.00699
Epoch: 33930, Training Loss: 0.00790
Epoch: 33930, Training Loss: 0.00739
Epoch: 33930, Training Loss: 0.00903
Epoch: 33930, Training Loss: 0.00699
Epoch: 33931, Training Loss: 0.00790
Epoch: 33931, Training Loss: 0.00739
Epoch: 33931, Training Loss: 0.00903
Epoch: 33931, Training Loss: 0.00699
Epoch: 33932, Training Loss: 0.00790
Epoch: 33932, Training Loss: 0.00739
Epoch: 33932, Training Loss: 0.00903
Epoch: 33932, Training Loss: 0.00699
Epoch: 33933, Training Loss: 0.00790
Epoch: 33933, Training Loss: 0.00739
Epoch: 33933, Training Loss: 0.00902
Epoch: 33933, Training Loss: 0.00699
Epoch: 33934, Training Loss: 0.00790
Epoch: 33934, Training Loss: 0.00739
Epoch: 33934, Training Loss: 0.00902
Epoch: 33934, Training Loss: 0.00699
Epoch: 33935, Training Loss: 0.00790
Epoch: 33935, Training Loss: 0.00739
Epoch: 33935, Training Loss: 0.00902
Epoch: 33935, Training Loss: 0.00699
Epoch: 33936, Training Loss: 0.00790
Epoch: 33936, Training Loss: 0.00738
Epoch: 33936, Training Loss: 0.00902
Epoch: 33936, Training Loss: 0.00699
Epoch: 33937, Training Loss: 0.00790
Epoch: 33937, Training Loss: 0.00738
Epoch: 33937, Training Loss: 0.00902
Epoch: 33937, Training Loss: 0.00699
Epoch: 33938, Training Loss: 0.00790
Epoch: 33938, Training Loss: 0.00738
Epoch: 33938, Training Loss: 0.00902
Epoch: 33938, Training Loss: 0.00699
Epoch: 33939, Training Loss: 0.00790
Epoch: 33939, Training Loss: 0.00738
Epoch: 33939, Training Loss: 0.00902
Epoch: 33939, Training Loss: 0.00699
Epoch: 33940, Training Loss: 0.00790
Epoch: 33940, Training Loss: 0.00738
Epoch: 33940, Training Loss: 0.00902
Epoch: 33940, Training Loss: 0.00699
Epoch: 33941, Training Loss: 0.00790
Epoch: 33941, Training Loss: 0.00738
Epoch: 33941, Training Loss: 0.00902
Epoch: 33941, Training Loss: 0.00699
Epoch: 33942, Training Loss: 0.00790
Epoch: 33942, Training Loss: 0.00738
Epoch: 33942, Training Loss: 0.00902
Epoch: 33942, Training Loss: 0.00699
Epoch: 33943, Training Loss: 0.00790
Epoch: 33943, Training Loss: 0.00738
Epoch: 33943, Training Loss: 0.00902
Epoch: 33943, Training Loss: 0.00699
Epoch: 33944, Training Loss: 0.00790
Epoch: 33944, Training Loss: 0.00738
Epoch: 33944, Training Loss: 0.00902
Epoch: 33944, Training Loss: 0.00699
Epoch: 33945, Training Loss: 0.00790
Epoch: 33945, Training Loss: 0.00738
Epoch: 33945, Training Loss: 0.00902
Epoch: 33945, Training Loss: 0.00699
Epoch: 33946, Training Loss: 0.00790
Epoch: 33946, Training Loss: 0.00738
Epoch: 33946, Training Loss: 0.00902
Epoch: 33946, Training Loss: 0.00699
Epoch: 33947, Training Loss: 0.00790
Epoch: 33947, Training Loss: 0.00738
Epoch: 33947, Training Loss: 0.00902
Epoch: 33947, Training Loss: 0.00699
Epoch: 33948, Training Loss: 0.00790
Epoch: 33948, Training Loss: 0.00738
Epoch: 33948, Training Loss: 0.00902
Epoch: 33948, Training Loss: 0.00699
Epoch: 33949, Training Loss: 0.00790
Epoch: 33949, Training Loss: 0.00738
Epoch: 33949, Training Loss: 0.00902
Epoch: 33949, Training Loss: 0.00699
Epoch: 33950, Training Loss: 0.00790
Epoch: 33950, Training Loss: 0.00738
Epoch: 33950, Training Loss: 0.00902
Epoch: 33950, Training Loss: 0.00699
Epoch: 33951, Training Loss: 0.00790
Epoch: 33951, Training Loss: 0.00738
Epoch: 33951, Training Loss: 0.00902
Epoch: 33951, Training Loss: 0.00699
Epoch: 33952, Training Loss: 0.00790
Epoch: 33952, Training Loss: 0.00738
Epoch: 33952, Training Loss: 0.00902
Epoch: 33952, Training Loss: 0.00699
Epoch: 33953, Training Loss: 0.00790
Epoch: 33953, Training Loss: 0.00738
Epoch: 33953, Training Loss: 0.00902
Epoch: 33953, Training Loss: 0.00699
Epoch: 33954, Training Loss: 0.00790
Epoch: 33954, Training Loss: 0.00738
Epoch: 33954, Training Loss: 0.00902
Epoch: 33954, Training Loss: 0.00699
Epoch: 33955, Training Loss: 0.00790
Epoch: 33955, Training Loss: 0.00738
Epoch: 33955, Training Loss: 0.00902
Epoch: 33955, Training Loss: 0.00699
Epoch: 33956, Training Loss: 0.00790
Epoch: 33956, Training Loss: 0.00738
Epoch: 33956, Training Loss: 0.00902
Epoch: 33956, Training Loss: 0.00699
Epoch: 33957, Training Loss: 0.00790
Epoch: 33957, Training Loss: 0.00738
Epoch: 33957, Training Loss: 0.00902
Epoch: 33957, Training Loss: 0.00699
Epoch: 33958, Training Loss: 0.00790
Epoch: 33958, Training Loss: 0.00738
Epoch: 33958, Training Loss: 0.00902
Epoch: 33958, Training Loss: 0.00699
Epoch: 33959, Training Loss: 0.00790
Epoch: 33959, Training Loss: 0.00738
Epoch: 33959, Training Loss: 0.00902
Epoch: 33959, Training Loss: 0.00699
Epoch: 33960, Training Loss: 0.00790
Epoch: 33960, Training Loss: 0.00738
Epoch: 33960, Training Loss: 0.00902
Epoch: 33960, Training Loss: 0.00699
Epoch: 33961, Training Loss: 0.00790
Epoch: 33961, Training Loss: 0.00738
Epoch: 33961, Training Loss: 0.00902
Epoch: 33961, Training Loss: 0.00699
Epoch: 33962, Training Loss: 0.00790
Epoch: 33962, Training Loss: 0.00738
Epoch: 33962, Training Loss: 0.00902
Epoch: 33962, Training Loss: 0.00699
Epoch: 33963, Training Loss: 0.00790
Epoch: 33963, Training Loss: 0.00738
Epoch: 33963, Training Loss: 0.00902
Epoch: 33963, Training Loss: 0.00699
Epoch: 33964, Training Loss: 0.00790
Epoch: 33964, Training Loss: 0.00738
Epoch: 33964, Training Loss: 0.00902
Epoch: 33964, Training Loss: 0.00699
Epoch: 33965, Training Loss: 0.00790
Epoch: 33965, Training Loss: 0.00738
Epoch: 33965, Training Loss: 0.00902
Epoch: 33965, Training Loss: 0.00699
Epoch: 33966, Training Loss: 0.00790
Epoch: 33966, Training Loss: 0.00738
Epoch: 33966, Training Loss: 0.00902
Epoch: 33966, Training Loss: 0.00699
Epoch: 33967, Training Loss: 0.00790
Epoch: 33967, Training Loss: 0.00738
Epoch: 33967, Training Loss: 0.00902
Epoch: 33967, Training Loss: 0.00699
Epoch: 33968, Training Loss: 0.00790
Epoch: 33968, Training Loss: 0.00738
Epoch: 33968, Training Loss: 0.00902
Epoch: 33968, Training Loss: 0.00699
Epoch: 33969, Training Loss: 0.00790
Epoch: 33969, Training Loss: 0.00738
Epoch: 33969, Training Loss: 0.00902
Epoch: 33969, Training Loss: 0.00699
Epoch: 33970, Training Loss: 0.00790
Epoch: 33970, Training Loss: 0.00738
Epoch: 33970, Training Loss: 0.00902
Epoch: 33970, Training Loss: 0.00699
Epoch: 33971, Training Loss: 0.00790
Epoch: 33971, Training Loss: 0.00738
Epoch: 33971, Training Loss: 0.00902
Epoch: 33971, Training Loss: 0.00699
Epoch: 33972, Training Loss: 0.00790
Epoch: 33972, Training Loss: 0.00738
Epoch: 33972, Training Loss: 0.00902
Epoch: 33972, Training Loss: 0.00699
Epoch: 33973, Training Loss: 0.00790
Epoch: 33973, Training Loss: 0.00738
Epoch: 33973, Training Loss: 0.00902
Epoch: 33973, Training Loss: 0.00699
Epoch: 33974, Training Loss: 0.00790
Epoch: 33974, Training Loss: 0.00738
Epoch: 33974, Training Loss: 0.00902
Epoch: 33974, Training Loss: 0.00699
Epoch: 33975, Training Loss: 0.00790
Epoch: 33975, Training Loss: 0.00738
Epoch: 33975, Training Loss: 0.00902
Epoch: 33975, Training Loss: 0.00699
Epoch: 33976, Training Loss: 0.00790
Epoch: 33976, Training Loss: 0.00738
Epoch: 33976, Training Loss: 0.00902
Epoch: 33976, Training Loss: 0.00699
Epoch: 33977, Training Loss: 0.00790
Epoch: 33977, Training Loss: 0.00738
Epoch: 33977, Training Loss: 0.00902
Epoch: 33977, Training Loss: 0.00699
Epoch: 33978, Training Loss: 0.00790
Epoch: 33978, Training Loss: 0.00738
Epoch: 33978, Training Loss: 0.00902
Epoch: 33978, Training Loss: 0.00699
Epoch: 33979, Training Loss: 0.00790
Epoch: 33979, Training Loss: 0.00738
Epoch: 33979, Training Loss: 0.00902
Epoch: 33979, Training Loss: 0.00699
Epoch: 33980, Training Loss: 0.00790
Epoch: 33980, Training Loss: 0.00738
Epoch: 33980, Training Loss: 0.00902
Epoch: 33980, Training Loss: 0.00699
Epoch: 33981, Training Loss: 0.00790
Epoch: 33981, Training Loss: 0.00738
Epoch: 33981, Training Loss: 0.00902
Epoch: 33981, Training Loss: 0.00699
Epoch: 33982, Training Loss: 0.00790
Epoch: 33982, Training Loss: 0.00738
Epoch: 33982, Training Loss: 0.00902
Epoch: 33982, Training Loss: 0.00699
Epoch: 33983, Training Loss: 0.00789
Epoch: 33983, Training Loss: 0.00738
Epoch: 33983, Training Loss: 0.00902
Epoch: 33983, Training Loss: 0.00699
Epoch: 33984, Training Loss: 0.00789
Epoch: 33984, Training Loss: 0.00738
Epoch: 33984, Training Loss: 0.00902
Epoch: 33984, Training Loss: 0.00699
Epoch: 33985, Training Loss: 0.00789
Epoch: 33985, Training Loss: 0.00738
Epoch: 33985, Training Loss: 0.00902
Epoch: 33985, Training Loss: 0.00699
Epoch: 33986, Training Loss: 0.00789
Epoch: 33986, Training Loss: 0.00738
Epoch: 33986, Training Loss: 0.00902
Epoch: 33986, Training Loss: 0.00699
Epoch: 33987, Training Loss: 0.00789
Epoch: 33987, Training Loss: 0.00738
Epoch: 33987, Training Loss: 0.00902
Epoch: 33987, Training Loss: 0.00699
Epoch: 33988, Training Loss: 0.00789
Epoch: 33988, Training Loss: 0.00738
Epoch: 33988, Training Loss: 0.00902
Epoch: 33988, Training Loss: 0.00699
Epoch: 33989, Training Loss: 0.00789
Epoch: 33989, Training Loss: 0.00738
Epoch: 33989, Training Loss: 0.00902
Epoch: 33989, Training Loss: 0.00699
Epoch: 33990, Training Loss: 0.00789
Epoch: 33990, Training Loss: 0.00738
Epoch: 33990, Training Loss: 0.00902
Epoch: 33990, Training Loss: 0.00699
Epoch: 33991, Training Loss: 0.00789
Epoch: 33991, Training Loss: 0.00738
Epoch: 33991, Training Loss: 0.00902
Epoch: 33991, Training Loss: 0.00699
Epoch: 33992, Training Loss: 0.00789
Epoch: 33992, Training Loss: 0.00738
Epoch: 33992, Training Loss: 0.00902
Epoch: 33992, Training Loss: 0.00699
Epoch: 33993, Training Loss: 0.00789
Epoch: 33993, Training Loss: 0.00738
Epoch: 33993, Training Loss: 0.00902
Epoch: 33993, Training Loss: 0.00699
Epoch: 33994, Training Loss: 0.00789
Epoch: 33994, Training Loss: 0.00738
Epoch: 33994, Training Loss: 0.00902
Epoch: 33994, Training Loss: 0.00699
Epoch: 33995, Training Loss: 0.00789
Epoch: 33995, Training Loss: 0.00738
Epoch: 33995, Training Loss: 0.00902
Epoch: 33995, Training Loss: 0.00699
Epoch: 33996, Training Loss: 0.00789
Epoch: 33996, Training Loss: 0.00738
Epoch: 33996, Training Loss: 0.00902
Epoch: 33996, Training Loss: 0.00699
Epoch: 33997, Training Loss: 0.00789
Epoch: 33997, Training Loss: 0.00738
Epoch: 33997, Training Loss: 0.00902
Epoch: 33997, Training Loss: 0.00699
Epoch: 33998, Training Loss: 0.00789
Epoch: 33998, Training Loss: 0.00738
Epoch: 33998, Training Loss: 0.00902
Epoch: 33998, Training Loss: 0.00699
Epoch: 33999, Training Loss: 0.00789
Epoch: 33999, Training Loss: 0.00738
Epoch: 33999, Training Loss: 0.00902
Epoch: 33999, Training Loss: 0.00699
Epoch: 34000, Training Loss: 0.00789
Epoch: 34000, Training Loss: 0.00738
Epoch: 34000, Training Loss: 0.00902
Epoch: 34000, Training Loss: 0.00699
Epoch: 34001, Training Loss: 0.00789
Epoch: 34001, Training Loss: 0.00738
Epoch: 34001, Training Loss: 0.00902
Epoch: 34001, Training Loss: 0.00699
Epoch: 34002, Training Loss: 0.00789
Epoch: 34002, Training Loss: 0.00738
Epoch: 34002, Training Loss: 0.00902
Epoch: 34002, Training Loss: 0.00699
Epoch: 34003, Training Loss: 0.00789
Epoch: 34003, Training Loss: 0.00738
Epoch: 34003, Training Loss: 0.00902
Epoch: 34003, Training Loss: 0.00699
Epoch: 34004, Training Loss: 0.00789
Epoch: 34004, Training Loss: 0.00738
Epoch: 34004, Training Loss: 0.00902
Epoch: 34004, Training Loss: 0.00699
Epoch: 34005, Training Loss: 0.00789
Epoch: 34005, Training Loss: 0.00738
Epoch: 34005, Training Loss: 0.00901
Epoch: 34005, Training Loss: 0.00699
Epoch: 34006, Training Loss: 0.00789
Epoch: 34006, Training Loss: 0.00738
Epoch: 34006, Training Loss: 0.00901
Epoch: 34006, Training Loss: 0.00699
Epoch: 34007, Training Loss: 0.00789
Epoch: 34007, Training Loss: 0.00738
Epoch: 34007, Training Loss: 0.00901
Epoch: 34007, Training Loss: 0.00698
Epoch: 34008, Training Loss: 0.00789
Epoch: 34008, Training Loss: 0.00738
Epoch: 34008, Training Loss: 0.00901
Epoch: 34008, Training Loss: 0.00698
Epoch: 34009, Training Loss: 0.00789
Epoch: 34009, Training Loss: 0.00738
Epoch: 34009, Training Loss: 0.00901
Epoch: 34009, Training Loss: 0.00698
Epoch: 34010, Training Loss: 0.00789
Epoch: 34010, Training Loss: 0.00738
Epoch: 34010, Training Loss: 0.00901
Epoch: 34010, Training Loss: 0.00698
Epoch: 34011, Training Loss: 0.00789
Epoch: 34011, Training Loss: 0.00738
Epoch: 34011, Training Loss: 0.00901
Epoch: 34011, Training Loss: 0.00698
Epoch: 34012, Training Loss: 0.00789
Epoch: 34012, Training Loss: 0.00738
Epoch: 34012, Training Loss: 0.00901
Epoch: 34012, Training Loss: 0.00698
Epoch: 34013, Training Loss: 0.00789
Epoch: 34013, Training Loss: 0.00738
Epoch: 34013, Training Loss: 0.00901
Epoch: 34013, Training Loss: 0.00698
Epoch: 34014, Training Loss: 0.00789
Epoch: 34014, Training Loss: 0.00738
Epoch: 34014, Training Loss: 0.00901
Epoch: 34014, Training Loss: 0.00698
Epoch: 34015, Training Loss: 0.00789
Epoch: 34015, Training Loss: 0.00738
Epoch: 34015, Training Loss: 0.00901
Epoch: 34015, Training Loss: 0.00698
Epoch: 34016, Training Loss: 0.00789
Epoch: 34016, Training Loss: 0.00738
Epoch: 34016, Training Loss: 0.00901
Epoch: 34016, Training Loss: 0.00698
Epoch: 34017, Training Loss: 0.00789
Epoch: 34017, Training Loss: 0.00738
Epoch: 34017, Training Loss: 0.00901
Epoch: 34017, Training Loss: 0.00698
Epoch: 34018, Training Loss: 0.00789
Epoch: 34018, Training Loss: 0.00738
Epoch: 34018, Training Loss: 0.00901
Epoch: 34018, Training Loss: 0.00698
Epoch: 34019, Training Loss: 0.00789
Epoch: 34019, Training Loss: 0.00738
Epoch: 34019, Training Loss: 0.00901
Epoch: 34019, Training Loss: 0.00698
Epoch: 34020, Training Loss: 0.00789
Epoch: 34020, Training Loss: 0.00738
Epoch: 34020, Training Loss: 0.00901
Epoch: 34020, Training Loss: 0.00698
Epoch: 34021, Training Loss: 0.00789
Epoch: 34021, Training Loss: 0.00738
Epoch: 34021, Training Loss: 0.00901
Epoch: 34021, Training Loss: 0.00698
Epoch: 34022, Training Loss: 0.00789
Epoch: 34022, Training Loss: 0.00738
Epoch: 34022, Training Loss: 0.00901
Epoch: 34022, Training Loss: 0.00698
Epoch: 34023, Training Loss: 0.00789
Epoch: 34023, Training Loss: 0.00738
Epoch: 34023, Training Loss: 0.00901
Epoch: 34023, Training Loss: 0.00698
Epoch: 34024, Training Loss: 0.00789
Epoch: 34024, Training Loss: 0.00737
Epoch: 34024, Training Loss: 0.00901
Epoch: 34024, Training Loss: 0.00698
Epoch: 34025, Training Loss: 0.00789
Epoch: 34025, Training Loss: 0.00737
Epoch: 34025, Training Loss: 0.00901
Epoch: 34025, Training Loss: 0.00698
Epoch: 34026, Training Loss: 0.00789
Epoch: 34026, Training Loss: 0.00737
Epoch: 34026, Training Loss: 0.00901
Epoch: 34026, Training Loss: 0.00698
Epoch: 34027, Training Loss: 0.00789
Epoch: 34027, Training Loss: 0.00737
Epoch: 34027, Training Loss: 0.00901
Epoch: 34027, Training Loss: 0.00698
Epoch: 34028, Training Loss: 0.00789
Epoch: 34028, Training Loss: 0.00737
Epoch: 34028, Training Loss: 0.00901
Epoch: 34028, Training Loss: 0.00698
Epoch: 34029, Training Loss: 0.00789
Epoch: 34029, Training Loss: 0.00737
Epoch: 34029, Training Loss: 0.00901
Epoch: 34029, Training Loss: 0.00698
Epoch: 34030, Training Loss: 0.00789
Epoch: 34030, Training Loss: 0.00737
Epoch: 34030, Training Loss: 0.00901
Epoch: 34030, Training Loss: 0.00698
Epoch: 34031, Training Loss: 0.00789
Epoch: 34031, Training Loss: 0.00737
Epoch: 34031, Training Loss: 0.00901
Epoch: 34031, Training Loss: 0.00698
Epoch: 34032, Training Loss: 0.00789
Epoch: 34032, Training Loss: 0.00737
Epoch: 34032, Training Loss: 0.00901
Epoch: 34032, Training Loss: 0.00698
Epoch: 34033, Training Loss: 0.00789
Epoch: 34033, Training Loss: 0.00737
Epoch: 34033, Training Loss: 0.00901
Epoch: 34033, Training Loss: 0.00698
Epoch: 34034, Training Loss: 0.00789
Epoch: 34034, Training Loss: 0.00737
Epoch: 34034, Training Loss: 0.00901
Epoch: 34034, Training Loss: 0.00698
Epoch: 34035, Training Loss: 0.00789
Epoch: 34035, Training Loss: 0.00737
Epoch: 34035, Training Loss: 0.00901
Epoch: 34035, Training Loss: 0.00698
Epoch: 34036, Training Loss: 0.00789
Epoch: 34036, Training Loss: 0.00737
Epoch: 34036, Training Loss: 0.00901
Epoch: 34036, Training Loss: 0.00698
Epoch: 34037, Training Loss: 0.00789
Epoch: 34037, Training Loss: 0.00737
Epoch: 34037, Training Loss: 0.00901
Epoch: 34037, Training Loss: 0.00698
Epoch: 34038, Training Loss: 0.00789
Epoch: 34038, Training Loss: 0.00737
Epoch: 34038, Training Loss: 0.00901
Epoch: 34038, Training Loss: 0.00698
Epoch: 34039, Training Loss: 0.00789
Epoch: 34039, Training Loss: 0.00737
Epoch: 34039, Training Loss: 0.00901
Epoch: 34039, Training Loss: 0.00698
Epoch: 34040, Training Loss: 0.00789
Epoch: 34040, Training Loss: 0.00737
Epoch: 34040, Training Loss: 0.00901
Epoch: 34040, Training Loss: 0.00698
Epoch: 34041, Training Loss: 0.00789
Epoch: 34041, Training Loss: 0.00737
Epoch: 34041, Training Loss: 0.00901
Epoch: 34041, Training Loss: 0.00698
Epoch: 34042, Training Loss: 0.00789
Epoch: 34042, Training Loss: 0.00737
Epoch: 34042, Training Loss: 0.00901
Epoch: 34042, Training Loss: 0.00698
Epoch: 34043, Training Loss: 0.00789
Epoch: 34043, Training Loss: 0.00737
Epoch: 34043, Training Loss: 0.00901
Epoch: 34043, Training Loss: 0.00698
Epoch: 34044, Training Loss: 0.00789
Epoch: 34044, Training Loss: 0.00737
Epoch: 34044, Training Loss: 0.00901
Epoch: 34044, Training Loss: 0.00698
Epoch: 34045, Training Loss: 0.00789
Epoch: 34045, Training Loss: 0.00737
Epoch: 34045, Training Loss: 0.00901
Epoch: 34045, Training Loss: 0.00698
Epoch: 34046, Training Loss: 0.00789
Epoch: 34046, Training Loss: 0.00737
Epoch: 34046, Training Loss: 0.00901
Epoch: 34046, Training Loss: 0.00698
Epoch: 34047, Training Loss: 0.00789
Epoch: 34047, Training Loss: 0.00737
Epoch: 34047, Training Loss: 0.00901
Epoch: 34047, Training Loss: 0.00698
Epoch: 34048, Training Loss: 0.00789
Epoch: 34048, Training Loss: 0.00737
Epoch: 34048, Training Loss: 0.00901
Epoch: 34048, Training Loss: 0.00698
Epoch: 34049, Training Loss: 0.00789
Epoch: 34049, Training Loss: 0.00737
Epoch: 34049, Training Loss: 0.00901
Epoch: 34049, Training Loss: 0.00698
Epoch: 34050, Training Loss: 0.00789
Epoch: 34050, Training Loss: 0.00737
Epoch: 34050, Training Loss: 0.00901
Epoch: 34050, Training Loss: 0.00698
Epoch: 34051, Training Loss: 0.00789
Epoch: 34051, Training Loss: 0.00737
Epoch: 34051, Training Loss: 0.00901
Epoch: 34051, Training Loss: 0.00698
Epoch: 34052, Training Loss: 0.00789
Epoch: 34052, Training Loss: 0.00737
Epoch: 34052, Training Loss: 0.00901
Epoch: 34052, Training Loss: 0.00698
Epoch: 34053, Training Loss: 0.00789
Epoch: 34053, Training Loss: 0.00737
Epoch: 34053, Training Loss: 0.00901
Epoch: 34053, Training Loss: 0.00698
Epoch: 34054, Training Loss: 0.00789
Epoch: 34054, Training Loss: 0.00737
Epoch: 34054, Training Loss: 0.00901
Epoch: 34054, Training Loss: 0.00698
Epoch: 34055, Training Loss: 0.00789
Epoch: 34055, Training Loss: 0.00737
Epoch: 34055, Training Loss: 0.00901
Epoch: 34055, Training Loss: 0.00698
Epoch: 34056, Training Loss: 0.00789
Epoch: 34056, Training Loss: 0.00737
Epoch: 34056, Training Loss: 0.00901
Epoch: 34056, Training Loss: 0.00698
Epoch: 34057, Training Loss: 0.00789
Epoch: 34057, Training Loss: 0.00737
Epoch: 34057, Training Loss: 0.00901
Epoch: 34057, Training Loss: 0.00698
Epoch: 34058, Training Loss: 0.00789
Epoch: 34058, Training Loss: 0.00737
Epoch: 34058, Training Loss: 0.00901
Epoch: 34058, Training Loss: 0.00698
Epoch: 34059, Training Loss: 0.00789
Epoch: 34059, Training Loss: 0.00737
Epoch: 34059, Training Loss: 0.00901
Epoch: 34059, Training Loss: 0.00698
Epoch: 34060, Training Loss: 0.00789
Epoch: 34060, Training Loss: 0.00737
Epoch: 34060, Training Loss: 0.00901
Epoch: 34060, Training Loss: 0.00698
Epoch: 34061, Training Loss: 0.00789
Epoch: 34061, Training Loss: 0.00737
Epoch: 34061, Training Loss: 0.00901
Epoch: 34061, Training Loss: 0.00698
Epoch: 34062, Training Loss: 0.00789
Epoch: 34062, Training Loss: 0.00737
Epoch: 34062, Training Loss: 0.00901
Epoch: 34062, Training Loss: 0.00698
Epoch: 34063, Training Loss: 0.00789
Epoch: 34063, Training Loss: 0.00737
Epoch: 34063, Training Loss: 0.00901
Epoch: 34063, Training Loss: 0.00698
Epoch: 34064, Training Loss: 0.00788
Epoch: 34064, Training Loss: 0.00737
Epoch: 34064, Training Loss: 0.00901
Epoch: 34064, Training Loss: 0.00698
Epoch: 34065, Training Loss: 0.00788
Epoch: 34065, Training Loss: 0.00737
Epoch: 34065, Training Loss: 0.00901
Epoch: 34065, Training Loss: 0.00698
Epoch: 34066, Training Loss: 0.00788
Epoch: 34066, Training Loss: 0.00737
Epoch: 34066, Training Loss: 0.00901
Epoch: 34066, Training Loss: 0.00698
Epoch: 34067, Training Loss: 0.00788
Epoch: 34067, Training Loss: 0.00737
Epoch: 34067, Training Loss: 0.00901
Epoch: 34067, Training Loss: 0.00698
Epoch: 34068, Training Loss: 0.00788
Epoch: 34068, Training Loss: 0.00737
Epoch: 34068, Training Loss: 0.00901
Epoch: 34068, Training Loss: 0.00698
Epoch: 34069, Training Loss: 0.00788
Epoch: 34069, Training Loss: 0.00737
Epoch: 34069, Training Loss: 0.00901
Epoch: 34069, Training Loss: 0.00698
Epoch: 34070, Training Loss: 0.00788
Epoch: 34070, Training Loss: 0.00737
Epoch: 34070, Training Loss: 0.00901
Epoch: 34070, Training Loss: 0.00698
Epoch: 34071, Training Loss: 0.00788
Epoch: 34071, Training Loss: 0.00737
Epoch: 34071, Training Loss: 0.00901
Epoch: 34071, Training Loss: 0.00698
Epoch: 34072, Training Loss: 0.00788
Epoch: 34072, Training Loss: 0.00737
Epoch: 34072, Training Loss: 0.00901
Epoch: 34072, Training Loss: 0.00698
Epoch: 34073, Training Loss: 0.00788
Epoch: 34073, Training Loss: 0.00737
Epoch: 34073, Training Loss: 0.00901
Epoch: 34073, Training Loss: 0.00698
Epoch: 34074, Training Loss: 0.00788
Epoch: 34074, Training Loss: 0.00737
Epoch: 34074, Training Loss: 0.00901
Epoch: 34074, Training Loss: 0.00698
Epoch: 34075, Training Loss: 0.00788
Epoch: 34075, Training Loss: 0.00737
Epoch: 34075, Training Loss: 0.00901
Epoch: 34075, Training Loss: 0.00698
Epoch: 34076, Training Loss: 0.00788
Epoch: 34076, Training Loss: 0.00737
Epoch: 34076, Training Loss: 0.00900
Epoch: 34076, Training Loss: 0.00698
Epoch: 34077, Training Loss: 0.00788
Epoch: 34077, Training Loss: 0.00737
Epoch: 34077, Training Loss: 0.00900
Epoch: 34077, Training Loss: 0.00698
Epoch: 34078, Training Loss: 0.00788
Epoch: 34078, Training Loss: 0.00737
Epoch: 34078, Training Loss: 0.00900
Epoch: 34078, Training Loss: 0.00698
Epoch: 34079, Training Loss: 0.00788
Epoch: 34079, Training Loss: 0.00737
Epoch: 34079, Training Loss: 0.00900
Epoch: 34079, Training Loss: 0.00698
Epoch: 34080, Training Loss: 0.00788
Epoch: 34080, Training Loss: 0.00737
Epoch: 34080, Training Loss: 0.00900
Epoch: 34080, Training Loss: 0.00698
Epoch: 34081, Training Loss: 0.00788
Epoch: 34081, Training Loss: 0.00737
Epoch: 34081, Training Loss: 0.00900
Epoch: 34081, Training Loss: 0.00698
Epoch: 34082, Training Loss: 0.00788
Epoch: 34082, Training Loss: 0.00737
Epoch: 34082, Training Loss: 0.00900
Epoch: 34082, Training Loss: 0.00698
Epoch: 34083, Training Loss: 0.00788
Epoch: 34083, Training Loss: 0.00737
Epoch: 34083, Training Loss: 0.00900
Epoch: 34083, Training Loss: 0.00698
Epoch: 34084, Training Loss: 0.00788
Epoch: 34084, Training Loss: 0.00737
Epoch: 34084, Training Loss: 0.00900
Epoch: 34084, Training Loss: 0.00698
Epoch: 34085, Training Loss: 0.00788
Epoch: 34085, Training Loss: 0.00737
Epoch: 34085, Training Loss: 0.00900
Epoch: 34085, Training Loss: 0.00698
Epoch: 34086, Training Loss: 0.00788
Epoch: 34086, Training Loss: 0.00737
Epoch: 34086, Training Loss: 0.00900
Epoch: 34086, Training Loss: 0.00698
Epoch: 34087, Training Loss: 0.00788
Epoch: 34087, Training Loss: 0.00737
Epoch: 34087, Training Loss: 0.00900
Epoch: 34087, Training Loss: 0.00698
Epoch: 34088, Training Loss: 0.00788
Epoch: 34088, Training Loss: 0.00737
Epoch: 34088, Training Loss: 0.00900
Epoch: 34088, Training Loss: 0.00698
Epoch: 34089, Training Loss: 0.00788
Epoch: 34089, Training Loss: 0.00737
Epoch: 34089, Training Loss: 0.00900
Epoch: 34089, Training Loss: 0.00698
Epoch: 34090, Training Loss: 0.00788
Epoch: 34090, Training Loss: 0.00737
Epoch: 34090, Training Loss: 0.00900
Epoch: 34090, Training Loss: 0.00698
Epoch: 34091, Training Loss: 0.00788
Epoch: 34091, Training Loss: 0.00737
Epoch: 34091, Training Loss: 0.00900
Epoch: 34091, Training Loss: 0.00698
Epoch: 34092, Training Loss: 0.00788
Epoch: 34092, Training Loss: 0.00737
Epoch: 34092, Training Loss: 0.00900
Epoch: 34092, Training Loss: 0.00698
Epoch: 34093, Training Loss: 0.00788
Epoch: 34093, Training Loss: 0.00737
Epoch: 34093, Training Loss: 0.00900
Epoch: 34093, Training Loss: 0.00698
Epoch: 34094, Training Loss: 0.00788
Epoch: 34094, Training Loss: 0.00737
Epoch: 34094, Training Loss: 0.00900
Epoch: 34094, Training Loss: 0.00698
Epoch: 34095, Training Loss: 0.00788
Epoch: 34095, Training Loss: 0.00737
Epoch: 34095, Training Loss: 0.00900
Epoch: 34095, Training Loss: 0.00698
Epoch: 34096, Training Loss: 0.00788
Epoch: 34096, Training Loss: 0.00737
Epoch: 34096, Training Loss: 0.00900
Epoch: 34096, Training Loss: 0.00698
Epoch: 34097, Training Loss: 0.00788
Epoch: 34097, Training Loss: 0.00737
Epoch: 34097, Training Loss: 0.00900
Epoch: 34097, Training Loss: 0.00698
Epoch: 34098, Training Loss: 0.00788
Epoch: 34098, Training Loss: 0.00737
Epoch: 34098, Training Loss: 0.00900
Epoch: 34098, Training Loss: 0.00698
Epoch: 34099, Training Loss: 0.00788
Epoch: 34099, Training Loss: 0.00737
Epoch: 34099, Training Loss: 0.00900
Epoch: 34099, Training Loss: 0.00698
Epoch: 34100, Training Loss: 0.00788
Epoch: 34100, Training Loss: 0.00737
Epoch: 34100, Training Loss: 0.00900
Epoch: 34100, Training Loss: 0.00697
Epoch: 34101, Training Loss: 0.00788
Epoch: 34101, Training Loss: 0.00737
Epoch: 34101, Training Loss: 0.00900
Epoch: 34101, Training Loss: 0.00697
Epoch: 34102, Training Loss: 0.00788
Epoch: 34102, Training Loss: 0.00737
Epoch: 34102, Training Loss: 0.00900
Epoch: 34102, Training Loss: 0.00697
Epoch: 34103, Training Loss: 0.00788
Epoch: 34103, Training Loss: 0.00737
Epoch: 34103, Training Loss: 0.00900
Epoch: 34103, Training Loss: 0.00697
Epoch: 34104, Training Loss: 0.00788
Epoch: 34104, Training Loss: 0.00737
Epoch: 34104, Training Loss: 0.00900
Epoch: 34104, Training Loss: 0.00697
Epoch: 34105, Training Loss: 0.00788
Epoch: 34105, Training Loss: 0.00737
Epoch: 34105, Training Loss: 0.00900
Epoch: 34105, Training Loss: 0.00697
Epoch: 34106, Training Loss: 0.00788
Epoch: 34106, Training Loss: 0.00737
Epoch: 34106, Training Loss: 0.00900
Epoch: 34106, Training Loss: 0.00697
Epoch: 34107, Training Loss: 0.00788
Epoch: 34107, Training Loss: 0.00737
Epoch: 34107, Training Loss: 0.00900
Epoch: 34107, Training Loss: 0.00697
Epoch: 34108, Training Loss: 0.00788
Epoch: 34108, Training Loss: 0.00737
Epoch: 34108, Training Loss: 0.00900
Epoch: 34108, Training Loss: 0.00697
Epoch: 34109, Training Loss: 0.00788
Epoch: 34109, Training Loss: 0.00737
Epoch: 34109, Training Loss: 0.00900
Epoch: 34109, Training Loss: 0.00697
Epoch: 34110, Training Loss: 0.00788
Epoch: 34110, Training Loss: 0.00737
Epoch: 34110, Training Loss: 0.00900
Epoch: 34110, Training Loss: 0.00697
Epoch: 34111, Training Loss: 0.00788
Epoch: 34111, Training Loss: 0.00737
Epoch: 34111, Training Loss: 0.00900
Epoch: 34111, Training Loss: 0.00697
Epoch: 34112, Training Loss: 0.00788
Epoch: 34112, Training Loss: 0.00737
Epoch: 34112, Training Loss: 0.00900
Epoch: 34112, Training Loss: 0.00697
Epoch: 34113, Training Loss: 0.00788
Epoch: 34113, Training Loss: 0.00736
Epoch: 34113, Training Loss: 0.00900
Epoch: 34113, Training Loss: 0.00697
Epoch: 34114, Training Loss: 0.00788
Epoch: 34114, Training Loss: 0.00736
Epoch: 34114, Training Loss: 0.00900
Epoch: 34114, Training Loss: 0.00697
Epoch: 34115, Training Loss: 0.00788
Epoch: 34115, Training Loss: 0.00736
Epoch: 34115, Training Loss: 0.00900
Epoch: 34115, Training Loss: 0.00697
Epoch: 34116, Training Loss: 0.00788
Epoch: 34116, Training Loss: 0.00736
Epoch: 34116, Training Loss: 0.00900
Epoch: 34116, Training Loss: 0.00697
Epoch: 34117, Training Loss: 0.00788
Epoch: 34117, Training Loss: 0.00736
Epoch: 34117, Training Loss: 0.00900
Epoch: 34117, Training Loss: 0.00697
Epoch: 34118, Training Loss: 0.00788
Epoch: 34118, Training Loss: 0.00736
Epoch: 34118, Training Loss: 0.00900
Epoch: 34118, Training Loss: 0.00697
Epoch: 34119, Training Loss: 0.00788
Epoch: 34119, Training Loss: 0.00736
Epoch: 34119, Training Loss: 0.00900
Epoch: 34119, Training Loss: 0.00697
Epoch: 34120, Training Loss: 0.00788
Epoch: 34120, Training Loss: 0.00736
Epoch: 34120, Training Loss: 0.00900
Epoch: 34120, Training Loss: 0.00697
Epoch: 34121, Training Loss: 0.00788
Epoch: 34121, Training Loss: 0.00736
Epoch: 34121, Training Loss: 0.00900
Epoch: 34121, Training Loss: 0.00697
Epoch: 34122, Training Loss: 0.00788
Epoch: 34122, Training Loss: 0.00736
Epoch: 34122, Training Loss: 0.00900
Epoch: 34122, Training Loss: 0.00697
Epoch: 34123, Training Loss: 0.00788
Epoch: 34123, Training Loss: 0.00736
Epoch: 34123, Training Loss: 0.00900
Epoch: 34123, Training Loss: 0.00697
Epoch: 34124, Training Loss: 0.00788
Epoch: 34124, Training Loss: 0.00736
Epoch: 34124, Training Loss: 0.00900
Epoch: 34124, Training Loss: 0.00697
Epoch: 34125, Training Loss: 0.00788
Epoch: 34125, Training Loss: 0.00736
Epoch: 34125, Training Loss: 0.00900
Epoch: 34125, Training Loss: 0.00697
Epoch: 34126, Training Loss: 0.00788
Epoch: 34126, Training Loss: 0.00736
Epoch: 34126, Training Loss: 0.00900
Epoch: 34126, Training Loss: 0.00697
Epoch: 34127, Training Loss: 0.00788
Epoch: 34127, Training Loss: 0.00736
Epoch: 34127, Training Loss: 0.00900
Epoch: 34127, Training Loss: 0.00697
Epoch: 34128, Training Loss: 0.00788
Epoch: 34128, Training Loss: 0.00736
Epoch: 34128, Training Loss: 0.00900
Epoch: 34128, Training Loss: 0.00697
Epoch: 34129, Training Loss: 0.00788
Epoch: 34129, Training Loss: 0.00736
Epoch: 34129, Training Loss: 0.00900
Epoch: 34129, Training Loss: 0.00697
Epoch: 34130, Training Loss: 0.00788
Epoch: 34130, Training Loss: 0.00736
Epoch: 34130, Training Loss: 0.00900
Epoch: 34130, Training Loss: 0.00697
Epoch: 34131, Training Loss: 0.00788
Epoch: 34131, Training Loss: 0.00736
Epoch: 34131, Training Loss: 0.00900
Epoch: 34131, Training Loss: 0.00697
Epoch: 34132, Training Loss: 0.00788
Epoch: 34132, Training Loss: 0.00736
Epoch: 34132, Training Loss: 0.00900
Epoch: 34132, Training Loss: 0.00697
Epoch: 34133, Training Loss: 0.00788
Epoch: 34133, Training Loss: 0.00736
Epoch: 34133, Training Loss: 0.00900
Epoch: 34133, Training Loss: 0.00697
Epoch: 34134, Training Loss: 0.00788
Epoch: 34134, Training Loss: 0.00736
Epoch: 34134, Training Loss: 0.00900
Epoch: 34134, Training Loss: 0.00697
Epoch: 34135, Training Loss: 0.00788
Epoch: 34135, Training Loss: 0.00736
Epoch: 34135, Training Loss: 0.00900
Epoch: 34135, Training Loss: 0.00697
Epoch: 34136, Training Loss: 0.00788
Epoch: 34136, Training Loss: 0.00736
Epoch: 34136, Training Loss: 0.00900
Epoch: 34136, Training Loss: 0.00697
Epoch: 34137, Training Loss: 0.00788
Epoch: 34137, Training Loss: 0.00736
Epoch: 34137, Training Loss: 0.00900
Epoch: 34137, Training Loss: 0.00697
Epoch: 34138, Training Loss: 0.00788
Epoch: 34138, Training Loss: 0.00736
Epoch: 34138, Training Loss: 0.00900
Epoch: 34138, Training Loss: 0.00697
Epoch: 34139, Training Loss: 0.00788
Epoch: 34139, Training Loss: 0.00736
Epoch: 34139, Training Loss: 0.00900
Epoch: 34139, Training Loss: 0.00697
Epoch: 34140, Training Loss: 0.00788
Epoch: 34140, Training Loss: 0.00736
Epoch: 34140, Training Loss: 0.00900
Epoch: 34140, Training Loss: 0.00697
Epoch: 34141, Training Loss: 0.00788
Epoch: 34141, Training Loss: 0.00736
Epoch: 34141, Training Loss: 0.00900
Epoch: 34141, Training Loss: 0.00697
Epoch: 34142, Training Loss: 0.00788
Epoch: 34142, Training Loss: 0.00736
Epoch: 34142, Training Loss: 0.00900
Epoch: 34142, Training Loss: 0.00697
Epoch: 34143, Training Loss: 0.00788
Epoch: 34143, Training Loss: 0.00736
Epoch: 34143, Training Loss: 0.00900
Epoch: 34143, Training Loss: 0.00697
Epoch: 34144, Training Loss: 0.00788
Epoch: 34144, Training Loss: 0.00736
Epoch: 34144, Training Loss: 0.00900
Epoch: 34144, Training Loss: 0.00697
Epoch: 34145, Training Loss: 0.00788
Epoch: 34145, Training Loss: 0.00736
Epoch: 34145, Training Loss: 0.00900
Epoch: 34145, Training Loss: 0.00697
Epoch: 34146, Training Loss: 0.00787
Epoch: 34146, Training Loss: 0.00736
Epoch: 34146, Training Loss: 0.00900
Epoch: 34146, Training Loss: 0.00697
Epoch: 34147, Training Loss: 0.00787
Epoch: 34147, Training Loss: 0.00736
Epoch: 34147, Training Loss: 0.00900
Epoch: 34147, Training Loss: 0.00697
Epoch: 34148, Training Loss: 0.00787
Epoch: 34148, Training Loss: 0.00736
Epoch: 34148, Training Loss: 0.00899
Epoch: 34148, Training Loss: 0.00697
Epoch: 34149, Training Loss: 0.00787
Epoch: 34149, Training Loss: 0.00736
Epoch: 34149, Training Loss: 0.00899
Epoch: 34149, Training Loss: 0.00697
Epoch: 34150, Training Loss: 0.00787
Epoch: 34150, Training Loss: 0.00736
Epoch: 34150, Training Loss: 0.00899
Epoch: 34150, Training Loss: 0.00697
Epoch: 34151, Training Loss: 0.00787
Epoch: 34151, Training Loss: 0.00736
Epoch: 34151, Training Loss: 0.00899
Epoch: 34151, Training Loss: 0.00697
Epoch: 34152, Training Loss: 0.00787
Epoch: 34152, Training Loss: 0.00736
Epoch: 34152, Training Loss: 0.00899
Epoch: 34152, Training Loss: 0.00697
Epoch: 34153, Training Loss: 0.00787
Epoch: 34153, Training Loss: 0.00736
Epoch: 34153, Training Loss: 0.00899
Epoch: 34153, Training Loss: 0.00697
Epoch: 34154, Training Loss: 0.00787
Epoch: 34154, Training Loss: 0.00736
Epoch: 34154, Training Loss: 0.00899
Epoch: 34154, Training Loss: 0.00697
Epoch: 34155, Training Loss: 0.00787
Epoch: 34155, Training Loss: 0.00736
Epoch: 34155, Training Loss: 0.00899
Epoch: 34155, Training Loss: 0.00697
Epoch: 34156, Training Loss: 0.00787
Epoch: 34156, Training Loss: 0.00736
Epoch: 34156, Training Loss: 0.00899
Epoch: 34156, Training Loss: 0.00697
Epoch: 34157, Training Loss: 0.00787
Epoch: 34157, Training Loss: 0.00736
Epoch: 34157, Training Loss: 0.00899
Epoch: 34157, Training Loss: 0.00697
Epoch: 34158, Training Loss: 0.00787
Epoch: 34158, Training Loss: 0.00736
Epoch: 34158, Training Loss: 0.00899
Epoch: 34158, Training Loss: 0.00697
Epoch: 34159, Training Loss: 0.00787
Epoch: 34159, Training Loss: 0.00736
Epoch: 34159, Training Loss: 0.00899
Epoch: 34159, Training Loss: 0.00697
Epoch: 34160, Training Loss: 0.00787
Epoch: 34160, Training Loss: 0.00736
Epoch: 34160, Training Loss: 0.00899
Epoch: 34160, Training Loss: 0.00697
Epoch: 34161, Training Loss: 0.00787
Epoch: 34161, Training Loss: 0.00736
Epoch: 34161, Training Loss: 0.00899
Epoch: 34161, Training Loss: 0.00697
Epoch: 34162, Training Loss: 0.00787
Epoch: 34162, Training Loss: 0.00736
Epoch: 34162, Training Loss: 0.00899
Epoch: 34162, Training Loss: 0.00697
Epoch: 34163, Training Loss: 0.00787
Epoch: 34163, Training Loss: 0.00736
Epoch: 34163, Training Loss: 0.00899
Epoch: 34163, Training Loss: 0.00697
Epoch: 34164, Training Loss: 0.00787
Epoch: 34164, Training Loss: 0.00736
Epoch: 34164, Training Loss: 0.00899
Epoch: 34164, Training Loss: 0.00697
Epoch: 34165, Training Loss: 0.00787
Epoch: 34165, Training Loss: 0.00736
Epoch: 34165, Training Loss: 0.00899
Epoch: 34165, Training Loss: 0.00697
Epoch: 34166, Training Loss: 0.00787
Epoch: 34166, Training Loss: 0.00736
Epoch: 34166, Training Loss: 0.00899
Epoch: 34166, Training Loss: 0.00697
Epoch: 34167, Training Loss: 0.00787
Epoch: 34167, Training Loss: 0.00736
Epoch: 34167, Training Loss: 0.00899
Epoch: 34167, Training Loss: 0.00697
Epoch: 34168, Training Loss: 0.00787
Epoch: 34168, Training Loss: 0.00736
Epoch: 34168, Training Loss: 0.00899
Epoch: 34168, Training Loss: 0.00697
Epoch: 34169, Training Loss: 0.00787
Epoch: 34169, Training Loss: 0.00736
Epoch: 34169, Training Loss: 0.00899
Epoch: 34169, Training Loss: 0.00697
Epoch: 34170, Training Loss: 0.00787
Epoch: 34170, Training Loss: 0.00736
Epoch: 34170, Training Loss: 0.00899
Epoch: 34170, Training Loss: 0.00697
Epoch: 34171, Training Loss: 0.00787
Epoch: 34171, Training Loss: 0.00736
Epoch: 34171, Training Loss: 0.00899
Epoch: 34171, Training Loss: 0.00697
Epoch: 34172, Training Loss: 0.00787
Epoch: 34172, Training Loss: 0.00736
Epoch: 34172, Training Loss: 0.00899
Epoch: 34172, Training Loss: 0.00697
Epoch: 34173, Training Loss: 0.00787
Epoch: 34173, Training Loss: 0.00736
Epoch: 34173, Training Loss: 0.00899
Epoch: 34173, Training Loss: 0.00697
Epoch: 34174, Training Loss: 0.00787
Epoch: 34174, Training Loss: 0.00736
Epoch: 34174, Training Loss: 0.00899
Epoch: 34174, Training Loss: 0.00697
Epoch: 34175, Training Loss: 0.00787
Epoch: 34175, Training Loss: 0.00736
Epoch: 34175, Training Loss: 0.00899
Epoch: 34175, Training Loss: 0.00697
Epoch: 34176, Training Loss: 0.00787
Epoch: 34176, Training Loss: 0.00736
Epoch: 34176, Training Loss: 0.00899
Epoch: 34176, Training Loss: 0.00697
Epoch: 34177, Training Loss: 0.00787
Epoch: 34177, Training Loss: 0.00736
Epoch: 34177, Training Loss: 0.00899
Epoch: 34177, Training Loss: 0.00697
Epoch: 34178, Training Loss: 0.00787
Epoch: 34178, Training Loss: 0.00736
Epoch: 34178, Training Loss: 0.00899
Epoch: 34178, Training Loss: 0.00697
Epoch: 34179, Training Loss: 0.00787
Epoch: 34179, Training Loss: 0.00736
Epoch: 34179, Training Loss: 0.00899
Epoch: 34179, Training Loss: 0.00697
Epoch: 34180, Training Loss: 0.00787
Epoch: 34180, Training Loss: 0.00736
Epoch: 34180, Training Loss: 0.00899
Epoch: 34180, Training Loss: 0.00697
Epoch: 34181, Training Loss: 0.00787
Epoch: 34181, Training Loss: 0.00736
Epoch: 34181, Training Loss: 0.00899
Epoch: 34181, Training Loss: 0.00697
Epoch: 34182, Training Loss: 0.00787
Epoch: 34182, Training Loss: 0.00736
Epoch: 34182, Training Loss: 0.00899
Epoch: 34182, Training Loss: 0.00697
Epoch: 34183, Training Loss: 0.00787
Epoch: 34183, Training Loss: 0.00736
Epoch: 34183, Training Loss: 0.00899
Epoch: 34183, Training Loss: 0.00697
Epoch: 34184, Training Loss: 0.00787
Epoch: 34184, Training Loss: 0.00736
Epoch: 34184, Training Loss: 0.00899
Epoch: 34184, Training Loss: 0.00697
Epoch: 34185, Training Loss: 0.00787
Epoch: 34185, Training Loss: 0.00736
Epoch: 34185, Training Loss: 0.00899
Epoch: 34185, Training Loss: 0.00697
Epoch: 34186, Training Loss: 0.00787
Epoch: 34186, Training Loss: 0.00736
Epoch: 34186, Training Loss: 0.00899
Epoch: 34186, Training Loss: 0.00697
Epoch: 34187, Training Loss: 0.00787
Epoch: 34187, Training Loss: 0.00736
Epoch: 34187, Training Loss: 0.00899
Epoch: 34187, Training Loss: 0.00697
Epoch: 34188, Training Loss: 0.00787
Epoch: 34188, Training Loss: 0.00736
Epoch: 34188, Training Loss: 0.00899
Epoch: 34188, Training Loss: 0.00697
Epoch: 34189, Training Loss: 0.00787
Epoch: 34189, Training Loss: 0.00736
Epoch: 34189, Training Loss: 0.00899
Epoch: 34189, Training Loss: 0.00697
Epoch: 34190, Training Loss: 0.00787
Epoch: 34190, Training Loss: 0.00736
Epoch: 34190, Training Loss: 0.00899
Epoch: 34190, Training Loss: 0.00697
Epoch: 34191, Training Loss: 0.00787
Epoch: 34191, Training Loss: 0.00736
Epoch: 34191, Training Loss: 0.00899
Epoch: 34191, Training Loss: 0.00697
Epoch: 34192, Training Loss: 0.00787
Epoch: 34192, Training Loss: 0.00736
Epoch: 34192, Training Loss: 0.00899
Epoch: 34192, Training Loss: 0.00697
Epoch: 34193, Training Loss: 0.00787
Epoch: 34193, Training Loss: 0.00736
Epoch: 34193, Training Loss: 0.00899
Epoch: 34193, Training Loss: 0.00697
Epoch: 34194, Training Loss: 0.00787
Epoch: 34194, Training Loss: 0.00736
Epoch: 34194, Training Loss: 0.00899
Epoch: 34194, Training Loss: 0.00696
Epoch: 34195, Training Loss: 0.00787
Epoch: 34195, Training Loss: 0.00736
Epoch: 34195, Training Loss: 0.00899
Epoch: 34195, Training Loss: 0.00696
Epoch: 34196, Training Loss: 0.00787
Epoch: 34196, Training Loss: 0.00736
Epoch: 34196, Training Loss: 0.00899
Epoch: 34196, Training Loss: 0.00696
Epoch: 34197, Training Loss: 0.00787
Epoch: 34197, Training Loss: 0.00736
Epoch: 34197, Training Loss: 0.00899
Epoch: 34197, Training Loss: 0.00696
Epoch: 34198, Training Loss: 0.00787
Epoch: 34198, Training Loss: 0.00736
Epoch: 34198, Training Loss: 0.00899
Epoch: 34198, Training Loss: 0.00696
Epoch: 34199, Training Loss: 0.00787
Epoch: 34199, Training Loss: 0.00736
Epoch: 34199, Training Loss: 0.00899
Epoch: 34199, Training Loss: 0.00696
Epoch: 34200, Training Loss: 0.00787
Epoch: 34200, Training Loss: 0.00736
Epoch: 34200, Training Loss: 0.00899
Epoch: 34200, Training Loss: 0.00696
Epoch: 34201, Training Loss: 0.00787
Epoch: 34201, Training Loss: 0.00736
Epoch: 34201, Training Loss: 0.00899
Epoch: 34201, Training Loss: 0.00696
Epoch: 34202, Training Loss: 0.00787
Epoch: 34202, Training Loss: 0.00735
Epoch: 34202, Training Loss: 0.00899
Epoch: 34202, Training Loss: 0.00696
Epoch: 34203, Training Loss: 0.00787
Epoch: 34203, Training Loss: 0.00735
Epoch: 34203, Training Loss: 0.00899
Epoch: 34203, Training Loss: 0.00696
Epoch: 34204, Training Loss: 0.00787
Epoch: 34204, Training Loss: 0.00735
Epoch: 34204, Training Loss: 0.00899
Epoch: 34204, Training Loss: 0.00696
Epoch: 34205, Training Loss: 0.00787
Epoch: 34205, Training Loss: 0.00735
Epoch: 34205, Training Loss: 0.00899
Epoch: 34205, Training Loss: 0.00696
Epoch: 34206, Training Loss: 0.00787
Epoch: 34206, Training Loss: 0.00735
Epoch: 34206, Training Loss: 0.00899
Epoch: 34206, Training Loss: 0.00696
Epoch: 34207, Training Loss: 0.00787
Epoch: 34207, Training Loss: 0.00735
Epoch: 34207, Training Loss: 0.00899
Epoch: 34207, Training Loss: 0.00696
Epoch: 34208, Training Loss: 0.00787
Epoch: 34208, Training Loss: 0.00735
Epoch: 34208, Training Loss: 0.00899
Epoch: 34208, Training Loss: 0.00696
Epoch: 34209, Training Loss: 0.00787
Epoch: 34209, Training Loss: 0.00735
Epoch: 34209, Training Loss: 0.00899
Epoch: 34209, Training Loss: 0.00696
Epoch: 34210, Training Loss: 0.00787
Epoch: 34210, Training Loss: 0.00735
Epoch: 34210, Training Loss: 0.00899
Epoch: 34210, Training Loss: 0.00696
Epoch: 34211, Training Loss: 0.00787
Epoch: 34211, Training Loss: 0.00735
Epoch: 34211, Training Loss: 0.00899
Epoch: 34211, Training Loss: 0.00696
Epoch: 34212, Training Loss: 0.00787
Epoch: 34212, Training Loss: 0.00735
Epoch: 34212, Training Loss: 0.00899
Epoch: 34212, Training Loss: 0.00696
Epoch: 34213, Training Loss: 0.00787
Epoch: 34213, Training Loss: 0.00735
Epoch: 34213, Training Loss: 0.00899
Epoch: 34213, Training Loss: 0.00696
Epoch: 34214, Training Loss: 0.00787
Epoch: 34214, Training Loss: 0.00735
Epoch: 34214, Training Loss: 0.00899
Epoch: 34214, Training Loss: 0.00696
Epoch: 34215, Training Loss: 0.00787
Epoch: 34215, Training Loss: 0.00735
Epoch: 34215, Training Loss: 0.00899
Epoch: 34215, Training Loss: 0.00696
Epoch: 34216, Training Loss: 0.00787
Epoch: 34216, Training Loss: 0.00735
Epoch: 34216, Training Loss: 0.00899
Epoch: 34216, Training Loss: 0.00696
Epoch: 34217, Training Loss: 0.00787
Epoch: 34217, Training Loss: 0.00735
Epoch: 34217, Training Loss: 0.00899
Epoch: 34217, Training Loss: 0.00696
Epoch: 34218, Training Loss: 0.00787
Epoch: 34218, Training Loss: 0.00735
Epoch: 34218, Training Loss: 0.00899
Epoch: 34218, Training Loss: 0.00696
Epoch: 34219, Training Loss: 0.00787
Epoch: 34219, Training Loss: 0.00735
Epoch: 34219, Training Loss: 0.00899
Epoch: 34219, Training Loss: 0.00696
Epoch: 34220, Training Loss: 0.00787
Epoch: 34220, Training Loss: 0.00735
Epoch: 34220, Training Loss: 0.00898
Epoch: 34220, Training Loss: 0.00696
Epoch: 34221, Training Loss: 0.00787
Epoch: 34221, Training Loss: 0.00735
Epoch: 34221, Training Loss: 0.00898
Epoch: 34221, Training Loss: 0.00696
Epoch: 34222, Training Loss: 0.00787
Epoch: 34222, Training Loss: 0.00735
Epoch: 34222, Training Loss: 0.00898
Epoch: 34222, Training Loss: 0.00696
Epoch: 34223, Training Loss: 0.00787
Epoch: 34223, Training Loss: 0.00735
Epoch: 34223, Training Loss: 0.00898
Epoch: 34223, Training Loss: 0.00696
Epoch: 34224, Training Loss: 0.00787
Epoch: 34224, Training Loss: 0.00735
Epoch: 34224, Training Loss: 0.00898
Epoch: 34224, Training Loss: 0.00696
Epoch: 34225, Training Loss: 0.00787
Epoch: 34225, Training Loss: 0.00735
Epoch: 34225, Training Loss: 0.00898
Epoch: 34225, Training Loss: 0.00696
Epoch: 34226, Training Loss: 0.00787
Epoch: 34226, Training Loss: 0.00735
Epoch: 34226, Training Loss: 0.00898
Epoch: 34226, Training Loss: 0.00696
Epoch: 34227, Training Loss: 0.00787
Epoch: 34227, Training Loss: 0.00735
Epoch: 34227, Training Loss: 0.00898
Epoch: 34227, Training Loss: 0.00696
Epoch: 34228, Training Loss: 0.00786
Epoch: 34228, Training Loss: 0.00735
Epoch: 34228, Training Loss: 0.00898
Epoch: 34228, Training Loss: 0.00696
Epoch: 34229, Training Loss: 0.00786
Epoch: 34229, Training Loss: 0.00735
Epoch: 34229, Training Loss: 0.00898
Epoch: 34229, Training Loss: 0.00696
Epoch: 34230, Training Loss: 0.00786
Epoch: 34230, Training Loss: 0.00735
Epoch: 34230, Training Loss: 0.00898
Epoch: 34230, Training Loss: 0.00696
Epoch: 34231, Training Loss: 0.00786
Epoch: 34231, Training Loss: 0.00735
Epoch: 34231, Training Loss: 0.00898
Epoch: 34231, Training Loss: 0.00696
Epoch: 34232, Training Loss: 0.00786
Epoch: 34232, Training Loss: 0.00735
Epoch: 34232, Training Loss: 0.00898
Epoch: 34232, Training Loss: 0.00696
Epoch: 34233, Training Loss: 0.00786
Epoch: 34233, Training Loss: 0.00735
Epoch: 34233, Training Loss: 0.00898
Epoch: 34233, Training Loss: 0.00696
Epoch: 34234, Training Loss: 0.00786
Epoch: 34234, Training Loss: 0.00735
Epoch: 34234, Training Loss: 0.00898
Epoch: 34234, Training Loss: 0.00696
Epoch: 34235, Training Loss: 0.00786
Epoch: 34235, Training Loss: 0.00735
Epoch: 34235, Training Loss: 0.00898
Epoch: 34235, Training Loss: 0.00696
Epoch: 34236, Training Loss: 0.00786
Epoch: 34236, Training Loss: 0.00735
Epoch: 34236, Training Loss: 0.00898
Epoch: 34236, Training Loss: 0.00696
Epoch: 34237, Training Loss: 0.00786
Epoch: 34237, Training Loss: 0.00735
Epoch: 34237, Training Loss: 0.00898
Epoch: 34237, Training Loss: 0.00696
Epoch: 34238, Training Loss: 0.00786
Epoch: 34238, Training Loss: 0.00735
Epoch: 34238, Training Loss: 0.00898
Epoch: 34238, Training Loss: 0.00696
Epoch: 34239, Training Loss: 0.00786
Epoch: 34239, Training Loss: 0.00735
Epoch: 34239, Training Loss: 0.00898
Epoch: 34239, Training Loss: 0.00696
Epoch: 34240, Training Loss: 0.00786
Epoch: 34240, Training Loss: 0.00735
Epoch: 34240, Training Loss: 0.00898
Epoch: 34240, Training Loss: 0.00696
Epoch: 34241, Training Loss: 0.00786
Epoch: 34241, Training Loss: 0.00735
Epoch: 34241, Training Loss: 0.00898
Epoch: 34241, Training Loss: 0.00696
Epoch: 34242, Training Loss: 0.00786
Epoch: 34242, Training Loss: 0.00735
Epoch: 34242, Training Loss: 0.00898
Epoch: 34242, Training Loss: 0.00696
Epoch: 34243, Training Loss: 0.00786
Epoch: 34243, Training Loss: 0.00735
Epoch: 34243, Training Loss: 0.00898
Epoch: 34243, Training Loss: 0.00696
Epoch: 34244, Training Loss: 0.00786
Epoch: 34244, Training Loss: 0.00735
Epoch: 34244, Training Loss: 0.00898
Epoch: 34244, Training Loss: 0.00696
Epoch: 34245, Training Loss: 0.00786
Epoch: 34245, Training Loss: 0.00735
Epoch: 34245, Training Loss: 0.00898
Epoch: 34245, Training Loss: 0.00696
Epoch: 34246, Training Loss: 0.00786
Epoch: 34246, Training Loss: 0.00735
Epoch: 34246, Training Loss: 0.00898
Epoch: 34246, Training Loss: 0.00696
Epoch: 34247, Training Loss: 0.00786
Epoch: 34247, Training Loss: 0.00735
Epoch: 34247, Training Loss: 0.00898
Epoch: 34247, Training Loss: 0.00696
Epoch: 34248, Training Loss: 0.00786
Epoch: 34248, Training Loss: 0.00735
Epoch: 34248, Training Loss: 0.00898
Epoch: 34248, Training Loss: 0.00696
Epoch: 34249, Training Loss: 0.00786
Epoch: 34249, Training Loss: 0.00735
Epoch: 34249, Training Loss: 0.00898
Epoch: 34249, Training Loss: 0.00696
Epoch: 34250, Training Loss: 0.00786
Epoch: 34250, Training Loss: 0.00735
Epoch: 34250, Training Loss: 0.00898
Epoch: 34250, Training Loss: 0.00696
Epoch: 34251, Training Loss: 0.00786
Epoch: 34251, Training Loss: 0.00735
Epoch: 34251, Training Loss: 0.00898
Epoch: 34251, Training Loss: 0.00696
Epoch: 34252, Training Loss: 0.00786
Epoch: 34252, Training Loss: 0.00735
Epoch: 34252, Training Loss: 0.00898
Epoch: 34252, Training Loss: 0.00696
Epoch: 34253, Training Loss: 0.00786
Epoch: 34253, Training Loss: 0.00735
Epoch: 34253, Training Loss: 0.00898
Epoch: 34253, Training Loss: 0.00696
Epoch: 34254, Training Loss: 0.00786
Epoch: 34254, Training Loss: 0.00735
Epoch: 34254, Training Loss: 0.00898
Epoch: 34254, Training Loss: 0.00696
Epoch: 34255, Training Loss: 0.00786
Epoch: 34255, Training Loss: 0.00735
Epoch: 34255, Training Loss: 0.00898
Epoch: 34255, Training Loss: 0.00696
Epoch: 34256, Training Loss: 0.00786
Epoch: 34256, Training Loss: 0.00735
Epoch: 34256, Training Loss: 0.00898
Epoch: 34256, Training Loss: 0.00696
Epoch: 34257, Training Loss: 0.00786
Epoch: 34257, Training Loss: 0.00735
Epoch: 34257, Training Loss: 0.00898
Epoch: 34257, Training Loss: 0.00696
Epoch: 34258, Training Loss: 0.00786
Epoch: 34258, Training Loss: 0.00735
Epoch: 34258, Training Loss: 0.00898
Epoch: 34258, Training Loss: 0.00696
Epoch: 34259, Training Loss: 0.00786
Epoch: 34259, Training Loss: 0.00735
Epoch: 34259, Training Loss: 0.00898
Epoch: 34259, Training Loss: 0.00696
Epoch: 34260, Training Loss: 0.00786
Epoch: 34260, Training Loss: 0.00735
Epoch: 34260, Training Loss: 0.00898
Epoch: 34260, Training Loss: 0.00696
Epoch: 34261, Training Loss: 0.00786
Epoch: 34261, Training Loss: 0.00735
Epoch: 34261, Training Loss: 0.00898
Epoch: 34261, Training Loss: 0.00696
Epoch: 34262, Training Loss: 0.00786
Epoch: 34262, Training Loss: 0.00735
Epoch: 34262, Training Loss: 0.00898
Epoch: 34262, Training Loss: 0.00696
Epoch: 34263, Training Loss: 0.00786
Epoch: 34263, Training Loss: 0.00735
Epoch: 34263, Training Loss: 0.00898
Epoch: 34263, Training Loss: 0.00696
Epoch: 34264, Training Loss: 0.00786
Epoch: 34264, Training Loss: 0.00735
Epoch: 34264, Training Loss: 0.00898
Epoch: 34264, Training Loss: 0.00696
Epoch: 34265, Training Loss: 0.00786
Epoch: 34265, Training Loss: 0.00735
Epoch: 34265, Training Loss: 0.00898
Epoch: 34265, Training Loss: 0.00696
Epoch: 34266, Training Loss: 0.00786
Epoch: 34266, Training Loss: 0.00735
Epoch: 34266, Training Loss: 0.00898
Epoch: 34266, Training Loss: 0.00696
Epoch: 34267, Training Loss: 0.00786
Epoch: 34267, Training Loss: 0.00735
Epoch: 34267, Training Loss: 0.00898
Epoch: 34267, Training Loss: 0.00696
Epoch: 34268, Training Loss: 0.00786
Epoch: 34268, Training Loss: 0.00735
Epoch: 34268, Training Loss: 0.00898
Epoch: 34268, Training Loss: 0.00696
Epoch: 34269, Training Loss: 0.00786
Epoch: 34269, Training Loss: 0.00735
Epoch: 34269, Training Loss: 0.00898
Epoch: 34269, Training Loss: 0.00696
Epoch: 34270, Training Loss: 0.00786
Epoch: 34270, Training Loss: 0.00735
Epoch: 34270, Training Loss: 0.00898
Epoch: 34270, Training Loss: 0.00696
Epoch: 34271, Training Loss: 0.00786
Epoch: 34271, Training Loss: 0.00735
Epoch: 34271, Training Loss: 0.00898
Epoch: 34271, Training Loss: 0.00696
Epoch: 34272, Training Loss: 0.00786
Epoch: 34272, Training Loss: 0.00735
Epoch: 34272, Training Loss: 0.00898
Epoch: 34272, Training Loss: 0.00696
Epoch: 34273, Training Loss: 0.00786
Epoch: 34273, Training Loss: 0.00735
Epoch: 34273, Training Loss: 0.00898
Epoch: 34273, Training Loss: 0.00696
Epoch: 34274, Training Loss: 0.00786
Epoch: 34274, Training Loss: 0.00735
Epoch: 34274, Training Loss: 0.00898
Epoch: 34274, Training Loss: 0.00696
Epoch: 34275, Training Loss: 0.00786
Epoch: 34275, Training Loss: 0.00735
Epoch: 34275, Training Loss: 0.00898
Epoch: 34275, Training Loss: 0.00696
Epoch: 34276, Training Loss: 0.00786
Epoch: 34276, Training Loss: 0.00735
Epoch: 34276, Training Loss: 0.00898
Epoch: 34276, Training Loss: 0.00696
Epoch: 34277, Training Loss: 0.00786
Epoch: 34277, Training Loss: 0.00735
Epoch: 34277, Training Loss: 0.00898
Epoch: 34277, Training Loss: 0.00696
Epoch: 34278, Training Loss: 0.00786
Epoch: 34278, Training Loss: 0.00735
Epoch: 34278, Training Loss: 0.00898
Epoch: 34278, Training Loss: 0.00696
Epoch: 34279, Training Loss: 0.00786
Epoch: 34279, Training Loss: 0.00735
Epoch: 34279, Training Loss: 0.00898
Epoch: 34279, Training Loss: 0.00696
Epoch: 34280, Training Loss: 0.00786
Epoch: 34280, Training Loss: 0.00735
Epoch: 34280, Training Loss: 0.00898
Epoch: 34280, Training Loss: 0.00696
Epoch: 34281, Training Loss: 0.00786
Epoch: 34281, Training Loss: 0.00735
Epoch: 34281, Training Loss: 0.00898
Epoch: 34281, Training Loss: 0.00696
Epoch: 34282, Training Loss: 0.00786
Epoch: 34282, Training Loss: 0.00735
Epoch: 34282, Training Loss: 0.00898
Epoch: 34282, Training Loss: 0.00696
Epoch: 34283, Training Loss: 0.00786
Epoch: 34283, Training Loss: 0.00735
Epoch: 34283, Training Loss: 0.00898
Epoch: 34283, Training Loss: 0.00696
Epoch: 34284, Training Loss: 0.00786
Epoch: 34284, Training Loss: 0.00735
Epoch: 34284, Training Loss: 0.00898
Epoch: 34284, Training Loss: 0.00696
Epoch: 34285, Training Loss: 0.00786
Epoch: 34285, Training Loss: 0.00735
Epoch: 34285, Training Loss: 0.00898
Epoch: 34285, Training Loss: 0.00696
Epoch: 34286, Training Loss: 0.00786
Epoch: 34286, Training Loss: 0.00735
Epoch: 34286, Training Loss: 0.00898
Epoch: 34286, Training Loss: 0.00696
Epoch: 34287, Training Loss: 0.00786
Epoch: 34287, Training Loss: 0.00735
Epoch: 34287, Training Loss: 0.00898
Epoch: 34287, Training Loss: 0.00696
Epoch: 34288, Training Loss: 0.00786
Epoch: 34288, Training Loss: 0.00735
Epoch: 34288, Training Loss: 0.00898
Epoch: 34288, Training Loss: 0.00695
Epoch: 34289, Training Loss: 0.00786
Epoch: 34289, Training Loss: 0.00735
Epoch: 34289, Training Loss: 0.00898
Epoch: 34289, Training Loss: 0.00695
Epoch: 34290, Training Loss: 0.00786
Epoch: 34290, Training Loss: 0.00735
Epoch: 34290, Training Loss: 0.00898
Epoch: 34290, Training Loss: 0.00695
Epoch: 34291, Training Loss: 0.00786
Epoch: 34291, Training Loss: 0.00734
Epoch: 34291, Training Loss: 0.00898
Epoch: 34291, Training Loss: 0.00695
Epoch: 34292, Training Loss: 0.00786
Epoch: 34292, Training Loss: 0.00734
Epoch: 34292, Training Loss: 0.00898
Epoch: 34292, Training Loss: 0.00695
Epoch: 34293, Training Loss: 0.00786
Epoch: 34293, Training Loss: 0.00734
Epoch: 34293, Training Loss: 0.00897
Epoch: 34293, Training Loss: 0.00695
Epoch: 34294, Training Loss: 0.00786
Epoch: 34294, Training Loss: 0.00734
Epoch: 34294, Training Loss: 0.00897
Epoch: 34294, Training Loss: 0.00695
Epoch: 34295, Training Loss: 0.00786
Epoch: 34295, Training Loss: 0.00734
Epoch: 34295, Training Loss: 0.00897
Epoch: 34295, Training Loss: 0.00695
Epoch: 34296, Training Loss: 0.00786
Epoch: 34296, Training Loss: 0.00734
Epoch: 34296, Training Loss: 0.00897
Epoch: 34296, Training Loss: 0.00695
Epoch: 34297, Training Loss: 0.00786
Epoch: 34297, Training Loss: 0.00734
Epoch: 34297, Training Loss: 0.00897
Epoch: 34297, Training Loss: 0.00695
Epoch: 34298, Training Loss: 0.00786
Epoch: 34298, Training Loss: 0.00734
Epoch: 34298, Training Loss: 0.00897
Epoch: 34298, Training Loss: 0.00695
Epoch: 34299, Training Loss: 0.00786
Epoch: 34299, Training Loss: 0.00734
Epoch: 34299, Training Loss: 0.00897
Epoch: 34299, Training Loss: 0.00695
Epoch: 34300, Training Loss: 0.00786
Epoch: 34300, Training Loss: 0.00734
Epoch: 34300, Training Loss: 0.00897
Epoch: 34300, Training Loss: 0.00695
Epoch: 34301, Training Loss: 0.00786
Epoch: 34301, Training Loss: 0.00734
Epoch: 34301, Training Loss: 0.00897
Epoch: 34301, Training Loss: 0.00695
Epoch: 34302, Training Loss: 0.00786
Epoch: 34302, Training Loss: 0.00734
Epoch: 34302, Training Loss: 0.00897
Epoch: 34302, Training Loss: 0.00695
Epoch: 34303, Training Loss: 0.00786
Epoch: 34303, Training Loss: 0.00734
Epoch: 34303, Training Loss: 0.00897
Epoch: 34303, Training Loss: 0.00695
Epoch: 34304, Training Loss: 0.00786
Epoch: 34304, Training Loss: 0.00734
Epoch: 34304, Training Loss: 0.00897
Epoch: 34304, Training Loss: 0.00695
Epoch: 34305, Training Loss: 0.00786
Epoch: 34305, Training Loss: 0.00734
Epoch: 34305, Training Loss: 0.00897
Epoch: 34305, Training Loss: 0.00695
Epoch: 34306, Training Loss: 0.00786
Epoch: 34306, Training Loss: 0.00734
Epoch: 34306, Training Loss: 0.00897
Epoch: 34306, Training Loss: 0.00695
Epoch: 34307, Training Loss: 0.00786
Epoch: 34307, Training Loss: 0.00734
Epoch: 34307, Training Loss: 0.00897
Epoch: 34307, Training Loss: 0.00695
Epoch: 34308, Training Loss: 0.00786
Epoch: 34308, Training Loss: 0.00734
Epoch: 34308, Training Loss: 0.00897
Epoch: 34308, Training Loss: 0.00695
Epoch: 34309, Training Loss: 0.00786
Epoch: 34309, Training Loss: 0.00734
Epoch: 34309, Training Loss: 0.00897
Epoch: 34309, Training Loss: 0.00695
Epoch: 34310, Training Loss: 0.00785
Epoch: 34310, Training Loss: 0.00734
Epoch: 34310, Training Loss: 0.00897
Epoch: 34310, Training Loss: 0.00695
Epoch: 34311, Training Loss: 0.00785
Epoch: 34311, Training Loss: 0.00734
Epoch: 34311, Training Loss: 0.00897
Epoch: 34311, Training Loss: 0.00695
Epoch: 34312, Training Loss: 0.00785
Epoch: 34312, Training Loss: 0.00734
Epoch: 34312, Training Loss: 0.00897
Epoch: 34312, Training Loss: 0.00695
Epoch: 34313, Training Loss: 0.00785
Epoch: 34313, Training Loss: 0.00734
Epoch: 34313, Training Loss: 0.00897
Epoch: 34313, Training Loss: 0.00695
Epoch: 34314, Training Loss: 0.00785
Epoch: 34314, Training Loss: 0.00734
Epoch: 34314, Training Loss: 0.00897
Epoch: 34314, Training Loss: 0.00695
Epoch: 34315, Training Loss: 0.00785
Epoch: 34315, Training Loss: 0.00734
Epoch: 34315, Training Loss: 0.00897
Epoch: 34315, Training Loss: 0.00695
Epoch: 34316, Training Loss: 0.00785
Epoch: 34316, Training Loss: 0.00734
Epoch: 34316, Training Loss: 0.00897
Epoch: 34316, Training Loss: 0.00695
Epoch: 34317, Training Loss: 0.00785
Epoch: 34317, Training Loss: 0.00734
Epoch: 34317, Training Loss: 0.00897
Epoch: 34317, Training Loss: 0.00695
Epoch: 34318, Training Loss: 0.00785
Epoch: 34318, Training Loss: 0.00734
Epoch: 34318, Training Loss: 0.00897
Epoch: 34318, Training Loss: 0.00695
Epoch: 34319, Training Loss: 0.00785
Epoch: 34319, Training Loss: 0.00734
Epoch: 34319, Training Loss: 0.00897
Epoch: 34319, Training Loss: 0.00695
Epoch: 34320, Training Loss: 0.00785
Epoch: 34320, Training Loss: 0.00734
Epoch: 34320, Training Loss: 0.00897
Epoch: 34320, Training Loss: 0.00695
Epoch: 34321, Training Loss: 0.00785
Epoch: 34321, Training Loss: 0.00734
Epoch: 34321, Training Loss: 0.00897
Epoch: 34321, Training Loss: 0.00695
Epoch: 34322, Training Loss: 0.00785
Epoch: 34322, Training Loss: 0.00734
Epoch: 34322, Training Loss: 0.00897
Epoch: 34322, Training Loss: 0.00695
Epoch: 34323, Training Loss: 0.00785
Epoch: 34323, Training Loss: 0.00734
Epoch: 34323, Training Loss: 0.00897
Epoch: 34323, Training Loss: 0.00695
Epoch: 34324, Training Loss: 0.00785
Epoch: 34324, Training Loss: 0.00734
Epoch: 34324, Training Loss: 0.00897
Epoch: 34324, Training Loss: 0.00695
Epoch: 34325, Training Loss: 0.00785
Epoch: 34325, Training Loss: 0.00734
Epoch: 34325, Training Loss: 0.00897
Epoch: 34325, Training Loss: 0.00695
Epoch: 34326, Training Loss: 0.00785
Epoch: 34326, Training Loss: 0.00734
Epoch: 34326, Training Loss: 0.00897
Epoch: 34326, Training Loss: 0.00695
Epoch: 34327, Training Loss: 0.00785
Epoch: 34327, Training Loss: 0.00734
Epoch: 34327, Training Loss: 0.00897
Epoch: 34327, Training Loss: 0.00695
Epoch: 34328, Training Loss: 0.00785
Epoch: 34328, Training Loss: 0.00734
Epoch: 34328, Training Loss: 0.00897
Epoch: 34328, Training Loss: 0.00695
Epoch: 34329, Training Loss: 0.00785
Epoch: 34329, Training Loss: 0.00734
Epoch: 34329, Training Loss: 0.00897
Epoch: 34329, Training Loss: 0.00695
Epoch: 34330, Training Loss: 0.00785
Epoch: 34330, Training Loss: 0.00734
Epoch: 34330, Training Loss: 0.00897
Epoch: 34330, Training Loss: 0.00695
Epoch: 34331, Training Loss: 0.00785
Epoch: 34331, Training Loss: 0.00734
Epoch: 34331, Training Loss: 0.00897
Epoch: 34331, Training Loss: 0.00695
Epoch: 34332, Training Loss: 0.00785
Epoch: 34332, Training Loss: 0.00734
Epoch: 34332, Training Loss: 0.00897
Epoch: 34332, Training Loss: 0.00695
Epoch: 34333, Training Loss: 0.00785
Epoch: 34333, Training Loss: 0.00734
Epoch: 34333, Training Loss: 0.00897
Epoch: 34333, Training Loss: 0.00695
Epoch: 34334, Training Loss: 0.00785
Epoch: 34334, Training Loss: 0.00734
Epoch: 34334, Training Loss: 0.00897
Epoch: 34334, Training Loss: 0.00695
Epoch: 34335, Training Loss: 0.00785
Epoch: 34335, Training Loss: 0.00734
Epoch: 34335, Training Loss: 0.00897
Epoch: 34335, Training Loss: 0.00695
Epoch: 34336, Training Loss: 0.00785
Epoch: 34336, Training Loss: 0.00734
Epoch: 34336, Training Loss: 0.00897
Epoch: 34336, Training Loss: 0.00695
Epoch: 34337, Training Loss: 0.00785
Epoch: 34337, Training Loss: 0.00734
Epoch: 34337, Training Loss: 0.00897
Epoch: 34337, Training Loss: 0.00695
Epoch: 34338, Training Loss: 0.00785
Epoch: 34338, Training Loss: 0.00734
Epoch: 34338, Training Loss: 0.00897
Epoch: 34338, Training Loss: 0.00695
Epoch: 34339, Training Loss: 0.00785
Epoch: 34339, Training Loss: 0.00734
Epoch: 34339, Training Loss: 0.00897
Epoch: 34339, Training Loss: 0.00695
Epoch: 34340, Training Loss: 0.00785
Epoch: 34340, Training Loss: 0.00734
Epoch: 34340, Training Loss: 0.00897
Epoch: 34340, Training Loss: 0.00695
Epoch: 34341, Training Loss: 0.00785
Epoch: 34341, Training Loss: 0.00734
Epoch: 34341, Training Loss: 0.00897
Epoch: 34341, Training Loss: 0.00695
Epoch: 34342, Training Loss: 0.00785
Epoch: 34342, Training Loss: 0.00734
Epoch: 34342, Training Loss: 0.00897
Epoch: 34342, Training Loss: 0.00695
Epoch: 34343, Training Loss: 0.00785
Epoch: 34343, Training Loss: 0.00734
Epoch: 34343, Training Loss: 0.00897
Epoch: 34343, Training Loss: 0.00695
Epoch: 34344, Training Loss: 0.00785
Epoch: 34344, Training Loss: 0.00734
Epoch: 34344, Training Loss: 0.00897
Epoch: 34344, Training Loss: 0.00695
Epoch: 34345, Training Loss: 0.00785
Epoch: 34345, Training Loss: 0.00734
Epoch: 34345, Training Loss: 0.00897
Epoch: 34345, Training Loss: 0.00695
Epoch: 34346, Training Loss: 0.00785
Epoch: 34346, Training Loss: 0.00734
Epoch: 34346, Training Loss: 0.00897
Epoch: 34346, Training Loss: 0.00695
Epoch: 34347, Training Loss: 0.00785
Epoch: 34347, Training Loss: 0.00734
Epoch: 34347, Training Loss: 0.00897
Epoch: 34347, Training Loss: 0.00695
Epoch: 34348, Training Loss: 0.00785
Epoch: 34348, Training Loss: 0.00734
Epoch: 34348, Training Loss: 0.00897
Epoch: 34348, Training Loss: 0.00695
Epoch: 34349, Training Loss: 0.00785
Epoch: 34349, Training Loss: 0.00734
Epoch: 34349, Training Loss: 0.00897
Epoch: 34349, Training Loss: 0.00695
Epoch: 34350, Training Loss: 0.00785
Epoch: 34350, Training Loss: 0.00734
Epoch: 34350, Training Loss: 0.00897
Epoch: 34350, Training Loss: 0.00695
Epoch: 34351, Training Loss: 0.00785
Epoch: 34351, Training Loss: 0.00734
Epoch: 34351, Training Loss: 0.00897
Epoch: 34351, Training Loss: 0.00695
Epoch: 34352, Training Loss: 0.00785
Epoch: 34352, Training Loss: 0.00734
Epoch: 34352, Training Loss: 0.00897
Epoch: 34352, Training Loss: 0.00695
Epoch: 34353, Training Loss: 0.00785
Epoch: 34353, Training Loss: 0.00734
Epoch: 34353, Training Loss: 0.00897
Epoch: 34353, Training Loss: 0.00695
Epoch: 34354, Training Loss: 0.00785
Epoch: 34354, Training Loss: 0.00734
Epoch: 34354, Training Loss: 0.00897
Epoch: 34354, Training Loss: 0.00695
Epoch: 34355, Training Loss: 0.00785
Epoch: 34355, Training Loss: 0.00734
Epoch: 34355, Training Loss: 0.00897
Epoch: 34355, Training Loss: 0.00695
Epoch: 34356, Training Loss: 0.00785
Epoch: 34356, Training Loss: 0.00734
Epoch: 34356, Training Loss: 0.00897
Epoch: 34356, Training Loss: 0.00695
Epoch: 34357, Training Loss: 0.00785
Epoch: 34357, Training Loss: 0.00734
Epoch: 34357, Training Loss: 0.00897
Epoch: 34357, Training Loss: 0.00695
Epoch: 34358, Training Loss: 0.00785
Epoch: 34358, Training Loss: 0.00734
Epoch: 34358, Training Loss: 0.00897
Epoch: 34358, Training Loss: 0.00695
Epoch: 34359, Training Loss: 0.00785
Epoch: 34359, Training Loss: 0.00734
Epoch: 34359, Training Loss: 0.00897
Epoch: 34359, Training Loss: 0.00695
Epoch: 34360, Training Loss: 0.00785
Epoch: 34360, Training Loss: 0.00734
Epoch: 34360, Training Loss: 0.00897
Epoch: 34360, Training Loss: 0.00695
Epoch: 34361, Training Loss: 0.00785
Epoch: 34361, Training Loss: 0.00734
Epoch: 34361, Training Loss: 0.00897
Epoch: 34361, Training Loss: 0.00695
Epoch: 34362, Training Loss: 0.00785
Epoch: 34362, Training Loss: 0.00734
Epoch: 34362, Training Loss: 0.00897
Epoch: 34362, Training Loss: 0.00695
Epoch: 34363, Training Loss: 0.00785
Epoch: 34363, Training Loss: 0.00734
Epoch: 34363, Training Loss: 0.00897
Epoch: 34363, Training Loss: 0.00695
Epoch: 34364, Training Loss: 0.00785
Epoch: 34364, Training Loss: 0.00734
Epoch: 34364, Training Loss: 0.00897
Epoch: 34364, Training Loss: 0.00695
Epoch: 34365, Training Loss: 0.00785
Epoch: 34365, Training Loss: 0.00734
Epoch: 34365, Training Loss: 0.00896
Epoch: 34365, Training Loss: 0.00695
Epoch: 34366, Training Loss: 0.00785
Epoch: 34366, Training Loss: 0.00734
Epoch: 34366, Training Loss: 0.00896
Epoch: 34366, Training Loss: 0.00695
Epoch: 34367, Training Loss: 0.00785
Epoch: 34367, Training Loss: 0.00734
Epoch: 34367, Training Loss: 0.00896
Epoch: 34367, Training Loss: 0.00695
Epoch: 34368, Training Loss: 0.00785
Epoch: 34368, Training Loss: 0.00734
Epoch: 34368, Training Loss: 0.00896
Epoch: 34368, Training Loss: 0.00695
Epoch: 34369, Training Loss: 0.00785
Epoch: 34369, Training Loss: 0.00734
Epoch: 34369, Training Loss: 0.00896
Epoch: 34369, Training Loss: 0.00695
Epoch: 34370, Training Loss: 0.00785
Epoch: 34370, Training Loss: 0.00734
Epoch: 34370, Training Loss: 0.00896
Epoch: 34370, Training Loss: 0.00695
Epoch: 34371, Training Loss: 0.00785
Epoch: 34371, Training Loss: 0.00734
Epoch: 34371, Training Loss: 0.00896
Epoch: 34371, Training Loss: 0.00695
Epoch: 34372, Training Loss: 0.00785
Epoch: 34372, Training Loss: 0.00734
Epoch: 34372, Training Loss: 0.00896
Epoch: 34372, Training Loss: 0.00695
Epoch: 34373, Training Loss: 0.00785
Epoch: 34373, Training Loss: 0.00734
Epoch: 34373, Training Loss: 0.00896
Epoch: 34373, Training Loss: 0.00695
Epoch: 34374, Training Loss: 0.00785
Epoch: 34374, Training Loss: 0.00734
Epoch: 34374, Training Loss: 0.00896
Epoch: 34374, Training Loss: 0.00695
Epoch: 34375, Training Loss: 0.00785
Epoch: 34375, Training Loss: 0.00734
Epoch: 34375, Training Loss: 0.00896
Epoch: 34375, Training Loss: 0.00695
Epoch: 34376, Training Loss: 0.00785
Epoch: 34376, Training Loss: 0.00734
Epoch: 34376, Training Loss: 0.00896
Epoch: 34376, Training Loss: 0.00695
Epoch: 34377, Training Loss: 0.00785
Epoch: 34377, Training Loss: 0.00734
Epoch: 34377, Training Loss: 0.00896
Epoch: 34377, Training Loss: 0.00695
Epoch: 34378, Training Loss: 0.00785
Epoch: 34378, Training Loss: 0.00734
Epoch: 34378, Training Loss: 0.00896
Epoch: 34378, Training Loss: 0.00695
Epoch: 34379, Training Loss: 0.00785
Epoch: 34379, Training Loss: 0.00734
Epoch: 34379, Training Loss: 0.00896
Epoch: 34379, Training Loss: 0.00695
Epoch: 34380, Training Loss: 0.00785
Epoch: 34380, Training Loss: 0.00734
Epoch: 34380, Training Loss: 0.00896
Epoch: 34380, Training Loss: 0.00695
Epoch: 34381, Training Loss: 0.00785
Epoch: 34381, Training Loss: 0.00733
Epoch: 34381, Training Loss: 0.00896
Epoch: 34381, Training Loss: 0.00695
Epoch: 34382, Training Loss: 0.00785
Epoch: 34382, Training Loss: 0.00733
Epoch: 34382, Training Loss: 0.00896
Epoch: 34382, Training Loss: 0.00695
Epoch: 34383, Training Loss: 0.00785
Epoch: 34383, Training Loss: 0.00733
Epoch: 34383, Training Loss: 0.00896
Epoch: 34383, Training Loss: 0.00694
Epoch: 34384, Training Loss: 0.00785
Epoch: 34384, Training Loss: 0.00733
Epoch: 34384, Training Loss: 0.00896
Epoch: 34384, Training Loss: 0.00694
Epoch: 34385, Training Loss: 0.00785
Epoch: 34385, Training Loss: 0.00733
Epoch: 34385, Training Loss: 0.00896
Epoch: 34385, Training Loss: 0.00694
Epoch: 34386, Training Loss: 0.00785
Epoch: 34386, Training Loss: 0.00733
Epoch: 34386, Training Loss: 0.00896
Epoch: 34386, Training Loss: 0.00694
Epoch: 34387, Training Loss: 0.00785
Epoch: 34387, Training Loss: 0.00733
Epoch: 34387, Training Loss: 0.00896
Epoch: 34387, Training Loss: 0.00694
Epoch: 34388, Training Loss: 0.00785
Epoch: 34388, Training Loss: 0.00733
Epoch: 34388, Training Loss: 0.00896
Epoch: 34388, Training Loss: 0.00694
Epoch: 34389, Training Loss: 0.00785
Epoch: 34389, Training Loss: 0.00733
Epoch: 34389, Training Loss: 0.00896
Epoch: 34389, Training Loss: 0.00694
Epoch: 34390, Training Loss: 0.00785
Epoch: 34390, Training Loss: 0.00733
Epoch: 34390, Training Loss: 0.00896
Epoch: 34390, Training Loss: 0.00694
Epoch: 34391, Training Loss: 0.00785
Epoch: 34391, Training Loss: 0.00733
Epoch: 34391, Training Loss: 0.00896
Epoch: 34391, Training Loss: 0.00694
Epoch: 34392, Training Loss: 0.00785
Epoch: 34392, Training Loss: 0.00733
Epoch: 34392, Training Loss: 0.00896
Epoch: 34392, Training Loss: 0.00694
Epoch: 34393, Training Loss: 0.00784
Epoch: 34393, Training Loss: 0.00733
Epoch: 34393, Training Loss: 0.00896
Epoch: 34393, Training Loss: 0.00694
Epoch: 34394, Training Loss: 0.00784
Epoch: 34394, Training Loss: 0.00733
Epoch: 34394, Training Loss: 0.00896
Epoch: 34394, Training Loss: 0.00694
Epoch: 34395, Training Loss: 0.00784
Epoch: 34395, Training Loss: 0.00733
Epoch: 34395, Training Loss: 0.00896
Epoch: 34395, Training Loss: 0.00694
Epoch: 34396, Training Loss: 0.00784
Epoch: 34396, Training Loss: 0.00733
Epoch: 34396, Training Loss: 0.00896
Epoch: 34396, Training Loss: 0.00694
Epoch: 34397, Training Loss: 0.00784
Epoch: 34397, Training Loss: 0.00733
Epoch: 34397, Training Loss: 0.00896
Epoch: 34397, Training Loss: 0.00694
Epoch: 34398, Training Loss: 0.00784
Epoch: 34398, Training Loss: 0.00733
Epoch: 34398, Training Loss: 0.00896
Epoch: 34398, Training Loss: 0.00694
Epoch: 34399, Training Loss: 0.00784
Epoch: 34399, Training Loss: 0.00733
Epoch: 34399, Training Loss: 0.00896
Epoch: 34399, Training Loss: 0.00694
Epoch: 34400, Training Loss: 0.00784
Epoch: 34400, Training Loss: 0.00733
Epoch: 34400, Training Loss: 0.00896
Epoch: 34400, Training Loss: 0.00694
Epoch: 34401, Training Loss: 0.00784
Epoch: 34401, Training Loss: 0.00733
Epoch: 34401, Training Loss: 0.00896
Epoch: 34401, Training Loss: 0.00694
Epoch: 34402, Training Loss: 0.00784
Epoch: 34402, Training Loss: 0.00733
Epoch: 34402, Training Loss: 0.00896
Epoch: 34402, Training Loss: 0.00694
Epoch: 34403, Training Loss: 0.00784
Epoch: 34403, Training Loss: 0.00733
Epoch: 34403, Training Loss: 0.00896
Epoch: 34403, Training Loss: 0.00694
Epoch: 34404, Training Loss: 0.00784
Epoch: 34404, Training Loss: 0.00733
Epoch: 34404, Training Loss: 0.00896
Epoch: 34404, Training Loss: 0.00694
Epoch: 34405, Training Loss: 0.00784
Epoch: 34405, Training Loss: 0.00733
Epoch: 34405, Training Loss: 0.00896
Epoch: 34405, Training Loss: 0.00694
Epoch: 34406, Training Loss: 0.00784
Epoch: 34406, Training Loss: 0.00733
Epoch: 34406, Training Loss: 0.00896
Epoch: 34406, Training Loss: 0.00694
Epoch: 34407, Training Loss: 0.00784
Epoch: 34407, Training Loss: 0.00733
Epoch: 34407, Training Loss: 0.00896
Epoch: 34407, Training Loss: 0.00694
Epoch: 34408, Training Loss: 0.00784
Epoch: 34408, Training Loss: 0.00733
Epoch: 34408, Training Loss: 0.00896
Epoch: 34408, Training Loss: 0.00694
Epoch: 34409, Training Loss: 0.00784
Epoch: 34409, Training Loss: 0.00733
Epoch: 34409, Training Loss: 0.00896
Epoch: 34409, Training Loss: 0.00694
Epoch: 34410, Training Loss: 0.00784
Epoch: 34410, Training Loss: 0.00733
Epoch: 34410, Training Loss: 0.00896
Epoch: 34410, Training Loss: 0.00694
Epoch: 34411, Training Loss: 0.00784
Epoch: 34411, Training Loss: 0.00733
Epoch: 34411, Training Loss: 0.00896
Epoch: 34411, Training Loss: 0.00694
Epoch: 34412, Training Loss: 0.00784
Epoch: 34412, Training Loss: 0.00733
Epoch: 34412, Training Loss: 0.00896
Epoch: 34412, Training Loss: 0.00694
Epoch: 34413, Training Loss: 0.00784
Epoch: 34413, Training Loss: 0.00733
Epoch: 34413, Training Loss: 0.00896
Epoch: 34413, Training Loss: 0.00694
Epoch: 34414, Training Loss: 0.00784
Epoch: 34414, Training Loss: 0.00733
Epoch: 34414, Training Loss: 0.00896
Epoch: 34414, Training Loss: 0.00694
Epoch: 34415, Training Loss: 0.00784
Epoch: 34415, Training Loss: 0.00733
Epoch: 34415, Training Loss: 0.00896
Epoch: 34415, Training Loss: 0.00694
Epoch: 34416, Training Loss: 0.00784
Epoch: 34416, Training Loss: 0.00733
Epoch: 34416, Training Loss: 0.00896
Epoch: 34416, Training Loss: 0.00694
Epoch: 34417, Training Loss: 0.00784
Epoch: 34417, Training Loss: 0.00733
Epoch: 34417, Training Loss: 0.00896
Epoch: 34417, Training Loss: 0.00694
Epoch: 34418, Training Loss: 0.00784
Epoch: 34418, Training Loss: 0.00733
Epoch: 34418, Training Loss: 0.00896
Epoch: 34418, Training Loss: 0.00694
Epoch: 34419, Training Loss: 0.00784
Epoch: 34419, Training Loss: 0.00733
Epoch: 34419, Training Loss: 0.00896
Epoch: 34419, Training Loss: 0.00694
Epoch: 34420, Training Loss: 0.00784
Epoch: 34420, Training Loss: 0.00733
Epoch: 34420, Training Loss: 0.00896
Epoch: 34420, Training Loss: 0.00694
Epoch: 34421, Training Loss: 0.00784
Epoch: 34421, Training Loss: 0.00733
Epoch: 34421, Training Loss: 0.00896
Epoch: 34421, Training Loss: 0.00694
Epoch: 34422, Training Loss: 0.00784
Epoch: 34422, Training Loss: 0.00733
Epoch: 34422, Training Loss: 0.00896
Epoch: 34422, Training Loss: 0.00694
Epoch: 34423, Training Loss: 0.00784
Epoch: 34423, Training Loss: 0.00733
Epoch: 34423, Training Loss: 0.00896
Epoch: 34423, Training Loss: 0.00694
Epoch: 34424, Training Loss: 0.00784
Epoch: 34424, Training Loss: 0.00733
Epoch: 34424, Training Loss: 0.00896
Epoch: 34424, Training Loss: 0.00694
Epoch: 34425, Training Loss: 0.00784
Epoch: 34425, Training Loss: 0.00733
Epoch: 34425, Training Loss: 0.00896
Epoch: 34425, Training Loss: 0.00694
Epoch: 34426, Training Loss: 0.00784
Epoch: 34426, Training Loss: 0.00733
Epoch: 34426, Training Loss: 0.00896
Epoch: 34426, Training Loss: 0.00694
Epoch: 34427, Training Loss: 0.00784
Epoch: 34427, Training Loss: 0.00733
Epoch: 34427, Training Loss: 0.00896
Epoch: 34427, Training Loss: 0.00694
Epoch: 34428, Training Loss: 0.00784
Epoch: 34428, Training Loss: 0.00733
Epoch: 34428, Training Loss: 0.00896
Epoch: 34428, Training Loss: 0.00694
Epoch: 34429, Training Loss: 0.00784
Epoch: 34429, Training Loss: 0.00733
Epoch: 34429, Training Loss: 0.00896
Epoch: 34429, Training Loss: 0.00694
Epoch: 34430, Training Loss: 0.00784
Epoch: 34430, Training Loss: 0.00733
Epoch: 34430, Training Loss: 0.00896
Epoch: 34430, Training Loss: 0.00694
Epoch: 34431, Training Loss: 0.00784
Epoch: 34431, Training Loss: 0.00733
Epoch: 34431, Training Loss: 0.00896
Epoch: 34431, Training Loss: 0.00694
Epoch: 34432, Training Loss: 0.00784
Epoch: 34432, Training Loss: 0.00733
Epoch: 34432, Training Loss: 0.00896
Epoch: 34432, Training Loss: 0.00694
Epoch: 34433, Training Loss: 0.00784
Epoch: 34433, Training Loss: 0.00733
Epoch: 34433, Training Loss: 0.00896
Epoch: 34433, Training Loss: 0.00694
Epoch: 34434, Training Loss: 0.00784
Epoch: 34434, Training Loss: 0.00733
Epoch: 34434, Training Loss: 0.00896
Epoch: 34434, Training Loss: 0.00694
Epoch: 34435, Training Loss: 0.00784
Epoch: 34435, Training Loss: 0.00733
Epoch: 34435, Training Loss: 0.00896
Epoch: 34435, Training Loss: 0.00694
Epoch: 34436, Training Loss: 0.00784
Epoch: 34436, Training Loss: 0.00733
Epoch: 34436, Training Loss: 0.00896
Epoch: 34436, Training Loss: 0.00694
Epoch: 34437, Training Loss: 0.00784
Epoch: 34437, Training Loss: 0.00733
Epoch: 34437, Training Loss: 0.00896
Epoch: 34437, Training Loss: 0.00694
Epoch: 34438, Training Loss: 0.00784
Epoch: 34438, Training Loss: 0.00733
Epoch: 34438, Training Loss: 0.00895
Epoch: 34438, Training Loss: 0.00694
Epoch: 34439, Training Loss: 0.00784
Epoch: 34439, Training Loss: 0.00733
Epoch: 34439, Training Loss: 0.00895
Epoch: 34439, Training Loss: 0.00694
Epoch: 34440, Training Loss: 0.00784
Epoch: 34440, Training Loss: 0.00733
Epoch: 34440, Training Loss: 0.00895
Epoch: 34440, Training Loss: 0.00694
Epoch: 34441, Training Loss: 0.00784
Epoch: 34441, Training Loss: 0.00733
Epoch: 34441, Training Loss: 0.00895
Epoch: 34441, Training Loss: 0.00694
Epoch: 34442, Training Loss: 0.00784
Epoch: 34442, Training Loss: 0.00733
Epoch: 34442, Training Loss: 0.00895
Epoch: 34442, Training Loss: 0.00694
Epoch: 34443, Training Loss: 0.00784
Epoch: 34443, Training Loss: 0.00733
Epoch: 34443, Training Loss: 0.00895
Epoch: 34443, Training Loss: 0.00694
Epoch: 34444, Training Loss: 0.00784
Epoch: 34444, Training Loss: 0.00733
Epoch: 34444, Training Loss: 0.00895
Epoch: 34444, Training Loss: 0.00694
Epoch: 34445, Training Loss: 0.00784
Epoch: 34445, Training Loss: 0.00733
Epoch: 34445, Training Loss: 0.00895
Epoch: 34445, Training Loss: 0.00694
Epoch: 34446, Training Loss: 0.00784
Epoch: 34446, Training Loss: 0.00733
Epoch: 34446, Training Loss: 0.00895
Epoch: 34446, Training Loss: 0.00694
Epoch: 34447, Training Loss: 0.00784
Epoch: 34447, Training Loss: 0.00733
Epoch: 34447, Training Loss: 0.00895
Epoch: 34447, Training Loss: 0.00694
Epoch: 34448, Training Loss: 0.00784
Epoch: 34448, Training Loss: 0.00733
Epoch: 34448, Training Loss: 0.00895
Epoch: 34448, Training Loss: 0.00694
Epoch: 34449, Training Loss: 0.00784
Epoch: 34449, Training Loss: 0.00733
Epoch: 34449, Training Loss: 0.00895
Epoch: 34449, Training Loss: 0.00694
Epoch: 34450, Training Loss: 0.00784
Epoch: 34450, Training Loss: 0.00733
Epoch: 34450, Training Loss: 0.00895
Epoch: 34450, Training Loss: 0.00694
Epoch: 34451, Training Loss: 0.00784
Epoch: 34451, Training Loss: 0.00733
Epoch: 34451, Training Loss: 0.00895
Epoch: 34451, Training Loss: 0.00694
Epoch: 34452, Training Loss: 0.00784
Epoch: 34452, Training Loss: 0.00733
Epoch: 34452, Training Loss: 0.00895
Epoch: 34452, Training Loss: 0.00694
Epoch: 34453, Training Loss: 0.00784
Epoch: 34453, Training Loss: 0.00733
Epoch: 34453, Training Loss: 0.00895
Epoch: 34453, Training Loss: 0.00694
Epoch: 34454, Training Loss: 0.00784
Epoch: 34454, Training Loss: 0.00733
Epoch: 34454, Training Loss: 0.00895
Epoch: 34454, Training Loss: 0.00694
Epoch: 34455, Training Loss: 0.00784
Epoch: 34455, Training Loss: 0.00733
Epoch: 34455, Training Loss: 0.00895
Epoch: 34455, Training Loss: 0.00694
Epoch: 34456, Training Loss: 0.00784
Epoch: 34456, Training Loss: 0.00733
Epoch: 34456, Training Loss: 0.00895
Epoch: 34456, Training Loss: 0.00694
Epoch: 34457, Training Loss: 0.00784
Epoch: 34457, Training Loss: 0.00733
Epoch: 34457, Training Loss: 0.00895
Epoch: 34457, Training Loss: 0.00694
Epoch: 34458, Training Loss: 0.00784
Epoch: 34458, Training Loss: 0.00733
Epoch: 34458, Training Loss: 0.00895
Epoch: 34458, Training Loss: 0.00694
Epoch: 34459, Training Loss: 0.00784
Epoch: 34459, Training Loss: 0.00733
Epoch: 34459, Training Loss: 0.00895
Epoch: 34459, Training Loss: 0.00694
Epoch: 34460, Training Loss: 0.00784
Epoch: 34460, Training Loss: 0.00733
Epoch: 34460, Training Loss: 0.00895
Epoch: 34460, Training Loss: 0.00694
Epoch: 34461, Training Loss: 0.00784
Epoch: 34461, Training Loss: 0.00733
Epoch: 34461, Training Loss: 0.00895
Epoch: 34461, Training Loss: 0.00694
Epoch: 34462, Training Loss: 0.00784
Epoch: 34462, Training Loss: 0.00733
Epoch: 34462, Training Loss: 0.00895
Epoch: 34462, Training Loss: 0.00694
Epoch: 34463, Training Loss: 0.00784
Epoch: 34463, Training Loss: 0.00733
Epoch: 34463, Training Loss: 0.00895
Epoch: 34463, Training Loss: 0.00694
Epoch: 34464, Training Loss: 0.00784
Epoch: 34464, Training Loss: 0.00733
Epoch: 34464, Training Loss: 0.00895
Epoch: 34464, Training Loss: 0.00694
Epoch: 34465, Training Loss: 0.00784
Epoch: 34465, Training Loss: 0.00733
Epoch: 34465, Training Loss: 0.00895
Epoch: 34465, Training Loss: 0.00694
Epoch: 34466, Training Loss: 0.00784
Epoch: 34466, Training Loss: 0.00733
Epoch: 34466, Training Loss: 0.00895
Epoch: 34466, Training Loss: 0.00694
Epoch: 34467, Training Loss: 0.00784
Epoch: 34467, Training Loss: 0.00733
Epoch: 34467, Training Loss: 0.00895
Epoch: 34467, Training Loss: 0.00694
Epoch: 34468, Training Loss: 0.00784
Epoch: 34468, Training Loss: 0.00733
Epoch: 34468, Training Loss: 0.00895
Epoch: 34468, Training Loss: 0.00694
Epoch: 34469, Training Loss: 0.00784
Epoch: 34469, Training Loss: 0.00733
Epoch: 34469, Training Loss: 0.00895
Epoch: 34469, Training Loss: 0.00694
Epoch: 34470, Training Loss: 0.00784
Epoch: 34470, Training Loss: 0.00733
Epoch: 34470, Training Loss: 0.00895
Epoch: 34470, Training Loss: 0.00694
Epoch: 34471, Training Loss: 0.00784
Epoch: 34471, Training Loss: 0.00732
Epoch: 34471, Training Loss: 0.00895
Epoch: 34471, Training Loss: 0.00694
Epoch: 34472, Training Loss: 0.00784
Epoch: 34472, Training Loss: 0.00732
Epoch: 34472, Training Loss: 0.00895
Epoch: 34472, Training Loss: 0.00694
Epoch: 34473, Training Loss: 0.00784
Epoch: 34473, Training Loss: 0.00732
Epoch: 34473, Training Loss: 0.00895
Epoch: 34473, Training Loss: 0.00694
Epoch: 34474, Training Loss: 0.00784
Epoch: 34474, Training Loss: 0.00732
Epoch: 34474, Training Loss: 0.00895
Epoch: 34474, Training Loss: 0.00694
Epoch: 34475, Training Loss: 0.00784
Epoch: 34475, Training Loss: 0.00732
Epoch: 34475, Training Loss: 0.00895
Epoch: 34475, Training Loss: 0.00694
Epoch: 34476, Training Loss: 0.00783
Epoch: 34476, Training Loss: 0.00732
Epoch: 34476, Training Loss: 0.00895
Epoch: 34476, Training Loss: 0.00694
Epoch: 34477, Training Loss: 0.00783
Epoch: 34477, Training Loss: 0.00732
Epoch: 34477, Training Loss: 0.00895
Epoch: 34477, Training Loss: 0.00694
Epoch: 34478, Training Loss: 0.00783
Epoch: 34478, Training Loss: 0.00732
Epoch: 34478, Training Loss: 0.00895
Epoch: 34478, Training Loss: 0.00693
Epoch: 34479, Training Loss: 0.00783
Epoch: 34479, Training Loss: 0.00732
Epoch: 34479, Training Loss: 0.00895
Epoch: 34479, Training Loss: 0.00693
Epoch: 34480, Training Loss: 0.00783
Epoch: 34480, Training Loss: 0.00732
Epoch: 34480, Training Loss: 0.00895
Epoch: 34480, Training Loss: 0.00693
Epoch: 34481, Training Loss: 0.00783
Epoch: 34481, Training Loss: 0.00732
Epoch: 34481, Training Loss: 0.00895
Epoch: 34481, Training Loss: 0.00693
Epoch: 34482, Training Loss: 0.00783
Epoch: 34482, Training Loss: 0.00732
Epoch: 34482, Training Loss: 0.00895
Epoch: 34482, Training Loss: 0.00693
Epoch: 34483, Training Loss: 0.00783
Epoch: 34483, Training Loss: 0.00732
Epoch: 34483, Training Loss: 0.00895
Epoch: 34483, Training Loss: 0.00693
Epoch: 34484, Training Loss: 0.00783
Epoch: 34484, Training Loss: 0.00732
Epoch: 34484, Training Loss: 0.00895
Epoch: 34484, Training Loss: 0.00693
Epoch: 34485, Training Loss: 0.00783
Epoch: 34485, Training Loss: 0.00732
Epoch: 34485, Training Loss: 0.00895
Epoch: 34485, Training Loss: 0.00693
Epoch: 34486, Training Loss: 0.00783
Epoch: 34486, Training Loss: 0.00732
Epoch: 34486, Training Loss: 0.00895
Epoch: 34486, Training Loss: 0.00693
Epoch: 34487, Training Loss: 0.00783
Epoch: 34487, Training Loss: 0.00732
Epoch: 34487, Training Loss: 0.00895
Epoch: 34487, Training Loss: 0.00693
Epoch: 34488, Training Loss: 0.00783
Epoch: 34488, Training Loss: 0.00732
Epoch: 34488, Training Loss: 0.00895
Epoch: 34488, Training Loss: 0.00693
Epoch: 34489, Training Loss: 0.00783
Epoch: 34489, Training Loss: 0.00732
Epoch: 34489, Training Loss: 0.00895
Epoch: 34489, Training Loss: 0.00693
Epoch: 34490, Training Loss: 0.00783
Epoch: 34490, Training Loss: 0.00732
Epoch: 34490, Training Loss: 0.00895
Epoch: 34490, Training Loss: 0.00693
Epoch: 34491, Training Loss: 0.00783
Epoch: 34491, Training Loss: 0.00732
Epoch: 34491, Training Loss: 0.00895
Epoch: 34491, Training Loss: 0.00693
Epoch: 34492, Training Loss: 0.00783
Epoch: 34492, Training Loss: 0.00732
Epoch: 34492, Training Loss: 0.00895
Epoch: 34492, Training Loss: 0.00693
Epoch: 34493, Training Loss: 0.00783
Epoch: 34493, Training Loss: 0.00732
Epoch: 34493, Training Loss: 0.00895
Epoch: 34493, Training Loss: 0.00693
Epoch: 34494, Training Loss: 0.00783
Epoch: 34494, Training Loss: 0.00732
Epoch: 34494, Training Loss: 0.00895
Epoch: 34494, Training Loss: 0.00693
Epoch: 34495, Training Loss: 0.00783
Epoch: 34495, Training Loss: 0.00732
Epoch: 34495, Training Loss: 0.00895
Epoch: 34495, Training Loss: 0.00693
Epoch: 34496, Training Loss: 0.00783
Epoch: 34496, Training Loss: 0.00732
Epoch: 34496, Training Loss: 0.00895
Epoch: 34496, Training Loss: 0.00693
Epoch: 34497, Training Loss: 0.00783
Epoch: 34497, Training Loss: 0.00732
Epoch: 34497, Training Loss: 0.00895
Epoch: 34497, Training Loss: 0.00693
Epoch: 34498, Training Loss: 0.00783
Epoch: 34498, Training Loss: 0.00732
Epoch: 34498, Training Loss: 0.00895
Epoch: 34498, Training Loss: 0.00693
Epoch: 34499, Training Loss: 0.00783
Epoch: 34499, Training Loss: 0.00732
Epoch: 34499, Training Loss: 0.00895
Epoch: 34499, Training Loss: 0.00693
Epoch: 34500, Training Loss: 0.00783
Epoch: 34500, Training Loss: 0.00732
Epoch: 34500, Training Loss: 0.00895
Epoch: 34500, Training Loss: 0.00693
Epoch: 34501, Training Loss: 0.00783
Epoch: 34501, Training Loss: 0.00732
Epoch: 34501, Training Loss: 0.00895
Epoch: 34501, Training Loss: 0.00693
Epoch: 34502, Training Loss: 0.00783
Epoch: 34502, Training Loss: 0.00732
Epoch: 34502, Training Loss: 0.00895
Epoch: 34502, Training Loss: 0.00693
Epoch: 34503, Training Loss: 0.00783
Epoch: 34503, Training Loss: 0.00732
Epoch: 34503, Training Loss: 0.00895
Epoch: 34503, Training Loss: 0.00693
Epoch: 34504, Training Loss: 0.00783
Epoch: 34504, Training Loss: 0.00732
Epoch: 34504, Training Loss: 0.00895
Epoch: 34504, Training Loss: 0.00693
Epoch: 34505, Training Loss: 0.00783
Epoch: 34505, Training Loss: 0.00732
Epoch: 34505, Training Loss: 0.00895
Epoch: 34505, Training Loss: 0.00693
Epoch: 34506, Training Loss: 0.00783
Epoch: 34506, Training Loss: 0.00732
Epoch: 34506, Training Loss: 0.00895
Epoch: 34506, Training Loss: 0.00693
Epoch: 34507, Training Loss: 0.00783
Epoch: 34507, Training Loss: 0.00732
Epoch: 34507, Training Loss: 0.00895
Epoch: 34507, Training Loss: 0.00693
Epoch: 34508, Training Loss: 0.00783
Epoch: 34508, Training Loss: 0.00732
Epoch: 34508, Training Loss: 0.00895
Epoch: 34508, Training Loss: 0.00693
Epoch: 34509, Training Loss: 0.00783
Epoch: 34509, Training Loss: 0.00732
Epoch: 34509, Training Loss: 0.00895
Epoch: 34509, Training Loss: 0.00693
Epoch: 34510, Training Loss: 0.00783
Epoch: 34510, Training Loss: 0.00732
Epoch: 34510, Training Loss: 0.00895
Epoch: 34510, Training Loss: 0.00693
Epoch: 34511, Training Loss: 0.00783
Epoch: 34511, Training Loss: 0.00732
Epoch: 34511, Training Loss: 0.00894
Epoch: 34511, Training Loss: 0.00693
Epoch: 34512, Training Loss: 0.00783
Epoch: 34512, Training Loss: 0.00732
Epoch: 34512, Training Loss: 0.00894
Epoch: 34512, Training Loss: 0.00693
Epoch: 34513, Training Loss: 0.00783
Epoch: 34513, Training Loss: 0.00732
Epoch: 34513, Training Loss: 0.00894
Epoch: 34513, Training Loss: 0.00693
Epoch: 34514, Training Loss: 0.00783
Epoch: 34514, Training Loss: 0.00732
Epoch: 34514, Training Loss: 0.00894
Epoch: 34514, Training Loss: 0.00693
Epoch: 34515, Training Loss: 0.00783
Epoch: 34515, Training Loss: 0.00732
Epoch: 34515, Training Loss: 0.00894
Epoch: 34515, Training Loss: 0.00693
Epoch: 34516, Training Loss: 0.00783
Epoch: 34516, Training Loss: 0.00732
Epoch: 34516, Training Loss: 0.00894
Epoch: 34516, Training Loss: 0.00693
Epoch: 34517, Training Loss: 0.00783
Epoch: 34517, Training Loss: 0.00732
Epoch: 34517, Training Loss: 0.00894
Epoch: 34517, Training Loss: 0.00693
Epoch: 34518, Training Loss: 0.00783
Epoch: 34518, Training Loss: 0.00732
Epoch: 34518, Training Loss: 0.00894
Epoch: 34518, Training Loss: 0.00693
Epoch: 34519, Training Loss: 0.00783
Epoch: 34519, Training Loss: 0.00732
Epoch: 34519, Training Loss: 0.00894
Epoch: 34519, Training Loss: 0.00693
Epoch: 34520, Training Loss: 0.00783
Epoch: 34520, Training Loss: 0.00732
Epoch: 34520, Training Loss: 0.00894
Epoch: 34520, Training Loss: 0.00693
Epoch: 34521, Training Loss: 0.00783
Epoch: 34521, Training Loss: 0.00732
Epoch: 34521, Training Loss: 0.00894
Epoch: 34521, Training Loss: 0.00693
Epoch: 34522, Training Loss: 0.00783
Epoch: 34522, Training Loss: 0.00732
Epoch: 34522, Training Loss: 0.00894
Epoch: 34522, Training Loss: 0.00693
Epoch: 34523, Training Loss: 0.00783
Epoch: 34523, Training Loss: 0.00732
Epoch: 34523, Training Loss: 0.00894
Epoch: 34523, Training Loss: 0.00693
Epoch: 34524, Training Loss: 0.00783
Epoch: 34524, Training Loss: 0.00732
Epoch: 34524, Training Loss: 0.00894
Epoch: 34524, Training Loss: 0.00693
Epoch: 34525, Training Loss: 0.00783
Epoch: 34525, Training Loss: 0.00732
Epoch: 34525, Training Loss: 0.00894
Epoch: 34525, Training Loss: 0.00693
Epoch: 34526, Training Loss: 0.00783
Epoch: 34526, Training Loss: 0.00732
Epoch: 34526, Training Loss: 0.00894
Epoch: 34526, Training Loss: 0.00693
Epoch: 34527, Training Loss: 0.00783
Epoch: 34527, Training Loss: 0.00732
Epoch: 34527, Training Loss: 0.00894
Epoch: 34527, Training Loss: 0.00693
Epoch: 34528, Training Loss: 0.00783
Epoch: 34528, Training Loss: 0.00732
Epoch: 34528, Training Loss: 0.00894
Epoch: 34528, Training Loss: 0.00693
Epoch: 34529, Training Loss: 0.00783
Epoch: 34529, Training Loss: 0.00732
Epoch: 34529, Training Loss: 0.00894
Epoch: 34529, Training Loss: 0.00693
Epoch: 34530, Training Loss: 0.00783
Epoch: 34530, Training Loss: 0.00732
Epoch: 34530, Training Loss: 0.00894
Epoch: 34530, Training Loss: 0.00693
Epoch: 34531, Training Loss: 0.00783
Epoch: 34531, Training Loss: 0.00732
Epoch: 34531, Training Loss: 0.00894
Epoch: 34531, Training Loss: 0.00693
Epoch: 34532, Training Loss: 0.00783
Epoch: 34532, Training Loss: 0.00732
Epoch: 34532, Training Loss: 0.00894
Epoch: 34532, Training Loss: 0.00693
Epoch: 34533, Training Loss: 0.00783
Epoch: 34533, Training Loss: 0.00732
Epoch: 34533, Training Loss: 0.00894
Epoch: 34533, Training Loss: 0.00693
Epoch: 34534, Training Loss: 0.00783
Epoch: 34534, Training Loss: 0.00732
Epoch: 34534, Training Loss: 0.00894
Epoch: 34534, Training Loss: 0.00693
Epoch: 34535, Training Loss: 0.00783
Epoch: 34535, Training Loss: 0.00732
Epoch: 34535, Training Loss: 0.00894
Epoch: 34535, Training Loss: 0.00693
Epoch: 34536, Training Loss: 0.00783
Epoch: 34536, Training Loss: 0.00732
Epoch: 34536, Training Loss: 0.00894
Epoch: 34536, Training Loss: 0.00693
Epoch: 34537, Training Loss: 0.00783
Epoch: 34537, Training Loss: 0.00732
Epoch: 34537, Training Loss: 0.00894
Epoch: 34537, Training Loss: 0.00693
Epoch: 34538, Training Loss: 0.00783
Epoch: 34538, Training Loss: 0.00732
Epoch: 34538, Training Loss: 0.00894
Epoch: 34538, Training Loss: 0.00693
Epoch: 34539, Training Loss: 0.00783
Epoch: 34539, Training Loss: 0.00732
Epoch: 34539, Training Loss: 0.00894
Epoch: 34539, Training Loss: 0.00693
Epoch: 34540, Training Loss: 0.00783
Epoch: 34540, Training Loss: 0.00732
Epoch: 34540, Training Loss: 0.00894
Epoch: 34540, Training Loss: 0.00693
Epoch: 34541, Training Loss: 0.00783
Epoch: 34541, Training Loss: 0.00732
Epoch: 34541, Training Loss: 0.00894
Epoch: 34541, Training Loss: 0.00693
Epoch: 34542, Training Loss: 0.00783
Epoch: 34542, Training Loss: 0.00732
Epoch: 34542, Training Loss: 0.00894
Epoch: 34542, Training Loss: 0.00693
Epoch: 34543, Training Loss: 0.00783
Epoch: 34543, Training Loss: 0.00732
Epoch: 34543, Training Loss: 0.00894
Epoch: 34543, Training Loss: 0.00693
Epoch: 34544, Training Loss: 0.00783
Epoch: 34544, Training Loss: 0.00732
Epoch: 34544, Training Loss: 0.00894
Epoch: 34544, Training Loss: 0.00693
Epoch: 34545, Training Loss: 0.00783
Epoch: 34545, Training Loss: 0.00732
Epoch: 34545, Training Loss: 0.00894
Epoch: 34545, Training Loss: 0.00693
Epoch: 34546, Training Loss: 0.00783
Epoch: 34546, Training Loss: 0.00732
Epoch: 34546, Training Loss: 0.00894
Epoch: 34546, Training Loss: 0.00693
Epoch: 34547, Training Loss: 0.00783
Epoch: 34547, Training Loss: 0.00732
Epoch: 34547, Training Loss: 0.00894
Epoch: 34547, Training Loss: 0.00693
Epoch: 34548, Training Loss: 0.00783
Epoch: 34548, Training Loss: 0.00732
Epoch: 34548, Training Loss: 0.00894
Epoch: 34548, Training Loss: 0.00693
Epoch: 34549, Training Loss: 0.00783
Epoch: 34549, Training Loss: 0.00732
Epoch: 34549, Training Loss: 0.00894
Epoch: 34549, Training Loss: 0.00693
Epoch: 34550, Training Loss: 0.00783
Epoch: 34550, Training Loss: 0.00732
Epoch: 34550, Training Loss: 0.00894
Epoch: 34550, Training Loss: 0.00693
Epoch: 34551, Training Loss: 0.00783
Epoch: 34551, Training Loss: 0.00732
Epoch: 34551, Training Loss: 0.00894
Epoch: 34551, Training Loss: 0.00693
Epoch: 34552, Training Loss: 0.00783
Epoch: 34552, Training Loss: 0.00732
Epoch: 34552, Training Loss: 0.00894
Epoch: 34552, Training Loss: 0.00693
Epoch: 34553, Training Loss: 0.00783
Epoch: 34553, Training Loss: 0.00732
Epoch: 34553, Training Loss: 0.00894
Epoch: 34553, Training Loss: 0.00693
Epoch: 34554, Training Loss: 0.00783
Epoch: 34554, Training Loss: 0.00732
Epoch: 34554, Training Loss: 0.00894
Epoch: 34554, Training Loss: 0.00693
Epoch: 34555, Training Loss: 0.00783
Epoch: 34555, Training Loss: 0.00732
Epoch: 34555, Training Loss: 0.00894
Epoch: 34555, Training Loss: 0.00693
Epoch: 34556, Training Loss: 0.00783
Epoch: 34556, Training Loss: 0.00732
Epoch: 34556, Training Loss: 0.00894
Epoch: 34556, Training Loss: 0.00693
Epoch: 34557, Training Loss: 0.00783
Epoch: 34557, Training Loss: 0.00732
Epoch: 34557, Training Loss: 0.00894
Epoch: 34557, Training Loss: 0.00693
Epoch: 34558, Training Loss: 0.00783
Epoch: 34558, Training Loss: 0.00732
Epoch: 34558, Training Loss: 0.00894
Epoch: 34558, Training Loss: 0.00693
Epoch: 34559, Training Loss: 0.00782
Epoch: 34559, Training Loss: 0.00732
Epoch: 34559, Training Loss: 0.00894
Epoch: 34559, Training Loss: 0.00693
Epoch: 34560, Training Loss: 0.00782
Epoch: 34560, Training Loss: 0.00732
Epoch: 34560, Training Loss: 0.00894
Epoch: 34560, Training Loss: 0.00693
Epoch: 34561, Training Loss: 0.00782
Epoch: 34561, Training Loss: 0.00732
Epoch: 34561, Training Loss: 0.00894
Epoch: 34561, Training Loss: 0.00693
Epoch: 34562, Training Loss: 0.00782
Epoch: 34562, Training Loss: 0.00731
Epoch: 34562, Training Loss: 0.00894
Epoch: 34562, Training Loss: 0.00693
Epoch: 34563, Training Loss: 0.00782
Epoch: 34563, Training Loss: 0.00731
Epoch: 34563, Training Loss: 0.00894
Epoch: 34563, Training Loss: 0.00693
Epoch: 34564, Training Loss: 0.00782
Epoch: 34564, Training Loss: 0.00731
Epoch: 34564, Training Loss: 0.00894
Epoch: 34564, Training Loss: 0.00693
Epoch: 34565, Training Loss: 0.00782
Epoch: 34565, Training Loss: 0.00731
Epoch: 34565, Training Loss: 0.00894
Epoch: 34565, Training Loss: 0.00693
Epoch: 34566, Training Loss: 0.00782
Epoch: 34566, Training Loss: 0.00731
Epoch: 34566, Training Loss: 0.00894
Epoch: 34566, Training Loss: 0.00693
Epoch: 34567, Training Loss: 0.00782
Epoch: 34567, Training Loss: 0.00731
Epoch: 34567, Training Loss: 0.00894
Epoch: 34567, Training Loss: 0.00693
Epoch: 34568, Training Loss: 0.00782
Epoch: 34568, Training Loss: 0.00731
Epoch: 34568, Training Loss: 0.00894
Epoch: 34568, Training Loss: 0.00693
Epoch: 34569, Training Loss: 0.00782
Epoch: 34569, Training Loss: 0.00731
Epoch: 34569, Training Loss: 0.00894
Epoch: 34569, Training Loss: 0.00693
Epoch: 34570, Training Loss: 0.00782
Epoch: 34570, Training Loss: 0.00731
Epoch: 34570, Training Loss: 0.00894
Epoch: 34570, Training Loss: 0.00693
Epoch: 34571, Training Loss: 0.00782
Epoch: 34571, Training Loss: 0.00731
Epoch: 34571, Training Loss: 0.00894
Epoch: 34571, Training Loss: 0.00693
Epoch: 34572, Training Loss: 0.00782
Epoch: 34572, Training Loss: 0.00731
Epoch: 34572, Training Loss: 0.00894
Epoch: 34572, Training Loss: 0.00693
Epoch: 34573, Training Loss: 0.00782
Epoch: 34573, Training Loss: 0.00731
Epoch: 34573, Training Loss: 0.00894
Epoch: 34573, Training Loss: 0.00693
Epoch: 34574, Training Loss: 0.00782
Epoch: 34574, Training Loss: 0.00731
Epoch: 34574, Training Loss: 0.00894
Epoch: 34574, Training Loss: 0.00692
Epoch: 34575, Training Loss: 0.00782
Epoch: 34575, Training Loss: 0.00731
Epoch: 34575, Training Loss: 0.00894
Epoch: 34575, Training Loss: 0.00692
Epoch: 34576, Training Loss: 0.00782
Epoch: 34576, Training Loss: 0.00731
Epoch: 34576, Training Loss: 0.00894
Epoch: 34576, Training Loss: 0.00692
Epoch: 34577, Training Loss: 0.00782
Epoch: 34577, Training Loss: 0.00731
Epoch: 34577, Training Loss: 0.00894
Epoch: 34577, Training Loss: 0.00692
Epoch: 34578, Training Loss: 0.00782
Epoch: 34578, Training Loss: 0.00731
Epoch: 34578, Training Loss: 0.00894
Epoch: 34578, Training Loss: 0.00692
Epoch: 34579, Training Loss: 0.00782
Epoch: 34579, Training Loss: 0.00731
Epoch: 34579, Training Loss: 0.00894
Epoch: 34579, Training Loss: 0.00692
Epoch: 34580, Training Loss: 0.00782
Epoch: 34580, Training Loss: 0.00731
Epoch: 34580, Training Loss: 0.00894
Epoch: 34580, Training Loss: 0.00692
Epoch: 34581, Training Loss: 0.00782
Epoch: 34581, Training Loss: 0.00731
Epoch: 34581, Training Loss: 0.00894
Epoch: 34581, Training Loss: 0.00692
Epoch: 34582, Training Loss: 0.00782
Epoch: 34582, Training Loss: 0.00731
Epoch: 34582, Training Loss: 0.00894
Epoch: 34582, Training Loss: 0.00692
Epoch: 34583, Training Loss: 0.00782
Epoch: 34583, Training Loss: 0.00731
Epoch: 34583, Training Loss: 0.00894
Epoch: 34583, Training Loss: 0.00692
Epoch: 34584, Training Loss: 0.00782
Epoch: 34584, Training Loss: 0.00731
Epoch: 34584, Training Loss: 0.00893
Epoch: 34584, Training Loss: 0.00692
Epoch: 34585, Training Loss: 0.00782
Epoch: 34585, Training Loss: 0.00731
Epoch: 34585, Training Loss: 0.00893
Epoch: 34585, Training Loss: 0.00692
Epoch: 34586, Training Loss: 0.00782
Epoch: 34586, Training Loss: 0.00731
Epoch: 34586, Training Loss: 0.00893
Epoch: 34586, Training Loss: 0.00692
Epoch: 34587, Training Loss: 0.00782
Epoch: 34587, Training Loss: 0.00731
Epoch: 34587, Training Loss: 0.00893
Epoch: 34587, Training Loss: 0.00692
Epoch: 34588, Training Loss: 0.00782
Epoch: 34588, Training Loss: 0.00731
Epoch: 34588, Training Loss: 0.00893
Epoch: 34588, Training Loss: 0.00692
Epoch: 34589, Training Loss: 0.00782
Epoch: 34589, Training Loss: 0.00731
Epoch: 34589, Training Loss: 0.00893
Epoch: 34589, Training Loss: 0.00692
Epoch: 34590, Training Loss: 0.00782
Epoch: 34590, Training Loss: 0.00731
Epoch: 34590, Training Loss: 0.00893
Epoch: 34590, Training Loss: 0.00692
Epoch: 34591, Training Loss: 0.00782
Epoch: 34591, Training Loss: 0.00731
Epoch: 34591, Training Loss: 0.00893
Epoch: 34591, Training Loss: 0.00692
Epoch: 34592, Training Loss: 0.00782
Epoch: 34592, Training Loss: 0.00731
Epoch: 34592, Training Loss: 0.00893
Epoch: 34592, Training Loss: 0.00692
Epoch: 34593, Training Loss: 0.00782
Epoch: 34593, Training Loss: 0.00731
Epoch: 34593, Training Loss: 0.00893
Epoch: 34593, Training Loss: 0.00692
Epoch: 34594, Training Loss: 0.00782
Epoch: 34594, Training Loss: 0.00731
Epoch: 34594, Training Loss: 0.00893
Epoch: 34594, Training Loss: 0.00692
Epoch: 34595, Training Loss: 0.00782
Epoch: 34595, Training Loss: 0.00731
Epoch: 34595, Training Loss: 0.00893
Epoch: 34595, Training Loss: 0.00692
Epoch: 34596, Training Loss: 0.00782
Epoch: 34596, Training Loss: 0.00731
Epoch: 34596, Training Loss: 0.00893
Epoch: 34596, Training Loss: 0.00692
Epoch: 34597, Training Loss: 0.00782
Epoch: 34597, Training Loss: 0.00731
Epoch: 34597, Training Loss: 0.00893
Epoch: 34597, Training Loss: 0.00692
Epoch: 34598, Training Loss: 0.00782
Epoch: 34598, Training Loss: 0.00731
Epoch: 34598, Training Loss: 0.00893
Epoch: 34598, Training Loss: 0.00692
Epoch: 34599, Training Loss: 0.00782
Epoch: 34599, Training Loss: 0.00731
Epoch: 34599, Training Loss: 0.00893
Epoch: 34599, Training Loss: 0.00692
Epoch: 34600, Training Loss: 0.00782
Epoch: 34600, Training Loss: 0.00731
Epoch: 34600, Training Loss: 0.00893
Epoch: 34600, Training Loss: 0.00692
Epoch: 34601, Training Loss: 0.00782
Epoch: 34601, Training Loss: 0.00731
Epoch: 34601, Training Loss: 0.00893
Epoch: 34601, Training Loss: 0.00692
Epoch: 34602, Training Loss: 0.00782
Epoch: 34602, Training Loss: 0.00731
Epoch: 34602, Training Loss: 0.00893
Epoch: 34602, Training Loss: 0.00692
Epoch: 34603, Training Loss: 0.00782
Epoch: 34603, Training Loss: 0.00731
Epoch: 34603, Training Loss: 0.00893
Epoch: 34603, Training Loss: 0.00692
Epoch: 34604, Training Loss: 0.00782
Epoch: 34604, Training Loss: 0.00731
Epoch: 34604, Training Loss: 0.00893
Epoch: 34604, Training Loss: 0.00692
Epoch: 34605, Training Loss: 0.00782
Epoch: 34605, Training Loss: 0.00731
Epoch: 34605, Training Loss: 0.00893
Epoch: 34605, Training Loss: 0.00692
Epoch: 34606, Training Loss: 0.00782
Epoch: 34606, Training Loss: 0.00731
Epoch: 34606, Training Loss: 0.00893
Epoch: 34606, Training Loss: 0.00692
Epoch: 34607, Training Loss: 0.00782
Epoch: 34607, Training Loss: 0.00731
Epoch: 34607, Training Loss: 0.00893
Epoch: 34607, Training Loss: 0.00692
Epoch: 34608, Training Loss: 0.00782
Epoch: 34608, Training Loss: 0.00731
Epoch: 34608, Training Loss: 0.00893
Epoch: 34608, Training Loss: 0.00692
Epoch: 34609, Training Loss: 0.00782
Epoch: 34609, Training Loss: 0.00731
Epoch: 34609, Training Loss: 0.00893
Epoch: 34609, Training Loss: 0.00692
Epoch: 34610, Training Loss: 0.00782
Epoch: 34610, Training Loss: 0.00731
Epoch: 34610, Training Loss: 0.00893
Epoch: 34610, Training Loss: 0.00692
Epoch: 34611, Training Loss: 0.00782
Epoch: 34611, Training Loss: 0.00731
Epoch: 34611, Training Loss: 0.00893
Epoch: 34611, Training Loss: 0.00692
Epoch: 34612, Training Loss: 0.00782
Epoch: 34612, Training Loss: 0.00731
Epoch: 34612, Training Loss: 0.00893
Epoch: 34612, Training Loss: 0.00692
Epoch: 34613, Training Loss: 0.00782
Epoch: 34613, Training Loss: 0.00731
Epoch: 34613, Training Loss: 0.00893
Epoch: 34613, Training Loss: 0.00692
Epoch: 34614, Training Loss: 0.00782
Epoch: 34614, Training Loss: 0.00731
Epoch: 34614, Training Loss: 0.00893
Epoch: 34614, Training Loss: 0.00692
Epoch: 34615, Training Loss: 0.00782
Epoch: 34615, Training Loss: 0.00731
Epoch: 34615, Training Loss: 0.00893
Epoch: 34615, Training Loss: 0.00692
Epoch: 34616, Training Loss: 0.00782
Epoch: 34616, Training Loss: 0.00731
Epoch: 34616, Training Loss: 0.00893
Epoch: 34616, Training Loss: 0.00692
Epoch: 34617, Training Loss: 0.00782
Epoch: 34617, Training Loss: 0.00731
Epoch: 34617, Training Loss: 0.00893
Epoch: 34617, Training Loss: 0.00692
Epoch: 34618, Training Loss: 0.00782
Epoch: 34618, Training Loss: 0.00731
Epoch: 34618, Training Loss: 0.00893
Epoch: 34618, Training Loss: 0.00692
Epoch: 34619, Training Loss: 0.00782
Epoch: 34619, Training Loss: 0.00731
Epoch: 34619, Training Loss: 0.00893
Epoch: 34619, Training Loss: 0.00692
Epoch: 34620, Training Loss: 0.00782
Epoch: 34620, Training Loss: 0.00731
Epoch: 34620, Training Loss: 0.00893
Epoch: 34620, Training Loss: 0.00692
Epoch: 34621, Training Loss: 0.00782
Epoch: 34621, Training Loss: 0.00731
Epoch: 34621, Training Loss: 0.00893
Epoch: 34621, Training Loss: 0.00692
Epoch: 34622, Training Loss: 0.00782
Epoch: 34622, Training Loss: 0.00731
Epoch: 34622, Training Loss: 0.00893
Epoch: 34622, Training Loss: 0.00692
Epoch: 34623, Training Loss: 0.00782
Epoch: 34623, Training Loss: 0.00731
Epoch: 34623, Training Loss: 0.00893
Epoch: 34623, Training Loss: 0.00692
Epoch: 34624, Training Loss: 0.00782
Epoch: 34624, Training Loss: 0.00731
Epoch: 34624, Training Loss: 0.00893
Epoch: 34624, Training Loss: 0.00692
Epoch: 34625, Training Loss: 0.00782
Epoch: 34625, Training Loss: 0.00731
Epoch: 34625, Training Loss: 0.00893
Epoch: 34625, Training Loss: 0.00692
Epoch: 34626, Training Loss: 0.00782
Epoch: 34626, Training Loss: 0.00731
Epoch: 34626, Training Loss: 0.00893
Epoch: 34626, Training Loss: 0.00692
Epoch: 34627, Training Loss: 0.00782
Epoch: 34627, Training Loss: 0.00731
Epoch: 34627, Training Loss: 0.00893
Epoch: 34627, Training Loss: 0.00692
Epoch: 34628, Training Loss: 0.00782
Epoch: 34628, Training Loss: 0.00731
Epoch: 34628, Training Loss: 0.00893
Epoch: 34628, Training Loss: 0.00692
Epoch: 34629, Training Loss: 0.00782
Epoch: 34629, Training Loss: 0.00731
Epoch: 34629, Training Loss: 0.00893
Epoch: 34629, Training Loss: 0.00692
Epoch: 34630, Training Loss: 0.00782
Epoch: 34630, Training Loss: 0.00731
Epoch: 34630, Training Loss: 0.00893
Epoch: 34630, Training Loss: 0.00692
Epoch: 34631, Training Loss: 0.00782
Epoch: 34631, Training Loss: 0.00731
Epoch: 34631, Training Loss: 0.00893
Epoch: 34631, Training Loss: 0.00692
Epoch: 34632, Training Loss: 0.00782
Epoch: 34632, Training Loss: 0.00731
Epoch: 34632, Training Loss: 0.00893
Epoch: 34632, Training Loss: 0.00692
Epoch: 34633, Training Loss: 0.00782
Epoch: 34633, Training Loss: 0.00731
Epoch: 34633, Training Loss: 0.00893
Epoch: 34633, Training Loss: 0.00692
Epoch: 34634, Training Loss: 0.00782
Epoch: 34634, Training Loss: 0.00731
Epoch: 34634, Training Loss: 0.00893
Epoch: 34634, Training Loss: 0.00692
Epoch: 34635, Training Loss: 0.00782
Epoch: 34635, Training Loss: 0.00731
Epoch: 34635, Training Loss: 0.00893
Epoch: 34635, Training Loss: 0.00692
Epoch: 34636, Training Loss: 0.00782
Epoch: 34636, Training Loss: 0.00731
Epoch: 34636, Training Loss: 0.00893
Epoch: 34636, Training Loss: 0.00692
Epoch: 34637, Training Loss: 0.00782
Epoch: 34637, Training Loss: 0.00731
Epoch: 34637, Training Loss: 0.00893
Epoch: 34637, Training Loss: 0.00692
Epoch: 34638, Training Loss: 0.00782
Epoch: 34638, Training Loss: 0.00731
Epoch: 34638, Training Loss: 0.00893
Epoch: 34638, Training Loss: 0.00692
Epoch: 34639, Training Loss: 0.00782
Epoch: 34639, Training Loss: 0.00731
Epoch: 34639, Training Loss: 0.00893
Epoch: 34639, Training Loss: 0.00692
Epoch: 34640, Training Loss: 0.00782
Epoch: 34640, Training Loss: 0.00731
Epoch: 34640, Training Loss: 0.00893
Epoch: 34640, Training Loss: 0.00692
Epoch: 34641, Training Loss: 0.00782
Epoch: 34641, Training Loss: 0.00731
Epoch: 34641, Training Loss: 0.00893
Epoch: 34641, Training Loss: 0.00692
Epoch: 34642, Training Loss: 0.00782
Epoch: 34642, Training Loss: 0.00731
Epoch: 34642, Training Loss: 0.00893
Epoch: 34642, Training Loss: 0.00692
Epoch: 34643, Training Loss: 0.00781
Epoch: 34643, Training Loss: 0.00731
Epoch: 34643, Training Loss: 0.00893
Epoch: 34643, Training Loss: 0.00692
Epoch: 34644, Training Loss: 0.00781
Epoch: 34644, Training Loss: 0.00731
Epoch: 34644, Training Loss: 0.00893
Epoch: 34644, Training Loss: 0.00692
Epoch: 34645, Training Loss: 0.00781
Epoch: 34645, Training Loss: 0.00731
Epoch: 34645, Training Loss: 0.00893
Epoch: 34645, Training Loss: 0.00692
Epoch: 34646, Training Loss: 0.00781
Epoch: 34646, Training Loss: 0.00731
Epoch: 34646, Training Loss: 0.00893
Epoch: 34646, Training Loss: 0.00692
Epoch: 34647, Training Loss: 0.00781
Epoch: 34647, Training Loss: 0.00731
Epoch: 34647, Training Loss: 0.00893
Epoch: 34647, Training Loss: 0.00692
Epoch: 34648, Training Loss: 0.00781
Epoch: 34648, Training Loss: 0.00731
Epoch: 34648, Training Loss: 0.00893
Epoch: 34648, Training Loss: 0.00692
Epoch: 34649, Training Loss: 0.00781
Epoch: 34649, Training Loss: 0.00731
Epoch: 34649, Training Loss: 0.00893
Epoch: 34649, Training Loss: 0.00692
Epoch: 34650, Training Loss: 0.00781
Epoch: 34650, Training Loss: 0.00731
Epoch: 34650, Training Loss: 0.00893
Epoch: 34650, Training Loss: 0.00692
Epoch: 34651, Training Loss: 0.00781
Epoch: 34651, Training Loss: 0.00731
Epoch: 34651, Training Loss: 0.00893
Epoch: 34651, Training Loss: 0.00692
Epoch: 34652, Training Loss: 0.00781
Epoch: 34652, Training Loss: 0.00731
Epoch: 34652, Training Loss: 0.00893
Epoch: 34652, Training Loss: 0.00692
Epoch: 34653, Training Loss: 0.00781
Epoch: 34653, Training Loss: 0.00730
Epoch: 34653, Training Loss: 0.00893
Epoch: 34653, Training Loss: 0.00692
Epoch: 34654, Training Loss: 0.00781
Epoch: 34654, Training Loss: 0.00730
Epoch: 34654, Training Loss: 0.00893
Epoch: 34654, Training Loss: 0.00692
Epoch: 34655, Training Loss: 0.00781
Epoch: 34655, Training Loss: 0.00730
Epoch: 34655, Training Loss: 0.00893
Epoch: 34655, Training Loss: 0.00692
Epoch: 34656, Training Loss: 0.00781
Epoch: 34656, Training Loss: 0.00730
Epoch: 34656, Training Loss: 0.00893
Epoch: 34656, Training Loss: 0.00692
Epoch: 34657, Training Loss: 0.00781
Epoch: 34657, Training Loss: 0.00730
Epoch: 34657, Training Loss: 0.00893
Epoch: 34657, Training Loss: 0.00692
Epoch: 34658, Training Loss: 0.00781
Epoch: 34658, Training Loss: 0.00730
Epoch: 34658, Training Loss: 0.00892
Epoch: 34658, Training Loss: 0.00692
Epoch: 34659, Training Loss: 0.00781
Epoch: 34659, Training Loss: 0.00730
Epoch: 34659, Training Loss: 0.00892
Epoch: 34659, Training Loss: 0.00692
Epoch: 34660, Training Loss: 0.00781
Epoch: 34660, Training Loss: 0.00730
Epoch: 34660, Training Loss: 0.00892
Epoch: 34660, Training Loss: 0.00692
Epoch: 34661, Training Loss: 0.00781
Epoch: 34661, Training Loss: 0.00730
Epoch: 34661, Training Loss: 0.00892
Epoch: 34661, Training Loss: 0.00692
Epoch: 34662, Training Loss: 0.00781
Epoch: 34662, Training Loss: 0.00730
Epoch: 34662, Training Loss: 0.00892
Epoch: 34662, Training Loss: 0.00692
Epoch: 34663, Training Loss: 0.00781
Epoch: 34663, Training Loss: 0.00730
Epoch: 34663, Training Loss: 0.00892
Epoch: 34663, Training Loss: 0.00692
Epoch: 34664, Training Loss: 0.00781
Epoch: 34664, Training Loss: 0.00730
Epoch: 34664, Training Loss: 0.00892
Epoch: 34664, Training Loss: 0.00692
Epoch: 34665, Training Loss: 0.00781
Epoch: 34665, Training Loss: 0.00730
Epoch: 34665, Training Loss: 0.00892
Epoch: 34665, Training Loss: 0.00692
Epoch: 34666, Training Loss: 0.00781
Epoch: 34666, Training Loss: 0.00730
Epoch: 34666, Training Loss: 0.00892
Epoch: 34666, Training Loss: 0.00692
Epoch: 34667, Training Loss: 0.00781
Epoch: 34667, Training Loss: 0.00730
Epoch: 34667, Training Loss: 0.00892
Epoch: 34667, Training Loss: 0.00692
Epoch: 34668, Training Loss: 0.00781
Epoch: 34668, Training Loss: 0.00730
Epoch: 34668, Training Loss: 0.00892
Epoch: 34668, Training Loss: 0.00692
Epoch: 34669, Training Loss: 0.00781
Epoch: 34669, Training Loss: 0.00730
Epoch: 34669, Training Loss: 0.00892
Epoch: 34669, Training Loss: 0.00692
Epoch: 34670, Training Loss: 0.00781
Epoch: 34670, Training Loss: 0.00730
Epoch: 34670, Training Loss: 0.00892
Epoch: 34670, Training Loss: 0.00691
Epoch: 34671, Training Loss: 0.00781
Epoch: 34671, Training Loss: 0.00730
Epoch: 34671, Training Loss: 0.00892
Epoch: 34671, Training Loss: 0.00691
Epoch: 34672, Training Loss: 0.00781
Epoch: 34672, Training Loss: 0.00730
Epoch: 34672, Training Loss: 0.00892
Epoch: 34672, Training Loss: 0.00691
Epoch: 34673, Training Loss: 0.00781
Epoch: 34673, Training Loss: 0.00730
Epoch: 34673, Training Loss: 0.00892
Epoch: 34673, Training Loss: 0.00691
Epoch: 34674, Training Loss: 0.00781
Epoch: 34674, Training Loss: 0.00730
Epoch: 34674, Training Loss: 0.00892
Epoch: 34674, Training Loss: 0.00691
Epoch: 34675, Training Loss: 0.00781
Epoch: 34675, Training Loss: 0.00730
Epoch: 34675, Training Loss: 0.00892
Epoch: 34675, Training Loss: 0.00691
Epoch: 34676, Training Loss: 0.00781
Epoch: 34676, Training Loss: 0.00730
Epoch: 34676, Training Loss: 0.00892
Epoch: 34676, Training Loss: 0.00691
Epoch: 34677, Training Loss: 0.00781
Epoch: 34677, Training Loss: 0.00730
Epoch: 34677, Training Loss: 0.00892
Epoch: 34677, Training Loss: 0.00691
Epoch: 34678, Training Loss: 0.00781
Epoch: 34678, Training Loss: 0.00730
Epoch: 34678, Training Loss: 0.00892
Epoch: 34678, Training Loss: 0.00691
Epoch: 34679, Training Loss: 0.00781
Epoch: 34679, Training Loss: 0.00730
Epoch: 34679, Training Loss: 0.00892
Epoch: 34679, Training Loss: 0.00691
Epoch: 34680, Training Loss: 0.00781
Epoch: 34680, Training Loss: 0.00730
Epoch: 34680, Training Loss: 0.00892
Epoch: 34680, Training Loss: 0.00691
Epoch: 34681, Training Loss: 0.00781
Epoch: 34681, Training Loss: 0.00730
Epoch: 34681, Training Loss: 0.00892
Epoch: 34681, Training Loss: 0.00691
Epoch: 34682, Training Loss: 0.00781
Epoch: 34682, Training Loss: 0.00730
Epoch: 34682, Training Loss: 0.00892
Epoch: 34682, Training Loss: 0.00691
Epoch: 34683, Training Loss: 0.00781
Epoch: 34683, Training Loss: 0.00730
Epoch: 34683, Training Loss: 0.00892
Epoch: 34683, Training Loss: 0.00691
Epoch: 34684, Training Loss: 0.00781
Epoch: 34684, Training Loss: 0.00730
Epoch: 34684, Training Loss: 0.00892
Epoch: 34684, Training Loss: 0.00691
Epoch: 34685, Training Loss: 0.00781
Epoch: 34685, Training Loss: 0.00730
Epoch: 34685, Training Loss: 0.00892
Epoch: 34685, Training Loss: 0.00691
Epoch: 34686, Training Loss: 0.00781
Epoch: 34686, Training Loss: 0.00730
Epoch: 34686, Training Loss: 0.00892
Epoch: 34686, Training Loss: 0.00691
Epoch: 34687, Training Loss: 0.00781
Epoch: 34687, Training Loss: 0.00730
Epoch: 34687, Training Loss: 0.00892
Epoch: 34687, Training Loss: 0.00691
Epoch: 34688, Training Loss: 0.00781
Epoch: 34688, Training Loss: 0.00730
Epoch: 34688, Training Loss: 0.00892
Epoch: 34688, Training Loss: 0.00691
Epoch: 34689, Training Loss: 0.00781
Epoch: 34689, Training Loss: 0.00730
Epoch: 34689, Training Loss: 0.00892
Epoch: 34689, Training Loss: 0.00691
Epoch: 34690, Training Loss: 0.00781
Epoch: 34690, Training Loss: 0.00730
Epoch: 34690, Training Loss: 0.00892
Epoch: 34690, Training Loss: 0.00691
Epoch: 34691, Training Loss: 0.00781
Epoch: 34691, Training Loss: 0.00730
Epoch: 34691, Training Loss: 0.00892
Epoch: 34691, Training Loss: 0.00691
Epoch: 34692, Training Loss: 0.00781
Epoch: 34692, Training Loss: 0.00730
Epoch: 34692, Training Loss: 0.00892
Epoch: 34692, Training Loss: 0.00691
Epoch: 34693, Training Loss: 0.00781
Epoch: 34693, Training Loss: 0.00730
Epoch: 34693, Training Loss: 0.00892
Epoch: 34693, Training Loss: 0.00691
Epoch: 34694, Training Loss: 0.00781
Epoch: 34694, Training Loss: 0.00730
Epoch: 34694, Training Loss: 0.00892
Epoch: 34694, Training Loss: 0.00691
Epoch: 34695, Training Loss: 0.00781
Epoch: 34695, Training Loss: 0.00730
Epoch: 34695, Training Loss: 0.00892
Epoch: 34695, Training Loss: 0.00691
Epoch: 34696, Training Loss: 0.00781
Epoch: 34696, Training Loss: 0.00730
Epoch: 34696, Training Loss: 0.00892
Epoch: 34696, Training Loss: 0.00691
Epoch: 34697, Training Loss: 0.00781
Epoch: 34697, Training Loss: 0.00730
Epoch: 34697, Training Loss: 0.00892
Epoch: 34697, Training Loss: 0.00691
Epoch: 34698, Training Loss: 0.00781
Epoch: 34698, Training Loss: 0.00730
Epoch: 34698, Training Loss: 0.00892
Epoch: 34698, Training Loss: 0.00691
Epoch: 34699, Training Loss: 0.00781
Epoch: 34699, Training Loss: 0.00730
Epoch: 34699, Training Loss: 0.00892
Epoch: 34699, Training Loss: 0.00691
Epoch: 34700, Training Loss: 0.00781
Epoch: 34700, Training Loss: 0.00730
Epoch: 34700, Training Loss: 0.00892
Epoch: 34700, Training Loss: 0.00691
Epoch: 34701, Training Loss: 0.00781
Epoch: 34701, Training Loss: 0.00730
Epoch: 34701, Training Loss: 0.00892
Epoch: 34701, Training Loss: 0.00691
Epoch: 34702, Training Loss: 0.00781
Epoch: 34702, Training Loss: 0.00730
Epoch: 34702, Training Loss: 0.00892
Epoch: 34702, Training Loss: 0.00691
Epoch: 34703, Training Loss: 0.00781
Epoch: 34703, Training Loss: 0.00730
Epoch: 34703, Training Loss: 0.00892
Epoch: 34703, Training Loss: 0.00691
Epoch: 34704, Training Loss: 0.00781
Epoch: 34704, Training Loss: 0.00730
Epoch: 34704, Training Loss: 0.00892
Epoch: 34704, Training Loss: 0.00691
Epoch: 34705, Training Loss: 0.00781
Epoch: 34705, Training Loss: 0.00730
Epoch: 34705, Training Loss: 0.00892
Epoch: 34705, Training Loss: 0.00691
Epoch: 34706, Training Loss: 0.00781
Epoch: 34706, Training Loss: 0.00730
Epoch: 34706, Training Loss: 0.00892
Epoch: 34706, Training Loss: 0.00691
Epoch: 34707, Training Loss: 0.00781
Epoch: 34707, Training Loss: 0.00730
Epoch: 34707, Training Loss: 0.00892
Epoch: 34707, Training Loss: 0.00691
Epoch: 34708, Training Loss: 0.00781
Epoch: 34708, Training Loss: 0.00730
Epoch: 34708, Training Loss: 0.00892
Epoch: 34708, Training Loss: 0.00691
Epoch: 34709, Training Loss: 0.00781
Epoch: 34709, Training Loss: 0.00730
Epoch: 34709, Training Loss: 0.00892
Epoch: 34709, Training Loss: 0.00691
Epoch: 34710, Training Loss: 0.00781
Epoch: 34710, Training Loss: 0.00730
Epoch: 34710, Training Loss: 0.00892
Epoch: 34710, Training Loss: 0.00691
Epoch: 34711, Training Loss: 0.00781
Epoch: 34711, Training Loss: 0.00730
Epoch: 34711, Training Loss: 0.00892
Epoch: 34711, Training Loss: 0.00691
Epoch: 34712, Training Loss: 0.00781
Epoch: 34712, Training Loss: 0.00730
Epoch: 34712, Training Loss: 0.00892
Epoch: 34712, Training Loss: 0.00691
Epoch: 34713, Training Loss: 0.00781
Epoch: 34713, Training Loss: 0.00730
Epoch: 34713, Training Loss: 0.00892
Epoch: 34713, Training Loss: 0.00691
Epoch: 34714, Training Loss: 0.00781
Epoch: 34714, Training Loss: 0.00730
Epoch: 34714, Training Loss: 0.00892
Epoch: 34714, Training Loss: 0.00691
Epoch: 34715, Training Loss: 0.00781
Epoch: 34715, Training Loss: 0.00730
Epoch: 34715, Training Loss: 0.00892
Epoch: 34715, Training Loss: 0.00691
Epoch: 34716, Training Loss: 0.00781
Epoch: 34716, Training Loss: 0.00730
Epoch: 34716, Training Loss: 0.00892
Epoch: 34716, Training Loss: 0.00691
Epoch: 34717, Training Loss: 0.00781
Epoch: 34717, Training Loss: 0.00730
Epoch: 34717, Training Loss: 0.00892
Epoch: 34717, Training Loss: 0.00691
Epoch: 34718, Training Loss: 0.00781
Epoch: 34718, Training Loss: 0.00730
Epoch: 34718, Training Loss: 0.00892
Epoch: 34718, Training Loss: 0.00691
Epoch: 34719, Training Loss: 0.00781
Epoch: 34719, Training Loss: 0.00730
Epoch: 34719, Training Loss: 0.00892
Epoch: 34719, Training Loss: 0.00691
Epoch: 34720, Training Loss: 0.00781
Epoch: 34720, Training Loss: 0.00730
Epoch: 34720, Training Loss: 0.00892
Epoch: 34720, Training Loss: 0.00691
Epoch: 34721, Training Loss: 0.00781
Epoch: 34721, Training Loss: 0.00730
Epoch: 34721, Training Loss: 0.00892
Epoch: 34721, Training Loss: 0.00691
Epoch: 34722, Training Loss: 0.00781
Epoch: 34722, Training Loss: 0.00730
Epoch: 34722, Training Loss: 0.00892
Epoch: 34722, Training Loss: 0.00691
Epoch: 34723, Training Loss: 0.00781
Epoch: 34723, Training Loss: 0.00730
Epoch: 34723, Training Loss: 0.00892
Epoch: 34723, Training Loss: 0.00691
Epoch: 34724, Training Loss: 0.00781
Epoch: 34724, Training Loss: 0.00730
Epoch: 34724, Training Loss: 0.00892
Epoch: 34724, Training Loss: 0.00691
Epoch: 34725, Training Loss: 0.00781
Epoch: 34725, Training Loss: 0.00730
Epoch: 34725, Training Loss: 0.00892
Epoch: 34725, Training Loss: 0.00691
Epoch: 34726, Training Loss: 0.00781
Epoch: 34726, Training Loss: 0.00730
Epoch: 34726, Training Loss: 0.00892
Epoch: 34726, Training Loss: 0.00691
Epoch: 34727, Training Loss: 0.00780
Epoch: 34727, Training Loss: 0.00730
Epoch: 34727, Training Loss: 0.00892
Epoch: 34727, Training Loss: 0.00691
Epoch: 34728, Training Loss: 0.00780
Epoch: 34728, Training Loss: 0.00730
Epoch: 34728, Training Loss: 0.00892
Epoch: 34728, Training Loss: 0.00691
Epoch: 34729, Training Loss: 0.00780
Epoch: 34729, Training Loss: 0.00730
Epoch: 34729, Training Loss: 0.00892
Epoch: 34729, Training Loss: 0.00691
Epoch: 34730, Training Loss: 0.00780
Epoch: 34730, Training Loss: 0.00730
Epoch: 34730, Training Loss: 0.00892
Epoch: 34730, Training Loss: 0.00691
Epoch: 34731, Training Loss: 0.00780
Epoch: 34731, Training Loss: 0.00730
Epoch: 34731, Training Loss: 0.00892
Epoch: 34731, Training Loss: 0.00691
Epoch: 34732, Training Loss: 0.00780
Epoch: 34732, Training Loss: 0.00730
Epoch: 34732, Training Loss: 0.00891
Epoch: 34732, Training Loss: 0.00691
Epoch: 34733, Training Loss: 0.00780
Epoch: 34733, Training Loss: 0.00730
Epoch: 34733, Training Loss: 0.00891
Epoch: 34733, Training Loss: 0.00691
Epoch: 34734, Training Loss: 0.00780
Epoch: 34734, Training Loss: 0.00730
Epoch: 34734, Training Loss: 0.00891
Epoch: 34734, Training Loss: 0.00691
Epoch: 34735, Training Loss: 0.00780
Epoch: 34735, Training Loss: 0.00730
Epoch: 34735, Training Loss: 0.00891
Epoch: 34735, Training Loss: 0.00691
Epoch: 34736, Training Loss: 0.00780
Epoch: 34736, Training Loss: 0.00730
Epoch: 34736, Training Loss: 0.00891
Epoch: 34736, Training Loss: 0.00691
Epoch: 34737, Training Loss: 0.00780
Epoch: 34737, Training Loss: 0.00730
Epoch: 34737, Training Loss: 0.00891
Epoch: 34737, Training Loss: 0.00691
Epoch: 34738, Training Loss: 0.00780
Epoch: 34738, Training Loss: 0.00730
Epoch: 34738, Training Loss: 0.00891
Epoch: 34738, Training Loss: 0.00691
Epoch: 34739, Training Loss: 0.00780
Epoch: 34739, Training Loss: 0.00730
Epoch: 34739, Training Loss: 0.00891
Epoch: 34739, Training Loss: 0.00691
Epoch: 34740, Training Loss: 0.00780
Epoch: 34740, Training Loss: 0.00730
Epoch: 34740, Training Loss: 0.00891
Epoch: 34740, Training Loss: 0.00691
Epoch: 34741, Training Loss: 0.00780
Epoch: 34741, Training Loss: 0.00730
Epoch: 34741, Training Loss: 0.00891
Epoch: 34741, Training Loss: 0.00691
Epoch: 34742, Training Loss: 0.00780
Epoch: 34742, Training Loss: 0.00730
Epoch: 34742, Training Loss: 0.00891
Epoch: 34742, Training Loss: 0.00691
Epoch: 34743, Training Loss: 0.00780
Epoch: 34743, Training Loss: 0.00730
Epoch: 34743, Training Loss: 0.00891
Epoch: 34743, Training Loss: 0.00691
Epoch: 34744, Training Loss: 0.00780
Epoch: 34744, Training Loss: 0.00729
Epoch: 34744, Training Loss: 0.00891
Epoch: 34744, Training Loss: 0.00691
Epoch: 34745, Training Loss: 0.00780
Epoch: 34745, Training Loss: 0.00729
Epoch: 34745, Training Loss: 0.00891
Epoch: 34745, Training Loss: 0.00691
Epoch: 34746, Training Loss: 0.00780
Epoch: 34746, Training Loss: 0.00729
Epoch: 34746, Training Loss: 0.00891
Epoch: 34746, Training Loss: 0.00691
Epoch: 34747, Training Loss: 0.00780
Epoch: 34747, Training Loss: 0.00729
Epoch: 34747, Training Loss: 0.00891
Epoch: 34747, Training Loss: 0.00691
Epoch: 34748, Training Loss: 0.00780
Epoch: 34748, Training Loss: 0.00729
Epoch: 34748, Training Loss: 0.00891
Epoch: 34748, Training Loss: 0.00691
Epoch: 34749, Training Loss: 0.00780
Epoch: 34749, Training Loss: 0.00729
Epoch: 34749, Training Loss: 0.00891
Epoch: 34749, Training Loss: 0.00691
Epoch: 34750, Training Loss: 0.00780
Epoch: 34750, Training Loss: 0.00729
Epoch: 34750, Training Loss: 0.00891
Epoch: 34750, Training Loss: 0.00691
Epoch: 34751, Training Loss: 0.00780
Epoch: 34751, Training Loss: 0.00729
Epoch: 34751, Training Loss: 0.00891
Epoch: 34751, Training Loss: 0.00691
Epoch: 34752, Training Loss: 0.00780
Epoch: 34752, Training Loss: 0.00729
Epoch: 34752, Training Loss: 0.00891
Epoch: 34752, Training Loss: 0.00691
Epoch: 34753, Training Loss: 0.00780
Epoch: 34753, Training Loss: 0.00729
Epoch: 34753, Training Loss: 0.00891
Epoch: 34753, Training Loss: 0.00691
Epoch: 34754, Training Loss: 0.00780
Epoch: 34754, Training Loss: 0.00729
Epoch: 34754, Training Loss: 0.00891
Epoch: 34754, Training Loss: 0.00691
Epoch: 34755, Training Loss: 0.00780
Epoch: 34755, Training Loss: 0.00729
Epoch: 34755, Training Loss: 0.00891
Epoch: 34755, Training Loss: 0.00691
Epoch: 34756, Training Loss: 0.00780
Epoch: 34756, Training Loss: 0.00729
Epoch: 34756, Training Loss: 0.00891
Epoch: 34756, Training Loss: 0.00691
Epoch: 34757, Training Loss: 0.00780
Epoch: 34757, Training Loss: 0.00729
Epoch: 34757, Training Loss: 0.00891
Epoch: 34757, Training Loss: 0.00691
Epoch: 34758, Training Loss: 0.00780
Epoch: 34758, Training Loss: 0.00729
Epoch: 34758, Training Loss: 0.00891
Epoch: 34758, Training Loss: 0.00691
Epoch: 34759, Training Loss: 0.00780
Epoch: 34759, Training Loss: 0.00729
Epoch: 34759, Training Loss: 0.00891
Epoch: 34759, Training Loss: 0.00691
Epoch: 34760, Training Loss: 0.00780
Epoch: 34760, Training Loss: 0.00729
Epoch: 34760, Training Loss: 0.00891
Epoch: 34760, Training Loss: 0.00691
Epoch: 34761, Training Loss: 0.00780
Epoch: 34761, Training Loss: 0.00729
Epoch: 34761, Training Loss: 0.00891
Epoch: 34761, Training Loss: 0.00691
Epoch: 34762, Training Loss: 0.00780
Epoch: 34762, Training Loss: 0.00729
Epoch: 34762, Training Loss: 0.00891
Epoch: 34762, Training Loss: 0.00691
Epoch: 34763, Training Loss: 0.00780
Epoch: 34763, Training Loss: 0.00729
Epoch: 34763, Training Loss: 0.00891
Epoch: 34763, Training Loss: 0.00691
Epoch: 34764, Training Loss: 0.00780
Epoch: 34764, Training Loss: 0.00729
Epoch: 34764, Training Loss: 0.00891
Epoch: 34764, Training Loss: 0.00691
Epoch: 34765, Training Loss: 0.00780
Epoch: 34765, Training Loss: 0.00729
Epoch: 34765, Training Loss: 0.00891
Epoch: 34765, Training Loss: 0.00691
Epoch: 34766, Training Loss: 0.00780
Epoch: 34766, Training Loss: 0.00729
Epoch: 34766, Training Loss: 0.00891
Epoch: 34766, Training Loss: 0.00690
Epoch: 34767, Training Loss: 0.00780
Epoch: 34767, Training Loss: 0.00729
Epoch: 34767, Training Loss: 0.00891
Epoch: 34767, Training Loss: 0.00690
Epoch: 34768, Training Loss: 0.00780
Epoch: 34768, Training Loss: 0.00729
Epoch: 34768, Training Loss: 0.00891
Epoch: 34768, Training Loss: 0.00690
Epoch: 34769, Training Loss: 0.00780
Epoch: 34769, Training Loss: 0.00729
Epoch: 34769, Training Loss: 0.00891
Epoch: 34769, Training Loss: 0.00690
Epoch: 34770, Training Loss: 0.00780
Epoch: 34770, Training Loss: 0.00729
Epoch: 34770, Training Loss: 0.00891
Epoch: 34770, Training Loss: 0.00690
Epoch: 34771, Training Loss: 0.00780
Epoch: 34771, Training Loss: 0.00729
Epoch: 34771, Training Loss: 0.00891
Epoch: 34771, Training Loss: 0.00690
Epoch: 34772, Training Loss: 0.00780
Epoch: 34772, Training Loss: 0.00729
Epoch: 34772, Training Loss: 0.00891
Epoch: 34772, Training Loss: 0.00690
Epoch: 34773, Training Loss: 0.00780
Epoch: 34773, Training Loss: 0.00729
Epoch: 34773, Training Loss: 0.00891
Epoch: 34773, Training Loss: 0.00690
Epoch: 34774, Training Loss: 0.00780
Epoch: 34774, Training Loss: 0.00729
Epoch: 34774, Training Loss: 0.00891
Epoch: 34774, Training Loss: 0.00690
Epoch: 34775, Training Loss: 0.00780
Epoch: 34775, Training Loss: 0.00729
Epoch: 34775, Training Loss: 0.00891
Epoch: 34775, Training Loss: 0.00690
Epoch: 34776, Training Loss: 0.00780
Epoch: 34776, Training Loss: 0.00729
Epoch: 34776, Training Loss: 0.00891
Epoch: 34776, Training Loss: 0.00690
Epoch: 34777, Training Loss: 0.00780
Epoch: 34777, Training Loss: 0.00729
Epoch: 34777, Training Loss: 0.00891
Epoch: 34777, Training Loss: 0.00690
Epoch: 34778, Training Loss: 0.00780
Epoch: 34778, Training Loss: 0.00729
Epoch: 34778, Training Loss: 0.00891
Epoch: 34778, Training Loss: 0.00690
Epoch: 34779, Training Loss: 0.00780
Epoch: 34779, Training Loss: 0.00729
Epoch: 34779, Training Loss: 0.00891
Epoch: 34779, Training Loss: 0.00690
Epoch: 34780, Training Loss: 0.00780
Epoch: 34780, Training Loss: 0.00729
Epoch: 34780, Training Loss: 0.00891
Epoch: 34780, Training Loss: 0.00690
Epoch: 34781, Training Loss: 0.00780
Epoch: 34781, Training Loss: 0.00729
Epoch: 34781, Training Loss: 0.00891
Epoch: 34781, Training Loss: 0.00690
Epoch: 34782, Training Loss: 0.00780
Epoch: 34782, Training Loss: 0.00729
Epoch: 34782, Training Loss: 0.00891
Epoch: 34782, Training Loss: 0.00690
Epoch: 34783, Training Loss: 0.00780
Epoch: 34783, Training Loss: 0.00729
Epoch: 34783, Training Loss: 0.00891
Epoch: 34783, Training Loss: 0.00690
Epoch: 34784, Training Loss: 0.00780
Epoch: 34784, Training Loss: 0.00729
Epoch: 34784, Training Loss: 0.00891
Epoch: 34784, Training Loss: 0.00690
Epoch: 34785, Training Loss: 0.00780
Epoch: 34785, Training Loss: 0.00729
Epoch: 34785, Training Loss: 0.00891
Epoch: 34785, Training Loss: 0.00690
Epoch: 34786, Training Loss: 0.00780
Epoch: 34786, Training Loss: 0.00729
Epoch: 34786, Training Loss: 0.00891
Epoch: 34786, Training Loss: 0.00690
Epoch: 34787, Training Loss: 0.00780
Epoch: 34787, Training Loss: 0.00729
Epoch: 34787, Training Loss: 0.00891
Epoch: 34787, Training Loss: 0.00690
Epoch: 34788, Training Loss: 0.00780
Epoch: 34788, Training Loss: 0.00729
Epoch: 34788, Training Loss: 0.00891
Epoch: 34788, Training Loss: 0.00690
Epoch: 34789, Training Loss: 0.00780
Epoch: 34789, Training Loss: 0.00729
Epoch: 34789, Training Loss: 0.00891
Epoch: 34789, Training Loss: 0.00690
Epoch: 34790, Training Loss: 0.00780
Epoch: 34790, Training Loss: 0.00729
Epoch: 34790, Training Loss: 0.00891
Epoch: 34790, Training Loss: 0.00690
Epoch: 34791, Training Loss: 0.00780
Epoch: 34791, Training Loss: 0.00729
Epoch: 34791, Training Loss: 0.00891
Epoch: 34791, Training Loss: 0.00690
Epoch: 34792, Training Loss: 0.00780
Epoch: 34792, Training Loss: 0.00729
Epoch: 34792, Training Loss: 0.00891
Epoch: 34792, Training Loss: 0.00690
Epoch: 34793, Training Loss: 0.00780
Epoch: 34793, Training Loss: 0.00729
Epoch: 34793, Training Loss: 0.00891
Epoch: 34793, Training Loss: 0.00690
Epoch: 34794, Training Loss: 0.00780
Epoch: 34794, Training Loss: 0.00729
Epoch: 34794, Training Loss: 0.00891
Epoch: 34794, Training Loss: 0.00690
Epoch: 34795, Training Loss: 0.00780
Epoch: 34795, Training Loss: 0.00729
Epoch: 34795, Training Loss: 0.00891
Epoch: 34795, Training Loss: 0.00690
Epoch: 34796, Training Loss: 0.00780
Epoch: 34796, Training Loss: 0.00729
Epoch: 34796, Training Loss: 0.00891
Epoch: 34796, Training Loss: 0.00690
Epoch: 34797, Training Loss: 0.00780
Epoch: 34797, Training Loss: 0.00729
Epoch: 34797, Training Loss: 0.00891
Epoch: 34797, Training Loss: 0.00690
Epoch: 34798, Training Loss: 0.00780
Epoch: 34798, Training Loss: 0.00729
Epoch: 34798, Training Loss: 0.00891
Epoch: 34798, Training Loss: 0.00690
Epoch: 34799, Training Loss: 0.00780
Epoch: 34799, Training Loss: 0.00729
Epoch: 34799, Training Loss: 0.00891
Epoch: 34799, Training Loss: 0.00690
Epoch: 34800, Training Loss: 0.00780
Epoch: 34800, Training Loss: 0.00729
Epoch: 34800, Training Loss: 0.00891
Epoch: 34800, Training Loss: 0.00690
Epoch: 34801, Training Loss: 0.00780
Epoch: 34801, Training Loss: 0.00729
Epoch: 34801, Training Loss: 0.00891
Epoch: 34801, Training Loss: 0.00690
Epoch: 34802, Training Loss: 0.00780
Epoch: 34802, Training Loss: 0.00729
Epoch: 34802, Training Loss: 0.00891
Epoch: 34802, Training Loss: 0.00690
Epoch: 34803, Training Loss: 0.00780
Epoch: 34803, Training Loss: 0.00729
Epoch: 34803, Training Loss: 0.00891
Epoch: 34803, Training Loss: 0.00690
Epoch: 34804, Training Loss: 0.00780
Epoch: 34804, Training Loss: 0.00729
Epoch: 34804, Training Loss: 0.00891
Epoch: 34804, Training Loss: 0.00690
Epoch: 34805, Training Loss: 0.00780
Epoch: 34805, Training Loss: 0.00729
Epoch: 34805, Training Loss: 0.00891
Epoch: 34805, Training Loss: 0.00690
Epoch: 34806, Training Loss: 0.00780
Epoch: 34806, Training Loss: 0.00729
Epoch: 34806, Training Loss: 0.00890
Epoch: 34806, Training Loss: 0.00690
Epoch: 34807, Training Loss: 0.00780
Epoch: 34807, Training Loss: 0.00729
Epoch: 34807, Training Loss: 0.00890
Epoch: 34807, Training Loss: 0.00690
Epoch: 34808, Training Loss: 0.00780
Epoch: 34808, Training Loss: 0.00729
Epoch: 34808, Training Loss: 0.00890
Epoch: 34808, Training Loss: 0.00690
Epoch: 34809, Training Loss: 0.00780
Epoch: 34809, Training Loss: 0.00729
Epoch: 34809, Training Loss: 0.00890
Epoch: 34809, Training Loss: 0.00690
Epoch: 34810, Training Loss: 0.00780
Epoch: 34810, Training Loss: 0.00729
Epoch: 34810, Training Loss: 0.00890
Epoch: 34810, Training Loss: 0.00690
Epoch: 34811, Training Loss: 0.00779
Epoch: 34811, Training Loss: 0.00729
Epoch: 34811, Training Loss: 0.00890
Epoch: 34811, Training Loss: 0.00690
Epoch: 34812, Training Loss: 0.00779
Epoch: 34812, Training Loss: 0.00729
Epoch: 34812, Training Loss: 0.00890
Epoch: 34812, Training Loss: 0.00690
Epoch: 34813, Training Loss: 0.00779
Epoch: 34813, Training Loss: 0.00729
Epoch: 34813, Training Loss: 0.00890
Epoch: 34813, Training Loss: 0.00690
Epoch: 34814, Training Loss: 0.00779
Epoch: 34814, Training Loss: 0.00729
Epoch: 34814, Training Loss: 0.00890
Epoch: 34814, Training Loss: 0.00690
Epoch: 34815, Training Loss: 0.00779
Epoch: 34815, Training Loss: 0.00729
Epoch: 34815, Training Loss: 0.00890
Epoch: 34815, Training Loss: 0.00690
Epoch: 34816, Training Loss: 0.00779
Epoch: 34816, Training Loss: 0.00729
Epoch: 34816, Training Loss: 0.00890
Epoch: 34816, Training Loss: 0.00690
Epoch: 34817, Training Loss: 0.00779
Epoch: 34817, Training Loss: 0.00729
Epoch: 34817, Training Loss: 0.00890
Epoch: 34817, Training Loss: 0.00690
Epoch: 34818, Training Loss: 0.00779
Epoch: 34818, Training Loss: 0.00729
Epoch: 34818, Training Loss: 0.00890
Epoch: 34818, Training Loss: 0.00690
Epoch: 34819, Training Loss: 0.00779
Epoch: 34819, Training Loss: 0.00729
Epoch: 34819, Training Loss: 0.00890
Epoch: 34819, Training Loss: 0.00690
Epoch: 34820, Training Loss: 0.00779
Epoch: 34820, Training Loss: 0.00729
Epoch: 34820, Training Loss: 0.00890
Epoch: 34820, Training Loss: 0.00690
Epoch: 34821, Training Loss: 0.00779
Epoch: 34821, Training Loss: 0.00729
Epoch: 34821, Training Loss: 0.00890
Epoch: 34821, Training Loss: 0.00690
Epoch: 34822, Training Loss: 0.00779
Epoch: 34822, Training Loss: 0.00729
Epoch: 34822, Training Loss: 0.00890
Epoch: 34822, Training Loss: 0.00690
Epoch: 34823, Training Loss: 0.00779
Epoch: 34823, Training Loss: 0.00729
Epoch: 34823, Training Loss: 0.00890
Epoch: 34823, Training Loss: 0.00690
Epoch: 34824, Training Loss: 0.00779
Epoch: 34824, Training Loss: 0.00729
Epoch: 34824, Training Loss: 0.00890
Epoch: 34824, Training Loss: 0.00690
Epoch: 34825, Training Loss: 0.00779
Epoch: 34825, Training Loss: 0.00729
Epoch: 34825, Training Loss: 0.00890
Epoch: 34825, Training Loss: 0.00690
Epoch: 34826, Training Loss: 0.00779
Epoch: 34826, Training Loss: 0.00729
Epoch: 34826, Training Loss: 0.00890
Epoch: 34826, Training Loss: 0.00690
Epoch: 34827, Training Loss: 0.00779
Epoch: 34827, Training Loss: 0.00729
Epoch: 34827, Training Loss: 0.00890
Epoch: 34827, Training Loss: 0.00690
Epoch: 34828, Training Loss: 0.00779
Epoch: 34828, Training Loss: 0.00729
Epoch: 34828, Training Loss: 0.00890
Epoch: 34828, Training Loss: 0.00690
Epoch: 34829, Training Loss: 0.00779
Epoch: 34829, Training Loss: 0.00729
Epoch: 34829, Training Loss: 0.00890
Epoch: 34829, Training Loss: 0.00690
Epoch: 34830, Training Loss: 0.00779
Epoch: 34830, Training Loss: 0.00729
Epoch: 34830, Training Loss: 0.00890
Epoch: 34830, Training Loss: 0.00690
Epoch: 34831, Training Loss: 0.00779
Epoch: 34831, Training Loss: 0.00729
Epoch: 34831, Training Loss: 0.00890
Epoch: 34831, Training Loss: 0.00690
Epoch: 34832, Training Loss: 0.00779
Epoch: 34832, Training Loss: 0.00729
Epoch: 34832, Training Loss: 0.00890
Epoch: 34832, Training Loss: 0.00690
Epoch: 34833, Training Loss: 0.00779
Epoch: 34833, Training Loss: 0.00729
Epoch: 34833, Training Loss: 0.00890
Epoch: 34833, Training Loss: 0.00690
Epoch: 34834, Training Loss: 0.00779
Epoch: 34834, Training Loss: 0.00729
Epoch: 34834, Training Loss: 0.00890
Epoch: 34834, Training Loss: 0.00690
Epoch: 34835, Training Loss: 0.00779
Epoch: 34835, Training Loss: 0.00729
Epoch: 34835, Training Loss: 0.00890
Epoch: 34835, Training Loss: 0.00690
Epoch: 34836, Training Loss: 0.00779
Epoch: 34836, Training Loss: 0.00728
Epoch: 34836, Training Loss: 0.00890
Epoch: 34836, Training Loss: 0.00690
Epoch: 34837, Training Loss: 0.00779
Epoch: 34837, Training Loss: 0.00728
Epoch: 34837, Training Loss: 0.00890
Epoch: 34837, Training Loss: 0.00690
Epoch: 34838, Training Loss: 0.00779
Epoch: 34838, Training Loss: 0.00728
Epoch: 34838, Training Loss: 0.00890
Epoch: 34838, Training Loss: 0.00690
Epoch: 34839, Training Loss: 0.00779
Epoch: 34839, Training Loss: 0.00728
Epoch: 34839, Training Loss: 0.00890
Epoch: 34839, Training Loss: 0.00690
Epoch: 34840, Training Loss: 0.00779
Epoch: 34840, Training Loss: 0.00728
Epoch: 34840, Training Loss: 0.00890
Epoch: 34840, Training Loss: 0.00690
Epoch: 34841, Training Loss: 0.00779
Epoch: 34841, Training Loss: 0.00728
Epoch: 34841, Training Loss: 0.00890
Epoch: 34841, Training Loss: 0.00690
Epoch: 34842, Training Loss: 0.00779
Epoch: 34842, Training Loss: 0.00728
Epoch: 34842, Training Loss: 0.00890
Epoch: 34842, Training Loss: 0.00690
Epoch: 34843, Training Loss: 0.00779
Epoch: 34843, Training Loss: 0.00728
Epoch: 34843, Training Loss: 0.00890
Epoch: 34843, Training Loss: 0.00690
Epoch: 34844, Training Loss: 0.00779
Epoch: 34844, Training Loss: 0.00728
Epoch: 34844, Training Loss: 0.00890
Epoch: 34844, Training Loss: 0.00690
Epoch: 34845, Training Loss: 0.00779
Epoch: 34845, Training Loss: 0.00728
Epoch: 34845, Training Loss: 0.00890
Epoch: 34845, Training Loss: 0.00690
Epoch: 34846, Training Loss: 0.00779
Epoch: 34846, Training Loss: 0.00728
Epoch: 34846, Training Loss: 0.00890
Epoch: 34846, Training Loss: 0.00690
Epoch: 34847, Training Loss: 0.00779
Epoch: 34847, Training Loss: 0.00728
Epoch: 34847, Training Loss: 0.00890
Epoch: 34847, Training Loss: 0.00690
Epoch: 34848, Training Loss: 0.00779
Epoch: 34848, Training Loss: 0.00728
Epoch: 34848, Training Loss: 0.00890
Epoch: 34848, Training Loss: 0.00690
Epoch: 34849, Training Loss: 0.00779
Epoch: 34849, Training Loss: 0.00728
Epoch: 34849, Training Loss: 0.00890
Epoch: 34849, Training Loss: 0.00690
Epoch: 34850, Training Loss: 0.00779
Epoch: 34850, Training Loss: 0.00728
Epoch: 34850, Training Loss: 0.00890
Epoch: 34850, Training Loss: 0.00690
Epoch: 34851, Training Loss: 0.00779
Epoch: 34851, Training Loss: 0.00728
Epoch: 34851, Training Loss: 0.00890
Epoch: 34851, Training Loss: 0.00690
Epoch: 34852, Training Loss: 0.00779
Epoch: 34852, Training Loss: 0.00728
Epoch: 34852, Training Loss: 0.00890
Epoch: 34852, Training Loss: 0.00690
Epoch: 34853, Training Loss: 0.00779
Epoch: 34853, Training Loss: 0.00728
Epoch: 34853, Training Loss: 0.00890
Epoch: 34853, Training Loss: 0.00690
Epoch: 34854, Training Loss: 0.00779
Epoch: 34854, Training Loss: 0.00728
Epoch: 34854, Training Loss: 0.00890
Epoch: 34854, Training Loss: 0.00690
Epoch: 34855, Training Loss: 0.00779
Epoch: 34855, Training Loss: 0.00728
Epoch: 34855, Training Loss: 0.00890
Epoch: 34855, Training Loss: 0.00690
Epoch: 34856, Training Loss: 0.00779
Epoch: 34856, Training Loss: 0.00728
Epoch: 34856, Training Loss: 0.00890
Epoch: 34856, Training Loss: 0.00690
Epoch: 34857, Training Loss: 0.00779
Epoch: 34857, Training Loss: 0.00728
Epoch: 34857, Training Loss: 0.00890
Epoch: 34857, Training Loss: 0.00690
Epoch: 34858, Training Loss: 0.00779
Epoch: 34858, Training Loss: 0.00728
Epoch: 34858, Training Loss: 0.00890
Epoch: 34858, Training Loss: 0.00690
Epoch: 34859, Training Loss: 0.00779
Epoch: 34859, Training Loss: 0.00728
Epoch: 34859, Training Loss: 0.00890
Epoch: 34859, Training Loss: 0.00690
Epoch: 34860, Training Loss: 0.00779
Epoch: 34860, Training Loss: 0.00728
Epoch: 34860, Training Loss: 0.00890
Epoch: 34860, Training Loss: 0.00690
Epoch: 34861, Training Loss: 0.00779
Epoch: 34861, Training Loss: 0.00728
Epoch: 34861, Training Loss: 0.00890
Epoch: 34861, Training Loss: 0.00690
Epoch: 34862, Training Loss: 0.00779
Epoch: 34862, Training Loss: 0.00728
Epoch: 34862, Training Loss: 0.00890
Epoch: 34862, Training Loss: 0.00690
Epoch: 34863, Training Loss: 0.00779
Epoch: 34863, Training Loss: 0.00728
Epoch: 34863, Training Loss: 0.00890
Epoch: 34863, Training Loss: 0.00689
Epoch: 34864, Training Loss: 0.00779
Epoch: 34864, Training Loss: 0.00728
Epoch: 34864, Training Loss: 0.00890
Epoch: 34864, Training Loss: 0.00689
Epoch: 34865, Training Loss: 0.00779
Epoch: 34865, Training Loss: 0.00728
Epoch: 34865, Training Loss: 0.00890
Epoch: 34865, Training Loss: 0.00689
Epoch: 34866, Training Loss: 0.00779
Epoch: 34866, Training Loss: 0.00728
Epoch: 34866, Training Loss: 0.00890
Epoch: 34866, Training Loss: 0.00689
Epoch: 34867, Training Loss: 0.00779
Epoch: 34867, Training Loss: 0.00728
Epoch: 34867, Training Loss: 0.00890
Epoch: 34867, Training Loss: 0.00689
Epoch: 34868, Training Loss: 0.00779
Epoch: 34868, Training Loss: 0.00728
Epoch: 34868, Training Loss: 0.00890
Epoch: 34868, Training Loss: 0.00689
Epoch: 34869, Training Loss: 0.00779
Epoch: 34869, Training Loss: 0.00728
Epoch: 34869, Training Loss: 0.00890
Epoch: 34869, Training Loss: 0.00689
Epoch: 34870, Training Loss: 0.00779
Epoch: 34870, Training Loss: 0.00728
Epoch: 34870, Training Loss: 0.00890
Epoch: 34870, Training Loss: 0.00689
Epoch: 34871, Training Loss: 0.00779
Epoch: 34871, Training Loss: 0.00728
Epoch: 34871, Training Loss: 0.00890
Epoch: 34871, Training Loss: 0.00689
Epoch: 34872, Training Loss: 0.00779
Epoch: 34872, Training Loss: 0.00728
Epoch: 34872, Training Loss: 0.00890
Epoch: 34872, Training Loss: 0.00689
Epoch: 34873, Training Loss: 0.00779
Epoch: 34873, Training Loss: 0.00728
Epoch: 34873, Training Loss: 0.00890
Epoch: 34873, Training Loss: 0.00689
Epoch: 34874, Training Loss: 0.00779
Epoch: 34874, Training Loss: 0.00728
Epoch: 34874, Training Loss: 0.00890
Epoch: 34874, Training Loss: 0.00689
Epoch: 34875, Training Loss: 0.00779
Epoch: 34875, Training Loss: 0.00728
Epoch: 34875, Training Loss: 0.00890
Epoch: 34875, Training Loss: 0.00689
Epoch: 34876, Training Loss: 0.00779
Epoch: 34876, Training Loss: 0.00728
Epoch: 34876, Training Loss: 0.00890
Epoch: 34876, Training Loss: 0.00689
Epoch: 34877, Training Loss: 0.00779
Epoch: 34877, Training Loss: 0.00728
Epoch: 34877, Training Loss: 0.00890
Epoch: 34877, Training Loss: 0.00689
Epoch: 34878, Training Loss: 0.00779
Epoch: 34878, Training Loss: 0.00728
Epoch: 34878, Training Loss: 0.00890
Epoch: 34878, Training Loss: 0.00689
Epoch: 34879, Training Loss: 0.00779
Epoch: 34879, Training Loss: 0.00728
Epoch: 34879, Training Loss: 0.00890
Epoch: 34879, Training Loss: 0.00689
Epoch: 34880, Training Loss: 0.00779
Epoch: 34880, Training Loss: 0.00728
Epoch: 34880, Training Loss: 0.00889
Epoch: 34880, Training Loss: 0.00689
Epoch: 34881, Training Loss: 0.00779
Epoch: 34881, Training Loss: 0.00728
Epoch: 34881, Training Loss: 0.00889
Epoch: 34881, Training Loss: 0.00689
Epoch: 34882, Training Loss: 0.00779
Epoch: 34882, Training Loss: 0.00728
Epoch: 34882, Training Loss: 0.00889
Epoch: 34882, Training Loss: 0.00689
Epoch: 34883, Training Loss: 0.00779
Epoch: 34883, Training Loss: 0.00728
Epoch: 34883, Training Loss: 0.00889
Epoch: 34883, Training Loss: 0.00689
Epoch: 34884, Training Loss: 0.00779
Epoch: 34884, Training Loss: 0.00728
Epoch: 34884, Training Loss: 0.00889
Epoch: 34884, Training Loss: 0.00689
Epoch: 34885, Training Loss: 0.00779
Epoch: 34885, Training Loss: 0.00728
Epoch: 34885, Training Loss: 0.00889
Epoch: 34885, Training Loss: 0.00689
Epoch: 34886, Training Loss: 0.00779
Epoch: 34886, Training Loss: 0.00728
Epoch: 34886, Training Loss: 0.00889
Epoch: 34886, Training Loss: 0.00689
Epoch: 34887, Training Loss: 0.00779
Epoch: 34887, Training Loss: 0.00728
Epoch: 34887, Training Loss: 0.00889
Epoch: 34887, Training Loss: 0.00689
Epoch: 34888, Training Loss: 0.00779
Epoch: 34888, Training Loss: 0.00728
Epoch: 34888, Training Loss: 0.00889
Epoch: 34888, Training Loss: 0.00689
Epoch: 34889, Training Loss: 0.00779
Epoch: 34889, Training Loss: 0.00728
Epoch: 34889, Training Loss: 0.00889
Epoch: 34889, Training Loss: 0.00689
Epoch: 34890, Training Loss: 0.00779
Epoch: 34890, Training Loss: 0.00728
Epoch: 34890, Training Loss: 0.00889
Epoch: 34890, Training Loss: 0.00689
Epoch: 34891, Training Loss: 0.00779
Epoch: 34891, Training Loss: 0.00728
Epoch: 34891, Training Loss: 0.00889
Epoch: 34891, Training Loss: 0.00689
Epoch: 34892, Training Loss: 0.00779
Epoch: 34892, Training Loss: 0.00728
Epoch: 34892, Training Loss: 0.00889
Epoch: 34892, Training Loss: 0.00689
Epoch: 34893, Training Loss: 0.00779
Epoch: 34893, Training Loss: 0.00728
Epoch: 34893, Training Loss: 0.00889
Epoch: 34893, Training Loss: 0.00689
Epoch: 34894, Training Loss: 0.00779
Epoch: 34894, Training Loss: 0.00728
Epoch: 34894, Training Loss: 0.00889
Epoch: 34894, Training Loss: 0.00689
Epoch: 34895, Training Loss: 0.00779
Epoch: 34895, Training Loss: 0.00728
Epoch: 34895, Training Loss: 0.00889
Epoch: 34895, Training Loss: 0.00689
Epoch: 34896, Training Loss: 0.00778
Epoch: 34896, Training Loss: 0.00728
Epoch: 34896, Training Loss: 0.00889
Epoch: 34896, Training Loss: 0.00689
Epoch: 34897, Training Loss: 0.00778
Epoch: 34897, Training Loss: 0.00728
Epoch: 34897, Training Loss: 0.00889
Epoch: 34897, Training Loss: 0.00689
Epoch: 34898, Training Loss: 0.00778
Epoch: 34898, Training Loss: 0.00728
Epoch: 34898, Training Loss: 0.00889
Epoch: 34898, Training Loss: 0.00689
Epoch: 34899, Training Loss: 0.00778
Epoch: 34899, Training Loss: 0.00728
Epoch: 34899, Training Loss: 0.00889
Epoch: 34899, Training Loss: 0.00689
Epoch: 34900, Training Loss: 0.00778
Epoch: 34900, Training Loss: 0.00728
Epoch: 34900, Training Loss: 0.00889
Epoch: 34900, Training Loss: 0.00689
Epoch: 34901, Training Loss: 0.00778
Epoch: 34901, Training Loss: 0.00728
Epoch: 34901, Training Loss: 0.00889
Epoch: 34901, Training Loss: 0.00689
Epoch: 34902, Training Loss: 0.00778
Epoch: 34902, Training Loss: 0.00728
Epoch: 34902, Training Loss: 0.00889
Epoch: 34902, Training Loss: 0.00689
Epoch: 34903, Training Loss: 0.00778
Epoch: 34903, Training Loss: 0.00728
Epoch: 34903, Training Loss: 0.00889
Epoch: 34903, Training Loss: 0.00689
Epoch: 34904, Training Loss: 0.00778
Epoch: 34904, Training Loss: 0.00728
Epoch: 34904, Training Loss: 0.00889
Epoch: 34904, Training Loss: 0.00689
Epoch: 34905, Training Loss: 0.00778
Epoch: 34905, Training Loss: 0.00728
Epoch: 34905, Training Loss: 0.00889
Epoch: 34905, Training Loss: 0.00689
Epoch: 34906, Training Loss: 0.00778
Epoch: 34906, Training Loss: 0.00728
Epoch: 34906, Training Loss: 0.00889
Epoch: 34906, Training Loss: 0.00689
Epoch: 34907, Training Loss: 0.00778
Epoch: 34907, Training Loss: 0.00728
Epoch: 34907, Training Loss: 0.00889
Epoch: 34907, Training Loss: 0.00689
Epoch: 34908, Training Loss: 0.00778
Epoch: 34908, Training Loss: 0.00728
Epoch: 34908, Training Loss: 0.00889
Epoch: 34908, Training Loss: 0.00689
Epoch: 34909, Training Loss: 0.00778
Epoch: 34909, Training Loss: 0.00728
Epoch: 34909, Training Loss: 0.00889
Epoch: 34909, Training Loss: 0.00689
Epoch: 34910, Training Loss: 0.00778
Epoch: 34910, Training Loss: 0.00728
Epoch: 34910, Training Loss: 0.00889
Epoch: 34910, Training Loss: 0.00689
Epoch: 34911, Training Loss: 0.00778
Epoch: 34911, Training Loss: 0.00728
Epoch: 34911, Training Loss: 0.00889
Epoch: 34911, Training Loss: 0.00689
Epoch: 34912, Training Loss: 0.00778
Epoch: 34912, Training Loss: 0.00728
Epoch: 34912, Training Loss: 0.00889
Epoch: 34912, Training Loss: 0.00689
Epoch: 34913, Training Loss: 0.00778
Epoch: 34913, Training Loss: 0.00728
Epoch: 34913, Training Loss: 0.00889
Epoch: 34913, Training Loss: 0.00689
Epoch: 34914, Training Loss: 0.00778
Epoch: 34914, Training Loss: 0.00728
Epoch: 34914, Training Loss: 0.00889
Epoch: 34914, Training Loss: 0.00689
Epoch: 34915, Training Loss: 0.00778
Epoch: 34915, Training Loss: 0.00728
Epoch: 34915, Training Loss: 0.00889
Epoch: 34915, Training Loss: 0.00689
Epoch: 34916, Training Loss: 0.00778
Epoch: 34916, Training Loss: 0.00728
Epoch: 34916, Training Loss: 0.00889
Epoch: 34916, Training Loss: 0.00689
Epoch: 34917, Training Loss: 0.00778
Epoch: 34917, Training Loss: 0.00728
Epoch: 34917, Training Loss: 0.00889
Epoch: 34917, Training Loss: 0.00689
Epoch: 34918, Training Loss: 0.00778
Epoch: 34918, Training Loss: 0.00728
Epoch: 34918, Training Loss: 0.00889
Epoch: 34918, Training Loss: 0.00689
Epoch: 34919, Training Loss: 0.00778
Epoch: 34919, Training Loss: 0.00728
Epoch: 34919, Training Loss: 0.00889
Epoch: 34919, Training Loss: 0.00689
Epoch: 34920, Training Loss: 0.00778
Epoch: 34920, Training Loss: 0.00728
Epoch: 34920, Training Loss: 0.00889
Epoch: 34920, Training Loss: 0.00689
Epoch: 34921, Training Loss: 0.00778
Epoch: 34921, Training Loss: 0.00728
Epoch: 34921, Training Loss: 0.00889
Epoch: 34921, Training Loss: 0.00689
Epoch: 34922, Training Loss: 0.00778
Epoch: 34922, Training Loss: 0.00728
Epoch: 34922, Training Loss: 0.00889
Epoch: 34922, Training Loss: 0.00689
Epoch: 34923, Training Loss: 0.00778
Epoch: 34923, Training Loss: 0.00728
Epoch: 34923, Training Loss: 0.00889
Epoch: 34923, Training Loss: 0.00689
Epoch: 34924, Training Loss: 0.00778
Epoch: 34924, Training Loss: 0.00728
Epoch: 34924, Training Loss: 0.00889
Epoch: 34924, Training Loss: 0.00689
Epoch: 34925, Training Loss: 0.00778
Epoch: 34925, Training Loss: 0.00728
Epoch: 34925, Training Loss: 0.00889
Epoch: 34925, Training Loss: 0.00689
Epoch: 34926, Training Loss: 0.00778
Epoch: 34926, Training Loss: 0.00728
Epoch: 34926, Training Loss: 0.00889
Epoch: 34926, Training Loss: 0.00689
Epoch: 34927, Training Loss: 0.00778
Epoch: 34927, Training Loss: 0.00728
Epoch: 34927, Training Loss: 0.00889
Epoch: 34927, Training Loss: 0.00689
Epoch: 34928, Training Loss: 0.00778
Epoch: 34928, Training Loss: 0.00727
Epoch: 34928, Training Loss: 0.00889
Epoch: 34928, Training Loss: 0.00689
Epoch: 34929, Training Loss: 0.00778
Epoch: 34929, Training Loss: 0.00727
Epoch: 34929, Training Loss: 0.00889
Epoch: 34929, Training Loss: 0.00689
Epoch: 34930, Training Loss: 0.00778
Epoch: 34930, Training Loss: 0.00727
Epoch: 34930, Training Loss: 0.00889
Epoch: 34930, Training Loss: 0.00689
Epoch: 34931, Training Loss: 0.00778
Epoch: 34931, Training Loss: 0.00727
Epoch: 34931, Training Loss: 0.00889
Epoch: 34931, Training Loss: 0.00689
Epoch: 34932, Training Loss: 0.00778
Epoch: 34932, Training Loss: 0.00727
Epoch: 34932, Training Loss: 0.00889
Epoch: 34932, Training Loss: 0.00689
Epoch: 34933, Training Loss: 0.00778
Epoch: 34933, Training Loss: 0.00727
Epoch: 34933, Training Loss: 0.00889
Epoch: 34933, Training Loss: 0.00689
Epoch: 34934, Training Loss: 0.00778
Epoch: 34934, Training Loss: 0.00727
Epoch: 34934, Training Loss: 0.00889
Epoch: 34934, Training Loss: 0.00689
Epoch: 34935, Training Loss: 0.00778
Epoch: 34935, Training Loss: 0.00727
Epoch: 34935, Training Loss: 0.00889
Epoch: 34935, Training Loss: 0.00689
Epoch: 34936, Training Loss: 0.00778
Epoch: 34936, Training Loss: 0.00727
Epoch: 34936, Training Loss: 0.00889
Epoch: 34936, Training Loss: 0.00689
Epoch: 34937, Training Loss: 0.00778
Epoch: 34937, Training Loss: 0.00727
Epoch: 34937, Training Loss: 0.00889
Epoch: 34937, Training Loss: 0.00689
Epoch: 34938, Training Loss: 0.00778
Epoch: 34938, Training Loss: 0.00727
Epoch: 34938, Training Loss: 0.00889
Epoch: 34938, Training Loss: 0.00689
Epoch: 34939, Training Loss: 0.00778
Epoch: 34939, Training Loss: 0.00727
Epoch: 34939, Training Loss: 0.00889
Epoch: 34939, Training Loss: 0.00689
Epoch: 34940, Training Loss: 0.00778
Epoch: 34940, Training Loss: 0.00727
Epoch: 34940, Training Loss: 0.00889
Epoch: 34940, Training Loss: 0.00689
Epoch: 34941, Training Loss: 0.00778
Epoch: 34941, Training Loss: 0.00727
Epoch: 34941, Training Loss: 0.00889
Epoch: 34941, Training Loss: 0.00689
Epoch: 34942, Training Loss: 0.00778
Epoch: 34942, Training Loss: 0.00727
Epoch: 34942, Training Loss: 0.00889
Epoch: 34942, Training Loss: 0.00689
Epoch: 34943, Training Loss: 0.00778
Epoch: 34943, Training Loss: 0.00727
Epoch: 34943, Training Loss: 0.00889
Epoch: 34943, Training Loss: 0.00689
Epoch: 34944, Training Loss: 0.00778
Epoch: 34944, Training Loss: 0.00727
Epoch: 34944, Training Loss: 0.00889
Epoch: 34944, Training Loss: 0.00689
Epoch: 34945, Training Loss: 0.00778
Epoch: 34945, Training Loss: 0.00727
Epoch: 34945, Training Loss: 0.00889
Epoch: 34945, Training Loss: 0.00689
Epoch: 34946, Training Loss: 0.00778
Epoch: 34946, Training Loss: 0.00727
Epoch: 34946, Training Loss: 0.00889
Epoch: 34946, Training Loss: 0.00689
Epoch: 34947, Training Loss: 0.00778
Epoch: 34947, Training Loss: 0.00727
Epoch: 34947, Training Loss: 0.00889
Epoch: 34947, Training Loss: 0.00689
Epoch: 34948, Training Loss: 0.00778
Epoch: 34948, Training Loss: 0.00727
Epoch: 34948, Training Loss: 0.00889
Epoch: 34948, Training Loss: 0.00689
Epoch: 34949, Training Loss: 0.00778
Epoch: 34949, Training Loss: 0.00727
Epoch: 34949, Training Loss: 0.00889
Epoch: 34949, Training Loss: 0.00689
Epoch: 34950, Training Loss: 0.00778
Epoch: 34950, Training Loss: 0.00727
Epoch: 34950, Training Loss: 0.00889
Epoch: 34950, Training Loss: 0.00689
Epoch: 34951, Training Loss: 0.00778
Epoch: 34951, Training Loss: 0.00727
Epoch: 34951, Training Loss: 0.00889
Epoch: 34951, Training Loss: 0.00689
Epoch: 34952, Training Loss: 0.00778
Epoch: 34952, Training Loss: 0.00727
Epoch: 34952, Training Loss: 0.00889
Epoch: 34952, Training Loss: 0.00689
Epoch: 34953, Training Loss: 0.00778
Epoch: 34953, Training Loss: 0.00727
Epoch: 34953, Training Loss: 0.00889
Epoch: 34953, Training Loss: 0.00689
Epoch: 34954, Training Loss: 0.00778
Epoch: 34954, Training Loss: 0.00727
Epoch: 34954, Training Loss: 0.00889
Epoch: 34954, Training Loss: 0.00689
Epoch: 34955, Training Loss: 0.00778
Epoch: 34955, Training Loss: 0.00727
Epoch: 34955, Training Loss: 0.00888
Epoch: 34955, Training Loss: 0.00689
Epoch: 34956, Training Loss: 0.00778
Epoch: 34956, Training Loss: 0.00727
Epoch: 34956, Training Loss: 0.00888
Epoch: 34956, Training Loss: 0.00689
Epoch: 34957, Training Loss: 0.00778
Epoch: 34957, Training Loss: 0.00727
Epoch: 34957, Training Loss: 0.00888
Epoch: 34957, Training Loss: 0.00689
Epoch: 34958, Training Loss: 0.00778
Epoch: 34958, Training Loss: 0.00727
Epoch: 34958, Training Loss: 0.00888
Epoch: 34958, Training Loss: 0.00689
Epoch: 34959, Training Loss: 0.00778
Epoch: 34959, Training Loss: 0.00727
Epoch: 34959, Training Loss: 0.00888
Epoch: 34959, Training Loss: 0.00689
Epoch: 34960, Training Loss: 0.00778
Epoch: 34960, Training Loss: 0.00727
Epoch: 34960, Training Loss: 0.00888
Epoch: 34960, Training Loss: 0.00688
Epoch: 34961, Training Loss: 0.00778
Epoch: 34961, Training Loss: 0.00727
Epoch: 34961, Training Loss: 0.00888
Epoch: 34961, Training Loss: 0.00688
Epoch: 34962, Training Loss: 0.00778
Epoch: 34962, Training Loss: 0.00727
Epoch: 34962, Training Loss: 0.00888
Epoch: 34962, Training Loss: 0.00688
Epoch: 34963, Training Loss: 0.00778
Epoch: 34963, Training Loss: 0.00727
Epoch: 34963, Training Loss: 0.00888
Epoch: 34963, Training Loss: 0.00688
Epoch: 34964, Training Loss: 0.00778
Epoch: 34964, Training Loss: 0.00727
Epoch: 34964, Training Loss: 0.00888
Epoch: 34964, Training Loss: 0.00688
Epoch: 34965, Training Loss: 0.00778
Epoch: 34965, Training Loss: 0.00727
Epoch: 34965, Training Loss: 0.00888
Epoch: 34965, Training Loss: 0.00688
Epoch: 34966, Training Loss: 0.00778
Epoch: 34966, Training Loss: 0.00727
Epoch: 34966, Training Loss: 0.00888
Epoch: 34966, Training Loss: 0.00688
Epoch: 34967, Training Loss: 0.00778
Epoch: 34967, Training Loss: 0.00727
Epoch: 34967, Training Loss: 0.00888
Epoch: 34967, Training Loss: 0.00688
Epoch: 34968, Training Loss: 0.00778
Epoch: 34968, Training Loss: 0.00727
Epoch: 34968, Training Loss: 0.00888
Epoch: 34968, Training Loss: 0.00688
Epoch: 34969, Training Loss: 0.00778
Epoch: 34969, Training Loss: 0.00727
Epoch: 34969, Training Loss: 0.00888
Epoch: 34969, Training Loss: 0.00688
Epoch: 34970, Training Loss: 0.00778
Epoch: 34970, Training Loss: 0.00727
Epoch: 34970, Training Loss: 0.00888
Epoch: 34970, Training Loss: 0.00688
Epoch: 34971, Training Loss: 0.00778
Epoch: 34971, Training Loss: 0.00727
Epoch: 34971, Training Loss: 0.00888
Epoch: 34971, Training Loss: 0.00688
Epoch: 34972, Training Loss: 0.00778
Epoch: 34972, Training Loss: 0.00727
Epoch: 34972, Training Loss: 0.00888
Epoch: 34972, Training Loss: 0.00688
Epoch: 34973, Training Loss: 0.00778
Epoch: 34973, Training Loss: 0.00727
Epoch: 34973, Training Loss: 0.00888
Epoch: 34973, Training Loss: 0.00688
Epoch: 34974, Training Loss: 0.00778
Epoch: 34974, Training Loss: 0.00727
Epoch: 34974, Training Loss: 0.00888
Epoch: 34974, Training Loss: 0.00688
Epoch: 34975, Training Loss: 0.00778
Epoch: 34975, Training Loss: 0.00727
Epoch: 34975, Training Loss: 0.00888
Epoch: 34975, Training Loss: 0.00688
Epoch: 34976, Training Loss: 0.00778
Epoch: 34976, Training Loss: 0.00727
Epoch: 34976, Training Loss: 0.00888
Epoch: 34976, Training Loss: 0.00688
Epoch: 34977, Training Loss: 0.00778
Epoch: 34977, Training Loss: 0.00727
Epoch: 34977, Training Loss: 0.00888
Epoch: 34977, Training Loss: 0.00688
Epoch: 34978, Training Loss: 0.00778
Epoch: 34978, Training Loss: 0.00727
Epoch: 34978, Training Loss: 0.00888
Epoch: 34978, Training Loss: 0.00688
Epoch: 34979, Training Loss: 0.00778
Epoch: 34979, Training Loss: 0.00727
Epoch: 34979, Training Loss: 0.00888
Epoch: 34979, Training Loss: 0.00688
Epoch: 34980, Training Loss: 0.00778
Epoch: 34980, Training Loss: 0.00727
Epoch: 34980, Training Loss: 0.00888
Epoch: 34980, Training Loss: 0.00688
Epoch: 34981, Training Loss: 0.00777
Epoch: 34981, Training Loss: 0.00727
Epoch: 34981, Training Loss: 0.00888
Epoch: 34981, Training Loss: 0.00688
Epoch: 34982, Training Loss: 0.00777
Epoch: 34982, Training Loss: 0.00727
Epoch: 34982, Training Loss: 0.00888
Epoch: 34982, Training Loss: 0.00688
Epoch: 34983, Training Loss: 0.00777
Epoch: 34983, Training Loss: 0.00727
Epoch: 34983, Training Loss: 0.00888
Epoch: 34983, Training Loss: 0.00688
Epoch: 34984, Training Loss: 0.00777
Epoch: 34984, Training Loss: 0.00727
Epoch: 34984, Training Loss: 0.00888
Epoch: 34984, Training Loss: 0.00688
Epoch: 34985, Training Loss: 0.00777
Epoch: 34985, Training Loss: 0.00727
Epoch: 34985, Training Loss: 0.00888
Epoch: 34985, Training Loss: 0.00688
Epoch: 34986, Training Loss: 0.00777
Epoch: 34986, Training Loss: 0.00727
Epoch: 34986, Training Loss: 0.00888
Epoch: 34986, Training Loss: 0.00688
Epoch: 34987, Training Loss: 0.00777
Epoch: 34987, Training Loss: 0.00727
Epoch: 34987, Training Loss: 0.00888
Epoch: 34987, Training Loss: 0.00688
Epoch: 34988, Training Loss: 0.00777
Epoch: 34988, Training Loss: 0.00727
Epoch: 34988, Training Loss: 0.00888
Epoch: 34988, Training Loss: 0.00688
Epoch: 34989, Training Loss: 0.00777
Epoch: 34989, Training Loss: 0.00727
Epoch: 34989, Training Loss: 0.00888
Epoch: 34989, Training Loss: 0.00688
Epoch: 34990, Training Loss: 0.00777
Epoch: 34990, Training Loss: 0.00727
Epoch: 34990, Training Loss: 0.00888
Epoch: 34990, Training Loss: 0.00688
Epoch: 34991, Training Loss: 0.00777
Epoch: 34991, Training Loss: 0.00727
Epoch: 34991, Training Loss: 0.00888
Epoch: 34991, Training Loss: 0.00688
Epoch: 34992, Training Loss: 0.00777
Epoch: 34992, Training Loss: 0.00727
Epoch: 34992, Training Loss: 0.00888
Epoch: 34992, Training Loss: 0.00688
Epoch: 34993, Training Loss: 0.00777
Epoch: 34993, Training Loss: 0.00727
Epoch: 34993, Training Loss: 0.00888
Epoch: 34993, Training Loss: 0.00688
Epoch: 34994, Training Loss: 0.00777
Epoch: 34994, Training Loss: 0.00727
Epoch: 34994, Training Loss: 0.00888
Epoch: 34994, Training Loss: 0.00688
Epoch: 34995, Training Loss: 0.00777
Epoch: 34995, Training Loss: 0.00727
Epoch: 34995, Training Loss: 0.00888
Epoch: 34995, Training Loss: 0.00688
Epoch: 34996, Training Loss: 0.00777
Epoch: 34996, Training Loss: 0.00727
Epoch: 34996, Training Loss: 0.00888
Epoch: 34996, Training Loss: 0.00688
Epoch: 34997, Training Loss: 0.00777
Epoch: 34997, Training Loss: 0.00727
Epoch: 34997, Training Loss: 0.00888
Epoch: 34997, Training Loss: 0.00688
Epoch: 34998, Training Loss: 0.00777
Epoch: 34998, Training Loss: 0.00727
Epoch: 34998, Training Loss: 0.00888
Epoch: 34998, Training Loss: 0.00688
Epoch: 34999, Training Loss: 0.00777
Epoch: 34999, Training Loss: 0.00727
Epoch: 34999, Training Loss: 0.00888
Epoch: 34999, Training Loss: 0.00688
Epoch: 35000, Training Loss: 0.00777
Epoch: 35000, Training Loss: 0.00727
Epoch: 35000, Training Loss: 0.00888
Epoch: 35000, Training Loss: 0.00688
Epoch: 35001, Training Loss: 0.00777
Epoch: 35001, Training Loss: 0.00727
Epoch: 35001, Training Loss: 0.00888
Epoch: 35001, Training Loss: 0.00688
Epoch: 35002, Training Loss: 0.00777
Epoch: 35002, Training Loss: 0.00727
Epoch: 35002, Training Loss: 0.00888
Epoch: 35002, Training Loss: 0.00688
Epoch: 35003, Training Loss: 0.00777
Epoch: 35003, Training Loss: 0.00727
Epoch: 35003, Training Loss: 0.00888
Epoch: 35003, Training Loss: 0.00688
Epoch: 35004, Training Loss: 0.00777
Epoch: 35004, Training Loss: 0.00727
Epoch: 35004, Training Loss: 0.00888
Epoch: 35004, Training Loss: 0.00688
Epoch: 35005, Training Loss: 0.00777
Epoch: 35005, Training Loss: 0.00727
Epoch: 35005, Training Loss: 0.00888
Epoch: 35005, Training Loss: 0.00688
Epoch: 35006, Training Loss: 0.00777
Epoch: 35006, Training Loss: 0.00727
Epoch: 35006, Training Loss: 0.00888
Epoch: 35006, Training Loss: 0.00688
Epoch: 35007, Training Loss: 0.00777
Epoch: 35007, Training Loss: 0.00727
Epoch: 35007, Training Loss: 0.00888
Epoch: 35007, Training Loss: 0.00688
Epoch: 35008, Training Loss: 0.00777
Epoch: 35008, Training Loss: 0.00727
Epoch: 35008, Training Loss: 0.00888
Epoch: 35008, Training Loss: 0.00688
Epoch: 35009, Training Loss: 0.00777
Epoch: 35009, Training Loss: 0.00727
Epoch: 35009, Training Loss: 0.00888
Epoch: 35009, Training Loss: 0.00688
Epoch: 35010, Training Loss: 0.00777
Epoch: 35010, Training Loss: 0.00727
Epoch: 35010, Training Loss: 0.00888
Epoch: 35010, Training Loss: 0.00688
Epoch: 35011, Training Loss: 0.00777
Epoch: 35011, Training Loss: 0.00727
Epoch: 35011, Training Loss: 0.00888
Epoch: 35011, Training Loss: 0.00688
Epoch: 35012, Training Loss: 0.00777
Epoch: 35012, Training Loss: 0.00727
Epoch: 35012, Training Loss: 0.00888
Epoch: 35012, Training Loss: 0.00688
Epoch: 35013, Training Loss: 0.00777
Epoch: 35013, Training Loss: 0.00727
Epoch: 35013, Training Loss: 0.00888
Epoch: 35013, Training Loss: 0.00688
Epoch: 35014, Training Loss: 0.00777
Epoch: 35014, Training Loss: 0.00727
Epoch: 35014, Training Loss: 0.00888
Epoch: 35014, Training Loss: 0.00688
Epoch: 35015, Training Loss: 0.00777
Epoch: 35015, Training Loss: 0.00727
Epoch: 35015, Training Loss: 0.00888
Epoch: 35015, Training Loss: 0.00688
Epoch: 35016, Training Loss: 0.00777
Epoch: 35016, Training Loss: 0.00727
Epoch: 35016, Training Loss: 0.00888
Epoch: 35016, Training Loss: 0.00688
Epoch: 35017, Training Loss: 0.00777
Epoch: 35017, Training Loss: 0.00727
Epoch: 35017, Training Loss: 0.00888
Epoch: 35017, Training Loss: 0.00688
Epoch: 35018, Training Loss: 0.00777
Epoch: 35018, Training Loss: 0.00727
Epoch: 35018, Training Loss: 0.00888
Epoch: 35018, Training Loss: 0.00688
Epoch: 35019, Training Loss: 0.00777
Epoch: 35019, Training Loss: 0.00727
Epoch: 35019, Training Loss: 0.00888
Epoch: 35019, Training Loss: 0.00688
Epoch: 35020, Training Loss: 0.00777
Epoch: 35020, Training Loss: 0.00726
Epoch: 35020, Training Loss: 0.00888
Epoch: 35020, Training Loss: 0.00688
Epoch: 35021, Training Loss: 0.00777
Epoch: 35021, Training Loss: 0.00726
Epoch: 35021, Training Loss: 0.00888
Epoch: 35021, Training Loss: 0.00688
Epoch: 35022, Training Loss: 0.00777
Epoch: 35022, Training Loss: 0.00726
Epoch: 35022, Training Loss: 0.00888
Epoch: 35022, Training Loss: 0.00688
Epoch: 35023, Training Loss: 0.00777
Epoch: 35023, Training Loss: 0.00726
Epoch: 35023, Training Loss: 0.00888
Epoch: 35023, Training Loss: 0.00688
Epoch: 35024, Training Loss: 0.00777
Epoch: 35024, Training Loss: 0.00726
Epoch: 35024, Training Loss: 0.00888
Epoch: 35024, Training Loss: 0.00688
Epoch: 35025, Training Loss: 0.00777
Epoch: 35025, Training Loss: 0.00726
Epoch: 35025, Training Loss: 0.00888
Epoch: 35025, Training Loss: 0.00688
Epoch: 35026, Training Loss: 0.00777
Epoch: 35026, Training Loss: 0.00726
Epoch: 35026, Training Loss: 0.00888
Epoch: 35026, Training Loss: 0.00688
Epoch: 35027, Training Loss: 0.00777
Epoch: 35027, Training Loss: 0.00726
Epoch: 35027, Training Loss: 0.00888
Epoch: 35027, Training Loss: 0.00688
Epoch: 35028, Training Loss: 0.00777
Epoch: 35028, Training Loss: 0.00726
Epoch: 35028, Training Loss: 0.00888
Epoch: 35028, Training Loss: 0.00688
Epoch: 35029, Training Loss: 0.00777
Epoch: 35029, Training Loss: 0.00726
Epoch: 35029, Training Loss: 0.00887
Epoch: 35029, Training Loss: 0.00688
Epoch: 35030, Training Loss: 0.00777
Epoch: 35030, Training Loss: 0.00726
Epoch: 35030, Training Loss: 0.00887
Epoch: 35030, Training Loss: 0.00688
Epoch: 35031, Training Loss: 0.00777
Epoch: 35031, Training Loss: 0.00726
Epoch: 35031, Training Loss: 0.00887
Epoch: 35031, Training Loss: 0.00688
Epoch: 35032, Training Loss: 0.00777
Epoch: 35032, Training Loss: 0.00726
Epoch: 35032, Training Loss: 0.00887
Epoch: 35032, Training Loss: 0.00688
Epoch: 35033, Training Loss: 0.00777
Epoch: 35033, Training Loss: 0.00726
Epoch: 35033, Training Loss: 0.00887
Epoch: 35033, Training Loss: 0.00688
Epoch: 35034, Training Loss: 0.00777
Epoch: 35034, Training Loss: 0.00726
Epoch: 35034, Training Loss: 0.00887
Epoch: 35034, Training Loss: 0.00688
Epoch: 35035, Training Loss: 0.00777
Epoch: 35035, Training Loss: 0.00726
Epoch: 35035, Training Loss: 0.00887
Epoch: 35035, Training Loss: 0.00688
Epoch: 35036, Training Loss: 0.00777
Epoch: 35036, Training Loss: 0.00726
Epoch: 35036, Training Loss: 0.00887
Epoch: 35036, Training Loss: 0.00688
Epoch: 35037, Training Loss: 0.00777
Epoch: 35037, Training Loss: 0.00726
Epoch: 35037, Training Loss: 0.00887
Epoch: 35037, Training Loss: 0.00688
Epoch: 35038, Training Loss: 0.00777
Epoch: 35038, Training Loss: 0.00726
Epoch: 35038, Training Loss: 0.00887
Epoch: 35038, Training Loss: 0.00688
Epoch: 35039, Training Loss: 0.00777
Epoch: 35039, Training Loss: 0.00726
Epoch: 35039, Training Loss: 0.00887
Epoch: 35039, Training Loss: 0.00688
Epoch: 35040, Training Loss: 0.00777
Epoch: 35040, Training Loss: 0.00726
Epoch: 35040, Training Loss: 0.00887
Epoch: 35040, Training Loss: 0.00688
Epoch: 35041, Training Loss: 0.00777
Epoch: 35041, Training Loss: 0.00726
Epoch: 35041, Training Loss: 0.00887
Epoch: 35041, Training Loss: 0.00688
Epoch: 35042, Training Loss: 0.00777
Epoch: 35042, Training Loss: 0.00726
Epoch: 35042, Training Loss: 0.00887
Epoch: 35042, Training Loss: 0.00688
Epoch: 35043, Training Loss: 0.00777
Epoch: 35043, Training Loss: 0.00726
Epoch: 35043, Training Loss: 0.00887
Epoch: 35043, Training Loss: 0.00688
Epoch: 35044, Training Loss: 0.00777
Epoch: 35044, Training Loss: 0.00726
Epoch: 35044, Training Loss: 0.00887
Epoch: 35044, Training Loss: 0.00688
Epoch: 35045, Training Loss: 0.00777
Epoch: 35045, Training Loss: 0.00726
Epoch: 35045, Training Loss: 0.00887
Epoch: 35045, Training Loss: 0.00688
Epoch: 35046, Training Loss: 0.00777
Epoch: 35046, Training Loss: 0.00726
Epoch: 35046, Training Loss: 0.00887
Epoch: 35046, Training Loss: 0.00688
Epoch: 35047, Training Loss: 0.00777
Epoch: 35047, Training Loss: 0.00726
Epoch: 35047, Training Loss: 0.00887
Epoch: 35047, Training Loss: 0.00688
Epoch: 35048, Training Loss: 0.00777
Epoch: 35048, Training Loss: 0.00726
Epoch: 35048, Training Loss: 0.00887
Epoch: 35048, Training Loss: 0.00688
Epoch: 35049, Training Loss: 0.00777
Epoch: 35049, Training Loss: 0.00726
Epoch: 35049, Training Loss: 0.00887
Epoch: 35049, Training Loss: 0.00688
Epoch: 35050, Training Loss: 0.00777
Epoch: 35050, Training Loss: 0.00726
Epoch: 35050, Training Loss: 0.00887
Epoch: 35050, Training Loss: 0.00688
Epoch: 35051, Training Loss: 0.00777
Epoch: 35051, Training Loss: 0.00726
Epoch: 35051, Training Loss: 0.00887
Epoch: 35051, Training Loss: 0.00688
Epoch: 35052, Training Loss: 0.00777
Epoch: 35052, Training Loss: 0.00726
Epoch: 35052, Training Loss: 0.00887
Epoch: 35052, Training Loss: 0.00688
Epoch: 35053, Training Loss: 0.00777
Epoch: 35053, Training Loss: 0.00726
Epoch: 35053, Training Loss: 0.00887
Epoch: 35053, Training Loss: 0.00688
Epoch: 35054, Training Loss: 0.00777
Epoch: 35054, Training Loss: 0.00726
Epoch: 35054, Training Loss: 0.00887
Epoch: 35054, Training Loss: 0.00688
Epoch: 35055, Training Loss: 0.00777
Epoch: 35055, Training Loss: 0.00726
Epoch: 35055, Training Loss: 0.00887
Epoch: 35055, Training Loss: 0.00688
Epoch: 35056, Training Loss: 0.00777
Epoch: 35056, Training Loss: 0.00726
Epoch: 35056, Training Loss: 0.00887
Epoch: 35056, Training Loss: 0.00688
Epoch: 35057, Training Loss: 0.00777
Epoch: 35057, Training Loss: 0.00726
Epoch: 35057, Training Loss: 0.00887
Epoch: 35057, Training Loss: 0.00687
Epoch: 35058, Training Loss: 0.00777
Epoch: 35058, Training Loss: 0.00726
Epoch: 35058, Training Loss: 0.00887
Epoch: 35058, Training Loss: 0.00687
Epoch: 35059, Training Loss: 0.00777
Epoch: 35059, Training Loss: 0.00726
Epoch: 35059, Training Loss: 0.00887
Epoch: 35059, Training Loss: 0.00687
Epoch: 35060, Training Loss: 0.00777
Epoch: 35060, Training Loss: 0.00726
Epoch: 35060, Training Loss: 0.00887
Epoch: 35060, Training Loss: 0.00687
Epoch: 35061, Training Loss: 0.00777
Epoch: 35061, Training Loss: 0.00726
Epoch: 35061, Training Loss: 0.00887
Epoch: 35061, Training Loss: 0.00687
Epoch: 35062, Training Loss: 0.00777
Epoch: 35062, Training Loss: 0.00726
Epoch: 35062, Training Loss: 0.00887
Epoch: 35062, Training Loss: 0.00687
Epoch: 35063, Training Loss: 0.00777
Epoch: 35063, Training Loss: 0.00726
Epoch: 35063, Training Loss: 0.00887
Epoch: 35063, Training Loss: 0.00687
Epoch: 35064, Training Loss: 0.00777
Epoch: 35064, Training Loss: 0.00726
Epoch: 35064, Training Loss: 0.00887
Epoch: 35064, Training Loss: 0.00687
Epoch: 35065, Training Loss: 0.00777
Epoch: 35065, Training Loss: 0.00726
Epoch: 35065, Training Loss: 0.00887
Epoch: 35065, Training Loss: 0.00687
Epoch: 35066, Training Loss: 0.00776
Epoch: 35066, Training Loss: 0.00726
Epoch: 35066, Training Loss: 0.00887
Epoch: 35066, Training Loss: 0.00687
Epoch: 35067, Training Loss: 0.00776
Epoch: 35067, Training Loss: 0.00726
Epoch: 35067, Training Loss: 0.00887
Epoch: 35067, Training Loss: 0.00687
Epoch: 35068, Training Loss: 0.00776
Epoch: 35068, Training Loss: 0.00726
Epoch: 35068, Training Loss: 0.00887
Epoch: 35068, Training Loss: 0.00687
Epoch: 35069, Training Loss: 0.00776
Epoch: 35069, Training Loss: 0.00726
Epoch: 35069, Training Loss: 0.00887
Epoch: 35069, Training Loss: 0.00687
Epoch: 35070, Training Loss: 0.00776
Epoch: 35070, Training Loss: 0.00726
Epoch: 35070, Training Loss: 0.00887
Epoch: 35070, Training Loss: 0.00687
Epoch: 35071, Training Loss: 0.00776
Epoch: 35071, Training Loss: 0.00726
Epoch: 35071, Training Loss: 0.00887
Epoch: 35071, Training Loss: 0.00687
Epoch: 35072, Training Loss: 0.00776
Epoch: 35072, Training Loss: 0.00726
Epoch: 35072, Training Loss: 0.00887
Epoch: 35072, Training Loss: 0.00687
Epoch: 35073, Training Loss: 0.00776
Epoch: 35073, Training Loss: 0.00726
Epoch: 35073, Training Loss: 0.00887
Epoch: 35073, Training Loss: 0.00687
Epoch: 35074, Training Loss: 0.00776
Epoch: 35074, Training Loss: 0.00726
Epoch: 35074, Training Loss: 0.00887
Epoch: 35074, Training Loss: 0.00687
Epoch: 35075, Training Loss: 0.00776
Epoch: 35075, Training Loss: 0.00726
Epoch: 35075, Training Loss: 0.00887
Epoch: 35075, Training Loss: 0.00687
Epoch: 35076, Training Loss: 0.00776
Epoch: 35076, Training Loss: 0.00726
Epoch: 35076, Training Loss: 0.00887
Epoch: 35076, Training Loss: 0.00687
Epoch: 35077, Training Loss: 0.00776
Epoch: 35077, Training Loss: 0.00726
Epoch: 35077, Training Loss: 0.00887
Epoch: 35077, Training Loss: 0.00687
Epoch: 35078, Training Loss: 0.00776
Epoch: 35078, Training Loss: 0.00726
Epoch: 35078, Training Loss: 0.00887
Epoch: 35078, Training Loss: 0.00687
Epoch: 35079, Training Loss: 0.00776
Epoch: 35079, Training Loss: 0.00726
Epoch: 35079, Training Loss: 0.00887
Epoch: 35079, Training Loss: 0.00687
Epoch: 35080, Training Loss: 0.00776
Epoch: 35080, Training Loss: 0.00726
Epoch: 35080, Training Loss: 0.00887
Epoch: 35080, Training Loss: 0.00687
Epoch: 35081, Training Loss: 0.00776
Epoch: 35081, Training Loss: 0.00726
Epoch: 35081, Training Loss: 0.00887
Epoch: 35081, Training Loss: 0.00687
Epoch: 35082, Training Loss: 0.00776
Epoch: 35082, Training Loss: 0.00726
Epoch: 35082, Training Loss: 0.00887
Epoch: 35082, Training Loss: 0.00687
Epoch: 35083, Training Loss: 0.00776
Epoch: 35083, Training Loss: 0.00726
Epoch: 35083, Training Loss: 0.00887
Epoch: 35083, Training Loss: 0.00687
Epoch: 35084, Training Loss: 0.00776
Epoch: 35084, Training Loss: 0.00726
Epoch: 35084, Training Loss: 0.00887
Epoch: 35084, Training Loss: 0.00687
Epoch: 35085, Training Loss: 0.00776
Epoch: 35085, Training Loss: 0.00726
Epoch: 35085, Training Loss: 0.00887
Epoch: 35085, Training Loss: 0.00687
Epoch: 35086, Training Loss: 0.00776
Epoch: 35086, Training Loss: 0.00726
Epoch: 35086, Training Loss: 0.00887
Epoch: 35086, Training Loss: 0.00687
Epoch: 35087, Training Loss: 0.00776
Epoch: 35087, Training Loss: 0.00726
Epoch: 35087, Training Loss: 0.00887
Epoch: 35087, Training Loss: 0.00687
Epoch: 35088, Training Loss: 0.00776
Epoch: 35088, Training Loss: 0.00726
Epoch: 35088, Training Loss: 0.00887
Epoch: 35088, Training Loss: 0.00687
Epoch: 35089, Training Loss: 0.00776
Epoch: 35089, Training Loss: 0.00726
Epoch: 35089, Training Loss: 0.00887
Epoch: 35089, Training Loss: 0.00687
Epoch: 35090, Training Loss: 0.00776
Epoch: 35090, Training Loss: 0.00726
Epoch: 35090, Training Loss: 0.00887
Epoch: 35090, Training Loss: 0.00687
Epoch: 35091, Training Loss: 0.00776
Epoch: 35091, Training Loss: 0.00726
Epoch: 35091, Training Loss: 0.00887
Epoch: 35091, Training Loss: 0.00687
Epoch: 35092, Training Loss: 0.00776
Epoch: 35092, Training Loss: 0.00726
Epoch: 35092, Training Loss: 0.00887
Epoch: 35092, Training Loss: 0.00687
Epoch: 35093, Training Loss: 0.00776
Epoch: 35093, Training Loss: 0.00726
Epoch: 35093, Training Loss: 0.00887
Epoch: 35093, Training Loss: 0.00687
Epoch: 35094, Training Loss: 0.00776
Epoch: 35094, Training Loss: 0.00726
Epoch: 35094, Training Loss: 0.00887
Epoch: 35094, Training Loss: 0.00687
Epoch: 35095, Training Loss: 0.00776
Epoch: 35095, Training Loss: 0.00726
Epoch: 35095, Training Loss: 0.00887
Epoch: 35095, Training Loss: 0.00687
Epoch: 35096, Training Loss: 0.00776
Epoch: 35096, Training Loss: 0.00726
Epoch: 35096, Training Loss: 0.00887
Epoch: 35096, Training Loss: 0.00687
Epoch: 35097, Training Loss: 0.00776
Epoch: 35097, Training Loss: 0.00726
Epoch: 35097, Training Loss: 0.00887
Epoch: 35097, Training Loss: 0.00687
Epoch: 35098, Training Loss: 0.00776
Epoch: 35098, Training Loss: 0.00726
Epoch: 35098, Training Loss: 0.00887
Epoch: 35098, Training Loss: 0.00687
Epoch: 35099, Training Loss: 0.00776
Epoch: 35099, Training Loss: 0.00726
Epoch: 35099, Training Loss: 0.00887
Epoch: 35099, Training Loss: 0.00687
Epoch: 35100, Training Loss: 0.00776
Epoch: 35100, Training Loss: 0.00726
Epoch: 35100, Training Loss: 0.00887
Epoch: 35100, Training Loss: 0.00687
Epoch: 35101, Training Loss: 0.00776
Epoch: 35101, Training Loss: 0.00726
Epoch: 35101, Training Loss: 0.00887
Epoch: 35101, Training Loss: 0.00687
Epoch: 35102, Training Loss: 0.00776
Epoch: 35102, Training Loss: 0.00726
Epoch: 35102, Training Loss: 0.00887
Epoch: 35102, Training Loss: 0.00687
Epoch: 35103, Training Loss: 0.00776
Epoch: 35103, Training Loss: 0.00726
Epoch: 35103, Training Loss: 0.00887
Epoch: 35103, Training Loss: 0.00687
Epoch: 35104, Training Loss: 0.00776
Epoch: 35104, Training Loss: 0.00726
Epoch: 35104, Training Loss: 0.00886
Epoch: 35104, Training Loss: 0.00687
Epoch: 35105, Training Loss: 0.00776
Epoch: 35105, Training Loss: 0.00726
Epoch: 35105, Training Loss: 0.00886
Epoch: 35105, Training Loss: 0.00687
Epoch: 35106, Training Loss: 0.00776
Epoch: 35106, Training Loss: 0.00726
Epoch: 35106, Training Loss: 0.00886
Epoch: 35106, Training Loss: 0.00687
Epoch: 35107, Training Loss: 0.00776
Epoch: 35107, Training Loss: 0.00726
Epoch: 35107, Training Loss: 0.00886
Epoch: 35107, Training Loss: 0.00687
Epoch: 35108, Training Loss: 0.00776
Epoch: 35108, Training Loss: 0.00726
Epoch: 35108, Training Loss: 0.00886
Epoch: 35108, Training Loss: 0.00687
Epoch: 35109, Training Loss: 0.00776
Epoch: 35109, Training Loss: 0.00726
Epoch: 35109, Training Loss: 0.00886
Epoch: 35109, Training Loss: 0.00687
Epoch: 35110, Training Loss: 0.00776
Epoch: 35110, Training Loss: 0.00726
Epoch: 35110, Training Loss: 0.00886
Epoch: 35110, Training Loss: 0.00687
Epoch: 35111, Training Loss: 0.00776
Epoch: 35111, Training Loss: 0.00726
Epoch: 35111, Training Loss: 0.00886
Epoch: 35111, Training Loss: 0.00687
Epoch: 35112, Training Loss: 0.00776
Epoch: 35112, Training Loss: 0.00726
Epoch: 35112, Training Loss: 0.00886
Epoch: 35112, Training Loss: 0.00687
Epoch: 35113, Training Loss: 0.00776
Epoch: 35113, Training Loss: 0.00725
Epoch: 35113, Training Loss: 0.00886
Epoch: 35113, Training Loss: 0.00687
Epoch: 35114, Training Loss: 0.00776
Epoch: 35114, Training Loss: 0.00725
Epoch: 35114, Training Loss: 0.00886
Epoch: 35114, Training Loss: 0.00687
Epoch: 35115, Training Loss: 0.00776
Epoch: 35115, Training Loss: 0.00725
Epoch: 35115, Training Loss: 0.00886
Epoch: 35115, Training Loss: 0.00687
Epoch: 35116, Training Loss: 0.00776
Epoch: 35116, Training Loss: 0.00725
Epoch: 35116, Training Loss: 0.00886
Epoch: 35116, Training Loss: 0.00687
Epoch: 35117, Training Loss: 0.00776
Epoch: 35117, Training Loss: 0.00725
Epoch: 35117, Training Loss: 0.00886
Epoch: 35117, Training Loss: 0.00687
Epoch: 35118, Training Loss: 0.00776
Epoch: 35118, Training Loss: 0.00725
Epoch: 35118, Training Loss: 0.00886
Epoch: 35118, Training Loss: 0.00687
Epoch: 35119, Training Loss: 0.00776
Epoch: 35119, Training Loss: 0.00725
Epoch: 35119, Training Loss: 0.00886
Epoch: 35119, Training Loss: 0.00687
Epoch: 35120, Training Loss: 0.00776
Epoch: 35120, Training Loss: 0.00725
Epoch: 35120, Training Loss: 0.00886
Epoch: 35120, Training Loss: 0.00687
Epoch: 35121, Training Loss: 0.00776
Epoch: 35121, Training Loss: 0.00725
Epoch: 35121, Training Loss: 0.00886
Epoch: 35121, Training Loss: 0.00687
Epoch: 35122, Training Loss: 0.00776
Epoch: 35122, Training Loss: 0.00725
Epoch: 35122, Training Loss: 0.00886
Epoch: 35122, Training Loss: 0.00687
Epoch: 35123, Training Loss: 0.00776
Epoch: 35123, Training Loss: 0.00725
Epoch: 35123, Training Loss: 0.00886
Epoch: 35123, Training Loss: 0.00687
Epoch: 35124, Training Loss: 0.00776
Epoch: 35124, Training Loss: 0.00725
Epoch: 35124, Training Loss: 0.00886
Epoch: 35124, Training Loss: 0.00687
Epoch: 35125, Training Loss: 0.00776
Epoch: 35125, Training Loss: 0.00725
Epoch: 35125, Training Loss: 0.00886
Epoch: 35125, Training Loss: 0.00687
Epoch: 35126, Training Loss: 0.00776
Epoch: 35126, Training Loss: 0.00725
Epoch: 35126, Training Loss: 0.00886
Epoch: 35126, Training Loss: 0.00687
Epoch: 35127, Training Loss: 0.00776
Epoch: 35127, Training Loss: 0.00725
Epoch: 35127, Training Loss: 0.00886
Epoch: 35127, Training Loss: 0.00687
Epoch: 35128, Training Loss: 0.00776
Epoch: 35128, Training Loss: 0.00725
Epoch: 35128, Training Loss: 0.00886
Epoch: 35128, Training Loss: 0.00687
Epoch: 35129, Training Loss: 0.00776
Epoch: 35129, Training Loss: 0.00725
Epoch: 35129, Training Loss: 0.00886
Epoch: 35129, Training Loss: 0.00687
Epoch: 35130, Training Loss: 0.00776
Epoch: 35130, Training Loss: 0.00725
Epoch: 35130, Training Loss: 0.00886
Epoch: 35130, Training Loss: 0.00687
Epoch: 35131, Training Loss: 0.00776
Epoch: 35131, Training Loss: 0.00725
Epoch: 35131, Training Loss: 0.00886
Epoch: 35131, Training Loss: 0.00687
Epoch: 35132, Training Loss: 0.00776
Epoch: 35132, Training Loss: 0.00725
Epoch: 35132, Training Loss: 0.00886
Epoch: 35132, Training Loss: 0.00687
Epoch: 35133, Training Loss: 0.00776
Epoch: 35133, Training Loss: 0.00725
Epoch: 35133, Training Loss: 0.00886
Epoch: 35133, Training Loss: 0.00687
Epoch: 35134, Training Loss: 0.00776
Epoch: 35134, Training Loss: 0.00725
Epoch: 35134, Training Loss: 0.00886
Epoch: 35134, Training Loss: 0.00687
Epoch: 35135, Training Loss: 0.00776
Epoch: 35135, Training Loss: 0.00725
Epoch: 35135, Training Loss: 0.00886
Epoch: 35135, Training Loss: 0.00687
Epoch: 35136, Training Loss: 0.00776
Epoch: 35136, Training Loss: 0.00725
Epoch: 35136, Training Loss: 0.00886
Epoch: 35136, Training Loss: 0.00687
Epoch: 35137, Training Loss: 0.00776
Epoch: 35137, Training Loss: 0.00725
Epoch: 35137, Training Loss: 0.00886
Epoch: 35137, Training Loss: 0.00687
Epoch: 35138, Training Loss: 0.00776
Epoch: 35138, Training Loss: 0.00725
Epoch: 35138, Training Loss: 0.00886
Epoch: 35138, Training Loss: 0.00687
Epoch: 35139, Training Loss: 0.00776
Epoch: 35139, Training Loss: 0.00725
Epoch: 35139, Training Loss: 0.00886
Epoch: 35139, Training Loss: 0.00687
Epoch: 35140, Training Loss: 0.00776
Epoch: 35140, Training Loss: 0.00725
Epoch: 35140, Training Loss: 0.00886
Epoch: 35140, Training Loss: 0.00687
Epoch: 35141, Training Loss: 0.00776
Epoch: 35141, Training Loss: 0.00725
Epoch: 35141, Training Loss: 0.00886
Epoch: 35141, Training Loss: 0.00687
Epoch: 35142, Training Loss: 0.00776
Epoch: 35142, Training Loss: 0.00725
Epoch: 35142, Training Loss: 0.00886
Epoch: 35142, Training Loss: 0.00687
Epoch: 35143, Training Loss: 0.00776
Epoch: 35143, Training Loss: 0.00725
Epoch: 35143, Training Loss: 0.00886
Epoch: 35143, Training Loss: 0.00687
Epoch: 35144, Training Loss: 0.00776
Epoch: 35144, Training Loss: 0.00725
Epoch: 35144, Training Loss: 0.00886
Epoch: 35144, Training Loss: 0.00687
Epoch: 35145, Training Loss: 0.00776
Epoch: 35145, Training Loss: 0.00725
Epoch: 35145, Training Loss: 0.00886
Epoch: 35145, Training Loss: 0.00687
Epoch: 35146, Training Loss: 0.00776
Epoch: 35146, Training Loss: 0.00725
Epoch: 35146, Training Loss: 0.00886
Epoch: 35146, Training Loss: 0.00687
Epoch: 35147, Training Loss: 0.00776
Epoch: 35147, Training Loss: 0.00725
Epoch: 35147, Training Loss: 0.00886
Epoch: 35147, Training Loss: 0.00687
Epoch: 35148, Training Loss: 0.00776
Epoch: 35148, Training Loss: 0.00725
Epoch: 35148, Training Loss: 0.00886
Epoch: 35148, Training Loss: 0.00687
Epoch: 35149, Training Loss: 0.00776
Epoch: 35149, Training Loss: 0.00725
Epoch: 35149, Training Loss: 0.00886
Epoch: 35149, Training Loss: 0.00687
Epoch: 35150, Training Loss: 0.00776
Epoch: 35150, Training Loss: 0.00725
Epoch: 35150, Training Loss: 0.00886
Epoch: 35150, Training Loss: 0.00687
Epoch: 35151, Training Loss: 0.00776
Epoch: 35151, Training Loss: 0.00725
Epoch: 35151, Training Loss: 0.00886
Epoch: 35151, Training Loss: 0.00687
Epoch: 35152, Training Loss: 0.00775
Epoch: 35152, Training Loss: 0.00725
Epoch: 35152, Training Loss: 0.00886
Epoch: 35152, Training Loss: 0.00687
Epoch: 35153, Training Loss: 0.00775
Epoch: 35153, Training Loss: 0.00725
Epoch: 35153, Training Loss: 0.00886
Epoch: 35153, Training Loss: 0.00687
Epoch: 35154, Training Loss: 0.00775
Epoch: 35154, Training Loss: 0.00725
Epoch: 35154, Training Loss: 0.00886
Epoch: 35154, Training Loss: 0.00687
Epoch: 35155, Training Loss: 0.00775
Epoch: 35155, Training Loss: 0.00725
Epoch: 35155, Training Loss: 0.00886
Epoch: 35155, Training Loss: 0.00686
Epoch: 35156, Training Loss: 0.00775
Epoch: 35156, Training Loss: 0.00725
Epoch: 35156, Training Loss: 0.00886
Epoch: 35156, Training Loss: 0.00686
Epoch: 35157, Training Loss: 0.00775
Epoch: 35157, Training Loss: 0.00725
Epoch: 35157, Training Loss: 0.00886
Epoch: 35157, Training Loss: 0.00686
Epoch: 35158, Training Loss: 0.00775
Epoch: 35158, Training Loss: 0.00725
Epoch: 35158, Training Loss: 0.00886
Epoch: 35158, Training Loss: 0.00686
Epoch: 35159, Training Loss: 0.00775
Epoch: 35159, Training Loss: 0.00725
Epoch: 35159, Training Loss: 0.00886
Epoch: 35159, Training Loss: 0.00686
Epoch: 35160, Training Loss: 0.00775
Epoch: 35160, Training Loss: 0.00725
Epoch: 35160, Training Loss: 0.00886
Epoch: 35160, Training Loss: 0.00686
Epoch: 35161, Training Loss: 0.00775
Epoch: 35161, Training Loss: 0.00725
Epoch: 35161, Training Loss: 0.00886
Epoch: 35161, Training Loss: 0.00686
Epoch: 35162, Training Loss: 0.00775
Epoch: 35162, Training Loss: 0.00725
Epoch: 35162, Training Loss: 0.00886
Epoch: 35162, Training Loss: 0.00686
Epoch: 35163, Training Loss: 0.00775
Epoch: 35163, Training Loss: 0.00725
Epoch: 35163, Training Loss: 0.00886
Epoch: 35163, Training Loss: 0.00686
Epoch: 35164, Training Loss: 0.00775
Epoch: 35164, Training Loss: 0.00725
Epoch: 35164, Training Loss: 0.00886
Epoch: 35164, Training Loss: 0.00686
Epoch: 35165, Training Loss: 0.00775
Epoch: 35165, Training Loss: 0.00725
Epoch: 35165, Training Loss: 0.00886
Epoch: 35165, Training Loss: 0.00686
Epoch: 35166, Training Loss: 0.00775
Epoch: 35166, Training Loss: 0.00725
Epoch: 35166, Training Loss: 0.00886
Epoch: 35166, Training Loss: 0.00686
Epoch: 35167, Training Loss: 0.00775
Epoch: 35167, Training Loss: 0.00725
Epoch: 35167, Training Loss: 0.00886
Epoch: 35167, Training Loss: 0.00686
Epoch: 35168, Training Loss: 0.00775
Epoch: 35168, Training Loss: 0.00725
Epoch: 35168, Training Loss: 0.00886
Epoch: 35168, Training Loss: 0.00686
Epoch: 35169, Training Loss: 0.00775
Epoch: 35169, Training Loss: 0.00725
Epoch: 35169, Training Loss: 0.00886
Epoch: 35169, Training Loss: 0.00686
Epoch: 35170, Training Loss: 0.00775
Epoch: 35170, Training Loss: 0.00725
Epoch: 35170, Training Loss: 0.00886
Epoch: 35170, Training Loss: 0.00686
Epoch: 35171, Training Loss: 0.00775
Epoch: 35171, Training Loss: 0.00725
Epoch: 35171, Training Loss: 0.00886
Epoch: 35171, Training Loss: 0.00686
Epoch: 35172, Training Loss: 0.00775
Epoch: 35172, Training Loss: 0.00725
Epoch: 35172, Training Loss: 0.00886
Epoch: 35172, Training Loss: 0.00686
Epoch: 35173, Training Loss: 0.00775
Epoch: 35173, Training Loss: 0.00725
Epoch: 35173, Training Loss: 0.00886
Epoch: 35173, Training Loss: 0.00686
Epoch: 35174, Training Loss: 0.00775
Epoch: 35174, Training Loss: 0.00725
Epoch: 35174, Training Loss: 0.00886
Epoch: 35174, Training Loss: 0.00686
Epoch: 35175, Training Loss: 0.00775
Epoch: 35175, Training Loss: 0.00725
Epoch: 35175, Training Loss: 0.00886
Epoch: 35175, Training Loss: 0.00686
Epoch: 35176, Training Loss: 0.00775
Epoch: 35176, Training Loss: 0.00725
Epoch: 35176, Training Loss: 0.00886
Epoch: 35176, Training Loss: 0.00686
Epoch: 35177, Training Loss: 0.00775
Epoch: 35177, Training Loss: 0.00725
Epoch: 35177, Training Loss: 0.00886
Epoch: 35177, Training Loss: 0.00686
Epoch: 35178, Training Loss: 0.00775
Epoch: 35178, Training Loss: 0.00725
Epoch: 35178, Training Loss: 0.00886
Epoch: 35178, Training Loss: 0.00686
Epoch: 35179, Training Loss: 0.00775
Epoch: 35179, Training Loss: 0.00725
Epoch: 35179, Training Loss: 0.00886
Epoch: 35179, Training Loss: 0.00686
Epoch: 35180, Training Loss: 0.00775
Epoch: 35180, Training Loss: 0.00725
Epoch: 35180, Training Loss: 0.00885
Epoch: 35180, Training Loss: 0.00686
Epoch: 35181, Training Loss: 0.00775
Epoch: 35181, Training Loss: 0.00725
Epoch: 35181, Training Loss: 0.00885
Epoch: 35181, Training Loss: 0.00686
Epoch: 35182, Training Loss: 0.00775
Epoch: 35182, Training Loss: 0.00725
Epoch: 35182, Training Loss: 0.00885
Epoch: 35182, Training Loss: 0.00686
Epoch: 35183, Training Loss: 0.00775
Epoch: 35183, Training Loss: 0.00725
Epoch: 35183, Training Loss: 0.00885
Epoch: 35183, Training Loss: 0.00686
Epoch: 35184, Training Loss: 0.00775
Epoch: 35184, Training Loss: 0.00725
Epoch: 35184, Training Loss: 0.00885
Epoch: 35184, Training Loss: 0.00686
Epoch: 35185, Training Loss: 0.00775
Epoch: 35185, Training Loss: 0.00725
Epoch: 35185, Training Loss: 0.00885
Epoch: 35185, Training Loss: 0.00686
Epoch: 35186, Training Loss: 0.00775
Epoch: 35186, Training Loss: 0.00725
Epoch: 35186, Training Loss: 0.00885
Epoch: 35186, Training Loss: 0.00686
Epoch: 35187, Training Loss: 0.00775
Epoch: 35187, Training Loss: 0.00725
Epoch: 35187, Training Loss: 0.00885
Epoch: 35187, Training Loss: 0.00686
Epoch: 35188, Training Loss: 0.00775
Epoch: 35188, Training Loss: 0.00725
Epoch: 35188, Training Loss: 0.00885
Epoch: 35188, Training Loss: 0.00686
Epoch: 35189, Training Loss: 0.00775
Epoch: 35189, Training Loss: 0.00725
Epoch: 35189, Training Loss: 0.00885
Epoch: 35189, Training Loss: 0.00686
Epoch: 35190, Training Loss: 0.00775
Epoch: 35190, Training Loss: 0.00725
Epoch: 35190, Training Loss: 0.00885
Epoch: 35190, Training Loss: 0.00686
Epoch: 35191, Training Loss: 0.00775
Epoch: 35191, Training Loss: 0.00725
Epoch: 35191, Training Loss: 0.00885
Epoch: 35191, Training Loss: 0.00686
Epoch: 35192, Training Loss: 0.00775
Epoch: 35192, Training Loss: 0.00725
Epoch: 35192, Training Loss: 0.00885
Epoch: 35192, Training Loss: 0.00686
Epoch: 35193, Training Loss: 0.00775
Epoch: 35193, Training Loss: 0.00725
Epoch: 35193, Training Loss: 0.00885
Epoch: 35193, Training Loss: 0.00686
Epoch: 35194, Training Loss: 0.00775
Epoch: 35194, Training Loss: 0.00725
Epoch: 35194, Training Loss: 0.00885
Epoch: 35194, Training Loss: 0.00686
Epoch: 35195, Training Loss: 0.00775
Epoch: 35195, Training Loss: 0.00725
Epoch: 35195, Training Loss: 0.00885
Epoch: 35195, Training Loss: 0.00686
Epoch: 35196, Training Loss: 0.00775
Epoch: 35196, Training Loss: 0.00725
Epoch: 35196, Training Loss: 0.00885
Epoch: 35196, Training Loss: 0.00686
Epoch: 35197, Training Loss: 0.00775
Epoch: 35197, Training Loss: 0.00725
Epoch: 35197, Training Loss: 0.00885
Epoch: 35197, Training Loss: 0.00686
Epoch: 35198, Training Loss: 0.00775
Epoch: 35198, Training Loss: 0.00725
Epoch: 35198, Training Loss: 0.00885
Epoch: 35198, Training Loss: 0.00686
Epoch: 35199, Training Loss: 0.00775
Epoch: 35199, Training Loss: 0.00725
Epoch: 35199, Training Loss: 0.00885
Epoch: 35199, Training Loss: 0.00686
Epoch: 35200, Training Loss: 0.00775
Epoch: 35200, Training Loss: 0.00725
Epoch: 35200, Training Loss: 0.00885
Epoch: 35200, Training Loss: 0.00686
Epoch: 35201, Training Loss: 0.00775
Epoch: 35201, Training Loss: 0.00725
Epoch: 35201, Training Loss: 0.00885
Epoch: 35201, Training Loss: 0.00686
Epoch: 35202, Training Loss: 0.00775
Epoch: 35202, Training Loss: 0.00725
Epoch: 35202, Training Loss: 0.00885
Epoch: 35202, Training Loss: 0.00686
Epoch: 35203, Training Loss: 0.00775
Epoch: 35203, Training Loss: 0.00725
Epoch: 35203, Training Loss: 0.00885
Epoch: 35203, Training Loss: 0.00686
Epoch: 35204, Training Loss: 0.00775
Epoch: 35204, Training Loss: 0.00725
Epoch: 35204, Training Loss: 0.00885
Epoch: 35204, Training Loss: 0.00686
Epoch: 35205, Training Loss: 0.00775
Epoch: 35205, Training Loss: 0.00725
Epoch: 35205, Training Loss: 0.00885
Epoch: 35205, Training Loss: 0.00686
Epoch: 35206, Training Loss: 0.00775
Epoch: 35206, Training Loss: 0.00724
Epoch: 35206, Training Loss: 0.00885
Epoch: 35206, Training Loss: 0.00686
Epoch: 35207, Training Loss: 0.00775
Epoch: 35207, Training Loss: 0.00724
Epoch: 35207, Training Loss: 0.00885
Epoch: 35207, Training Loss: 0.00686
Epoch: 35208, Training Loss: 0.00775
Epoch: 35208, Training Loss: 0.00724
Epoch: 35208, Training Loss: 0.00885
Epoch: 35208, Training Loss: 0.00686
Epoch: 35209, Training Loss: 0.00775
Epoch: 35209, Training Loss: 0.00724
Epoch: 35209, Training Loss: 0.00885
Epoch: 35209, Training Loss: 0.00686
Epoch: 35210, Training Loss: 0.00775
Epoch: 35210, Training Loss: 0.00724
Epoch: 35210, Training Loss: 0.00885
Epoch: 35210, Training Loss: 0.00686
Epoch: 35211, Training Loss: 0.00775
Epoch: 35211, Training Loss: 0.00724
Epoch: 35211, Training Loss: 0.00885
Epoch: 35211, Training Loss: 0.00686
Epoch: 35212, Training Loss: 0.00775
Epoch: 35212, Training Loss: 0.00724
Epoch: 35212, Training Loss: 0.00885
Epoch: 35212, Training Loss: 0.00686
Epoch: 35213, Training Loss: 0.00775
Epoch: 35213, Training Loss: 0.00724
Epoch: 35213, Training Loss: 0.00885
Epoch: 35213, Training Loss: 0.00686
Epoch: 35214, Training Loss: 0.00775
Epoch: 35214, Training Loss: 0.00724
Epoch: 35214, Training Loss: 0.00885
Epoch: 35214, Training Loss: 0.00686
Epoch: 35215, Training Loss: 0.00775
Epoch: 35215, Training Loss: 0.00724
Epoch: 35215, Training Loss: 0.00885
Epoch: 35215, Training Loss: 0.00686
Epoch: 35216, Training Loss: 0.00775
Epoch: 35216, Training Loss: 0.00724
Epoch: 35216, Training Loss: 0.00885
Epoch: 35216, Training Loss: 0.00686
Epoch: 35217, Training Loss: 0.00775
Epoch: 35217, Training Loss: 0.00724
Epoch: 35217, Training Loss: 0.00885
Epoch: 35217, Training Loss: 0.00686
Epoch: 35218, Training Loss: 0.00775
Epoch: 35218, Training Loss: 0.00724
Epoch: 35218, Training Loss: 0.00885
Epoch: 35218, Training Loss: 0.00686
Epoch: 35219, Training Loss: 0.00775
Epoch: 35219, Training Loss: 0.00724
Epoch: 35219, Training Loss: 0.00885
Epoch: 35219, Training Loss: 0.00686
Epoch: 35220, Training Loss: 0.00775
Epoch: 35220, Training Loss: 0.00724
Epoch: 35220, Training Loss: 0.00885
Epoch: 35220, Training Loss: 0.00686
Epoch: 35221, Training Loss: 0.00775
Epoch: 35221, Training Loss: 0.00724
Epoch: 35221, Training Loss: 0.00885
Epoch: 35221, Training Loss: 0.00686
Epoch: 35222, Training Loss: 0.00775
Epoch: 35222, Training Loss: 0.00724
Epoch: 35222, Training Loss: 0.00885
Epoch: 35222, Training Loss: 0.00686
Epoch: 35223, Training Loss: 0.00775
Epoch: 35223, Training Loss: 0.00724
Epoch: 35223, Training Loss: 0.00885
Epoch: 35223, Training Loss: 0.00686
Epoch: 35224, Training Loss: 0.00775
Epoch: 35224, Training Loss: 0.00724
Epoch: 35224, Training Loss: 0.00885
Epoch: 35224, Training Loss: 0.00686
Epoch: 35225, Training Loss: 0.00775
Epoch: 35225, Training Loss: 0.00724
Epoch: 35225, Training Loss: 0.00885
Epoch: 35225, Training Loss: 0.00686
Epoch: 35226, Training Loss: 0.00775
Epoch: 35226, Training Loss: 0.00724
Epoch: 35226, Training Loss: 0.00885
Epoch: 35226, Training Loss: 0.00686
Epoch: 35227, Training Loss: 0.00775
Epoch: 35227, Training Loss: 0.00724
Epoch: 35227, Training Loss: 0.00885
Epoch: 35227, Training Loss: 0.00686
Epoch: 35228, Training Loss: 0.00775
Epoch: 35228, Training Loss: 0.00724
Epoch: 35228, Training Loss: 0.00885
Epoch: 35228, Training Loss: 0.00686
Epoch: 35229, Training Loss: 0.00775
Epoch: 35229, Training Loss: 0.00724
Epoch: 35229, Training Loss: 0.00885
Epoch: 35229, Training Loss: 0.00686
Epoch: 35230, Training Loss: 0.00775
Epoch: 35230, Training Loss: 0.00724
Epoch: 35230, Training Loss: 0.00885
Epoch: 35230, Training Loss: 0.00686
Epoch: 35231, Training Loss: 0.00775
Epoch: 35231, Training Loss: 0.00724
Epoch: 35231, Training Loss: 0.00885
Epoch: 35231, Training Loss: 0.00686
Epoch: 35232, Training Loss: 0.00775
Epoch: 35232, Training Loss: 0.00724
Epoch: 35232, Training Loss: 0.00885
Epoch: 35232, Training Loss: 0.00686
Epoch: 35233, Training Loss: 0.00775
Epoch: 35233, Training Loss: 0.00724
Epoch: 35233, Training Loss: 0.00885
Epoch: 35233, Training Loss: 0.00686
Epoch: 35234, Training Loss: 0.00775
Epoch: 35234, Training Loss: 0.00724
Epoch: 35234, Training Loss: 0.00885
Epoch: 35234, Training Loss: 0.00686
Epoch: 35235, Training Loss: 0.00775
Epoch: 35235, Training Loss: 0.00724
Epoch: 35235, Training Loss: 0.00885
Epoch: 35235, Training Loss: 0.00686
Epoch: 35236, Training Loss: 0.00775
Epoch: 35236, Training Loss: 0.00724
Epoch: 35236, Training Loss: 0.00885
Epoch: 35236, Training Loss: 0.00686
Epoch: 35237, Training Loss: 0.00775
Epoch: 35237, Training Loss: 0.00724
Epoch: 35237, Training Loss: 0.00885
Epoch: 35237, Training Loss: 0.00686
Epoch: 35238, Training Loss: 0.00774
Epoch: 35238, Training Loss: 0.00724
Epoch: 35238, Training Loss: 0.00885
Epoch: 35238, Training Loss: 0.00686
Epoch: 35239, Training Loss: 0.00774
Epoch: 35239, Training Loss: 0.00724
Epoch: 35239, Training Loss: 0.00885
Epoch: 35239, Training Loss: 0.00686
Epoch: 35240, Training Loss: 0.00774
Epoch: 35240, Training Loss: 0.00724
Epoch: 35240, Training Loss: 0.00885
Epoch: 35240, Training Loss: 0.00686
Epoch: 35241, Training Loss: 0.00774
Epoch: 35241, Training Loss: 0.00724
Epoch: 35241, Training Loss: 0.00885
Epoch: 35241, Training Loss: 0.00686
Epoch: 35242, Training Loss: 0.00774
Epoch: 35242, Training Loss: 0.00724
Epoch: 35242, Training Loss: 0.00885
Epoch: 35242, Training Loss: 0.00686
Epoch: 35243, Training Loss: 0.00774
Epoch: 35243, Training Loss: 0.00724
Epoch: 35243, Training Loss: 0.00885
Epoch: 35243, Training Loss: 0.00686
Epoch: 35244, Training Loss: 0.00774
Epoch: 35244, Training Loss: 0.00724
Epoch: 35244, Training Loss: 0.00885
Epoch: 35244, Training Loss: 0.00686
Epoch: 35245, Training Loss: 0.00774
Epoch: 35245, Training Loss: 0.00724
Epoch: 35245, Training Loss: 0.00885
Epoch: 35245, Training Loss: 0.00686
Epoch: 35246, Training Loss: 0.00774
Epoch: 35246, Training Loss: 0.00724
Epoch: 35246, Training Loss: 0.00885
Epoch: 35246, Training Loss: 0.00686
Epoch: 35247, Training Loss: 0.00774
Epoch: 35247, Training Loss: 0.00724
Epoch: 35247, Training Loss: 0.00885
Epoch: 35247, Training Loss: 0.00686
Epoch: 35248, Training Loss: 0.00774
Epoch: 35248, Training Loss: 0.00724
Epoch: 35248, Training Loss: 0.00885
Epoch: 35248, Training Loss: 0.00686
Epoch: 35249, Training Loss: 0.00774
Epoch: 35249, Training Loss: 0.00724
Epoch: 35249, Training Loss: 0.00885
Epoch: 35249, Training Loss: 0.00686
Epoch: 35250, Training Loss: 0.00774
Epoch: 35250, Training Loss: 0.00724
Epoch: 35250, Training Loss: 0.00885
Epoch: 35250, Training Loss: 0.00686
Epoch: 35251, Training Loss: 0.00774
Epoch: 35251, Training Loss: 0.00724
Epoch: 35251, Training Loss: 0.00885
Epoch: 35251, Training Loss: 0.00686
Epoch: 35252, Training Loss: 0.00774
Epoch: 35252, Training Loss: 0.00724
Epoch: 35252, Training Loss: 0.00885
Epoch: 35252, Training Loss: 0.00686
Epoch: 35253, Training Loss: 0.00774
Epoch: 35253, Training Loss: 0.00724
Epoch: 35253, Training Loss: 0.00885
Epoch: 35253, Training Loss: 0.00686
Epoch: 35254, Training Loss: 0.00774
Epoch: 35254, Training Loss: 0.00724
Epoch: 35254, Training Loss: 0.00885
Epoch: 35254, Training Loss: 0.00685
Epoch: 35255, Training Loss: 0.00774
Epoch: 35255, Training Loss: 0.00724
Epoch: 35255, Training Loss: 0.00884
Epoch: 35255, Training Loss: 0.00685
Epoch: 35256, Training Loss: 0.00774
Epoch: 35256, Training Loss: 0.00724
Epoch: 35256, Training Loss: 0.00884
Epoch: 35256, Training Loss: 0.00685
Epoch: 35257, Training Loss: 0.00774
Epoch: 35257, Training Loss: 0.00724
Epoch: 35257, Training Loss: 0.00884
Epoch: 35257, Training Loss: 0.00685
Epoch: 35258, Training Loss: 0.00774
Epoch: 35258, Training Loss: 0.00724
Epoch: 35258, Training Loss: 0.00884
Epoch: 35258, Training Loss: 0.00685
Epoch: 35259, Training Loss: 0.00774
Epoch: 35259, Training Loss: 0.00724
Epoch: 35259, Training Loss: 0.00884
Epoch: 35259, Training Loss: 0.00685
Epoch: 35260, Training Loss: 0.00774
Epoch: 35260, Training Loss: 0.00724
Epoch: 35260, Training Loss: 0.00884
Epoch: 35260, Training Loss: 0.00685
Epoch: 35261, Training Loss: 0.00774
Epoch: 35261, Training Loss: 0.00724
Epoch: 35261, Training Loss: 0.00884
Epoch: 35261, Training Loss: 0.00685
Epoch: 35262, Training Loss: 0.00774
Epoch: 35262, Training Loss: 0.00724
Epoch: 35262, Training Loss: 0.00884
Epoch: 35262, Training Loss: 0.00685
Epoch: 35263, Training Loss: 0.00774
Epoch: 35263, Training Loss: 0.00724
Epoch: 35263, Training Loss: 0.00884
Epoch: 35263, Training Loss: 0.00685
Epoch: 35264, Training Loss: 0.00774
Epoch: 35264, Training Loss: 0.00724
Epoch: 35264, Training Loss: 0.00884
Epoch: 35264, Training Loss: 0.00685
Epoch: 35265, Training Loss: 0.00774
Epoch: 35265, Training Loss: 0.00724
Epoch: 35265, Training Loss: 0.00884
Epoch: 35265, Training Loss: 0.00685
Epoch: 35266, Training Loss: 0.00774
Epoch: 35266, Training Loss: 0.00724
Epoch: 35266, Training Loss: 0.00884
Epoch: 35266, Training Loss: 0.00685
Epoch: 35267, Training Loss: 0.00774
Epoch: 35267, Training Loss: 0.00724
Epoch: 35267, Training Loss: 0.00884
Epoch: 35267, Training Loss: 0.00685
Epoch: 35268, Training Loss: 0.00774
Epoch: 35268, Training Loss: 0.00724
Epoch: 35268, Training Loss: 0.00884
Epoch: 35268, Training Loss: 0.00685
Epoch: 35269, Training Loss: 0.00774
Epoch: 35269, Training Loss: 0.00724
Epoch: 35269, Training Loss: 0.00884
Epoch: 35269, Training Loss: 0.00685
Epoch: 35270, Training Loss: 0.00774
Epoch: 35270, Training Loss: 0.00724
Epoch: 35270, Training Loss: 0.00884
Epoch: 35270, Training Loss: 0.00685
Epoch: 35271, Training Loss: 0.00774
Epoch: 35271, Training Loss: 0.00724
Epoch: 35271, Training Loss: 0.00884
Epoch: 35271, Training Loss: 0.00685
Epoch: 35272, Training Loss: 0.00774
Epoch: 35272, Training Loss: 0.00724
Epoch: 35272, Training Loss: 0.00884
Epoch: 35272, Training Loss: 0.00685
Epoch: 35273, Training Loss: 0.00774
Epoch: 35273, Training Loss: 0.00724
Epoch: 35273, Training Loss: 0.00884
Epoch: 35273, Training Loss: 0.00685
Epoch: 35274, Training Loss: 0.00774
Epoch: 35274, Training Loss: 0.00724
Epoch: 35274, Training Loss: 0.00884
Epoch: 35274, Training Loss: 0.00685
Epoch: 35275, Training Loss: 0.00774
Epoch: 35275, Training Loss: 0.00724
Epoch: 35275, Training Loss: 0.00884
Epoch: 35275, Training Loss: 0.00685
Epoch: 35276, Training Loss: 0.00774
Epoch: 35276, Training Loss: 0.00724
Epoch: 35276, Training Loss: 0.00884
Epoch: 35276, Training Loss: 0.00685
Epoch: 35277, Training Loss: 0.00774
Epoch: 35277, Training Loss: 0.00724
Epoch: 35277, Training Loss: 0.00884
Epoch: 35277, Training Loss: 0.00685
Epoch: 35278, Training Loss: 0.00774
Epoch: 35278, Training Loss: 0.00724
Epoch: 35278, Training Loss: 0.00884
Epoch: 35278, Training Loss: 0.00685
Epoch: 35279, Training Loss: 0.00774
Epoch: 35279, Training Loss: 0.00724
Epoch: 35279, Training Loss: 0.00884
Epoch: 35279, Training Loss: 0.00685
Epoch: 35280, Training Loss: 0.00774
Epoch: 35280, Training Loss: 0.00724
Epoch: 35280, Training Loss: 0.00884
Epoch: 35280, Training Loss: 0.00685
Epoch: 35281, Training Loss: 0.00774
Epoch: 35281, Training Loss: 0.00724
Epoch: 35281, Training Loss: 0.00884
Epoch: 35281, Training Loss: 0.00685
Epoch: 35282, Training Loss: 0.00774
Epoch: 35282, Training Loss: 0.00724
Epoch: 35282, Training Loss: 0.00884
Epoch: 35282, Training Loss: 0.00685
Epoch: 35283, Training Loss: 0.00774
Epoch: 35283, Training Loss: 0.00724
Epoch: 35283, Training Loss: 0.00884
Epoch: 35283, Training Loss: 0.00685
Epoch: 35284, Training Loss: 0.00774
Epoch: 35284, Training Loss: 0.00724
Epoch: 35284, Training Loss: 0.00884
Epoch: 35284, Training Loss: 0.00685
Epoch: 35285, Training Loss: 0.00774
Epoch: 35285, Training Loss: 0.00724
Epoch: 35285, Training Loss: 0.00884
Epoch: 35285, Training Loss: 0.00685
Epoch: 35286, Training Loss: 0.00774
Epoch: 35286, Training Loss: 0.00724
Epoch: 35286, Training Loss: 0.00884
Epoch: 35286, Training Loss: 0.00685
Epoch: 35287, Training Loss: 0.00774
Epoch: 35287, Training Loss: 0.00724
Epoch: 35287, Training Loss: 0.00884
Epoch: 35287, Training Loss: 0.00685
Epoch: 35288, Training Loss: 0.00774
Epoch: 35288, Training Loss: 0.00724
Epoch: 35288, Training Loss: 0.00884
Epoch: 35288, Training Loss: 0.00685
Epoch: 35289, Training Loss: 0.00774
Epoch: 35289, Training Loss: 0.00724
Epoch: 35289, Training Loss: 0.00884
Epoch: 35289, Training Loss: 0.00685
Epoch: 35290, Training Loss: 0.00774
Epoch: 35290, Training Loss: 0.00724
Epoch: 35290, Training Loss: 0.00884
Epoch: 35290, Training Loss: 0.00685
Epoch: 35291, Training Loss: 0.00774
Epoch: 35291, Training Loss: 0.00724
Epoch: 35291, Training Loss: 0.00884
Epoch: 35291, Training Loss: 0.00685
Epoch: 35292, Training Loss: 0.00774
Epoch: 35292, Training Loss: 0.00724
Epoch: 35292, Training Loss: 0.00884
Epoch: 35292, Training Loss: 0.00685
Epoch: 35293, Training Loss: 0.00774
Epoch: 35293, Training Loss: 0.00724
Epoch: 35293, Training Loss: 0.00884
Epoch: 35293, Training Loss: 0.00685
Epoch: 35294, Training Loss: 0.00774
Epoch: 35294, Training Loss: 0.00724
Epoch: 35294, Training Loss: 0.00884
Epoch: 35294, Training Loss: 0.00685
Epoch: 35295, Training Loss: 0.00774
Epoch: 35295, Training Loss: 0.00724
Epoch: 35295, Training Loss: 0.00884
Epoch: 35295, Training Loss: 0.00685
Epoch: 35296, Training Loss: 0.00774
Epoch: 35296, Training Loss: 0.00724
Epoch: 35296, Training Loss: 0.00884
Epoch: 35296, Training Loss: 0.00685
Epoch: 35297, Training Loss: 0.00774
Epoch: 35297, Training Loss: 0.00724
Epoch: 35297, Training Loss: 0.00884
Epoch: 35297, Training Loss: 0.00685
Epoch: 35298, Training Loss: 0.00774
Epoch: 35298, Training Loss: 0.00724
Epoch: 35298, Training Loss: 0.00884
Epoch: 35298, Training Loss: 0.00685
Epoch: 35299, Training Loss: 0.00774
Epoch: 35299, Training Loss: 0.00723
Epoch: 35299, Training Loss: 0.00884
Epoch: 35299, Training Loss: 0.00685
Epoch: 35300, Training Loss: 0.00774
Epoch: 35300, Training Loss: 0.00723
Epoch: 35300, Training Loss: 0.00884
Epoch: 35300, Training Loss: 0.00685
Epoch: 35301, Training Loss: 0.00774
Epoch: 35301, Training Loss: 0.00723
Epoch: 35301, Training Loss: 0.00884
Epoch: 35301, Training Loss: 0.00685
Epoch: 35302, Training Loss: 0.00774
Epoch: 35302, Training Loss: 0.00723
Epoch: 35302, Training Loss: 0.00884
Epoch: 35302, Training Loss: 0.00685
Epoch: 35303, Training Loss: 0.00774
Epoch: 35303, Training Loss: 0.00723
Epoch: 35303, Training Loss: 0.00884
Epoch: 35303, Training Loss: 0.00685
Epoch: 35304, Training Loss: 0.00774
Epoch: 35304, Training Loss: 0.00723
Epoch: 35304, Training Loss: 0.00884
Epoch: 35304, Training Loss: 0.00685
Epoch: 35305, Training Loss: 0.00774
Epoch: 35305, Training Loss: 0.00723
Epoch: 35305, Training Loss: 0.00884
Epoch: 35305, Training Loss: 0.00685
Epoch: 35306, Training Loss: 0.00774
Epoch: 35306, Training Loss: 0.00723
Epoch: 35306, Training Loss: 0.00884
Epoch: 35306, Training Loss: 0.00685
Epoch: 35307, Training Loss: 0.00774
Epoch: 35307, Training Loss: 0.00723
Epoch: 35307, Training Loss: 0.00884
Epoch: 35307, Training Loss: 0.00685
Epoch: 35308, Training Loss: 0.00774
Epoch: 35308, Training Loss: 0.00723
Epoch: 35308, Training Loss: 0.00884
Epoch: 35308, Training Loss: 0.00685
Epoch: 35309, Training Loss: 0.00774
Epoch: 35309, Training Loss: 0.00723
Epoch: 35309, Training Loss: 0.00884
Epoch: 35309, Training Loss: 0.00685
Epoch: 35310, Training Loss: 0.00774
Epoch: 35310, Training Loss: 0.00723
Epoch: 35310, Training Loss: 0.00884
Epoch: 35310, Training Loss: 0.00685
Epoch: 35311, Training Loss: 0.00774
Epoch: 35311, Training Loss: 0.00723
Epoch: 35311, Training Loss: 0.00884
Epoch: 35311, Training Loss: 0.00685
Epoch: 35312, Training Loss: 0.00774
Epoch: 35312, Training Loss: 0.00723
Epoch: 35312, Training Loss: 0.00884
Epoch: 35312, Training Loss: 0.00685
Epoch: 35313, Training Loss: 0.00774
Epoch: 35313, Training Loss: 0.00723
Epoch: 35313, Training Loss: 0.00884
Epoch: 35313, Training Loss: 0.00685
Epoch: 35314, Training Loss: 0.00774
Epoch: 35314, Training Loss: 0.00723
Epoch: 35314, Training Loss: 0.00884
Epoch: 35314, Training Loss: 0.00685
Epoch: 35315, Training Loss: 0.00774
Epoch: 35315, Training Loss: 0.00723
Epoch: 35315, Training Loss: 0.00884
Epoch: 35315, Training Loss: 0.00685
Epoch: 35316, Training Loss: 0.00774
Epoch: 35316, Training Loss: 0.00723
Epoch: 35316, Training Loss: 0.00884
Epoch: 35316, Training Loss: 0.00685
Epoch: 35317, Training Loss: 0.00774
Epoch: 35317, Training Loss: 0.00723
Epoch: 35317, Training Loss: 0.00884
Epoch: 35317, Training Loss: 0.00685
Epoch: 35318, Training Loss: 0.00774
Epoch: 35318, Training Loss: 0.00723
Epoch: 35318, Training Loss: 0.00884
Epoch: 35318, Training Loss: 0.00685
Epoch: 35319, Training Loss: 0.00774
Epoch: 35319, Training Loss: 0.00723
Epoch: 35319, Training Loss: 0.00884
Epoch: 35319, Training Loss: 0.00685
Epoch: 35320, Training Loss: 0.00774
Epoch: 35320, Training Loss: 0.00723
Epoch: 35320, Training Loss: 0.00884
Epoch: 35320, Training Loss: 0.00685
Epoch: 35321, Training Loss: 0.00774
Epoch: 35321, Training Loss: 0.00723
Epoch: 35321, Training Loss: 0.00884
Epoch: 35321, Training Loss: 0.00685
Epoch: 35322, Training Loss: 0.00774
Epoch: 35322, Training Loss: 0.00723
Epoch: 35322, Training Loss: 0.00884
Epoch: 35322, Training Loss: 0.00685
Epoch: 35323, Training Loss: 0.00774
Epoch: 35323, Training Loss: 0.00723
Epoch: 35323, Training Loss: 0.00884
Epoch: 35323, Training Loss: 0.00685
Epoch: 35324, Training Loss: 0.00773
Epoch: 35324, Training Loss: 0.00723
Epoch: 35324, Training Loss: 0.00884
Epoch: 35324, Training Loss: 0.00685
Epoch: 35325, Training Loss: 0.00773
Epoch: 35325, Training Loss: 0.00723
Epoch: 35325, Training Loss: 0.00884
Epoch: 35325, Training Loss: 0.00685
Epoch: 35326, Training Loss: 0.00773
Epoch: 35326, Training Loss: 0.00723
Epoch: 35326, Training Loss: 0.00884
Epoch: 35326, Training Loss: 0.00685
Epoch: 35327, Training Loss: 0.00773
Epoch: 35327, Training Loss: 0.00723
Epoch: 35327, Training Loss: 0.00884
Epoch: 35327, Training Loss: 0.00685
Epoch: 35328, Training Loss: 0.00773
Epoch: 35328, Training Loss: 0.00723
Epoch: 35328, Training Loss: 0.00884
Epoch: 35328, Training Loss: 0.00685
Epoch: 35329, Training Loss: 0.00773
Epoch: 35329, Training Loss: 0.00723
Epoch: 35329, Training Loss: 0.00884
Epoch: 35329, Training Loss: 0.00685
Epoch: 35330, Training Loss: 0.00773
Epoch: 35330, Training Loss: 0.00723
Epoch: 35330, Training Loss: 0.00884
Epoch: 35330, Training Loss: 0.00685
Epoch: 35331, Training Loss: 0.00773
Epoch: 35331, Training Loss: 0.00723
Epoch: 35331, Training Loss: 0.00883
Epoch: 35331, Training Loss: 0.00685
Epoch: 35332, Training Loss: 0.00773
Epoch: 35332, Training Loss: 0.00723
Epoch: 35332, Training Loss: 0.00883
Epoch: 35332, Training Loss: 0.00685
Epoch: 35333, Training Loss: 0.00773
Epoch: 35333, Training Loss: 0.00723
Epoch: 35333, Training Loss: 0.00883
Epoch: 35333, Training Loss: 0.00685
Epoch: 35334, Training Loss: 0.00773
Epoch: 35334, Training Loss: 0.00723
Epoch: 35334, Training Loss: 0.00883
Epoch: 35334, Training Loss: 0.00685
Epoch: 35335, Training Loss: 0.00773
Epoch: 35335, Training Loss: 0.00723
Epoch: 35335, Training Loss: 0.00883
Epoch: 35335, Training Loss: 0.00685
Epoch: 35336, Training Loss: 0.00773
Epoch: 35336, Training Loss: 0.00723
Epoch: 35336, Training Loss: 0.00883
Epoch: 35336, Training Loss: 0.00685
Epoch: 35337, Training Loss: 0.00773
Epoch: 35337, Training Loss: 0.00723
Epoch: 35337, Training Loss: 0.00883
Epoch: 35337, Training Loss: 0.00685
Epoch: 35338, Training Loss: 0.00773
Epoch: 35338, Training Loss: 0.00723
Epoch: 35338, Training Loss: 0.00883
Epoch: 35338, Training Loss: 0.00685
Epoch: 35339, Training Loss: 0.00773
Epoch: 35339, Training Loss: 0.00723
Epoch: 35339, Training Loss: 0.00883
Epoch: 35339, Training Loss: 0.00685
Epoch: 35340, Training Loss: 0.00773
Epoch: 35340, Training Loss: 0.00723
Epoch: 35340, Training Loss: 0.00883
Epoch: 35340, Training Loss: 0.00685
Epoch: 35341, Training Loss: 0.00773
Epoch: 35341, Training Loss: 0.00723
Epoch: 35341, Training Loss: 0.00883
Epoch: 35341, Training Loss: 0.00685
Epoch: 35342, Training Loss: 0.00773
Epoch: 35342, Training Loss: 0.00723
Epoch: 35342, Training Loss: 0.00883
Epoch: 35342, Training Loss: 0.00685
Epoch: 35343, Training Loss: 0.00773
Epoch: 35343, Training Loss: 0.00723
Epoch: 35343, Training Loss: 0.00883
Epoch: 35343, Training Loss: 0.00685
Epoch: 35344, Training Loss: 0.00773
Epoch: 35344, Training Loss: 0.00723
Epoch: 35344, Training Loss: 0.00883
Epoch: 35344, Training Loss: 0.00685
Epoch: 35345, Training Loss: 0.00773
Epoch: 35345, Training Loss: 0.00723
Epoch: 35345, Training Loss: 0.00883
Epoch: 35345, Training Loss: 0.00685
Epoch: 35346, Training Loss: 0.00773
Epoch: 35346, Training Loss: 0.00723
Epoch: 35346, Training Loss: 0.00883
Epoch: 35346, Training Loss: 0.00685
Epoch: 35347, Training Loss: 0.00773
Epoch: 35347, Training Loss: 0.00723
Epoch: 35347, Training Loss: 0.00883
Epoch: 35347, Training Loss: 0.00685
Epoch: 35348, Training Loss: 0.00773
Epoch: 35348, Training Loss: 0.00723
Epoch: 35348, Training Loss: 0.00883
Epoch: 35348, Training Loss: 0.00685
Epoch: 35349, Training Loss: 0.00773
Epoch: 35349, Training Loss: 0.00723
Epoch: 35349, Training Loss: 0.00883
Epoch: 35349, Training Loss: 0.00685
Epoch: 35350, Training Loss: 0.00773
Epoch: 35350, Training Loss: 0.00723
Epoch: 35350, Training Loss: 0.00883
Epoch: 35350, Training Loss: 0.00685
Epoch: 35351, Training Loss: 0.00773
Epoch: 35351, Training Loss: 0.00723
Epoch: 35351, Training Loss: 0.00883
Epoch: 35351, Training Loss: 0.00685
Epoch: 35352, Training Loss: 0.00773
Epoch: 35352, Training Loss: 0.00723
Epoch: 35352, Training Loss: 0.00883
Epoch: 35352, Training Loss: 0.00685
Epoch: 35353, Training Loss: 0.00773
Epoch: 35353, Training Loss: 0.00723
Epoch: 35353, Training Loss: 0.00883
Epoch: 35353, Training Loss: 0.00684
Epoch: 35354, Training Loss: 0.00773
Epoch: 35354, Training Loss: 0.00723
Epoch: 35354, Training Loss: 0.00883
Epoch: 35354, Training Loss: 0.00684
Epoch: 35355, Training Loss: 0.00773
Epoch: 35355, Training Loss: 0.00723
Epoch: 35355, Training Loss: 0.00883
Epoch: 35355, Training Loss: 0.00684
Epoch: 35356, Training Loss: 0.00773
Epoch: 35356, Training Loss: 0.00723
Epoch: 35356, Training Loss: 0.00883
Epoch: 35356, Training Loss: 0.00684
Epoch: 35357, Training Loss: 0.00773
Epoch: 35357, Training Loss: 0.00723
Epoch: 35357, Training Loss: 0.00883
Epoch: 35357, Training Loss: 0.00684
Epoch: 35358, Training Loss: 0.00773
Epoch: 35358, Training Loss: 0.00723
Epoch: 35358, Training Loss: 0.00883
Epoch: 35358, Training Loss: 0.00684
Epoch: 35359, Training Loss: 0.00773
Epoch: 35359, Training Loss: 0.00723
Epoch: 35359, Training Loss: 0.00883
Epoch: 35359, Training Loss: 0.00684
Epoch: 35360, Training Loss: 0.00773
Epoch: 35360, Training Loss: 0.00723
Epoch: 35360, Training Loss: 0.00883
Epoch: 35360, Training Loss: 0.00684
Epoch: 35361, Training Loss: 0.00773
Epoch: 35361, Training Loss: 0.00723
Epoch: 35361, Training Loss: 0.00883
Epoch: 35361, Training Loss: 0.00684
Epoch: 35362, Training Loss: 0.00773
Epoch: 35362, Training Loss: 0.00723
Epoch: 35362, Training Loss: 0.00883
Epoch: 35362, Training Loss: 0.00684
Epoch: 35363, Training Loss: 0.00773
Epoch: 35363, Training Loss: 0.00723
Epoch: 35363, Training Loss: 0.00883
Epoch: 35363, Training Loss: 0.00684
Epoch: 35364, Training Loss: 0.00773
Epoch: 35364, Training Loss: 0.00723
Epoch: 35364, Training Loss: 0.00883
Epoch: 35364, Training Loss: 0.00684
Epoch: 35365, Training Loss: 0.00773
Epoch: 35365, Training Loss: 0.00723
Epoch: 35365, Training Loss: 0.00883
Epoch: 35365, Training Loss: 0.00684
Epoch: 35366, Training Loss: 0.00773
Epoch: 35366, Training Loss: 0.00723
Epoch: 35366, Training Loss: 0.00883
Epoch: 35366, Training Loss: 0.00684
Epoch: 35367, Training Loss: 0.00773
Epoch: 35367, Training Loss: 0.00723
Epoch: 35367, Training Loss: 0.00883
Epoch: 35367, Training Loss: 0.00684
Epoch: 35368, Training Loss: 0.00773
Epoch: 35368, Training Loss: 0.00723
Epoch: 35368, Training Loss: 0.00883
Epoch: 35368, Training Loss: 0.00684
Epoch: 35369, Training Loss: 0.00773
Epoch: 35369, Training Loss: 0.00723
Epoch: 35369, Training Loss: 0.00883
Epoch: 35369, Training Loss: 0.00684
Epoch: 35370, Training Loss: 0.00773
Epoch: 35370, Training Loss: 0.00723
Epoch: 35370, Training Loss: 0.00883
Epoch: 35370, Training Loss: 0.00684
Epoch: 35371, Training Loss: 0.00773
Epoch: 35371, Training Loss: 0.00723
Epoch: 35371, Training Loss: 0.00883
Epoch: 35371, Training Loss: 0.00684
Epoch: 35372, Training Loss: 0.00773
Epoch: 35372, Training Loss: 0.00723
Epoch: 35372, Training Loss: 0.00883
Epoch: 35372, Training Loss: 0.00684
Epoch: 35373, Training Loss: 0.00773
Epoch: 35373, Training Loss: 0.00723
Epoch: 35373, Training Loss: 0.00883
Epoch: 35373, Training Loss: 0.00684
Epoch: 35374, Training Loss: 0.00773
Epoch: 35374, Training Loss: 0.00723
Epoch: 35374, Training Loss: 0.00883
Epoch: 35374, Training Loss: 0.00684
Epoch: 35375, Training Loss: 0.00773
Epoch: 35375, Training Loss: 0.00723
Epoch: 35375, Training Loss: 0.00883
Epoch: 35375, Training Loss: 0.00684
Epoch: 35376, Training Loss: 0.00773
Epoch: 35376, Training Loss: 0.00723
Epoch: 35376, Training Loss: 0.00883
Epoch: 35376, Training Loss: 0.00684
Epoch: 35377, Training Loss: 0.00773
Epoch: 35377, Training Loss: 0.00723
Epoch: 35377, Training Loss: 0.00883
Epoch: 35377, Training Loss: 0.00684
Epoch: 35378, Training Loss: 0.00773
Epoch: 35378, Training Loss: 0.00723
Epoch: 35378, Training Loss: 0.00883
Epoch: 35378, Training Loss: 0.00684
Epoch: 35379, Training Loss: 0.00773
Epoch: 35379, Training Loss: 0.00723
Epoch: 35379, Training Loss: 0.00883
Epoch: 35379, Training Loss: 0.00684
Epoch: 35380, Training Loss: 0.00773
Epoch: 35380, Training Loss: 0.00723
Epoch: 35380, Training Loss: 0.00883
Epoch: 35380, Training Loss: 0.00684
Epoch: 35381, Training Loss: 0.00773
Epoch: 35381, Training Loss: 0.00723
Epoch: 35381, Training Loss: 0.00883
Epoch: 35381, Training Loss: 0.00684
Epoch: 35382, Training Loss: 0.00773
Epoch: 35382, Training Loss: 0.00723
Epoch: 35382, Training Loss: 0.00883
Epoch: 35382, Training Loss: 0.00684
Epoch: 35383, Training Loss: 0.00773
Epoch: 35383, Training Loss: 0.00723
Epoch: 35383, Training Loss: 0.00883
Epoch: 35383, Training Loss: 0.00684
Epoch: 35384, Training Loss: 0.00773
Epoch: 35384, Training Loss: 0.00723
Epoch: 35384, Training Loss: 0.00883
Epoch: 35384, Training Loss: 0.00684
Epoch: 35385, Training Loss: 0.00773
Epoch: 35385, Training Loss: 0.00723
Epoch: 35385, Training Loss: 0.00883
Epoch: 35385, Training Loss: 0.00684
Epoch: 35386, Training Loss: 0.00773
Epoch: 35386, Training Loss: 0.00723
Epoch: 35386, Training Loss: 0.00883
Epoch: 35386, Training Loss: 0.00684
Epoch: 35387, Training Loss: 0.00773
Epoch: 35387, Training Loss: 0.00723
Epoch: 35387, Training Loss: 0.00883
Epoch: 35387, Training Loss: 0.00684
Epoch: 35388, Training Loss: 0.00773
Epoch: 35388, Training Loss: 0.00723
Epoch: 35388, Training Loss: 0.00883
Epoch: 35388, Training Loss: 0.00684
Epoch: 35389, Training Loss: 0.00773
Epoch: 35389, Training Loss: 0.00723
Epoch: 35389, Training Loss: 0.00883
Epoch: 35389, Training Loss: 0.00684
Epoch: 35390, Training Loss: 0.00773
Epoch: 35390, Training Loss: 0.00723
Epoch: 35390, Training Loss: 0.00883
Epoch: 35390, Training Loss: 0.00684
Epoch: 35391, Training Loss: 0.00773
Epoch: 35391, Training Loss: 0.00723
Epoch: 35391, Training Loss: 0.00883
Epoch: 35391, Training Loss: 0.00684
Epoch: 35392, Training Loss: 0.00773
Epoch: 35392, Training Loss: 0.00723
Epoch: 35392, Training Loss: 0.00883
Epoch: 35392, Training Loss: 0.00684
Epoch: 35393, Training Loss: 0.00773
Epoch: 35393, Training Loss: 0.00722
Epoch: 35393, Training Loss: 0.00883
Epoch: 35393, Training Loss: 0.00684
Epoch: 35394, Training Loss: 0.00773
Epoch: 35394, Training Loss: 0.00722
Epoch: 35394, Training Loss: 0.00883
Epoch: 35394, Training Loss: 0.00684
Epoch: 35395, Training Loss: 0.00773
Epoch: 35395, Training Loss: 0.00722
Epoch: 35395, Training Loss: 0.00883
Epoch: 35395, Training Loss: 0.00684
Epoch: 35396, Training Loss: 0.00773
Epoch: 35396, Training Loss: 0.00722
Epoch: 35396, Training Loss: 0.00883
Epoch: 35396, Training Loss: 0.00684
Epoch: 35397, Training Loss: 0.00773
Epoch: 35397, Training Loss: 0.00722
Epoch: 35397, Training Loss: 0.00883
Epoch: 35397, Training Loss: 0.00684
Epoch: 35398, Training Loss: 0.00773
Epoch: 35398, Training Loss: 0.00722
Epoch: 35398, Training Loss: 0.00883
Epoch: 35398, Training Loss: 0.00684
Epoch: 35399, Training Loss: 0.00773
Epoch: 35399, Training Loss: 0.00722
Epoch: 35399, Training Loss: 0.00883
Epoch: 35399, Training Loss: 0.00684
Epoch: 35400, Training Loss: 0.00773
Epoch: 35400, Training Loss: 0.00722
Epoch: 35400, Training Loss: 0.00883
Epoch: 35400, Training Loss: 0.00684
Epoch: 35401, Training Loss: 0.00773
Epoch: 35401, Training Loss: 0.00722
Epoch: 35401, Training Loss: 0.00883
Epoch: 35401, Training Loss: 0.00684
Epoch: 35402, Training Loss: 0.00773
Epoch: 35402, Training Loss: 0.00722
Epoch: 35402, Training Loss: 0.00883
Epoch: 35402, Training Loss: 0.00684
Epoch: 35403, Training Loss: 0.00773
Epoch: 35403, Training Loss: 0.00722
Epoch: 35403, Training Loss: 0.00883
Epoch: 35403, Training Loss: 0.00684
Epoch: 35404, Training Loss: 0.00773
Epoch: 35404, Training Loss: 0.00722
Epoch: 35404, Training Loss: 0.00883
Epoch: 35404, Training Loss: 0.00684
Epoch: 35405, Training Loss: 0.00773
Epoch: 35405, Training Loss: 0.00722
Epoch: 35405, Training Loss: 0.00883
Epoch: 35405, Training Loss: 0.00684
Epoch: 35406, Training Loss: 0.00773
Epoch: 35406, Training Loss: 0.00722
Epoch: 35406, Training Loss: 0.00883
Epoch: 35406, Training Loss: 0.00684
Epoch: 35407, Training Loss: 0.00773
Epoch: 35407, Training Loss: 0.00722
Epoch: 35407, Training Loss: 0.00882
Epoch: 35407, Training Loss: 0.00684
Epoch: 35408, Training Loss: 0.00773
Epoch: 35408, Training Loss: 0.00722
Epoch: 35408, Training Loss: 0.00882
Epoch: 35408, Training Loss: 0.00684
Epoch: 35409, Training Loss: 0.00773
Epoch: 35409, Training Loss: 0.00722
Epoch: 35409, Training Loss: 0.00882
Epoch: 35409, Training Loss: 0.00684
Epoch: 35410, Training Loss: 0.00772
Epoch: 35410, Training Loss: 0.00722
Epoch: 35410, Training Loss: 0.00882
Epoch: 35410, Training Loss: 0.00684
Epoch: 35411, Training Loss: 0.00772
Epoch: 35411, Training Loss: 0.00722
Epoch: 35411, Training Loss: 0.00882
Epoch: 35411, Training Loss: 0.00684
Epoch: 35412, Training Loss: 0.00772
Epoch: 35412, Training Loss: 0.00722
Epoch: 35412, Training Loss: 0.00882
Epoch: 35412, Training Loss: 0.00684
Epoch: 35413, Training Loss: 0.00772
Epoch: 35413, Training Loss: 0.00722
Epoch: 35413, Training Loss: 0.00882
Epoch: 35413, Training Loss: 0.00684
Epoch: 35414, Training Loss: 0.00772
Epoch: 35414, Training Loss: 0.00722
Epoch: 35414, Training Loss: 0.00882
Epoch: 35414, Training Loss: 0.00684
Epoch: 35415, Training Loss: 0.00772
Epoch: 35415, Training Loss: 0.00722
Epoch: 35415, Training Loss: 0.00882
Epoch: 35415, Training Loss: 0.00684
Epoch: 35416, Training Loss: 0.00772
Epoch: 35416, Training Loss: 0.00722
Epoch: 35416, Training Loss: 0.00882
Epoch: 35416, Training Loss: 0.00684
Epoch: 35417, Training Loss: 0.00772
Epoch: 35417, Training Loss: 0.00722
Epoch: 35417, Training Loss: 0.00882
Epoch: 35417, Training Loss: 0.00684
Epoch: 35418, Training Loss: 0.00772
Epoch: 35418, Training Loss: 0.00722
Epoch: 35418, Training Loss: 0.00882
Epoch: 35418, Training Loss: 0.00684
Epoch: 35419, Training Loss: 0.00772
Epoch: 35419, Training Loss: 0.00722
Epoch: 35419, Training Loss: 0.00882
Epoch: 35419, Training Loss: 0.00684
Epoch: 35420, Training Loss: 0.00772
Epoch: 35420, Training Loss: 0.00722
Epoch: 35420, Training Loss: 0.00882
Epoch: 35420, Training Loss: 0.00684
Epoch: 35421, Training Loss: 0.00772
Epoch: 35421, Training Loss: 0.00722
Epoch: 35421, Training Loss: 0.00882
Epoch: 35421, Training Loss: 0.00684
Epoch: 35422, Training Loss: 0.00772
Epoch: 35422, Training Loss: 0.00722
Epoch: 35422, Training Loss: 0.00882
Epoch: 35422, Training Loss: 0.00684
Epoch: 35423, Training Loss: 0.00772
Epoch: 35423, Training Loss: 0.00722
Epoch: 35423, Training Loss: 0.00882
Epoch: 35423, Training Loss: 0.00684
Epoch: 35424, Training Loss: 0.00772
Epoch: 35424, Training Loss: 0.00722
Epoch: 35424, Training Loss: 0.00882
Epoch: 35424, Training Loss: 0.00684
Epoch: 35425, Training Loss: 0.00772
Epoch: 35425, Training Loss: 0.00722
Epoch: 35425, Training Loss: 0.00882
Epoch: 35425, Training Loss: 0.00684
Epoch: 35426, Training Loss: 0.00772
Epoch: 35426, Training Loss: 0.00722
Epoch: 35426, Training Loss: 0.00882
Epoch: 35426, Training Loss: 0.00684
Epoch: 35427, Training Loss: 0.00772
Epoch: 35427, Training Loss: 0.00722
Epoch: 35427, Training Loss: 0.00882
Epoch: 35427, Training Loss: 0.00684
Epoch: 35428, Training Loss: 0.00772
Epoch: 35428, Training Loss: 0.00722
Epoch: 35428, Training Loss: 0.00882
Epoch: 35428, Training Loss: 0.00684
Epoch: 35429, Training Loss: 0.00772
Epoch: 35429, Training Loss: 0.00722
Epoch: 35429, Training Loss: 0.00882
Epoch: 35429, Training Loss: 0.00684
Epoch: 35430, Training Loss: 0.00772
Epoch: 35430, Training Loss: 0.00722
Epoch: 35430, Training Loss: 0.00882
Epoch: 35430, Training Loss: 0.00684
Epoch: 35431, Training Loss: 0.00772
Epoch: 35431, Training Loss: 0.00722
Epoch: 35431, Training Loss: 0.00882
Epoch: 35431, Training Loss: 0.00684
Epoch: 35432, Training Loss: 0.00772
Epoch: 35432, Training Loss: 0.00722
Epoch: 35432, Training Loss: 0.00882
Epoch: 35432, Training Loss: 0.00684
Epoch: 35433, Training Loss: 0.00772
Epoch: 35433, Training Loss: 0.00722
Epoch: 35433, Training Loss: 0.00882
Epoch: 35433, Training Loss: 0.00684
Epoch: 35434, Training Loss: 0.00772
Epoch: 35434, Training Loss: 0.00722
Epoch: 35434, Training Loss: 0.00882
Epoch: 35434, Training Loss: 0.00684
Epoch: 35435, Training Loss: 0.00772
Epoch: 35435, Training Loss: 0.00722
Epoch: 35435, Training Loss: 0.00882
Epoch: 35435, Training Loss: 0.00684
Epoch: 35436, Training Loss: 0.00772
Epoch: 35436, Training Loss: 0.00722
Epoch: 35436, Training Loss: 0.00882
Epoch: 35436, Training Loss: 0.00684
Epoch: 35437, Training Loss: 0.00772
Epoch: 35437, Training Loss: 0.00722
Epoch: 35437, Training Loss: 0.00882
Epoch: 35437, Training Loss: 0.00684
Epoch: 35438, Training Loss: 0.00772
Epoch: 35438, Training Loss: 0.00722
Epoch: 35438, Training Loss: 0.00882
Epoch: 35438, Training Loss: 0.00684
Epoch: 35439, Training Loss: 0.00772
Epoch: 35439, Training Loss: 0.00722
Epoch: 35439, Training Loss: 0.00882
Epoch: 35439, Training Loss: 0.00684
Epoch: 35440, Training Loss: 0.00772
Epoch: 35440, Training Loss: 0.00722
Epoch: 35440, Training Loss: 0.00882
Epoch: 35440, Training Loss: 0.00684
Epoch: 35441, Training Loss: 0.00772
Epoch: 35441, Training Loss: 0.00722
Epoch: 35441, Training Loss: 0.00882
Epoch: 35441, Training Loss: 0.00684
Epoch: 35442, Training Loss: 0.00772
Epoch: 35442, Training Loss: 0.00722
Epoch: 35442, Training Loss: 0.00882
Epoch: 35442, Training Loss: 0.00684
Epoch: 35443, Training Loss: 0.00772
Epoch: 35443, Training Loss: 0.00722
Epoch: 35443, Training Loss: 0.00882
Epoch: 35443, Training Loss: 0.00684
Epoch: 35444, Training Loss: 0.00772
Epoch: 35444, Training Loss: 0.00722
Epoch: 35444, Training Loss: 0.00882
Epoch: 35444, Training Loss: 0.00684
Epoch: 35445, Training Loss: 0.00772
Epoch: 35445, Training Loss: 0.00722
Epoch: 35445, Training Loss: 0.00882
Epoch: 35445, Training Loss: 0.00684
Epoch: 35446, Training Loss: 0.00772
Epoch: 35446, Training Loss: 0.00722
Epoch: 35446, Training Loss: 0.00882
Epoch: 35446, Training Loss: 0.00684
Epoch: 35447, Training Loss: 0.00772
Epoch: 35447, Training Loss: 0.00722
Epoch: 35447, Training Loss: 0.00882
Epoch: 35447, Training Loss: 0.00684
Epoch: 35448, Training Loss: 0.00772
Epoch: 35448, Training Loss: 0.00722
Epoch: 35448, Training Loss: 0.00882
Epoch: 35448, Training Loss: 0.00684
Epoch: 35449, Training Loss: 0.00772
Epoch: 35449, Training Loss: 0.00722
Epoch: 35449, Training Loss: 0.00882
Epoch: 35449, Training Loss: 0.00684
Epoch: 35450, Training Loss: 0.00772
Epoch: 35450, Training Loss: 0.00722
Epoch: 35450, Training Loss: 0.00882
Epoch: 35450, Training Loss: 0.00684
Epoch: 35451, Training Loss: 0.00772
Epoch: 35451, Training Loss: 0.00722
Epoch: 35451, Training Loss: 0.00882
Epoch: 35451, Training Loss: 0.00684
Epoch: 35452, Training Loss: 0.00772
Epoch: 35452, Training Loss: 0.00722
Epoch: 35452, Training Loss: 0.00882
Epoch: 35452, Training Loss: 0.00683
Epoch: 35453, Training Loss: 0.00772
Epoch: 35453, Training Loss: 0.00722
Epoch: 35453, Training Loss: 0.00882
Epoch: 35453, Training Loss: 0.00683
Epoch: 35454, Training Loss: 0.00772
Epoch: 35454, Training Loss: 0.00722
Epoch: 35454, Training Loss: 0.00882
Epoch: 35454, Training Loss: 0.00683
Epoch: 35455, Training Loss: 0.00772
Epoch: 35455, Training Loss: 0.00722
Epoch: 35455, Training Loss: 0.00882
Epoch: 35455, Training Loss: 0.00683
Epoch: 35456, Training Loss: 0.00772
Epoch: 35456, Training Loss: 0.00722
Epoch: 35456, Training Loss: 0.00882
Epoch: 35456, Training Loss: 0.00683
Epoch: 35457, Training Loss: 0.00772
Epoch: 35457, Training Loss: 0.00722
Epoch: 35457, Training Loss: 0.00882
Epoch: 35457, Training Loss: 0.00683
Epoch: 35458, Training Loss: 0.00772
Epoch: 35458, Training Loss: 0.00722
Epoch: 35458, Training Loss: 0.00882
Epoch: 35458, Training Loss: 0.00683
Epoch: 35459, Training Loss: 0.00772
Epoch: 35459, Training Loss: 0.00722
Epoch: 35459, Training Loss: 0.00882
Epoch: 35459, Training Loss: 0.00683
Epoch: 35460, Training Loss: 0.00772
Epoch: 35460, Training Loss: 0.00722
Epoch: 35460, Training Loss: 0.00882
Epoch: 35460, Training Loss: 0.00683
Epoch: 35461, Training Loss: 0.00772
Epoch: 35461, Training Loss: 0.00722
Epoch: 35461, Training Loss: 0.00882
Epoch: 35461, Training Loss: 0.00683
Epoch: 35462, Training Loss: 0.00772
Epoch: 35462, Training Loss: 0.00722
Epoch: 35462, Training Loss: 0.00882
Epoch: 35462, Training Loss: 0.00683
Epoch: 35463, Training Loss: 0.00772
Epoch: 35463, Training Loss: 0.00722
Epoch: 35463, Training Loss: 0.00882
Epoch: 35463, Training Loss: 0.00683
Epoch: 35464, Training Loss: 0.00772
Epoch: 35464, Training Loss: 0.00722
Epoch: 35464, Training Loss: 0.00882
Epoch: 35464, Training Loss: 0.00683
Epoch: 35465, Training Loss: 0.00772
Epoch: 35465, Training Loss: 0.00722
Epoch: 35465, Training Loss: 0.00882
Epoch: 35465, Training Loss: 0.00683
Epoch: 35466, Training Loss: 0.00772
Epoch: 35466, Training Loss: 0.00722
Epoch: 35466, Training Loss: 0.00882
Epoch: 35466, Training Loss: 0.00683
Epoch: 35467, Training Loss: 0.00772
Epoch: 35467, Training Loss: 0.00722
Epoch: 35467, Training Loss: 0.00882
Epoch: 35467, Training Loss: 0.00683
Epoch: 35468, Training Loss: 0.00772
Epoch: 35468, Training Loss: 0.00722
Epoch: 35468, Training Loss: 0.00882
Epoch: 35468, Training Loss: 0.00683
Epoch: 35469, Training Loss: 0.00772
Epoch: 35469, Training Loss: 0.00722
Epoch: 35469, Training Loss: 0.00882
Epoch: 35469, Training Loss: 0.00683
Epoch: 35470, Training Loss: 0.00772
Epoch: 35470, Training Loss: 0.00722
Epoch: 35470, Training Loss: 0.00882
Epoch: 35470, Training Loss: 0.00683
Epoch: 35471, Training Loss: 0.00772
Epoch: 35471, Training Loss: 0.00722
Epoch: 35471, Training Loss: 0.00882
Epoch: 35471, Training Loss: 0.00683
Epoch: 35472, Training Loss: 0.00772
Epoch: 35472, Training Loss: 0.00722
Epoch: 35472, Training Loss: 0.00882
Epoch: 35472, Training Loss: 0.00683
Epoch: 35473, Training Loss: 0.00772
Epoch: 35473, Training Loss: 0.00722
Epoch: 35473, Training Loss: 0.00882
Epoch: 35473, Training Loss: 0.00683
Epoch: 35474, Training Loss: 0.00772
Epoch: 35474, Training Loss: 0.00722
Epoch: 35474, Training Loss: 0.00882
Epoch: 35474, Training Loss: 0.00683
Epoch: 35475, Training Loss: 0.00772
Epoch: 35475, Training Loss: 0.00722
Epoch: 35475, Training Loss: 0.00882
Epoch: 35475, Training Loss: 0.00683
Epoch: 35476, Training Loss: 0.00772
Epoch: 35476, Training Loss: 0.00722
Epoch: 35476, Training Loss: 0.00882
Epoch: 35476, Training Loss: 0.00683
Epoch: 35477, Training Loss: 0.00772
Epoch: 35477, Training Loss: 0.00722
Epoch: 35477, Training Loss: 0.00882
Epoch: 35477, Training Loss: 0.00683
Epoch: 35478, Training Loss: 0.00772
Epoch: 35478, Training Loss: 0.00722
Epoch: 35478, Training Loss: 0.00882
Epoch: 35478, Training Loss: 0.00683
Epoch: 35479, Training Loss: 0.00772
Epoch: 35479, Training Loss: 0.00722
Epoch: 35479, Training Loss: 0.00882
Epoch: 35479, Training Loss: 0.00683
Epoch: 35480, Training Loss: 0.00772
Epoch: 35480, Training Loss: 0.00722
Epoch: 35480, Training Loss: 0.00882
Epoch: 35480, Training Loss: 0.00683
Epoch: 35481, Training Loss: 0.00772
Epoch: 35481, Training Loss: 0.00722
Epoch: 35481, Training Loss: 0.00882
Epoch: 35481, Training Loss: 0.00683
Epoch: 35482, Training Loss: 0.00772
Epoch: 35482, Training Loss: 0.00722
Epoch: 35482, Training Loss: 0.00882
Epoch: 35482, Training Loss: 0.00683
Epoch: 35483, Training Loss: 0.00772
Epoch: 35483, Training Loss: 0.00722
Epoch: 35483, Training Loss: 0.00881
Epoch: 35483, Training Loss: 0.00683
Epoch: 35484, Training Loss: 0.00772
Epoch: 35484, Training Loss: 0.00722
Epoch: 35484, Training Loss: 0.00881
Epoch: 35484, Training Loss: 0.00683
Epoch: 35485, Training Loss: 0.00772
Epoch: 35485, Training Loss: 0.00722
Epoch: 35485, Training Loss: 0.00881
Epoch: 35485, Training Loss: 0.00683
Epoch: 35486, Training Loss: 0.00772
Epoch: 35486, Training Loss: 0.00722
Epoch: 35486, Training Loss: 0.00881
Epoch: 35486, Training Loss: 0.00683
Epoch: 35487, Training Loss: 0.00772
Epoch: 35487, Training Loss: 0.00722
Epoch: 35487, Training Loss: 0.00881
Epoch: 35487, Training Loss: 0.00683
Epoch: 35488, Training Loss: 0.00772
Epoch: 35488, Training Loss: 0.00721
Epoch: 35488, Training Loss: 0.00881
Epoch: 35488, Training Loss: 0.00683
Epoch: 35489, Training Loss: 0.00772
Epoch: 35489, Training Loss: 0.00721
Epoch: 35489, Training Loss: 0.00881
Epoch: 35489, Training Loss: 0.00683
Epoch: 35490, Training Loss: 0.00772
Epoch: 35490, Training Loss: 0.00721
Epoch: 35490, Training Loss: 0.00881
Epoch: 35490, Training Loss: 0.00683
Epoch: 35491, Training Loss: 0.00772
Epoch: 35491, Training Loss: 0.00721
Epoch: 35491, Training Loss: 0.00881
Epoch: 35491, Training Loss: 0.00683
Epoch: 35492, Training Loss: 0.00772
Epoch: 35492, Training Loss: 0.00721
Epoch: 35492, Training Loss: 0.00881
Epoch: 35492, Training Loss: 0.00683
Epoch: 35493, Training Loss: 0.00772
Epoch: 35493, Training Loss: 0.00721
Epoch: 35493, Training Loss: 0.00881
Epoch: 35493, Training Loss: 0.00683
Epoch: 35494, Training Loss: 0.00772
Epoch: 35494, Training Loss: 0.00721
Epoch: 35494, Training Loss: 0.00881
Epoch: 35494, Training Loss: 0.00683
Epoch: 35495, Training Loss: 0.00772
Epoch: 35495, Training Loss: 0.00721
Epoch: 35495, Training Loss: 0.00881
Epoch: 35495, Training Loss: 0.00683
Epoch: 35496, Training Loss: 0.00772
Epoch: 35496, Training Loss: 0.00721
Epoch: 35496, Training Loss: 0.00881
Epoch: 35496, Training Loss: 0.00683
Epoch: 35497, Training Loss: 0.00771
Epoch: 35497, Training Loss: 0.00721
Epoch: 35497, Training Loss: 0.00881
Epoch: 35497, Training Loss: 0.00683
Epoch: 35498, Training Loss: 0.00771
Epoch: 35498, Training Loss: 0.00721
Epoch: 35498, Training Loss: 0.00881
Epoch: 35498, Training Loss: 0.00683
Epoch: 35499, Training Loss: 0.00771
Epoch: 35499, Training Loss: 0.00721
Epoch: 35499, Training Loss: 0.00881
Epoch: 35499, Training Loss: 0.00683
Epoch: 35500, Training Loss: 0.00771
Epoch: 35500, Training Loss: 0.00721
Epoch: 35500, Training Loss: 0.00881
Epoch: 35500, Training Loss: 0.00683
Epoch: 35501, Training Loss: 0.00771
Epoch: 35501, Training Loss: 0.00721
Epoch: 35501, Training Loss: 0.00881
Epoch: 35501, Training Loss: 0.00683
Epoch: 35502, Training Loss: 0.00771
Epoch: 35502, Training Loss: 0.00721
Epoch: 35502, Training Loss: 0.00881
Epoch: 35502, Training Loss: 0.00683
Epoch: 35503, Training Loss: 0.00771
Epoch: 35503, Training Loss: 0.00721
Epoch: 35503, Training Loss: 0.00881
Epoch: 35503, Training Loss: 0.00683
Epoch: 35504, Training Loss: 0.00771
Epoch: 35504, Training Loss: 0.00721
Epoch: 35504, Training Loss: 0.00881
Epoch: 35504, Training Loss: 0.00683
Epoch: 35505, Training Loss: 0.00771
Epoch: 35505, Training Loss: 0.00721
Epoch: 35505, Training Loss: 0.00881
Epoch: 35505, Training Loss: 0.00683
Epoch: 35506, Training Loss: 0.00771
Epoch: 35506, Training Loss: 0.00721
Epoch: 35506, Training Loss: 0.00881
Epoch: 35506, Training Loss: 0.00683
Epoch: 35507, Training Loss: 0.00771
Epoch: 35507, Training Loss: 0.00721
Epoch: 35507, Training Loss: 0.00881
Epoch: 35507, Training Loss: 0.00683
Epoch: 35508, Training Loss: 0.00771
Epoch: 35508, Training Loss: 0.00721
Epoch: 35508, Training Loss: 0.00881
Epoch: 35508, Training Loss: 0.00683
Epoch: 35509, Training Loss: 0.00771
Epoch: 35509, Training Loss: 0.00721
Epoch: 35509, Training Loss: 0.00881
Epoch: 35509, Training Loss: 0.00683
Epoch: 35510, Training Loss: 0.00771
Epoch: 35510, Training Loss: 0.00721
Epoch: 35510, Training Loss: 0.00881
Epoch: 35510, Training Loss: 0.00683
Epoch: 35511, Training Loss: 0.00771
Epoch: 35511, Training Loss: 0.00721
Epoch: 35511, Training Loss: 0.00881
Epoch: 35511, Training Loss: 0.00683
Epoch: 35512, Training Loss: 0.00771
Epoch: 35512, Training Loss: 0.00721
Epoch: 35512, Training Loss: 0.00881
Epoch: 35512, Training Loss: 0.00683
Epoch: 35513, Training Loss: 0.00771
Epoch: 35513, Training Loss: 0.00721
Epoch: 35513, Training Loss: 0.00881
Epoch: 35513, Training Loss: 0.00683
Epoch: 35514, Training Loss: 0.00771
Epoch: 35514, Training Loss: 0.00721
Epoch: 35514, Training Loss: 0.00881
Epoch: 35514, Training Loss: 0.00683
Epoch: 35515, Training Loss: 0.00771
Epoch: 35515, Training Loss: 0.00721
Epoch: 35515, Training Loss: 0.00881
Epoch: 35515, Training Loss: 0.00683
Epoch: 35516, Training Loss: 0.00771
Epoch: 35516, Training Loss: 0.00721
Epoch: 35516, Training Loss: 0.00881
Epoch: 35516, Training Loss: 0.00683
Epoch: 35517, Training Loss: 0.00771
Epoch: 35517, Training Loss: 0.00721
Epoch: 35517, Training Loss: 0.00881
Epoch: 35517, Training Loss: 0.00683
Epoch: 35518, Training Loss: 0.00771
Epoch: 35518, Training Loss: 0.00721
Epoch: 35518, Training Loss: 0.00881
Epoch: 35518, Training Loss: 0.00683
Epoch: 35519, Training Loss: 0.00771
Epoch: 35519, Training Loss: 0.00721
Epoch: 35519, Training Loss: 0.00881
Epoch: 35519, Training Loss: 0.00683
Epoch: 35520, Training Loss: 0.00771
Epoch: 35520, Training Loss: 0.00721
Epoch: 35520, Training Loss: 0.00881
Epoch: 35520, Training Loss: 0.00683
Epoch: 35521, Training Loss: 0.00771
Epoch: 35521, Training Loss: 0.00721
Epoch: 35521, Training Loss: 0.00881
Epoch: 35521, Training Loss: 0.00683
Epoch: 35522, Training Loss: 0.00771
Epoch: 35522, Training Loss: 0.00721
Epoch: 35522, Training Loss: 0.00881
Epoch: 35522, Training Loss: 0.00683
Epoch: 35523, Training Loss: 0.00771
Epoch: 35523, Training Loss: 0.00721
Epoch: 35523, Training Loss: 0.00881
Epoch: 35523, Training Loss: 0.00683
Epoch: 35524, Training Loss: 0.00771
Epoch: 35524, Training Loss: 0.00721
Epoch: 35524, Training Loss: 0.00881
Epoch: 35524, Training Loss: 0.00683
Epoch: 35525, Training Loss: 0.00771
Epoch: 35525, Training Loss: 0.00721
Epoch: 35525, Training Loss: 0.00881
Epoch: 35525, Training Loss: 0.00683
Epoch: 35526, Training Loss: 0.00771
Epoch: 35526, Training Loss: 0.00721
Epoch: 35526, Training Loss: 0.00881
Epoch: 35526, Training Loss: 0.00683
Epoch: 35527, Training Loss: 0.00771
Epoch: 35527, Training Loss: 0.00721
Epoch: 35527, Training Loss: 0.00881
Epoch: 35527, Training Loss: 0.00683
Epoch: 35528, Training Loss: 0.00771
Epoch: 35528, Training Loss: 0.00721
Epoch: 35528, Training Loss: 0.00881
Epoch: 35528, Training Loss: 0.00683
Epoch: 35529, Training Loss: 0.00771
Epoch: 35529, Training Loss: 0.00721
Epoch: 35529, Training Loss: 0.00881
Epoch: 35529, Training Loss: 0.00683
Epoch: 35530, Training Loss: 0.00771
Epoch: 35530, Training Loss: 0.00721
Epoch: 35530, Training Loss: 0.00881
Epoch: 35530, Training Loss: 0.00683
Epoch: 35531, Training Loss: 0.00771
Epoch: 35531, Training Loss: 0.00721
Epoch: 35531, Training Loss: 0.00881
Epoch: 35531, Training Loss: 0.00683
Epoch: 35532, Training Loss: 0.00771
Epoch: 35532, Training Loss: 0.00721
Epoch: 35532, Training Loss: 0.00881
Epoch: 35532, Training Loss: 0.00683
Epoch: 35533, Training Loss: 0.00771
Epoch: 35533, Training Loss: 0.00721
Epoch: 35533, Training Loss: 0.00881
Epoch: 35533, Training Loss: 0.00683
Epoch: 35534, Training Loss: 0.00771
Epoch: 35534, Training Loss: 0.00721
Epoch: 35534, Training Loss: 0.00881
Epoch: 35534, Training Loss: 0.00683
Epoch: 35535, Training Loss: 0.00771
Epoch: 35535, Training Loss: 0.00721
Epoch: 35535, Training Loss: 0.00881
Epoch: 35535, Training Loss: 0.00683
Epoch: 35536, Training Loss: 0.00771
Epoch: 35536, Training Loss: 0.00721
Epoch: 35536, Training Loss: 0.00881
Epoch: 35536, Training Loss: 0.00683
Epoch: 35537, Training Loss: 0.00771
Epoch: 35537, Training Loss: 0.00721
Epoch: 35537, Training Loss: 0.00881
Epoch: 35537, Training Loss: 0.00683
Epoch: 35538, Training Loss: 0.00771
Epoch: 35538, Training Loss: 0.00721
Epoch: 35538, Training Loss: 0.00881
Epoch: 35538, Training Loss: 0.00683
Epoch: 35539, Training Loss: 0.00771
Epoch: 35539, Training Loss: 0.00721
Epoch: 35539, Training Loss: 0.00881
Epoch: 35539, Training Loss: 0.00683
Epoch: 35540, Training Loss: 0.00771
Epoch: 35540, Training Loss: 0.00721
Epoch: 35540, Training Loss: 0.00881
Epoch: 35540, Training Loss: 0.00683
Epoch: 35541, Training Loss: 0.00771
Epoch: 35541, Training Loss: 0.00721
Epoch: 35541, Training Loss: 0.00881
Epoch: 35541, Training Loss: 0.00683
Epoch: 35542, Training Loss: 0.00771
Epoch: 35542, Training Loss: 0.00721
Epoch: 35542, Training Loss: 0.00881
Epoch: 35542, Training Loss: 0.00683
Epoch: 35543, Training Loss: 0.00771
Epoch: 35543, Training Loss: 0.00721
Epoch: 35543, Training Loss: 0.00881
Epoch: 35543, Training Loss: 0.00683
Epoch: 35544, Training Loss: 0.00771
Epoch: 35544, Training Loss: 0.00721
Epoch: 35544, Training Loss: 0.00881
Epoch: 35544, Training Loss: 0.00683
Epoch: 35545, Training Loss: 0.00771
Epoch: 35545, Training Loss: 0.00721
Epoch: 35545, Training Loss: 0.00881
Epoch: 35545, Training Loss: 0.00683
Epoch: 35546, Training Loss: 0.00771
Epoch: 35546, Training Loss: 0.00721
Epoch: 35546, Training Loss: 0.00881
Epoch: 35546, Training Loss: 0.00683
Epoch: 35547, Training Loss: 0.00771
Epoch: 35547, Training Loss: 0.00721
Epoch: 35547, Training Loss: 0.00881
Epoch: 35547, Training Loss: 0.00683
Epoch: 35548, Training Loss: 0.00771
Epoch: 35548, Training Loss: 0.00721
Epoch: 35548, Training Loss: 0.00881
Epoch: 35548, Training Loss: 0.00683
Epoch: 35549, Training Loss: 0.00771
Epoch: 35549, Training Loss: 0.00721
Epoch: 35549, Training Loss: 0.00881
Epoch: 35549, Training Loss: 0.00683
Epoch: 35550, Training Loss: 0.00771
Epoch: 35550, Training Loss: 0.00721
Epoch: 35550, Training Loss: 0.00881
Epoch: 35550, Training Loss: 0.00683
Epoch: 35551, Training Loss: 0.00771
Epoch: 35551, Training Loss: 0.00721
Epoch: 35551, Training Loss: 0.00881
Epoch: 35551, Training Loss: 0.00683
Epoch: 35552, Training Loss: 0.00771
Epoch: 35552, Training Loss: 0.00721
Epoch: 35552, Training Loss: 0.00881
Epoch: 35552, Training Loss: 0.00682
Epoch: 35553, Training Loss: 0.00771
Epoch: 35553, Training Loss: 0.00721
Epoch: 35553, Training Loss: 0.00881
Epoch: 35553, Training Loss: 0.00682
Epoch: 35554, Training Loss: 0.00771
Epoch: 35554, Training Loss: 0.00721
Epoch: 35554, Training Loss: 0.00881
Epoch: 35554, Training Loss: 0.00682
Epoch: 35555, Training Loss: 0.00771
Epoch: 35555, Training Loss: 0.00721
Epoch: 35555, Training Loss: 0.00881
Epoch: 35555, Training Loss: 0.00682
Epoch: 35556, Training Loss: 0.00771
Epoch: 35556, Training Loss: 0.00721
Epoch: 35556, Training Loss: 0.00881
Epoch: 35556, Training Loss: 0.00682
Epoch: 35557, Training Loss: 0.00771
Epoch: 35557, Training Loss: 0.00721
Epoch: 35557, Training Loss: 0.00881
Epoch: 35557, Training Loss: 0.00682
Epoch: 35558, Training Loss: 0.00771
Epoch: 35558, Training Loss: 0.00721
Epoch: 35558, Training Loss: 0.00881
Epoch: 35558, Training Loss: 0.00682
Epoch: 35559, Training Loss: 0.00771
Epoch: 35559, Training Loss: 0.00721
Epoch: 35559, Training Loss: 0.00881
Epoch: 35559, Training Loss: 0.00682
Epoch: 35560, Training Loss: 0.00771
Epoch: 35560, Training Loss: 0.00721
Epoch: 35560, Training Loss: 0.00880
Epoch: 35560, Training Loss: 0.00682
Epoch: 35561, Training Loss: 0.00771
Epoch: 35561, Training Loss: 0.00721
Epoch: 35561, Training Loss: 0.00880
Epoch: 35561, Training Loss: 0.00682
Epoch: 35562, Training Loss: 0.00771
Epoch: 35562, Training Loss: 0.00721
Epoch: 35562, Training Loss: 0.00880
Epoch: 35562, Training Loss: 0.00682
Epoch: 35563, Training Loss: 0.00771
Epoch: 35563, Training Loss: 0.00721
Epoch: 35563, Training Loss: 0.00880
Epoch: 35563, Training Loss: 0.00682
Epoch: 35564, Training Loss: 0.00771
Epoch: 35564, Training Loss: 0.00721
Epoch: 35564, Training Loss: 0.00880
Epoch: 35564, Training Loss: 0.00682
Epoch: 35565, Training Loss: 0.00771
Epoch: 35565, Training Loss: 0.00721
Epoch: 35565, Training Loss: 0.00880
Epoch: 35565, Training Loss: 0.00682
Epoch: 35566, Training Loss: 0.00771
Epoch: 35566, Training Loss: 0.00721
Epoch: 35566, Training Loss: 0.00880
Epoch: 35566, Training Loss: 0.00682
Epoch: 35567, Training Loss: 0.00771
Epoch: 35567, Training Loss: 0.00721
Epoch: 35567, Training Loss: 0.00880
Epoch: 35567, Training Loss: 0.00682
Epoch: 35568, Training Loss: 0.00771
Epoch: 35568, Training Loss: 0.00721
Epoch: 35568, Training Loss: 0.00880
Epoch: 35568, Training Loss: 0.00682
Epoch: 35569, Training Loss: 0.00771
Epoch: 35569, Training Loss: 0.00721
Epoch: 35569, Training Loss: 0.00880
Epoch: 35569, Training Loss: 0.00682
Epoch: 35570, Training Loss: 0.00771
Epoch: 35570, Training Loss: 0.00721
Epoch: 35570, Training Loss: 0.00880
Epoch: 35570, Training Loss: 0.00682
Epoch: 35571, Training Loss: 0.00771
Epoch: 35571, Training Loss: 0.00721
Epoch: 35571, Training Loss: 0.00880
Epoch: 35571, Training Loss: 0.00682
Epoch: 35572, Training Loss: 0.00771
Epoch: 35572, Training Loss: 0.00721
Epoch: 35572, Training Loss: 0.00880
Epoch: 35572, Training Loss: 0.00682
Epoch: 35573, Training Loss: 0.00771
Epoch: 35573, Training Loss: 0.00721
Epoch: 35573, Training Loss: 0.00880
Epoch: 35573, Training Loss: 0.00682
Epoch: 35574, Training Loss: 0.00771
Epoch: 35574, Training Loss: 0.00721
Epoch: 35574, Training Loss: 0.00880
Epoch: 35574, Training Loss: 0.00682
Epoch: 35575, Training Loss: 0.00771
Epoch: 35575, Training Loss: 0.00721
Epoch: 35575, Training Loss: 0.00880
Epoch: 35575, Training Loss: 0.00682
Epoch: 35576, Training Loss: 0.00771
Epoch: 35576, Training Loss: 0.00721
Epoch: 35576, Training Loss: 0.00880
Epoch: 35576, Training Loss: 0.00682
Epoch: 35577, Training Loss: 0.00771
Epoch: 35577, Training Loss: 0.00721
Epoch: 35577, Training Loss: 0.00880
Epoch: 35577, Training Loss: 0.00682
Epoch: 35578, Training Loss: 0.00771
Epoch: 35578, Training Loss: 0.00721
Epoch: 35578, Training Loss: 0.00880
Epoch: 35578, Training Loss: 0.00682
Epoch: 35579, Training Loss: 0.00771
Epoch: 35579, Training Loss: 0.00721
Epoch: 35579, Training Loss: 0.00880
Epoch: 35579, Training Loss: 0.00682
Epoch: 35580, Training Loss: 0.00771
Epoch: 35580, Training Loss: 0.00721
Epoch: 35580, Training Loss: 0.00880
Epoch: 35580, Training Loss: 0.00682
Epoch: 35581, Training Loss: 0.00771
Epoch: 35581, Training Loss: 0.00721
Epoch: 35581, Training Loss: 0.00880
Epoch: 35581, Training Loss: 0.00682
Epoch: 35582, Training Loss: 0.00771
Epoch: 35582, Training Loss: 0.00720
Epoch: 35582, Training Loss: 0.00880
Epoch: 35582, Training Loss: 0.00682
Epoch: 35583, Training Loss: 0.00771
Epoch: 35583, Training Loss: 0.00720
Epoch: 35583, Training Loss: 0.00880
Epoch: 35583, Training Loss: 0.00682
Epoch: 35584, Training Loss: 0.00770
Epoch: 35584, Training Loss: 0.00720
Epoch: 35584, Training Loss: 0.00880
Epoch: 35584, Training Loss: 0.00682
Epoch: 35585, Training Loss: 0.00770
Epoch: 35585, Training Loss: 0.00720
Epoch: 35585, Training Loss: 0.00880
Epoch: 35585, Training Loss: 0.00682
Epoch: 35586, Training Loss: 0.00770
Epoch: 35586, Training Loss: 0.00720
Epoch: 35586, Training Loss: 0.00880
Epoch: 35586, Training Loss: 0.00682
Epoch: 35587, Training Loss: 0.00770
Epoch: 35587, Training Loss: 0.00720
Epoch: 35587, Training Loss: 0.00880
Epoch: 35587, Training Loss: 0.00682
Epoch: 35588, Training Loss: 0.00770
Epoch: 35588, Training Loss: 0.00720
Epoch: 35588, Training Loss: 0.00880
Epoch: 35588, Training Loss: 0.00682
Epoch: 35589, Training Loss: 0.00770
Epoch: 35589, Training Loss: 0.00720
Epoch: 35589, Training Loss: 0.00880
Epoch: 35589, Training Loss: 0.00682
Epoch: 35590, Training Loss: 0.00770
Epoch: 35590, Training Loss: 0.00720
Epoch: 35590, Training Loss: 0.00880
Epoch: 35590, Training Loss: 0.00682
Epoch: 35591, Training Loss: 0.00770
Epoch: 35591, Training Loss: 0.00720
Epoch: 35591, Training Loss: 0.00880
Epoch: 35591, Training Loss: 0.00682
Epoch: 35592, Training Loss: 0.00770
Epoch: 35592, Training Loss: 0.00720
Epoch: 35592, Training Loss: 0.00880
Epoch: 35592, Training Loss: 0.00682
Epoch: 35593, Training Loss: 0.00770
Epoch: 35593, Training Loss: 0.00720
Epoch: 35593, Training Loss: 0.00880
Epoch: 35593, Training Loss: 0.00682
Epoch: 35594, Training Loss: 0.00770
Epoch: 35594, Training Loss: 0.00720
Epoch: 35594, Training Loss: 0.00880
Epoch: 35594, Training Loss: 0.00682
Epoch: 35595, Training Loss: 0.00770
Epoch: 35595, Training Loss: 0.00720
Epoch: 35595, Training Loss: 0.00880
Epoch: 35595, Training Loss: 0.00682
Epoch: 35596, Training Loss: 0.00770
Epoch: 35596, Training Loss: 0.00720
Epoch: 35596, Training Loss: 0.00880
Epoch: 35596, Training Loss: 0.00682
Epoch: 35597, Training Loss: 0.00770
Epoch: 35597, Training Loss: 0.00720
Epoch: 35597, Training Loss: 0.00880
Epoch: 35597, Training Loss: 0.00682
Epoch: 35598, Training Loss: 0.00770
Epoch: 35598, Training Loss: 0.00720
Epoch: 35598, Training Loss: 0.00880
Epoch: 35598, Training Loss: 0.00682
Epoch: 35599, Training Loss: 0.00770
Epoch: 35599, Training Loss: 0.00720
Epoch: 35599, Training Loss: 0.00880
Epoch: 35599, Training Loss: 0.00682
Epoch: 35600, Training Loss: 0.00770
Epoch: 35600, Training Loss: 0.00720
Epoch: 35600, Training Loss: 0.00880
Epoch: 35600, Training Loss: 0.00682
Epoch: 35601, Training Loss: 0.00770
Epoch: 35601, Training Loss: 0.00720
Epoch: 35601, Training Loss: 0.00880
Epoch: 35601, Training Loss: 0.00682
Epoch: 35602, Training Loss: 0.00770
Epoch: 35602, Training Loss: 0.00720
Epoch: 35602, Training Loss: 0.00880
Epoch: 35602, Training Loss: 0.00682
Epoch: 35603, Training Loss: 0.00770
Epoch: 35603, Training Loss: 0.00720
Epoch: 35603, Training Loss: 0.00880
Epoch: 35603, Training Loss: 0.00682
Epoch: 35604, Training Loss: 0.00770
Epoch: 35604, Training Loss: 0.00720
Epoch: 35604, Training Loss: 0.00880
Epoch: 35604, Training Loss: 0.00682
Epoch: 35605, Training Loss: 0.00770
Epoch: 35605, Training Loss: 0.00720
Epoch: 35605, Training Loss: 0.00880
Epoch: 35605, Training Loss: 0.00682
Epoch: 35606, Training Loss: 0.00770
Epoch: 35606, Training Loss: 0.00720
Epoch: 35606, Training Loss: 0.00880
Epoch: 35606, Training Loss: 0.00682
Epoch: 35607, Training Loss: 0.00770
Epoch: 35607, Training Loss: 0.00720
Epoch: 35607, Training Loss: 0.00880
Epoch: 35607, Training Loss: 0.00682
Epoch: 35608, Training Loss: 0.00770
Epoch: 35608, Training Loss: 0.00720
Epoch: 35608, Training Loss: 0.00880
Epoch: 35608, Training Loss: 0.00682
Epoch: 35609, Training Loss: 0.00770
Epoch: 35609, Training Loss: 0.00720
Epoch: 35609, Training Loss: 0.00880
Epoch: 35609, Training Loss: 0.00682
Epoch: 35610, Training Loss: 0.00770
Epoch: 35610, Training Loss: 0.00720
Epoch: 35610, Training Loss: 0.00880
Epoch: 35610, Training Loss: 0.00682
Epoch: 35611, Training Loss: 0.00770
Epoch: 35611, Training Loss: 0.00720
Epoch: 35611, Training Loss: 0.00880
Epoch: 35611, Training Loss: 0.00682
Epoch: 35612, Training Loss: 0.00770
Epoch: 35612, Training Loss: 0.00720
Epoch: 35612, Training Loss: 0.00880
Epoch: 35612, Training Loss: 0.00682
Epoch: 35613, Training Loss: 0.00770
Epoch: 35613, Training Loss: 0.00720
Epoch: 35613, Training Loss: 0.00880
Epoch: 35613, Training Loss: 0.00682
Epoch: 35614, Training Loss: 0.00770
Epoch: 35614, Training Loss: 0.00720
Epoch: 35614, Training Loss: 0.00880
Epoch: 35614, Training Loss: 0.00682
Epoch: 35615, Training Loss: 0.00770
Epoch: 35615, Training Loss: 0.00720
Epoch: 35615, Training Loss: 0.00880
Epoch: 35615, Training Loss: 0.00682
Epoch: 35616, Training Loss: 0.00770
Epoch: 35616, Training Loss: 0.00720
Epoch: 35616, Training Loss: 0.00880
Epoch: 35616, Training Loss: 0.00682
Epoch: 35617, Training Loss: 0.00770
Epoch: 35617, Training Loss: 0.00720
Epoch: 35617, Training Loss: 0.00880
Epoch: 35617, Training Loss: 0.00682
Epoch: 35618, Training Loss: 0.00770
Epoch: 35618, Training Loss: 0.00720
Epoch: 35618, Training Loss: 0.00880
Epoch: 35618, Training Loss: 0.00682
Epoch: 35619, Training Loss: 0.00770
Epoch: 35619, Training Loss: 0.00720
Epoch: 35619, Training Loss: 0.00880
Epoch: 35619, Training Loss: 0.00682
Epoch: 35620, Training Loss: 0.00770
Epoch: 35620, Training Loss: 0.00720
Epoch: 35620, Training Loss: 0.00880
Epoch: 35620, Training Loss: 0.00682
Epoch: 35621, Training Loss: 0.00770
Epoch: 35621, Training Loss: 0.00720
Epoch: 35621, Training Loss: 0.00880
Epoch: 35621, Training Loss: 0.00682
Epoch: 35622, Training Loss: 0.00770
Epoch: 35622, Training Loss: 0.00720
Epoch: 35622, Training Loss: 0.00880
Epoch: 35622, Training Loss: 0.00682
Epoch: 35623, Training Loss: 0.00770
Epoch: 35623, Training Loss: 0.00720
Epoch: 35623, Training Loss: 0.00880
Epoch: 35623, Training Loss: 0.00682
Epoch: 35624, Training Loss: 0.00770
Epoch: 35624, Training Loss: 0.00720
Epoch: 35624, Training Loss: 0.00880
Epoch: 35624, Training Loss: 0.00682
Epoch: 35625, Training Loss: 0.00770
Epoch: 35625, Training Loss: 0.00720
Epoch: 35625, Training Loss: 0.00880
Epoch: 35625, Training Loss: 0.00682
Epoch: 35626, Training Loss: 0.00770
Epoch: 35626, Training Loss: 0.00720
Epoch: 35626, Training Loss: 0.00880
Epoch: 35626, Training Loss: 0.00682
Epoch: 35627, Training Loss: 0.00770
Epoch: 35627, Training Loss: 0.00720
Epoch: 35627, Training Loss: 0.00880
Epoch: 35627, Training Loss: 0.00682
Epoch: 35628, Training Loss: 0.00770
Epoch: 35628, Training Loss: 0.00720
Epoch: 35628, Training Loss: 0.00880
Epoch: 35628, Training Loss: 0.00682
Epoch: 35629, Training Loss: 0.00770
Epoch: 35629, Training Loss: 0.00720
Epoch: 35629, Training Loss: 0.00880
Epoch: 35629, Training Loss: 0.00682
Epoch: 35630, Training Loss: 0.00770
Epoch: 35630, Training Loss: 0.00720
Epoch: 35630, Training Loss: 0.00880
Epoch: 35630, Training Loss: 0.00682
Epoch: 35631, Training Loss: 0.00770
Epoch: 35631, Training Loss: 0.00720
Epoch: 35631, Training Loss: 0.00880
Epoch: 35631, Training Loss: 0.00682
Epoch: 35632, Training Loss: 0.00770
Epoch: 35632, Training Loss: 0.00720
Epoch: 35632, Training Loss: 0.00880
Epoch: 35632, Training Loss: 0.00682
Epoch: 35633, Training Loss: 0.00770
Epoch: 35633, Training Loss: 0.00720
Epoch: 35633, Training Loss: 0.00880
Epoch: 35633, Training Loss: 0.00682
Epoch: 35634, Training Loss: 0.00770
Epoch: 35634, Training Loss: 0.00720
Epoch: 35634, Training Loss: 0.00880
Epoch: 35634, Training Loss: 0.00682
Epoch: 35635, Training Loss: 0.00770
Epoch: 35635, Training Loss: 0.00720
Epoch: 35635, Training Loss: 0.00880
Epoch: 35635, Training Loss: 0.00682
Epoch: 35636, Training Loss: 0.00770
Epoch: 35636, Training Loss: 0.00720
Epoch: 35636, Training Loss: 0.00880
Epoch: 35636, Training Loss: 0.00682
Epoch: 35637, Training Loss: 0.00770
Epoch: 35637, Training Loss: 0.00720
Epoch: 35637, Training Loss: 0.00879
Epoch: 35637, Training Loss: 0.00682
Epoch: 35638, Training Loss: 0.00770
Epoch: 35638, Training Loss: 0.00720
Epoch: 35638, Training Loss: 0.00879
Epoch: 35638, Training Loss: 0.00682
Epoch: 35639, Training Loss: 0.00770
Epoch: 35639, Training Loss: 0.00720
Epoch: 35639, Training Loss: 0.00879
Epoch: 35639, Training Loss: 0.00682
Epoch: 35640, Training Loss: 0.00770
Epoch: 35640, Training Loss: 0.00720
Epoch: 35640, Training Loss: 0.00879
Epoch: 35640, Training Loss: 0.00682
Epoch: 35641, Training Loss: 0.00770
Epoch: 35641, Training Loss: 0.00720
Epoch: 35641, Training Loss: 0.00879
Epoch: 35641, Training Loss: 0.00682
Epoch: 35642, Training Loss: 0.00770
Epoch: 35642, Training Loss: 0.00720
Epoch: 35642, Training Loss: 0.00879
Epoch: 35642, Training Loss: 0.00682
Epoch: 35643, Training Loss: 0.00770
Epoch: 35643, Training Loss: 0.00720
Epoch: 35643, Training Loss: 0.00879
Epoch: 35643, Training Loss: 0.00682
Epoch: 35644, Training Loss: 0.00770
Epoch: 35644, Training Loss: 0.00720
Epoch: 35644, Training Loss: 0.00879
Epoch: 35644, Training Loss: 0.00682
Epoch: 35645, Training Loss: 0.00770
Epoch: 35645, Training Loss: 0.00720
Epoch: 35645, Training Loss: 0.00879
Epoch: 35645, Training Loss: 0.00682
Epoch: 35646, Training Loss: 0.00770
Epoch: 35646, Training Loss: 0.00720
Epoch: 35646, Training Loss: 0.00879
Epoch: 35646, Training Loss: 0.00682
Epoch: 35647, Training Loss: 0.00770
Epoch: 35647, Training Loss: 0.00720
Epoch: 35647, Training Loss: 0.00879
Epoch: 35647, Training Loss: 0.00682
Epoch: 35648, Training Loss: 0.00770
Epoch: 35648, Training Loss: 0.00720
Epoch: 35648, Training Loss: 0.00879
Epoch: 35648, Training Loss: 0.00682
Epoch: 35649, Training Loss: 0.00770
Epoch: 35649, Training Loss: 0.00720
Epoch: 35649, Training Loss: 0.00879
Epoch: 35649, Training Loss: 0.00682
Epoch: 35650, Training Loss: 0.00770
Epoch: 35650, Training Loss: 0.00720
Epoch: 35650, Training Loss: 0.00879
Epoch: 35650, Training Loss: 0.00682
Epoch: 35651, Training Loss: 0.00770
Epoch: 35651, Training Loss: 0.00720
Epoch: 35651, Training Loss: 0.00879
Epoch: 35651, Training Loss: 0.00682
Epoch: 35652, Training Loss: 0.00770
Epoch: 35652, Training Loss: 0.00720
Epoch: 35652, Training Loss: 0.00879
Epoch: 35652, Training Loss: 0.00681
Epoch: 35653, Training Loss: 0.00770
Epoch: 35653, Training Loss: 0.00720
Epoch: 35653, Training Loss: 0.00879
Epoch: 35653, Training Loss: 0.00681
Epoch: 35654, Training Loss: 0.00770
Epoch: 35654, Training Loss: 0.00720
Epoch: 35654, Training Loss: 0.00879
Epoch: 35654, Training Loss: 0.00681
Epoch: 35655, Training Loss: 0.00770
Epoch: 35655, Training Loss: 0.00720
Epoch: 35655, Training Loss: 0.00879
Epoch: 35655, Training Loss: 0.00681
Epoch: 35656, Training Loss: 0.00770
Epoch: 35656, Training Loss: 0.00720
Epoch: 35656, Training Loss: 0.00879
Epoch: 35656, Training Loss: 0.00681
Epoch: 35657, Training Loss: 0.00770
Epoch: 35657, Training Loss: 0.00720
Epoch: 35657, Training Loss: 0.00879
Epoch: 35657, Training Loss: 0.00681
Epoch: 35658, Training Loss: 0.00770
Epoch: 35658, Training Loss: 0.00720
Epoch: 35658, Training Loss: 0.00879
Epoch: 35658, Training Loss: 0.00681
Epoch: 35659, Training Loss: 0.00770
Epoch: 35659, Training Loss: 0.00720
Epoch: 35659, Training Loss: 0.00879
Epoch: 35659, Training Loss: 0.00681
Epoch: 35660, Training Loss: 0.00770
Epoch: 35660, Training Loss: 0.00720
Epoch: 35660, Training Loss: 0.00879
Epoch: 35660, Training Loss: 0.00681
Epoch: 35661, Training Loss: 0.00770
Epoch: 35661, Training Loss: 0.00720
Epoch: 35661, Training Loss: 0.00879
Epoch: 35661, Training Loss: 0.00681
Epoch: 35662, Training Loss: 0.00770
Epoch: 35662, Training Loss: 0.00720
Epoch: 35662, Training Loss: 0.00879
Epoch: 35662, Training Loss: 0.00681
Epoch: 35663, Training Loss: 0.00770
Epoch: 35663, Training Loss: 0.00720
Epoch: 35663, Training Loss: 0.00879
Epoch: 35663, Training Loss: 0.00681
Epoch: 35664, Training Loss: 0.00770
Epoch: 35664, Training Loss: 0.00720
Epoch: 35664, Training Loss: 0.00879
Epoch: 35664, Training Loss: 0.00681
Epoch: 35665, Training Loss: 0.00770
Epoch: 35665, Training Loss: 0.00720
Epoch: 35665, Training Loss: 0.00879
Epoch: 35665, Training Loss: 0.00681
Epoch: 35666, Training Loss: 0.00770
Epoch: 35666, Training Loss: 0.00720
Epoch: 35666, Training Loss: 0.00879
Epoch: 35666, Training Loss: 0.00681
Epoch: 35667, Training Loss: 0.00770
Epoch: 35667, Training Loss: 0.00720
Epoch: 35667, Training Loss: 0.00879
Epoch: 35667, Training Loss: 0.00681
Epoch: 35668, Training Loss: 0.00770
Epoch: 35668, Training Loss: 0.00720
Epoch: 35668, Training Loss: 0.00879
Epoch: 35668, Training Loss: 0.00681
Epoch: 35669, Training Loss: 0.00770
Epoch: 35669, Training Loss: 0.00720
Epoch: 35669, Training Loss: 0.00879
Epoch: 35669, Training Loss: 0.00681
Epoch: 35670, Training Loss: 0.00770
Epoch: 35670, Training Loss: 0.00720
Epoch: 35670, Training Loss: 0.00879
Epoch: 35670, Training Loss: 0.00681
Epoch: 35671, Training Loss: 0.00770
Epoch: 35671, Training Loss: 0.00720
Epoch: 35671, Training Loss: 0.00879
Epoch: 35671, Training Loss: 0.00681
Epoch: 35672, Training Loss: 0.00769
Epoch: 35672, Training Loss: 0.00720
Epoch: 35672, Training Loss: 0.00879
Epoch: 35672, Training Loss: 0.00681
Epoch: 35673, Training Loss: 0.00769
Epoch: 35673, Training Loss: 0.00720
Epoch: 35673, Training Loss: 0.00879
Epoch: 35673, Training Loss: 0.00681
Epoch: 35674, Training Loss: 0.00769
Epoch: 35674, Training Loss: 0.00720
Epoch: 35674, Training Loss: 0.00879
Epoch: 35674, Training Loss: 0.00681
Epoch: 35675, Training Loss: 0.00769
Epoch: 35675, Training Loss: 0.00720
Epoch: 35675, Training Loss: 0.00879
Epoch: 35675, Training Loss: 0.00681
Epoch: 35676, Training Loss: 0.00769
Epoch: 35676, Training Loss: 0.00720
Epoch: 35676, Training Loss: 0.00879
Epoch: 35676, Training Loss: 0.00681
Epoch: 35677, Training Loss: 0.00769
Epoch: 35677, Training Loss: 0.00719
Epoch: 35677, Training Loss: 0.00879
Epoch: 35677, Training Loss: 0.00681
Epoch: 35678, Training Loss: 0.00769
Epoch: 35678, Training Loss: 0.00719
Epoch: 35678, Training Loss: 0.00879
Epoch: 35678, Training Loss: 0.00681
Epoch: 35679, Training Loss: 0.00769
Epoch: 35679, Training Loss: 0.00719
Epoch: 35679, Training Loss: 0.00879
Epoch: 35679, Training Loss: 0.00681
Epoch: 35680, Training Loss: 0.00769
Epoch: 35680, Training Loss: 0.00719
Epoch: 35680, Training Loss: 0.00879
Epoch: 35680, Training Loss: 0.00681
Epoch: 35681, Training Loss: 0.00769
Epoch: 35681, Training Loss: 0.00719
Epoch: 35681, Training Loss: 0.00879
Epoch: 35681, Training Loss: 0.00681
Epoch: 35682, Training Loss: 0.00769
Epoch: 35682, Training Loss: 0.00719
Epoch: 35682, Training Loss: 0.00879
Epoch: 35682, Training Loss: 0.00681
Epoch: 35683, Training Loss: 0.00769
Epoch: 35683, Training Loss: 0.00719
Epoch: 35683, Training Loss: 0.00879
Epoch: 35683, Training Loss: 0.00681
Epoch: 35684, Training Loss: 0.00769
Epoch: 35684, Training Loss: 0.00719
Epoch: 35684, Training Loss: 0.00879
Epoch: 35684, Training Loss: 0.00681
Epoch: 35685, Training Loss: 0.00769
Epoch: 35685, Training Loss: 0.00719
Epoch: 35685, Training Loss: 0.00879
Epoch: 35685, Training Loss: 0.00681
Epoch: 35686, Training Loss: 0.00769
Epoch: 35686, Training Loss: 0.00719
Epoch: 35686, Training Loss: 0.00879
Epoch: 35686, Training Loss: 0.00681
Epoch: 35687, Training Loss: 0.00769
Epoch: 35687, Training Loss: 0.00719
Epoch: 35687, Training Loss: 0.00879
Epoch: 35687, Training Loss: 0.00681
Epoch: 35688, Training Loss: 0.00769
Epoch: 35688, Training Loss: 0.00719
Epoch: 35688, Training Loss: 0.00879
Epoch: 35688, Training Loss: 0.00681
Epoch: 35689, Training Loss: 0.00769
Epoch: 35689, Training Loss: 0.00719
Epoch: 35689, Training Loss: 0.00879
Epoch: 35689, Training Loss: 0.00681
Epoch: 35690, Training Loss: 0.00769
Epoch: 35690, Training Loss: 0.00719
Epoch: 35690, Training Loss: 0.00879
Epoch: 35690, Training Loss: 0.00681
Epoch: 35691, Training Loss: 0.00769
Epoch: 35691, Training Loss: 0.00719
Epoch: 35691, Training Loss: 0.00879
Epoch: 35691, Training Loss: 0.00681
Epoch: 35692, Training Loss: 0.00769
Epoch: 35692, Training Loss: 0.00719
Epoch: 35692, Training Loss: 0.00879
Epoch: 35692, Training Loss: 0.00681
Epoch: 35693, Training Loss: 0.00769
Epoch: 35693, Training Loss: 0.00719
Epoch: 35693, Training Loss: 0.00879
Epoch: 35693, Training Loss: 0.00681
Epoch: 35694, Training Loss: 0.00769
Epoch: 35694, Training Loss: 0.00719
Epoch: 35694, Training Loss: 0.00879
Epoch: 35694, Training Loss: 0.00681
Epoch: 35695, Training Loss: 0.00769
Epoch: 35695, Training Loss: 0.00719
Epoch: 35695, Training Loss: 0.00879
Epoch: 35695, Training Loss: 0.00681
Epoch: 35696, Training Loss: 0.00769
Epoch: 35696, Training Loss: 0.00719
Epoch: 35696, Training Loss: 0.00879
Epoch: 35696, Training Loss: 0.00681
Epoch: 35697, Training Loss: 0.00769
Epoch: 35697, Training Loss: 0.00719
Epoch: 35697, Training Loss: 0.00879
Epoch: 35697, Training Loss: 0.00681
Epoch: 35698, Training Loss: 0.00769
Epoch: 35698, Training Loss: 0.00719
Epoch: 35698, Training Loss: 0.00879
Epoch: 35698, Training Loss: 0.00681
Epoch: 35699, Training Loss: 0.00769
Epoch: 35699, Training Loss: 0.00719
Epoch: 35699, Training Loss: 0.00879
Epoch: 35699, Training Loss: 0.00681
Epoch: 35700, Training Loss: 0.00769
Epoch: 35700, Training Loss: 0.00719
Epoch: 35700, Training Loss: 0.00879
Epoch: 35700, Training Loss: 0.00681
Epoch: 35701, Training Loss: 0.00769
Epoch: 35701, Training Loss: 0.00719
Epoch: 35701, Training Loss: 0.00879
Epoch: 35701, Training Loss: 0.00681
Epoch: 35702, Training Loss: 0.00769
Epoch: 35702, Training Loss: 0.00719
Epoch: 35702, Training Loss: 0.00879
Epoch: 35702, Training Loss: 0.00681
Epoch: 35703, Training Loss: 0.00769
Epoch: 35703, Training Loss: 0.00719
Epoch: 35703, Training Loss: 0.00879
Epoch: 35703, Training Loss: 0.00681
Epoch: 35704, Training Loss: 0.00769
Epoch: 35704, Training Loss: 0.00719
Epoch: 35704, Training Loss: 0.00879
Epoch: 35704, Training Loss: 0.00681
Epoch: 35705, Training Loss: 0.00769
Epoch: 35705, Training Loss: 0.00719
Epoch: 35705, Training Loss: 0.00879
Epoch: 35705, Training Loss: 0.00681
Epoch: 35706, Training Loss: 0.00769
Epoch: 35706, Training Loss: 0.00719
Epoch: 35706, Training Loss: 0.00879
Epoch: 35706, Training Loss: 0.00681
Epoch: 35707, Training Loss: 0.00769
Epoch: 35707, Training Loss: 0.00719
Epoch: 35707, Training Loss: 0.00879
Epoch: 35707, Training Loss: 0.00681
Epoch: 35708, Training Loss: 0.00769
Epoch: 35708, Training Loss: 0.00719
Epoch: 35708, Training Loss: 0.00879
Epoch: 35708, Training Loss: 0.00681
Epoch: 35709, Training Loss: 0.00769
Epoch: 35709, Training Loss: 0.00719
Epoch: 35709, Training Loss: 0.00879
Epoch: 35709, Training Loss: 0.00681
Epoch: 35710, Training Loss: 0.00769
Epoch: 35710, Training Loss: 0.00719
Epoch: 35710, Training Loss: 0.00879
Epoch: 35710, Training Loss: 0.00681
Epoch: 35711, Training Loss: 0.00769
Epoch: 35711, Training Loss: 0.00719
Epoch: 35711, Training Loss: 0.00879
Epoch: 35711, Training Loss: 0.00681
Epoch: 35712, Training Loss: 0.00769
Epoch: 35712, Training Loss: 0.00719
Epoch: 35712, Training Loss: 0.00879
Epoch: 35712, Training Loss: 0.00681
Epoch: 35713, Training Loss: 0.00769
Epoch: 35713, Training Loss: 0.00719
Epoch: 35713, Training Loss: 0.00879
Epoch: 35713, Training Loss: 0.00681
Epoch: 35714, Training Loss: 0.00769
Epoch: 35714, Training Loss: 0.00719
Epoch: 35714, Training Loss: 0.00878
Epoch: 35714, Training Loss: 0.00681
Epoch: 35715, Training Loss: 0.00769
Epoch: 35715, Training Loss: 0.00719
Epoch: 35715, Training Loss: 0.00878
Epoch: 35715, Training Loss: 0.00681
Epoch: 35716, Training Loss: 0.00769
Epoch: 35716, Training Loss: 0.00719
Epoch: 35716, Training Loss: 0.00878
Epoch: 35716, Training Loss: 0.00681
Epoch: 35717, Training Loss: 0.00769
Epoch: 35717, Training Loss: 0.00719
Epoch: 35717, Training Loss: 0.00878
Epoch: 35717, Training Loss: 0.00681
Epoch: 35718, Training Loss: 0.00769
Epoch: 35718, Training Loss: 0.00719
Epoch: 35718, Training Loss: 0.00878
Epoch: 35718, Training Loss: 0.00681
Epoch: 35719, Training Loss: 0.00769
Epoch: 35719, Training Loss: 0.00719
Epoch: 35719, Training Loss: 0.00878
Epoch: 35719, Training Loss: 0.00681
Epoch: 35720, Training Loss: 0.00769
Epoch: 35720, Training Loss: 0.00719
Epoch: 35720, Training Loss: 0.00878
Epoch: 35720, Training Loss: 0.00681
Epoch: 35721, Training Loss: 0.00769
Epoch: 35721, Training Loss: 0.00719
Epoch: 35721, Training Loss: 0.00878
Epoch: 35721, Training Loss: 0.00681
Epoch: 35722, Training Loss: 0.00769
Epoch: 35722, Training Loss: 0.00719
Epoch: 35722, Training Loss: 0.00878
Epoch: 35722, Training Loss: 0.00681
Epoch: 35723, Training Loss: 0.00769
Epoch: 35723, Training Loss: 0.00719
Epoch: 35723, Training Loss: 0.00878
Epoch: 35723, Training Loss: 0.00681
Epoch: 35724, Training Loss: 0.00769
Epoch: 35724, Training Loss: 0.00719
Epoch: 35724, Training Loss: 0.00878
Epoch: 35724, Training Loss: 0.00681
Epoch: 35725, Training Loss: 0.00769
Epoch: 35725, Training Loss: 0.00719
Epoch: 35725, Training Loss: 0.00878
Epoch: 35725, Training Loss: 0.00681
Epoch: 35726, Training Loss: 0.00769
Epoch: 35726, Training Loss: 0.00719
Epoch: 35726, Training Loss: 0.00878
Epoch: 35726, Training Loss: 0.00681
Epoch: 35727, Training Loss: 0.00769
Epoch: 35727, Training Loss: 0.00719
Epoch: 35727, Training Loss: 0.00878
Epoch: 35727, Training Loss: 0.00681
Epoch: 35728, Training Loss: 0.00769
Epoch: 35728, Training Loss: 0.00719
Epoch: 35728, Training Loss: 0.00878
Epoch: 35728, Training Loss: 0.00681
Epoch: 35729, Training Loss: 0.00769
Epoch: 35729, Training Loss: 0.00719
Epoch: 35729, Training Loss: 0.00878
Epoch: 35729, Training Loss: 0.00681
Epoch: 35730, Training Loss: 0.00769
Epoch: 35730, Training Loss: 0.00719
Epoch: 35730, Training Loss: 0.00878
Epoch: 35730, Training Loss: 0.00681
Epoch: 35731, Training Loss: 0.00769
Epoch: 35731, Training Loss: 0.00719
Epoch: 35731, Training Loss: 0.00878
Epoch: 35731, Training Loss: 0.00681
Epoch: 35732, Training Loss: 0.00769
Epoch: 35732, Training Loss: 0.00719
Epoch: 35732, Training Loss: 0.00878
Epoch: 35732, Training Loss: 0.00681
Epoch: 35733, Training Loss: 0.00769
Epoch: 35733, Training Loss: 0.00719
Epoch: 35733, Training Loss: 0.00878
Epoch: 35733, Training Loss: 0.00681
Epoch: 35734, Training Loss: 0.00769
Epoch: 35734, Training Loss: 0.00719
Epoch: 35734, Training Loss: 0.00878
Epoch: 35734, Training Loss: 0.00681
Epoch: 35735, Training Loss: 0.00769
Epoch: 35735, Training Loss: 0.00719
Epoch: 35735, Training Loss: 0.00878
Epoch: 35735, Training Loss: 0.00681
Epoch: 35736, Training Loss: 0.00769
Epoch: 35736, Training Loss: 0.00719
Epoch: 35736, Training Loss: 0.00878
Epoch: 35736, Training Loss: 0.00681
Epoch: 35737, Training Loss: 0.00769
Epoch: 35737, Training Loss: 0.00719
Epoch: 35737, Training Loss: 0.00878
Epoch: 35737, Training Loss: 0.00681
Epoch: 35738, Training Loss: 0.00769
Epoch: 35738, Training Loss: 0.00719
Epoch: 35738, Training Loss: 0.00878
Epoch: 35738, Training Loss: 0.00681
Epoch: 35739, Training Loss: 0.00769
Epoch: 35739, Training Loss: 0.00719
Epoch: 35739, Training Loss: 0.00878
Epoch: 35739, Training Loss: 0.00681
Epoch: 35740, Training Loss: 0.00769
Epoch: 35740, Training Loss: 0.00719
Epoch: 35740, Training Loss: 0.00878
Epoch: 35740, Training Loss: 0.00681
Epoch: 35741, Training Loss: 0.00769
Epoch: 35741, Training Loss: 0.00719
Epoch: 35741, Training Loss: 0.00878
Epoch: 35741, Training Loss: 0.00681
Epoch: 35742, Training Loss: 0.00769
Epoch: 35742, Training Loss: 0.00719
Epoch: 35742, Training Loss: 0.00878
Epoch: 35742, Training Loss: 0.00681
Epoch: 35743, Training Loss: 0.00769
Epoch: 35743, Training Loss: 0.00719
Epoch: 35743, Training Loss: 0.00878
Epoch: 35743, Training Loss: 0.00681
Epoch: 35744, Training Loss: 0.00769
Epoch: 35744, Training Loss: 0.00719
Epoch: 35744, Training Loss: 0.00878
Epoch: 35744, Training Loss: 0.00681
Epoch: 35745, Training Loss: 0.00769
Epoch: 35745, Training Loss: 0.00719
Epoch: 35745, Training Loss: 0.00878
Epoch: 35745, Training Loss: 0.00681
Epoch: 35746, Training Loss: 0.00769
Epoch: 35746, Training Loss: 0.00719
Epoch: 35746, Training Loss: 0.00878
Epoch: 35746, Training Loss: 0.00681
Epoch: 35747, Training Loss: 0.00769
Epoch: 35747, Training Loss: 0.00719
Epoch: 35747, Training Loss: 0.00878
Epoch: 35747, Training Loss: 0.00681
Epoch: 35748, Training Loss: 0.00769
Epoch: 35748, Training Loss: 0.00719
Epoch: 35748, Training Loss: 0.00878
Epoch: 35748, Training Loss: 0.00681
Epoch: 35749, Training Loss: 0.00769
Epoch: 35749, Training Loss: 0.00719
Epoch: 35749, Training Loss: 0.00878
Epoch: 35749, Training Loss: 0.00681
Epoch: 35750, Training Loss: 0.00769
Epoch: 35750, Training Loss: 0.00719
Epoch: 35750, Training Loss: 0.00878
Epoch: 35750, Training Loss: 0.00681
Epoch: 35751, Training Loss: 0.00769
Epoch: 35751, Training Loss: 0.00719
Epoch: 35751, Training Loss: 0.00878
Epoch: 35751, Training Loss: 0.00681
Epoch: 35752, Training Loss: 0.00769
Epoch: 35752, Training Loss: 0.00719
Epoch: 35752, Training Loss: 0.00878
Epoch: 35752, Training Loss: 0.00680
Epoch: 35753, Training Loss: 0.00769
Epoch: 35753, Training Loss: 0.00719
Epoch: 35753, Training Loss: 0.00878
Epoch: 35753, Training Loss: 0.00680
Epoch: 35754, Training Loss: 0.00769
Epoch: 35754, Training Loss: 0.00719
Epoch: 35754, Training Loss: 0.00878
Epoch: 35754, Training Loss: 0.00680
Epoch: 35755, Training Loss: 0.00769
Epoch: 35755, Training Loss: 0.00719
Epoch: 35755, Training Loss: 0.00878
Epoch: 35755, Training Loss: 0.00680
Epoch: 35756, Training Loss: 0.00769
Epoch: 35756, Training Loss: 0.00719
Epoch: 35756, Training Loss: 0.00878
Epoch: 35756, Training Loss: 0.00680
Epoch: 35757, Training Loss: 0.00769
Epoch: 35757, Training Loss: 0.00719
Epoch: 35757, Training Loss: 0.00878
Epoch: 35757, Training Loss: 0.00680
Epoch: 35758, Training Loss: 0.00769
Epoch: 35758, Training Loss: 0.00719
Epoch: 35758, Training Loss: 0.00878
Epoch: 35758, Training Loss: 0.00680
Epoch: 35759, Training Loss: 0.00769
Epoch: 35759, Training Loss: 0.00719
Epoch: 35759, Training Loss: 0.00878
Epoch: 35759, Training Loss: 0.00680
Epoch: 35760, Training Loss: 0.00768
Epoch: 35760, Training Loss: 0.00719
Epoch: 35760, Training Loss: 0.00878
Epoch: 35760, Training Loss: 0.00680
Epoch: 35761, Training Loss: 0.00768
Epoch: 35761, Training Loss: 0.00719
Epoch: 35761, Training Loss: 0.00878
Epoch: 35761, Training Loss: 0.00680
Epoch: 35762, Training Loss: 0.00768
Epoch: 35762, Training Loss: 0.00719
Epoch: 35762, Training Loss: 0.00878
Epoch: 35762, Training Loss: 0.00680
Epoch: 35763, Training Loss: 0.00768
Epoch: 35763, Training Loss: 0.00719
Epoch: 35763, Training Loss: 0.00878
Epoch: 35763, Training Loss: 0.00680
Epoch: 35764, Training Loss: 0.00768
Epoch: 35764, Training Loss: 0.00719
Epoch: 35764, Training Loss: 0.00878
Epoch: 35764, Training Loss: 0.00680
Epoch: 35765, Training Loss: 0.00768
Epoch: 35765, Training Loss: 0.00719
Epoch: 35765, Training Loss: 0.00878
Epoch: 35765, Training Loss: 0.00680
Epoch: 35766, Training Loss: 0.00768
Epoch: 35766, Training Loss: 0.00719
Epoch: 35766, Training Loss: 0.00878
Epoch: 35766, Training Loss: 0.00680
Epoch: 35767, Training Loss: 0.00768
Epoch: 35767, Training Loss: 0.00719
Epoch: 35767, Training Loss: 0.00878
Epoch: 35767, Training Loss: 0.00680
Epoch: 35768, Training Loss: 0.00768
Epoch: 35768, Training Loss: 0.00719
Epoch: 35768, Training Loss: 0.00878
Epoch: 35768, Training Loss: 0.00680
Epoch: 35769, Training Loss: 0.00768
Epoch: 35769, Training Loss: 0.00719
Epoch: 35769, Training Loss: 0.00878
Epoch: 35769, Training Loss: 0.00680
Epoch: 35770, Training Loss: 0.00768
Epoch: 35770, Training Loss: 0.00719
Epoch: 35770, Training Loss: 0.00878
Epoch: 35770, Training Loss: 0.00680
Epoch: 35771, Training Loss: 0.00768
Epoch: 35771, Training Loss: 0.00719
Epoch: 35771, Training Loss: 0.00878
Epoch: 35771, Training Loss: 0.00680
Epoch: 35772, Training Loss: 0.00768
Epoch: 35772, Training Loss: 0.00719
Epoch: 35772, Training Loss: 0.00878
Epoch: 35772, Training Loss: 0.00680
Epoch: 35773, Training Loss: 0.00768
Epoch: 35773, Training Loss: 0.00718
Epoch: 35773, Training Loss: 0.00878
Epoch: 35773, Training Loss: 0.00680
Epoch: 35774, Training Loss: 0.00768
Epoch: 35774, Training Loss: 0.00718
Epoch: 35774, Training Loss: 0.00878
Epoch: 35774, Training Loss: 0.00680
Epoch: 35775, Training Loss: 0.00768
Epoch: 35775, Training Loss: 0.00718
Epoch: 35775, Training Loss: 0.00878
Epoch: 35775, Training Loss: 0.00680
Epoch: 35776, Training Loss: 0.00768
Epoch: 35776, Training Loss: 0.00718
Epoch: 35776, Training Loss: 0.00878
Epoch: 35776, Training Loss: 0.00680
Epoch: 35777, Training Loss: 0.00768
Epoch: 35777, Training Loss: 0.00718
Epoch: 35777, Training Loss: 0.00878
Epoch: 35777, Training Loss: 0.00680
Epoch: 35778, Training Loss: 0.00768
Epoch: 35778, Training Loss: 0.00718
Epoch: 35778, Training Loss: 0.00878
Epoch: 35778, Training Loss: 0.00680
Epoch: 35779, Training Loss: 0.00768
Epoch: 35779, Training Loss: 0.00718
Epoch: 35779, Training Loss: 0.00878
Epoch: 35779, Training Loss: 0.00680
Epoch: 35780, Training Loss: 0.00768
Epoch: 35780, Training Loss: 0.00718
Epoch: 35780, Training Loss: 0.00878
Epoch: 35780, Training Loss: 0.00680
Epoch: 35781, Training Loss: 0.00768
Epoch: 35781, Training Loss: 0.00718
Epoch: 35781, Training Loss: 0.00878
Epoch: 35781, Training Loss: 0.00680
Epoch: 35782, Training Loss: 0.00768
Epoch: 35782, Training Loss: 0.00718
Epoch: 35782, Training Loss: 0.00878
Epoch: 35782, Training Loss: 0.00680
Epoch: 35783, Training Loss: 0.00768
Epoch: 35783, Training Loss: 0.00718
Epoch: 35783, Training Loss: 0.00878
Epoch: 35783, Training Loss: 0.00680
Epoch: 35784, Training Loss: 0.00768
Epoch: 35784, Training Loss: 0.00718
Epoch: 35784, Training Loss: 0.00878
Epoch: 35784, Training Loss: 0.00680
Epoch: 35785, Training Loss: 0.00768
Epoch: 35785, Training Loss: 0.00718
Epoch: 35785, Training Loss: 0.00878
Epoch: 35785, Training Loss: 0.00680
Epoch: 35786, Training Loss: 0.00768
Epoch: 35786, Training Loss: 0.00718
Epoch: 35786, Training Loss: 0.00878
Epoch: 35786, Training Loss: 0.00680
Epoch: 35787, Training Loss: 0.00768
Epoch: 35787, Training Loss: 0.00718
Epoch: 35787, Training Loss: 0.00878
Epoch: 35787, Training Loss: 0.00680
Epoch: 35788, Training Loss: 0.00768
Epoch: 35788, Training Loss: 0.00718
Epoch: 35788, Training Loss: 0.00878
Epoch: 35788, Training Loss: 0.00680
Epoch: 35789, Training Loss: 0.00768
Epoch: 35789, Training Loss: 0.00718
Epoch: 35789, Training Loss: 0.00878
Epoch: 35789, Training Loss: 0.00680
Epoch: 35790, Training Loss: 0.00768
Epoch: 35790, Training Loss: 0.00718
Epoch: 35790, Training Loss: 0.00878
Epoch: 35790, Training Loss: 0.00680
Epoch: 35791, Training Loss: 0.00768
Epoch: 35791, Training Loss: 0.00718
Epoch: 35791, Training Loss: 0.00877
Epoch: 35791, Training Loss: 0.00680
Epoch: 35792, Training Loss: 0.00768
Epoch: 35792, Training Loss: 0.00718
Epoch: 35792, Training Loss: 0.00877
Epoch: 35792, Training Loss: 0.00680
Epoch: 35793, Training Loss: 0.00768
Epoch: 35793, Training Loss: 0.00718
Epoch: 35793, Training Loss: 0.00877
Epoch: 35793, Training Loss: 0.00680
Epoch: 35794, Training Loss: 0.00768
Epoch: 35794, Training Loss: 0.00718
Epoch: 35794, Training Loss: 0.00877
Epoch: 35794, Training Loss: 0.00680
Epoch: 35795, Training Loss: 0.00768
Epoch: 35795, Training Loss: 0.00718
Epoch: 35795, Training Loss: 0.00877
Epoch: 35795, Training Loss: 0.00680
Epoch: 35796, Training Loss: 0.00768
Epoch: 35796, Training Loss: 0.00718
Epoch: 35796, Training Loss: 0.00877
Epoch: 35796, Training Loss: 0.00680
Epoch: 35797, Training Loss: 0.00768
Epoch: 35797, Training Loss: 0.00718
Epoch: 35797, Training Loss: 0.00877
Epoch: 35797, Training Loss: 0.00680
Epoch: 35798, Training Loss: 0.00768
Epoch: 35798, Training Loss: 0.00718
Epoch: 35798, Training Loss: 0.00877
Epoch: 35798, Training Loss: 0.00680
Epoch: 35799, Training Loss: 0.00768
Epoch: 35799, Training Loss: 0.00718
Epoch: 35799, Training Loss: 0.00877
Epoch: 35799, Training Loss: 0.00680
Epoch: 35800, Training Loss: 0.00768
Epoch: 35800, Training Loss: 0.00718
Epoch: 35800, Training Loss: 0.00877
Epoch: 35800, Training Loss: 0.00680
Epoch: 35801, Training Loss: 0.00768
Epoch: 35801, Training Loss: 0.00718
Epoch: 35801, Training Loss: 0.00877
Epoch: 35801, Training Loss: 0.00680
Epoch: 35802, Training Loss: 0.00768
Epoch: 35802, Training Loss: 0.00718
Epoch: 35802, Training Loss: 0.00877
Epoch: 35802, Training Loss: 0.00680
Epoch: 35803, Training Loss: 0.00768
Epoch: 35803, Training Loss: 0.00718
Epoch: 35803, Training Loss: 0.00877
Epoch: 35803, Training Loss: 0.00680
Epoch: 35804, Training Loss: 0.00768
Epoch: 35804, Training Loss: 0.00718
Epoch: 35804, Training Loss: 0.00877
Epoch: 35804, Training Loss: 0.00680
Epoch: 35805, Training Loss: 0.00768
Epoch: 35805, Training Loss: 0.00718
Epoch: 35805, Training Loss: 0.00877
Epoch: 35805, Training Loss: 0.00680
Epoch: 35806, Training Loss: 0.00768
Epoch: 35806, Training Loss: 0.00718
Epoch: 35806, Training Loss: 0.00877
Epoch: 35806, Training Loss: 0.00680
Epoch: 35807, Training Loss: 0.00768
Epoch: 35807, Training Loss: 0.00718
Epoch: 35807, Training Loss: 0.00877
Epoch: 35807, Training Loss: 0.00680
Epoch: 35808, Training Loss: 0.00768
Epoch: 35808, Training Loss: 0.00718
Epoch: 35808, Training Loss: 0.00877
Epoch: 35808, Training Loss: 0.00680
Epoch: 35809, Training Loss: 0.00768
Epoch: 35809, Training Loss: 0.00718
Epoch: 35809, Training Loss: 0.00877
Epoch: 35809, Training Loss: 0.00680
Epoch: 35810, Training Loss: 0.00768
Epoch: 35810, Training Loss: 0.00718
Epoch: 35810, Training Loss: 0.00877
Epoch: 35810, Training Loss: 0.00680
Epoch: 35811, Training Loss: 0.00768
Epoch: 35811, Training Loss: 0.00718
Epoch: 35811, Training Loss: 0.00877
Epoch: 35811, Training Loss: 0.00680
Epoch: 35812, Training Loss: 0.00768
Epoch: 35812, Training Loss: 0.00718
Epoch: 35812, Training Loss: 0.00877
Epoch: 35812, Training Loss: 0.00680
Epoch: 35813, Training Loss: 0.00768
Epoch: 35813, Training Loss: 0.00718
Epoch: 35813, Training Loss: 0.00877
Epoch: 35813, Training Loss: 0.00680
Epoch: 35814, Training Loss: 0.00768
Epoch: 35814, Training Loss: 0.00718
Epoch: 35814, Training Loss: 0.00877
Epoch: 35814, Training Loss: 0.00680
Epoch: 35815, Training Loss: 0.00768
Epoch: 35815, Training Loss: 0.00718
Epoch: 35815, Training Loss: 0.00877
Epoch: 35815, Training Loss: 0.00680
Epoch: 35816, Training Loss: 0.00768
Epoch: 35816, Training Loss: 0.00718
Epoch: 35816, Training Loss: 0.00877
Epoch: 35816, Training Loss: 0.00680
Epoch: 35817, Training Loss: 0.00768
Epoch: 35817, Training Loss: 0.00718
Epoch: 35817, Training Loss: 0.00877
Epoch: 35817, Training Loss: 0.00680
Epoch: 35818, Training Loss: 0.00768
Epoch: 35818, Training Loss: 0.00718
Epoch: 35818, Training Loss: 0.00877
Epoch: 35818, Training Loss: 0.00680
Epoch: 35819, Training Loss: 0.00768
Epoch: 35819, Training Loss: 0.00718
Epoch: 35819, Training Loss: 0.00877
Epoch: 35819, Training Loss: 0.00680
Epoch: 35820, Training Loss: 0.00768
Epoch: 35820, Training Loss: 0.00718
Epoch: 35820, Training Loss: 0.00877
Epoch: 35820, Training Loss: 0.00680
Epoch: 35821, Training Loss: 0.00768
Epoch: 35821, Training Loss: 0.00718
Epoch: 35821, Training Loss: 0.00877
Epoch: 35821, Training Loss: 0.00680
Epoch: 35822, Training Loss: 0.00768
Epoch: 35822, Training Loss: 0.00718
Epoch: 35822, Training Loss: 0.00877
Epoch: 35822, Training Loss: 0.00680
Epoch: 35823, Training Loss: 0.00768
Epoch: 35823, Training Loss: 0.00718
Epoch: 35823, Training Loss: 0.00877
Epoch: 35823, Training Loss: 0.00680
Epoch: 35824, Training Loss: 0.00768
Epoch: 35824, Training Loss: 0.00718
Epoch: 35824, Training Loss: 0.00877
Epoch: 35824, Training Loss: 0.00680
Epoch: 35825, Training Loss: 0.00768
Epoch: 35825, Training Loss: 0.00718
Epoch: 35825, Training Loss: 0.00877
Epoch: 35825, Training Loss: 0.00680
Epoch: 35826, Training Loss: 0.00768
Epoch: 35826, Training Loss: 0.00718
Epoch: 35826, Training Loss: 0.00877
Epoch: 35826, Training Loss: 0.00680
Epoch: 35827, Training Loss: 0.00768
Epoch: 35827, Training Loss: 0.00718
Epoch: 35827, Training Loss: 0.00877
Epoch: 35827, Training Loss: 0.00680
Epoch: 35828, Training Loss: 0.00768
Epoch: 35828, Training Loss: 0.00718
Epoch: 35828, Training Loss: 0.00877
Epoch: 35828, Training Loss: 0.00680
Epoch: 35829, Training Loss: 0.00768
Epoch: 35829, Training Loss: 0.00718
Epoch: 35829, Training Loss: 0.00877
Epoch: 35829, Training Loss: 0.00680
Epoch: 35830, Training Loss: 0.00768
Epoch: 35830, Training Loss: 0.00718
Epoch: 35830, Training Loss: 0.00877
Epoch: 35830, Training Loss: 0.00680
Epoch: 35831, Training Loss: 0.00768
Epoch: 35831, Training Loss: 0.00718
Epoch: 35831, Training Loss: 0.00877
Epoch: 35831, Training Loss: 0.00680
Epoch: 35832, Training Loss: 0.00768
Epoch: 35832, Training Loss: 0.00718
Epoch: 35832, Training Loss: 0.00877
Epoch: 35832, Training Loss: 0.00680
Epoch: 35833, Training Loss: 0.00768
Epoch: 35833, Training Loss: 0.00718
Epoch: 35833, Training Loss: 0.00877
Epoch: 35833, Training Loss: 0.00680
Epoch: 35834, Training Loss: 0.00768
Epoch: 35834, Training Loss: 0.00718
Epoch: 35834, Training Loss: 0.00877
Epoch: 35834, Training Loss: 0.00680
Epoch: 35835, Training Loss: 0.00768
Epoch: 35835, Training Loss: 0.00718
Epoch: 35835, Training Loss: 0.00877
Epoch: 35835, Training Loss: 0.00680
Epoch: 35836, Training Loss: 0.00768
Epoch: 35836, Training Loss: 0.00718
Epoch: 35836, Training Loss: 0.00877
Epoch: 35836, Training Loss: 0.00680
Epoch: 35837, Training Loss: 0.00768
Epoch: 35837, Training Loss: 0.00718
Epoch: 35837, Training Loss: 0.00877
Epoch: 35837, Training Loss: 0.00680
Epoch: 35838, Training Loss: 0.00768
Epoch: 35838, Training Loss: 0.00718
Epoch: 35838, Training Loss: 0.00877
Epoch: 35838, Training Loss: 0.00680
Epoch: 35839, Training Loss: 0.00768
Epoch: 35839, Training Loss: 0.00718
Epoch: 35839, Training Loss: 0.00877
Epoch: 35839, Training Loss: 0.00680
Epoch: 35840, Training Loss: 0.00768
Epoch: 35840, Training Loss: 0.00718
Epoch: 35840, Training Loss: 0.00877
Epoch: 35840, Training Loss: 0.00680
Epoch: 35841, Training Loss: 0.00768
Epoch: 35841, Training Loss: 0.00718
Epoch: 35841, Training Loss: 0.00877
Epoch: 35841, Training Loss: 0.00680
Epoch: 35842, Training Loss: 0.00768
Epoch: 35842, Training Loss: 0.00718
Epoch: 35842, Training Loss: 0.00877
Epoch: 35842, Training Loss: 0.00680
Epoch: 35843, Training Loss: 0.00768
Epoch: 35843, Training Loss: 0.00718
Epoch: 35843, Training Loss: 0.00877
Epoch: 35843, Training Loss: 0.00680
Epoch: 35844, Training Loss: 0.00768
Epoch: 35844, Training Loss: 0.00718
Epoch: 35844, Training Loss: 0.00877
Epoch: 35844, Training Loss: 0.00680
Epoch: 35845, Training Loss: 0.00768
Epoch: 35845, Training Loss: 0.00718
Epoch: 35845, Training Loss: 0.00877
Epoch: 35845, Training Loss: 0.00680
Epoch: 35846, Training Loss: 0.00768
Epoch: 35846, Training Loss: 0.00718
Epoch: 35846, Training Loss: 0.00877
Epoch: 35846, Training Loss: 0.00680
Epoch: 35847, Training Loss: 0.00768
Epoch: 35847, Training Loss: 0.00718
Epoch: 35847, Training Loss: 0.00877
Epoch: 35847, Training Loss: 0.00680
Epoch: 35848, Training Loss: 0.00767
Epoch: 35848, Training Loss: 0.00718
Epoch: 35848, Training Loss: 0.00877
Epoch: 35848, Training Loss: 0.00680
Epoch: 35849, Training Loss: 0.00767
Epoch: 35849, Training Loss: 0.00718
Epoch: 35849, Training Loss: 0.00877
Epoch: 35849, Training Loss: 0.00680
Epoch: 35850, Training Loss: 0.00767
Epoch: 35850, Training Loss: 0.00718
Epoch: 35850, Training Loss: 0.00877
Epoch: 35850, Training Loss: 0.00680
Epoch: 35851, Training Loss: 0.00767
Epoch: 35851, Training Loss: 0.00718
Epoch: 35851, Training Loss: 0.00877
Epoch: 35851, Training Loss: 0.00680
Epoch: 35852, Training Loss: 0.00767
Epoch: 35852, Training Loss: 0.00718
Epoch: 35852, Training Loss: 0.00877
Epoch: 35852, Training Loss: 0.00680
Epoch: 35853, Training Loss: 0.00767
Epoch: 35853, Training Loss: 0.00718
Epoch: 35853, Training Loss: 0.00877
Epoch: 35853, Training Loss: 0.00679
Epoch: 35854, Training Loss: 0.00767
Epoch: 35854, Training Loss: 0.00718
Epoch: 35854, Training Loss: 0.00877
Epoch: 35854, Training Loss: 0.00679
Epoch: 35855, Training Loss: 0.00767
Epoch: 35855, Training Loss: 0.00718
Epoch: 35855, Training Loss: 0.00877
Epoch: 35855, Training Loss: 0.00679
Epoch: 35856, Training Loss: 0.00767
Epoch: 35856, Training Loss: 0.00718
Epoch: 35856, Training Loss: 0.00877
Epoch: 35856, Training Loss: 0.00679
Epoch: 35857, Training Loss: 0.00767
Epoch: 35857, Training Loss: 0.00718
Epoch: 35857, Training Loss: 0.00877
Epoch: 35857, Training Loss: 0.00679
Epoch: 35858, Training Loss: 0.00767
Epoch: 35858, Training Loss: 0.00718
Epoch: 35858, Training Loss: 0.00877
Epoch: 35858, Training Loss: 0.00679
Epoch: 35859, Training Loss: 0.00767
Epoch: 35859, Training Loss: 0.00718
Epoch: 35859, Training Loss: 0.00877
Epoch: 35859, Training Loss: 0.00679
Epoch: 35860, Training Loss: 0.00767
Epoch: 35860, Training Loss: 0.00718
Epoch: 35860, Training Loss: 0.00877
Epoch: 35860, Training Loss: 0.00679
Epoch: 35861, Training Loss: 0.00767
Epoch: 35861, Training Loss: 0.00718
Epoch: 35861, Training Loss: 0.00877
Epoch: 35861, Training Loss: 0.00679
Epoch: 35862, Training Loss: 0.00767
Epoch: 35862, Training Loss: 0.00718
Epoch: 35862, Training Loss: 0.00877
Epoch: 35862, Training Loss: 0.00679
Epoch: 35863, Training Loss: 0.00767
Epoch: 35863, Training Loss: 0.00718
Epoch: 35863, Training Loss: 0.00877
Epoch: 35863, Training Loss: 0.00679
Epoch: 35864, Training Loss: 0.00767
Epoch: 35864, Training Loss: 0.00718
Epoch: 35864, Training Loss: 0.00877
Epoch: 35864, Training Loss: 0.00679
Epoch: 35865, Training Loss: 0.00767
Epoch: 35865, Training Loss: 0.00718
Epoch: 35865, Training Loss: 0.00877
Epoch: 35865, Training Loss: 0.00679
Epoch: 35866, Training Loss: 0.00767
Epoch: 35866, Training Loss: 0.00718
Epoch: 35866, Training Loss: 0.00877
Epoch: 35866, Training Loss: 0.00679
Epoch: 35867, Training Loss: 0.00767
Epoch: 35867, Training Loss: 0.00718
Epoch: 35867, Training Loss: 0.00877
Epoch: 35867, Training Loss: 0.00679
Epoch: 35868, Training Loss: 0.00767
Epoch: 35868, Training Loss: 0.00718
Epoch: 35868, Training Loss: 0.00877
Epoch: 35868, Training Loss: 0.00679
Epoch: 35869, Training Loss: 0.00767
Epoch: 35869, Training Loss: 0.00717
Epoch: 35869, Training Loss: 0.00876
Epoch: 35869, Training Loss: 0.00679
Epoch: 35870, Training Loss: 0.00767
Epoch: 35870, Training Loss: 0.00717
Epoch: 35870, Training Loss: 0.00876
Epoch: 35870, Training Loss: 0.00679
Epoch: 35871, Training Loss: 0.00767
Epoch: 35871, Training Loss: 0.00717
Epoch: 35871, Training Loss: 0.00876
Epoch: 35871, Training Loss: 0.00679
Epoch: 35872, Training Loss: 0.00767
Epoch: 35872, Training Loss: 0.00717
Epoch: 35872, Training Loss: 0.00876
Epoch: 35872, Training Loss: 0.00679
Epoch: 35873, Training Loss: 0.00767
Epoch: 35873, Training Loss: 0.00717
Epoch: 35873, Training Loss: 0.00876
Epoch: 35873, Training Loss: 0.00679
Epoch: 35874, Training Loss: 0.00767
Epoch: 35874, Training Loss: 0.00717
Epoch: 35874, Training Loss: 0.00876
Epoch: 35874, Training Loss: 0.00679
Epoch: 35875, Training Loss: 0.00767
Epoch: 35875, Training Loss: 0.00717
Epoch: 35875, Training Loss: 0.00876
Epoch: 35875, Training Loss: 0.00679
Epoch: 35876, Training Loss: 0.00767
Epoch: 35876, Training Loss: 0.00717
Epoch: 35876, Training Loss: 0.00876
Epoch: 35876, Training Loss: 0.00679
Epoch: 35877, Training Loss: 0.00767
Epoch: 35877, Training Loss: 0.00717
Epoch: 35877, Training Loss: 0.00876
Epoch: 35877, Training Loss: 0.00679
Epoch: 35878, Training Loss: 0.00767
Epoch: 35878, Training Loss: 0.00717
Epoch: 35878, Training Loss: 0.00876
Epoch: 35878, Training Loss: 0.00679
Epoch: 35879, Training Loss: 0.00767
Epoch: 35879, Training Loss: 0.00717
Epoch: 35879, Training Loss: 0.00876
Epoch: 35879, Training Loss: 0.00679
Epoch: 35880, Training Loss: 0.00767
Epoch: 35880, Training Loss: 0.00717
Epoch: 35880, Training Loss: 0.00876
Epoch: 35880, Training Loss: 0.00679
Epoch: 35881, Training Loss: 0.00767
Epoch: 35881, Training Loss: 0.00717
Epoch: 35881, Training Loss: 0.00876
Epoch: 35881, Training Loss: 0.00679
Epoch: 35882, Training Loss: 0.00767
Epoch: 35882, Training Loss: 0.00717
Epoch: 35882, Training Loss: 0.00876
Epoch: 35882, Training Loss: 0.00679
Epoch: 35883, Training Loss: 0.00767
Epoch: 35883, Training Loss: 0.00717
Epoch: 35883, Training Loss: 0.00876
Epoch: 35883, Training Loss: 0.00679
Epoch: 35884, Training Loss: 0.00767
Epoch: 35884, Training Loss: 0.00717
Epoch: 35884, Training Loss: 0.00876
Epoch: 35884, Training Loss: 0.00679
Epoch: 35885, Training Loss: 0.00767
Epoch: 35885, Training Loss: 0.00717
Epoch: 35885, Training Loss: 0.00876
Epoch: 35885, Training Loss: 0.00679
Epoch: 35886, Training Loss: 0.00767
Epoch: 35886, Training Loss: 0.00717
Epoch: 35886, Training Loss: 0.00876
Epoch: 35886, Training Loss: 0.00679
Epoch: 35887, Training Loss: 0.00767
Epoch: 35887, Training Loss: 0.00717
Epoch: 35887, Training Loss: 0.00876
Epoch: 35887, Training Loss: 0.00679
Epoch: 35888, Training Loss: 0.00767
Epoch: 35888, Training Loss: 0.00717
Epoch: 35888, Training Loss: 0.00876
Epoch: 35888, Training Loss: 0.00679
Epoch: 35889, Training Loss: 0.00767
Epoch: 35889, Training Loss: 0.00717
Epoch: 35889, Training Loss: 0.00876
Epoch: 35889, Training Loss: 0.00679
Epoch: 35890, Training Loss: 0.00767
Epoch: 35890, Training Loss: 0.00717
Epoch: 35890, Training Loss: 0.00876
Epoch: 35890, Training Loss: 0.00679
Epoch: 35891, Training Loss: 0.00767
Epoch: 35891, Training Loss: 0.00717
Epoch: 35891, Training Loss: 0.00876
Epoch: 35891, Training Loss: 0.00679
Epoch: 35892, Training Loss: 0.00767
Epoch: 35892, Training Loss: 0.00717
Epoch: 35892, Training Loss: 0.00876
Epoch: 35892, Training Loss: 0.00679
Epoch: 35893, Training Loss: 0.00767
Epoch: 35893, Training Loss: 0.00717
Epoch: 35893, Training Loss: 0.00876
Epoch: 35893, Training Loss: 0.00679
Epoch: 35894, Training Loss: 0.00767
Epoch: 35894, Training Loss: 0.00717
Epoch: 35894, Training Loss: 0.00876
Epoch: 35894, Training Loss: 0.00679
Epoch: 35895, Training Loss: 0.00767
Epoch: 35895, Training Loss: 0.00717
Epoch: 35895, Training Loss: 0.00876
Epoch: 35895, Training Loss: 0.00679
Epoch: 35896, Training Loss: 0.00767
Epoch: 35896, Training Loss: 0.00717
Epoch: 35896, Training Loss: 0.00876
Epoch: 35896, Training Loss: 0.00679
Epoch: 35897, Training Loss: 0.00767
Epoch: 35897, Training Loss: 0.00717
Epoch: 35897, Training Loss: 0.00876
Epoch: 35897, Training Loss: 0.00679
Epoch: 35898, Training Loss: 0.00767
Epoch: 35898, Training Loss: 0.00717
Epoch: 35898, Training Loss: 0.00876
Epoch: 35898, Training Loss: 0.00679
Epoch: 35899, Training Loss: 0.00767
Epoch: 35899, Training Loss: 0.00717
Epoch: 35899, Training Loss: 0.00876
Epoch: 35899, Training Loss: 0.00679
Epoch: 35900, Training Loss: 0.00767
Epoch: 35900, Training Loss: 0.00717
Epoch: 35900, Training Loss: 0.00876
Epoch: 35900, Training Loss: 0.00679
Epoch: 35901, Training Loss: 0.00767
Epoch: 35901, Training Loss: 0.00717
Epoch: 35901, Training Loss: 0.00876
Epoch: 35901, Training Loss: 0.00679
Epoch: 35902, Training Loss: 0.00767
Epoch: 35902, Training Loss: 0.00717
Epoch: 35902, Training Loss: 0.00876
Epoch: 35902, Training Loss: 0.00679
Epoch: 35903, Training Loss: 0.00767
Epoch: 35903, Training Loss: 0.00717
Epoch: 35903, Training Loss: 0.00876
Epoch: 35903, Training Loss: 0.00679
Epoch: 35904, Training Loss: 0.00767
Epoch: 35904, Training Loss: 0.00717
Epoch: 35904, Training Loss: 0.00876
Epoch: 35904, Training Loss: 0.00679
Epoch: 35905, Training Loss: 0.00767
Epoch: 35905, Training Loss: 0.00717
Epoch: 35905, Training Loss: 0.00876
Epoch: 35905, Training Loss: 0.00679
Epoch: 35906, Training Loss: 0.00767
Epoch: 35906, Training Loss: 0.00717
Epoch: 35906, Training Loss: 0.00876
Epoch: 35906, Training Loss: 0.00679
Epoch: 35907, Training Loss: 0.00767
Epoch: 35907, Training Loss: 0.00717
Epoch: 35907, Training Loss: 0.00876
Epoch: 35907, Training Loss: 0.00679
Epoch: 35908, Training Loss: 0.00767
Epoch: 35908, Training Loss: 0.00717
Epoch: 35908, Training Loss: 0.00876
Epoch: 35908, Training Loss: 0.00679
Epoch: 35909, Training Loss: 0.00767
Epoch: 35909, Training Loss: 0.00717
Epoch: 35909, Training Loss: 0.00876
Epoch: 35909, Training Loss: 0.00679
Epoch: 35910, Training Loss: 0.00767
Epoch: 35910, Training Loss: 0.00717
Epoch: 35910, Training Loss: 0.00876
Epoch: 35910, Training Loss: 0.00679
Epoch: 35911, Training Loss: 0.00767
Epoch: 35911, Training Loss: 0.00717
Epoch: 35911, Training Loss: 0.00876
Epoch: 35911, Training Loss: 0.00679
Epoch: 35912, Training Loss: 0.00767
Epoch: 35912, Training Loss: 0.00717
Epoch: 35912, Training Loss: 0.00876
Epoch: 35912, Training Loss: 0.00679
Epoch: 35913, Training Loss: 0.00767
Epoch: 35913, Training Loss: 0.00717
Epoch: 35913, Training Loss: 0.00876
Epoch: 35913, Training Loss: 0.00679
Epoch: 35914, Training Loss: 0.00767
Epoch: 35914, Training Loss: 0.00717
Epoch: 35914, Training Loss: 0.00876
Epoch: 35914, Training Loss: 0.00679
Epoch: 35915, Training Loss: 0.00767
Epoch: 35915, Training Loss: 0.00717
Epoch: 35915, Training Loss: 0.00876
Epoch: 35915, Training Loss: 0.00679
Epoch: 35916, Training Loss: 0.00767
Epoch: 35916, Training Loss: 0.00717
Epoch: 35916, Training Loss: 0.00876
Epoch: 35916, Training Loss: 0.00679
Epoch: 35917, Training Loss: 0.00767
Epoch: 35917, Training Loss: 0.00717
Epoch: 35917, Training Loss: 0.00876
Epoch: 35917, Training Loss: 0.00679
Epoch: 35918, Training Loss: 0.00767
Epoch: 35918, Training Loss: 0.00717
Epoch: 35918, Training Loss: 0.00876
Epoch: 35918, Training Loss: 0.00679
Epoch: 35919, Training Loss: 0.00767
Epoch: 35919, Training Loss: 0.00717
Epoch: 35919, Training Loss: 0.00876
Epoch: 35919, Training Loss: 0.00679
Epoch: 35920, Training Loss: 0.00767
Epoch: 35920, Training Loss: 0.00717
Epoch: 35920, Training Loss: 0.00876
Epoch: 35920, Training Loss: 0.00679
Epoch: 35921, Training Loss: 0.00767
Epoch: 35921, Training Loss: 0.00717
Epoch: 35921, Training Loss: 0.00876
Epoch: 35921, Training Loss: 0.00679
Epoch: 35922, Training Loss: 0.00767
Epoch: 35922, Training Loss: 0.00717
Epoch: 35922, Training Loss: 0.00876
Epoch: 35922, Training Loss: 0.00679
Epoch: 35923, Training Loss: 0.00767
Epoch: 35923, Training Loss: 0.00717
Epoch: 35923, Training Loss: 0.00876
Epoch: 35923, Training Loss: 0.00679
Epoch: 35924, Training Loss: 0.00767
Epoch: 35924, Training Loss: 0.00717
Epoch: 35924, Training Loss: 0.00876
Epoch: 35924, Training Loss: 0.00679
Epoch: 35925, Training Loss: 0.00767
Epoch: 35925, Training Loss: 0.00717
Epoch: 35925, Training Loss: 0.00876
Epoch: 35925, Training Loss: 0.00679
Epoch: 35926, Training Loss: 0.00767
Epoch: 35926, Training Loss: 0.00717
Epoch: 35926, Training Loss: 0.00876
Epoch: 35926, Training Loss: 0.00679
Epoch: 35927, Training Loss: 0.00767
Epoch: 35927, Training Loss: 0.00717
Epoch: 35927, Training Loss: 0.00876
Epoch: 35927, Training Loss: 0.00679
Epoch: 35928, Training Loss: 0.00767
Epoch: 35928, Training Loss: 0.00717
Epoch: 35928, Training Loss: 0.00876
Epoch: 35928, Training Loss: 0.00679
Epoch: 35929, Training Loss: 0.00767
Epoch: 35929, Training Loss: 0.00717
Epoch: 35929, Training Loss: 0.00876
Epoch: 35929, Training Loss: 0.00679
Epoch: 35930, Training Loss: 0.00767
Epoch: 35930, Training Loss: 0.00717
Epoch: 35930, Training Loss: 0.00876
Epoch: 35930, Training Loss: 0.00679
Epoch: 35931, Training Loss: 0.00767
Epoch: 35931, Training Loss: 0.00717
Epoch: 35931, Training Loss: 0.00876
Epoch: 35931, Training Loss: 0.00679
Epoch: 35932, Training Loss: 0.00767
Epoch: 35932, Training Loss: 0.00717
Epoch: 35932, Training Loss: 0.00876
Epoch: 35932, Training Loss: 0.00679
Epoch: 35933, Training Loss: 0.00767
Epoch: 35933, Training Loss: 0.00717
Epoch: 35933, Training Loss: 0.00876
Epoch: 35933, Training Loss: 0.00679
Epoch: 35934, Training Loss: 0.00767
Epoch: 35934, Training Loss: 0.00717
Epoch: 35934, Training Loss: 0.00876
Epoch: 35934, Training Loss: 0.00679
Epoch: 35935, Training Loss: 0.00767
Epoch: 35935, Training Loss: 0.00717
Epoch: 35935, Training Loss: 0.00876
Epoch: 35935, Training Loss: 0.00679
Epoch: 35936, Training Loss: 0.00767
Epoch: 35936, Training Loss: 0.00717
Epoch: 35936, Training Loss: 0.00876
Epoch: 35936, Training Loss: 0.00679
Epoch: 35937, Training Loss: 0.00766
Epoch: 35937, Training Loss: 0.00717
Epoch: 35937, Training Loss: 0.00876
Epoch: 35937, Training Loss: 0.00679
Epoch: 35938, Training Loss: 0.00766
Epoch: 35938, Training Loss: 0.00717
Epoch: 35938, Training Loss: 0.00876
Epoch: 35938, Training Loss: 0.00679
Epoch: 35939, Training Loss: 0.00766
Epoch: 35939, Training Loss: 0.00717
Epoch: 35939, Training Loss: 0.00876
Epoch: 35939, Training Loss: 0.00679
Epoch: 35940, Training Loss: 0.00766
Epoch: 35940, Training Loss: 0.00717
Epoch: 35940, Training Loss: 0.00876
Epoch: 35940, Training Loss: 0.00679
Epoch: 35941, Training Loss: 0.00766
Epoch: 35941, Training Loss: 0.00717
Epoch: 35941, Training Loss: 0.00876
Epoch: 35941, Training Loss: 0.00679
Epoch: 35942, Training Loss: 0.00766
Epoch: 35942, Training Loss: 0.00717
Epoch: 35942, Training Loss: 0.00876
Epoch: 35942, Training Loss: 0.00679
Epoch: 35943, Training Loss: 0.00766
Epoch: 35943, Training Loss: 0.00717
Epoch: 35943, Training Loss: 0.00876
Epoch: 35943, Training Loss: 0.00679
Epoch: 35944, Training Loss: 0.00766
Epoch: 35944, Training Loss: 0.00717
Epoch: 35944, Training Loss: 0.00876
Epoch: 35944, Training Loss: 0.00679
Epoch: 35945, Training Loss: 0.00766
Epoch: 35945, Training Loss: 0.00717
Epoch: 35945, Training Loss: 0.00876
Epoch: 35945, Training Loss: 0.00679
Epoch: 35946, Training Loss: 0.00766
Epoch: 35946, Training Loss: 0.00717
Epoch: 35946, Training Loss: 0.00875
Epoch: 35946, Training Loss: 0.00679
Epoch: 35947, Training Loss: 0.00766
Epoch: 35947, Training Loss: 0.00717
Epoch: 35947, Training Loss: 0.00875
Epoch: 35947, Training Loss: 0.00679
Epoch: 35948, Training Loss: 0.00766
Epoch: 35948, Training Loss: 0.00717
Epoch: 35948, Training Loss: 0.00875
Epoch: 35948, Training Loss: 0.00679
Epoch: 35949, Training Loss: 0.00766
Epoch: 35949, Training Loss: 0.00717
Epoch: 35949, Training Loss: 0.00875
Epoch: 35949, Training Loss: 0.00679
Epoch: 35950, Training Loss: 0.00766
Epoch: 35950, Training Loss: 0.00717
Epoch: 35950, Training Loss: 0.00875
Epoch: 35950, Training Loss: 0.00679
Epoch: 35951, Training Loss: 0.00766
Epoch: 35951, Training Loss: 0.00717
Epoch: 35951, Training Loss: 0.00875
Epoch: 35951, Training Loss: 0.00679
Epoch: 35952, Training Loss: 0.00766
Epoch: 35952, Training Loss: 0.00717
Epoch: 35952, Training Loss: 0.00875
Epoch: 35952, Training Loss: 0.00679
Epoch: 35953, Training Loss: 0.00766
Epoch: 35953, Training Loss: 0.00717
Epoch: 35953, Training Loss: 0.00875
Epoch: 35953, Training Loss: 0.00679
Epoch: 35954, Training Loss: 0.00766
Epoch: 35954, Training Loss: 0.00717
Epoch: 35954, Training Loss: 0.00875
Epoch: 35954, Training Loss: 0.00679
Epoch: 35955, Training Loss: 0.00766
Epoch: 35955, Training Loss: 0.00717
Epoch: 35955, Training Loss: 0.00875
Epoch: 35955, Training Loss: 0.00678
Epoch: 35956, Training Loss: 0.00766
Epoch: 35956, Training Loss: 0.00717
Epoch: 35956, Training Loss: 0.00875
Epoch: 35956, Training Loss: 0.00678
Epoch: 35957, Training Loss: 0.00766
Epoch: 35957, Training Loss: 0.00717
Epoch: 35957, Training Loss: 0.00875
Epoch: 35957, Training Loss: 0.00678
Epoch: 35958, Training Loss: 0.00766
Epoch: 35958, Training Loss: 0.00717
Epoch: 35958, Training Loss: 0.00875
Epoch: 35958, Training Loss: 0.00678
Epoch: 35959, Training Loss: 0.00766
Epoch: 35959, Training Loss: 0.00717
Epoch: 35959, Training Loss: 0.00875
Epoch: 35959, Training Loss: 0.00678
Epoch: 35960, Training Loss: 0.00766
Epoch: 35960, Training Loss: 0.00717
Epoch: 35960, Training Loss: 0.00875
Epoch: 35960, Training Loss: 0.00678
Epoch: 35961, Training Loss: 0.00766
Epoch: 35961, Training Loss: 0.00717
Epoch: 35961, Training Loss: 0.00875
Epoch: 35961, Training Loss: 0.00678
Epoch: 35962, Training Loss: 0.00766
Epoch: 35962, Training Loss: 0.00717
Epoch: 35962, Training Loss: 0.00875
Epoch: 35962, Training Loss: 0.00678
Epoch: 35963, Training Loss: 0.00766
Epoch: 35963, Training Loss: 0.00717
Epoch: 35963, Training Loss: 0.00875
Epoch: 35963, Training Loss: 0.00678
Epoch: 35964, Training Loss: 0.00766
Epoch: 35964, Training Loss: 0.00717
Epoch: 35964, Training Loss: 0.00875
Epoch: 35964, Training Loss: 0.00678
Epoch: 35965, Training Loss: 0.00766
Epoch: 35965, Training Loss: 0.00716
Epoch: 35965, Training Loss: 0.00875
Epoch: 35965, Training Loss: 0.00678
Epoch: 35966, Training Loss: 0.00766
Epoch: 35966, Training Loss: 0.00716
Epoch: 35966, Training Loss: 0.00875
Epoch: 35966, Training Loss: 0.00678
Epoch: 35967, Training Loss: 0.00766
Epoch: 35967, Training Loss: 0.00716
Epoch: 35967, Training Loss: 0.00875
Epoch: 35967, Training Loss: 0.00678
Epoch: 35968, Training Loss: 0.00766
Epoch: 35968, Training Loss: 0.00716
Epoch: 35968, Training Loss: 0.00875
Epoch: 35968, Training Loss: 0.00678
Epoch: 35969, Training Loss: 0.00766
Epoch: 35969, Training Loss: 0.00716
Epoch: 35969, Training Loss: 0.00875
Epoch: 35969, Training Loss: 0.00678
Epoch: 35970, Training Loss: 0.00766
Epoch: 35970, Training Loss: 0.00716
Epoch: 35970, Training Loss: 0.00875
Epoch: 35970, Training Loss: 0.00678
Epoch: 35971, Training Loss: 0.00766
Epoch: 35971, Training Loss: 0.00716
Epoch: 35971, Training Loss: 0.00875
Epoch: 35971, Training Loss: 0.00678
Epoch: 35972, Training Loss: 0.00766
Epoch: 35972, Training Loss: 0.00716
Epoch: 35972, Training Loss: 0.00875
Epoch: 35972, Training Loss: 0.00678
Epoch: 35973, Training Loss: 0.00766
Epoch: 35973, Training Loss: 0.00716
Epoch: 35973, Training Loss: 0.00875
Epoch: 35973, Training Loss: 0.00678
Epoch: 35974, Training Loss: 0.00766
Epoch: 35974, Training Loss: 0.00716
Epoch: 35974, Training Loss: 0.00875
Epoch: 35974, Training Loss: 0.00678
Epoch: 35975, Training Loss: 0.00766
Epoch: 35975, Training Loss: 0.00716
Epoch: 35975, Training Loss: 0.00875
Epoch: 35975, Training Loss: 0.00678
Epoch: 35976, Training Loss: 0.00766
Epoch: 35976, Training Loss: 0.00716
Epoch: 35976, Training Loss: 0.00875
Epoch: 35976, Training Loss: 0.00678
Epoch: 35977, Training Loss: 0.00766
Epoch: 35977, Training Loss: 0.00716
Epoch: 35977, Training Loss: 0.00875
Epoch: 35977, Training Loss: 0.00678
Epoch: 35978, Training Loss: 0.00766
Epoch: 35978, Training Loss: 0.00716
Epoch: 35978, Training Loss: 0.00875
Epoch: 35978, Training Loss: 0.00678
Epoch: 35979, Training Loss: 0.00766
Epoch: 35979, Training Loss: 0.00716
Epoch: 35979, Training Loss: 0.00875
Epoch: 35979, Training Loss: 0.00678
Epoch: 35980, Training Loss: 0.00766
Epoch: 35980, Training Loss: 0.00716
Epoch: 35980, Training Loss: 0.00875
Epoch: 35980, Training Loss: 0.00678
Epoch: 35981, Training Loss: 0.00766
Epoch: 35981, Training Loss: 0.00716
Epoch: 35981, Training Loss: 0.00875
Epoch: 35981, Training Loss: 0.00678
Epoch: 35982, Training Loss: 0.00766
Epoch: 35982, Training Loss: 0.00716
Epoch: 35982, Training Loss: 0.00875
Epoch: 35982, Training Loss: 0.00678
Epoch: 35983, Training Loss: 0.00766
Epoch: 35983, Training Loss: 0.00716
Epoch: 35983, Training Loss: 0.00875
Epoch: 35983, Training Loss: 0.00678
Epoch: 35984, Training Loss: 0.00766
Epoch: 35984, Training Loss: 0.00716
Epoch: 35984, Training Loss: 0.00875
Epoch: 35984, Training Loss: 0.00678
Epoch: 35985, Training Loss: 0.00766
Epoch: 35985, Training Loss: 0.00716
Epoch: 35985, Training Loss: 0.00875
Epoch: 35985, Training Loss: 0.00678
Epoch: 35986, Training Loss: 0.00766
Epoch: 35986, Training Loss: 0.00716
Epoch: 35986, Training Loss: 0.00875
Epoch: 35986, Training Loss: 0.00678
Epoch: 35987, Training Loss: 0.00766
Epoch: 35987, Training Loss: 0.00716
Epoch: 35987, Training Loss: 0.00875
Epoch: 35987, Training Loss: 0.00678
Epoch: 35988, Training Loss: 0.00766
Epoch: 35988, Training Loss: 0.00716
Epoch: 35988, Training Loss: 0.00875
Epoch: 35988, Training Loss: 0.00678
Epoch: 35989, Training Loss: 0.00766
Epoch: 35989, Training Loss: 0.00716
Epoch: 35989, Training Loss: 0.00875
Epoch: 35989, Training Loss: 0.00678
Epoch: 35990, Training Loss: 0.00766
Epoch: 35990, Training Loss: 0.00716
Epoch: 35990, Training Loss: 0.00875
Epoch: 35990, Training Loss: 0.00678
Epoch: 35991, Training Loss: 0.00766
Epoch: 35991, Training Loss: 0.00716
Epoch: 35991, Training Loss: 0.00875
Epoch: 35991, Training Loss: 0.00678
Epoch: 35992, Training Loss: 0.00766
Epoch: 35992, Training Loss: 0.00716
Epoch: 35992, Training Loss: 0.00875
Epoch: 35992, Training Loss: 0.00678
Epoch: 35993, Training Loss: 0.00766
Epoch: 35993, Training Loss: 0.00716
Epoch: 35993, Training Loss: 0.00875
Epoch: 35993, Training Loss: 0.00678
Epoch: 35994, Training Loss: 0.00766
Epoch: 35994, Training Loss: 0.00716
Epoch: 35994, Training Loss: 0.00875
Epoch: 35994, Training Loss: 0.00678
Epoch: 35995, Training Loss: 0.00766
Epoch: 35995, Training Loss: 0.00716
Epoch: 35995, Training Loss: 0.00875
Epoch: 35995, Training Loss: 0.00678
Epoch: 35996, Training Loss: 0.00766
Epoch: 35996, Training Loss: 0.00716
Epoch: 35996, Training Loss: 0.00875
Epoch: 35996, Training Loss: 0.00678
Epoch: 35997, Training Loss: 0.00766
Epoch: 35997, Training Loss: 0.00716
Epoch: 35997, Training Loss: 0.00875
Epoch: 35997, Training Loss: 0.00678
Epoch: 35998, Training Loss: 0.00766
Epoch: 35998, Training Loss: 0.00716
Epoch: 35998, Training Loss: 0.00875
Epoch: 35998, Training Loss: 0.00678
Epoch: 35999, Training Loss: 0.00766
Epoch: 35999, Training Loss: 0.00716
Epoch: 35999, Training Loss: 0.00875
Epoch: 35999, Training Loss: 0.00678
Epoch: 36000, Training Loss: 0.00766
Epoch: 36000, Training Loss: 0.00716
Epoch: 36000, Training Loss: 0.00875
Epoch: 36000, Training Loss: 0.00678
Epoch: 36001, Training Loss: 0.00766
Epoch: 36001, Training Loss: 0.00716
Epoch: 36001, Training Loss: 0.00875
Epoch: 36001, Training Loss: 0.00678
Epoch: 36002, Training Loss: 0.00766
Epoch: 36002, Training Loss: 0.00716
Epoch: 36002, Training Loss: 0.00875
Epoch: 36002, Training Loss: 0.00678
Epoch: 36003, Training Loss: 0.00766
Epoch: 36003, Training Loss: 0.00716
Epoch: 36003, Training Loss: 0.00875
Epoch: 36003, Training Loss: 0.00678
Epoch: 36004, Training Loss: 0.00766
Epoch: 36004, Training Loss: 0.00716
Epoch: 36004, Training Loss: 0.00875
Epoch: 36004, Training Loss: 0.00678
Epoch: 36005, Training Loss: 0.00766
Epoch: 36005, Training Loss: 0.00716
Epoch: 36005, Training Loss: 0.00875
Epoch: 36005, Training Loss: 0.00678
Epoch: 36006, Training Loss: 0.00766
Epoch: 36006, Training Loss: 0.00716
Epoch: 36006, Training Loss: 0.00875
Epoch: 36006, Training Loss: 0.00678
Epoch: 36007, Training Loss: 0.00766
Epoch: 36007, Training Loss: 0.00716
Epoch: 36007, Training Loss: 0.00875
Epoch: 36007, Training Loss: 0.00678
Epoch: 36008, Training Loss: 0.00766
Epoch: 36008, Training Loss: 0.00716
Epoch: 36008, Training Loss: 0.00875
Epoch: 36008, Training Loss: 0.00678
Epoch: 36009, Training Loss: 0.00766
Epoch: 36009, Training Loss: 0.00716
Epoch: 36009, Training Loss: 0.00875
Epoch: 36009, Training Loss: 0.00678
Epoch: 36010, Training Loss: 0.00766
Epoch: 36010, Training Loss: 0.00716
Epoch: 36010, Training Loss: 0.00875
Epoch: 36010, Training Loss: 0.00678
Epoch: 36011, Training Loss: 0.00766
Epoch: 36011, Training Loss: 0.00716
Epoch: 36011, Training Loss: 0.00875
Epoch: 36011, Training Loss: 0.00678
Epoch: 36012, Training Loss: 0.00766
Epoch: 36012, Training Loss: 0.00716
Epoch: 36012, Training Loss: 0.00875
Epoch: 36012, Training Loss: 0.00678
Epoch: 36013, Training Loss: 0.00766
Epoch: 36013, Training Loss: 0.00716
Epoch: 36013, Training Loss: 0.00875
Epoch: 36013, Training Loss: 0.00678
Epoch: 36014, Training Loss: 0.00766
Epoch: 36014, Training Loss: 0.00716
Epoch: 36014, Training Loss: 0.00875
Epoch: 36014, Training Loss: 0.00678
Epoch: 36015, Training Loss: 0.00766
Epoch: 36015, Training Loss: 0.00716
Epoch: 36015, Training Loss: 0.00875
Epoch: 36015, Training Loss: 0.00678
Epoch: 36016, Training Loss: 0.00766
Epoch: 36016, Training Loss: 0.00716
Epoch: 36016, Training Loss: 0.00875
Epoch: 36016, Training Loss: 0.00678
Epoch: 36017, Training Loss: 0.00766
Epoch: 36017, Training Loss: 0.00716
Epoch: 36017, Training Loss: 0.00875
Epoch: 36017, Training Loss: 0.00678
Epoch: 36018, Training Loss: 0.00766
Epoch: 36018, Training Loss: 0.00716
Epoch: 36018, Training Loss: 0.00875
Epoch: 36018, Training Loss: 0.00678
Epoch: 36019, Training Loss: 0.00766
Epoch: 36019, Training Loss: 0.00716
Epoch: 36019, Training Loss: 0.00875
Epoch: 36019, Training Loss: 0.00678
Epoch: 36020, Training Loss: 0.00766
Epoch: 36020, Training Loss: 0.00716
Epoch: 36020, Training Loss: 0.00875
Epoch: 36020, Training Loss: 0.00678
Epoch: 36021, Training Loss: 0.00766
Epoch: 36021, Training Loss: 0.00716
Epoch: 36021, Training Loss: 0.00875
Epoch: 36021, Training Loss: 0.00678
Epoch: 36022, Training Loss: 0.00766
Epoch: 36022, Training Loss: 0.00716
Epoch: 36022, Training Loss: 0.00875
Epoch: 36022, Training Loss: 0.00678
Epoch: 36023, Training Loss: 0.00766
Epoch: 36023, Training Loss: 0.00716
Epoch: 36023, Training Loss: 0.00875
Epoch: 36023, Training Loss: 0.00678
Epoch: 36024, Training Loss: 0.00766
Epoch: 36024, Training Loss: 0.00716
Epoch: 36024, Training Loss: 0.00875
Epoch: 36024, Training Loss: 0.00678
Epoch: 36025, Training Loss: 0.00766
Epoch: 36025, Training Loss: 0.00716
Epoch: 36025, Training Loss: 0.00874
Epoch: 36025, Training Loss: 0.00678
Epoch: 36026, Training Loss: 0.00765
Epoch: 36026, Training Loss: 0.00716
Epoch: 36026, Training Loss: 0.00874
Epoch: 36026, Training Loss: 0.00678
Epoch: 36027, Training Loss: 0.00765
Epoch: 36027, Training Loss: 0.00716
Epoch: 36027, Training Loss: 0.00874
Epoch: 36027, Training Loss: 0.00678
Epoch: 36028, Training Loss: 0.00765
Epoch: 36028, Training Loss: 0.00716
Epoch: 36028, Training Loss: 0.00874
Epoch: 36028, Training Loss: 0.00678
Epoch: 36029, Training Loss: 0.00765
Epoch: 36029, Training Loss: 0.00716
Epoch: 36029, Training Loss: 0.00874
Epoch: 36029, Training Loss: 0.00678
Epoch: 36030, Training Loss: 0.00765
Epoch: 36030, Training Loss: 0.00716
Epoch: 36030, Training Loss: 0.00874
Epoch: 36030, Training Loss: 0.00678
Epoch: 36031, Training Loss: 0.00765
Epoch: 36031, Training Loss: 0.00716
Epoch: 36031, Training Loss: 0.00874
Epoch: 36031, Training Loss: 0.00678
Epoch: 36032, Training Loss: 0.00765
Epoch: 36032, Training Loss: 0.00716
Epoch: 36032, Training Loss: 0.00874
Epoch: 36032, Training Loss: 0.00678
Epoch: 36033, Training Loss: 0.00765
Epoch: 36033, Training Loss: 0.00716
Epoch: 36033, Training Loss: 0.00874
Epoch: 36033, Training Loss: 0.00678
Epoch: 36034, Training Loss: 0.00765
Epoch: 36034, Training Loss: 0.00716
Epoch: 36034, Training Loss: 0.00874
Epoch: 36034, Training Loss: 0.00678
Epoch: 36035, Training Loss: 0.00765
Epoch: 36035, Training Loss: 0.00716
Epoch: 36035, Training Loss: 0.00874
Epoch: 36035, Training Loss: 0.00678
Epoch: 36036, Training Loss: 0.00765
Epoch: 36036, Training Loss: 0.00716
Epoch: 36036, Training Loss: 0.00874
Epoch: 36036, Training Loss: 0.00678
Epoch: 36037, Training Loss: 0.00765
Epoch: 36037, Training Loss: 0.00716
Epoch: 36037, Training Loss: 0.00874
Epoch: 36037, Training Loss: 0.00678
Epoch: 36038, Training Loss: 0.00765
Epoch: 36038, Training Loss: 0.00716
Epoch: 36038, Training Loss: 0.00874
Epoch: 36038, Training Loss: 0.00678
Epoch: 36039, Training Loss: 0.00765
Epoch: 36039, Training Loss: 0.00716
Epoch: 36039, Training Loss: 0.00874
Epoch: 36039, Training Loss: 0.00678
Epoch: 36040, Training Loss: 0.00765
Epoch: 36040, Training Loss: 0.00716
Epoch: 36040, Training Loss: 0.00874
Epoch: 36040, Training Loss: 0.00678
Epoch: 36041, Training Loss: 0.00765
Epoch: 36041, Training Loss: 0.00716
Epoch: 36041, Training Loss: 0.00874
Epoch: 36041, Training Loss: 0.00678
Epoch: 36042, Training Loss: 0.00765
Epoch: 36042, Training Loss: 0.00716
Epoch: 36042, Training Loss: 0.00874
Epoch: 36042, Training Loss: 0.00678
Epoch: 36043, Training Loss: 0.00765
Epoch: 36043, Training Loss: 0.00716
Epoch: 36043, Training Loss: 0.00874
Epoch: 36043, Training Loss: 0.00678
Epoch: 36044, Training Loss: 0.00765
Epoch: 36044, Training Loss: 0.00716
Epoch: 36044, Training Loss: 0.00874
Epoch: 36044, Training Loss: 0.00678
Epoch: 36045, Training Loss: 0.00765
Epoch: 36045, Training Loss: 0.00716
Epoch: 36045, Training Loss: 0.00874
Epoch: 36045, Training Loss: 0.00678
Epoch: 36046, Training Loss: 0.00765
Epoch: 36046, Training Loss: 0.00716
Epoch: 36046, Training Loss: 0.00874
Epoch: 36046, Training Loss: 0.00678
Epoch: 36047, Training Loss: 0.00765
Epoch: 36047, Training Loss: 0.00716
Epoch: 36047, Training Loss: 0.00874
Epoch: 36047, Training Loss: 0.00678
Epoch: 36048, Training Loss: 0.00765
Epoch: 36048, Training Loss: 0.00716
Epoch: 36048, Training Loss: 0.00874
Epoch: 36048, Training Loss: 0.00678
Epoch: 36049, Training Loss: 0.00765
Epoch: 36049, Training Loss: 0.00716
Epoch: 36049, Training Loss: 0.00874
Epoch: 36049, Training Loss: 0.00678
Epoch: 36050, Training Loss: 0.00765
Epoch: 36050, Training Loss: 0.00716
Epoch: 36050, Training Loss: 0.00874
Epoch: 36050, Training Loss: 0.00678
Epoch: 36051, Training Loss: 0.00765
Epoch: 36051, Training Loss: 0.00716
Epoch: 36051, Training Loss: 0.00874
Epoch: 36051, Training Loss: 0.00678
Epoch: 36052, Training Loss: 0.00765
Epoch: 36052, Training Loss: 0.00716
Epoch: 36052, Training Loss: 0.00874
Epoch: 36052, Training Loss: 0.00678
Epoch: 36053, Training Loss: 0.00765
Epoch: 36053, Training Loss: 0.00716
Epoch: 36053, Training Loss: 0.00874
Epoch: 36053, Training Loss: 0.00678
Epoch: 36054, Training Loss: 0.00765
Epoch: 36054, Training Loss: 0.00716
Epoch: 36054, Training Loss: 0.00874
Epoch: 36054, Training Loss: 0.00678
Epoch: 36055, Training Loss: 0.00765
Epoch: 36055, Training Loss: 0.00716
Epoch: 36055, Training Loss: 0.00874
Epoch: 36055, Training Loss: 0.00678
Epoch: 36056, Training Loss: 0.00765
Epoch: 36056, Training Loss: 0.00716
Epoch: 36056, Training Loss: 0.00874
Epoch: 36056, Training Loss: 0.00678
Epoch: 36057, Training Loss: 0.00765
Epoch: 36057, Training Loss: 0.00716
Epoch: 36057, Training Loss: 0.00874
Epoch: 36057, Training Loss: 0.00677
Epoch: 36058, Training Loss: 0.00765
Epoch: 36058, Training Loss: 0.00716
Epoch: 36058, Training Loss: 0.00874
Epoch: 36058, Training Loss: 0.00677
Epoch: 36059, Training Loss: 0.00765
Epoch: 36059, Training Loss: 0.00716
Epoch: 36059, Training Loss: 0.00874
Epoch: 36059, Training Loss: 0.00677
Epoch: 36060, Training Loss: 0.00765
Epoch: 36060, Training Loss: 0.00716
Epoch: 36060, Training Loss: 0.00874
Epoch: 36060, Training Loss: 0.00677
Epoch: 36061, Training Loss: 0.00765
Epoch: 36061, Training Loss: 0.00716
Epoch: 36061, Training Loss: 0.00874
Epoch: 36061, Training Loss: 0.00677
Epoch: 36062, Training Loss: 0.00765
Epoch: 36062, Training Loss: 0.00715
Epoch: 36062, Training Loss: 0.00874
Epoch: 36062, Training Loss: 0.00677
Epoch: 36063, Training Loss: 0.00765
Epoch: 36063, Training Loss: 0.00715
Epoch: 36063, Training Loss: 0.00874
Epoch: 36063, Training Loss: 0.00677
Epoch: 36064, Training Loss: 0.00765
Epoch: 36064, Training Loss: 0.00715
Epoch: 36064, Training Loss: 0.00874
Epoch: 36064, Training Loss: 0.00677
Epoch: 36065, Training Loss: 0.00765
Epoch: 36065, Training Loss: 0.00715
Epoch: 36065, Training Loss: 0.00874
Epoch: 36065, Training Loss: 0.00677
Epoch: 36066, Training Loss: 0.00765
Epoch: 36066, Training Loss: 0.00715
Epoch: 36066, Training Loss: 0.00874
Epoch: 36066, Training Loss: 0.00677
Epoch: 36067, Training Loss: 0.00765
Epoch: 36067, Training Loss: 0.00715
Epoch: 36067, Training Loss: 0.00874
Epoch: 36067, Training Loss: 0.00677
Epoch: 36068, Training Loss: 0.00765
Epoch: 36068, Training Loss: 0.00715
Epoch: 36068, Training Loss: 0.00874
Epoch: 36068, Training Loss: 0.00677
Epoch: 36069, Training Loss: 0.00765
Epoch: 36069, Training Loss: 0.00715
Epoch: 36069, Training Loss: 0.00874
Epoch: 36069, Training Loss: 0.00677
Epoch: 36070, Training Loss: 0.00765
Epoch: 36070, Training Loss: 0.00715
Epoch: 36070, Training Loss: 0.00874
Epoch: 36070, Training Loss: 0.00677
Epoch: 36071, Training Loss: 0.00765
Epoch: 36071, Training Loss: 0.00715
Epoch: 36071, Training Loss: 0.00874
Epoch: 36071, Training Loss: 0.00677
Epoch: 36072, Training Loss: 0.00765
Epoch: 36072, Training Loss: 0.00715
Epoch: 36072, Training Loss: 0.00874
Epoch: 36072, Training Loss: 0.00677
Epoch: 36073, Training Loss: 0.00765
Epoch: 36073, Training Loss: 0.00715
Epoch: 36073, Training Loss: 0.00874
Epoch: 36073, Training Loss: 0.00677
Epoch: 36074, Training Loss: 0.00765
Epoch: 36074, Training Loss: 0.00715
Epoch: 36074, Training Loss: 0.00874
Epoch: 36074, Training Loss: 0.00677
Epoch: 36075, Training Loss: 0.00765
Epoch: 36075, Training Loss: 0.00715
Epoch: 36075, Training Loss: 0.00874
Epoch: 36075, Training Loss: 0.00677
Epoch: 36076, Training Loss: 0.00765
Epoch: 36076, Training Loss: 0.00715
Epoch: 36076, Training Loss: 0.00874
Epoch: 36076, Training Loss: 0.00677
Epoch: 36077, Training Loss: 0.00765
Epoch: 36077, Training Loss: 0.00715
Epoch: 36077, Training Loss: 0.00874
Epoch: 36077, Training Loss: 0.00677
Epoch: 36078, Training Loss: 0.00765
Epoch: 36078, Training Loss: 0.00715
Epoch: 36078, Training Loss: 0.00874
Epoch: 36078, Training Loss: 0.00677
Epoch: 36079, Training Loss: 0.00765
Epoch: 36079, Training Loss: 0.00715
Epoch: 36079, Training Loss: 0.00874
Epoch: 36079, Training Loss: 0.00677
Epoch: 36080, Training Loss: 0.00765
Epoch: 36080, Training Loss: 0.00715
Epoch: 36080, Training Loss: 0.00874
Epoch: 36080, Training Loss: 0.00677
Epoch: 36081, Training Loss: 0.00765
Epoch: 36081, Training Loss: 0.00715
Epoch: 36081, Training Loss: 0.00874
Epoch: 36081, Training Loss: 0.00677
Epoch: 36082, Training Loss: 0.00765
Epoch: 36082, Training Loss: 0.00715
Epoch: 36082, Training Loss: 0.00874
Epoch: 36082, Training Loss: 0.00677
Epoch: 36083, Training Loss: 0.00765
Epoch: 36083, Training Loss: 0.00715
Epoch: 36083, Training Loss: 0.00874
Epoch: 36083, Training Loss: 0.00677
Epoch: 36084, Training Loss: 0.00765
Epoch: 36084, Training Loss: 0.00715
Epoch: 36084, Training Loss: 0.00874
Epoch: 36084, Training Loss: 0.00677
Epoch: 36085, Training Loss: 0.00765
Epoch: 36085, Training Loss: 0.00715
Epoch: 36085, Training Loss: 0.00874
Epoch: 36085, Training Loss: 0.00677
Epoch: 36086, Training Loss: 0.00765
Epoch: 36086, Training Loss: 0.00715
Epoch: 36086, Training Loss: 0.00874
Epoch: 36086, Training Loss: 0.00677
Epoch: 36087, Training Loss: 0.00765
Epoch: 36087, Training Loss: 0.00715
Epoch: 36087, Training Loss: 0.00874
Epoch: 36087, Training Loss: 0.00677
Epoch: 36088, Training Loss: 0.00765
Epoch: 36088, Training Loss: 0.00715
Epoch: 36088, Training Loss: 0.00874
Epoch: 36088, Training Loss: 0.00677
Epoch: 36089, Training Loss: 0.00765
Epoch: 36089, Training Loss: 0.00715
Epoch: 36089, Training Loss: 0.00874
Epoch: 36089, Training Loss: 0.00677
Epoch: 36090, Training Loss: 0.00765
Epoch: 36090, Training Loss: 0.00715
Epoch: 36090, Training Loss: 0.00874
Epoch: 36090, Training Loss: 0.00677
Epoch: 36091, Training Loss: 0.00765
Epoch: 36091, Training Loss: 0.00715
Epoch: 36091, Training Loss: 0.00874
Epoch: 36091, Training Loss: 0.00677
Epoch: 36092, Training Loss: 0.00765
Epoch: 36092, Training Loss: 0.00715
Epoch: 36092, Training Loss: 0.00874
Epoch: 36092, Training Loss: 0.00677
Epoch: 36093, Training Loss: 0.00765
Epoch: 36093, Training Loss: 0.00715
Epoch: 36093, Training Loss: 0.00874
Epoch: 36093, Training Loss: 0.00677
Epoch: 36094, Training Loss: 0.00765
Epoch: 36094, Training Loss: 0.00715
Epoch: 36094, Training Loss: 0.00874
Epoch: 36094, Training Loss: 0.00677
Epoch: 36095, Training Loss: 0.00765
Epoch: 36095, Training Loss: 0.00715
Epoch: 36095, Training Loss: 0.00874
Epoch: 36095, Training Loss: 0.00677
Epoch: 36096, Training Loss: 0.00765
Epoch: 36096, Training Loss: 0.00715
Epoch: 36096, Training Loss: 0.00874
Epoch: 36096, Training Loss: 0.00677
Epoch: 36097, Training Loss: 0.00765
Epoch: 36097, Training Loss: 0.00715
Epoch: 36097, Training Loss: 0.00874
Epoch: 36097, Training Loss: 0.00677
Epoch: 36098, Training Loss: 0.00765
Epoch: 36098, Training Loss: 0.00715
Epoch: 36098, Training Loss: 0.00874
Epoch: 36098, Training Loss: 0.00677
Epoch: 36099, Training Loss: 0.00765
Epoch: 36099, Training Loss: 0.00715
Epoch: 36099, Training Loss: 0.00874
Epoch: 36099, Training Loss: 0.00677
Epoch: 36100, Training Loss: 0.00765
Epoch: 36100, Training Loss: 0.00715
Epoch: 36100, Training Loss: 0.00874
Epoch: 36100, Training Loss: 0.00677
Epoch: 36101, Training Loss: 0.00765
Epoch: 36101, Training Loss: 0.00715
Epoch: 36101, Training Loss: 0.00874
Epoch: 36101, Training Loss: 0.00677
Epoch: 36102, Training Loss: 0.00765
Epoch: 36102, Training Loss: 0.00715
Epoch: 36102, Training Loss: 0.00874
Epoch: 36102, Training Loss: 0.00677
Epoch: 36103, Training Loss: 0.00765
Epoch: 36103, Training Loss: 0.00715
Epoch: 36103, Training Loss: 0.00873
Epoch: 36103, Training Loss: 0.00677
Epoch: 36104, Training Loss: 0.00765
Epoch: 36104, Training Loss: 0.00715
Epoch: 36104, Training Loss: 0.00873
Epoch: 36104, Training Loss: 0.00677
Epoch: 36105, Training Loss: 0.00765
Epoch: 36105, Training Loss: 0.00715
Epoch: 36105, Training Loss: 0.00873
Epoch: 36105, Training Loss: 0.00677
Epoch: 36106, Training Loss: 0.00765
Epoch: 36106, Training Loss: 0.00715
Epoch: 36106, Training Loss: 0.00873
Epoch: 36106, Training Loss: 0.00677
Epoch: 36107, Training Loss: 0.00765
Epoch: 36107, Training Loss: 0.00715
Epoch: 36107, Training Loss: 0.00873
Epoch: 36107, Training Loss: 0.00677
Epoch: 36108, Training Loss: 0.00765
Epoch: 36108, Training Loss: 0.00715
Epoch: 36108, Training Loss: 0.00873
Epoch: 36108, Training Loss: 0.00677
Epoch: 36109, Training Loss: 0.00765
Epoch: 36109, Training Loss: 0.00715
Epoch: 36109, Training Loss: 0.00873
Epoch: 36109, Training Loss: 0.00677
Epoch: 36110, Training Loss: 0.00765
Epoch: 36110, Training Loss: 0.00715
Epoch: 36110, Training Loss: 0.00873
Epoch: 36110, Training Loss: 0.00677
Epoch: 36111, Training Loss: 0.00765
Epoch: 36111, Training Loss: 0.00715
Epoch: 36111, Training Loss: 0.00873
Epoch: 36111, Training Loss: 0.00677
Epoch: 36112, Training Loss: 0.00765
Epoch: 36112, Training Loss: 0.00715
Epoch: 36112, Training Loss: 0.00873
Epoch: 36112, Training Loss: 0.00677
Epoch: 36113, Training Loss: 0.00765
Epoch: 36113, Training Loss: 0.00715
Epoch: 36113, Training Loss: 0.00873
Epoch: 36113, Training Loss: 0.00677
Epoch: 36114, Training Loss: 0.00765
Epoch: 36114, Training Loss: 0.00715
Epoch: 36114, Training Loss: 0.00873
Epoch: 36114, Training Loss: 0.00677
Epoch: 36115, Training Loss: 0.00764
Epoch: 36115, Training Loss: 0.00715
Epoch: 36115, Training Loss: 0.00873
Epoch: 36115, Training Loss: 0.00677
Epoch: 36116, Training Loss: 0.00764
Epoch: 36116, Training Loss: 0.00715
Epoch: 36116, Training Loss: 0.00873
Epoch: 36116, Training Loss: 0.00677
Epoch: 36117, Training Loss: 0.00764
Epoch: 36117, Training Loss: 0.00715
Epoch: 36117, Training Loss: 0.00873
Epoch: 36117, Training Loss: 0.00677
Epoch: 36118, Training Loss: 0.00764
Epoch: 36118, Training Loss: 0.00715
Epoch: 36118, Training Loss: 0.00873
Epoch: 36118, Training Loss: 0.00677
Epoch: 36119, Training Loss: 0.00764
Epoch: 36119, Training Loss: 0.00715
Epoch: 36119, Training Loss: 0.00873
Epoch: 36119, Training Loss: 0.00677
Epoch: 36120, Training Loss: 0.00764
Epoch: 36120, Training Loss: 0.00715
Epoch: 36120, Training Loss: 0.00873
Epoch: 36120, Training Loss: 0.00677
Epoch: 36121, Training Loss: 0.00764
Epoch: 36121, Training Loss: 0.00715
Epoch: 36121, Training Loss: 0.00873
Epoch: 36121, Training Loss: 0.00677
Epoch: 36122, Training Loss: 0.00764
Epoch: 36122, Training Loss: 0.00715
Epoch: 36122, Training Loss: 0.00873
Epoch: 36122, Training Loss: 0.00677
Epoch: 36123, Training Loss: 0.00764
Epoch: 36123, Training Loss: 0.00715
Epoch: 36123, Training Loss: 0.00873
Epoch: 36123, Training Loss: 0.00677
Epoch: 36124, Training Loss: 0.00764
Epoch: 36124, Training Loss: 0.00715
Epoch: 36124, Training Loss: 0.00873
Epoch: 36124, Training Loss: 0.00677
Epoch: 36125, Training Loss: 0.00764
Epoch: 36125, Training Loss: 0.00715
Epoch: 36125, Training Loss: 0.00873
Epoch: 36125, Training Loss: 0.00677
Epoch: 36126, Training Loss: 0.00764
Epoch: 36126, Training Loss: 0.00715
Epoch: 36126, Training Loss: 0.00873
Epoch: 36126, Training Loss: 0.00677
Epoch: 36127, Training Loss: 0.00764
Epoch: 36127, Training Loss: 0.00715
Epoch: 36127, Training Loss: 0.00873
Epoch: 36127, Training Loss: 0.00677
Epoch: 36128, Training Loss: 0.00764
Epoch: 36128, Training Loss: 0.00715
Epoch: 36128, Training Loss: 0.00873
Epoch: 36128, Training Loss: 0.00677
Epoch: 36129, Training Loss: 0.00764
Epoch: 36129, Training Loss: 0.00715
Epoch: 36129, Training Loss: 0.00873
Epoch: 36129, Training Loss: 0.00677
Epoch: 36130, Training Loss: 0.00764
Epoch: 36130, Training Loss: 0.00715
Epoch: 36130, Training Loss: 0.00873
Epoch: 36130, Training Loss: 0.00677
Epoch: 36131, Training Loss: 0.00764
Epoch: 36131, Training Loss: 0.00715
Epoch: 36131, Training Loss: 0.00873
Epoch: 36131, Training Loss: 0.00677
Epoch: 36132, Training Loss: 0.00764
Epoch: 36132, Training Loss: 0.00715
Epoch: 36132, Training Loss: 0.00873
Epoch: 36132, Training Loss: 0.00677
Epoch: 36133, Training Loss: 0.00764
Epoch: 36133, Training Loss: 0.00715
Epoch: 36133, Training Loss: 0.00873
Epoch: 36133, Training Loss: 0.00677
Epoch: 36134, Training Loss: 0.00764
Epoch: 36134, Training Loss: 0.00715
Epoch: 36134, Training Loss: 0.00873
Epoch: 36134, Training Loss: 0.00677
Epoch: 36135, Training Loss: 0.00764
Epoch: 36135, Training Loss: 0.00715
Epoch: 36135, Training Loss: 0.00873
Epoch: 36135, Training Loss: 0.00677
Epoch: 36136, Training Loss: 0.00764
Epoch: 36136, Training Loss: 0.00715
Epoch: 36136, Training Loss: 0.00873
Epoch: 36136, Training Loss: 0.00677
Epoch: 36137, Training Loss: 0.00764
Epoch: 36137, Training Loss: 0.00715
Epoch: 36137, Training Loss: 0.00873
Epoch: 36137, Training Loss: 0.00677
Epoch: 36138, Training Loss: 0.00764
Epoch: 36138, Training Loss: 0.00715
Epoch: 36138, Training Loss: 0.00873
Epoch: 36138, Training Loss: 0.00677
Epoch: 36139, Training Loss: 0.00764
Epoch: 36139, Training Loss: 0.00715
Epoch: 36139, Training Loss: 0.00873
Epoch: 36139, Training Loss: 0.00677
Epoch: 36140, Training Loss: 0.00764
Epoch: 36140, Training Loss: 0.00715
Epoch: 36140, Training Loss: 0.00873
Epoch: 36140, Training Loss: 0.00677
Epoch: 36141, Training Loss: 0.00764
Epoch: 36141, Training Loss: 0.00715
Epoch: 36141, Training Loss: 0.00873
Epoch: 36141, Training Loss: 0.00677
Epoch: 36142, Training Loss: 0.00764
Epoch: 36142, Training Loss: 0.00715
Epoch: 36142, Training Loss: 0.00873
Epoch: 36142, Training Loss: 0.00677
Epoch: 36143, Training Loss: 0.00764
Epoch: 36143, Training Loss: 0.00715
Epoch: 36143, Training Loss: 0.00873
Epoch: 36143, Training Loss: 0.00677
Epoch: 36144, Training Loss: 0.00764
Epoch: 36144, Training Loss: 0.00715
Epoch: 36144, Training Loss: 0.00873
Epoch: 36144, Training Loss: 0.00677
Epoch: 36145, Training Loss: 0.00764
Epoch: 36145, Training Loss: 0.00715
Epoch: 36145, Training Loss: 0.00873
Epoch: 36145, Training Loss: 0.00677
Epoch: 36146, Training Loss: 0.00764
Epoch: 36146, Training Loss: 0.00715
Epoch: 36146, Training Loss: 0.00873
Epoch: 36146, Training Loss: 0.00677
Epoch: 36147, Training Loss: 0.00764
Epoch: 36147, Training Loss: 0.00715
Epoch: 36147, Training Loss: 0.00873
Epoch: 36147, Training Loss: 0.00677
Epoch: 36148, Training Loss: 0.00764
Epoch: 36148, Training Loss: 0.00715
Epoch: 36148, Training Loss: 0.00873
Epoch: 36148, Training Loss: 0.00677
Epoch: 36149, Training Loss: 0.00764
Epoch: 36149, Training Loss: 0.00715
Epoch: 36149, Training Loss: 0.00873
Epoch: 36149, Training Loss: 0.00677
Epoch: 36150, Training Loss: 0.00764
Epoch: 36150, Training Loss: 0.00715
Epoch: 36150, Training Loss: 0.00873
Epoch: 36150, Training Loss: 0.00677
Epoch: 36151, Training Loss: 0.00764
Epoch: 36151, Training Loss: 0.00715
Epoch: 36151, Training Loss: 0.00873
Epoch: 36151, Training Loss: 0.00677
Epoch: 36152, Training Loss: 0.00764
Epoch: 36152, Training Loss: 0.00715
Epoch: 36152, Training Loss: 0.00873
Epoch: 36152, Training Loss: 0.00677
Epoch: 36153, Training Loss: 0.00764
Epoch: 36153, Training Loss: 0.00715
Epoch: 36153, Training Loss: 0.00873
Epoch: 36153, Training Loss: 0.00677
Epoch: 36154, Training Loss: 0.00764
Epoch: 36154, Training Loss: 0.00715
Epoch: 36154, Training Loss: 0.00873
Epoch: 36154, Training Loss: 0.00677
Epoch: 36155, Training Loss: 0.00764
Epoch: 36155, Training Loss: 0.00715
Epoch: 36155, Training Loss: 0.00873
Epoch: 36155, Training Loss: 0.00677
Epoch: 36156, Training Loss: 0.00764
Epoch: 36156, Training Loss: 0.00715
Epoch: 36156, Training Loss: 0.00873
Epoch: 36156, Training Loss: 0.00677
Epoch: 36157, Training Loss: 0.00764
Epoch: 36157, Training Loss: 0.00715
Epoch: 36157, Training Loss: 0.00873
Epoch: 36157, Training Loss: 0.00677
Epoch: 36158, Training Loss: 0.00764
Epoch: 36158, Training Loss: 0.00715
Epoch: 36158, Training Loss: 0.00873
Epoch: 36158, Training Loss: 0.00677
Epoch: 36159, Training Loss: 0.00764
Epoch: 36159, Training Loss: 0.00714
Epoch: 36159, Training Loss: 0.00873
Epoch: 36159, Training Loss: 0.00676
Epoch: 36160, Training Loss: 0.00764
Epoch: 36160, Training Loss: 0.00714
Epoch: 36160, Training Loss: 0.00873
Epoch: 36160, Training Loss: 0.00676
Epoch: 36161, Training Loss: 0.00764
Epoch: 36161, Training Loss: 0.00714
Epoch: 36161, Training Loss: 0.00873
Epoch: 36161, Training Loss: 0.00676
Epoch: 36162, Training Loss: 0.00764
Epoch: 36162, Training Loss: 0.00714
Epoch: 36162, Training Loss: 0.00873
Epoch: 36162, Training Loss: 0.00676
Epoch: 36163, Training Loss: 0.00764
Epoch: 36163, Training Loss: 0.00714
Epoch: 36163, Training Loss: 0.00873
Epoch: 36163, Training Loss: 0.00676
Epoch: 36164, Training Loss: 0.00764
Epoch: 36164, Training Loss: 0.00714
Epoch: 36164, Training Loss: 0.00873
Epoch: 36164, Training Loss: 0.00676
Epoch: 36165, Training Loss: 0.00764
Epoch: 36165, Training Loss: 0.00714
Epoch: 36165, Training Loss: 0.00873
Epoch: 36165, Training Loss: 0.00676
Epoch: 36166, Training Loss: 0.00764
Epoch: 36166, Training Loss: 0.00714
Epoch: 36166, Training Loss: 0.00873
Epoch: 36166, Training Loss: 0.00676
Epoch: 36167, Training Loss: 0.00764
Epoch: 36167, Training Loss: 0.00714
Epoch: 36167, Training Loss: 0.00873
Epoch: 36167, Training Loss: 0.00676
Epoch: 36168, Training Loss: 0.00764
Epoch: 36168, Training Loss: 0.00714
Epoch: 36168, Training Loss: 0.00873
Epoch: 36168, Training Loss: 0.00676
Epoch: 36169, Training Loss: 0.00764
Epoch: 36169, Training Loss: 0.00714
Epoch: 36169, Training Loss: 0.00873
Epoch: 36169, Training Loss: 0.00676
Epoch: 36170, Training Loss: 0.00764
Epoch: 36170, Training Loss: 0.00714
Epoch: 36170, Training Loss: 0.00873
Epoch: 36170, Training Loss: 0.00676
Epoch: 36171, Training Loss: 0.00764
Epoch: 36171, Training Loss: 0.00714
Epoch: 36171, Training Loss: 0.00873
Epoch: 36171, Training Loss: 0.00676
Epoch: 36172, Training Loss: 0.00764
Epoch: 36172, Training Loss: 0.00714
Epoch: 36172, Training Loss: 0.00873
Epoch: 36172, Training Loss: 0.00676
Epoch: 36173, Training Loss: 0.00764
Epoch: 36173, Training Loss: 0.00714
Epoch: 36173, Training Loss: 0.00873
Epoch: 36173, Training Loss: 0.00676
Epoch: 36174, Training Loss: 0.00764
Epoch: 36174, Training Loss: 0.00714
Epoch: 36174, Training Loss: 0.00873
Epoch: 36174, Training Loss: 0.00676
Epoch: 36175, Training Loss: 0.00764
Epoch: 36175, Training Loss: 0.00714
Epoch: 36175, Training Loss: 0.00873
Epoch: 36175, Training Loss: 0.00676
Epoch: 36176, Training Loss: 0.00764
Epoch: 36176, Training Loss: 0.00714
Epoch: 36176, Training Loss: 0.00873
Epoch: 36176, Training Loss: 0.00676
Epoch: 36177, Training Loss: 0.00764
Epoch: 36177, Training Loss: 0.00714
Epoch: 36177, Training Loss: 0.00873
Epoch: 36177, Training Loss: 0.00676
Epoch: 36178, Training Loss: 0.00764
Epoch: 36178, Training Loss: 0.00714
Epoch: 36178, Training Loss: 0.00873
Epoch: 36178, Training Loss: 0.00676
Epoch: 36179, Training Loss: 0.00764
Epoch: 36179, Training Loss: 0.00714
Epoch: 36179, Training Loss: 0.00873
Epoch: 36179, Training Loss: 0.00676
Epoch: 36180, Training Loss: 0.00764
Epoch: 36180, Training Loss: 0.00714
Epoch: 36180, Training Loss: 0.00873
Epoch: 36180, Training Loss: 0.00676
Epoch: 36181, Training Loss: 0.00764
Epoch: 36181, Training Loss: 0.00714
Epoch: 36181, Training Loss: 0.00873
Epoch: 36181, Training Loss: 0.00676
Epoch: 36182, Training Loss: 0.00764
Epoch: 36182, Training Loss: 0.00714
Epoch: 36182, Training Loss: 0.00872
Epoch: 36182, Training Loss: 0.00676
Epoch: 36183, Training Loss: 0.00764
Epoch: 36183, Training Loss: 0.00714
Epoch: 36183, Training Loss: 0.00872
Epoch: 36183, Training Loss: 0.00676
Epoch: 36184, Training Loss: 0.00764
Epoch: 36184, Training Loss: 0.00714
Epoch: 36184, Training Loss: 0.00872
Epoch: 36184, Training Loss: 0.00676
Epoch: 36185, Training Loss: 0.00764
Epoch: 36185, Training Loss: 0.00714
Epoch: 36185, Training Loss: 0.00872
Epoch: 36185, Training Loss: 0.00676
Epoch: 36186, Training Loss: 0.00764
Epoch: 36186, Training Loss: 0.00714
Epoch: 36186, Training Loss: 0.00872
Epoch: 36186, Training Loss: 0.00676
Epoch: 36187, Training Loss: 0.00764
Epoch: 36187, Training Loss: 0.00714
Epoch: 36187, Training Loss: 0.00872
Epoch: 36187, Training Loss: 0.00676
Epoch: 36188, Training Loss: 0.00764
Epoch: 36188, Training Loss: 0.00714
Epoch: 36188, Training Loss: 0.00872
Epoch: 36188, Training Loss: 0.00676
Epoch: 36189, Training Loss: 0.00764
Epoch: 36189, Training Loss: 0.00714
Epoch: 36189, Training Loss: 0.00872
Epoch: 36189, Training Loss: 0.00676
Epoch: 36190, Training Loss: 0.00764
Epoch: 36190, Training Loss: 0.00714
Epoch: 36190, Training Loss: 0.00872
Epoch: 36190, Training Loss: 0.00676
Epoch: 36191, Training Loss: 0.00764
Epoch: 36191, Training Loss: 0.00714
Epoch: 36191, Training Loss: 0.00872
Epoch: 36191, Training Loss: 0.00676
Epoch: 36192, Training Loss: 0.00764
Epoch: 36192, Training Loss: 0.00714
Epoch: 36192, Training Loss: 0.00872
Epoch: 36192, Training Loss: 0.00676
Epoch: 36193, Training Loss: 0.00764
Epoch: 36193, Training Loss: 0.00714
Epoch: 36193, Training Loss: 0.00872
Epoch: 36193, Training Loss: 0.00676
Epoch: 36194, Training Loss: 0.00764
Epoch: 36194, Training Loss: 0.00714
Epoch: 36194, Training Loss: 0.00872
Epoch: 36194, Training Loss: 0.00676
Epoch: 36195, Training Loss: 0.00764
Epoch: 36195, Training Loss: 0.00714
Epoch: 36195, Training Loss: 0.00872
Epoch: 36195, Training Loss: 0.00676
Epoch: 36196, Training Loss: 0.00764
Epoch: 36196, Training Loss: 0.00714
Epoch: 36196, Training Loss: 0.00872
Epoch: 36196, Training Loss: 0.00676
Epoch: 36197, Training Loss: 0.00764
Epoch: 36197, Training Loss: 0.00714
Epoch: 36197, Training Loss: 0.00872
Epoch: 36197, Training Loss: 0.00676
Epoch: 36198, Training Loss: 0.00764
Epoch: 36198, Training Loss: 0.00714
Epoch: 36198, Training Loss: 0.00872
Epoch: 36198, Training Loss: 0.00676
Epoch: 36199, Training Loss: 0.00764
Epoch: 36199, Training Loss: 0.00714
Epoch: 36199, Training Loss: 0.00872
Epoch: 36199, Training Loss: 0.00676
Epoch: 36200, Training Loss: 0.00764
Epoch: 36200, Training Loss: 0.00714
Epoch: 36200, Training Loss: 0.00872
Epoch: 36200, Training Loss: 0.00676
Epoch: 36201, Training Loss: 0.00764
Epoch: 36201, Training Loss: 0.00714
Epoch: 36201, Training Loss: 0.00872
Epoch: 36201, Training Loss: 0.00676
Epoch: 36202, Training Loss: 0.00764
Epoch: 36202, Training Loss: 0.00714
Epoch: 36202, Training Loss: 0.00872
Epoch: 36202, Training Loss: 0.00676
Epoch: 36203, Training Loss: 0.00764
Epoch: 36203, Training Loss: 0.00714
Epoch: 36203, Training Loss: 0.00872
Epoch: 36203, Training Loss: 0.00676
Epoch: 36204, Training Loss: 0.00764
Epoch: 36204, Training Loss: 0.00714
Epoch: 36204, Training Loss: 0.00872
Epoch: 36204, Training Loss: 0.00676
Epoch: 36205, Training Loss: 0.00763
Epoch: 36205, Training Loss: 0.00714
Epoch: 36205, Training Loss: 0.00872
Epoch: 36205, Training Loss: 0.00676
Epoch: 36206, Training Loss: 0.00763
Epoch: 36206, Training Loss: 0.00714
Epoch: 36206, Training Loss: 0.00872
Epoch: 36206, Training Loss: 0.00676
Epoch: 36207, Training Loss: 0.00763
Epoch: 36207, Training Loss: 0.00714
Epoch: 36207, Training Loss: 0.00872
Epoch: 36207, Training Loss: 0.00676
Epoch: 36208, Training Loss: 0.00763
Epoch: 36208, Training Loss: 0.00714
Epoch: 36208, Training Loss: 0.00872
Epoch: 36208, Training Loss: 0.00676
Epoch: 36209, Training Loss: 0.00763
Epoch: 36209, Training Loss: 0.00714
Epoch: 36209, Training Loss: 0.00872
Epoch: 36209, Training Loss: 0.00676
Epoch: 36210, Training Loss: 0.00763
Epoch: 36210, Training Loss: 0.00714
Epoch: 36210, Training Loss: 0.00872
Epoch: 36210, Training Loss: 0.00676
Epoch: 36211, Training Loss: 0.00763
Epoch: 36211, Training Loss: 0.00714
Epoch: 36211, Training Loss: 0.00872
Epoch: 36211, Training Loss: 0.00676
Epoch: 36212, Training Loss: 0.00763
Epoch: 36212, Training Loss: 0.00714
Epoch: 36212, Training Loss: 0.00872
Epoch: 36212, Training Loss: 0.00676
Epoch: 36213, Training Loss: 0.00763
Epoch: 36213, Training Loss: 0.00714
Epoch: 36213, Training Loss: 0.00872
Epoch: 36213, Training Loss: 0.00676
Epoch: 36214, Training Loss: 0.00763
Epoch: 36214, Training Loss: 0.00714
Epoch: 36214, Training Loss: 0.00872
Epoch: 36214, Training Loss: 0.00676
Epoch: 36215, Training Loss: 0.00763
Epoch: 36215, Training Loss: 0.00714
Epoch: 36215, Training Loss: 0.00872
Epoch: 36215, Training Loss: 0.00676
Epoch: 36216, Training Loss: 0.00763
Epoch: 36216, Training Loss: 0.00714
Epoch: 36216, Training Loss: 0.00872
Epoch: 36216, Training Loss: 0.00676
Epoch: 36217, Training Loss: 0.00763
Epoch: 36217, Training Loss: 0.00714
Epoch: 36217, Training Loss: 0.00872
Epoch: 36217, Training Loss: 0.00676
Epoch: 36218, Training Loss: 0.00763
Epoch: 36218, Training Loss: 0.00714
Epoch: 36218, Training Loss: 0.00872
Epoch: 36218, Training Loss: 0.00676
Epoch: 36219, Training Loss: 0.00763
Epoch: 36219, Training Loss: 0.00714
Epoch: 36219, Training Loss: 0.00872
Epoch: 36219, Training Loss: 0.00676
Epoch: 36220, Training Loss: 0.00763
Epoch: 36220, Training Loss: 0.00714
Epoch: 36220, Training Loss: 0.00872
Epoch: 36220, Training Loss: 0.00676
Epoch: 36221, Training Loss: 0.00763
Epoch: 36221, Training Loss: 0.00714
Epoch: 36221, Training Loss: 0.00872
Epoch: 36221, Training Loss: 0.00676
Epoch: 36222, Training Loss: 0.00763
Epoch: 36222, Training Loss: 0.00714
Epoch: 36222, Training Loss: 0.00872
Epoch: 36222, Training Loss: 0.00676
Epoch: 36223, Training Loss: 0.00763
Epoch: 36223, Training Loss: 0.00714
Epoch: 36223, Training Loss: 0.00872
Epoch: 36223, Training Loss: 0.00676
Epoch: 36224, Training Loss: 0.00763
Epoch: 36224, Training Loss: 0.00714
Epoch: 36224, Training Loss: 0.00872
Epoch: 36224, Training Loss: 0.00676
Epoch: 36225, Training Loss: 0.00763
Epoch: 36225, Training Loss: 0.00714
Epoch: 36225, Training Loss: 0.00872
Epoch: 36225, Training Loss: 0.00676
Epoch: 36226, Training Loss: 0.00763
Epoch: 36226, Training Loss: 0.00714
Epoch: 36226, Training Loss: 0.00872
Epoch: 36226, Training Loss: 0.00676
Epoch: 36227, Training Loss: 0.00763
Epoch: 36227, Training Loss: 0.00714
Epoch: 36227, Training Loss: 0.00872
Epoch: 36227, Training Loss: 0.00676
Epoch: 36228, Training Loss: 0.00763
Epoch: 36228, Training Loss: 0.00714
Epoch: 36228, Training Loss: 0.00872
Epoch: 36228, Training Loss: 0.00676
Epoch: 36229, Training Loss: 0.00763
Epoch: 36229, Training Loss: 0.00714
Epoch: 36229, Training Loss: 0.00872
Epoch: 36229, Training Loss: 0.00676
Epoch: 36230, Training Loss: 0.00763
Epoch: 36230, Training Loss: 0.00714
Epoch: 36230, Training Loss: 0.00872
Epoch: 36230, Training Loss: 0.00676
Epoch: 36231, Training Loss: 0.00763
Epoch: 36231, Training Loss: 0.00714
Epoch: 36231, Training Loss: 0.00872
Epoch: 36231, Training Loss: 0.00676
Epoch: 36232, Training Loss: 0.00763
Epoch: 36232, Training Loss: 0.00714
Epoch: 36232, Training Loss: 0.00872
Epoch: 36232, Training Loss: 0.00676
Epoch: 36233, Training Loss: 0.00763
Epoch: 36233, Training Loss: 0.00714
Epoch: 36233, Training Loss: 0.00872
Epoch: 36233, Training Loss: 0.00676
Epoch: 36234, Training Loss: 0.00763
Epoch: 36234, Training Loss: 0.00714
Epoch: 36234, Training Loss: 0.00872
Epoch: 36234, Training Loss: 0.00676
Epoch: 36235, Training Loss: 0.00763
Epoch: 36235, Training Loss: 0.00714
Epoch: 36235, Training Loss: 0.00872
Epoch: 36235, Training Loss: 0.00676
Epoch: 36236, Training Loss: 0.00763
Epoch: 36236, Training Loss: 0.00714
Epoch: 36236, Training Loss: 0.00872
Epoch: 36236, Training Loss: 0.00676
Epoch: 36237, Training Loss: 0.00763
Epoch: 36237, Training Loss: 0.00714
Epoch: 36237, Training Loss: 0.00872
Epoch: 36237, Training Loss: 0.00676
Epoch: 36238, Training Loss: 0.00763
Epoch: 36238, Training Loss: 0.00714
Epoch: 36238, Training Loss: 0.00872
Epoch: 36238, Training Loss: 0.00676
Epoch: 36239, Training Loss: 0.00763
Epoch: 36239, Training Loss: 0.00714
Epoch: 36239, Training Loss: 0.00872
Epoch: 36239, Training Loss: 0.00676
Epoch: 36240, Training Loss: 0.00763
Epoch: 36240, Training Loss: 0.00714
Epoch: 36240, Training Loss: 0.00872
Epoch: 36240, Training Loss: 0.00676
Epoch: 36241, Training Loss: 0.00763
Epoch: 36241, Training Loss: 0.00714
Epoch: 36241, Training Loss: 0.00872
Epoch: 36241, Training Loss: 0.00676
Epoch: 36242, Training Loss: 0.00763
Epoch: 36242, Training Loss: 0.00714
Epoch: 36242, Training Loss: 0.00872
Epoch: 36242, Training Loss: 0.00676
Epoch: 36243, Training Loss: 0.00763
Epoch: 36243, Training Loss: 0.00714
Epoch: 36243, Training Loss: 0.00872
Epoch: 36243, Training Loss: 0.00676
Epoch: 36244, Training Loss: 0.00763
Epoch: 36244, Training Loss: 0.00714
Epoch: 36244, Training Loss: 0.00872
Epoch: 36244, Training Loss: 0.00676
Epoch: 36245, Training Loss: 0.00763
Epoch: 36245, Training Loss: 0.00714
Epoch: 36245, Training Loss: 0.00872
Epoch: 36245, Training Loss: 0.00676
Epoch: 36246, Training Loss: 0.00763
Epoch: 36246, Training Loss: 0.00714
Epoch: 36246, Training Loss: 0.00872
Epoch: 36246, Training Loss: 0.00676
Epoch: 36247, Training Loss: 0.00763
Epoch: 36247, Training Loss: 0.00714
Epoch: 36247, Training Loss: 0.00872
Epoch: 36247, Training Loss: 0.00676
Epoch: 36248, Training Loss: 0.00763
Epoch: 36248, Training Loss: 0.00714
Epoch: 36248, Training Loss: 0.00872
Epoch: 36248, Training Loss: 0.00676
Epoch: 36249, Training Loss: 0.00763
Epoch: 36249, Training Loss: 0.00714
Epoch: 36249, Training Loss: 0.00872
Epoch: 36249, Training Loss: 0.00676
Epoch: 36250, Training Loss: 0.00763
Epoch: 36250, Training Loss: 0.00714
Epoch: 36250, Training Loss: 0.00872
Epoch: 36250, Training Loss: 0.00676
Epoch: 36251, Training Loss: 0.00763
Epoch: 36251, Training Loss: 0.00714
Epoch: 36251, Training Loss: 0.00872
Epoch: 36251, Training Loss: 0.00676
Epoch: 36252, Training Loss: 0.00763
Epoch: 36252, Training Loss: 0.00714
Epoch: 36252, Training Loss: 0.00872
Epoch: 36252, Training Loss: 0.00676
Epoch: 36253, Training Loss: 0.00763
Epoch: 36253, Training Loss: 0.00714
Epoch: 36253, Training Loss: 0.00872
Epoch: 36253, Training Loss: 0.00676
Epoch: 36254, Training Loss: 0.00763
Epoch: 36254, Training Loss: 0.00714
Epoch: 36254, Training Loss: 0.00872
Epoch: 36254, Training Loss: 0.00676
Epoch: 36255, Training Loss: 0.00763
Epoch: 36255, Training Loss: 0.00714
Epoch: 36255, Training Loss: 0.00872
Epoch: 36255, Training Loss: 0.00676
Epoch: 36256, Training Loss: 0.00763
Epoch: 36256, Training Loss: 0.00713
Epoch: 36256, Training Loss: 0.00872
Epoch: 36256, Training Loss: 0.00676
Epoch: 36257, Training Loss: 0.00763
Epoch: 36257, Training Loss: 0.00713
Epoch: 36257, Training Loss: 0.00872
Epoch: 36257, Training Loss: 0.00676
Epoch: 36258, Training Loss: 0.00763
Epoch: 36258, Training Loss: 0.00713
Epoch: 36258, Training Loss: 0.00872
Epoch: 36258, Training Loss: 0.00676
Epoch: 36259, Training Loss: 0.00763
Epoch: 36259, Training Loss: 0.00713
Epoch: 36259, Training Loss: 0.00872
Epoch: 36259, Training Loss: 0.00676
Epoch: 36260, Training Loss: 0.00763
Epoch: 36260, Training Loss: 0.00713
Epoch: 36260, Training Loss: 0.00871
Epoch: 36260, Training Loss: 0.00676
Epoch: 36261, Training Loss: 0.00763
Epoch: 36261, Training Loss: 0.00713
Epoch: 36261, Training Loss: 0.00871
Epoch: 36261, Training Loss: 0.00676
Epoch: 36262, Training Loss: 0.00763
Epoch: 36262, Training Loss: 0.00713
Epoch: 36262, Training Loss: 0.00871
Epoch: 36262, Training Loss: 0.00675
Epoch: 36263, Training Loss: 0.00763
Epoch: 36263, Training Loss: 0.00713
Epoch: 36263, Training Loss: 0.00871
Epoch: 36263, Training Loss: 0.00675
Epoch: 36264, Training Loss: 0.00763
Epoch: 36264, Training Loss: 0.00713
Epoch: 36264, Training Loss: 0.00871
Epoch: 36264, Training Loss: 0.00675
Epoch: 36265, Training Loss: 0.00763
Epoch: 36265, Training Loss: 0.00713
Epoch: 36265, Training Loss: 0.00871
Epoch: 36265, Training Loss: 0.00675
Epoch: 36266, Training Loss: 0.00763
Epoch: 36266, Training Loss: 0.00713
Epoch: 36266, Training Loss: 0.00871
Epoch: 36266, Training Loss: 0.00675
Epoch: 36267, Training Loss: 0.00763
Epoch: 36267, Training Loss: 0.00713
Epoch: 36267, Training Loss: 0.00871
Epoch: 36267, Training Loss: 0.00675
Epoch: 36268, Training Loss: 0.00763
Epoch: 36268, Training Loss: 0.00713
Epoch: 36268, Training Loss: 0.00871
Epoch: 36268, Training Loss: 0.00675
Epoch: 36269, Training Loss: 0.00763
Epoch: 36269, Training Loss: 0.00713
Epoch: 36269, Training Loss: 0.00871
Epoch: 36269, Training Loss: 0.00675
Epoch: 36270, Training Loss: 0.00763
Epoch: 36270, Training Loss: 0.00713
Epoch: 36270, Training Loss: 0.00871
Epoch: 36270, Training Loss: 0.00675
Epoch: 36271, Training Loss: 0.00763
Epoch: 36271, Training Loss: 0.00713
Epoch: 36271, Training Loss: 0.00871
Epoch: 36271, Training Loss: 0.00675
Epoch: 36272, Training Loss: 0.00763
Epoch: 36272, Training Loss: 0.00713
Epoch: 36272, Training Loss: 0.00871
Epoch: 36272, Training Loss: 0.00675
Epoch: 36273, Training Loss: 0.00763
Epoch: 36273, Training Loss: 0.00713
Epoch: 36273, Training Loss: 0.00871
Epoch: 36273, Training Loss: 0.00675
Epoch: 36274, Training Loss: 0.00763
Epoch: 36274, Training Loss: 0.00713
Epoch: 36274, Training Loss: 0.00871
Epoch: 36274, Training Loss: 0.00675
Epoch: 36275, Training Loss: 0.00763
Epoch: 36275, Training Loss: 0.00713
Epoch: 36275, Training Loss: 0.00871
Epoch: 36275, Training Loss: 0.00675
Epoch: 36276, Training Loss: 0.00763
Epoch: 36276, Training Loss: 0.00713
Epoch: 36276, Training Loss: 0.00871
Epoch: 36276, Training Loss: 0.00675
Epoch: 36277, Training Loss: 0.00763
Epoch: 36277, Training Loss: 0.00713
Epoch: 36277, Training Loss: 0.00871
Epoch: 36277, Training Loss: 0.00675
Epoch: 36278, Training Loss: 0.00763
Epoch: 36278, Training Loss: 0.00713
Epoch: 36278, Training Loss: 0.00871
Epoch: 36278, Training Loss: 0.00675
Epoch: 36279, Training Loss: 0.00763
Epoch: 36279, Training Loss: 0.00713
Epoch: 36279, Training Loss: 0.00871
Epoch: 36279, Training Loss: 0.00675
Epoch: 36280, Training Loss: 0.00763
Epoch: 36280, Training Loss: 0.00713
Epoch: 36280, Training Loss: 0.00871
Epoch: 36280, Training Loss: 0.00675
Epoch: 36281, Training Loss: 0.00763
Epoch: 36281, Training Loss: 0.00713
Epoch: 36281, Training Loss: 0.00871
Epoch: 36281, Training Loss: 0.00675
Epoch: 36282, Training Loss: 0.00763
Epoch: 36282, Training Loss: 0.00713
Epoch: 36282, Training Loss: 0.00871
Epoch: 36282, Training Loss: 0.00675
Epoch: 36283, Training Loss: 0.00763
Epoch: 36283, Training Loss: 0.00713
Epoch: 36283, Training Loss: 0.00871
Epoch: 36283, Training Loss: 0.00675
Epoch: 36284, Training Loss: 0.00763
Epoch: 36284, Training Loss: 0.00713
Epoch: 36284, Training Loss: 0.00871
Epoch: 36284, Training Loss: 0.00675
Epoch: 36285, Training Loss: 0.00763
Epoch: 36285, Training Loss: 0.00713
Epoch: 36285, Training Loss: 0.00871
Epoch: 36285, Training Loss: 0.00675
Epoch: 36286, Training Loss: 0.00763
Epoch: 36286, Training Loss: 0.00713
Epoch: 36286, Training Loss: 0.00871
Epoch: 36286, Training Loss: 0.00675
Epoch: 36287, Training Loss: 0.00763
Epoch: 36287, Training Loss: 0.00713
Epoch: 36287, Training Loss: 0.00871
Epoch: 36287, Training Loss: 0.00675
Epoch: 36288, Training Loss: 0.00763
Epoch: 36288, Training Loss: 0.00713
Epoch: 36288, Training Loss: 0.00871
Epoch: 36288, Training Loss: 0.00675
Epoch: 36289, Training Loss: 0.00763
Epoch: 36289, Training Loss: 0.00713
Epoch: 36289, Training Loss: 0.00871
Epoch: 36289, Training Loss: 0.00675
Epoch: 36290, Training Loss: 0.00763
Epoch: 36290, Training Loss: 0.00713
Epoch: 36290, Training Loss: 0.00871
Epoch: 36290, Training Loss: 0.00675
Epoch: 36291, Training Loss: 0.00763
Epoch: 36291, Training Loss: 0.00713
Epoch: 36291, Training Loss: 0.00871
Epoch: 36291, Training Loss: 0.00675
Epoch: 36292, Training Loss: 0.00763
Epoch: 36292, Training Loss: 0.00713
Epoch: 36292, Training Loss: 0.00871
Epoch: 36292, Training Loss: 0.00675
Epoch: 36293, Training Loss: 0.00763
Epoch: 36293, Training Loss: 0.00713
Epoch: 36293, Training Loss: 0.00871
Epoch: 36293, Training Loss: 0.00675
Epoch: 36294, Training Loss: 0.00762
Epoch: 36294, Training Loss: 0.00713
Epoch: 36294, Training Loss: 0.00871
Epoch: 36294, Training Loss: 0.00675
Epoch: 36295, Training Loss: 0.00762
Epoch: 36295, Training Loss: 0.00713
Epoch: 36295, Training Loss: 0.00871
Epoch: 36295, Training Loss: 0.00675
Epoch: 36296, Training Loss: 0.00762
Epoch: 36296, Training Loss: 0.00713
Epoch: 36296, Training Loss: 0.00871
Epoch: 36296, Training Loss: 0.00675
Epoch: 36297, Training Loss: 0.00762
Epoch: 36297, Training Loss: 0.00713
Epoch: 36297, Training Loss: 0.00871
Epoch: 36297, Training Loss: 0.00675
Epoch: 36298, Training Loss: 0.00762
Epoch: 36298, Training Loss: 0.00713
Epoch: 36298, Training Loss: 0.00871
Epoch: 36298, Training Loss: 0.00675
Epoch: 36299, Training Loss: 0.00762
Epoch: 36299, Training Loss: 0.00713
Epoch: 36299, Training Loss: 0.00871
Epoch: 36299, Training Loss: 0.00675
Epoch: 36300, Training Loss: 0.00762
Epoch: 36300, Training Loss: 0.00713
Epoch: 36300, Training Loss: 0.00871
Epoch: 36300, Training Loss: 0.00675
Epoch: 36301, Training Loss: 0.00762
Epoch: 36301, Training Loss: 0.00713
Epoch: 36301, Training Loss: 0.00871
Epoch: 36301, Training Loss: 0.00675
Epoch: 36302, Training Loss: 0.00762
Epoch: 36302, Training Loss: 0.00713
Epoch: 36302, Training Loss: 0.00871
Epoch: 36302, Training Loss: 0.00675
Epoch: 36303, Training Loss: 0.00762
Epoch: 36303, Training Loss: 0.00713
Epoch: 36303, Training Loss: 0.00871
Epoch: 36303, Training Loss: 0.00675
Epoch: 36304, Training Loss: 0.00762
Epoch: 36304, Training Loss: 0.00713
Epoch: 36304, Training Loss: 0.00871
Epoch: 36304, Training Loss: 0.00675
Epoch: 36305, Training Loss: 0.00762
Epoch: 36305, Training Loss: 0.00713
Epoch: 36305, Training Loss: 0.00871
Epoch: 36305, Training Loss: 0.00675
Epoch: 36306, Training Loss: 0.00762
Epoch: 36306, Training Loss: 0.00713
Epoch: 36306, Training Loss: 0.00871
Epoch: 36306, Training Loss: 0.00675
Epoch: 36307, Training Loss: 0.00762
Epoch: 36307, Training Loss: 0.00713
Epoch: 36307, Training Loss: 0.00871
Epoch: 36307, Training Loss: 0.00675
Epoch: 36308, Training Loss: 0.00762
Epoch: 36308, Training Loss: 0.00713
Epoch: 36308, Training Loss: 0.00871
Epoch: 36308, Training Loss: 0.00675
Epoch: 36309, Training Loss: 0.00762
Epoch: 36309, Training Loss: 0.00713
Epoch: 36309, Training Loss: 0.00871
Epoch: 36309, Training Loss: 0.00675
Epoch: 36310, Training Loss: 0.00762
Epoch: 36310, Training Loss: 0.00713
Epoch: 36310, Training Loss: 0.00871
Epoch: 36310, Training Loss: 0.00675
Epoch: 36311, Training Loss: 0.00762
Epoch: 36311, Training Loss: 0.00713
Epoch: 36311, Training Loss: 0.00871
Epoch: 36311, Training Loss: 0.00675
Epoch: 36312, Training Loss: 0.00762
Epoch: 36312, Training Loss: 0.00713
Epoch: 36312, Training Loss: 0.00871
Epoch: 36312, Training Loss: 0.00675
Epoch: 36313, Training Loss: 0.00762
Epoch: 36313, Training Loss: 0.00713
Epoch: 36313, Training Loss: 0.00871
Epoch: 36313, Training Loss: 0.00675
Epoch: 36314, Training Loss: 0.00762
Epoch: 36314, Training Loss: 0.00713
Epoch: 36314, Training Loss: 0.00871
Epoch: 36314, Training Loss: 0.00675
Epoch: 36315, Training Loss: 0.00762
Epoch: 36315, Training Loss: 0.00713
Epoch: 36315, Training Loss: 0.00871
Epoch: 36315, Training Loss: 0.00675
Epoch: 36316, Training Loss: 0.00762
Epoch: 36316, Training Loss: 0.00713
Epoch: 36316, Training Loss: 0.00871
Epoch: 36316, Training Loss: 0.00675
Epoch: 36317, Training Loss: 0.00762
Epoch: 36317, Training Loss: 0.00713
Epoch: 36317, Training Loss: 0.00871
Epoch: 36317, Training Loss: 0.00675
Epoch: 36318, Training Loss: 0.00762
Epoch: 36318, Training Loss: 0.00713
Epoch: 36318, Training Loss: 0.00871
Epoch: 36318, Training Loss: 0.00675
Epoch: 36319, Training Loss: 0.00762
Epoch: 36319, Training Loss: 0.00713
Epoch: 36319, Training Loss: 0.00871
Epoch: 36319, Training Loss: 0.00675
Epoch: 36320, Training Loss: 0.00762
Epoch: 36320, Training Loss: 0.00713
Epoch: 36320, Training Loss: 0.00871
Epoch: 36320, Training Loss: 0.00675
Epoch: 36321, Training Loss: 0.00762
Epoch: 36321, Training Loss: 0.00713
Epoch: 36321, Training Loss: 0.00871
Epoch: 36321, Training Loss: 0.00675
Epoch: 36322, Training Loss: 0.00762
Epoch: 36322, Training Loss: 0.00713
Epoch: 36322, Training Loss: 0.00871
Epoch: 36322, Training Loss: 0.00675
Epoch: 36323, Training Loss: 0.00762
Epoch: 36323, Training Loss: 0.00713
Epoch: 36323, Training Loss: 0.00871
Epoch: 36323, Training Loss: 0.00675
Epoch: 36324, Training Loss: 0.00762
Epoch: 36324, Training Loss: 0.00713
Epoch: 36324, Training Loss: 0.00871
Epoch: 36324, Training Loss: 0.00675
Epoch: 36325, Training Loss: 0.00762
Epoch: 36325, Training Loss: 0.00713
Epoch: 36325, Training Loss: 0.00871
Epoch: 36325, Training Loss: 0.00675
Epoch: 36326, Training Loss: 0.00762
Epoch: 36326, Training Loss: 0.00713
Epoch: 36326, Training Loss: 0.00871
Epoch: 36326, Training Loss: 0.00675
Epoch: 36327, Training Loss: 0.00762
Epoch: 36327, Training Loss: 0.00713
Epoch: 36327, Training Loss: 0.00871
Epoch: 36327, Training Loss: 0.00675
Epoch: 36328, Training Loss: 0.00762
Epoch: 36328, Training Loss: 0.00713
Epoch: 36328, Training Loss: 0.00871
Epoch: 36328, Training Loss: 0.00675
Epoch: 36329, Training Loss: 0.00762
Epoch: 36329, Training Loss: 0.00713
Epoch: 36329, Training Loss: 0.00871
Epoch: 36329, Training Loss: 0.00675
Epoch: 36330, Training Loss: 0.00762
Epoch: 36330, Training Loss: 0.00713
Epoch: 36330, Training Loss: 0.00871
Epoch: 36330, Training Loss: 0.00675
Epoch: 36331, Training Loss: 0.00762
Epoch: 36331, Training Loss: 0.00713
Epoch: 36331, Training Loss: 0.00871
Epoch: 36331, Training Loss: 0.00675
Epoch: 36332, Training Loss: 0.00762
Epoch: 36332, Training Loss: 0.00713
Epoch: 36332, Training Loss: 0.00871
Epoch: 36332, Training Loss: 0.00675
Epoch: 36333, Training Loss: 0.00762
Epoch: 36333, Training Loss: 0.00713
Epoch: 36333, Training Loss: 0.00871
Epoch: 36333, Training Loss: 0.00675
Epoch: 36334, Training Loss: 0.00762
Epoch: 36334, Training Loss: 0.00713
Epoch: 36334, Training Loss: 0.00871
Epoch: 36334, Training Loss: 0.00675
Epoch: 36335, Training Loss: 0.00762
Epoch: 36335, Training Loss: 0.00713
Epoch: 36335, Training Loss: 0.00871
Epoch: 36335, Training Loss: 0.00675
Epoch: 36336, Training Loss: 0.00762
Epoch: 36336, Training Loss: 0.00713
Epoch: 36336, Training Loss: 0.00871
Epoch: 36336, Training Loss: 0.00675
Epoch: 36337, Training Loss: 0.00762
Epoch: 36337, Training Loss: 0.00713
Epoch: 36337, Training Loss: 0.00871
Epoch: 36337, Training Loss: 0.00675
Epoch: 36338, Training Loss: 0.00762
Epoch: 36338, Training Loss: 0.00713
Epoch: 36338, Training Loss: 0.00871
Epoch: 36338, Training Loss: 0.00675
Epoch: 36339, Training Loss: 0.00762
Epoch: 36339, Training Loss: 0.00713
Epoch: 36339, Training Loss: 0.00871
Epoch: 36339, Training Loss: 0.00675
Epoch: 36340, Training Loss: 0.00762
Epoch: 36340, Training Loss: 0.00713
Epoch: 36340, Training Loss: 0.00870
Epoch: 36340, Training Loss: 0.00675
Epoch: 36341, Training Loss: 0.00762
Epoch: 36341, Training Loss: 0.00713
Epoch: 36341, Training Loss: 0.00870
Epoch: 36341, Training Loss: 0.00675
Epoch: 36342, Training Loss: 0.00762
Epoch: 36342, Training Loss: 0.00713
Epoch: 36342, Training Loss: 0.00870
Epoch: 36342, Training Loss: 0.00675
Epoch: 36343, Training Loss: 0.00762
Epoch: 36343, Training Loss: 0.00713
Epoch: 36343, Training Loss: 0.00870
Epoch: 36343, Training Loss: 0.00675
Epoch: 36344, Training Loss: 0.00762
Epoch: 36344, Training Loss: 0.00713
Epoch: 36344, Training Loss: 0.00870
Epoch: 36344, Training Loss: 0.00675
Epoch: 36345, Training Loss: 0.00762
Epoch: 36345, Training Loss: 0.00713
Epoch: 36345, Training Loss: 0.00870
Epoch: 36345, Training Loss: 0.00675
Epoch: 36346, Training Loss: 0.00762
Epoch: 36346, Training Loss: 0.00713
Epoch: 36346, Training Loss: 0.00870
Epoch: 36346, Training Loss: 0.00675
Epoch: 36347, Training Loss: 0.00762
Epoch: 36347, Training Loss: 0.00713
Epoch: 36347, Training Loss: 0.00870
Epoch: 36347, Training Loss: 0.00675
Epoch: 36348, Training Loss: 0.00762
Epoch: 36348, Training Loss: 0.00713
Epoch: 36348, Training Loss: 0.00870
Epoch: 36348, Training Loss: 0.00675
Epoch: 36349, Training Loss: 0.00762
Epoch: 36349, Training Loss: 0.00713
Epoch: 36349, Training Loss: 0.00870
Epoch: 36349, Training Loss: 0.00675
Epoch: 36350, Training Loss: 0.00762
Epoch: 36350, Training Loss: 0.00713
Epoch: 36350, Training Loss: 0.00870
Epoch: 36350, Training Loss: 0.00675
Epoch: 36351, Training Loss: 0.00762
Epoch: 36351, Training Loss: 0.00713
Epoch: 36351, Training Loss: 0.00870
Epoch: 36351, Training Loss: 0.00675
Epoch: 36352, Training Loss: 0.00762
Epoch: 36352, Training Loss: 0.00713
Epoch: 36352, Training Loss: 0.00870
Epoch: 36352, Training Loss: 0.00675
Epoch: 36353, Training Loss: 0.00762
Epoch: 36353, Training Loss: 0.00713
Epoch: 36353, Training Loss: 0.00870
Epoch: 36353, Training Loss: 0.00675
Epoch: 36354, Training Loss: 0.00762
Epoch: 36354, Training Loss: 0.00712
Epoch: 36354, Training Loss: 0.00870
Epoch: 36354, Training Loss: 0.00675
Epoch: 36355, Training Loss: 0.00762
Epoch: 36355, Training Loss: 0.00712
Epoch: 36355, Training Loss: 0.00870
Epoch: 36355, Training Loss: 0.00675
Epoch: 36356, Training Loss: 0.00762
Epoch: 36356, Training Loss: 0.00712
Epoch: 36356, Training Loss: 0.00870
Epoch: 36356, Training Loss: 0.00675
Epoch: 36357, Training Loss: 0.00762
Epoch: 36357, Training Loss: 0.00712
Epoch: 36357, Training Loss: 0.00870
Epoch: 36357, Training Loss: 0.00675
Epoch: 36358, Training Loss: 0.00762
Epoch: 36358, Training Loss: 0.00712
Epoch: 36358, Training Loss: 0.00870
Epoch: 36358, Training Loss: 0.00675
Epoch: 36359, Training Loss: 0.00762
Epoch: 36359, Training Loss: 0.00712
Epoch: 36359, Training Loss: 0.00870
Epoch: 36359, Training Loss: 0.00675
Epoch: 36360, Training Loss: 0.00762
Epoch: 36360, Training Loss: 0.00712
Epoch: 36360, Training Loss: 0.00870
Epoch: 36360, Training Loss: 0.00675
Epoch: 36361, Training Loss: 0.00762
Epoch: 36361, Training Loss: 0.00712
Epoch: 36361, Training Loss: 0.00870
Epoch: 36361, Training Loss: 0.00675
Epoch: 36362, Training Loss: 0.00762
Epoch: 36362, Training Loss: 0.00712
Epoch: 36362, Training Loss: 0.00870
Epoch: 36362, Training Loss: 0.00675
Epoch: 36363, Training Loss: 0.00762
Epoch: 36363, Training Loss: 0.00712
Epoch: 36363, Training Loss: 0.00870
Epoch: 36363, Training Loss: 0.00675
Epoch: 36364, Training Loss: 0.00762
Epoch: 36364, Training Loss: 0.00712
Epoch: 36364, Training Loss: 0.00870
Epoch: 36364, Training Loss: 0.00675
Epoch: 36365, Training Loss: 0.00762
Epoch: 36365, Training Loss: 0.00712
Epoch: 36365, Training Loss: 0.00870
Epoch: 36365, Training Loss: 0.00674
Epoch: 36366, Training Loss: 0.00762
Epoch: 36366, Training Loss: 0.00712
Epoch: 36366, Training Loss: 0.00870
Epoch: 36366, Training Loss: 0.00674
Epoch: 36367, Training Loss: 0.00762
Epoch: 36367, Training Loss: 0.00712
Epoch: 36367, Training Loss: 0.00870
Epoch: 36367, Training Loss: 0.00674
Epoch: 36368, Training Loss: 0.00762
Epoch: 36368, Training Loss: 0.00712
Epoch: 36368, Training Loss: 0.00870
Epoch: 36368, Training Loss: 0.00674
Epoch: 36369, Training Loss: 0.00762
Epoch: 36369, Training Loss: 0.00712
Epoch: 36369, Training Loss: 0.00870
Epoch: 36369, Training Loss: 0.00674
Epoch: 36370, Training Loss: 0.00762
Epoch: 36370, Training Loss: 0.00712
Epoch: 36370, Training Loss: 0.00870
Epoch: 36370, Training Loss: 0.00674
Epoch: 36371, Training Loss: 0.00762
Epoch: 36371, Training Loss: 0.00712
Epoch: 36371, Training Loss: 0.00870
Epoch: 36371, Training Loss: 0.00674
Epoch: 36372, Training Loss: 0.00762
Epoch: 36372, Training Loss: 0.00712
Epoch: 36372, Training Loss: 0.00870
Epoch: 36372, Training Loss: 0.00674
Epoch: 36373, Training Loss: 0.00762
Epoch: 36373, Training Loss: 0.00712
Epoch: 36373, Training Loss: 0.00870
Epoch: 36373, Training Loss: 0.00674
Epoch: 36374, Training Loss: 0.00762
Epoch: 36374, Training Loss: 0.00712
Epoch: 36374, Training Loss: 0.00870
Epoch: 36374, Training Loss: 0.00674
Epoch: 36375, Training Loss: 0.00762
Epoch: 36375, Training Loss: 0.00712
Epoch: 36375, Training Loss: 0.00870
Epoch: 36375, Training Loss: 0.00674
Epoch: 36376, Training Loss: 0.00762
Epoch: 36376, Training Loss: 0.00712
Epoch: 36376, Training Loss: 0.00870
Epoch: 36376, Training Loss: 0.00674
Epoch: 36377, Training Loss: 0.00762
Epoch: 36377, Training Loss: 0.00712
Epoch: 36377, Training Loss: 0.00870
Epoch: 36377, Training Loss: 0.00674
Epoch: 36378, Training Loss: 0.00762
Epoch: 36378, Training Loss: 0.00712
Epoch: 36378, Training Loss: 0.00870
Epoch: 36378, Training Loss: 0.00674
Epoch: 36379, Training Loss: 0.00762
Epoch: 36379, Training Loss: 0.00712
Epoch: 36379, Training Loss: 0.00870
Epoch: 36379, Training Loss: 0.00674
Epoch: 36380, Training Loss: 0.00762
Epoch: 36380, Training Loss: 0.00712
Epoch: 36380, Training Loss: 0.00870
Epoch: 36380, Training Loss: 0.00674
Epoch: 36381, Training Loss: 0.00762
Epoch: 36381, Training Loss: 0.00712
Epoch: 36381, Training Loss: 0.00870
Epoch: 36381, Training Loss: 0.00674
Epoch: 36382, Training Loss: 0.00762
Epoch: 36382, Training Loss: 0.00712
Epoch: 36382, Training Loss: 0.00870
Epoch: 36382, Training Loss: 0.00674
Epoch: 36383, Training Loss: 0.00762
Epoch: 36383, Training Loss: 0.00712
Epoch: 36383, Training Loss: 0.00870
Epoch: 36383, Training Loss: 0.00674
Epoch: 36384, Training Loss: 0.00762
Epoch: 36384, Training Loss: 0.00712
Epoch: 36384, Training Loss: 0.00870
Epoch: 36384, Training Loss: 0.00674
Epoch: 36385, Training Loss: 0.00761
Epoch: 36385, Training Loss: 0.00712
Epoch: 36385, Training Loss: 0.00870
Epoch: 36385, Training Loss: 0.00674
Epoch: 36386, Training Loss: 0.00761
Epoch: 36386, Training Loss: 0.00712
Epoch: 36386, Training Loss: 0.00870
Epoch: 36386, Training Loss: 0.00674
Epoch: 36387, Training Loss: 0.00761
Epoch: 36387, Training Loss: 0.00712
Epoch: 36387, Training Loss: 0.00870
Epoch: 36387, Training Loss: 0.00674
Epoch: 36388, Training Loss: 0.00761
Epoch: 36388, Training Loss: 0.00712
Epoch: 36388, Training Loss: 0.00870
Epoch: 36388, Training Loss: 0.00674
Epoch: 36389, Training Loss: 0.00761
Epoch: 36389, Training Loss: 0.00712
Epoch: 36389, Training Loss: 0.00870
Epoch: 36389, Training Loss: 0.00674
Epoch: 36390, Training Loss: 0.00761
Epoch: 36390, Training Loss: 0.00712
Epoch: 36390, Training Loss: 0.00870
Epoch: 36390, Training Loss: 0.00674
Epoch: 36391, Training Loss: 0.00761
Epoch: 36391, Training Loss: 0.00712
Epoch: 36391, Training Loss: 0.00870
Epoch: 36391, Training Loss: 0.00674
Epoch: 36392, Training Loss: 0.00761
Epoch: 36392, Training Loss: 0.00712
Epoch: 36392, Training Loss: 0.00870
Epoch: 36392, Training Loss: 0.00674
Epoch: 36393, Training Loss: 0.00761
Epoch: 36393, Training Loss: 0.00712
Epoch: 36393, Training Loss: 0.00870
Epoch: 36393, Training Loss: 0.00674
Epoch: 36394, Training Loss: 0.00761
Epoch: 36394, Training Loss: 0.00712
Epoch: 36394, Training Loss: 0.00870
Epoch: 36394, Training Loss: 0.00674
Epoch: 36395, Training Loss: 0.00761
Epoch: 36395, Training Loss: 0.00712
Epoch: 36395, Training Loss: 0.00870
Epoch: 36395, Training Loss: 0.00674
Epoch: 36396, Training Loss: 0.00761
Epoch: 36396, Training Loss: 0.00712
Epoch: 36396, Training Loss: 0.00870
Epoch: 36396, Training Loss: 0.00674
Epoch: 36397, Training Loss: 0.00761
Epoch: 36397, Training Loss: 0.00712
Epoch: 36397, Training Loss: 0.00870
Epoch: 36397, Training Loss: 0.00674
Epoch: 36398, Training Loss: 0.00761
Epoch: 36398, Training Loss: 0.00712
Epoch: 36398, Training Loss: 0.00870
Epoch: 36398, Training Loss: 0.00674
Epoch: 36399, Training Loss: 0.00761
Epoch: 36399, Training Loss: 0.00712
Epoch: 36399, Training Loss: 0.00870
Epoch: 36399, Training Loss: 0.00674
Epoch: 36400, Training Loss: 0.00761
Epoch: 36400, Training Loss: 0.00712
Epoch: 36400, Training Loss: 0.00870
Epoch: 36400, Training Loss: 0.00674
Epoch: 36401, Training Loss: 0.00761
Epoch: 36401, Training Loss: 0.00712
Epoch: 36401, Training Loss: 0.00870
Epoch: 36401, Training Loss: 0.00674
Epoch: 36402, Training Loss: 0.00761
Epoch: 36402, Training Loss: 0.00712
Epoch: 36402, Training Loss: 0.00870
Epoch: 36402, Training Loss: 0.00674
Epoch: 36403, Training Loss: 0.00761
Epoch: 36403, Training Loss: 0.00712
Epoch: 36403, Training Loss: 0.00870
Epoch: 36403, Training Loss: 0.00674
Epoch: 36404, Training Loss: 0.00761
Epoch: 36404, Training Loss: 0.00712
Epoch: 36404, Training Loss: 0.00870
Epoch: 36404, Training Loss: 0.00674
Epoch: 36405, Training Loss: 0.00761
Epoch: 36405, Training Loss: 0.00712
Epoch: 36405, Training Loss: 0.00870
Epoch: 36405, Training Loss: 0.00674
Epoch: 36406, Training Loss: 0.00761
Epoch: 36406, Training Loss: 0.00712
Epoch: 36406, Training Loss: 0.00870
Epoch: 36406, Training Loss: 0.00674
Epoch: 36407, Training Loss: 0.00761
Epoch: 36407, Training Loss: 0.00712
Epoch: 36407, Training Loss: 0.00870
Epoch: 36407, Training Loss: 0.00674
Epoch: 36408, Training Loss: 0.00761
Epoch: 36408, Training Loss: 0.00712
Epoch: 36408, Training Loss: 0.00870
Epoch: 36408, Training Loss: 0.00674
Epoch: 36409, Training Loss: 0.00761
Epoch: 36409, Training Loss: 0.00712
Epoch: 36409, Training Loss: 0.00870
Epoch: 36409, Training Loss: 0.00674
Epoch: 36410, Training Loss: 0.00761
Epoch: 36410, Training Loss: 0.00712
Epoch: 36410, Training Loss: 0.00870
Epoch: 36410, Training Loss: 0.00674
Epoch: 36411, Training Loss: 0.00761
Epoch: 36411, Training Loss: 0.00712
Epoch: 36411, Training Loss: 0.00870
Epoch: 36411, Training Loss: 0.00674
Epoch: 36412, Training Loss: 0.00761
Epoch: 36412, Training Loss: 0.00712
Epoch: 36412, Training Loss: 0.00870
Epoch: 36412, Training Loss: 0.00674
Epoch: 36413, Training Loss: 0.00761
Epoch: 36413, Training Loss: 0.00712
Epoch: 36413, Training Loss: 0.00870
Epoch: 36413, Training Loss: 0.00674
Epoch: 36414, Training Loss: 0.00761
Epoch: 36414, Training Loss: 0.00712
Epoch: 36414, Training Loss: 0.00870
Epoch: 36414, Training Loss: 0.00674
Epoch: 36415, Training Loss: 0.00761
Epoch: 36415, Training Loss: 0.00712
Epoch: 36415, Training Loss: 0.00870
Epoch: 36415, Training Loss: 0.00674
Epoch: 36416, Training Loss: 0.00761
Epoch: 36416, Training Loss: 0.00712
Epoch: 36416, Training Loss: 0.00870
Epoch: 36416, Training Loss: 0.00674
Epoch: 36417, Training Loss: 0.00761
Epoch: 36417, Training Loss: 0.00712
Epoch: 36417, Training Loss: 0.00870
Epoch: 36417, Training Loss: 0.00674
Epoch: 36418, Training Loss: 0.00761
Epoch: 36418, Training Loss: 0.00712
Epoch: 36418, Training Loss: 0.00870
Epoch: 36418, Training Loss: 0.00674
Epoch: 36419, Training Loss: 0.00761
Epoch: 36419, Training Loss: 0.00712
Epoch: 36419, Training Loss: 0.00869
Epoch: 36419, Training Loss: 0.00674
Epoch: 36420, Training Loss: 0.00761
Epoch: 36420, Training Loss: 0.00712
Epoch: 36420, Training Loss: 0.00869
Epoch: 36420, Training Loss: 0.00674
Epoch: 36421, Training Loss: 0.00761
Epoch: 36421, Training Loss: 0.00712
Epoch: 36421, Training Loss: 0.00869
Epoch: 36421, Training Loss: 0.00674
Epoch: 36422, Training Loss: 0.00761
Epoch: 36422, Training Loss: 0.00712
Epoch: 36422, Training Loss: 0.00869
Epoch: 36422, Training Loss: 0.00674
Epoch: 36423, Training Loss: 0.00761
Epoch: 36423, Training Loss: 0.00712
Epoch: 36423, Training Loss: 0.00869
Epoch: 36423, Training Loss: 0.00674
Epoch: 36424, Training Loss: 0.00761
Epoch: 36424, Training Loss: 0.00712
Epoch: 36424, Training Loss: 0.00869
Epoch: 36424, Training Loss: 0.00674
Epoch: 36425, Training Loss: 0.00761
Epoch: 36425, Training Loss: 0.00712
Epoch: 36425, Training Loss: 0.00869
Epoch: 36425, Training Loss: 0.00674
Epoch: 36426, Training Loss: 0.00761
Epoch: 36426, Training Loss: 0.00712
Epoch: 36426, Training Loss: 0.00869
Epoch: 36426, Training Loss: 0.00674
Epoch: 36427, Training Loss: 0.00761
Epoch: 36427, Training Loss: 0.00712
Epoch: 36427, Training Loss: 0.00869
Epoch: 36427, Training Loss: 0.00674
Epoch: 36428, Training Loss: 0.00761
Epoch: 36428, Training Loss: 0.00712
Epoch: 36428, Training Loss: 0.00869
Epoch: 36428, Training Loss: 0.00674
Epoch: 36429, Training Loss: 0.00761
Epoch: 36429, Training Loss: 0.00712
Epoch: 36429, Training Loss: 0.00869
Epoch: 36429, Training Loss: 0.00674
Epoch: 36430, Training Loss: 0.00761
Epoch: 36430, Training Loss: 0.00712
Epoch: 36430, Training Loss: 0.00869
Epoch: 36430, Training Loss: 0.00674
Epoch: 36431, Training Loss: 0.00761
Epoch: 36431, Training Loss: 0.00712
Epoch: 36431, Training Loss: 0.00869
Epoch: 36431, Training Loss: 0.00674
Epoch: 36432, Training Loss: 0.00761
Epoch: 36432, Training Loss: 0.00712
Epoch: 36432, Training Loss: 0.00869
Epoch: 36432, Training Loss: 0.00674
Epoch: 36433, Training Loss: 0.00761
Epoch: 36433, Training Loss: 0.00712
Epoch: 36433, Training Loss: 0.00869
Epoch: 36433, Training Loss: 0.00674
Epoch: 36434, Training Loss: 0.00761
Epoch: 36434, Training Loss: 0.00712
Epoch: 36434, Training Loss: 0.00869
Epoch: 36434, Training Loss: 0.00674
Epoch: 36435, Training Loss: 0.00761
Epoch: 36435, Training Loss: 0.00712
Epoch: 36435, Training Loss: 0.00869
Epoch: 36435, Training Loss: 0.00674
Epoch: 36436, Training Loss: 0.00761
Epoch: 36436, Training Loss: 0.00712
Epoch: 36436, Training Loss: 0.00869
Epoch: 36436, Training Loss: 0.00674
Epoch: 36437, Training Loss: 0.00761
Epoch: 36437, Training Loss: 0.00712
Epoch: 36437, Training Loss: 0.00869
Epoch: 36437, Training Loss: 0.00674
Epoch: 36438, Training Loss: 0.00761
Epoch: 36438, Training Loss: 0.00712
Epoch: 36438, Training Loss: 0.00869
Epoch: 36438, Training Loss: 0.00674
Epoch: 36439, Training Loss: 0.00761
Epoch: 36439, Training Loss: 0.00712
Epoch: 36439, Training Loss: 0.00869
Epoch: 36439, Training Loss: 0.00674
Epoch: 36440, Training Loss: 0.00761
Epoch: 36440, Training Loss: 0.00712
Epoch: 36440, Training Loss: 0.00869
Epoch: 36440, Training Loss: 0.00674
Epoch: 36441, Training Loss: 0.00761
Epoch: 36441, Training Loss: 0.00712
Epoch: 36441, Training Loss: 0.00869
Epoch: 36441, Training Loss: 0.00674
Epoch: 36442, Training Loss: 0.00761
Epoch: 36442, Training Loss: 0.00712
Epoch: 36442, Training Loss: 0.00869
Epoch: 36442, Training Loss: 0.00674
Epoch: 36443, Training Loss: 0.00761
Epoch: 36443, Training Loss: 0.00712
Epoch: 36443, Training Loss: 0.00869
Epoch: 36443, Training Loss: 0.00674
Epoch: 36444, Training Loss: 0.00761
Epoch: 36444, Training Loss: 0.00712
Epoch: 36444, Training Loss: 0.00869
Epoch: 36444, Training Loss: 0.00674
Epoch: 36445, Training Loss: 0.00761
Epoch: 36445, Training Loss: 0.00712
Epoch: 36445, Training Loss: 0.00869
Epoch: 36445, Training Loss: 0.00674
Epoch: 36446, Training Loss: 0.00761
Epoch: 36446, Training Loss: 0.00712
Epoch: 36446, Training Loss: 0.00869
Epoch: 36446, Training Loss: 0.00674
Epoch: 36447, Training Loss: 0.00761
Epoch: 36447, Training Loss: 0.00712
Epoch: 36447, Training Loss: 0.00869
Epoch: 36447, Training Loss: 0.00674
Epoch: 36448, Training Loss: 0.00761
Epoch: 36448, Training Loss: 0.00712
Epoch: 36448, Training Loss: 0.00869
Epoch: 36448, Training Loss: 0.00674
Epoch: 36449, Training Loss: 0.00761
Epoch: 36449, Training Loss: 0.00712
Epoch: 36449, Training Loss: 0.00869
Epoch: 36449, Training Loss: 0.00674
Epoch: 36450, Training Loss: 0.00761
Epoch: 36450, Training Loss: 0.00712
Epoch: 36450, Training Loss: 0.00869
Epoch: 36450, Training Loss: 0.00674
Epoch: 36451, Training Loss: 0.00761
Epoch: 36451, Training Loss: 0.00712
Epoch: 36451, Training Loss: 0.00869
Epoch: 36451, Training Loss: 0.00674
Epoch: 36452, Training Loss: 0.00761
Epoch: 36452, Training Loss: 0.00711
Epoch: 36452, Training Loss: 0.00869
Epoch: 36452, Training Loss: 0.00674
Epoch: 36453, Training Loss: 0.00761
Epoch: 36453, Training Loss: 0.00711
Epoch: 36453, Training Loss: 0.00869
Epoch: 36453, Training Loss: 0.00674
Epoch: 36454, Training Loss: 0.00761
Epoch: 36454, Training Loss: 0.00711
Epoch: 36454, Training Loss: 0.00869
Epoch: 36454, Training Loss: 0.00674
Epoch: 36455, Training Loss: 0.00761
Epoch: 36455, Training Loss: 0.00711
Epoch: 36455, Training Loss: 0.00869
Epoch: 36455, Training Loss: 0.00674
Epoch: 36456, Training Loss: 0.00761
Epoch: 36456, Training Loss: 0.00711
Epoch: 36456, Training Loss: 0.00869
Epoch: 36456, Training Loss: 0.00674
Epoch: 36457, Training Loss: 0.00761
Epoch: 36457, Training Loss: 0.00711
Epoch: 36457, Training Loss: 0.00869
Epoch: 36457, Training Loss: 0.00674
Epoch: 36458, Training Loss: 0.00761
Epoch: 36458, Training Loss: 0.00711
Epoch: 36458, Training Loss: 0.00869
Epoch: 36458, Training Loss: 0.00674
Epoch: 36459, Training Loss: 0.00761
Epoch: 36459, Training Loss: 0.00711
Epoch: 36459, Training Loss: 0.00869
Epoch: 36459, Training Loss: 0.00674
Epoch: 36460, Training Loss: 0.00761
Epoch: 36460, Training Loss: 0.00711
Epoch: 36460, Training Loss: 0.00869
Epoch: 36460, Training Loss: 0.00674
Epoch: 36461, Training Loss: 0.00761
Epoch: 36461, Training Loss: 0.00711
Epoch: 36461, Training Loss: 0.00869
Epoch: 36461, Training Loss: 0.00674
Epoch: 36462, Training Loss: 0.00761
Epoch: 36462, Training Loss: 0.00711
Epoch: 36462, Training Loss: 0.00869
Epoch: 36462, Training Loss: 0.00674
Epoch: 36463, Training Loss: 0.00761
Epoch: 36463, Training Loss: 0.00711
Epoch: 36463, Training Loss: 0.00869
Epoch: 36463, Training Loss: 0.00674
Epoch: 36464, Training Loss: 0.00761
Epoch: 36464, Training Loss: 0.00711
Epoch: 36464, Training Loss: 0.00869
Epoch: 36464, Training Loss: 0.00674
Epoch: 36465, Training Loss: 0.00761
Epoch: 36465, Training Loss: 0.00711
Epoch: 36465, Training Loss: 0.00869
Epoch: 36465, Training Loss: 0.00674
Epoch: 36466, Training Loss: 0.00761
Epoch: 36466, Training Loss: 0.00711
Epoch: 36466, Training Loss: 0.00869
Epoch: 36466, Training Loss: 0.00674
Epoch: 36467, Training Loss: 0.00761
Epoch: 36467, Training Loss: 0.00711
Epoch: 36467, Training Loss: 0.00869
Epoch: 36467, Training Loss: 0.00674
Epoch: 36468, Training Loss: 0.00761
Epoch: 36468, Training Loss: 0.00711
Epoch: 36468, Training Loss: 0.00869
Epoch: 36468, Training Loss: 0.00674
Epoch: 36469, Training Loss: 0.00761
Epoch: 36469, Training Loss: 0.00711
Epoch: 36469, Training Loss: 0.00869
Epoch: 36469, Training Loss: 0.00673
Epoch: 36470, Training Loss: 0.00761
Epoch: 36470, Training Loss: 0.00711
Epoch: 36470, Training Loss: 0.00869
Epoch: 36470, Training Loss: 0.00673
Epoch: 36471, Training Loss: 0.00761
Epoch: 36471, Training Loss: 0.00711
Epoch: 36471, Training Loss: 0.00869
Epoch: 36471, Training Loss: 0.00673
Epoch: 36472, Training Loss: 0.00761
Epoch: 36472, Training Loss: 0.00711
Epoch: 36472, Training Loss: 0.00869
Epoch: 36472, Training Loss: 0.00673
Epoch: 36473, Training Loss: 0.00761
Epoch: 36473, Training Loss: 0.00711
Epoch: 36473, Training Loss: 0.00869
Epoch: 36473, Training Loss: 0.00673
Epoch: 36474, Training Loss: 0.00761
Epoch: 36474, Training Loss: 0.00711
Epoch: 36474, Training Loss: 0.00869
Epoch: 36474, Training Loss: 0.00673
Epoch: 36475, Training Loss: 0.00760
Epoch: 36475, Training Loss: 0.00711
Epoch: 36475, Training Loss: 0.00869
Epoch: 36475, Training Loss: 0.00673
Epoch: 36476, Training Loss: 0.00760
Epoch: 36476, Training Loss: 0.00711
Epoch: 36476, Training Loss: 0.00869
Epoch: 36476, Training Loss: 0.00673
Epoch: 36477, Training Loss: 0.00760
Epoch: 36477, Training Loss: 0.00711
Epoch: 36477, Training Loss: 0.00869
Epoch: 36477, Training Loss: 0.00673
Epoch: 36478, Training Loss: 0.00760
Epoch: 36478, Training Loss: 0.00711
Epoch: 36478, Training Loss: 0.00869
Epoch: 36478, Training Loss: 0.00673
Epoch: 36479, Training Loss: 0.00760
Epoch: 36479, Training Loss: 0.00711
Epoch: 36479, Training Loss: 0.00869
Epoch: 36479, Training Loss: 0.00673
Epoch: 36480, Training Loss: 0.00760
Epoch: 36480, Training Loss: 0.00711
Epoch: 36480, Training Loss: 0.00869
Epoch: 36480, Training Loss: 0.00673
Epoch: 36481, Training Loss: 0.00760
Epoch: 36481, Training Loss: 0.00711
Epoch: 36481, Training Loss: 0.00869
Epoch: 36481, Training Loss: 0.00673
Epoch: 36482, Training Loss: 0.00760
Epoch: 36482, Training Loss: 0.00711
Epoch: 36482, Training Loss: 0.00869
Epoch: 36482, Training Loss: 0.00673
Epoch: 36483, Training Loss: 0.00760
Epoch: 36483, Training Loss: 0.00711
Epoch: 36483, Training Loss: 0.00869
Epoch: 36483, Training Loss: 0.00673
Epoch: 36484, Training Loss: 0.00760
Epoch: 36484, Training Loss: 0.00711
Epoch: 36484, Training Loss: 0.00869
Epoch: 36484, Training Loss: 0.00673
Epoch: 36485, Training Loss: 0.00760
Epoch: 36485, Training Loss: 0.00711
Epoch: 36485, Training Loss: 0.00869
Epoch: 36485, Training Loss: 0.00673
Epoch: 36486, Training Loss: 0.00760
Epoch: 36486, Training Loss: 0.00711
Epoch: 36486, Training Loss: 0.00869
Epoch: 36486, Training Loss: 0.00673
Epoch: 36487, Training Loss: 0.00760
Epoch: 36487, Training Loss: 0.00711
Epoch: 36487, Training Loss: 0.00869
Epoch: 36487, Training Loss: 0.00673
Epoch: 36488, Training Loss: 0.00760
Epoch: 36488, Training Loss: 0.00711
Epoch: 36488, Training Loss: 0.00869
Epoch: 36488, Training Loss: 0.00673
Epoch: 36489, Training Loss: 0.00760
Epoch: 36489, Training Loss: 0.00711
Epoch: 36489, Training Loss: 0.00869
Epoch: 36489, Training Loss: 0.00673
Epoch: 36490, Training Loss: 0.00760
Epoch: 36490, Training Loss: 0.00711
Epoch: 36490, Training Loss: 0.00869
Epoch: 36490, Training Loss: 0.00673
Epoch: 36491, Training Loss: 0.00760
Epoch: 36491, Training Loss: 0.00711
Epoch: 36491, Training Loss: 0.00869
Epoch: 36491, Training Loss: 0.00673
Epoch: 36492, Training Loss: 0.00760
Epoch: 36492, Training Loss: 0.00711
Epoch: 36492, Training Loss: 0.00869
Epoch: 36492, Training Loss: 0.00673
Epoch: 36493, Training Loss: 0.00760
Epoch: 36493, Training Loss: 0.00711
Epoch: 36493, Training Loss: 0.00869
Epoch: 36493, Training Loss: 0.00673
Epoch: 36494, Training Loss: 0.00760
Epoch: 36494, Training Loss: 0.00711
Epoch: 36494, Training Loss: 0.00869
Epoch: 36494, Training Loss: 0.00673
Epoch: 36495, Training Loss: 0.00760
Epoch: 36495, Training Loss: 0.00711
Epoch: 36495, Training Loss: 0.00869
Epoch: 36495, Training Loss: 0.00673
Epoch: 36496, Training Loss: 0.00760
Epoch: 36496, Training Loss: 0.00711
Epoch: 36496, Training Loss: 0.00869
Epoch: 36496, Training Loss: 0.00673
Epoch: 36497, Training Loss: 0.00760
Epoch: 36497, Training Loss: 0.00711
Epoch: 36497, Training Loss: 0.00869
Epoch: 36497, Training Loss: 0.00673
Epoch: 36498, Training Loss: 0.00760
Epoch: 36498, Training Loss: 0.00711
Epoch: 36498, Training Loss: 0.00869
Epoch: 36498, Training Loss: 0.00673
Epoch: 36499, Training Loss: 0.00760
Epoch: 36499, Training Loss: 0.00711
Epoch: 36499, Training Loss: 0.00868
Epoch: 36499, Training Loss: 0.00673
Epoch: 36500, Training Loss: 0.00760
Epoch: 36500, Training Loss: 0.00711
Epoch: 36500, Training Loss: 0.00868
Epoch: 36500, Training Loss: 0.00673
Epoch: 36501, Training Loss: 0.00760
Epoch: 36501, Training Loss: 0.00711
Epoch: 36501, Training Loss: 0.00868
Epoch: 36501, Training Loss: 0.00673
Epoch: 36502, Training Loss: 0.00760
Epoch: 36502, Training Loss: 0.00711
Epoch: 36502, Training Loss: 0.00868
Epoch: 36502, Training Loss: 0.00673
Epoch: 36503, Training Loss: 0.00760
Epoch: 36503, Training Loss: 0.00711
Epoch: 36503, Training Loss: 0.00868
Epoch: 36503, Training Loss: 0.00673
Epoch: 36504, Training Loss: 0.00760
Epoch: 36504, Training Loss: 0.00711
Epoch: 36504, Training Loss: 0.00868
Epoch: 36504, Training Loss: 0.00673
Epoch: 36505, Training Loss: 0.00760
Epoch: 36505, Training Loss: 0.00711
Epoch: 36505, Training Loss: 0.00868
Epoch: 36505, Training Loss: 0.00673
Epoch: 36506, Training Loss: 0.00760
Epoch: 36506, Training Loss: 0.00711
Epoch: 36506, Training Loss: 0.00868
Epoch: 36506, Training Loss: 0.00673
Epoch: 36507, Training Loss: 0.00760
Epoch: 36507, Training Loss: 0.00711
Epoch: 36507, Training Loss: 0.00868
Epoch: 36507, Training Loss: 0.00673
Epoch: 36508, Training Loss: 0.00760
Epoch: 36508, Training Loss: 0.00711
Epoch: 36508, Training Loss: 0.00868
Epoch: 36508, Training Loss: 0.00673
Epoch: 36509, Training Loss: 0.00760
Epoch: 36509, Training Loss: 0.00711
Epoch: 36509, Training Loss: 0.00868
Epoch: 36509, Training Loss: 0.00673
Epoch: 36510, Training Loss: 0.00760
Epoch: 36510, Training Loss: 0.00711
Epoch: 36510, Training Loss: 0.00868
Epoch: 36510, Training Loss: 0.00673
Epoch: 36511, Training Loss: 0.00760
Epoch: 36511, Training Loss: 0.00711
Epoch: 36511, Training Loss: 0.00868
Epoch: 36511, Training Loss: 0.00673
Epoch: 36512, Training Loss: 0.00760
Epoch: 36512, Training Loss: 0.00711
Epoch: 36512, Training Loss: 0.00868
Epoch: 36512, Training Loss: 0.00673
Epoch: 36513, Training Loss: 0.00760
Epoch: 36513, Training Loss: 0.00711
Epoch: 36513, Training Loss: 0.00868
Epoch: 36513, Training Loss: 0.00673
Epoch: 36514, Training Loss: 0.00760
Epoch: 36514, Training Loss: 0.00711
Epoch: 36514, Training Loss: 0.00868
Epoch: 36514, Training Loss: 0.00673
Epoch: 36515, Training Loss: 0.00760
Epoch: 36515, Training Loss: 0.00711
Epoch: 36515, Training Loss: 0.00868
Epoch: 36515, Training Loss: 0.00673
Epoch: 36516, Training Loss: 0.00760
Epoch: 36516, Training Loss: 0.00711
Epoch: 36516, Training Loss: 0.00868
Epoch: 36516, Training Loss: 0.00673
Epoch: 36517, Training Loss: 0.00760
Epoch: 36517, Training Loss: 0.00711
Epoch: 36517, Training Loss: 0.00868
Epoch: 36517, Training Loss: 0.00673
Epoch: 36518, Training Loss: 0.00760
Epoch: 36518, Training Loss: 0.00711
Epoch: 36518, Training Loss: 0.00868
Epoch: 36518, Training Loss: 0.00673
Epoch: 36519, Training Loss: 0.00760
Epoch: 36519, Training Loss: 0.00711
Epoch: 36519, Training Loss: 0.00868
Epoch: 36519, Training Loss: 0.00673
Epoch: 36520, Training Loss: 0.00760
Epoch: 36520, Training Loss: 0.00711
Epoch: 36520, Training Loss: 0.00868
Epoch: 36520, Training Loss: 0.00673
Epoch: 36521, Training Loss: 0.00760
Epoch: 36521, Training Loss: 0.00711
Epoch: 36521, Training Loss: 0.00868
Epoch: 36521, Training Loss: 0.00673
Epoch: 36522, Training Loss: 0.00760
Epoch: 36522, Training Loss: 0.00711
Epoch: 36522, Training Loss: 0.00868
Epoch: 36522, Training Loss: 0.00673
Epoch: 36523, Training Loss: 0.00760
Epoch: 36523, Training Loss: 0.00711
Epoch: 36523, Training Loss: 0.00868
Epoch: 36523, Training Loss: 0.00673
Epoch: 36524, Training Loss: 0.00760
Epoch: 36524, Training Loss: 0.00711
Epoch: 36524, Training Loss: 0.00868
Epoch: 36524, Training Loss: 0.00673
Epoch: 36525, Training Loss: 0.00760
Epoch: 36525, Training Loss: 0.00711
Epoch: 36525, Training Loss: 0.00868
Epoch: 36525, Training Loss: 0.00673
Epoch: 36526, Training Loss: 0.00760
Epoch: 36526, Training Loss: 0.00711
Epoch: 36526, Training Loss: 0.00868
Epoch: 36526, Training Loss: 0.00673
Epoch: 36527, Training Loss: 0.00760
Epoch: 36527, Training Loss: 0.00711
Epoch: 36527, Training Loss: 0.00868
Epoch: 36527, Training Loss: 0.00673
Epoch: 36528, Training Loss: 0.00760
Epoch: 36528, Training Loss: 0.00711
Epoch: 36528, Training Loss: 0.00868
Epoch: 36528, Training Loss: 0.00673
Epoch: 36529, Training Loss: 0.00760
Epoch: 36529, Training Loss: 0.00711
Epoch: 36529, Training Loss: 0.00868
Epoch: 36529, Training Loss: 0.00673
Epoch: 36530, Training Loss: 0.00760
Epoch: 36530, Training Loss: 0.00711
Epoch: 36530, Training Loss: 0.00868
Epoch: 36530, Training Loss: 0.00673
Epoch: 36531, Training Loss: 0.00760
Epoch: 36531, Training Loss: 0.00711
Epoch: 36531, Training Loss: 0.00868
Epoch: 36531, Training Loss: 0.00673
Epoch: 36532, Training Loss: 0.00760
Epoch: 36532, Training Loss: 0.00711
Epoch: 36532, Training Loss: 0.00868
Epoch: 36532, Training Loss: 0.00673
Epoch: 36533, Training Loss: 0.00760
Epoch: 36533, Training Loss: 0.00711
Epoch: 36533, Training Loss: 0.00868
Epoch: 36533, Training Loss: 0.00673
Epoch: 36534, Training Loss: 0.00760
Epoch: 36534, Training Loss: 0.00711
Epoch: 36534, Training Loss: 0.00868
Epoch: 36534, Training Loss: 0.00673
Epoch: 36535, Training Loss: 0.00760
Epoch: 36535, Training Loss: 0.00711
Epoch: 36535, Training Loss: 0.00868
Epoch: 36535, Training Loss: 0.00673
Epoch: 36536, Training Loss: 0.00760
Epoch: 36536, Training Loss: 0.00711
Epoch: 36536, Training Loss: 0.00868
Epoch: 36536, Training Loss: 0.00673
Epoch: 36537, Training Loss: 0.00760
Epoch: 36537, Training Loss: 0.00711
Epoch: 36537, Training Loss: 0.00868
Epoch: 36537, Training Loss: 0.00673
Epoch: 36538, Training Loss: 0.00760
Epoch: 36538, Training Loss: 0.00711
Epoch: 36538, Training Loss: 0.00868
Epoch: 36538, Training Loss: 0.00673
Epoch: 36539, Training Loss: 0.00760
Epoch: 36539, Training Loss: 0.00711
Epoch: 36539, Training Loss: 0.00868
Epoch: 36539, Training Loss: 0.00673
Epoch: 36540, Training Loss: 0.00760
Epoch: 36540, Training Loss: 0.00711
Epoch: 36540, Training Loss: 0.00868
Epoch: 36540, Training Loss: 0.00673
Epoch: 36541, Training Loss: 0.00760
Epoch: 36541, Training Loss: 0.00711
Epoch: 36541, Training Loss: 0.00868
Epoch: 36541, Training Loss: 0.00673
Epoch: 36542, Training Loss: 0.00760
Epoch: 36542, Training Loss: 0.00711
Epoch: 36542, Training Loss: 0.00868
Epoch: 36542, Training Loss: 0.00673
Epoch: 36543, Training Loss: 0.00760
Epoch: 36543, Training Loss: 0.00711
Epoch: 36543, Training Loss: 0.00868
Epoch: 36543, Training Loss: 0.00673
Epoch: 36544, Training Loss: 0.00760
Epoch: 36544, Training Loss: 0.00711
Epoch: 36544, Training Loss: 0.00868
Epoch: 36544, Training Loss: 0.00673
Epoch: 36545, Training Loss: 0.00760
Epoch: 36545, Training Loss: 0.00711
Epoch: 36545, Training Loss: 0.00868
Epoch: 36545, Training Loss: 0.00673
Epoch: 36546, Training Loss: 0.00760
Epoch: 36546, Training Loss: 0.00711
Epoch: 36546, Training Loss: 0.00868
Epoch: 36546, Training Loss: 0.00673
Epoch: 36547, Training Loss: 0.00760
Epoch: 36547, Training Loss: 0.00711
Epoch: 36547, Training Loss: 0.00868
Epoch: 36547, Training Loss: 0.00673
Epoch: 36548, Training Loss: 0.00760
Epoch: 36548, Training Loss: 0.00711
Epoch: 36548, Training Loss: 0.00868
Epoch: 36548, Training Loss: 0.00673
Epoch: 36549, Training Loss: 0.00760
Epoch: 36549, Training Loss: 0.00711
Epoch: 36549, Training Loss: 0.00868
Epoch: 36549, Training Loss: 0.00673
Epoch: 36550, Training Loss: 0.00760
Epoch: 36550, Training Loss: 0.00711
Epoch: 36550, Training Loss: 0.00868
Epoch: 36550, Training Loss: 0.00673
Epoch: 36551, Training Loss: 0.00760
Epoch: 36551, Training Loss: 0.00710
Epoch: 36551, Training Loss: 0.00868
Epoch: 36551, Training Loss: 0.00673
Epoch: 36552, Training Loss: 0.00760
Epoch: 36552, Training Loss: 0.00710
Epoch: 36552, Training Loss: 0.00868
Epoch: 36552, Training Loss: 0.00673
Epoch: 36553, Training Loss: 0.00760
Epoch: 36553, Training Loss: 0.00710
Epoch: 36553, Training Loss: 0.00868
Epoch: 36553, Training Loss: 0.00673
Epoch: 36554, Training Loss: 0.00760
Epoch: 36554, Training Loss: 0.00710
Epoch: 36554, Training Loss: 0.00868
Epoch: 36554, Training Loss: 0.00673
Epoch: 36555, Training Loss: 0.00760
Epoch: 36555, Training Loss: 0.00710
Epoch: 36555, Training Loss: 0.00868
Epoch: 36555, Training Loss: 0.00673
Epoch: 36556, Training Loss: 0.00760
Epoch: 36556, Training Loss: 0.00710
Epoch: 36556, Training Loss: 0.00868
Epoch: 36556, Training Loss: 0.00673
Epoch: 36557, Training Loss: 0.00760
Epoch: 36557, Training Loss: 0.00710
Epoch: 36557, Training Loss: 0.00868
Epoch: 36557, Training Loss: 0.00673
Epoch: 36558, Training Loss: 0.00760
Epoch: 36558, Training Loss: 0.00710
Epoch: 36558, Training Loss: 0.00868
Epoch: 36558, Training Loss: 0.00673
Epoch: 36559, Training Loss: 0.00760
Epoch: 36559, Training Loss: 0.00710
Epoch: 36559, Training Loss: 0.00868
Epoch: 36559, Training Loss: 0.00673
Epoch: 36560, Training Loss: 0.00760
Epoch: 36560, Training Loss: 0.00710
Epoch: 36560, Training Loss: 0.00868
Epoch: 36560, Training Loss: 0.00673
Epoch: 36561, Training Loss: 0.00760
Epoch: 36561, Training Loss: 0.00710
Epoch: 36561, Training Loss: 0.00868
Epoch: 36561, Training Loss: 0.00673
Epoch: 36562, Training Loss: 0.00760
Epoch: 36562, Training Loss: 0.00710
Epoch: 36562, Training Loss: 0.00868
Epoch: 36562, Training Loss: 0.00673
Epoch: 36563, Training Loss: 0.00760
Epoch: 36563, Training Loss: 0.00710
Epoch: 36563, Training Loss: 0.00868
Epoch: 36563, Training Loss: 0.00673
Epoch: 36564, Training Loss: 0.00760
Epoch: 36564, Training Loss: 0.00710
Epoch: 36564, Training Loss: 0.00868
Epoch: 36564, Training Loss: 0.00673
Epoch: 36565, Training Loss: 0.00760
Epoch: 36565, Training Loss: 0.00710
Epoch: 36565, Training Loss: 0.00868
Epoch: 36565, Training Loss: 0.00673
Epoch: 36566, Training Loss: 0.00759
Epoch: 36566, Training Loss: 0.00710
Epoch: 36566, Training Loss: 0.00868
Epoch: 36566, Training Loss: 0.00673
Epoch: 36567, Training Loss: 0.00759
Epoch: 36567, Training Loss: 0.00710
Epoch: 36567, Training Loss: 0.00868
Epoch: 36567, Training Loss: 0.00673
Epoch: 36568, Training Loss: 0.00759
Epoch: 36568, Training Loss: 0.00710
Epoch: 36568, Training Loss: 0.00868
Epoch: 36568, Training Loss: 0.00673
Epoch: 36569, Training Loss: 0.00759
Epoch: 36569, Training Loss: 0.00710
Epoch: 36569, Training Loss: 0.00868
Epoch: 36569, Training Loss: 0.00673
Epoch: 36570, Training Loss: 0.00759
Epoch: 36570, Training Loss: 0.00710
Epoch: 36570, Training Loss: 0.00868
Epoch: 36570, Training Loss: 0.00673
Epoch: 36571, Training Loss: 0.00759
Epoch: 36571, Training Loss: 0.00710
Epoch: 36571, Training Loss: 0.00868
Epoch: 36571, Training Loss: 0.00673
Epoch: 36572, Training Loss: 0.00759
Epoch: 36572, Training Loss: 0.00710
Epoch: 36572, Training Loss: 0.00868
Epoch: 36572, Training Loss: 0.00673
Epoch: 36573, Training Loss: 0.00759
Epoch: 36573, Training Loss: 0.00710
Epoch: 36573, Training Loss: 0.00868
Epoch: 36573, Training Loss: 0.00672
Epoch: 36574, Training Loss: 0.00759
Epoch: 36574, Training Loss: 0.00710
Epoch: 36574, Training Loss: 0.00868
Epoch: 36574, Training Loss: 0.00672
Epoch: 36575, Training Loss: 0.00759
Epoch: 36575, Training Loss: 0.00710
Epoch: 36575, Training Loss: 0.00868
Epoch: 36575, Training Loss: 0.00672
Epoch: 36576, Training Loss: 0.00759
Epoch: 36576, Training Loss: 0.00710
Epoch: 36576, Training Loss: 0.00868
Epoch: 36576, Training Loss: 0.00672
Epoch: 36577, Training Loss: 0.00759
Epoch: 36577, Training Loss: 0.00710
Epoch: 36577, Training Loss: 0.00868
Epoch: 36577, Training Loss: 0.00672
Epoch: 36578, Training Loss: 0.00759
Epoch: 36578, Training Loss: 0.00710
Epoch: 36578, Training Loss: 0.00868
Epoch: 36578, Training Loss: 0.00672
Epoch: 36579, Training Loss: 0.00759
Epoch: 36579, Training Loss: 0.00710
Epoch: 36579, Training Loss: 0.00867
Epoch: 36579, Training Loss: 0.00672
Epoch: 36580, Training Loss: 0.00759
Epoch: 36580, Training Loss: 0.00710
Epoch: 36580, Training Loss: 0.00867
Epoch: 36580, Training Loss: 0.00672
Epoch: 36581, Training Loss: 0.00759
Epoch: 36581, Training Loss: 0.00710
Epoch: 36581, Training Loss: 0.00867
Epoch: 36581, Training Loss: 0.00672
Epoch: 36582, Training Loss: 0.00759
Epoch: 36582, Training Loss: 0.00710
Epoch: 36582, Training Loss: 0.00867
Epoch: 36582, Training Loss: 0.00672
Epoch: 36583, Training Loss: 0.00759
Epoch: 36583, Training Loss: 0.00710
Epoch: 36583, Training Loss: 0.00867
Epoch: 36583, Training Loss: 0.00672
Epoch: 36584, Training Loss: 0.00759
Epoch: 36584, Training Loss: 0.00710
Epoch: 36584, Training Loss: 0.00867
Epoch: 36584, Training Loss: 0.00672
Epoch: 36585, Training Loss: 0.00759
Epoch: 36585, Training Loss: 0.00710
Epoch: 36585, Training Loss: 0.00867
Epoch: 36585, Training Loss: 0.00672
Epoch: 36586, Training Loss: 0.00759
Epoch: 36586, Training Loss: 0.00710
Epoch: 36586, Training Loss: 0.00867
Epoch: 36586, Training Loss: 0.00672
Epoch: 36587, Training Loss: 0.00759
Epoch: 36587, Training Loss: 0.00710
Epoch: 36587, Training Loss: 0.00867
Epoch: 36587, Training Loss: 0.00672
Epoch: 36588, Training Loss: 0.00759
Epoch: 36588, Training Loss: 0.00710
Epoch: 36588, Training Loss: 0.00867
Epoch: 36588, Training Loss: 0.00672
Epoch: 36589, Training Loss: 0.00759
Epoch: 36589, Training Loss: 0.00710
Epoch: 36589, Training Loss: 0.00867
Epoch: 36589, Training Loss: 0.00672
Epoch: 36590, Training Loss: 0.00759
Epoch: 36590, Training Loss: 0.00710
Epoch: 36590, Training Loss: 0.00867
Epoch: 36590, Training Loss: 0.00672
Epoch: 36591, Training Loss: 0.00759
Epoch: 36591, Training Loss: 0.00710
Epoch: 36591, Training Loss: 0.00867
Epoch: 36591, Training Loss: 0.00672
Epoch: 36592, Training Loss: 0.00759
Epoch: 36592, Training Loss: 0.00710
Epoch: 36592, Training Loss: 0.00867
Epoch: 36592, Training Loss: 0.00672
Epoch: 36593, Training Loss: 0.00759
Epoch: 36593, Training Loss: 0.00710
Epoch: 36593, Training Loss: 0.00867
Epoch: 36593, Training Loss: 0.00672
Epoch: 36594, Training Loss: 0.00759
Epoch: 36594, Training Loss: 0.00710
Epoch: 36594, Training Loss: 0.00867
Epoch: 36594, Training Loss: 0.00672
Epoch: 36595, Training Loss: 0.00759
Epoch: 36595, Training Loss: 0.00710
Epoch: 36595, Training Loss: 0.00867
Epoch: 36595, Training Loss: 0.00672
Epoch: 36596, Training Loss: 0.00759
Epoch: 36596, Training Loss: 0.00710
Epoch: 36596, Training Loss: 0.00867
Epoch: 36596, Training Loss: 0.00672
Epoch: 36597, Training Loss: 0.00759
Epoch: 36597, Training Loss: 0.00710
Epoch: 36597, Training Loss: 0.00867
Epoch: 36597, Training Loss: 0.00672
Epoch: 36598, Training Loss: 0.00759
Epoch: 36598, Training Loss: 0.00710
Epoch: 36598, Training Loss: 0.00867
Epoch: 36598, Training Loss: 0.00672
Epoch: 36599, Training Loss: 0.00759
Epoch: 36599, Training Loss: 0.00710
Epoch: 36599, Training Loss: 0.00867
Epoch: 36599, Training Loss: 0.00672
Epoch: 36600, Training Loss: 0.00759
Epoch: 36600, Training Loss: 0.00710
Epoch: 36600, Training Loss: 0.00867
Epoch: 36600, Training Loss: 0.00672
Epoch: 36601, Training Loss: 0.00759
Epoch: 36601, Training Loss: 0.00710
Epoch: 36601, Training Loss: 0.00867
Epoch: 36601, Training Loss: 0.00672
Epoch: 36602, Training Loss: 0.00759
Epoch: 36602, Training Loss: 0.00710
Epoch: 36602, Training Loss: 0.00867
Epoch: 36602, Training Loss: 0.00672
Epoch: 36603, Training Loss: 0.00759
Epoch: 36603, Training Loss: 0.00710
Epoch: 36603, Training Loss: 0.00867
Epoch: 36603, Training Loss: 0.00672
Epoch: 36604, Training Loss: 0.00759
Epoch: 36604, Training Loss: 0.00710
Epoch: 36604, Training Loss: 0.00867
Epoch: 36604, Training Loss: 0.00672
Epoch: 36605, Training Loss: 0.00759
Epoch: 36605, Training Loss: 0.00710
Epoch: 36605, Training Loss: 0.00867
Epoch: 36605, Training Loss: 0.00672
Epoch: 36606, Training Loss: 0.00759
Epoch: 36606, Training Loss: 0.00710
Epoch: 36606, Training Loss: 0.00867
Epoch: 36606, Training Loss: 0.00672
Epoch: 36607, Training Loss: 0.00759
Epoch: 36607, Training Loss: 0.00710
Epoch: 36607, Training Loss: 0.00867
Epoch: 36607, Training Loss: 0.00672
Epoch: 36608, Training Loss: 0.00759
Epoch: 36608, Training Loss: 0.00710
Epoch: 36608, Training Loss: 0.00867
Epoch: 36608, Training Loss: 0.00672
Epoch: 36609, Training Loss: 0.00759
Epoch: 36609, Training Loss: 0.00710
Epoch: 36609, Training Loss: 0.00867
Epoch: 36609, Training Loss: 0.00672
Epoch: 36610, Training Loss: 0.00759
Epoch: 36610, Training Loss: 0.00710
Epoch: 36610, Training Loss: 0.00867
Epoch: 36610, Training Loss: 0.00672
Epoch: 36611, Training Loss: 0.00759
Epoch: 36611, Training Loss: 0.00710
Epoch: 36611, Training Loss: 0.00867
Epoch: 36611, Training Loss: 0.00672
Epoch: 36612, Training Loss: 0.00759
Epoch: 36612, Training Loss: 0.00710
Epoch: 36612, Training Loss: 0.00867
Epoch: 36612, Training Loss: 0.00672
Epoch: 36613, Training Loss: 0.00759
Epoch: 36613, Training Loss: 0.00710
Epoch: 36613, Training Loss: 0.00867
Epoch: 36613, Training Loss: 0.00672
Epoch: 36614, Training Loss: 0.00759
Epoch: 36614, Training Loss: 0.00710
Epoch: 36614, Training Loss: 0.00867
Epoch: 36614, Training Loss: 0.00672
Epoch: 36615, Training Loss: 0.00759
Epoch: 36615, Training Loss: 0.00710
Epoch: 36615, Training Loss: 0.00867
Epoch: 36615, Training Loss: 0.00672
Epoch: 36616, Training Loss: 0.00759
Epoch: 36616, Training Loss: 0.00710
Epoch: 36616, Training Loss: 0.00867
Epoch: 36616, Training Loss: 0.00672
Epoch: 36617, Training Loss: 0.00759
Epoch: 36617, Training Loss: 0.00710
Epoch: 36617, Training Loss: 0.00867
Epoch: 36617, Training Loss: 0.00672
Epoch: 36618, Training Loss: 0.00759
Epoch: 36618, Training Loss: 0.00710
Epoch: 36618, Training Loss: 0.00867
Epoch: 36618, Training Loss: 0.00672
Epoch: 36619, Training Loss: 0.00759
Epoch: 36619, Training Loss: 0.00710
Epoch: 36619, Training Loss: 0.00867
Epoch: 36619, Training Loss: 0.00672
Epoch: 36620, Training Loss: 0.00759
Epoch: 36620, Training Loss: 0.00710
Epoch: 36620, Training Loss: 0.00867
Epoch: 36620, Training Loss: 0.00672
Epoch: 36621, Training Loss: 0.00759
Epoch: 36621, Training Loss: 0.00710
Epoch: 36621, Training Loss: 0.00867
Epoch: 36621, Training Loss: 0.00672
Epoch: 36622, Training Loss: 0.00759
Epoch: 36622, Training Loss: 0.00710
Epoch: 36622, Training Loss: 0.00867
Epoch: 36622, Training Loss: 0.00672
Epoch: 36623, Training Loss: 0.00759
Epoch: 36623, Training Loss: 0.00710
Epoch: 36623, Training Loss: 0.00867
Epoch: 36623, Training Loss: 0.00672
Epoch: 36624, Training Loss: 0.00759
Epoch: 36624, Training Loss: 0.00710
Epoch: 36624, Training Loss: 0.00867
Epoch: 36624, Training Loss: 0.00672
Epoch: 36625, Training Loss: 0.00759
Epoch: 36625, Training Loss: 0.00710
Epoch: 36625, Training Loss: 0.00867
Epoch: 36625, Training Loss: 0.00672
Epoch: 36626, Training Loss: 0.00759
Epoch: 36626, Training Loss: 0.00710
Epoch: 36626, Training Loss: 0.00867
Epoch: 36626, Training Loss: 0.00672
Epoch: 36627, Training Loss: 0.00759
Epoch: 36627, Training Loss: 0.00710
Epoch: 36627, Training Loss: 0.00867
Epoch: 36627, Training Loss: 0.00672
Epoch: 36628, Training Loss: 0.00759
Epoch: 36628, Training Loss: 0.00710
Epoch: 36628, Training Loss: 0.00867
Epoch: 36628, Training Loss: 0.00672
Epoch: 36629, Training Loss: 0.00759
Epoch: 36629, Training Loss: 0.00710
Epoch: 36629, Training Loss: 0.00867
Epoch: 36629, Training Loss: 0.00672
Epoch: 36630, Training Loss: 0.00759
Epoch: 36630, Training Loss: 0.00710
Epoch: 36630, Training Loss: 0.00867
Epoch: 36630, Training Loss: 0.00672
Epoch: 36631, Training Loss: 0.00759
Epoch: 36631, Training Loss: 0.00710
Epoch: 36631, Training Loss: 0.00867
Epoch: 36631, Training Loss: 0.00672
Epoch: 36632, Training Loss: 0.00759
Epoch: 36632, Training Loss: 0.00710
Epoch: 36632, Training Loss: 0.00867
Epoch: 36632, Training Loss: 0.00672
Epoch: 36633, Training Loss: 0.00759
Epoch: 36633, Training Loss: 0.00710
Epoch: 36633, Training Loss: 0.00867
Epoch: 36633, Training Loss: 0.00672
Epoch: 36634, Training Loss: 0.00759
Epoch: 36634, Training Loss: 0.00710
Epoch: 36634, Training Loss: 0.00867
Epoch: 36634, Training Loss: 0.00672
Epoch: 36635, Training Loss: 0.00759
Epoch: 36635, Training Loss: 0.00710
Epoch: 36635, Training Loss: 0.00867
Epoch: 36635, Training Loss: 0.00672
Epoch: 36636, Training Loss: 0.00759
Epoch: 36636, Training Loss: 0.00710
Epoch: 36636, Training Loss: 0.00867
Epoch: 36636, Training Loss: 0.00672
Epoch: 36637, Training Loss: 0.00759
Epoch: 36637, Training Loss: 0.00710
Epoch: 36637, Training Loss: 0.00867
Epoch: 36637, Training Loss: 0.00672
Epoch: 36638, Training Loss: 0.00759
Epoch: 36638, Training Loss: 0.00710
Epoch: 36638, Training Loss: 0.00867
Epoch: 36638, Training Loss: 0.00672
Epoch: 36639, Training Loss: 0.00759
Epoch: 36639, Training Loss: 0.00710
Epoch: 36639, Training Loss: 0.00867
Epoch: 36639, Training Loss: 0.00672
Epoch: 36640, Training Loss: 0.00759
Epoch: 36640, Training Loss: 0.00710
Epoch: 36640, Training Loss: 0.00867
Epoch: 36640, Training Loss: 0.00672
Epoch: 36641, Training Loss: 0.00759
Epoch: 36641, Training Loss: 0.00710
Epoch: 36641, Training Loss: 0.00867
Epoch: 36641, Training Loss: 0.00672
Epoch: 36642, Training Loss: 0.00759
Epoch: 36642, Training Loss: 0.00710
Epoch: 36642, Training Loss: 0.00867
Epoch: 36642, Training Loss: 0.00672
Epoch: 36643, Training Loss: 0.00759
Epoch: 36643, Training Loss: 0.00710
Epoch: 36643, Training Loss: 0.00867
Epoch: 36643, Training Loss: 0.00672
Epoch: 36644, Training Loss: 0.00759
Epoch: 36644, Training Loss: 0.00710
Epoch: 36644, Training Loss: 0.00867
Epoch: 36644, Training Loss: 0.00672
Epoch: 36645, Training Loss: 0.00759
Epoch: 36645, Training Loss: 0.00710
Epoch: 36645, Training Loss: 0.00867
Epoch: 36645, Training Loss: 0.00672
Epoch: 36646, Training Loss: 0.00759
Epoch: 36646, Training Loss: 0.00710
Epoch: 36646, Training Loss: 0.00867
Epoch: 36646, Training Loss: 0.00672
Epoch: 36647, Training Loss: 0.00759
Epoch: 36647, Training Loss: 0.00710
Epoch: 36647, Training Loss: 0.00867
Epoch: 36647, Training Loss: 0.00672
Epoch: 36648, Training Loss: 0.00759
Epoch: 36648, Training Loss: 0.00710
Epoch: 36648, Training Loss: 0.00867
Epoch: 36648, Training Loss: 0.00672
Epoch: 36649, Training Loss: 0.00759
Epoch: 36649, Training Loss: 0.00710
Epoch: 36649, Training Loss: 0.00867
Epoch: 36649, Training Loss: 0.00672
Epoch: 36650, Training Loss: 0.00759
Epoch: 36650, Training Loss: 0.00709
Epoch: 36650, Training Loss: 0.00867
Epoch: 36650, Training Loss: 0.00672
Epoch: 36651, Training Loss: 0.00759
Epoch: 36651, Training Loss: 0.00709
Epoch: 36651, Training Loss: 0.00867
Epoch: 36651, Training Loss: 0.00672
Epoch: 36652, Training Loss: 0.00759
Epoch: 36652, Training Loss: 0.00709
Epoch: 36652, Training Loss: 0.00867
Epoch: 36652, Training Loss: 0.00672
Epoch: 36653, Training Loss: 0.00759
Epoch: 36653, Training Loss: 0.00709
Epoch: 36653, Training Loss: 0.00867
Epoch: 36653, Training Loss: 0.00672
Epoch: 36654, Training Loss: 0.00759
Epoch: 36654, Training Loss: 0.00709
Epoch: 36654, Training Loss: 0.00867
Epoch: 36654, Training Loss: 0.00672
Epoch: 36655, Training Loss: 0.00759
Epoch: 36655, Training Loss: 0.00709
Epoch: 36655, Training Loss: 0.00867
Epoch: 36655, Training Loss: 0.00672
Epoch: 36656, Training Loss: 0.00759
Epoch: 36656, Training Loss: 0.00709
Epoch: 36656, Training Loss: 0.00867
Epoch: 36656, Training Loss: 0.00672
Epoch: 36657, Training Loss: 0.00759
Epoch: 36657, Training Loss: 0.00709
Epoch: 36657, Training Loss: 0.00867
Epoch: 36657, Training Loss: 0.00672
Epoch: 36658, Training Loss: 0.00758
Epoch: 36658, Training Loss: 0.00709
Epoch: 36658, Training Loss: 0.00867
Epoch: 36658, Training Loss: 0.00672
Epoch: 36659, Training Loss: 0.00758
Epoch: 36659, Training Loss: 0.00709
Epoch: 36659, Training Loss: 0.00866
Epoch: 36659, Training Loss: 0.00672
Epoch: 36660, Training Loss: 0.00758
Epoch: 36660, Training Loss: 0.00709
Epoch: 36660, Training Loss: 0.00866
Epoch: 36660, Training Loss: 0.00672
Epoch: 36661, Training Loss: 0.00758
Epoch: 36661, Training Loss: 0.00709
Epoch: 36661, Training Loss: 0.00866
Epoch: 36661, Training Loss: 0.00672
Epoch: 36662, Training Loss: 0.00758
Epoch: 36662, Training Loss: 0.00709
Epoch: 36662, Training Loss: 0.00866
Epoch: 36662, Training Loss: 0.00672
Epoch: 36663, Training Loss: 0.00758
Epoch: 36663, Training Loss: 0.00709
Epoch: 36663, Training Loss: 0.00866
Epoch: 36663, Training Loss: 0.00672
Epoch: 36664, Training Loss: 0.00758
Epoch: 36664, Training Loss: 0.00709
Epoch: 36664, Training Loss: 0.00866
Epoch: 36664, Training Loss: 0.00672
Epoch: 36665, Training Loss: 0.00758
Epoch: 36665, Training Loss: 0.00709
Epoch: 36665, Training Loss: 0.00866
Epoch: 36665, Training Loss: 0.00672
Epoch: 36666, Training Loss: 0.00758
Epoch: 36666, Training Loss: 0.00709
Epoch: 36666, Training Loss: 0.00866
Epoch: 36666, Training Loss: 0.00672
Epoch: 36667, Training Loss: 0.00758
Epoch: 36667, Training Loss: 0.00709
Epoch: 36667, Training Loss: 0.00866
Epoch: 36667, Training Loss: 0.00672
Epoch: 36668, Training Loss: 0.00758
Epoch: 36668, Training Loss: 0.00709
Epoch: 36668, Training Loss: 0.00866
Epoch: 36668, Training Loss: 0.00672
Epoch: 36669, Training Loss: 0.00758
Epoch: 36669, Training Loss: 0.00709
Epoch: 36669, Training Loss: 0.00866
Epoch: 36669, Training Loss: 0.00672
Epoch: 36670, Training Loss: 0.00758
Epoch: 36670, Training Loss: 0.00709
Epoch: 36670, Training Loss: 0.00866
Epoch: 36670, Training Loss: 0.00672
Epoch: 36671, Training Loss: 0.00758
Epoch: 36671, Training Loss: 0.00709
Epoch: 36671, Training Loss: 0.00866
Epoch: 36671, Training Loss: 0.00672
Epoch: 36672, Training Loss: 0.00758
Epoch: 36672, Training Loss: 0.00709
Epoch: 36672, Training Loss: 0.00866
Epoch: 36672, Training Loss: 0.00672
Epoch: 36673, Training Loss: 0.00758
Epoch: 36673, Training Loss: 0.00709
Epoch: 36673, Training Loss: 0.00866
Epoch: 36673, Training Loss: 0.00672
Epoch: 36674, Training Loss: 0.00758
Epoch: 36674, Training Loss: 0.00709
Epoch: 36674, Training Loss: 0.00866
Epoch: 36674, Training Loss: 0.00672
Epoch: 36675, Training Loss: 0.00758
Epoch: 36675, Training Loss: 0.00709
Epoch: 36675, Training Loss: 0.00866
Epoch: 36675, Training Loss: 0.00672
Epoch: 36676, Training Loss: 0.00758
Epoch: 36676, Training Loss: 0.00709
Epoch: 36676, Training Loss: 0.00866
Epoch: 36676, Training Loss: 0.00672
Epoch: 36677, Training Loss: 0.00758
Epoch: 36677, Training Loss: 0.00709
Epoch: 36677, Training Loss: 0.00866
Epoch: 36677, Training Loss: 0.00672
Epoch: 36678, Training Loss: 0.00758
Epoch: 36678, Training Loss: 0.00709
Epoch: 36678, Training Loss: 0.00866
Epoch: 36678, Training Loss: 0.00671
Epoch: 36679, Training Loss: 0.00758
Epoch: 36679, Training Loss: 0.00709
Epoch: 36679, Training Loss: 0.00866
Epoch: 36679, Training Loss: 0.00671
Epoch: 36680, Training Loss: 0.00758
Epoch: 36680, Training Loss: 0.00709
Epoch: 36680, Training Loss: 0.00866
Epoch: 36680, Training Loss: 0.00671
Epoch: 36681, Training Loss: 0.00758
Epoch: 36681, Training Loss: 0.00709
Epoch: 36681, Training Loss: 0.00866
Epoch: 36681, Training Loss: 0.00671
Epoch: 36682, Training Loss: 0.00758
Epoch: 36682, Training Loss: 0.00709
Epoch: 36682, Training Loss: 0.00866
Epoch: 36682, Training Loss: 0.00671
Epoch: 36683, Training Loss: 0.00758
Epoch: 36683, Training Loss: 0.00709
Epoch: 36683, Training Loss: 0.00866
Epoch: 36683, Training Loss: 0.00671
Epoch: 36684, Training Loss: 0.00758
Epoch: 36684, Training Loss: 0.00709
Epoch: 36684, Training Loss: 0.00866
Epoch: 36684, Training Loss: 0.00671
Epoch: 36685, Training Loss: 0.00758
Epoch: 36685, Training Loss: 0.00709
Epoch: 36685, Training Loss: 0.00866
Epoch: 36685, Training Loss: 0.00671
Epoch: 36686, Training Loss: 0.00758
Epoch: 36686, Training Loss: 0.00709
Epoch: 36686, Training Loss: 0.00866
Epoch: 36686, Training Loss: 0.00671
Epoch: 36687, Training Loss: 0.00758
Epoch: 36687, Training Loss: 0.00709
Epoch: 36687, Training Loss: 0.00866
Epoch: 36687, Training Loss: 0.00671
Epoch: 36688, Training Loss: 0.00758
Epoch: 36688, Training Loss: 0.00709
Epoch: 36688, Training Loss: 0.00866
Epoch: 36688, Training Loss: 0.00671
Epoch: 36689, Training Loss: 0.00758
Epoch: 36689, Training Loss: 0.00709
Epoch: 36689, Training Loss: 0.00866
Epoch: 36689, Training Loss: 0.00671
Epoch: 36690, Training Loss: 0.00758
Epoch: 36690, Training Loss: 0.00709
Epoch: 36690, Training Loss: 0.00866
Epoch: 36690, Training Loss: 0.00671
Epoch: 36691, Training Loss: 0.00758
Epoch: 36691, Training Loss: 0.00709
Epoch: 36691, Training Loss: 0.00866
Epoch: 36691, Training Loss: 0.00671
Epoch: 36692, Training Loss: 0.00758
Epoch: 36692, Training Loss: 0.00709
Epoch: 36692, Training Loss: 0.00866
Epoch: 36692, Training Loss: 0.00671
Epoch: 36693, Training Loss: 0.00758
Epoch: 36693, Training Loss: 0.00709
Epoch: 36693, Training Loss: 0.00866
Epoch: 36693, Training Loss: 0.00671
Epoch: 36694, Training Loss: 0.00758
Epoch: 36694, Training Loss: 0.00709
Epoch: 36694, Training Loss: 0.00866
Epoch: 36694, Training Loss: 0.00671
Epoch: 36695, Training Loss: 0.00758
Epoch: 36695, Training Loss: 0.00709
Epoch: 36695, Training Loss: 0.00866
Epoch: 36695, Training Loss: 0.00671
Epoch: 36696, Training Loss: 0.00758
Epoch: 36696, Training Loss: 0.00709
Epoch: 36696, Training Loss: 0.00866
Epoch: 36696, Training Loss: 0.00671
Epoch: 36697, Training Loss: 0.00758
Epoch: 36697, Training Loss: 0.00709
Epoch: 36697, Training Loss: 0.00866
Epoch: 36697, Training Loss: 0.00671
Epoch: 36698, Training Loss: 0.00758
Epoch: 36698, Training Loss: 0.00709
Epoch: 36698, Training Loss: 0.00866
Epoch: 36698, Training Loss: 0.00671
Epoch: 36699, Training Loss: 0.00758
Epoch: 36699, Training Loss: 0.00709
Epoch: 36699, Training Loss: 0.00866
Epoch: 36699, Training Loss: 0.00671
Epoch: 36700, Training Loss: 0.00758
Epoch: 36700, Training Loss: 0.00709
Epoch: 36700, Training Loss: 0.00866
Epoch: 36700, Training Loss: 0.00671
Epoch: 36701, Training Loss: 0.00758
Epoch: 36701, Training Loss: 0.00709
Epoch: 36701, Training Loss: 0.00866
Epoch: 36701, Training Loss: 0.00671
Epoch: 36702, Training Loss: 0.00758
Epoch: 36702, Training Loss: 0.00709
Epoch: 36702, Training Loss: 0.00866
Epoch: 36702, Training Loss: 0.00671
Epoch: 36703, Training Loss: 0.00758
Epoch: 36703, Training Loss: 0.00709
Epoch: 36703, Training Loss: 0.00866
Epoch: 36703, Training Loss: 0.00671
Epoch: 36704, Training Loss: 0.00758
Epoch: 36704, Training Loss: 0.00709
Epoch: 36704, Training Loss: 0.00866
Epoch: 36704, Training Loss: 0.00671
Epoch: 36705, Training Loss: 0.00758
Epoch: 36705, Training Loss: 0.00709
Epoch: 36705, Training Loss: 0.00866
Epoch: 36705, Training Loss: 0.00671
Epoch: 36706, Training Loss: 0.00758
Epoch: 36706, Training Loss: 0.00709
Epoch: 36706, Training Loss: 0.00866
Epoch: 36706, Training Loss: 0.00671
Epoch: 36707, Training Loss: 0.00758
Epoch: 36707, Training Loss: 0.00709
Epoch: 36707, Training Loss: 0.00866
Epoch: 36707, Training Loss: 0.00671
Epoch: 36708, Training Loss: 0.00758
Epoch: 36708, Training Loss: 0.00709
Epoch: 36708, Training Loss: 0.00866
Epoch: 36708, Training Loss: 0.00671
Epoch: 36709, Training Loss: 0.00758
Epoch: 36709, Training Loss: 0.00709
Epoch: 36709, Training Loss: 0.00866
Epoch: 36709, Training Loss: 0.00671
Epoch: 36710, Training Loss: 0.00758
Epoch: 36710, Training Loss: 0.00709
Epoch: 36710, Training Loss: 0.00866
Epoch: 36710, Training Loss: 0.00671
Epoch: 36711, Training Loss: 0.00758
Epoch: 36711, Training Loss: 0.00709
Epoch: 36711, Training Loss: 0.00866
Epoch: 36711, Training Loss: 0.00671
Epoch: 36712, Training Loss: 0.00758
Epoch: 36712, Training Loss: 0.00709
Epoch: 36712, Training Loss: 0.00866
Epoch: 36712, Training Loss: 0.00671
Epoch: 36713, Training Loss: 0.00758
Epoch: 36713, Training Loss: 0.00709
Epoch: 36713, Training Loss: 0.00866
Epoch: 36713, Training Loss: 0.00671
Epoch: 36714, Training Loss: 0.00758
Epoch: 36714, Training Loss: 0.00709
Epoch: 36714, Training Loss: 0.00866
Epoch: 36714, Training Loss: 0.00671
Epoch: 36715, Training Loss: 0.00758
Epoch: 36715, Training Loss: 0.00709
Epoch: 36715, Training Loss: 0.00866
Epoch: 36715, Training Loss: 0.00671
Epoch: 36716, Training Loss: 0.00758
Epoch: 36716, Training Loss: 0.00709
Epoch: 36716, Training Loss: 0.00866
Epoch: 36716, Training Loss: 0.00671
Epoch: 36717, Training Loss: 0.00758
Epoch: 36717, Training Loss: 0.00709
Epoch: 36717, Training Loss: 0.00866
Epoch: 36717, Training Loss: 0.00671
Epoch: 36718, Training Loss: 0.00758
Epoch: 36718, Training Loss: 0.00709
Epoch: 36718, Training Loss: 0.00866
Epoch: 36718, Training Loss: 0.00671
Epoch: 36719, Training Loss: 0.00758
Epoch: 36719, Training Loss: 0.00709
Epoch: 36719, Training Loss: 0.00866
Epoch: 36719, Training Loss: 0.00671
Epoch: 36720, Training Loss: 0.00758
Epoch: 36720, Training Loss: 0.00709
Epoch: 36720, Training Loss: 0.00866
Epoch: 36720, Training Loss: 0.00671
Epoch: 36721, Training Loss: 0.00758
Epoch: 36721, Training Loss: 0.00709
Epoch: 36721, Training Loss: 0.00866
Epoch: 36721, Training Loss: 0.00671
Epoch: 36722, Training Loss: 0.00758
Epoch: 36722, Training Loss: 0.00709
Epoch: 36722, Training Loss: 0.00866
Epoch: 36722, Training Loss: 0.00671
Epoch: 36723, Training Loss: 0.00758
Epoch: 36723, Training Loss: 0.00709
Epoch: 36723, Training Loss: 0.00866
Epoch: 36723, Training Loss: 0.00671
Epoch: 36724, Training Loss: 0.00758
Epoch: 36724, Training Loss: 0.00709
Epoch: 36724, Training Loss: 0.00866
Epoch: 36724, Training Loss: 0.00671
Epoch: 36725, Training Loss: 0.00758
Epoch: 36725, Training Loss: 0.00709
Epoch: 36725, Training Loss: 0.00866
Epoch: 36725, Training Loss: 0.00671
Epoch: 36726, Training Loss: 0.00758
Epoch: 36726, Training Loss: 0.00709
Epoch: 36726, Training Loss: 0.00866
Epoch: 36726, Training Loss: 0.00671
Epoch: 36727, Training Loss: 0.00758
Epoch: 36727, Training Loss: 0.00709
Epoch: 36727, Training Loss: 0.00866
Epoch: 36727, Training Loss: 0.00671
Epoch: 36728, Training Loss: 0.00758
Epoch: 36728, Training Loss: 0.00709
Epoch: 36728, Training Loss: 0.00866
Epoch: 36728, Training Loss: 0.00671
Epoch: 36729, Training Loss: 0.00758
Epoch: 36729, Training Loss: 0.00709
Epoch: 36729, Training Loss: 0.00866
Epoch: 36729, Training Loss: 0.00671
Epoch: 36730, Training Loss: 0.00758
Epoch: 36730, Training Loss: 0.00709
Epoch: 36730, Training Loss: 0.00866
Epoch: 36730, Training Loss: 0.00671
Epoch: 36731, Training Loss: 0.00758
Epoch: 36731, Training Loss: 0.00709
Epoch: 36731, Training Loss: 0.00866
Epoch: 36731, Training Loss: 0.00671
Epoch: 36732, Training Loss: 0.00758
Epoch: 36732, Training Loss: 0.00709
Epoch: 36732, Training Loss: 0.00866
Epoch: 36732, Training Loss: 0.00671
Epoch: 36733, Training Loss: 0.00758
Epoch: 36733, Training Loss: 0.00709
Epoch: 36733, Training Loss: 0.00866
Epoch: 36733, Training Loss: 0.00671
Epoch: 36734, Training Loss: 0.00758
Epoch: 36734, Training Loss: 0.00709
Epoch: 36734, Training Loss: 0.00866
Epoch: 36734, Training Loss: 0.00671
Epoch: 36735, Training Loss: 0.00758
Epoch: 36735, Training Loss: 0.00709
Epoch: 36735, Training Loss: 0.00866
Epoch: 36735, Training Loss: 0.00671
Epoch: 36736, Training Loss: 0.00758
Epoch: 36736, Training Loss: 0.00709
Epoch: 36736, Training Loss: 0.00866
Epoch: 36736, Training Loss: 0.00671
Epoch: 36737, Training Loss: 0.00758
Epoch: 36737, Training Loss: 0.00709
Epoch: 36737, Training Loss: 0.00866
Epoch: 36737, Training Loss: 0.00671
Epoch: 36738, Training Loss: 0.00758
Epoch: 36738, Training Loss: 0.00709
Epoch: 36738, Training Loss: 0.00866
Epoch: 36738, Training Loss: 0.00671
Epoch: 36739, Training Loss: 0.00758
Epoch: 36739, Training Loss: 0.00709
Epoch: 36739, Training Loss: 0.00866
Epoch: 36739, Training Loss: 0.00671
Epoch: 36740, Training Loss: 0.00758
Epoch: 36740, Training Loss: 0.00709
Epoch: 36740, Training Loss: 0.00865
Epoch: 36740, Training Loss: 0.00671
Epoch: 36741, Training Loss: 0.00758
Epoch: 36741, Training Loss: 0.00709
Epoch: 36741, Training Loss: 0.00865
Epoch: 36741, Training Loss: 0.00671
Epoch: 36742, Training Loss: 0.00758
Epoch: 36742, Training Loss: 0.00709
Epoch: 36742, Training Loss: 0.00865
Epoch: 36742, Training Loss: 0.00671
Epoch: 36743, Training Loss: 0.00758
Epoch: 36743, Training Loss: 0.00709
Epoch: 36743, Training Loss: 0.00865
Epoch: 36743, Training Loss: 0.00671
Epoch: 36744, Training Loss: 0.00758
Epoch: 36744, Training Loss: 0.00709
Epoch: 36744, Training Loss: 0.00865
Epoch: 36744, Training Loss: 0.00671
Epoch: 36745, Training Loss: 0.00758
Epoch: 36745, Training Loss: 0.00709
Epoch: 36745, Training Loss: 0.00865
Epoch: 36745, Training Loss: 0.00671
Epoch: 36746, Training Loss: 0.00758
Epoch: 36746, Training Loss: 0.00709
Epoch: 36746, Training Loss: 0.00865
Epoch: 36746, Training Loss: 0.00671
Epoch: 36747, Training Loss: 0.00758
Epoch: 36747, Training Loss: 0.00709
Epoch: 36747, Training Loss: 0.00865
Epoch: 36747, Training Loss: 0.00671
Epoch: 36748, Training Loss: 0.00758
Epoch: 36748, Training Loss: 0.00709
Epoch: 36748, Training Loss: 0.00865
Epoch: 36748, Training Loss: 0.00671
Epoch: 36749, Training Loss: 0.00758
Epoch: 36749, Training Loss: 0.00709
Epoch: 36749, Training Loss: 0.00865
Epoch: 36749, Training Loss: 0.00671
Epoch: 36750, Training Loss: 0.00757
Epoch: 36750, Training Loss: 0.00708
Epoch: 36750, Training Loss: 0.00865
Epoch: 36750, Training Loss: 0.00671
Epoch: 36751, Training Loss: 0.00757
Epoch: 36751, Training Loss: 0.00708
Epoch: 36751, Training Loss: 0.00865
Epoch: 36751, Training Loss: 0.00671
Epoch: 36752, Training Loss: 0.00757
Epoch: 36752, Training Loss: 0.00708
Epoch: 36752, Training Loss: 0.00865
Epoch: 36752, Training Loss: 0.00671
Epoch: 36753, Training Loss: 0.00757
Epoch: 36753, Training Loss: 0.00708
Epoch: 36753, Training Loss: 0.00865
Epoch: 36753, Training Loss: 0.00671
Epoch: 36754, Training Loss: 0.00757
Epoch: 36754, Training Loss: 0.00708
Epoch: 36754, Training Loss: 0.00865
Epoch: 36754, Training Loss: 0.00671
Epoch: 36755, Training Loss: 0.00757
Epoch: 36755, Training Loss: 0.00708
Epoch: 36755, Training Loss: 0.00865
Epoch: 36755, Training Loss: 0.00671
Epoch: 36756, Training Loss: 0.00757
Epoch: 36756, Training Loss: 0.00708
Epoch: 36756, Training Loss: 0.00865
Epoch: 36756, Training Loss: 0.00671
Epoch: 36757, Training Loss: 0.00757
Epoch: 36757, Training Loss: 0.00708
Epoch: 36757, Training Loss: 0.00865
Epoch: 36757, Training Loss: 0.00671
Epoch: 36758, Training Loss: 0.00757
Epoch: 36758, Training Loss: 0.00708
Epoch: 36758, Training Loss: 0.00865
Epoch: 36758, Training Loss: 0.00671
Epoch: 36759, Training Loss: 0.00757
Epoch: 36759, Training Loss: 0.00708
Epoch: 36759, Training Loss: 0.00865
Epoch: 36759, Training Loss: 0.00671
Epoch: 36760, Training Loss: 0.00757
Epoch: 36760, Training Loss: 0.00708
Epoch: 36760, Training Loss: 0.00865
Epoch: 36760, Training Loss: 0.00671
Epoch: 36761, Training Loss: 0.00757
Epoch: 36761, Training Loss: 0.00708
Epoch: 36761, Training Loss: 0.00865
Epoch: 36761, Training Loss: 0.00671
Epoch: 36762, Training Loss: 0.00757
Epoch: 36762, Training Loss: 0.00708
Epoch: 36762, Training Loss: 0.00865
Epoch: 36762, Training Loss: 0.00671
Epoch: 36763, Training Loss: 0.00757
Epoch: 36763, Training Loss: 0.00708
Epoch: 36763, Training Loss: 0.00865
Epoch: 36763, Training Loss: 0.00671
Epoch: 36764, Training Loss: 0.00757
Epoch: 36764, Training Loss: 0.00708
Epoch: 36764, Training Loss: 0.00865
Epoch: 36764, Training Loss: 0.00671
Epoch: 36765, Training Loss: 0.00757
Epoch: 36765, Training Loss: 0.00708
Epoch: 36765, Training Loss: 0.00865
Epoch: 36765, Training Loss: 0.00671
Epoch: 36766, Training Loss: 0.00757
Epoch: 36766, Training Loss: 0.00708
Epoch: 36766, Training Loss: 0.00865
Epoch: 36766, Training Loss: 0.00671
Epoch: 36767, Training Loss: 0.00757
Epoch: 36767, Training Loss: 0.00708
Epoch: 36767, Training Loss: 0.00865
Epoch: 36767, Training Loss: 0.00671
Epoch: 36768, Training Loss: 0.00757
Epoch: 36768, Training Loss: 0.00708
Epoch: 36768, Training Loss: 0.00865
Epoch: 36768, Training Loss: 0.00671
Epoch: 36769, Training Loss: 0.00757
Epoch: 36769, Training Loss: 0.00708
Epoch: 36769, Training Loss: 0.00865
Epoch: 36769, Training Loss: 0.00671
Epoch: 36770, Training Loss: 0.00757
Epoch: 36770, Training Loss: 0.00708
Epoch: 36770, Training Loss: 0.00865
Epoch: 36770, Training Loss: 0.00671
Epoch: 36771, Training Loss: 0.00757
Epoch: 36771, Training Loss: 0.00708
Epoch: 36771, Training Loss: 0.00865
Epoch: 36771, Training Loss: 0.00671
Epoch: 36772, Training Loss: 0.00757
Epoch: 36772, Training Loss: 0.00708
Epoch: 36772, Training Loss: 0.00865
Epoch: 36772, Training Loss: 0.00671
Epoch: 36773, Training Loss: 0.00757
Epoch: 36773, Training Loss: 0.00708
Epoch: 36773, Training Loss: 0.00865
Epoch: 36773, Training Loss: 0.00671
Epoch: 36774, Training Loss: 0.00757
Epoch: 36774, Training Loss: 0.00708
Epoch: 36774, Training Loss: 0.00865
Epoch: 36774, Training Loss: 0.00671
Epoch: 36775, Training Loss: 0.00757
Epoch: 36775, Training Loss: 0.00708
Epoch: 36775, Training Loss: 0.00865
Epoch: 36775, Training Loss: 0.00671
Epoch: 36776, Training Loss: 0.00757
Epoch: 36776, Training Loss: 0.00708
Epoch: 36776, Training Loss: 0.00865
Epoch: 36776, Training Loss: 0.00671
Epoch: 36777, Training Loss: 0.00757
Epoch: 36777, Training Loss: 0.00708
Epoch: 36777, Training Loss: 0.00865
Epoch: 36777, Training Loss: 0.00671
Epoch: 36778, Training Loss: 0.00757
Epoch: 36778, Training Loss: 0.00708
Epoch: 36778, Training Loss: 0.00865
Epoch: 36778, Training Loss: 0.00671
Epoch: 36779, Training Loss: 0.00757
Epoch: 36779, Training Loss: 0.00708
Epoch: 36779, Training Loss: 0.00865
Epoch: 36779, Training Loss: 0.00671
Epoch: 36780, Training Loss: 0.00757
Epoch: 36780, Training Loss: 0.00708
Epoch: 36780, Training Loss: 0.00865
Epoch: 36780, Training Loss: 0.00671
Epoch: 36781, Training Loss: 0.00757
Epoch: 36781, Training Loss: 0.00708
Epoch: 36781, Training Loss: 0.00865
Epoch: 36781, Training Loss: 0.00671
Epoch: 36782, Training Loss: 0.00757
Epoch: 36782, Training Loss: 0.00708
Epoch: 36782, Training Loss: 0.00865
Epoch: 36782, Training Loss: 0.00671
Epoch: 36783, Training Loss: 0.00757
Epoch: 36783, Training Loss: 0.00708
Epoch: 36783, Training Loss: 0.00865
Epoch: 36783, Training Loss: 0.00670
Epoch: 36784, Training Loss: 0.00757
Epoch: 36784, Training Loss: 0.00708
Epoch: 36784, Training Loss: 0.00865
Epoch: 36784, Training Loss: 0.00670
Epoch: 36785, Training Loss: 0.00757
Epoch: 36785, Training Loss: 0.00708
Epoch: 36785, Training Loss: 0.00865
Epoch: 36785, Training Loss: 0.00670
Epoch: 36786, Training Loss: 0.00757
Epoch: 36786, Training Loss: 0.00708
Epoch: 36786, Training Loss: 0.00865
Epoch: 36786, Training Loss: 0.00670
Epoch: 36787, Training Loss: 0.00757
Epoch: 36787, Training Loss: 0.00708
Epoch: 36787, Training Loss: 0.00865
Epoch: 36787, Training Loss: 0.00670
Epoch: 36788, Training Loss: 0.00757
Epoch: 36788, Training Loss: 0.00708
Epoch: 36788, Training Loss: 0.00865
Epoch: 36788, Training Loss: 0.00670
Epoch: 36789, Training Loss: 0.00757
Epoch: 36789, Training Loss: 0.00708
Epoch: 36789, Training Loss: 0.00865
Epoch: 36789, Training Loss: 0.00670
Epoch: 36790, Training Loss: 0.00757
Epoch: 36790, Training Loss: 0.00708
Epoch: 36790, Training Loss: 0.00865
Epoch: 36790, Training Loss: 0.00670
Epoch: 36791, Training Loss: 0.00757
Epoch: 36791, Training Loss: 0.00708
Epoch: 36791, Training Loss: 0.00865
Epoch: 36791, Training Loss: 0.00670
Epoch: 36792, Training Loss: 0.00757
Epoch: 36792, Training Loss: 0.00708
Epoch: 36792, Training Loss: 0.00865
Epoch: 36792, Training Loss: 0.00670
Epoch: 36793, Training Loss: 0.00757
Epoch: 36793, Training Loss: 0.00708
Epoch: 36793, Training Loss: 0.00865
Epoch: 36793, Training Loss: 0.00670
Epoch: 36794, Training Loss: 0.00757
Epoch: 36794, Training Loss: 0.00708
Epoch: 36794, Training Loss: 0.00865
Epoch: 36794, Training Loss: 0.00670
Epoch: 36795, Training Loss: 0.00757
Epoch: 36795, Training Loss: 0.00708
Epoch: 36795, Training Loss: 0.00865
Epoch: 36795, Training Loss: 0.00670
Epoch: 36796, Training Loss: 0.00757
Epoch: 36796, Training Loss: 0.00708
Epoch: 36796, Training Loss: 0.00865
Epoch: 36796, Training Loss: 0.00670
Epoch: 36797, Training Loss: 0.00757
Epoch: 36797, Training Loss: 0.00708
Epoch: 36797, Training Loss: 0.00865
Epoch: 36797, Training Loss: 0.00670
Epoch: 36798, Training Loss: 0.00757
Epoch: 36798, Training Loss: 0.00708
Epoch: 36798, Training Loss: 0.00865
Epoch: 36798, Training Loss: 0.00670
Epoch: 36799, Training Loss: 0.00757
Epoch: 36799, Training Loss: 0.00708
Epoch: 36799, Training Loss: 0.00865
Epoch: 36799, Training Loss: 0.00670
Epoch: 36800, Training Loss: 0.00757
Epoch: 36800, Training Loss: 0.00708
Epoch: 36800, Training Loss: 0.00865
Epoch: 36800, Training Loss: 0.00670
Epoch: 36801, Training Loss: 0.00757
Epoch: 36801, Training Loss: 0.00708
Epoch: 36801, Training Loss: 0.00865
Epoch: 36801, Training Loss: 0.00670
Epoch: 36802, Training Loss: 0.00757
Epoch: 36802, Training Loss: 0.00708
Epoch: 36802, Training Loss: 0.00865
Epoch: 36802, Training Loss: 0.00670
Epoch: 36803, Training Loss: 0.00757
Epoch: 36803, Training Loss: 0.00708
Epoch: 36803, Training Loss: 0.00865
Epoch: 36803, Training Loss: 0.00670
Epoch: 36804, Training Loss: 0.00757
Epoch: 36804, Training Loss: 0.00708
Epoch: 36804, Training Loss: 0.00865
Epoch: 36804, Training Loss: 0.00670
Epoch: 36805, Training Loss: 0.00757
Epoch: 36805, Training Loss: 0.00708
Epoch: 36805, Training Loss: 0.00865
Epoch: 36805, Training Loss: 0.00670
Epoch: 36806, Training Loss: 0.00757
Epoch: 36806, Training Loss: 0.00708
Epoch: 36806, Training Loss: 0.00865
Epoch: 36806, Training Loss: 0.00670
Epoch: 36807, Training Loss: 0.00757
Epoch: 36807, Training Loss: 0.00708
Epoch: 36807, Training Loss: 0.00865
Epoch: 36807, Training Loss: 0.00670
Epoch: 36808, Training Loss: 0.00757
Epoch: 36808, Training Loss: 0.00708
Epoch: 36808, Training Loss: 0.00865
Epoch: 36808, Training Loss: 0.00670
Epoch: 36809, Training Loss: 0.00757
Epoch: 36809, Training Loss: 0.00708
Epoch: 36809, Training Loss: 0.00865
Epoch: 36809, Training Loss: 0.00670
Epoch: 36810, Training Loss: 0.00757
Epoch: 36810, Training Loss: 0.00708
Epoch: 36810, Training Loss: 0.00865
Epoch: 36810, Training Loss: 0.00670
Epoch: 36811, Training Loss: 0.00757
Epoch: 36811, Training Loss: 0.00708
Epoch: 36811, Training Loss: 0.00865
Epoch: 36811, Training Loss: 0.00670
Epoch: 36812, Training Loss: 0.00757
Epoch: 36812, Training Loss: 0.00708
Epoch: 36812, Training Loss: 0.00865
Epoch: 36812, Training Loss: 0.00670
Epoch: 36813, Training Loss: 0.00757
Epoch: 36813, Training Loss: 0.00708
Epoch: 36813, Training Loss: 0.00865
Epoch: 36813, Training Loss: 0.00670
Epoch: 36814, Training Loss: 0.00757
Epoch: 36814, Training Loss: 0.00708
Epoch: 36814, Training Loss: 0.00865
Epoch: 36814, Training Loss: 0.00670
Epoch: 36815, Training Loss: 0.00757
Epoch: 36815, Training Loss: 0.00708
Epoch: 36815, Training Loss: 0.00865
Epoch: 36815, Training Loss: 0.00670
Epoch: 36816, Training Loss: 0.00757
Epoch: 36816, Training Loss: 0.00708
Epoch: 36816, Training Loss: 0.00865
Epoch: 36816, Training Loss: 0.00670
Epoch: 36817, Training Loss: 0.00757
Epoch: 36817, Training Loss: 0.00708
Epoch: 36817, Training Loss: 0.00865
Epoch: 36817, Training Loss: 0.00670
Epoch: 36818, Training Loss: 0.00757
Epoch: 36818, Training Loss: 0.00708
Epoch: 36818, Training Loss: 0.00865
Epoch: 36818, Training Loss: 0.00670
Epoch: 36819, Training Loss: 0.00757
Epoch: 36819, Training Loss: 0.00708
Epoch: 36819, Training Loss: 0.00865
Epoch: 36819, Training Loss: 0.00670
Epoch: 36820, Training Loss: 0.00757
Epoch: 36820, Training Loss: 0.00708
Epoch: 36820, Training Loss: 0.00864
Epoch: 36820, Training Loss: 0.00670
Epoch: 36821, Training Loss: 0.00757
Epoch: 36821, Training Loss: 0.00708
Epoch: 36821, Training Loss: 0.00864
Epoch: 36821, Training Loss: 0.00670
Epoch: 36822, Training Loss: 0.00757
Epoch: 36822, Training Loss: 0.00708
Epoch: 36822, Training Loss: 0.00864
Epoch: 36822, Training Loss: 0.00670
Epoch: 36823, Training Loss: 0.00757
Epoch: 36823, Training Loss: 0.00708
Epoch: 36823, Training Loss: 0.00864
Epoch: 36823, Training Loss: 0.00670
Epoch: 36824, Training Loss: 0.00757
Epoch: 36824, Training Loss: 0.00708
Epoch: 36824, Training Loss: 0.00864
Epoch: 36824, Training Loss: 0.00670
Epoch: 36825, Training Loss: 0.00757
Epoch: 36825, Training Loss: 0.00708
Epoch: 36825, Training Loss: 0.00864
Epoch: 36825, Training Loss: 0.00670
Epoch: 36826, Training Loss: 0.00757
Epoch: 36826, Training Loss: 0.00708
Epoch: 36826, Training Loss: 0.00864
Epoch: 36826, Training Loss: 0.00670
Epoch: 36827, Training Loss: 0.00757
Epoch: 36827, Training Loss: 0.00708
Epoch: 36827, Training Loss: 0.00864
Epoch: 36827, Training Loss: 0.00670
Epoch: 36828, Training Loss: 0.00757
Epoch: 36828, Training Loss: 0.00708
Epoch: 36828, Training Loss: 0.00864
Epoch: 36828, Training Loss: 0.00670
Epoch: 36829, Training Loss: 0.00757
Epoch: 36829, Training Loss: 0.00708
Epoch: 36829, Training Loss: 0.00864
Epoch: 36829, Training Loss: 0.00670
Epoch: 36830, Training Loss: 0.00757
Epoch: 36830, Training Loss: 0.00708
Epoch: 36830, Training Loss: 0.00864
Epoch: 36830, Training Loss: 0.00670
Epoch: 36831, Training Loss: 0.00757
Epoch: 36831, Training Loss: 0.00708
Epoch: 36831, Training Loss: 0.00864
Epoch: 36831, Training Loss: 0.00670
Epoch: 36832, Training Loss: 0.00757
Epoch: 36832, Training Loss: 0.00708
Epoch: 36832, Training Loss: 0.00864
Epoch: 36832, Training Loss: 0.00670
Epoch: 36833, Training Loss: 0.00757
Epoch: 36833, Training Loss: 0.00708
Epoch: 36833, Training Loss: 0.00864
Epoch: 36833, Training Loss: 0.00670
Epoch: 36834, Training Loss: 0.00757
Epoch: 36834, Training Loss: 0.00708
Epoch: 36834, Training Loss: 0.00864
Epoch: 36834, Training Loss: 0.00670
Epoch: 36835, Training Loss: 0.00757
Epoch: 36835, Training Loss: 0.00708
Epoch: 36835, Training Loss: 0.00864
Epoch: 36835, Training Loss: 0.00670
Epoch: 36836, Training Loss: 0.00757
Epoch: 36836, Training Loss: 0.00708
Epoch: 36836, Training Loss: 0.00864
Epoch: 36836, Training Loss: 0.00670
Epoch: 36837, Training Loss: 0.00757
Epoch: 36837, Training Loss: 0.00708
Epoch: 36837, Training Loss: 0.00864
Epoch: 36837, Training Loss: 0.00670
Epoch: 36838, Training Loss: 0.00757
Epoch: 36838, Training Loss: 0.00708
Epoch: 36838, Training Loss: 0.00864
Epoch: 36838, Training Loss: 0.00670
Epoch: 36839, Training Loss: 0.00757
Epoch: 36839, Training Loss: 0.00708
Epoch: 36839, Training Loss: 0.00864
Epoch: 36839, Training Loss: 0.00670
Epoch: 36840, Training Loss: 0.00757
Epoch: 36840, Training Loss: 0.00708
Epoch: 36840, Training Loss: 0.00864
Epoch: 36840, Training Loss: 0.00670
Epoch: 36841, Training Loss: 0.00757
Epoch: 36841, Training Loss: 0.00708
Epoch: 36841, Training Loss: 0.00864
Epoch: 36841, Training Loss: 0.00670
Epoch: 36842, Training Loss: 0.00756
Epoch: 36842, Training Loss: 0.00708
Epoch: 36842, Training Loss: 0.00864
Epoch: 36842, Training Loss: 0.00670
Epoch: 36843, Training Loss: 0.00756
Epoch: 36843, Training Loss: 0.00708
Epoch: 36843, Training Loss: 0.00864
Epoch: 36843, Training Loss: 0.00670
Epoch: 36844, Training Loss: 0.00756
Epoch: 36844, Training Loss: 0.00708
Epoch: 36844, Training Loss: 0.00864
Epoch: 36844, Training Loss: 0.00670
Epoch: 36845, Training Loss: 0.00756
Epoch: 36845, Training Loss: 0.00708
Epoch: 36845, Training Loss: 0.00864
Epoch: 36845, Training Loss: 0.00670
Epoch: 36846, Training Loss: 0.00756
Epoch: 36846, Training Loss: 0.00708
Epoch: 36846, Training Loss: 0.00864
Epoch: 36846, Training Loss: 0.00670
Epoch: 36847, Training Loss: 0.00756
Epoch: 36847, Training Loss: 0.00708
Epoch: 36847, Training Loss: 0.00864
Epoch: 36847, Training Loss: 0.00670
Epoch: 36848, Training Loss: 0.00756
Epoch: 36848, Training Loss: 0.00708
Epoch: 36848, Training Loss: 0.00864
Epoch: 36848, Training Loss: 0.00670
Epoch: 36849, Training Loss: 0.00756
Epoch: 36849, Training Loss: 0.00708
Epoch: 36849, Training Loss: 0.00864
Epoch: 36849, Training Loss: 0.00670
Epoch: 36850, Training Loss: 0.00756
Epoch: 36850, Training Loss: 0.00707
Epoch: 36850, Training Loss: 0.00864
Epoch: 36850, Training Loss: 0.00670
Epoch: 36851, Training Loss: 0.00756
Epoch: 36851, Training Loss: 0.00707
Epoch: 36851, Training Loss: 0.00864
Epoch: 36851, Training Loss: 0.00670
Epoch: 36852, Training Loss: 0.00756
Epoch: 36852, Training Loss: 0.00707
Epoch: 36852, Training Loss: 0.00864
Epoch: 36852, Training Loss: 0.00670
Epoch: 36853, Training Loss: 0.00756
Epoch: 36853, Training Loss: 0.00707
Epoch: 36853, Training Loss: 0.00864
Epoch: 36853, Training Loss: 0.00670
Epoch: 36854, Training Loss: 0.00756
Epoch: 36854, Training Loss: 0.00707
Epoch: 36854, Training Loss: 0.00864
Epoch: 36854, Training Loss: 0.00670
Epoch: 36855, Training Loss: 0.00756
Epoch: 36855, Training Loss: 0.00707
Epoch: 36855, Training Loss: 0.00864
Epoch: 36855, Training Loss: 0.00670
Epoch: 36856, Training Loss: 0.00756
Epoch: 36856, Training Loss: 0.00707
Epoch: 36856, Training Loss: 0.00864
Epoch: 36856, Training Loss: 0.00670
Epoch: 36857, Training Loss: 0.00756
Epoch: 36857, Training Loss: 0.00707
Epoch: 36857, Training Loss: 0.00864
Epoch: 36857, Training Loss: 0.00670
Epoch: 36858, Training Loss: 0.00756
Epoch: 36858, Training Loss: 0.00707
Epoch: 36858, Training Loss: 0.00864
Epoch: 36858, Training Loss: 0.00670
Epoch: 36859, Training Loss: 0.00756
Epoch: 36859, Training Loss: 0.00707
Epoch: 36859, Training Loss: 0.00864
Epoch: 36859, Training Loss: 0.00670
Epoch: 36860, Training Loss: 0.00756
Epoch: 36860, Training Loss: 0.00707
Epoch: 36860, Training Loss: 0.00864
Epoch: 36860, Training Loss: 0.00670
Epoch: 36861, Training Loss: 0.00756
Epoch: 36861, Training Loss: 0.00707
Epoch: 36861, Training Loss: 0.00864
Epoch: 36861, Training Loss: 0.00670
Epoch: 36862, Training Loss: 0.00756
Epoch: 36862, Training Loss: 0.00707
Epoch: 36862, Training Loss: 0.00864
Epoch: 36862, Training Loss: 0.00670
Epoch: 36863, Training Loss: 0.00756
Epoch: 36863, Training Loss: 0.00707
Epoch: 36863, Training Loss: 0.00864
Epoch: 36863, Training Loss: 0.00670
Epoch: 36864, Training Loss: 0.00756
Epoch: 36864, Training Loss: 0.00707
Epoch: 36864, Training Loss: 0.00864
Epoch: 36864, Training Loss: 0.00670
Epoch: 36865, Training Loss: 0.00756
Epoch: 36865, Training Loss: 0.00707
Epoch: 36865, Training Loss: 0.00864
Epoch: 36865, Training Loss: 0.00670
Epoch: 36866, Training Loss: 0.00756
Epoch: 36866, Training Loss: 0.00707
Epoch: 36866, Training Loss: 0.00864
Epoch: 36866, Training Loss: 0.00670
Epoch: 36867, Training Loss: 0.00756
Epoch: 36867, Training Loss: 0.00707
Epoch: 36867, Training Loss: 0.00864
Epoch: 36867, Training Loss: 0.00670
Epoch: 36868, Training Loss: 0.00756
Epoch: 36868, Training Loss: 0.00707
Epoch: 36868, Training Loss: 0.00864
Epoch: 36868, Training Loss: 0.00670
Epoch: 36869, Training Loss: 0.00756
Epoch: 36869, Training Loss: 0.00707
Epoch: 36869, Training Loss: 0.00864
Epoch: 36869, Training Loss: 0.00670
Epoch: 36870, Training Loss: 0.00756
Epoch: 36870, Training Loss: 0.00707
Epoch: 36870, Training Loss: 0.00864
Epoch: 36870, Training Loss: 0.00670
Epoch: 36871, Training Loss: 0.00756
Epoch: 36871, Training Loss: 0.00707
Epoch: 36871, Training Loss: 0.00864
Epoch: 36871, Training Loss: 0.00670
Epoch: 36872, Training Loss: 0.00756
Epoch: 36872, Training Loss: 0.00707
Epoch: 36872, Training Loss: 0.00864
Epoch: 36872, Training Loss: 0.00670
Epoch: 36873, Training Loss: 0.00756
Epoch: 36873, Training Loss: 0.00707
Epoch: 36873, Training Loss: 0.00864
Epoch: 36873, Training Loss: 0.00670
Epoch: 36874, Training Loss: 0.00756
Epoch: 36874, Training Loss: 0.00707
Epoch: 36874, Training Loss: 0.00864
Epoch: 36874, Training Loss: 0.00670
Epoch: 36875, Training Loss: 0.00756
Epoch: 36875, Training Loss: 0.00707
Epoch: 36875, Training Loss: 0.00864
Epoch: 36875, Training Loss: 0.00670
Epoch: 36876, Training Loss: 0.00756
Epoch: 36876, Training Loss: 0.00707
Epoch: 36876, Training Loss: 0.00864
Epoch: 36876, Training Loss: 0.00670
Epoch: 36877, Training Loss: 0.00756
Epoch: 36877, Training Loss: 0.00707
Epoch: 36877, Training Loss: 0.00864
Epoch: 36877, Training Loss: 0.00670
Epoch: 36878, Training Loss: 0.00756
Epoch: 36878, Training Loss: 0.00707
Epoch: 36878, Training Loss: 0.00864
Epoch: 36878, Training Loss: 0.00670
Epoch: 36879, Training Loss: 0.00756
Epoch: 36879, Training Loss: 0.00707
Epoch: 36879, Training Loss: 0.00864
Epoch: 36879, Training Loss: 0.00670
Epoch: 36880, Training Loss: 0.00756
Epoch: 36880, Training Loss: 0.00707
Epoch: 36880, Training Loss: 0.00864
Epoch: 36880, Training Loss: 0.00670
Epoch: 36881, Training Loss: 0.00756
Epoch: 36881, Training Loss: 0.00707
Epoch: 36881, Training Loss: 0.00864
Epoch: 36881, Training Loss: 0.00670
Epoch: 36882, Training Loss: 0.00756
Epoch: 36882, Training Loss: 0.00707
Epoch: 36882, Training Loss: 0.00864
Epoch: 36882, Training Loss: 0.00670
Epoch: 36883, Training Loss: 0.00756
Epoch: 36883, Training Loss: 0.00707
Epoch: 36883, Training Loss: 0.00864
Epoch: 36883, Training Loss: 0.00670
Epoch: 36884, Training Loss: 0.00756
Epoch: 36884, Training Loss: 0.00707
Epoch: 36884, Training Loss: 0.00864
Epoch: 36884, Training Loss: 0.00670
Epoch: 36885, Training Loss: 0.00756
Epoch: 36885, Training Loss: 0.00707
Epoch: 36885, Training Loss: 0.00864
Epoch: 36885, Training Loss: 0.00670
Epoch: 36886, Training Loss: 0.00756
Epoch: 36886, Training Loss: 0.00707
Epoch: 36886, Training Loss: 0.00864
Epoch: 36886, Training Loss: 0.00670
Epoch: 36887, Training Loss: 0.00756
Epoch: 36887, Training Loss: 0.00707
Epoch: 36887, Training Loss: 0.00864
Epoch: 36887, Training Loss: 0.00670
Epoch: 36888, Training Loss: 0.00756
Epoch: 36888, Training Loss: 0.00707
Epoch: 36888, Training Loss: 0.00864
Epoch: 36888, Training Loss: 0.00670
Epoch: 36889, Training Loss: 0.00756
Epoch: 36889, Training Loss: 0.00707
Epoch: 36889, Training Loss: 0.00864
Epoch: 36889, Training Loss: 0.00669
Epoch: 36890, Training Loss: 0.00756
Epoch: 36890, Training Loss: 0.00707
Epoch: 36890, Training Loss: 0.00864
Epoch: 36890, Training Loss: 0.00669
Epoch: 36891, Training Loss: 0.00756
Epoch: 36891, Training Loss: 0.00707
Epoch: 36891, Training Loss: 0.00864
Epoch: 36891, Training Loss: 0.00669
Epoch: 36892, Training Loss: 0.00756
Epoch: 36892, Training Loss: 0.00707
Epoch: 36892, Training Loss: 0.00864
Epoch: 36892, Training Loss: 0.00669
Epoch: 36893, Training Loss: 0.00756
Epoch: 36893, Training Loss: 0.00707
Epoch: 36893, Training Loss: 0.00864
Epoch: 36893, Training Loss: 0.00669
Epoch: 36894, Training Loss: 0.00756
Epoch: 36894, Training Loss: 0.00707
Epoch: 36894, Training Loss: 0.00864
Epoch: 36894, Training Loss: 0.00669
Epoch: 36895, Training Loss: 0.00756
Epoch: 36895, Training Loss: 0.00707
Epoch: 36895, Training Loss: 0.00864
Epoch: 36895, Training Loss: 0.00669
Epoch: 36896, Training Loss: 0.00756
Epoch: 36896, Training Loss: 0.00707
Epoch: 36896, Training Loss: 0.00864
Epoch: 36896, Training Loss: 0.00669
Epoch: 36897, Training Loss: 0.00756
Epoch: 36897, Training Loss: 0.00707
Epoch: 36897, Training Loss: 0.00864
Epoch: 36897, Training Loss: 0.00669
Epoch: 36898, Training Loss: 0.00756
Epoch: 36898, Training Loss: 0.00707
Epoch: 36898, Training Loss: 0.00864
Epoch: 36898, Training Loss: 0.00669
Epoch: 36899, Training Loss: 0.00756
Epoch: 36899, Training Loss: 0.00707
Epoch: 36899, Training Loss: 0.00864
Epoch: 36899, Training Loss: 0.00669
Epoch: 36900, Training Loss: 0.00756
Epoch: 36900, Training Loss: 0.00707
Epoch: 36900, Training Loss: 0.00864
Epoch: 36900, Training Loss: 0.00669
Epoch: 36901, Training Loss: 0.00756
Epoch: 36901, Training Loss: 0.00707
Epoch: 36901, Training Loss: 0.00864
Epoch: 36901, Training Loss: 0.00669
Epoch: 36902, Training Loss: 0.00756
Epoch: 36902, Training Loss: 0.00707
Epoch: 36902, Training Loss: 0.00863
Epoch: 36902, Training Loss: 0.00669
Epoch: 36903, Training Loss: 0.00756
Epoch: 36903, Training Loss: 0.00707
Epoch: 36903, Training Loss: 0.00863
Epoch: 36903, Training Loss: 0.00669
Epoch: 36904, Training Loss: 0.00756
Epoch: 36904, Training Loss: 0.00707
Epoch: 36904, Training Loss: 0.00863
Epoch: 36904, Training Loss: 0.00669
Epoch: 36905, Training Loss: 0.00756
Epoch: 36905, Training Loss: 0.00707
Epoch: 36905, Training Loss: 0.00863
Epoch: 36905, Training Loss: 0.00669
Epoch: 36906, Training Loss: 0.00756
Epoch: 36906, Training Loss: 0.00707
Epoch: 36906, Training Loss: 0.00863
Epoch: 36906, Training Loss: 0.00669
Epoch: 36907, Training Loss: 0.00756
Epoch: 36907, Training Loss: 0.00707
Epoch: 36907, Training Loss: 0.00863
Epoch: 36907, Training Loss: 0.00669
Epoch: 36908, Training Loss: 0.00756
Epoch: 36908, Training Loss: 0.00707
Epoch: 36908, Training Loss: 0.00863
Epoch: 36908, Training Loss: 0.00669
Epoch: 36909, Training Loss: 0.00756
Epoch: 36909, Training Loss: 0.00707
Epoch: 36909, Training Loss: 0.00863
Epoch: 36909, Training Loss: 0.00669
Epoch: 36910, Training Loss: 0.00756
Epoch: 36910, Training Loss: 0.00707
Epoch: 36910, Training Loss: 0.00863
Epoch: 36910, Training Loss: 0.00669
Epoch: 36911, Training Loss: 0.00756
Epoch: 36911, Training Loss: 0.00707
Epoch: 36911, Training Loss: 0.00863
Epoch: 36911, Training Loss: 0.00669
Epoch: 36912, Training Loss: 0.00756
Epoch: 36912, Training Loss: 0.00707
Epoch: 36912, Training Loss: 0.00863
Epoch: 36912, Training Loss: 0.00669
Epoch: 36913, Training Loss: 0.00756
Epoch: 36913, Training Loss: 0.00707
Epoch: 36913, Training Loss: 0.00863
Epoch: 36913, Training Loss: 0.00669
Epoch: 36914, Training Loss: 0.00756
Epoch: 36914, Training Loss: 0.00707
Epoch: 36914, Training Loss: 0.00863
Epoch: 36914, Training Loss: 0.00669
Epoch: 36915, Training Loss: 0.00756
Epoch: 36915, Training Loss: 0.00707
Epoch: 36915, Training Loss: 0.00863
Epoch: 36915, Training Loss: 0.00669
Epoch: 36916, Training Loss: 0.00756
Epoch: 36916, Training Loss: 0.00707
Epoch: 36916, Training Loss: 0.00863
Epoch: 36916, Training Loss: 0.00669
Epoch: 36917, Training Loss: 0.00756
Epoch: 36917, Training Loss: 0.00707
Epoch: 36917, Training Loss: 0.00863
Epoch: 36917, Training Loss: 0.00669
Epoch: 36918, Training Loss: 0.00756
Epoch: 36918, Training Loss: 0.00707
Epoch: 36918, Training Loss: 0.00863
Epoch: 36918, Training Loss: 0.00669
Epoch: 36919, Training Loss: 0.00756
Epoch: 36919, Training Loss: 0.00707
Epoch: 36919, Training Loss: 0.00863
Epoch: 36919, Training Loss: 0.00669
Epoch: 36920, Training Loss: 0.00756
Epoch: 36920, Training Loss: 0.00707
Epoch: 36920, Training Loss: 0.00863
Epoch: 36920, Training Loss: 0.00669
Epoch: 36921, Training Loss: 0.00756
Epoch: 36921, Training Loss: 0.00707
Epoch: 36921, Training Loss: 0.00863
Epoch: 36921, Training Loss: 0.00669
Epoch: 36922, Training Loss: 0.00756
Epoch: 36922, Training Loss: 0.00707
Epoch: 36922, Training Loss: 0.00863
Epoch: 36922, Training Loss: 0.00669
Epoch: 36923, Training Loss: 0.00756
Epoch: 36923, Training Loss: 0.00707
Epoch: 36923, Training Loss: 0.00863
Epoch: 36923, Training Loss: 0.00669
Epoch: 36924, Training Loss: 0.00756
Epoch: 36924, Training Loss: 0.00707
Epoch: 36924, Training Loss: 0.00863
Epoch: 36924, Training Loss: 0.00669
Epoch: 36925, Training Loss: 0.00756
Epoch: 36925, Training Loss: 0.00707
Epoch: 36925, Training Loss: 0.00863
Epoch: 36925, Training Loss: 0.00669
Epoch: 36926, Training Loss: 0.00756
Epoch: 36926, Training Loss: 0.00707
Epoch: 36926, Training Loss: 0.00863
Epoch: 36926, Training Loss: 0.00669
Epoch: 36927, Training Loss: 0.00756
Epoch: 36927, Training Loss: 0.00707
Epoch: 36927, Training Loss: 0.00863
Epoch: 36927, Training Loss: 0.00669
Epoch: 36928, Training Loss: 0.00756
Epoch: 36928, Training Loss: 0.00707
Epoch: 36928, Training Loss: 0.00863
Epoch: 36928, Training Loss: 0.00669
Epoch: 36929, Training Loss: 0.00756
Epoch: 36929, Training Loss: 0.00707
Epoch: 36929, Training Loss: 0.00863
Epoch: 36929, Training Loss: 0.00669
Epoch: 36930, Training Loss: 0.00756
Epoch: 36930, Training Loss: 0.00707
Epoch: 36930, Training Loss: 0.00863
Epoch: 36930, Training Loss: 0.00669
Epoch: 36931, Training Loss: 0.00756
Epoch: 36931, Training Loss: 0.00707
Epoch: 36931, Training Loss: 0.00863
Epoch: 36931, Training Loss: 0.00669
Epoch: 36932, Training Loss: 0.00756
Epoch: 36932, Training Loss: 0.00707
Epoch: 36932, Training Loss: 0.00863
Epoch: 36932, Training Loss: 0.00669
Epoch: 36933, Training Loss: 0.00756
Epoch: 36933, Training Loss: 0.00707
Epoch: 36933, Training Loss: 0.00863
Epoch: 36933, Training Loss: 0.00669
Epoch: 36934, Training Loss: 0.00755
Epoch: 36934, Training Loss: 0.00707
Epoch: 36934, Training Loss: 0.00863
Epoch: 36934, Training Loss: 0.00669
Epoch: 36935, Training Loss: 0.00755
Epoch: 36935, Training Loss: 0.00707
Epoch: 36935, Training Loss: 0.00863
Epoch: 36935, Training Loss: 0.00669
Epoch: 36936, Training Loss: 0.00755
Epoch: 36936, Training Loss: 0.00707
Epoch: 36936, Training Loss: 0.00863
Epoch: 36936, Training Loss: 0.00669
Epoch: 36937, Training Loss: 0.00755
Epoch: 36937, Training Loss: 0.00707
Epoch: 36937, Training Loss: 0.00863
Epoch: 36937, Training Loss: 0.00669
Epoch: 36938, Training Loss: 0.00755
Epoch: 36938, Training Loss: 0.00707
Epoch: 36938, Training Loss: 0.00863
Epoch: 36938, Training Loss: 0.00669
Epoch: 36939, Training Loss: 0.00755
Epoch: 36939, Training Loss: 0.00707
Epoch: 36939, Training Loss: 0.00863
Epoch: 36939, Training Loss: 0.00669
Epoch: 36940, Training Loss: 0.00755
Epoch: 36940, Training Loss: 0.00707
Epoch: 36940, Training Loss: 0.00863
Epoch: 36940, Training Loss: 0.00669
Epoch: 36941, Training Loss: 0.00755
Epoch: 36941, Training Loss: 0.00707
Epoch: 36941, Training Loss: 0.00863
Epoch: 36941, Training Loss: 0.00669
Epoch: 36942, Training Loss: 0.00755
Epoch: 36942, Training Loss: 0.00707
Epoch: 36942, Training Loss: 0.00863
Epoch: 36942, Training Loss: 0.00669
Epoch: 36943, Training Loss: 0.00755
Epoch: 36943, Training Loss: 0.00707
Epoch: 36943, Training Loss: 0.00863
Epoch: 36943, Training Loss: 0.00669
Epoch: 36944, Training Loss: 0.00755
Epoch: 36944, Training Loss: 0.00707
Epoch: 36944, Training Loss: 0.00863
Epoch: 36944, Training Loss: 0.00669
Epoch: 36945, Training Loss: 0.00755
Epoch: 36945, Training Loss: 0.00707
Epoch: 36945, Training Loss: 0.00863
Epoch: 36945, Training Loss: 0.00669
Epoch: 36946, Training Loss: 0.00755
Epoch: 36946, Training Loss: 0.00707
Epoch: 36946, Training Loss: 0.00863
Epoch: 36946, Training Loss: 0.00669
Epoch: 36947, Training Loss: 0.00755
Epoch: 36947, Training Loss: 0.00707
Epoch: 36947, Training Loss: 0.00863
Epoch: 36947, Training Loss: 0.00669
Epoch: 36948, Training Loss: 0.00755
Epoch: 36948, Training Loss: 0.00707
Epoch: 36948, Training Loss: 0.00863
Epoch: 36948, Training Loss: 0.00669
Epoch: 36949, Training Loss: 0.00755
Epoch: 36949, Training Loss: 0.00707
Epoch: 36949, Training Loss: 0.00863
Epoch: 36949, Training Loss: 0.00669
Epoch: 36950, Training Loss: 0.00755
Epoch: 36950, Training Loss: 0.00706
Epoch: 36950, Training Loss: 0.00863
Epoch: 36950, Training Loss: 0.00669
Epoch: 36951, Training Loss: 0.00755
Epoch: 36951, Training Loss: 0.00706
Epoch: 36951, Training Loss: 0.00863
Epoch: 36951, Training Loss: 0.00669
Epoch: 36952, Training Loss: 0.00755
Epoch: 36952, Training Loss: 0.00706
Epoch: 36952, Training Loss: 0.00863
Epoch: 36952, Training Loss: 0.00669
Epoch: 36953, Training Loss: 0.00755
Epoch: 36953, Training Loss: 0.00706
Epoch: 36953, Training Loss: 0.00863
Epoch: 36953, Training Loss: 0.00669
Epoch: 36954, Training Loss: 0.00755
Epoch: 36954, Training Loss: 0.00706
Epoch: 36954, Training Loss: 0.00863
Epoch: 36954, Training Loss: 0.00669
Epoch: 36955, Training Loss: 0.00755
Epoch: 36955, Training Loss: 0.00706
Epoch: 36955, Training Loss: 0.00863
Epoch: 36955, Training Loss: 0.00669
Epoch: 36956, Training Loss: 0.00755
Epoch: 36956, Training Loss: 0.00706
Epoch: 36956, Training Loss: 0.00863
Epoch: 36956, Training Loss: 0.00669
Epoch: 36957, Training Loss: 0.00755
Epoch: 36957, Training Loss: 0.00706
Epoch: 36957, Training Loss: 0.00863
Epoch: 36957, Training Loss: 0.00669
Epoch: 36958, Training Loss: 0.00755
Epoch: 36958, Training Loss: 0.00706
Epoch: 36958, Training Loss: 0.00863
Epoch: 36958, Training Loss: 0.00669
Epoch: 36959, Training Loss: 0.00755
Epoch: 36959, Training Loss: 0.00706
Epoch: 36959, Training Loss: 0.00863
Epoch: 36959, Training Loss: 0.00669
Epoch: 36960, Training Loss: 0.00755
Epoch: 36960, Training Loss: 0.00706
Epoch: 36960, Training Loss: 0.00863
Epoch: 36960, Training Loss: 0.00669
Epoch: 36961, Training Loss: 0.00755
Epoch: 36961, Training Loss: 0.00706
Epoch: 36961, Training Loss: 0.00863
Epoch: 36961, Training Loss: 0.00669
Epoch: 36962, Training Loss: 0.00755
Epoch: 36962, Training Loss: 0.00706
Epoch: 36962, Training Loss: 0.00863
Epoch: 36962, Training Loss: 0.00669
Epoch: 36963, Training Loss: 0.00755
Epoch: 36963, Training Loss: 0.00706
Epoch: 36963, Training Loss: 0.00863
Epoch: 36963, Training Loss: 0.00669
Epoch: 36964, Training Loss: 0.00755
Epoch: 36964, Training Loss: 0.00706
Epoch: 36964, Training Loss: 0.00863
Epoch: 36964, Training Loss: 0.00669
Epoch: 36965, Training Loss: 0.00755
Epoch: 36965, Training Loss: 0.00706
Epoch: 36965, Training Loss: 0.00863
Epoch: 36965, Training Loss: 0.00669
Epoch: 36966, Training Loss: 0.00755
Epoch: 36966, Training Loss: 0.00706
Epoch: 36966, Training Loss: 0.00863
Epoch: 36966, Training Loss: 0.00669
Epoch: 36967, Training Loss: 0.00755
Epoch: 36967, Training Loss: 0.00706
Epoch: 36967, Training Loss: 0.00863
Epoch: 36967, Training Loss: 0.00669
Epoch: 36968, Training Loss: 0.00755
Epoch: 36968, Training Loss: 0.00706
Epoch: 36968, Training Loss: 0.00863
Epoch: 36968, Training Loss: 0.00669
Epoch: 36969, Training Loss: 0.00755
Epoch: 36969, Training Loss: 0.00706
Epoch: 36969, Training Loss: 0.00863
Epoch: 36969, Training Loss: 0.00669
Epoch: 36970, Training Loss: 0.00755
Epoch: 36970, Training Loss: 0.00706
Epoch: 36970, Training Loss: 0.00863
Epoch: 36970, Training Loss: 0.00669
Epoch: 36971, Training Loss: 0.00755
Epoch: 36971, Training Loss: 0.00706
Epoch: 36971, Training Loss: 0.00863
Epoch: 36971, Training Loss: 0.00669
Epoch: 36972, Training Loss: 0.00755
Epoch: 36972, Training Loss: 0.00706
Epoch: 36972, Training Loss: 0.00863
Epoch: 36972, Training Loss: 0.00669
Epoch: 36973, Training Loss: 0.00755
Epoch: 36973, Training Loss: 0.00706
Epoch: 36973, Training Loss: 0.00863
Epoch: 36973, Training Loss: 0.00669
Epoch: 36974, Training Loss: 0.00755
Epoch: 36974, Training Loss: 0.00706
Epoch: 36974, Training Loss: 0.00863
Epoch: 36974, Training Loss: 0.00669
Epoch: 36975, Training Loss: 0.00755
Epoch: 36975, Training Loss: 0.00706
Epoch: 36975, Training Loss: 0.00863
Epoch: 36975, Training Loss: 0.00669
Epoch: 36976, Training Loss: 0.00755
Epoch: 36976, Training Loss: 0.00706
Epoch: 36976, Training Loss: 0.00863
Epoch: 36976, Training Loss: 0.00669
Epoch: 36977, Training Loss: 0.00755
Epoch: 36977, Training Loss: 0.00706
Epoch: 36977, Training Loss: 0.00863
Epoch: 36977, Training Loss: 0.00669
Epoch: 36978, Training Loss: 0.00755
Epoch: 36978, Training Loss: 0.00706
Epoch: 36978, Training Loss: 0.00863
Epoch: 36978, Training Loss: 0.00669
Epoch: 36979, Training Loss: 0.00755
Epoch: 36979, Training Loss: 0.00706
Epoch: 36979, Training Loss: 0.00863
Epoch: 36979, Training Loss: 0.00669
Epoch: 36980, Training Loss: 0.00755
Epoch: 36980, Training Loss: 0.00706
Epoch: 36980, Training Loss: 0.00863
Epoch: 36980, Training Loss: 0.00669
Epoch: 36981, Training Loss: 0.00755
Epoch: 36981, Training Loss: 0.00706
Epoch: 36981, Training Loss: 0.00863
Epoch: 36981, Training Loss: 0.00669
Epoch: 36982, Training Loss: 0.00755
Epoch: 36982, Training Loss: 0.00706
Epoch: 36982, Training Loss: 0.00863
Epoch: 36982, Training Loss: 0.00669
Epoch: 36983, Training Loss: 0.00755
Epoch: 36983, Training Loss: 0.00706
Epoch: 36983, Training Loss: 0.00862
Epoch: 36983, Training Loss: 0.00669
Epoch: 36984, Training Loss: 0.00755
Epoch: 36984, Training Loss: 0.00706
Epoch: 36984, Training Loss: 0.00862
Epoch: 36984, Training Loss: 0.00669
Epoch: 36985, Training Loss: 0.00755
Epoch: 36985, Training Loss: 0.00706
Epoch: 36985, Training Loss: 0.00862
Epoch: 36985, Training Loss: 0.00669
Epoch: 36986, Training Loss: 0.00755
Epoch: 36986, Training Loss: 0.00706
Epoch: 36986, Training Loss: 0.00862
Epoch: 36986, Training Loss: 0.00669
Epoch: 36987, Training Loss: 0.00755
Epoch: 36987, Training Loss: 0.00706
Epoch: 36987, Training Loss: 0.00862
Epoch: 36987, Training Loss: 0.00669
Epoch: 36988, Training Loss: 0.00755
Epoch: 36988, Training Loss: 0.00706
Epoch: 36988, Training Loss: 0.00862
Epoch: 36988, Training Loss: 0.00669
Epoch: 36989, Training Loss: 0.00755
Epoch: 36989, Training Loss: 0.00706
Epoch: 36989, Training Loss: 0.00862
Epoch: 36989, Training Loss: 0.00669
Epoch: 36990, Training Loss: 0.00755
Epoch: 36990, Training Loss: 0.00706
Epoch: 36990, Training Loss: 0.00862
Epoch: 36990, Training Loss: 0.00669
Epoch: 36991, Training Loss: 0.00755
Epoch: 36991, Training Loss: 0.00706
Epoch: 36991, Training Loss: 0.00862
Epoch: 36991, Training Loss: 0.00669
Epoch: 36992, Training Loss: 0.00755
Epoch: 36992, Training Loss: 0.00706
Epoch: 36992, Training Loss: 0.00862
Epoch: 36992, Training Loss: 0.00669
Epoch: 36993, Training Loss: 0.00755
Epoch: 36993, Training Loss: 0.00706
Epoch: 36993, Training Loss: 0.00862
Epoch: 36993, Training Loss: 0.00669
Epoch: 36994, Training Loss: 0.00755
Epoch: 36994, Training Loss: 0.00706
Epoch: 36994, Training Loss: 0.00862
Epoch: 36994, Training Loss: 0.00669
Epoch: 36995, Training Loss: 0.00755
Epoch: 36995, Training Loss: 0.00706
Epoch: 36995, Training Loss: 0.00862
Epoch: 36995, Training Loss: 0.00668
Epoch: 36996, Training Loss: 0.00755
Epoch: 36996, Training Loss: 0.00706
Epoch: 36996, Training Loss: 0.00862
Epoch: 36996, Training Loss: 0.00668
Epoch: 36997, Training Loss: 0.00755
Epoch: 36997, Training Loss: 0.00706
Epoch: 36997, Training Loss: 0.00862
Epoch: 36997, Training Loss: 0.00668
Epoch: 36998, Training Loss: 0.00755
Epoch: 36998, Training Loss: 0.00706
Epoch: 36998, Training Loss: 0.00862
Epoch: 36998, Training Loss: 0.00668
Epoch: 36999, Training Loss: 0.00755
Epoch: 36999, Training Loss: 0.00706
Epoch: 36999, Training Loss: 0.00862
Epoch: 36999, Training Loss: 0.00668
Epoch: 37000, Training Loss: 0.00755
Epoch: 37000, Training Loss: 0.00706
Epoch: 37000, Training Loss: 0.00862
Epoch: 37000, Training Loss: 0.00668
Epoch: 37001, Training Loss: 0.00755
Epoch: 37001, Training Loss: 0.00706
Epoch: 37001, Training Loss: 0.00862
Epoch: 37001, Training Loss: 0.00668
Epoch: 37002, Training Loss: 0.00755
Epoch: 37002, Training Loss: 0.00706
Epoch: 37002, Training Loss: 0.00862
Epoch: 37002, Training Loss: 0.00668
Epoch: 37003, Training Loss: 0.00755
Epoch: 37003, Training Loss: 0.00706
Epoch: 37003, Training Loss: 0.00862
Epoch: 37003, Training Loss: 0.00668
Epoch: 37004, Training Loss: 0.00755
Epoch: 37004, Training Loss: 0.00706
Epoch: 37004, Training Loss: 0.00862
Epoch: 37004, Training Loss: 0.00668
Epoch: 37005, Training Loss: 0.00755
Epoch: 37005, Training Loss: 0.00706
Epoch: 37005, Training Loss: 0.00862
Epoch: 37005, Training Loss: 0.00668
Epoch: 37006, Training Loss: 0.00755
Epoch: 37006, Training Loss: 0.00706
Epoch: 37006, Training Loss: 0.00862
Epoch: 37006, Training Loss: 0.00668
Epoch: 37007, Training Loss: 0.00755
Epoch: 37007, Training Loss: 0.00706
Epoch: 37007, Training Loss: 0.00862
Epoch: 37007, Training Loss: 0.00668
Epoch: 37008, Training Loss: 0.00755
Epoch: 37008, Training Loss: 0.00706
Epoch: 37008, Training Loss: 0.00862
Epoch: 37008, Training Loss: 0.00668
Epoch: 37009, Training Loss: 0.00755
Epoch: 37009, Training Loss: 0.00706
Epoch: 37009, Training Loss: 0.00862
Epoch: 37009, Training Loss: 0.00668
Epoch: 37010, Training Loss: 0.00755
Epoch: 37010, Training Loss: 0.00706
Epoch: 37010, Training Loss: 0.00862
Epoch: 37010, Training Loss: 0.00668
Epoch: 37011, Training Loss: 0.00755
Epoch: 37011, Training Loss: 0.00706
Epoch: 37011, Training Loss: 0.00862
Epoch: 37011, Training Loss: 0.00668
Epoch: 37012, Training Loss: 0.00755
Epoch: 37012, Training Loss: 0.00706
Epoch: 37012, Training Loss: 0.00862
Epoch: 37012, Training Loss: 0.00668
Epoch: 37013, Training Loss: 0.00755
Epoch: 37013, Training Loss: 0.00706
Epoch: 37013, Training Loss: 0.00862
Epoch: 37013, Training Loss: 0.00668
Epoch: 37014, Training Loss: 0.00755
Epoch: 37014, Training Loss: 0.00706
Epoch: 37014, Training Loss: 0.00862
Epoch: 37014, Training Loss: 0.00668
Epoch: 37015, Training Loss: 0.00755
Epoch: 37015, Training Loss: 0.00706
Epoch: 37015, Training Loss: 0.00862
Epoch: 37015, Training Loss: 0.00668
Epoch: 37016, Training Loss: 0.00755
Epoch: 37016, Training Loss: 0.00706
Epoch: 37016, Training Loss: 0.00862
Epoch: 37016, Training Loss: 0.00668
Epoch: 37017, Training Loss: 0.00755
Epoch: 37017, Training Loss: 0.00706
Epoch: 37017, Training Loss: 0.00862
Epoch: 37017, Training Loss: 0.00668
Epoch: 37018, Training Loss: 0.00755
Epoch: 37018, Training Loss: 0.00706
Epoch: 37018, Training Loss: 0.00862
Epoch: 37018, Training Loss: 0.00668
Epoch: 37019, Training Loss: 0.00755
Epoch: 37019, Training Loss: 0.00706
Epoch: 37019, Training Loss: 0.00862
Epoch: 37019, Training Loss: 0.00668
Epoch: 37020, Training Loss: 0.00755
Epoch: 37020, Training Loss: 0.00706
Epoch: 37020, Training Loss: 0.00862
Epoch: 37020, Training Loss: 0.00668
Epoch: 37021, Training Loss: 0.00755
Epoch: 37021, Training Loss: 0.00706
Epoch: 37021, Training Loss: 0.00862
Epoch: 37021, Training Loss: 0.00668
Epoch: 37022, Training Loss: 0.00755
Epoch: 37022, Training Loss: 0.00706
Epoch: 37022, Training Loss: 0.00862
Epoch: 37022, Training Loss: 0.00668
Epoch: 37023, Training Loss: 0.00755
Epoch: 37023, Training Loss: 0.00706
Epoch: 37023, Training Loss: 0.00862
Epoch: 37023, Training Loss: 0.00668
Epoch: 37024, Training Loss: 0.00755
Epoch: 37024, Training Loss: 0.00706
Epoch: 37024, Training Loss: 0.00862
Epoch: 37024, Training Loss: 0.00668
Epoch: 37025, Training Loss: 0.00755
Epoch: 37025, Training Loss: 0.00706
Epoch: 37025, Training Loss: 0.00862
Epoch: 37025, Training Loss: 0.00668
Epoch: 37026, Training Loss: 0.00755
Epoch: 37026, Training Loss: 0.00706
Epoch: 37026, Training Loss: 0.00862
Epoch: 37026, Training Loss: 0.00668
Epoch: 37027, Training Loss: 0.00754
Epoch: 37027, Training Loss: 0.00706
Epoch: 37027, Training Loss: 0.00862
Epoch: 37027, Training Loss: 0.00668
Epoch: 37028, Training Loss: 0.00754
Epoch: 37028, Training Loss: 0.00706
Epoch: 37028, Training Loss: 0.00862
Epoch: 37028, Training Loss: 0.00668
Epoch: 37029, Training Loss: 0.00754
Epoch: 37029, Training Loss: 0.00706
Epoch: 37029, Training Loss: 0.00862
Epoch: 37029, Training Loss: 0.00668
Epoch: 37030, Training Loss: 0.00754
Epoch: 37030, Training Loss: 0.00706
Epoch: 37030, Training Loss: 0.00862
Epoch: 37030, Training Loss: 0.00668
Epoch: 37031, Training Loss: 0.00754
Epoch: 37031, Training Loss: 0.00706
Epoch: 37031, Training Loss: 0.00862
Epoch: 37031, Training Loss: 0.00668
Epoch: 37032, Training Loss: 0.00754
Epoch: 37032, Training Loss: 0.00706
Epoch: 37032, Training Loss: 0.00862
Epoch: 37032, Training Loss: 0.00668
Epoch: 37033, Training Loss: 0.00754
Epoch: 37033, Training Loss: 0.00706
Epoch: 37033, Training Loss: 0.00862
Epoch: 37033, Training Loss: 0.00668
Epoch: 37034, Training Loss: 0.00754
Epoch: 37034, Training Loss: 0.00706
Epoch: 37034, Training Loss: 0.00862
Epoch: 37034, Training Loss: 0.00668
Epoch: 37035, Training Loss: 0.00754
Epoch: 37035, Training Loss: 0.00706
Epoch: 37035, Training Loss: 0.00862
Epoch: 37035, Training Loss: 0.00668
Epoch: 37036, Training Loss: 0.00754
Epoch: 37036, Training Loss: 0.00706
Epoch: 37036, Training Loss: 0.00862
Epoch: 37036, Training Loss: 0.00668
Epoch: 37037, Training Loss: 0.00754
Epoch: 37037, Training Loss: 0.00706
Epoch: 37037, Training Loss: 0.00862
Epoch: 37037, Training Loss: 0.00668
Epoch: 37038, Training Loss: 0.00754
Epoch: 37038, Training Loss: 0.00706
Epoch: 37038, Training Loss: 0.00862
Epoch: 37038, Training Loss: 0.00668
Epoch: 37039, Training Loss: 0.00754
Epoch: 37039, Training Loss: 0.00706
Epoch: 37039, Training Loss: 0.00862
Epoch: 37039, Training Loss: 0.00668
Epoch: 37040, Training Loss: 0.00754
Epoch: 37040, Training Loss: 0.00706
Epoch: 37040, Training Loss: 0.00862
Epoch: 37040, Training Loss: 0.00668
Epoch: 37041, Training Loss: 0.00754
Epoch: 37041, Training Loss: 0.00706
Epoch: 37041, Training Loss: 0.00862
Epoch: 37041, Training Loss: 0.00668
Epoch: 37042, Training Loss: 0.00754
Epoch: 37042, Training Loss: 0.00706
Epoch: 37042, Training Loss: 0.00862
Epoch: 37042, Training Loss: 0.00668
Epoch: 37043, Training Loss: 0.00754
Epoch: 37043, Training Loss: 0.00706
Epoch: 37043, Training Loss: 0.00862
Epoch: 37043, Training Loss: 0.00668
Epoch: 37044, Training Loss: 0.00754
Epoch: 37044, Training Loss: 0.00706
Epoch: 37044, Training Loss: 0.00862
Epoch: 37044, Training Loss: 0.00668
Epoch: 37045, Training Loss: 0.00754
Epoch: 37045, Training Loss: 0.00706
Epoch: 37045, Training Loss: 0.00862
Epoch: 37045, Training Loss: 0.00668
Epoch: 37046, Training Loss: 0.00754
Epoch: 37046, Training Loss: 0.00706
Epoch: 37046, Training Loss: 0.00862
Epoch: 37046, Training Loss: 0.00668
Epoch: 37047, Training Loss: 0.00754
Epoch: 37047, Training Loss: 0.00706
Epoch: 37047, Training Loss: 0.00862
Epoch: 37047, Training Loss: 0.00668
Epoch: 37048, Training Loss: 0.00754
Epoch: 37048, Training Loss: 0.00706
Epoch: 37048, Training Loss: 0.00862
Epoch: 37048, Training Loss: 0.00668
Epoch: 37049, Training Loss: 0.00754
Epoch: 37049, Training Loss: 0.00706
Epoch: 37049, Training Loss: 0.00862
Epoch: 37049, Training Loss: 0.00668
Epoch: 37050, Training Loss: 0.00754
Epoch: 37050, Training Loss: 0.00706
Epoch: 37050, Training Loss: 0.00862
Epoch: 37050, Training Loss: 0.00668
Epoch: 37051, Training Loss: 0.00754
Epoch: 37051, Training Loss: 0.00705
Epoch: 37051, Training Loss: 0.00862
Epoch: 37051, Training Loss: 0.00668
Epoch: 37052, Training Loss: 0.00754
Epoch: 37052, Training Loss: 0.00705
Epoch: 37052, Training Loss: 0.00862
Epoch: 37052, Training Loss: 0.00668
Epoch: 37053, Training Loss: 0.00754
Epoch: 37053, Training Loss: 0.00705
Epoch: 37053, Training Loss: 0.00862
Epoch: 37053, Training Loss: 0.00668
Epoch: 37054, Training Loss: 0.00754
Epoch: 37054, Training Loss: 0.00705
Epoch: 37054, Training Loss: 0.00862
Epoch: 37054, Training Loss: 0.00668
Epoch: 37055, Training Loss: 0.00754
Epoch: 37055, Training Loss: 0.00705
Epoch: 37055, Training Loss: 0.00862
Epoch: 37055, Training Loss: 0.00668
Epoch: 37056, Training Loss: 0.00754
Epoch: 37056, Training Loss: 0.00705
Epoch: 37056, Training Loss: 0.00862
Epoch: 37056, Training Loss: 0.00668
Epoch: 37057, Training Loss: 0.00754
Epoch: 37057, Training Loss: 0.00705
Epoch: 37057, Training Loss: 0.00862
Epoch: 37057, Training Loss: 0.00668
Epoch: 37058, Training Loss: 0.00754
Epoch: 37058, Training Loss: 0.00705
Epoch: 37058, Training Loss: 0.00862
Epoch: 37058, Training Loss: 0.00668
Epoch: 37059, Training Loss: 0.00754
Epoch: 37059, Training Loss: 0.00705
Epoch: 37059, Training Loss: 0.00862
Epoch: 37059, Training Loss: 0.00668
Epoch: 37060, Training Loss: 0.00754
Epoch: 37060, Training Loss: 0.00705
Epoch: 37060, Training Loss: 0.00862
Epoch: 37060, Training Loss: 0.00668
Epoch: 37061, Training Loss: 0.00754
Epoch: 37061, Training Loss: 0.00705
Epoch: 37061, Training Loss: 0.00862
Epoch: 37061, Training Loss: 0.00668
Epoch: 37062, Training Loss: 0.00754
Epoch: 37062, Training Loss: 0.00705
Epoch: 37062, Training Loss: 0.00862
Epoch: 37062, Training Loss: 0.00668
Epoch: 37063, Training Loss: 0.00754
Epoch: 37063, Training Loss: 0.00705
Epoch: 37063, Training Loss: 0.00862
Epoch: 37063, Training Loss: 0.00668
Epoch: 37064, Training Loss: 0.00754
Epoch: 37064, Training Loss: 0.00705
Epoch: 37064, Training Loss: 0.00862
Epoch: 37064, Training Loss: 0.00668
Epoch: 37065, Training Loss: 0.00754
Epoch: 37065, Training Loss: 0.00705
Epoch: 37065, Training Loss: 0.00861
Epoch: 37065, Training Loss: 0.00668
Epoch: 37066, Training Loss: 0.00754
Epoch: 37066, Training Loss: 0.00705
Epoch: 37066, Training Loss: 0.00861
Epoch: 37066, Training Loss: 0.00668
Epoch: 37067, Training Loss: 0.00754
Epoch: 37067, Training Loss: 0.00705
Epoch: 37067, Training Loss: 0.00861
Epoch: 37067, Training Loss: 0.00668
Epoch: 37068, Training Loss: 0.00754
Epoch: 37068, Training Loss: 0.00705
Epoch: 37068, Training Loss: 0.00861
Epoch: 37068, Training Loss: 0.00668
Epoch: 37069, Training Loss: 0.00754
Epoch: 37069, Training Loss: 0.00705
Epoch: 37069, Training Loss: 0.00861
Epoch: 37069, Training Loss: 0.00668
Epoch: 37070, Training Loss: 0.00754
Epoch: 37070, Training Loss: 0.00705
Epoch: 37070, Training Loss: 0.00861
Epoch: 37070, Training Loss: 0.00668
Epoch: 37071, Training Loss: 0.00754
Epoch: 37071, Training Loss: 0.00705
Epoch: 37071, Training Loss: 0.00861
Epoch: 37071, Training Loss: 0.00668
Epoch: 37072, Training Loss: 0.00754
Epoch: 37072, Training Loss: 0.00705
Epoch: 37072, Training Loss: 0.00861
Epoch: 37072, Training Loss: 0.00668
Epoch: 37073, Training Loss: 0.00754
Epoch: 37073, Training Loss: 0.00705
Epoch: 37073, Training Loss: 0.00861
Epoch: 37073, Training Loss: 0.00668
Epoch: 37074, Training Loss: 0.00754
Epoch: 37074, Training Loss: 0.00705
Epoch: 37074, Training Loss: 0.00861
Epoch: 37074, Training Loss: 0.00668
Epoch: 37075, Training Loss: 0.00754
Epoch: 37075, Training Loss: 0.00705
Epoch: 37075, Training Loss: 0.00861
Epoch: 37075, Training Loss: 0.00668
Epoch: 37076, Training Loss: 0.00754
Epoch: 37076, Training Loss: 0.00705
Epoch: 37076, Training Loss: 0.00861
Epoch: 37076, Training Loss: 0.00668
Epoch: 37077, Training Loss: 0.00754
Epoch: 37077, Training Loss: 0.00705
Epoch: 37077, Training Loss: 0.00861
Epoch: 37077, Training Loss: 0.00668
Epoch: 37078, Training Loss: 0.00754
Epoch: 37078, Training Loss: 0.00705
Epoch: 37078, Training Loss: 0.00861
Epoch: 37078, Training Loss: 0.00668
Epoch: 37079, Training Loss: 0.00754
Epoch: 37079, Training Loss: 0.00705
Epoch: 37079, Training Loss: 0.00861
Epoch: 37079, Training Loss: 0.00668
Epoch: 37080, Training Loss: 0.00754
Epoch: 37080, Training Loss: 0.00705
Epoch: 37080, Training Loss: 0.00861
Epoch: 37080, Training Loss: 0.00668
Epoch: 37081, Training Loss: 0.00754
Epoch: 37081, Training Loss: 0.00705
Epoch: 37081, Training Loss: 0.00861
Epoch: 37081, Training Loss: 0.00668
Epoch: 37082, Training Loss: 0.00754
Epoch: 37082, Training Loss: 0.00705
Epoch: 37082, Training Loss: 0.00861
Epoch: 37082, Training Loss: 0.00668
Epoch: 37083, Training Loss: 0.00754
Epoch: 37083, Training Loss: 0.00705
Epoch: 37083, Training Loss: 0.00861
Epoch: 37083, Training Loss: 0.00668
Epoch: 37084, Training Loss: 0.00754
Epoch: 37084, Training Loss: 0.00705
Epoch: 37084, Training Loss: 0.00861
Epoch: 37084, Training Loss: 0.00668
Epoch: 37085, Training Loss: 0.00754
Epoch: 37085, Training Loss: 0.00705
Epoch: 37085, Training Loss: 0.00861
Epoch: 37085, Training Loss: 0.00668
Epoch: 37086, Training Loss: 0.00754
Epoch: 37086, Training Loss: 0.00705
Epoch: 37086, Training Loss: 0.00861
Epoch: 37086, Training Loss: 0.00668
Epoch: 37087, Training Loss: 0.00754
Epoch: 37087, Training Loss: 0.00705
Epoch: 37087, Training Loss: 0.00861
Epoch: 37087, Training Loss: 0.00668
Epoch: 37088, Training Loss: 0.00754
Epoch: 37088, Training Loss: 0.00705
Epoch: 37088, Training Loss: 0.00861
Epoch: 37088, Training Loss: 0.00668
Epoch: 37089, Training Loss: 0.00754
Epoch: 37089, Training Loss: 0.00705
Epoch: 37089, Training Loss: 0.00861
Epoch: 37089, Training Loss: 0.00668
Epoch: 37090, Training Loss: 0.00754
Epoch: 37090, Training Loss: 0.00705
Epoch: 37090, Training Loss: 0.00861
Epoch: 37090, Training Loss: 0.00668
Epoch: 37091, Training Loss: 0.00754
Epoch: 37091, Training Loss: 0.00705
Epoch: 37091, Training Loss: 0.00861
Epoch: 37091, Training Loss: 0.00668
Epoch: 37092, Training Loss: 0.00754
Epoch: 37092, Training Loss: 0.00705
Epoch: 37092, Training Loss: 0.00861
Epoch: 37092, Training Loss: 0.00668
Epoch: 37093, Training Loss: 0.00754
Epoch: 37093, Training Loss: 0.00705
Epoch: 37093, Training Loss: 0.00861
Epoch: 37093, Training Loss: 0.00668
Epoch: 37094, Training Loss: 0.00754
Epoch: 37094, Training Loss: 0.00705
Epoch: 37094, Training Loss: 0.00861
Epoch: 37094, Training Loss: 0.00668
Epoch: 37095, Training Loss: 0.00754
Epoch: 37095, Training Loss: 0.00705
Epoch: 37095, Training Loss: 0.00861
Epoch: 37095, Training Loss: 0.00668
Epoch: 37096, Training Loss: 0.00754
Epoch: 37096, Training Loss: 0.00705
Epoch: 37096, Training Loss: 0.00861
Epoch: 37096, Training Loss: 0.00668
Epoch: 37097, Training Loss: 0.00754
Epoch: 37097, Training Loss: 0.00705
Epoch: 37097, Training Loss: 0.00861
Epoch: 37097, Training Loss: 0.00668
Epoch: 37098, Training Loss: 0.00754
Epoch: 37098, Training Loss: 0.00705
Epoch: 37098, Training Loss: 0.00861
Epoch: 37098, Training Loss: 0.00668
Epoch: 37099, Training Loss: 0.00754
Epoch: 37099, Training Loss: 0.00705
Epoch: 37099, Training Loss: 0.00861
Epoch: 37099, Training Loss: 0.00668
Epoch: 37100, Training Loss: 0.00754
Epoch: 37100, Training Loss: 0.00705
Epoch: 37100, Training Loss: 0.00861
Epoch: 37100, Training Loss: 0.00668
Epoch: 37101, Training Loss: 0.00754
Epoch: 37101, Training Loss: 0.00705
Epoch: 37101, Training Loss: 0.00861
Epoch: 37101, Training Loss: 0.00667
Epoch: 37102, Training Loss: 0.00754
Epoch: 37102, Training Loss: 0.00705
Epoch: 37102, Training Loss: 0.00861
Epoch: 37102, Training Loss: 0.00667
Epoch: 37103, Training Loss: 0.00754
Epoch: 37103, Training Loss: 0.00705
Epoch: 37103, Training Loss: 0.00861
Epoch: 37103, Training Loss: 0.00667
Epoch: 37104, Training Loss: 0.00754
Epoch: 37104, Training Loss: 0.00705
Epoch: 37104, Training Loss: 0.00861
Epoch: 37104, Training Loss: 0.00667
Epoch: 37105, Training Loss: 0.00754
Epoch: 37105, Training Loss: 0.00705
Epoch: 37105, Training Loss: 0.00861
Epoch: 37105, Training Loss: 0.00667
Epoch: 37106, Training Loss: 0.00754
Epoch: 37106, Training Loss: 0.00705
Epoch: 37106, Training Loss: 0.00861
Epoch: 37106, Training Loss: 0.00667
Epoch: 37107, Training Loss: 0.00754
Epoch: 37107, Training Loss: 0.00705
Epoch: 37107, Training Loss: 0.00861
Epoch: 37107, Training Loss: 0.00667
Epoch: 37108, Training Loss: 0.00754
Epoch: 37108, Training Loss: 0.00705
Epoch: 37108, Training Loss: 0.00861
Epoch: 37108, Training Loss: 0.00667
Epoch: 37109, Training Loss: 0.00754
Epoch: 37109, Training Loss: 0.00705
Epoch: 37109, Training Loss: 0.00861
Epoch: 37109, Training Loss: 0.00667
Epoch: 37110, Training Loss: 0.00754
Epoch: 37110, Training Loss: 0.00705
Epoch: 37110, Training Loss: 0.00861
Epoch: 37110, Training Loss: 0.00667
Epoch: 37111, Training Loss: 0.00754
Epoch: 37111, Training Loss: 0.00705
Epoch: 37111, Training Loss: 0.00861
Epoch: 37111, Training Loss: 0.00667
Epoch: 37112, Training Loss: 0.00754
Epoch: 37112, Training Loss: 0.00705
Epoch: 37112, Training Loss: 0.00861
Epoch: 37112, Training Loss: 0.00667
Epoch: 37113, Training Loss: 0.00754
Epoch: 37113, Training Loss: 0.00705
Epoch: 37113, Training Loss: 0.00861
Epoch: 37113, Training Loss: 0.00667
Epoch: 37114, Training Loss: 0.00754
Epoch: 37114, Training Loss: 0.00705
Epoch: 37114, Training Loss: 0.00861
Epoch: 37114, Training Loss: 0.00667
Epoch: 37115, Training Loss: 0.00754
Epoch: 37115, Training Loss: 0.00705
Epoch: 37115, Training Loss: 0.00861
Epoch: 37115, Training Loss: 0.00667
Epoch: 37116, Training Loss: 0.00754
Epoch: 37116, Training Loss: 0.00705
Epoch: 37116, Training Loss: 0.00861
Epoch: 37116, Training Loss: 0.00667
Epoch: 37117, Training Loss: 0.00754
Epoch: 37117, Training Loss: 0.00705
Epoch: 37117, Training Loss: 0.00861
Epoch: 37117, Training Loss: 0.00667
Epoch: 37118, Training Loss: 0.00754
Epoch: 37118, Training Loss: 0.00705
Epoch: 37118, Training Loss: 0.00861
Epoch: 37118, Training Loss: 0.00667
Epoch: 37119, Training Loss: 0.00754
Epoch: 37119, Training Loss: 0.00705
Epoch: 37119, Training Loss: 0.00861
Epoch: 37119, Training Loss: 0.00667
Epoch: 37120, Training Loss: 0.00753
Epoch: 37120, Training Loss: 0.00705
Epoch: 37120, Training Loss: 0.00861
Epoch: 37120, Training Loss: 0.00667
Epoch: 37121, Training Loss: 0.00753
Epoch: 37121, Training Loss: 0.00705
Epoch: 37121, Training Loss: 0.00861
Epoch: 37121, Training Loss: 0.00667
Epoch: 37122, Training Loss: 0.00753
Epoch: 37122, Training Loss: 0.00705
Epoch: 37122, Training Loss: 0.00861
Epoch: 37122, Training Loss: 0.00667
Epoch: 37123, Training Loss: 0.00753
Epoch: 37123, Training Loss: 0.00705
Epoch: 37123, Training Loss: 0.00861
Epoch: 37123, Training Loss: 0.00667
Epoch: 37124, Training Loss: 0.00753
Epoch: 37124, Training Loss: 0.00705
Epoch: 37124, Training Loss: 0.00861
Epoch: 37124, Training Loss: 0.00667
Epoch: 37125, Training Loss: 0.00753
Epoch: 37125, Training Loss: 0.00705
Epoch: 37125, Training Loss: 0.00861
Epoch: 37125, Training Loss: 0.00667
Epoch: 37126, Training Loss: 0.00753
Epoch: 37126, Training Loss: 0.00705
Epoch: 37126, Training Loss: 0.00861
Epoch: 37126, Training Loss: 0.00667
Epoch: 37127, Training Loss: 0.00753
Epoch: 37127, Training Loss: 0.00705
Epoch: 37127, Training Loss: 0.00861
Epoch: 37127, Training Loss: 0.00667
Epoch: 37128, Training Loss: 0.00753
Epoch: 37128, Training Loss: 0.00705
Epoch: 37128, Training Loss: 0.00861
Epoch: 37128, Training Loss: 0.00667
Epoch: 37129, Training Loss: 0.00753
Epoch: 37129, Training Loss: 0.00705
Epoch: 37129, Training Loss: 0.00861
Epoch: 37129, Training Loss: 0.00667
Epoch: 37130, Training Loss: 0.00753
Epoch: 37130, Training Loss: 0.00705
Epoch: 37130, Training Loss: 0.00861
Epoch: 37130, Training Loss: 0.00667
Epoch: 37131, Training Loss: 0.00753
Epoch: 37131, Training Loss: 0.00705
Epoch: 37131, Training Loss: 0.00861
Epoch: 37131, Training Loss: 0.00667
Epoch: 37132, Training Loss: 0.00753
Epoch: 37132, Training Loss: 0.00705
Epoch: 37132, Training Loss: 0.00861
Epoch: 37132, Training Loss: 0.00667
Epoch: 37133, Training Loss: 0.00753
Epoch: 37133, Training Loss: 0.00705
Epoch: 37133, Training Loss: 0.00861
Epoch: 37133, Training Loss: 0.00667
Epoch: 37134, Training Loss: 0.00753
Epoch: 37134, Training Loss: 0.00705
Epoch: 37134, Training Loss: 0.00861
Epoch: 37134, Training Loss: 0.00667
Epoch: 37135, Training Loss: 0.00753
Epoch: 37135, Training Loss: 0.00705
Epoch: 37135, Training Loss: 0.00861
Epoch: 37135, Training Loss: 0.00667
Epoch: 37136, Training Loss: 0.00753
Epoch: 37136, Training Loss: 0.00705
Epoch: 37136, Training Loss: 0.00861
Epoch: 37136, Training Loss: 0.00667
Epoch: 37137, Training Loss: 0.00753
Epoch: 37137, Training Loss: 0.00705
Epoch: 37137, Training Loss: 0.00861
Epoch: 37137, Training Loss: 0.00667
Epoch: 37138, Training Loss: 0.00753
Epoch: 37138, Training Loss: 0.00705
Epoch: 37138, Training Loss: 0.00861
Epoch: 37138, Training Loss: 0.00667
Epoch: 37139, Training Loss: 0.00753
Epoch: 37139, Training Loss: 0.00705
Epoch: 37139, Training Loss: 0.00861
Epoch: 37139, Training Loss: 0.00667
Epoch: 37140, Training Loss: 0.00753
Epoch: 37140, Training Loss: 0.00705
Epoch: 37140, Training Loss: 0.00861
Epoch: 37140, Training Loss: 0.00667
Epoch: 37141, Training Loss: 0.00753
Epoch: 37141, Training Loss: 0.00705
Epoch: 37141, Training Loss: 0.00861
Epoch: 37141, Training Loss: 0.00667
Epoch: 37142, Training Loss: 0.00753
Epoch: 37142, Training Loss: 0.00705
Epoch: 37142, Training Loss: 0.00861
Epoch: 37142, Training Loss: 0.00667
Epoch: 37143, Training Loss: 0.00753
Epoch: 37143, Training Loss: 0.00705
Epoch: 37143, Training Loss: 0.00861
Epoch: 37143, Training Loss: 0.00667
Epoch: 37144, Training Loss: 0.00753
Epoch: 37144, Training Loss: 0.00705
Epoch: 37144, Training Loss: 0.00861
Epoch: 37144, Training Loss: 0.00667
Epoch: 37145, Training Loss: 0.00753
Epoch: 37145, Training Loss: 0.00705
Epoch: 37145, Training Loss: 0.00861
Epoch: 37145, Training Loss: 0.00667
Epoch: 37146, Training Loss: 0.00753
Epoch: 37146, Training Loss: 0.00705
Epoch: 37146, Training Loss: 0.00860
Epoch: 37146, Training Loss: 0.00667
Epoch: 37147, Training Loss: 0.00753
Epoch: 37147, Training Loss: 0.00705
Epoch: 37147, Training Loss: 0.00860
Epoch: 37147, Training Loss: 0.00667
Epoch: 37148, Training Loss: 0.00753
Epoch: 37148, Training Loss: 0.00705
Epoch: 37148, Training Loss: 0.00860
Epoch: 37148, Training Loss: 0.00667
Epoch: 37149, Training Loss: 0.00753
Epoch: 37149, Training Loss: 0.00705
Epoch: 37149, Training Loss: 0.00860
Epoch: 37149, Training Loss: 0.00667
Epoch: 37150, Training Loss: 0.00753
Epoch: 37150, Training Loss: 0.00705
Epoch: 37150, Training Loss: 0.00860
Epoch: 37150, Training Loss: 0.00667
Epoch: 37151, Training Loss: 0.00753
Epoch: 37151, Training Loss: 0.00705
Epoch: 37151, Training Loss: 0.00860
Epoch: 37151, Training Loss: 0.00667
Epoch: 37152, Training Loss: 0.00753
Epoch: 37152, Training Loss: 0.00704
Epoch: 37152, Training Loss: 0.00860
Epoch: 37152, Training Loss: 0.00667
Epoch: 37153, Training Loss: 0.00753
Epoch: 37153, Training Loss: 0.00704
Epoch: 37153, Training Loss: 0.00860
Epoch: 37153, Training Loss: 0.00667
Epoch: 37154, Training Loss: 0.00753
Epoch: 37154, Training Loss: 0.00704
Epoch: 37154, Training Loss: 0.00860
Epoch: 37154, Training Loss: 0.00667
Epoch: 37155, Training Loss: 0.00753
Epoch: 37155, Training Loss: 0.00704
Epoch: 37155, Training Loss: 0.00860
Epoch: 37155, Training Loss: 0.00667
Epoch: 37156, Training Loss: 0.00753
Epoch: 37156, Training Loss: 0.00704
Epoch: 37156, Training Loss: 0.00860
Epoch: 37156, Training Loss: 0.00667
Epoch: 37157, Training Loss: 0.00753
Epoch: 37157, Training Loss: 0.00704
Epoch: 37157, Training Loss: 0.00860
Epoch: 37157, Training Loss: 0.00667
Epoch: 37158, Training Loss: 0.00753
Epoch: 37158, Training Loss: 0.00704
Epoch: 37158, Training Loss: 0.00860
Epoch: 37158, Training Loss: 0.00667
Epoch: 37159, Training Loss: 0.00753
Epoch: 37159, Training Loss: 0.00704
Epoch: 37159, Training Loss: 0.00860
Epoch: 37159, Training Loss: 0.00667
Epoch: 37160, Training Loss: 0.00753
Epoch: 37160, Training Loss: 0.00704
Epoch: 37160, Training Loss: 0.00860
Epoch: 37160, Training Loss: 0.00667
Epoch: 37161, Training Loss: 0.00753
Epoch: 37161, Training Loss: 0.00704
Epoch: 37161, Training Loss: 0.00860
Epoch: 37161, Training Loss: 0.00667
Epoch: 37162, Training Loss: 0.00753
Epoch: 37162, Training Loss: 0.00704
Epoch: 37162, Training Loss: 0.00860
Epoch: 37162, Training Loss: 0.00667
Epoch: 37163, Training Loss: 0.00753
Epoch: 37163, Training Loss: 0.00704
Epoch: 37163, Training Loss: 0.00860
Epoch: 37163, Training Loss: 0.00667
Epoch: 37164, Training Loss: 0.00753
Epoch: 37164, Training Loss: 0.00704
Epoch: 37164, Training Loss: 0.00860
Epoch: 37164, Training Loss: 0.00667
Epoch: 37165, Training Loss: 0.00753
Epoch: 37165, Training Loss: 0.00704
Epoch: 37165, Training Loss: 0.00860
Epoch: 37165, Training Loss: 0.00667
Epoch: 37166, Training Loss: 0.00753
Epoch: 37166, Training Loss: 0.00704
Epoch: 37166, Training Loss: 0.00860
Epoch: 37166, Training Loss: 0.00667
Epoch: 37167, Training Loss: 0.00753
Epoch: 37167, Training Loss: 0.00704
Epoch: 37167, Training Loss: 0.00860
Epoch: 37167, Training Loss: 0.00667
Epoch: 37168, Training Loss: 0.00753
Epoch: 37168, Training Loss: 0.00704
Epoch: 37168, Training Loss: 0.00860
Epoch: 37168, Training Loss: 0.00667
Epoch: 37169, Training Loss: 0.00753
Epoch: 37169, Training Loss: 0.00704
Epoch: 37169, Training Loss: 0.00860
Epoch: 37169, Training Loss: 0.00667
Epoch: 37170, Training Loss: 0.00753
Epoch: 37170, Training Loss: 0.00704
Epoch: 37170, Training Loss: 0.00860
Epoch: 37170, Training Loss: 0.00667
Epoch: 37171, Training Loss: 0.00753
Epoch: 37171, Training Loss: 0.00704
Epoch: 37171, Training Loss: 0.00860
Epoch: 37171, Training Loss: 0.00667
Epoch: 37172, Training Loss: 0.00753
Epoch: 37172, Training Loss: 0.00704
Epoch: 37172, Training Loss: 0.00860
Epoch: 37172, Training Loss: 0.00667
Epoch: 37173, Training Loss: 0.00753
Epoch: 37173, Training Loss: 0.00704
Epoch: 37173, Training Loss: 0.00860
Epoch: 37173, Training Loss: 0.00667
Epoch: 37174, Training Loss: 0.00753
Epoch: 37174, Training Loss: 0.00704
Epoch: 37174, Training Loss: 0.00860
Epoch: 37174, Training Loss: 0.00667
Epoch: 37175, Training Loss: 0.00753
Epoch: 37175, Training Loss: 0.00704
Epoch: 37175, Training Loss: 0.00860
Epoch: 37175, Training Loss: 0.00667
Epoch: 37176, Training Loss: 0.00753
Epoch: 37176, Training Loss: 0.00704
Epoch: 37176, Training Loss: 0.00860
Epoch: 37176, Training Loss: 0.00667
Epoch: 37177, Training Loss: 0.00753
Epoch: 37177, Training Loss: 0.00704
Epoch: 37177, Training Loss: 0.00860
Epoch: 37177, Training Loss: 0.00667
Epoch: 37178, Training Loss: 0.00753
Epoch: 37178, Training Loss: 0.00704
Epoch: 37178, Training Loss: 0.00860
Epoch: 37178, Training Loss: 0.00667
Epoch: 37179, Training Loss: 0.00753
Epoch: 37179, Training Loss: 0.00704
Epoch: 37179, Training Loss: 0.00860
Epoch: 37179, Training Loss: 0.00667
Epoch: 37180, Training Loss: 0.00753
Epoch: 37180, Training Loss: 0.00704
Epoch: 37180, Training Loss: 0.00860
Epoch: 37180, Training Loss: 0.00667
Epoch: 37181, Training Loss: 0.00753
Epoch: 37181, Training Loss: 0.00704
Epoch: 37181, Training Loss: 0.00860
Epoch: 37181, Training Loss: 0.00667
Epoch: 37182, Training Loss: 0.00753
Epoch: 37182, Training Loss: 0.00704
Epoch: 37182, Training Loss: 0.00860
Epoch: 37182, Training Loss: 0.00667
Epoch: 37183, Training Loss: 0.00753
Epoch: 37183, Training Loss: 0.00704
Epoch: 37183, Training Loss: 0.00860
Epoch: 37183, Training Loss: 0.00667
Epoch: 37184, Training Loss: 0.00753
Epoch: 37184, Training Loss: 0.00704
Epoch: 37184, Training Loss: 0.00860
Epoch: 37184, Training Loss: 0.00667
Epoch: 37185, Training Loss: 0.00753
Epoch: 37185, Training Loss: 0.00704
Epoch: 37185, Training Loss: 0.00860
Epoch: 37185, Training Loss: 0.00667
Epoch: 37186, Training Loss: 0.00753
Epoch: 37186, Training Loss: 0.00704
Epoch: 37186, Training Loss: 0.00860
Epoch: 37186, Training Loss: 0.00667
Epoch: 37187, Training Loss: 0.00753
Epoch: 37187, Training Loss: 0.00704
Epoch: 37187, Training Loss: 0.00860
Epoch: 37187, Training Loss: 0.00667
Epoch: 37188, Training Loss: 0.00753
Epoch: 37188, Training Loss: 0.00704
Epoch: 37188, Training Loss: 0.00860
Epoch: 37188, Training Loss: 0.00667
Epoch: 37189, Training Loss: 0.00753
Epoch: 37189, Training Loss: 0.00704
Epoch: 37189, Training Loss: 0.00860
Epoch: 37189, Training Loss: 0.00667
Epoch: 37190, Training Loss: 0.00753
Epoch: 37190, Training Loss: 0.00704
Epoch: 37190, Training Loss: 0.00860
Epoch: 37190, Training Loss: 0.00667
Epoch: 37191, Training Loss: 0.00753
Epoch: 37191, Training Loss: 0.00704
Epoch: 37191, Training Loss: 0.00860
Epoch: 37191, Training Loss: 0.00667
Epoch: 37192, Training Loss: 0.00753
Epoch: 37192, Training Loss: 0.00704
Epoch: 37192, Training Loss: 0.00860
Epoch: 37192, Training Loss: 0.00667
Epoch: 37193, Training Loss: 0.00753
Epoch: 37193, Training Loss: 0.00704
Epoch: 37193, Training Loss: 0.00860
Epoch: 37193, Training Loss: 0.00667
Epoch: 37194, Training Loss: 0.00753
Epoch: 37194, Training Loss: 0.00704
Epoch: 37194, Training Loss: 0.00860
Epoch: 37194, Training Loss: 0.00667
Epoch: 37195, Training Loss: 0.00753
Epoch: 37195, Training Loss: 0.00704
Epoch: 37195, Training Loss: 0.00860
Epoch: 37195, Training Loss: 0.00667
Epoch: 37196, Training Loss: 0.00753
Epoch: 37196, Training Loss: 0.00704
Epoch: 37196, Training Loss: 0.00860
Epoch: 37196, Training Loss: 0.00667
Epoch: 37197, Training Loss: 0.00753
Epoch: 37197, Training Loss: 0.00704
Epoch: 37197, Training Loss: 0.00860
Epoch: 37197, Training Loss: 0.00667
Epoch: 37198, Training Loss: 0.00753
Epoch: 37198, Training Loss: 0.00704
Epoch: 37198, Training Loss: 0.00860
Epoch: 37198, Training Loss: 0.00667
Epoch: 37199, Training Loss: 0.00753
Epoch: 37199, Training Loss: 0.00704
Epoch: 37199, Training Loss: 0.00860
Epoch: 37199, Training Loss: 0.00667
Epoch: 37200, Training Loss: 0.00753
Epoch: 37200, Training Loss: 0.00704
Epoch: 37200, Training Loss: 0.00860
Epoch: 37200, Training Loss: 0.00667
Epoch: 37201, Training Loss: 0.00753
Epoch: 37201, Training Loss: 0.00704
Epoch: 37201, Training Loss: 0.00860
Epoch: 37201, Training Loss: 0.00667
Epoch: 37202, Training Loss: 0.00753
Epoch: 37202, Training Loss: 0.00704
Epoch: 37202, Training Loss: 0.00860
Epoch: 37202, Training Loss: 0.00667
Epoch: 37203, Training Loss: 0.00753
Epoch: 37203, Training Loss: 0.00704
Epoch: 37203, Training Loss: 0.00860
Epoch: 37203, Training Loss: 0.00667
Epoch: 37204, Training Loss: 0.00753
Epoch: 37204, Training Loss: 0.00704
Epoch: 37204, Training Loss: 0.00860
Epoch: 37204, Training Loss: 0.00667
Epoch: 37205, Training Loss: 0.00753
Epoch: 37205, Training Loss: 0.00704
Epoch: 37205, Training Loss: 0.00860
Epoch: 37205, Training Loss: 0.00667
Epoch: 37206, Training Loss: 0.00753
Epoch: 37206, Training Loss: 0.00704
Epoch: 37206, Training Loss: 0.00860
Epoch: 37206, Training Loss: 0.00667
Epoch: 37207, Training Loss: 0.00753
Epoch: 37207, Training Loss: 0.00704
Epoch: 37207, Training Loss: 0.00860
Epoch: 37207, Training Loss: 0.00667
Epoch: 37208, Training Loss: 0.00753
Epoch: 37208, Training Loss: 0.00704
Epoch: 37208, Training Loss: 0.00860
Epoch: 37208, Training Loss: 0.00666
Epoch: 37209, Training Loss: 0.00753
Epoch: 37209, Training Loss: 0.00704
Epoch: 37209, Training Loss: 0.00860
Epoch: 37209, Training Loss: 0.00666
Epoch: 37210, Training Loss: 0.00753
Epoch: 37210, Training Loss: 0.00704
Epoch: 37210, Training Loss: 0.00860
Epoch: 37210, Training Loss: 0.00666
Epoch: 37211, Training Loss: 0.00753
Epoch: 37211, Training Loss: 0.00704
Epoch: 37211, Training Loss: 0.00860
Epoch: 37211, Training Loss: 0.00666
Epoch: 37212, Training Loss: 0.00753
Epoch: 37212, Training Loss: 0.00704
Epoch: 37212, Training Loss: 0.00860
Epoch: 37212, Training Loss: 0.00666
Epoch: 37213, Training Loss: 0.00753
Epoch: 37213, Training Loss: 0.00704
Epoch: 37213, Training Loss: 0.00860
Epoch: 37213, Training Loss: 0.00666
Epoch: 37214, Training Loss: 0.00752
Epoch: 37214, Training Loss: 0.00704
Epoch: 37214, Training Loss: 0.00860
Epoch: 37214, Training Loss: 0.00666
Epoch: 37215, Training Loss: 0.00752
Epoch: 37215, Training Loss: 0.00704
Epoch: 37215, Training Loss: 0.00860
Epoch: 37215, Training Loss: 0.00666
Epoch: 37216, Training Loss: 0.00752
Epoch: 37216, Training Loss: 0.00704
Epoch: 37216, Training Loss: 0.00860
Epoch: 37216, Training Loss: 0.00666
Epoch: 37217, Training Loss: 0.00752
Epoch: 37217, Training Loss: 0.00704
Epoch: 37217, Training Loss: 0.00860
Epoch: 37217, Training Loss: 0.00666
Epoch: 37218, Training Loss: 0.00752
Epoch: 37218, Training Loss: 0.00704
Epoch: 37218, Training Loss: 0.00860
Epoch: 37218, Training Loss: 0.00666
Epoch: 37219, Training Loss: 0.00752
Epoch: 37219, Training Loss: 0.00704
Epoch: 37219, Training Loss: 0.00860
Epoch: 37219, Training Loss: 0.00666
Epoch: 37220, Training Loss: 0.00752
Epoch: 37220, Training Loss: 0.00704
Epoch: 37220, Training Loss: 0.00860
Epoch: 37220, Training Loss: 0.00666
Epoch: 37221, Training Loss: 0.00752
Epoch: 37221, Training Loss: 0.00704
Epoch: 37221, Training Loss: 0.00860
Epoch: 37221, Training Loss: 0.00666
Epoch: 37222, Training Loss: 0.00752
Epoch: 37222, Training Loss: 0.00704
Epoch: 37222, Training Loss: 0.00860
Epoch: 37222, Training Loss: 0.00666
Epoch: 37223, Training Loss: 0.00752
Epoch: 37223, Training Loss: 0.00704
Epoch: 37223, Training Loss: 0.00860
Epoch: 37223, Training Loss: 0.00666
Epoch: 37224, Training Loss: 0.00752
Epoch: 37224, Training Loss: 0.00704
Epoch: 37224, Training Loss: 0.00860
Epoch: 37224, Training Loss: 0.00666
Epoch: 37225, Training Loss: 0.00752
Epoch: 37225, Training Loss: 0.00704
Epoch: 37225, Training Loss: 0.00860
Epoch: 37225, Training Loss: 0.00666
Epoch: 37226, Training Loss: 0.00752
Epoch: 37226, Training Loss: 0.00704
Epoch: 37226, Training Loss: 0.00860
Epoch: 37226, Training Loss: 0.00666
Epoch: 37227, Training Loss: 0.00752
Epoch: 37227, Training Loss: 0.00704
Epoch: 37227, Training Loss: 0.00860
Epoch: 37227, Training Loss: 0.00666
Epoch: 37228, Training Loss: 0.00752
Epoch: 37228, Training Loss: 0.00704
Epoch: 37228, Training Loss: 0.00860
Epoch: 37228, Training Loss: 0.00666
Epoch: 37229, Training Loss: 0.00752
Epoch: 37229, Training Loss: 0.00704
Epoch: 37229, Training Loss: 0.00859
Epoch: 37229, Training Loss: 0.00666
Epoch: 37230, Training Loss: 0.00752
Epoch: 37230, Training Loss: 0.00704
Epoch: 37230, Training Loss: 0.00859
Epoch: 37230, Training Loss: 0.00666
Epoch: 37231, Training Loss: 0.00752
Epoch: 37231, Training Loss: 0.00704
Epoch: 37231, Training Loss: 0.00859
Epoch: 37231, Training Loss: 0.00666
Epoch: 37232, Training Loss: 0.00752
Epoch: 37232, Training Loss: 0.00704
Epoch: 37232, Training Loss: 0.00859
Epoch: 37232, Training Loss: 0.00666
Epoch: 37233, Training Loss: 0.00752
Epoch: 37233, Training Loss: 0.00704
Epoch: 37233, Training Loss: 0.00859
Epoch: 37233, Training Loss: 0.00666
Epoch: 37234, Training Loss: 0.00752
Epoch: 37234, Training Loss: 0.00704
Epoch: 37234, Training Loss: 0.00859
Epoch: 37234, Training Loss: 0.00666
Epoch: 37235, Training Loss: 0.00752
Epoch: 37235, Training Loss: 0.00704
Epoch: 37235, Training Loss: 0.00859
Epoch: 37235, Training Loss: 0.00666
Epoch: 37236, Training Loss: 0.00752
Epoch: 37236, Training Loss: 0.00704
Epoch: 37236, Training Loss: 0.00859
Epoch: 37236, Training Loss: 0.00666
Epoch: 37237, Training Loss: 0.00752
Epoch: 37237, Training Loss: 0.00704
Epoch: 37237, Training Loss: 0.00859
Epoch: 37237, Training Loss: 0.00666
Epoch: 37238, Training Loss: 0.00752
Epoch: 37238, Training Loss: 0.00704
Epoch: 37238, Training Loss: 0.00859
Epoch: 37238, Training Loss: 0.00666
Epoch: 37239, Training Loss: 0.00752
Epoch: 37239, Training Loss: 0.00704
Epoch: 37239, Training Loss: 0.00859
Epoch: 37239, Training Loss: 0.00666
Epoch: 37240, Training Loss: 0.00752
Epoch: 37240, Training Loss: 0.00704
Epoch: 37240, Training Loss: 0.00859
Epoch: 37240, Training Loss: 0.00666
Epoch: 37241, Training Loss: 0.00752
Epoch: 37241, Training Loss: 0.00704
Epoch: 37241, Training Loss: 0.00859
Epoch: 37241, Training Loss: 0.00666
Epoch: 37242, Training Loss: 0.00752
Epoch: 37242, Training Loss: 0.00704
Epoch: 37242, Training Loss: 0.00859
Epoch: 37242, Training Loss: 0.00666
Epoch: 37243, Training Loss: 0.00752
Epoch: 37243, Training Loss: 0.00704
Epoch: 37243, Training Loss: 0.00859
Epoch: 37243, Training Loss: 0.00666
Epoch: 37244, Training Loss: 0.00752
Epoch: 37244, Training Loss: 0.00704
Epoch: 37244, Training Loss: 0.00859
Epoch: 37244, Training Loss: 0.00666
Epoch: 37245, Training Loss: 0.00752
Epoch: 37245, Training Loss: 0.00704
Epoch: 37245, Training Loss: 0.00859
Epoch: 37245, Training Loss: 0.00666
Epoch: 37246, Training Loss: 0.00752
Epoch: 37246, Training Loss: 0.00704
Epoch: 37246, Training Loss: 0.00859
Epoch: 37246, Training Loss: 0.00666
Epoch: 37247, Training Loss: 0.00752
Epoch: 37247, Training Loss: 0.00704
Epoch: 37247, Training Loss: 0.00859
Epoch: 37247, Training Loss: 0.00666
Epoch: 37248, Training Loss: 0.00752
Epoch: 37248, Training Loss: 0.00704
Epoch: 37248, Training Loss: 0.00859
Epoch: 37248, Training Loss: 0.00666
Epoch: 37249, Training Loss: 0.00752
Epoch: 37249, Training Loss: 0.00704
Epoch: 37249, Training Loss: 0.00859
Epoch: 37249, Training Loss: 0.00666
Epoch: 37250, Training Loss: 0.00752
Epoch: 37250, Training Loss: 0.00704
Epoch: 37250, Training Loss: 0.00859
Epoch: 37250, Training Loss: 0.00666
Epoch: 37251, Training Loss: 0.00752
Epoch: 37251, Training Loss: 0.00704
Epoch: 37251, Training Loss: 0.00859
Epoch: 37251, Training Loss: 0.00666
Epoch: 37252, Training Loss: 0.00752
Epoch: 37252, Training Loss: 0.00704
Epoch: 37252, Training Loss: 0.00859
Epoch: 37252, Training Loss: 0.00666
Epoch: 37253, Training Loss: 0.00752
Epoch: 37253, Training Loss: 0.00703
Epoch: 37253, Training Loss: 0.00859
Epoch: 37253, Training Loss: 0.00666
Epoch: 37254, Training Loss: 0.00752
Epoch: 37254, Training Loss: 0.00703
Epoch: 37254, Training Loss: 0.00859
Epoch: 37254, Training Loss: 0.00666
Epoch: 37255, Training Loss: 0.00752
Epoch: 37255, Training Loss: 0.00703
Epoch: 37255, Training Loss: 0.00859
Epoch: 37255, Training Loss: 0.00666
Epoch: 37256, Training Loss: 0.00752
Epoch: 37256, Training Loss: 0.00703
Epoch: 37256, Training Loss: 0.00859
Epoch: 37256, Training Loss: 0.00666
Epoch: 37257, Training Loss: 0.00752
Epoch: 37257, Training Loss: 0.00703
Epoch: 37257, Training Loss: 0.00859
Epoch: 37257, Training Loss: 0.00666
Epoch: 37258, Training Loss: 0.00752
Epoch: 37258, Training Loss: 0.00703
Epoch: 37258, Training Loss: 0.00859
Epoch: 37258, Training Loss: 0.00666
Epoch: 37259, Training Loss: 0.00752
Epoch: 37259, Training Loss: 0.00703
Epoch: 37259, Training Loss: 0.00859
Epoch: 37259, Training Loss: 0.00666
Epoch: 37260, Training Loss: 0.00752
Epoch: 37260, Training Loss: 0.00703
Epoch: 37260, Training Loss: 0.00859
Epoch: 37260, Training Loss: 0.00666
Epoch: 37261, Training Loss: 0.00752
Epoch: 37261, Training Loss: 0.00703
Epoch: 37261, Training Loss: 0.00859
Epoch: 37261, Training Loss: 0.00666
Epoch: 37262, Training Loss: 0.00752
Epoch: 37262, Training Loss: 0.00703
Epoch: 37262, Training Loss: 0.00859
Epoch: 37262, Training Loss: 0.00666
Epoch: 37263, Training Loss: 0.00752
Epoch: 37263, Training Loss: 0.00703
Epoch: 37263, Training Loss: 0.00859
Epoch: 37263, Training Loss: 0.00666
Epoch: 37264, Training Loss: 0.00752
Epoch: 37264, Training Loss: 0.00703
Epoch: 37264, Training Loss: 0.00859
Epoch: 37264, Training Loss: 0.00666
Epoch: 37265, Training Loss: 0.00752
Epoch: 37265, Training Loss: 0.00703
Epoch: 37265, Training Loss: 0.00859
Epoch: 37265, Training Loss: 0.00666
Epoch: 37266, Training Loss: 0.00752
Epoch: 37266, Training Loss: 0.00703
Epoch: 37266, Training Loss: 0.00859
Epoch: 37266, Training Loss: 0.00666
Epoch: 37267, Training Loss: 0.00752
Epoch: 37267, Training Loss: 0.00703
Epoch: 37267, Training Loss: 0.00859
Epoch: 37267, Training Loss: 0.00666
Epoch: 37268, Training Loss: 0.00752
Epoch: 37268, Training Loss: 0.00703
Epoch: 37268, Training Loss: 0.00859
Epoch: 37268, Training Loss: 0.00666
Epoch: 37269, Training Loss: 0.00752
Epoch: 37269, Training Loss: 0.00703
Epoch: 37269, Training Loss: 0.00859
Epoch: 37269, Training Loss: 0.00666
Epoch: 37270, Training Loss: 0.00752
Epoch: 37270, Training Loss: 0.00703
Epoch: 37270, Training Loss: 0.00859
Epoch: 37270, Training Loss: 0.00666
Epoch: 37271, Training Loss: 0.00752
Epoch: 37271, Training Loss: 0.00703
Epoch: 37271, Training Loss: 0.00859
Epoch: 37271, Training Loss: 0.00666
Epoch: 37272, Training Loss: 0.00752
Epoch: 37272, Training Loss: 0.00703
Epoch: 37272, Training Loss: 0.00859
Epoch: 37272, Training Loss: 0.00666
Epoch: 37273, Training Loss: 0.00752
Epoch: 37273, Training Loss: 0.00703
Epoch: 37273, Training Loss: 0.00859
Epoch: 37273, Training Loss: 0.00666
Epoch: 37274, Training Loss: 0.00752
Epoch: 37274, Training Loss: 0.00703
Epoch: 37274, Training Loss: 0.00859
Epoch: 37274, Training Loss: 0.00666
Epoch: 37275, Training Loss: 0.00752
Epoch: 37275, Training Loss: 0.00703
Epoch: 37275, Training Loss: 0.00859
Epoch: 37275, Training Loss: 0.00666
Epoch: 37276, Training Loss: 0.00752
Epoch: 37276, Training Loss: 0.00703
Epoch: 37276, Training Loss: 0.00859
Epoch: 37276, Training Loss: 0.00666
Epoch: 37277, Training Loss: 0.00752
Epoch: 37277, Training Loss: 0.00703
Epoch: 37277, Training Loss: 0.00859
Epoch: 37277, Training Loss: 0.00666
Epoch: 37278, Training Loss: 0.00752
Epoch: 37278, Training Loss: 0.00703
Epoch: 37278, Training Loss: 0.00859
Epoch: 37278, Training Loss: 0.00666
Epoch: 37279, Training Loss: 0.00752
Epoch: 37279, Training Loss: 0.00703
Epoch: 37279, Training Loss: 0.00859
Epoch: 37279, Training Loss: 0.00666
Epoch: 37280, Training Loss: 0.00752
Epoch: 37280, Training Loss: 0.00703
Epoch: 37280, Training Loss: 0.00859
Epoch: 37280, Training Loss: 0.00666
Epoch: 37281, Training Loss: 0.00752
Epoch: 37281, Training Loss: 0.00703
Epoch: 37281, Training Loss: 0.00859
Epoch: 37281, Training Loss: 0.00666
Epoch: 37282, Training Loss: 0.00752
Epoch: 37282, Training Loss: 0.00703
Epoch: 37282, Training Loss: 0.00859
Epoch: 37282, Training Loss: 0.00666
Epoch: 37283, Training Loss: 0.00752
Epoch: 37283, Training Loss: 0.00703
Epoch: 37283, Training Loss: 0.00859
Epoch: 37283, Training Loss: 0.00666
Epoch: 37284, Training Loss: 0.00752
Epoch: 37284, Training Loss: 0.00703
Epoch: 37284, Training Loss: 0.00859
Epoch: 37284, Training Loss: 0.00666
Epoch: 37285, Training Loss: 0.00752
Epoch: 37285, Training Loss: 0.00703
Epoch: 37285, Training Loss: 0.00859
Epoch: 37285, Training Loss: 0.00666
Epoch: 37286, Training Loss: 0.00752
Epoch: 37286, Training Loss: 0.00703
Epoch: 37286, Training Loss: 0.00859
Epoch: 37286, Training Loss: 0.00666
Epoch: 37287, Training Loss: 0.00752
Epoch: 37287, Training Loss: 0.00703
Epoch: 37287, Training Loss: 0.00859
Epoch: 37287, Training Loss: 0.00666
Epoch: 37288, Training Loss: 0.00752
Epoch: 37288, Training Loss: 0.00703
Epoch: 37288, Training Loss: 0.00859
Epoch: 37288, Training Loss: 0.00666
Epoch: 37289, Training Loss: 0.00752
Epoch: 37289, Training Loss: 0.00703
Epoch: 37289, Training Loss: 0.00859
Epoch: 37289, Training Loss: 0.00666
Epoch: 37290, Training Loss: 0.00752
Epoch: 37290, Training Loss: 0.00703
Epoch: 37290, Training Loss: 0.00859
Epoch: 37290, Training Loss: 0.00666
Epoch: 37291, Training Loss: 0.00752
Epoch: 37291, Training Loss: 0.00703
Epoch: 37291, Training Loss: 0.00859
Epoch: 37291, Training Loss: 0.00666
Epoch: 37292, Training Loss: 0.00752
Epoch: 37292, Training Loss: 0.00703
Epoch: 37292, Training Loss: 0.00859
Epoch: 37292, Training Loss: 0.00666
Epoch: 37293, Training Loss: 0.00752
Epoch: 37293, Training Loss: 0.00703
Epoch: 37293, Training Loss: 0.00859
Epoch: 37293, Training Loss: 0.00666
Epoch: 37294, Training Loss: 0.00752
Epoch: 37294, Training Loss: 0.00703
Epoch: 37294, Training Loss: 0.00859
Epoch: 37294, Training Loss: 0.00666
Epoch: 37295, Training Loss: 0.00752
Epoch: 37295, Training Loss: 0.00703
Epoch: 37295, Training Loss: 0.00859
Epoch: 37295, Training Loss: 0.00666
Epoch: 37296, Training Loss: 0.00752
Epoch: 37296, Training Loss: 0.00703
Epoch: 37296, Training Loss: 0.00859
Epoch: 37296, Training Loss: 0.00666
Epoch: 37297, Training Loss: 0.00752
Epoch: 37297, Training Loss: 0.00703
Epoch: 37297, Training Loss: 0.00859
Epoch: 37297, Training Loss: 0.00666
Epoch: 37298, Training Loss: 0.00752
Epoch: 37298, Training Loss: 0.00703
Epoch: 37298, Training Loss: 0.00859
Epoch: 37298, Training Loss: 0.00666
Epoch: 37299, Training Loss: 0.00752
Epoch: 37299, Training Loss: 0.00703
Epoch: 37299, Training Loss: 0.00859
Epoch: 37299, Training Loss: 0.00666
Epoch: 37300, Training Loss: 0.00752
Epoch: 37300, Training Loss: 0.00703
Epoch: 37300, Training Loss: 0.00859
Epoch: 37300, Training Loss: 0.00666
Epoch: 37301, Training Loss: 0.00752
Epoch: 37301, Training Loss: 0.00703
Epoch: 37301, Training Loss: 0.00859
Epoch: 37301, Training Loss: 0.00666
Epoch: 37302, Training Loss: 0.00752
Epoch: 37302, Training Loss: 0.00703
Epoch: 37302, Training Loss: 0.00859
Epoch: 37302, Training Loss: 0.00666
Epoch: 37303, Training Loss: 0.00752
Epoch: 37303, Training Loss: 0.00703
Epoch: 37303, Training Loss: 0.00859
Epoch: 37303, Training Loss: 0.00666
Epoch: 37304, Training Loss: 0.00752
Epoch: 37304, Training Loss: 0.00703
Epoch: 37304, Training Loss: 0.00859
Epoch: 37304, Training Loss: 0.00666
Epoch: 37305, Training Loss: 0.00752
Epoch: 37305, Training Loss: 0.00703
Epoch: 37305, Training Loss: 0.00859
Epoch: 37305, Training Loss: 0.00666
Epoch: 37306, Training Loss: 0.00752
Epoch: 37306, Training Loss: 0.00703
Epoch: 37306, Training Loss: 0.00859
Epoch: 37306, Training Loss: 0.00666
Epoch: 37307, Training Loss: 0.00751
Epoch: 37307, Training Loss: 0.00703
Epoch: 37307, Training Loss: 0.00859
Epoch: 37307, Training Loss: 0.00666
Epoch: 37308, Training Loss: 0.00751
Epoch: 37308, Training Loss: 0.00703
Epoch: 37308, Training Loss: 0.00859
Epoch: 37308, Training Loss: 0.00666
Epoch: 37309, Training Loss: 0.00751
Epoch: 37309, Training Loss: 0.00703
Epoch: 37309, Training Loss: 0.00859
Epoch: 37309, Training Loss: 0.00666
Epoch: 37310, Training Loss: 0.00751
Epoch: 37310, Training Loss: 0.00703
Epoch: 37310, Training Loss: 0.00859
Epoch: 37310, Training Loss: 0.00666
Epoch: 37311, Training Loss: 0.00751
Epoch: 37311, Training Loss: 0.00703
Epoch: 37311, Training Loss: 0.00858
Epoch: 37311, Training Loss: 0.00666
Epoch: 37312, Training Loss: 0.00751
Epoch: 37312, Training Loss: 0.00703
Epoch: 37312, Training Loss: 0.00858
Epoch: 37312, Training Loss: 0.00666
Epoch: 37313, Training Loss: 0.00751
Epoch: 37313, Training Loss: 0.00703
Epoch: 37313, Training Loss: 0.00858
Epoch: 37313, Training Loss: 0.00666
Epoch: 37314, Training Loss: 0.00751
Epoch: 37314, Training Loss: 0.00703
Epoch: 37314, Training Loss: 0.00858
Epoch: 37314, Training Loss: 0.00666
Epoch: 37315, Training Loss: 0.00751
Epoch: 37315, Training Loss: 0.00703
Epoch: 37315, Training Loss: 0.00858
Epoch: 37315, Training Loss: 0.00666
Epoch: 37316, Training Loss: 0.00751
Epoch: 37316, Training Loss: 0.00703
Epoch: 37316, Training Loss: 0.00858
Epoch: 37316, Training Loss: 0.00665
Epoch: 37317, Training Loss: 0.00751
Epoch: 37317, Training Loss: 0.00703
Epoch: 37317, Training Loss: 0.00858
Epoch: 37317, Training Loss: 0.00665
Epoch: 37318, Training Loss: 0.00751
Epoch: 37318, Training Loss: 0.00703
Epoch: 37318, Training Loss: 0.00858
Epoch: 37318, Training Loss: 0.00665
Epoch: 37319, Training Loss: 0.00751
Epoch: 37319, Training Loss: 0.00703
Epoch: 37319, Training Loss: 0.00858
Epoch: 37319, Training Loss: 0.00665
Epoch: 37320, Training Loss: 0.00751
Epoch: 37320, Training Loss: 0.00703
Epoch: 37320, Training Loss: 0.00858
Epoch: 37320, Training Loss: 0.00665
Epoch: 37321, Training Loss: 0.00751
Epoch: 37321, Training Loss: 0.00703
Epoch: 37321, Training Loss: 0.00858
Epoch: 37321, Training Loss: 0.00665
Epoch: 37322, Training Loss: 0.00751
Epoch: 37322, Training Loss: 0.00703
Epoch: 37322, Training Loss: 0.00858
Epoch: 37322, Training Loss: 0.00665
Epoch: 37323, Training Loss: 0.00751
Epoch: 37323, Training Loss: 0.00703
Epoch: 37323, Training Loss: 0.00858
Epoch: 37323, Training Loss: 0.00665
Epoch: 37324, Training Loss: 0.00751
Epoch: 37324, Training Loss: 0.00703
Epoch: 37324, Training Loss: 0.00858
Epoch: 37324, Training Loss: 0.00665
Epoch: 37325, Training Loss: 0.00751
Epoch: 37325, Training Loss: 0.00703
Epoch: 37325, Training Loss: 0.00858
Epoch: 37325, Training Loss: 0.00665
Epoch: 37326, Training Loss: 0.00751
Epoch: 37326, Training Loss: 0.00703
Epoch: 37326, Training Loss: 0.00858
Epoch: 37326, Training Loss: 0.00665
Epoch: 37327, Training Loss: 0.00751
Epoch: 37327, Training Loss: 0.00703
Epoch: 37327, Training Loss: 0.00858
Epoch: 37327, Training Loss: 0.00665
Epoch: 37328, Training Loss: 0.00751
Epoch: 37328, Training Loss: 0.00703
Epoch: 37328, Training Loss: 0.00858
Epoch: 37328, Training Loss: 0.00665
Epoch: 37329, Training Loss: 0.00751
Epoch: 37329, Training Loss: 0.00703
Epoch: 37329, Training Loss: 0.00858
Epoch: 37329, Training Loss: 0.00665
Epoch: 37330, Training Loss: 0.00751
Epoch: 37330, Training Loss: 0.00703
Epoch: 37330, Training Loss: 0.00858
Epoch: 37330, Training Loss: 0.00665
Epoch: 37331, Training Loss: 0.00751
Epoch: 37331, Training Loss: 0.00703
Epoch: 37331, Training Loss: 0.00858
Epoch: 37331, Training Loss: 0.00665
Epoch: 37332, Training Loss: 0.00751
Epoch: 37332, Training Loss: 0.00703
Epoch: 37332, Training Loss: 0.00858
Epoch: 37332, Training Loss: 0.00665
Epoch: 37333, Training Loss: 0.00751
Epoch: 37333, Training Loss: 0.00703
Epoch: 37333, Training Loss: 0.00858
Epoch: 37333, Training Loss: 0.00665
Epoch: 37334, Training Loss: 0.00751
Epoch: 37334, Training Loss: 0.00703
Epoch: 37334, Training Loss: 0.00858
Epoch: 37334, Training Loss: 0.00665
Epoch: 37335, Training Loss: 0.00751
Epoch: 37335, Training Loss: 0.00703
Epoch: 37335, Training Loss: 0.00858
Epoch: 37335, Training Loss: 0.00665
Epoch: 37336, Training Loss: 0.00751
Epoch: 37336, Training Loss: 0.00703
Epoch: 37336, Training Loss: 0.00858
Epoch: 37336, Training Loss: 0.00665
Epoch: 37337, Training Loss: 0.00751
Epoch: 37337, Training Loss: 0.00703
Epoch: 37337, Training Loss: 0.00858
Epoch: 37337, Training Loss: 0.00665
Epoch: 37338, Training Loss: 0.00751
Epoch: 37338, Training Loss: 0.00703
Epoch: 37338, Training Loss: 0.00858
Epoch: 37338, Training Loss: 0.00665
Epoch: 37339, Training Loss: 0.00751
Epoch: 37339, Training Loss: 0.00703
Epoch: 37339, Training Loss: 0.00858
Epoch: 37339, Training Loss: 0.00665
Epoch: 37340, Training Loss: 0.00751
Epoch: 37340, Training Loss: 0.00703
Epoch: 37340, Training Loss: 0.00858
Epoch: 37340, Training Loss: 0.00665
Epoch: 37341, Training Loss: 0.00751
Epoch: 37341, Training Loss: 0.00703
Epoch: 37341, Training Loss: 0.00858
Epoch: 37341, Training Loss: 0.00665
Epoch: 37342, Training Loss: 0.00751
Epoch: 37342, Training Loss: 0.00703
Epoch: 37342, Training Loss: 0.00858
Epoch: 37342, Training Loss: 0.00665
Epoch: 37343, Training Loss: 0.00751
Epoch: 37343, Training Loss: 0.00703
Epoch: 37343, Training Loss: 0.00858
Epoch: 37343, Training Loss: 0.00665
Epoch: 37344, Training Loss: 0.00751
Epoch: 37344, Training Loss: 0.00703
Epoch: 37344, Training Loss: 0.00858
Epoch: 37344, Training Loss: 0.00665
Epoch: 37345, Training Loss: 0.00751
Epoch: 37345, Training Loss: 0.00703
Epoch: 37345, Training Loss: 0.00858
Epoch: 37345, Training Loss: 0.00665
Epoch: 37346, Training Loss: 0.00751
Epoch: 37346, Training Loss: 0.00703
Epoch: 37346, Training Loss: 0.00858
Epoch: 37346, Training Loss: 0.00665
Epoch: 37347, Training Loss: 0.00751
Epoch: 37347, Training Loss: 0.00703
Epoch: 37347, Training Loss: 0.00858
Epoch: 37347, Training Loss: 0.00665
Epoch: 37348, Training Loss: 0.00751
Epoch: 37348, Training Loss: 0.00703
Epoch: 37348, Training Loss: 0.00858
Epoch: 37348, Training Loss: 0.00665
Epoch: 37349, Training Loss: 0.00751
Epoch: 37349, Training Loss: 0.00703
Epoch: 37349, Training Loss: 0.00858
Epoch: 37349, Training Loss: 0.00665
Epoch: 37350, Training Loss: 0.00751
Epoch: 37350, Training Loss: 0.00703
Epoch: 37350, Training Loss: 0.00858
Epoch: 37350, Training Loss: 0.00665
Epoch: 37351, Training Loss: 0.00751
Epoch: 37351, Training Loss: 0.00703
Epoch: 37351, Training Loss: 0.00858
Epoch: 37351, Training Loss: 0.00665
Epoch: 37352, Training Loss: 0.00751
Epoch: 37352, Training Loss: 0.00703
Epoch: 37352, Training Loss: 0.00858
Epoch: 37352, Training Loss: 0.00665
Epoch: 37353, Training Loss: 0.00751
Epoch: 37353, Training Loss: 0.00703
Epoch: 37353, Training Loss: 0.00858
Epoch: 37353, Training Loss: 0.00665
Epoch: 37354, Training Loss: 0.00751
Epoch: 37354, Training Loss: 0.00703
Epoch: 37354, Training Loss: 0.00858
Epoch: 37354, Training Loss: 0.00665
Epoch: 37355, Training Loss: 0.00751
Epoch: 37355, Training Loss: 0.00703
Epoch: 37355, Training Loss: 0.00858
Epoch: 37355, Training Loss: 0.00665
Epoch: 37356, Training Loss: 0.00751
Epoch: 37356, Training Loss: 0.00702
Epoch: 37356, Training Loss: 0.00858
Epoch: 37356, Training Loss: 0.00665
Epoch: 37357, Training Loss: 0.00751
Epoch: 37357, Training Loss: 0.00702
Epoch: 37357, Training Loss: 0.00858
Epoch: 37357, Training Loss: 0.00665
Epoch: 37358, Training Loss: 0.00751
Epoch: 37358, Training Loss: 0.00702
Epoch: 37358, Training Loss: 0.00858
Epoch: 37358, Training Loss: 0.00665
Epoch: 37359, Training Loss: 0.00751
Epoch: 37359, Training Loss: 0.00702
Epoch: 37359, Training Loss: 0.00858
Epoch: 37359, Training Loss: 0.00665
Epoch: 37360, Training Loss: 0.00751
Epoch: 37360, Training Loss: 0.00702
Epoch: 37360, Training Loss: 0.00858
Epoch: 37360, Training Loss: 0.00665
Epoch: 37361, Training Loss: 0.00751
Epoch: 37361, Training Loss: 0.00702
Epoch: 37361, Training Loss: 0.00858
Epoch: 37361, Training Loss: 0.00665
Epoch: 37362, Training Loss: 0.00751
Epoch: 37362, Training Loss: 0.00702
Epoch: 37362, Training Loss: 0.00858
Epoch: 37362, Training Loss: 0.00665
Epoch: 37363, Training Loss: 0.00751
Epoch: 37363, Training Loss: 0.00702
Epoch: 37363, Training Loss: 0.00858
Epoch: 37363, Training Loss: 0.00665
Epoch: 37364, Training Loss: 0.00751
Epoch: 37364, Training Loss: 0.00702
Epoch: 37364, Training Loss: 0.00858
Epoch: 37364, Training Loss: 0.00665
Epoch: 37365, Training Loss: 0.00751
Epoch: 37365, Training Loss: 0.00702
Epoch: 37365, Training Loss: 0.00858
Epoch: 37365, Training Loss: 0.00665
Epoch: 37366, Training Loss: 0.00751
Epoch: 37366, Training Loss: 0.00702
Epoch: 37366, Training Loss: 0.00858
Epoch: 37366, Training Loss: 0.00665
Epoch: 37367, Training Loss: 0.00751
Epoch: 37367, Training Loss: 0.00702
Epoch: 37367, Training Loss: 0.00858
Epoch: 37367, Training Loss: 0.00665
Epoch: 37368, Training Loss: 0.00751
Epoch: 37368, Training Loss: 0.00702
Epoch: 37368, Training Loss: 0.00858
Epoch: 37368, Training Loss: 0.00665
Epoch: 37369, Training Loss: 0.00751
Epoch: 37369, Training Loss: 0.00702
Epoch: 37369, Training Loss: 0.00858
Epoch: 37369, Training Loss: 0.00665
Epoch: 37370, Training Loss: 0.00751
Epoch: 37370, Training Loss: 0.00702
Epoch: 37370, Training Loss: 0.00858
Epoch: 37370, Training Loss: 0.00665
Epoch: 37371, Training Loss: 0.00751
Epoch: 37371, Training Loss: 0.00702
Epoch: 37371, Training Loss: 0.00858
Epoch: 37371, Training Loss: 0.00665
Epoch: 37372, Training Loss: 0.00751
Epoch: 37372, Training Loss: 0.00702
Epoch: 37372, Training Loss: 0.00858
Epoch: 37372, Training Loss: 0.00665
Epoch: 37373, Training Loss: 0.00751
Epoch: 37373, Training Loss: 0.00702
Epoch: 37373, Training Loss: 0.00858
Epoch: 37373, Training Loss: 0.00665
Epoch: 37374, Training Loss: 0.00751
Epoch: 37374, Training Loss: 0.00702
Epoch: 37374, Training Loss: 0.00858
Epoch: 37374, Training Loss: 0.00665
Epoch: 37375, Training Loss: 0.00751
Epoch: 37375, Training Loss: 0.00702
Epoch: 37375, Training Loss: 0.00858
Epoch: 37375, Training Loss: 0.00665
Epoch: 37376, Training Loss: 0.00751
Epoch: 37376, Training Loss: 0.00702
Epoch: 37376, Training Loss: 0.00858
Epoch: 37376, Training Loss: 0.00665
Epoch: 37377, Training Loss: 0.00751
Epoch: 37377, Training Loss: 0.00702
Epoch: 37377, Training Loss: 0.00858
Epoch: 37377, Training Loss: 0.00665
Epoch: 37378, Training Loss: 0.00751
Epoch: 37378, Training Loss: 0.00702
Epoch: 37378, Training Loss: 0.00858
Epoch: 37378, Training Loss: 0.00665
Epoch: 37379, Training Loss: 0.00751
Epoch: 37379, Training Loss: 0.00702
Epoch: 37379, Training Loss: 0.00858
Epoch: 37379, Training Loss: 0.00665
Epoch: 37380, Training Loss: 0.00751
Epoch: 37380, Training Loss: 0.00702
Epoch: 37380, Training Loss: 0.00858
Epoch: 37380, Training Loss: 0.00665
Epoch: 37381, Training Loss: 0.00751
Epoch: 37381, Training Loss: 0.00702
Epoch: 37381, Training Loss: 0.00858
Epoch: 37381, Training Loss: 0.00665
Epoch: 37382, Training Loss: 0.00751
Epoch: 37382, Training Loss: 0.00702
Epoch: 37382, Training Loss: 0.00858
Epoch: 37382, Training Loss: 0.00665
Epoch: 37383, Training Loss: 0.00751
Epoch: 37383, Training Loss: 0.00702
Epoch: 37383, Training Loss: 0.00858
Epoch: 37383, Training Loss: 0.00665
Epoch: 37384, Training Loss: 0.00751
Epoch: 37384, Training Loss: 0.00702
Epoch: 37384, Training Loss: 0.00858
Epoch: 37384, Training Loss: 0.00665
Epoch: 37385, Training Loss: 0.00751
Epoch: 37385, Training Loss: 0.00702
Epoch: 37385, Training Loss: 0.00858
Epoch: 37385, Training Loss: 0.00665
Epoch: 37386, Training Loss: 0.00751
Epoch: 37386, Training Loss: 0.00702
Epoch: 37386, Training Loss: 0.00858
Epoch: 37386, Training Loss: 0.00665
Epoch: 37387, Training Loss: 0.00751
Epoch: 37387, Training Loss: 0.00702
Epoch: 37387, Training Loss: 0.00858
Epoch: 37387, Training Loss: 0.00665
Epoch: 37388, Training Loss: 0.00751
Epoch: 37388, Training Loss: 0.00702
Epoch: 37388, Training Loss: 0.00858
Epoch: 37388, Training Loss: 0.00665
Epoch: 37389, Training Loss: 0.00751
Epoch: 37389, Training Loss: 0.00702
Epoch: 37389, Training Loss: 0.00858
Epoch: 37389, Training Loss: 0.00665
Epoch: 37390, Training Loss: 0.00751
Epoch: 37390, Training Loss: 0.00702
Epoch: 37390, Training Loss: 0.00858
Epoch: 37390, Training Loss: 0.00665
Epoch: 37391, Training Loss: 0.00751
Epoch: 37391, Training Loss: 0.00702
Epoch: 37391, Training Loss: 0.00858
Epoch: 37391, Training Loss: 0.00665
Epoch: 37392, Training Loss: 0.00751
Epoch: 37392, Training Loss: 0.00702
Epoch: 37392, Training Loss: 0.00858
Epoch: 37392, Training Loss: 0.00665
Epoch: 37393, Training Loss: 0.00751
Epoch: 37393, Training Loss: 0.00702
Epoch: 37393, Training Loss: 0.00858
Epoch: 37393, Training Loss: 0.00665
Epoch: 37394, Training Loss: 0.00751
Epoch: 37394, Training Loss: 0.00702
Epoch: 37394, Training Loss: 0.00857
Epoch: 37394, Training Loss: 0.00665
Epoch: 37395, Training Loss: 0.00751
Epoch: 37395, Training Loss: 0.00702
Epoch: 37395, Training Loss: 0.00857
Epoch: 37395, Training Loss: 0.00665
Epoch: 37396, Training Loss: 0.00751
Epoch: 37396, Training Loss: 0.00702
Epoch: 37396, Training Loss: 0.00857
Epoch: 37396, Training Loss: 0.00665
Epoch: 37397, Training Loss: 0.00751
Epoch: 37397, Training Loss: 0.00702
Epoch: 37397, Training Loss: 0.00857
Epoch: 37397, Training Loss: 0.00665
Epoch: 37398, Training Loss: 0.00751
Epoch: 37398, Training Loss: 0.00702
Epoch: 37398, Training Loss: 0.00857
Epoch: 37398, Training Loss: 0.00665
Epoch: 37399, Training Loss: 0.00751
Epoch: 37399, Training Loss: 0.00702
Epoch: 37399, Training Loss: 0.00857
Epoch: 37399, Training Loss: 0.00665
Epoch: 37400, Training Loss: 0.00751
Epoch: 37400, Training Loss: 0.00702
Epoch: 37400, Training Loss: 0.00857
Epoch: 37400, Training Loss: 0.00665
Epoch: 37401, Training Loss: 0.00751
Epoch: 37401, Training Loss: 0.00702
Epoch: 37401, Training Loss: 0.00857
Epoch: 37401, Training Loss: 0.00665
Epoch: 37402, Training Loss: 0.00750
Epoch: 37402, Training Loss: 0.00702
Epoch: 37402, Training Loss: 0.00857
Epoch: 37402, Training Loss: 0.00665
Epoch: 37403, Training Loss: 0.00750
Epoch: 37403, Training Loss: 0.00702
Epoch: 37403, Training Loss: 0.00857
Epoch: 37403, Training Loss: 0.00665
Epoch: 37404, Training Loss: 0.00750
Epoch: 37404, Training Loss: 0.00702
Epoch: 37404, Training Loss: 0.00857
Epoch: 37404, Training Loss: 0.00665
Epoch: 37405, Training Loss: 0.00750
Epoch: 37405, Training Loss: 0.00702
Epoch: 37405, Training Loss: 0.00857
Epoch: 37405, Training Loss: 0.00665
Epoch: 37406, Training Loss: 0.00750
Epoch: 37406, Training Loss: 0.00702
Epoch: 37406, Training Loss: 0.00857
Epoch: 37406, Training Loss: 0.00665
Epoch: 37407, Training Loss: 0.00750
Epoch: 37407, Training Loss: 0.00702
Epoch: 37407, Training Loss: 0.00857
Epoch: 37407, Training Loss: 0.00665
Epoch: 37408, Training Loss: 0.00750
Epoch: 37408, Training Loss: 0.00702
Epoch: 37408, Training Loss: 0.00857
Epoch: 37408, Training Loss: 0.00665
Epoch: 37409, Training Loss: 0.00750
Epoch: 37409, Training Loss: 0.00702
Epoch: 37409, Training Loss: 0.00857
Epoch: 37409, Training Loss: 0.00665
Epoch: 37410, Training Loss: 0.00750
Epoch: 37410, Training Loss: 0.00702
Epoch: 37410, Training Loss: 0.00857
Epoch: 37410, Training Loss: 0.00665
Epoch: 37411, Training Loss: 0.00750
Epoch: 37411, Training Loss: 0.00702
Epoch: 37411, Training Loss: 0.00857
Epoch: 37411, Training Loss: 0.00665
Epoch: 37412, Training Loss: 0.00750
Epoch: 37412, Training Loss: 0.00702
Epoch: 37412, Training Loss: 0.00857
Epoch: 37412, Training Loss: 0.00665
Epoch: 37413, Training Loss: 0.00750
Epoch: 37413, Training Loss: 0.00702
Epoch: 37413, Training Loss: 0.00857
Epoch: 37413, Training Loss: 0.00665
Epoch: 37414, Training Loss: 0.00750
Epoch: 37414, Training Loss: 0.00702
Epoch: 37414, Training Loss: 0.00857
Epoch: 37414, Training Loss: 0.00665
Epoch: 37415, Training Loss: 0.00750
Epoch: 37415, Training Loss: 0.00702
Epoch: 37415, Training Loss: 0.00857
Epoch: 37415, Training Loss: 0.00665
Epoch: 37416, Training Loss: 0.00750
Epoch: 37416, Training Loss: 0.00702
Epoch: 37416, Training Loss: 0.00857
Epoch: 37416, Training Loss: 0.00665
Epoch: 37417, Training Loss: 0.00750
Epoch: 37417, Training Loss: 0.00702
Epoch: 37417, Training Loss: 0.00857
Epoch: 37417, Training Loss: 0.00665
Epoch: 37418, Training Loss: 0.00750
Epoch: 37418, Training Loss: 0.00702
Epoch: 37418, Training Loss: 0.00857
Epoch: 37418, Training Loss: 0.00665
Epoch: 37419, Training Loss: 0.00750
Epoch: 37419, Training Loss: 0.00702
Epoch: 37419, Training Loss: 0.00857
Epoch: 37419, Training Loss: 0.00665
Epoch: 37420, Training Loss: 0.00750
Epoch: 37420, Training Loss: 0.00702
Epoch: 37420, Training Loss: 0.00857
Epoch: 37420, Training Loss: 0.00665
Epoch: 37421, Training Loss: 0.00750
Epoch: 37421, Training Loss: 0.00702
Epoch: 37421, Training Loss: 0.00857
Epoch: 37421, Training Loss: 0.00665
Epoch: 37422, Training Loss: 0.00750
Epoch: 37422, Training Loss: 0.00702
Epoch: 37422, Training Loss: 0.00857
Epoch: 37422, Training Loss: 0.00665
Epoch: 37423, Training Loss: 0.00750
Epoch: 37423, Training Loss: 0.00702
Epoch: 37423, Training Loss: 0.00857
Epoch: 37423, Training Loss: 0.00665
Epoch: 37424, Training Loss: 0.00750
Epoch: 37424, Training Loss: 0.00702
Epoch: 37424, Training Loss: 0.00857
Epoch: 37424, Training Loss: 0.00664
Epoch: 37425, Training Loss: 0.00750
Epoch: 37425, Training Loss: 0.00702
Epoch: 37425, Training Loss: 0.00857
Epoch: 37425, Training Loss: 0.00664
Epoch: 37426, Training Loss: 0.00750
Epoch: 37426, Training Loss: 0.00702
Epoch: 37426, Training Loss: 0.00857
Epoch: 37426, Training Loss: 0.00664
Epoch: 37427, Training Loss: 0.00750
Epoch: 37427, Training Loss: 0.00702
Epoch: 37427, Training Loss: 0.00857
Epoch: 37427, Training Loss: 0.00664
Epoch: 37428, Training Loss: 0.00750
Epoch: 37428, Training Loss: 0.00702
Epoch: 37428, Training Loss: 0.00857
Epoch: 37428, Training Loss: 0.00664
Epoch: 37429, Training Loss: 0.00750
Epoch: 37429, Training Loss: 0.00702
Epoch: 37429, Training Loss: 0.00857
Epoch: 37429, Training Loss: 0.00664
Epoch: 37430, Training Loss: 0.00750
Epoch: 37430, Training Loss: 0.00702
Epoch: 37430, Training Loss: 0.00857
Epoch: 37430, Training Loss: 0.00664
Epoch: 37431, Training Loss: 0.00750
Epoch: 37431, Training Loss: 0.00702
Epoch: 37431, Training Loss: 0.00857
Epoch: 37431, Training Loss: 0.00664
Epoch: 37432, Training Loss: 0.00750
Epoch: 37432, Training Loss: 0.00702
Epoch: 37432, Training Loss: 0.00857
Epoch: 37432, Training Loss: 0.00664
Epoch: 37433, Training Loss: 0.00750
Epoch: 37433, Training Loss: 0.00702
Epoch: 37433, Training Loss: 0.00857
Epoch: 37433, Training Loss: 0.00664
Epoch: 37434, Training Loss: 0.00750
Epoch: 37434, Training Loss: 0.00702
Epoch: 37434, Training Loss: 0.00857
Epoch: 37434, Training Loss: 0.00664
Epoch: 37435, Training Loss: 0.00750
Epoch: 37435, Training Loss: 0.00702
Epoch: 37435, Training Loss: 0.00857
Epoch: 37435, Training Loss: 0.00664
Epoch: 37436, Training Loss: 0.00750
Epoch: 37436, Training Loss: 0.00702
Epoch: 37436, Training Loss: 0.00857
Epoch: 37436, Training Loss: 0.00664
Epoch: 37437, Training Loss: 0.00750
Epoch: 37437, Training Loss: 0.00702
Epoch: 37437, Training Loss: 0.00857
Epoch: 37437, Training Loss: 0.00664
Epoch: 37438, Training Loss: 0.00750
Epoch: 37438, Training Loss: 0.00702
Epoch: 37438, Training Loss: 0.00857
Epoch: 37438, Training Loss: 0.00664
Epoch: 37439, Training Loss: 0.00750
Epoch: 37439, Training Loss: 0.00702
Epoch: 37439, Training Loss: 0.00857
Epoch: 37439, Training Loss: 0.00664
Epoch: 37440, Training Loss: 0.00750
Epoch: 37440, Training Loss: 0.00702
Epoch: 37440, Training Loss: 0.00857
Epoch: 37440, Training Loss: 0.00664
Epoch: 37441, Training Loss: 0.00750
Epoch: 37441, Training Loss: 0.00702
Epoch: 37441, Training Loss: 0.00857
Epoch: 37441, Training Loss: 0.00664
Epoch: 37442, Training Loss: 0.00750
Epoch: 37442, Training Loss: 0.00702
Epoch: 37442, Training Loss: 0.00857
Epoch: 37442, Training Loss: 0.00664
Epoch: 37443, Training Loss: 0.00750
Epoch: 37443, Training Loss: 0.00702
Epoch: 37443, Training Loss: 0.00857
Epoch: 37443, Training Loss: 0.00664
Epoch: 37444, Training Loss: 0.00750
Epoch: 37444, Training Loss: 0.00702
Epoch: 37444, Training Loss: 0.00857
Epoch: 37444, Training Loss: 0.00664
Epoch: 37445, Training Loss: 0.00750
Epoch: 37445, Training Loss: 0.00702
Epoch: 37445, Training Loss: 0.00857
Epoch: 37445, Training Loss: 0.00664
Epoch: 37446, Training Loss: 0.00750
Epoch: 37446, Training Loss: 0.00702
Epoch: 37446, Training Loss: 0.00857
Epoch: 37446, Training Loss: 0.00664
Epoch: 37447, Training Loss: 0.00750
Epoch: 37447, Training Loss: 0.00702
Epoch: 37447, Training Loss: 0.00857
Epoch: 37447, Training Loss: 0.00664
Epoch: 37448, Training Loss: 0.00750
Epoch: 37448, Training Loss: 0.00702
Epoch: 37448, Training Loss: 0.00857
Epoch: 37448, Training Loss: 0.00664
Epoch: 37449, Training Loss: 0.00750
Epoch: 37449, Training Loss: 0.00702
Epoch: 37449, Training Loss: 0.00857
Epoch: 37449, Training Loss: 0.00664
Epoch: 37450, Training Loss: 0.00750
Epoch: 37450, Training Loss: 0.00702
Epoch: 37450, Training Loss: 0.00857
Epoch: 37450, Training Loss: 0.00664
Epoch: 37451, Training Loss: 0.00750
Epoch: 37451, Training Loss: 0.00702
Epoch: 37451, Training Loss: 0.00857
Epoch: 37451, Training Loss: 0.00664
Epoch: 37452, Training Loss: 0.00750
Epoch: 37452, Training Loss: 0.00702
Epoch: 37452, Training Loss: 0.00857
Epoch: 37452, Training Loss: 0.00664
Epoch: 37453, Training Loss: 0.00750
Epoch: 37453, Training Loss: 0.00702
Epoch: 37453, Training Loss: 0.00857
Epoch: 37453, Training Loss: 0.00664
Epoch: 37454, Training Loss: 0.00750
Epoch: 37454, Training Loss: 0.00702
Epoch: 37454, Training Loss: 0.00857
Epoch: 37454, Training Loss: 0.00664
Epoch: 37455, Training Loss: 0.00750
Epoch: 37455, Training Loss: 0.00702
Epoch: 37455, Training Loss: 0.00857
Epoch: 37455, Training Loss: 0.00664
Epoch: 37456, Training Loss: 0.00750
Epoch: 37456, Training Loss: 0.00702
Epoch: 37456, Training Loss: 0.00857
Epoch: 37456, Training Loss: 0.00664
Epoch: 37457, Training Loss: 0.00750
Epoch: 37457, Training Loss: 0.00702
Epoch: 37457, Training Loss: 0.00857
Epoch: 37457, Training Loss: 0.00664
Epoch: 37458, Training Loss: 0.00750
Epoch: 37458, Training Loss: 0.00701
Epoch: 37458, Training Loss: 0.00857
Epoch: 37458, Training Loss: 0.00664
Epoch: 37459, Training Loss: 0.00750
Epoch: 37459, Training Loss: 0.00701
Epoch: 37459, Training Loss: 0.00857
Epoch: 37459, Training Loss: 0.00664
Epoch: 37460, Training Loss: 0.00750
Epoch: 37460, Training Loss: 0.00701
Epoch: 37460, Training Loss: 0.00857
Epoch: 37460, Training Loss: 0.00664
Epoch: 37461, Training Loss: 0.00750
Epoch: 37461, Training Loss: 0.00701
Epoch: 37461, Training Loss: 0.00857
Epoch: 37461, Training Loss: 0.00664
Epoch: 37462, Training Loss: 0.00750
Epoch: 37462, Training Loss: 0.00701
Epoch: 37462, Training Loss: 0.00857
Epoch: 37462, Training Loss: 0.00664
Epoch: 37463, Training Loss: 0.00750
Epoch: 37463, Training Loss: 0.00701
Epoch: 37463, Training Loss: 0.00857
Epoch: 37463, Training Loss: 0.00664
Epoch: 37464, Training Loss: 0.00750
Epoch: 37464, Training Loss: 0.00701
Epoch: 37464, Training Loss: 0.00857
Epoch: 37464, Training Loss: 0.00664
Epoch: 37465, Training Loss: 0.00750
Epoch: 37465, Training Loss: 0.00701
Epoch: 37465, Training Loss: 0.00857
Epoch: 37465, Training Loss: 0.00664
Epoch: 37466, Training Loss: 0.00750
Epoch: 37466, Training Loss: 0.00701
Epoch: 37466, Training Loss: 0.00857
Epoch: 37466, Training Loss: 0.00664
Epoch: 37467, Training Loss: 0.00750
Epoch: 37467, Training Loss: 0.00701
Epoch: 37467, Training Loss: 0.00857
Epoch: 37467, Training Loss: 0.00664
Epoch: 37468, Training Loss: 0.00750
Epoch: 37468, Training Loss: 0.00701
Epoch: 37468, Training Loss: 0.00857
Epoch: 37468, Training Loss: 0.00664
Epoch: 37469, Training Loss: 0.00750
Epoch: 37469, Training Loss: 0.00701
Epoch: 37469, Training Loss: 0.00857
Epoch: 37469, Training Loss: 0.00664
Epoch: 37470, Training Loss: 0.00750
Epoch: 37470, Training Loss: 0.00701
Epoch: 37470, Training Loss: 0.00857
Epoch: 37470, Training Loss: 0.00664
Epoch: 37471, Training Loss: 0.00750
Epoch: 37471, Training Loss: 0.00701
Epoch: 37471, Training Loss: 0.00857
Epoch: 37471, Training Loss: 0.00664
Epoch: 37472, Training Loss: 0.00750
Epoch: 37472, Training Loss: 0.00701
Epoch: 37472, Training Loss: 0.00857
Epoch: 37472, Training Loss: 0.00664
Epoch: 37473, Training Loss: 0.00750
Epoch: 37473, Training Loss: 0.00701
Epoch: 37473, Training Loss: 0.00857
Epoch: 37473, Training Loss: 0.00664
Epoch: 37474, Training Loss: 0.00750
Epoch: 37474, Training Loss: 0.00701
Epoch: 37474, Training Loss: 0.00857
Epoch: 37474, Training Loss: 0.00664
Epoch: 37475, Training Loss: 0.00750
Epoch: 37475, Training Loss: 0.00701
Epoch: 37475, Training Loss: 0.00857
Epoch: 37475, Training Loss: 0.00664
Epoch: 37476, Training Loss: 0.00750
Epoch: 37476, Training Loss: 0.00701
Epoch: 37476, Training Loss: 0.00857
Epoch: 37476, Training Loss: 0.00664
Epoch: 37477, Training Loss: 0.00750
Epoch: 37477, Training Loss: 0.00701
Epoch: 37477, Training Loss: 0.00856
Epoch: 37477, Training Loss: 0.00664
Epoch: 37478, Training Loss: 0.00750
Epoch: 37478, Training Loss: 0.00701
Epoch: 37478, Training Loss: 0.00856
Epoch: 37478, Training Loss: 0.00664
Epoch: 37479, Training Loss: 0.00750
Epoch: 37479, Training Loss: 0.00701
Epoch: 37479, Training Loss: 0.00856
Epoch: 37479, Training Loss: 0.00664
Epoch: 37480, Training Loss: 0.00750
Epoch: 37480, Training Loss: 0.00701
Epoch: 37480, Training Loss: 0.00856
Epoch: 37480, Training Loss: 0.00664
Epoch: 37481, Training Loss: 0.00750
Epoch: 37481, Training Loss: 0.00701
Epoch: 37481, Training Loss: 0.00856
Epoch: 37481, Training Loss: 0.00664
Epoch: 37482, Training Loss: 0.00750
Epoch: 37482, Training Loss: 0.00701
Epoch: 37482, Training Loss: 0.00856
Epoch: 37482, Training Loss: 0.00664
Epoch: 37483, Training Loss: 0.00750
Epoch: 37483, Training Loss: 0.00701
Epoch: 37483, Training Loss: 0.00856
Epoch: 37483, Training Loss: 0.00664
Epoch: 37484, Training Loss: 0.00750
Epoch: 37484, Training Loss: 0.00701
Epoch: 37484, Training Loss: 0.00856
Epoch: 37484, Training Loss: 0.00664
Epoch: 37485, Training Loss: 0.00750
Epoch: 37485, Training Loss: 0.00701
Epoch: 37485, Training Loss: 0.00856
Epoch: 37485, Training Loss: 0.00664
Epoch: 37486, Training Loss: 0.00750
Epoch: 37486, Training Loss: 0.00701
Epoch: 37486, Training Loss: 0.00856
Epoch: 37486, Training Loss: 0.00664
Epoch: 37487, Training Loss: 0.00750
Epoch: 37487, Training Loss: 0.00701
Epoch: 37487, Training Loss: 0.00856
Epoch: 37487, Training Loss: 0.00664
Epoch: 37488, Training Loss: 0.00750
Epoch: 37488, Training Loss: 0.00701
Epoch: 37488, Training Loss: 0.00856
Epoch: 37488, Training Loss: 0.00664
Epoch: 37489, Training Loss: 0.00750
Epoch: 37489, Training Loss: 0.00701
Epoch: 37489, Training Loss: 0.00856
Epoch: 37489, Training Loss: 0.00664
Epoch: 37490, Training Loss: 0.00750
Epoch: 37490, Training Loss: 0.00701
Epoch: 37490, Training Loss: 0.00856
Epoch: 37490, Training Loss: 0.00664
Epoch: 37491, Training Loss: 0.00750
Epoch: 37491, Training Loss: 0.00701
Epoch: 37491, Training Loss: 0.00856
Epoch: 37491, Training Loss: 0.00664
Epoch: 37492, Training Loss: 0.00750
Epoch: 37492, Training Loss: 0.00701
Epoch: 37492, Training Loss: 0.00856
Epoch: 37492, Training Loss: 0.00664
Epoch: 37493, Training Loss: 0.00750
Epoch: 37493, Training Loss: 0.00701
Epoch: 37493, Training Loss: 0.00856
Epoch: 37493, Training Loss: 0.00664
Epoch: 37494, Training Loss: 0.00750
Epoch: 37494, Training Loss: 0.00701
Epoch: 37494, Training Loss: 0.00856
Epoch: 37494, Training Loss: 0.00664
Epoch: 37495, Training Loss: 0.00750
Epoch: 37495, Training Loss: 0.00701
Epoch: 37495, Training Loss: 0.00856
Epoch: 37495, Training Loss: 0.00664
Epoch: 37496, Training Loss: 0.00749
Epoch: 37496, Training Loss: 0.00701
Epoch: 37496, Training Loss: 0.00856
Epoch: 37496, Training Loss: 0.00664
Epoch: 37497, Training Loss: 0.00749
Epoch: 37497, Training Loss: 0.00701
Epoch: 37497, Training Loss: 0.00856
Epoch: 37497, Training Loss: 0.00664
Epoch: 37498, Training Loss: 0.00749
Epoch: 37498, Training Loss: 0.00701
Epoch: 37498, Training Loss: 0.00856
Epoch: 37498, Training Loss: 0.00664
Epoch: 37499, Training Loss: 0.00749
Epoch: 37499, Training Loss: 0.00701
Epoch: 37499, Training Loss: 0.00856
Epoch: 37499, Training Loss: 0.00664
Epoch: 37500, Training Loss: 0.00749
Epoch: 37500, Training Loss: 0.00701
Epoch: 37500, Training Loss: 0.00856
Epoch: 37500, Training Loss: 0.00664
Epoch: 37501, Training Loss: 0.00749
Epoch: 37501, Training Loss: 0.00701
Epoch: 37501, Training Loss: 0.00856
Epoch: 37501, Training Loss: 0.00664
Epoch: 37502, Training Loss: 0.00749
Epoch: 37502, Training Loss: 0.00701
Epoch: 37502, Training Loss: 0.00856
Epoch: 37502, Training Loss: 0.00664
Epoch: 37503, Training Loss: 0.00749
Epoch: 37503, Training Loss: 0.00701
Epoch: 37503, Training Loss: 0.00856
Epoch: 37503, Training Loss: 0.00664
Epoch: 37504, Training Loss: 0.00749
Epoch: 37504, Training Loss: 0.00701
Epoch: 37504, Training Loss: 0.00856
Epoch: 37504, Training Loss: 0.00664
Epoch: 37505, Training Loss: 0.00749
Epoch: 37505, Training Loss: 0.00701
Epoch: 37505, Training Loss: 0.00856
Epoch: 37505, Training Loss: 0.00664
Epoch: 37506, Training Loss: 0.00749
Epoch: 37506, Training Loss: 0.00701
Epoch: 37506, Training Loss: 0.00856
Epoch: 37506, Training Loss: 0.00664
Epoch: 37507, Training Loss: 0.00749
Epoch: 37507, Training Loss: 0.00701
Epoch: 37507, Training Loss: 0.00856
Epoch: 37507, Training Loss: 0.00664
Epoch: 37508, Training Loss: 0.00749
Epoch: 37508, Training Loss: 0.00701
Epoch: 37508, Training Loss: 0.00856
Epoch: 37508, Training Loss: 0.00664
Epoch: 37509, Training Loss: 0.00749
Epoch: 37509, Training Loss: 0.00701
Epoch: 37509, Training Loss: 0.00856
Epoch: 37509, Training Loss: 0.00664
Epoch: 37510, Training Loss: 0.00749
Epoch: 37510, Training Loss: 0.00701
Epoch: 37510, Training Loss: 0.00856
Epoch: 37510, Training Loss: 0.00664
Epoch: 37511, Training Loss: 0.00749
Epoch: 37511, Training Loss: 0.00701
Epoch: 37511, Training Loss: 0.00856
Epoch: 37511, Training Loss: 0.00664
Epoch: 37512, Training Loss: 0.00749
Epoch: 37512, Training Loss: 0.00701
Epoch: 37512, Training Loss: 0.00856
Epoch: 37512, Training Loss: 0.00664
Epoch: 37513, Training Loss: 0.00749
Epoch: 37513, Training Loss: 0.00701
Epoch: 37513, Training Loss: 0.00856
Epoch: 37513, Training Loss: 0.00664
Epoch: 37514, Training Loss: 0.00749
Epoch: 37514, Training Loss: 0.00701
Epoch: 37514, Training Loss: 0.00856
Epoch: 37514, Training Loss: 0.00664
Epoch: 37515, Training Loss: 0.00749
Epoch: 37515, Training Loss: 0.00701
Epoch: 37515, Training Loss: 0.00856
Epoch: 37515, Training Loss: 0.00664
Epoch: 37516, Training Loss: 0.00749
Epoch: 37516, Training Loss: 0.00701
Epoch: 37516, Training Loss: 0.00856
Epoch: 37516, Training Loss: 0.00664
Epoch: 37517, Training Loss: 0.00749
Epoch: 37517, Training Loss: 0.00701
Epoch: 37517, Training Loss: 0.00856
Epoch: 37517, Training Loss: 0.00664
Epoch: 37518, Training Loss: 0.00749
Epoch: 37518, Training Loss: 0.00701
Epoch: 37518, Training Loss: 0.00856
Epoch: 37518, Training Loss: 0.00664
Epoch: 37519, Training Loss: 0.00749
Epoch: 37519, Training Loss: 0.00701
Epoch: 37519, Training Loss: 0.00856
Epoch: 37519, Training Loss: 0.00664
Epoch: 37520, Training Loss: 0.00749
Epoch: 37520, Training Loss: 0.00701
Epoch: 37520, Training Loss: 0.00856
Epoch: 37520, Training Loss: 0.00664
Epoch: 37521, Training Loss: 0.00749
Epoch: 37521, Training Loss: 0.00701
Epoch: 37521, Training Loss: 0.00856
Epoch: 37521, Training Loss: 0.00664
Epoch: 37522, Training Loss: 0.00749
Epoch: 37522, Training Loss: 0.00701
Epoch: 37522, Training Loss: 0.00856
Epoch: 37522, Training Loss: 0.00664
Epoch: 37523, Training Loss: 0.00749
Epoch: 37523, Training Loss: 0.00701
Epoch: 37523, Training Loss: 0.00856
Epoch: 37523, Training Loss: 0.00664
Epoch: 37524, Training Loss: 0.00749
Epoch: 37524, Training Loss: 0.00701
Epoch: 37524, Training Loss: 0.00856
Epoch: 37524, Training Loss: 0.00664
Epoch: 37525, Training Loss: 0.00749
Epoch: 37525, Training Loss: 0.00701
Epoch: 37525, Training Loss: 0.00856
Epoch: 37525, Training Loss: 0.00664
Epoch: 37526, Training Loss: 0.00749
Epoch: 37526, Training Loss: 0.00701
Epoch: 37526, Training Loss: 0.00856
Epoch: 37526, Training Loss: 0.00664
Epoch: 37527, Training Loss: 0.00749
Epoch: 37527, Training Loss: 0.00701
Epoch: 37527, Training Loss: 0.00856
Epoch: 37527, Training Loss: 0.00664
Epoch: 37528, Training Loss: 0.00749
Epoch: 37528, Training Loss: 0.00701
Epoch: 37528, Training Loss: 0.00856
Epoch: 37528, Training Loss: 0.00664
Epoch: 37529, Training Loss: 0.00749
Epoch: 37529, Training Loss: 0.00701
Epoch: 37529, Training Loss: 0.00856
Epoch: 37529, Training Loss: 0.00664
Epoch: 37530, Training Loss: 0.00749
Epoch: 37530, Training Loss: 0.00701
Epoch: 37530, Training Loss: 0.00856
Epoch: 37530, Training Loss: 0.00664
Epoch: 37531, Training Loss: 0.00749
Epoch: 37531, Training Loss: 0.00701
Epoch: 37531, Training Loss: 0.00856
Epoch: 37531, Training Loss: 0.00664
Epoch: 37532, Training Loss: 0.00749
Epoch: 37532, Training Loss: 0.00701
Epoch: 37532, Training Loss: 0.00856
Epoch: 37532, Training Loss: 0.00663
Epoch: 37533, Training Loss: 0.00749
Epoch: 37533, Training Loss: 0.00701
Epoch: 37533, Training Loss: 0.00856
Epoch: 37533, Training Loss: 0.00663
Epoch: 37534, Training Loss: 0.00749
Epoch: 37534, Training Loss: 0.00701
Epoch: 37534, Training Loss: 0.00856
Epoch: 37534, Training Loss: 0.00663
Epoch: 37535, Training Loss: 0.00749
Epoch: 37535, Training Loss: 0.00701
Epoch: 37535, Training Loss: 0.00856
Epoch: 37535, Training Loss: 0.00663
Epoch: 37536, Training Loss: 0.00749
Epoch: 37536, Training Loss: 0.00701
Epoch: 37536, Training Loss: 0.00856
Epoch: 37536, Training Loss: 0.00663
Epoch: 37537, Training Loss: 0.00749
Epoch: 37537, Training Loss: 0.00701
Epoch: 37537, Training Loss: 0.00856
Epoch: 37537, Training Loss: 0.00663
Epoch: 37538, Training Loss: 0.00749
Epoch: 37538, Training Loss: 0.00701
Epoch: 37538, Training Loss: 0.00856
Epoch: 37538, Training Loss: 0.00663
Epoch: 37539, Training Loss: 0.00749
Epoch: 37539, Training Loss: 0.00701
Epoch: 37539, Training Loss: 0.00856
Epoch: 37539, Training Loss: 0.00663
Epoch: 37540, Training Loss: 0.00749
Epoch: 37540, Training Loss: 0.00701
Epoch: 37540, Training Loss: 0.00856
Epoch: 37540, Training Loss: 0.00663
Epoch: 37541, Training Loss: 0.00749
Epoch: 37541, Training Loss: 0.00701
Epoch: 37541, Training Loss: 0.00856
Epoch: 37541, Training Loss: 0.00663
Epoch: 37542, Training Loss: 0.00749
Epoch: 37542, Training Loss: 0.00701
Epoch: 37542, Training Loss: 0.00856
Epoch: 37542, Training Loss: 0.00663
Epoch: 37543, Training Loss: 0.00749
Epoch: 37543, Training Loss: 0.00701
Epoch: 37543, Training Loss: 0.00856
Epoch: 37543, Training Loss: 0.00663
Epoch: 37544, Training Loss: 0.00749
Epoch: 37544, Training Loss: 0.00701
Epoch: 37544, Training Loss: 0.00856
Epoch: 37544, Training Loss: 0.00663
Epoch: 37545, Training Loss: 0.00749
Epoch: 37545, Training Loss: 0.00701
Epoch: 37545, Training Loss: 0.00856
Epoch: 37545, Training Loss: 0.00663
Epoch: 37546, Training Loss: 0.00749
Epoch: 37546, Training Loss: 0.00701
Epoch: 37546, Training Loss: 0.00856
Epoch: 37546, Training Loss: 0.00663
Epoch: 37547, Training Loss: 0.00749
Epoch: 37547, Training Loss: 0.00701
Epoch: 37547, Training Loss: 0.00856
Epoch: 37547, Training Loss: 0.00663
Epoch: 37548, Training Loss: 0.00749
Epoch: 37548, Training Loss: 0.00701
Epoch: 37548, Training Loss: 0.00856
Epoch: 37548, Training Loss: 0.00663
Epoch: 37549, Training Loss: 0.00749
Epoch: 37549, Training Loss: 0.00701
Epoch: 37549, Training Loss: 0.00856
Epoch: 37549, Training Loss: 0.00663
Epoch: 37550, Training Loss: 0.00749
Epoch: 37550, Training Loss: 0.00701
Epoch: 37550, Training Loss: 0.00856
Epoch: 37550, Training Loss: 0.00663
Epoch: 37551, Training Loss: 0.00749
Epoch: 37551, Training Loss: 0.00701
Epoch: 37551, Training Loss: 0.00856
Epoch: 37551, Training Loss: 0.00663
Epoch: 37552, Training Loss: 0.00749
Epoch: 37552, Training Loss: 0.00701
Epoch: 37552, Training Loss: 0.00856
Epoch: 37552, Training Loss: 0.00663
Epoch: 37553, Training Loss: 0.00749
Epoch: 37553, Training Loss: 0.00701
Epoch: 37553, Training Loss: 0.00856
Epoch: 37553, Training Loss: 0.00663
Epoch: 37554, Training Loss: 0.00749
Epoch: 37554, Training Loss: 0.00701
Epoch: 37554, Training Loss: 0.00856
Epoch: 37554, Training Loss: 0.00663
Epoch: 37555, Training Loss: 0.00749
Epoch: 37555, Training Loss: 0.00701
Epoch: 37555, Training Loss: 0.00856
Epoch: 37555, Training Loss: 0.00663
Epoch: 37556, Training Loss: 0.00749
Epoch: 37556, Training Loss: 0.00701
Epoch: 37556, Training Loss: 0.00856
Epoch: 37556, Training Loss: 0.00663
Epoch: 37557, Training Loss: 0.00749
Epoch: 37557, Training Loss: 0.00701
Epoch: 37557, Training Loss: 0.00856
Epoch: 37557, Training Loss: 0.00663
Epoch: 37558, Training Loss: 0.00749
Epoch: 37558, Training Loss: 0.00701
Epoch: 37558, Training Loss: 0.00856
Epoch: 37558, Training Loss: 0.00663
Epoch: 37559, Training Loss: 0.00749
Epoch: 37559, Training Loss: 0.00701
Epoch: 37559, Training Loss: 0.00856
Epoch: 37559, Training Loss: 0.00663
Epoch: 37560, Training Loss: 0.00749
Epoch: 37560, Training Loss: 0.00701
Epoch: 37560, Training Loss: 0.00855
Epoch: 37560, Training Loss: 0.00663
Epoch: 37561, Training Loss: 0.00749
Epoch: 37561, Training Loss: 0.00700
Epoch: 37561, Training Loss: 0.00855
Epoch: 37561, Training Loss: 0.00663
Epoch: 37562, Training Loss: 0.00749
Epoch: 37562, Training Loss: 0.00700
Epoch: 37562, Training Loss: 0.00855
Epoch: 37562, Training Loss: 0.00663
Epoch: 37563, Training Loss: 0.00749
Epoch: 37563, Training Loss: 0.00700
Epoch: 37563, Training Loss: 0.00855
Epoch: 37563, Training Loss: 0.00663
Epoch: 37564, Training Loss: 0.00749
Epoch: 37564, Training Loss: 0.00700
Epoch: 37564, Training Loss: 0.00855
Epoch: 37564, Training Loss: 0.00663
Epoch: 37565, Training Loss: 0.00749
Epoch: 37565, Training Loss: 0.00700
Epoch: 37565, Training Loss: 0.00855
Epoch: 37565, Training Loss: 0.00663
Epoch: 37566, Training Loss: 0.00749
Epoch: 37566, Training Loss: 0.00700
Epoch: 37566, Training Loss: 0.00855
Epoch: 37566, Training Loss: 0.00663
Epoch: 37567, Training Loss: 0.00749
Epoch: 37567, Training Loss: 0.00700
Epoch: 37567, Training Loss: 0.00855
Epoch: 37567, Training Loss: 0.00663
Epoch: 37568, Training Loss: 0.00749
Epoch: 37568, Training Loss: 0.00700
Epoch: 37568, Training Loss: 0.00855
Epoch: 37568, Training Loss: 0.00663
Epoch: 37569, Training Loss: 0.00749
Epoch: 37569, Training Loss: 0.00700
Epoch: 37569, Training Loss: 0.00855
Epoch: 37569, Training Loss: 0.00663
Epoch: 37570, Training Loss: 0.00749
Epoch: 37570, Training Loss: 0.00700
Epoch: 37570, Training Loss: 0.00855
Epoch: 37570, Training Loss: 0.00663
Epoch: 37571, Training Loss: 0.00749
Epoch: 37571, Training Loss: 0.00700
Epoch: 37571, Training Loss: 0.00855
Epoch: 37571, Training Loss: 0.00663
Epoch: 37572, Training Loss: 0.00749
Epoch: 37572, Training Loss: 0.00700
Epoch: 37572, Training Loss: 0.00855
Epoch: 37572, Training Loss: 0.00663
Epoch: 37573, Training Loss: 0.00749
Epoch: 37573, Training Loss: 0.00700
Epoch: 37573, Training Loss: 0.00855
Epoch: 37573, Training Loss: 0.00663
Epoch: 37574, Training Loss: 0.00749
Epoch: 37574, Training Loss: 0.00700
Epoch: 37574, Training Loss: 0.00855
Epoch: 37574, Training Loss: 0.00663
Epoch: 37575, Training Loss: 0.00749
Epoch: 37575, Training Loss: 0.00700
Epoch: 37575, Training Loss: 0.00855
Epoch: 37575, Training Loss: 0.00663
Epoch: 37576, Training Loss: 0.00749
Epoch: 37576, Training Loss: 0.00700
Epoch: 37576, Training Loss: 0.00855
Epoch: 37576, Training Loss: 0.00663
Epoch: 37577, Training Loss: 0.00749
Epoch: 37577, Training Loss: 0.00700
Epoch: 37577, Training Loss: 0.00855
Epoch: 37577, Training Loss: 0.00663
Epoch: 37578, Training Loss: 0.00749
Epoch: 37578, Training Loss: 0.00700
Epoch: 37578, Training Loss: 0.00855
Epoch: 37578, Training Loss: 0.00663
Epoch: 37579, Training Loss: 0.00749
Epoch: 37579, Training Loss: 0.00700
Epoch: 37579, Training Loss: 0.00855
Epoch: 37579, Training Loss: 0.00663
Epoch: 37580, Training Loss: 0.00749
Epoch: 37580, Training Loss: 0.00700
Epoch: 37580, Training Loss: 0.00855
Epoch: 37580, Training Loss: 0.00663
Epoch: 37581, Training Loss: 0.00749
Epoch: 37581, Training Loss: 0.00700
Epoch: 37581, Training Loss: 0.00855
Epoch: 37581, Training Loss: 0.00663
Epoch: 37582, Training Loss: 0.00749
Epoch: 37582, Training Loss: 0.00700
Epoch: 37582, Training Loss: 0.00855
Epoch: 37582, Training Loss: 0.00663
Epoch: 37583, Training Loss: 0.00749
Epoch: 37583, Training Loss: 0.00700
Epoch: 37583, Training Loss: 0.00855
Epoch: 37583, Training Loss: 0.00663
Epoch: 37584, Training Loss: 0.00749
Epoch: 37584, Training Loss: 0.00700
Epoch: 37584, Training Loss: 0.00855
Epoch: 37584, Training Loss: 0.00663
Epoch: 37585, Training Loss: 0.00749
Epoch: 37585, Training Loss: 0.00700
Epoch: 37585, Training Loss: 0.00855
Epoch: 37585, Training Loss: 0.00663
Epoch: 37586, Training Loss: 0.00749
Epoch: 37586, Training Loss: 0.00700
Epoch: 37586, Training Loss: 0.00855
Epoch: 37586, Training Loss: 0.00663
Epoch: 37587, Training Loss: 0.00749
Epoch: 37587, Training Loss: 0.00700
Epoch: 37587, Training Loss: 0.00855
Epoch: 37587, Training Loss: 0.00663
Epoch: 37588, Training Loss: 0.00749
Epoch: 37588, Training Loss: 0.00700
Epoch: 37588, Training Loss: 0.00855
Epoch: 37588, Training Loss: 0.00663
Epoch: 37589, Training Loss: 0.00749
Epoch: 37589, Training Loss: 0.00700
Epoch: 37589, Training Loss: 0.00855
Epoch: 37589, Training Loss: 0.00663
Epoch: 37590, Training Loss: 0.00749
Epoch: 37590, Training Loss: 0.00700
Epoch: 37590, Training Loss: 0.00855
Epoch: 37590, Training Loss: 0.00663
Epoch: 37591, Training Loss: 0.00748
Epoch: 37591, Training Loss: 0.00700
Epoch: 37591, Training Loss: 0.00855
Epoch: 37591, Training Loss: 0.00663
Epoch: 37592, Training Loss: 0.00748
Epoch: 37592, Training Loss: 0.00700
Epoch: 37592, Training Loss: 0.00855
Epoch: 37592, Training Loss: 0.00663
Epoch: 37593, Training Loss: 0.00748
Epoch: 37593, Training Loss: 0.00700
Epoch: 37593, Training Loss: 0.00855
Epoch: 37593, Training Loss: 0.00663
Epoch: 37594, Training Loss: 0.00748
Epoch: 37594, Training Loss: 0.00700
Epoch: 37594, Training Loss: 0.00855
Epoch: 37594, Training Loss: 0.00663
Epoch: 37595, Training Loss: 0.00748
Epoch: 37595, Training Loss: 0.00700
Epoch: 37595, Training Loss: 0.00855
Epoch: 37595, Training Loss: 0.00663
Epoch: 37596, Training Loss: 0.00748
Epoch: 37596, Training Loss: 0.00700
Epoch: 37596, Training Loss: 0.00855
Epoch: 37596, Training Loss: 0.00663
Epoch: 37597, Training Loss: 0.00748
Epoch: 37597, Training Loss: 0.00700
Epoch: 37597, Training Loss: 0.00855
Epoch: 37597, Training Loss: 0.00663
Epoch: 37598, Training Loss: 0.00748
Epoch: 37598, Training Loss: 0.00700
Epoch: 37598, Training Loss: 0.00855
Epoch: 37598, Training Loss: 0.00663
Epoch: 37599, Training Loss: 0.00748
Epoch: 37599, Training Loss: 0.00700
Epoch: 37599, Training Loss: 0.00855
Epoch: 37599, Training Loss: 0.00663
Epoch: 37600, Training Loss: 0.00748
Epoch: 37600, Training Loss: 0.00700
Epoch: 37600, Training Loss: 0.00855
Epoch: 37600, Training Loss: 0.00663
Epoch: 37601, Training Loss: 0.00748
Epoch: 37601, Training Loss: 0.00700
Epoch: 37601, Training Loss: 0.00855
Epoch: 37601, Training Loss: 0.00663
Epoch: 37602, Training Loss: 0.00748
Epoch: 37602, Training Loss: 0.00700
Epoch: 37602, Training Loss: 0.00855
Epoch: 37602, Training Loss: 0.00663
Epoch: 37603, Training Loss: 0.00748
Epoch: 37603, Training Loss: 0.00700
Epoch: 37603, Training Loss: 0.00855
Epoch: 37603, Training Loss: 0.00663
Epoch: 37604, Training Loss: 0.00748
Epoch: 37604, Training Loss: 0.00700
Epoch: 37604, Training Loss: 0.00855
Epoch: 37604, Training Loss: 0.00663
Epoch: 37605, Training Loss: 0.00748
Epoch: 37605, Training Loss: 0.00700
Epoch: 37605, Training Loss: 0.00855
Epoch: 37605, Training Loss: 0.00663
Epoch: 37606, Training Loss: 0.00748
Epoch: 37606, Training Loss: 0.00700
Epoch: 37606, Training Loss: 0.00855
Epoch: 37606, Training Loss: 0.00663
Epoch: 37607, Training Loss: 0.00748
Epoch: 37607, Training Loss: 0.00700
Epoch: 37607, Training Loss: 0.00855
Epoch: 37607, Training Loss: 0.00663
Epoch: 37608, Training Loss: 0.00748
Epoch: 37608, Training Loss: 0.00700
Epoch: 37608, Training Loss: 0.00855
Epoch: 37608, Training Loss: 0.00663
Epoch: 37609, Training Loss: 0.00748
Epoch: 37609, Training Loss: 0.00700
Epoch: 37609, Training Loss: 0.00855
Epoch: 37609, Training Loss: 0.00663
Epoch: 37610, Training Loss: 0.00748
Epoch: 37610, Training Loss: 0.00700
Epoch: 37610, Training Loss: 0.00855
Epoch: 37610, Training Loss: 0.00663
Epoch: 37611, Training Loss: 0.00748
Epoch: 37611, Training Loss: 0.00700
Epoch: 37611, Training Loss: 0.00855
Epoch: 37611, Training Loss: 0.00663
Epoch: 37612, Training Loss: 0.00748
Epoch: 37612, Training Loss: 0.00700
Epoch: 37612, Training Loss: 0.00855
Epoch: 37612, Training Loss: 0.00663
Epoch: 37613, Training Loss: 0.00748
Epoch: 37613, Training Loss: 0.00700
Epoch: 37613, Training Loss: 0.00855
Epoch: 37613, Training Loss: 0.00663
Epoch: 37614, Training Loss: 0.00748
Epoch: 37614, Training Loss: 0.00700
Epoch: 37614, Training Loss: 0.00855
Epoch: 37614, Training Loss: 0.00663
Epoch: 37615, Training Loss: 0.00748
Epoch: 37615, Training Loss: 0.00700
Epoch: 37615, Training Loss: 0.00855
Epoch: 37615, Training Loss: 0.00663
Epoch: 37616, Training Loss: 0.00748
Epoch: 37616, Training Loss: 0.00700
Epoch: 37616, Training Loss: 0.00855
Epoch: 37616, Training Loss: 0.00663
Epoch: 37617, Training Loss: 0.00748
Epoch: 37617, Training Loss: 0.00700
Epoch: 37617, Training Loss: 0.00855
Epoch: 37617, Training Loss: 0.00663
Epoch: 37618, Training Loss: 0.00748
Epoch: 37618, Training Loss: 0.00700
Epoch: 37618, Training Loss: 0.00855
Epoch: 37618, Training Loss: 0.00663
Epoch: 37619, Training Loss: 0.00748
Epoch: 37619, Training Loss: 0.00700
Epoch: 37619, Training Loss: 0.00855
Epoch: 37619, Training Loss: 0.00663
Epoch: 37620, Training Loss: 0.00748
Epoch: 37620, Training Loss: 0.00700
Epoch: 37620, Training Loss: 0.00855
Epoch: 37620, Training Loss: 0.00663
Epoch: 37621, Training Loss: 0.00748
Epoch: 37621, Training Loss: 0.00700
Epoch: 37621, Training Loss: 0.00855
Epoch: 37621, Training Loss: 0.00663
Epoch: 37622, Training Loss: 0.00748
Epoch: 37622, Training Loss: 0.00700
Epoch: 37622, Training Loss: 0.00855
Epoch: 37622, Training Loss: 0.00663
Epoch: 37623, Training Loss: 0.00748
Epoch: 37623, Training Loss: 0.00700
Epoch: 37623, Training Loss: 0.00855
Epoch: 37623, Training Loss: 0.00663
Epoch: 37624, Training Loss: 0.00748
Epoch: 37624, Training Loss: 0.00700
Epoch: 37624, Training Loss: 0.00855
Epoch: 37624, Training Loss: 0.00663
Epoch: 37625, Training Loss: 0.00748
Epoch: 37625, Training Loss: 0.00700
Epoch: 37625, Training Loss: 0.00855
Epoch: 37625, Training Loss: 0.00663
Epoch: 37626, Training Loss: 0.00748
Epoch: 37626, Training Loss: 0.00700
Epoch: 37626, Training Loss: 0.00855
Epoch: 37626, Training Loss: 0.00663
Epoch: 37627, Training Loss: 0.00748
Epoch: 37627, Training Loss: 0.00700
Epoch: 37627, Training Loss: 0.00855
Epoch: 37627, Training Loss: 0.00663
Epoch: 37628, Training Loss: 0.00748
Epoch: 37628, Training Loss: 0.00700
Epoch: 37628, Training Loss: 0.00855
Epoch: 37628, Training Loss: 0.00663
Epoch: 37629, Training Loss: 0.00748
Epoch: 37629, Training Loss: 0.00700
Epoch: 37629, Training Loss: 0.00855
Epoch: 37629, Training Loss: 0.00663
Epoch: 37630, Training Loss: 0.00748
Epoch: 37630, Training Loss: 0.00700
Epoch: 37630, Training Loss: 0.00855
Epoch: 37630, Training Loss: 0.00663
Epoch: 37631, Training Loss: 0.00748
Epoch: 37631, Training Loss: 0.00700
Epoch: 37631, Training Loss: 0.00855
Epoch: 37631, Training Loss: 0.00663
Epoch: 37632, Training Loss: 0.00748
Epoch: 37632, Training Loss: 0.00700
Epoch: 37632, Training Loss: 0.00855
Epoch: 37632, Training Loss: 0.00663
Epoch: 37633, Training Loss: 0.00748
Epoch: 37633, Training Loss: 0.00700
Epoch: 37633, Training Loss: 0.00855
Epoch: 37633, Training Loss: 0.00663
Epoch: 37634, Training Loss: 0.00748
Epoch: 37634, Training Loss: 0.00700
Epoch: 37634, Training Loss: 0.00855
Epoch: 37634, Training Loss: 0.00663
Epoch: 37635, Training Loss: 0.00748
Epoch: 37635, Training Loss: 0.00700
Epoch: 37635, Training Loss: 0.00855
Epoch: 37635, Training Loss: 0.00663
Epoch: 37636, Training Loss: 0.00748
Epoch: 37636, Training Loss: 0.00700
Epoch: 37636, Training Loss: 0.00855
Epoch: 37636, Training Loss: 0.00663
Epoch: 37637, Training Loss: 0.00748
Epoch: 37637, Training Loss: 0.00700
Epoch: 37637, Training Loss: 0.00855
Epoch: 37637, Training Loss: 0.00663
Epoch: 37638, Training Loss: 0.00748
Epoch: 37638, Training Loss: 0.00700
Epoch: 37638, Training Loss: 0.00855
Epoch: 37638, Training Loss: 0.00663
Epoch: 37639, Training Loss: 0.00748
Epoch: 37639, Training Loss: 0.00700
Epoch: 37639, Training Loss: 0.00855
Epoch: 37639, Training Loss: 0.00663
Epoch: 37640, Training Loss: 0.00748
Epoch: 37640, Training Loss: 0.00700
Epoch: 37640, Training Loss: 0.00855
Epoch: 37640, Training Loss: 0.00663
Epoch: 37641, Training Loss: 0.00748
Epoch: 37641, Training Loss: 0.00700
Epoch: 37641, Training Loss: 0.00855
Epoch: 37641, Training Loss: 0.00662
Epoch: 37642, Training Loss: 0.00748
Epoch: 37642, Training Loss: 0.00700
Epoch: 37642, Training Loss: 0.00855
Epoch: 37642, Training Loss: 0.00662
Epoch: 37643, Training Loss: 0.00748
Epoch: 37643, Training Loss: 0.00700
Epoch: 37643, Training Loss: 0.00855
Epoch: 37643, Training Loss: 0.00662
Epoch: 37644, Training Loss: 0.00748
Epoch: 37644, Training Loss: 0.00700
Epoch: 37644, Training Loss: 0.00854
Epoch: 37644, Training Loss: 0.00662
Epoch: 37645, Training Loss: 0.00748
Epoch: 37645, Training Loss: 0.00700
Epoch: 37645, Training Loss: 0.00854
Epoch: 37645, Training Loss: 0.00662
Epoch: 37646, Training Loss: 0.00748
Epoch: 37646, Training Loss: 0.00700
Epoch: 37646, Training Loss: 0.00854
Epoch: 37646, Training Loss: 0.00662
Epoch: 37647, Training Loss: 0.00748
Epoch: 37647, Training Loss: 0.00700
Epoch: 37647, Training Loss: 0.00854
Epoch: 37647, Training Loss: 0.00662
Epoch: 37648, Training Loss: 0.00748
Epoch: 37648, Training Loss: 0.00700
Epoch: 37648, Training Loss: 0.00854
Epoch: 37648, Training Loss: 0.00662
Epoch: 37649, Training Loss: 0.00748
Epoch: 37649, Training Loss: 0.00700
Epoch: 37649, Training Loss: 0.00854
Epoch: 37649, Training Loss: 0.00662
Epoch: 37650, Training Loss: 0.00748
Epoch: 37650, Training Loss: 0.00700
Epoch: 37650, Training Loss: 0.00854
Epoch: 37650, Training Loss: 0.00662
Epoch: 37651, Training Loss: 0.00748
Epoch: 37651, Training Loss: 0.00700
Epoch: 37651, Training Loss: 0.00854
Epoch: 37651, Training Loss: 0.00662
Epoch: 37652, Training Loss: 0.00748
Epoch: 37652, Training Loss: 0.00700
Epoch: 37652, Training Loss: 0.00854
Epoch: 37652, Training Loss: 0.00662
Epoch: 37653, Training Loss: 0.00748
Epoch: 37653, Training Loss: 0.00700
Epoch: 37653, Training Loss: 0.00854
Epoch: 37653, Training Loss: 0.00662
Epoch: 37654, Training Loss: 0.00748
Epoch: 37654, Training Loss: 0.00700
Epoch: 37654, Training Loss: 0.00854
Epoch: 37654, Training Loss: 0.00662
Epoch: 37655, Training Loss: 0.00748
Epoch: 37655, Training Loss: 0.00700
Epoch: 37655, Training Loss: 0.00854
Epoch: 37655, Training Loss: 0.00662
Epoch: 37656, Training Loss: 0.00748
Epoch: 37656, Training Loss: 0.00700
Epoch: 37656, Training Loss: 0.00854
Epoch: 37656, Training Loss: 0.00662
Epoch: 37657, Training Loss: 0.00748
Epoch: 37657, Training Loss: 0.00700
Epoch: 37657, Training Loss: 0.00854
Epoch: 37657, Training Loss: 0.00662
Epoch: 37658, Training Loss: 0.00748
Epoch: 37658, Training Loss: 0.00700
Epoch: 37658, Training Loss: 0.00854
Epoch: 37658, Training Loss: 0.00662
Epoch: 37659, Training Loss: 0.00748
Epoch: 37659, Training Loss: 0.00700
Epoch: 37659, Training Loss: 0.00854
Epoch: 37659, Training Loss: 0.00662
Epoch: 37660, Training Loss: 0.00748
Epoch: 37660, Training Loss: 0.00700
Epoch: 37660, Training Loss: 0.00854
Epoch: 37660, Training Loss: 0.00662
Epoch: 37661, Training Loss: 0.00748
Epoch: 37661, Training Loss: 0.00700
Epoch: 37661, Training Loss: 0.00854
Epoch: 37661, Training Loss: 0.00662
Epoch: 37662, Training Loss: 0.00748
Epoch: 37662, Training Loss: 0.00700
Epoch: 37662, Training Loss: 0.00854
Epoch: 37662, Training Loss: 0.00662
Epoch: 37663, Training Loss: 0.00748
Epoch: 37663, Training Loss: 0.00700
Epoch: 37663, Training Loss: 0.00854
Epoch: 37663, Training Loss: 0.00662
Epoch: 37664, Training Loss: 0.00748
Epoch: 37664, Training Loss: 0.00699
Epoch: 37664, Training Loss: 0.00854
Epoch: 37664, Training Loss: 0.00662
Epoch: 37665, Training Loss: 0.00748
Epoch: 37665, Training Loss: 0.00699
Epoch: 37665, Training Loss: 0.00854
Epoch: 37665, Training Loss: 0.00662
Epoch: 37666, Training Loss: 0.00748
Epoch: 37666, Training Loss: 0.00699
Epoch: 37666, Training Loss: 0.00854
Epoch: 37666, Training Loss: 0.00662
Epoch: 37667, Training Loss: 0.00748
Epoch: 37667, Training Loss: 0.00699
Epoch: 37667, Training Loss: 0.00854
Epoch: 37667, Training Loss: 0.00662
Epoch: 37668, Training Loss: 0.00748
Epoch: 37668, Training Loss: 0.00699
Epoch: 37668, Training Loss: 0.00854
Epoch: 37668, Training Loss: 0.00662
Epoch: 37669, Training Loss: 0.00748
Epoch: 37669, Training Loss: 0.00699
Epoch: 37669, Training Loss: 0.00854
Epoch: 37669, Training Loss: 0.00662
Epoch: 37670, Training Loss: 0.00748
Epoch: 37670, Training Loss: 0.00699
Epoch: 37670, Training Loss: 0.00854
Epoch: 37670, Training Loss: 0.00662
Epoch: 37671, Training Loss: 0.00748
Epoch: 37671, Training Loss: 0.00699
Epoch: 37671, Training Loss: 0.00854
Epoch: 37671, Training Loss: 0.00662
Epoch: 37672, Training Loss: 0.00748
Epoch: 37672, Training Loss: 0.00699
Epoch: 37672, Training Loss: 0.00854
Epoch: 37672, Training Loss: 0.00662
Epoch: 37673, Training Loss: 0.00748
Epoch: 37673, Training Loss: 0.00699
Epoch: 37673, Training Loss: 0.00854
Epoch: 37673, Training Loss: 0.00662
Epoch: 37674, Training Loss: 0.00748
Epoch: 37674, Training Loss: 0.00699
Epoch: 37674, Training Loss: 0.00854
Epoch: 37674, Training Loss: 0.00662
Epoch: 37675, Training Loss: 0.00748
Epoch: 37675, Training Loss: 0.00699
Epoch: 37675, Training Loss: 0.00854
Epoch: 37675, Training Loss: 0.00662
Epoch: 37676, Training Loss: 0.00748
Epoch: 37676, Training Loss: 0.00699
Epoch: 37676, Training Loss: 0.00854
Epoch: 37676, Training Loss: 0.00662
Epoch: 37677, Training Loss: 0.00748
Epoch: 37677, Training Loss: 0.00699
Epoch: 37677, Training Loss: 0.00854
Epoch: 37677, Training Loss: 0.00662
Epoch: 37678, Training Loss: 0.00748
Epoch: 37678, Training Loss: 0.00699
Epoch: 37678, Training Loss: 0.00854
Epoch: 37678, Training Loss: 0.00662
Epoch: 37679, Training Loss: 0.00748
Epoch: 37679, Training Loss: 0.00699
Epoch: 37679, Training Loss: 0.00854
Epoch: 37679, Training Loss: 0.00662
Epoch: 37680, Training Loss: 0.00748
Epoch: 37680, Training Loss: 0.00699
Epoch: 37680, Training Loss: 0.00854
Epoch: 37680, Training Loss: 0.00662
Epoch: 37681, Training Loss: 0.00748
Epoch: 37681, Training Loss: 0.00699
Epoch: 37681, Training Loss: 0.00854
Epoch: 37681, Training Loss: 0.00662
Epoch: 37682, Training Loss: 0.00748
Epoch: 37682, Training Loss: 0.00699
Epoch: 37682, Training Loss: 0.00854
Epoch: 37682, Training Loss: 0.00662
Epoch: 37683, Training Loss: 0.00748
Epoch: 37683, Training Loss: 0.00699
Epoch: 37683, Training Loss: 0.00854
Epoch: 37683, Training Loss: 0.00662
Epoch: 37684, Training Loss: 0.00748
Epoch: 37684, Training Loss: 0.00699
Epoch: 37684, Training Loss: 0.00854
Epoch: 37684, Training Loss: 0.00662
Epoch: 37685, Training Loss: 0.00748
Epoch: 37685, Training Loss: 0.00699
Epoch: 37685, Training Loss: 0.00854
Epoch: 37685, Training Loss: 0.00662
Epoch: 37686, Training Loss: 0.00748
Epoch: 37686, Training Loss: 0.00699
Epoch: 37686, Training Loss: 0.00854
Epoch: 37686, Training Loss: 0.00662
Epoch: 37687, Training Loss: 0.00747
Epoch: 37687, Training Loss: 0.00699
Epoch: 37687, Training Loss: 0.00854
Epoch: 37687, Training Loss: 0.00662
Epoch: 37688, Training Loss: 0.00747
Epoch: 37688, Training Loss: 0.00699
Epoch: 37688, Training Loss: 0.00854
Epoch: 37688, Training Loss: 0.00662
Epoch: 37689, Training Loss: 0.00747
Epoch: 37689, Training Loss: 0.00699
Epoch: 37689, Training Loss: 0.00854
Epoch: 37689, Training Loss: 0.00662
Epoch: 37690, Training Loss: 0.00747
Epoch: 37690, Training Loss: 0.00699
Epoch: 37690, Training Loss: 0.00854
Epoch: 37690, Training Loss: 0.00662
Epoch: 37691, Training Loss: 0.00747
Epoch: 37691, Training Loss: 0.00699
Epoch: 37691, Training Loss: 0.00854
Epoch: 37691, Training Loss: 0.00662
Epoch: 37692, Training Loss: 0.00747
Epoch: 37692, Training Loss: 0.00699
Epoch: 37692, Training Loss: 0.00854
Epoch: 37692, Training Loss: 0.00662
Epoch: 37693, Training Loss: 0.00747
Epoch: 37693, Training Loss: 0.00699
Epoch: 37693, Training Loss: 0.00854
Epoch: 37693, Training Loss: 0.00662
Epoch: 37694, Training Loss: 0.00747
Epoch: 37694, Training Loss: 0.00699
Epoch: 37694, Training Loss: 0.00854
Epoch: 37694, Training Loss: 0.00662
Epoch: 37695, Training Loss: 0.00747
Epoch: 37695, Training Loss: 0.00699
Epoch: 37695, Training Loss: 0.00854
Epoch: 37695, Training Loss: 0.00662
Epoch: 37696, Training Loss: 0.00747
Epoch: 37696, Training Loss: 0.00699
Epoch: 37696, Training Loss: 0.00854
Epoch: 37696, Training Loss: 0.00662
Epoch: 37697, Training Loss: 0.00747
Epoch: 37697, Training Loss: 0.00699
Epoch: 37697, Training Loss: 0.00854
Epoch: 37697, Training Loss: 0.00662
Epoch: 37698, Training Loss: 0.00747
Epoch: 37698, Training Loss: 0.00699
Epoch: 37698, Training Loss: 0.00854
Epoch: 37698, Training Loss: 0.00662
Epoch: 37699, Training Loss: 0.00747
Epoch: 37699, Training Loss: 0.00699
Epoch: 37699, Training Loss: 0.00854
Epoch: 37699, Training Loss: 0.00662
Epoch: 37700, Training Loss: 0.00747
Epoch: 37700, Training Loss: 0.00699
Epoch: 37700, Training Loss: 0.00854
Epoch: 37700, Training Loss: 0.00662
Epoch: 37701, Training Loss: 0.00747
Epoch: 37701, Training Loss: 0.00699
Epoch: 37701, Training Loss: 0.00854
Epoch: 37701, Training Loss: 0.00662
Epoch: 37702, Training Loss: 0.00747
Epoch: 37702, Training Loss: 0.00699
Epoch: 37702, Training Loss: 0.00854
Epoch: 37702, Training Loss: 0.00662
Epoch: 37703, Training Loss: 0.00747
Epoch: 37703, Training Loss: 0.00699
Epoch: 37703, Training Loss: 0.00854
Epoch: 37703, Training Loss: 0.00662
Epoch: 37704, Training Loss: 0.00747
Epoch: 37704, Training Loss: 0.00699
Epoch: 37704, Training Loss: 0.00854
Epoch: 37704, Training Loss: 0.00662
Epoch: 37705, Training Loss: 0.00747
Epoch: 37705, Training Loss: 0.00699
Epoch: 37705, Training Loss: 0.00854
Epoch: 37705, Training Loss: 0.00662
Epoch: 37706, Training Loss: 0.00747
Epoch: 37706, Training Loss: 0.00699
Epoch: 37706, Training Loss: 0.00854
Epoch: 37706, Training Loss: 0.00662
Epoch: 37707, Training Loss: 0.00747
Epoch: 37707, Training Loss: 0.00699
Epoch: 37707, Training Loss: 0.00854
Epoch: 37707, Training Loss: 0.00662
Epoch: 37708, Training Loss: 0.00747
Epoch: 37708, Training Loss: 0.00699
Epoch: 37708, Training Loss: 0.00854
Epoch: 37708, Training Loss: 0.00662
Epoch: 37709, Training Loss: 0.00747
Epoch: 37709, Training Loss: 0.00699
Epoch: 37709, Training Loss: 0.00854
Epoch: 37709, Training Loss: 0.00662
Epoch: 37710, Training Loss: 0.00747
Epoch: 37710, Training Loss: 0.00699
Epoch: 37710, Training Loss: 0.00854
Epoch: 37710, Training Loss: 0.00662
Epoch: 37711, Training Loss: 0.00747
Epoch: 37711, Training Loss: 0.00699
Epoch: 37711, Training Loss: 0.00854
Epoch: 37711, Training Loss: 0.00662
Epoch: 37712, Training Loss: 0.00747
Epoch: 37712, Training Loss: 0.00699
Epoch: 37712, Training Loss: 0.00854
Epoch: 37712, Training Loss: 0.00662
Epoch: 37713, Training Loss: 0.00747
Epoch: 37713, Training Loss: 0.00699
Epoch: 37713, Training Loss: 0.00854
Epoch: 37713, Training Loss: 0.00662
Epoch: 37714, Training Loss: 0.00747
Epoch: 37714, Training Loss: 0.00699
Epoch: 37714, Training Loss: 0.00854
Epoch: 37714, Training Loss: 0.00662
Epoch: 37715, Training Loss: 0.00747
Epoch: 37715, Training Loss: 0.00699
Epoch: 37715, Training Loss: 0.00854
Epoch: 37715, Training Loss: 0.00662
Epoch: 37716, Training Loss: 0.00747
Epoch: 37716, Training Loss: 0.00699
Epoch: 37716, Training Loss: 0.00854
Epoch: 37716, Training Loss: 0.00662
Epoch: 37717, Training Loss: 0.00747
Epoch: 37717, Training Loss: 0.00699
Epoch: 37717, Training Loss: 0.00854
Epoch: 37717, Training Loss: 0.00662
Epoch: 37718, Training Loss: 0.00747
Epoch: 37718, Training Loss: 0.00699
Epoch: 37718, Training Loss: 0.00854
Epoch: 37718, Training Loss: 0.00662
Epoch: 37719, Training Loss: 0.00747
Epoch: 37719, Training Loss: 0.00699
Epoch: 37719, Training Loss: 0.00854
Epoch: 37719, Training Loss: 0.00662
Epoch: 37720, Training Loss: 0.00747
Epoch: 37720, Training Loss: 0.00699
Epoch: 37720, Training Loss: 0.00854
Epoch: 37720, Training Loss: 0.00662
Epoch: 37721, Training Loss: 0.00747
Epoch: 37721, Training Loss: 0.00699
Epoch: 37721, Training Loss: 0.00854
Epoch: 37721, Training Loss: 0.00662
Epoch: 37722, Training Loss: 0.00747
Epoch: 37722, Training Loss: 0.00699
Epoch: 37722, Training Loss: 0.00854
Epoch: 37722, Training Loss: 0.00662
Epoch: 37723, Training Loss: 0.00747
Epoch: 37723, Training Loss: 0.00699
Epoch: 37723, Training Loss: 0.00854
Epoch: 37723, Training Loss: 0.00662
Epoch: 37724, Training Loss: 0.00747
Epoch: 37724, Training Loss: 0.00699
Epoch: 37724, Training Loss: 0.00854
Epoch: 37724, Training Loss: 0.00662
Epoch: 37725, Training Loss: 0.00747
Epoch: 37725, Training Loss: 0.00699
Epoch: 37725, Training Loss: 0.00854
Epoch: 37725, Training Loss: 0.00662
Epoch: 37726, Training Loss: 0.00747
Epoch: 37726, Training Loss: 0.00699
Epoch: 37726, Training Loss: 0.00854
Epoch: 37726, Training Loss: 0.00662
Epoch: 37727, Training Loss: 0.00747
Epoch: 37727, Training Loss: 0.00699
Epoch: 37727, Training Loss: 0.00854
Epoch: 37727, Training Loss: 0.00662
Epoch: 37728, Training Loss: 0.00747
Epoch: 37728, Training Loss: 0.00699
Epoch: 37728, Training Loss: 0.00853
Epoch: 37728, Training Loss: 0.00662
Epoch: 37729, Training Loss: 0.00747
Epoch: 37729, Training Loss: 0.00699
Epoch: 37729, Training Loss: 0.00853
Epoch: 37729, Training Loss: 0.00662
Epoch: 37730, Training Loss: 0.00747
Epoch: 37730, Training Loss: 0.00699
Epoch: 37730, Training Loss: 0.00853
Epoch: 37730, Training Loss: 0.00662
Epoch: 37731, Training Loss: 0.00747
Epoch: 37731, Training Loss: 0.00699
Epoch: 37731, Training Loss: 0.00853
Epoch: 37731, Training Loss: 0.00662
Epoch: 37732, Training Loss: 0.00747
Epoch: 37732, Training Loss: 0.00699
Epoch: 37732, Training Loss: 0.00853
Epoch: 37732, Training Loss: 0.00662
Epoch: 37733, Training Loss: 0.00747
Epoch: 37733, Training Loss: 0.00699
Epoch: 37733, Training Loss: 0.00853
Epoch: 37733, Training Loss: 0.00662
Epoch: 37734, Training Loss: 0.00747
Epoch: 37734, Training Loss: 0.00699
Epoch: 37734, Training Loss: 0.00853
Epoch: 37734, Training Loss: 0.00662
Epoch: 37735, Training Loss: 0.00747
Epoch: 37735, Training Loss: 0.00699
Epoch: 37735, Training Loss: 0.00853
Epoch: 37735, Training Loss: 0.00662
Epoch: 37736, Training Loss: 0.00747
Epoch: 37736, Training Loss: 0.00699
Epoch: 37736, Training Loss: 0.00853
Epoch: 37736, Training Loss: 0.00662
Epoch: 37737, Training Loss: 0.00747
Epoch: 37737, Training Loss: 0.00699
Epoch: 37737, Training Loss: 0.00853
Epoch: 37737, Training Loss: 0.00662
Epoch: 37738, Training Loss: 0.00747
Epoch: 37738, Training Loss: 0.00699
Epoch: 37738, Training Loss: 0.00853
Epoch: 37738, Training Loss: 0.00662
Epoch: 37739, Training Loss: 0.00747
Epoch: 37739, Training Loss: 0.00699
Epoch: 37739, Training Loss: 0.00853
Epoch: 37739, Training Loss: 0.00662
Epoch: 37740, Training Loss: 0.00747
Epoch: 37740, Training Loss: 0.00699
Epoch: 37740, Training Loss: 0.00853
Epoch: 37740, Training Loss: 0.00662
Epoch: 37741, Training Loss: 0.00747
Epoch: 37741, Training Loss: 0.00699
Epoch: 37741, Training Loss: 0.00853
Epoch: 37741, Training Loss: 0.00662
Epoch: 37742, Training Loss: 0.00747
Epoch: 37742, Training Loss: 0.00699
Epoch: 37742, Training Loss: 0.00853
Epoch: 37742, Training Loss: 0.00662
Epoch: 37743, Training Loss: 0.00747
Epoch: 37743, Training Loss: 0.00699
Epoch: 37743, Training Loss: 0.00853
Epoch: 37743, Training Loss: 0.00662
Epoch: 37744, Training Loss: 0.00747
Epoch: 37744, Training Loss: 0.00699
Epoch: 37744, Training Loss: 0.00853
Epoch: 37744, Training Loss: 0.00662
Epoch: 37745, Training Loss: 0.00747
Epoch: 37745, Training Loss: 0.00699
Epoch: 37745, Training Loss: 0.00853
Epoch: 37745, Training Loss: 0.00662
Epoch: 37746, Training Loss: 0.00747
Epoch: 37746, Training Loss: 0.00699
Epoch: 37746, Training Loss: 0.00853
Epoch: 37746, Training Loss: 0.00662
Epoch: 37747, Training Loss: 0.00747
Epoch: 37747, Training Loss: 0.00699
Epoch: 37747, Training Loss: 0.00853
Epoch: 37747, Training Loss: 0.00662
Epoch: 37748, Training Loss: 0.00747
Epoch: 37748, Training Loss: 0.00699
Epoch: 37748, Training Loss: 0.00853
Epoch: 37748, Training Loss: 0.00662
Epoch: 37749, Training Loss: 0.00747
Epoch: 37749, Training Loss: 0.00699
Epoch: 37749, Training Loss: 0.00853
Epoch: 37749, Training Loss: 0.00662
Epoch: 37750, Training Loss: 0.00747
Epoch: 37750, Training Loss: 0.00699
Epoch: 37750, Training Loss: 0.00853
Epoch: 37750, Training Loss: 0.00662
Epoch: 37751, Training Loss: 0.00747
Epoch: 37751, Training Loss: 0.00699
Epoch: 37751, Training Loss: 0.00853
Epoch: 37751, Training Loss: 0.00661
Epoch: 37752, Training Loss: 0.00747
Epoch: 37752, Training Loss: 0.00699
Epoch: 37752, Training Loss: 0.00853
Epoch: 37752, Training Loss: 0.00661
Epoch: 37753, Training Loss: 0.00747
Epoch: 37753, Training Loss: 0.00699
Epoch: 37753, Training Loss: 0.00853
Epoch: 37753, Training Loss: 0.00661
Epoch: 37754, Training Loss: 0.00747
Epoch: 37754, Training Loss: 0.00699
Epoch: 37754, Training Loss: 0.00853
Epoch: 37754, Training Loss: 0.00661
Epoch: 37755, Training Loss: 0.00747
Epoch: 37755, Training Loss: 0.00699
Epoch: 37755, Training Loss: 0.00853
Epoch: 37755, Training Loss: 0.00661
Epoch: 37756, Training Loss: 0.00747
Epoch: 37756, Training Loss: 0.00699
Epoch: 37756, Training Loss: 0.00853
Epoch: 37756, Training Loss: 0.00661
Epoch: 37757, Training Loss: 0.00747
Epoch: 37757, Training Loss: 0.00699
Epoch: 37757, Training Loss: 0.00853
Epoch: 37757, Training Loss: 0.00661
Epoch: 37758, Training Loss: 0.00747
Epoch: 37758, Training Loss: 0.00699
Epoch: 37758, Training Loss: 0.00853
Epoch: 37758, Training Loss: 0.00661
Epoch: 37759, Training Loss: 0.00747
Epoch: 37759, Training Loss: 0.00699
Epoch: 37759, Training Loss: 0.00853
Epoch: 37759, Training Loss: 0.00661
Epoch: 37760, Training Loss: 0.00747
Epoch: 37760, Training Loss: 0.00699
Epoch: 37760, Training Loss: 0.00853
Epoch: 37760, Training Loss: 0.00661
Epoch: 37761, Training Loss: 0.00747
Epoch: 37761, Training Loss: 0.00699
Epoch: 37761, Training Loss: 0.00853
Epoch: 37761, Training Loss: 0.00661
Epoch: 37762, Training Loss: 0.00747
Epoch: 37762, Training Loss: 0.00699
Epoch: 37762, Training Loss: 0.00853
Epoch: 37762, Training Loss: 0.00661
Epoch: 37763, Training Loss: 0.00747
Epoch: 37763, Training Loss: 0.00699
Epoch: 37763, Training Loss: 0.00853
Epoch: 37763, Training Loss: 0.00661
Epoch: 37764, Training Loss: 0.00747
Epoch: 37764, Training Loss: 0.00699
Epoch: 37764, Training Loss: 0.00853
Epoch: 37764, Training Loss: 0.00661
Epoch: 37765, Training Loss: 0.00747
Epoch: 37765, Training Loss: 0.00699
Epoch: 37765, Training Loss: 0.00853
Epoch: 37765, Training Loss: 0.00661
Epoch: 37766, Training Loss: 0.00747
Epoch: 37766, Training Loss: 0.00699
Epoch: 37766, Training Loss: 0.00853
Epoch: 37766, Training Loss: 0.00661
Epoch: 37767, Training Loss: 0.00747
Epoch: 37767, Training Loss: 0.00699
Epoch: 37767, Training Loss: 0.00853
Epoch: 37767, Training Loss: 0.00661
Epoch: 37768, Training Loss: 0.00747
Epoch: 37768, Training Loss: 0.00698
Epoch: 37768, Training Loss: 0.00853
Epoch: 37768, Training Loss: 0.00661
Epoch: 37769, Training Loss: 0.00747
Epoch: 37769, Training Loss: 0.00698
Epoch: 37769, Training Loss: 0.00853
Epoch: 37769, Training Loss: 0.00661
Epoch: 37770, Training Loss: 0.00747
Epoch: 37770, Training Loss: 0.00698
Epoch: 37770, Training Loss: 0.00853
Epoch: 37770, Training Loss: 0.00661
Epoch: 37771, Training Loss: 0.00747
Epoch: 37771, Training Loss: 0.00698
Epoch: 37771, Training Loss: 0.00853
Epoch: 37771, Training Loss: 0.00661
Epoch: 37772, Training Loss: 0.00747
Epoch: 37772, Training Loss: 0.00698
Epoch: 37772, Training Loss: 0.00853
Epoch: 37772, Training Loss: 0.00661
Epoch: 37773, Training Loss: 0.00747
Epoch: 37773, Training Loss: 0.00698
Epoch: 37773, Training Loss: 0.00853
Epoch: 37773, Training Loss: 0.00661
Epoch: 37774, Training Loss: 0.00747
Epoch: 37774, Training Loss: 0.00698
Epoch: 37774, Training Loss: 0.00853
Epoch: 37774, Training Loss: 0.00661
Epoch: 37775, Training Loss: 0.00747
Epoch: 37775, Training Loss: 0.00698
Epoch: 37775, Training Loss: 0.00853
Epoch: 37775, Training Loss: 0.00661
Epoch: 37776, Training Loss: 0.00747
Epoch: 37776, Training Loss: 0.00698
Epoch: 37776, Training Loss: 0.00853
Epoch: 37776, Training Loss: 0.00661
Epoch: 37777, Training Loss: 0.00747
Epoch: 37777, Training Loss: 0.00698
Epoch: 37777, Training Loss: 0.00853
Epoch: 37777, Training Loss: 0.00661
Epoch: 37778, Training Loss: 0.00747
Epoch: 37778, Training Loss: 0.00698
Epoch: 37778, Training Loss: 0.00853
Epoch: 37778, Training Loss: 0.00661
Epoch: 37779, Training Loss: 0.00747
Epoch: 37779, Training Loss: 0.00698
Epoch: 37779, Training Loss: 0.00853
Epoch: 37779, Training Loss: 0.00661
Epoch: 37780, Training Loss: 0.00747
Epoch: 37780, Training Loss: 0.00698
Epoch: 37780, Training Loss: 0.00853
Epoch: 37780, Training Loss: 0.00661
Epoch: 37781, Training Loss: 0.00747
Epoch: 37781, Training Loss: 0.00698
Epoch: 37781, Training Loss: 0.00853
Epoch: 37781, Training Loss: 0.00661
Epoch: 37782, Training Loss: 0.00747
Epoch: 37782, Training Loss: 0.00698
Epoch: 37782, Training Loss: 0.00853
Epoch: 37782, Training Loss: 0.00661
Epoch: 37783, Training Loss: 0.00746
Epoch: 37783, Training Loss: 0.00698
Epoch: 37783, Training Loss: 0.00853
Epoch: 37783, Training Loss: 0.00661
Epoch: 37784, Training Loss: 0.00746
Epoch: 37784, Training Loss: 0.00698
Epoch: 37784, Training Loss: 0.00853
Epoch: 37784, Training Loss: 0.00661
Epoch: 37785, Training Loss: 0.00746
Epoch: 37785, Training Loss: 0.00698
Epoch: 37785, Training Loss: 0.00853
Epoch: 37785, Training Loss: 0.00661
Epoch: 37786, Training Loss: 0.00746
Epoch: 37786, Training Loss: 0.00698
Epoch: 37786, Training Loss: 0.00853
Epoch: 37786, Training Loss: 0.00661
Epoch: 37787, Training Loss: 0.00746
Epoch: 37787, Training Loss: 0.00698
Epoch: 37787, Training Loss: 0.00853
Epoch: 37787, Training Loss: 0.00661
Epoch: 37788, Training Loss: 0.00746
Epoch: 37788, Training Loss: 0.00698
Epoch: 37788, Training Loss: 0.00853
Epoch: 37788, Training Loss: 0.00661
Epoch: 37789, Training Loss: 0.00746
Epoch: 37789, Training Loss: 0.00698
Epoch: 37789, Training Loss: 0.00853
Epoch: 37789, Training Loss: 0.00661
Epoch: 37790, Training Loss: 0.00746
Epoch: 37790, Training Loss: 0.00698
Epoch: 37790, Training Loss: 0.00853
Epoch: 37790, Training Loss: 0.00661
Epoch: 37791, Training Loss: 0.00746
Epoch: 37791, Training Loss: 0.00698
Epoch: 37791, Training Loss: 0.00853
Epoch: 37791, Training Loss: 0.00661
Epoch: 37792, Training Loss: 0.00746
Epoch: 37792, Training Loss: 0.00698
Epoch: 37792, Training Loss: 0.00853
Epoch: 37792, Training Loss: 0.00661
Epoch: 37793, Training Loss: 0.00746
Epoch: 37793, Training Loss: 0.00698
Epoch: 37793, Training Loss: 0.00853
Epoch: 37793, Training Loss: 0.00661
Epoch: 37794, Training Loss: 0.00746
Epoch: 37794, Training Loss: 0.00698
Epoch: 37794, Training Loss: 0.00853
Epoch: 37794, Training Loss: 0.00661
Epoch: 37795, Training Loss: 0.00746
Epoch: 37795, Training Loss: 0.00698
Epoch: 37795, Training Loss: 0.00853
Epoch: 37795, Training Loss: 0.00661
Epoch: 37796, Training Loss: 0.00746
Epoch: 37796, Training Loss: 0.00698
Epoch: 37796, Training Loss: 0.00853
Epoch: 37796, Training Loss: 0.00661
Epoch: 37797, Training Loss: 0.00746
Epoch: 37797, Training Loss: 0.00698
Epoch: 37797, Training Loss: 0.00853
Epoch: 37797, Training Loss: 0.00661
Epoch: 37798, Training Loss: 0.00746
Epoch: 37798, Training Loss: 0.00698
Epoch: 37798, Training Loss: 0.00853
Epoch: 37798, Training Loss: 0.00661
Epoch: 37799, Training Loss: 0.00746
Epoch: 37799, Training Loss: 0.00698
Epoch: 37799, Training Loss: 0.00853
Epoch: 37799, Training Loss: 0.00661
Epoch: 37800, Training Loss: 0.00746
Epoch: 37800, Training Loss: 0.00698
Epoch: 37800, Training Loss: 0.00853
Epoch: 37800, Training Loss: 0.00661
Epoch: 37801, Training Loss: 0.00746
Epoch: 37801, Training Loss: 0.00698
Epoch: 37801, Training Loss: 0.00853
Epoch: 37801, Training Loss: 0.00661
Epoch: 37802, Training Loss: 0.00746
Epoch: 37802, Training Loss: 0.00698
Epoch: 37802, Training Loss: 0.00853
Epoch: 37802, Training Loss: 0.00661
Epoch: 37803, Training Loss: 0.00746
Epoch: 37803, Training Loss: 0.00698
Epoch: 37803, Training Loss: 0.00853
Epoch: 37803, Training Loss: 0.00661
Epoch: 37804, Training Loss: 0.00746
Epoch: 37804, Training Loss: 0.00698
Epoch: 37804, Training Loss: 0.00853
Epoch: 37804, Training Loss: 0.00661
Epoch: 37805, Training Loss: 0.00746
Epoch: 37805, Training Loss: 0.00698
Epoch: 37805, Training Loss: 0.00853
Epoch: 37805, Training Loss: 0.00661
Epoch: 37806, Training Loss: 0.00746
Epoch: 37806, Training Loss: 0.00698
Epoch: 37806, Training Loss: 0.00853
Epoch: 37806, Training Loss: 0.00661
Epoch: 37807, Training Loss: 0.00746
Epoch: 37807, Training Loss: 0.00698
Epoch: 37807, Training Loss: 0.00853
Epoch: 37807, Training Loss: 0.00661
Epoch: 37808, Training Loss: 0.00746
Epoch: 37808, Training Loss: 0.00698
Epoch: 37808, Training Loss: 0.00853
Epoch: 37808, Training Loss: 0.00661
Epoch: 37809, Training Loss: 0.00746
Epoch: 37809, Training Loss: 0.00698
Epoch: 37809, Training Loss: 0.00853
Epoch: 37809, Training Loss: 0.00661
Epoch: 37810, Training Loss: 0.00746
Epoch: 37810, Training Loss: 0.00698
Epoch: 37810, Training Loss: 0.00853
Epoch: 37810, Training Loss: 0.00661
Epoch: 37811, Training Loss: 0.00746
Epoch: 37811, Training Loss: 0.00698
Epoch: 37811, Training Loss: 0.00853
Epoch: 37811, Training Loss: 0.00661
Epoch: 37812, Training Loss: 0.00746
Epoch: 37812, Training Loss: 0.00698
Epoch: 37812, Training Loss: 0.00852
Epoch: 37812, Training Loss: 0.00661
Epoch: 37813, Training Loss: 0.00746
Epoch: 37813, Training Loss: 0.00698
Epoch: 37813, Training Loss: 0.00852
Epoch: 37813, Training Loss: 0.00661
Epoch: 37814, Training Loss: 0.00746
Epoch: 37814, Training Loss: 0.00698
Epoch: 37814, Training Loss: 0.00852
Epoch: 37814, Training Loss: 0.00661
Epoch: 37815, Training Loss: 0.00746
Epoch: 37815, Training Loss: 0.00698
Epoch: 37815, Training Loss: 0.00852
Epoch: 37815, Training Loss: 0.00661
Epoch: 37816, Training Loss: 0.00746
Epoch: 37816, Training Loss: 0.00698
Epoch: 37816, Training Loss: 0.00852
Epoch: 37816, Training Loss: 0.00661
Epoch: 37817, Training Loss: 0.00746
Epoch: 37817, Training Loss: 0.00698
Epoch: 37817, Training Loss: 0.00852
Epoch: 37817, Training Loss: 0.00661
Epoch: 37818, Training Loss: 0.00746
Epoch: 37818, Training Loss: 0.00698
Epoch: 37818, Training Loss: 0.00852
Epoch: 37818, Training Loss: 0.00661
Epoch: 37819, Training Loss: 0.00746
Epoch: 37819, Training Loss: 0.00698
Epoch: 37819, Training Loss: 0.00852
Epoch: 37819, Training Loss: 0.00661
Epoch: 37820, Training Loss: 0.00746
Epoch: 37820, Training Loss: 0.00698
Epoch: 37820, Training Loss: 0.00852
Epoch: 37820, Training Loss: 0.00661
Epoch: 37821, Training Loss: 0.00746
Epoch: 37821, Training Loss: 0.00698
Epoch: 37821, Training Loss: 0.00852
Epoch: 37821, Training Loss: 0.00661
Epoch: 37822, Training Loss: 0.00746
Epoch: 37822, Training Loss: 0.00698
Epoch: 37822, Training Loss: 0.00852
Epoch: 37822, Training Loss: 0.00661
Epoch: 37823, Training Loss: 0.00746
Epoch: 37823, Training Loss: 0.00698
Epoch: 37823, Training Loss: 0.00852
Epoch: 37823, Training Loss: 0.00661
Epoch: 37824, Training Loss: 0.00746
Epoch: 37824, Training Loss: 0.00698
Epoch: 37824, Training Loss: 0.00852
Epoch: 37824, Training Loss: 0.00661
Epoch: 37825, Training Loss: 0.00746
Epoch: 37825, Training Loss: 0.00698
Epoch: 37825, Training Loss: 0.00852
Epoch: 37825, Training Loss: 0.00661
Epoch: 37826, Training Loss: 0.00746
Epoch: 37826, Training Loss: 0.00698
Epoch: 37826, Training Loss: 0.00852
Epoch: 37826, Training Loss: 0.00661
Epoch: 37827, Training Loss: 0.00746
Epoch: 37827, Training Loss: 0.00698
Epoch: 37827, Training Loss: 0.00852
Epoch: 37827, Training Loss: 0.00661
Epoch: 37828, Training Loss: 0.00746
Epoch: 37828, Training Loss: 0.00698
Epoch: 37828, Training Loss: 0.00852
Epoch: 37828, Training Loss: 0.00661
Epoch: 37829, Training Loss: 0.00746
Epoch: 37829, Training Loss: 0.00698
Epoch: 37829, Training Loss: 0.00852
Epoch: 37829, Training Loss: 0.00661
Epoch: 37830, Training Loss: 0.00746
Epoch: 37830, Training Loss: 0.00698
Epoch: 37830, Training Loss: 0.00852
Epoch: 37830, Training Loss: 0.00661
Epoch: 37831, Training Loss: 0.00746
Epoch: 37831, Training Loss: 0.00698
Epoch: 37831, Training Loss: 0.00852
Epoch: 37831, Training Loss: 0.00661
Epoch: 37832, Training Loss: 0.00746
Epoch: 37832, Training Loss: 0.00698
Epoch: 37832, Training Loss: 0.00852
Epoch: 37832, Training Loss: 0.00661
Epoch: 37833, Training Loss: 0.00746
Epoch: 37833, Training Loss: 0.00698
Epoch: 37833, Training Loss: 0.00852
Epoch: 37833, Training Loss: 0.00661
Epoch: 37834, Training Loss: 0.00746
Epoch: 37834, Training Loss: 0.00698
Epoch: 37834, Training Loss: 0.00852
Epoch: 37834, Training Loss: 0.00661
Epoch: 37835, Training Loss: 0.00746
Epoch: 37835, Training Loss: 0.00698
Epoch: 37835, Training Loss: 0.00852
Epoch: 37835, Training Loss: 0.00661
Epoch: 37836, Training Loss: 0.00746
Epoch: 37836, Training Loss: 0.00698
Epoch: 37836, Training Loss: 0.00852
Epoch: 37836, Training Loss: 0.00661
Epoch: 37837, Training Loss: 0.00746
Epoch: 37837, Training Loss: 0.00698
Epoch: 37837, Training Loss: 0.00852
Epoch: 37837, Training Loss: 0.00661
Epoch: 37838, Training Loss: 0.00746
Epoch: 37838, Training Loss: 0.00698
Epoch: 37838, Training Loss: 0.00852
Epoch: 37838, Training Loss: 0.00661
Epoch: 37839, Training Loss: 0.00746
Epoch: 37839, Training Loss: 0.00698
Epoch: 37839, Training Loss: 0.00852
Epoch: 37839, Training Loss: 0.00661
Epoch: 37840, Training Loss: 0.00746
Epoch: 37840, Training Loss: 0.00698
Epoch: 37840, Training Loss: 0.00852
Epoch: 37840, Training Loss: 0.00661
Epoch: 37841, Training Loss: 0.00746
Epoch: 37841, Training Loss: 0.00698
Epoch: 37841, Training Loss: 0.00852
Epoch: 37841, Training Loss: 0.00661
Epoch: 37842, Training Loss: 0.00746
Epoch: 37842, Training Loss: 0.00698
Epoch: 37842, Training Loss: 0.00852
Epoch: 37842, Training Loss: 0.00661
Epoch: 37843, Training Loss: 0.00746
Epoch: 37843, Training Loss: 0.00698
Epoch: 37843, Training Loss: 0.00852
Epoch: 37843, Training Loss: 0.00661
Epoch: 37844, Training Loss: 0.00746
Epoch: 37844, Training Loss: 0.00698
Epoch: 37844, Training Loss: 0.00852
Epoch: 37844, Training Loss: 0.00661
Epoch: 37845, Training Loss: 0.00746
Epoch: 37845, Training Loss: 0.00698
Epoch: 37845, Training Loss: 0.00852
Epoch: 37845, Training Loss: 0.00661
Epoch: 37846, Training Loss: 0.00746
Epoch: 37846, Training Loss: 0.00698
Epoch: 37846, Training Loss: 0.00852
Epoch: 37846, Training Loss: 0.00661
Epoch: 37847, Training Loss: 0.00746
Epoch: 37847, Training Loss: 0.00698
Epoch: 37847, Training Loss: 0.00852
Epoch: 37847, Training Loss: 0.00661
Epoch: 37848, Training Loss: 0.00746
Epoch: 37848, Training Loss: 0.00698
Epoch: 37848, Training Loss: 0.00852
Epoch: 37848, Training Loss: 0.00661
Epoch: 37849, Training Loss: 0.00746
Epoch: 37849, Training Loss: 0.00698
Epoch: 37849, Training Loss: 0.00852
Epoch: 37849, Training Loss: 0.00661
Epoch: 37850, Training Loss: 0.00746
Epoch: 37850, Training Loss: 0.00698
Epoch: 37850, Training Loss: 0.00852
Epoch: 37850, Training Loss: 0.00661
Epoch: 37851, Training Loss: 0.00746
Epoch: 37851, Training Loss: 0.00698
Epoch: 37851, Training Loss: 0.00852
Epoch: 37851, Training Loss: 0.00661
Epoch: 37852, Training Loss: 0.00746
Epoch: 37852, Training Loss: 0.00698
Epoch: 37852, Training Loss: 0.00852
Epoch: 37852, Training Loss: 0.00661
Epoch: 37853, Training Loss: 0.00746
Epoch: 37853, Training Loss: 0.00698
Epoch: 37853, Training Loss: 0.00852
Epoch: 37853, Training Loss: 0.00661
Epoch: 37854, Training Loss: 0.00746
Epoch: 37854, Training Loss: 0.00698
Epoch: 37854, Training Loss: 0.00852
Epoch: 37854, Training Loss: 0.00661
Epoch: 37855, Training Loss: 0.00746
Epoch: 37855, Training Loss: 0.00698
Epoch: 37855, Training Loss: 0.00852
Epoch: 37855, Training Loss: 0.00661
Epoch: 37856, Training Loss: 0.00746
Epoch: 37856, Training Loss: 0.00698
Epoch: 37856, Training Loss: 0.00852
Epoch: 37856, Training Loss: 0.00661
Epoch: 37857, Training Loss: 0.00746
Epoch: 37857, Training Loss: 0.00698
Epoch: 37857, Training Loss: 0.00852
Epoch: 37857, Training Loss: 0.00661
Epoch: 37858, Training Loss: 0.00746
Epoch: 37858, Training Loss: 0.00698
Epoch: 37858, Training Loss: 0.00852
Epoch: 37858, Training Loss: 0.00661
Epoch: 37859, Training Loss: 0.00746
Epoch: 37859, Training Loss: 0.00698
Epoch: 37859, Training Loss: 0.00852
Epoch: 37859, Training Loss: 0.00661
Epoch: 37860, Training Loss: 0.00746
Epoch: 37860, Training Loss: 0.00698
Epoch: 37860, Training Loss: 0.00852
Epoch: 37860, Training Loss: 0.00660
Epoch: 37861, Training Loss: 0.00746
Epoch: 37861, Training Loss: 0.00698
Epoch: 37861, Training Loss: 0.00852
Epoch: 37861, Training Loss: 0.00660
Epoch: 37862, Training Loss: 0.00746
Epoch: 37862, Training Loss: 0.00698
Epoch: 37862, Training Loss: 0.00852
Epoch: 37862, Training Loss: 0.00660
Epoch: 37863, Training Loss: 0.00746
Epoch: 37863, Training Loss: 0.00698
Epoch: 37863, Training Loss: 0.00852
Epoch: 37863, Training Loss: 0.00660
Epoch: 37864, Training Loss: 0.00746
Epoch: 37864, Training Loss: 0.00698
Epoch: 37864, Training Loss: 0.00852
Epoch: 37864, Training Loss: 0.00660
Epoch: 37865, Training Loss: 0.00746
Epoch: 37865, Training Loss: 0.00698
Epoch: 37865, Training Loss: 0.00852
Epoch: 37865, Training Loss: 0.00660
Epoch: 37866, Training Loss: 0.00746
Epoch: 37866, Training Loss: 0.00698
Epoch: 37866, Training Loss: 0.00852
Epoch: 37866, Training Loss: 0.00660
Epoch: 37867, Training Loss: 0.00746
Epoch: 37867, Training Loss: 0.00698
Epoch: 37867, Training Loss: 0.00852
Epoch: 37867, Training Loss: 0.00660
Epoch: 37868, Training Loss: 0.00746
Epoch: 37868, Training Loss: 0.00698
Epoch: 37868, Training Loss: 0.00852
Epoch: 37868, Training Loss: 0.00660
Epoch: 37869, Training Loss: 0.00746
Epoch: 37869, Training Loss: 0.00698
Epoch: 37869, Training Loss: 0.00852
Epoch: 37869, Training Loss: 0.00660
Epoch: 37870, Training Loss: 0.00746
Epoch: 37870, Training Loss: 0.00698
Epoch: 37870, Training Loss: 0.00852
Epoch: 37870, Training Loss: 0.00660
Epoch: 37871, Training Loss: 0.00746
Epoch: 37871, Training Loss: 0.00698
Epoch: 37871, Training Loss: 0.00852
Epoch: 37871, Training Loss: 0.00660
Epoch: 37872, Training Loss: 0.00746
Epoch: 37872, Training Loss: 0.00697
Epoch: 37872, Training Loss: 0.00852
Epoch: 37872, Training Loss: 0.00660
Epoch: 37873, Training Loss: 0.00746
Epoch: 37873, Training Loss: 0.00697
Epoch: 37873, Training Loss: 0.00852
Epoch: 37873, Training Loss: 0.00660
Epoch: 37874, Training Loss: 0.00746
Epoch: 37874, Training Loss: 0.00697
Epoch: 37874, Training Loss: 0.00852
Epoch: 37874, Training Loss: 0.00660
Epoch: 37875, Training Loss: 0.00746
Epoch: 37875, Training Loss: 0.00697
Epoch: 37875, Training Loss: 0.00852
Epoch: 37875, Training Loss: 0.00660
Epoch: 37876, Training Loss: 0.00746
Epoch: 37876, Training Loss: 0.00697
Epoch: 37876, Training Loss: 0.00852
Epoch: 37876, Training Loss: 0.00660
Epoch: 37877, Training Loss: 0.00746
Epoch: 37877, Training Loss: 0.00697
Epoch: 37877, Training Loss: 0.00852
Epoch: 37877, Training Loss: 0.00660
Epoch: 37878, Training Loss: 0.00746
Epoch: 37878, Training Loss: 0.00697
Epoch: 37878, Training Loss: 0.00852
Epoch: 37878, Training Loss: 0.00660
Epoch: 37879, Training Loss: 0.00745
Epoch: 37879, Training Loss: 0.00697
Epoch: 37879, Training Loss: 0.00852
Epoch: 37879, Training Loss: 0.00660
Epoch: 37880, Training Loss: 0.00745
Epoch: 37880, Training Loss: 0.00697
Epoch: 37880, Training Loss: 0.00852
Epoch: 37880, Training Loss: 0.00660
Epoch: 37881, Training Loss: 0.00745
Epoch: 37881, Training Loss: 0.00697
Epoch: 37881, Training Loss: 0.00852
Epoch: 37881, Training Loss: 0.00660
Epoch: 37882, Training Loss: 0.00745
Epoch: 37882, Training Loss: 0.00697
Epoch: 37882, Training Loss: 0.00852
Epoch: 37882, Training Loss: 0.00660
Epoch: 37883, Training Loss: 0.00745
Epoch: 37883, Training Loss: 0.00697
Epoch: 37883, Training Loss: 0.00852
Epoch: 37883, Training Loss: 0.00660
Epoch: 37884, Training Loss: 0.00745
Epoch: 37884, Training Loss: 0.00697
Epoch: 37884, Training Loss: 0.00852
Epoch: 37884, Training Loss: 0.00660
Epoch: 37885, Training Loss: 0.00745
Epoch: 37885, Training Loss: 0.00697
Epoch: 37885, Training Loss: 0.00852
Epoch: 37885, Training Loss: 0.00660
Epoch: 37886, Training Loss: 0.00745
Epoch: 37886, Training Loss: 0.00697
Epoch: 37886, Training Loss: 0.00852
Epoch: 37886, Training Loss: 0.00660
Epoch: 37887, Training Loss: 0.00745
Epoch: 37887, Training Loss: 0.00697
Epoch: 37887, Training Loss: 0.00852
Epoch: 37887, Training Loss: 0.00660
Epoch: 37888, Training Loss: 0.00745
Epoch: 37888, Training Loss: 0.00697
Epoch: 37888, Training Loss: 0.00852
Epoch: 37888, Training Loss: 0.00660
Epoch: 37889, Training Loss: 0.00745
Epoch: 37889, Training Loss: 0.00697
Epoch: 37889, Training Loss: 0.00852
Epoch: 37889, Training Loss: 0.00660
Epoch: 37890, Training Loss: 0.00745
Epoch: 37890, Training Loss: 0.00697
Epoch: 37890, Training Loss: 0.00852
Epoch: 37890, Training Loss: 0.00660
Epoch: 37891, Training Loss: 0.00745
Epoch: 37891, Training Loss: 0.00697
Epoch: 37891, Training Loss: 0.00852
Epoch: 37891, Training Loss: 0.00660
Epoch: 37892, Training Loss: 0.00745
Epoch: 37892, Training Loss: 0.00697
Epoch: 37892, Training Loss: 0.00852
Epoch: 37892, Training Loss: 0.00660
Epoch: 37893, Training Loss: 0.00745
Epoch: 37893, Training Loss: 0.00697
Epoch: 37893, Training Loss: 0.00852
Epoch: 37893, Training Loss: 0.00660
Epoch: 37894, Training Loss: 0.00745
Epoch: 37894, Training Loss: 0.00697
Epoch: 37894, Training Loss: 0.00852
Epoch: 37894, Training Loss: 0.00660
Epoch: 37895, Training Loss: 0.00745
Epoch: 37895, Training Loss: 0.00697
Epoch: 37895, Training Loss: 0.00852
Epoch: 37895, Training Loss: 0.00660
Epoch: 37896, Training Loss: 0.00745
Epoch: 37896, Training Loss: 0.00697
Epoch: 37896, Training Loss: 0.00852
Epoch: 37896, Training Loss: 0.00660
Epoch: 37897, Training Loss: 0.00745
Epoch: 37897, Training Loss: 0.00697
Epoch: 37897, Training Loss: 0.00851
Epoch: 37897, Training Loss: 0.00660
Epoch: 37898, Training Loss: 0.00745
Epoch: 37898, Training Loss: 0.00697
Epoch: 37898, Training Loss: 0.00851
Epoch: 37898, Training Loss: 0.00660
Epoch: 37899, Training Loss: 0.00745
Epoch: 37899, Training Loss: 0.00697
Epoch: 37899, Training Loss: 0.00851
Epoch: 37899, Training Loss: 0.00660
Epoch: 37900, Training Loss: 0.00745
Epoch: 37900, Training Loss: 0.00697
Epoch: 37900, Training Loss: 0.00851
Epoch: 37900, Training Loss: 0.00660
Epoch: 37901, Training Loss: 0.00745
Epoch: 37901, Training Loss: 0.00697
Epoch: 37901, Training Loss: 0.00851
Epoch: 37901, Training Loss: 0.00660
Epoch: 37902, Training Loss: 0.00745
Epoch: 37902, Training Loss: 0.00697
Epoch: 37902, Training Loss: 0.00851
Epoch: 37902, Training Loss: 0.00660
Epoch: 37903, Training Loss: 0.00745
Epoch: 37903, Training Loss: 0.00697
Epoch: 37903, Training Loss: 0.00851
Epoch: 37903, Training Loss: 0.00660
Epoch: 37904, Training Loss: 0.00745
Epoch: 37904, Training Loss: 0.00697
Epoch: 37904, Training Loss: 0.00851
Epoch: 37904, Training Loss: 0.00660
Epoch: 37905, Training Loss: 0.00745
Epoch: 37905, Training Loss: 0.00697
Epoch: 37905, Training Loss: 0.00851
Epoch: 37905, Training Loss: 0.00660
Epoch: 37906, Training Loss: 0.00745
Epoch: 37906, Training Loss: 0.00697
Epoch: 37906, Training Loss: 0.00851
Epoch: 37906, Training Loss: 0.00660
Epoch: 37907, Training Loss: 0.00745
Epoch: 37907, Training Loss: 0.00697
Epoch: 37907, Training Loss: 0.00851
Epoch: 37907, Training Loss: 0.00660
Epoch: 37908, Training Loss: 0.00745
Epoch: 37908, Training Loss: 0.00697
Epoch: 37908, Training Loss: 0.00851
Epoch: 37908, Training Loss: 0.00660
Epoch: 37909, Training Loss: 0.00745
Epoch: 37909, Training Loss: 0.00697
Epoch: 37909, Training Loss: 0.00851
Epoch: 37909, Training Loss: 0.00660
Epoch: 37910, Training Loss: 0.00745
Epoch: 37910, Training Loss: 0.00697
Epoch: 37910, Training Loss: 0.00851
Epoch: 37910, Training Loss: 0.00660
Epoch: 37911, Training Loss: 0.00745
Epoch: 37911, Training Loss: 0.00697
Epoch: 37911, Training Loss: 0.00851
Epoch: 37911, Training Loss: 0.00660
Epoch: 37912, Training Loss: 0.00745
Epoch: 37912, Training Loss: 0.00697
Epoch: 37912, Training Loss: 0.00851
Epoch: 37912, Training Loss: 0.00660
Epoch: 37913, Training Loss: 0.00745
Epoch: 37913, Training Loss: 0.00697
Epoch: 37913, Training Loss: 0.00851
Epoch: 37913, Training Loss: 0.00660
Epoch: 37914, Training Loss: 0.00745
Epoch: 37914, Training Loss: 0.00697
Epoch: 37914, Training Loss: 0.00851
Epoch: 37914, Training Loss: 0.00660
Epoch: 37915, Training Loss: 0.00745
Epoch: 37915, Training Loss: 0.00697
Epoch: 37915, Training Loss: 0.00851
Epoch: 37915, Training Loss: 0.00660
Epoch: 37916, Training Loss: 0.00745
Epoch: 37916, Training Loss: 0.00697
Epoch: 37916, Training Loss: 0.00851
Epoch: 37916, Training Loss: 0.00660
Epoch: 37917, Training Loss: 0.00745
Epoch: 37917, Training Loss: 0.00697
Epoch: 37917, Training Loss: 0.00851
Epoch: 37917, Training Loss: 0.00660
Epoch: 37918, Training Loss: 0.00745
Epoch: 37918, Training Loss: 0.00697
Epoch: 37918, Training Loss: 0.00851
Epoch: 37918, Training Loss: 0.00660
Epoch: 37919, Training Loss: 0.00745
Epoch: 37919, Training Loss: 0.00697
Epoch: 37919, Training Loss: 0.00851
Epoch: 37919, Training Loss: 0.00660
Epoch: 37920, Training Loss: 0.00745
Epoch: 37920, Training Loss: 0.00697
Epoch: 37920, Training Loss: 0.00851
Epoch: 37920, Training Loss: 0.00660
Epoch: 37921, Training Loss: 0.00745
Epoch: 37921, Training Loss: 0.00697
Epoch: 37921, Training Loss: 0.00851
Epoch: 37921, Training Loss: 0.00660
Epoch: 37922, Training Loss: 0.00745
Epoch: 37922, Training Loss: 0.00697
Epoch: 37922, Training Loss: 0.00851
Epoch: 37922, Training Loss: 0.00660
Epoch: 37923, Training Loss: 0.00745
Epoch: 37923, Training Loss: 0.00697
Epoch: 37923, Training Loss: 0.00851
Epoch: 37923, Training Loss: 0.00660
Epoch: 37924, Training Loss: 0.00745
Epoch: 37924, Training Loss: 0.00697
Epoch: 37924, Training Loss: 0.00851
Epoch: 37924, Training Loss: 0.00660
Epoch: 37925, Training Loss: 0.00745
Epoch: 37925, Training Loss: 0.00697
Epoch: 37925, Training Loss: 0.00851
Epoch: 37925, Training Loss: 0.00660
Epoch: 37926, Training Loss: 0.00745
Epoch: 37926, Training Loss: 0.00697
Epoch: 37926, Training Loss: 0.00851
Epoch: 37926, Training Loss: 0.00660
Epoch: 37927, Training Loss: 0.00745
Epoch: 37927, Training Loss: 0.00697
Epoch: 37927, Training Loss: 0.00851
Epoch: 37927, Training Loss: 0.00660
Epoch: 37928, Training Loss: 0.00745
Epoch: 37928, Training Loss: 0.00697
Epoch: 37928, Training Loss: 0.00851
Epoch: 37928, Training Loss: 0.00660
Epoch: 37929, Training Loss: 0.00745
Epoch: 37929, Training Loss: 0.00697
Epoch: 37929, Training Loss: 0.00851
Epoch: 37929, Training Loss: 0.00660
Epoch: 37930, Training Loss: 0.00745
Epoch: 37930, Training Loss: 0.00697
Epoch: 37930, Training Loss: 0.00851
Epoch: 37930, Training Loss: 0.00660
Epoch: 37931, Training Loss: 0.00745
Epoch: 37931, Training Loss: 0.00697
Epoch: 37931, Training Loss: 0.00851
Epoch: 37931, Training Loss: 0.00660
Epoch: 37932, Training Loss: 0.00745
Epoch: 37932, Training Loss: 0.00697
Epoch: 37932, Training Loss: 0.00851
Epoch: 37932, Training Loss: 0.00660
Epoch: 37933, Training Loss: 0.00745
Epoch: 37933, Training Loss: 0.00697
Epoch: 37933, Training Loss: 0.00851
Epoch: 37933, Training Loss: 0.00660
Epoch: 37934, Training Loss: 0.00745
Epoch: 37934, Training Loss: 0.00697
Epoch: 37934, Training Loss: 0.00851
Epoch: 37934, Training Loss: 0.00660
Epoch: 37935, Training Loss: 0.00745
Epoch: 37935, Training Loss: 0.00697
Epoch: 37935, Training Loss: 0.00851
Epoch: 37935, Training Loss: 0.00660
Epoch: 37936, Training Loss: 0.00745
Epoch: 37936, Training Loss: 0.00697
Epoch: 37936, Training Loss: 0.00851
Epoch: 37936, Training Loss: 0.00660
Epoch: 37937, Training Loss: 0.00745
Epoch: 37937, Training Loss: 0.00697
Epoch: 37937, Training Loss: 0.00851
Epoch: 37937, Training Loss: 0.00660
Epoch: 37938, Training Loss: 0.00745
Epoch: 37938, Training Loss: 0.00697
Epoch: 37938, Training Loss: 0.00851
Epoch: 37938, Training Loss: 0.00660
Epoch: 37939, Training Loss: 0.00745
Epoch: 37939, Training Loss: 0.00697
Epoch: 37939, Training Loss: 0.00851
Epoch: 37939, Training Loss: 0.00660
Epoch: 37940, Training Loss: 0.00745
Epoch: 37940, Training Loss: 0.00697
Epoch: 37940, Training Loss: 0.00851
Epoch: 37940, Training Loss: 0.00660
Epoch: 37941, Training Loss: 0.00745
Epoch: 37941, Training Loss: 0.00697
Epoch: 37941, Training Loss: 0.00851
Epoch: 37941, Training Loss: 0.00660
Epoch: 37942, Training Loss: 0.00745
Epoch: 37942, Training Loss: 0.00697
Epoch: 37942, Training Loss: 0.00851
Epoch: 37942, Training Loss: 0.00660
Epoch: 37943, Training Loss: 0.00745
Epoch: 37943, Training Loss: 0.00697
Epoch: 37943, Training Loss: 0.00851
Epoch: 37943, Training Loss: 0.00660
Epoch: 37944, Training Loss: 0.00745
Epoch: 37944, Training Loss: 0.00697
Epoch: 37944, Training Loss: 0.00851
Epoch: 37944, Training Loss: 0.00660
Epoch: 37945, Training Loss: 0.00745
Epoch: 37945, Training Loss: 0.00697
Epoch: 37945, Training Loss: 0.00851
Epoch: 37945, Training Loss: 0.00660
Epoch: 37946, Training Loss: 0.00745
Epoch: 37946, Training Loss: 0.00697
Epoch: 37946, Training Loss: 0.00851
Epoch: 37946, Training Loss: 0.00660
Epoch: 37947, Training Loss: 0.00745
Epoch: 37947, Training Loss: 0.00697
Epoch: 37947, Training Loss: 0.00851
Epoch: 37947, Training Loss: 0.00660
Epoch: 37948, Training Loss: 0.00745
Epoch: 37948, Training Loss: 0.00697
Epoch: 37948, Training Loss: 0.00851
Epoch: 37948, Training Loss: 0.00660
Epoch: 37949, Training Loss: 0.00745
Epoch: 37949, Training Loss: 0.00697
Epoch: 37949, Training Loss: 0.00851
Epoch: 37949, Training Loss: 0.00660
Epoch: 37950, Training Loss: 0.00745
Epoch: 37950, Training Loss: 0.00697
Epoch: 37950, Training Loss: 0.00851
Epoch: 37950, Training Loss: 0.00660
Epoch: 37951, Training Loss: 0.00745
Epoch: 37951, Training Loss: 0.00697
Epoch: 37951, Training Loss: 0.00851
Epoch: 37951, Training Loss: 0.00660
Epoch: 37952, Training Loss: 0.00745
Epoch: 37952, Training Loss: 0.00697
Epoch: 37952, Training Loss: 0.00851
Epoch: 37952, Training Loss: 0.00660
Epoch: 37953, Training Loss: 0.00745
Epoch: 37953, Training Loss: 0.00697
Epoch: 37953, Training Loss: 0.00851
Epoch: 37953, Training Loss: 0.00660
Epoch: 37954, Training Loss: 0.00745
Epoch: 37954, Training Loss: 0.00697
Epoch: 37954, Training Loss: 0.00851
Epoch: 37954, Training Loss: 0.00660
Epoch: 37955, Training Loss: 0.00745
Epoch: 37955, Training Loss: 0.00697
Epoch: 37955, Training Loss: 0.00851
Epoch: 37955, Training Loss: 0.00660
Epoch: 37956, Training Loss: 0.00745
Epoch: 37956, Training Loss: 0.00697
Epoch: 37956, Training Loss: 0.00851
Epoch: 37956, Training Loss: 0.00660
Epoch: 37957, Training Loss: 0.00745
Epoch: 37957, Training Loss: 0.00697
Epoch: 37957, Training Loss: 0.00851
Epoch: 37957, Training Loss: 0.00660
Epoch: 37958, Training Loss: 0.00745
Epoch: 37958, Training Loss: 0.00697
Epoch: 37958, Training Loss: 0.00851
Epoch: 37958, Training Loss: 0.00660
Epoch: 37959, Training Loss: 0.00745
Epoch: 37959, Training Loss: 0.00697
Epoch: 37959, Training Loss: 0.00851
Epoch: 37959, Training Loss: 0.00660
Epoch: 37960, Training Loss: 0.00745
Epoch: 37960, Training Loss: 0.00697
Epoch: 37960, Training Loss: 0.00851
Epoch: 37960, Training Loss: 0.00660
Epoch: 37961, Training Loss: 0.00745
Epoch: 37961, Training Loss: 0.00697
Epoch: 37961, Training Loss: 0.00851
Epoch: 37961, Training Loss: 0.00660
Epoch: 37962, Training Loss: 0.00745
Epoch: 37962, Training Loss: 0.00697
Epoch: 37962, Training Loss: 0.00851
Epoch: 37962, Training Loss: 0.00660
Epoch: 37963, Training Loss: 0.00745
Epoch: 37963, Training Loss: 0.00697
Epoch: 37963, Training Loss: 0.00851
Epoch: 37963, Training Loss: 0.00660
Epoch: 37964, Training Loss: 0.00745
Epoch: 37964, Training Loss: 0.00697
Epoch: 37964, Training Loss: 0.00851
Epoch: 37964, Training Loss: 0.00660
Epoch: 37965, Training Loss: 0.00745
Epoch: 37965, Training Loss: 0.00697
Epoch: 37965, Training Loss: 0.00851
Epoch: 37965, Training Loss: 0.00660
Epoch: 37966, Training Loss: 0.00745
Epoch: 37966, Training Loss: 0.00697
Epoch: 37966, Training Loss: 0.00851
Epoch: 37966, Training Loss: 0.00660
Epoch: 37967, Training Loss: 0.00745
Epoch: 37967, Training Loss: 0.00697
Epoch: 37967, Training Loss: 0.00851
Epoch: 37967, Training Loss: 0.00660
Epoch: 37968, Training Loss: 0.00745
Epoch: 37968, Training Loss: 0.00697
Epoch: 37968, Training Loss: 0.00851
Epoch: 37968, Training Loss: 0.00660
Epoch: 37969, Training Loss: 0.00745
Epoch: 37969, Training Loss: 0.00697
Epoch: 37969, Training Loss: 0.00851
Epoch: 37969, Training Loss: 0.00660
Epoch: 37970, Training Loss: 0.00745
Epoch: 37970, Training Loss: 0.00697
Epoch: 37970, Training Loss: 0.00851
Epoch: 37970, Training Loss: 0.00660
Epoch: 37971, Training Loss: 0.00745
Epoch: 37971, Training Loss: 0.00697
Epoch: 37971, Training Loss: 0.00851
Epoch: 37971, Training Loss: 0.00659
Epoch: 37972, Training Loss: 0.00745
Epoch: 37972, Training Loss: 0.00697
Epoch: 37972, Training Loss: 0.00851
Epoch: 37972, Training Loss: 0.00659
Epoch: 37973, Training Loss: 0.00745
Epoch: 37973, Training Loss: 0.00697
Epoch: 37973, Training Loss: 0.00851
Epoch: 37973, Training Loss: 0.00659
Epoch: 37974, Training Loss: 0.00745
Epoch: 37974, Training Loss: 0.00697
Epoch: 37974, Training Loss: 0.00851
Epoch: 37974, Training Loss: 0.00659
Epoch: 37975, Training Loss: 0.00744
Epoch: 37975, Training Loss: 0.00697
Epoch: 37975, Training Loss: 0.00851
Epoch: 37975, Training Loss: 0.00659
Epoch: 37976, Training Loss: 0.00744
Epoch: 37976, Training Loss: 0.00697
Epoch: 37976, Training Loss: 0.00851
Epoch: 37976, Training Loss: 0.00659
Epoch: 37977, Training Loss: 0.00744
Epoch: 37977, Training Loss: 0.00696
Epoch: 37977, Training Loss: 0.00851
Epoch: 37977, Training Loss: 0.00659
Epoch: 37978, Training Loss: 0.00744
Epoch: 37978, Training Loss: 0.00696
Epoch: 37978, Training Loss: 0.00851
Epoch: 37978, Training Loss: 0.00659
Epoch: 37979, Training Loss: 0.00744
Epoch: 37979, Training Loss: 0.00696
Epoch: 37979, Training Loss: 0.00851
Epoch: 37979, Training Loss: 0.00659
Epoch: 37980, Training Loss: 0.00744
Epoch: 37980, Training Loss: 0.00696
Epoch: 37980, Training Loss: 0.00851
Epoch: 37980, Training Loss: 0.00659
Epoch: 37981, Training Loss: 0.00744
Epoch: 37981, Training Loss: 0.00696
Epoch: 37981, Training Loss: 0.00851
Epoch: 37981, Training Loss: 0.00659
Epoch: 37982, Training Loss: 0.00744
Epoch: 37982, Training Loss: 0.00696
Epoch: 37982, Training Loss: 0.00850
Epoch: 37982, Training Loss: 0.00659
Epoch: 37983, Training Loss: 0.00744
Epoch: 37983, Training Loss: 0.00696
Epoch: 37983, Training Loss: 0.00850
Epoch: 37983, Training Loss: 0.00659
Epoch: 37984, Training Loss: 0.00744
Epoch: 37984, Training Loss: 0.00696
Epoch: 37984, Training Loss: 0.00850
Epoch: 37984, Training Loss: 0.00659
Epoch: 37985, Training Loss: 0.00744
Epoch: 37985, Training Loss: 0.00696
Epoch: 37985, Training Loss: 0.00850
Epoch: 37985, Training Loss: 0.00659
Epoch: 37986, Training Loss: 0.00744
Epoch: 37986, Training Loss: 0.00696
Epoch: 37986, Training Loss: 0.00850
Epoch: 37986, Training Loss: 0.00659
Epoch: 37987, Training Loss: 0.00744
Epoch: 37987, Training Loss: 0.00696
Epoch: 37987, Training Loss: 0.00850
Epoch: 37987, Training Loss: 0.00659
Epoch: 37988, Training Loss: 0.00744
Epoch: 37988, Training Loss: 0.00696
Epoch: 37988, Training Loss: 0.00850
Epoch: 37988, Training Loss: 0.00659
Epoch: 37989, Training Loss: 0.00744
Epoch: 37989, Training Loss: 0.00696
Epoch: 37989, Training Loss: 0.00850
Epoch: 37989, Training Loss: 0.00659
Epoch: 37990, Training Loss: 0.00744
Epoch: 37990, Training Loss: 0.00696
Epoch: 37990, Training Loss: 0.00850
Epoch: 37990, Training Loss: 0.00659
Epoch: 37991, Training Loss: 0.00744
Epoch: 37991, Training Loss: 0.00696
Epoch: 37991, Training Loss: 0.00850
Epoch: 37991, Training Loss: 0.00659
Epoch: 37992, Training Loss: 0.00744
Epoch: 37992, Training Loss: 0.00696
Epoch: 37992, Training Loss: 0.00850
Epoch: 37992, Training Loss: 0.00659
Epoch: 37993, Training Loss: 0.00744
Epoch: 37993, Training Loss: 0.00696
Epoch: 37993, Training Loss: 0.00850
Epoch: 37993, Training Loss: 0.00659
Epoch: 37994, Training Loss: 0.00744
Epoch: 37994, Training Loss: 0.00696
Epoch: 37994, Training Loss: 0.00850
Epoch: 37994, Training Loss: 0.00659
Epoch: 37995, Training Loss: 0.00744
Epoch: 37995, Training Loss: 0.00696
Epoch: 37995, Training Loss: 0.00850
Epoch: 37995, Training Loss: 0.00659
Epoch: 37996, Training Loss: 0.00744
Epoch: 37996, Training Loss: 0.00696
Epoch: 37996, Training Loss: 0.00850
Epoch: 37996, Training Loss: 0.00659
Epoch: 37997, Training Loss: 0.00744
Epoch: 37997, Training Loss: 0.00696
Epoch: 37997, Training Loss: 0.00850
Epoch: 37997, Training Loss: 0.00659
Epoch: 37998, Training Loss: 0.00744
Epoch: 37998, Training Loss: 0.00696
Epoch: 37998, Training Loss: 0.00850
Epoch: 37998, Training Loss: 0.00659
Epoch: 37999, Training Loss: 0.00744
Epoch: 37999, Training Loss: 0.00696
Epoch: 37999, Training Loss: 0.00850
Epoch: 37999, Training Loss: 0.00659
Epoch: 38000, Training Loss: 0.00744
Epoch: 38000, Training Loss: 0.00696
Epoch: 38000, Training Loss: 0.00850
Epoch: 38000, Training Loss: 0.00659
Epoch: 38001, Training Loss: 0.00744
Epoch: 38001, Training Loss: 0.00696
Epoch: 38001, Training Loss: 0.00850
Epoch: 38001, Training Loss: 0.00659
Epoch: 38002, Training Loss: 0.00744
Epoch: 38002, Training Loss: 0.00696
Epoch: 38002, Training Loss: 0.00850
Epoch: 38002, Training Loss: 0.00659
Epoch: 38003, Training Loss: 0.00744
Epoch: 38003, Training Loss: 0.00696
Epoch: 38003, Training Loss: 0.00850
Epoch: 38003, Training Loss: 0.00659
Epoch: 38004, Training Loss: 0.00744
Epoch: 38004, Training Loss: 0.00696
Epoch: 38004, Training Loss: 0.00850
Epoch: 38004, Training Loss: 0.00659
Epoch: 38005, Training Loss: 0.00744
Epoch: 38005, Training Loss: 0.00696
Epoch: 38005, Training Loss: 0.00850
Epoch: 38005, Training Loss: 0.00659
Epoch: 38006, Training Loss: 0.00744
Epoch: 38006, Training Loss: 0.00696
Epoch: 38006, Training Loss: 0.00850
Epoch: 38006, Training Loss: 0.00659
Epoch: 38007, Training Loss: 0.00744
Epoch: 38007, Training Loss: 0.00696
Epoch: 38007, Training Loss: 0.00850
Epoch: 38007, Training Loss: 0.00659
Epoch: 38008, Training Loss: 0.00744
Epoch: 38008, Training Loss: 0.00696
Epoch: 38008, Training Loss: 0.00850
Epoch: 38008, Training Loss: 0.00659
Epoch: 38009, Training Loss: 0.00744
Epoch: 38009, Training Loss: 0.00696
Epoch: 38009, Training Loss: 0.00850
Epoch: 38009, Training Loss: 0.00659
Epoch: 38010, Training Loss: 0.00744
Epoch: 38010, Training Loss: 0.00696
Epoch: 38010, Training Loss: 0.00850
Epoch: 38010, Training Loss: 0.00659
Epoch: 38011, Training Loss: 0.00744
Epoch: 38011, Training Loss: 0.00696
Epoch: 38011, Training Loss: 0.00850
Epoch: 38011, Training Loss: 0.00659
Epoch: 38012, Training Loss: 0.00744
Epoch: 38012, Training Loss: 0.00696
Epoch: 38012, Training Loss: 0.00850
Epoch: 38012, Training Loss: 0.00659
Epoch: 38013, Training Loss: 0.00744
Epoch: 38013, Training Loss: 0.00696
Epoch: 38013, Training Loss: 0.00850
Epoch: 38013, Training Loss: 0.00659
Epoch: 38014, Training Loss: 0.00744
Epoch: 38014, Training Loss: 0.00696
Epoch: 38014, Training Loss: 0.00850
Epoch: 38014, Training Loss: 0.00659
Epoch: 38015, Training Loss: 0.00744
Epoch: 38015, Training Loss: 0.00696
Epoch: 38015, Training Loss: 0.00850
Epoch: 38015, Training Loss: 0.00659
Epoch: 38016, Training Loss: 0.00744
Epoch: 38016, Training Loss: 0.00696
Epoch: 38016, Training Loss: 0.00850
Epoch: 38016, Training Loss: 0.00659
Epoch: 38017, Training Loss: 0.00744
Epoch: 38017, Training Loss: 0.00696
Epoch: 38017, Training Loss: 0.00850
Epoch: 38017, Training Loss: 0.00659
Epoch: 38018, Training Loss: 0.00744
Epoch: 38018, Training Loss: 0.00696
Epoch: 38018, Training Loss: 0.00850
Epoch: 38018, Training Loss: 0.00659
Epoch: 38019, Training Loss: 0.00744
Epoch: 38019, Training Loss: 0.00696
Epoch: 38019, Training Loss: 0.00850
Epoch: 38019, Training Loss: 0.00659
Epoch: 38020, Training Loss: 0.00744
Epoch: 38020, Training Loss: 0.00696
Epoch: 38020, Training Loss: 0.00850
Epoch: 38020, Training Loss: 0.00659
Epoch: 38021, Training Loss: 0.00744
Epoch: 38021, Training Loss: 0.00696
Epoch: 38021, Training Loss: 0.00850
Epoch: 38021, Training Loss: 0.00659
Epoch: 38022, Training Loss: 0.00744
Epoch: 38022, Training Loss: 0.00696
Epoch: 38022, Training Loss: 0.00850
Epoch: 38022, Training Loss: 0.00659
Epoch: 38023, Training Loss: 0.00744
Epoch: 38023, Training Loss: 0.00696
Epoch: 38023, Training Loss: 0.00850
Epoch: 38023, Training Loss: 0.00659
Epoch: 38024, Training Loss: 0.00744
Epoch: 38024, Training Loss: 0.00696
Epoch: 38024, Training Loss: 0.00850
Epoch: 38024, Training Loss: 0.00659
Epoch: 38025, Training Loss: 0.00744
Epoch: 38025, Training Loss: 0.00696
Epoch: 38025, Training Loss: 0.00850
Epoch: 38025, Training Loss: 0.00659
Epoch: 38026, Training Loss: 0.00744
Epoch: 38026, Training Loss: 0.00696
Epoch: 38026, Training Loss: 0.00850
Epoch: 38026, Training Loss: 0.00659
Epoch: 38027, Training Loss: 0.00744
Epoch: 38027, Training Loss: 0.00696
Epoch: 38027, Training Loss: 0.00850
Epoch: 38027, Training Loss: 0.00659
Epoch: 38028, Training Loss: 0.00744
Epoch: 38028, Training Loss: 0.00696
Epoch: 38028, Training Loss: 0.00850
Epoch: 38028, Training Loss: 0.00659
Epoch: 38029, Training Loss: 0.00744
Epoch: 38029, Training Loss: 0.00696
Epoch: 38029, Training Loss: 0.00850
Epoch: 38029, Training Loss: 0.00659
Epoch: 38030, Training Loss: 0.00744
Epoch: 38030, Training Loss: 0.00696
Epoch: 38030, Training Loss: 0.00850
Epoch: 38030, Training Loss: 0.00659
Epoch: 38031, Training Loss: 0.00744
Epoch: 38031, Training Loss: 0.00696
Epoch: 38031, Training Loss: 0.00850
Epoch: 38031, Training Loss: 0.00659
Epoch: 38032, Training Loss: 0.00744
Epoch: 38032, Training Loss: 0.00696
Epoch: 38032, Training Loss: 0.00850
Epoch: 38032, Training Loss: 0.00659
Epoch: 38033, Training Loss: 0.00744
Epoch: 38033, Training Loss: 0.00696
Epoch: 38033, Training Loss: 0.00850
Epoch: 38033, Training Loss: 0.00659
Epoch: 38034, Training Loss: 0.00744
Epoch: 38034, Training Loss: 0.00696
Epoch: 38034, Training Loss: 0.00850
Epoch: 38034, Training Loss: 0.00659
Epoch: 38035, Training Loss: 0.00744
Epoch: 38035, Training Loss: 0.00696
Epoch: 38035, Training Loss: 0.00850
Epoch: 38035, Training Loss: 0.00659
Epoch: 38036, Training Loss: 0.00744
Epoch: 38036, Training Loss: 0.00696
Epoch: 38036, Training Loss: 0.00850
Epoch: 38036, Training Loss: 0.00659
Epoch: 38037, Training Loss: 0.00744
Epoch: 38037, Training Loss: 0.00696
Epoch: 38037, Training Loss: 0.00850
Epoch: 38037, Training Loss: 0.00659
Epoch: 38038, Training Loss: 0.00744
Epoch: 38038, Training Loss: 0.00696
Epoch: 38038, Training Loss: 0.00850
Epoch: 38038, Training Loss: 0.00659
Epoch: 38039, Training Loss: 0.00744
Epoch: 38039, Training Loss: 0.00696
Epoch: 38039, Training Loss: 0.00850
Epoch: 38039, Training Loss: 0.00659
Epoch: 38040, Training Loss: 0.00744
Epoch: 38040, Training Loss: 0.00696
Epoch: 38040, Training Loss: 0.00850
Epoch: 38040, Training Loss: 0.00659
Epoch: 38041, Training Loss: 0.00744
Epoch: 38041, Training Loss: 0.00696
Epoch: 38041, Training Loss: 0.00850
Epoch: 38041, Training Loss: 0.00659
Epoch: 38042, Training Loss: 0.00744
Epoch: 38042, Training Loss: 0.00696
Epoch: 38042, Training Loss: 0.00850
Epoch: 38042, Training Loss: 0.00659
Epoch: 38043, Training Loss: 0.00744
Epoch: 38043, Training Loss: 0.00696
Epoch: 38043, Training Loss: 0.00850
Epoch: 38043, Training Loss: 0.00659
Epoch: 38044, Training Loss: 0.00744
Epoch: 38044, Training Loss: 0.00696
Epoch: 38044, Training Loss: 0.00850
Epoch: 38044, Training Loss: 0.00659
Epoch: 38045, Training Loss: 0.00744
Epoch: 38045, Training Loss: 0.00696
Epoch: 38045, Training Loss: 0.00850
Epoch: 38045, Training Loss: 0.00659
Epoch: 38046, Training Loss: 0.00744
Epoch: 38046, Training Loss: 0.00696
Epoch: 38046, Training Loss: 0.00850
Epoch: 38046, Training Loss: 0.00659
Epoch: 38047, Training Loss: 0.00744
Epoch: 38047, Training Loss: 0.00696
Epoch: 38047, Training Loss: 0.00850
Epoch: 38047, Training Loss: 0.00659
Epoch: 38048, Training Loss: 0.00744
Epoch: 38048, Training Loss: 0.00696
Epoch: 38048, Training Loss: 0.00850
Epoch: 38048, Training Loss: 0.00659
Epoch: 38049, Training Loss: 0.00744
Epoch: 38049, Training Loss: 0.00696
Epoch: 38049, Training Loss: 0.00850
Epoch: 38049, Training Loss: 0.00659
Epoch: 38050, Training Loss: 0.00744
Epoch: 38050, Training Loss: 0.00696
Epoch: 38050, Training Loss: 0.00850
Epoch: 38050, Training Loss: 0.00659
Epoch: 38051, Training Loss: 0.00744
Epoch: 38051, Training Loss: 0.00696
Epoch: 38051, Training Loss: 0.00850
Epoch: 38051, Training Loss: 0.00659
Epoch: 38052, Training Loss: 0.00744
Epoch: 38052, Training Loss: 0.00696
Epoch: 38052, Training Loss: 0.00850
Epoch: 38052, Training Loss: 0.00659
Epoch: 38053, Training Loss: 0.00744
Epoch: 38053, Training Loss: 0.00696
Epoch: 38053, Training Loss: 0.00850
Epoch: 38053, Training Loss: 0.00659
Epoch: 38054, Training Loss: 0.00744
Epoch: 38054, Training Loss: 0.00696
Epoch: 38054, Training Loss: 0.00850
Epoch: 38054, Training Loss: 0.00659
Epoch: 38055, Training Loss: 0.00744
Epoch: 38055, Training Loss: 0.00696
Epoch: 38055, Training Loss: 0.00850
Epoch: 38055, Training Loss: 0.00659
Epoch: 38056, Training Loss: 0.00744
Epoch: 38056, Training Loss: 0.00696
Epoch: 38056, Training Loss: 0.00850
Epoch: 38056, Training Loss: 0.00659
Epoch: 38057, Training Loss: 0.00744
Epoch: 38057, Training Loss: 0.00696
Epoch: 38057, Training Loss: 0.00850
Epoch: 38057, Training Loss: 0.00659
Epoch: 38058, Training Loss: 0.00744
Epoch: 38058, Training Loss: 0.00696
Epoch: 38058, Training Loss: 0.00850
Epoch: 38058, Training Loss: 0.00659
Epoch: 38059, Training Loss: 0.00744
Epoch: 38059, Training Loss: 0.00696
Epoch: 38059, Training Loss: 0.00850
Epoch: 38059, Training Loss: 0.00659
Epoch: 38060, Training Loss: 0.00744
Epoch: 38060, Training Loss: 0.00696
Epoch: 38060, Training Loss: 0.00850
Epoch: 38060, Training Loss: 0.00659
Epoch: 38061, Training Loss: 0.00744
Epoch: 38061, Training Loss: 0.00696
Epoch: 38061, Training Loss: 0.00850
Epoch: 38061, Training Loss: 0.00659
Epoch: 38062, Training Loss: 0.00744
Epoch: 38062, Training Loss: 0.00696
Epoch: 38062, Training Loss: 0.00850
Epoch: 38062, Training Loss: 0.00659
Epoch: 38063, Training Loss: 0.00744
Epoch: 38063, Training Loss: 0.00696
Epoch: 38063, Training Loss: 0.00850
Epoch: 38063, Training Loss: 0.00659
Epoch: 38064, Training Loss: 0.00744
Epoch: 38064, Training Loss: 0.00696
Epoch: 38064, Training Loss: 0.00850
Epoch: 38064, Training Loss: 0.00659
Epoch: 38065, Training Loss: 0.00744
Epoch: 38065, Training Loss: 0.00696
Epoch: 38065, Training Loss: 0.00850
Epoch: 38065, Training Loss: 0.00659
Epoch: 38066, Training Loss: 0.00744
Epoch: 38066, Training Loss: 0.00696
Epoch: 38066, Training Loss: 0.00850
Epoch: 38066, Training Loss: 0.00659
Epoch: 38067, Training Loss: 0.00744
Epoch: 38067, Training Loss: 0.00696
Epoch: 38067, Training Loss: 0.00849
Epoch: 38067, Training Loss: 0.00659
Epoch: 38068, Training Loss: 0.00744
Epoch: 38068, Training Loss: 0.00696
Epoch: 38068, Training Loss: 0.00849
Epoch: 38068, Training Loss: 0.00659
Epoch: 38069, Training Loss: 0.00744
Epoch: 38069, Training Loss: 0.00696
Epoch: 38069, Training Loss: 0.00849
Epoch: 38069, Training Loss: 0.00659
Epoch: 38070, Training Loss: 0.00744
Epoch: 38070, Training Loss: 0.00696
Epoch: 38070, Training Loss: 0.00849
Epoch: 38070, Training Loss: 0.00659
Epoch: 38071, Training Loss: 0.00744
Epoch: 38071, Training Loss: 0.00696
Epoch: 38071, Training Loss: 0.00849
Epoch: 38071, Training Loss: 0.00659
Epoch: 38072, Training Loss: 0.00743
Epoch: 38072, Training Loss: 0.00696
Epoch: 38072, Training Loss: 0.00849
Epoch: 38072, Training Loss: 0.00659
Epoch: 38073, Training Loss: 0.00743
Epoch: 38073, Training Loss: 0.00696
Epoch: 38073, Training Loss: 0.00849
Epoch: 38073, Training Loss: 0.00659
Epoch: 38074, Training Loss: 0.00743
Epoch: 38074, Training Loss: 0.00696
Epoch: 38074, Training Loss: 0.00849
Epoch: 38074, Training Loss: 0.00659
Epoch: 38075, Training Loss: 0.00743
Epoch: 38075, Training Loss: 0.00696
Epoch: 38075, Training Loss: 0.00849
Epoch: 38075, Training Loss: 0.00659
Epoch: 38076, Training Loss: 0.00743
Epoch: 38076, Training Loss: 0.00696
Epoch: 38076, Training Loss: 0.00849
Epoch: 38076, Training Loss: 0.00659
Epoch: 38077, Training Loss: 0.00743
Epoch: 38077, Training Loss: 0.00696
Epoch: 38077, Training Loss: 0.00849
Epoch: 38077, Training Loss: 0.00659
Epoch: 38078, Training Loss: 0.00743
Epoch: 38078, Training Loss: 0.00696
Epoch: 38078, Training Loss: 0.00849
Epoch: 38078, Training Loss: 0.00659
Epoch: 38079, Training Loss: 0.00743
Epoch: 38079, Training Loss: 0.00696
Epoch: 38079, Training Loss: 0.00849
Epoch: 38079, Training Loss: 0.00659
Epoch: 38080, Training Loss: 0.00743
Epoch: 38080, Training Loss: 0.00696
Epoch: 38080, Training Loss: 0.00849
Epoch: 38080, Training Loss: 0.00659
Epoch: 38081, Training Loss: 0.00743
Epoch: 38081, Training Loss: 0.00696
Epoch: 38081, Training Loss: 0.00849
Epoch: 38081, Training Loss: 0.00659
Epoch: 38082, Training Loss: 0.00743
Epoch: 38082, Training Loss: 0.00695
Epoch: 38082, Training Loss: 0.00849
Epoch: 38082, Training Loss: 0.00658
Epoch: 38083, Training Loss: 0.00743
Epoch: 38083, Training Loss: 0.00695
Epoch: 38083, Training Loss: 0.00849
Epoch: 38083, Training Loss: 0.00658
Epoch: 38084, Training Loss: 0.00743
Epoch: 38084, Training Loss: 0.00695
Epoch: 38084, Training Loss: 0.00849
Epoch: 38084, Training Loss: 0.00658
Epoch: 38085, Training Loss: 0.00743
Epoch: 38085, Training Loss: 0.00695
Epoch: 38085, Training Loss: 0.00849
Epoch: 38085, Training Loss: 0.00658
Epoch: 38086, Training Loss: 0.00743
Epoch: 38086, Training Loss: 0.00695
Epoch: 38086, Training Loss: 0.00849
Epoch: 38086, Training Loss: 0.00658
Epoch: 38087, Training Loss: 0.00743
Epoch: 38087, Training Loss: 0.00695
Epoch: 38087, Training Loss: 0.00849
Epoch: 38087, Training Loss: 0.00658
Epoch: 38088, Training Loss: 0.00743
Epoch: 38088, Training Loss: 0.00695
Epoch: 38088, Training Loss: 0.00849
Epoch: 38088, Training Loss: 0.00658
Epoch: 38089, Training Loss: 0.00743
Epoch: 38089, Training Loss: 0.00695
Epoch: 38089, Training Loss: 0.00849
Epoch: 38089, Training Loss: 0.00658
Epoch: 38090, Training Loss: 0.00743
Epoch: 38090, Training Loss: 0.00695
Epoch: 38090, Training Loss: 0.00849
Epoch: 38090, Training Loss: 0.00658
Epoch: 38091, Training Loss: 0.00743
Epoch: 38091, Training Loss: 0.00695
Epoch: 38091, Training Loss: 0.00849
Epoch: 38091, Training Loss: 0.00658
Epoch: 38092, Training Loss: 0.00743
Epoch: 38092, Training Loss: 0.00695
Epoch: 38092, Training Loss: 0.00849
Epoch: 38092, Training Loss: 0.00658
Epoch: 38093, Training Loss: 0.00743
Epoch: 38093, Training Loss: 0.00695
Epoch: 38093, Training Loss: 0.00849
Epoch: 38093, Training Loss: 0.00658
Epoch: 38094, Training Loss: 0.00743
Epoch: 38094, Training Loss: 0.00695
Epoch: 38094, Training Loss: 0.00849
Epoch: 38094, Training Loss: 0.00658
Epoch: 38095, Training Loss: 0.00743
Epoch: 38095, Training Loss: 0.00695
Epoch: 38095, Training Loss: 0.00849
Epoch: 38095, Training Loss: 0.00658
Epoch: 38096, Training Loss: 0.00743
Epoch: 38096, Training Loss: 0.00695
Epoch: 38096, Training Loss: 0.00849
Epoch: 38096, Training Loss: 0.00658
Epoch: 38097, Training Loss: 0.00743
Epoch: 38097, Training Loss: 0.00695
Epoch: 38097, Training Loss: 0.00849
Epoch: 38097, Training Loss: 0.00658
Epoch: 38098, Training Loss: 0.00743
Epoch: 38098, Training Loss: 0.00695
Epoch: 38098, Training Loss: 0.00849
Epoch: 38098, Training Loss: 0.00658
Epoch: 38099, Training Loss: 0.00743
Epoch: 38099, Training Loss: 0.00695
Epoch: 38099, Training Loss: 0.00849
Epoch: 38099, Training Loss: 0.00658
Epoch: 38100, Training Loss: 0.00743
Epoch: 38100, Training Loss: 0.00695
Epoch: 38100, Training Loss: 0.00849
Epoch: 38100, Training Loss: 0.00658
Epoch: 38101, Training Loss: 0.00743
Epoch: 38101, Training Loss: 0.00695
Epoch: 38101, Training Loss: 0.00849
Epoch: 38101, Training Loss: 0.00658
Epoch: 38102, Training Loss: 0.00743
Epoch: 38102, Training Loss: 0.00695
Epoch: 38102, Training Loss: 0.00849
Epoch: 38102, Training Loss: 0.00658
Epoch: 38103, Training Loss: 0.00743
Epoch: 38103, Training Loss: 0.00695
Epoch: 38103, Training Loss: 0.00849
Epoch: 38103, Training Loss: 0.00658
Epoch: 38104, Training Loss: 0.00743
Epoch: 38104, Training Loss: 0.00695
Epoch: 38104, Training Loss: 0.00849
Epoch: 38104, Training Loss: 0.00658
Epoch: 38105, Training Loss: 0.00743
Epoch: 38105, Training Loss: 0.00695
Epoch: 38105, Training Loss: 0.00849
Epoch: 38105, Training Loss: 0.00658
Epoch: 38106, Training Loss: 0.00743
Epoch: 38106, Training Loss: 0.00695
Epoch: 38106, Training Loss: 0.00849
Epoch: 38106, Training Loss: 0.00658
Epoch: 38107, Training Loss: 0.00743
Epoch: 38107, Training Loss: 0.00695
Epoch: 38107, Training Loss: 0.00849
Epoch: 38107, Training Loss: 0.00658
Epoch: 38108, Training Loss: 0.00743
Epoch: 38108, Training Loss: 0.00695
Epoch: 38108, Training Loss: 0.00849
Epoch: 38108, Training Loss: 0.00658
Epoch: 38109, Training Loss: 0.00743
Epoch: 38109, Training Loss: 0.00695
Epoch: 38109, Training Loss: 0.00849
Epoch: 38109, Training Loss: 0.00658
Epoch: 38110, Training Loss: 0.00743
Epoch: 38110, Training Loss: 0.00695
Epoch: 38110, Training Loss: 0.00849
Epoch: 38110, Training Loss: 0.00658
Epoch: 38111, Training Loss: 0.00743
Epoch: 38111, Training Loss: 0.00695
Epoch: 38111, Training Loss: 0.00849
Epoch: 38111, Training Loss: 0.00658
Epoch: 38112, Training Loss: 0.00743
Epoch: 38112, Training Loss: 0.00695
Epoch: 38112, Training Loss: 0.00849
Epoch: 38112, Training Loss: 0.00658
Epoch: 38113, Training Loss: 0.00743
Epoch: 38113, Training Loss: 0.00695
Epoch: 38113, Training Loss: 0.00849
Epoch: 38113, Training Loss: 0.00658
Epoch: 38114, Training Loss: 0.00743
Epoch: 38114, Training Loss: 0.00695
Epoch: 38114, Training Loss: 0.00849
Epoch: 38114, Training Loss: 0.00658
Epoch: 38115, Training Loss: 0.00743
Epoch: 38115, Training Loss: 0.00695
Epoch: 38115, Training Loss: 0.00849
Epoch: 38115, Training Loss: 0.00658
Epoch: 38116, Training Loss: 0.00743
Epoch: 38116, Training Loss: 0.00695
Epoch: 38116, Training Loss: 0.00849
Epoch: 38116, Training Loss: 0.00658
Epoch: 38117, Training Loss: 0.00743
Epoch: 38117, Training Loss: 0.00695
Epoch: 38117, Training Loss: 0.00849
Epoch: 38117, Training Loss: 0.00658
Epoch: 38118, Training Loss: 0.00743
Epoch: 38118, Training Loss: 0.00695
Epoch: 38118, Training Loss: 0.00849
Epoch: 38118, Training Loss: 0.00658
Epoch: 38119, Training Loss: 0.00743
Epoch: 38119, Training Loss: 0.00695
Epoch: 38119, Training Loss: 0.00849
Epoch: 38119, Training Loss: 0.00658
Epoch: 38120, Training Loss: 0.00743
Epoch: 38120, Training Loss: 0.00695
Epoch: 38120, Training Loss: 0.00849
Epoch: 38120, Training Loss: 0.00658
Epoch: 38121, Training Loss: 0.00743
Epoch: 38121, Training Loss: 0.00695
Epoch: 38121, Training Loss: 0.00849
Epoch: 38121, Training Loss: 0.00658
Epoch: 38122, Training Loss: 0.00743
Epoch: 38122, Training Loss: 0.00695
Epoch: 38122, Training Loss: 0.00849
Epoch: 38122, Training Loss: 0.00658
Epoch: 38123, Training Loss: 0.00743
Epoch: 38123, Training Loss: 0.00695
Epoch: 38123, Training Loss: 0.00849
Epoch: 38123, Training Loss: 0.00658
Epoch: 38124, Training Loss: 0.00743
Epoch: 38124, Training Loss: 0.00695
Epoch: 38124, Training Loss: 0.00849
Epoch: 38124, Training Loss: 0.00658
Epoch: 38125, Training Loss: 0.00743
Epoch: 38125, Training Loss: 0.00695
Epoch: 38125, Training Loss: 0.00849
Epoch: 38125, Training Loss: 0.00658
Epoch: 38126, Training Loss: 0.00743
Epoch: 38126, Training Loss: 0.00695
Epoch: 38126, Training Loss: 0.00849
Epoch: 38126, Training Loss: 0.00658
Epoch: 38127, Training Loss: 0.00743
Epoch: 38127, Training Loss: 0.00695
Epoch: 38127, Training Loss: 0.00849
Epoch: 38127, Training Loss: 0.00658
Epoch: 38128, Training Loss: 0.00743
Epoch: 38128, Training Loss: 0.00695
Epoch: 38128, Training Loss: 0.00849
Epoch: 38128, Training Loss: 0.00658
Epoch: 38129, Training Loss: 0.00743
Epoch: 38129, Training Loss: 0.00695
Epoch: 38129, Training Loss: 0.00849
Epoch: 38129, Training Loss: 0.00658
Epoch: 38130, Training Loss: 0.00743
Epoch: 38130, Training Loss: 0.00695
Epoch: 38130, Training Loss: 0.00849
Epoch: 38130, Training Loss: 0.00658
Epoch: 38131, Training Loss: 0.00743
Epoch: 38131, Training Loss: 0.00695
Epoch: 38131, Training Loss: 0.00849
Epoch: 38131, Training Loss: 0.00658
Epoch: 38132, Training Loss: 0.00743
Epoch: 38132, Training Loss: 0.00695
Epoch: 38132, Training Loss: 0.00849
Epoch: 38132, Training Loss: 0.00658
Epoch: 38133, Training Loss: 0.00743
Epoch: 38133, Training Loss: 0.00695
Epoch: 38133, Training Loss: 0.00849
Epoch: 38133, Training Loss: 0.00658
Epoch: 38134, Training Loss: 0.00743
Epoch: 38134, Training Loss: 0.00695
Epoch: 38134, Training Loss: 0.00849
Epoch: 38134, Training Loss: 0.00658
Epoch: 38135, Training Loss: 0.00743
Epoch: 38135, Training Loss: 0.00695
Epoch: 38135, Training Loss: 0.00849
Epoch: 38135, Training Loss: 0.00658
Epoch: 38136, Training Loss: 0.00743
Epoch: 38136, Training Loss: 0.00695
Epoch: 38136, Training Loss: 0.00849
Epoch: 38136, Training Loss: 0.00658
Epoch: 38137, Training Loss: 0.00743
Epoch: 38137, Training Loss: 0.00695
Epoch: 38137, Training Loss: 0.00849
Epoch: 38137, Training Loss: 0.00658
Epoch: 38138, Training Loss: 0.00743
Epoch: 38138, Training Loss: 0.00695
Epoch: 38138, Training Loss: 0.00849
Epoch: 38138, Training Loss: 0.00658
Epoch: 38139, Training Loss: 0.00743
Epoch: 38139, Training Loss: 0.00695
Epoch: 38139, Training Loss: 0.00849
Epoch: 38139, Training Loss: 0.00658
Epoch: 38140, Training Loss: 0.00743
Epoch: 38140, Training Loss: 0.00695
Epoch: 38140, Training Loss: 0.00849
Epoch: 38140, Training Loss: 0.00658
Epoch: 38141, Training Loss: 0.00743
Epoch: 38141, Training Loss: 0.00695
Epoch: 38141, Training Loss: 0.00849
Epoch: 38141, Training Loss: 0.00658
Epoch: 38142, Training Loss: 0.00743
Epoch: 38142, Training Loss: 0.00695
Epoch: 38142, Training Loss: 0.00849
Epoch: 38142, Training Loss: 0.00658
Epoch: 38143, Training Loss: 0.00743
Epoch: 38143, Training Loss: 0.00695
Epoch: 38143, Training Loss: 0.00849
Epoch: 38143, Training Loss: 0.00658
Epoch: 38144, Training Loss: 0.00743
Epoch: 38144, Training Loss: 0.00695
Epoch: 38144, Training Loss: 0.00849
Epoch: 38144, Training Loss: 0.00658
Epoch: 38145, Training Loss: 0.00743
Epoch: 38145, Training Loss: 0.00695
Epoch: 38145, Training Loss: 0.00849
Epoch: 38145, Training Loss: 0.00658
Epoch: 38146, Training Loss: 0.00743
Epoch: 38146, Training Loss: 0.00695
Epoch: 38146, Training Loss: 0.00849
Epoch: 38146, Training Loss: 0.00658
Epoch: 38147, Training Loss: 0.00743
Epoch: 38147, Training Loss: 0.00695
Epoch: 38147, Training Loss: 0.00849
Epoch: 38147, Training Loss: 0.00658
Epoch: 38148, Training Loss: 0.00743
Epoch: 38148, Training Loss: 0.00695
Epoch: 38148, Training Loss: 0.00849
Epoch: 38148, Training Loss: 0.00658
Epoch: 38149, Training Loss: 0.00743
Epoch: 38149, Training Loss: 0.00695
Epoch: 38149, Training Loss: 0.00849
Epoch: 38149, Training Loss: 0.00658
Epoch: 38150, Training Loss: 0.00743
Epoch: 38150, Training Loss: 0.00695
Epoch: 38150, Training Loss: 0.00849
Epoch: 38150, Training Loss: 0.00658
Epoch: 38151, Training Loss: 0.00743
Epoch: 38151, Training Loss: 0.00695
Epoch: 38151, Training Loss: 0.00849
Epoch: 38151, Training Loss: 0.00658
Epoch: 38152, Training Loss: 0.00743
Epoch: 38152, Training Loss: 0.00695
Epoch: 38152, Training Loss: 0.00848
Epoch: 38152, Training Loss: 0.00658
Epoch: 38153, Training Loss: 0.00743
Epoch: 38153, Training Loss: 0.00695
Epoch: 38153, Training Loss: 0.00848
Epoch: 38153, Training Loss: 0.00658
Epoch: 38154, Training Loss: 0.00743
Epoch: 38154, Training Loss: 0.00695
Epoch: 38154, Training Loss: 0.00848
Epoch: 38154, Training Loss: 0.00658
Epoch: 38155, Training Loss: 0.00743
Epoch: 38155, Training Loss: 0.00695
Epoch: 38155, Training Loss: 0.00848
Epoch: 38155, Training Loss: 0.00658
Epoch: 38156, Training Loss: 0.00743
Epoch: 38156, Training Loss: 0.00695
Epoch: 38156, Training Loss: 0.00848
Epoch: 38156, Training Loss: 0.00658
Epoch: 38157, Training Loss: 0.00743
Epoch: 38157, Training Loss: 0.00695
Epoch: 38157, Training Loss: 0.00848
Epoch: 38157, Training Loss: 0.00658
Epoch: 38158, Training Loss: 0.00743
Epoch: 38158, Training Loss: 0.00695
Epoch: 38158, Training Loss: 0.00848
Epoch: 38158, Training Loss: 0.00658
Epoch: 38159, Training Loss: 0.00743
Epoch: 38159, Training Loss: 0.00695
Epoch: 38159, Training Loss: 0.00848
Epoch: 38159, Training Loss: 0.00658
Epoch: 38160, Training Loss: 0.00743
Epoch: 38160, Training Loss: 0.00695
Epoch: 38160, Training Loss: 0.00848
Epoch: 38160, Training Loss: 0.00658
Epoch: 38161, Training Loss: 0.00743
Epoch: 38161, Training Loss: 0.00695
Epoch: 38161, Training Loss: 0.00848
Epoch: 38161, Training Loss: 0.00658
Epoch: 38162, Training Loss: 0.00743
Epoch: 38162, Training Loss: 0.00695
Epoch: 38162, Training Loss: 0.00848
Epoch: 38162, Training Loss: 0.00658
Epoch: 38163, Training Loss: 0.00743
Epoch: 38163, Training Loss: 0.00695
Epoch: 38163, Training Loss: 0.00848
Epoch: 38163, Training Loss: 0.00658
Epoch: 38164, Training Loss: 0.00743
Epoch: 38164, Training Loss: 0.00695
Epoch: 38164, Training Loss: 0.00848
Epoch: 38164, Training Loss: 0.00658
Epoch: 38165, Training Loss: 0.00743
Epoch: 38165, Training Loss: 0.00695
Epoch: 38165, Training Loss: 0.00848
Epoch: 38165, Training Loss: 0.00658
Epoch: 38166, Training Loss: 0.00743
Epoch: 38166, Training Loss: 0.00695
Epoch: 38166, Training Loss: 0.00848
Epoch: 38166, Training Loss: 0.00658
Epoch: 38167, Training Loss: 0.00743
Epoch: 38167, Training Loss: 0.00695
Epoch: 38167, Training Loss: 0.00848
Epoch: 38167, Training Loss: 0.00658
Epoch: 38168, Training Loss: 0.00743
Epoch: 38168, Training Loss: 0.00695
Epoch: 38168, Training Loss: 0.00848
Epoch: 38168, Training Loss: 0.00658
Epoch: 38169, Training Loss: 0.00742
Epoch: 38169, Training Loss: 0.00695
Epoch: 38169, Training Loss: 0.00848
Epoch: 38169, Training Loss: 0.00658
Epoch: 38170, Training Loss: 0.00742
Epoch: 38170, Training Loss: 0.00695
Epoch: 38170, Training Loss: 0.00848
Epoch: 38170, Training Loss: 0.00658
Epoch: 38171, Training Loss: 0.00742
Epoch: 38171, Training Loss: 0.00695
Epoch: 38171, Training Loss: 0.00848
Epoch: 38171, Training Loss: 0.00658
Epoch: 38172, Training Loss: 0.00742
Epoch: 38172, Training Loss: 0.00695
Epoch: 38172, Training Loss: 0.00848
Epoch: 38172, Training Loss: 0.00658
Epoch: 38173, Training Loss: 0.00742
Epoch: 38173, Training Loss: 0.00695
Epoch: 38173, Training Loss: 0.00848
Epoch: 38173, Training Loss: 0.00658
Epoch: 38174, Training Loss: 0.00742
Epoch: 38174, Training Loss: 0.00695
Epoch: 38174, Training Loss: 0.00848
Epoch: 38174, Training Loss: 0.00658
Epoch: 38175, Training Loss: 0.00742
Epoch: 38175, Training Loss: 0.00695
Epoch: 38175, Training Loss: 0.00848
Epoch: 38175, Training Loss: 0.00658
Epoch: 38176, Training Loss: 0.00742
Epoch: 38176, Training Loss: 0.00695
Epoch: 38176, Training Loss: 0.00848
Epoch: 38176, Training Loss: 0.00658
Epoch: 38177, Training Loss: 0.00742
Epoch: 38177, Training Loss: 0.00695
Epoch: 38177, Training Loss: 0.00848
Epoch: 38177, Training Loss: 0.00658
Epoch: 38178, Training Loss: 0.00742
Epoch: 38178, Training Loss: 0.00695
Epoch: 38178, Training Loss: 0.00848
Epoch: 38178, Training Loss: 0.00658
Epoch: 38179, Training Loss: 0.00742
Epoch: 38179, Training Loss: 0.00695
Epoch: 38179, Training Loss: 0.00848
Epoch: 38179, Training Loss: 0.00658
Epoch: 38180, Training Loss: 0.00742
Epoch: 38180, Training Loss: 0.00695
Epoch: 38180, Training Loss: 0.00848
Epoch: 38180, Training Loss: 0.00658
Epoch: 38181, Training Loss: 0.00742
Epoch: 38181, Training Loss: 0.00695
Epoch: 38181, Training Loss: 0.00848
Epoch: 38181, Training Loss: 0.00658
Epoch: 38182, Training Loss: 0.00742
Epoch: 38182, Training Loss: 0.00695
Epoch: 38182, Training Loss: 0.00848
Epoch: 38182, Training Loss: 0.00658
Epoch: 38183, Training Loss: 0.00742
Epoch: 38183, Training Loss: 0.00695
Epoch: 38183, Training Loss: 0.00848
Epoch: 38183, Training Loss: 0.00658
Epoch: 38184, Training Loss: 0.00742
Epoch: 38184, Training Loss: 0.00695
Epoch: 38184, Training Loss: 0.00848
Epoch: 38184, Training Loss: 0.00658
Epoch: 38185, Training Loss: 0.00742
Epoch: 38185, Training Loss: 0.00695
Epoch: 38185, Training Loss: 0.00848
Epoch: 38185, Training Loss: 0.00658
Epoch: 38186, Training Loss: 0.00742
Epoch: 38186, Training Loss: 0.00695
Epoch: 38186, Training Loss: 0.00848
Epoch: 38186, Training Loss: 0.00658
Epoch: 38187, Training Loss: 0.00742
Epoch: 38187, Training Loss: 0.00695
Epoch: 38187, Training Loss: 0.00848
Epoch: 38187, Training Loss: 0.00658
Epoch: 38188, Training Loss: 0.00742
Epoch: 38188, Training Loss: 0.00694
Epoch: 38188, Training Loss: 0.00848
Epoch: 38188, Training Loss: 0.00658
Epoch: 38189, Training Loss: 0.00742
Epoch: 38189, Training Loss: 0.00694
Epoch: 38189, Training Loss: 0.00848
Epoch: 38189, Training Loss: 0.00658
Epoch: 38190, Training Loss: 0.00742
Epoch: 38190, Training Loss: 0.00694
Epoch: 38190, Training Loss: 0.00848
Epoch: 38190, Training Loss: 0.00658
Epoch: 38191, Training Loss: 0.00742
Epoch: 38191, Training Loss: 0.00694
Epoch: 38191, Training Loss: 0.00848
Epoch: 38191, Training Loss: 0.00658
Epoch: 38192, Training Loss: 0.00742
Epoch: 38192, Training Loss: 0.00694
Epoch: 38192, Training Loss: 0.00848
Epoch: 38192, Training Loss: 0.00658
Epoch: 38193, Training Loss: 0.00742
Epoch: 38193, Training Loss: 0.00694
Epoch: 38193, Training Loss: 0.00848
Epoch: 38193, Training Loss: 0.00657
Epoch: 38194, Training Loss: 0.00742
Epoch: 38194, Training Loss: 0.00694
Epoch: 38194, Training Loss: 0.00848
Epoch: 38194, Training Loss: 0.00657
Epoch: 38195, Training Loss: 0.00742
Epoch: 38195, Training Loss: 0.00694
Epoch: 38195, Training Loss: 0.00848
Epoch: 38195, Training Loss: 0.00657
Epoch: 38196, Training Loss: 0.00742
Epoch: 38196, Training Loss: 0.00694
Epoch: 38196, Training Loss: 0.00848
Epoch: 38196, Training Loss: 0.00657
Epoch: 38197, Training Loss: 0.00742
Epoch: 38197, Training Loss: 0.00694
Epoch: 38197, Training Loss: 0.00848
Epoch: 38197, Training Loss: 0.00657
Epoch: 38198, Training Loss: 0.00742
Epoch: 38198, Training Loss: 0.00694
Epoch: 38198, Training Loss: 0.00848
Epoch: 38198, Training Loss: 0.00657
Epoch: 38199, Training Loss: 0.00742
Epoch: 38199, Training Loss: 0.00694
Epoch: 38199, Training Loss: 0.00848
Epoch: 38199, Training Loss: 0.00657
Epoch: 38200, Training Loss: 0.00742
Epoch: 38200, Training Loss: 0.00694
Epoch: 38200, Training Loss: 0.00848
Epoch: 38200, Training Loss: 0.00657
Epoch: 38201, Training Loss: 0.00742
Epoch: 38201, Training Loss: 0.00694
Epoch: 38201, Training Loss: 0.00848
Epoch: 38201, Training Loss: 0.00657
Epoch: 38202, Training Loss: 0.00742
Epoch: 38202, Training Loss: 0.00694
Epoch: 38202, Training Loss: 0.00848
Epoch: 38202, Training Loss: 0.00657
Epoch: 38203, Training Loss: 0.00742
Epoch: 38203, Training Loss: 0.00694
Epoch: 38203, Training Loss: 0.00848
Epoch: 38203, Training Loss: 0.00657
Epoch: 38204, Training Loss: 0.00742
Epoch: 38204, Training Loss: 0.00694
Epoch: 38204, Training Loss: 0.00848
Epoch: 38204, Training Loss: 0.00657
Epoch: 38205, Training Loss: 0.00742
Epoch: 38205, Training Loss: 0.00694
Epoch: 38205, Training Loss: 0.00848
Epoch: 38205, Training Loss: 0.00657
Epoch: 38206, Training Loss: 0.00742
Epoch: 38206, Training Loss: 0.00694
Epoch: 38206, Training Loss: 0.00848
Epoch: 38206, Training Loss: 0.00657
Epoch: 38207, Training Loss: 0.00742
Epoch: 38207, Training Loss: 0.00694
Epoch: 38207, Training Loss: 0.00848
Epoch: 38207, Training Loss: 0.00657
Epoch: 38208, Training Loss: 0.00742
Epoch: 38208, Training Loss: 0.00694
Epoch: 38208, Training Loss: 0.00848
Epoch: 38208, Training Loss: 0.00657
Epoch: 38209, Training Loss: 0.00742
Epoch: 38209, Training Loss: 0.00694
Epoch: 38209, Training Loss: 0.00848
Epoch: 38209, Training Loss: 0.00657
Epoch: 38210, Training Loss: 0.00742
Epoch: 38210, Training Loss: 0.00694
Epoch: 38210, Training Loss: 0.00848
Epoch: 38210, Training Loss: 0.00657
Epoch: 38211, Training Loss: 0.00742
Epoch: 38211, Training Loss: 0.00694
Epoch: 38211, Training Loss: 0.00848
Epoch: 38211, Training Loss: 0.00657
Epoch: 38212, Training Loss: 0.00742
Epoch: 38212, Training Loss: 0.00694
Epoch: 38212, Training Loss: 0.00848
Epoch: 38212, Training Loss: 0.00657
Epoch: 38213, Training Loss: 0.00742
Epoch: 38213, Training Loss: 0.00694
Epoch: 38213, Training Loss: 0.00848
Epoch: 38213, Training Loss: 0.00657
Epoch: 38214, Training Loss: 0.00742
Epoch: 38214, Training Loss: 0.00694
Epoch: 38214, Training Loss: 0.00848
Epoch: 38214, Training Loss: 0.00657
Epoch: 38215, Training Loss: 0.00742
Epoch: 38215, Training Loss: 0.00694
Epoch: 38215, Training Loss: 0.00848
Epoch: 38215, Training Loss: 0.00657
Epoch: 38216, Training Loss: 0.00742
Epoch: 38216, Training Loss: 0.00694
Epoch: 38216, Training Loss: 0.00848
Epoch: 38216, Training Loss: 0.00657
Epoch: 38217, Training Loss: 0.00742
Epoch: 38217, Training Loss: 0.00694
Epoch: 38217, Training Loss: 0.00848
Epoch: 38217, Training Loss: 0.00657
Epoch: 38218, Training Loss: 0.00742
Epoch: 38218, Training Loss: 0.00694
Epoch: 38218, Training Loss: 0.00848
Epoch: 38218, Training Loss: 0.00657
Epoch: 38219, Training Loss: 0.00742
Epoch: 38219, Training Loss: 0.00694
Epoch: 38219, Training Loss: 0.00848
Epoch: 38219, Training Loss: 0.00657
Epoch: 38220, Training Loss: 0.00742
Epoch: 38220, Training Loss: 0.00694
Epoch: 38220, Training Loss: 0.00848
Epoch: 38220, Training Loss: 0.00657
Epoch: 38221, Training Loss: 0.00742
Epoch: 38221, Training Loss: 0.00694
Epoch: 38221, Training Loss: 0.00848
Epoch: 38221, Training Loss: 0.00657
Epoch: 38222, Training Loss: 0.00742
Epoch: 38222, Training Loss: 0.00694
Epoch: 38222, Training Loss: 0.00848
Epoch: 38222, Training Loss: 0.00657
Epoch: 38223, Training Loss: 0.00742
Epoch: 38223, Training Loss: 0.00694
Epoch: 38223, Training Loss: 0.00848
Epoch: 38223, Training Loss: 0.00657
Epoch: 38224, Training Loss: 0.00742
Epoch: 38224, Training Loss: 0.00694
Epoch: 38224, Training Loss: 0.00848
Epoch: 38224, Training Loss: 0.00657
Epoch: 38225, Training Loss: 0.00742
Epoch: 38225, Training Loss: 0.00694
Epoch: 38225, Training Loss: 0.00848
Epoch: 38225, Training Loss: 0.00657
Epoch: 38226, Training Loss: 0.00742
Epoch: 38226, Training Loss: 0.00694
Epoch: 38226, Training Loss: 0.00848
Epoch: 38226, Training Loss: 0.00657
Epoch: 38227, Training Loss: 0.00742
Epoch: 38227, Training Loss: 0.00694
Epoch: 38227, Training Loss: 0.00848
Epoch: 38227, Training Loss: 0.00657
Epoch: 38228, Training Loss: 0.00742
Epoch: 38228, Training Loss: 0.00694
Epoch: 38228, Training Loss: 0.00848
Epoch: 38228, Training Loss: 0.00657
Epoch: 38229, Training Loss: 0.00742
Epoch: 38229, Training Loss: 0.00694
Epoch: 38229, Training Loss: 0.00848
Epoch: 38229, Training Loss: 0.00657
Epoch: 38230, Training Loss: 0.00742
Epoch: 38230, Training Loss: 0.00694
Epoch: 38230, Training Loss: 0.00848
Epoch: 38230, Training Loss: 0.00657
Epoch: 38231, Training Loss: 0.00742
Epoch: 38231, Training Loss: 0.00694
Epoch: 38231, Training Loss: 0.00848
Epoch: 38231, Training Loss: 0.00657
Epoch: 38232, Training Loss: 0.00742
Epoch: 38232, Training Loss: 0.00694
Epoch: 38232, Training Loss: 0.00848
Epoch: 38232, Training Loss: 0.00657
Epoch: 38233, Training Loss: 0.00742
Epoch: 38233, Training Loss: 0.00694
Epoch: 38233, Training Loss: 0.00848
Epoch: 38233, Training Loss: 0.00657
Epoch: 38234, Training Loss: 0.00742
Epoch: 38234, Training Loss: 0.00694
Epoch: 38234, Training Loss: 0.00848
Epoch: 38234, Training Loss: 0.00657
Epoch: 38235, Training Loss: 0.00742
Epoch: 38235, Training Loss: 0.00694
Epoch: 38235, Training Loss: 0.00848
Epoch: 38235, Training Loss: 0.00657
Epoch: 38236, Training Loss: 0.00742
Epoch: 38236, Training Loss: 0.00694
Epoch: 38236, Training Loss: 0.00848
Epoch: 38236, Training Loss: 0.00657
Epoch: 38237, Training Loss: 0.00742
Epoch: 38237, Training Loss: 0.00694
Epoch: 38237, Training Loss: 0.00848
Epoch: 38237, Training Loss: 0.00657
Epoch: 38238, Training Loss: 0.00742
Epoch: 38238, Training Loss: 0.00694
Epoch: 38238, Training Loss: 0.00847
Epoch: 38238, Training Loss: 0.00657
Epoch: 38239, Training Loss: 0.00742
Epoch: 38239, Training Loss: 0.00694
Epoch: 38239, Training Loss: 0.00847
Epoch: 38239, Training Loss: 0.00657
Epoch: 38240, Training Loss: 0.00742
Epoch: 38240, Training Loss: 0.00694
Epoch: 38240, Training Loss: 0.00847
Epoch: 38240, Training Loss: 0.00657
Epoch: 38241, Training Loss: 0.00742
Epoch: 38241, Training Loss: 0.00694
Epoch: 38241, Training Loss: 0.00847
Epoch: 38241, Training Loss: 0.00657
Epoch: 38242, Training Loss: 0.00742
Epoch: 38242, Training Loss: 0.00694
Epoch: 38242, Training Loss: 0.00847
Epoch: 38242, Training Loss: 0.00657
Epoch: 38243, Training Loss: 0.00742
Epoch: 38243, Training Loss: 0.00694
Epoch: 38243, Training Loss: 0.00847
Epoch: 38243, Training Loss: 0.00657
Epoch: 38244, Training Loss: 0.00742
Epoch: 38244, Training Loss: 0.00694
Epoch: 38244, Training Loss: 0.00847
Epoch: 38244, Training Loss: 0.00657
Epoch: 38245, Training Loss: 0.00742
Epoch: 38245, Training Loss: 0.00694
Epoch: 38245, Training Loss: 0.00847
Epoch: 38245, Training Loss: 0.00657
Epoch: 38246, Training Loss: 0.00742
Epoch: 38246, Training Loss: 0.00694
Epoch: 38246, Training Loss: 0.00847
Epoch: 38246, Training Loss: 0.00657
Epoch: 38247, Training Loss: 0.00742
Epoch: 38247, Training Loss: 0.00694
Epoch: 38247, Training Loss: 0.00847
Epoch: 38247, Training Loss: 0.00657
Epoch: 38248, Training Loss: 0.00742
Epoch: 38248, Training Loss: 0.00694
Epoch: 38248, Training Loss: 0.00847
Epoch: 38248, Training Loss: 0.00657
Epoch: 38249, Training Loss: 0.00742
Epoch: 38249, Training Loss: 0.00694
Epoch: 38249, Training Loss: 0.00847
Epoch: 38249, Training Loss: 0.00657
Epoch: 38250, Training Loss: 0.00742
Epoch: 38250, Training Loss: 0.00694
Epoch: 38250, Training Loss: 0.00847
Epoch: 38250, Training Loss: 0.00657
Epoch: 38251, Training Loss: 0.00742
Epoch: 38251, Training Loss: 0.00694
Epoch: 38251, Training Loss: 0.00847
Epoch: 38251, Training Loss: 0.00657
Epoch: 38252, Training Loss: 0.00742
Epoch: 38252, Training Loss: 0.00694
Epoch: 38252, Training Loss: 0.00847
Epoch: 38252, Training Loss: 0.00657
Epoch: 38253, Training Loss: 0.00742
Epoch: 38253, Training Loss: 0.00694
Epoch: 38253, Training Loss: 0.00847
Epoch: 38253, Training Loss: 0.00657
Epoch: 38254, Training Loss: 0.00742
Epoch: 38254, Training Loss: 0.00694
Epoch: 38254, Training Loss: 0.00847
Epoch: 38254, Training Loss: 0.00657
Epoch: 38255, Training Loss: 0.00742
Epoch: 38255, Training Loss: 0.00694
Epoch: 38255, Training Loss: 0.00847
Epoch: 38255, Training Loss: 0.00657
Epoch: 38256, Training Loss: 0.00742
Epoch: 38256, Training Loss: 0.00694
Epoch: 38256, Training Loss: 0.00847
Epoch: 38256, Training Loss: 0.00657
Epoch: 38257, Training Loss: 0.00742
Epoch: 38257, Training Loss: 0.00694
Epoch: 38257, Training Loss: 0.00847
Epoch: 38257, Training Loss: 0.00657
Epoch: 38258, Training Loss: 0.00742
Epoch: 38258, Training Loss: 0.00694
Epoch: 38258, Training Loss: 0.00847
Epoch: 38258, Training Loss: 0.00657
Epoch: 38259, Training Loss: 0.00742
Epoch: 38259, Training Loss: 0.00694
Epoch: 38259, Training Loss: 0.00847
Epoch: 38259, Training Loss: 0.00657
Epoch: 38260, Training Loss: 0.00742
Epoch: 38260, Training Loss: 0.00694
Epoch: 38260, Training Loss: 0.00847
Epoch: 38260, Training Loss: 0.00657
Epoch: 38261, Training Loss: 0.00742
Epoch: 38261, Training Loss: 0.00694
Epoch: 38261, Training Loss: 0.00847
Epoch: 38261, Training Loss: 0.00657
Epoch: 38262, Training Loss: 0.00742
Epoch: 38262, Training Loss: 0.00694
Epoch: 38262, Training Loss: 0.00847
Epoch: 38262, Training Loss: 0.00657
Epoch: 38263, Training Loss: 0.00742
Epoch: 38263, Training Loss: 0.00694
Epoch: 38263, Training Loss: 0.00847
Epoch: 38263, Training Loss: 0.00657
Epoch: 38264, Training Loss: 0.00742
Epoch: 38264, Training Loss: 0.00694
Epoch: 38264, Training Loss: 0.00847
Epoch: 38264, Training Loss: 0.00657
Epoch: 38265, Training Loss: 0.00742
Epoch: 38265, Training Loss: 0.00694
Epoch: 38265, Training Loss: 0.00847
Epoch: 38265, Training Loss: 0.00657
Epoch: 38266, Training Loss: 0.00742
Epoch: 38266, Training Loss: 0.00694
Epoch: 38266, Training Loss: 0.00847
Epoch: 38266, Training Loss: 0.00657
Epoch: 38267, Training Loss: 0.00741
Epoch: 38267, Training Loss: 0.00694
Epoch: 38267, Training Loss: 0.00847
Epoch: 38267, Training Loss: 0.00657
Epoch: 38268, Training Loss: 0.00741
Epoch: 38268, Training Loss: 0.00694
Epoch: 38268, Training Loss: 0.00847
Epoch: 38268, Training Loss: 0.00657
Epoch: 38269, Training Loss: 0.00741
Epoch: 38269, Training Loss: 0.00694
Epoch: 38269, Training Loss: 0.00847
Epoch: 38269, Training Loss: 0.00657
Epoch: 38270, Training Loss: 0.00741
Epoch: 38270, Training Loss: 0.00694
Epoch: 38270, Training Loss: 0.00847
Epoch: 38270, Training Loss: 0.00657
Epoch: 38271, Training Loss: 0.00741
Epoch: 38271, Training Loss: 0.00694
Epoch: 38271, Training Loss: 0.00847
Epoch: 38271, Training Loss: 0.00657
Epoch: 38272, Training Loss: 0.00741
Epoch: 38272, Training Loss: 0.00694
Epoch: 38272, Training Loss: 0.00847
Epoch: 38272, Training Loss: 0.00657
Epoch: 38273, Training Loss: 0.00741
Epoch: 38273, Training Loss: 0.00694
Epoch: 38273, Training Loss: 0.00847
Epoch: 38273, Training Loss: 0.00657
Epoch: 38274, Training Loss: 0.00741
Epoch: 38274, Training Loss: 0.00694
Epoch: 38274, Training Loss: 0.00847
Epoch: 38274, Training Loss: 0.00657
Epoch: 38275, Training Loss: 0.00741
Epoch: 38275, Training Loss: 0.00694
Epoch: 38275, Training Loss: 0.00847
Epoch: 38275, Training Loss: 0.00657
Epoch: 38276, Training Loss: 0.00741
Epoch: 38276, Training Loss: 0.00694
Epoch: 38276, Training Loss: 0.00847
Epoch: 38276, Training Loss: 0.00657
Epoch: 38277, Training Loss: 0.00741
Epoch: 38277, Training Loss: 0.00694
Epoch: 38277, Training Loss: 0.00847
Epoch: 38277, Training Loss: 0.00657
Epoch: 38278, Training Loss: 0.00741
Epoch: 38278, Training Loss: 0.00694
Epoch: 38278, Training Loss: 0.00847
Epoch: 38278, Training Loss: 0.00657
Epoch: 38279, Training Loss: 0.00741
Epoch: 38279, Training Loss: 0.00694
Epoch: 38279, Training Loss: 0.00847
Epoch: 38279, Training Loss: 0.00657
Epoch: 38280, Training Loss: 0.00741
Epoch: 38280, Training Loss: 0.00694
Epoch: 38280, Training Loss: 0.00847
Epoch: 38280, Training Loss: 0.00657
Epoch: 38281, Training Loss: 0.00741
Epoch: 38281, Training Loss: 0.00694
Epoch: 38281, Training Loss: 0.00847
Epoch: 38281, Training Loss: 0.00657
Epoch: 38282, Training Loss: 0.00741
Epoch: 38282, Training Loss: 0.00694
Epoch: 38282, Training Loss: 0.00847
Epoch: 38282, Training Loss: 0.00657
Epoch: 38283, Training Loss: 0.00741
Epoch: 38283, Training Loss: 0.00694
Epoch: 38283, Training Loss: 0.00847
Epoch: 38283, Training Loss: 0.00657
Epoch: 38284, Training Loss: 0.00741
Epoch: 38284, Training Loss: 0.00694
Epoch: 38284, Training Loss: 0.00847
Epoch: 38284, Training Loss: 0.00657
Epoch: 38285, Training Loss: 0.00741
Epoch: 38285, Training Loss: 0.00694
Epoch: 38285, Training Loss: 0.00847
Epoch: 38285, Training Loss: 0.00657
Epoch: 38286, Training Loss: 0.00741
Epoch: 38286, Training Loss: 0.00694
Epoch: 38286, Training Loss: 0.00847
Epoch: 38286, Training Loss: 0.00657
Epoch: 38287, Training Loss: 0.00741
Epoch: 38287, Training Loss: 0.00694
Epoch: 38287, Training Loss: 0.00847
Epoch: 38287, Training Loss: 0.00657
Epoch: 38288, Training Loss: 0.00741
Epoch: 38288, Training Loss: 0.00694
Epoch: 38288, Training Loss: 0.00847
Epoch: 38288, Training Loss: 0.00657
Epoch: 38289, Training Loss: 0.00741
Epoch: 38289, Training Loss: 0.00694
Epoch: 38289, Training Loss: 0.00847
Epoch: 38289, Training Loss: 0.00657
Epoch: 38290, Training Loss: 0.00741
Epoch: 38290, Training Loss: 0.00694
Epoch: 38290, Training Loss: 0.00847
Epoch: 38290, Training Loss: 0.00657
Epoch: 38291, Training Loss: 0.00741
Epoch: 38291, Training Loss: 0.00694
Epoch: 38291, Training Loss: 0.00847
Epoch: 38291, Training Loss: 0.00657
Epoch: 38292, Training Loss: 0.00741
Epoch: 38292, Training Loss: 0.00694
Epoch: 38292, Training Loss: 0.00847
Epoch: 38292, Training Loss: 0.00657
Epoch: 38293, Training Loss: 0.00741
Epoch: 38293, Training Loss: 0.00694
Epoch: 38293, Training Loss: 0.00847
Epoch: 38293, Training Loss: 0.00657
Epoch: 38294, Training Loss: 0.00741
Epoch: 38294, Training Loss: 0.00693
Epoch: 38294, Training Loss: 0.00847
Epoch: 38294, Training Loss: 0.00657
Epoch: 38295, Training Loss: 0.00741
Epoch: 38295, Training Loss: 0.00693
Epoch: 38295, Training Loss: 0.00847
Epoch: 38295, Training Loss: 0.00657
Epoch: 38296, Training Loss: 0.00741
Epoch: 38296, Training Loss: 0.00693
Epoch: 38296, Training Loss: 0.00847
Epoch: 38296, Training Loss: 0.00657
Epoch: 38297, Training Loss: 0.00741
Epoch: 38297, Training Loss: 0.00693
Epoch: 38297, Training Loss: 0.00847
Epoch: 38297, Training Loss: 0.00657
Epoch: 38298, Training Loss: 0.00741
Epoch: 38298, Training Loss: 0.00693
Epoch: 38298, Training Loss: 0.00847
Epoch: 38298, Training Loss: 0.00657
Epoch: 38299, Training Loss: 0.00741
Epoch: 38299, Training Loss: 0.00693
Epoch: 38299, Training Loss: 0.00847
Epoch: 38299, Training Loss: 0.00657
Epoch: 38300, Training Loss: 0.00741
Epoch: 38300, Training Loss: 0.00693
Epoch: 38300, Training Loss: 0.00847
Epoch: 38300, Training Loss: 0.00657
Epoch: 38301, Training Loss: 0.00741
Epoch: 38301, Training Loss: 0.00693
Epoch: 38301, Training Loss: 0.00847
Epoch: 38301, Training Loss: 0.00657
Epoch: 38302, Training Loss: 0.00741
Epoch: 38302, Training Loss: 0.00693
Epoch: 38302, Training Loss: 0.00847
Epoch: 38302, Training Loss: 0.00657
Epoch: 38303, Training Loss: 0.00741
Epoch: 38303, Training Loss: 0.00693
Epoch: 38303, Training Loss: 0.00847
Epoch: 38303, Training Loss: 0.00657
Epoch: 38304, Training Loss: 0.00741
Epoch: 38304, Training Loss: 0.00693
Epoch: 38304, Training Loss: 0.00847
Epoch: 38304, Training Loss: 0.00657
Epoch: 38305, Training Loss: 0.00741
Epoch: 38305, Training Loss: 0.00693
Epoch: 38305, Training Loss: 0.00847
Epoch: 38305, Training Loss: 0.00656
Epoch: 38306, Training Loss: 0.00741
Epoch: 38306, Training Loss: 0.00693
Epoch: 38306, Training Loss: 0.00847
Epoch: 38306, Training Loss: 0.00656
Epoch: 38307, Training Loss: 0.00741
Epoch: 38307, Training Loss: 0.00693
Epoch: 38307, Training Loss: 0.00847
Epoch: 38307, Training Loss: 0.00656
Epoch: 38308, Training Loss: 0.00741
Epoch: 38308, Training Loss: 0.00693
Epoch: 38308, Training Loss: 0.00847
Epoch: 38308, Training Loss: 0.00656
Epoch: 38309, Training Loss: 0.00741
Epoch: 38309, Training Loss: 0.00693
Epoch: 38309, Training Loss: 0.00847
Epoch: 38309, Training Loss: 0.00656
Epoch: 38310, Training Loss: 0.00741
Epoch: 38310, Training Loss: 0.00693
Epoch: 38310, Training Loss: 0.00847
Epoch: 38310, Training Loss: 0.00656
Epoch: 38311, Training Loss: 0.00741
Epoch: 38311, Training Loss: 0.00693
Epoch: 38311, Training Loss: 0.00847
Epoch: 38311, Training Loss: 0.00656
Epoch: 38312, Training Loss: 0.00741
Epoch: 38312, Training Loss: 0.00693
Epoch: 38312, Training Loss: 0.00847
Epoch: 38312, Training Loss: 0.00656
Epoch: 38313, Training Loss: 0.00741
Epoch: 38313, Training Loss: 0.00693
Epoch: 38313, Training Loss: 0.00847
Epoch: 38313, Training Loss: 0.00656
Epoch: 38314, Training Loss: 0.00741
Epoch: 38314, Training Loss: 0.00693
Epoch: 38314, Training Loss: 0.00847
Epoch: 38314, Training Loss: 0.00656
Epoch: 38315, Training Loss: 0.00741
Epoch: 38315, Training Loss: 0.00693
Epoch: 38315, Training Loss: 0.00847
Epoch: 38315, Training Loss: 0.00656
Epoch: 38316, Training Loss: 0.00741
Epoch: 38316, Training Loss: 0.00693
Epoch: 38316, Training Loss: 0.00847
Epoch: 38316, Training Loss: 0.00656
Epoch: 38317, Training Loss: 0.00741
Epoch: 38317, Training Loss: 0.00693
Epoch: 38317, Training Loss: 0.00847
Epoch: 38317, Training Loss: 0.00656
Epoch: 38318, Training Loss: 0.00741
Epoch: 38318, Training Loss: 0.00693
Epoch: 38318, Training Loss: 0.00847
Epoch: 38318, Training Loss: 0.00656
Epoch: 38319, Training Loss: 0.00741
Epoch: 38319, Training Loss: 0.00693
Epoch: 38319, Training Loss: 0.00847
Epoch: 38319, Training Loss: 0.00656
Epoch: 38320, Training Loss: 0.00741
Epoch: 38320, Training Loss: 0.00693
Epoch: 38320, Training Loss: 0.00847
Epoch: 38320, Training Loss: 0.00656
Epoch: 38321, Training Loss: 0.00741
Epoch: 38321, Training Loss: 0.00693
Epoch: 38321, Training Loss: 0.00847
Epoch: 38321, Training Loss: 0.00656
Epoch: 38322, Training Loss: 0.00741
Epoch: 38322, Training Loss: 0.00693
Epoch: 38322, Training Loss: 0.00847
Epoch: 38322, Training Loss: 0.00656
Epoch: 38323, Training Loss: 0.00741
Epoch: 38323, Training Loss: 0.00693
Epoch: 38323, Training Loss: 0.00847
Epoch: 38323, Training Loss: 0.00656
Epoch: 38324, Training Loss: 0.00741
Epoch: 38324, Training Loss: 0.00693
Epoch: 38324, Training Loss: 0.00846
Epoch: 38324, Training Loss: 0.00656
Epoch: 38325, Training Loss: 0.00741
Epoch: 38325, Training Loss: 0.00693
Epoch: 38325, Training Loss: 0.00846
Epoch: 38325, Training Loss: 0.00656
Epoch: 38326, Training Loss: 0.00741
Epoch: 38326, Training Loss: 0.00693
Epoch: 38326, Training Loss: 0.00846
Epoch: 38326, Training Loss: 0.00656
Epoch: 38327, Training Loss: 0.00741
Epoch: 38327, Training Loss: 0.00693
Epoch: 38327, Training Loss: 0.00846
Epoch: 38327, Training Loss: 0.00656
Epoch: 38328, Training Loss: 0.00741
Epoch: 38328, Training Loss: 0.00693
Epoch: 38328, Training Loss: 0.00846
Epoch: 38328, Training Loss: 0.00656
Epoch: 38329, Training Loss: 0.00741
Epoch: 38329, Training Loss: 0.00693
Epoch: 38329, Training Loss: 0.00846
Epoch: 38329, Training Loss: 0.00656
Epoch: 38330, Training Loss: 0.00741
Epoch: 38330, Training Loss: 0.00693
Epoch: 38330, Training Loss: 0.00846
Epoch: 38330, Training Loss: 0.00656
Epoch: 38331, Training Loss: 0.00741
Epoch: 38331, Training Loss: 0.00693
Epoch: 38331, Training Loss: 0.00846
Epoch: 38331, Training Loss: 0.00656
Epoch: 38332, Training Loss: 0.00741
Epoch: 38332, Training Loss: 0.00693
Epoch: 38332, Training Loss: 0.00846
Epoch: 38332, Training Loss: 0.00656
Epoch: 38333, Training Loss: 0.00741
Epoch: 38333, Training Loss: 0.00693
Epoch: 38333, Training Loss: 0.00846
Epoch: 38333, Training Loss: 0.00656
Epoch: 38334, Training Loss: 0.00741
Epoch: 38334, Training Loss: 0.00693
Epoch: 38334, Training Loss: 0.00846
Epoch: 38334, Training Loss: 0.00656
Epoch: 38335, Training Loss: 0.00741
Epoch: 38335, Training Loss: 0.00693
Epoch: 38335, Training Loss: 0.00846
Epoch: 38335, Training Loss: 0.00656
Epoch: 38336, Training Loss: 0.00741
Epoch: 38336, Training Loss: 0.00693
Epoch: 38336, Training Loss: 0.00846
Epoch: 38336, Training Loss: 0.00656
Epoch: 38337, Training Loss: 0.00741
Epoch: 38337, Training Loss: 0.00693
Epoch: 38337, Training Loss: 0.00846
Epoch: 38337, Training Loss: 0.00656
Epoch: 38338, Training Loss: 0.00741
Epoch: 38338, Training Loss: 0.00693
Epoch: 38338, Training Loss: 0.00846
Epoch: 38338, Training Loss: 0.00656
Epoch: 38339, Training Loss: 0.00741
Epoch: 38339, Training Loss: 0.00693
Epoch: 38339, Training Loss: 0.00846
Epoch: 38339, Training Loss: 0.00656
Epoch: 38340, Training Loss: 0.00741
Epoch: 38340, Training Loss: 0.00693
Epoch: 38340, Training Loss: 0.00846
Epoch: 38340, Training Loss: 0.00656
Epoch: 38341, Training Loss: 0.00741
Epoch: 38341, Training Loss: 0.00693
Epoch: 38341, Training Loss: 0.00846
Epoch: 38341, Training Loss: 0.00656
Epoch: 38342, Training Loss: 0.00741
Epoch: 38342, Training Loss: 0.00693
Epoch: 38342, Training Loss: 0.00846
Epoch: 38342, Training Loss: 0.00656
Epoch: 38343, Training Loss: 0.00741
Epoch: 38343, Training Loss: 0.00693
Epoch: 38343, Training Loss: 0.00846
Epoch: 38343, Training Loss: 0.00656
Epoch: 38344, Training Loss: 0.00741
Epoch: 38344, Training Loss: 0.00693
Epoch: 38344, Training Loss: 0.00846
Epoch: 38344, Training Loss: 0.00656
Epoch: 38345, Training Loss: 0.00741
Epoch: 38345, Training Loss: 0.00693
Epoch: 38345, Training Loss: 0.00846
Epoch: 38345, Training Loss: 0.00656
Epoch: 38346, Training Loss: 0.00741
Epoch: 38346, Training Loss: 0.00693
Epoch: 38346, Training Loss: 0.00846
Epoch: 38346, Training Loss: 0.00656
Epoch: 38347, Training Loss: 0.00741
Epoch: 38347, Training Loss: 0.00693
Epoch: 38347, Training Loss: 0.00846
Epoch: 38347, Training Loss: 0.00656
Epoch: 38348, Training Loss: 0.00741
Epoch: 38348, Training Loss: 0.00693
Epoch: 38348, Training Loss: 0.00846
Epoch: 38348, Training Loss: 0.00656
Epoch: 38349, Training Loss: 0.00741
Epoch: 38349, Training Loss: 0.00693
Epoch: 38349, Training Loss: 0.00846
Epoch: 38349, Training Loss: 0.00656
Epoch: 38350, Training Loss: 0.00741
Epoch: 38350, Training Loss: 0.00693
Epoch: 38350, Training Loss: 0.00846
Epoch: 38350, Training Loss: 0.00656
Epoch: 38351, Training Loss: 0.00741
Epoch: 38351, Training Loss: 0.00693
Epoch: 38351, Training Loss: 0.00846
Epoch: 38351, Training Loss: 0.00656
Epoch: 38352, Training Loss: 0.00741
Epoch: 38352, Training Loss: 0.00693
Epoch: 38352, Training Loss: 0.00846
Epoch: 38352, Training Loss: 0.00656
Epoch: 38353, Training Loss: 0.00741
Epoch: 38353, Training Loss: 0.00693
Epoch: 38353, Training Loss: 0.00846
Epoch: 38353, Training Loss: 0.00656
Epoch: 38354, Training Loss: 0.00741
Epoch: 38354, Training Loss: 0.00693
Epoch: 38354, Training Loss: 0.00846
Epoch: 38354, Training Loss: 0.00656
Epoch: 38355, Training Loss: 0.00741
Epoch: 38355, Training Loss: 0.00693
Epoch: 38355, Training Loss: 0.00846
Epoch: 38355, Training Loss: 0.00656
Epoch: 38356, Training Loss: 0.00741
Epoch: 38356, Training Loss: 0.00693
Epoch: 38356, Training Loss: 0.00846
Epoch: 38356, Training Loss: 0.00656
Epoch: 38357, Training Loss: 0.00741
Epoch: 38357, Training Loss: 0.00693
Epoch: 38357, Training Loss: 0.00846
Epoch: 38357, Training Loss: 0.00656
Epoch: 38358, Training Loss: 0.00741
Epoch: 38358, Training Loss: 0.00693
Epoch: 38358, Training Loss: 0.00846
Epoch: 38358, Training Loss: 0.00656
Epoch: 38359, Training Loss: 0.00741
Epoch: 38359, Training Loss: 0.00693
Epoch: 38359, Training Loss: 0.00846
Epoch: 38359, Training Loss: 0.00656
Epoch: 38360, Training Loss: 0.00741
Epoch: 38360, Training Loss: 0.00693
Epoch: 38360, Training Loss: 0.00846
Epoch: 38360, Training Loss: 0.00656
Epoch: 38361, Training Loss: 0.00741
Epoch: 38361, Training Loss: 0.00693
Epoch: 38361, Training Loss: 0.00846
Epoch: 38361, Training Loss: 0.00656
Epoch: 38362, Training Loss: 0.00741
Epoch: 38362, Training Loss: 0.00693
Epoch: 38362, Training Loss: 0.00846
Epoch: 38362, Training Loss: 0.00656
Epoch: 38363, Training Loss: 0.00741
Epoch: 38363, Training Loss: 0.00693
Epoch: 38363, Training Loss: 0.00846
Epoch: 38363, Training Loss: 0.00656
Epoch: 38364, Training Loss: 0.00741
Epoch: 38364, Training Loss: 0.00693
Epoch: 38364, Training Loss: 0.00846
Epoch: 38364, Training Loss: 0.00656
Epoch: 38365, Training Loss: 0.00740
Epoch: 38365, Training Loss: 0.00693
Epoch: 38365, Training Loss: 0.00846
Epoch: 38365, Training Loss: 0.00656
Epoch: 38366, Training Loss: 0.00740
Epoch: 38366, Training Loss: 0.00693
Epoch: 38366, Training Loss: 0.00846
Epoch: 38366, Training Loss: 0.00656
Epoch: 38367, Training Loss: 0.00740
Epoch: 38367, Training Loss: 0.00693
Epoch: 38367, Training Loss: 0.00846
Epoch: 38367, Training Loss: 0.00656
Epoch: 38368, Training Loss: 0.00740
Epoch: 38368, Training Loss: 0.00693
Epoch: 38368, Training Loss: 0.00846
Epoch: 38368, Training Loss: 0.00656
Epoch: 38369, Training Loss: 0.00740
Epoch: 38369, Training Loss: 0.00693
Epoch: 38369, Training Loss: 0.00846
Epoch: 38369, Training Loss: 0.00656
Epoch: 38370, Training Loss: 0.00740
Epoch: 38370, Training Loss: 0.00693
Epoch: 38370, Training Loss: 0.00846
Epoch: 38370, Training Loss: 0.00656
Epoch: 38371, Training Loss: 0.00740
Epoch: 38371, Training Loss: 0.00693
Epoch: 38371, Training Loss: 0.00846
Epoch: 38371, Training Loss: 0.00656
Epoch: 38372, Training Loss: 0.00740
Epoch: 38372, Training Loss: 0.00693
Epoch: 38372, Training Loss: 0.00846
Epoch: 38372, Training Loss: 0.00656
Epoch: 38373, Training Loss: 0.00740
Epoch: 38373, Training Loss: 0.00693
Epoch: 38373, Training Loss: 0.00846
Epoch: 38373, Training Loss: 0.00656
Epoch: 38374, Training Loss: 0.00740
Epoch: 38374, Training Loss: 0.00693
Epoch: 38374, Training Loss: 0.00846
Epoch: 38374, Training Loss: 0.00656
Epoch: 38375, Training Loss: 0.00740
Epoch: 38375, Training Loss: 0.00693
Epoch: 38375, Training Loss: 0.00846
Epoch: 38375, Training Loss: 0.00656
Epoch: 38376, Training Loss: 0.00740
Epoch: 38376, Training Loss: 0.00693
Epoch: 38376, Training Loss: 0.00846
Epoch: 38376, Training Loss: 0.00656
Epoch: 38377, Training Loss: 0.00740
Epoch: 38377, Training Loss: 0.00693
Epoch: 38377, Training Loss: 0.00846
Epoch: 38377, Training Loss: 0.00656
Epoch: 38378, Training Loss: 0.00740
Epoch: 38378, Training Loss: 0.00693
Epoch: 38378, Training Loss: 0.00846
Epoch: 38378, Training Loss: 0.00656
Epoch: 38379, Training Loss: 0.00740
Epoch: 38379, Training Loss: 0.00693
Epoch: 38379, Training Loss: 0.00846
Epoch: 38379, Training Loss: 0.00656
Epoch: 38380, Training Loss: 0.00740
Epoch: 38380, Training Loss: 0.00693
Epoch: 38380, Training Loss: 0.00846
Epoch: 38380, Training Loss: 0.00656
Epoch: 38381, Training Loss: 0.00740
Epoch: 38381, Training Loss: 0.00693
Epoch: 38381, Training Loss: 0.00846
Epoch: 38381, Training Loss: 0.00656
Epoch: 38382, Training Loss: 0.00740
Epoch: 38382, Training Loss: 0.00693
Epoch: 38382, Training Loss: 0.00846
Epoch: 38382, Training Loss: 0.00656
Epoch: 38383, Training Loss: 0.00740
Epoch: 38383, Training Loss: 0.00693
Epoch: 38383, Training Loss: 0.00846
Epoch: 38383, Training Loss: 0.00656
Epoch: 38384, Training Loss: 0.00740
Epoch: 38384, Training Loss: 0.00693
Epoch: 38384, Training Loss: 0.00846
Epoch: 38384, Training Loss: 0.00656
Epoch: 38385, Training Loss: 0.00740
Epoch: 38385, Training Loss: 0.00693
Epoch: 38385, Training Loss: 0.00846
Epoch: 38385, Training Loss: 0.00656
Epoch: 38386, Training Loss: 0.00740
Epoch: 38386, Training Loss: 0.00693
Epoch: 38386, Training Loss: 0.00846
Epoch: 38386, Training Loss: 0.00656
Epoch: 38387, Training Loss: 0.00740
Epoch: 38387, Training Loss: 0.00693
Epoch: 38387, Training Loss: 0.00846
Epoch: 38387, Training Loss: 0.00656
Epoch: 38388, Training Loss: 0.00740
Epoch: 38388, Training Loss: 0.00693
Epoch: 38388, Training Loss: 0.00846
Epoch: 38388, Training Loss: 0.00656
Epoch: 38389, Training Loss: 0.00740
Epoch: 38389, Training Loss: 0.00693
Epoch: 38389, Training Loss: 0.00846
Epoch: 38389, Training Loss: 0.00656
Epoch: 38390, Training Loss: 0.00740
Epoch: 38390, Training Loss: 0.00693
Epoch: 38390, Training Loss: 0.00846
Epoch: 38390, Training Loss: 0.00656
Epoch: 38391, Training Loss: 0.00740
Epoch: 38391, Training Loss: 0.00693
Epoch: 38391, Training Loss: 0.00846
Epoch: 38391, Training Loss: 0.00656
Epoch: 38392, Training Loss: 0.00740
Epoch: 38392, Training Loss: 0.00693
Epoch: 38392, Training Loss: 0.00846
Epoch: 38392, Training Loss: 0.00656
Epoch: 38393, Training Loss: 0.00740
Epoch: 38393, Training Loss: 0.00693
Epoch: 38393, Training Loss: 0.00846
Epoch: 38393, Training Loss: 0.00656
Epoch: 38394, Training Loss: 0.00740
Epoch: 38394, Training Loss: 0.00693
Epoch: 38394, Training Loss: 0.00846
Epoch: 38394, Training Loss: 0.00656
Epoch: 38395, Training Loss: 0.00740
Epoch: 38395, Training Loss: 0.00693
Epoch: 38395, Training Loss: 0.00846
Epoch: 38395, Training Loss: 0.00656
Epoch: 38396, Training Loss: 0.00740
Epoch: 38396, Training Loss: 0.00693
Epoch: 38396, Training Loss: 0.00846
Epoch: 38396, Training Loss: 0.00656
Epoch: 38397, Training Loss: 0.00740
Epoch: 38397, Training Loss: 0.00693
Epoch: 38397, Training Loss: 0.00846
Epoch: 38397, Training Loss: 0.00656
Epoch: 38398, Training Loss: 0.00740
Epoch: 38398, Training Loss: 0.00693
Epoch: 38398, Training Loss: 0.00846
Epoch: 38398, Training Loss: 0.00656
Epoch: 38399, Training Loss: 0.00740
Epoch: 38399, Training Loss: 0.00693
Epoch: 38399, Training Loss: 0.00846
Epoch: 38399, Training Loss: 0.00656
Epoch: 38400, Training Loss: 0.00740
Epoch: 38400, Training Loss: 0.00692
Epoch: 38400, Training Loss: 0.00846
Epoch: 38400, Training Loss: 0.00656
Epoch: 38401, Training Loss: 0.00740
Epoch: 38401, Training Loss: 0.00692
Epoch: 38401, Training Loss: 0.00846
Epoch: 38401, Training Loss: 0.00656
Epoch: 38402, Training Loss: 0.00740
Epoch: 38402, Training Loss: 0.00692
Epoch: 38402, Training Loss: 0.00846
Epoch: 38402, Training Loss: 0.00656
Epoch: 38403, Training Loss: 0.00740
Epoch: 38403, Training Loss: 0.00692
Epoch: 38403, Training Loss: 0.00846
Epoch: 38403, Training Loss: 0.00656
Epoch: 38404, Training Loss: 0.00740
Epoch: 38404, Training Loss: 0.00692
Epoch: 38404, Training Loss: 0.00846
Epoch: 38404, Training Loss: 0.00656
Epoch: 38405, Training Loss: 0.00740
Epoch: 38405, Training Loss: 0.00692
Epoch: 38405, Training Loss: 0.00846
Epoch: 38405, Training Loss: 0.00656
Epoch: 38406, Training Loss: 0.00740
Epoch: 38406, Training Loss: 0.00692
Epoch: 38406, Training Loss: 0.00846
Epoch: 38406, Training Loss: 0.00656
Epoch: 38407, Training Loss: 0.00740
Epoch: 38407, Training Loss: 0.00692
Epoch: 38407, Training Loss: 0.00846
Epoch: 38407, Training Loss: 0.00656
Epoch: 38408, Training Loss: 0.00740
Epoch: 38408, Training Loss: 0.00692
Epoch: 38408, Training Loss: 0.00846
Epoch: 38408, Training Loss: 0.00656
Epoch: 38409, Training Loss: 0.00740
Epoch: 38409, Training Loss: 0.00692
Epoch: 38409, Training Loss: 0.00846
Epoch: 38409, Training Loss: 0.00656
Epoch: 38410, Training Loss: 0.00740
Epoch: 38410, Training Loss: 0.00692
Epoch: 38410, Training Loss: 0.00845
Epoch: 38410, Training Loss: 0.00656
Epoch: 38411, Training Loss: 0.00740
Epoch: 38411, Training Loss: 0.00692
Epoch: 38411, Training Loss: 0.00845
Epoch: 38411, Training Loss: 0.00656
Epoch: 38412, Training Loss: 0.00740
Epoch: 38412, Training Loss: 0.00692
Epoch: 38412, Training Loss: 0.00845
Epoch: 38412, Training Loss: 0.00656
Epoch: 38413, Training Loss: 0.00740
Epoch: 38413, Training Loss: 0.00692
Epoch: 38413, Training Loss: 0.00845
Epoch: 38413, Training Loss: 0.00656
Epoch: 38414, Training Loss: 0.00740
Epoch: 38414, Training Loss: 0.00692
Epoch: 38414, Training Loss: 0.00845
Epoch: 38414, Training Loss: 0.00656
Epoch: 38415, Training Loss: 0.00740
Epoch: 38415, Training Loss: 0.00692
Epoch: 38415, Training Loss: 0.00845
Epoch: 38415, Training Loss: 0.00656
Epoch: 38416, Training Loss: 0.00740
Epoch: 38416, Training Loss: 0.00692
Epoch: 38416, Training Loss: 0.00845
Epoch: 38416, Training Loss: 0.00656
Epoch: 38417, Training Loss: 0.00740
Epoch: 38417, Training Loss: 0.00692
Epoch: 38417, Training Loss: 0.00845
Epoch: 38417, Training Loss: 0.00656
Epoch: 38418, Training Loss: 0.00740
Epoch: 38418, Training Loss: 0.00692
Epoch: 38418, Training Loss: 0.00845
Epoch: 38418, Training Loss: 0.00655
Epoch: 38419, Training Loss: 0.00740
Epoch: 38419, Training Loss: 0.00692
Epoch: 38419, Training Loss: 0.00845
Epoch: 38419, Training Loss: 0.00655
Epoch: 38420, Training Loss: 0.00740
Epoch: 38420, Training Loss: 0.00692
Epoch: 38420, Training Loss: 0.00845
Epoch: 38420, Training Loss: 0.00655
Epoch: 38421, Training Loss: 0.00740
Epoch: 38421, Training Loss: 0.00692
Epoch: 38421, Training Loss: 0.00845
Epoch: 38421, Training Loss: 0.00655
Epoch: 38422, Training Loss: 0.00740
Epoch: 38422, Training Loss: 0.00692
Epoch: 38422, Training Loss: 0.00845
Epoch: 38422, Training Loss: 0.00655
Epoch: 38423, Training Loss: 0.00740
Epoch: 38423, Training Loss: 0.00692
Epoch: 38423, Training Loss: 0.00845
Epoch: 38423, Training Loss: 0.00655
Epoch: 38424, Training Loss: 0.00740
Epoch: 38424, Training Loss: 0.00692
Epoch: 38424, Training Loss: 0.00845
Epoch: 38424, Training Loss: 0.00655
Epoch: 38425, Training Loss: 0.00740
Epoch: 38425, Training Loss: 0.00692
Epoch: 38425, Training Loss: 0.00845
Epoch: 38425, Training Loss: 0.00655
Epoch: 38426, Training Loss: 0.00740
Epoch: 38426, Training Loss: 0.00692
Epoch: 38426, Training Loss: 0.00845
Epoch: 38426, Training Loss: 0.00655
Epoch: 38427, Training Loss: 0.00740
Epoch: 38427, Training Loss: 0.00692
Epoch: 38427, Training Loss: 0.00845
Epoch: 38427, Training Loss: 0.00655
Epoch: 38428, Training Loss: 0.00740
Epoch: 38428, Training Loss: 0.00692
Epoch: 38428, Training Loss: 0.00845
Epoch: 38428, Training Loss: 0.00655
Epoch: 38429, Training Loss: 0.00740
Epoch: 38429, Training Loss: 0.00692
Epoch: 38429, Training Loss: 0.00845
Epoch: 38429, Training Loss: 0.00655
Epoch: 38430, Training Loss: 0.00740
Epoch: 38430, Training Loss: 0.00692
Epoch: 38430, Training Loss: 0.00845
Epoch: 38430, Training Loss: 0.00655
Epoch: 38431, Training Loss: 0.00740
Epoch: 38431, Training Loss: 0.00692
Epoch: 38431, Training Loss: 0.00845
Epoch: 38431, Training Loss: 0.00655
Epoch: 38432, Training Loss: 0.00740
Epoch: 38432, Training Loss: 0.00692
Epoch: 38432, Training Loss: 0.00845
Epoch: 38432, Training Loss: 0.00655
Epoch: 38433, Training Loss: 0.00740
Epoch: 38433, Training Loss: 0.00692
Epoch: 38433, Training Loss: 0.00845
Epoch: 38433, Training Loss: 0.00655
Epoch: 38434, Training Loss: 0.00740
Epoch: 38434, Training Loss: 0.00692
Epoch: 38434, Training Loss: 0.00845
Epoch: 38434, Training Loss: 0.00655
Epoch: 38435, Training Loss: 0.00740
Epoch: 38435, Training Loss: 0.00692
Epoch: 38435, Training Loss: 0.00845
Epoch: 38435, Training Loss: 0.00655
Epoch: 38436, Training Loss: 0.00740
Epoch: 38436, Training Loss: 0.00692
Epoch: 38436, Training Loss: 0.00845
Epoch: 38436, Training Loss: 0.00655
Epoch: 38437, Training Loss: 0.00740
Epoch: 38437, Training Loss: 0.00692
Epoch: 38437, Training Loss: 0.00845
Epoch: 38437, Training Loss: 0.00655
Epoch: 38438, Training Loss: 0.00740
Epoch: 38438, Training Loss: 0.00692
Epoch: 38438, Training Loss: 0.00845
Epoch: 38438, Training Loss: 0.00655
Epoch: 38439, Training Loss: 0.00740
Epoch: 38439, Training Loss: 0.00692
Epoch: 38439, Training Loss: 0.00845
Epoch: 38439, Training Loss: 0.00655
Epoch: 38440, Training Loss: 0.00740
Epoch: 38440, Training Loss: 0.00692
Epoch: 38440, Training Loss: 0.00845
Epoch: 38440, Training Loss: 0.00655
Epoch: 38441, Training Loss: 0.00740
Epoch: 38441, Training Loss: 0.00692
Epoch: 38441, Training Loss: 0.00845
Epoch: 38441, Training Loss: 0.00655
Epoch: 38442, Training Loss: 0.00740
Epoch: 38442, Training Loss: 0.00692
Epoch: 38442, Training Loss: 0.00845
Epoch: 38442, Training Loss: 0.00655
Epoch: 38443, Training Loss: 0.00740
Epoch: 38443, Training Loss: 0.00692
Epoch: 38443, Training Loss: 0.00845
Epoch: 38443, Training Loss: 0.00655
Epoch: 38444, Training Loss: 0.00740
Epoch: 38444, Training Loss: 0.00692
Epoch: 38444, Training Loss: 0.00845
Epoch: 38444, Training Loss: 0.00655
Epoch: 38445, Training Loss: 0.00740
Epoch: 38445, Training Loss: 0.00692
Epoch: 38445, Training Loss: 0.00845
Epoch: 38445, Training Loss: 0.00655
Epoch: 38446, Training Loss: 0.00740
Epoch: 38446, Training Loss: 0.00692
Epoch: 38446, Training Loss: 0.00845
Epoch: 38446, Training Loss: 0.00655
Epoch: 38447, Training Loss: 0.00740
Epoch: 38447, Training Loss: 0.00692
Epoch: 38447, Training Loss: 0.00845
Epoch: 38447, Training Loss: 0.00655
Epoch: 38448, Training Loss: 0.00740
Epoch: 38448, Training Loss: 0.00692
Epoch: 38448, Training Loss: 0.00845
Epoch: 38448, Training Loss: 0.00655
Epoch: 38449, Training Loss: 0.00740
Epoch: 38449, Training Loss: 0.00692
Epoch: 38449, Training Loss: 0.00845
Epoch: 38449, Training Loss: 0.00655
Epoch: 38450, Training Loss: 0.00740
Epoch: 38450, Training Loss: 0.00692
Epoch: 38450, Training Loss: 0.00845
Epoch: 38450, Training Loss: 0.00655
Epoch: 38451, Training Loss: 0.00740
Epoch: 38451, Training Loss: 0.00692
Epoch: 38451, Training Loss: 0.00845
Epoch: 38451, Training Loss: 0.00655
Epoch: 38452, Training Loss: 0.00740
Epoch: 38452, Training Loss: 0.00692
Epoch: 38452, Training Loss: 0.00845
Epoch: 38452, Training Loss: 0.00655
Epoch: 38453, Training Loss: 0.00740
Epoch: 38453, Training Loss: 0.00692
Epoch: 38453, Training Loss: 0.00845
Epoch: 38453, Training Loss: 0.00655
Epoch: 38454, Training Loss: 0.00740
Epoch: 38454, Training Loss: 0.00692
Epoch: 38454, Training Loss: 0.00845
Epoch: 38454, Training Loss: 0.00655
Epoch: 38455, Training Loss: 0.00740
Epoch: 38455, Training Loss: 0.00692
Epoch: 38455, Training Loss: 0.00845
Epoch: 38455, Training Loss: 0.00655
Epoch: 38456, Training Loss: 0.00740
Epoch: 38456, Training Loss: 0.00692
Epoch: 38456, Training Loss: 0.00845
Epoch: 38456, Training Loss: 0.00655
Epoch: 38457, Training Loss: 0.00740
Epoch: 38457, Training Loss: 0.00692
Epoch: 38457, Training Loss: 0.00845
Epoch: 38457, Training Loss: 0.00655
Epoch: 38458, Training Loss: 0.00740
Epoch: 38458, Training Loss: 0.00692
Epoch: 38458, Training Loss: 0.00845
Epoch: 38458, Training Loss: 0.00655
Epoch: 38459, Training Loss: 0.00740
Epoch: 38459, Training Loss: 0.00692
Epoch: 38459, Training Loss: 0.00845
Epoch: 38459, Training Loss: 0.00655
Epoch: 38460, Training Loss: 0.00740
Epoch: 38460, Training Loss: 0.00692
Epoch: 38460, Training Loss: 0.00845
Epoch: 38460, Training Loss: 0.00655
Epoch: 38461, Training Loss: 0.00740
Epoch: 38461, Training Loss: 0.00692
Epoch: 38461, Training Loss: 0.00845
Epoch: 38461, Training Loss: 0.00655
Epoch: 38462, Training Loss: 0.00740
Epoch: 38462, Training Loss: 0.00692
Epoch: 38462, Training Loss: 0.00845
Epoch: 38462, Training Loss: 0.00655
Epoch: 38463, Training Loss: 0.00740
Epoch: 38463, Training Loss: 0.00692
Epoch: 38463, Training Loss: 0.00845
Epoch: 38463, Training Loss: 0.00655
Epoch: 38464, Training Loss: 0.00739
Epoch: 38464, Training Loss: 0.00692
Epoch: 38464, Training Loss: 0.00845
Epoch: 38464, Training Loss: 0.00655
Epoch: 38465, Training Loss: 0.00739
Epoch: 38465, Training Loss: 0.00692
Epoch: 38465, Training Loss: 0.00845
Epoch: 38465, Training Loss: 0.00655
Epoch: 38466, Training Loss: 0.00739
Epoch: 38466, Training Loss: 0.00692
Epoch: 38466, Training Loss: 0.00845
Epoch: 38466, Training Loss: 0.00655
Epoch: 38467, Training Loss: 0.00739
Epoch: 38467, Training Loss: 0.00692
Epoch: 38467, Training Loss: 0.00845
Epoch: 38467, Training Loss: 0.00655
Epoch: 38468, Training Loss: 0.00739
Epoch: 38468, Training Loss: 0.00692
Epoch: 38468, Training Loss: 0.00845
Epoch: 38468, Training Loss: 0.00655
Epoch: 38469, Training Loss: 0.00739
Epoch: 38469, Training Loss: 0.00692
Epoch: 38469, Training Loss: 0.00845
Epoch: 38469, Training Loss: 0.00655
Epoch: 38470, Training Loss: 0.00739
Epoch: 38470, Training Loss: 0.00692
Epoch: 38470, Training Loss: 0.00845
Epoch: 38470, Training Loss: 0.00655
Epoch: 38471, Training Loss: 0.00739
Epoch: 38471, Training Loss: 0.00692
Epoch: 38471, Training Loss: 0.00845
Epoch: 38471, Training Loss: 0.00655
Epoch: 38472, Training Loss: 0.00739
Epoch: 38472, Training Loss: 0.00692
Epoch: 38472, Training Loss: 0.00845
Epoch: 38472, Training Loss: 0.00655
Epoch: 38473, Training Loss: 0.00739
Epoch: 38473, Training Loss: 0.00692
Epoch: 38473, Training Loss: 0.00845
Epoch: 38473, Training Loss: 0.00655
Epoch: 38474, Training Loss: 0.00739
Epoch: 38474, Training Loss: 0.00692
Epoch: 38474, Training Loss: 0.00845
Epoch: 38474, Training Loss: 0.00655
Epoch: 38475, Training Loss: 0.00739
Epoch: 38475, Training Loss: 0.00692
Epoch: 38475, Training Loss: 0.00845
Epoch: 38475, Training Loss: 0.00655
Epoch: 38476, Training Loss: 0.00739
Epoch: 38476, Training Loss: 0.00692
Epoch: 38476, Training Loss: 0.00845
Epoch: 38476, Training Loss: 0.00655
Epoch: 38477, Training Loss: 0.00739
Epoch: 38477, Training Loss: 0.00692
Epoch: 38477, Training Loss: 0.00845
Epoch: 38477, Training Loss: 0.00655
Epoch: 38478, Training Loss: 0.00739
Epoch: 38478, Training Loss: 0.00692
Epoch: 38478, Training Loss: 0.00845
Epoch: 38478, Training Loss: 0.00655
Epoch: 38479, Training Loss: 0.00739
Epoch: 38479, Training Loss: 0.00692
Epoch: 38479, Training Loss: 0.00845
Epoch: 38479, Training Loss: 0.00655
Epoch: 38480, Training Loss: 0.00739
Epoch: 38480, Training Loss: 0.00692
Epoch: 38480, Training Loss: 0.00845
Epoch: 38480, Training Loss: 0.00655
Epoch: 38481, Training Loss: 0.00739
Epoch: 38481, Training Loss: 0.00692
Epoch: 38481, Training Loss: 0.00845
Epoch: 38481, Training Loss: 0.00655
Epoch: 38482, Training Loss: 0.00739
Epoch: 38482, Training Loss: 0.00692
Epoch: 38482, Training Loss: 0.00845
Epoch: 38482, Training Loss: 0.00655
Epoch: 38483, Training Loss: 0.00739
Epoch: 38483, Training Loss: 0.00692
Epoch: 38483, Training Loss: 0.00845
Epoch: 38483, Training Loss: 0.00655
Epoch: 38484, Training Loss: 0.00739
Epoch: 38484, Training Loss: 0.00692
Epoch: 38484, Training Loss: 0.00845
Epoch: 38484, Training Loss: 0.00655
Epoch: 38485, Training Loss: 0.00739
Epoch: 38485, Training Loss: 0.00692
Epoch: 38485, Training Loss: 0.00845
Epoch: 38485, Training Loss: 0.00655
Epoch: 38486, Training Loss: 0.00739
Epoch: 38486, Training Loss: 0.00692
Epoch: 38486, Training Loss: 0.00845
Epoch: 38486, Training Loss: 0.00655
Epoch: 38487, Training Loss: 0.00739
Epoch: 38487, Training Loss: 0.00692
Epoch: 38487, Training Loss: 0.00845
Epoch: 38487, Training Loss: 0.00655
Epoch: 38488, Training Loss: 0.00739
Epoch: 38488, Training Loss: 0.00692
Epoch: 38488, Training Loss: 0.00845
Epoch: 38488, Training Loss: 0.00655
Epoch: 38489, Training Loss: 0.00739
Epoch: 38489, Training Loss: 0.00692
Epoch: 38489, Training Loss: 0.00845
Epoch: 38489, Training Loss: 0.00655
Epoch: 38490, Training Loss: 0.00739
Epoch: 38490, Training Loss: 0.00692
Epoch: 38490, Training Loss: 0.00845
Epoch: 38490, Training Loss: 0.00655
Epoch: 38491, Training Loss: 0.00739
Epoch: 38491, Training Loss: 0.00692
Epoch: 38491, Training Loss: 0.00845
Epoch: 38491, Training Loss: 0.00655
Epoch: 38492, Training Loss: 0.00739
Epoch: 38492, Training Loss: 0.00692
Epoch: 38492, Training Loss: 0.00845
Epoch: 38492, Training Loss: 0.00655
Epoch: 38493, Training Loss: 0.00739
Epoch: 38493, Training Loss: 0.00692
Epoch: 38493, Training Loss: 0.00845
Epoch: 38493, Training Loss: 0.00655
Epoch: 38494, Training Loss: 0.00739
Epoch: 38494, Training Loss: 0.00692
Epoch: 38494, Training Loss: 0.00845
Epoch: 38494, Training Loss: 0.00655
Epoch: 38495, Training Loss: 0.00739
Epoch: 38495, Training Loss: 0.00692
Epoch: 38495, Training Loss: 0.00845
Epoch: 38495, Training Loss: 0.00655
Epoch: 38496, Training Loss: 0.00739
Epoch: 38496, Training Loss: 0.00692
Epoch: 38496, Training Loss: 0.00845
Epoch: 38496, Training Loss: 0.00655
Epoch: 38497, Training Loss: 0.00739
Epoch: 38497, Training Loss: 0.00692
Epoch: 38497, Training Loss: 0.00844
Epoch: 38497, Training Loss: 0.00655
Epoch: 38498, Training Loss: 0.00739
Epoch: 38498, Training Loss: 0.00692
Epoch: 38498, Training Loss: 0.00844
Epoch: 38498, Training Loss: 0.00655
Epoch: 38499, Training Loss: 0.00739
Epoch: 38499, Training Loss: 0.00692
Epoch: 38499, Training Loss: 0.00844
Epoch: 38499, Training Loss: 0.00655
Epoch: 38500, Training Loss: 0.00739
Epoch: 38500, Training Loss: 0.00692
Epoch: 38500, Training Loss: 0.00844
Epoch: 38500, Training Loss: 0.00655
Epoch: 38501, Training Loss: 0.00739
Epoch: 38501, Training Loss: 0.00692
Epoch: 38501, Training Loss: 0.00844
Epoch: 38501, Training Loss: 0.00655
Epoch: 38502, Training Loss: 0.00739
Epoch: 38502, Training Loss: 0.00692
Epoch: 38502, Training Loss: 0.00844
Epoch: 38502, Training Loss: 0.00655
Epoch: 38503, Training Loss: 0.00739
Epoch: 38503, Training Loss: 0.00692
Epoch: 38503, Training Loss: 0.00844
Epoch: 38503, Training Loss: 0.00655
Epoch: 38504, Training Loss: 0.00739
Epoch: 38504, Training Loss: 0.00692
Epoch: 38504, Training Loss: 0.00844
Epoch: 38504, Training Loss: 0.00655
Epoch: 38505, Training Loss: 0.00739
Epoch: 38505, Training Loss: 0.00692
Epoch: 38505, Training Loss: 0.00844
Epoch: 38505, Training Loss: 0.00655
Epoch: 38506, Training Loss: 0.00739
Epoch: 38506, Training Loss: 0.00692
Epoch: 38506, Training Loss: 0.00844
Epoch: 38506, Training Loss: 0.00655
Epoch: 38507, Training Loss: 0.00739
Epoch: 38507, Training Loss: 0.00691
Epoch: 38507, Training Loss: 0.00844
Epoch: 38507, Training Loss: 0.00655
Epoch: 38508, Training Loss: 0.00739
Epoch: 38508, Training Loss: 0.00691
Epoch: 38508, Training Loss: 0.00844
Epoch: 38508, Training Loss: 0.00655
Epoch: 38509, Training Loss: 0.00739
Epoch: 38509, Training Loss: 0.00691
Epoch: 38509, Training Loss: 0.00844
Epoch: 38509, Training Loss: 0.00655
Epoch: 38510, Training Loss: 0.00739
Epoch: 38510, Training Loss: 0.00691
Epoch: 38510, Training Loss: 0.00844
Epoch: 38510, Training Loss: 0.00655
Epoch: 38511, Training Loss: 0.00739
Epoch: 38511, Training Loss: 0.00691
Epoch: 38511, Training Loss: 0.00844
Epoch: 38511, Training Loss: 0.00655
Epoch: 38512, Training Loss: 0.00739
Epoch: 38512, Training Loss: 0.00691
Epoch: 38512, Training Loss: 0.00844
Epoch: 38512, Training Loss: 0.00655
Epoch: 38513, Training Loss: 0.00739
Epoch: 38513, Training Loss: 0.00691
Epoch: 38513, Training Loss: 0.00844
Epoch: 38513, Training Loss: 0.00655
Epoch: 38514, Training Loss: 0.00739
Epoch: 38514, Training Loss: 0.00691
Epoch: 38514, Training Loss: 0.00844
Epoch: 38514, Training Loss: 0.00655
Epoch: 38515, Training Loss: 0.00739
Epoch: 38515, Training Loss: 0.00691
Epoch: 38515, Training Loss: 0.00844
Epoch: 38515, Training Loss: 0.00655
Epoch: 38516, Training Loss: 0.00739
Epoch: 38516, Training Loss: 0.00691
Epoch: 38516, Training Loss: 0.00844
Epoch: 38516, Training Loss: 0.00655
Epoch: 38517, Training Loss: 0.00739
Epoch: 38517, Training Loss: 0.00691
Epoch: 38517, Training Loss: 0.00844
Epoch: 38517, Training Loss: 0.00655
Epoch: 38518, Training Loss: 0.00739
Epoch: 38518, Training Loss: 0.00691
Epoch: 38518, Training Loss: 0.00844
Epoch: 38518, Training Loss: 0.00655
Epoch: 38519, Training Loss: 0.00739
Epoch: 38519, Training Loss: 0.00691
Epoch: 38519, Training Loss: 0.00844
Epoch: 38519, Training Loss: 0.00655
Epoch: 38520, Training Loss: 0.00739
Epoch: 38520, Training Loss: 0.00691
Epoch: 38520, Training Loss: 0.00844
Epoch: 38520, Training Loss: 0.00655
Epoch: 38521, Training Loss: 0.00739
Epoch: 38521, Training Loss: 0.00691
Epoch: 38521, Training Loss: 0.00844
Epoch: 38521, Training Loss: 0.00655
Epoch: 38522, Training Loss: 0.00739
Epoch: 38522, Training Loss: 0.00691
Epoch: 38522, Training Loss: 0.00844
Epoch: 38522, Training Loss: 0.00655
Epoch: 38523, Training Loss: 0.00739
Epoch: 38523, Training Loss: 0.00691
Epoch: 38523, Training Loss: 0.00844
Epoch: 38523, Training Loss: 0.00655
Epoch: 38524, Training Loss: 0.00739
Epoch: 38524, Training Loss: 0.00691
Epoch: 38524, Training Loss: 0.00844
Epoch: 38524, Training Loss: 0.00655
Epoch: 38525, Training Loss: 0.00739
Epoch: 38525, Training Loss: 0.00691
Epoch: 38525, Training Loss: 0.00844
Epoch: 38525, Training Loss: 0.00655
Epoch: 38526, Training Loss: 0.00739
Epoch: 38526, Training Loss: 0.00691
Epoch: 38526, Training Loss: 0.00844
Epoch: 38526, Training Loss: 0.00655
Epoch: 38527, Training Loss: 0.00739
Epoch: 38527, Training Loss: 0.00691
Epoch: 38527, Training Loss: 0.00844
Epoch: 38527, Training Loss: 0.00655
Epoch: 38528, Training Loss: 0.00739
Epoch: 38528, Training Loss: 0.00691
Epoch: 38528, Training Loss: 0.00844
Epoch: 38528, Training Loss: 0.00655
Epoch: 38529, Training Loss: 0.00739
Epoch: 38529, Training Loss: 0.00691
Epoch: 38529, Training Loss: 0.00844
Epoch: 38529, Training Loss: 0.00655
Epoch: 38530, Training Loss: 0.00739
Epoch: 38530, Training Loss: 0.00691
Epoch: 38530, Training Loss: 0.00844
Epoch: 38530, Training Loss: 0.00654
Epoch: 38531, Training Loss: 0.00739
Epoch: 38531, Training Loss: 0.00691
Epoch: 38531, Training Loss: 0.00844
Epoch: 38531, Training Loss: 0.00654
Epoch: 38532, Training Loss: 0.00739
Epoch: 38532, Training Loss: 0.00691
Epoch: 38532, Training Loss: 0.00844
Epoch: 38532, Training Loss: 0.00654
Epoch: 38533, Training Loss: 0.00739
Epoch: 38533, Training Loss: 0.00691
Epoch: 38533, Training Loss: 0.00844
Epoch: 38533, Training Loss: 0.00654
Epoch: 38534, Training Loss: 0.00739
Epoch: 38534, Training Loss: 0.00691
Epoch: 38534, Training Loss: 0.00844
Epoch: 38534, Training Loss: 0.00654
Epoch: 38535, Training Loss: 0.00739
Epoch: 38535, Training Loss: 0.00691
Epoch: 38535, Training Loss: 0.00844
Epoch: 38535, Training Loss: 0.00654
Epoch: 38536, Training Loss: 0.00739
Epoch: 38536, Training Loss: 0.00691
Epoch: 38536, Training Loss: 0.00844
Epoch: 38536, Training Loss: 0.00654
Epoch: 38537, Training Loss: 0.00739
Epoch: 38537, Training Loss: 0.00691
Epoch: 38537, Training Loss: 0.00844
Epoch: 38537, Training Loss: 0.00654
Epoch: 38538, Training Loss: 0.00739
Epoch: 38538, Training Loss: 0.00691
Epoch: 38538, Training Loss: 0.00844
Epoch: 38538, Training Loss: 0.00654
Epoch: 38539, Training Loss: 0.00739
Epoch: 38539, Training Loss: 0.00691
Epoch: 38539, Training Loss: 0.00844
Epoch: 38539, Training Loss: 0.00654
Epoch: 38540, Training Loss: 0.00739
Epoch: 38540, Training Loss: 0.00691
Epoch: 38540, Training Loss: 0.00844
Epoch: 38540, Training Loss: 0.00654
Epoch: 38541, Training Loss: 0.00739
Epoch: 38541, Training Loss: 0.00691
Epoch: 38541, Training Loss: 0.00844
Epoch: 38541, Training Loss: 0.00654
Epoch: 38542, Training Loss: 0.00739
Epoch: 38542, Training Loss: 0.00691
Epoch: 38542, Training Loss: 0.00844
Epoch: 38542, Training Loss: 0.00654
Epoch: 38543, Training Loss: 0.00739
Epoch: 38543, Training Loss: 0.00691
Epoch: 38543, Training Loss: 0.00844
Epoch: 38543, Training Loss: 0.00654
Epoch: 38544, Training Loss: 0.00739
Epoch: 38544, Training Loss: 0.00691
Epoch: 38544, Training Loss: 0.00844
Epoch: 38544, Training Loss: 0.00654
Epoch: 38545, Training Loss: 0.00739
Epoch: 38545, Training Loss: 0.00691
Epoch: 38545, Training Loss: 0.00844
Epoch: 38545, Training Loss: 0.00654
Epoch: 38546, Training Loss: 0.00739
Epoch: 38546, Training Loss: 0.00691
Epoch: 38546, Training Loss: 0.00844
Epoch: 38546, Training Loss: 0.00654
Epoch: 38547, Training Loss: 0.00739
Epoch: 38547, Training Loss: 0.00691
Epoch: 38547, Training Loss: 0.00844
Epoch: 38547, Training Loss: 0.00654
Epoch: 38548, Training Loss: 0.00739
Epoch: 38548, Training Loss: 0.00691
Epoch: 38548, Training Loss: 0.00844
Epoch: 38548, Training Loss: 0.00654
Epoch: 38549, Training Loss: 0.00739
Epoch: 38549, Training Loss: 0.00691
Epoch: 38549, Training Loss: 0.00844
Epoch: 38549, Training Loss: 0.00654
Epoch: 38550, Training Loss: 0.00739
Epoch: 38550, Training Loss: 0.00691
Epoch: 38550, Training Loss: 0.00844
Epoch: 38550, Training Loss: 0.00654
Epoch: 38551, Training Loss: 0.00739
Epoch: 38551, Training Loss: 0.00691
Epoch: 38551, Training Loss: 0.00844
Epoch: 38551, Training Loss: 0.00654
Epoch: 38552, Training Loss: 0.00739
Epoch: 38552, Training Loss: 0.00691
Epoch: 38552, Training Loss: 0.00844
Epoch: 38552, Training Loss: 0.00654
Epoch: 38553, Training Loss: 0.00739
Epoch: 38553, Training Loss: 0.00691
Epoch: 38553, Training Loss: 0.00844
Epoch: 38553, Training Loss: 0.00654
Epoch: 38554, Training Loss: 0.00739
Epoch: 38554, Training Loss: 0.00691
Epoch: 38554, Training Loss: 0.00844
Epoch: 38554, Training Loss: 0.00654
Epoch: 38555, Training Loss: 0.00739
Epoch: 38555, Training Loss: 0.00691
Epoch: 38555, Training Loss: 0.00844
Epoch: 38555, Training Loss: 0.00654
Epoch: 38556, Training Loss: 0.00739
Epoch: 38556, Training Loss: 0.00691
Epoch: 38556, Training Loss: 0.00844
Epoch: 38556, Training Loss: 0.00654
Epoch: 38557, Training Loss: 0.00739
Epoch: 38557, Training Loss: 0.00691
Epoch: 38557, Training Loss: 0.00844
Epoch: 38557, Training Loss: 0.00654
Epoch: 38558, Training Loss: 0.00739
Epoch: 38558, Training Loss: 0.00691
Epoch: 38558, Training Loss: 0.00844
Epoch: 38558, Training Loss: 0.00654
Epoch: 38559, Training Loss: 0.00739
Epoch: 38559, Training Loss: 0.00691
Epoch: 38559, Training Loss: 0.00844
Epoch: 38559, Training Loss: 0.00654
Epoch: 38560, Training Loss: 0.00739
Epoch: 38560, Training Loss: 0.00691
Epoch: 38560, Training Loss: 0.00844
Epoch: 38560, Training Loss: 0.00654
Epoch: 38561, Training Loss: 0.00739
Epoch: 38561, Training Loss: 0.00691
Epoch: 38561, Training Loss: 0.00844
Epoch: 38561, Training Loss: 0.00654
Epoch: 38562, Training Loss: 0.00738
Epoch: 38562, Training Loss: 0.00691
Epoch: 38562, Training Loss: 0.00844
Epoch: 38562, Training Loss: 0.00654
Epoch: 38563, Training Loss: 0.00738
Epoch: 38563, Training Loss: 0.00691
Epoch: 38563, Training Loss: 0.00844
Epoch: 38563, Training Loss: 0.00654
Epoch: 38564, Training Loss: 0.00738
Epoch: 38564, Training Loss: 0.00691
Epoch: 38564, Training Loss: 0.00844
Epoch: 38564, Training Loss: 0.00654
Epoch: 38565, Training Loss: 0.00738
Epoch: 38565, Training Loss: 0.00691
Epoch: 38565, Training Loss: 0.00844
Epoch: 38565, Training Loss: 0.00654
Epoch: 38566, Training Loss: 0.00738
Epoch: 38566, Training Loss: 0.00691
Epoch: 38566, Training Loss: 0.00844
Epoch: 38566, Training Loss: 0.00654
Epoch: 38567, Training Loss: 0.00738
Epoch: 38567, Training Loss: 0.00691
Epoch: 38567, Training Loss: 0.00844
Epoch: 38567, Training Loss: 0.00654
Epoch: 38568, Training Loss: 0.00738
Epoch: 38568, Training Loss: 0.00691
Epoch: 38568, Training Loss: 0.00844
Epoch: 38568, Training Loss: 0.00654
Epoch: 38569, Training Loss: 0.00738
Epoch: 38569, Training Loss: 0.00691
Epoch: 38569, Training Loss: 0.00844
Epoch: 38569, Training Loss: 0.00654
Epoch: 38570, Training Loss: 0.00738
Epoch: 38570, Training Loss: 0.00691
Epoch: 38570, Training Loss: 0.00844
Epoch: 38570, Training Loss: 0.00654
Epoch: 38571, Training Loss: 0.00738
Epoch: 38571, Training Loss: 0.00691
Epoch: 38571, Training Loss: 0.00844
Epoch: 38571, Training Loss: 0.00654
Epoch: 38572, Training Loss: 0.00738
Epoch: 38572, Training Loss: 0.00691
Epoch: 38572, Training Loss: 0.00844
Epoch: 38572, Training Loss: 0.00654
Epoch: 38573, Training Loss: 0.00738
Epoch: 38573, Training Loss: 0.00691
Epoch: 38573, Training Loss: 0.00844
Epoch: 38573, Training Loss: 0.00654
Epoch: 38574, Training Loss: 0.00738
Epoch: 38574, Training Loss: 0.00691
Epoch: 38574, Training Loss: 0.00844
Epoch: 38574, Training Loss: 0.00654
Epoch: 38575, Training Loss: 0.00738
Epoch: 38575, Training Loss: 0.00691
Epoch: 38575, Training Loss: 0.00844
Epoch: 38575, Training Loss: 0.00654
Epoch: 38576, Training Loss: 0.00738
Epoch: 38576, Training Loss: 0.00691
Epoch: 38576, Training Loss: 0.00844
Epoch: 38576, Training Loss: 0.00654
Epoch: 38577, Training Loss: 0.00738
Epoch: 38577, Training Loss: 0.00691
Epoch: 38577, Training Loss: 0.00844
Epoch: 38577, Training Loss: 0.00654
Epoch: 38578, Training Loss: 0.00738
Epoch: 38578, Training Loss: 0.00691
Epoch: 38578, Training Loss: 0.00844
Epoch: 38578, Training Loss: 0.00654
Epoch: 38579, Training Loss: 0.00738
Epoch: 38579, Training Loss: 0.00691
Epoch: 38579, Training Loss: 0.00844
Epoch: 38579, Training Loss: 0.00654
Epoch: 38580, Training Loss: 0.00738
Epoch: 38580, Training Loss: 0.00691
Epoch: 38580, Training Loss: 0.00844
Epoch: 38580, Training Loss: 0.00654
Epoch: 38581, Training Loss: 0.00738
Epoch: 38581, Training Loss: 0.00691
Epoch: 38581, Training Loss: 0.00844
Epoch: 38581, Training Loss: 0.00654
Epoch: 38582, Training Loss: 0.00738
Epoch: 38582, Training Loss: 0.00691
Epoch: 38582, Training Loss: 0.00844
Epoch: 38582, Training Loss: 0.00654
Epoch: 38583, Training Loss: 0.00738
Epoch: 38583, Training Loss: 0.00691
Epoch: 38583, Training Loss: 0.00844
Epoch: 38583, Training Loss: 0.00654
Epoch: 38584, Training Loss: 0.00738
Epoch: 38584, Training Loss: 0.00691
Epoch: 38584, Training Loss: 0.00843
Epoch: 38584, Training Loss: 0.00654
Epoch: 38585, Training Loss: 0.00738
Epoch: 38585, Training Loss: 0.00691
Epoch: 38585, Training Loss: 0.00843
Epoch: 38585, Training Loss: 0.00654
Epoch: 38586, Training Loss: 0.00738
Epoch: 38586, Training Loss: 0.00691
Epoch: 38586, Training Loss: 0.00843
Epoch: 38586, Training Loss: 0.00654
Epoch: 38587, Training Loss: 0.00738
Epoch: 38587, Training Loss: 0.00691
Epoch: 38587, Training Loss: 0.00843
Epoch: 38587, Training Loss: 0.00654
Epoch: 38588, Training Loss: 0.00738
Epoch: 38588, Training Loss: 0.00691
Epoch: 38588, Training Loss: 0.00843
Epoch: 38588, Training Loss: 0.00654
Epoch: 38589, Training Loss: 0.00738
Epoch: 38589, Training Loss: 0.00691
Epoch: 38589, Training Loss: 0.00843
Epoch: 38589, Training Loss: 0.00654
Epoch: 38590, Training Loss: 0.00738
Epoch: 38590, Training Loss: 0.00691
Epoch: 38590, Training Loss: 0.00843
Epoch: 38590, Training Loss: 0.00654
Epoch: 38591, Training Loss: 0.00738
Epoch: 38591, Training Loss: 0.00691
Epoch: 38591, Training Loss: 0.00843
Epoch: 38591, Training Loss: 0.00654
Epoch: 38592, Training Loss: 0.00738
Epoch: 38592, Training Loss: 0.00691
Epoch: 38592, Training Loss: 0.00843
Epoch: 38592, Training Loss: 0.00654
Epoch: 38593, Training Loss: 0.00738
Epoch: 38593, Training Loss: 0.00691
Epoch: 38593, Training Loss: 0.00843
Epoch: 38593, Training Loss: 0.00654
Epoch: 38594, Training Loss: 0.00738
Epoch: 38594, Training Loss: 0.00691
Epoch: 38594, Training Loss: 0.00843
Epoch: 38594, Training Loss: 0.00654
Epoch: 38595, Training Loss: 0.00738
Epoch: 38595, Training Loss: 0.00691
Epoch: 38595, Training Loss: 0.00843
Epoch: 38595, Training Loss: 0.00654
Epoch: 38596, Training Loss: 0.00738
Epoch: 38596, Training Loss: 0.00691
Epoch: 38596, Training Loss: 0.00843
Epoch: 38596, Training Loss: 0.00654
Epoch: 38597, Training Loss: 0.00738
Epoch: 38597, Training Loss: 0.00691
Epoch: 38597, Training Loss: 0.00843
Epoch: 38597, Training Loss: 0.00654
Epoch: 38598, Training Loss: 0.00738
Epoch: 38598, Training Loss: 0.00691
Epoch: 38598, Training Loss: 0.00843
Epoch: 38598, Training Loss: 0.00654
Epoch: 38599, Training Loss: 0.00738
Epoch: 38599, Training Loss: 0.00691
Epoch: 38599, Training Loss: 0.00843
Epoch: 38599, Training Loss: 0.00654
Epoch: 38600, Training Loss: 0.00738
Epoch: 38600, Training Loss: 0.00691
Epoch: 38600, Training Loss: 0.00843
Epoch: 38600, Training Loss: 0.00654
Epoch: 38601, Training Loss: 0.00738
Epoch: 38601, Training Loss: 0.00691
Epoch: 38601, Training Loss: 0.00843
Epoch: 38601, Training Loss: 0.00654
Epoch: 38602, Training Loss: 0.00738
Epoch: 38602, Training Loss: 0.00691
Epoch: 38602, Training Loss: 0.00843
Epoch: 38602, Training Loss: 0.00654
Epoch: 38603, Training Loss: 0.00738
Epoch: 38603, Training Loss: 0.00691
Epoch: 38603, Training Loss: 0.00843
Epoch: 38603, Training Loss: 0.00654
Epoch: 38604, Training Loss: 0.00738
Epoch: 38604, Training Loss: 0.00691
Epoch: 38604, Training Loss: 0.00843
Epoch: 38604, Training Loss: 0.00654
Epoch: 38605, Training Loss: 0.00738
Epoch: 38605, Training Loss: 0.00691
Epoch: 38605, Training Loss: 0.00843
Epoch: 38605, Training Loss: 0.00654
Epoch: 38606, Training Loss: 0.00738
Epoch: 38606, Training Loss: 0.00691
Epoch: 38606, Training Loss: 0.00843
Epoch: 38606, Training Loss: 0.00654
Epoch: 38607, Training Loss: 0.00738
Epoch: 38607, Training Loss: 0.00691
Epoch: 38607, Training Loss: 0.00843
Epoch: 38607, Training Loss: 0.00654
Epoch: 38608, Training Loss: 0.00738
Epoch: 38608, Training Loss: 0.00691
Epoch: 38608, Training Loss: 0.00843
Epoch: 38608, Training Loss: 0.00654
Epoch: 38609, Training Loss: 0.00738
Epoch: 38609, Training Loss: 0.00691
Epoch: 38609, Training Loss: 0.00843
Epoch: 38609, Training Loss: 0.00654
Epoch: 38610, Training Loss: 0.00738
Epoch: 38610, Training Loss: 0.00691
Epoch: 38610, Training Loss: 0.00843
Epoch: 38610, Training Loss: 0.00654
Epoch: 38611, Training Loss: 0.00738
Epoch: 38611, Training Loss: 0.00691
Epoch: 38611, Training Loss: 0.00843
Epoch: 38611, Training Loss: 0.00654
Epoch: 38612, Training Loss: 0.00738
Epoch: 38612, Training Loss: 0.00691
Epoch: 38612, Training Loss: 0.00843
Epoch: 38612, Training Loss: 0.00654
Epoch: 38613, Training Loss: 0.00738
Epoch: 38613, Training Loss: 0.00691
Epoch: 38613, Training Loss: 0.00843
Epoch: 38613, Training Loss: 0.00654
Epoch: 38614, Training Loss: 0.00738
Epoch: 38614, Training Loss: 0.00691
Epoch: 38614, Training Loss: 0.00843
Epoch: 38614, Training Loss: 0.00654
Epoch: 38615, Training Loss: 0.00738
Epoch: 38615, Training Loss: 0.00690
Epoch: 38615, Training Loss: 0.00843
Epoch: 38615, Training Loss: 0.00654
Epoch: 38616, Training Loss: 0.00738
Epoch: 38616, Training Loss: 0.00690
Epoch: 38616, Training Loss: 0.00843
Epoch: 38616, Training Loss: 0.00654
Epoch: 38617, Training Loss: 0.00738
Epoch: 38617, Training Loss: 0.00690
Epoch: 38617, Training Loss: 0.00843
Epoch: 38617, Training Loss: 0.00654
Epoch: 38618, Training Loss: 0.00738
Epoch: 38618, Training Loss: 0.00690
Epoch: 38618, Training Loss: 0.00843
Epoch: 38618, Training Loss: 0.00654
Epoch: 38619, Training Loss: 0.00738
Epoch: 38619, Training Loss: 0.00690
Epoch: 38619, Training Loss: 0.00843
Epoch: 38619, Training Loss: 0.00654
Epoch: 38620, Training Loss: 0.00738
Epoch: 38620, Training Loss: 0.00690
Epoch: 38620, Training Loss: 0.00843
Epoch: 38620, Training Loss: 0.00654
Epoch: 38621, Training Loss: 0.00738
Epoch: 38621, Training Loss: 0.00690
Epoch: 38621, Training Loss: 0.00843
Epoch: 38621, Training Loss: 0.00654
Epoch: 38622, Training Loss: 0.00738
Epoch: 38622, Training Loss: 0.00690
Epoch: 38622, Training Loss: 0.00843
Epoch: 38622, Training Loss: 0.00654
Epoch: 38623, Training Loss: 0.00738
Epoch: 38623, Training Loss: 0.00690
Epoch: 38623, Training Loss: 0.00843
Epoch: 38623, Training Loss: 0.00654
Epoch: 38624, Training Loss: 0.00738
Epoch: 38624, Training Loss: 0.00690
Epoch: 38624, Training Loss: 0.00843
Epoch: 38624, Training Loss: 0.00654
Epoch: 38625, Training Loss: 0.00738
Epoch: 38625, Training Loss: 0.00690
Epoch: 38625, Training Loss: 0.00843
Epoch: 38625, Training Loss: 0.00654
Epoch: 38626, Training Loss: 0.00738
Epoch: 38626, Training Loss: 0.00690
Epoch: 38626, Training Loss: 0.00843
Epoch: 38626, Training Loss: 0.00654
Epoch: 38627, Training Loss: 0.00738
Epoch: 38627, Training Loss: 0.00690
Epoch: 38627, Training Loss: 0.00843
Epoch: 38627, Training Loss: 0.00654
Epoch: 38628, Training Loss: 0.00738
Epoch: 38628, Training Loss: 0.00690
Epoch: 38628, Training Loss: 0.00843
Epoch: 38628, Training Loss: 0.00654
Epoch: 38629, Training Loss: 0.00738
Epoch: 38629, Training Loss: 0.00690
Epoch: 38629, Training Loss: 0.00843
Epoch: 38629, Training Loss: 0.00654
Epoch: 38630, Training Loss: 0.00738
Epoch: 38630, Training Loss: 0.00690
Epoch: 38630, Training Loss: 0.00843
Epoch: 38630, Training Loss: 0.00654
Epoch: 38631, Training Loss: 0.00738
Epoch: 38631, Training Loss: 0.00690
Epoch: 38631, Training Loss: 0.00843
Epoch: 38631, Training Loss: 0.00654
Epoch: 38632, Training Loss: 0.00738
Epoch: 38632, Training Loss: 0.00690
Epoch: 38632, Training Loss: 0.00843
Epoch: 38632, Training Loss: 0.00654
Epoch: 38633, Training Loss: 0.00738
Epoch: 38633, Training Loss: 0.00690
Epoch: 38633, Training Loss: 0.00843
Epoch: 38633, Training Loss: 0.00654
Epoch: 38634, Training Loss: 0.00738
Epoch: 38634, Training Loss: 0.00690
Epoch: 38634, Training Loss: 0.00843
Epoch: 38634, Training Loss: 0.00654
Epoch: 38635, Training Loss: 0.00738
Epoch: 38635, Training Loss: 0.00690
Epoch: 38635, Training Loss: 0.00843
Epoch: 38635, Training Loss: 0.00654
Epoch: 38636, Training Loss: 0.00738
Epoch: 38636, Training Loss: 0.00690
Epoch: 38636, Training Loss: 0.00843
Epoch: 38636, Training Loss: 0.00654
Epoch: 38637, Training Loss: 0.00738
Epoch: 38637, Training Loss: 0.00690
Epoch: 38637, Training Loss: 0.00843
Epoch: 38637, Training Loss: 0.00654
Epoch: 38638, Training Loss: 0.00738
Epoch: 38638, Training Loss: 0.00690
Epoch: 38638, Training Loss: 0.00843
Epoch: 38638, Training Loss: 0.00654
Epoch: 38639, Training Loss: 0.00738
Epoch: 38639, Training Loss: 0.00690
Epoch: 38639, Training Loss: 0.00843
Epoch: 38639, Training Loss: 0.00654
Epoch: 38640, Training Loss: 0.00738
Epoch: 38640, Training Loss: 0.00690
Epoch: 38640, Training Loss: 0.00843
Epoch: 38640, Training Loss: 0.00654
Epoch: 38641, Training Loss: 0.00738
Epoch: 38641, Training Loss: 0.00690
Epoch: 38641, Training Loss: 0.00843
Epoch: 38641, Training Loss: 0.00654
Epoch: 38642, Training Loss: 0.00738
Epoch: 38642, Training Loss: 0.00690
Epoch: 38642, Training Loss: 0.00843
Epoch: 38642, Training Loss: 0.00654
Epoch: 38643, Training Loss: 0.00738
Epoch: 38643, Training Loss: 0.00690
Epoch: 38643, Training Loss: 0.00843
Epoch: 38643, Training Loss: 0.00654
Epoch: 38644, Training Loss: 0.00738
Epoch: 38644, Training Loss: 0.00690
Epoch: 38644, Training Loss: 0.00843
Epoch: 38644, Training Loss: 0.00653
Epoch: 38645, Training Loss: 0.00738
Epoch: 38645, Training Loss: 0.00690
Epoch: 38645, Training Loss: 0.00843
Epoch: 38645, Training Loss: 0.00653
Epoch: 38646, Training Loss: 0.00738
Epoch: 38646, Training Loss: 0.00690
Epoch: 38646, Training Loss: 0.00843
Epoch: 38646, Training Loss: 0.00653
Epoch: 38647, Training Loss: 0.00738
Epoch: 38647, Training Loss: 0.00690
Epoch: 38647, Training Loss: 0.00843
Epoch: 38647, Training Loss: 0.00653
Epoch: 38648, Training Loss: 0.00738
Epoch: 38648, Training Loss: 0.00690
Epoch: 38648, Training Loss: 0.00843
Epoch: 38648, Training Loss: 0.00653
Epoch: 38649, Training Loss: 0.00738
Epoch: 38649, Training Loss: 0.00690
Epoch: 38649, Training Loss: 0.00843
Epoch: 38649, Training Loss: 0.00653
Epoch: 38650, Training Loss: 0.00738
Epoch: 38650, Training Loss: 0.00690
Epoch: 38650, Training Loss: 0.00843
Epoch: 38650, Training Loss: 0.00653
Epoch: 38651, Training Loss: 0.00738
Epoch: 38651, Training Loss: 0.00690
Epoch: 38651, Training Loss: 0.00843
Epoch: 38651, Training Loss: 0.00653
Epoch: 38652, Training Loss: 0.00738
Epoch: 38652, Training Loss: 0.00690
Epoch: 38652, Training Loss: 0.00843
Epoch: 38652, Training Loss: 0.00653
Epoch: 38653, Training Loss: 0.00738
Epoch: 38653, Training Loss: 0.00690
Epoch: 38653, Training Loss: 0.00843
Epoch: 38653, Training Loss: 0.00653
Epoch: 38654, Training Loss: 0.00738
Epoch: 38654, Training Loss: 0.00690
Epoch: 38654, Training Loss: 0.00843
Epoch: 38654, Training Loss: 0.00653
Epoch: 38655, Training Loss: 0.00738
Epoch: 38655, Training Loss: 0.00690
Epoch: 38655, Training Loss: 0.00843
Epoch: 38655, Training Loss: 0.00653
Epoch: 38656, Training Loss: 0.00738
Epoch: 38656, Training Loss: 0.00690
Epoch: 38656, Training Loss: 0.00843
Epoch: 38656, Training Loss: 0.00653
Epoch: 38657, Training Loss: 0.00738
Epoch: 38657, Training Loss: 0.00690
Epoch: 38657, Training Loss: 0.00843
Epoch: 38657, Training Loss: 0.00653
Epoch: 38658, Training Loss: 0.00738
Epoch: 38658, Training Loss: 0.00690
Epoch: 38658, Training Loss: 0.00843
Epoch: 38658, Training Loss: 0.00653
Epoch: 38659, Training Loss: 0.00738
Epoch: 38659, Training Loss: 0.00690
Epoch: 38659, Training Loss: 0.00843
Epoch: 38659, Training Loss: 0.00653
Epoch: 38660, Training Loss: 0.00738
Epoch: 38660, Training Loss: 0.00690
Epoch: 38660, Training Loss: 0.00843
Epoch: 38660, Training Loss: 0.00653
Epoch: 38661, Training Loss: 0.00738
Epoch: 38661, Training Loss: 0.00690
Epoch: 38661, Training Loss: 0.00843
Epoch: 38661, Training Loss: 0.00653
Epoch: 38662, Training Loss: 0.00737
Epoch: 38662, Training Loss: 0.00690
Epoch: 38662, Training Loss: 0.00843
Epoch: 38662, Training Loss: 0.00653
Epoch: 38663, Training Loss: 0.00737
Epoch: 38663, Training Loss: 0.00690
Epoch: 38663, Training Loss: 0.00843
Epoch: 38663, Training Loss: 0.00653
Epoch: 38664, Training Loss: 0.00737
Epoch: 38664, Training Loss: 0.00690
Epoch: 38664, Training Loss: 0.00843
Epoch: 38664, Training Loss: 0.00653
Epoch: 38665, Training Loss: 0.00737
Epoch: 38665, Training Loss: 0.00690
Epoch: 38665, Training Loss: 0.00843
Epoch: 38665, Training Loss: 0.00653
Epoch: 38666, Training Loss: 0.00737
Epoch: 38666, Training Loss: 0.00690
Epoch: 38666, Training Loss: 0.00843
Epoch: 38666, Training Loss: 0.00653
Epoch: 38667, Training Loss: 0.00737
Epoch: 38667, Training Loss: 0.00690
Epoch: 38667, Training Loss: 0.00843
Epoch: 38667, Training Loss: 0.00653
Epoch: 38668, Training Loss: 0.00737
Epoch: 38668, Training Loss: 0.00690
Epoch: 38668, Training Loss: 0.00843
Epoch: 38668, Training Loss: 0.00653
Epoch: 38669, Training Loss: 0.00737
Epoch: 38669, Training Loss: 0.00690
Epoch: 38669, Training Loss: 0.00843
Epoch: 38669, Training Loss: 0.00653
Epoch: 38670, Training Loss: 0.00737
Epoch: 38670, Training Loss: 0.00690
Epoch: 38670, Training Loss: 0.00843
Epoch: 38670, Training Loss: 0.00653
Epoch: 38671, Training Loss: 0.00737
Epoch: 38671, Training Loss: 0.00690
Epoch: 38671, Training Loss: 0.00842
Epoch: 38671, Training Loss: 0.00653
Epoch: 38672, Training Loss: 0.00737
Epoch: 38672, Training Loss: 0.00690
Epoch: 38672, Training Loss: 0.00842
Epoch: 38672, Training Loss: 0.00653
Epoch: 38673, Training Loss: 0.00737
Epoch: 38673, Training Loss: 0.00690
Epoch: 38673, Training Loss: 0.00842
Epoch: 38673, Training Loss: 0.00653
Epoch: 38674, Training Loss: 0.00737
Epoch: 38674, Training Loss: 0.00690
Epoch: 38674, Training Loss: 0.00842
Epoch: 38674, Training Loss: 0.00653
Epoch: 38675, Training Loss: 0.00737
Epoch: 38675, Training Loss: 0.00690
Epoch: 38675, Training Loss: 0.00842
Epoch: 38675, Training Loss: 0.00653
Epoch: 38676, Training Loss: 0.00737
Epoch: 38676, Training Loss: 0.00690
Epoch: 38676, Training Loss: 0.00842
Epoch: 38676, Training Loss: 0.00653
Epoch: 38677, Training Loss: 0.00737
Epoch: 38677, Training Loss: 0.00690
Epoch: 38677, Training Loss: 0.00842
Epoch: 38677, Training Loss: 0.00653
Epoch: 38678, Training Loss: 0.00737
Epoch: 38678, Training Loss: 0.00690
Epoch: 38678, Training Loss: 0.00842
Epoch: 38678, Training Loss: 0.00653
Epoch: 38679, Training Loss: 0.00737
Epoch: 38679, Training Loss: 0.00690
Epoch: 38679, Training Loss: 0.00842
Epoch: 38679, Training Loss: 0.00653
Epoch: 38680, Training Loss: 0.00737
Epoch: 38680, Training Loss: 0.00690
Epoch: 38680, Training Loss: 0.00842
Epoch: 38680, Training Loss: 0.00653
Epoch: 38681, Training Loss: 0.00737
Epoch: 38681, Training Loss: 0.00690
Epoch: 38681, Training Loss: 0.00842
Epoch: 38681, Training Loss: 0.00653
Epoch: 38682, Training Loss: 0.00737
Epoch: 38682, Training Loss: 0.00690
Epoch: 38682, Training Loss: 0.00842
Epoch: 38682, Training Loss: 0.00653
Epoch: 38683, Training Loss: 0.00737
Epoch: 38683, Training Loss: 0.00690
Epoch: 38683, Training Loss: 0.00842
Epoch: 38683, Training Loss: 0.00653
Epoch: 38684, Training Loss: 0.00737
Epoch: 38684, Training Loss: 0.00690
Epoch: 38684, Training Loss: 0.00842
Epoch: 38684, Training Loss: 0.00653
Epoch: 38685, Training Loss: 0.00737
Epoch: 38685, Training Loss: 0.00690
Epoch: 38685, Training Loss: 0.00842
Epoch: 38685, Training Loss: 0.00653
Epoch: 38686, Training Loss: 0.00737
Epoch: 38686, Training Loss: 0.00690
Epoch: 38686, Training Loss: 0.00842
Epoch: 38686, Training Loss: 0.00653
Epoch: 38687, Training Loss: 0.00737
Epoch: 38687, Training Loss: 0.00690
Epoch: 38687, Training Loss: 0.00842
Epoch: 38687, Training Loss: 0.00653
Epoch: 38688, Training Loss: 0.00737
Epoch: 38688, Training Loss: 0.00690
Epoch: 38688, Training Loss: 0.00842
Epoch: 38688, Training Loss: 0.00653
Epoch: 38689, Training Loss: 0.00737
Epoch: 38689, Training Loss: 0.00690
Epoch: 38689, Training Loss: 0.00842
Epoch: 38689, Training Loss: 0.00653
Epoch: 38690, Training Loss: 0.00737
Epoch: 38690, Training Loss: 0.00690
Epoch: 38690, Training Loss: 0.00842
Epoch: 38690, Training Loss: 0.00653
Epoch: 38691, Training Loss: 0.00737
Epoch: 38691, Training Loss: 0.00690
Epoch: 38691, Training Loss: 0.00842
Epoch: 38691, Training Loss: 0.00653
Epoch: 38692, Training Loss: 0.00737
Epoch: 38692, Training Loss: 0.00690
Epoch: 38692, Training Loss: 0.00842
Epoch: 38692, Training Loss: 0.00653
Epoch: 38693, Training Loss: 0.00737
Epoch: 38693, Training Loss: 0.00690
Epoch: 38693, Training Loss: 0.00842
Epoch: 38693, Training Loss: 0.00653
Epoch: 38694, Training Loss: 0.00737
Epoch: 38694, Training Loss: 0.00690
Epoch: 38694, Training Loss: 0.00842
Epoch: 38694, Training Loss: 0.00653
Epoch: 38695, Training Loss: 0.00737
Epoch: 38695, Training Loss: 0.00690
Epoch: 38695, Training Loss: 0.00842
Epoch: 38695, Training Loss: 0.00653
Epoch: 38696, Training Loss: 0.00737
Epoch: 38696, Training Loss: 0.00690
Epoch: 38696, Training Loss: 0.00842
Epoch: 38696, Training Loss: 0.00653
Epoch: 38697, Training Loss: 0.00737
Epoch: 38697, Training Loss: 0.00690
Epoch: 38697, Training Loss: 0.00842
Epoch: 38697, Training Loss: 0.00653
Epoch: 38698, Training Loss: 0.00737
Epoch: 38698, Training Loss: 0.00690
Epoch: 38698, Training Loss: 0.00842
Epoch: 38698, Training Loss: 0.00653
Epoch: 38699, Training Loss: 0.00737
Epoch: 38699, Training Loss: 0.00690
Epoch: 38699, Training Loss: 0.00842
Epoch: 38699, Training Loss: 0.00653
Epoch: 38700, Training Loss: 0.00737
Epoch: 38700, Training Loss: 0.00690
Epoch: 38700, Training Loss: 0.00842
Epoch: 38700, Training Loss: 0.00653
Epoch: 38701, Training Loss: 0.00737
Epoch: 38701, Training Loss: 0.00690
Epoch: 38701, Training Loss: 0.00842
Epoch: 38701, Training Loss: 0.00653
Epoch: 38702, Training Loss: 0.00737
Epoch: 38702, Training Loss: 0.00690
Epoch: 38702, Training Loss: 0.00842
Epoch: 38702, Training Loss: 0.00653
Epoch: 38703, Training Loss: 0.00737
Epoch: 38703, Training Loss: 0.00690
Epoch: 38703, Training Loss: 0.00842
Epoch: 38703, Training Loss: 0.00653
Epoch: 38704, Training Loss: 0.00737
Epoch: 38704, Training Loss: 0.00690
Epoch: 38704, Training Loss: 0.00842
Epoch: 38704, Training Loss: 0.00653
Epoch: 38705, Training Loss: 0.00737
Epoch: 38705, Training Loss: 0.00690
Epoch: 38705, Training Loss: 0.00842
Epoch: 38705, Training Loss: 0.00653
Epoch: 38706, Training Loss: 0.00737
Epoch: 38706, Training Loss: 0.00690
Epoch: 38706, Training Loss: 0.00842
Epoch: 38706, Training Loss: 0.00653
Epoch: 38707, Training Loss: 0.00737
Epoch: 38707, Training Loss: 0.00690
Epoch: 38707, Training Loss: 0.00842
Epoch: 38707, Training Loss: 0.00653
Epoch: 38708, Training Loss: 0.00737
Epoch: 38708, Training Loss: 0.00690
Epoch: 38708, Training Loss: 0.00842
Epoch: 38708, Training Loss: 0.00653
Epoch: 38709, Training Loss: 0.00737
Epoch: 38709, Training Loss: 0.00690
Epoch: 38709, Training Loss: 0.00842
Epoch: 38709, Training Loss: 0.00653
Epoch: 38710, Training Loss: 0.00737
Epoch: 38710, Training Loss: 0.00690
Epoch: 38710, Training Loss: 0.00842
Epoch: 38710, Training Loss: 0.00653
Epoch: 38711, Training Loss: 0.00737
Epoch: 38711, Training Loss: 0.00690
Epoch: 38711, Training Loss: 0.00842
Epoch: 38711, Training Loss: 0.00653
Epoch: 38712, Training Loss: 0.00737
Epoch: 38712, Training Loss: 0.00690
Epoch: 38712, Training Loss: 0.00842
Epoch: 38712, Training Loss: 0.00653
Epoch: 38713, Training Loss: 0.00737
Epoch: 38713, Training Loss: 0.00690
Epoch: 38713, Training Loss: 0.00842
Epoch: 38713, Training Loss: 0.00653
Epoch: 38714, Training Loss: 0.00737
Epoch: 38714, Training Loss: 0.00690
Epoch: 38714, Training Loss: 0.00842
Epoch: 38714, Training Loss: 0.00653
Epoch: 38715, Training Loss: 0.00737
Epoch: 38715, Training Loss: 0.00690
Epoch: 38715, Training Loss: 0.00842
Epoch: 38715, Training Loss: 0.00653
Epoch: 38716, Training Loss: 0.00737
Epoch: 38716, Training Loss: 0.00690
Epoch: 38716, Training Loss: 0.00842
Epoch: 38716, Training Loss: 0.00653
Epoch: 38717, Training Loss: 0.00737
Epoch: 38717, Training Loss: 0.00690
Epoch: 38717, Training Loss: 0.00842
Epoch: 38717, Training Loss: 0.00653
Epoch: 38718, Training Loss: 0.00737
Epoch: 38718, Training Loss: 0.00690
Epoch: 38718, Training Loss: 0.00842
Epoch: 38718, Training Loss: 0.00653
Epoch: 38719, Training Loss: 0.00737
Epoch: 38719, Training Loss: 0.00690
Epoch: 38719, Training Loss: 0.00842
Epoch: 38719, Training Loss: 0.00653
Epoch: 38720, Training Loss: 0.00737
Epoch: 38720, Training Loss: 0.00690
Epoch: 38720, Training Loss: 0.00842
Epoch: 38720, Training Loss: 0.00653
Epoch: 38721, Training Loss: 0.00737
Epoch: 38721, Training Loss: 0.00690
Epoch: 38721, Training Loss: 0.00842
Epoch: 38721, Training Loss: 0.00653
Epoch: 38722, Training Loss: 0.00737
Epoch: 38722, Training Loss: 0.00690
Epoch: 38722, Training Loss: 0.00842
Epoch: 38722, Training Loss: 0.00653
Epoch: 38723, Training Loss: 0.00737
Epoch: 38723, Training Loss: 0.00689
Epoch: 38723, Training Loss: 0.00842
Epoch: 38723, Training Loss: 0.00653
Epoch: 38724, Training Loss: 0.00737
Epoch: 38724, Training Loss: 0.00689
Epoch: 38724, Training Loss: 0.00842
Epoch: 38724, Training Loss: 0.00653
Epoch: 38725, Training Loss: 0.00737
Epoch: 38725, Training Loss: 0.00689
Epoch: 38725, Training Loss: 0.00842
Epoch: 38725, Training Loss: 0.00653
Epoch: 38726, Training Loss: 0.00737
Epoch: 38726, Training Loss: 0.00689
Epoch: 38726, Training Loss: 0.00842
Epoch: 38726, Training Loss: 0.00653
Epoch: 38727, Training Loss: 0.00737
Epoch: 38727, Training Loss: 0.00689
Epoch: 38727, Training Loss: 0.00842
Epoch: 38727, Training Loss: 0.00653
Epoch: 38728, Training Loss: 0.00737
Epoch: 38728, Training Loss: 0.00689
Epoch: 38728, Training Loss: 0.00842
Epoch: 38728, Training Loss: 0.00653
Epoch: 38729, Training Loss: 0.00737
Epoch: 38729, Training Loss: 0.00689
Epoch: 38729, Training Loss: 0.00842
Epoch: 38729, Training Loss: 0.00653
Epoch: 38730, Training Loss: 0.00737
Epoch: 38730, Training Loss: 0.00689
Epoch: 38730, Training Loss: 0.00842
Epoch: 38730, Training Loss: 0.00653
Epoch: 38731, Training Loss: 0.00737
Epoch: 38731, Training Loss: 0.00689
Epoch: 38731, Training Loss: 0.00842
Epoch: 38731, Training Loss: 0.00653
Epoch: 38732, Training Loss: 0.00737
Epoch: 38732, Training Loss: 0.00689
Epoch: 38732, Training Loss: 0.00842
Epoch: 38732, Training Loss: 0.00653
Epoch: 38733, Training Loss: 0.00737
Epoch: 38733, Training Loss: 0.00689
Epoch: 38733, Training Loss: 0.00842
Epoch: 38733, Training Loss: 0.00653
Epoch: 38734, Training Loss: 0.00737
Epoch: 38734, Training Loss: 0.00689
Epoch: 38734, Training Loss: 0.00842
Epoch: 38734, Training Loss: 0.00653
Epoch: 38735, Training Loss: 0.00737
Epoch: 38735, Training Loss: 0.00689
Epoch: 38735, Training Loss: 0.00842
Epoch: 38735, Training Loss: 0.00653
Epoch: 38736, Training Loss: 0.00737
Epoch: 38736, Training Loss: 0.00689
Epoch: 38736, Training Loss: 0.00842
Epoch: 38736, Training Loss: 0.00653
Epoch: 38737, Training Loss: 0.00737
Epoch: 38737, Training Loss: 0.00689
Epoch: 38737, Training Loss: 0.00842
Epoch: 38737, Training Loss: 0.00653
Epoch: 38738, Training Loss: 0.00737
Epoch: 38738, Training Loss: 0.00689
Epoch: 38738, Training Loss: 0.00842
Epoch: 38738, Training Loss: 0.00653
Epoch: 38739, Training Loss: 0.00737
Epoch: 38739, Training Loss: 0.00689
Epoch: 38739, Training Loss: 0.00842
Epoch: 38739, Training Loss: 0.00653
Epoch: 38740, Training Loss: 0.00737
Epoch: 38740, Training Loss: 0.00689
Epoch: 38740, Training Loss: 0.00842
Epoch: 38740, Training Loss: 0.00653
Epoch: 38741, Training Loss: 0.00737
Epoch: 38741, Training Loss: 0.00689
Epoch: 38741, Training Loss: 0.00842
Epoch: 38741, Training Loss: 0.00653
Epoch: 38742, Training Loss: 0.00737
Epoch: 38742, Training Loss: 0.00689
Epoch: 38742, Training Loss: 0.00842
Epoch: 38742, Training Loss: 0.00653
Epoch: 38743, Training Loss: 0.00737
Epoch: 38743, Training Loss: 0.00689
Epoch: 38743, Training Loss: 0.00842
Epoch: 38743, Training Loss: 0.00653
Epoch: 38744, Training Loss: 0.00737
Epoch: 38744, Training Loss: 0.00689
Epoch: 38744, Training Loss: 0.00842
Epoch: 38744, Training Loss: 0.00653
Epoch: 38745, Training Loss: 0.00737
Epoch: 38745, Training Loss: 0.00689
Epoch: 38745, Training Loss: 0.00842
Epoch: 38745, Training Loss: 0.00653
Epoch: 38746, Training Loss: 0.00737
Epoch: 38746, Training Loss: 0.00689
Epoch: 38746, Training Loss: 0.00842
Epoch: 38746, Training Loss: 0.00653
Epoch: 38747, Training Loss: 0.00737
Epoch: 38747, Training Loss: 0.00689
Epoch: 38747, Training Loss: 0.00842
Epoch: 38747, Training Loss: 0.00653
Epoch: 38748, Training Loss: 0.00737
Epoch: 38748, Training Loss: 0.00689
Epoch: 38748, Training Loss: 0.00842
Epoch: 38748, Training Loss: 0.00653
Epoch: 38749, Training Loss: 0.00737
Epoch: 38749, Training Loss: 0.00689
Epoch: 38749, Training Loss: 0.00842
Epoch: 38749, Training Loss: 0.00653
Epoch: 38750, Training Loss: 0.00737
Epoch: 38750, Training Loss: 0.00689
Epoch: 38750, Training Loss: 0.00842
Epoch: 38750, Training Loss: 0.00653
Epoch: 38751, Training Loss: 0.00737
Epoch: 38751, Training Loss: 0.00689
Epoch: 38751, Training Loss: 0.00842
Epoch: 38751, Training Loss: 0.00653
Epoch: 38752, Training Loss: 0.00737
Epoch: 38752, Training Loss: 0.00689
Epoch: 38752, Training Loss: 0.00842
Epoch: 38752, Training Loss: 0.00653
Epoch: 38753, Training Loss: 0.00737
Epoch: 38753, Training Loss: 0.00689
Epoch: 38753, Training Loss: 0.00842
Epoch: 38753, Training Loss: 0.00653
Epoch: 38754, Training Loss: 0.00737
Epoch: 38754, Training Loss: 0.00689
Epoch: 38754, Training Loss: 0.00842
Epoch: 38754, Training Loss: 0.00653
Epoch: 38755, Training Loss: 0.00737
Epoch: 38755, Training Loss: 0.00689
Epoch: 38755, Training Loss: 0.00842
Epoch: 38755, Training Loss: 0.00653
Epoch: 38756, Training Loss: 0.00737
Epoch: 38756, Training Loss: 0.00689
Epoch: 38756, Training Loss: 0.00842
Epoch: 38756, Training Loss: 0.00653
Epoch: 38757, Training Loss: 0.00737
Epoch: 38757, Training Loss: 0.00689
Epoch: 38757, Training Loss: 0.00842
Epoch: 38757, Training Loss: 0.00653
Epoch: 38758, Training Loss: 0.00737
Epoch: 38758, Training Loss: 0.00689
Epoch: 38758, Training Loss: 0.00842
Epoch: 38758, Training Loss: 0.00652
Epoch: 38759, Training Loss: 0.00737
Epoch: 38759, Training Loss: 0.00689
Epoch: 38759, Training Loss: 0.00841
Epoch: 38759, Training Loss: 0.00652
Epoch: 38760, Training Loss: 0.00737
Epoch: 38760, Training Loss: 0.00689
Epoch: 38760, Training Loss: 0.00841
Epoch: 38760, Training Loss: 0.00652
Epoch: 38761, Training Loss: 0.00736
Epoch: 38761, Training Loss: 0.00689
Epoch: 38761, Training Loss: 0.00841
Epoch: 38761, Training Loss: 0.00652
Epoch: 38762, Training Loss: 0.00736
Epoch: 38762, Training Loss: 0.00689
Epoch: 38762, Training Loss: 0.00841
Epoch: 38762, Training Loss: 0.00652
Epoch: 38763, Training Loss: 0.00736
Epoch: 38763, Training Loss: 0.00689
Epoch: 38763, Training Loss: 0.00841
Epoch: 38763, Training Loss: 0.00652
Epoch: 38764, Training Loss: 0.00736
Epoch: 38764, Training Loss: 0.00689
Epoch: 38764, Training Loss: 0.00841
Epoch: 38764, Training Loss: 0.00652
Epoch: 38765, Training Loss: 0.00736
Epoch: 38765, Training Loss: 0.00689
Epoch: 38765, Training Loss: 0.00841
Epoch: 38765, Training Loss: 0.00652
Epoch: 38766, Training Loss: 0.00736
Epoch: 38766, Training Loss: 0.00689
Epoch: 38766, Training Loss: 0.00841
Epoch: 38766, Training Loss: 0.00652
Epoch: 38767, Training Loss: 0.00736
Epoch: 38767, Training Loss: 0.00689
Epoch: 38767, Training Loss: 0.00841
Epoch: 38767, Training Loss: 0.00652
Epoch: 38768, Training Loss: 0.00736
Epoch: 38768, Training Loss: 0.00689
Epoch: 38768, Training Loss: 0.00841
Epoch: 38768, Training Loss: 0.00652
Epoch: 38769, Training Loss: 0.00736
Epoch: 38769, Training Loss: 0.00689
Epoch: 38769, Training Loss: 0.00841
Epoch: 38769, Training Loss: 0.00652
Epoch: 38770, Training Loss: 0.00736
Epoch: 38770, Training Loss: 0.00689
Epoch: 38770, Training Loss: 0.00841
Epoch: 38770, Training Loss: 0.00652
Epoch: 38771, Training Loss: 0.00736
Epoch: 38771, Training Loss: 0.00689
Epoch: 38771, Training Loss: 0.00841
Epoch: 38771, Training Loss: 0.00652
Epoch: 38772, Training Loss: 0.00736
Epoch: 38772, Training Loss: 0.00689
Epoch: 38772, Training Loss: 0.00841
Epoch: 38772, Training Loss: 0.00652
Epoch: 38773, Training Loss: 0.00736
Epoch: 38773, Training Loss: 0.00689
Epoch: 38773, Training Loss: 0.00841
Epoch: 38773, Training Loss: 0.00652
Epoch: 38774, Training Loss: 0.00736
Epoch: 38774, Training Loss: 0.00689
Epoch: 38774, Training Loss: 0.00841
Epoch: 38774, Training Loss: 0.00652
Epoch: 38775, Training Loss: 0.00736
Epoch: 38775, Training Loss: 0.00689
Epoch: 38775, Training Loss: 0.00841
Epoch: 38775, Training Loss: 0.00652
Epoch: 38776, Training Loss: 0.00736
Epoch: 38776, Training Loss: 0.00689
Epoch: 38776, Training Loss: 0.00841
Epoch: 38776, Training Loss: 0.00652
Epoch: 38777, Training Loss: 0.00736
Epoch: 38777, Training Loss: 0.00689
Epoch: 38777, Training Loss: 0.00841
Epoch: 38777, Training Loss: 0.00652
Epoch: 38778, Training Loss: 0.00736
Epoch: 38778, Training Loss: 0.00689
Epoch: 38778, Training Loss: 0.00841
Epoch: 38778, Training Loss: 0.00652
Epoch: 38779, Training Loss: 0.00736
Epoch: 38779, Training Loss: 0.00689
Epoch: 38779, Training Loss: 0.00841
Epoch: 38779, Training Loss: 0.00652
Epoch: 38780, Training Loss: 0.00736
Epoch: 38780, Training Loss: 0.00689
Epoch: 38780, Training Loss: 0.00841
Epoch: 38780, Training Loss: 0.00652
Epoch: 38781, Training Loss: 0.00736
Epoch: 38781, Training Loss: 0.00689
Epoch: 38781, Training Loss: 0.00841
Epoch: 38781, Training Loss: 0.00652
Epoch: 38782, Training Loss: 0.00736
Epoch: 38782, Training Loss: 0.00689
Epoch: 38782, Training Loss: 0.00841
Epoch: 38782, Training Loss: 0.00652
Epoch: 38783, Training Loss: 0.00736
Epoch: 38783, Training Loss: 0.00689
Epoch: 38783, Training Loss: 0.00841
Epoch: 38783, Training Loss: 0.00652
Epoch: 38784, Training Loss: 0.00736
Epoch: 38784, Training Loss: 0.00689
Epoch: 38784, Training Loss: 0.00841
Epoch: 38784, Training Loss: 0.00652
Epoch: 38785, Training Loss: 0.00736
Epoch: 38785, Training Loss: 0.00689
Epoch: 38785, Training Loss: 0.00841
Epoch: 38785, Training Loss: 0.00652
Epoch: 38786, Training Loss: 0.00736
Epoch: 38786, Training Loss: 0.00689
Epoch: 38786, Training Loss: 0.00841
Epoch: 38786, Training Loss: 0.00652
Epoch: 38787, Training Loss: 0.00736
Epoch: 38787, Training Loss: 0.00689
Epoch: 38787, Training Loss: 0.00841
Epoch: 38787, Training Loss: 0.00652
Epoch: 38788, Training Loss: 0.00736
Epoch: 38788, Training Loss: 0.00689
Epoch: 38788, Training Loss: 0.00841
Epoch: 38788, Training Loss: 0.00652
Epoch: 38789, Training Loss: 0.00736
Epoch: 38789, Training Loss: 0.00689
Epoch: 38789, Training Loss: 0.00841
Epoch: 38789, Training Loss: 0.00652
Epoch: 38790, Training Loss: 0.00736
Epoch: 38790, Training Loss: 0.00689
Epoch: 38790, Training Loss: 0.00841
Epoch: 38790, Training Loss: 0.00652
Epoch: 38791, Training Loss: 0.00736
Epoch: 38791, Training Loss: 0.00689
Epoch: 38791, Training Loss: 0.00841
Epoch: 38791, Training Loss: 0.00652
Epoch: 38792, Training Loss: 0.00736
Epoch: 38792, Training Loss: 0.00689
Epoch: 38792, Training Loss: 0.00841
Epoch: 38792, Training Loss: 0.00652
Epoch: 38793, Training Loss: 0.00736
Epoch: 38793, Training Loss: 0.00689
Epoch: 38793, Training Loss: 0.00841
Epoch: 38793, Training Loss: 0.00652
Epoch: 38794, Training Loss: 0.00736
Epoch: 38794, Training Loss: 0.00689
Epoch: 38794, Training Loss: 0.00841
Epoch: 38794, Training Loss: 0.00652
Epoch: 38795, Training Loss: 0.00736
Epoch: 38795, Training Loss: 0.00689
Epoch: 38795, Training Loss: 0.00841
Epoch: 38795, Training Loss: 0.00652
Epoch: 38796, Training Loss: 0.00736
Epoch: 38796, Training Loss: 0.00689
Epoch: 38796, Training Loss: 0.00841
Epoch: 38796, Training Loss: 0.00652
Epoch: 38797, Training Loss: 0.00736
Epoch: 38797, Training Loss: 0.00689
Epoch: 38797, Training Loss: 0.00841
Epoch: 38797, Training Loss: 0.00652
Epoch: 38798, Training Loss: 0.00736
Epoch: 38798, Training Loss: 0.00689
Epoch: 38798, Training Loss: 0.00841
Epoch: 38798, Training Loss: 0.00652
Epoch: 38799, Training Loss: 0.00736
Epoch: 38799, Training Loss: 0.00689
Epoch: 38799, Training Loss: 0.00841
Epoch: 38799, Training Loss: 0.00652
Epoch: 38800, Training Loss: 0.00736
Epoch: 38800, Training Loss: 0.00689
Epoch: 38800, Training Loss: 0.00841
Epoch: 38800, Training Loss: 0.00652
Epoch: 38801, Training Loss: 0.00736
Epoch: 38801, Training Loss: 0.00689
Epoch: 38801, Training Loss: 0.00841
Epoch: 38801, Training Loss: 0.00652
Epoch: 38802, Training Loss: 0.00736
Epoch: 38802, Training Loss: 0.00689
Epoch: 38802, Training Loss: 0.00841
Epoch: 38802, Training Loss: 0.00652
Epoch: 38803, Training Loss: 0.00736
Epoch: 38803, Training Loss: 0.00689
Epoch: 38803, Training Loss: 0.00841
Epoch: 38803, Training Loss: 0.00652
Epoch: 38804, Training Loss: 0.00736
Epoch: 38804, Training Loss: 0.00689
Epoch: 38804, Training Loss: 0.00841
Epoch: 38804, Training Loss: 0.00652
Epoch: 38805, Training Loss: 0.00736
Epoch: 38805, Training Loss: 0.00689
Epoch: 38805, Training Loss: 0.00841
Epoch: 38805, Training Loss: 0.00652
Epoch: 38806, Training Loss: 0.00736
Epoch: 38806, Training Loss: 0.00689
Epoch: 38806, Training Loss: 0.00841
Epoch: 38806, Training Loss: 0.00652
Epoch: 38807, Training Loss: 0.00736
Epoch: 38807, Training Loss: 0.00689
Epoch: 38807, Training Loss: 0.00841
Epoch: 38807, Training Loss: 0.00652
Epoch: 38808, Training Loss: 0.00736
Epoch: 38808, Training Loss: 0.00689
Epoch: 38808, Training Loss: 0.00841
Epoch: 38808, Training Loss: 0.00652
Epoch: 38809, Training Loss: 0.00736
Epoch: 38809, Training Loss: 0.00689
Epoch: 38809, Training Loss: 0.00841
Epoch: 38809, Training Loss: 0.00652
Epoch: 38810, Training Loss: 0.00736
Epoch: 38810, Training Loss: 0.00689
Epoch: 38810, Training Loss: 0.00841
Epoch: 38810, Training Loss: 0.00652
Epoch: 38811, Training Loss: 0.00736
Epoch: 38811, Training Loss: 0.00689
Epoch: 38811, Training Loss: 0.00841
Epoch: 38811, Training Loss: 0.00652
Epoch: 38812, Training Loss: 0.00736
Epoch: 38812, Training Loss: 0.00689
Epoch: 38812, Training Loss: 0.00841
Epoch: 38812, Training Loss: 0.00652
Epoch: 38813, Training Loss: 0.00736
Epoch: 38813, Training Loss: 0.00689
Epoch: 38813, Training Loss: 0.00841
Epoch: 38813, Training Loss: 0.00652
Epoch: 38814, Training Loss: 0.00736
Epoch: 38814, Training Loss: 0.00689
Epoch: 38814, Training Loss: 0.00841
Epoch: 38814, Training Loss: 0.00652
Epoch: 38815, Training Loss: 0.00736
Epoch: 38815, Training Loss: 0.00689
Epoch: 38815, Training Loss: 0.00841
Epoch: 38815, Training Loss: 0.00652
Epoch: 38816, Training Loss: 0.00736
Epoch: 38816, Training Loss: 0.00689
Epoch: 38816, Training Loss: 0.00841
Epoch: 38816, Training Loss: 0.00652
Epoch: 38817, Training Loss: 0.00736
Epoch: 38817, Training Loss: 0.00689
Epoch: 38817, Training Loss: 0.00841
Epoch: 38817, Training Loss: 0.00652
Epoch: 38818, Training Loss: 0.00736
Epoch: 38818, Training Loss: 0.00689
Epoch: 38818, Training Loss: 0.00841
Epoch: 38818, Training Loss: 0.00652
Epoch: 38819, Training Loss: 0.00736
Epoch: 38819, Training Loss: 0.00689
Epoch: 38819, Training Loss: 0.00841
Epoch: 38819, Training Loss: 0.00652
Epoch: 38820, Training Loss: 0.00736
Epoch: 38820, Training Loss: 0.00689
Epoch: 38820, Training Loss: 0.00841
Epoch: 38820, Training Loss: 0.00652
Epoch: 38821, Training Loss: 0.00736
Epoch: 38821, Training Loss: 0.00689
Epoch: 38821, Training Loss: 0.00841
Epoch: 38821, Training Loss: 0.00652
Epoch: 38822, Training Loss: 0.00736
Epoch: 38822, Training Loss: 0.00689
Epoch: 38822, Training Loss: 0.00841
Epoch: 38822, Training Loss: 0.00652
Epoch: 38823, Training Loss: 0.00736
Epoch: 38823, Training Loss: 0.00689
Epoch: 38823, Training Loss: 0.00841
Epoch: 38823, Training Loss: 0.00652
Epoch: 38824, Training Loss: 0.00736
Epoch: 38824, Training Loss: 0.00689
Epoch: 38824, Training Loss: 0.00841
Epoch: 38824, Training Loss: 0.00652
Epoch: 38825, Training Loss: 0.00736
Epoch: 38825, Training Loss: 0.00689
Epoch: 38825, Training Loss: 0.00841
Epoch: 38825, Training Loss: 0.00652
Epoch: 38826, Training Loss: 0.00736
Epoch: 38826, Training Loss: 0.00689
Epoch: 38826, Training Loss: 0.00841
Epoch: 38826, Training Loss: 0.00652
Epoch: 38827, Training Loss: 0.00736
Epoch: 38827, Training Loss: 0.00689
Epoch: 38827, Training Loss: 0.00841
Epoch: 38827, Training Loss: 0.00652
Epoch: 38828, Training Loss: 0.00736
Epoch: 38828, Training Loss: 0.00689
Epoch: 38828, Training Loss: 0.00841
Epoch: 38828, Training Loss: 0.00652
Epoch: 38829, Training Loss: 0.00736
Epoch: 38829, Training Loss: 0.00689
Epoch: 38829, Training Loss: 0.00841
Epoch: 38829, Training Loss: 0.00652
Epoch: 38830, Training Loss: 0.00736
Epoch: 38830, Training Loss: 0.00689
Epoch: 38830, Training Loss: 0.00841
Epoch: 38830, Training Loss: 0.00652
Epoch: 38831, Training Loss: 0.00736
Epoch: 38831, Training Loss: 0.00688
Epoch: 38831, Training Loss: 0.00841
Epoch: 38831, Training Loss: 0.00652
Epoch: 38832, Training Loss: 0.00736
Epoch: 38832, Training Loss: 0.00688
Epoch: 38832, Training Loss: 0.00841
Epoch: 38832, Training Loss: 0.00652
Epoch: 38833, Training Loss: 0.00736
Epoch: 38833, Training Loss: 0.00688
Epoch: 38833, Training Loss: 0.00841
Epoch: 38833, Training Loss: 0.00652
Epoch: 38834, Training Loss: 0.00736
Epoch: 38834, Training Loss: 0.00688
Epoch: 38834, Training Loss: 0.00841
Epoch: 38834, Training Loss: 0.00652
Epoch: 38835, Training Loss: 0.00736
Epoch: 38835, Training Loss: 0.00688
Epoch: 38835, Training Loss: 0.00841
Epoch: 38835, Training Loss: 0.00652
Epoch: 38836, Training Loss: 0.00736
Epoch: 38836, Training Loss: 0.00688
Epoch: 38836, Training Loss: 0.00841
Epoch: 38836, Training Loss: 0.00652
Epoch: 38837, Training Loss: 0.00736
Epoch: 38837, Training Loss: 0.00688
Epoch: 38837, Training Loss: 0.00841
Epoch: 38837, Training Loss: 0.00652
Epoch: 38838, Training Loss: 0.00736
Epoch: 38838, Training Loss: 0.00688
Epoch: 38838, Training Loss: 0.00841
Epoch: 38838, Training Loss: 0.00652
Epoch: 38839, Training Loss: 0.00736
Epoch: 38839, Training Loss: 0.00688
Epoch: 38839, Training Loss: 0.00841
Epoch: 38839, Training Loss: 0.00652
Epoch: 38840, Training Loss: 0.00736
Epoch: 38840, Training Loss: 0.00688
Epoch: 38840, Training Loss: 0.00841
Epoch: 38840, Training Loss: 0.00652
Epoch: 38841, Training Loss: 0.00736
Epoch: 38841, Training Loss: 0.00688
Epoch: 38841, Training Loss: 0.00841
Epoch: 38841, Training Loss: 0.00652
Epoch: 38842, Training Loss: 0.00736
Epoch: 38842, Training Loss: 0.00688
Epoch: 38842, Training Loss: 0.00841
Epoch: 38842, Training Loss: 0.00652
Epoch: 38843, Training Loss: 0.00736
Epoch: 38843, Training Loss: 0.00688
Epoch: 38843, Training Loss: 0.00841
Epoch: 38843, Training Loss: 0.00652
Epoch: 38844, Training Loss: 0.00736
Epoch: 38844, Training Loss: 0.00688
Epoch: 38844, Training Loss: 0.00841
Epoch: 38844, Training Loss: 0.00652
Epoch: 38845, Training Loss: 0.00736
Epoch: 38845, Training Loss: 0.00688
Epoch: 38845, Training Loss: 0.00841
Epoch: 38845, Training Loss: 0.00652
Epoch: 38846, Training Loss: 0.00736
Epoch: 38846, Training Loss: 0.00688
Epoch: 38846, Training Loss: 0.00840
Epoch: 38846, Training Loss: 0.00652
Epoch: 38847, Training Loss: 0.00736
Epoch: 38847, Training Loss: 0.00688
Epoch: 38847, Training Loss: 0.00840
Epoch: 38847, Training Loss: 0.00652
Epoch: 38848, Training Loss: 0.00736
Epoch: 38848, Training Loss: 0.00688
Epoch: 38848, Training Loss: 0.00840
Epoch: 38848, Training Loss: 0.00652
Epoch: 38849, Training Loss: 0.00736
Epoch: 38849, Training Loss: 0.00688
Epoch: 38849, Training Loss: 0.00840
Epoch: 38849, Training Loss: 0.00652
Epoch: 38850, Training Loss: 0.00736
Epoch: 38850, Training Loss: 0.00688
Epoch: 38850, Training Loss: 0.00840
Epoch: 38850, Training Loss: 0.00652
Epoch: 38851, Training Loss: 0.00736
Epoch: 38851, Training Loss: 0.00688
Epoch: 38851, Training Loss: 0.00840
Epoch: 38851, Training Loss: 0.00652
Epoch: 38852, Training Loss: 0.00736
Epoch: 38852, Training Loss: 0.00688
Epoch: 38852, Training Loss: 0.00840
Epoch: 38852, Training Loss: 0.00652
Epoch: 38853, Training Loss: 0.00736
Epoch: 38853, Training Loss: 0.00688
Epoch: 38853, Training Loss: 0.00840
Epoch: 38853, Training Loss: 0.00652
Epoch: 38854, Training Loss: 0.00736
Epoch: 38854, Training Loss: 0.00688
Epoch: 38854, Training Loss: 0.00840
Epoch: 38854, Training Loss: 0.00652
Epoch: 38855, Training Loss: 0.00736
Epoch: 38855, Training Loss: 0.00688
Epoch: 38855, Training Loss: 0.00840
Epoch: 38855, Training Loss: 0.00652
Epoch: 38856, Training Loss: 0.00736
Epoch: 38856, Training Loss: 0.00688
Epoch: 38856, Training Loss: 0.00840
Epoch: 38856, Training Loss: 0.00652
Epoch: 38857, Training Loss: 0.00736
Epoch: 38857, Training Loss: 0.00688
Epoch: 38857, Training Loss: 0.00840
Epoch: 38857, Training Loss: 0.00652
Epoch: 38858, Training Loss: 0.00736
Epoch: 38858, Training Loss: 0.00688
Epoch: 38858, Training Loss: 0.00840
Epoch: 38858, Training Loss: 0.00652
Epoch: 38859, Training Loss: 0.00736
Epoch: 38859, Training Loss: 0.00688
Epoch: 38859, Training Loss: 0.00840
Epoch: 38859, Training Loss: 0.00652
Epoch: 38860, Training Loss: 0.00736
Epoch: 38860, Training Loss: 0.00688
Epoch: 38860, Training Loss: 0.00840
Epoch: 38860, Training Loss: 0.00652
Epoch: 38861, Training Loss: 0.00735
Epoch: 38861, Training Loss: 0.00688
Epoch: 38861, Training Loss: 0.00840
Epoch: 38861, Training Loss: 0.00652
Epoch: 38862, Training Loss: 0.00735
Epoch: 38862, Training Loss: 0.00688
Epoch: 38862, Training Loss: 0.00840
Epoch: 38862, Training Loss: 0.00652
Epoch: 38863, Training Loss: 0.00735
Epoch: 38863, Training Loss: 0.00688
Epoch: 38863, Training Loss: 0.00840
Epoch: 38863, Training Loss: 0.00652
Epoch: 38864, Training Loss: 0.00735
Epoch: 38864, Training Loss: 0.00688
Epoch: 38864, Training Loss: 0.00840
Epoch: 38864, Training Loss: 0.00652
Epoch: 38865, Training Loss: 0.00735
Epoch: 38865, Training Loss: 0.00688
Epoch: 38865, Training Loss: 0.00840
Epoch: 38865, Training Loss: 0.00652
Epoch: 38866, Training Loss: 0.00735
Epoch: 38866, Training Loss: 0.00688
Epoch: 38866, Training Loss: 0.00840
Epoch: 38866, Training Loss: 0.00652
Epoch: 38867, Training Loss: 0.00735
Epoch: 38867, Training Loss: 0.00688
Epoch: 38867, Training Loss: 0.00840
Epoch: 38867, Training Loss: 0.00652
Epoch: 38868, Training Loss: 0.00735
Epoch: 38868, Training Loss: 0.00688
Epoch: 38868, Training Loss: 0.00840
Epoch: 38868, Training Loss: 0.00652
Epoch: 38869, Training Loss: 0.00735
Epoch: 38869, Training Loss: 0.00688
Epoch: 38869, Training Loss: 0.00840
Epoch: 38869, Training Loss: 0.00652
Epoch: 38870, Training Loss: 0.00735
Epoch: 38870, Training Loss: 0.00688
Epoch: 38870, Training Loss: 0.00840
Epoch: 38870, Training Loss: 0.00652
Epoch: 38871, Training Loss: 0.00735
Epoch: 38871, Training Loss: 0.00688
Epoch: 38871, Training Loss: 0.00840
Epoch: 38871, Training Loss: 0.00652
Epoch: 38872, Training Loss: 0.00735
Epoch: 38872, Training Loss: 0.00688
Epoch: 38872, Training Loss: 0.00840
Epoch: 38872, Training Loss: 0.00651
Epoch: 38873, Training Loss: 0.00735
Epoch: 38873, Training Loss: 0.00688
Epoch: 38873, Training Loss: 0.00840
Epoch: 38873, Training Loss: 0.00651
Epoch: 38874, Training Loss: 0.00735
Epoch: 38874, Training Loss: 0.00688
Epoch: 38874, Training Loss: 0.00840
Epoch: 38874, Training Loss: 0.00651
Epoch: 38875, Training Loss: 0.00735
Epoch: 38875, Training Loss: 0.00688
Epoch: 38875, Training Loss: 0.00840
Epoch: 38875, Training Loss: 0.00651
Epoch: 38876, Training Loss: 0.00735
Epoch: 38876, Training Loss: 0.00688
Epoch: 38876, Training Loss: 0.00840
Epoch: 38876, Training Loss: 0.00651
Epoch: 38877, Training Loss: 0.00735
Epoch: 38877, Training Loss: 0.00688
Epoch: 38877, Training Loss: 0.00840
Epoch: 38877, Training Loss: 0.00651
Epoch: 38878, Training Loss: 0.00735
Epoch: 38878, Training Loss: 0.00688
Epoch: 38878, Training Loss: 0.00840
Epoch: 38878, Training Loss: 0.00651
Epoch: 38879, Training Loss: 0.00735
Epoch: 38879, Training Loss: 0.00688
Epoch: 38879, Training Loss: 0.00840
Epoch: 38879, Training Loss: 0.00651
Epoch: 38880, Training Loss: 0.00735
Epoch: 38880, Training Loss: 0.00688
Epoch: 38880, Training Loss: 0.00840
Epoch: 38880, Training Loss: 0.00651
Epoch: 38881, Training Loss: 0.00735
Epoch: 38881, Training Loss: 0.00688
Epoch: 38881, Training Loss: 0.00840
Epoch: 38881, Training Loss: 0.00651
Epoch: 38882, Training Loss: 0.00735
Epoch: 38882, Training Loss: 0.00688
Epoch: 38882, Training Loss: 0.00840
Epoch: 38882, Training Loss: 0.00651
Epoch: 38883, Training Loss: 0.00735
Epoch: 38883, Training Loss: 0.00688
Epoch: 38883, Training Loss: 0.00840
Epoch: 38883, Training Loss: 0.00651
Epoch: 38884, Training Loss: 0.00735
Epoch: 38884, Training Loss: 0.00688
Epoch: 38884, Training Loss: 0.00840
Epoch: 38884, Training Loss: 0.00651
Epoch: 38885, Training Loss: 0.00735
Epoch: 38885, Training Loss: 0.00688
Epoch: 38885, Training Loss: 0.00840
Epoch: 38885, Training Loss: 0.00651
Epoch: 38886, Training Loss: 0.00735
Epoch: 38886, Training Loss: 0.00688
Epoch: 38886, Training Loss: 0.00840
Epoch: 38886, Training Loss: 0.00651
Epoch: 38887, Training Loss: 0.00735
Epoch: 38887, Training Loss: 0.00688
Epoch: 38887, Training Loss: 0.00840
Epoch: 38887, Training Loss: 0.00651
Epoch: 38888, Training Loss: 0.00735
Epoch: 38888, Training Loss: 0.00688
Epoch: 38888, Training Loss: 0.00840
Epoch: 38888, Training Loss: 0.00651
Epoch: 38889, Training Loss: 0.00735
Epoch: 38889, Training Loss: 0.00688
Epoch: 38889, Training Loss: 0.00840
Epoch: 38889, Training Loss: 0.00651
Epoch: 38890, Training Loss: 0.00735
Epoch: 38890, Training Loss: 0.00688
Epoch: 38890, Training Loss: 0.00840
Epoch: 38890, Training Loss: 0.00651
Epoch: 38891, Training Loss: 0.00735
Epoch: 38891, Training Loss: 0.00688
Epoch: 38891, Training Loss: 0.00840
Epoch: 38891, Training Loss: 0.00651
Epoch: 38892, Training Loss: 0.00735
Epoch: 38892, Training Loss: 0.00688
Epoch: 38892, Training Loss: 0.00840
Epoch: 38892, Training Loss: 0.00651
Epoch: 38893, Training Loss: 0.00735
Epoch: 38893, Training Loss: 0.00688
Epoch: 38893, Training Loss: 0.00840
Epoch: 38893, Training Loss: 0.00651
Epoch: 38894, Training Loss: 0.00735
Epoch: 38894, Training Loss: 0.00688
Epoch: 38894, Training Loss: 0.00840
Epoch: 38894, Training Loss: 0.00651
Epoch: 38895, Training Loss: 0.00735
Epoch: 38895, Training Loss: 0.00688
Epoch: 38895, Training Loss: 0.00840
Epoch: 38895, Training Loss: 0.00651
Epoch: 38896, Training Loss: 0.00735
Epoch: 38896, Training Loss: 0.00688
Epoch: 38896, Training Loss: 0.00840
Epoch: 38896, Training Loss: 0.00651
Epoch: 38897, Training Loss: 0.00735
Epoch: 38897, Training Loss: 0.00688
Epoch: 38897, Training Loss: 0.00840
Epoch: 38897, Training Loss: 0.00651
Epoch: 38898, Training Loss: 0.00735
Epoch: 38898, Training Loss: 0.00688
Epoch: 38898, Training Loss: 0.00840
Epoch: 38898, Training Loss: 0.00651
Epoch: 38899, Training Loss: 0.00735
Epoch: 38899, Training Loss: 0.00688
Epoch: 38899, Training Loss: 0.00840
Epoch: 38899, Training Loss: 0.00651
Epoch: 38900, Training Loss: 0.00735
Epoch: 38900, Training Loss: 0.00688
Epoch: 38900, Training Loss: 0.00840
Epoch: 38900, Training Loss: 0.00651
Epoch: 38901, Training Loss: 0.00735
Epoch: 38901, Training Loss: 0.00688
Epoch: 38901, Training Loss: 0.00840
Epoch: 38901, Training Loss: 0.00651
Epoch: 38902, Training Loss: 0.00735
Epoch: 38902, Training Loss: 0.00688
Epoch: 38902, Training Loss: 0.00840
Epoch: 38902, Training Loss: 0.00651
Epoch: 38903, Training Loss: 0.00735
Epoch: 38903, Training Loss: 0.00688
Epoch: 38903, Training Loss: 0.00840
Epoch: 38903, Training Loss: 0.00651
Epoch: 38904, Training Loss: 0.00735
Epoch: 38904, Training Loss: 0.00688
Epoch: 38904, Training Loss: 0.00840
Epoch: 38904, Training Loss: 0.00651
Epoch: 38905, Training Loss: 0.00735
Epoch: 38905, Training Loss: 0.00688
Epoch: 38905, Training Loss: 0.00840
Epoch: 38905, Training Loss: 0.00651
Epoch: 38906, Training Loss: 0.00735
Epoch: 38906, Training Loss: 0.00688
Epoch: 38906, Training Loss: 0.00840
Epoch: 38906, Training Loss: 0.00651
Epoch: 38907, Training Loss: 0.00735
Epoch: 38907, Training Loss: 0.00688
Epoch: 38907, Training Loss: 0.00840
Epoch: 38907, Training Loss: 0.00651
Epoch: 38908, Training Loss: 0.00735
Epoch: 38908, Training Loss: 0.00688
Epoch: 38908, Training Loss: 0.00840
Epoch: 38908, Training Loss: 0.00651
Epoch: 38909, Training Loss: 0.00735
Epoch: 38909, Training Loss: 0.00688
Epoch: 38909, Training Loss: 0.00840
Epoch: 38909, Training Loss: 0.00651
Epoch: 38910, Training Loss: 0.00735
Epoch: 38910, Training Loss: 0.00688
Epoch: 38910, Training Loss: 0.00840
Epoch: 38910, Training Loss: 0.00651
Epoch: 38911, Training Loss: 0.00735
Epoch: 38911, Training Loss: 0.00688
Epoch: 38911, Training Loss: 0.00840
Epoch: 38911, Training Loss: 0.00651
Epoch: 38912, Training Loss: 0.00735
Epoch: 38912, Training Loss: 0.00688
Epoch: 38912, Training Loss: 0.00840
Epoch: 38912, Training Loss: 0.00651
Epoch: 38913, Training Loss: 0.00735
Epoch: 38913, Training Loss: 0.00688
Epoch: 38913, Training Loss: 0.00840
Epoch: 38913, Training Loss: 0.00651
Epoch: 38914, Training Loss: 0.00735
Epoch: 38914, Training Loss: 0.00688
Epoch: 38914, Training Loss: 0.00840
Epoch: 38914, Training Loss: 0.00651
Epoch: 38915, Training Loss: 0.00735
Epoch: 38915, Training Loss: 0.00688
Epoch: 38915, Training Loss: 0.00840
Epoch: 38915, Training Loss: 0.00651
Epoch: 38916, Training Loss: 0.00735
Epoch: 38916, Training Loss: 0.00688
Epoch: 38916, Training Loss: 0.00840
Epoch: 38916, Training Loss: 0.00651
Epoch: 38917, Training Loss: 0.00735
Epoch: 38917, Training Loss: 0.00688
Epoch: 38917, Training Loss: 0.00840
Epoch: 38917, Training Loss: 0.00651
Epoch: 38918, Training Loss: 0.00735
Epoch: 38918, Training Loss: 0.00688
Epoch: 38918, Training Loss: 0.00840
Epoch: 38918, Training Loss: 0.00651
Epoch: 38919, Training Loss: 0.00735
Epoch: 38919, Training Loss: 0.00688
Epoch: 38919, Training Loss: 0.00840
Epoch: 38919, Training Loss: 0.00651
Epoch: 38920, Training Loss: 0.00735
Epoch: 38920, Training Loss: 0.00688
Epoch: 38920, Training Loss: 0.00840
Epoch: 38920, Training Loss: 0.00651
Epoch: 38921, Training Loss: 0.00735
Epoch: 38921, Training Loss: 0.00688
Epoch: 38921, Training Loss: 0.00840
Epoch: 38921, Training Loss: 0.00651
Epoch: 38922, Training Loss: 0.00735
Epoch: 38922, Training Loss: 0.00688
Epoch: 38922, Training Loss: 0.00840
Epoch: 38922, Training Loss: 0.00651
Epoch: 38923, Training Loss: 0.00735
Epoch: 38923, Training Loss: 0.00688
Epoch: 38923, Training Loss: 0.00840
Epoch: 38923, Training Loss: 0.00651
Epoch: 38924, Training Loss: 0.00735
Epoch: 38924, Training Loss: 0.00688
Epoch: 38924, Training Loss: 0.00840
Epoch: 38924, Training Loss: 0.00651
Epoch: 38925, Training Loss: 0.00735
Epoch: 38925, Training Loss: 0.00688
Epoch: 38925, Training Loss: 0.00840
Epoch: 38925, Training Loss: 0.00651
Epoch: 38926, Training Loss: 0.00735
Epoch: 38926, Training Loss: 0.00688
Epoch: 38926, Training Loss: 0.00840
Epoch: 38926, Training Loss: 0.00651
Epoch: 38927, Training Loss: 0.00735
Epoch: 38927, Training Loss: 0.00688
Epoch: 38927, Training Loss: 0.00840
Epoch: 38927, Training Loss: 0.00651
Epoch: 38928, Training Loss: 0.00735
Epoch: 38928, Training Loss: 0.00688
Epoch: 38928, Training Loss: 0.00840
Epoch: 38928, Training Loss: 0.00651
Epoch: 38929, Training Loss: 0.00735
Epoch: 38929, Training Loss: 0.00688
Epoch: 38929, Training Loss: 0.00840
Epoch: 38929, Training Loss: 0.00651
Epoch: 38930, Training Loss: 0.00735
Epoch: 38930, Training Loss: 0.00688
Epoch: 38930, Training Loss: 0.00840
Epoch: 38930, Training Loss: 0.00651
Epoch: 38931, Training Loss: 0.00735
Epoch: 38931, Training Loss: 0.00688
Epoch: 38931, Training Loss: 0.00840
Epoch: 38931, Training Loss: 0.00651
Epoch: 38932, Training Loss: 0.00735
Epoch: 38932, Training Loss: 0.00688
Epoch: 38932, Training Loss: 0.00840
Epoch: 38932, Training Loss: 0.00651
Epoch: 38933, Training Loss: 0.00735
Epoch: 38933, Training Loss: 0.00688
Epoch: 38933, Training Loss: 0.00840
Epoch: 38933, Training Loss: 0.00651
Epoch: 38934, Training Loss: 0.00735
Epoch: 38934, Training Loss: 0.00688
Epoch: 38934, Training Loss: 0.00840
Epoch: 38934, Training Loss: 0.00651
Epoch: 38935, Training Loss: 0.00735
Epoch: 38935, Training Loss: 0.00688
Epoch: 38935, Training Loss: 0.00839
Epoch: 38935, Training Loss: 0.00651
Epoch: 38936, Training Loss: 0.00735
Epoch: 38936, Training Loss: 0.00688
Epoch: 38936, Training Loss: 0.00839
Epoch: 38936, Training Loss: 0.00651
Epoch: 38937, Training Loss: 0.00735
Epoch: 38937, Training Loss: 0.00688
Epoch: 38937, Training Loss: 0.00839
Epoch: 38937, Training Loss: 0.00651
Epoch: 38938, Training Loss: 0.00735
Epoch: 38938, Training Loss: 0.00688
Epoch: 38938, Training Loss: 0.00839
Epoch: 38938, Training Loss: 0.00651
Epoch: 38939, Training Loss: 0.00735
Epoch: 38939, Training Loss: 0.00688
Epoch: 38939, Training Loss: 0.00839
Epoch: 38939, Training Loss: 0.00651
Epoch: 38940, Training Loss: 0.00735
Epoch: 38940, Training Loss: 0.00687
Epoch: 38940, Training Loss: 0.00839
Epoch: 38940, Training Loss: 0.00651
Epoch: 38941, Training Loss: 0.00735
Epoch: 38941, Training Loss: 0.00687
Epoch: 38941, Training Loss: 0.00839
Epoch: 38941, Training Loss: 0.00651
Epoch: 38942, Training Loss: 0.00735
Epoch: 38942, Training Loss: 0.00687
Epoch: 38942, Training Loss: 0.00839
Epoch: 38942, Training Loss: 0.00651
Epoch: 38943, Training Loss: 0.00735
Epoch: 38943, Training Loss: 0.00687
Epoch: 38943, Training Loss: 0.00839
Epoch: 38943, Training Loss: 0.00651
Epoch: 38944, Training Loss: 0.00735
Epoch: 38944, Training Loss: 0.00687
Epoch: 38944, Training Loss: 0.00839
Epoch: 38944, Training Loss: 0.00651
Epoch: 38945, Training Loss: 0.00735
Epoch: 38945, Training Loss: 0.00687
Epoch: 38945, Training Loss: 0.00839
Epoch: 38945, Training Loss: 0.00651
Epoch: 38946, Training Loss: 0.00735
Epoch: 38946, Training Loss: 0.00687
Epoch: 38946, Training Loss: 0.00839
Epoch: 38946, Training Loss: 0.00651
Epoch: 38947, Training Loss: 0.00735
Epoch: 38947, Training Loss: 0.00687
Epoch: 38947, Training Loss: 0.00839
Epoch: 38947, Training Loss: 0.00651
Epoch: 38948, Training Loss: 0.00735
Epoch: 38948, Training Loss: 0.00687
Epoch: 38948, Training Loss: 0.00839
Epoch: 38948, Training Loss: 0.00651
Epoch: 38949, Training Loss: 0.00735
Epoch: 38949, Training Loss: 0.00687
Epoch: 38949, Training Loss: 0.00839
Epoch: 38949, Training Loss: 0.00651
Epoch: 38950, Training Loss: 0.00735
Epoch: 38950, Training Loss: 0.00687
Epoch: 38950, Training Loss: 0.00839
Epoch: 38950, Training Loss: 0.00651
Epoch: 38951, Training Loss: 0.00735
Epoch: 38951, Training Loss: 0.00687
Epoch: 38951, Training Loss: 0.00839
Epoch: 38951, Training Loss: 0.00651
Epoch: 38952, Training Loss: 0.00735
Epoch: 38952, Training Loss: 0.00687
Epoch: 38952, Training Loss: 0.00839
Epoch: 38952, Training Loss: 0.00651
Epoch: 38953, Training Loss: 0.00735
Epoch: 38953, Training Loss: 0.00687
Epoch: 38953, Training Loss: 0.00839
Epoch: 38953, Training Loss: 0.00651
Epoch: 38954, Training Loss: 0.00735
Epoch: 38954, Training Loss: 0.00687
Epoch: 38954, Training Loss: 0.00839
Epoch: 38954, Training Loss: 0.00651
Epoch: 38955, Training Loss: 0.00735
Epoch: 38955, Training Loss: 0.00687
Epoch: 38955, Training Loss: 0.00839
Epoch: 38955, Training Loss: 0.00651
Epoch: 38956, Training Loss: 0.00735
Epoch: 38956, Training Loss: 0.00687
Epoch: 38956, Training Loss: 0.00839
Epoch: 38956, Training Loss: 0.00651
Epoch: 38957, Training Loss: 0.00735
Epoch: 38957, Training Loss: 0.00687
Epoch: 38957, Training Loss: 0.00839
Epoch: 38957, Training Loss: 0.00651
Epoch: 38958, Training Loss: 0.00735
Epoch: 38958, Training Loss: 0.00687
Epoch: 38958, Training Loss: 0.00839
Epoch: 38958, Training Loss: 0.00651
Epoch: 38959, Training Loss: 0.00735
Epoch: 38959, Training Loss: 0.00687
Epoch: 38959, Training Loss: 0.00839
Epoch: 38959, Training Loss: 0.00651
Epoch: 38960, Training Loss: 0.00735
Epoch: 38960, Training Loss: 0.00687
Epoch: 38960, Training Loss: 0.00839
Epoch: 38960, Training Loss: 0.00651
Epoch: 38961, Training Loss: 0.00735
Epoch: 38961, Training Loss: 0.00687
Epoch: 38961, Training Loss: 0.00839
Epoch: 38961, Training Loss: 0.00651
Epoch: 38962, Training Loss: 0.00734
Epoch: 38962, Training Loss: 0.00687
Epoch: 38962, Training Loss: 0.00839
Epoch: 38962, Training Loss: 0.00651
Epoch: 38963, Training Loss: 0.00734
Epoch: 38963, Training Loss: 0.00687
Epoch: 38963, Training Loss: 0.00839
Epoch: 38963, Training Loss: 0.00651
Epoch: 38964, Training Loss: 0.00734
Epoch: 38964, Training Loss: 0.00687
Epoch: 38964, Training Loss: 0.00839
Epoch: 38964, Training Loss: 0.00651
Epoch: 38965, Training Loss: 0.00734
Epoch: 38965, Training Loss: 0.00687
Epoch: 38965, Training Loss: 0.00839
Epoch: 38965, Training Loss: 0.00651
Epoch: 38966, Training Loss: 0.00734
Epoch: 38966, Training Loss: 0.00687
Epoch: 38966, Training Loss: 0.00839
Epoch: 38966, Training Loss: 0.00651
Epoch: 38967, Training Loss: 0.00734
Epoch: 38967, Training Loss: 0.00687
Epoch: 38967, Training Loss: 0.00839
Epoch: 38967, Training Loss: 0.00651
Epoch: 38968, Training Loss: 0.00734
Epoch: 38968, Training Loss: 0.00687
Epoch: 38968, Training Loss: 0.00839
Epoch: 38968, Training Loss: 0.00651
Epoch: 38969, Training Loss: 0.00734
Epoch: 38969, Training Loss: 0.00687
Epoch: 38969, Training Loss: 0.00839
Epoch: 38969, Training Loss: 0.00651
Epoch: 38970, Training Loss: 0.00734
Epoch: 38970, Training Loss: 0.00687
Epoch: 38970, Training Loss: 0.00839
Epoch: 38970, Training Loss: 0.00651
Epoch: 38971, Training Loss: 0.00734
Epoch: 38971, Training Loss: 0.00687
Epoch: 38971, Training Loss: 0.00839
Epoch: 38971, Training Loss: 0.00651
Epoch: 38972, Training Loss: 0.00734
Epoch: 38972, Training Loss: 0.00687
Epoch: 38972, Training Loss: 0.00839
Epoch: 38972, Training Loss: 0.00651
Epoch: 38973, Training Loss: 0.00734
Epoch: 38973, Training Loss: 0.00687
Epoch: 38973, Training Loss: 0.00839
Epoch: 38973, Training Loss: 0.00651
Epoch: 38974, Training Loss: 0.00734
Epoch: 38974, Training Loss: 0.00687
Epoch: 38974, Training Loss: 0.00839
Epoch: 38974, Training Loss: 0.00651
Epoch: 38975, Training Loss: 0.00734
Epoch: 38975, Training Loss: 0.00687
Epoch: 38975, Training Loss: 0.00839
Epoch: 38975, Training Loss: 0.00651
Epoch: 38976, Training Loss: 0.00734
Epoch: 38976, Training Loss: 0.00687
Epoch: 38976, Training Loss: 0.00839
Epoch: 38976, Training Loss: 0.00651
Epoch: 38977, Training Loss: 0.00734
Epoch: 38977, Training Loss: 0.00687
Epoch: 38977, Training Loss: 0.00839
Epoch: 38977, Training Loss: 0.00651
Epoch: 38978, Training Loss: 0.00734
Epoch: 38978, Training Loss: 0.00687
Epoch: 38978, Training Loss: 0.00839
Epoch: 38978, Training Loss: 0.00651
Epoch: 38979, Training Loss: 0.00734
Epoch: 38979, Training Loss: 0.00687
Epoch: 38979, Training Loss: 0.00839
Epoch: 38979, Training Loss: 0.00651
Epoch: 38980, Training Loss: 0.00734
Epoch: 38980, Training Loss: 0.00687
Epoch: 38980, Training Loss: 0.00839
Epoch: 38980, Training Loss: 0.00651
Epoch: 38981, Training Loss: 0.00734
Epoch: 38981, Training Loss: 0.00687
Epoch: 38981, Training Loss: 0.00839
Epoch: 38981, Training Loss: 0.00651
Epoch: 38982, Training Loss: 0.00734
Epoch: 38982, Training Loss: 0.00687
Epoch: 38982, Training Loss: 0.00839
Epoch: 38982, Training Loss: 0.00651
Epoch: 38983, Training Loss: 0.00734
Epoch: 38983, Training Loss: 0.00687
Epoch: 38983, Training Loss: 0.00839
Epoch: 38983, Training Loss: 0.00651
Epoch: 38984, Training Loss: 0.00734
Epoch: 38984, Training Loss: 0.00687
Epoch: 38984, Training Loss: 0.00839
Epoch: 38984, Training Loss: 0.00651
Epoch: 38985, Training Loss: 0.00734
Epoch: 38985, Training Loss: 0.00687
Epoch: 38985, Training Loss: 0.00839
Epoch: 38985, Training Loss: 0.00651
Epoch: 38986, Training Loss: 0.00734
Epoch: 38986, Training Loss: 0.00687
Epoch: 38986, Training Loss: 0.00839
Epoch: 38986, Training Loss: 0.00651
Epoch: 38987, Training Loss: 0.00734
Epoch: 38987, Training Loss: 0.00687
Epoch: 38987, Training Loss: 0.00839
Epoch: 38987, Training Loss: 0.00650
Epoch: 38988, Training Loss: 0.00734
Epoch: 38988, Training Loss: 0.00687
Epoch: 38988, Training Loss: 0.00839
Epoch: 38988, Training Loss: 0.00650
Epoch: 38989, Training Loss: 0.00734
Epoch: 38989, Training Loss: 0.00687
Epoch: 38989, Training Loss: 0.00839
Epoch: 38989, Training Loss: 0.00650
Epoch: 38990, Training Loss: 0.00734
Epoch: 38990, Training Loss: 0.00687
Epoch: 38990, Training Loss: 0.00839
Epoch: 38990, Training Loss: 0.00650
Epoch: 38991, Training Loss: 0.00734
Epoch: 38991, Training Loss: 0.00687
Epoch: 38991, Training Loss: 0.00839
Epoch: 38991, Training Loss: 0.00650
Epoch: 38992, Training Loss: 0.00734
Epoch: 38992, Training Loss: 0.00687
Epoch: 38992, Training Loss: 0.00839
Epoch: 38992, Training Loss: 0.00650
Epoch: 38993, Training Loss: 0.00734
Epoch: 38993, Training Loss: 0.00687
Epoch: 38993, Training Loss: 0.00839
Epoch: 38993, Training Loss: 0.00650
Epoch: 38994, Training Loss: 0.00734
Epoch: 38994, Training Loss: 0.00687
Epoch: 38994, Training Loss: 0.00839
Epoch: 38994, Training Loss: 0.00650
Epoch: 38995, Training Loss: 0.00734
Epoch: 38995, Training Loss: 0.00687
Epoch: 38995, Training Loss: 0.00839
Epoch: 38995, Training Loss: 0.00650
Epoch: 38996, Training Loss: 0.00734
Epoch: 38996, Training Loss: 0.00687
Epoch: 38996, Training Loss: 0.00839
Epoch: 38996, Training Loss: 0.00650
Epoch: 38997, Training Loss: 0.00734
Epoch: 38997, Training Loss: 0.00687
Epoch: 38997, Training Loss: 0.00839
Epoch: 38997, Training Loss: 0.00650
Epoch: 38998, Training Loss: 0.00734
Epoch: 38998, Training Loss: 0.00687
Epoch: 38998, Training Loss: 0.00839
Epoch: 38998, Training Loss: 0.00650
Epoch: 38999, Training Loss: 0.00734
Epoch: 38999, Training Loss: 0.00687
Epoch: 38999, Training Loss: 0.00839
Epoch: 38999, Training Loss: 0.00650
Epoch: 39000, Training Loss: 0.00734
Epoch: 39000, Training Loss: 0.00687
Epoch: 39000, Training Loss: 0.00839
Epoch: 39000, Training Loss: 0.00650
Epoch: 39001, Training Loss: 0.00734
Epoch: 39001, Training Loss: 0.00687
Epoch: 39001, Training Loss: 0.00839
Epoch: 39001, Training Loss: 0.00650
Epoch: 39002, Training Loss: 0.00734
Epoch: 39002, Training Loss: 0.00687
Epoch: 39002, Training Loss: 0.00839
Epoch: 39002, Training Loss: 0.00650
Epoch: 39003, Training Loss: 0.00734
Epoch: 39003, Training Loss: 0.00687
Epoch: 39003, Training Loss: 0.00839
Epoch: 39003, Training Loss: 0.00650
Epoch: 39004, Training Loss: 0.00734
Epoch: 39004, Training Loss: 0.00687
Epoch: 39004, Training Loss: 0.00839
Epoch: 39004, Training Loss: 0.00650
Epoch: 39005, Training Loss: 0.00734
Epoch: 39005, Training Loss: 0.00687
Epoch: 39005, Training Loss: 0.00839
Epoch: 39005, Training Loss: 0.00650
Epoch: 39006, Training Loss: 0.00734
Epoch: 39006, Training Loss: 0.00687
Epoch: 39006, Training Loss: 0.00839
Epoch: 39006, Training Loss: 0.00650
Epoch: 39007, Training Loss: 0.00734
Epoch: 39007, Training Loss: 0.00687
Epoch: 39007, Training Loss: 0.00839
Epoch: 39007, Training Loss: 0.00650
Epoch: 39008, Training Loss: 0.00734
Epoch: 39008, Training Loss: 0.00687
Epoch: 39008, Training Loss: 0.00839
Epoch: 39008, Training Loss: 0.00650
Epoch: 39009, Training Loss: 0.00734
Epoch: 39009, Training Loss: 0.00687
Epoch: 39009, Training Loss: 0.00839
Epoch: 39009, Training Loss: 0.00650
Epoch: 39010, Training Loss: 0.00734
Epoch: 39010, Training Loss: 0.00687
Epoch: 39010, Training Loss: 0.00839
Epoch: 39010, Training Loss: 0.00650
Epoch: 39011, Training Loss: 0.00734
Epoch: 39011, Training Loss: 0.00687
Epoch: 39011, Training Loss: 0.00839
Epoch: 39011, Training Loss: 0.00650
Epoch: 39012, Training Loss: 0.00734
Epoch: 39012, Training Loss: 0.00687
Epoch: 39012, Training Loss: 0.00839
Epoch: 39012, Training Loss: 0.00650
Epoch: 39013, Training Loss: 0.00734
Epoch: 39013, Training Loss: 0.00687
Epoch: 39013, Training Loss: 0.00839
Epoch: 39013, Training Loss: 0.00650
Epoch: 39014, Training Loss: 0.00734
Epoch: 39014, Training Loss: 0.00687
Epoch: 39014, Training Loss: 0.00839
Epoch: 39014, Training Loss: 0.00650
Epoch: 39015, Training Loss: 0.00734
Epoch: 39015, Training Loss: 0.00687
Epoch: 39015, Training Loss: 0.00839
Epoch: 39015, Training Loss: 0.00650
Epoch: 39016, Training Loss: 0.00734
Epoch: 39016, Training Loss: 0.00687
Epoch: 39016, Training Loss: 0.00839
Epoch: 39016, Training Loss: 0.00650
Epoch: 39017, Training Loss: 0.00734
Epoch: 39017, Training Loss: 0.00687
Epoch: 39017, Training Loss: 0.00839
Epoch: 39017, Training Loss: 0.00650
Epoch: 39018, Training Loss: 0.00734
Epoch: 39018, Training Loss: 0.00687
Epoch: 39018, Training Loss: 0.00839
Epoch: 39018, Training Loss: 0.00650
Epoch: 39019, Training Loss: 0.00734
Epoch: 39019, Training Loss: 0.00687
Epoch: 39019, Training Loss: 0.00839
Epoch: 39019, Training Loss: 0.00650
Epoch: 39020, Training Loss: 0.00734
Epoch: 39020, Training Loss: 0.00687
Epoch: 39020, Training Loss: 0.00839
Epoch: 39020, Training Loss: 0.00650
Epoch: 39021, Training Loss: 0.00734
Epoch: 39021, Training Loss: 0.00687
Epoch: 39021, Training Loss: 0.00839
Epoch: 39021, Training Loss: 0.00650
Epoch: 39022, Training Loss: 0.00734
Epoch: 39022, Training Loss: 0.00687
Epoch: 39022, Training Loss: 0.00839
Epoch: 39022, Training Loss: 0.00650
Epoch: 39023, Training Loss: 0.00734
Epoch: 39023, Training Loss: 0.00687
Epoch: 39023, Training Loss: 0.00838
Epoch: 39023, Training Loss: 0.00650
Epoch: 39024, Training Loss: 0.00734
Epoch: 39024, Training Loss: 0.00687
Epoch: 39024, Training Loss: 0.00838
Epoch: 39024, Training Loss: 0.00650
Epoch: 39025, Training Loss: 0.00734
Epoch: 39025, Training Loss: 0.00687
Epoch: 39025, Training Loss: 0.00838
Epoch: 39025, Training Loss: 0.00650
Epoch: 39026, Training Loss: 0.00734
Epoch: 39026, Training Loss: 0.00687
Epoch: 39026, Training Loss: 0.00838
Epoch: 39026, Training Loss: 0.00650
Epoch: 39027, Training Loss: 0.00734
Epoch: 39027, Training Loss: 0.00687
Epoch: 39027, Training Loss: 0.00838
Epoch: 39027, Training Loss: 0.00650
Epoch: 39028, Training Loss: 0.00734
Epoch: 39028, Training Loss: 0.00687
Epoch: 39028, Training Loss: 0.00838
Epoch: 39028, Training Loss: 0.00650
Epoch: 39029, Training Loss: 0.00734
Epoch: 39029, Training Loss: 0.00687
Epoch: 39029, Training Loss: 0.00838
Epoch: 39029, Training Loss: 0.00650
Epoch: 39030, Training Loss: 0.00734
Epoch: 39030, Training Loss: 0.00687
Epoch: 39030, Training Loss: 0.00838
Epoch: 39030, Training Loss: 0.00650
Epoch: 39031, Training Loss: 0.00734
Epoch: 39031, Training Loss: 0.00687
Epoch: 39031, Training Loss: 0.00838
Epoch: 39031, Training Loss: 0.00650
Epoch: 39032, Training Loss: 0.00734
Epoch: 39032, Training Loss: 0.00687
Epoch: 39032, Training Loss: 0.00838
Epoch: 39032, Training Loss: 0.00650
Epoch: 39033, Training Loss: 0.00734
Epoch: 39033, Training Loss: 0.00687
Epoch: 39033, Training Loss: 0.00838
Epoch: 39033, Training Loss: 0.00650
Epoch: 39034, Training Loss: 0.00734
Epoch: 39034, Training Loss: 0.00687
Epoch: 39034, Training Loss: 0.00838
Epoch: 39034, Training Loss: 0.00650
Epoch: 39035, Training Loss: 0.00734
Epoch: 39035, Training Loss: 0.00687
Epoch: 39035, Training Loss: 0.00838
Epoch: 39035, Training Loss: 0.00650
Epoch: 39036, Training Loss: 0.00734
Epoch: 39036, Training Loss: 0.00687
Epoch: 39036, Training Loss: 0.00838
Epoch: 39036, Training Loss: 0.00650
Epoch: 39037, Training Loss: 0.00734
Epoch: 39037, Training Loss: 0.00687
Epoch: 39037, Training Loss: 0.00838
Epoch: 39037, Training Loss: 0.00650
Epoch: 39038, Training Loss: 0.00734
Epoch: 39038, Training Loss: 0.00687
Epoch: 39038, Training Loss: 0.00838
Epoch: 39038, Training Loss: 0.00650
Epoch: 39039, Training Loss: 0.00734
Epoch: 39039, Training Loss: 0.00687
Epoch: 39039, Training Loss: 0.00838
Epoch: 39039, Training Loss: 0.00650
Epoch: 39040, Training Loss: 0.00734
Epoch: 39040, Training Loss: 0.00687
Epoch: 39040, Training Loss: 0.00838
Epoch: 39040, Training Loss: 0.00650
Epoch: 39041, Training Loss: 0.00734
Epoch: 39041, Training Loss: 0.00687
Epoch: 39041, Training Loss: 0.00838
Epoch: 39041, Training Loss: 0.00650
Epoch: 39042, Training Loss: 0.00734
Epoch: 39042, Training Loss: 0.00687
Epoch: 39042, Training Loss: 0.00838
Epoch: 39042, Training Loss: 0.00650
Epoch: 39043, Training Loss: 0.00734
Epoch: 39043, Training Loss: 0.00687
Epoch: 39043, Training Loss: 0.00838
Epoch: 39043, Training Loss: 0.00650
Epoch: 39044, Training Loss: 0.00734
Epoch: 39044, Training Loss: 0.00687
Epoch: 39044, Training Loss: 0.00838
Epoch: 39044, Training Loss: 0.00650
Epoch: 39045, Training Loss: 0.00734
Epoch: 39045, Training Loss: 0.00687
Epoch: 39045, Training Loss: 0.00838
Epoch: 39045, Training Loss: 0.00650
Epoch: 39046, Training Loss: 0.00734
Epoch: 39046, Training Loss: 0.00687
Epoch: 39046, Training Loss: 0.00838
Epoch: 39046, Training Loss: 0.00650
Epoch: 39047, Training Loss: 0.00734
Epoch: 39047, Training Loss: 0.00687
Epoch: 39047, Training Loss: 0.00838
Epoch: 39047, Training Loss: 0.00650
Epoch: 39048, Training Loss: 0.00734
Epoch: 39048, Training Loss: 0.00687
Epoch: 39048, Training Loss: 0.00838
Epoch: 39048, Training Loss: 0.00650
Epoch: 39049, Training Loss: 0.00734
Epoch: 39049, Training Loss: 0.00686
Epoch: 39049, Training Loss: 0.00838
Epoch: 39049, Training Loss: 0.00650
Epoch: 39050, Training Loss: 0.00734
Epoch: 39050, Training Loss: 0.00686
Epoch: 39050, Training Loss: 0.00838
Epoch: 39050, Training Loss: 0.00650
Epoch: 39051, Training Loss: 0.00734
Epoch: 39051, Training Loss: 0.00686
Epoch: 39051, Training Loss: 0.00838
Epoch: 39051, Training Loss: 0.00650
Epoch: 39052, Training Loss: 0.00734
Epoch: 39052, Training Loss: 0.00686
Epoch: 39052, Training Loss: 0.00838
Epoch: 39052, Training Loss: 0.00650
Epoch: 39053, Training Loss: 0.00734
Epoch: 39053, Training Loss: 0.00686
Epoch: 39053, Training Loss: 0.00838
Epoch: 39053, Training Loss: 0.00650
Epoch: 39054, Training Loss: 0.00734
Epoch: 39054, Training Loss: 0.00686
Epoch: 39054, Training Loss: 0.00838
Epoch: 39054, Training Loss: 0.00650
Epoch: 39055, Training Loss: 0.00734
Epoch: 39055, Training Loss: 0.00686
Epoch: 39055, Training Loss: 0.00838
Epoch: 39055, Training Loss: 0.00650
Epoch: 39056, Training Loss: 0.00734
Epoch: 39056, Training Loss: 0.00686
Epoch: 39056, Training Loss: 0.00838
Epoch: 39056, Training Loss: 0.00650
Epoch: 39057, Training Loss: 0.00734
Epoch: 39057, Training Loss: 0.00686
Epoch: 39057, Training Loss: 0.00838
Epoch: 39057, Training Loss: 0.00650
Epoch: 39058, Training Loss: 0.00734
Epoch: 39058, Training Loss: 0.00686
Epoch: 39058, Training Loss: 0.00838
Epoch: 39058, Training Loss: 0.00650
Epoch: 39059, Training Loss: 0.00734
Epoch: 39059, Training Loss: 0.00686
Epoch: 39059, Training Loss: 0.00838
Epoch: 39059, Training Loss: 0.00650
Epoch: 39060, Training Loss: 0.00734
Epoch: 39060, Training Loss: 0.00686
Epoch: 39060, Training Loss: 0.00838
Epoch: 39060, Training Loss: 0.00650
Epoch: 39061, Training Loss: 0.00734
Epoch: 39061, Training Loss: 0.00686
Epoch: 39061, Training Loss: 0.00838
Epoch: 39061, Training Loss: 0.00650
Epoch: 39062, Training Loss: 0.00734
Epoch: 39062, Training Loss: 0.00686
Epoch: 39062, Training Loss: 0.00838
Epoch: 39062, Training Loss: 0.00650
Epoch: 39063, Training Loss: 0.00733
Epoch: 39063, Training Loss: 0.00686
Epoch: 39063, Training Loss: 0.00838
Epoch: 39063, Training Loss: 0.00650
Epoch: 39064, Training Loss: 0.00733
Epoch: 39064, Training Loss: 0.00686
Epoch: 39064, Training Loss: 0.00838
Epoch: 39064, Training Loss: 0.00650
Epoch: 39065, Training Loss: 0.00733
Epoch: 39065, Training Loss: 0.00686
Epoch: 39065, Training Loss: 0.00838
Epoch: 39065, Training Loss: 0.00650
Epoch: 39066, Training Loss: 0.00733
Epoch: 39066, Training Loss: 0.00686
Epoch: 39066, Training Loss: 0.00838
Epoch: 39066, Training Loss: 0.00650
Epoch: 39067, Training Loss: 0.00733
Epoch: 39067, Training Loss: 0.00686
Epoch: 39067, Training Loss: 0.00838
Epoch: 39067, Training Loss: 0.00650
Epoch: 39068, Training Loss: 0.00733
Epoch: 39068, Training Loss: 0.00686
Epoch: 39068, Training Loss: 0.00838
Epoch: 39068, Training Loss: 0.00650
Epoch: 39069, Training Loss: 0.00733
Epoch: 39069, Training Loss: 0.00686
Epoch: 39069, Training Loss: 0.00838
Epoch: 39069, Training Loss: 0.00650
Epoch: 39070, Training Loss: 0.00733
Epoch: 39070, Training Loss: 0.00686
Epoch: 39070, Training Loss: 0.00838
Epoch: 39070, Training Loss: 0.00650
Epoch: 39071, Training Loss: 0.00733
Epoch: 39071, Training Loss: 0.00686
Epoch: 39071, Training Loss: 0.00838
Epoch: 39071, Training Loss: 0.00650
Epoch: 39072, Training Loss: 0.00733
Epoch: 39072, Training Loss: 0.00686
Epoch: 39072, Training Loss: 0.00838
Epoch: 39072, Training Loss: 0.00650
Epoch: 39073, Training Loss: 0.00733
Epoch: 39073, Training Loss: 0.00686
Epoch: 39073, Training Loss: 0.00838
Epoch: 39073, Training Loss: 0.00650
Epoch: 39074, Training Loss: 0.00733
Epoch: 39074, Training Loss: 0.00686
Epoch: 39074, Training Loss: 0.00838
Epoch: 39074, Training Loss: 0.00650
Epoch: 39075, Training Loss: 0.00733
Epoch: 39075, Training Loss: 0.00686
Epoch: 39075, Training Loss: 0.00838
Epoch: 39075, Training Loss: 0.00650
Epoch: 39076, Training Loss: 0.00733
Epoch: 39076, Training Loss: 0.00686
Epoch: 39076, Training Loss: 0.00838
Epoch: 39076, Training Loss: 0.00650
Epoch: 39077, Training Loss: 0.00733
Epoch: 39077, Training Loss: 0.00686
Epoch: 39077, Training Loss: 0.00838
Epoch: 39077, Training Loss: 0.00650
Epoch: 39078, Training Loss: 0.00733
Epoch: 39078, Training Loss: 0.00686
Epoch: 39078, Training Loss: 0.00838
Epoch: 39078, Training Loss: 0.00650
Epoch: 39079, Training Loss: 0.00733
Epoch: 39079, Training Loss: 0.00686
Epoch: 39079, Training Loss: 0.00838
Epoch: 39079, Training Loss: 0.00650
Epoch: 39080, Training Loss: 0.00733
Epoch: 39080, Training Loss: 0.00686
Epoch: 39080, Training Loss: 0.00838
Epoch: 39080, Training Loss: 0.00650
Epoch: 39081, Training Loss: 0.00733
Epoch: 39081, Training Loss: 0.00686
Epoch: 39081, Training Loss: 0.00838
Epoch: 39081, Training Loss: 0.00650
Epoch: 39082, Training Loss: 0.00733
Epoch: 39082, Training Loss: 0.00686
Epoch: 39082, Training Loss: 0.00838
Epoch: 39082, Training Loss: 0.00650
Epoch: 39083, Training Loss: 0.00733
Epoch: 39083, Training Loss: 0.00686
Epoch: 39083, Training Loss: 0.00838
Epoch: 39083, Training Loss: 0.00650
Epoch: 39084, Training Loss: 0.00733
Epoch: 39084, Training Loss: 0.00686
Epoch: 39084, Training Loss: 0.00838
Epoch: 39084, Training Loss: 0.00650
Epoch: 39085, Training Loss: 0.00733
Epoch: 39085, Training Loss: 0.00686
Epoch: 39085, Training Loss: 0.00838
Epoch: 39085, Training Loss: 0.00650
Epoch: 39086, Training Loss: 0.00733
Epoch: 39086, Training Loss: 0.00686
Epoch: 39086, Training Loss: 0.00838
Epoch: 39086, Training Loss: 0.00650
Epoch: 39087, Training Loss: 0.00733
Epoch: 39087, Training Loss: 0.00686
Epoch: 39087, Training Loss: 0.00838
Epoch: 39087, Training Loss: 0.00650
Epoch: 39088, Training Loss: 0.00733
Epoch: 39088, Training Loss: 0.00686
Epoch: 39088, Training Loss: 0.00838
Epoch: 39088, Training Loss: 0.00650
Epoch: 39089, Training Loss: 0.00733
Epoch: 39089, Training Loss: 0.00686
Epoch: 39089, Training Loss: 0.00838
Epoch: 39089, Training Loss: 0.00650
Epoch: 39090, Training Loss: 0.00733
Epoch: 39090, Training Loss: 0.00686
Epoch: 39090, Training Loss: 0.00838
Epoch: 39090, Training Loss: 0.00650
Epoch: 39091, Training Loss: 0.00733
Epoch: 39091, Training Loss: 0.00686
Epoch: 39091, Training Loss: 0.00838
Epoch: 39091, Training Loss: 0.00650
Epoch: 39092, Training Loss: 0.00733
Epoch: 39092, Training Loss: 0.00686
Epoch: 39092, Training Loss: 0.00838
Epoch: 39092, Training Loss: 0.00650
Epoch: 39093, Training Loss: 0.00733
Epoch: 39093, Training Loss: 0.00686
Epoch: 39093, Training Loss: 0.00838
Epoch: 39093, Training Loss: 0.00650
Epoch: 39094, Training Loss: 0.00733
Epoch: 39094, Training Loss: 0.00686
Epoch: 39094, Training Loss: 0.00838
Epoch: 39094, Training Loss: 0.00650
Epoch: 39095, Training Loss: 0.00733
Epoch: 39095, Training Loss: 0.00686
Epoch: 39095, Training Loss: 0.00838
Epoch: 39095, Training Loss: 0.00650
Epoch: 39096, Training Loss: 0.00733
Epoch: 39096, Training Loss: 0.00686
Epoch: 39096, Training Loss: 0.00838
Epoch: 39096, Training Loss: 0.00650
Epoch: 39097, Training Loss: 0.00733
Epoch: 39097, Training Loss: 0.00686
Epoch: 39097, Training Loss: 0.00838
Epoch: 39097, Training Loss: 0.00650
Epoch: 39098, Training Loss: 0.00733
Epoch: 39098, Training Loss: 0.00686
Epoch: 39098, Training Loss: 0.00838
Epoch: 39098, Training Loss: 0.00650
Epoch: 39099, Training Loss: 0.00733
Epoch: 39099, Training Loss: 0.00686
Epoch: 39099, Training Loss: 0.00838
Epoch: 39099, Training Loss: 0.00650
Epoch: 39100, Training Loss: 0.00733
Epoch: 39100, Training Loss: 0.00686
Epoch: 39100, Training Loss: 0.00838
Epoch: 39100, Training Loss: 0.00650
Epoch: 39101, Training Loss: 0.00733
Epoch: 39101, Training Loss: 0.00686
Epoch: 39101, Training Loss: 0.00838
Epoch: 39101, Training Loss: 0.00650
Epoch: 39102, Training Loss: 0.00733
Epoch: 39102, Training Loss: 0.00686
Epoch: 39102, Training Loss: 0.00838
Epoch: 39102, Training Loss: 0.00650
Epoch: 39103, Training Loss: 0.00733
Epoch: 39103, Training Loss: 0.00686
Epoch: 39103, Training Loss: 0.00838
Epoch: 39103, Training Loss: 0.00649
Epoch: 39104, Training Loss: 0.00733
Epoch: 39104, Training Loss: 0.00686
Epoch: 39104, Training Loss: 0.00838
Epoch: 39104, Training Loss: 0.00649
Epoch: 39105, Training Loss: 0.00733
Epoch: 39105, Training Loss: 0.00686
Epoch: 39105, Training Loss: 0.00838
Epoch: 39105, Training Loss: 0.00649
Epoch: 39106, Training Loss: 0.00733
Epoch: 39106, Training Loss: 0.00686
Epoch: 39106, Training Loss: 0.00838
Epoch: 39106, Training Loss: 0.00649
Epoch: 39107, Training Loss: 0.00733
Epoch: 39107, Training Loss: 0.00686
Epoch: 39107, Training Loss: 0.00838
Epoch: 39107, Training Loss: 0.00649
Epoch: 39108, Training Loss: 0.00733
Epoch: 39108, Training Loss: 0.00686
Epoch: 39108, Training Loss: 0.00838
Epoch: 39108, Training Loss: 0.00649
Epoch: 39109, Training Loss: 0.00733
Epoch: 39109, Training Loss: 0.00686
Epoch: 39109, Training Loss: 0.00838
Epoch: 39109, Training Loss: 0.00649
Epoch: 39110, Training Loss: 0.00733
Epoch: 39110, Training Loss: 0.00686
Epoch: 39110, Training Loss: 0.00838
Epoch: 39110, Training Loss: 0.00649
Epoch: 39111, Training Loss: 0.00733
Epoch: 39111, Training Loss: 0.00686
Epoch: 39111, Training Loss: 0.00838
Epoch: 39111, Training Loss: 0.00649
Epoch: 39112, Training Loss: 0.00733
Epoch: 39112, Training Loss: 0.00686
Epoch: 39112, Training Loss: 0.00837
Epoch: 39112, Training Loss: 0.00649
Epoch: 39113, Training Loss: 0.00733
Epoch: 39113, Training Loss: 0.00686
Epoch: 39113, Training Loss: 0.00837
Epoch: 39113, Training Loss: 0.00649
Epoch: 39114, Training Loss: 0.00733
Epoch: 39114, Training Loss: 0.00686
Epoch: 39114, Training Loss: 0.00837
Epoch: 39114, Training Loss: 0.00649
Epoch: 39115, Training Loss: 0.00733
Epoch: 39115, Training Loss: 0.00686
Epoch: 39115, Training Loss: 0.00837
Epoch: 39115, Training Loss: 0.00649
Epoch: 39116, Training Loss: 0.00733
Epoch: 39116, Training Loss: 0.00686
Epoch: 39116, Training Loss: 0.00837
Epoch: 39116, Training Loss: 0.00649
Epoch: 39117, Training Loss: 0.00733
Epoch: 39117, Training Loss: 0.00686
Epoch: 39117, Training Loss: 0.00837
Epoch: 39117, Training Loss: 0.00649
Epoch: 39118, Training Loss: 0.00733
Epoch: 39118, Training Loss: 0.00686
Epoch: 39118, Training Loss: 0.00837
Epoch: 39118, Training Loss: 0.00649
Epoch: 39119, Training Loss: 0.00733
Epoch: 39119, Training Loss: 0.00686
Epoch: 39119, Training Loss: 0.00837
Epoch: 39119, Training Loss: 0.00649
Epoch: 39120, Training Loss: 0.00733
Epoch: 39120, Training Loss: 0.00686
Epoch: 39120, Training Loss: 0.00837
Epoch: 39120, Training Loss: 0.00649
Epoch: 39121, Training Loss: 0.00733
Epoch: 39121, Training Loss: 0.00686
Epoch: 39121, Training Loss: 0.00837
Epoch: 39121, Training Loss: 0.00649
Epoch: 39122, Training Loss: 0.00733
Epoch: 39122, Training Loss: 0.00686
Epoch: 39122, Training Loss: 0.00837
Epoch: 39122, Training Loss: 0.00649
Epoch: 39123, Training Loss: 0.00733
Epoch: 39123, Training Loss: 0.00686
Epoch: 39123, Training Loss: 0.00837
Epoch: 39123, Training Loss: 0.00649
Epoch: 39124, Training Loss: 0.00733
Epoch: 39124, Training Loss: 0.00686
Epoch: 39124, Training Loss: 0.00837
Epoch: 39124, Training Loss: 0.00649
Epoch: 39125, Training Loss: 0.00733
Epoch: 39125, Training Loss: 0.00686
Epoch: 39125, Training Loss: 0.00837
Epoch: 39125, Training Loss: 0.00649
Epoch: 39126, Training Loss: 0.00733
Epoch: 39126, Training Loss: 0.00686
Epoch: 39126, Training Loss: 0.00837
Epoch: 39126, Training Loss: 0.00649
Epoch: 39127, Training Loss: 0.00733
Epoch: 39127, Training Loss: 0.00686
Epoch: 39127, Training Loss: 0.00837
Epoch: 39127, Training Loss: 0.00649
Epoch: 39128, Training Loss: 0.00733
Epoch: 39128, Training Loss: 0.00686
Epoch: 39128, Training Loss: 0.00837
Epoch: 39128, Training Loss: 0.00649
Epoch: 39129, Training Loss: 0.00733
Epoch: 39129, Training Loss: 0.00686
Epoch: 39129, Training Loss: 0.00837
Epoch: 39129, Training Loss: 0.00649
Epoch: 39130, Training Loss: 0.00733
Epoch: 39130, Training Loss: 0.00686
Epoch: 39130, Training Loss: 0.00837
Epoch: 39130, Training Loss: 0.00649
Epoch: 39131, Training Loss: 0.00733
Epoch: 39131, Training Loss: 0.00686
Epoch: 39131, Training Loss: 0.00837
Epoch: 39131, Training Loss: 0.00649
Epoch: 39132, Training Loss: 0.00733
Epoch: 39132, Training Loss: 0.00686
Epoch: 39132, Training Loss: 0.00837
Epoch: 39132, Training Loss: 0.00649
Epoch: 39133, Training Loss: 0.00733
Epoch: 39133, Training Loss: 0.00686
Epoch: 39133, Training Loss: 0.00837
Epoch: 39133, Training Loss: 0.00649
Epoch: 39134, Training Loss: 0.00733
Epoch: 39134, Training Loss: 0.00686
Epoch: 39134, Training Loss: 0.00837
Epoch: 39134, Training Loss: 0.00649
Epoch: 39135, Training Loss: 0.00733
Epoch: 39135, Training Loss: 0.00686
Epoch: 39135, Training Loss: 0.00837
Epoch: 39135, Training Loss: 0.00649
Epoch: 39136, Training Loss: 0.00733
Epoch: 39136, Training Loss: 0.00686
Epoch: 39136, Training Loss: 0.00837
Epoch: 39136, Training Loss: 0.00649
Epoch: 39137, Training Loss: 0.00733
Epoch: 39137, Training Loss: 0.00686
Epoch: 39137, Training Loss: 0.00837
Epoch: 39137, Training Loss: 0.00649
Epoch: 39138, Training Loss: 0.00733
Epoch: 39138, Training Loss: 0.00686
Epoch: 39138, Training Loss: 0.00837
Epoch: 39138, Training Loss: 0.00649
Epoch: 39139, Training Loss: 0.00733
Epoch: 39139, Training Loss: 0.00686
Epoch: 39139, Training Loss: 0.00837
Epoch: 39139, Training Loss: 0.00649
Epoch: 39140, Training Loss: 0.00733
Epoch: 39140, Training Loss: 0.00686
Epoch: 39140, Training Loss: 0.00837
Epoch: 39140, Training Loss: 0.00649
Epoch: 39141, Training Loss: 0.00733
Epoch: 39141, Training Loss: 0.00686
Epoch: 39141, Training Loss: 0.00837
Epoch: 39141, Training Loss: 0.00649
Epoch: 39142, Training Loss: 0.00733
Epoch: 39142, Training Loss: 0.00686
Epoch: 39142, Training Loss: 0.00837
Epoch: 39142, Training Loss: 0.00649
Epoch: 39143, Training Loss: 0.00733
Epoch: 39143, Training Loss: 0.00686
Epoch: 39143, Training Loss: 0.00837
Epoch: 39143, Training Loss: 0.00649
Epoch: 39144, Training Loss: 0.00733
Epoch: 39144, Training Loss: 0.00686
Epoch: 39144, Training Loss: 0.00837
Epoch: 39144, Training Loss: 0.00649
Epoch: 39145, Training Loss: 0.00733
Epoch: 39145, Training Loss: 0.00686
Epoch: 39145, Training Loss: 0.00837
Epoch: 39145, Training Loss: 0.00649
Epoch: 39146, Training Loss: 0.00733
Epoch: 39146, Training Loss: 0.00686
Epoch: 39146, Training Loss: 0.00837
Epoch: 39146, Training Loss: 0.00649
Epoch: 39147, Training Loss: 0.00733
Epoch: 39147, Training Loss: 0.00686
Epoch: 39147, Training Loss: 0.00837
Epoch: 39147, Training Loss: 0.00649
Epoch: 39148, Training Loss: 0.00733
Epoch: 39148, Training Loss: 0.00686
Epoch: 39148, Training Loss: 0.00837
Epoch: 39148, Training Loss: 0.00649
Epoch: 39149, Training Loss: 0.00733
Epoch: 39149, Training Loss: 0.00686
Epoch: 39149, Training Loss: 0.00837
Epoch: 39149, Training Loss: 0.00649
Epoch: 39150, Training Loss: 0.00733
Epoch: 39150, Training Loss: 0.00686
Epoch: 39150, Training Loss: 0.00837
Epoch: 39150, Training Loss: 0.00649
Epoch: 39151, Training Loss: 0.00733
Epoch: 39151, Training Loss: 0.00686
Epoch: 39151, Training Loss: 0.00837
Epoch: 39151, Training Loss: 0.00649
Epoch: 39152, Training Loss: 0.00733
Epoch: 39152, Training Loss: 0.00686
Epoch: 39152, Training Loss: 0.00837
Epoch: 39152, Training Loss: 0.00649
Epoch: 39153, Training Loss: 0.00733
Epoch: 39153, Training Loss: 0.00686
Epoch: 39153, Training Loss: 0.00837
Epoch: 39153, Training Loss: 0.00649
Epoch: 39154, Training Loss: 0.00733
Epoch: 39154, Training Loss: 0.00686
Epoch: 39154, Training Loss: 0.00837
Epoch: 39154, Training Loss: 0.00649
Epoch: 39155, Training Loss: 0.00733
Epoch: 39155, Training Loss: 0.00686
Epoch: 39155, Training Loss: 0.00837
Epoch: 39155, Training Loss: 0.00649
Epoch: 39156, Training Loss: 0.00733
Epoch: 39156, Training Loss: 0.00686
Epoch: 39156, Training Loss: 0.00837
Epoch: 39156, Training Loss: 0.00649
Epoch: 39157, Training Loss: 0.00733
Epoch: 39157, Training Loss: 0.00686
Epoch: 39157, Training Loss: 0.00837
Epoch: 39157, Training Loss: 0.00649
Epoch: 39158, Training Loss: 0.00733
Epoch: 39158, Training Loss: 0.00686
Epoch: 39158, Training Loss: 0.00837
Epoch: 39158, Training Loss: 0.00649
Epoch: 39159, Training Loss: 0.00733
Epoch: 39159, Training Loss: 0.00685
Epoch: 39159, Training Loss: 0.00837
Epoch: 39159, Training Loss: 0.00649
Epoch: 39160, Training Loss: 0.00733
Epoch: 39160, Training Loss: 0.00685
Epoch: 39160, Training Loss: 0.00837
Epoch: 39160, Training Loss: 0.00649
Epoch: 39161, Training Loss: 0.00733
Epoch: 39161, Training Loss: 0.00685
Epoch: 39161, Training Loss: 0.00837
Epoch: 39161, Training Loss: 0.00649
Epoch: 39162, Training Loss: 0.00733
Epoch: 39162, Training Loss: 0.00685
Epoch: 39162, Training Loss: 0.00837
Epoch: 39162, Training Loss: 0.00649
Epoch: 39163, Training Loss: 0.00733
Epoch: 39163, Training Loss: 0.00685
Epoch: 39163, Training Loss: 0.00837
Epoch: 39163, Training Loss: 0.00649
Epoch: 39164, Training Loss: 0.00732
Epoch: 39164, Training Loss: 0.00685
Epoch: 39164, Training Loss: 0.00837
Epoch: 39164, Training Loss: 0.00649
Epoch: 39165, Training Loss: 0.00732
Epoch: 39165, Training Loss: 0.00685
Epoch: 39165, Training Loss: 0.00837
Epoch: 39165, Training Loss: 0.00649
Epoch: 39166, Training Loss: 0.00732
Epoch: 39166, Training Loss: 0.00685
Epoch: 39166, Training Loss: 0.00837
Epoch: 39166, Training Loss: 0.00649
Epoch: 39167, Training Loss: 0.00732
Epoch: 39167, Training Loss: 0.00685
Epoch: 39167, Training Loss: 0.00837
Epoch: 39167, Training Loss: 0.00649
Epoch: 39168, Training Loss: 0.00732
Epoch: 39168, Training Loss: 0.00685
Epoch: 39168, Training Loss: 0.00837
Epoch: 39168, Training Loss: 0.00649
Epoch: 39169, Training Loss: 0.00732
Epoch: 39169, Training Loss: 0.00685
Epoch: 39169, Training Loss: 0.00837
Epoch: 39169, Training Loss: 0.00649
Epoch: 39170, Training Loss: 0.00732
Epoch: 39170, Training Loss: 0.00685
Epoch: 39170, Training Loss: 0.00837
Epoch: 39170, Training Loss: 0.00649
Epoch: 39171, Training Loss: 0.00732
Epoch: 39171, Training Loss: 0.00685
Epoch: 39171, Training Loss: 0.00837
Epoch: 39171, Training Loss: 0.00649
Epoch: 39172, Training Loss: 0.00732
Epoch: 39172, Training Loss: 0.00685
Epoch: 39172, Training Loss: 0.00837
Epoch: 39172, Training Loss: 0.00649
Epoch: 39173, Training Loss: 0.00732
Epoch: 39173, Training Loss: 0.00685
Epoch: 39173, Training Loss: 0.00837
Epoch: 39173, Training Loss: 0.00649
Epoch: 39174, Training Loss: 0.00732
Epoch: 39174, Training Loss: 0.00685
Epoch: 39174, Training Loss: 0.00837
Epoch: 39174, Training Loss: 0.00649
Epoch: 39175, Training Loss: 0.00732
Epoch: 39175, Training Loss: 0.00685
Epoch: 39175, Training Loss: 0.00837
Epoch: 39175, Training Loss: 0.00649
Epoch: 39176, Training Loss: 0.00732
Epoch: 39176, Training Loss: 0.00685
Epoch: 39176, Training Loss: 0.00837
Epoch: 39176, Training Loss: 0.00649
Epoch: 39177, Training Loss: 0.00732
Epoch: 39177, Training Loss: 0.00685
Epoch: 39177, Training Loss: 0.00837
Epoch: 39177, Training Loss: 0.00649
Epoch: 39178, Training Loss: 0.00732
Epoch: 39178, Training Loss: 0.00685
Epoch: 39178, Training Loss: 0.00837
Epoch: 39178, Training Loss: 0.00649
Epoch: 39179, Training Loss: 0.00732
Epoch: 39179, Training Loss: 0.00685
Epoch: 39179, Training Loss: 0.00837
Epoch: 39179, Training Loss: 0.00649
Epoch: 39180, Training Loss: 0.00732
Epoch: 39180, Training Loss: 0.00685
Epoch: 39180, Training Loss: 0.00837
Epoch: 39180, Training Loss: 0.00649
Epoch: 39181, Training Loss: 0.00732
Epoch: 39181, Training Loss: 0.00685
Epoch: 39181, Training Loss: 0.00837
Epoch: 39181, Training Loss: 0.00649
Epoch: 39182, Training Loss: 0.00732
Epoch: 39182, Training Loss: 0.00685
Epoch: 39182, Training Loss: 0.00837
Epoch: 39182, Training Loss: 0.00649
Epoch: 39183, Training Loss: 0.00732
Epoch: 39183, Training Loss: 0.00685
Epoch: 39183, Training Loss: 0.00837
Epoch: 39183, Training Loss: 0.00649
Epoch: 39184, Training Loss: 0.00732
Epoch: 39184, Training Loss: 0.00685
Epoch: 39184, Training Loss: 0.00837
Epoch: 39184, Training Loss: 0.00649
Epoch: 39185, Training Loss: 0.00732
Epoch: 39185, Training Loss: 0.00685
Epoch: 39185, Training Loss: 0.00837
Epoch: 39185, Training Loss: 0.00649
Epoch: 39186, Training Loss: 0.00732
Epoch: 39186, Training Loss: 0.00685
Epoch: 39186, Training Loss: 0.00837
Epoch: 39186, Training Loss: 0.00649
Epoch: 39187, Training Loss: 0.00732
Epoch: 39187, Training Loss: 0.00685
Epoch: 39187, Training Loss: 0.00837
Epoch: 39187, Training Loss: 0.00649
Epoch: 39188, Training Loss: 0.00732
Epoch: 39188, Training Loss: 0.00685
Epoch: 39188, Training Loss: 0.00837
Epoch: 39188, Training Loss: 0.00649
Epoch: 39189, Training Loss: 0.00732
Epoch: 39189, Training Loss: 0.00685
Epoch: 39189, Training Loss: 0.00837
Epoch: 39189, Training Loss: 0.00649
Epoch: 39190, Training Loss: 0.00732
Epoch: 39190, Training Loss: 0.00685
Epoch: 39190, Training Loss: 0.00837
Epoch: 39190, Training Loss: 0.00649
Epoch: 39191, Training Loss: 0.00732
Epoch: 39191, Training Loss: 0.00685
Epoch: 39191, Training Loss: 0.00837
Epoch: 39191, Training Loss: 0.00649
Epoch: 39192, Training Loss: 0.00732
Epoch: 39192, Training Loss: 0.00685
Epoch: 39192, Training Loss: 0.00837
Epoch: 39192, Training Loss: 0.00649
Epoch: 39193, Training Loss: 0.00732
Epoch: 39193, Training Loss: 0.00685
Epoch: 39193, Training Loss: 0.00837
Epoch: 39193, Training Loss: 0.00649
Epoch: 39194, Training Loss: 0.00732
Epoch: 39194, Training Loss: 0.00685
Epoch: 39194, Training Loss: 0.00837
Epoch: 39194, Training Loss: 0.00649
Epoch: 39195, Training Loss: 0.00732
Epoch: 39195, Training Loss: 0.00685
Epoch: 39195, Training Loss: 0.00837
Epoch: 39195, Training Loss: 0.00649
Epoch: 39196, Training Loss: 0.00732
Epoch: 39196, Training Loss: 0.00685
Epoch: 39196, Training Loss: 0.00837
Epoch: 39196, Training Loss: 0.00649
Epoch: 39197, Training Loss: 0.00732
Epoch: 39197, Training Loss: 0.00685
Epoch: 39197, Training Loss: 0.00837
Epoch: 39197, Training Loss: 0.00649
Epoch: 39198, Training Loss: 0.00732
Epoch: 39198, Training Loss: 0.00685
Epoch: 39198, Training Loss: 0.00837
Epoch: 39198, Training Loss: 0.00649
Epoch: 39199, Training Loss: 0.00732
Epoch: 39199, Training Loss: 0.00685
Epoch: 39199, Training Loss: 0.00837
Epoch: 39199, Training Loss: 0.00649
Epoch: 39200, Training Loss: 0.00732
Epoch: 39200, Training Loss: 0.00685
Epoch: 39200, Training Loss: 0.00837
Epoch: 39200, Training Loss: 0.00649
Epoch: 39201, Training Loss: 0.00732
Epoch: 39201, Training Loss: 0.00685
Epoch: 39201, Training Loss: 0.00836
Epoch: 39201, Training Loss: 0.00649
Epoch: 39202, Training Loss: 0.00732
Epoch: 39202, Training Loss: 0.00685
Epoch: 39202, Training Loss: 0.00836
Epoch: 39202, Training Loss: 0.00649
Epoch: 39203, Training Loss: 0.00732
Epoch: 39203, Training Loss: 0.00685
Epoch: 39203, Training Loss: 0.00836
Epoch: 39203, Training Loss: 0.00649
Epoch: 39204, Training Loss: 0.00732
Epoch: 39204, Training Loss: 0.00685
Epoch: 39204, Training Loss: 0.00836
Epoch: 39204, Training Loss: 0.00649
Epoch: 39205, Training Loss: 0.00732
Epoch: 39205, Training Loss: 0.00685
Epoch: 39205, Training Loss: 0.00836
Epoch: 39205, Training Loss: 0.00649
Epoch: 39206, Training Loss: 0.00732
Epoch: 39206, Training Loss: 0.00685
Epoch: 39206, Training Loss: 0.00836
Epoch: 39206, Training Loss: 0.00649
Epoch: 39207, Training Loss: 0.00732
Epoch: 39207, Training Loss: 0.00685
Epoch: 39207, Training Loss: 0.00836
Epoch: 39207, Training Loss: 0.00649
Epoch: 39208, Training Loss: 0.00732
Epoch: 39208, Training Loss: 0.00685
Epoch: 39208, Training Loss: 0.00836
Epoch: 39208, Training Loss: 0.00649
Epoch: 39209, Training Loss: 0.00732
Epoch: 39209, Training Loss: 0.00685
Epoch: 39209, Training Loss: 0.00836
Epoch: 39209, Training Loss: 0.00649
Epoch: 39210, Training Loss: 0.00732
Epoch: 39210, Training Loss: 0.00685
Epoch: 39210, Training Loss: 0.00836
Epoch: 39210, Training Loss: 0.00649
Epoch: 39211, Training Loss: 0.00732
Epoch: 39211, Training Loss: 0.00685
Epoch: 39211, Training Loss: 0.00836
Epoch: 39211, Training Loss: 0.00649
Epoch: 39212, Training Loss: 0.00732
Epoch: 39212, Training Loss: 0.00685
Epoch: 39212, Training Loss: 0.00836
Epoch: 39212, Training Loss: 0.00649
Epoch: 39213, Training Loss: 0.00732
Epoch: 39213, Training Loss: 0.00685
Epoch: 39213, Training Loss: 0.00836
Epoch: 39213, Training Loss: 0.00649
Epoch: 39214, Training Loss: 0.00732
Epoch: 39214, Training Loss: 0.00685
Epoch: 39214, Training Loss: 0.00836
Epoch: 39214, Training Loss: 0.00649
Epoch: 39215, Training Loss: 0.00732
Epoch: 39215, Training Loss: 0.00685
Epoch: 39215, Training Loss: 0.00836
Epoch: 39215, Training Loss: 0.00649
Epoch: 39216, Training Loss: 0.00732
Epoch: 39216, Training Loss: 0.00685
Epoch: 39216, Training Loss: 0.00836
Epoch: 39216, Training Loss: 0.00649
Epoch: 39217, Training Loss: 0.00732
Epoch: 39217, Training Loss: 0.00685
Epoch: 39217, Training Loss: 0.00836
Epoch: 39217, Training Loss: 0.00649
Epoch: 39218, Training Loss: 0.00732
Epoch: 39218, Training Loss: 0.00685
Epoch: 39218, Training Loss: 0.00836
Epoch: 39218, Training Loss: 0.00649
Epoch: 39219, Training Loss: 0.00732
Epoch: 39219, Training Loss: 0.00685
Epoch: 39219, Training Loss: 0.00836
Epoch: 39219, Training Loss: 0.00648
Epoch: 39220, Training Loss: 0.00732
Epoch: 39220, Training Loss: 0.00685
Epoch: 39220, Training Loss: 0.00836
Epoch: 39220, Training Loss: 0.00648
Epoch: 39221, Training Loss: 0.00732
Epoch: 39221, Training Loss: 0.00685
Epoch: 39221, Training Loss: 0.00836
Epoch: 39221, Training Loss: 0.00648
Epoch: 39222, Training Loss: 0.00732
Epoch: 39222, Training Loss: 0.00685
Epoch: 39222, Training Loss: 0.00836
Epoch: 39222, Training Loss: 0.00648
Epoch: 39223, Training Loss: 0.00732
Epoch: 39223, Training Loss: 0.00685
Epoch: 39223, Training Loss: 0.00836
Epoch: 39223, Training Loss: 0.00648
Epoch: 39224, Training Loss: 0.00732
Epoch: 39224, Training Loss: 0.00685
Epoch: 39224, Training Loss: 0.00836
Epoch: 39224, Training Loss: 0.00648
Epoch: 39225, Training Loss: 0.00732
Epoch: 39225, Training Loss: 0.00685
Epoch: 39225, Training Loss: 0.00836
Epoch: 39225, Training Loss: 0.00648
Epoch: 39226, Training Loss: 0.00732
Epoch: 39226, Training Loss: 0.00685
Epoch: 39226, Training Loss: 0.00836
Epoch: 39226, Training Loss: 0.00648
Epoch: 39227, Training Loss: 0.00732
Epoch: 39227, Training Loss: 0.00685
Epoch: 39227, Training Loss: 0.00836
Epoch: 39227, Training Loss: 0.00648
Epoch: 39228, Training Loss: 0.00732
Epoch: 39228, Training Loss: 0.00685
Epoch: 39228, Training Loss: 0.00836
Epoch: 39228, Training Loss: 0.00648
Epoch: 39229, Training Loss: 0.00732
Epoch: 39229, Training Loss: 0.00685
Epoch: 39229, Training Loss: 0.00836
Epoch: 39229, Training Loss: 0.00648
Epoch: 39230, Training Loss: 0.00732
Epoch: 39230, Training Loss: 0.00685
Epoch: 39230, Training Loss: 0.00836
Epoch: 39230, Training Loss: 0.00648
Epoch: 39231, Training Loss: 0.00732
Epoch: 39231, Training Loss: 0.00685
Epoch: 39231, Training Loss: 0.00836
Epoch: 39231, Training Loss: 0.00648
Epoch: 39232, Training Loss: 0.00732
Epoch: 39232, Training Loss: 0.00685
Epoch: 39232, Training Loss: 0.00836
Epoch: 39232, Training Loss: 0.00648
Epoch: 39233, Training Loss: 0.00732
Epoch: 39233, Training Loss: 0.00685
Epoch: 39233, Training Loss: 0.00836
Epoch: 39233, Training Loss: 0.00648
Epoch: 39234, Training Loss: 0.00732
Epoch: 39234, Training Loss: 0.00685
Epoch: 39234, Training Loss: 0.00836
Epoch: 39234, Training Loss: 0.00648
Epoch: 39235, Training Loss: 0.00732
Epoch: 39235, Training Loss: 0.00685
Epoch: 39235, Training Loss: 0.00836
Epoch: 39235, Training Loss: 0.00648
Epoch: 39236, Training Loss: 0.00732
Epoch: 39236, Training Loss: 0.00685
Epoch: 39236, Training Loss: 0.00836
Epoch: 39236, Training Loss: 0.00648
Epoch: 39237, Training Loss: 0.00732
Epoch: 39237, Training Loss: 0.00685
Epoch: 39237, Training Loss: 0.00836
Epoch: 39237, Training Loss: 0.00648
Epoch: 39238, Training Loss: 0.00732
Epoch: 39238, Training Loss: 0.00685
Epoch: 39238, Training Loss: 0.00836
Epoch: 39238, Training Loss: 0.00648
Epoch: 39239, Training Loss: 0.00732
Epoch: 39239, Training Loss: 0.00685
Epoch: 39239, Training Loss: 0.00836
Epoch: 39239, Training Loss: 0.00648
Epoch: 39240, Training Loss: 0.00732
Epoch: 39240, Training Loss: 0.00685
Epoch: 39240, Training Loss: 0.00836
Epoch: 39240, Training Loss: 0.00648
Epoch: 39241, Training Loss: 0.00732
Epoch: 39241, Training Loss: 0.00685
Epoch: 39241, Training Loss: 0.00836
Epoch: 39241, Training Loss: 0.00648
Epoch: 39242, Training Loss: 0.00732
Epoch: 39242, Training Loss: 0.00685
Epoch: 39242, Training Loss: 0.00836
Epoch: 39242, Training Loss: 0.00648
Epoch: 39243, Training Loss: 0.00732
Epoch: 39243, Training Loss: 0.00685
Epoch: 39243, Training Loss: 0.00836
Epoch: 39243, Training Loss: 0.00648
Epoch: 39244, Training Loss: 0.00732
Epoch: 39244, Training Loss: 0.00685
Epoch: 39244, Training Loss: 0.00836
Epoch: 39244, Training Loss: 0.00648
Epoch: 39245, Training Loss: 0.00732
Epoch: 39245, Training Loss: 0.00685
Epoch: 39245, Training Loss: 0.00836
Epoch: 39245, Training Loss: 0.00648
Epoch: 39246, Training Loss: 0.00732
Epoch: 39246, Training Loss: 0.00685
Epoch: 39246, Training Loss: 0.00836
Epoch: 39246, Training Loss: 0.00648
Epoch: 39247, Training Loss: 0.00732
Epoch: 39247, Training Loss: 0.00685
Epoch: 39247, Training Loss: 0.00836
Epoch: 39247, Training Loss: 0.00648
Epoch: 39248, Training Loss: 0.00732
Epoch: 39248, Training Loss: 0.00685
Epoch: 39248, Training Loss: 0.00836
Epoch: 39248, Training Loss: 0.00648
Epoch: 39249, Training Loss: 0.00732
Epoch: 39249, Training Loss: 0.00685
Epoch: 39249, Training Loss: 0.00836
Epoch: 39249, Training Loss: 0.00648
Epoch: 39250, Training Loss: 0.00732
Epoch: 39250, Training Loss: 0.00685
Epoch: 39250, Training Loss: 0.00836
Epoch: 39250, Training Loss: 0.00648
Epoch: 39251, Training Loss: 0.00732
Epoch: 39251, Training Loss: 0.00685
Epoch: 39251, Training Loss: 0.00836
Epoch: 39251, Training Loss: 0.00648
Epoch: 39252, Training Loss: 0.00732
Epoch: 39252, Training Loss: 0.00685
Epoch: 39252, Training Loss: 0.00836
Epoch: 39252, Training Loss: 0.00648
Epoch: 39253, Training Loss: 0.00732
Epoch: 39253, Training Loss: 0.00685
Epoch: 39253, Training Loss: 0.00836
Epoch: 39253, Training Loss: 0.00648
Epoch: 39254, Training Loss: 0.00732
Epoch: 39254, Training Loss: 0.00685
Epoch: 39254, Training Loss: 0.00836
Epoch: 39254, Training Loss: 0.00648
Epoch: 39255, Training Loss: 0.00732
Epoch: 39255, Training Loss: 0.00685
Epoch: 39255, Training Loss: 0.00836
Epoch: 39255, Training Loss: 0.00648
Epoch: 39256, Training Loss: 0.00732
Epoch: 39256, Training Loss: 0.00685
Epoch: 39256, Training Loss: 0.00836
Epoch: 39256, Training Loss: 0.00648
Epoch: 39257, Training Loss: 0.00732
Epoch: 39257, Training Loss: 0.00685
Epoch: 39257, Training Loss: 0.00836
Epoch: 39257, Training Loss: 0.00648
Epoch: 39258, Training Loss: 0.00732
Epoch: 39258, Training Loss: 0.00685
Epoch: 39258, Training Loss: 0.00836
Epoch: 39258, Training Loss: 0.00648
Epoch: 39259, Training Loss: 0.00732
Epoch: 39259, Training Loss: 0.00685
Epoch: 39259, Training Loss: 0.00836
Epoch: 39259, Training Loss: 0.00648
Epoch: 39260, Training Loss: 0.00732
Epoch: 39260, Training Loss: 0.00685
Epoch: 39260, Training Loss: 0.00836
Epoch: 39260, Training Loss: 0.00648
Epoch: 39261, Training Loss: 0.00732
Epoch: 39261, Training Loss: 0.00685
Epoch: 39261, Training Loss: 0.00836
Epoch: 39261, Training Loss: 0.00648
Epoch: 39262, Training Loss: 0.00732
Epoch: 39262, Training Loss: 0.00685
Epoch: 39262, Training Loss: 0.00836
Epoch: 39262, Training Loss: 0.00648
Epoch: 39263, Training Loss: 0.00732
Epoch: 39263, Training Loss: 0.00685
Epoch: 39263, Training Loss: 0.00836
Epoch: 39263, Training Loss: 0.00648
Epoch: 39264, Training Loss: 0.00732
Epoch: 39264, Training Loss: 0.00685
Epoch: 39264, Training Loss: 0.00836
Epoch: 39264, Training Loss: 0.00648
Epoch: 39265, Training Loss: 0.00732
Epoch: 39265, Training Loss: 0.00685
Epoch: 39265, Training Loss: 0.00836
Epoch: 39265, Training Loss: 0.00648
Epoch: 39266, Training Loss: 0.00731
Epoch: 39266, Training Loss: 0.00685
Epoch: 39266, Training Loss: 0.00836
Epoch: 39266, Training Loss: 0.00648
Epoch: 39267, Training Loss: 0.00731
Epoch: 39267, Training Loss: 0.00685
Epoch: 39267, Training Loss: 0.00836
Epoch: 39267, Training Loss: 0.00648
Epoch: 39268, Training Loss: 0.00731
Epoch: 39268, Training Loss: 0.00685
Epoch: 39268, Training Loss: 0.00836
Epoch: 39268, Training Loss: 0.00648
Epoch: 39269, Training Loss: 0.00731
Epoch: 39269, Training Loss: 0.00684
Epoch: 39269, Training Loss: 0.00836
Epoch: 39269, Training Loss: 0.00648
Epoch: 39270, Training Loss: 0.00731
Epoch: 39270, Training Loss: 0.00684
Epoch: 39270, Training Loss: 0.00836
Epoch: 39270, Training Loss: 0.00648
Epoch: 39271, Training Loss: 0.00731
Epoch: 39271, Training Loss: 0.00684
Epoch: 39271, Training Loss: 0.00836
Epoch: 39271, Training Loss: 0.00648
Epoch: 39272, Training Loss: 0.00731
Epoch: 39272, Training Loss: 0.00684
Epoch: 39272, Training Loss: 0.00836
Epoch: 39272, Training Loss: 0.00648
Epoch: 39273, Training Loss: 0.00731
Epoch: 39273, Training Loss: 0.00684
Epoch: 39273, Training Loss: 0.00836
Epoch: 39273, Training Loss: 0.00648
Epoch: 39274, Training Loss: 0.00731
Epoch: 39274, Training Loss: 0.00684
Epoch: 39274, Training Loss: 0.00836
Epoch: 39274, Training Loss: 0.00648
Epoch: 39275, Training Loss: 0.00731
Epoch: 39275, Training Loss: 0.00684
Epoch: 39275, Training Loss: 0.00836
Epoch: 39275, Training Loss: 0.00648
Epoch: 39276, Training Loss: 0.00731
Epoch: 39276, Training Loss: 0.00684
Epoch: 39276, Training Loss: 0.00836
Epoch: 39276, Training Loss: 0.00648
Epoch: 39277, Training Loss: 0.00731
Epoch: 39277, Training Loss: 0.00684
Epoch: 39277, Training Loss: 0.00836
Epoch: 39277, Training Loss: 0.00648
Epoch: 39278, Training Loss: 0.00731
Epoch: 39278, Training Loss: 0.00684
Epoch: 39278, Training Loss: 0.00836
Epoch: 39278, Training Loss: 0.00648
Epoch: 39279, Training Loss: 0.00731
Epoch: 39279, Training Loss: 0.00684
Epoch: 39279, Training Loss: 0.00836
Epoch: 39279, Training Loss: 0.00648
Epoch: 39280, Training Loss: 0.00731
Epoch: 39280, Training Loss: 0.00684
Epoch: 39280, Training Loss: 0.00836
Epoch: 39280, Training Loss: 0.00648
Epoch: 39281, Training Loss: 0.00731
Epoch: 39281, Training Loss: 0.00684
Epoch: 39281, Training Loss: 0.00836
Epoch: 39281, Training Loss: 0.00648
Epoch: 39282, Training Loss: 0.00731
Epoch: 39282, Training Loss: 0.00684
Epoch: 39282, Training Loss: 0.00836
Epoch: 39282, Training Loss: 0.00648
Epoch: 39283, Training Loss: 0.00731
Epoch: 39283, Training Loss: 0.00684
Epoch: 39283, Training Loss: 0.00836
Epoch: 39283, Training Loss: 0.00648
Epoch: 39284, Training Loss: 0.00731
Epoch: 39284, Training Loss: 0.00684
Epoch: 39284, Training Loss: 0.00836
Epoch: 39284, Training Loss: 0.00648
Epoch: 39285, Training Loss: 0.00731
Epoch: 39285, Training Loss: 0.00684
Epoch: 39285, Training Loss: 0.00836
Epoch: 39285, Training Loss: 0.00648
Epoch: 39286, Training Loss: 0.00731
Epoch: 39286, Training Loss: 0.00684
Epoch: 39286, Training Loss: 0.00836
Epoch: 39286, Training Loss: 0.00648
Epoch: 39287, Training Loss: 0.00731
Epoch: 39287, Training Loss: 0.00684
Epoch: 39287, Training Loss: 0.00836
Epoch: 39287, Training Loss: 0.00648
Epoch: 39288, Training Loss: 0.00731
Epoch: 39288, Training Loss: 0.00684
Epoch: 39288, Training Loss: 0.00836
Epoch: 39288, Training Loss: 0.00648
Epoch: 39289, Training Loss: 0.00731
Epoch: 39289, Training Loss: 0.00684
Epoch: 39289, Training Loss: 0.00836
Epoch: 39289, Training Loss: 0.00648
Epoch: 39290, Training Loss: 0.00731
Epoch: 39290, Training Loss: 0.00684
Epoch: 39290, Training Loss: 0.00835
Epoch: 39290, Training Loss: 0.00648
Epoch: 39291, Training Loss: 0.00731
Epoch: 39291, Training Loss: 0.00684
Epoch: 39291, Training Loss: 0.00835
Epoch: 39291, Training Loss: 0.00648
Epoch: 39292, Training Loss: 0.00731
Epoch: 39292, Training Loss: 0.00684
Epoch: 39292, Training Loss: 0.00835
Epoch: 39292, Training Loss: 0.00648
Epoch: 39293, Training Loss: 0.00731
Epoch: 39293, Training Loss: 0.00684
Epoch: 39293, Training Loss: 0.00835
Epoch: 39293, Training Loss: 0.00648
Epoch: 39294, Training Loss: 0.00731
Epoch: 39294, Training Loss: 0.00684
Epoch: 39294, Training Loss: 0.00835
Epoch: 39294, Training Loss: 0.00648
Epoch: 39295, Training Loss: 0.00731
Epoch: 39295, Training Loss: 0.00684
Epoch: 39295, Training Loss: 0.00835
Epoch: 39295, Training Loss: 0.00648
Epoch: 39296, Training Loss: 0.00731
Epoch: 39296, Training Loss: 0.00684
Epoch: 39296, Training Loss: 0.00835
Epoch: 39296, Training Loss: 0.00648
Epoch: 39297, Training Loss: 0.00731
Epoch: 39297, Training Loss: 0.00684
Epoch: 39297, Training Loss: 0.00835
Epoch: 39297, Training Loss: 0.00648
Epoch: 39298, Training Loss: 0.00731
Epoch: 39298, Training Loss: 0.00684
Epoch: 39298, Training Loss: 0.00835
Epoch: 39298, Training Loss: 0.00648
Epoch: 39299, Training Loss: 0.00731
Epoch: 39299, Training Loss: 0.00684
Epoch: 39299, Training Loss: 0.00835
Epoch: 39299, Training Loss: 0.00648
Epoch: 39300, Training Loss: 0.00731
Epoch: 39300, Training Loss: 0.00684
Epoch: 39300, Training Loss: 0.00835
Epoch: 39300, Training Loss: 0.00648
Epoch: 39301, Training Loss: 0.00731
Epoch: 39301, Training Loss: 0.00684
Epoch: 39301, Training Loss: 0.00835
Epoch: 39301, Training Loss: 0.00648
Epoch: 39302, Training Loss: 0.00731
Epoch: 39302, Training Loss: 0.00684
Epoch: 39302, Training Loss: 0.00835
Epoch: 39302, Training Loss: 0.00648
Epoch: 39303, Training Loss: 0.00731
Epoch: 39303, Training Loss: 0.00684
Epoch: 39303, Training Loss: 0.00835
Epoch: 39303, Training Loss: 0.00648
Epoch: 39304, Training Loss: 0.00731
Epoch: 39304, Training Loss: 0.00684
Epoch: 39304, Training Loss: 0.00835
Epoch: 39304, Training Loss: 0.00648
Epoch: 39305, Training Loss: 0.00731
Epoch: 39305, Training Loss: 0.00684
Epoch: 39305, Training Loss: 0.00835
Epoch: 39305, Training Loss: 0.00648
Epoch: 39306, Training Loss: 0.00731
Epoch: 39306, Training Loss: 0.00684
Epoch: 39306, Training Loss: 0.00835
Epoch: 39306, Training Loss: 0.00648
Epoch: 39307, Training Loss: 0.00731
Epoch: 39307, Training Loss: 0.00684
Epoch: 39307, Training Loss: 0.00835
Epoch: 39307, Training Loss: 0.00648
Epoch: 39308, Training Loss: 0.00731
Epoch: 39308, Training Loss: 0.00684
Epoch: 39308, Training Loss: 0.00835
Epoch: 39308, Training Loss: 0.00648
Epoch: 39309, Training Loss: 0.00731
Epoch: 39309, Training Loss: 0.00684
Epoch: 39309, Training Loss: 0.00835
Epoch: 39309, Training Loss: 0.00648
Epoch: 39310, Training Loss: 0.00731
Epoch: 39310, Training Loss: 0.00684
Epoch: 39310, Training Loss: 0.00835
Epoch: 39310, Training Loss: 0.00648
Epoch: 39311, Training Loss: 0.00731
Epoch: 39311, Training Loss: 0.00684
Epoch: 39311, Training Loss: 0.00835
Epoch: 39311, Training Loss: 0.00648
Epoch: 39312, Training Loss: 0.00731
Epoch: 39312, Training Loss: 0.00684
Epoch: 39312, Training Loss: 0.00835
Epoch: 39312, Training Loss: 0.00648
Epoch: 39313, Training Loss: 0.00731
Epoch: 39313, Training Loss: 0.00684
Epoch: 39313, Training Loss: 0.00835
Epoch: 39313, Training Loss: 0.00648
Epoch: 39314, Training Loss: 0.00731
Epoch: 39314, Training Loss: 0.00684
Epoch: 39314, Training Loss: 0.00835
Epoch: 39314, Training Loss: 0.00648
Epoch: 39315, Training Loss: 0.00731
Epoch: 39315, Training Loss: 0.00684
Epoch: 39315, Training Loss: 0.00835
Epoch: 39315, Training Loss: 0.00648
Epoch: 39316, Training Loss: 0.00731
Epoch: 39316, Training Loss: 0.00684
Epoch: 39316, Training Loss: 0.00835
Epoch: 39316, Training Loss: 0.00648
Epoch: 39317, Training Loss: 0.00731
Epoch: 39317, Training Loss: 0.00684
Epoch: 39317, Training Loss: 0.00835
Epoch: 39317, Training Loss: 0.00648
Epoch: 39318, Training Loss: 0.00731
Epoch: 39318, Training Loss: 0.00684
Epoch: 39318, Training Loss: 0.00835
Epoch: 39318, Training Loss: 0.00648
Epoch: 39319, Training Loss: 0.00731
Epoch: 39319, Training Loss: 0.00684
Epoch: 39319, Training Loss: 0.00835
Epoch: 39319, Training Loss: 0.00648
Epoch: 39320, Training Loss: 0.00731
Epoch: 39320, Training Loss: 0.00684
Epoch: 39320, Training Loss: 0.00835
Epoch: 39320, Training Loss: 0.00648
Epoch: 39321, Training Loss: 0.00731
Epoch: 39321, Training Loss: 0.00684
Epoch: 39321, Training Loss: 0.00835
Epoch: 39321, Training Loss: 0.00648
Epoch: 39322, Training Loss: 0.00731
Epoch: 39322, Training Loss: 0.00684
Epoch: 39322, Training Loss: 0.00835
Epoch: 39322, Training Loss: 0.00648
Epoch: 39323, Training Loss: 0.00731
Epoch: 39323, Training Loss: 0.00684
Epoch: 39323, Training Loss: 0.00835
Epoch: 39323, Training Loss: 0.00648
Epoch: 39324, Training Loss: 0.00731
Epoch: 39324, Training Loss: 0.00684
Epoch: 39324, Training Loss: 0.00835
Epoch: 39324, Training Loss: 0.00648
Epoch: 39325, Training Loss: 0.00731
Epoch: 39325, Training Loss: 0.00684
Epoch: 39325, Training Loss: 0.00835
Epoch: 39325, Training Loss: 0.00648
Epoch: 39326, Training Loss: 0.00731
Epoch: 39326, Training Loss: 0.00684
Epoch: 39326, Training Loss: 0.00835
Epoch: 39326, Training Loss: 0.00648
Epoch: 39327, Training Loss: 0.00731
Epoch: 39327, Training Loss: 0.00684
Epoch: 39327, Training Loss: 0.00835
Epoch: 39327, Training Loss: 0.00648
Epoch: 39328, Training Loss: 0.00731
Epoch: 39328, Training Loss: 0.00684
Epoch: 39328, Training Loss: 0.00835
Epoch: 39328, Training Loss: 0.00648
Epoch: 39329, Training Loss: 0.00731
Epoch: 39329, Training Loss: 0.00684
Epoch: 39329, Training Loss: 0.00835
Epoch: 39329, Training Loss: 0.00648
Epoch: 39330, Training Loss: 0.00731
Epoch: 39330, Training Loss: 0.00684
Epoch: 39330, Training Loss: 0.00835
Epoch: 39330, Training Loss: 0.00648
Epoch: 39331, Training Loss: 0.00731
Epoch: 39331, Training Loss: 0.00684
Epoch: 39331, Training Loss: 0.00835
Epoch: 39331, Training Loss: 0.00648
Epoch: 39332, Training Loss: 0.00731
Epoch: 39332, Training Loss: 0.00684
Epoch: 39332, Training Loss: 0.00835
Epoch: 39332, Training Loss: 0.00648
Epoch: 39333, Training Loss: 0.00731
Epoch: 39333, Training Loss: 0.00684
Epoch: 39333, Training Loss: 0.00835
Epoch: 39333, Training Loss: 0.00648
Epoch: 39334, Training Loss: 0.00731
Epoch: 39334, Training Loss: 0.00684
Epoch: 39334, Training Loss: 0.00835
Epoch: 39334, Training Loss: 0.00648
Epoch: 39335, Training Loss: 0.00731
Epoch: 39335, Training Loss: 0.00684
Epoch: 39335, Training Loss: 0.00835
Epoch: 39335, Training Loss: 0.00648
Epoch: 39336, Training Loss: 0.00731
Epoch: 39336, Training Loss: 0.00684
Epoch: 39336, Training Loss: 0.00835
Epoch: 39336, Training Loss: 0.00647
Epoch: 39337, Training Loss: 0.00731
Epoch: 39337, Training Loss: 0.00684
Epoch: 39337, Training Loss: 0.00835
Epoch: 39337, Training Loss: 0.00647
Epoch: 39338, Training Loss: 0.00731
Epoch: 39338, Training Loss: 0.00684
Epoch: 39338, Training Loss: 0.00835
Epoch: 39338, Training Loss: 0.00647
Epoch: 39339, Training Loss: 0.00731
Epoch: 39339, Training Loss: 0.00684
Epoch: 39339, Training Loss: 0.00835
Epoch: 39339, Training Loss: 0.00647
Epoch: 39340, Training Loss: 0.00731
Epoch: 39340, Training Loss: 0.00684
Epoch: 39340, Training Loss: 0.00835
Epoch: 39340, Training Loss: 0.00647
Epoch: 39341, Training Loss: 0.00731
Epoch: 39341, Training Loss: 0.00684
Epoch: 39341, Training Loss: 0.00835
Epoch: 39341, Training Loss: 0.00647
Epoch: 39342, Training Loss: 0.00731
Epoch: 39342, Training Loss: 0.00684
Epoch: 39342, Training Loss: 0.00835
Epoch: 39342, Training Loss: 0.00647
Epoch: 39343, Training Loss: 0.00731
Epoch: 39343, Training Loss: 0.00684
Epoch: 39343, Training Loss: 0.00835
Epoch: 39343, Training Loss: 0.00647
Epoch: 39344, Training Loss: 0.00731
Epoch: 39344, Training Loss: 0.00684
Epoch: 39344, Training Loss: 0.00835
Epoch: 39344, Training Loss: 0.00647
Epoch: 39345, Training Loss: 0.00731
Epoch: 39345, Training Loss: 0.00684
Epoch: 39345, Training Loss: 0.00835
Epoch: 39345, Training Loss: 0.00647
Epoch: 39346, Training Loss: 0.00731
Epoch: 39346, Training Loss: 0.00684
Epoch: 39346, Training Loss: 0.00835
Epoch: 39346, Training Loss: 0.00647
Epoch: 39347, Training Loss: 0.00731
Epoch: 39347, Training Loss: 0.00684
Epoch: 39347, Training Loss: 0.00835
Epoch: 39347, Training Loss: 0.00647
Epoch: 39348, Training Loss: 0.00731
Epoch: 39348, Training Loss: 0.00684
Epoch: 39348, Training Loss: 0.00835
Epoch: 39348, Training Loss: 0.00647
Epoch: 39349, Training Loss: 0.00731
Epoch: 39349, Training Loss: 0.00684
Epoch: 39349, Training Loss: 0.00835
Epoch: 39349, Training Loss: 0.00647
Epoch: 39350, Training Loss: 0.00731
Epoch: 39350, Training Loss: 0.00684
Epoch: 39350, Training Loss: 0.00835
Epoch: 39350, Training Loss: 0.00647
Epoch: 39351, Training Loss: 0.00731
Epoch: 39351, Training Loss: 0.00684
Epoch: 39351, Training Loss: 0.00835
Epoch: 39351, Training Loss: 0.00647
Epoch: 39352, Training Loss: 0.00731
Epoch: 39352, Training Loss: 0.00684
Epoch: 39352, Training Loss: 0.00835
Epoch: 39352, Training Loss: 0.00647
Epoch: 39353, Training Loss: 0.00731
Epoch: 39353, Training Loss: 0.00684
Epoch: 39353, Training Loss: 0.00835
Epoch: 39353, Training Loss: 0.00647
Epoch: 39354, Training Loss: 0.00731
Epoch: 39354, Training Loss: 0.00684
Epoch: 39354, Training Loss: 0.00835
Epoch: 39354, Training Loss: 0.00647
Epoch: 39355, Training Loss: 0.00731
Epoch: 39355, Training Loss: 0.00684
Epoch: 39355, Training Loss: 0.00835
Epoch: 39355, Training Loss: 0.00647
Epoch: 39356, Training Loss: 0.00731
Epoch: 39356, Training Loss: 0.00684
Epoch: 39356, Training Loss: 0.00835
Epoch: 39356, Training Loss: 0.00647
Epoch: 39357, Training Loss: 0.00731
Epoch: 39357, Training Loss: 0.00684
Epoch: 39357, Training Loss: 0.00835
Epoch: 39357, Training Loss: 0.00647
Epoch: 39358, Training Loss: 0.00731
Epoch: 39358, Training Loss: 0.00684
Epoch: 39358, Training Loss: 0.00835
Epoch: 39358, Training Loss: 0.00647
Epoch: 39359, Training Loss: 0.00731
Epoch: 39359, Training Loss: 0.00684
Epoch: 39359, Training Loss: 0.00835
Epoch: 39359, Training Loss: 0.00647
Epoch: 39360, Training Loss: 0.00731
Epoch: 39360, Training Loss: 0.00684
Epoch: 39360, Training Loss: 0.00835
Epoch: 39360, Training Loss: 0.00647
Epoch: 39361, Training Loss: 0.00731
Epoch: 39361, Training Loss: 0.00684
Epoch: 39361, Training Loss: 0.00835
Epoch: 39361, Training Loss: 0.00647
Epoch: 39362, Training Loss: 0.00731
Epoch: 39362, Training Loss: 0.00684
Epoch: 39362, Training Loss: 0.00835
Epoch: 39362, Training Loss: 0.00647
Epoch: 39363, Training Loss: 0.00731
Epoch: 39363, Training Loss: 0.00684
Epoch: 39363, Training Loss: 0.00835
Epoch: 39363, Training Loss: 0.00647
Epoch: 39364, Training Loss: 0.00731
Epoch: 39364, Training Loss: 0.00684
Epoch: 39364, Training Loss: 0.00835
Epoch: 39364, Training Loss: 0.00647
Epoch: 39365, Training Loss: 0.00731
Epoch: 39365, Training Loss: 0.00684
Epoch: 39365, Training Loss: 0.00835
Epoch: 39365, Training Loss: 0.00647
Epoch: 39366, Training Loss: 0.00731
Epoch: 39366, Training Loss: 0.00684
Epoch: 39366, Training Loss: 0.00835
Epoch: 39366, Training Loss: 0.00647
Epoch: 39367, Training Loss: 0.00731
Epoch: 39367, Training Loss: 0.00684
Epoch: 39367, Training Loss: 0.00835
Epoch: 39367, Training Loss: 0.00647
Epoch: 39368, Training Loss: 0.00730
Epoch: 39368, Training Loss: 0.00684
Epoch: 39368, Training Loss: 0.00835
Epoch: 39368, Training Loss: 0.00647
Epoch: 39369, Training Loss: 0.00730
Epoch: 39369, Training Loss: 0.00684
Epoch: 39369, Training Loss: 0.00835
Epoch: 39369, Training Loss: 0.00647
Epoch: 39370, Training Loss: 0.00730
Epoch: 39370, Training Loss: 0.00684
Epoch: 39370, Training Loss: 0.00835
Epoch: 39370, Training Loss: 0.00647
Epoch: 39371, Training Loss: 0.00730
Epoch: 39371, Training Loss: 0.00684
Epoch: 39371, Training Loss: 0.00835
Epoch: 39371, Training Loss: 0.00647
Epoch: 39372, Training Loss: 0.00730
Epoch: 39372, Training Loss: 0.00684
Epoch: 39372, Training Loss: 0.00835
Epoch: 39372, Training Loss: 0.00647
Epoch: 39373, Training Loss: 0.00730
Epoch: 39373, Training Loss: 0.00684
Epoch: 39373, Training Loss: 0.00835
Epoch: 39373, Training Loss: 0.00647
Epoch: 39374, Training Loss: 0.00730
Epoch: 39374, Training Loss: 0.00684
Epoch: 39374, Training Loss: 0.00835
Epoch: 39374, Training Loss: 0.00647
Epoch: 39375, Training Loss: 0.00730
Epoch: 39375, Training Loss: 0.00684
Epoch: 39375, Training Loss: 0.00835
Epoch: 39375, Training Loss: 0.00647
Epoch: 39376, Training Loss: 0.00730
Epoch: 39376, Training Loss: 0.00684
Epoch: 39376, Training Loss: 0.00835
Epoch: 39376, Training Loss: 0.00647
Epoch: 39377, Training Loss: 0.00730
Epoch: 39377, Training Loss: 0.00684
Epoch: 39377, Training Loss: 0.00835
Epoch: 39377, Training Loss: 0.00647
Epoch: 39378, Training Loss: 0.00730
Epoch: 39378, Training Loss: 0.00684
Epoch: 39378, Training Loss: 0.00835
Epoch: 39378, Training Loss: 0.00647
Epoch: 39379, Training Loss: 0.00730
Epoch: 39379, Training Loss: 0.00684
Epoch: 39379, Training Loss: 0.00835
Epoch: 39379, Training Loss: 0.00647
Epoch: 39380, Training Loss: 0.00730
Epoch: 39380, Training Loss: 0.00683
Epoch: 39380, Training Loss: 0.00834
Epoch: 39380, Training Loss: 0.00647
Epoch: 39381, Training Loss: 0.00730
Epoch: 39381, Training Loss: 0.00683
Epoch: 39381, Training Loss: 0.00834
Epoch: 39381, Training Loss: 0.00647
Epoch: 39382, Training Loss: 0.00730
Epoch: 39382, Training Loss: 0.00683
Epoch: 39382, Training Loss: 0.00834
Epoch: 39382, Training Loss: 0.00647
Epoch: 39383, Training Loss: 0.00730
Epoch: 39383, Training Loss: 0.00683
Epoch: 39383, Training Loss: 0.00834
Epoch: 39383, Training Loss: 0.00647
Epoch: 39384, Training Loss: 0.00730
Epoch: 39384, Training Loss: 0.00683
Epoch: 39384, Training Loss: 0.00834
Epoch: 39384, Training Loss: 0.00647
Epoch: 39385, Training Loss: 0.00730
Epoch: 39385, Training Loss: 0.00683
Epoch: 39385, Training Loss: 0.00834
Epoch: 39385, Training Loss: 0.00647
Epoch: 39386, Training Loss: 0.00730
Epoch: 39386, Training Loss: 0.00683
Epoch: 39386, Training Loss: 0.00834
Epoch: 39386, Training Loss: 0.00647
Epoch: 39387, Training Loss: 0.00730
Epoch: 39387, Training Loss: 0.00683
Epoch: 39387, Training Loss: 0.00834
Epoch: 39387, Training Loss: 0.00647
Epoch: 39388, Training Loss: 0.00730
Epoch: 39388, Training Loss: 0.00683
Epoch: 39388, Training Loss: 0.00834
Epoch: 39388, Training Loss: 0.00647
Epoch: 39389, Training Loss: 0.00730
Epoch: 39389, Training Loss: 0.00683
Epoch: 39389, Training Loss: 0.00834
Epoch: 39389, Training Loss: 0.00647
Epoch: 39390, Training Loss: 0.00730
Epoch: 39390, Training Loss: 0.00683
Epoch: 39390, Training Loss: 0.00834
Epoch: 39390, Training Loss: 0.00647
Epoch: 39391, Training Loss: 0.00730
Epoch: 39391, Training Loss: 0.00683
Epoch: 39391, Training Loss: 0.00834
Epoch: 39391, Training Loss: 0.00647
Epoch: 39392, Training Loss: 0.00730
Epoch: 39392, Training Loss: 0.00683
Epoch: 39392, Training Loss: 0.00834
Epoch: 39392, Training Loss: 0.00647
Epoch: 39393, Training Loss: 0.00730
Epoch: 39393, Training Loss: 0.00683
Epoch: 39393, Training Loss: 0.00834
Epoch: 39393, Training Loss: 0.00647
Epoch: 39394, Training Loss: 0.00730
Epoch: 39394, Training Loss: 0.00683
Epoch: 39394, Training Loss: 0.00834
Epoch: 39394, Training Loss: 0.00647
Epoch: 39395, Training Loss: 0.00730
Epoch: 39395, Training Loss: 0.00683
Epoch: 39395, Training Loss: 0.00834
Epoch: 39395, Training Loss: 0.00647
Epoch: 39396, Training Loss: 0.00730
Epoch: 39396, Training Loss: 0.00683
Epoch: 39396, Training Loss: 0.00834
Epoch: 39396, Training Loss: 0.00647
Epoch: 39397, Training Loss: 0.00730
Epoch: 39397, Training Loss: 0.00683
Epoch: 39397, Training Loss: 0.00834
Epoch: 39397, Training Loss: 0.00647
Epoch: 39398, Training Loss: 0.00730
Epoch: 39398, Training Loss: 0.00683
Epoch: 39398, Training Loss: 0.00834
Epoch: 39398, Training Loss: 0.00647
Epoch: 39399, Training Loss: 0.00730
Epoch: 39399, Training Loss: 0.00683
Epoch: 39399, Training Loss: 0.00834
Epoch: 39399, Training Loss: 0.00647
Epoch: 39400, Training Loss: 0.00730
Epoch: 39400, Training Loss: 0.00683
Epoch: 39400, Training Loss: 0.00834
Epoch: 39400, Training Loss: 0.00647
Epoch: 39401, Training Loss: 0.00730
Epoch: 39401, Training Loss: 0.00683
Epoch: 39401, Training Loss: 0.00834
Epoch: 39401, Training Loss: 0.00647
Epoch: 39402, Training Loss: 0.00730
Epoch: 39402, Training Loss: 0.00683
Epoch: 39402, Training Loss: 0.00834
Epoch: 39402, Training Loss: 0.00647
Epoch: 39403, Training Loss: 0.00730
Epoch: 39403, Training Loss: 0.00683
Epoch: 39403, Training Loss: 0.00834
Epoch: 39403, Training Loss: 0.00647
Epoch: 39404, Training Loss: 0.00730
Epoch: 39404, Training Loss: 0.00683
Epoch: 39404, Training Loss: 0.00834
Epoch: 39404, Training Loss: 0.00647
Epoch: 39405, Training Loss: 0.00730
Epoch: 39405, Training Loss: 0.00683
Epoch: 39405, Training Loss: 0.00834
Epoch: 39405, Training Loss: 0.00647
Epoch: 39406, Training Loss: 0.00730
Epoch: 39406, Training Loss: 0.00683
Epoch: 39406, Training Loss: 0.00834
Epoch: 39406, Training Loss: 0.00647
Epoch: 39407, Training Loss: 0.00730
Epoch: 39407, Training Loss: 0.00683
Epoch: 39407, Training Loss: 0.00834
Epoch: 39407, Training Loss: 0.00647
Epoch: 39408, Training Loss: 0.00730
Epoch: 39408, Training Loss: 0.00683
Epoch: 39408, Training Loss: 0.00834
Epoch: 39408, Training Loss: 0.00647
Epoch: 39409, Training Loss: 0.00730
Epoch: 39409, Training Loss: 0.00683
Epoch: 39409, Training Loss: 0.00834
Epoch: 39409, Training Loss: 0.00647
Epoch: 39410, Training Loss: 0.00730
Epoch: 39410, Training Loss: 0.00683
Epoch: 39410, Training Loss: 0.00834
Epoch: 39410, Training Loss: 0.00647
Epoch: 39411, Training Loss: 0.00730
Epoch: 39411, Training Loss: 0.00683
Epoch: 39411, Training Loss: 0.00834
Epoch: 39411, Training Loss: 0.00647
Epoch: 39412, Training Loss: 0.00730
Epoch: 39412, Training Loss: 0.00683
Epoch: 39412, Training Loss: 0.00834
Epoch: 39412, Training Loss: 0.00647
Epoch: 39413, Training Loss: 0.00730
Epoch: 39413, Training Loss: 0.00683
Epoch: 39413, Training Loss: 0.00834
Epoch: 39413, Training Loss: 0.00647
Epoch: 39414, Training Loss: 0.00730
Epoch: 39414, Training Loss: 0.00683
Epoch: 39414, Training Loss: 0.00834
Epoch: 39414, Training Loss: 0.00647
Epoch: 39415, Training Loss: 0.00730
Epoch: 39415, Training Loss: 0.00683
Epoch: 39415, Training Loss: 0.00834
Epoch: 39415, Training Loss: 0.00647
Epoch: 39416, Training Loss: 0.00730
Epoch: 39416, Training Loss: 0.00683
Epoch: 39416, Training Loss: 0.00834
Epoch: 39416, Training Loss: 0.00647
Epoch: 39417, Training Loss: 0.00730
Epoch: 39417, Training Loss: 0.00683
Epoch: 39417, Training Loss: 0.00834
Epoch: 39417, Training Loss: 0.00647
Epoch: 39418, Training Loss: 0.00730
Epoch: 39418, Training Loss: 0.00683
Epoch: 39418, Training Loss: 0.00834
Epoch: 39418, Training Loss: 0.00647
Epoch: 39419, Training Loss: 0.00730
Epoch: 39419, Training Loss: 0.00683
Epoch: 39419, Training Loss: 0.00834
Epoch: 39419, Training Loss: 0.00647
Epoch: 39420, Training Loss: 0.00730
Epoch: 39420, Training Loss: 0.00683
Epoch: 39420, Training Loss: 0.00834
Epoch: 39420, Training Loss: 0.00647
Epoch: 39421, Training Loss: 0.00730
Epoch: 39421, Training Loss: 0.00683
Epoch: 39421, Training Loss: 0.00834
Epoch: 39421, Training Loss: 0.00647
Epoch: 39422, Training Loss: 0.00730
Epoch: 39422, Training Loss: 0.00683
Epoch: 39422, Training Loss: 0.00834
Epoch: 39422, Training Loss: 0.00647
Epoch: 39423, Training Loss: 0.00730
Epoch: 39423, Training Loss: 0.00683
Epoch: 39423, Training Loss: 0.00834
Epoch: 39423, Training Loss: 0.00647
Epoch: 39424, Training Loss: 0.00730
Epoch: 39424, Training Loss: 0.00683
Epoch: 39424, Training Loss: 0.00834
Epoch: 39424, Training Loss: 0.00647
Epoch: 39425, Training Loss: 0.00730
Epoch: 39425, Training Loss: 0.00683
Epoch: 39425, Training Loss: 0.00834
Epoch: 39425, Training Loss: 0.00647
Epoch: 39426, Training Loss: 0.00730
Epoch: 39426, Training Loss: 0.00683
Epoch: 39426, Training Loss: 0.00834
Epoch: 39426, Training Loss: 0.00647
Epoch: 39427, Training Loss: 0.00730
Epoch: 39427, Training Loss: 0.00683
Epoch: 39427, Training Loss: 0.00834
Epoch: 39427, Training Loss: 0.00647
Epoch: 39428, Training Loss: 0.00730
Epoch: 39428, Training Loss: 0.00683
Epoch: 39428, Training Loss: 0.00834
Epoch: 39428, Training Loss: 0.00647
Epoch: 39429, Training Loss: 0.00730
Epoch: 39429, Training Loss: 0.00683
Epoch: 39429, Training Loss: 0.00834
Epoch: 39429, Training Loss: 0.00647
Epoch: 39430, Training Loss: 0.00730
Epoch: 39430, Training Loss: 0.00683
Epoch: 39430, Training Loss: 0.00834
Epoch: 39430, Training Loss: 0.00647
Epoch: 39431, Training Loss: 0.00730
Epoch: 39431, Training Loss: 0.00683
Epoch: 39431, Training Loss: 0.00834
Epoch: 39431, Training Loss: 0.00647
Epoch: 39432, Training Loss: 0.00730
Epoch: 39432, Training Loss: 0.00683
Epoch: 39432, Training Loss: 0.00834
Epoch: 39432, Training Loss: 0.00647
Epoch: 39433, Training Loss: 0.00730
Epoch: 39433, Training Loss: 0.00683
Epoch: 39433, Training Loss: 0.00834
Epoch: 39433, Training Loss: 0.00647
Epoch: 39434, Training Loss: 0.00730
Epoch: 39434, Training Loss: 0.00683
Epoch: 39434, Training Loss: 0.00834
Epoch: 39434, Training Loss: 0.00647
Epoch: 39435, Training Loss: 0.00730
Epoch: 39435, Training Loss: 0.00683
Epoch: 39435, Training Loss: 0.00834
Epoch: 39435, Training Loss: 0.00647
Epoch: 39436, Training Loss: 0.00730
Epoch: 39436, Training Loss: 0.00683
Epoch: 39436, Training Loss: 0.00834
Epoch: 39436, Training Loss: 0.00647
Epoch: 39437, Training Loss: 0.00730
Epoch: 39437, Training Loss: 0.00683
Epoch: 39437, Training Loss: 0.00834
Epoch: 39437, Training Loss: 0.00647
Epoch: 39438, Training Loss: 0.00730
Epoch: 39438, Training Loss: 0.00683
Epoch: 39438, Training Loss: 0.00834
Epoch: 39438, Training Loss: 0.00647
Epoch: 39439, Training Loss: 0.00730
Epoch: 39439, Training Loss: 0.00683
Epoch: 39439, Training Loss: 0.00834
Epoch: 39439, Training Loss: 0.00647
Epoch: 39440, Training Loss: 0.00730
Epoch: 39440, Training Loss: 0.00683
Epoch: 39440, Training Loss: 0.00834
Epoch: 39440, Training Loss: 0.00647
Epoch: 39441, Training Loss: 0.00730
Epoch: 39441, Training Loss: 0.00683
Epoch: 39441, Training Loss: 0.00834
Epoch: 39441, Training Loss: 0.00647
Epoch: 39442, Training Loss: 0.00730
Epoch: 39442, Training Loss: 0.00683
Epoch: 39442, Training Loss: 0.00834
Epoch: 39442, Training Loss: 0.00647
Epoch: 39443, Training Loss: 0.00730
Epoch: 39443, Training Loss: 0.00683
Epoch: 39443, Training Loss: 0.00834
Epoch: 39443, Training Loss: 0.00647
Epoch: 39444, Training Loss: 0.00730
Epoch: 39444, Training Loss: 0.00683
Epoch: 39444, Training Loss: 0.00834
Epoch: 39444, Training Loss: 0.00647
Epoch: 39445, Training Loss: 0.00730
Epoch: 39445, Training Loss: 0.00683
Epoch: 39445, Training Loss: 0.00834
Epoch: 39445, Training Loss: 0.00647
Epoch: 39446, Training Loss: 0.00730
Epoch: 39446, Training Loss: 0.00683
Epoch: 39446, Training Loss: 0.00834
Epoch: 39446, Training Loss: 0.00647
Epoch: 39447, Training Loss: 0.00730
Epoch: 39447, Training Loss: 0.00683
Epoch: 39447, Training Loss: 0.00834
Epoch: 39447, Training Loss: 0.00647
Epoch: 39448, Training Loss: 0.00730
Epoch: 39448, Training Loss: 0.00683
Epoch: 39448, Training Loss: 0.00834
Epoch: 39448, Training Loss: 0.00647
Epoch: 39449, Training Loss: 0.00730
Epoch: 39449, Training Loss: 0.00683
Epoch: 39449, Training Loss: 0.00834
Epoch: 39449, Training Loss: 0.00647
Epoch: 39450, Training Loss: 0.00730
Epoch: 39450, Training Loss: 0.00683
Epoch: 39450, Training Loss: 0.00834
Epoch: 39450, Training Loss: 0.00647
Epoch: 39451, Training Loss: 0.00730
Epoch: 39451, Training Loss: 0.00683
Epoch: 39451, Training Loss: 0.00834
Epoch: 39451, Training Loss: 0.00647
Epoch: 39452, Training Loss: 0.00730
Epoch: 39452, Training Loss: 0.00683
Epoch: 39452, Training Loss: 0.00834
Epoch: 39452, Training Loss: 0.00647
Epoch: 39453, Training Loss: 0.00730
Epoch: 39453, Training Loss: 0.00683
Epoch: 39453, Training Loss: 0.00834
Epoch: 39453, Training Loss: 0.00646
Epoch: 39454, Training Loss: 0.00730
Epoch: 39454, Training Loss: 0.00683
Epoch: 39454, Training Loss: 0.00834
Epoch: 39454, Training Loss: 0.00646
Epoch: 39455, Training Loss: 0.00730
Epoch: 39455, Training Loss: 0.00683
Epoch: 39455, Training Loss: 0.00834
Epoch: 39455, Training Loss: 0.00646
Epoch: 39456, Training Loss: 0.00730
Epoch: 39456, Training Loss: 0.00683
Epoch: 39456, Training Loss: 0.00834
Epoch: 39456, Training Loss: 0.00646
Epoch: 39457, Training Loss: 0.00730
Epoch: 39457, Training Loss: 0.00683
Epoch: 39457, Training Loss: 0.00834
Epoch: 39457, Training Loss: 0.00646
Epoch: 39458, Training Loss: 0.00730
Epoch: 39458, Training Loss: 0.00683
Epoch: 39458, Training Loss: 0.00834
Epoch: 39458, Training Loss: 0.00646
Epoch: 39459, Training Loss: 0.00730
Epoch: 39459, Training Loss: 0.00683
Epoch: 39459, Training Loss: 0.00834
Epoch: 39459, Training Loss: 0.00646
Epoch: 39460, Training Loss: 0.00730
Epoch: 39460, Training Loss: 0.00683
Epoch: 39460, Training Loss: 0.00834
Epoch: 39460, Training Loss: 0.00646
Epoch: 39461, Training Loss: 0.00730
Epoch: 39461, Training Loss: 0.00683
Epoch: 39461, Training Loss: 0.00834
Epoch: 39461, Training Loss: 0.00646
Epoch: 39462, Training Loss: 0.00730
Epoch: 39462, Training Loss: 0.00683
Epoch: 39462, Training Loss: 0.00834
Epoch: 39462, Training Loss: 0.00646
Epoch: 39463, Training Loss: 0.00730
Epoch: 39463, Training Loss: 0.00683
Epoch: 39463, Training Loss: 0.00834
Epoch: 39463, Training Loss: 0.00646
Epoch: 39464, Training Loss: 0.00730
Epoch: 39464, Training Loss: 0.00683
Epoch: 39464, Training Loss: 0.00834
Epoch: 39464, Training Loss: 0.00646
Epoch: 39465, Training Loss: 0.00730
Epoch: 39465, Training Loss: 0.00683
Epoch: 39465, Training Loss: 0.00834
Epoch: 39465, Training Loss: 0.00646
Epoch: 39466, Training Loss: 0.00730
Epoch: 39466, Training Loss: 0.00683
Epoch: 39466, Training Loss: 0.00834
Epoch: 39466, Training Loss: 0.00646
Epoch: 39467, Training Loss: 0.00730
Epoch: 39467, Training Loss: 0.00683
Epoch: 39467, Training Loss: 0.00834
Epoch: 39467, Training Loss: 0.00646
Epoch: 39468, Training Loss: 0.00730
Epoch: 39468, Training Loss: 0.00683
Epoch: 39468, Training Loss: 0.00834
Epoch: 39468, Training Loss: 0.00646
Epoch: 39469, Training Loss: 0.00730
Epoch: 39469, Training Loss: 0.00683
Epoch: 39469, Training Loss: 0.00834
Epoch: 39469, Training Loss: 0.00646
Epoch: 39470, Training Loss: 0.00729
Epoch: 39470, Training Loss: 0.00683
Epoch: 39470, Training Loss: 0.00833
Epoch: 39470, Training Loss: 0.00646
Epoch: 39471, Training Loss: 0.00729
Epoch: 39471, Training Loss: 0.00683
Epoch: 39471, Training Loss: 0.00833
Epoch: 39471, Training Loss: 0.00646
Epoch: 39472, Training Loss: 0.00729
Epoch: 39472, Training Loss: 0.00683
Epoch: 39472, Training Loss: 0.00833
Epoch: 39472, Training Loss: 0.00646
Epoch: 39473, Training Loss: 0.00729
Epoch: 39473, Training Loss: 0.00683
Epoch: 39473, Training Loss: 0.00833
Epoch: 39473, Training Loss: 0.00646
Epoch: 39474, Training Loss: 0.00729
Epoch: 39474, Training Loss: 0.00683
Epoch: 39474, Training Loss: 0.00833
Epoch: 39474, Training Loss: 0.00646
Epoch: 39475, Training Loss: 0.00729
Epoch: 39475, Training Loss: 0.00683
Epoch: 39475, Training Loss: 0.00833
Epoch: 39475, Training Loss: 0.00646
Epoch: 39476, Training Loss: 0.00729
Epoch: 39476, Training Loss: 0.00683
Epoch: 39476, Training Loss: 0.00833
Epoch: 39476, Training Loss: 0.00646
Epoch: 39477, Training Loss: 0.00729
Epoch: 39477, Training Loss: 0.00683
Epoch: 39477, Training Loss: 0.00833
Epoch: 39477, Training Loss: 0.00646
Epoch: 39478, Training Loss: 0.00729
Epoch: 39478, Training Loss: 0.00683
Epoch: 39478, Training Loss: 0.00833
Epoch: 39478, Training Loss: 0.00646
Epoch: 39479, Training Loss: 0.00729
Epoch: 39479, Training Loss: 0.00683
Epoch: 39479, Training Loss: 0.00833
Epoch: 39479, Training Loss: 0.00646
Epoch: 39480, Training Loss: 0.00729
Epoch: 39480, Training Loss: 0.00683
Epoch: 39480, Training Loss: 0.00833
Epoch: 39480, Training Loss: 0.00646
Epoch: 39481, Training Loss: 0.00729
Epoch: 39481, Training Loss: 0.00683
Epoch: 39481, Training Loss: 0.00833
Epoch: 39481, Training Loss: 0.00646
Epoch: 39482, Training Loss: 0.00729
Epoch: 39482, Training Loss: 0.00683
Epoch: 39482, Training Loss: 0.00833
Epoch: 39482, Training Loss: 0.00646
Epoch: 39483, Training Loss: 0.00729
Epoch: 39483, Training Loss: 0.00683
Epoch: 39483, Training Loss: 0.00833
Epoch: 39483, Training Loss: 0.00646
Epoch: 39484, Training Loss: 0.00729
Epoch: 39484, Training Loss: 0.00683
Epoch: 39484, Training Loss: 0.00833
Epoch: 39484, Training Loss: 0.00646
Epoch: 39485, Training Loss: 0.00729
Epoch: 39485, Training Loss: 0.00683
Epoch: 39485, Training Loss: 0.00833
Epoch: 39485, Training Loss: 0.00646
Epoch: 39486, Training Loss: 0.00729
Epoch: 39486, Training Loss: 0.00683
Epoch: 39486, Training Loss: 0.00833
Epoch: 39486, Training Loss: 0.00646
Epoch: 39487, Training Loss: 0.00729
Epoch: 39487, Training Loss: 0.00683
Epoch: 39487, Training Loss: 0.00833
Epoch: 39487, Training Loss: 0.00646
Epoch: 39488, Training Loss: 0.00729
Epoch: 39488, Training Loss: 0.00683
Epoch: 39488, Training Loss: 0.00833
Epoch: 39488, Training Loss: 0.00646
Epoch: 39489, Training Loss: 0.00729
Epoch: 39489, Training Loss: 0.00683
Epoch: 39489, Training Loss: 0.00833
Epoch: 39489, Training Loss: 0.00646
Epoch: 39490, Training Loss: 0.00729
Epoch: 39490, Training Loss: 0.00683
Epoch: 39490, Training Loss: 0.00833
Epoch: 39490, Training Loss: 0.00646
Epoch: 39491, Training Loss: 0.00729
Epoch: 39491, Training Loss: 0.00682
Epoch: 39491, Training Loss: 0.00833
Epoch: 39491, Training Loss: 0.00646
Epoch: 39492, Training Loss: 0.00729
Epoch: 39492, Training Loss: 0.00682
Epoch: 39492, Training Loss: 0.00833
Epoch: 39492, Training Loss: 0.00646
Epoch: 39493, Training Loss: 0.00729
Epoch: 39493, Training Loss: 0.00682
Epoch: 39493, Training Loss: 0.00833
Epoch: 39493, Training Loss: 0.00646
Epoch: 39494, Training Loss: 0.00729
Epoch: 39494, Training Loss: 0.00682
Epoch: 39494, Training Loss: 0.00833
Epoch: 39494, Training Loss: 0.00646
Epoch: 39495, Training Loss: 0.00729
Epoch: 39495, Training Loss: 0.00682
Epoch: 39495, Training Loss: 0.00833
Epoch: 39495, Training Loss: 0.00646
Epoch: 39496, Training Loss: 0.00729
Epoch: 39496, Training Loss: 0.00682
Epoch: 39496, Training Loss: 0.00833
Epoch: 39496, Training Loss: 0.00646
Epoch: 39497, Training Loss: 0.00729
Epoch: 39497, Training Loss: 0.00682
Epoch: 39497, Training Loss: 0.00833
Epoch: 39497, Training Loss: 0.00646
Epoch: 39498, Training Loss: 0.00729
Epoch: 39498, Training Loss: 0.00682
Epoch: 39498, Training Loss: 0.00833
Epoch: 39498, Training Loss: 0.00646
Epoch: 39499, Training Loss: 0.00729
Epoch: 39499, Training Loss: 0.00682
Epoch: 39499, Training Loss: 0.00833
Epoch: 39499, Training Loss: 0.00646
Epoch: 39500, Training Loss: 0.00729
Epoch: 39500, Training Loss: 0.00682
Epoch: 39500, Training Loss: 0.00833
Epoch: 39500, Training Loss: 0.00646
Epoch: 39501, Training Loss: 0.00729
Epoch: 39501, Training Loss: 0.00682
Epoch: 39501, Training Loss: 0.00833
Epoch: 39501, Training Loss: 0.00646
Epoch: 39502, Training Loss: 0.00729
Epoch: 39502, Training Loss: 0.00682
Epoch: 39502, Training Loss: 0.00833
Epoch: 39502, Training Loss: 0.00646
Epoch: 39503, Training Loss: 0.00729
Epoch: 39503, Training Loss: 0.00682
Epoch: 39503, Training Loss: 0.00833
Epoch: 39503, Training Loss: 0.00646
Epoch: 39504, Training Loss: 0.00729
Epoch: 39504, Training Loss: 0.00682
Epoch: 39504, Training Loss: 0.00833
Epoch: 39504, Training Loss: 0.00646
Epoch: 39505, Training Loss: 0.00729
Epoch: 39505, Training Loss: 0.00682
Epoch: 39505, Training Loss: 0.00833
Epoch: 39505, Training Loss: 0.00646
Epoch: 39506, Training Loss: 0.00729
Epoch: 39506, Training Loss: 0.00682
Epoch: 39506, Training Loss: 0.00833
Epoch: 39506, Training Loss: 0.00646
Epoch: 39507, Training Loss: 0.00729
Epoch: 39507, Training Loss: 0.00682
Epoch: 39507, Training Loss: 0.00833
Epoch: 39507, Training Loss: 0.00646
Epoch: 39508, Training Loss: 0.00729
Epoch: 39508, Training Loss: 0.00682
Epoch: 39508, Training Loss: 0.00833
Epoch: 39508, Training Loss: 0.00646
Epoch: 39509, Training Loss: 0.00729
Epoch: 39509, Training Loss: 0.00682
Epoch: 39509, Training Loss: 0.00833
Epoch: 39509, Training Loss: 0.00646
Epoch: 39510, Training Loss: 0.00729
Epoch: 39510, Training Loss: 0.00682
Epoch: 39510, Training Loss: 0.00833
Epoch: 39510, Training Loss: 0.00646
Epoch: 39511, Training Loss: 0.00729
Epoch: 39511, Training Loss: 0.00682
Epoch: 39511, Training Loss: 0.00833
Epoch: 39511, Training Loss: 0.00646
Epoch: 39512, Training Loss: 0.00729
Epoch: 39512, Training Loss: 0.00682
Epoch: 39512, Training Loss: 0.00833
Epoch: 39512, Training Loss: 0.00646
Epoch: 39513, Training Loss: 0.00729
Epoch: 39513, Training Loss: 0.00682
Epoch: 39513, Training Loss: 0.00833
Epoch: 39513, Training Loss: 0.00646
Epoch: 39514, Training Loss: 0.00729
Epoch: 39514, Training Loss: 0.00682
Epoch: 39514, Training Loss: 0.00833
Epoch: 39514, Training Loss: 0.00646
Epoch: 39515, Training Loss: 0.00729
Epoch: 39515, Training Loss: 0.00682
Epoch: 39515, Training Loss: 0.00833
Epoch: 39515, Training Loss: 0.00646
Epoch: 39516, Training Loss: 0.00729
Epoch: 39516, Training Loss: 0.00682
Epoch: 39516, Training Loss: 0.00833
Epoch: 39516, Training Loss: 0.00646
Epoch: 39517, Training Loss: 0.00729
Epoch: 39517, Training Loss: 0.00682
Epoch: 39517, Training Loss: 0.00833
Epoch: 39517, Training Loss: 0.00646
Epoch: 39518, Training Loss: 0.00729
Epoch: 39518, Training Loss: 0.00682
Epoch: 39518, Training Loss: 0.00833
Epoch: 39518, Training Loss: 0.00646
Epoch: 39519, Training Loss: 0.00729
Epoch: 39519, Training Loss: 0.00682
Epoch: 39519, Training Loss: 0.00833
Epoch: 39519, Training Loss: 0.00646
Epoch: 39520, Training Loss: 0.00729
Epoch: 39520, Training Loss: 0.00682
Epoch: 39520, Training Loss: 0.00833
Epoch: 39520, Training Loss: 0.00646
Epoch: 39521, Training Loss: 0.00729
Epoch: 39521, Training Loss: 0.00682
Epoch: 39521, Training Loss: 0.00833
Epoch: 39521, Training Loss: 0.00646
Epoch: 39522, Training Loss: 0.00729
Epoch: 39522, Training Loss: 0.00682
Epoch: 39522, Training Loss: 0.00833
Epoch: 39522, Training Loss: 0.00646
Epoch: 39523, Training Loss: 0.00729
Epoch: 39523, Training Loss: 0.00682
Epoch: 39523, Training Loss: 0.00833
Epoch: 39523, Training Loss: 0.00646
Epoch: 39524, Training Loss: 0.00729
Epoch: 39524, Training Loss: 0.00682
Epoch: 39524, Training Loss: 0.00833
Epoch: 39524, Training Loss: 0.00646
Epoch: 39525, Training Loss: 0.00729
Epoch: 39525, Training Loss: 0.00682
Epoch: 39525, Training Loss: 0.00833
Epoch: 39525, Training Loss: 0.00646
Epoch: 39526, Training Loss: 0.00729
Epoch: 39526, Training Loss: 0.00682
Epoch: 39526, Training Loss: 0.00833
Epoch: 39526, Training Loss: 0.00646
Epoch: 39527, Training Loss: 0.00729
Epoch: 39527, Training Loss: 0.00682
Epoch: 39527, Training Loss: 0.00833
Epoch: 39527, Training Loss: 0.00646
Epoch: 39528, Training Loss: 0.00729
Epoch: 39528, Training Loss: 0.00682
Epoch: 39528, Training Loss: 0.00833
Epoch: 39528, Training Loss: 0.00646
Epoch: 39529, Training Loss: 0.00729
Epoch: 39529, Training Loss: 0.00682
Epoch: 39529, Training Loss: 0.00833
Epoch: 39529, Training Loss: 0.00646
Epoch: 39530, Training Loss: 0.00729
Epoch: 39530, Training Loss: 0.00682
Epoch: 39530, Training Loss: 0.00833
Epoch: 39530, Training Loss: 0.00646
Epoch: 39531, Training Loss: 0.00729
Epoch: 39531, Training Loss: 0.00682
Epoch: 39531, Training Loss: 0.00833
Epoch: 39531, Training Loss: 0.00646
Epoch: 39532, Training Loss: 0.00729
Epoch: 39532, Training Loss: 0.00682
Epoch: 39532, Training Loss: 0.00833
Epoch: 39532, Training Loss: 0.00646
Epoch: 39533, Training Loss: 0.00729
Epoch: 39533, Training Loss: 0.00682
Epoch: 39533, Training Loss: 0.00833
Epoch: 39533, Training Loss: 0.00646
Epoch: 39534, Training Loss: 0.00729
Epoch: 39534, Training Loss: 0.00682
Epoch: 39534, Training Loss: 0.00833
Epoch: 39534, Training Loss: 0.00646
Epoch: 39535, Training Loss: 0.00729
Epoch: 39535, Training Loss: 0.00682
Epoch: 39535, Training Loss: 0.00833
Epoch: 39535, Training Loss: 0.00646
Epoch: 39536, Training Loss: 0.00729
Epoch: 39536, Training Loss: 0.00682
Epoch: 39536, Training Loss: 0.00833
Epoch: 39536, Training Loss: 0.00646
Epoch: 39537, Training Loss: 0.00729
Epoch: 39537, Training Loss: 0.00682
Epoch: 39537, Training Loss: 0.00833
Epoch: 39537, Training Loss: 0.00646
Epoch: 39538, Training Loss: 0.00729
Epoch: 39538, Training Loss: 0.00682
Epoch: 39538, Training Loss: 0.00833
Epoch: 39538, Training Loss: 0.00646
Epoch: 39539, Training Loss: 0.00729
Epoch: 39539, Training Loss: 0.00682
Epoch: 39539, Training Loss: 0.00833
Epoch: 39539, Training Loss: 0.00646
Epoch: 39540, Training Loss: 0.00729
Epoch: 39540, Training Loss: 0.00682
Epoch: 39540, Training Loss: 0.00833
Epoch: 39540, Training Loss: 0.00646
Epoch: 39541, Training Loss: 0.00729
Epoch: 39541, Training Loss: 0.00682
Epoch: 39541, Training Loss: 0.00833
Epoch: 39541, Training Loss: 0.00646
Epoch: 39542, Training Loss: 0.00729
Epoch: 39542, Training Loss: 0.00682
Epoch: 39542, Training Loss: 0.00833
Epoch: 39542, Training Loss: 0.00646
Epoch: 39543, Training Loss: 0.00729
Epoch: 39543, Training Loss: 0.00682
Epoch: 39543, Training Loss: 0.00833
Epoch: 39543, Training Loss: 0.00646
Epoch: 39544, Training Loss: 0.00729
Epoch: 39544, Training Loss: 0.00682
Epoch: 39544, Training Loss: 0.00833
Epoch: 39544, Training Loss: 0.00646
Epoch: 39545, Training Loss: 0.00729
Epoch: 39545, Training Loss: 0.00682
Epoch: 39545, Training Loss: 0.00833
Epoch: 39545, Training Loss: 0.00646
Epoch: 39546, Training Loss: 0.00729
Epoch: 39546, Training Loss: 0.00682
Epoch: 39546, Training Loss: 0.00833
Epoch: 39546, Training Loss: 0.00646
Epoch: 39547, Training Loss: 0.00729
Epoch: 39547, Training Loss: 0.00682
Epoch: 39547, Training Loss: 0.00833
Epoch: 39547, Training Loss: 0.00646
Epoch: 39548, Training Loss: 0.00729
Epoch: 39548, Training Loss: 0.00682
Epoch: 39548, Training Loss: 0.00833
Epoch: 39548, Training Loss: 0.00646
Epoch: 39549, Training Loss: 0.00729
Epoch: 39549, Training Loss: 0.00682
Epoch: 39549, Training Loss: 0.00833
Epoch: 39549, Training Loss: 0.00646
Epoch: 39550, Training Loss: 0.00729
Epoch: 39550, Training Loss: 0.00682
Epoch: 39550, Training Loss: 0.00833
Epoch: 39550, Training Loss: 0.00646
Epoch: 39551, Training Loss: 0.00729
Epoch: 39551, Training Loss: 0.00682
Epoch: 39551, Training Loss: 0.00833
Epoch: 39551, Training Loss: 0.00646
Epoch: 39552, Training Loss: 0.00729
Epoch: 39552, Training Loss: 0.00682
Epoch: 39552, Training Loss: 0.00833
Epoch: 39552, Training Loss: 0.00646
Epoch: 39553, Training Loss: 0.00729
Epoch: 39553, Training Loss: 0.00682
Epoch: 39553, Training Loss: 0.00833
Epoch: 39553, Training Loss: 0.00646
Epoch: 39554, Training Loss: 0.00729
Epoch: 39554, Training Loss: 0.00682
Epoch: 39554, Training Loss: 0.00833
Epoch: 39554, Training Loss: 0.00646
Epoch: 39555, Training Loss: 0.00729
Epoch: 39555, Training Loss: 0.00682
Epoch: 39555, Training Loss: 0.00833
Epoch: 39555, Training Loss: 0.00646
Epoch: 39556, Training Loss: 0.00729
Epoch: 39556, Training Loss: 0.00682
Epoch: 39556, Training Loss: 0.00833
Epoch: 39556, Training Loss: 0.00646
Epoch: 39557, Training Loss: 0.00729
Epoch: 39557, Training Loss: 0.00682
Epoch: 39557, Training Loss: 0.00833
Epoch: 39557, Training Loss: 0.00646
Epoch: 39558, Training Loss: 0.00729
Epoch: 39558, Training Loss: 0.00682
Epoch: 39558, Training Loss: 0.00833
Epoch: 39558, Training Loss: 0.00646
Epoch: 39559, Training Loss: 0.00729
Epoch: 39559, Training Loss: 0.00682
Epoch: 39559, Training Loss: 0.00833
Epoch: 39559, Training Loss: 0.00646
Epoch: 39560, Training Loss: 0.00729
Epoch: 39560, Training Loss: 0.00682
Epoch: 39560, Training Loss: 0.00833
Epoch: 39560, Training Loss: 0.00646
Epoch: 39561, Training Loss: 0.00729
Epoch: 39561, Training Loss: 0.00682
Epoch: 39561, Training Loss: 0.00832
Epoch: 39561, Training Loss: 0.00646
Epoch: 39562, Training Loss: 0.00729
Epoch: 39562, Training Loss: 0.00682
Epoch: 39562, Training Loss: 0.00832
Epoch: 39562, Training Loss: 0.00646
Epoch: 39563, Training Loss: 0.00729
Epoch: 39563, Training Loss: 0.00682
Epoch: 39563, Training Loss: 0.00832
Epoch: 39563, Training Loss: 0.00646
Epoch: 39564, Training Loss: 0.00729
Epoch: 39564, Training Loss: 0.00682
Epoch: 39564, Training Loss: 0.00832
Epoch: 39564, Training Loss: 0.00646
Epoch: 39565, Training Loss: 0.00729
Epoch: 39565, Training Loss: 0.00682
Epoch: 39565, Training Loss: 0.00832
Epoch: 39565, Training Loss: 0.00646
Epoch: 39566, Training Loss: 0.00729
Epoch: 39566, Training Loss: 0.00682
Epoch: 39566, Training Loss: 0.00832
Epoch: 39566, Training Loss: 0.00646
Epoch: 39567, Training Loss: 0.00729
Epoch: 39567, Training Loss: 0.00682
Epoch: 39567, Training Loss: 0.00832
Epoch: 39567, Training Loss: 0.00646
Epoch: 39568, Training Loss: 0.00729
Epoch: 39568, Training Loss: 0.00682
Epoch: 39568, Training Loss: 0.00832
Epoch: 39568, Training Loss: 0.00646
Epoch: 39569, Training Loss: 0.00729
Epoch: 39569, Training Loss: 0.00682
Epoch: 39569, Training Loss: 0.00832
Epoch: 39569, Training Loss: 0.00646
Epoch: 39570, Training Loss: 0.00729
Epoch: 39570, Training Loss: 0.00682
Epoch: 39570, Training Loss: 0.00832
Epoch: 39570, Training Loss: 0.00645
Epoch: 39571, Training Loss: 0.00729
Epoch: 39571, Training Loss: 0.00682
Epoch: 39571, Training Loss: 0.00832
Epoch: 39571, Training Loss: 0.00645
Epoch: 39572, Training Loss: 0.00729
Epoch: 39572, Training Loss: 0.00682
Epoch: 39572, Training Loss: 0.00832
Epoch: 39572, Training Loss: 0.00645
Epoch: 39573, Training Loss: 0.00728
Epoch: 39573, Training Loss: 0.00682
Epoch: 39573, Training Loss: 0.00832
Epoch: 39573, Training Loss: 0.00645
Epoch: 39574, Training Loss: 0.00728
Epoch: 39574, Training Loss: 0.00682
Epoch: 39574, Training Loss: 0.00832
Epoch: 39574, Training Loss: 0.00645
Epoch: 39575, Training Loss: 0.00728
Epoch: 39575, Training Loss: 0.00682
Epoch: 39575, Training Loss: 0.00832
Epoch: 39575, Training Loss: 0.00645
Epoch: 39576, Training Loss: 0.00728
Epoch: 39576, Training Loss: 0.00682
Epoch: 39576, Training Loss: 0.00832
Epoch: 39576, Training Loss: 0.00645
Epoch: 39577, Training Loss: 0.00728
Epoch: 39577, Training Loss: 0.00682
Epoch: 39577, Training Loss: 0.00832
Epoch: 39577, Training Loss: 0.00645
Epoch: 39578, Training Loss: 0.00728
Epoch: 39578, Training Loss: 0.00682
Epoch: 39578, Training Loss: 0.00832
Epoch: 39578, Training Loss: 0.00645
Epoch: 39579, Training Loss: 0.00728
Epoch: 39579, Training Loss: 0.00682
Epoch: 39579, Training Loss: 0.00832
Epoch: 39579, Training Loss: 0.00645
Epoch: 39580, Training Loss: 0.00728
Epoch: 39580, Training Loss: 0.00682
Epoch: 39580, Training Loss: 0.00832
Epoch: 39580, Training Loss: 0.00645
Epoch: 39581, Training Loss: 0.00728
Epoch: 39581, Training Loss: 0.00682
Epoch: 39581, Training Loss: 0.00832
Epoch: 39581, Training Loss: 0.00645
Epoch: 39582, Training Loss: 0.00728
Epoch: 39582, Training Loss: 0.00682
Epoch: 39582, Training Loss: 0.00832
Epoch: 39582, Training Loss: 0.00645
Epoch: 39583, Training Loss: 0.00728
Epoch: 39583, Training Loss: 0.00682
Epoch: 39583, Training Loss: 0.00832
Epoch: 39583, Training Loss: 0.00645
Epoch: 39584, Training Loss: 0.00728
Epoch: 39584, Training Loss: 0.00682
Epoch: 39584, Training Loss: 0.00832
Epoch: 39584, Training Loss: 0.00645
Epoch: 39585, Training Loss: 0.00728
Epoch: 39585, Training Loss: 0.00682
Epoch: 39585, Training Loss: 0.00832
Epoch: 39585, Training Loss: 0.00645
Epoch: 39586, Training Loss: 0.00728
Epoch: 39586, Training Loss: 0.00682
Epoch: 39586, Training Loss: 0.00832
Epoch: 39586, Training Loss: 0.00645
Epoch: 39587, Training Loss: 0.00728
Epoch: 39587, Training Loss: 0.00682
Epoch: 39587, Training Loss: 0.00832
Epoch: 39587, Training Loss: 0.00645
Epoch: 39588, Training Loss: 0.00728
Epoch: 39588, Training Loss: 0.00682
Epoch: 39588, Training Loss: 0.00832
Epoch: 39588, Training Loss: 0.00645
Epoch: 39589, Training Loss: 0.00728
Epoch: 39589, Training Loss: 0.00682
Epoch: 39589, Training Loss: 0.00832
Epoch: 39589, Training Loss: 0.00645
Epoch: 39590, Training Loss: 0.00728
Epoch: 39590, Training Loss: 0.00682
Epoch: 39590, Training Loss: 0.00832
Epoch: 39590, Training Loss: 0.00645
Epoch: 39591, Training Loss: 0.00728
Epoch: 39591, Training Loss: 0.00682
Epoch: 39591, Training Loss: 0.00832
Epoch: 39591, Training Loss: 0.00645
Epoch: 39592, Training Loss: 0.00728
Epoch: 39592, Training Loss: 0.00682
Epoch: 39592, Training Loss: 0.00832
Epoch: 39592, Training Loss: 0.00645
Epoch: 39593, Training Loss: 0.00728
Epoch: 39593, Training Loss: 0.00682
Epoch: 39593, Training Loss: 0.00832
Epoch: 39593, Training Loss: 0.00645
Epoch: 39594, Training Loss: 0.00728
Epoch: 39594, Training Loss: 0.00682
Epoch: 39594, Training Loss: 0.00832
Epoch: 39594, Training Loss: 0.00645
Epoch: 39595, Training Loss: 0.00728
Epoch: 39595, Training Loss: 0.00682
Epoch: 39595, Training Loss: 0.00832
Epoch: 39595, Training Loss: 0.00645
Epoch: 39596, Training Loss: 0.00728
Epoch: 39596, Training Loss: 0.00682
Epoch: 39596, Training Loss: 0.00832
Epoch: 39596, Training Loss: 0.00645
Epoch: 39597, Training Loss: 0.00728
Epoch: 39597, Training Loss: 0.00682
Epoch: 39597, Training Loss: 0.00832
Epoch: 39597, Training Loss: 0.00645
Epoch: 39598, Training Loss: 0.00728
Epoch: 39598, Training Loss: 0.00682
Epoch: 39598, Training Loss: 0.00832
Epoch: 39598, Training Loss: 0.00645
Epoch: 39599, Training Loss: 0.00728
Epoch: 39599, Training Loss: 0.00682
Epoch: 39599, Training Loss: 0.00832
Epoch: 39599, Training Loss: 0.00645
Epoch: 39600, Training Loss: 0.00728
Epoch: 39600, Training Loss: 0.00682
Epoch: 39600, Training Loss: 0.00832
Epoch: 39600, Training Loss: 0.00645
Epoch: 39601, Training Loss: 0.00728
Epoch: 39601, Training Loss: 0.00682
Epoch: 39601, Training Loss: 0.00832
Epoch: 39601, Training Loss: 0.00645
Epoch: 39602, Training Loss: 0.00728
Epoch: 39602, Training Loss: 0.00682
Epoch: 39602, Training Loss: 0.00832
Epoch: 39602, Training Loss: 0.00645
Epoch: 39603, Training Loss: 0.00728
Epoch: 39603, Training Loss: 0.00681
Epoch: 39603, Training Loss: 0.00832
Epoch: 39603, Training Loss: 0.00645
Epoch: 39604, Training Loss: 0.00728
Epoch: 39604, Training Loss: 0.00681
Epoch: 39604, Training Loss: 0.00832
Epoch: 39604, Training Loss: 0.00645
Epoch: 39605, Training Loss: 0.00728
Epoch: 39605, Training Loss: 0.00681
Epoch: 39605, Training Loss: 0.00832
Epoch: 39605, Training Loss: 0.00645
Epoch: 39606, Training Loss: 0.00728
Epoch: 39606, Training Loss: 0.00681
Epoch: 39606, Training Loss: 0.00832
Epoch: 39606, Training Loss: 0.00645
Epoch: 39607, Training Loss: 0.00728
Epoch: 39607, Training Loss: 0.00681
Epoch: 39607, Training Loss: 0.00832
Epoch: 39607, Training Loss: 0.00645
Epoch: 39608, Training Loss: 0.00728
Epoch: 39608, Training Loss: 0.00681
Epoch: 39608, Training Loss: 0.00832
Epoch: 39608, Training Loss: 0.00645
Epoch: 39609, Training Loss: 0.00728
Epoch: 39609, Training Loss: 0.00681
Epoch: 39609, Training Loss: 0.00832
Epoch: 39609, Training Loss: 0.00645
Epoch: 39610, Training Loss: 0.00728
Epoch: 39610, Training Loss: 0.00681
Epoch: 39610, Training Loss: 0.00832
Epoch: 39610, Training Loss: 0.00645
Epoch: 39611, Training Loss: 0.00728
Epoch: 39611, Training Loss: 0.00681
Epoch: 39611, Training Loss: 0.00832
Epoch: 39611, Training Loss: 0.00645
Epoch: 39612, Training Loss: 0.00728
Epoch: 39612, Training Loss: 0.00681
Epoch: 39612, Training Loss: 0.00832
Epoch: 39612, Training Loss: 0.00645
Epoch: 39613, Training Loss: 0.00728
Epoch: 39613, Training Loss: 0.00681
Epoch: 39613, Training Loss: 0.00832
Epoch: 39613, Training Loss: 0.00645
Epoch: 39614, Training Loss: 0.00728
Epoch: 39614, Training Loss: 0.00681
Epoch: 39614, Training Loss: 0.00832
Epoch: 39614, Training Loss: 0.00645
Epoch: 39615, Training Loss: 0.00728
Epoch: 39615, Training Loss: 0.00681
Epoch: 39615, Training Loss: 0.00832
Epoch: 39615, Training Loss: 0.00645
Epoch: 39616, Training Loss: 0.00728
Epoch: 39616, Training Loss: 0.00681
Epoch: 39616, Training Loss: 0.00832
Epoch: 39616, Training Loss: 0.00645
Epoch: 39617, Training Loss: 0.00728
Epoch: 39617, Training Loss: 0.00681
Epoch: 39617, Training Loss: 0.00832
Epoch: 39617, Training Loss: 0.00645
Epoch: 39618, Training Loss: 0.00728
Epoch: 39618, Training Loss: 0.00681
Epoch: 39618, Training Loss: 0.00832
Epoch: 39618, Training Loss: 0.00645
Epoch: 39619, Training Loss: 0.00728
Epoch: 39619, Training Loss: 0.00681
Epoch: 39619, Training Loss: 0.00832
Epoch: 39619, Training Loss: 0.00645
Epoch: 39620, Training Loss: 0.00728
Epoch: 39620, Training Loss: 0.00681
Epoch: 39620, Training Loss: 0.00832
Epoch: 39620, Training Loss: 0.00645
Epoch: 39621, Training Loss: 0.00728
Epoch: 39621, Training Loss: 0.00681
Epoch: 39621, Training Loss: 0.00832
Epoch: 39621, Training Loss: 0.00645
Epoch: 39622, Training Loss: 0.00728
Epoch: 39622, Training Loss: 0.00681
Epoch: 39622, Training Loss: 0.00832
Epoch: 39622, Training Loss: 0.00645
Epoch: 39623, Training Loss: 0.00728
Epoch: 39623, Training Loss: 0.00681
Epoch: 39623, Training Loss: 0.00832
Epoch: 39623, Training Loss: 0.00645
Epoch: 39624, Training Loss: 0.00728
Epoch: 39624, Training Loss: 0.00681
Epoch: 39624, Training Loss: 0.00832
Epoch: 39624, Training Loss: 0.00645
Epoch: 39625, Training Loss: 0.00728
Epoch: 39625, Training Loss: 0.00681
Epoch: 39625, Training Loss: 0.00832
Epoch: 39625, Training Loss: 0.00645
Epoch: 39626, Training Loss: 0.00728
Epoch: 39626, Training Loss: 0.00681
Epoch: 39626, Training Loss: 0.00832
Epoch: 39626, Training Loss: 0.00645
Epoch: 39627, Training Loss: 0.00728
Epoch: 39627, Training Loss: 0.00681
Epoch: 39627, Training Loss: 0.00832
Epoch: 39627, Training Loss: 0.00645
Epoch: 39628, Training Loss: 0.00728
Epoch: 39628, Training Loss: 0.00681
Epoch: 39628, Training Loss: 0.00832
Epoch: 39628, Training Loss: 0.00645
Epoch: 39629, Training Loss: 0.00728
Epoch: 39629, Training Loss: 0.00681
Epoch: 39629, Training Loss: 0.00832
Epoch: 39629, Training Loss: 0.00645
Epoch: 39630, Training Loss: 0.00728
Epoch: 39630, Training Loss: 0.00681
Epoch: 39630, Training Loss: 0.00832
Epoch: 39630, Training Loss: 0.00645
Epoch: 39631, Training Loss: 0.00728
Epoch: 39631, Training Loss: 0.00681
Epoch: 39631, Training Loss: 0.00832
Epoch: 39631, Training Loss: 0.00645
Epoch: 39632, Training Loss: 0.00728
Epoch: 39632, Training Loss: 0.00681
Epoch: 39632, Training Loss: 0.00832
Epoch: 39632, Training Loss: 0.00645
Epoch: 39633, Training Loss: 0.00728
Epoch: 39633, Training Loss: 0.00681
Epoch: 39633, Training Loss: 0.00832
Epoch: 39633, Training Loss: 0.00645
Epoch: 39634, Training Loss: 0.00728
Epoch: 39634, Training Loss: 0.00681
Epoch: 39634, Training Loss: 0.00832
Epoch: 39634, Training Loss: 0.00645
Epoch: 39635, Training Loss: 0.00728
Epoch: 39635, Training Loss: 0.00681
Epoch: 39635, Training Loss: 0.00832
Epoch: 39635, Training Loss: 0.00645
Epoch: 39636, Training Loss: 0.00728
Epoch: 39636, Training Loss: 0.00681
Epoch: 39636, Training Loss: 0.00832
Epoch: 39636, Training Loss: 0.00645
Epoch: 39637, Training Loss: 0.00728
Epoch: 39637, Training Loss: 0.00681
Epoch: 39637, Training Loss: 0.00832
Epoch: 39637, Training Loss: 0.00645
Epoch: 39638, Training Loss: 0.00728
Epoch: 39638, Training Loss: 0.00681
Epoch: 39638, Training Loss: 0.00832
Epoch: 39638, Training Loss: 0.00645
Epoch: 39639, Training Loss: 0.00728
Epoch: 39639, Training Loss: 0.00681
Epoch: 39639, Training Loss: 0.00832
Epoch: 39639, Training Loss: 0.00645
Epoch: 39640, Training Loss: 0.00728
Epoch: 39640, Training Loss: 0.00681
Epoch: 39640, Training Loss: 0.00832
Epoch: 39640, Training Loss: 0.00645
Epoch: 39641, Training Loss: 0.00728
Epoch: 39641, Training Loss: 0.00681
Epoch: 39641, Training Loss: 0.00832
Epoch: 39641, Training Loss: 0.00645
Epoch: 39642, Training Loss: 0.00728
Epoch: 39642, Training Loss: 0.00681
Epoch: 39642, Training Loss: 0.00832
Epoch: 39642, Training Loss: 0.00645
Epoch: 39643, Training Loss: 0.00728
Epoch: 39643, Training Loss: 0.00681
Epoch: 39643, Training Loss: 0.00832
Epoch: 39643, Training Loss: 0.00645
Epoch: 39644, Training Loss: 0.00728
Epoch: 39644, Training Loss: 0.00681
Epoch: 39644, Training Loss: 0.00832
Epoch: 39644, Training Loss: 0.00645
Epoch: 39645, Training Loss: 0.00728
Epoch: 39645, Training Loss: 0.00681
Epoch: 39645, Training Loss: 0.00832
Epoch: 39645, Training Loss: 0.00645
Epoch: 39646, Training Loss: 0.00728
Epoch: 39646, Training Loss: 0.00681
Epoch: 39646, Training Loss: 0.00832
Epoch: 39646, Training Loss: 0.00645
Epoch: 39647, Training Loss: 0.00728
Epoch: 39647, Training Loss: 0.00681
Epoch: 39647, Training Loss: 0.00832
Epoch: 39647, Training Loss: 0.00645
Epoch: 39648, Training Loss: 0.00728
Epoch: 39648, Training Loss: 0.00681
Epoch: 39648, Training Loss: 0.00832
Epoch: 39648, Training Loss: 0.00645
Epoch: 39649, Training Loss: 0.00728
Epoch: 39649, Training Loss: 0.00681
Epoch: 39649, Training Loss: 0.00832
Epoch: 39649, Training Loss: 0.00645
Epoch: 39650, Training Loss: 0.00728
Epoch: 39650, Training Loss: 0.00681
Epoch: 39650, Training Loss: 0.00832
Epoch: 39650, Training Loss: 0.00645
Epoch: 39651, Training Loss: 0.00728
Epoch: 39651, Training Loss: 0.00681
Epoch: 39651, Training Loss: 0.00831
Epoch: 39651, Training Loss: 0.00645
Epoch: 39652, Training Loss: 0.00728
Epoch: 39652, Training Loss: 0.00681
Epoch: 39652, Training Loss: 0.00831
Epoch: 39652, Training Loss: 0.00645
Epoch: 39653, Training Loss: 0.00728
Epoch: 39653, Training Loss: 0.00681
Epoch: 39653, Training Loss: 0.00831
Epoch: 39653, Training Loss: 0.00645
Epoch: 39654, Training Loss: 0.00728
Epoch: 39654, Training Loss: 0.00681
Epoch: 39654, Training Loss: 0.00831
Epoch: 39654, Training Loss: 0.00645
Epoch: 39655, Training Loss: 0.00728
Epoch: 39655, Training Loss: 0.00681
Epoch: 39655, Training Loss: 0.00831
Epoch: 39655, Training Loss: 0.00645
Epoch: 39656, Training Loss: 0.00728
Epoch: 39656, Training Loss: 0.00681
Epoch: 39656, Training Loss: 0.00831
Epoch: 39656, Training Loss: 0.00645
Epoch: 39657, Training Loss: 0.00728
Epoch: 39657, Training Loss: 0.00681
Epoch: 39657, Training Loss: 0.00831
Epoch: 39657, Training Loss: 0.00645
Epoch: 39658, Training Loss: 0.00728
Epoch: 39658, Training Loss: 0.00681
Epoch: 39658, Training Loss: 0.00831
Epoch: 39658, Training Loss: 0.00645
Epoch: 39659, Training Loss: 0.00728
Epoch: 39659, Training Loss: 0.00681
Epoch: 39659, Training Loss: 0.00831
Epoch: 39659, Training Loss: 0.00645
Epoch: 39660, Training Loss: 0.00728
Epoch: 39660, Training Loss: 0.00681
Epoch: 39660, Training Loss: 0.00831
Epoch: 39660, Training Loss: 0.00645
Epoch: 39661, Training Loss: 0.00728
Epoch: 39661, Training Loss: 0.00681
Epoch: 39661, Training Loss: 0.00831
Epoch: 39661, Training Loss: 0.00645
Epoch: 39662, Training Loss: 0.00728
Epoch: 39662, Training Loss: 0.00681
Epoch: 39662, Training Loss: 0.00831
Epoch: 39662, Training Loss: 0.00645
Epoch: 39663, Training Loss: 0.00728
Epoch: 39663, Training Loss: 0.00681
Epoch: 39663, Training Loss: 0.00831
Epoch: 39663, Training Loss: 0.00645
Epoch: 39664, Training Loss: 0.00728
Epoch: 39664, Training Loss: 0.00681
Epoch: 39664, Training Loss: 0.00831
Epoch: 39664, Training Loss: 0.00645
Epoch: 39665, Training Loss: 0.00728
Epoch: 39665, Training Loss: 0.00681
Epoch: 39665, Training Loss: 0.00831
Epoch: 39665, Training Loss: 0.00645
Epoch: 39666, Training Loss: 0.00728
Epoch: 39666, Training Loss: 0.00681
Epoch: 39666, Training Loss: 0.00831
Epoch: 39666, Training Loss: 0.00645
Epoch: 39667, Training Loss: 0.00728
Epoch: 39667, Training Loss: 0.00681
Epoch: 39667, Training Loss: 0.00831
Epoch: 39667, Training Loss: 0.00645
Epoch: 39668, Training Loss: 0.00728
Epoch: 39668, Training Loss: 0.00681
Epoch: 39668, Training Loss: 0.00831
Epoch: 39668, Training Loss: 0.00645
Epoch: 39669, Training Loss: 0.00728
Epoch: 39669, Training Loss: 0.00681
Epoch: 39669, Training Loss: 0.00831
Epoch: 39669, Training Loss: 0.00645
Epoch: 39670, Training Loss: 0.00728
Epoch: 39670, Training Loss: 0.00681
Epoch: 39670, Training Loss: 0.00831
Epoch: 39670, Training Loss: 0.00645
Epoch: 39671, Training Loss: 0.00728
Epoch: 39671, Training Loss: 0.00681
Epoch: 39671, Training Loss: 0.00831
Epoch: 39671, Training Loss: 0.00645
Epoch: 39672, Training Loss: 0.00728
Epoch: 39672, Training Loss: 0.00681
Epoch: 39672, Training Loss: 0.00831
Epoch: 39672, Training Loss: 0.00645
Epoch: 39673, Training Loss: 0.00728
Epoch: 39673, Training Loss: 0.00681
Epoch: 39673, Training Loss: 0.00831
Epoch: 39673, Training Loss: 0.00645
Epoch: 39674, Training Loss: 0.00728
Epoch: 39674, Training Loss: 0.00681
Epoch: 39674, Training Loss: 0.00831
Epoch: 39674, Training Loss: 0.00645
Epoch: 39675, Training Loss: 0.00728
Epoch: 39675, Training Loss: 0.00681
Epoch: 39675, Training Loss: 0.00831
Epoch: 39675, Training Loss: 0.00645
Epoch: 39676, Training Loss: 0.00728
Epoch: 39676, Training Loss: 0.00681
Epoch: 39676, Training Loss: 0.00831
Epoch: 39676, Training Loss: 0.00645
Epoch: 39677, Training Loss: 0.00727
Epoch: 39677, Training Loss: 0.00681
Epoch: 39677, Training Loss: 0.00831
Epoch: 39677, Training Loss: 0.00645
Epoch: 39678, Training Loss: 0.00727
Epoch: 39678, Training Loss: 0.00681
Epoch: 39678, Training Loss: 0.00831
Epoch: 39678, Training Loss: 0.00645
Epoch: 39679, Training Loss: 0.00727
Epoch: 39679, Training Loss: 0.00681
Epoch: 39679, Training Loss: 0.00831
Epoch: 39679, Training Loss: 0.00645
Epoch: 39680, Training Loss: 0.00727
Epoch: 39680, Training Loss: 0.00681
Epoch: 39680, Training Loss: 0.00831
Epoch: 39680, Training Loss: 0.00645
Epoch: 39681, Training Loss: 0.00727
Epoch: 39681, Training Loss: 0.00681
Epoch: 39681, Training Loss: 0.00831
Epoch: 39681, Training Loss: 0.00645
Epoch: 39682, Training Loss: 0.00727
Epoch: 39682, Training Loss: 0.00681
Epoch: 39682, Training Loss: 0.00831
Epoch: 39682, Training Loss: 0.00645
Epoch: 39683, Training Loss: 0.00727
Epoch: 39683, Training Loss: 0.00681
Epoch: 39683, Training Loss: 0.00831
Epoch: 39683, Training Loss: 0.00645
Epoch: 39684, Training Loss: 0.00727
Epoch: 39684, Training Loss: 0.00681
Epoch: 39684, Training Loss: 0.00831
Epoch: 39684, Training Loss: 0.00645
Epoch: 39685, Training Loss: 0.00727
Epoch: 39685, Training Loss: 0.00681
Epoch: 39685, Training Loss: 0.00831
Epoch: 39685, Training Loss: 0.00645
Epoch: 39686, Training Loss: 0.00727
Epoch: 39686, Training Loss: 0.00681
Epoch: 39686, Training Loss: 0.00831
Epoch: 39686, Training Loss: 0.00645
Epoch: 39687, Training Loss: 0.00727
Epoch: 39687, Training Loss: 0.00681
Epoch: 39687, Training Loss: 0.00831
Epoch: 39687, Training Loss: 0.00645
Epoch: 39688, Training Loss: 0.00727
Epoch: 39688, Training Loss: 0.00681
Epoch: 39688, Training Loss: 0.00831
Epoch: 39688, Training Loss: 0.00645
Epoch: 39689, Training Loss: 0.00727
Epoch: 39689, Training Loss: 0.00681
Epoch: 39689, Training Loss: 0.00831
Epoch: 39689, Training Loss: 0.00644
Epoch: 39690, Training Loss: 0.00727
Epoch: 39690, Training Loss: 0.00681
Epoch: 39690, Training Loss: 0.00831
Epoch: 39690, Training Loss: 0.00644
Epoch: 39691, Training Loss: 0.00727
Epoch: 39691, Training Loss: 0.00681
Epoch: 39691, Training Loss: 0.00831
Epoch: 39691, Training Loss: 0.00644
Epoch: 39692, Training Loss: 0.00727
Epoch: 39692, Training Loss: 0.00681
Epoch: 39692, Training Loss: 0.00831
Epoch: 39692, Training Loss: 0.00644
Epoch: 39693, Training Loss: 0.00727
Epoch: 39693, Training Loss: 0.00681
Epoch: 39693, Training Loss: 0.00831
Epoch: 39693, Training Loss: 0.00644
Epoch: 39694, Training Loss: 0.00727
Epoch: 39694, Training Loss: 0.00681
Epoch: 39694, Training Loss: 0.00831
Epoch: 39694, Training Loss: 0.00644
Epoch: 39695, Training Loss: 0.00727
Epoch: 39695, Training Loss: 0.00681
Epoch: 39695, Training Loss: 0.00831
Epoch: 39695, Training Loss: 0.00644
Epoch: 39696, Training Loss: 0.00727
Epoch: 39696, Training Loss: 0.00681
Epoch: 39696, Training Loss: 0.00831
Epoch: 39696, Training Loss: 0.00644
Epoch: 39697, Training Loss: 0.00727
Epoch: 39697, Training Loss: 0.00681
Epoch: 39697, Training Loss: 0.00831
Epoch: 39697, Training Loss: 0.00644
Epoch: 39698, Training Loss: 0.00727
Epoch: 39698, Training Loss: 0.00681
Epoch: 39698, Training Loss: 0.00831
Epoch: 39698, Training Loss: 0.00644
Epoch: 39699, Training Loss: 0.00727
Epoch: 39699, Training Loss: 0.00681
Epoch: 39699, Training Loss: 0.00831
Epoch: 39699, Training Loss: 0.00644
Epoch: 39700, Training Loss: 0.00727
Epoch: 39700, Training Loss: 0.00681
Epoch: 39700, Training Loss: 0.00831
Epoch: 39700, Training Loss: 0.00644
Epoch: 39701, Training Loss: 0.00727
Epoch: 39701, Training Loss: 0.00681
Epoch: 39701, Training Loss: 0.00831
Epoch: 39701, Training Loss: 0.00644
Epoch: 39702, Training Loss: 0.00727
Epoch: 39702, Training Loss: 0.00681
Epoch: 39702, Training Loss: 0.00831
Epoch: 39702, Training Loss: 0.00644
Epoch: 39703, Training Loss: 0.00727
Epoch: 39703, Training Loss: 0.00681
Epoch: 39703, Training Loss: 0.00831
Epoch: 39703, Training Loss: 0.00644
Epoch: 39704, Training Loss: 0.00727
Epoch: 39704, Training Loss: 0.00681
Epoch: 39704, Training Loss: 0.00831
Epoch: 39704, Training Loss: 0.00644
Epoch: 39705, Training Loss: 0.00727
Epoch: 39705, Training Loss: 0.00681
Epoch: 39705, Training Loss: 0.00831
Epoch: 39705, Training Loss: 0.00644
Epoch: 39706, Training Loss: 0.00727
Epoch: 39706, Training Loss: 0.00681
Epoch: 39706, Training Loss: 0.00831
Epoch: 39706, Training Loss: 0.00644
Epoch: 39707, Training Loss: 0.00727
Epoch: 39707, Training Loss: 0.00681
Epoch: 39707, Training Loss: 0.00831
Epoch: 39707, Training Loss: 0.00644
Epoch: 39708, Training Loss: 0.00727
Epoch: 39708, Training Loss: 0.00681
Epoch: 39708, Training Loss: 0.00831
Epoch: 39708, Training Loss: 0.00644
Epoch: 39709, Training Loss: 0.00727
Epoch: 39709, Training Loss: 0.00681
Epoch: 39709, Training Loss: 0.00831
Epoch: 39709, Training Loss: 0.00644
Epoch: 39710, Training Loss: 0.00727
Epoch: 39710, Training Loss: 0.00681
Epoch: 39710, Training Loss: 0.00831
Epoch: 39710, Training Loss: 0.00644
Epoch: 39711, Training Loss: 0.00727
Epoch: 39711, Training Loss: 0.00681
Epoch: 39711, Training Loss: 0.00831
Epoch: 39711, Training Loss: 0.00644
Epoch: 39712, Training Loss: 0.00727
Epoch: 39712, Training Loss: 0.00681
Epoch: 39712, Training Loss: 0.00831
Epoch: 39712, Training Loss: 0.00644
Epoch: 39713, Training Loss: 0.00727
Epoch: 39713, Training Loss: 0.00681
Epoch: 39713, Training Loss: 0.00831
Epoch: 39713, Training Loss: 0.00644
Epoch: 39714, Training Loss: 0.00727
Epoch: 39714, Training Loss: 0.00681
Epoch: 39714, Training Loss: 0.00831
Epoch: 39714, Training Loss: 0.00644
Epoch: 39715, Training Loss: 0.00727
Epoch: 39715, Training Loss: 0.00680
Epoch: 39715, Training Loss: 0.00831
Epoch: 39715, Training Loss: 0.00644
Epoch: 39716, Training Loss: 0.00727
Epoch: 39716, Training Loss: 0.00680
Epoch: 39716, Training Loss: 0.00831
Epoch: 39716, Training Loss: 0.00644
Epoch: 39717, Training Loss: 0.00727
Epoch: 39717, Training Loss: 0.00680
Epoch: 39717, Training Loss: 0.00831
Epoch: 39717, Training Loss: 0.00644
Epoch: 39718, Training Loss: 0.00727
Epoch: 39718, Training Loss: 0.00680
Epoch: 39718, Training Loss: 0.00831
Epoch: 39718, Training Loss: 0.00644
Epoch: 39719, Training Loss: 0.00727
Epoch: 39719, Training Loss: 0.00680
Epoch: 39719, Training Loss: 0.00831
Epoch: 39719, Training Loss: 0.00644
Epoch: 39720, Training Loss: 0.00727
Epoch: 39720, Training Loss: 0.00680
Epoch: 39720, Training Loss: 0.00831
Epoch: 39720, Training Loss: 0.00644
Epoch: 39721, Training Loss: 0.00727
Epoch: 39721, Training Loss: 0.00680
Epoch: 39721, Training Loss: 0.00831
Epoch: 39721, Training Loss: 0.00644
Epoch: 39722, Training Loss: 0.00727
Epoch: 39722, Training Loss: 0.00680
Epoch: 39722, Training Loss: 0.00831
Epoch: 39722, Training Loss: 0.00644
Epoch: 39723, Training Loss: 0.00727
Epoch: 39723, Training Loss: 0.00680
Epoch: 39723, Training Loss: 0.00831
Epoch: 39723, Training Loss: 0.00644
Epoch: 39724, Training Loss: 0.00727
Epoch: 39724, Training Loss: 0.00680
Epoch: 39724, Training Loss: 0.00831
Epoch: 39724, Training Loss: 0.00644
Epoch: 39725, Training Loss: 0.00727
Epoch: 39725, Training Loss: 0.00680
Epoch: 39725, Training Loss: 0.00831
Epoch: 39725, Training Loss: 0.00644
Epoch: 39726, Training Loss: 0.00727
Epoch: 39726, Training Loss: 0.00680
Epoch: 39726, Training Loss: 0.00831
Epoch: 39726, Training Loss: 0.00644
Epoch: 39727, Training Loss: 0.00727
Epoch: 39727, Training Loss: 0.00680
Epoch: 39727, Training Loss: 0.00831
Epoch: 39727, Training Loss: 0.00644
Epoch: 39728, Training Loss: 0.00727
Epoch: 39728, Training Loss: 0.00680
Epoch: 39728, Training Loss: 0.00831
Epoch: 39728, Training Loss: 0.00644
Epoch: 39729, Training Loss: 0.00727
Epoch: 39729, Training Loss: 0.00680
Epoch: 39729, Training Loss: 0.00831
Epoch: 39729, Training Loss: 0.00644
Epoch: 39730, Training Loss: 0.00727
Epoch: 39730, Training Loss: 0.00680
Epoch: 39730, Training Loss: 0.00831
Epoch: 39730, Training Loss: 0.00644
Epoch: 39731, Training Loss: 0.00727
Epoch: 39731, Training Loss: 0.00680
Epoch: 39731, Training Loss: 0.00831
Epoch: 39731, Training Loss: 0.00644
Epoch: 39732, Training Loss: 0.00727
Epoch: 39732, Training Loss: 0.00680
Epoch: 39732, Training Loss: 0.00831
Epoch: 39732, Training Loss: 0.00644
Epoch: 39733, Training Loss: 0.00727
Epoch: 39733, Training Loss: 0.00680
Epoch: 39733, Training Loss: 0.00831
Epoch: 39733, Training Loss: 0.00644
Epoch: 39734, Training Loss: 0.00727
Epoch: 39734, Training Loss: 0.00680
Epoch: 39734, Training Loss: 0.00831
Epoch: 39734, Training Loss: 0.00644
Epoch: 39735, Training Loss: 0.00727
Epoch: 39735, Training Loss: 0.00680
Epoch: 39735, Training Loss: 0.00831
Epoch: 39735, Training Loss: 0.00644
Epoch: 39736, Training Loss: 0.00727
Epoch: 39736, Training Loss: 0.00680
Epoch: 39736, Training Loss: 0.00831
Epoch: 39736, Training Loss: 0.00644
Epoch: 39737, Training Loss: 0.00727
Epoch: 39737, Training Loss: 0.00680
Epoch: 39737, Training Loss: 0.00831
Epoch: 39737, Training Loss: 0.00644
Epoch: 39738, Training Loss: 0.00727
Epoch: 39738, Training Loss: 0.00680
Epoch: 39738, Training Loss: 0.00831
Epoch: 39738, Training Loss: 0.00644
Epoch: 39739, Training Loss: 0.00727
Epoch: 39739, Training Loss: 0.00680
Epoch: 39739, Training Loss: 0.00831
Epoch: 39739, Training Loss: 0.00644
Epoch: 39740, Training Loss: 0.00727
Epoch: 39740, Training Loss: 0.00680
Epoch: 39740, Training Loss: 0.00831
Epoch: 39740, Training Loss: 0.00644
Epoch: 39741, Training Loss: 0.00727
Epoch: 39741, Training Loss: 0.00680
Epoch: 39741, Training Loss: 0.00831
Epoch: 39741, Training Loss: 0.00644
Epoch: 39742, Training Loss: 0.00727
Epoch: 39742, Training Loss: 0.00680
Epoch: 39742, Training Loss: 0.00830
Epoch: 39742, Training Loss: 0.00644
Epoch: 39743, Training Loss: 0.00727
Epoch: 39743, Training Loss: 0.00680
Epoch: 39743, Training Loss: 0.00830
Epoch: 39743, Training Loss: 0.00644
Epoch: 39744, Training Loss: 0.00727
Epoch: 39744, Training Loss: 0.00680
Epoch: 39744, Training Loss: 0.00830
Epoch: 39744, Training Loss: 0.00644
Epoch: 39745, Training Loss: 0.00727
Epoch: 39745, Training Loss: 0.00680
Epoch: 39745, Training Loss: 0.00830
Epoch: 39745, Training Loss: 0.00644
Epoch: 39746, Training Loss: 0.00727
Epoch: 39746, Training Loss: 0.00680
Epoch: 39746, Training Loss: 0.00830
Epoch: 39746, Training Loss: 0.00644
Epoch: 39747, Training Loss: 0.00727
Epoch: 39747, Training Loss: 0.00680
Epoch: 39747, Training Loss: 0.00830
Epoch: 39747, Training Loss: 0.00644
Epoch: 39748, Training Loss: 0.00727
Epoch: 39748, Training Loss: 0.00680
Epoch: 39748, Training Loss: 0.00830
Epoch: 39748, Training Loss: 0.00644
Epoch: 39749, Training Loss: 0.00727
Epoch: 39749, Training Loss: 0.00680
Epoch: 39749, Training Loss: 0.00830
Epoch: 39749, Training Loss: 0.00644
Epoch: 39750, Training Loss: 0.00727
Epoch: 39750, Training Loss: 0.00680
Epoch: 39750, Training Loss: 0.00830
Epoch: 39750, Training Loss: 0.00644
Epoch: 39751, Training Loss: 0.00727
Epoch: 39751, Training Loss: 0.00680
Epoch: 39751, Training Loss: 0.00830
Epoch: 39751, Training Loss: 0.00644
Epoch: 39752, Training Loss: 0.00727
Epoch: 39752, Training Loss: 0.00680
Epoch: 39752, Training Loss: 0.00830
Epoch: 39752, Training Loss: 0.00644
Epoch: 39753, Training Loss: 0.00727
Epoch: 39753, Training Loss: 0.00680
Epoch: 39753, Training Loss: 0.00830
Epoch: 39753, Training Loss: 0.00644
Epoch: 39754, Training Loss: 0.00727
Epoch: 39754, Training Loss: 0.00680
Epoch: 39754, Training Loss: 0.00830
Epoch: 39754, Training Loss: 0.00644
Epoch: 39755, Training Loss: 0.00727
Epoch: 39755, Training Loss: 0.00680
Epoch: 39755, Training Loss: 0.00830
Epoch: 39755, Training Loss: 0.00644
Epoch: 39756, Training Loss: 0.00727
Epoch: 39756, Training Loss: 0.00680
Epoch: 39756, Training Loss: 0.00830
Epoch: 39756, Training Loss: 0.00644
Epoch: 39757, Training Loss: 0.00727
Epoch: 39757, Training Loss: 0.00680
Epoch: 39757, Training Loss: 0.00830
Epoch: 39757, Training Loss: 0.00644
Epoch: 39758, Training Loss: 0.00727
Epoch: 39758, Training Loss: 0.00680
Epoch: 39758, Training Loss: 0.00830
Epoch: 39758, Training Loss: 0.00644
Epoch: 39759, Training Loss: 0.00727
Epoch: 39759, Training Loss: 0.00680
Epoch: 39759, Training Loss: 0.00830
Epoch: 39759, Training Loss: 0.00644
Epoch: 39760, Training Loss: 0.00727
Epoch: 39760, Training Loss: 0.00680
Epoch: 39760, Training Loss: 0.00830
Epoch: 39760, Training Loss: 0.00644
Epoch: 39761, Training Loss: 0.00727
Epoch: 39761, Training Loss: 0.00680
Epoch: 39761, Training Loss: 0.00830
Epoch: 39761, Training Loss: 0.00644
Epoch: 39762, Training Loss: 0.00727
Epoch: 39762, Training Loss: 0.00680
Epoch: 39762, Training Loss: 0.00830
Epoch: 39762, Training Loss: 0.00644
Epoch: 39763, Training Loss: 0.00727
Epoch: 39763, Training Loss: 0.00680
Epoch: 39763, Training Loss: 0.00830
Epoch: 39763, Training Loss: 0.00644
Epoch: 39764, Training Loss: 0.00727
Epoch: 39764, Training Loss: 0.00680
Epoch: 39764, Training Loss: 0.00830
Epoch: 39764, Training Loss: 0.00644
Epoch: 39765, Training Loss: 0.00727
Epoch: 39765, Training Loss: 0.00680
Epoch: 39765, Training Loss: 0.00830
Epoch: 39765, Training Loss: 0.00644
Epoch: 39766, Training Loss: 0.00727
Epoch: 39766, Training Loss: 0.00680
Epoch: 39766, Training Loss: 0.00830
Epoch: 39766, Training Loss: 0.00644
Epoch: 39767, Training Loss: 0.00727
Epoch: 39767, Training Loss: 0.00680
Epoch: 39767, Training Loss: 0.00830
Epoch: 39767, Training Loss: 0.00644
Epoch: 39768, Training Loss: 0.00727
Epoch: 39768, Training Loss: 0.00680
Epoch: 39768, Training Loss: 0.00830
Epoch: 39768, Training Loss: 0.00644
Epoch: 39769, Training Loss: 0.00727
Epoch: 39769, Training Loss: 0.00680
Epoch: 39769, Training Loss: 0.00830
Epoch: 39769, Training Loss: 0.00644
Epoch: 39770, Training Loss: 0.00727
Epoch: 39770, Training Loss: 0.00680
Epoch: 39770, Training Loss: 0.00830
Epoch: 39770, Training Loss: 0.00644
Epoch: 39771, Training Loss: 0.00727
Epoch: 39771, Training Loss: 0.00680
Epoch: 39771, Training Loss: 0.00830
Epoch: 39771, Training Loss: 0.00644
Epoch: 39772, Training Loss: 0.00727
Epoch: 39772, Training Loss: 0.00680
Epoch: 39772, Training Loss: 0.00830
Epoch: 39772, Training Loss: 0.00644
Epoch: 39773, Training Loss: 0.00727
Epoch: 39773, Training Loss: 0.00680
Epoch: 39773, Training Loss: 0.00830
Epoch: 39773, Training Loss: 0.00644
Epoch: 39774, Training Loss: 0.00727
Epoch: 39774, Training Loss: 0.00680
Epoch: 39774, Training Loss: 0.00830
Epoch: 39774, Training Loss: 0.00644
Epoch: 39775, Training Loss: 0.00727
Epoch: 39775, Training Loss: 0.00680
Epoch: 39775, Training Loss: 0.00830
Epoch: 39775, Training Loss: 0.00644
Epoch: 39776, Training Loss: 0.00727
Epoch: 39776, Training Loss: 0.00680
Epoch: 39776, Training Loss: 0.00830
Epoch: 39776, Training Loss: 0.00644
Epoch: 39777, Training Loss: 0.00727
Epoch: 39777, Training Loss: 0.00680
Epoch: 39777, Training Loss: 0.00830
Epoch: 39777, Training Loss: 0.00644
Epoch: 39778, Training Loss: 0.00727
Epoch: 39778, Training Loss: 0.00680
Epoch: 39778, Training Loss: 0.00830
Epoch: 39778, Training Loss: 0.00644
Epoch: 39779, Training Loss: 0.00727
Epoch: 39779, Training Loss: 0.00680
Epoch: 39779, Training Loss: 0.00830
Epoch: 39779, Training Loss: 0.00644
Epoch: 39780, Training Loss: 0.00726
Epoch: 39780, Training Loss: 0.00680
Epoch: 39780, Training Loss: 0.00830
Epoch: 39780, Training Loss: 0.00644
Epoch: 39781, Training Loss: 0.00726
Epoch: 39781, Training Loss: 0.00680
Epoch: 39781, Training Loss: 0.00830
Epoch: 39781, Training Loss: 0.00644
Epoch: 39782, Training Loss: 0.00726
Epoch: 39782, Training Loss: 0.00680
Epoch: 39782, Training Loss: 0.00830
Epoch: 39782, Training Loss: 0.00644
Epoch: 39783, Training Loss: 0.00726
Epoch: 39783, Training Loss: 0.00680
Epoch: 39783, Training Loss: 0.00830
Epoch: 39783, Training Loss: 0.00644
Epoch: 39784, Training Loss: 0.00726
Epoch: 39784, Training Loss: 0.00680
Epoch: 39784, Training Loss: 0.00830
Epoch: 39784, Training Loss: 0.00644
Epoch: 39785, Training Loss: 0.00726
Epoch: 39785, Training Loss: 0.00680
Epoch: 39785, Training Loss: 0.00830
Epoch: 39785, Training Loss: 0.00644
Epoch: 39786, Training Loss: 0.00726
Epoch: 39786, Training Loss: 0.00680
Epoch: 39786, Training Loss: 0.00830
Epoch: 39786, Training Loss: 0.00644
Epoch: 39787, Training Loss: 0.00726
Epoch: 39787, Training Loss: 0.00680
Epoch: 39787, Training Loss: 0.00830
Epoch: 39787, Training Loss: 0.00644
Epoch: 39788, Training Loss: 0.00726
Epoch: 39788, Training Loss: 0.00680
Epoch: 39788, Training Loss: 0.00830
Epoch: 39788, Training Loss: 0.00644
Epoch: 39789, Training Loss: 0.00726
Epoch: 39789, Training Loss: 0.00680
Epoch: 39789, Training Loss: 0.00830
Epoch: 39789, Training Loss: 0.00644
Epoch: 39790, Training Loss: 0.00726
Epoch: 39790, Training Loss: 0.00680
Epoch: 39790, Training Loss: 0.00830
Epoch: 39790, Training Loss: 0.00644
Epoch: 39791, Training Loss: 0.00726
Epoch: 39791, Training Loss: 0.00680
Epoch: 39791, Training Loss: 0.00830
Epoch: 39791, Training Loss: 0.00644
Epoch: 39792, Training Loss: 0.00726
Epoch: 39792, Training Loss: 0.00680
Epoch: 39792, Training Loss: 0.00830
Epoch: 39792, Training Loss: 0.00644
Epoch: 39793, Training Loss: 0.00726
Epoch: 39793, Training Loss: 0.00680
Epoch: 39793, Training Loss: 0.00830
Epoch: 39793, Training Loss: 0.00644
Epoch: 39794, Training Loss: 0.00726
Epoch: 39794, Training Loss: 0.00680
Epoch: 39794, Training Loss: 0.00830
Epoch: 39794, Training Loss: 0.00644
Epoch: 39795, Training Loss: 0.00726
Epoch: 39795, Training Loss: 0.00680
Epoch: 39795, Training Loss: 0.00830
Epoch: 39795, Training Loss: 0.00644
Epoch: 39796, Training Loss: 0.00726
Epoch: 39796, Training Loss: 0.00680
Epoch: 39796, Training Loss: 0.00830
Epoch: 39796, Training Loss: 0.00644
Epoch: 39797, Training Loss: 0.00726
Epoch: 39797, Training Loss: 0.00680
Epoch: 39797, Training Loss: 0.00830
Epoch: 39797, Training Loss: 0.00644
Epoch: 39798, Training Loss: 0.00726
Epoch: 39798, Training Loss: 0.00680
Epoch: 39798, Training Loss: 0.00830
Epoch: 39798, Training Loss: 0.00644
Epoch: 39799, Training Loss: 0.00726
Epoch: 39799, Training Loss: 0.00680
Epoch: 39799, Training Loss: 0.00830
Epoch: 39799, Training Loss: 0.00644
Epoch: 39800, Training Loss: 0.00726
Epoch: 39800, Training Loss: 0.00680
Epoch: 39800, Training Loss: 0.00830
Epoch: 39800, Training Loss: 0.00644
Epoch: 39801, Training Loss: 0.00726
Epoch: 39801, Training Loss: 0.00680
Epoch: 39801, Training Loss: 0.00830
Epoch: 39801, Training Loss: 0.00644
Epoch: 39802, Training Loss: 0.00726
Epoch: 39802, Training Loss: 0.00680
Epoch: 39802, Training Loss: 0.00830
Epoch: 39802, Training Loss: 0.00644
Epoch: 39803, Training Loss: 0.00726
Epoch: 39803, Training Loss: 0.00680
Epoch: 39803, Training Loss: 0.00830
Epoch: 39803, Training Loss: 0.00644
Epoch: 39804, Training Loss: 0.00726
Epoch: 39804, Training Loss: 0.00680
Epoch: 39804, Training Loss: 0.00830
Epoch: 39804, Training Loss: 0.00644
Epoch: 39805, Training Loss: 0.00726
Epoch: 39805, Training Loss: 0.00680
Epoch: 39805, Training Loss: 0.00830
Epoch: 39805, Training Loss: 0.00644
Epoch: 39806, Training Loss: 0.00726
Epoch: 39806, Training Loss: 0.00680
Epoch: 39806, Training Loss: 0.00830
Epoch: 39806, Training Loss: 0.00644
Epoch: 39807, Training Loss: 0.00726
Epoch: 39807, Training Loss: 0.00680
Epoch: 39807, Training Loss: 0.00830
Epoch: 39807, Training Loss: 0.00643
Epoch: 39808, Training Loss: 0.00726
Epoch: 39808, Training Loss: 0.00680
Epoch: 39808, Training Loss: 0.00830
Epoch: 39808, Training Loss: 0.00643
Epoch: 39809, Training Loss: 0.00726
Epoch: 39809, Training Loss: 0.00680
Epoch: 39809, Training Loss: 0.00830
Epoch: 39809, Training Loss: 0.00643
Epoch: 39810, Training Loss: 0.00726
Epoch: 39810, Training Loss: 0.00680
Epoch: 39810, Training Loss: 0.00830
Epoch: 39810, Training Loss: 0.00643
Epoch: 39811, Training Loss: 0.00726
Epoch: 39811, Training Loss: 0.00680
Epoch: 39811, Training Loss: 0.00830
Epoch: 39811, Training Loss: 0.00643
Epoch: 39812, Training Loss: 0.00726
Epoch: 39812, Training Loss: 0.00680
Epoch: 39812, Training Loss: 0.00830
Epoch: 39812, Training Loss: 0.00643
Epoch: 39813, Training Loss: 0.00726
Epoch: 39813, Training Loss: 0.00680
Epoch: 39813, Training Loss: 0.00830
Epoch: 39813, Training Loss: 0.00643
Epoch: 39814, Training Loss: 0.00726
Epoch: 39814, Training Loss: 0.00680
Epoch: 39814, Training Loss: 0.00830
Epoch: 39814, Training Loss: 0.00643
Epoch: 39815, Training Loss: 0.00726
Epoch: 39815, Training Loss: 0.00680
Epoch: 39815, Training Loss: 0.00830
Epoch: 39815, Training Loss: 0.00643
Epoch: 39816, Training Loss: 0.00726
Epoch: 39816, Training Loss: 0.00680
Epoch: 39816, Training Loss: 0.00830
Epoch: 39816, Training Loss: 0.00643
Epoch: 39817, Training Loss: 0.00726
Epoch: 39817, Training Loss: 0.00680
Epoch: 39817, Training Loss: 0.00830
Epoch: 39817, Training Loss: 0.00643
Epoch: 39818, Training Loss: 0.00726
Epoch: 39818, Training Loss: 0.00680
Epoch: 39818, Training Loss: 0.00830
Epoch: 39818, Training Loss: 0.00643
Epoch: 39819, Training Loss: 0.00726
Epoch: 39819, Training Loss: 0.00680
Epoch: 39819, Training Loss: 0.00830
Epoch: 39819, Training Loss: 0.00643
Epoch: 39820, Training Loss: 0.00726
Epoch: 39820, Training Loss: 0.00680
Epoch: 39820, Training Loss: 0.00830
Epoch: 39820, Training Loss: 0.00643
Epoch: 39821, Training Loss: 0.00726
Epoch: 39821, Training Loss: 0.00680
Epoch: 39821, Training Loss: 0.00830
Epoch: 39821, Training Loss: 0.00643
Epoch: 39822, Training Loss: 0.00726
Epoch: 39822, Training Loss: 0.00680
Epoch: 39822, Training Loss: 0.00830
Epoch: 39822, Training Loss: 0.00643
Epoch: 39823, Training Loss: 0.00726
Epoch: 39823, Training Loss: 0.00680
Epoch: 39823, Training Loss: 0.00830
Epoch: 39823, Training Loss: 0.00643
Epoch: 39824, Training Loss: 0.00726
Epoch: 39824, Training Loss: 0.00680
Epoch: 39824, Training Loss: 0.00830
Epoch: 39824, Training Loss: 0.00643
Epoch: 39825, Training Loss: 0.00726
Epoch: 39825, Training Loss: 0.00680
Epoch: 39825, Training Loss: 0.00830
Epoch: 39825, Training Loss: 0.00643
Epoch: 39826, Training Loss: 0.00726
Epoch: 39826, Training Loss: 0.00680
Epoch: 39826, Training Loss: 0.00830
Epoch: 39826, Training Loss: 0.00643
Epoch: 39827, Training Loss: 0.00726
Epoch: 39827, Training Loss: 0.00679
Epoch: 39827, Training Loss: 0.00830
Epoch: 39827, Training Loss: 0.00643
Epoch: 39828, Training Loss: 0.00726
Epoch: 39828, Training Loss: 0.00679
Epoch: 39828, Training Loss: 0.00830
Epoch: 39828, Training Loss: 0.00643
Epoch: 39829, Training Loss: 0.00726
Epoch: 39829, Training Loss: 0.00679
Epoch: 39829, Training Loss: 0.00830
Epoch: 39829, Training Loss: 0.00643
Epoch: 39830, Training Loss: 0.00726
Epoch: 39830, Training Loss: 0.00679
Epoch: 39830, Training Loss: 0.00830
Epoch: 39830, Training Loss: 0.00643
Epoch: 39831, Training Loss: 0.00726
Epoch: 39831, Training Loss: 0.00679
Epoch: 39831, Training Loss: 0.00830
Epoch: 39831, Training Loss: 0.00643
Epoch: 39832, Training Loss: 0.00726
Epoch: 39832, Training Loss: 0.00679
Epoch: 39832, Training Loss: 0.00830
Epoch: 39832, Training Loss: 0.00643
Epoch: 39833, Training Loss: 0.00726
Epoch: 39833, Training Loss: 0.00679
Epoch: 39833, Training Loss: 0.00830
Epoch: 39833, Training Loss: 0.00643
Epoch: 39834, Training Loss: 0.00726
Epoch: 39834, Training Loss: 0.00679
Epoch: 39834, Training Loss: 0.00829
Epoch: 39834, Training Loss: 0.00643
Epoch: 39835, Training Loss: 0.00726
Epoch: 39835, Training Loss: 0.00679
Epoch: 39835, Training Loss: 0.00829
Epoch: 39835, Training Loss: 0.00643
Epoch: 39836, Training Loss: 0.00726
Epoch: 39836, Training Loss: 0.00679
Epoch: 39836, Training Loss: 0.00829
Epoch: 39836, Training Loss: 0.00643
Epoch: 39837, Training Loss: 0.00726
Epoch: 39837, Training Loss: 0.00679
Epoch: 39837, Training Loss: 0.00829
Epoch: 39837, Training Loss: 0.00643
Epoch: 39838, Training Loss: 0.00726
Epoch: 39838, Training Loss: 0.00679
Epoch: 39838, Training Loss: 0.00829
Epoch: 39838, Training Loss: 0.00643
Epoch: 39839, Training Loss: 0.00726
Epoch: 39839, Training Loss: 0.00679
Epoch: 39839, Training Loss: 0.00829
Epoch: 39839, Training Loss: 0.00643
Epoch: 39840, Training Loss: 0.00726
Epoch: 39840, Training Loss: 0.00679
Epoch: 39840, Training Loss: 0.00829
Epoch: 39840, Training Loss: 0.00643
Epoch: 39841, Training Loss: 0.00726
Epoch: 39841, Training Loss: 0.00679
Epoch: 39841, Training Loss: 0.00829
Epoch: 39841, Training Loss: 0.00643
Epoch: 39842, Training Loss: 0.00726
Epoch: 39842, Training Loss: 0.00679
Epoch: 39842, Training Loss: 0.00829
Epoch: 39842, Training Loss: 0.00643
Epoch: 39843, Training Loss: 0.00726
Epoch: 39843, Training Loss: 0.00679
Epoch: 39843, Training Loss: 0.00829
Epoch: 39843, Training Loss: 0.00643
Epoch: 39844, Training Loss: 0.00726
Epoch: 39844, Training Loss: 0.00679
Epoch: 39844, Training Loss: 0.00829
Epoch: 39844, Training Loss: 0.00643
Epoch: 39845, Training Loss: 0.00726
Epoch: 39845, Training Loss: 0.00679
Epoch: 39845, Training Loss: 0.00829
Epoch: 39845, Training Loss: 0.00643
Epoch: 39846, Training Loss: 0.00726
Epoch: 39846, Training Loss: 0.00679
Epoch: 39846, Training Loss: 0.00829
Epoch: 39846, Training Loss: 0.00643
Epoch: 39847, Training Loss: 0.00726
Epoch: 39847, Training Loss: 0.00679
Epoch: 39847, Training Loss: 0.00829
Epoch: 39847, Training Loss: 0.00643
Epoch: 39848, Training Loss: 0.00726
Epoch: 39848, Training Loss: 0.00679
Epoch: 39848, Training Loss: 0.00829
Epoch: 39848, Training Loss: 0.00643
Epoch: 39849, Training Loss: 0.00726
Epoch: 39849, Training Loss: 0.00679
Epoch: 39849, Training Loss: 0.00829
Epoch: 39849, Training Loss: 0.00643
Epoch: 39850, Training Loss: 0.00726
Epoch: 39850, Training Loss: 0.00679
Epoch: 39850, Training Loss: 0.00829
Epoch: 39850, Training Loss: 0.00643
Epoch: 39851, Training Loss: 0.00726
Epoch: 39851, Training Loss: 0.00679
Epoch: 39851, Training Loss: 0.00829
Epoch: 39851, Training Loss: 0.00643
Epoch: 39852, Training Loss: 0.00726
Epoch: 39852, Training Loss: 0.00679
Epoch: 39852, Training Loss: 0.00829
Epoch: 39852, Training Loss: 0.00643
Epoch: 39853, Training Loss: 0.00726
Epoch: 39853, Training Loss: 0.00679
Epoch: 39853, Training Loss: 0.00829
Epoch: 39853, Training Loss: 0.00643
Epoch: 39854, Training Loss: 0.00726
Epoch: 39854, Training Loss: 0.00679
Epoch: 39854, Training Loss: 0.00829
Epoch: 39854, Training Loss: 0.00643
Epoch: 39855, Training Loss: 0.00726
Epoch: 39855, Training Loss: 0.00679
Epoch: 39855, Training Loss: 0.00829
Epoch: 39855, Training Loss: 0.00643
Epoch: 39856, Training Loss: 0.00726
Epoch: 39856, Training Loss: 0.00679
Epoch: 39856, Training Loss: 0.00829
Epoch: 39856, Training Loss: 0.00643
Epoch: 39857, Training Loss: 0.00726
Epoch: 39857, Training Loss: 0.00679
Epoch: 39857, Training Loss: 0.00829
Epoch: 39857, Training Loss: 0.00643
Epoch: 39858, Training Loss: 0.00726
Epoch: 39858, Training Loss: 0.00679
Epoch: 39858, Training Loss: 0.00829
Epoch: 39858, Training Loss: 0.00643
Epoch: 39859, Training Loss: 0.00726
Epoch: 39859, Training Loss: 0.00679
Epoch: 39859, Training Loss: 0.00829
Epoch: 39859, Training Loss: 0.00643
Epoch: 39860, Training Loss: 0.00726
Epoch: 39860, Training Loss: 0.00679
Epoch: 39860, Training Loss: 0.00829
Epoch: 39860, Training Loss: 0.00643
Epoch: 39861, Training Loss: 0.00726
Epoch: 39861, Training Loss: 0.00679
Epoch: 39861, Training Loss: 0.00829
Epoch: 39861, Training Loss: 0.00643
Epoch: 39862, Training Loss: 0.00726
Epoch: 39862, Training Loss: 0.00679
Epoch: 39862, Training Loss: 0.00829
Epoch: 39862, Training Loss: 0.00643
Epoch: 39863, Training Loss: 0.00726
Epoch: 39863, Training Loss: 0.00679
Epoch: 39863, Training Loss: 0.00829
Epoch: 39863, Training Loss: 0.00643
Epoch: 39864, Training Loss: 0.00726
Epoch: 39864, Training Loss: 0.00679
Epoch: 39864, Training Loss: 0.00829
Epoch: 39864, Training Loss: 0.00643
Epoch: 39865, Training Loss: 0.00726
Epoch: 39865, Training Loss: 0.00679
Epoch: 39865, Training Loss: 0.00829
Epoch: 39865, Training Loss: 0.00643
Epoch: 39866, Training Loss: 0.00726
Epoch: 39866, Training Loss: 0.00679
Epoch: 39866, Training Loss: 0.00829
Epoch: 39866, Training Loss: 0.00643
Epoch: 39867, Training Loss: 0.00726
Epoch: 39867, Training Loss: 0.00679
Epoch: 39867, Training Loss: 0.00829
Epoch: 39867, Training Loss: 0.00643
Epoch: 39868, Training Loss: 0.00726
Epoch: 39868, Training Loss: 0.00679
Epoch: 39868, Training Loss: 0.00829
Epoch: 39868, Training Loss: 0.00643
Epoch: 39869, Training Loss: 0.00726
Epoch: 39869, Training Loss: 0.00679
Epoch: 39869, Training Loss: 0.00829
Epoch: 39869, Training Loss: 0.00643
Epoch: 39870, Training Loss: 0.00726
Epoch: 39870, Training Loss: 0.00679
Epoch: 39870, Training Loss: 0.00829
Epoch: 39870, Training Loss: 0.00643
Epoch: 39871, Training Loss: 0.00726
Epoch: 39871, Training Loss: 0.00679
Epoch: 39871, Training Loss: 0.00829
Epoch: 39871, Training Loss: 0.00643
Epoch: 39872, Training Loss: 0.00726
Epoch: 39872, Training Loss: 0.00679
Epoch: 39872, Training Loss: 0.00829
Epoch: 39872, Training Loss: 0.00643
Epoch: 39873, Training Loss: 0.00726
Epoch: 39873, Training Loss: 0.00679
Epoch: 39873, Training Loss: 0.00829
Epoch: 39873, Training Loss: 0.00643
Epoch: 39874, Training Loss: 0.00726
Epoch: 39874, Training Loss: 0.00679
Epoch: 39874, Training Loss: 0.00829
Epoch: 39874, Training Loss: 0.00643
Epoch: 39875, Training Loss: 0.00726
Epoch: 39875, Training Loss: 0.00679
Epoch: 39875, Training Loss: 0.00829
Epoch: 39875, Training Loss: 0.00643
Epoch: 39876, Training Loss: 0.00726
Epoch: 39876, Training Loss: 0.00679
Epoch: 39876, Training Loss: 0.00829
Epoch: 39876, Training Loss: 0.00643
Epoch: 39877, Training Loss: 0.00726
Epoch: 39877, Training Loss: 0.00679
Epoch: 39877, Training Loss: 0.00829
Epoch: 39877, Training Loss: 0.00643
Epoch: 39878, Training Loss: 0.00726
Epoch: 39878, Training Loss: 0.00679
Epoch: 39878, Training Loss: 0.00829
Epoch: 39878, Training Loss: 0.00643
Epoch: 39879, Training Loss: 0.00726
Epoch: 39879, Training Loss: 0.00679
Epoch: 39879, Training Loss: 0.00829
Epoch: 39879, Training Loss: 0.00643
Epoch: 39880, Training Loss: 0.00726
Epoch: 39880, Training Loss: 0.00679
Epoch: 39880, Training Loss: 0.00829
Epoch: 39880, Training Loss: 0.00643
Epoch: 39881, Training Loss: 0.00726
Epoch: 39881, Training Loss: 0.00679
Epoch: 39881, Training Loss: 0.00829
Epoch: 39881, Training Loss: 0.00643
Epoch: 39882, Training Loss: 0.00726
Epoch: 39882, Training Loss: 0.00679
Epoch: 39882, Training Loss: 0.00829
Epoch: 39882, Training Loss: 0.00643
Epoch: 39883, Training Loss: 0.00726
Epoch: 39883, Training Loss: 0.00679
Epoch: 39883, Training Loss: 0.00829
Epoch: 39883, Training Loss: 0.00643
Epoch: 39884, Training Loss: 0.00726
Epoch: 39884, Training Loss: 0.00679
Epoch: 39884, Training Loss: 0.00829
Epoch: 39884, Training Loss: 0.00643
Epoch: 39885, Training Loss: 0.00725
Epoch: 39885, Training Loss: 0.00679
Epoch: 39885, Training Loss: 0.00829
Epoch: 39885, Training Loss: 0.00643
Epoch: 39886, Training Loss: 0.00725
Epoch: 39886, Training Loss: 0.00679
Epoch: 39886, Training Loss: 0.00829
Epoch: 39886, Training Loss: 0.00643
Epoch: 39887, Training Loss: 0.00725
Epoch: 39887, Training Loss: 0.00679
Epoch: 39887, Training Loss: 0.00829
Epoch: 39887, Training Loss: 0.00643
Epoch: 39888, Training Loss: 0.00725
Epoch: 39888, Training Loss: 0.00679
Epoch: 39888, Training Loss: 0.00829
Epoch: 39888, Training Loss: 0.00643
Epoch: 39889, Training Loss: 0.00725
Epoch: 39889, Training Loss: 0.00679
Epoch: 39889, Training Loss: 0.00829
Epoch: 39889, Training Loss: 0.00643
Epoch: 39890, Training Loss: 0.00725
Epoch: 39890, Training Loss: 0.00679
Epoch: 39890, Training Loss: 0.00829
Epoch: 39890, Training Loss: 0.00643
Epoch: 39891, Training Loss: 0.00725
Epoch: 39891, Training Loss: 0.00679
Epoch: 39891, Training Loss: 0.00829
Epoch: 39891, Training Loss: 0.00643
Epoch: 39892, Training Loss: 0.00725
Epoch: 39892, Training Loss: 0.00679
Epoch: 39892, Training Loss: 0.00829
Epoch: 39892, Training Loss: 0.00643
Epoch: 39893, Training Loss: 0.00725
Epoch: 39893, Training Loss: 0.00679
Epoch: 39893, Training Loss: 0.00829
Epoch: 39893, Training Loss: 0.00643
Epoch: 39894, Training Loss: 0.00725
Epoch: 39894, Training Loss: 0.00679
Epoch: 39894, Training Loss: 0.00829
Epoch: 39894, Training Loss: 0.00643
Epoch: 39895, Training Loss: 0.00725
Epoch: 39895, Training Loss: 0.00679
Epoch: 39895, Training Loss: 0.00829
Epoch: 39895, Training Loss: 0.00643
Epoch: 39896, Training Loss: 0.00725
Epoch: 39896, Training Loss: 0.00679
Epoch: 39896, Training Loss: 0.00829
Epoch: 39896, Training Loss: 0.00643
Epoch: 39897, Training Loss: 0.00725
Epoch: 39897, Training Loss: 0.00679
Epoch: 39897, Training Loss: 0.00829
Epoch: 39897, Training Loss: 0.00643
Epoch: 39898, Training Loss: 0.00725
Epoch: 39898, Training Loss: 0.00679
Epoch: 39898, Training Loss: 0.00829
Epoch: 39898, Training Loss: 0.00643
Epoch: 39899, Training Loss: 0.00725
Epoch: 39899, Training Loss: 0.00679
Epoch: 39899, Training Loss: 0.00829
Epoch: 39899, Training Loss: 0.00643
Epoch: 39900, Training Loss: 0.00725
Epoch: 39900, Training Loss: 0.00679
Epoch: 39900, Training Loss: 0.00829
Epoch: 39900, Training Loss: 0.00643
Epoch: 39901, Training Loss: 0.00725
Epoch: 39901, Training Loss: 0.00679
Epoch: 39901, Training Loss: 0.00829
Epoch: 39901, Training Loss: 0.00643
Epoch: 39902, Training Loss: 0.00725
Epoch: 39902, Training Loss: 0.00679
Epoch: 39902, Training Loss: 0.00829
Epoch: 39902, Training Loss: 0.00643
Epoch: 39903, Training Loss: 0.00725
Epoch: 39903, Training Loss: 0.00679
Epoch: 39903, Training Loss: 0.00829
Epoch: 39903, Training Loss: 0.00643
Epoch: 39904, Training Loss: 0.00725
Epoch: 39904, Training Loss: 0.00679
Epoch: 39904, Training Loss: 0.00829
Epoch: 39904, Training Loss: 0.00643
Epoch: 39905, Training Loss: 0.00725
Epoch: 39905, Training Loss: 0.00679
Epoch: 39905, Training Loss: 0.00829
Epoch: 39905, Training Loss: 0.00643
Epoch: 39906, Training Loss: 0.00725
Epoch: 39906, Training Loss: 0.00679
Epoch: 39906, Training Loss: 0.00829
Epoch: 39906, Training Loss: 0.00643
Epoch: 39907, Training Loss: 0.00725
Epoch: 39907, Training Loss: 0.00679
Epoch: 39907, Training Loss: 0.00829
Epoch: 39907, Training Loss: 0.00643
Epoch: 39908, Training Loss: 0.00725
Epoch: 39908, Training Loss: 0.00679
Epoch: 39908, Training Loss: 0.00829
Epoch: 39908, Training Loss: 0.00643
Epoch: 39909, Training Loss: 0.00725
Epoch: 39909, Training Loss: 0.00679
Epoch: 39909, Training Loss: 0.00829
Epoch: 39909, Training Loss: 0.00643
Epoch: 39910, Training Loss: 0.00725
Epoch: 39910, Training Loss: 0.00679
Epoch: 39910, Training Loss: 0.00829
Epoch: 39910, Training Loss: 0.00643
Epoch: 39911, Training Loss: 0.00725
Epoch: 39911, Training Loss: 0.00679
Epoch: 39911, Training Loss: 0.00829
Epoch: 39911, Training Loss: 0.00643
Epoch: 39912, Training Loss: 0.00725
Epoch: 39912, Training Loss: 0.00679
Epoch: 39912, Training Loss: 0.00829
Epoch: 39912, Training Loss: 0.00643
Epoch: 39913, Training Loss: 0.00725
Epoch: 39913, Training Loss: 0.00679
Epoch: 39913, Training Loss: 0.00829
Epoch: 39913, Training Loss: 0.00643
Epoch: 39914, Training Loss: 0.00725
Epoch: 39914, Training Loss: 0.00679
Epoch: 39914, Training Loss: 0.00829
Epoch: 39914, Training Loss: 0.00643
Epoch: 39915, Training Loss: 0.00725
Epoch: 39915, Training Loss: 0.00679
Epoch: 39915, Training Loss: 0.00829
Epoch: 39915, Training Loss: 0.00643
Epoch: 39916, Training Loss: 0.00725
Epoch: 39916, Training Loss: 0.00679
Epoch: 39916, Training Loss: 0.00829
Epoch: 39916, Training Loss: 0.00643
Epoch: 39917, Training Loss: 0.00725
Epoch: 39917, Training Loss: 0.00679
Epoch: 39917, Training Loss: 0.00829
Epoch: 39917, Training Loss: 0.00643
Epoch: 39918, Training Loss: 0.00725
Epoch: 39918, Training Loss: 0.00679
Epoch: 39918, Training Loss: 0.00829
Epoch: 39918, Training Loss: 0.00643
Epoch: 39919, Training Loss: 0.00725
Epoch: 39919, Training Loss: 0.00679
Epoch: 39919, Training Loss: 0.00829
Epoch: 39919, Training Loss: 0.00643
Epoch: 39920, Training Loss: 0.00725
Epoch: 39920, Training Loss: 0.00679
Epoch: 39920, Training Loss: 0.00829
Epoch: 39920, Training Loss: 0.00643
Epoch: 39921, Training Loss: 0.00725
Epoch: 39921, Training Loss: 0.00679
Epoch: 39921, Training Loss: 0.00829
Epoch: 39921, Training Loss: 0.00643
Epoch: 39922, Training Loss: 0.00725
Epoch: 39922, Training Loss: 0.00679
Epoch: 39922, Training Loss: 0.00829
Epoch: 39922, Training Loss: 0.00643
Epoch: 39923, Training Loss: 0.00725
Epoch: 39923, Training Loss: 0.00679
Epoch: 39923, Training Loss: 0.00829
Epoch: 39923, Training Loss: 0.00643
Epoch: 39924, Training Loss: 0.00725
Epoch: 39924, Training Loss: 0.00679
Epoch: 39924, Training Loss: 0.00829
Epoch: 39924, Training Loss: 0.00643
Epoch: 39925, Training Loss: 0.00725
Epoch: 39925, Training Loss: 0.00679
Epoch: 39925, Training Loss: 0.00828
Epoch: 39925, Training Loss: 0.00643
Epoch: 39926, Training Loss: 0.00725
Epoch: 39926, Training Loss: 0.00679
Epoch: 39926, Training Loss: 0.00828
Epoch: 39926, Training Loss: 0.00643
Epoch: 39927, Training Loss: 0.00725
Epoch: 39927, Training Loss: 0.00679
Epoch: 39927, Training Loss: 0.00828
Epoch: 39927, Training Loss: 0.00642
Epoch: 39928, Training Loss: 0.00725
Epoch: 39928, Training Loss: 0.00679
Epoch: 39928, Training Loss: 0.00828
Epoch: 39928, Training Loss: 0.00642
Epoch: 39929, Training Loss: 0.00725
Epoch: 39929, Training Loss: 0.00679
Epoch: 39929, Training Loss: 0.00828
Epoch: 39929, Training Loss: 0.00642
Epoch: 39930, Training Loss: 0.00725
Epoch: 39930, Training Loss: 0.00679
Epoch: 39930, Training Loss: 0.00828
Epoch: 39930, Training Loss: 0.00642
Epoch: 39931, Training Loss: 0.00725
Epoch: 39931, Training Loss: 0.00679
Epoch: 39931, Training Loss: 0.00828
Epoch: 39931, Training Loss: 0.00642
Epoch: 39932, Training Loss: 0.00725
Epoch: 39932, Training Loss: 0.00679
Epoch: 39932, Training Loss: 0.00828
Epoch: 39932, Training Loss: 0.00642
Epoch: 39933, Training Loss: 0.00725
Epoch: 39933, Training Loss: 0.00679
Epoch: 39933, Training Loss: 0.00828
Epoch: 39933, Training Loss: 0.00642
Epoch: 39934, Training Loss: 0.00725
Epoch: 39934, Training Loss: 0.00679
Epoch: 39934, Training Loss: 0.00828
Epoch: 39934, Training Loss: 0.00642
Epoch: 39935, Training Loss: 0.00725
Epoch: 39935, Training Loss: 0.00679
Epoch: 39935, Training Loss: 0.00828
Epoch: 39935, Training Loss: 0.00642
Epoch: 39936, Training Loss: 0.00725
Epoch: 39936, Training Loss: 0.00679
Epoch: 39936, Training Loss: 0.00828
Epoch: 39936, Training Loss: 0.00642
Epoch: 39937, Training Loss: 0.00725
Epoch: 39937, Training Loss: 0.00679
Epoch: 39937, Training Loss: 0.00828
Epoch: 39937, Training Loss: 0.00642
Epoch: 39938, Training Loss: 0.00725
Epoch: 39938, Training Loss: 0.00679
Epoch: 39938, Training Loss: 0.00828
Epoch: 39938, Training Loss: 0.00642
Epoch: 39939, Training Loss: 0.00725
Epoch: 39939, Training Loss: 0.00679
Epoch: 39939, Training Loss: 0.00828
Epoch: 39939, Training Loss: 0.00642
Epoch: 39940, Training Loss: 0.00725
Epoch: 39940, Training Loss: 0.00678
Epoch: 39940, Training Loss: 0.00828
Epoch: 39940, Training Loss: 0.00642
Epoch: 39941, Training Loss: 0.00725
Epoch: 39941, Training Loss: 0.00678
Epoch: 39941, Training Loss: 0.00828
Epoch: 39941, Training Loss: 0.00642
Epoch: 39942, Training Loss: 0.00725
Epoch: 39942, Training Loss: 0.00678
Epoch: 39942, Training Loss: 0.00828
Epoch: 39942, Training Loss: 0.00642
Epoch: 39943, Training Loss: 0.00725
Epoch: 39943, Training Loss: 0.00678
Epoch: 39943, Training Loss: 0.00828
Epoch: 39943, Training Loss: 0.00642
Epoch: 39944, Training Loss: 0.00725
Epoch: 39944, Training Loss: 0.00678
Epoch: 39944, Training Loss: 0.00828
Epoch: 39944, Training Loss: 0.00642
Epoch: 39945, Training Loss: 0.00725
Epoch: 39945, Training Loss: 0.00678
Epoch: 39945, Training Loss: 0.00828
Epoch: 39945, Training Loss: 0.00642
Epoch: 39946, Training Loss: 0.00725
Epoch: 39946, Training Loss: 0.00678
Epoch: 39946, Training Loss: 0.00828
Epoch: 39946, Training Loss: 0.00642
Epoch: 39947, Training Loss: 0.00725
Epoch: 39947, Training Loss: 0.00678
Epoch: 39947, Training Loss: 0.00828
Epoch: 39947, Training Loss: 0.00642
Epoch: 39948, Training Loss: 0.00725
Epoch: 39948, Training Loss: 0.00678
Epoch: 39948, Training Loss: 0.00828
Epoch: 39948, Training Loss: 0.00642
Epoch: 39949, Training Loss: 0.00725
Epoch: 39949, Training Loss: 0.00678
Epoch: 39949, Training Loss: 0.00828
Epoch: 39949, Training Loss: 0.00642
Epoch: 39950, Training Loss: 0.00725
Epoch: 39950, Training Loss: 0.00678
Epoch: 39950, Training Loss: 0.00828
Epoch: 39950, Training Loss: 0.00642
Epoch: 39951, Training Loss: 0.00725
Epoch: 39951, Training Loss: 0.00678
Epoch: 39951, Training Loss: 0.00828
Epoch: 39951, Training Loss: 0.00642
Epoch: 39952, Training Loss: 0.00725
Epoch: 39952, Training Loss: 0.00678
Epoch: 39952, Training Loss: 0.00828
Epoch: 39952, Training Loss: 0.00642
Epoch: 39953, Training Loss: 0.00725
Epoch: 39953, Training Loss: 0.00678
Epoch: 39953, Training Loss: 0.00828
Epoch: 39953, Training Loss: 0.00642
Epoch: 39954, Training Loss: 0.00725
Epoch: 39954, Training Loss: 0.00678
Epoch: 39954, Training Loss: 0.00828
Epoch: 39954, Training Loss: 0.00642
Epoch: 39955, Training Loss: 0.00725
Epoch: 39955, Training Loss: 0.00678
Epoch: 39955, Training Loss: 0.00828
Epoch: 39955, Training Loss: 0.00642
Epoch: 39956, Training Loss: 0.00725
Epoch: 39956, Training Loss: 0.00678
Epoch: 39956, Training Loss: 0.00828
Epoch: 39956, Training Loss: 0.00642
Epoch: 39957, Training Loss: 0.00725
Epoch: 39957, Training Loss: 0.00678
Epoch: 39957, Training Loss: 0.00828
Epoch: 39957, Training Loss: 0.00642
Epoch: 39958, Training Loss: 0.00725
Epoch: 39958, Training Loss: 0.00678
Epoch: 39958, Training Loss: 0.00828
Epoch: 39958, Training Loss: 0.00642
Epoch: 39959, Training Loss: 0.00725
Epoch: 39959, Training Loss: 0.00678
Epoch: 39959, Training Loss: 0.00828
Epoch: 39959, Training Loss: 0.00642
Epoch: 39960, Training Loss: 0.00725
Epoch: 39960, Training Loss: 0.00678
Epoch: 39960, Training Loss: 0.00828
Epoch: 39960, Training Loss: 0.00642
Epoch: 39961, Training Loss: 0.00725
Epoch: 39961, Training Loss: 0.00678
Epoch: 39961, Training Loss: 0.00828
Epoch: 39961, Training Loss: 0.00642
Epoch: 39962, Training Loss: 0.00725
Epoch: 39962, Training Loss: 0.00678
Epoch: 39962, Training Loss: 0.00828
Epoch: 39962, Training Loss: 0.00642
Epoch: 39963, Training Loss: 0.00725
Epoch: 39963, Training Loss: 0.00678
Epoch: 39963, Training Loss: 0.00828
Epoch: 39963, Training Loss: 0.00642
Epoch: 39964, Training Loss: 0.00725
Epoch: 39964, Training Loss: 0.00678
Epoch: 39964, Training Loss: 0.00828
Epoch: 39964, Training Loss: 0.00642
Epoch: 39965, Training Loss: 0.00725
Epoch: 39965, Training Loss: 0.00678
Epoch: 39965, Training Loss: 0.00828
Epoch: 39965, Training Loss: 0.00642
Epoch: 39966, Training Loss: 0.00725
Epoch: 39966, Training Loss: 0.00678
Epoch: 39966, Training Loss: 0.00828
Epoch: 39966, Training Loss: 0.00642
Epoch: 39967, Training Loss: 0.00725
Epoch: 39967, Training Loss: 0.00678
Epoch: 39967, Training Loss: 0.00828
Epoch: 39967, Training Loss: 0.00642
Epoch: 39968, Training Loss: 0.00725
Epoch: 39968, Training Loss: 0.00678
Epoch: 39968, Training Loss: 0.00828
Epoch: 39968, Training Loss: 0.00642
Epoch: 39969, Training Loss: 0.00725
Epoch: 39969, Training Loss: 0.00678
Epoch: 39969, Training Loss: 0.00828
Epoch: 39969, Training Loss: 0.00642
Epoch: 39970, Training Loss: 0.00725
Epoch: 39970, Training Loss: 0.00678
Epoch: 39970, Training Loss: 0.00828
Epoch: 39970, Training Loss: 0.00642
Epoch: 39971, Training Loss: 0.00725
Epoch: 39971, Training Loss: 0.00678
Epoch: 39971, Training Loss: 0.00828
Epoch: 39971, Training Loss: 0.00642
Epoch: 39972, Training Loss: 0.00725
Epoch: 39972, Training Loss: 0.00678
Epoch: 39972, Training Loss: 0.00828
Epoch: 39972, Training Loss: 0.00642
Epoch: 39973, Training Loss: 0.00725
Epoch: 39973, Training Loss: 0.00678
Epoch: 39973, Training Loss: 0.00828
Epoch: 39973, Training Loss: 0.00642
Epoch: 39974, Training Loss: 0.00725
Epoch: 39974, Training Loss: 0.00678
Epoch: 39974, Training Loss: 0.00828
Epoch: 39974, Training Loss: 0.00642
Epoch: 39975, Training Loss: 0.00725
Epoch: 39975, Training Loss: 0.00678
Epoch: 39975, Training Loss: 0.00828
Epoch: 39975, Training Loss: 0.00642
Epoch: 39976, Training Loss: 0.00725
Epoch: 39976, Training Loss: 0.00678
Epoch: 39976, Training Loss: 0.00828
Epoch: 39976, Training Loss: 0.00642
Epoch: 39977, Training Loss: 0.00725
Epoch: 39977, Training Loss: 0.00678
Epoch: 39977, Training Loss: 0.00828
Epoch: 39977, Training Loss: 0.00642
Epoch: 39978, Training Loss: 0.00725
Epoch: 39978, Training Loss: 0.00678
Epoch: 39978, Training Loss: 0.00828
Epoch: 39978, Training Loss: 0.00642
Epoch: 39979, Training Loss: 0.00725
Epoch: 39979, Training Loss: 0.00678
Epoch: 39979, Training Loss: 0.00828
Epoch: 39979, Training Loss: 0.00642
Epoch: 39980, Training Loss: 0.00725
Epoch: 39980, Training Loss: 0.00678
Epoch: 39980, Training Loss: 0.00828
Epoch: 39980, Training Loss: 0.00642
Epoch: 39981, Training Loss: 0.00725
Epoch: 39981, Training Loss: 0.00678
Epoch: 39981, Training Loss: 0.00828
Epoch: 39981, Training Loss: 0.00642
Epoch: 39982, Training Loss: 0.00725
Epoch: 39982, Training Loss: 0.00678
Epoch: 39982, Training Loss: 0.00828
Epoch: 39982, Training Loss: 0.00642
Epoch: 39983, Training Loss: 0.00725
Epoch: 39983, Training Loss: 0.00678
Epoch: 39983, Training Loss: 0.00828
Epoch: 39983, Training Loss: 0.00642
Epoch: 39984, Training Loss: 0.00725
Epoch: 39984, Training Loss: 0.00678
Epoch: 39984, Training Loss: 0.00828
Epoch: 39984, Training Loss: 0.00642
Epoch: 39985, Training Loss: 0.00725
Epoch: 39985, Training Loss: 0.00678
Epoch: 39985, Training Loss: 0.00828
Epoch: 39985, Training Loss: 0.00642
Epoch: 39986, Training Loss: 0.00725
Epoch: 39986, Training Loss: 0.00678
Epoch: 39986, Training Loss: 0.00828
Epoch: 39986, Training Loss: 0.00642
Epoch: 39987, Training Loss: 0.00725
Epoch: 39987, Training Loss: 0.00678
Epoch: 39987, Training Loss: 0.00828
Epoch: 39987, Training Loss: 0.00642
Epoch: 39988, Training Loss: 0.00725
Epoch: 39988, Training Loss: 0.00678
Epoch: 39988, Training Loss: 0.00828
Epoch: 39988, Training Loss: 0.00642
Epoch: 39989, Training Loss: 0.00724
Epoch: 39989, Training Loss: 0.00678
Epoch: 39989, Training Loss: 0.00828
Epoch: 39989, Training Loss: 0.00642
Epoch: 39990, Training Loss: 0.00724
Epoch: 39990, Training Loss: 0.00678
Epoch: 39990, Training Loss: 0.00828
Epoch: 39990, Training Loss: 0.00642
Epoch: 39991, Training Loss: 0.00724
Epoch: 39991, Training Loss: 0.00678
Epoch: 39991, Training Loss: 0.00828
Epoch: 39991, Training Loss: 0.00642
Epoch: 39992, Training Loss: 0.00724
Epoch: 39992, Training Loss: 0.00678
Epoch: 39992, Training Loss: 0.00828
Epoch: 39992, Training Loss: 0.00642
Epoch: 39993, Training Loss: 0.00724
Epoch: 39993, Training Loss: 0.00678
Epoch: 39993, Training Loss: 0.00828
Epoch: 39993, Training Loss: 0.00642
Epoch: 39994, Training Loss: 0.00724
Epoch: 39994, Training Loss: 0.00678
Epoch: 39994, Training Loss: 0.00828
Epoch: 39994, Training Loss: 0.00642
Epoch: 39995, Training Loss: 0.00724
Epoch: 39995, Training Loss: 0.00678
Epoch: 39995, Training Loss: 0.00828
Epoch: 39995, Training Loss: 0.00642
Epoch: 39996, Training Loss: 0.00724
Epoch: 39996, Training Loss: 0.00678
Epoch: 39996, Training Loss: 0.00828
Epoch: 39996, Training Loss: 0.00642
Epoch: 39997, Training Loss: 0.00724
Epoch: 39997, Training Loss: 0.00678
Epoch: 39997, Training Loss: 0.00828
Epoch: 39997, Training Loss: 0.00642
Epoch: 39998, Training Loss: 0.00724
Epoch: 39998, Training Loss: 0.00678
Epoch: 39998, Training Loss: 0.00828
Epoch: 39998, Training Loss: 0.00642
Epoch: 39999, Training Loss: 0.00724
Epoch: 39999, Training Loss: 0.00678
Epoch: 39999, Training Loss: 0.00828
Epoch: 39999, Training Loss: 0.00642
Epoch: 40000, Training Loss: 0.00724
Epoch: 40000, Training Loss: 0.00678
Epoch: 40000, Training Loss: 0.00828
Epoch: 40000, Training Loss: 0.00642
Epoch: 40001, Training Loss: 0.00724
Epoch: 40001, Training Loss: 0.00678
Epoch: 40001, Training Loss: 0.00828
Epoch: 40001, Training Loss: 0.00642
Epoch: 40002, Training Loss: 0.00724
Epoch: 40002, Training Loss: 0.00678
Epoch: 40002, Training Loss: 0.00828
Epoch: 40002, Training Loss: 0.00642
Epoch: 40003, Training Loss: 0.00724
Epoch: 40003, Training Loss: 0.00678
Epoch: 40003, Training Loss: 0.00828
Epoch: 40003, Training Loss: 0.00642
Epoch: 40004, Training Loss: 0.00724
Epoch: 40004, Training Loss: 0.00678
Epoch: 40004, Training Loss: 0.00828
Epoch: 40004, Training Loss: 0.00642
Epoch: 40005, Training Loss: 0.00724
Epoch: 40005, Training Loss: 0.00678
Epoch: 40005, Training Loss: 0.00828
Epoch: 40005, Training Loss: 0.00642
Epoch: 40006, Training Loss: 0.00724
Epoch: 40006, Training Loss: 0.00678
Epoch: 40006, Training Loss: 0.00828
Epoch: 40006, Training Loss: 0.00642
Epoch: 40007, Training Loss: 0.00724
Epoch: 40007, Training Loss: 0.00678
Epoch: 40007, Training Loss: 0.00828
Epoch: 40007, Training Loss: 0.00642
Epoch: 40008, Training Loss: 0.00724
Epoch: 40008, Training Loss: 0.00678
Epoch: 40008, Training Loss: 0.00828
Epoch: 40008, Training Loss: 0.00642
Epoch: 40009, Training Loss: 0.00724
Epoch: 40009, Training Loss: 0.00678
Epoch: 40009, Training Loss: 0.00828
Epoch: 40009, Training Loss: 0.00642
Epoch: 40010, Training Loss: 0.00724
Epoch: 40010, Training Loss: 0.00678
Epoch: 40010, Training Loss: 0.00828
Epoch: 40010, Training Loss: 0.00642
Epoch: 40011, Training Loss: 0.00724
Epoch: 40011, Training Loss: 0.00678
Epoch: 40011, Training Loss: 0.00828
Epoch: 40011, Training Loss: 0.00642
Epoch: 40012, Training Loss: 0.00724
Epoch: 40012, Training Loss: 0.00678
Epoch: 40012, Training Loss: 0.00828
Epoch: 40012, Training Loss: 0.00642
Epoch: 40013, Training Loss: 0.00724
Epoch: 40013, Training Loss: 0.00678
Epoch: 40013, Training Loss: 0.00828
Epoch: 40013, Training Loss: 0.00642
Epoch: 40014, Training Loss: 0.00724
Epoch: 40014, Training Loss: 0.00678
Epoch: 40014, Training Loss: 0.00828
Epoch: 40014, Training Loss: 0.00642
Epoch: 40015, Training Loss: 0.00724
Epoch: 40015, Training Loss: 0.00678
Epoch: 40015, Training Loss: 0.00828
Epoch: 40015, Training Loss: 0.00642
Epoch: 40016, Training Loss: 0.00724
Epoch: 40016, Training Loss: 0.00678
Epoch: 40016, Training Loss: 0.00828
Epoch: 40016, Training Loss: 0.00642
Epoch: 40017, Training Loss: 0.00724
Epoch: 40017, Training Loss: 0.00678
Epoch: 40017, Training Loss: 0.00827
Epoch: 40017, Training Loss: 0.00642
Epoch: 40018, Training Loss: 0.00724
Epoch: 40018, Training Loss: 0.00678
Epoch: 40018, Training Loss: 0.00827
Epoch: 40018, Training Loss: 0.00642
Epoch: 40019, Training Loss: 0.00724
Epoch: 40019, Training Loss: 0.00678
Epoch: 40019, Training Loss: 0.00827
Epoch: 40019, Training Loss: 0.00642
Epoch: 40020, Training Loss: 0.00724
Epoch: 40020, Training Loss: 0.00678
Epoch: 40020, Training Loss: 0.00827
Epoch: 40020, Training Loss: 0.00642
Epoch: 40021, Training Loss: 0.00724
Epoch: 40021, Training Loss: 0.00678
Epoch: 40021, Training Loss: 0.00827
Epoch: 40021, Training Loss: 0.00642
Epoch: 40022, Training Loss: 0.00724
Epoch: 40022, Training Loss: 0.00678
Epoch: 40022, Training Loss: 0.00827
Epoch: 40022, Training Loss: 0.00642
Epoch: 40023, Training Loss: 0.00724
Epoch: 40023, Training Loss: 0.00678
Epoch: 40023, Training Loss: 0.00827
Epoch: 40023, Training Loss: 0.00642
Epoch: 40024, Training Loss: 0.00724
Epoch: 40024, Training Loss: 0.00678
Epoch: 40024, Training Loss: 0.00827
Epoch: 40024, Training Loss: 0.00642
Epoch: 40025, Training Loss: 0.00724
Epoch: 40025, Training Loss: 0.00678
Epoch: 40025, Training Loss: 0.00827
Epoch: 40025, Training Loss: 0.00642
Epoch: 40026, Training Loss: 0.00724
Epoch: 40026, Training Loss: 0.00678
Epoch: 40026, Training Loss: 0.00827
Epoch: 40026, Training Loss: 0.00642
Epoch: 40027, Training Loss: 0.00724
Epoch: 40027, Training Loss: 0.00678
Epoch: 40027, Training Loss: 0.00827
Epoch: 40027, Training Loss: 0.00642
Epoch: 40028, Training Loss: 0.00724
Epoch: 40028, Training Loss: 0.00678
Epoch: 40028, Training Loss: 0.00827
Epoch: 40028, Training Loss: 0.00642
Epoch: 40029, Training Loss: 0.00724
Epoch: 40029, Training Loss: 0.00678
Epoch: 40029, Training Loss: 0.00827
Epoch: 40029, Training Loss: 0.00642
Epoch: 40030, Training Loss: 0.00724
Epoch: 40030, Training Loss: 0.00678
Epoch: 40030, Training Loss: 0.00827
Epoch: 40030, Training Loss: 0.00642
Epoch: 40031, Training Loss: 0.00724
Epoch: 40031, Training Loss: 0.00678
Epoch: 40031, Training Loss: 0.00827
Epoch: 40031, Training Loss: 0.00642
Epoch: 40032, Training Loss: 0.00724
Epoch: 40032, Training Loss: 0.00678
Epoch: 40032, Training Loss: 0.00827
Epoch: 40032, Training Loss: 0.00642
Epoch: 40033, Training Loss: 0.00724
Epoch: 40033, Training Loss: 0.00678
Epoch: 40033, Training Loss: 0.00827
Epoch: 40033, Training Loss: 0.00642
Epoch: 40034, Training Loss: 0.00724
Epoch: 40034, Training Loss: 0.00678
Epoch: 40034, Training Loss: 0.00827
Epoch: 40034, Training Loss: 0.00642
Epoch: 40035, Training Loss: 0.00724
Epoch: 40035, Training Loss: 0.00678
Epoch: 40035, Training Loss: 0.00827
Epoch: 40035, Training Loss: 0.00642
Epoch: 40036, Training Loss: 0.00724
Epoch: 40036, Training Loss: 0.00678
Epoch: 40036, Training Loss: 0.00827
Epoch: 40036, Training Loss: 0.00642
Epoch: 40037, Training Loss: 0.00724
Epoch: 40037, Training Loss: 0.00678
Epoch: 40037, Training Loss: 0.00827
Epoch: 40037, Training Loss: 0.00642
Epoch: 40038, Training Loss: 0.00724
Epoch: 40038, Training Loss: 0.00678
Epoch: 40038, Training Loss: 0.00827
Epoch: 40038, Training Loss: 0.00642
Epoch: 40039, Training Loss: 0.00724
Epoch: 40039, Training Loss: 0.00678
Epoch: 40039, Training Loss: 0.00827
Epoch: 40039, Training Loss: 0.00642
Epoch: 40040, Training Loss: 0.00724
Epoch: 40040, Training Loss: 0.00678
Epoch: 40040, Training Loss: 0.00827
Epoch: 40040, Training Loss: 0.00642
Epoch: 40041, Training Loss: 0.00724
Epoch: 40041, Training Loss: 0.00678
Epoch: 40041, Training Loss: 0.00827
Epoch: 40041, Training Loss: 0.00642
Epoch: 40042, Training Loss: 0.00724
Epoch: 40042, Training Loss: 0.00678
Epoch: 40042, Training Loss: 0.00827
Epoch: 40042, Training Loss: 0.00642
Epoch: 40043, Training Loss: 0.00724
Epoch: 40043, Training Loss: 0.00678
Epoch: 40043, Training Loss: 0.00827
Epoch: 40043, Training Loss: 0.00642
Epoch: 40044, Training Loss: 0.00724
Epoch: 40044, Training Loss: 0.00678
Epoch: 40044, Training Loss: 0.00827
Epoch: 40044, Training Loss: 0.00642
Epoch: 40045, Training Loss: 0.00724
Epoch: 40045, Training Loss: 0.00678
Epoch: 40045, Training Loss: 0.00827
Epoch: 40045, Training Loss: 0.00642
Epoch: 40046, Training Loss: 0.00724
Epoch: 40046, Training Loss: 0.00678
Epoch: 40046, Training Loss: 0.00827
Epoch: 40046, Training Loss: 0.00642
Epoch: 40047, Training Loss: 0.00724
Epoch: 40047, Training Loss: 0.00678
Epoch: 40047, Training Loss: 0.00827
Epoch: 40047, Training Loss: 0.00641
Epoch: 40048, Training Loss: 0.00724
Epoch: 40048, Training Loss: 0.00678
Epoch: 40048, Training Loss: 0.00827
Epoch: 40048, Training Loss: 0.00641
Epoch: 40049, Training Loss: 0.00724
Epoch: 40049, Training Loss: 0.00678
Epoch: 40049, Training Loss: 0.00827
Epoch: 40049, Training Loss: 0.00641
Epoch: 40050, Training Loss: 0.00724
Epoch: 40050, Training Loss: 0.00678
Epoch: 40050, Training Loss: 0.00827
Epoch: 40050, Training Loss: 0.00641
Epoch: 40051, Training Loss: 0.00724
Epoch: 40051, Training Loss: 0.00678
Epoch: 40051, Training Loss: 0.00827
Epoch: 40051, Training Loss: 0.00641
Epoch: 40052, Training Loss: 0.00724
Epoch: 40052, Training Loss: 0.00678
Epoch: 40052, Training Loss: 0.00827
Epoch: 40052, Training Loss: 0.00641
Epoch: 40053, Training Loss: 0.00724
Epoch: 40053, Training Loss: 0.00678
Epoch: 40053, Training Loss: 0.00827
Epoch: 40053, Training Loss: 0.00641
Epoch: 40054, Training Loss: 0.00724
Epoch: 40054, Training Loss: 0.00677
Epoch: 40054, Training Loss: 0.00827
Epoch: 40054, Training Loss: 0.00641
Epoch: 40055, Training Loss: 0.00724
Epoch: 40055, Training Loss: 0.00677
Epoch: 40055, Training Loss: 0.00827
Epoch: 40055, Training Loss: 0.00641
Epoch: 40056, Training Loss: 0.00724
Epoch: 40056, Training Loss: 0.00677
Epoch: 40056, Training Loss: 0.00827
Epoch: 40056, Training Loss: 0.00641
Epoch: 40057, Training Loss: 0.00724
Epoch: 40057, Training Loss: 0.00677
Epoch: 40057, Training Loss: 0.00827
Epoch: 40057, Training Loss: 0.00641
Epoch: 40058, Training Loss: 0.00724
Epoch: 40058, Training Loss: 0.00677
Epoch: 40058, Training Loss: 0.00827
Epoch: 40058, Training Loss: 0.00641
Epoch: 40059, Training Loss: 0.00724
Epoch: 40059, Training Loss: 0.00677
Epoch: 40059, Training Loss: 0.00827
Epoch: 40059, Training Loss: 0.00641
Epoch: 40060, Training Loss: 0.00724
Epoch: 40060, Training Loss: 0.00677
Epoch: 40060, Training Loss: 0.00827
Epoch: 40060, Training Loss: 0.00641
Epoch: 40061, Training Loss: 0.00724
Epoch: 40061, Training Loss: 0.00677
Epoch: 40061, Training Loss: 0.00827
Epoch: 40061, Training Loss: 0.00641
Epoch: 40062, Training Loss: 0.00724
Epoch: 40062, Training Loss: 0.00677
Epoch: 40062, Training Loss: 0.00827
Epoch: 40062, Training Loss: 0.00641
Epoch: 40063, Training Loss: 0.00724
Epoch: 40063, Training Loss: 0.00677
Epoch: 40063, Training Loss: 0.00827
Epoch: 40063, Training Loss: 0.00641
Epoch: 40064, Training Loss: 0.00724
Epoch: 40064, Training Loss: 0.00677
Epoch: 40064, Training Loss: 0.00827
Epoch: 40064, Training Loss: 0.00641
Epoch: 40065, Training Loss: 0.00724
Epoch: 40065, Training Loss: 0.00677
Epoch: 40065, Training Loss: 0.00827
Epoch: 40065, Training Loss: 0.00641
Epoch: 40066, Training Loss: 0.00724
Epoch: 40066, Training Loss: 0.00677
Epoch: 40066, Training Loss: 0.00827
Epoch: 40066, Training Loss: 0.00641
Epoch: 40067, Training Loss: 0.00724
Epoch: 40067, Training Loss: 0.00677
Epoch: 40067, Training Loss: 0.00827
Epoch: 40067, Training Loss: 0.00641
Epoch: 40068, Training Loss: 0.00724
Epoch: 40068, Training Loss: 0.00677
Epoch: 40068, Training Loss: 0.00827
Epoch: 40068, Training Loss: 0.00641
Epoch: 40069, Training Loss: 0.00724
Epoch: 40069, Training Loss: 0.00677
Epoch: 40069, Training Loss: 0.00827
Epoch: 40069, Training Loss: 0.00641
Epoch: 40070, Training Loss: 0.00724
Epoch: 40070, Training Loss: 0.00677
Epoch: 40070, Training Loss: 0.00827
Epoch: 40070, Training Loss: 0.00641
Epoch: 40071, Training Loss: 0.00724
Epoch: 40071, Training Loss: 0.00677
Epoch: 40071, Training Loss: 0.00827
Epoch: 40071, Training Loss: 0.00641
Epoch: 40072, Training Loss: 0.00724
Epoch: 40072, Training Loss: 0.00677
Epoch: 40072, Training Loss: 0.00827
Epoch: 40072, Training Loss: 0.00641
Epoch: 40073, Training Loss: 0.00724
Epoch: 40073, Training Loss: 0.00677
Epoch: 40073, Training Loss: 0.00827
Epoch: 40073, Training Loss: 0.00641
Epoch: 40074, Training Loss: 0.00724
Epoch: 40074, Training Loss: 0.00677
Epoch: 40074, Training Loss: 0.00827
Epoch: 40074, Training Loss: 0.00641
Epoch: 40075, Training Loss: 0.00724
Epoch: 40075, Training Loss: 0.00677
Epoch: 40075, Training Loss: 0.00827
Epoch: 40075, Training Loss: 0.00641
Epoch: 40076, Training Loss: 0.00724
Epoch: 40076, Training Loss: 0.00677
Epoch: 40076, Training Loss: 0.00827
Epoch: 40076, Training Loss: 0.00641
Epoch: 40077, Training Loss: 0.00724
Epoch: 40077, Training Loss: 0.00677
Epoch: 40077, Training Loss: 0.00827
Epoch: 40077, Training Loss: 0.00641
Epoch: 40078, Training Loss: 0.00724
Epoch: 40078, Training Loss: 0.00677
Epoch: 40078, Training Loss: 0.00827
Epoch: 40078, Training Loss: 0.00641
Epoch: 40079, Training Loss: 0.00724
Epoch: 40079, Training Loss: 0.00677
Epoch: 40079, Training Loss: 0.00827
Epoch: 40079, Training Loss: 0.00641
Epoch: 40080, Training Loss: 0.00724
Epoch: 40080, Training Loss: 0.00677
Epoch: 40080, Training Loss: 0.00827
Epoch: 40080, Training Loss: 0.00641
Epoch: 40081, Training Loss: 0.00724
Epoch: 40081, Training Loss: 0.00677
Epoch: 40081, Training Loss: 0.00827
Epoch: 40081, Training Loss: 0.00641
Epoch: 40082, Training Loss: 0.00724
Epoch: 40082, Training Loss: 0.00677
Epoch: 40082, Training Loss: 0.00827
Epoch: 40082, Training Loss: 0.00641
Epoch: 40083, Training Loss: 0.00724
Epoch: 40083, Training Loss: 0.00677
Epoch: 40083, Training Loss: 0.00827
Epoch: 40083, Training Loss: 0.00641
Epoch: 40084, Training Loss: 0.00724
Epoch: 40084, Training Loss: 0.00677
Epoch: 40084, Training Loss: 0.00827
Epoch: 40084, Training Loss: 0.00641
Epoch: 40085, Training Loss: 0.00724
Epoch: 40085, Training Loss: 0.00677
Epoch: 40085, Training Loss: 0.00827
Epoch: 40085, Training Loss: 0.00641
Epoch: 40086, Training Loss: 0.00724
Epoch: 40086, Training Loss: 0.00677
Epoch: 40086, Training Loss: 0.00827
Epoch: 40086, Training Loss: 0.00641
Epoch: 40087, Training Loss: 0.00724
Epoch: 40087, Training Loss: 0.00677
Epoch: 40087, Training Loss: 0.00827
Epoch: 40087, Training Loss: 0.00641
Epoch: 40088, Training Loss: 0.00724
Epoch: 40088, Training Loss: 0.00677
Epoch: 40088, Training Loss: 0.00827
Epoch: 40088, Training Loss: 0.00641
Epoch: 40089, Training Loss: 0.00724
Epoch: 40089, Training Loss: 0.00677
Epoch: 40089, Training Loss: 0.00827
Epoch: 40089, Training Loss: 0.00641
Epoch: 40090, Training Loss: 0.00724
Epoch: 40090, Training Loss: 0.00677
Epoch: 40090, Training Loss: 0.00827
Epoch: 40090, Training Loss: 0.00641
Epoch: 40091, Training Loss: 0.00724
Epoch: 40091, Training Loss: 0.00677
Epoch: 40091, Training Loss: 0.00827
Epoch: 40091, Training Loss: 0.00641
Epoch: 40092, Training Loss: 0.00724
Epoch: 40092, Training Loss: 0.00677
Epoch: 40092, Training Loss: 0.00827
Epoch: 40092, Training Loss: 0.00641
Epoch: 40093, Training Loss: 0.00724
Epoch: 40093, Training Loss: 0.00677
Epoch: 40093, Training Loss: 0.00827
Epoch: 40093, Training Loss: 0.00641
Epoch: 40094, Training Loss: 0.00723
Epoch: 40094, Training Loss: 0.00677
Epoch: 40094, Training Loss: 0.00827
Epoch: 40094, Training Loss: 0.00641
Epoch: 40095, Training Loss: 0.00723
Epoch: 40095, Training Loss: 0.00677
Epoch: 40095, Training Loss: 0.00827
Epoch: 40095, Training Loss: 0.00641
Epoch: 40096, Training Loss: 0.00723
Epoch: 40096, Training Loss: 0.00677
Epoch: 40096, Training Loss: 0.00827
Epoch: 40096, Training Loss: 0.00641
Epoch: 40097, Training Loss: 0.00723
Epoch: 40097, Training Loss: 0.00677
Epoch: 40097, Training Loss: 0.00827
Epoch: 40097, Training Loss: 0.00641
Epoch: 40098, Training Loss: 0.00723
Epoch: 40098, Training Loss: 0.00677
Epoch: 40098, Training Loss: 0.00827
Epoch: 40098, Training Loss: 0.00641
Epoch: 40099, Training Loss: 0.00723
Epoch: 40099, Training Loss: 0.00677
Epoch: 40099, Training Loss: 0.00827
Epoch: 40099, Training Loss: 0.00641
Epoch: 40100, Training Loss: 0.00723
Epoch: 40100, Training Loss: 0.00677
Epoch: 40100, Training Loss: 0.00827
Epoch: 40100, Training Loss: 0.00641
Epoch: 40101, Training Loss: 0.00723
Epoch: 40101, Training Loss: 0.00677
Epoch: 40101, Training Loss: 0.00827
Epoch: 40101, Training Loss: 0.00641
Epoch: 40102, Training Loss: 0.00723
Epoch: 40102, Training Loss: 0.00677
Epoch: 40102, Training Loss: 0.00827
Epoch: 40102, Training Loss: 0.00641
Epoch: 40103, Training Loss: 0.00723
Epoch: 40103, Training Loss: 0.00677
Epoch: 40103, Training Loss: 0.00827
Epoch: 40103, Training Loss: 0.00641
Epoch: 40104, Training Loss: 0.00723
Epoch: 40104, Training Loss: 0.00677
Epoch: 40104, Training Loss: 0.00827
Epoch: 40104, Training Loss: 0.00641
Epoch: 40105, Training Loss: 0.00723
Epoch: 40105, Training Loss: 0.00677
Epoch: 40105, Training Loss: 0.00827
Epoch: 40105, Training Loss: 0.00641
Epoch: 40106, Training Loss: 0.00723
Epoch: 40106, Training Loss: 0.00677
Epoch: 40106, Training Loss: 0.00827
Epoch: 40106, Training Loss: 0.00641
Epoch: 40107, Training Loss: 0.00723
Epoch: 40107, Training Loss: 0.00677
Epoch: 40107, Training Loss: 0.00827
Epoch: 40107, Training Loss: 0.00641
Epoch: 40108, Training Loss: 0.00723
Epoch: 40108, Training Loss: 0.00677
Epoch: 40108, Training Loss: 0.00827
Epoch: 40108, Training Loss: 0.00641
Epoch: 40109, Training Loss: 0.00723
Epoch: 40109, Training Loss: 0.00677
Epoch: 40109, Training Loss: 0.00827
Epoch: 40109, Training Loss: 0.00641
Epoch: 40110, Training Loss: 0.00723
Epoch: 40110, Training Loss: 0.00677
Epoch: 40110, Training Loss: 0.00826
Epoch: 40110, Training Loss: 0.00641
Epoch: 40111, Training Loss: 0.00723
Epoch: 40111, Training Loss: 0.00677
Epoch: 40111, Training Loss: 0.00826
Epoch: 40111, Training Loss: 0.00641
Epoch: 40112, Training Loss: 0.00723
Epoch: 40112, Training Loss: 0.00677
Epoch: 40112, Training Loss: 0.00826
Epoch: 40112, Training Loss: 0.00641
Epoch: 40113, Training Loss: 0.00723
Epoch: 40113, Training Loss: 0.00677
Epoch: 40113, Training Loss: 0.00826
Epoch: 40113, Training Loss: 0.00641
Epoch: 40114, Training Loss: 0.00723
Epoch: 40114, Training Loss: 0.00677
Epoch: 40114, Training Loss: 0.00826
Epoch: 40114, Training Loss: 0.00641
Epoch: 40115, Training Loss: 0.00723
Epoch: 40115, Training Loss: 0.00677
Epoch: 40115, Training Loss: 0.00826
Epoch: 40115, Training Loss: 0.00641
Epoch: 40116, Training Loss: 0.00723
Epoch: 40116, Training Loss: 0.00677
Epoch: 40116, Training Loss: 0.00826
Epoch: 40116, Training Loss: 0.00641
Epoch: 40117, Training Loss: 0.00723
Epoch: 40117, Training Loss: 0.00677
Epoch: 40117, Training Loss: 0.00826
Epoch: 40117, Training Loss: 0.00641
Epoch: 40118, Training Loss: 0.00723
Epoch: 40118, Training Loss: 0.00677
Epoch: 40118, Training Loss: 0.00826
Epoch: 40118, Training Loss: 0.00641
Epoch: 40119, Training Loss: 0.00723
Epoch: 40119, Training Loss: 0.00677
Epoch: 40119, Training Loss: 0.00826
Epoch: 40119, Training Loss: 0.00641
Epoch: 40120, Training Loss: 0.00723
Epoch: 40120, Training Loss: 0.00677
Epoch: 40120, Training Loss: 0.00826
Epoch: 40120, Training Loss: 0.00641
Epoch: 40121, Training Loss: 0.00723
Epoch: 40121, Training Loss: 0.00677
Epoch: 40121, Training Loss: 0.00826
Epoch: 40121, Training Loss: 0.00641
Epoch: 40122, Training Loss: 0.00723
Epoch: 40122, Training Loss: 0.00677
Epoch: 40122, Training Loss: 0.00826
Epoch: 40122, Training Loss: 0.00641
Epoch: 40123, Training Loss: 0.00723
Epoch: 40123, Training Loss: 0.00677
Epoch: 40123, Training Loss: 0.00826
Epoch: 40123, Training Loss: 0.00641
Epoch: 40124, Training Loss: 0.00723
Epoch: 40124, Training Loss: 0.00677
Epoch: 40124, Training Loss: 0.00826
Epoch: 40124, Training Loss: 0.00641
Epoch: 40125, Training Loss: 0.00723
Epoch: 40125, Training Loss: 0.00677
Epoch: 40125, Training Loss: 0.00826
Epoch: 40125, Training Loss: 0.00641
Epoch: 40126, Training Loss: 0.00723
Epoch: 40126, Training Loss: 0.00677
Epoch: 40126, Training Loss: 0.00826
Epoch: 40126, Training Loss: 0.00641
Epoch: 40127, Training Loss: 0.00723
Epoch: 40127, Training Loss: 0.00677
Epoch: 40127, Training Loss: 0.00826
Epoch: 40127, Training Loss: 0.00641
Epoch: 40128, Training Loss: 0.00723
Epoch: 40128, Training Loss: 0.00677
Epoch: 40128, Training Loss: 0.00826
Epoch: 40128, Training Loss: 0.00641
Epoch: 40129, Training Loss: 0.00723
Epoch: 40129, Training Loss: 0.00677
Epoch: 40129, Training Loss: 0.00826
Epoch: 40129, Training Loss: 0.00641
Epoch: 40130, Training Loss: 0.00723
Epoch: 40130, Training Loss: 0.00677
Epoch: 40130, Training Loss: 0.00826
Epoch: 40130, Training Loss: 0.00641
Epoch: 40131, Training Loss: 0.00723
Epoch: 40131, Training Loss: 0.00677
Epoch: 40131, Training Loss: 0.00826
Epoch: 40131, Training Loss: 0.00641
Epoch: 40132, Training Loss: 0.00723
Epoch: 40132, Training Loss: 0.00677
Epoch: 40132, Training Loss: 0.00826
Epoch: 40132, Training Loss: 0.00641
Epoch: 40133, Training Loss: 0.00723
Epoch: 40133, Training Loss: 0.00677
Epoch: 40133, Training Loss: 0.00826
Epoch: 40133, Training Loss: 0.00641
Epoch: 40134, Training Loss: 0.00723
Epoch: 40134, Training Loss: 0.00677
Epoch: 40134, Training Loss: 0.00826
Epoch: 40134, Training Loss: 0.00641
Epoch: 40135, Training Loss: 0.00723
Epoch: 40135, Training Loss: 0.00677
Epoch: 40135, Training Loss: 0.00826
Epoch: 40135, Training Loss: 0.00641
Epoch: 40136, Training Loss: 0.00723
Epoch: 40136, Training Loss: 0.00677
Epoch: 40136, Training Loss: 0.00826
Epoch: 40136, Training Loss: 0.00641
Epoch: 40137, Training Loss: 0.00723
Epoch: 40137, Training Loss: 0.00677
Epoch: 40137, Training Loss: 0.00826
Epoch: 40137, Training Loss: 0.00641
Epoch: 40138, Training Loss: 0.00723
Epoch: 40138, Training Loss: 0.00677
Epoch: 40138, Training Loss: 0.00826
Epoch: 40138, Training Loss: 0.00641
Epoch: 40139, Training Loss: 0.00723
Epoch: 40139, Training Loss: 0.00677
Epoch: 40139, Training Loss: 0.00826
Epoch: 40139, Training Loss: 0.00641
Epoch: 40140, Training Loss: 0.00723
Epoch: 40140, Training Loss: 0.00677
Epoch: 40140, Training Loss: 0.00826
Epoch: 40140, Training Loss: 0.00641
Epoch: 40141, Training Loss: 0.00723
Epoch: 40141, Training Loss: 0.00677
Epoch: 40141, Training Loss: 0.00826
Epoch: 40141, Training Loss: 0.00641
Epoch: 40142, Training Loss: 0.00723
Epoch: 40142, Training Loss: 0.00677
Epoch: 40142, Training Loss: 0.00826
Epoch: 40142, Training Loss: 0.00641
Epoch: 40143, Training Loss: 0.00723
Epoch: 40143, Training Loss: 0.00677
Epoch: 40143, Training Loss: 0.00826
Epoch: 40143, Training Loss: 0.00641
Epoch: 40144, Training Loss: 0.00723
Epoch: 40144, Training Loss: 0.00677
Epoch: 40144, Training Loss: 0.00826
Epoch: 40144, Training Loss: 0.00641
Epoch: 40145, Training Loss: 0.00723
Epoch: 40145, Training Loss: 0.00677
Epoch: 40145, Training Loss: 0.00826
Epoch: 40145, Training Loss: 0.00641
Epoch: 40146, Training Loss: 0.00723
Epoch: 40146, Training Loss: 0.00677
Epoch: 40146, Training Loss: 0.00826
Epoch: 40146, Training Loss: 0.00641
Epoch: 40147, Training Loss: 0.00723
Epoch: 40147, Training Loss: 0.00677
Epoch: 40147, Training Loss: 0.00826
Epoch: 40147, Training Loss: 0.00641
Epoch: 40148, Training Loss: 0.00723
Epoch: 40148, Training Loss: 0.00677
Epoch: 40148, Training Loss: 0.00826
Epoch: 40148, Training Loss: 0.00641
Epoch: 40149, Training Loss: 0.00723
Epoch: 40149, Training Loss: 0.00677
Epoch: 40149, Training Loss: 0.00826
Epoch: 40149, Training Loss: 0.00641
Epoch: 40150, Training Loss: 0.00723
Epoch: 40150, Training Loss: 0.00677
Epoch: 40150, Training Loss: 0.00826
Epoch: 40150, Training Loss: 0.00641
Epoch: 40151, Training Loss: 0.00723
Epoch: 40151, Training Loss: 0.00677
Epoch: 40151, Training Loss: 0.00826
Epoch: 40151, Training Loss: 0.00641
Epoch: 40152, Training Loss: 0.00723
Epoch: 40152, Training Loss: 0.00677
Epoch: 40152, Training Loss: 0.00826
Epoch: 40152, Training Loss: 0.00641
Epoch: 40153, Training Loss: 0.00723
Epoch: 40153, Training Loss: 0.00677
Epoch: 40153, Training Loss: 0.00826
Epoch: 40153, Training Loss: 0.00641
Epoch: 40154, Training Loss: 0.00723
Epoch: 40154, Training Loss: 0.00677
Epoch: 40154, Training Loss: 0.00826
Epoch: 40154, Training Loss: 0.00641
Epoch: 40155, Training Loss: 0.00723
Epoch: 40155, Training Loss: 0.00677
Epoch: 40155, Training Loss: 0.00826
Epoch: 40155, Training Loss: 0.00641
Epoch: 40156, Training Loss: 0.00723
Epoch: 40156, Training Loss: 0.00677
Epoch: 40156, Training Loss: 0.00826
Epoch: 40156, Training Loss: 0.00641
Epoch: 40157, Training Loss: 0.00723
Epoch: 40157, Training Loss: 0.00677
Epoch: 40157, Training Loss: 0.00826
Epoch: 40157, Training Loss: 0.00641
Epoch: 40158, Training Loss: 0.00723
Epoch: 40158, Training Loss: 0.00677
Epoch: 40158, Training Loss: 0.00826
Epoch: 40158, Training Loss: 0.00641
Epoch: 40159, Training Loss: 0.00723
Epoch: 40159, Training Loss: 0.00677
Epoch: 40159, Training Loss: 0.00826
Epoch: 40159, Training Loss: 0.00641
Epoch: 40160, Training Loss: 0.00723
Epoch: 40160, Training Loss: 0.00677
Epoch: 40160, Training Loss: 0.00826
Epoch: 40160, Training Loss: 0.00641
Epoch: 40161, Training Loss: 0.00723
Epoch: 40161, Training Loss: 0.00677
Epoch: 40161, Training Loss: 0.00826
Epoch: 40161, Training Loss: 0.00641
Epoch: 40162, Training Loss: 0.00723
Epoch: 40162, Training Loss: 0.00677
Epoch: 40162, Training Loss: 0.00826
Epoch: 40162, Training Loss: 0.00641
Epoch: 40163, Training Loss: 0.00723
Epoch: 40163, Training Loss: 0.00677
Epoch: 40163, Training Loss: 0.00826
Epoch: 40163, Training Loss: 0.00641
Epoch: 40164, Training Loss: 0.00723
Epoch: 40164, Training Loss: 0.00677
Epoch: 40164, Training Loss: 0.00826
Epoch: 40164, Training Loss: 0.00641
Epoch: 40165, Training Loss: 0.00723
Epoch: 40165, Training Loss: 0.00677
Epoch: 40165, Training Loss: 0.00826
Epoch: 40165, Training Loss: 0.00641
Epoch: 40166, Training Loss: 0.00723
Epoch: 40166, Training Loss: 0.00677
Epoch: 40166, Training Loss: 0.00826
Epoch: 40166, Training Loss: 0.00641
Epoch: 40167, Training Loss: 0.00723
Epoch: 40167, Training Loss: 0.00677
Epoch: 40167, Training Loss: 0.00826
Epoch: 40167, Training Loss: 0.00640
Epoch: 40168, Training Loss: 0.00723
Epoch: 40168, Training Loss: 0.00676
Epoch: 40168, Training Loss: 0.00826
Epoch: 40168, Training Loss: 0.00640
Epoch: 40169, Training Loss: 0.00723
Epoch: 40169, Training Loss: 0.00676
Epoch: 40169, Training Loss: 0.00826
Epoch: 40169, Training Loss: 0.00640
Epoch: 40170, Training Loss: 0.00723
Epoch: 40170, Training Loss: 0.00676
Epoch: 40170, Training Loss: 0.00826
Epoch: 40170, Training Loss: 0.00640
Epoch: 40171, Training Loss: 0.00723
Epoch: 40171, Training Loss: 0.00676
Epoch: 40171, Training Loss: 0.00826
Epoch: 40171, Training Loss: 0.00640
Epoch: 40172, Training Loss: 0.00723
Epoch: 40172, Training Loss: 0.00676
Epoch: 40172, Training Loss: 0.00826
Epoch: 40172, Training Loss: 0.00640
Epoch: 40173, Training Loss: 0.00723
Epoch: 40173, Training Loss: 0.00676
Epoch: 40173, Training Loss: 0.00826
Epoch: 40173, Training Loss: 0.00640
Epoch: 40174, Training Loss: 0.00723
Epoch: 40174, Training Loss: 0.00676
Epoch: 40174, Training Loss: 0.00826
Epoch: 40174, Training Loss: 0.00640
Epoch: 40175, Training Loss: 0.00723
Epoch: 40175, Training Loss: 0.00676
Epoch: 40175, Training Loss: 0.00826
Epoch: 40175, Training Loss: 0.00640
Epoch: 40176, Training Loss: 0.00723
Epoch: 40176, Training Loss: 0.00676
Epoch: 40176, Training Loss: 0.00826
Epoch: 40176, Training Loss: 0.00640
Epoch: 40177, Training Loss: 0.00723
Epoch: 40177, Training Loss: 0.00676
Epoch: 40177, Training Loss: 0.00826
Epoch: 40177, Training Loss: 0.00640
Epoch: 40178, Training Loss: 0.00723
Epoch: 40178, Training Loss: 0.00676
Epoch: 40178, Training Loss: 0.00826
Epoch: 40178, Training Loss: 0.00640
Epoch: 40179, Training Loss: 0.00723
Epoch: 40179, Training Loss: 0.00676
Epoch: 40179, Training Loss: 0.00826
Epoch: 40179, Training Loss: 0.00640
Epoch: 40180, Training Loss: 0.00723
Epoch: 40180, Training Loss: 0.00676
Epoch: 40180, Training Loss: 0.00826
Epoch: 40180, Training Loss: 0.00640
Epoch: 40181, Training Loss: 0.00723
Epoch: 40181, Training Loss: 0.00676
Epoch: 40181, Training Loss: 0.00826
Epoch: 40181, Training Loss: 0.00640
Epoch: 40182, Training Loss: 0.00723
Epoch: 40182, Training Loss: 0.00676
Epoch: 40182, Training Loss: 0.00826
Epoch: 40182, Training Loss: 0.00640
Epoch: 40183, Training Loss: 0.00723
Epoch: 40183, Training Loss: 0.00676
Epoch: 40183, Training Loss: 0.00826
Epoch: 40183, Training Loss: 0.00640
Epoch: 40184, Training Loss: 0.00723
Epoch: 40184, Training Loss: 0.00676
Epoch: 40184, Training Loss: 0.00826
Epoch: 40184, Training Loss: 0.00640
Epoch: 40185, Training Loss: 0.00723
Epoch: 40185, Training Loss: 0.00676
Epoch: 40185, Training Loss: 0.00826
Epoch: 40185, Training Loss: 0.00640
Epoch: 40186, Training Loss: 0.00723
Epoch: 40186, Training Loss: 0.00676
Epoch: 40186, Training Loss: 0.00826
Epoch: 40186, Training Loss: 0.00640
Epoch: 40187, Training Loss: 0.00723
Epoch: 40187, Training Loss: 0.00676
Epoch: 40187, Training Loss: 0.00826
Epoch: 40187, Training Loss: 0.00640
Epoch: 40188, Training Loss: 0.00723
Epoch: 40188, Training Loss: 0.00676
Epoch: 40188, Training Loss: 0.00826
Epoch: 40188, Training Loss: 0.00640
Epoch: 40189, Training Loss: 0.00723
Epoch: 40189, Training Loss: 0.00676
Epoch: 40189, Training Loss: 0.00826
Epoch: 40189, Training Loss: 0.00640
Epoch: 40190, Training Loss: 0.00723
Epoch: 40190, Training Loss: 0.00676
Epoch: 40190, Training Loss: 0.00826
Epoch: 40190, Training Loss: 0.00640
Epoch: 40191, Training Loss: 0.00723
Epoch: 40191, Training Loss: 0.00676
Epoch: 40191, Training Loss: 0.00826
Epoch: 40191, Training Loss: 0.00640
Epoch: 40192, Training Loss: 0.00723
Epoch: 40192, Training Loss: 0.00676
Epoch: 40192, Training Loss: 0.00826
Epoch: 40192, Training Loss: 0.00640
Epoch: 40193, Training Loss: 0.00723
Epoch: 40193, Training Loss: 0.00676
Epoch: 40193, Training Loss: 0.00826
Epoch: 40193, Training Loss: 0.00640
Epoch: 40194, Training Loss: 0.00723
Epoch: 40194, Training Loss: 0.00676
Epoch: 40194, Training Loss: 0.00826
Epoch: 40194, Training Loss: 0.00640
Epoch: 40195, Training Loss: 0.00723
Epoch: 40195, Training Loss: 0.00676
Epoch: 40195, Training Loss: 0.00826
Epoch: 40195, Training Loss: 0.00640
Epoch: 40196, Training Loss: 0.00723
Epoch: 40196, Training Loss: 0.00676
Epoch: 40196, Training Loss: 0.00826
Epoch: 40196, Training Loss: 0.00640
Epoch: 40197, Training Loss: 0.00723
Epoch: 40197, Training Loss: 0.00676
Epoch: 40197, Training Loss: 0.00826
Epoch: 40197, Training Loss: 0.00640
Epoch: 40198, Training Loss: 0.00723
Epoch: 40198, Training Loss: 0.00676
Epoch: 40198, Training Loss: 0.00826
Epoch: 40198, Training Loss: 0.00640
Epoch: 40199, Training Loss: 0.00723
Epoch: 40199, Training Loss: 0.00676
Epoch: 40199, Training Loss: 0.00826
Epoch: 40199, Training Loss: 0.00640
Epoch: 40200, Training Loss: 0.00722
Epoch: 40200, Training Loss: 0.00676
Epoch: 40200, Training Loss: 0.00826
Epoch: 40200, Training Loss: 0.00640
Epoch: 40201, Training Loss: 0.00722
Epoch: 40201, Training Loss: 0.00676
Epoch: 40201, Training Loss: 0.00826
Epoch: 40201, Training Loss: 0.00640
Epoch: 40202, Training Loss: 0.00722
Epoch: 40202, Training Loss: 0.00676
Epoch: 40202, Training Loss: 0.00825
Epoch: 40202, Training Loss: 0.00640
Epoch: 40203, Training Loss: 0.00722
Epoch: 40203, Training Loss: 0.00676
Epoch: 40203, Training Loss: 0.00825
Epoch: 40203, Training Loss: 0.00640
Epoch: 40204, Training Loss: 0.00722
Epoch: 40204, Training Loss: 0.00676
Epoch: 40204, Training Loss: 0.00825
Epoch: 40204, Training Loss: 0.00640
Epoch: 40205, Training Loss: 0.00722
Epoch: 40205, Training Loss: 0.00676
Epoch: 40205, Training Loss: 0.00825
Epoch: 40205, Training Loss: 0.00640
Epoch: 40206, Training Loss: 0.00722
Epoch: 40206, Training Loss: 0.00676
Epoch: 40206, Training Loss: 0.00825
Epoch: 40206, Training Loss: 0.00640
Epoch: 40207, Training Loss: 0.00722
Epoch: 40207, Training Loss: 0.00676
Epoch: 40207, Training Loss: 0.00825
Epoch: 40207, Training Loss: 0.00640
Epoch: 40208, Training Loss: 0.00722
Epoch: 40208, Training Loss: 0.00676
Epoch: 40208, Training Loss: 0.00825
Epoch: 40208, Training Loss: 0.00640
Epoch: 40209, Training Loss: 0.00722
Epoch: 40209, Training Loss: 0.00676
Epoch: 40209, Training Loss: 0.00825
Epoch: 40209, Training Loss: 0.00640
Epoch: 40210, Training Loss: 0.00722
Epoch: 40210, Training Loss: 0.00676
Epoch: 40210, Training Loss: 0.00825
Epoch: 40210, Training Loss: 0.00640
Epoch: 40211, Training Loss: 0.00722
Epoch: 40211, Training Loss: 0.00676
Epoch: 40211, Training Loss: 0.00825
Epoch: 40211, Training Loss: 0.00640
Epoch: 40212, Training Loss: 0.00722
Epoch: 40212, Training Loss: 0.00676
Epoch: 40212, Training Loss: 0.00825
Epoch: 40212, Training Loss: 0.00640
Epoch: 40213, Training Loss: 0.00722
Epoch: 40213, Training Loss: 0.00676
Epoch: 40213, Training Loss: 0.00825
Epoch: 40213, Training Loss: 0.00640
Epoch: 40214, Training Loss: 0.00722
Epoch: 40214, Training Loss: 0.00676
Epoch: 40214, Training Loss: 0.00825
Epoch: 40214, Training Loss: 0.00640
Epoch: 40215, Training Loss: 0.00722
Epoch: 40215, Training Loss: 0.00676
Epoch: 40215, Training Loss: 0.00825
Epoch: 40215, Training Loss: 0.00640
Epoch: 40216, Training Loss: 0.00722
Epoch: 40216, Training Loss: 0.00676
Epoch: 40216, Training Loss: 0.00825
Epoch: 40216, Training Loss: 0.00640
Epoch: 40217, Training Loss: 0.00722
Epoch: 40217, Training Loss: 0.00676
Epoch: 40217, Training Loss: 0.00825
Epoch: 40217, Training Loss: 0.00640
Epoch: 40218, Training Loss: 0.00722
Epoch: 40218, Training Loss: 0.00676
Epoch: 40218, Training Loss: 0.00825
Epoch: 40218, Training Loss: 0.00640
Epoch: 40219, Training Loss: 0.00722
Epoch: 40219, Training Loss: 0.00676
Epoch: 40219, Training Loss: 0.00825
Epoch: 40219, Training Loss: 0.00640
Epoch: 40220, Training Loss: 0.00722
Epoch: 40220, Training Loss: 0.00676
Epoch: 40220, Training Loss: 0.00825
Epoch: 40220, Training Loss: 0.00640
Epoch: 40221, Training Loss: 0.00722
Epoch: 40221, Training Loss: 0.00676
Epoch: 40221, Training Loss: 0.00825
Epoch: 40221, Training Loss: 0.00640
Epoch: 40222, Training Loss: 0.00722
Epoch: 40222, Training Loss: 0.00676
Epoch: 40222, Training Loss: 0.00825
Epoch: 40222, Training Loss: 0.00640
Epoch: 40223, Training Loss: 0.00722
Epoch: 40223, Training Loss: 0.00676
Epoch: 40223, Training Loss: 0.00825
Epoch: 40223, Training Loss: 0.00640
Epoch: 40224, Training Loss: 0.00722
Epoch: 40224, Training Loss: 0.00676
Epoch: 40224, Training Loss: 0.00825
Epoch: 40224, Training Loss: 0.00640
Epoch: 40225, Training Loss: 0.00722
Epoch: 40225, Training Loss: 0.00676
Epoch: 40225, Training Loss: 0.00825
Epoch: 40225, Training Loss: 0.00640
Epoch: 40226, Training Loss: 0.00722
Epoch: 40226, Training Loss: 0.00676
Epoch: 40226, Training Loss: 0.00825
Epoch: 40226, Training Loss: 0.00640
Epoch: 40227, Training Loss: 0.00722
Epoch: 40227, Training Loss: 0.00676
Epoch: 40227, Training Loss: 0.00825
Epoch: 40227, Training Loss: 0.00640
Epoch: 40228, Training Loss: 0.00722
Epoch: 40228, Training Loss: 0.00676
Epoch: 40228, Training Loss: 0.00825
Epoch: 40228, Training Loss: 0.00640
Epoch: 40229, Training Loss: 0.00722
Epoch: 40229, Training Loss: 0.00676
Epoch: 40229, Training Loss: 0.00825
Epoch: 40229, Training Loss: 0.00640
Epoch: 40230, Training Loss: 0.00722
Epoch: 40230, Training Loss: 0.00676
Epoch: 40230, Training Loss: 0.00825
Epoch: 40230, Training Loss: 0.00640
Epoch: 40231, Training Loss: 0.00722
Epoch: 40231, Training Loss: 0.00676
Epoch: 40231, Training Loss: 0.00825
Epoch: 40231, Training Loss: 0.00640
Epoch: 40232, Training Loss: 0.00722
Epoch: 40232, Training Loss: 0.00676
Epoch: 40232, Training Loss: 0.00825
Epoch: 40232, Training Loss: 0.00640
Epoch: 40233, Training Loss: 0.00722
Epoch: 40233, Training Loss: 0.00676
Epoch: 40233, Training Loss: 0.00825
Epoch: 40233, Training Loss: 0.00640
Epoch: 40234, Training Loss: 0.00722
Epoch: 40234, Training Loss: 0.00676
Epoch: 40234, Training Loss: 0.00825
Epoch: 40234, Training Loss: 0.00640
Epoch: 40235, Training Loss: 0.00722
Epoch: 40235, Training Loss: 0.00676
Epoch: 40235, Training Loss: 0.00825
Epoch: 40235, Training Loss: 0.00640
Epoch: 40236, Training Loss: 0.00722
Epoch: 40236, Training Loss: 0.00676
Epoch: 40236, Training Loss: 0.00825
Epoch: 40236, Training Loss: 0.00640
Epoch: 40237, Training Loss: 0.00722
Epoch: 40237, Training Loss: 0.00676
Epoch: 40237, Training Loss: 0.00825
Epoch: 40237, Training Loss: 0.00640
Epoch: 40238, Training Loss: 0.00722
Epoch: 40238, Training Loss: 0.00676
Epoch: 40238, Training Loss: 0.00825
Epoch: 40238, Training Loss: 0.00640
Epoch: 40239, Training Loss: 0.00722
Epoch: 40239, Training Loss: 0.00676
Epoch: 40239, Training Loss: 0.00825
Epoch: 40239, Training Loss: 0.00640
Epoch: 40240, Training Loss: 0.00722
Epoch: 40240, Training Loss: 0.00676
Epoch: 40240, Training Loss: 0.00825
Epoch: 40240, Training Loss: 0.00640
Epoch: 40241, Training Loss: 0.00722
Epoch: 40241, Training Loss: 0.00676
Epoch: 40241, Training Loss: 0.00825
Epoch: 40241, Training Loss: 0.00640
Epoch: 40242, Training Loss: 0.00722
Epoch: 40242, Training Loss: 0.00676
Epoch: 40242, Training Loss: 0.00825
Epoch: 40242, Training Loss: 0.00640
Epoch: 40243, Training Loss: 0.00722
Epoch: 40243, Training Loss: 0.00676
Epoch: 40243, Training Loss: 0.00825
Epoch: 40243, Training Loss: 0.00640
Epoch: 40244, Training Loss: 0.00722
Epoch: 40244, Training Loss: 0.00676
Epoch: 40244, Training Loss: 0.00825
Epoch: 40244, Training Loss: 0.00640
Epoch: 40245, Training Loss: 0.00722
Epoch: 40245, Training Loss: 0.00676
Epoch: 40245, Training Loss: 0.00825
Epoch: 40245, Training Loss: 0.00640
Epoch: 40246, Training Loss: 0.00722
Epoch: 40246, Training Loss: 0.00676
Epoch: 40246, Training Loss: 0.00825
Epoch: 40246, Training Loss: 0.00640
Epoch: 40247, Training Loss: 0.00722
Epoch: 40247, Training Loss: 0.00676
Epoch: 40247, Training Loss: 0.00825
Epoch: 40247, Training Loss: 0.00640
Epoch: 40248, Training Loss: 0.00722
Epoch: 40248, Training Loss: 0.00676
Epoch: 40248, Training Loss: 0.00825
Epoch: 40248, Training Loss: 0.00640
Epoch: 40249, Training Loss: 0.00722
Epoch: 40249, Training Loss: 0.00676
Epoch: 40249, Training Loss: 0.00825
Epoch: 40249, Training Loss: 0.00640
Epoch: 40250, Training Loss: 0.00722
Epoch: 40250, Training Loss: 0.00676
Epoch: 40250, Training Loss: 0.00825
Epoch: 40250, Training Loss: 0.00640
Epoch: 40251, Training Loss: 0.00722
Epoch: 40251, Training Loss: 0.00676
Epoch: 40251, Training Loss: 0.00825
Epoch: 40251, Training Loss: 0.00640
Epoch: 40252, Training Loss: 0.00722
Epoch: 40252, Training Loss: 0.00676
Epoch: 40252, Training Loss: 0.00825
Epoch: 40252, Training Loss: 0.00640
Epoch: 40253, Training Loss: 0.00722
Epoch: 40253, Training Loss: 0.00676
Epoch: 40253, Training Loss: 0.00825
Epoch: 40253, Training Loss: 0.00640
Epoch: 40254, Training Loss: 0.00722
Epoch: 40254, Training Loss: 0.00676
Epoch: 40254, Training Loss: 0.00825
Epoch: 40254, Training Loss: 0.00640
Epoch: 40255, Training Loss: 0.00722
Epoch: 40255, Training Loss: 0.00676
Epoch: 40255, Training Loss: 0.00825
Epoch: 40255, Training Loss: 0.00640
Epoch: 40256, Training Loss: 0.00722
Epoch: 40256, Training Loss: 0.00676
Epoch: 40256, Training Loss: 0.00825
Epoch: 40256, Training Loss: 0.00640
Epoch: 40257, Training Loss: 0.00722
Epoch: 40257, Training Loss: 0.00676
Epoch: 40257, Training Loss: 0.00825
Epoch: 40257, Training Loss: 0.00640
Epoch: 40258, Training Loss: 0.00722
Epoch: 40258, Training Loss: 0.00676
Epoch: 40258, Training Loss: 0.00825
Epoch: 40258, Training Loss: 0.00640
Epoch: 40259, Training Loss: 0.00722
Epoch: 40259, Training Loss: 0.00676
Epoch: 40259, Training Loss: 0.00825
Epoch: 40259, Training Loss: 0.00640
Epoch: 40260, Training Loss: 0.00722
Epoch: 40260, Training Loss: 0.00676
Epoch: 40260, Training Loss: 0.00825
Epoch: 40260, Training Loss: 0.00640
Epoch: 40261, Training Loss: 0.00722
Epoch: 40261, Training Loss: 0.00676
Epoch: 40261, Training Loss: 0.00825
Epoch: 40261, Training Loss: 0.00640
Epoch: 40262, Training Loss: 0.00722
Epoch: 40262, Training Loss: 0.00676
Epoch: 40262, Training Loss: 0.00825
Epoch: 40262, Training Loss: 0.00640
Epoch: 40263, Training Loss: 0.00722
Epoch: 40263, Training Loss: 0.00676
Epoch: 40263, Training Loss: 0.00825
Epoch: 40263, Training Loss: 0.00640
Epoch: 40264, Training Loss: 0.00722
Epoch: 40264, Training Loss: 0.00676
Epoch: 40264, Training Loss: 0.00825
Epoch: 40264, Training Loss: 0.00640
Epoch: 40265, Training Loss: 0.00722
Epoch: 40265, Training Loss: 0.00676
Epoch: 40265, Training Loss: 0.00825
Epoch: 40265, Training Loss: 0.00640
Epoch: 40266, Training Loss: 0.00722
Epoch: 40266, Training Loss: 0.00676
Epoch: 40266, Training Loss: 0.00825
Epoch: 40266, Training Loss: 0.00640
Epoch: 40267, Training Loss: 0.00722
Epoch: 40267, Training Loss: 0.00676
Epoch: 40267, Training Loss: 0.00825
Epoch: 40267, Training Loss: 0.00640
Epoch: 40268, Training Loss: 0.00722
Epoch: 40268, Training Loss: 0.00676
Epoch: 40268, Training Loss: 0.00825
Epoch: 40268, Training Loss: 0.00640
Epoch: 40269, Training Loss: 0.00722
Epoch: 40269, Training Loss: 0.00676
Epoch: 40269, Training Loss: 0.00825
Epoch: 40269, Training Loss: 0.00640
Epoch: 40270, Training Loss: 0.00722
Epoch: 40270, Training Loss: 0.00676
Epoch: 40270, Training Loss: 0.00825
Epoch: 40270, Training Loss: 0.00640
Epoch: 40271, Training Loss: 0.00722
Epoch: 40271, Training Loss: 0.00676
Epoch: 40271, Training Loss: 0.00825
Epoch: 40271, Training Loss: 0.00640
Epoch: 40272, Training Loss: 0.00722
Epoch: 40272, Training Loss: 0.00676
Epoch: 40272, Training Loss: 0.00825
Epoch: 40272, Training Loss: 0.00640
Epoch: 40273, Training Loss: 0.00722
Epoch: 40273, Training Loss: 0.00676
Epoch: 40273, Training Loss: 0.00825
Epoch: 40273, Training Loss: 0.00640
Epoch: 40274, Training Loss: 0.00722
Epoch: 40274, Training Loss: 0.00676
Epoch: 40274, Training Loss: 0.00825
Epoch: 40274, Training Loss: 0.00640
Epoch: 40275, Training Loss: 0.00722
Epoch: 40275, Training Loss: 0.00676
Epoch: 40275, Training Loss: 0.00825
Epoch: 40275, Training Loss: 0.00640
Epoch: 40276, Training Loss: 0.00722
Epoch: 40276, Training Loss: 0.00676
Epoch: 40276, Training Loss: 0.00825
Epoch: 40276, Training Loss: 0.00640
Epoch: 40277, Training Loss: 0.00722
Epoch: 40277, Training Loss: 0.00676
Epoch: 40277, Training Loss: 0.00825
Epoch: 40277, Training Loss: 0.00640
Epoch: 40278, Training Loss: 0.00722
Epoch: 40278, Training Loss: 0.00676
Epoch: 40278, Training Loss: 0.00825
Epoch: 40278, Training Loss: 0.00640
Epoch: 40279, Training Loss: 0.00722
Epoch: 40279, Training Loss: 0.00676
Epoch: 40279, Training Loss: 0.00825
Epoch: 40279, Training Loss: 0.00640
Epoch: 40280, Training Loss: 0.00722
Epoch: 40280, Training Loss: 0.00676
Epoch: 40280, Training Loss: 0.00825
Epoch: 40280, Training Loss: 0.00640
Epoch: 40281, Training Loss: 0.00722
Epoch: 40281, Training Loss: 0.00676
Epoch: 40281, Training Loss: 0.00825
Epoch: 40281, Training Loss: 0.00640
Epoch: 40282, Training Loss: 0.00722
Epoch: 40282, Training Loss: 0.00676
Epoch: 40282, Training Loss: 0.00825
Epoch: 40282, Training Loss: 0.00640
Epoch: 40283, Training Loss: 0.00722
Epoch: 40283, Training Loss: 0.00675
Epoch: 40283, Training Loss: 0.00825
Epoch: 40283, Training Loss: 0.00640
Epoch: 40284, Training Loss: 0.00722
Epoch: 40284, Training Loss: 0.00675
Epoch: 40284, Training Loss: 0.00825
Epoch: 40284, Training Loss: 0.00640
Epoch: 40285, Training Loss: 0.00722
Epoch: 40285, Training Loss: 0.00675
Epoch: 40285, Training Loss: 0.00825
Epoch: 40285, Training Loss: 0.00640
Epoch: 40286, Training Loss: 0.00722
Epoch: 40286, Training Loss: 0.00675
Epoch: 40286, Training Loss: 0.00825
Epoch: 40286, Training Loss: 0.00640
Epoch: 40287, Training Loss: 0.00722
Epoch: 40287, Training Loss: 0.00675
Epoch: 40287, Training Loss: 0.00825
Epoch: 40287, Training Loss: 0.00640
Epoch: 40288, Training Loss: 0.00722
Epoch: 40288, Training Loss: 0.00675
Epoch: 40288, Training Loss: 0.00825
Epoch: 40288, Training Loss: 0.00639
Epoch: 40289, Training Loss: 0.00722
Epoch: 40289, Training Loss: 0.00675
Epoch: 40289, Training Loss: 0.00825
Epoch: 40289, Training Loss: 0.00639
Epoch: 40290, Training Loss: 0.00722
Epoch: 40290, Training Loss: 0.00675
Epoch: 40290, Training Loss: 0.00825
Epoch: 40290, Training Loss: 0.00639
Epoch: 40291, Training Loss: 0.00722
Epoch: 40291, Training Loss: 0.00675
Epoch: 40291, Training Loss: 0.00825
Epoch: 40291, Training Loss: 0.00639
Epoch: 40292, Training Loss: 0.00722
Epoch: 40292, Training Loss: 0.00675
Epoch: 40292, Training Loss: 0.00825
Epoch: 40292, Training Loss: 0.00639
Epoch: 40293, Training Loss: 0.00722
Epoch: 40293, Training Loss: 0.00675
Epoch: 40293, Training Loss: 0.00825
Epoch: 40293, Training Loss: 0.00639
Epoch: 40294, Training Loss: 0.00722
Epoch: 40294, Training Loss: 0.00675
Epoch: 40294, Training Loss: 0.00825
Epoch: 40294, Training Loss: 0.00639
Epoch: 40295, Training Loss: 0.00722
Epoch: 40295, Training Loss: 0.00675
Epoch: 40295, Training Loss: 0.00824
Epoch: 40295, Training Loss: 0.00639
Epoch: 40296, Training Loss: 0.00722
Epoch: 40296, Training Loss: 0.00675
Epoch: 40296, Training Loss: 0.00824
Epoch: 40296, Training Loss: 0.00639
Epoch: 40297, Training Loss: 0.00722
Epoch: 40297, Training Loss: 0.00675
Epoch: 40297, Training Loss: 0.00824
Epoch: 40297, Training Loss: 0.00639
Epoch: 40298, Training Loss: 0.00722
Epoch: 40298, Training Loss: 0.00675
Epoch: 40298, Training Loss: 0.00824
Epoch: 40298, Training Loss: 0.00639
Epoch: 40299, Training Loss: 0.00722
Epoch: 40299, Training Loss: 0.00675
Epoch: 40299, Training Loss: 0.00824
Epoch: 40299, Training Loss: 0.00639
Epoch: 40300, Training Loss: 0.00722
Epoch: 40300, Training Loss: 0.00675
Epoch: 40300, Training Loss: 0.00824
Epoch: 40300, Training Loss: 0.00639
Epoch: 40301, Training Loss: 0.00722
Epoch: 40301, Training Loss: 0.00675
Epoch: 40301, Training Loss: 0.00824
Epoch: 40301, Training Loss: 0.00639
Epoch: 40302, Training Loss: 0.00722
Epoch: 40302, Training Loss: 0.00675
Epoch: 40302, Training Loss: 0.00824
Epoch: 40302, Training Loss: 0.00639
Epoch: 40303, Training Loss: 0.00722
Epoch: 40303, Training Loss: 0.00675
Epoch: 40303, Training Loss: 0.00824
Epoch: 40303, Training Loss: 0.00639
Epoch: 40304, Training Loss: 0.00722
Epoch: 40304, Training Loss: 0.00675
Epoch: 40304, Training Loss: 0.00824
Epoch: 40304, Training Loss: 0.00639
Epoch: 40305, Training Loss: 0.00722
Epoch: 40305, Training Loss: 0.00675
Epoch: 40305, Training Loss: 0.00824
Epoch: 40305, Training Loss: 0.00639
Epoch: 40306, Training Loss: 0.00721
Epoch: 40306, Training Loss: 0.00675
Epoch: 40306, Training Loss: 0.00824
Epoch: 40306, Training Loss: 0.00639
Epoch: 40307, Training Loss: 0.00721
Epoch: 40307, Training Loss: 0.00675
Epoch: 40307, Training Loss: 0.00824
Epoch: 40307, Training Loss: 0.00639
Epoch: 40308, Training Loss: 0.00721
Epoch: 40308, Training Loss: 0.00675
Epoch: 40308, Training Loss: 0.00824
Epoch: 40308, Training Loss: 0.00639
Epoch: 40309, Training Loss: 0.00721
Epoch: 40309, Training Loss: 0.00675
Epoch: 40309, Training Loss: 0.00824
Epoch: 40309, Training Loss: 0.00639
Epoch: 40310, Training Loss: 0.00721
Epoch: 40310, Training Loss: 0.00675
Epoch: 40310, Training Loss: 0.00824
Epoch: 40310, Training Loss: 0.00639
Epoch: 40311, Training Loss: 0.00721
Epoch: 40311, Training Loss: 0.00675
Epoch: 40311, Training Loss: 0.00824
Epoch: 40311, Training Loss: 0.00639
Epoch: 40312, Training Loss: 0.00721
Epoch: 40312, Training Loss: 0.00675
Epoch: 40312, Training Loss: 0.00824
Epoch: 40312, Training Loss: 0.00639
Epoch: 40313, Training Loss: 0.00721
Epoch: 40313, Training Loss: 0.00675
Epoch: 40313, Training Loss: 0.00824
Epoch: 40313, Training Loss: 0.00639
Epoch: 40314, Training Loss: 0.00721
Epoch: 40314, Training Loss: 0.00675
Epoch: 40314, Training Loss: 0.00824
Epoch: 40314, Training Loss: 0.00639
Epoch: 40315, Training Loss: 0.00721
Epoch: 40315, Training Loss: 0.00675
Epoch: 40315, Training Loss: 0.00824
Epoch: 40315, Training Loss: 0.00639
Epoch: 40316, Training Loss: 0.00721
Epoch: 40316, Training Loss: 0.00675
Epoch: 40316, Training Loss: 0.00824
Epoch: 40316, Training Loss: 0.00639
Epoch: 40317, Training Loss: 0.00721
Epoch: 40317, Training Loss: 0.00675
Epoch: 40317, Training Loss: 0.00824
Epoch: 40317, Training Loss: 0.00639
Epoch: 40318, Training Loss: 0.00721
Epoch: 40318, Training Loss: 0.00675
Epoch: 40318, Training Loss: 0.00824
Epoch: 40318, Training Loss: 0.00639
Epoch: 40319, Training Loss: 0.00721
Epoch: 40319, Training Loss: 0.00675
Epoch: 40319, Training Loss: 0.00824
Epoch: 40319, Training Loss: 0.00639
Epoch: 40320, Training Loss: 0.00721
Epoch: 40320, Training Loss: 0.00675
Epoch: 40320, Training Loss: 0.00824
Epoch: 40320, Training Loss: 0.00639
Epoch: 40321, Training Loss: 0.00721
Epoch: 40321, Training Loss: 0.00675
Epoch: 40321, Training Loss: 0.00824
Epoch: 40321, Training Loss: 0.00639
Epoch: 40322, Training Loss: 0.00721
Epoch: 40322, Training Loss: 0.00675
Epoch: 40322, Training Loss: 0.00824
Epoch: 40322, Training Loss: 0.00639
Epoch: 40323, Training Loss: 0.00721
Epoch: 40323, Training Loss: 0.00675
Epoch: 40323, Training Loss: 0.00824
Epoch: 40323, Training Loss: 0.00639
Epoch: 40324, Training Loss: 0.00721
Epoch: 40324, Training Loss: 0.00675
Epoch: 40324, Training Loss: 0.00824
Epoch: 40324, Training Loss: 0.00639
Epoch: 40325, Training Loss: 0.00721
Epoch: 40325, Training Loss: 0.00675
Epoch: 40325, Training Loss: 0.00824
Epoch: 40325, Training Loss: 0.00639
Epoch: 40326, Training Loss: 0.00721
Epoch: 40326, Training Loss: 0.00675
Epoch: 40326, Training Loss: 0.00824
Epoch: 40326, Training Loss: 0.00639
Epoch: 40327, Training Loss: 0.00721
Epoch: 40327, Training Loss: 0.00675
Epoch: 40327, Training Loss: 0.00824
Epoch: 40327, Training Loss: 0.00639
Epoch: 40328, Training Loss: 0.00721
Epoch: 40328, Training Loss: 0.00675
Epoch: 40328, Training Loss: 0.00824
Epoch: 40328, Training Loss: 0.00639
Epoch: 40329, Training Loss: 0.00721
Epoch: 40329, Training Loss: 0.00675
Epoch: 40329, Training Loss: 0.00824
Epoch: 40329, Training Loss: 0.00639
Epoch: 40330, Training Loss: 0.00721
Epoch: 40330, Training Loss: 0.00675
Epoch: 40330, Training Loss: 0.00824
Epoch: 40330, Training Loss: 0.00639
Epoch: 40331, Training Loss: 0.00721
Epoch: 40331, Training Loss: 0.00675
Epoch: 40331, Training Loss: 0.00824
Epoch: 40331, Training Loss: 0.00639
Epoch: 40332, Training Loss: 0.00721
Epoch: 40332, Training Loss: 0.00675
Epoch: 40332, Training Loss: 0.00824
Epoch: 40332, Training Loss: 0.00639
Epoch: 40333, Training Loss: 0.00721
Epoch: 40333, Training Loss: 0.00675
Epoch: 40333, Training Loss: 0.00824
Epoch: 40333, Training Loss: 0.00639
Epoch: 40334, Training Loss: 0.00721
Epoch: 40334, Training Loss: 0.00675
Epoch: 40334, Training Loss: 0.00824
Epoch: 40334, Training Loss: 0.00639
Epoch: 40335, Training Loss: 0.00721
Epoch: 40335, Training Loss: 0.00675
Epoch: 40335, Training Loss: 0.00824
Epoch: 40335, Training Loss: 0.00639
Epoch: 40336, Training Loss: 0.00721
Epoch: 40336, Training Loss: 0.00675
Epoch: 40336, Training Loss: 0.00824
Epoch: 40336, Training Loss: 0.00639
Epoch: 40337, Training Loss: 0.00721
Epoch: 40337, Training Loss: 0.00675
Epoch: 40337, Training Loss: 0.00824
Epoch: 40337, Training Loss: 0.00639
Epoch: 40338, Training Loss: 0.00721
Epoch: 40338, Training Loss: 0.00675
Epoch: 40338, Training Loss: 0.00824
Epoch: 40338, Training Loss: 0.00639
Epoch: 40339, Training Loss: 0.00721
Epoch: 40339, Training Loss: 0.00675
Epoch: 40339, Training Loss: 0.00824
Epoch: 40339, Training Loss: 0.00639
Epoch: 40340, Training Loss: 0.00721
Epoch: 40340, Training Loss: 0.00675
Epoch: 40340, Training Loss: 0.00824
Epoch: 40340, Training Loss: 0.00639
Epoch: 40341, Training Loss: 0.00721
Epoch: 40341, Training Loss: 0.00675
Epoch: 40341, Training Loss: 0.00824
Epoch: 40341, Training Loss: 0.00639
Epoch: 40342, Training Loss: 0.00721
Epoch: 40342, Training Loss: 0.00675
Epoch: 40342, Training Loss: 0.00824
Epoch: 40342, Training Loss: 0.00639
Epoch: 40343, Training Loss: 0.00721
Epoch: 40343, Training Loss: 0.00675
Epoch: 40343, Training Loss: 0.00824
Epoch: 40343, Training Loss: 0.00639
Epoch: 40344, Training Loss: 0.00721
Epoch: 40344, Training Loss: 0.00675
Epoch: 40344, Training Loss: 0.00824
Epoch: 40344, Training Loss: 0.00639
Epoch: 40345, Training Loss: 0.00721
Epoch: 40345, Training Loss: 0.00675
Epoch: 40345, Training Loss: 0.00824
Epoch: 40345, Training Loss: 0.00639
Epoch: 40346, Training Loss: 0.00721
Epoch: 40346, Training Loss: 0.00675
Epoch: 40346, Training Loss: 0.00824
Epoch: 40346, Training Loss: 0.00639
Epoch: 40347, Training Loss: 0.00721
Epoch: 40347, Training Loss: 0.00675
Epoch: 40347, Training Loss: 0.00824
Epoch: 40347, Training Loss: 0.00639
Epoch: 40348, Training Loss: 0.00721
Epoch: 40348, Training Loss: 0.00675
Epoch: 40348, Training Loss: 0.00824
Epoch: 40348, Training Loss: 0.00639
Epoch: 40349, Training Loss: 0.00721
Epoch: 40349, Training Loss: 0.00675
Epoch: 40349, Training Loss: 0.00824
Epoch: 40349, Training Loss: 0.00639
Epoch: 40350, Training Loss: 0.00721
Epoch: 40350, Training Loss: 0.00675
Epoch: 40350, Training Loss: 0.00824
Epoch: 40350, Training Loss: 0.00639
Epoch: 40351, Training Loss: 0.00721
Epoch: 40351, Training Loss: 0.00675
Epoch: 40351, Training Loss: 0.00824
Epoch: 40351, Training Loss: 0.00639
Epoch: 40352, Training Loss: 0.00721
Epoch: 40352, Training Loss: 0.00675
Epoch: 40352, Training Loss: 0.00824
Epoch: 40352, Training Loss: 0.00639
Epoch: 40353, Training Loss: 0.00721
Epoch: 40353, Training Loss: 0.00675
Epoch: 40353, Training Loss: 0.00824
Epoch: 40353, Training Loss: 0.00639
Epoch: 40354, Training Loss: 0.00721
Epoch: 40354, Training Loss: 0.00675
Epoch: 40354, Training Loss: 0.00824
Epoch: 40354, Training Loss: 0.00639
Epoch: 40355, Training Loss: 0.00721
Epoch: 40355, Training Loss: 0.00675
Epoch: 40355, Training Loss: 0.00824
Epoch: 40355, Training Loss: 0.00639
Epoch: 40356, Training Loss: 0.00721
Epoch: 40356, Training Loss: 0.00675
Epoch: 40356, Training Loss: 0.00824
Epoch: 40356, Training Loss: 0.00639
Epoch: 40357, Training Loss: 0.00721
Epoch: 40357, Training Loss: 0.00675
Epoch: 40357, Training Loss: 0.00824
Epoch: 40357, Training Loss: 0.00639
Epoch: 40358, Training Loss: 0.00721
Epoch: 40358, Training Loss: 0.00675
Epoch: 40358, Training Loss: 0.00824
Epoch: 40358, Training Loss: 0.00639
Epoch: 40359, Training Loss: 0.00721
Epoch: 40359, Training Loss: 0.00675
Epoch: 40359, Training Loss: 0.00824
Epoch: 40359, Training Loss: 0.00639
Epoch: 40360, Training Loss: 0.00721
Epoch: 40360, Training Loss: 0.00675
Epoch: 40360, Training Loss: 0.00824
Epoch: 40360, Training Loss: 0.00639
Epoch: 40361, Training Loss: 0.00721
Epoch: 40361, Training Loss: 0.00675
Epoch: 40361, Training Loss: 0.00824
Epoch: 40361, Training Loss: 0.00639
Epoch: 40362, Training Loss: 0.00721
Epoch: 40362, Training Loss: 0.00675
Epoch: 40362, Training Loss: 0.00824
Epoch: 40362, Training Loss: 0.00639
Epoch: 40363, Training Loss: 0.00721
Epoch: 40363, Training Loss: 0.00675
Epoch: 40363, Training Loss: 0.00824
Epoch: 40363, Training Loss: 0.00639
Epoch: 40364, Training Loss: 0.00721
Epoch: 40364, Training Loss: 0.00675
Epoch: 40364, Training Loss: 0.00824
Epoch: 40364, Training Loss: 0.00639
Epoch: 40365, Training Loss: 0.00721
Epoch: 40365, Training Loss: 0.00675
Epoch: 40365, Training Loss: 0.00824
Epoch: 40365, Training Loss: 0.00639
Epoch: 40366, Training Loss: 0.00721
Epoch: 40366, Training Loss: 0.00675
Epoch: 40366, Training Loss: 0.00824
Epoch: 40366, Training Loss: 0.00639
Epoch: 40367, Training Loss: 0.00721
Epoch: 40367, Training Loss: 0.00675
Epoch: 40367, Training Loss: 0.00824
Epoch: 40367, Training Loss: 0.00639
Epoch: 40368, Training Loss: 0.00721
Epoch: 40368, Training Loss: 0.00675
Epoch: 40368, Training Loss: 0.00824
Epoch: 40368, Training Loss: 0.00639
Epoch: 40369, Training Loss: 0.00721
Epoch: 40369, Training Loss: 0.00675
Epoch: 40369, Training Loss: 0.00824
Epoch: 40369, Training Loss: 0.00639
Epoch: 40370, Training Loss: 0.00721
Epoch: 40370, Training Loss: 0.00675
Epoch: 40370, Training Loss: 0.00824
Epoch: 40370, Training Loss: 0.00639
Epoch: 40371, Training Loss: 0.00721
Epoch: 40371, Training Loss: 0.00675
Epoch: 40371, Training Loss: 0.00824
Epoch: 40371, Training Loss: 0.00639
Epoch: 40372, Training Loss: 0.00721
Epoch: 40372, Training Loss: 0.00675
Epoch: 40372, Training Loss: 0.00824
Epoch: 40372, Training Loss: 0.00639
Epoch: 40373, Training Loss: 0.00721
Epoch: 40373, Training Loss: 0.00675
Epoch: 40373, Training Loss: 0.00824
Epoch: 40373, Training Loss: 0.00639
Epoch: 40374, Training Loss: 0.00721
Epoch: 40374, Training Loss: 0.00675
Epoch: 40374, Training Loss: 0.00824
Epoch: 40374, Training Loss: 0.00639
Epoch: 40375, Training Loss: 0.00721
Epoch: 40375, Training Loss: 0.00675
Epoch: 40375, Training Loss: 0.00824
Epoch: 40375, Training Loss: 0.00639
Epoch: 40376, Training Loss: 0.00721
Epoch: 40376, Training Loss: 0.00675
Epoch: 40376, Training Loss: 0.00824
Epoch: 40376, Training Loss: 0.00639
Epoch: 40377, Training Loss: 0.00721
Epoch: 40377, Training Loss: 0.00675
Epoch: 40377, Training Loss: 0.00824
Epoch: 40377, Training Loss: 0.00639
Epoch: 40378, Training Loss: 0.00721
Epoch: 40378, Training Loss: 0.00675
Epoch: 40378, Training Loss: 0.00824
Epoch: 40378, Training Loss: 0.00639
Epoch: 40379, Training Loss: 0.00721
Epoch: 40379, Training Loss: 0.00675
Epoch: 40379, Training Loss: 0.00824
Epoch: 40379, Training Loss: 0.00639
Epoch: 40380, Training Loss: 0.00721
Epoch: 40380, Training Loss: 0.00675
Epoch: 40380, Training Loss: 0.00824
Epoch: 40380, Training Loss: 0.00639
Epoch: 40381, Training Loss: 0.00721
Epoch: 40381, Training Loss: 0.00675
Epoch: 40381, Training Loss: 0.00824
Epoch: 40381, Training Loss: 0.00639
Epoch: 40382, Training Loss: 0.00721
Epoch: 40382, Training Loss: 0.00675
Epoch: 40382, Training Loss: 0.00824
Epoch: 40382, Training Loss: 0.00639
Epoch: 40383, Training Loss: 0.00721
Epoch: 40383, Training Loss: 0.00675
Epoch: 40383, Training Loss: 0.00824
Epoch: 40383, Training Loss: 0.00639
Epoch: 40384, Training Loss: 0.00721
Epoch: 40384, Training Loss: 0.00675
Epoch: 40384, Training Loss: 0.00824
Epoch: 40384, Training Loss: 0.00639
Epoch: 40385, Training Loss: 0.00721
Epoch: 40385, Training Loss: 0.00675
Epoch: 40385, Training Loss: 0.00824
Epoch: 40385, Training Loss: 0.00639
Epoch: 40386, Training Loss: 0.00721
Epoch: 40386, Training Loss: 0.00675
Epoch: 40386, Training Loss: 0.00824
Epoch: 40386, Training Loss: 0.00639
Epoch: 40387, Training Loss: 0.00721
Epoch: 40387, Training Loss: 0.00675
Epoch: 40387, Training Loss: 0.00824
Epoch: 40387, Training Loss: 0.00639
Epoch: 40388, Training Loss: 0.00721
Epoch: 40388, Training Loss: 0.00675
Epoch: 40388, Training Loss: 0.00824
Epoch: 40388, Training Loss: 0.00639
Epoch: 40389, Training Loss: 0.00721
Epoch: 40389, Training Loss: 0.00675
Epoch: 40389, Training Loss: 0.00823
Epoch: 40389, Training Loss: 0.00639
Epoch: 40390, Training Loss: 0.00721
Epoch: 40390, Training Loss: 0.00675
Epoch: 40390, Training Loss: 0.00823
Epoch: 40390, Training Loss: 0.00639
Epoch: 40391, Training Loss: 0.00721
Epoch: 40391, Training Loss: 0.00675
Epoch: 40391, Training Loss: 0.00823
Epoch: 40391, Training Loss: 0.00639
Epoch: 40392, Training Loss: 0.00721
Epoch: 40392, Training Loss: 0.00675
Epoch: 40392, Training Loss: 0.00823
Epoch: 40392, Training Loss: 0.00639
Epoch: 40393, Training Loss: 0.00721
Epoch: 40393, Training Loss: 0.00675
Epoch: 40393, Training Loss: 0.00823
Epoch: 40393, Training Loss: 0.00639
Epoch: 40394, Training Loss: 0.00721
Epoch: 40394, Training Loss: 0.00675
Epoch: 40394, Training Loss: 0.00823
Epoch: 40394, Training Loss: 0.00639
Epoch: 40395, Training Loss: 0.00721
Epoch: 40395, Training Loss: 0.00675
Epoch: 40395, Training Loss: 0.00823
Epoch: 40395, Training Loss: 0.00639
Epoch: 40396, Training Loss: 0.00721
Epoch: 40396, Training Loss: 0.00675
Epoch: 40396, Training Loss: 0.00823
Epoch: 40396, Training Loss: 0.00639
Epoch: 40397, Training Loss: 0.00721
Epoch: 40397, Training Loss: 0.00675
Epoch: 40397, Training Loss: 0.00823
Epoch: 40397, Training Loss: 0.00639
Epoch: 40398, Training Loss: 0.00721
Epoch: 40398, Training Loss: 0.00674
Epoch: 40398, Training Loss: 0.00823
Epoch: 40398, Training Loss: 0.00639
Epoch: 40399, Training Loss: 0.00721
Epoch: 40399, Training Loss: 0.00674
Epoch: 40399, Training Loss: 0.00823
Epoch: 40399, Training Loss: 0.00639
Epoch: 40400, Training Loss: 0.00721
Epoch: 40400, Training Loss: 0.00674
Epoch: 40400, Training Loss: 0.00823
Epoch: 40400, Training Loss: 0.00639
Epoch: 40401, Training Loss: 0.00721
Epoch: 40401, Training Loss: 0.00674
Epoch: 40401, Training Loss: 0.00823
Epoch: 40401, Training Loss: 0.00639
Epoch: 40402, Training Loss: 0.00721
Epoch: 40402, Training Loss: 0.00674
Epoch: 40402, Training Loss: 0.00823
Epoch: 40402, Training Loss: 0.00639
Epoch: 40403, Training Loss: 0.00721
Epoch: 40403, Training Loss: 0.00674
Epoch: 40403, Training Loss: 0.00823
Epoch: 40403, Training Loss: 0.00639
Epoch: 40404, Training Loss: 0.00721
Epoch: 40404, Training Loss: 0.00674
Epoch: 40404, Training Loss: 0.00823
Epoch: 40404, Training Loss: 0.00639
Epoch: 40405, Training Loss: 0.00721
Epoch: 40405, Training Loss: 0.00674
Epoch: 40405, Training Loss: 0.00823
Epoch: 40405, Training Loss: 0.00639
Epoch: 40406, Training Loss: 0.00721
Epoch: 40406, Training Loss: 0.00674
Epoch: 40406, Training Loss: 0.00823
Epoch: 40406, Training Loss: 0.00639
Epoch: 40407, Training Loss: 0.00721
Epoch: 40407, Training Loss: 0.00674
Epoch: 40407, Training Loss: 0.00823
Epoch: 40407, Training Loss: 0.00639
Epoch: 40408, Training Loss: 0.00721
Epoch: 40408, Training Loss: 0.00674
Epoch: 40408, Training Loss: 0.00823
Epoch: 40408, Training Loss: 0.00639
Epoch: 40409, Training Loss: 0.00721
Epoch: 40409, Training Loss: 0.00674
Epoch: 40409, Training Loss: 0.00823
Epoch: 40409, Training Loss: 0.00639
Epoch: 40410, Training Loss: 0.00721
Epoch: 40410, Training Loss: 0.00674
Epoch: 40410, Training Loss: 0.00823
Epoch: 40410, Training Loss: 0.00638
Epoch: 40411, Training Loss: 0.00721
Epoch: 40411, Training Loss: 0.00674
Epoch: 40411, Training Loss: 0.00823
Epoch: 40411, Training Loss: 0.00638
Epoch: 40412, Training Loss: 0.00720
Epoch: 40412, Training Loss: 0.00674
Epoch: 40412, Training Loss: 0.00823
Epoch: 40412, Training Loss: 0.00638
Epoch: 40413, Training Loss: 0.00720
Epoch: 40413, Training Loss: 0.00674
Epoch: 40413, Training Loss: 0.00823
Epoch: 40413, Training Loss: 0.00638
Epoch: 40414, Training Loss: 0.00720
Epoch: 40414, Training Loss: 0.00674
Epoch: 40414, Training Loss: 0.00823
Epoch: 40414, Training Loss: 0.00638
Epoch: 40415, Training Loss: 0.00720
Epoch: 40415, Training Loss: 0.00674
Epoch: 40415, Training Loss: 0.00823
Epoch: 40415, Training Loss: 0.00638
Epoch: 40416, Training Loss: 0.00720
Epoch: 40416, Training Loss: 0.00674
Epoch: 40416, Training Loss: 0.00823
Epoch: 40416, Training Loss: 0.00638
Epoch: 40417, Training Loss: 0.00720
Epoch: 40417, Training Loss: 0.00674
Epoch: 40417, Training Loss: 0.00823
Epoch: 40417, Training Loss: 0.00638
Epoch: 40418, Training Loss: 0.00720
Epoch: 40418, Training Loss: 0.00674
Epoch: 40418, Training Loss: 0.00823
Epoch: 40418, Training Loss: 0.00638
Epoch: 40419, Training Loss: 0.00720
Epoch: 40419, Training Loss: 0.00674
Epoch: 40419, Training Loss: 0.00823
Epoch: 40419, Training Loss: 0.00638
Epoch: 40420, Training Loss: 0.00720
Epoch: 40420, Training Loss: 0.00674
Epoch: 40420, Training Loss: 0.00823
Epoch: 40420, Training Loss: 0.00638
Epoch: 40421, Training Loss: 0.00720
Epoch: 40421, Training Loss: 0.00674
Epoch: 40421, Training Loss: 0.00823
Epoch: 40421, Training Loss: 0.00638
Epoch: 40422, Training Loss: 0.00720
Epoch: 40422, Training Loss: 0.00674
Epoch: 40422, Training Loss: 0.00823
Epoch: 40422, Training Loss: 0.00638
Epoch: 40423, Training Loss: 0.00720
Epoch: 40423, Training Loss: 0.00674
Epoch: 40423, Training Loss: 0.00823
Epoch: 40423, Training Loss: 0.00638
Epoch: 40424, Training Loss: 0.00720
Epoch: 40424, Training Loss: 0.00674
Epoch: 40424, Training Loss: 0.00823
Epoch: 40424, Training Loss: 0.00638
Epoch: 40425, Training Loss: 0.00720
Epoch: 40425, Training Loss: 0.00674
Epoch: 40425, Training Loss: 0.00823
Epoch: 40425, Training Loss: 0.00638
Epoch: 40426, Training Loss: 0.00720
Epoch: 40426, Training Loss: 0.00674
Epoch: 40426, Training Loss: 0.00823
Epoch: 40426, Training Loss: 0.00638
Epoch: 40427, Training Loss: 0.00720
Epoch: 40427, Training Loss: 0.00674
Epoch: 40427, Training Loss: 0.00823
Epoch: 40427, Training Loss: 0.00638
Epoch: 40428, Training Loss: 0.00720
Epoch: 40428, Training Loss: 0.00674
Epoch: 40428, Training Loss: 0.00823
Epoch: 40428, Training Loss: 0.00638
Epoch: 40429, Training Loss: 0.00720
Epoch: 40429, Training Loss: 0.00674
Epoch: 40429, Training Loss: 0.00823
Epoch: 40429, Training Loss: 0.00638
Epoch: 40430, Training Loss: 0.00720
Epoch: 40430, Training Loss: 0.00674
Epoch: 40430, Training Loss: 0.00823
Epoch: 40430, Training Loss: 0.00638
Epoch: 40431, Training Loss: 0.00720
Epoch: 40431, Training Loss: 0.00674
Epoch: 40431, Training Loss: 0.00823
Epoch: 40431, Training Loss: 0.00638
Epoch: 40432, Training Loss: 0.00720
Epoch: 40432, Training Loss: 0.00674
Epoch: 40432, Training Loss: 0.00823
Epoch: 40432, Training Loss: 0.00638
Epoch: 40433, Training Loss: 0.00720
Epoch: 40433, Training Loss: 0.00674
Epoch: 40433, Training Loss: 0.00823
Epoch: 40433, Training Loss: 0.00638
Epoch: 40434, Training Loss: 0.00720
Epoch: 40434, Training Loss: 0.00674
Epoch: 40434, Training Loss: 0.00823
Epoch: 40434, Training Loss: 0.00638
Epoch: 40435, Training Loss: 0.00720
Epoch: 40435, Training Loss: 0.00674
Epoch: 40435, Training Loss: 0.00823
Epoch: 40435, Training Loss: 0.00638
Epoch: 40436, Training Loss: 0.00720
Epoch: 40436, Training Loss: 0.00674
Epoch: 40436, Training Loss: 0.00823
Epoch: 40436, Training Loss: 0.00638
Epoch: 40437, Training Loss: 0.00720
Epoch: 40437, Training Loss: 0.00674
Epoch: 40437, Training Loss: 0.00823
Epoch: 40437, Training Loss: 0.00638
Epoch: 40438, Training Loss: 0.00720
Epoch: 40438, Training Loss: 0.00674
Epoch: 40438, Training Loss: 0.00823
Epoch: 40438, Training Loss: 0.00638
Epoch: 40439, Training Loss: 0.00720
Epoch: 40439, Training Loss: 0.00674
Epoch: 40439, Training Loss: 0.00823
Epoch: 40439, Training Loss: 0.00638
Epoch: 40440, Training Loss: 0.00720
Epoch: 40440, Training Loss: 0.00674
Epoch: 40440, Training Loss: 0.00823
Epoch: 40440, Training Loss: 0.00638
Epoch: 40441, Training Loss: 0.00720
Epoch: 40441, Training Loss: 0.00674
Epoch: 40441, Training Loss: 0.00823
Epoch: 40441, Training Loss: 0.00638
Epoch: 40442, Training Loss: 0.00720
Epoch: 40442, Training Loss: 0.00674
Epoch: 40442, Training Loss: 0.00823
Epoch: 40442, Training Loss: 0.00638
Epoch: 40443, Training Loss: 0.00720
Epoch: 40443, Training Loss: 0.00674
Epoch: 40443, Training Loss: 0.00823
Epoch: 40443, Training Loss: 0.00638
Epoch: 40444, Training Loss: 0.00720
Epoch: 40444, Training Loss: 0.00674
Epoch: 40444, Training Loss: 0.00823
Epoch: 40444, Training Loss: 0.00638
Epoch: 40445, Training Loss: 0.00720
Epoch: 40445, Training Loss: 0.00674
Epoch: 40445, Training Loss: 0.00823
Epoch: 40445, Training Loss: 0.00638
Epoch: 40446, Training Loss: 0.00720
Epoch: 40446, Training Loss: 0.00674
Epoch: 40446, Training Loss: 0.00823
Epoch: 40446, Training Loss: 0.00638
Epoch: 40447, Training Loss: 0.00720
Epoch: 40447, Training Loss: 0.00674
Epoch: 40447, Training Loss: 0.00823
Epoch: 40447, Training Loss: 0.00638
Epoch: 40448, Training Loss: 0.00720
Epoch: 40448, Training Loss: 0.00674
Epoch: 40448, Training Loss: 0.00823
Epoch: 40448, Training Loss: 0.00638
Epoch: 40449, Training Loss: 0.00720
Epoch: 40449, Training Loss: 0.00674
Epoch: 40449, Training Loss: 0.00823
Epoch: 40449, Training Loss: 0.00638
Epoch: 40450, Training Loss: 0.00720
Epoch: 40450, Training Loss: 0.00674
Epoch: 40450, Training Loss: 0.00823
Epoch: 40450, Training Loss: 0.00638
Epoch: 40451, Training Loss: 0.00720
Epoch: 40451, Training Loss: 0.00674
Epoch: 40451, Training Loss: 0.00823
Epoch: 40451, Training Loss: 0.00638
Epoch: 40452, Training Loss: 0.00720
Epoch: 40452, Training Loss: 0.00674
Epoch: 40452, Training Loss: 0.00823
Epoch: 40452, Training Loss: 0.00638
Epoch: 40453, Training Loss: 0.00720
Epoch: 40453, Training Loss: 0.00674
Epoch: 40453, Training Loss: 0.00823
Epoch: 40453, Training Loss: 0.00638
Epoch: 40454, Training Loss: 0.00720
Epoch: 40454, Training Loss: 0.00674
Epoch: 40454, Training Loss: 0.00823
Epoch: 40454, Training Loss: 0.00638
Epoch: 40455, Training Loss: 0.00720
Epoch: 40455, Training Loss: 0.00674
Epoch: 40455, Training Loss: 0.00823
Epoch: 40455, Training Loss: 0.00638
Epoch: 40456, Training Loss: 0.00720
Epoch: 40456, Training Loss: 0.00674
Epoch: 40456, Training Loss: 0.00823
Epoch: 40456, Training Loss: 0.00638
Epoch: 40457, Training Loss: 0.00720
Epoch: 40457, Training Loss: 0.00674
Epoch: 40457, Training Loss: 0.00823
Epoch: 40457, Training Loss: 0.00638
Epoch: 40458, Training Loss: 0.00720
Epoch: 40458, Training Loss: 0.00674
Epoch: 40458, Training Loss: 0.00823
Epoch: 40458, Training Loss: 0.00638
Epoch: 40459, Training Loss: 0.00720
Epoch: 40459, Training Loss: 0.00674
Epoch: 40459, Training Loss: 0.00823
Epoch: 40459, Training Loss: 0.00638
Epoch: 40460, Training Loss: 0.00720
Epoch: 40460, Training Loss: 0.00674
Epoch: 40460, Training Loss: 0.00823
Epoch: 40460, Training Loss: 0.00638
Epoch: 40461, Training Loss: 0.00720
Epoch: 40461, Training Loss: 0.00674
Epoch: 40461, Training Loss: 0.00823
Epoch: 40461, Training Loss: 0.00638
Epoch: 40462, Training Loss: 0.00720
Epoch: 40462, Training Loss: 0.00674
Epoch: 40462, Training Loss: 0.00823
Epoch: 40462, Training Loss: 0.00638
Epoch: 40463, Training Loss: 0.00720
Epoch: 40463, Training Loss: 0.00674
Epoch: 40463, Training Loss: 0.00823
Epoch: 40463, Training Loss: 0.00638
Epoch: 40464, Training Loss: 0.00720
Epoch: 40464, Training Loss: 0.00674
Epoch: 40464, Training Loss: 0.00823
Epoch: 40464, Training Loss: 0.00638
Epoch: 40465, Training Loss: 0.00720
Epoch: 40465, Training Loss: 0.00674
Epoch: 40465, Training Loss: 0.00823
Epoch: 40465, Training Loss: 0.00638
Epoch: 40466, Training Loss: 0.00720
Epoch: 40466, Training Loss: 0.00674
Epoch: 40466, Training Loss: 0.00823
Epoch: 40466, Training Loss: 0.00638
Epoch: 40467, Training Loss: 0.00720
Epoch: 40467, Training Loss: 0.00674
Epoch: 40467, Training Loss: 0.00823
Epoch: 40467, Training Loss: 0.00638
Epoch: 40468, Training Loss: 0.00720
Epoch: 40468, Training Loss: 0.00674
Epoch: 40468, Training Loss: 0.00823
Epoch: 40468, Training Loss: 0.00638
Epoch: 40469, Training Loss: 0.00720
Epoch: 40469, Training Loss: 0.00674
Epoch: 40469, Training Loss: 0.00823
Epoch: 40469, Training Loss: 0.00638
Epoch: 40470, Training Loss: 0.00720
Epoch: 40470, Training Loss: 0.00674
Epoch: 40470, Training Loss: 0.00823
Epoch: 40470, Training Loss: 0.00638
Epoch: 40471, Training Loss: 0.00720
Epoch: 40471, Training Loss: 0.00674
Epoch: 40471, Training Loss: 0.00823
Epoch: 40471, Training Loss: 0.00638
Epoch: 40472, Training Loss: 0.00720
Epoch: 40472, Training Loss: 0.00674
Epoch: 40472, Training Loss: 0.00823
Epoch: 40472, Training Loss: 0.00638
Epoch: 40473, Training Loss: 0.00720
Epoch: 40473, Training Loss: 0.00674
Epoch: 40473, Training Loss: 0.00823
Epoch: 40473, Training Loss: 0.00638
Epoch: 40474, Training Loss: 0.00720
Epoch: 40474, Training Loss: 0.00674
Epoch: 40474, Training Loss: 0.00823
Epoch: 40474, Training Loss: 0.00638
Epoch: 40475, Training Loss: 0.00720
Epoch: 40475, Training Loss: 0.00674
Epoch: 40475, Training Loss: 0.00823
Epoch: 40475, Training Loss: 0.00638
Epoch: 40476, Training Loss: 0.00720
Epoch: 40476, Training Loss: 0.00674
Epoch: 40476, Training Loss: 0.00823
Epoch: 40476, Training Loss: 0.00638
Epoch: 40477, Training Loss: 0.00720
Epoch: 40477, Training Loss: 0.00674
Epoch: 40477, Training Loss: 0.00823
Epoch: 40477, Training Loss: 0.00638
Epoch: 40478, Training Loss: 0.00720
Epoch: 40478, Training Loss: 0.00674
Epoch: 40478, Training Loss: 0.00823
Epoch: 40478, Training Loss: 0.00638
Epoch: 40479, Training Loss: 0.00720
Epoch: 40479, Training Loss: 0.00674
Epoch: 40479, Training Loss: 0.00823
Epoch: 40479, Training Loss: 0.00638
Epoch: 40480, Training Loss: 0.00720
Epoch: 40480, Training Loss: 0.00674
Epoch: 40480, Training Loss: 0.00823
Epoch: 40480, Training Loss: 0.00638
Epoch: 40481, Training Loss: 0.00720
Epoch: 40481, Training Loss: 0.00674
Epoch: 40481, Training Loss: 0.00823
Epoch: 40481, Training Loss: 0.00638
Epoch: 40482, Training Loss: 0.00720
Epoch: 40482, Training Loss: 0.00674
Epoch: 40482, Training Loss: 0.00822
Epoch: 40482, Training Loss: 0.00638
Epoch: 40483, Training Loss: 0.00720
Epoch: 40483, Training Loss: 0.00674
Epoch: 40483, Training Loss: 0.00822
Epoch: 40483, Training Loss: 0.00638
Epoch: 40484, Training Loss: 0.00720
Epoch: 40484, Training Loss: 0.00674
Epoch: 40484, Training Loss: 0.00822
Epoch: 40484, Training Loss: 0.00638
Epoch: 40485, Training Loss: 0.00720
Epoch: 40485, Training Loss: 0.00674
Epoch: 40485, Training Loss: 0.00822
Epoch: 40485, Training Loss: 0.00638
Epoch: 40486, Training Loss: 0.00720
Epoch: 40486, Training Loss: 0.00674
Epoch: 40486, Training Loss: 0.00822
Epoch: 40486, Training Loss: 0.00638
Epoch: 40487, Training Loss: 0.00720
Epoch: 40487, Training Loss: 0.00674
Epoch: 40487, Training Loss: 0.00822
Epoch: 40487, Training Loss: 0.00638
Epoch: 40488, Training Loss: 0.00720
Epoch: 40488, Training Loss: 0.00674
Epoch: 40488, Training Loss: 0.00822
Epoch: 40488, Training Loss: 0.00638
Epoch: 40489, Training Loss: 0.00720
Epoch: 40489, Training Loss: 0.00674
Epoch: 40489, Training Loss: 0.00822
Epoch: 40489, Training Loss: 0.00638
Epoch: 40490, Training Loss: 0.00720
Epoch: 40490, Training Loss: 0.00674
Epoch: 40490, Training Loss: 0.00822
Epoch: 40490, Training Loss: 0.00638
Epoch: 40491, Training Loss: 0.00720
Epoch: 40491, Training Loss: 0.00674
Epoch: 40491, Training Loss: 0.00822
Epoch: 40491, Training Loss: 0.00638
Epoch: 40492, Training Loss: 0.00720
Epoch: 40492, Training Loss: 0.00674
Epoch: 40492, Training Loss: 0.00822
Epoch: 40492, Training Loss: 0.00638
Epoch: 40493, Training Loss: 0.00720
Epoch: 40493, Training Loss: 0.00674
Epoch: 40493, Training Loss: 0.00822
Epoch: 40493, Training Loss: 0.00638
Epoch: 40494, Training Loss: 0.00720
Epoch: 40494, Training Loss: 0.00674
Epoch: 40494, Training Loss: 0.00822
Epoch: 40494, Training Loss: 0.00638
Epoch: 40495, Training Loss: 0.00720
Epoch: 40495, Training Loss: 0.00674
Epoch: 40495, Training Loss: 0.00822
Epoch: 40495, Training Loss: 0.00638
Epoch: 40496, Training Loss: 0.00720
Epoch: 40496, Training Loss: 0.00674
Epoch: 40496, Training Loss: 0.00822
Epoch: 40496, Training Loss: 0.00638
Epoch: 40497, Training Loss: 0.00720
Epoch: 40497, Training Loss: 0.00674
Epoch: 40497, Training Loss: 0.00822
Epoch: 40497, Training Loss: 0.00638
Epoch: 40498, Training Loss: 0.00720
Epoch: 40498, Training Loss: 0.00674
Epoch: 40498, Training Loss: 0.00822
Epoch: 40498, Training Loss: 0.00638
Epoch: 40499, Training Loss: 0.00720
Epoch: 40499, Training Loss: 0.00674
Epoch: 40499, Training Loss: 0.00822
Epoch: 40499, Training Loss: 0.00638
Epoch: 40500, Training Loss: 0.00720
Epoch: 40500, Training Loss: 0.00674
Epoch: 40500, Training Loss: 0.00822
Epoch: 40500, Training Loss: 0.00638
Epoch: 40501, Training Loss: 0.00720
Epoch: 40501, Training Loss: 0.00674
Epoch: 40501, Training Loss: 0.00822
Epoch: 40501, Training Loss: 0.00638
Epoch: 40502, Training Loss: 0.00720
Epoch: 40502, Training Loss: 0.00674
Epoch: 40502, Training Loss: 0.00822
Epoch: 40502, Training Loss: 0.00638
Epoch: 40503, Training Loss: 0.00720
Epoch: 40503, Training Loss: 0.00674
Epoch: 40503, Training Loss: 0.00822
Epoch: 40503, Training Loss: 0.00638
Epoch: 40504, Training Loss: 0.00720
Epoch: 40504, Training Loss: 0.00674
Epoch: 40504, Training Loss: 0.00822
Epoch: 40504, Training Loss: 0.00638
Epoch: 40505, Training Loss: 0.00720
Epoch: 40505, Training Loss: 0.00674
Epoch: 40505, Training Loss: 0.00822
Epoch: 40505, Training Loss: 0.00638
Epoch: 40506, Training Loss: 0.00720
Epoch: 40506, Training Loss: 0.00674
Epoch: 40506, Training Loss: 0.00822
Epoch: 40506, Training Loss: 0.00638
Epoch: 40507, Training Loss: 0.00720
Epoch: 40507, Training Loss: 0.00674
Epoch: 40507, Training Loss: 0.00822
Epoch: 40507, Training Loss: 0.00638
Epoch: 40508, Training Loss: 0.00720
Epoch: 40508, Training Loss: 0.00674
Epoch: 40508, Training Loss: 0.00822
Epoch: 40508, Training Loss: 0.00638
Epoch: 40509, Training Loss: 0.00720
Epoch: 40509, Training Loss: 0.00674
Epoch: 40509, Training Loss: 0.00822
Epoch: 40509, Training Loss: 0.00638
Epoch: 40510, Training Loss: 0.00720
Epoch: 40510, Training Loss: 0.00674
Epoch: 40510, Training Loss: 0.00822
Epoch: 40510, Training Loss: 0.00638
Epoch: 40511, Training Loss: 0.00720
Epoch: 40511, Training Loss: 0.00674
Epoch: 40511, Training Loss: 0.00822
Epoch: 40511, Training Loss: 0.00638
Epoch: 40512, Training Loss: 0.00720
Epoch: 40512, Training Loss: 0.00674
Epoch: 40512, Training Loss: 0.00822
Epoch: 40512, Training Loss: 0.00638
Epoch: 40513, Training Loss: 0.00720
Epoch: 40513, Training Loss: 0.00674
Epoch: 40513, Training Loss: 0.00822
Epoch: 40513, Training Loss: 0.00638
Epoch: 40514, Training Loss: 0.00720
Epoch: 40514, Training Loss: 0.00673
Epoch: 40514, Training Loss: 0.00822
Epoch: 40514, Training Loss: 0.00638
Epoch: 40515, Training Loss: 0.00720
Epoch: 40515, Training Loss: 0.00673
Epoch: 40515, Training Loss: 0.00822
Epoch: 40515, Training Loss: 0.00638
Epoch: 40516, Training Loss: 0.00720
Epoch: 40516, Training Loss: 0.00673
Epoch: 40516, Training Loss: 0.00822
Epoch: 40516, Training Loss: 0.00638
Epoch: 40517, Training Loss: 0.00720
Epoch: 40517, Training Loss: 0.00673
Epoch: 40517, Training Loss: 0.00822
Epoch: 40517, Training Loss: 0.00638
Epoch: 40518, Training Loss: 0.00720
Epoch: 40518, Training Loss: 0.00673
Epoch: 40518, Training Loss: 0.00822
Epoch: 40518, Training Loss: 0.00638
Epoch: 40519, Training Loss: 0.00719
Epoch: 40519, Training Loss: 0.00673
Epoch: 40519, Training Loss: 0.00822
Epoch: 40519, Training Loss: 0.00638
Epoch: 40520, Training Loss: 0.00719
Epoch: 40520, Training Loss: 0.00673
Epoch: 40520, Training Loss: 0.00822
Epoch: 40520, Training Loss: 0.00638
Epoch: 40521, Training Loss: 0.00719
Epoch: 40521, Training Loss: 0.00673
Epoch: 40521, Training Loss: 0.00822
Epoch: 40521, Training Loss: 0.00638
Epoch: 40522, Training Loss: 0.00719
Epoch: 40522, Training Loss: 0.00673
Epoch: 40522, Training Loss: 0.00822
Epoch: 40522, Training Loss: 0.00638
Epoch: 40523, Training Loss: 0.00719
Epoch: 40523, Training Loss: 0.00673
Epoch: 40523, Training Loss: 0.00822
Epoch: 40523, Training Loss: 0.00638
Epoch: 40524, Training Loss: 0.00719
Epoch: 40524, Training Loss: 0.00673
Epoch: 40524, Training Loss: 0.00822
Epoch: 40524, Training Loss: 0.00638
Epoch: 40525, Training Loss: 0.00719
Epoch: 40525, Training Loss: 0.00673
Epoch: 40525, Training Loss: 0.00822
Epoch: 40525, Training Loss: 0.00638
Epoch: 40526, Training Loss: 0.00719
Epoch: 40526, Training Loss: 0.00673
Epoch: 40526, Training Loss: 0.00822
Epoch: 40526, Training Loss: 0.00638
Epoch: 40527, Training Loss: 0.00719
Epoch: 40527, Training Loss: 0.00673
Epoch: 40527, Training Loss: 0.00822
Epoch: 40527, Training Loss: 0.00638
Epoch: 40528, Training Loss: 0.00719
Epoch: 40528, Training Loss: 0.00673
Epoch: 40528, Training Loss: 0.00822
Epoch: 40528, Training Loss: 0.00638
Epoch: 40529, Training Loss: 0.00719
Epoch: 40529, Training Loss: 0.00673
Epoch: 40529, Training Loss: 0.00822
Epoch: 40529, Training Loss: 0.00638
Epoch: 40530, Training Loss: 0.00719
Epoch: 40530, Training Loss: 0.00673
Epoch: 40530, Training Loss: 0.00822
Epoch: 40530, Training Loss: 0.00638
Epoch: 40531, Training Loss: 0.00719
Epoch: 40531, Training Loss: 0.00673
Epoch: 40531, Training Loss: 0.00822
Epoch: 40531, Training Loss: 0.00638
Epoch: 40532, Training Loss: 0.00719
Epoch: 40532, Training Loss: 0.00673
Epoch: 40532, Training Loss: 0.00822
Epoch: 40532, Training Loss: 0.00637
Epoch: 40533, Training Loss: 0.00719
Epoch: 40533, Training Loss: 0.00673
Epoch: 40533, Training Loss: 0.00822
Epoch: 40533, Training Loss: 0.00637
Epoch: 40534, Training Loss: 0.00719
Epoch: 40534, Training Loss: 0.00673
Epoch: 40534, Training Loss: 0.00822
Epoch: 40534, Training Loss: 0.00637
Epoch: 40535, Training Loss: 0.00719
Epoch: 40535, Training Loss: 0.00673
Epoch: 40535, Training Loss: 0.00822
Epoch: 40535, Training Loss: 0.00637
Epoch: 40536, Training Loss: 0.00719
Epoch: 40536, Training Loss: 0.00673
Epoch: 40536, Training Loss: 0.00822
Epoch: 40536, Training Loss: 0.00637
Epoch: 40537, Training Loss: 0.00719
Epoch: 40537, Training Loss: 0.00673
Epoch: 40537, Training Loss: 0.00822
Epoch: 40537, Training Loss: 0.00637
Epoch: 40538, Training Loss: 0.00719
Epoch: 40538, Training Loss: 0.00673
Epoch: 40538, Training Loss: 0.00822
Epoch: 40538, Training Loss: 0.00637
Epoch: 40539, Training Loss: 0.00719
Epoch: 40539, Training Loss: 0.00673
Epoch: 40539, Training Loss: 0.00822
Epoch: 40539, Training Loss: 0.00637
Epoch: 40540, Training Loss: 0.00719
Epoch: 40540, Training Loss: 0.00673
Epoch: 40540, Training Loss: 0.00822
Epoch: 40540, Training Loss: 0.00637
Epoch: 40541, Training Loss: 0.00719
Epoch: 40541, Training Loss: 0.00673
Epoch: 40541, Training Loss: 0.00822
Epoch: 40541, Training Loss: 0.00637
Epoch: 40542, Training Loss: 0.00719
Epoch: 40542, Training Loss: 0.00673
Epoch: 40542, Training Loss: 0.00822
Epoch: 40542, Training Loss: 0.00637
Epoch: 40543, Training Loss: 0.00719
Epoch: 40543, Training Loss: 0.00673
Epoch: 40543, Training Loss: 0.00822
Epoch: 40543, Training Loss: 0.00637
Epoch: 40544, Training Loss: 0.00719
Epoch: 40544, Training Loss: 0.00673
Epoch: 40544, Training Loss: 0.00822
Epoch: 40544, Training Loss: 0.00637
Epoch: 40545, Training Loss: 0.00719
Epoch: 40545, Training Loss: 0.00673
Epoch: 40545, Training Loss: 0.00822
Epoch: 40545, Training Loss: 0.00637
Epoch: 40546, Training Loss: 0.00719
Epoch: 40546, Training Loss: 0.00673
Epoch: 40546, Training Loss: 0.00822
Epoch: 40546, Training Loss: 0.00637
Epoch: 40547, Training Loss: 0.00719
Epoch: 40547, Training Loss: 0.00673
Epoch: 40547, Training Loss: 0.00822
Epoch: 40547, Training Loss: 0.00637
Epoch: 40548, Training Loss: 0.00719
Epoch: 40548, Training Loss: 0.00673
Epoch: 40548, Training Loss: 0.00822
Epoch: 40548, Training Loss: 0.00637
Epoch: 40549, Training Loss: 0.00719
Epoch: 40549, Training Loss: 0.00673
Epoch: 40549, Training Loss: 0.00822
Epoch: 40549, Training Loss: 0.00637
Epoch: 40550, Training Loss: 0.00719
Epoch: 40550, Training Loss: 0.00673
Epoch: 40550, Training Loss: 0.00822
Epoch: 40550, Training Loss: 0.00637
Epoch: 40551, Training Loss: 0.00719
Epoch: 40551, Training Loss: 0.00673
Epoch: 40551, Training Loss: 0.00822
Epoch: 40551, Training Loss: 0.00637
Epoch: 40552, Training Loss: 0.00719
Epoch: 40552, Training Loss: 0.00673
Epoch: 40552, Training Loss: 0.00822
Epoch: 40552, Training Loss: 0.00637
Epoch: 40553, Training Loss: 0.00719
Epoch: 40553, Training Loss: 0.00673
Epoch: 40553, Training Loss: 0.00822
Epoch: 40553, Training Loss: 0.00637
Epoch: 40554, Training Loss: 0.00719
Epoch: 40554, Training Loss: 0.00673
Epoch: 40554, Training Loss: 0.00822
Epoch: 40554, Training Loss: 0.00637
Epoch: 40555, Training Loss: 0.00719
Epoch: 40555, Training Loss: 0.00673
Epoch: 40555, Training Loss: 0.00822
Epoch: 40555, Training Loss: 0.00637
Epoch: 40556, Training Loss: 0.00719
Epoch: 40556, Training Loss: 0.00673
Epoch: 40556, Training Loss: 0.00822
Epoch: 40556, Training Loss: 0.00637
Epoch: 40557, Training Loss: 0.00719
Epoch: 40557, Training Loss: 0.00673
Epoch: 40557, Training Loss: 0.00822
Epoch: 40557, Training Loss: 0.00637
Epoch: 40558, Training Loss: 0.00719
Epoch: 40558, Training Loss: 0.00673
Epoch: 40558, Training Loss: 0.00822
Epoch: 40558, Training Loss: 0.00637
Epoch: 40559, Training Loss: 0.00719
Epoch: 40559, Training Loss: 0.00673
Epoch: 40559, Training Loss: 0.00822
Epoch: 40559, Training Loss: 0.00637
Epoch: 40560, Training Loss: 0.00719
Epoch: 40560, Training Loss: 0.00673
Epoch: 40560, Training Loss: 0.00822
Epoch: 40560, Training Loss: 0.00637
Epoch: 40561, Training Loss: 0.00719
Epoch: 40561, Training Loss: 0.00673
Epoch: 40561, Training Loss: 0.00822
Epoch: 40561, Training Loss: 0.00637
Epoch: 40562, Training Loss: 0.00719
Epoch: 40562, Training Loss: 0.00673
Epoch: 40562, Training Loss: 0.00822
Epoch: 40562, Training Loss: 0.00637
Epoch: 40563, Training Loss: 0.00719
Epoch: 40563, Training Loss: 0.00673
Epoch: 40563, Training Loss: 0.00822
Epoch: 40563, Training Loss: 0.00637
Epoch: 40564, Training Loss: 0.00719
Epoch: 40564, Training Loss: 0.00673
Epoch: 40564, Training Loss: 0.00822
Epoch: 40564, Training Loss: 0.00637
Epoch: 40565, Training Loss: 0.00719
Epoch: 40565, Training Loss: 0.00673
Epoch: 40565, Training Loss: 0.00822
Epoch: 40565, Training Loss: 0.00637
Epoch: 40566, Training Loss: 0.00719
Epoch: 40566, Training Loss: 0.00673
Epoch: 40566, Training Loss: 0.00822
Epoch: 40566, Training Loss: 0.00637
Epoch: 40567, Training Loss: 0.00719
Epoch: 40567, Training Loss: 0.00673
Epoch: 40567, Training Loss: 0.00822
Epoch: 40567, Training Loss: 0.00637
Epoch: 40568, Training Loss: 0.00719
Epoch: 40568, Training Loss: 0.00673
Epoch: 40568, Training Loss: 0.00822
Epoch: 40568, Training Loss: 0.00637
Epoch: 40569, Training Loss: 0.00719
Epoch: 40569, Training Loss: 0.00673
Epoch: 40569, Training Loss: 0.00822
Epoch: 40569, Training Loss: 0.00637
Epoch: 40570, Training Loss: 0.00719
Epoch: 40570, Training Loss: 0.00673
Epoch: 40570, Training Loss: 0.00822
Epoch: 40570, Training Loss: 0.00637
Epoch: 40571, Training Loss: 0.00719
Epoch: 40571, Training Loss: 0.00673
Epoch: 40571, Training Loss: 0.00822
Epoch: 40571, Training Loss: 0.00637
Epoch: 40572, Training Loss: 0.00719
Epoch: 40572, Training Loss: 0.00673
Epoch: 40572, Training Loss: 0.00822
Epoch: 40572, Training Loss: 0.00637
Epoch: 40573, Training Loss: 0.00719
Epoch: 40573, Training Loss: 0.00673
Epoch: 40573, Training Loss: 0.00822
Epoch: 40573, Training Loss: 0.00637
Epoch: 40574, Training Loss: 0.00719
Epoch: 40574, Training Loss: 0.00673
Epoch: 40574, Training Loss: 0.00822
Epoch: 40574, Training Loss: 0.00637
Epoch: 40575, Training Loss: 0.00719
Epoch: 40575, Training Loss: 0.00673
Epoch: 40575, Training Loss: 0.00822
Epoch: 40575, Training Loss: 0.00637
Epoch: 40576, Training Loss: 0.00719
Epoch: 40576, Training Loss: 0.00673
Epoch: 40576, Training Loss: 0.00821
Epoch: 40576, Training Loss: 0.00637
Epoch: 40577, Training Loss: 0.00719
Epoch: 40577, Training Loss: 0.00673
Epoch: 40577, Training Loss: 0.00821
Epoch: 40577, Training Loss: 0.00637
Epoch: 40578, Training Loss: 0.00719
Epoch: 40578, Training Loss: 0.00673
Epoch: 40578, Training Loss: 0.00821
Epoch: 40578, Training Loss: 0.00637
Epoch: 40579, Training Loss: 0.00719
Epoch: 40579, Training Loss: 0.00673
Epoch: 40579, Training Loss: 0.00821
Epoch: 40579, Training Loss: 0.00637
Epoch: 40580, Training Loss: 0.00719
Epoch: 40580, Training Loss: 0.00673
Epoch: 40580, Training Loss: 0.00821
Epoch: 40580, Training Loss: 0.00637
Epoch: 40581, Training Loss: 0.00719
Epoch: 40581, Training Loss: 0.00673
Epoch: 40581, Training Loss: 0.00821
Epoch: 40581, Training Loss: 0.00637
Epoch: 40582, Training Loss: 0.00719
Epoch: 40582, Training Loss: 0.00673
Epoch: 40582, Training Loss: 0.00821
Epoch: 40582, Training Loss: 0.00637
Epoch: 40583, Training Loss: 0.00719
Epoch: 40583, Training Loss: 0.00673
Epoch: 40583, Training Loss: 0.00821
Epoch: 40583, Training Loss: 0.00637
Epoch: 40584, Training Loss: 0.00719
Epoch: 40584, Training Loss: 0.00673
Epoch: 40584, Training Loss: 0.00821
Epoch: 40584, Training Loss: 0.00637
Epoch: 40585, Training Loss: 0.00719
Epoch: 40585, Training Loss: 0.00673
Epoch: 40585, Training Loss: 0.00821
Epoch: 40585, Training Loss: 0.00637
Epoch: 40586, Training Loss: 0.00719
Epoch: 40586, Training Loss: 0.00673
Epoch: 40586, Training Loss: 0.00821
Epoch: 40586, Training Loss: 0.00637
Epoch: 40587, Training Loss: 0.00719
Epoch: 40587, Training Loss: 0.00673
Epoch: 40587, Training Loss: 0.00821
Epoch: 40587, Training Loss: 0.00637
Epoch: 40588, Training Loss: 0.00719
Epoch: 40588, Training Loss: 0.00673
Epoch: 40588, Training Loss: 0.00821
Epoch: 40588, Training Loss: 0.00637
Epoch: 40589, Training Loss: 0.00719
Epoch: 40589, Training Loss: 0.00673
Epoch: 40589, Training Loss: 0.00821
Epoch: 40589, Training Loss: 0.00637
Epoch: 40590, Training Loss: 0.00719
Epoch: 40590, Training Loss: 0.00673
Epoch: 40590, Training Loss: 0.00821
Epoch: 40590, Training Loss: 0.00637
Epoch: 40591, Training Loss: 0.00719
Epoch: 40591, Training Loss: 0.00673
Epoch: 40591, Training Loss: 0.00821
Epoch: 40591, Training Loss: 0.00637
Epoch: 40592, Training Loss: 0.00719
Epoch: 40592, Training Loss: 0.00673
Epoch: 40592, Training Loss: 0.00821
Epoch: 40592, Training Loss: 0.00637
Epoch: 40593, Training Loss: 0.00719
Epoch: 40593, Training Loss: 0.00673
Epoch: 40593, Training Loss: 0.00821
Epoch: 40593, Training Loss: 0.00637
Epoch: 40594, Training Loss: 0.00719
Epoch: 40594, Training Loss: 0.00673
Epoch: 40594, Training Loss: 0.00821
Epoch: 40594, Training Loss: 0.00637
Epoch: 40595, Training Loss: 0.00719
Epoch: 40595, Training Loss: 0.00673
Epoch: 40595, Training Loss: 0.00821
Epoch: 40595, Training Loss: 0.00637
Epoch: 40596, Training Loss: 0.00719
Epoch: 40596, Training Loss: 0.00673
Epoch: 40596, Training Loss: 0.00821
Epoch: 40596, Training Loss: 0.00637
Epoch: 40597, Training Loss: 0.00719
Epoch: 40597, Training Loss: 0.00673
Epoch: 40597, Training Loss: 0.00821
Epoch: 40597, Training Loss: 0.00637
Epoch: 40598, Training Loss: 0.00719
Epoch: 40598, Training Loss: 0.00673
Epoch: 40598, Training Loss: 0.00821
Epoch: 40598, Training Loss: 0.00637
Epoch: 40599, Training Loss: 0.00719
Epoch: 40599, Training Loss: 0.00673
Epoch: 40599, Training Loss: 0.00821
Epoch: 40599, Training Loss: 0.00637
Epoch: 40600, Training Loss: 0.00719
Epoch: 40600, Training Loss: 0.00673
Epoch: 40600, Training Loss: 0.00821
Epoch: 40600, Training Loss: 0.00637
Epoch: 40601, Training Loss: 0.00719
Epoch: 40601, Training Loss: 0.00673
Epoch: 40601, Training Loss: 0.00821
Epoch: 40601, Training Loss: 0.00637
Epoch: 40602, Training Loss: 0.00719
Epoch: 40602, Training Loss: 0.00673
Epoch: 40602, Training Loss: 0.00821
Epoch: 40602, Training Loss: 0.00637
Epoch: 40603, Training Loss: 0.00719
Epoch: 40603, Training Loss: 0.00673
Epoch: 40603, Training Loss: 0.00821
Epoch: 40603, Training Loss: 0.00637
Epoch: 40604, Training Loss: 0.00719
Epoch: 40604, Training Loss: 0.00673
Epoch: 40604, Training Loss: 0.00821
Epoch: 40604, Training Loss: 0.00637
Epoch: 40605, Training Loss: 0.00719
Epoch: 40605, Training Loss: 0.00673
Epoch: 40605, Training Loss: 0.00821
Epoch: 40605, Training Loss: 0.00637
Epoch: 40606, Training Loss: 0.00719
Epoch: 40606, Training Loss: 0.00673
Epoch: 40606, Training Loss: 0.00821
Epoch: 40606, Training Loss: 0.00637
Epoch: 40607, Training Loss: 0.00719
Epoch: 40607, Training Loss: 0.00673
Epoch: 40607, Training Loss: 0.00821
Epoch: 40607, Training Loss: 0.00637
Epoch: 40608, Training Loss: 0.00719
Epoch: 40608, Training Loss: 0.00673
Epoch: 40608, Training Loss: 0.00821
Epoch: 40608, Training Loss: 0.00637
Epoch: 40609, Training Loss: 0.00719
Epoch: 40609, Training Loss: 0.00673
Epoch: 40609, Training Loss: 0.00821
Epoch: 40609, Training Loss: 0.00637
Epoch: 40610, Training Loss: 0.00719
Epoch: 40610, Training Loss: 0.00673
Epoch: 40610, Training Loss: 0.00821
Epoch: 40610, Training Loss: 0.00637
Epoch: 40611, Training Loss: 0.00719
Epoch: 40611, Training Loss: 0.00673
Epoch: 40611, Training Loss: 0.00821
Epoch: 40611, Training Loss: 0.00637
Epoch: 40612, Training Loss: 0.00719
Epoch: 40612, Training Loss: 0.00673
Epoch: 40612, Training Loss: 0.00821
Epoch: 40612, Training Loss: 0.00637
Epoch: 40613, Training Loss: 0.00719
Epoch: 40613, Training Loss: 0.00673
Epoch: 40613, Training Loss: 0.00821
Epoch: 40613, Training Loss: 0.00637
Epoch: 40614, Training Loss: 0.00719
Epoch: 40614, Training Loss: 0.00673
Epoch: 40614, Training Loss: 0.00821
Epoch: 40614, Training Loss: 0.00637
Epoch: 40615, Training Loss: 0.00719
Epoch: 40615, Training Loss: 0.00673
Epoch: 40615, Training Loss: 0.00821
Epoch: 40615, Training Loss: 0.00637
Epoch: 40616, Training Loss: 0.00719
Epoch: 40616, Training Loss: 0.00673
Epoch: 40616, Training Loss: 0.00821
Epoch: 40616, Training Loss: 0.00637
Epoch: 40617, Training Loss: 0.00719
Epoch: 40617, Training Loss: 0.00673
Epoch: 40617, Training Loss: 0.00821
Epoch: 40617, Training Loss: 0.00637
Epoch: 40618, Training Loss: 0.00719
Epoch: 40618, Training Loss: 0.00673
Epoch: 40618, Training Loss: 0.00821
Epoch: 40618, Training Loss: 0.00637
Epoch: 40619, Training Loss: 0.00719
Epoch: 40619, Training Loss: 0.00673
Epoch: 40619, Training Loss: 0.00821
Epoch: 40619, Training Loss: 0.00637
Epoch: 40620, Training Loss: 0.00719
Epoch: 40620, Training Loss: 0.00673
Epoch: 40620, Training Loss: 0.00821
Epoch: 40620, Training Loss: 0.00637
Epoch: 40621, Training Loss: 0.00719
Epoch: 40621, Training Loss: 0.00673
Epoch: 40621, Training Loss: 0.00821
Epoch: 40621, Training Loss: 0.00637
Epoch: 40622, Training Loss: 0.00719
Epoch: 40622, Training Loss: 0.00673
Epoch: 40622, Training Loss: 0.00821
Epoch: 40622, Training Loss: 0.00637
Epoch: 40623, Training Loss: 0.00719
Epoch: 40623, Training Loss: 0.00673
Epoch: 40623, Training Loss: 0.00821
Epoch: 40623, Training Loss: 0.00637
Epoch: 40624, Training Loss: 0.00719
Epoch: 40624, Training Loss: 0.00673
Epoch: 40624, Training Loss: 0.00821
Epoch: 40624, Training Loss: 0.00637
Epoch: 40625, Training Loss: 0.00719
Epoch: 40625, Training Loss: 0.00673
Epoch: 40625, Training Loss: 0.00821
Epoch: 40625, Training Loss: 0.00637
Epoch: 40626, Training Loss: 0.00718
Epoch: 40626, Training Loss: 0.00673
Epoch: 40626, Training Loss: 0.00821
Epoch: 40626, Training Loss: 0.00637
Epoch: 40627, Training Loss: 0.00718
Epoch: 40627, Training Loss: 0.00673
Epoch: 40627, Training Loss: 0.00821
Epoch: 40627, Training Loss: 0.00637
Epoch: 40628, Training Loss: 0.00718
Epoch: 40628, Training Loss: 0.00673
Epoch: 40628, Training Loss: 0.00821
Epoch: 40628, Training Loss: 0.00637
Epoch: 40629, Training Loss: 0.00718
Epoch: 40629, Training Loss: 0.00673
Epoch: 40629, Training Loss: 0.00821
Epoch: 40629, Training Loss: 0.00637
Epoch: 40630, Training Loss: 0.00718
Epoch: 40630, Training Loss: 0.00672
Epoch: 40630, Training Loss: 0.00821
Epoch: 40630, Training Loss: 0.00637
Epoch: 40631, Training Loss: 0.00718
Epoch: 40631, Training Loss: 0.00672
Epoch: 40631, Training Loss: 0.00821
Epoch: 40631, Training Loss: 0.00637
Epoch: 40632, Training Loss: 0.00718
Epoch: 40632, Training Loss: 0.00672
Epoch: 40632, Training Loss: 0.00821
Epoch: 40632, Training Loss: 0.00637
Epoch: 40633, Training Loss: 0.00718
Epoch: 40633, Training Loss: 0.00672
Epoch: 40633, Training Loss: 0.00821
Epoch: 40633, Training Loss: 0.00637
Epoch: 40634, Training Loss: 0.00718
Epoch: 40634, Training Loss: 0.00672
Epoch: 40634, Training Loss: 0.00821
Epoch: 40634, Training Loss: 0.00637
Epoch: 40635, Training Loss: 0.00718
Epoch: 40635, Training Loss: 0.00672
Epoch: 40635, Training Loss: 0.00821
Epoch: 40635, Training Loss: 0.00637
Epoch: 40636, Training Loss: 0.00718
Epoch: 40636, Training Loss: 0.00672
Epoch: 40636, Training Loss: 0.00821
Epoch: 40636, Training Loss: 0.00637
Epoch: 40637, Training Loss: 0.00718
Epoch: 40637, Training Loss: 0.00672
Epoch: 40637, Training Loss: 0.00821
Epoch: 40637, Training Loss: 0.00637
Epoch: 40638, Training Loss: 0.00718
Epoch: 40638, Training Loss: 0.00672
Epoch: 40638, Training Loss: 0.00821
Epoch: 40638, Training Loss: 0.00637
Epoch: 40639, Training Loss: 0.00718
Epoch: 40639, Training Loss: 0.00672
Epoch: 40639, Training Loss: 0.00821
Epoch: 40639, Training Loss: 0.00637
Epoch: 40640, Training Loss: 0.00718
Epoch: 40640, Training Loss: 0.00672
Epoch: 40640, Training Loss: 0.00821
Epoch: 40640, Training Loss: 0.00637
Epoch: 40641, Training Loss: 0.00718
Epoch: 40641, Training Loss: 0.00672
Epoch: 40641, Training Loss: 0.00821
Epoch: 40641, Training Loss: 0.00637
Epoch: 40642, Training Loss: 0.00718
Epoch: 40642, Training Loss: 0.00672
Epoch: 40642, Training Loss: 0.00821
Epoch: 40642, Training Loss: 0.00637
Epoch: 40643, Training Loss: 0.00718
Epoch: 40643, Training Loss: 0.00672
Epoch: 40643, Training Loss: 0.00821
Epoch: 40643, Training Loss: 0.00637
Epoch: 40644, Training Loss: 0.00718
Epoch: 40644, Training Loss: 0.00672
Epoch: 40644, Training Loss: 0.00821
Epoch: 40644, Training Loss: 0.00637
Epoch: 40645, Training Loss: 0.00718
Epoch: 40645, Training Loss: 0.00672
Epoch: 40645, Training Loss: 0.00821
Epoch: 40645, Training Loss: 0.00637
Epoch: 40646, Training Loss: 0.00718
Epoch: 40646, Training Loss: 0.00672
Epoch: 40646, Training Loss: 0.00821
Epoch: 40646, Training Loss: 0.00637
Epoch: 40647, Training Loss: 0.00718
Epoch: 40647, Training Loss: 0.00672
Epoch: 40647, Training Loss: 0.00821
Epoch: 40647, Training Loss: 0.00637
Epoch: 40648, Training Loss: 0.00718
Epoch: 40648, Training Loss: 0.00672
Epoch: 40648, Training Loss: 0.00821
Epoch: 40648, Training Loss: 0.00637
Epoch: 40649, Training Loss: 0.00718
Epoch: 40649, Training Loss: 0.00672
Epoch: 40649, Training Loss: 0.00821
Epoch: 40649, Training Loss: 0.00637
Epoch: 40650, Training Loss: 0.00718
Epoch: 40650, Training Loss: 0.00672
Epoch: 40650, Training Loss: 0.00821
Epoch: 40650, Training Loss: 0.00637
Epoch: 40651, Training Loss: 0.00718
Epoch: 40651, Training Loss: 0.00672
Epoch: 40651, Training Loss: 0.00821
Epoch: 40651, Training Loss: 0.00637
Epoch: 40652, Training Loss: 0.00718
Epoch: 40652, Training Loss: 0.00672
Epoch: 40652, Training Loss: 0.00821
Epoch: 40652, Training Loss: 0.00637
Epoch: 40653, Training Loss: 0.00718
Epoch: 40653, Training Loss: 0.00672
Epoch: 40653, Training Loss: 0.00821
Epoch: 40653, Training Loss: 0.00637
Epoch: 40654, Training Loss: 0.00718
Epoch: 40654, Training Loss: 0.00672
Epoch: 40654, Training Loss: 0.00821
Epoch: 40654, Training Loss: 0.00636
Epoch: 40655, Training Loss: 0.00718
Epoch: 40655, Training Loss: 0.00672
Epoch: 40655, Training Loss: 0.00821
Epoch: 40655, Training Loss: 0.00636
Epoch: 40656, Training Loss: 0.00718
Epoch: 40656, Training Loss: 0.00672
Epoch: 40656, Training Loss: 0.00821
Epoch: 40656, Training Loss: 0.00636
Epoch: 40657, Training Loss: 0.00718
Epoch: 40657, Training Loss: 0.00672
Epoch: 40657, Training Loss: 0.00821
Epoch: 40657, Training Loss: 0.00636
Epoch: 40658, Training Loss: 0.00718
Epoch: 40658, Training Loss: 0.00672
Epoch: 40658, Training Loss: 0.00821
Epoch: 40658, Training Loss: 0.00636
Epoch: 40659, Training Loss: 0.00718
Epoch: 40659, Training Loss: 0.00672
Epoch: 40659, Training Loss: 0.00821
Epoch: 40659, Training Loss: 0.00636
Epoch: 40660, Training Loss: 0.00718
Epoch: 40660, Training Loss: 0.00672
Epoch: 40660, Training Loss: 0.00821
Epoch: 40660, Training Loss: 0.00636
Epoch: 40661, Training Loss: 0.00718
Epoch: 40661, Training Loss: 0.00672
Epoch: 40661, Training Loss: 0.00821
Epoch: 40661, Training Loss: 0.00636
Epoch: 40662, Training Loss: 0.00718
Epoch: 40662, Training Loss: 0.00672
Epoch: 40662, Training Loss: 0.00821
Epoch: 40662, Training Loss: 0.00636
Epoch: 40663, Training Loss: 0.00718
Epoch: 40663, Training Loss: 0.00672
Epoch: 40663, Training Loss: 0.00821
Epoch: 40663, Training Loss: 0.00636
Epoch: 40664, Training Loss: 0.00718
Epoch: 40664, Training Loss: 0.00672
Epoch: 40664, Training Loss: 0.00821
Epoch: 40664, Training Loss: 0.00636
Epoch: 40665, Training Loss: 0.00718
Epoch: 40665, Training Loss: 0.00672
Epoch: 40665, Training Loss: 0.00821
Epoch: 40665, Training Loss: 0.00636
Epoch: 40666, Training Loss: 0.00718
Epoch: 40666, Training Loss: 0.00672
Epoch: 40666, Training Loss: 0.00821
Epoch: 40666, Training Loss: 0.00636
Epoch: 40667, Training Loss: 0.00718
Epoch: 40667, Training Loss: 0.00672
Epoch: 40667, Training Loss: 0.00821
Epoch: 40667, Training Loss: 0.00636
Epoch: 40668, Training Loss: 0.00718
Epoch: 40668, Training Loss: 0.00672
Epoch: 40668, Training Loss: 0.00821
Epoch: 40668, Training Loss: 0.00636
Epoch: 40669, Training Loss: 0.00718
Epoch: 40669, Training Loss: 0.00672
Epoch: 40669, Training Loss: 0.00821
Epoch: 40669, Training Loss: 0.00636
Epoch: 40670, Training Loss: 0.00718
Epoch: 40670, Training Loss: 0.00672
Epoch: 40670, Training Loss: 0.00821
Epoch: 40670, Training Loss: 0.00636
Epoch: 40671, Training Loss: 0.00718
Epoch: 40671, Training Loss: 0.00672
Epoch: 40671, Training Loss: 0.00820
Epoch: 40671, Training Loss: 0.00636
Epoch: 40672, Training Loss: 0.00718
Epoch: 40672, Training Loss: 0.00672
Epoch: 40672, Training Loss: 0.00820
Epoch: 40672, Training Loss: 0.00636
Epoch: 40673, Training Loss: 0.00718
Epoch: 40673, Training Loss: 0.00672
Epoch: 40673, Training Loss: 0.00820
Epoch: 40673, Training Loss: 0.00636
Epoch: 40674, Training Loss: 0.00718
Epoch: 40674, Training Loss: 0.00672
Epoch: 40674, Training Loss: 0.00820
Epoch: 40674, Training Loss: 0.00636
Epoch: 40675, Training Loss: 0.00718
Epoch: 40675, Training Loss: 0.00672
Epoch: 40675, Training Loss: 0.00820
Epoch: 40675, Training Loss: 0.00636
Epoch: 40676, Training Loss: 0.00718
Epoch: 40676, Training Loss: 0.00672
Epoch: 40676, Training Loss: 0.00820
Epoch: 40676, Training Loss: 0.00636
Epoch: 40677, Training Loss: 0.00718
Epoch: 40677, Training Loss: 0.00672
Epoch: 40677, Training Loss: 0.00820
Epoch: 40677, Training Loss: 0.00636
Epoch: 40678, Training Loss: 0.00718
Epoch: 40678, Training Loss: 0.00672
Epoch: 40678, Training Loss: 0.00820
Epoch: 40678, Training Loss: 0.00636
Epoch: 40679, Training Loss: 0.00718
Epoch: 40679, Training Loss: 0.00672
Epoch: 40679, Training Loss: 0.00820
Epoch: 40679, Training Loss: 0.00636
Epoch: 40680, Training Loss: 0.00718
Epoch: 40680, Training Loss: 0.00672
Epoch: 40680, Training Loss: 0.00820
Epoch: 40680, Training Loss: 0.00636
Epoch: 40681, Training Loss: 0.00718
Epoch: 40681, Training Loss: 0.00672
Epoch: 40681, Training Loss: 0.00820
Epoch: 40681, Training Loss: 0.00636
Epoch: 40682, Training Loss: 0.00718
Epoch: 40682, Training Loss: 0.00672
Epoch: 40682, Training Loss: 0.00820
Epoch: 40682, Training Loss: 0.00636
Epoch: 40683, Training Loss: 0.00718
Epoch: 40683, Training Loss: 0.00672
Epoch: 40683, Training Loss: 0.00820
Epoch: 40683, Training Loss: 0.00636
Epoch: 40684, Training Loss: 0.00718
Epoch: 40684, Training Loss: 0.00672
Epoch: 40684, Training Loss: 0.00820
Epoch: 40684, Training Loss: 0.00636
Epoch: 40685, Training Loss: 0.00718
Epoch: 40685, Training Loss: 0.00672
Epoch: 40685, Training Loss: 0.00820
Epoch: 40685, Training Loss: 0.00636
Epoch: 40686, Training Loss: 0.00718
Epoch: 40686, Training Loss: 0.00672
Epoch: 40686, Training Loss: 0.00820
Epoch: 40686, Training Loss: 0.00636
Epoch: 40687, Training Loss: 0.00718
Epoch: 40687, Training Loss: 0.00672
Epoch: 40687, Training Loss: 0.00820
Epoch: 40687, Training Loss: 0.00636
Epoch: 40688, Training Loss: 0.00718
Epoch: 40688, Training Loss: 0.00672
Epoch: 40688, Training Loss: 0.00820
Epoch: 40688, Training Loss: 0.00636
Epoch: 40689, Training Loss: 0.00718
Epoch: 40689, Training Loss: 0.00672
Epoch: 40689, Training Loss: 0.00820
Epoch: 40689, Training Loss: 0.00636
Epoch: 40690, Training Loss: 0.00718
Epoch: 40690, Training Loss: 0.00672
Epoch: 40690, Training Loss: 0.00820
Epoch: 40690, Training Loss: 0.00636
Epoch: 40691, Training Loss: 0.00718
Epoch: 40691, Training Loss: 0.00672
Epoch: 40691, Training Loss: 0.00820
Epoch: 40691, Training Loss: 0.00636
Epoch: 40692, Training Loss: 0.00718
Epoch: 40692, Training Loss: 0.00672
Epoch: 40692, Training Loss: 0.00820
Epoch: 40692, Training Loss: 0.00636
Epoch: 40693, Training Loss: 0.00718
Epoch: 40693, Training Loss: 0.00672
Epoch: 40693, Training Loss: 0.00820
Epoch: 40693, Training Loss: 0.00636
Epoch: 40694, Training Loss: 0.00718
Epoch: 40694, Training Loss: 0.00672
Epoch: 40694, Training Loss: 0.00820
Epoch: 40694, Training Loss: 0.00636
Epoch: 40695, Training Loss: 0.00718
Epoch: 40695, Training Loss: 0.00672
Epoch: 40695, Training Loss: 0.00820
Epoch: 40695, Training Loss: 0.00636
Epoch: 40696, Training Loss: 0.00718
Epoch: 40696, Training Loss: 0.00672
Epoch: 40696, Training Loss: 0.00820
Epoch: 40696, Training Loss: 0.00636
Epoch: 40697, Training Loss: 0.00718
Epoch: 40697, Training Loss: 0.00672
Epoch: 40697, Training Loss: 0.00820
Epoch: 40697, Training Loss: 0.00636
Epoch: 40698, Training Loss: 0.00718
Epoch: 40698, Training Loss: 0.00672
Epoch: 40698, Training Loss: 0.00820
Epoch: 40698, Training Loss: 0.00636
Epoch: 40699, Training Loss: 0.00718
Epoch: 40699, Training Loss: 0.00672
Epoch: 40699, Training Loss: 0.00820
Epoch: 40699, Training Loss: 0.00636
Epoch: 40700, Training Loss: 0.00718
Epoch: 40700, Training Loss: 0.00672
Epoch: 40700, Training Loss: 0.00820
Epoch: 40700, Training Loss: 0.00636
Epoch: 40701, Training Loss: 0.00718
Epoch: 40701, Training Loss: 0.00672
Epoch: 40701, Training Loss: 0.00820
Epoch: 40701, Training Loss: 0.00636
Epoch: 40702, Training Loss: 0.00718
Epoch: 40702, Training Loss: 0.00672
Epoch: 40702, Training Loss: 0.00820
Epoch: 40702, Training Loss: 0.00636
Epoch: 40703, Training Loss: 0.00718
Epoch: 40703, Training Loss: 0.00672
Epoch: 40703, Training Loss: 0.00820
Epoch: 40703, Training Loss: 0.00636
Epoch: 40704, Training Loss: 0.00718
Epoch: 40704, Training Loss: 0.00672
Epoch: 40704, Training Loss: 0.00820
Epoch: 40704, Training Loss: 0.00636
Epoch: 40705, Training Loss: 0.00718
Epoch: 40705, Training Loss: 0.00672
Epoch: 40705, Training Loss: 0.00820
Epoch: 40705, Training Loss: 0.00636
Epoch: 40706, Training Loss: 0.00718
Epoch: 40706, Training Loss: 0.00672
Epoch: 40706, Training Loss: 0.00820
Epoch: 40706, Training Loss: 0.00636
Epoch: 40707, Training Loss: 0.00718
Epoch: 40707, Training Loss: 0.00672
Epoch: 40707, Training Loss: 0.00820
Epoch: 40707, Training Loss: 0.00636
Epoch: 40708, Training Loss: 0.00718
Epoch: 40708, Training Loss: 0.00672
Epoch: 40708, Training Loss: 0.00820
Epoch: 40708, Training Loss: 0.00636
Epoch: 40709, Training Loss: 0.00718
Epoch: 40709, Training Loss: 0.00672
Epoch: 40709, Training Loss: 0.00820
Epoch: 40709, Training Loss: 0.00636
Epoch: 40710, Training Loss: 0.00718
Epoch: 40710, Training Loss: 0.00672
Epoch: 40710, Training Loss: 0.00820
Epoch: 40710, Training Loss: 0.00636
Epoch: 40711, Training Loss: 0.00718
Epoch: 40711, Training Loss: 0.00672
Epoch: 40711, Training Loss: 0.00820
Epoch: 40711, Training Loss: 0.00636
Epoch: 40712, Training Loss: 0.00718
Epoch: 40712, Training Loss: 0.00672
Epoch: 40712, Training Loss: 0.00820
Epoch: 40712, Training Loss: 0.00636
Epoch: 40713, Training Loss: 0.00718
Epoch: 40713, Training Loss: 0.00672
Epoch: 40713, Training Loss: 0.00820
Epoch: 40713, Training Loss: 0.00636
Epoch: 40714, Training Loss: 0.00718
Epoch: 40714, Training Loss: 0.00672
Epoch: 40714, Training Loss: 0.00820
Epoch: 40714, Training Loss: 0.00636
Epoch: 40715, Training Loss: 0.00718
Epoch: 40715, Training Loss: 0.00672
Epoch: 40715, Training Loss: 0.00820
Epoch: 40715, Training Loss: 0.00636
Epoch: 40716, Training Loss: 0.00718
Epoch: 40716, Training Loss: 0.00672
Epoch: 40716, Training Loss: 0.00820
Epoch: 40716, Training Loss: 0.00636
Epoch: 40717, Training Loss: 0.00718
Epoch: 40717, Training Loss: 0.00672
Epoch: 40717, Training Loss: 0.00820
Epoch: 40717, Training Loss: 0.00636
Epoch: 40718, Training Loss: 0.00718
Epoch: 40718, Training Loss: 0.00672
Epoch: 40718, Training Loss: 0.00820
Epoch: 40718, Training Loss: 0.00636
Epoch: 40719, Training Loss: 0.00718
Epoch: 40719, Training Loss: 0.00672
Epoch: 40719, Training Loss: 0.00820
Epoch: 40719, Training Loss: 0.00636
Epoch: 40720, Training Loss: 0.00718
Epoch: 40720, Training Loss: 0.00672
Epoch: 40720, Training Loss: 0.00820
Epoch: 40720, Training Loss: 0.00636
Epoch: 40721, Training Loss: 0.00718
Epoch: 40721, Training Loss: 0.00672
Epoch: 40721, Training Loss: 0.00820
Epoch: 40721, Training Loss: 0.00636
Epoch: 40722, Training Loss: 0.00718
Epoch: 40722, Training Loss: 0.00672
Epoch: 40722, Training Loss: 0.00820
Epoch: 40722, Training Loss: 0.00636
Epoch: 40723, Training Loss: 0.00718
Epoch: 40723, Training Loss: 0.00672
Epoch: 40723, Training Loss: 0.00820
Epoch: 40723, Training Loss: 0.00636
Epoch: 40724, Training Loss: 0.00718
Epoch: 40724, Training Loss: 0.00672
Epoch: 40724, Training Loss: 0.00820
Epoch: 40724, Training Loss: 0.00636
Epoch: 40725, Training Loss: 0.00718
Epoch: 40725, Training Loss: 0.00672
Epoch: 40725, Training Loss: 0.00820
Epoch: 40725, Training Loss: 0.00636
Epoch: 40726, Training Loss: 0.00718
Epoch: 40726, Training Loss: 0.00672
Epoch: 40726, Training Loss: 0.00820
Epoch: 40726, Training Loss: 0.00636
Epoch: 40727, Training Loss: 0.00718
Epoch: 40727, Training Loss: 0.00672
Epoch: 40727, Training Loss: 0.00820
Epoch: 40727, Training Loss: 0.00636
Epoch: 40728, Training Loss: 0.00718
Epoch: 40728, Training Loss: 0.00672
Epoch: 40728, Training Loss: 0.00820
Epoch: 40728, Training Loss: 0.00636
Epoch: 40729, Training Loss: 0.00718
Epoch: 40729, Training Loss: 0.00672
Epoch: 40729, Training Loss: 0.00820
Epoch: 40729, Training Loss: 0.00636
Epoch: 40730, Training Loss: 0.00718
Epoch: 40730, Training Loss: 0.00672
Epoch: 40730, Training Loss: 0.00820
Epoch: 40730, Training Loss: 0.00636
Epoch: 40731, Training Loss: 0.00718
Epoch: 40731, Training Loss: 0.00672
Epoch: 40731, Training Loss: 0.00820
Epoch: 40731, Training Loss: 0.00636
Epoch: 40732, Training Loss: 0.00718
Epoch: 40732, Training Loss: 0.00672
Epoch: 40732, Training Loss: 0.00820
Epoch: 40732, Training Loss: 0.00636
Epoch: 40733, Training Loss: 0.00717
Epoch: 40733, Training Loss: 0.00672
Epoch: 40733, Training Loss: 0.00820
Epoch: 40733, Training Loss: 0.00636
Epoch: 40734, Training Loss: 0.00717
Epoch: 40734, Training Loss: 0.00672
Epoch: 40734, Training Loss: 0.00820
Epoch: 40734, Training Loss: 0.00636
Epoch: 40735, Training Loss: 0.00717
Epoch: 40735, Training Loss: 0.00672
Epoch: 40735, Training Loss: 0.00820
Epoch: 40735, Training Loss: 0.00636
Epoch: 40736, Training Loss: 0.00717
Epoch: 40736, Training Loss: 0.00672
Epoch: 40736, Training Loss: 0.00820
Epoch: 40736, Training Loss: 0.00636
Epoch: 40737, Training Loss: 0.00717
Epoch: 40737, Training Loss: 0.00672
Epoch: 40737, Training Loss: 0.00820
Epoch: 40737, Training Loss: 0.00636
Epoch: 40738, Training Loss: 0.00717
Epoch: 40738, Training Loss: 0.00672
Epoch: 40738, Training Loss: 0.00820
Epoch: 40738, Training Loss: 0.00636
Epoch: 40739, Training Loss: 0.00717
Epoch: 40739, Training Loss: 0.00672
Epoch: 40739, Training Loss: 0.00820
Epoch: 40739, Training Loss: 0.00636
Epoch: 40740, Training Loss: 0.00717
Epoch: 40740, Training Loss: 0.00672
Epoch: 40740, Training Loss: 0.00820
Epoch: 40740, Training Loss: 0.00636
Epoch: 40741, Training Loss: 0.00717
Epoch: 40741, Training Loss: 0.00672
Epoch: 40741, Training Loss: 0.00820
Epoch: 40741, Training Loss: 0.00636
Epoch: 40742, Training Loss: 0.00717
Epoch: 40742, Training Loss: 0.00672
Epoch: 40742, Training Loss: 0.00820
Epoch: 40742, Training Loss: 0.00636
Epoch: 40743, Training Loss: 0.00717
Epoch: 40743, Training Loss: 0.00672
Epoch: 40743, Training Loss: 0.00820
Epoch: 40743, Training Loss: 0.00636
Epoch: 40744, Training Loss: 0.00717
Epoch: 40744, Training Loss: 0.00672
Epoch: 40744, Training Loss: 0.00820
Epoch: 40744, Training Loss: 0.00636
Epoch: 40745, Training Loss: 0.00717
Epoch: 40745, Training Loss: 0.00672
Epoch: 40745, Training Loss: 0.00820
Epoch: 40745, Training Loss: 0.00636
Epoch: 40746, Training Loss: 0.00717
Epoch: 40746, Training Loss: 0.00672
Epoch: 40746, Training Loss: 0.00820
Epoch: 40746, Training Loss: 0.00636
Epoch: 40747, Training Loss: 0.00717
Epoch: 40747, Training Loss: 0.00671
Epoch: 40747, Training Loss: 0.00820
Epoch: 40747, Training Loss: 0.00636
Epoch: 40748, Training Loss: 0.00717
Epoch: 40748, Training Loss: 0.00671
Epoch: 40748, Training Loss: 0.00820
Epoch: 40748, Training Loss: 0.00636
Epoch: 40749, Training Loss: 0.00717
Epoch: 40749, Training Loss: 0.00671
Epoch: 40749, Training Loss: 0.00820
Epoch: 40749, Training Loss: 0.00636
Epoch: 40750, Training Loss: 0.00717
Epoch: 40750, Training Loss: 0.00671
Epoch: 40750, Training Loss: 0.00820
Epoch: 40750, Training Loss: 0.00636
Epoch: 40751, Training Loss: 0.00717
Epoch: 40751, Training Loss: 0.00671
Epoch: 40751, Training Loss: 0.00820
Epoch: 40751, Training Loss: 0.00636
Epoch: 40752, Training Loss: 0.00717
Epoch: 40752, Training Loss: 0.00671
Epoch: 40752, Training Loss: 0.00820
Epoch: 40752, Training Loss: 0.00636
Epoch: 40753, Training Loss: 0.00717
Epoch: 40753, Training Loss: 0.00671
Epoch: 40753, Training Loss: 0.00820
Epoch: 40753, Training Loss: 0.00636
Epoch: 40754, Training Loss: 0.00717
Epoch: 40754, Training Loss: 0.00671
Epoch: 40754, Training Loss: 0.00820
Epoch: 40754, Training Loss: 0.00636
Epoch: 40755, Training Loss: 0.00717
Epoch: 40755, Training Loss: 0.00671
Epoch: 40755, Training Loss: 0.00820
Epoch: 40755, Training Loss: 0.00636
Epoch: 40756, Training Loss: 0.00717
Epoch: 40756, Training Loss: 0.00671
Epoch: 40756, Training Loss: 0.00820
Epoch: 40756, Training Loss: 0.00636
Epoch: 40757, Training Loss: 0.00717
Epoch: 40757, Training Loss: 0.00671
Epoch: 40757, Training Loss: 0.00820
Epoch: 40757, Training Loss: 0.00636
Epoch: 40758, Training Loss: 0.00717
Epoch: 40758, Training Loss: 0.00671
Epoch: 40758, Training Loss: 0.00820
Epoch: 40758, Training Loss: 0.00636
Epoch: 40759, Training Loss: 0.00717
Epoch: 40759, Training Loss: 0.00671
Epoch: 40759, Training Loss: 0.00820
Epoch: 40759, Training Loss: 0.00636
Epoch: 40760, Training Loss: 0.00717
Epoch: 40760, Training Loss: 0.00671
Epoch: 40760, Training Loss: 0.00820
Epoch: 40760, Training Loss: 0.00636
Epoch: 40761, Training Loss: 0.00717
Epoch: 40761, Training Loss: 0.00671
Epoch: 40761, Training Loss: 0.00820
Epoch: 40761, Training Loss: 0.00636
Epoch: 40762, Training Loss: 0.00717
Epoch: 40762, Training Loss: 0.00671
Epoch: 40762, Training Loss: 0.00820
Epoch: 40762, Training Loss: 0.00636
Epoch: 40763, Training Loss: 0.00717
Epoch: 40763, Training Loss: 0.00671
Epoch: 40763, Training Loss: 0.00820
Epoch: 40763, Training Loss: 0.00636
Epoch: 40764, Training Loss: 0.00717
Epoch: 40764, Training Loss: 0.00671
Epoch: 40764, Training Loss: 0.00820
Epoch: 40764, Training Loss: 0.00636
Epoch: 40765, Training Loss: 0.00717
Epoch: 40765, Training Loss: 0.00671
Epoch: 40765, Training Loss: 0.00820
Epoch: 40765, Training Loss: 0.00636
Epoch: 40766, Training Loss: 0.00717
Epoch: 40766, Training Loss: 0.00671
Epoch: 40766, Training Loss: 0.00819
Epoch: 40766, Training Loss: 0.00636
Epoch: 40767, Training Loss: 0.00717
Epoch: 40767, Training Loss: 0.00671
Epoch: 40767, Training Loss: 0.00819
Epoch: 40767, Training Loss: 0.00636
Epoch: 40768, Training Loss: 0.00717
Epoch: 40768, Training Loss: 0.00671
Epoch: 40768, Training Loss: 0.00819
Epoch: 40768, Training Loss: 0.00636
Epoch: 40769, Training Loss: 0.00717
Epoch: 40769, Training Loss: 0.00671
Epoch: 40769, Training Loss: 0.00819
Epoch: 40769, Training Loss: 0.00636
Epoch: 40770, Training Loss: 0.00717
Epoch: 40770, Training Loss: 0.00671
Epoch: 40770, Training Loss: 0.00819
Epoch: 40770, Training Loss: 0.00636
Epoch: 40771, Training Loss: 0.00717
Epoch: 40771, Training Loss: 0.00671
Epoch: 40771, Training Loss: 0.00819
Epoch: 40771, Training Loss: 0.00636
Epoch: 40772, Training Loss: 0.00717
Epoch: 40772, Training Loss: 0.00671
Epoch: 40772, Training Loss: 0.00819
Epoch: 40772, Training Loss: 0.00636
Epoch: 40773, Training Loss: 0.00717
Epoch: 40773, Training Loss: 0.00671
Epoch: 40773, Training Loss: 0.00819
Epoch: 40773, Training Loss: 0.00636
Epoch: 40774, Training Loss: 0.00717
Epoch: 40774, Training Loss: 0.00671
Epoch: 40774, Training Loss: 0.00819
Epoch: 40774, Training Loss: 0.00636
Epoch: 40775, Training Loss: 0.00717
Epoch: 40775, Training Loss: 0.00671
Epoch: 40775, Training Loss: 0.00819
Epoch: 40775, Training Loss: 0.00636
Epoch: 40776, Training Loss: 0.00717
Epoch: 40776, Training Loss: 0.00671
Epoch: 40776, Training Loss: 0.00819
Epoch: 40776, Training Loss: 0.00636
Epoch: 40777, Training Loss: 0.00717
Epoch: 40777, Training Loss: 0.00671
Epoch: 40777, Training Loss: 0.00819
Epoch: 40777, Training Loss: 0.00636
Epoch: 40778, Training Loss: 0.00717
Epoch: 40778, Training Loss: 0.00671
Epoch: 40778, Training Loss: 0.00819
Epoch: 40778, Training Loss: 0.00635
Epoch: 40779, Training Loss: 0.00717
Epoch: 40779, Training Loss: 0.00671
Epoch: 40779, Training Loss: 0.00819
Epoch: 40779, Training Loss: 0.00635
Epoch: 40780, Training Loss: 0.00717
Epoch: 40780, Training Loss: 0.00671
Epoch: 40780, Training Loss: 0.00819
Epoch: 40780, Training Loss: 0.00635
Epoch: 40781, Training Loss: 0.00717
Epoch: 40781, Training Loss: 0.00671
Epoch: 40781, Training Loss: 0.00819
Epoch: 40781, Training Loss: 0.00635
Epoch: 40782, Training Loss: 0.00717
Epoch: 40782, Training Loss: 0.00671
Epoch: 40782, Training Loss: 0.00819
Epoch: 40782, Training Loss: 0.00635
Epoch: 40783, Training Loss: 0.00717
Epoch: 40783, Training Loss: 0.00671
Epoch: 40783, Training Loss: 0.00819
Epoch: 40783, Training Loss: 0.00635
Epoch: 40784, Training Loss: 0.00717
Epoch: 40784, Training Loss: 0.00671
Epoch: 40784, Training Loss: 0.00819
Epoch: 40784, Training Loss: 0.00635
Epoch: 40785, Training Loss: 0.00717
Epoch: 40785, Training Loss: 0.00671
Epoch: 40785, Training Loss: 0.00819
Epoch: 40785, Training Loss: 0.00635
Epoch: 40786, Training Loss: 0.00717
Epoch: 40786, Training Loss: 0.00671
Epoch: 40786, Training Loss: 0.00819
Epoch: 40786, Training Loss: 0.00635
Epoch: 40787, Training Loss: 0.00717
Epoch: 40787, Training Loss: 0.00671
Epoch: 40787, Training Loss: 0.00819
Epoch: 40787, Training Loss: 0.00635
Epoch: 40788, Training Loss: 0.00717
Epoch: 40788, Training Loss: 0.00671
Epoch: 40788, Training Loss: 0.00819
Epoch: 40788, Training Loss: 0.00635
Epoch: 40789, Training Loss: 0.00717
Epoch: 40789, Training Loss: 0.00671
Epoch: 40789, Training Loss: 0.00819
Epoch: 40789, Training Loss: 0.00635
Epoch: 40790, Training Loss: 0.00717
Epoch: 40790, Training Loss: 0.00671
Epoch: 40790, Training Loss: 0.00819
Epoch: 40790, Training Loss: 0.00635
Epoch: 40791, Training Loss: 0.00717
Epoch: 40791, Training Loss: 0.00671
Epoch: 40791, Training Loss: 0.00819
Epoch: 40791, Training Loss: 0.00635
Epoch: 40792, Training Loss: 0.00717
Epoch: 40792, Training Loss: 0.00671
Epoch: 40792, Training Loss: 0.00819
Epoch: 40792, Training Loss: 0.00635
Epoch: 40793, Training Loss: 0.00717
Epoch: 40793, Training Loss: 0.00671
Epoch: 40793, Training Loss: 0.00819
Epoch: 40793, Training Loss: 0.00635
Epoch: 40794, Training Loss: 0.00717
Epoch: 40794, Training Loss: 0.00671
Epoch: 40794, Training Loss: 0.00819
Epoch: 40794, Training Loss: 0.00635
Epoch: 40795, Training Loss: 0.00717
Epoch: 40795, Training Loss: 0.00671
Epoch: 40795, Training Loss: 0.00819
Epoch: 40795, Training Loss: 0.00635
Epoch: 40796, Training Loss: 0.00717
Epoch: 40796, Training Loss: 0.00671
Epoch: 40796, Training Loss: 0.00819
Epoch: 40796, Training Loss: 0.00635
Epoch: 40797, Training Loss: 0.00717
Epoch: 40797, Training Loss: 0.00671
Epoch: 40797, Training Loss: 0.00819
Epoch: 40797, Training Loss: 0.00635
Epoch: 40798, Training Loss: 0.00717
Epoch: 40798, Training Loss: 0.00671
Epoch: 40798, Training Loss: 0.00819
Epoch: 40798, Training Loss: 0.00635
Epoch: 40799, Training Loss: 0.00717
Epoch: 40799, Training Loss: 0.00671
Epoch: 40799, Training Loss: 0.00819
Epoch: 40799, Training Loss: 0.00635
Epoch: 40800, Training Loss: 0.00717
Epoch: 40800, Training Loss: 0.00671
Epoch: 40800, Training Loss: 0.00819
Epoch: 40800, Training Loss: 0.00635
Epoch: 40801, Training Loss: 0.00717
Epoch: 40801, Training Loss: 0.00671
Epoch: 40801, Training Loss: 0.00819
Epoch: 40801, Training Loss: 0.00635
Epoch: 40802, Training Loss: 0.00717
Epoch: 40802, Training Loss: 0.00671
Epoch: 40802, Training Loss: 0.00819
Epoch: 40802, Training Loss: 0.00635
Epoch: 40803, Training Loss: 0.00717
Epoch: 40803, Training Loss: 0.00671
Epoch: 40803, Training Loss: 0.00819
Epoch: 40803, Training Loss: 0.00635
Epoch: 40804, Training Loss: 0.00717
Epoch: 40804, Training Loss: 0.00671
Epoch: 40804, Training Loss: 0.00819
Epoch: 40804, Training Loss: 0.00635
Epoch: 40805, Training Loss: 0.00717
Epoch: 40805, Training Loss: 0.00671
Epoch: 40805, Training Loss: 0.00819
Epoch: 40805, Training Loss: 0.00635
Epoch: 40806, Training Loss: 0.00717
Epoch: 40806, Training Loss: 0.00671
Epoch: 40806, Training Loss: 0.00819
Epoch: 40806, Training Loss: 0.00635
Epoch: 40807, Training Loss: 0.00717
Epoch: 40807, Training Loss: 0.00671
Epoch: 40807, Training Loss: 0.00819
Epoch: 40807, Training Loss: 0.00635
Epoch: 40808, Training Loss: 0.00717
Epoch: 40808, Training Loss: 0.00671
Epoch: 40808, Training Loss: 0.00819
Epoch: 40808, Training Loss: 0.00635
Epoch: 40809, Training Loss: 0.00717
Epoch: 40809, Training Loss: 0.00671
Epoch: 40809, Training Loss: 0.00819
Epoch: 40809, Training Loss: 0.00635
Epoch: 40810, Training Loss: 0.00717
Epoch: 40810, Training Loss: 0.00671
Epoch: 40810, Training Loss: 0.00819
Epoch: 40810, Training Loss: 0.00635
Epoch: 40811, Training Loss: 0.00717
Epoch: 40811, Training Loss: 0.00671
Epoch: 40811, Training Loss: 0.00819
Epoch: 40811, Training Loss: 0.00635
Epoch: 40812, Training Loss: 0.00717
Epoch: 40812, Training Loss: 0.00671
Epoch: 40812, Training Loss: 0.00819
Epoch: 40812, Training Loss: 0.00635
Epoch: 40813, Training Loss: 0.00717
Epoch: 40813, Training Loss: 0.00671
Epoch: 40813, Training Loss: 0.00819
Epoch: 40813, Training Loss: 0.00635
Epoch: 40814, Training Loss: 0.00717
Epoch: 40814, Training Loss: 0.00671
Epoch: 40814, Training Loss: 0.00819
Epoch: 40814, Training Loss: 0.00635
Epoch: 40815, Training Loss: 0.00717
Epoch: 40815, Training Loss: 0.00671
Epoch: 40815, Training Loss: 0.00819
Epoch: 40815, Training Loss: 0.00635
Epoch: 40816, Training Loss: 0.00717
Epoch: 40816, Training Loss: 0.00671
Epoch: 40816, Training Loss: 0.00819
Epoch: 40816, Training Loss: 0.00635
Epoch: 40817, Training Loss: 0.00717
Epoch: 40817, Training Loss: 0.00671
Epoch: 40817, Training Loss: 0.00819
Epoch: 40817, Training Loss: 0.00635
Epoch: 40818, Training Loss: 0.00717
Epoch: 40818, Training Loss: 0.00671
Epoch: 40818, Training Loss: 0.00819
Epoch: 40818, Training Loss: 0.00635
Epoch: 40819, Training Loss: 0.00717
Epoch: 40819, Training Loss: 0.00671
Epoch: 40819, Training Loss: 0.00819
Epoch: 40819, Training Loss: 0.00635
Epoch: 40820, Training Loss: 0.00717
Epoch: 40820, Training Loss: 0.00671
Epoch: 40820, Training Loss: 0.00819
Epoch: 40820, Training Loss: 0.00635
Epoch: 40821, Training Loss: 0.00717
Epoch: 40821, Training Loss: 0.00671
Epoch: 40821, Training Loss: 0.00819
Epoch: 40821, Training Loss: 0.00635
Epoch: 40822, Training Loss: 0.00717
Epoch: 40822, Training Loss: 0.00671
Epoch: 40822, Training Loss: 0.00819
Epoch: 40822, Training Loss: 0.00635
Epoch: 40823, Training Loss: 0.00717
Epoch: 40823, Training Loss: 0.00671
Epoch: 40823, Training Loss: 0.00819
Epoch: 40823, Training Loss: 0.00635
Epoch: 40824, Training Loss: 0.00717
Epoch: 40824, Training Loss: 0.00671
Epoch: 40824, Training Loss: 0.00819
Epoch: 40824, Training Loss: 0.00635
Epoch: 40825, Training Loss: 0.00717
Epoch: 40825, Training Loss: 0.00671
Epoch: 40825, Training Loss: 0.00819
Epoch: 40825, Training Loss: 0.00635
Epoch: 40826, Training Loss: 0.00717
Epoch: 40826, Training Loss: 0.00671
Epoch: 40826, Training Loss: 0.00819
Epoch: 40826, Training Loss: 0.00635
Epoch: 40827, Training Loss: 0.00717
Epoch: 40827, Training Loss: 0.00671
Epoch: 40827, Training Loss: 0.00819
Epoch: 40827, Training Loss: 0.00635
Epoch: 40828, Training Loss: 0.00717
Epoch: 40828, Training Loss: 0.00671
Epoch: 40828, Training Loss: 0.00819
Epoch: 40828, Training Loss: 0.00635
Epoch: 40829, Training Loss: 0.00717
Epoch: 40829, Training Loss: 0.00671
Epoch: 40829, Training Loss: 0.00819
Epoch: 40829, Training Loss: 0.00635
Epoch: 40830, Training Loss: 0.00717
Epoch: 40830, Training Loss: 0.00671
Epoch: 40830, Training Loss: 0.00819
Epoch: 40830, Training Loss: 0.00635
Epoch: 40831, Training Loss: 0.00717
Epoch: 40831, Training Loss: 0.00671
Epoch: 40831, Training Loss: 0.00819
Epoch: 40831, Training Loss: 0.00635
Epoch: 40832, Training Loss: 0.00717
Epoch: 40832, Training Loss: 0.00671
Epoch: 40832, Training Loss: 0.00819
Epoch: 40832, Training Loss: 0.00635
Epoch: 40833, Training Loss: 0.00717
Epoch: 40833, Training Loss: 0.00671
Epoch: 40833, Training Loss: 0.00819
Epoch: 40833, Training Loss: 0.00635
Epoch: 40834, Training Loss: 0.00717
Epoch: 40834, Training Loss: 0.00671
Epoch: 40834, Training Loss: 0.00819
Epoch: 40834, Training Loss: 0.00635
Epoch: 40835, Training Loss: 0.00717
Epoch: 40835, Training Loss: 0.00671
Epoch: 40835, Training Loss: 0.00819
Epoch: 40835, Training Loss: 0.00635
Epoch: 40836, Training Loss: 0.00717
Epoch: 40836, Training Loss: 0.00671
Epoch: 40836, Training Loss: 0.00819
Epoch: 40836, Training Loss: 0.00635
Epoch: 40837, Training Loss: 0.00717
Epoch: 40837, Training Loss: 0.00671
Epoch: 40837, Training Loss: 0.00819
Epoch: 40837, Training Loss: 0.00635
Epoch: 40838, Training Loss: 0.00717
Epoch: 40838, Training Loss: 0.00671
Epoch: 40838, Training Loss: 0.00819
Epoch: 40838, Training Loss: 0.00635
Epoch: 40839, Training Loss: 0.00717
Epoch: 40839, Training Loss: 0.00671
Epoch: 40839, Training Loss: 0.00819
Epoch: 40839, Training Loss: 0.00635
Epoch: 40840, Training Loss: 0.00717
Epoch: 40840, Training Loss: 0.00671
Epoch: 40840, Training Loss: 0.00819
Epoch: 40840, Training Loss: 0.00635
Epoch: 40841, Training Loss: 0.00717
Epoch: 40841, Training Loss: 0.00671
Epoch: 40841, Training Loss: 0.00819
Epoch: 40841, Training Loss: 0.00635
Epoch: 40842, Training Loss: 0.00716
Epoch: 40842, Training Loss: 0.00671
Epoch: 40842, Training Loss: 0.00819
Epoch: 40842, Training Loss: 0.00635
Epoch: 40843, Training Loss: 0.00716
Epoch: 40843, Training Loss: 0.00671
Epoch: 40843, Training Loss: 0.00819
Epoch: 40843, Training Loss: 0.00635
Epoch: 40844, Training Loss: 0.00716
Epoch: 40844, Training Loss: 0.00671
Epoch: 40844, Training Loss: 0.00819
Epoch: 40844, Training Loss: 0.00635
Epoch: 40845, Training Loss: 0.00716
Epoch: 40845, Training Loss: 0.00671
Epoch: 40845, Training Loss: 0.00819
Epoch: 40845, Training Loss: 0.00635
Epoch: 40846, Training Loss: 0.00716
Epoch: 40846, Training Loss: 0.00671
Epoch: 40846, Training Loss: 0.00819
Epoch: 40846, Training Loss: 0.00635
Epoch: 40847, Training Loss: 0.00716
Epoch: 40847, Training Loss: 0.00671
Epoch: 40847, Training Loss: 0.00819
Epoch: 40847, Training Loss: 0.00635
Epoch: 40848, Training Loss: 0.00716
Epoch: 40848, Training Loss: 0.00671
Epoch: 40848, Training Loss: 0.00819
Epoch: 40848, Training Loss: 0.00635
Epoch: 40849, Training Loss: 0.00716
Epoch: 40849, Training Loss: 0.00671
Epoch: 40849, Training Loss: 0.00819
Epoch: 40849, Training Loss: 0.00635
Epoch: 40850, Training Loss: 0.00716
Epoch: 40850, Training Loss: 0.00671
Epoch: 40850, Training Loss: 0.00819
Epoch: 40850, Training Loss: 0.00635
Epoch: 40851, Training Loss: 0.00716
Epoch: 40851, Training Loss: 0.00671
Epoch: 40851, Training Loss: 0.00819
Epoch: 40851, Training Loss: 0.00635
Epoch: 40852, Training Loss: 0.00716
Epoch: 40852, Training Loss: 0.00671
Epoch: 40852, Training Loss: 0.00819
Epoch: 40852, Training Loss: 0.00635
Epoch: 40853, Training Loss: 0.00716
Epoch: 40853, Training Loss: 0.00671
Epoch: 40853, Training Loss: 0.00819
Epoch: 40853, Training Loss: 0.00635
Epoch: 40854, Training Loss: 0.00716
Epoch: 40854, Training Loss: 0.00671
Epoch: 40854, Training Loss: 0.00819
Epoch: 40854, Training Loss: 0.00635
Epoch: 40855, Training Loss: 0.00716
Epoch: 40855, Training Loss: 0.00671
Epoch: 40855, Training Loss: 0.00819
Epoch: 40855, Training Loss: 0.00635
Epoch: 40856, Training Loss: 0.00716
Epoch: 40856, Training Loss: 0.00671
Epoch: 40856, Training Loss: 0.00819
Epoch: 40856, Training Loss: 0.00635
Epoch: 40857, Training Loss: 0.00716
Epoch: 40857, Training Loss: 0.00671
Epoch: 40857, Training Loss: 0.00819
Epoch: 40857, Training Loss: 0.00635
Epoch: 40858, Training Loss: 0.00716
Epoch: 40858, Training Loss: 0.00671
Epoch: 40858, Training Loss: 0.00819
Epoch: 40858, Training Loss: 0.00635
Epoch: 40859, Training Loss: 0.00716
Epoch: 40859, Training Loss: 0.00671
Epoch: 40859, Training Loss: 0.00819
Epoch: 40859, Training Loss: 0.00635
Epoch: 40860, Training Loss: 0.00716
Epoch: 40860, Training Loss: 0.00671
Epoch: 40860, Training Loss: 0.00819
Epoch: 40860, Training Loss: 0.00635
Epoch: 40861, Training Loss: 0.00716
Epoch: 40861, Training Loss: 0.00671
Epoch: 40861, Training Loss: 0.00818
Epoch: 40861, Training Loss: 0.00635
Epoch: 40862, Training Loss: 0.00716
Epoch: 40862, Training Loss: 0.00671
Epoch: 40862, Training Loss: 0.00818
Epoch: 40862, Training Loss: 0.00635
Epoch: 40863, Training Loss: 0.00716
Epoch: 40863, Training Loss: 0.00671
Epoch: 40863, Training Loss: 0.00818
Epoch: 40863, Training Loss: 0.00635
Epoch: 40864, Training Loss: 0.00716
Epoch: 40864, Training Loss: 0.00670
Epoch: 40864, Training Loss: 0.00818
Epoch: 40864, Training Loss: 0.00635
Epoch: 40865, Training Loss: 0.00716
Epoch: 40865, Training Loss: 0.00670
Epoch: 40865, Training Loss: 0.00818
Epoch: 40865, Training Loss: 0.00635
Epoch: 40866, Training Loss: 0.00716
Epoch: 40866, Training Loss: 0.00670
Epoch: 40866, Training Loss: 0.00818
Epoch: 40866, Training Loss: 0.00635
Epoch: 40867, Training Loss: 0.00716
Epoch: 40867, Training Loss: 0.00670
Epoch: 40867, Training Loss: 0.00818
Epoch: 40867, Training Loss: 0.00635
Epoch: 40868, Training Loss: 0.00716
Epoch: 40868, Training Loss: 0.00670
Epoch: 40868, Training Loss: 0.00818
Epoch: 40868, Training Loss: 0.00635
Epoch: 40869, Training Loss: 0.00716
Epoch: 40869, Training Loss: 0.00670
Epoch: 40869, Training Loss: 0.00818
Epoch: 40869, Training Loss: 0.00635
Epoch: 40870, Training Loss: 0.00716
Epoch: 40870, Training Loss: 0.00670
Epoch: 40870, Training Loss: 0.00818
Epoch: 40870, Training Loss: 0.00635
Epoch: 40871, Training Loss: 0.00716
Epoch: 40871, Training Loss: 0.00670
Epoch: 40871, Training Loss: 0.00818
Epoch: 40871, Training Loss: 0.00635
Epoch: 40872, Training Loss: 0.00716
Epoch: 40872, Training Loss: 0.00670
Epoch: 40872, Training Loss: 0.00818
Epoch: 40872, Training Loss: 0.00635
Epoch: 40873, Training Loss: 0.00716
Epoch: 40873, Training Loss: 0.00670
Epoch: 40873, Training Loss: 0.00818
Epoch: 40873, Training Loss: 0.00635
Epoch: 40874, Training Loss: 0.00716
Epoch: 40874, Training Loss: 0.00670
Epoch: 40874, Training Loss: 0.00818
Epoch: 40874, Training Loss: 0.00635
Epoch: 40875, Training Loss: 0.00716
Epoch: 40875, Training Loss: 0.00670
Epoch: 40875, Training Loss: 0.00818
Epoch: 40875, Training Loss: 0.00635
Epoch: 40876, Training Loss: 0.00716
Epoch: 40876, Training Loss: 0.00670
Epoch: 40876, Training Loss: 0.00818
Epoch: 40876, Training Loss: 0.00635
Epoch: 40877, Training Loss: 0.00716
Epoch: 40877, Training Loss: 0.00670
Epoch: 40877, Training Loss: 0.00818
Epoch: 40877, Training Loss: 0.00635
Epoch: 40878, Training Loss: 0.00716
Epoch: 40878, Training Loss: 0.00670
Epoch: 40878, Training Loss: 0.00818
Epoch: 40878, Training Loss: 0.00635
Epoch: 40879, Training Loss: 0.00716
Epoch: 40879, Training Loss: 0.00670
Epoch: 40879, Training Loss: 0.00818
Epoch: 40879, Training Loss: 0.00635
Epoch: 40880, Training Loss: 0.00716
Epoch: 40880, Training Loss: 0.00670
Epoch: 40880, Training Loss: 0.00818
Epoch: 40880, Training Loss: 0.00635
Epoch: 40881, Training Loss: 0.00716
Epoch: 40881, Training Loss: 0.00670
Epoch: 40881, Training Loss: 0.00818
Epoch: 40881, Training Loss: 0.00635
Epoch: 40882, Training Loss: 0.00716
Epoch: 40882, Training Loss: 0.00670
Epoch: 40882, Training Loss: 0.00818
Epoch: 40882, Training Loss: 0.00635
Epoch: 40883, Training Loss: 0.00716
Epoch: 40883, Training Loss: 0.00670
Epoch: 40883, Training Loss: 0.00818
Epoch: 40883, Training Loss: 0.00635
Epoch: 40884, Training Loss: 0.00716
Epoch: 40884, Training Loss: 0.00670
Epoch: 40884, Training Loss: 0.00818
Epoch: 40884, Training Loss: 0.00635
Epoch: 40885, Training Loss: 0.00716
Epoch: 40885, Training Loss: 0.00670
Epoch: 40885, Training Loss: 0.00818
Epoch: 40885, Training Loss: 0.00635
Epoch: 40886, Training Loss: 0.00716
Epoch: 40886, Training Loss: 0.00670
Epoch: 40886, Training Loss: 0.00818
Epoch: 40886, Training Loss: 0.00635
Epoch: 40887, Training Loss: 0.00716
Epoch: 40887, Training Loss: 0.00670
Epoch: 40887, Training Loss: 0.00818
Epoch: 40887, Training Loss: 0.00635
Epoch: 40888, Training Loss: 0.00716
Epoch: 40888, Training Loss: 0.00670
Epoch: 40888, Training Loss: 0.00818
Epoch: 40888, Training Loss: 0.00635
Epoch: 40889, Training Loss: 0.00716
Epoch: 40889, Training Loss: 0.00670
Epoch: 40889, Training Loss: 0.00818
Epoch: 40889, Training Loss: 0.00635
Epoch: 40890, Training Loss: 0.00716
Epoch: 40890, Training Loss: 0.00670
Epoch: 40890, Training Loss: 0.00818
Epoch: 40890, Training Loss: 0.00635
Epoch: 40891, Training Loss: 0.00716
Epoch: 40891, Training Loss: 0.00670
Epoch: 40891, Training Loss: 0.00818
Epoch: 40891, Training Loss: 0.00635
Epoch: 40892, Training Loss: 0.00716
Epoch: 40892, Training Loss: 0.00670
Epoch: 40892, Training Loss: 0.00818
Epoch: 40892, Training Loss: 0.00635
Epoch: 40893, Training Loss: 0.00716
Epoch: 40893, Training Loss: 0.00670
Epoch: 40893, Training Loss: 0.00818
Epoch: 40893, Training Loss: 0.00635
Epoch: 40894, Training Loss: 0.00716
Epoch: 40894, Training Loss: 0.00670
Epoch: 40894, Training Loss: 0.00818
Epoch: 40894, Training Loss: 0.00635
Epoch: 40895, Training Loss: 0.00716
Epoch: 40895, Training Loss: 0.00670
Epoch: 40895, Training Loss: 0.00818
Epoch: 40895, Training Loss: 0.00635
Epoch: 40896, Training Loss: 0.00716
Epoch: 40896, Training Loss: 0.00670
Epoch: 40896, Training Loss: 0.00818
Epoch: 40896, Training Loss: 0.00635
Epoch: 40897, Training Loss: 0.00716
Epoch: 40897, Training Loss: 0.00670
Epoch: 40897, Training Loss: 0.00818
Epoch: 40897, Training Loss: 0.00635
Epoch: 40898, Training Loss: 0.00716
Epoch: 40898, Training Loss: 0.00670
Epoch: 40898, Training Loss: 0.00818
Epoch: 40898, Training Loss: 0.00635
Epoch: 40899, Training Loss: 0.00716
Epoch: 40899, Training Loss: 0.00670
Epoch: 40899, Training Loss: 0.00818
Epoch: 40899, Training Loss: 0.00635
Epoch: 40900, Training Loss: 0.00716
Epoch: 40900, Training Loss: 0.00670
Epoch: 40900, Training Loss: 0.00818
Epoch: 40900, Training Loss: 0.00635
Epoch: 40901, Training Loss: 0.00716
Epoch: 40901, Training Loss: 0.00670
Epoch: 40901, Training Loss: 0.00818
Epoch: 40901, Training Loss: 0.00635
Epoch: 40902, Training Loss: 0.00716
Epoch: 40902, Training Loss: 0.00670
Epoch: 40902, Training Loss: 0.00818
Epoch: 40902, Training Loss: 0.00634
Epoch: 40903, Training Loss: 0.00716
Epoch: 40903, Training Loss: 0.00670
Epoch: 40903, Training Loss: 0.00818
Epoch: 40903, Training Loss: 0.00634
Epoch: 40904, Training Loss: 0.00716
Epoch: 40904, Training Loss: 0.00670
Epoch: 40904, Training Loss: 0.00818
Epoch: 40904, Training Loss: 0.00634
Epoch: 40905, Training Loss: 0.00716
Epoch: 40905, Training Loss: 0.00670
Epoch: 40905, Training Loss: 0.00818
Epoch: 40905, Training Loss: 0.00634
Epoch: 40906, Training Loss: 0.00716
Epoch: 40906, Training Loss: 0.00670
Epoch: 40906, Training Loss: 0.00818
Epoch: 40906, Training Loss: 0.00634
Epoch: 40907, Training Loss: 0.00716
Epoch: 40907, Training Loss: 0.00670
Epoch: 40907, Training Loss: 0.00818
Epoch: 40907, Training Loss: 0.00634
Epoch: 40908, Training Loss: 0.00716
Epoch: 40908, Training Loss: 0.00670
Epoch: 40908, Training Loss: 0.00818
Epoch: 40908, Training Loss: 0.00634
Epoch: 40909, Training Loss: 0.00716
Epoch: 40909, Training Loss: 0.00670
Epoch: 40909, Training Loss: 0.00818
Epoch: 40909, Training Loss: 0.00634
Epoch: 40910, Training Loss: 0.00716
Epoch: 40910, Training Loss: 0.00670
Epoch: 40910, Training Loss: 0.00818
Epoch: 40910, Training Loss: 0.00634
Epoch: 40911, Training Loss: 0.00716
Epoch: 40911, Training Loss: 0.00670
Epoch: 40911, Training Loss: 0.00818
Epoch: 40911, Training Loss: 0.00634
Epoch: 40912, Training Loss: 0.00716
Epoch: 40912, Training Loss: 0.00670
Epoch: 40912, Training Loss: 0.00818
Epoch: 40912, Training Loss: 0.00634
Epoch: 40913, Training Loss: 0.00716
Epoch: 40913, Training Loss: 0.00670
Epoch: 40913, Training Loss: 0.00818
Epoch: 40913, Training Loss: 0.00634
Epoch: 40914, Training Loss: 0.00716
Epoch: 40914, Training Loss: 0.00670
Epoch: 40914, Training Loss: 0.00818
Epoch: 40914, Training Loss: 0.00634
Epoch: 40915, Training Loss: 0.00716
Epoch: 40915, Training Loss: 0.00670
Epoch: 40915, Training Loss: 0.00818
Epoch: 40915, Training Loss: 0.00634
Epoch: 40916, Training Loss: 0.00716
Epoch: 40916, Training Loss: 0.00670
Epoch: 40916, Training Loss: 0.00818
Epoch: 40916, Training Loss: 0.00634
Epoch: 40917, Training Loss: 0.00716
Epoch: 40917, Training Loss: 0.00670
Epoch: 40917, Training Loss: 0.00818
Epoch: 40917, Training Loss: 0.00634
Epoch: 40918, Training Loss: 0.00716
Epoch: 40918, Training Loss: 0.00670
Epoch: 40918, Training Loss: 0.00818
Epoch: 40918, Training Loss: 0.00634
Epoch: 40919, Training Loss: 0.00716
Epoch: 40919, Training Loss: 0.00670
Epoch: 40919, Training Loss: 0.00818
Epoch: 40919, Training Loss: 0.00634
Epoch: 40920, Training Loss: 0.00716
Epoch: 40920, Training Loss: 0.00670
Epoch: 40920, Training Loss: 0.00818
Epoch: 40920, Training Loss: 0.00634
Epoch: 40921, Training Loss: 0.00716
Epoch: 40921, Training Loss: 0.00670
Epoch: 40921, Training Loss: 0.00818
Epoch: 40921, Training Loss: 0.00634
Epoch: 40922, Training Loss: 0.00716
Epoch: 40922, Training Loss: 0.00670
Epoch: 40922, Training Loss: 0.00818
Epoch: 40922, Training Loss: 0.00634
Epoch: 40923, Training Loss: 0.00716
Epoch: 40923, Training Loss: 0.00670
Epoch: 40923, Training Loss: 0.00818
Epoch: 40923, Training Loss: 0.00634
Epoch: 40924, Training Loss: 0.00716
Epoch: 40924, Training Loss: 0.00670
Epoch: 40924, Training Loss: 0.00818
Epoch: 40924, Training Loss: 0.00634
Epoch: 40925, Training Loss: 0.00716
Epoch: 40925, Training Loss: 0.00670
Epoch: 40925, Training Loss: 0.00818
Epoch: 40925, Training Loss: 0.00634
Epoch: 40926, Training Loss: 0.00716
Epoch: 40926, Training Loss: 0.00670
Epoch: 40926, Training Loss: 0.00818
Epoch: 40926, Training Loss: 0.00634
Epoch: 40927, Training Loss: 0.00716
Epoch: 40927, Training Loss: 0.00670
Epoch: 40927, Training Loss: 0.00818
Epoch: 40927, Training Loss: 0.00634
Epoch: 40928, Training Loss: 0.00716
Epoch: 40928, Training Loss: 0.00670
Epoch: 40928, Training Loss: 0.00818
Epoch: 40928, Training Loss: 0.00634
Epoch: 40929, Training Loss: 0.00716
Epoch: 40929, Training Loss: 0.00670
Epoch: 40929, Training Loss: 0.00818
Epoch: 40929, Training Loss: 0.00634
Epoch: 40930, Training Loss: 0.00716
Epoch: 40930, Training Loss: 0.00670
Epoch: 40930, Training Loss: 0.00818
Epoch: 40930, Training Loss: 0.00634
Epoch: 40931, Training Loss: 0.00716
Epoch: 40931, Training Loss: 0.00670
Epoch: 40931, Training Loss: 0.00818
Epoch: 40931, Training Loss: 0.00634
Epoch: 40932, Training Loss: 0.00716
Epoch: 40932, Training Loss: 0.00670
Epoch: 40932, Training Loss: 0.00818
Epoch: 40932, Training Loss: 0.00634
Epoch: 40933, Training Loss: 0.00716
Epoch: 40933, Training Loss: 0.00670
Epoch: 40933, Training Loss: 0.00818
Epoch: 40933, Training Loss: 0.00634
Epoch: 40934, Training Loss: 0.00716
Epoch: 40934, Training Loss: 0.00670
Epoch: 40934, Training Loss: 0.00818
Epoch: 40934, Training Loss: 0.00634
Epoch: 40935, Training Loss: 0.00716
Epoch: 40935, Training Loss: 0.00670
Epoch: 40935, Training Loss: 0.00818
Epoch: 40935, Training Loss: 0.00634
Epoch: 40936, Training Loss: 0.00716
Epoch: 40936, Training Loss: 0.00670
Epoch: 40936, Training Loss: 0.00818
Epoch: 40936, Training Loss: 0.00634
Epoch: 40937, Training Loss: 0.00716
Epoch: 40937, Training Loss: 0.00670
Epoch: 40937, Training Loss: 0.00818
Epoch: 40937, Training Loss: 0.00634
Epoch: 40938, Training Loss: 0.00716
Epoch: 40938, Training Loss: 0.00670
Epoch: 40938, Training Loss: 0.00818
Epoch: 40938, Training Loss: 0.00634
Epoch: 40939, Training Loss: 0.00716
Epoch: 40939, Training Loss: 0.00670
Epoch: 40939, Training Loss: 0.00818
Epoch: 40939, Training Loss: 0.00634
Epoch: 40940, Training Loss: 0.00716
Epoch: 40940, Training Loss: 0.00670
Epoch: 40940, Training Loss: 0.00818
Epoch: 40940, Training Loss: 0.00634
Epoch: 40941, Training Loss: 0.00716
Epoch: 40941, Training Loss: 0.00670
Epoch: 40941, Training Loss: 0.00818
Epoch: 40941, Training Loss: 0.00634
Epoch: 40942, Training Loss: 0.00716
Epoch: 40942, Training Loss: 0.00670
Epoch: 40942, Training Loss: 0.00818
Epoch: 40942, Training Loss: 0.00634
Epoch: 40943, Training Loss: 0.00716
Epoch: 40943, Training Loss: 0.00670
Epoch: 40943, Training Loss: 0.00818
Epoch: 40943, Training Loss: 0.00634
Epoch: 40944, Training Loss: 0.00716
Epoch: 40944, Training Loss: 0.00670
Epoch: 40944, Training Loss: 0.00818
Epoch: 40944, Training Loss: 0.00634
Epoch: 40945, Training Loss: 0.00716
Epoch: 40945, Training Loss: 0.00670
Epoch: 40945, Training Loss: 0.00818
Epoch: 40945, Training Loss: 0.00634
Epoch: 40946, Training Loss: 0.00716
Epoch: 40946, Training Loss: 0.00670
Epoch: 40946, Training Loss: 0.00818
Epoch: 40946, Training Loss: 0.00634
Epoch: 40947, Training Loss: 0.00716
Epoch: 40947, Training Loss: 0.00670
Epoch: 40947, Training Loss: 0.00818
Epoch: 40947, Training Loss: 0.00634
Epoch: 40948, Training Loss: 0.00716
Epoch: 40948, Training Loss: 0.00670
Epoch: 40948, Training Loss: 0.00818
Epoch: 40948, Training Loss: 0.00634
Epoch: 40949, Training Loss: 0.00716
Epoch: 40949, Training Loss: 0.00670
Epoch: 40949, Training Loss: 0.00818
Epoch: 40949, Training Loss: 0.00634
Epoch: 40950, Training Loss: 0.00715
Epoch: 40950, Training Loss: 0.00670
Epoch: 40950, Training Loss: 0.00818
Epoch: 40950, Training Loss: 0.00634
Epoch: 40951, Training Loss: 0.00715
Epoch: 40951, Training Loss: 0.00670
Epoch: 40951, Training Loss: 0.00818
Epoch: 40951, Training Loss: 0.00634
Epoch: 40952, Training Loss: 0.00715
Epoch: 40952, Training Loss: 0.00670
Epoch: 40952, Training Loss: 0.00818
Epoch: 40952, Training Loss: 0.00634
Epoch: 40953, Training Loss: 0.00715
Epoch: 40953, Training Loss: 0.00670
Epoch: 40953, Training Loss: 0.00818
Epoch: 40953, Training Loss: 0.00634
Epoch: 40954, Training Loss: 0.00715
Epoch: 40954, Training Loss: 0.00670
Epoch: 40954, Training Loss: 0.00818
Epoch: 40954, Training Loss: 0.00634
Epoch: 40955, Training Loss: 0.00715
Epoch: 40955, Training Loss: 0.00670
Epoch: 40955, Training Loss: 0.00818
Epoch: 40955, Training Loss: 0.00634
Epoch: 40956, Training Loss: 0.00715
Epoch: 40956, Training Loss: 0.00670
Epoch: 40956, Training Loss: 0.00817
Epoch: 40956, Training Loss: 0.00634
Epoch: 40957, Training Loss: 0.00715
Epoch: 40957, Training Loss: 0.00670
Epoch: 40957, Training Loss: 0.00817
Epoch: 40957, Training Loss: 0.00634
Epoch: 40958, Training Loss: 0.00715
Epoch: 40958, Training Loss: 0.00670
Epoch: 40958, Training Loss: 0.00817
Epoch: 40958, Training Loss: 0.00634
Epoch: 40959, Training Loss: 0.00715
Epoch: 40959, Training Loss: 0.00670
Epoch: 40959, Training Loss: 0.00817
Epoch: 40959, Training Loss: 0.00634
Epoch: 40960, Training Loss: 0.00715
Epoch: 40960, Training Loss: 0.00670
Epoch: 40960, Training Loss: 0.00817
Epoch: 40960, Training Loss: 0.00634
Epoch: 40961, Training Loss: 0.00715
Epoch: 40961, Training Loss: 0.00670
Epoch: 40961, Training Loss: 0.00817
Epoch: 40961, Training Loss: 0.00634
Epoch: 40962, Training Loss: 0.00715
Epoch: 40962, Training Loss: 0.00670
Epoch: 40962, Training Loss: 0.00817
Epoch: 40962, Training Loss: 0.00634
Epoch: 40963, Training Loss: 0.00715
Epoch: 40963, Training Loss: 0.00670
Epoch: 40963, Training Loss: 0.00817
Epoch: 40963, Training Loss: 0.00634
Epoch: 40964, Training Loss: 0.00715
Epoch: 40964, Training Loss: 0.00670
Epoch: 40964, Training Loss: 0.00817
Epoch: 40964, Training Loss: 0.00634
Epoch: 40965, Training Loss: 0.00715
Epoch: 40965, Training Loss: 0.00670
Epoch: 40965, Training Loss: 0.00817
Epoch: 40965, Training Loss: 0.00634
Epoch: 40966, Training Loss: 0.00715
Epoch: 40966, Training Loss: 0.00670
Epoch: 40966, Training Loss: 0.00817
Epoch: 40966, Training Loss: 0.00634
Epoch: 40967, Training Loss: 0.00715
Epoch: 40967, Training Loss: 0.00670
Epoch: 40967, Training Loss: 0.00817
Epoch: 40967, Training Loss: 0.00634
Epoch: 40968, Training Loss: 0.00715
Epoch: 40968, Training Loss: 0.00670
Epoch: 40968, Training Loss: 0.00817
Epoch: 40968, Training Loss: 0.00634
Epoch: 40969, Training Loss: 0.00715
Epoch: 40969, Training Loss: 0.00670
Epoch: 40969, Training Loss: 0.00817
Epoch: 40969, Training Loss: 0.00634
Epoch: 40970, Training Loss: 0.00715
Epoch: 40970, Training Loss: 0.00670
Epoch: 40970, Training Loss: 0.00817
Epoch: 40970, Training Loss: 0.00634
Epoch: 40971, Training Loss: 0.00715
Epoch: 40971, Training Loss: 0.00670
Epoch: 40971, Training Loss: 0.00817
Epoch: 40971, Training Loss: 0.00634
Epoch: 40972, Training Loss: 0.00715
Epoch: 40972, Training Loss: 0.00670
Epoch: 40972, Training Loss: 0.00817
Epoch: 40972, Training Loss: 0.00634
Epoch: 40973, Training Loss: 0.00715
Epoch: 40973, Training Loss: 0.00670
Epoch: 40973, Training Loss: 0.00817
Epoch: 40973, Training Loss: 0.00634
Epoch: 40974, Training Loss: 0.00715
Epoch: 40974, Training Loss: 0.00670
Epoch: 40974, Training Loss: 0.00817
Epoch: 40974, Training Loss: 0.00634
Epoch: 40975, Training Loss: 0.00715
Epoch: 40975, Training Loss: 0.00670
Epoch: 40975, Training Loss: 0.00817
Epoch: 40975, Training Loss: 0.00634
Epoch: 40976, Training Loss: 0.00715
Epoch: 40976, Training Loss: 0.00670
Epoch: 40976, Training Loss: 0.00817
Epoch: 40976, Training Loss: 0.00634
Epoch: 40977, Training Loss: 0.00715
Epoch: 40977, Training Loss: 0.00670
Epoch: 40977, Training Loss: 0.00817
Epoch: 40977, Training Loss: 0.00634
Epoch: 40978, Training Loss: 0.00715
Epoch: 40978, Training Loss: 0.00670
Epoch: 40978, Training Loss: 0.00817
Epoch: 40978, Training Loss: 0.00634
Epoch: 40979, Training Loss: 0.00715
Epoch: 40979, Training Loss: 0.00670
Epoch: 40979, Training Loss: 0.00817
Epoch: 40979, Training Loss: 0.00634
Epoch: 40980, Training Loss: 0.00715
Epoch: 40980, Training Loss: 0.00670
Epoch: 40980, Training Loss: 0.00817
Epoch: 40980, Training Loss: 0.00634
Epoch: 40981, Training Loss: 0.00715
Epoch: 40981, Training Loss: 0.00669
Epoch: 40981, Training Loss: 0.00817
Epoch: 40981, Training Loss: 0.00634
Epoch: 40982, Training Loss: 0.00715
Epoch: 40982, Training Loss: 0.00669
Epoch: 40982, Training Loss: 0.00817
Epoch: 40982, Training Loss: 0.00634
Epoch: 40983, Training Loss: 0.00715
Epoch: 40983, Training Loss: 0.00669
Epoch: 40983, Training Loss: 0.00817
Epoch: 40983, Training Loss: 0.00634
Epoch: 40984, Training Loss: 0.00715
Epoch: 40984, Training Loss: 0.00669
Epoch: 40984, Training Loss: 0.00817
Epoch: 40984, Training Loss: 0.00634
Epoch: 40985, Training Loss: 0.00715
Epoch: 40985, Training Loss: 0.00669
Epoch: 40985, Training Loss: 0.00817
Epoch: 40985, Training Loss: 0.00634
Epoch: 40986, Training Loss: 0.00715
Epoch: 40986, Training Loss: 0.00669
Epoch: 40986, Training Loss: 0.00817
Epoch: 40986, Training Loss: 0.00634
Epoch: 40987, Training Loss: 0.00715
Epoch: 40987, Training Loss: 0.00669
Epoch: 40987, Training Loss: 0.00817
Epoch: 40987, Training Loss: 0.00634
Epoch: 40988, Training Loss: 0.00715
Epoch: 40988, Training Loss: 0.00669
Epoch: 40988, Training Loss: 0.00817
Epoch: 40988, Training Loss: 0.00634
Epoch: 40989, Training Loss: 0.00715
Epoch: 40989, Training Loss: 0.00669
Epoch: 40989, Training Loss: 0.00817
Epoch: 40989, Training Loss: 0.00634
Epoch: 40990, Training Loss: 0.00715
Epoch: 40990, Training Loss: 0.00669
Epoch: 40990, Training Loss: 0.00817
Epoch: 40990, Training Loss: 0.00634
Epoch: 40991, Training Loss: 0.00715
Epoch: 40991, Training Loss: 0.00669
Epoch: 40991, Training Loss: 0.00817
Epoch: 40991, Training Loss: 0.00634
Epoch: 40992, Training Loss: 0.00715
Epoch: 40992, Training Loss: 0.00669
Epoch: 40992, Training Loss: 0.00817
Epoch: 40992, Training Loss: 0.00634
Epoch: 40993, Training Loss: 0.00715
Epoch: 40993, Training Loss: 0.00669
Epoch: 40993, Training Loss: 0.00817
Epoch: 40993, Training Loss: 0.00634
Epoch: 40994, Training Loss: 0.00715
Epoch: 40994, Training Loss: 0.00669
Epoch: 40994, Training Loss: 0.00817
Epoch: 40994, Training Loss: 0.00634
Epoch: 40995, Training Loss: 0.00715
Epoch: 40995, Training Loss: 0.00669
Epoch: 40995, Training Loss: 0.00817
Epoch: 40995, Training Loss: 0.00634
Epoch: 40996, Training Loss: 0.00715
Epoch: 40996, Training Loss: 0.00669
Epoch: 40996, Training Loss: 0.00817
Epoch: 40996, Training Loss: 0.00634
Epoch: 40997, Training Loss: 0.00715
Epoch: 40997, Training Loss: 0.00669
Epoch: 40997, Training Loss: 0.00817
Epoch: 40997, Training Loss: 0.00634
Epoch: 40998, Training Loss: 0.00715
Epoch: 40998, Training Loss: 0.00669
Epoch: 40998, Training Loss: 0.00817
Epoch: 40998, Training Loss: 0.00634
Epoch: 40999, Training Loss: 0.00715
Epoch: 40999, Training Loss: 0.00669
Epoch: 40999, Training Loss: 0.00817
Epoch: 40999, Training Loss: 0.00634
Epoch: 41000, Training Loss: 0.00715
Epoch: 41000, Training Loss: 0.00669
Epoch: 41000, Training Loss: 0.00817
Epoch: 41000, Training Loss: 0.00634
Epoch: 41001, Training Loss: 0.00715
Epoch: 41001, Training Loss: 0.00669
Epoch: 41001, Training Loss: 0.00817
Epoch: 41001, Training Loss: 0.00634
Epoch: 41002, Training Loss: 0.00715
Epoch: 41002, Training Loss: 0.00669
Epoch: 41002, Training Loss: 0.00817
Epoch: 41002, Training Loss: 0.00634
Epoch: 41003, Training Loss: 0.00715
Epoch: 41003, Training Loss: 0.00669
Epoch: 41003, Training Loss: 0.00817
Epoch: 41003, Training Loss: 0.00634
Epoch: 41004, Training Loss: 0.00715
Epoch: 41004, Training Loss: 0.00669
Epoch: 41004, Training Loss: 0.00817
Epoch: 41004, Training Loss: 0.00634
Epoch: 41005, Training Loss: 0.00715
Epoch: 41005, Training Loss: 0.00669
Epoch: 41005, Training Loss: 0.00817
Epoch: 41005, Training Loss: 0.00634
Epoch: 41006, Training Loss: 0.00715
Epoch: 41006, Training Loss: 0.00669
Epoch: 41006, Training Loss: 0.00817
Epoch: 41006, Training Loss: 0.00634
Epoch: 41007, Training Loss: 0.00715
Epoch: 41007, Training Loss: 0.00669
Epoch: 41007, Training Loss: 0.00817
Epoch: 41007, Training Loss: 0.00634
Epoch: 41008, Training Loss: 0.00715
Epoch: 41008, Training Loss: 0.00669
Epoch: 41008, Training Loss: 0.00817
Epoch: 41008, Training Loss: 0.00634
Epoch: 41009, Training Loss: 0.00715
Epoch: 41009, Training Loss: 0.00669
Epoch: 41009, Training Loss: 0.00817
Epoch: 41009, Training Loss: 0.00634
Epoch: 41010, Training Loss: 0.00715
Epoch: 41010, Training Loss: 0.00669
Epoch: 41010, Training Loss: 0.00817
Epoch: 41010, Training Loss: 0.00634
Epoch: 41011, Training Loss: 0.00715
Epoch: 41011, Training Loss: 0.00669
Epoch: 41011, Training Loss: 0.00817
Epoch: 41011, Training Loss: 0.00634
Epoch: 41012, Training Loss: 0.00715
Epoch: 41012, Training Loss: 0.00669
Epoch: 41012, Training Loss: 0.00817
Epoch: 41012, Training Loss: 0.00634
Epoch: 41013, Training Loss: 0.00715
Epoch: 41013, Training Loss: 0.00669
Epoch: 41013, Training Loss: 0.00817
Epoch: 41013, Training Loss: 0.00634
Epoch: 41014, Training Loss: 0.00715
Epoch: 41014, Training Loss: 0.00669
Epoch: 41014, Training Loss: 0.00817
Epoch: 41014, Training Loss: 0.00634
Epoch: 41015, Training Loss: 0.00715
Epoch: 41015, Training Loss: 0.00669
Epoch: 41015, Training Loss: 0.00817
Epoch: 41015, Training Loss: 0.00634
Epoch: 41016, Training Loss: 0.00715
Epoch: 41016, Training Loss: 0.00669
Epoch: 41016, Training Loss: 0.00817
Epoch: 41016, Training Loss: 0.00634
Epoch: 41017, Training Loss: 0.00715
Epoch: 41017, Training Loss: 0.00669
Epoch: 41017, Training Loss: 0.00817
Epoch: 41017, Training Loss: 0.00634
Epoch: 41018, Training Loss: 0.00715
Epoch: 41018, Training Loss: 0.00669
Epoch: 41018, Training Loss: 0.00817
Epoch: 41018, Training Loss: 0.00634
Epoch: 41019, Training Loss: 0.00715
Epoch: 41019, Training Loss: 0.00669
Epoch: 41019, Training Loss: 0.00817
Epoch: 41019, Training Loss: 0.00634
Epoch: 41020, Training Loss: 0.00715
Epoch: 41020, Training Loss: 0.00669
Epoch: 41020, Training Loss: 0.00817
Epoch: 41020, Training Loss: 0.00634
Epoch: 41021, Training Loss: 0.00715
Epoch: 41021, Training Loss: 0.00669
Epoch: 41021, Training Loss: 0.00817
Epoch: 41021, Training Loss: 0.00634
Epoch: 41022, Training Loss: 0.00715
Epoch: 41022, Training Loss: 0.00669
Epoch: 41022, Training Loss: 0.00817
Epoch: 41022, Training Loss: 0.00634
Epoch: 41023, Training Loss: 0.00715
Epoch: 41023, Training Loss: 0.00669
Epoch: 41023, Training Loss: 0.00817
Epoch: 41023, Training Loss: 0.00634
Epoch: 41024, Training Loss: 0.00715
Epoch: 41024, Training Loss: 0.00669
Epoch: 41024, Training Loss: 0.00817
Epoch: 41024, Training Loss: 0.00634
Epoch: 41025, Training Loss: 0.00715
Epoch: 41025, Training Loss: 0.00669
Epoch: 41025, Training Loss: 0.00817
Epoch: 41025, Training Loss: 0.00634
Epoch: 41026, Training Loss: 0.00715
Epoch: 41026, Training Loss: 0.00669
Epoch: 41026, Training Loss: 0.00817
Epoch: 41026, Training Loss: 0.00633
Epoch: 41027, Training Loss: 0.00715
Epoch: 41027, Training Loss: 0.00669
Epoch: 41027, Training Loss: 0.00817
Epoch: 41027, Training Loss: 0.00633
Epoch: 41028, Training Loss: 0.00715
Epoch: 41028, Training Loss: 0.00669
Epoch: 41028, Training Loss: 0.00817
Epoch: 41028, Training Loss: 0.00633
Epoch: 41029, Training Loss: 0.00715
Epoch: 41029, Training Loss: 0.00669
Epoch: 41029, Training Loss: 0.00817
Epoch: 41029, Training Loss: 0.00633
Epoch: 41030, Training Loss: 0.00715
Epoch: 41030, Training Loss: 0.00669
Epoch: 41030, Training Loss: 0.00817
Epoch: 41030, Training Loss: 0.00633
Epoch: 41031, Training Loss: 0.00715
Epoch: 41031, Training Loss: 0.00669
Epoch: 41031, Training Loss: 0.00817
Epoch: 41031, Training Loss: 0.00633
Epoch: 41032, Training Loss: 0.00715
Epoch: 41032, Training Loss: 0.00669
Epoch: 41032, Training Loss: 0.00817
Epoch: 41032, Training Loss: 0.00633
Epoch: 41033, Training Loss: 0.00715
Epoch: 41033, Training Loss: 0.00669
Epoch: 41033, Training Loss: 0.00817
Epoch: 41033, Training Loss: 0.00633
Epoch: 41034, Training Loss: 0.00715
Epoch: 41034, Training Loss: 0.00669
Epoch: 41034, Training Loss: 0.00817
Epoch: 41034, Training Loss: 0.00633
Epoch: 41035, Training Loss: 0.00715
Epoch: 41035, Training Loss: 0.00669
Epoch: 41035, Training Loss: 0.00817
Epoch: 41035, Training Loss: 0.00633
Epoch: 41036, Training Loss: 0.00715
Epoch: 41036, Training Loss: 0.00669
Epoch: 41036, Training Loss: 0.00817
Epoch: 41036, Training Loss: 0.00633
Epoch: 41037, Training Loss: 0.00715
Epoch: 41037, Training Loss: 0.00669
Epoch: 41037, Training Loss: 0.00817
Epoch: 41037, Training Loss: 0.00633
Epoch: 41038, Training Loss: 0.00715
Epoch: 41038, Training Loss: 0.00669
Epoch: 41038, Training Loss: 0.00817
Epoch: 41038, Training Loss: 0.00633
Epoch: 41039, Training Loss: 0.00715
Epoch: 41039, Training Loss: 0.00669
Epoch: 41039, Training Loss: 0.00817
Epoch: 41039, Training Loss: 0.00633
Epoch: 41040, Training Loss: 0.00715
Epoch: 41040, Training Loss: 0.00669
Epoch: 41040, Training Loss: 0.00817
Epoch: 41040, Training Loss: 0.00633
Epoch: 41041, Training Loss: 0.00715
Epoch: 41041, Training Loss: 0.00669
Epoch: 41041, Training Loss: 0.00817
Epoch: 41041, Training Loss: 0.00633
Epoch: 41042, Training Loss: 0.00715
Epoch: 41042, Training Loss: 0.00669
Epoch: 41042, Training Loss: 0.00817
Epoch: 41042, Training Loss: 0.00633
Epoch: 41043, Training Loss: 0.00715
Epoch: 41043, Training Loss: 0.00669
Epoch: 41043, Training Loss: 0.00817
Epoch: 41043, Training Loss: 0.00633
Epoch: 41044, Training Loss: 0.00715
Epoch: 41044, Training Loss: 0.00669
Epoch: 41044, Training Loss: 0.00817
Epoch: 41044, Training Loss: 0.00633
Epoch: 41045, Training Loss: 0.00715
Epoch: 41045, Training Loss: 0.00669
Epoch: 41045, Training Loss: 0.00817
Epoch: 41045, Training Loss: 0.00633
Epoch: 41046, Training Loss: 0.00715
Epoch: 41046, Training Loss: 0.00669
Epoch: 41046, Training Loss: 0.00817
Epoch: 41046, Training Loss: 0.00633
Epoch: 41047, Training Loss: 0.00715
Epoch: 41047, Training Loss: 0.00669
Epoch: 41047, Training Loss: 0.00817
Epoch: 41047, Training Loss: 0.00633
Epoch: 41048, Training Loss: 0.00715
Epoch: 41048, Training Loss: 0.00669
Epoch: 41048, Training Loss: 0.00817
Epoch: 41048, Training Loss: 0.00633
Epoch: 41049, Training Loss: 0.00715
Epoch: 41049, Training Loss: 0.00669
Epoch: 41049, Training Loss: 0.00817
Epoch: 41049, Training Loss: 0.00633
Epoch: 41050, Training Loss: 0.00715
Epoch: 41050, Training Loss: 0.00669
Epoch: 41050, Training Loss: 0.00817
Epoch: 41050, Training Loss: 0.00633
Epoch: 41051, Training Loss: 0.00715
Epoch: 41051, Training Loss: 0.00669
Epoch: 41051, Training Loss: 0.00817
Epoch: 41051, Training Loss: 0.00633
Epoch: 41052, Training Loss: 0.00715
Epoch: 41052, Training Loss: 0.00669
Epoch: 41052, Training Loss: 0.00816
Epoch: 41052, Training Loss: 0.00633
Epoch: 41053, Training Loss: 0.00715
Epoch: 41053, Training Loss: 0.00669
Epoch: 41053, Training Loss: 0.00816
Epoch: 41053, Training Loss: 0.00633
Epoch: 41054, Training Loss: 0.00715
Epoch: 41054, Training Loss: 0.00669
Epoch: 41054, Training Loss: 0.00816
Epoch: 41054, Training Loss: 0.00633
Epoch: 41055, Training Loss: 0.00715
Epoch: 41055, Training Loss: 0.00669
Epoch: 41055, Training Loss: 0.00816
Epoch: 41055, Training Loss: 0.00633
Epoch: 41056, Training Loss: 0.00715
Epoch: 41056, Training Loss: 0.00669
Epoch: 41056, Training Loss: 0.00816
Epoch: 41056, Training Loss: 0.00633
Epoch: 41057, Training Loss: 0.00715
Epoch: 41057, Training Loss: 0.00669
Epoch: 41057, Training Loss: 0.00816
Epoch: 41057, Training Loss: 0.00633
Epoch: 41058, Training Loss: 0.00715
Epoch: 41058, Training Loss: 0.00669
Epoch: 41058, Training Loss: 0.00816
Epoch: 41058, Training Loss: 0.00633
Epoch: 41059, Training Loss: 0.00714
Epoch: 41059, Training Loss: 0.00669
Epoch: 41059, Training Loss: 0.00816
Epoch: 41059, Training Loss: 0.00633
Epoch: 41060, Training Loss: 0.00714
Epoch: 41060, Training Loss: 0.00669
Epoch: 41060, Training Loss: 0.00816
Epoch: 41060, Training Loss: 0.00633
Epoch: 41061, Training Loss: 0.00714
Epoch: 41061, Training Loss: 0.00669
Epoch: 41061, Training Loss: 0.00816
Epoch: 41061, Training Loss: 0.00633
Epoch: 41062, Training Loss: 0.00714
Epoch: 41062, Training Loss: 0.00669
Epoch: 41062, Training Loss: 0.00816
Epoch: 41062, Training Loss: 0.00633
Epoch: 41063, Training Loss: 0.00714
Epoch: 41063, Training Loss: 0.00669
Epoch: 41063, Training Loss: 0.00816
Epoch: 41063, Training Loss: 0.00633
Epoch: 41064, Training Loss: 0.00714
Epoch: 41064, Training Loss: 0.00669
Epoch: 41064, Training Loss: 0.00816
Epoch: 41064, Training Loss: 0.00633
Epoch: 41065, Training Loss: 0.00714
Epoch: 41065, Training Loss: 0.00669
Epoch: 41065, Training Loss: 0.00816
Epoch: 41065, Training Loss: 0.00633
Epoch: 41066, Training Loss: 0.00714
Epoch: 41066, Training Loss: 0.00669
Epoch: 41066, Training Loss: 0.00816
Epoch: 41066, Training Loss: 0.00633
Epoch: 41067, Training Loss: 0.00714
Epoch: 41067, Training Loss: 0.00669
Epoch: 41067, Training Loss: 0.00816
Epoch: 41067, Training Loss: 0.00633
Epoch: 41068, Training Loss: 0.00714
Epoch: 41068, Training Loss: 0.00669
Epoch: 41068, Training Loss: 0.00816
Epoch: 41068, Training Loss: 0.00633
Epoch: 41069, Training Loss: 0.00714
Epoch: 41069, Training Loss: 0.00669
Epoch: 41069, Training Loss: 0.00816
Epoch: 41069, Training Loss: 0.00633
Epoch: 41070, Training Loss: 0.00714
Epoch: 41070, Training Loss: 0.00669
Epoch: 41070, Training Loss: 0.00816
Epoch: 41070, Training Loss: 0.00633
Epoch: 41071, Training Loss: 0.00714
Epoch: 41071, Training Loss: 0.00669
Epoch: 41071, Training Loss: 0.00816
Epoch: 41071, Training Loss: 0.00633
Epoch: 41072, Training Loss: 0.00714
Epoch: 41072, Training Loss: 0.00669
Epoch: 41072, Training Loss: 0.00816
Epoch: 41072, Training Loss: 0.00633
Epoch: 41073, Training Loss: 0.00714
Epoch: 41073, Training Loss: 0.00669
Epoch: 41073, Training Loss: 0.00816
Epoch: 41073, Training Loss: 0.00633
Epoch: 41074, Training Loss: 0.00714
Epoch: 41074, Training Loss: 0.00669
Epoch: 41074, Training Loss: 0.00816
Epoch: 41074, Training Loss: 0.00633
Epoch: 41075, Training Loss: 0.00714
Epoch: 41075, Training Loss: 0.00669
Epoch: 41075, Training Loss: 0.00816
Epoch: 41075, Training Loss: 0.00633
Epoch: 41076, Training Loss: 0.00714
Epoch: 41076, Training Loss: 0.00669
Epoch: 41076, Training Loss: 0.00816
Epoch: 41076, Training Loss: 0.00633
Epoch: 41077, Training Loss: 0.00714
Epoch: 41077, Training Loss: 0.00669
Epoch: 41077, Training Loss: 0.00816
Epoch: 41077, Training Loss: 0.00633
Epoch: 41078, Training Loss: 0.00714
Epoch: 41078, Training Loss: 0.00669
Epoch: 41078, Training Loss: 0.00816
Epoch: 41078, Training Loss: 0.00633
Epoch: 41079, Training Loss: 0.00714
Epoch: 41079, Training Loss: 0.00669
Epoch: 41079, Training Loss: 0.00816
Epoch: 41079, Training Loss: 0.00633
Epoch: 41080, Training Loss: 0.00714
Epoch: 41080, Training Loss: 0.00669
Epoch: 41080, Training Loss: 0.00816
Epoch: 41080, Training Loss: 0.00633
Epoch: 41081, Training Loss: 0.00714
Epoch: 41081, Training Loss: 0.00669
Epoch: 41081, Training Loss: 0.00816
Epoch: 41081, Training Loss: 0.00633
Epoch: 41082, Training Loss: 0.00714
Epoch: 41082, Training Loss: 0.00669
Epoch: 41082, Training Loss: 0.00816
Epoch: 41082, Training Loss: 0.00633
Epoch: 41083, Training Loss: 0.00714
Epoch: 41083, Training Loss: 0.00669
Epoch: 41083, Training Loss: 0.00816
Epoch: 41083, Training Loss: 0.00633
Epoch: 41084, Training Loss: 0.00714
Epoch: 41084, Training Loss: 0.00669
Epoch: 41084, Training Loss: 0.00816
Epoch: 41084, Training Loss: 0.00633
Epoch: 41085, Training Loss: 0.00714
Epoch: 41085, Training Loss: 0.00669
Epoch: 41085, Training Loss: 0.00816
Epoch: 41085, Training Loss: 0.00633
Epoch: 41086, Training Loss: 0.00714
Epoch: 41086, Training Loss: 0.00669
Epoch: 41086, Training Loss: 0.00816
Epoch: 41086, Training Loss: 0.00633
Epoch: 41087, Training Loss: 0.00714
Epoch: 41087, Training Loss: 0.00669
Epoch: 41087, Training Loss: 0.00816
Epoch: 41087, Training Loss: 0.00633
Epoch: 41088, Training Loss: 0.00714
Epoch: 41088, Training Loss: 0.00669
Epoch: 41088, Training Loss: 0.00816
Epoch: 41088, Training Loss: 0.00633
Epoch: 41089, Training Loss: 0.00714
Epoch: 41089, Training Loss: 0.00669
Epoch: 41089, Training Loss: 0.00816
Epoch: 41089, Training Loss: 0.00633
Epoch: 41090, Training Loss: 0.00714
Epoch: 41090, Training Loss: 0.00669
Epoch: 41090, Training Loss: 0.00816
Epoch: 41090, Training Loss: 0.00633
Epoch: 41091, Training Loss: 0.00714
Epoch: 41091, Training Loss: 0.00669
Epoch: 41091, Training Loss: 0.00816
Epoch: 41091, Training Loss: 0.00633
Epoch: 41092, Training Loss: 0.00714
Epoch: 41092, Training Loss: 0.00669
Epoch: 41092, Training Loss: 0.00816
Epoch: 41092, Training Loss: 0.00633
Epoch: 41093, Training Loss: 0.00714
Epoch: 41093, Training Loss: 0.00669
Epoch: 41093, Training Loss: 0.00816
Epoch: 41093, Training Loss: 0.00633
Epoch: 41094, Training Loss: 0.00714
Epoch: 41094, Training Loss: 0.00669
Epoch: 41094, Training Loss: 0.00816
Epoch: 41094, Training Loss: 0.00633
Epoch: 41095, Training Loss: 0.00714
Epoch: 41095, Training Loss: 0.00669
Epoch: 41095, Training Loss: 0.00816
Epoch: 41095, Training Loss: 0.00633
Epoch: 41096, Training Loss: 0.00714
Epoch: 41096, Training Loss: 0.00669
Epoch: 41096, Training Loss: 0.00816
Epoch: 41096, Training Loss: 0.00633
Epoch: 41097, Training Loss: 0.00714
Epoch: 41097, Training Loss: 0.00669
Epoch: 41097, Training Loss: 0.00816
Epoch: 41097, Training Loss: 0.00633
Epoch: 41098, Training Loss: 0.00714
Epoch: 41098, Training Loss: 0.00669
Epoch: 41098, Training Loss: 0.00816
Epoch: 41098, Training Loss: 0.00633
Epoch: 41099, Training Loss: 0.00714
Epoch: 41099, Training Loss: 0.00669
Epoch: 41099, Training Loss: 0.00816
Epoch: 41099, Training Loss: 0.00633
Epoch: 41100, Training Loss: 0.00714
Epoch: 41100, Training Loss: 0.00668
Epoch: 41100, Training Loss: 0.00816
Epoch: 41100, Training Loss: 0.00633
Epoch: 41101, Training Loss: 0.00714
Epoch: 41101, Training Loss: 0.00668
Epoch: 41101, Training Loss: 0.00816
Epoch: 41101, Training Loss: 0.00633
Epoch: 41102, Training Loss: 0.00714
Epoch: 41102, Training Loss: 0.00668
Epoch: 41102, Training Loss: 0.00816
Epoch: 41102, Training Loss: 0.00633
Epoch: 41103, Training Loss: 0.00714
Epoch: 41103, Training Loss: 0.00668
Epoch: 41103, Training Loss: 0.00816
Epoch: 41103, Training Loss: 0.00633
Epoch: 41104, Training Loss: 0.00714
Epoch: 41104, Training Loss: 0.00668
Epoch: 41104, Training Loss: 0.00816
Epoch: 41104, Training Loss: 0.00633
Epoch: 41105, Training Loss: 0.00714
Epoch: 41105, Training Loss: 0.00668
Epoch: 41105, Training Loss: 0.00816
Epoch: 41105, Training Loss: 0.00633
Epoch: 41106, Training Loss: 0.00714
Epoch: 41106, Training Loss: 0.00668
Epoch: 41106, Training Loss: 0.00816
Epoch: 41106, Training Loss: 0.00633
Epoch: 41107, Training Loss: 0.00714
Epoch: 41107, Training Loss: 0.00668
Epoch: 41107, Training Loss: 0.00816
Epoch: 41107, Training Loss: 0.00633
Epoch: 41108, Training Loss: 0.00714
Epoch: 41108, Training Loss: 0.00668
Epoch: 41108, Training Loss: 0.00816
Epoch: 41108, Training Loss: 0.00633
Epoch: 41109, Training Loss: 0.00714
Epoch: 41109, Training Loss: 0.00668
Epoch: 41109, Training Loss: 0.00816
Epoch: 41109, Training Loss: 0.00633
Epoch: 41110, Training Loss: 0.00714
Epoch: 41110, Training Loss: 0.00668
Epoch: 41110, Training Loss: 0.00816
Epoch: 41110, Training Loss: 0.00633
Epoch: 41111, Training Loss: 0.00714
Epoch: 41111, Training Loss: 0.00668
Epoch: 41111, Training Loss: 0.00816
Epoch: 41111, Training Loss: 0.00633
Epoch: 41112, Training Loss: 0.00714
Epoch: 41112, Training Loss: 0.00668
Epoch: 41112, Training Loss: 0.00816
Epoch: 41112, Training Loss: 0.00633
Epoch: 41113, Training Loss: 0.00714
Epoch: 41113, Training Loss: 0.00668
Epoch: 41113, Training Loss: 0.00816
Epoch: 41113, Training Loss: 0.00633
Epoch: 41114, Training Loss: 0.00714
Epoch: 41114, Training Loss: 0.00668
Epoch: 41114, Training Loss: 0.00816
Epoch: 41114, Training Loss: 0.00633
Epoch: 41115, Training Loss: 0.00714
Epoch: 41115, Training Loss: 0.00668
Epoch: 41115, Training Loss: 0.00816
Epoch: 41115, Training Loss: 0.00633
Epoch: 41116, Training Loss: 0.00714
Epoch: 41116, Training Loss: 0.00668
Epoch: 41116, Training Loss: 0.00816
Epoch: 41116, Training Loss: 0.00633
Epoch: 41117, Training Loss: 0.00714
Epoch: 41117, Training Loss: 0.00668
Epoch: 41117, Training Loss: 0.00816
Epoch: 41117, Training Loss: 0.00633
Epoch: 41118, Training Loss: 0.00714
Epoch: 41118, Training Loss: 0.00668
Epoch: 41118, Training Loss: 0.00816
Epoch: 41118, Training Loss: 0.00633
Epoch: 41119, Training Loss: 0.00714
Epoch: 41119, Training Loss: 0.00668
Epoch: 41119, Training Loss: 0.00816
Epoch: 41119, Training Loss: 0.00633
Epoch: 41120, Training Loss: 0.00714
Epoch: 41120, Training Loss: 0.00668
Epoch: 41120, Training Loss: 0.00816
Epoch: 41120, Training Loss: 0.00633
Epoch: 41121, Training Loss: 0.00714
Epoch: 41121, Training Loss: 0.00668
Epoch: 41121, Training Loss: 0.00816
Epoch: 41121, Training Loss: 0.00633
Epoch: 41122, Training Loss: 0.00714
Epoch: 41122, Training Loss: 0.00668
Epoch: 41122, Training Loss: 0.00816
Epoch: 41122, Training Loss: 0.00633
Epoch: 41123, Training Loss: 0.00714
Epoch: 41123, Training Loss: 0.00668
Epoch: 41123, Training Loss: 0.00816
Epoch: 41123, Training Loss: 0.00633
Epoch: 41124, Training Loss: 0.00714
Epoch: 41124, Training Loss: 0.00668
Epoch: 41124, Training Loss: 0.00816
Epoch: 41124, Training Loss: 0.00633
Epoch: 41125, Training Loss: 0.00714
Epoch: 41125, Training Loss: 0.00668
Epoch: 41125, Training Loss: 0.00816
Epoch: 41125, Training Loss: 0.00633
Epoch: 41126, Training Loss: 0.00714
Epoch: 41126, Training Loss: 0.00668
Epoch: 41126, Training Loss: 0.00816
Epoch: 41126, Training Loss: 0.00633
Epoch: 41127, Training Loss: 0.00714
Epoch: 41127, Training Loss: 0.00668
Epoch: 41127, Training Loss: 0.00816
Epoch: 41127, Training Loss: 0.00633
Epoch: 41128, Training Loss: 0.00714
Epoch: 41128, Training Loss: 0.00668
Epoch: 41128, Training Loss: 0.00816
Epoch: 41128, Training Loss: 0.00633
Epoch: 41129, Training Loss: 0.00714
Epoch: 41129, Training Loss: 0.00668
Epoch: 41129, Training Loss: 0.00816
Epoch: 41129, Training Loss: 0.00633
Epoch: 41130, Training Loss: 0.00714
Epoch: 41130, Training Loss: 0.00668
Epoch: 41130, Training Loss: 0.00816
Epoch: 41130, Training Loss: 0.00633
Epoch: 41131, Training Loss: 0.00714
Epoch: 41131, Training Loss: 0.00668
Epoch: 41131, Training Loss: 0.00816
Epoch: 41131, Training Loss: 0.00633
Epoch: 41132, Training Loss: 0.00714
Epoch: 41132, Training Loss: 0.00668
Epoch: 41132, Training Loss: 0.00816
Epoch: 41132, Training Loss: 0.00633
Epoch: 41133, Training Loss: 0.00714
Epoch: 41133, Training Loss: 0.00668
Epoch: 41133, Training Loss: 0.00816
Epoch: 41133, Training Loss: 0.00633
Epoch: 41134, Training Loss: 0.00714
Epoch: 41134, Training Loss: 0.00668
Epoch: 41134, Training Loss: 0.00816
Epoch: 41134, Training Loss: 0.00633
Epoch: 41135, Training Loss: 0.00714
Epoch: 41135, Training Loss: 0.00668
Epoch: 41135, Training Loss: 0.00816
Epoch: 41135, Training Loss: 0.00633
Epoch: 41136, Training Loss: 0.00714
Epoch: 41136, Training Loss: 0.00668
Epoch: 41136, Training Loss: 0.00816
Epoch: 41136, Training Loss: 0.00633
Epoch: 41137, Training Loss: 0.00714
Epoch: 41137, Training Loss: 0.00668
Epoch: 41137, Training Loss: 0.00816
Epoch: 41137, Training Loss: 0.00633
Epoch: 41138, Training Loss: 0.00714
Epoch: 41138, Training Loss: 0.00668
Epoch: 41138, Training Loss: 0.00816
Epoch: 41138, Training Loss: 0.00633
Epoch: 41139, Training Loss: 0.00714
Epoch: 41139, Training Loss: 0.00668
Epoch: 41139, Training Loss: 0.00816
Epoch: 41139, Training Loss: 0.00633
Epoch: 41140, Training Loss: 0.00714
Epoch: 41140, Training Loss: 0.00668
Epoch: 41140, Training Loss: 0.00816
Epoch: 41140, Training Loss: 0.00633
Epoch: 41141, Training Loss: 0.00714
Epoch: 41141, Training Loss: 0.00668
Epoch: 41141, Training Loss: 0.00816
Epoch: 41141, Training Loss: 0.00633
Epoch: 41142, Training Loss: 0.00714
Epoch: 41142, Training Loss: 0.00668
Epoch: 41142, Training Loss: 0.00816
Epoch: 41142, Training Loss: 0.00633
Epoch: 41143, Training Loss: 0.00714
Epoch: 41143, Training Loss: 0.00668
Epoch: 41143, Training Loss: 0.00816
Epoch: 41143, Training Loss: 0.00633
Epoch: 41144, Training Loss: 0.00714
Epoch: 41144, Training Loss: 0.00668
Epoch: 41144, Training Loss: 0.00816
Epoch: 41144, Training Loss: 0.00633
Epoch: 41145, Training Loss: 0.00714
Epoch: 41145, Training Loss: 0.00668
Epoch: 41145, Training Loss: 0.00816
Epoch: 41145, Training Loss: 0.00633
Epoch: 41146, Training Loss: 0.00714
Epoch: 41146, Training Loss: 0.00668
Epoch: 41146, Training Loss: 0.00816
Epoch: 41146, Training Loss: 0.00633
Epoch: 41147, Training Loss: 0.00714
Epoch: 41147, Training Loss: 0.00668
Epoch: 41147, Training Loss: 0.00816
Epoch: 41147, Training Loss: 0.00633
Epoch: 41148, Training Loss: 0.00714
Epoch: 41148, Training Loss: 0.00668
Epoch: 41148, Training Loss: 0.00815
Epoch: 41148, Training Loss: 0.00633
Epoch: 41149, Training Loss: 0.00714
Epoch: 41149, Training Loss: 0.00668
Epoch: 41149, Training Loss: 0.00815
Epoch: 41149, Training Loss: 0.00633
Epoch: 41150, Training Loss: 0.00714
Epoch: 41150, Training Loss: 0.00668
Epoch: 41150, Training Loss: 0.00815
Epoch: 41150, Training Loss: 0.00633
Epoch: 41151, Training Loss: 0.00714
Epoch: 41151, Training Loss: 0.00668
Epoch: 41151, Training Loss: 0.00815
Epoch: 41151, Training Loss: 0.00632
Epoch: 41152, Training Loss: 0.00714
Epoch: 41152, Training Loss: 0.00668
Epoch: 41152, Training Loss: 0.00815
Epoch: 41152, Training Loss: 0.00632
Epoch: 41153, Training Loss: 0.00714
Epoch: 41153, Training Loss: 0.00668
Epoch: 41153, Training Loss: 0.00815
Epoch: 41153, Training Loss: 0.00632
Epoch: 41154, Training Loss: 0.00714
Epoch: 41154, Training Loss: 0.00668
Epoch: 41154, Training Loss: 0.00815
Epoch: 41154, Training Loss: 0.00632
Epoch: 41155, Training Loss: 0.00714
Epoch: 41155, Training Loss: 0.00668
Epoch: 41155, Training Loss: 0.00815
Epoch: 41155, Training Loss: 0.00632
Epoch: 41156, Training Loss: 0.00714
Epoch: 41156, Training Loss: 0.00668
Epoch: 41156, Training Loss: 0.00815
Epoch: 41156, Training Loss: 0.00632
Epoch: 41157, Training Loss: 0.00714
Epoch: 41157, Training Loss: 0.00668
Epoch: 41157, Training Loss: 0.00815
Epoch: 41157, Training Loss: 0.00632
Epoch: 41158, Training Loss: 0.00714
Epoch: 41158, Training Loss: 0.00668
Epoch: 41158, Training Loss: 0.00815
Epoch: 41158, Training Loss: 0.00632
Epoch: 41159, Training Loss: 0.00714
Epoch: 41159, Training Loss: 0.00668
Epoch: 41159, Training Loss: 0.00815
Epoch: 41159, Training Loss: 0.00632
Epoch: 41160, Training Loss: 0.00714
Epoch: 41160, Training Loss: 0.00668
Epoch: 41160, Training Loss: 0.00815
Epoch: 41160, Training Loss: 0.00632
Epoch: 41161, Training Loss: 0.00714
Epoch: 41161, Training Loss: 0.00668
Epoch: 41161, Training Loss: 0.00815
Epoch: 41161, Training Loss: 0.00632
Epoch: 41162, Training Loss: 0.00714
Epoch: 41162, Training Loss: 0.00668
Epoch: 41162, Training Loss: 0.00815
Epoch: 41162, Training Loss: 0.00632
Epoch: 41163, Training Loss: 0.00714
Epoch: 41163, Training Loss: 0.00668
Epoch: 41163, Training Loss: 0.00815
Epoch: 41163, Training Loss: 0.00632
Epoch: 41164, Training Loss: 0.00714
Epoch: 41164, Training Loss: 0.00668
Epoch: 41164, Training Loss: 0.00815
Epoch: 41164, Training Loss: 0.00632
Epoch: 41165, Training Loss: 0.00714
Epoch: 41165, Training Loss: 0.00668
Epoch: 41165, Training Loss: 0.00815
Epoch: 41165, Training Loss: 0.00632
Epoch: 41166, Training Loss: 0.00714
Epoch: 41166, Training Loss: 0.00668
Epoch: 41166, Training Loss: 0.00815
Epoch: 41166, Training Loss: 0.00632
Epoch: 41167, Training Loss: 0.00714
Epoch: 41167, Training Loss: 0.00668
Epoch: 41167, Training Loss: 0.00815
Epoch: 41167, Training Loss: 0.00632
Epoch: 41168, Training Loss: 0.00714
Epoch: 41168, Training Loss: 0.00668
Epoch: 41168, Training Loss: 0.00815
Epoch: 41168, Training Loss: 0.00632
Epoch: 41169, Training Loss: 0.00713
Epoch: 41169, Training Loss: 0.00668
Epoch: 41169, Training Loss: 0.00815
Epoch: 41169, Training Loss: 0.00632
Epoch: 41170, Training Loss: 0.00713
Epoch: 41170, Training Loss: 0.00668
Epoch: 41170, Training Loss: 0.00815
Epoch: 41170, Training Loss: 0.00632
Epoch: 41171, Training Loss: 0.00713
Epoch: 41171, Training Loss: 0.00668
Epoch: 41171, Training Loss: 0.00815
Epoch: 41171, Training Loss: 0.00632
Epoch: 41172, Training Loss: 0.00713
Epoch: 41172, Training Loss: 0.00668
Epoch: 41172, Training Loss: 0.00815
Epoch: 41172, Training Loss: 0.00632
Epoch: 41173, Training Loss: 0.00713
Epoch: 41173, Training Loss: 0.00668
Epoch: 41173, Training Loss: 0.00815
Epoch: 41173, Training Loss: 0.00632
Epoch: 41174, Training Loss: 0.00713
Epoch: 41174, Training Loss: 0.00668
Epoch: 41174, Training Loss: 0.00815
Epoch: 41174, Training Loss: 0.00632
Epoch: 41175, Training Loss: 0.00713
Epoch: 41175, Training Loss: 0.00668
Epoch: 41175, Training Loss: 0.00815
Epoch: 41175, Training Loss: 0.00632
Epoch: 41176, Training Loss: 0.00713
Epoch: 41176, Training Loss: 0.00668
Epoch: 41176, Training Loss: 0.00815
Epoch: 41176, Training Loss: 0.00632
Epoch: 41177, Training Loss: 0.00713
Epoch: 41177, Training Loss: 0.00668
Epoch: 41177, Training Loss: 0.00815
Epoch: 41177, Training Loss: 0.00632
Epoch: 41178, Training Loss: 0.00713
Epoch: 41178, Training Loss: 0.00668
Epoch: 41178, Training Loss: 0.00815
Epoch: 41178, Training Loss: 0.00632
Epoch: 41179, Training Loss: 0.00713
Epoch: 41179, Training Loss: 0.00668
Epoch: 41179, Training Loss: 0.00815
Epoch: 41179, Training Loss: 0.00632
Epoch: 41180, Training Loss: 0.00713
Epoch: 41180, Training Loss: 0.00668
Epoch: 41180, Training Loss: 0.00815
Epoch: 41180, Training Loss: 0.00632
Epoch: 41181, Training Loss: 0.00713
Epoch: 41181, Training Loss: 0.00668
Epoch: 41181, Training Loss: 0.00815
Epoch: 41181, Training Loss: 0.00632
Epoch: 41182, Training Loss: 0.00713
Epoch: 41182, Training Loss: 0.00668
Epoch: 41182, Training Loss: 0.00815
Epoch: 41182, Training Loss: 0.00632
Epoch: 41183, Training Loss: 0.00713
Epoch: 41183, Training Loss: 0.00668
Epoch: 41183, Training Loss: 0.00815
Epoch: 41183, Training Loss: 0.00632
Epoch: 41184, Training Loss: 0.00713
Epoch: 41184, Training Loss: 0.00668
Epoch: 41184, Training Loss: 0.00815
Epoch: 41184, Training Loss: 0.00632
Epoch: 41185, Training Loss: 0.00713
Epoch: 41185, Training Loss: 0.00668
Epoch: 41185, Training Loss: 0.00815
Epoch: 41185, Training Loss: 0.00632
Epoch: 41186, Training Loss: 0.00713
Epoch: 41186, Training Loss: 0.00668
Epoch: 41186, Training Loss: 0.00815
Epoch: 41186, Training Loss: 0.00632
Epoch: 41187, Training Loss: 0.00713
Epoch: 41187, Training Loss: 0.00668
Epoch: 41187, Training Loss: 0.00815
Epoch: 41187, Training Loss: 0.00632
Epoch: 41188, Training Loss: 0.00713
Epoch: 41188, Training Loss: 0.00668
Epoch: 41188, Training Loss: 0.00815
Epoch: 41188, Training Loss: 0.00632
Epoch: 41189, Training Loss: 0.00713
Epoch: 41189, Training Loss: 0.00668
Epoch: 41189, Training Loss: 0.00815
Epoch: 41189, Training Loss: 0.00632
Epoch: 41190, Training Loss: 0.00713
Epoch: 41190, Training Loss: 0.00668
Epoch: 41190, Training Loss: 0.00815
Epoch: 41190, Training Loss: 0.00632
Epoch: 41191, Training Loss: 0.00713
Epoch: 41191, Training Loss: 0.00668
Epoch: 41191, Training Loss: 0.00815
Epoch: 41191, Training Loss: 0.00632
Epoch: 41192, Training Loss: 0.00713
Epoch: 41192, Training Loss: 0.00668
Epoch: 41192, Training Loss: 0.00815
Epoch: 41192, Training Loss: 0.00632
Epoch: 41193, Training Loss: 0.00713
Epoch: 41193, Training Loss: 0.00668
Epoch: 41193, Training Loss: 0.00815
Epoch: 41193, Training Loss: 0.00632
Epoch: 41194, Training Loss: 0.00713
Epoch: 41194, Training Loss: 0.00668
Epoch: 41194, Training Loss: 0.00815
Epoch: 41194, Training Loss: 0.00632
Epoch: 41195, Training Loss: 0.00713
Epoch: 41195, Training Loss: 0.00668
Epoch: 41195, Training Loss: 0.00815
Epoch: 41195, Training Loss: 0.00632
Epoch: 41196, Training Loss: 0.00713
Epoch: 41196, Training Loss: 0.00668
Epoch: 41196, Training Loss: 0.00815
Epoch: 41196, Training Loss: 0.00632
Epoch: 41197, Training Loss: 0.00713
Epoch: 41197, Training Loss: 0.00668
Epoch: 41197, Training Loss: 0.00815
Epoch: 41197, Training Loss: 0.00632
Epoch: 41198, Training Loss: 0.00713
Epoch: 41198, Training Loss: 0.00668
Epoch: 41198, Training Loss: 0.00815
Epoch: 41198, Training Loss: 0.00632
Epoch: 41199, Training Loss: 0.00713
Epoch: 41199, Training Loss: 0.00668
Epoch: 41199, Training Loss: 0.00815
Epoch: 41199, Training Loss: 0.00632
Epoch: 41200, Training Loss: 0.00713
Epoch: 41200, Training Loss: 0.00668
Epoch: 41200, Training Loss: 0.00815
Epoch: 41200, Training Loss: 0.00632
Epoch: 41201, Training Loss: 0.00713
Epoch: 41201, Training Loss: 0.00668
Epoch: 41201, Training Loss: 0.00815
Epoch: 41201, Training Loss: 0.00632
Epoch: 41202, Training Loss: 0.00713
Epoch: 41202, Training Loss: 0.00668
Epoch: 41202, Training Loss: 0.00815
Epoch: 41202, Training Loss: 0.00632
Epoch: 41203, Training Loss: 0.00713
Epoch: 41203, Training Loss: 0.00668
Epoch: 41203, Training Loss: 0.00815
Epoch: 41203, Training Loss: 0.00632
Epoch: 41204, Training Loss: 0.00713
Epoch: 41204, Training Loss: 0.00668
Epoch: 41204, Training Loss: 0.00815
Epoch: 41204, Training Loss: 0.00632
Epoch: 41205, Training Loss: 0.00713
Epoch: 41205, Training Loss: 0.00668
Epoch: 41205, Training Loss: 0.00815
Epoch: 41205, Training Loss: 0.00632
Epoch: 41206, Training Loss: 0.00713
Epoch: 41206, Training Loss: 0.00668
Epoch: 41206, Training Loss: 0.00815
Epoch: 41206, Training Loss: 0.00632
Epoch: 41207, Training Loss: 0.00713
Epoch: 41207, Training Loss: 0.00668
Epoch: 41207, Training Loss: 0.00815
Epoch: 41207, Training Loss: 0.00632
Epoch: 41208, Training Loss: 0.00713
Epoch: 41208, Training Loss: 0.00668
Epoch: 41208, Training Loss: 0.00815
Epoch: 41208, Training Loss: 0.00632
Epoch: 41209, Training Loss: 0.00713
Epoch: 41209, Training Loss: 0.00668
Epoch: 41209, Training Loss: 0.00815
Epoch: 41209, Training Loss: 0.00632
Epoch: 41210, Training Loss: 0.00713
Epoch: 41210, Training Loss: 0.00668
Epoch: 41210, Training Loss: 0.00815
Epoch: 41210, Training Loss: 0.00632
Epoch: 41211, Training Loss: 0.00713
Epoch: 41211, Training Loss: 0.00668
Epoch: 41211, Training Loss: 0.00815
Epoch: 41211, Training Loss: 0.00632
Epoch: 41212, Training Loss: 0.00713
Epoch: 41212, Training Loss: 0.00668
Epoch: 41212, Training Loss: 0.00815
Epoch: 41212, Training Loss: 0.00632
Epoch: 41213, Training Loss: 0.00713
Epoch: 41213, Training Loss: 0.00668
Epoch: 41213, Training Loss: 0.00815
Epoch: 41213, Training Loss: 0.00632
Epoch: 41214, Training Loss: 0.00713
Epoch: 41214, Training Loss: 0.00668
Epoch: 41214, Training Loss: 0.00815
Epoch: 41214, Training Loss: 0.00632
Epoch: 41215, Training Loss: 0.00713
Epoch: 41215, Training Loss: 0.00668
Epoch: 41215, Training Loss: 0.00815
Epoch: 41215, Training Loss: 0.00632
Epoch: 41216, Training Loss: 0.00713
Epoch: 41216, Training Loss: 0.00668
Epoch: 41216, Training Loss: 0.00815
Epoch: 41216, Training Loss: 0.00632
Epoch: 41217, Training Loss: 0.00713
Epoch: 41217, Training Loss: 0.00668
Epoch: 41217, Training Loss: 0.00815
Epoch: 41217, Training Loss: 0.00632
Epoch: 41218, Training Loss: 0.00713
Epoch: 41218, Training Loss: 0.00667
Epoch: 41218, Training Loss: 0.00815
Epoch: 41218, Training Loss: 0.00632
Epoch: 41219, Training Loss: 0.00713
Epoch: 41219, Training Loss: 0.00667
Epoch: 41219, Training Loss: 0.00815
Epoch: 41219, Training Loss: 0.00632
Epoch: 41220, Training Loss: 0.00713
Epoch: 41220, Training Loss: 0.00667
Epoch: 41220, Training Loss: 0.00815
Epoch: 41220, Training Loss: 0.00632
Epoch: 41221, Training Loss: 0.00713
Epoch: 41221, Training Loss: 0.00667
Epoch: 41221, Training Loss: 0.00815
Epoch: 41221, Training Loss: 0.00632
Epoch: 41222, Training Loss: 0.00713
Epoch: 41222, Training Loss: 0.00667
Epoch: 41222, Training Loss: 0.00815
Epoch: 41222, Training Loss: 0.00632
Epoch: 41223, Training Loss: 0.00713
Epoch: 41223, Training Loss: 0.00667
Epoch: 41223, Training Loss: 0.00815
Epoch: 41223, Training Loss: 0.00632
Epoch: 41224, Training Loss: 0.00713
Epoch: 41224, Training Loss: 0.00667
Epoch: 41224, Training Loss: 0.00815
Epoch: 41224, Training Loss: 0.00632
Epoch: 41225, Training Loss: 0.00713
Epoch: 41225, Training Loss: 0.00667
Epoch: 41225, Training Loss: 0.00815
Epoch: 41225, Training Loss: 0.00632
Epoch: 41226, Training Loss: 0.00713
Epoch: 41226, Training Loss: 0.00667
Epoch: 41226, Training Loss: 0.00815
Epoch: 41226, Training Loss: 0.00632
Epoch: 41227, Training Loss: 0.00713
Epoch: 41227, Training Loss: 0.00667
Epoch: 41227, Training Loss: 0.00815
Epoch: 41227, Training Loss: 0.00632
Epoch: 41228, Training Loss: 0.00713
Epoch: 41228, Training Loss: 0.00667
Epoch: 41228, Training Loss: 0.00815
Epoch: 41228, Training Loss: 0.00632
Epoch: 41229, Training Loss: 0.00713
Epoch: 41229, Training Loss: 0.00667
Epoch: 41229, Training Loss: 0.00815
Epoch: 41229, Training Loss: 0.00632
Epoch: 41230, Training Loss: 0.00713
Epoch: 41230, Training Loss: 0.00667
Epoch: 41230, Training Loss: 0.00815
Epoch: 41230, Training Loss: 0.00632
Epoch: 41231, Training Loss: 0.00713
Epoch: 41231, Training Loss: 0.00667
Epoch: 41231, Training Loss: 0.00815
Epoch: 41231, Training Loss: 0.00632
Epoch: 41232, Training Loss: 0.00713
Epoch: 41232, Training Loss: 0.00667
Epoch: 41232, Training Loss: 0.00815
Epoch: 41232, Training Loss: 0.00632
Epoch: 41233, Training Loss: 0.00713
Epoch: 41233, Training Loss: 0.00667
Epoch: 41233, Training Loss: 0.00815
Epoch: 41233, Training Loss: 0.00632
Epoch: 41234, Training Loss: 0.00713
Epoch: 41234, Training Loss: 0.00667
Epoch: 41234, Training Loss: 0.00815
Epoch: 41234, Training Loss: 0.00632
Epoch: 41235, Training Loss: 0.00713
Epoch: 41235, Training Loss: 0.00667
Epoch: 41235, Training Loss: 0.00815
Epoch: 41235, Training Loss: 0.00632
Epoch: 41236, Training Loss: 0.00713
Epoch: 41236, Training Loss: 0.00667
Epoch: 41236, Training Loss: 0.00815
Epoch: 41236, Training Loss: 0.00632
Epoch: 41237, Training Loss: 0.00713
Epoch: 41237, Training Loss: 0.00667
Epoch: 41237, Training Loss: 0.00815
Epoch: 41237, Training Loss: 0.00632
Epoch: 41238, Training Loss: 0.00713
Epoch: 41238, Training Loss: 0.00667
Epoch: 41238, Training Loss: 0.00815
Epoch: 41238, Training Loss: 0.00632
Epoch: 41239, Training Loss: 0.00713
Epoch: 41239, Training Loss: 0.00667
Epoch: 41239, Training Loss: 0.00815
Epoch: 41239, Training Loss: 0.00632
Epoch: 41240, Training Loss: 0.00713
Epoch: 41240, Training Loss: 0.00667
Epoch: 41240, Training Loss: 0.00815
Epoch: 41240, Training Loss: 0.00632
Epoch: 41241, Training Loss: 0.00713
Epoch: 41241, Training Loss: 0.00667
Epoch: 41241, Training Loss: 0.00815
Epoch: 41241, Training Loss: 0.00632
Epoch: 41242, Training Loss: 0.00713
Epoch: 41242, Training Loss: 0.00667
Epoch: 41242, Training Loss: 0.00815
Epoch: 41242, Training Loss: 0.00632
Epoch: 41243, Training Loss: 0.00713
Epoch: 41243, Training Loss: 0.00667
Epoch: 41243, Training Loss: 0.00815
Epoch: 41243, Training Loss: 0.00632
Epoch: 41244, Training Loss: 0.00713
Epoch: 41244, Training Loss: 0.00667
Epoch: 41244, Training Loss: 0.00814
Epoch: 41244, Training Loss: 0.00632
Epoch: 41245, Training Loss: 0.00713
Epoch: 41245, Training Loss: 0.00667
Epoch: 41245, Training Loss: 0.00814
Epoch: 41245, Training Loss: 0.00632
Epoch: 41246, Training Loss: 0.00713
Epoch: 41246, Training Loss: 0.00667
Epoch: 41246, Training Loss: 0.00814
Epoch: 41246, Training Loss: 0.00632
Epoch: 41247, Training Loss: 0.00713
Epoch: 41247, Training Loss: 0.00667
Epoch: 41247, Training Loss: 0.00814
Epoch: 41247, Training Loss: 0.00632
Epoch: 41248, Training Loss: 0.00713
Epoch: 41248, Training Loss: 0.00667
Epoch: 41248, Training Loss: 0.00814
Epoch: 41248, Training Loss: 0.00632
Epoch: 41249, Training Loss: 0.00713
Epoch: 41249, Training Loss: 0.00667
Epoch: 41249, Training Loss: 0.00814
Epoch: 41249, Training Loss: 0.00632
Epoch: 41250, Training Loss: 0.00713
Epoch: 41250, Training Loss: 0.00667
Epoch: 41250, Training Loss: 0.00814
Epoch: 41250, Training Loss: 0.00632
Epoch: 41251, Training Loss: 0.00713
Epoch: 41251, Training Loss: 0.00667
Epoch: 41251, Training Loss: 0.00814
Epoch: 41251, Training Loss: 0.00632
Epoch: 41252, Training Loss: 0.00713
Epoch: 41252, Training Loss: 0.00667
Epoch: 41252, Training Loss: 0.00814
Epoch: 41252, Training Loss: 0.00632
Epoch: 41253, Training Loss: 0.00713
Epoch: 41253, Training Loss: 0.00667
Epoch: 41253, Training Loss: 0.00814
Epoch: 41253, Training Loss: 0.00632
Epoch: 41254, Training Loss: 0.00713
Epoch: 41254, Training Loss: 0.00667
Epoch: 41254, Training Loss: 0.00814
Epoch: 41254, Training Loss: 0.00632
Epoch: 41255, Training Loss: 0.00713
Epoch: 41255, Training Loss: 0.00667
Epoch: 41255, Training Loss: 0.00814
Epoch: 41255, Training Loss: 0.00632
Epoch: 41256, Training Loss: 0.00713
Epoch: 41256, Training Loss: 0.00667
Epoch: 41256, Training Loss: 0.00814
Epoch: 41256, Training Loss: 0.00632
Epoch: 41257, Training Loss: 0.00713
Epoch: 41257, Training Loss: 0.00667
Epoch: 41257, Training Loss: 0.00814
Epoch: 41257, Training Loss: 0.00632
Epoch: 41258, Training Loss: 0.00713
Epoch: 41258, Training Loss: 0.00667
Epoch: 41258, Training Loss: 0.00814
Epoch: 41258, Training Loss: 0.00632
Epoch: 41259, Training Loss: 0.00713
Epoch: 41259, Training Loss: 0.00667
Epoch: 41259, Training Loss: 0.00814
Epoch: 41259, Training Loss: 0.00632
Epoch: 41260, Training Loss: 0.00713
Epoch: 41260, Training Loss: 0.00667
Epoch: 41260, Training Loss: 0.00814
Epoch: 41260, Training Loss: 0.00632
Epoch: 41261, Training Loss: 0.00713
Epoch: 41261, Training Loss: 0.00667
Epoch: 41261, Training Loss: 0.00814
Epoch: 41261, Training Loss: 0.00632
Epoch: 41262, Training Loss: 0.00713
Epoch: 41262, Training Loss: 0.00667
Epoch: 41262, Training Loss: 0.00814
Epoch: 41262, Training Loss: 0.00632
Epoch: 41263, Training Loss: 0.00713
Epoch: 41263, Training Loss: 0.00667
Epoch: 41263, Training Loss: 0.00814
Epoch: 41263, Training Loss: 0.00632
Epoch: 41264, Training Loss: 0.00713
Epoch: 41264, Training Loss: 0.00667
Epoch: 41264, Training Loss: 0.00814
Epoch: 41264, Training Loss: 0.00632
Epoch: 41265, Training Loss: 0.00713
Epoch: 41265, Training Loss: 0.00667
Epoch: 41265, Training Loss: 0.00814
Epoch: 41265, Training Loss: 0.00632
Epoch: 41266, Training Loss: 0.00713
Epoch: 41266, Training Loss: 0.00667
Epoch: 41266, Training Loss: 0.00814
Epoch: 41266, Training Loss: 0.00632
Epoch: 41267, Training Loss: 0.00713
Epoch: 41267, Training Loss: 0.00667
Epoch: 41267, Training Loss: 0.00814
Epoch: 41267, Training Loss: 0.00632
Epoch: 41268, Training Loss: 0.00713
Epoch: 41268, Training Loss: 0.00667
Epoch: 41268, Training Loss: 0.00814
Epoch: 41268, Training Loss: 0.00632
Epoch: 41269, Training Loss: 0.00713
Epoch: 41269, Training Loss: 0.00667
Epoch: 41269, Training Loss: 0.00814
Epoch: 41269, Training Loss: 0.00632
Epoch: 41270, Training Loss: 0.00713
Epoch: 41270, Training Loss: 0.00667
Epoch: 41270, Training Loss: 0.00814
Epoch: 41270, Training Loss: 0.00632
Epoch: 41271, Training Loss: 0.00713
Epoch: 41271, Training Loss: 0.00667
Epoch: 41271, Training Loss: 0.00814
Epoch: 41271, Training Loss: 0.00632
Epoch: 41272, Training Loss: 0.00713
Epoch: 41272, Training Loss: 0.00667
Epoch: 41272, Training Loss: 0.00814
Epoch: 41272, Training Loss: 0.00632
Epoch: 41273, Training Loss: 0.00713
Epoch: 41273, Training Loss: 0.00667
Epoch: 41273, Training Loss: 0.00814
Epoch: 41273, Training Loss: 0.00632
Epoch: 41274, Training Loss: 0.00713
Epoch: 41274, Training Loss: 0.00667
Epoch: 41274, Training Loss: 0.00814
Epoch: 41274, Training Loss: 0.00632
Epoch: 41275, Training Loss: 0.00713
Epoch: 41275, Training Loss: 0.00667
Epoch: 41275, Training Loss: 0.00814
Epoch: 41275, Training Loss: 0.00632
Epoch: 41276, Training Loss: 0.00713
Epoch: 41276, Training Loss: 0.00667
Epoch: 41276, Training Loss: 0.00814
Epoch: 41276, Training Loss: 0.00632
Epoch: 41277, Training Loss: 0.00713
Epoch: 41277, Training Loss: 0.00667
Epoch: 41277, Training Loss: 0.00814
Epoch: 41277, Training Loss: 0.00631
Epoch: 41278, Training Loss: 0.00712
Epoch: 41278, Training Loss: 0.00667
Epoch: 41278, Training Loss: 0.00814
Epoch: 41278, Training Loss: 0.00631
Epoch: 41279, Training Loss: 0.00712
Epoch: 41279, Training Loss: 0.00667
Epoch: 41279, Training Loss: 0.00814
Epoch: 41279, Training Loss: 0.00631
Epoch: 41280, Training Loss: 0.00712
Epoch: 41280, Training Loss: 0.00667
Epoch: 41280, Training Loss: 0.00814
Epoch: 41280, Training Loss: 0.00631
Epoch: 41281, Training Loss: 0.00712
Epoch: 41281, Training Loss: 0.00667
Epoch: 41281, Training Loss: 0.00814
Epoch: 41281, Training Loss: 0.00631
Epoch: 41282, Training Loss: 0.00712
Epoch: 41282, Training Loss: 0.00667
Epoch: 41282, Training Loss: 0.00814
Epoch: 41282, Training Loss: 0.00631
Epoch: 41283, Training Loss: 0.00712
Epoch: 41283, Training Loss: 0.00667
Epoch: 41283, Training Loss: 0.00814
Epoch: 41283, Training Loss: 0.00631
Epoch: 41284, Training Loss: 0.00712
Epoch: 41284, Training Loss: 0.00667
Epoch: 41284, Training Loss: 0.00814
Epoch: 41284, Training Loss: 0.00631
Epoch: 41285, Training Loss: 0.00712
Epoch: 41285, Training Loss: 0.00667
Epoch: 41285, Training Loss: 0.00814
Epoch: 41285, Training Loss: 0.00631
Epoch: 41286, Training Loss: 0.00712
Epoch: 41286, Training Loss: 0.00667
Epoch: 41286, Training Loss: 0.00814
Epoch: 41286, Training Loss: 0.00631
Epoch: 41287, Training Loss: 0.00712
Epoch: 41287, Training Loss: 0.00667
Epoch: 41287, Training Loss: 0.00814
Epoch: 41287, Training Loss: 0.00631
Epoch: 41288, Training Loss: 0.00712
Epoch: 41288, Training Loss: 0.00667
Epoch: 41288, Training Loss: 0.00814
Epoch: 41288, Training Loss: 0.00631
Epoch: 41289, Training Loss: 0.00712
Epoch: 41289, Training Loss: 0.00667
Epoch: 41289, Training Loss: 0.00814
Epoch: 41289, Training Loss: 0.00631
Epoch: 41290, Training Loss: 0.00712
Epoch: 41290, Training Loss: 0.00667
Epoch: 41290, Training Loss: 0.00814
Epoch: 41290, Training Loss: 0.00631
Epoch: 41291, Training Loss: 0.00712
Epoch: 41291, Training Loss: 0.00667
Epoch: 41291, Training Loss: 0.00814
Epoch: 41291, Training Loss: 0.00631
Epoch: 41292, Training Loss: 0.00712
Epoch: 41292, Training Loss: 0.00667
Epoch: 41292, Training Loss: 0.00814
Epoch: 41292, Training Loss: 0.00631
Epoch: 41293, Training Loss: 0.00712
Epoch: 41293, Training Loss: 0.00667
Epoch: 41293, Training Loss: 0.00814
Epoch: 41293, Training Loss: 0.00631
Epoch: 41294, Training Loss: 0.00712
Epoch: 41294, Training Loss: 0.00667
Epoch: 41294, Training Loss: 0.00814
Epoch: 41294, Training Loss: 0.00631
Epoch: 41295, Training Loss: 0.00712
Epoch: 41295, Training Loss: 0.00667
Epoch: 41295, Training Loss: 0.00814
Epoch: 41295, Training Loss: 0.00631
Epoch: 41296, Training Loss: 0.00712
Epoch: 41296, Training Loss: 0.00667
Epoch: 41296, Training Loss: 0.00814
Epoch: 41296, Training Loss: 0.00631
Epoch: 41297, Training Loss: 0.00712
Epoch: 41297, Training Loss: 0.00667
Epoch: 41297, Training Loss: 0.00814
Epoch: 41297, Training Loss: 0.00631
Epoch: 41298, Training Loss: 0.00712
Epoch: 41298, Training Loss: 0.00667
Epoch: 41298, Training Loss: 0.00814
Epoch: 41298, Training Loss: 0.00631
Epoch: 41299, Training Loss: 0.00712
Epoch: 41299, Training Loss: 0.00667
Epoch: 41299, Training Loss: 0.00814
Epoch: 41299, Training Loss: 0.00631
Epoch: 41300, Training Loss: 0.00712
Epoch: 41300, Training Loss: 0.00667
Epoch: 41300, Training Loss: 0.00814
Epoch: 41300, Training Loss: 0.00631
Epoch: 41301, Training Loss: 0.00712
Epoch: 41301, Training Loss: 0.00667
Epoch: 41301, Training Loss: 0.00814
Epoch: 41301, Training Loss: 0.00631
Epoch: 41302, Training Loss: 0.00712
Epoch: 41302, Training Loss: 0.00667
Epoch: 41302, Training Loss: 0.00814
Epoch: 41302, Training Loss: 0.00631
Epoch: 41303, Training Loss: 0.00712
Epoch: 41303, Training Loss: 0.00667
Epoch: 41303, Training Loss: 0.00814
Epoch: 41303, Training Loss: 0.00631
Epoch: 41304, Training Loss: 0.00712
Epoch: 41304, Training Loss: 0.00667
Epoch: 41304, Training Loss: 0.00814
Epoch: 41304, Training Loss: 0.00631
Epoch: 41305, Training Loss: 0.00712
Epoch: 41305, Training Loss: 0.00667
Epoch: 41305, Training Loss: 0.00814
Epoch: 41305, Training Loss: 0.00631
Epoch: 41306, Training Loss: 0.00712
Epoch: 41306, Training Loss: 0.00667
Epoch: 41306, Training Loss: 0.00814
Epoch: 41306, Training Loss: 0.00631
Epoch: 41307, Training Loss: 0.00712
Epoch: 41307, Training Loss: 0.00667
Epoch: 41307, Training Loss: 0.00814
Epoch: 41307, Training Loss: 0.00631
Epoch: 41308, Training Loss: 0.00712
Epoch: 41308, Training Loss: 0.00667
Epoch: 41308, Training Loss: 0.00814
Epoch: 41308, Training Loss: 0.00631
Epoch: 41309, Training Loss: 0.00712
Epoch: 41309, Training Loss: 0.00667
Epoch: 41309, Training Loss: 0.00814
Epoch: 41309, Training Loss: 0.00631
Epoch: 41310, Training Loss: 0.00712
Epoch: 41310, Training Loss: 0.00667
Epoch: 41310, Training Loss: 0.00814
Epoch: 41310, Training Loss: 0.00631
Epoch: 41311, Training Loss: 0.00712
Epoch: 41311, Training Loss: 0.00667
Epoch: 41311, Training Loss: 0.00814
Epoch: 41311, Training Loss: 0.00631
Epoch: 41312, Training Loss: 0.00712
Epoch: 41312, Training Loss: 0.00667
Epoch: 41312, Training Loss: 0.00814
Epoch: 41312, Training Loss: 0.00631
Epoch: 41313, Training Loss: 0.00712
Epoch: 41313, Training Loss: 0.00667
Epoch: 41313, Training Loss: 0.00814
Epoch: 41313, Training Loss: 0.00631
Epoch: 41314, Training Loss: 0.00712
Epoch: 41314, Training Loss: 0.00667
Epoch: 41314, Training Loss: 0.00814
Epoch: 41314, Training Loss: 0.00631
Epoch: 41315, Training Loss: 0.00712
Epoch: 41315, Training Loss: 0.00667
Epoch: 41315, Training Loss: 0.00814
Epoch: 41315, Training Loss: 0.00631
Epoch: 41316, Training Loss: 0.00712
Epoch: 41316, Training Loss: 0.00667
Epoch: 41316, Training Loss: 0.00814
Epoch: 41316, Training Loss: 0.00631
Epoch: 41317, Training Loss: 0.00712
Epoch: 41317, Training Loss: 0.00667
Epoch: 41317, Training Loss: 0.00814
Epoch: 41317, Training Loss: 0.00631
Epoch: 41318, Training Loss: 0.00712
Epoch: 41318, Training Loss: 0.00667
Epoch: 41318, Training Loss: 0.00814
Epoch: 41318, Training Loss: 0.00631
Epoch: 41319, Training Loss: 0.00712
Epoch: 41319, Training Loss: 0.00667
Epoch: 41319, Training Loss: 0.00814
Epoch: 41319, Training Loss: 0.00631
Epoch: 41320, Training Loss: 0.00712
Epoch: 41320, Training Loss: 0.00667
Epoch: 41320, Training Loss: 0.00814
Epoch: 41320, Training Loss: 0.00631
Epoch: 41321, Training Loss: 0.00712
Epoch: 41321, Training Loss: 0.00667
Epoch: 41321, Training Loss: 0.00814
Epoch: 41321, Training Loss: 0.00631
Epoch: 41322, Training Loss: 0.00712
Epoch: 41322, Training Loss: 0.00667
Epoch: 41322, Training Loss: 0.00814
Epoch: 41322, Training Loss: 0.00631
Epoch: 41323, Training Loss: 0.00712
Epoch: 41323, Training Loss: 0.00667
Epoch: 41323, Training Loss: 0.00814
Epoch: 41323, Training Loss: 0.00631
Epoch: 41324, Training Loss: 0.00712
Epoch: 41324, Training Loss: 0.00667
Epoch: 41324, Training Loss: 0.00814
Epoch: 41324, Training Loss: 0.00631
Epoch: 41325, Training Loss: 0.00712
Epoch: 41325, Training Loss: 0.00667
Epoch: 41325, Training Loss: 0.00814
Epoch: 41325, Training Loss: 0.00631
Epoch: 41326, Training Loss: 0.00712
Epoch: 41326, Training Loss: 0.00667
Epoch: 41326, Training Loss: 0.00814
Epoch: 41326, Training Loss: 0.00631
Epoch: 41327, Training Loss: 0.00712
Epoch: 41327, Training Loss: 0.00667
Epoch: 41327, Training Loss: 0.00814
Epoch: 41327, Training Loss: 0.00631
Epoch: 41328, Training Loss: 0.00712
Epoch: 41328, Training Loss: 0.00667
Epoch: 41328, Training Loss: 0.00814
Epoch: 41328, Training Loss: 0.00631
Epoch: 41329, Training Loss: 0.00712
Epoch: 41329, Training Loss: 0.00667
Epoch: 41329, Training Loss: 0.00814
Epoch: 41329, Training Loss: 0.00631
Epoch: 41330, Training Loss: 0.00712
Epoch: 41330, Training Loss: 0.00667
Epoch: 41330, Training Loss: 0.00814
Epoch: 41330, Training Loss: 0.00631
Epoch: 41331, Training Loss: 0.00712
Epoch: 41331, Training Loss: 0.00667
Epoch: 41331, Training Loss: 0.00814
Epoch: 41331, Training Loss: 0.00631
Epoch: 41332, Training Loss: 0.00712
Epoch: 41332, Training Loss: 0.00667
Epoch: 41332, Training Loss: 0.00814
Epoch: 41332, Training Loss: 0.00631
Epoch: 41333, Training Loss: 0.00712
Epoch: 41333, Training Loss: 0.00667
Epoch: 41333, Training Loss: 0.00814
Epoch: 41333, Training Loss: 0.00631
Epoch: 41334, Training Loss: 0.00712
Epoch: 41334, Training Loss: 0.00667
Epoch: 41334, Training Loss: 0.00814
Epoch: 41334, Training Loss: 0.00631
Epoch: 41335, Training Loss: 0.00712
Epoch: 41335, Training Loss: 0.00667
Epoch: 41335, Training Loss: 0.00814
Epoch: 41335, Training Loss: 0.00631
Epoch: 41336, Training Loss: 0.00712
Epoch: 41336, Training Loss: 0.00667
Epoch: 41336, Training Loss: 0.00814
Epoch: 41336, Training Loss: 0.00631
Epoch: 41337, Training Loss: 0.00712
Epoch: 41337, Training Loss: 0.00667
Epoch: 41337, Training Loss: 0.00814
Epoch: 41337, Training Loss: 0.00631
Epoch: 41338, Training Loss: 0.00712
Epoch: 41338, Training Loss: 0.00666
Epoch: 41338, Training Loss: 0.00814
Epoch: 41338, Training Loss: 0.00631
Epoch: 41339, Training Loss: 0.00712
Epoch: 41339, Training Loss: 0.00666
Epoch: 41339, Training Loss: 0.00814
Epoch: 41339, Training Loss: 0.00631
Epoch: 41340, Training Loss: 0.00712
Epoch: 41340, Training Loss: 0.00666
Epoch: 41340, Training Loss: 0.00814
Epoch: 41340, Training Loss: 0.00631
Epoch: 41341, Training Loss: 0.00712
Epoch: 41341, Training Loss: 0.00666
Epoch: 41341, Training Loss: 0.00813
Epoch: 41341, Training Loss: 0.00631
Epoch: 41342, Training Loss: 0.00712
Epoch: 41342, Training Loss: 0.00666
Epoch: 41342, Training Loss: 0.00813
Epoch: 41342, Training Loss: 0.00631
Epoch: 41343, Training Loss: 0.00712
Epoch: 41343, Training Loss: 0.00666
Epoch: 41343, Training Loss: 0.00813
Epoch: 41343, Training Loss: 0.00631
Epoch: 41344, Training Loss: 0.00712
Epoch: 41344, Training Loss: 0.00666
Epoch: 41344, Training Loss: 0.00813
Epoch: 41344, Training Loss: 0.00631
Epoch: 41345, Training Loss: 0.00712
Epoch: 41345, Training Loss: 0.00666
Epoch: 41345, Training Loss: 0.00813
Epoch: 41345, Training Loss: 0.00631
Epoch: 41346, Training Loss: 0.00712
Epoch: 41346, Training Loss: 0.00666
Epoch: 41346, Training Loss: 0.00813
Epoch: 41346, Training Loss: 0.00631
Epoch: 41347, Training Loss: 0.00712
Epoch: 41347, Training Loss: 0.00666
Epoch: 41347, Training Loss: 0.00813
Epoch: 41347, Training Loss: 0.00631
Epoch: 41348, Training Loss: 0.00712
Epoch: 41348, Training Loss: 0.00666
Epoch: 41348, Training Loss: 0.00813
Epoch: 41348, Training Loss: 0.00631
Epoch: 41349, Training Loss: 0.00712
Epoch: 41349, Training Loss: 0.00666
Epoch: 41349, Training Loss: 0.00813
Epoch: 41349, Training Loss: 0.00631
Epoch: 41350, Training Loss: 0.00712
Epoch: 41350, Training Loss: 0.00666
Epoch: 41350, Training Loss: 0.00813
Epoch: 41350, Training Loss: 0.00631
Epoch: 41351, Training Loss: 0.00712
Epoch: 41351, Training Loss: 0.00666
Epoch: 41351, Training Loss: 0.00813
Epoch: 41351, Training Loss: 0.00631
Epoch: 41352, Training Loss: 0.00712
Epoch: 41352, Training Loss: 0.00666
Epoch: 41352, Training Loss: 0.00813
Epoch: 41352, Training Loss: 0.00631
Epoch: 41353, Training Loss: 0.00712
Epoch: 41353, Training Loss: 0.00666
Epoch: 41353, Training Loss: 0.00813
Epoch: 41353, Training Loss: 0.00631
Epoch: 41354, Training Loss: 0.00712
Epoch: 41354, Training Loss: 0.00666
Epoch: 41354, Training Loss: 0.00813
Epoch: 41354, Training Loss: 0.00631
Epoch: 41355, Training Loss: 0.00712
Epoch: 41355, Training Loss: 0.00666
Epoch: 41355, Training Loss: 0.00813
Epoch: 41355, Training Loss: 0.00631
Epoch: 41356, Training Loss: 0.00712
Epoch: 41356, Training Loss: 0.00666
Epoch: 41356, Training Loss: 0.00813
Epoch: 41356, Training Loss: 0.00631
Epoch: 41357, Training Loss: 0.00712
Epoch: 41357, Training Loss: 0.00666
Epoch: 41357, Training Loss: 0.00813
Epoch: 41357, Training Loss: 0.00631
Epoch: 41358, Training Loss: 0.00712
Epoch: 41358, Training Loss: 0.00666
Epoch: 41358, Training Loss: 0.00813
Epoch: 41358, Training Loss: 0.00631
Epoch: 41359, Training Loss: 0.00712
Epoch: 41359, Training Loss: 0.00666
Epoch: 41359, Training Loss: 0.00813
Epoch: 41359, Training Loss: 0.00631
Epoch: 41360, Training Loss: 0.00712
Epoch: 41360, Training Loss: 0.00666
Epoch: 41360, Training Loss: 0.00813
Epoch: 41360, Training Loss: 0.00631
Epoch: 41361, Training Loss: 0.00712
Epoch: 41361, Training Loss: 0.00666
Epoch: 41361, Training Loss: 0.00813
Epoch: 41361, Training Loss: 0.00631
Epoch: 41362, Training Loss: 0.00712
Epoch: 41362, Training Loss: 0.00666
Epoch: 41362, Training Loss: 0.00813
Epoch: 41362, Training Loss: 0.00631
Epoch: 41363, Training Loss: 0.00712
Epoch: 41363, Training Loss: 0.00666
Epoch: 41363, Training Loss: 0.00813
Epoch: 41363, Training Loss: 0.00631
Epoch: 41364, Training Loss: 0.00712
Epoch: 41364, Training Loss: 0.00666
Epoch: 41364, Training Loss: 0.00813
Epoch: 41364, Training Loss: 0.00631
Epoch: 41365, Training Loss: 0.00712
Epoch: 41365, Training Loss: 0.00666
Epoch: 41365, Training Loss: 0.00813
Epoch: 41365, Training Loss: 0.00631
Epoch: 41366, Training Loss: 0.00712
Epoch: 41366, Training Loss: 0.00666
Epoch: 41366, Training Loss: 0.00813
Epoch: 41366, Training Loss: 0.00631
Epoch: 41367, Training Loss: 0.00712
Epoch: 41367, Training Loss: 0.00666
Epoch: 41367, Training Loss: 0.00813
Epoch: 41367, Training Loss: 0.00631
Epoch: 41368, Training Loss: 0.00712
Epoch: 41368, Training Loss: 0.00666
Epoch: 41368, Training Loss: 0.00813
Epoch: 41368, Training Loss: 0.00631
Epoch: 41369, Training Loss: 0.00712
Epoch: 41369, Training Loss: 0.00666
Epoch: 41369, Training Loss: 0.00813
Epoch: 41369, Training Loss: 0.00631
Epoch: 41370, Training Loss: 0.00712
Epoch: 41370, Training Loss: 0.00666
Epoch: 41370, Training Loss: 0.00813
Epoch: 41370, Training Loss: 0.00631
Epoch: 41371, Training Loss: 0.00712
Epoch: 41371, Training Loss: 0.00666
Epoch: 41371, Training Loss: 0.00813
Epoch: 41371, Training Loss: 0.00631
Epoch: 41372, Training Loss: 0.00712
Epoch: 41372, Training Loss: 0.00666
Epoch: 41372, Training Loss: 0.00813
Epoch: 41372, Training Loss: 0.00631
Epoch: 41373, Training Loss: 0.00712
Epoch: 41373, Training Loss: 0.00666
Epoch: 41373, Training Loss: 0.00813
Epoch: 41373, Training Loss: 0.00631
Epoch: 41374, Training Loss: 0.00712
Epoch: 41374, Training Loss: 0.00666
Epoch: 41374, Training Loss: 0.00813
Epoch: 41374, Training Loss: 0.00631
Epoch: 41375, Training Loss: 0.00712
Epoch: 41375, Training Loss: 0.00666
Epoch: 41375, Training Loss: 0.00813
Epoch: 41375, Training Loss: 0.00631
Epoch: 41376, Training Loss: 0.00712
Epoch: 41376, Training Loss: 0.00666
Epoch: 41376, Training Loss: 0.00813
Epoch: 41376, Training Loss: 0.00631
Epoch: 41377, Training Loss: 0.00712
Epoch: 41377, Training Loss: 0.00666
Epoch: 41377, Training Loss: 0.00813
Epoch: 41377, Training Loss: 0.00631
Epoch: 41378, Training Loss: 0.00712
Epoch: 41378, Training Loss: 0.00666
Epoch: 41378, Training Loss: 0.00813
Epoch: 41378, Training Loss: 0.00631
Epoch: 41379, Training Loss: 0.00712
Epoch: 41379, Training Loss: 0.00666
Epoch: 41379, Training Loss: 0.00813
Epoch: 41379, Training Loss: 0.00631
Epoch: 41380, Training Loss: 0.00712
Epoch: 41380, Training Loss: 0.00666
Epoch: 41380, Training Loss: 0.00813
Epoch: 41380, Training Loss: 0.00631
Epoch: 41381, Training Loss: 0.00712
Epoch: 41381, Training Loss: 0.00666
Epoch: 41381, Training Loss: 0.00813
Epoch: 41381, Training Loss: 0.00631
Epoch: 41382, Training Loss: 0.00712
Epoch: 41382, Training Loss: 0.00666
Epoch: 41382, Training Loss: 0.00813
Epoch: 41382, Training Loss: 0.00631
Epoch: 41383, Training Loss: 0.00712
Epoch: 41383, Training Loss: 0.00666
Epoch: 41383, Training Loss: 0.00813
Epoch: 41383, Training Loss: 0.00631
Epoch: 41384, Training Loss: 0.00712
Epoch: 41384, Training Loss: 0.00666
Epoch: 41384, Training Loss: 0.00813
Epoch: 41384, Training Loss: 0.00631
Epoch: 41385, Training Loss: 0.00712
Epoch: 41385, Training Loss: 0.00666
Epoch: 41385, Training Loss: 0.00813
Epoch: 41385, Training Loss: 0.00631
Epoch: 41386, Training Loss: 0.00712
Epoch: 41386, Training Loss: 0.00666
Epoch: 41386, Training Loss: 0.00813
Epoch: 41386, Training Loss: 0.00631
Epoch: 41387, Training Loss: 0.00712
Epoch: 41387, Training Loss: 0.00666
Epoch: 41387, Training Loss: 0.00813
Epoch: 41387, Training Loss: 0.00631
Epoch: 41388, Training Loss: 0.00712
Epoch: 41388, Training Loss: 0.00666
Epoch: 41388, Training Loss: 0.00813
Epoch: 41388, Training Loss: 0.00631
Epoch: 41389, Training Loss: 0.00711
Epoch: 41389, Training Loss: 0.00666
Epoch: 41389, Training Loss: 0.00813
Epoch: 41389, Training Loss: 0.00631
Epoch: 41390, Training Loss: 0.00711
Epoch: 41390, Training Loss: 0.00666
Epoch: 41390, Training Loss: 0.00813
Epoch: 41390, Training Loss: 0.00631
Epoch: 41391, Training Loss: 0.00711
Epoch: 41391, Training Loss: 0.00666
Epoch: 41391, Training Loss: 0.00813
Epoch: 41391, Training Loss: 0.00631
Epoch: 41392, Training Loss: 0.00711
Epoch: 41392, Training Loss: 0.00666
Epoch: 41392, Training Loss: 0.00813
Epoch: 41392, Training Loss: 0.00631
Epoch: 41393, Training Loss: 0.00711
Epoch: 41393, Training Loss: 0.00666
Epoch: 41393, Training Loss: 0.00813
Epoch: 41393, Training Loss: 0.00631
Epoch: 41394, Training Loss: 0.00711
Epoch: 41394, Training Loss: 0.00666
Epoch: 41394, Training Loss: 0.00813
Epoch: 41394, Training Loss: 0.00631
Epoch: 41395, Training Loss: 0.00711
Epoch: 41395, Training Loss: 0.00666
Epoch: 41395, Training Loss: 0.00813
Epoch: 41395, Training Loss: 0.00631
Epoch: 41396, Training Loss: 0.00711
Epoch: 41396, Training Loss: 0.00666
Epoch: 41396, Training Loss: 0.00813
Epoch: 41396, Training Loss: 0.00631
Epoch: 41397, Training Loss: 0.00711
Epoch: 41397, Training Loss: 0.00666
Epoch: 41397, Training Loss: 0.00813
Epoch: 41397, Training Loss: 0.00631
Epoch: 41398, Training Loss: 0.00711
Epoch: 41398, Training Loss: 0.00666
Epoch: 41398, Training Loss: 0.00813
Epoch: 41398, Training Loss: 0.00631
Epoch: 41399, Training Loss: 0.00711
Epoch: 41399, Training Loss: 0.00666
Epoch: 41399, Training Loss: 0.00813
Epoch: 41399, Training Loss: 0.00631
Epoch: 41400, Training Loss: 0.00711
Epoch: 41400, Training Loss: 0.00666
Epoch: 41400, Training Loss: 0.00813
Epoch: 41400, Training Loss: 0.00631
Epoch: 41401, Training Loss: 0.00711
Epoch: 41401, Training Loss: 0.00666
Epoch: 41401, Training Loss: 0.00813
Epoch: 41401, Training Loss: 0.00631
Epoch: 41402, Training Loss: 0.00711
Epoch: 41402, Training Loss: 0.00666
Epoch: 41402, Training Loss: 0.00813
Epoch: 41402, Training Loss: 0.00631
Epoch: 41403, Training Loss: 0.00711
Epoch: 41403, Training Loss: 0.00666
Epoch: 41403, Training Loss: 0.00813
Epoch: 41403, Training Loss: 0.00630
Epoch: 41404, Training Loss: 0.00711
Epoch: 41404, Training Loss: 0.00666
Epoch: 41404, Training Loss: 0.00813
Epoch: 41404, Training Loss: 0.00630
Epoch: 41405, Training Loss: 0.00711
Epoch: 41405, Training Loss: 0.00666
Epoch: 41405, Training Loss: 0.00813
Epoch: 41405, Training Loss: 0.00630
Epoch: 41406, Training Loss: 0.00711
Epoch: 41406, Training Loss: 0.00666
Epoch: 41406, Training Loss: 0.00813
Epoch: 41406, Training Loss: 0.00630
Epoch: 41407, Training Loss: 0.00711
Epoch: 41407, Training Loss: 0.00666
Epoch: 41407, Training Loss: 0.00813
Epoch: 41407, Training Loss: 0.00630
Epoch: 41408, Training Loss: 0.00711
Epoch: 41408, Training Loss: 0.00666
Epoch: 41408, Training Loss: 0.00813
Epoch: 41408, Training Loss: 0.00630
Epoch: 41409, Training Loss: 0.00711
Epoch: 41409, Training Loss: 0.00666
Epoch: 41409, Training Loss: 0.00813
Epoch: 41409, Training Loss: 0.00630
Epoch: 41410, Training Loss: 0.00711
Epoch: 41410, Training Loss: 0.00666
Epoch: 41410, Training Loss: 0.00813
Epoch: 41410, Training Loss: 0.00630
Epoch: 41411, Training Loss: 0.00711
Epoch: 41411, Training Loss: 0.00666
Epoch: 41411, Training Loss: 0.00813
Epoch: 41411, Training Loss: 0.00630
Epoch: 41412, Training Loss: 0.00711
Epoch: 41412, Training Loss: 0.00666
Epoch: 41412, Training Loss: 0.00813
Epoch: 41412, Training Loss: 0.00630
Epoch: 41413, Training Loss: 0.00711
Epoch: 41413, Training Loss: 0.00666
Epoch: 41413, Training Loss: 0.00813
Epoch: 41413, Training Loss: 0.00630
Epoch: 41414, Training Loss: 0.00711
Epoch: 41414, Training Loss: 0.00666
Epoch: 41414, Training Loss: 0.00813
Epoch: 41414, Training Loss: 0.00630
Epoch: 41415, Training Loss: 0.00711
Epoch: 41415, Training Loss: 0.00666
Epoch: 41415, Training Loss: 0.00813
Epoch: 41415, Training Loss: 0.00630
Epoch: 41416, Training Loss: 0.00711
Epoch: 41416, Training Loss: 0.00666
Epoch: 41416, Training Loss: 0.00813
Epoch: 41416, Training Loss: 0.00630
Epoch: 41417, Training Loss: 0.00711
Epoch: 41417, Training Loss: 0.00666
Epoch: 41417, Training Loss: 0.00813
Epoch: 41417, Training Loss: 0.00630
Epoch: 41418, Training Loss: 0.00711
Epoch: 41418, Training Loss: 0.00666
Epoch: 41418, Training Loss: 0.00813
Epoch: 41418, Training Loss: 0.00630
Epoch: 41419, Training Loss: 0.00711
Epoch: 41419, Training Loss: 0.00666
Epoch: 41419, Training Loss: 0.00813
Epoch: 41419, Training Loss: 0.00630
Epoch: 41420, Training Loss: 0.00711
Epoch: 41420, Training Loss: 0.00666
Epoch: 41420, Training Loss: 0.00813
Epoch: 41420, Training Loss: 0.00630
Epoch: 41421, Training Loss: 0.00711
Epoch: 41421, Training Loss: 0.00666
Epoch: 41421, Training Loss: 0.00813
Epoch: 41421, Training Loss: 0.00630
Epoch: 41422, Training Loss: 0.00711
Epoch: 41422, Training Loss: 0.00666
Epoch: 41422, Training Loss: 0.00813
Epoch: 41422, Training Loss: 0.00630
Epoch: 41423, Training Loss: 0.00711
Epoch: 41423, Training Loss: 0.00666
Epoch: 41423, Training Loss: 0.00813
Epoch: 41423, Training Loss: 0.00630
Epoch: 41424, Training Loss: 0.00711
Epoch: 41424, Training Loss: 0.00666
Epoch: 41424, Training Loss: 0.00813
Epoch: 41424, Training Loss: 0.00630
Epoch: 41425, Training Loss: 0.00711
Epoch: 41425, Training Loss: 0.00666
Epoch: 41425, Training Loss: 0.00813
Epoch: 41425, Training Loss: 0.00630
Epoch: 41426, Training Loss: 0.00711
Epoch: 41426, Training Loss: 0.00666
Epoch: 41426, Training Loss: 0.00813
Epoch: 41426, Training Loss: 0.00630
Epoch: 41427, Training Loss: 0.00711
Epoch: 41427, Training Loss: 0.00666
Epoch: 41427, Training Loss: 0.00813
Epoch: 41427, Training Loss: 0.00630
Epoch: 41428, Training Loss: 0.00711
Epoch: 41428, Training Loss: 0.00666
Epoch: 41428, Training Loss: 0.00813
Epoch: 41428, Training Loss: 0.00630
Epoch: 41429, Training Loss: 0.00711
Epoch: 41429, Training Loss: 0.00666
Epoch: 41429, Training Loss: 0.00813
Epoch: 41429, Training Loss: 0.00630
Epoch: 41430, Training Loss: 0.00711
Epoch: 41430, Training Loss: 0.00666
Epoch: 41430, Training Loss: 0.00813
Epoch: 41430, Training Loss: 0.00630
Epoch: 41431, Training Loss: 0.00711
Epoch: 41431, Training Loss: 0.00666
Epoch: 41431, Training Loss: 0.00813
Epoch: 41431, Training Loss: 0.00630
Epoch: 41432, Training Loss: 0.00711
Epoch: 41432, Training Loss: 0.00666
Epoch: 41432, Training Loss: 0.00813
Epoch: 41432, Training Loss: 0.00630
Epoch: 41433, Training Loss: 0.00711
Epoch: 41433, Training Loss: 0.00666
Epoch: 41433, Training Loss: 0.00813
Epoch: 41433, Training Loss: 0.00630
Epoch: 41434, Training Loss: 0.00711
Epoch: 41434, Training Loss: 0.00666
Epoch: 41434, Training Loss: 0.00813
Epoch: 41434, Training Loss: 0.00630
Epoch: 41435, Training Loss: 0.00711
Epoch: 41435, Training Loss: 0.00666
Epoch: 41435, Training Loss: 0.00813
Epoch: 41435, Training Loss: 0.00630
Epoch: 41436, Training Loss: 0.00711
Epoch: 41436, Training Loss: 0.00666
Epoch: 41436, Training Loss: 0.00813
Epoch: 41436, Training Loss: 0.00630
Epoch: 41437, Training Loss: 0.00711
Epoch: 41437, Training Loss: 0.00666
Epoch: 41437, Training Loss: 0.00813
Epoch: 41437, Training Loss: 0.00630
Epoch: 41438, Training Loss: 0.00711
Epoch: 41438, Training Loss: 0.00666
Epoch: 41438, Training Loss: 0.00812
Epoch: 41438, Training Loss: 0.00630
Epoch: 41439, Training Loss: 0.00711
Epoch: 41439, Training Loss: 0.00666
Epoch: 41439, Training Loss: 0.00812
Epoch: 41439, Training Loss: 0.00630
Epoch: 41440, Training Loss: 0.00711
Epoch: 41440, Training Loss: 0.00666
Epoch: 41440, Training Loss: 0.00812
Epoch: 41440, Training Loss: 0.00630
Epoch: 41441, Training Loss: 0.00711
Epoch: 41441, Training Loss: 0.00666
Epoch: 41441, Training Loss: 0.00812
Epoch: 41441, Training Loss: 0.00630
Epoch: 41442, Training Loss: 0.00711
Epoch: 41442, Training Loss: 0.00666
Epoch: 41442, Training Loss: 0.00812
Epoch: 41442, Training Loss: 0.00630
Epoch: 41443, Training Loss: 0.00711
Epoch: 41443, Training Loss: 0.00666
Epoch: 41443, Training Loss: 0.00812
Epoch: 41443, Training Loss: 0.00630
Epoch: 41444, Training Loss: 0.00711
Epoch: 41444, Training Loss: 0.00666
Epoch: 41444, Training Loss: 0.00812
Epoch: 41444, Training Loss: 0.00630
Epoch: 41445, Training Loss: 0.00711
Epoch: 41445, Training Loss: 0.00666
Epoch: 41445, Training Loss: 0.00812
Epoch: 41445, Training Loss: 0.00630
Epoch: 41446, Training Loss: 0.00711
Epoch: 41446, Training Loss: 0.00666
Epoch: 41446, Training Loss: 0.00812
Epoch: 41446, Training Loss: 0.00630
Epoch: 41447, Training Loss: 0.00711
Epoch: 41447, Training Loss: 0.00666
Epoch: 41447, Training Loss: 0.00812
Epoch: 41447, Training Loss: 0.00630
Epoch: 41448, Training Loss: 0.00711
Epoch: 41448, Training Loss: 0.00666
Epoch: 41448, Training Loss: 0.00812
Epoch: 41448, Training Loss: 0.00630
Epoch: 41449, Training Loss: 0.00711
Epoch: 41449, Training Loss: 0.00666
Epoch: 41449, Training Loss: 0.00812
Epoch: 41449, Training Loss: 0.00630
Epoch: 41450, Training Loss: 0.00711
Epoch: 41450, Training Loss: 0.00666
Epoch: 41450, Training Loss: 0.00812
Epoch: 41450, Training Loss: 0.00630
Epoch: 41451, Training Loss: 0.00711
Epoch: 41451, Training Loss: 0.00666
Epoch: 41451, Training Loss: 0.00812
Epoch: 41451, Training Loss: 0.00630
Epoch: 41452, Training Loss: 0.00711
Epoch: 41452, Training Loss: 0.00666
Epoch: 41452, Training Loss: 0.00812
Epoch: 41452, Training Loss: 0.00630
Epoch: 41453, Training Loss: 0.00711
Epoch: 41453, Training Loss: 0.00666
Epoch: 41453, Training Loss: 0.00812
Epoch: 41453, Training Loss: 0.00630
Epoch: 41454, Training Loss: 0.00711
Epoch: 41454, Training Loss: 0.00666
Epoch: 41454, Training Loss: 0.00812
Epoch: 41454, Training Loss: 0.00630
Epoch: 41455, Training Loss: 0.00711
Epoch: 41455, Training Loss: 0.00666
Epoch: 41455, Training Loss: 0.00812
Epoch: 41455, Training Loss: 0.00630
Epoch: 41456, Training Loss: 0.00711
Epoch: 41456, Training Loss: 0.00666
Epoch: 41456, Training Loss: 0.00812
Epoch: 41456, Training Loss: 0.00630
Epoch: 41457, Training Loss: 0.00711
Epoch: 41457, Training Loss: 0.00666
Epoch: 41457, Training Loss: 0.00812
Epoch: 41457, Training Loss: 0.00630
Epoch: 41458, Training Loss: 0.00711
Epoch: 41458, Training Loss: 0.00665
Epoch: 41458, Training Loss: 0.00812
Epoch: 41458, Training Loss: 0.00630
Epoch: 41459, Training Loss: 0.00711
Epoch: 41459, Training Loss: 0.00665
Epoch: 41459, Training Loss: 0.00812
Epoch: 41459, Training Loss: 0.00630
Epoch: 41460, Training Loss: 0.00711
Epoch: 41460, Training Loss: 0.00665
Epoch: 41460, Training Loss: 0.00812
Epoch: 41460, Training Loss: 0.00630
Epoch: 41461, Training Loss: 0.00711
Epoch: 41461, Training Loss: 0.00665
Epoch: 41461, Training Loss: 0.00812
Epoch: 41461, Training Loss: 0.00630
Epoch: 41462, Training Loss: 0.00711
Epoch: 41462, Training Loss: 0.00665
Epoch: 41462, Training Loss: 0.00812
Epoch: 41462, Training Loss: 0.00630
Epoch: 41463, Training Loss: 0.00711
Epoch: 41463, Training Loss: 0.00665
Epoch: 41463, Training Loss: 0.00812
Epoch: 41463, Training Loss: 0.00630
Epoch: 41464, Training Loss: 0.00711
Epoch: 41464, Training Loss: 0.00665
Epoch: 41464, Training Loss: 0.00812
Epoch: 41464, Training Loss: 0.00630
Epoch: 41465, Training Loss: 0.00711
Epoch: 41465, Training Loss: 0.00665
Epoch: 41465, Training Loss: 0.00812
Epoch: 41465, Training Loss: 0.00630
Epoch: 41466, Training Loss: 0.00711
Epoch: 41466, Training Loss: 0.00665
Epoch: 41466, Training Loss: 0.00812
Epoch: 41466, Training Loss: 0.00630
Epoch: 41467, Training Loss: 0.00711
Epoch: 41467, Training Loss: 0.00665
Epoch: 41467, Training Loss: 0.00812
Epoch: 41467, Training Loss: 0.00630
Epoch: 41468, Training Loss: 0.00711
Epoch: 41468, Training Loss: 0.00665
Epoch: 41468, Training Loss: 0.00812
Epoch: 41468, Training Loss: 0.00630
Epoch: 41469, Training Loss: 0.00711
Epoch: 41469, Training Loss: 0.00665
Epoch: 41469, Training Loss: 0.00812
Epoch: 41469, Training Loss: 0.00630
Epoch: 41470, Training Loss: 0.00711
Epoch: 41470, Training Loss: 0.00665
Epoch: 41470, Training Loss: 0.00812
Epoch: 41470, Training Loss: 0.00630
Epoch: 41471, Training Loss: 0.00711
Epoch: 41471, Training Loss: 0.00665
Epoch: 41471, Training Loss: 0.00812
Epoch: 41471, Training Loss: 0.00630
Epoch: 41472, Training Loss: 0.00711
Epoch: 41472, Training Loss: 0.00665
Epoch: 41472, Training Loss: 0.00812
Epoch: 41472, Training Loss: 0.00630
Epoch: 41473, Training Loss: 0.00711
Epoch: 41473, Training Loss: 0.00665
Epoch: 41473, Training Loss: 0.00812
Epoch: 41473, Training Loss: 0.00630
Epoch: 41474, Training Loss: 0.00711
Epoch: 41474, Training Loss: 0.00665
Epoch: 41474, Training Loss: 0.00812
Epoch: 41474, Training Loss: 0.00630
Epoch: 41475, Training Loss: 0.00711
Epoch: 41475, Training Loss: 0.00665
Epoch: 41475, Training Loss: 0.00812
Epoch: 41475, Training Loss: 0.00630
Epoch: 41476, Training Loss: 0.00711
Epoch: 41476, Training Loss: 0.00665
Epoch: 41476, Training Loss: 0.00812
Epoch: 41476, Training Loss: 0.00630
Epoch: 41477, Training Loss: 0.00711
Epoch: 41477, Training Loss: 0.00665
Epoch: 41477, Training Loss: 0.00812
Epoch: 41477, Training Loss: 0.00630
Epoch: 41478, Training Loss: 0.00711
Epoch: 41478, Training Loss: 0.00665
Epoch: 41478, Training Loss: 0.00812
Epoch: 41478, Training Loss: 0.00630
Epoch: 41479, Training Loss: 0.00711
Epoch: 41479, Training Loss: 0.00665
Epoch: 41479, Training Loss: 0.00812
Epoch: 41479, Training Loss: 0.00630
Epoch: 41480, Training Loss: 0.00711
Epoch: 41480, Training Loss: 0.00665
Epoch: 41480, Training Loss: 0.00812
Epoch: 41480, Training Loss: 0.00630
Epoch: 41481, Training Loss: 0.00711
Epoch: 41481, Training Loss: 0.00665
Epoch: 41481, Training Loss: 0.00812
Epoch: 41481, Training Loss: 0.00630
Epoch: 41482, Training Loss: 0.00711
Epoch: 41482, Training Loss: 0.00665
Epoch: 41482, Training Loss: 0.00812
Epoch: 41482, Training Loss: 0.00630
Epoch: 41483, Training Loss: 0.00711
Epoch: 41483, Training Loss: 0.00665
Epoch: 41483, Training Loss: 0.00812
Epoch: 41483, Training Loss: 0.00630
Epoch: 41484, Training Loss: 0.00711
Epoch: 41484, Training Loss: 0.00665
Epoch: 41484, Training Loss: 0.00812
Epoch: 41484, Training Loss: 0.00630
Epoch: 41485, Training Loss: 0.00711
Epoch: 41485, Training Loss: 0.00665
Epoch: 41485, Training Loss: 0.00812
Epoch: 41485, Training Loss: 0.00630
Epoch: 41486, Training Loss: 0.00711
Epoch: 41486, Training Loss: 0.00665
Epoch: 41486, Training Loss: 0.00812
Epoch: 41486, Training Loss: 0.00630
Epoch: 41487, Training Loss: 0.00711
Epoch: 41487, Training Loss: 0.00665
Epoch: 41487, Training Loss: 0.00812
Epoch: 41487, Training Loss: 0.00630
Epoch: 41488, Training Loss: 0.00711
Epoch: 41488, Training Loss: 0.00665
Epoch: 41488, Training Loss: 0.00812
Epoch: 41488, Training Loss: 0.00630
Epoch: 41489, Training Loss: 0.00711
Epoch: 41489, Training Loss: 0.00665
Epoch: 41489, Training Loss: 0.00812
Epoch: 41489, Training Loss: 0.00630
Epoch: 41490, Training Loss: 0.00711
Epoch: 41490, Training Loss: 0.00665
Epoch: 41490, Training Loss: 0.00812
Epoch: 41490, Training Loss: 0.00630
Epoch: 41491, Training Loss: 0.00711
Epoch: 41491, Training Loss: 0.00665
Epoch: 41491, Training Loss: 0.00812
Epoch: 41491, Training Loss: 0.00630
Epoch: 41492, Training Loss: 0.00711
Epoch: 41492, Training Loss: 0.00665
Epoch: 41492, Training Loss: 0.00812
Epoch: 41492, Training Loss: 0.00630
Epoch: 41493, Training Loss: 0.00711
Epoch: 41493, Training Loss: 0.00665
Epoch: 41493, Training Loss: 0.00812
Epoch: 41493, Training Loss: 0.00630
Epoch: 41494, Training Loss: 0.00711
Epoch: 41494, Training Loss: 0.00665
Epoch: 41494, Training Loss: 0.00812
Epoch: 41494, Training Loss: 0.00630
Epoch: 41495, Training Loss: 0.00711
Epoch: 41495, Training Loss: 0.00665
Epoch: 41495, Training Loss: 0.00812
Epoch: 41495, Training Loss: 0.00630
Epoch: 41496, Training Loss: 0.00711
Epoch: 41496, Training Loss: 0.00665
Epoch: 41496, Training Loss: 0.00812
Epoch: 41496, Training Loss: 0.00630
Epoch: 41497, Training Loss: 0.00711
Epoch: 41497, Training Loss: 0.00665
Epoch: 41497, Training Loss: 0.00812
Epoch: 41497, Training Loss: 0.00630
Epoch: 41498, Training Loss: 0.00711
Epoch: 41498, Training Loss: 0.00665
Epoch: 41498, Training Loss: 0.00812
Epoch: 41498, Training Loss: 0.00630
Epoch: 41499, Training Loss: 0.00711
Epoch: 41499, Training Loss: 0.00665
Epoch: 41499, Training Loss: 0.00812
Epoch: 41499, Training Loss: 0.00630
Epoch: 41500, Training Loss: 0.00710
Epoch: 41500, Training Loss: 0.00665
Epoch: 41500, Training Loss: 0.00812
Epoch: 41500, Training Loss: 0.00630
Epoch: 41501, Training Loss: 0.00710
Epoch: 41501, Training Loss: 0.00665
Epoch: 41501, Training Loss: 0.00812
Epoch: 41501, Training Loss: 0.00630
Epoch: 41502, Training Loss: 0.00710
Epoch: 41502, Training Loss: 0.00665
Epoch: 41502, Training Loss: 0.00812
Epoch: 41502, Training Loss: 0.00630
Epoch: 41503, Training Loss: 0.00710
Epoch: 41503, Training Loss: 0.00665
Epoch: 41503, Training Loss: 0.00812
Epoch: 41503, Training Loss: 0.00630
Epoch: 41504, Training Loss: 0.00710
Epoch: 41504, Training Loss: 0.00665
Epoch: 41504, Training Loss: 0.00812
Epoch: 41504, Training Loss: 0.00630
Epoch: 41505, Training Loss: 0.00710
Epoch: 41505, Training Loss: 0.00665
Epoch: 41505, Training Loss: 0.00812
Epoch: 41505, Training Loss: 0.00630
Epoch: 41506, Training Loss: 0.00710
Epoch: 41506, Training Loss: 0.00665
Epoch: 41506, Training Loss: 0.00812
Epoch: 41506, Training Loss: 0.00630
Epoch: 41507, Training Loss: 0.00710
Epoch: 41507, Training Loss: 0.00665
Epoch: 41507, Training Loss: 0.00812
Epoch: 41507, Training Loss: 0.00630
Epoch: 41508, Training Loss: 0.00710
Epoch: 41508, Training Loss: 0.00665
Epoch: 41508, Training Loss: 0.00812
Epoch: 41508, Training Loss: 0.00630
Epoch: 41509, Training Loss: 0.00710
Epoch: 41509, Training Loss: 0.00665
Epoch: 41509, Training Loss: 0.00812
Epoch: 41509, Training Loss: 0.00630
Epoch: 41510, Training Loss: 0.00710
Epoch: 41510, Training Loss: 0.00665
Epoch: 41510, Training Loss: 0.00812
Epoch: 41510, Training Loss: 0.00630
Epoch: 41511, Training Loss: 0.00710
Epoch: 41511, Training Loss: 0.00665
Epoch: 41511, Training Loss: 0.00812
Epoch: 41511, Training Loss: 0.00630
Epoch: 41512, Training Loss: 0.00710
Epoch: 41512, Training Loss: 0.00665
Epoch: 41512, Training Loss: 0.00812
Epoch: 41512, Training Loss: 0.00630
Epoch: 41513, Training Loss: 0.00710
Epoch: 41513, Training Loss: 0.00665
Epoch: 41513, Training Loss: 0.00812
Epoch: 41513, Training Loss: 0.00630
Epoch: 41514, Training Loss: 0.00710
Epoch: 41514, Training Loss: 0.00665
Epoch: 41514, Training Loss: 0.00812
Epoch: 41514, Training Loss: 0.00630
Epoch: 41515, Training Loss: 0.00710
Epoch: 41515, Training Loss: 0.00665
Epoch: 41515, Training Loss: 0.00812
Epoch: 41515, Training Loss: 0.00630
Epoch: 41516, Training Loss: 0.00710
Epoch: 41516, Training Loss: 0.00665
Epoch: 41516, Training Loss: 0.00812
Epoch: 41516, Training Loss: 0.00630
Epoch: 41517, Training Loss: 0.00710
Epoch: 41517, Training Loss: 0.00665
Epoch: 41517, Training Loss: 0.00812
Epoch: 41517, Training Loss: 0.00630
Epoch: 41518, Training Loss: 0.00710
Epoch: 41518, Training Loss: 0.00665
Epoch: 41518, Training Loss: 0.00812
Epoch: 41518, Training Loss: 0.00630
Epoch: 41519, Training Loss: 0.00710
Epoch: 41519, Training Loss: 0.00665
Epoch: 41519, Training Loss: 0.00812
Epoch: 41519, Training Loss: 0.00630
Epoch: 41520, Training Loss: 0.00710
Epoch: 41520, Training Loss: 0.00665
Epoch: 41520, Training Loss: 0.00812
Epoch: 41520, Training Loss: 0.00630
Epoch: 41521, Training Loss: 0.00710
Epoch: 41521, Training Loss: 0.00665
Epoch: 41521, Training Loss: 0.00812
Epoch: 41521, Training Loss: 0.00630
Epoch: 41522, Training Loss: 0.00710
Epoch: 41522, Training Loss: 0.00665
Epoch: 41522, Training Loss: 0.00812
Epoch: 41522, Training Loss: 0.00630
Epoch: 41523, Training Loss: 0.00710
Epoch: 41523, Training Loss: 0.00665
Epoch: 41523, Training Loss: 0.00812
Epoch: 41523, Training Loss: 0.00630
Epoch: 41524, Training Loss: 0.00710
Epoch: 41524, Training Loss: 0.00665
Epoch: 41524, Training Loss: 0.00812
Epoch: 41524, Training Loss: 0.00630
Epoch: 41525, Training Loss: 0.00710
Epoch: 41525, Training Loss: 0.00665
Epoch: 41525, Training Loss: 0.00812
Epoch: 41525, Training Loss: 0.00630
Epoch: 41526, Training Loss: 0.00710
Epoch: 41526, Training Loss: 0.00665
Epoch: 41526, Training Loss: 0.00812
Epoch: 41526, Training Loss: 0.00630
Epoch: 41527, Training Loss: 0.00710
Epoch: 41527, Training Loss: 0.00665
Epoch: 41527, Training Loss: 0.00812
Epoch: 41527, Training Loss: 0.00630
Epoch: 41528, Training Loss: 0.00710
Epoch: 41528, Training Loss: 0.00665
Epoch: 41528, Training Loss: 0.00812
Epoch: 41528, Training Loss: 0.00630
Epoch: 41529, Training Loss: 0.00710
Epoch: 41529, Training Loss: 0.00665
Epoch: 41529, Training Loss: 0.00812
Epoch: 41529, Training Loss: 0.00630
Epoch: 41530, Training Loss: 0.00710
Epoch: 41530, Training Loss: 0.00665
Epoch: 41530, Training Loss: 0.00812
Epoch: 41530, Training Loss: 0.00629
Epoch: 41531, Training Loss: 0.00710
Epoch: 41531, Training Loss: 0.00665
Epoch: 41531, Training Loss: 0.00812
Epoch: 41531, Training Loss: 0.00629
Epoch: 41532, Training Loss: 0.00710
Epoch: 41532, Training Loss: 0.00665
Epoch: 41532, Training Loss: 0.00812
Epoch: 41532, Training Loss: 0.00629
Epoch: 41533, Training Loss: 0.00710
Epoch: 41533, Training Loss: 0.00665
Epoch: 41533, Training Loss: 0.00812
Epoch: 41533, Training Loss: 0.00629
Epoch: 41534, Training Loss: 0.00710
Epoch: 41534, Training Loss: 0.00665
Epoch: 41534, Training Loss: 0.00812
Epoch: 41534, Training Loss: 0.00629
Epoch: 41535, Training Loss: 0.00710
Epoch: 41535, Training Loss: 0.00665
Epoch: 41535, Training Loss: 0.00812
Epoch: 41535, Training Loss: 0.00629
Epoch: 41536, Training Loss: 0.00710
Epoch: 41536, Training Loss: 0.00665
Epoch: 41536, Training Loss: 0.00811
Epoch: 41536, Training Loss: 0.00629
Epoch: 41537, Training Loss: 0.00710
Epoch: 41537, Training Loss: 0.00665
Epoch: 41537, Training Loss: 0.00811
Epoch: 41537, Training Loss: 0.00629
Epoch: 41538, Training Loss: 0.00710
Epoch: 41538, Training Loss: 0.00665
Epoch: 41538, Training Loss: 0.00811
Epoch: 41538, Training Loss: 0.00629
Epoch: 41539, Training Loss: 0.00710
Epoch: 41539, Training Loss: 0.00665
Epoch: 41539, Training Loss: 0.00811
Epoch: 41539, Training Loss: 0.00629
Epoch: 41540, Training Loss: 0.00710
Epoch: 41540, Training Loss: 0.00665
Epoch: 41540, Training Loss: 0.00811
Epoch: 41540, Training Loss: 0.00629
Epoch: 41541, Training Loss: 0.00710
Epoch: 41541, Training Loss: 0.00665
Epoch: 41541, Training Loss: 0.00811
Epoch: 41541, Training Loss: 0.00629
Epoch: 41542, Training Loss: 0.00710
Epoch: 41542, Training Loss: 0.00665
Epoch: 41542, Training Loss: 0.00811
Epoch: 41542, Training Loss: 0.00629
Epoch: 41543, Training Loss: 0.00710
Epoch: 41543, Training Loss: 0.00665
Epoch: 41543, Training Loss: 0.00811
Epoch: 41543, Training Loss: 0.00629
Epoch: 41544, Training Loss: 0.00710
Epoch: 41544, Training Loss: 0.00665
Epoch: 41544, Training Loss: 0.00811
Epoch: 41544, Training Loss: 0.00629
Epoch: 41545, Training Loss: 0.00710
Epoch: 41545, Training Loss: 0.00665
Epoch: 41545, Training Loss: 0.00811
Epoch: 41545, Training Loss: 0.00629
Epoch: 41546, Training Loss: 0.00710
Epoch: 41546, Training Loss: 0.00665
Epoch: 41546, Training Loss: 0.00811
Epoch: 41546, Training Loss: 0.00629
Epoch: 41547, Training Loss: 0.00710
Epoch: 41547, Training Loss: 0.00665
Epoch: 41547, Training Loss: 0.00811
Epoch: 41547, Training Loss: 0.00629
Epoch: 41548, Training Loss: 0.00710
Epoch: 41548, Training Loss: 0.00665
Epoch: 41548, Training Loss: 0.00811
Epoch: 41548, Training Loss: 0.00629
Epoch: 41549, Training Loss: 0.00710
Epoch: 41549, Training Loss: 0.00665
Epoch: 41549, Training Loss: 0.00811
Epoch: 41549, Training Loss: 0.00629
Epoch: 41550, Training Loss: 0.00710
Epoch: 41550, Training Loss: 0.00665
Epoch: 41550, Training Loss: 0.00811
Epoch: 41550, Training Loss: 0.00629
Epoch: 41551, Training Loss: 0.00710
Epoch: 41551, Training Loss: 0.00665
Epoch: 41551, Training Loss: 0.00811
Epoch: 41551, Training Loss: 0.00629
Epoch: 41552, Training Loss: 0.00710
Epoch: 41552, Training Loss: 0.00665
Epoch: 41552, Training Loss: 0.00811
Epoch: 41552, Training Loss: 0.00629
Epoch: 41553, Training Loss: 0.00710
Epoch: 41553, Training Loss: 0.00665
Epoch: 41553, Training Loss: 0.00811
Epoch: 41553, Training Loss: 0.00629
Epoch: 41554, Training Loss: 0.00710
Epoch: 41554, Training Loss: 0.00665
Epoch: 41554, Training Loss: 0.00811
Epoch: 41554, Training Loss: 0.00629
Epoch: 41555, Training Loss: 0.00710
Epoch: 41555, Training Loss: 0.00665
Epoch: 41555, Training Loss: 0.00811
Epoch: 41555, Training Loss: 0.00629
Epoch: 41556, Training Loss: 0.00710
Epoch: 41556, Training Loss: 0.00665
Epoch: 41556, Training Loss: 0.00811
Epoch: 41556, Training Loss: 0.00629
Epoch: 41557, Training Loss: 0.00710
Epoch: 41557, Training Loss: 0.00665
Epoch: 41557, Training Loss: 0.00811
Epoch: 41557, Training Loss: 0.00629
Epoch: 41558, Training Loss: 0.00710
Epoch: 41558, Training Loss: 0.00665
Epoch: 41558, Training Loss: 0.00811
Epoch: 41558, Training Loss: 0.00629
Epoch: 41559, Training Loss: 0.00710
Epoch: 41559, Training Loss: 0.00665
Epoch: 41559, Training Loss: 0.00811
Epoch: 41559, Training Loss: 0.00629
Epoch: 41560, Training Loss: 0.00710
Epoch: 41560, Training Loss: 0.00665
Epoch: 41560, Training Loss: 0.00811
Epoch: 41560, Training Loss: 0.00629
Epoch: 41561, Training Loss: 0.00710
Epoch: 41561, Training Loss: 0.00665
Epoch: 41561, Training Loss: 0.00811
Epoch: 41561, Training Loss: 0.00629
Epoch: 41562, Training Loss: 0.00710
Epoch: 41562, Training Loss: 0.00665
Epoch: 41562, Training Loss: 0.00811
Epoch: 41562, Training Loss: 0.00629
Epoch: 41563, Training Loss: 0.00710
Epoch: 41563, Training Loss: 0.00665
Epoch: 41563, Training Loss: 0.00811
Epoch: 41563, Training Loss: 0.00629
Epoch: 41564, Training Loss: 0.00710
Epoch: 41564, Training Loss: 0.00665
Epoch: 41564, Training Loss: 0.00811
Epoch: 41564, Training Loss: 0.00629
Epoch: 41565, Training Loss: 0.00710
Epoch: 41565, Training Loss: 0.00665
Epoch: 41565, Training Loss: 0.00811
Epoch: 41565, Training Loss: 0.00629
Epoch: 41566, Training Loss: 0.00710
Epoch: 41566, Training Loss: 0.00665
Epoch: 41566, Training Loss: 0.00811
Epoch: 41566, Training Loss: 0.00629
Epoch: 41567, Training Loss: 0.00710
Epoch: 41567, Training Loss: 0.00665
Epoch: 41567, Training Loss: 0.00811
Epoch: 41567, Training Loss: 0.00629
Epoch: 41568, Training Loss: 0.00710
Epoch: 41568, Training Loss: 0.00665
Epoch: 41568, Training Loss: 0.00811
Epoch: 41568, Training Loss: 0.00629
Epoch: 41569, Training Loss: 0.00710
Epoch: 41569, Training Loss: 0.00665
Epoch: 41569, Training Loss: 0.00811
Epoch: 41569, Training Loss: 0.00629
Epoch: 41570, Training Loss: 0.00710
Epoch: 41570, Training Loss: 0.00665
Epoch: 41570, Training Loss: 0.00811
Epoch: 41570, Training Loss: 0.00629
Epoch: 41571, Training Loss: 0.00710
Epoch: 41571, Training Loss: 0.00665
Epoch: 41571, Training Loss: 0.00811
Epoch: 41571, Training Loss: 0.00629
Epoch: 41572, Training Loss: 0.00710
Epoch: 41572, Training Loss: 0.00665
Epoch: 41572, Training Loss: 0.00811
Epoch: 41572, Training Loss: 0.00629
Epoch: 41573, Training Loss: 0.00710
Epoch: 41573, Training Loss: 0.00665
Epoch: 41573, Training Loss: 0.00811
Epoch: 41573, Training Loss: 0.00629
Epoch: 41574, Training Loss: 0.00710
Epoch: 41574, Training Loss: 0.00665
Epoch: 41574, Training Loss: 0.00811
Epoch: 41574, Training Loss: 0.00629
Epoch: 41575, Training Loss: 0.00710
Epoch: 41575, Training Loss: 0.00665
Epoch: 41575, Training Loss: 0.00811
Epoch: 41575, Training Loss: 0.00629
Epoch: 41576, Training Loss: 0.00710
Epoch: 41576, Training Loss: 0.00665
Epoch: 41576, Training Loss: 0.00811
Epoch: 41576, Training Loss: 0.00629
Epoch: 41577, Training Loss: 0.00710
Epoch: 41577, Training Loss: 0.00665
Epoch: 41577, Training Loss: 0.00811
Epoch: 41577, Training Loss: 0.00629
Epoch: 41578, Training Loss: 0.00710
Epoch: 41578, Training Loss: 0.00664
Epoch: 41578, Training Loss: 0.00811
Epoch: 41578, Training Loss: 0.00629
Epoch: 41579, Training Loss: 0.00710
Epoch: 41579, Training Loss: 0.00664
Epoch: 41579, Training Loss: 0.00811
Epoch: 41579, Training Loss: 0.00629
Epoch: 41580, Training Loss: 0.00710
Epoch: 41580, Training Loss: 0.00664
Epoch: 41580, Training Loss: 0.00811
Epoch: 41580, Training Loss: 0.00629
Epoch: 41581, Training Loss: 0.00710
Epoch: 41581, Training Loss: 0.00664
Epoch: 41581, Training Loss: 0.00811
Epoch: 41581, Training Loss: 0.00629
Epoch: 41582, Training Loss: 0.00710
Epoch: 41582, Training Loss: 0.00664
Epoch: 41582, Training Loss: 0.00811
Epoch: 41582, Training Loss: 0.00629
Epoch: 41583, Training Loss: 0.00710
Epoch: 41583, Training Loss: 0.00664
Epoch: 41583, Training Loss: 0.00811
Epoch: 41583, Training Loss: 0.00629
Epoch: 41584, Training Loss: 0.00710
Epoch: 41584, Training Loss: 0.00664
Epoch: 41584, Training Loss: 0.00811
Epoch: 41584, Training Loss: 0.00629
Epoch: 41585, Training Loss: 0.00710
Epoch: 41585, Training Loss: 0.00664
Epoch: 41585, Training Loss: 0.00811
Epoch: 41585, Training Loss: 0.00629
Epoch: 41586, Training Loss: 0.00710
Epoch: 41586, Training Loss: 0.00664
Epoch: 41586, Training Loss: 0.00811
Epoch: 41586, Training Loss: 0.00629
Epoch: 41587, Training Loss: 0.00710
Epoch: 41587, Training Loss: 0.00664
Epoch: 41587, Training Loss: 0.00811
Epoch: 41587, Training Loss: 0.00629
Epoch: 41588, Training Loss: 0.00710
Epoch: 41588, Training Loss: 0.00664
Epoch: 41588, Training Loss: 0.00811
Epoch: 41588, Training Loss: 0.00629
Epoch: 41589, Training Loss: 0.00710
Epoch: 41589, Training Loss: 0.00664
Epoch: 41589, Training Loss: 0.00811
Epoch: 41589, Training Loss: 0.00629
Epoch: 41590, Training Loss: 0.00710
Epoch: 41590, Training Loss: 0.00664
Epoch: 41590, Training Loss: 0.00811
Epoch: 41590, Training Loss: 0.00629
Epoch: 41591, Training Loss: 0.00710
Epoch: 41591, Training Loss: 0.00664
Epoch: 41591, Training Loss: 0.00811
Epoch: 41591, Training Loss: 0.00629
Epoch: 41592, Training Loss: 0.00710
Epoch: 41592, Training Loss: 0.00664
Epoch: 41592, Training Loss: 0.00811
Epoch: 41592, Training Loss: 0.00629
Epoch: 41593, Training Loss: 0.00710
Epoch: 41593, Training Loss: 0.00664
Epoch: 41593, Training Loss: 0.00811
Epoch: 41593, Training Loss: 0.00629
Epoch: 41594, Training Loss: 0.00710
Epoch: 41594, Training Loss: 0.00664
Epoch: 41594, Training Loss: 0.00811
Epoch: 41594, Training Loss: 0.00629
Epoch: 41595, Training Loss: 0.00710
Epoch: 41595, Training Loss: 0.00664
Epoch: 41595, Training Loss: 0.00811
Epoch: 41595, Training Loss: 0.00629
Epoch: 41596, Training Loss: 0.00710
Epoch: 41596, Training Loss: 0.00664
Epoch: 41596, Training Loss: 0.00811
Epoch: 41596, Training Loss: 0.00629
Epoch: 41597, Training Loss: 0.00710
Epoch: 41597, Training Loss: 0.00664
Epoch: 41597, Training Loss: 0.00811
Epoch: 41597, Training Loss: 0.00629
Epoch: 41598, Training Loss: 0.00710
Epoch: 41598, Training Loss: 0.00664
Epoch: 41598, Training Loss: 0.00811
Epoch: 41598, Training Loss: 0.00629
Epoch: 41599, Training Loss: 0.00710
Epoch: 41599, Training Loss: 0.00664
Epoch: 41599, Training Loss: 0.00811
Epoch: 41599, Training Loss: 0.00629
Epoch: 41600, Training Loss: 0.00710
Epoch: 41600, Training Loss: 0.00664
Epoch: 41600, Training Loss: 0.00811
Epoch: 41600, Training Loss: 0.00629
Epoch: 41601, Training Loss: 0.00710
Epoch: 41601, Training Loss: 0.00664
Epoch: 41601, Training Loss: 0.00811
Epoch: 41601, Training Loss: 0.00629
Epoch: 41602, Training Loss: 0.00710
Epoch: 41602, Training Loss: 0.00664
Epoch: 41602, Training Loss: 0.00811
Epoch: 41602, Training Loss: 0.00629
Epoch: 41603, Training Loss: 0.00710
Epoch: 41603, Training Loss: 0.00664
Epoch: 41603, Training Loss: 0.00811
Epoch: 41603, Training Loss: 0.00629
Epoch: 41604, Training Loss: 0.00710
Epoch: 41604, Training Loss: 0.00664
Epoch: 41604, Training Loss: 0.00811
Epoch: 41604, Training Loss: 0.00629
Epoch: 41605, Training Loss: 0.00710
Epoch: 41605, Training Loss: 0.00664
Epoch: 41605, Training Loss: 0.00811
Epoch: 41605, Training Loss: 0.00629
Epoch: 41606, Training Loss: 0.00710
Epoch: 41606, Training Loss: 0.00664
Epoch: 41606, Training Loss: 0.00811
Epoch: 41606, Training Loss: 0.00629
Epoch: 41607, Training Loss: 0.00710
Epoch: 41607, Training Loss: 0.00664
Epoch: 41607, Training Loss: 0.00811
Epoch: 41607, Training Loss: 0.00629
Epoch: 41608, Training Loss: 0.00710
Epoch: 41608, Training Loss: 0.00664
Epoch: 41608, Training Loss: 0.00811
Epoch: 41608, Training Loss: 0.00629
Epoch: 41609, Training Loss: 0.00710
Epoch: 41609, Training Loss: 0.00664
Epoch: 41609, Training Loss: 0.00811
Epoch: 41609, Training Loss: 0.00629
Epoch: 41610, Training Loss: 0.00710
Epoch: 41610, Training Loss: 0.00664
Epoch: 41610, Training Loss: 0.00811
Epoch: 41610, Training Loss: 0.00629
Epoch: 41611, Training Loss: 0.00709
Epoch: 41611, Training Loss: 0.00664
Epoch: 41611, Training Loss: 0.00811
Epoch: 41611, Training Loss: 0.00629
Epoch: 41612, Training Loss: 0.00709
Epoch: 41612, Training Loss: 0.00664
Epoch: 41612, Training Loss: 0.00811
Epoch: 41612, Training Loss: 0.00629
Epoch: 41613, Training Loss: 0.00709
Epoch: 41613, Training Loss: 0.00664
Epoch: 41613, Training Loss: 0.00811
Epoch: 41613, Training Loss: 0.00629
Epoch: 41614, Training Loss: 0.00709
Epoch: 41614, Training Loss: 0.00664
Epoch: 41614, Training Loss: 0.00811
Epoch: 41614, Training Loss: 0.00629
Epoch: 41615, Training Loss: 0.00709
Epoch: 41615, Training Loss: 0.00664
Epoch: 41615, Training Loss: 0.00811
Epoch: 41615, Training Loss: 0.00629
Epoch: 41616, Training Loss: 0.00709
Epoch: 41616, Training Loss: 0.00664
Epoch: 41616, Training Loss: 0.00811
Epoch: 41616, Training Loss: 0.00629
Epoch: 41617, Training Loss: 0.00709
Epoch: 41617, Training Loss: 0.00664
Epoch: 41617, Training Loss: 0.00811
Epoch: 41617, Training Loss: 0.00629
Epoch: 41618, Training Loss: 0.00709
Epoch: 41618, Training Loss: 0.00664
Epoch: 41618, Training Loss: 0.00811
Epoch: 41618, Training Loss: 0.00629
Epoch: 41619, Training Loss: 0.00709
Epoch: 41619, Training Loss: 0.00664
Epoch: 41619, Training Loss: 0.00811
Epoch: 41619, Training Loss: 0.00629
Epoch: 41620, Training Loss: 0.00709
Epoch: 41620, Training Loss: 0.00664
Epoch: 41620, Training Loss: 0.00811
Epoch: 41620, Training Loss: 0.00629
Epoch: 41621, Training Loss: 0.00709
Epoch: 41621, Training Loss: 0.00664
Epoch: 41621, Training Loss: 0.00811
Epoch: 41621, Training Loss: 0.00629
Epoch: 41622, Training Loss: 0.00709
Epoch: 41622, Training Loss: 0.00664
Epoch: 41622, Training Loss: 0.00811
Epoch: 41622, Training Loss: 0.00629
Epoch: 41623, Training Loss: 0.00709
Epoch: 41623, Training Loss: 0.00664
Epoch: 41623, Training Loss: 0.00811
Epoch: 41623, Training Loss: 0.00629
Epoch: 41624, Training Loss: 0.00709
Epoch: 41624, Training Loss: 0.00664
Epoch: 41624, Training Loss: 0.00811
Epoch: 41624, Training Loss: 0.00629
Epoch: 41625, Training Loss: 0.00709
Epoch: 41625, Training Loss: 0.00664
Epoch: 41625, Training Loss: 0.00811
Epoch: 41625, Training Loss: 0.00629
Epoch: 41626, Training Loss: 0.00709
Epoch: 41626, Training Loss: 0.00664
Epoch: 41626, Training Loss: 0.00811
Epoch: 41626, Training Loss: 0.00629
Epoch: 41627, Training Loss: 0.00709
Epoch: 41627, Training Loss: 0.00664
Epoch: 41627, Training Loss: 0.00811
Epoch: 41627, Training Loss: 0.00629
Epoch: 41628, Training Loss: 0.00709
Epoch: 41628, Training Loss: 0.00664
Epoch: 41628, Training Loss: 0.00811
Epoch: 41628, Training Loss: 0.00629
Epoch: 41629, Training Loss: 0.00709
Epoch: 41629, Training Loss: 0.00664
Epoch: 41629, Training Loss: 0.00811
Epoch: 41629, Training Loss: 0.00629
Epoch: 41630, Training Loss: 0.00709
Epoch: 41630, Training Loss: 0.00664
Epoch: 41630, Training Loss: 0.00811
Epoch: 41630, Training Loss: 0.00629
Epoch: 41631, Training Loss: 0.00709
Epoch: 41631, Training Loss: 0.00664
Epoch: 41631, Training Loss: 0.00811
Epoch: 41631, Training Loss: 0.00629
Epoch: 41632, Training Loss: 0.00709
Epoch: 41632, Training Loss: 0.00664
Epoch: 41632, Training Loss: 0.00811
Epoch: 41632, Training Loss: 0.00629
Epoch: 41633, Training Loss: 0.00709
Epoch: 41633, Training Loss: 0.00664
Epoch: 41633, Training Loss: 0.00811
Epoch: 41633, Training Loss: 0.00629
Epoch: 41634, Training Loss: 0.00709
Epoch: 41634, Training Loss: 0.00664
Epoch: 41634, Training Loss: 0.00810
Epoch: 41634, Training Loss: 0.00629
Epoch: 41635, Training Loss: 0.00709
Epoch: 41635, Training Loss: 0.00664
Epoch: 41635, Training Loss: 0.00810
Epoch: 41635, Training Loss: 0.00629
Epoch: 41636, Training Loss: 0.00709
Epoch: 41636, Training Loss: 0.00664
Epoch: 41636, Training Loss: 0.00810
Epoch: 41636, Training Loss: 0.00629
Epoch: 41637, Training Loss: 0.00709
Epoch: 41637, Training Loss: 0.00664
Epoch: 41637, Training Loss: 0.00810
Epoch: 41637, Training Loss: 0.00629
Epoch: 41638, Training Loss: 0.00709
Epoch: 41638, Training Loss: 0.00664
Epoch: 41638, Training Loss: 0.00810
Epoch: 41638, Training Loss: 0.00629
Epoch: 41639, Training Loss: 0.00709
Epoch: 41639, Training Loss: 0.00664
Epoch: 41639, Training Loss: 0.00810
Epoch: 41639, Training Loss: 0.00629
Epoch: 41640, Training Loss: 0.00709
Epoch: 41640, Training Loss: 0.00664
Epoch: 41640, Training Loss: 0.00810
Epoch: 41640, Training Loss: 0.00629
Epoch: 41641, Training Loss: 0.00709
Epoch: 41641, Training Loss: 0.00664
Epoch: 41641, Training Loss: 0.00810
Epoch: 41641, Training Loss: 0.00629
Epoch: 41642, Training Loss: 0.00709
Epoch: 41642, Training Loss: 0.00664
Epoch: 41642, Training Loss: 0.00810
Epoch: 41642, Training Loss: 0.00629
Epoch: 41643, Training Loss: 0.00709
Epoch: 41643, Training Loss: 0.00664
Epoch: 41643, Training Loss: 0.00810
Epoch: 41643, Training Loss: 0.00629
Epoch: 41644, Training Loss: 0.00709
Epoch: 41644, Training Loss: 0.00664
Epoch: 41644, Training Loss: 0.00810
Epoch: 41644, Training Loss: 0.00629
Epoch: 41645, Training Loss: 0.00709
Epoch: 41645, Training Loss: 0.00664
Epoch: 41645, Training Loss: 0.00810
Epoch: 41645, Training Loss: 0.00629
Epoch: 41646, Training Loss: 0.00709
Epoch: 41646, Training Loss: 0.00664
Epoch: 41646, Training Loss: 0.00810
Epoch: 41646, Training Loss: 0.00629
Epoch: 41647, Training Loss: 0.00709
Epoch: 41647, Training Loss: 0.00664
Epoch: 41647, Training Loss: 0.00810
Epoch: 41647, Training Loss: 0.00629
Epoch: 41648, Training Loss: 0.00709
Epoch: 41648, Training Loss: 0.00664
Epoch: 41648, Training Loss: 0.00810
Epoch: 41648, Training Loss: 0.00629
Epoch: 41649, Training Loss: 0.00709
Epoch: 41649, Training Loss: 0.00664
Epoch: 41649, Training Loss: 0.00810
Epoch: 41649, Training Loss: 0.00629
Epoch: 41650, Training Loss: 0.00709
Epoch: 41650, Training Loss: 0.00664
Epoch: 41650, Training Loss: 0.00810
Epoch: 41650, Training Loss: 0.00629
Epoch: 41651, Training Loss: 0.00709
Epoch: 41651, Training Loss: 0.00664
Epoch: 41651, Training Loss: 0.00810
Epoch: 41651, Training Loss: 0.00629
Epoch: 41652, Training Loss: 0.00709
Epoch: 41652, Training Loss: 0.00664
Epoch: 41652, Training Loss: 0.00810
Epoch: 41652, Training Loss: 0.00629
Epoch: 41653, Training Loss: 0.00709
Epoch: 41653, Training Loss: 0.00664
Epoch: 41653, Training Loss: 0.00810
Epoch: 41653, Training Loss: 0.00629
Epoch: 41654, Training Loss: 0.00709
Epoch: 41654, Training Loss: 0.00664
Epoch: 41654, Training Loss: 0.00810
Epoch: 41654, Training Loss: 0.00629
Epoch: 41655, Training Loss: 0.00709
Epoch: 41655, Training Loss: 0.00664
Epoch: 41655, Training Loss: 0.00810
Epoch: 41655, Training Loss: 0.00629
Epoch: 41656, Training Loss: 0.00709
Epoch: 41656, Training Loss: 0.00664
Epoch: 41656, Training Loss: 0.00810
Epoch: 41656, Training Loss: 0.00629
Epoch: 41657, Training Loss: 0.00709
Epoch: 41657, Training Loss: 0.00664
Epoch: 41657, Training Loss: 0.00810
Epoch: 41657, Training Loss: 0.00628
Epoch: 41658, Training Loss: 0.00709
Epoch: 41658, Training Loss: 0.00664
Epoch: 41658, Training Loss: 0.00810
Epoch: 41658, Training Loss: 0.00628
Epoch: 41659, Training Loss: 0.00709
Epoch: 41659, Training Loss: 0.00664
Epoch: 41659, Training Loss: 0.00810
Epoch: 41659, Training Loss: 0.00628
Epoch: 41660, Training Loss: 0.00709
Epoch: 41660, Training Loss: 0.00664
Epoch: 41660, Training Loss: 0.00810
Epoch: 41660, Training Loss: 0.00628
Epoch: 41661, Training Loss: 0.00709
Epoch: 41661, Training Loss: 0.00664
Epoch: 41661, Training Loss: 0.00810
Epoch: 41661, Training Loss: 0.00628
Epoch: 41662, Training Loss: 0.00709
Epoch: 41662, Training Loss: 0.00664
Epoch: 41662, Training Loss: 0.00810
Epoch: 41662, Training Loss: 0.00628
Epoch: 41663, Training Loss: 0.00709
Epoch: 41663, Training Loss: 0.00664
Epoch: 41663, Training Loss: 0.00810
Epoch: 41663, Training Loss: 0.00628
Epoch: 41664, Training Loss: 0.00709
Epoch: 41664, Training Loss: 0.00664
Epoch: 41664, Training Loss: 0.00810
Epoch: 41664, Training Loss: 0.00628
Epoch: 41665, Training Loss: 0.00709
Epoch: 41665, Training Loss: 0.00664
Epoch: 41665, Training Loss: 0.00810
Epoch: 41665, Training Loss: 0.00628
Epoch: 41666, Training Loss: 0.00709
Epoch: 41666, Training Loss: 0.00664
Epoch: 41666, Training Loss: 0.00810
Epoch: 41666, Training Loss: 0.00628
Epoch: 41667, Training Loss: 0.00709
Epoch: 41667, Training Loss: 0.00664
Epoch: 41667, Training Loss: 0.00810
Epoch: 41667, Training Loss: 0.00628
Epoch: 41668, Training Loss: 0.00709
Epoch: 41668, Training Loss: 0.00664
Epoch: 41668, Training Loss: 0.00810
Epoch: 41668, Training Loss: 0.00628
Epoch: 41669, Training Loss: 0.00709
Epoch: 41669, Training Loss: 0.00664
Epoch: 41669, Training Loss: 0.00810
Epoch: 41669, Training Loss: 0.00628
Epoch: 41670, Training Loss: 0.00709
Epoch: 41670, Training Loss: 0.00664
Epoch: 41670, Training Loss: 0.00810
Epoch: 41670, Training Loss: 0.00628
Epoch: 41671, Training Loss: 0.00709
Epoch: 41671, Training Loss: 0.00664
Epoch: 41671, Training Loss: 0.00810
Epoch: 41671, Training Loss: 0.00628
Epoch: 41672, Training Loss: 0.00709
Epoch: 41672, Training Loss: 0.00664
Epoch: 41672, Training Loss: 0.00810
Epoch: 41672, Training Loss: 0.00628
Epoch: 41673, Training Loss: 0.00709
Epoch: 41673, Training Loss: 0.00664
Epoch: 41673, Training Loss: 0.00810
Epoch: 41673, Training Loss: 0.00628
Epoch: 41674, Training Loss: 0.00709
Epoch: 41674, Training Loss: 0.00664
Epoch: 41674, Training Loss: 0.00810
Epoch: 41674, Training Loss: 0.00628
Epoch: 41675, Training Loss: 0.00709
Epoch: 41675, Training Loss: 0.00664
Epoch: 41675, Training Loss: 0.00810
Epoch: 41675, Training Loss: 0.00628
Epoch: 41676, Training Loss: 0.00709
Epoch: 41676, Training Loss: 0.00664
Epoch: 41676, Training Loss: 0.00810
Epoch: 41676, Training Loss: 0.00628
Epoch: 41677, Training Loss: 0.00709
Epoch: 41677, Training Loss: 0.00664
Epoch: 41677, Training Loss: 0.00810
Epoch: 41677, Training Loss: 0.00628
Epoch: 41678, Training Loss: 0.00709
Epoch: 41678, Training Loss: 0.00664
Epoch: 41678, Training Loss: 0.00810
Epoch: 41678, Training Loss: 0.00628
Epoch: 41679, Training Loss: 0.00709
Epoch: 41679, Training Loss: 0.00664
Epoch: 41679, Training Loss: 0.00810
Epoch: 41679, Training Loss: 0.00628
Epoch: 41680, Training Loss: 0.00709
Epoch: 41680, Training Loss: 0.00664
Epoch: 41680, Training Loss: 0.00810
Epoch: 41680, Training Loss: 0.00628
Epoch: 41681, Training Loss: 0.00709
Epoch: 41681, Training Loss: 0.00664
Epoch: 41681, Training Loss: 0.00810
Epoch: 41681, Training Loss: 0.00628
Epoch: 41682, Training Loss: 0.00709
Epoch: 41682, Training Loss: 0.00664
Epoch: 41682, Training Loss: 0.00810
Epoch: 41682, Training Loss: 0.00628
Epoch: 41683, Training Loss: 0.00709
Epoch: 41683, Training Loss: 0.00664
Epoch: 41683, Training Loss: 0.00810
Epoch: 41683, Training Loss: 0.00628
Epoch: 41684, Training Loss: 0.00709
Epoch: 41684, Training Loss: 0.00664
Epoch: 41684, Training Loss: 0.00810
Epoch: 41684, Training Loss: 0.00628
Epoch: 41685, Training Loss: 0.00709
Epoch: 41685, Training Loss: 0.00664
Epoch: 41685, Training Loss: 0.00810
Epoch: 41685, Training Loss: 0.00628
Epoch: 41686, Training Loss: 0.00709
Epoch: 41686, Training Loss: 0.00664
Epoch: 41686, Training Loss: 0.00810
Epoch: 41686, Training Loss: 0.00628
Epoch: 41687, Training Loss: 0.00709
Epoch: 41687, Training Loss: 0.00664
Epoch: 41687, Training Loss: 0.00810
Epoch: 41687, Training Loss: 0.00628
Epoch: 41688, Training Loss: 0.00709
Epoch: 41688, Training Loss: 0.00664
Epoch: 41688, Training Loss: 0.00810
Epoch: 41688, Training Loss: 0.00628
Epoch: 41689, Training Loss: 0.00709
Epoch: 41689, Training Loss: 0.00664
Epoch: 41689, Training Loss: 0.00810
Epoch: 41689, Training Loss: 0.00628
Epoch: 41690, Training Loss: 0.00709
Epoch: 41690, Training Loss: 0.00664
Epoch: 41690, Training Loss: 0.00810
Epoch: 41690, Training Loss: 0.00628
Epoch: 41691, Training Loss: 0.00709
Epoch: 41691, Training Loss: 0.00664
Epoch: 41691, Training Loss: 0.00810
Epoch: 41691, Training Loss: 0.00628
Epoch: 41692, Training Loss: 0.00709
Epoch: 41692, Training Loss: 0.00664
Epoch: 41692, Training Loss: 0.00810
Epoch: 41692, Training Loss: 0.00628
Epoch: 41693, Training Loss: 0.00709
Epoch: 41693, Training Loss: 0.00664
Epoch: 41693, Training Loss: 0.00810
Epoch: 41693, Training Loss: 0.00628
Epoch: 41694, Training Loss: 0.00709
Epoch: 41694, Training Loss: 0.00664
Epoch: 41694, Training Loss: 0.00810
Epoch: 41694, Training Loss: 0.00628
Epoch: 41695, Training Loss: 0.00709
Epoch: 41695, Training Loss: 0.00664
Epoch: 41695, Training Loss: 0.00810
Epoch: 41695, Training Loss: 0.00628
Epoch: 41696, Training Loss: 0.00709
Epoch: 41696, Training Loss: 0.00664
Epoch: 41696, Training Loss: 0.00810
Epoch: 41696, Training Loss: 0.00628
Epoch: 41697, Training Loss: 0.00709
Epoch: 41697, Training Loss: 0.00664
Epoch: 41697, Training Loss: 0.00810
Epoch: 41697, Training Loss: 0.00628
Epoch: 41698, Training Loss: 0.00709
Epoch: 41698, Training Loss: 0.00664
Epoch: 41698, Training Loss: 0.00810
Epoch: 41698, Training Loss: 0.00628
Epoch: 41699, Training Loss: 0.00709
Epoch: 41699, Training Loss: 0.00663
Epoch: 41699, Training Loss: 0.00810
Epoch: 41699, Training Loss: 0.00628
Epoch: 41700, Training Loss: 0.00709
Epoch: 41700, Training Loss: 0.00663
Epoch: 41700, Training Loss: 0.00810
Epoch: 41700, Training Loss: 0.00628
Epoch: 41701, Training Loss: 0.00709
Epoch: 41701, Training Loss: 0.00663
Epoch: 41701, Training Loss: 0.00810
Epoch: 41701, Training Loss: 0.00628
Epoch: 41702, Training Loss: 0.00709
Epoch: 41702, Training Loss: 0.00663
Epoch: 41702, Training Loss: 0.00810
Epoch: 41702, Training Loss: 0.00628
Epoch: 41703, Training Loss: 0.00709
Epoch: 41703, Training Loss: 0.00663
Epoch: 41703, Training Loss: 0.00810
Epoch: 41703, Training Loss: 0.00628
Epoch: 41704, Training Loss: 0.00709
Epoch: 41704, Training Loss: 0.00663
Epoch: 41704, Training Loss: 0.00810
Epoch: 41704, Training Loss: 0.00628
Epoch: 41705, Training Loss: 0.00709
Epoch: 41705, Training Loss: 0.00663
Epoch: 41705, Training Loss: 0.00810
Epoch: 41705, Training Loss: 0.00628
Epoch: 41706, Training Loss: 0.00709
Epoch: 41706, Training Loss: 0.00663
Epoch: 41706, Training Loss: 0.00810
Epoch: 41706, Training Loss: 0.00628
Epoch: 41707, Training Loss: 0.00709
Epoch: 41707, Training Loss: 0.00663
Epoch: 41707, Training Loss: 0.00810
Epoch: 41707, Training Loss: 0.00628
Epoch: 41708, Training Loss: 0.00709
Epoch: 41708, Training Loss: 0.00663
Epoch: 41708, Training Loss: 0.00810
Epoch: 41708, Training Loss: 0.00628
Epoch: 41709, Training Loss: 0.00709
Epoch: 41709, Training Loss: 0.00663
Epoch: 41709, Training Loss: 0.00810
Epoch: 41709, Training Loss: 0.00628
Epoch: 41710, Training Loss: 0.00709
Epoch: 41710, Training Loss: 0.00663
Epoch: 41710, Training Loss: 0.00810
Epoch: 41710, Training Loss: 0.00628
Epoch: 41711, Training Loss: 0.00709
Epoch: 41711, Training Loss: 0.00663
Epoch: 41711, Training Loss: 0.00810
Epoch: 41711, Training Loss: 0.00628
Epoch: 41712, Training Loss: 0.00709
Epoch: 41712, Training Loss: 0.00663
Epoch: 41712, Training Loss: 0.00810
Epoch: 41712, Training Loss: 0.00628
Epoch: 41713, Training Loss: 0.00709
Epoch: 41713, Training Loss: 0.00663
Epoch: 41713, Training Loss: 0.00810
Epoch: 41713, Training Loss: 0.00628
Epoch: 41714, Training Loss: 0.00709
Epoch: 41714, Training Loss: 0.00663
Epoch: 41714, Training Loss: 0.00810
Epoch: 41714, Training Loss: 0.00628
Epoch: 41715, Training Loss: 0.00709
Epoch: 41715, Training Loss: 0.00663
Epoch: 41715, Training Loss: 0.00810
Epoch: 41715, Training Loss: 0.00628
Epoch: 41716, Training Loss: 0.00709
Epoch: 41716, Training Loss: 0.00663
Epoch: 41716, Training Loss: 0.00810
Epoch: 41716, Training Loss: 0.00628
Epoch: 41717, Training Loss: 0.00709
Epoch: 41717, Training Loss: 0.00663
Epoch: 41717, Training Loss: 0.00810
Epoch: 41717, Training Loss: 0.00628
Epoch: 41718, Training Loss: 0.00709
Epoch: 41718, Training Loss: 0.00663
Epoch: 41718, Training Loss: 0.00810
Epoch: 41718, Training Loss: 0.00628
Epoch: 41719, Training Loss: 0.00709
Epoch: 41719, Training Loss: 0.00663
Epoch: 41719, Training Loss: 0.00810
Epoch: 41719, Training Loss: 0.00628
Epoch: 41720, Training Loss: 0.00709
Epoch: 41720, Training Loss: 0.00663
Epoch: 41720, Training Loss: 0.00810
Epoch: 41720, Training Loss: 0.00628
Epoch: 41721, Training Loss: 0.00709
Epoch: 41721, Training Loss: 0.00663
Epoch: 41721, Training Loss: 0.00810
Epoch: 41721, Training Loss: 0.00628
Epoch: 41722, Training Loss: 0.00709
Epoch: 41722, Training Loss: 0.00663
Epoch: 41722, Training Loss: 0.00810
Epoch: 41722, Training Loss: 0.00628
Epoch: 41723, Training Loss: 0.00708
Epoch: 41723, Training Loss: 0.00663
Epoch: 41723, Training Loss: 0.00810
Epoch: 41723, Training Loss: 0.00628
Epoch: 41724, Training Loss: 0.00708
Epoch: 41724, Training Loss: 0.00663
Epoch: 41724, Training Loss: 0.00810
Epoch: 41724, Training Loss: 0.00628
Epoch: 41725, Training Loss: 0.00708
Epoch: 41725, Training Loss: 0.00663
Epoch: 41725, Training Loss: 0.00810
Epoch: 41725, Training Loss: 0.00628
Epoch: 41726, Training Loss: 0.00708
Epoch: 41726, Training Loss: 0.00663
Epoch: 41726, Training Loss: 0.00810
Epoch: 41726, Training Loss: 0.00628
Epoch: 41727, Training Loss: 0.00708
Epoch: 41727, Training Loss: 0.00663
Epoch: 41727, Training Loss: 0.00810
Epoch: 41727, Training Loss: 0.00628
Epoch: 41728, Training Loss: 0.00708
Epoch: 41728, Training Loss: 0.00663
Epoch: 41728, Training Loss: 0.00810
Epoch: 41728, Training Loss: 0.00628
Epoch: 41729, Training Loss: 0.00708
Epoch: 41729, Training Loss: 0.00663
Epoch: 41729, Training Loss: 0.00810
Epoch: 41729, Training Loss: 0.00628
Epoch: 41730, Training Loss: 0.00708
Epoch: 41730, Training Loss: 0.00663
Epoch: 41730, Training Loss: 0.00810
Epoch: 41730, Training Loss: 0.00628
Epoch: 41731, Training Loss: 0.00708
Epoch: 41731, Training Loss: 0.00663
Epoch: 41731, Training Loss: 0.00810
Epoch: 41731, Training Loss: 0.00628
Epoch: 41732, Training Loss: 0.00708
Epoch: 41732, Training Loss: 0.00663
Epoch: 41732, Training Loss: 0.00809
Epoch: 41732, Training Loss: 0.00628
Epoch: 41733, Training Loss: 0.00708
Epoch: 41733, Training Loss: 0.00663
Epoch: 41733, Training Loss: 0.00809
Epoch: 41733, Training Loss: 0.00628
Epoch: 41734, Training Loss: 0.00708
Epoch: 41734, Training Loss: 0.00663
Epoch: 41734, Training Loss: 0.00809
Epoch: 41734, Training Loss: 0.00628
Epoch: 41735, Training Loss: 0.00708
Epoch: 41735, Training Loss: 0.00663
Epoch: 41735, Training Loss: 0.00809
Epoch: 41735, Training Loss: 0.00628
Epoch: 41736, Training Loss: 0.00708
Epoch: 41736, Training Loss: 0.00663
Epoch: 41736, Training Loss: 0.00809
Epoch: 41736, Training Loss: 0.00628
Epoch: 41737, Training Loss: 0.00708
Epoch: 41737, Training Loss: 0.00663
Epoch: 41737, Training Loss: 0.00809
Epoch: 41737, Training Loss: 0.00628
Epoch: 41738, Training Loss: 0.00708
Epoch: 41738, Training Loss: 0.00663
Epoch: 41738, Training Loss: 0.00809
Epoch: 41738, Training Loss: 0.00628
Epoch: 41739, Training Loss: 0.00708
Epoch: 41739, Training Loss: 0.00663
Epoch: 41739, Training Loss: 0.00809
Epoch: 41739, Training Loss: 0.00628
Epoch: 41740, Training Loss: 0.00708
Epoch: 41740, Training Loss: 0.00663
Epoch: 41740, Training Loss: 0.00809
Epoch: 41740, Training Loss: 0.00628
Epoch: 41741, Training Loss: 0.00708
Epoch: 41741, Training Loss: 0.00663
Epoch: 41741, Training Loss: 0.00809
Epoch: 41741, Training Loss: 0.00628
Epoch: 41742, Training Loss: 0.00708
Epoch: 41742, Training Loss: 0.00663
Epoch: 41742, Training Loss: 0.00809
Epoch: 41742, Training Loss: 0.00628
Epoch: 41743, Training Loss: 0.00708
Epoch: 41743, Training Loss: 0.00663
Epoch: 41743, Training Loss: 0.00809
Epoch: 41743, Training Loss: 0.00628
Epoch: 41744, Training Loss: 0.00708
Epoch: 41744, Training Loss: 0.00663
Epoch: 41744, Training Loss: 0.00809
Epoch: 41744, Training Loss: 0.00628
Epoch: 41745, Training Loss: 0.00708
Epoch: 41745, Training Loss: 0.00663
Epoch: 41745, Training Loss: 0.00809
Epoch: 41745, Training Loss: 0.00628
Epoch: 41746, Training Loss: 0.00708
Epoch: 41746, Training Loss: 0.00663
Epoch: 41746, Training Loss: 0.00809
Epoch: 41746, Training Loss: 0.00628
Epoch: 41747, Training Loss: 0.00708
Epoch: 41747, Training Loss: 0.00663
Epoch: 41747, Training Loss: 0.00809
Epoch: 41747, Training Loss: 0.00628
Epoch: 41748, Training Loss: 0.00708
Epoch: 41748, Training Loss: 0.00663
Epoch: 41748, Training Loss: 0.00809
Epoch: 41748, Training Loss: 0.00628
Epoch: 41749, Training Loss: 0.00708
Epoch: 41749, Training Loss: 0.00663
Epoch: 41749, Training Loss: 0.00809
Epoch: 41749, Training Loss: 0.00628
Epoch: 41750, Training Loss: 0.00708
Epoch: 41750, Training Loss: 0.00663
Epoch: 41750, Training Loss: 0.00809
Epoch: 41750, Training Loss: 0.00628
Epoch: 41751, Training Loss: 0.00708
Epoch: 41751, Training Loss: 0.00663
Epoch: 41751, Training Loss: 0.00809
Epoch: 41751, Training Loss: 0.00628
Epoch: 41752, Training Loss: 0.00708
Epoch: 41752, Training Loss: 0.00663
Epoch: 41752, Training Loss: 0.00809
Epoch: 41752, Training Loss: 0.00628
Epoch: 41753, Training Loss: 0.00708
Epoch: 41753, Training Loss: 0.00663
Epoch: 41753, Training Loss: 0.00809
Epoch: 41753, Training Loss: 0.00628
Epoch: 41754, Training Loss: 0.00708
Epoch: 41754, Training Loss: 0.00663
Epoch: 41754, Training Loss: 0.00809
Epoch: 41754, Training Loss: 0.00628
Epoch: 41755, Training Loss: 0.00708
Epoch: 41755, Training Loss: 0.00663
Epoch: 41755, Training Loss: 0.00809
Epoch: 41755, Training Loss: 0.00628
Epoch: 41756, Training Loss: 0.00708
Epoch: 41756, Training Loss: 0.00663
Epoch: 41756, Training Loss: 0.00809
Epoch: 41756, Training Loss: 0.00628
Epoch: 41757, Training Loss: 0.00708
Epoch: 41757, Training Loss: 0.00663
Epoch: 41757, Training Loss: 0.00809
Epoch: 41757, Training Loss: 0.00628
Epoch: 41758, Training Loss: 0.00708
Epoch: 41758, Training Loss: 0.00663
Epoch: 41758, Training Loss: 0.00809
Epoch: 41758, Training Loss: 0.00628
Epoch: 41759, Training Loss: 0.00708
Epoch: 41759, Training Loss: 0.00663
Epoch: 41759, Training Loss: 0.00809
Epoch: 41759, Training Loss: 0.00628
Epoch: 41760, Training Loss: 0.00708
Epoch: 41760, Training Loss: 0.00663
Epoch: 41760, Training Loss: 0.00809
Epoch: 41760, Training Loss: 0.00628
Epoch: 41761, Training Loss: 0.00708
Epoch: 41761, Training Loss: 0.00663
Epoch: 41761, Training Loss: 0.00809
Epoch: 41761, Training Loss: 0.00628
Epoch: 41762, Training Loss: 0.00708
Epoch: 41762, Training Loss: 0.00663
Epoch: 41762, Training Loss: 0.00809
Epoch: 41762, Training Loss: 0.00628
Epoch: 41763, Training Loss: 0.00708
Epoch: 41763, Training Loss: 0.00663
Epoch: 41763, Training Loss: 0.00809
Epoch: 41763, Training Loss: 0.00628
Epoch: 41764, Training Loss: 0.00708
Epoch: 41764, Training Loss: 0.00663
Epoch: 41764, Training Loss: 0.00809
Epoch: 41764, Training Loss: 0.00628
Epoch: 41765, Training Loss: 0.00708
Epoch: 41765, Training Loss: 0.00663
Epoch: 41765, Training Loss: 0.00809
Epoch: 41765, Training Loss: 0.00628
Epoch: 41766, Training Loss: 0.00708
Epoch: 41766, Training Loss: 0.00663
Epoch: 41766, Training Loss: 0.00809
Epoch: 41766, Training Loss: 0.00628
Epoch: 41767, Training Loss: 0.00708
Epoch: 41767, Training Loss: 0.00663
Epoch: 41767, Training Loss: 0.00809
Epoch: 41767, Training Loss: 0.00628
Epoch: 41768, Training Loss: 0.00708
Epoch: 41768, Training Loss: 0.00663
Epoch: 41768, Training Loss: 0.00809
Epoch: 41768, Training Loss: 0.00628
Epoch: 41769, Training Loss: 0.00708
Epoch: 41769, Training Loss: 0.00663
Epoch: 41769, Training Loss: 0.00809
Epoch: 41769, Training Loss: 0.00628
Epoch: 41770, Training Loss: 0.00708
Epoch: 41770, Training Loss: 0.00663
Epoch: 41770, Training Loss: 0.00809
Epoch: 41770, Training Loss: 0.00628
Epoch: 41771, Training Loss: 0.00708
Epoch: 41771, Training Loss: 0.00663
Epoch: 41771, Training Loss: 0.00809
Epoch: 41771, Training Loss: 0.00628
Epoch: 41772, Training Loss: 0.00708
Epoch: 41772, Training Loss: 0.00663
Epoch: 41772, Training Loss: 0.00809
Epoch: 41772, Training Loss: 0.00628
Epoch: 41773, Training Loss: 0.00708
Epoch: 41773, Training Loss: 0.00663
Epoch: 41773, Training Loss: 0.00809
Epoch: 41773, Training Loss: 0.00628
Epoch: 41774, Training Loss: 0.00708
Epoch: 41774, Training Loss: 0.00663
Epoch: 41774, Training Loss: 0.00809
Epoch: 41774, Training Loss: 0.00628
Epoch: 41775, Training Loss: 0.00708
Epoch: 41775, Training Loss: 0.00663
Epoch: 41775, Training Loss: 0.00809
Epoch: 41775, Training Loss: 0.00628
Epoch: 41776, Training Loss: 0.00708
Epoch: 41776, Training Loss: 0.00663
Epoch: 41776, Training Loss: 0.00809
Epoch: 41776, Training Loss: 0.00628
Epoch: 41777, Training Loss: 0.00708
Epoch: 41777, Training Loss: 0.00663
Epoch: 41777, Training Loss: 0.00809
Epoch: 41777, Training Loss: 0.00628
Epoch: 41778, Training Loss: 0.00708
Epoch: 41778, Training Loss: 0.00663
Epoch: 41778, Training Loss: 0.00809
Epoch: 41778, Training Loss: 0.00628
Epoch: 41779, Training Loss: 0.00708
Epoch: 41779, Training Loss: 0.00663
Epoch: 41779, Training Loss: 0.00809
Epoch: 41779, Training Loss: 0.00628
Epoch: 41780, Training Loss: 0.00708
Epoch: 41780, Training Loss: 0.00663
Epoch: 41780, Training Loss: 0.00809
Epoch: 41780, Training Loss: 0.00628
Epoch: 41781, Training Loss: 0.00708
Epoch: 41781, Training Loss: 0.00663
Epoch: 41781, Training Loss: 0.00809
Epoch: 41781, Training Loss: 0.00628
Epoch: 41782, Training Loss: 0.00708
Epoch: 41782, Training Loss: 0.00663
Epoch: 41782, Training Loss: 0.00809
Epoch: 41782, Training Loss: 0.00628
Epoch: 41783, Training Loss: 0.00708
Epoch: 41783, Training Loss: 0.00663
Epoch: 41783, Training Loss: 0.00809
Epoch: 41783, Training Loss: 0.00628
Epoch: 41784, Training Loss: 0.00708
Epoch: 41784, Training Loss: 0.00663
Epoch: 41784, Training Loss: 0.00809
Epoch: 41784, Training Loss: 0.00628
Epoch: 41785, Training Loss: 0.00708
Epoch: 41785, Training Loss: 0.00663
Epoch: 41785, Training Loss: 0.00809
Epoch: 41785, Training Loss: 0.00627
Epoch: 41786, Training Loss: 0.00708
Epoch: 41786, Training Loss: 0.00663
Epoch: 41786, Training Loss: 0.00809
Epoch: 41786, Training Loss: 0.00627
Epoch: 41787, Training Loss: 0.00708
Epoch: 41787, Training Loss: 0.00663
Epoch: 41787, Training Loss: 0.00809
Epoch: 41787, Training Loss: 0.00627
Epoch: 41788, Training Loss: 0.00708
Epoch: 41788, Training Loss: 0.00663
Epoch: 41788, Training Loss: 0.00809
Epoch: 41788, Training Loss: 0.00627
Epoch: 41789, Training Loss: 0.00708
Epoch: 41789, Training Loss: 0.00663
Epoch: 41789, Training Loss: 0.00809
Epoch: 41789, Training Loss: 0.00627
Epoch: 41790, Training Loss: 0.00708
Epoch: 41790, Training Loss: 0.00663
Epoch: 41790, Training Loss: 0.00809
Epoch: 41790, Training Loss: 0.00627
Epoch: 41791, Training Loss: 0.00708
Epoch: 41791, Training Loss: 0.00663
Epoch: 41791, Training Loss: 0.00809
Epoch: 41791, Training Loss: 0.00627
Epoch: 41792, Training Loss: 0.00708
Epoch: 41792, Training Loss: 0.00663
Epoch: 41792, Training Loss: 0.00809
Epoch: 41792, Training Loss: 0.00627
Epoch: 41793, Training Loss: 0.00708
Epoch: 41793, Training Loss: 0.00663
Epoch: 41793, Training Loss: 0.00809
Epoch: 41793, Training Loss: 0.00627
Epoch: 41794, Training Loss: 0.00708
Epoch: 41794, Training Loss: 0.00663
Epoch: 41794, Training Loss: 0.00809
Epoch: 41794, Training Loss: 0.00627
Epoch: 41795, Training Loss: 0.00708
Epoch: 41795, Training Loss: 0.00663
Epoch: 41795, Training Loss: 0.00809
Epoch: 41795, Training Loss: 0.00627
Epoch: 41796, Training Loss: 0.00708
Epoch: 41796, Training Loss: 0.00663
Epoch: 41796, Training Loss: 0.00809
Epoch: 41796, Training Loss: 0.00627
Epoch: 41797, Training Loss: 0.00708
Epoch: 41797, Training Loss: 0.00663
Epoch: 41797, Training Loss: 0.00809
Epoch: 41797, Training Loss: 0.00627
Epoch: 41798, Training Loss: 0.00708
Epoch: 41798, Training Loss: 0.00663
Epoch: 41798, Training Loss: 0.00809
Epoch: 41798, Training Loss: 0.00627
Epoch: 41799, Training Loss: 0.00708
Epoch: 41799, Training Loss: 0.00663
Epoch: 41799, Training Loss: 0.00809
Epoch: 41799, Training Loss: 0.00627
Epoch: 41800, Training Loss: 0.00708
Epoch: 41800, Training Loss: 0.00663
Epoch: 41800, Training Loss: 0.00809
Epoch: 41800, Training Loss: 0.00627
Epoch: 41801, Training Loss: 0.00708
Epoch: 41801, Training Loss: 0.00663
Epoch: 41801, Training Loss: 0.00809
Epoch: 41801, Training Loss: 0.00627
Epoch: 41802, Training Loss: 0.00708
Epoch: 41802, Training Loss: 0.00663
Epoch: 41802, Training Loss: 0.00809
Epoch: 41802, Training Loss: 0.00627
Epoch: 41803, Training Loss: 0.00708
Epoch: 41803, Training Loss: 0.00663
Epoch: 41803, Training Loss: 0.00809
Epoch: 41803, Training Loss: 0.00627
Epoch: 41804, Training Loss: 0.00708
Epoch: 41804, Training Loss: 0.00663
Epoch: 41804, Training Loss: 0.00809
Epoch: 41804, Training Loss: 0.00627
Epoch: 41805, Training Loss: 0.00708
Epoch: 41805, Training Loss: 0.00663
Epoch: 41805, Training Loss: 0.00809
Epoch: 41805, Training Loss: 0.00627
Epoch: 41806, Training Loss: 0.00708
Epoch: 41806, Training Loss: 0.00663
Epoch: 41806, Training Loss: 0.00809
Epoch: 41806, Training Loss: 0.00627
Epoch: 41807, Training Loss: 0.00708
Epoch: 41807, Training Loss: 0.00663
Epoch: 41807, Training Loss: 0.00809
Epoch: 41807, Training Loss: 0.00627
Epoch: 41808, Training Loss: 0.00708
Epoch: 41808, Training Loss: 0.00663
Epoch: 41808, Training Loss: 0.00809
Epoch: 41808, Training Loss: 0.00627
Epoch: 41809, Training Loss: 0.00708
Epoch: 41809, Training Loss: 0.00663
Epoch: 41809, Training Loss: 0.00809
Epoch: 41809, Training Loss: 0.00627
Epoch: 41810, Training Loss: 0.00708
Epoch: 41810, Training Loss: 0.00663
Epoch: 41810, Training Loss: 0.00809
Epoch: 41810, Training Loss: 0.00627
Epoch: 41811, Training Loss: 0.00708
Epoch: 41811, Training Loss: 0.00663
Epoch: 41811, Training Loss: 0.00809
Epoch: 41811, Training Loss: 0.00627
Epoch: 41812, Training Loss: 0.00708
Epoch: 41812, Training Loss: 0.00663
Epoch: 41812, Training Loss: 0.00809
Epoch: 41812, Training Loss: 0.00627
Epoch: 41813, Training Loss: 0.00708
Epoch: 41813, Training Loss: 0.00663
Epoch: 41813, Training Loss: 0.00809
Epoch: 41813, Training Loss: 0.00627
Epoch: 41814, Training Loss: 0.00708
Epoch: 41814, Training Loss: 0.00663
Epoch: 41814, Training Loss: 0.00809
Epoch: 41814, Training Loss: 0.00627
Epoch: 41815, Training Loss: 0.00708
Epoch: 41815, Training Loss: 0.00663
Epoch: 41815, Training Loss: 0.00809
Epoch: 41815, Training Loss: 0.00627
Epoch: 41816, Training Loss: 0.00708
Epoch: 41816, Training Loss: 0.00663
Epoch: 41816, Training Loss: 0.00809
Epoch: 41816, Training Loss: 0.00627
Epoch: 41817, Training Loss: 0.00708
Epoch: 41817, Training Loss: 0.00663
Epoch: 41817, Training Loss: 0.00809
Epoch: 41817, Training Loss: 0.00627
Epoch: 41818, Training Loss: 0.00708
Epoch: 41818, Training Loss: 0.00663
Epoch: 41818, Training Loss: 0.00809
Epoch: 41818, Training Loss: 0.00627
Epoch: 41819, Training Loss: 0.00708
Epoch: 41819, Training Loss: 0.00663
Epoch: 41819, Training Loss: 0.00809
Epoch: 41819, Training Loss: 0.00627
Epoch: 41820, Training Loss: 0.00708
Epoch: 41820, Training Loss: 0.00662
Epoch: 41820, Training Loss: 0.00809
Epoch: 41820, Training Loss: 0.00627
Epoch: 41821, Training Loss: 0.00708
Epoch: 41821, Training Loss: 0.00662
Epoch: 41821, Training Loss: 0.00809
Epoch: 41821, Training Loss: 0.00627
Epoch: 41822, Training Loss: 0.00708
Epoch: 41822, Training Loss: 0.00662
Epoch: 41822, Training Loss: 0.00809
Epoch: 41822, Training Loss: 0.00627
Epoch: 41823, Training Loss: 0.00708
Epoch: 41823, Training Loss: 0.00662
Epoch: 41823, Training Loss: 0.00809
Epoch: 41823, Training Loss: 0.00627
Epoch: 41824, Training Loss: 0.00708
Epoch: 41824, Training Loss: 0.00662
Epoch: 41824, Training Loss: 0.00809
Epoch: 41824, Training Loss: 0.00627
Epoch: 41825, Training Loss: 0.00708
Epoch: 41825, Training Loss: 0.00662
Epoch: 41825, Training Loss: 0.00809
Epoch: 41825, Training Loss: 0.00627
Epoch: 41826, Training Loss: 0.00708
Epoch: 41826, Training Loss: 0.00662
Epoch: 41826, Training Loss: 0.00809
Epoch: 41826, Training Loss: 0.00627
Epoch: 41827, Training Loss: 0.00708
Epoch: 41827, Training Loss: 0.00662
Epoch: 41827, Training Loss: 0.00809
Epoch: 41827, Training Loss: 0.00627
Epoch: 41828, Training Loss: 0.00708
Epoch: 41828, Training Loss: 0.00662
Epoch: 41828, Training Loss: 0.00809
Epoch: 41828, Training Loss: 0.00627
Epoch: 41829, Training Loss: 0.00708
Epoch: 41829, Training Loss: 0.00662
Epoch: 41829, Training Loss: 0.00809
Epoch: 41829, Training Loss: 0.00627
Epoch: 41830, Training Loss: 0.00708
Epoch: 41830, Training Loss: 0.00662
Epoch: 41830, Training Loss: 0.00808
Epoch: 41830, Training Loss: 0.00627
Epoch: 41831, Training Loss: 0.00708
Epoch: 41831, Training Loss: 0.00662
Epoch: 41831, Training Loss: 0.00808
Epoch: 41831, Training Loss: 0.00627
Epoch: 41832, Training Loss: 0.00708
Epoch: 41832, Training Loss: 0.00662
Epoch: 41832, Training Loss: 0.00808
Epoch: 41832, Training Loss: 0.00627
Epoch: 41833, Training Loss: 0.00708
Epoch: 41833, Training Loss: 0.00662
Epoch: 41833, Training Loss: 0.00808
Epoch: 41833, Training Loss: 0.00627
Epoch: 41834, Training Loss: 0.00708
Epoch: 41834, Training Loss: 0.00662
Epoch: 41834, Training Loss: 0.00808
Epoch: 41834, Training Loss: 0.00627
Epoch: 41835, Training Loss: 0.00707
Epoch: 41835, Training Loss: 0.00662
Epoch: 41835, Training Loss: 0.00808
Epoch: 41835, Training Loss: 0.00627
Epoch: 41836, Training Loss: 0.00707
Epoch: 41836, Training Loss: 0.00662
Epoch: 41836, Training Loss: 0.00808
Epoch: 41836, Training Loss: 0.00627
Epoch: 41837, Training Loss: 0.00707
Epoch: 41837, Training Loss: 0.00662
Epoch: 41837, Training Loss: 0.00808
Epoch: 41837, Training Loss: 0.00627
Epoch: 41838, Training Loss: 0.00707
Epoch: 41838, Training Loss: 0.00662
Epoch: 41838, Training Loss: 0.00808
Epoch: 41838, Training Loss: 0.00627
Epoch: 41839, Training Loss: 0.00707
Epoch: 41839, Training Loss: 0.00662
Epoch: 41839, Training Loss: 0.00808
Epoch: 41839, Training Loss: 0.00627
Epoch: 41840, Training Loss: 0.00707
Epoch: 41840, Training Loss: 0.00662
Epoch: 41840, Training Loss: 0.00808
Epoch: 41840, Training Loss: 0.00627
Epoch: 41841, Training Loss: 0.00707
Epoch: 41841, Training Loss: 0.00662
Epoch: 41841, Training Loss: 0.00808
Epoch: 41841, Training Loss: 0.00627
Epoch: 41842, Training Loss: 0.00707
Epoch: 41842, Training Loss: 0.00662
Epoch: 41842, Training Loss: 0.00808
Epoch: 41842, Training Loss: 0.00627
Epoch: 41843, Training Loss: 0.00707
Epoch: 41843, Training Loss: 0.00662
Epoch: 41843, Training Loss: 0.00808
Epoch: 41843, Training Loss: 0.00627
Epoch: 41844, Training Loss: 0.00707
Epoch: 41844, Training Loss: 0.00662
Epoch: 41844, Training Loss: 0.00808
Epoch: 41844, Training Loss: 0.00627
Epoch: 41845, Training Loss: 0.00707
Epoch: 41845, Training Loss: 0.00662
Epoch: 41845, Training Loss: 0.00808
Epoch: 41845, Training Loss: 0.00627
Epoch: 41846, Training Loss: 0.00707
Epoch: 41846, Training Loss: 0.00662
Epoch: 41846, Training Loss: 0.00808
Epoch: 41846, Training Loss: 0.00627
Epoch: 41847, Training Loss: 0.00707
Epoch: 41847, Training Loss: 0.00662
Epoch: 41847, Training Loss: 0.00808
Epoch: 41847, Training Loss: 0.00627
Epoch: 41848, Training Loss: 0.00707
Epoch: 41848, Training Loss: 0.00662
Epoch: 41848, Training Loss: 0.00808
Epoch: 41848, Training Loss: 0.00627
Epoch: 41849, Training Loss: 0.00707
Epoch: 41849, Training Loss: 0.00662
Epoch: 41849, Training Loss: 0.00808
Epoch: 41849, Training Loss: 0.00627
Epoch: 41850, Training Loss: 0.00707
Epoch: 41850, Training Loss: 0.00662
Epoch: 41850, Training Loss: 0.00808
Epoch: 41850, Training Loss: 0.00627
Epoch: 41851, Training Loss: 0.00707
Epoch: 41851, Training Loss: 0.00662
Epoch: 41851, Training Loss: 0.00808
Epoch: 41851, Training Loss: 0.00627
Epoch: 41852, Training Loss: 0.00707
Epoch: 41852, Training Loss: 0.00662
Epoch: 41852, Training Loss: 0.00808
Epoch: 41852, Training Loss: 0.00627
Epoch: 41853, Training Loss: 0.00707
Epoch: 41853, Training Loss: 0.00662
Epoch: 41853, Training Loss: 0.00808
Epoch: 41853, Training Loss: 0.00627
Epoch: 41854, Training Loss: 0.00707
Epoch: 41854, Training Loss: 0.00662
Epoch: 41854, Training Loss: 0.00808
Epoch: 41854, Training Loss: 0.00627
Epoch: 41855, Training Loss: 0.00707
Epoch: 41855, Training Loss: 0.00662
Epoch: 41855, Training Loss: 0.00808
Epoch: 41855, Training Loss: 0.00627
Epoch: 41856, Training Loss: 0.00707
Epoch: 41856, Training Loss: 0.00662
Epoch: 41856, Training Loss: 0.00808
Epoch: 41856, Training Loss: 0.00627
Epoch: 41857, Training Loss: 0.00707
Epoch: 41857, Training Loss: 0.00662
Epoch: 41857, Training Loss: 0.00808
Epoch: 41857, Training Loss: 0.00627
Epoch: 41858, Training Loss: 0.00707
Epoch: 41858, Training Loss: 0.00662
Epoch: 41858, Training Loss: 0.00808
Epoch: 41858, Training Loss: 0.00627
Epoch: 41859, Training Loss: 0.00707
Epoch: 41859, Training Loss: 0.00662
Epoch: 41859, Training Loss: 0.00808
Epoch: 41859, Training Loss: 0.00627
Epoch: 41860, Training Loss: 0.00707
Epoch: 41860, Training Loss: 0.00662
Epoch: 41860, Training Loss: 0.00808
Epoch: 41860, Training Loss: 0.00627
Epoch: 41861, Training Loss: 0.00707
Epoch: 41861, Training Loss: 0.00662
Epoch: 41861, Training Loss: 0.00808
Epoch: 41861, Training Loss: 0.00627
Epoch: 41862, Training Loss: 0.00707
Epoch: 41862, Training Loss: 0.00662
Epoch: 41862, Training Loss: 0.00808
Epoch: 41862, Training Loss: 0.00627
Epoch: 41863, Training Loss: 0.00707
Epoch: 41863, Training Loss: 0.00662
Epoch: 41863, Training Loss: 0.00808
Epoch: 41863, Training Loss: 0.00627
Epoch: 41864, Training Loss: 0.00707
Epoch: 41864, Training Loss: 0.00662
Epoch: 41864, Training Loss: 0.00808
Epoch: 41864, Training Loss: 0.00627
Epoch: 41865, Training Loss: 0.00707
Epoch: 41865, Training Loss: 0.00662
Epoch: 41865, Training Loss: 0.00808
Epoch: 41865, Training Loss: 0.00627
Epoch: 41866, Training Loss: 0.00707
Epoch: 41866, Training Loss: 0.00662
Epoch: 41866, Training Loss: 0.00808
Epoch: 41866, Training Loss: 0.00627
Epoch: 41867, Training Loss: 0.00707
Epoch: 41867, Training Loss: 0.00662
Epoch: 41867, Training Loss: 0.00808
Epoch: 41867, Training Loss: 0.00627
Epoch: 41868, Training Loss: 0.00707
Epoch: 41868, Training Loss: 0.00662
Epoch: 41868, Training Loss: 0.00808
Epoch: 41868, Training Loss: 0.00627
Epoch: 41869, Training Loss: 0.00707
Epoch: 41869, Training Loss: 0.00662
Epoch: 41869, Training Loss: 0.00808
Epoch: 41869, Training Loss: 0.00627
Epoch: 41870, Training Loss: 0.00707
Epoch: 41870, Training Loss: 0.00662
Epoch: 41870, Training Loss: 0.00808
Epoch: 41870, Training Loss: 0.00627
Epoch: 41871, Training Loss: 0.00707
Epoch: 41871, Training Loss: 0.00662
Epoch: 41871, Training Loss: 0.00808
Epoch: 41871, Training Loss: 0.00627
Epoch: 41872, Training Loss: 0.00707
Epoch: 41872, Training Loss: 0.00662
Epoch: 41872, Training Loss: 0.00808
Epoch: 41872, Training Loss: 0.00627
Epoch: 41873, Training Loss: 0.00707
Epoch: 41873, Training Loss: 0.00662
Epoch: 41873, Training Loss: 0.00808
Epoch: 41873, Training Loss: 0.00627
Epoch: 41874, Training Loss: 0.00707
Epoch: 41874, Training Loss: 0.00662
Epoch: 41874, Training Loss: 0.00808
Epoch: 41874, Training Loss: 0.00627
Epoch: 41875, Training Loss: 0.00707
Epoch: 41875, Training Loss: 0.00662
Epoch: 41875, Training Loss: 0.00808
Epoch: 41875, Training Loss: 0.00627
Epoch: 41876, Training Loss: 0.00707
Epoch: 41876, Training Loss: 0.00662
Epoch: 41876, Training Loss: 0.00808
Epoch: 41876, Training Loss: 0.00627
Epoch: 41877, Training Loss: 0.00707
Epoch: 41877, Training Loss: 0.00662
Epoch: 41877, Training Loss: 0.00808
Epoch: 41877, Training Loss: 0.00627
Epoch: 41878, Training Loss: 0.00707
Epoch: 41878, Training Loss: 0.00662
Epoch: 41878, Training Loss: 0.00808
Epoch: 41878, Training Loss: 0.00627
Epoch: 41879, Training Loss: 0.00707
Epoch: 41879, Training Loss: 0.00662
Epoch: 41879, Training Loss: 0.00808
Epoch: 41879, Training Loss: 0.00627
Epoch: 41880, Training Loss: 0.00707
Epoch: 41880, Training Loss: 0.00662
Epoch: 41880, Training Loss: 0.00808
Epoch: 41880, Training Loss: 0.00627
Epoch: 41881, Training Loss: 0.00707
Epoch: 41881, Training Loss: 0.00662
Epoch: 41881, Training Loss: 0.00808
Epoch: 41881, Training Loss: 0.00627
Epoch: 41882, Training Loss: 0.00707
Epoch: 41882, Training Loss: 0.00662
Epoch: 41882, Training Loss: 0.00808
Epoch: 41882, Training Loss: 0.00627
Epoch: 41883, Training Loss: 0.00707
Epoch: 41883, Training Loss: 0.00662
Epoch: 41883, Training Loss: 0.00808
Epoch: 41883, Training Loss: 0.00627
Epoch: 41884, Training Loss: 0.00707
Epoch: 41884, Training Loss: 0.00662
Epoch: 41884, Training Loss: 0.00808
Epoch: 41884, Training Loss: 0.00627
Epoch: 41885, Training Loss: 0.00707
Epoch: 41885, Training Loss: 0.00662
Epoch: 41885, Training Loss: 0.00808
Epoch: 41885, Training Loss: 0.00627
Epoch: 41886, Training Loss: 0.00707
Epoch: 41886, Training Loss: 0.00662
Epoch: 41886, Training Loss: 0.00808
Epoch: 41886, Training Loss: 0.00627
Epoch: 41887, Training Loss: 0.00707
Epoch: 41887, Training Loss: 0.00662
Epoch: 41887, Training Loss: 0.00808
Epoch: 41887, Training Loss: 0.00627
Epoch: 41888, Training Loss: 0.00707
Epoch: 41888, Training Loss: 0.00662
Epoch: 41888, Training Loss: 0.00808
Epoch: 41888, Training Loss: 0.00627
Epoch: 41889, Training Loss: 0.00707
Epoch: 41889, Training Loss: 0.00662
Epoch: 41889, Training Loss: 0.00808
Epoch: 41889, Training Loss: 0.00627
Epoch: 41890, Training Loss: 0.00707
Epoch: 41890, Training Loss: 0.00662
Epoch: 41890, Training Loss: 0.00808
Epoch: 41890, Training Loss: 0.00627
Epoch: 41891, Training Loss: 0.00707
Epoch: 41891, Training Loss: 0.00662
Epoch: 41891, Training Loss: 0.00808
Epoch: 41891, Training Loss: 0.00627
Epoch: 41892, Training Loss: 0.00707
Epoch: 41892, Training Loss: 0.00662
Epoch: 41892, Training Loss: 0.00808
Epoch: 41892, Training Loss: 0.00627
Epoch: 41893, Training Loss: 0.00707
Epoch: 41893, Training Loss: 0.00662
Epoch: 41893, Training Loss: 0.00808
Epoch: 41893, Training Loss: 0.00627
Epoch: 41894, Training Loss: 0.00707
Epoch: 41894, Training Loss: 0.00662
Epoch: 41894, Training Loss: 0.00808
Epoch: 41894, Training Loss: 0.00627
Epoch: 41895, Training Loss: 0.00707
Epoch: 41895, Training Loss: 0.00662
Epoch: 41895, Training Loss: 0.00808
Epoch: 41895, Training Loss: 0.00627
Epoch: 41896, Training Loss: 0.00707
Epoch: 41896, Training Loss: 0.00662
Epoch: 41896, Training Loss: 0.00808
Epoch: 41896, Training Loss: 0.00627
Epoch: 41897, Training Loss: 0.00707
Epoch: 41897, Training Loss: 0.00662
Epoch: 41897, Training Loss: 0.00808
Epoch: 41897, Training Loss: 0.00627
Epoch: 41898, Training Loss: 0.00707
Epoch: 41898, Training Loss: 0.00662
Epoch: 41898, Training Loss: 0.00808
Epoch: 41898, Training Loss: 0.00627
Epoch: 41899, Training Loss: 0.00707
Epoch: 41899, Training Loss: 0.00662
Epoch: 41899, Training Loss: 0.00808
Epoch: 41899, Training Loss: 0.00627
Epoch: 41900, Training Loss: 0.00707
Epoch: 41900, Training Loss: 0.00662
Epoch: 41900, Training Loss: 0.00808
Epoch: 41900, Training Loss: 0.00627
Epoch: 41901, Training Loss: 0.00707
Epoch: 41901, Training Loss: 0.00662
Epoch: 41901, Training Loss: 0.00808
Epoch: 41901, Training Loss: 0.00627
Epoch: 41902, Training Loss: 0.00707
Epoch: 41902, Training Loss: 0.00662
Epoch: 41902, Training Loss: 0.00808
Epoch: 41902, Training Loss: 0.00627
Epoch: 41903, Training Loss: 0.00707
Epoch: 41903, Training Loss: 0.00662
Epoch: 41903, Training Loss: 0.00808
Epoch: 41903, Training Loss: 0.00627
Epoch: 41904, Training Loss: 0.00707
Epoch: 41904, Training Loss: 0.00662
Epoch: 41904, Training Loss: 0.00808
Epoch: 41904, Training Loss: 0.00627
Epoch: 41905, Training Loss: 0.00707
Epoch: 41905, Training Loss: 0.00662
Epoch: 41905, Training Loss: 0.00808
Epoch: 41905, Training Loss: 0.00627
Epoch: 41906, Training Loss: 0.00707
Epoch: 41906, Training Loss: 0.00662
Epoch: 41906, Training Loss: 0.00808
Epoch: 41906, Training Loss: 0.00627
Epoch: 41907, Training Loss: 0.00707
Epoch: 41907, Training Loss: 0.00662
Epoch: 41907, Training Loss: 0.00808
Epoch: 41907, Training Loss: 0.00627
Epoch: 41908, Training Loss: 0.00707
Epoch: 41908, Training Loss: 0.00662
Epoch: 41908, Training Loss: 0.00808
Epoch: 41908, Training Loss: 0.00627
Epoch: 41909, Training Loss: 0.00707
Epoch: 41909, Training Loss: 0.00662
Epoch: 41909, Training Loss: 0.00808
Epoch: 41909, Training Loss: 0.00627
Epoch: 41910, Training Loss: 0.00707
Epoch: 41910, Training Loss: 0.00662
Epoch: 41910, Training Loss: 0.00808
Epoch: 41910, Training Loss: 0.00627
Epoch: 41911, Training Loss: 0.00707
Epoch: 41911, Training Loss: 0.00662
Epoch: 41911, Training Loss: 0.00808
Epoch: 41911, Training Loss: 0.00627
Epoch: 41912, Training Loss: 0.00707
Epoch: 41912, Training Loss: 0.00662
Epoch: 41912, Training Loss: 0.00808
Epoch: 41912, Training Loss: 0.00627
Epoch: 41913, Training Loss: 0.00707
Epoch: 41913, Training Loss: 0.00662
Epoch: 41913, Training Loss: 0.00808
Epoch: 41913, Training Loss: 0.00627
Epoch: 41914, Training Loss: 0.00707
Epoch: 41914, Training Loss: 0.00662
Epoch: 41914, Training Loss: 0.00808
Epoch: 41914, Training Loss: 0.00626
Epoch: 41915, Training Loss: 0.00707
Epoch: 41915, Training Loss: 0.00662
Epoch: 41915, Training Loss: 0.00808
Epoch: 41915, Training Loss: 0.00626
Epoch: 41916, Training Loss: 0.00707
Epoch: 41916, Training Loss: 0.00662
Epoch: 41916, Training Loss: 0.00808
Epoch: 41916, Training Loss: 0.00626
Epoch: 41917, Training Loss: 0.00707
Epoch: 41917, Training Loss: 0.00662
Epoch: 41917, Training Loss: 0.00808
Epoch: 41917, Training Loss: 0.00626
Epoch: 41918, Training Loss: 0.00707
Epoch: 41918, Training Loss: 0.00662
Epoch: 41918, Training Loss: 0.00808
Epoch: 41918, Training Loss: 0.00626
Epoch: 41919, Training Loss: 0.00707
Epoch: 41919, Training Loss: 0.00662
Epoch: 41919, Training Loss: 0.00808
Epoch: 41919, Training Loss: 0.00626
Epoch: 41920, Training Loss: 0.00707
Epoch: 41920, Training Loss: 0.00662
Epoch: 41920, Training Loss: 0.00808
Epoch: 41920, Training Loss: 0.00626
Epoch: 41921, Training Loss: 0.00707
Epoch: 41921, Training Loss: 0.00662
Epoch: 41921, Training Loss: 0.00808
Epoch: 41921, Training Loss: 0.00626
Epoch: 41922, Training Loss: 0.00707
Epoch: 41922, Training Loss: 0.00662
Epoch: 41922, Training Loss: 0.00808
Epoch: 41922, Training Loss: 0.00626
Epoch: 41923, Training Loss: 0.00707
Epoch: 41923, Training Loss: 0.00662
Epoch: 41923, Training Loss: 0.00808
Epoch: 41923, Training Loss: 0.00626
Epoch: 41924, Training Loss: 0.00707
Epoch: 41924, Training Loss: 0.00662
Epoch: 41924, Training Loss: 0.00808
Epoch: 41924, Training Loss: 0.00626
Epoch: 41925, Training Loss: 0.00707
Epoch: 41925, Training Loss: 0.00662
Epoch: 41925, Training Loss: 0.00808
Epoch: 41925, Training Loss: 0.00626
Epoch: 41926, Training Loss: 0.00707
Epoch: 41926, Training Loss: 0.00662
Epoch: 41926, Training Loss: 0.00808
Epoch: 41926, Training Loss: 0.00626
Epoch: 41927, Training Loss: 0.00707
Epoch: 41927, Training Loss: 0.00662
Epoch: 41927, Training Loss: 0.00808
Epoch: 41927, Training Loss: 0.00626
Epoch: 41928, Training Loss: 0.00707
Epoch: 41928, Training Loss: 0.00662
Epoch: 41928, Training Loss: 0.00808
Epoch: 41928, Training Loss: 0.00626
Epoch: 41929, Training Loss: 0.00707
Epoch: 41929, Training Loss: 0.00662
Epoch: 41929, Training Loss: 0.00807
Epoch: 41929, Training Loss: 0.00626
Epoch: 41930, Training Loss: 0.00707
Epoch: 41930, Training Loss: 0.00662
Epoch: 41930, Training Loss: 0.00807
Epoch: 41930, Training Loss: 0.00626
Epoch: 41931, Training Loss: 0.00707
Epoch: 41931, Training Loss: 0.00662
Epoch: 41931, Training Loss: 0.00807
Epoch: 41931, Training Loss: 0.00626
Epoch: 41932, Training Loss: 0.00707
Epoch: 41932, Training Loss: 0.00662
Epoch: 41932, Training Loss: 0.00807
Epoch: 41932, Training Loss: 0.00626
Epoch: 41933, Training Loss: 0.00707
Epoch: 41933, Training Loss: 0.00662
Epoch: 41933, Training Loss: 0.00807
Epoch: 41933, Training Loss: 0.00626
Epoch: 41934, Training Loss: 0.00707
Epoch: 41934, Training Loss: 0.00662
Epoch: 41934, Training Loss: 0.00807
Epoch: 41934, Training Loss: 0.00626
Epoch: 41935, Training Loss: 0.00707
Epoch: 41935, Training Loss: 0.00662
Epoch: 41935, Training Loss: 0.00807
Epoch: 41935, Training Loss: 0.00626
Epoch: 41936, Training Loss: 0.00707
Epoch: 41936, Training Loss: 0.00662
Epoch: 41936, Training Loss: 0.00807
Epoch: 41936, Training Loss: 0.00626
Epoch: 41937, Training Loss: 0.00707
Epoch: 41937, Training Loss: 0.00662
Epoch: 41937, Training Loss: 0.00807
Epoch: 41937, Training Loss: 0.00626
Epoch: 41938, Training Loss: 0.00707
Epoch: 41938, Training Loss: 0.00662
Epoch: 41938, Training Loss: 0.00807
Epoch: 41938, Training Loss: 0.00626
Epoch: 41939, Training Loss: 0.00707
Epoch: 41939, Training Loss: 0.00662
Epoch: 41939, Training Loss: 0.00807
Epoch: 41939, Training Loss: 0.00626
Epoch: 41940, Training Loss: 0.00707
Epoch: 41940, Training Loss: 0.00662
Epoch: 41940, Training Loss: 0.00807
Epoch: 41940, Training Loss: 0.00626
Epoch: 41941, Training Loss: 0.00707
Epoch: 41941, Training Loss: 0.00662
Epoch: 41941, Training Loss: 0.00807
Epoch: 41941, Training Loss: 0.00626
Epoch: 41942, Training Loss: 0.00707
Epoch: 41942, Training Loss: 0.00661
Epoch: 41942, Training Loss: 0.00807
Epoch: 41942, Training Loss: 0.00626
Epoch: 41943, Training Loss: 0.00707
Epoch: 41943, Training Loss: 0.00661
Epoch: 41943, Training Loss: 0.00807
Epoch: 41943, Training Loss: 0.00626
Epoch: 41944, Training Loss: 0.00707
Epoch: 41944, Training Loss: 0.00661
Epoch: 41944, Training Loss: 0.00807
Epoch: 41944, Training Loss: 0.00626
Epoch: 41945, Training Loss: 0.00707
Epoch: 41945, Training Loss: 0.00661
Epoch: 41945, Training Loss: 0.00807
Epoch: 41945, Training Loss: 0.00626
Epoch: 41946, Training Loss: 0.00707
Epoch: 41946, Training Loss: 0.00661
Epoch: 41946, Training Loss: 0.00807
Epoch: 41946, Training Loss: 0.00626
Epoch: 41947, Training Loss: 0.00706
Epoch: 41947, Training Loss: 0.00661
Epoch: 41947, Training Loss: 0.00807
Epoch: 41947, Training Loss: 0.00626
Epoch: 41948, Training Loss: 0.00706
Epoch: 41948, Training Loss: 0.00661
Epoch: 41948, Training Loss: 0.00807
Epoch: 41948, Training Loss: 0.00626
Epoch: 41949, Training Loss: 0.00706
Epoch: 41949, Training Loss: 0.00661
Epoch: 41949, Training Loss: 0.00807
Epoch: 41949, Training Loss: 0.00626
Epoch: 41950, Training Loss: 0.00706
Epoch: 41950, Training Loss: 0.00661
Epoch: 41950, Training Loss: 0.00807
Epoch: 41950, Training Loss: 0.00626
Epoch: 41951, Training Loss: 0.00706
Epoch: 41951, Training Loss: 0.00661
Epoch: 41951, Training Loss: 0.00807
Epoch: 41951, Training Loss: 0.00626
Epoch: 41952, Training Loss: 0.00706
Epoch: 41952, Training Loss: 0.00661
Epoch: 41952, Training Loss: 0.00807
Epoch: 41952, Training Loss: 0.00626
Epoch: 41953, Training Loss: 0.00706
Epoch: 41953, Training Loss: 0.00661
Epoch: 41953, Training Loss: 0.00807
Epoch: 41953, Training Loss: 0.00626
Epoch: 41954, Training Loss: 0.00706
Epoch: 41954, Training Loss: 0.00661
Epoch: 41954, Training Loss: 0.00807
Epoch: 41954, Training Loss: 0.00626
Epoch: 41955, Training Loss: 0.00706
Epoch: 41955, Training Loss: 0.00661
Epoch: 41955, Training Loss: 0.00807
Epoch: 41955, Training Loss: 0.00626
Epoch: 41956, Training Loss: 0.00706
Epoch: 41956, Training Loss: 0.00661
Epoch: 41956, Training Loss: 0.00807
Epoch: 41956, Training Loss: 0.00626
Epoch: 41957, Training Loss: 0.00706
Epoch: 41957, Training Loss: 0.00661
Epoch: 41957, Training Loss: 0.00807
Epoch: 41957, Training Loss: 0.00626
Epoch: 41958, Training Loss: 0.00706
Epoch: 41958, Training Loss: 0.00661
Epoch: 41958, Training Loss: 0.00807
Epoch: 41958, Training Loss: 0.00626
Epoch: 41959, Training Loss: 0.00706
Epoch: 41959, Training Loss: 0.00661
Epoch: 41959, Training Loss: 0.00807
Epoch: 41959, Training Loss: 0.00626
Epoch: 41960, Training Loss: 0.00706
Epoch: 41960, Training Loss: 0.00661
Epoch: 41960, Training Loss: 0.00807
Epoch: 41960, Training Loss: 0.00626
Epoch: 41961, Training Loss: 0.00706
Epoch: 41961, Training Loss: 0.00661
Epoch: 41961, Training Loss: 0.00807
Epoch: 41961, Training Loss: 0.00626
Epoch: 41962, Training Loss: 0.00706
Epoch: 41962, Training Loss: 0.00661
Epoch: 41962, Training Loss: 0.00807
Epoch: 41962, Training Loss: 0.00626
Epoch: 41963, Training Loss: 0.00706
Epoch: 41963, Training Loss: 0.00661
Epoch: 41963, Training Loss: 0.00807
Epoch: 41963, Training Loss: 0.00626
Epoch: 41964, Training Loss: 0.00706
Epoch: 41964, Training Loss: 0.00661
Epoch: 41964, Training Loss: 0.00807
Epoch: 41964, Training Loss: 0.00626
Epoch: 41965, Training Loss: 0.00706
Epoch: 41965, Training Loss: 0.00661
Epoch: 41965, Training Loss: 0.00807
Epoch: 41965, Training Loss: 0.00626
Epoch: 41966, Training Loss: 0.00706
Epoch: 41966, Training Loss: 0.00661
Epoch: 41966, Training Loss: 0.00807
Epoch: 41966, Training Loss: 0.00626
Epoch: 41967, Training Loss: 0.00706
Epoch: 41967, Training Loss: 0.00661
Epoch: 41967, Training Loss: 0.00807
Epoch: 41967, Training Loss: 0.00626
Epoch: 41968, Training Loss: 0.00706
Epoch: 41968, Training Loss: 0.00661
Epoch: 41968, Training Loss: 0.00807
Epoch: 41968, Training Loss: 0.00626
Epoch: 41969, Training Loss: 0.00706
Epoch: 41969, Training Loss: 0.00661
Epoch: 41969, Training Loss: 0.00807
Epoch: 41969, Training Loss: 0.00626
Epoch: 41970, Training Loss: 0.00706
Epoch: 41970, Training Loss: 0.00661
Epoch: 41970, Training Loss: 0.00807
Epoch: 41970, Training Loss: 0.00626
Epoch: 41971, Training Loss: 0.00706
Epoch: 41971, Training Loss: 0.00661
Epoch: 41971, Training Loss: 0.00807
Epoch: 41971, Training Loss: 0.00626
Epoch: 41972, Training Loss: 0.00706
Epoch: 41972, Training Loss: 0.00661
Epoch: 41972, Training Loss: 0.00807
Epoch: 41972, Training Loss: 0.00626
Epoch: 41973, Training Loss: 0.00706
Epoch: 41973, Training Loss: 0.00661
Epoch: 41973, Training Loss: 0.00807
Epoch: 41973, Training Loss: 0.00626
Epoch: 41974, Training Loss: 0.00706
Epoch: 41974, Training Loss: 0.00661
Epoch: 41974, Training Loss: 0.00807
Epoch: 41974, Training Loss: 0.00626
Epoch: 41975, Training Loss: 0.00706
Epoch: 41975, Training Loss: 0.00661
Epoch: 41975, Training Loss: 0.00807
Epoch: 41975, Training Loss: 0.00626
Epoch: 41976, Training Loss: 0.00706
Epoch: 41976, Training Loss: 0.00661
Epoch: 41976, Training Loss: 0.00807
Epoch: 41976, Training Loss: 0.00626
Epoch: 41977, Training Loss: 0.00706
Epoch: 41977, Training Loss: 0.00661
Epoch: 41977, Training Loss: 0.00807
Epoch: 41977, Training Loss: 0.00626
Epoch: 41978, Training Loss: 0.00706
Epoch: 41978, Training Loss: 0.00661
Epoch: 41978, Training Loss: 0.00807
Epoch: 41978, Training Loss: 0.00626
Epoch: 41979, Training Loss: 0.00706
Epoch: 41979, Training Loss: 0.00661
Epoch: 41979, Training Loss: 0.00807
Epoch: 41979, Training Loss: 0.00626
Epoch: 41980, Training Loss: 0.00706
Epoch: 41980, Training Loss: 0.00661
Epoch: 41980, Training Loss: 0.00807
Epoch: 41980, Training Loss: 0.00626
Epoch: 41981, Training Loss: 0.00706
Epoch: 41981, Training Loss: 0.00661
Epoch: 41981, Training Loss: 0.00807
Epoch: 41981, Training Loss: 0.00626
Epoch: 41982, Training Loss: 0.00706
Epoch: 41982, Training Loss: 0.00661
Epoch: 41982, Training Loss: 0.00807
Epoch: 41982, Training Loss: 0.00626
Epoch: 41983, Training Loss: 0.00706
Epoch: 41983, Training Loss: 0.00661
Epoch: 41983, Training Loss: 0.00807
Epoch: 41983, Training Loss: 0.00626
Epoch: 41984, Training Loss: 0.00706
Epoch: 41984, Training Loss: 0.00661
Epoch: 41984, Training Loss: 0.00807
Epoch: 41984, Training Loss: 0.00626
Epoch: 41985, Training Loss: 0.00706
Epoch: 41985, Training Loss: 0.00661
Epoch: 41985, Training Loss: 0.00807
Epoch: 41985, Training Loss: 0.00626
Epoch: 41986, Training Loss: 0.00706
Epoch: 41986, Training Loss: 0.00661
Epoch: 41986, Training Loss: 0.00807
Epoch: 41986, Training Loss: 0.00626
Epoch: 41987, Training Loss: 0.00706
Epoch: 41987, Training Loss: 0.00661
Epoch: 41987, Training Loss: 0.00807
Epoch: 41987, Training Loss: 0.00626
Epoch: 41988, Training Loss: 0.00706
Epoch: 41988, Training Loss: 0.00661
Epoch: 41988, Training Loss: 0.00807
Epoch: 41988, Training Loss: 0.00626
Epoch: 41989, Training Loss: 0.00706
Epoch: 41989, Training Loss: 0.00661
Epoch: 41989, Training Loss: 0.00807
Epoch: 41989, Training Loss: 0.00626
Epoch: 41990, Training Loss: 0.00706
Epoch: 41990, Training Loss: 0.00661
Epoch: 41990, Training Loss: 0.00807
Epoch: 41990, Training Loss: 0.00626
Epoch: 41991, Training Loss: 0.00706
Epoch: 41991, Training Loss: 0.00661
Epoch: 41991, Training Loss: 0.00807
Epoch: 41991, Training Loss: 0.00626
Epoch: 41992, Training Loss: 0.00706
Epoch: 41992, Training Loss: 0.00661
Epoch: 41992, Training Loss: 0.00807
Epoch: 41992, Training Loss: 0.00626
Epoch: 41993, Training Loss: 0.00706
Epoch: 41993, Training Loss: 0.00661
Epoch: 41993, Training Loss: 0.00807
Epoch: 41993, Training Loss: 0.00626
Epoch: 41994, Training Loss: 0.00706
Epoch: 41994, Training Loss: 0.00661
Epoch: 41994, Training Loss: 0.00807
Epoch: 41994, Training Loss: 0.00626
Epoch: 41995, Training Loss: 0.00706
Epoch: 41995, Training Loss: 0.00661
Epoch: 41995, Training Loss: 0.00807
Epoch: 41995, Training Loss: 0.00626
Epoch: 41996, Training Loss: 0.00706
Epoch: 41996, Training Loss: 0.00661
Epoch: 41996, Training Loss: 0.00807
Epoch: 41996, Training Loss: 0.00626
Epoch: 41997, Training Loss: 0.00706
Epoch: 41997, Training Loss: 0.00661
Epoch: 41997, Training Loss: 0.00807
Epoch: 41997, Training Loss: 0.00626
Epoch: 41998, Training Loss: 0.00706
Epoch: 41998, Training Loss: 0.00661
Epoch: 41998, Training Loss: 0.00807
Epoch: 41998, Training Loss: 0.00626
Epoch: 41999, Training Loss: 0.00706
Epoch: 41999, Training Loss: 0.00661
Epoch: 41999, Training Loss: 0.00807
Epoch: 41999, Training Loss: 0.00626
Epoch: 42000, Training Loss: 0.00706
Epoch: 42000, Training Loss: 0.00661
Epoch: 42000, Training Loss: 0.00807
Epoch: 42000, Training Loss: 0.00626
Epoch: 42001, Training Loss: 0.00706
Epoch: 42001, Training Loss: 0.00661
Epoch: 42001, Training Loss: 0.00807
Epoch: 42001, Training Loss: 0.00626
Epoch: 42002, Training Loss: 0.00706
Epoch: 42002, Training Loss: 0.00661
Epoch: 42002, Training Loss: 0.00807
Epoch: 42002, Training Loss: 0.00626
Epoch: 42003, Training Loss: 0.00706
Epoch: 42003, Training Loss: 0.00661
Epoch: 42003, Training Loss: 0.00807
Epoch: 42003, Training Loss: 0.00626
Epoch: 42004, Training Loss: 0.00706
Epoch: 42004, Training Loss: 0.00661
Epoch: 42004, Training Loss: 0.00807
Epoch: 42004, Training Loss: 0.00626
Epoch: 42005, Training Loss: 0.00706
Epoch: 42005, Training Loss: 0.00661
Epoch: 42005, Training Loss: 0.00807
Epoch: 42005, Training Loss: 0.00626
Epoch: 42006, Training Loss: 0.00706
Epoch: 42006, Training Loss: 0.00661
Epoch: 42006, Training Loss: 0.00807
Epoch: 42006, Training Loss: 0.00626
Epoch: 42007, Training Loss: 0.00706
Epoch: 42007, Training Loss: 0.00661
Epoch: 42007, Training Loss: 0.00807
Epoch: 42007, Training Loss: 0.00626
Epoch: 42008, Training Loss: 0.00706
Epoch: 42008, Training Loss: 0.00661
Epoch: 42008, Training Loss: 0.00807
Epoch: 42008, Training Loss: 0.00626
Epoch: 42009, Training Loss: 0.00706
Epoch: 42009, Training Loss: 0.00661
Epoch: 42009, Training Loss: 0.00807
Epoch: 42009, Training Loss: 0.00626
Epoch: 42010, Training Loss: 0.00706
Epoch: 42010, Training Loss: 0.00661
Epoch: 42010, Training Loss: 0.00807
Epoch: 42010, Training Loss: 0.00626
Epoch: 42011, Training Loss: 0.00706
Epoch: 42011, Training Loss: 0.00661
Epoch: 42011, Training Loss: 0.00807
Epoch: 42011, Training Loss: 0.00626
Epoch: 42012, Training Loss: 0.00706
Epoch: 42012, Training Loss: 0.00661
Epoch: 42012, Training Loss: 0.00807
Epoch: 42012, Training Loss: 0.00626
Epoch: 42013, Training Loss: 0.00706
Epoch: 42013, Training Loss: 0.00661
Epoch: 42013, Training Loss: 0.00807
Epoch: 42013, Training Loss: 0.00626
Epoch: 42014, Training Loss: 0.00706
Epoch: 42014, Training Loss: 0.00661
Epoch: 42014, Training Loss: 0.00807
Epoch: 42014, Training Loss: 0.00626
Epoch: 42015, Training Loss: 0.00706
Epoch: 42015, Training Loss: 0.00661
Epoch: 42015, Training Loss: 0.00807
Epoch: 42015, Training Loss: 0.00626
Epoch: 42016, Training Loss: 0.00706
Epoch: 42016, Training Loss: 0.00661
Epoch: 42016, Training Loss: 0.00807
Epoch: 42016, Training Loss: 0.00626
Epoch: 42017, Training Loss: 0.00706
Epoch: 42017, Training Loss: 0.00661
Epoch: 42017, Training Loss: 0.00807
Epoch: 42017, Training Loss: 0.00626
Epoch: 42018, Training Loss: 0.00706
Epoch: 42018, Training Loss: 0.00661
Epoch: 42018, Training Loss: 0.00807
Epoch: 42018, Training Loss: 0.00626
Epoch: 42019, Training Loss: 0.00706
Epoch: 42019, Training Loss: 0.00661
Epoch: 42019, Training Loss: 0.00807
Epoch: 42019, Training Loss: 0.00626
Epoch: 42020, Training Loss: 0.00706
Epoch: 42020, Training Loss: 0.00661
Epoch: 42020, Training Loss: 0.00807
Epoch: 42020, Training Loss: 0.00626
Epoch: 42021, Training Loss: 0.00706
Epoch: 42021, Training Loss: 0.00661
Epoch: 42021, Training Loss: 0.00807
Epoch: 42021, Training Loss: 0.00626
Epoch: 42022, Training Loss: 0.00706
Epoch: 42022, Training Loss: 0.00661
Epoch: 42022, Training Loss: 0.00807
Epoch: 42022, Training Loss: 0.00626
Epoch: 42023, Training Loss: 0.00706
Epoch: 42023, Training Loss: 0.00661
Epoch: 42023, Training Loss: 0.00807
Epoch: 42023, Training Loss: 0.00626
Epoch: 42024, Training Loss: 0.00706
Epoch: 42024, Training Loss: 0.00661
Epoch: 42024, Training Loss: 0.00807
Epoch: 42024, Training Loss: 0.00626
Epoch: 42025, Training Loss: 0.00706
Epoch: 42025, Training Loss: 0.00661
Epoch: 42025, Training Loss: 0.00807
Epoch: 42025, Training Loss: 0.00626
Epoch: 42026, Training Loss: 0.00706
Epoch: 42026, Training Loss: 0.00661
Epoch: 42026, Training Loss: 0.00807
Epoch: 42026, Training Loss: 0.00626
Epoch: 42027, Training Loss: 0.00706
Epoch: 42027, Training Loss: 0.00661
Epoch: 42027, Training Loss: 0.00807
Epoch: 42027, Training Loss: 0.00626
Epoch: 42028, Training Loss: 0.00706
Epoch: 42028, Training Loss: 0.00661
Epoch: 42028, Training Loss: 0.00807
Epoch: 42028, Training Loss: 0.00626
Epoch: 42029, Training Loss: 0.00706
Epoch: 42029, Training Loss: 0.00661
Epoch: 42029, Training Loss: 0.00806
Epoch: 42029, Training Loss: 0.00626
Epoch: 42030, Training Loss: 0.00706
Epoch: 42030, Training Loss: 0.00661
Epoch: 42030, Training Loss: 0.00806
Epoch: 42030, Training Loss: 0.00626
Epoch: 42031, Training Loss: 0.00706
Epoch: 42031, Training Loss: 0.00661
Epoch: 42031, Training Loss: 0.00806
Epoch: 42031, Training Loss: 0.00626
Epoch: 42032, Training Loss: 0.00706
Epoch: 42032, Training Loss: 0.00661
Epoch: 42032, Training Loss: 0.00806
Epoch: 42032, Training Loss: 0.00626
Epoch: 42033, Training Loss: 0.00706
Epoch: 42033, Training Loss: 0.00661
Epoch: 42033, Training Loss: 0.00806
Epoch: 42033, Training Loss: 0.00626
Epoch: 42034, Training Loss: 0.00706
Epoch: 42034, Training Loss: 0.00661
Epoch: 42034, Training Loss: 0.00806
Epoch: 42034, Training Loss: 0.00626
Epoch: 42035, Training Loss: 0.00706
Epoch: 42035, Training Loss: 0.00661
Epoch: 42035, Training Loss: 0.00806
Epoch: 42035, Training Loss: 0.00626
Epoch: 42036, Training Loss: 0.00706
Epoch: 42036, Training Loss: 0.00661
Epoch: 42036, Training Loss: 0.00806
Epoch: 42036, Training Loss: 0.00626
Epoch: 42037, Training Loss: 0.00706
Epoch: 42037, Training Loss: 0.00661
Epoch: 42037, Training Loss: 0.00806
Epoch: 42037, Training Loss: 0.00626
Epoch: 42038, Training Loss: 0.00706
Epoch: 42038, Training Loss: 0.00661
Epoch: 42038, Training Loss: 0.00806
Epoch: 42038, Training Loss: 0.00626
Epoch: 42039, Training Loss: 0.00706
Epoch: 42039, Training Loss: 0.00661
Epoch: 42039, Training Loss: 0.00806
Epoch: 42039, Training Loss: 0.00626
Epoch: 42040, Training Loss: 0.00706
Epoch: 42040, Training Loss: 0.00661
Epoch: 42040, Training Loss: 0.00806
Epoch: 42040, Training Loss: 0.00626
Epoch: 42041, Training Loss: 0.00706
Epoch: 42041, Training Loss: 0.00661
Epoch: 42041, Training Loss: 0.00806
Epoch: 42041, Training Loss: 0.00626
Epoch: 42042, Training Loss: 0.00706
Epoch: 42042, Training Loss: 0.00661
Epoch: 42042, Training Loss: 0.00806
Epoch: 42042, Training Loss: 0.00626
Epoch: 42043, Training Loss: 0.00706
Epoch: 42043, Training Loss: 0.00661
Epoch: 42043, Training Loss: 0.00806
Epoch: 42043, Training Loss: 0.00625
Epoch: 42044, Training Loss: 0.00706
Epoch: 42044, Training Loss: 0.00661
Epoch: 42044, Training Loss: 0.00806
Epoch: 42044, Training Loss: 0.00625
Epoch: 42045, Training Loss: 0.00706
Epoch: 42045, Training Loss: 0.00661
Epoch: 42045, Training Loss: 0.00806
Epoch: 42045, Training Loss: 0.00625
Epoch: 42046, Training Loss: 0.00706
Epoch: 42046, Training Loss: 0.00661
Epoch: 42046, Training Loss: 0.00806
Epoch: 42046, Training Loss: 0.00625
Epoch: 42047, Training Loss: 0.00706
Epoch: 42047, Training Loss: 0.00661
Epoch: 42047, Training Loss: 0.00806
Epoch: 42047, Training Loss: 0.00625
Epoch: 42048, Training Loss: 0.00706
Epoch: 42048, Training Loss: 0.00661
Epoch: 42048, Training Loss: 0.00806
Epoch: 42048, Training Loss: 0.00625
Epoch: 42049, Training Loss: 0.00706
Epoch: 42049, Training Loss: 0.00661
Epoch: 42049, Training Loss: 0.00806
Epoch: 42049, Training Loss: 0.00625
Epoch: 42050, Training Loss: 0.00706
Epoch: 42050, Training Loss: 0.00661
Epoch: 42050, Training Loss: 0.00806
Epoch: 42050, Training Loss: 0.00625
Epoch: 42051, Training Loss: 0.00706
Epoch: 42051, Training Loss: 0.00661
Epoch: 42051, Training Loss: 0.00806
Epoch: 42051, Training Loss: 0.00625
Epoch: 42052, Training Loss: 0.00706
Epoch: 42052, Training Loss: 0.00661
Epoch: 42052, Training Loss: 0.00806
Epoch: 42052, Training Loss: 0.00625
Epoch: 42053, Training Loss: 0.00706
Epoch: 42053, Training Loss: 0.00661
Epoch: 42053, Training Loss: 0.00806
Epoch: 42053, Training Loss: 0.00625
Epoch: 42054, Training Loss: 0.00706
Epoch: 42054, Training Loss: 0.00661
Epoch: 42054, Training Loss: 0.00806
Epoch: 42054, Training Loss: 0.00625
Epoch: 42055, Training Loss: 0.00706
Epoch: 42055, Training Loss: 0.00661
Epoch: 42055, Training Loss: 0.00806
Epoch: 42055, Training Loss: 0.00625
Epoch: 42056, Training Loss: 0.00706
Epoch: 42056, Training Loss: 0.00661
Epoch: 42056, Training Loss: 0.00806
Epoch: 42056, Training Loss: 0.00625
Epoch: 42057, Training Loss: 0.00706
Epoch: 42057, Training Loss: 0.00661
Epoch: 42057, Training Loss: 0.00806
Epoch: 42057, Training Loss: 0.00625
Epoch: 42058, Training Loss: 0.00706
Epoch: 42058, Training Loss: 0.00661
Epoch: 42058, Training Loss: 0.00806
Epoch: 42058, Training Loss: 0.00625
Epoch: 42059, Training Loss: 0.00706
Epoch: 42059, Training Loss: 0.00661
Epoch: 42059, Training Loss: 0.00806
Epoch: 42059, Training Loss: 0.00625
Epoch: 42060, Training Loss: 0.00706
Epoch: 42060, Training Loss: 0.00661
Epoch: 42060, Training Loss: 0.00806
Epoch: 42060, Training Loss: 0.00625
Epoch: 42061, Training Loss: 0.00705
Epoch: 42061, Training Loss: 0.00661
Epoch: 42061, Training Loss: 0.00806
Epoch: 42061, Training Loss: 0.00625
Epoch: 42062, Training Loss: 0.00705
Epoch: 42062, Training Loss: 0.00661
Epoch: 42062, Training Loss: 0.00806
Epoch: 42062, Training Loss: 0.00625
Epoch: 42063, Training Loss: 0.00705
Epoch: 42063, Training Loss: 0.00661
Epoch: 42063, Training Loss: 0.00806
Epoch: 42063, Training Loss: 0.00625
Epoch: 42064, Training Loss: 0.00705
Epoch: 42064, Training Loss: 0.00661
Epoch: 42064, Training Loss: 0.00806
Epoch: 42064, Training Loss: 0.00625
Epoch: 42065, Training Loss: 0.00705
Epoch: 42065, Training Loss: 0.00660
Epoch: 42065, Training Loss: 0.00806
Epoch: 42065, Training Loss: 0.00625
Epoch: 42066, Training Loss: 0.00705
Epoch: 42066, Training Loss: 0.00660
Epoch: 42066, Training Loss: 0.00806
Epoch: 42066, Training Loss: 0.00625
Epoch: 42067, Training Loss: 0.00705
Epoch: 42067, Training Loss: 0.00660
Epoch: 42067, Training Loss: 0.00806
Epoch: 42067, Training Loss: 0.00625
Epoch: 42068, Training Loss: 0.00705
Epoch: 42068, Training Loss: 0.00660
Epoch: 42068, Training Loss: 0.00806
Epoch: 42068, Training Loss: 0.00625
Epoch: 42069, Training Loss: 0.00705
Epoch: 42069, Training Loss: 0.00660
Epoch: 42069, Training Loss: 0.00806
Epoch: 42069, Training Loss: 0.00625
Epoch: 42070, Training Loss: 0.00705
Epoch: 42070, Training Loss: 0.00660
Epoch: 42070, Training Loss: 0.00806
Epoch: 42070, Training Loss: 0.00625
Epoch: 42071, Training Loss: 0.00705
Epoch: 42071, Training Loss: 0.00660
Epoch: 42071, Training Loss: 0.00806
Epoch: 42071, Training Loss: 0.00625
Epoch: 42072, Training Loss: 0.00705
Epoch: 42072, Training Loss: 0.00660
Epoch: 42072, Training Loss: 0.00806
Epoch: 42072, Training Loss: 0.00625
Epoch: 42073, Training Loss: 0.00705
Epoch: 42073, Training Loss: 0.00660
Epoch: 42073, Training Loss: 0.00806
Epoch: 42073, Training Loss: 0.00625
Epoch: 42074, Training Loss: 0.00705
Epoch: 42074, Training Loss: 0.00660
Epoch: 42074, Training Loss: 0.00806
Epoch: 42074, Training Loss: 0.00625
Epoch: 42075, Training Loss: 0.00705
Epoch: 42075, Training Loss: 0.00660
Epoch: 42075, Training Loss: 0.00806
Epoch: 42075, Training Loss: 0.00625
Epoch: 42076, Training Loss: 0.00705
Epoch: 42076, Training Loss: 0.00660
Epoch: 42076, Training Loss: 0.00806
Epoch: 42076, Training Loss: 0.00625
Epoch: 42077, Training Loss: 0.00705
Epoch: 42077, Training Loss: 0.00660
Epoch: 42077, Training Loss: 0.00806
Epoch: 42077, Training Loss: 0.00625
Epoch: 42078, Training Loss: 0.00705
Epoch: 42078, Training Loss: 0.00660
Epoch: 42078, Training Loss: 0.00806
Epoch: 42078, Training Loss: 0.00625
Epoch: 42079, Training Loss: 0.00705
Epoch: 42079, Training Loss: 0.00660
Epoch: 42079, Training Loss: 0.00806
Epoch: 42079, Training Loss: 0.00625
Epoch: 42080, Training Loss: 0.00705
Epoch: 42080, Training Loss: 0.00660
Epoch: 42080, Training Loss: 0.00806
Epoch: 42080, Training Loss: 0.00625
Epoch: 42081, Training Loss: 0.00705
Epoch: 42081, Training Loss: 0.00660
Epoch: 42081, Training Loss: 0.00806
Epoch: 42081, Training Loss: 0.00625
Epoch: 42082, Training Loss: 0.00705
Epoch: 42082, Training Loss: 0.00660
Epoch: 42082, Training Loss: 0.00806
Epoch: 42082, Training Loss: 0.00625
Epoch: 42083, Training Loss: 0.00705
Epoch: 42083, Training Loss: 0.00660
Epoch: 42083, Training Loss: 0.00806
Epoch: 42083, Training Loss: 0.00625
Epoch: 42084, Training Loss: 0.00705
Epoch: 42084, Training Loss: 0.00660
Epoch: 42084, Training Loss: 0.00806
Epoch: 42084, Training Loss: 0.00625
Epoch: 42085, Training Loss: 0.00705
Epoch: 42085, Training Loss: 0.00660
Epoch: 42085, Training Loss: 0.00806
Epoch: 42085, Training Loss: 0.00625
Epoch: 42086, Training Loss: 0.00705
Epoch: 42086, Training Loss: 0.00660
Epoch: 42086, Training Loss: 0.00806
Epoch: 42086, Training Loss: 0.00625
Epoch: 42087, Training Loss: 0.00705
Epoch: 42087, Training Loss: 0.00660
Epoch: 42087, Training Loss: 0.00806
Epoch: 42087, Training Loss: 0.00625
Epoch: 42088, Training Loss: 0.00705
Epoch: 42088, Training Loss: 0.00660
Epoch: 42088, Training Loss: 0.00806
Epoch: 42088, Training Loss: 0.00625
Epoch: 42089, Training Loss: 0.00705
Epoch: 42089, Training Loss: 0.00660
Epoch: 42089, Training Loss: 0.00806
Epoch: 42089, Training Loss: 0.00625
Epoch: 42090, Training Loss: 0.00705
Epoch: 42090, Training Loss: 0.00660
Epoch: 42090, Training Loss: 0.00806
Epoch: 42090, Training Loss: 0.00625
Epoch: 42091, Training Loss: 0.00705
Epoch: 42091, Training Loss: 0.00660
Epoch: 42091, Training Loss: 0.00806
Epoch: 42091, Training Loss: 0.00625
Epoch: 42092, Training Loss: 0.00705
Epoch: 42092, Training Loss: 0.00660
Epoch: 42092, Training Loss: 0.00806
Epoch: 42092, Training Loss: 0.00625
Epoch: 42093, Training Loss: 0.00705
Epoch: 42093, Training Loss: 0.00660
Epoch: 42093, Training Loss: 0.00806
Epoch: 42093, Training Loss: 0.00625
Epoch: 42094, Training Loss: 0.00705
Epoch: 42094, Training Loss: 0.00660
Epoch: 42094, Training Loss: 0.00806
Epoch: 42094, Training Loss: 0.00625
Epoch: 42095, Training Loss: 0.00705
Epoch: 42095, Training Loss: 0.00660
Epoch: 42095, Training Loss: 0.00806
Epoch: 42095, Training Loss: 0.00625
Epoch: 42096, Training Loss: 0.00705
Epoch: 42096, Training Loss: 0.00660
Epoch: 42096, Training Loss: 0.00806
Epoch: 42096, Training Loss: 0.00625
Epoch: 42097, Training Loss: 0.00705
Epoch: 42097, Training Loss: 0.00660
Epoch: 42097, Training Loss: 0.00806
Epoch: 42097, Training Loss: 0.00625
Epoch: 42098, Training Loss: 0.00705
Epoch: 42098, Training Loss: 0.00660
Epoch: 42098, Training Loss: 0.00806
Epoch: 42098, Training Loss: 0.00625
Epoch: 42099, Training Loss: 0.00705
Epoch: 42099, Training Loss: 0.00660
Epoch: 42099, Training Loss: 0.00806
Epoch: 42099, Training Loss: 0.00625
Epoch: 42100, Training Loss: 0.00705
Epoch: 42100, Training Loss: 0.00660
Epoch: 42100, Training Loss: 0.00806
Epoch: 42100, Training Loss: 0.00625
Epoch: 42101, Training Loss: 0.00705
Epoch: 42101, Training Loss: 0.00660
Epoch: 42101, Training Loss: 0.00806
Epoch: 42101, Training Loss: 0.00625
Epoch: 42102, Training Loss: 0.00705
Epoch: 42102, Training Loss: 0.00660
Epoch: 42102, Training Loss: 0.00806
Epoch: 42102, Training Loss: 0.00625
Epoch: 42103, Training Loss: 0.00705
Epoch: 42103, Training Loss: 0.00660
Epoch: 42103, Training Loss: 0.00806
Epoch: 42103, Training Loss: 0.00625
Epoch: 42104, Training Loss: 0.00705
Epoch: 42104, Training Loss: 0.00660
Epoch: 42104, Training Loss: 0.00806
Epoch: 42104, Training Loss: 0.00625
Epoch: 42105, Training Loss: 0.00705
Epoch: 42105, Training Loss: 0.00660
Epoch: 42105, Training Loss: 0.00806
Epoch: 42105, Training Loss: 0.00625
Epoch: 42106, Training Loss: 0.00705
Epoch: 42106, Training Loss: 0.00660
Epoch: 42106, Training Loss: 0.00806
Epoch: 42106, Training Loss: 0.00625
Epoch: 42107, Training Loss: 0.00705
Epoch: 42107, Training Loss: 0.00660
Epoch: 42107, Training Loss: 0.00806
Epoch: 42107, Training Loss: 0.00625
Epoch: 42108, Training Loss: 0.00705
Epoch: 42108, Training Loss: 0.00660
Epoch: 42108, Training Loss: 0.00806
Epoch: 42108, Training Loss: 0.00625
Epoch: 42109, Training Loss: 0.00705
Epoch: 42109, Training Loss: 0.00660
Epoch: 42109, Training Loss: 0.00806
Epoch: 42109, Training Loss: 0.00625
Epoch: 42110, Training Loss: 0.00705
Epoch: 42110, Training Loss: 0.00660
Epoch: 42110, Training Loss: 0.00806
Epoch: 42110, Training Loss: 0.00625
Epoch: 42111, Training Loss: 0.00705
Epoch: 42111, Training Loss: 0.00660
Epoch: 42111, Training Loss: 0.00806
Epoch: 42111, Training Loss: 0.00625
Epoch: 42112, Training Loss: 0.00705
Epoch: 42112, Training Loss: 0.00660
Epoch: 42112, Training Loss: 0.00806
Epoch: 42112, Training Loss: 0.00625
Epoch: 42113, Training Loss: 0.00705
Epoch: 42113, Training Loss: 0.00660
Epoch: 42113, Training Loss: 0.00806
Epoch: 42113, Training Loss: 0.00625
Epoch: 42114, Training Loss: 0.00705
Epoch: 42114, Training Loss: 0.00660
Epoch: 42114, Training Loss: 0.00806
Epoch: 42114, Training Loss: 0.00625
Epoch: 42115, Training Loss: 0.00705
Epoch: 42115, Training Loss: 0.00660
Epoch: 42115, Training Loss: 0.00806
Epoch: 42115, Training Loss: 0.00625
Epoch: 42116, Training Loss: 0.00705
Epoch: 42116, Training Loss: 0.00660
Epoch: 42116, Training Loss: 0.00806
Epoch: 42116, Training Loss: 0.00625
Epoch: 42117, Training Loss: 0.00705
Epoch: 42117, Training Loss: 0.00660
Epoch: 42117, Training Loss: 0.00806
Epoch: 42117, Training Loss: 0.00625
Epoch: 42118, Training Loss: 0.00705
Epoch: 42118, Training Loss: 0.00660
Epoch: 42118, Training Loss: 0.00806
Epoch: 42118, Training Loss: 0.00625
Epoch: 42119, Training Loss: 0.00705
Epoch: 42119, Training Loss: 0.00660
Epoch: 42119, Training Loss: 0.00806
Epoch: 42119, Training Loss: 0.00625
Epoch: 42120, Training Loss: 0.00705
Epoch: 42120, Training Loss: 0.00660
Epoch: 42120, Training Loss: 0.00806
Epoch: 42120, Training Loss: 0.00625
Epoch: 42121, Training Loss: 0.00705
Epoch: 42121, Training Loss: 0.00660
Epoch: 42121, Training Loss: 0.00806
Epoch: 42121, Training Loss: 0.00625
Epoch: 42122, Training Loss: 0.00705
Epoch: 42122, Training Loss: 0.00660
Epoch: 42122, Training Loss: 0.00806
Epoch: 42122, Training Loss: 0.00625
Epoch: 42123, Training Loss: 0.00705
Epoch: 42123, Training Loss: 0.00660
Epoch: 42123, Training Loss: 0.00806
Epoch: 42123, Training Loss: 0.00625
Epoch: 42124, Training Loss: 0.00705
Epoch: 42124, Training Loss: 0.00660
Epoch: 42124, Training Loss: 0.00806
Epoch: 42124, Training Loss: 0.00625
Epoch: 42125, Training Loss: 0.00705
Epoch: 42125, Training Loss: 0.00660
Epoch: 42125, Training Loss: 0.00806
Epoch: 42125, Training Loss: 0.00625
Epoch: 42126, Training Loss: 0.00705
Epoch: 42126, Training Loss: 0.00660
Epoch: 42126, Training Loss: 0.00806
Epoch: 42126, Training Loss: 0.00625
Epoch: 42127, Training Loss: 0.00705
Epoch: 42127, Training Loss: 0.00660
Epoch: 42127, Training Loss: 0.00806
Epoch: 42127, Training Loss: 0.00625
Epoch: 42128, Training Loss: 0.00705
Epoch: 42128, Training Loss: 0.00660
Epoch: 42128, Training Loss: 0.00805
Epoch: 42128, Training Loss: 0.00625
Epoch: 42129, Training Loss: 0.00705
Epoch: 42129, Training Loss: 0.00660
Epoch: 42129, Training Loss: 0.00805
Epoch: 42129, Training Loss: 0.00625
Epoch: 42130, Training Loss: 0.00705
Epoch: 42130, Training Loss: 0.00660
Epoch: 42130, Training Loss: 0.00805
Epoch: 42130, Training Loss: 0.00625
Epoch: 42131, Training Loss: 0.00705
Epoch: 42131, Training Loss: 0.00660
Epoch: 42131, Training Loss: 0.00805
Epoch: 42131, Training Loss: 0.00625
Epoch: 42132, Training Loss: 0.00705
Epoch: 42132, Training Loss: 0.00660
Epoch: 42132, Training Loss: 0.00805
Epoch: 42132, Training Loss: 0.00625
Epoch: 42133, Training Loss: 0.00705
Epoch: 42133, Training Loss: 0.00660
Epoch: 42133, Training Loss: 0.00805
Epoch: 42133, Training Loss: 0.00625
Epoch: 42134, Training Loss: 0.00705
Epoch: 42134, Training Loss: 0.00660
Epoch: 42134, Training Loss: 0.00805
Epoch: 42134, Training Loss: 0.00625
Epoch: 42135, Training Loss: 0.00705
Epoch: 42135, Training Loss: 0.00660
Epoch: 42135, Training Loss: 0.00805
Epoch: 42135, Training Loss: 0.00625
Epoch: 42136, Training Loss: 0.00705
Epoch: 42136, Training Loss: 0.00660
Epoch: 42136, Training Loss: 0.00805
Epoch: 42136, Training Loss: 0.00625
Epoch: 42137, Training Loss: 0.00705
Epoch: 42137, Training Loss: 0.00660
Epoch: 42137, Training Loss: 0.00805
Epoch: 42137, Training Loss: 0.00625
Epoch: 42138, Training Loss: 0.00705
Epoch: 42138, Training Loss: 0.00660
Epoch: 42138, Training Loss: 0.00805
Epoch: 42138, Training Loss: 0.00625
Epoch: 42139, Training Loss: 0.00705
Epoch: 42139, Training Loss: 0.00660
Epoch: 42139, Training Loss: 0.00805
Epoch: 42139, Training Loss: 0.00625
Epoch: 42140, Training Loss: 0.00705
Epoch: 42140, Training Loss: 0.00660
Epoch: 42140, Training Loss: 0.00805
Epoch: 42140, Training Loss: 0.00625
Epoch: 42141, Training Loss: 0.00705
Epoch: 42141, Training Loss: 0.00660
Epoch: 42141, Training Loss: 0.00805
Epoch: 42141, Training Loss: 0.00625
Epoch: 42142, Training Loss: 0.00705
Epoch: 42142, Training Loss: 0.00660
Epoch: 42142, Training Loss: 0.00805
Epoch: 42142, Training Loss: 0.00625
Epoch: 42143, Training Loss: 0.00705
Epoch: 42143, Training Loss: 0.00660
Epoch: 42143, Training Loss: 0.00805
Epoch: 42143, Training Loss: 0.00625
Epoch: 42144, Training Loss: 0.00705
Epoch: 42144, Training Loss: 0.00660
Epoch: 42144, Training Loss: 0.00805
Epoch: 42144, Training Loss: 0.00625
Epoch: 42145, Training Loss: 0.00705
Epoch: 42145, Training Loss: 0.00660
Epoch: 42145, Training Loss: 0.00805
Epoch: 42145, Training Loss: 0.00625
Epoch: 42146, Training Loss: 0.00705
Epoch: 42146, Training Loss: 0.00660
Epoch: 42146, Training Loss: 0.00805
Epoch: 42146, Training Loss: 0.00625
Epoch: 42147, Training Loss: 0.00705
Epoch: 42147, Training Loss: 0.00660
Epoch: 42147, Training Loss: 0.00805
Epoch: 42147, Training Loss: 0.00625
Epoch: 42148, Training Loss: 0.00705
Epoch: 42148, Training Loss: 0.00660
Epoch: 42148, Training Loss: 0.00805
Epoch: 42148, Training Loss: 0.00625
Epoch: 42149, Training Loss: 0.00705
Epoch: 42149, Training Loss: 0.00660
Epoch: 42149, Training Loss: 0.00805
Epoch: 42149, Training Loss: 0.00625
Epoch: 42150, Training Loss: 0.00705
Epoch: 42150, Training Loss: 0.00660
Epoch: 42150, Training Loss: 0.00805
Epoch: 42150, Training Loss: 0.00625
Epoch: 42151, Training Loss: 0.00705
Epoch: 42151, Training Loss: 0.00660
Epoch: 42151, Training Loss: 0.00805
Epoch: 42151, Training Loss: 0.00625
Epoch: 42152, Training Loss: 0.00705
Epoch: 42152, Training Loss: 0.00660
Epoch: 42152, Training Loss: 0.00805
Epoch: 42152, Training Loss: 0.00625
Epoch: 42153, Training Loss: 0.00705
Epoch: 42153, Training Loss: 0.00660
Epoch: 42153, Training Loss: 0.00805
Epoch: 42153, Training Loss: 0.00625
Epoch: 42154, Training Loss: 0.00705
Epoch: 42154, Training Loss: 0.00660
Epoch: 42154, Training Loss: 0.00805
Epoch: 42154, Training Loss: 0.00625
Epoch: 42155, Training Loss: 0.00705
Epoch: 42155, Training Loss: 0.00660
Epoch: 42155, Training Loss: 0.00805
Epoch: 42155, Training Loss: 0.00625
Epoch: 42156, Training Loss: 0.00705
Epoch: 42156, Training Loss: 0.00660
Epoch: 42156, Training Loss: 0.00805
Epoch: 42156, Training Loss: 0.00625
Epoch: 42157, Training Loss: 0.00705
Epoch: 42157, Training Loss: 0.00660
Epoch: 42157, Training Loss: 0.00805
Epoch: 42157, Training Loss: 0.00625
Epoch: 42158, Training Loss: 0.00705
Epoch: 42158, Training Loss: 0.00660
Epoch: 42158, Training Loss: 0.00805
Epoch: 42158, Training Loss: 0.00625
Epoch: 42159, Training Loss: 0.00705
Epoch: 42159, Training Loss: 0.00660
Epoch: 42159, Training Loss: 0.00805
Epoch: 42159, Training Loss: 0.00625
Epoch: 42160, Training Loss: 0.00705
Epoch: 42160, Training Loss: 0.00660
Epoch: 42160, Training Loss: 0.00805
Epoch: 42160, Training Loss: 0.00625
Epoch: 42161, Training Loss: 0.00705
Epoch: 42161, Training Loss: 0.00660
Epoch: 42161, Training Loss: 0.00805
Epoch: 42161, Training Loss: 0.00625
Epoch: 42162, Training Loss: 0.00705
Epoch: 42162, Training Loss: 0.00660
Epoch: 42162, Training Loss: 0.00805
Epoch: 42162, Training Loss: 0.00625
Epoch: 42163, Training Loss: 0.00705
Epoch: 42163, Training Loss: 0.00660
Epoch: 42163, Training Loss: 0.00805
Epoch: 42163, Training Loss: 0.00625
Epoch: 42164, Training Loss: 0.00705
Epoch: 42164, Training Loss: 0.00660
Epoch: 42164, Training Loss: 0.00805
Epoch: 42164, Training Loss: 0.00625
Epoch: 42165, Training Loss: 0.00705
Epoch: 42165, Training Loss: 0.00660
Epoch: 42165, Training Loss: 0.00805
Epoch: 42165, Training Loss: 0.00625
Epoch: 42166, Training Loss: 0.00705
Epoch: 42166, Training Loss: 0.00660
Epoch: 42166, Training Loss: 0.00805
Epoch: 42166, Training Loss: 0.00625
Epoch: 42167, Training Loss: 0.00705
Epoch: 42167, Training Loss: 0.00660
Epoch: 42167, Training Loss: 0.00805
Epoch: 42167, Training Loss: 0.00625
Epoch: 42168, Training Loss: 0.00705
Epoch: 42168, Training Loss: 0.00660
Epoch: 42168, Training Loss: 0.00805
Epoch: 42168, Training Loss: 0.00625
Epoch: 42169, Training Loss: 0.00705
Epoch: 42169, Training Loss: 0.00660
Epoch: 42169, Training Loss: 0.00805
Epoch: 42169, Training Loss: 0.00625
Epoch: 42170, Training Loss: 0.00705
Epoch: 42170, Training Loss: 0.00660
Epoch: 42170, Training Loss: 0.00805
Epoch: 42170, Training Loss: 0.00625
Epoch: 42171, Training Loss: 0.00705
Epoch: 42171, Training Loss: 0.00660
Epoch: 42171, Training Loss: 0.00805
Epoch: 42171, Training Loss: 0.00625
Epoch: 42172, Training Loss: 0.00705
Epoch: 42172, Training Loss: 0.00660
Epoch: 42172, Training Loss: 0.00805
Epoch: 42172, Training Loss: 0.00625
Epoch: 42173, Training Loss: 0.00705
Epoch: 42173, Training Loss: 0.00660
Epoch: 42173, Training Loss: 0.00805
Epoch: 42173, Training Loss: 0.00624
Epoch: 42174, Training Loss: 0.00704
Epoch: 42174, Training Loss: 0.00660
Epoch: 42174, Training Loss: 0.00805
Epoch: 42174, Training Loss: 0.00624
Epoch: 42175, Training Loss: 0.00704
Epoch: 42175, Training Loss: 0.00660
Epoch: 42175, Training Loss: 0.00805
Epoch: 42175, Training Loss: 0.00624
Epoch: 42176, Training Loss: 0.00704
Epoch: 42176, Training Loss: 0.00660
Epoch: 42176, Training Loss: 0.00805
Epoch: 42176, Training Loss: 0.00624
Epoch: 42177, Training Loss: 0.00704
Epoch: 42177, Training Loss: 0.00660
Epoch: 42177, Training Loss: 0.00805
Epoch: 42177, Training Loss: 0.00624
Epoch: 42178, Training Loss: 0.00704
Epoch: 42178, Training Loss: 0.00660
Epoch: 42178, Training Loss: 0.00805
Epoch: 42178, Training Loss: 0.00624
Epoch: 42179, Training Loss: 0.00704
Epoch: 42179, Training Loss: 0.00660
Epoch: 42179, Training Loss: 0.00805
Epoch: 42179, Training Loss: 0.00624
Epoch: 42180, Training Loss: 0.00704
Epoch: 42180, Training Loss: 0.00660
Epoch: 42180, Training Loss: 0.00805
Epoch: 42180, Training Loss: 0.00624
Epoch: 42181, Training Loss: 0.00704
Epoch: 42181, Training Loss: 0.00660
Epoch: 42181, Training Loss: 0.00805
Epoch: 42181, Training Loss: 0.00624
Epoch: 42182, Training Loss: 0.00704
Epoch: 42182, Training Loss: 0.00660
Epoch: 42182, Training Loss: 0.00805
Epoch: 42182, Training Loss: 0.00624
Epoch: 42183, Training Loss: 0.00704
Epoch: 42183, Training Loss: 0.00660
Epoch: 42183, Training Loss: 0.00805
Epoch: 42183, Training Loss: 0.00624
Epoch: 42184, Training Loss: 0.00704
Epoch: 42184, Training Loss: 0.00660
Epoch: 42184, Training Loss: 0.00805
Epoch: 42184, Training Loss: 0.00624
Epoch: 42185, Training Loss: 0.00704
Epoch: 42185, Training Loss: 0.00660
Epoch: 42185, Training Loss: 0.00805
Epoch: 42185, Training Loss: 0.00624
Epoch: 42186, Training Loss: 0.00704
Epoch: 42186, Training Loss: 0.00660
Epoch: 42186, Training Loss: 0.00805
Epoch: 42186, Training Loss: 0.00624
Epoch: 42187, Training Loss: 0.00704
Epoch: 42187, Training Loss: 0.00660
Epoch: 42187, Training Loss: 0.00805
Epoch: 42187, Training Loss: 0.00624
Epoch: 42188, Training Loss: 0.00704
Epoch: 42188, Training Loss: 0.00659
Epoch: 42188, Training Loss: 0.00805
Epoch: 42188, Training Loss: 0.00624
Epoch: 42189, Training Loss: 0.00704
Epoch: 42189, Training Loss: 0.00659
Epoch: 42189, Training Loss: 0.00805
Epoch: 42189, Training Loss: 0.00624
Epoch: 42190, Training Loss: 0.00704
Epoch: 42190, Training Loss: 0.00659
Epoch: 42190, Training Loss: 0.00805
Epoch: 42190, Training Loss: 0.00624
Epoch: 42191, Training Loss: 0.00704
Epoch: 42191, Training Loss: 0.00659
Epoch: 42191, Training Loss: 0.00805
Epoch: 42191, Training Loss: 0.00624
Epoch: 42192, Training Loss: 0.00704
Epoch: 42192, Training Loss: 0.00659
Epoch: 42192, Training Loss: 0.00805
Epoch: 42192, Training Loss: 0.00624
Epoch: 42193, Training Loss: 0.00704
Epoch: 42193, Training Loss: 0.00659
Epoch: 42193, Training Loss: 0.00805
Epoch: 42193, Training Loss: 0.00624
Epoch: 42194, Training Loss: 0.00704
Epoch: 42194, Training Loss: 0.00659
Epoch: 42194, Training Loss: 0.00805
Epoch: 42194, Training Loss: 0.00624
Epoch: 42195, Training Loss: 0.00704
Epoch: 42195, Training Loss: 0.00659
Epoch: 42195, Training Loss: 0.00805
Epoch: 42195, Training Loss: 0.00624
Epoch: 42196, Training Loss: 0.00704
Epoch: 42196, Training Loss: 0.00659
Epoch: 42196, Training Loss: 0.00805
Epoch: 42196, Training Loss: 0.00624
Epoch: 42197, Training Loss: 0.00704
Epoch: 42197, Training Loss: 0.00659
Epoch: 42197, Training Loss: 0.00805
Epoch: 42197, Training Loss: 0.00624
Epoch: 42198, Training Loss: 0.00704
Epoch: 42198, Training Loss: 0.00659
Epoch: 42198, Training Loss: 0.00805
Epoch: 42198, Training Loss: 0.00624
Epoch: 42199, Training Loss: 0.00704
Epoch: 42199, Training Loss: 0.00659
Epoch: 42199, Training Loss: 0.00805
Epoch: 42199, Training Loss: 0.00624
Epoch: 42200, Training Loss: 0.00704
Epoch: 42200, Training Loss: 0.00659
Epoch: 42200, Training Loss: 0.00805
Epoch: 42200, Training Loss: 0.00624
Epoch: 42201, Training Loss: 0.00704
Epoch: 42201, Training Loss: 0.00659
Epoch: 42201, Training Loss: 0.00805
Epoch: 42201, Training Loss: 0.00624
Epoch: 42202, Training Loss: 0.00704
Epoch: 42202, Training Loss: 0.00659
Epoch: 42202, Training Loss: 0.00805
Epoch: 42202, Training Loss: 0.00624
Epoch: 42203, Training Loss: 0.00704
Epoch: 42203, Training Loss: 0.00659
Epoch: 42203, Training Loss: 0.00805
Epoch: 42203, Training Loss: 0.00624
Epoch: 42204, Training Loss: 0.00704
Epoch: 42204, Training Loss: 0.00659
Epoch: 42204, Training Loss: 0.00805
Epoch: 42204, Training Loss: 0.00624
Epoch: 42205, Training Loss: 0.00704
Epoch: 42205, Training Loss: 0.00659
Epoch: 42205, Training Loss: 0.00805
Epoch: 42205, Training Loss: 0.00624
Epoch: 42206, Training Loss: 0.00704
Epoch: 42206, Training Loss: 0.00659
Epoch: 42206, Training Loss: 0.00805
Epoch: 42206, Training Loss: 0.00624
Epoch: 42207, Training Loss: 0.00704
Epoch: 42207, Training Loss: 0.00659
Epoch: 42207, Training Loss: 0.00805
Epoch: 42207, Training Loss: 0.00624
Epoch: 42208, Training Loss: 0.00704
Epoch: 42208, Training Loss: 0.00659
Epoch: 42208, Training Loss: 0.00805
Epoch: 42208, Training Loss: 0.00624
Epoch: 42209, Training Loss: 0.00704
Epoch: 42209, Training Loss: 0.00659
Epoch: 42209, Training Loss: 0.00805
Epoch: 42209, Training Loss: 0.00624
Epoch: 42210, Training Loss: 0.00704
Epoch: 42210, Training Loss: 0.00659
Epoch: 42210, Training Loss: 0.00805
Epoch: 42210, Training Loss: 0.00624
Epoch: 42211, Training Loss: 0.00704
Epoch: 42211, Training Loss: 0.00659
Epoch: 42211, Training Loss: 0.00805
Epoch: 42211, Training Loss: 0.00624
Epoch: 42212, Training Loss: 0.00704
Epoch: 42212, Training Loss: 0.00659
Epoch: 42212, Training Loss: 0.00805
Epoch: 42212, Training Loss: 0.00624
Epoch: 42213, Training Loss: 0.00704
Epoch: 42213, Training Loss: 0.00659
Epoch: 42213, Training Loss: 0.00805
Epoch: 42213, Training Loss: 0.00624
Epoch: 42214, Training Loss: 0.00704
Epoch: 42214, Training Loss: 0.00659
Epoch: 42214, Training Loss: 0.00805
Epoch: 42214, Training Loss: 0.00624
Epoch: 42215, Training Loss: 0.00704
Epoch: 42215, Training Loss: 0.00659
Epoch: 42215, Training Loss: 0.00805
Epoch: 42215, Training Loss: 0.00624
Epoch: 42216, Training Loss: 0.00704
Epoch: 42216, Training Loss: 0.00659
Epoch: 42216, Training Loss: 0.00805
Epoch: 42216, Training Loss: 0.00624
Epoch: 42217, Training Loss: 0.00704
Epoch: 42217, Training Loss: 0.00659
Epoch: 42217, Training Loss: 0.00805
Epoch: 42217, Training Loss: 0.00624
Epoch: 42218, Training Loss: 0.00704
Epoch: 42218, Training Loss: 0.00659
Epoch: 42218, Training Loss: 0.00805
Epoch: 42218, Training Loss: 0.00624
Epoch: 42219, Training Loss: 0.00704
Epoch: 42219, Training Loss: 0.00659
Epoch: 42219, Training Loss: 0.00805
Epoch: 42219, Training Loss: 0.00624
Epoch: 42220, Training Loss: 0.00704
Epoch: 42220, Training Loss: 0.00659
Epoch: 42220, Training Loss: 0.00805
Epoch: 42220, Training Loss: 0.00624
Epoch: 42221, Training Loss: 0.00704
Epoch: 42221, Training Loss: 0.00659
Epoch: 42221, Training Loss: 0.00805
Epoch: 42221, Training Loss: 0.00624
Epoch: 42222, Training Loss: 0.00704
Epoch: 42222, Training Loss: 0.00659
Epoch: 42222, Training Loss: 0.00805
Epoch: 42222, Training Loss: 0.00624
Epoch: 42223, Training Loss: 0.00704
Epoch: 42223, Training Loss: 0.00659
Epoch: 42223, Training Loss: 0.00805
Epoch: 42223, Training Loss: 0.00624
Epoch: 42224, Training Loss: 0.00704
Epoch: 42224, Training Loss: 0.00659
Epoch: 42224, Training Loss: 0.00805
Epoch: 42224, Training Loss: 0.00624
Epoch: 42225, Training Loss: 0.00704
Epoch: 42225, Training Loss: 0.00659
Epoch: 42225, Training Loss: 0.00805
Epoch: 42225, Training Loss: 0.00624
Epoch: 42226, Training Loss: 0.00704
Epoch: 42226, Training Loss: 0.00659
Epoch: 42226, Training Loss: 0.00805
Epoch: 42226, Training Loss: 0.00624
Epoch: 42227, Training Loss: 0.00704
Epoch: 42227, Training Loss: 0.00659
Epoch: 42227, Training Loss: 0.00805
Epoch: 42227, Training Loss: 0.00624
Epoch: 42228, Training Loss: 0.00704
Epoch: 42228, Training Loss: 0.00659
Epoch: 42228, Training Loss: 0.00804
Epoch: 42228, Training Loss: 0.00624
Epoch: 42229, Training Loss: 0.00704
Epoch: 42229, Training Loss: 0.00659
Epoch: 42229, Training Loss: 0.00804
Epoch: 42229, Training Loss: 0.00624
Epoch: 42230, Training Loss: 0.00704
Epoch: 42230, Training Loss: 0.00659
Epoch: 42230, Training Loss: 0.00804
Epoch: 42230, Training Loss: 0.00624
Epoch: 42231, Training Loss: 0.00704
Epoch: 42231, Training Loss: 0.00659
Epoch: 42231, Training Loss: 0.00804
Epoch: 42231, Training Loss: 0.00624
Epoch: 42232, Training Loss: 0.00704
Epoch: 42232, Training Loss: 0.00659
Epoch: 42232, Training Loss: 0.00804
Epoch: 42232, Training Loss: 0.00624
Epoch: 42233, Training Loss: 0.00704
Epoch: 42233, Training Loss: 0.00659
Epoch: 42233, Training Loss: 0.00804
Epoch: 42233, Training Loss: 0.00624
Epoch: 42234, Training Loss: 0.00704
Epoch: 42234, Training Loss: 0.00659
Epoch: 42234, Training Loss: 0.00804
Epoch: 42234, Training Loss: 0.00624
Epoch: 42235, Training Loss: 0.00704
Epoch: 42235, Training Loss: 0.00659
Epoch: 42235, Training Loss: 0.00804
Epoch: 42235, Training Loss: 0.00624
Epoch: 42236, Training Loss: 0.00704
Epoch: 42236, Training Loss: 0.00659
Epoch: 42236, Training Loss: 0.00804
Epoch: 42236, Training Loss: 0.00624
Epoch: 42237, Training Loss: 0.00704
Epoch: 42237, Training Loss: 0.00659
Epoch: 42237, Training Loss: 0.00804
Epoch: 42237, Training Loss: 0.00624
Epoch: 42238, Training Loss: 0.00704
Epoch: 42238, Training Loss: 0.00659
Epoch: 42238, Training Loss: 0.00804
Epoch: 42238, Training Loss: 0.00624
Epoch: 42239, Training Loss: 0.00704
Epoch: 42239, Training Loss: 0.00659
Epoch: 42239, Training Loss: 0.00804
Epoch: 42239, Training Loss: 0.00624
Epoch: 42240, Training Loss: 0.00704
Epoch: 42240, Training Loss: 0.00659
Epoch: 42240, Training Loss: 0.00804
Epoch: 42240, Training Loss: 0.00624
Epoch: 42241, Training Loss: 0.00704
Epoch: 42241, Training Loss: 0.00659
Epoch: 42241, Training Loss: 0.00804
Epoch: 42241, Training Loss: 0.00624
Epoch: 42242, Training Loss: 0.00704
Epoch: 42242, Training Loss: 0.00659
Epoch: 42242, Training Loss: 0.00804
Epoch: 42242, Training Loss: 0.00624
Epoch: 42243, Training Loss: 0.00704
Epoch: 42243, Training Loss: 0.00659
Epoch: 42243, Training Loss: 0.00804
Epoch: 42243, Training Loss: 0.00624
Epoch: 42244, Training Loss: 0.00704
Epoch: 42244, Training Loss: 0.00659
Epoch: 42244, Training Loss: 0.00804
Epoch: 42244, Training Loss: 0.00624
Epoch: 42245, Training Loss: 0.00704
Epoch: 42245, Training Loss: 0.00659
Epoch: 42245, Training Loss: 0.00804
Epoch: 42245, Training Loss: 0.00624
Epoch: 42246, Training Loss: 0.00704
Epoch: 42246, Training Loss: 0.00659
Epoch: 42246, Training Loss: 0.00804
Epoch: 42246, Training Loss: 0.00624
Epoch: 42247, Training Loss: 0.00704
Epoch: 42247, Training Loss: 0.00659
Epoch: 42247, Training Loss: 0.00804
Epoch: 42247, Training Loss: 0.00624
Epoch: 42248, Training Loss: 0.00704
Epoch: 42248, Training Loss: 0.00659
Epoch: 42248, Training Loss: 0.00804
Epoch: 42248, Training Loss: 0.00624
Epoch: 42249, Training Loss: 0.00704
Epoch: 42249, Training Loss: 0.00659
Epoch: 42249, Training Loss: 0.00804
Epoch: 42249, Training Loss: 0.00624
Epoch: 42250, Training Loss: 0.00704
Epoch: 42250, Training Loss: 0.00659
Epoch: 42250, Training Loss: 0.00804
Epoch: 42250, Training Loss: 0.00624
Epoch: 42251, Training Loss: 0.00704
Epoch: 42251, Training Loss: 0.00659
Epoch: 42251, Training Loss: 0.00804
Epoch: 42251, Training Loss: 0.00624
Epoch: 42252, Training Loss: 0.00704
Epoch: 42252, Training Loss: 0.00659
Epoch: 42252, Training Loss: 0.00804
Epoch: 42252, Training Loss: 0.00624
Epoch: 42253, Training Loss: 0.00704
Epoch: 42253, Training Loss: 0.00659
Epoch: 42253, Training Loss: 0.00804
Epoch: 42253, Training Loss: 0.00624
Epoch: 42254, Training Loss: 0.00704
Epoch: 42254, Training Loss: 0.00659
Epoch: 42254, Training Loss: 0.00804
Epoch: 42254, Training Loss: 0.00624
Epoch: 42255, Training Loss: 0.00704
Epoch: 42255, Training Loss: 0.00659
Epoch: 42255, Training Loss: 0.00804
Epoch: 42255, Training Loss: 0.00624
Epoch: 42256, Training Loss: 0.00704
Epoch: 42256, Training Loss: 0.00659
Epoch: 42256, Training Loss: 0.00804
Epoch: 42256, Training Loss: 0.00624
Epoch: 42257, Training Loss: 0.00704
Epoch: 42257, Training Loss: 0.00659
Epoch: 42257, Training Loss: 0.00804
Epoch: 42257, Training Loss: 0.00624
Epoch: 42258, Training Loss: 0.00704
Epoch: 42258, Training Loss: 0.00659
Epoch: 42258, Training Loss: 0.00804
Epoch: 42258, Training Loss: 0.00624
Epoch: 42259, Training Loss: 0.00704
Epoch: 42259, Training Loss: 0.00659
Epoch: 42259, Training Loss: 0.00804
Epoch: 42259, Training Loss: 0.00624
Epoch: 42260, Training Loss: 0.00704
Epoch: 42260, Training Loss: 0.00659
Epoch: 42260, Training Loss: 0.00804
Epoch: 42260, Training Loss: 0.00624
Epoch: 42261, Training Loss: 0.00704
Epoch: 42261, Training Loss: 0.00659
Epoch: 42261, Training Loss: 0.00804
Epoch: 42261, Training Loss: 0.00624
Epoch: 42262, Training Loss: 0.00704
Epoch: 42262, Training Loss: 0.00659
Epoch: 42262, Training Loss: 0.00804
Epoch: 42262, Training Loss: 0.00624
Epoch: 42263, Training Loss: 0.00704
Epoch: 42263, Training Loss: 0.00659
Epoch: 42263, Training Loss: 0.00804
Epoch: 42263, Training Loss: 0.00624
Epoch: 42264, Training Loss: 0.00704
Epoch: 42264, Training Loss: 0.00659
Epoch: 42264, Training Loss: 0.00804
Epoch: 42264, Training Loss: 0.00624
Epoch: 42265, Training Loss: 0.00704
Epoch: 42265, Training Loss: 0.00659
Epoch: 42265, Training Loss: 0.00804
Epoch: 42265, Training Loss: 0.00624
Epoch: 42266, Training Loss: 0.00704
Epoch: 42266, Training Loss: 0.00659
Epoch: 42266, Training Loss: 0.00804
Epoch: 42266, Training Loss: 0.00624
Epoch: 42267, Training Loss: 0.00704
Epoch: 42267, Training Loss: 0.00659
Epoch: 42267, Training Loss: 0.00804
Epoch: 42267, Training Loss: 0.00624
Epoch: 42268, Training Loss: 0.00704
Epoch: 42268, Training Loss: 0.00659
Epoch: 42268, Training Loss: 0.00804
Epoch: 42268, Training Loss: 0.00624
Epoch: 42269, Training Loss: 0.00704
Epoch: 42269, Training Loss: 0.00659
Epoch: 42269, Training Loss: 0.00804
Epoch: 42269, Training Loss: 0.00624
Epoch: 42270, Training Loss: 0.00704
Epoch: 42270, Training Loss: 0.00659
Epoch: 42270, Training Loss: 0.00804
Epoch: 42270, Training Loss: 0.00624
Epoch: 42271, Training Loss: 0.00704
Epoch: 42271, Training Loss: 0.00659
Epoch: 42271, Training Loss: 0.00804
Epoch: 42271, Training Loss: 0.00624
Epoch: 42272, Training Loss: 0.00704
Epoch: 42272, Training Loss: 0.00659
Epoch: 42272, Training Loss: 0.00804
Epoch: 42272, Training Loss: 0.00624
Epoch: 42273, Training Loss: 0.00704
Epoch: 42273, Training Loss: 0.00659
Epoch: 42273, Training Loss: 0.00804
Epoch: 42273, Training Loss: 0.00624
Epoch: 42274, Training Loss: 0.00704
Epoch: 42274, Training Loss: 0.00659
Epoch: 42274, Training Loss: 0.00804
Epoch: 42274, Training Loss: 0.00624
Epoch: 42275, Training Loss: 0.00704
Epoch: 42275, Training Loss: 0.00659
Epoch: 42275, Training Loss: 0.00804
Epoch: 42275, Training Loss: 0.00624
Epoch: 42276, Training Loss: 0.00704
Epoch: 42276, Training Loss: 0.00659
Epoch: 42276, Training Loss: 0.00804
Epoch: 42276, Training Loss: 0.00624
Epoch: 42277, Training Loss: 0.00704
Epoch: 42277, Training Loss: 0.00659
Epoch: 42277, Training Loss: 0.00804
Epoch: 42277, Training Loss: 0.00624
Epoch: 42278, Training Loss: 0.00704
Epoch: 42278, Training Loss: 0.00659
Epoch: 42278, Training Loss: 0.00804
Epoch: 42278, Training Loss: 0.00624
Epoch: 42279, Training Loss: 0.00704
Epoch: 42279, Training Loss: 0.00659
Epoch: 42279, Training Loss: 0.00804
Epoch: 42279, Training Loss: 0.00624
Epoch: 42280, Training Loss: 0.00704
Epoch: 42280, Training Loss: 0.00659
Epoch: 42280, Training Loss: 0.00804
Epoch: 42280, Training Loss: 0.00624
Epoch: 42281, Training Loss: 0.00704
Epoch: 42281, Training Loss: 0.00659
Epoch: 42281, Training Loss: 0.00804
Epoch: 42281, Training Loss: 0.00624
Epoch: 42282, Training Loss: 0.00704
Epoch: 42282, Training Loss: 0.00659
Epoch: 42282, Training Loss: 0.00804
Epoch: 42282, Training Loss: 0.00624
Epoch: 42283, Training Loss: 0.00704
Epoch: 42283, Training Loss: 0.00659
Epoch: 42283, Training Loss: 0.00804
Epoch: 42283, Training Loss: 0.00624
Epoch: 42284, Training Loss: 0.00704
Epoch: 42284, Training Loss: 0.00659
Epoch: 42284, Training Loss: 0.00804
Epoch: 42284, Training Loss: 0.00624
Epoch: 42285, Training Loss: 0.00704
Epoch: 42285, Training Loss: 0.00659
Epoch: 42285, Training Loss: 0.00804
Epoch: 42285, Training Loss: 0.00624
Epoch: 42286, Training Loss: 0.00704
Epoch: 42286, Training Loss: 0.00659
Epoch: 42286, Training Loss: 0.00804
Epoch: 42286, Training Loss: 0.00624
Epoch: 42287, Training Loss: 0.00704
Epoch: 42287, Training Loss: 0.00659
Epoch: 42287, Training Loss: 0.00804
Epoch: 42287, Training Loss: 0.00624
Epoch: 42288, Training Loss: 0.00703
Epoch: 42288, Training Loss: 0.00659
Epoch: 42288, Training Loss: 0.00804
Epoch: 42288, Training Loss: 0.00624
Epoch: 42289, Training Loss: 0.00703
Epoch: 42289, Training Loss: 0.00659
Epoch: 42289, Training Loss: 0.00804
Epoch: 42289, Training Loss: 0.00624
Epoch: 42290, Training Loss: 0.00703
Epoch: 42290, Training Loss: 0.00659
Epoch: 42290, Training Loss: 0.00804
Epoch: 42290, Training Loss: 0.00624
Epoch: 42291, Training Loss: 0.00703
Epoch: 42291, Training Loss: 0.00659
Epoch: 42291, Training Loss: 0.00804
Epoch: 42291, Training Loss: 0.00624
Epoch: 42292, Training Loss: 0.00703
Epoch: 42292, Training Loss: 0.00659
Epoch: 42292, Training Loss: 0.00804
Epoch: 42292, Training Loss: 0.00624
Epoch: 42293, Training Loss: 0.00703
Epoch: 42293, Training Loss: 0.00659
Epoch: 42293, Training Loss: 0.00804
Epoch: 42293, Training Loss: 0.00624
Epoch: 42294, Training Loss: 0.00703
Epoch: 42294, Training Loss: 0.00659
Epoch: 42294, Training Loss: 0.00804
Epoch: 42294, Training Loss: 0.00624
Epoch: 42295, Training Loss: 0.00703
Epoch: 42295, Training Loss: 0.00659
Epoch: 42295, Training Loss: 0.00804
Epoch: 42295, Training Loss: 0.00624
Epoch: 42296, Training Loss: 0.00703
Epoch: 42296, Training Loss: 0.00659
Epoch: 42296, Training Loss: 0.00804
Epoch: 42296, Training Loss: 0.00624
Epoch: 42297, Training Loss: 0.00703
Epoch: 42297, Training Loss: 0.00659
Epoch: 42297, Training Loss: 0.00804
Epoch: 42297, Training Loss: 0.00624
Epoch: 42298, Training Loss: 0.00703
Epoch: 42298, Training Loss: 0.00659
Epoch: 42298, Training Loss: 0.00804
Epoch: 42298, Training Loss: 0.00624
Epoch: 42299, Training Loss: 0.00703
Epoch: 42299, Training Loss: 0.00659
Epoch: 42299, Training Loss: 0.00804
Epoch: 42299, Training Loss: 0.00624
Epoch: 42300, Training Loss: 0.00703
Epoch: 42300, Training Loss: 0.00659
Epoch: 42300, Training Loss: 0.00804
Epoch: 42300, Training Loss: 0.00624
Epoch: 42301, Training Loss: 0.00703
Epoch: 42301, Training Loss: 0.00659
Epoch: 42301, Training Loss: 0.00804
Epoch: 42301, Training Loss: 0.00624
Epoch: 42302, Training Loss: 0.00703
Epoch: 42302, Training Loss: 0.00659
Epoch: 42302, Training Loss: 0.00804
Epoch: 42302, Training Loss: 0.00624
Epoch: 42303, Training Loss: 0.00703
Epoch: 42303, Training Loss: 0.00659
Epoch: 42303, Training Loss: 0.00804
Epoch: 42303, Training Loss: 0.00623
Epoch: 42304, Training Loss: 0.00703
Epoch: 42304, Training Loss: 0.00659
Epoch: 42304, Training Loss: 0.00804
Epoch: 42304, Training Loss: 0.00623
Epoch: 42305, Training Loss: 0.00703
Epoch: 42305, Training Loss: 0.00659
Epoch: 42305, Training Loss: 0.00804
Epoch: 42305, Training Loss: 0.00623
Epoch: 42306, Training Loss: 0.00703
Epoch: 42306, Training Loss: 0.00659
Epoch: 42306, Training Loss: 0.00804
Epoch: 42306, Training Loss: 0.00623
Epoch: 42307, Training Loss: 0.00703
Epoch: 42307, Training Loss: 0.00659
Epoch: 42307, Training Loss: 0.00804
Epoch: 42307, Training Loss: 0.00623
Epoch: 42308, Training Loss: 0.00703
Epoch: 42308, Training Loss: 0.00659
Epoch: 42308, Training Loss: 0.00804
Epoch: 42308, Training Loss: 0.00623
Epoch: 42309, Training Loss: 0.00703
Epoch: 42309, Training Loss: 0.00659
Epoch: 42309, Training Loss: 0.00804
Epoch: 42309, Training Loss: 0.00623
Epoch: 42310, Training Loss: 0.00703
Epoch: 42310, Training Loss: 0.00659
Epoch: 42310, Training Loss: 0.00804
Epoch: 42310, Training Loss: 0.00623
Epoch: 42311, Training Loss: 0.00703
Epoch: 42311, Training Loss: 0.00659
Epoch: 42311, Training Loss: 0.00804
Epoch: 42311, Training Loss: 0.00623
Epoch: 42312, Training Loss: 0.00703
Epoch: 42312, Training Loss: 0.00658
Epoch: 42312, Training Loss: 0.00804
Epoch: 42312, Training Loss: 0.00623
Epoch: 42313, Training Loss: 0.00703
Epoch: 42313, Training Loss: 0.00658
Epoch: 42313, Training Loss: 0.00804
Epoch: 42313, Training Loss: 0.00623
Epoch: 42314, Training Loss: 0.00703
Epoch: 42314, Training Loss: 0.00658
Epoch: 42314, Training Loss: 0.00804
Epoch: 42314, Training Loss: 0.00623
Epoch: 42315, Training Loss: 0.00703
Epoch: 42315, Training Loss: 0.00658
Epoch: 42315, Training Loss: 0.00804
Epoch: 42315, Training Loss: 0.00623
Epoch: 42316, Training Loss: 0.00703
Epoch: 42316, Training Loss: 0.00658
Epoch: 42316, Training Loss: 0.00804
Epoch: 42316, Training Loss: 0.00623
Epoch: 42317, Training Loss: 0.00703
Epoch: 42317, Training Loss: 0.00658
Epoch: 42317, Training Loss: 0.00804
Epoch: 42317, Training Loss: 0.00623
Epoch: 42318, Training Loss: 0.00703
Epoch: 42318, Training Loss: 0.00658
Epoch: 42318, Training Loss: 0.00804
Epoch: 42318, Training Loss: 0.00623
Epoch: 42319, Training Loss: 0.00703
Epoch: 42319, Training Loss: 0.00658
Epoch: 42319, Training Loss: 0.00804
Epoch: 42319, Training Loss: 0.00623
Epoch: 42320, Training Loss: 0.00703
Epoch: 42320, Training Loss: 0.00658
Epoch: 42320, Training Loss: 0.00804
Epoch: 42320, Training Loss: 0.00623
Epoch: 42321, Training Loss: 0.00703
Epoch: 42321, Training Loss: 0.00658
Epoch: 42321, Training Loss: 0.00804
Epoch: 42321, Training Loss: 0.00623
Epoch: 42322, Training Loss: 0.00703
Epoch: 42322, Training Loss: 0.00658
Epoch: 42322, Training Loss: 0.00804
Epoch: 42322, Training Loss: 0.00623
Epoch: 42323, Training Loss: 0.00703
Epoch: 42323, Training Loss: 0.00658
Epoch: 42323, Training Loss: 0.00804
Epoch: 42323, Training Loss: 0.00623
Epoch: 42324, Training Loss: 0.00703
Epoch: 42324, Training Loss: 0.00658
Epoch: 42324, Training Loss: 0.00804
Epoch: 42324, Training Loss: 0.00623
Epoch: 42325, Training Loss: 0.00703
Epoch: 42325, Training Loss: 0.00658
Epoch: 42325, Training Loss: 0.00804
Epoch: 42325, Training Loss: 0.00623
Epoch: 42326, Training Loss: 0.00703
Epoch: 42326, Training Loss: 0.00658
Epoch: 42326, Training Loss: 0.00804
Epoch: 42326, Training Loss: 0.00623
Epoch: 42327, Training Loss: 0.00703
Epoch: 42327, Training Loss: 0.00658
Epoch: 42327, Training Loss: 0.00804
Epoch: 42327, Training Loss: 0.00623
Epoch: 42328, Training Loss: 0.00703
Epoch: 42328, Training Loss: 0.00658
Epoch: 42328, Training Loss: 0.00804
Epoch: 42328, Training Loss: 0.00623
Epoch: 42329, Training Loss: 0.00703
Epoch: 42329, Training Loss: 0.00658
Epoch: 42329, Training Loss: 0.00803
Epoch: 42329, Training Loss: 0.00623
Epoch: 42330, Training Loss: 0.00703
Epoch: 42330, Training Loss: 0.00658
Epoch: 42330, Training Loss: 0.00803
Epoch: 42330, Training Loss: 0.00623
Epoch: 42331, Training Loss: 0.00703
Epoch: 42331, Training Loss: 0.00658
Epoch: 42331, Training Loss: 0.00803
Epoch: 42331, Training Loss: 0.00623
Epoch: 42332, Training Loss: 0.00703
Epoch: 42332, Training Loss: 0.00658
Epoch: 42332, Training Loss: 0.00803
Epoch: 42332, Training Loss: 0.00623
Epoch: 42333, Training Loss: 0.00703
Epoch: 42333, Training Loss: 0.00658
Epoch: 42333, Training Loss: 0.00803
Epoch: 42333, Training Loss: 0.00623
Epoch: 42334, Training Loss: 0.00703
Epoch: 42334, Training Loss: 0.00658
Epoch: 42334, Training Loss: 0.00803
Epoch: 42334, Training Loss: 0.00623
Epoch: 42335, Training Loss: 0.00703
Epoch: 42335, Training Loss: 0.00658
Epoch: 42335, Training Loss: 0.00803
Epoch: 42335, Training Loss: 0.00623
Epoch: 42336, Training Loss: 0.00703
Epoch: 42336, Training Loss: 0.00658
Epoch: 42336, Training Loss: 0.00803
Epoch: 42336, Training Loss: 0.00623
Epoch: 42337, Training Loss: 0.00703
Epoch: 42337, Training Loss: 0.00658
Epoch: 42337, Training Loss: 0.00803
Epoch: 42337, Training Loss: 0.00623
Epoch: 42338, Training Loss: 0.00703
Epoch: 42338, Training Loss: 0.00658
Epoch: 42338, Training Loss: 0.00803
Epoch: 42338, Training Loss: 0.00623
Epoch: 42339, Training Loss: 0.00703
Epoch: 42339, Training Loss: 0.00658
Epoch: 42339, Training Loss: 0.00803
Epoch: 42339, Training Loss: 0.00623
Epoch: 42340, Training Loss: 0.00703
Epoch: 42340, Training Loss: 0.00658
Epoch: 42340, Training Loss: 0.00803
Epoch: 42340, Training Loss: 0.00623
Epoch: 42341, Training Loss: 0.00703
Epoch: 42341, Training Loss: 0.00658
Epoch: 42341, Training Loss: 0.00803
Epoch: 42341, Training Loss: 0.00623
Epoch: 42342, Training Loss: 0.00703
Epoch: 42342, Training Loss: 0.00658
Epoch: 42342, Training Loss: 0.00803
Epoch: 42342, Training Loss: 0.00623
Epoch: 42343, Training Loss: 0.00703
Epoch: 42343, Training Loss: 0.00658
Epoch: 42343, Training Loss: 0.00803
Epoch: 42343, Training Loss: 0.00623
Epoch: 42344, Training Loss: 0.00703
Epoch: 42344, Training Loss: 0.00658
Epoch: 42344, Training Loss: 0.00803
Epoch: 42344, Training Loss: 0.00623
Epoch: 42345, Training Loss: 0.00703
Epoch: 42345, Training Loss: 0.00658
Epoch: 42345, Training Loss: 0.00803
Epoch: 42345, Training Loss: 0.00623
Epoch: 42346, Training Loss: 0.00703
Epoch: 42346, Training Loss: 0.00658
Epoch: 42346, Training Loss: 0.00803
Epoch: 42346, Training Loss: 0.00623
Epoch: 42347, Training Loss: 0.00703
Epoch: 42347, Training Loss: 0.00658
Epoch: 42347, Training Loss: 0.00803
Epoch: 42347, Training Loss: 0.00623
Epoch: 42348, Training Loss: 0.00703
Epoch: 42348, Training Loss: 0.00658
Epoch: 42348, Training Loss: 0.00803
Epoch: 42348, Training Loss: 0.00623
Epoch: 42349, Training Loss: 0.00703
Epoch: 42349, Training Loss: 0.00658
Epoch: 42349, Training Loss: 0.00803
Epoch: 42349, Training Loss: 0.00623
Epoch: 42350, Training Loss: 0.00703
Epoch: 42350, Training Loss: 0.00658
Epoch: 42350, Training Loss: 0.00803
Epoch: 42350, Training Loss: 0.00623
Epoch: 42351, Training Loss: 0.00703
Epoch: 42351, Training Loss: 0.00658
Epoch: 42351, Training Loss: 0.00803
Epoch: 42351, Training Loss: 0.00623
Epoch: 42352, Training Loss: 0.00703
Epoch: 42352, Training Loss: 0.00658
Epoch: 42352, Training Loss: 0.00803
Epoch: 42352, Training Loss: 0.00623
Epoch: 42353, Training Loss: 0.00703
Epoch: 42353, Training Loss: 0.00658
Epoch: 42353, Training Loss: 0.00803
Epoch: 42353, Training Loss: 0.00623
Epoch: 42354, Training Loss: 0.00703
Epoch: 42354, Training Loss: 0.00658
Epoch: 42354, Training Loss: 0.00803
Epoch: 42354, Training Loss: 0.00623
Epoch: 42355, Training Loss: 0.00703
Epoch: 42355, Training Loss: 0.00658
Epoch: 42355, Training Loss: 0.00803
Epoch: 42355, Training Loss: 0.00623
Epoch: 42356, Training Loss: 0.00703
Epoch: 42356, Training Loss: 0.00658
Epoch: 42356, Training Loss: 0.00803
Epoch: 42356, Training Loss: 0.00623
Epoch: 42357, Training Loss: 0.00703
Epoch: 42357, Training Loss: 0.00658
Epoch: 42357, Training Loss: 0.00803
Epoch: 42357, Training Loss: 0.00623
Epoch: 42358, Training Loss: 0.00703
Epoch: 42358, Training Loss: 0.00658
Epoch: 42358, Training Loss: 0.00803
Epoch: 42358, Training Loss: 0.00623
Epoch: 42359, Training Loss: 0.00703
Epoch: 42359, Training Loss: 0.00658
Epoch: 42359, Training Loss: 0.00803
Epoch: 42359, Training Loss: 0.00623
Epoch: 42360, Training Loss: 0.00703
Epoch: 42360, Training Loss: 0.00658
Epoch: 42360, Training Loss: 0.00803
Epoch: 42360, Training Loss: 0.00623
Epoch: 42361, Training Loss: 0.00703
Epoch: 42361, Training Loss: 0.00658
Epoch: 42361, Training Loss: 0.00803
Epoch: 42361, Training Loss: 0.00623
Epoch: 42362, Training Loss: 0.00703
Epoch: 42362, Training Loss: 0.00658
Epoch: 42362, Training Loss: 0.00803
Epoch: 42362, Training Loss: 0.00623
Epoch: 42363, Training Loss: 0.00703
Epoch: 42363, Training Loss: 0.00658
Epoch: 42363, Training Loss: 0.00803
Epoch: 42363, Training Loss: 0.00623
Epoch: 42364, Training Loss: 0.00703
Epoch: 42364, Training Loss: 0.00658
Epoch: 42364, Training Loss: 0.00803
Epoch: 42364, Training Loss: 0.00623
Epoch: 42365, Training Loss: 0.00703
Epoch: 42365, Training Loss: 0.00658
Epoch: 42365, Training Loss: 0.00803
Epoch: 42365, Training Loss: 0.00623
Epoch: 42366, Training Loss: 0.00703
Epoch: 42366, Training Loss: 0.00658
Epoch: 42366, Training Loss: 0.00803
Epoch: 42366, Training Loss: 0.00623
Epoch: 42367, Training Loss: 0.00703
Epoch: 42367, Training Loss: 0.00658
Epoch: 42367, Training Loss: 0.00803
Epoch: 42367, Training Loss: 0.00623
Epoch: 42368, Training Loss: 0.00703
Epoch: 42368, Training Loss: 0.00658
Epoch: 42368, Training Loss: 0.00803
Epoch: 42368, Training Loss: 0.00623
Epoch: 42369, Training Loss: 0.00703
Epoch: 42369, Training Loss: 0.00658
Epoch: 42369, Training Loss: 0.00803
Epoch: 42369, Training Loss: 0.00623
Epoch: 42370, Training Loss: 0.00703
Epoch: 42370, Training Loss: 0.00658
Epoch: 42370, Training Loss: 0.00803
Epoch: 42370, Training Loss: 0.00623
Epoch: 42371, Training Loss: 0.00703
Epoch: 42371, Training Loss: 0.00658
Epoch: 42371, Training Loss: 0.00803
Epoch: 42371, Training Loss: 0.00623
Epoch: 42372, Training Loss: 0.00703
Epoch: 42372, Training Loss: 0.00658
Epoch: 42372, Training Loss: 0.00803
Epoch: 42372, Training Loss: 0.00623
Epoch: 42373, Training Loss: 0.00703
Epoch: 42373, Training Loss: 0.00658
Epoch: 42373, Training Loss: 0.00803
Epoch: 42373, Training Loss: 0.00623
Epoch: 42374, Training Loss: 0.00703
Epoch: 42374, Training Loss: 0.00658
Epoch: 42374, Training Loss: 0.00803
Epoch: 42374, Training Loss: 0.00623
Epoch: 42375, Training Loss: 0.00703
Epoch: 42375, Training Loss: 0.00658
Epoch: 42375, Training Loss: 0.00803
Epoch: 42375, Training Loss: 0.00623
Epoch: 42376, Training Loss: 0.00703
Epoch: 42376, Training Loss: 0.00658
Epoch: 42376, Training Loss: 0.00803
Epoch: 42376, Training Loss: 0.00623
Epoch: 42377, Training Loss: 0.00703
Epoch: 42377, Training Loss: 0.00658
Epoch: 42377, Training Loss: 0.00803
Epoch: 42377, Training Loss: 0.00623
Epoch: 42378, Training Loss: 0.00703
Epoch: 42378, Training Loss: 0.00658
Epoch: 42378, Training Loss: 0.00803
Epoch: 42378, Training Loss: 0.00623
Epoch: 42379, Training Loss: 0.00703
Epoch: 42379, Training Loss: 0.00658
Epoch: 42379, Training Loss: 0.00803
Epoch: 42379, Training Loss: 0.00623
Epoch: 42380, Training Loss: 0.00703
Epoch: 42380, Training Loss: 0.00658
Epoch: 42380, Training Loss: 0.00803
Epoch: 42380, Training Loss: 0.00623
Epoch: 42381, Training Loss: 0.00703
Epoch: 42381, Training Loss: 0.00658
Epoch: 42381, Training Loss: 0.00803
Epoch: 42381, Training Loss: 0.00623
Epoch: 42382, Training Loss: 0.00703
Epoch: 42382, Training Loss: 0.00658
Epoch: 42382, Training Loss: 0.00803
Epoch: 42382, Training Loss: 0.00623
Epoch: 42383, Training Loss: 0.00703
Epoch: 42383, Training Loss: 0.00658
Epoch: 42383, Training Loss: 0.00803
Epoch: 42383, Training Loss: 0.00623
Epoch: 42384, Training Loss: 0.00703
Epoch: 42384, Training Loss: 0.00658
Epoch: 42384, Training Loss: 0.00803
Epoch: 42384, Training Loss: 0.00623
Epoch: 42385, Training Loss: 0.00703
Epoch: 42385, Training Loss: 0.00658
Epoch: 42385, Training Loss: 0.00803
Epoch: 42385, Training Loss: 0.00623
Epoch: 42386, Training Loss: 0.00703
Epoch: 42386, Training Loss: 0.00658
Epoch: 42386, Training Loss: 0.00803
Epoch: 42386, Training Loss: 0.00623
Epoch: 42387, Training Loss: 0.00703
Epoch: 42387, Training Loss: 0.00658
Epoch: 42387, Training Loss: 0.00803
Epoch: 42387, Training Loss: 0.00623
Epoch: 42388, Training Loss: 0.00703
Epoch: 42388, Training Loss: 0.00658
Epoch: 42388, Training Loss: 0.00803
Epoch: 42388, Training Loss: 0.00623
Epoch: 42389, Training Loss: 0.00703
Epoch: 42389, Training Loss: 0.00658
Epoch: 42389, Training Loss: 0.00803
Epoch: 42389, Training Loss: 0.00623
Epoch: 42390, Training Loss: 0.00703
Epoch: 42390, Training Loss: 0.00658
Epoch: 42390, Training Loss: 0.00803
Epoch: 42390, Training Loss: 0.00623
Epoch: 42391, Training Loss: 0.00703
Epoch: 42391, Training Loss: 0.00658
Epoch: 42391, Training Loss: 0.00803
Epoch: 42391, Training Loss: 0.00623
Epoch: 42392, Training Loss: 0.00703
Epoch: 42392, Training Loss: 0.00658
Epoch: 42392, Training Loss: 0.00803
Epoch: 42392, Training Loss: 0.00623
Epoch: 42393, Training Loss: 0.00703
Epoch: 42393, Training Loss: 0.00658
Epoch: 42393, Training Loss: 0.00803
Epoch: 42393, Training Loss: 0.00623
Epoch: 42394, Training Loss: 0.00703
Epoch: 42394, Training Loss: 0.00658
Epoch: 42394, Training Loss: 0.00803
Epoch: 42394, Training Loss: 0.00623
Epoch: 42395, Training Loss: 0.00703
Epoch: 42395, Training Loss: 0.00658
Epoch: 42395, Training Loss: 0.00803
Epoch: 42395, Training Loss: 0.00623
Epoch: 42396, Training Loss: 0.00703
Epoch: 42396, Training Loss: 0.00658
Epoch: 42396, Training Loss: 0.00803
Epoch: 42396, Training Loss: 0.00623
Epoch: 42397, Training Loss: 0.00703
Epoch: 42397, Training Loss: 0.00658
Epoch: 42397, Training Loss: 0.00803
Epoch: 42397, Training Loss: 0.00623
Epoch: 42398, Training Loss: 0.00703
Epoch: 42398, Training Loss: 0.00658
Epoch: 42398, Training Loss: 0.00803
Epoch: 42398, Training Loss: 0.00623
Epoch: 42399, Training Loss: 0.00703
Epoch: 42399, Training Loss: 0.00658
Epoch: 42399, Training Loss: 0.00803
Epoch: 42399, Training Loss: 0.00623
Epoch: 42400, Training Loss: 0.00703
Epoch: 42400, Training Loss: 0.00658
Epoch: 42400, Training Loss: 0.00803
Epoch: 42400, Training Loss: 0.00623
Epoch: 42401, Training Loss: 0.00703
Epoch: 42401, Training Loss: 0.00658
Epoch: 42401, Training Loss: 0.00803
Epoch: 42401, Training Loss: 0.00623
Epoch: 42402, Training Loss: 0.00703
Epoch: 42402, Training Loss: 0.00658
Epoch: 42402, Training Loss: 0.00803
Epoch: 42402, Training Loss: 0.00623
Epoch: 42403, Training Loss: 0.00702
Epoch: 42403, Training Loss: 0.00658
Epoch: 42403, Training Loss: 0.00803
Epoch: 42403, Training Loss: 0.00623
Epoch: 42404, Training Loss: 0.00702
Epoch: 42404, Training Loss: 0.00658
Epoch: 42404, Training Loss: 0.00803
Epoch: 42404, Training Loss: 0.00623
Epoch: 42405, Training Loss: 0.00702
Epoch: 42405, Training Loss: 0.00658
Epoch: 42405, Training Loss: 0.00803
Epoch: 42405, Training Loss: 0.00623
Epoch: 42406, Training Loss: 0.00702
Epoch: 42406, Training Loss: 0.00658
Epoch: 42406, Training Loss: 0.00803
Epoch: 42406, Training Loss: 0.00623
Epoch: 42407, Training Loss: 0.00702
Epoch: 42407, Training Loss: 0.00658
Epoch: 42407, Training Loss: 0.00803
Epoch: 42407, Training Loss: 0.00623
Epoch: 42408, Training Loss: 0.00702
Epoch: 42408, Training Loss: 0.00658
Epoch: 42408, Training Loss: 0.00803
Epoch: 42408, Training Loss: 0.00623
Epoch: 42409, Training Loss: 0.00702
Epoch: 42409, Training Loss: 0.00658
Epoch: 42409, Training Loss: 0.00803
Epoch: 42409, Training Loss: 0.00623
Epoch: 42410, Training Loss: 0.00702
Epoch: 42410, Training Loss: 0.00658
Epoch: 42410, Training Loss: 0.00803
Epoch: 42410, Training Loss: 0.00623
Epoch: 42411, Training Loss: 0.00702
Epoch: 42411, Training Loss: 0.00658
Epoch: 42411, Training Loss: 0.00803
Epoch: 42411, Training Loss: 0.00623
Epoch: 42412, Training Loss: 0.00702
Epoch: 42412, Training Loss: 0.00658
Epoch: 42412, Training Loss: 0.00803
Epoch: 42412, Training Loss: 0.00623
Epoch: 42413, Training Loss: 0.00702
Epoch: 42413, Training Loss: 0.00658
Epoch: 42413, Training Loss: 0.00803
Epoch: 42413, Training Loss: 0.00623
Epoch: 42414, Training Loss: 0.00702
Epoch: 42414, Training Loss: 0.00658
Epoch: 42414, Training Loss: 0.00803
Epoch: 42414, Training Loss: 0.00623
Epoch: 42415, Training Loss: 0.00702
Epoch: 42415, Training Loss: 0.00658
Epoch: 42415, Training Loss: 0.00803
Epoch: 42415, Training Loss: 0.00623
Epoch: 42416, Training Loss: 0.00702
Epoch: 42416, Training Loss: 0.00658
Epoch: 42416, Training Loss: 0.00803
Epoch: 42416, Training Loss: 0.00623
Epoch: 42417, Training Loss: 0.00702
Epoch: 42417, Training Loss: 0.00658
Epoch: 42417, Training Loss: 0.00803
Epoch: 42417, Training Loss: 0.00623
Epoch: 42418, Training Loss: 0.00702
Epoch: 42418, Training Loss: 0.00658
Epoch: 42418, Training Loss: 0.00803
Epoch: 42418, Training Loss: 0.00623
Epoch: 42419, Training Loss: 0.00702
Epoch: 42419, Training Loss: 0.00658
Epoch: 42419, Training Loss: 0.00803
Epoch: 42419, Training Loss: 0.00623
Epoch: 42420, Training Loss: 0.00702
Epoch: 42420, Training Loss: 0.00658
Epoch: 42420, Training Loss: 0.00803
Epoch: 42420, Training Loss: 0.00623
Epoch: 42421, Training Loss: 0.00702
Epoch: 42421, Training Loss: 0.00658
Epoch: 42421, Training Loss: 0.00803
Epoch: 42421, Training Loss: 0.00623
Epoch: 42422, Training Loss: 0.00702
Epoch: 42422, Training Loss: 0.00658
Epoch: 42422, Training Loss: 0.00803
Epoch: 42422, Training Loss: 0.00623
Epoch: 42423, Training Loss: 0.00702
Epoch: 42423, Training Loss: 0.00658
Epoch: 42423, Training Loss: 0.00803
Epoch: 42423, Training Loss: 0.00623
Epoch: 42424, Training Loss: 0.00702
Epoch: 42424, Training Loss: 0.00658
Epoch: 42424, Training Loss: 0.00803
Epoch: 42424, Training Loss: 0.00623
Epoch: 42425, Training Loss: 0.00702
Epoch: 42425, Training Loss: 0.00658
Epoch: 42425, Training Loss: 0.00803
Epoch: 42425, Training Loss: 0.00623
Epoch: 42426, Training Loss: 0.00702
Epoch: 42426, Training Loss: 0.00658
Epoch: 42426, Training Loss: 0.00803
Epoch: 42426, Training Loss: 0.00623
Epoch: 42427, Training Loss: 0.00702
Epoch: 42427, Training Loss: 0.00658
Epoch: 42427, Training Loss: 0.00803
Epoch: 42427, Training Loss: 0.00623
Epoch: 42428, Training Loss: 0.00702
Epoch: 42428, Training Loss: 0.00658
Epoch: 42428, Training Loss: 0.00803
Epoch: 42428, Training Loss: 0.00623
Epoch: 42429, Training Loss: 0.00702
Epoch: 42429, Training Loss: 0.00658
Epoch: 42429, Training Loss: 0.00802
Epoch: 42429, Training Loss: 0.00623
Epoch: 42430, Training Loss: 0.00702
Epoch: 42430, Training Loss: 0.00658
Epoch: 42430, Training Loss: 0.00802
Epoch: 42430, Training Loss: 0.00623
Epoch: 42431, Training Loss: 0.00702
Epoch: 42431, Training Loss: 0.00658
Epoch: 42431, Training Loss: 0.00802
Epoch: 42431, Training Loss: 0.00623
Epoch: 42432, Training Loss: 0.00702
Epoch: 42432, Training Loss: 0.00658
Epoch: 42432, Training Loss: 0.00802
Epoch: 42432, Training Loss: 0.00623
Epoch: 42433, Training Loss: 0.00702
Epoch: 42433, Training Loss: 0.00658
Epoch: 42433, Training Loss: 0.00802
Epoch: 42433, Training Loss: 0.00623
Epoch: 42434, Training Loss: 0.00702
Epoch: 42434, Training Loss: 0.00658
Epoch: 42434, Training Loss: 0.00802
Epoch: 42434, Training Loss: 0.00622
Epoch: 42435, Training Loss: 0.00702
Epoch: 42435, Training Loss: 0.00658
Epoch: 42435, Training Loss: 0.00802
Epoch: 42435, Training Loss: 0.00622
Epoch: 42436, Training Loss: 0.00702
Epoch: 42436, Training Loss: 0.00657
Epoch: 42436, Training Loss: 0.00802
Epoch: 42436, Training Loss: 0.00622
Epoch: 42437, Training Loss: 0.00702
Epoch: 42437, Training Loss: 0.00657
Epoch: 42437, Training Loss: 0.00802
Epoch: 42437, Training Loss: 0.00622
Epoch: 42438, Training Loss: 0.00702
Epoch: 42438, Training Loss: 0.00657
Epoch: 42438, Training Loss: 0.00802
Epoch: 42438, Training Loss: 0.00622
Epoch: 42439, Training Loss: 0.00702
Epoch: 42439, Training Loss: 0.00657
Epoch: 42439, Training Loss: 0.00802
Epoch: 42439, Training Loss: 0.00622
Epoch: 42440, Training Loss: 0.00702
Epoch: 42440, Training Loss: 0.00657
Epoch: 42440, Training Loss: 0.00802
Epoch: 42440, Training Loss: 0.00622
Epoch: 42441, Training Loss: 0.00702
Epoch: 42441, Training Loss: 0.00657
Epoch: 42441, Training Loss: 0.00802
Epoch: 42441, Training Loss: 0.00622
Epoch: 42442, Training Loss: 0.00702
Epoch: 42442, Training Loss: 0.00657
Epoch: 42442, Training Loss: 0.00802
Epoch: 42442, Training Loss: 0.00622
Epoch: 42443, Training Loss: 0.00702
Epoch: 42443, Training Loss: 0.00657
Epoch: 42443, Training Loss: 0.00802
Epoch: 42443, Training Loss: 0.00622
Epoch: 42444, Training Loss: 0.00702
Epoch: 42444, Training Loss: 0.00657
Epoch: 42444, Training Loss: 0.00802
Epoch: 42444, Training Loss: 0.00622
Epoch: 42445, Training Loss: 0.00702
Epoch: 42445, Training Loss: 0.00657
Epoch: 42445, Training Loss: 0.00802
Epoch: 42445, Training Loss: 0.00622
Epoch: 42446, Training Loss: 0.00702
Epoch: 42446, Training Loss: 0.00657
Epoch: 42446, Training Loss: 0.00802
Epoch: 42446, Training Loss: 0.00622
Epoch: 42447, Training Loss: 0.00702
Epoch: 42447, Training Loss: 0.00657
Epoch: 42447, Training Loss: 0.00802
Epoch: 42447, Training Loss: 0.00622
Epoch: 42448, Training Loss: 0.00702
Epoch: 42448, Training Loss: 0.00657
Epoch: 42448, Training Loss: 0.00802
Epoch: 42448, Training Loss: 0.00622
Epoch: 42449, Training Loss: 0.00702
Epoch: 42449, Training Loss: 0.00657
Epoch: 42449, Training Loss: 0.00802
Epoch: 42449, Training Loss: 0.00622
Epoch: 42450, Training Loss: 0.00702
Epoch: 42450, Training Loss: 0.00657
Epoch: 42450, Training Loss: 0.00802
Epoch: 42450, Training Loss: 0.00622
Epoch: 42451, Training Loss: 0.00702
Epoch: 42451, Training Loss: 0.00657
Epoch: 42451, Training Loss: 0.00802
Epoch: 42451, Training Loss: 0.00622
Epoch: 42452, Training Loss: 0.00702
Epoch: 42452, Training Loss: 0.00657
Epoch: 42452, Training Loss: 0.00802
Epoch: 42452, Training Loss: 0.00622
Epoch: 42453, Training Loss: 0.00702
Epoch: 42453, Training Loss: 0.00657
Epoch: 42453, Training Loss: 0.00802
Epoch: 42453, Training Loss: 0.00622
Epoch: 42454, Training Loss: 0.00702
Epoch: 42454, Training Loss: 0.00657
Epoch: 42454, Training Loss: 0.00802
Epoch: 42454, Training Loss: 0.00622
Epoch: 42455, Training Loss: 0.00702
Epoch: 42455, Training Loss: 0.00657
Epoch: 42455, Training Loss: 0.00802
Epoch: 42455, Training Loss: 0.00622
Epoch: 42456, Training Loss: 0.00702
Epoch: 42456, Training Loss: 0.00657
Epoch: 42456, Training Loss: 0.00802
Epoch: 42456, Training Loss: 0.00622
Epoch: 42457, Training Loss: 0.00702
Epoch: 42457, Training Loss: 0.00657
Epoch: 42457, Training Loss: 0.00802
Epoch: 42457, Training Loss: 0.00622
Epoch: 42458, Training Loss: 0.00702
Epoch: 42458, Training Loss: 0.00657
Epoch: 42458, Training Loss: 0.00802
Epoch: 42458, Training Loss: 0.00622
Epoch: 42459, Training Loss: 0.00702
Epoch: 42459, Training Loss: 0.00657
Epoch: 42459, Training Loss: 0.00802
Epoch: 42459, Training Loss: 0.00622
Epoch: 42460, Training Loss: 0.00702
Epoch: 42460, Training Loss: 0.00657
Epoch: 42460, Training Loss: 0.00802
Epoch: 42460, Training Loss: 0.00622
Epoch: 42461, Training Loss: 0.00702
Epoch: 42461, Training Loss: 0.00657
Epoch: 42461, Training Loss: 0.00802
Epoch: 42461, Training Loss: 0.00622
Epoch: 42462, Training Loss: 0.00702
Epoch: 42462, Training Loss: 0.00657
Epoch: 42462, Training Loss: 0.00802
Epoch: 42462, Training Loss: 0.00622
Epoch: 42463, Training Loss: 0.00702
Epoch: 42463, Training Loss: 0.00657
Epoch: 42463, Training Loss: 0.00802
Epoch: 42463, Training Loss: 0.00622
Epoch: 42464, Training Loss: 0.00702
Epoch: 42464, Training Loss: 0.00657
Epoch: 42464, Training Loss: 0.00802
Epoch: 42464, Training Loss: 0.00622
Epoch: 42465, Training Loss: 0.00702
Epoch: 42465, Training Loss: 0.00657
Epoch: 42465, Training Loss: 0.00802
Epoch: 42465, Training Loss: 0.00622
Epoch: 42466, Training Loss: 0.00702
Epoch: 42466, Training Loss: 0.00657
Epoch: 42466, Training Loss: 0.00802
Epoch: 42466, Training Loss: 0.00622
Epoch: 42467, Training Loss: 0.00702
Epoch: 42467, Training Loss: 0.00657
Epoch: 42467, Training Loss: 0.00802
Epoch: 42467, Training Loss: 0.00622
Epoch: 42468, Training Loss: 0.00702
Epoch: 42468, Training Loss: 0.00657
Epoch: 42468, Training Loss: 0.00802
Epoch: 42468, Training Loss: 0.00622
Epoch: 42469, Training Loss: 0.00702
Epoch: 42469, Training Loss: 0.00657
Epoch: 42469, Training Loss: 0.00802
Epoch: 42469, Training Loss: 0.00622
Epoch: 42470, Training Loss: 0.00702
Epoch: 42470, Training Loss: 0.00657
Epoch: 42470, Training Loss: 0.00802
Epoch: 42470, Training Loss: 0.00622
Epoch: 42471, Training Loss: 0.00702
Epoch: 42471, Training Loss: 0.00657
Epoch: 42471, Training Loss: 0.00802
Epoch: 42471, Training Loss: 0.00622
Epoch: 42472, Training Loss: 0.00702
Epoch: 42472, Training Loss: 0.00657
Epoch: 42472, Training Loss: 0.00802
Epoch: 42472, Training Loss: 0.00622
Epoch: 42473, Training Loss: 0.00702
Epoch: 42473, Training Loss: 0.00657
Epoch: 42473, Training Loss: 0.00802
Epoch: 42473, Training Loss: 0.00622
Epoch: 42474, Training Loss: 0.00702
Epoch: 42474, Training Loss: 0.00657
Epoch: 42474, Training Loss: 0.00802
Epoch: 42474, Training Loss: 0.00622
Epoch: 42475, Training Loss: 0.00702
Epoch: 42475, Training Loss: 0.00657
Epoch: 42475, Training Loss: 0.00802
Epoch: 42475, Training Loss: 0.00622
Epoch: 42476, Training Loss: 0.00702
Epoch: 42476, Training Loss: 0.00657
Epoch: 42476, Training Loss: 0.00802
Epoch: 42476, Training Loss: 0.00622
Epoch: 42477, Training Loss: 0.00702
Epoch: 42477, Training Loss: 0.00657
Epoch: 42477, Training Loss: 0.00802
Epoch: 42477, Training Loss: 0.00622
Epoch: 42478, Training Loss: 0.00702
Epoch: 42478, Training Loss: 0.00657
Epoch: 42478, Training Loss: 0.00802
Epoch: 42478, Training Loss: 0.00622
Epoch: 42479, Training Loss: 0.00702
Epoch: 42479, Training Loss: 0.00657
Epoch: 42479, Training Loss: 0.00802
Epoch: 42479, Training Loss: 0.00622
Epoch: 42480, Training Loss: 0.00702
Epoch: 42480, Training Loss: 0.00657
Epoch: 42480, Training Loss: 0.00802
Epoch: 42480, Training Loss: 0.00622
Epoch: 42481, Training Loss: 0.00702
Epoch: 42481, Training Loss: 0.00657
Epoch: 42481, Training Loss: 0.00802
Epoch: 42481, Training Loss: 0.00622
Epoch: 42482, Training Loss: 0.00702
Epoch: 42482, Training Loss: 0.00657
Epoch: 42482, Training Loss: 0.00802
Epoch: 42482, Training Loss: 0.00622
Epoch: 42483, Training Loss: 0.00702
Epoch: 42483, Training Loss: 0.00657
Epoch: 42483, Training Loss: 0.00802
Epoch: 42483, Training Loss: 0.00622
Epoch: 42484, Training Loss: 0.00702
Epoch: 42484, Training Loss: 0.00657
Epoch: 42484, Training Loss: 0.00802
Epoch: 42484, Training Loss: 0.00622
Epoch: 42485, Training Loss: 0.00702
Epoch: 42485, Training Loss: 0.00657
Epoch: 42485, Training Loss: 0.00802
Epoch: 42485, Training Loss: 0.00622
Epoch: 42486, Training Loss: 0.00702
Epoch: 42486, Training Loss: 0.00657
Epoch: 42486, Training Loss: 0.00802
Epoch: 42486, Training Loss: 0.00622
Epoch: 42487, Training Loss: 0.00702
Epoch: 42487, Training Loss: 0.00657
Epoch: 42487, Training Loss: 0.00802
Epoch: 42487, Training Loss: 0.00622
Epoch: 42488, Training Loss: 0.00702
Epoch: 42488, Training Loss: 0.00657
Epoch: 42488, Training Loss: 0.00802
Epoch: 42488, Training Loss: 0.00622
Epoch: 42489, Training Loss: 0.00702
Epoch: 42489, Training Loss: 0.00657
Epoch: 42489, Training Loss: 0.00802
Epoch: 42489, Training Loss: 0.00622
Epoch: 42490, Training Loss: 0.00702
Epoch: 42490, Training Loss: 0.00657
Epoch: 42490, Training Loss: 0.00802
Epoch: 42490, Training Loss: 0.00622
Epoch: 42491, Training Loss: 0.00702
Epoch: 42491, Training Loss: 0.00657
Epoch: 42491, Training Loss: 0.00802
Epoch: 42491, Training Loss: 0.00622
Epoch: 42492, Training Loss: 0.00702
Epoch: 42492, Training Loss: 0.00657
Epoch: 42492, Training Loss: 0.00802
Epoch: 42492, Training Loss: 0.00622
Epoch: 42493, Training Loss: 0.00702
Epoch: 42493, Training Loss: 0.00657
Epoch: 42493, Training Loss: 0.00802
Epoch: 42493, Training Loss: 0.00622
Epoch: 42494, Training Loss: 0.00702
Epoch: 42494, Training Loss: 0.00657
Epoch: 42494, Training Loss: 0.00802
Epoch: 42494, Training Loss: 0.00622
Epoch: 42495, Training Loss: 0.00702
Epoch: 42495, Training Loss: 0.00657
Epoch: 42495, Training Loss: 0.00802
Epoch: 42495, Training Loss: 0.00622
Epoch: 42496, Training Loss: 0.00702
Epoch: 42496, Training Loss: 0.00657
Epoch: 42496, Training Loss: 0.00802
Epoch: 42496, Training Loss: 0.00622
Epoch: 42497, Training Loss: 0.00702
Epoch: 42497, Training Loss: 0.00657
Epoch: 42497, Training Loss: 0.00802
Epoch: 42497, Training Loss: 0.00622
Epoch: 42498, Training Loss: 0.00702
Epoch: 42498, Training Loss: 0.00657
Epoch: 42498, Training Loss: 0.00802
Epoch: 42498, Training Loss: 0.00622
Epoch: 42499, Training Loss: 0.00702
Epoch: 42499, Training Loss: 0.00657
Epoch: 42499, Training Loss: 0.00802
Epoch: 42499, Training Loss: 0.00622
Epoch: 42500, Training Loss: 0.00702
Epoch: 42500, Training Loss: 0.00657
Epoch: 42500, Training Loss: 0.00802
Epoch: 42500, Training Loss: 0.00622
Epoch: 42501, Training Loss: 0.00702
Epoch: 42501, Training Loss: 0.00657
Epoch: 42501, Training Loss: 0.00802
Epoch: 42501, Training Loss: 0.00622
Epoch: 42502, Training Loss: 0.00702
Epoch: 42502, Training Loss: 0.00657
Epoch: 42502, Training Loss: 0.00802
Epoch: 42502, Training Loss: 0.00622
Epoch: 42503, Training Loss: 0.00702
Epoch: 42503, Training Loss: 0.00657
Epoch: 42503, Training Loss: 0.00802
Epoch: 42503, Training Loss: 0.00622
Epoch: 42504, Training Loss: 0.00702
Epoch: 42504, Training Loss: 0.00657
Epoch: 42504, Training Loss: 0.00802
Epoch: 42504, Training Loss: 0.00622
Epoch: 42505, Training Loss: 0.00702
Epoch: 42505, Training Loss: 0.00657
Epoch: 42505, Training Loss: 0.00802
Epoch: 42505, Training Loss: 0.00622
Epoch: 42506, Training Loss: 0.00702
Epoch: 42506, Training Loss: 0.00657
Epoch: 42506, Training Loss: 0.00802
Epoch: 42506, Training Loss: 0.00622
Epoch: 42507, Training Loss: 0.00702
Epoch: 42507, Training Loss: 0.00657
Epoch: 42507, Training Loss: 0.00802
Epoch: 42507, Training Loss: 0.00622
Epoch: 42508, Training Loss: 0.00702
Epoch: 42508, Training Loss: 0.00657
Epoch: 42508, Training Loss: 0.00802
Epoch: 42508, Training Loss: 0.00622
Epoch: 42509, Training Loss: 0.00702
Epoch: 42509, Training Loss: 0.00657
Epoch: 42509, Training Loss: 0.00802
Epoch: 42509, Training Loss: 0.00622
Epoch: 42510, Training Loss: 0.00702
Epoch: 42510, Training Loss: 0.00657
Epoch: 42510, Training Loss: 0.00802
Epoch: 42510, Training Loss: 0.00622
Epoch: 42511, Training Loss: 0.00702
Epoch: 42511, Training Loss: 0.00657
Epoch: 42511, Training Loss: 0.00802
Epoch: 42511, Training Loss: 0.00622
Epoch: 42512, Training Loss: 0.00702
Epoch: 42512, Training Loss: 0.00657
Epoch: 42512, Training Loss: 0.00802
Epoch: 42512, Training Loss: 0.00622
Epoch: 42513, Training Loss: 0.00702
Epoch: 42513, Training Loss: 0.00657
Epoch: 42513, Training Loss: 0.00802
Epoch: 42513, Training Loss: 0.00622
Epoch: 42514, Training Loss: 0.00702
Epoch: 42514, Training Loss: 0.00657
Epoch: 42514, Training Loss: 0.00802
Epoch: 42514, Training Loss: 0.00622
Epoch: 42515, Training Loss: 0.00702
Epoch: 42515, Training Loss: 0.00657
Epoch: 42515, Training Loss: 0.00802
Epoch: 42515, Training Loss: 0.00622
Epoch: 42516, Training Loss: 0.00702
Epoch: 42516, Training Loss: 0.00657
Epoch: 42516, Training Loss: 0.00802
Epoch: 42516, Training Loss: 0.00622
Epoch: 42517, Training Loss: 0.00702
Epoch: 42517, Training Loss: 0.00657
Epoch: 42517, Training Loss: 0.00802
Epoch: 42517, Training Loss: 0.00622
Epoch: 42518, Training Loss: 0.00701
Epoch: 42518, Training Loss: 0.00657
Epoch: 42518, Training Loss: 0.00802
Epoch: 42518, Training Loss: 0.00622
Epoch: 42519, Training Loss: 0.00701
Epoch: 42519, Training Loss: 0.00657
Epoch: 42519, Training Loss: 0.00802
Epoch: 42519, Training Loss: 0.00622
Epoch: 42520, Training Loss: 0.00701
Epoch: 42520, Training Loss: 0.00657
Epoch: 42520, Training Loss: 0.00802
Epoch: 42520, Training Loss: 0.00622
Epoch: 42521, Training Loss: 0.00701
Epoch: 42521, Training Loss: 0.00657
Epoch: 42521, Training Loss: 0.00802
Epoch: 42521, Training Loss: 0.00622
Epoch: 42522, Training Loss: 0.00701
Epoch: 42522, Training Loss: 0.00657
Epoch: 42522, Training Loss: 0.00802
Epoch: 42522, Training Loss: 0.00622
Epoch: 42523, Training Loss: 0.00701
Epoch: 42523, Training Loss: 0.00657
Epoch: 42523, Training Loss: 0.00802
Epoch: 42523, Training Loss: 0.00622
Epoch: 42524, Training Loss: 0.00701
Epoch: 42524, Training Loss: 0.00657
Epoch: 42524, Training Loss: 0.00802
Epoch: 42524, Training Loss: 0.00622
Epoch: 42525, Training Loss: 0.00701
Epoch: 42525, Training Loss: 0.00657
Epoch: 42525, Training Loss: 0.00802
Epoch: 42525, Training Loss: 0.00622
Epoch: 42526, Training Loss: 0.00701
Epoch: 42526, Training Loss: 0.00657
Epoch: 42526, Training Loss: 0.00802
Epoch: 42526, Training Loss: 0.00622
Epoch: 42527, Training Loss: 0.00701
Epoch: 42527, Training Loss: 0.00657
Epoch: 42527, Training Loss: 0.00802
Epoch: 42527, Training Loss: 0.00622
Epoch: 42528, Training Loss: 0.00701
Epoch: 42528, Training Loss: 0.00657
Epoch: 42528, Training Loss: 0.00802
Epoch: 42528, Training Loss: 0.00622
Epoch: 42529, Training Loss: 0.00701
Epoch: 42529, Training Loss: 0.00657
Epoch: 42529, Training Loss: 0.00802
Epoch: 42529, Training Loss: 0.00622
Epoch: 42530, Training Loss: 0.00701
Epoch: 42530, Training Loss: 0.00657
Epoch: 42530, Training Loss: 0.00802
Epoch: 42530, Training Loss: 0.00622
Epoch: 42531, Training Loss: 0.00701
Epoch: 42531, Training Loss: 0.00657
Epoch: 42531, Training Loss: 0.00801
Epoch: 42531, Training Loss: 0.00622
Epoch: 42532, Training Loss: 0.00701
Epoch: 42532, Training Loss: 0.00657
Epoch: 42532, Training Loss: 0.00801
Epoch: 42532, Training Loss: 0.00622
Epoch: 42533, Training Loss: 0.00701
Epoch: 42533, Training Loss: 0.00657
Epoch: 42533, Training Loss: 0.00801
Epoch: 42533, Training Loss: 0.00622
Epoch: 42534, Training Loss: 0.00701
Epoch: 42534, Training Loss: 0.00657
Epoch: 42534, Training Loss: 0.00801
Epoch: 42534, Training Loss: 0.00622
Epoch: 42535, Training Loss: 0.00701
Epoch: 42535, Training Loss: 0.00657
Epoch: 42535, Training Loss: 0.00801
Epoch: 42535, Training Loss: 0.00622
Epoch: 42536, Training Loss: 0.00701
Epoch: 42536, Training Loss: 0.00657
Epoch: 42536, Training Loss: 0.00801
Epoch: 42536, Training Loss: 0.00622
Epoch: 42537, Training Loss: 0.00701
Epoch: 42537, Training Loss: 0.00657
Epoch: 42537, Training Loss: 0.00801
Epoch: 42537, Training Loss: 0.00622
Epoch: 42538, Training Loss: 0.00701
Epoch: 42538, Training Loss: 0.00657
Epoch: 42538, Training Loss: 0.00801
Epoch: 42538, Training Loss: 0.00622
Epoch: 42539, Training Loss: 0.00701
Epoch: 42539, Training Loss: 0.00657
Epoch: 42539, Training Loss: 0.00801
Epoch: 42539, Training Loss: 0.00622
Epoch: 42540, Training Loss: 0.00701
Epoch: 42540, Training Loss: 0.00657
Epoch: 42540, Training Loss: 0.00801
Epoch: 42540, Training Loss: 0.00622
Epoch: 42541, Training Loss: 0.00701
Epoch: 42541, Training Loss: 0.00657
Epoch: 42541, Training Loss: 0.00801
Epoch: 42541, Training Loss: 0.00622
Epoch: 42542, Training Loss: 0.00701
Epoch: 42542, Training Loss: 0.00657
Epoch: 42542, Training Loss: 0.00801
Epoch: 42542, Training Loss: 0.00622
Epoch: 42543, Training Loss: 0.00701
Epoch: 42543, Training Loss: 0.00657
Epoch: 42543, Training Loss: 0.00801
Epoch: 42543, Training Loss: 0.00622
Epoch: 42544, Training Loss: 0.00701
Epoch: 42544, Training Loss: 0.00657
Epoch: 42544, Training Loss: 0.00801
Epoch: 42544, Training Loss: 0.00622
Epoch: 42545, Training Loss: 0.00701
Epoch: 42545, Training Loss: 0.00657
Epoch: 42545, Training Loss: 0.00801
Epoch: 42545, Training Loss: 0.00622
Epoch: 42546, Training Loss: 0.00701
Epoch: 42546, Training Loss: 0.00657
Epoch: 42546, Training Loss: 0.00801
Epoch: 42546, Training Loss: 0.00622
Epoch: 42547, Training Loss: 0.00701
Epoch: 42547, Training Loss: 0.00657
Epoch: 42547, Training Loss: 0.00801
Epoch: 42547, Training Loss: 0.00622
Epoch: 42548, Training Loss: 0.00701
Epoch: 42548, Training Loss: 0.00657
Epoch: 42548, Training Loss: 0.00801
Epoch: 42548, Training Loss: 0.00622
Epoch: 42549, Training Loss: 0.00701
Epoch: 42549, Training Loss: 0.00657
Epoch: 42549, Training Loss: 0.00801
Epoch: 42549, Training Loss: 0.00622
Epoch: 42550, Training Loss: 0.00701
Epoch: 42550, Training Loss: 0.00657
Epoch: 42550, Training Loss: 0.00801
Epoch: 42550, Training Loss: 0.00622
Epoch: 42551, Training Loss: 0.00701
Epoch: 42551, Training Loss: 0.00657
Epoch: 42551, Training Loss: 0.00801
Epoch: 42551, Training Loss: 0.00622
Epoch: 42552, Training Loss: 0.00701
Epoch: 42552, Training Loss: 0.00657
Epoch: 42552, Training Loss: 0.00801
Epoch: 42552, Training Loss: 0.00622
Epoch: 42553, Training Loss: 0.00701
Epoch: 42553, Training Loss: 0.00657
Epoch: 42553, Training Loss: 0.00801
Epoch: 42553, Training Loss: 0.00622
Epoch: 42554, Training Loss: 0.00701
Epoch: 42554, Training Loss: 0.00657
Epoch: 42554, Training Loss: 0.00801
Epoch: 42554, Training Loss: 0.00622
Epoch: 42555, Training Loss: 0.00701
Epoch: 42555, Training Loss: 0.00657
Epoch: 42555, Training Loss: 0.00801
Epoch: 42555, Training Loss: 0.00622
Epoch: 42556, Training Loss: 0.00701
Epoch: 42556, Training Loss: 0.00657
Epoch: 42556, Training Loss: 0.00801
Epoch: 42556, Training Loss: 0.00622
Epoch: 42557, Training Loss: 0.00701
Epoch: 42557, Training Loss: 0.00657
Epoch: 42557, Training Loss: 0.00801
Epoch: 42557, Training Loss: 0.00622
Epoch: 42558, Training Loss: 0.00701
Epoch: 42558, Training Loss: 0.00657
Epoch: 42558, Training Loss: 0.00801
Epoch: 42558, Training Loss: 0.00622
Epoch: 42559, Training Loss: 0.00701
Epoch: 42559, Training Loss: 0.00657
Epoch: 42559, Training Loss: 0.00801
Epoch: 42559, Training Loss: 0.00622
Epoch: 42560, Training Loss: 0.00701
Epoch: 42560, Training Loss: 0.00657
Epoch: 42560, Training Loss: 0.00801
Epoch: 42560, Training Loss: 0.00622
Epoch: 42561, Training Loss: 0.00701
Epoch: 42561, Training Loss: 0.00656
Epoch: 42561, Training Loss: 0.00801
Epoch: 42561, Training Loss: 0.00622
Epoch: 42562, Training Loss: 0.00701
Epoch: 42562, Training Loss: 0.00656
Epoch: 42562, Training Loss: 0.00801
Epoch: 42562, Training Loss: 0.00622
Epoch: 42563, Training Loss: 0.00701
Epoch: 42563, Training Loss: 0.00656
Epoch: 42563, Training Loss: 0.00801
Epoch: 42563, Training Loss: 0.00622
Epoch: 42564, Training Loss: 0.00701
Epoch: 42564, Training Loss: 0.00656
Epoch: 42564, Training Loss: 0.00801
Epoch: 42564, Training Loss: 0.00622
Epoch: 42565, Training Loss: 0.00701
Epoch: 42565, Training Loss: 0.00656
Epoch: 42565, Training Loss: 0.00801
Epoch: 42565, Training Loss: 0.00622
Epoch: 42566, Training Loss: 0.00701
Epoch: 42566, Training Loss: 0.00656
Epoch: 42566, Training Loss: 0.00801
Epoch: 42566, Training Loss: 0.00621
Epoch: 42567, Training Loss: 0.00701
Epoch: 42567, Training Loss: 0.00656
Epoch: 42567, Training Loss: 0.00801
Epoch: 42567, Training Loss: 0.00621
Epoch: 42568, Training Loss: 0.00701
Epoch: 42568, Training Loss: 0.00656
Epoch: 42568, Training Loss: 0.00801
Epoch: 42568, Training Loss: 0.00621
Epoch: 42569, Training Loss: 0.00701
Epoch: 42569, Training Loss: 0.00656
Epoch: 42569, Training Loss: 0.00801
Epoch: 42569, Training Loss: 0.00621
Epoch: 42570, Training Loss: 0.00701
Epoch: 42570, Training Loss: 0.00656
Epoch: 42570, Training Loss: 0.00801
Epoch: 42570, Training Loss: 0.00621
Epoch: 42571, Training Loss: 0.00701
Epoch: 42571, Training Loss: 0.00656
Epoch: 42571, Training Loss: 0.00801
Epoch: 42571, Training Loss: 0.00621
Epoch: 42572, Training Loss: 0.00701
Epoch: 42572, Training Loss: 0.00656
Epoch: 42572, Training Loss: 0.00801
Epoch: 42572, Training Loss: 0.00621
Epoch: 42573, Training Loss: 0.00701
Epoch: 42573, Training Loss: 0.00656
Epoch: 42573, Training Loss: 0.00801
Epoch: 42573, Training Loss: 0.00621
Epoch: 42574, Training Loss: 0.00701
Epoch: 42574, Training Loss: 0.00656
Epoch: 42574, Training Loss: 0.00801
Epoch: 42574, Training Loss: 0.00621
Epoch: 42575, Training Loss: 0.00701
Epoch: 42575, Training Loss: 0.00656
Epoch: 42575, Training Loss: 0.00801
Epoch: 42575, Training Loss: 0.00621
Epoch: 42576, Training Loss: 0.00701
Epoch: 42576, Training Loss: 0.00656
Epoch: 42576, Training Loss: 0.00801
Epoch: 42576, Training Loss: 0.00621
Epoch: 42577, Training Loss: 0.00701
Epoch: 42577, Training Loss: 0.00656
Epoch: 42577, Training Loss: 0.00801
Epoch: 42577, Training Loss: 0.00621
Epoch: 42578, Training Loss: 0.00701
Epoch: 42578, Training Loss: 0.00656
Epoch: 42578, Training Loss: 0.00801
Epoch: 42578, Training Loss: 0.00621
Epoch: 42579, Training Loss: 0.00701
Epoch: 42579, Training Loss: 0.00656
Epoch: 42579, Training Loss: 0.00801
Epoch: 42579, Training Loss: 0.00621
Epoch: 42580, Training Loss: 0.00701
Epoch: 42580, Training Loss: 0.00656
Epoch: 42580, Training Loss: 0.00801
Epoch: 42580, Training Loss: 0.00621
Epoch: 42581, Training Loss: 0.00701
Epoch: 42581, Training Loss: 0.00656
Epoch: 42581, Training Loss: 0.00801
Epoch: 42581, Training Loss: 0.00621
Epoch: 42582, Training Loss: 0.00701
Epoch: 42582, Training Loss: 0.00656
Epoch: 42582, Training Loss: 0.00801
Epoch: 42582, Training Loss: 0.00621
Epoch: 42583, Training Loss: 0.00701
Epoch: 42583, Training Loss: 0.00656
Epoch: 42583, Training Loss: 0.00801
Epoch: 42583, Training Loss: 0.00621
Epoch: 42584, Training Loss: 0.00701
Epoch: 42584, Training Loss: 0.00656
Epoch: 42584, Training Loss: 0.00801
Epoch: 42584, Training Loss: 0.00621
Epoch: 42585, Training Loss: 0.00701
Epoch: 42585, Training Loss: 0.00656
Epoch: 42585, Training Loss: 0.00801
Epoch: 42585, Training Loss: 0.00621
Epoch: 42586, Training Loss: 0.00701
Epoch: 42586, Training Loss: 0.00656
Epoch: 42586, Training Loss: 0.00801
Epoch: 42586, Training Loss: 0.00621
Epoch: 42587, Training Loss: 0.00701
Epoch: 42587, Training Loss: 0.00656
Epoch: 42587, Training Loss: 0.00801
Epoch: 42587, Training Loss: 0.00621
Epoch: 42588, Training Loss: 0.00701
Epoch: 42588, Training Loss: 0.00656
Epoch: 42588, Training Loss: 0.00801
Epoch: 42588, Training Loss: 0.00621
Epoch: 42589, Training Loss: 0.00701
Epoch: 42589, Training Loss: 0.00656
Epoch: 42589, Training Loss: 0.00801
Epoch: 42589, Training Loss: 0.00621
Epoch: 42590, Training Loss: 0.00701
Epoch: 42590, Training Loss: 0.00656
Epoch: 42590, Training Loss: 0.00801
Epoch: 42590, Training Loss: 0.00621
Epoch: 42591, Training Loss: 0.00701
Epoch: 42591, Training Loss: 0.00656
Epoch: 42591, Training Loss: 0.00801
Epoch: 42591, Training Loss: 0.00621
Epoch: 42592, Training Loss: 0.00701
Epoch: 42592, Training Loss: 0.00656
Epoch: 42592, Training Loss: 0.00801
Epoch: 42592, Training Loss: 0.00621
Epoch: 42593, Training Loss: 0.00701
Epoch: 42593, Training Loss: 0.00656
Epoch: 42593, Training Loss: 0.00801
Epoch: 42593, Training Loss: 0.00621
Epoch: 42594, Training Loss: 0.00701
Epoch: 42594, Training Loss: 0.00656
Epoch: 42594, Training Loss: 0.00801
Epoch: 42594, Training Loss: 0.00621
Epoch: 42595, Training Loss: 0.00701
Epoch: 42595, Training Loss: 0.00656
Epoch: 42595, Training Loss: 0.00801
Epoch: 42595, Training Loss: 0.00621
Epoch: 42596, Training Loss: 0.00701
Epoch: 42596, Training Loss: 0.00656
Epoch: 42596, Training Loss: 0.00801
Epoch: 42596, Training Loss: 0.00621
Epoch: 42597, Training Loss: 0.00701
Epoch: 42597, Training Loss: 0.00656
Epoch: 42597, Training Loss: 0.00801
Epoch: 42597, Training Loss: 0.00621
Epoch: 42598, Training Loss: 0.00701
Epoch: 42598, Training Loss: 0.00656
Epoch: 42598, Training Loss: 0.00801
Epoch: 42598, Training Loss: 0.00621
Epoch: 42599, Training Loss: 0.00701
Epoch: 42599, Training Loss: 0.00656
Epoch: 42599, Training Loss: 0.00801
Epoch: 42599, Training Loss: 0.00621
Epoch: 42600, Training Loss: 0.00701
Epoch: 42600, Training Loss: 0.00656
Epoch: 42600, Training Loss: 0.00801
Epoch: 42600, Training Loss: 0.00621
Epoch: 42601, Training Loss: 0.00701
Epoch: 42601, Training Loss: 0.00656
Epoch: 42601, Training Loss: 0.00801
Epoch: 42601, Training Loss: 0.00621
Epoch: 42602, Training Loss: 0.00701
Epoch: 42602, Training Loss: 0.00656
Epoch: 42602, Training Loss: 0.00801
Epoch: 42602, Training Loss: 0.00621
Epoch: 42603, Training Loss: 0.00701
Epoch: 42603, Training Loss: 0.00656
Epoch: 42603, Training Loss: 0.00801
Epoch: 42603, Training Loss: 0.00621
Epoch: 42604, Training Loss: 0.00701
Epoch: 42604, Training Loss: 0.00656
Epoch: 42604, Training Loss: 0.00801
Epoch: 42604, Training Loss: 0.00621
Epoch: 42605, Training Loss: 0.00701
Epoch: 42605, Training Loss: 0.00656
Epoch: 42605, Training Loss: 0.00801
Epoch: 42605, Training Loss: 0.00621
Epoch: 42606, Training Loss: 0.00701
Epoch: 42606, Training Loss: 0.00656
Epoch: 42606, Training Loss: 0.00801
Epoch: 42606, Training Loss: 0.00621
Epoch: 42607, Training Loss: 0.00701
Epoch: 42607, Training Loss: 0.00656
Epoch: 42607, Training Loss: 0.00801
Epoch: 42607, Training Loss: 0.00621
Epoch: 42608, Training Loss: 0.00701
Epoch: 42608, Training Loss: 0.00656
Epoch: 42608, Training Loss: 0.00801
Epoch: 42608, Training Loss: 0.00621
Epoch: 42609, Training Loss: 0.00701
Epoch: 42609, Training Loss: 0.00656
Epoch: 42609, Training Loss: 0.00801
Epoch: 42609, Training Loss: 0.00621
Epoch: 42610, Training Loss: 0.00701
Epoch: 42610, Training Loss: 0.00656
Epoch: 42610, Training Loss: 0.00801
Epoch: 42610, Training Loss: 0.00621
Epoch: 42611, Training Loss: 0.00701
Epoch: 42611, Training Loss: 0.00656
Epoch: 42611, Training Loss: 0.00801
Epoch: 42611, Training Loss: 0.00621
Epoch: 42612, Training Loss: 0.00701
Epoch: 42612, Training Loss: 0.00656
Epoch: 42612, Training Loss: 0.00801
Epoch: 42612, Training Loss: 0.00621
Epoch: 42613, Training Loss: 0.00701
Epoch: 42613, Training Loss: 0.00656
Epoch: 42613, Training Loss: 0.00801
Epoch: 42613, Training Loss: 0.00621
Epoch: 42614, Training Loss: 0.00701
Epoch: 42614, Training Loss: 0.00656
Epoch: 42614, Training Loss: 0.00801
Epoch: 42614, Training Loss: 0.00621
Epoch: 42615, Training Loss: 0.00701
Epoch: 42615, Training Loss: 0.00656
Epoch: 42615, Training Loss: 0.00801
Epoch: 42615, Training Loss: 0.00621
Epoch: 42616, Training Loss: 0.00701
Epoch: 42616, Training Loss: 0.00656
Epoch: 42616, Training Loss: 0.00801
Epoch: 42616, Training Loss: 0.00621
Epoch: 42617, Training Loss: 0.00701
Epoch: 42617, Training Loss: 0.00656
Epoch: 42617, Training Loss: 0.00801
Epoch: 42617, Training Loss: 0.00621
Epoch: 42618, Training Loss: 0.00701
Epoch: 42618, Training Loss: 0.00656
Epoch: 42618, Training Loss: 0.00801
Epoch: 42618, Training Loss: 0.00621
Epoch: 42619, Training Loss: 0.00701
Epoch: 42619, Training Loss: 0.00656
Epoch: 42619, Training Loss: 0.00801
Epoch: 42619, Training Loss: 0.00621
Epoch: 42620, Training Loss: 0.00701
Epoch: 42620, Training Loss: 0.00656
Epoch: 42620, Training Loss: 0.00801
Epoch: 42620, Training Loss: 0.00621
Epoch: 42621, Training Loss: 0.00701
Epoch: 42621, Training Loss: 0.00656
Epoch: 42621, Training Loss: 0.00801
Epoch: 42621, Training Loss: 0.00621
Epoch: 42622, Training Loss: 0.00701
Epoch: 42622, Training Loss: 0.00656
Epoch: 42622, Training Loss: 0.00801
Epoch: 42622, Training Loss: 0.00621
Epoch: 42623, Training Loss: 0.00701
Epoch: 42623, Training Loss: 0.00656
Epoch: 42623, Training Loss: 0.00801
Epoch: 42623, Training Loss: 0.00621
Epoch: 42624, Training Loss: 0.00701
Epoch: 42624, Training Loss: 0.00656
Epoch: 42624, Training Loss: 0.00801
Epoch: 42624, Training Loss: 0.00621
Epoch: 42625, Training Loss: 0.00701
Epoch: 42625, Training Loss: 0.00656
Epoch: 42625, Training Loss: 0.00801
Epoch: 42625, Training Loss: 0.00621
Epoch: 42626, Training Loss: 0.00701
Epoch: 42626, Training Loss: 0.00656
Epoch: 42626, Training Loss: 0.00801
Epoch: 42626, Training Loss: 0.00621
Epoch: 42627, Training Loss: 0.00701
Epoch: 42627, Training Loss: 0.00656
Epoch: 42627, Training Loss: 0.00801
Epoch: 42627, Training Loss: 0.00621
Epoch: 42628, Training Loss: 0.00701
Epoch: 42628, Training Loss: 0.00656
Epoch: 42628, Training Loss: 0.00801
Epoch: 42628, Training Loss: 0.00621
Epoch: 42629, Training Loss: 0.00701
Epoch: 42629, Training Loss: 0.00656
Epoch: 42629, Training Loss: 0.00801
Epoch: 42629, Training Loss: 0.00621
Epoch: 42630, Training Loss: 0.00701
Epoch: 42630, Training Loss: 0.00656
Epoch: 42630, Training Loss: 0.00801
Epoch: 42630, Training Loss: 0.00621
Epoch: 42631, Training Loss: 0.00701
Epoch: 42631, Training Loss: 0.00656
Epoch: 42631, Training Loss: 0.00801
Epoch: 42631, Training Loss: 0.00621
Epoch: 42632, Training Loss: 0.00701
Epoch: 42632, Training Loss: 0.00656
Epoch: 42632, Training Loss: 0.00800
Epoch: 42632, Training Loss: 0.00621
Epoch: 42633, Training Loss: 0.00701
Epoch: 42633, Training Loss: 0.00656
Epoch: 42633, Training Loss: 0.00800
Epoch: 42633, Training Loss: 0.00621
Epoch: 42634, Training Loss: 0.00700
Epoch: 42634, Training Loss: 0.00656
Epoch: 42634, Training Loss: 0.00800
Epoch: 42634, Training Loss: 0.00621
Epoch: 42635, Training Loss: 0.00700
Epoch: 42635, Training Loss: 0.00656
Epoch: 42635, Training Loss: 0.00800
Epoch: 42635, Training Loss: 0.00621
Epoch: 42636, Training Loss: 0.00700
Epoch: 42636, Training Loss: 0.00656
Epoch: 42636, Training Loss: 0.00800
Epoch: 42636, Training Loss: 0.00621
Epoch: 42637, Training Loss: 0.00700
Epoch: 42637, Training Loss: 0.00656
Epoch: 42637, Training Loss: 0.00800
Epoch: 42637, Training Loss: 0.00621
Epoch: 42638, Training Loss: 0.00700
Epoch: 42638, Training Loss: 0.00656
Epoch: 42638, Training Loss: 0.00800
Epoch: 42638, Training Loss: 0.00621
Epoch: 42639, Training Loss: 0.00700
Epoch: 42639, Training Loss: 0.00656
Epoch: 42639, Training Loss: 0.00800
Epoch: 42639, Training Loss: 0.00621
Epoch: 42640, Training Loss: 0.00700
Epoch: 42640, Training Loss: 0.00656
Epoch: 42640, Training Loss: 0.00800
Epoch: 42640, Training Loss: 0.00621
Epoch: 42641, Training Loss: 0.00700
Epoch: 42641, Training Loss: 0.00656
Epoch: 42641, Training Loss: 0.00800
Epoch: 42641, Training Loss: 0.00621
Epoch: 42642, Training Loss: 0.00700
Epoch: 42642, Training Loss: 0.00656
Epoch: 42642, Training Loss: 0.00800
Epoch: 42642, Training Loss: 0.00621
Epoch: 42643, Training Loss: 0.00700
Epoch: 42643, Training Loss: 0.00656
Epoch: 42643, Training Loss: 0.00800
Epoch: 42643, Training Loss: 0.00621
Epoch: 42644, Training Loss: 0.00700
Epoch: 42644, Training Loss: 0.00656
Epoch: 42644, Training Loss: 0.00800
Epoch: 42644, Training Loss: 0.00621
Epoch: 42645, Training Loss: 0.00700
Epoch: 42645, Training Loss: 0.00656
Epoch: 42645, Training Loss: 0.00800
Epoch: 42645, Training Loss: 0.00621
Epoch: 42646, Training Loss: 0.00700
Epoch: 42646, Training Loss: 0.00656
Epoch: 42646, Training Loss: 0.00800
Epoch: 42646, Training Loss: 0.00621
Epoch: 42647, Training Loss: 0.00700
Epoch: 42647, Training Loss: 0.00656
Epoch: 42647, Training Loss: 0.00800
Epoch: 42647, Training Loss: 0.00621
Epoch: 42648, Training Loss: 0.00700
Epoch: 42648, Training Loss: 0.00656
Epoch: 42648, Training Loss: 0.00800
Epoch: 42648, Training Loss: 0.00621
Epoch: 42649, Training Loss: 0.00700
Epoch: 42649, Training Loss: 0.00656
Epoch: 42649, Training Loss: 0.00800
Epoch: 42649, Training Loss: 0.00621
Epoch: 42650, Training Loss: 0.00700
Epoch: 42650, Training Loss: 0.00656
Epoch: 42650, Training Loss: 0.00800
Epoch: 42650, Training Loss: 0.00621
Epoch: 42651, Training Loss: 0.00700
Epoch: 42651, Training Loss: 0.00656
Epoch: 42651, Training Loss: 0.00800
Epoch: 42651, Training Loss: 0.00621
Epoch: 42652, Training Loss: 0.00700
Epoch: 42652, Training Loss: 0.00656
Epoch: 42652, Training Loss: 0.00800
Epoch: 42652, Training Loss: 0.00621
Epoch: 42653, Training Loss: 0.00700
Epoch: 42653, Training Loss: 0.00656
Epoch: 42653, Training Loss: 0.00800
Epoch: 42653, Training Loss: 0.00621
Epoch: 42654, Training Loss: 0.00700
Epoch: 42654, Training Loss: 0.00656
Epoch: 42654, Training Loss: 0.00800
Epoch: 42654, Training Loss: 0.00621
Epoch: 42655, Training Loss: 0.00700
Epoch: 42655, Training Loss: 0.00656
Epoch: 42655, Training Loss: 0.00800
Epoch: 42655, Training Loss: 0.00621
Epoch: 42656, Training Loss: 0.00700
Epoch: 42656, Training Loss: 0.00656
Epoch: 42656, Training Loss: 0.00800
Epoch: 42656, Training Loss: 0.00621
Epoch: 42657, Training Loss: 0.00700
Epoch: 42657, Training Loss: 0.00656
Epoch: 42657, Training Loss: 0.00800
Epoch: 42657, Training Loss: 0.00621
Epoch: 42658, Training Loss: 0.00700
Epoch: 42658, Training Loss: 0.00656
Epoch: 42658, Training Loss: 0.00800
Epoch: 42658, Training Loss: 0.00621
Epoch: 42659, Training Loss: 0.00700
Epoch: 42659, Training Loss: 0.00656
Epoch: 42659, Training Loss: 0.00800
Epoch: 42659, Training Loss: 0.00621
Epoch: 42660, Training Loss: 0.00700
Epoch: 42660, Training Loss: 0.00656
Epoch: 42660, Training Loss: 0.00800
Epoch: 42660, Training Loss: 0.00621
Epoch: 42661, Training Loss: 0.00700
Epoch: 42661, Training Loss: 0.00656
Epoch: 42661, Training Loss: 0.00800
Epoch: 42661, Training Loss: 0.00621
Epoch: 42662, Training Loss: 0.00700
Epoch: 42662, Training Loss: 0.00656
Epoch: 42662, Training Loss: 0.00800
Epoch: 42662, Training Loss: 0.00621
Epoch: 42663, Training Loss: 0.00700
Epoch: 42663, Training Loss: 0.00656
Epoch: 42663, Training Loss: 0.00800
Epoch: 42663, Training Loss: 0.00621
Epoch: 42664, Training Loss: 0.00700
Epoch: 42664, Training Loss: 0.00656
Epoch: 42664, Training Loss: 0.00800
Epoch: 42664, Training Loss: 0.00621
Epoch: 42665, Training Loss: 0.00700
Epoch: 42665, Training Loss: 0.00656
Epoch: 42665, Training Loss: 0.00800
Epoch: 42665, Training Loss: 0.00621
Epoch: 42666, Training Loss: 0.00700
Epoch: 42666, Training Loss: 0.00656
Epoch: 42666, Training Loss: 0.00800
Epoch: 42666, Training Loss: 0.00621
Epoch: 42667, Training Loss: 0.00700
Epoch: 42667, Training Loss: 0.00656
Epoch: 42667, Training Loss: 0.00800
Epoch: 42667, Training Loss: 0.00621
Epoch: 42668, Training Loss: 0.00700
Epoch: 42668, Training Loss: 0.00656
Epoch: 42668, Training Loss: 0.00800
Epoch: 42668, Training Loss: 0.00621
Epoch: 42669, Training Loss: 0.00700
Epoch: 42669, Training Loss: 0.00656
Epoch: 42669, Training Loss: 0.00800
Epoch: 42669, Training Loss: 0.00621
Epoch: 42670, Training Loss: 0.00700
Epoch: 42670, Training Loss: 0.00656
Epoch: 42670, Training Loss: 0.00800
Epoch: 42670, Training Loss: 0.00621
Epoch: 42671, Training Loss: 0.00700
Epoch: 42671, Training Loss: 0.00656
Epoch: 42671, Training Loss: 0.00800
Epoch: 42671, Training Loss: 0.00621
Epoch: 42672, Training Loss: 0.00700
Epoch: 42672, Training Loss: 0.00656
Epoch: 42672, Training Loss: 0.00800
Epoch: 42672, Training Loss: 0.00621
Epoch: 42673, Training Loss: 0.00700
Epoch: 42673, Training Loss: 0.00656
Epoch: 42673, Training Loss: 0.00800
Epoch: 42673, Training Loss: 0.00621
Epoch: 42674, Training Loss: 0.00700
Epoch: 42674, Training Loss: 0.00656
Epoch: 42674, Training Loss: 0.00800
Epoch: 42674, Training Loss: 0.00621
Epoch: 42675, Training Loss: 0.00700
Epoch: 42675, Training Loss: 0.00656
Epoch: 42675, Training Loss: 0.00800
Epoch: 42675, Training Loss: 0.00621
Epoch: 42676, Training Loss: 0.00700
Epoch: 42676, Training Loss: 0.00656
Epoch: 42676, Training Loss: 0.00800
Epoch: 42676, Training Loss: 0.00621
Epoch: 42677, Training Loss: 0.00700
Epoch: 42677, Training Loss: 0.00656
Epoch: 42677, Training Loss: 0.00800
Epoch: 42677, Training Loss: 0.00621
Epoch: 42678, Training Loss: 0.00700
Epoch: 42678, Training Loss: 0.00656
Epoch: 42678, Training Loss: 0.00800
Epoch: 42678, Training Loss: 0.00621
Epoch: 42679, Training Loss: 0.00700
Epoch: 42679, Training Loss: 0.00656
Epoch: 42679, Training Loss: 0.00800
Epoch: 42679, Training Loss: 0.00621
Epoch: 42680, Training Loss: 0.00700
Epoch: 42680, Training Loss: 0.00656
Epoch: 42680, Training Loss: 0.00800
Epoch: 42680, Training Loss: 0.00621
Epoch: 42681, Training Loss: 0.00700
Epoch: 42681, Training Loss: 0.00656
Epoch: 42681, Training Loss: 0.00800
Epoch: 42681, Training Loss: 0.00621
Epoch: 42682, Training Loss: 0.00700
Epoch: 42682, Training Loss: 0.00656
Epoch: 42682, Training Loss: 0.00800
Epoch: 42682, Training Loss: 0.00621
Epoch: 42683, Training Loss: 0.00700
Epoch: 42683, Training Loss: 0.00656
Epoch: 42683, Training Loss: 0.00800
Epoch: 42683, Training Loss: 0.00621
Epoch: 42684, Training Loss: 0.00700
Epoch: 42684, Training Loss: 0.00656
Epoch: 42684, Training Loss: 0.00800
Epoch: 42684, Training Loss: 0.00621
Epoch: 42685, Training Loss: 0.00700
Epoch: 42685, Training Loss: 0.00656
Epoch: 42685, Training Loss: 0.00800
Epoch: 42685, Training Loss: 0.00621
Epoch: 42686, Training Loss: 0.00700
Epoch: 42686, Training Loss: 0.00655
Epoch: 42686, Training Loss: 0.00800
Epoch: 42686, Training Loss: 0.00621
Epoch: 42687, Training Loss: 0.00700
Epoch: 42687, Training Loss: 0.00655
Epoch: 42687, Training Loss: 0.00800
Epoch: 42687, Training Loss: 0.00621
Epoch: 42688, Training Loss: 0.00700
Epoch: 42688, Training Loss: 0.00655
Epoch: 42688, Training Loss: 0.00800
Epoch: 42688, Training Loss: 0.00621
Epoch: 42689, Training Loss: 0.00700
Epoch: 42689, Training Loss: 0.00655
Epoch: 42689, Training Loss: 0.00800
Epoch: 42689, Training Loss: 0.00621
Epoch: 42690, Training Loss: 0.00700
Epoch: 42690, Training Loss: 0.00655
Epoch: 42690, Training Loss: 0.00800
Epoch: 42690, Training Loss: 0.00621
Epoch: 42691, Training Loss: 0.00700
Epoch: 42691, Training Loss: 0.00655
Epoch: 42691, Training Loss: 0.00800
Epoch: 42691, Training Loss: 0.00621
Epoch: 42692, Training Loss: 0.00700
Epoch: 42692, Training Loss: 0.00655
Epoch: 42692, Training Loss: 0.00800
Epoch: 42692, Training Loss: 0.00621
Epoch: 42693, Training Loss: 0.00700
Epoch: 42693, Training Loss: 0.00655
Epoch: 42693, Training Loss: 0.00800
Epoch: 42693, Training Loss: 0.00621
Epoch: 42694, Training Loss: 0.00700
Epoch: 42694, Training Loss: 0.00655
Epoch: 42694, Training Loss: 0.00800
Epoch: 42694, Training Loss: 0.00621
Epoch: 42695, Training Loss: 0.00700
Epoch: 42695, Training Loss: 0.00655
Epoch: 42695, Training Loss: 0.00800
Epoch: 42695, Training Loss: 0.00621
Epoch: 42696, Training Loss: 0.00700
Epoch: 42696, Training Loss: 0.00655
Epoch: 42696, Training Loss: 0.00800
Epoch: 42696, Training Loss: 0.00621
Epoch: 42697, Training Loss: 0.00700
Epoch: 42697, Training Loss: 0.00655
Epoch: 42697, Training Loss: 0.00800
Epoch: 42697, Training Loss: 0.00621
Epoch: 42698, Training Loss: 0.00700
Epoch: 42698, Training Loss: 0.00655
Epoch: 42698, Training Loss: 0.00800
Epoch: 42698, Training Loss: 0.00620
Epoch: 42699, Training Loss: 0.00700
Epoch: 42699, Training Loss: 0.00655
Epoch: 42699, Training Loss: 0.00800
Epoch: 42699, Training Loss: 0.00620
Epoch: 42700, Training Loss: 0.00700
Epoch: 42700, Training Loss: 0.00655
Epoch: 42700, Training Loss: 0.00800
Epoch: 42700, Training Loss: 0.00620
Epoch: 42701, Training Loss: 0.00700
Epoch: 42701, Training Loss: 0.00655
Epoch: 42701, Training Loss: 0.00800
Epoch: 42701, Training Loss: 0.00620
Epoch: 42702, Training Loss: 0.00700
Epoch: 42702, Training Loss: 0.00655
Epoch: 42702, Training Loss: 0.00800
Epoch: 42702, Training Loss: 0.00620
Epoch: 42703, Training Loss: 0.00700
Epoch: 42703, Training Loss: 0.00655
Epoch: 42703, Training Loss: 0.00800
Epoch: 42703, Training Loss: 0.00620
Epoch: 42704, Training Loss: 0.00700
Epoch: 42704, Training Loss: 0.00655
Epoch: 42704, Training Loss: 0.00800
Epoch: 42704, Training Loss: 0.00620
Epoch: 42705, Training Loss: 0.00700
Epoch: 42705, Training Loss: 0.00655
Epoch: 42705, Training Loss: 0.00800
Epoch: 42705, Training Loss: 0.00620
Epoch: 42706, Training Loss: 0.00700
Epoch: 42706, Training Loss: 0.00655
Epoch: 42706, Training Loss: 0.00800
Epoch: 42706, Training Loss: 0.00620
Epoch: 42707, Training Loss: 0.00700
Epoch: 42707, Training Loss: 0.00655
Epoch: 42707, Training Loss: 0.00800
Epoch: 42707, Training Loss: 0.00620
Epoch: 42708, Training Loss: 0.00700
Epoch: 42708, Training Loss: 0.00655
Epoch: 42708, Training Loss: 0.00800
Epoch: 42708, Training Loss: 0.00620
Epoch: 42709, Training Loss: 0.00700
Epoch: 42709, Training Loss: 0.00655
Epoch: 42709, Training Loss: 0.00800
Epoch: 42709, Training Loss: 0.00620
Epoch: 42710, Training Loss: 0.00700
Epoch: 42710, Training Loss: 0.00655
Epoch: 42710, Training Loss: 0.00800
Epoch: 42710, Training Loss: 0.00620
Epoch: 42711, Training Loss: 0.00700
Epoch: 42711, Training Loss: 0.00655
Epoch: 42711, Training Loss: 0.00800
Epoch: 42711, Training Loss: 0.00620
Epoch: 42712, Training Loss: 0.00700
Epoch: 42712, Training Loss: 0.00655
Epoch: 42712, Training Loss: 0.00800
Epoch: 42712, Training Loss: 0.00620
Epoch: 42713, Training Loss: 0.00700
Epoch: 42713, Training Loss: 0.00655
Epoch: 42713, Training Loss: 0.00800
Epoch: 42713, Training Loss: 0.00620
Epoch: 42714, Training Loss: 0.00700
Epoch: 42714, Training Loss: 0.00655
Epoch: 42714, Training Loss: 0.00800
Epoch: 42714, Training Loss: 0.00620
Epoch: 42715, Training Loss: 0.00700
Epoch: 42715, Training Loss: 0.00655
Epoch: 42715, Training Loss: 0.00800
Epoch: 42715, Training Loss: 0.00620
Epoch: 42716, Training Loss: 0.00700
Epoch: 42716, Training Loss: 0.00655
Epoch: 42716, Training Loss: 0.00800
Epoch: 42716, Training Loss: 0.00620
Epoch: 42717, Training Loss: 0.00700
Epoch: 42717, Training Loss: 0.00655
Epoch: 42717, Training Loss: 0.00800
Epoch: 42717, Training Loss: 0.00620
Epoch: 42718, Training Loss: 0.00700
Epoch: 42718, Training Loss: 0.00655
Epoch: 42718, Training Loss: 0.00800
Epoch: 42718, Training Loss: 0.00620
Epoch: 42719, Training Loss: 0.00700
Epoch: 42719, Training Loss: 0.00655
Epoch: 42719, Training Loss: 0.00800
Epoch: 42719, Training Loss: 0.00620
Epoch: 42720, Training Loss: 0.00700
Epoch: 42720, Training Loss: 0.00655
Epoch: 42720, Training Loss: 0.00800
Epoch: 42720, Training Loss: 0.00620
Epoch: 42721, Training Loss: 0.00700
Epoch: 42721, Training Loss: 0.00655
Epoch: 42721, Training Loss: 0.00800
Epoch: 42721, Training Loss: 0.00620
Epoch: 42722, Training Loss: 0.00700
Epoch: 42722, Training Loss: 0.00655
Epoch: 42722, Training Loss: 0.00800
Epoch: 42722, Training Loss: 0.00620
Epoch: 42723, Training Loss: 0.00700
Epoch: 42723, Training Loss: 0.00655
Epoch: 42723, Training Loss: 0.00800
Epoch: 42723, Training Loss: 0.00620
Epoch: 42724, Training Loss: 0.00700
Epoch: 42724, Training Loss: 0.00655
Epoch: 42724, Training Loss: 0.00800
Epoch: 42724, Training Loss: 0.00620
Epoch: 42725, Training Loss: 0.00700
Epoch: 42725, Training Loss: 0.00655
Epoch: 42725, Training Loss: 0.00800
Epoch: 42725, Training Loss: 0.00620
Epoch: 42726, Training Loss: 0.00700
Epoch: 42726, Training Loss: 0.00655
Epoch: 42726, Training Loss: 0.00800
Epoch: 42726, Training Loss: 0.00620
Epoch: 42727, Training Loss: 0.00700
Epoch: 42727, Training Loss: 0.00655
Epoch: 42727, Training Loss: 0.00800
Epoch: 42727, Training Loss: 0.00620
Epoch: 42728, Training Loss: 0.00700
Epoch: 42728, Training Loss: 0.00655
Epoch: 42728, Training Loss: 0.00800
Epoch: 42728, Training Loss: 0.00620
Epoch: 42729, Training Loss: 0.00700
Epoch: 42729, Training Loss: 0.00655
Epoch: 42729, Training Loss: 0.00800
Epoch: 42729, Training Loss: 0.00620
Epoch: 42730, Training Loss: 0.00700
Epoch: 42730, Training Loss: 0.00655
Epoch: 42730, Training Loss: 0.00800
Epoch: 42730, Training Loss: 0.00620
Epoch: 42731, Training Loss: 0.00700
Epoch: 42731, Training Loss: 0.00655
Epoch: 42731, Training Loss: 0.00800
Epoch: 42731, Training Loss: 0.00620
Epoch: 42732, Training Loss: 0.00700
Epoch: 42732, Training Loss: 0.00655
Epoch: 42732, Training Loss: 0.00800
Epoch: 42732, Training Loss: 0.00620
Epoch: 42733, Training Loss: 0.00700
Epoch: 42733, Training Loss: 0.00655
Epoch: 42733, Training Loss: 0.00800
Epoch: 42733, Training Loss: 0.00620
Epoch: 42734, Training Loss: 0.00700
Epoch: 42734, Training Loss: 0.00655
Epoch: 42734, Training Loss: 0.00799
Epoch: 42734, Training Loss: 0.00620
Epoch: 42735, Training Loss: 0.00700
Epoch: 42735, Training Loss: 0.00655
Epoch: 42735, Training Loss: 0.00799
Epoch: 42735, Training Loss: 0.00620
Epoch: 42736, Training Loss: 0.00700
Epoch: 42736, Training Loss: 0.00655
Epoch: 42736, Training Loss: 0.00799
Epoch: 42736, Training Loss: 0.00620
Epoch: 42737, Training Loss: 0.00700
Epoch: 42737, Training Loss: 0.00655
Epoch: 42737, Training Loss: 0.00799
Epoch: 42737, Training Loss: 0.00620
Epoch: 42738, Training Loss: 0.00700
Epoch: 42738, Training Loss: 0.00655
Epoch: 42738, Training Loss: 0.00799
Epoch: 42738, Training Loss: 0.00620
Epoch: 42739, Training Loss: 0.00700
Epoch: 42739, Training Loss: 0.00655
Epoch: 42739, Training Loss: 0.00799
Epoch: 42739, Training Loss: 0.00620
Epoch: 42740, Training Loss: 0.00700
Epoch: 42740, Training Loss: 0.00655
Epoch: 42740, Training Loss: 0.00799
Epoch: 42740, Training Loss: 0.00620
Epoch: 42741, Training Loss: 0.00700
Epoch: 42741, Training Loss: 0.00655
Epoch: 42741, Training Loss: 0.00799
Epoch: 42741, Training Loss: 0.00620
Epoch: 42742, Training Loss: 0.00700
Epoch: 42742, Training Loss: 0.00655
Epoch: 42742, Training Loss: 0.00799
Epoch: 42742, Training Loss: 0.00620
Epoch: 42743, Training Loss: 0.00700
Epoch: 42743, Training Loss: 0.00655
Epoch: 42743, Training Loss: 0.00799
Epoch: 42743, Training Loss: 0.00620
Epoch: 42744, Training Loss: 0.00700
Epoch: 42744, Training Loss: 0.00655
Epoch: 42744, Training Loss: 0.00799
Epoch: 42744, Training Loss: 0.00620
Epoch: 42745, Training Loss: 0.00700
Epoch: 42745, Training Loss: 0.00655
Epoch: 42745, Training Loss: 0.00799
Epoch: 42745, Training Loss: 0.00620
Epoch: 42746, Training Loss: 0.00700
Epoch: 42746, Training Loss: 0.00655
Epoch: 42746, Training Loss: 0.00799
Epoch: 42746, Training Loss: 0.00620
Epoch: 42747, Training Loss: 0.00700
Epoch: 42747, Training Loss: 0.00655
Epoch: 42747, Training Loss: 0.00799
Epoch: 42747, Training Loss: 0.00620
Epoch: 42748, Training Loss: 0.00700
Epoch: 42748, Training Loss: 0.00655
Epoch: 42748, Training Loss: 0.00799
Epoch: 42748, Training Loss: 0.00620
Epoch: 42749, Training Loss: 0.00700
Epoch: 42749, Training Loss: 0.00655
Epoch: 42749, Training Loss: 0.00799
Epoch: 42749, Training Loss: 0.00620
Epoch: 42750, Training Loss: 0.00699
Epoch: 42750, Training Loss: 0.00655
Epoch: 42750, Training Loss: 0.00799
Epoch: 42750, Training Loss: 0.00620
Epoch: 42751, Training Loss: 0.00699
Epoch: 42751, Training Loss: 0.00655
Epoch: 42751, Training Loss: 0.00799
Epoch: 42751, Training Loss: 0.00620
Epoch: 42752, Training Loss: 0.00699
Epoch: 42752, Training Loss: 0.00655
Epoch: 42752, Training Loss: 0.00799
Epoch: 42752, Training Loss: 0.00620
Epoch: 42753, Training Loss: 0.00699
Epoch: 42753, Training Loss: 0.00655
Epoch: 42753, Training Loss: 0.00799
Epoch: 42753, Training Loss: 0.00620
Epoch: 42754, Training Loss: 0.00699
Epoch: 42754, Training Loss: 0.00655
Epoch: 42754, Training Loss: 0.00799
Epoch: 42754, Training Loss: 0.00620
Epoch: 42755, Training Loss: 0.00699
Epoch: 42755, Training Loss: 0.00655
Epoch: 42755, Training Loss: 0.00799
Epoch: 42755, Training Loss: 0.00620
Epoch: 42756, Training Loss: 0.00699
Epoch: 42756, Training Loss: 0.00655
Epoch: 42756, Training Loss: 0.00799
Epoch: 42756, Training Loss: 0.00620
Epoch: 42757, Training Loss: 0.00699
Epoch: 42757, Training Loss: 0.00655
Epoch: 42757, Training Loss: 0.00799
Epoch: 42757, Training Loss: 0.00620
Epoch: 42758, Training Loss: 0.00699
Epoch: 42758, Training Loss: 0.00655
Epoch: 42758, Training Loss: 0.00799
Epoch: 42758, Training Loss: 0.00620
Epoch: 42759, Training Loss: 0.00699
Epoch: 42759, Training Loss: 0.00655
Epoch: 42759, Training Loss: 0.00799
Epoch: 42759, Training Loss: 0.00620
Epoch: 42760, Training Loss: 0.00699
Epoch: 42760, Training Loss: 0.00655
Epoch: 42760, Training Loss: 0.00799
Epoch: 42760, Training Loss: 0.00620
Epoch: 42761, Training Loss: 0.00699
Epoch: 42761, Training Loss: 0.00655
Epoch: 42761, Training Loss: 0.00799
Epoch: 42761, Training Loss: 0.00620
Epoch: 42762, Training Loss: 0.00699
Epoch: 42762, Training Loss: 0.00655
Epoch: 42762, Training Loss: 0.00799
Epoch: 42762, Training Loss: 0.00620
Epoch: 42763, Training Loss: 0.00699
Epoch: 42763, Training Loss: 0.00655
Epoch: 42763, Training Loss: 0.00799
Epoch: 42763, Training Loss: 0.00620
Epoch: 42764, Training Loss: 0.00699
Epoch: 42764, Training Loss: 0.00655
Epoch: 42764, Training Loss: 0.00799
Epoch: 42764, Training Loss: 0.00620
Epoch: 42765, Training Loss: 0.00699
Epoch: 42765, Training Loss: 0.00655
Epoch: 42765, Training Loss: 0.00799
Epoch: 42765, Training Loss: 0.00620
Epoch: 42766, Training Loss: 0.00699
Epoch: 42766, Training Loss: 0.00655
Epoch: 42766, Training Loss: 0.00799
Epoch: 42766, Training Loss: 0.00620
Epoch: 42767, Training Loss: 0.00699
Epoch: 42767, Training Loss: 0.00655
Epoch: 42767, Training Loss: 0.00799
Epoch: 42767, Training Loss: 0.00620
Epoch: 42768, Training Loss: 0.00699
Epoch: 42768, Training Loss: 0.00655
Epoch: 42768, Training Loss: 0.00799
Epoch: 42768, Training Loss: 0.00620
Epoch: 42769, Training Loss: 0.00699
Epoch: 42769, Training Loss: 0.00655
Epoch: 42769, Training Loss: 0.00799
Epoch: 42769, Training Loss: 0.00620
Epoch: 42770, Training Loss: 0.00699
Epoch: 42770, Training Loss: 0.00655
Epoch: 42770, Training Loss: 0.00799
Epoch: 42770, Training Loss: 0.00620
Epoch: 42771, Training Loss: 0.00699
Epoch: 42771, Training Loss: 0.00655
Epoch: 42771, Training Loss: 0.00799
Epoch: 42771, Training Loss: 0.00620
Epoch: 42772, Training Loss: 0.00699
Epoch: 42772, Training Loss: 0.00655
Epoch: 42772, Training Loss: 0.00799
Epoch: 42772, Training Loss: 0.00620
Epoch: 42773, Training Loss: 0.00699
Epoch: 42773, Training Loss: 0.00655
Epoch: 42773, Training Loss: 0.00799
Epoch: 42773, Training Loss: 0.00620
Epoch: 42774, Training Loss: 0.00699
Epoch: 42774, Training Loss: 0.00655
Epoch: 42774, Training Loss: 0.00799
Epoch: 42774, Training Loss: 0.00620
Epoch: 42775, Training Loss: 0.00699
Epoch: 42775, Training Loss: 0.00655
Epoch: 42775, Training Loss: 0.00799
Epoch: 42775, Training Loss: 0.00620
Epoch: 42776, Training Loss: 0.00699
Epoch: 42776, Training Loss: 0.00655
Epoch: 42776, Training Loss: 0.00799
Epoch: 42776, Training Loss: 0.00620
Epoch: 42777, Training Loss: 0.00699
Epoch: 42777, Training Loss: 0.00655
Epoch: 42777, Training Loss: 0.00799
Epoch: 42777, Training Loss: 0.00620
Epoch: 42778, Training Loss: 0.00699
Epoch: 42778, Training Loss: 0.00655
Epoch: 42778, Training Loss: 0.00799
Epoch: 42778, Training Loss: 0.00620
Epoch: 42779, Training Loss: 0.00699
Epoch: 42779, Training Loss: 0.00655
Epoch: 42779, Training Loss: 0.00799
Epoch: 42779, Training Loss: 0.00620
Epoch: 42780, Training Loss: 0.00699
Epoch: 42780, Training Loss: 0.00655
Epoch: 42780, Training Loss: 0.00799
Epoch: 42780, Training Loss: 0.00620
Epoch: 42781, Training Loss: 0.00699
Epoch: 42781, Training Loss: 0.00655
Epoch: 42781, Training Loss: 0.00799
Epoch: 42781, Training Loss: 0.00620
Epoch: 42782, Training Loss: 0.00699
Epoch: 42782, Training Loss: 0.00655
Epoch: 42782, Training Loss: 0.00799
Epoch: 42782, Training Loss: 0.00620
Epoch: 42783, Training Loss: 0.00699
Epoch: 42783, Training Loss: 0.00655
Epoch: 42783, Training Loss: 0.00799
Epoch: 42783, Training Loss: 0.00620
Epoch: 42784, Training Loss: 0.00699
Epoch: 42784, Training Loss: 0.00655
Epoch: 42784, Training Loss: 0.00799
Epoch: 42784, Training Loss: 0.00620
Epoch: 42785, Training Loss: 0.00699
Epoch: 42785, Training Loss: 0.00655
Epoch: 42785, Training Loss: 0.00799
Epoch: 42785, Training Loss: 0.00620
Epoch: 42786, Training Loss: 0.00699
Epoch: 42786, Training Loss: 0.00655
Epoch: 42786, Training Loss: 0.00799
Epoch: 42786, Training Loss: 0.00620
Epoch: 42787, Training Loss: 0.00699
Epoch: 42787, Training Loss: 0.00655
Epoch: 42787, Training Loss: 0.00799
Epoch: 42787, Training Loss: 0.00620
Epoch: 42788, Training Loss: 0.00699
Epoch: 42788, Training Loss: 0.00655
Epoch: 42788, Training Loss: 0.00799
Epoch: 42788, Training Loss: 0.00620
Epoch: 42789, Training Loss: 0.00699
Epoch: 42789, Training Loss: 0.00655
Epoch: 42789, Training Loss: 0.00799
Epoch: 42789, Training Loss: 0.00620
Epoch: 42790, Training Loss: 0.00699
Epoch: 42790, Training Loss: 0.00655
Epoch: 42790, Training Loss: 0.00799
Epoch: 42790, Training Loss: 0.00620
Epoch: 42791, Training Loss: 0.00699
Epoch: 42791, Training Loss: 0.00655
Epoch: 42791, Training Loss: 0.00799
Epoch: 42791, Training Loss: 0.00620
Epoch: 42792, Training Loss: 0.00699
Epoch: 42792, Training Loss: 0.00655
Epoch: 42792, Training Loss: 0.00799
Epoch: 42792, Training Loss: 0.00620
Epoch: 42793, Training Loss: 0.00699
Epoch: 42793, Training Loss: 0.00655
Epoch: 42793, Training Loss: 0.00799
Epoch: 42793, Training Loss: 0.00620
Epoch: 42794, Training Loss: 0.00699
Epoch: 42794, Training Loss: 0.00655
Epoch: 42794, Training Loss: 0.00799
Epoch: 42794, Training Loss: 0.00620
Epoch: 42795, Training Loss: 0.00699
Epoch: 42795, Training Loss: 0.00655
Epoch: 42795, Training Loss: 0.00799
Epoch: 42795, Training Loss: 0.00620
Epoch: 42796, Training Loss: 0.00699
Epoch: 42796, Training Loss: 0.00655
Epoch: 42796, Training Loss: 0.00799
Epoch: 42796, Training Loss: 0.00620
Epoch: 42797, Training Loss: 0.00699
Epoch: 42797, Training Loss: 0.00655
Epoch: 42797, Training Loss: 0.00799
Epoch: 42797, Training Loss: 0.00620
Epoch: 42798, Training Loss: 0.00699
Epoch: 42798, Training Loss: 0.00655
Epoch: 42798, Training Loss: 0.00799
Epoch: 42798, Training Loss: 0.00620
Epoch: 42799, Training Loss: 0.00699
Epoch: 42799, Training Loss: 0.00655
Epoch: 42799, Training Loss: 0.00799
Epoch: 42799, Training Loss: 0.00620
Epoch: 42800, Training Loss: 0.00699
Epoch: 42800, Training Loss: 0.00655
Epoch: 42800, Training Loss: 0.00799
Epoch: 42800, Training Loss: 0.00620
Epoch: 42801, Training Loss: 0.00699
Epoch: 42801, Training Loss: 0.00655
Epoch: 42801, Training Loss: 0.00799
Epoch: 42801, Training Loss: 0.00620
Epoch: 42802, Training Loss: 0.00699
Epoch: 42802, Training Loss: 0.00655
Epoch: 42802, Training Loss: 0.00799
Epoch: 42802, Training Loss: 0.00620
Epoch: 42803, Training Loss: 0.00699
Epoch: 42803, Training Loss: 0.00655
Epoch: 42803, Training Loss: 0.00799
Epoch: 42803, Training Loss: 0.00620
Epoch: 42804, Training Loss: 0.00699
Epoch: 42804, Training Loss: 0.00655
Epoch: 42804, Training Loss: 0.00799
Epoch: 42804, Training Loss: 0.00620
Epoch: 42805, Training Loss: 0.00699
Epoch: 42805, Training Loss: 0.00655
Epoch: 42805, Training Loss: 0.00799
Epoch: 42805, Training Loss: 0.00620
Epoch: 42806, Training Loss: 0.00699
Epoch: 42806, Training Loss: 0.00655
Epoch: 42806, Training Loss: 0.00799
Epoch: 42806, Training Loss: 0.00620
Epoch: 42807, Training Loss: 0.00699
Epoch: 42807, Training Loss: 0.00655
Epoch: 42807, Training Loss: 0.00799
Epoch: 42807, Training Loss: 0.00620
Epoch: 42808, Training Loss: 0.00699
Epoch: 42808, Training Loss: 0.00655
Epoch: 42808, Training Loss: 0.00799
Epoch: 42808, Training Loss: 0.00620
Epoch: 42809, Training Loss: 0.00699
Epoch: 42809, Training Loss: 0.00655
Epoch: 42809, Training Loss: 0.00799
Epoch: 42809, Training Loss: 0.00620
Epoch: 42810, Training Loss: 0.00699
Epoch: 42810, Training Loss: 0.00655
Epoch: 42810, Training Loss: 0.00799
Epoch: 42810, Training Loss: 0.00620
Epoch: 42811, Training Loss: 0.00699
Epoch: 42811, Training Loss: 0.00655
Epoch: 42811, Training Loss: 0.00799
Epoch: 42811, Training Loss: 0.00620
Epoch: 42812, Training Loss: 0.00699
Epoch: 42812, Training Loss: 0.00654
Epoch: 42812, Training Loss: 0.00799
Epoch: 42812, Training Loss: 0.00620
Epoch: 42813, Training Loss: 0.00699
Epoch: 42813, Training Loss: 0.00654
Epoch: 42813, Training Loss: 0.00799
Epoch: 42813, Training Loss: 0.00620
Epoch: 42814, Training Loss: 0.00699
Epoch: 42814, Training Loss: 0.00654
Epoch: 42814, Training Loss: 0.00799
Epoch: 42814, Training Loss: 0.00620
Epoch: 42815, Training Loss: 0.00699
Epoch: 42815, Training Loss: 0.00654
Epoch: 42815, Training Loss: 0.00799
Epoch: 42815, Training Loss: 0.00620
Epoch: 42816, Training Loss: 0.00699
Epoch: 42816, Training Loss: 0.00654
Epoch: 42816, Training Loss: 0.00799
Epoch: 42816, Training Loss: 0.00620
Epoch: 42817, Training Loss: 0.00699
Epoch: 42817, Training Loss: 0.00654
Epoch: 42817, Training Loss: 0.00799
Epoch: 42817, Training Loss: 0.00620
Epoch: 42818, Training Loss: 0.00699
Epoch: 42818, Training Loss: 0.00654
Epoch: 42818, Training Loss: 0.00799
Epoch: 42818, Training Loss: 0.00620
Epoch: 42819, Training Loss: 0.00699
Epoch: 42819, Training Loss: 0.00654
Epoch: 42819, Training Loss: 0.00799
Epoch: 42819, Training Loss: 0.00620
Epoch: 42820, Training Loss: 0.00699
Epoch: 42820, Training Loss: 0.00654
Epoch: 42820, Training Loss: 0.00799
Epoch: 42820, Training Loss: 0.00620
Epoch: 42821, Training Loss: 0.00699
Epoch: 42821, Training Loss: 0.00654
Epoch: 42821, Training Loss: 0.00799
Epoch: 42821, Training Loss: 0.00620
Epoch: 42822, Training Loss: 0.00699
Epoch: 42822, Training Loss: 0.00654
Epoch: 42822, Training Loss: 0.00799
Epoch: 42822, Training Loss: 0.00620
Epoch: 42823, Training Loss: 0.00699
Epoch: 42823, Training Loss: 0.00654
Epoch: 42823, Training Loss: 0.00799
Epoch: 42823, Training Loss: 0.00620
Epoch: 42824, Training Loss: 0.00699
Epoch: 42824, Training Loss: 0.00654
Epoch: 42824, Training Loss: 0.00799
Epoch: 42824, Training Loss: 0.00620
Epoch: 42825, Training Loss: 0.00699
Epoch: 42825, Training Loss: 0.00654
Epoch: 42825, Training Loss: 0.00799
Epoch: 42825, Training Loss: 0.00620
Epoch: 42826, Training Loss: 0.00699
Epoch: 42826, Training Loss: 0.00654
Epoch: 42826, Training Loss: 0.00799
Epoch: 42826, Training Loss: 0.00620
Epoch: 42827, Training Loss: 0.00699
Epoch: 42827, Training Loss: 0.00654
Epoch: 42827, Training Loss: 0.00799
Epoch: 42827, Training Loss: 0.00620
Epoch: 42828, Training Loss: 0.00699
Epoch: 42828, Training Loss: 0.00654
Epoch: 42828, Training Loss: 0.00799
Epoch: 42828, Training Loss: 0.00620
Epoch: 42829, Training Loss: 0.00699
Epoch: 42829, Training Loss: 0.00654
Epoch: 42829, Training Loss: 0.00799
Epoch: 42829, Training Loss: 0.00620
Epoch: 42830, Training Loss: 0.00699
Epoch: 42830, Training Loss: 0.00654
Epoch: 42830, Training Loss: 0.00799
Epoch: 42830, Training Loss: 0.00620
Epoch: 42831, Training Loss: 0.00699
Epoch: 42831, Training Loss: 0.00654
Epoch: 42831, Training Loss: 0.00799
Epoch: 42831, Training Loss: 0.00619
Epoch: 42832, Training Loss: 0.00699
Epoch: 42832, Training Loss: 0.00654
Epoch: 42832, Training Loss: 0.00799
Epoch: 42832, Training Loss: 0.00619
Epoch: 42833, Training Loss: 0.00699
Epoch: 42833, Training Loss: 0.00654
Epoch: 42833, Training Loss: 0.00799
Epoch: 42833, Training Loss: 0.00619
Epoch: 42834, Training Loss: 0.00699
Epoch: 42834, Training Loss: 0.00654
Epoch: 42834, Training Loss: 0.00799
Epoch: 42834, Training Loss: 0.00619
Epoch: 42835, Training Loss: 0.00699
Epoch: 42835, Training Loss: 0.00654
Epoch: 42835, Training Loss: 0.00799
Epoch: 42835, Training Loss: 0.00619
Epoch: 42836, Training Loss: 0.00699
Epoch: 42836, Training Loss: 0.00654
Epoch: 42836, Training Loss: 0.00798
Epoch: 42836, Training Loss: 0.00619
Epoch: 42837, Training Loss: 0.00699
Epoch: 42837, Training Loss: 0.00654
Epoch: 42837, Training Loss: 0.00798
Epoch: 42837, Training Loss: 0.00619
Epoch: 42838, Training Loss: 0.00699
Epoch: 42838, Training Loss: 0.00654
Epoch: 42838, Training Loss: 0.00798
Epoch: 42838, Training Loss: 0.00619
Epoch: 42839, Training Loss: 0.00699
Epoch: 42839, Training Loss: 0.00654
Epoch: 42839, Training Loss: 0.00798
Epoch: 42839, Training Loss: 0.00619
Epoch: 42840, Training Loss: 0.00699
Epoch: 42840, Training Loss: 0.00654
Epoch: 42840, Training Loss: 0.00798
Epoch: 42840, Training Loss: 0.00619
Epoch: 42841, Training Loss: 0.00699
Epoch: 42841, Training Loss: 0.00654
Epoch: 42841, Training Loss: 0.00798
Epoch: 42841, Training Loss: 0.00619
Epoch: 42842, Training Loss: 0.00699
Epoch: 42842, Training Loss: 0.00654
Epoch: 42842, Training Loss: 0.00798
Epoch: 42842, Training Loss: 0.00619
Epoch: 42843, Training Loss: 0.00699
Epoch: 42843, Training Loss: 0.00654
Epoch: 42843, Training Loss: 0.00798
Epoch: 42843, Training Loss: 0.00619
Epoch: 42844, Training Loss: 0.00699
Epoch: 42844, Training Loss: 0.00654
Epoch: 42844, Training Loss: 0.00798
Epoch: 42844, Training Loss: 0.00619
Epoch: 42845, Training Loss: 0.00699
Epoch: 42845, Training Loss: 0.00654
Epoch: 42845, Training Loss: 0.00798
Epoch: 42845, Training Loss: 0.00619
Epoch: 42846, Training Loss: 0.00699
Epoch: 42846, Training Loss: 0.00654
Epoch: 42846, Training Loss: 0.00798
Epoch: 42846, Training Loss: 0.00619
Epoch: 42847, Training Loss: 0.00699
Epoch: 42847, Training Loss: 0.00654
Epoch: 42847, Training Loss: 0.00798
Epoch: 42847, Training Loss: 0.00619
Epoch: 42848, Training Loss: 0.00699
Epoch: 42848, Training Loss: 0.00654
Epoch: 42848, Training Loss: 0.00798
Epoch: 42848, Training Loss: 0.00619
Epoch: 42849, Training Loss: 0.00699
Epoch: 42849, Training Loss: 0.00654
Epoch: 42849, Training Loss: 0.00798
Epoch: 42849, Training Loss: 0.00619
Epoch: 42850, Training Loss: 0.00699
Epoch: 42850, Training Loss: 0.00654
Epoch: 42850, Training Loss: 0.00798
Epoch: 42850, Training Loss: 0.00619
Epoch: 42851, Training Loss: 0.00699
Epoch: 42851, Training Loss: 0.00654
Epoch: 42851, Training Loss: 0.00798
Epoch: 42851, Training Loss: 0.00619
Epoch: 42852, Training Loss: 0.00699
Epoch: 42852, Training Loss: 0.00654
Epoch: 42852, Training Loss: 0.00798
Epoch: 42852, Training Loss: 0.00619
Epoch: 42853, Training Loss: 0.00699
Epoch: 42853, Training Loss: 0.00654
Epoch: 42853, Training Loss: 0.00798
Epoch: 42853, Training Loss: 0.00619
Epoch: 42854, Training Loss: 0.00699
Epoch: 42854, Training Loss: 0.00654
Epoch: 42854, Training Loss: 0.00798
Epoch: 42854, Training Loss: 0.00619
Epoch: 42855, Training Loss: 0.00699
Epoch: 42855, Training Loss: 0.00654
Epoch: 42855, Training Loss: 0.00798
Epoch: 42855, Training Loss: 0.00619
Epoch: 42856, Training Loss: 0.00699
Epoch: 42856, Training Loss: 0.00654
Epoch: 42856, Training Loss: 0.00798
Epoch: 42856, Training Loss: 0.00619
Epoch: 42857, Training Loss: 0.00699
Epoch: 42857, Training Loss: 0.00654
Epoch: 42857, Training Loss: 0.00798
Epoch: 42857, Training Loss: 0.00619
Epoch: 42858, Training Loss: 0.00699
Epoch: 42858, Training Loss: 0.00654
Epoch: 42858, Training Loss: 0.00798
Epoch: 42858, Training Loss: 0.00619
Epoch: 42859, Training Loss: 0.00699
Epoch: 42859, Training Loss: 0.00654
Epoch: 42859, Training Loss: 0.00798
Epoch: 42859, Training Loss: 0.00619
Epoch: 42860, Training Loss: 0.00699
Epoch: 42860, Training Loss: 0.00654
Epoch: 42860, Training Loss: 0.00798
Epoch: 42860, Training Loss: 0.00619
Epoch: 42861, Training Loss: 0.00699
Epoch: 42861, Training Loss: 0.00654
Epoch: 42861, Training Loss: 0.00798
Epoch: 42861, Training Loss: 0.00619
Epoch: 42862, Training Loss: 0.00699
Epoch: 42862, Training Loss: 0.00654
Epoch: 42862, Training Loss: 0.00798
Epoch: 42862, Training Loss: 0.00619
Epoch: 42863, Training Loss: 0.00699
Epoch: 42863, Training Loss: 0.00654
Epoch: 42863, Training Loss: 0.00798
Epoch: 42863, Training Loss: 0.00619
Epoch: 42864, Training Loss: 0.00699
Epoch: 42864, Training Loss: 0.00654
Epoch: 42864, Training Loss: 0.00798
Epoch: 42864, Training Loss: 0.00619
Epoch: 42865, Training Loss: 0.00699
Epoch: 42865, Training Loss: 0.00654
Epoch: 42865, Training Loss: 0.00798
Epoch: 42865, Training Loss: 0.00619
Epoch: 42866, Training Loss: 0.00698
Epoch: 42866, Training Loss: 0.00654
Epoch: 42866, Training Loss: 0.00798
Epoch: 42866, Training Loss: 0.00619
Epoch: 42867, Training Loss: 0.00698
Epoch: 42867, Training Loss: 0.00654
Epoch: 42867, Training Loss: 0.00798
Epoch: 42867, Training Loss: 0.00619
Epoch: 42868, Training Loss: 0.00698
Epoch: 42868, Training Loss: 0.00654
Epoch: 42868, Training Loss: 0.00798
Epoch: 42868, Training Loss: 0.00619
Epoch: 42869, Training Loss: 0.00698
Epoch: 42869, Training Loss: 0.00654
Epoch: 42869, Training Loss: 0.00798
Epoch: 42869, Training Loss: 0.00619
Epoch: 42870, Training Loss: 0.00698
Epoch: 42870, Training Loss: 0.00654
Epoch: 42870, Training Loss: 0.00798
Epoch: 42870, Training Loss: 0.00619
Epoch: 42871, Training Loss: 0.00698
Epoch: 42871, Training Loss: 0.00654
Epoch: 42871, Training Loss: 0.00798
Epoch: 42871, Training Loss: 0.00619
Epoch: 42872, Training Loss: 0.00698
Epoch: 42872, Training Loss: 0.00654
Epoch: 42872, Training Loss: 0.00798
Epoch: 42872, Training Loss: 0.00619
Epoch: 42873, Training Loss: 0.00698
Epoch: 42873, Training Loss: 0.00654
Epoch: 42873, Training Loss: 0.00798
Epoch: 42873, Training Loss: 0.00619
Epoch: 42874, Training Loss: 0.00698
Epoch: 42874, Training Loss: 0.00654
Epoch: 42874, Training Loss: 0.00798
Epoch: 42874, Training Loss: 0.00619
Epoch: 42875, Training Loss: 0.00698
Epoch: 42875, Training Loss: 0.00654
Epoch: 42875, Training Loss: 0.00798
Epoch: 42875, Training Loss: 0.00619
Epoch: 42876, Training Loss: 0.00698
Epoch: 42876, Training Loss: 0.00654
Epoch: 42876, Training Loss: 0.00798
Epoch: 42876, Training Loss: 0.00619
Epoch: 42877, Training Loss: 0.00698
Epoch: 42877, Training Loss: 0.00654
Epoch: 42877, Training Loss: 0.00798
Epoch: 42877, Training Loss: 0.00619
Epoch: 42878, Training Loss: 0.00698
Epoch: 42878, Training Loss: 0.00654
Epoch: 42878, Training Loss: 0.00798
Epoch: 42878, Training Loss: 0.00619
Epoch: 42879, Training Loss: 0.00698
Epoch: 42879, Training Loss: 0.00654
Epoch: 42879, Training Loss: 0.00798
Epoch: 42879, Training Loss: 0.00619
Epoch: 42880, Training Loss: 0.00698
Epoch: 42880, Training Loss: 0.00654
Epoch: 42880, Training Loss: 0.00798
Epoch: 42880, Training Loss: 0.00619
Epoch: 42881, Training Loss: 0.00698
Epoch: 42881, Training Loss: 0.00654
Epoch: 42881, Training Loss: 0.00798
Epoch: 42881, Training Loss: 0.00619
Epoch: 42882, Training Loss: 0.00698
Epoch: 42882, Training Loss: 0.00654
Epoch: 42882, Training Loss: 0.00798
Epoch: 42882, Training Loss: 0.00619
Epoch: 42883, Training Loss: 0.00698
Epoch: 42883, Training Loss: 0.00654
Epoch: 42883, Training Loss: 0.00798
Epoch: 42883, Training Loss: 0.00619
Epoch: 42884, Training Loss: 0.00698
Epoch: 42884, Training Loss: 0.00654
Epoch: 42884, Training Loss: 0.00798
Epoch: 42884, Training Loss: 0.00619
Epoch: 42885, Training Loss: 0.00698
Epoch: 42885, Training Loss: 0.00654
Epoch: 42885, Training Loss: 0.00798
Epoch: 42885, Training Loss: 0.00619
Epoch: 42886, Training Loss: 0.00698
Epoch: 42886, Training Loss: 0.00654
Epoch: 42886, Training Loss: 0.00798
Epoch: 42886, Training Loss: 0.00619
Epoch: 42887, Training Loss: 0.00698
Epoch: 42887, Training Loss: 0.00654
Epoch: 42887, Training Loss: 0.00798
Epoch: 42887, Training Loss: 0.00619
Epoch: 42888, Training Loss: 0.00698
Epoch: 42888, Training Loss: 0.00654
Epoch: 42888, Training Loss: 0.00798
Epoch: 42888, Training Loss: 0.00619
Epoch: 42889, Training Loss: 0.00698
Epoch: 42889, Training Loss: 0.00654
Epoch: 42889, Training Loss: 0.00798
Epoch: 42889, Training Loss: 0.00619
Epoch: 42890, Training Loss: 0.00698
Epoch: 42890, Training Loss: 0.00654
Epoch: 42890, Training Loss: 0.00798
Epoch: 42890, Training Loss: 0.00619
Epoch: 42891, Training Loss: 0.00698
Epoch: 42891, Training Loss: 0.00654
Epoch: 42891, Training Loss: 0.00798
Epoch: 42891, Training Loss: 0.00619
Epoch: 42892, Training Loss: 0.00698
Epoch: 42892, Training Loss: 0.00654
Epoch: 42892, Training Loss: 0.00798
Epoch: 42892, Training Loss: 0.00619
Epoch: 42893, Training Loss: 0.00698
Epoch: 42893, Training Loss: 0.00654
Epoch: 42893, Training Loss: 0.00798
Epoch: 42893, Training Loss: 0.00619
Epoch: 42894, Training Loss: 0.00698
Epoch: 42894, Training Loss: 0.00654
Epoch: 42894, Training Loss: 0.00798
Epoch: 42894, Training Loss: 0.00619
Epoch: 42895, Training Loss: 0.00698
Epoch: 42895, Training Loss: 0.00654
Epoch: 42895, Training Loss: 0.00798
Epoch: 42895, Training Loss: 0.00619
Epoch: 42896, Training Loss: 0.00698
Epoch: 42896, Training Loss: 0.00654
Epoch: 42896, Training Loss: 0.00798
Epoch: 42896, Training Loss: 0.00619
Epoch: 42897, Training Loss: 0.00698
Epoch: 42897, Training Loss: 0.00654
Epoch: 42897, Training Loss: 0.00798
Epoch: 42897, Training Loss: 0.00619
Epoch: 42898, Training Loss: 0.00698
Epoch: 42898, Training Loss: 0.00654
Epoch: 42898, Training Loss: 0.00798
Epoch: 42898, Training Loss: 0.00619
Epoch: 42899, Training Loss: 0.00698
Epoch: 42899, Training Loss: 0.00654
Epoch: 42899, Training Loss: 0.00798
Epoch: 42899, Training Loss: 0.00619
Epoch: 42900, Training Loss: 0.00698
Epoch: 42900, Training Loss: 0.00654
Epoch: 42900, Training Loss: 0.00798
Epoch: 42900, Training Loss: 0.00619
Epoch: 42901, Training Loss: 0.00698
Epoch: 42901, Training Loss: 0.00654
Epoch: 42901, Training Loss: 0.00798
Epoch: 42901, Training Loss: 0.00619
Epoch: 42902, Training Loss: 0.00698
Epoch: 42902, Training Loss: 0.00654
Epoch: 42902, Training Loss: 0.00798
Epoch: 42902, Training Loss: 0.00619
Epoch: 42903, Training Loss: 0.00698
Epoch: 42903, Training Loss: 0.00654
Epoch: 42903, Training Loss: 0.00798
Epoch: 42903, Training Loss: 0.00619
Epoch: 42904, Training Loss: 0.00698
Epoch: 42904, Training Loss: 0.00654
Epoch: 42904, Training Loss: 0.00798
Epoch: 42904, Training Loss: 0.00619
Epoch: 42905, Training Loss: 0.00698
Epoch: 42905, Training Loss: 0.00654
Epoch: 42905, Training Loss: 0.00798
Epoch: 42905, Training Loss: 0.00619
Epoch: 42906, Training Loss: 0.00698
Epoch: 42906, Training Loss: 0.00654
Epoch: 42906, Training Loss: 0.00798
Epoch: 42906, Training Loss: 0.00619
Epoch: 42907, Training Loss: 0.00698
Epoch: 42907, Training Loss: 0.00654
Epoch: 42907, Training Loss: 0.00798
Epoch: 42907, Training Loss: 0.00619
Epoch: 42908, Training Loss: 0.00698
Epoch: 42908, Training Loss: 0.00654
Epoch: 42908, Training Loss: 0.00798
Epoch: 42908, Training Loss: 0.00619
Epoch: 42909, Training Loss: 0.00698
Epoch: 42909, Training Loss: 0.00654
Epoch: 42909, Training Loss: 0.00798
Epoch: 42909, Training Loss: 0.00619
Epoch: 42910, Training Loss: 0.00698
Epoch: 42910, Training Loss: 0.00654
Epoch: 42910, Training Loss: 0.00798
Epoch: 42910, Training Loss: 0.00619
Epoch: 42911, Training Loss: 0.00698
Epoch: 42911, Training Loss: 0.00654
Epoch: 42911, Training Loss: 0.00798
Epoch: 42911, Training Loss: 0.00619
Epoch: 42912, Training Loss: 0.00698
Epoch: 42912, Training Loss: 0.00654
Epoch: 42912, Training Loss: 0.00798
Epoch: 42912, Training Loss: 0.00619
Epoch: 42913, Training Loss: 0.00698
Epoch: 42913, Training Loss: 0.00654
Epoch: 42913, Training Loss: 0.00798
Epoch: 42913, Training Loss: 0.00619
Epoch: 42914, Training Loss: 0.00698
Epoch: 42914, Training Loss: 0.00654
Epoch: 42914, Training Loss: 0.00798
Epoch: 42914, Training Loss: 0.00619
Epoch: 42915, Training Loss: 0.00698
Epoch: 42915, Training Loss: 0.00654
Epoch: 42915, Training Loss: 0.00798
Epoch: 42915, Training Loss: 0.00619
Epoch: 42916, Training Loss: 0.00698
Epoch: 42916, Training Loss: 0.00654
Epoch: 42916, Training Loss: 0.00798
Epoch: 42916, Training Loss: 0.00619
Epoch: 42917, Training Loss: 0.00698
Epoch: 42917, Training Loss: 0.00654
Epoch: 42917, Training Loss: 0.00798
Epoch: 42917, Training Loss: 0.00619
Epoch: 42918, Training Loss: 0.00698
Epoch: 42918, Training Loss: 0.00654
Epoch: 42918, Training Loss: 0.00798
Epoch: 42918, Training Loss: 0.00619
Epoch: 42919, Training Loss: 0.00698
Epoch: 42919, Training Loss: 0.00654
Epoch: 42919, Training Loss: 0.00798
Epoch: 42919, Training Loss: 0.00619
Epoch: 42920, Training Loss: 0.00698
Epoch: 42920, Training Loss: 0.00654
Epoch: 42920, Training Loss: 0.00798
Epoch: 42920, Training Loss: 0.00619
Epoch: 42921, Training Loss: 0.00698
Epoch: 42921, Training Loss: 0.00654
Epoch: 42921, Training Loss: 0.00798
Epoch: 42921, Training Loss: 0.00619
Epoch: 42922, Training Loss: 0.00698
Epoch: 42922, Training Loss: 0.00654
Epoch: 42922, Training Loss: 0.00798
Epoch: 42922, Training Loss: 0.00619
Epoch: 42923, Training Loss: 0.00698
Epoch: 42923, Training Loss: 0.00654
Epoch: 42923, Training Loss: 0.00798
Epoch: 42923, Training Loss: 0.00619
Epoch: 42924, Training Loss: 0.00698
Epoch: 42924, Training Loss: 0.00654
Epoch: 42924, Training Loss: 0.00798
Epoch: 42924, Training Loss: 0.00619
Epoch: 42925, Training Loss: 0.00698
Epoch: 42925, Training Loss: 0.00654
Epoch: 42925, Training Loss: 0.00798
Epoch: 42925, Training Loss: 0.00619
Epoch: 42926, Training Loss: 0.00698
Epoch: 42926, Training Loss: 0.00654
Epoch: 42926, Training Loss: 0.00798
Epoch: 42926, Training Loss: 0.00619
Epoch: 42927, Training Loss: 0.00698
Epoch: 42927, Training Loss: 0.00654
Epoch: 42927, Training Loss: 0.00798
Epoch: 42927, Training Loss: 0.00619
Epoch: 42928, Training Loss: 0.00698
Epoch: 42928, Training Loss: 0.00654
Epoch: 42928, Training Loss: 0.00798
Epoch: 42928, Training Loss: 0.00619
Epoch: 42929, Training Loss: 0.00698
Epoch: 42929, Training Loss: 0.00654
Epoch: 42929, Training Loss: 0.00798
Epoch: 42929, Training Loss: 0.00619
Epoch: 42930, Training Loss: 0.00698
Epoch: 42930, Training Loss: 0.00654
Epoch: 42930, Training Loss: 0.00798
Epoch: 42930, Training Loss: 0.00619
Epoch: 42931, Training Loss: 0.00698
Epoch: 42931, Training Loss: 0.00654
Epoch: 42931, Training Loss: 0.00798
Epoch: 42931, Training Loss: 0.00619
Epoch: 42932, Training Loss: 0.00698
Epoch: 42932, Training Loss: 0.00654
Epoch: 42932, Training Loss: 0.00798
Epoch: 42932, Training Loss: 0.00619
Epoch: 42933, Training Loss: 0.00698
Epoch: 42933, Training Loss: 0.00654
Epoch: 42933, Training Loss: 0.00798
Epoch: 42933, Training Loss: 0.00619
Epoch: 42934, Training Loss: 0.00698
Epoch: 42934, Training Loss: 0.00654
Epoch: 42934, Training Loss: 0.00798
Epoch: 42934, Training Loss: 0.00619
Epoch: 42935, Training Loss: 0.00698
Epoch: 42935, Training Loss: 0.00654
Epoch: 42935, Training Loss: 0.00798
Epoch: 42935, Training Loss: 0.00619
Epoch: 42936, Training Loss: 0.00698
Epoch: 42936, Training Loss: 0.00654
Epoch: 42936, Training Loss: 0.00798
Epoch: 42936, Training Loss: 0.00619
Epoch: 42937, Training Loss: 0.00698
Epoch: 42937, Training Loss: 0.00654
Epoch: 42937, Training Loss: 0.00798
Epoch: 42937, Training Loss: 0.00619
Epoch: 42938, Training Loss: 0.00698
Epoch: 42938, Training Loss: 0.00653
Epoch: 42938, Training Loss: 0.00798
Epoch: 42938, Training Loss: 0.00619
Epoch: 42939, Training Loss: 0.00698
Epoch: 42939, Training Loss: 0.00653
Epoch: 42939, Training Loss: 0.00797
Epoch: 42939, Training Loss: 0.00619
Epoch: 42940, Training Loss: 0.00698
Epoch: 42940, Training Loss: 0.00653
Epoch: 42940, Training Loss: 0.00797
Epoch: 42940, Training Loss: 0.00619
Epoch: 42941, Training Loss: 0.00698
Epoch: 42941, Training Loss: 0.00653
Epoch: 42941, Training Loss: 0.00797
Epoch: 42941, Training Loss: 0.00619
Epoch: 42942, Training Loss: 0.00698
Epoch: 42942, Training Loss: 0.00653
Epoch: 42942, Training Loss: 0.00797
Epoch: 42942, Training Loss: 0.00619
Epoch: 42943, Training Loss: 0.00698
Epoch: 42943, Training Loss: 0.00653
Epoch: 42943, Training Loss: 0.00797
Epoch: 42943, Training Loss: 0.00619
Epoch: 42944, Training Loss: 0.00698
Epoch: 42944, Training Loss: 0.00653
Epoch: 42944, Training Loss: 0.00797
Epoch: 42944, Training Loss: 0.00619
Epoch: 42945, Training Loss: 0.00698
Epoch: 42945, Training Loss: 0.00653
Epoch: 42945, Training Loss: 0.00797
Epoch: 42945, Training Loss: 0.00619
Epoch: 42946, Training Loss: 0.00698
Epoch: 42946, Training Loss: 0.00653
Epoch: 42946, Training Loss: 0.00797
Epoch: 42946, Training Loss: 0.00619
Epoch: 42947, Training Loss: 0.00698
Epoch: 42947, Training Loss: 0.00653
Epoch: 42947, Training Loss: 0.00797
Epoch: 42947, Training Loss: 0.00619
Epoch: 42948, Training Loss: 0.00698
Epoch: 42948, Training Loss: 0.00653
Epoch: 42948, Training Loss: 0.00797
Epoch: 42948, Training Loss: 0.00619
Epoch: 42949, Training Loss: 0.00698
Epoch: 42949, Training Loss: 0.00653
Epoch: 42949, Training Loss: 0.00797
Epoch: 42949, Training Loss: 0.00619
Epoch: 42950, Training Loss: 0.00698
Epoch: 42950, Training Loss: 0.00653
Epoch: 42950, Training Loss: 0.00797
Epoch: 42950, Training Loss: 0.00619
Epoch: 42951, Training Loss: 0.00698
Epoch: 42951, Training Loss: 0.00653
Epoch: 42951, Training Loss: 0.00797
Epoch: 42951, Training Loss: 0.00619
Epoch: 42952, Training Loss: 0.00698
Epoch: 42952, Training Loss: 0.00653
Epoch: 42952, Training Loss: 0.00797
Epoch: 42952, Training Loss: 0.00619
Epoch: 42953, Training Loss: 0.00698
Epoch: 42953, Training Loss: 0.00653
Epoch: 42953, Training Loss: 0.00797
Epoch: 42953, Training Loss: 0.00619
Epoch: 42954, Training Loss: 0.00698
Epoch: 42954, Training Loss: 0.00653
Epoch: 42954, Training Loss: 0.00797
Epoch: 42954, Training Loss: 0.00619
Epoch: 42955, Training Loss: 0.00698
Epoch: 42955, Training Loss: 0.00653
Epoch: 42955, Training Loss: 0.00797
Epoch: 42955, Training Loss: 0.00619
Epoch: 42956, Training Loss: 0.00698
Epoch: 42956, Training Loss: 0.00653
Epoch: 42956, Training Loss: 0.00797
Epoch: 42956, Training Loss: 0.00619
Epoch: 42957, Training Loss: 0.00698
Epoch: 42957, Training Loss: 0.00653
Epoch: 42957, Training Loss: 0.00797
Epoch: 42957, Training Loss: 0.00619
Epoch: 42958, Training Loss: 0.00698
Epoch: 42958, Training Loss: 0.00653
Epoch: 42958, Training Loss: 0.00797
Epoch: 42958, Training Loss: 0.00619
Epoch: 42959, Training Loss: 0.00698
Epoch: 42959, Training Loss: 0.00653
Epoch: 42959, Training Loss: 0.00797
Epoch: 42959, Training Loss: 0.00619
Epoch: 42960, Training Loss: 0.00698
Epoch: 42960, Training Loss: 0.00653
Epoch: 42960, Training Loss: 0.00797
Epoch: 42960, Training Loss: 0.00619
Epoch: 42961, Training Loss: 0.00698
Epoch: 42961, Training Loss: 0.00653
Epoch: 42961, Training Loss: 0.00797
Epoch: 42961, Training Loss: 0.00619
Epoch: 42962, Training Loss: 0.00698
Epoch: 42962, Training Loss: 0.00653
Epoch: 42962, Training Loss: 0.00797
Epoch: 42962, Training Loss: 0.00619
Epoch: 42963, Training Loss: 0.00698
Epoch: 42963, Training Loss: 0.00653
Epoch: 42963, Training Loss: 0.00797
Epoch: 42963, Training Loss: 0.00619
Epoch: 42964, Training Loss: 0.00698
Epoch: 42964, Training Loss: 0.00653
Epoch: 42964, Training Loss: 0.00797
Epoch: 42964, Training Loss: 0.00619
Epoch: 42965, Training Loss: 0.00698
Epoch: 42965, Training Loss: 0.00653
Epoch: 42965, Training Loss: 0.00797
Epoch: 42965, Training Loss: 0.00618
Epoch: 42966, Training Loss: 0.00698
Epoch: 42966, Training Loss: 0.00653
Epoch: 42966, Training Loss: 0.00797
Epoch: 42966, Training Loss: 0.00618
Epoch: 42967, Training Loss: 0.00698
Epoch: 42967, Training Loss: 0.00653
Epoch: 42967, Training Loss: 0.00797
Epoch: 42967, Training Loss: 0.00618
Epoch: 42968, Training Loss: 0.00698
Epoch: 42968, Training Loss: 0.00653
Epoch: 42968, Training Loss: 0.00797
Epoch: 42968, Training Loss: 0.00618
Epoch: 42969, Training Loss: 0.00698
Epoch: 42969, Training Loss: 0.00653
Epoch: 42969, Training Loss: 0.00797
Epoch: 42969, Training Loss: 0.00618
Epoch: 42970, Training Loss: 0.00698
Epoch: 42970, Training Loss: 0.00653
Epoch: 42970, Training Loss: 0.00797
Epoch: 42970, Training Loss: 0.00618
Epoch: 42971, Training Loss: 0.00698
Epoch: 42971, Training Loss: 0.00653
Epoch: 42971, Training Loss: 0.00797
Epoch: 42971, Training Loss: 0.00618
Epoch: 42972, Training Loss: 0.00698
Epoch: 42972, Training Loss: 0.00653
Epoch: 42972, Training Loss: 0.00797
Epoch: 42972, Training Loss: 0.00618
Epoch: 42973, Training Loss: 0.00698
Epoch: 42973, Training Loss: 0.00653
Epoch: 42973, Training Loss: 0.00797
Epoch: 42973, Training Loss: 0.00618
Epoch: 42974, Training Loss: 0.00698
Epoch: 42974, Training Loss: 0.00653
Epoch: 42974, Training Loss: 0.00797
Epoch: 42974, Training Loss: 0.00618
Epoch: 42975, Training Loss: 0.00698
Epoch: 42975, Training Loss: 0.00653
Epoch: 42975, Training Loss: 0.00797
Epoch: 42975, Training Loss: 0.00618
Epoch: 42976, Training Loss: 0.00698
Epoch: 42976, Training Loss: 0.00653
Epoch: 42976, Training Loss: 0.00797
Epoch: 42976, Training Loss: 0.00618
Epoch: 42977, Training Loss: 0.00698
Epoch: 42977, Training Loss: 0.00653
Epoch: 42977, Training Loss: 0.00797
Epoch: 42977, Training Loss: 0.00618
Epoch: 42978, Training Loss: 0.00698
Epoch: 42978, Training Loss: 0.00653
Epoch: 42978, Training Loss: 0.00797
Epoch: 42978, Training Loss: 0.00618
Epoch: 42979, Training Loss: 0.00698
Epoch: 42979, Training Loss: 0.00653
Epoch: 42979, Training Loss: 0.00797
Epoch: 42979, Training Loss: 0.00618
Epoch: 42980, Training Loss: 0.00698
Epoch: 42980, Training Loss: 0.00653
Epoch: 42980, Training Loss: 0.00797
Epoch: 42980, Training Loss: 0.00618
Epoch: 42981, Training Loss: 0.00698
Epoch: 42981, Training Loss: 0.00653
Epoch: 42981, Training Loss: 0.00797
Epoch: 42981, Training Loss: 0.00618
Epoch: 42982, Training Loss: 0.00698
Epoch: 42982, Training Loss: 0.00653
Epoch: 42982, Training Loss: 0.00797
Epoch: 42982, Training Loss: 0.00618
Epoch: 42983, Training Loss: 0.00697
Epoch: 42983, Training Loss: 0.00653
Epoch: 42983, Training Loss: 0.00797
Epoch: 42983, Training Loss: 0.00618
Epoch: 42984, Training Loss: 0.00697
Epoch: 42984, Training Loss: 0.00653
Epoch: 42984, Training Loss: 0.00797
Epoch: 42984, Training Loss: 0.00618
Epoch: 42985, Training Loss: 0.00697
Epoch: 42985, Training Loss: 0.00653
Epoch: 42985, Training Loss: 0.00797
Epoch: 42985, Training Loss: 0.00618
Epoch: 42986, Training Loss: 0.00697
Epoch: 42986, Training Loss: 0.00653
Epoch: 42986, Training Loss: 0.00797
Epoch: 42986, Training Loss: 0.00618
Epoch: 42987, Training Loss: 0.00697
Epoch: 42987, Training Loss: 0.00653
Epoch: 42987, Training Loss: 0.00797
Epoch: 42987, Training Loss: 0.00618
Epoch: 42988, Training Loss: 0.00697
Epoch: 42988, Training Loss: 0.00653
Epoch: 42988, Training Loss: 0.00797
Epoch: 42988, Training Loss: 0.00618
Epoch: 42989, Training Loss: 0.00697
Epoch: 42989, Training Loss: 0.00653
Epoch: 42989, Training Loss: 0.00797
Epoch: 42989, Training Loss: 0.00618
Epoch: 42990, Training Loss: 0.00697
Epoch: 42990, Training Loss: 0.00653
Epoch: 42990, Training Loss: 0.00797
Epoch: 42990, Training Loss: 0.00618
Epoch: 42991, Training Loss: 0.00697
Epoch: 42991, Training Loss: 0.00653
Epoch: 42991, Training Loss: 0.00797
Epoch: 42991, Training Loss: 0.00618
Epoch: 42992, Training Loss: 0.00697
Epoch: 42992, Training Loss: 0.00653
Epoch: 42992, Training Loss: 0.00797
Epoch: 42992, Training Loss: 0.00618
Epoch: 42993, Training Loss: 0.00697
Epoch: 42993, Training Loss: 0.00653
Epoch: 42993, Training Loss: 0.00797
Epoch: 42993, Training Loss: 0.00618
Epoch: 42994, Training Loss: 0.00697
Epoch: 42994, Training Loss: 0.00653
Epoch: 42994, Training Loss: 0.00797
Epoch: 42994, Training Loss: 0.00618
Epoch: 42995, Training Loss: 0.00697
Epoch: 42995, Training Loss: 0.00653
Epoch: 42995, Training Loss: 0.00797
Epoch: 42995, Training Loss: 0.00618
Epoch: 42996, Training Loss: 0.00697
Epoch: 42996, Training Loss: 0.00653
Epoch: 42996, Training Loss: 0.00797
Epoch: 42996, Training Loss: 0.00618
Epoch: 42997, Training Loss: 0.00697
Epoch: 42997, Training Loss: 0.00653
Epoch: 42997, Training Loss: 0.00797
Epoch: 42997, Training Loss: 0.00618
Epoch: 42998, Training Loss: 0.00697
Epoch: 42998, Training Loss: 0.00653
Epoch: 42998, Training Loss: 0.00797
Epoch: 42998, Training Loss: 0.00618
Epoch: 42999, Training Loss: 0.00697
Epoch: 42999, Training Loss: 0.00653
Epoch: 42999, Training Loss: 0.00797
Epoch: 42999, Training Loss: 0.00618
Epoch: 43000, Training Loss: 0.00697
Epoch: 43000, Training Loss: 0.00653
Epoch: 43000, Training Loss: 0.00797
Epoch: 43000, Training Loss: 0.00618
Epoch: 43001, Training Loss: 0.00697
Epoch: 43001, Training Loss: 0.00653
Epoch: 43001, Training Loss: 0.00797
Epoch: 43001, Training Loss: 0.00618
Epoch: 43002, Training Loss: 0.00697
Epoch: 43002, Training Loss: 0.00653
Epoch: 43002, Training Loss: 0.00797
Epoch: 43002, Training Loss: 0.00618
Epoch: 43003, Training Loss: 0.00697
Epoch: 43003, Training Loss: 0.00653
Epoch: 43003, Training Loss: 0.00797
Epoch: 43003, Training Loss: 0.00618
Epoch: 43004, Training Loss: 0.00697
Epoch: 43004, Training Loss: 0.00653
Epoch: 43004, Training Loss: 0.00797
Epoch: 43004, Training Loss: 0.00618
Epoch: 43005, Training Loss: 0.00697
Epoch: 43005, Training Loss: 0.00653
Epoch: 43005, Training Loss: 0.00797
Epoch: 43005, Training Loss: 0.00618
Epoch: 43006, Training Loss: 0.00697
Epoch: 43006, Training Loss: 0.00653
Epoch: 43006, Training Loss: 0.00797
Epoch: 43006, Training Loss: 0.00618
Epoch: 43007, Training Loss: 0.00697
Epoch: 43007, Training Loss: 0.00653
Epoch: 43007, Training Loss: 0.00797
Epoch: 43007, Training Loss: 0.00618
Epoch: 43008, Training Loss: 0.00697
Epoch: 43008, Training Loss: 0.00653
Epoch: 43008, Training Loss: 0.00797
Epoch: 43008, Training Loss: 0.00618
Epoch: 43009, Training Loss: 0.00697
Epoch: 43009, Training Loss: 0.00653
Epoch: 43009, Training Loss: 0.00797
Epoch: 43009, Training Loss: 0.00618
Epoch: 43010, Training Loss: 0.00697
Epoch: 43010, Training Loss: 0.00653
Epoch: 43010, Training Loss: 0.00797
Epoch: 43010, Training Loss: 0.00618
Epoch: 43011, Training Loss: 0.00697
Epoch: 43011, Training Loss: 0.00653
Epoch: 43011, Training Loss: 0.00797
Epoch: 43011, Training Loss: 0.00618
Epoch: 43012, Training Loss: 0.00697
Epoch: 43012, Training Loss: 0.00653
Epoch: 43012, Training Loss: 0.00797
Epoch: 43012, Training Loss: 0.00618
Epoch: 43013, Training Loss: 0.00697
Epoch: 43013, Training Loss: 0.00653
Epoch: 43013, Training Loss: 0.00797
Epoch: 43013, Training Loss: 0.00618
Epoch: 43014, Training Loss: 0.00697
Epoch: 43014, Training Loss: 0.00653
Epoch: 43014, Training Loss: 0.00797
Epoch: 43014, Training Loss: 0.00618
Epoch: 43015, Training Loss: 0.00697
Epoch: 43015, Training Loss: 0.00653
Epoch: 43015, Training Loss: 0.00797
Epoch: 43015, Training Loss: 0.00618
Epoch: 43016, Training Loss: 0.00697
Epoch: 43016, Training Loss: 0.00653
Epoch: 43016, Training Loss: 0.00797
Epoch: 43016, Training Loss: 0.00618
Epoch: 43017, Training Loss: 0.00697
Epoch: 43017, Training Loss: 0.00653
Epoch: 43017, Training Loss: 0.00797
Epoch: 43017, Training Loss: 0.00618
Epoch: 43018, Training Loss: 0.00697
Epoch: 43018, Training Loss: 0.00653
Epoch: 43018, Training Loss: 0.00797
Epoch: 43018, Training Loss: 0.00618
Epoch: 43019, Training Loss: 0.00697
Epoch: 43019, Training Loss: 0.00653
Epoch: 43019, Training Loss: 0.00797
Epoch: 43019, Training Loss: 0.00618
Epoch: 43020, Training Loss: 0.00697
Epoch: 43020, Training Loss: 0.00653
Epoch: 43020, Training Loss: 0.00797
Epoch: 43020, Training Loss: 0.00618
Epoch: 43021, Training Loss: 0.00697
Epoch: 43021, Training Loss: 0.00653
Epoch: 43021, Training Loss: 0.00797
Epoch: 43021, Training Loss: 0.00618
Epoch: 43022, Training Loss: 0.00697
Epoch: 43022, Training Loss: 0.00653
Epoch: 43022, Training Loss: 0.00797
Epoch: 43022, Training Loss: 0.00618
Epoch: 43023, Training Loss: 0.00697
Epoch: 43023, Training Loss: 0.00653
Epoch: 43023, Training Loss: 0.00797
Epoch: 43023, Training Loss: 0.00618
Epoch: 43024, Training Loss: 0.00697
Epoch: 43024, Training Loss: 0.00653
Epoch: 43024, Training Loss: 0.00797
Epoch: 43024, Training Loss: 0.00618
Epoch: 43025, Training Loss: 0.00697
Epoch: 43025, Training Loss: 0.00653
Epoch: 43025, Training Loss: 0.00797
Epoch: 43025, Training Loss: 0.00618
Epoch: 43026, Training Loss: 0.00697
Epoch: 43026, Training Loss: 0.00653
Epoch: 43026, Training Loss: 0.00797
Epoch: 43026, Training Loss: 0.00618
Epoch: 43027, Training Loss: 0.00697
Epoch: 43027, Training Loss: 0.00653
Epoch: 43027, Training Loss: 0.00797
Epoch: 43027, Training Loss: 0.00618
Epoch: 43028, Training Loss: 0.00697
Epoch: 43028, Training Loss: 0.00653
Epoch: 43028, Training Loss: 0.00797
Epoch: 43028, Training Loss: 0.00618
Epoch: 43029, Training Loss: 0.00697
Epoch: 43029, Training Loss: 0.00653
Epoch: 43029, Training Loss: 0.00797
Epoch: 43029, Training Loss: 0.00618
Epoch: 43030, Training Loss: 0.00697
Epoch: 43030, Training Loss: 0.00653
Epoch: 43030, Training Loss: 0.00797
Epoch: 43030, Training Loss: 0.00618
Epoch: 43031, Training Loss: 0.00697
Epoch: 43031, Training Loss: 0.00653
Epoch: 43031, Training Loss: 0.00797
Epoch: 43031, Training Loss: 0.00618
Epoch: 43032, Training Loss: 0.00697
Epoch: 43032, Training Loss: 0.00653
Epoch: 43032, Training Loss: 0.00797
Epoch: 43032, Training Loss: 0.00618
Epoch: 43033, Training Loss: 0.00697
Epoch: 43033, Training Loss: 0.00653
Epoch: 43033, Training Loss: 0.00797
Epoch: 43033, Training Loss: 0.00618
Epoch: 43034, Training Loss: 0.00697
Epoch: 43034, Training Loss: 0.00653
Epoch: 43034, Training Loss: 0.00797
Epoch: 43034, Training Loss: 0.00618
Epoch: 43035, Training Loss: 0.00697
Epoch: 43035, Training Loss: 0.00653
Epoch: 43035, Training Loss: 0.00797
Epoch: 43035, Training Loss: 0.00618
Epoch: 43036, Training Loss: 0.00697
Epoch: 43036, Training Loss: 0.00653
Epoch: 43036, Training Loss: 0.00797
Epoch: 43036, Training Loss: 0.00618
Epoch: 43037, Training Loss: 0.00697
Epoch: 43037, Training Loss: 0.00653
Epoch: 43037, Training Loss: 0.00797
Epoch: 43037, Training Loss: 0.00618
Epoch: 43038, Training Loss: 0.00697
Epoch: 43038, Training Loss: 0.00653
Epoch: 43038, Training Loss: 0.00797
Epoch: 43038, Training Loss: 0.00618
Epoch: 43039, Training Loss: 0.00697
Epoch: 43039, Training Loss: 0.00653
Epoch: 43039, Training Loss: 0.00797
Epoch: 43039, Training Loss: 0.00618
Epoch: 43040, Training Loss: 0.00697
Epoch: 43040, Training Loss: 0.00653
Epoch: 43040, Training Loss: 0.00797
Epoch: 43040, Training Loss: 0.00618
Epoch: 43041, Training Loss: 0.00697
Epoch: 43041, Training Loss: 0.00653
Epoch: 43041, Training Loss: 0.00797
Epoch: 43041, Training Loss: 0.00618
Epoch: 43042, Training Loss: 0.00697
Epoch: 43042, Training Loss: 0.00653
Epoch: 43042, Training Loss: 0.00796
Epoch: 43042, Training Loss: 0.00618
Epoch: 43043, Training Loss: 0.00697
Epoch: 43043, Training Loss: 0.00653
Epoch: 43043, Training Loss: 0.00796
Epoch: 43043, Training Loss: 0.00618
Epoch: 43044, Training Loss: 0.00697
Epoch: 43044, Training Loss: 0.00653
Epoch: 43044, Training Loss: 0.00796
Epoch: 43044, Training Loss: 0.00618
Epoch: 43045, Training Loss: 0.00697
Epoch: 43045, Training Loss: 0.00653
Epoch: 43045, Training Loss: 0.00796
Epoch: 43045, Training Loss: 0.00618
Epoch: 43046, Training Loss: 0.00697
Epoch: 43046, Training Loss: 0.00653
Epoch: 43046, Training Loss: 0.00796
Epoch: 43046, Training Loss: 0.00618
Epoch: 43047, Training Loss: 0.00697
Epoch: 43047, Training Loss: 0.00653
Epoch: 43047, Training Loss: 0.00796
Epoch: 43047, Training Loss: 0.00618
Epoch: 43048, Training Loss: 0.00697
Epoch: 43048, Training Loss: 0.00653
Epoch: 43048, Training Loss: 0.00796
Epoch: 43048, Training Loss: 0.00618
Epoch: 43049, Training Loss: 0.00697
Epoch: 43049, Training Loss: 0.00653
Epoch: 43049, Training Loss: 0.00796
Epoch: 43049, Training Loss: 0.00618
Epoch: 43050, Training Loss: 0.00697
Epoch: 43050, Training Loss: 0.00653
Epoch: 43050, Training Loss: 0.00796
Epoch: 43050, Training Loss: 0.00618
Epoch: 43051, Training Loss: 0.00697
Epoch: 43051, Training Loss: 0.00653
Epoch: 43051, Training Loss: 0.00796
Epoch: 43051, Training Loss: 0.00618
Epoch: 43052, Training Loss: 0.00697
Epoch: 43052, Training Loss: 0.00653
Epoch: 43052, Training Loss: 0.00796
Epoch: 43052, Training Loss: 0.00618
Epoch: 43053, Training Loss: 0.00697
Epoch: 43053, Training Loss: 0.00653
Epoch: 43053, Training Loss: 0.00796
Epoch: 43053, Training Loss: 0.00618
Epoch: 43054, Training Loss: 0.00697
Epoch: 43054, Training Loss: 0.00653
Epoch: 43054, Training Loss: 0.00796
Epoch: 43054, Training Loss: 0.00618
Epoch: 43055, Training Loss: 0.00697
Epoch: 43055, Training Loss: 0.00653
Epoch: 43055, Training Loss: 0.00796
Epoch: 43055, Training Loss: 0.00618
Epoch: 43056, Training Loss: 0.00697
Epoch: 43056, Training Loss: 0.00653
Epoch: 43056, Training Loss: 0.00796
Epoch: 43056, Training Loss: 0.00618
Epoch: 43057, Training Loss: 0.00697
Epoch: 43057, Training Loss: 0.00653
Epoch: 43057, Training Loss: 0.00796
Epoch: 43057, Training Loss: 0.00618
Epoch: 43058, Training Loss: 0.00697
Epoch: 43058, Training Loss: 0.00653
Epoch: 43058, Training Loss: 0.00796
Epoch: 43058, Training Loss: 0.00618
Epoch: 43059, Training Loss: 0.00697
Epoch: 43059, Training Loss: 0.00653
Epoch: 43059, Training Loss: 0.00796
Epoch: 43059, Training Loss: 0.00618
Epoch: 43060, Training Loss: 0.00697
Epoch: 43060, Training Loss: 0.00653
Epoch: 43060, Training Loss: 0.00796
Epoch: 43060, Training Loss: 0.00618
Epoch: 43061, Training Loss: 0.00697
Epoch: 43061, Training Loss: 0.00653
Epoch: 43061, Training Loss: 0.00796
Epoch: 43061, Training Loss: 0.00618
Epoch: 43062, Training Loss: 0.00697
Epoch: 43062, Training Loss: 0.00653
Epoch: 43062, Training Loss: 0.00796
Epoch: 43062, Training Loss: 0.00618
Epoch: 43063, Training Loss: 0.00697
Epoch: 43063, Training Loss: 0.00653
Epoch: 43063, Training Loss: 0.00796
Epoch: 43063, Training Loss: 0.00618
Epoch: 43064, Training Loss: 0.00697
Epoch: 43064, Training Loss: 0.00653
Epoch: 43064, Training Loss: 0.00796
Epoch: 43064, Training Loss: 0.00618
Epoch: 43065, Training Loss: 0.00697
Epoch: 43065, Training Loss: 0.00652
Epoch: 43065, Training Loss: 0.00796
Epoch: 43065, Training Loss: 0.00618
Epoch: 43066, Training Loss: 0.00697
Epoch: 43066, Training Loss: 0.00652
Epoch: 43066, Training Loss: 0.00796
Epoch: 43066, Training Loss: 0.00618
Epoch: 43067, Training Loss: 0.00697
Epoch: 43067, Training Loss: 0.00652
Epoch: 43067, Training Loss: 0.00796
Epoch: 43067, Training Loss: 0.00618
Epoch: 43068, Training Loss: 0.00697
Epoch: 43068, Training Loss: 0.00652
Epoch: 43068, Training Loss: 0.00796
Epoch: 43068, Training Loss: 0.00618
Epoch: 43069, Training Loss: 0.00697
Epoch: 43069, Training Loss: 0.00652
Epoch: 43069, Training Loss: 0.00796
Epoch: 43069, Training Loss: 0.00618
Epoch: 43070, Training Loss: 0.00697
Epoch: 43070, Training Loss: 0.00652
Epoch: 43070, Training Loss: 0.00796
Epoch: 43070, Training Loss: 0.00618
Epoch: 43071, Training Loss: 0.00697
Epoch: 43071, Training Loss: 0.00652
Epoch: 43071, Training Loss: 0.00796
Epoch: 43071, Training Loss: 0.00618
Epoch: 43072, Training Loss: 0.00697
Epoch: 43072, Training Loss: 0.00652
Epoch: 43072, Training Loss: 0.00796
Epoch: 43072, Training Loss: 0.00618
Epoch: 43073, Training Loss: 0.00697
Epoch: 43073, Training Loss: 0.00652
Epoch: 43073, Training Loss: 0.00796
Epoch: 43073, Training Loss: 0.00618
Epoch: 43074, Training Loss: 0.00697
Epoch: 43074, Training Loss: 0.00652
Epoch: 43074, Training Loss: 0.00796
Epoch: 43074, Training Loss: 0.00618
Epoch: 43075, Training Loss: 0.00697
Epoch: 43075, Training Loss: 0.00652
Epoch: 43075, Training Loss: 0.00796
Epoch: 43075, Training Loss: 0.00618
Epoch: 43076, Training Loss: 0.00697
Epoch: 43076, Training Loss: 0.00652
Epoch: 43076, Training Loss: 0.00796
Epoch: 43076, Training Loss: 0.00618
Epoch: 43077, Training Loss: 0.00697
Epoch: 43077, Training Loss: 0.00652
Epoch: 43077, Training Loss: 0.00796
Epoch: 43077, Training Loss: 0.00618
Epoch: 43078, Training Loss: 0.00697
Epoch: 43078, Training Loss: 0.00652
Epoch: 43078, Training Loss: 0.00796
Epoch: 43078, Training Loss: 0.00618
Epoch: 43079, Training Loss: 0.00697
Epoch: 43079, Training Loss: 0.00652
Epoch: 43079, Training Loss: 0.00796
Epoch: 43079, Training Loss: 0.00618
Epoch: 43080, Training Loss: 0.00697
Epoch: 43080, Training Loss: 0.00652
Epoch: 43080, Training Loss: 0.00796
Epoch: 43080, Training Loss: 0.00618
Epoch: 43081, Training Loss: 0.00697
Epoch: 43081, Training Loss: 0.00652
Epoch: 43081, Training Loss: 0.00796
Epoch: 43081, Training Loss: 0.00618
Epoch: 43082, Training Loss: 0.00697
Epoch: 43082, Training Loss: 0.00652
Epoch: 43082, Training Loss: 0.00796
Epoch: 43082, Training Loss: 0.00618
Epoch: 43083, Training Loss: 0.00697
Epoch: 43083, Training Loss: 0.00652
Epoch: 43083, Training Loss: 0.00796
Epoch: 43083, Training Loss: 0.00618
Epoch: 43084, Training Loss: 0.00697
Epoch: 43084, Training Loss: 0.00652
Epoch: 43084, Training Loss: 0.00796
Epoch: 43084, Training Loss: 0.00618
Epoch: 43085, Training Loss: 0.00697
Epoch: 43085, Training Loss: 0.00652
Epoch: 43085, Training Loss: 0.00796
Epoch: 43085, Training Loss: 0.00618
Epoch: 43086, Training Loss: 0.00697
Epoch: 43086, Training Loss: 0.00652
Epoch: 43086, Training Loss: 0.00796
Epoch: 43086, Training Loss: 0.00618
Epoch: 43087, Training Loss: 0.00697
Epoch: 43087, Training Loss: 0.00652
Epoch: 43087, Training Loss: 0.00796
Epoch: 43087, Training Loss: 0.00618
Epoch: 43088, Training Loss: 0.00697
Epoch: 43088, Training Loss: 0.00652
Epoch: 43088, Training Loss: 0.00796
Epoch: 43088, Training Loss: 0.00618
Epoch: 43089, Training Loss: 0.00697
Epoch: 43089, Training Loss: 0.00652
Epoch: 43089, Training Loss: 0.00796
Epoch: 43089, Training Loss: 0.00618
Epoch: 43090, Training Loss: 0.00697
Epoch: 43090, Training Loss: 0.00652
Epoch: 43090, Training Loss: 0.00796
Epoch: 43090, Training Loss: 0.00618
Epoch: 43091, Training Loss: 0.00697
Epoch: 43091, Training Loss: 0.00652
Epoch: 43091, Training Loss: 0.00796
Epoch: 43091, Training Loss: 0.00618
Epoch: 43092, Training Loss: 0.00697
Epoch: 43092, Training Loss: 0.00652
Epoch: 43092, Training Loss: 0.00796
Epoch: 43092, Training Loss: 0.00618
Epoch: 43093, Training Loss: 0.00697
Epoch: 43093, Training Loss: 0.00652
Epoch: 43093, Training Loss: 0.00796
Epoch: 43093, Training Loss: 0.00618
Epoch: 43094, Training Loss: 0.00697
Epoch: 43094, Training Loss: 0.00652
Epoch: 43094, Training Loss: 0.00796
Epoch: 43094, Training Loss: 0.00618
Epoch: 43095, Training Loss: 0.00697
Epoch: 43095, Training Loss: 0.00652
Epoch: 43095, Training Loss: 0.00796
Epoch: 43095, Training Loss: 0.00618
Epoch: 43096, Training Loss: 0.00697
Epoch: 43096, Training Loss: 0.00652
Epoch: 43096, Training Loss: 0.00796
Epoch: 43096, Training Loss: 0.00618
Epoch: 43097, Training Loss: 0.00697
Epoch: 43097, Training Loss: 0.00652
Epoch: 43097, Training Loss: 0.00796
Epoch: 43097, Training Loss: 0.00618
Epoch: 43098, Training Loss: 0.00697
Epoch: 43098, Training Loss: 0.00652
Epoch: 43098, Training Loss: 0.00796
Epoch: 43098, Training Loss: 0.00618
Epoch: 43099, Training Loss: 0.00697
Epoch: 43099, Training Loss: 0.00652
Epoch: 43099, Training Loss: 0.00796
Epoch: 43099, Training Loss: 0.00617
Epoch: 43100, Training Loss: 0.00697
Epoch: 43100, Training Loss: 0.00652
Epoch: 43100, Training Loss: 0.00796
Epoch: 43100, Training Loss: 0.00617
Epoch: 43101, Training Loss: 0.00696
Epoch: 43101, Training Loss: 0.00652
Epoch: 43101, Training Loss: 0.00796
Epoch: 43101, Training Loss: 0.00617
Epoch: 43102, Training Loss: 0.00696
Epoch: 43102, Training Loss: 0.00652
Epoch: 43102, Training Loss: 0.00796
Epoch: 43102, Training Loss: 0.00617
Epoch: 43103, Training Loss: 0.00696
Epoch: 43103, Training Loss: 0.00652
Epoch: 43103, Training Loss: 0.00796
Epoch: 43103, Training Loss: 0.00617
Epoch: 43104, Training Loss: 0.00696
Epoch: 43104, Training Loss: 0.00652
Epoch: 43104, Training Loss: 0.00796
Epoch: 43104, Training Loss: 0.00617
Epoch: 43105, Training Loss: 0.00696
Epoch: 43105, Training Loss: 0.00652
Epoch: 43105, Training Loss: 0.00796
Epoch: 43105, Training Loss: 0.00617
Epoch: 43106, Training Loss: 0.00696
Epoch: 43106, Training Loss: 0.00652
Epoch: 43106, Training Loss: 0.00796
Epoch: 43106, Training Loss: 0.00617
Epoch: 43107, Training Loss: 0.00696
Epoch: 43107, Training Loss: 0.00652
Epoch: 43107, Training Loss: 0.00796
Epoch: 43107, Training Loss: 0.00617
Epoch: 43108, Training Loss: 0.00696
Epoch: 43108, Training Loss: 0.00652
Epoch: 43108, Training Loss: 0.00796
Epoch: 43108, Training Loss: 0.00617
Epoch: 43109, Training Loss: 0.00696
Epoch: 43109, Training Loss: 0.00652
Epoch: 43109, Training Loss: 0.00796
Epoch: 43109, Training Loss: 0.00617
Epoch: 43110, Training Loss: 0.00696
Epoch: 43110, Training Loss: 0.00652
Epoch: 43110, Training Loss: 0.00796
Epoch: 43110, Training Loss: 0.00617
Epoch: 43111, Training Loss: 0.00696
Epoch: 43111, Training Loss: 0.00652
Epoch: 43111, Training Loss: 0.00796
Epoch: 43111, Training Loss: 0.00617
Epoch: 43112, Training Loss: 0.00696
Epoch: 43112, Training Loss: 0.00652
Epoch: 43112, Training Loss: 0.00796
Epoch: 43112, Training Loss: 0.00617
Epoch: 43113, Training Loss: 0.00696
Epoch: 43113, Training Loss: 0.00652
Epoch: 43113, Training Loss: 0.00796
Epoch: 43113, Training Loss: 0.00617
Epoch: 43114, Training Loss: 0.00696
Epoch: 43114, Training Loss: 0.00652
Epoch: 43114, Training Loss: 0.00796
Epoch: 43114, Training Loss: 0.00617
Epoch: 43115, Training Loss: 0.00696
Epoch: 43115, Training Loss: 0.00652
Epoch: 43115, Training Loss: 0.00796
Epoch: 43115, Training Loss: 0.00617
Epoch: 43116, Training Loss: 0.00696
Epoch: 43116, Training Loss: 0.00652
Epoch: 43116, Training Loss: 0.00796
Epoch: 43116, Training Loss: 0.00617
Epoch: 43117, Training Loss: 0.00696
Epoch: 43117, Training Loss: 0.00652
Epoch: 43117, Training Loss: 0.00796
Epoch: 43117, Training Loss: 0.00617
Epoch: 43118, Training Loss: 0.00696
Epoch: 43118, Training Loss: 0.00652
Epoch: 43118, Training Loss: 0.00796
Epoch: 43118, Training Loss: 0.00617
Epoch: 43119, Training Loss: 0.00696
Epoch: 43119, Training Loss: 0.00652
Epoch: 43119, Training Loss: 0.00796
Epoch: 43119, Training Loss: 0.00617
Epoch: 43120, Training Loss: 0.00696
Epoch: 43120, Training Loss: 0.00652
Epoch: 43120, Training Loss: 0.00796
Epoch: 43120, Training Loss: 0.00617
Epoch: 43121, Training Loss: 0.00696
Epoch: 43121, Training Loss: 0.00652
Epoch: 43121, Training Loss: 0.00796
Epoch: 43121, Training Loss: 0.00617
Epoch: 43122, Training Loss: 0.00696
Epoch: 43122, Training Loss: 0.00652
Epoch: 43122, Training Loss: 0.00796
Epoch: 43122, Training Loss: 0.00617
Epoch: 43123, Training Loss: 0.00696
Epoch: 43123, Training Loss: 0.00652
Epoch: 43123, Training Loss: 0.00796
Epoch: 43123, Training Loss: 0.00617
Epoch: 43124, Training Loss: 0.00696
Epoch: 43124, Training Loss: 0.00652
Epoch: 43124, Training Loss: 0.00796
Epoch: 43124, Training Loss: 0.00617
Epoch: 43125, Training Loss: 0.00696
Epoch: 43125, Training Loss: 0.00652
Epoch: 43125, Training Loss: 0.00796
Epoch: 43125, Training Loss: 0.00617
Epoch: 43126, Training Loss: 0.00696
Epoch: 43126, Training Loss: 0.00652
Epoch: 43126, Training Loss: 0.00796
Epoch: 43126, Training Loss: 0.00617
Epoch: 43127, Training Loss: 0.00696
Epoch: 43127, Training Loss: 0.00652
Epoch: 43127, Training Loss: 0.00796
Epoch: 43127, Training Loss: 0.00617
Epoch: 43128, Training Loss: 0.00696
Epoch: 43128, Training Loss: 0.00652
Epoch: 43128, Training Loss: 0.00796
Epoch: 43128, Training Loss: 0.00617
Epoch: 43129, Training Loss: 0.00696
Epoch: 43129, Training Loss: 0.00652
Epoch: 43129, Training Loss: 0.00796
Epoch: 43129, Training Loss: 0.00617
Epoch: 43130, Training Loss: 0.00696
Epoch: 43130, Training Loss: 0.00652
Epoch: 43130, Training Loss: 0.00796
Epoch: 43130, Training Loss: 0.00617
Epoch: 43131, Training Loss: 0.00696
Epoch: 43131, Training Loss: 0.00652
Epoch: 43131, Training Loss: 0.00796
Epoch: 43131, Training Loss: 0.00617
Epoch: 43132, Training Loss: 0.00696
Epoch: 43132, Training Loss: 0.00652
Epoch: 43132, Training Loss: 0.00796
Epoch: 43132, Training Loss: 0.00617
Epoch: 43133, Training Loss: 0.00696
Epoch: 43133, Training Loss: 0.00652
Epoch: 43133, Training Loss: 0.00796
Epoch: 43133, Training Loss: 0.00617
Epoch: 43134, Training Loss: 0.00696
Epoch: 43134, Training Loss: 0.00652
Epoch: 43134, Training Loss: 0.00796
Epoch: 43134, Training Loss: 0.00617
Epoch: 43135, Training Loss: 0.00696
Epoch: 43135, Training Loss: 0.00652
Epoch: 43135, Training Loss: 0.00796
Epoch: 43135, Training Loss: 0.00617
Epoch: 43136, Training Loss: 0.00696
Epoch: 43136, Training Loss: 0.00652
Epoch: 43136, Training Loss: 0.00796
Epoch: 43136, Training Loss: 0.00617
Epoch: 43137, Training Loss: 0.00696
Epoch: 43137, Training Loss: 0.00652
Epoch: 43137, Training Loss: 0.00796
Epoch: 43137, Training Loss: 0.00617
Epoch: 43138, Training Loss: 0.00696
Epoch: 43138, Training Loss: 0.00652
Epoch: 43138, Training Loss: 0.00796
Epoch: 43138, Training Loss: 0.00617
Epoch: 43139, Training Loss: 0.00696
Epoch: 43139, Training Loss: 0.00652
Epoch: 43139, Training Loss: 0.00796
Epoch: 43139, Training Loss: 0.00617
Epoch: 43140, Training Loss: 0.00696
Epoch: 43140, Training Loss: 0.00652
Epoch: 43140, Training Loss: 0.00796
Epoch: 43140, Training Loss: 0.00617
Epoch: 43141, Training Loss: 0.00696
Epoch: 43141, Training Loss: 0.00652
Epoch: 43141, Training Loss: 0.00796
Epoch: 43141, Training Loss: 0.00617
Epoch: 43142, Training Loss: 0.00696
Epoch: 43142, Training Loss: 0.00652
Epoch: 43142, Training Loss: 0.00796
Epoch: 43142, Training Loss: 0.00617
Epoch: 43143, Training Loss: 0.00696
Epoch: 43143, Training Loss: 0.00652
Epoch: 43143, Training Loss: 0.00796
Epoch: 43143, Training Loss: 0.00617
Epoch: 43144, Training Loss: 0.00696
Epoch: 43144, Training Loss: 0.00652
Epoch: 43144, Training Loss: 0.00796
Epoch: 43144, Training Loss: 0.00617
Epoch: 43145, Training Loss: 0.00696
Epoch: 43145, Training Loss: 0.00652
Epoch: 43145, Training Loss: 0.00795
Epoch: 43145, Training Loss: 0.00617
Epoch: 43146, Training Loss: 0.00696
Epoch: 43146, Training Loss: 0.00652
Epoch: 43146, Training Loss: 0.00795
Epoch: 43146, Training Loss: 0.00617
Epoch: 43147, Training Loss: 0.00696
Epoch: 43147, Training Loss: 0.00652
Epoch: 43147, Training Loss: 0.00795
Epoch: 43147, Training Loss: 0.00617
Epoch: 43148, Training Loss: 0.00696
Epoch: 43148, Training Loss: 0.00652
Epoch: 43148, Training Loss: 0.00795
Epoch: 43148, Training Loss: 0.00617
Epoch: 43149, Training Loss: 0.00696
Epoch: 43149, Training Loss: 0.00652
Epoch: 43149, Training Loss: 0.00795
Epoch: 43149, Training Loss: 0.00617
Epoch: 43150, Training Loss: 0.00696
Epoch: 43150, Training Loss: 0.00652
Epoch: 43150, Training Loss: 0.00795
Epoch: 43150, Training Loss: 0.00617
Epoch: 43151, Training Loss: 0.00696
Epoch: 43151, Training Loss: 0.00652
Epoch: 43151, Training Loss: 0.00795
Epoch: 43151, Training Loss: 0.00617
Epoch: 43152, Training Loss: 0.00696
Epoch: 43152, Training Loss: 0.00652
Epoch: 43152, Training Loss: 0.00795
Epoch: 43152, Training Loss: 0.00617
Epoch: 43153, Training Loss: 0.00696
Epoch: 43153, Training Loss: 0.00652
Epoch: 43153, Training Loss: 0.00795
Epoch: 43153, Training Loss: 0.00617
Epoch: 43154, Training Loss: 0.00696
Epoch: 43154, Training Loss: 0.00652
Epoch: 43154, Training Loss: 0.00795
Epoch: 43154, Training Loss: 0.00617
Epoch: 43155, Training Loss: 0.00696
Epoch: 43155, Training Loss: 0.00652
Epoch: 43155, Training Loss: 0.00795
Epoch: 43155, Training Loss: 0.00617
Epoch: 43156, Training Loss: 0.00696
Epoch: 43156, Training Loss: 0.00652
Epoch: 43156, Training Loss: 0.00795
Epoch: 43156, Training Loss: 0.00617
Epoch: 43157, Training Loss: 0.00696
Epoch: 43157, Training Loss: 0.00652
Epoch: 43157, Training Loss: 0.00795
Epoch: 43157, Training Loss: 0.00617
Epoch: 43158, Training Loss: 0.00696
Epoch: 43158, Training Loss: 0.00652
Epoch: 43158, Training Loss: 0.00795
Epoch: 43158, Training Loss: 0.00617
Epoch: 43159, Training Loss: 0.00696
Epoch: 43159, Training Loss: 0.00652
Epoch: 43159, Training Loss: 0.00795
Epoch: 43159, Training Loss: 0.00617
Epoch: 43160, Training Loss: 0.00696
Epoch: 43160, Training Loss: 0.00652
Epoch: 43160, Training Loss: 0.00795
Epoch: 43160, Training Loss: 0.00617
Epoch: 43161, Training Loss: 0.00696
Epoch: 43161, Training Loss: 0.00652
Epoch: 43161, Training Loss: 0.00795
Epoch: 43161, Training Loss: 0.00617
Epoch: 43162, Training Loss: 0.00696
Epoch: 43162, Training Loss: 0.00652
Epoch: 43162, Training Loss: 0.00795
Epoch: 43162, Training Loss: 0.00617
Epoch: 43163, Training Loss: 0.00696
Epoch: 43163, Training Loss: 0.00652
Epoch: 43163, Training Loss: 0.00795
Epoch: 43163, Training Loss: 0.00617
Epoch: 43164, Training Loss: 0.00696
Epoch: 43164, Training Loss: 0.00652
Epoch: 43164, Training Loss: 0.00795
Epoch: 43164, Training Loss: 0.00617
Epoch: 43165, Training Loss: 0.00696
Epoch: 43165, Training Loss: 0.00652
Epoch: 43165, Training Loss: 0.00795
Epoch: 43165, Training Loss: 0.00617
Epoch: 43166, Training Loss: 0.00696
Epoch: 43166, Training Loss: 0.00652
Epoch: 43166, Training Loss: 0.00795
Epoch: 43166, Training Loss: 0.00617
Epoch: 43167, Training Loss: 0.00696
Epoch: 43167, Training Loss: 0.00652
Epoch: 43167, Training Loss: 0.00795
Epoch: 43167, Training Loss: 0.00617
Epoch: 43168, Training Loss: 0.00696
Epoch: 43168, Training Loss: 0.00652
Epoch: 43168, Training Loss: 0.00795
Epoch: 43168, Training Loss: 0.00617
Epoch: 43169, Training Loss: 0.00696
Epoch: 43169, Training Loss: 0.00652
Epoch: 43169, Training Loss: 0.00795
Epoch: 43169, Training Loss: 0.00617
Epoch: 43170, Training Loss: 0.00696
Epoch: 43170, Training Loss: 0.00652
Epoch: 43170, Training Loss: 0.00795
Epoch: 43170, Training Loss: 0.00617
Epoch: 43171, Training Loss: 0.00696
Epoch: 43171, Training Loss: 0.00652
Epoch: 43171, Training Loss: 0.00795
Epoch: 43171, Training Loss: 0.00617
Epoch: 43172, Training Loss: 0.00696
Epoch: 43172, Training Loss: 0.00652
Epoch: 43172, Training Loss: 0.00795
Epoch: 43172, Training Loss: 0.00617
Epoch: 43173, Training Loss: 0.00696
Epoch: 43173, Training Loss: 0.00652
Epoch: 43173, Training Loss: 0.00795
Epoch: 43173, Training Loss: 0.00617
Epoch: 43174, Training Loss: 0.00696
Epoch: 43174, Training Loss: 0.00652
Epoch: 43174, Training Loss: 0.00795
Epoch: 43174, Training Loss: 0.00617
Epoch: 43175, Training Loss: 0.00696
Epoch: 43175, Training Loss: 0.00652
Epoch: 43175, Training Loss: 0.00795
Epoch: 43175, Training Loss: 0.00617
Epoch: 43176, Training Loss: 0.00696
Epoch: 43176, Training Loss: 0.00652
Epoch: 43176, Training Loss: 0.00795
Epoch: 43176, Training Loss: 0.00617
Epoch: 43177, Training Loss: 0.00696
Epoch: 43177, Training Loss: 0.00652
Epoch: 43177, Training Loss: 0.00795
Epoch: 43177, Training Loss: 0.00617
Epoch: 43178, Training Loss: 0.00696
Epoch: 43178, Training Loss: 0.00652
Epoch: 43178, Training Loss: 0.00795
Epoch: 43178, Training Loss: 0.00617
Epoch: 43179, Training Loss: 0.00696
Epoch: 43179, Training Loss: 0.00652
Epoch: 43179, Training Loss: 0.00795
Epoch: 43179, Training Loss: 0.00617
Epoch: 43180, Training Loss: 0.00696
Epoch: 43180, Training Loss: 0.00652
Epoch: 43180, Training Loss: 0.00795
Epoch: 43180, Training Loss: 0.00617
Epoch: 43181, Training Loss: 0.00696
Epoch: 43181, Training Loss: 0.00652
Epoch: 43181, Training Loss: 0.00795
Epoch: 43181, Training Loss: 0.00617
Epoch: 43182, Training Loss: 0.00696
Epoch: 43182, Training Loss: 0.00652
Epoch: 43182, Training Loss: 0.00795
Epoch: 43182, Training Loss: 0.00617
Epoch: 43183, Training Loss: 0.00696
Epoch: 43183, Training Loss: 0.00652
Epoch: 43183, Training Loss: 0.00795
Epoch: 43183, Training Loss: 0.00617
Epoch: 43184, Training Loss: 0.00696
Epoch: 43184, Training Loss: 0.00652
Epoch: 43184, Training Loss: 0.00795
Epoch: 43184, Training Loss: 0.00617
Epoch: 43185, Training Loss: 0.00696
Epoch: 43185, Training Loss: 0.00652
Epoch: 43185, Training Loss: 0.00795
Epoch: 43185, Training Loss: 0.00617
Epoch: 43186, Training Loss: 0.00696
Epoch: 43186, Training Loss: 0.00652
Epoch: 43186, Training Loss: 0.00795
Epoch: 43186, Training Loss: 0.00617
Epoch: 43187, Training Loss: 0.00696
Epoch: 43187, Training Loss: 0.00652
Epoch: 43187, Training Loss: 0.00795
Epoch: 43187, Training Loss: 0.00617
Epoch: 43188, Training Loss: 0.00696
Epoch: 43188, Training Loss: 0.00652
Epoch: 43188, Training Loss: 0.00795
Epoch: 43188, Training Loss: 0.00617
Epoch: 43189, Training Loss: 0.00696
Epoch: 43189, Training Loss: 0.00652
Epoch: 43189, Training Loss: 0.00795
Epoch: 43189, Training Loss: 0.00617
Epoch: 43190, Training Loss: 0.00696
Epoch: 43190, Training Loss: 0.00652
Epoch: 43190, Training Loss: 0.00795
Epoch: 43190, Training Loss: 0.00617
Epoch: 43191, Training Loss: 0.00696
Epoch: 43191, Training Loss: 0.00652
Epoch: 43191, Training Loss: 0.00795
Epoch: 43191, Training Loss: 0.00617
Epoch: 43192, Training Loss: 0.00696
Epoch: 43192, Training Loss: 0.00652
Epoch: 43192, Training Loss: 0.00795
Epoch: 43192, Training Loss: 0.00617
Epoch: 43193, Training Loss: 0.00696
Epoch: 43193, Training Loss: 0.00651
Epoch: 43193, Training Loss: 0.00795
Epoch: 43193, Training Loss: 0.00617
Epoch: 43194, Training Loss: 0.00696
Epoch: 43194, Training Loss: 0.00651
Epoch: 43194, Training Loss: 0.00795
Epoch: 43194, Training Loss: 0.00617
Epoch: 43195, Training Loss: 0.00696
Epoch: 43195, Training Loss: 0.00651
Epoch: 43195, Training Loss: 0.00795
Epoch: 43195, Training Loss: 0.00617
Epoch: 43196, Training Loss: 0.00696
Epoch: 43196, Training Loss: 0.00651
Epoch: 43196, Training Loss: 0.00795
Epoch: 43196, Training Loss: 0.00617
Epoch: 43197, Training Loss: 0.00696
Epoch: 43197, Training Loss: 0.00651
Epoch: 43197, Training Loss: 0.00795
Epoch: 43197, Training Loss: 0.00617
Epoch: 43198, Training Loss: 0.00696
Epoch: 43198, Training Loss: 0.00651
Epoch: 43198, Training Loss: 0.00795
Epoch: 43198, Training Loss: 0.00617
Epoch: 43199, Training Loss: 0.00696
Epoch: 43199, Training Loss: 0.00651
Epoch: 43199, Training Loss: 0.00795
Epoch: 43199, Training Loss: 0.00617
Epoch: 43200, Training Loss: 0.00696
Epoch: 43200, Training Loss: 0.00651
Epoch: 43200, Training Loss: 0.00795
Epoch: 43200, Training Loss: 0.00617
Epoch: 43201, Training Loss: 0.00696
Epoch: 43201, Training Loss: 0.00651
Epoch: 43201, Training Loss: 0.00795
Epoch: 43201, Training Loss: 0.00617
Epoch: 43202, Training Loss: 0.00696
Epoch: 43202, Training Loss: 0.00651
Epoch: 43202, Training Loss: 0.00795
Epoch: 43202, Training Loss: 0.00617
Epoch: 43203, Training Loss: 0.00696
Epoch: 43203, Training Loss: 0.00651
Epoch: 43203, Training Loss: 0.00795
Epoch: 43203, Training Loss: 0.00617
Epoch: 43204, Training Loss: 0.00696
Epoch: 43204, Training Loss: 0.00651
Epoch: 43204, Training Loss: 0.00795
Epoch: 43204, Training Loss: 0.00617
Epoch: 43205, Training Loss: 0.00696
Epoch: 43205, Training Loss: 0.00651
Epoch: 43205, Training Loss: 0.00795
Epoch: 43205, Training Loss: 0.00617
Epoch: 43206, Training Loss: 0.00696
Epoch: 43206, Training Loss: 0.00651
Epoch: 43206, Training Loss: 0.00795
Epoch: 43206, Training Loss: 0.00617
Epoch: 43207, Training Loss: 0.00696
Epoch: 43207, Training Loss: 0.00651
Epoch: 43207, Training Loss: 0.00795
Epoch: 43207, Training Loss: 0.00617
Epoch: 43208, Training Loss: 0.00696
Epoch: 43208, Training Loss: 0.00651
Epoch: 43208, Training Loss: 0.00795
Epoch: 43208, Training Loss: 0.00617
Epoch: 43209, Training Loss: 0.00696
Epoch: 43209, Training Loss: 0.00651
Epoch: 43209, Training Loss: 0.00795
Epoch: 43209, Training Loss: 0.00617
Epoch: 43210, Training Loss: 0.00696
Epoch: 43210, Training Loss: 0.00651
Epoch: 43210, Training Loss: 0.00795
Epoch: 43210, Training Loss: 0.00617
Epoch: 43211, Training Loss: 0.00696
Epoch: 43211, Training Loss: 0.00651
Epoch: 43211, Training Loss: 0.00795
Epoch: 43211, Training Loss: 0.00617
Epoch: 43212, Training Loss: 0.00696
Epoch: 43212, Training Loss: 0.00651
Epoch: 43212, Training Loss: 0.00795
Epoch: 43212, Training Loss: 0.00617
Epoch: 43213, Training Loss: 0.00696
Epoch: 43213, Training Loss: 0.00651
Epoch: 43213, Training Loss: 0.00795
Epoch: 43213, Training Loss: 0.00617
Epoch: 43214, Training Loss: 0.00696
Epoch: 43214, Training Loss: 0.00651
Epoch: 43214, Training Loss: 0.00795
Epoch: 43214, Training Loss: 0.00617
Epoch: 43215, Training Loss: 0.00696
Epoch: 43215, Training Loss: 0.00651
Epoch: 43215, Training Loss: 0.00795
Epoch: 43215, Training Loss: 0.00617
Epoch: 43216, Training Loss: 0.00696
Epoch: 43216, Training Loss: 0.00651
Epoch: 43216, Training Loss: 0.00795
Epoch: 43216, Training Loss: 0.00617
Epoch: 43217, Training Loss: 0.00696
Epoch: 43217, Training Loss: 0.00651
Epoch: 43217, Training Loss: 0.00795
Epoch: 43217, Training Loss: 0.00617
Epoch: 43218, Training Loss: 0.00696
Epoch: 43218, Training Loss: 0.00651
Epoch: 43218, Training Loss: 0.00795
Epoch: 43218, Training Loss: 0.00617
Epoch: 43219, Training Loss: 0.00695
Epoch: 43219, Training Loss: 0.00651
Epoch: 43219, Training Loss: 0.00795
Epoch: 43219, Training Loss: 0.00617
Epoch: 43220, Training Loss: 0.00695
Epoch: 43220, Training Loss: 0.00651
Epoch: 43220, Training Loss: 0.00795
Epoch: 43220, Training Loss: 0.00617
Epoch: 43221, Training Loss: 0.00695
Epoch: 43221, Training Loss: 0.00651
Epoch: 43221, Training Loss: 0.00795
Epoch: 43221, Training Loss: 0.00617
Epoch: 43222, Training Loss: 0.00695
Epoch: 43222, Training Loss: 0.00651
Epoch: 43222, Training Loss: 0.00795
Epoch: 43222, Training Loss: 0.00617
Epoch: 43223, Training Loss: 0.00695
Epoch: 43223, Training Loss: 0.00651
Epoch: 43223, Training Loss: 0.00795
Epoch: 43223, Training Loss: 0.00617
Epoch: 43224, Training Loss: 0.00695
Epoch: 43224, Training Loss: 0.00651
Epoch: 43224, Training Loss: 0.00795
Epoch: 43224, Training Loss: 0.00617
Epoch: 43225, Training Loss: 0.00695
Epoch: 43225, Training Loss: 0.00651
Epoch: 43225, Training Loss: 0.00795
Epoch: 43225, Training Loss: 0.00617
Epoch: 43226, Training Loss: 0.00695
Epoch: 43226, Training Loss: 0.00651
Epoch: 43226, Training Loss: 0.00795
Epoch: 43226, Training Loss: 0.00617
Epoch: 43227, Training Loss: 0.00695
Epoch: 43227, Training Loss: 0.00651
Epoch: 43227, Training Loss: 0.00795
Epoch: 43227, Training Loss: 0.00617
Epoch: 43228, Training Loss: 0.00695
Epoch: 43228, Training Loss: 0.00651
Epoch: 43228, Training Loss: 0.00795
Epoch: 43228, Training Loss: 0.00617
Epoch: 43229, Training Loss: 0.00695
Epoch: 43229, Training Loss: 0.00651
Epoch: 43229, Training Loss: 0.00795
Epoch: 43229, Training Loss: 0.00617
Epoch: 43230, Training Loss: 0.00695
Epoch: 43230, Training Loss: 0.00651
Epoch: 43230, Training Loss: 0.00795
Epoch: 43230, Training Loss: 0.00617
Epoch: 43231, Training Loss: 0.00695
Epoch: 43231, Training Loss: 0.00651
Epoch: 43231, Training Loss: 0.00795
Epoch: 43231, Training Loss: 0.00617
Epoch: 43232, Training Loss: 0.00695
Epoch: 43232, Training Loss: 0.00651
Epoch: 43232, Training Loss: 0.00795
Epoch: 43232, Training Loss: 0.00617
Epoch: 43233, Training Loss: 0.00695
Epoch: 43233, Training Loss: 0.00651
Epoch: 43233, Training Loss: 0.00795
Epoch: 43233, Training Loss: 0.00617
Epoch: 43234, Training Loss: 0.00695
Epoch: 43234, Training Loss: 0.00651
Epoch: 43234, Training Loss: 0.00795
Epoch: 43234, Training Loss: 0.00616
Epoch: 43235, Training Loss: 0.00695
Epoch: 43235, Training Loss: 0.00651
Epoch: 43235, Training Loss: 0.00795
Epoch: 43235, Training Loss: 0.00616
Epoch: 43236, Training Loss: 0.00695
Epoch: 43236, Training Loss: 0.00651
Epoch: 43236, Training Loss: 0.00795
Epoch: 43236, Training Loss: 0.00616
Epoch: 43237, Training Loss: 0.00695
Epoch: 43237, Training Loss: 0.00651
Epoch: 43237, Training Loss: 0.00795
Epoch: 43237, Training Loss: 0.00616
Epoch: 43238, Training Loss: 0.00695
Epoch: 43238, Training Loss: 0.00651
Epoch: 43238, Training Loss: 0.00795
Epoch: 43238, Training Loss: 0.00616
Epoch: 43239, Training Loss: 0.00695
Epoch: 43239, Training Loss: 0.00651
Epoch: 43239, Training Loss: 0.00795
Epoch: 43239, Training Loss: 0.00616
Epoch: 43240, Training Loss: 0.00695
Epoch: 43240, Training Loss: 0.00651
Epoch: 43240, Training Loss: 0.00795
Epoch: 43240, Training Loss: 0.00616
Epoch: 43241, Training Loss: 0.00695
Epoch: 43241, Training Loss: 0.00651
Epoch: 43241, Training Loss: 0.00795
Epoch: 43241, Training Loss: 0.00616
Epoch: 43242, Training Loss: 0.00695
Epoch: 43242, Training Loss: 0.00651
Epoch: 43242, Training Loss: 0.00795
Epoch: 43242, Training Loss: 0.00616
Epoch: 43243, Training Loss: 0.00695
Epoch: 43243, Training Loss: 0.00651
Epoch: 43243, Training Loss: 0.00795
Epoch: 43243, Training Loss: 0.00616
Epoch: 43244, Training Loss: 0.00695
Epoch: 43244, Training Loss: 0.00651
Epoch: 43244, Training Loss: 0.00795
Epoch: 43244, Training Loss: 0.00616
Epoch: 43245, Training Loss: 0.00695
Epoch: 43245, Training Loss: 0.00651
Epoch: 43245, Training Loss: 0.00795
Epoch: 43245, Training Loss: 0.00616
Epoch: 43246, Training Loss: 0.00695
Epoch: 43246, Training Loss: 0.00651
Epoch: 43246, Training Loss: 0.00795
Epoch: 43246, Training Loss: 0.00616
Epoch: 43247, Training Loss: 0.00695
Epoch: 43247, Training Loss: 0.00651
Epoch: 43247, Training Loss: 0.00795
Epoch: 43247, Training Loss: 0.00616
Epoch: 43248, Training Loss: 0.00695
Epoch: 43248, Training Loss: 0.00651
Epoch: 43248, Training Loss: 0.00795
Epoch: 43248, Training Loss: 0.00616
Epoch: 43249, Training Loss: 0.00695
Epoch: 43249, Training Loss: 0.00651
Epoch: 43249, Training Loss: 0.00794
Epoch: 43249, Training Loss: 0.00616
Epoch: 43250, Training Loss: 0.00695
Epoch: 43250, Training Loss: 0.00651
Epoch: 43250, Training Loss: 0.00794
Epoch: 43250, Training Loss: 0.00616
Epoch: 43251, Training Loss: 0.00695
Epoch: 43251, Training Loss: 0.00651
Epoch: 43251, Training Loss: 0.00794
Epoch: 43251, Training Loss: 0.00616
Epoch: 43252, Training Loss: 0.00695
Epoch: 43252, Training Loss: 0.00651
Epoch: 43252, Training Loss: 0.00794
Epoch: 43252, Training Loss: 0.00616
Epoch: 43253, Training Loss: 0.00695
Epoch: 43253, Training Loss: 0.00651
Epoch: 43253, Training Loss: 0.00794
Epoch: 43253, Training Loss: 0.00616
Epoch: 43254, Training Loss: 0.00695
Epoch: 43254, Training Loss: 0.00651
Epoch: 43254, Training Loss: 0.00794
Epoch: 43254, Training Loss: 0.00616
Epoch: 43255, Training Loss: 0.00695
Epoch: 43255, Training Loss: 0.00651
Epoch: 43255, Training Loss: 0.00794
Epoch: 43255, Training Loss: 0.00616
Epoch: 43256, Training Loss: 0.00695
Epoch: 43256, Training Loss: 0.00651
Epoch: 43256, Training Loss: 0.00794
Epoch: 43256, Training Loss: 0.00616
Epoch: 43257, Training Loss: 0.00695
Epoch: 43257, Training Loss: 0.00651
Epoch: 43257, Training Loss: 0.00794
Epoch: 43257, Training Loss: 0.00616
Epoch: 43258, Training Loss: 0.00695
Epoch: 43258, Training Loss: 0.00651
Epoch: 43258, Training Loss: 0.00794
Epoch: 43258, Training Loss: 0.00616
Epoch: 43259, Training Loss: 0.00695
Epoch: 43259, Training Loss: 0.00651
Epoch: 43259, Training Loss: 0.00794
Epoch: 43259, Training Loss: 0.00616
Epoch: 43260, Training Loss: 0.00695
Epoch: 43260, Training Loss: 0.00651
Epoch: 43260, Training Loss: 0.00794
Epoch: 43260, Training Loss: 0.00616
Epoch: 43261, Training Loss: 0.00695
Epoch: 43261, Training Loss: 0.00651
Epoch: 43261, Training Loss: 0.00794
Epoch: 43261, Training Loss: 0.00616
Epoch: 43262, Training Loss: 0.00695
Epoch: 43262, Training Loss: 0.00651
Epoch: 43262, Training Loss: 0.00794
Epoch: 43262, Training Loss: 0.00616
Epoch: 43263, Training Loss: 0.00695
Epoch: 43263, Training Loss: 0.00651
Epoch: 43263, Training Loss: 0.00794
Epoch: 43263, Training Loss: 0.00616
Epoch: 43264, Training Loss: 0.00695
Epoch: 43264, Training Loss: 0.00651
Epoch: 43264, Training Loss: 0.00794
Epoch: 43264, Training Loss: 0.00616
Epoch: 43265, Training Loss: 0.00695
Epoch: 43265, Training Loss: 0.00651
Epoch: 43265, Training Loss: 0.00794
Epoch: 43265, Training Loss: 0.00616
Epoch: 43266, Training Loss: 0.00695
Epoch: 43266, Training Loss: 0.00651
Epoch: 43266, Training Loss: 0.00794
Epoch: 43266, Training Loss: 0.00616
Epoch: 43267, Training Loss: 0.00695
Epoch: 43267, Training Loss: 0.00651
Epoch: 43267, Training Loss: 0.00794
Epoch: 43267, Training Loss: 0.00616
Epoch: 43268, Training Loss: 0.00695
Epoch: 43268, Training Loss: 0.00651
Epoch: 43268, Training Loss: 0.00794
Epoch: 43268, Training Loss: 0.00616
Epoch: 43269, Training Loss: 0.00695
Epoch: 43269, Training Loss: 0.00651
Epoch: 43269, Training Loss: 0.00794
Epoch: 43269, Training Loss: 0.00616
Epoch: 43270, Training Loss: 0.00695
Epoch: 43270, Training Loss: 0.00651
Epoch: 43270, Training Loss: 0.00794
Epoch: 43270, Training Loss: 0.00616
Epoch: 43271, Training Loss: 0.00695
Epoch: 43271, Training Loss: 0.00651
Epoch: 43271, Training Loss: 0.00794
Epoch: 43271, Training Loss: 0.00616
Epoch: 43272, Training Loss: 0.00695
Epoch: 43272, Training Loss: 0.00651
Epoch: 43272, Training Loss: 0.00794
Epoch: 43272, Training Loss: 0.00616
Epoch: 43273, Training Loss: 0.00695
Epoch: 43273, Training Loss: 0.00651
Epoch: 43273, Training Loss: 0.00794
Epoch: 43273, Training Loss: 0.00616
Epoch: 43274, Training Loss: 0.00695
Epoch: 43274, Training Loss: 0.00651
Epoch: 43274, Training Loss: 0.00794
Epoch: 43274, Training Loss: 0.00616
Epoch: 43275, Training Loss: 0.00695
Epoch: 43275, Training Loss: 0.00651
Epoch: 43275, Training Loss: 0.00794
Epoch: 43275, Training Loss: 0.00616
Epoch: 43276, Training Loss: 0.00695
Epoch: 43276, Training Loss: 0.00651
Epoch: 43276, Training Loss: 0.00794
Epoch: 43276, Training Loss: 0.00616
Epoch: 43277, Training Loss: 0.00695
Epoch: 43277, Training Loss: 0.00651
Epoch: 43277, Training Loss: 0.00794
Epoch: 43277, Training Loss: 0.00616
Epoch: 43278, Training Loss: 0.00695
Epoch: 43278, Training Loss: 0.00651
Epoch: 43278, Training Loss: 0.00794
Epoch: 43278, Training Loss: 0.00616
Epoch: 43279, Training Loss: 0.00695
Epoch: 43279, Training Loss: 0.00651
Epoch: 43279, Training Loss: 0.00794
Epoch: 43279, Training Loss: 0.00616
Epoch: 43280, Training Loss: 0.00695
Epoch: 43280, Training Loss: 0.00651
Epoch: 43280, Training Loss: 0.00794
Epoch: 43280, Training Loss: 0.00616
Epoch: 43281, Training Loss: 0.00695
Epoch: 43281, Training Loss: 0.00651
Epoch: 43281, Training Loss: 0.00794
Epoch: 43281, Training Loss: 0.00616
Epoch: 43282, Training Loss: 0.00695
Epoch: 43282, Training Loss: 0.00651
Epoch: 43282, Training Loss: 0.00794
Epoch: 43282, Training Loss: 0.00616
Epoch: 43283, Training Loss: 0.00695
Epoch: 43283, Training Loss: 0.00651
Epoch: 43283, Training Loss: 0.00794
Epoch: 43283, Training Loss: 0.00616
Epoch: 43284, Training Loss: 0.00695
Epoch: 43284, Training Loss: 0.00651
Epoch: 43284, Training Loss: 0.00794
Epoch: 43284, Training Loss: 0.00616
Epoch: 43285, Training Loss: 0.00695
Epoch: 43285, Training Loss: 0.00651
Epoch: 43285, Training Loss: 0.00794
Epoch: 43285, Training Loss: 0.00616
Epoch: 43286, Training Loss: 0.00695
Epoch: 43286, Training Loss: 0.00651
Epoch: 43286, Training Loss: 0.00794
Epoch: 43286, Training Loss: 0.00616
Epoch: 43287, Training Loss: 0.00695
Epoch: 43287, Training Loss: 0.00651
Epoch: 43287, Training Loss: 0.00794
Epoch: 43287, Training Loss: 0.00616
Epoch: 43288, Training Loss: 0.00695
Epoch: 43288, Training Loss: 0.00651
Epoch: 43288, Training Loss: 0.00794
Epoch: 43288, Training Loss: 0.00616
Epoch: 43289, Training Loss: 0.00695
Epoch: 43289, Training Loss: 0.00651
Epoch: 43289, Training Loss: 0.00794
Epoch: 43289, Training Loss: 0.00616
Epoch: 43290, Training Loss: 0.00695
Epoch: 43290, Training Loss: 0.00651
Epoch: 43290, Training Loss: 0.00794
Epoch: 43290, Training Loss: 0.00616
Epoch: 43291, Training Loss: 0.00695
Epoch: 43291, Training Loss: 0.00651
Epoch: 43291, Training Loss: 0.00794
Epoch: 43291, Training Loss: 0.00616
Epoch: 43292, Training Loss: 0.00695
Epoch: 43292, Training Loss: 0.00651
Epoch: 43292, Training Loss: 0.00794
Epoch: 43292, Training Loss: 0.00616
Epoch: 43293, Training Loss: 0.00695
Epoch: 43293, Training Loss: 0.00651
Epoch: 43293, Training Loss: 0.00794
Epoch: 43293, Training Loss: 0.00616
Epoch: 43294, Training Loss: 0.00695
Epoch: 43294, Training Loss: 0.00651
Epoch: 43294, Training Loss: 0.00794
Epoch: 43294, Training Loss: 0.00616
Epoch: 43295, Training Loss: 0.00695
Epoch: 43295, Training Loss: 0.00651
Epoch: 43295, Training Loss: 0.00794
Epoch: 43295, Training Loss: 0.00616
Epoch: 43296, Training Loss: 0.00695
Epoch: 43296, Training Loss: 0.00651
Epoch: 43296, Training Loss: 0.00794
Epoch: 43296, Training Loss: 0.00616
Epoch: 43297, Training Loss: 0.00695
Epoch: 43297, Training Loss: 0.00651
Epoch: 43297, Training Loss: 0.00794
Epoch: 43297, Training Loss: 0.00616
Epoch: 43298, Training Loss: 0.00695
Epoch: 43298, Training Loss: 0.00651
Epoch: 43298, Training Loss: 0.00794
Epoch: 43298, Training Loss: 0.00616
Epoch: 43299, Training Loss: 0.00695
Epoch: 43299, Training Loss: 0.00651
Epoch: 43299, Training Loss: 0.00794
Epoch: 43299, Training Loss: 0.00616
Epoch: 43300, Training Loss: 0.00695
Epoch: 43300, Training Loss: 0.00651
Epoch: 43300, Training Loss: 0.00794
Epoch: 43300, Training Loss: 0.00616
Epoch: 43301, Training Loss: 0.00695
Epoch: 43301, Training Loss: 0.00651
Epoch: 43301, Training Loss: 0.00794
Epoch: 43301, Training Loss: 0.00616
Epoch: 43302, Training Loss: 0.00695
Epoch: 43302, Training Loss: 0.00651
Epoch: 43302, Training Loss: 0.00794
Epoch: 43302, Training Loss: 0.00616
Epoch: 43303, Training Loss: 0.00695
Epoch: 43303, Training Loss: 0.00651
Epoch: 43303, Training Loss: 0.00794
Epoch: 43303, Training Loss: 0.00616
Epoch: 43304, Training Loss: 0.00695
Epoch: 43304, Training Loss: 0.00651
Epoch: 43304, Training Loss: 0.00794
Epoch: 43304, Training Loss: 0.00616
Epoch: 43305, Training Loss: 0.00695
Epoch: 43305, Training Loss: 0.00651
Epoch: 43305, Training Loss: 0.00794
Epoch: 43305, Training Loss: 0.00616
Epoch: 43306, Training Loss: 0.00695
Epoch: 43306, Training Loss: 0.00651
Epoch: 43306, Training Loss: 0.00794
Epoch: 43306, Training Loss: 0.00616
Epoch: 43307, Training Loss: 0.00695
Epoch: 43307, Training Loss: 0.00651
Epoch: 43307, Training Loss: 0.00794
Epoch: 43307, Training Loss: 0.00616
Epoch: 43308, Training Loss: 0.00695
Epoch: 43308, Training Loss: 0.00651
Epoch: 43308, Training Loss: 0.00794
Epoch: 43308, Training Loss: 0.00616
Epoch: 43309, Training Loss: 0.00695
Epoch: 43309, Training Loss: 0.00651
Epoch: 43309, Training Loss: 0.00794
Epoch: 43309, Training Loss: 0.00616
Epoch: 43310, Training Loss: 0.00695
Epoch: 43310, Training Loss: 0.00651
Epoch: 43310, Training Loss: 0.00794
Epoch: 43310, Training Loss: 0.00616
Epoch: 43311, Training Loss: 0.00695
Epoch: 43311, Training Loss: 0.00651
Epoch: 43311, Training Loss: 0.00794
Epoch: 43311, Training Loss: 0.00616
Epoch: 43312, Training Loss: 0.00695
Epoch: 43312, Training Loss: 0.00651
Epoch: 43312, Training Loss: 0.00794
Epoch: 43312, Training Loss: 0.00616
Epoch: 43313, Training Loss: 0.00695
Epoch: 43313, Training Loss: 0.00651
Epoch: 43313, Training Loss: 0.00794
Epoch: 43313, Training Loss: 0.00616
Epoch: 43314, Training Loss: 0.00695
Epoch: 43314, Training Loss: 0.00651
Epoch: 43314, Training Loss: 0.00794
Epoch: 43314, Training Loss: 0.00616
Epoch: 43315, Training Loss: 0.00695
Epoch: 43315, Training Loss: 0.00651
Epoch: 43315, Training Loss: 0.00794
Epoch: 43315, Training Loss: 0.00616
Epoch: 43316, Training Loss: 0.00695
Epoch: 43316, Training Loss: 0.00651
Epoch: 43316, Training Loss: 0.00794
Epoch: 43316, Training Loss: 0.00616
Epoch: 43317, Training Loss: 0.00695
Epoch: 43317, Training Loss: 0.00651
Epoch: 43317, Training Loss: 0.00794
Epoch: 43317, Training Loss: 0.00616
Epoch: 43318, Training Loss: 0.00695
Epoch: 43318, Training Loss: 0.00651
Epoch: 43318, Training Loss: 0.00794
Epoch: 43318, Training Loss: 0.00616
Epoch: 43319, Training Loss: 0.00695
Epoch: 43319, Training Loss: 0.00651
Epoch: 43319, Training Loss: 0.00794
Epoch: 43319, Training Loss: 0.00616
Epoch: 43320, Training Loss: 0.00695
Epoch: 43320, Training Loss: 0.00651
Epoch: 43320, Training Loss: 0.00794
Epoch: 43320, Training Loss: 0.00616
Epoch: 43321, Training Loss: 0.00695
Epoch: 43321, Training Loss: 0.00650
Epoch: 43321, Training Loss: 0.00794
Epoch: 43321, Training Loss: 0.00616
Epoch: 43322, Training Loss: 0.00695
Epoch: 43322, Training Loss: 0.00650
Epoch: 43322, Training Loss: 0.00794
Epoch: 43322, Training Loss: 0.00616
Epoch: 43323, Training Loss: 0.00695
Epoch: 43323, Training Loss: 0.00650
Epoch: 43323, Training Loss: 0.00794
Epoch: 43323, Training Loss: 0.00616
Epoch: 43324, Training Loss: 0.00695
Epoch: 43324, Training Loss: 0.00650
Epoch: 43324, Training Loss: 0.00794
Epoch: 43324, Training Loss: 0.00616
Epoch: 43325, Training Loss: 0.00695
Epoch: 43325, Training Loss: 0.00650
Epoch: 43325, Training Loss: 0.00794
Epoch: 43325, Training Loss: 0.00616
Epoch: 43326, Training Loss: 0.00695
Epoch: 43326, Training Loss: 0.00650
Epoch: 43326, Training Loss: 0.00794
Epoch: 43326, Training Loss: 0.00616
Epoch: 43327, Training Loss: 0.00695
Epoch: 43327, Training Loss: 0.00650
Epoch: 43327, Training Loss: 0.00794
Epoch: 43327, Training Loss: 0.00616
Epoch: 43328, Training Loss: 0.00695
Epoch: 43328, Training Loss: 0.00650
Epoch: 43328, Training Loss: 0.00794
Epoch: 43328, Training Loss: 0.00616
Epoch: 43329, Training Loss: 0.00695
Epoch: 43329, Training Loss: 0.00650
Epoch: 43329, Training Loss: 0.00794
Epoch: 43329, Training Loss: 0.00616
Epoch: 43330, Training Loss: 0.00695
Epoch: 43330, Training Loss: 0.00650
Epoch: 43330, Training Loss: 0.00794
Epoch: 43330, Training Loss: 0.00616
Epoch: 43331, Training Loss: 0.00695
Epoch: 43331, Training Loss: 0.00650
Epoch: 43331, Training Loss: 0.00794
Epoch: 43331, Training Loss: 0.00616
Epoch: 43332, Training Loss: 0.00695
Epoch: 43332, Training Loss: 0.00650
Epoch: 43332, Training Loss: 0.00794
Epoch: 43332, Training Loss: 0.00616
Epoch: 43333, Training Loss: 0.00695
Epoch: 43333, Training Loss: 0.00650
Epoch: 43333, Training Loss: 0.00794
Epoch: 43333, Training Loss: 0.00616
Epoch: 43334, Training Loss: 0.00695
Epoch: 43334, Training Loss: 0.00650
Epoch: 43334, Training Loss: 0.00794
Epoch: 43334, Training Loss: 0.00616
Epoch: 43335, Training Loss: 0.00695
Epoch: 43335, Training Loss: 0.00650
Epoch: 43335, Training Loss: 0.00794
Epoch: 43335, Training Loss: 0.00616
Epoch: 43336, Training Loss: 0.00695
Epoch: 43336, Training Loss: 0.00650
Epoch: 43336, Training Loss: 0.00794
Epoch: 43336, Training Loss: 0.00616
Epoch: 43337, Training Loss: 0.00694
Epoch: 43337, Training Loss: 0.00650
Epoch: 43337, Training Loss: 0.00794
Epoch: 43337, Training Loss: 0.00616
Epoch: 43338, Training Loss: 0.00694
Epoch: 43338, Training Loss: 0.00650
Epoch: 43338, Training Loss: 0.00794
Epoch: 43338, Training Loss: 0.00616
Epoch: 43339, Training Loss: 0.00694
Epoch: 43339, Training Loss: 0.00650
Epoch: 43339, Training Loss: 0.00794
Epoch: 43339, Training Loss: 0.00616
Epoch: 43340, Training Loss: 0.00694
Epoch: 43340, Training Loss: 0.00650
Epoch: 43340, Training Loss: 0.00794
Epoch: 43340, Training Loss: 0.00616
Epoch: 43341, Training Loss: 0.00694
Epoch: 43341, Training Loss: 0.00650
Epoch: 43341, Training Loss: 0.00794
Epoch: 43341, Training Loss: 0.00616
Epoch: 43342, Training Loss: 0.00694
Epoch: 43342, Training Loss: 0.00650
Epoch: 43342, Training Loss: 0.00794
Epoch: 43342, Training Loss: 0.00616
Epoch: 43343, Training Loss: 0.00694
Epoch: 43343, Training Loss: 0.00650
Epoch: 43343, Training Loss: 0.00794
Epoch: 43343, Training Loss: 0.00616
Epoch: 43344, Training Loss: 0.00694
Epoch: 43344, Training Loss: 0.00650
Epoch: 43344, Training Loss: 0.00794
Epoch: 43344, Training Loss: 0.00616
Epoch: 43345, Training Loss: 0.00694
Epoch: 43345, Training Loss: 0.00650
Epoch: 43345, Training Loss: 0.00794
Epoch: 43345, Training Loss: 0.00616
Epoch: 43346, Training Loss: 0.00694
Epoch: 43346, Training Loss: 0.00650
Epoch: 43346, Training Loss: 0.00794
Epoch: 43346, Training Loss: 0.00616
Epoch: 43347, Training Loss: 0.00694
Epoch: 43347, Training Loss: 0.00650
Epoch: 43347, Training Loss: 0.00794
Epoch: 43347, Training Loss: 0.00616
Epoch: 43348, Training Loss: 0.00694
Epoch: 43348, Training Loss: 0.00650
Epoch: 43348, Training Loss: 0.00794
Epoch: 43348, Training Loss: 0.00616
Epoch: 43349, Training Loss: 0.00694
Epoch: 43349, Training Loss: 0.00650
Epoch: 43349, Training Loss: 0.00794
Epoch: 43349, Training Loss: 0.00616
Epoch: 43350, Training Loss: 0.00694
Epoch: 43350, Training Loss: 0.00650
Epoch: 43350, Training Loss: 0.00794
Epoch: 43350, Training Loss: 0.00616
Epoch: 43351, Training Loss: 0.00694
Epoch: 43351, Training Loss: 0.00650
Epoch: 43351, Training Loss: 0.00794
Epoch: 43351, Training Loss: 0.00616
Epoch: 43352, Training Loss: 0.00694
Epoch: 43352, Training Loss: 0.00650
Epoch: 43352, Training Loss: 0.00794
Epoch: 43352, Training Loss: 0.00616
Epoch: 43353, Training Loss: 0.00694
Epoch: 43353, Training Loss: 0.00650
Epoch: 43353, Training Loss: 0.00793
Epoch: 43353, Training Loss: 0.00616
Epoch: 43354, Training Loss: 0.00694
Epoch: 43354, Training Loss: 0.00650
Epoch: 43354, Training Loss: 0.00793
Epoch: 43354, Training Loss: 0.00616
Epoch: 43355, Training Loss: 0.00694
Epoch: 43355, Training Loss: 0.00650
Epoch: 43355, Training Loss: 0.00793
Epoch: 43355, Training Loss: 0.00616
Epoch: 43356, Training Loss: 0.00694
Epoch: 43356, Training Loss: 0.00650
Epoch: 43356, Training Loss: 0.00793
Epoch: 43356, Training Loss: 0.00616
Epoch: 43357, Training Loss: 0.00694
Epoch: 43357, Training Loss: 0.00650
Epoch: 43357, Training Loss: 0.00793
Epoch: 43357, Training Loss: 0.00616
Epoch: 43358, Training Loss: 0.00694
Epoch: 43358, Training Loss: 0.00650
Epoch: 43358, Training Loss: 0.00793
Epoch: 43358, Training Loss: 0.00616
Epoch: 43359, Training Loss: 0.00694
Epoch: 43359, Training Loss: 0.00650
Epoch: 43359, Training Loss: 0.00793
Epoch: 43359, Training Loss: 0.00616
Epoch: 43360, Training Loss: 0.00694
Epoch: 43360, Training Loss: 0.00650
Epoch: 43360, Training Loss: 0.00793
Epoch: 43360, Training Loss: 0.00616
Epoch: 43361, Training Loss: 0.00694
Epoch: 43361, Training Loss: 0.00650
Epoch: 43361, Training Loss: 0.00793
Epoch: 43361, Training Loss: 0.00616
Epoch: 43362, Training Loss: 0.00694
Epoch: 43362, Training Loss: 0.00650
Epoch: 43362, Training Loss: 0.00793
Epoch: 43362, Training Loss: 0.00616
Epoch: 43363, Training Loss: 0.00694
Epoch: 43363, Training Loss: 0.00650
Epoch: 43363, Training Loss: 0.00793
Epoch: 43363, Training Loss: 0.00616
Epoch: 43364, Training Loss: 0.00694
Epoch: 43364, Training Loss: 0.00650
Epoch: 43364, Training Loss: 0.00793
Epoch: 43364, Training Loss: 0.00616
Epoch: 43365, Training Loss: 0.00694
Epoch: 43365, Training Loss: 0.00650
Epoch: 43365, Training Loss: 0.00793
Epoch: 43365, Training Loss: 0.00616
Epoch: 43366, Training Loss: 0.00694
Epoch: 43366, Training Loss: 0.00650
Epoch: 43366, Training Loss: 0.00793
Epoch: 43366, Training Loss: 0.00616
Epoch: 43367, Training Loss: 0.00694
Epoch: 43367, Training Loss: 0.00650
Epoch: 43367, Training Loss: 0.00793
Epoch: 43367, Training Loss: 0.00616
Epoch: 43368, Training Loss: 0.00694
Epoch: 43368, Training Loss: 0.00650
Epoch: 43368, Training Loss: 0.00793
Epoch: 43368, Training Loss: 0.00616
Epoch: 43369, Training Loss: 0.00694
Epoch: 43369, Training Loss: 0.00650
Epoch: 43369, Training Loss: 0.00793
Epoch: 43369, Training Loss: 0.00616
Epoch: 43370, Training Loss: 0.00694
Epoch: 43370, Training Loss: 0.00650
Epoch: 43370, Training Loss: 0.00793
Epoch: 43370, Training Loss: 0.00615
Epoch: 43371, Training Loss: 0.00694
Epoch: 43371, Training Loss: 0.00650
Epoch: 43371, Training Loss: 0.00793
Epoch: 43371, Training Loss: 0.00615
Epoch: 43372, Training Loss: 0.00694
Epoch: 43372, Training Loss: 0.00650
Epoch: 43372, Training Loss: 0.00793
Epoch: 43372, Training Loss: 0.00615
Epoch: 43373, Training Loss: 0.00694
Epoch: 43373, Training Loss: 0.00650
Epoch: 43373, Training Loss: 0.00793
Epoch: 43373, Training Loss: 0.00615
Epoch: 43374, Training Loss: 0.00694
Epoch: 43374, Training Loss: 0.00650
Epoch: 43374, Training Loss: 0.00793
Epoch: 43374, Training Loss: 0.00615
Epoch: 43375, Training Loss: 0.00694
Epoch: 43375, Training Loss: 0.00650
Epoch: 43375, Training Loss: 0.00793
Epoch: 43375, Training Loss: 0.00615
Epoch: 43376, Training Loss: 0.00694
Epoch: 43376, Training Loss: 0.00650
Epoch: 43376, Training Loss: 0.00793
Epoch: 43376, Training Loss: 0.00615
Epoch: 43377, Training Loss: 0.00694
Epoch: 43377, Training Loss: 0.00650
Epoch: 43377, Training Loss: 0.00793
Epoch: 43377, Training Loss: 0.00615
Epoch: 43378, Training Loss: 0.00694
Epoch: 43378, Training Loss: 0.00650
Epoch: 43378, Training Loss: 0.00793
Epoch: 43378, Training Loss: 0.00615
Epoch: 43379, Training Loss: 0.00694
Epoch: 43379, Training Loss: 0.00650
Epoch: 43379, Training Loss: 0.00793
Epoch: 43379, Training Loss: 0.00615
Epoch: 43380, Training Loss: 0.00694
Epoch: 43380, Training Loss: 0.00650
Epoch: 43380, Training Loss: 0.00793
Epoch: 43380, Training Loss: 0.00615
Epoch: 43381, Training Loss: 0.00694
Epoch: 43381, Training Loss: 0.00650
Epoch: 43381, Training Loss: 0.00793
Epoch: 43381, Training Loss: 0.00615
Epoch: 43382, Training Loss: 0.00694
Epoch: 43382, Training Loss: 0.00650
Epoch: 43382, Training Loss: 0.00793
Epoch: 43382, Training Loss: 0.00615
Epoch: 43383, Training Loss: 0.00694
Epoch: 43383, Training Loss: 0.00650
Epoch: 43383, Training Loss: 0.00793
Epoch: 43383, Training Loss: 0.00615
Epoch: 43384, Training Loss: 0.00694
Epoch: 43384, Training Loss: 0.00650
Epoch: 43384, Training Loss: 0.00793
Epoch: 43384, Training Loss: 0.00615
Epoch: 43385, Training Loss: 0.00694
Epoch: 43385, Training Loss: 0.00650
Epoch: 43385, Training Loss: 0.00793
Epoch: 43385, Training Loss: 0.00615
Epoch: 43386, Training Loss: 0.00694
Epoch: 43386, Training Loss: 0.00650
Epoch: 43386, Training Loss: 0.00793
Epoch: 43386, Training Loss: 0.00615
Epoch: 43387, Training Loss: 0.00694
Epoch: 43387, Training Loss: 0.00650
Epoch: 43387, Training Loss: 0.00793
Epoch: 43387, Training Loss: 0.00615
Epoch: 43388, Training Loss: 0.00694
Epoch: 43388, Training Loss: 0.00650
Epoch: 43388, Training Loss: 0.00793
Epoch: 43388, Training Loss: 0.00615
Epoch: 43389, Training Loss: 0.00694
Epoch: 43389, Training Loss: 0.00650
Epoch: 43389, Training Loss: 0.00793
Epoch: 43389, Training Loss: 0.00615
Epoch: 43390, Training Loss: 0.00694
Epoch: 43390, Training Loss: 0.00650
Epoch: 43390, Training Loss: 0.00793
Epoch: 43390, Training Loss: 0.00615
Epoch: 43391, Training Loss: 0.00694
Epoch: 43391, Training Loss: 0.00650
Epoch: 43391, Training Loss: 0.00793
Epoch: 43391, Training Loss: 0.00615
Epoch: 43392, Training Loss: 0.00694
Epoch: 43392, Training Loss: 0.00650
Epoch: 43392, Training Loss: 0.00793
Epoch: 43392, Training Loss: 0.00615
Epoch: 43393, Training Loss: 0.00694
Epoch: 43393, Training Loss: 0.00650
Epoch: 43393, Training Loss: 0.00793
Epoch: 43393, Training Loss: 0.00615
Epoch: 43394, Training Loss: 0.00694
Epoch: 43394, Training Loss: 0.00650
Epoch: 43394, Training Loss: 0.00793
Epoch: 43394, Training Loss: 0.00615
Epoch: 43395, Training Loss: 0.00694
Epoch: 43395, Training Loss: 0.00650
Epoch: 43395, Training Loss: 0.00793
Epoch: 43395, Training Loss: 0.00615
Epoch: 43396, Training Loss: 0.00694
Epoch: 43396, Training Loss: 0.00650
Epoch: 43396, Training Loss: 0.00793
Epoch: 43396, Training Loss: 0.00615
Epoch: 43397, Training Loss: 0.00694
Epoch: 43397, Training Loss: 0.00650
Epoch: 43397, Training Loss: 0.00793
Epoch: 43397, Training Loss: 0.00615
Epoch: 43398, Training Loss: 0.00694
Epoch: 43398, Training Loss: 0.00650
Epoch: 43398, Training Loss: 0.00793
Epoch: 43398, Training Loss: 0.00615
Epoch: 43399, Training Loss: 0.00694
Epoch: 43399, Training Loss: 0.00650
Epoch: 43399, Training Loss: 0.00793
Epoch: 43399, Training Loss: 0.00615
Epoch: 43400, Training Loss: 0.00694
Epoch: 43400, Training Loss: 0.00650
Epoch: 43400, Training Loss: 0.00793
Epoch: 43400, Training Loss: 0.00615
Epoch: 43401, Training Loss: 0.00694
Epoch: 43401, Training Loss: 0.00650
Epoch: 43401, Training Loss: 0.00793
Epoch: 43401, Training Loss: 0.00615
Epoch: 43402, Training Loss: 0.00694
Epoch: 43402, Training Loss: 0.00650
Epoch: 43402, Training Loss: 0.00793
Epoch: 43402, Training Loss: 0.00615
Epoch: 43403, Training Loss: 0.00694
Epoch: 43403, Training Loss: 0.00650
Epoch: 43403, Training Loss: 0.00793
Epoch: 43403, Training Loss: 0.00615
Epoch: 43404, Training Loss: 0.00694
Epoch: 43404, Training Loss: 0.00650
Epoch: 43404, Training Loss: 0.00793
Epoch: 43404, Training Loss: 0.00615
Epoch: 43405, Training Loss: 0.00694
Epoch: 43405, Training Loss: 0.00650
Epoch: 43405, Training Loss: 0.00793
Epoch: 43405, Training Loss: 0.00615
Epoch: 43406, Training Loss: 0.00694
Epoch: 43406, Training Loss: 0.00650
Epoch: 43406, Training Loss: 0.00793
Epoch: 43406, Training Loss: 0.00615
Epoch: 43407, Training Loss: 0.00694
Epoch: 43407, Training Loss: 0.00650
Epoch: 43407, Training Loss: 0.00793
Epoch: 43407, Training Loss: 0.00615
Epoch: 43408, Training Loss: 0.00694
Epoch: 43408, Training Loss: 0.00650
Epoch: 43408, Training Loss: 0.00793
Epoch: 43408, Training Loss: 0.00615
Epoch: 43409, Training Loss: 0.00694
Epoch: 43409, Training Loss: 0.00650
Epoch: 43409, Training Loss: 0.00793
Epoch: 43409, Training Loss: 0.00615
Epoch: 43410, Training Loss: 0.00694
Epoch: 43410, Training Loss: 0.00650
Epoch: 43410, Training Loss: 0.00793
Epoch: 43410, Training Loss: 0.00615
Epoch: 43411, Training Loss: 0.00694
Epoch: 43411, Training Loss: 0.00650
Epoch: 43411, Training Loss: 0.00793
Epoch: 43411, Training Loss: 0.00615
Epoch: 43412, Training Loss: 0.00694
Epoch: 43412, Training Loss: 0.00650
Epoch: 43412, Training Loss: 0.00793
Epoch: 43412, Training Loss: 0.00615
Epoch: 43413, Training Loss: 0.00694
Epoch: 43413, Training Loss: 0.00650
Epoch: 43413, Training Loss: 0.00793
Epoch: 43413, Training Loss: 0.00615
Epoch: 43414, Training Loss: 0.00694
Epoch: 43414, Training Loss: 0.00650
Epoch: 43414, Training Loss: 0.00793
Epoch: 43414, Training Loss: 0.00615
Epoch: 43415, Training Loss: 0.00694
Epoch: 43415, Training Loss: 0.00650
Epoch: 43415, Training Loss: 0.00793
Epoch: 43415, Training Loss: 0.00615
Epoch: 43416, Training Loss: 0.00694
Epoch: 43416, Training Loss: 0.00650
Epoch: 43416, Training Loss: 0.00793
Epoch: 43416, Training Loss: 0.00615
Epoch: 43417, Training Loss: 0.00694
Epoch: 43417, Training Loss: 0.00650
Epoch: 43417, Training Loss: 0.00793
Epoch: 43417, Training Loss: 0.00615
Epoch: 43418, Training Loss: 0.00694
Epoch: 43418, Training Loss: 0.00650
Epoch: 43418, Training Loss: 0.00793
Epoch: 43418, Training Loss: 0.00615
Epoch: 43419, Training Loss: 0.00694
Epoch: 43419, Training Loss: 0.00650
Epoch: 43419, Training Loss: 0.00793
Epoch: 43419, Training Loss: 0.00615
Epoch: 43420, Training Loss: 0.00694
Epoch: 43420, Training Loss: 0.00650
Epoch: 43420, Training Loss: 0.00793
Epoch: 43420, Training Loss: 0.00615
Epoch: 43421, Training Loss: 0.00694
Epoch: 43421, Training Loss: 0.00650
Epoch: 43421, Training Loss: 0.00793
Epoch: 43421, Training Loss: 0.00615
Epoch: 43422, Training Loss: 0.00694
Epoch: 43422, Training Loss: 0.00650
Epoch: 43422, Training Loss: 0.00793
Epoch: 43422, Training Loss: 0.00615
Epoch: 43423, Training Loss: 0.00694
Epoch: 43423, Training Loss: 0.00650
Epoch: 43423, Training Loss: 0.00793
Epoch: 43423, Training Loss: 0.00615
Epoch: 43424, Training Loss: 0.00694
Epoch: 43424, Training Loss: 0.00650
Epoch: 43424, Training Loss: 0.00793
Epoch: 43424, Training Loss: 0.00615
Epoch: 43425, Training Loss: 0.00694
Epoch: 43425, Training Loss: 0.00650
Epoch: 43425, Training Loss: 0.00793
Epoch: 43425, Training Loss: 0.00615
Epoch: 43426, Training Loss: 0.00694
Epoch: 43426, Training Loss: 0.00650
Epoch: 43426, Training Loss: 0.00793
Epoch: 43426, Training Loss: 0.00615
Epoch: 43427, Training Loss: 0.00694
Epoch: 43427, Training Loss: 0.00650
Epoch: 43427, Training Loss: 0.00793
Epoch: 43427, Training Loss: 0.00615
Epoch: 43428, Training Loss: 0.00694
Epoch: 43428, Training Loss: 0.00650
Epoch: 43428, Training Loss: 0.00793
Epoch: 43428, Training Loss: 0.00615
Epoch: 43429, Training Loss: 0.00694
Epoch: 43429, Training Loss: 0.00650
Epoch: 43429, Training Loss: 0.00793
Epoch: 43429, Training Loss: 0.00615
Epoch: 43430, Training Loss: 0.00694
Epoch: 43430, Training Loss: 0.00650
Epoch: 43430, Training Loss: 0.00793
Epoch: 43430, Training Loss: 0.00615
Epoch: 43431, Training Loss: 0.00694
Epoch: 43431, Training Loss: 0.00650
Epoch: 43431, Training Loss: 0.00793
Epoch: 43431, Training Loss: 0.00615
Epoch: 43432, Training Loss: 0.00694
Epoch: 43432, Training Loss: 0.00650
Epoch: 43432, Training Loss: 0.00793
Epoch: 43432, Training Loss: 0.00615
Epoch: 43433, Training Loss: 0.00694
Epoch: 43433, Training Loss: 0.00650
Epoch: 43433, Training Loss: 0.00793
Epoch: 43433, Training Loss: 0.00615
Epoch: 43434, Training Loss: 0.00694
Epoch: 43434, Training Loss: 0.00650
Epoch: 43434, Training Loss: 0.00793
Epoch: 43434, Training Loss: 0.00615
Epoch: 43435, Training Loss: 0.00694
Epoch: 43435, Training Loss: 0.00650
Epoch: 43435, Training Loss: 0.00793
Epoch: 43435, Training Loss: 0.00615
Epoch: 43436, Training Loss: 0.00694
Epoch: 43436, Training Loss: 0.00650
Epoch: 43436, Training Loss: 0.00793
Epoch: 43436, Training Loss: 0.00615
Epoch: 43437, Training Loss: 0.00694
Epoch: 43437, Training Loss: 0.00650
Epoch: 43437, Training Loss: 0.00793
Epoch: 43437, Training Loss: 0.00615
Epoch: 43438, Training Loss: 0.00694
Epoch: 43438, Training Loss: 0.00650
Epoch: 43438, Training Loss: 0.00793
Epoch: 43438, Training Loss: 0.00615
Epoch: 43439, Training Loss: 0.00694
Epoch: 43439, Training Loss: 0.00650
Epoch: 43439, Training Loss: 0.00793
Epoch: 43439, Training Loss: 0.00615
Epoch: 43440, Training Loss: 0.00694
Epoch: 43440, Training Loss: 0.00650
Epoch: 43440, Training Loss: 0.00793
Epoch: 43440, Training Loss: 0.00615
Epoch: 43441, Training Loss: 0.00694
Epoch: 43441, Training Loss: 0.00650
Epoch: 43441, Training Loss: 0.00793
Epoch: 43441, Training Loss: 0.00615
Epoch: 43442, Training Loss: 0.00694
Epoch: 43442, Training Loss: 0.00650
Epoch: 43442, Training Loss: 0.00793
Epoch: 43442, Training Loss: 0.00615
Epoch: 43443, Training Loss: 0.00694
Epoch: 43443, Training Loss: 0.00650
Epoch: 43443, Training Loss: 0.00793
Epoch: 43443, Training Loss: 0.00615
Epoch: 43444, Training Loss: 0.00694
Epoch: 43444, Training Loss: 0.00650
Epoch: 43444, Training Loss: 0.00793
Epoch: 43444, Training Loss: 0.00615
Epoch: 43445, Training Loss: 0.00694
Epoch: 43445, Training Loss: 0.00650
Epoch: 43445, Training Loss: 0.00793
Epoch: 43445, Training Loss: 0.00615
Epoch: 43446, Training Loss: 0.00694
Epoch: 43446, Training Loss: 0.00650
Epoch: 43446, Training Loss: 0.00793
Epoch: 43446, Training Loss: 0.00615
Epoch: 43447, Training Loss: 0.00694
Epoch: 43447, Training Loss: 0.00650
Epoch: 43447, Training Loss: 0.00793
Epoch: 43447, Training Loss: 0.00615
Epoch: 43448, Training Loss: 0.00694
Epoch: 43448, Training Loss: 0.00650
Epoch: 43448, Training Loss: 0.00793
Epoch: 43448, Training Loss: 0.00615
Epoch: 43449, Training Loss: 0.00694
Epoch: 43449, Training Loss: 0.00650
Epoch: 43449, Training Loss: 0.00793
Epoch: 43449, Training Loss: 0.00615
Epoch: 43450, Training Loss: 0.00694
Epoch: 43450, Training Loss: 0.00649
Epoch: 43450, Training Loss: 0.00793
Epoch: 43450, Training Loss: 0.00615
Epoch: 43451, Training Loss: 0.00694
Epoch: 43451, Training Loss: 0.00649
Epoch: 43451, Training Loss: 0.00793
Epoch: 43451, Training Loss: 0.00615
Epoch: 43452, Training Loss: 0.00694
Epoch: 43452, Training Loss: 0.00649
Epoch: 43452, Training Loss: 0.00793
Epoch: 43452, Training Loss: 0.00615
Epoch: 43453, Training Loss: 0.00694
Epoch: 43453, Training Loss: 0.00649
Epoch: 43453, Training Loss: 0.00793
Epoch: 43453, Training Loss: 0.00615
Epoch: 43454, Training Loss: 0.00694
Epoch: 43454, Training Loss: 0.00649
Epoch: 43454, Training Loss: 0.00793
Epoch: 43454, Training Loss: 0.00615
Epoch: 43455, Training Loss: 0.00694
Epoch: 43455, Training Loss: 0.00649
Epoch: 43455, Training Loss: 0.00793
Epoch: 43455, Training Loss: 0.00615
Epoch: 43456, Training Loss: 0.00693
Epoch: 43456, Training Loss: 0.00649
Epoch: 43456, Training Loss: 0.00793
Epoch: 43456, Training Loss: 0.00615
Epoch: 43457, Training Loss: 0.00693
Epoch: 43457, Training Loss: 0.00649
Epoch: 43457, Training Loss: 0.00793
Epoch: 43457, Training Loss: 0.00615
Epoch: 43458, Training Loss: 0.00693
Epoch: 43458, Training Loss: 0.00649
Epoch: 43458, Training Loss: 0.00792
Epoch: 43458, Training Loss: 0.00615
Epoch: 43459, Training Loss: 0.00693
Epoch: 43459, Training Loss: 0.00649
Epoch: 43459, Training Loss: 0.00792
Epoch: 43459, Training Loss: 0.00615
Epoch: 43460, Training Loss: 0.00693
Epoch: 43460, Training Loss: 0.00649
Epoch: 43460, Training Loss: 0.00792
Epoch: 43460, Training Loss: 0.00615
Epoch: 43461, Training Loss: 0.00693
Epoch: 43461, Training Loss: 0.00649
Epoch: 43461, Training Loss: 0.00792
Epoch: 43461, Training Loss: 0.00615
Epoch: 43462, Training Loss: 0.00693
Epoch: 43462, Training Loss: 0.00649
Epoch: 43462, Training Loss: 0.00792
Epoch: 43462, Training Loss: 0.00615
Epoch: 43463, Training Loss: 0.00693
Epoch: 43463, Training Loss: 0.00649
Epoch: 43463, Training Loss: 0.00792
Epoch: 43463, Training Loss: 0.00615
Epoch: 43464, Training Loss: 0.00693
Epoch: 43464, Training Loss: 0.00649
Epoch: 43464, Training Loss: 0.00792
Epoch: 43464, Training Loss: 0.00615
Epoch: 43465, Training Loss: 0.00693
Epoch: 43465, Training Loss: 0.00649
Epoch: 43465, Training Loss: 0.00792
Epoch: 43465, Training Loss: 0.00615
Epoch: 43466, Training Loss: 0.00693
Epoch: 43466, Training Loss: 0.00649
Epoch: 43466, Training Loss: 0.00792
Epoch: 43466, Training Loss: 0.00615
Epoch: 43467, Training Loss: 0.00693
Epoch: 43467, Training Loss: 0.00649
Epoch: 43467, Training Loss: 0.00792
Epoch: 43467, Training Loss: 0.00615
Epoch: 43468, Training Loss: 0.00693
Epoch: 43468, Training Loss: 0.00649
Epoch: 43468, Training Loss: 0.00792
Epoch: 43468, Training Loss: 0.00615
Epoch: 43469, Training Loss: 0.00693
Epoch: 43469, Training Loss: 0.00649
Epoch: 43469, Training Loss: 0.00792
Epoch: 43469, Training Loss: 0.00615
Epoch: 43470, Training Loss: 0.00693
Epoch: 43470, Training Loss: 0.00649
Epoch: 43470, Training Loss: 0.00792
Epoch: 43470, Training Loss: 0.00615
Epoch: 43471, Training Loss: 0.00693
Epoch: 43471, Training Loss: 0.00649
Epoch: 43471, Training Loss: 0.00792
Epoch: 43471, Training Loss: 0.00615
Epoch: 43472, Training Loss: 0.00693
Epoch: 43472, Training Loss: 0.00649
Epoch: 43472, Training Loss: 0.00792
Epoch: 43472, Training Loss: 0.00615
Epoch: 43473, Training Loss: 0.00693
Epoch: 43473, Training Loss: 0.00649
Epoch: 43473, Training Loss: 0.00792
Epoch: 43473, Training Loss: 0.00615
Epoch: 43474, Training Loss: 0.00693
Epoch: 43474, Training Loss: 0.00649
Epoch: 43474, Training Loss: 0.00792
Epoch: 43474, Training Loss: 0.00615
Epoch: 43475, Training Loss: 0.00693
Epoch: 43475, Training Loss: 0.00649
Epoch: 43475, Training Loss: 0.00792
Epoch: 43475, Training Loss: 0.00615
Epoch: 43476, Training Loss: 0.00693
Epoch: 43476, Training Loss: 0.00649
Epoch: 43476, Training Loss: 0.00792
Epoch: 43476, Training Loss: 0.00615
Epoch: 43477, Training Loss: 0.00693
Epoch: 43477, Training Loss: 0.00649
Epoch: 43477, Training Loss: 0.00792
Epoch: 43477, Training Loss: 0.00615
Epoch: 43478, Training Loss: 0.00693
Epoch: 43478, Training Loss: 0.00649
Epoch: 43478, Training Loss: 0.00792
Epoch: 43478, Training Loss: 0.00615
Epoch: 43479, Training Loss: 0.00693
Epoch: 43479, Training Loss: 0.00649
Epoch: 43479, Training Loss: 0.00792
Epoch: 43479, Training Loss: 0.00615
Epoch: 43480, Training Loss: 0.00693
Epoch: 43480, Training Loss: 0.00649
Epoch: 43480, Training Loss: 0.00792
Epoch: 43480, Training Loss: 0.00615
Epoch: 43481, Training Loss: 0.00693
Epoch: 43481, Training Loss: 0.00649
Epoch: 43481, Training Loss: 0.00792
Epoch: 43481, Training Loss: 0.00615
Epoch: 43482, Training Loss: 0.00693
Epoch: 43482, Training Loss: 0.00649
Epoch: 43482, Training Loss: 0.00792
Epoch: 43482, Training Loss: 0.00615
Epoch: 43483, Training Loss: 0.00693
Epoch: 43483, Training Loss: 0.00649
Epoch: 43483, Training Loss: 0.00792
Epoch: 43483, Training Loss: 0.00615
Epoch: 43484, Training Loss: 0.00693
Epoch: 43484, Training Loss: 0.00649
Epoch: 43484, Training Loss: 0.00792
Epoch: 43484, Training Loss: 0.00615
Epoch: 43485, Training Loss: 0.00693
Epoch: 43485, Training Loss: 0.00649
Epoch: 43485, Training Loss: 0.00792
Epoch: 43485, Training Loss: 0.00615
Epoch: 43486, Training Loss: 0.00693
Epoch: 43486, Training Loss: 0.00649
Epoch: 43486, Training Loss: 0.00792
Epoch: 43486, Training Loss: 0.00615
Epoch: 43487, Training Loss: 0.00693
Epoch: 43487, Training Loss: 0.00649
Epoch: 43487, Training Loss: 0.00792
Epoch: 43487, Training Loss: 0.00615
Epoch: 43488, Training Loss: 0.00693
Epoch: 43488, Training Loss: 0.00649
Epoch: 43488, Training Loss: 0.00792
Epoch: 43488, Training Loss: 0.00615
Epoch: 43489, Training Loss: 0.00693
Epoch: 43489, Training Loss: 0.00649
Epoch: 43489, Training Loss: 0.00792
Epoch: 43489, Training Loss: 0.00615
Epoch: 43490, Training Loss: 0.00693
Epoch: 43490, Training Loss: 0.00649
Epoch: 43490, Training Loss: 0.00792
Epoch: 43490, Training Loss: 0.00615
Epoch: 43491, Training Loss: 0.00693
Epoch: 43491, Training Loss: 0.00649
Epoch: 43491, Training Loss: 0.00792
Epoch: 43491, Training Loss: 0.00615
Epoch: 43492, Training Loss: 0.00693
Epoch: 43492, Training Loss: 0.00649
Epoch: 43492, Training Loss: 0.00792
Epoch: 43492, Training Loss: 0.00615
Epoch: 43493, Training Loss: 0.00693
Epoch: 43493, Training Loss: 0.00649
Epoch: 43493, Training Loss: 0.00792
Epoch: 43493, Training Loss: 0.00615
Epoch: 43494, Training Loss: 0.00693
Epoch: 43494, Training Loss: 0.00649
Epoch: 43494, Training Loss: 0.00792
Epoch: 43494, Training Loss: 0.00615
Epoch: 43495, Training Loss: 0.00693
Epoch: 43495, Training Loss: 0.00649
Epoch: 43495, Training Loss: 0.00792
Epoch: 43495, Training Loss: 0.00615
Epoch: 43496, Training Loss: 0.00693
Epoch: 43496, Training Loss: 0.00649
Epoch: 43496, Training Loss: 0.00792
Epoch: 43496, Training Loss: 0.00615
Epoch: 43497, Training Loss: 0.00693
Epoch: 43497, Training Loss: 0.00649
Epoch: 43497, Training Loss: 0.00792
Epoch: 43497, Training Loss: 0.00615
Epoch: 43498, Training Loss: 0.00693
Epoch: 43498, Training Loss: 0.00649
Epoch: 43498, Training Loss: 0.00792
Epoch: 43498, Training Loss: 0.00615
Epoch: 43499, Training Loss: 0.00693
Epoch: 43499, Training Loss: 0.00649
Epoch: 43499, Training Loss: 0.00792
Epoch: 43499, Training Loss: 0.00615
Epoch: 43500, Training Loss: 0.00693
Epoch: 43500, Training Loss: 0.00649
Epoch: 43500, Training Loss: 0.00792
Epoch: 43500, Training Loss: 0.00615
Epoch: 43501, Training Loss: 0.00693
Epoch: 43501, Training Loss: 0.00649
Epoch: 43501, Training Loss: 0.00792
Epoch: 43501, Training Loss: 0.00615
Epoch: 43502, Training Loss: 0.00693
Epoch: 43502, Training Loss: 0.00649
Epoch: 43502, Training Loss: 0.00792
Epoch: 43502, Training Loss: 0.00615
Epoch: 43503, Training Loss: 0.00693
Epoch: 43503, Training Loss: 0.00649
Epoch: 43503, Training Loss: 0.00792
Epoch: 43503, Training Loss: 0.00615
Epoch: 43504, Training Loss: 0.00693
Epoch: 43504, Training Loss: 0.00649
Epoch: 43504, Training Loss: 0.00792
Epoch: 43504, Training Loss: 0.00615
Epoch: 43505, Training Loss: 0.00693
Epoch: 43505, Training Loss: 0.00649
Epoch: 43505, Training Loss: 0.00792
Epoch: 43505, Training Loss: 0.00615
Epoch: 43506, Training Loss: 0.00693
Epoch: 43506, Training Loss: 0.00649
Epoch: 43506, Training Loss: 0.00792
Epoch: 43506, Training Loss: 0.00614
Epoch: 43507, Training Loss: 0.00693
Epoch: 43507, Training Loss: 0.00649
Epoch: 43507, Training Loss: 0.00792
Epoch: 43507, Training Loss: 0.00614
Epoch: 43508, Training Loss: 0.00693
Epoch: 43508, Training Loss: 0.00649
Epoch: 43508, Training Loss: 0.00792
Epoch: 43508, Training Loss: 0.00614
Epoch: 43509, Training Loss: 0.00693
Epoch: 43509, Training Loss: 0.00649
Epoch: 43509, Training Loss: 0.00792
Epoch: 43509, Training Loss: 0.00614
Epoch: 43510, Training Loss: 0.00693
Epoch: 43510, Training Loss: 0.00649
Epoch: 43510, Training Loss: 0.00792
Epoch: 43510, Training Loss: 0.00614
Epoch: 43511, Training Loss: 0.00693
Epoch: 43511, Training Loss: 0.00649
Epoch: 43511, Training Loss: 0.00792
Epoch: 43511, Training Loss: 0.00614
Epoch: 43512, Training Loss: 0.00693
Epoch: 43512, Training Loss: 0.00649
Epoch: 43512, Training Loss: 0.00792
Epoch: 43512, Training Loss: 0.00614
Epoch: 43513, Training Loss: 0.00693
Epoch: 43513, Training Loss: 0.00649
Epoch: 43513, Training Loss: 0.00792
Epoch: 43513, Training Loss: 0.00614
Epoch: 43514, Training Loss: 0.00693
Epoch: 43514, Training Loss: 0.00649
Epoch: 43514, Training Loss: 0.00792
Epoch: 43514, Training Loss: 0.00614
Epoch: 43515, Training Loss: 0.00693
Epoch: 43515, Training Loss: 0.00649
Epoch: 43515, Training Loss: 0.00792
Epoch: 43515, Training Loss: 0.00614
Epoch: 43516, Training Loss: 0.00693
Epoch: 43516, Training Loss: 0.00649
Epoch: 43516, Training Loss: 0.00792
Epoch: 43516, Training Loss: 0.00614
Epoch: 43517, Training Loss: 0.00693
Epoch: 43517, Training Loss: 0.00649
Epoch: 43517, Training Loss: 0.00792
Epoch: 43517, Training Loss: 0.00614
Epoch: 43518, Training Loss: 0.00693
Epoch: 43518, Training Loss: 0.00649
Epoch: 43518, Training Loss: 0.00792
Epoch: 43518, Training Loss: 0.00614
Epoch: 43519, Training Loss: 0.00693
Epoch: 43519, Training Loss: 0.00649
Epoch: 43519, Training Loss: 0.00792
Epoch: 43519, Training Loss: 0.00614
Epoch: 43520, Training Loss: 0.00693
Epoch: 43520, Training Loss: 0.00649
Epoch: 43520, Training Loss: 0.00792
Epoch: 43520, Training Loss: 0.00614
Epoch: 43521, Training Loss: 0.00693
Epoch: 43521, Training Loss: 0.00649
Epoch: 43521, Training Loss: 0.00792
Epoch: 43521, Training Loss: 0.00614
Epoch: 43522, Training Loss: 0.00693
Epoch: 43522, Training Loss: 0.00649
Epoch: 43522, Training Loss: 0.00792
Epoch: 43522, Training Loss: 0.00614
Epoch: 43523, Training Loss: 0.00693
Epoch: 43523, Training Loss: 0.00649
Epoch: 43523, Training Loss: 0.00792
Epoch: 43523, Training Loss: 0.00614
Epoch: 43524, Training Loss: 0.00693
Epoch: 43524, Training Loss: 0.00649
Epoch: 43524, Training Loss: 0.00792
Epoch: 43524, Training Loss: 0.00614
Epoch: 43525, Training Loss: 0.00693
Epoch: 43525, Training Loss: 0.00649
Epoch: 43525, Training Loss: 0.00792
Epoch: 43525, Training Loss: 0.00614
Epoch: 43526, Training Loss: 0.00693
Epoch: 43526, Training Loss: 0.00649
Epoch: 43526, Training Loss: 0.00792
Epoch: 43526, Training Loss: 0.00614
Epoch: 43527, Training Loss: 0.00693
Epoch: 43527, Training Loss: 0.00649
Epoch: 43527, Training Loss: 0.00792
Epoch: 43527, Training Loss: 0.00614
Epoch: 43528, Training Loss: 0.00693
Epoch: 43528, Training Loss: 0.00649
Epoch: 43528, Training Loss: 0.00792
Epoch: 43528, Training Loss: 0.00614
Epoch: 43529, Training Loss: 0.00693
Epoch: 43529, Training Loss: 0.00649
Epoch: 43529, Training Loss: 0.00792
Epoch: 43529, Training Loss: 0.00614
Epoch: 43530, Training Loss: 0.00693
Epoch: 43530, Training Loss: 0.00649
Epoch: 43530, Training Loss: 0.00792
Epoch: 43530, Training Loss: 0.00614
Epoch: 43531, Training Loss: 0.00693
Epoch: 43531, Training Loss: 0.00649
Epoch: 43531, Training Loss: 0.00792
Epoch: 43531, Training Loss: 0.00614
Epoch: 43532, Training Loss: 0.00693
Epoch: 43532, Training Loss: 0.00649
Epoch: 43532, Training Loss: 0.00792
Epoch: 43532, Training Loss: 0.00614
Epoch: 43533, Training Loss: 0.00693
Epoch: 43533, Training Loss: 0.00649
Epoch: 43533, Training Loss: 0.00792
Epoch: 43533, Training Loss: 0.00614
Epoch: 43534, Training Loss: 0.00693
Epoch: 43534, Training Loss: 0.00649
Epoch: 43534, Training Loss: 0.00792
Epoch: 43534, Training Loss: 0.00614
Epoch: 43535, Training Loss: 0.00693
Epoch: 43535, Training Loss: 0.00649
Epoch: 43535, Training Loss: 0.00792
Epoch: 43535, Training Loss: 0.00614
Epoch: 43536, Training Loss: 0.00693
Epoch: 43536, Training Loss: 0.00649
Epoch: 43536, Training Loss: 0.00792
Epoch: 43536, Training Loss: 0.00614
Epoch: 43537, Training Loss: 0.00693
Epoch: 43537, Training Loss: 0.00649
Epoch: 43537, Training Loss: 0.00792
Epoch: 43537, Training Loss: 0.00614
Epoch: 43538, Training Loss: 0.00693
Epoch: 43538, Training Loss: 0.00649
Epoch: 43538, Training Loss: 0.00792
Epoch: 43538, Training Loss: 0.00614
Epoch: 43539, Training Loss: 0.00693
Epoch: 43539, Training Loss: 0.00649
Epoch: 43539, Training Loss: 0.00792
Epoch: 43539, Training Loss: 0.00614
Epoch: 43540, Training Loss: 0.00693
Epoch: 43540, Training Loss: 0.00649
Epoch: 43540, Training Loss: 0.00792
Epoch: 43540, Training Loss: 0.00614
Epoch: 43541, Training Loss: 0.00693
Epoch: 43541, Training Loss: 0.00649
Epoch: 43541, Training Loss: 0.00792
Epoch: 43541, Training Loss: 0.00614
Epoch: 43542, Training Loss: 0.00693
Epoch: 43542, Training Loss: 0.00649
Epoch: 43542, Training Loss: 0.00792
Epoch: 43542, Training Loss: 0.00614
Epoch: 43543, Training Loss: 0.00693
Epoch: 43543, Training Loss: 0.00649
Epoch: 43543, Training Loss: 0.00792
Epoch: 43543, Training Loss: 0.00614
Epoch: 43544, Training Loss: 0.00693
Epoch: 43544, Training Loss: 0.00649
Epoch: 43544, Training Loss: 0.00792
Epoch: 43544, Training Loss: 0.00614
Epoch: 43545, Training Loss: 0.00693
Epoch: 43545, Training Loss: 0.00649
Epoch: 43545, Training Loss: 0.00792
Epoch: 43545, Training Loss: 0.00614
Epoch: 43546, Training Loss: 0.00693
Epoch: 43546, Training Loss: 0.00649
Epoch: 43546, Training Loss: 0.00792
Epoch: 43546, Training Loss: 0.00614
Epoch: 43547, Training Loss: 0.00693
Epoch: 43547, Training Loss: 0.00649
Epoch: 43547, Training Loss: 0.00792
Epoch: 43547, Training Loss: 0.00614
Epoch: 43548, Training Loss: 0.00693
Epoch: 43548, Training Loss: 0.00649
Epoch: 43548, Training Loss: 0.00792
Epoch: 43548, Training Loss: 0.00614
Epoch: 43549, Training Loss: 0.00693
Epoch: 43549, Training Loss: 0.00649
Epoch: 43549, Training Loss: 0.00792
Epoch: 43549, Training Loss: 0.00614
Epoch: 43550, Training Loss: 0.00693
Epoch: 43550, Training Loss: 0.00649
Epoch: 43550, Training Loss: 0.00792
Epoch: 43550, Training Loss: 0.00614
Epoch: 43551, Training Loss: 0.00693
Epoch: 43551, Training Loss: 0.00649
Epoch: 43551, Training Loss: 0.00792
Epoch: 43551, Training Loss: 0.00614
Epoch: 43552, Training Loss: 0.00693
Epoch: 43552, Training Loss: 0.00649
Epoch: 43552, Training Loss: 0.00792
Epoch: 43552, Training Loss: 0.00614
Epoch: 43553, Training Loss: 0.00693
Epoch: 43553, Training Loss: 0.00649
Epoch: 43553, Training Loss: 0.00792
Epoch: 43553, Training Loss: 0.00614
Epoch: 43554, Training Loss: 0.00693
Epoch: 43554, Training Loss: 0.00649
Epoch: 43554, Training Loss: 0.00792
Epoch: 43554, Training Loss: 0.00614
Epoch: 43555, Training Loss: 0.00693
Epoch: 43555, Training Loss: 0.00649
Epoch: 43555, Training Loss: 0.00792
Epoch: 43555, Training Loss: 0.00614
Epoch: 43556, Training Loss: 0.00693
Epoch: 43556, Training Loss: 0.00649
Epoch: 43556, Training Loss: 0.00792
Epoch: 43556, Training Loss: 0.00614
Epoch: 43557, Training Loss: 0.00693
Epoch: 43557, Training Loss: 0.00649
Epoch: 43557, Training Loss: 0.00792
Epoch: 43557, Training Loss: 0.00614
Epoch: 43558, Training Loss: 0.00693
Epoch: 43558, Training Loss: 0.00649
Epoch: 43558, Training Loss: 0.00792
Epoch: 43558, Training Loss: 0.00614
Epoch: 43559, Training Loss: 0.00693
Epoch: 43559, Training Loss: 0.00649
Epoch: 43559, Training Loss: 0.00792
Epoch: 43559, Training Loss: 0.00614
Epoch: 43560, Training Loss: 0.00693
Epoch: 43560, Training Loss: 0.00649
Epoch: 43560, Training Loss: 0.00792
Epoch: 43560, Training Loss: 0.00614
Epoch: 43561, Training Loss: 0.00693
Epoch: 43561, Training Loss: 0.00649
Epoch: 43561, Training Loss: 0.00792
Epoch: 43561, Training Loss: 0.00614
Epoch: 43562, Training Loss: 0.00693
Epoch: 43562, Training Loss: 0.00649
Epoch: 43562, Training Loss: 0.00792
Epoch: 43562, Training Loss: 0.00614
Epoch: 43563, Training Loss: 0.00693
Epoch: 43563, Training Loss: 0.00649
Epoch: 43563, Training Loss: 0.00791
Epoch: 43563, Training Loss: 0.00614
Epoch: 43564, Training Loss: 0.00693
Epoch: 43564, Training Loss: 0.00649
Epoch: 43564, Training Loss: 0.00791
Epoch: 43564, Training Loss: 0.00614
Epoch: 43565, Training Loss: 0.00693
Epoch: 43565, Training Loss: 0.00649
Epoch: 43565, Training Loss: 0.00791
Epoch: 43565, Training Loss: 0.00614
Epoch: 43566, Training Loss: 0.00693
Epoch: 43566, Training Loss: 0.00649
Epoch: 43566, Training Loss: 0.00791
Epoch: 43566, Training Loss: 0.00614
Epoch: 43567, Training Loss: 0.00693
Epoch: 43567, Training Loss: 0.00649
Epoch: 43567, Training Loss: 0.00791
Epoch: 43567, Training Loss: 0.00614
Epoch: 43568, Training Loss: 0.00693
Epoch: 43568, Training Loss: 0.00649
Epoch: 43568, Training Loss: 0.00791
Epoch: 43568, Training Loss: 0.00614
Epoch: 43569, Training Loss: 0.00693
Epoch: 43569, Training Loss: 0.00649
Epoch: 43569, Training Loss: 0.00791
Epoch: 43569, Training Loss: 0.00614
Epoch: 43570, Training Loss: 0.00693
Epoch: 43570, Training Loss: 0.00649
Epoch: 43570, Training Loss: 0.00791
Epoch: 43570, Training Loss: 0.00614
Epoch: 43571, Training Loss: 0.00693
Epoch: 43571, Training Loss: 0.00649
Epoch: 43571, Training Loss: 0.00791
Epoch: 43571, Training Loss: 0.00614
Epoch: 43572, Training Loss: 0.00693
Epoch: 43572, Training Loss: 0.00649
Epoch: 43572, Training Loss: 0.00791
Epoch: 43572, Training Loss: 0.00614
Epoch: 43573, Training Loss: 0.00693
Epoch: 43573, Training Loss: 0.00649
Epoch: 43573, Training Loss: 0.00791
Epoch: 43573, Training Loss: 0.00614
Epoch: 43574, Training Loss: 0.00693
Epoch: 43574, Training Loss: 0.00649
Epoch: 43574, Training Loss: 0.00791
Epoch: 43574, Training Loss: 0.00614
Epoch: 43575, Training Loss: 0.00693
Epoch: 43575, Training Loss: 0.00649
Epoch: 43575, Training Loss: 0.00791
Epoch: 43575, Training Loss: 0.00614
Epoch: 43576, Training Loss: 0.00692
Epoch: 43576, Training Loss: 0.00649
Epoch: 43576, Training Loss: 0.00791
Epoch: 43576, Training Loss: 0.00614
Epoch: 43577, Training Loss: 0.00692
Epoch: 43577, Training Loss: 0.00649
Epoch: 43577, Training Loss: 0.00791
Epoch: 43577, Training Loss: 0.00614
Epoch: 43578, Training Loss: 0.00692
Epoch: 43578, Training Loss: 0.00649
Epoch: 43578, Training Loss: 0.00791
Epoch: 43578, Training Loss: 0.00614
Epoch: 43579, Training Loss: 0.00692
Epoch: 43579, Training Loss: 0.00648
Epoch: 43579, Training Loss: 0.00791
Epoch: 43579, Training Loss: 0.00614
Epoch: 43580, Training Loss: 0.00692
Epoch: 43580, Training Loss: 0.00648
Epoch: 43580, Training Loss: 0.00791
Epoch: 43580, Training Loss: 0.00614
Epoch: 43581, Training Loss: 0.00692
Epoch: 43581, Training Loss: 0.00648
Epoch: 43581, Training Loss: 0.00791
Epoch: 43581, Training Loss: 0.00614
Epoch: 43582, Training Loss: 0.00692
Epoch: 43582, Training Loss: 0.00648
Epoch: 43582, Training Loss: 0.00791
Epoch: 43582, Training Loss: 0.00614
Epoch: 43583, Training Loss: 0.00692
Epoch: 43583, Training Loss: 0.00648
Epoch: 43583, Training Loss: 0.00791
Epoch: 43583, Training Loss: 0.00614
Epoch: 43584, Training Loss: 0.00692
Epoch: 43584, Training Loss: 0.00648
Epoch: 43584, Training Loss: 0.00791
Epoch: 43584, Training Loss: 0.00614
Epoch: 43585, Training Loss: 0.00692
Epoch: 43585, Training Loss: 0.00648
Epoch: 43585, Training Loss: 0.00791
Epoch: 43585, Training Loss: 0.00614
Epoch: 43586, Training Loss: 0.00692
Epoch: 43586, Training Loss: 0.00648
Epoch: 43586, Training Loss: 0.00791
Epoch: 43586, Training Loss: 0.00614
Epoch: 43587, Training Loss: 0.00692
Epoch: 43587, Training Loss: 0.00648
Epoch: 43587, Training Loss: 0.00791
Epoch: 43587, Training Loss: 0.00614
Epoch: 43588, Training Loss: 0.00692
Epoch: 43588, Training Loss: 0.00648
Epoch: 43588, Training Loss: 0.00791
Epoch: 43588, Training Loss: 0.00614
Epoch: 43589, Training Loss: 0.00692
Epoch: 43589, Training Loss: 0.00648
Epoch: 43589, Training Loss: 0.00791
Epoch: 43589, Training Loss: 0.00614
Epoch: 43590, Training Loss: 0.00692
Epoch: 43590, Training Loss: 0.00648
Epoch: 43590, Training Loss: 0.00791
Epoch: 43590, Training Loss: 0.00614
Epoch: 43591, Training Loss: 0.00692
Epoch: 43591, Training Loss: 0.00648
Epoch: 43591, Training Loss: 0.00791
Epoch: 43591, Training Loss: 0.00614
Epoch: 43592, Training Loss: 0.00692
Epoch: 43592, Training Loss: 0.00648
Epoch: 43592, Training Loss: 0.00791
Epoch: 43592, Training Loss: 0.00614
Epoch: 43593, Training Loss: 0.00692
Epoch: 43593, Training Loss: 0.00648
Epoch: 43593, Training Loss: 0.00791
Epoch: 43593, Training Loss: 0.00614
Epoch: 43594, Training Loss: 0.00692
Epoch: 43594, Training Loss: 0.00648
Epoch: 43594, Training Loss: 0.00791
Epoch: 43594, Training Loss: 0.00614
Epoch: 43595, Training Loss: 0.00692
Epoch: 43595, Training Loss: 0.00648
Epoch: 43595, Training Loss: 0.00791
Epoch: 43595, Training Loss: 0.00614
Epoch: 43596, Training Loss: 0.00692
Epoch: 43596, Training Loss: 0.00648
Epoch: 43596, Training Loss: 0.00791
Epoch: 43596, Training Loss: 0.00614
Epoch: 43597, Training Loss: 0.00692
Epoch: 43597, Training Loss: 0.00648
Epoch: 43597, Training Loss: 0.00791
Epoch: 43597, Training Loss: 0.00614
Epoch: 43598, Training Loss: 0.00692
Epoch: 43598, Training Loss: 0.00648
Epoch: 43598, Training Loss: 0.00791
Epoch: 43598, Training Loss: 0.00614
Epoch: 43599, Training Loss: 0.00692
Epoch: 43599, Training Loss: 0.00648
Epoch: 43599, Training Loss: 0.00791
Epoch: 43599, Training Loss: 0.00614
Epoch: 43600, Training Loss: 0.00692
Epoch: 43600, Training Loss: 0.00648
Epoch: 43600, Training Loss: 0.00791
Epoch: 43600, Training Loss: 0.00614
Epoch: 43601, Training Loss: 0.00692
Epoch: 43601, Training Loss: 0.00648
Epoch: 43601, Training Loss: 0.00791
Epoch: 43601, Training Loss: 0.00614
Epoch: 43602, Training Loss: 0.00692
Epoch: 43602, Training Loss: 0.00648
Epoch: 43602, Training Loss: 0.00791
Epoch: 43602, Training Loss: 0.00614
Epoch: 43603, Training Loss: 0.00692
Epoch: 43603, Training Loss: 0.00648
Epoch: 43603, Training Loss: 0.00791
Epoch: 43603, Training Loss: 0.00614
Epoch: 43604, Training Loss: 0.00692
Epoch: 43604, Training Loss: 0.00648
Epoch: 43604, Training Loss: 0.00791
Epoch: 43604, Training Loss: 0.00614
Epoch: 43605, Training Loss: 0.00692
Epoch: 43605, Training Loss: 0.00648
Epoch: 43605, Training Loss: 0.00791
Epoch: 43605, Training Loss: 0.00614
Epoch: 43606, Training Loss: 0.00692
Epoch: 43606, Training Loss: 0.00648
Epoch: 43606, Training Loss: 0.00791
Epoch: 43606, Training Loss: 0.00614
Epoch: 43607, Training Loss: 0.00692
Epoch: 43607, Training Loss: 0.00648
Epoch: 43607, Training Loss: 0.00791
Epoch: 43607, Training Loss: 0.00614
Epoch: 43608, Training Loss: 0.00692
Epoch: 43608, Training Loss: 0.00648
Epoch: 43608, Training Loss: 0.00791
Epoch: 43608, Training Loss: 0.00614
Epoch: 43609, Training Loss: 0.00692
Epoch: 43609, Training Loss: 0.00648
Epoch: 43609, Training Loss: 0.00791
Epoch: 43609, Training Loss: 0.00614
Epoch: 43610, Training Loss: 0.00692
Epoch: 43610, Training Loss: 0.00648
Epoch: 43610, Training Loss: 0.00791
Epoch: 43610, Training Loss: 0.00614
Epoch: 43611, Training Loss: 0.00692
Epoch: 43611, Training Loss: 0.00648
Epoch: 43611, Training Loss: 0.00791
Epoch: 43611, Training Loss: 0.00614
Epoch: 43612, Training Loss: 0.00692
Epoch: 43612, Training Loss: 0.00648
Epoch: 43612, Training Loss: 0.00791
Epoch: 43612, Training Loss: 0.00614
Epoch: 43613, Training Loss: 0.00692
Epoch: 43613, Training Loss: 0.00648
Epoch: 43613, Training Loss: 0.00791
Epoch: 43613, Training Loss: 0.00614
Epoch: 43614, Training Loss: 0.00692
Epoch: 43614, Training Loss: 0.00648
Epoch: 43614, Training Loss: 0.00791
Epoch: 43614, Training Loss: 0.00614
Epoch: 43615, Training Loss: 0.00692
Epoch: 43615, Training Loss: 0.00648
Epoch: 43615, Training Loss: 0.00791
Epoch: 43615, Training Loss: 0.00614
Epoch: 43616, Training Loss: 0.00692
Epoch: 43616, Training Loss: 0.00648
Epoch: 43616, Training Loss: 0.00791
Epoch: 43616, Training Loss: 0.00614
Epoch: 43617, Training Loss: 0.00692
Epoch: 43617, Training Loss: 0.00648
Epoch: 43617, Training Loss: 0.00791
Epoch: 43617, Training Loss: 0.00614
Epoch: 43618, Training Loss: 0.00692
Epoch: 43618, Training Loss: 0.00648
Epoch: 43618, Training Loss: 0.00791
Epoch: 43618, Training Loss: 0.00614
Epoch: 43619, Training Loss: 0.00692
Epoch: 43619, Training Loss: 0.00648
Epoch: 43619, Training Loss: 0.00791
Epoch: 43619, Training Loss: 0.00614
Epoch: 43620, Training Loss: 0.00692
Epoch: 43620, Training Loss: 0.00648
Epoch: 43620, Training Loss: 0.00791
Epoch: 43620, Training Loss: 0.00614
Epoch: 43621, Training Loss: 0.00692
Epoch: 43621, Training Loss: 0.00648
Epoch: 43621, Training Loss: 0.00791
Epoch: 43621, Training Loss: 0.00614
Epoch: 43622, Training Loss: 0.00692
Epoch: 43622, Training Loss: 0.00648
Epoch: 43622, Training Loss: 0.00791
Epoch: 43622, Training Loss: 0.00614
Epoch: 43623, Training Loss: 0.00692
Epoch: 43623, Training Loss: 0.00648
Epoch: 43623, Training Loss: 0.00791
Epoch: 43623, Training Loss: 0.00614
Epoch: 43624, Training Loss: 0.00692
Epoch: 43624, Training Loss: 0.00648
Epoch: 43624, Training Loss: 0.00791
Epoch: 43624, Training Loss: 0.00614
Epoch: 43625, Training Loss: 0.00692
Epoch: 43625, Training Loss: 0.00648
Epoch: 43625, Training Loss: 0.00791
Epoch: 43625, Training Loss: 0.00614
Epoch: 43626, Training Loss: 0.00692
Epoch: 43626, Training Loss: 0.00648
Epoch: 43626, Training Loss: 0.00791
Epoch: 43626, Training Loss: 0.00614
Epoch: 43627, Training Loss: 0.00692
Epoch: 43627, Training Loss: 0.00648
Epoch: 43627, Training Loss: 0.00791
Epoch: 43627, Training Loss: 0.00614
Epoch: 43628, Training Loss: 0.00692
Epoch: 43628, Training Loss: 0.00648
Epoch: 43628, Training Loss: 0.00791
Epoch: 43628, Training Loss: 0.00614
Epoch: 43629, Training Loss: 0.00692
Epoch: 43629, Training Loss: 0.00648
Epoch: 43629, Training Loss: 0.00791
Epoch: 43629, Training Loss: 0.00614
Epoch: 43630, Training Loss: 0.00692
Epoch: 43630, Training Loss: 0.00648
Epoch: 43630, Training Loss: 0.00791
Epoch: 43630, Training Loss: 0.00614
Epoch: 43631, Training Loss: 0.00692
Epoch: 43631, Training Loss: 0.00648
Epoch: 43631, Training Loss: 0.00791
Epoch: 43631, Training Loss: 0.00614
Epoch: 43632, Training Loss: 0.00692
Epoch: 43632, Training Loss: 0.00648
Epoch: 43632, Training Loss: 0.00791
Epoch: 43632, Training Loss: 0.00614
Epoch: 43633, Training Loss: 0.00692
Epoch: 43633, Training Loss: 0.00648
Epoch: 43633, Training Loss: 0.00791
Epoch: 43633, Training Loss: 0.00614
Epoch: 43634, Training Loss: 0.00692
Epoch: 43634, Training Loss: 0.00648
Epoch: 43634, Training Loss: 0.00791
Epoch: 43634, Training Loss: 0.00614
Epoch: 43635, Training Loss: 0.00692
Epoch: 43635, Training Loss: 0.00648
Epoch: 43635, Training Loss: 0.00791
Epoch: 43635, Training Loss: 0.00614
Epoch: 43636, Training Loss: 0.00692
Epoch: 43636, Training Loss: 0.00648
Epoch: 43636, Training Loss: 0.00791
Epoch: 43636, Training Loss: 0.00614
Epoch: 43637, Training Loss: 0.00692
Epoch: 43637, Training Loss: 0.00648
Epoch: 43637, Training Loss: 0.00791
Epoch: 43637, Training Loss: 0.00614
Epoch: 43638, Training Loss: 0.00692
Epoch: 43638, Training Loss: 0.00648
Epoch: 43638, Training Loss: 0.00791
Epoch: 43638, Training Loss: 0.00614
Epoch: 43639, Training Loss: 0.00692
Epoch: 43639, Training Loss: 0.00648
Epoch: 43639, Training Loss: 0.00791
Epoch: 43639, Training Loss: 0.00614
Epoch: 43640, Training Loss: 0.00692
Epoch: 43640, Training Loss: 0.00648
Epoch: 43640, Training Loss: 0.00791
Epoch: 43640, Training Loss: 0.00614
Epoch: 43641, Training Loss: 0.00692
Epoch: 43641, Training Loss: 0.00648
Epoch: 43641, Training Loss: 0.00791
Epoch: 43641, Training Loss: 0.00614
Epoch: 43642, Training Loss: 0.00692
Epoch: 43642, Training Loss: 0.00648
Epoch: 43642, Training Loss: 0.00791
Epoch: 43642, Training Loss: 0.00614
Epoch: 43643, Training Loss: 0.00692
Epoch: 43643, Training Loss: 0.00648
Epoch: 43643, Training Loss: 0.00791
Epoch: 43643, Training Loss: 0.00613
Epoch: 43644, Training Loss: 0.00692
Epoch: 43644, Training Loss: 0.00648
Epoch: 43644, Training Loss: 0.00791
Epoch: 43644, Training Loss: 0.00613
Epoch: 43645, Training Loss: 0.00692
Epoch: 43645, Training Loss: 0.00648
Epoch: 43645, Training Loss: 0.00791
Epoch: 43645, Training Loss: 0.00613
Epoch: 43646, Training Loss: 0.00692
Epoch: 43646, Training Loss: 0.00648
Epoch: 43646, Training Loss: 0.00791
Epoch: 43646, Training Loss: 0.00613
Epoch: 43647, Training Loss: 0.00692
Epoch: 43647, Training Loss: 0.00648
Epoch: 43647, Training Loss: 0.00791
Epoch: 43647, Training Loss: 0.00613
Epoch: 43648, Training Loss: 0.00692
Epoch: 43648, Training Loss: 0.00648
Epoch: 43648, Training Loss: 0.00791
Epoch: 43648, Training Loss: 0.00613
Epoch: 43649, Training Loss: 0.00692
Epoch: 43649, Training Loss: 0.00648
Epoch: 43649, Training Loss: 0.00791
Epoch: 43649, Training Loss: 0.00613
Epoch: 43650, Training Loss: 0.00692
Epoch: 43650, Training Loss: 0.00648
Epoch: 43650, Training Loss: 0.00791
Epoch: 43650, Training Loss: 0.00613
Epoch: 43651, Training Loss: 0.00692
Epoch: 43651, Training Loss: 0.00648
Epoch: 43651, Training Loss: 0.00791
Epoch: 43651, Training Loss: 0.00613
Epoch: 43652, Training Loss: 0.00692
Epoch: 43652, Training Loss: 0.00648
Epoch: 43652, Training Loss: 0.00791
Epoch: 43652, Training Loss: 0.00613
Epoch: 43653, Training Loss: 0.00692
Epoch: 43653, Training Loss: 0.00648
Epoch: 43653, Training Loss: 0.00791
Epoch: 43653, Training Loss: 0.00613
Epoch: 43654, Training Loss: 0.00692
Epoch: 43654, Training Loss: 0.00648
Epoch: 43654, Training Loss: 0.00791
Epoch: 43654, Training Loss: 0.00613
Epoch: 43655, Training Loss: 0.00692
Epoch: 43655, Training Loss: 0.00648
Epoch: 43655, Training Loss: 0.00791
Epoch: 43655, Training Loss: 0.00613
Epoch: 43656, Training Loss: 0.00692
Epoch: 43656, Training Loss: 0.00648
Epoch: 43656, Training Loss: 0.00791
Epoch: 43656, Training Loss: 0.00613
Epoch: 43657, Training Loss: 0.00692
Epoch: 43657, Training Loss: 0.00648
Epoch: 43657, Training Loss: 0.00791
Epoch: 43657, Training Loss: 0.00613
Epoch: 43658, Training Loss: 0.00692
Epoch: 43658, Training Loss: 0.00648
Epoch: 43658, Training Loss: 0.00791
Epoch: 43658, Training Loss: 0.00613
Epoch: 43659, Training Loss: 0.00692
Epoch: 43659, Training Loss: 0.00648
Epoch: 43659, Training Loss: 0.00791
Epoch: 43659, Training Loss: 0.00613
Epoch: 43660, Training Loss: 0.00692
Epoch: 43660, Training Loss: 0.00648
Epoch: 43660, Training Loss: 0.00791
Epoch: 43660, Training Loss: 0.00613
Epoch: 43661, Training Loss: 0.00692
Epoch: 43661, Training Loss: 0.00648
Epoch: 43661, Training Loss: 0.00791
Epoch: 43661, Training Loss: 0.00613
Epoch: 43662, Training Loss: 0.00692
Epoch: 43662, Training Loss: 0.00648
Epoch: 43662, Training Loss: 0.00791
Epoch: 43662, Training Loss: 0.00613
Epoch: 43663, Training Loss: 0.00692
Epoch: 43663, Training Loss: 0.00648
Epoch: 43663, Training Loss: 0.00791
Epoch: 43663, Training Loss: 0.00613
Epoch: 43664, Training Loss: 0.00692
Epoch: 43664, Training Loss: 0.00648
Epoch: 43664, Training Loss: 0.00791
Epoch: 43664, Training Loss: 0.00613
Epoch: 43665, Training Loss: 0.00692
Epoch: 43665, Training Loss: 0.00648
Epoch: 43665, Training Loss: 0.00791
Epoch: 43665, Training Loss: 0.00613
Epoch: 43666, Training Loss: 0.00692
Epoch: 43666, Training Loss: 0.00648
Epoch: 43666, Training Loss: 0.00791
Epoch: 43666, Training Loss: 0.00613
Epoch: 43667, Training Loss: 0.00692
Epoch: 43667, Training Loss: 0.00648
Epoch: 43667, Training Loss: 0.00791
Epoch: 43667, Training Loss: 0.00613
Epoch: 43668, Training Loss: 0.00692
Epoch: 43668, Training Loss: 0.00648
Epoch: 43668, Training Loss: 0.00790
Epoch: 43668, Training Loss: 0.00613
Epoch: 43669, Training Loss: 0.00692
Epoch: 43669, Training Loss: 0.00648
Epoch: 43669, Training Loss: 0.00790
Epoch: 43669, Training Loss: 0.00613
Epoch: 43670, Training Loss: 0.00692
Epoch: 43670, Training Loss: 0.00648
Epoch: 43670, Training Loss: 0.00790
Epoch: 43670, Training Loss: 0.00613
Epoch: 43671, Training Loss: 0.00692
Epoch: 43671, Training Loss: 0.00648
Epoch: 43671, Training Loss: 0.00790
Epoch: 43671, Training Loss: 0.00613
Epoch: 43672, Training Loss: 0.00692
Epoch: 43672, Training Loss: 0.00648
Epoch: 43672, Training Loss: 0.00790
Epoch: 43672, Training Loss: 0.00613
Epoch: 43673, Training Loss: 0.00692
Epoch: 43673, Training Loss: 0.00648
Epoch: 43673, Training Loss: 0.00790
Epoch: 43673, Training Loss: 0.00613
Epoch: 43674, Training Loss: 0.00692
Epoch: 43674, Training Loss: 0.00648
Epoch: 43674, Training Loss: 0.00790
Epoch: 43674, Training Loss: 0.00613
Epoch: 43675, Training Loss: 0.00692
Epoch: 43675, Training Loss: 0.00648
Epoch: 43675, Training Loss: 0.00790
Epoch: 43675, Training Loss: 0.00613
Epoch: 43676, Training Loss: 0.00692
Epoch: 43676, Training Loss: 0.00648
Epoch: 43676, Training Loss: 0.00790
Epoch: 43676, Training Loss: 0.00613
Epoch: 43677, Training Loss: 0.00692
Epoch: 43677, Training Loss: 0.00648
Epoch: 43677, Training Loss: 0.00790
Epoch: 43677, Training Loss: 0.00613
Epoch: 43678, Training Loss: 0.00692
Epoch: 43678, Training Loss: 0.00648
Epoch: 43678, Training Loss: 0.00790
Epoch: 43678, Training Loss: 0.00613
Epoch: 43679, Training Loss: 0.00692
Epoch: 43679, Training Loss: 0.00648
Epoch: 43679, Training Loss: 0.00790
Epoch: 43679, Training Loss: 0.00613
Epoch: 43680, Training Loss: 0.00692
Epoch: 43680, Training Loss: 0.00648
Epoch: 43680, Training Loss: 0.00790
Epoch: 43680, Training Loss: 0.00613
Epoch: 43681, Training Loss: 0.00692
Epoch: 43681, Training Loss: 0.00648
Epoch: 43681, Training Loss: 0.00790
Epoch: 43681, Training Loss: 0.00613
Epoch: 43682, Training Loss: 0.00692
Epoch: 43682, Training Loss: 0.00648
Epoch: 43682, Training Loss: 0.00790
Epoch: 43682, Training Loss: 0.00613
Epoch: 43683, Training Loss: 0.00692
Epoch: 43683, Training Loss: 0.00648
Epoch: 43683, Training Loss: 0.00790
Epoch: 43683, Training Loss: 0.00613
Epoch: 43684, Training Loss: 0.00692
Epoch: 43684, Training Loss: 0.00648
Epoch: 43684, Training Loss: 0.00790
Epoch: 43684, Training Loss: 0.00613
Epoch: 43685, Training Loss: 0.00692
Epoch: 43685, Training Loss: 0.00648
Epoch: 43685, Training Loss: 0.00790
Epoch: 43685, Training Loss: 0.00613
Epoch: 43686, Training Loss: 0.00692
Epoch: 43686, Training Loss: 0.00648
Epoch: 43686, Training Loss: 0.00790
Epoch: 43686, Training Loss: 0.00613
Epoch: 43687, Training Loss: 0.00692
Epoch: 43687, Training Loss: 0.00648
Epoch: 43687, Training Loss: 0.00790
Epoch: 43687, Training Loss: 0.00613
Epoch: 43688, Training Loss: 0.00692
Epoch: 43688, Training Loss: 0.00648
Epoch: 43688, Training Loss: 0.00790
Epoch: 43688, Training Loss: 0.00613
Epoch: 43689, Training Loss: 0.00692
Epoch: 43689, Training Loss: 0.00648
Epoch: 43689, Training Loss: 0.00790
Epoch: 43689, Training Loss: 0.00613
Epoch: 43690, Training Loss: 0.00692
Epoch: 43690, Training Loss: 0.00648
Epoch: 43690, Training Loss: 0.00790
Epoch: 43690, Training Loss: 0.00613
Epoch: 43691, Training Loss: 0.00692
Epoch: 43691, Training Loss: 0.00648
Epoch: 43691, Training Loss: 0.00790
Epoch: 43691, Training Loss: 0.00613
Epoch: 43692, Training Loss: 0.00692
Epoch: 43692, Training Loss: 0.00648
Epoch: 43692, Training Loss: 0.00790
Epoch: 43692, Training Loss: 0.00613
Epoch: 43693, Training Loss: 0.00692
Epoch: 43693, Training Loss: 0.00648
Epoch: 43693, Training Loss: 0.00790
Epoch: 43693, Training Loss: 0.00613
Epoch: 43694, Training Loss: 0.00692
Epoch: 43694, Training Loss: 0.00648
Epoch: 43694, Training Loss: 0.00790
Epoch: 43694, Training Loss: 0.00613
Epoch: 43695, Training Loss: 0.00692
Epoch: 43695, Training Loss: 0.00648
Epoch: 43695, Training Loss: 0.00790
Epoch: 43695, Training Loss: 0.00613
Epoch: 43696, Training Loss: 0.00691
Epoch: 43696, Training Loss: 0.00648
Epoch: 43696, Training Loss: 0.00790
Epoch: 43696, Training Loss: 0.00613
Epoch: 43697, Training Loss: 0.00691
Epoch: 43697, Training Loss: 0.00648
Epoch: 43697, Training Loss: 0.00790
Epoch: 43697, Training Loss: 0.00613
Epoch: 43698, Training Loss: 0.00691
Epoch: 43698, Training Loss: 0.00648
Epoch: 43698, Training Loss: 0.00790
Epoch: 43698, Training Loss: 0.00613
Epoch: 43699, Training Loss: 0.00691
Epoch: 43699, Training Loss: 0.00648
Epoch: 43699, Training Loss: 0.00790
Epoch: 43699, Training Loss: 0.00613
Epoch: 43700, Training Loss: 0.00691
Epoch: 43700, Training Loss: 0.00648
Epoch: 43700, Training Loss: 0.00790
Epoch: 43700, Training Loss: 0.00613
Epoch: 43701, Training Loss: 0.00691
Epoch: 43701, Training Loss: 0.00648
Epoch: 43701, Training Loss: 0.00790
Epoch: 43701, Training Loss: 0.00613
Epoch: 43702, Training Loss: 0.00691
Epoch: 43702, Training Loss: 0.00648
Epoch: 43702, Training Loss: 0.00790
Epoch: 43702, Training Loss: 0.00613
Epoch: 43703, Training Loss: 0.00691
Epoch: 43703, Training Loss: 0.00648
Epoch: 43703, Training Loss: 0.00790
Epoch: 43703, Training Loss: 0.00613
Epoch: 43704, Training Loss: 0.00691
Epoch: 43704, Training Loss: 0.00648
Epoch: 43704, Training Loss: 0.00790
Epoch: 43704, Training Loss: 0.00613
Epoch: 43705, Training Loss: 0.00691
Epoch: 43705, Training Loss: 0.00648
Epoch: 43705, Training Loss: 0.00790
Epoch: 43705, Training Loss: 0.00613
Epoch: 43706, Training Loss: 0.00691
Epoch: 43706, Training Loss: 0.00648
Epoch: 43706, Training Loss: 0.00790
Epoch: 43706, Training Loss: 0.00613
Epoch: 43707, Training Loss: 0.00691
Epoch: 43707, Training Loss: 0.00648
Epoch: 43707, Training Loss: 0.00790
Epoch: 43707, Training Loss: 0.00613
Epoch: 43708, Training Loss: 0.00691
Epoch: 43708, Training Loss: 0.00648
Epoch: 43708, Training Loss: 0.00790
Epoch: 43708, Training Loss: 0.00613
Epoch: 43709, Training Loss: 0.00691
Epoch: 43709, Training Loss: 0.00647
Epoch: 43709, Training Loss: 0.00790
Epoch: 43709, Training Loss: 0.00613
Epoch: 43710, Training Loss: 0.00691
Epoch: 43710, Training Loss: 0.00647
Epoch: 43710, Training Loss: 0.00790
Epoch: 43710, Training Loss: 0.00613
Epoch: 43711, Training Loss: 0.00691
Epoch: 43711, Training Loss: 0.00647
Epoch: 43711, Training Loss: 0.00790
Epoch: 43711, Training Loss: 0.00613
Epoch: 43712, Training Loss: 0.00691
Epoch: 43712, Training Loss: 0.00647
Epoch: 43712, Training Loss: 0.00790
Epoch: 43712, Training Loss: 0.00613
Epoch: 43713, Training Loss: 0.00691
Epoch: 43713, Training Loss: 0.00647
Epoch: 43713, Training Loss: 0.00790
Epoch: 43713, Training Loss: 0.00613
Epoch: 43714, Training Loss: 0.00691
Epoch: 43714, Training Loss: 0.00647
Epoch: 43714, Training Loss: 0.00790
Epoch: 43714, Training Loss: 0.00613
Epoch: 43715, Training Loss: 0.00691
Epoch: 43715, Training Loss: 0.00647
Epoch: 43715, Training Loss: 0.00790
Epoch: 43715, Training Loss: 0.00613
Epoch: 43716, Training Loss: 0.00691
Epoch: 43716, Training Loss: 0.00647
Epoch: 43716, Training Loss: 0.00790
Epoch: 43716, Training Loss: 0.00613
Epoch: 43717, Training Loss: 0.00691
Epoch: 43717, Training Loss: 0.00647
Epoch: 43717, Training Loss: 0.00790
Epoch: 43717, Training Loss: 0.00613
Epoch: 43718, Training Loss: 0.00691
Epoch: 43718, Training Loss: 0.00647
Epoch: 43718, Training Loss: 0.00790
Epoch: 43718, Training Loss: 0.00613
Epoch: 43719, Training Loss: 0.00691
Epoch: 43719, Training Loss: 0.00647
Epoch: 43719, Training Loss: 0.00790
Epoch: 43719, Training Loss: 0.00613
Epoch: 43720, Training Loss: 0.00691
Epoch: 43720, Training Loss: 0.00647
Epoch: 43720, Training Loss: 0.00790
Epoch: 43720, Training Loss: 0.00613
Epoch: 43721, Training Loss: 0.00691
Epoch: 43721, Training Loss: 0.00647
Epoch: 43721, Training Loss: 0.00790
Epoch: 43721, Training Loss: 0.00613
Epoch: 43722, Training Loss: 0.00691
Epoch: 43722, Training Loss: 0.00647
Epoch: 43722, Training Loss: 0.00790
Epoch: 43722, Training Loss: 0.00613
Epoch: 43723, Training Loss: 0.00691
Epoch: 43723, Training Loss: 0.00647
Epoch: 43723, Training Loss: 0.00790
Epoch: 43723, Training Loss: 0.00613
Epoch: 43724, Training Loss: 0.00691
Epoch: 43724, Training Loss: 0.00647
Epoch: 43724, Training Loss: 0.00790
Epoch: 43724, Training Loss: 0.00613
Epoch: 43725, Training Loss: 0.00691
Epoch: 43725, Training Loss: 0.00647
Epoch: 43725, Training Loss: 0.00790
Epoch: 43725, Training Loss: 0.00613
Epoch: 43726, Training Loss: 0.00691
Epoch: 43726, Training Loss: 0.00647
Epoch: 43726, Training Loss: 0.00790
Epoch: 43726, Training Loss: 0.00613
Epoch: 43727, Training Loss: 0.00691
Epoch: 43727, Training Loss: 0.00647
Epoch: 43727, Training Loss: 0.00790
Epoch: 43727, Training Loss: 0.00613
Epoch: 43728, Training Loss: 0.00691
Epoch: 43728, Training Loss: 0.00647
Epoch: 43728, Training Loss: 0.00790
Epoch: 43728, Training Loss: 0.00613
Epoch: 43729, Training Loss: 0.00691
Epoch: 43729, Training Loss: 0.00647
Epoch: 43729, Training Loss: 0.00790
Epoch: 43729, Training Loss: 0.00613
Epoch: 43730, Training Loss: 0.00691
Epoch: 43730, Training Loss: 0.00647
Epoch: 43730, Training Loss: 0.00790
Epoch: 43730, Training Loss: 0.00613
Epoch: 43731, Training Loss: 0.00691
Epoch: 43731, Training Loss: 0.00647
Epoch: 43731, Training Loss: 0.00790
Epoch: 43731, Training Loss: 0.00613
Epoch: 43732, Training Loss: 0.00691
Epoch: 43732, Training Loss: 0.00647
Epoch: 43732, Training Loss: 0.00790
Epoch: 43732, Training Loss: 0.00613
Epoch: 43733, Training Loss: 0.00691
Epoch: 43733, Training Loss: 0.00647
Epoch: 43733, Training Loss: 0.00790
Epoch: 43733, Training Loss: 0.00613
Epoch: 43734, Training Loss: 0.00691
Epoch: 43734, Training Loss: 0.00647
Epoch: 43734, Training Loss: 0.00790
Epoch: 43734, Training Loss: 0.00613
Epoch: 43735, Training Loss: 0.00691
Epoch: 43735, Training Loss: 0.00647
Epoch: 43735, Training Loss: 0.00790
Epoch: 43735, Training Loss: 0.00613
Epoch: 43736, Training Loss: 0.00691
Epoch: 43736, Training Loss: 0.00647
Epoch: 43736, Training Loss: 0.00790
Epoch: 43736, Training Loss: 0.00613
Epoch: 43737, Training Loss: 0.00691
Epoch: 43737, Training Loss: 0.00647
Epoch: 43737, Training Loss: 0.00790
Epoch: 43737, Training Loss: 0.00613
Epoch: 43738, Training Loss: 0.00691
Epoch: 43738, Training Loss: 0.00647
Epoch: 43738, Training Loss: 0.00790
Epoch: 43738, Training Loss: 0.00613
Epoch: 43739, Training Loss: 0.00691
Epoch: 43739, Training Loss: 0.00647
Epoch: 43739, Training Loss: 0.00790
Epoch: 43739, Training Loss: 0.00613
Epoch: 43740, Training Loss: 0.00691
Epoch: 43740, Training Loss: 0.00647
Epoch: 43740, Training Loss: 0.00790
Epoch: 43740, Training Loss: 0.00613
Epoch: 43741, Training Loss: 0.00691
Epoch: 43741, Training Loss: 0.00647
Epoch: 43741, Training Loss: 0.00790
Epoch: 43741, Training Loss: 0.00613
Epoch: 43742, Training Loss: 0.00691
Epoch: 43742, Training Loss: 0.00647
Epoch: 43742, Training Loss: 0.00790
Epoch: 43742, Training Loss: 0.00613
Epoch: 43743, Training Loss: 0.00691
Epoch: 43743, Training Loss: 0.00647
Epoch: 43743, Training Loss: 0.00790
Epoch: 43743, Training Loss: 0.00613
Epoch: 43744, Training Loss: 0.00691
Epoch: 43744, Training Loss: 0.00647
Epoch: 43744, Training Loss: 0.00790
Epoch: 43744, Training Loss: 0.00613
Epoch: 43745, Training Loss: 0.00691
Epoch: 43745, Training Loss: 0.00647
Epoch: 43745, Training Loss: 0.00790
Epoch: 43745, Training Loss: 0.00613
Epoch: 43746, Training Loss: 0.00691
Epoch: 43746, Training Loss: 0.00647
Epoch: 43746, Training Loss: 0.00790
Epoch: 43746, Training Loss: 0.00613
Epoch: 43747, Training Loss: 0.00691
Epoch: 43747, Training Loss: 0.00647
Epoch: 43747, Training Loss: 0.00790
Epoch: 43747, Training Loss: 0.00613
Epoch: 43748, Training Loss: 0.00691
Epoch: 43748, Training Loss: 0.00647
Epoch: 43748, Training Loss: 0.00790
Epoch: 43748, Training Loss: 0.00613
Epoch: 43749, Training Loss: 0.00691
Epoch: 43749, Training Loss: 0.00647
Epoch: 43749, Training Loss: 0.00790
Epoch: 43749, Training Loss: 0.00613
Epoch: 43750, Training Loss: 0.00691
Epoch: 43750, Training Loss: 0.00647
Epoch: 43750, Training Loss: 0.00790
Epoch: 43750, Training Loss: 0.00613
Epoch: 43751, Training Loss: 0.00691
Epoch: 43751, Training Loss: 0.00647
Epoch: 43751, Training Loss: 0.00790
Epoch: 43751, Training Loss: 0.00613
Epoch: 43752, Training Loss: 0.00691
Epoch: 43752, Training Loss: 0.00647
Epoch: 43752, Training Loss: 0.00790
Epoch: 43752, Training Loss: 0.00613
Epoch: 43753, Training Loss: 0.00691
Epoch: 43753, Training Loss: 0.00647
Epoch: 43753, Training Loss: 0.00790
Epoch: 43753, Training Loss: 0.00613
Epoch: 43754, Training Loss: 0.00691
Epoch: 43754, Training Loss: 0.00647
Epoch: 43754, Training Loss: 0.00790
Epoch: 43754, Training Loss: 0.00613
Epoch: 43755, Training Loss: 0.00691
Epoch: 43755, Training Loss: 0.00647
Epoch: 43755, Training Loss: 0.00790
Epoch: 43755, Training Loss: 0.00613
Epoch: 43756, Training Loss: 0.00691
Epoch: 43756, Training Loss: 0.00647
Epoch: 43756, Training Loss: 0.00790
Epoch: 43756, Training Loss: 0.00613
Epoch: 43757, Training Loss: 0.00691
Epoch: 43757, Training Loss: 0.00647
Epoch: 43757, Training Loss: 0.00790
Epoch: 43757, Training Loss: 0.00613
Epoch: 43758, Training Loss: 0.00691
Epoch: 43758, Training Loss: 0.00647
Epoch: 43758, Training Loss: 0.00790
Epoch: 43758, Training Loss: 0.00613
Epoch: 43759, Training Loss: 0.00691
Epoch: 43759, Training Loss: 0.00647
Epoch: 43759, Training Loss: 0.00790
Epoch: 43759, Training Loss: 0.00613
Epoch: 43760, Training Loss: 0.00691
Epoch: 43760, Training Loss: 0.00647
Epoch: 43760, Training Loss: 0.00790
Epoch: 43760, Training Loss: 0.00613
Epoch: 43761, Training Loss: 0.00691
Epoch: 43761, Training Loss: 0.00647
Epoch: 43761, Training Loss: 0.00790
Epoch: 43761, Training Loss: 0.00613
Epoch: 43762, Training Loss: 0.00691
Epoch: 43762, Training Loss: 0.00647
Epoch: 43762, Training Loss: 0.00790
Epoch: 43762, Training Loss: 0.00613
Epoch: 43763, Training Loss: 0.00691
Epoch: 43763, Training Loss: 0.00647
Epoch: 43763, Training Loss: 0.00790
Epoch: 43763, Training Loss: 0.00613
Epoch: 43764, Training Loss: 0.00691
Epoch: 43764, Training Loss: 0.00647
Epoch: 43764, Training Loss: 0.00790
Epoch: 43764, Training Loss: 0.00613
Epoch: 43765, Training Loss: 0.00691
Epoch: 43765, Training Loss: 0.00647
Epoch: 43765, Training Loss: 0.00790
Epoch: 43765, Training Loss: 0.00613
Epoch: 43766, Training Loss: 0.00691
Epoch: 43766, Training Loss: 0.00647
Epoch: 43766, Training Loss: 0.00790
Epoch: 43766, Training Loss: 0.00613
Epoch: 43767, Training Loss: 0.00691
Epoch: 43767, Training Loss: 0.00647
Epoch: 43767, Training Loss: 0.00790
Epoch: 43767, Training Loss: 0.00613
Epoch: 43768, Training Loss: 0.00691
Epoch: 43768, Training Loss: 0.00647
Epoch: 43768, Training Loss: 0.00790
Epoch: 43768, Training Loss: 0.00613
Epoch: 43769, Training Loss: 0.00691
Epoch: 43769, Training Loss: 0.00647
Epoch: 43769, Training Loss: 0.00790
Epoch: 43769, Training Loss: 0.00613
Epoch: 43770, Training Loss: 0.00691
Epoch: 43770, Training Loss: 0.00647
Epoch: 43770, Training Loss: 0.00790
Epoch: 43770, Training Loss: 0.00613
Epoch: 43771, Training Loss: 0.00691
Epoch: 43771, Training Loss: 0.00647
Epoch: 43771, Training Loss: 0.00790
Epoch: 43771, Training Loss: 0.00613
Epoch: 43772, Training Loss: 0.00691
Epoch: 43772, Training Loss: 0.00647
Epoch: 43772, Training Loss: 0.00790
Epoch: 43772, Training Loss: 0.00613
Epoch: 43773, Training Loss: 0.00691
Epoch: 43773, Training Loss: 0.00647
Epoch: 43773, Training Loss: 0.00790
Epoch: 43773, Training Loss: 0.00613
Epoch: 43774, Training Loss: 0.00691
Epoch: 43774, Training Loss: 0.00647
Epoch: 43774, Training Loss: 0.00789
Epoch: 43774, Training Loss: 0.00613
Epoch: 43775, Training Loss: 0.00691
Epoch: 43775, Training Loss: 0.00647
Epoch: 43775, Training Loss: 0.00789
Epoch: 43775, Training Loss: 0.00613
Epoch: 43776, Training Loss: 0.00691
Epoch: 43776, Training Loss: 0.00647
Epoch: 43776, Training Loss: 0.00789
Epoch: 43776, Training Loss: 0.00613
Epoch: 43777, Training Loss: 0.00691
Epoch: 43777, Training Loss: 0.00647
Epoch: 43777, Training Loss: 0.00789
Epoch: 43777, Training Loss: 0.00613
Epoch: 43778, Training Loss: 0.00691
Epoch: 43778, Training Loss: 0.00647
Epoch: 43778, Training Loss: 0.00789
Epoch: 43778, Training Loss: 0.00613
Epoch: 43779, Training Loss: 0.00691
Epoch: 43779, Training Loss: 0.00647
Epoch: 43779, Training Loss: 0.00789
Epoch: 43779, Training Loss: 0.00613
Epoch: 43780, Training Loss: 0.00691
Epoch: 43780, Training Loss: 0.00647
Epoch: 43780, Training Loss: 0.00789
Epoch: 43780, Training Loss: 0.00612
Epoch: 43781, Training Loss: 0.00691
Epoch: 43781, Training Loss: 0.00647
Epoch: 43781, Training Loss: 0.00789
Epoch: 43781, Training Loss: 0.00612
Epoch: 43782, Training Loss: 0.00691
Epoch: 43782, Training Loss: 0.00647
Epoch: 43782, Training Loss: 0.00789
Epoch: 43782, Training Loss: 0.00612
Epoch: 43783, Training Loss: 0.00691
Epoch: 43783, Training Loss: 0.00647
Epoch: 43783, Training Loss: 0.00789
Epoch: 43783, Training Loss: 0.00612
Epoch: 43784, Training Loss: 0.00691
Epoch: 43784, Training Loss: 0.00647
Epoch: 43784, Training Loss: 0.00789
Epoch: 43784, Training Loss: 0.00612
Epoch: 43785, Training Loss: 0.00691
Epoch: 43785, Training Loss: 0.00647
Epoch: 43785, Training Loss: 0.00789
Epoch: 43785, Training Loss: 0.00612
Epoch: 43786, Training Loss: 0.00691
Epoch: 43786, Training Loss: 0.00647
Epoch: 43786, Training Loss: 0.00789
Epoch: 43786, Training Loss: 0.00612
Epoch: 43787, Training Loss: 0.00691
Epoch: 43787, Training Loss: 0.00647
Epoch: 43787, Training Loss: 0.00789
Epoch: 43787, Training Loss: 0.00612
Epoch: 43788, Training Loss: 0.00691
Epoch: 43788, Training Loss: 0.00647
Epoch: 43788, Training Loss: 0.00789
Epoch: 43788, Training Loss: 0.00612
Epoch: 43789, Training Loss: 0.00691
Epoch: 43789, Training Loss: 0.00647
Epoch: 43789, Training Loss: 0.00789
Epoch: 43789, Training Loss: 0.00612
Epoch: 43790, Training Loss: 0.00691
Epoch: 43790, Training Loss: 0.00647
Epoch: 43790, Training Loss: 0.00789
Epoch: 43790, Training Loss: 0.00612
Epoch: 43791, Training Loss: 0.00691
Epoch: 43791, Training Loss: 0.00647
Epoch: 43791, Training Loss: 0.00789
Epoch: 43791, Training Loss: 0.00612
Epoch: 43792, Training Loss: 0.00691
Epoch: 43792, Training Loss: 0.00647
Epoch: 43792, Training Loss: 0.00789
Epoch: 43792, Training Loss: 0.00612
Epoch: 43793, Training Loss: 0.00691
Epoch: 43793, Training Loss: 0.00647
Epoch: 43793, Training Loss: 0.00789
Epoch: 43793, Training Loss: 0.00612
Epoch: 43794, Training Loss: 0.00691
Epoch: 43794, Training Loss: 0.00647
Epoch: 43794, Training Loss: 0.00789
Epoch: 43794, Training Loss: 0.00612
Epoch: 43795, Training Loss: 0.00691
Epoch: 43795, Training Loss: 0.00647
Epoch: 43795, Training Loss: 0.00789
Epoch: 43795, Training Loss: 0.00612
Epoch: 43796, Training Loss: 0.00691
Epoch: 43796, Training Loss: 0.00647
Epoch: 43796, Training Loss: 0.00789
Epoch: 43796, Training Loss: 0.00612
Epoch: 43797, Training Loss: 0.00691
Epoch: 43797, Training Loss: 0.00647
Epoch: 43797, Training Loss: 0.00789
Epoch: 43797, Training Loss: 0.00612
Epoch: 43798, Training Loss: 0.00691
Epoch: 43798, Training Loss: 0.00647
Epoch: 43798, Training Loss: 0.00789
Epoch: 43798, Training Loss: 0.00612
Epoch: 43799, Training Loss: 0.00691
Epoch: 43799, Training Loss: 0.00647
Epoch: 43799, Training Loss: 0.00789
Epoch: 43799, Training Loss: 0.00612
Epoch: 43800, Training Loss: 0.00691
Epoch: 43800, Training Loss: 0.00647
Epoch: 43800, Training Loss: 0.00789
Epoch: 43800, Training Loss: 0.00612
Epoch: 43801, Training Loss: 0.00691
Epoch: 43801, Training Loss: 0.00647
Epoch: 43801, Training Loss: 0.00789
Epoch: 43801, Training Loss: 0.00612
Epoch: 43802, Training Loss: 0.00691
Epoch: 43802, Training Loss: 0.00647
Epoch: 43802, Training Loss: 0.00789
Epoch: 43802, Training Loss: 0.00612
Epoch: 43803, Training Loss: 0.00691
Epoch: 43803, Training Loss: 0.00647
Epoch: 43803, Training Loss: 0.00789
Epoch: 43803, Training Loss: 0.00612
Epoch: 43804, Training Loss: 0.00691
Epoch: 43804, Training Loss: 0.00647
Epoch: 43804, Training Loss: 0.00789
Epoch: 43804, Training Loss: 0.00612
Epoch: 43805, Training Loss: 0.00691
Epoch: 43805, Training Loss: 0.00647
Epoch: 43805, Training Loss: 0.00789
Epoch: 43805, Training Loss: 0.00612
Epoch: 43806, Training Loss: 0.00691
Epoch: 43806, Training Loss: 0.00647
Epoch: 43806, Training Loss: 0.00789
Epoch: 43806, Training Loss: 0.00612
Epoch: 43807, Training Loss: 0.00691
Epoch: 43807, Training Loss: 0.00647
Epoch: 43807, Training Loss: 0.00789
Epoch: 43807, Training Loss: 0.00612
Epoch: 43808, Training Loss: 0.00691
Epoch: 43808, Training Loss: 0.00647
Epoch: 43808, Training Loss: 0.00789
Epoch: 43808, Training Loss: 0.00612
Epoch: 43809, Training Loss: 0.00691
Epoch: 43809, Training Loss: 0.00647
Epoch: 43809, Training Loss: 0.00789
Epoch: 43809, Training Loss: 0.00612
Epoch: 43810, Training Loss: 0.00691
Epoch: 43810, Training Loss: 0.00647
Epoch: 43810, Training Loss: 0.00789
Epoch: 43810, Training Loss: 0.00612
Epoch: 43811, Training Loss: 0.00691
Epoch: 43811, Training Loss: 0.00647
Epoch: 43811, Training Loss: 0.00789
Epoch: 43811, Training Loss: 0.00612
Epoch: 43812, Training Loss: 0.00691
Epoch: 43812, Training Loss: 0.00647
Epoch: 43812, Training Loss: 0.00789
Epoch: 43812, Training Loss: 0.00612
Epoch: 43813, Training Loss: 0.00691
Epoch: 43813, Training Loss: 0.00647
Epoch: 43813, Training Loss: 0.00789
Epoch: 43813, Training Loss: 0.00612
Epoch: 43814, Training Loss: 0.00691
Epoch: 43814, Training Loss: 0.00647
Epoch: 43814, Training Loss: 0.00789
Epoch: 43814, Training Loss: 0.00612
Epoch: 43815, Training Loss: 0.00691
Epoch: 43815, Training Loss: 0.00647
Epoch: 43815, Training Loss: 0.00789
Epoch: 43815, Training Loss: 0.00612
Epoch: 43816, Training Loss: 0.00690
Epoch: 43816, Training Loss: 0.00647
Epoch: 43816, Training Loss: 0.00789
Epoch: 43816, Training Loss: 0.00612
Epoch: 43817, Training Loss: 0.00690
Epoch: 43817, Training Loss: 0.00647
Epoch: 43817, Training Loss: 0.00789
Epoch: 43817, Training Loss: 0.00612
Epoch: 43818, Training Loss: 0.00690
Epoch: 43818, Training Loss: 0.00647
Epoch: 43818, Training Loss: 0.00789
Epoch: 43818, Training Loss: 0.00612
Epoch: 43819, Training Loss: 0.00690
Epoch: 43819, Training Loss: 0.00647
Epoch: 43819, Training Loss: 0.00789
Epoch: 43819, Training Loss: 0.00612
Epoch: 43820, Training Loss: 0.00690
Epoch: 43820, Training Loss: 0.00647
Epoch: 43820, Training Loss: 0.00789
Epoch: 43820, Training Loss: 0.00612
Epoch: 43821, Training Loss: 0.00690
Epoch: 43821, Training Loss: 0.00647
Epoch: 43821, Training Loss: 0.00789
Epoch: 43821, Training Loss: 0.00612
Epoch: 43822, Training Loss: 0.00690
Epoch: 43822, Training Loss: 0.00647
Epoch: 43822, Training Loss: 0.00789
Epoch: 43822, Training Loss: 0.00612
Epoch: 43823, Training Loss: 0.00690
Epoch: 43823, Training Loss: 0.00647
Epoch: 43823, Training Loss: 0.00789
Epoch: 43823, Training Loss: 0.00612
Epoch: 43824, Training Loss: 0.00690
Epoch: 43824, Training Loss: 0.00647
Epoch: 43824, Training Loss: 0.00789
Epoch: 43824, Training Loss: 0.00612
Epoch: 43825, Training Loss: 0.00690
Epoch: 43825, Training Loss: 0.00647
Epoch: 43825, Training Loss: 0.00789
Epoch: 43825, Training Loss: 0.00612
Epoch: 43826, Training Loss: 0.00690
Epoch: 43826, Training Loss: 0.00647
Epoch: 43826, Training Loss: 0.00789
Epoch: 43826, Training Loss: 0.00612
Epoch: 43827, Training Loss: 0.00690
Epoch: 43827, Training Loss: 0.00647
Epoch: 43827, Training Loss: 0.00789
Epoch: 43827, Training Loss: 0.00612
Epoch: 43828, Training Loss: 0.00690
Epoch: 43828, Training Loss: 0.00647
Epoch: 43828, Training Loss: 0.00789
Epoch: 43828, Training Loss: 0.00612
Epoch: 43829, Training Loss: 0.00690
Epoch: 43829, Training Loss: 0.00647
Epoch: 43829, Training Loss: 0.00789
Epoch: 43829, Training Loss: 0.00612
Epoch: 43830, Training Loss: 0.00690
Epoch: 43830, Training Loss: 0.00647
Epoch: 43830, Training Loss: 0.00789
Epoch: 43830, Training Loss: 0.00612
Epoch: 43831, Training Loss: 0.00690
Epoch: 43831, Training Loss: 0.00647
Epoch: 43831, Training Loss: 0.00789
Epoch: 43831, Training Loss: 0.00612
Epoch: 43832, Training Loss: 0.00690
Epoch: 43832, Training Loss: 0.00647
Epoch: 43832, Training Loss: 0.00789
Epoch: 43832, Training Loss: 0.00612
Epoch: 43833, Training Loss: 0.00690
Epoch: 43833, Training Loss: 0.00647
Epoch: 43833, Training Loss: 0.00789
Epoch: 43833, Training Loss: 0.00612
Epoch: 43834, Training Loss: 0.00690
Epoch: 43834, Training Loss: 0.00647
Epoch: 43834, Training Loss: 0.00789
Epoch: 43834, Training Loss: 0.00612
Epoch: 43835, Training Loss: 0.00690
Epoch: 43835, Training Loss: 0.00647
Epoch: 43835, Training Loss: 0.00789
Epoch: 43835, Training Loss: 0.00612
Epoch: 43836, Training Loss: 0.00690
Epoch: 43836, Training Loss: 0.00647
Epoch: 43836, Training Loss: 0.00789
Epoch: 43836, Training Loss: 0.00612
Epoch: 43837, Training Loss: 0.00690
Epoch: 43837, Training Loss: 0.00647
Epoch: 43837, Training Loss: 0.00789
Epoch: 43837, Training Loss: 0.00612
Epoch: 43838, Training Loss: 0.00690
Epoch: 43838, Training Loss: 0.00647
Epoch: 43838, Training Loss: 0.00789
Epoch: 43838, Training Loss: 0.00612
Epoch: 43839, Training Loss: 0.00690
Epoch: 43839, Training Loss: 0.00647
Epoch: 43839, Training Loss: 0.00789
Epoch: 43839, Training Loss: 0.00612
Epoch: 43840, Training Loss: 0.00690
Epoch: 43840, Training Loss: 0.00646
Epoch: 43840, Training Loss: 0.00789
Epoch: 43840, Training Loss: 0.00612
Epoch: 43841, Training Loss: 0.00690
Epoch: 43841, Training Loss: 0.00646
Epoch: 43841, Training Loss: 0.00789
Epoch: 43841, Training Loss: 0.00612
Epoch: 43842, Training Loss: 0.00690
Epoch: 43842, Training Loss: 0.00646
Epoch: 43842, Training Loss: 0.00789
Epoch: 43842, Training Loss: 0.00612
Epoch: 43843, Training Loss: 0.00690
Epoch: 43843, Training Loss: 0.00646
Epoch: 43843, Training Loss: 0.00789
Epoch: 43843, Training Loss: 0.00612
Epoch: 43844, Training Loss: 0.00690
Epoch: 43844, Training Loss: 0.00646
Epoch: 43844, Training Loss: 0.00789
Epoch: 43844, Training Loss: 0.00612
Epoch: 43845, Training Loss: 0.00690
Epoch: 43845, Training Loss: 0.00646
Epoch: 43845, Training Loss: 0.00789
Epoch: 43845, Training Loss: 0.00612
Epoch: 43846, Training Loss: 0.00690
Epoch: 43846, Training Loss: 0.00646
Epoch: 43846, Training Loss: 0.00789
Epoch: 43846, Training Loss: 0.00612
Epoch: 43847, Training Loss: 0.00690
Epoch: 43847, Training Loss: 0.00646
Epoch: 43847, Training Loss: 0.00789
Epoch: 43847, Training Loss: 0.00612
Epoch: 43848, Training Loss: 0.00690
Epoch: 43848, Training Loss: 0.00646
Epoch: 43848, Training Loss: 0.00789
Epoch: 43848, Training Loss: 0.00612
Epoch: 43849, Training Loss: 0.00690
Epoch: 43849, Training Loss: 0.00646
Epoch: 43849, Training Loss: 0.00789
Epoch: 43849, Training Loss: 0.00612
Epoch: 43850, Training Loss: 0.00690
Epoch: 43850, Training Loss: 0.00646
Epoch: 43850, Training Loss: 0.00789
Epoch: 43850, Training Loss: 0.00612
Epoch: 43851, Training Loss: 0.00690
Epoch: 43851, Training Loss: 0.00646
Epoch: 43851, Training Loss: 0.00789
Epoch: 43851, Training Loss: 0.00612
Epoch: 43852, Training Loss: 0.00690
Epoch: 43852, Training Loss: 0.00646
Epoch: 43852, Training Loss: 0.00789
Epoch: 43852, Training Loss: 0.00612
Epoch: 43853, Training Loss: 0.00690
Epoch: 43853, Training Loss: 0.00646
Epoch: 43853, Training Loss: 0.00789
Epoch: 43853, Training Loss: 0.00612
Epoch: 43854, Training Loss: 0.00690
Epoch: 43854, Training Loss: 0.00646
Epoch: 43854, Training Loss: 0.00789
Epoch: 43854, Training Loss: 0.00612
Epoch: 43855, Training Loss: 0.00690
Epoch: 43855, Training Loss: 0.00646
Epoch: 43855, Training Loss: 0.00789
Epoch: 43855, Training Loss: 0.00612
Epoch: 43856, Training Loss: 0.00690
Epoch: 43856, Training Loss: 0.00646
Epoch: 43856, Training Loss: 0.00789
Epoch: 43856, Training Loss: 0.00612
Epoch: 43857, Training Loss: 0.00690
Epoch: 43857, Training Loss: 0.00646
Epoch: 43857, Training Loss: 0.00789
Epoch: 43857, Training Loss: 0.00612
Epoch: 43858, Training Loss: 0.00690
Epoch: 43858, Training Loss: 0.00646
Epoch: 43858, Training Loss: 0.00789
Epoch: 43858, Training Loss: 0.00612
Epoch: 43859, Training Loss: 0.00690
Epoch: 43859, Training Loss: 0.00646
Epoch: 43859, Training Loss: 0.00789
Epoch: 43859, Training Loss: 0.00612
Epoch: 43860, Training Loss: 0.00690
Epoch: 43860, Training Loss: 0.00646
Epoch: 43860, Training Loss: 0.00789
Epoch: 43860, Training Loss: 0.00612
Epoch: 43861, Training Loss: 0.00690
Epoch: 43861, Training Loss: 0.00646
Epoch: 43861, Training Loss: 0.00789
Epoch: 43861, Training Loss: 0.00612
Epoch: 43862, Training Loss: 0.00690
Epoch: 43862, Training Loss: 0.00646
Epoch: 43862, Training Loss: 0.00789
Epoch: 43862, Training Loss: 0.00612
Epoch: 43863, Training Loss: 0.00690
Epoch: 43863, Training Loss: 0.00646
Epoch: 43863, Training Loss: 0.00789
Epoch: 43863, Training Loss: 0.00612
Epoch: 43864, Training Loss: 0.00690
Epoch: 43864, Training Loss: 0.00646
Epoch: 43864, Training Loss: 0.00789
Epoch: 43864, Training Loss: 0.00612
Epoch: 43865, Training Loss: 0.00690
Epoch: 43865, Training Loss: 0.00646
Epoch: 43865, Training Loss: 0.00789
Epoch: 43865, Training Loss: 0.00612
Epoch: 43866, Training Loss: 0.00690
Epoch: 43866, Training Loss: 0.00646
Epoch: 43866, Training Loss: 0.00789
Epoch: 43866, Training Loss: 0.00612
Epoch: 43867, Training Loss: 0.00690
Epoch: 43867, Training Loss: 0.00646
Epoch: 43867, Training Loss: 0.00789
Epoch: 43867, Training Loss: 0.00612
Epoch: 43868, Training Loss: 0.00690
Epoch: 43868, Training Loss: 0.00646
Epoch: 43868, Training Loss: 0.00789
Epoch: 43868, Training Loss: 0.00612
Epoch: 43869, Training Loss: 0.00690
Epoch: 43869, Training Loss: 0.00646
Epoch: 43869, Training Loss: 0.00789
Epoch: 43869, Training Loss: 0.00612
Epoch: 43870, Training Loss: 0.00690
Epoch: 43870, Training Loss: 0.00646
Epoch: 43870, Training Loss: 0.00789
Epoch: 43870, Training Loss: 0.00612
Epoch: 43871, Training Loss: 0.00690
Epoch: 43871, Training Loss: 0.00646
Epoch: 43871, Training Loss: 0.00789
Epoch: 43871, Training Loss: 0.00612
Epoch: 43872, Training Loss: 0.00690
Epoch: 43872, Training Loss: 0.00646
Epoch: 43872, Training Loss: 0.00789
Epoch: 43872, Training Loss: 0.00612
Epoch: 43873, Training Loss: 0.00690
Epoch: 43873, Training Loss: 0.00646
Epoch: 43873, Training Loss: 0.00789
Epoch: 43873, Training Loss: 0.00612
Epoch: 43874, Training Loss: 0.00690
Epoch: 43874, Training Loss: 0.00646
Epoch: 43874, Training Loss: 0.00789
Epoch: 43874, Training Loss: 0.00612
Epoch: 43875, Training Loss: 0.00690
Epoch: 43875, Training Loss: 0.00646
Epoch: 43875, Training Loss: 0.00789
Epoch: 43875, Training Loss: 0.00612
Epoch: 43876, Training Loss: 0.00690
Epoch: 43876, Training Loss: 0.00646
Epoch: 43876, Training Loss: 0.00789
Epoch: 43876, Training Loss: 0.00612
Epoch: 43877, Training Loss: 0.00690
Epoch: 43877, Training Loss: 0.00646
Epoch: 43877, Training Loss: 0.00789
Epoch: 43877, Training Loss: 0.00612
Epoch: 43878, Training Loss: 0.00690
Epoch: 43878, Training Loss: 0.00646
Epoch: 43878, Training Loss: 0.00789
Epoch: 43878, Training Loss: 0.00612
Epoch: 43879, Training Loss: 0.00690
Epoch: 43879, Training Loss: 0.00646
Epoch: 43879, Training Loss: 0.00789
Epoch: 43879, Training Loss: 0.00612
Epoch: 43880, Training Loss: 0.00690
Epoch: 43880, Training Loss: 0.00646
Epoch: 43880, Training Loss: 0.00788
Epoch: 43880, Training Loss: 0.00612
Epoch: 43881, Training Loss: 0.00690
Epoch: 43881, Training Loss: 0.00646
Epoch: 43881, Training Loss: 0.00788
Epoch: 43881, Training Loss: 0.00612
Epoch: 43882, Training Loss: 0.00690
Epoch: 43882, Training Loss: 0.00646
Epoch: 43882, Training Loss: 0.00788
Epoch: 43882, Training Loss: 0.00612
Epoch: 43883, Training Loss: 0.00690
Epoch: 43883, Training Loss: 0.00646
Epoch: 43883, Training Loss: 0.00788
Epoch: 43883, Training Loss: 0.00612
Epoch: 43884, Training Loss: 0.00690
Epoch: 43884, Training Loss: 0.00646
Epoch: 43884, Training Loss: 0.00788
Epoch: 43884, Training Loss: 0.00612
Epoch: 43885, Training Loss: 0.00690
Epoch: 43885, Training Loss: 0.00646
Epoch: 43885, Training Loss: 0.00788
Epoch: 43885, Training Loss: 0.00612
Epoch: 43886, Training Loss: 0.00690
Epoch: 43886, Training Loss: 0.00646
Epoch: 43886, Training Loss: 0.00788
Epoch: 43886, Training Loss: 0.00612
Epoch: 43887, Training Loss: 0.00690
Epoch: 43887, Training Loss: 0.00646
Epoch: 43887, Training Loss: 0.00788
Epoch: 43887, Training Loss: 0.00612
Epoch: 43888, Training Loss: 0.00690
Epoch: 43888, Training Loss: 0.00646
Epoch: 43888, Training Loss: 0.00788
Epoch: 43888, Training Loss: 0.00612
Epoch: 43889, Training Loss: 0.00690
Epoch: 43889, Training Loss: 0.00646
Epoch: 43889, Training Loss: 0.00788
Epoch: 43889, Training Loss: 0.00612
Epoch: 43890, Training Loss: 0.00690
Epoch: 43890, Training Loss: 0.00646
Epoch: 43890, Training Loss: 0.00788
Epoch: 43890, Training Loss: 0.00612
Epoch: 43891, Training Loss: 0.00690
Epoch: 43891, Training Loss: 0.00646
Epoch: 43891, Training Loss: 0.00788
Epoch: 43891, Training Loss: 0.00612
Epoch: 43892, Training Loss: 0.00690
Epoch: 43892, Training Loss: 0.00646
Epoch: 43892, Training Loss: 0.00788
Epoch: 43892, Training Loss: 0.00612
Epoch: 43893, Training Loss: 0.00690
Epoch: 43893, Training Loss: 0.00646
Epoch: 43893, Training Loss: 0.00788
Epoch: 43893, Training Loss: 0.00612
Epoch: 43894, Training Loss: 0.00690
Epoch: 43894, Training Loss: 0.00646
Epoch: 43894, Training Loss: 0.00788
Epoch: 43894, Training Loss: 0.00612
Epoch: 43895, Training Loss: 0.00690
Epoch: 43895, Training Loss: 0.00646
Epoch: 43895, Training Loss: 0.00788
Epoch: 43895, Training Loss: 0.00612
Epoch: 43896, Training Loss: 0.00690
Epoch: 43896, Training Loss: 0.00646
Epoch: 43896, Training Loss: 0.00788
Epoch: 43896, Training Loss: 0.00612
Epoch: 43897, Training Loss: 0.00690
Epoch: 43897, Training Loss: 0.00646
Epoch: 43897, Training Loss: 0.00788
Epoch: 43897, Training Loss: 0.00612
Epoch: 43898, Training Loss: 0.00690
Epoch: 43898, Training Loss: 0.00646
Epoch: 43898, Training Loss: 0.00788
Epoch: 43898, Training Loss: 0.00612
Epoch: 43899, Training Loss: 0.00690
Epoch: 43899, Training Loss: 0.00646
Epoch: 43899, Training Loss: 0.00788
Epoch: 43899, Training Loss: 0.00612
Epoch: 43900, Training Loss: 0.00690
Epoch: 43900, Training Loss: 0.00646
Epoch: 43900, Training Loss: 0.00788
Epoch: 43900, Training Loss: 0.00612
Epoch: 43901, Training Loss: 0.00690
Epoch: 43901, Training Loss: 0.00646
Epoch: 43901, Training Loss: 0.00788
Epoch: 43901, Training Loss: 0.00612
Epoch: 43902, Training Loss: 0.00690
Epoch: 43902, Training Loss: 0.00646
Epoch: 43902, Training Loss: 0.00788
Epoch: 43902, Training Loss: 0.00612
Epoch: 43903, Training Loss: 0.00690
Epoch: 43903, Training Loss: 0.00646
Epoch: 43903, Training Loss: 0.00788
Epoch: 43903, Training Loss: 0.00612
Epoch: 43904, Training Loss: 0.00690
Epoch: 43904, Training Loss: 0.00646
Epoch: 43904, Training Loss: 0.00788
Epoch: 43904, Training Loss: 0.00612
Epoch: 43905, Training Loss: 0.00690
Epoch: 43905, Training Loss: 0.00646
Epoch: 43905, Training Loss: 0.00788
Epoch: 43905, Training Loss: 0.00612
Epoch: 43906, Training Loss: 0.00690
Epoch: 43906, Training Loss: 0.00646
Epoch: 43906, Training Loss: 0.00788
Epoch: 43906, Training Loss: 0.00612
Epoch: 43907, Training Loss: 0.00690
Epoch: 43907, Training Loss: 0.00646
Epoch: 43907, Training Loss: 0.00788
Epoch: 43907, Training Loss: 0.00612
Epoch: 43908, Training Loss: 0.00690
Epoch: 43908, Training Loss: 0.00646
Epoch: 43908, Training Loss: 0.00788
Epoch: 43908, Training Loss: 0.00612
Epoch: 43909, Training Loss: 0.00690
Epoch: 43909, Training Loss: 0.00646
Epoch: 43909, Training Loss: 0.00788
Epoch: 43909, Training Loss: 0.00612
Epoch: 43910, Training Loss: 0.00690
Epoch: 43910, Training Loss: 0.00646
Epoch: 43910, Training Loss: 0.00788
Epoch: 43910, Training Loss: 0.00612
Epoch: 43911, Training Loss: 0.00690
Epoch: 43911, Training Loss: 0.00646
Epoch: 43911, Training Loss: 0.00788
Epoch: 43911, Training Loss: 0.00612
Epoch: 43912, Training Loss: 0.00690
Epoch: 43912, Training Loss: 0.00646
Epoch: 43912, Training Loss: 0.00788
Epoch: 43912, Training Loss: 0.00612
Epoch: 43913, Training Loss: 0.00690
Epoch: 43913, Training Loss: 0.00646
Epoch: 43913, Training Loss: 0.00788
Epoch: 43913, Training Loss: 0.00612
Epoch: 43914, Training Loss: 0.00690
Epoch: 43914, Training Loss: 0.00646
Epoch: 43914, Training Loss: 0.00788
Epoch: 43914, Training Loss: 0.00612
Epoch: 43915, Training Loss: 0.00690
Epoch: 43915, Training Loss: 0.00646
Epoch: 43915, Training Loss: 0.00788
Epoch: 43915, Training Loss: 0.00612
Epoch: 43916, Training Loss: 0.00690
Epoch: 43916, Training Loss: 0.00646
Epoch: 43916, Training Loss: 0.00788
Epoch: 43916, Training Loss: 0.00612
Epoch: 43917, Training Loss: 0.00690
Epoch: 43917, Training Loss: 0.00646
Epoch: 43917, Training Loss: 0.00788
Epoch: 43917, Training Loss: 0.00612
Epoch: 43918, Training Loss: 0.00690
Epoch: 43918, Training Loss: 0.00646
Epoch: 43918, Training Loss: 0.00788
Epoch: 43918, Training Loss: 0.00612
Epoch: 43919, Training Loss: 0.00690
Epoch: 43919, Training Loss: 0.00646
Epoch: 43919, Training Loss: 0.00788
Epoch: 43919, Training Loss: 0.00611
Epoch: 43920, Training Loss: 0.00690
Epoch: 43920, Training Loss: 0.00646
Epoch: 43920, Training Loss: 0.00788
Epoch: 43920, Training Loss: 0.00611
Epoch: 43921, Training Loss: 0.00690
Epoch: 43921, Training Loss: 0.00646
Epoch: 43921, Training Loss: 0.00788
Epoch: 43921, Training Loss: 0.00611
Epoch: 43922, Training Loss: 0.00690
Epoch: 43922, Training Loss: 0.00646
Epoch: 43922, Training Loss: 0.00788
Epoch: 43922, Training Loss: 0.00611
Epoch: 43923, Training Loss: 0.00690
Epoch: 43923, Training Loss: 0.00646
Epoch: 43923, Training Loss: 0.00788
Epoch: 43923, Training Loss: 0.00611
Epoch: 43924, Training Loss: 0.00690
Epoch: 43924, Training Loss: 0.00646
Epoch: 43924, Training Loss: 0.00788
Epoch: 43924, Training Loss: 0.00611
Epoch: 43925, Training Loss: 0.00690
Epoch: 43925, Training Loss: 0.00646
Epoch: 43925, Training Loss: 0.00788
Epoch: 43925, Training Loss: 0.00611
Epoch: 43926, Training Loss: 0.00690
Epoch: 43926, Training Loss: 0.00646
Epoch: 43926, Training Loss: 0.00788
Epoch: 43926, Training Loss: 0.00611
Epoch: 43927, Training Loss: 0.00690
Epoch: 43927, Training Loss: 0.00646
Epoch: 43927, Training Loss: 0.00788
Epoch: 43927, Training Loss: 0.00611
Epoch: 43928, Training Loss: 0.00690
Epoch: 43928, Training Loss: 0.00646
Epoch: 43928, Training Loss: 0.00788
Epoch: 43928, Training Loss: 0.00611
Epoch: 43929, Training Loss: 0.00690
Epoch: 43929, Training Loss: 0.00646
Epoch: 43929, Training Loss: 0.00788
Epoch: 43929, Training Loss: 0.00611
Epoch: 43930, Training Loss: 0.00690
Epoch: 43930, Training Loss: 0.00646
Epoch: 43930, Training Loss: 0.00788
Epoch: 43930, Training Loss: 0.00611
Epoch: 43931, Training Loss: 0.00690
Epoch: 43931, Training Loss: 0.00646
Epoch: 43931, Training Loss: 0.00788
Epoch: 43931, Training Loss: 0.00611
Epoch: 43932, Training Loss: 0.00690
Epoch: 43932, Training Loss: 0.00646
Epoch: 43932, Training Loss: 0.00788
Epoch: 43932, Training Loss: 0.00611
Epoch: 43933, Training Loss: 0.00690
Epoch: 43933, Training Loss: 0.00646
Epoch: 43933, Training Loss: 0.00788
Epoch: 43933, Training Loss: 0.00611
Epoch: 43934, Training Loss: 0.00690
Epoch: 43934, Training Loss: 0.00646
Epoch: 43934, Training Loss: 0.00788
Epoch: 43934, Training Loss: 0.00611
Epoch: 43935, Training Loss: 0.00690
Epoch: 43935, Training Loss: 0.00646
Epoch: 43935, Training Loss: 0.00788
Epoch: 43935, Training Loss: 0.00611
Epoch: 43936, Training Loss: 0.00690
Epoch: 43936, Training Loss: 0.00646
Epoch: 43936, Training Loss: 0.00788
Epoch: 43936, Training Loss: 0.00611
Epoch: 43937, Training Loss: 0.00689
Epoch: 43937, Training Loss: 0.00646
Epoch: 43937, Training Loss: 0.00788
Epoch: 43937, Training Loss: 0.00611
Epoch: 43938, Training Loss: 0.00689
Epoch: 43938, Training Loss: 0.00646
Epoch: 43938, Training Loss: 0.00788
Epoch: 43938, Training Loss: 0.00611
Epoch: 43939, Training Loss: 0.00689
Epoch: 43939, Training Loss: 0.00646
Epoch: 43939, Training Loss: 0.00788
Epoch: 43939, Training Loss: 0.00611
Epoch: 43940, Training Loss: 0.00689
Epoch: 43940, Training Loss: 0.00646
Epoch: 43940, Training Loss: 0.00788
Epoch: 43940, Training Loss: 0.00611
Epoch: 43941, Training Loss: 0.00689
Epoch: 43941, Training Loss: 0.00646
Epoch: 43941, Training Loss: 0.00788
Epoch: 43941, Training Loss: 0.00611
Epoch: 43942, Training Loss: 0.00689
Epoch: 43942, Training Loss: 0.00646
Epoch: 43942, Training Loss: 0.00788
Epoch: 43942, Training Loss: 0.00611
Epoch: 43943, Training Loss: 0.00689
Epoch: 43943, Training Loss: 0.00646
Epoch: 43943, Training Loss: 0.00788
Epoch: 43943, Training Loss: 0.00611
Epoch: 43944, Training Loss: 0.00689
Epoch: 43944, Training Loss: 0.00646
Epoch: 43944, Training Loss: 0.00788
Epoch: 43944, Training Loss: 0.00611
Epoch: 43945, Training Loss: 0.00689
Epoch: 43945, Training Loss: 0.00646
Epoch: 43945, Training Loss: 0.00788
Epoch: 43945, Training Loss: 0.00611
Epoch: 43946, Training Loss: 0.00689
Epoch: 43946, Training Loss: 0.00646
Epoch: 43946, Training Loss: 0.00788
Epoch: 43946, Training Loss: 0.00611
Epoch: 43947, Training Loss: 0.00689
Epoch: 43947, Training Loss: 0.00646
Epoch: 43947, Training Loss: 0.00788
Epoch: 43947, Training Loss: 0.00611
Epoch: 43948, Training Loss: 0.00689
Epoch: 43948, Training Loss: 0.00646
Epoch: 43948, Training Loss: 0.00788
Epoch: 43948, Training Loss: 0.00611
Epoch: 43949, Training Loss: 0.00689
Epoch: 43949, Training Loss: 0.00646
Epoch: 43949, Training Loss: 0.00788
Epoch: 43949, Training Loss: 0.00611
Epoch: 43950, Training Loss: 0.00689
Epoch: 43950, Training Loss: 0.00646
Epoch: 43950, Training Loss: 0.00788
Epoch: 43950, Training Loss: 0.00611
Epoch: 43951, Training Loss: 0.00689
Epoch: 43951, Training Loss: 0.00646
Epoch: 43951, Training Loss: 0.00788
Epoch: 43951, Training Loss: 0.00611
Epoch: 43952, Training Loss: 0.00689
Epoch: 43952, Training Loss: 0.00646
Epoch: 43952, Training Loss: 0.00788
Epoch: 43952, Training Loss: 0.00611
Epoch: 43953, Training Loss: 0.00689
Epoch: 43953, Training Loss: 0.00646
Epoch: 43953, Training Loss: 0.00788
Epoch: 43953, Training Loss: 0.00611
Epoch: 43954, Training Loss: 0.00689
Epoch: 43954, Training Loss: 0.00646
Epoch: 43954, Training Loss: 0.00788
Epoch: 43954, Training Loss: 0.00611
Epoch: 43955, Training Loss: 0.00689
Epoch: 43955, Training Loss: 0.00646
Epoch: 43955, Training Loss: 0.00788
Epoch: 43955, Training Loss: 0.00611
Epoch: 43956, Training Loss: 0.00689
Epoch: 43956, Training Loss: 0.00646
Epoch: 43956, Training Loss: 0.00788
Epoch: 43956, Training Loss: 0.00611
Epoch: 43957, Training Loss: 0.00689
Epoch: 43957, Training Loss: 0.00646
Epoch: 43957, Training Loss: 0.00788
Epoch: 43957, Training Loss: 0.00611
Epoch: 43958, Training Loss: 0.00689
Epoch: 43958, Training Loss: 0.00646
Epoch: 43958, Training Loss: 0.00788
Epoch: 43958, Training Loss: 0.00611
Epoch: 43959, Training Loss: 0.00689
Epoch: 43959, Training Loss: 0.00646
Epoch: 43959, Training Loss: 0.00788
Epoch: 43959, Training Loss: 0.00611
Epoch: 43960, Training Loss: 0.00689
Epoch: 43960, Training Loss: 0.00646
Epoch: 43960, Training Loss: 0.00788
Epoch: 43960, Training Loss: 0.00611
Epoch: 43961, Training Loss: 0.00689
Epoch: 43961, Training Loss: 0.00646
Epoch: 43961, Training Loss: 0.00788
Epoch: 43961, Training Loss: 0.00611
Epoch: 43962, Training Loss: 0.00689
Epoch: 43962, Training Loss: 0.00646
Epoch: 43962, Training Loss: 0.00788
Epoch: 43962, Training Loss: 0.00611
Epoch: 43963, Training Loss: 0.00689
Epoch: 43963, Training Loss: 0.00646
Epoch: 43963, Training Loss: 0.00788
Epoch: 43963, Training Loss: 0.00611
Epoch: 43964, Training Loss: 0.00689
Epoch: 43964, Training Loss: 0.00646
Epoch: 43964, Training Loss: 0.00788
Epoch: 43964, Training Loss: 0.00611
Epoch: 43965, Training Loss: 0.00689
Epoch: 43965, Training Loss: 0.00646
Epoch: 43965, Training Loss: 0.00788
Epoch: 43965, Training Loss: 0.00611
Epoch: 43966, Training Loss: 0.00689
Epoch: 43966, Training Loss: 0.00646
Epoch: 43966, Training Loss: 0.00788
Epoch: 43966, Training Loss: 0.00611
Epoch: 43967, Training Loss: 0.00689
Epoch: 43967, Training Loss: 0.00646
Epoch: 43967, Training Loss: 0.00788
Epoch: 43967, Training Loss: 0.00611
Epoch: 43968, Training Loss: 0.00689
Epoch: 43968, Training Loss: 0.00646
Epoch: 43968, Training Loss: 0.00788
Epoch: 43968, Training Loss: 0.00611
Epoch: 43969, Training Loss: 0.00689
Epoch: 43969, Training Loss: 0.00646
Epoch: 43969, Training Loss: 0.00788
Epoch: 43969, Training Loss: 0.00611
Epoch: 43970, Training Loss: 0.00689
Epoch: 43970, Training Loss: 0.00646
Epoch: 43970, Training Loss: 0.00788
Epoch: 43970, Training Loss: 0.00611
Epoch: 43971, Training Loss: 0.00689
Epoch: 43971, Training Loss: 0.00645
Epoch: 43971, Training Loss: 0.00788
Epoch: 43971, Training Loss: 0.00611
Epoch: 43972, Training Loss: 0.00689
Epoch: 43972, Training Loss: 0.00645
Epoch: 43972, Training Loss: 0.00788
Epoch: 43972, Training Loss: 0.00611
Epoch: 43973, Training Loss: 0.00689
Epoch: 43973, Training Loss: 0.00645
Epoch: 43973, Training Loss: 0.00788
Epoch: 43973, Training Loss: 0.00611
Epoch: 43974, Training Loss: 0.00689
Epoch: 43974, Training Loss: 0.00645
Epoch: 43974, Training Loss: 0.00788
Epoch: 43974, Training Loss: 0.00611
Epoch: 43975, Training Loss: 0.00689
Epoch: 43975, Training Loss: 0.00645
Epoch: 43975, Training Loss: 0.00788
Epoch: 43975, Training Loss: 0.00611
Epoch: 43976, Training Loss: 0.00689
Epoch: 43976, Training Loss: 0.00645
Epoch: 43976, Training Loss: 0.00788
Epoch: 43976, Training Loss: 0.00611
Epoch: 43977, Training Loss: 0.00689
Epoch: 43977, Training Loss: 0.00645
Epoch: 43977, Training Loss: 0.00788
Epoch: 43977, Training Loss: 0.00611
Epoch: 43978, Training Loss: 0.00689
Epoch: 43978, Training Loss: 0.00645
Epoch: 43978, Training Loss: 0.00788
Epoch: 43978, Training Loss: 0.00611
Epoch: 43979, Training Loss: 0.00689
Epoch: 43979, Training Loss: 0.00645
Epoch: 43979, Training Loss: 0.00788
Epoch: 43979, Training Loss: 0.00611
Epoch: 43980, Training Loss: 0.00689
Epoch: 43980, Training Loss: 0.00645
Epoch: 43980, Training Loss: 0.00788
Epoch: 43980, Training Loss: 0.00611
Epoch: 43981, Training Loss: 0.00689
Epoch: 43981, Training Loss: 0.00645
Epoch: 43981, Training Loss: 0.00788
Epoch: 43981, Training Loss: 0.00611
Epoch: 43982, Training Loss: 0.00689
Epoch: 43982, Training Loss: 0.00645
Epoch: 43982, Training Loss: 0.00788
Epoch: 43982, Training Loss: 0.00611
Epoch: 43983, Training Loss: 0.00689
Epoch: 43983, Training Loss: 0.00645
Epoch: 43983, Training Loss: 0.00788
Epoch: 43983, Training Loss: 0.00611
Epoch: 43984, Training Loss: 0.00689
Epoch: 43984, Training Loss: 0.00645
Epoch: 43984, Training Loss: 0.00788
Epoch: 43984, Training Loss: 0.00611
Epoch: 43985, Training Loss: 0.00689
Epoch: 43985, Training Loss: 0.00645
Epoch: 43985, Training Loss: 0.00788
Epoch: 43985, Training Loss: 0.00611
Epoch: 43986, Training Loss: 0.00689
Epoch: 43986, Training Loss: 0.00645
Epoch: 43986, Training Loss: 0.00788
Epoch: 43986, Training Loss: 0.00611
Epoch: 43987, Training Loss: 0.00689
Epoch: 43987, Training Loss: 0.00645
Epoch: 43987, Training Loss: 0.00787
Epoch: 43987, Training Loss: 0.00611
Epoch: 43988, Training Loss: 0.00689
Epoch: 43988, Training Loss: 0.00645
Epoch: 43988, Training Loss: 0.00787
Epoch: 43988, Training Loss: 0.00611
Epoch: 43989, Training Loss: 0.00689
Epoch: 43989, Training Loss: 0.00645
Epoch: 43989, Training Loss: 0.00787
Epoch: 43989, Training Loss: 0.00611
Epoch: 43990, Training Loss: 0.00689
Epoch: 43990, Training Loss: 0.00645
Epoch: 43990, Training Loss: 0.00787
Epoch: 43990, Training Loss: 0.00611
Epoch: 43991, Training Loss: 0.00689
Epoch: 43991, Training Loss: 0.00645
Epoch: 43991, Training Loss: 0.00787
Epoch: 43991, Training Loss: 0.00611
Epoch: 43992, Training Loss: 0.00689
Epoch: 43992, Training Loss: 0.00645
Epoch: 43992, Training Loss: 0.00787
Epoch: 43992, Training Loss: 0.00611
Epoch: 43993, Training Loss: 0.00689
Epoch: 43993, Training Loss: 0.00645
Epoch: 43993, Training Loss: 0.00787
Epoch: 43993, Training Loss: 0.00611
Epoch: 43994, Training Loss: 0.00689
Epoch: 43994, Training Loss: 0.00645
Epoch: 43994, Training Loss: 0.00787
Epoch: 43994, Training Loss: 0.00611
Epoch: 43995, Training Loss: 0.00689
Epoch: 43995, Training Loss: 0.00645
Epoch: 43995, Training Loss: 0.00787
Epoch: 43995, Training Loss: 0.00611
Epoch: 43996, Training Loss: 0.00689
Epoch: 43996, Training Loss: 0.00645
Epoch: 43996, Training Loss: 0.00787
Epoch: 43996, Training Loss: 0.00611
Epoch: 43997, Training Loss: 0.00689
Epoch: 43997, Training Loss: 0.00645
Epoch: 43997, Training Loss: 0.00787
Epoch: 43997, Training Loss: 0.00611
Epoch: 43998, Training Loss: 0.00689
Epoch: 43998, Training Loss: 0.00645
Epoch: 43998, Training Loss: 0.00787
Epoch: 43998, Training Loss: 0.00611
Epoch: 43999, Training Loss: 0.00689
Epoch: 43999, Training Loss: 0.00645
Epoch: 43999, Training Loss: 0.00787
Epoch: 43999, Training Loss: 0.00611
Epoch: 44000, Training Loss: 0.00689
Epoch: 44000, Training Loss: 0.00645
Epoch: 44000, Training Loss: 0.00787
Epoch: 44000, Training Loss: 0.00611
Epoch: 44001, Training Loss: 0.00689
Epoch: 44001, Training Loss: 0.00645
Epoch: 44001, Training Loss: 0.00787
Epoch: 44001, Training Loss: 0.00611
Epoch: 44002, Training Loss: 0.00689
Epoch: 44002, Training Loss: 0.00645
Epoch: 44002, Training Loss: 0.00787
Epoch: 44002, Training Loss: 0.00611
Epoch: 44003, Training Loss: 0.00689
Epoch: 44003, Training Loss: 0.00645
Epoch: 44003, Training Loss: 0.00787
Epoch: 44003, Training Loss: 0.00611
Epoch: 44004, Training Loss: 0.00689
Epoch: 44004, Training Loss: 0.00645
Epoch: 44004, Training Loss: 0.00787
Epoch: 44004, Training Loss: 0.00611
Epoch: 44005, Training Loss: 0.00689
Epoch: 44005, Training Loss: 0.00645
Epoch: 44005, Training Loss: 0.00787
Epoch: 44005, Training Loss: 0.00611
Epoch: 44006, Training Loss: 0.00689
Epoch: 44006, Training Loss: 0.00645
Epoch: 44006, Training Loss: 0.00787
Epoch: 44006, Training Loss: 0.00611
Epoch: 44007, Training Loss: 0.00689
Epoch: 44007, Training Loss: 0.00645
Epoch: 44007, Training Loss: 0.00787
Epoch: 44007, Training Loss: 0.00611
Epoch: 44008, Training Loss: 0.00689
Epoch: 44008, Training Loss: 0.00645
Epoch: 44008, Training Loss: 0.00787
Epoch: 44008, Training Loss: 0.00611
Epoch: 44009, Training Loss: 0.00689
Epoch: 44009, Training Loss: 0.00645
Epoch: 44009, Training Loss: 0.00787
Epoch: 44009, Training Loss: 0.00611
Epoch: 44010, Training Loss: 0.00689
Epoch: 44010, Training Loss: 0.00645
Epoch: 44010, Training Loss: 0.00787
Epoch: 44010, Training Loss: 0.00611
Epoch: 44011, Training Loss: 0.00689
Epoch: 44011, Training Loss: 0.00645
Epoch: 44011, Training Loss: 0.00787
Epoch: 44011, Training Loss: 0.00611
Epoch: 44012, Training Loss: 0.00689
Epoch: 44012, Training Loss: 0.00645
Epoch: 44012, Training Loss: 0.00787
Epoch: 44012, Training Loss: 0.00611
Epoch: 44013, Training Loss: 0.00689
Epoch: 44013, Training Loss: 0.00645
Epoch: 44013, Training Loss: 0.00787
Epoch: 44013, Training Loss: 0.00611
Epoch: 44014, Training Loss: 0.00689
Epoch: 44014, Training Loss: 0.00645
Epoch: 44014, Training Loss: 0.00787
Epoch: 44014, Training Loss: 0.00611
Epoch: 44015, Training Loss: 0.00689
Epoch: 44015, Training Loss: 0.00645
Epoch: 44015, Training Loss: 0.00787
Epoch: 44015, Training Loss: 0.00611
Epoch: 44016, Training Loss: 0.00689
Epoch: 44016, Training Loss: 0.00645
Epoch: 44016, Training Loss: 0.00787
Epoch: 44016, Training Loss: 0.00611
Epoch: 44017, Training Loss: 0.00689
Epoch: 44017, Training Loss: 0.00645
Epoch: 44017, Training Loss: 0.00787
Epoch: 44017, Training Loss: 0.00611
Epoch: 44018, Training Loss: 0.00689
Epoch: 44018, Training Loss: 0.00645
Epoch: 44018, Training Loss: 0.00787
Epoch: 44018, Training Loss: 0.00611
Epoch: 44019, Training Loss: 0.00689
Epoch: 44019, Training Loss: 0.00645
Epoch: 44019, Training Loss: 0.00787
Epoch: 44019, Training Loss: 0.00611
Epoch: 44020, Training Loss: 0.00689
Epoch: 44020, Training Loss: 0.00645
Epoch: 44020, Training Loss: 0.00787
Epoch: 44020, Training Loss: 0.00611
Epoch: 44021, Training Loss: 0.00689
Epoch: 44021, Training Loss: 0.00645
Epoch: 44021, Training Loss: 0.00787
Epoch: 44021, Training Loss: 0.00611
Epoch: 44022, Training Loss: 0.00689
Epoch: 44022, Training Loss: 0.00645
Epoch: 44022, Training Loss: 0.00787
Epoch: 44022, Training Loss: 0.00611
Epoch: 44023, Training Loss: 0.00689
Epoch: 44023, Training Loss: 0.00645
Epoch: 44023, Training Loss: 0.00787
Epoch: 44023, Training Loss: 0.00611
Epoch: 44024, Training Loss: 0.00689
Epoch: 44024, Training Loss: 0.00645
Epoch: 44024, Training Loss: 0.00787
Epoch: 44024, Training Loss: 0.00611
Epoch: 44025, Training Loss: 0.00689
Epoch: 44025, Training Loss: 0.00645
Epoch: 44025, Training Loss: 0.00787
Epoch: 44025, Training Loss: 0.00611
Epoch: 44026, Training Loss: 0.00689
Epoch: 44026, Training Loss: 0.00645
Epoch: 44026, Training Loss: 0.00787
Epoch: 44026, Training Loss: 0.00611
Epoch: 44027, Training Loss: 0.00689
Epoch: 44027, Training Loss: 0.00645
Epoch: 44027, Training Loss: 0.00787
Epoch: 44027, Training Loss: 0.00611
Epoch: 44028, Training Loss: 0.00689
Epoch: 44028, Training Loss: 0.00645
Epoch: 44028, Training Loss: 0.00787
Epoch: 44028, Training Loss: 0.00611
Epoch: 44029, Training Loss: 0.00689
Epoch: 44029, Training Loss: 0.00645
Epoch: 44029, Training Loss: 0.00787
Epoch: 44029, Training Loss: 0.00611
Epoch: 44030, Training Loss: 0.00689
Epoch: 44030, Training Loss: 0.00645
Epoch: 44030, Training Loss: 0.00787
Epoch: 44030, Training Loss: 0.00611
Epoch: 44031, Training Loss: 0.00689
Epoch: 44031, Training Loss: 0.00645
Epoch: 44031, Training Loss: 0.00787
Epoch: 44031, Training Loss: 0.00611
Epoch: 44032, Training Loss: 0.00689
Epoch: 44032, Training Loss: 0.00645
Epoch: 44032, Training Loss: 0.00787
Epoch: 44032, Training Loss: 0.00611
Epoch: 44033, Training Loss: 0.00689
Epoch: 44033, Training Loss: 0.00645
Epoch: 44033, Training Loss: 0.00787
Epoch: 44033, Training Loss: 0.00611
Epoch: 44034, Training Loss: 0.00689
Epoch: 44034, Training Loss: 0.00645
Epoch: 44034, Training Loss: 0.00787
Epoch: 44034, Training Loss: 0.00611
Epoch: 44035, Training Loss: 0.00689
Epoch: 44035, Training Loss: 0.00645
Epoch: 44035, Training Loss: 0.00787
Epoch: 44035, Training Loss: 0.00611
Epoch: 44036, Training Loss: 0.00689
Epoch: 44036, Training Loss: 0.00645
Epoch: 44036, Training Loss: 0.00787
Epoch: 44036, Training Loss: 0.00611
Epoch: 44037, Training Loss: 0.00689
Epoch: 44037, Training Loss: 0.00645
Epoch: 44037, Training Loss: 0.00787
Epoch: 44037, Training Loss: 0.00611
Epoch: 44038, Training Loss: 0.00689
Epoch: 44038, Training Loss: 0.00645
Epoch: 44038, Training Loss: 0.00787
Epoch: 44038, Training Loss: 0.00611
Epoch: 44039, Training Loss: 0.00689
Epoch: 44039, Training Loss: 0.00645
Epoch: 44039, Training Loss: 0.00787
Epoch: 44039, Training Loss: 0.00611
Epoch: 44040, Training Loss: 0.00689
Epoch: 44040, Training Loss: 0.00645
Epoch: 44040, Training Loss: 0.00787
Epoch: 44040, Training Loss: 0.00611
Epoch: 44041, Training Loss: 0.00689
Epoch: 44041, Training Loss: 0.00645
Epoch: 44041, Training Loss: 0.00787
Epoch: 44041, Training Loss: 0.00611
Epoch: 44042, Training Loss: 0.00689
Epoch: 44042, Training Loss: 0.00645
Epoch: 44042, Training Loss: 0.00787
Epoch: 44042, Training Loss: 0.00611
Epoch: 44043, Training Loss: 0.00689
Epoch: 44043, Training Loss: 0.00645
Epoch: 44043, Training Loss: 0.00787
Epoch: 44043, Training Loss: 0.00611
Epoch: 44044, Training Loss: 0.00689
Epoch: 44044, Training Loss: 0.00645
Epoch: 44044, Training Loss: 0.00787
Epoch: 44044, Training Loss: 0.00611
Epoch: 44045, Training Loss: 0.00689
Epoch: 44045, Training Loss: 0.00645
Epoch: 44045, Training Loss: 0.00787
Epoch: 44045, Training Loss: 0.00611
Epoch: 44046, Training Loss: 0.00689
Epoch: 44046, Training Loss: 0.00645
Epoch: 44046, Training Loss: 0.00787
Epoch: 44046, Training Loss: 0.00611
Epoch: 44047, Training Loss: 0.00689
Epoch: 44047, Training Loss: 0.00645
Epoch: 44047, Training Loss: 0.00787
Epoch: 44047, Training Loss: 0.00611
Epoch: 44048, Training Loss: 0.00689
Epoch: 44048, Training Loss: 0.00645
Epoch: 44048, Training Loss: 0.00787
Epoch: 44048, Training Loss: 0.00611
Epoch: 44049, Training Loss: 0.00689
Epoch: 44049, Training Loss: 0.00645
Epoch: 44049, Training Loss: 0.00787
Epoch: 44049, Training Loss: 0.00611
Epoch: 44050, Training Loss: 0.00689
Epoch: 44050, Training Loss: 0.00645
Epoch: 44050, Training Loss: 0.00787
Epoch: 44050, Training Loss: 0.00611
Epoch: 44051, Training Loss: 0.00689
Epoch: 44051, Training Loss: 0.00645
Epoch: 44051, Training Loss: 0.00787
Epoch: 44051, Training Loss: 0.00611
Epoch: 44052, Training Loss: 0.00689
Epoch: 44052, Training Loss: 0.00645
Epoch: 44052, Training Loss: 0.00787
Epoch: 44052, Training Loss: 0.00611
Epoch: 44053, Training Loss: 0.00689
Epoch: 44053, Training Loss: 0.00645
Epoch: 44053, Training Loss: 0.00787
Epoch: 44053, Training Loss: 0.00611
Epoch: 44054, Training Loss: 0.00689
Epoch: 44054, Training Loss: 0.00645
Epoch: 44054, Training Loss: 0.00787
Epoch: 44054, Training Loss: 0.00611
Epoch: 44055, Training Loss: 0.00689
Epoch: 44055, Training Loss: 0.00645
Epoch: 44055, Training Loss: 0.00787
Epoch: 44055, Training Loss: 0.00611
Epoch: 44056, Training Loss: 0.00689
Epoch: 44056, Training Loss: 0.00645
Epoch: 44056, Training Loss: 0.00787
Epoch: 44056, Training Loss: 0.00611
Epoch: 44057, Training Loss: 0.00689
Epoch: 44057, Training Loss: 0.00645
Epoch: 44057, Training Loss: 0.00787
Epoch: 44057, Training Loss: 0.00610
Epoch: 44058, Training Loss: 0.00689
Epoch: 44058, Training Loss: 0.00645
Epoch: 44058, Training Loss: 0.00787
Epoch: 44058, Training Loss: 0.00610
Epoch: 44059, Training Loss: 0.00688
Epoch: 44059, Training Loss: 0.00645
Epoch: 44059, Training Loss: 0.00787
Epoch: 44059, Training Loss: 0.00610
Epoch: 44060, Training Loss: 0.00688
Epoch: 44060, Training Loss: 0.00645
Epoch: 44060, Training Loss: 0.00787
Epoch: 44060, Training Loss: 0.00610
Epoch: 44061, Training Loss: 0.00688
Epoch: 44061, Training Loss: 0.00645
Epoch: 44061, Training Loss: 0.00787
Epoch: 44061, Training Loss: 0.00610
Epoch: 44062, Training Loss: 0.00688
Epoch: 44062, Training Loss: 0.00645
Epoch: 44062, Training Loss: 0.00787
Epoch: 44062, Training Loss: 0.00610
Epoch: 44063, Training Loss: 0.00688
Epoch: 44063, Training Loss: 0.00645
Epoch: 44063, Training Loss: 0.00787
Epoch: 44063, Training Loss: 0.00610
Epoch: 44064, Training Loss: 0.00688
Epoch: 44064, Training Loss: 0.00645
Epoch: 44064, Training Loss: 0.00787
Epoch: 44064, Training Loss: 0.00610
Epoch: 44065, Training Loss: 0.00688
Epoch: 44065, Training Loss: 0.00645
Epoch: 44065, Training Loss: 0.00787
Epoch: 44065, Training Loss: 0.00610
Epoch: 44066, Training Loss: 0.00688
Epoch: 44066, Training Loss: 0.00645
Epoch: 44066, Training Loss: 0.00787
Epoch: 44066, Training Loss: 0.00610
Epoch: 44067, Training Loss: 0.00688
Epoch: 44067, Training Loss: 0.00645
Epoch: 44067, Training Loss: 0.00787
Epoch: 44067, Training Loss: 0.00610
Epoch: 44068, Training Loss: 0.00688
Epoch: 44068, Training Loss: 0.00645
Epoch: 44068, Training Loss: 0.00787
Epoch: 44068, Training Loss: 0.00610
Epoch: 44069, Training Loss: 0.00688
Epoch: 44069, Training Loss: 0.00645
Epoch: 44069, Training Loss: 0.00787
Epoch: 44069, Training Loss: 0.00610
Epoch: 44070, Training Loss: 0.00688
Epoch: 44070, Training Loss: 0.00645
Epoch: 44070, Training Loss: 0.00787
Epoch: 44070, Training Loss: 0.00610
Epoch: 44071, Training Loss: 0.00688
Epoch: 44071, Training Loss: 0.00645
Epoch: 44071, Training Loss: 0.00787
Epoch: 44071, Training Loss: 0.00610
Epoch: 44072, Training Loss: 0.00688
Epoch: 44072, Training Loss: 0.00645
Epoch: 44072, Training Loss: 0.00787
Epoch: 44072, Training Loss: 0.00610
Epoch: 44073, Training Loss: 0.00688
Epoch: 44073, Training Loss: 0.00645
Epoch: 44073, Training Loss: 0.00787
Epoch: 44073, Training Loss: 0.00610
Epoch: 44074, Training Loss: 0.00688
Epoch: 44074, Training Loss: 0.00645
Epoch: 44074, Training Loss: 0.00787
Epoch: 44074, Training Loss: 0.00610
Epoch: 44075, Training Loss: 0.00688
Epoch: 44075, Training Loss: 0.00645
Epoch: 44075, Training Loss: 0.00787
Epoch: 44075, Training Loss: 0.00610
Epoch: 44076, Training Loss: 0.00688
Epoch: 44076, Training Loss: 0.00645
Epoch: 44076, Training Loss: 0.00787
Epoch: 44076, Training Loss: 0.00610
Epoch: 44077, Training Loss: 0.00688
Epoch: 44077, Training Loss: 0.00645
Epoch: 44077, Training Loss: 0.00787
Epoch: 44077, Training Loss: 0.00610
Epoch: 44078, Training Loss: 0.00688
Epoch: 44078, Training Loss: 0.00645
Epoch: 44078, Training Loss: 0.00787
Epoch: 44078, Training Loss: 0.00610
Epoch: 44079, Training Loss: 0.00688
Epoch: 44079, Training Loss: 0.00645
Epoch: 44079, Training Loss: 0.00787
Epoch: 44079, Training Loss: 0.00610
Epoch: 44080, Training Loss: 0.00688
Epoch: 44080, Training Loss: 0.00645
Epoch: 44080, Training Loss: 0.00787
Epoch: 44080, Training Loss: 0.00610
Epoch: 44081, Training Loss: 0.00688
Epoch: 44081, Training Loss: 0.00645
Epoch: 44081, Training Loss: 0.00787
Epoch: 44081, Training Loss: 0.00610
Epoch: 44082, Training Loss: 0.00688
Epoch: 44082, Training Loss: 0.00645
Epoch: 44082, Training Loss: 0.00787
Epoch: 44082, Training Loss: 0.00610
Epoch: 44083, Training Loss: 0.00688
Epoch: 44083, Training Loss: 0.00645
Epoch: 44083, Training Loss: 0.00787
Epoch: 44083, Training Loss: 0.00610
Epoch: 44084, Training Loss: 0.00688
Epoch: 44084, Training Loss: 0.00645
Epoch: 44084, Training Loss: 0.00787
Epoch: 44084, Training Loss: 0.00610
Epoch: 44085, Training Loss: 0.00688
Epoch: 44085, Training Loss: 0.00645
Epoch: 44085, Training Loss: 0.00787
Epoch: 44085, Training Loss: 0.00610
Epoch: 44086, Training Loss: 0.00688
Epoch: 44086, Training Loss: 0.00645
Epoch: 44086, Training Loss: 0.00787
Epoch: 44086, Training Loss: 0.00610
Epoch: 44087, Training Loss: 0.00688
Epoch: 44087, Training Loss: 0.00645
Epoch: 44087, Training Loss: 0.00787
Epoch: 44087, Training Loss: 0.00610
Epoch: 44088, Training Loss: 0.00688
Epoch: 44088, Training Loss: 0.00645
Epoch: 44088, Training Loss: 0.00787
Epoch: 44088, Training Loss: 0.00610
Epoch: 44089, Training Loss: 0.00688
Epoch: 44089, Training Loss: 0.00645
Epoch: 44089, Training Loss: 0.00787
Epoch: 44089, Training Loss: 0.00610
Epoch: 44090, Training Loss: 0.00688
Epoch: 44090, Training Loss: 0.00645
Epoch: 44090, Training Loss: 0.00787
Epoch: 44090, Training Loss: 0.00610
Epoch: 44091, Training Loss: 0.00688
Epoch: 44091, Training Loss: 0.00645
Epoch: 44091, Training Loss: 0.00787
Epoch: 44091, Training Loss: 0.00610
Epoch: 44092, Training Loss: 0.00688
Epoch: 44092, Training Loss: 0.00645
Epoch: 44092, Training Loss: 0.00787
Epoch: 44092, Training Loss: 0.00610
Epoch: 44093, Training Loss: 0.00688
Epoch: 44093, Training Loss: 0.00645
Epoch: 44093, Training Loss: 0.00787
Epoch: 44093, Training Loss: 0.00610
Epoch: 44094, Training Loss: 0.00688
Epoch: 44094, Training Loss: 0.00645
Epoch: 44094, Training Loss: 0.00786
Epoch: 44094, Training Loss: 0.00610
Epoch: 44095, Training Loss: 0.00688
Epoch: 44095, Training Loss: 0.00645
Epoch: 44095, Training Loss: 0.00786
Epoch: 44095, Training Loss: 0.00610
Epoch: 44096, Training Loss: 0.00688
Epoch: 44096, Training Loss: 0.00645
Epoch: 44096, Training Loss: 0.00786
Epoch: 44096, Training Loss: 0.00610
Epoch: 44097, Training Loss: 0.00688
Epoch: 44097, Training Loss: 0.00645
Epoch: 44097, Training Loss: 0.00786
Epoch: 44097, Training Loss: 0.00610
Epoch: 44098, Training Loss: 0.00688
Epoch: 44098, Training Loss: 0.00645
Epoch: 44098, Training Loss: 0.00786
Epoch: 44098, Training Loss: 0.00610
Epoch: 44099, Training Loss: 0.00688
Epoch: 44099, Training Loss: 0.00645
Epoch: 44099, Training Loss: 0.00786
Epoch: 44099, Training Loss: 0.00610
Epoch: 44100, Training Loss: 0.00688
Epoch: 44100, Training Loss: 0.00645
Epoch: 44100, Training Loss: 0.00786
Epoch: 44100, Training Loss: 0.00610
Epoch: 44101, Training Loss: 0.00688
Epoch: 44101, Training Loss: 0.00645
Epoch: 44101, Training Loss: 0.00786
Epoch: 44101, Training Loss: 0.00610
Epoch: 44102, Training Loss: 0.00688
Epoch: 44102, Training Loss: 0.00645
Epoch: 44102, Training Loss: 0.00786
Epoch: 44102, Training Loss: 0.00610
Epoch: 44103, Training Loss: 0.00688
Epoch: 44103, Training Loss: 0.00644
Epoch: 44103, Training Loss: 0.00786
Epoch: 44103, Training Loss: 0.00610
Epoch: 44104, Training Loss: 0.00688
Epoch: 44104, Training Loss: 0.00644
Epoch: 44104, Training Loss: 0.00786
Epoch: 44104, Training Loss: 0.00610
Epoch: 44105, Training Loss: 0.00688
Epoch: 44105, Training Loss: 0.00644
Epoch: 44105, Training Loss: 0.00786
Epoch: 44105, Training Loss: 0.00610
Epoch: 44106, Training Loss: 0.00688
Epoch: 44106, Training Loss: 0.00644
Epoch: 44106, Training Loss: 0.00786
Epoch: 44106, Training Loss: 0.00610
Epoch: 44107, Training Loss: 0.00688
Epoch: 44107, Training Loss: 0.00644
Epoch: 44107, Training Loss: 0.00786
Epoch: 44107, Training Loss: 0.00610
Epoch: 44108, Training Loss: 0.00688
Epoch: 44108, Training Loss: 0.00644
Epoch: 44108, Training Loss: 0.00786
Epoch: 44108, Training Loss: 0.00610
Epoch: 44109, Training Loss: 0.00688
Epoch: 44109, Training Loss: 0.00644
Epoch: 44109, Training Loss: 0.00786
Epoch: 44109, Training Loss: 0.00610
Epoch: 44110, Training Loss: 0.00688
Epoch: 44110, Training Loss: 0.00644
Epoch: 44110, Training Loss: 0.00786
Epoch: 44110, Training Loss: 0.00610
Epoch: 44111, Training Loss: 0.00688
Epoch: 44111, Training Loss: 0.00644
Epoch: 44111, Training Loss: 0.00786
Epoch: 44111, Training Loss: 0.00610
Epoch: 44112, Training Loss: 0.00688
Epoch: 44112, Training Loss: 0.00644
Epoch: 44112, Training Loss: 0.00786
Epoch: 44112, Training Loss: 0.00610
Epoch: 44113, Training Loss: 0.00688
Epoch: 44113, Training Loss: 0.00644
Epoch: 44113, Training Loss: 0.00786
Epoch: 44113, Training Loss: 0.00610
Epoch: 44114, Training Loss: 0.00688
Epoch: 44114, Training Loss: 0.00644
Epoch: 44114, Training Loss: 0.00786
Epoch: 44114, Training Loss: 0.00610
Epoch: 44115, Training Loss: 0.00688
Epoch: 44115, Training Loss: 0.00644
Epoch: 44115, Training Loss: 0.00786
Epoch: 44115, Training Loss: 0.00610
Epoch: 44116, Training Loss: 0.00688
Epoch: 44116, Training Loss: 0.00644
Epoch: 44116, Training Loss: 0.00786
Epoch: 44116, Training Loss: 0.00610
Epoch: 44117, Training Loss: 0.00688
Epoch: 44117, Training Loss: 0.00644
Epoch: 44117, Training Loss: 0.00786
Epoch: 44117, Training Loss: 0.00610
Epoch: 44118, Training Loss: 0.00688
Epoch: 44118, Training Loss: 0.00644
Epoch: 44118, Training Loss: 0.00786
Epoch: 44118, Training Loss: 0.00610
Epoch: 44119, Training Loss: 0.00688
Epoch: 44119, Training Loss: 0.00644
Epoch: 44119, Training Loss: 0.00786
Epoch: 44119, Training Loss: 0.00610
Epoch: 44120, Training Loss: 0.00688
Epoch: 44120, Training Loss: 0.00644
Epoch: 44120, Training Loss: 0.00786
Epoch: 44120, Training Loss: 0.00610
Epoch: 44121, Training Loss: 0.00688
Epoch: 44121, Training Loss: 0.00644
Epoch: 44121, Training Loss: 0.00786
Epoch: 44121, Training Loss: 0.00610
Epoch: 44122, Training Loss: 0.00688
Epoch: 44122, Training Loss: 0.00644
Epoch: 44122, Training Loss: 0.00786
Epoch: 44122, Training Loss: 0.00610
Epoch: 44123, Training Loss: 0.00688
Epoch: 44123, Training Loss: 0.00644
Epoch: 44123, Training Loss: 0.00786
Epoch: 44123, Training Loss: 0.00610
Epoch: 44124, Training Loss: 0.00688
Epoch: 44124, Training Loss: 0.00644
Epoch: 44124, Training Loss: 0.00786
Epoch: 44124, Training Loss: 0.00610
Epoch: 44125, Training Loss: 0.00688
Epoch: 44125, Training Loss: 0.00644
Epoch: 44125, Training Loss: 0.00786
Epoch: 44125, Training Loss: 0.00610
Epoch: 44126, Training Loss: 0.00688
Epoch: 44126, Training Loss: 0.00644
Epoch: 44126, Training Loss: 0.00786
Epoch: 44126, Training Loss: 0.00610
Epoch: 44127, Training Loss: 0.00688
Epoch: 44127, Training Loss: 0.00644
Epoch: 44127, Training Loss: 0.00786
Epoch: 44127, Training Loss: 0.00610
Epoch: 44128, Training Loss: 0.00688
Epoch: 44128, Training Loss: 0.00644
Epoch: 44128, Training Loss: 0.00786
Epoch: 44128, Training Loss: 0.00610
Epoch: 44129, Training Loss: 0.00688
Epoch: 44129, Training Loss: 0.00644
Epoch: 44129, Training Loss: 0.00786
Epoch: 44129, Training Loss: 0.00610
Epoch: 44130, Training Loss: 0.00688
Epoch: 44130, Training Loss: 0.00644
Epoch: 44130, Training Loss: 0.00786
Epoch: 44130, Training Loss: 0.00610
Epoch: 44131, Training Loss: 0.00688
Epoch: 44131, Training Loss: 0.00644
Epoch: 44131, Training Loss: 0.00786
Epoch: 44131, Training Loss: 0.00610
Epoch: 44132, Training Loss: 0.00688
Epoch: 44132, Training Loss: 0.00644
Epoch: 44132, Training Loss: 0.00786
Epoch: 44132, Training Loss: 0.00610
Epoch: 44133, Training Loss: 0.00688
Epoch: 44133, Training Loss: 0.00644
Epoch: 44133, Training Loss: 0.00786
Epoch: 44133, Training Loss: 0.00610
Epoch: 44134, Training Loss: 0.00688
Epoch: 44134, Training Loss: 0.00644
Epoch: 44134, Training Loss: 0.00786
Epoch: 44134, Training Loss: 0.00610
Epoch: 44135, Training Loss: 0.00688
Epoch: 44135, Training Loss: 0.00644
Epoch: 44135, Training Loss: 0.00786
Epoch: 44135, Training Loss: 0.00610
Epoch: 44136, Training Loss: 0.00688
Epoch: 44136, Training Loss: 0.00644
Epoch: 44136, Training Loss: 0.00786
Epoch: 44136, Training Loss: 0.00610
Epoch: 44137, Training Loss: 0.00688
Epoch: 44137, Training Loss: 0.00644
Epoch: 44137, Training Loss: 0.00786
Epoch: 44137, Training Loss: 0.00610
Epoch: 44138, Training Loss: 0.00688
Epoch: 44138, Training Loss: 0.00644
Epoch: 44138, Training Loss: 0.00786
Epoch: 44138, Training Loss: 0.00610
Epoch: 44139, Training Loss: 0.00688
Epoch: 44139, Training Loss: 0.00644
Epoch: 44139, Training Loss: 0.00786
Epoch: 44139, Training Loss: 0.00610
Epoch: 44140, Training Loss: 0.00688
Epoch: 44140, Training Loss: 0.00644
Epoch: 44140, Training Loss: 0.00786
Epoch: 44140, Training Loss: 0.00610
Epoch: 44141, Training Loss: 0.00688
Epoch: 44141, Training Loss: 0.00644
Epoch: 44141, Training Loss: 0.00786
Epoch: 44141, Training Loss: 0.00610
Epoch: 44142, Training Loss: 0.00688
Epoch: 44142, Training Loss: 0.00644
Epoch: 44142, Training Loss: 0.00786
Epoch: 44142, Training Loss: 0.00610
Epoch: 44143, Training Loss: 0.00688
Epoch: 44143, Training Loss: 0.00644
Epoch: 44143, Training Loss: 0.00786
Epoch: 44143, Training Loss: 0.00610
Epoch: 44144, Training Loss: 0.00688
Epoch: 44144, Training Loss: 0.00644
Epoch: 44144, Training Loss: 0.00786
Epoch: 44144, Training Loss: 0.00610
Epoch: 44145, Training Loss: 0.00688
Epoch: 44145, Training Loss: 0.00644
Epoch: 44145, Training Loss: 0.00786
Epoch: 44145, Training Loss: 0.00610
Epoch: 44146, Training Loss: 0.00688
Epoch: 44146, Training Loss: 0.00644
Epoch: 44146, Training Loss: 0.00786
Epoch: 44146, Training Loss: 0.00610
Epoch: 44147, Training Loss: 0.00688
Epoch: 44147, Training Loss: 0.00644
Epoch: 44147, Training Loss: 0.00786
Epoch: 44147, Training Loss: 0.00610
Epoch: 44148, Training Loss: 0.00688
Epoch: 44148, Training Loss: 0.00644
Epoch: 44148, Training Loss: 0.00786
Epoch: 44148, Training Loss: 0.00610
Epoch: 44149, Training Loss: 0.00688
Epoch: 44149, Training Loss: 0.00644
Epoch: 44149, Training Loss: 0.00786
Epoch: 44149, Training Loss: 0.00610
Epoch: 44150, Training Loss: 0.00688
Epoch: 44150, Training Loss: 0.00644
Epoch: 44150, Training Loss: 0.00786
Epoch: 44150, Training Loss: 0.00610
Epoch: 44151, Training Loss: 0.00688
Epoch: 44151, Training Loss: 0.00644
Epoch: 44151, Training Loss: 0.00786
Epoch: 44151, Training Loss: 0.00610
Epoch: 44152, Training Loss: 0.00688
Epoch: 44152, Training Loss: 0.00644
Epoch: 44152, Training Loss: 0.00786
Epoch: 44152, Training Loss: 0.00610
Epoch: 44153, Training Loss: 0.00688
Epoch: 44153, Training Loss: 0.00644
Epoch: 44153, Training Loss: 0.00786
Epoch: 44153, Training Loss: 0.00610
Epoch: 44154, Training Loss: 0.00688
Epoch: 44154, Training Loss: 0.00644
Epoch: 44154, Training Loss: 0.00786
Epoch: 44154, Training Loss: 0.00610
Epoch: 44155, Training Loss: 0.00688
Epoch: 44155, Training Loss: 0.00644
Epoch: 44155, Training Loss: 0.00786
Epoch: 44155, Training Loss: 0.00610
Epoch: 44156, Training Loss: 0.00688
Epoch: 44156, Training Loss: 0.00644
Epoch: 44156, Training Loss: 0.00786
Epoch: 44156, Training Loss: 0.00610
Epoch: 44157, Training Loss: 0.00688
Epoch: 44157, Training Loss: 0.00644
Epoch: 44157, Training Loss: 0.00786
Epoch: 44157, Training Loss: 0.00610
Epoch: 44158, Training Loss: 0.00688
Epoch: 44158, Training Loss: 0.00644
Epoch: 44158, Training Loss: 0.00786
Epoch: 44158, Training Loss: 0.00610
Epoch: 44159, Training Loss: 0.00688
Epoch: 44159, Training Loss: 0.00644
Epoch: 44159, Training Loss: 0.00786
Epoch: 44159, Training Loss: 0.00610
Epoch: 44160, Training Loss: 0.00688
Epoch: 44160, Training Loss: 0.00644
Epoch: 44160, Training Loss: 0.00786
Epoch: 44160, Training Loss: 0.00610
Epoch: 44161, Training Loss: 0.00688
Epoch: 44161, Training Loss: 0.00644
Epoch: 44161, Training Loss: 0.00786
Epoch: 44161, Training Loss: 0.00610
Epoch: 44162, Training Loss: 0.00688
Epoch: 44162, Training Loss: 0.00644
Epoch: 44162, Training Loss: 0.00786
Epoch: 44162, Training Loss: 0.00610
Epoch: 44163, Training Loss: 0.00688
Epoch: 44163, Training Loss: 0.00644
Epoch: 44163, Training Loss: 0.00786
Epoch: 44163, Training Loss: 0.00610
Epoch: 44164, Training Loss: 0.00688
Epoch: 44164, Training Loss: 0.00644
Epoch: 44164, Training Loss: 0.00786
Epoch: 44164, Training Loss: 0.00610
Epoch: 44165, Training Loss: 0.00688
Epoch: 44165, Training Loss: 0.00644
Epoch: 44165, Training Loss: 0.00786
Epoch: 44165, Training Loss: 0.00610
Epoch: 44166, Training Loss: 0.00688
Epoch: 44166, Training Loss: 0.00644
Epoch: 44166, Training Loss: 0.00786
Epoch: 44166, Training Loss: 0.00610
Epoch: 44167, Training Loss: 0.00688
Epoch: 44167, Training Loss: 0.00644
Epoch: 44167, Training Loss: 0.00786
Epoch: 44167, Training Loss: 0.00610
Epoch: 44168, Training Loss: 0.00688
Epoch: 44168, Training Loss: 0.00644
Epoch: 44168, Training Loss: 0.00786
Epoch: 44168, Training Loss: 0.00610
Epoch: 44169, Training Loss: 0.00688
Epoch: 44169, Training Loss: 0.00644
Epoch: 44169, Training Loss: 0.00786
Epoch: 44169, Training Loss: 0.00610
Epoch: 44170, Training Loss: 0.00688
Epoch: 44170, Training Loss: 0.00644
Epoch: 44170, Training Loss: 0.00786
Epoch: 44170, Training Loss: 0.00610
Epoch: 44171, Training Loss: 0.00688
Epoch: 44171, Training Loss: 0.00644
Epoch: 44171, Training Loss: 0.00786
Epoch: 44171, Training Loss: 0.00610
Epoch: 44172, Training Loss: 0.00688
Epoch: 44172, Training Loss: 0.00644
Epoch: 44172, Training Loss: 0.00786
Epoch: 44172, Training Loss: 0.00610
Epoch: 44173, Training Loss: 0.00688
Epoch: 44173, Training Loss: 0.00644
Epoch: 44173, Training Loss: 0.00786
Epoch: 44173, Training Loss: 0.00610
Epoch: 44174, Training Loss: 0.00688
Epoch: 44174, Training Loss: 0.00644
Epoch: 44174, Training Loss: 0.00786
Epoch: 44174, Training Loss: 0.00610
Epoch: 44175, Training Loss: 0.00688
Epoch: 44175, Training Loss: 0.00644
Epoch: 44175, Training Loss: 0.00786
Epoch: 44175, Training Loss: 0.00610
Epoch: 44176, Training Loss: 0.00688
Epoch: 44176, Training Loss: 0.00644
Epoch: 44176, Training Loss: 0.00786
Epoch: 44176, Training Loss: 0.00610
Epoch: 44177, Training Loss: 0.00688
Epoch: 44177, Training Loss: 0.00644
Epoch: 44177, Training Loss: 0.00786
Epoch: 44177, Training Loss: 0.00610
Epoch: 44178, Training Loss: 0.00688
Epoch: 44178, Training Loss: 0.00644
Epoch: 44178, Training Loss: 0.00786
Epoch: 44178, Training Loss: 0.00610
Epoch: 44179, Training Loss: 0.00688
Epoch: 44179, Training Loss: 0.00644
Epoch: 44179, Training Loss: 0.00786
Epoch: 44179, Training Loss: 0.00610
Epoch: 44180, Training Loss: 0.00688
Epoch: 44180, Training Loss: 0.00644
Epoch: 44180, Training Loss: 0.00786
Epoch: 44180, Training Loss: 0.00610
Epoch: 44181, Training Loss: 0.00687
Epoch: 44181, Training Loss: 0.00644
Epoch: 44181, Training Loss: 0.00786
Epoch: 44181, Training Loss: 0.00610
Epoch: 44182, Training Loss: 0.00687
Epoch: 44182, Training Loss: 0.00644
Epoch: 44182, Training Loss: 0.00786
Epoch: 44182, Training Loss: 0.00610
Epoch: 44183, Training Loss: 0.00687
Epoch: 44183, Training Loss: 0.00644
Epoch: 44183, Training Loss: 0.00786
Epoch: 44183, Training Loss: 0.00610
Epoch: 44184, Training Loss: 0.00687
Epoch: 44184, Training Loss: 0.00644
Epoch: 44184, Training Loss: 0.00786
Epoch: 44184, Training Loss: 0.00610
Epoch: 44185, Training Loss: 0.00687
Epoch: 44185, Training Loss: 0.00644
Epoch: 44185, Training Loss: 0.00786
Epoch: 44185, Training Loss: 0.00610
Epoch: 44186, Training Loss: 0.00687
Epoch: 44186, Training Loss: 0.00644
Epoch: 44186, Training Loss: 0.00786
Epoch: 44186, Training Loss: 0.00610
Epoch: 44187, Training Loss: 0.00687
Epoch: 44187, Training Loss: 0.00644
Epoch: 44187, Training Loss: 0.00786
Epoch: 44187, Training Loss: 0.00610
Epoch: 44188, Training Loss: 0.00687
Epoch: 44188, Training Loss: 0.00644
Epoch: 44188, Training Loss: 0.00786
Epoch: 44188, Training Loss: 0.00610
Epoch: 44189, Training Loss: 0.00687
Epoch: 44189, Training Loss: 0.00644
Epoch: 44189, Training Loss: 0.00786
Epoch: 44189, Training Loss: 0.00610
Epoch: 44190, Training Loss: 0.00687
Epoch: 44190, Training Loss: 0.00644
Epoch: 44190, Training Loss: 0.00786
Epoch: 44190, Training Loss: 0.00610
Epoch: 44191, Training Loss: 0.00687
Epoch: 44191, Training Loss: 0.00644
Epoch: 44191, Training Loss: 0.00786
Epoch: 44191, Training Loss: 0.00610
Epoch: 44192, Training Loss: 0.00687
Epoch: 44192, Training Loss: 0.00644
Epoch: 44192, Training Loss: 0.00786
Epoch: 44192, Training Loss: 0.00610
Epoch: 44193, Training Loss: 0.00687
Epoch: 44193, Training Loss: 0.00644
Epoch: 44193, Training Loss: 0.00786
Epoch: 44193, Training Loss: 0.00610
Epoch: 44194, Training Loss: 0.00687
Epoch: 44194, Training Loss: 0.00644
Epoch: 44194, Training Loss: 0.00786
Epoch: 44194, Training Loss: 0.00610
Epoch: 44195, Training Loss: 0.00687
Epoch: 44195, Training Loss: 0.00644
Epoch: 44195, Training Loss: 0.00786
Epoch: 44195, Training Loss: 0.00610
Epoch: 44196, Training Loss: 0.00687
Epoch: 44196, Training Loss: 0.00644
Epoch: 44196, Training Loss: 0.00786
Epoch: 44196, Training Loss: 0.00610
Epoch: 44197, Training Loss: 0.00687
Epoch: 44197, Training Loss: 0.00644
Epoch: 44197, Training Loss: 0.00786
Epoch: 44197, Training Loss: 0.00609
Epoch: 44198, Training Loss: 0.00687
Epoch: 44198, Training Loss: 0.00644
Epoch: 44198, Training Loss: 0.00786
Epoch: 44198, Training Loss: 0.00609
Epoch: 44199, Training Loss: 0.00687
Epoch: 44199, Training Loss: 0.00644
Epoch: 44199, Training Loss: 0.00786
Epoch: 44199, Training Loss: 0.00609
Epoch: 44200, Training Loss: 0.00687
Epoch: 44200, Training Loss: 0.00644
Epoch: 44200, Training Loss: 0.00786
Epoch: 44200, Training Loss: 0.00609
Epoch: 44201, Training Loss: 0.00687
Epoch: 44201, Training Loss: 0.00644
Epoch: 44201, Training Loss: 0.00785
Epoch: 44201, Training Loss: 0.00609
Epoch: 44202, Training Loss: 0.00687
Epoch: 44202, Training Loss: 0.00644
Epoch: 44202, Training Loss: 0.00785
Epoch: 44202, Training Loss: 0.00609
Epoch: 44203, Training Loss: 0.00687
Epoch: 44203, Training Loss: 0.00644
Epoch: 44203, Training Loss: 0.00785
Epoch: 44203, Training Loss: 0.00609
Epoch: 44204, Training Loss: 0.00687
Epoch: 44204, Training Loss: 0.00644
Epoch: 44204, Training Loss: 0.00785
Epoch: 44204, Training Loss: 0.00609
Epoch: 44205, Training Loss: 0.00687
Epoch: 44205, Training Loss: 0.00644
Epoch: 44205, Training Loss: 0.00785
Epoch: 44205, Training Loss: 0.00609
Epoch: 44206, Training Loss: 0.00687
Epoch: 44206, Training Loss: 0.00644
Epoch: 44206, Training Loss: 0.00785
Epoch: 44206, Training Loss: 0.00609
Epoch: 44207, Training Loss: 0.00687
Epoch: 44207, Training Loss: 0.00644
Epoch: 44207, Training Loss: 0.00785
Epoch: 44207, Training Loss: 0.00609
Epoch: 44208, Training Loss: 0.00687
Epoch: 44208, Training Loss: 0.00644
Epoch: 44208, Training Loss: 0.00785
Epoch: 44208, Training Loss: 0.00609
Epoch: 44209, Training Loss: 0.00687
Epoch: 44209, Training Loss: 0.00644
Epoch: 44209, Training Loss: 0.00785
Epoch: 44209, Training Loss: 0.00609
Epoch: 44210, Training Loss: 0.00687
Epoch: 44210, Training Loss: 0.00644
Epoch: 44210, Training Loss: 0.00785
Epoch: 44210, Training Loss: 0.00609
Epoch: 44211, Training Loss: 0.00687
Epoch: 44211, Training Loss: 0.00644
Epoch: 44211, Training Loss: 0.00785
Epoch: 44211, Training Loss: 0.00609
Epoch: 44212, Training Loss: 0.00687
Epoch: 44212, Training Loss: 0.00644
Epoch: 44212, Training Loss: 0.00785
Epoch: 44212, Training Loss: 0.00609
Epoch: 44213, Training Loss: 0.00687
Epoch: 44213, Training Loss: 0.00644
Epoch: 44213, Training Loss: 0.00785
Epoch: 44213, Training Loss: 0.00609
Epoch: 44214, Training Loss: 0.00687
Epoch: 44214, Training Loss: 0.00644
Epoch: 44214, Training Loss: 0.00785
Epoch: 44214, Training Loss: 0.00609
Epoch: 44215, Training Loss: 0.00687
Epoch: 44215, Training Loss: 0.00644
Epoch: 44215, Training Loss: 0.00785
Epoch: 44215, Training Loss: 0.00609
Epoch: 44216, Training Loss: 0.00687
Epoch: 44216, Training Loss: 0.00644
Epoch: 44216, Training Loss: 0.00785
Epoch: 44216, Training Loss: 0.00609
Epoch: 44217, Training Loss: 0.00687
Epoch: 44217, Training Loss: 0.00644
Epoch: 44217, Training Loss: 0.00785
Epoch: 44217, Training Loss: 0.00609
Epoch: 44218, Training Loss: 0.00687
Epoch: 44218, Training Loss: 0.00644
Epoch: 44218, Training Loss: 0.00785
Epoch: 44218, Training Loss: 0.00609
Epoch: 44219, Training Loss: 0.00687
Epoch: 44219, Training Loss: 0.00644
Epoch: 44219, Training Loss: 0.00785
Epoch: 44219, Training Loss: 0.00609
Epoch: 44220, Training Loss: 0.00687
Epoch: 44220, Training Loss: 0.00644
Epoch: 44220, Training Loss: 0.00785
Epoch: 44220, Training Loss: 0.00609
Epoch: 44221, Training Loss: 0.00687
Epoch: 44221, Training Loss: 0.00644
Epoch: 44221, Training Loss: 0.00785
Epoch: 44221, Training Loss: 0.00609
Epoch: 44222, Training Loss: 0.00687
Epoch: 44222, Training Loss: 0.00644
Epoch: 44222, Training Loss: 0.00785
Epoch: 44222, Training Loss: 0.00609
Epoch: 44223, Training Loss: 0.00687
Epoch: 44223, Training Loss: 0.00644
Epoch: 44223, Training Loss: 0.00785
Epoch: 44223, Training Loss: 0.00609
Epoch: 44224, Training Loss: 0.00687
Epoch: 44224, Training Loss: 0.00644
Epoch: 44224, Training Loss: 0.00785
Epoch: 44224, Training Loss: 0.00609
Epoch: 44225, Training Loss: 0.00687
Epoch: 44225, Training Loss: 0.00644
Epoch: 44225, Training Loss: 0.00785
Epoch: 44225, Training Loss: 0.00609
Epoch: 44226, Training Loss: 0.00687
Epoch: 44226, Training Loss: 0.00644
Epoch: 44226, Training Loss: 0.00785
Epoch: 44226, Training Loss: 0.00609
Epoch: 44227, Training Loss: 0.00687
Epoch: 44227, Training Loss: 0.00644
Epoch: 44227, Training Loss: 0.00785
Epoch: 44227, Training Loss: 0.00609
Epoch: 44228, Training Loss: 0.00687
Epoch: 44228, Training Loss: 0.00644
Epoch: 44228, Training Loss: 0.00785
Epoch: 44228, Training Loss: 0.00609
Epoch: 44229, Training Loss: 0.00687
Epoch: 44229, Training Loss: 0.00644
Epoch: 44229, Training Loss: 0.00785
Epoch: 44229, Training Loss: 0.00609
Epoch: 44230, Training Loss: 0.00687
Epoch: 44230, Training Loss: 0.00644
Epoch: 44230, Training Loss: 0.00785
Epoch: 44230, Training Loss: 0.00609
Epoch: 44231, Training Loss: 0.00687
Epoch: 44231, Training Loss: 0.00644
Epoch: 44231, Training Loss: 0.00785
Epoch: 44231, Training Loss: 0.00609
Epoch: 44232, Training Loss: 0.00687
Epoch: 44232, Training Loss: 0.00644
Epoch: 44232, Training Loss: 0.00785
Epoch: 44232, Training Loss: 0.00609
Epoch: 44233, Training Loss: 0.00687
Epoch: 44233, Training Loss: 0.00644
Epoch: 44233, Training Loss: 0.00785
Epoch: 44233, Training Loss: 0.00609
Epoch: 44234, Training Loss: 0.00687
Epoch: 44234, Training Loss: 0.00644
Epoch: 44234, Training Loss: 0.00785
Epoch: 44234, Training Loss: 0.00609
Epoch: 44235, Training Loss: 0.00687
Epoch: 44235, Training Loss: 0.00643
Epoch: 44235, Training Loss: 0.00785
Epoch: 44235, Training Loss: 0.00609
Epoch: 44236, Training Loss: 0.00687
Epoch: 44236, Training Loss: 0.00643
Epoch: 44236, Training Loss: 0.00785
Epoch: 44236, Training Loss: 0.00609
Epoch: 44237, Training Loss: 0.00687
Epoch: 44237, Training Loss: 0.00643
Epoch: 44237, Training Loss: 0.00785
Epoch: 44237, Training Loss: 0.00609
Epoch: 44238, Training Loss: 0.00687
Epoch: 44238, Training Loss: 0.00643
Epoch: 44238, Training Loss: 0.00785
Epoch: 44238, Training Loss: 0.00609
Epoch: 44239, Training Loss: 0.00687
Epoch: 44239, Training Loss: 0.00643
Epoch: 44239, Training Loss: 0.00785
Epoch: 44239, Training Loss: 0.00609
Epoch: 44240, Training Loss: 0.00687
Epoch: 44240, Training Loss: 0.00643
Epoch: 44240, Training Loss: 0.00785
Epoch: 44240, Training Loss: 0.00609
Epoch: 44241, Training Loss: 0.00687
Epoch: 44241, Training Loss: 0.00643
Epoch: 44241, Training Loss: 0.00785
Epoch: 44241, Training Loss: 0.00609
Epoch: 44242, Training Loss: 0.00687
Epoch: 44242, Training Loss: 0.00643
Epoch: 44242, Training Loss: 0.00785
Epoch: 44242, Training Loss: 0.00609
Epoch: 44243, Training Loss: 0.00687
Epoch: 44243, Training Loss: 0.00643
Epoch: 44243, Training Loss: 0.00785
Epoch: 44243, Training Loss: 0.00609
Epoch: 44244, Training Loss: 0.00687
Epoch: 44244, Training Loss: 0.00643
Epoch: 44244, Training Loss: 0.00785
Epoch: 44244, Training Loss: 0.00609
Epoch: 44245, Training Loss: 0.00687
Epoch: 44245, Training Loss: 0.00643
Epoch: 44245, Training Loss: 0.00785
Epoch: 44245, Training Loss: 0.00609
Epoch: 44246, Training Loss: 0.00687
Epoch: 44246, Training Loss: 0.00643
Epoch: 44246, Training Loss: 0.00785
Epoch: 44246, Training Loss: 0.00609
Epoch: 44247, Training Loss: 0.00687
Epoch: 44247, Training Loss: 0.00643
Epoch: 44247, Training Loss: 0.00785
Epoch: 44247, Training Loss: 0.00609
Epoch: 44248, Training Loss: 0.00687
Epoch: 44248, Training Loss: 0.00643
Epoch: 44248, Training Loss: 0.00785
Epoch: 44248, Training Loss: 0.00609
Epoch: 44249, Training Loss: 0.00687
Epoch: 44249, Training Loss: 0.00643
Epoch: 44249, Training Loss: 0.00785
Epoch: 44249, Training Loss: 0.00609
Epoch: 44250, Training Loss: 0.00687
Epoch: 44250, Training Loss: 0.00643
Epoch: 44250, Training Loss: 0.00785
Epoch: 44250, Training Loss: 0.00609
Epoch: 44251, Training Loss: 0.00687
Epoch: 44251, Training Loss: 0.00643
Epoch: 44251, Training Loss: 0.00785
Epoch: 44251, Training Loss: 0.00609
Epoch: 44252, Training Loss: 0.00687
Epoch: 44252, Training Loss: 0.00643
Epoch: 44252, Training Loss: 0.00785
Epoch: 44252, Training Loss: 0.00609
Epoch: 44253, Training Loss: 0.00687
Epoch: 44253, Training Loss: 0.00643
Epoch: 44253, Training Loss: 0.00785
Epoch: 44253, Training Loss: 0.00609
Epoch: 44254, Training Loss: 0.00687
Epoch: 44254, Training Loss: 0.00643
Epoch: 44254, Training Loss: 0.00785
Epoch: 44254, Training Loss: 0.00609
Epoch: 44255, Training Loss: 0.00687
Epoch: 44255, Training Loss: 0.00643
Epoch: 44255, Training Loss: 0.00785
Epoch: 44255, Training Loss: 0.00609
Epoch: 44256, Training Loss: 0.00687
Epoch: 44256, Training Loss: 0.00643
Epoch: 44256, Training Loss: 0.00785
Epoch: 44256, Training Loss: 0.00609
Epoch: 44257, Training Loss: 0.00687
Epoch: 44257, Training Loss: 0.00643
Epoch: 44257, Training Loss: 0.00785
Epoch: 44257, Training Loss: 0.00609
Epoch: 44258, Training Loss: 0.00687
Epoch: 44258, Training Loss: 0.00643
Epoch: 44258, Training Loss: 0.00785
Epoch: 44258, Training Loss: 0.00609
Epoch: 44259, Training Loss: 0.00687
Epoch: 44259, Training Loss: 0.00643
Epoch: 44259, Training Loss: 0.00785
Epoch: 44259, Training Loss: 0.00609
Epoch: 44260, Training Loss: 0.00687
Epoch: 44260, Training Loss: 0.00643
Epoch: 44260, Training Loss: 0.00785
Epoch: 44260, Training Loss: 0.00609
Epoch: 44261, Training Loss: 0.00687
Epoch: 44261, Training Loss: 0.00643
Epoch: 44261, Training Loss: 0.00785
Epoch: 44261, Training Loss: 0.00609
Epoch: 44262, Training Loss: 0.00687
Epoch: 44262, Training Loss: 0.00643
Epoch: 44262, Training Loss: 0.00785
Epoch: 44262, Training Loss: 0.00609
Epoch: 44263, Training Loss: 0.00687
Epoch: 44263, Training Loss: 0.00643
Epoch: 44263, Training Loss: 0.00785
Epoch: 44263, Training Loss: 0.00609
Epoch: 44264, Training Loss: 0.00687
Epoch: 44264, Training Loss: 0.00643
Epoch: 44264, Training Loss: 0.00785
Epoch: 44264, Training Loss: 0.00609
Epoch: 44265, Training Loss: 0.00687
Epoch: 44265, Training Loss: 0.00643
Epoch: 44265, Training Loss: 0.00785
Epoch: 44265, Training Loss: 0.00609
Epoch: 44266, Training Loss: 0.00687
Epoch: 44266, Training Loss: 0.00643
Epoch: 44266, Training Loss: 0.00785
Epoch: 44266, Training Loss: 0.00609
Epoch: 44267, Training Loss: 0.00687
Epoch: 44267, Training Loss: 0.00643
Epoch: 44267, Training Loss: 0.00785
Epoch: 44267, Training Loss: 0.00609
Epoch: 44268, Training Loss: 0.00687
Epoch: 44268, Training Loss: 0.00643
Epoch: 44268, Training Loss: 0.00785
Epoch: 44268, Training Loss: 0.00609
Epoch: 44269, Training Loss: 0.00687
Epoch: 44269, Training Loss: 0.00643
Epoch: 44269, Training Loss: 0.00785
Epoch: 44269, Training Loss: 0.00609
Epoch: 44270, Training Loss: 0.00687
Epoch: 44270, Training Loss: 0.00643
Epoch: 44270, Training Loss: 0.00785
Epoch: 44270, Training Loss: 0.00609
Epoch: 44271, Training Loss: 0.00687
Epoch: 44271, Training Loss: 0.00643
Epoch: 44271, Training Loss: 0.00785
Epoch: 44271, Training Loss: 0.00609
Epoch: 44272, Training Loss: 0.00687
Epoch: 44272, Training Loss: 0.00643
Epoch: 44272, Training Loss: 0.00785
Epoch: 44272, Training Loss: 0.00609
Epoch: 44273, Training Loss: 0.00687
Epoch: 44273, Training Loss: 0.00643
Epoch: 44273, Training Loss: 0.00785
Epoch: 44273, Training Loss: 0.00609
Epoch: 44274, Training Loss: 0.00687
Epoch: 44274, Training Loss: 0.00643
Epoch: 44274, Training Loss: 0.00785
Epoch: 44274, Training Loss: 0.00609
Epoch: 44275, Training Loss: 0.00687
Epoch: 44275, Training Loss: 0.00643
Epoch: 44275, Training Loss: 0.00785
Epoch: 44275, Training Loss: 0.00609
Epoch: 44276, Training Loss: 0.00687
Epoch: 44276, Training Loss: 0.00643
Epoch: 44276, Training Loss: 0.00785
Epoch: 44276, Training Loss: 0.00609
Epoch: 44277, Training Loss: 0.00687
Epoch: 44277, Training Loss: 0.00643
Epoch: 44277, Training Loss: 0.00785
Epoch: 44277, Training Loss: 0.00609
Epoch: 44278, Training Loss: 0.00687
Epoch: 44278, Training Loss: 0.00643
Epoch: 44278, Training Loss: 0.00785
Epoch: 44278, Training Loss: 0.00609
Epoch: 44279, Training Loss: 0.00687
Epoch: 44279, Training Loss: 0.00643
Epoch: 44279, Training Loss: 0.00785
Epoch: 44279, Training Loss: 0.00609
Epoch: 44280, Training Loss: 0.00687
Epoch: 44280, Training Loss: 0.00643
Epoch: 44280, Training Loss: 0.00785
Epoch: 44280, Training Loss: 0.00609
Epoch: 44281, Training Loss: 0.00687
Epoch: 44281, Training Loss: 0.00643
Epoch: 44281, Training Loss: 0.00785
Epoch: 44281, Training Loss: 0.00609
Epoch: 44282, Training Loss: 0.00687
Epoch: 44282, Training Loss: 0.00643
Epoch: 44282, Training Loss: 0.00785
Epoch: 44282, Training Loss: 0.00609
Epoch: 44283, Training Loss: 0.00687
Epoch: 44283, Training Loss: 0.00643
Epoch: 44283, Training Loss: 0.00785
Epoch: 44283, Training Loss: 0.00609
Epoch: 44284, Training Loss: 0.00687
Epoch: 44284, Training Loss: 0.00643
Epoch: 44284, Training Loss: 0.00785
Epoch: 44284, Training Loss: 0.00609
Epoch: 44285, Training Loss: 0.00687
Epoch: 44285, Training Loss: 0.00643
Epoch: 44285, Training Loss: 0.00785
Epoch: 44285, Training Loss: 0.00609
Epoch: 44286, Training Loss: 0.00687
Epoch: 44286, Training Loss: 0.00643
Epoch: 44286, Training Loss: 0.00785
Epoch: 44286, Training Loss: 0.00609
Epoch: 44287, Training Loss: 0.00687
Epoch: 44287, Training Loss: 0.00643
Epoch: 44287, Training Loss: 0.00785
Epoch: 44287, Training Loss: 0.00609
Epoch: 44288, Training Loss: 0.00687
Epoch: 44288, Training Loss: 0.00643
Epoch: 44288, Training Loss: 0.00785
Epoch: 44288, Training Loss: 0.00609
Epoch: 44289, Training Loss: 0.00687
Epoch: 44289, Training Loss: 0.00643
Epoch: 44289, Training Loss: 0.00785
Epoch: 44289, Training Loss: 0.00609
Epoch: 44290, Training Loss: 0.00687
Epoch: 44290, Training Loss: 0.00643
Epoch: 44290, Training Loss: 0.00785
Epoch: 44290, Training Loss: 0.00609
Epoch: 44291, Training Loss: 0.00687
Epoch: 44291, Training Loss: 0.00643
Epoch: 44291, Training Loss: 0.00785
Epoch: 44291, Training Loss: 0.00609
Epoch: 44292, Training Loss: 0.00687
Epoch: 44292, Training Loss: 0.00643
Epoch: 44292, Training Loss: 0.00785
Epoch: 44292, Training Loss: 0.00609
Epoch: 44293, Training Loss: 0.00687
Epoch: 44293, Training Loss: 0.00643
Epoch: 44293, Training Loss: 0.00785
Epoch: 44293, Training Loss: 0.00609
Epoch: 44294, Training Loss: 0.00687
Epoch: 44294, Training Loss: 0.00643
Epoch: 44294, Training Loss: 0.00785
Epoch: 44294, Training Loss: 0.00609
Epoch: 44295, Training Loss: 0.00687
Epoch: 44295, Training Loss: 0.00643
Epoch: 44295, Training Loss: 0.00785
Epoch: 44295, Training Loss: 0.00609
Epoch: 44296, Training Loss: 0.00687
Epoch: 44296, Training Loss: 0.00643
Epoch: 44296, Training Loss: 0.00785
Epoch: 44296, Training Loss: 0.00609
Epoch: 44297, Training Loss: 0.00687
Epoch: 44297, Training Loss: 0.00643
Epoch: 44297, Training Loss: 0.00785
Epoch: 44297, Training Loss: 0.00609
Epoch: 44298, Training Loss: 0.00687
Epoch: 44298, Training Loss: 0.00643
Epoch: 44298, Training Loss: 0.00785
Epoch: 44298, Training Loss: 0.00609
Epoch: 44299, Training Loss: 0.00687
Epoch: 44299, Training Loss: 0.00643
Epoch: 44299, Training Loss: 0.00785
Epoch: 44299, Training Loss: 0.00609
Epoch: 44300, Training Loss: 0.00687
Epoch: 44300, Training Loss: 0.00643
Epoch: 44300, Training Loss: 0.00785
Epoch: 44300, Training Loss: 0.00609
Epoch: 44301, Training Loss: 0.00687
Epoch: 44301, Training Loss: 0.00643
Epoch: 44301, Training Loss: 0.00785
Epoch: 44301, Training Loss: 0.00609
Epoch: 44302, Training Loss: 0.00687
Epoch: 44302, Training Loss: 0.00643
Epoch: 44302, Training Loss: 0.00785
Epoch: 44302, Training Loss: 0.00609
Epoch: 44303, Training Loss: 0.00687
Epoch: 44303, Training Loss: 0.00643
Epoch: 44303, Training Loss: 0.00785
Epoch: 44303, Training Loss: 0.00609
Epoch: 44304, Training Loss: 0.00686
Epoch: 44304, Training Loss: 0.00643
Epoch: 44304, Training Loss: 0.00785
Epoch: 44304, Training Loss: 0.00609
Epoch: 44305, Training Loss: 0.00686
Epoch: 44305, Training Loss: 0.00643
Epoch: 44305, Training Loss: 0.00785
Epoch: 44305, Training Loss: 0.00609
Epoch: 44306, Training Loss: 0.00686
Epoch: 44306, Training Loss: 0.00643
Epoch: 44306, Training Loss: 0.00785
Epoch: 44306, Training Loss: 0.00609
Epoch: 44307, Training Loss: 0.00686
Epoch: 44307, Training Loss: 0.00643
Epoch: 44307, Training Loss: 0.00785
Epoch: 44307, Training Loss: 0.00609
Epoch: 44308, Training Loss: 0.00686
Epoch: 44308, Training Loss: 0.00643
Epoch: 44308, Training Loss: 0.00785
Epoch: 44308, Training Loss: 0.00609
Epoch: 44309, Training Loss: 0.00686
Epoch: 44309, Training Loss: 0.00643
Epoch: 44309, Training Loss: 0.00784
Epoch: 44309, Training Loss: 0.00609
Epoch: 44310, Training Loss: 0.00686
Epoch: 44310, Training Loss: 0.00643
Epoch: 44310, Training Loss: 0.00784
Epoch: 44310, Training Loss: 0.00609
Epoch: 44311, Training Loss: 0.00686
Epoch: 44311, Training Loss: 0.00643
Epoch: 44311, Training Loss: 0.00784
Epoch: 44311, Training Loss: 0.00609
Epoch: 44312, Training Loss: 0.00686
Epoch: 44312, Training Loss: 0.00643
Epoch: 44312, Training Loss: 0.00784
Epoch: 44312, Training Loss: 0.00609
Epoch: 44313, Training Loss: 0.00686
Epoch: 44313, Training Loss: 0.00643
Epoch: 44313, Training Loss: 0.00784
Epoch: 44313, Training Loss: 0.00609
Epoch: 44314, Training Loss: 0.00686
Epoch: 44314, Training Loss: 0.00643
Epoch: 44314, Training Loss: 0.00784
Epoch: 44314, Training Loss: 0.00609
Epoch: 44315, Training Loss: 0.00686
Epoch: 44315, Training Loss: 0.00643
Epoch: 44315, Training Loss: 0.00784
Epoch: 44315, Training Loss: 0.00609
Epoch: 44316, Training Loss: 0.00686
Epoch: 44316, Training Loss: 0.00643
Epoch: 44316, Training Loss: 0.00784
Epoch: 44316, Training Loss: 0.00609
Epoch: 44317, Training Loss: 0.00686
Epoch: 44317, Training Loss: 0.00643
Epoch: 44317, Training Loss: 0.00784
Epoch: 44317, Training Loss: 0.00609
Epoch: 44318, Training Loss: 0.00686
Epoch: 44318, Training Loss: 0.00643
Epoch: 44318, Training Loss: 0.00784
Epoch: 44318, Training Loss: 0.00609
Epoch: 44319, Training Loss: 0.00686
Epoch: 44319, Training Loss: 0.00643
Epoch: 44319, Training Loss: 0.00784
Epoch: 44319, Training Loss: 0.00609
Epoch: 44320, Training Loss: 0.00686
Epoch: 44320, Training Loss: 0.00643
Epoch: 44320, Training Loss: 0.00784
Epoch: 44320, Training Loss: 0.00609
Epoch: 44321, Training Loss: 0.00686
Epoch: 44321, Training Loss: 0.00643
Epoch: 44321, Training Loss: 0.00784
Epoch: 44321, Training Loss: 0.00609
Epoch: 44322, Training Loss: 0.00686
Epoch: 44322, Training Loss: 0.00643
Epoch: 44322, Training Loss: 0.00784
Epoch: 44322, Training Loss: 0.00609
Epoch: 44323, Training Loss: 0.00686
Epoch: 44323, Training Loss: 0.00643
Epoch: 44323, Training Loss: 0.00784
Epoch: 44323, Training Loss: 0.00609
Epoch: 44324, Training Loss: 0.00686
Epoch: 44324, Training Loss: 0.00643
Epoch: 44324, Training Loss: 0.00784
Epoch: 44324, Training Loss: 0.00609
Epoch: 44325, Training Loss: 0.00686
Epoch: 44325, Training Loss: 0.00643
Epoch: 44325, Training Loss: 0.00784
Epoch: 44325, Training Loss: 0.00609
Epoch: 44326, Training Loss: 0.00686
Epoch: 44326, Training Loss: 0.00643
Epoch: 44326, Training Loss: 0.00784
Epoch: 44326, Training Loss: 0.00609
Epoch: 44327, Training Loss: 0.00686
Epoch: 44327, Training Loss: 0.00643
Epoch: 44327, Training Loss: 0.00784
Epoch: 44327, Training Loss: 0.00609
Epoch: 44328, Training Loss: 0.00686
Epoch: 44328, Training Loss: 0.00643
Epoch: 44328, Training Loss: 0.00784
Epoch: 44328, Training Loss: 0.00609
Epoch: 44329, Training Loss: 0.00686
Epoch: 44329, Training Loss: 0.00643
Epoch: 44329, Training Loss: 0.00784
Epoch: 44329, Training Loss: 0.00609
Epoch: 44330, Training Loss: 0.00686
Epoch: 44330, Training Loss: 0.00643
Epoch: 44330, Training Loss: 0.00784
Epoch: 44330, Training Loss: 0.00609
Epoch: 44331, Training Loss: 0.00686
Epoch: 44331, Training Loss: 0.00643
Epoch: 44331, Training Loss: 0.00784
Epoch: 44331, Training Loss: 0.00609
Epoch: 44332, Training Loss: 0.00686
Epoch: 44332, Training Loss: 0.00643
Epoch: 44332, Training Loss: 0.00784
Epoch: 44332, Training Loss: 0.00609
Epoch: 44333, Training Loss: 0.00686
Epoch: 44333, Training Loss: 0.00643
Epoch: 44333, Training Loss: 0.00784
Epoch: 44333, Training Loss: 0.00609
Epoch: 44334, Training Loss: 0.00686
Epoch: 44334, Training Loss: 0.00643
Epoch: 44334, Training Loss: 0.00784
Epoch: 44334, Training Loss: 0.00609
Epoch: 44335, Training Loss: 0.00686
Epoch: 44335, Training Loss: 0.00643
Epoch: 44335, Training Loss: 0.00784
Epoch: 44335, Training Loss: 0.00609
Epoch: 44336, Training Loss: 0.00686
Epoch: 44336, Training Loss: 0.00643
Epoch: 44336, Training Loss: 0.00784
Epoch: 44336, Training Loss: 0.00609
Epoch: 44337, Training Loss: 0.00686
Epoch: 44337, Training Loss: 0.00643
Epoch: 44337, Training Loss: 0.00784
Epoch: 44337, Training Loss: 0.00608
Epoch: 44338, Training Loss: 0.00686
Epoch: 44338, Training Loss: 0.00643
Epoch: 44338, Training Loss: 0.00784
Epoch: 44338, Training Loss: 0.00608
Epoch: 44339, Training Loss: 0.00686
Epoch: 44339, Training Loss: 0.00643
Epoch: 44339, Training Loss: 0.00784
Epoch: 44339, Training Loss: 0.00608
Epoch: 44340, Training Loss: 0.00686
Epoch: 44340, Training Loss: 0.00643
Epoch: 44340, Training Loss: 0.00784
Epoch: 44340, Training Loss: 0.00608
Epoch: 44341, Training Loss: 0.00686
Epoch: 44341, Training Loss: 0.00643
Epoch: 44341, Training Loss: 0.00784
Epoch: 44341, Training Loss: 0.00608
Epoch: 44342, Training Loss: 0.00686
Epoch: 44342, Training Loss: 0.00643
Epoch: 44342, Training Loss: 0.00784
Epoch: 44342, Training Loss: 0.00608
Epoch: 44343, Training Loss: 0.00686
Epoch: 44343, Training Loss: 0.00643
Epoch: 44343, Training Loss: 0.00784
Epoch: 44343, Training Loss: 0.00608
Epoch: 44344, Training Loss: 0.00686
Epoch: 44344, Training Loss: 0.00643
Epoch: 44344, Training Loss: 0.00784
Epoch: 44344, Training Loss: 0.00608
Epoch: 44345, Training Loss: 0.00686
Epoch: 44345, Training Loss: 0.00643
Epoch: 44345, Training Loss: 0.00784
Epoch: 44345, Training Loss: 0.00608
Epoch: 44346, Training Loss: 0.00686
Epoch: 44346, Training Loss: 0.00643
Epoch: 44346, Training Loss: 0.00784
Epoch: 44346, Training Loss: 0.00608
Epoch: 44347, Training Loss: 0.00686
Epoch: 44347, Training Loss: 0.00643
Epoch: 44347, Training Loss: 0.00784
Epoch: 44347, Training Loss: 0.00608
Epoch: 44348, Training Loss: 0.00686
Epoch: 44348, Training Loss: 0.00643
Epoch: 44348, Training Loss: 0.00784
Epoch: 44348, Training Loss: 0.00608
Epoch: 44349, Training Loss: 0.00686
Epoch: 44349, Training Loss: 0.00643
Epoch: 44349, Training Loss: 0.00784
Epoch: 44349, Training Loss: 0.00608
Epoch: 44350, Training Loss: 0.00686
Epoch: 44350, Training Loss: 0.00643
Epoch: 44350, Training Loss: 0.00784
Epoch: 44350, Training Loss: 0.00608
Epoch: 44351, Training Loss: 0.00686
Epoch: 44351, Training Loss: 0.00643
Epoch: 44351, Training Loss: 0.00784
Epoch: 44351, Training Loss: 0.00608
Epoch: 44352, Training Loss: 0.00686
Epoch: 44352, Training Loss: 0.00643
Epoch: 44352, Training Loss: 0.00784
Epoch: 44352, Training Loss: 0.00608
Epoch: 44353, Training Loss: 0.00686
Epoch: 44353, Training Loss: 0.00643
Epoch: 44353, Training Loss: 0.00784
Epoch: 44353, Training Loss: 0.00608
Epoch: 44354, Training Loss: 0.00686
Epoch: 44354, Training Loss: 0.00643
Epoch: 44354, Training Loss: 0.00784
Epoch: 44354, Training Loss: 0.00608
Epoch: 44355, Training Loss: 0.00686
Epoch: 44355, Training Loss: 0.00643
Epoch: 44355, Training Loss: 0.00784
Epoch: 44355, Training Loss: 0.00608
Epoch: 44356, Training Loss: 0.00686
Epoch: 44356, Training Loss: 0.00643
Epoch: 44356, Training Loss: 0.00784
Epoch: 44356, Training Loss: 0.00608
Epoch: 44357, Training Loss: 0.00686
Epoch: 44357, Training Loss: 0.00643
Epoch: 44357, Training Loss: 0.00784
Epoch: 44357, Training Loss: 0.00608
Epoch: 44358, Training Loss: 0.00686
Epoch: 44358, Training Loss: 0.00643
Epoch: 44358, Training Loss: 0.00784
Epoch: 44358, Training Loss: 0.00608
Epoch: 44359, Training Loss: 0.00686
Epoch: 44359, Training Loss: 0.00643
Epoch: 44359, Training Loss: 0.00784
Epoch: 44359, Training Loss: 0.00608
Epoch: 44360, Training Loss: 0.00686
Epoch: 44360, Training Loss: 0.00643
Epoch: 44360, Training Loss: 0.00784
Epoch: 44360, Training Loss: 0.00608
Epoch: 44361, Training Loss: 0.00686
Epoch: 44361, Training Loss: 0.00643
Epoch: 44361, Training Loss: 0.00784
Epoch: 44361, Training Loss: 0.00608
Epoch: 44362, Training Loss: 0.00686
Epoch: 44362, Training Loss: 0.00643
Epoch: 44362, Training Loss: 0.00784
Epoch: 44362, Training Loss: 0.00608
Epoch: 44363, Training Loss: 0.00686
Epoch: 44363, Training Loss: 0.00643
Epoch: 44363, Training Loss: 0.00784
Epoch: 44363, Training Loss: 0.00608
Epoch: 44364, Training Loss: 0.00686
Epoch: 44364, Training Loss: 0.00643
Epoch: 44364, Training Loss: 0.00784
Epoch: 44364, Training Loss: 0.00608
Epoch: 44365, Training Loss: 0.00686
Epoch: 44365, Training Loss: 0.00643
Epoch: 44365, Training Loss: 0.00784
Epoch: 44365, Training Loss: 0.00608
Epoch: 44366, Training Loss: 0.00686
Epoch: 44366, Training Loss: 0.00643
Epoch: 44366, Training Loss: 0.00784
Epoch: 44366, Training Loss: 0.00608
Epoch: 44367, Training Loss: 0.00686
Epoch: 44367, Training Loss: 0.00643
Epoch: 44367, Training Loss: 0.00784
Epoch: 44367, Training Loss: 0.00608
Epoch: 44368, Training Loss: 0.00686
Epoch: 44368, Training Loss: 0.00642
Epoch: 44368, Training Loss: 0.00784
Epoch: 44368, Training Loss: 0.00608
Epoch: 44369, Training Loss: 0.00686
Epoch: 44369, Training Loss: 0.00642
Epoch: 44369, Training Loss: 0.00784
Epoch: 44369, Training Loss: 0.00608
Epoch: 44370, Training Loss: 0.00686
Epoch: 44370, Training Loss: 0.00642
Epoch: 44370, Training Loss: 0.00784
Epoch: 44370, Training Loss: 0.00608
Epoch: 44371, Training Loss: 0.00686
Epoch: 44371, Training Loss: 0.00642
Epoch: 44371, Training Loss: 0.00784
Epoch: 44371, Training Loss: 0.00608
Epoch: 44372, Training Loss: 0.00686
Epoch: 44372, Training Loss: 0.00642
Epoch: 44372, Training Loss: 0.00784
Epoch: 44372, Training Loss: 0.00608
Epoch: 44373, Training Loss: 0.00686
Epoch: 44373, Training Loss: 0.00642
Epoch: 44373, Training Loss: 0.00784
Epoch: 44373, Training Loss: 0.00608
Epoch: 44374, Training Loss: 0.00686
Epoch: 44374, Training Loss: 0.00642
Epoch: 44374, Training Loss: 0.00784
Epoch: 44374, Training Loss: 0.00608
Epoch: 44375, Training Loss: 0.00686
Epoch: 44375, Training Loss: 0.00642
Epoch: 44375, Training Loss: 0.00784
Epoch: 44375, Training Loss: 0.00608
Epoch: 44376, Training Loss: 0.00686
Epoch: 44376, Training Loss: 0.00642
Epoch: 44376, Training Loss: 0.00784
Epoch: 44376, Training Loss: 0.00608
Epoch: 44377, Training Loss: 0.00686
Epoch: 44377, Training Loss: 0.00642
Epoch: 44377, Training Loss: 0.00784
Epoch: 44377, Training Loss: 0.00608
Epoch: 44378, Training Loss: 0.00686
Epoch: 44378, Training Loss: 0.00642
Epoch: 44378, Training Loss: 0.00784
Epoch: 44378, Training Loss: 0.00608
Epoch: 44379, Training Loss: 0.00686
Epoch: 44379, Training Loss: 0.00642
Epoch: 44379, Training Loss: 0.00784
Epoch: 44379, Training Loss: 0.00608
Epoch: 44380, Training Loss: 0.00686
Epoch: 44380, Training Loss: 0.00642
Epoch: 44380, Training Loss: 0.00784
Epoch: 44380, Training Loss: 0.00608
Epoch: 44381, Training Loss: 0.00686
Epoch: 44381, Training Loss: 0.00642
Epoch: 44381, Training Loss: 0.00784
Epoch: 44381, Training Loss: 0.00608
Epoch: 44382, Training Loss: 0.00686
Epoch: 44382, Training Loss: 0.00642
Epoch: 44382, Training Loss: 0.00784
Epoch: 44382, Training Loss: 0.00608
Epoch: 44383, Training Loss: 0.00686
Epoch: 44383, Training Loss: 0.00642
Epoch: 44383, Training Loss: 0.00784
Epoch: 44383, Training Loss: 0.00608
Epoch: 44384, Training Loss: 0.00686
Epoch: 44384, Training Loss: 0.00642
Epoch: 44384, Training Loss: 0.00784
Epoch: 44384, Training Loss: 0.00608
Epoch: 44385, Training Loss: 0.00686
Epoch: 44385, Training Loss: 0.00642
Epoch: 44385, Training Loss: 0.00784
Epoch: 44385, Training Loss: 0.00608
Epoch: 44386, Training Loss: 0.00686
Epoch: 44386, Training Loss: 0.00642
Epoch: 44386, Training Loss: 0.00784
Epoch: 44386, Training Loss: 0.00608
Epoch: 44387, Training Loss: 0.00686
Epoch: 44387, Training Loss: 0.00642
Epoch: 44387, Training Loss: 0.00784
Epoch: 44387, Training Loss: 0.00608
Epoch: 44388, Training Loss: 0.00686
Epoch: 44388, Training Loss: 0.00642
Epoch: 44388, Training Loss: 0.00784
Epoch: 44388, Training Loss: 0.00608
Epoch: 44389, Training Loss: 0.00686
Epoch: 44389, Training Loss: 0.00642
Epoch: 44389, Training Loss: 0.00784
Epoch: 44389, Training Loss: 0.00608
Epoch: 44390, Training Loss: 0.00686
Epoch: 44390, Training Loss: 0.00642
Epoch: 44390, Training Loss: 0.00784
Epoch: 44390, Training Loss: 0.00608
Epoch: 44391, Training Loss: 0.00686
Epoch: 44391, Training Loss: 0.00642
Epoch: 44391, Training Loss: 0.00784
Epoch: 44391, Training Loss: 0.00608
Epoch: 44392, Training Loss: 0.00686
Epoch: 44392, Training Loss: 0.00642
Epoch: 44392, Training Loss: 0.00784
Epoch: 44392, Training Loss: 0.00608
Epoch: 44393, Training Loss: 0.00686
Epoch: 44393, Training Loss: 0.00642
Epoch: 44393, Training Loss: 0.00784
Epoch: 44393, Training Loss: 0.00608
Epoch: 44394, Training Loss: 0.00686
Epoch: 44394, Training Loss: 0.00642
Epoch: 44394, Training Loss: 0.00784
Epoch: 44394, Training Loss: 0.00608
Epoch: 44395, Training Loss: 0.00686
Epoch: 44395, Training Loss: 0.00642
Epoch: 44395, Training Loss: 0.00784
Epoch: 44395, Training Loss: 0.00608
Epoch: 44396, Training Loss: 0.00686
Epoch: 44396, Training Loss: 0.00642
Epoch: 44396, Training Loss: 0.00784
Epoch: 44396, Training Loss: 0.00608
Epoch: 44397, Training Loss: 0.00686
Epoch: 44397, Training Loss: 0.00642
Epoch: 44397, Training Loss: 0.00784
Epoch: 44397, Training Loss: 0.00608
Epoch: 44398, Training Loss: 0.00686
Epoch: 44398, Training Loss: 0.00642
Epoch: 44398, Training Loss: 0.00784
Epoch: 44398, Training Loss: 0.00608
Epoch: 44399, Training Loss: 0.00686
Epoch: 44399, Training Loss: 0.00642
Epoch: 44399, Training Loss: 0.00784
Epoch: 44399, Training Loss: 0.00608
Epoch: 44400, Training Loss: 0.00686
Epoch: 44400, Training Loss: 0.00642
Epoch: 44400, Training Loss: 0.00784
Epoch: 44400, Training Loss: 0.00608
Epoch: 44401, Training Loss: 0.00686
Epoch: 44401, Training Loss: 0.00642
Epoch: 44401, Training Loss: 0.00784
Epoch: 44401, Training Loss: 0.00608
Epoch: 44402, Training Loss: 0.00686
Epoch: 44402, Training Loss: 0.00642
Epoch: 44402, Training Loss: 0.00784
Epoch: 44402, Training Loss: 0.00608
Epoch: 44403, Training Loss: 0.00686
Epoch: 44403, Training Loss: 0.00642
Epoch: 44403, Training Loss: 0.00784
Epoch: 44403, Training Loss: 0.00608
Epoch: 44404, Training Loss: 0.00686
Epoch: 44404, Training Loss: 0.00642
Epoch: 44404, Training Loss: 0.00784
Epoch: 44404, Training Loss: 0.00608
Epoch: 44405, Training Loss: 0.00686
Epoch: 44405, Training Loss: 0.00642
Epoch: 44405, Training Loss: 0.00784
Epoch: 44405, Training Loss: 0.00608
Epoch: 44406, Training Loss: 0.00686
Epoch: 44406, Training Loss: 0.00642
Epoch: 44406, Training Loss: 0.00784
Epoch: 44406, Training Loss: 0.00608
Epoch: 44407, Training Loss: 0.00686
Epoch: 44407, Training Loss: 0.00642
Epoch: 44407, Training Loss: 0.00784
Epoch: 44407, Training Loss: 0.00608
Epoch: 44408, Training Loss: 0.00686
Epoch: 44408, Training Loss: 0.00642
Epoch: 44408, Training Loss: 0.00784
Epoch: 44408, Training Loss: 0.00608
Epoch: 44409, Training Loss: 0.00686
Epoch: 44409, Training Loss: 0.00642
Epoch: 44409, Training Loss: 0.00784
Epoch: 44409, Training Loss: 0.00608
Epoch: 44410, Training Loss: 0.00686
Epoch: 44410, Training Loss: 0.00642
Epoch: 44410, Training Loss: 0.00784
Epoch: 44410, Training Loss: 0.00608
Epoch: 44411, Training Loss: 0.00686
Epoch: 44411, Training Loss: 0.00642
Epoch: 44411, Training Loss: 0.00784
Epoch: 44411, Training Loss: 0.00608
Epoch: 44412, Training Loss: 0.00686
Epoch: 44412, Training Loss: 0.00642
Epoch: 44412, Training Loss: 0.00784
Epoch: 44412, Training Loss: 0.00608
Epoch: 44413, Training Loss: 0.00686
Epoch: 44413, Training Loss: 0.00642
Epoch: 44413, Training Loss: 0.00784
Epoch: 44413, Training Loss: 0.00608
Epoch: 44414, Training Loss: 0.00686
Epoch: 44414, Training Loss: 0.00642
Epoch: 44414, Training Loss: 0.00784
Epoch: 44414, Training Loss: 0.00608
Epoch: 44415, Training Loss: 0.00686
Epoch: 44415, Training Loss: 0.00642
Epoch: 44415, Training Loss: 0.00784
Epoch: 44415, Training Loss: 0.00608
Epoch: 44416, Training Loss: 0.00686
Epoch: 44416, Training Loss: 0.00642
Epoch: 44416, Training Loss: 0.00784
Epoch: 44416, Training Loss: 0.00608
Epoch: 44417, Training Loss: 0.00686
Epoch: 44417, Training Loss: 0.00642
Epoch: 44417, Training Loss: 0.00783
Epoch: 44417, Training Loss: 0.00608
Epoch: 44418, Training Loss: 0.00686
Epoch: 44418, Training Loss: 0.00642
Epoch: 44418, Training Loss: 0.00783
Epoch: 44418, Training Loss: 0.00608
Epoch: 44419, Training Loss: 0.00686
Epoch: 44419, Training Loss: 0.00642
Epoch: 44419, Training Loss: 0.00783
Epoch: 44419, Training Loss: 0.00608
Epoch: 44420, Training Loss: 0.00686
Epoch: 44420, Training Loss: 0.00642
Epoch: 44420, Training Loss: 0.00783
Epoch: 44420, Training Loss: 0.00608
Epoch: 44421, Training Loss: 0.00686
Epoch: 44421, Training Loss: 0.00642
Epoch: 44421, Training Loss: 0.00783
Epoch: 44421, Training Loss: 0.00608
Epoch: 44422, Training Loss: 0.00686
Epoch: 44422, Training Loss: 0.00642
Epoch: 44422, Training Loss: 0.00783
Epoch: 44422, Training Loss: 0.00608
Epoch: 44423, Training Loss: 0.00686
Epoch: 44423, Training Loss: 0.00642
Epoch: 44423, Training Loss: 0.00783
Epoch: 44423, Training Loss: 0.00608
Epoch: 44424, Training Loss: 0.00686
Epoch: 44424, Training Loss: 0.00642
Epoch: 44424, Training Loss: 0.00783
Epoch: 44424, Training Loss: 0.00608
Epoch: 44425, Training Loss: 0.00686
Epoch: 44425, Training Loss: 0.00642
Epoch: 44425, Training Loss: 0.00783
Epoch: 44425, Training Loss: 0.00608
Epoch: 44426, Training Loss: 0.00686
Epoch: 44426, Training Loss: 0.00642
Epoch: 44426, Training Loss: 0.00783
Epoch: 44426, Training Loss: 0.00608
Epoch: 44427, Training Loss: 0.00685
Epoch: 44427, Training Loss: 0.00642
Epoch: 44427, Training Loss: 0.00783
Epoch: 44427, Training Loss: 0.00608
Epoch: 44428, Training Loss: 0.00685
Epoch: 44428, Training Loss: 0.00642
Epoch: 44428, Training Loss: 0.00783
Epoch: 44428, Training Loss: 0.00608
Epoch: 44429, Training Loss: 0.00685
Epoch: 44429, Training Loss: 0.00642
Epoch: 44429, Training Loss: 0.00783
Epoch: 44429, Training Loss: 0.00608
Epoch: 44430, Training Loss: 0.00685
Epoch: 44430, Training Loss: 0.00642
Epoch: 44430, Training Loss: 0.00783
Epoch: 44430, Training Loss: 0.00608
Epoch: 44431, Training Loss: 0.00685
Epoch: 44431, Training Loss: 0.00642
Epoch: 44431, Training Loss: 0.00783
Epoch: 44431, Training Loss: 0.00608
Epoch: 44432, Training Loss: 0.00685
Epoch: 44432, Training Loss: 0.00642
Epoch: 44432, Training Loss: 0.00783
Epoch: 44432, Training Loss: 0.00608
Epoch: 44433, Training Loss: 0.00685
Epoch: 44433, Training Loss: 0.00642
Epoch: 44433, Training Loss: 0.00783
Epoch: 44433, Training Loss: 0.00608
Epoch: 44434, Training Loss: 0.00685
Epoch: 44434, Training Loss: 0.00642
Epoch: 44434, Training Loss: 0.00783
Epoch: 44434, Training Loss: 0.00608
Epoch: 44435, Training Loss: 0.00685
Epoch: 44435, Training Loss: 0.00642
Epoch: 44435, Training Loss: 0.00783
Epoch: 44435, Training Loss: 0.00608
Epoch: 44436, Training Loss: 0.00685
Epoch: 44436, Training Loss: 0.00642
Epoch: 44436, Training Loss: 0.00783
Epoch: 44436, Training Loss: 0.00608
Epoch: 44437, Training Loss: 0.00685
Epoch: 44437, Training Loss: 0.00642
Epoch: 44437, Training Loss: 0.00783
Epoch: 44437, Training Loss: 0.00608
Epoch: 44438, Training Loss: 0.00685
Epoch: 44438, Training Loss: 0.00642
Epoch: 44438, Training Loss: 0.00783
Epoch: 44438, Training Loss: 0.00608
Epoch: 44439, Training Loss: 0.00685
Epoch: 44439, Training Loss: 0.00642
Epoch: 44439, Training Loss: 0.00783
Epoch: 44439, Training Loss: 0.00608
Epoch: 44440, Training Loss: 0.00685
Epoch: 44440, Training Loss: 0.00642
Epoch: 44440, Training Loss: 0.00783
Epoch: 44440, Training Loss: 0.00608
Epoch: 44441, Training Loss: 0.00685
Epoch: 44441, Training Loss: 0.00642
Epoch: 44441, Training Loss: 0.00783
Epoch: 44441, Training Loss: 0.00608
Epoch: 44442, Training Loss: 0.00685
Epoch: 44442, Training Loss: 0.00642
Epoch: 44442, Training Loss: 0.00783
Epoch: 44442, Training Loss: 0.00608
Epoch: 44443, Training Loss: 0.00685
Epoch: 44443, Training Loss: 0.00642
Epoch: 44443, Training Loss: 0.00783
Epoch: 44443, Training Loss: 0.00608
Epoch: 44444, Training Loss: 0.00685
Epoch: 44444, Training Loss: 0.00642
Epoch: 44444, Training Loss: 0.00783
Epoch: 44444, Training Loss: 0.00608
Epoch: 44445, Training Loss: 0.00685
Epoch: 44445, Training Loss: 0.00642
Epoch: 44445, Training Loss: 0.00783
Epoch: 44445, Training Loss: 0.00608
Epoch: 44446, Training Loss: 0.00685
Epoch: 44446, Training Loss: 0.00642
Epoch: 44446, Training Loss: 0.00783
Epoch: 44446, Training Loss: 0.00608
Epoch: 44447, Training Loss: 0.00685
Epoch: 44447, Training Loss: 0.00642
Epoch: 44447, Training Loss: 0.00783
Epoch: 44447, Training Loss: 0.00608
Epoch: 44448, Training Loss: 0.00685
Epoch: 44448, Training Loss: 0.00642
Epoch: 44448, Training Loss: 0.00783
Epoch: 44448, Training Loss: 0.00608
Epoch: 44449, Training Loss: 0.00685
Epoch: 44449, Training Loss: 0.00642
Epoch: 44449, Training Loss: 0.00783
Epoch: 44449, Training Loss: 0.00608
Epoch: 44450, Training Loss: 0.00685
Epoch: 44450, Training Loss: 0.00642
Epoch: 44450, Training Loss: 0.00783
Epoch: 44450, Training Loss: 0.00608
Epoch: 44451, Training Loss: 0.00685
Epoch: 44451, Training Loss: 0.00642
Epoch: 44451, Training Loss: 0.00783
Epoch: 44451, Training Loss: 0.00608
Epoch: 44452, Training Loss: 0.00685
Epoch: 44452, Training Loss: 0.00642
Epoch: 44452, Training Loss: 0.00783
Epoch: 44452, Training Loss: 0.00608
Epoch: 44453, Training Loss: 0.00685
Epoch: 44453, Training Loss: 0.00642
Epoch: 44453, Training Loss: 0.00783
Epoch: 44453, Training Loss: 0.00608
Epoch: 44454, Training Loss: 0.00685
Epoch: 44454, Training Loss: 0.00642
Epoch: 44454, Training Loss: 0.00783
Epoch: 44454, Training Loss: 0.00608
Epoch: 44455, Training Loss: 0.00685
Epoch: 44455, Training Loss: 0.00642
Epoch: 44455, Training Loss: 0.00783
Epoch: 44455, Training Loss: 0.00608
Epoch: 44456, Training Loss: 0.00685
Epoch: 44456, Training Loss: 0.00642
Epoch: 44456, Training Loss: 0.00783
Epoch: 44456, Training Loss: 0.00608
Epoch: 44457, Training Loss: 0.00685
Epoch: 44457, Training Loss: 0.00642
Epoch: 44457, Training Loss: 0.00783
Epoch: 44457, Training Loss: 0.00608
Epoch: 44458, Training Loss: 0.00685
Epoch: 44458, Training Loss: 0.00642
Epoch: 44458, Training Loss: 0.00783
Epoch: 44458, Training Loss: 0.00608
Epoch: 44459, Training Loss: 0.00685
Epoch: 44459, Training Loss: 0.00642
Epoch: 44459, Training Loss: 0.00783
Epoch: 44459, Training Loss: 0.00608
Epoch: 44460, Training Loss: 0.00685
Epoch: 44460, Training Loss: 0.00642
Epoch: 44460, Training Loss: 0.00783
Epoch: 44460, Training Loss: 0.00608
Epoch: 44461, Training Loss: 0.00685
Epoch: 44461, Training Loss: 0.00642
Epoch: 44461, Training Loss: 0.00783
Epoch: 44461, Training Loss: 0.00608
Epoch: 44462, Training Loss: 0.00685
Epoch: 44462, Training Loss: 0.00642
Epoch: 44462, Training Loss: 0.00783
Epoch: 44462, Training Loss: 0.00608
Epoch: 44463, Training Loss: 0.00685
Epoch: 44463, Training Loss: 0.00642
Epoch: 44463, Training Loss: 0.00783
Epoch: 44463, Training Loss: 0.00608
Epoch: 44464, Training Loss: 0.00685
Epoch: 44464, Training Loss: 0.00642
Epoch: 44464, Training Loss: 0.00783
Epoch: 44464, Training Loss: 0.00608
Epoch: 44465, Training Loss: 0.00685
Epoch: 44465, Training Loss: 0.00642
Epoch: 44465, Training Loss: 0.00783
Epoch: 44465, Training Loss: 0.00608
Epoch: 44466, Training Loss: 0.00685
Epoch: 44466, Training Loss: 0.00642
Epoch: 44466, Training Loss: 0.00783
Epoch: 44466, Training Loss: 0.00608
Epoch: 44467, Training Loss: 0.00685
Epoch: 44467, Training Loss: 0.00642
Epoch: 44467, Training Loss: 0.00783
Epoch: 44467, Training Loss: 0.00608
Epoch: 44468, Training Loss: 0.00685
Epoch: 44468, Training Loss: 0.00642
Epoch: 44468, Training Loss: 0.00783
Epoch: 44468, Training Loss: 0.00608
Epoch: 44469, Training Loss: 0.00685
Epoch: 44469, Training Loss: 0.00642
Epoch: 44469, Training Loss: 0.00783
Epoch: 44469, Training Loss: 0.00608
Epoch: 44470, Training Loss: 0.00685
Epoch: 44470, Training Loss: 0.00642
Epoch: 44470, Training Loss: 0.00783
Epoch: 44470, Training Loss: 0.00608
Epoch: 44471, Training Loss: 0.00685
Epoch: 44471, Training Loss: 0.00642
Epoch: 44471, Training Loss: 0.00783
Epoch: 44471, Training Loss: 0.00608
Epoch: 44472, Training Loss: 0.00685
Epoch: 44472, Training Loss: 0.00642
Epoch: 44472, Training Loss: 0.00783
Epoch: 44472, Training Loss: 0.00608
Epoch: 44473, Training Loss: 0.00685
Epoch: 44473, Training Loss: 0.00642
Epoch: 44473, Training Loss: 0.00783
Epoch: 44473, Training Loss: 0.00608
Epoch: 44474, Training Loss: 0.00685
Epoch: 44474, Training Loss: 0.00642
Epoch: 44474, Training Loss: 0.00783
Epoch: 44474, Training Loss: 0.00608
Epoch: 44475, Training Loss: 0.00685
Epoch: 44475, Training Loss: 0.00642
Epoch: 44475, Training Loss: 0.00783
Epoch: 44475, Training Loss: 0.00608
Epoch: 44476, Training Loss: 0.00685
Epoch: 44476, Training Loss: 0.00642
Epoch: 44476, Training Loss: 0.00783
Epoch: 44476, Training Loss: 0.00608
Epoch: 44477, Training Loss: 0.00685
Epoch: 44477, Training Loss: 0.00642
Epoch: 44477, Training Loss: 0.00783
Epoch: 44477, Training Loss: 0.00608
Epoch: 44478, Training Loss: 0.00685
Epoch: 44478, Training Loss: 0.00642
Epoch: 44478, Training Loss: 0.00783
Epoch: 44478, Training Loss: 0.00607
Epoch: 44479, Training Loss: 0.00685
Epoch: 44479, Training Loss: 0.00642
Epoch: 44479, Training Loss: 0.00783
Epoch: 44479, Training Loss: 0.00607
Epoch: 44480, Training Loss: 0.00685
Epoch: 44480, Training Loss: 0.00642
Epoch: 44480, Training Loss: 0.00783
Epoch: 44480, Training Loss: 0.00607
Epoch: 44481, Training Loss: 0.00685
Epoch: 44481, Training Loss: 0.00642
Epoch: 44481, Training Loss: 0.00783
Epoch: 44481, Training Loss: 0.00607
Epoch: 44482, Training Loss: 0.00685
Epoch: 44482, Training Loss: 0.00642
Epoch: 44482, Training Loss: 0.00783
Epoch: 44482, Training Loss: 0.00607
Epoch: 44483, Training Loss: 0.00685
Epoch: 44483, Training Loss: 0.00642
Epoch: 44483, Training Loss: 0.00783
Epoch: 44483, Training Loss: 0.00607
Epoch: 44484, Training Loss: 0.00685
Epoch: 44484, Training Loss: 0.00642
Epoch: 44484, Training Loss: 0.00783
Epoch: 44484, Training Loss: 0.00607
Epoch: 44485, Training Loss: 0.00685
Epoch: 44485, Training Loss: 0.00642
Epoch: 44485, Training Loss: 0.00783
Epoch: 44485, Training Loss: 0.00607
Epoch: 44486, Training Loss: 0.00685
Epoch: 44486, Training Loss: 0.00642
Epoch: 44486, Training Loss: 0.00783
Epoch: 44486, Training Loss: 0.00607
Epoch: 44487, Training Loss: 0.00685
Epoch: 44487, Training Loss: 0.00642
Epoch: 44487, Training Loss: 0.00783
Epoch: 44487, Training Loss: 0.00607
Epoch: 44488, Training Loss: 0.00685
Epoch: 44488, Training Loss: 0.00642
Epoch: 44488, Training Loss: 0.00783
Epoch: 44488, Training Loss: 0.00607
Epoch: 44489, Training Loss: 0.00685
Epoch: 44489, Training Loss: 0.00642
Epoch: 44489, Training Loss: 0.00783
Epoch: 44489, Training Loss: 0.00607
Epoch: 44490, Training Loss: 0.00685
Epoch: 44490, Training Loss: 0.00642
Epoch: 44490, Training Loss: 0.00783
Epoch: 44490, Training Loss: 0.00607
Epoch: 44491, Training Loss: 0.00685
Epoch: 44491, Training Loss: 0.00642
Epoch: 44491, Training Loss: 0.00783
Epoch: 44491, Training Loss: 0.00607
Epoch: 44492, Training Loss: 0.00685
Epoch: 44492, Training Loss: 0.00642
Epoch: 44492, Training Loss: 0.00783
Epoch: 44492, Training Loss: 0.00607
Epoch: 44493, Training Loss: 0.00685
Epoch: 44493, Training Loss: 0.00642
Epoch: 44493, Training Loss: 0.00783
Epoch: 44493, Training Loss: 0.00607
Epoch: 44494, Training Loss: 0.00685
Epoch: 44494, Training Loss: 0.00642
Epoch: 44494, Training Loss: 0.00783
Epoch: 44494, Training Loss: 0.00607
Epoch: 44495, Training Loss: 0.00685
Epoch: 44495, Training Loss: 0.00642
Epoch: 44495, Training Loss: 0.00783
Epoch: 44495, Training Loss: 0.00607
Epoch: 44496, Training Loss: 0.00685
Epoch: 44496, Training Loss: 0.00642
Epoch: 44496, Training Loss: 0.00783
Epoch: 44496, Training Loss: 0.00607
Epoch: 44497, Training Loss: 0.00685
Epoch: 44497, Training Loss: 0.00642
Epoch: 44497, Training Loss: 0.00783
Epoch: 44497, Training Loss: 0.00607
Epoch: 44498, Training Loss: 0.00685
Epoch: 44498, Training Loss: 0.00642
Epoch: 44498, Training Loss: 0.00783
Epoch: 44498, Training Loss: 0.00607
Epoch: 44499, Training Loss: 0.00685
Epoch: 44499, Training Loss: 0.00642
Epoch: 44499, Training Loss: 0.00783
Epoch: 44499, Training Loss: 0.00607
Epoch: 44500, Training Loss: 0.00685
Epoch: 44500, Training Loss: 0.00642
Epoch: 44500, Training Loss: 0.00783
Epoch: 44500, Training Loss: 0.00607
Epoch: 44501, Training Loss: 0.00685
Epoch: 44501, Training Loss: 0.00642
Epoch: 44501, Training Loss: 0.00783
Epoch: 44501, Training Loss: 0.00607
Epoch: 44502, Training Loss: 0.00685
Epoch: 44502, Training Loss: 0.00641
Epoch: 44502, Training Loss: 0.00783
Epoch: 44502, Training Loss: 0.00607
Epoch: 44503, Training Loss: 0.00685
Epoch: 44503, Training Loss: 0.00641
Epoch: 44503, Training Loss: 0.00783
Epoch: 44503, Training Loss: 0.00607
Epoch: 44504, Training Loss: 0.00685
Epoch: 44504, Training Loss: 0.00641
Epoch: 44504, Training Loss: 0.00783
Epoch: 44504, Training Loss: 0.00607
Epoch: 44505, Training Loss: 0.00685
Epoch: 44505, Training Loss: 0.00641
Epoch: 44505, Training Loss: 0.00783
Epoch: 44505, Training Loss: 0.00607
Epoch: 44506, Training Loss: 0.00685
Epoch: 44506, Training Loss: 0.00641
Epoch: 44506, Training Loss: 0.00783
Epoch: 44506, Training Loss: 0.00607
Epoch: 44507, Training Loss: 0.00685
Epoch: 44507, Training Loss: 0.00641
Epoch: 44507, Training Loss: 0.00783
Epoch: 44507, Training Loss: 0.00607
Epoch: 44508, Training Loss: 0.00685
Epoch: 44508, Training Loss: 0.00641
Epoch: 44508, Training Loss: 0.00783
Epoch: 44508, Training Loss: 0.00607
Epoch: 44509, Training Loss: 0.00685
Epoch: 44509, Training Loss: 0.00641
Epoch: 44509, Training Loss: 0.00783
Epoch: 44509, Training Loss: 0.00607
Epoch: 44510, Training Loss: 0.00685
Epoch: 44510, Training Loss: 0.00641
Epoch: 44510, Training Loss: 0.00783
Epoch: 44510, Training Loss: 0.00607
Epoch: 44511, Training Loss: 0.00685
Epoch: 44511, Training Loss: 0.00641
Epoch: 44511, Training Loss: 0.00783
Epoch: 44511, Training Loss: 0.00607
Epoch: 44512, Training Loss: 0.00685
Epoch: 44512, Training Loss: 0.00641
Epoch: 44512, Training Loss: 0.00783
Epoch: 44512, Training Loss: 0.00607
Epoch: 44513, Training Loss: 0.00685
Epoch: 44513, Training Loss: 0.00641
Epoch: 44513, Training Loss: 0.00783
Epoch: 44513, Training Loss: 0.00607
Epoch: 44514, Training Loss: 0.00685
Epoch: 44514, Training Loss: 0.00641
Epoch: 44514, Training Loss: 0.00783
Epoch: 44514, Training Loss: 0.00607
Epoch: 44515, Training Loss: 0.00685
Epoch: 44515, Training Loss: 0.00641
Epoch: 44515, Training Loss: 0.00783
Epoch: 44515, Training Loss: 0.00607
Epoch: 44516, Training Loss: 0.00685
Epoch: 44516, Training Loss: 0.00641
Epoch: 44516, Training Loss: 0.00783
Epoch: 44516, Training Loss: 0.00607
Epoch: 44517, Training Loss: 0.00685
Epoch: 44517, Training Loss: 0.00641
Epoch: 44517, Training Loss: 0.00783
Epoch: 44517, Training Loss: 0.00607
Epoch: 44518, Training Loss: 0.00685
Epoch: 44518, Training Loss: 0.00641
Epoch: 44518, Training Loss: 0.00783
Epoch: 44518, Training Loss: 0.00607
Epoch: 44519, Training Loss: 0.00685
Epoch: 44519, Training Loss: 0.00641
Epoch: 44519, Training Loss: 0.00783
Epoch: 44519, Training Loss: 0.00607
Epoch: 44520, Training Loss: 0.00685
Epoch: 44520, Training Loss: 0.00641
Epoch: 44520, Training Loss: 0.00783
Epoch: 44520, Training Loss: 0.00607
Epoch: 44521, Training Loss: 0.00685
Epoch: 44521, Training Loss: 0.00641
Epoch: 44521, Training Loss: 0.00783
Epoch: 44521, Training Loss: 0.00607
Epoch: 44522, Training Loss: 0.00685
Epoch: 44522, Training Loss: 0.00641
Epoch: 44522, Training Loss: 0.00783
Epoch: 44522, Training Loss: 0.00607
Epoch: 44523, Training Loss: 0.00685
Epoch: 44523, Training Loss: 0.00641
Epoch: 44523, Training Loss: 0.00783
Epoch: 44523, Training Loss: 0.00607
Epoch: 44524, Training Loss: 0.00685
Epoch: 44524, Training Loss: 0.00641
Epoch: 44524, Training Loss: 0.00783
Epoch: 44524, Training Loss: 0.00607
Epoch: 44525, Training Loss: 0.00685
Epoch: 44525, Training Loss: 0.00641
Epoch: 44525, Training Loss: 0.00783
Epoch: 44525, Training Loss: 0.00607
Epoch: 44526, Training Loss: 0.00685
Epoch: 44526, Training Loss: 0.00641
Epoch: 44526, Training Loss: 0.00782
Epoch: 44526, Training Loss: 0.00607
Epoch: 44527, Training Loss: 0.00685
Epoch: 44527, Training Loss: 0.00641
Epoch: 44527, Training Loss: 0.00782
Epoch: 44527, Training Loss: 0.00607
Epoch: 44528, Training Loss: 0.00685
Epoch: 44528, Training Loss: 0.00641
Epoch: 44528, Training Loss: 0.00782
Epoch: 44528, Training Loss: 0.00607
Epoch: 44529, Training Loss: 0.00685
Epoch: 44529, Training Loss: 0.00641
Epoch: 44529, Training Loss: 0.00782
Epoch: 44529, Training Loss: 0.00607
Epoch: 44530, Training Loss: 0.00685
Epoch: 44530, Training Loss: 0.00641
Epoch: 44530, Training Loss: 0.00782
Epoch: 44530, Training Loss: 0.00607
Epoch: 44531, Training Loss: 0.00685
Epoch: 44531, Training Loss: 0.00641
Epoch: 44531, Training Loss: 0.00782
Epoch: 44531, Training Loss: 0.00607
Epoch: 44532, Training Loss: 0.00685
Epoch: 44532, Training Loss: 0.00641
Epoch: 44532, Training Loss: 0.00782
Epoch: 44532, Training Loss: 0.00607
Epoch: 44533, Training Loss: 0.00685
Epoch: 44533, Training Loss: 0.00641
Epoch: 44533, Training Loss: 0.00782
Epoch: 44533, Training Loss: 0.00607
Epoch: 44534, Training Loss: 0.00685
Epoch: 44534, Training Loss: 0.00641
Epoch: 44534, Training Loss: 0.00782
Epoch: 44534, Training Loss: 0.00607
Epoch: 44535, Training Loss: 0.00685
Epoch: 44535, Training Loss: 0.00641
Epoch: 44535, Training Loss: 0.00782
Epoch: 44535, Training Loss: 0.00607
Epoch: 44536, Training Loss: 0.00685
Epoch: 44536, Training Loss: 0.00641
Epoch: 44536, Training Loss: 0.00782
Epoch: 44536, Training Loss: 0.00607
Epoch: 44537, Training Loss: 0.00685
Epoch: 44537, Training Loss: 0.00641
Epoch: 44537, Training Loss: 0.00782
Epoch: 44537, Training Loss: 0.00607
Epoch: 44538, Training Loss: 0.00685
Epoch: 44538, Training Loss: 0.00641
Epoch: 44538, Training Loss: 0.00782
Epoch: 44538, Training Loss: 0.00607
Epoch: 44539, Training Loss: 0.00685
Epoch: 44539, Training Loss: 0.00641
Epoch: 44539, Training Loss: 0.00782
Epoch: 44539, Training Loss: 0.00607
Epoch: 44540, Training Loss: 0.00685
Epoch: 44540, Training Loss: 0.00641
Epoch: 44540, Training Loss: 0.00782
Epoch: 44540, Training Loss: 0.00607
Epoch: 44541, Training Loss: 0.00685
Epoch: 44541, Training Loss: 0.00641
Epoch: 44541, Training Loss: 0.00782
Epoch: 44541, Training Loss: 0.00607
Epoch: 44542, Training Loss: 0.00685
Epoch: 44542, Training Loss: 0.00641
Epoch: 44542, Training Loss: 0.00782
Epoch: 44542, Training Loss: 0.00607
Epoch: 44543, Training Loss: 0.00685
Epoch: 44543, Training Loss: 0.00641
Epoch: 44543, Training Loss: 0.00782
Epoch: 44543, Training Loss: 0.00607
Epoch: 44544, Training Loss: 0.00685
Epoch: 44544, Training Loss: 0.00641
Epoch: 44544, Training Loss: 0.00782
Epoch: 44544, Training Loss: 0.00607
Epoch: 44545, Training Loss: 0.00685
Epoch: 44545, Training Loss: 0.00641
Epoch: 44545, Training Loss: 0.00782
Epoch: 44545, Training Loss: 0.00607
Epoch: 44546, Training Loss: 0.00685
Epoch: 44546, Training Loss: 0.00641
Epoch: 44546, Training Loss: 0.00782
Epoch: 44546, Training Loss: 0.00607
Epoch: 44547, Training Loss: 0.00685
Epoch: 44547, Training Loss: 0.00641
Epoch: 44547, Training Loss: 0.00782
Epoch: 44547, Training Loss: 0.00607
Epoch: 44548, Training Loss: 0.00685
Epoch: 44548, Training Loss: 0.00641
Epoch: 44548, Training Loss: 0.00782
Epoch: 44548, Training Loss: 0.00607
Epoch: 44549, Training Loss: 0.00685
Epoch: 44549, Training Loss: 0.00641
Epoch: 44549, Training Loss: 0.00782
Epoch: 44549, Training Loss: 0.00607
Epoch: 44550, Training Loss: 0.00685
Epoch: 44550, Training Loss: 0.00641
Epoch: 44550, Training Loss: 0.00782
Epoch: 44550, Training Loss: 0.00607
Epoch: 44551, Training Loss: 0.00684
Epoch: 44551, Training Loss: 0.00641
Epoch: 44551, Training Loss: 0.00782
Epoch: 44551, Training Loss: 0.00607
Epoch: 44552, Training Loss: 0.00684
Epoch: 44552, Training Loss: 0.00641
Epoch: 44552, Training Loss: 0.00782
Epoch: 44552, Training Loss: 0.00607
Epoch: 44553, Training Loss: 0.00684
Epoch: 44553, Training Loss: 0.00641
Epoch: 44553, Training Loss: 0.00782
Epoch: 44553, Training Loss: 0.00607
Epoch: 44554, Training Loss: 0.00684
Epoch: 44554, Training Loss: 0.00641
Epoch: 44554, Training Loss: 0.00782
Epoch: 44554, Training Loss: 0.00607
Epoch: 44555, Training Loss: 0.00684
Epoch: 44555, Training Loss: 0.00641
Epoch: 44555, Training Loss: 0.00782
Epoch: 44555, Training Loss: 0.00607
Epoch: 44556, Training Loss: 0.00684
Epoch: 44556, Training Loss: 0.00641
Epoch: 44556, Training Loss: 0.00782
Epoch: 44556, Training Loss: 0.00607
Epoch: 44557, Training Loss: 0.00684
Epoch: 44557, Training Loss: 0.00641
Epoch: 44557, Training Loss: 0.00782
Epoch: 44557, Training Loss: 0.00607
Epoch: 44558, Training Loss: 0.00684
Epoch: 44558, Training Loss: 0.00641
Epoch: 44558, Training Loss: 0.00782
Epoch: 44558, Training Loss: 0.00607
Epoch: 44559, Training Loss: 0.00684
Epoch: 44559, Training Loss: 0.00641
Epoch: 44559, Training Loss: 0.00782
Epoch: 44559, Training Loss: 0.00607
Epoch: 44560, Training Loss: 0.00684
Epoch: 44560, Training Loss: 0.00641
Epoch: 44560, Training Loss: 0.00782
Epoch: 44560, Training Loss: 0.00607
Epoch: 44561, Training Loss: 0.00684
Epoch: 44561, Training Loss: 0.00641
Epoch: 44561, Training Loss: 0.00782
Epoch: 44561, Training Loss: 0.00607
Epoch: 44562, Training Loss: 0.00684
Epoch: 44562, Training Loss: 0.00641
Epoch: 44562, Training Loss: 0.00782
Epoch: 44562, Training Loss: 0.00607
Epoch: 44563, Training Loss: 0.00684
Epoch: 44563, Training Loss: 0.00641
Epoch: 44563, Training Loss: 0.00782
Epoch: 44563, Training Loss: 0.00607
Epoch: 44564, Training Loss: 0.00684
Epoch: 44564, Training Loss: 0.00641
Epoch: 44564, Training Loss: 0.00782
Epoch: 44564, Training Loss: 0.00607
Epoch: 44565, Training Loss: 0.00684
Epoch: 44565, Training Loss: 0.00641
Epoch: 44565, Training Loss: 0.00782
Epoch: 44565, Training Loss: 0.00607
Epoch: 44566, Training Loss: 0.00684
Epoch: 44566, Training Loss: 0.00641
Epoch: 44566, Training Loss: 0.00782
Epoch: 44566, Training Loss: 0.00607
Epoch: 44567, Training Loss: 0.00684
Epoch: 44567, Training Loss: 0.00641
Epoch: 44567, Training Loss: 0.00782
Epoch: 44567, Training Loss: 0.00607
Epoch: 44568, Training Loss: 0.00684
Epoch: 44568, Training Loss: 0.00641
Epoch: 44568, Training Loss: 0.00782
Epoch: 44568, Training Loss: 0.00607
Epoch: 44569, Training Loss: 0.00684
Epoch: 44569, Training Loss: 0.00641
Epoch: 44569, Training Loss: 0.00782
Epoch: 44569, Training Loss: 0.00607
Epoch: 44570, Training Loss: 0.00684
Epoch: 44570, Training Loss: 0.00641
Epoch: 44570, Training Loss: 0.00782
Epoch: 44570, Training Loss: 0.00607
Epoch: 44571, Training Loss: 0.00684
Epoch: 44571, Training Loss: 0.00641
Epoch: 44571, Training Loss: 0.00782
Epoch: 44571, Training Loss: 0.00607
Epoch: 44572, Training Loss: 0.00684
Epoch: 44572, Training Loss: 0.00641
Epoch: 44572, Training Loss: 0.00782
Epoch: 44572, Training Loss: 0.00607
Epoch: 44573, Training Loss: 0.00684
Epoch: 44573, Training Loss: 0.00641
Epoch: 44573, Training Loss: 0.00782
Epoch: 44573, Training Loss: 0.00607
Epoch: 44574, Training Loss: 0.00684
Epoch: 44574, Training Loss: 0.00641
Epoch: 44574, Training Loss: 0.00782
Epoch: 44574, Training Loss: 0.00607
Epoch: 44575, Training Loss: 0.00684
Epoch: 44575, Training Loss: 0.00641
Epoch: 44575, Training Loss: 0.00782
Epoch: 44575, Training Loss: 0.00607
Epoch: 44576, Training Loss: 0.00684
Epoch: 44576, Training Loss: 0.00641
Epoch: 44576, Training Loss: 0.00782
Epoch: 44576, Training Loss: 0.00607
Epoch: 44577, Training Loss: 0.00684
Epoch: 44577, Training Loss: 0.00641
Epoch: 44577, Training Loss: 0.00782
Epoch: 44577, Training Loss: 0.00607
Epoch: 44578, Training Loss: 0.00684
Epoch: 44578, Training Loss: 0.00641
Epoch: 44578, Training Loss: 0.00782
Epoch: 44578, Training Loss: 0.00607
Epoch: 44579, Training Loss: 0.00684
Epoch: 44579, Training Loss: 0.00641
Epoch: 44579, Training Loss: 0.00782
Epoch: 44579, Training Loss: 0.00607
Epoch: 44580, Training Loss: 0.00684
Epoch: 44580, Training Loss: 0.00641
Epoch: 44580, Training Loss: 0.00782
Epoch: 44580, Training Loss: 0.00607
Epoch: 44581, Training Loss: 0.00684
Epoch: 44581, Training Loss: 0.00641
Epoch: 44581, Training Loss: 0.00782
Epoch: 44581, Training Loss: 0.00607
Epoch: 44582, Training Loss: 0.00684
Epoch: 44582, Training Loss: 0.00641
Epoch: 44582, Training Loss: 0.00782
Epoch: 44582, Training Loss: 0.00607
Epoch: 44583, Training Loss: 0.00684
Epoch: 44583, Training Loss: 0.00641
Epoch: 44583, Training Loss: 0.00782
Epoch: 44583, Training Loss: 0.00607
Epoch: 44584, Training Loss: 0.00684
Epoch: 44584, Training Loss: 0.00641
Epoch: 44584, Training Loss: 0.00782
Epoch: 44584, Training Loss: 0.00607
Epoch: 44585, Training Loss: 0.00684
Epoch: 44585, Training Loss: 0.00641
Epoch: 44585, Training Loss: 0.00782
Epoch: 44585, Training Loss: 0.00607
Epoch: 44586, Training Loss: 0.00684
Epoch: 44586, Training Loss: 0.00641
Epoch: 44586, Training Loss: 0.00782
Epoch: 44586, Training Loss: 0.00607
Epoch: 44587, Training Loss: 0.00684
Epoch: 44587, Training Loss: 0.00641
Epoch: 44587, Training Loss: 0.00782
Epoch: 44587, Training Loss: 0.00607
Epoch: 44588, Training Loss: 0.00684
Epoch: 44588, Training Loss: 0.00641
Epoch: 44588, Training Loss: 0.00782
Epoch: 44588, Training Loss: 0.00607
Epoch: 44589, Training Loss: 0.00684
Epoch: 44589, Training Loss: 0.00641
Epoch: 44589, Training Loss: 0.00782
Epoch: 44589, Training Loss: 0.00607
Epoch: 44590, Training Loss: 0.00684
Epoch: 44590, Training Loss: 0.00641
Epoch: 44590, Training Loss: 0.00782
Epoch: 44590, Training Loss: 0.00607
Epoch: 44591, Training Loss: 0.00684
Epoch: 44591, Training Loss: 0.00641
Epoch: 44591, Training Loss: 0.00782
Epoch: 44591, Training Loss: 0.00607
Epoch: 44592, Training Loss: 0.00684
Epoch: 44592, Training Loss: 0.00641
Epoch: 44592, Training Loss: 0.00782
Epoch: 44592, Training Loss: 0.00607
Epoch: 44593, Training Loss: 0.00684
Epoch: 44593, Training Loss: 0.00641
Epoch: 44593, Training Loss: 0.00782
Epoch: 44593, Training Loss: 0.00607
Epoch: 44594, Training Loss: 0.00684
Epoch: 44594, Training Loss: 0.00641
Epoch: 44594, Training Loss: 0.00782
Epoch: 44594, Training Loss: 0.00607
Epoch: 44595, Training Loss: 0.00684
Epoch: 44595, Training Loss: 0.00641
Epoch: 44595, Training Loss: 0.00782
Epoch: 44595, Training Loss: 0.00607
Epoch: 44596, Training Loss: 0.00684
Epoch: 44596, Training Loss: 0.00641
Epoch: 44596, Training Loss: 0.00782
Epoch: 44596, Training Loss: 0.00607
Epoch: 44597, Training Loss: 0.00684
Epoch: 44597, Training Loss: 0.00641
Epoch: 44597, Training Loss: 0.00782
Epoch: 44597, Training Loss: 0.00607
Epoch: 44598, Training Loss: 0.00684
Epoch: 44598, Training Loss: 0.00641
Epoch: 44598, Training Loss: 0.00782
Epoch: 44598, Training Loss: 0.00607
Epoch: 44599, Training Loss: 0.00684
Epoch: 44599, Training Loss: 0.00641
Epoch: 44599, Training Loss: 0.00782
Epoch: 44599, Training Loss: 0.00607
Epoch: 44600, Training Loss: 0.00684
Epoch: 44600, Training Loss: 0.00641
Epoch: 44600, Training Loss: 0.00782
Epoch: 44600, Training Loss: 0.00607
Epoch: 44601, Training Loss: 0.00684
Epoch: 44601, Training Loss: 0.00641
Epoch: 44601, Training Loss: 0.00782
Epoch: 44601, Training Loss: 0.00607
Epoch: 44602, Training Loss: 0.00684
Epoch: 44602, Training Loss: 0.00641
Epoch: 44602, Training Loss: 0.00782
Epoch: 44602, Training Loss: 0.00607
Epoch: 44603, Training Loss: 0.00684
Epoch: 44603, Training Loss: 0.00641
Epoch: 44603, Training Loss: 0.00782
Epoch: 44603, Training Loss: 0.00607
Epoch: 44604, Training Loss: 0.00684
Epoch: 44604, Training Loss: 0.00641
Epoch: 44604, Training Loss: 0.00782
Epoch: 44604, Training Loss: 0.00607
Epoch: 44605, Training Loss: 0.00684
Epoch: 44605, Training Loss: 0.00641
Epoch: 44605, Training Loss: 0.00782
Epoch: 44605, Training Loss: 0.00607
Epoch: 44606, Training Loss: 0.00684
Epoch: 44606, Training Loss: 0.00641
Epoch: 44606, Training Loss: 0.00782
Epoch: 44606, Training Loss: 0.00607
Epoch: 44607, Training Loss: 0.00684
Epoch: 44607, Training Loss: 0.00641
Epoch: 44607, Training Loss: 0.00782
Epoch: 44607, Training Loss: 0.00607
Epoch: 44608, Training Loss: 0.00684
Epoch: 44608, Training Loss: 0.00641
Epoch: 44608, Training Loss: 0.00782
Epoch: 44608, Training Loss: 0.00607
Epoch: 44609, Training Loss: 0.00684
Epoch: 44609, Training Loss: 0.00641
Epoch: 44609, Training Loss: 0.00782
Epoch: 44609, Training Loss: 0.00607
Epoch: 44610, Training Loss: 0.00684
Epoch: 44610, Training Loss: 0.00641
Epoch: 44610, Training Loss: 0.00782
Epoch: 44610, Training Loss: 0.00607
Epoch: 44611, Training Loss: 0.00684
Epoch: 44611, Training Loss: 0.00641
Epoch: 44611, Training Loss: 0.00782
Epoch: 44611, Training Loss: 0.00607
Epoch: 44612, Training Loss: 0.00684
Epoch: 44612, Training Loss: 0.00641
Epoch: 44612, Training Loss: 0.00782
Epoch: 44612, Training Loss: 0.00607
Epoch: 44613, Training Loss: 0.00684
Epoch: 44613, Training Loss: 0.00641
Epoch: 44613, Training Loss: 0.00782
Epoch: 44613, Training Loss: 0.00607
Epoch: 44614, Training Loss: 0.00684
Epoch: 44614, Training Loss: 0.00641
Epoch: 44614, Training Loss: 0.00782
Epoch: 44614, Training Loss: 0.00607
Epoch: 44615, Training Loss: 0.00684
Epoch: 44615, Training Loss: 0.00641
Epoch: 44615, Training Loss: 0.00782
Epoch: 44615, Training Loss: 0.00607
Epoch: 44616, Training Loss: 0.00684
Epoch: 44616, Training Loss: 0.00641
Epoch: 44616, Training Loss: 0.00782
Epoch: 44616, Training Loss: 0.00607
Epoch: 44617, Training Loss: 0.00684
Epoch: 44617, Training Loss: 0.00641
Epoch: 44617, Training Loss: 0.00782
Epoch: 44617, Training Loss: 0.00607
Epoch: 44618, Training Loss: 0.00684
Epoch: 44618, Training Loss: 0.00641
Epoch: 44618, Training Loss: 0.00782
Epoch: 44618, Training Loss: 0.00607
Epoch: 44619, Training Loss: 0.00684
Epoch: 44619, Training Loss: 0.00641
Epoch: 44619, Training Loss: 0.00782
Epoch: 44619, Training Loss: 0.00607
Epoch: 44620, Training Loss: 0.00684
Epoch: 44620, Training Loss: 0.00641
Epoch: 44620, Training Loss: 0.00782
Epoch: 44620, Training Loss: 0.00606
Epoch: 44621, Training Loss: 0.00684
Epoch: 44621, Training Loss: 0.00641
Epoch: 44621, Training Loss: 0.00782
Epoch: 44621, Training Loss: 0.00606
Epoch: 44622, Training Loss: 0.00684
Epoch: 44622, Training Loss: 0.00641
Epoch: 44622, Training Loss: 0.00782
Epoch: 44622, Training Loss: 0.00606
Epoch: 44623, Training Loss: 0.00684
Epoch: 44623, Training Loss: 0.00641
Epoch: 44623, Training Loss: 0.00782
Epoch: 44623, Training Loss: 0.00606
Epoch: 44624, Training Loss: 0.00684
Epoch: 44624, Training Loss: 0.00641
Epoch: 44624, Training Loss: 0.00782
Epoch: 44624, Training Loss: 0.00606
Epoch: 44625, Training Loss: 0.00684
Epoch: 44625, Training Loss: 0.00641
Epoch: 44625, Training Loss: 0.00782
Epoch: 44625, Training Loss: 0.00606
Epoch: 44626, Training Loss: 0.00684
Epoch: 44626, Training Loss: 0.00641
Epoch: 44626, Training Loss: 0.00782
Epoch: 44626, Training Loss: 0.00606
Epoch: 44627, Training Loss: 0.00684
Epoch: 44627, Training Loss: 0.00641
Epoch: 44627, Training Loss: 0.00782
Epoch: 44627, Training Loss: 0.00606
Epoch: 44628, Training Loss: 0.00684
Epoch: 44628, Training Loss: 0.00641
Epoch: 44628, Training Loss: 0.00782
Epoch: 44628, Training Loss: 0.00606
Epoch: 44629, Training Loss: 0.00684
Epoch: 44629, Training Loss: 0.00641
Epoch: 44629, Training Loss: 0.00782
Epoch: 44629, Training Loss: 0.00606
Epoch: 44630, Training Loss: 0.00684
Epoch: 44630, Training Loss: 0.00641
Epoch: 44630, Training Loss: 0.00782
Epoch: 44630, Training Loss: 0.00606
Epoch: 44631, Training Loss: 0.00684
Epoch: 44631, Training Loss: 0.00641
Epoch: 44631, Training Loss: 0.00782
Epoch: 44631, Training Loss: 0.00606
Epoch: 44632, Training Loss: 0.00684
Epoch: 44632, Training Loss: 0.00641
Epoch: 44632, Training Loss: 0.00782
Epoch: 44632, Training Loss: 0.00606
Epoch: 44633, Training Loss: 0.00684
Epoch: 44633, Training Loss: 0.00641
Epoch: 44633, Training Loss: 0.00782
Epoch: 44633, Training Loss: 0.00606
Epoch: 44634, Training Loss: 0.00684
Epoch: 44634, Training Loss: 0.00641
Epoch: 44634, Training Loss: 0.00782
Epoch: 44634, Training Loss: 0.00606
Epoch: 44635, Training Loss: 0.00684
Epoch: 44635, Training Loss: 0.00641
Epoch: 44635, Training Loss: 0.00781
Epoch: 44635, Training Loss: 0.00606
Epoch: 44636, Training Loss: 0.00684
Epoch: 44636, Training Loss: 0.00640
Epoch: 44636, Training Loss: 0.00781
Epoch: 44636, Training Loss: 0.00606
Epoch: 44637, Training Loss: 0.00684
Epoch: 44637, Training Loss: 0.00640
Epoch: 44637, Training Loss: 0.00781
Epoch: 44637, Training Loss: 0.00606
Epoch: 44638, Training Loss: 0.00684
Epoch: 44638, Training Loss: 0.00640
Epoch: 44638, Training Loss: 0.00781
Epoch: 44638, Training Loss: 0.00606
Epoch: 44639, Training Loss: 0.00684
Epoch: 44639, Training Loss: 0.00640
Epoch: 44639, Training Loss: 0.00781
Epoch: 44639, Training Loss: 0.00606
Epoch: 44640, Training Loss: 0.00684
Epoch: 44640, Training Loss: 0.00640
Epoch: 44640, Training Loss: 0.00781
Epoch: 44640, Training Loss: 0.00606
Epoch: 44641, Training Loss: 0.00684
Epoch: 44641, Training Loss: 0.00640
Epoch: 44641, Training Loss: 0.00781
Epoch: 44641, Training Loss: 0.00606
Epoch: 44642, Training Loss: 0.00684
Epoch: 44642, Training Loss: 0.00640
Epoch: 44642, Training Loss: 0.00781
Epoch: 44642, Training Loss: 0.00606
Epoch: 44643, Training Loss: 0.00684
Epoch: 44643, Training Loss: 0.00640
Epoch: 44643, Training Loss: 0.00781
Epoch: 44643, Training Loss: 0.00606
Epoch: 44644, Training Loss: 0.00684
Epoch: 44644, Training Loss: 0.00640
Epoch: 44644, Training Loss: 0.00781
Epoch: 44644, Training Loss: 0.00606
Epoch: 44645, Training Loss: 0.00684
Epoch: 44645, Training Loss: 0.00640
Epoch: 44645, Training Loss: 0.00781
Epoch: 44645, Training Loss: 0.00606
Epoch: 44646, Training Loss: 0.00684
Epoch: 44646, Training Loss: 0.00640
Epoch: 44646, Training Loss: 0.00781
Epoch: 44646, Training Loss: 0.00606
Epoch: 44647, Training Loss: 0.00684
Epoch: 44647, Training Loss: 0.00640
Epoch: 44647, Training Loss: 0.00781
Epoch: 44647, Training Loss: 0.00606
Epoch: 44648, Training Loss: 0.00684
Epoch: 44648, Training Loss: 0.00640
Epoch: 44648, Training Loss: 0.00781
Epoch: 44648, Training Loss: 0.00606
Epoch: 44649, Training Loss: 0.00684
Epoch: 44649, Training Loss: 0.00640
Epoch: 44649, Training Loss: 0.00781
Epoch: 44649, Training Loss: 0.00606
Epoch: 44650, Training Loss: 0.00684
Epoch: 44650, Training Loss: 0.00640
Epoch: 44650, Training Loss: 0.00781
Epoch: 44650, Training Loss: 0.00606
Epoch: 44651, Training Loss: 0.00684
Epoch: 44651, Training Loss: 0.00640
Epoch: 44651, Training Loss: 0.00781
Epoch: 44651, Training Loss: 0.00606
Epoch: 44652, Training Loss: 0.00684
Epoch: 44652, Training Loss: 0.00640
Epoch: 44652, Training Loss: 0.00781
Epoch: 44652, Training Loss: 0.00606
Epoch: 44653, Training Loss: 0.00684
Epoch: 44653, Training Loss: 0.00640
Epoch: 44653, Training Loss: 0.00781
Epoch: 44653, Training Loss: 0.00606
Epoch: 44654, Training Loss: 0.00684
Epoch: 44654, Training Loss: 0.00640
Epoch: 44654, Training Loss: 0.00781
Epoch: 44654, Training Loss: 0.00606
Epoch: 44655, Training Loss: 0.00684
Epoch: 44655, Training Loss: 0.00640
Epoch: 44655, Training Loss: 0.00781
Epoch: 44655, Training Loss: 0.00606
Epoch: 44656, Training Loss: 0.00684
Epoch: 44656, Training Loss: 0.00640
Epoch: 44656, Training Loss: 0.00781
Epoch: 44656, Training Loss: 0.00606
Epoch: 44657, Training Loss: 0.00684
Epoch: 44657, Training Loss: 0.00640
Epoch: 44657, Training Loss: 0.00781
Epoch: 44657, Training Loss: 0.00606
Epoch: 44658, Training Loss: 0.00684
Epoch: 44658, Training Loss: 0.00640
Epoch: 44658, Training Loss: 0.00781
Epoch: 44658, Training Loss: 0.00606
Epoch: 44659, Training Loss: 0.00684
Epoch: 44659, Training Loss: 0.00640
Epoch: 44659, Training Loss: 0.00781
Epoch: 44659, Training Loss: 0.00606
Epoch: 44660, Training Loss: 0.00684
Epoch: 44660, Training Loss: 0.00640
Epoch: 44660, Training Loss: 0.00781
Epoch: 44660, Training Loss: 0.00606
Epoch: 44661, Training Loss: 0.00684
Epoch: 44661, Training Loss: 0.00640
Epoch: 44661, Training Loss: 0.00781
Epoch: 44661, Training Loss: 0.00606
Epoch: 44662, Training Loss: 0.00684
Epoch: 44662, Training Loss: 0.00640
Epoch: 44662, Training Loss: 0.00781
Epoch: 44662, Training Loss: 0.00606
Epoch: 44663, Training Loss: 0.00684
Epoch: 44663, Training Loss: 0.00640
Epoch: 44663, Training Loss: 0.00781
Epoch: 44663, Training Loss: 0.00606
Epoch: 44664, Training Loss: 0.00684
Epoch: 44664, Training Loss: 0.00640
Epoch: 44664, Training Loss: 0.00781
Epoch: 44664, Training Loss: 0.00606
Epoch: 44665, Training Loss: 0.00684
Epoch: 44665, Training Loss: 0.00640
Epoch: 44665, Training Loss: 0.00781
Epoch: 44665, Training Loss: 0.00606
Epoch: 44666, Training Loss: 0.00684
Epoch: 44666, Training Loss: 0.00640
Epoch: 44666, Training Loss: 0.00781
Epoch: 44666, Training Loss: 0.00606
Epoch: 44667, Training Loss: 0.00684
Epoch: 44667, Training Loss: 0.00640
Epoch: 44667, Training Loss: 0.00781
Epoch: 44667, Training Loss: 0.00606
Epoch: 44668, Training Loss: 0.00684
Epoch: 44668, Training Loss: 0.00640
Epoch: 44668, Training Loss: 0.00781
Epoch: 44668, Training Loss: 0.00606
Epoch: 44669, Training Loss: 0.00684
Epoch: 44669, Training Loss: 0.00640
Epoch: 44669, Training Loss: 0.00781
Epoch: 44669, Training Loss: 0.00606
Epoch: 44670, Training Loss: 0.00684
Epoch: 44670, Training Loss: 0.00640
Epoch: 44670, Training Loss: 0.00781
Epoch: 44670, Training Loss: 0.00606
Epoch: 44671, Training Loss: 0.00684
Epoch: 44671, Training Loss: 0.00640
Epoch: 44671, Training Loss: 0.00781
Epoch: 44671, Training Loss: 0.00606
Epoch: 44672, Training Loss: 0.00684
Epoch: 44672, Training Loss: 0.00640
Epoch: 44672, Training Loss: 0.00781
Epoch: 44672, Training Loss: 0.00606
Epoch: 44673, Training Loss: 0.00684
Epoch: 44673, Training Loss: 0.00640
Epoch: 44673, Training Loss: 0.00781
Epoch: 44673, Training Loss: 0.00606
Epoch: 44674, Training Loss: 0.00684
Epoch: 44674, Training Loss: 0.00640
Epoch: 44674, Training Loss: 0.00781
Epoch: 44674, Training Loss: 0.00606
Epoch: 44675, Training Loss: 0.00683
Epoch: 44675, Training Loss: 0.00640
Epoch: 44675, Training Loss: 0.00781
Epoch: 44675, Training Loss: 0.00606
Epoch: 44676, Training Loss: 0.00683
Epoch: 44676, Training Loss: 0.00640
Epoch: 44676, Training Loss: 0.00781
Epoch: 44676, Training Loss: 0.00606
Epoch: 44677, Training Loss: 0.00683
Epoch: 44677, Training Loss: 0.00640
Epoch: 44677, Training Loss: 0.00781
Epoch: 44677, Training Loss: 0.00606
Epoch: 44678, Training Loss: 0.00683
Epoch: 44678, Training Loss: 0.00640
Epoch: 44678, Training Loss: 0.00781
Epoch: 44678, Training Loss: 0.00606
Epoch: 44679, Training Loss: 0.00683
Epoch: 44679, Training Loss: 0.00640
Epoch: 44679, Training Loss: 0.00781
Epoch: 44679, Training Loss: 0.00606
Epoch: 44680, Training Loss: 0.00683
Epoch: 44680, Training Loss: 0.00640
Epoch: 44680, Training Loss: 0.00781
Epoch: 44680, Training Loss: 0.00606
Epoch: 44681, Training Loss: 0.00683
Epoch: 44681, Training Loss: 0.00640
Epoch: 44681, Training Loss: 0.00781
Epoch: 44681, Training Loss: 0.00606
Epoch: 44682, Training Loss: 0.00683
Epoch: 44682, Training Loss: 0.00640
Epoch: 44682, Training Loss: 0.00781
Epoch: 44682, Training Loss: 0.00606
Epoch: 44683, Training Loss: 0.00683
Epoch: 44683, Training Loss: 0.00640
Epoch: 44683, Training Loss: 0.00781
Epoch: 44683, Training Loss: 0.00606
Epoch: 44684, Training Loss: 0.00683
Epoch: 44684, Training Loss: 0.00640
Epoch: 44684, Training Loss: 0.00781
Epoch: 44684, Training Loss: 0.00606
Epoch: 44685, Training Loss: 0.00683
Epoch: 44685, Training Loss: 0.00640
Epoch: 44685, Training Loss: 0.00781
Epoch: 44685, Training Loss: 0.00606
Epoch: 44686, Training Loss: 0.00683
Epoch: 44686, Training Loss: 0.00640
Epoch: 44686, Training Loss: 0.00781
Epoch: 44686, Training Loss: 0.00606
Epoch: 44687, Training Loss: 0.00683
Epoch: 44687, Training Loss: 0.00640
Epoch: 44687, Training Loss: 0.00781
Epoch: 44687, Training Loss: 0.00606
Epoch: 44688, Training Loss: 0.00683
Epoch: 44688, Training Loss: 0.00640
Epoch: 44688, Training Loss: 0.00781
Epoch: 44688, Training Loss: 0.00606
Epoch: 44689, Training Loss: 0.00683
Epoch: 44689, Training Loss: 0.00640
Epoch: 44689, Training Loss: 0.00781
Epoch: 44689, Training Loss: 0.00606
Epoch: 44690, Training Loss: 0.00683
Epoch: 44690, Training Loss: 0.00640
Epoch: 44690, Training Loss: 0.00781
Epoch: 44690, Training Loss: 0.00606
Epoch: 44691, Training Loss: 0.00683
Epoch: 44691, Training Loss: 0.00640
Epoch: 44691, Training Loss: 0.00781
Epoch: 44691, Training Loss: 0.00606
Epoch: 44692, Training Loss: 0.00683
Epoch: 44692, Training Loss: 0.00640
Epoch: 44692, Training Loss: 0.00781
Epoch: 44692, Training Loss: 0.00606
Epoch: 44693, Training Loss: 0.00683
Epoch: 44693, Training Loss: 0.00640
Epoch: 44693, Training Loss: 0.00781
Epoch: 44693, Training Loss: 0.00606
Epoch: 44694, Training Loss: 0.00683
Epoch: 44694, Training Loss: 0.00640
Epoch: 44694, Training Loss: 0.00781
Epoch: 44694, Training Loss: 0.00606
Epoch: 44695, Training Loss: 0.00683
Epoch: 44695, Training Loss: 0.00640
Epoch: 44695, Training Loss: 0.00781
Epoch: 44695, Training Loss: 0.00606
Epoch: 44696, Training Loss: 0.00683
Epoch: 44696, Training Loss: 0.00640
Epoch: 44696, Training Loss: 0.00781
Epoch: 44696, Training Loss: 0.00606
Epoch: 44697, Training Loss: 0.00683
Epoch: 44697, Training Loss: 0.00640
Epoch: 44697, Training Loss: 0.00781
Epoch: 44697, Training Loss: 0.00606
Epoch: 44698, Training Loss: 0.00683
Epoch: 44698, Training Loss: 0.00640
Epoch: 44698, Training Loss: 0.00781
Epoch: 44698, Training Loss: 0.00606
Epoch: 44699, Training Loss: 0.00683
Epoch: 44699, Training Loss: 0.00640
Epoch: 44699, Training Loss: 0.00781
Epoch: 44699, Training Loss: 0.00606
Epoch: 44700, Training Loss: 0.00683
Epoch: 44700, Training Loss: 0.00640
Epoch: 44700, Training Loss: 0.00781
Epoch: 44700, Training Loss: 0.00606
Epoch: 44701, Training Loss: 0.00683
Epoch: 44701, Training Loss: 0.00640
Epoch: 44701, Training Loss: 0.00781
Epoch: 44701, Training Loss: 0.00606
Epoch: 44702, Training Loss: 0.00683
Epoch: 44702, Training Loss: 0.00640
Epoch: 44702, Training Loss: 0.00781
Epoch: 44702, Training Loss: 0.00606
Epoch: 44703, Training Loss: 0.00683
Epoch: 44703, Training Loss: 0.00640
Epoch: 44703, Training Loss: 0.00781
Epoch: 44703, Training Loss: 0.00606
Epoch: 44704, Training Loss: 0.00683
Epoch: 44704, Training Loss: 0.00640
Epoch: 44704, Training Loss: 0.00781
Epoch: 44704, Training Loss: 0.00606
Epoch: 44705, Training Loss: 0.00683
Epoch: 44705, Training Loss: 0.00640
Epoch: 44705, Training Loss: 0.00781
Epoch: 44705, Training Loss: 0.00606
Epoch: 44706, Training Loss: 0.00683
Epoch: 44706, Training Loss: 0.00640
Epoch: 44706, Training Loss: 0.00781
Epoch: 44706, Training Loss: 0.00606
Epoch: 44707, Training Loss: 0.00683
Epoch: 44707, Training Loss: 0.00640
Epoch: 44707, Training Loss: 0.00781
Epoch: 44707, Training Loss: 0.00606
Epoch: 44708, Training Loss: 0.00683
Epoch: 44708, Training Loss: 0.00640
Epoch: 44708, Training Loss: 0.00781
Epoch: 44708, Training Loss: 0.00606
Epoch: 44709, Training Loss: 0.00683
Epoch: 44709, Training Loss: 0.00640
Epoch: 44709, Training Loss: 0.00781
Epoch: 44709, Training Loss: 0.00606
Epoch: 44710, Training Loss: 0.00683
Epoch: 44710, Training Loss: 0.00640
Epoch: 44710, Training Loss: 0.00781
Epoch: 44710, Training Loss: 0.00606
Epoch: 44711, Training Loss: 0.00683
Epoch: 44711, Training Loss: 0.00640
Epoch: 44711, Training Loss: 0.00781
Epoch: 44711, Training Loss: 0.00606
Epoch: 44712, Training Loss: 0.00683
Epoch: 44712, Training Loss: 0.00640
Epoch: 44712, Training Loss: 0.00781
Epoch: 44712, Training Loss: 0.00606
Epoch: 44713, Training Loss: 0.00683
Epoch: 44713, Training Loss: 0.00640
Epoch: 44713, Training Loss: 0.00781
Epoch: 44713, Training Loss: 0.00606
Epoch: 44714, Training Loss: 0.00683
Epoch: 44714, Training Loss: 0.00640
Epoch: 44714, Training Loss: 0.00781
Epoch: 44714, Training Loss: 0.00606
Epoch: 44715, Training Loss: 0.00683
Epoch: 44715, Training Loss: 0.00640
Epoch: 44715, Training Loss: 0.00781
Epoch: 44715, Training Loss: 0.00606
Epoch: 44716, Training Loss: 0.00683
Epoch: 44716, Training Loss: 0.00640
Epoch: 44716, Training Loss: 0.00781
Epoch: 44716, Training Loss: 0.00606
Epoch: 44717, Training Loss: 0.00683
Epoch: 44717, Training Loss: 0.00640
Epoch: 44717, Training Loss: 0.00781
Epoch: 44717, Training Loss: 0.00606
Epoch: 44718, Training Loss: 0.00683
Epoch: 44718, Training Loss: 0.00640
Epoch: 44718, Training Loss: 0.00781
Epoch: 44718, Training Loss: 0.00606
Epoch: 44719, Training Loss: 0.00683
Epoch: 44719, Training Loss: 0.00640
Epoch: 44719, Training Loss: 0.00781
Epoch: 44719, Training Loss: 0.00606
Epoch: 44720, Training Loss: 0.00683
Epoch: 44720, Training Loss: 0.00640
Epoch: 44720, Training Loss: 0.00781
Epoch: 44720, Training Loss: 0.00606
Epoch: 44721, Training Loss: 0.00683
Epoch: 44721, Training Loss: 0.00640
Epoch: 44721, Training Loss: 0.00781
Epoch: 44721, Training Loss: 0.00606
Epoch: 44722, Training Loss: 0.00683
Epoch: 44722, Training Loss: 0.00640
Epoch: 44722, Training Loss: 0.00781
Epoch: 44722, Training Loss: 0.00606
Epoch: 44723, Training Loss: 0.00683
Epoch: 44723, Training Loss: 0.00640
Epoch: 44723, Training Loss: 0.00781
Epoch: 44723, Training Loss: 0.00606
Epoch: 44724, Training Loss: 0.00683
Epoch: 44724, Training Loss: 0.00640
Epoch: 44724, Training Loss: 0.00781
Epoch: 44724, Training Loss: 0.00606
Epoch: 44725, Training Loss: 0.00683
Epoch: 44725, Training Loss: 0.00640
Epoch: 44725, Training Loss: 0.00781
Epoch: 44725, Training Loss: 0.00606
Epoch: 44726, Training Loss: 0.00683
Epoch: 44726, Training Loss: 0.00640
Epoch: 44726, Training Loss: 0.00781
Epoch: 44726, Training Loss: 0.00606
Epoch: 44727, Training Loss: 0.00683
Epoch: 44727, Training Loss: 0.00640
Epoch: 44727, Training Loss: 0.00781
Epoch: 44727, Training Loss: 0.00606
Epoch: 44728, Training Loss: 0.00683
Epoch: 44728, Training Loss: 0.00640
Epoch: 44728, Training Loss: 0.00781
Epoch: 44728, Training Loss: 0.00606
Epoch: 44729, Training Loss: 0.00683
Epoch: 44729, Training Loss: 0.00640
Epoch: 44729, Training Loss: 0.00781
Epoch: 44729, Training Loss: 0.00606
Epoch: 44730, Training Loss: 0.00683
Epoch: 44730, Training Loss: 0.00640
Epoch: 44730, Training Loss: 0.00781
Epoch: 44730, Training Loss: 0.00606
Epoch: 44731, Training Loss: 0.00683
Epoch: 44731, Training Loss: 0.00640
Epoch: 44731, Training Loss: 0.00781
Epoch: 44731, Training Loss: 0.00606
Epoch: 44732, Training Loss: 0.00683
Epoch: 44732, Training Loss: 0.00640
Epoch: 44732, Training Loss: 0.00781
Epoch: 44732, Training Loss: 0.00606
Epoch: 44733, Training Loss: 0.00683
Epoch: 44733, Training Loss: 0.00640
Epoch: 44733, Training Loss: 0.00781
Epoch: 44733, Training Loss: 0.00606
Epoch: 44734, Training Loss: 0.00683
Epoch: 44734, Training Loss: 0.00640
Epoch: 44734, Training Loss: 0.00781
Epoch: 44734, Training Loss: 0.00606
Epoch: 44735, Training Loss: 0.00683
Epoch: 44735, Training Loss: 0.00640
Epoch: 44735, Training Loss: 0.00781
Epoch: 44735, Training Loss: 0.00606
Epoch: 44736, Training Loss: 0.00683
Epoch: 44736, Training Loss: 0.00640
Epoch: 44736, Training Loss: 0.00781
Epoch: 44736, Training Loss: 0.00606
Epoch: 44737, Training Loss: 0.00683
Epoch: 44737, Training Loss: 0.00640
Epoch: 44737, Training Loss: 0.00781
Epoch: 44737, Training Loss: 0.00606
Epoch: 44738, Training Loss: 0.00683
Epoch: 44738, Training Loss: 0.00640
Epoch: 44738, Training Loss: 0.00781
Epoch: 44738, Training Loss: 0.00606
Epoch: 44739, Training Loss: 0.00683
Epoch: 44739, Training Loss: 0.00640
Epoch: 44739, Training Loss: 0.00781
Epoch: 44739, Training Loss: 0.00606
Epoch: 44740, Training Loss: 0.00683
Epoch: 44740, Training Loss: 0.00640
Epoch: 44740, Training Loss: 0.00781
Epoch: 44740, Training Loss: 0.00606
Epoch: 44741, Training Loss: 0.00683
Epoch: 44741, Training Loss: 0.00640
Epoch: 44741, Training Loss: 0.00781
Epoch: 44741, Training Loss: 0.00606
Epoch: 44742, Training Loss: 0.00683
Epoch: 44742, Training Loss: 0.00640
Epoch: 44742, Training Loss: 0.00781
Epoch: 44742, Training Loss: 0.00606
Epoch: 44743, Training Loss: 0.00683
Epoch: 44743, Training Loss: 0.00640
Epoch: 44743, Training Loss: 0.00781
Epoch: 44743, Training Loss: 0.00606
Epoch: 44744, Training Loss: 0.00683
Epoch: 44744, Training Loss: 0.00640
Epoch: 44744, Training Loss: 0.00780
Epoch: 44744, Training Loss: 0.00606
Epoch: 44745, Training Loss: 0.00683
Epoch: 44745, Training Loss: 0.00640
Epoch: 44745, Training Loss: 0.00780
Epoch: 44745, Training Loss: 0.00606
Epoch: 44746, Training Loss: 0.00683
Epoch: 44746, Training Loss: 0.00640
Epoch: 44746, Training Loss: 0.00780
Epoch: 44746, Training Loss: 0.00606
Epoch: 44747, Training Loss: 0.00683
Epoch: 44747, Training Loss: 0.00640
Epoch: 44747, Training Loss: 0.00780
Epoch: 44747, Training Loss: 0.00606
Epoch: 44748, Training Loss: 0.00683
Epoch: 44748, Training Loss: 0.00640
Epoch: 44748, Training Loss: 0.00780
Epoch: 44748, Training Loss: 0.00606
Epoch: 44749, Training Loss: 0.00683
Epoch: 44749, Training Loss: 0.00640
Epoch: 44749, Training Loss: 0.00780
Epoch: 44749, Training Loss: 0.00606
Epoch: 44750, Training Loss: 0.00683
Epoch: 44750, Training Loss: 0.00640
Epoch: 44750, Training Loss: 0.00780
Epoch: 44750, Training Loss: 0.00606
Epoch: 44751, Training Loss: 0.00683
Epoch: 44751, Training Loss: 0.00640
Epoch: 44751, Training Loss: 0.00780
Epoch: 44751, Training Loss: 0.00606
Epoch: 44752, Training Loss: 0.00683
Epoch: 44752, Training Loss: 0.00640
Epoch: 44752, Training Loss: 0.00780
Epoch: 44752, Training Loss: 0.00606
Epoch: 44753, Training Loss: 0.00683
Epoch: 44753, Training Loss: 0.00640
Epoch: 44753, Training Loss: 0.00780
Epoch: 44753, Training Loss: 0.00606
Epoch: 44754, Training Loss: 0.00683
Epoch: 44754, Training Loss: 0.00640
Epoch: 44754, Training Loss: 0.00780
Epoch: 44754, Training Loss: 0.00606
Epoch: 44755, Training Loss: 0.00683
Epoch: 44755, Training Loss: 0.00640
Epoch: 44755, Training Loss: 0.00780
Epoch: 44755, Training Loss: 0.00606
Epoch: 44756, Training Loss: 0.00683
Epoch: 44756, Training Loss: 0.00640
Epoch: 44756, Training Loss: 0.00780
Epoch: 44756, Training Loss: 0.00606
Epoch: 44757, Training Loss: 0.00683
Epoch: 44757, Training Loss: 0.00640
Epoch: 44757, Training Loss: 0.00780
Epoch: 44757, Training Loss: 0.00606
Epoch: 44758, Training Loss: 0.00683
Epoch: 44758, Training Loss: 0.00640
Epoch: 44758, Training Loss: 0.00780
Epoch: 44758, Training Loss: 0.00606
Epoch: 44759, Training Loss: 0.00683
Epoch: 44759, Training Loss: 0.00640
Epoch: 44759, Training Loss: 0.00780
Epoch: 44759, Training Loss: 0.00606
Epoch: 44760, Training Loss: 0.00683
Epoch: 44760, Training Loss: 0.00640
Epoch: 44760, Training Loss: 0.00780
Epoch: 44760, Training Loss: 0.00606
Epoch: 44761, Training Loss: 0.00683
Epoch: 44761, Training Loss: 0.00640
Epoch: 44761, Training Loss: 0.00780
Epoch: 44761, Training Loss: 0.00606
Epoch: 44762, Training Loss: 0.00683
Epoch: 44762, Training Loss: 0.00640
Epoch: 44762, Training Loss: 0.00780
Epoch: 44762, Training Loss: 0.00605
Epoch: 44763, Training Loss: 0.00683
Epoch: 44763, Training Loss: 0.00640
Epoch: 44763, Training Loss: 0.00780
Epoch: 44763, Training Loss: 0.00605
Epoch: 44764, Training Loss: 0.00683
Epoch: 44764, Training Loss: 0.00640
Epoch: 44764, Training Loss: 0.00780
Epoch: 44764, Training Loss: 0.00605
Epoch: 44765, Training Loss: 0.00683
Epoch: 44765, Training Loss: 0.00640
Epoch: 44765, Training Loss: 0.00780
Epoch: 44765, Training Loss: 0.00605
Epoch: 44766, Training Loss: 0.00683
Epoch: 44766, Training Loss: 0.00640
Epoch: 44766, Training Loss: 0.00780
Epoch: 44766, Training Loss: 0.00605
Epoch: 44767, Training Loss: 0.00683
Epoch: 44767, Training Loss: 0.00640
Epoch: 44767, Training Loss: 0.00780
Epoch: 44767, Training Loss: 0.00605
Epoch: 44768, Training Loss: 0.00683
Epoch: 44768, Training Loss: 0.00640
Epoch: 44768, Training Loss: 0.00780
Epoch: 44768, Training Loss: 0.00605
Epoch: 44769, Training Loss: 0.00683
Epoch: 44769, Training Loss: 0.00640
Epoch: 44769, Training Loss: 0.00780
Epoch: 44769, Training Loss: 0.00605
Epoch: 44770, Training Loss: 0.00683
Epoch: 44770, Training Loss: 0.00640
Epoch: 44770, Training Loss: 0.00780
Epoch: 44770, Training Loss: 0.00605
Epoch: 44771, Training Loss: 0.00683
Epoch: 44771, Training Loss: 0.00639
Epoch: 44771, Training Loss: 0.00780
Epoch: 44771, Training Loss: 0.00605
Epoch: 44772, Training Loss: 0.00683
Epoch: 44772, Training Loss: 0.00639
Epoch: 44772, Training Loss: 0.00780
Epoch: 44772, Training Loss: 0.00605
Epoch: 44773, Training Loss: 0.00683
Epoch: 44773, Training Loss: 0.00639
Epoch: 44773, Training Loss: 0.00780
Epoch: 44773, Training Loss: 0.00605
Epoch: 44774, Training Loss: 0.00683
Epoch: 44774, Training Loss: 0.00639
Epoch: 44774, Training Loss: 0.00780
Epoch: 44774, Training Loss: 0.00605
Epoch: 44775, Training Loss: 0.00683
Epoch: 44775, Training Loss: 0.00639
Epoch: 44775, Training Loss: 0.00780
Epoch: 44775, Training Loss: 0.00605
Epoch: 44776, Training Loss: 0.00683
Epoch: 44776, Training Loss: 0.00639
Epoch: 44776, Training Loss: 0.00780
Epoch: 44776, Training Loss: 0.00605
Epoch: 44777, Training Loss: 0.00683
Epoch: 44777, Training Loss: 0.00639
Epoch: 44777, Training Loss: 0.00780
Epoch: 44777, Training Loss: 0.00605
Epoch: 44778, Training Loss: 0.00683
Epoch: 44778, Training Loss: 0.00639
Epoch: 44778, Training Loss: 0.00780
Epoch: 44778, Training Loss: 0.00605
Epoch: 44779, Training Loss: 0.00683
Epoch: 44779, Training Loss: 0.00639
Epoch: 44779, Training Loss: 0.00780
Epoch: 44779, Training Loss: 0.00605
Epoch: 44780, Training Loss: 0.00683
Epoch: 44780, Training Loss: 0.00639
Epoch: 44780, Training Loss: 0.00780
Epoch: 44780, Training Loss: 0.00605
Epoch: 44781, Training Loss: 0.00683
Epoch: 44781, Training Loss: 0.00639
Epoch: 44781, Training Loss: 0.00780
Epoch: 44781, Training Loss: 0.00605
Epoch: 44782, Training Loss: 0.00683
Epoch: 44782, Training Loss: 0.00639
Epoch: 44782, Training Loss: 0.00780
Epoch: 44782, Training Loss: 0.00605
Epoch: 44783, Training Loss: 0.00683
Epoch: 44783, Training Loss: 0.00639
Epoch: 44783, Training Loss: 0.00780
Epoch: 44783, Training Loss: 0.00605
Epoch: 44784, Training Loss: 0.00683
Epoch: 44784, Training Loss: 0.00639
Epoch: 44784, Training Loss: 0.00780
Epoch: 44784, Training Loss: 0.00605
Epoch: 44785, Training Loss: 0.00683
Epoch: 44785, Training Loss: 0.00639
Epoch: 44785, Training Loss: 0.00780
Epoch: 44785, Training Loss: 0.00605
Epoch: 44786, Training Loss: 0.00683
Epoch: 44786, Training Loss: 0.00639
Epoch: 44786, Training Loss: 0.00780
Epoch: 44786, Training Loss: 0.00605
Epoch: 44787, Training Loss: 0.00683
Epoch: 44787, Training Loss: 0.00639
Epoch: 44787, Training Loss: 0.00780
Epoch: 44787, Training Loss: 0.00605
Epoch: 44788, Training Loss: 0.00683
Epoch: 44788, Training Loss: 0.00639
Epoch: 44788, Training Loss: 0.00780
Epoch: 44788, Training Loss: 0.00605
Epoch: 44789, Training Loss: 0.00683
Epoch: 44789, Training Loss: 0.00639
Epoch: 44789, Training Loss: 0.00780
Epoch: 44789, Training Loss: 0.00605
Epoch: 44790, Training Loss: 0.00683
Epoch: 44790, Training Loss: 0.00639
Epoch: 44790, Training Loss: 0.00780
Epoch: 44790, Training Loss: 0.00605
Epoch: 44791, Training Loss: 0.00683
Epoch: 44791, Training Loss: 0.00639
Epoch: 44791, Training Loss: 0.00780
Epoch: 44791, Training Loss: 0.00605
Epoch: 44792, Training Loss: 0.00683
Epoch: 44792, Training Loss: 0.00639
Epoch: 44792, Training Loss: 0.00780
Epoch: 44792, Training Loss: 0.00605
Epoch: 44793, Training Loss: 0.00683
Epoch: 44793, Training Loss: 0.00639
Epoch: 44793, Training Loss: 0.00780
Epoch: 44793, Training Loss: 0.00605
Epoch: 44794, Training Loss: 0.00683
Epoch: 44794, Training Loss: 0.00639
Epoch: 44794, Training Loss: 0.00780
Epoch: 44794, Training Loss: 0.00605
Epoch: 44795, Training Loss: 0.00683
Epoch: 44795, Training Loss: 0.00639
Epoch: 44795, Training Loss: 0.00780
Epoch: 44795, Training Loss: 0.00605
Epoch: 44796, Training Loss: 0.00683
Epoch: 44796, Training Loss: 0.00639
Epoch: 44796, Training Loss: 0.00780
Epoch: 44796, Training Loss: 0.00605
Epoch: 44797, Training Loss: 0.00683
Epoch: 44797, Training Loss: 0.00639
Epoch: 44797, Training Loss: 0.00780
Epoch: 44797, Training Loss: 0.00605
Epoch: 44798, Training Loss: 0.00683
Epoch: 44798, Training Loss: 0.00639
Epoch: 44798, Training Loss: 0.00780
Epoch: 44798, Training Loss: 0.00605
Epoch: 44799, Training Loss: 0.00683
Epoch: 44799, Training Loss: 0.00639
Epoch: 44799, Training Loss: 0.00780
Epoch: 44799, Training Loss: 0.00605
Epoch: 44800, Training Loss: 0.00682
Epoch: 44800, Training Loss: 0.00639
Epoch: 44800, Training Loss: 0.00780
Epoch: 44800, Training Loss: 0.00605
Epoch: 44801, Training Loss: 0.00682
Epoch: 44801, Training Loss: 0.00639
Epoch: 44801, Training Loss: 0.00780
Epoch: 44801, Training Loss: 0.00605
Epoch: 44802, Training Loss: 0.00682
Epoch: 44802, Training Loss: 0.00639
Epoch: 44802, Training Loss: 0.00780
Epoch: 44802, Training Loss: 0.00605
Epoch: 44803, Training Loss: 0.00682
Epoch: 44803, Training Loss: 0.00639
Epoch: 44803, Training Loss: 0.00780
Epoch: 44803, Training Loss: 0.00605
Epoch: 44804, Training Loss: 0.00682
Epoch: 44804, Training Loss: 0.00639
Epoch: 44804, Training Loss: 0.00780
Epoch: 44804, Training Loss: 0.00605
Epoch: 44805, Training Loss: 0.00682
Epoch: 44805, Training Loss: 0.00639
Epoch: 44805, Training Loss: 0.00780
Epoch: 44805, Training Loss: 0.00605
Epoch: 44806, Training Loss: 0.00682
Epoch: 44806, Training Loss: 0.00639
Epoch: 44806, Training Loss: 0.00780
Epoch: 44806, Training Loss: 0.00605
Epoch: 44807, Training Loss: 0.00682
Epoch: 44807, Training Loss: 0.00639
Epoch: 44807, Training Loss: 0.00780
Epoch: 44807, Training Loss: 0.00605
Epoch: 44808, Training Loss: 0.00682
Epoch: 44808, Training Loss: 0.00639
Epoch: 44808, Training Loss: 0.00780
Epoch: 44808, Training Loss: 0.00605
Epoch: 44809, Training Loss: 0.00682
Epoch: 44809, Training Loss: 0.00639
Epoch: 44809, Training Loss: 0.00780
Epoch: 44809, Training Loss: 0.00605
Epoch: 44810, Training Loss: 0.00682
Epoch: 44810, Training Loss: 0.00639
Epoch: 44810, Training Loss: 0.00780
Epoch: 44810, Training Loss: 0.00605
Epoch: 44811, Training Loss: 0.00682
Epoch: 44811, Training Loss: 0.00639
Epoch: 44811, Training Loss: 0.00780
Epoch: 44811, Training Loss: 0.00605
Epoch: 44812, Training Loss: 0.00682
Epoch: 44812, Training Loss: 0.00639
Epoch: 44812, Training Loss: 0.00780
Epoch: 44812, Training Loss: 0.00605
Epoch: 44813, Training Loss: 0.00682
Epoch: 44813, Training Loss: 0.00639
Epoch: 44813, Training Loss: 0.00780
Epoch: 44813, Training Loss: 0.00605
Epoch: 44814, Training Loss: 0.00682
Epoch: 44814, Training Loss: 0.00639
Epoch: 44814, Training Loss: 0.00780
Epoch: 44814, Training Loss: 0.00605
Epoch: 44815, Training Loss: 0.00682
Epoch: 44815, Training Loss: 0.00639
Epoch: 44815, Training Loss: 0.00780
Epoch: 44815, Training Loss: 0.00605
Epoch: 44816, Training Loss: 0.00682
Epoch: 44816, Training Loss: 0.00639
Epoch: 44816, Training Loss: 0.00780
Epoch: 44816, Training Loss: 0.00605
Epoch: 44817, Training Loss: 0.00682
Epoch: 44817, Training Loss: 0.00639
Epoch: 44817, Training Loss: 0.00780
Epoch: 44817, Training Loss: 0.00605
Epoch: 44818, Training Loss: 0.00682
Epoch: 44818, Training Loss: 0.00639
Epoch: 44818, Training Loss: 0.00780
Epoch: 44818, Training Loss: 0.00605
Epoch: 44819, Training Loss: 0.00682
Epoch: 44819, Training Loss: 0.00639
Epoch: 44819, Training Loss: 0.00780
Epoch: 44819, Training Loss: 0.00605
Epoch: 44820, Training Loss: 0.00682
Epoch: 44820, Training Loss: 0.00639
Epoch: 44820, Training Loss: 0.00780
Epoch: 44820, Training Loss: 0.00605
Epoch: 44821, Training Loss: 0.00682
Epoch: 44821, Training Loss: 0.00639
Epoch: 44821, Training Loss: 0.00780
Epoch: 44821, Training Loss: 0.00605
Epoch: 44822, Training Loss: 0.00682
Epoch: 44822, Training Loss: 0.00639
Epoch: 44822, Training Loss: 0.00780
Epoch: 44822, Training Loss: 0.00605
Epoch: 44823, Training Loss: 0.00682
Epoch: 44823, Training Loss: 0.00639
Epoch: 44823, Training Loss: 0.00780
Epoch: 44823, Training Loss: 0.00605
Epoch: 44824, Training Loss: 0.00682
Epoch: 44824, Training Loss: 0.00639
Epoch: 44824, Training Loss: 0.00780
Epoch: 44824, Training Loss: 0.00605
Epoch: 44825, Training Loss: 0.00682
Epoch: 44825, Training Loss: 0.00639
Epoch: 44825, Training Loss: 0.00780
Epoch: 44825, Training Loss: 0.00605
Epoch: 44826, Training Loss: 0.00682
Epoch: 44826, Training Loss: 0.00639
Epoch: 44826, Training Loss: 0.00780
Epoch: 44826, Training Loss: 0.00605
Epoch: 44827, Training Loss: 0.00682
Epoch: 44827, Training Loss: 0.00639
Epoch: 44827, Training Loss: 0.00780
Epoch: 44827, Training Loss: 0.00605
Epoch: 44828, Training Loss: 0.00682
Epoch: 44828, Training Loss: 0.00639
Epoch: 44828, Training Loss: 0.00780
Epoch: 44828, Training Loss: 0.00605
Epoch: 44829, Training Loss: 0.00682
Epoch: 44829, Training Loss: 0.00639
Epoch: 44829, Training Loss: 0.00780
Epoch: 44829, Training Loss: 0.00605
Epoch: 44830, Training Loss: 0.00682
Epoch: 44830, Training Loss: 0.00639
Epoch: 44830, Training Loss: 0.00780
Epoch: 44830, Training Loss: 0.00605
Epoch: 44831, Training Loss: 0.00682
Epoch: 44831, Training Loss: 0.00639
Epoch: 44831, Training Loss: 0.00780
Epoch: 44831, Training Loss: 0.00605
Epoch: 44832, Training Loss: 0.00682
Epoch: 44832, Training Loss: 0.00639
Epoch: 44832, Training Loss: 0.00780
Epoch: 44832, Training Loss: 0.00605
Epoch: 44833, Training Loss: 0.00682
Epoch: 44833, Training Loss: 0.00639
Epoch: 44833, Training Loss: 0.00780
Epoch: 44833, Training Loss: 0.00605
Epoch: 44834, Training Loss: 0.00682
Epoch: 44834, Training Loss: 0.00639
Epoch: 44834, Training Loss: 0.00780
Epoch: 44834, Training Loss: 0.00605
Epoch: 44835, Training Loss: 0.00682
Epoch: 44835, Training Loss: 0.00639
Epoch: 44835, Training Loss: 0.00780
Epoch: 44835, Training Loss: 0.00605
Epoch: 44836, Training Loss: 0.00682
Epoch: 44836, Training Loss: 0.00639
Epoch: 44836, Training Loss: 0.00780
Epoch: 44836, Training Loss: 0.00605
Epoch: 44837, Training Loss: 0.00682
Epoch: 44837, Training Loss: 0.00639
Epoch: 44837, Training Loss: 0.00780
Epoch: 44837, Training Loss: 0.00605
Epoch: 44838, Training Loss: 0.00682
Epoch: 44838, Training Loss: 0.00639
Epoch: 44838, Training Loss: 0.00780
Epoch: 44838, Training Loss: 0.00605
Epoch: 44839, Training Loss: 0.00682
Epoch: 44839, Training Loss: 0.00639
Epoch: 44839, Training Loss: 0.00780
Epoch: 44839, Training Loss: 0.00605
Epoch: 44840, Training Loss: 0.00682
Epoch: 44840, Training Loss: 0.00639
Epoch: 44840, Training Loss: 0.00780
Epoch: 44840, Training Loss: 0.00605
Epoch: 44841, Training Loss: 0.00682
Epoch: 44841, Training Loss: 0.00639
Epoch: 44841, Training Loss: 0.00780
Epoch: 44841, Training Loss: 0.00605
Epoch: 44842, Training Loss: 0.00682
Epoch: 44842, Training Loss: 0.00639
Epoch: 44842, Training Loss: 0.00780
Epoch: 44842, Training Loss: 0.00605
Epoch: 44843, Training Loss: 0.00682
Epoch: 44843, Training Loss: 0.00639
Epoch: 44843, Training Loss: 0.00780
Epoch: 44843, Training Loss: 0.00605
Epoch: 44844, Training Loss: 0.00682
Epoch: 44844, Training Loss: 0.00639
Epoch: 44844, Training Loss: 0.00780
Epoch: 44844, Training Loss: 0.00605
Epoch: 44845, Training Loss: 0.00682
Epoch: 44845, Training Loss: 0.00639
Epoch: 44845, Training Loss: 0.00780
Epoch: 44845, Training Loss: 0.00605
Epoch: 44846, Training Loss: 0.00682
Epoch: 44846, Training Loss: 0.00639
Epoch: 44846, Training Loss: 0.00780
Epoch: 44846, Training Loss: 0.00605
Epoch: 44847, Training Loss: 0.00682
Epoch: 44847, Training Loss: 0.00639
Epoch: 44847, Training Loss: 0.00780
Epoch: 44847, Training Loss: 0.00605
Epoch: 44848, Training Loss: 0.00682
Epoch: 44848, Training Loss: 0.00639
Epoch: 44848, Training Loss: 0.00780
Epoch: 44848, Training Loss: 0.00605
Epoch: 44849, Training Loss: 0.00682
Epoch: 44849, Training Loss: 0.00639
Epoch: 44849, Training Loss: 0.00780
Epoch: 44849, Training Loss: 0.00605
Epoch: 44850, Training Loss: 0.00682
Epoch: 44850, Training Loss: 0.00639
Epoch: 44850, Training Loss: 0.00780
Epoch: 44850, Training Loss: 0.00605
Epoch: 44851, Training Loss: 0.00682
Epoch: 44851, Training Loss: 0.00639
Epoch: 44851, Training Loss: 0.00780
Epoch: 44851, Training Loss: 0.00605
Epoch: 44852, Training Loss: 0.00682
Epoch: 44852, Training Loss: 0.00639
Epoch: 44852, Training Loss: 0.00780
Epoch: 44852, Training Loss: 0.00605
Epoch: 44853, Training Loss: 0.00682
Epoch: 44853, Training Loss: 0.00639
Epoch: 44853, Training Loss: 0.00780
Epoch: 44853, Training Loss: 0.00605
Epoch: 44854, Training Loss: 0.00682
Epoch: 44854, Training Loss: 0.00639
Epoch: 44854, Training Loss: 0.00779
Epoch: 44854, Training Loss: 0.00605
Epoch: 44855, Training Loss: 0.00682
Epoch: 44855, Training Loss: 0.00639
Epoch: 44855, Training Loss: 0.00779
Epoch: 44855, Training Loss: 0.00605
Epoch: 44856, Training Loss: 0.00682
Epoch: 44856, Training Loss: 0.00639
Epoch: 44856, Training Loss: 0.00779
Epoch: 44856, Training Loss: 0.00605
Epoch: 44857, Training Loss: 0.00682
Epoch: 44857, Training Loss: 0.00639
Epoch: 44857, Training Loss: 0.00779
Epoch: 44857, Training Loss: 0.00605
Epoch: 44858, Training Loss: 0.00682
Epoch: 44858, Training Loss: 0.00639
Epoch: 44858, Training Loss: 0.00779
Epoch: 44858, Training Loss: 0.00605
Epoch: 44859, Training Loss: 0.00682
Epoch: 44859, Training Loss: 0.00639
Epoch: 44859, Training Loss: 0.00779
Epoch: 44859, Training Loss: 0.00605
Epoch: 44860, Training Loss: 0.00682
Epoch: 44860, Training Loss: 0.00639
Epoch: 44860, Training Loss: 0.00779
Epoch: 44860, Training Loss: 0.00605
Epoch: 44861, Training Loss: 0.00682
Epoch: 44861, Training Loss: 0.00639
Epoch: 44861, Training Loss: 0.00779
Epoch: 44861, Training Loss: 0.00605
Epoch: 44862, Training Loss: 0.00682
Epoch: 44862, Training Loss: 0.00639
Epoch: 44862, Training Loss: 0.00779
Epoch: 44862, Training Loss: 0.00605
Epoch: 44863, Training Loss: 0.00682
Epoch: 44863, Training Loss: 0.00639
Epoch: 44863, Training Loss: 0.00779
Epoch: 44863, Training Loss: 0.00605
Epoch: 44864, Training Loss: 0.00682
Epoch: 44864, Training Loss: 0.00639
Epoch: 44864, Training Loss: 0.00779
Epoch: 44864, Training Loss: 0.00605
Epoch: 44865, Training Loss: 0.00682
Epoch: 44865, Training Loss: 0.00639
Epoch: 44865, Training Loss: 0.00779
Epoch: 44865, Training Loss: 0.00605
Epoch: 44866, Training Loss: 0.00682
Epoch: 44866, Training Loss: 0.00639
Epoch: 44866, Training Loss: 0.00779
Epoch: 44866, Training Loss: 0.00605
Epoch: 44867, Training Loss: 0.00682
Epoch: 44867, Training Loss: 0.00639
Epoch: 44867, Training Loss: 0.00779
Epoch: 44867, Training Loss: 0.00605
Epoch: 44868, Training Loss: 0.00682
Epoch: 44868, Training Loss: 0.00639
Epoch: 44868, Training Loss: 0.00779
Epoch: 44868, Training Loss: 0.00605
Epoch: 44869, Training Loss: 0.00682
Epoch: 44869, Training Loss: 0.00639
Epoch: 44869, Training Loss: 0.00779
Epoch: 44869, Training Loss: 0.00605
Epoch: 44870, Training Loss: 0.00682
Epoch: 44870, Training Loss: 0.00639
Epoch: 44870, Training Loss: 0.00779
Epoch: 44870, Training Loss: 0.00605
Epoch: 44871, Training Loss: 0.00682
Epoch: 44871, Training Loss: 0.00639
Epoch: 44871, Training Loss: 0.00779
Epoch: 44871, Training Loss: 0.00605
Epoch: 44872, Training Loss: 0.00682
Epoch: 44872, Training Loss: 0.00639
Epoch: 44872, Training Loss: 0.00779
Epoch: 44872, Training Loss: 0.00605
Epoch: 44873, Training Loss: 0.00682
Epoch: 44873, Training Loss: 0.00639
Epoch: 44873, Training Loss: 0.00779
Epoch: 44873, Training Loss: 0.00605
Epoch: 44874, Training Loss: 0.00682
Epoch: 44874, Training Loss: 0.00639
Epoch: 44874, Training Loss: 0.00779
Epoch: 44874, Training Loss: 0.00605
Epoch: 44875, Training Loss: 0.00682
Epoch: 44875, Training Loss: 0.00639
Epoch: 44875, Training Loss: 0.00779
Epoch: 44875, Training Loss: 0.00605
Epoch: 44876, Training Loss: 0.00682
Epoch: 44876, Training Loss: 0.00639
Epoch: 44876, Training Loss: 0.00779
Epoch: 44876, Training Loss: 0.00605
Epoch: 44877, Training Loss: 0.00682
Epoch: 44877, Training Loss: 0.00639
Epoch: 44877, Training Loss: 0.00779
Epoch: 44877, Training Loss: 0.00605
Epoch: 44878, Training Loss: 0.00682
Epoch: 44878, Training Loss: 0.00639
Epoch: 44878, Training Loss: 0.00779
Epoch: 44878, Training Loss: 0.00605
Epoch: 44879, Training Loss: 0.00682
Epoch: 44879, Training Loss: 0.00639
Epoch: 44879, Training Loss: 0.00779
Epoch: 44879, Training Loss: 0.00605
Epoch: 44880, Training Loss: 0.00682
Epoch: 44880, Training Loss: 0.00639
Epoch: 44880, Training Loss: 0.00779
Epoch: 44880, Training Loss: 0.00605
Epoch: 44881, Training Loss: 0.00682
Epoch: 44881, Training Loss: 0.00639
Epoch: 44881, Training Loss: 0.00779
Epoch: 44881, Training Loss: 0.00605
Epoch: 44882, Training Loss: 0.00682
Epoch: 44882, Training Loss: 0.00639
Epoch: 44882, Training Loss: 0.00779
Epoch: 44882, Training Loss: 0.00605
Epoch: 44883, Training Loss: 0.00682
Epoch: 44883, Training Loss: 0.00639
Epoch: 44883, Training Loss: 0.00779
Epoch: 44883, Training Loss: 0.00605
Epoch: 44884, Training Loss: 0.00682
Epoch: 44884, Training Loss: 0.00639
Epoch: 44884, Training Loss: 0.00779
Epoch: 44884, Training Loss: 0.00605
Epoch: 44885, Training Loss: 0.00682
Epoch: 44885, Training Loss: 0.00639
Epoch: 44885, Training Loss: 0.00779
Epoch: 44885, Training Loss: 0.00605
Epoch: 44886, Training Loss: 0.00682
Epoch: 44886, Training Loss: 0.00639
Epoch: 44886, Training Loss: 0.00779
Epoch: 44886, Training Loss: 0.00605
Epoch: 44887, Training Loss: 0.00682
Epoch: 44887, Training Loss: 0.00639
Epoch: 44887, Training Loss: 0.00779
Epoch: 44887, Training Loss: 0.00605
Epoch: 44888, Training Loss: 0.00682
Epoch: 44888, Training Loss: 0.00639
Epoch: 44888, Training Loss: 0.00779
Epoch: 44888, Training Loss: 0.00605
Epoch: 44889, Training Loss: 0.00682
Epoch: 44889, Training Loss: 0.00639
Epoch: 44889, Training Loss: 0.00779
Epoch: 44889, Training Loss: 0.00605
Epoch: 44890, Training Loss: 0.00682
Epoch: 44890, Training Loss: 0.00639
Epoch: 44890, Training Loss: 0.00779
Epoch: 44890, Training Loss: 0.00605
Epoch: 44891, Training Loss: 0.00682
Epoch: 44891, Training Loss: 0.00639
Epoch: 44891, Training Loss: 0.00779
Epoch: 44891, Training Loss: 0.00605
Epoch: 44892, Training Loss: 0.00682
Epoch: 44892, Training Loss: 0.00639
Epoch: 44892, Training Loss: 0.00779
Epoch: 44892, Training Loss: 0.00605
Epoch: 44893, Training Loss: 0.00682
Epoch: 44893, Training Loss: 0.00639
Epoch: 44893, Training Loss: 0.00779
Epoch: 44893, Training Loss: 0.00605
Epoch: 44894, Training Loss: 0.00682
Epoch: 44894, Training Loss: 0.00639
Epoch: 44894, Training Loss: 0.00779
Epoch: 44894, Training Loss: 0.00605
Epoch: 44895, Training Loss: 0.00682
Epoch: 44895, Training Loss: 0.00639
Epoch: 44895, Training Loss: 0.00779
Epoch: 44895, Training Loss: 0.00605
Epoch: 44896, Training Loss: 0.00682
Epoch: 44896, Training Loss: 0.00639
Epoch: 44896, Training Loss: 0.00779
Epoch: 44896, Training Loss: 0.00605
Epoch: 44897, Training Loss: 0.00682
Epoch: 44897, Training Loss: 0.00639
Epoch: 44897, Training Loss: 0.00779
Epoch: 44897, Training Loss: 0.00605
Epoch: 44898, Training Loss: 0.00682
Epoch: 44898, Training Loss: 0.00639
Epoch: 44898, Training Loss: 0.00779
Epoch: 44898, Training Loss: 0.00605
Epoch: 44899, Training Loss: 0.00682
Epoch: 44899, Training Loss: 0.00639
Epoch: 44899, Training Loss: 0.00779
Epoch: 44899, Training Loss: 0.00605
Epoch: 44900, Training Loss: 0.00682
Epoch: 44900, Training Loss: 0.00639
Epoch: 44900, Training Loss: 0.00779
Epoch: 44900, Training Loss: 0.00605
Epoch: 44901, Training Loss: 0.00682
Epoch: 44901, Training Loss: 0.00639
Epoch: 44901, Training Loss: 0.00779
Epoch: 44901, Training Loss: 0.00605
Epoch: 44902, Training Loss: 0.00682
Epoch: 44902, Training Loss: 0.00639
Epoch: 44902, Training Loss: 0.00779
Epoch: 44902, Training Loss: 0.00605
Epoch: 44903, Training Loss: 0.00682
Epoch: 44903, Training Loss: 0.00639
Epoch: 44903, Training Loss: 0.00779
Epoch: 44903, Training Loss: 0.00605
Epoch: 44904, Training Loss: 0.00682
Epoch: 44904, Training Loss: 0.00639
Epoch: 44904, Training Loss: 0.00779
Epoch: 44904, Training Loss: 0.00605
Epoch: 44905, Training Loss: 0.00682
Epoch: 44905, Training Loss: 0.00639
Epoch: 44905, Training Loss: 0.00779
Epoch: 44905, Training Loss: 0.00604
Epoch: 44906, Training Loss: 0.00682
Epoch: 44906, Training Loss: 0.00639
Epoch: 44906, Training Loss: 0.00779
Epoch: 44906, Training Loss: 0.00604
Epoch: 44907, Training Loss: 0.00682
Epoch: 44907, Training Loss: 0.00638
Epoch: 44907, Training Loss: 0.00779
Epoch: 44907, Training Loss: 0.00604
Epoch: 44908, Training Loss: 0.00682
Epoch: 44908, Training Loss: 0.00638
Epoch: 44908, Training Loss: 0.00779
Epoch: 44908, Training Loss: 0.00604
Epoch: 44909, Training Loss: 0.00682
Epoch: 44909, Training Loss: 0.00638
Epoch: 44909, Training Loss: 0.00779
Epoch: 44909, Training Loss: 0.00604
Epoch: 44910, Training Loss: 0.00682
Epoch: 44910, Training Loss: 0.00638
Epoch: 44910, Training Loss: 0.00779
Epoch: 44910, Training Loss: 0.00604
Epoch: 44911, Training Loss: 0.00682
Epoch: 44911, Training Loss: 0.00638
Epoch: 44911, Training Loss: 0.00779
Epoch: 44911, Training Loss: 0.00604
Epoch: 44912, Training Loss: 0.00682
Epoch: 44912, Training Loss: 0.00638
Epoch: 44912, Training Loss: 0.00779
Epoch: 44912, Training Loss: 0.00604
Epoch: 44913, Training Loss: 0.00682
Epoch: 44913, Training Loss: 0.00638
Epoch: 44913, Training Loss: 0.00779
Epoch: 44913, Training Loss: 0.00604
Epoch: 44914, Training Loss: 0.00682
Epoch: 44914, Training Loss: 0.00638
Epoch: 44914, Training Loss: 0.00779
Epoch: 44914, Training Loss: 0.00604
Epoch: 44915, Training Loss: 0.00682
Epoch: 44915, Training Loss: 0.00638
Epoch: 44915, Training Loss: 0.00779
Epoch: 44915, Training Loss: 0.00604
Epoch: 44916, Training Loss: 0.00682
Epoch: 44916, Training Loss: 0.00638
Epoch: 44916, Training Loss: 0.00779
Epoch: 44916, Training Loss: 0.00604
Epoch: 44917, Training Loss: 0.00682
Epoch: 44917, Training Loss: 0.00638
Epoch: 44917, Training Loss: 0.00779
Epoch: 44917, Training Loss: 0.00604
Epoch: 44918, Training Loss: 0.00682
Epoch: 44918, Training Loss: 0.00638
Epoch: 44918, Training Loss: 0.00779
Epoch: 44918, Training Loss: 0.00604
Epoch: 44919, Training Loss: 0.00682
Epoch: 44919, Training Loss: 0.00638
Epoch: 44919, Training Loss: 0.00779
Epoch: 44919, Training Loss: 0.00604
Epoch: 44920, Training Loss: 0.00682
Epoch: 44920, Training Loss: 0.00638
Epoch: 44920, Training Loss: 0.00779
Epoch: 44920, Training Loss: 0.00604
Epoch: 44921, Training Loss: 0.00682
Epoch: 44921, Training Loss: 0.00638
Epoch: 44921, Training Loss: 0.00779
Epoch: 44921, Training Loss: 0.00604
Epoch: 44922, Training Loss: 0.00682
Epoch: 44922, Training Loss: 0.00638
Epoch: 44922, Training Loss: 0.00779
Epoch: 44922, Training Loss: 0.00604
Epoch: 44923, Training Loss: 0.00682
Epoch: 44923, Training Loss: 0.00638
Epoch: 44923, Training Loss: 0.00779
Epoch: 44923, Training Loss: 0.00604
Epoch: 44924, Training Loss: 0.00682
Epoch: 44924, Training Loss: 0.00638
Epoch: 44924, Training Loss: 0.00779
Epoch: 44924, Training Loss: 0.00604
Epoch: 44925, Training Loss: 0.00681
Epoch: 44925, Training Loss: 0.00638
Epoch: 44925, Training Loss: 0.00779
Epoch: 44925, Training Loss: 0.00604
Epoch: 44926, Training Loss: 0.00681
Epoch: 44926, Training Loss: 0.00638
Epoch: 44926, Training Loss: 0.00779
Epoch: 44926, Training Loss: 0.00604
Epoch: 44927, Training Loss: 0.00681
Epoch: 44927, Training Loss: 0.00638
Epoch: 44927, Training Loss: 0.00779
Epoch: 44927, Training Loss: 0.00604
Epoch: 44928, Training Loss: 0.00681
Epoch: 44928, Training Loss: 0.00638
Epoch: 44928, Training Loss: 0.00779
Epoch: 44928, Training Loss: 0.00604
Epoch: 44929, Training Loss: 0.00681
Epoch: 44929, Training Loss: 0.00638
Epoch: 44929, Training Loss: 0.00779
Epoch: 44929, Training Loss: 0.00604
Epoch: 44930, Training Loss: 0.00681
Epoch: 44930, Training Loss: 0.00638
Epoch: 44930, Training Loss: 0.00779
Epoch: 44930, Training Loss: 0.00604
Epoch: 44931, Training Loss: 0.00681
Epoch: 44931, Training Loss: 0.00638
Epoch: 44931, Training Loss: 0.00779
Epoch: 44931, Training Loss: 0.00604
Epoch: 44932, Training Loss: 0.00681
Epoch: 44932, Training Loss: 0.00638
Epoch: 44932, Training Loss: 0.00779
Epoch: 44932, Training Loss: 0.00604
Epoch: 44933, Training Loss: 0.00681
Epoch: 44933, Training Loss: 0.00638
Epoch: 44933, Training Loss: 0.00779
Epoch: 44933, Training Loss: 0.00604
Epoch: 44934, Training Loss: 0.00681
Epoch: 44934, Training Loss: 0.00638
Epoch: 44934, Training Loss: 0.00779
Epoch: 44934, Training Loss: 0.00604
Epoch: 44935, Training Loss: 0.00681
Epoch: 44935, Training Loss: 0.00638
Epoch: 44935, Training Loss: 0.00779
Epoch: 44935, Training Loss: 0.00604
Epoch: 44936, Training Loss: 0.00681
Epoch: 44936, Training Loss: 0.00638
Epoch: 44936, Training Loss: 0.00779
Epoch: 44936, Training Loss: 0.00604
Epoch: 44937, Training Loss: 0.00681
Epoch: 44937, Training Loss: 0.00638
Epoch: 44937, Training Loss: 0.00779
Epoch: 44937, Training Loss: 0.00604
Epoch: 44938, Training Loss: 0.00681
Epoch: 44938, Training Loss: 0.00638
Epoch: 44938, Training Loss: 0.00779
Epoch: 44938, Training Loss: 0.00604
Epoch: 44939, Training Loss: 0.00681
Epoch: 44939, Training Loss: 0.00638
Epoch: 44939, Training Loss: 0.00779
Epoch: 44939, Training Loss: 0.00604
Epoch: 44940, Training Loss: 0.00681
Epoch: 44940, Training Loss: 0.00638
Epoch: 44940, Training Loss: 0.00779
Epoch: 44940, Training Loss: 0.00604
Epoch: 44941, Training Loss: 0.00681
Epoch: 44941, Training Loss: 0.00638
Epoch: 44941, Training Loss: 0.00779
Epoch: 44941, Training Loss: 0.00604
Epoch: 44942, Training Loss: 0.00681
Epoch: 44942, Training Loss: 0.00638
Epoch: 44942, Training Loss: 0.00779
Epoch: 44942, Training Loss: 0.00604
Epoch: 44943, Training Loss: 0.00681
Epoch: 44943, Training Loss: 0.00638
Epoch: 44943, Training Loss: 0.00779
Epoch: 44943, Training Loss: 0.00604
Epoch: 44944, Training Loss: 0.00681
Epoch: 44944, Training Loss: 0.00638
Epoch: 44944, Training Loss: 0.00779
Epoch: 44944, Training Loss: 0.00604
Epoch: 44945, Training Loss: 0.00681
Epoch: 44945, Training Loss: 0.00638
Epoch: 44945, Training Loss: 0.00779
Epoch: 44945, Training Loss: 0.00604
Epoch: 44946, Training Loss: 0.00681
Epoch: 44946, Training Loss: 0.00638
Epoch: 44946, Training Loss: 0.00779
Epoch: 44946, Training Loss: 0.00604
Epoch: 44947, Training Loss: 0.00681
Epoch: 44947, Training Loss: 0.00638
Epoch: 44947, Training Loss: 0.00779
Epoch: 44947, Training Loss: 0.00604
Epoch: 44948, Training Loss: 0.00681
Epoch: 44948, Training Loss: 0.00638
Epoch: 44948, Training Loss: 0.00779
Epoch: 44948, Training Loss: 0.00604
Epoch: 44949, Training Loss: 0.00681
Epoch: 44949, Training Loss: 0.00638
Epoch: 44949, Training Loss: 0.00779
Epoch: 44949, Training Loss: 0.00604
Epoch: 44950, Training Loss: 0.00681
Epoch: 44950, Training Loss: 0.00638
Epoch: 44950, Training Loss: 0.00779
Epoch: 44950, Training Loss: 0.00604
Epoch: 44951, Training Loss: 0.00681
Epoch: 44951, Training Loss: 0.00638
Epoch: 44951, Training Loss: 0.00779
Epoch: 44951, Training Loss: 0.00604
Epoch: 44952, Training Loss: 0.00681
Epoch: 44952, Training Loss: 0.00638
Epoch: 44952, Training Loss: 0.00779
Epoch: 44952, Training Loss: 0.00604
Epoch: 44953, Training Loss: 0.00681
Epoch: 44953, Training Loss: 0.00638
Epoch: 44953, Training Loss: 0.00779
Epoch: 44953, Training Loss: 0.00604
Epoch: 44954, Training Loss: 0.00681
Epoch: 44954, Training Loss: 0.00638
Epoch: 44954, Training Loss: 0.00779
Epoch: 44954, Training Loss: 0.00604
Epoch: 44955, Training Loss: 0.00681
Epoch: 44955, Training Loss: 0.00638
Epoch: 44955, Training Loss: 0.00779
Epoch: 44955, Training Loss: 0.00604
Epoch: 44956, Training Loss: 0.00681
Epoch: 44956, Training Loss: 0.00638
Epoch: 44956, Training Loss: 0.00779
Epoch: 44956, Training Loss: 0.00604
Epoch: 44957, Training Loss: 0.00681
Epoch: 44957, Training Loss: 0.00638
Epoch: 44957, Training Loss: 0.00779
Epoch: 44957, Training Loss: 0.00604
Epoch: 44958, Training Loss: 0.00681
Epoch: 44958, Training Loss: 0.00638
Epoch: 44958, Training Loss: 0.00779
Epoch: 44958, Training Loss: 0.00604
Epoch: 44959, Training Loss: 0.00681
Epoch: 44959, Training Loss: 0.00638
Epoch: 44959, Training Loss: 0.00779
Epoch: 44959, Training Loss: 0.00604
Epoch: 44960, Training Loss: 0.00681
Epoch: 44960, Training Loss: 0.00638
Epoch: 44960, Training Loss: 0.00779
Epoch: 44960, Training Loss: 0.00604
Epoch: 44961, Training Loss: 0.00681
Epoch: 44961, Training Loss: 0.00638
Epoch: 44961, Training Loss: 0.00779
Epoch: 44961, Training Loss: 0.00604
Epoch: 44962, Training Loss: 0.00681
Epoch: 44962, Training Loss: 0.00638
Epoch: 44962, Training Loss: 0.00779
Epoch: 44962, Training Loss: 0.00604
Epoch: 44963, Training Loss: 0.00681
Epoch: 44963, Training Loss: 0.00638
Epoch: 44963, Training Loss: 0.00779
Epoch: 44963, Training Loss: 0.00604
Epoch: 44964, Training Loss: 0.00681
Epoch: 44964, Training Loss: 0.00638
Epoch: 44964, Training Loss: 0.00779
Epoch: 44964, Training Loss: 0.00604
Epoch: 44965, Training Loss: 0.00681
Epoch: 44965, Training Loss: 0.00638
Epoch: 44965, Training Loss: 0.00778
Epoch: 44965, Training Loss: 0.00604
Epoch: 44966, Training Loss: 0.00681
Epoch: 44966, Training Loss: 0.00638
Epoch: 44966, Training Loss: 0.00778
Epoch: 44966, Training Loss: 0.00604
Epoch: 44967, Training Loss: 0.00681
Epoch: 44967, Training Loss: 0.00638
Epoch: 44967, Training Loss: 0.00778
Epoch: 44967, Training Loss: 0.00604
Epoch: 44968, Training Loss: 0.00681
Epoch: 44968, Training Loss: 0.00638
Epoch: 44968, Training Loss: 0.00778
Epoch: 44968, Training Loss: 0.00604
Epoch: 44969, Training Loss: 0.00681
Epoch: 44969, Training Loss: 0.00638
Epoch: 44969, Training Loss: 0.00778
Epoch: 44969, Training Loss: 0.00604
Epoch: 44970, Training Loss: 0.00681
Epoch: 44970, Training Loss: 0.00638
Epoch: 44970, Training Loss: 0.00778
Epoch: 44970, Training Loss: 0.00604
Epoch: 44971, Training Loss: 0.00681
Epoch: 44971, Training Loss: 0.00638
Epoch: 44971, Training Loss: 0.00778
Epoch: 44971, Training Loss: 0.00604
Epoch: 44972, Training Loss: 0.00681
Epoch: 44972, Training Loss: 0.00638
Epoch: 44972, Training Loss: 0.00778
Epoch: 44972, Training Loss: 0.00604
Epoch: 44973, Training Loss: 0.00681
Epoch: 44973, Training Loss: 0.00638
Epoch: 44973, Training Loss: 0.00778
Epoch: 44973, Training Loss: 0.00604
Epoch: 44974, Training Loss: 0.00681
Epoch: 44974, Training Loss: 0.00638
Epoch: 44974, Training Loss: 0.00778
Epoch: 44974, Training Loss: 0.00604
Epoch: 44975, Training Loss: 0.00681
Epoch: 44975, Training Loss: 0.00638
Epoch: 44975, Training Loss: 0.00778
Epoch: 44975, Training Loss: 0.00604
Epoch: 44976, Training Loss: 0.00681
Epoch: 44976, Training Loss: 0.00638
Epoch: 44976, Training Loss: 0.00778
Epoch: 44976, Training Loss: 0.00604
Epoch: 44977, Training Loss: 0.00681
Epoch: 44977, Training Loss: 0.00638
Epoch: 44977, Training Loss: 0.00778
Epoch: 44977, Training Loss: 0.00604
Epoch: 44978, Training Loss: 0.00681
Epoch: 44978, Training Loss: 0.00638
Epoch: 44978, Training Loss: 0.00778
Epoch: 44978, Training Loss: 0.00604
Epoch: 44979, Training Loss: 0.00681
Epoch: 44979, Training Loss: 0.00638
Epoch: 44979, Training Loss: 0.00778
Epoch: 44979, Training Loss: 0.00604
Epoch: 44980, Training Loss: 0.00681
Epoch: 44980, Training Loss: 0.00638
Epoch: 44980, Training Loss: 0.00778
Epoch: 44980, Training Loss: 0.00604
Epoch: 44981, Training Loss: 0.00681
Epoch: 44981, Training Loss: 0.00638
Epoch: 44981, Training Loss: 0.00778
Epoch: 44981, Training Loss: 0.00604
Epoch: 44982, Training Loss: 0.00681
Epoch: 44982, Training Loss: 0.00638
Epoch: 44982, Training Loss: 0.00778
Epoch: 44982, Training Loss: 0.00604
Epoch: 44983, Training Loss: 0.00681
Epoch: 44983, Training Loss: 0.00638
Epoch: 44983, Training Loss: 0.00778
Epoch: 44983, Training Loss: 0.00604
Epoch: 44984, Training Loss: 0.00681
Epoch: 44984, Training Loss: 0.00638
Epoch: 44984, Training Loss: 0.00778
Epoch: 44984, Training Loss: 0.00604
Epoch: 44985, Training Loss: 0.00681
Epoch: 44985, Training Loss: 0.00638
Epoch: 44985, Training Loss: 0.00778
Epoch: 44985, Training Loss: 0.00604
Epoch: 44986, Training Loss: 0.00681
Epoch: 44986, Training Loss: 0.00638
Epoch: 44986, Training Loss: 0.00778
Epoch: 44986, Training Loss: 0.00604
Epoch: 44987, Training Loss: 0.00681
Epoch: 44987, Training Loss: 0.00638
Epoch: 44987, Training Loss: 0.00778
Epoch: 44987, Training Loss: 0.00604
Epoch: 44988, Training Loss: 0.00681
Epoch: 44988, Training Loss: 0.00638
Epoch: 44988, Training Loss: 0.00778
Epoch: 44988, Training Loss: 0.00604
Epoch: 44989, Training Loss: 0.00681
Epoch: 44989, Training Loss: 0.00638
Epoch: 44989, Training Loss: 0.00778
Epoch: 44989, Training Loss: 0.00604
Epoch: 44990, Training Loss: 0.00681
Epoch: 44990, Training Loss: 0.00638
Epoch: 44990, Training Loss: 0.00778
Epoch: 44990, Training Loss: 0.00604
Epoch: 44991, Training Loss: 0.00681
Epoch: 44991, Training Loss: 0.00638
Epoch: 44991, Training Loss: 0.00778
Epoch: 44991, Training Loss: 0.00604
Epoch: 44992, Training Loss: 0.00681
Epoch: 44992, Training Loss: 0.00638
Epoch: 44992, Training Loss: 0.00778
Epoch: 44992, Training Loss: 0.00604
Epoch: 44993, Training Loss: 0.00681
Epoch: 44993, Training Loss: 0.00638
Epoch: 44993, Training Loss: 0.00778
Epoch: 44993, Training Loss: 0.00604
Epoch: 44994, Training Loss: 0.00681
Epoch: 44994, Training Loss: 0.00638
Epoch: 44994, Training Loss: 0.00778
Epoch: 44994, Training Loss: 0.00604
Epoch: 44995, Training Loss: 0.00681
Epoch: 44995, Training Loss: 0.00638
Epoch: 44995, Training Loss: 0.00778
Epoch: 44995, Training Loss: 0.00604
Epoch: 44996, Training Loss: 0.00681
Epoch: 44996, Training Loss: 0.00638
Epoch: 44996, Training Loss: 0.00778
Epoch: 44996, Training Loss: 0.00604
Epoch: 44997, Training Loss: 0.00681
Epoch: 44997, Training Loss: 0.00638
Epoch: 44997, Training Loss: 0.00778
Epoch: 44997, Training Loss: 0.00604
Epoch: 44998, Training Loss: 0.00681
Epoch: 44998, Training Loss: 0.00638
Epoch: 44998, Training Loss: 0.00778
Epoch: 44998, Training Loss: 0.00604
Epoch: 44999, Training Loss: 0.00681
Epoch: 44999, Training Loss: 0.00638
Epoch: 44999, Training Loss: 0.00778
Epoch: 44999, Training Loss: 0.00604
Epoch: 45000, Training Loss: 0.00681
Epoch: 45000, Training Loss: 0.00638
Epoch: 45000, Training Loss: 0.00778
Epoch: 45000, Training Loss: 0.00604
Epoch: 45001, Training Loss: 0.00681
Epoch: 45001, Training Loss: 0.00638
Epoch: 45001, Training Loss: 0.00778
Epoch: 45001, Training Loss: 0.00604
Epoch: 45002, Training Loss: 0.00681
Epoch: 45002, Training Loss: 0.00638
Epoch: 45002, Training Loss: 0.00778
Epoch: 45002, Training Loss: 0.00604
Epoch: 45003, Training Loss: 0.00681
Epoch: 45003, Training Loss: 0.00638
Epoch: 45003, Training Loss: 0.00778
Epoch: 45003, Training Loss: 0.00604
Epoch: 45004, Training Loss: 0.00681
Epoch: 45004, Training Loss: 0.00638
Epoch: 45004, Training Loss: 0.00778
Epoch: 45004, Training Loss: 0.00604
Epoch: 45005, Training Loss: 0.00681
Epoch: 45005, Training Loss: 0.00638
Epoch: 45005, Training Loss: 0.00778
Epoch: 45005, Training Loss: 0.00604
Epoch: 45006, Training Loss: 0.00681
Epoch: 45006, Training Loss: 0.00638
Epoch: 45006, Training Loss: 0.00778
Epoch: 45006, Training Loss: 0.00604
Epoch: 45007, Training Loss: 0.00681
Epoch: 45007, Training Loss: 0.00638
Epoch: 45007, Training Loss: 0.00778
Epoch: 45007, Training Loss: 0.00604
Epoch: 45008, Training Loss: 0.00681
Epoch: 45008, Training Loss: 0.00638
Epoch: 45008, Training Loss: 0.00778
Epoch: 45008, Training Loss: 0.00604
Epoch: 45009, Training Loss: 0.00681
Epoch: 45009, Training Loss: 0.00638
Epoch: 45009, Training Loss: 0.00778
Epoch: 45009, Training Loss: 0.00604
Epoch: 45010, Training Loss: 0.00681
Epoch: 45010, Training Loss: 0.00638
Epoch: 45010, Training Loss: 0.00778
Epoch: 45010, Training Loss: 0.00604
Epoch: 45011, Training Loss: 0.00681
Epoch: 45011, Training Loss: 0.00638
Epoch: 45011, Training Loss: 0.00778
Epoch: 45011, Training Loss: 0.00604
Epoch: 45012, Training Loss: 0.00681
Epoch: 45012, Training Loss: 0.00638
Epoch: 45012, Training Loss: 0.00778
Epoch: 45012, Training Loss: 0.00604
Epoch: 45013, Training Loss: 0.00681
Epoch: 45013, Training Loss: 0.00638
Epoch: 45013, Training Loss: 0.00778
Epoch: 45013, Training Loss: 0.00604
Epoch: 45014, Training Loss: 0.00681
Epoch: 45014, Training Loss: 0.00638
Epoch: 45014, Training Loss: 0.00778
Epoch: 45014, Training Loss: 0.00604
Epoch: 45015, Training Loss: 0.00681
Epoch: 45015, Training Loss: 0.00638
Epoch: 45015, Training Loss: 0.00778
Epoch: 45015, Training Loss: 0.00604
Epoch: 45016, Training Loss: 0.00681
Epoch: 45016, Training Loss: 0.00638
Epoch: 45016, Training Loss: 0.00778
Epoch: 45016, Training Loss: 0.00604
Epoch: 45017, Training Loss: 0.00681
Epoch: 45017, Training Loss: 0.00638
Epoch: 45017, Training Loss: 0.00778
Epoch: 45017, Training Loss: 0.00604
Epoch: 45018, Training Loss: 0.00681
Epoch: 45018, Training Loss: 0.00638
Epoch: 45018, Training Loss: 0.00778
Epoch: 45018, Training Loss: 0.00604
Epoch: 45019, Training Loss: 0.00681
Epoch: 45019, Training Loss: 0.00638
Epoch: 45019, Training Loss: 0.00778
Epoch: 45019, Training Loss: 0.00604
Epoch: 45020, Training Loss: 0.00681
Epoch: 45020, Training Loss: 0.00638
Epoch: 45020, Training Loss: 0.00778
Epoch: 45020, Training Loss: 0.00604
Epoch: 45021, Training Loss: 0.00681
Epoch: 45021, Training Loss: 0.00638
Epoch: 45021, Training Loss: 0.00778
Epoch: 45021, Training Loss: 0.00604
Epoch: 45022, Training Loss: 0.00681
Epoch: 45022, Training Loss: 0.00638
Epoch: 45022, Training Loss: 0.00778
Epoch: 45022, Training Loss: 0.00604
Epoch: 45023, Training Loss: 0.00681
Epoch: 45023, Training Loss: 0.00638
Epoch: 45023, Training Loss: 0.00778
Epoch: 45023, Training Loss: 0.00604
Epoch: 45024, Training Loss: 0.00681
Epoch: 45024, Training Loss: 0.00638
Epoch: 45024, Training Loss: 0.00778
Epoch: 45024, Training Loss: 0.00604
Epoch: 45025, Training Loss: 0.00681
Epoch: 45025, Training Loss: 0.00638
Epoch: 45025, Training Loss: 0.00778
Epoch: 45025, Training Loss: 0.00604
Epoch: 45026, Training Loss: 0.00681
Epoch: 45026, Training Loss: 0.00638
Epoch: 45026, Training Loss: 0.00778
Epoch: 45026, Training Loss: 0.00604
Epoch: 45027, Training Loss: 0.00681
Epoch: 45027, Training Loss: 0.00638
Epoch: 45027, Training Loss: 0.00778
Epoch: 45027, Training Loss: 0.00604
Epoch: 45028, Training Loss: 0.00681
Epoch: 45028, Training Loss: 0.00638
Epoch: 45028, Training Loss: 0.00778
Epoch: 45028, Training Loss: 0.00604
Epoch: 45029, Training Loss: 0.00681
Epoch: 45029, Training Loss: 0.00638
Epoch: 45029, Training Loss: 0.00778
Epoch: 45029, Training Loss: 0.00604
Epoch: 45030, Training Loss: 0.00681
Epoch: 45030, Training Loss: 0.00638
Epoch: 45030, Training Loss: 0.00778
Epoch: 45030, Training Loss: 0.00604
Epoch: 45031, Training Loss: 0.00681
Epoch: 45031, Training Loss: 0.00638
Epoch: 45031, Training Loss: 0.00778
Epoch: 45031, Training Loss: 0.00604
Epoch: 45032, Training Loss: 0.00681
Epoch: 45032, Training Loss: 0.00638
Epoch: 45032, Training Loss: 0.00778
Epoch: 45032, Training Loss: 0.00604
Epoch: 45033, Training Loss: 0.00681
Epoch: 45033, Training Loss: 0.00638
Epoch: 45033, Training Loss: 0.00778
Epoch: 45033, Training Loss: 0.00604
Epoch: 45034, Training Loss: 0.00681
Epoch: 45034, Training Loss: 0.00638
Epoch: 45034, Training Loss: 0.00778
Epoch: 45034, Training Loss: 0.00604
Epoch: 45035, Training Loss: 0.00681
Epoch: 45035, Training Loss: 0.00638
Epoch: 45035, Training Loss: 0.00778
Epoch: 45035, Training Loss: 0.00604
Epoch: 45036, Training Loss: 0.00681
Epoch: 45036, Training Loss: 0.00638
Epoch: 45036, Training Loss: 0.00778
Epoch: 45036, Training Loss: 0.00604
Epoch: 45037, Training Loss: 0.00681
Epoch: 45037, Training Loss: 0.00638
Epoch: 45037, Training Loss: 0.00778
Epoch: 45037, Training Loss: 0.00604
Epoch: 45038, Training Loss: 0.00681
Epoch: 45038, Training Loss: 0.00638
Epoch: 45038, Training Loss: 0.00778
Epoch: 45038, Training Loss: 0.00604
Epoch: 45039, Training Loss: 0.00681
Epoch: 45039, Training Loss: 0.00638
Epoch: 45039, Training Loss: 0.00778
Epoch: 45039, Training Loss: 0.00604
Epoch: 45040, Training Loss: 0.00681
Epoch: 45040, Training Loss: 0.00638
Epoch: 45040, Training Loss: 0.00778
Epoch: 45040, Training Loss: 0.00604
Epoch: 45041, Training Loss: 0.00681
Epoch: 45041, Training Loss: 0.00638
Epoch: 45041, Training Loss: 0.00778
Epoch: 45041, Training Loss: 0.00604
Epoch: 45042, Training Loss: 0.00681
Epoch: 45042, Training Loss: 0.00638
Epoch: 45042, Training Loss: 0.00778
Epoch: 45042, Training Loss: 0.00604
Epoch: 45043, Training Loss: 0.00681
Epoch: 45043, Training Loss: 0.00637
Epoch: 45043, Training Loss: 0.00778
Epoch: 45043, Training Loss: 0.00604
Epoch: 45044, Training Loss: 0.00681
Epoch: 45044, Training Loss: 0.00637
Epoch: 45044, Training Loss: 0.00778
Epoch: 45044, Training Loss: 0.00604
Epoch: 45045, Training Loss: 0.00681
Epoch: 45045, Training Loss: 0.00637
Epoch: 45045, Training Loss: 0.00778
Epoch: 45045, Training Loss: 0.00604
Epoch: 45046, Training Loss: 0.00681
Epoch: 45046, Training Loss: 0.00637
Epoch: 45046, Training Loss: 0.00778
Epoch: 45046, Training Loss: 0.00604
Epoch: 45047, Training Loss: 0.00681
Epoch: 45047, Training Loss: 0.00637
Epoch: 45047, Training Loss: 0.00778
Epoch: 45047, Training Loss: 0.00604
Epoch: 45048, Training Loss: 0.00681
Epoch: 45048, Training Loss: 0.00637
Epoch: 45048, Training Loss: 0.00778
Epoch: 45048, Training Loss: 0.00604
Epoch: 45049, Training Loss: 0.00681
Epoch: 45049, Training Loss: 0.00637
Epoch: 45049, Training Loss: 0.00778
Epoch: 45049, Training Loss: 0.00603
Epoch: 45050, Training Loss: 0.00681
Epoch: 45050, Training Loss: 0.00637
Epoch: 45050, Training Loss: 0.00778
Epoch: 45050, Training Loss: 0.00603
Epoch: 45051, Training Loss: 0.00680
Epoch: 45051, Training Loss: 0.00637
Epoch: 45051, Training Loss: 0.00778
Epoch: 45051, Training Loss: 0.00603
Epoch: 45052, Training Loss: 0.00680
Epoch: 45052, Training Loss: 0.00637
Epoch: 45052, Training Loss: 0.00778
Epoch: 45052, Training Loss: 0.00603
Epoch: 45053, Training Loss: 0.00680
Epoch: 45053, Training Loss: 0.00637
Epoch: 45053, Training Loss: 0.00778
Epoch: 45053, Training Loss: 0.00603
Epoch: 45054, Training Loss: 0.00680
Epoch: 45054, Training Loss: 0.00637
Epoch: 45054, Training Loss: 0.00778
Epoch: 45054, Training Loss: 0.00603
Epoch: 45055, Training Loss: 0.00680
Epoch: 45055, Training Loss: 0.00637
Epoch: 45055, Training Loss: 0.00778
Epoch: 45055, Training Loss: 0.00603
Epoch: 45056, Training Loss: 0.00680
Epoch: 45056, Training Loss: 0.00637
Epoch: 45056, Training Loss: 0.00778
Epoch: 45056, Training Loss: 0.00603
Epoch: 45057, Training Loss: 0.00680
Epoch: 45057, Training Loss: 0.00637
Epoch: 45057, Training Loss: 0.00778
Epoch: 45057, Training Loss: 0.00603
Epoch: 45058, Training Loss: 0.00680
Epoch: 45058, Training Loss: 0.00637
Epoch: 45058, Training Loss: 0.00778
Epoch: 45058, Training Loss: 0.00603
Epoch: 45059, Training Loss: 0.00680
Epoch: 45059, Training Loss: 0.00637
Epoch: 45059, Training Loss: 0.00778
Epoch: 45059, Training Loss: 0.00603
Epoch: 45060, Training Loss: 0.00680
Epoch: 45060, Training Loss: 0.00637
Epoch: 45060, Training Loss: 0.00778
Epoch: 45060, Training Loss: 0.00603
Epoch: 45061, Training Loss: 0.00680
Epoch: 45061, Training Loss: 0.00637
Epoch: 45061, Training Loss: 0.00778
Epoch: 45061, Training Loss: 0.00603
Epoch: 45062, Training Loss: 0.00680
Epoch: 45062, Training Loss: 0.00637
Epoch: 45062, Training Loss: 0.00778
Epoch: 45062, Training Loss: 0.00603
Epoch: 45063, Training Loss: 0.00680
Epoch: 45063, Training Loss: 0.00637
Epoch: 45063, Training Loss: 0.00778
Epoch: 45063, Training Loss: 0.00603
Epoch: 45064, Training Loss: 0.00680
Epoch: 45064, Training Loss: 0.00637
Epoch: 45064, Training Loss: 0.00778
Epoch: 45064, Training Loss: 0.00603
Epoch: 45065, Training Loss: 0.00680
Epoch: 45065, Training Loss: 0.00637
Epoch: 45065, Training Loss: 0.00778
Epoch: 45065, Training Loss: 0.00603
Epoch: 45066, Training Loss: 0.00680
Epoch: 45066, Training Loss: 0.00637
Epoch: 45066, Training Loss: 0.00778
Epoch: 45066, Training Loss: 0.00603
Epoch: 45067, Training Loss: 0.00680
Epoch: 45067, Training Loss: 0.00637
Epoch: 45067, Training Loss: 0.00778
Epoch: 45067, Training Loss: 0.00603
Epoch: 45068, Training Loss: 0.00680
Epoch: 45068, Training Loss: 0.00637
Epoch: 45068, Training Loss: 0.00778
Epoch: 45068, Training Loss: 0.00603
Epoch: 45069, Training Loss: 0.00680
Epoch: 45069, Training Loss: 0.00637
Epoch: 45069, Training Loss: 0.00778
Epoch: 45069, Training Loss: 0.00603
Epoch: 45070, Training Loss: 0.00680
Epoch: 45070, Training Loss: 0.00637
Epoch: 45070, Training Loss: 0.00778
Epoch: 45070, Training Loss: 0.00603
Epoch: 45071, Training Loss: 0.00680
Epoch: 45071, Training Loss: 0.00637
Epoch: 45071, Training Loss: 0.00778
Epoch: 45071, Training Loss: 0.00603
Epoch: 45072, Training Loss: 0.00680
Epoch: 45072, Training Loss: 0.00637
Epoch: 45072, Training Loss: 0.00778
Epoch: 45072, Training Loss: 0.00603
Epoch: 45073, Training Loss: 0.00680
Epoch: 45073, Training Loss: 0.00637
Epoch: 45073, Training Loss: 0.00778
Epoch: 45073, Training Loss: 0.00603
Epoch: 45074, Training Loss: 0.00680
Epoch: 45074, Training Loss: 0.00637
Epoch: 45074, Training Loss: 0.00778
Epoch: 45074, Training Loss: 0.00603
Epoch: 45075, Training Loss: 0.00680
Epoch: 45075, Training Loss: 0.00637
Epoch: 45075, Training Loss: 0.00777
Epoch: 45075, Training Loss: 0.00603
Epoch: 45076, Training Loss: 0.00680
Epoch: 45076, Training Loss: 0.00637
Epoch: 45076, Training Loss: 0.00777
Epoch: 45076, Training Loss: 0.00603
Epoch: 45077, Training Loss: 0.00680
Epoch: 45077, Training Loss: 0.00637
Epoch: 45077, Training Loss: 0.00777
Epoch: 45077, Training Loss: 0.00603
Epoch: 45078, Training Loss: 0.00680
Epoch: 45078, Training Loss: 0.00637
Epoch: 45078, Training Loss: 0.00777
Epoch: 45078, Training Loss: 0.00603
Epoch: 45079, Training Loss: 0.00680
Epoch: 45079, Training Loss: 0.00637
Epoch: 45079, Training Loss: 0.00777
Epoch: 45079, Training Loss: 0.00603
Epoch: 45080, Training Loss: 0.00680
Epoch: 45080, Training Loss: 0.00637
Epoch: 45080, Training Loss: 0.00777
Epoch: 45080, Training Loss: 0.00603
Epoch: 45081, Training Loss: 0.00680
Epoch: 45081, Training Loss: 0.00637
Epoch: 45081, Training Loss: 0.00777
Epoch: 45081, Training Loss: 0.00603
Epoch: 45082, Training Loss: 0.00680
Epoch: 45082, Training Loss: 0.00637
Epoch: 45082, Training Loss: 0.00777
Epoch: 45082, Training Loss: 0.00603
Epoch: 45083, Training Loss: 0.00680
Epoch: 45083, Training Loss: 0.00637
Epoch: 45083, Training Loss: 0.00777
Epoch: 45083, Training Loss: 0.00603
Epoch: 45084, Training Loss: 0.00680
Epoch: 45084, Training Loss: 0.00637
Epoch: 45084, Training Loss: 0.00777
Epoch: 45084, Training Loss: 0.00603
Epoch: 45085, Training Loss: 0.00680
Epoch: 45085, Training Loss: 0.00637
Epoch: 45085, Training Loss: 0.00777
Epoch: 45085, Training Loss: 0.00603
Epoch: 45086, Training Loss: 0.00680
Epoch: 45086, Training Loss: 0.00637
Epoch: 45086, Training Loss: 0.00777
Epoch: 45086, Training Loss: 0.00603
Epoch: 45087, Training Loss: 0.00680
Epoch: 45087, Training Loss: 0.00637
Epoch: 45087, Training Loss: 0.00777
Epoch: 45087, Training Loss: 0.00603
Epoch: 45088, Training Loss: 0.00680
Epoch: 45088, Training Loss: 0.00637
Epoch: 45088, Training Loss: 0.00777
Epoch: 45088, Training Loss: 0.00603
Epoch: 45089, Training Loss: 0.00680
Epoch: 45089, Training Loss: 0.00637
Epoch: 45089, Training Loss: 0.00777
Epoch: 45089, Training Loss: 0.00603
Epoch: 45090, Training Loss: 0.00680
Epoch: 45090, Training Loss: 0.00637
Epoch: 45090, Training Loss: 0.00777
Epoch: 45090, Training Loss: 0.00603
Epoch: 45091, Training Loss: 0.00680
Epoch: 45091, Training Loss: 0.00637
Epoch: 45091, Training Loss: 0.00777
Epoch: 45091, Training Loss: 0.00603
Epoch: 45092, Training Loss: 0.00680
Epoch: 45092, Training Loss: 0.00637
Epoch: 45092, Training Loss: 0.00777
Epoch: 45092, Training Loss: 0.00603
Epoch: 45093, Training Loss: 0.00680
Epoch: 45093, Training Loss: 0.00637
Epoch: 45093, Training Loss: 0.00777
Epoch: 45093, Training Loss: 0.00603
Epoch: 45094, Training Loss: 0.00680
Epoch: 45094, Training Loss: 0.00637
Epoch: 45094, Training Loss: 0.00777
Epoch: 45094, Training Loss: 0.00603
Epoch: 45095, Training Loss: 0.00680
Epoch: 45095, Training Loss: 0.00637
Epoch: 45095, Training Loss: 0.00777
Epoch: 45095, Training Loss: 0.00603
Epoch: 45096, Training Loss: 0.00680
Epoch: 45096, Training Loss: 0.00637
Epoch: 45096, Training Loss: 0.00777
Epoch: 45096, Training Loss: 0.00603
Epoch: 45097, Training Loss: 0.00680
Epoch: 45097, Training Loss: 0.00637
Epoch: 45097, Training Loss: 0.00777
Epoch: 45097, Training Loss: 0.00603
Epoch: 45098, Training Loss: 0.00680
Epoch: 45098, Training Loss: 0.00637
Epoch: 45098, Training Loss: 0.00777
Epoch: 45098, Training Loss: 0.00603
Epoch: 45099, Training Loss: 0.00680
Epoch: 45099, Training Loss: 0.00637
Epoch: 45099, Training Loss: 0.00777
Epoch: 45099, Training Loss: 0.00603
Epoch: 45100, Training Loss: 0.00680
Epoch: 45100, Training Loss: 0.00637
Epoch: 45100, Training Loss: 0.00777
Epoch: 45100, Training Loss: 0.00603
Epoch: 45101, Training Loss: 0.00680
Epoch: 45101, Training Loss: 0.00637
Epoch: 45101, Training Loss: 0.00777
Epoch: 45101, Training Loss: 0.00603
Epoch: 45102, Training Loss: 0.00680
Epoch: 45102, Training Loss: 0.00637
Epoch: 45102, Training Loss: 0.00777
Epoch: 45102, Training Loss: 0.00603
Epoch: 45103, Training Loss: 0.00680
Epoch: 45103, Training Loss: 0.00637
Epoch: 45103, Training Loss: 0.00777
Epoch: 45103, Training Loss: 0.00603
Epoch: 45104, Training Loss: 0.00680
Epoch: 45104, Training Loss: 0.00637
Epoch: 45104, Training Loss: 0.00777
Epoch: 45104, Training Loss: 0.00603
Epoch: 45105, Training Loss: 0.00680
Epoch: 45105, Training Loss: 0.00637
Epoch: 45105, Training Loss: 0.00777
Epoch: 45105, Training Loss: 0.00603
Epoch: 45106, Training Loss: 0.00680
Epoch: 45106, Training Loss: 0.00637
Epoch: 45106, Training Loss: 0.00777
Epoch: 45106, Training Loss: 0.00603
Epoch: 45107, Training Loss: 0.00680
Epoch: 45107, Training Loss: 0.00637
Epoch: 45107, Training Loss: 0.00777
Epoch: 45107, Training Loss: 0.00603
Epoch: 45108, Training Loss: 0.00680
Epoch: 45108, Training Loss: 0.00637
Epoch: 45108, Training Loss: 0.00777
Epoch: 45108, Training Loss: 0.00603
Epoch: 45109, Training Loss: 0.00680
Epoch: 45109, Training Loss: 0.00637
Epoch: 45109, Training Loss: 0.00777
Epoch: 45109, Training Loss: 0.00603
Epoch: 45110, Training Loss: 0.00680
Epoch: 45110, Training Loss: 0.00637
Epoch: 45110, Training Loss: 0.00777
Epoch: 45110, Training Loss: 0.00603
Epoch: 45111, Training Loss: 0.00680
Epoch: 45111, Training Loss: 0.00637
Epoch: 45111, Training Loss: 0.00777
Epoch: 45111, Training Loss: 0.00603
Epoch: 45112, Training Loss: 0.00680
Epoch: 45112, Training Loss: 0.00637
Epoch: 45112, Training Loss: 0.00777
Epoch: 45112, Training Loss: 0.00603
Epoch: 45113, Training Loss: 0.00680
Epoch: 45113, Training Loss: 0.00637
Epoch: 45113, Training Loss: 0.00777
Epoch: 45113, Training Loss: 0.00603
Epoch: 45114, Training Loss: 0.00680
Epoch: 45114, Training Loss: 0.00637
Epoch: 45114, Training Loss: 0.00777
Epoch: 45114, Training Loss: 0.00603
Epoch: 45115, Training Loss: 0.00680
Epoch: 45115, Training Loss: 0.00637
Epoch: 45115, Training Loss: 0.00777
Epoch: 45115, Training Loss: 0.00603
Epoch: 45116, Training Loss: 0.00680
Epoch: 45116, Training Loss: 0.00637
Epoch: 45116, Training Loss: 0.00777
Epoch: 45116, Training Loss: 0.00603
Epoch: 45117, Training Loss: 0.00680
Epoch: 45117, Training Loss: 0.00637
Epoch: 45117, Training Loss: 0.00777
Epoch: 45117, Training Loss: 0.00603
Epoch: 45118, Training Loss: 0.00680
Epoch: 45118, Training Loss: 0.00637
Epoch: 45118, Training Loss: 0.00777
Epoch: 45118, Training Loss: 0.00603
Epoch: 45119, Training Loss: 0.00680
Epoch: 45119, Training Loss: 0.00637
Epoch: 45119, Training Loss: 0.00777
Epoch: 45119, Training Loss: 0.00603
Epoch: 45120, Training Loss: 0.00680
Epoch: 45120, Training Loss: 0.00637
Epoch: 45120, Training Loss: 0.00777
Epoch: 45120, Training Loss: 0.00603
Epoch: 45121, Training Loss: 0.00680
Epoch: 45121, Training Loss: 0.00637
Epoch: 45121, Training Loss: 0.00777
Epoch: 45121, Training Loss: 0.00603
Epoch: 45122, Training Loss: 0.00680
Epoch: 45122, Training Loss: 0.00637
Epoch: 45122, Training Loss: 0.00777
Epoch: 45122, Training Loss: 0.00603
Epoch: 45123, Training Loss: 0.00680
Epoch: 45123, Training Loss: 0.00637
Epoch: 45123, Training Loss: 0.00777
Epoch: 45123, Training Loss: 0.00603
Epoch: 45124, Training Loss: 0.00680
Epoch: 45124, Training Loss: 0.00637
Epoch: 45124, Training Loss: 0.00777
Epoch: 45124, Training Loss: 0.00603
Epoch: 45125, Training Loss: 0.00680
Epoch: 45125, Training Loss: 0.00637
Epoch: 45125, Training Loss: 0.00777
Epoch: 45125, Training Loss: 0.00603
Epoch: 45126, Training Loss: 0.00680
Epoch: 45126, Training Loss: 0.00637
Epoch: 45126, Training Loss: 0.00777
Epoch: 45126, Training Loss: 0.00603
Epoch: 45127, Training Loss: 0.00680
Epoch: 45127, Training Loss: 0.00637
Epoch: 45127, Training Loss: 0.00777
Epoch: 45127, Training Loss: 0.00603
Epoch: 45128, Training Loss: 0.00680
Epoch: 45128, Training Loss: 0.00637
Epoch: 45128, Training Loss: 0.00777
Epoch: 45128, Training Loss: 0.00603
Epoch: 45129, Training Loss: 0.00680
Epoch: 45129, Training Loss: 0.00637
Epoch: 45129, Training Loss: 0.00777
Epoch: 45129, Training Loss: 0.00603
Epoch: 45130, Training Loss: 0.00680
Epoch: 45130, Training Loss: 0.00637
Epoch: 45130, Training Loss: 0.00777
Epoch: 45130, Training Loss: 0.00603
Epoch: 45131, Training Loss: 0.00680
Epoch: 45131, Training Loss: 0.00637
Epoch: 45131, Training Loss: 0.00777
Epoch: 45131, Training Loss: 0.00603
Epoch: 45132, Training Loss: 0.00680
Epoch: 45132, Training Loss: 0.00637
Epoch: 45132, Training Loss: 0.00777
Epoch: 45132, Training Loss: 0.00603
Epoch: 45133, Training Loss: 0.00680
Epoch: 45133, Training Loss: 0.00637
Epoch: 45133, Training Loss: 0.00777
Epoch: 45133, Training Loss: 0.00603
Epoch: 45134, Training Loss: 0.00680
Epoch: 45134, Training Loss: 0.00637
Epoch: 45134, Training Loss: 0.00777
Epoch: 45134, Training Loss: 0.00603
Epoch: 45135, Training Loss: 0.00680
Epoch: 45135, Training Loss: 0.00637
Epoch: 45135, Training Loss: 0.00777
Epoch: 45135, Training Loss: 0.00603
Epoch: 45136, Training Loss: 0.00680
Epoch: 45136, Training Loss: 0.00637
Epoch: 45136, Training Loss: 0.00777
Epoch: 45136, Training Loss: 0.00603
Epoch: 45137, Training Loss: 0.00680
Epoch: 45137, Training Loss: 0.00637
Epoch: 45137, Training Loss: 0.00777
Epoch: 45137, Training Loss: 0.00603
Epoch: 45138, Training Loss: 0.00680
Epoch: 45138, Training Loss: 0.00637
Epoch: 45138, Training Loss: 0.00777
Epoch: 45138, Training Loss: 0.00603
Epoch: 45139, Training Loss: 0.00680
Epoch: 45139, Training Loss: 0.00637
Epoch: 45139, Training Loss: 0.00777
Epoch: 45139, Training Loss: 0.00603
Epoch: 45140, Training Loss: 0.00680
Epoch: 45140, Training Loss: 0.00637
Epoch: 45140, Training Loss: 0.00777
Epoch: 45140, Training Loss: 0.00603
Epoch: 45141, Training Loss: 0.00680
Epoch: 45141, Training Loss: 0.00637
Epoch: 45141, Training Loss: 0.00777
Epoch: 45141, Training Loss: 0.00603
Epoch: 45142, Training Loss: 0.00680
Epoch: 45142, Training Loss: 0.00637
Epoch: 45142, Training Loss: 0.00777
Epoch: 45142, Training Loss: 0.00603
Epoch: 45143, Training Loss: 0.00680
Epoch: 45143, Training Loss: 0.00637
Epoch: 45143, Training Loss: 0.00777
Epoch: 45143, Training Loss: 0.00603
Epoch: 45144, Training Loss: 0.00680
Epoch: 45144, Training Loss: 0.00637
Epoch: 45144, Training Loss: 0.00777
Epoch: 45144, Training Loss: 0.00603
Epoch: 45145, Training Loss: 0.00680
Epoch: 45145, Training Loss: 0.00637
Epoch: 45145, Training Loss: 0.00777
Epoch: 45145, Training Loss: 0.00603
Epoch: 45146, Training Loss: 0.00680
Epoch: 45146, Training Loss: 0.00637
Epoch: 45146, Training Loss: 0.00777
Epoch: 45146, Training Loss: 0.00603
Epoch: 45147, Training Loss: 0.00680
Epoch: 45147, Training Loss: 0.00637
Epoch: 45147, Training Loss: 0.00777
Epoch: 45147, Training Loss: 0.00603
Epoch: 45148, Training Loss: 0.00680
Epoch: 45148, Training Loss: 0.00637
Epoch: 45148, Training Loss: 0.00777
Epoch: 45148, Training Loss: 0.00603
Epoch: 45149, Training Loss: 0.00680
Epoch: 45149, Training Loss: 0.00637
Epoch: 45149, Training Loss: 0.00777
Epoch: 45149, Training Loss: 0.00603
Epoch: 45150, Training Loss: 0.00680
Epoch: 45150, Training Loss: 0.00637
Epoch: 45150, Training Loss: 0.00777
Epoch: 45150, Training Loss: 0.00603
Epoch: 45151, Training Loss: 0.00680
Epoch: 45151, Training Loss: 0.00637
Epoch: 45151, Training Loss: 0.00777
Epoch: 45151, Training Loss: 0.00603
Epoch: 45152, Training Loss: 0.00680
Epoch: 45152, Training Loss: 0.00637
Epoch: 45152, Training Loss: 0.00777
Epoch: 45152, Training Loss: 0.00603
Epoch: 45153, Training Loss: 0.00680
Epoch: 45153, Training Loss: 0.00637
Epoch: 45153, Training Loss: 0.00777
Epoch: 45153, Training Loss: 0.00603
Epoch: 45154, Training Loss: 0.00680
Epoch: 45154, Training Loss: 0.00637
Epoch: 45154, Training Loss: 0.00777
Epoch: 45154, Training Loss: 0.00603
Epoch: 45155, Training Loss: 0.00680
Epoch: 45155, Training Loss: 0.00637
Epoch: 45155, Training Loss: 0.00777
Epoch: 45155, Training Loss: 0.00603
Epoch: 45156, Training Loss: 0.00680
Epoch: 45156, Training Loss: 0.00637
Epoch: 45156, Training Loss: 0.00777
Epoch: 45156, Training Loss: 0.00603
Epoch: 45157, Training Loss: 0.00680
Epoch: 45157, Training Loss: 0.00637
Epoch: 45157, Training Loss: 0.00777
Epoch: 45157, Training Loss: 0.00603
Epoch: 45158, Training Loss: 0.00680
Epoch: 45158, Training Loss: 0.00637
Epoch: 45158, Training Loss: 0.00777
Epoch: 45158, Training Loss: 0.00603
Epoch: 45159, Training Loss: 0.00680
Epoch: 45159, Training Loss: 0.00637
Epoch: 45159, Training Loss: 0.00777
Epoch: 45159, Training Loss: 0.00603
Epoch: 45160, Training Loss: 0.00680
Epoch: 45160, Training Loss: 0.00637
Epoch: 45160, Training Loss: 0.00777
Epoch: 45160, Training Loss: 0.00603
Epoch: 45161, Training Loss: 0.00680
Epoch: 45161, Training Loss: 0.00637
Epoch: 45161, Training Loss: 0.00777
Epoch: 45161, Training Loss: 0.00603
Epoch: 45162, Training Loss: 0.00680
Epoch: 45162, Training Loss: 0.00637
Epoch: 45162, Training Loss: 0.00777
Epoch: 45162, Training Loss: 0.00603
Epoch: 45163, Training Loss: 0.00680
Epoch: 45163, Training Loss: 0.00637
Epoch: 45163, Training Loss: 0.00777
Epoch: 45163, Training Loss: 0.00603
Epoch: 45164, Training Loss: 0.00680
Epoch: 45164, Training Loss: 0.00637
Epoch: 45164, Training Loss: 0.00777
Epoch: 45164, Training Loss: 0.00603
Epoch: 45165, Training Loss: 0.00680
Epoch: 45165, Training Loss: 0.00637
Epoch: 45165, Training Loss: 0.00777
Epoch: 45165, Training Loss: 0.00603
Epoch: 45166, Training Loss: 0.00680
Epoch: 45166, Training Loss: 0.00637
Epoch: 45166, Training Loss: 0.00777
Epoch: 45166, Training Loss: 0.00603
Epoch: 45167, Training Loss: 0.00680
Epoch: 45167, Training Loss: 0.00637
Epoch: 45167, Training Loss: 0.00777
Epoch: 45167, Training Loss: 0.00603
Epoch: 45168, Training Loss: 0.00680
Epoch: 45168, Training Loss: 0.00637
Epoch: 45168, Training Loss: 0.00777
Epoch: 45168, Training Loss: 0.00603
Epoch: 45169, Training Loss: 0.00680
Epoch: 45169, Training Loss: 0.00637
Epoch: 45169, Training Loss: 0.00777
Epoch: 45169, Training Loss: 0.00603
Epoch: 45170, Training Loss: 0.00680
Epoch: 45170, Training Loss: 0.00637
Epoch: 45170, Training Loss: 0.00777
Epoch: 45170, Training Loss: 0.00603
Epoch: 45171, Training Loss: 0.00680
Epoch: 45171, Training Loss: 0.00637
Epoch: 45171, Training Loss: 0.00777
Epoch: 45171, Training Loss: 0.00603
Epoch: 45172, Training Loss: 0.00680
Epoch: 45172, Training Loss: 0.00637
Epoch: 45172, Training Loss: 0.00777
Epoch: 45172, Training Loss: 0.00603
Epoch: 45173, Training Loss: 0.00680
Epoch: 45173, Training Loss: 0.00637
Epoch: 45173, Training Loss: 0.00777
Epoch: 45173, Training Loss: 0.00603
Epoch: 45174, Training Loss: 0.00680
Epoch: 45174, Training Loss: 0.00637
Epoch: 45174, Training Loss: 0.00777
Epoch: 45174, Training Loss: 0.00603
Epoch: 45175, Training Loss: 0.00680
Epoch: 45175, Training Loss: 0.00637
Epoch: 45175, Training Loss: 0.00777
Epoch: 45175, Training Loss: 0.00603
Epoch: 45176, Training Loss: 0.00680
Epoch: 45176, Training Loss: 0.00637
Epoch: 45176, Training Loss: 0.00777
Epoch: 45176, Training Loss: 0.00603
Epoch: 45177, Training Loss: 0.00679
Epoch: 45177, Training Loss: 0.00637
Epoch: 45177, Training Loss: 0.00777
Epoch: 45177, Training Loss: 0.00603
Epoch: 45178, Training Loss: 0.00679
Epoch: 45178, Training Loss: 0.00637
Epoch: 45178, Training Loss: 0.00777
Epoch: 45178, Training Loss: 0.00603
Epoch: 45179, Training Loss: 0.00679
Epoch: 45179, Training Loss: 0.00637
Epoch: 45179, Training Loss: 0.00777
Epoch: 45179, Training Loss: 0.00603
Epoch: 45180, Training Loss: 0.00679
Epoch: 45180, Training Loss: 0.00636
Epoch: 45180, Training Loss: 0.00777
Epoch: 45180, Training Loss: 0.00603
Epoch: 45181, Training Loss: 0.00679
Epoch: 45181, Training Loss: 0.00636
Epoch: 45181, Training Loss: 0.00777
Epoch: 45181, Training Loss: 0.00603
Epoch: 45182, Training Loss: 0.00679
Epoch: 45182, Training Loss: 0.00636
Epoch: 45182, Training Loss: 0.00777
Epoch: 45182, Training Loss: 0.00603
Epoch: 45183, Training Loss: 0.00679
Epoch: 45183, Training Loss: 0.00636
Epoch: 45183, Training Loss: 0.00777
Epoch: 45183, Training Loss: 0.00603
Epoch: 45184, Training Loss: 0.00679
Epoch: 45184, Training Loss: 0.00636
Epoch: 45184, Training Loss: 0.00777
Epoch: 45184, Training Loss: 0.00603
Epoch: 45185, Training Loss: 0.00679
Epoch: 45185, Training Loss: 0.00636
Epoch: 45185, Training Loss: 0.00777
Epoch: 45185, Training Loss: 0.00603
Epoch: 45186, Training Loss: 0.00679
Epoch: 45186, Training Loss: 0.00636
Epoch: 45186, Training Loss: 0.00776
Epoch: 45186, Training Loss: 0.00603
Epoch: 45187, Training Loss: 0.00679
Epoch: 45187, Training Loss: 0.00636
Epoch: 45187, Training Loss: 0.00776
Epoch: 45187, Training Loss: 0.00603
Epoch: 45188, Training Loss: 0.00679
Epoch: 45188, Training Loss: 0.00636
Epoch: 45188, Training Loss: 0.00776
Epoch: 45188, Training Loss: 0.00603
Epoch: 45189, Training Loss: 0.00679
Epoch: 45189, Training Loss: 0.00636
Epoch: 45189, Training Loss: 0.00776
Epoch: 45189, Training Loss: 0.00603
Epoch: 45190, Training Loss: 0.00679
Epoch: 45190, Training Loss: 0.00636
Epoch: 45190, Training Loss: 0.00776
Epoch: 45190, Training Loss: 0.00603
Epoch: 45191, Training Loss: 0.00679
Epoch: 45191, Training Loss: 0.00636
Epoch: 45191, Training Loss: 0.00776
Epoch: 45191, Training Loss: 0.00603
Epoch: 45192, Training Loss: 0.00679
Epoch: 45192, Training Loss: 0.00636
Epoch: 45192, Training Loss: 0.00776
Epoch: 45192, Training Loss: 0.00603
Epoch: 45193, Training Loss: 0.00679
Epoch: 45193, Training Loss: 0.00636
Epoch: 45193, Training Loss: 0.00776
Epoch: 45193, Training Loss: 0.00602
Epoch: 45194, Training Loss: 0.00679
Epoch: 45194, Training Loss: 0.00636
Epoch: 45194, Training Loss: 0.00776
Epoch: 45194, Training Loss: 0.00602
Epoch: 45195, Training Loss: 0.00679
Epoch: 45195, Training Loss: 0.00636
Epoch: 45195, Training Loss: 0.00776
Epoch: 45195, Training Loss: 0.00602
Epoch: 45196, Training Loss: 0.00679
Epoch: 45196, Training Loss: 0.00636
Epoch: 45196, Training Loss: 0.00776
Epoch: 45196, Training Loss: 0.00602
Epoch: 45197, Training Loss: 0.00679
Epoch: 45197, Training Loss: 0.00636
Epoch: 45197, Training Loss: 0.00776
Epoch: 45197, Training Loss: 0.00602
Epoch: 45198, Training Loss: 0.00679
Epoch: 45198, Training Loss: 0.00636
Epoch: 45198, Training Loss: 0.00776
Epoch: 45198, Training Loss: 0.00602
Epoch: 45199, Training Loss: 0.00679
Epoch: 45199, Training Loss: 0.00636
Epoch: 45199, Training Loss: 0.00776
Epoch: 45199, Training Loss: 0.00602
Epoch: 45200, Training Loss: 0.00679
Epoch: 45200, Training Loss: 0.00636
Epoch: 45200, Training Loss: 0.00776
Epoch: 45200, Training Loss: 0.00602
Epoch: 45201, Training Loss: 0.00679
Epoch: 45201, Training Loss: 0.00636
Epoch: 45201, Training Loss: 0.00776
Epoch: 45201, Training Loss: 0.00602
Epoch: 45202, Training Loss: 0.00679
Epoch: 45202, Training Loss: 0.00636
Epoch: 45202, Training Loss: 0.00776
Epoch: 45202, Training Loss: 0.00602
Epoch: 45203, Training Loss: 0.00679
Epoch: 45203, Training Loss: 0.00636
Epoch: 45203, Training Loss: 0.00776
Epoch: 45203, Training Loss: 0.00602
Epoch: 45204, Training Loss: 0.00679
Epoch: 45204, Training Loss: 0.00636
Epoch: 45204, Training Loss: 0.00776
Epoch: 45204, Training Loss: 0.00602
Epoch: 45205, Training Loss: 0.00679
Epoch: 45205, Training Loss: 0.00636
Epoch: 45205, Training Loss: 0.00776
Epoch: 45205, Training Loss: 0.00602
Epoch: 45206, Training Loss: 0.00679
Epoch: 45206, Training Loss: 0.00636
Epoch: 45206, Training Loss: 0.00776
Epoch: 45206, Training Loss: 0.00602
Epoch: 45207, Training Loss: 0.00679
Epoch: 45207, Training Loss: 0.00636
Epoch: 45207, Training Loss: 0.00776
Epoch: 45207, Training Loss: 0.00602
Epoch: 45208, Training Loss: 0.00679
Epoch: 45208, Training Loss: 0.00636
Epoch: 45208, Training Loss: 0.00776
Epoch: 45208, Training Loss: 0.00602
Epoch: 45209, Training Loss: 0.00679
Epoch: 45209, Training Loss: 0.00636
Epoch: 45209, Training Loss: 0.00776
Epoch: 45209, Training Loss: 0.00602
Epoch: 45210, Training Loss: 0.00679
Epoch: 45210, Training Loss: 0.00636
Epoch: 45210, Training Loss: 0.00776
Epoch: 45210, Training Loss: 0.00602
Epoch: 45211, Training Loss: 0.00679
Epoch: 45211, Training Loss: 0.00636
Epoch: 45211, Training Loss: 0.00776
Epoch: 45211, Training Loss: 0.00602
Epoch: 45212, Training Loss: 0.00679
Epoch: 45212, Training Loss: 0.00636
Epoch: 45212, Training Loss: 0.00776
Epoch: 45212, Training Loss: 0.00602
Epoch: 45213, Training Loss: 0.00679
Epoch: 45213, Training Loss: 0.00636
Epoch: 45213, Training Loss: 0.00776
Epoch: 45213, Training Loss: 0.00602
Epoch: 45214, Training Loss: 0.00679
Epoch: 45214, Training Loss: 0.00636
Epoch: 45214, Training Loss: 0.00776
Epoch: 45214, Training Loss: 0.00602
Epoch: 45215, Training Loss: 0.00679
Epoch: 45215, Training Loss: 0.00636
Epoch: 45215, Training Loss: 0.00776
Epoch: 45215, Training Loss: 0.00602
Epoch: 45216, Training Loss: 0.00679
Epoch: 45216, Training Loss: 0.00636
Epoch: 45216, Training Loss: 0.00776
Epoch: 45216, Training Loss: 0.00602
Epoch: 45217, Training Loss: 0.00679
Epoch: 45217, Training Loss: 0.00636
Epoch: 45217, Training Loss: 0.00776
Epoch: 45217, Training Loss: 0.00602
Epoch: 45218, Training Loss: 0.00679
Epoch: 45218, Training Loss: 0.00636
Epoch: 45218, Training Loss: 0.00776
Epoch: 45218, Training Loss: 0.00602
Epoch: 45219, Training Loss: 0.00679
Epoch: 45219, Training Loss: 0.00636
Epoch: 45219, Training Loss: 0.00776
Epoch: 45219, Training Loss: 0.00602
Epoch: 45220, Training Loss: 0.00679
Epoch: 45220, Training Loss: 0.00636
Epoch: 45220, Training Loss: 0.00776
Epoch: 45220, Training Loss: 0.00602
Epoch: 45221, Training Loss: 0.00679
Epoch: 45221, Training Loss: 0.00636
Epoch: 45221, Training Loss: 0.00776
Epoch: 45221, Training Loss: 0.00602
Epoch: 45222, Training Loss: 0.00679
Epoch: 45222, Training Loss: 0.00636
Epoch: 45222, Training Loss: 0.00776
Epoch: 45222, Training Loss: 0.00602
Epoch: 45223, Training Loss: 0.00679
Epoch: 45223, Training Loss: 0.00636
Epoch: 45223, Training Loss: 0.00776
Epoch: 45223, Training Loss: 0.00602
Epoch: 45224, Training Loss: 0.00679
Epoch: 45224, Training Loss: 0.00636
Epoch: 45224, Training Loss: 0.00776
Epoch: 45224, Training Loss: 0.00602
Epoch: 45225, Training Loss: 0.00679
Epoch: 45225, Training Loss: 0.00636
Epoch: 45225, Training Loss: 0.00776
Epoch: 45225, Training Loss: 0.00602
Epoch: 45226, Training Loss: 0.00679
Epoch: 45226, Training Loss: 0.00636
Epoch: 45226, Training Loss: 0.00776
Epoch: 45226, Training Loss: 0.00602
Epoch: 45227, Training Loss: 0.00679
Epoch: 45227, Training Loss: 0.00636
Epoch: 45227, Training Loss: 0.00776
Epoch: 45227, Training Loss: 0.00602
Epoch: 45228, Training Loss: 0.00679
Epoch: 45228, Training Loss: 0.00636
Epoch: 45228, Training Loss: 0.00776
Epoch: 45228, Training Loss: 0.00602
Epoch: 45229, Training Loss: 0.00679
Epoch: 45229, Training Loss: 0.00636
Epoch: 45229, Training Loss: 0.00776
Epoch: 45229, Training Loss: 0.00602
Epoch: 45230, Training Loss: 0.00679
Epoch: 45230, Training Loss: 0.00636
Epoch: 45230, Training Loss: 0.00776
Epoch: 45230, Training Loss: 0.00602
Epoch: 45231, Training Loss: 0.00679
Epoch: 45231, Training Loss: 0.00636
Epoch: 45231, Training Loss: 0.00776
Epoch: 45231, Training Loss: 0.00602
Epoch: 45232, Training Loss: 0.00679
Epoch: 45232, Training Loss: 0.00636
Epoch: 45232, Training Loss: 0.00776
Epoch: 45232, Training Loss: 0.00602
Epoch: 45233, Training Loss: 0.00679
Epoch: 45233, Training Loss: 0.00636
Epoch: 45233, Training Loss: 0.00776
Epoch: 45233, Training Loss: 0.00602
Epoch: 45234, Training Loss: 0.00679
Epoch: 45234, Training Loss: 0.00636
Epoch: 45234, Training Loss: 0.00776
Epoch: 45234, Training Loss: 0.00602
Epoch: 45235, Training Loss: 0.00679
Epoch: 45235, Training Loss: 0.00636
Epoch: 45235, Training Loss: 0.00776
Epoch: 45235, Training Loss: 0.00602
Epoch: 45236, Training Loss: 0.00679
Epoch: 45236, Training Loss: 0.00636
Epoch: 45236, Training Loss: 0.00776
Epoch: 45236, Training Loss: 0.00602
Epoch: 45237, Training Loss: 0.00679
Epoch: 45237, Training Loss: 0.00636
Epoch: 45237, Training Loss: 0.00776
Epoch: 45237, Training Loss: 0.00602
Epoch: 45238, Training Loss: 0.00679
Epoch: 45238, Training Loss: 0.00636
Epoch: 45238, Training Loss: 0.00776
Epoch: 45238, Training Loss: 0.00602
Epoch: 45239, Training Loss: 0.00679
Epoch: 45239, Training Loss: 0.00636
Epoch: 45239, Training Loss: 0.00776
Epoch: 45239, Training Loss: 0.00602
Epoch: 45240, Training Loss: 0.00679
Epoch: 45240, Training Loss: 0.00636
Epoch: 45240, Training Loss: 0.00776
Epoch: 45240, Training Loss: 0.00602
Epoch: 45241, Training Loss: 0.00679
Epoch: 45241, Training Loss: 0.00636
Epoch: 45241, Training Loss: 0.00776
Epoch: 45241, Training Loss: 0.00602
Epoch: 45242, Training Loss: 0.00679
Epoch: 45242, Training Loss: 0.00636
Epoch: 45242, Training Loss: 0.00776
Epoch: 45242, Training Loss: 0.00602
Epoch: 45243, Training Loss: 0.00679
Epoch: 45243, Training Loss: 0.00636
Epoch: 45243, Training Loss: 0.00776
Epoch: 45243, Training Loss: 0.00602
Epoch: 45244, Training Loss: 0.00679
Epoch: 45244, Training Loss: 0.00636
Epoch: 45244, Training Loss: 0.00776
Epoch: 45244, Training Loss: 0.00602
Epoch: 45245, Training Loss: 0.00679
Epoch: 45245, Training Loss: 0.00636
Epoch: 45245, Training Loss: 0.00776
Epoch: 45245, Training Loss: 0.00602
Epoch: 45246, Training Loss: 0.00679
Epoch: 45246, Training Loss: 0.00636
Epoch: 45246, Training Loss: 0.00776
Epoch: 45246, Training Loss: 0.00602
Epoch: 45247, Training Loss: 0.00679
Epoch: 45247, Training Loss: 0.00636
Epoch: 45247, Training Loss: 0.00776
Epoch: 45247, Training Loss: 0.00602
Epoch: 45248, Training Loss: 0.00679
Epoch: 45248, Training Loss: 0.00636
Epoch: 45248, Training Loss: 0.00776
Epoch: 45248, Training Loss: 0.00602
Epoch: 45249, Training Loss: 0.00679
Epoch: 45249, Training Loss: 0.00636
Epoch: 45249, Training Loss: 0.00776
Epoch: 45249, Training Loss: 0.00602
Epoch: 45250, Training Loss: 0.00679
Epoch: 45250, Training Loss: 0.00636
Epoch: 45250, Training Loss: 0.00776
Epoch: 45250, Training Loss: 0.00602
Epoch: 45251, Training Loss: 0.00679
Epoch: 45251, Training Loss: 0.00636
Epoch: 45251, Training Loss: 0.00776
Epoch: 45251, Training Loss: 0.00602
Epoch: 45252, Training Loss: 0.00679
Epoch: 45252, Training Loss: 0.00636
Epoch: 45252, Training Loss: 0.00776
Epoch: 45252, Training Loss: 0.00602
Epoch: 45253, Training Loss: 0.00679
Epoch: 45253, Training Loss: 0.00636
Epoch: 45253, Training Loss: 0.00776
Epoch: 45253, Training Loss: 0.00602
Epoch: 45254, Training Loss: 0.00679
Epoch: 45254, Training Loss: 0.00636
Epoch: 45254, Training Loss: 0.00776
Epoch: 45254, Training Loss: 0.00602
Epoch: 45255, Training Loss: 0.00679
Epoch: 45255, Training Loss: 0.00636
Epoch: 45255, Training Loss: 0.00776
Epoch: 45255, Training Loss: 0.00602
Epoch: 45256, Training Loss: 0.00679
Epoch: 45256, Training Loss: 0.00636
Epoch: 45256, Training Loss: 0.00776
Epoch: 45256, Training Loss: 0.00602
Epoch: 45257, Training Loss: 0.00679
Epoch: 45257, Training Loss: 0.00636
Epoch: 45257, Training Loss: 0.00776
Epoch: 45257, Training Loss: 0.00602
Epoch: 45258, Training Loss: 0.00679
Epoch: 45258, Training Loss: 0.00636
Epoch: 45258, Training Loss: 0.00776
Epoch: 45258, Training Loss: 0.00602
Epoch: 45259, Training Loss: 0.00679
Epoch: 45259, Training Loss: 0.00636
Epoch: 45259, Training Loss: 0.00776
Epoch: 45259, Training Loss: 0.00602
Epoch: 45260, Training Loss: 0.00679
Epoch: 45260, Training Loss: 0.00636
Epoch: 45260, Training Loss: 0.00776
Epoch: 45260, Training Loss: 0.00602
Epoch: 45261, Training Loss: 0.00679
Epoch: 45261, Training Loss: 0.00636
Epoch: 45261, Training Loss: 0.00776
Epoch: 45261, Training Loss: 0.00602
Epoch: 45262, Training Loss: 0.00679
Epoch: 45262, Training Loss: 0.00636
Epoch: 45262, Training Loss: 0.00776
Epoch: 45262, Training Loss: 0.00602
Epoch: 45263, Training Loss: 0.00679
Epoch: 45263, Training Loss: 0.00636
Epoch: 45263, Training Loss: 0.00776
Epoch: 45263, Training Loss: 0.00602
Epoch: 45264, Training Loss: 0.00679
Epoch: 45264, Training Loss: 0.00636
Epoch: 45264, Training Loss: 0.00776
Epoch: 45264, Training Loss: 0.00602
Epoch: 45265, Training Loss: 0.00679
Epoch: 45265, Training Loss: 0.00636
Epoch: 45265, Training Loss: 0.00776
Epoch: 45265, Training Loss: 0.00602
Epoch: 45266, Training Loss: 0.00679
Epoch: 45266, Training Loss: 0.00636
Epoch: 45266, Training Loss: 0.00776
Epoch: 45266, Training Loss: 0.00602
Epoch: 45267, Training Loss: 0.00679
Epoch: 45267, Training Loss: 0.00636
Epoch: 45267, Training Loss: 0.00776
Epoch: 45267, Training Loss: 0.00602
Epoch: 45268, Training Loss: 0.00679
Epoch: 45268, Training Loss: 0.00636
Epoch: 45268, Training Loss: 0.00776
Epoch: 45268, Training Loss: 0.00602
Epoch: 45269, Training Loss: 0.00679
Epoch: 45269, Training Loss: 0.00636
Epoch: 45269, Training Loss: 0.00776
Epoch: 45269, Training Loss: 0.00602
Epoch: 45270, Training Loss: 0.00679
Epoch: 45270, Training Loss: 0.00636
Epoch: 45270, Training Loss: 0.00776
Epoch: 45270, Training Loss: 0.00602
Epoch: 45271, Training Loss: 0.00679
Epoch: 45271, Training Loss: 0.00636
Epoch: 45271, Training Loss: 0.00776
Epoch: 45271, Training Loss: 0.00602
Epoch: 45272, Training Loss: 0.00679
Epoch: 45272, Training Loss: 0.00636
Epoch: 45272, Training Loss: 0.00776
Epoch: 45272, Training Loss: 0.00602
Epoch: 45273, Training Loss: 0.00679
Epoch: 45273, Training Loss: 0.00636
Epoch: 45273, Training Loss: 0.00776
Epoch: 45273, Training Loss: 0.00602
Epoch: 45274, Training Loss: 0.00679
Epoch: 45274, Training Loss: 0.00636
Epoch: 45274, Training Loss: 0.00776
Epoch: 45274, Training Loss: 0.00602
Epoch: 45275, Training Loss: 0.00679
Epoch: 45275, Training Loss: 0.00636
Epoch: 45275, Training Loss: 0.00776
Epoch: 45275, Training Loss: 0.00602
Epoch: 45276, Training Loss: 0.00679
Epoch: 45276, Training Loss: 0.00636
Epoch: 45276, Training Loss: 0.00776
Epoch: 45276, Training Loss: 0.00602
Epoch: 45277, Training Loss: 0.00679
Epoch: 45277, Training Loss: 0.00636
Epoch: 45277, Training Loss: 0.00776
Epoch: 45277, Training Loss: 0.00602
Epoch: 45278, Training Loss: 0.00679
Epoch: 45278, Training Loss: 0.00636
Epoch: 45278, Training Loss: 0.00776
Epoch: 45278, Training Loss: 0.00602
Epoch: 45279, Training Loss: 0.00679
Epoch: 45279, Training Loss: 0.00636
Epoch: 45279, Training Loss: 0.00776
Epoch: 45279, Training Loss: 0.00602
Epoch: 45280, Training Loss: 0.00679
Epoch: 45280, Training Loss: 0.00636
Epoch: 45280, Training Loss: 0.00776
Epoch: 45280, Training Loss: 0.00602
Epoch: 45281, Training Loss: 0.00679
Epoch: 45281, Training Loss: 0.00636
Epoch: 45281, Training Loss: 0.00776
Epoch: 45281, Training Loss: 0.00602
Epoch: 45282, Training Loss: 0.00679
Epoch: 45282, Training Loss: 0.00636
Epoch: 45282, Training Loss: 0.00776
Epoch: 45282, Training Loss: 0.00602
Epoch: 45283, Training Loss: 0.00679
Epoch: 45283, Training Loss: 0.00636
Epoch: 45283, Training Loss: 0.00776
Epoch: 45283, Training Loss: 0.00602
Epoch: 45284, Training Loss: 0.00679
Epoch: 45284, Training Loss: 0.00636
Epoch: 45284, Training Loss: 0.00776
Epoch: 45284, Training Loss: 0.00602
Epoch: 45285, Training Loss: 0.00679
Epoch: 45285, Training Loss: 0.00636
Epoch: 45285, Training Loss: 0.00776
Epoch: 45285, Training Loss: 0.00602
Epoch: 45286, Training Loss: 0.00679
Epoch: 45286, Training Loss: 0.00636
Epoch: 45286, Training Loss: 0.00776
Epoch: 45286, Training Loss: 0.00602
Epoch: 45287, Training Loss: 0.00679
Epoch: 45287, Training Loss: 0.00636
Epoch: 45287, Training Loss: 0.00776
Epoch: 45287, Training Loss: 0.00602
Epoch: 45288, Training Loss: 0.00679
Epoch: 45288, Training Loss: 0.00636
Epoch: 45288, Training Loss: 0.00776
Epoch: 45288, Training Loss: 0.00602
Epoch: 45289, Training Loss: 0.00679
Epoch: 45289, Training Loss: 0.00636
Epoch: 45289, Training Loss: 0.00776
Epoch: 45289, Training Loss: 0.00602
Epoch: 45290, Training Loss: 0.00679
Epoch: 45290, Training Loss: 0.00636
Epoch: 45290, Training Loss: 0.00776
Epoch: 45290, Training Loss: 0.00602
Epoch: 45291, Training Loss: 0.00679
Epoch: 45291, Training Loss: 0.00636
Epoch: 45291, Training Loss: 0.00776
Epoch: 45291, Training Loss: 0.00602
Epoch: 45292, Training Loss: 0.00679
Epoch: 45292, Training Loss: 0.00636
Epoch: 45292, Training Loss: 0.00776
Epoch: 45292, Training Loss: 0.00602
Epoch: 45293, Training Loss: 0.00679
Epoch: 45293, Training Loss: 0.00636
Epoch: 45293, Training Loss: 0.00776
Epoch: 45293, Training Loss: 0.00602
Epoch: 45294, Training Loss: 0.00679
Epoch: 45294, Training Loss: 0.00636
Epoch: 45294, Training Loss: 0.00776
Epoch: 45294, Training Loss: 0.00602
Epoch: 45295, Training Loss: 0.00679
Epoch: 45295, Training Loss: 0.00636
Epoch: 45295, Training Loss: 0.00776
Epoch: 45295, Training Loss: 0.00602
Epoch: 45296, Training Loss: 0.00679
Epoch: 45296, Training Loss: 0.00636
Epoch: 45296, Training Loss: 0.00776
Epoch: 45296, Training Loss: 0.00602
Epoch: 45297, Training Loss: 0.00679
Epoch: 45297, Training Loss: 0.00636
Epoch: 45297, Training Loss: 0.00776
Epoch: 45297, Training Loss: 0.00602
Epoch: 45298, Training Loss: 0.00679
Epoch: 45298, Training Loss: 0.00636
Epoch: 45298, Training Loss: 0.00775
Epoch: 45298, Training Loss: 0.00602
Epoch: 45299, Training Loss: 0.00679
Epoch: 45299, Training Loss: 0.00636
Epoch: 45299, Training Loss: 0.00775
Epoch: 45299, Training Loss: 0.00602
Epoch: 45300, Training Loss: 0.00679
Epoch: 45300, Training Loss: 0.00636
Epoch: 45300, Training Loss: 0.00775
Epoch: 45300, Training Loss: 0.00602
Epoch: 45301, Training Loss: 0.00679
Epoch: 45301, Training Loss: 0.00636
Epoch: 45301, Training Loss: 0.00775
Epoch: 45301, Training Loss: 0.00602
Epoch: 45302, Training Loss: 0.00679
Epoch: 45302, Training Loss: 0.00636
Epoch: 45302, Training Loss: 0.00775
Epoch: 45302, Training Loss: 0.00602
Epoch: 45303, Training Loss: 0.00679
Epoch: 45303, Training Loss: 0.00636
Epoch: 45303, Training Loss: 0.00775
Epoch: 45303, Training Loss: 0.00602
Epoch: 45304, Training Loss: 0.00678
Epoch: 45304, Training Loss: 0.00636
Epoch: 45304, Training Loss: 0.00775
Epoch: 45304, Training Loss: 0.00602
Epoch: 45305, Training Loss: 0.00678
Epoch: 45305, Training Loss: 0.00636
Epoch: 45305, Training Loss: 0.00775
Epoch: 45305, Training Loss: 0.00602
Epoch: 45306, Training Loss: 0.00678
Epoch: 45306, Training Loss: 0.00636
Epoch: 45306, Training Loss: 0.00775
Epoch: 45306, Training Loss: 0.00602
Epoch: 45307, Training Loss: 0.00678
Epoch: 45307, Training Loss: 0.00636
Epoch: 45307, Training Loss: 0.00775
Epoch: 45307, Training Loss: 0.00602
Epoch: 45308, Training Loss: 0.00678
Epoch: 45308, Training Loss: 0.00636
Epoch: 45308, Training Loss: 0.00775
Epoch: 45308, Training Loss: 0.00602
Epoch: 45309, Training Loss: 0.00678
Epoch: 45309, Training Loss: 0.00636
Epoch: 45309, Training Loss: 0.00775
Epoch: 45309, Training Loss: 0.00602
Epoch: 45310, Training Loss: 0.00678
Epoch: 45310, Training Loss: 0.00636
Epoch: 45310, Training Loss: 0.00775
Epoch: 45310, Training Loss: 0.00602
Epoch: 45311, Training Loss: 0.00678
Epoch: 45311, Training Loss: 0.00636
Epoch: 45311, Training Loss: 0.00775
Epoch: 45311, Training Loss: 0.00602
Epoch: 45312, Training Loss: 0.00678
Epoch: 45312, Training Loss: 0.00636
Epoch: 45312, Training Loss: 0.00775
Epoch: 45312, Training Loss: 0.00602
Epoch: 45313, Training Loss: 0.00678
Epoch: 45313, Training Loss: 0.00636
Epoch: 45313, Training Loss: 0.00775
Epoch: 45313, Training Loss: 0.00602
Epoch: 45314, Training Loss: 0.00678
Epoch: 45314, Training Loss: 0.00636
Epoch: 45314, Training Loss: 0.00775
Epoch: 45314, Training Loss: 0.00602
Epoch: 45315, Training Loss: 0.00678
Epoch: 45315, Training Loss: 0.00636
Epoch: 45315, Training Loss: 0.00775
Epoch: 45315, Training Loss: 0.00602
Epoch: 45316, Training Loss: 0.00678
Epoch: 45316, Training Loss: 0.00636
Epoch: 45316, Training Loss: 0.00775
Epoch: 45316, Training Loss: 0.00602
Epoch: 45317, Training Loss: 0.00678
Epoch: 45317, Training Loss: 0.00635
Epoch: 45317, Training Loss: 0.00775
Epoch: 45317, Training Loss: 0.00602
Epoch: 45318, Training Loss: 0.00678
Epoch: 45318, Training Loss: 0.00635
Epoch: 45318, Training Loss: 0.00775
Epoch: 45318, Training Loss: 0.00602
Epoch: 45319, Training Loss: 0.00678
Epoch: 45319, Training Loss: 0.00635
Epoch: 45319, Training Loss: 0.00775
Epoch: 45319, Training Loss: 0.00602
Epoch: 45320, Training Loss: 0.00678
Epoch: 45320, Training Loss: 0.00635
Epoch: 45320, Training Loss: 0.00775
Epoch: 45320, Training Loss: 0.00602
Epoch: 45321, Training Loss: 0.00678
Epoch: 45321, Training Loss: 0.00635
Epoch: 45321, Training Loss: 0.00775
Epoch: 45321, Training Loss: 0.00602
Epoch: 45322, Training Loss: 0.00678
Epoch: 45322, Training Loss: 0.00635
Epoch: 45322, Training Loss: 0.00775
Epoch: 45322, Training Loss: 0.00602
Epoch: 45323, Training Loss: 0.00678
Epoch: 45323, Training Loss: 0.00635
Epoch: 45323, Training Loss: 0.00775
Epoch: 45323, Training Loss: 0.00602
Epoch: 45324, Training Loss: 0.00678
Epoch: 45324, Training Loss: 0.00635
Epoch: 45324, Training Loss: 0.00775
Epoch: 45324, Training Loss: 0.00602
Epoch: 45325, Training Loss: 0.00678
Epoch: 45325, Training Loss: 0.00635
Epoch: 45325, Training Loss: 0.00775
Epoch: 45325, Training Loss: 0.00602
Epoch: 45326, Training Loss: 0.00678
Epoch: 45326, Training Loss: 0.00635
Epoch: 45326, Training Loss: 0.00775
Epoch: 45326, Training Loss: 0.00602
Epoch: 45327, Training Loss: 0.00678
Epoch: 45327, Training Loss: 0.00635
Epoch: 45327, Training Loss: 0.00775
Epoch: 45327, Training Loss: 0.00602
Epoch: 45328, Training Loss: 0.00678
Epoch: 45328, Training Loss: 0.00635
Epoch: 45328, Training Loss: 0.00775
Epoch: 45328, Training Loss: 0.00602
Epoch: 45329, Training Loss: 0.00678
Epoch: 45329, Training Loss: 0.00635
Epoch: 45329, Training Loss: 0.00775
Epoch: 45329, Training Loss: 0.00602
Epoch: 45330, Training Loss: 0.00678
Epoch: 45330, Training Loss: 0.00635
Epoch: 45330, Training Loss: 0.00775
Epoch: 45330, Training Loss: 0.00602
Epoch: 45331, Training Loss: 0.00678
Epoch: 45331, Training Loss: 0.00635
Epoch: 45331, Training Loss: 0.00775
Epoch: 45331, Training Loss: 0.00602
Epoch: 45332, Training Loss: 0.00678
Epoch: 45332, Training Loss: 0.00635
Epoch: 45332, Training Loss: 0.00775
Epoch: 45332, Training Loss: 0.00602
Epoch: 45333, Training Loss: 0.00678
Epoch: 45333, Training Loss: 0.00635
Epoch: 45333, Training Loss: 0.00775
Epoch: 45333, Training Loss: 0.00602
Epoch: 45334, Training Loss: 0.00678
Epoch: 45334, Training Loss: 0.00635
Epoch: 45334, Training Loss: 0.00775
Epoch: 45334, Training Loss: 0.00602
Epoch: 45335, Training Loss: 0.00678
Epoch: 45335, Training Loss: 0.00635
Epoch: 45335, Training Loss: 0.00775
Epoch: 45335, Training Loss: 0.00602
Epoch: 45336, Training Loss: 0.00678
Epoch: 45336, Training Loss: 0.00635
Epoch: 45336, Training Loss: 0.00775
Epoch: 45336, Training Loss: 0.00602
Epoch: 45337, Training Loss: 0.00678
Epoch: 45337, Training Loss: 0.00635
Epoch: 45337, Training Loss: 0.00775
Epoch: 45337, Training Loss: 0.00602
Epoch: 45338, Training Loss: 0.00678
Epoch: 45338, Training Loss: 0.00635
Epoch: 45338, Training Loss: 0.00775
Epoch: 45338, Training Loss: 0.00601
Epoch: 45339, Training Loss: 0.00678
Epoch: 45339, Training Loss: 0.00635
Epoch: 45339, Training Loss: 0.00775
Epoch: 45339, Training Loss: 0.00601
Epoch: 45340, Training Loss: 0.00678
Epoch: 45340, Training Loss: 0.00635
Epoch: 45340, Training Loss: 0.00775
Epoch: 45340, Training Loss: 0.00601
Epoch: 45341, Training Loss: 0.00678
Epoch: 45341, Training Loss: 0.00635
Epoch: 45341, Training Loss: 0.00775
Epoch: 45341, Training Loss: 0.00601
Epoch: 45342, Training Loss: 0.00678
Epoch: 45342, Training Loss: 0.00635
Epoch: 45342, Training Loss: 0.00775
Epoch: 45342, Training Loss: 0.00601
Epoch: 45343, Training Loss: 0.00678
Epoch: 45343, Training Loss: 0.00635
Epoch: 45343, Training Loss: 0.00775
Epoch: 45343, Training Loss: 0.00601
Epoch: 45344, Training Loss: 0.00678
Epoch: 45344, Training Loss: 0.00635
Epoch: 45344, Training Loss: 0.00775
Epoch: 45344, Training Loss: 0.00601
Epoch: 45345, Training Loss: 0.00678
Epoch: 45345, Training Loss: 0.00635
Epoch: 45345, Training Loss: 0.00775
Epoch: 45345, Training Loss: 0.00601
Epoch: 45346, Training Loss: 0.00678
Epoch: 45346, Training Loss: 0.00635
Epoch: 45346, Training Loss: 0.00775
Epoch: 45346, Training Loss: 0.00601
Epoch: 45347, Training Loss: 0.00678
Epoch: 45347, Training Loss: 0.00635
Epoch: 45347, Training Loss: 0.00775
Epoch: 45347, Training Loss: 0.00601
Epoch: 45348, Training Loss: 0.00678
Epoch: 45348, Training Loss: 0.00635
Epoch: 45348, Training Loss: 0.00775
Epoch: 45348, Training Loss: 0.00601
Epoch: 45349, Training Loss: 0.00678
Epoch: 45349, Training Loss: 0.00635
Epoch: 45349, Training Loss: 0.00775
Epoch: 45349, Training Loss: 0.00601
Epoch: 45350, Training Loss: 0.00678
Epoch: 45350, Training Loss: 0.00635
Epoch: 45350, Training Loss: 0.00775
Epoch: 45350, Training Loss: 0.00601
Epoch: 45351, Training Loss: 0.00678
Epoch: 45351, Training Loss: 0.00635
Epoch: 45351, Training Loss: 0.00775
Epoch: 45351, Training Loss: 0.00601
Epoch: 45352, Training Loss: 0.00678
Epoch: 45352, Training Loss: 0.00635
Epoch: 45352, Training Loss: 0.00775
Epoch: 45352, Training Loss: 0.00601
Epoch: 45353, Training Loss: 0.00678
Epoch: 45353, Training Loss: 0.00635
Epoch: 45353, Training Loss: 0.00775
Epoch: 45353, Training Loss: 0.00601
Epoch: 45354, Training Loss: 0.00678
Epoch: 45354, Training Loss: 0.00635
Epoch: 45354, Training Loss: 0.00775
Epoch: 45354, Training Loss: 0.00601
Epoch: 45355, Training Loss: 0.00678
Epoch: 45355, Training Loss: 0.00635
Epoch: 45355, Training Loss: 0.00775
Epoch: 45355, Training Loss: 0.00601
Epoch: 45356, Training Loss: 0.00678
Epoch: 45356, Training Loss: 0.00635
Epoch: 45356, Training Loss: 0.00775
Epoch: 45356, Training Loss: 0.00601
Epoch: 45357, Training Loss: 0.00678
Epoch: 45357, Training Loss: 0.00635
Epoch: 45357, Training Loss: 0.00775
Epoch: 45357, Training Loss: 0.00601
Epoch: 45358, Training Loss: 0.00678
Epoch: 45358, Training Loss: 0.00635
Epoch: 45358, Training Loss: 0.00775
Epoch: 45358, Training Loss: 0.00601
Epoch: 45359, Training Loss: 0.00678
Epoch: 45359, Training Loss: 0.00635
Epoch: 45359, Training Loss: 0.00775
Epoch: 45359, Training Loss: 0.00601
Epoch: 45360, Training Loss: 0.00678
Epoch: 45360, Training Loss: 0.00635
Epoch: 45360, Training Loss: 0.00775
Epoch: 45360, Training Loss: 0.00601
Epoch: 45361, Training Loss: 0.00678
Epoch: 45361, Training Loss: 0.00635
Epoch: 45361, Training Loss: 0.00775
Epoch: 45361, Training Loss: 0.00601
Epoch: 45362, Training Loss: 0.00678
Epoch: 45362, Training Loss: 0.00635
Epoch: 45362, Training Loss: 0.00775
Epoch: 45362, Training Loss: 0.00601
Epoch: 45363, Training Loss: 0.00678
Epoch: 45363, Training Loss: 0.00635
Epoch: 45363, Training Loss: 0.00775
Epoch: 45363, Training Loss: 0.00601
Epoch: 45364, Training Loss: 0.00678
Epoch: 45364, Training Loss: 0.00635
Epoch: 45364, Training Loss: 0.00775
Epoch: 45364, Training Loss: 0.00601
Epoch: 45365, Training Loss: 0.00678
Epoch: 45365, Training Loss: 0.00635
Epoch: 45365, Training Loss: 0.00775
Epoch: 45365, Training Loss: 0.00601
Epoch: 45366, Training Loss: 0.00678
Epoch: 45366, Training Loss: 0.00635
Epoch: 45366, Training Loss: 0.00775
Epoch: 45366, Training Loss: 0.00601
Epoch: 45367, Training Loss: 0.00678
Epoch: 45367, Training Loss: 0.00635
Epoch: 45367, Training Loss: 0.00775
Epoch: 45367, Training Loss: 0.00601
Epoch: 45368, Training Loss: 0.00678
Epoch: 45368, Training Loss: 0.00635
Epoch: 45368, Training Loss: 0.00775
Epoch: 45368, Training Loss: 0.00601
Epoch: 45369, Training Loss: 0.00678
Epoch: 45369, Training Loss: 0.00635
Epoch: 45369, Training Loss: 0.00775
Epoch: 45369, Training Loss: 0.00601
Epoch: 45370, Training Loss: 0.00678
Epoch: 45370, Training Loss: 0.00635
Epoch: 45370, Training Loss: 0.00775
Epoch: 45370, Training Loss: 0.00601
Epoch: 45371, Training Loss: 0.00678
Epoch: 45371, Training Loss: 0.00635
Epoch: 45371, Training Loss: 0.00775
Epoch: 45371, Training Loss: 0.00601
Epoch: 45372, Training Loss: 0.00678
Epoch: 45372, Training Loss: 0.00635
Epoch: 45372, Training Loss: 0.00775
Epoch: 45372, Training Loss: 0.00601
Epoch: 45373, Training Loss: 0.00678
Epoch: 45373, Training Loss: 0.00635
Epoch: 45373, Training Loss: 0.00775
Epoch: 45373, Training Loss: 0.00601
Epoch: 45374, Training Loss: 0.00678
Epoch: 45374, Training Loss: 0.00635
Epoch: 45374, Training Loss: 0.00775
Epoch: 45374, Training Loss: 0.00601
Epoch: 45375, Training Loss: 0.00678
Epoch: 45375, Training Loss: 0.00635
Epoch: 45375, Training Loss: 0.00775
Epoch: 45375, Training Loss: 0.00601
Epoch: 45376, Training Loss: 0.00678
Epoch: 45376, Training Loss: 0.00635
Epoch: 45376, Training Loss: 0.00775
Epoch: 45376, Training Loss: 0.00601
Epoch: 45377, Training Loss: 0.00678
Epoch: 45377, Training Loss: 0.00635
Epoch: 45377, Training Loss: 0.00775
Epoch: 45377, Training Loss: 0.00601
Epoch: 45378, Training Loss: 0.00678
Epoch: 45378, Training Loss: 0.00635
Epoch: 45378, Training Loss: 0.00775
Epoch: 45378, Training Loss: 0.00601
Epoch: 45379, Training Loss: 0.00678
Epoch: 45379, Training Loss: 0.00635
Epoch: 45379, Training Loss: 0.00775
Epoch: 45379, Training Loss: 0.00601
Epoch: 45380, Training Loss: 0.00678
Epoch: 45380, Training Loss: 0.00635
Epoch: 45380, Training Loss: 0.00775
Epoch: 45380, Training Loss: 0.00601
Epoch: 45381, Training Loss: 0.00678
Epoch: 45381, Training Loss: 0.00635
Epoch: 45381, Training Loss: 0.00775
Epoch: 45381, Training Loss: 0.00601
Epoch: 45382, Training Loss: 0.00678
Epoch: 45382, Training Loss: 0.00635
Epoch: 45382, Training Loss: 0.00775
Epoch: 45382, Training Loss: 0.00601
Epoch: 45383, Training Loss: 0.00678
Epoch: 45383, Training Loss: 0.00635
Epoch: 45383, Training Loss: 0.00775
Epoch: 45383, Training Loss: 0.00601
Epoch: 45384, Training Loss: 0.00678
Epoch: 45384, Training Loss: 0.00635
Epoch: 45384, Training Loss: 0.00775
Epoch: 45384, Training Loss: 0.00601
Epoch: 45385, Training Loss: 0.00678
Epoch: 45385, Training Loss: 0.00635
Epoch: 45385, Training Loss: 0.00775
Epoch: 45385, Training Loss: 0.00601
Epoch: 45386, Training Loss: 0.00678
Epoch: 45386, Training Loss: 0.00635
Epoch: 45386, Training Loss: 0.00775
Epoch: 45386, Training Loss: 0.00601
Epoch: 45387, Training Loss: 0.00678
Epoch: 45387, Training Loss: 0.00635
Epoch: 45387, Training Loss: 0.00775
Epoch: 45387, Training Loss: 0.00601
Epoch: 45388, Training Loss: 0.00678
Epoch: 45388, Training Loss: 0.00635
Epoch: 45388, Training Loss: 0.00775
Epoch: 45388, Training Loss: 0.00601
Epoch: 45389, Training Loss: 0.00678
Epoch: 45389, Training Loss: 0.00635
Epoch: 45389, Training Loss: 0.00775
Epoch: 45389, Training Loss: 0.00601
Epoch: 45390, Training Loss: 0.00678
Epoch: 45390, Training Loss: 0.00635
Epoch: 45390, Training Loss: 0.00775
Epoch: 45390, Training Loss: 0.00601
Epoch: 45391, Training Loss: 0.00678
Epoch: 45391, Training Loss: 0.00635
Epoch: 45391, Training Loss: 0.00775
Epoch: 45391, Training Loss: 0.00601
Epoch: 45392, Training Loss: 0.00678
Epoch: 45392, Training Loss: 0.00635
Epoch: 45392, Training Loss: 0.00775
Epoch: 45392, Training Loss: 0.00601
Epoch: 45393, Training Loss: 0.00678
Epoch: 45393, Training Loss: 0.00635
Epoch: 45393, Training Loss: 0.00775
Epoch: 45393, Training Loss: 0.00601
Epoch: 45394, Training Loss: 0.00678
Epoch: 45394, Training Loss: 0.00635
Epoch: 45394, Training Loss: 0.00775
Epoch: 45394, Training Loss: 0.00601
Epoch: 45395, Training Loss: 0.00678
Epoch: 45395, Training Loss: 0.00635
Epoch: 45395, Training Loss: 0.00775
Epoch: 45395, Training Loss: 0.00601
Epoch: 45396, Training Loss: 0.00678
Epoch: 45396, Training Loss: 0.00635
Epoch: 45396, Training Loss: 0.00775
Epoch: 45396, Training Loss: 0.00601
Epoch: 45397, Training Loss: 0.00678
Epoch: 45397, Training Loss: 0.00635
Epoch: 45397, Training Loss: 0.00775
Epoch: 45397, Training Loss: 0.00601
Epoch: 45398, Training Loss: 0.00678
Epoch: 45398, Training Loss: 0.00635
Epoch: 45398, Training Loss: 0.00775
Epoch: 45398, Training Loss: 0.00601
Epoch: 45399, Training Loss: 0.00678
Epoch: 45399, Training Loss: 0.00635
Epoch: 45399, Training Loss: 0.00775
Epoch: 45399, Training Loss: 0.00601
Epoch: 45400, Training Loss: 0.00678
Epoch: 45400, Training Loss: 0.00635
Epoch: 45400, Training Loss: 0.00775
Epoch: 45400, Training Loss: 0.00601
Epoch: 45401, Training Loss: 0.00678
Epoch: 45401, Training Loss: 0.00635
Epoch: 45401, Training Loss: 0.00775
Epoch: 45401, Training Loss: 0.00601
Epoch: 45402, Training Loss: 0.00678
Epoch: 45402, Training Loss: 0.00635
Epoch: 45402, Training Loss: 0.00775
Epoch: 45402, Training Loss: 0.00601
Epoch: 45403, Training Loss: 0.00678
Epoch: 45403, Training Loss: 0.00635
Epoch: 45403, Training Loss: 0.00775
Epoch: 45403, Training Loss: 0.00601
Epoch: 45404, Training Loss: 0.00678
Epoch: 45404, Training Loss: 0.00635
Epoch: 45404, Training Loss: 0.00775
Epoch: 45404, Training Loss: 0.00601
Epoch: 45405, Training Loss: 0.00678
Epoch: 45405, Training Loss: 0.00635
Epoch: 45405, Training Loss: 0.00775
Epoch: 45405, Training Loss: 0.00601
Epoch: 45406, Training Loss: 0.00678
Epoch: 45406, Training Loss: 0.00635
Epoch: 45406, Training Loss: 0.00775
Epoch: 45406, Training Loss: 0.00601
Epoch: 45407, Training Loss: 0.00678
Epoch: 45407, Training Loss: 0.00635
Epoch: 45407, Training Loss: 0.00775
Epoch: 45407, Training Loss: 0.00601
Epoch: 45408, Training Loss: 0.00678
Epoch: 45408, Training Loss: 0.00635
Epoch: 45408, Training Loss: 0.00775
Epoch: 45408, Training Loss: 0.00601
Epoch: 45409, Training Loss: 0.00678
Epoch: 45409, Training Loss: 0.00635
Epoch: 45409, Training Loss: 0.00775
Epoch: 45409, Training Loss: 0.00601
Epoch: 45410, Training Loss: 0.00678
Epoch: 45410, Training Loss: 0.00635
Epoch: 45410, Training Loss: 0.00774
Epoch: 45410, Training Loss: 0.00601
Epoch: 45411, Training Loss: 0.00678
Epoch: 45411, Training Loss: 0.00635
Epoch: 45411, Training Loss: 0.00774
Epoch: 45411, Training Loss: 0.00601
Epoch: 45412, Training Loss: 0.00678
Epoch: 45412, Training Loss: 0.00635
Epoch: 45412, Training Loss: 0.00774
Epoch: 45412, Training Loss: 0.00601
Epoch: 45413, Training Loss: 0.00678
Epoch: 45413, Training Loss: 0.00635
Epoch: 45413, Training Loss: 0.00774
Epoch: 45413, Training Loss: 0.00601
Epoch: 45414, Training Loss: 0.00678
Epoch: 45414, Training Loss: 0.00635
Epoch: 45414, Training Loss: 0.00774
Epoch: 45414, Training Loss: 0.00601
Epoch: 45415, Training Loss: 0.00678
Epoch: 45415, Training Loss: 0.00635
Epoch: 45415, Training Loss: 0.00774
Epoch: 45415, Training Loss: 0.00601
Epoch: 45416, Training Loss: 0.00678
Epoch: 45416, Training Loss: 0.00635
Epoch: 45416, Training Loss: 0.00774
Epoch: 45416, Training Loss: 0.00601
Epoch: 45417, Training Loss: 0.00678
Epoch: 45417, Training Loss: 0.00635
Epoch: 45417, Training Loss: 0.00774
Epoch: 45417, Training Loss: 0.00601
Epoch: 45418, Training Loss: 0.00678
Epoch: 45418, Training Loss: 0.00635
Epoch: 45418, Training Loss: 0.00774
Epoch: 45418, Training Loss: 0.00601
Epoch: 45419, Training Loss: 0.00678
Epoch: 45419, Training Loss: 0.00635
Epoch: 45419, Training Loss: 0.00774
Epoch: 45419, Training Loss: 0.00601
Epoch: 45420, Training Loss: 0.00678
Epoch: 45420, Training Loss: 0.00635
Epoch: 45420, Training Loss: 0.00774
Epoch: 45420, Training Loss: 0.00601
Epoch: 45421, Training Loss: 0.00678
Epoch: 45421, Training Loss: 0.00635
Epoch: 45421, Training Loss: 0.00774
Epoch: 45421, Training Loss: 0.00601
Epoch: 45422, Training Loss: 0.00678
Epoch: 45422, Training Loss: 0.00635
Epoch: 45422, Training Loss: 0.00774
Epoch: 45422, Training Loss: 0.00601
Epoch: 45423, Training Loss: 0.00678
Epoch: 45423, Training Loss: 0.00635
Epoch: 45423, Training Loss: 0.00774
Epoch: 45423, Training Loss: 0.00601
Epoch: 45424, Training Loss: 0.00678
Epoch: 45424, Training Loss: 0.00635
Epoch: 45424, Training Loss: 0.00774
Epoch: 45424, Training Loss: 0.00601
Epoch: 45425, Training Loss: 0.00678
Epoch: 45425, Training Loss: 0.00635
Epoch: 45425, Training Loss: 0.00774
Epoch: 45425, Training Loss: 0.00601
Epoch: 45426, Training Loss: 0.00678
Epoch: 45426, Training Loss: 0.00635
Epoch: 45426, Training Loss: 0.00774
Epoch: 45426, Training Loss: 0.00601
Epoch: 45427, Training Loss: 0.00678
Epoch: 45427, Training Loss: 0.00635
Epoch: 45427, Training Loss: 0.00774
Epoch: 45427, Training Loss: 0.00601
Epoch: 45428, Training Loss: 0.00678
Epoch: 45428, Training Loss: 0.00635
Epoch: 45428, Training Loss: 0.00774
Epoch: 45428, Training Loss: 0.00601
Epoch: 45429, Training Loss: 0.00678
Epoch: 45429, Training Loss: 0.00635
Epoch: 45429, Training Loss: 0.00774
Epoch: 45429, Training Loss: 0.00601
Epoch: 45430, Training Loss: 0.00678
Epoch: 45430, Training Loss: 0.00635
Epoch: 45430, Training Loss: 0.00774
Epoch: 45430, Training Loss: 0.00601
Epoch: 45431, Training Loss: 0.00678
Epoch: 45431, Training Loss: 0.00635
Epoch: 45431, Training Loss: 0.00774
Epoch: 45431, Training Loss: 0.00601
Epoch: 45432, Training Loss: 0.00677
Epoch: 45432, Training Loss: 0.00635
Epoch: 45432, Training Loss: 0.00774
Epoch: 45432, Training Loss: 0.00601
Epoch: 45433, Training Loss: 0.00677
Epoch: 45433, Training Loss: 0.00635
Epoch: 45433, Training Loss: 0.00774
Epoch: 45433, Training Loss: 0.00601
Epoch: 45434, Training Loss: 0.00677
Epoch: 45434, Training Loss: 0.00635
Epoch: 45434, Training Loss: 0.00774
Epoch: 45434, Training Loss: 0.00601
Epoch: 45435, Training Loss: 0.00677
Epoch: 45435, Training Loss: 0.00635
Epoch: 45435, Training Loss: 0.00774
Epoch: 45435, Training Loss: 0.00601
Epoch: 45436, Training Loss: 0.00677
Epoch: 45436, Training Loss: 0.00635
Epoch: 45436, Training Loss: 0.00774
Epoch: 45436, Training Loss: 0.00601
Epoch: 45437, Training Loss: 0.00677
Epoch: 45437, Training Loss: 0.00635
Epoch: 45437, Training Loss: 0.00774
Epoch: 45437, Training Loss: 0.00601
Epoch: 45438, Training Loss: 0.00677
Epoch: 45438, Training Loss: 0.00635
Epoch: 45438, Training Loss: 0.00774
Epoch: 45438, Training Loss: 0.00601
Epoch: 45439, Training Loss: 0.00677
Epoch: 45439, Training Loss: 0.00635
Epoch: 45439, Training Loss: 0.00774
Epoch: 45439, Training Loss: 0.00601
Epoch: 45440, Training Loss: 0.00677
Epoch: 45440, Training Loss: 0.00635
Epoch: 45440, Training Loss: 0.00774
Epoch: 45440, Training Loss: 0.00601
Epoch: 45441, Training Loss: 0.00677
Epoch: 45441, Training Loss: 0.00635
Epoch: 45441, Training Loss: 0.00774
Epoch: 45441, Training Loss: 0.00601
Epoch: 45442, Training Loss: 0.00677
Epoch: 45442, Training Loss: 0.00635
Epoch: 45442, Training Loss: 0.00774
Epoch: 45442, Training Loss: 0.00601
Epoch: 45443, Training Loss: 0.00677
Epoch: 45443, Training Loss: 0.00635
Epoch: 45443, Training Loss: 0.00774
Epoch: 45443, Training Loss: 0.00601
Epoch: 45444, Training Loss: 0.00677
Epoch: 45444, Training Loss: 0.00635
Epoch: 45444, Training Loss: 0.00774
Epoch: 45444, Training Loss: 0.00601
Epoch: 45445, Training Loss: 0.00677
Epoch: 45445, Training Loss: 0.00635
Epoch: 45445, Training Loss: 0.00774
Epoch: 45445, Training Loss: 0.00601
Epoch: 45446, Training Loss: 0.00677
Epoch: 45446, Training Loss: 0.00635
Epoch: 45446, Training Loss: 0.00774
Epoch: 45446, Training Loss: 0.00601
Epoch: 45447, Training Loss: 0.00677
Epoch: 45447, Training Loss: 0.00635
Epoch: 45447, Training Loss: 0.00774
Epoch: 45447, Training Loss: 0.00601
Epoch: 45448, Training Loss: 0.00677
Epoch: 45448, Training Loss: 0.00635
Epoch: 45448, Training Loss: 0.00774
Epoch: 45448, Training Loss: 0.00601
Epoch: 45449, Training Loss: 0.00677
Epoch: 45449, Training Loss: 0.00635
Epoch: 45449, Training Loss: 0.00774
Epoch: 45449, Training Loss: 0.00601
Epoch: 45450, Training Loss: 0.00677
Epoch: 45450, Training Loss: 0.00635
Epoch: 45450, Training Loss: 0.00774
Epoch: 45450, Training Loss: 0.00601
Epoch: 45451, Training Loss: 0.00677
Epoch: 45451, Training Loss: 0.00635
Epoch: 45451, Training Loss: 0.00774
Epoch: 45451, Training Loss: 0.00601
Epoch: 45452, Training Loss: 0.00677
Epoch: 45452, Training Loss: 0.00635
Epoch: 45452, Training Loss: 0.00774
Epoch: 45452, Training Loss: 0.00601
Epoch: 45453, Training Loss: 0.00677
Epoch: 45453, Training Loss: 0.00635
Epoch: 45453, Training Loss: 0.00774
Epoch: 45453, Training Loss: 0.00601
Epoch: 45454, Training Loss: 0.00677
Epoch: 45454, Training Loss: 0.00635
Epoch: 45454, Training Loss: 0.00774
Epoch: 45454, Training Loss: 0.00601
Epoch: 45455, Training Loss: 0.00677
Epoch: 45455, Training Loss: 0.00634
Epoch: 45455, Training Loss: 0.00774
Epoch: 45455, Training Loss: 0.00601
Epoch: 45456, Training Loss: 0.00677
Epoch: 45456, Training Loss: 0.00634
Epoch: 45456, Training Loss: 0.00774
Epoch: 45456, Training Loss: 0.00601
Epoch: 45457, Training Loss: 0.00677
Epoch: 45457, Training Loss: 0.00634
Epoch: 45457, Training Loss: 0.00774
Epoch: 45457, Training Loss: 0.00601
Epoch: 45458, Training Loss: 0.00677
Epoch: 45458, Training Loss: 0.00634
Epoch: 45458, Training Loss: 0.00774
Epoch: 45458, Training Loss: 0.00601
Epoch: 45459, Training Loss: 0.00677
Epoch: 45459, Training Loss: 0.00634
Epoch: 45459, Training Loss: 0.00774
Epoch: 45459, Training Loss: 0.00601
Epoch: 45460, Training Loss: 0.00677
Epoch: 45460, Training Loss: 0.00634
Epoch: 45460, Training Loss: 0.00774
Epoch: 45460, Training Loss: 0.00601
Epoch: 45461, Training Loss: 0.00677
Epoch: 45461, Training Loss: 0.00634
Epoch: 45461, Training Loss: 0.00774
Epoch: 45461, Training Loss: 0.00601
Epoch: 45462, Training Loss: 0.00677
Epoch: 45462, Training Loss: 0.00634
Epoch: 45462, Training Loss: 0.00774
Epoch: 45462, Training Loss: 0.00601
Epoch: 45463, Training Loss: 0.00677
Epoch: 45463, Training Loss: 0.00634
Epoch: 45463, Training Loss: 0.00774
Epoch: 45463, Training Loss: 0.00601
Epoch: 45464, Training Loss: 0.00677
Epoch: 45464, Training Loss: 0.00634
Epoch: 45464, Training Loss: 0.00774
Epoch: 45464, Training Loss: 0.00601
Epoch: 45465, Training Loss: 0.00677
Epoch: 45465, Training Loss: 0.00634
Epoch: 45465, Training Loss: 0.00774
Epoch: 45465, Training Loss: 0.00601
Epoch: 45466, Training Loss: 0.00677
Epoch: 45466, Training Loss: 0.00634
Epoch: 45466, Training Loss: 0.00774
Epoch: 45466, Training Loss: 0.00601
Epoch: 45467, Training Loss: 0.00677
Epoch: 45467, Training Loss: 0.00634
Epoch: 45467, Training Loss: 0.00774
Epoch: 45467, Training Loss: 0.00601
Epoch: 45468, Training Loss: 0.00677
Epoch: 45468, Training Loss: 0.00634
Epoch: 45468, Training Loss: 0.00774
Epoch: 45468, Training Loss: 0.00601
Epoch: 45469, Training Loss: 0.00677
Epoch: 45469, Training Loss: 0.00634
Epoch: 45469, Training Loss: 0.00774
Epoch: 45469, Training Loss: 0.00601
Epoch: 45470, Training Loss: 0.00677
Epoch: 45470, Training Loss: 0.00634
Epoch: 45470, Training Loss: 0.00774
Epoch: 45470, Training Loss: 0.00601
Epoch: 45471, Training Loss: 0.00677
Epoch: 45471, Training Loss: 0.00634
Epoch: 45471, Training Loss: 0.00774
Epoch: 45471, Training Loss: 0.00601
Epoch: 45472, Training Loss: 0.00677
Epoch: 45472, Training Loss: 0.00634
Epoch: 45472, Training Loss: 0.00774
Epoch: 45472, Training Loss: 0.00601
Epoch: 45473, Training Loss: 0.00677
Epoch: 45473, Training Loss: 0.00634
Epoch: 45473, Training Loss: 0.00774
Epoch: 45473, Training Loss: 0.00601
Epoch: 45474, Training Loss: 0.00677
Epoch: 45474, Training Loss: 0.00634
Epoch: 45474, Training Loss: 0.00774
Epoch: 45474, Training Loss: 0.00601
Epoch: 45475, Training Loss: 0.00677
Epoch: 45475, Training Loss: 0.00634
Epoch: 45475, Training Loss: 0.00774
Epoch: 45475, Training Loss: 0.00601
Epoch: 45476, Training Loss: 0.00677
Epoch: 45476, Training Loss: 0.00634
Epoch: 45476, Training Loss: 0.00774
Epoch: 45476, Training Loss: 0.00601
Epoch: 45477, Training Loss: 0.00677
Epoch: 45477, Training Loss: 0.00634
Epoch: 45477, Training Loss: 0.00774
Epoch: 45477, Training Loss: 0.00601
Epoch: 45478, Training Loss: 0.00677
Epoch: 45478, Training Loss: 0.00634
Epoch: 45478, Training Loss: 0.00774
Epoch: 45478, Training Loss: 0.00601
Epoch: 45479, Training Loss: 0.00677
Epoch: 45479, Training Loss: 0.00634
Epoch: 45479, Training Loss: 0.00774
Epoch: 45479, Training Loss: 0.00601
Epoch: 45480, Training Loss: 0.00677
Epoch: 45480, Training Loss: 0.00634
Epoch: 45480, Training Loss: 0.00774
Epoch: 45480, Training Loss: 0.00601
Epoch: 45481, Training Loss: 0.00677
Epoch: 45481, Training Loss: 0.00634
Epoch: 45481, Training Loss: 0.00774
Epoch: 45481, Training Loss: 0.00601
Epoch: 45482, Training Loss: 0.00677
Epoch: 45482, Training Loss: 0.00634
Epoch: 45482, Training Loss: 0.00774
Epoch: 45482, Training Loss: 0.00601
Epoch: 45483, Training Loss: 0.00677
Epoch: 45483, Training Loss: 0.00634
Epoch: 45483, Training Loss: 0.00774
Epoch: 45483, Training Loss: 0.00601
Epoch: 45484, Training Loss: 0.00677
Epoch: 45484, Training Loss: 0.00634
Epoch: 45484, Training Loss: 0.00774
Epoch: 45484, Training Loss: 0.00600
Epoch: 45485, Training Loss: 0.00677
Epoch: 45485, Training Loss: 0.00634
Epoch: 45485, Training Loss: 0.00774
Epoch: 45485, Training Loss: 0.00600
Epoch: 45486, Training Loss: 0.00677
Epoch: 45486, Training Loss: 0.00634
Epoch: 45486, Training Loss: 0.00774
Epoch: 45486, Training Loss: 0.00600
Epoch: 45487, Training Loss: 0.00677
Epoch: 45487, Training Loss: 0.00634
Epoch: 45487, Training Loss: 0.00774
Epoch: 45487, Training Loss: 0.00600
Epoch: 45488, Training Loss: 0.00677
Epoch: 45488, Training Loss: 0.00634
Epoch: 45488, Training Loss: 0.00774
Epoch: 45488, Training Loss: 0.00600
Epoch: 45489, Training Loss: 0.00677
Epoch: 45489, Training Loss: 0.00634
Epoch: 45489, Training Loss: 0.00774
Epoch: 45489, Training Loss: 0.00600
Epoch: 45490, Training Loss: 0.00677
Epoch: 45490, Training Loss: 0.00634
Epoch: 45490, Training Loss: 0.00774
Epoch: 45490, Training Loss: 0.00600
Epoch: 45491, Training Loss: 0.00677
Epoch: 45491, Training Loss: 0.00634
Epoch: 45491, Training Loss: 0.00774
Epoch: 45491, Training Loss: 0.00600
Epoch: 45492, Training Loss: 0.00677
Epoch: 45492, Training Loss: 0.00634
Epoch: 45492, Training Loss: 0.00774
Epoch: 45492, Training Loss: 0.00600
Epoch: 45493, Training Loss: 0.00677
Epoch: 45493, Training Loss: 0.00634
Epoch: 45493, Training Loss: 0.00774
Epoch: 45493, Training Loss: 0.00600
Epoch: 45494, Training Loss: 0.00677
Epoch: 45494, Training Loss: 0.00634
Epoch: 45494, Training Loss: 0.00774
Epoch: 45494, Training Loss: 0.00600
Epoch: 45495, Training Loss: 0.00677
Epoch: 45495, Training Loss: 0.00634
Epoch: 45495, Training Loss: 0.00774
Epoch: 45495, Training Loss: 0.00600
Epoch: 45496, Training Loss: 0.00677
Epoch: 45496, Training Loss: 0.00634
Epoch: 45496, Training Loss: 0.00774
Epoch: 45496, Training Loss: 0.00600
Epoch: 45497, Training Loss: 0.00677
Epoch: 45497, Training Loss: 0.00634
Epoch: 45497, Training Loss: 0.00774
Epoch: 45497, Training Loss: 0.00600
Epoch: 45498, Training Loss: 0.00677
Epoch: 45498, Training Loss: 0.00634
Epoch: 45498, Training Loss: 0.00774
Epoch: 45498, Training Loss: 0.00600
Epoch: 45499, Training Loss: 0.00677
Epoch: 45499, Training Loss: 0.00634
Epoch: 45499, Training Loss: 0.00774
Epoch: 45499, Training Loss: 0.00600
Epoch: 45500, Training Loss: 0.00677
Epoch: 45500, Training Loss: 0.00634
Epoch: 45500, Training Loss: 0.00774
Epoch: 45500, Training Loss: 0.00600
Epoch: 45501, Training Loss: 0.00677
Epoch: 45501, Training Loss: 0.00634
Epoch: 45501, Training Loss: 0.00774
Epoch: 45501, Training Loss: 0.00600
Epoch: 45502, Training Loss: 0.00677
Epoch: 45502, Training Loss: 0.00634
Epoch: 45502, Training Loss: 0.00774
Epoch: 45502, Training Loss: 0.00600
Epoch: 45503, Training Loss: 0.00677
Epoch: 45503, Training Loss: 0.00634
Epoch: 45503, Training Loss: 0.00774
Epoch: 45503, Training Loss: 0.00600
Epoch: 45504, Training Loss: 0.00677
Epoch: 45504, Training Loss: 0.00634
Epoch: 45504, Training Loss: 0.00774
Epoch: 45504, Training Loss: 0.00600
Epoch: 45505, Training Loss: 0.00677
Epoch: 45505, Training Loss: 0.00634
Epoch: 45505, Training Loss: 0.00774
Epoch: 45505, Training Loss: 0.00600
Epoch: 45506, Training Loss: 0.00677
Epoch: 45506, Training Loss: 0.00634
Epoch: 45506, Training Loss: 0.00774
Epoch: 45506, Training Loss: 0.00600
Epoch: 45507, Training Loss: 0.00677
Epoch: 45507, Training Loss: 0.00634
Epoch: 45507, Training Loss: 0.00774
Epoch: 45507, Training Loss: 0.00600
Epoch: 45508, Training Loss: 0.00677
Epoch: 45508, Training Loss: 0.00634
Epoch: 45508, Training Loss: 0.00774
Epoch: 45508, Training Loss: 0.00600
Epoch: 45509, Training Loss: 0.00677
Epoch: 45509, Training Loss: 0.00634
Epoch: 45509, Training Loss: 0.00774
Epoch: 45509, Training Loss: 0.00600
Epoch: 45510, Training Loss: 0.00677
Epoch: 45510, Training Loss: 0.00634
Epoch: 45510, Training Loss: 0.00774
Epoch: 45510, Training Loss: 0.00600
Epoch: 45511, Training Loss: 0.00677
Epoch: 45511, Training Loss: 0.00634
Epoch: 45511, Training Loss: 0.00774
Epoch: 45511, Training Loss: 0.00600
Epoch: 45512, Training Loss: 0.00677
Epoch: 45512, Training Loss: 0.00634
Epoch: 45512, Training Loss: 0.00774
Epoch: 45512, Training Loss: 0.00600
Epoch: 45513, Training Loss: 0.00677
Epoch: 45513, Training Loss: 0.00634
Epoch: 45513, Training Loss: 0.00774
Epoch: 45513, Training Loss: 0.00600
Epoch: 45514, Training Loss: 0.00677
Epoch: 45514, Training Loss: 0.00634
Epoch: 45514, Training Loss: 0.00774
Epoch: 45514, Training Loss: 0.00600
Epoch: 45515, Training Loss: 0.00677
Epoch: 45515, Training Loss: 0.00634
Epoch: 45515, Training Loss: 0.00774
Epoch: 45515, Training Loss: 0.00600
Epoch: 45516, Training Loss: 0.00677
Epoch: 45516, Training Loss: 0.00634
Epoch: 45516, Training Loss: 0.00774
Epoch: 45516, Training Loss: 0.00600
Epoch: 45517, Training Loss: 0.00677
Epoch: 45517, Training Loss: 0.00634
Epoch: 45517, Training Loss: 0.00774
Epoch: 45517, Training Loss: 0.00600
Epoch: 45518, Training Loss: 0.00677
Epoch: 45518, Training Loss: 0.00634
Epoch: 45518, Training Loss: 0.00774
Epoch: 45518, Training Loss: 0.00600
Epoch: 45519, Training Loss: 0.00677
Epoch: 45519, Training Loss: 0.00634
Epoch: 45519, Training Loss: 0.00774
Epoch: 45519, Training Loss: 0.00600
Epoch: 45520, Training Loss: 0.00677
Epoch: 45520, Training Loss: 0.00634
Epoch: 45520, Training Loss: 0.00774
Epoch: 45520, Training Loss: 0.00600
Epoch: 45521, Training Loss: 0.00677
Epoch: 45521, Training Loss: 0.00634
Epoch: 45521, Training Loss: 0.00774
Epoch: 45521, Training Loss: 0.00600
Epoch: 45522, Training Loss: 0.00677
Epoch: 45522, Training Loss: 0.00634
Epoch: 45522, Training Loss: 0.00773
Epoch: 45522, Training Loss: 0.00600
Epoch: 45523, Training Loss: 0.00677
Epoch: 45523, Training Loss: 0.00634
Epoch: 45523, Training Loss: 0.00773
Epoch: 45523, Training Loss: 0.00600
Epoch: 45524, Training Loss: 0.00677
Epoch: 45524, Training Loss: 0.00634
Epoch: 45524, Training Loss: 0.00773
Epoch: 45524, Training Loss: 0.00600
Epoch: 45525, Training Loss: 0.00677
Epoch: 45525, Training Loss: 0.00634
Epoch: 45525, Training Loss: 0.00773
Epoch: 45525, Training Loss: 0.00600
Epoch: 45526, Training Loss: 0.00677
Epoch: 45526, Training Loss: 0.00634
Epoch: 45526, Training Loss: 0.00773
Epoch: 45526, Training Loss: 0.00600
Epoch: 45527, Training Loss: 0.00677
Epoch: 45527, Training Loss: 0.00634
Epoch: 45527, Training Loss: 0.00773
Epoch: 45527, Training Loss: 0.00600
Epoch: 45528, Training Loss: 0.00677
Epoch: 45528, Training Loss: 0.00634
Epoch: 45528, Training Loss: 0.00773
Epoch: 45528, Training Loss: 0.00600
Epoch: 45529, Training Loss: 0.00677
Epoch: 45529, Training Loss: 0.00634
Epoch: 45529, Training Loss: 0.00773
Epoch: 45529, Training Loss: 0.00600
Epoch: 45530, Training Loss: 0.00677
Epoch: 45530, Training Loss: 0.00634
Epoch: 45530, Training Loss: 0.00773
Epoch: 45530, Training Loss: 0.00600
Epoch: 45531, Training Loss: 0.00677
Epoch: 45531, Training Loss: 0.00634
Epoch: 45531, Training Loss: 0.00773
Epoch: 45531, Training Loss: 0.00600
Epoch: 45532, Training Loss: 0.00677
Epoch: 45532, Training Loss: 0.00634
Epoch: 45532, Training Loss: 0.00773
Epoch: 45532, Training Loss: 0.00600
Epoch: 45533, Training Loss: 0.00677
Epoch: 45533, Training Loss: 0.00634
Epoch: 45533, Training Loss: 0.00773
Epoch: 45533, Training Loss: 0.00600
Epoch: 45534, Training Loss: 0.00677
Epoch: 45534, Training Loss: 0.00634
Epoch: 45534, Training Loss: 0.00773
Epoch: 45534, Training Loss: 0.00600
Epoch: 45535, Training Loss: 0.00677
Epoch: 45535, Training Loss: 0.00634
Epoch: 45535, Training Loss: 0.00773
Epoch: 45535, Training Loss: 0.00600
Epoch: 45536, Training Loss: 0.00677
Epoch: 45536, Training Loss: 0.00634
Epoch: 45536, Training Loss: 0.00773
Epoch: 45536, Training Loss: 0.00600
Epoch: 45537, Training Loss: 0.00677
Epoch: 45537, Training Loss: 0.00634
Epoch: 45537, Training Loss: 0.00773
Epoch: 45537, Training Loss: 0.00600
Epoch: 45538, Training Loss: 0.00677
Epoch: 45538, Training Loss: 0.00634
Epoch: 45538, Training Loss: 0.00773
Epoch: 45538, Training Loss: 0.00600
Epoch: 45539, Training Loss: 0.00677
Epoch: 45539, Training Loss: 0.00634
Epoch: 45539, Training Loss: 0.00773
Epoch: 45539, Training Loss: 0.00600
Epoch: 45540, Training Loss: 0.00677
Epoch: 45540, Training Loss: 0.00634
Epoch: 45540, Training Loss: 0.00773
Epoch: 45540, Training Loss: 0.00600
Epoch: 45541, Training Loss: 0.00677
Epoch: 45541, Training Loss: 0.00634
Epoch: 45541, Training Loss: 0.00773
Epoch: 45541, Training Loss: 0.00600
Epoch: 45542, Training Loss: 0.00677
Epoch: 45542, Training Loss: 0.00634
Epoch: 45542, Training Loss: 0.00773
Epoch: 45542, Training Loss: 0.00600
Epoch: 45543, Training Loss: 0.00677
Epoch: 45543, Training Loss: 0.00634
Epoch: 45543, Training Loss: 0.00773
Epoch: 45543, Training Loss: 0.00600
Epoch: 45544, Training Loss: 0.00677
Epoch: 45544, Training Loss: 0.00634
Epoch: 45544, Training Loss: 0.00773
Epoch: 45544, Training Loss: 0.00600
Epoch: 45545, Training Loss: 0.00677
Epoch: 45545, Training Loss: 0.00634
Epoch: 45545, Training Loss: 0.00773
Epoch: 45545, Training Loss: 0.00600
Epoch: 45546, Training Loss: 0.00677
Epoch: 45546, Training Loss: 0.00634
Epoch: 45546, Training Loss: 0.00773
Epoch: 45546, Training Loss: 0.00600
Epoch: 45547, Training Loss: 0.00677
Epoch: 45547, Training Loss: 0.00634
Epoch: 45547, Training Loss: 0.00773
Epoch: 45547, Training Loss: 0.00600
Epoch: 45548, Training Loss: 0.00677
Epoch: 45548, Training Loss: 0.00634
Epoch: 45548, Training Loss: 0.00773
Epoch: 45548, Training Loss: 0.00600
Epoch: 45549, Training Loss: 0.00677
Epoch: 45549, Training Loss: 0.00634
Epoch: 45549, Training Loss: 0.00773
Epoch: 45549, Training Loss: 0.00600
Epoch: 45550, Training Loss: 0.00677
Epoch: 45550, Training Loss: 0.00634
Epoch: 45550, Training Loss: 0.00773
Epoch: 45550, Training Loss: 0.00600
Epoch: 45551, Training Loss: 0.00677
Epoch: 45551, Training Loss: 0.00634
Epoch: 45551, Training Loss: 0.00773
Epoch: 45551, Training Loss: 0.00600
Epoch: 45552, Training Loss: 0.00677
Epoch: 45552, Training Loss: 0.00634
Epoch: 45552, Training Loss: 0.00773
Epoch: 45552, Training Loss: 0.00600
Epoch: 45553, Training Loss: 0.00677
Epoch: 45553, Training Loss: 0.00634
Epoch: 45553, Training Loss: 0.00773
Epoch: 45553, Training Loss: 0.00600
Epoch: 45554, Training Loss: 0.00677
Epoch: 45554, Training Loss: 0.00634
Epoch: 45554, Training Loss: 0.00773
Epoch: 45554, Training Loss: 0.00600
Epoch: 45555, Training Loss: 0.00677
Epoch: 45555, Training Loss: 0.00634
Epoch: 45555, Training Loss: 0.00773
Epoch: 45555, Training Loss: 0.00600
Epoch: 45556, Training Loss: 0.00677
Epoch: 45556, Training Loss: 0.00634
Epoch: 45556, Training Loss: 0.00773
Epoch: 45556, Training Loss: 0.00600
Epoch: 45557, Training Loss: 0.00677
Epoch: 45557, Training Loss: 0.00634
Epoch: 45557, Training Loss: 0.00773
Epoch: 45557, Training Loss: 0.00600
Epoch: 45558, Training Loss: 0.00677
Epoch: 45558, Training Loss: 0.00634
Epoch: 45558, Training Loss: 0.00773
Epoch: 45558, Training Loss: 0.00600
Epoch: 45559, Training Loss: 0.00677
Epoch: 45559, Training Loss: 0.00634
Epoch: 45559, Training Loss: 0.00773
Epoch: 45559, Training Loss: 0.00600
Epoch: 45560, Training Loss: 0.00676
Epoch: 45560, Training Loss: 0.00634
Epoch: 45560, Training Loss: 0.00773
Epoch: 45560, Training Loss: 0.00600
Epoch: 45561, Training Loss: 0.00676
Epoch: 45561, Training Loss: 0.00634
Epoch: 45561, Training Loss: 0.00773
Epoch: 45561, Training Loss: 0.00600
Epoch: 45562, Training Loss: 0.00676
Epoch: 45562, Training Loss: 0.00634
Epoch: 45562, Training Loss: 0.00773
Epoch: 45562, Training Loss: 0.00600
Epoch: 45563, Training Loss: 0.00676
Epoch: 45563, Training Loss: 0.00634
Epoch: 45563, Training Loss: 0.00773
Epoch: 45563, Training Loss: 0.00600
Epoch: 45564, Training Loss: 0.00676
Epoch: 45564, Training Loss: 0.00634
Epoch: 45564, Training Loss: 0.00773
Epoch: 45564, Training Loss: 0.00600
Epoch: 45565, Training Loss: 0.00676
Epoch: 45565, Training Loss: 0.00634
Epoch: 45565, Training Loss: 0.00773
Epoch: 45565, Training Loss: 0.00600
Epoch: 45566, Training Loss: 0.00676
Epoch: 45566, Training Loss: 0.00634
Epoch: 45566, Training Loss: 0.00773
Epoch: 45566, Training Loss: 0.00600
Epoch: 45567, Training Loss: 0.00676
Epoch: 45567, Training Loss: 0.00634
Epoch: 45567, Training Loss: 0.00773
Epoch: 45567, Training Loss: 0.00600
Epoch: 45568, Training Loss: 0.00676
Epoch: 45568, Training Loss: 0.00634
Epoch: 45568, Training Loss: 0.00773
Epoch: 45568, Training Loss: 0.00600
Epoch: 45569, Training Loss: 0.00676
Epoch: 45569, Training Loss: 0.00634
Epoch: 45569, Training Loss: 0.00773
Epoch: 45569, Training Loss: 0.00600
Epoch: 45570, Training Loss: 0.00676
Epoch: 45570, Training Loss: 0.00634
Epoch: 45570, Training Loss: 0.00773
Epoch: 45570, Training Loss: 0.00600
Epoch: 45571, Training Loss: 0.00676
Epoch: 45571, Training Loss: 0.00634
Epoch: 45571, Training Loss: 0.00773
Epoch: 45571, Training Loss: 0.00600
Epoch: 45572, Training Loss: 0.00676
Epoch: 45572, Training Loss: 0.00634
Epoch: 45572, Training Loss: 0.00773
Epoch: 45572, Training Loss: 0.00600
Epoch: 45573, Training Loss: 0.00676
Epoch: 45573, Training Loss: 0.00634
Epoch: 45573, Training Loss: 0.00773
Epoch: 45573, Training Loss: 0.00600
Epoch: 45574, Training Loss: 0.00676
Epoch: 45574, Training Loss: 0.00634
Epoch: 45574, Training Loss: 0.00773
Epoch: 45574, Training Loss: 0.00600
Epoch: 45575, Training Loss: 0.00676
Epoch: 45575, Training Loss: 0.00634
Epoch: 45575, Training Loss: 0.00773
Epoch: 45575, Training Loss: 0.00600
Epoch: 45576, Training Loss: 0.00676
Epoch: 45576, Training Loss: 0.00634
Epoch: 45576, Training Loss: 0.00773
Epoch: 45576, Training Loss: 0.00600
Epoch: 45577, Training Loss: 0.00676
Epoch: 45577, Training Loss: 0.00634
Epoch: 45577, Training Loss: 0.00773
Epoch: 45577, Training Loss: 0.00600
Epoch: 45578, Training Loss: 0.00676
Epoch: 45578, Training Loss: 0.00634
Epoch: 45578, Training Loss: 0.00773
Epoch: 45578, Training Loss: 0.00600
Epoch: 45579, Training Loss: 0.00676
Epoch: 45579, Training Loss: 0.00634
Epoch: 45579, Training Loss: 0.00773
Epoch: 45579, Training Loss: 0.00600
Epoch: 45580, Training Loss: 0.00676
Epoch: 45580, Training Loss: 0.00634
Epoch: 45580, Training Loss: 0.00773
Epoch: 45580, Training Loss: 0.00600
Epoch: 45581, Training Loss: 0.00676
Epoch: 45581, Training Loss: 0.00634
Epoch: 45581, Training Loss: 0.00773
Epoch: 45581, Training Loss: 0.00600
Epoch: 45582, Training Loss: 0.00676
Epoch: 45582, Training Loss: 0.00634
Epoch: 45582, Training Loss: 0.00773
Epoch: 45582, Training Loss: 0.00600
Epoch: 45583, Training Loss: 0.00676
Epoch: 45583, Training Loss: 0.00634
Epoch: 45583, Training Loss: 0.00773
Epoch: 45583, Training Loss: 0.00600
Epoch: 45584, Training Loss: 0.00676
Epoch: 45584, Training Loss: 0.00634
Epoch: 45584, Training Loss: 0.00773
Epoch: 45584, Training Loss: 0.00600
Epoch: 45585, Training Loss: 0.00676
Epoch: 45585, Training Loss: 0.00634
Epoch: 45585, Training Loss: 0.00773
Epoch: 45585, Training Loss: 0.00600
Epoch: 45586, Training Loss: 0.00676
Epoch: 45586, Training Loss: 0.00634
Epoch: 45586, Training Loss: 0.00773
Epoch: 45586, Training Loss: 0.00600
Epoch: 45587, Training Loss: 0.00676
Epoch: 45587, Training Loss: 0.00634
Epoch: 45587, Training Loss: 0.00773
Epoch: 45587, Training Loss: 0.00600
Epoch: 45588, Training Loss: 0.00676
Epoch: 45588, Training Loss: 0.00634
Epoch: 45588, Training Loss: 0.00773
Epoch: 45588, Training Loss: 0.00600
Epoch: 45589, Training Loss: 0.00676
Epoch: 45589, Training Loss: 0.00634
Epoch: 45589, Training Loss: 0.00773
Epoch: 45589, Training Loss: 0.00600
Epoch: 45590, Training Loss: 0.00676
Epoch: 45590, Training Loss: 0.00634
Epoch: 45590, Training Loss: 0.00773
Epoch: 45590, Training Loss: 0.00600
Epoch: 45591, Training Loss: 0.00676
Epoch: 45591, Training Loss: 0.00634
Epoch: 45591, Training Loss: 0.00773
Epoch: 45591, Training Loss: 0.00600
Epoch: 45592, Training Loss: 0.00676
Epoch: 45592, Training Loss: 0.00634
Epoch: 45592, Training Loss: 0.00773
Epoch: 45592, Training Loss: 0.00600
Epoch: 45593, Training Loss: 0.00676
Epoch: 45593, Training Loss: 0.00634
Epoch: 45593, Training Loss: 0.00773
Epoch: 45593, Training Loss: 0.00600
Epoch: 45594, Training Loss: 0.00676
Epoch: 45594, Training Loss: 0.00633
Epoch: 45594, Training Loss: 0.00773
Epoch: 45594, Training Loss: 0.00600
Epoch: 45595, Training Loss: 0.00676
Epoch: 45595, Training Loss: 0.00633
Epoch: 45595, Training Loss: 0.00773
Epoch: 45595, Training Loss: 0.00600
Epoch: 45596, Training Loss: 0.00676
Epoch: 45596, Training Loss: 0.00633
Epoch: 45596, Training Loss: 0.00773
Epoch: 45596, Training Loss: 0.00600
Epoch: 45597, Training Loss: 0.00676
Epoch: 45597, Training Loss: 0.00633
Epoch: 45597, Training Loss: 0.00773
Epoch: 45597, Training Loss: 0.00600
Epoch: 45598, Training Loss: 0.00676
Epoch: 45598, Training Loss: 0.00633
Epoch: 45598, Training Loss: 0.00773
Epoch: 45598, Training Loss: 0.00600
Epoch: 45599, Training Loss: 0.00676
Epoch: 45599, Training Loss: 0.00633
Epoch: 45599, Training Loss: 0.00773
Epoch: 45599, Training Loss: 0.00600
Epoch: 45600, Training Loss: 0.00676
Epoch: 45600, Training Loss: 0.00633
Epoch: 45600, Training Loss: 0.00773
Epoch: 45600, Training Loss: 0.00600
Epoch: 45601, Training Loss: 0.00676
Epoch: 45601, Training Loss: 0.00633
Epoch: 45601, Training Loss: 0.00773
Epoch: 45601, Training Loss: 0.00600
Epoch: 45602, Training Loss: 0.00676
Epoch: 45602, Training Loss: 0.00633
Epoch: 45602, Training Loss: 0.00773
Epoch: 45602, Training Loss: 0.00600
Epoch: 45603, Training Loss: 0.00676
Epoch: 45603, Training Loss: 0.00633
Epoch: 45603, Training Loss: 0.00773
Epoch: 45603, Training Loss: 0.00600
Epoch: 45604, Training Loss: 0.00676
Epoch: 45604, Training Loss: 0.00633
Epoch: 45604, Training Loss: 0.00773
Epoch: 45604, Training Loss: 0.00600
Epoch: 45605, Training Loss: 0.00676
Epoch: 45605, Training Loss: 0.00633
Epoch: 45605, Training Loss: 0.00773
Epoch: 45605, Training Loss: 0.00600
Epoch: 45606, Training Loss: 0.00676
Epoch: 45606, Training Loss: 0.00633
Epoch: 45606, Training Loss: 0.00773
Epoch: 45606, Training Loss: 0.00600
Epoch: 45607, Training Loss: 0.00676
Epoch: 45607, Training Loss: 0.00633
Epoch: 45607, Training Loss: 0.00773
Epoch: 45607, Training Loss: 0.00600
Epoch: 45608, Training Loss: 0.00676
Epoch: 45608, Training Loss: 0.00633
Epoch: 45608, Training Loss: 0.00773
Epoch: 45608, Training Loss: 0.00600
Epoch: 45609, Training Loss: 0.00676
Epoch: 45609, Training Loss: 0.00633
Epoch: 45609, Training Loss: 0.00773
Epoch: 45609, Training Loss: 0.00600
Epoch: 45610, Training Loss: 0.00676
Epoch: 45610, Training Loss: 0.00633
Epoch: 45610, Training Loss: 0.00773
Epoch: 45610, Training Loss: 0.00600
Epoch: 45611, Training Loss: 0.00676
Epoch: 45611, Training Loss: 0.00633
Epoch: 45611, Training Loss: 0.00773
Epoch: 45611, Training Loss: 0.00600
Epoch: 45612, Training Loss: 0.00676
Epoch: 45612, Training Loss: 0.00633
Epoch: 45612, Training Loss: 0.00773
Epoch: 45612, Training Loss: 0.00600
Epoch: 45613, Training Loss: 0.00676
Epoch: 45613, Training Loss: 0.00633
Epoch: 45613, Training Loss: 0.00773
Epoch: 45613, Training Loss: 0.00600
Epoch: 45614, Training Loss: 0.00676
Epoch: 45614, Training Loss: 0.00633
Epoch: 45614, Training Loss: 0.00773
Epoch: 45614, Training Loss: 0.00600
Epoch: 45615, Training Loss: 0.00676
Epoch: 45615, Training Loss: 0.00633
Epoch: 45615, Training Loss: 0.00773
Epoch: 45615, Training Loss: 0.00600
Epoch: 45616, Training Loss: 0.00676
Epoch: 45616, Training Loss: 0.00633
Epoch: 45616, Training Loss: 0.00773
Epoch: 45616, Training Loss: 0.00600
Epoch: 45617, Training Loss: 0.00676
Epoch: 45617, Training Loss: 0.00633
Epoch: 45617, Training Loss: 0.00773
Epoch: 45617, Training Loss: 0.00600
Epoch: 45618, Training Loss: 0.00676
Epoch: 45618, Training Loss: 0.00633
Epoch: 45618, Training Loss: 0.00773
Epoch: 45618, Training Loss: 0.00600
Epoch: 45619, Training Loss: 0.00676
Epoch: 45619, Training Loss: 0.00633
Epoch: 45619, Training Loss: 0.00773
Epoch: 45619, Training Loss: 0.00600
Epoch: 45620, Training Loss: 0.00676
Epoch: 45620, Training Loss: 0.00633
Epoch: 45620, Training Loss: 0.00773
Epoch: 45620, Training Loss: 0.00600
Epoch: 45621, Training Loss: 0.00676
Epoch: 45621, Training Loss: 0.00633
Epoch: 45621, Training Loss: 0.00773
Epoch: 45621, Training Loss: 0.00600
Epoch: 45622, Training Loss: 0.00676
Epoch: 45622, Training Loss: 0.00633
Epoch: 45622, Training Loss: 0.00773
Epoch: 45622, Training Loss: 0.00600
Epoch: 45623, Training Loss: 0.00676
Epoch: 45623, Training Loss: 0.00633
Epoch: 45623, Training Loss: 0.00773
Epoch: 45623, Training Loss: 0.00600
Epoch: 45624, Training Loss: 0.00676
Epoch: 45624, Training Loss: 0.00633
Epoch: 45624, Training Loss: 0.00773
Epoch: 45624, Training Loss: 0.00600
Epoch: 45625, Training Loss: 0.00676
Epoch: 45625, Training Loss: 0.00633
Epoch: 45625, Training Loss: 0.00773
Epoch: 45625, Training Loss: 0.00600
Epoch: 45626, Training Loss: 0.00676
Epoch: 45626, Training Loss: 0.00633
Epoch: 45626, Training Loss: 0.00773
Epoch: 45626, Training Loss: 0.00600
Epoch: 45627, Training Loss: 0.00676
Epoch: 45627, Training Loss: 0.00633
Epoch: 45627, Training Loss: 0.00773
Epoch: 45627, Training Loss: 0.00600
Epoch: 45628, Training Loss: 0.00676
Epoch: 45628, Training Loss: 0.00633
Epoch: 45628, Training Loss: 0.00773
Epoch: 45628, Training Loss: 0.00600
Epoch: 45629, Training Loss: 0.00676
Epoch: 45629, Training Loss: 0.00633
Epoch: 45629, Training Loss: 0.00773
Epoch: 45629, Training Loss: 0.00600
Epoch: 45630, Training Loss: 0.00676
Epoch: 45630, Training Loss: 0.00633
Epoch: 45630, Training Loss: 0.00773
Epoch: 45630, Training Loss: 0.00600
Epoch: 45631, Training Loss: 0.00676
Epoch: 45631, Training Loss: 0.00633
Epoch: 45631, Training Loss: 0.00773
Epoch: 45631, Training Loss: 0.00599
Epoch: 45632, Training Loss: 0.00676
Epoch: 45632, Training Loss: 0.00633
Epoch: 45632, Training Loss: 0.00773
Epoch: 45632, Training Loss: 0.00599
Epoch: 45633, Training Loss: 0.00676
Epoch: 45633, Training Loss: 0.00633
Epoch: 45633, Training Loss: 0.00773
Epoch: 45633, Training Loss: 0.00599
Epoch: 45634, Training Loss: 0.00676
Epoch: 45634, Training Loss: 0.00633
Epoch: 45634, Training Loss: 0.00773
Epoch: 45634, Training Loss: 0.00599
Epoch: 45635, Training Loss: 0.00676
Epoch: 45635, Training Loss: 0.00633
Epoch: 45635, Training Loss: 0.00772
Epoch: 45635, Training Loss: 0.00599
Epoch: 45636, Training Loss: 0.00676
Epoch: 45636, Training Loss: 0.00633
Epoch: 45636, Training Loss: 0.00772
Epoch: 45636, Training Loss: 0.00599
Epoch: 45637, Training Loss: 0.00676
Epoch: 45637, Training Loss: 0.00633
Epoch: 45637, Training Loss: 0.00772
Epoch: 45637, Training Loss: 0.00599
Epoch: 45638, Training Loss: 0.00676
Epoch: 45638, Training Loss: 0.00633
Epoch: 45638, Training Loss: 0.00772
Epoch: 45638, Training Loss: 0.00599
Epoch: 45639, Training Loss: 0.00676
Epoch: 45639, Training Loss: 0.00633
Epoch: 45639, Training Loss: 0.00772
Epoch: 45639, Training Loss: 0.00599
Epoch: 45640, Training Loss: 0.00676
Epoch: 45640, Training Loss: 0.00633
Epoch: 45640, Training Loss: 0.00772
Epoch: 45640, Training Loss: 0.00599
Epoch: 45641, Training Loss: 0.00676
Epoch: 45641, Training Loss: 0.00633
Epoch: 45641, Training Loss: 0.00772
Epoch: 45641, Training Loss: 0.00599
Epoch: 45642, Training Loss: 0.00676
Epoch: 45642, Training Loss: 0.00633
Epoch: 45642, Training Loss: 0.00772
Epoch: 45642, Training Loss: 0.00599
Epoch: 45643, Training Loss: 0.00676
Epoch: 45643, Training Loss: 0.00633
Epoch: 45643, Training Loss: 0.00772
Epoch: 45643, Training Loss: 0.00599
Epoch: 45644, Training Loss: 0.00676
Epoch: 45644, Training Loss: 0.00633
Epoch: 45644, Training Loss: 0.00772
Epoch: 45644, Training Loss: 0.00599
Epoch: 45645, Training Loss: 0.00676
Epoch: 45645, Training Loss: 0.00633
Epoch: 45645, Training Loss: 0.00772
Epoch: 45645, Training Loss: 0.00599
Epoch: 45646, Training Loss: 0.00676
Epoch: 45646, Training Loss: 0.00633
Epoch: 45646, Training Loss: 0.00772
Epoch: 45646, Training Loss: 0.00599
Epoch: 45647, Training Loss: 0.00676
Epoch: 45647, Training Loss: 0.00633
Epoch: 45647, Training Loss: 0.00772
Epoch: 45647, Training Loss: 0.00599
Epoch: 45648, Training Loss: 0.00676
Epoch: 45648, Training Loss: 0.00633
Epoch: 45648, Training Loss: 0.00772
Epoch: 45648, Training Loss: 0.00599
Epoch: 45649, Training Loss: 0.00676
Epoch: 45649, Training Loss: 0.00633
Epoch: 45649, Training Loss: 0.00772
Epoch: 45649, Training Loss: 0.00599
Epoch: 45650, Training Loss: 0.00676
Epoch: 45650, Training Loss: 0.00633
Epoch: 45650, Training Loss: 0.00772
Epoch: 45650, Training Loss: 0.00599
Epoch: 45651, Training Loss: 0.00676
Epoch: 45651, Training Loss: 0.00633
Epoch: 45651, Training Loss: 0.00772
Epoch: 45651, Training Loss: 0.00599
Epoch: 45652, Training Loss: 0.00676
Epoch: 45652, Training Loss: 0.00633
Epoch: 45652, Training Loss: 0.00772
Epoch: 45652, Training Loss: 0.00599
Epoch: 45653, Training Loss: 0.00676
Epoch: 45653, Training Loss: 0.00633
Epoch: 45653, Training Loss: 0.00772
Epoch: 45653, Training Loss: 0.00599
Epoch: 45654, Training Loss: 0.00676
Epoch: 45654, Training Loss: 0.00633
Epoch: 45654, Training Loss: 0.00772
Epoch: 45654, Training Loss: 0.00599
Epoch: 45655, Training Loss: 0.00676
Epoch: 45655, Training Loss: 0.00633
Epoch: 45655, Training Loss: 0.00772
Epoch: 45655, Training Loss: 0.00599
Epoch: 45656, Training Loss: 0.00676
Epoch: 45656, Training Loss: 0.00633
Epoch: 45656, Training Loss: 0.00772
Epoch: 45656, Training Loss: 0.00599
Epoch: 45657, Training Loss: 0.00676
Epoch: 45657, Training Loss: 0.00633
Epoch: 45657, Training Loss: 0.00772
Epoch: 45657, Training Loss: 0.00599
Epoch: 45658, Training Loss: 0.00676
Epoch: 45658, Training Loss: 0.00633
Epoch: 45658, Training Loss: 0.00772
Epoch: 45658, Training Loss: 0.00599
Epoch: 45659, Training Loss: 0.00676
Epoch: 45659, Training Loss: 0.00633
Epoch: 45659, Training Loss: 0.00772
Epoch: 45659, Training Loss: 0.00599
Epoch: 45660, Training Loss: 0.00676
Epoch: 45660, Training Loss: 0.00633
Epoch: 45660, Training Loss: 0.00772
Epoch: 45660, Training Loss: 0.00599
Epoch: 45661, Training Loss: 0.00676
Epoch: 45661, Training Loss: 0.00633
Epoch: 45661, Training Loss: 0.00772
Epoch: 45661, Training Loss: 0.00599
Epoch: 45662, Training Loss: 0.00676
Epoch: 45662, Training Loss: 0.00633
Epoch: 45662, Training Loss: 0.00772
Epoch: 45662, Training Loss: 0.00599
Epoch: 45663, Training Loss: 0.00676
Epoch: 45663, Training Loss: 0.00633
Epoch: 45663, Training Loss: 0.00772
Epoch: 45663, Training Loss: 0.00599
Epoch: 45664, Training Loss: 0.00676
Epoch: 45664, Training Loss: 0.00633
Epoch: 45664, Training Loss: 0.00772
Epoch: 45664, Training Loss: 0.00599
Epoch: 45665, Training Loss: 0.00676
Epoch: 45665, Training Loss: 0.00633
Epoch: 45665, Training Loss: 0.00772
Epoch: 45665, Training Loss: 0.00599
Epoch: 45666, Training Loss: 0.00676
Epoch: 45666, Training Loss: 0.00633
Epoch: 45666, Training Loss: 0.00772
Epoch: 45666, Training Loss: 0.00599
Epoch: 45667, Training Loss: 0.00676
Epoch: 45667, Training Loss: 0.00633
Epoch: 45667, Training Loss: 0.00772
Epoch: 45667, Training Loss: 0.00599
Epoch: 45668, Training Loss: 0.00676
Epoch: 45668, Training Loss: 0.00633
Epoch: 45668, Training Loss: 0.00772
Epoch: 45668, Training Loss: 0.00599
Epoch: 45669, Training Loss: 0.00676
Epoch: 45669, Training Loss: 0.00633
Epoch: 45669, Training Loss: 0.00772
Epoch: 45669, Training Loss: 0.00599
Epoch: 45670, Training Loss: 0.00676
Epoch: 45670, Training Loss: 0.00633
Epoch: 45670, Training Loss: 0.00772
Epoch: 45670, Training Loss: 0.00599
Epoch: 45671, Training Loss: 0.00676
Epoch: 45671, Training Loss: 0.00633
Epoch: 45671, Training Loss: 0.00772
Epoch: 45671, Training Loss: 0.00599
Epoch: 45672, Training Loss: 0.00676
Epoch: 45672, Training Loss: 0.00633
Epoch: 45672, Training Loss: 0.00772
Epoch: 45672, Training Loss: 0.00599
Epoch: 45673, Training Loss: 0.00676
Epoch: 45673, Training Loss: 0.00633
Epoch: 45673, Training Loss: 0.00772
Epoch: 45673, Training Loss: 0.00599
Epoch: 45674, Training Loss: 0.00676
Epoch: 45674, Training Loss: 0.00633
Epoch: 45674, Training Loss: 0.00772
Epoch: 45674, Training Loss: 0.00599
Epoch: 45675, Training Loss: 0.00676
Epoch: 45675, Training Loss: 0.00633
Epoch: 45675, Training Loss: 0.00772
Epoch: 45675, Training Loss: 0.00599
Epoch: 45676, Training Loss: 0.00676
Epoch: 45676, Training Loss: 0.00633
Epoch: 45676, Training Loss: 0.00772
Epoch: 45676, Training Loss: 0.00599
Epoch: 45677, Training Loss: 0.00676
Epoch: 45677, Training Loss: 0.00633
Epoch: 45677, Training Loss: 0.00772
Epoch: 45677, Training Loss: 0.00599
Epoch: 45678, Training Loss: 0.00676
Epoch: 45678, Training Loss: 0.00633
Epoch: 45678, Training Loss: 0.00772
Epoch: 45678, Training Loss: 0.00599
Epoch: 45679, Training Loss: 0.00676
Epoch: 45679, Training Loss: 0.00633
Epoch: 45679, Training Loss: 0.00772
Epoch: 45679, Training Loss: 0.00599
Epoch: 45680, Training Loss: 0.00676
Epoch: 45680, Training Loss: 0.00633
Epoch: 45680, Training Loss: 0.00772
Epoch: 45680, Training Loss: 0.00599
Epoch: 45681, Training Loss: 0.00676
Epoch: 45681, Training Loss: 0.00633
Epoch: 45681, Training Loss: 0.00772
Epoch: 45681, Training Loss: 0.00599
Epoch: 45682, Training Loss: 0.00676
Epoch: 45682, Training Loss: 0.00633
Epoch: 45682, Training Loss: 0.00772
Epoch: 45682, Training Loss: 0.00599
Epoch: 45683, Training Loss: 0.00676
Epoch: 45683, Training Loss: 0.00633
Epoch: 45683, Training Loss: 0.00772
Epoch: 45683, Training Loss: 0.00599
Epoch: 45684, Training Loss: 0.00676
Epoch: 45684, Training Loss: 0.00633
Epoch: 45684, Training Loss: 0.00772
Epoch: 45684, Training Loss: 0.00599
Epoch: 45685, Training Loss: 0.00676
Epoch: 45685, Training Loss: 0.00633
Epoch: 45685, Training Loss: 0.00772
Epoch: 45685, Training Loss: 0.00599
Epoch: 45686, Training Loss: 0.00676
Epoch: 45686, Training Loss: 0.00633
Epoch: 45686, Training Loss: 0.00772
Epoch: 45686, Training Loss: 0.00599
Epoch: 45687, Training Loss: 0.00676
Epoch: 45687, Training Loss: 0.00633
Epoch: 45687, Training Loss: 0.00772
Epoch: 45687, Training Loss: 0.00599
Epoch: 45688, Training Loss: 0.00676
Epoch: 45688, Training Loss: 0.00633
Epoch: 45688, Training Loss: 0.00772
Epoch: 45688, Training Loss: 0.00599
Epoch: 45689, Training Loss: 0.00675
Epoch: 45689, Training Loss: 0.00633
Epoch: 45689, Training Loss: 0.00772
Epoch: 45689, Training Loss: 0.00599
Epoch: 45690, Training Loss: 0.00675
Epoch: 45690, Training Loss: 0.00633
Epoch: 45690, Training Loss: 0.00772
Epoch: 45690, Training Loss: 0.00599
Epoch: 45691, Training Loss: 0.00675
Epoch: 45691, Training Loss: 0.00633
Epoch: 45691, Training Loss: 0.00772
Epoch: 45691, Training Loss: 0.00599
Epoch: 45692, Training Loss: 0.00675
Epoch: 45692, Training Loss: 0.00633
Epoch: 45692, Training Loss: 0.00772
Epoch: 45692, Training Loss: 0.00599
Epoch: 45693, Training Loss: 0.00675
Epoch: 45693, Training Loss: 0.00633
Epoch: 45693, Training Loss: 0.00772
Epoch: 45693, Training Loss: 0.00599
Epoch: 45694, Training Loss: 0.00675
Epoch: 45694, Training Loss: 0.00633
Epoch: 45694, Training Loss: 0.00772
Epoch: 45694, Training Loss: 0.00599
Epoch: 45695, Training Loss: 0.00675
Epoch: 45695, Training Loss: 0.00633
Epoch: 45695, Training Loss: 0.00772
Epoch: 45695, Training Loss: 0.00599
Epoch: 45696, Training Loss: 0.00675
Epoch: 45696, Training Loss: 0.00633
Epoch: 45696, Training Loss: 0.00772
Epoch: 45696, Training Loss: 0.00599
Epoch: 45697, Training Loss: 0.00675
Epoch: 45697, Training Loss: 0.00633
Epoch: 45697, Training Loss: 0.00772
Epoch: 45697, Training Loss: 0.00599
Epoch: 45698, Training Loss: 0.00675
Epoch: 45698, Training Loss: 0.00633
Epoch: 45698, Training Loss: 0.00772
Epoch: 45698, Training Loss: 0.00599
Epoch: 45699, Training Loss: 0.00675
Epoch: 45699, Training Loss: 0.00633
Epoch: 45699, Training Loss: 0.00772
Epoch: 45699, Training Loss: 0.00599
Epoch: 45700, Training Loss: 0.00675
Epoch: 45700, Training Loss: 0.00633
Epoch: 45700, Training Loss: 0.00772
Epoch: 45700, Training Loss: 0.00599
Epoch: 45701, Training Loss: 0.00675
Epoch: 45701, Training Loss: 0.00633
Epoch: 45701, Training Loss: 0.00772
Epoch: 45701, Training Loss: 0.00599
Epoch: 45702, Training Loss: 0.00675
Epoch: 45702, Training Loss: 0.00633
Epoch: 45702, Training Loss: 0.00772
Epoch: 45702, Training Loss: 0.00599
Epoch: 45703, Training Loss: 0.00675
Epoch: 45703, Training Loss: 0.00633
Epoch: 45703, Training Loss: 0.00772
Epoch: 45703, Training Loss: 0.00599
Epoch: 45704, Training Loss: 0.00675
Epoch: 45704, Training Loss: 0.00633
Epoch: 45704, Training Loss: 0.00772
Epoch: 45704, Training Loss: 0.00599
Epoch: 45705, Training Loss: 0.00675
Epoch: 45705, Training Loss: 0.00633
Epoch: 45705, Training Loss: 0.00772
Epoch: 45705, Training Loss: 0.00599
Epoch: 45706, Training Loss: 0.00675
Epoch: 45706, Training Loss: 0.00633
Epoch: 45706, Training Loss: 0.00772
Epoch: 45706, Training Loss: 0.00599
Epoch: 45707, Training Loss: 0.00675
Epoch: 45707, Training Loss: 0.00633
Epoch: 45707, Training Loss: 0.00772
Epoch: 45707, Training Loss: 0.00599
Epoch: 45708, Training Loss: 0.00675
Epoch: 45708, Training Loss: 0.00633
Epoch: 45708, Training Loss: 0.00772
Epoch: 45708, Training Loss: 0.00599
Epoch: 45709, Training Loss: 0.00675
Epoch: 45709, Training Loss: 0.00633
Epoch: 45709, Training Loss: 0.00772
Epoch: 45709, Training Loss: 0.00599
Epoch: 45710, Training Loss: 0.00675
Epoch: 45710, Training Loss: 0.00633
Epoch: 45710, Training Loss: 0.00772
Epoch: 45710, Training Loss: 0.00599
Epoch: 45711, Training Loss: 0.00675
Epoch: 45711, Training Loss: 0.00633
Epoch: 45711, Training Loss: 0.00772
Epoch: 45711, Training Loss: 0.00599
Epoch: 45712, Training Loss: 0.00675
Epoch: 45712, Training Loss: 0.00633
Epoch: 45712, Training Loss: 0.00772
Epoch: 45712, Training Loss: 0.00599
Epoch: 45713, Training Loss: 0.00675
Epoch: 45713, Training Loss: 0.00633
Epoch: 45713, Training Loss: 0.00772
Epoch: 45713, Training Loss: 0.00599
Epoch: 45714, Training Loss: 0.00675
Epoch: 45714, Training Loss: 0.00633
Epoch: 45714, Training Loss: 0.00772
Epoch: 45714, Training Loss: 0.00599
Epoch: 45715, Training Loss: 0.00675
Epoch: 45715, Training Loss: 0.00633
Epoch: 45715, Training Loss: 0.00772
Epoch: 45715, Training Loss: 0.00599
Epoch: 45716, Training Loss: 0.00675
Epoch: 45716, Training Loss: 0.00633
Epoch: 45716, Training Loss: 0.00772
Epoch: 45716, Training Loss: 0.00599
Epoch: 45717, Training Loss: 0.00675
Epoch: 45717, Training Loss: 0.00633
Epoch: 45717, Training Loss: 0.00772
Epoch: 45717, Training Loss: 0.00599
Epoch: 45718, Training Loss: 0.00675
Epoch: 45718, Training Loss: 0.00633
Epoch: 45718, Training Loss: 0.00772
Epoch: 45718, Training Loss: 0.00599
Epoch: 45719, Training Loss: 0.00675
Epoch: 45719, Training Loss: 0.00633
Epoch: 45719, Training Loss: 0.00772
Epoch: 45719, Training Loss: 0.00599
Epoch: 45720, Training Loss: 0.00675
Epoch: 45720, Training Loss: 0.00633
Epoch: 45720, Training Loss: 0.00772
Epoch: 45720, Training Loss: 0.00599
Epoch: 45721, Training Loss: 0.00675
Epoch: 45721, Training Loss: 0.00633
Epoch: 45721, Training Loss: 0.00772
Epoch: 45721, Training Loss: 0.00599
Epoch: 45722, Training Loss: 0.00675
Epoch: 45722, Training Loss: 0.00633
Epoch: 45722, Training Loss: 0.00772
Epoch: 45722, Training Loss: 0.00599
Epoch: 45723, Training Loss: 0.00675
Epoch: 45723, Training Loss: 0.00633
Epoch: 45723, Training Loss: 0.00772
Epoch: 45723, Training Loss: 0.00599
Epoch: 45724, Training Loss: 0.00675
Epoch: 45724, Training Loss: 0.00633
Epoch: 45724, Training Loss: 0.00772
Epoch: 45724, Training Loss: 0.00599
Epoch: 45725, Training Loss: 0.00675
Epoch: 45725, Training Loss: 0.00633
Epoch: 45725, Training Loss: 0.00772
Epoch: 45725, Training Loss: 0.00599
Epoch: 45726, Training Loss: 0.00675
Epoch: 45726, Training Loss: 0.00633
Epoch: 45726, Training Loss: 0.00772
Epoch: 45726, Training Loss: 0.00599
Epoch: 45727, Training Loss: 0.00675
Epoch: 45727, Training Loss: 0.00633
Epoch: 45727, Training Loss: 0.00772
Epoch: 45727, Training Loss: 0.00599
Epoch: 45728, Training Loss: 0.00675
Epoch: 45728, Training Loss: 0.00633
Epoch: 45728, Training Loss: 0.00772
Epoch: 45728, Training Loss: 0.00599
Epoch: 45729, Training Loss: 0.00675
Epoch: 45729, Training Loss: 0.00633
Epoch: 45729, Training Loss: 0.00772
Epoch: 45729, Training Loss: 0.00599
Epoch: 45730, Training Loss: 0.00675
Epoch: 45730, Training Loss: 0.00633
Epoch: 45730, Training Loss: 0.00772
Epoch: 45730, Training Loss: 0.00599
Epoch: 45731, Training Loss: 0.00675
Epoch: 45731, Training Loss: 0.00633
Epoch: 45731, Training Loss: 0.00772
Epoch: 45731, Training Loss: 0.00599
Epoch: 45732, Training Loss: 0.00675
Epoch: 45732, Training Loss: 0.00633
Epoch: 45732, Training Loss: 0.00772
Epoch: 45732, Training Loss: 0.00599
Epoch: 45733, Training Loss: 0.00675
Epoch: 45733, Training Loss: 0.00632
Epoch: 45733, Training Loss: 0.00772
Epoch: 45733, Training Loss: 0.00599
Epoch: 45734, Training Loss: 0.00675
Epoch: 45734, Training Loss: 0.00632
Epoch: 45734, Training Loss: 0.00772
Epoch: 45734, Training Loss: 0.00599
Epoch: 45735, Training Loss: 0.00675
Epoch: 45735, Training Loss: 0.00632
Epoch: 45735, Training Loss: 0.00772
Epoch: 45735, Training Loss: 0.00599
Epoch: 45736, Training Loss: 0.00675
Epoch: 45736, Training Loss: 0.00632
Epoch: 45736, Training Loss: 0.00772
Epoch: 45736, Training Loss: 0.00599
Epoch: 45737, Training Loss: 0.00675
Epoch: 45737, Training Loss: 0.00632
Epoch: 45737, Training Loss: 0.00772
Epoch: 45737, Training Loss: 0.00599
Epoch: 45738, Training Loss: 0.00675
Epoch: 45738, Training Loss: 0.00632
Epoch: 45738, Training Loss: 0.00772
Epoch: 45738, Training Loss: 0.00599
Epoch: 45739, Training Loss: 0.00675
Epoch: 45739, Training Loss: 0.00632
Epoch: 45739, Training Loss: 0.00772
Epoch: 45739, Training Loss: 0.00599
Epoch: 45740, Training Loss: 0.00675
Epoch: 45740, Training Loss: 0.00632
Epoch: 45740, Training Loss: 0.00772
Epoch: 45740, Training Loss: 0.00599
Epoch: 45741, Training Loss: 0.00675
Epoch: 45741, Training Loss: 0.00632
Epoch: 45741, Training Loss: 0.00772
Epoch: 45741, Training Loss: 0.00599
Epoch: 45742, Training Loss: 0.00675
Epoch: 45742, Training Loss: 0.00632
Epoch: 45742, Training Loss: 0.00772
Epoch: 45742, Training Loss: 0.00599
Epoch: 45743, Training Loss: 0.00675
Epoch: 45743, Training Loss: 0.00632
Epoch: 45743, Training Loss: 0.00772
Epoch: 45743, Training Loss: 0.00599
Epoch: 45744, Training Loss: 0.00675
Epoch: 45744, Training Loss: 0.00632
Epoch: 45744, Training Loss: 0.00772
Epoch: 45744, Training Loss: 0.00599
Epoch: 45745, Training Loss: 0.00675
Epoch: 45745, Training Loss: 0.00632
Epoch: 45745, Training Loss: 0.00772
Epoch: 45745, Training Loss: 0.00599
Epoch: 45746, Training Loss: 0.00675
Epoch: 45746, Training Loss: 0.00632
Epoch: 45746, Training Loss: 0.00772
Epoch: 45746, Training Loss: 0.00599
Epoch: 45747, Training Loss: 0.00675
Epoch: 45747, Training Loss: 0.00632
Epoch: 45747, Training Loss: 0.00772
Epoch: 45747, Training Loss: 0.00599
Epoch: 45748, Training Loss: 0.00675
Epoch: 45748, Training Loss: 0.00632
Epoch: 45748, Training Loss: 0.00771
Epoch: 45748, Training Loss: 0.00599
Epoch: 45749, Training Loss: 0.00675
Epoch: 45749, Training Loss: 0.00632
Epoch: 45749, Training Loss: 0.00771
Epoch: 45749, Training Loss: 0.00599
Epoch: 45750, Training Loss: 0.00675
Epoch: 45750, Training Loss: 0.00632
Epoch: 45750, Training Loss: 0.00771
Epoch: 45750, Training Loss: 0.00599
Epoch: 45751, Training Loss: 0.00675
Epoch: 45751, Training Loss: 0.00632
Epoch: 45751, Training Loss: 0.00771
Epoch: 45751, Training Loss: 0.00599
Epoch: 45752, Training Loss: 0.00675
Epoch: 45752, Training Loss: 0.00632
Epoch: 45752, Training Loss: 0.00771
Epoch: 45752, Training Loss: 0.00599
Epoch: 45753, Training Loss: 0.00675
Epoch: 45753, Training Loss: 0.00632
Epoch: 45753, Training Loss: 0.00771
Epoch: 45753, Training Loss: 0.00599
Epoch: 45754, Training Loss: 0.00675
Epoch: 45754, Training Loss: 0.00632
Epoch: 45754, Training Loss: 0.00771
Epoch: 45754, Training Loss: 0.00599
Epoch: 45755, Training Loss: 0.00675
Epoch: 45755, Training Loss: 0.00632
Epoch: 45755, Training Loss: 0.00771
Epoch: 45755, Training Loss: 0.00599
Epoch: 45756, Training Loss: 0.00675
Epoch: 45756, Training Loss: 0.00632
Epoch: 45756, Training Loss: 0.00771
Epoch: 45756, Training Loss: 0.00599
Epoch: 45757, Training Loss: 0.00675
Epoch: 45757, Training Loss: 0.00632
Epoch: 45757, Training Loss: 0.00771
Epoch: 45757, Training Loss: 0.00599
Epoch: 45758, Training Loss: 0.00675
Epoch: 45758, Training Loss: 0.00632
Epoch: 45758, Training Loss: 0.00771
Epoch: 45758, Training Loss: 0.00599
Epoch: 45759, Training Loss: 0.00675
Epoch: 45759, Training Loss: 0.00632
Epoch: 45759, Training Loss: 0.00771
Epoch: 45759, Training Loss: 0.00599
Epoch: 45760, Training Loss: 0.00675
Epoch: 45760, Training Loss: 0.00632
Epoch: 45760, Training Loss: 0.00771
Epoch: 45760, Training Loss: 0.00599
Epoch: 45761, Training Loss: 0.00675
Epoch: 45761, Training Loss: 0.00632
Epoch: 45761, Training Loss: 0.00771
Epoch: 45761, Training Loss: 0.00599
Epoch: 45762, Training Loss: 0.00675
Epoch: 45762, Training Loss: 0.00632
Epoch: 45762, Training Loss: 0.00771
Epoch: 45762, Training Loss: 0.00599
Epoch: 45763, Training Loss: 0.00675
Epoch: 45763, Training Loss: 0.00632
Epoch: 45763, Training Loss: 0.00771
Epoch: 45763, Training Loss: 0.00599
Epoch: 45764, Training Loss: 0.00675
Epoch: 45764, Training Loss: 0.00632
Epoch: 45764, Training Loss: 0.00771
Epoch: 45764, Training Loss: 0.00599
Epoch: 45765, Training Loss: 0.00675
Epoch: 45765, Training Loss: 0.00632
Epoch: 45765, Training Loss: 0.00771
Epoch: 45765, Training Loss: 0.00599
Epoch: 45766, Training Loss: 0.00675
Epoch: 45766, Training Loss: 0.00632
Epoch: 45766, Training Loss: 0.00771
Epoch: 45766, Training Loss: 0.00599
Epoch: 45767, Training Loss: 0.00675
Epoch: 45767, Training Loss: 0.00632
Epoch: 45767, Training Loss: 0.00771
Epoch: 45767, Training Loss: 0.00599
Epoch: 45768, Training Loss: 0.00675
Epoch: 45768, Training Loss: 0.00632
Epoch: 45768, Training Loss: 0.00771
Epoch: 45768, Training Loss: 0.00599
Epoch: 45769, Training Loss: 0.00675
Epoch: 45769, Training Loss: 0.00632
Epoch: 45769, Training Loss: 0.00771
Epoch: 45769, Training Loss: 0.00599
Epoch: 45770, Training Loss: 0.00675
Epoch: 45770, Training Loss: 0.00632
Epoch: 45770, Training Loss: 0.00771
Epoch: 45770, Training Loss: 0.00599
Epoch: 45771, Training Loss: 0.00675
Epoch: 45771, Training Loss: 0.00632
Epoch: 45771, Training Loss: 0.00771
Epoch: 45771, Training Loss: 0.00599
Epoch: 45772, Training Loss: 0.00675
Epoch: 45772, Training Loss: 0.00632
Epoch: 45772, Training Loss: 0.00771
Epoch: 45772, Training Loss: 0.00599
Epoch: 45773, Training Loss: 0.00675
Epoch: 45773, Training Loss: 0.00632
Epoch: 45773, Training Loss: 0.00771
Epoch: 45773, Training Loss: 0.00599
Epoch: 45774, Training Loss: 0.00675
Epoch: 45774, Training Loss: 0.00632
Epoch: 45774, Training Loss: 0.00771
Epoch: 45774, Training Loss: 0.00599
Epoch: 45775, Training Loss: 0.00675
Epoch: 45775, Training Loss: 0.00632
Epoch: 45775, Training Loss: 0.00771
Epoch: 45775, Training Loss: 0.00599
Epoch: 45776, Training Loss: 0.00675
Epoch: 45776, Training Loss: 0.00632
Epoch: 45776, Training Loss: 0.00771
Epoch: 45776, Training Loss: 0.00599
Epoch: 45777, Training Loss: 0.00675
Epoch: 45777, Training Loss: 0.00632
Epoch: 45777, Training Loss: 0.00771
Epoch: 45777, Training Loss: 0.00599
Epoch: 45778, Training Loss: 0.00675
Epoch: 45778, Training Loss: 0.00632
Epoch: 45778, Training Loss: 0.00771
Epoch: 45778, Training Loss: 0.00598
Epoch: 45779, Training Loss: 0.00675
Epoch: 45779, Training Loss: 0.00632
Epoch: 45779, Training Loss: 0.00771
Epoch: 45779, Training Loss: 0.00598
Epoch: 45780, Training Loss: 0.00675
Epoch: 45780, Training Loss: 0.00632
Epoch: 45780, Training Loss: 0.00771
Epoch: 45780, Training Loss: 0.00598
Epoch: 45781, Training Loss: 0.00675
Epoch: 45781, Training Loss: 0.00632
Epoch: 45781, Training Loss: 0.00771
Epoch: 45781, Training Loss: 0.00598
Epoch: 45782, Training Loss: 0.00675
Epoch: 45782, Training Loss: 0.00632
Epoch: 45782, Training Loss: 0.00771
Epoch: 45782, Training Loss: 0.00598
Epoch: 45783, Training Loss: 0.00675
Epoch: 45783, Training Loss: 0.00632
Epoch: 45783, Training Loss: 0.00771
Epoch: 45783, Training Loss: 0.00598
Epoch: 45784, Training Loss: 0.00675
Epoch: 45784, Training Loss: 0.00632
Epoch: 45784, Training Loss: 0.00771
Epoch: 45784, Training Loss: 0.00598
Epoch: 45785, Training Loss: 0.00675
Epoch: 45785, Training Loss: 0.00632
Epoch: 45785, Training Loss: 0.00771
Epoch: 45785, Training Loss: 0.00598
Epoch: 45786, Training Loss: 0.00675
Epoch: 45786, Training Loss: 0.00632
Epoch: 45786, Training Loss: 0.00771
Epoch: 45786, Training Loss: 0.00598
Epoch: 45787, Training Loss: 0.00675
Epoch: 45787, Training Loss: 0.00632
Epoch: 45787, Training Loss: 0.00771
Epoch: 45787, Training Loss: 0.00598
Epoch: 45788, Training Loss: 0.00675
Epoch: 45788, Training Loss: 0.00632
Epoch: 45788, Training Loss: 0.00771
Epoch: 45788, Training Loss: 0.00598
Epoch: 45789, Training Loss: 0.00675
Epoch: 45789, Training Loss: 0.00632
Epoch: 45789, Training Loss: 0.00771
Epoch: 45789, Training Loss: 0.00598
Epoch: 45790, Training Loss: 0.00675
Epoch: 45790, Training Loss: 0.00632
Epoch: 45790, Training Loss: 0.00771
Epoch: 45790, Training Loss: 0.00598
Epoch: 45791, Training Loss: 0.00675
Epoch: 45791, Training Loss: 0.00632
Epoch: 45791, Training Loss: 0.00771
Epoch: 45791, Training Loss: 0.00598
Epoch: 45792, Training Loss: 0.00675
Epoch: 45792, Training Loss: 0.00632
Epoch: 45792, Training Loss: 0.00771
Epoch: 45792, Training Loss: 0.00598
Epoch: 45793, Training Loss: 0.00675
Epoch: 45793, Training Loss: 0.00632
Epoch: 45793, Training Loss: 0.00771
Epoch: 45793, Training Loss: 0.00598
Epoch: 45794, Training Loss: 0.00675
Epoch: 45794, Training Loss: 0.00632
Epoch: 45794, Training Loss: 0.00771
Epoch: 45794, Training Loss: 0.00598
Epoch: 45795, Training Loss: 0.00675
Epoch: 45795, Training Loss: 0.00632
Epoch: 45795, Training Loss: 0.00771
Epoch: 45795, Training Loss: 0.00598
Epoch: 45796, Training Loss: 0.00675
Epoch: 45796, Training Loss: 0.00632
Epoch: 45796, Training Loss: 0.00771
Epoch: 45796, Training Loss: 0.00598
Epoch: 45797, Training Loss: 0.00675
Epoch: 45797, Training Loss: 0.00632
Epoch: 45797, Training Loss: 0.00771
Epoch: 45797, Training Loss: 0.00598
Epoch: 45798, Training Loss: 0.00675
Epoch: 45798, Training Loss: 0.00632
Epoch: 45798, Training Loss: 0.00771
Epoch: 45798, Training Loss: 0.00598
Epoch: 45799, Training Loss: 0.00675
Epoch: 45799, Training Loss: 0.00632
Epoch: 45799, Training Loss: 0.00771
Epoch: 45799, Training Loss: 0.00598
Epoch: 45800, Training Loss: 0.00675
Epoch: 45800, Training Loss: 0.00632
Epoch: 45800, Training Loss: 0.00771
Epoch: 45800, Training Loss: 0.00598
Epoch: 45801, Training Loss: 0.00675
Epoch: 45801, Training Loss: 0.00632
Epoch: 45801, Training Loss: 0.00771
Epoch: 45801, Training Loss: 0.00598
Epoch: 45802, Training Loss: 0.00675
Epoch: 45802, Training Loss: 0.00632
Epoch: 45802, Training Loss: 0.00771
Epoch: 45802, Training Loss: 0.00598
Epoch: 45803, Training Loss: 0.00675
Epoch: 45803, Training Loss: 0.00632
Epoch: 45803, Training Loss: 0.00771
Epoch: 45803, Training Loss: 0.00598
Epoch: 45804, Training Loss: 0.00675
Epoch: 45804, Training Loss: 0.00632
Epoch: 45804, Training Loss: 0.00771
Epoch: 45804, Training Loss: 0.00598
Epoch: 45805, Training Loss: 0.00675
Epoch: 45805, Training Loss: 0.00632
Epoch: 45805, Training Loss: 0.00771
Epoch: 45805, Training Loss: 0.00598
Epoch: 45806, Training Loss: 0.00675
Epoch: 45806, Training Loss: 0.00632
Epoch: 45806, Training Loss: 0.00771
Epoch: 45806, Training Loss: 0.00598
Epoch: 45807, Training Loss: 0.00675
Epoch: 45807, Training Loss: 0.00632
Epoch: 45807, Training Loss: 0.00771
Epoch: 45807, Training Loss: 0.00598
Epoch: 45808, Training Loss: 0.00675
Epoch: 45808, Training Loss: 0.00632
Epoch: 45808, Training Loss: 0.00771
Epoch: 45808, Training Loss: 0.00598
Epoch: 45809, Training Loss: 0.00675
Epoch: 45809, Training Loss: 0.00632
Epoch: 45809, Training Loss: 0.00771
Epoch: 45809, Training Loss: 0.00598
Epoch: 45810, Training Loss: 0.00675
Epoch: 45810, Training Loss: 0.00632
Epoch: 45810, Training Loss: 0.00771
Epoch: 45810, Training Loss: 0.00598
Epoch: 45811, Training Loss: 0.00675
Epoch: 45811, Training Loss: 0.00632
Epoch: 45811, Training Loss: 0.00771
Epoch: 45811, Training Loss: 0.00598
Epoch: 45812, Training Loss: 0.00675
Epoch: 45812, Training Loss: 0.00632
Epoch: 45812, Training Loss: 0.00771
Epoch: 45812, Training Loss: 0.00598
Epoch: 45813, Training Loss: 0.00675
Epoch: 45813, Training Loss: 0.00632
Epoch: 45813, Training Loss: 0.00771
Epoch: 45813, Training Loss: 0.00598
Epoch: 45814, Training Loss: 0.00675
Epoch: 45814, Training Loss: 0.00632
Epoch: 45814, Training Loss: 0.00771
Epoch: 45814, Training Loss: 0.00598
Epoch: 45815, Training Loss: 0.00675
Epoch: 45815, Training Loss: 0.00632
Epoch: 45815, Training Loss: 0.00771
Epoch: 45815, Training Loss: 0.00598
Epoch: 45816, Training Loss: 0.00675
Epoch: 45816, Training Loss: 0.00632
Epoch: 45816, Training Loss: 0.00771
Epoch: 45816, Training Loss: 0.00598
Epoch: 45817, Training Loss: 0.00675
Epoch: 45817, Training Loss: 0.00632
Epoch: 45817, Training Loss: 0.00771
Epoch: 45817, Training Loss: 0.00598
Epoch: 45818, Training Loss: 0.00674
Epoch: 45818, Training Loss: 0.00632
Epoch: 45818, Training Loss: 0.00771
Epoch: 45818, Training Loss: 0.00598
Epoch: 45819, Training Loss: 0.00674
Epoch: 45819, Training Loss: 0.00632
Epoch: 45819, Training Loss: 0.00771
Epoch: 45819, Training Loss: 0.00598
Epoch: 45820, Training Loss: 0.00674
Epoch: 45820, Training Loss: 0.00632
Epoch: 45820, Training Loss: 0.00771
Epoch: 45820, Training Loss: 0.00598
Epoch: 45821, Training Loss: 0.00674
Epoch: 45821, Training Loss: 0.00632
Epoch: 45821, Training Loss: 0.00771
Epoch: 45821, Training Loss: 0.00598
Epoch: 45822, Training Loss: 0.00674
Epoch: 45822, Training Loss: 0.00632
Epoch: 45822, Training Loss: 0.00771
Epoch: 45822, Training Loss: 0.00598
Epoch: 45823, Training Loss: 0.00674
Epoch: 45823, Training Loss: 0.00632
Epoch: 45823, Training Loss: 0.00771
Epoch: 45823, Training Loss: 0.00598
Epoch: 45824, Training Loss: 0.00674
Epoch: 45824, Training Loss: 0.00632
Epoch: 45824, Training Loss: 0.00771
Epoch: 45824, Training Loss: 0.00598
Epoch: 45825, Training Loss: 0.00674
Epoch: 45825, Training Loss: 0.00632
Epoch: 45825, Training Loss: 0.00771
Epoch: 45825, Training Loss: 0.00598
Epoch: 45826, Training Loss: 0.00674
Epoch: 45826, Training Loss: 0.00632
Epoch: 45826, Training Loss: 0.00771
Epoch: 45826, Training Loss: 0.00598
Epoch: 45827, Training Loss: 0.00674
Epoch: 45827, Training Loss: 0.00632
Epoch: 45827, Training Loss: 0.00771
Epoch: 45827, Training Loss: 0.00598
Epoch: 45828, Training Loss: 0.00674
Epoch: 45828, Training Loss: 0.00632
Epoch: 45828, Training Loss: 0.00771
Epoch: 45828, Training Loss: 0.00598
Epoch: 45829, Training Loss: 0.00674
Epoch: 45829, Training Loss: 0.00632
Epoch: 45829, Training Loss: 0.00771
Epoch: 45829, Training Loss: 0.00598
Epoch: 45830, Training Loss: 0.00674
Epoch: 45830, Training Loss: 0.00632
Epoch: 45830, Training Loss: 0.00771
Epoch: 45830, Training Loss: 0.00598
Epoch: 45831, Training Loss: 0.00674
Epoch: 45831, Training Loss: 0.00632
Epoch: 45831, Training Loss: 0.00771
Epoch: 45831, Training Loss: 0.00598
Epoch: 45832, Training Loss: 0.00674
Epoch: 45832, Training Loss: 0.00632
Epoch: 45832, Training Loss: 0.00771
Epoch: 45832, Training Loss: 0.00598
Epoch: 45833, Training Loss: 0.00674
Epoch: 45833, Training Loss: 0.00632
Epoch: 45833, Training Loss: 0.00771
Epoch: 45833, Training Loss: 0.00598
Epoch: 45834, Training Loss: 0.00674
Epoch: 45834, Training Loss: 0.00632
Epoch: 45834, Training Loss: 0.00771
Epoch: 45834, Training Loss: 0.00598
Epoch: 45835, Training Loss: 0.00674
Epoch: 45835, Training Loss: 0.00632
Epoch: 45835, Training Loss: 0.00771
Epoch: 45835, Training Loss: 0.00598
Epoch: 45836, Training Loss: 0.00674
Epoch: 45836, Training Loss: 0.00632
Epoch: 45836, Training Loss: 0.00771
Epoch: 45836, Training Loss: 0.00598
Epoch: 45837, Training Loss: 0.00674
Epoch: 45837, Training Loss: 0.00632
Epoch: 45837, Training Loss: 0.00771
Epoch: 45837, Training Loss: 0.00598
Epoch: 45838, Training Loss: 0.00674
Epoch: 45838, Training Loss: 0.00632
Epoch: 45838, Training Loss: 0.00771
Epoch: 45838, Training Loss: 0.00598
Epoch: 45839, Training Loss: 0.00674
Epoch: 45839, Training Loss: 0.00632
Epoch: 45839, Training Loss: 0.00771
Epoch: 45839, Training Loss: 0.00598
Epoch: 45840, Training Loss: 0.00674
Epoch: 45840, Training Loss: 0.00632
Epoch: 45840, Training Loss: 0.00771
Epoch: 45840, Training Loss: 0.00598
Epoch: 45841, Training Loss: 0.00674
Epoch: 45841, Training Loss: 0.00632
Epoch: 45841, Training Loss: 0.00771
Epoch: 45841, Training Loss: 0.00598
Epoch: 45842, Training Loss: 0.00674
Epoch: 45842, Training Loss: 0.00632
Epoch: 45842, Training Loss: 0.00771
Epoch: 45842, Training Loss: 0.00598
Epoch: 45843, Training Loss: 0.00674
Epoch: 45843, Training Loss: 0.00632
Epoch: 45843, Training Loss: 0.00771
Epoch: 45843, Training Loss: 0.00598
Epoch: 45844, Training Loss: 0.00674
Epoch: 45844, Training Loss: 0.00632
Epoch: 45844, Training Loss: 0.00771
Epoch: 45844, Training Loss: 0.00598
Epoch: 45845, Training Loss: 0.00674
Epoch: 45845, Training Loss: 0.00632
Epoch: 45845, Training Loss: 0.00771
Epoch: 45845, Training Loss: 0.00598
Epoch: 45846, Training Loss: 0.00674
Epoch: 45846, Training Loss: 0.00632
Epoch: 45846, Training Loss: 0.00771
Epoch: 45846, Training Loss: 0.00598
Epoch: 45847, Training Loss: 0.00674
Epoch: 45847, Training Loss: 0.00632
Epoch: 45847, Training Loss: 0.00771
Epoch: 45847, Training Loss: 0.00598
Epoch: 45848, Training Loss: 0.00674
Epoch: 45848, Training Loss: 0.00632
Epoch: 45848, Training Loss: 0.00771
Epoch: 45848, Training Loss: 0.00598
Epoch: 45849, Training Loss: 0.00674
Epoch: 45849, Training Loss: 0.00632
Epoch: 45849, Training Loss: 0.00771
Epoch: 45849, Training Loss: 0.00598
Epoch: 45850, Training Loss: 0.00674
Epoch: 45850, Training Loss: 0.00632
Epoch: 45850, Training Loss: 0.00771
Epoch: 45850, Training Loss: 0.00598
Epoch: 45851, Training Loss: 0.00674
Epoch: 45851, Training Loss: 0.00632
Epoch: 45851, Training Loss: 0.00771
Epoch: 45851, Training Loss: 0.00598
Epoch: 45852, Training Loss: 0.00674
Epoch: 45852, Training Loss: 0.00632
Epoch: 45852, Training Loss: 0.00771
Epoch: 45852, Training Loss: 0.00598
Epoch: 45853, Training Loss: 0.00674
Epoch: 45853, Training Loss: 0.00632
Epoch: 45853, Training Loss: 0.00771
Epoch: 45853, Training Loss: 0.00598
Epoch: 45854, Training Loss: 0.00674
Epoch: 45854, Training Loss: 0.00632
Epoch: 45854, Training Loss: 0.00771
Epoch: 45854, Training Loss: 0.00598
Epoch: 45855, Training Loss: 0.00674
Epoch: 45855, Training Loss: 0.00632
Epoch: 45855, Training Loss: 0.00771
Epoch: 45855, Training Loss: 0.00598
Epoch: 45856, Training Loss: 0.00674
Epoch: 45856, Training Loss: 0.00632
Epoch: 45856, Training Loss: 0.00771
Epoch: 45856, Training Loss: 0.00598
Epoch: 45857, Training Loss: 0.00674
Epoch: 45857, Training Loss: 0.00632
Epoch: 45857, Training Loss: 0.00771
Epoch: 45857, Training Loss: 0.00598
Epoch: 45858, Training Loss: 0.00674
Epoch: 45858, Training Loss: 0.00632
Epoch: 45858, Training Loss: 0.00771
Epoch: 45858, Training Loss: 0.00598
Epoch: 45859, Training Loss: 0.00674
Epoch: 45859, Training Loss: 0.00632
Epoch: 45859, Training Loss: 0.00771
Epoch: 45859, Training Loss: 0.00598
Epoch: 45860, Training Loss: 0.00674
Epoch: 45860, Training Loss: 0.00632
Epoch: 45860, Training Loss: 0.00771
Epoch: 45860, Training Loss: 0.00598
Epoch: 45861, Training Loss: 0.00674
Epoch: 45861, Training Loss: 0.00632
Epoch: 45861, Training Loss: 0.00771
Epoch: 45861, Training Loss: 0.00598
Epoch: 45862, Training Loss: 0.00674
Epoch: 45862, Training Loss: 0.00632
Epoch: 45862, Training Loss: 0.00770
Epoch: 45862, Training Loss: 0.00598
Epoch: 45863, Training Loss: 0.00674
Epoch: 45863, Training Loss: 0.00632
Epoch: 45863, Training Loss: 0.00770
Epoch: 45863, Training Loss: 0.00598
Epoch: 45864, Training Loss: 0.00674
Epoch: 45864, Training Loss: 0.00632
Epoch: 45864, Training Loss: 0.00770
Epoch: 45864, Training Loss: 0.00598
Epoch: 45865, Training Loss: 0.00674
Epoch: 45865, Training Loss: 0.00632
Epoch: 45865, Training Loss: 0.00770
Epoch: 45865, Training Loss: 0.00598
Epoch: 45866, Training Loss: 0.00674
Epoch: 45866, Training Loss: 0.00632
Epoch: 45866, Training Loss: 0.00770
Epoch: 45866, Training Loss: 0.00598
Epoch: 45867, Training Loss: 0.00674
Epoch: 45867, Training Loss: 0.00632
Epoch: 45867, Training Loss: 0.00770
Epoch: 45867, Training Loss: 0.00598
Epoch: 45868, Training Loss: 0.00674
Epoch: 45868, Training Loss: 0.00632
Epoch: 45868, Training Loss: 0.00770
Epoch: 45868, Training Loss: 0.00598
Epoch: 45869, Training Loss: 0.00674
Epoch: 45869, Training Loss: 0.00632
Epoch: 45869, Training Loss: 0.00770
Epoch: 45869, Training Loss: 0.00598
Epoch: 45870, Training Loss: 0.00674
Epoch: 45870, Training Loss: 0.00632
Epoch: 45870, Training Loss: 0.00770
Epoch: 45870, Training Loss: 0.00598
Epoch: 45871, Training Loss: 0.00674
Epoch: 45871, Training Loss: 0.00632
Epoch: 45871, Training Loss: 0.00770
Epoch: 45871, Training Loss: 0.00598
Epoch: 45872, Training Loss: 0.00674
Epoch: 45872, Training Loss: 0.00632
Epoch: 45872, Training Loss: 0.00770
Epoch: 45872, Training Loss: 0.00598
Epoch: 45873, Training Loss: 0.00674
Epoch: 45873, Training Loss: 0.00631
Epoch: 45873, Training Loss: 0.00770
Epoch: 45873, Training Loss: 0.00598
Epoch: 45874, Training Loss: 0.00674
Epoch: 45874, Training Loss: 0.00631
Epoch: 45874, Training Loss: 0.00770
Epoch: 45874, Training Loss: 0.00598
Epoch: 45875, Training Loss: 0.00674
Epoch: 45875, Training Loss: 0.00631
Epoch: 45875, Training Loss: 0.00770
Epoch: 45875, Training Loss: 0.00598
Epoch: 45876, Training Loss: 0.00674
Epoch: 45876, Training Loss: 0.00631
Epoch: 45876, Training Loss: 0.00770
Epoch: 45876, Training Loss: 0.00598
Epoch: 45877, Training Loss: 0.00674
Epoch: 45877, Training Loss: 0.00631
Epoch: 45877, Training Loss: 0.00770
Epoch: 45877, Training Loss: 0.00598
Epoch: 45878, Training Loss: 0.00674
Epoch: 45878, Training Loss: 0.00631
Epoch: 45878, Training Loss: 0.00770
Epoch: 45878, Training Loss: 0.00598
Epoch: 45879, Training Loss: 0.00674
Epoch: 45879, Training Loss: 0.00631
Epoch: 45879, Training Loss: 0.00770
Epoch: 45879, Training Loss: 0.00598
Epoch: 45880, Training Loss: 0.00674
Epoch: 45880, Training Loss: 0.00631
Epoch: 45880, Training Loss: 0.00770
Epoch: 45880, Training Loss: 0.00598
Epoch: 45881, Training Loss: 0.00674
Epoch: 45881, Training Loss: 0.00631
Epoch: 45881, Training Loss: 0.00770
Epoch: 45881, Training Loss: 0.00598
Epoch: 45882, Training Loss: 0.00674
Epoch: 45882, Training Loss: 0.00631
Epoch: 45882, Training Loss: 0.00770
Epoch: 45882, Training Loss: 0.00598
Epoch: 45883, Training Loss: 0.00674
Epoch: 45883, Training Loss: 0.00631
Epoch: 45883, Training Loss: 0.00770
Epoch: 45883, Training Loss: 0.00598
Epoch: 45884, Training Loss: 0.00674
Epoch: 45884, Training Loss: 0.00631
Epoch: 45884, Training Loss: 0.00770
Epoch: 45884, Training Loss: 0.00598
Epoch: 45885, Training Loss: 0.00674
Epoch: 45885, Training Loss: 0.00631
Epoch: 45885, Training Loss: 0.00770
Epoch: 45885, Training Loss: 0.00598
Epoch: 45886, Training Loss: 0.00674
Epoch: 45886, Training Loss: 0.00631
Epoch: 45886, Training Loss: 0.00770
Epoch: 45886, Training Loss: 0.00598
Epoch: 45887, Training Loss: 0.00674
Epoch: 45887, Training Loss: 0.00631
Epoch: 45887, Training Loss: 0.00770
Epoch: 45887, Training Loss: 0.00598
Epoch: 45888, Training Loss: 0.00674
Epoch: 45888, Training Loss: 0.00631
Epoch: 45888, Training Loss: 0.00770
Epoch: 45888, Training Loss: 0.00598
Epoch: 45889, Training Loss: 0.00674
Epoch: 45889, Training Loss: 0.00631
Epoch: 45889, Training Loss: 0.00770
Epoch: 45889, Training Loss: 0.00598
Epoch: 45890, Training Loss: 0.00674
Epoch: 45890, Training Loss: 0.00631
Epoch: 45890, Training Loss: 0.00770
Epoch: 45890, Training Loss: 0.00598
Epoch: 45891, Training Loss: 0.00674
Epoch: 45891, Training Loss: 0.00631
Epoch: 45891, Training Loss: 0.00770
Epoch: 45891, Training Loss: 0.00598
Epoch: 45892, Training Loss: 0.00674
Epoch: 45892, Training Loss: 0.00631
Epoch: 45892, Training Loss: 0.00770
Epoch: 45892, Training Loss: 0.00598
Epoch: 45893, Training Loss: 0.00674
Epoch: 45893, Training Loss: 0.00631
Epoch: 45893, Training Loss: 0.00770
Epoch: 45893, Training Loss: 0.00598
Epoch: 45894, Training Loss: 0.00674
Epoch: 45894, Training Loss: 0.00631
Epoch: 45894, Training Loss: 0.00770
Epoch: 45894, Training Loss: 0.00598
Epoch: 45895, Training Loss: 0.00674
Epoch: 45895, Training Loss: 0.00631
Epoch: 45895, Training Loss: 0.00770
Epoch: 45895, Training Loss: 0.00598
Epoch: 45896, Training Loss: 0.00674
Epoch: 45896, Training Loss: 0.00631
Epoch: 45896, Training Loss: 0.00770
Epoch: 45896, Training Loss: 0.00598
Epoch: 45897, Training Loss: 0.00674
Epoch: 45897, Training Loss: 0.00631
Epoch: 45897, Training Loss: 0.00770
Epoch: 45897, Training Loss: 0.00598
Epoch: 45898, Training Loss: 0.00674
Epoch: 45898, Training Loss: 0.00631
Epoch: 45898, Training Loss: 0.00770
Epoch: 45898, Training Loss: 0.00598
Epoch: 45899, Training Loss: 0.00674
Epoch: 45899, Training Loss: 0.00631
Epoch: 45899, Training Loss: 0.00770
Epoch: 45899, Training Loss: 0.00598
Epoch: 45900, Training Loss: 0.00674
Epoch: 45900, Training Loss: 0.00631
Epoch: 45900, Training Loss: 0.00770
Epoch: 45900, Training Loss: 0.00598
Epoch: 45901, Training Loss: 0.00674
Epoch: 45901, Training Loss: 0.00631
Epoch: 45901, Training Loss: 0.00770
Epoch: 45901, Training Loss: 0.00598
Epoch: 45902, Training Loss: 0.00674
Epoch: 45902, Training Loss: 0.00631
Epoch: 45902, Training Loss: 0.00770
Epoch: 45902, Training Loss: 0.00598
Epoch: 45903, Training Loss: 0.00674
Epoch: 45903, Training Loss: 0.00631
Epoch: 45903, Training Loss: 0.00770
Epoch: 45903, Training Loss: 0.00598
Epoch: 45904, Training Loss: 0.00674
Epoch: 45904, Training Loss: 0.00631
Epoch: 45904, Training Loss: 0.00770
Epoch: 45904, Training Loss: 0.00598
Epoch: 45905, Training Loss: 0.00674
Epoch: 45905, Training Loss: 0.00631
Epoch: 45905, Training Loss: 0.00770
Epoch: 45905, Training Loss: 0.00598
Epoch: 45906, Training Loss: 0.00674
Epoch: 45906, Training Loss: 0.00631
Epoch: 45906, Training Loss: 0.00770
Epoch: 45906, Training Loss: 0.00598
Epoch: 45907, Training Loss: 0.00674
Epoch: 45907, Training Loss: 0.00631
Epoch: 45907, Training Loss: 0.00770
Epoch: 45907, Training Loss: 0.00598
Epoch: 45908, Training Loss: 0.00674
Epoch: 45908, Training Loss: 0.00631
Epoch: 45908, Training Loss: 0.00770
Epoch: 45908, Training Loss: 0.00598
Epoch: 45909, Training Loss: 0.00674
Epoch: 45909, Training Loss: 0.00631
Epoch: 45909, Training Loss: 0.00770
Epoch: 45909, Training Loss: 0.00598
Epoch: 45910, Training Loss: 0.00674
Epoch: 45910, Training Loss: 0.00631
Epoch: 45910, Training Loss: 0.00770
Epoch: 45910, Training Loss: 0.00598
Epoch: 45911, Training Loss: 0.00674
Epoch: 45911, Training Loss: 0.00631
Epoch: 45911, Training Loss: 0.00770
Epoch: 45911, Training Loss: 0.00598
Epoch: 45912, Training Loss: 0.00674
Epoch: 45912, Training Loss: 0.00631
Epoch: 45912, Training Loss: 0.00770
Epoch: 45912, Training Loss: 0.00598
Epoch: 45913, Training Loss: 0.00674
Epoch: 45913, Training Loss: 0.00631
Epoch: 45913, Training Loss: 0.00770
Epoch: 45913, Training Loss: 0.00598
Epoch: 45914, Training Loss: 0.00674
Epoch: 45914, Training Loss: 0.00631
Epoch: 45914, Training Loss: 0.00770
Epoch: 45914, Training Loss: 0.00598
Epoch: 45915, Training Loss: 0.00674
Epoch: 45915, Training Loss: 0.00631
Epoch: 45915, Training Loss: 0.00770
Epoch: 45915, Training Loss: 0.00598
Epoch: 45916, Training Loss: 0.00674
Epoch: 45916, Training Loss: 0.00631
Epoch: 45916, Training Loss: 0.00770
Epoch: 45916, Training Loss: 0.00598
Epoch: 45917, Training Loss: 0.00674
Epoch: 45917, Training Loss: 0.00631
Epoch: 45917, Training Loss: 0.00770
Epoch: 45917, Training Loss: 0.00598
Epoch: 45918, Training Loss: 0.00674
Epoch: 45918, Training Loss: 0.00631
Epoch: 45918, Training Loss: 0.00770
Epoch: 45918, Training Loss: 0.00598
Epoch: 45919, Training Loss: 0.00674
Epoch: 45919, Training Loss: 0.00631
Epoch: 45919, Training Loss: 0.00770
Epoch: 45919, Training Loss: 0.00598
Epoch: 45920, Training Loss: 0.00674
Epoch: 45920, Training Loss: 0.00631
Epoch: 45920, Training Loss: 0.00770
Epoch: 45920, Training Loss: 0.00598
Epoch: 45921, Training Loss: 0.00674
Epoch: 45921, Training Loss: 0.00631
Epoch: 45921, Training Loss: 0.00770
Epoch: 45921, Training Loss: 0.00598
Epoch: 45922, Training Loss: 0.00674
Epoch: 45922, Training Loss: 0.00631
Epoch: 45922, Training Loss: 0.00770
Epoch: 45922, Training Loss: 0.00598
Epoch: 45923, Training Loss: 0.00674
Epoch: 45923, Training Loss: 0.00631
Epoch: 45923, Training Loss: 0.00770
Epoch: 45923, Training Loss: 0.00598
Epoch: 45924, Training Loss: 0.00674
Epoch: 45924, Training Loss: 0.00631
Epoch: 45924, Training Loss: 0.00770
Epoch: 45924, Training Loss: 0.00598
Epoch: 45925, Training Loss: 0.00674
Epoch: 45925, Training Loss: 0.00631
Epoch: 45925, Training Loss: 0.00770
Epoch: 45925, Training Loss: 0.00598
Epoch: 45926, Training Loss: 0.00674
Epoch: 45926, Training Loss: 0.00631
Epoch: 45926, Training Loss: 0.00770
Epoch: 45926, Training Loss: 0.00597
Epoch: 45927, Training Loss: 0.00674
Epoch: 45927, Training Loss: 0.00631
Epoch: 45927, Training Loss: 0.00770
Epoch: 45927, Training Loss: 0.00597
Epoch: 45928, Training Loss: 0.00674
Epoch: 45928, Training Loss: 0.00631
Epoch: 45928, Training Loss: 0.00770
Epoch: 45928, Training Loss: 0.00597
Epoch: 45929, Training Loss: 0.00674
Epoch: 45929, Training Loss: 0.00631
Epoch: 45929, Training Loss: 0.00770
Epoch: 45929, Training Loss: 0.00597
Epoch: 45930, Training Loss: 0.00674
Epoch: 45930, Training Loss: 0.00631
Epoch: 45930, Training Loss: 0.00770
Epoch: 45930, Training Loss: 0.00597
Epoch: 45931, Training Loss: 0.00674
Epoch: 45931, Training Loss: 0.00631
Epoch: 45931, Training Loss: 0.00770
Epoch: 45931, Training Loss: 0.00597
Epoch: 45932, Training Loss: 0.00674
Epoch: 45932, Training Loss: 0.00631
Epoch: 45932, Training Loss: 0.00770
Epoch: 45932, Training Loss: 0.00597
Epoch: 45933, Training Loss: 0.00674
Epoch: 45933, Training Loss: 0.00631
Epoch: 45933, Training Loss: 0.00770
Epoch: 45933, Training Loss: 0.00597
Epoch: 45934, Training Loss: 0.00674
Epoch: 45934, Training Loss: 0.00631
Epoch: 45934, Training Loss: 0.00770
Epoch: 45934, Training Loss: 0.00597
Epoch: 45935, Training Loss: 0.00674
Epoch: 45935, Training Loss: 0.00631
Epoch: 45935, Training Loss: 0.00770
Epoch: 45935, Training Loss: 0.00597
Epoch: 45936, Training Loss: 0.00674
Epoch: 45936, Training Loss: 0.00631
Epoch: 45936, Training Loss: 0.00770
Epoch: 45936, Training Loss: 0.00597
Epoch: 45937, Training Loss: 0.00674
Epoch: 45937, Training Loss: 0.00631
Epoch: 45937, Training Loss: 0.00770
Epoch: 45937, Training Loss: 0.00597
Epoch: 45938, Training Loss: 0.00674
Epoch: 45938, Training Loss: 0.00631
Epoch: 45938, Training Loss: 0.00770
Epoch: 45938, Training Loss: 0.00597
Epoch: 45939, Training Loss: 0.00674
Epoch: 45939, Training Loss: 0.00631
Epoch: 45939, Training Loss: 0.00770
Epoch: 45939, Training Loss: 0.00597
Epoch: 45940, Training Loss: 0.00674
Epoch: 45940, Training Loss: 0.00631
Epoch: 45940, Training Loss: 0.00770
Epoch: 45940, Training Loss: 0.00597
Epoch: 45941, Training Loss: 0.00674
Epoch: 45941, Training Loss: 0.00631
Epoch: 45941, Training Loss: 0.00770
Epoch: 45941, Training Loss: 0.00597
Epoch: 45942, Training Loss: 0.00674
Epoch: 45942, Training Loss: 0.00631
Epoch: 45942, Training Loss: 0.00770
Epoch: 45942, Training Loss: 0.00597
Epoch: 45943, Training Loss: 0.00674
Epoch: 45943, Training Loss: 0.00631
Epoch: 45943, Training Loss: 0.00770
Epoch: 45943, Training Loss: 0.00597
Epoch: 45944, Training Loss: 0.00674
Epoch: 45944, Training Loss: 0.00631
Epoch: 45944, Training Loss: 0.00770
Epoch: 45944, Training Loss: 0.00597
Epoch: 45945, Training Loss: 0.00674
Epoch: 45945, Training Loss: 0.00631
Epoch: 45945, Training Loss: 0.00770
Epoch: 45945, Training Loss: 0.00597
Epoch: 45946, Training Loss: 0.00674
Epoch: 45946, Training Loss: 0.00631
Epoch: 45946, Training Loss: 0.00770
Epoch: 45946, Training Loss: 0.00597
Epoch: 45947, Training Loss: 0.00673
Epoch: 45947, Training Loss: 0.00631
Epoch: 45947, Training Loss: 0.00770
Epoch: 45947, Training Loss: 0.00597
Epoch: 45948, Training Loss: 0.00673
Epoch: 45948, Training Loss: 0.00631
Epoch: 45948, Training Loss: 0.00770
Epoch: 45948, Training Loss: 0.00597
Epoch: 45949, Training Loss: 0.00673
Epoch: 45949, Training Loss: 0.00631
Epoch: 45949, Training Loss: 0.00770
Epoch: 45949, Training Loss: 0.00597
Epoch: 45950, Training Loss: 0.00673
Epoch: 45950, Training Loss: 0.00631
Epoch: 45950, Training Loss: 0.00770
Epoch: 45950, Training Loss: 0.00597
Epoch: 45951, Training Loss: 0.00673
Epoch: 45951, Training Loss: 0.00631
Epoch: 45951, Training Loss: 0.00770
Epoch: 45951, Training Loss: 0.00597
Epoch: 45952, Training Loss: 0.00673
Epoch: 45952, Training Loss: 0.00631
Epoch: 45952, Training Loss: 0.00770
Epoch: 45952, Training Loss: 0.00597
Epoch: 45953, Training Loss: 0.00673
Epoch: 45953, Training Loss: 0.00631
Epoch: 45953, Training Loss: 0.00770
Epoch: 45953, Training Loss: 0.00597
Epoch: 45954, Training Loss: 0.00673
Epoch: 45954, Training Loss: 0.00631
Epoch: 45954, Training Loss: 0.00770
Epoch: 45954, Training Loss: 0.00597
Epoch: 45955, Training Loss: 0.00673
Epoch: 45955, Training Loss: 0.00631
Epoch: 45955, Training Loss: 0.00770
Epoch: 45955, Training Loss: 0.00597
Epoch: 45956, Training Loss: 0.00673
Epoch: 45956, Training Loss: 0.00631
Epoch: 45956, Training Loss: 0.00770
Epoch: 45956, Training Loss: 0.00597
Epoch: 45957, Training Loss: 0.00673
Epoch: 45957, Training Loss: 0.00631
Epoch: 45957, Training Loss: 0.00770
Epoch: 45957, Training Loss: 0.00597
Epoch: 45958, Training Loss: 0.00673
Epoch: 45958, Training Loss: 0.00631
Epoch: 45958, Training Loss: 0.00770
Epoch: 45958, Training Loss: 0.00597
Epoch: 45959, Training Loss: 0.00673
Epoch: 45959, Training Loss: 0.00631
Epoch: 45959, Training Loss: 0.00770
Epoch: 45959, Training Loss: 0.00597
Epoch: 45960, Training Loss: 0.00673
Epoch: 45960, Training Loss: 0.00631
Epoch: 45960, Training Loss: 0.00770
Epoch: 45960, Training Loss: 0.00597
Epoch: 45961, Training Loss: 0.00673
Epoch: 45961, Training Loss: 0.00631
Epoch: 45961, Training Loss: 0.00770
Epoch: 45961, Training Loss: 0.00597
Epoch: 45962, Training Loss: 0.00673
Epoch: 45962, Training Loss: 0.00631
Epoch: 45962, Training Loss: 0.00770
Epoch: 45962, Training Loss: 0.00597
Epoch: 45963, Training Loss: 0.00673
Epoch: 45963, Training Loss: 0.00631
Epoch: 45963, Training Loss: 0.00770
Epoch: 45963, Training Loss: 0.00597
Epoch: 45964, Training Loss: 0.00673
Epoch: 45964, Training Loss: 0.00631
Epoch: 45964, Training Loss: 0.00770
Epoch: 45964, Training Loss: 0.00597
Epoch: 45965, Training Loss: 0.00673
Epoch: 45965, Training Loss: 0.00631
Epoch: 45965, Training Loss: 0.00770
Epoch: 45965, Training Loss: 0.00597
Epoch: 45966, Training Loss: 0.00673
Epoch: 45966, Training Loss: 0.00631
Epoch: 45966, Training Loss: 0.00770
Epoch: 45966, Training Loss: 0.00597
Epoch: 45967, Training Loss: 0.00673
Epoch: 45967, Training Loss: 0.00631
Epoch: 45967, Training Loss: 0.00770
Epoch: 45967, Training Loss: 0.00597
Epoch: 45968, Training Loss: 0.00673
Epoch: 45968, Training Loss: 0.00631
Epoch: 45968, Training Loss: 0.00770
Epoch: 45968, Training Loss: 0.00597
Epoch: 45969, Training Loss: 0.00673
Epoch: 45969, Training Loss: 0.00631
Epoch: 45969, Training Loss: 0.00770
Epoch: 45969, Training Loss: 0.00597
Epoch: 45970, Training Loss: 0.00673
Epoch: 45970, Training Loss: 0.00631
Epoch: 45970, Training Loss: 0.00770
Epoch: 45970, Training Loss: 0.00597
Epoch: 45971, Training Loss: 0.00673
Epoch: 45971, Training Loss: 0.00631
Epoch: 45971, Training Loss: 0.00770
Epoch: 45971, Training Loss: 0.00597
Epoch: 45972, Training Loss: 0.00673
Epoch: 45972, Training Loss: 0.00631
Epoch: 45972, Training Loss: 0.00770
Epoch: 45972, Training Loss: 0.00597
Epoch: 45973, Training Loss: 0.00673
Epoch: 45973, Training Loss: 0.00631
Epoch: 45973, Training Loss: 0.00770
Epoch: 45973, Training Loss: 0.00597
Epoch: 45974, Training Loss: 0.00673
Epoch: 45974, Training Loss: 0.00631
Epoch: 45974, Training Loss: 0.00770
Epoch: 45974, Training Loss: 0.00597
Epoch: 45975, Training Loss: 0.00673
Epoch: 45975, Training Loss: 0.00631
Epoch: 45975, Training Loss: 0.00770
Epoch: 45975, Training Loss: 0.00597
Epoch: 45976, Training Loss: 0.00673
Epoch: 45976, Training Loss: 0.00631
Epoch: 45976, Training Loss: 0.00769
Epoch: 45976, Training Loss: 0.00597
Epoch: 45977, Training Loss: 0.00673
Epoch: 45977, Training Loss: 0.00631
Epoch: 45977, Training Loss: 0.00769
Epoch: 45977, Training Loss: 0.00597
Epoch: 45978, Training Loss: 0.00673
Epoch: 45978, Training Loss: 0.00631
Epoch: 45978, Training Loss: 0.00769
Epoch: 45978, Training Loss: 0.00597
Epoch: 45979, Training Loss: 0.00673
Epoch: 45979, Training Loss: 0.00631
Epoch: 45979, Training Loss: 0.00769
Epoch: 45979, Training Loss: 0.00597
Epoch: 45980, Training Loss: 0.00673
Epoch: 45980, Training Loss: 0.00631
Epoch: 45980, Training Loss: 0.00769
Epoch: 45980, Training Loss: 0.00597
Epoch: 45981, Training Loss: 0.00673
Epoch: 45981, Training Loss: 0.00631
Epoch: 45981, Training Loss: 0.00769
Epoch: 45981, Training Loss: 0.00597
Epoch: 45982, Training Loss: 0.00673
Epoch: 45982, Training Loss: 0.00631
Epoch: 45982, Training Loss: 0.00769
Epoch: 45982, Training Loss: 0.00597
Epoch: 45983, Training Loss: 0.00673
Epoch: 45983, Training Loss: 0.00631
Epoch: 45983, Training Loss: 0.00769
Epoch: 45983, Training Loss: 0.00597
Epoch: 45984, Training Loss: 0.00673
Epoch: 45984, Training Loss: 0.00631
Epoch: 45984, Training Loss: 0.00769
Epoch: 45984, Training Loss: 0.00597
Epoch: 45985, Training Loss: 0.00673
Epoch: 45985, Training Loss: 0.00631
Epoch: 45985, Training Loss: 0.00769
Epoch: 45985, Training Loss: 0.00597
Epoch: 45986, Training Loss: 0.00673
Epoch: 45986, Training Loss: 0.00631
Epoch: 45986, Training Loss: 0.00769
Epoch: 45986, Training Loss: 0.00597
Epoch: 45987, Training Loss: 0.00673
Epoch: 45987, Training Loss: 0.00631
Epoch: 45987, Training Loss: 0.00769
Epoch: 45987, Training Loss: 0.00597
Epoch: 45988, Training Loss: 0.00673
Epoch: 45988, Training Loss: 0.00631
Epoch: 45988, Training Loss: 0.00769
Epoch: 45988, Training Loss: 0.00597
Epoch: 45989, Training Loss: 0.00673
Epoch: 45989, Training Loss: 0.00631
Epoch: 45989, Training Loss: 0.00769
Epoch: 45989, Training Loss: 0.00597
Epoch: 45990, Training Loss: 0.00673
Epoch: 45990, Training Loss: 0.00631
Epoch: 45990, Training Loss: 0.00769
Epoch: 45990, Training Loss: 0.00597
Epoch: 45991, Training Loss: 0.00673
Epoch: 45991, Training Loss: 0.00631
Epoch: 45991, Training Loss: 0.00769
Epoch: 45991, Training Loss: 0.00597
Epoch: 45992, Training Loss: 0.00673
Epoch: 45992, Training Loss: 0.00631
Epoch: 45992, Training Loss: 0.00769
Epoch: 45992, Training Loss: 0.00597
Epoch: 45993, Training Loss: 0.00673
Epoch: 45993, Training Loss: 0.00631
Epoch: 45993, Training Loss: 0.00769
Epoch: 45993, Training Loss: 0.00597
Epoch: 45994, Training Loss: 0.00673
Epoch: 45994, Training Loss: 0.00631
Epoch: 45994, Training Loss: 0.00769
Epoch: 45994, Training Loss: 0.00597
Epoch: 45995, Training Loss: 0.00673
Epoch: 45995, Training Loss: 0.00631
Epoch: 45995, Training Loss: 0.00769
Epoch: 45995, Training Loss: 0.00597
Epoch: 45996, Training Loss: 0.00673
Epoch: 45996, Training Loss: 0.00631
Epoch: 45996, Training Loss: 0.00769
Epoch: 45996, Training Loss: 0.00597
Epoch: 45997, Training Loss: 0.00673
Epoch: 45997, Training Loss: 0.00631
Epoch: 45997, Training Loss: 0.00769
Epoch: 45997, Training Loss: 0.00597
Epoch: 45998, Training Loss: 0.00673
Epoch: 45998, Training Loss: 0.00631
Epoch: 45998, Training Loss: 0.00769
Epoch: 45998, Training Loss: 0.00597
Epoch: 45999, Training Loss: 0.00673
Epoch: 45999, Training Loss: 0.00631
Epoch: 45999, Training Loss: 0.00769
Epoch: 45999, Training Loss: 0.00597
Epoch: 46000, Training Loss: 0.00673
Epoch: 46000, Training Loss: 0.00631
Epoch: 46000, Training Loss: 0.00769
Epoch: 46000, Training Loss: 0.00597
Epoch: 46001, Training Loss: 0.00673
Epoch: 46001, Training Loss: 0.00631
Epoch: 46001, Training Loss: 0.00769
Epoch: 46001, Training Loss: 0.00597
Epoch: 46002, Training Loss: 0.00673
Epoch: 46002, Training Loss: 0.00631
Epoch: 46002, Training Loss: 0.00769
Epoch: 46002, Training Loss: 0.00597
Epoch: 46003, Training Loss: 0.00673
Epoch: 46003, Training Loss: 0.00631
Epoch: 46003, Training Loss: 0.00769
Epoch: 46003, Training Loss: 0.00597
Epoch: 46004, Training Loss: 0.00673
Epoch: 46004, Training Loss: 0.00631
Epoch: 46004, Training Loss: 0.00769
Epoch: 46004, Training Loss: 0.00597
Epoch: 46005, Training Loss: 0.00673
Epoch: 46005, Training Loss: 0.00631
Epoch: 46005, Training Loss: 0.00769
Epoch: 46005, Training Loss: 0.00597
Epoch: 46006, Training Loss: 0.00673
Epoch: 46006, Training Loss: 0.00631
Epoch: 46006, Training Loss: 0.00769
Epoch: 46006, Training Loss: 0.00597
Epoch: 46007, Training Loss: 0.00673
Epoch: 46007, Training Loss: 0.00631
Epoch: 46007, Training Loss: 0.00769
Epoch: 46007, Training Loss: 0.00597
Epoch: 46008, Training Loss: 0.00673
Epoch: 46008, Training Loss: 0.00631
Epoch: 46008, Training Loss: 0.00769
Epoch: 46008, Training Loss: 0.00597
Epoch: 46009, Training Loss: 0.00673
Epoch: 46009, Training Loss: 0.00631
Epoch: 46009, Training Loss: 0.00769
Epoch: 46009, Training Loss: 0.00597
Epoch: 46010, Training Loss: 0.00673
Epoch: 46010, Training Loss: 0.00631
Epoch: 46010, Training Loss: 0.00769
Epoch: 46010, Training Loss: 0.00597
Epoch: 46011, Training Loss: 0.00673
Epoch: 46011, Training Loss: 0.00631
Epoch: 46011, Training Loss: 0.00769
Epoch: 46011, Training Loss: 0.00597
Epoch: 46012, Training Loss: 0.00673
Epoch: 46012, Training Loss: 0.00631
Epoch: 46012, Training Loss: 0.00769
Epoch: 46012, Training Loss: 0.00597
Epoch: 46013, Training Loss: 0.00673
Epoch: 46013, Training Loss: 0.00631
Epoch: 46013, Training Loss: 0.00769
Epoch: 46013, Training Loss: 0.00597
Epoch: 46014, Training Loss: 0.00673
Epoch: 46014, Training Loss: 0.00630
Epoch: 46014, Training Loss: 0.00769
Epoch: 46014, Training Loss: 0.00597
Epoch: 46015, Training Loss: 0.00673
Epoch: 46015, Training Loss: 0.00630
Epoch: 46015, Training Loss: 0.00769
Epoch: 46015, Training Loss: 0.00597
Epoch: 46016, Training Loss: 0.00673
Epoch: 46016, Training Loss: 0.00630
Epoch: 46016, Training Loss: 0.00769
Epoch: 46016, Training Loss: 0.00597
Epoch: 46017, Training Loss: 0.00673
Epoch: 46017, Training Loss: 0.00630
Epoch: 46017, Training Loss: 0.00769
Epoch: 46017, Training Loss: 0.00597
Epoch: 46018, Training Loss: 0.00673
Epoch: 46018, Training Loss: 0.00630
Epoch: 46018, Training Loss: 0.00769
Epoch: 46018, Training Loss: 0.00597
Epoch: 46019, Training Loss: 0.00673
Epoch: 46019, Training Loss: 0.00630
Epoch: 46019, Training Loss: 0.00769
Epoch: 46019, Training Loss: 0.00597
Epoch: 46020, Training Loss: 0.00673
Epoch: 46020, Training Loss: 0.00630
Epoch: 46020, Training Loss: 0.00769
Epoch: 46020, Training Loss: 0.00597
Epoch: 46021, Training Loss: 0.00673
Epoch: 46021, Training Loss: 0.00630
Epoch: 46021, Training Loss: 0.00769
Epoch: 46021, Training Loss: 0.00597
Epoch: 46022, Training Loss: 0.00673
Epoch: 46022, Training Loss: 0.00630
Epoch: 46022, Training Loss: 0.00769
Epoch: 46022, Training Loss: 0.00597
Epoch: 46023, Training Loss: 0.00673
Epoch: 46023, Training Loss: 0.00630
Epoch: 46023, Training Loss: 0.00769
Epoch: 46023, Training Loss: 0.00597
Epoch: 46024, Training Loss: 0.00673
Epoch: 46024, Training Loss: 0.00630
Epoch: 46024, Training Loss: 0.00769
Epoch: 46024, Training Loss: 0.00597
Epoch: 46025, Training Loss: 0.00673
Epoch: 46025, Training Loss: 0.00630
Epoch: 46025, Training Loss: 0.00769
Epoch: 46025, Training Loss: 0.00597
Epoch: 46026, Training Loss: 0.00673
Epoch: 46026, Training Loss: 0.00630
Epoch: 46026, Training Loss: 0.00769
Epoch: 46026, Training Loss: 0.00597
Epoch: 46027, Training Loss: 0.00673
Epoch: 46027, Training Loss: 0.00630
Epoch: 46027, Training Loss: 0.00769
Epoch: 46027, Training Loss: 0.00597
Epoch: 46028, Training Loss: 0.00673
Epoch: 46028, Training Loss: 0.00630
Epoch: 46028, Training Loss: 0.00769
Epoch: 46028, Training Loss: 0.00597
Epoch: 46029, Training Loss: 0.00673
Epoch: 46029, Training Loss: 0.00630
Epoch: 46029, Training Loss: 0.00769
Epoch: 46029, Training Loss: 0.00597
Epoch: 46030, Training Loss: 0.00673
Epoch: 46030, Training Loss: 0.00630
Epoch: 46030, Training Loss: 0.00769
Epoch: 46030, Training Loss: 0.00597
Epoch: 46031, Training Loss: 0.00673
Epoch: 46031, Training Loss: 0.00630
Epoch: 46031, Training Loss: 0.00769
Epoch: 46031, Training Loss: 0.00597
Epoch: 46032, Training Loss: 0.00673
Epoch: 46032, Training Loss: 0.00630
Epoch: 46032, Training Loss: 0.00769
Epoch: 46032, Training Loss: 0.00597
Epoch: 46033, Training Loss: 0.00673
Epoch: 46033, Training Loss: 0.00630
Epoch: 46033, Training Loss: 0.00769
Epoch: 46033, Training Loss: 0.00597
Epoch: 46034, Training Loss: 0.00673
Epoch: 46034, Training Loss: 0.00630
Epoch: 46034, Training Loss: 0.00769
Epoch: 46034, Training Loss: 0.00597
Epoch: 46035, Training Loss: 0.00673
Epoch: 46035, Training Loss: 0.00630
Epoch: 46035, Training Loss: 0.00769
Epoch: 46035, Training Loss: 0.00597
Epoch: 46036, Training Loss: 0.00673
Epoch: 46036, Training Loss: 0.00630
Epoch: 46036, Training Loss: 0.00769
Epoch: 46036, Training Loss: 0.00597
Epoch: 46037, Training Loss: 0.00673
Epoch: 46037, Training Loss: 0.00630
Epoch: 46037, Training Loss: 0.00769
Epoch: 46037, Training Loss: 0.00597
Epoch: 46038, Training Loss: 0.00673
Epoch: 46038, Training Loss: 0.00630
Epoch: 46038, Training Loss: 0.00769
Epoch: 46038, Training Loss: 0.00597
Epoch: 46039, Training Loss: 0.00673
Epoch: 46039, Training Loss: 0.00630
Epoch: 46039, Training Loss: 0.00769
Epoch: 46039, Training Loss: 0.00597
Epoch: 46040, Training Loss: 0.00673
Epoch: 46040, Training Loss: 0.00630
Epoch: 46040, Training Loss: 0.00769
Epoch: 46040, Training Loss: 0.00597
Epoch: 46041, Training Loss: 0.00673
Epoch: 46041, Training Loss: 0.00630
Epoch: 46041, Training Loss: 0.00769
Epoch: 46041, Training Loss: 0.00597
Epoch: 46042, Training Loss: 0.00673
Epoch: 46042, Training Loss: 0.00630
Epoch: 46042, Training Loss: 0.00769
Epoch: 46042, Training Loss: 0.00597
Epoch: 46043, Training Loss: 0.00673
Epoch: 46043, Training Loss: 0.00630
Epoch: 46043, Training Loss: 0.00769
Epoch: 46043, Training Loss: 0.00597
Epoch: 46044, Training Loss: 0.00673
Epoch: 46044, Training Loss: 0.00630
Epoch: 46044, Training Loss: 0.00769
Epoch: 46044, Training Loss: 0.00597
Epoch: 46045, Training Loss: 0.00673
Epoch: 46045, Training Loss: 0.00630
Epoch: 46045, Training Loss: 0.00769
Epoch: 46045, Training Loss: 0.00597
Epoch: 46046, Training Loss: 0.00673
Epoch: 46046, Training Loss: 0.00630
Epoch: 46046, Training Loss: 0.00769
Epoch: 46046, Training Loss: 0.00597
Epoch: 46047, Training Loss: 0.00673
Epoch: 46047, Training Loss: 0.00630
Epoch: 46047, Training Loss: 0.00769
Epoch: 46047, Training Loss: 0.00597
Epoch: 46048, Training Loss: 0.00673
Epoch: 46048, Training Loss: 0.00630
Epoch: 46048, Training Loss: 0.00769
Epoch: 46048, Training Loss: 0.00597
Epoch: 46049, Training Loss: 0.00673
Epoch: 46049, Training Loss: 0.00630
Epoch: 46049, Training Loss: 0.00769
Epoch: 46049, Training Loss: 0.00597
Epoch: 46050, Training Loss: 0.00673
Epoch: 46050, Training Loss: 0.00630
Epoch: 46050, Training Loss: 0.00769
Epoch: 46050, Training Loss: 0.00597
Epoch: 46051, Training Loss: 0.00673
Epoch: 46051, Training Loss: 0.00630
Epoch: 46051, Training Loss: 0.00769
Epoch: 46051, Training Loss: 0.00597
Epoch: 46052, Training Loss: 0.00673
Epoch: 46052, Training Loss: 0.00630
Epoch: 46052, Training Loss: 0.00769
Epoch: 46052, Training Loss: 0.00597
Epoch: 46053, Training Loss: 0.00673
Epoch: 46053, Training Loss: 0.00630
Epoch: 46053, Training Loss: 0.00769
Epoch: 46053, Training Loss: 0.00597
Epoch: 46054, Training Loss: 0.00673
Epoch: 46054, Training Loss: 0.00630
Epoch: 46054, Training Loss: 0.00769
Epoch: 46054, Training Loss: 0.00597
Epoch: 46055, Training Loss: 0.00673
Epoch: 46055, Training Loss: 0.00630
Epoch: 46055, Training Loss: 0.00769
Epoch: 46055, Training Loss: 0.00597
Epoch: 46056, Training Loss: 0.00673
Epoch: 46056, Training Loss: 0.00630
Epoch: 46056, Training Loss: 0.00769
Epoch: 46056, Training Loss: 0.00597
Epoch: 46057, Training Loss: 0.00673
Epoch: 46057, Training Loss: 0.00630
Epoch: 46057, Training Loss: 0.00769
Epoch: 46057, Training Loss: 0.00597
Epoch: 46058, Training Loss: 0.00673
Epoch: 46058, Training Loss: 0.00630
Epoch: 46058, Training Loss: 0.00769
Epoch: 46058, Training Loss: 0.00597
Epoch: 46059, Training Loss: 0.00673
Epoch: 46059, Training Loss: 0.00630
Epoch: 46059, Training Loss: 0.00769
Epoch: 46059, Training Loss: 0.00597
Epoch: 46060, Training Loss: 0.00673
Epoch: 46060, Training Loss: 0.00630
Epoch: 46060, Training Loss: 0.00769
Epoch: 46060, Training Loss: 0.00597
Epoch: 46061, Training Loss: 0.00673
Epoch: 46061, Training Loss: 0.00630
Epoch: 46061, Training Loss: 0.00769
Epoch: 46061, Training Loss: 0.00597
Epoch: 46062, Training Loss: 0.00673
Epoch: 46062, Training Loss: 0.00630
Epoch: 46062, Training Loss: 0.00769
Epoch: 46062, Training Loss: 0.00597
Epoch: 46063, Training Loss: 0.00673
Epoch: 46063, Training Loss: 0.00630
Epoch: 46063, Training Loss: 0.00769
Epoch: 46063, Training Loss: 0.00597
Epoch: 46064, Training Loss: 0.00673
Epoch: 46064, Training Loss: 0.00630
Epoch: 46064, Training Loss: 0.00769
Epoch: 46064, Training Loss: 0.00597
Epoch: 46065, Training Loss: 0.00673
Epoch: 46065, Training Loss: 0.00630
Epoch: 46065, Training Loss: 0.00769
Epoch: 46065, Training Loss: 0.00597
Epoch: 46066, Training Loss: 0.00673
Epoch: 46066, Training Loss: 0.00630
Epoch: 46066, Training Loss: 0.00769
Epoch: 46066, Training Loss: 0.00597
Epoch: 46067, Training Loss: 0.00673
Epoch: 46067, Training Loss: 0.00630
Epoch: 46067, Training Loss: 0.00769
Epoch: 46067, Training Loss: 0.00597
Epoch: 46068, Training Loss: 0.00673
Epoch: 46068, Training Loss: 0.00630
Epoch: 46068, Training Loss: 0.00769
Epoch: 46068, Training Loss: 0.00597
Epoch: 46069, Training Loss: 0.00673
Epoch: 46069, Training Loss: 0.00630
Epoch: 46069, Training Loss: 0.00769
Epoch: 46069, Training Loss: 0.00597
Epoch: 46070, Training Loss: 0.00673
Epoch: 46070, Training Loss: 0.00630
Epoch: 46070, Training Loss: 0.00769
Epoch: 46070, Training Loss: 0.00597
Epoch: 46071, Training Loss: 0.00673
Epoch: 46071, Training Loss: 0.00630
Epoch: 46071, Training Loss: 0.00769
Epoch: 46071, Training Loss: 0.00597
Epoch: 46072, Training Loss: 0.00673
Epoch: 46072, Training Loss: 0.00630
Epoch: 46072, Training Loss: 0.00769
Epoch: 46072, Training Loss: 0.00597
Epoch: 46073, Training Loss: 0.00673
Epoch: 46073, Training Loss: 0.00630
Epoch: 46073, Training Loss: 0.00769
Epoch: 46073, Training Loss: 0.00597
Epoch: 46074, Training Loss: 0.00673
Epoch: 46074, Training Loss: 0.00630
Epoch: 46074, Training Loss: 0.00769
Epoch: 46074, Training Loss: 0.00597
Epoch: 46075, Training Loss: 0.00673
Epoch: 46075, Training Loss: 0.00630
Epoch: 46075, Training Loss: 0.00769
Epoch: 46075, Training Loss: 0.00596
Epoch: 46076, Training Loss: 0.00673
Epoch: 46076, Training Loss: 0.00630
Epoch: 46076, Training Loss: 0.00769
Epoch: 46076, Training Loss: 0.00596
Epoch: 46077, Training Loss: 0.00673
Epoch: 46077, Training Loss: 0.00630
Epoch: 46077, Training Loss: 0.00769
Epoch: 46077, Training Loss: 0.00596
Epoch: 46078, Training Loss: 0.00672
Epoch: 46078, Training Loss: 0.00630
Epoch: 46078, Training Loss: 0.00769
Epoch: 46078, Training Loss: 0.00596
Epoch: 46079, Training Loss: 0.00672
Epoch: 46079, Training Loss: 0.00630
Epoch: 46079, Training Loss: 0.00769
Epoch: 46079, Training Loss: 0.00596
Epoch: 46080, Training Loss: 0.00672
Epoch: 46080, Training Loss: 0.00630
Epoch: 46080, Training Loss: 0.00769
Epoch: 46080, Training Loss: 0.00596
Epoch: 46081, Training Loss: 0.00672
Epoch: 46081, Training Loss: 0.00630
Epoch: 46081, Training Loss: 0.00769
Epoch: 46081, Training Loss: 0.00596
Epoch: 46082, Training Loss: 0.00672
Epoch: 46082, Training Loss: 0.00630
Epoch: 46082, Training Loss: 0.00769
Epoch: 46082, Training Loss: 0.00596
Epoch: 46083, Training Loss: 0.00672
Epoch: 46083, Training Loss: 0.00630
Epoch: 46083, Training Loss: 0.00769
Epoch: 46083, Training Loss: 0.00596
Epoch: 46084, Training Loss: 0.00672
Epoch: 46084, Training Loss: 0.00630
Epoch: 46084, Training Loss: 0.00769
Epoch: 46084, Training Loss: 0.00596
Epoch: 46085, Training Loss: 0.00672
Epoch: 46085, Training Loss: 0.00630
Epoch: 46085, Training Loss: 0.00769
Epoch: 46085, Training Loss: 0.00596
Epoch: 46086, Training Loss: 0.00672
Epoch: 46086, Training Loss: 0.00630
Epoch: 46086, Training Loss: 0.00769
Epoch: 46086, Training Loss: 0.00596
Epoch: 46087, Training Loss: 0.00672
Epoch: 46087, Training Loss: 0.00630
Epoch: 46087, Training Loss: 0.00769
Epoch: 46087, Training Loss: 0.00596
Epoch: 46088, Training Loss: 0.00672
Epoch: 46088, Training Loss: 0.00630
Epoch: 46088, Training Loss: 0.00769
Epoch: 46088, Training Loss: 0.00596
Epoch: 46089, Training Loss: 0.00672
Epoch: 46089, Training Loss: 0.00630
Epoch: 46089, Training Loss: 0.00769
Epoch: 46089, Training Loss: 0.00596
Epoch: 46090, Training Loss: 0.00672
Epoch: 46090, Training Loss: 0.00630
Epoch: 46090, Training Loss: 0.00769
Epoch: 46090, Training Loss: 0.00596
Epoch: 46091, Training Loss: 0.00672
Epoch: 46091, Training Loss: 0.00630
Epoch: 46091, Training Loss: 0.00768
Epoch: 46091, Training Loss: 0.00596
Epoch: 46092, Training Loss: 0.00672
Epoch: 46092, Training Loss: 0.00630
Epoch: 46092, Training Loss: 0.00768
Epoch: 46092, Training Loss: 0.00596
Epoch: 46093, Training Loss: 0.00672
Epoch: 46093, Training Loss: 0.00630
Epoch: 46093, Training Loss: 0.00768
Epoch: 46093, Training Loss: 0.00596
Epoch: 46094, Training Loss: 0.00672
Epoch: 46094, Training Loss: 0.00630
Epoch: 46094, Training Loss: 0.00768
Epoch: 46094, Training Loss: 0.00596
Epoch: 46095, Training Loss: 0.00672
Epoch: 46095, Training Loss: 0.00630
Epoch: 46095, Training Loss: 0.00768
Epoch: 46095, Training Loss: 0.00596
Epoch: 46096, Training Loss: 0.00672
Epoch: 46096, Training Loss: 0.00630
Epoch: 46096, Training Loss: 0.00768
Epoch: 46096, Training Loss: 0.00596
Epoch: 46097, Training Loss: 0.00672
Epoch: 46097, Training Loss: 0.00630
Epoch: 46097, Training Loss: 0.00768
Epoch: 46097, Training Loss: 0.00596
Epoch: 46098, Training Loss: 0.00672
Epoch: 46098, Training Loss: 0.00630
Epoch: 46098, Training Loss: 0.00768
Epoch: 46098, Training Loss: 0.00596
Epoch: 46099, Training Loss: 0.00672
Epoch: 46099, Training Loss: 0.00630
Epoch: 46099, Training Loss: 0.00768
Epoch: 46099, Training Loss: 0.00596
Epoch: 46100, Training Loss: 0.00672
Epoch: 46100, Training Loss: 0.00630
Epoch: 46100, Training Loss: 0.00768
Epoch: 46100, Training Loss: 0.00596
Epoch: 46101, Training Loss: 0.00672
Epoch: 46101, Training Loss: 0.00630
Epoch: 46101, Training Loss: 0.00768
Epoch: 46101, Training Loss: 0.00596
Epoch: 46102, Training Loss: 0.00672
Epoch: 46102, Training Loss: 0.00630
Epoch: 46102, Training Loss: 0.00768
Epoch: 46102, Training Loss: 0.00596
Epoch: 46103, Training Loss: 0.00672
Epoch: 46103, Training Loss: 0.00630
Epoch: 46103, Training Loss: 0.00768
Epoch: 46103, Training Loss: 0.00596
Epoch: 46104, Training Loss: 0.00672
Epoch: 46104, Training Loss: 0.00630
Epoch: 46104, Training Loss: 0.00768
Epoch: 46104, Training Loss: 0.00596
Epoch: 46105, Training Loss: 0.00672
Epoch: 46105, Training Loss: 0.00630
Epoch: 46105, Training Loss: 0.00768
Epoch: 46105, Training Loss: 0.00596
Epoch: 46106, Training Loss: 0.00672
Epoch: 46106, Training Loss: 0.00630
Epoch: 46106, Training Loss: 0.00768
Epoch: 46106, Training Loss: 0.00596
Epoch: 46107, Training Loss: 0.00672
Epoch: 46107, Training Loss: 0.00630
Epoch: 46107, Training Loss: 0.00768
Epoch: 46107, Training Loss: 0.00596
Epoch: 46108, Training Loss: 0.00672
Epoch: 46108, Training Loss: 0.00630
Epoch: 46108, Training Loss: 0.00768
Epoch: 46108, Training Loss: 0.00596
Epoch: 46109, Training Loss: 0.00672
Epoch: 46109, Training Loss: 0.00630
Epoch: 46109, Training Loss: 0.00768
Epoch: 46109, Training Loss: 0.00596
Epoch: 46110, Training Loss: 0.00672
Epoch: 46110, Training Loss: 0.00630
Epoch: 46110, Training Loss: 0.00768
Epoch: 46110, Training Loss: 0.00596
Epoch: 46111, Training Loss: 0.00672
Epoch: 46111, Training Loss: 0.00630
Epoch: 46111, Training Loss: 0.00768
Epoch: 46111, Training Loss: 0.00596
Epoch: 46112, Training Loss: 0.00672
Epoch: 46112, Training Loss: 0.00630
Epoch: 46112, Training Loss: 0.00768
Epoch: 46112, Training Loss: 0.00596
Epoch: 46113, Training Loss: 0.00672
Epoch: 46113, Training Loss: 0.00630
Epoch: 46113, Training Loss: 0.00768
Epoch: 46113, Training Loss: 0.00596
Epoch: 46114, Training Loss: 0.00672
Epoch: 46114, Training Loss: 0.00630
Epoch: 46114, Training Loss: 0.00768
Epoch: 46114, Training Loss: 0.00596
Epoch: 46115, Training Loss: 0.00672
Epoch: 46115, Training Loss: 0.00630
Epoch: 46115, Training Loss: 0.00768
Epoch: 46115, Training Loss: 0.00596
Epoch: 46116, Training Loss: 0.00672
Epoch: 46116, Training Loss: 0.00630
Epoch: 46116, Training Loss: 0.00768
Epoch: 46116, Training Loss: 0.00596
Epoch: 46117, Training Loss: 0.00672
Epoch: 46117, Training Loss: 0.00630
Epoch: 46117, Training Loss: 0.00768
Epoch: 46117, Training Loss: 0.00596
Epoch: 46118, Training Loss: 0.00672
Epoch: 46118, Training Loss: 0.00630
Epoch: 46118, Training Loss: 0.00768
Epoch: 46118, Training Loss: 0.00596
Epoch: 46119, Training Loss: 0.00672
Epoch: 46119, Training Loss: 0.00630
Epoch: 46119, Training Loss: 0.00768
Epoch: 46119, Training Loss: 0.00596
Epoch: 46120, Training Loss: 0.00672
Epoch: 46120, Training Loss: 0.00630
Epoch: 46120, Training Loss: 0.00768
Epoch: 46120, Training Loss: 0.00596
Epoch: 46121, Training Loss: 0.00672
Epoch: 46121, Training Loss: 0.00630
Epoch: 46121, Training Loss: 0.00768
Epoch: 46121, Training Loss: 0.00596
Epoch: 46122, Training Loss: 0.00672
Epoch: 46122, Training Loss: 0.00630
Epoch: 46122, Training Loss: 0.00768
Epoch: 46122, Training Loss: 0.00596
Epoch: 46123, Training Loss: 0.00672
Epoch: 46123, Training Loss: 0.00630
Epoch: 46123, Training Loss: 0.00768
Epoch: 46123, Training Loss: 0.00596
Epoch: 46124, Training Loss: 0.00672
Epoch: 46124, Training Loss: 0.00630
Epoch: 46124, Training Loss: 0.00768
Epoch: 46124, Training Loss: 0.00596
Epoch: 46125, Training Loss: 0.00672
Epoch: 46125, Training Loss: 0.00630
Epoch: 46125, Training Loss: 0.00768
Epoch: 46125, Training Loss: 0.00596
Epoch: 46126, Training Loss: 0.00672
Epoch: 46126, Training Loss: 0.00630
Epoch: 46126, Training Loss: 0.00768
Epoch: 46126, Training Loss: 0.00596
Epoch: 46127, Training Loss: 0.00672
Epoch: 46127, Training Loss: 0.00630
Epoch: 46127, Training Loss: 0.00768
Epoch: 46127, Training Loss: 0.00596
Epoch: 46128, Training Loss: 0.00672
Epoch: 46128, Training Loss: 0.00630
Epoch: 46128, Training Loss: 0.00768
Epoch: 46128, Training Loss: 0.00596
Epoch: 46129, Training Loss: 0.00672
Epoch: 46129, Training Loss: 0.00630
Epoch: 46129, Training Loss: 0.00768
Epoch: 46129, Training Loss: 0.00596
Epoch: 46130, Training Loss: 0.00672
Epoch: 46130, Training Loss: 0.00630
Epoch: 46130, Training Loss: 0.00768
Epoch: 46130, Training Loss: 0.00596
Epoch: 46131, Training Loss: 0.00672
Epoch: 46131, Training Loss: 0.00630
Epoch: 46131, Training Loss: 0.00768
Epoch: 46131, Training Loss: 0.00596
Epoch: 46132, Training Loss: 0.00672
Epoch: 46132, Training Loss: 0.00630
Epoch: 46132, Training Loss: 0.00768
Epoch: 46132, Training Loss: 0.00596
Epoch: 46133, Training Loss: 0.00672
Epoch: 46133, Training Loss: 0.00630
Epoch: 46133, Training Loss: 0.00768
Epoch: 46133, Training Loss: 0.00596
Epoch: 46134, Training Loss: 0.00672
Epoch: 46134, Training Loss: 0.00630
Epoch: 46134, Training Loss: 0.00768
Epoch: 46134, Training Loss: 0.00596
Epoch: 46135, Training Loss: 0.00672
Epoch: 46135, Training Loss: 0.00630
Epoch: 46135, Training Loss: 0.00768
Epoch: 46135, Training Loss: 0.00596
Epoch: 46136, Training Loss: 0.00672
Epoch: 46136, Training Loss: 0.00630
Epoch: 46136, Training Loss: 0.00768
Epoch: 46136, Training Loss: 0.00596
Epoch: 46137, Training Loss: 0.00672
Epoch: 46137, Training Loss: 0.00630
Epoch: 46137, Training Loss: 0.00768
Epoch: 46137, Training Loss: 0.00596
Epoch: 46138, Training Loss: 0.00672
Epoch: 46138, Training Loss: 0.00630
Epoch: 46138, Training Loss: 0.00768
Epoch: 46138, Training Loss: 0.00596
Epoch: 46139, Training Loss: 0.00672
Epoch: 46139, Training Loss: 0.00630
Epoch: 46139, Training Loss: 0.00768
Epoch: 46139, Training Loss: 0.00596
Epoch: 46140, Training Loss: 0.00672
Epoch: 46140, Training Loss: 0.00630
Epoch: 46140, Training Loss: 0.00768
Epoch: 46140, Training Loss: 0.00596
Epoch: 46141, Training Loss: 0.00672
Epoch: 46141, Training Loss: 0.00630
Epoch: 46141, Training Loss: 0.00768
Epoch: 46141, Training Loss: 0.00596
Epoch: 46142, Training Loss: 0.00672
Epoch: 46142, Training Loss: 0.00630
Epoch: 46142, Training Loss: 0.00768
Epoch: 46142, Training Loss: 0.00596
Epoch: 46143, Training Loss: 0.00672
Epoch: 46143, Training Loss: 0.00630
Epoch: 46143, Training Loss: 0.00768
Epoch: 46143, Training Loss: 0.00596
Epoch: 46144, Training Loss: 0.00672
Epoch: 46144, Training Loss: 0.00630
Epoch: 46144, Training Loss: 0.00768
Epoch: 46144, Training Loss: 0.00596
Epoch: 46145, Training Loss: 0.00672
Epoch: 46145, Training Loss: 0.00630
Epoch: 46145, Training Loss: 0.00768
Epoch: 46145, Training Loss: 0.00596
Epoch: 46146, Training Loss: 0.00672
Epoch: 46146, Training Loss: 0.00630
Epoch: 46146, Training Loss: 0.00768
Epoch: 46146, Training Loss: 0.00596
Epoch: 46147, Training Loss: 0.00672
Epoch: 46147, Training Loss: 0.00630
Epoch: 46147, Training Loss: 0.00768
Epoch: 46147, Training Loss: 0.00596
Epoch: 46148, Training Loss: 0.00672
Epoch: 46148, Training Loss: 0.00630
Epoch: 46148, Training Loss: 0.00768
Epoch: 46148, Training Loss: 0.00596
Epoch: 46149, Training Loss: 0.00672
Epoch: 46149, Training Loss: 0.00630
Epoch: 46149, Training Loss: 0.00768
Epoch: 46149, Training Loss: 0.00596
Epoch: 46150, Training Loss: 0.00672
Epoch: 46150, Training Loss: 0.00630
Epoch: 46150, Training Loss: 0.00768
Epoch: 46150, Training Loss: 0.00596
Epoch: 46151, Training Loss: 0.00672
Epoch: 46151, Training Loss: 0.00630
Epoch: 46151, Training Loss: 0.00768
Epoch: 46151, Training Loss: 0.00596
Epoch: 46152, Training Loss: 0.00672
Epoch: 46152, Training Loss: 0.00630
Epoch: 46152, Training Loss: 0.00768
Epoch: 46152, Training Loss: 0.00596
Epoch: 46153, Training Loss: 0.00672
Epoch: 46153, Training Loss: 0.00630
Epoch: 46153, Training Loss: 0.00768
Epoch: 46153, Training Loss: 0.00596
Epoch: 46154, Training Loss: 0.00672
Epoch: 46154, Training Loss: 0.00630
Epoch: 46154, Training Loss: 0.00768
Epoch: 46154, Training Loss: 0.00596
Epoch: 46155, Training Loss: 0.00672
Epoch: 46155, Training Loss: 0.00629
Epoch: 46155, Training Loss: 0.00768
Epoch: 46155, Training Loss: 0.00596
Epoch: 46156, Training Loss: 0.00672
Epoch: 46156, Training Loss: 0.00629
Epoch: 46156, Training Loss: 0.00768
Epoch: 46156, Training Loss: 0.00596
Epoch: 46157, Training Loss: 0.00672
Epoch: 46157, Training Loss: 0.00629
Epoch: 46157, Training Loss: 0.00768
Epoch: 46157, Training Loss: 0.00596
Epoch: 46158, Training Loss: 0.00672
Epoch: 46158, Training Loss: 0.00629
Epoch: 46158, Training Loss: 0.00768
Epoch: 46158, Training Loss: 0.00596
Epoch: 46159, Training Loss: 0.00672
Epoch: 46159, Training Loss: 0.00629
Epoch: 46159, Training Loss: 0.00768
Epoch: 46159, Training Loss: 0.00596
Epoch: 46160, Training Loss: 0.00672
Epoch: 46160, Training Loss: 0.00629
Epoch: 46160, Training Loss: 0.00768
Epoch: 46160, Training Loss: 0.00596
Epoch: 46161, Training Loss: 0.00672
Epoch: 46161, Training Loss: 0.00629
Epoch: 46161, Training Loss: 0.00768
Epoch: 46161, Training Loss: 0.00596
Epoch: 46162, Training Loss: 0.00672
Epoch: 46162, Training Loss: 0.00629
Epoch: 46162, Training Loss: 0.00768
Epoch: 46162, Training Loss: 0.00596
Epoch: 46163, Training Loss: 0.00672
Epoch: 46163, Training Loss: 0.00629
Epoch: 46163, Training Loss: 0.00768
Epoch: 46163, Training Loss: 0.00596
Epoch: 46164, Training Loss: 0.00672
Epoch: 46164, Training Loss: 0.00629
Epoch: 46164, Training Loss: 0.00768
Epoch: 46164, Training Loss: 0.00596
Epoch: 46165, Training Loss: 0.00672
Epoch: 46165, Training Loss: 0.00629
Epoch: 46165, Training Loss: 0.00768
Epoch: 46165, Training Loss: 0.00596
Epoch: 46166, Training Loss: 0.00672
Epoch: 46166, Training Loss: 0.00629
Epoch: 46166, Training Loss: 0.00768
Epoch: 46166, Training Loss: 0.00596
Epoch: 46167, Training Loss: 0.00672
Epoch: 46167, Training Loss: 0.00629
Epoch: 46167, Training Loss: 0.00768
Epoch: 46167, Training Loss: 0.00596
Epoch: 46168, Training Loss: 0.00672
Epoch: 46168, Training Loss: 0.00629
Epoch: 46168, Training Loss: 0.00768
Epoch: 46168, Training Loss: 0.00596
Epoch: 46169, Training Loss: 0.00672
Epoch: 46169, Training Loss: 0.00629
Epoch: 46169, Training Loss: 0.00768
Epoch: 46169, Training Loss: 0.00596
Epoch: 46170, Training Loss: 0.00672
Epoch: 46170, Training Loss: 0.00629
Epoch: 46170, Training Loss: 0.00768
Epoch: 46170, Training Loss: 0.00596
Epoch: 46171, Training Loss: 0.00672
Epoch: 46171, Training Loss: 0.00629
Epoch: 46171, Training Loss: 0.00768
Epoch: 46171, Training Loss: 0.00596
Epoch: 46172, Training Loss: 0.00672
Epoch: 46172, Training Loss: 0.00629
Epoch: 46172, Training Loss: 0.00768
Epoch: 46172, Training Loss: 0.00596
Epoch: 46173, Training Loss: 0.00672
Epoch: 46173, Training Loss: 0.00629
Epoch: 46173, Training Loss: 0.00768
Epoch: 46173, Training Loss: 0.00596
Epoch: 46174, Training Loss: 0.00672
Epoch: 46174, Training Loss: 0.00629
Epoch: 46174, Training Loss: 0.00768
Epoch: 46174, Training Loss: 0.00596
Epoch: 46175, Training Loss: 0.00672
Epoch: 46175, Training Loss: 0.00629
Epoch: 46175, Training Loss: 0.00768
Epoch: 46175, Training Loss: 0.00596
Epoch: 46176, Training Loss: 0.00672
Epoch: 46176, Training Loss: 0.00629
Epoch: 46176, Training Loss: 0.00768
Epoch: 46176, Training Loss: 0.00596
Epoch: 46177, Training Loss: 0.00672
Epoch: 46177, Training Loss: 0.00629
Epoch: 46177, Training Loss: 0.00768
Epoch: 46177, Training Loss: 0.00596
Epoch: 46178, Training Loss: 0.00672
Epoch: 46178, Training Loss: 0.00629
Epoch: 46178, Training Loss: 0.00768
Epoch: 46178, Training Loss: 0.00596
Epoch: 46179, Training Loss: 0.00672
Epoch: 46179, Training Loss: 0.00629
Epoch: 46179, Training Loss: 0.00768
Epoch: 46179, Training Loss: 0.00596
Epoch: 46180, Training Loss: 0.00672
Epoch: 46180, Training Loss: 0.00629
Epoch: 46180, Training Loss: 0.00768
Epoch: 46180, Training Loss: 0.00596
Epoch: 46181, Training Loss: 0.00672
Epoch: 46181, Training Loss: 0.00629
Epoch: 46181, Training Loss: 0.00768
Epoch: 46181, Training Loss: 0.00596
Epoch: 46182, Training Loss: 0.00672
Epoch: 46182, Training Loss: 0.00629
Epoch: 46182, Training Loss: 0.00768
Epoch: 46182, Training Loss: 0.00596
Epoch: 46183, Training Loss: 0.00672
Epoch: 46183, Training Loss: 0.00629
Epoch: 46183, Training Loss: 0.00768
Epoch: 46183, Training Loss: 0.00596
Epoch: 46184, Training Loss: 0.00672
Epoch: 46184, Training Loss: 0.00629
Epoch: 46184, Training Loss: 0.00768
Epoch: 46184, Training Loss: 0.00596
Epoch: 46185, Training Loss: 0.00672
Epoch: 46185, Training Loss: 0.00629
Epoch: 46185, Training Loss: 0.00768
Epoch: 46185, Training Loss: 0.00596
Epoch: 46186, Training Loss: 0.00672
Epoch: 46186, Training Loss: 0.00629
Epoch: 46186, Training Loss: 0.00768
Epoch: 46186, Training Loss: 0.00596
Epoch: 46187, Training Loss: 0.00672
Epoch: 46187, Training Loss: 0.00629
Epoch: 46187, Training Loss: 0.00768
Epoch: 46187, Training Loss: 0.00596
Epoch: 46188, Training Loss: 0.00672
Epoch: 46188, Training Loss: 0.00629
Epoch: 46188, Training Loss: 0.00768
Epoch: 46188, Training Loss: 0.00596
Epoch: 46189, Training Loss: 0.00672
Epoch: 46189, Training Loss: 0.00629
Epoch: 46189, Training Loss: 0.00768
Epoch: 46189, Training Loss: 0.00596
Epoch: 46190, Training Loss: 0.00672
Epoch: 46190, Training Loss: 0.00629
Epoch: 46190, Training Loss: 0.00768
Epoch: 46190, Training Loss: 0.00596
Epoch: 46191, Training Loss: 0.00672
Epoch: 46191, Training Loss: 0.00629
Epoch: 46191, Training Loss: 0.00768
Epoch: 46191, Training Loss: 0.00596
Epoch: 46192, Training Loss: 0.00672
Epoch: 46192, Training Loss: 0.00629
Epoch: 46192, Training Loss: 0.00768
Epoch: 46192, Training Loss: 0.00596
Epoch: 46193, Training Loss: 0.00672
Epoch: 46193, Training Loss: 0.00629
Epoch: 46193, Training Loss: 0.00768
Epoch: 46193, Training Loss: 0.00596
Epoch: 46194, Training Loss: 0.00672
Epoch: 46194, Training Loss: 0.00629
Epoch: 46194, Training Loss: 0.00768
Epoch: 46194, Training Loss: 0.00596
Epoch: 46195, Training Loss: 0.00672
Epoch: 46195, Training Loss: 0.00629
Epoch: 46195, Training Loss: 0.00768
Epoch: 46195, Training Loss: 0.00596
Epoch: 46196, Training Loss: 0.00672
Epoch: 46196, Training Loss: 0.00629
Epoch: 46196, Training Loss: 0.00768
Epoch: 46196, Training Loss: 0.00596
Epoch: 46197, Training Loss: 0.00672
Epoch: 46197, Training Loss: 0.00629
Epoch: 46197, Training Loss: 0.00768
Epoch: 46197, Training Loss: 0.00596
Epoch: 46198, Training Loss: 0.00672
Epoch: 46198, Training Loss: 0.00629
Epoch: 46198, Training Loss: 0.00768
Epoch: 46198, Training Loss: 0.00596
Epoch: 46199, Training Loss: 0.00672
Epoch: 46199, Training Loss: 0.00629
Epoch: 46199, Training Loss: 0.00768
Epoch: 46199, Training Loss: 0.00596
Epoch: 46200, Training Loss: 0.00672
Epoch: 46200, Training Loss: 0.00629
Epoch: 46200, Training Loss: 0.00768
Epoch: 46200, Training Loss: 0.00596
Epoch: 46201, Training Loss: 0.00672
Epoch: 46201, Training Loss: 0.00629
Epoch: 46201, Training Loss: 0.00768
Epoch: 46201, Training Loss: 0.00596
Epoch: 46202, Training Loss: 0.00672
Epoch: 46202, Training Loss: 0.00629
Epoch: 46202, Training Loss: 0.00768
Epoch: 46202, Training Loss: 0.00596
Epoch: 46203, Training Loss: 0.00672
Epoch: 46203, Training Loss: 0.00629
Epoch: 46203, Training Loss: 0.00768
Epoch: 46203, Training Loss: 0.00596
Epoch: 46204, Training Loss: 0.00672
Epoch: 46204, Training Loss: 0.00629
Epoch: 46204, Training Loss: 0.00768
Epoch: 46204, Training Loss: 0.00596
Epoch: 46205, Training Loss: 0.00672
Epoch: 46205, Training Loss: 0.00629
Epoch: 46205, Training Loss: 0.00768
Epoch: 46205, Training Loss: 0.00596
Epoch: 46206, Training Loss: 0.00672
Epoch: 46206, Training Loss: 0.00629
Epoch: 46206, Training Loss: 0.00767
Epoch: 46206, Training Loss: 0.00596
Epoch: 46207, Training Loss: 0.00672
Epoch: 46207, Training Loss: 0.00629
Epoch: 46207, Training Loss: 0.00767
Epoch: 46207, Training Loss: 0.00596
Epoch: 46208, Training Loss: 0.00672
Epoch: 46208, Training Loss: 0.00629
Epoch: 46208, Training Loss: 0.00767
Epoch: 46208, Training Loss: 0.00596
Epoch: 46209, Training Loss: 0.00671
Epoch: 46209, Training Loss: 0.00629
Epoch: 46209, Training Loss: 0.00767
Epoch: 46209, Training Loss: 0.00596
Epoch: 46210, Training Loss: 0.00671
Epoch: 46210, Training Loss: 0.00629
Epoch: 46210, Training Loss: 0.00767
Epoch: 46210, Training Loss: 0.00596
Epoch: 46211, Training Loss: 0.00671
Epoch: 46211, Training Loss: 0.00629
Epoch: 46211, Training Loss: 0.00767
Epoch: 46211, Training Loss: 0.00596
Epoch: 46212, Training Loss: 0.00671
Epoch: 46212, Training Loss: 0.00629
Epoch: 46212, Training Loss: 0.00767
Epoch: 46212, Training Loss: 0.00596
Epoch: 46213, Training Loss: 0.00671
Epoch: 46213, Training Loss: 0.00629
Epoch: 46213, Training Loss: 0.00767
Epoch: 46213, Training Loss: 0.00596
Epoch: 46214, Training Loss: 0.00671
Epoch: 46214, Training Loss: 0.00629
Epoch: 46214, Training Loss: 0.00767
Epoch: 46214, Training Loss: 0.00596
Epoch: 46215, Training Loss: 0.00671
Epoch: 46215, Training Loss: 0.00629
Epoch: 46215, Training Loss: 0.00767
Epoch: 46215, Training Loss: 0.00596
Epoch: 46216, Training Loss: 0.00671
Epoch: 46216, Training Loss: 0.00629
Epoch: 46216, Training Loss: 0.00767
Epoch: 46216, Training Loss: 0.00596
Epoch: 46217, Training Loss: 0.00671
Epoch: 46217, Training Loss: 0.00629
Epoch: 46217, Training Loss: 0.00767
Epoch: 46217, Training Loss: 0.00596
Epoch: 46218, Training Loss: 0.00671
Epoch: 46218, Training Loss: 0.00629
Epoch: 46218, Training Loss: 0.00767
Epoch: 46218, Training Loss: 0.00596
Epoch: 46219, Training Loss: 0.00671
Epoch: 46219, Training Loss: 0.00629
Epoch: 46219, Training Loss: 0.00767
Epoch: 46219, Training Loss: 0.00596
Epoch: 46220, Training Loss: 0.00671
Epoch: 46220, Training Loss: 0.00629
Epoch: 46220, Training Loss: 0.00767
Epoch: 46220, Training Loss: 0.00596
Epoch: 46221, Training Loss: 0.00671
Epoch: 46221, Training Loss: 0.00629
Epoch: 46221, Training Loss: 0.00767
Epoch: 46221, Training Loss: 0.00596
Epoch: 46222, Training Loss: 0.00671
Epoch: 46222, Training Loss: 0.00629
Epoch: 46222, Training Loss: 0.00767
Epoch: 46222, Training Loss: 0.00596
Epoch: 46223, Training Loss: 0.00671
Epoch: 46223, Training Loss: 0.00629
Epoch: 46223, Training Loss: 0.00767
Epoch: 46223, Training Loss: 0.00596
Epoch: 46224, Training Loss: 0.00671
Epoch: 46224, Training Loss: 0.00629
Epoch: 46224, Training Loss: 0.00767
Epoch: 46224, Training Loss: 0.00596
Epoch: 46225, Training Loss: 0.00671
Epoch: 46225, Training Loss: 0.00629
Epoch: 46225, Training Loss: 0.00767
Epoch: 46225, Training Loss: 0.00595
Epoch: 46226, Training Loss: 0.00671
Epoch: 46226, Training Loss: 0.00629
Epoch: 46226, Training Loss: 0.00767
Epoch: 46226, Training Loss: 0.00595
Epoch: 46227, Training Loss: 0.00671
Epoch: 46227, Training Loss: 0.00629
Epoch: 46227, Training Loss: 0.00767
Epoch: 46227, Training Loss: 0.00595
Epoch: 46228, Training Loss: 0.00671
Epoch: 46228, Training Loss: 0.00629
Epoch: 46228, Training Loss: 0.00767
Epoch: 46228, Training Loss: 0.00595
Epoch: 46229, Training Loss: 0.00671
Epoch: 46229, Training Loss: 0.00629
Epoch: 46229, Training Loss: 0.00767
Epoch: 46229, Training Loss: 0.00595
Epoch: 46230, Training Loss: 0.00671
Epoch: 46230, Training Loss: 0.00629
Epoch: 46230, Training Loss: 0.00767
Epoch: 46230, Training Loss: 0.00595
Epoch: 46231, Training Loss: 0.00671
Epoch: 46231, Training Loss: 0.00629
Epoch: 46231, Training Loss: 0.00767
Epoch: 46231, Training Loss: 0.00595
Epoch: 46232, Training Loss: 0.00671
Epoch: 46232, Training Loss: 0.00629
Epoch: 46232, Training Loss: 0.00767
Epoch: 46232, Training Loss: 0.00595
Epoch: 46233, Training Loss: 0.00671
Epoch: 46233, Training Loss: 0.00629
Epoch: 46233, Training Loss: 0.00767
Epoch: 46233, Training Loss: 0.00595
Epoch: 46234, Training Loss: 0.00671
Epoch: 46234, Training Loss: 0.00629
Epoch: 46234, Training Loss: 0.00767
Epoch: 46234, Training Loss: 0.00595
Epoch: 46235, Training Loss: 0.00671
Epoch: 46235, Training Loss: 0.00629
Epoch: 46235, Training Loss: 0.00767
Epoch: 46235, Training Loss: 0.00595
Epoch: 46236, Training Loss: 0.00671
Epoch: 46236, Training Loss: 0.00629
Epoch: 46236, Training Loss: 0.00767
Epoch: 46236, Training Loss: 0.00595
Epoch: 46237, Training Loss: 0.00671
Epoch: 46237, Training Loss: 0.00629
Epoch: 46237, Training Loss: 0.00767
Epoch: 46237, Training Loss: 0.00595
Epoch: 46238, Training Loss: 0.00671
Epoch: 46238, Training Loss: 0.00629
Epoch: 46238, Training Loss: 0.00767
Epoch: 46238, Training Loss: 0.00595
Epoch: 46239, Training Loss: 0.00671
Epoch: 46239, Training Loss: 0.00629
Epoch: 46239, Training Loss: 0.00767
Epoch: 46239, Training Loss: 0.00595
Epoch: 46240, Training Loss: 0.00671
Epoch: 46240, Training Loss: 0.00629
Epoch: 46240, Training Loss: 0.00767
Epoch: 46240, Training Loss: 0.00595
Epoch: 46241, Training Loss: 0.00671
Epoch: 46241, Training Loss: 0.00629
Epoch: 46241, Training Loss: 0.00767
Epoch: 46241, Training Loss: 0.00595
Epoch: 46242, Training Loss: 0.00671
Epoch: 46242, Training Loss: 0.00629
Epoch: 46242, Training Loss: 0.00767
Epoch: 46242, Training Loss: 0.00595
Epoch: 46243, Training Loss: 0.00671
Epoch: 46243, Training Loss: 0.00629
Epoch: 46243, Training Loss: 0.00767
Epoch: 46243, Training Loss: 0.00595
Epoch: 46244, Training Loss: 0.00671
Epoch: 46244, Training Loss: 0.00629
Epoch: 46244, Training Loss: 0.00767
Epoch: 46244, Training Loss: 0.00595
Epoch: 46245, Training Loss: 0.00671
Epoch: 46245, Training Loss: 0.00629
Epoch: 46245, Training Loss: 0.00767
Epoch: 46245, Training Loss: 0.00595
Epoch: 46246, Training Loss: 0.00671
Epoch: 46246, Training Loss: 0.00629
Epoch: 46246, Training Loss: 0.00767
Epoch: 46246, Training Loss: 0.00595
Epoch: 46247, Training Loss: 0.00671
Epoch: 46247, Training Loss: 0.00629
Epoch: 46247, Training Loss: 0.00767
Epoch: 46247, Training Loss: 0.00595
Epoch: 46248, Training Loss: 0.00671
Epoch: 46248, Training Loss: 0.00629
Epoch: 46248, Training Loss: 0.00767
Epoch: 46248, Training Loss: 0.00595
Epoch: 46249, Training Loss: 0.00671
Epoch: 46249, Training Loss: 0.00629
Epoch: 46249, Training Loss: 0.00767
Epoch: 46249, Training Loss: 0.00595
Epoch: 46250, Training Loss: 0.00671
Epoch: 46250, Training Loss: 0.00629
Epoch: 46250, Training Loss: 0.00767
Epoch: 46250, Training Loss: 0.00595
Epoch: 46251, Training Loss: 0.00671
Epoch: 46251, Training Loss: 0.00629
Epoch: 46251, Training Loss: 0.00767
Epoch: 46251, Training Loss: 0.00595
Epoch: 46252, Training Loss: 0.00671
Epoch: 46252, Training Loss: 0.00629
Epoch: 46252, Training Loss: 0.00767
Epoch: 46252, Training Loss: 0.00595
Epoch: 46253, Training Loss: 0.00671
Epoch: 46253, Training Loss: 0.00629
Epoch: 46253, Training Loss: 0.00767
Epoch: 46253, Training Loss: 0.00595
Epoch: 46254, Training Loss: 0.00671
Epoch: 46254, Training Loss: 0.00629
Epoch: 46254, Training Loss: 0.00767
Epoch: 46254, Training Loss: 0.00595
Epoch: 46255, Training Loss: 0.00671
Epoch: 46255, Training Loss: 0.00629
Epoch: 46255, Training Loss: 0.00767
Epoch: 46255, Training Loss: 0.00595
Epoch: 46256, Training Loss: 0.00671
Epoch: 46256, Training Loss: 0.00629
Epoch: 46256, Training Loss: 0.00767
Epoch: 46256, Training Loss: 0.00595
Epoch: 46257, Training Loss: 0.00671
Epoch: 46257, Training Loss: 0.00629
Epoch: 46257, Training Loss: 0.00767
Epoch: 46257, Training Loss: 0.00595
Epoch: 46258, Training Loss: 0.00671
Epoch: 46258, Training Loss: 0.00629
Epoch: 46258, Training Loss: 0.00767
Epoch: 46258, Training Loss: 0.00595
Epoch: 46259, Training Loss: 0.00671
Epoch: 46259, Training Loss: 0.00629
Epoch: 46259, Training Loss: 0.00767
Epoch: 46259, Training Loss: 0.00595
Epoch: 46260, Training Loss: 0.00671
Epoch: 46260, Training Loss: 0.00629
Epoch: 46260, Training Loss: 0.00767
Epoch: 46260, Training Loss: 0.00595
Epoch: 46261, Training Loss: 0.00671
Epoch: 46261, Training Loss: 0.00629
Epoch: 46261, Training Loss: 0.00767
Epoch: 46261, Training Loss: 0.00595
Epoch: 46262, Training Loss: 0.00671
Epoch: 46262, Training Loss: 0.00629
Epoch: 46262, Training Loss: 0.00767
Epoch: 46262, Training Loss: 0.00595
Epoch: 46263, Training Loss: 0.00671
Epoch: 46263, Training Loss: 0.00629
Epoch: 46263, Training Loss: 0.00767
Epoch: 46263, Training Loss: 0.00595
Epoch: 46264, Training Loss: 0.00671
Epoch: 46264, Training Loss: 0.00629
Epoch: 46264, Training Loss: 0.00767
Epoch: 46264, Training Loss: 0.00595
Epoch: 46265, Training Loss: 0.00671
Epoch: 46265, Training Loss: 0.00629
Epoch: 46265, Training Loss: 0.00767
Epoch: 46265, Training Loss: 0.00595
Epoch: 46266, Training Loss: 0.00671
Epoch: 46266, Training Loss: 0.00629
Epoch: 46266, Training Loss: 0.00767
Epoch: 46266, Training Loss: 0.00595
Epoch: 46267, Training Loss: 0.00671
Epoch: 46267, Training Loss: 0.00629
Epoch: 46267, Training Loss: 0.00767
Epoch: 46267, Training Loss: 0.00595
Epoch: 46268, Training Loss: 0.00671
Epoch: 46268, Training Loss: 0.00629
Epoch: 46268, Training Loss: 0.00767
Epoch: 46268, Training Loss: 0.00595
Epoch: 46269, Training Loss: 0.00671
Epoch: 46269, Training Loss: 0.00629
Epoch: 46269, Training Loss: 0.00767
Epoch: 46269, Training Loss: 0.00595
Epoch: 46270, Training Loss: 0.00671
Epoch: 46270, Training Loss: 0.00629
Epoch: 46270, Training Loss: 0.00767
Epoch: 46270, Training Loss: 0.00595
Epoch: 46271, Training Loss: 0.00671
Epoch: 46271, Training Loss: 0.00629
Epoch: 46271, Training Loss: 0.00767
Epoch: 46271, Training Loss: 0.00595
Epoch: 46272, Training Loss: 0.00671
Epoch: 46272, Training Loss: 0.00629
Epoch: 46272, Training Loss: 0.00767
Epoch: 46272, Training Loss: 0.00595
Epoch: 46273, Training Loss: 0.00671
Epoch: 46273, Training Loss: 0.00629
Epoch: 46273, Training Loss: 0.00767
Epoch: 46273, Training Loss: 0.00595
Epoch: 46274, Training Loss: 0.00671
Epoch: 46274, Training Loss: 0.00629
Epoch: 46274, Training Loss: 0.00767
Epoch: 46274, Training Loss: 0.00595
Epoch: 46275, Training Loss: 0.00671
Epoch: 46275, Training Loss: 0.00629
Epoch: 46275, Training Loss: 0.00767
Epoch: 46275, Training Loss: 0.00595
Epoch: 46276, Training Loss: 0.00671
Epoch: 46276, Training Loss: 0.00629
Epoch: 46276, Training Loss: 0.00767
Epoch: 46276, Training Loss: 0.00595
Epoch: 46277, Training Loss: 0.00671
Epoch: 46277, Training Loss: 0.00629
Epoch: 46277, Training Loss: 0.00767
Epoch: 46277, Training Loss: 0.00595
Epoch: 46278, Training Loss: 0.00671
Epoch: 46278, Training Loss: 0.00629
Epoch: 46278, Training Loss: 0.00767
Epoch: 46278, Training Loss: 0.00595
Epoch: 46279, Training Loss: 0.00671
Epoch: 46279, Training Loss: 0.00629
Epoch: 46279, Training Loss: 0.00767
Epoch: 46279, Training Loss: 0.00595
Epoch: 46280, Training Loss: 0.00671
Epoch: 46280, Training Loss: 0.00629
Epoch: 46280, Training Loss: 0.00767
Epoch: 46280, Training Loss: 0.00595
Epoch: 46281, Training Loss: 0.00671
Epoch: 46281, Training Loss: 0.00629
Epoch: 46281, Training Loss: 0.00767
Epoch: 46281, Training Loss: 0.00595
Epoch: 46282, Training Loss: 0.00671
Epoch: 46282, Training Loss: 0.00629
Epoch: 46282, Training Loss: 0.00767
Epoch: 46282, Training Loss: 0.00595
Epoch: 46283, Training Loss: 0.00671
Epoch: 46283, Training Loss: 0.00629
Epoch: 46283, Training Loss: 0.00767
Epoch: 46283, Training Loss: 0.00595
Epoch: 46284, Training Loss: 0.00671
Epoch: 46284, Training Loss: 0.00629
Epoch: 46284, Training Loss: 0.00767
Epoch: 46284, Training Loss: 0.00595
Epoch: 46285, Training Loss: 0.00671
Epoch: 46285, Training Loss: 0.00629
Epoch: 46285, Training Loss: 0.00767
Epoch: 46285, Training Loss: 0.00595
Epoch: 46286, Training Loss: 0.00671
Epoch: 46286, Training Loss: 0.00629
Epoch: 46286, Training Loss: 0.00767
Epoch: 46286, Training Loss: 0.00595
Epoch: 46287, Training Loss: 0.00671
Epoch: 46287, Training Loss: 0.00629
Epoch: 46287, Training Loss: 0.00767
Epoch: 46287, Training Loss: 0.00595
Epoch: 46288, Training Loss: 0.00671
Epoch: 46288, Training Loss: 0.00629
Epoch: 46288, Training Loss: 0.00767
Epoch: 46288, Training Loss: 0.00595
Epoch: 46289, Training Loss: 0.00671
Epoch: 46289, Training Loss: 0.00629
Epoch: 46289, Training Loss: 0.00767
Epoch: 46289, Training Loss: 0.00595
Epoch: 46290, Training Loss: 0.00671
Epoch: 46290, Training Loss: 0.00629
Epoch: 46290, Training Loss: 0.00767
Epoch: 46290, Training Loss: 0.00595
Epoch: 46291, Training Loss: 0.00671
Epoch: 46291, Training Loss: 0.00629
Epoch: 46291, Training Loss: 0.00767
Epoch: 46291, Training Loss: 0.00595
Epoch: 46292, Training Loss: 0.00671
Epoch: 46292, Training Loss: 0.00629
Epoch: 46292, Training Loss: 0.00767
Epoch: 46292, Training Loss: 0.00595
Epoch: 46293, Training Loss: 0.00671
Epoch: 46293, Training Loss: 0.00629
Epoch: 46293, Training Loss: 0.00767
Epoch: 46293, Training Loss: 0.00595
Epoch: 46294, Training Loss: 0.00671
Epoch: 46294, Training Loss: 0.00629
Epoch: 46294, Training Loss: 0.00767
Epoch: 46294, Training Loss: 0.00595
Epoch: 46295, Training Loss: 0.00671
Epoch: 46295, Training Loss: 0.00629
Epoch: 46295, Training Loss: 0.00767
Epoch: 46295, Training Loss: 0.00595
Epoch: 46296, Training Loss: 0.00671
Epoch: 46296, Training Loss: 0.00629
Epoch: 46296, Training Loss: 0.00767
Epoch: 46296, Training Loss: 0.00595
Epoch: 46297, Training Loss: 0.00671
Epoch: 46297, Training Loss: 0.00628
Epoch: 46297, Training Loss: 0.00767
Epoch: 46297, Training Loss: 0.00595
Epoch: 46298, Training Loss: 0.00671
Epoch: 46298, Training Loss: 0.00628
Epoch: 46298, Training Loss: 0.00767
Epoch: 46298, Training Loss: 0.00595
Epoch: 46299, Training Loss: 0.00671
Epoch: 46299, Training Loss: 0.00628
Epoch: 46299, Training Loss: 0.00767
Epoch: 46299, Training Loss: 0.00595
Epoch: 46300, Training Loss: 0.00671
Epoch: 46300, Training Loss: 0.00628
Epoch: 46300, Training Loss: 0.00767
Epoch: 46300, Training Loss: 0.00595
Epoch: 46301, Training Loss: 0.00671
Epoch: 46301, Training Loss: 0.00628
Epoch: 46301, Training Loss: 0.00767
Epoch: 46301, Training Loss: 0.00595
Epoch: 46302, Training Loss: 0.00671
Epoch: 46302, Training Loss: 0.00628
Epoch: 46302, Training Loss: 0.00767
Epoch: 46302, Training Loss: 0.00595
Epoch: 46303, Training Loss: 0.00671
Epoch: 46303, Training Loss: 0.00628
Epoch: 46303, Training Loss: 0.00767
Epoch: 46303, Training Loss: 0.00595
Epoch: 46304, Training Loss: 0.00671
Epoch: 46304, Training Loss: 0.00628
Epoch: 46304, Training Loss: 0.00767
Epoch: 46304, Training Loss: 0.00595
Epoch: 46305, Training Loss: 0.00671
Epoch: 46305, Training Loss: 0.00628
Epoch: 46305, Training Loss: 0.00767
Epoch: 46305, Training Loss: 0.00595
Epoch: 46306, Training Loss: 0.00671
Epoch: 46306, Training Loss: 0.00628
Epoch: 46306, Training Loss: 0.00767
Epoch: 46306, Training Loss: 0.00595
Epoch: 46307, Training Loss: 0.00671
Epoch: 46307, Training Loss: 0.00628
Epoch: 46307, Training Loss: 0.00767
Epoch: 46307, Training Loss: 0.00595
Epoch: 46308, Training Loss: 0.00671
Epoch: 46308, Training Loss: 0.00628
Epoch: 46308, Training Loss: 0.00767
Epoch: 46308, Training Loss: 0.00595
Epoch: 46309, Training Loss: 0.00671
Epoch: 46309, Training Loss: 0.00628
Epoch: 46309, Training Loss: 0.00767
Epoch: 46309, Training Loss: 0.00595
Epoch: 46310, Training Loss: 0.00671
Epoch: 46310, Training Loss: 0.00628
Epoch: 46310, Training Loss: 0.00767
Epoch: 46310, Training Loss: 0.00595
Epoch: 46311, Training Loss: 0.00671
Epoch: 46311, Training Loss: 0.00628
Epoch: 46311, Training Loss: 0.00767
Epoch: 46311, Training Loss: 0.00595
Epoch: 46312, Training Loss: 0.00671
Epoch: 46312, Training Loss: 0.00628
Epoch: 46312, Training Loss: 0.00767
Epoch: 46312, Training Loss: 0.00595
Epoch: 46313, Training Loss: 0.00671
Epoch: 46313, Training Loss: 0.00628
Epoch: 46313, Training Loss: 0.00767
Epoch: 46313, Training Loss: 0.00595
Epoch: 46314, Training Loss: 0.00671
Epoch: 46314, Training Loss: 0.00628
Epoch: 46314, Training Loss: 0.00767
Epoch: 46314, Training Loss: 0.00595
Epoch: 46315, Training Loss: 0.00671
Epoch: 46315, Training Loss: 0.00628
Epoch: 46315, Training Loss: 0.00767
Epoch: 46315, Training Loss: 0.00595
Epoch: 46316, Training Loss: 0.00671
Epoch: 46316, Training Loss: 0.00628
Epoch: 46316, Training Loss: 0.00767
Epoch: 46316, Training Loss: 0.00595
Epoch: 46317, Training Loss: 0.00671
Epoch: 46317, Training Loss: 0.00628
Epoch: 46317, Training Loss: 0.00767
Epoch: 46317, Training Loss: 0.00595
Epoch: 46318, Training Loss: 0.00671
Epoch: 46318, Training Loss: 0.00628
Epoch: 46318, Training Loss: 0.00767
Epoch: 46318, Training Loss: 0.00595
Epoch: 46319, Training Loss: 0.00671
Epoch: 46319, Training Loss: 0.00628
Epoch: 46319, Training Loss: 0.00767
Epoch: 46319, Training Loss: 0.00595
Epoch: 46320, Training Loss: 0.00671
Epoch: 46320, Training Loss: 0.00628
Epoch: 46320, Training Loss: 0.00767
Epoch: 46320, Training Loss: 0.00595
Epoch: 46321, Training Loss: 0.00671
Epoch: 46321, Training Loss: 0.00628
Epoch: 46321, Training Loss: 0.00766
Epoch: 46321, Training Loss: 0.00595
Epoch: 46322, Training Loss: 0.00671
Epoch: 46322, Training Loss: 0.00628
Epoch: 46322, Training Loss: 0.00766
Epoch: 46322, Training Loss: 0.00595
Epoch: 46323, Training Loss: 0.00671
Epoch: 46323, Training Loss: 0.00628
Epoch: 46323, Training Loss: 0.00766
Epoch: 46323, Training Loss: 0.00595
Epoch: 46324, Training Loss: 0.00671
Epoch: 46324, Training Loss: 0.00628
Epoch: 46324, Training Loss: 0.00766
Epoch: 46324, Training Loss: 0.00595
Epoch: 46325, Training Loss: 0.00671
Epoch: 46325, Training Loss: 0.00628
Epoch: 46325, Training Loss: 0.00766
Epoch: 46325, Training Loss: 0.00595
Epoch: 46326, Training Loss: 0.00671
Epoch: 46326, Training Loss: 0.00628
Epoch: 46326, Training Loss: 0.00766
Epoch: 46326, Training Loss: 0.00595
Epoch: 46327, Training Loss: 0.00671
Epoch: 46327, Training Loss: 0.00628
Epoch: 46327, Training Loss: 0.00766
Epoch: 46327, Training Loss: 0.00595
Epoch: 46328, Training Loss: 0.00671
Epoch: 46328, Training Loss: 0.00628
Epoch: 46328, Training Loss: 0.00766
Epoch: 46328, Training Loss: 0.00595
Epoch: 46329, Training Loss: 0.00671
Epoch: 46329, Training Loss: 0.00628
Epoch: 46329, Training Loss: 0.00766
Epoch: 46329, Training Loss: 0.00595
Epoch: 46330, Training Loss: 0.00671
Epoch: 46330, Training Loss: 0.00628
Epoch: 46330, Training Loss: 0.00766
Epoch: 46330, Training Loss: 0.00595
Epoch: 46331, Training Loss: 0.00671
Epoch: 46331, Training Loss: 0.00628
Epoch: 46331, Training Loss: 0.00766
Epoch: 46331, Training Loss: 0.00595
Epoch: 46332, Training Loss: 0.00671
Epoch: 46332, Training Loss: 0.00628
Epoch: 46332, Training Loss: 0.00766
Epoch: 46332, Training Loss: 0.00595
Epoch: 46333, Training Loss: 0.00671
Epoch: 46333, Training Loss: 0.00628
Epoch: 46333, Training Loss: 0.00766
Epoch: 46333, Training Loss: 0.00595
Epoch: 46334, Training Loss: 0.00671
Epoch: 46334, Training Loss: 0.00628
Epoch: 46334, Training Loss: 0.00766
Epoch: 46334, Training Loss: 0.00595
Epoch: 46335, Training Loss: 0.00671
Epoch: 46335, Training Loss: 0.00628
Epoch: 46335, Training Loss: 0.00766
Epoch: 46335, Training Loss: 0.00595
Epoch: 46336, Training Loss: 0.00671
Epoch: 46336, Training Loss: 0.00628
Epoch: 46336, Training Loss: 0.00766
Epoch: 46336, Training Loss: 0.00595
Epoch: 46337, Training Loss: 0.00671
Epoch: 46337, Training Loss: 0.00628
Epoch: 46337, Training Loss: 0.00766
Epoch: 46337, Training Loss: 0.00595
Epoch: 46338, Training Loss: 0.00671
Epoch: 46338, Training Loss: 0.00628
Epoch: 46338, Training Loss: 0.00766
Epoch: 46338, Training Loss: 0.00595
Epoch: 46339, Training Loss: 0.00671
Epoch: 46339, Training Loss: 0.00628
Epoch: 46339, Training Loss: 0.00766
Epoch: 46339, Training Loss: 0.00595
Epoch: 46340, Training Loss: 0.00670
Epoch: 46340, Training Loss: 0.00628
Epoch: 46340, Training Loss: 0.00766
Epoch: 46340, Training Loss: 0.00595
Epoch: 46341, Training Loss: 0.00670
Epoch: 46341, Training Loss: 0.00628
Epoch: 46341, Training Loss: 0.00766
Epoch: 46341, Training Loss: 0.00595
Epoch: 46342, Training Loss: 0.00670
Epoch: 46342, Training Loss: 0.00628
Epoch: 46342, Training Loss: 0.00766
Epoch: 46342, Training Loss: 0.00595
Epoch: 46343, Training Loss: 0.00670
Epoch: 46343, Training Loss: 0.00628
Epoch: 46343, Training Loss: 0.00766
Epoch: 46343, Training Loss: 0.00595
Epoch: 46344, Training Loss: 0.00670
Epoch: 46344, Training Loss: 0.00628
Epoch: 46344, Training Loss: 0.00766
Epoch: 46344, Training Loss: 0.00595
Epoch: 46345, Training Loss: 0.00670
Epoch: 46345, Training Loss: 0.00628
Epoch: 46345, Training Loss: 0.00766
Epoch: 46345, Training Loss: 0.00595
Epoch: 46346, Training Loss: 0.00670
Epoch: 46346, Training Loss: 0.00628
Epoch: 46346, Training Loss: 0.00766
Epoch: 46346, Training Loss: 0.00595
Epoch: 46347, Training Loss: 0.00670
Epoch: 46347, Training Loss: 0.00628
Epoch: 46347, Training Loss: 0.00766
Epoch: 46347, Training Loss: 0.00595
Epoch: 46348, Training Loss: 0.00670
Epoch: 46348, Training Loss: 0.00628
Epoch: 46348, Training Loss: 0.00766
Epoch: 46348, Training Loss: 0.00595
Epoch: 46349, Training Loss: 0.00670
Epoch: 46349, Training Loss: 0.00628
Epoch: 46349, Training Loss: 0.00766
Epoch: 46349, Training Loss: 0.00595
Epoch: 46350, Training Loss: 0.00670
Epoch: 46350, Training Loss: 0.00628
Epoch: 46350, Training Loss: 0.00766
Epoch: 46350, Training Loss: 0.00595
Epoch: 46351, Training Loss: 0.00670
Epoch: 46351, Training Loss: 0.00628
Epoch: 46351, Training Loss: 0.00766
Epoch: 46351, Training Loss: 0.00595
Epoch: 46352, Training Loss: 0.00670
Epoch: 46352, Training Loss: 0.00628
Epoch: 46352, Training Loss: 0.00766
Epoch: 46352, Training Loss: 0.00595
Epoch: 46353, Training Loss: 0.00670
Epoch: 46353, Training Loss: 0.00628
Epoch: 46353, Training Loss: 0.00766
Epoch: 46353, Training Loss: 0.00595
Epoch: 46354, Training Loss: 0.00670
Epoch: 46354, Training Loss: 0.00628
Epoch: 46354, Training Loss: 0.00766
Epoch: 46354, Training Loss: 0.00595
Epoch: 46355, Training Loss: 0.00670
Epoch: 46355, Training Loss: 0.00628
Epoch: 46355, Training Loss: 0.00766
Epoch: 46355, Training Loss: 0.00595
Epoch: 46356, Training Loss: 0.00670
Epoch: 46356, Training Loss: 0.00628
Epoch: 46356, Training Loss: 0.00766
Epoch: 46356, Training Loss: 0.00595
Epoch: 46357, Training Loss: 0.00670
Epoch: 46357, Training Loss: 0.00628
Epoch: 46357, Training Loss: 0.00766
Epoch: 46357, Training Loss: 0.00595
Epoch: 46358, Training Loss: 0.00670
Epoch: 46358, Training Loss: 0.00628
Epoch: 46358, Training Loss: 0.00766
Epoch: 46358, Training Loss: 0.00595
Epoch: 46359, Training Loss: 0.00670
Epoch: 46359, Training Loss: 0.00628
Epoch: 46359, Training Loss: 0.00766
Epoch: 46359, Training Loss: 0.00595
Epoch: 46360, Training Loss: 0.00670
Epoch: 46360, Training Loss: 0.00628
Epoch: 46360, Training Loss: 0.00766
Epoch: 46360, Training Loss: 0.00595
Epoch: 46361, Training Loss: 0.00670
Epoch: 46361, Training Loss: 0.00628
Epoch: 46361, Training Loss: 0.00766
Epoch: 46361, Training Loss: 0.00595
Epoch: 46362, Training Loss: 0.00670
Epoch: 46362, Training Loss: 0.00628
Epoch: 46362, Training Loss: 0.00766
Epoch: 46362, Training Loss: 0.00595
Epoch: 46363, Training Loss: 0.00670
Epoch: 46363, Training Loss: 0.00628
Epoch: 46363, Training Loss: 0.00766
Epoch: 46363, Training Loss: 0.00595
Epoch: 46364, Training Loss: 0.00670
Epoch: 46364, Training Loss: 0.00628
Epoch: 46364, Training Loss: 0.00766
Epoch: 46364, Training Loss: 0.00595
Epoch: 46365, Training Loss: 0.00670
Epoch: 46365, Training Loss: 0.00628
Epoch: 46365, Training Loss: 0.00766
Epoch: 46365, Training Loss: 0.00595
Epoch: 46366, Training Loss: 0.00670
Epoch: 46366, Training Loss: 0.00628
Epoch: 46366, Training Loss: 0.00766
Epoch: 46366, Training Loss: 0.00595
Epoch: 46367, Training Loss: 0.00670
Epoch: 46367, Training Loss: 0.00628
Epoch: 46367, Training Loss: 0.00766
Epoch: 46367, Training Loss: 0.00595
Epoch: 46368, Training Loss: 0.00670
Epoch: 46368, Training Loss: 0.00628
Epoch: 46368, Training Loss: 0.00766
Epoch: 46368, Training Loss: 0.00595
Epoch: 46369, Training Loss: 0.00670
Epoch: 46369, Training Loss: 0.00628
Epoch: 46369, Training Loss: 0.00766
Epoch: 46369, Training Loss: 0.00595
Epoch: 46370, Training Loss: 0.00670
Epoch: 46370, Training Loss: 0.00628
Epoch: 46370, Training Loss: 0.00766
Epoch: 46370, Training Loss: 0.00595
Epoch: 46371, Training Loss: 0.00670
Epoch: 46371, Training Loss: 0.00628
Epoch: 46371, Training Loss: 0.00766
Epoch: 46371, Training Loss: 0.00595
Epoch: 46372, Training Loss: 0.00670
Epoch: 46372, Training Loss: 0.00628
Epoch: 46372, Training Loss: 0.00766
Epoch: 46372, Training Loss: 0.00595
Epoch: 46373, Training Loss: 0.00670
Epoch: 46373, Training Loss: 0.00628
Epoch: 46373, Training Loss: 0.00766
Epoch: 46373, Training Loss: 0.00595
Epoch: 46374, Training Loss: 0.00670
Epoch: 46374, Training Loss: 0.00628
Epoch: 46374, Training Loss: 0.00766
Epoch: 46374, Training Loss: 0.00595
Epoch: 46375, Training Loss: 0.00670
Epoch: 46375, Training Loss: 0.00628
Epoch: 46375, Training Loss: 0.00766
Epoch: 46375, Training Loss: 0.00594
Epoch: 46376, Training Loss: 0.00670
Epoch: 46376, Training Loss: 0.00628
Epoch: 46376, Training Loss: 0.00766
Epoch: 46376, Training Loss: 0.00594
Epoch: 46377, Training Loss: 0.00670
Epoch: 46377, Training Loss: 0.00628
Epoch: 46377, Training Loss: 0.00766
Epoch: 46377, Training Loss: 0.00594
Epoch: 46378, Training Loss: 0.00670
Epoch: 46378, Training Loss: 0.00628
Epoch: 46378, Training Loss: 0.00766
Epoch: 46378, Training Loss: 0.00594
Epoch: 46379, Training Loss: 0.00670
Epoch: 46379, Training Loss: 0.00628
Epoch: 46379, Training Loss: 0.00766
Epoch: 46379, Training Loss: 0.00594
Epoch: 46380, Training Loss: 0.00670
Epoch: 46380, Training Loss: 0.00628
Epoch: 46380, Training Loss: 0.00766
Epoch: 46380, Training Loss: 0.00594
Epoch: 46381, Training Loss: 0.00670
Epoch: 46381, Training Loss: 0.00628
Epoch: 46381, Training Loss: 0.00766
Epoch: 46381, Training Loss: 0.00594
Epoch: 46382, Training Loss: 0.00670
Epoch: 46382, Training Loss: 0.00628
Epoch: 46382, Training Loss: 0.00766
Epoch: 46382, Training Loss: 0.00594
Epoch: 46383, Training Loss: 0.00670
Epoch: 46383, Training Loss: 0.00628
Epoch: 46383, Training Loss: 0.00766
Epoch: 46383, Training Loss: 0.00594
Epoch: 46384, Training Loss: 0.00670
Epoch: 46384, Training Loss: 0.00628
Epoch: 46384, Training Loss: 0.00766
Epoch: 46384, Training Loss: 0.00594
Epoch: 46385, Training Loss: 0.00670
Epoch: 46385, Training Loss: 0.00628
Epoch: 46385, Training Loss: 0.00766
Epoch: 46385, Training Loss: 0.00594
Epoch: 46386, Training Loss: 0.00670
Epoch: 46386, Training Loss: 0.00628
Epoch: 46386, Training Loss: 0.00766
Epoch: 46386, Training Loss: 0.00594
Epoch: 46387, Training Loss: 0.00670
Epoch: 46387, Training Loss: 0.00628
Epoch: 46387, Training Loss: 0.00766
Epoch: 46387, Training Loss: 0.00594
Epoch: 46388, Training Loss: 0.00670
Epoch: 46388, Training Loss: 0.00628
Epoch: 46388, Training Loss: 0.00766
Epoch: 46388, Training Loss: 0.00594
Epoch: 46389, Training Loss: 0.00670
Epoch: 46389, Training Loss: 0.00628
Epoch: 46389, Training Loss: 0.00766
Epoch: 46389, Training Loss: 0.00594
Epoch: 46390, Training Loss: 0.00670
Epoch: 46390, Training Loss: 0.00628
Epoch: 46390, Training Loss: 0.00766
Epoch: 46390, Training Loss: 0.00594
Epoch: 46391, Training Loss: 0.00670
Epoch: 46391, Training Loss: 0.00628
Epoch: 46391, Training Loss: 0.00766
Epoch: 46391, Training Loss: 0.00594
Epoch: 46392, Training Loss: 0.00670
Epoch: 46392, Training Loss: 0.00628
Epoch: 46392, Training Loss: 0.00766
Epoch: 46392, Training Loss: 0.00594
Epoch: 46393, Training Loss: 0.00670
Epoch: 46393, Training Loss: 0.00628
Epoch: 46393, Training Loss: 0.00766
Epoch: 46393, Training Loss: 0.00594
Epoch: 46394, Training Loss: 0.00670
Epoch: 46394, Training Loss: 0.00628
Epoch: 46394, Training Loss: 0.00766
Epoch: 46394, Training Loss: 0.00594
Epoch: 46395, Training Loss: 0.00670
Epoch: 46395, Training Loss: 0.00628
Epoch: 46395, Training Loss: 0.00766
Epoch: 46395, Training Loss: 0.00594
Epoch: 46396, Training Loss: 0.00670
Epoch: 46396, Training Loss: 0.00628
Epoch: 46396, Training Loss: 0.00766
Epoch: 46396, Training Loss: 0.00594
Epoch: 46397, Training Loss: 0.00670
Epoch: 46397, Training Loss: 0.00628
Epoch: 46397, Training Loss: 0.00766
Epoch: 46397, Training Loss: 0.00594
Epoch: 46398, Training Loss: 0.00670
Epoch: 46398, Training Loss: 0.00628
Epoch: 46398, Training Loss: 0.00766
Epoch: 46398, Training Loss: 0.00594
Epoch: 46399, Training Loss: 0.00670
Epoch: 46399, Training Loss: 0.00628
Epoch: 46399, Training Loss: 0.00766
Epoch: 46399, Training Loss: 0.00594
Epoch: 46400, Training Loss: 0.00670
Epoch: 46400, Training Loss: 0.00628
Epoch: 46400, Training Loss: 0.00766
Epoch: 46400, Training Loss: 0.00594
Epoch: 46401, Training Loss: 0.00670
Epoch: 46401, Training Loss: 0.00628
Epoch: 46401, Training Loss: 0.00766
Epoch: 46401, Training Loss: 0.00594
Epoch: 46402, Training Loss: 0.00670
Epoch: 46402, Training Loss: 0.00628
Epoch: 46402, Training Loss: 0.00766
Epoch: 46402, Training Loss: 0.00594
Epoch: 46403, Training Loss: 0.00670
Epoch: 46403, Training Loss: 0.00628
Epoch: 46403, Training Loss: 0.00766
Epoch: 46403, Training Loss: 0.00594
Epoch: 46404, Training Loss: 0.00670
Epoch: 46404, Training Loss: 0.00628
Epoch: 46404, Training Loss: 0.00766
Epoch: 46404, Training Loss: 0.00594
Epoch: 46405, Training Loss: 0.00670
Epoch: 46405, Training Loss: 0.00628
Epoch: 46405, Training Loss: 0.00766
Epoch: 46405, Training Loss: 0.00594
Epoch: 46406, Training Loss: 0.00670
Epoch: 46406, Training Loss: 0.00628
Epoch: 46406, Training Loss: 0.00766
Epoch: 46406, Training Loss: 0.00594
Epoch: 46407, Training Loss: 0.00670
Epoch: 46407, Training Loss: 0.00628
Epoch: 46407, Training Loss: 0.00766
Epoch: 46407, Training Loss: 0.00594
Epoch: 46408, Training Loss: 0.00670
Epoch: 46408, Training Loss: 0.00628
Epoch: 46408, Training Loss: 0.00766
Epoch: 46408, Training Loss: 0.00594
Epoch: 46409, Training Loss: 0.00670
Epoch: 46409, Training Loss: 0.00628
Epoch: 46409, Training Loss: 0.00766
Epoch: 46409, Training Loss: 0.00594
Epoch: 46410, Training Loss: 0.00670
Epoch: 46410, Training Loss: 0.00628
Epoch: 46410, Training Loss: 0.00766
Epoch: 46410, Training Loss: 0.00594
Epoch: 46411, Training Loss: 0.00670
Epoch: 46411, Training Loss: 0.00628
Epoch: 46411, Training Loss: 0.00766
Epoch: 46411, Training Loss: 0.00594
Epoch: 46412, Training Loss: 0.00670
Epoch: 46412, Training Loss: 0.00628
Epoch: 46412, Training Loss: 0.00766
Epoch: 46412, Training Loss: 0.00594
Epoch: 46413, Training Loss: 0.00670
Epoch: 46413, Training Loss: 0.00628
Epoch: 46413, Training Loss: 0.00766
Epoch: 46413, Training Loss: 0.00594
Epoch: 46414, Training Loss: 0.00670
Epoch: 46414, Training Loss: 0.00628
Epoch: 46414, Training Loss: 0.00766
Epoch: 46414, Training Loss: 0.00594
Epoch: 46415, Training Loss: 0.00670
Epoch: 46415, Training Loss: 0.00628
Epoch: 46415, Training Loss: 0.00766
Epoch: 46415, Training Loss: 0.00594
Epoch: 46416, Training Loss: 0.00670
Epoch: 46416, Training Loss: 0.00628
Epoch: 46416, Training Loss: 0.00766
Epoch: 46416, Training Loss: 0.00594
Epoch: 46417, Training Loss: 0.00670
Epoch: 46417, Training Loss: 0.00628
Epoch: 46417, Training Loss: 0.00766
Epoch: 46417, Training Loss: 0.00594
Epoch: 46418, Training Loss: 0.00670
Epoch: 46418, Training Loss: 0.00628
Epoch: 46418, Training Loss: 0.00766
Epoch: 46418, Training Loss: 0.00594
Epoch: 46419, Training Loss: 0.00670
Epoch: 46419, Training Loss: 0.00628
Epoch: 46419, Training Loss: 0.00766
Epoch: 46419, Training Loss: 0.00594
Epoch: 46420, Training Loss: 0.00670
Epoch: 46420, Training Loss: 0.00628
Epoch: 46420, Training Loss: 0.00766
Epoch: 46420, Training Loss: 0.00594
Epoch: 46421, Training Loss: 0.00670
Epoch: 46421, Training Loss: 0.00628
Epoch: 46421, Training Loss: 0.00766
Epoch: 46421, Training Loss: 0.00594
Epoch: 46422, Training Loss: 0.00670
Epoch: 46422, Training Loss: 0.00628
Epoch: 46422, Training Loss: 0.00766
Epoch: 46422, Training Loss: 0.00594
Epoch: 46423, Training Loss: 0.00670
Epoch: 46423, Training Loss: 0.00628
Epoch: 46423, Training Loss: 0.00766
Epoch: 46423, Training Loss: 0.00594
Epoch: 46424, Training Loss: 0.00670
Epoch: 46424, Training Loss: 0.00628
Epoch: 46424, Training Loss: 0.00766
Epoch: 46424, Training Loss: 0.00594
Epoch: 46425, Training Loss: 0.00670
Epoch: 46425, Training Loss: 0.00628
Epoch: 46425, Training Loss: 0.00766
Epoch: 46425, Training Loss: 0.00594
Epoch: 46426, Training Loss: 0.00670
Epoch: 46426, Training Loss: 0.00628
Epoch: 46426, Training Loss: 0.00766
Epoch: 46426, Training Loss: 0.00594
Epoch: 46427, Training Loss: 0.00670
Epoch: 46427, Training Loss: 0.00628
Epoch: 46427, Training Loss: 0.00766
Epoch: 46427, Training Loss: 0.00594
Epoch: 46428, Training Loss: 0.00670
Epoch: 46428, Training Loss: 0.00628
Epoch: 46428, Training Loss: 0.00766
Epoch: 46428, Training Loss: 0.00594
Epoch: 46429, Training Loss: 0.00670
Epoch: 46429, Training Loss: 0.00628
Epoch: 46429, Training Loss: 0.00766
Epoch: 46429, Training Loss: 0.00594
Epoch: 46430, Training Loss: 0.00670
Epoch: 46430, Training Loss: 0.00628
Epoch: 46430, Training Loss: 0.00766
Epoch: 46430, Training Loss: 0.00594
Epoch: 46431, Training Loss: 0.00670
Epoch: 46431, Training Loss: 0.00628
Epoch: 46431, Training Loss: 0.00766
Epoch: 46431, Training Loss: 0.00594
Epoch: 46432, Training Loss: 0.00670
Epoch: 46432, Training Loss: 0.00628
Epoch: 46432, Training Loss: 0.00766
Epoch: 46432, Training Loss: 0.00594
Epoch: 46433, Training Loss: 0.00670
Epoch: 46433, Training Loss: 0.00628
Epoch: 46433, Training Loss: 0.00766
Epoch: 46433, Training Loss: 0.00594
Epoch: 46434, Training Loss: 0.00670
Epoch: 46434, Training Loss: 0.00628
Epoch: 46434, Training Loss: 0.00766
Epoch: 46434, Training Loss: 0.00594
Epoch: 46435, Training Loss: 0.00670
Epoch: 46435, Training Loss: 0.00628
Epoch: 46435, Training Loss: 0.00766
Epoch: 46435, Training Loss: 0.00594
Epoch: 46436, Training Loss: 0.00670
Epoch: 46436, Training Loss: 0.00628
Epoch: 46436, Training Loss: 0.00766
Epoch: 46436, Training Loss: 0.00594
Epoch: 46437, Training Loss: 0.00670
Epoch: 46437, Training Loss: 0.00628
Epoch: 46437, Training Loss: 0.00765
Epoch: 46437, Training Loss: 0.00594
Epoch: 46438, Training Loss: 0.00670
Epoch: 46438, Training Loss: 0.00628
Epoch: 46438, Training Loss: 0.00765
Epoch: 46438, Training Loss: 0.00594
Epoch: 46439, Training Loss: 0.00670
Epoch: 46439, Training Loss: 0.00628
Epoch: 46439, Training Loss: 0.00765
Epoch: 46439, Training Loss: 0.00594
Epoch: 46440, Training Loss: 0.00670
Epoch: 46440, Training Loss: 0.00627
Epoch: 46440, Training Loss: 0.00765
Epoch: 46440, Training Loss: 0.00594
Epoch: 46441, Training Loss: 0.00670
Epoch: 46441, Training Loss: 0.00627
Epoch: 46441, Training Loss: 0.00765
Epoch: 46441, Training Loss: 0.00594
Epoch: 46442, Training Loss: 0.00670
Epoch: 46442, Training Loss: 0.00627
Epoch: 46442, Training Loss: 0.00765
Epoch: 46442, Training Loss: 0.00594
Epoch: 46443, Training Loss: 0.00670
Epoch: 46443, Training Loss: 0.00627
Epoch: 46443, Training Loss: 0.00765
Epoch: 46443, Training Loss: 0.00594
Epoch: 46444, Training Loss: 0.00670
Epoch: 46444, Training Loss: 0.00627
Epoch: 46444, Training Loss: 0.00765
Epoch: 46444, Training Loss: 0.00594
Epoch: 46445, Training Loss: 0.00670
Epoch: 46445, Training Loss: 0.00627
Epoch: 46445, Training Loss: 0.00765
Epoch: 46445, Training Loss: 0.00594
Epoch: 46446, Training Loss: 0.00670
Epoch: 46446, Training Loss: 0.00627
Epoch: 46446, Training Loss: 0.00765
Epoch: 46446, Training Loss: 0.00594
Epoch: 46447, Training Loss: 0.00670
Epoch: 46447, Training Loss: 0.00627
Epoch: 46447, Training Loss: 0.00765
Epoch: 46447, Training Loss: 0.00594
Epoch: 46448, Training Loss: 0.00670
Epoch: 46448, Training Loss: 0.00627
Epoch: 46448, Training Loss: 0.00765
Epoch: 46448, Training Loss: 0.00594
Epoch: 46449, Training Loss: 0.00670
Epoch: 46449, Training Loss: 0.00627
Epoch: 46449, Training Loss: 0.00765
Epoch: 46449, Training Loss: 0.00594
Epoch: 46450, Training Loss: 0.00670
Epoch: 46450, Training Loss: 0.00627
Epoch: 46450, Training Loss: 0.00765
Epoch: 46450, Training Loss: 0.00594
Epoch: 46451, Training Loss: 0.00670
Epoch: 46451, Training Loss: 0.00627
Epoch: 46451, Training Loss: 0.00765
Epoch: 46451, Training Loss: 0.00594
Epoch: 46452, Training Loss: 0.00670
Epoch: 46452, Training Loss: 0.00627
Epoch: 46452, Training Loss: 0.00765
Epoch: 46452, Training Loss: 0.00594
Epoch: 46453, Training Loss: 0.00670
Epoch: 46453, Training Loss: 0.00627
Epoch: 46453, Training Loss: 0.00765
Epoch: 46453, Training Loss: 0.00594
Epoch: 46454, Training Loss: 0.00670
Epoch: 46454, Training Loss: 0.00627
Epoch: 46454, Training Loss: 0.00765
Epoch: 46454, Training Loss: 0.00594
Epoch: 46455, Training Loss: 0.00670
Epoch: 46455, Training Loss: 0.00627
Epoch: 46455, Training Loss: 0.00765
Epoch: 46455, Training Loss: 0.00594
Epoch: 46456, Training Loss: 0.00670
Epoch: 46456, Training Loss: 0.00627
Epoch: 46456, Training Loss: 0.00765
Epoch: 46456, Training Loss: 0.00594
Epoch: 46457, Training Loss: 0.00670
Epoch: 46457, Training Loss: 0.00627
Epoch: 46457, Training Loss: 0.00765
Epoch: 46457, Training Loss: 0.00594
Epoch: 46458, Training Loss: 0.00670
Epoch: 46458, Training Loss: 0.00627
Epoch: 46458, Training Loss: 0.00765
Epoch: 46458, Training Loss: 0.00594
Epoch: 46459, Training Loss: 0.00670
Epoch: 46459, Training Loss: 0.00627
Epoch: 46459, Training Loss: 0.00765
Epoch: 46459, Training Loss: 0.00594
Epoch: 46460, Training Loss: 0.00670
Epoch: 46460, Training Loss: 0.00627
Epoch: 46460, Training Loss: 0.00765
Epoch: 46460, Training Loss: 0.00594
Epoch: 46461, Training Loss: 0.00670
Epoch: 46461, Training Loss: 0.00627
Epoch: 46461, Training Loss: 0.00765
Epoch: 46461, Training Loss: 0.00594
Epoch: 46462, Training Loss: 0.00670
Epoch: 46462, Training Loss: 0.00627
Epoch: 46462, Training Loss: 0.00765
Epoch: 46462, Training Loss: 0.00594
Epoch: 46463, Training Loss: 0.00670
Epoch: 46463, Training Loss: 0.00627
Epoch: 46463, Training Loss: 0.00765
Epoch: 46463, Training Loss: 0.00594
Epoch: 46464, Training Loss: 0.00670
Epoch: 46464, Training Loss: 0.00627
Epoch: 46464, Training Loss: 0.00765
Epoch: 46464, Training Loss: 0.00594
Epoch: 46465, Training Loss: 0.00670
Epoch: 46465, Training Loss: 0.00627
Epoch: 46465, Training Loss: 0.00765
Epoch: 46465, Training Loss: 0.00594
Epoch: 46466, Training Loss: 0.00670
Epoch: 46466, Training Loss: 0.00627
Epoch: 46466, Training Loss: 0.00765
Epoch: 46466, Training Loss: 0.00594
Epoch: 46467, Training Loss: 0.00670
Epoch: 46467, Training Loss: 0.00627
Epoch: 46467, Training Loss: 0.00765
Epoch: 46467, Training Loss: 0.00594
Epoch: 46468, Training Loss: 0.00670
Epoch: 46468, Training Loss: 0.00627
Epoch: 46468, Training Loss: 0.00765
Epoch: 46468, Training Loss: 0.00594
Epoch: 46469, Training Loss: 0.00670
Epoch: 46469, Training Loss: 0.00627
Epoch: 46469, Training Loss: 0.00765
Epoch: 46469, Training Loss: 0.00594
Epoch: 46470, Training Loss: 0.00670
Epoch: 46470, Training Loss: 0.00627
Epoch: 46470, Training Loss: 0.00765
Epoch: 46470, Training Loss: 0.00594
Epoch: 46471, Training Loss: 0.00670
Epoch: 46471, Training Loss: 0.00627
Epoch: 46471, Training Loss: 0.00765
Epoch: 46471, Training Loss: 0.00594
Epoch: 46472, Training Loss: 0.00669
Epoch: 46472, Training Loss: 0.00627
Epoch: 46472, Training Loss: 0.00765
Epoch: 46472, Training Loss: 0.00594
Epoch: 46473, Training Loss: 0.00669
Epoch: 46473, Training Loss: 0.00627
Epoch: 46473, Training Loss: 0.00765
Epoch: 46473, Training Loss: 0.00594
Epoch: 46474, Training Loss: 0.00669
Epoch: 46474, Training Loss: 0.00627
Epoch: 46474, Training Loss: 0.00765
Epoch: 46474, Training Loss: 0.00594
Epoch: 46475, Training Loss: 0.00669
Epoch: 46475, Training Loss: 0.00627
Epoch: 46475, Training Loss: 0.00765
Epoch: 46475, Training Loss: 0.00594
Epoch: 46476, Training Loss: 0.00669
Epoch: 46476, Training Loss: 0.00627
Epoch: 46476, Training Loss: 0.00765
Epoch: 46476, Training Loss: 0.00594
Epoch: 46477, Training Loss: 0.00669
Epoch: 46477, Training Loss: 0.00627
Epoch: 46477, Training Loss: 0.00765
Epoch: 46477, Training Loss: 0.00594
Epoch: 46478, Training Loss: 0.00669
Epoch: 46478, Training Loss: 0.00627
Epoch: 46478, Training Loss: 0.00765
Epoch: 46478, Training Loss: 0.00594
Epoch: 46479, Training Loss: 0.00669
Epoch: 46479, Training Loss: 0.00627
Epoch: 46479, Training Loss: 0.00765
Epoch: 46479, Training Loss: 0.00594
Epoch: 46480, Training Loss: 0.00669
Epoch: 46480, Training Loss: 0.00627
Epoch: 46480, Training Loss: 0.00765
Epoch: 46480, Training Loss: 0.00594
Epoch: 46481, Training Loss: 0.00669
Epoch: 46481, Training Loss: 0.00627
Epoch: 46481, Training Loss: 0.00765
Epoch: 46481, Training Loss: 0.00594
Epoch: 46482, Training Loss: 0.00669
Epoch: 46482, Training Loss: 0.00627
Epoch: 46482, Training Loss: 0.00765
Epoch: 46482, Training Loss: 0.00594
Epoch: 46483, Training Loss: 0.00669
Epoch: 46483, Training Loss: 0.00627
Epoch: 46483, Training Loss: 0.00765
Epoch: 46483, Training Loss: 0.00594
Epoch: 46484, Training Loss: 0.00669
Epoch: 46484, Training Loss: 0.00627
Epoch: 46484, Training Loss: 0.00765
Epoch: 46484, Training Loss: 0.00594
Epoch: 46485, Training Loss: 0.00669
Epoch: 46485, Training Loss: 0.00627
Epoch: 46485, Training Loss: 0.00765
Epoch: 46485, Training Loss: 0.00594
Epoch: 46486, Training Loss: 0.00669
Epoch: 46486, Training Loss: 0.00627
Epoch: 46486, Training Loss: 0.00765
Epoch: 46486, Training Loss: 0.00594
Epoch: 46487, Training Loss: 0.00669
Epoch: 46487, Training Loss: 0.00627
Epoch: 46487, Training Loss: 0.00765
Epoch: 46487, Training Loss: 0.00594
Epoch: 46488, Training Loss: 0.00669
Epoch: 46488, Training Loss: 0.00627
Epoch: 46488, Training Loss: 0.00765
Epoch: 46488, Training Loss: 0.00594
Epoch: 46489, Training Loss: 0.00669
Epoch: 46489, Training Loss: 0.00627
Epoch: 46489, Training Loss: 0.00765
Epoch: 46489, Training Loss: 0.00594
Epoch: 46490, Training Loss: 0.00669
Epoch: 46490, Training Loss: 0.00627
Epoch: 46490, Training Loss: 0.00765
Epoch: 46490, Training Loss: 0.00594
Epoch: 46491, Training Loss: 0.00669
Epoch: 46491, Training Loss: 0.00627
Epoch: 46491, Training Loss: 0.00765
Epoch: 46491, Training Loss: 0.00594
Epoch: 46492, Training Loss: 0.00669
Epoch: 46492, Training Loss: 0.00627
Epoch: 46492, Training Loss: 0.00765
Epoch: 46492, Training Loss: 0.00594
Epoch: 46493, Training Loss: 0.00669
Epoch: 46493, Training Loss: 0.00627
Epoch: 46493, Training Loss: 0.00765
Epoch: 46493, Training Loss: 0.00594
Epoch: 46494, Training Loss: 0.00669
Epoch: 46494, Training Loss: 0.00627
Epoch: 46494, Training Loss: 0.00765
Epoch: 46494, Training Loss: 0.00594
Epoch: 46495, Training Loss: 0.00669
Epoch: 46495, Training Loss: 0.00627
Epoch: 46495, Training Loss: 0.00765
Epoch: 46495, Training Loss: 0.00594
Epoch: 46496, Training Loss: 0.00669
Epoch: 46496, Training Loss: 0.00627
Epoch: 46496, Training Loss: 0.00765
Epoch: 46496, Training Loss: 0.00594
Epoch: 46497, Training Loss: 0.00669
Epoch: 46497, Training Loss: 0.00627
Epoch: 46497, Training Loss: 0.00765
Epoch: 46497, Training Loss: 0.00594
Epoch: 46498, Training Loss: 0.00669
Epoch: 46498, Training Loss: 0.00627
Epoch: 46498, Training Loss: 0.00765
Epoch: 46498, Training Loss: 0.00594
Epoch: 46499, Training Loss: 0.00669
Epoch: 46499, Training Loss: 0.00627
Epoch: 46499, Training Loss: 0.00765
Epoch: 46499, Training Loss: 0.00594
Epoch: 46500, Training Loss: 0.00669
Epoch: 46500, Training Loss: 0.00627
Epoch: 46500, Training Loss: 0.00765
Epoch: 46500, Training Loss: 0.00594
Epoch: 46501, Training Loss: 0.00669
Epoch: 46501, Training Loss: 0.00627
Epoch: 46501, Training Loss: 0.00765
Epoch: 46501, Training Loss: 0.00594
Epoch: 46502, Training Loss: 0.00669
Epoch: 46502, Training Loss: 0.00627
Epoch: 46502, Training Loss: 0.00765
Epoch: 46502, Training Loss: 0.00594
Epoch: 46503, Training Loss: 0.00669
Epoch: 46503, Training Loss: 0.00627
Epoch: 46503, Training Loss: 0.00765
Epoch: 46503, Training Loss: 0.00594
Epoch: 46504, Training Loss: 0.00669
Epoch: 46504, Training Loss: 0.00627
Epoch: 46504, Training Loss: 0.00765
Epoch: 46504, Training Loss: 0.00594
Epoch: 46505, Training Loss: 0.00669
Epoch: 46505, Training Loss: 0.00627
Epoch: 46505, Training Loss: 0.00765
Epoch: 46505, Training Loss: 0.00594
Epoch: 46506, Training Loss: 0.00669
Epoch: 46506, Training Loss: 0.00627
Epoch: 46506, Training Loss: 0.00765
Epoch: 46506, Training Loss: 0.00594
Epoch: 46507, Training Loss: 0.00669
Epoch: 46507, Training Loss: 0.00627
Epoch: 46507, Training Loss: 0.00765
Epoch: 46507, Training Loss: 0.00594
Epoch: 46508, Training Loss: 0.00669
Epoch: 46508, Training Loss: 0.00627
Epoch: 46508, Training Loss: 0.00765
Epoch: 46508, Training Loss: 0.00594
Epoch: 46509, Training Loss: 0.00669
Epoch: 46509, Training Loss: 0.00627
Epoch: 46509, Training Loss: 0.00765
Epoch: 46509, Training Loss: 0.00594
Epoch: 46510, Training Loss: 0.00669
Epoch: 46510, Training Loss: 0.00627
Epoch: 46510, Training Loss: 0.00765
Epoch: 46510, Training Loss: 0.00594
Epoch: 46511, Training Loss: 0.00669
Epoch: 46511, Training Loss: 0.00627
Epoch: 46511, Training Loss: 0.00765
Epoch: 46511, Training Loss: 0.00594
Epoch: 46512, Training Loss: 0.00669
Epoch: 46512, Training Loss: 0.00627
Epoch: 46512, Training Loss: 0.00765
Epoch: 46512, Training Loss: 0.00594
Epoch: 46513, Training Loss: 0.00669
Epoch: 46513, Training Loss: 0.00627
Epoch: 46513, Training Loss: 0.00765
Epoch: 46513, Training Loss: 0.00594
Epoch: 46514, Training Loss: 0.00669
Epoch: 46514, Training Loss: 0.00627
Epoch: 46514, Training Loss: 0.00765
Epoch: 46514, Training Loss: 0.00594
Epoch: 46515, Training Loss: 0.00669
Epoch: 46515, Training Loss: 0.00627
Epoch: 46515, Training Loss: 0.00765
Epoch: 46515, Training Loss: 0.00594
Epoch: 46516, Training Loss: 0.00669
Epoch: 46516, Training Loss: 0.00627
Epoch: 46516, Training Loss: 0.00765
Epoch: 46516, Training Loss: 0.00594
Epoch: 46517, Training Loss: 0.00669
Epoch: 46517, Training Loss: 0.00627
Epoch: 46517, Training Loss: 0.00765
Epoch: 46517, Training Loss: 0.00594
Epoch: 46518, Training Loss: 0.00669
Epoch: 46518, Training Loss: 0.00627
Epoch: 46518, Training Loss: 0.00765
Epoch: 46518, Training Loss: 0.00594
Epoch: 46519, Training Loss: 0.00669
Epoch: 46519, Training Loss: 0.00627
Epoch: 46519, Training Loss: 0.00765
Epoch: 46519, Training Loss: 0.00594
Epoch: 46520, Training Loss: 0.00669
Epoch: 46520, Training Loss: 0.00627
Epoch: 46520, Training Loss: 0.00765
Epoch: 46520, Training Loss: 0.00594
Epoch: 46521, Training Loss: 0.00669
Epoch: 46521, Training Loss: 0.00627
Epoch: 46521, Training Loss: 0.00765
Epoch: 46521, Training Loss: 0.00594
Epoch: 46522, Training Loss: 0.00669
Epoch: 46522, Training Loss: 0.00627
Epoch: 46522, Training Loss: 0.00765
Epoch: 46522, Training Loss: 0.00594
Epoch: 46523, Training Loss: 0.00669
Epoch: 46523, Training Loss: 0.00627
Epoch: 46523, Training Loss: 0.00765
Epoch: 46523, Training Loss: 0.00594
Epoch: 46524, Training Loss: 0.00669
Epoch: 46524, Training Loss: 0.00627
Epoch: 46524, Training Loss: 0.00765
Epoch: 46524, Training Loss: 0.00594
Epoch: 46525, Training Loss: 0.00669
Epoch: 46525, Training Loss: 0.00627
Epoch: 46525, Training Loss: 0.00765
Epoch: 46525, Training Loss: 0.00594
Epoch: 46526, Training Loss: 0.00669
Epoch: 46526, Training Loss: 0.00627
Epoch: 46526, Training Loss: 0.00765
Epoch: 46526, Training Loss: 0.00593
Epoch: 46527, Training Loss: 0.00669
Epoch: 46527, Training Loss: 0.00627
Epoch: 46527, Training Loss: 0.00765
Epoch: 46527, Training Loss: 0.00593
Epoch: 46528, Training Loss: 0.00669
Epoch: 46528, Training Loss: 0.00627
Epoch: 46528, Training Loss: 0.00765
Epoch: 46528, Training Loss: 0.00593
Epoch: 46529, Training Loss: 0.00669
Epoch: 46529, Training Loss: 0.00627
Epoch: 46529, Training Loss: 0.00765
Epoch: 46529, Training Loss: 0.00593
Epoch: 46530, Training Loss: 0.00669
Epoch: 46530, Training Loss: 0.00627
Epoch: 46530, Training Loss: 0.00765
Epoch: 46530, Training Loss: 0.00593
Epoch: 46531, Training Loss: 0.00669
Epoch: 46531, Training Loss: 0.00627
Epoch: 46531, Training Loss: 0.00765
Epoch: 46531, Training Loss: 0.00593
Epoch: 46532, Training Loss: 0.00669
Epoch: 46532, Training Loss: 0.00627
Epoch: 46532, Training Loss: 0.00765
Epoch: 46532, Training Loss: 0.00593
Epoch: 46533, Training Loss: 0.00669
Epoch: 46533, Training Loss: 0.00627
Epoch: 46533, Training Loss: 0.00765
Epoch: 46533, Training Loss: 0.00593
Epoch: 46534, Training Loss: 0.00669
Epoch: 46534, Training Loss: 0.00627
Epoch: 46534, Training Loss: 0.00765
Epoch: 46534, Training Loss: 0.00593
Epoch: 46535, Training Loss: 0.00669
Epoch: 46535, Training Loss: 0.00627
Epoch: 46535, Training Loss: 0.00765
Epoch: 46535, Training Loss: 0.00593
Epoch: 46536, Training Loss: 0.00669
Epoch: 46536, Training Loss: 0.00627
Epoch: 46536, Training Loss: 0.00765
Epoch: 46536, Training Loss: 0.00593
Epoch: 46537, Training Loss: 0.00669
Epoch: 46537, Training Loss: 0.00627
Epoch: 46537, Training Loss: 0.00765
Epoch: 46537, Training Loss: 0.00593
Epoch: 46538, Training Loss: 0.00669
Epoch: 46538, Training Loss: 0.00627
Epoch: 46538, Training Loss: 0.00765
Epoch: 46538, Training Loss: 0.00593
Epoch: 46539, Training Loss: 0.00669
Epoch: 46539, Training Loss: 0.00627
Epoch: 46539, Training Loss: 0.00765
Epoch: 46539, Training Loss: 0.00593
Epoch: 46540, Training Loss: 0.00669
Epoch: 46540, Training Loss: 0.00627
Epoch: 46540, Training Loss: 0.00765
Epoch: 46540, Training Loss: 0.00593
Epoch: 46541, Training Loss: 0.00669
Epoch: 46541, Training Loss: 0.00627
Epoch: 46541, Training Loss: 0.00765
Epoch: 46541, Training Loss: 0.00593
Epoch: 46542, Training Loss: 0.00669
Epoch: 46542, Training Loss: 0.00627
Epoch: 46542, Training Loss: 0.00765
Epoch: 46542, Training Loss: 0.00593
Epoch: 46543, Training Loss: 0.00669
Epoch: 46543, Training Loss: 0.00627
Epoch: 46543, Training Loss: 0.00765
Epoch: 46543, Training Loss: 0.00593
Epoch: 46544, Training Loss: 0.00669
Epoch: 46544, Training Loss: 0.00627
Epoch: 46544, Training Loss: 0.00765
Epoch: 46544, Training Loss: 0.00593
Epoch: 46545, Training Loss: 0.00669
Epoch: 46545, Training Loss: 0.00627
Epoch: 46545, Training Loss: 0.00765
Epoch: 46545, Training Loss: 0.00593
Epoch: 46546, Training Loss: 0.00669
Epoch: 46546, Training Loss: 0.00627
Epoch: 46546, Training Loss: 0.00765
Epoch: 46546, Training Loss: 0.00593
Epoch: 46547, Training Loss: 0.00669
Epoch: 46547, Training Loss: 0.00627
Epoch: 46547, Training Loss: 0.00765
Epoch: 46547, Training Loss: 0.00593
Epoch: 46548, Training Loss: 0.00669
Epoch: 46548, Training Loss: 0.00627
Epoch: 46548, Training Loss: 0.00765
Epoch: 46548, Training Loss: 0.00593
Epoch: 46549, Training Loss: 0.00669
Epoch: 46549, Training Loss: 0.00627
Epoch: 46549, Training Loss: 0.00765
Epoch: 46549, Training Loss: 0.00593
Epoch: 46550, Training Loss: 0.00669
Epoch: 46550, Training Loss: 0.00627
Epoch: 46550, Training Loss: 0.00765
Epoch: 46550, Training Loss: 0.00593
Epoch: 46551, Training Loss: 0.00669
Epoch: 46551, Training Loss: 0.00627
Epoch: 46551, Training Loss: 0.00765
Epoch: 46551, Training Loss: 0.00593
Epoch: 46552, Training Loss: 0.00669
Epoch: 46552, Training Loss: 0.00627
Epoch: 46552, Training Loss: 0.00765
Epoch: 46552, Training Loss: 0.00593
Epoch: 46553, Training Loss: 0.00669
Epoch: 46553, Training Loss: 0.00627
Epoch: 46553, Training Loss: 0.00765
Epoch: 46553, Training Loss: 0.00593
Epoch: 46554, Training Loss: 0.00669
Epoch: 46554, Training Loss: 0.00627
Epoch: 46554, Training Loss: 0.00764
Epoch: 46554, Training Loss: 0.00593
Epoch: 46555, Training Loss: 0.00669
Epoch: 46555, Training Loss: 0.00627
Epoch: 46555, Training Loss: 0.00764
Epoch: 46555, Training Loss: 0.00593
Epoch: 46556, Training Loss: 0.00669
Epoch: 46556, Training Loss: 0.00627
Epoch: 46556, Training Loss: 0.00764
Epoch: 46556, Training Loss: 0.00593
Epoch: 46557, Training Loss: 0.00669
Epoch: 46557, Training Loss: 0.00627
Epoch: 46557, Training Loss: 0.00764
Epoch: 46557, Training Loss: 0.00593
Epoch: 46558, Training Loss: 0.00669
Epoch: 46558, Training Loss: 0.00627
Epoch: 46558, Training Loss: 0.00764
Epoch: 46558, Training Loss: 0.00593
Epoch: 46559, Training Loss: 0.00669
Epoch: 46559, Training Loss: 0.00627
Epoch: 46559, Training Loss: 0.00764
Epoch: 46559, Training Loss: 0.00593
Epoch: 46560, Training Loss: 0.00669
Epoch: 46560, Training Loss: 0.00627
Epoch: 46560, Training Loss: 0.00764
Epoch: 46560, Training Loss: 0.00593
Epoch: 46561, Training Loss: 0.00669
Epoch: 46561, Training Loss: 0.00627
Epoch: 46561, Training Loss: 0.00764
Epoch: 46561, Training Loss: 0.00593
Epoch: 46562, Training Loss: 0.00669
Epoch: 46562, Training Loss: 0.00627
Epoch: 46562, Training Loss: 0.00764
Epoch: 46562, Training Loss: 0.00593
Epoch: 46563, Training Loss: 0.00669
Epoch: 46563, Training Loss: 0.00627
Epoch: 46563, Training Loss: 0.00764
Epoch: 46563, Training Loss: 0.00593
Epoch: 46564, Training Loss: 0.00669
Epoch: 46564, Training Loss: 0.00627
Epoch: 46564, Training Loss: 0.00764
Epoch: 46564, Training Loss: 0.00593
Epoch: 46565, Training Loss: 0.00669
Epoch: 46565, Training Loss: 0.00627
Epoch: 46565, Training Loss: 0.00764
Epoch: 46565, Training Loss: 0.00593
Epoch: 46566, Training Loss: 0.00669
Epoch: 46566, Training Loss: 0.00627
Epoch: 46566, Training Loss: 0.00764
Epoch: 46566, Training Loss: 0.00593
Epoch: 46567, Training Loss: 0.00669
Epoch: 46567, Training Loss: 0.00627
Epoch: 46567, Training Loss: 0.00764
Epoch: 46567, Training Loss: 0.00593
Epoch: 46568, Training Loss: 0.00669
Epoch: 46568, Training Loss: 0.00627
Epoch: 46568, Training Loss: 0.00764
Epoch: 46568, Training Loss: 0.00593
Epoch: 46569, Training Loss: 0.00669
Epoch: 46569, Training Loss: 0.00627
Epoch: 46569, Training Loss: 0.00764
Epoch: 46569, Training Loss: 0.00593
Epoch: 46570, Training Loss: 0.00669
Epoch: 46570, Training Loss: 0.00627
Epoch: 46570, Training Loss: 0.00764
Epoch: 46570, Training Loss: 0.00593
Epoch: 46571, Training Loss: 0.00669
Epoch: 46571, Training Loss: 0.00627
Epoch: 46571, Training Loss: 0.00764
Epoch: 46571, Training Loss: 0.00593
Epoch: 46572, Training Loss: 0.00669
Epoch: 46572, Training Loss: 0.00627
Epoch: 46572, Training Loss: 0.00764
Epoch: 46572, Training Loss: 0.00593
Epoch: 46573, Training Loss: 0.00669
Epoch: 46573, Training Loss: 0.00627
Epoch: 46573, Training Loss: 0.00764
Epoch: 46573, Training Loss: 0.00593
Epoch: 46574, Training Loss: 0.00669
Epoch: 46574, Training Loss: 0.00627
Epoch: 46574, Training Loss: 0.00764
Epoch: 46574, Training Loss: 0.00593
Epoch: 46575, Training Loss: 0.00669
Epoch: 46575, Training Loss: 0.00627
Epoch: 46575, Training Loss: 0.00764
Epoch: 46575, Training Loss: 0.00593
Epoch: 46576, Training Loss: 0.00669
Epoch: 46576, Training Loss: 0.00627
Epoch: 46576, Training Loss: 0.00764
Epoch: 46576, Training Loss: 0.00593
Epoch: 46577, Training Loss: 0.00669
Epoch: 46577, Training Loss: 0.00627
Epoch: 46577, Training Loss: 0.00764
Epoch: 46577, Training Loss: 0.00593
Epoch: 46578, Training Loss: 0.00669
Epoch: 46578, Training Loss: 0.00627
Epoch: 46578, Training Loss: 0.00764
Epoch: 46578, Training Loss: 0.00593
Epoch: 46579, Training Loss: 0.00669
Epoch: 46579, Training Loss: 0.00627
Epoch: 46579, Training Loss: 0.00764
Epoch: 46579, Training Loss: 0.00593
Epoch: 46580, Training Loss: 0.00669
Epoch: 46580, Training Loss: 0.00627
Epoch: 46580, Training Loss: 0.00764
Epoch: 46580, Training Loss: 0.00593
Epoch: 46581, Training Loss: 0.00669
Epoch: 46581, Training Loss: 0.00627
Epoch: 46581, Training Loss: 0.00764
Epoch: 46581, Training Loss: 0.00593
Epoch: 46582, Training Loss: 0.00669
Epoch: 46582, Training Loss: 0.00627
Epoch: 46582, Training Loss: 0.00764
Epoch: 46582, Training Loss: 0.00593
Epoch: 46583, Training Loss: 0.00669
Epoch: 46583, Training Loss: 0.00626
Epoch: 46583, Training Loss: 0.00764
Epoch: 46583, Training Loss: 0.00593
Epoch: 46584, Training Loss: 0.00669
Epoch: 46584, Training Loss: 0.00626
Epoch: 46584, Training Loss: 0.00764
Epoch: 46584, Training Loss: 0.00593
Epoch: 46585, Training Loss: 0.00669
Epoch: 46585, Training Loss: 0.00626
Epoch: 46585, Training Loss: 0.00764
Epoch: 46585, Training Loss: 0.00593
Epoch: 46586, Training Loss: 0.00669
Epoch: 46586, Training Loss: 0.00626
Epoch: 46586, Training Loss: 0.00764
Epoch: 46586, Training Loss: 0.00593
Epoch: 46587, Training Loss: 0.00669
Epoch: 46587, Training Loss: 0.00626
Epoch: 46587, Training Loss: 0.00764
Epoch: 46587, Training Loss: 0.00593
Epoch: 46588, Training Loss: 0.00669
Epoch: 46588, Training Loss: 0.00626
Epoch: 46588, Training Loss: 0.00764
Epoch: 46588, Training Loss: 0.00593
Epoch: 46589, Training Loss: 0.00669
Epoch: 46589, Training Loss: 0.00626
Epoch: 46589, Training Loss: 0.00764
Epoch: 46589, Training Loss: 0.00593
Epoch: 46590, Training Loss: 0.00669
Epoch: 46590, Training Loss: 0.00626
Epoch: 46590, Training Loss: 0.00764
Epoch: 46590, Training Loss: 0.00593
Epoch: 46591, Training Loss: 0.00669
Epoch: 46591, Training Loss: 0.00626
Epoch: 46591, Training Loss: 0.00764
Epoch: 46591, Training Loss: 0.00593
Epoch: 46592, Training Loss: 0.00669
Epoch: 46592, Training Loss: 0.00626
Epoch: 46592, Training Loss: 0.00764
Epoch: 46592, Training Loss: 0.00593
Epoch: 46593, Training Loss: 0.00669
Epoch: 46593, Training Loss: 0.00626
Epoch: 46593, Training Loss: 0.00764
Epoch: 46593, Training Loss: 0.00593
Epoch: 46594, Training Loss: 0.00669
Epoch: 46594, Training Loss: 0.00626
Epoch: 46594, Training Loss: 0.00764
Epoch: 46594, Training Loss: 0.00593
Epoch: 46595, Training Loss: 0.00669
Epoch: 46595, Training Loss: 0.00626
Epoch: 46595, Training Loss: 0.00764
Epoch: 46595, Training Loss: 0.00593
Epoch: 46596, Training Loss: 0.00669
Epoch: 46596, Training Loss: 0.00626
Epoch: 46596, Training Loss: 0.00764
Epoch: 46596, Training Loss: 0.00593
Epoch: 46597, Training Loss: 0.00669
Epoch: 46597, Training Loss: 0.00626
Epoch: 46597, Training Loss: 0.00764
Epoch: 46597, Training Loss: 0.00593
Epoch: 46598, Training Loss: 0.00669
Epoch: 46598, Training Loss: 0.00626
Epoch: 46598, Training Loss: 0.00764
Epoch: 46598, Training Loss: 0.00593
Epoch: 46599, Training Loss: 0.00669
Epoch: 46599, Training Loss: 0.00626
Epoch: 46599, Training Loss: 0.00764
Epoch: 46599, Training Loss: 0.00593
Epoch: 46600, Training Loss: 0.00669
Epoch: 46600, Training Loss: 0.00626
Epoch: 46600, Training Loss: 0.00764
Epoch: 46600, Training Loss: 0.00593
Epoch: 46601, Training Loss: 0.00669
Epoch: 46601, Training Loss: 0.00626
Epoch: 46601, Training Loss: 0.00764
Epoch: 46601, Training Loss: 0.00593
Epoch: 46602, Training Loss: 0.00669
Epoch: 46602, Training Loss: 0.00626
Epoch: 46602, Training Loss: 0.00764
Epoch: 46602, Training Loss: 0.00593
Epoch: 46603, Training Loss: 0.00669
Epoch: 46603, Training Loss: 0.00626
Epoch: 46603, Training Loss: 0.00764
Epoch: 46603, Training Loss: 0.00593
Epoch: 46604, Training Loss: 0.00669
Epoch: 46604, Training Loss: 0.00626
Epoch: 46604, Training Loss: 0.00764
Epoch: 46604, Training Loss: 0.00593
Epoch: 46605, Training Loss: 0.00668
Epoch: 46605, Training Loss: 0.00626
Epoch: 46605, Training Loss: 0.00764
Epoch: 46605, Training Loss: 0.00593
Epoch: 46606, Training Loss: 0.00668
Epoch: 46606, Training Loss: 0.00626
Epoch: 46606, Training Loss: 0.00764
Epoch: 46606, Training Loss: 0.00593
Epoch: 46607, Training Loss: 0.00668
Epoch: 46607, Training Loss: 0.00626
Epoch: 46607, Training Loss: 0.00764
Epoch: 46607, Training Loss: 0.00593
Epoch: 46608, Training Loss: 0.00668
Epoch: 46608, Training Loss: 0.00626
Epoch: 46608, Training Loss: 0.00764
Epoch: 46608, Training Loss: 0.00593
Epoch: 46609, Training Loss: 0.00668
Epoch: 46609, Training Loss: 0.00626
Epoch: 46609, Training Loss: 0.00764
Epoch: 46609, Training Loss: 0.00593
Epoch: 46610, Training Loss: 0.00668
Epoch: 46610, Training Loss: 0.00626
Epoch: 46610, Training Loss: 0.00764
Epoch: 46610, Training Loss: 0.00593
Epoch: 46611, Training Loss: 0.00668
Epoch: 46611, Training Loss: 0.00626
Epoch: 46611, Training Loss: 0.00764
Epoch: 46611, Training Loss: 0.00593
Epoch: 46612, Training Loss: 0.00668
Epoch: 46612, Training Loss: 0.00626
Epoch: 46612, Training Loss: 0.00764
Epoch: 46612, Training Loss: 0.00593
Epoch: 46613, Training Loss: 0.00668
Epoch: 46613, Training Loss: 0.00626
Epoch: 46613, Training Loss: 0.00764
Epoch: 46613, Training Loss: 0.00593
Epoch: 46614, Training Loss: 0.00668
Epoch: 46614, Training Loss: 0.00626
Epoch: 46614, Training Loss: 0.00764
Epoch: 46614, Training Loss: 0.00593
Epoch: 46615, Training Loss: 0.00668
Epoch: 46615, Training Loss: 0.00626
Epoch: 46615, Training Loss: 0.00764
Epoch: 46615, Training Loss: 0.00593
Epoch: 46616, Training Loss: 0.00668
Epoch: 46616, Training Loss: 0.00626
Epoch: 46616, Training Loss: 0.00764
Epoch: 46616, Training Loss: 0.00593
Epoch: 46617, Training Loss: 0.00668
Epoch: 46617, Training Loss: 0.00626
Epoch: 46617, Training Loss: 0.00764
Epoch: 46617, Training Loss: 0.00593
Epoch: 46618, Training Loss: 0.00668
Epoch: 46618, Training Loss: 0.00626
Epoch: 46618, Training Loss: 0.00764
Epoch: 46618, Training Loss: 0.00593
Epoch: 46619, Training Loss: 0.00668
Epoch: 46619, Training Loss: 0.00626
Epoch: 46619, Training Loss: 0.00764
Epoch: 46619, Training Loss: 0.00593
Epoch: 46620, Training Loss: 0.00668
Epoch: 46620, Training Loss: 0.00626
Epoch: 46620, Training Loss: 0.00764
Epoch: 46620, Training Loss: 0.00593
Epoch: 46621, Training Loss: 0.00668
Epoch: 46621, Training Loss: 0.00626
Epoch: 46621, Training Loss: 0.00764
Epoch: 46621, Training Loss: 0.00593
Epoch: 46622, Training Loss: 0.00668
Epoch: 46622, Training Loss: 0.00626
Epoch: 46622, Training Loss: 0.00764
Epoch: 46622, Training Loss: 0.00593
Epoch: 46623, Training Loss: 0.00668
Epoch: 46623, Training Loss: 0.00626
Epoch: 46623, Training Loss: 0.00764
Epoch: 46623, Training Loss: 0.00593
Epoch: 46624, Training Loss: 0.00668
Epoch: 46624, Training Loss: 0.00626
Epoch: 46624, Training Loss: 0.00764
Epoch: 46624, Training Loss: 0.00593
Epoch: 46625, Training Loss: 0.00668
Epoch: 46625, Training Loss: 0.00626
Epoch: 46625, Training Loss: 0.00764
Epoch: 46625, Training Loss: 0.00593
Epoch: 46626, Training Loss: 0.00668
Epoch: 46626, Training Loss: 0.00626
Epoch: 46626, Training Loss: 0.00764
Epoch: 46626, Training Loss: 0.00593
Epoch: 46627, Training Loss: 0.00668
Epoch: 46627, Training Loss: 0.00626
Epoch: 46627, Training Loss: 0.00764
Epoch: 46627, Training Loss: 0.00593
Epoch: 46628, Training Loss: 0.00668
Epoch: 46628, Training Loss: 0.00626
Epoch: 46628, Training Loss: 0.00764
Epoch: 46628, Training Loss: 0.00593
Epoch: 46629, Training Loss: 0.00668
Epoch: 46629, Training Loss: 0.00626
Epoch: 46629, Training Loss: 0.00764
Epoch: 46629, Training Loss: 0.00593
Epoch: 46630, Training Loss: 0.00668
Epoch: 46630, Training Loss: 0.00626
Epoch: 46630, Training Loss: 0.00764
Epoch: 46630, Training Loss: 0.00593
Epoch: 46631, Training Loss: 0.00668
Epoch: 46631, Training Loss: 0.00626
Epoch: 46631, Training Loss: 0.00764
Epoch: 46631, Training Loss: 0.00593
Epoch: 46632, Training Loss: 0.00668
Epoch: 46632, Training Loss: 0.00626
Epoch: 46632, Training Loss: 0.00764
Epoch: 46632, Training Loss: 0.00593
Epoch: 46633, Training Loss: 0.00668
Epoch: 46633, Training Loss: 0.00626
Epoch: 46633, Training Loss: 0.00764
Epoch: 46633, Training Loss: 0.00593
Epoch: 46634, Training Loss: 0.00668
Epoch: 46634, Training Loss: 0.00626
Epoch: 46634, Training Loss: 0.00764
Epoch: 46634, Training Loss: 0.00593
Epoch: 46635, Training Loss: 0.00668
Epoch: 46635, Training Loss: 0.00626
Epoch: 46635, Training Loss: 0.00764
Epoch: 46635, Training Loss: 0.00593
Epoch: 46636, Training Loss: 0.00668
Epoch: 46636, Training Loss: 0.00626
Epoch: 46636, Training Loss: 0.00764
Epoch: 46636, Training Loss: 0.00593
Epoch: 46637, Training Loss: 0.00668
Epoch: 46637, Training Loss: 0.00626
Epoch: 46637, Training Loss: 0.00764
Epoch: 46637, Training Loss: 0.00593
Epoch: 46638, Training Loss: 0.00668
Epoch: 46638, Training Loss: 0.00626
Epoch: 46638, Training Loss: 0.00764
Epoch: 46638, Training Loss: 0.00593
Epoch: 46639, Training Loss: 0.00668
Epoch: 46639, Training Loss: 0.00626
Epoch: 46639, Training Loss: 0.00764
Epoch: 46639, Training Loss: 0.00593
Epoch: 46640, Training Loss: 0.00668
Epoch: 46640, Training Loss: 0.00626
Epoch: 46640, Training Loss: 0.00764
Epoch: 46640, Training Loss: 0.00593
Epoch: 46641, Training Loss: 0.00668
Epoch: 46641, Training Loss: 0.00626
Epoch: 46641, Training Loss: 0.00764
Epoch: 46641, Training Loss: 0.00593
Epoch: 46642, Training Loss: 0.00668
Epoch: 46642, Training Loss: 0.00626
Epoch: 46642, Training Loss: 0.00764
Epoch: 46642, Training Loss: 0.00593
Epoch: 46643, Training Loss: 0.00668
Epoch: 46643, Training Loss: 0.00626
Epoch: 46643, Training Loss: 0.00764
Epoch: 46643, Training Loss: 0.00593
Epoch: 46644, Training Loss: 0.00668
Epoch: 46644, Training Loss: 0.00626
Epoch: 46644, Training Loss: 0.00764
Epoch: 46644, Training Loss: 0.00593
Epoch: 46645, Training Loss: 0.00668
Epoch: 46645, Training Loss: 0.00626
Epoch: 46645, Training Loss: 0.00764
Epoch: 46645, Training Loss: 0.00593
Epoch: 46646, Training Loss: 0.00668
Epoch: 46646, Training Loss: 0.00626
Epoch: 46646, Training Loss: 0.00764
Epoch: 46646, Training Loss: 0.00593
Epoch: 46647, Training Loss: 0.00668
Epoch: 46647, Training Loss: 0.00626
Epoch: 46647, Training Loss: 0.00764
Epoch: 46647, Training Loss: 0.00593
Epoch: 46648, Training Loss: 0.00668
Epoch: 46648, Training Loss: 0.00626
Epoch: 46648, Training Loss: 0.00764
Epoch: 46648, Training Loss: 0.00593
Epoch: 46649, Training Loss: 0.00668
Epoch: 46649, Training Loss: 0.00626
Epoch: 46649, Training Loss: 0.00764
Epoch: 46649, Training Loss: 0.00593
Epoch: 46650, Training Loss: 0.00668
Epoch: 46650, Training Loss: 0.00626
Epoch: 46650, Training Loss: 0.00764
Epoch: 46650, Training Loss: 0.00593
Epoch: 46651, Training Loss: 0.00668
Epoch: 46651, Training Loss: 0.00626
Epoch: 46651, Training Loss: 0.00764
Epoch: 46651, Training Loss: 0.00593
Epoch: 46652, Training Loss: 0.00668
Epoch: 46652, Training Loss: 0.00626
Epoch: 46652, Training Loss: 0.00764
Epoch: 46652, Training Loss: 0.00593
Epoch: 46653, Training Loss: 0.00668
Epoch: 46653, Training Loss: 0.00626
Epoch: 46653, Training Loss: 0.00764
Epoch: 46653, Training Loss: 0.00593
Epoch: 46654, Training Loss: 0.00668
Epoch: 46654, Training Loss: 0.00626
Epoch: 46654, Training Loss: 0.00764
Epoch: 46654, Training Loss: 0.00593
Epoch: 46655, Training Loss: 0.00668
Epoch: 46655, Training Loss: 0.00626
Epoch: 46655, Training Loss: 0.00764
Epoch: 46655, Training Loss: 0.00593
Epoch: 46656, Training Loss: 0.00668
Epoch: 46656, Training Loss: 0.00626
Epoch: 46656, Training Loss: 0.00764
Epoch: 46656, Training Loss: 0.00593
Epoch: 46657, Training Loss: 0.00668
Epoch: 46657, Training Loss: 0.00626
Epoch: 46657, Training Loss: 0.00764
Epoch: 46657, Training Loss: 0.00593
Epoch: 46658, Training Loss: 0.00668
Epoch: 46658, Training Loss: 0.00626
Epoch: 46658, Training Loss: 0.00764
Epoch: 46658, Training Loss: 0.00593
Epoch: 46659, Training Loss: 0.00668
Epoch: 46659, Training Loss: 0.00626
Epoch: 46659, Training Loss: 0.00764
Epoch: 46659, Training Loss: 0.00593
Epoch: 46660, Training Loss: 0.00668
Epoch: 46660, Training Loss: 0.00626
Epoch: 46660, Training Loss: 0.00764
Epoch: 46660, Training Loss: 0.00593
Epoch: 46661, Training Loss: 0.00668
Epoch: 46661, Training Loss: 0.00626
Epoch: 46661, Training Loss: 0.00764
Epoch: 46661, Training Loss: 0.00593
Epoch: 46662, Training Loss: 0.00668
Epoch: 46662, Training Loss: 0.00626
Epoch: 46662, Training Loss: 0.00764
Epoch: 46662, Training Loss: 0.00593
Epoch: 46663, Training Loss: 0.00668
Epoch: 46663, Training Loss: 0.00626
Epoch: 46663, Training Loss: 0.00764
Epoch: 46663, Training Loss: 0.00593
Epoch: 46664, Training Loss: 0.00668
Epoch: 46664, Training Loss: 0.00626
Epoch: 46664, Training Loss: 0.00764
Epoch: 46664, Training Loss: 0.00593
Epoch: 46665, Training Loss: 0.00668
Epoch: 46665, Training Loss: 0.00626
Epoch: 46665, Training Loss: 0.00764
Epoch: 46665, Training Loss: 0.00593
Epoch: 46666, Training Loss: 0.00668
Epoch: 46666, Training Loss: 0.00626
Epoch: 46666, Training Loss: 0.00764
Epoch: 46666, Training Loss: 0.00593
Epoch: 46667, Training Loss: 0.00668
Epoch: 46667, Training Loss: 0.00626
Epoch: 46667, Training Loss: 0.00764
Epoch: 46667, Training Loss: 0.00593
Epoch: 46668, Training Loss: 0.00668
Epoch: 46668, Training Loss: 0.00626
Epoch: 46668, Training Loss: 0.00764
Epoch: 46668, Training Loss: 0.00593
Epoch: 46669, Training Loss: 0.00668
Epoch: 46669, Training Loss: 0.00626
Epoch: 46669, Training Loss: 0.00764
Epoch: 46669, Training Loss: 0.00593
Epoch: 46670, Training Loss: 0.00668
Epoch: 46670, Training Loss: 0.00626
Epoch: 46670, Training Loss: 0.00763
Epoch: 46670, Training Loss: 0.00593
Epoch: 46671, Training Loss: 0.00668
Epoch: 46671, Training Loss: 0.00626
Epoch: 46671, Training Loss: 0.00763
Epoch: 46671, Training Loss: 0.00593
Epoch: 46672, Training Loss: 0.00668
Epoch: 46672, Training Loss: 0.00626
Epoch: 46672, Training Loss: 0.00763
Epoch: 46672, Training Loss: 0.00593
Epoch: 46673, Training Loss: 0.00668
Epoch: 46673, Training Loss: 0.00626
Epoch: 46673, Training Loss: 0.00763
Epoch: 46673, Training Loss: 0.00593
Epoch: 46674, Training Loss: 0.00668
Epoch: 46674, Training Loss: 0.00626
Epoch: 46674, Training Loss: 0.00763
Epoch: 46674, Training Loss: 0.00593
Epoch: 46675, Training Loss: 0.00668
Epoch: 46675, Training Loss: 0.00626
Epoch: 46675, Training Loss: 0.00763
Epoch: 46675, Training Loss: 0.00593
Epoch: 46676, Training Loss: 0.00668
Epoch: 46676, Training Loss: 0.00626
Epoch: 46676, Training Loss: 0.00763
Epoch: 46676, Training Loss: 0.00593
Epoch: 46677, Training Loss: 0.00668
Epoch: 46677, Training Loss: 0.00626
Epoch: 46677, Training Loss: 0.00763
Epoch: 46677, Training Loss: 0.00593
Epoch: 46678, Training Loss: 0.00668
Epoch: 46678, Training Loss: 0.00626
Epoch: 46678, Training Loss: 0.00763
Epoch: 46678, Training Loss: 0.00592
Epoch: 46679, Training Loss: 0.00668
Epoch: 46679, Training Loss: 0.00626
Epoch: 46679, Training Loss: 0.00763
Epoch: 46679, Training Loss: 0.00592
Epoch: 46680, Training Loss: 0.00668
Epoch: 46680, Training Loss: 0.00626
Epoch: 46680, Training Loss: 0.00763
Epoch: 46680, Training Loss: 0.00592
Epoch: 46681, Training Loss: 0.00668
Epoch: 46681, Training Loss: 0.00626
Epoch: 46681, Training Loss: 0.00763
Epoch: 46681, Training Loss: 0.00592
Epoch: 46682, Training Loss: 0.00668
Epoch: 46682, Training Loss: 0.00626
Epoch: 46682, Training Loss: 0.00763
Epoch: 46682, Training Loss: 0.00592
Epoch: 46683, Training Loss: 0.00668
Epoch: 46683, Training Loss: 0.00626
Epoch: 46683, Training Loss: 0.00763
Epoch: 46683, Training Loss: 0.00592
Epoch: 46684, Training Loss: 0.00668
Epoch: 46684, Training Loss: 0.00626
Epoch: 46684, Training Loss: 0.00763
Epoch: 46684, Training Loss: 0.00592
Epoch: 46685, Training Loss: 0.00668
Epoch: 46685, Training Loss: 0.00626
Epoch: 46685, Training Loss: 0.00763
Epoch: 46685, Training Loss: 0.00592
Epoch: 46686, Training Loss: 0.00668
Epoch: 46686, Training Loss: 0.00626
Epoch: 46686, Training Loss: 0.00763
Epoch: 46686, Training Loss: 0.00592
Epoch: 46687, Training Loss: 0.00668
Epoch: 46687, Training Loss: 0.00626
Epoch: 46687, Training Loss: 0.00763
Epoch: 46687, Training Loss: 0.00592
Epoch: 46688, Training Loss: 0.00668
Epoch: 46688, Training Loss: 0.00626
Epoch: 46688, Training Loss: 0.00763
Epoch: 46688, Training Loss: 0.00592
Epoch: 46689, Training Loss: 0.00668
Epoch: 46689, Training Loss: 0.00626
Epoch: 46689, Training Loss: 0.00763
Epoch: 46689, Training Loss: 0.00592
Epoch: 46690, Training Loss: 0.00668
Epoch: 46690, Training Loss: 0.00626
Epoch: 46690, Training Loss: 0.00763
Epoch: 46690, Training Loss: 0.00592
Epoch: 46691, Training Loss: 0.00668
Epoch: 46691, Training Loss: 0.00626
Epoch: 46691, Training Loss: 0.00763
Epoch: 46691, Training Loss: 0.00592
Epoch: 46692, Training Loss: 0.00668
Epoch: 46692, Training Loss: 0.00626
Epoch: 46692, Training Loss: 0.00763
Epoch: 46692, Training Loss: 0.00592
Epoch: 46693, Training Loss: 0.00668
Epoch: 46693, Training Loss: 0.00626
Epoch: 46693, Training Loss: 0.00763
Epoch: 46693, Training Loss: 0.00592
Epoch: 46694, Training Loss: 0.00668
Epoch: 46694, Training Loss: 0.00626
Epoch: 46694, Training Loss: 0.00763
Epoch: 46694, Training Loss: 0.00592
Epoch: 46695, Training Loss: 0.00668
Epoch: 46695, Training Loss: 0.00626
Epoch: 46695, Training Loss: 0.00763
Epoch: 46695, Training Loss: 0.00592
Epoch: 46696, Training Loss: 0.00668
Epoch: 46696, Training Loss: 0.00626
Epoch: 46696, Training Loss: 0.00763
Epoch: 46696, Training Loss: 0.00592
Epoch: 46697, Training Loss: 0.00668
Epoch: 46697, Training Loss: 0.00626
Epoch: 46697, Training Loss: 0.00763
Epoch: 46697, Training Loss: 0.00592
Epoch: 46698, Training Loss: 0.00668
Epoch: 46698, Training Loss: 0.00626
Epoch: 46698, Training Loss: 0.00763
Epoch: 46698, Training Loss: 0.00592
Epoch: 46699, Training Loss: 0.00668
Epoch: 46699, Training Loss: 0.00626
Epoch: 46699, Training Loss: 0.00763
Epoch: 46699, Training Loss: 0.00592
Epoch: 46700, Training Loss: 0.00668
Epoch: 46700, Training Loss: 0.00626
Epoch: 46700, Training Loss: 0.00763
Epoch: 46700, Training Loss: 0.00592
Epoch: 46701, Training Loss: 0.00668
Epoch: 46701, Training Loss: 0.00626
Epoch: 46701, Training Loss: 0.00763
Epoch: 46701, Training Loss: 0.00592
Epoch: 46702, Training Loss: 0.00668
Epoch: 46702, Training Loss: 0.00626
Epoch: 46702, Training Loss: 0.00763
Epoch: 46702, Training Loss: 0.00592
Epoch: 46703, Training Loss: 0.00668
Epoch: 46703, Training Loss: 0.00626
Epoch: 46703, Training Loss: 0.00763
Epoch: 46703, Training Loss: 0.00592
Epoch: 46704, Training Loss: 0.00668
Epoch: 46704, Training Loss: 0.00626
Epoch: 46704, Training Loss: 0.00763
Epoch: 46704, Training Loss: 0.00592
Epoch: 46705, Training Loss: 0.00668
Epoch: 46705, Training Loss: 0.00626
Epoch: 46705, Training Loss: 0.00763
Epoch: 46705, Training Loss: 0.00592
Epoch: 46706, Training Loss: 0.00668
Epoch: 46706, Training Loss: 0.00626
Epoch: 46706, Training Loss: 0.00763
Epoch: 46706, Training Loss: 0.00592
Epoch: 46707, Training Loss: 0.00668
Epoch: 46707, Training Loss: 0.00626
Epoch: 46707, Training Loss: 0.00763
Epoch: 46707, Training Loss: 0.00592
Epoch: 46708, Training Loss: 0.00668
Epoch: 46708, Training Loss: 0.00626
Epoch: 46708, Training Loss: 0.00763
Epoch: 46708, Training Loss: 0.00592
Epoch: 46709, Training Loss: 0.00668
Epoch: 46709, Training Loss: 0.00626
Epoch: 46709, Training Loss: 0.00763
Epoch: 46709, Training Loss: 0.00592
Epoch: 46710, Training Loss: 0.00668
Epoch: 46710, Training Loss: 0.00626
Epoch: 46710, Training Loss: 0.00763
Epoch: 46710, Training Loss: 0.00592
Epoch: 46711, Training Loss: 0.00668
Epoch: 46711, Training Loss: 0.00626
Epoch: 46711, Training Loss: 0.00763
Epoch: 46711, Training Loss: 0.00592
Epoch: 46712, Training Loss: 0.00668
Epoch: 46712, Training Loss: 0.00626
Epoch: 46712, Training Loss: 0.00763
Epoch: 46712, Training Loss: 0.00592
Epoch: 46713, Training Loss: 0.00668
Epoch: 46713, Training Loss: 0.00626
Epoch: 46713, Training Loss: 0.00763
Epoch: 46713, Training Loss: 0.00592
Epoch: 46714, Training Loss: 0.00668
Epoch: 46714, Training Loss: 0.00626
Epoch: 46714, Training Loss: 0.00763
Epoch: 46714, Training Loss: 0.00592
Epoch: 46715, Training Loss: 0.00668
Epoch: 46715, Training Loss: 0.00626
Epoch: 46715, Training Loss: 0.00763
Epoch: 46715, Training Loss: 0.00592
Epoch: 46716, Training Loss: 0.00668
Epoch: 46716, Training Loss: 0.00626
Epoch: 46716, Training Loss: 0.00763
Epoch: 46716, Training Loss: 0.00592
Epoch: 46717, Training Loss: 0.00668
Epoch: 46717, Training Loss: 0.00626
Epoch: 46717, Training Loss: 0.00763
Epoch: 46717, Training Loss: 0.00592
Epoch: 46718, Training Loss: 0.00668
Epoch: 46718, Training Loss: 0.00626
Epoch: 46718, Training Loss: 0.00763
Epoch: 46718, Training Loss: 0.00592
Epoch: 46719, Training Loss: 0.00668
Epoch: 46719, Training Loss: 0.00626
Epoch: 46719, Training Loss: 0.00763
Epoch: 46719, Training Loss: 0.00592
Epoch: 46720, Training Loss: 0.00668
Epoch: 46720, Training Loss: 0.00626
Epoch: 46720, Training Loss: 0.00763
Epoch: 46720, Training Loss: 0.00592
Epoch: 46721, Training Loss: 0.00668
Epoch: 46721, Training Loss: 0.00626
Epoch: 46721, Training Loss: 0.00763
Epoch: 46721, Training Loss: 0.00592
Epoch: 46722, Training Loss: 0.00668
Epoch: 46722, Training Loss: 0.00626
Epoch: 46722, Training Loss: 0.00763
Epoch: 46722, Training Loss: 0.00592
Epoch: 46723, Training Loss: 0.00668
Epoch: 46723, Training Loss: 0.00626
Epoch: 46723, Training Loss: 0.00763
Epoch: 46723, Training Loss: 0.00592
Epoch: 46724, Training Loss: 0.00668
Epoch: 46724, Training Loss: 0.00626
Epoch: 46724, Training Loss: 0.00763
Epoch: 46724, Training Loss: 0.00592
Epoch: 46725, Training Loss: 0.00668
Epoch: 46725, Training Loss: 0.00626
Epoch: 46725, Training Loss: 0.00763
Epoch: 46725, Training Loss: 0.00592
Epoch: 46726, Training Loss: 0.00668
Epoch: 46726, Training Loss: 0.00626
Epoch: 46726, Training Loss: 0.00763
Epoch: 46726, Training Loss: 0.00592
Epoch: 46727, Training Loss: 0.00668
Epoch: 46727, Training Loss: 0.00625
Epoch: 46727, Training Loss: 0.00763
Epoch: 46727, Training Loss: 0.00592
Epoch: 46728, Training Loss: 0.00668
Epoch: 46728, Training Loss: 0.00625
Epoch: 46728, Training Loss: 0.00763
Epoch: 46728, Training Loss: 0.00592
Epoch: 46729, Training Loss: 0.00668
Epoch: 46729, Training Loss: 0.00625
Epoch: 46729, Training Loss: 0.00763
Epoch: 46729, Training Loss: 0.00592
Epoch: 46730, Training Loss: 0.00668
Epoch: 46730, Training Loss: 0.00625
Epoch: 46730, Training Loss: 0.00763
Epoch: 46730, Training Loss: 0.00592
Epoch: 46731, Training Loss: 0.00668
Epoch: 46731, Training Loss: 0.00625
Epoch: 46731, Training Loss: 0.00763
Epoch: 46731, Training Loss: 0.00592
Epoch: 46732, Training Loss: 0.00668
Epoch: 46732, Training Loss: 0.00625
Epoch: 46732, Training Loss: 0.00763
Epoch: 46732, Training Loss: 0.00592
Epoch: 46733, Training Loss: 0.00668
Epoch: 46733, Training Loss: 0.00625
Epoch: 46733, Training Loss: 0.00763
Epoch: 46733, Training Loss: 0.00592
Epoch: 46734, Training Loss: 0.00668
Epoch: 46734, Training Loss: 0.00625
Epoch: 46734, Training Loss: 0.00763
Epoch: 46734, Training Loss: 0.00592
Epoch: 46735, Training Loss: 0.00668
Epoch: 46735, Training Loss: 0.00625
Epoch: 46735, Training Loss: 0.00763
Epoch: 46735, Training Loss: 0.00592
Epoch: 46736, Training Loss: 0.00668
Epoch: 46736, Training Loss: 0.00625
Epoch: 46736, Training Loss: 0.00763
Epoch: 46736, Training Loss: 0.00592
Epoch: 46737, Training Loss: 0.00668
Epoch: 46737, Training Loss: 0.00625
Epoch: 46737, Training Loss: 0.00763
Epoch: 46737, Training Loss: 0.00592
Epoch: 46738, Training Loss: 0.00667
Epoch: 46738, Training Loss: 0.00625
Epoch: 46738, Training Loss: 0.00763
Epoch: 46738, Training Loss: 0.00592
Epoch: 46739, Training Loss: 0.00667
Epoch: 46739, Training Loss: 0.00625
Epoch: 46739, Training Loss: 0.00763
Epoch: 46739, Training Loss: 0.00592
Epoch: 46740, Training Loss: 0.00667
Epoch: 46740, Training Loss: 0.00625
Epoch: 46740, Training Loss: 0.00763
Epoch: 46740, Training Loss: 0.00592
Epoch: 46741, Training Loss: 0.00667
Epoch: 46741, Training Loss: 0.00625
Epoch: 46741, Training Loss: 0.00763
Epoch: 46741, Training Loss: 0.00592
Epoch: 46742, Training Loss: 0.00667
Epoch: 46742, Training Loss: 0.00625
Epoch: 46742, Training Loss: 0.00763
Epoch: 46742, Training Loss: 0.00592
Epoch: 46743, Training Loss: 0.00667
Epoch: 46743, Training Loss: 0.00625
Epoch: 46743, Training Loss: 0.00763
Epoch: 46743, Training Loss: 0.00592
Epoch: 46744, Training Loss: 0.00667
Epoch: 46744, Training Loss: 0.00625
Epoch: 46744, Training Loss: 0.00763
Epoch: 46744, Training Loss: 0.00592
Epoch: 46745, Training Loss: 0.00667
Epoch: 46745, Training Loss: 0.00625
Epoch: 46745, Training Loss: 0.00763
Epoch: 46745, Training Loss: 0.00592
Epoch: 46746, Training Loss: 0.00667
Epoch: 46746, Training Loss: 0.00625
Epoch: 46746, Training Loss: 0.00763
Epoch: 46746, Training Loss: 0.00592
Epoch: 46747, Training Loss: 0.00667
Epoch: 46747, Training Loss: 0.00625
Epoch: 46747, Training Loss: 0.00763
Epoch: 46747, Training Loss: 0.00592
Epoch: 46748, Training Loss: 0.00667
Epoch: 46748, Training Loss: 0.00625
Epoch: 46748, Training Loss: 0.00763
Epoch: 46748, Training Loss: 0.00592
Epoch: 46749, Training Loss: 0.00667
Epoch: 46749, Training Loss: 0.00625
Epoch: 46749, Training Loss: 0.00763
Epoch: 46749, Training Loss: 0.00592
Epoch: 46750, Training Loss: 0.00667
Epoch: 46750, Training Loss: 0.00625
Epoch: 46750, Training Loss: 0.00763
Epoch: 46750, Training Loss: 0.00592
Epoch: 46751, Training Loss: 0.00667
Epoch: 46751, Training Loss: 0.00625
Epoch: 46751, Training Loss: 0.00763
Epoch: 46751, Training Loss: 0.00592
Epoch: 46752, Training Loss: 0.00667
Epoch: 46752, Training Loss: 0.00625
Epoch: 46752, Training Loss: 0.00763
Epoch: 46752, Training Loss: 0.00592
Epoch: 46753, Training Loss: 0.00667
Epoch: 46753, Training Loss: 0.00625
Epoch: 46753, Training Loss: 0.00763
Epoch: 46753, Training Loss: 0.00592
Epoch: 46754, Training Loss: 0.00667
Epoch: 46754, Training Loss: 0.00625
Epoch: 46754, Training Loss: 0.00763
Epoch: 46754, Training Loss: 0.00592
Epoch: 46755, Training Loss: 0.00667
Epoch: 46755, Training Loss: 0.00625
Epoch: 46755, Training Loss: 0.00763
Epoch: 46755, Training Loss: 0.00592
Epoch: 46756, Training Loss: 0.00667
Epoch: 46756, Training Loss: 0.00625
Epoch: 46756, Training Loss: 0.00763
Epoch: 46756, Training Loss: 0.00592
Epoch: 46757, Training Loss: 0.00667
Epoch: 46757, Training Loss: 0.00625
Epoch: 46757, Training Loss: 0.00763
Epoch: 46757, Training Loss: 0.00592
Epoch: 46758, Training Loss: 0.00667
Epoch: 46758, Training Loss: 0.00625
Epoch: 46758, Training Loss: 0.00763
Epoch: 46758, Training Loss: 0.00592
Epoch: 46759, Training Loss: 0.00667
Epoch: 46759, Training Loss: 0.00625
Epoch: 46759, Training Loss: 0.00763
Epoch: 46759, Training Loss: 0.00592
Epoch: 46760, Training Loss: 0.00667
Epoch: 46760, Training Loss: 0.00625
Epoch: 46760, Training Loss: 0.00763
Epoch: 46760, Training Loss: 0.00592
Epoch: 46761, Training Loss: 0.00667
Epoch: 46761, Training Loss: 0.00625
Epoch: 46761, Training Loss: 0.00763
Epoch: 46761, Training Loss: 0.00592
Epoch: 46762, Training Loss: 0.00667
Epoch: 46762, Training Loss: 0.00625
Epoch: 46762, Training Loss: 0.00763
Epoch: 46762, Training Loss: 0.00592
Epoch: 46763, Training Loss: 0.00667
Epoch: 46763, Training Loss: 0.00625
Epoch: 46763, Training Loss: 0.00763
Epoch: 46763, Training Loss: 0.00592
Epoch: 46764, Training Loss: 0.00667
Epoch: 46764, Training Loss: 0.00625
Epoch: 46764, Training Loss: 0.00763
Epoch: 46764, Training Loss: 0.00592
Epoch: 46765, Training Loss: 0.00667
Epoch: 46765, Training Loss: 0.00625
Epoch: 46765, Training Loss: 0.00763
Epoch: 46765, Training Loss: 0.00592
Epoch: 46766, Training Loss: 0.00667
Epoch: 46766, Training Loss: 0.00625
Epoch: 46766, Training Loss: 0.00763
Epoch: 46766, Training Loss: 0.00592
Epoch: 46767, Training Loss: 0.00667
Epoch: 46767, Training Loss: 0.00625
Epoch: 46767, Training Loss: 0.00763
Epoch: 46767, Training Loss: 0.00592
Epoch: 46768, Training Loss: 0.00667
Epoch: 46768, Training Loss: 0.00625
Epoch: 46768, Training Loss: 0.00763
Epoch: 46768, Training Loss: 0.00592
Epoch: 46769, Training Loss: 0.00667
Epoch: 46769, Training Loss: 0.00625
Epoch: 46769, Training Loss: 0.00763
Epoch: 46769, Training Loss: 0.00592
Epoch: 46770, Training Loss: 0.00667
Epoch: 46770, Training Loss: 0.00625
Epoch: 46770, Training Loss: 0.00763
Epoch: 46770, Training Loss: 0.00592
Epoch: 46771, Training Loss: 0.00667
Epoch: 46771, Training Loss: 0.00625
Epoch: 46771, Training Loss: 0.00763
Epoch: 46771, Training Loss: 0.00592
Epoch: 46772, Training Loss: 0.00667
Epoch: 46772, Training Loss: 0.00625
Epoch: 46772, Training Loss: 0.00763
Epoch: 46772, Training Loss: 0.00592
Epoch: 46773, Training Loss: 0.00667
Epoch: 46773, Training Loss: 0.00625
Epoch: 46773, Training Loss: 0.00763
Epoch: 46773, Training Loss: 0.00592
Epoch: 46774, Training Loss: 0.00667
Epoch: 46774, Training Loss: 0.00625
Epoch: 46774, Training Loss: 0.00763
Epoch: 46774, Training Loss: 0.00592
Epoch: 46775, Training Loss: 0.00667
Epoch: 46775, Training Loss: 0.00625
Epoch: 46775, Training Loss: 0.00763
Epoch: 46775, Training Loss: 0.00592
Epoch: 46776, Training Loss: 0.00667
Epoch: 46776, Training Loss: 0.00625
Epoch: 46776, Training Loss: 0.00763
Epoch: 46776, Training Loss: 0.00592
Epoch: 46777, Training Loss: 0.00667
Epoch: 46777, Training Loss: 0.00625
Epoch: 46777, Training Loss: 0.00763
Epoch: 46777, Training Loss: 0.00592
Epoch: 46778, Training Loss: 0.00667
Epoch: 46778, Training Loss: 0.00625
Epoch: 46778, Training Loss: 0.00763
Epoch: 46778, Training Loss: 0.00592
Epoch: 46779, Training Loss: 0.00667
Epoch: 46779, Training Loss: 0.00625
Epoch: 46779, Training Loss: 0.00763
Epoch: 46779, Training Loss: 0.00592
Epoch: 46780, Training Loss: 0.00667
Epoch: 46780, Training Loss: 0.00625
Epoch: 46780, Training Loss: 0.00763
Epoch: 46780, Training Loss: 0.00592
Epoch: 46781, Training Loss: 0.00667
Epoch: 46781, Training Loss: 0.00625
Epoch: 46781, Training Loss: 0.00763
Epoch: 46781, Training Loss: 0.00592
Epoch: 46782, Training Loss: 0.00667
Epoch: 46782, Training Loss: 0.00625
Epoch: 46782, Training Loss: 0.00763
Epoch: 46782, Training Loss: 0.00592
Epoch: 46783, Training Loss: 0.00667
Epoch: 46783, Training Loss: 0.00625
Epoch: 46783, Training Loss: 0.00763
Epoch: 46783, Training Loss: 0.00592
Epoch: 46784, Training Loss: 0.00667
Epoch: 46784, Training Loss: 0.00625
Epoch: 46784, Training Loss: 0.00763
Epoch: 46784, Training Loss: 0.00592
Epoch: 46785, Training Loss: 0.00667
Epoch: 46785, Training Loss: 0.00625
Epoch: 46785, Training Loss: 0.00763
Epoch: 46785, Training Loss: 0.00592
Epoch: 46786, Training Loss: 0.00667
Epoch: 46786, Training Loss: 0.00625
Epoch: 46786, Training Loss: 0.00763
Epoch: 46786, Training Loss: 0.00592
Epoch: 46787, Training Loss: 0.00667
Epoch: 46787, Training Loss: 0.00625
Epoch: 46787, Training Loss: 0.00763
Epoch: 46787, Training Loss: 0.00592
Epoch: 46788, Training Loss: 0.00667
Epoch: 46788, Training Loss: 0.00625
Epoch: 46788, Training Loss: 0.00762
Epoch: 46788, Training Loss: 0.00592
Epoch: 46789, Training Loss: 0.00667
Epoch: 46789, Training Loss: 0.00625
Epoch: 46789, Training Loss: 0.00762
Epoch: 46789, Training Loss: 0.00592
Epoch: 46790, Training Loss: 0.00667
Epoch: 46790, Training Loss: 0.00625
Epoch: 46790, Training Loss: 0.00762
Epoch: 46790, Training Loss: 0.00592
Epoch: 46791, Training Loss: 0.00667
Epoch: 46791, Training Loss: 0.00625
Epoch: 46791, Training Loss: 0.00762
Epoch: 46791, Training Loss: 0.00592
Epoch: 46792, Training Loss: 0.00667
Epoch: 46792, Training Loss: 0.00625
Epoch: 46792, Training Loss: 0.00762
Epoch: 46792, Training Loss: 0.00592
Epoch: 46793, Training Loss: 0.00667
Epoch: 46793, Training Loss: 0.00625
Epoch: 46793, Training Loss: 0.00762
Epoch: 46793, Training Loss: 0.00592
Epoch: 46794, Training Loss: 0.00667
Epoch: 46794, Training Loss: 0.00625
Epoch: 46794, Training Loss: 0.00762
Epoch: 46794, Training Loss: 0.00592
Epoch: 46795, Training Loss: 0.00667
Epoch: 46795, Training Loss: 0.00625
Epoch: 46795, Training Loss: 0.00762
Epoch: 46795, Training Loss: 0.00592
Epoch: 46796, Training Loss: 0.00667
Epoch: 46796, Training Loss: 0.00625
Epoch: 46796, Training Loss: 0.00762
Epoch: 46796, Training Loss: 0.00592
Epoch: 46797, Training Loss: 0.00667
Epoch: 46797, Training Loss: 0.00625
Epoch: 46797, Training Loss: 0.00762
Epoch: 46797, Training Loss: 0.00592
Epoch: 46798, Training Loss: 0.00667
Epoch: 46798, Training Loss: 0.00625
Epoch: 46798, Training Loss: 0.00762
Epoch: 46798, Training Loss: 0.00592
Epoch: 46799, Training Loss: 0.00667
Epoch: 46799, Training Loss: 0.00625
Epoch: 46799, Training Loss: 0.00762
Epoch: 46799, Training Loss: 0.00592
Epoch: 46800, Training Loss: 0.00667
Epoch: 46800, Training Loss: 0.00625
Epoch: 46800, Training Loss: 0.00762
Epoch: 46800, Training Loss: 0.00592
Epoch: 46801, Training Loss: 0.00667
Epoch: 46801, Training Loss: 0.00625
Epoch: 46801, Training Loss: 0.00762
Epoch: 46801, Training Loss: 0.00592
Epoch: 46802, Training Loss: 0.00667
Epoch: 46802, Training Loss: 0.00625
Epoch: 46802, Training Loss: 0.00762
Epoch: 46802, Training Loss: 0.00592
Epoch: 46803, Training Loss: 0.00667
Epoch: 46803, Training Loss: 0.00625
Epoch: 46803, Training Loss: 0.00762
Epoch: 46803, Training Loss: 0.00592
Epoch: 46804, Training Loss: 0.00667
Epoch: 46804, Training Loss: 0.00625
Epoch: 46804, Training Loss: 0.00762
Epoch: 46804, Training Loss: 0.00592
Epoch: 46805, Training Loss: 0.00667
Epoch: 46805, Training Loss: 0.00625
Epoch: 46805, Training Loss: 0.00762
Epoch: 46805, Training Loss: 0.00592
Epoch: 46806, Training Loss: 0.00667
Epoch: 46806, Training Loss: 0.00625
Epoch: 46806, Training Loss: 0.00762
Epoch: 46806, Training Loss: 0.00592
Epoch: 46807, Training Loss: 0.00667
Epoch: 46807, Training Loss: 0.00625
Epoch: 46807, Training Loss: 0.00762
Epoch: 46807, Training Loss: 0.00592
Epoch: 46808, Training Loss: 0.00667
Epoch: 46808, Training Loss: 0.00625
Epoch: 46808, Training Loss: 0.00762
Epoch: 46808, Training Loss: 0.00592
Epoch: 46809, Training Loss: 0.00667
Epoch: 46809, Training Loss: 0.00625
Epoch: 46809, Training Loss: 0.00762
Epoch: 46809, Training Loss: 0.00592
Epoch: 46810, Training Loss: 0.00667
Epoch: 46810, Training Loss: 0.00625
Epoch: 46810, Training Loss: 0.00762
Epoch: 46810, Training Loss: 0.00592
Epoch: 46811, Training Loss: 0.00667
Epoch: 46811, Training Loss: 0.00625
Epoch: 46811, Training Loss: 0.00762
Epoch: 46811, Training Loss: 0.00592
Epoch: 46812, Training Loss: 0.00667
Epoch: 46812, Training Loss: 0.00625
Epoch: 46812, Training Loss: 0.00762
Epoch: 46812, Training Loss: 0.00592
Epoch: 46813, Training Loss: 0.00667
Epoch: 46813, Training Loss: 0.00625
Epoch: 46813, Training Loss: 0.00762
Epoch: 46813, Training Loss: 0.00592
Epoch: 46814, Training Loss: 0.00667
Epoch: 46814, Training Loss: 0.00625
Epoch: 46814, Training Loss: 0.00762
Epoch: 46814, Training Loss: 0.00592
Epoch: 46815, Training Loss: 0.00667
Epoch: 46815, Training Loss: 0.00625
Epoch: 46815, Training Loss: 0.00762
Epoch: 46815, Training Loss: 0.00592
Epoch: 46816, Training Loss: 0.00667
Epoch: 46816, Training Loss: 0.00625
Epoch: 46816, Training Loss: 0.00762
Epoch: 46816, Training Loss: 0.00592
Epoch: 46817, Training Loss: 0.00667
Epoch: 46817, Training Loss: 0.00625
Epoch: 46817, Training Loss: 0.00762
Epoch: 46817, Training Loss: 0.00592
Epoch: 46818, Training Loss: 0.00667
Epoch: 46818, Training Loss: 0.00625
Epoch: 46818, Training Loss: 0.00762
Epoch: 46818, Training Loss: 0.00592
Epoch: 46819, Training Loss: 0.00667
Epoch: 46819, Training Loss: 0.00625
Epoch: 46819, Training Loss: 0.00762
Epoch: 46819, Training Loss: 0.00592
Epoch: 46820, Training Loss: 0.00667
Epoch: 46820, Training Loss: 0.00625
Epoch: 46820, Training Loss: 0.00762
Epoch: 46820, Training Loss: 0.00592
Epoch: 46821, Training Loss: 0.00667
Epoch: 46821, Training Loss: 0.00625
Epoch: 46821, Training Loss: 0.00762
Epoch: 46821, Training Loss: 0.00592
Epoch: 46822, Training Loss: 0.00667
Epoch: 46822, Training Loss: 0.00625
Epoch: 46822, Training Loss: 0.00762
Epoch: 46822, Training Loss: 0.00592
Epoch: 46823, Training Loss: 0.00667
Epoch: 46823, Training Loss: 0.00625
Epoch: 46823, Training Loss: 0.00762
Epoch: 46823, Training Loss: 0.00592
Epoch: 46824, Training Loss: 0.00667
Epoch: 46824, Training Loss: 0.00625
Epoch: 46824, Training Loss: 0.00762
Epoch: 46824, Training Loss: 0.00592
Epoch: 46825, Training Loss: 0.00667
Epoch: 46825, Training Loss: 0.00625
Epoch: 46825, Training Loss: 0.00762
Epoch: 46825, Training Loss: 0.00592
Epoch: 46826, Training Loss: 0.00667
Epoch: 46826, Training Loss: 0.00625
Epoch: 46826, Training Loss: 0.00762
Epoch: 46826, Training Loss: 0.00592
Epoch: 46827, Training Loss: 0.00667
Epoch: 46827, Training Loss: 0.00625
Epoch: 46827, Training Loss: 0.00762
Epoch: 46827, Training Loss: 0.00592
Epoch: 46828, Training Loss: 0.00667
Epoch: 46828, Training Loss: 0.00625
Epoch: 46828, Training Loss: 0.00762
Epoch: 46828, Training Loss: 0.00592
Epoch: 46829, Training Loss: 0.00667
Epoch: 46829, Training Loss: 0.00625
Epoch: 46829, Training Loss: 0.00762
Epoch: 46829, Training Loss: 0.00592
Epoch: 46830, Training Loss: 0.00667
Epoch: 46830, Training Loss: 0.00625
Epoch: 46830, Training Loss: 0.00762
Epoch: 46830, Training Loss: 0.00591
Epoch: 46831, Training Loss: 0.00667
Epoch: 46831, Training Loss: 0.00625
Epoch: 46831, Training Loss: 0.00762
Epoch: 46831, Training Loss: 0.00591
Epoch: 46832, Training Loss: 0.00667
Epoch: 46832, Training Loss: 0.00625
Epoch: 46832, Training Loss: 0.00762
Epoch: 46832, Training Loss: 0.00591
Epoch: 46833, Training Loss: 0.00667
Epoch: 46833, Training Loss: 0.00625
Epoch: 46833, Training Loss: 0.00762
Epoch: 46833, Training Loss: 0.00591
Epoch: 46834, Training Loss: 0.00667
Epoch: 46834, Training Loss: 0.00625
Epoch: 46834, Training Loss: 0.00762
Epoch: 46834, Training Loss: 0.00591
Epoch: 46835, Training Loss: 0.00667
Epoch: 46835, Training Loss: 0.00625
Epoch: 46835, Training Loss: 0.00762
Epoch: 46835, Training Loss: 0.00591
Epoch: 46836, Training Loss: 0.00667
Epoch: 46836, Training Loss: 0.00625
Epoch: 46836, Training Loss: 0.00762
Epoch: 46836, Training Loss: 0.00591
Epoch: 46837, Training Loss: 0.00667
Epoch: 46837, Training Loss: 0.00625
Epoch: 46837, Training Loss: 0.00762
Epoch: 46837, Training Loss: 0.00591
Epoch: 46838, Training Loss: 0.00667
Epoch: 46838, Training Loss: 0.00625
Epoch: 46838, Training Loss: 0.00762
Epoch: 46838, Training Loss: 0.00591
Epoch: 46839, Training Loss: 0.00667
Epoch: 46839, Training Loss: 0.00625
Epoch: 46839, Training Loss: 0.00762
Epoch: 46839, Training Loss: 0.00591
Epoch: 46840, Training Loss: 0.00667
Epoch: 46840, Training Loss: 0.00625
Epoch: 46840, Training Loss: 0.00762
Epoch: 46840, Training Loss: 0.00591
Epoch: 46841, Training Loss: 0.00667
Epoch: 46841, Training Loss: 0.00625
Epoch: 46841, Training Loss: 0.00762
Epoch: 46841, Training Loss: 0.00591
Epoch: 46842, Training Loss: 0.00667
Epoch: 46842, Training Loss: 0.00625
Epoch: 46842, Training Loss: 0.00762
Epoch: 46842, Training Loss: 0.00591
Epoch: 46843, Training Loss: 0.00667
Epoch: 46843, Training Loss: 0.00625
Epoch: 46843, Training Loss: 0.00762
Epoch: 46843, Training Loss: 0.00591
Epoch: 46844, Training Loss: 0.00667
Epoch: 46844, Training Loss: 0.00625
Epoch: 46844, Training Loss: 0.00762
Epoch: 46844, Training Loss: 0.00591
Epoch: 46845, Training Loss: 0.00667
Epoch: 46845, Training Loss: 0.00625
Epoch: 46845, Training Loss: 0.00762
Epoch: 46845, Training Loss: 0.00591
Epoch: 46846, Training Loss: 0.00667
Epoch: 46846, Training Loss: 0.00625
Epoch: 46846, Training Loss: 0.00762
Epoch: 46846, Training Loss: 0.00591
Epoch: 46847, Training Loss: 0.00667
Epoch: 46847, Training Loss: 0.00625
Epoch: 46847, Training Loss: 0.00762
Epoch: 46847, Training Loss: 0.00591
Epoch: 46848, Training Loss: 0.00667
Epoch: 46848, Training Loss: 0.00625
Epoch: 46848, Training Loss: 0.00762
Epoch: 46848, Training Loss: 0.00591
Epoch: 46849, Training Loss: 0.00667
Epoch: 46849, Training Loss: 0.00625
Epoch: 46849, Training Loss: 0.00762
Epoch: 46849, Training Loss: 0.00591
Epoch: 46850, Training Loss: 0.00667
Epoch: 46850, Training Loss: 0.00625
Epoch: 46850, Training Loss: 0.00762
Epoch: 46850, Training Loss: 0.00591
Epoch: 46851, Training Loss: 0.00667
Epoch: 46851, Training Loss: 0.00625
Epoch: 46851, Training Loss: 0.00762
Epoch: 46851, Training Loss: 0.00591
Epoch: 46852, Training Loss: 0.00667
Epoch: 46852, Training Loss: 0.00625
Epoch: 46852, Training Loss: 0.00762
Epoch: 46852, Training Loss: 0.00591
Epoch: 46853, Training Loss: 0.00667
Epoch: 46853, Training Loss: 0.00625
Epoch: 46853, Training Loss: 0.00762
Epoch: 46853, Training Loss: 0.00591
Epoch: 46854, Training Loss: 0.00667
Epoch: 46854, Training Loss: 0.00625
Epoch: 46854, Training Loss: 0.00762
Epoch: 46854, Training Loss: 0.00591
Epoch: 46855, Training Loss: 0.00667
Epoch: 46855, Training Loss: 0.00625
Epoch: 46855, Training Loss: 0.00762
Epoch: 46855, Training Loss: 0.00591
Epoch: 46856, Training Loss: 0.00667
Epoch: 46856, Training Loss: 0.00625
Epoch: 46856, Training Loss: 0.00762
Epoch: 46856, Training Loss: 0.00591
Epoch: 46857, Training Loss: 0.00667
Epoch: 46857, Training Loss: 0.00625
Epoch: 46857, Training Loss: 0.00762
Epoch: 46857, Training Loss: 0.00591
Epoch: 46858, Training Loss: 0.00667
Epoch: 46858, Training Loss: 0.00625
Epoch: 46858, Training Loss: 0.00762
Epoch: 46858, Training Loss: 0.00591
Epoch: 46859, Training Loss: 0.00667
Epoch: 46859, Training Loss: 0.00625
Epoch: 46859, Training Loss: 0.00762
Epoch: 46859, Training Loss: 0.00591
Epoch: 46860, Training Loss: 0.00667
Epoch: 46860, Training Loss: 0.00625
Epoch: 46860, Training Loss: 0.00762
Epoch: 46860, Training Loss: 0.00591
Epoch: 46861, Training Loss: 0.00667
Epoch: 46861, Training Loss: 0.00625
Epoch: 46861, Training Loss: 0.00762
Epoch: 46861, Training Loss: 0.00591
Epoch: 46862, Training Loss: 0.00667
Epoch: 46862, Training Loss: 0.00625
Epoch: 46862, Training Loss: 0.00762
Epoch: 46862, Training Loss: 0.00591
Epoch: 46863, Training Loss: 0.00667
Epoch: 46863, Training Loss: 0.00625
Epoch: 46863, Training Loss: 0.00762
Epoch: 46863, Training Loss: 0.00591
Epoch: 46864, Training Loss: 0.00667
Epoch: 46864, Training Loss: 0.00625
Epoch: 46864, Training Loss: 0.00762
Epoch: 46864, Training Loss: 0.00591
Epoch: 46865, Training Loss: 0.00667
Epoch: 46865, Training Loss: 0.00625
Epoch: 46865, Training Loss: 0.00762
Epoch: 46865, Training Loss: 0.00591
Epoch: 46866, Training Loss: 0.00667
Epoch: 46866, Training Loss: 0.00625
Epoch: 46866, Training Loss: 0.00762
Epoch: 46866, Training Loss: 0.00591
Epoch: 46867, Training Loss: 0.00667
Epoch: 46867, Training Loss: 0.00625
Epoch: 46867, Training Loss: 0.00762
Epoch: 46867, Training Loss: 0.00591
Epoch: 46868, Training Loss: 0.00667
Epoch: 46868, Training Loss: 0.00625
Epoch: 46868, Training Loss: 0.00762
Epoch: 46868, Training Loss: 0.00591
Epoch: 46869, Training Loss: 0.00667
Epoch: 46869, Training Loss: 0.00625
Epoch: 46869, Training Loss: 0.00762
Epoch: 46869, Training Loss: 0.00591
Epoch: 46870, Training Loss: 0.00667
Epoch: 46870, Training Loss: 0.00625
Epoch: 46870, Training Loss: 0.00762
Epoch: 46870, Training Loss: 0.00591
Epoch: 46871, Training Loss: 0.00667
Epoch: 46871, Training Loss: 0.00625
Epoch: 46871, Training Loss: 0.00762
Epoch: 46871, Training Loss: 0.00591
Epoch: 46872, Training Loss: 0.00666
Epoch: 46872, Training Loss: 0.00624
Epoch: 46872, Training Loss: 0.00762
Epoch: 46872, Training Loss: 0.00591
Epoch: 46873, Training Loss: 0.00666
Epoch: 46873, Training Loss: 0.00624
Epoch: 46873, Training Loss: 0.00762
Epoch: 46873, Training Loss: 0.00591
Epoch: 46874, Training Loss: 0.00666
Epoch: 46874, Training Loss: 0.00624
Epoch: 46874, Training Loss: 0.00762
Epoch: 46874, Training Loss: 0.00591
Epoch: 46875, Training Loss: 0.00666
Epoch: 46875, Training Loss: 0.00624
Epoch: 46875, Training Loss: 0.00762
Epoch: 46875, Training Loss: 0.00591
Epoch: 46876, Training Loss: 0.00666
Epoch: 46876, Training Loss: 0.00624
Epoch: 46876, Training Loss: 0.00762
Epoch: 46876, Training Loss: 0.00591
Epoch: 46877, Training Loss: 0.00666
Epoch: 46877, Training Loss: 0.00624
Epoch: 46877, Training Loss: 0.00762
Epoch: 46877, Training Loss: 0.00591
Epoch: 46878, Training Loss: 0.00666
Epoch: 46878, Training Loss: 0.00624
Epoch: 46878, Training Loss: 0.00762
Epoch: 46878, Training Loss: 0.00591
Epoch: 46879, Training Loss: 0.00666
Epoch: 46879, Training Loss: 0.00624
Epoch: 46879, Training Loss: 0.00762
Epoch: 46879, Training Loss: 0.00591
Epoch: 46880, Training Loss: 0.00666
Epoch: 46880, Training Loss: 0.00624
Epoch: 46880, Training Loss: 0.00762
Epoch: 46880, Training Loss: 0.00591
Epoch: 46881, Training Loss: 0.00666
Epoch: 46881, Training Loss: 0.00624
Epoch: 46881, Training Loss: 0.00762
Epoch: 46881, Training Loss: 0.00591
Epoch: 46882, Training Loss: 0.00666
Epoch: 46882, Training Loss: 0.00624
Epoch: 46882, Training Loss: 0.00762
Epoch: 46882, Training Loss: 0.00591
Epoch: 46883, Training Loss: 0.00666
Epoch: 46883, Training Loss: 0.00624
Epoch: 46883, Training Loss: 0.00762
Epoch: 46883, Training Loss: 0.00591
Epoch: 46884, Training Loss: 0.00666
Epoch: 46884, Training Loss: 0.00624
Epoch: 46884, Training Loss: 0.00762
Epoch: 46884, Training Loss: 0.00591
Epoch: 46885, Training Loss: 0.00666
Epoch: 46885, Training Loss: 0.00624
Epoch: 46885, Training Loss: 0.00762
Epoch: 46885, Training Loss: 0.00591
Epoch: 46886, Training Loss: 0.00666
Epoch: 46886, Training Loss: 0.00624
Epoch: 46886, Training Loss: 0.00762
Epoch: 46886, Training Loss: 0.00591
Epoch: 46887, Training Loss: 0.00666
Epoch: 46887, Training Loss: 0.00624
Epoch: 46887, Training Loss: 0.00762
Epoch: 46887, Training Loss: 0.00591
Epoch: 46888, Training Loss: 0.00666
Epoch: 46888, Training Loss: 0.00624
Epoch: 46888, Training Loss: 0.00762
Epoch: 46888, Training Loss: 0.00591
Epoch: 46889, Training Loss: 0.00666
Epoch: 46889, Training Loss: 0.00624
Epoch: 46889, Training Loss: 0.00762
Epoch: 46889, Training Loss: 0.00591
Epoch: 46890, Training Loss: 0.00666
Epoch: 46890, Training Loss: 0.00624
Epoch: 46890, Training Loss: 0.00762
Epoch: 46890, Training Loss: 0.00591
Epoch: 46891, Training Loss: 0.00666
Epoch: 46891, Training Loss: 0.00624
Epoch: 46891, Training Loss: 0.00762
Epoch: 46891, Training Loss: 0.00591
Epoch: 46892, Training Loss: 0.00666
Epoch: 46892, Training Loss: 0.00624
Epoch: 46892, Training Loss: 0.00762
Epoch: 46892, Training Loss: 0.00591
Epoch: 46893, Training Loss: 0.00666
Epoch: 46893, Training Loss: 0.00624
Epoch: 46893, Training Loss: 0.00762
Epoch: 46893, Training Loss: 0.00591
Epoch: 46894, Training Loss: 0.00666
Epoch: 46894, Training Loss: 0.00624
Epoch: 46894, Training Loss: 0.00762
Epoch: 46894, Training Loss: 0.00591
Epoch: 46895, Training Loss: 0.00666
Epoch: 46895, Training Loss: 0.00624
Epoch: 46895, Training Loss: 0.00762
Epoch: 46895, Training Loss: 0.00591
Epoch: 46896, Training Loss: 0.00666
Epoch: 46896, Training Loss: 0.00624
Epoch: 46896, Training Loss: 0.00762
Epoch: 46896, Training Loss: 0.00591
Epoch: 46897, Training Loss: 0.00666
Epoch: 46897, Training Loss: 0.00624
Epoch: 46897, Training Loss: 0.00762
Epoch: 46897, Training Loss: 0.00591
Epoch: 46898, Training Loss: 0.00666
Epoch: 46898, Training Loss: 0.00624
Epoch: 46898, Training Loss: 0.00762
Epoch: 46898, Training Loss: 0.00591
Epoch: 46899, Training Loss: 0.00666
Epoch: 46899, Training Loss: 0.00624
Epoch: 46899, Training Loss: 0.00762
Epoch: 46899, Training Loss: 0.00591
Epoch: 46900, Training Loss: 0.00666
Epoch: 46900, Training Loss: 0.00624
Epoch: 46900, Training Loss: 0.00762
Epoch: 46900, Training Loss: 0.00591
Epoch: 46901, Training Loss: 0.00666
Epoch: 46901, Training Loss: 0.00624
Epoch: 46901, Training Loss: 0.00762
Epoch: 46901, Training Loss: 0.00591
Epoch: 46902, Training Loss: 0.00666
Epoch: 46902, Training Loss: 0.00624
Epoch: 46902, Training Loss: 0.00762
Epoch: 46902, Training Loss: 0.00591
Epoch: 46903, Training Loss: 0.00666
Epoch: 46903, Training Loss: 0.00624
Epoch: 46903, Training Loss: 0.00762
Epoch: 46903, Training Loss: 0.00591
Epoch: 46904, Training Loss: 0.00666
Epoch: 46904, Training Loss: 0.00624
Epoch: 46904, Training Loss: 0.00762
Epoch: 46904, Training Loss: 0.00591
Epoch: 46905, Training Loss: 0.00666
Epoch: 46905, Training Loss: 0.00624
Epoch: 46905, Training Loss: 0.00761
Epoch: 46905, Training Loss: 0.00591
Epoch: 46906, Training Loss: 0.00666
Epoch: 46906, Training Loss: 0.00624
Epoch: 46906, Training Loss: 0.00761
Epoch: 46906, Training Loss: 0.00591
Epoch: 46907, Training Loss: 0.00666
Epoch: 46907, Training Loss: 0.00624
Epoch: 46907, Training Loss: 0.00761
Epoch: 46907, Training Loss: 0.00591
Epoch: 46908, Training Loss: 0.00666
Epoch: 46908, Training Loss: 0.00624
Epoch: 46908, Training Loss: 0.00761
Epoch: 46908, Training Loss: 0.00591
Epoch: 46909, Training Loss: 0.00666
Epoch: 46909, Training Loss: 0.00624
Epoch: 46909, Training Loss: 0.00761
Epoch: 46909, Training Loss: 0.00591
Epoch: 46910, Training Loss: 0.00666
Epoch: 46910, Training Loss: 0.00624
Epoch: 46910, Training Loss: 0.00761
Epoch: 46910, Training Loss: 0.00591
Epoch: 46911, Training Loss: 0.00666
Epoch: 46911, Training Loss: 0.00624
Epoch: 46911, Training Loss: 0.00761
Epoch: 46911, Training Loss: 0.00591
Epoch: 46912, Training Loss: 0.00666
Epoch: 46912, Training Loss: 0.00624
Epoch: 46912, Training Loss: 0.00761
Epoch: 46912, Training Loss: 0.00591
Epoch: 46913, Training Loss: 0.00666
Epoch: 46913, Training Loss: 0.00624
Epoch: 46913, Training Loss: 0.00761
Epoch: 46913, Training Loss: 0.00591
Epoch: 46914, Training Loss: 0.00666
Epoch: 46914, Training Loss: 0.00624
Epoch: 46914, Training Loss: 0.00761
Epoch: 46914, Training Loss: 0.00591
Epoch: 46915, Training Loss: 0.00666
Epoch: 46915, Training Loss: 0.00624
Epoch: 46915, Training Loss: 0.00761
Epoch: 46915, Training Loss: 0.00591
Epoch: 46916, Training Loss: 0.00666
Epoch: 46916, Training Loss: 0.00624
Epoch: 46916, Training Loss: 0.00761
Epoch: 46916, Training Loss: 0.00591
Epoch: 46917, Training Loss: 0.00666
Epoch: 46917, Training Loss: 0.00624
Epoch: 46917, Training Loss: 0.00761
Epoch: 46917, Training Loss: 0.00591
Epoch: 46918, Training Loss: 0.00666
Epoch: 46918, Training Loss: 0.00624
Epoch: 46918, Training Loss: 0.00761
Epoch: 46918, Training Loss: 0.00591
Epoch: 46919, Training Loss: 0.00666
Epoch: 46919, Training Loss: 0.00624
Epoch: 46919, Training Loss: 0.00761
Epoch: 46919, Training Loss: 0.00591
Epoch: 46920, Training Loss: 0.00666
Epoch: 46920, Training Loss: 0.00624
Epoch: 46920, Training Loss: 0.00761
Epoch: 46920, Training Loss: 0.00591
Epoch: 46921, Training Loss: 0.00666
Epoch: 46921, Training Loss: 0.00624
Epoch: 46921, Training Loss: 0.00761
Epoch: 46921, Training Loss: 0.00591
Epoch: 46922, Training Loss: 0.00666
Epoch: 46922, Training Loss: 0.00624
Epoch: 46922, Training Loss: 0.00761
Epoch: 46922, Training Loss: 0.00591
Epoch: 46923, Training Loss: 0.00666
Epoch: 46923, Training Loss: 0.00624
Epoch: 46923, Training Loss: 0.00761
Epoch: 46923, Training Loss: 0.00591
Epoch: 46924, Training Loss: 0.00666
Epoch: 46924, Training Loss: 0.00624
Epoch: 46924, Training Loss: 0.00761
Epoch: 46924, Training Loss: 0.00591
Epoch: 46925, Training Loss: 0.00666
Epoch: 46925, Training Loss: 0.00624
Epoch: 46925, Training Loss: 0.00761
Epoch: 46925, Training Loss: 0.00591
Epoch: 46926, Training Loss: 0.00666
Epoch: 46926, Training Loss: 0.00624
Epoch: 46926, Training Loss: 0.00761
Epoch: 46926, Training Loss: 0.00591
Epoch: 46927, Training Loss: 0.00666
Epoch: 46927, Training Loss: 0.00624
Epoch: 46927, Training Loss: 0.00761
Epoch: 46927, Training Loss: 0.00591
Epoch: 46928, Training Loss: 0.00666
Epoch: 46928, Training Loss: 0.00624
Epoch: 46928, Training Loss: 0.00761
Epoch: 46928, Training Loss: 0.00591
Epoch: 46929, Training Loss: 0.00666
Epoch: 46929, Training Loss: 0.00624
Epoch: 46929, Training Loss: 0.00761
Epoch: 46929, Training Loss: 0.00591
Epoch: 46930, Training Loss: 0.00666
Epoch: 46930, Training Loss: 0.00624
Epoch: 46930, Training Loss: 0.00761
Epoch: 46930, Training Loss: 0.00591
Epoch: 46931, Training Loss: 0.00666
Epoch: 46931, Training Loss: 0.00624
Epoch: 46931, Training Loss: 0.00761
Epoch: 46931, Training Loss: 0.00591
Epoch: 46932, Training Loss: 0.00666
Epoch: 46932, Training Loss: 0.00624
Epoch: 46932, Training Loss: 0.00761
Epoch: 46932, Training Loss: 0.00591
Epoch: 46933, Training Loss: 0.00666
Epoch: 46933, Training Loss: 0.00624
Epoch: 46933, Training Loss: 0.00761
Epoch: 46933, Training Loss: 0.00591
Epoch: 46934, Training Loss: 0.00666
Epoch: 46934, Training Loss: 0.00624
Epoch: 46934, Training Loss: 0.00761
Epoch: 46934, Training Loss: 0.00591
Epoch: 46935, Training Loss: 0.00666
Epoch: 46935, Training Loss: 0.00624
Epoch: 46935, Training Loss: 0.00761
Epoch: 46935, Training Loss: 0.00591
Epoch: 46936, Training Loss: 0.00666
Epoch: 46936, Training Loss: 0.00624
Epoch: 46936, Training Loss: 0.00761
Epoch: 46936, Training Loss: 0.00591
Epoch: 46937, Training Loss: 0.00666
Epoch: 46937, Training Loss: 0.00624
Epoch: 46937, Training Loss: 0.00761
Epoch: 46937, Training Loss: 0.00591
Epoch: 46938, Training Loss: 0.00666
Epoch: 46938, Training Loss: 0.00624
Epoch: 46938, Training Loss: 0.00761
Epoch: 46938, Training Loss: 0.00591
Epoch: 46939, Training Loss: 0.00666
Epoch: 46939, Training Loss: 0.00624
Epoch: 46939, Training Loss: 0.00761
Epoch: 46939, Training Loss: 0.00591
Epoch: 46940, Training Loss: 0.00666
Epoch: 46940, Training Loss: 0.00624
Epoch: 46940, Training Loss: 0.00761
Epoch: 46940, Training Loss: 0.00591
Epoch: 46941, Training Loss: 0.00666
Epoch: 46941, Training Loss: 0.00624
Epoch: 46941, Training Loss: 0.00761
Epoch: 46941, Training Loss: 0.00591
Epoch: 46942, Training Loss: 0.00666
Epoch: 46942, Training Loss: 0.00624
Epoch: 46942, Training Loss: 0.00761
Epoch: 46942, Training Loss: 0.00591
Epoch: 46943, Training Loss: 0.00666
Epoch: 46943, Training Loss: 0.00624
Epoch: 46943, Training Loss: 0.00761
Epoch: 46943, Training Loss: 0.00591
Epoch: 46944, Training Loss: 0.00666
Epoch: 46944, Training Loss: 0.00624
Epoch: 46944, Training Loss: 0.00761
Epoch: 46944, Training Loss: 0.00591
Epoch: 46945, Training Loss: 0.00666
Epoch: 46945, Training Loss: 0.00624
Epoch: 46945, Training Loss: 0.00761
Epoch: 46945, Training Loss: 0.00591
Epoch: 46946, Training Loss: 0.00666
Epoch: 46946, Training Loss: 0.00624
Epoch: 46946, Training Loss: 0.00761
Epoch: 46946, Training Loss: 0.00591
Epoch: 46947, Training Loss: 0.00666
Epoch: 46947, Training Loss: 0.00624
Epoch: 46947, Training Loss: 0.00761
Epoch: 46947, Training Loss: 0.00591
Epoch: 46948, Training Loss: 0.00666
Epoch: 46948, Training Loss: 0.00624
Epoch: 46948, Training Loss: 0.00761
Epoch: 46948, Training Loss: 0.00591
Epoch: 46949, Training Loss: 0.00666
Epoch: 46949, Training Loss: 0.00624
Epoch: 46949, Training Loss: 0.00761
Epoch: 46949, Training Loss: 0.00591
Epoch: 46950, Training Loss: 0.00666
Epoch: 46950, Training Loss: 0.00624
Epoch: 46950, Training Loss: 0.00761
Epoch: 46950, Training Loss: 0.00591
Epoch: 46951, Training Loss: 0.00666
Epoch: 46951, Training Loss: 0.00624
Epoch: 46951, Training Loss: 0.00761
Epoch: 46951, Training Loss: 0.00591
Epoch: 46952, Training Loss: 0.00666
Epoch: 46952, Training Loss: 0.00624
Epoch: 46952, Training Loss: 0.00761
Epoch: 46952, Training Loss: 0.00591
Epoch: 46953, Training Loss: 0.00666
Epoch: 46953, Training Loss: 0.00624
Epoch: 46953, Training Loss: 0.00761
Epoch: 46953, Training Loss: 0.00591
Epoch: 46954, Training Loss: 0.00666
Epoch: 46954, Training Loss: 0.00624
Epoch: 46954, Training Loss: 0.00761
Epoch: 46954, Training Loss: 0.00591
Epoch: 46955, Training Loss: 0.00666
Epoch: 46955, Training Loss: 0.00624
Epoch: 46955, Training Loss: 0.00761
Epoch: 46955, Training Loss: 0.00591
Epoch: 46956, Training Loss: 0.00666
Epoch: 46956, Training Loss: 0.00624
Epoch: 46956, Training Loss: 0.00761
Epoch: 46956, Training Loss: 0.00591
Epoch: 46957, Training Loss: 0.00666
Epoch: 46957, Training Loss: 0.00624
Epoch: 46957, Training Loss: 0.00761
Epoch: 46957, Training Loss: 0.00591
Epoch: 46958, Training Loss: 0.00666
Epoch: 46958, Training Loss: 0.00624
Epoch: 46958, Training Loss: 0.00761
Epoch: 46958, Training Loss: 0.00591
Epoch: 46959, Training Loss: 0.00666
Epoch: 46959, Training Loss: 0.00624
Epoch: 46959, Training Loss: 0.00761
Epoch: 46959, Training Loss: 0.00591
Epoch: 46960, Training Loss: 0.00666
Epoch: 46960, Training Loss: 0.00624
Epoch: 46960, Training Loss: 0.00761
Epoch: 46960, Training Loss: 0.00591
Epoch: 46961, Training Loss: 0.00666
Epoch: 46961, Training Loss: 0.00624
Epoch: 46961, Training Loss: 0.00761
Epoch: 46961, Training Loss: 0.00591
Epoch: 46962, Training Loss: 0.00666
Epoch: 46962, Training Loss: 0.00624
Epoch: 46962, Training Loss: 0.00761
Epoch: 46962, Training Loss: 0.00591
Epoch: 46963, Training Loss: 0.00666
Epoch: 46963, Training Loss: 0.00624
Epoch: 46963, Training Loss: 0.00761
Epoch: 46963, Training Loss: 0.00591
Epoch: 46964, Training Loss: 0.00666
Epoch: 46964, Training Loss: 0.00624
Epoch: 46964, Training Loss: 0.00761
Epoch: 46964, Training Loss: 0.00591
Epoch: 46965, Training Loss: 0.00666
Epoch: 46965, Training Loss: 0.00624
Epoch: 46965, Training Loss: 0.00761
Epoch: 46965, Training Loss: 0.00591
Epoch: 46966, Training Loss: 0.00666
Epoch: 46966, Training Loss: 0.00624
Epoch: 46966, Training Loss: 0.00761
Epoch: 46966, Training Loss: 0.00591
Epoch: 46967, Training Loss: 0.00666
Epoch: 46967, Training Loss: 0.00624
Epoch: 46967, Training Loss: 0.00761
Epoch: 46967, Training Loss: 0.00591
Epoch: 46968, Training Loss: 0.00666
Epoch: 46968, Training Loss: 0.00624
Epoch: 46968, Training Loss: 0.00761
Epoch: 46968, Training Loss: 0.00591
Epoch: 46969, Training Loss: 0.00666
Epoch: 46969, Training Loss: 0.00624
Epoch: 46969, Training Loss: 0.00761
Epoch: 46969, Training Loss: 0.00591
Epoch: 46970, Training Loss: 0.00666
Epoch: 46970, Training Loss: 0.00624
Epoch: 46970, Training Loss: 0.00761
Epoch: 46970, Training Loss: 0.00591
Epoch: 46971, Training Loss: 0.00666
Epoch: 46971, Training Loss: 0.00624
Epoch: 46971, Training Loss: 0.00761
Epoch: 46971, Training Loss: 0.00591
Epoch: 46972, Training Loss: 0.00666
Epoch: 46972, Training Loss: 0.00624
Epoch: 46972, Training Loss: 0.00761
Epoch: 46972, Training Loss: 0.00591
Epoch: 46973, Training Loss: 0.00666
Epoch: 46973, Training Loss: 0.00624
Epoch: 46973, Training Loss: 0.00761
Epoch: 46973, Training Loss: 0.00591
Epoch: 46974, Training Loss: 0.00666
Epoch: 46974, Training Loss: 0.00624
Epoch: 46974, Training Loss: 0.00761
Epoch: 46974, Training Loss: 0.00591
Epoch: 46975, Training Loss: 0.00666
Epoch: 46975, Training Loss: 0.00624
Epoch: 46975, Training Loss: 0.00761
Epoch: 46975, Training Loss: 0.00591
Epoch: 46976, Training Loss: 0.00666
Epoch: 46976, Training Loss: 0.00624
Epoch: 46976, Training Loss: 0.00761
Epoch: 46976, Training Loss: 0.00591
Epoch: 46977, Training Loss: 0.00666
Epoch: 46977, Training Loss: 0.00624
Epoch: 46977, Training Loss: 0.00761
Epoch: 46977, Training Loss: 0.00591
Epoch: 46978, Training Loss: 0.00666
Epoch: 46978, Training Loss: 0.00624
Epoch: 46978, Training Loss: 0.00761
Epoch: 46978, Training Loss: 0.00591
Epoch: 46979, Training Loss: 0.00666
Epoch: 46979, Training Loss: 0.00624
Epoch: 46979, Training Loss: 0.00761
Epoch: 46979, Training Loss: 0.00591
Epoch: 46980, Training Loss: 0.00666
Epoch: 46980, Training Loss: 0.00624
Epoch: 46980, Training Loss: 0.00761
Epoch: 46980, Training Loss: 0.00591
Epoch: 46981, Training Loss: 0.00666
Epoch: 46981, Training Loss: 0.00624
Epoch: 46981, Training Loss: 0.00761
Epoch: 46981, Training Loss: 0.00591
Epoch: 46982, Training Loss: 0.00666
Epoch: 46982, Training Loss: 0.00624
Epoch: 46982, Training Loss: 0.00761
Epoch: 46982, Training Loss: 0.00591
Epoch: 46983, Training Loss: 0.00666
Epoch: 46983, Training Loss: 0.00624
Epoch: 46983, Training Loss: 0.00761
Epoch: 46983, Training Loss: 0.00591
Epoch: 46984, Training Loss: 0.00666
Epoch: 46984, Training Loss: 0.00624
Epoch: 46984, Training Loss: 0.00761
Epoch: 46984, Training Loss: 0.00590
Epoch: 46985, Training Loss: 0.00666
Epoch: 46985, Training Loss: 0.00624
Epoch: 46985, Training Loss: 0.00761
Epoch: 46985, Training Loss: 0.00590
Epoch: 46986, Training Loss: 0.00666
Epoch: 46986, Training Loss: 0.00624
Epoch: 46986, Training Loss: 0.00761
Epoch: 46986, Training Loss: 0.00590
Epoch: 46987, Training Loss: 0.00666
Epoch: 46987, Training Loss: 0.00624
Epoch: 46987, Training Loss: 0.00761
Epoch: 46987, Training Loss: 0.00590
Epoch: 46988, Training Loss: 0.00666
Epoch: 46988, Training Loss: 0.00624
Epoch: 46988, Training Loss: 0.00761
Epoch: 46988, Training Loss: 0.00590
Epoch: 46989, Training Loss: 0.00666
Epoch: 46989, Training Loss: 0.00624
Epoch: 46989, Training Loss: 0.00761
Epoch: 46989, Training Loss: 0.00590
Epoch: 46990, Training Loss: 0.00666
Epoch: 46990, Training Loss: 0.00624
Epoch: 46990, Training Loss: 0.00761
Epoch: 46990, Training Loss: 0.00590
Epoch: 46991, Training Loss: 0.00666
Epoch: 46991, Training Loss: 0.00624
Epoch: 46991, Training Loss: 0.00761
Epoch: 46991, Training Loss: 0.00590
Epoch: 46992, Training Loss: 0.00666
Epoch: 46992, Training Loss: 0.00624
Epoch: 46992, Training Loss: 0.00761
Epoch: 46992, Training Loss: 0.00590
Epoch: 46993, Training Loss: 0.00666
Epoch: 46993, Training Loss: 0.00624
Epoch: 46993, Training Loss: 0.00761
Epoch: 46993, Training Loss: 0.00590
Epoch: 46994, Training Loss: 0.00666
Epoch: 46994, Training Loss: 0.00624
Epoch: 46994, Training Loss: 0.00761
Epoch: 46994, Training Loss: 0.00590
Epoch: 46995, Training Loss: 0.00666
Epoch: 46995, Training Loss: 0.00624
Epoch: 46995, Training Loss: 0.00761
Epoch: 46995, Training Loss: 0.00590
Epoch: 46996, Training Loss: 0.00666
Epoch: 46996, Training Loss: 0.00624
Epoch: 46996, Training Loss: 0.00761
Epoch: 46996, Training Loss: 0.00590
Epoch: 46997, Training Loss: 0.00666
Epoch: 46997, Training Loss: 0.00624
Epoch: 46997, Training Loss: 0.00761
Epoch: 46997, Training Loss: 0.00590
Epoch: 46998, Training Loss: 0.00666
Epoch: 46998, Training Loss: 0.00624
Epoch: 46998, Training Loss: 0.00761
Epoch: 46998, Training Loss: 0.00590
Epoch: 46999, Training Loss: 0.00666
Epoch: 46999, Training Loss: 0.00624
Epoch: 46999, Training Loss: 0.00761
Epoch: 46999, Training Loss: 0.00590
Epoch: 47000, Training Loss: 0.00666
Epoch: 47000, Training Loss: 0.00624
Epoch: 47000, Training Loss: 0.00761
Epoch: 47000, Training Loss: 0.00590
Epoch: 47001, Training Loss: 0.00666
Epoch: 47001, Training Loss: 0.00624
Epoch: 47001, Training Loss: 0.00761
Epoch: 47001, Training Loss: 0.00590
Epoch: 47002, Training Loss: 0.00666
Epoch: 47002, Training Loss: 0.00624
Epoch: 47002, Training Loss: 0.00761
Epoch: 47002, Training Loss: 0.00590
Epoch: 47003, Training Loss: 0.00666
Epoch: 47003, Training Loss: 0.00624
Epoch: 47003, Training Loss: 0.00761
Epoch: 47003, Training Loss: 0.00590
Epoch: 47004, Training Loss: 0.00666
Epoch: 47004, Training Loss: 0.00624
Epoch: 47004, Training Loss: 0.00761
Epoch: 47004, Training Loss: 0.00590
Epoch: 47005, Training Loss: 0.00666
Epoch: 47005, Training Loss: 0.00624
Epoch: 47005, Training Loss: 0.00761
Epoch: 47005, Training Loss: 0.00590
Epoch: 47006, Training Loss: 0.00666
Epoch: 47006, Training Loss: 0.00624
Epoch: 47006, Training Loss: 0.00761
Epoch: 47006, Training Loss: 0.00590
Epoch: 47007, Training Loss: 0.00665
Epoch: 47007, Training Loss: 0.00624
Epoch: 47007, Training Loss: 0.00761
Epoch: 47007, Training Loss: 0.00590
Epoch: 47008, Training Loss: 0.00665
Epoch: 47008, Training Loss: 0.00624
Epoch: 47008, Training Loss: 0.00761
Epoch: 47008, Training Loss: 0.00590
Epoch: 47009, Training Loss: 0.00665
Epoch: 47009, Training Loss: 0.00624
Epoch: 47009, Training Loss: 0.00761
Epoch: 47009, Training Loss: 0.00590
Epoch: 47010, Training Loss: 0.00665
Epoch: 47010, Training Loss: 0.00624
Epoch: 47010, Training Loss: 0.00761
Epoch: 47010, Training Loss: 0.00590
Epoch: 47011, Training Loss: 0.00665
Epoch: 47011, Training Loss: 0.00624
Epoch: 47011, Training Loss: 0.00761
Epoch: 47011, Training Loss: 0.00590
Epoch: 47012, Training Loss: 0.00665
Epoch: 47012, Training Loss: 0.00624
Epoch: 47012, Training Loss: 0.00761
Epoch: 47012, Training Loss: 0.00590
Epoch: 47013, Training Loss: 0.00665
Epoch: 47013, Training Loss: 0.00624
Epoch: 47013, Training Loss: 0.00761
Epoch: 47013, Training Loss: 0.00590
Epoch: 47014, Training Loss: 0.00665
Epoch: 47014, Training Loss: 0.00624
Epoch: 47014, Training Loss: 0.00761
Epoch: 47014, Training Loss: 0.00590
Epoch: 47015, Training Loss: 0.00665
Epoch: 47015, Training Loss: 0.00624
Epoch: 47015, Training Loss: 0.00761
Epoch: 47015, Training Loss: 0.00590
Epoch: 47016, Training Loss: 0.00665
Epoch: 47016, Training Loss: 0.00624
Epoch: 47016, Training Loss: 0.00761
Epoch: 47016, Training Loss: 0.00590
Epoch: 47017, Training Loss: 0.00665
Epoch: 47017, Training Loss: 0.00623
Epoch: 47017, Training Loss: 0.00761
Epoch: 47017, Training Loss: 0.00590
Epoch: 47018, Training Loss: 0.00665
Epoch: 47018, Training Loss: 0.00623
Epoch: 47018, Training Loss: 0.00761
Epoch: 47018, Training Loss: 0.00590
Epoch: 47019, Training Loss: 0.00665
Epoch: 47019, Training Loss: 0.00623
Epoch: 47019, Training Loss: 0.00761
Epoch: 47019, Training Loss: 0.00590
Epoch: 47020, Training Loss: 0.00665
Epoch: 47020, Training Loss: 0.00623
Epoch: 47020, Training Loss: 0.00761
Epoch: 47020, Training Loss: 0.00590
Epoch: 47021, Training Loss: 0.00665
Epoch: 47021, Training Loss: 0.00623
Epoch: 47021, Training Loss: 0.00761
Epoch: 47021, Training Loss: 0.00590
Epoch: 47022, Training Loss: 0.00665
Epoch: 47022, Training Loss: 0.00623
Epoch: 47022, Training Loss: 0.00761
Epoch: 47022, Training Loss: 0.00590
Epoch: 47023, Training Loss: 0.00665
Epoch: 47023, Training Loss: 0.00623
Epoch: 47023, Training Loss: 0.00761
Epoch: 47023, Training Loss: 0.00590
Epoch: 47024, Training Loss: 0.00665
Epoch: 47024, Training Loss: 0.00623
Epoch: 47024, Training Loss: 0.00760
Epoch: 47024, Training Loss: 0.00590
Epoch: 47025, Training Loss: 0.00665
Epoch: 47025, Training Loss: 0.00623
Epoch: 47025, Training Loss: 0.00760
Epoch: 47025, Training Loss: 0.00590
Epoch: 47026, Training Loss: 0.00665
Epoch: 47026, Training Loss: 0.00623
Epoch: 47026, Training Loss: 0.00760
Epoch: 47026, Training Loss: 0.00590
Epoch: 47027, Training Loss: 0.00665
Epoch: 47027, Training Loss: 0.00623
Epoch: 47027, Training Loss: 0.00760
Epoch: 47027, Training Loss: 0.00590
Epoch: 47028, Training Loss: 0.00665
Epoch: 47028, Training Loss: 0.00623
Epoch: 47028, Training Loss: 0.00760
Epoch: 47028, Training Loss: 0.00590
Epoch: 47029, Training Loss: 0.00665
Epoch: 47029, Training Loss: 0.00623
Epoch: 47029, Training Loss: 0.00760
Epoch: 47029, Training Loss: 0.00590
Epoch: 47030, Training Loss: 0.00665
Epoch: 47030, Training Loss: 0.00623
Epoch: 47030, Training Loss: 0.00760
Epoch: 47030, Training Loss: 0.00590
Epoch: 47031, Training Loss: 0.00665
Epoch: 47031, Training Loss: 0.00623
Epoch: 47031, Training Loss: 0.00760
Epoch: 47031, Training Loss: 0.00590
Epoch: 47032, Training Loss: 0.00665
Epoch: 47032, Training Loss: 0.00623
Epoch: 47032, Training Loss: 0.00760
Epoch: 47032, Training Loss: 0.00590
Epoch: 47033, Training Loss: 0.00665
Epoch: 47033, Training Loss: 0.00623
Epoch: 47033, Training Loss: 0.00760
Epoch: 47033, Training Loss: 0.00590
Epoch: 47034, Training Loss: 0.00665
Epoch: 47034, Training Loss: 0.00623
Epoch: 47034, Training Loss: 0.00760
Epoch: 47034, Training Loss: 0.00590
Epoch: 47035, Training Loss: 0.00665
Epoch: 47035, Training Loss: 0.00623
Epoch: 47035, Training Loss: 0.00760
Epoch: 47035, Training Loss: 0.00590
Epoch: 47036, Training Loss: 0.00665
Epoch: 47036, Training Loss: 0.00623
Epoch: 47036, Training Loss: 0.00760
Epoch: 47036, Training Loss: 0.00590
Epoch: 47037, Training Loss: 0.00665
Epoch: 47037, Training Loss: 0.00623
Epoch: 47037, Training Loss: 0.00760
Epoch: 47037, Training Loss: 0.00590
Epoch: 47038, Training Loss: 0.00665
Epoch: 47038, Training Loss: 0.00623
Epoch: 47038, Training Loss: 0.00760
Epoch: 47038, Training Loss: 0.00590
Epoch: 47039, Training Loss: 0.00665
Epoch: 47039, Training Loss: 0.00623
Epoch: 47039, Training Loss: 0.00760
Epoch: 47039, Training Loss: 0.00590
Epoch: 47040, Training Loss: 0.00665
Epoch: 47040, Training Loss: 0.00623
Epoch: 47040, Training Loss: 0.00760
Epoch: 47040, Training Loss: 0.00590
Epoch: 47041, Training Loss: 0.00665
Epoch: 47041, Training Loss: 0.00623
Epoch: 47041, Training Loss: 0.00760
Epoch: 47041, Training Loss: 0.00590
Epoch: 47042, Training Loss: 0.00665
Epoch: 47042, Training Loss: 0.00623
Epoch: 47042, Training Loss: 0.00760
Epoch: 47042, Training Loss: 0.00590
Epoch: 47043, Training Loss: 0.00665
Epoch: 47043, Training Loss: 0.00623
Epoch: 47043, Training Loss: 0.00760
Epoch: 47043, Training Loss: 0.00590
Epoch: 47044, Training Loss: 0.00665
Epoch: 47044, Training Loss: 0.00623
Epoch: 47044, Training Loss: 0.00760
Epoch: 47044, Training Loss: 0.00590
Epoch: 47045, Training Loss: 0.00665
Epoch: 47045, Training Loss: 0.00623
Epoch: 47045, Training Loss: 0.00760
Epoch: 47045, Training Loss: 0.00590
Epoch: 47046, Training Loss: 0.00665
Epoch: 47046, Training Loss: 0.00623
Epoch: 47046, Training Loss: 0.00760
Epoch: 47046, Training Loss: 0.00590
Epoch: 47047, Training Loss: 0.00665
Epoch: 47047, Training Loss: 0.00623
Epoch: 47047, Training Loss: 0.00760
Epoch: 47047, Training Loss: 0.00590
Epoch: 47048, Training Loss: 0.00665
Epoch: 47048, Training Loss: 0.00623
Epoch: 47048, Training Loss: 0.00760
Epoch: 47048, Training Loss: 0.00590
Epoch: 47049, Training Loss: 0.00665
Epoch: 47049, Training Loss: 0.00623
Epoch: 47049, Training Loss: 0.00760
Epoch: 47049, Training Loss: 0.00590
Epoch: 47050, Training Loss: 0.00665
Epoch: 47050, Training Loss: 0.00623
Epoch: 47050, Training Loss: 0.00760
Epoch: 47050, Training Loss: 0.00590
Epoch: 47051, Training Loss: 0.00665
Epoch: 47051, Training Loss: 0.00623
Epoch: 47051, Training Loss: 0.00760
Epoch: 47051, Training Loss: 0.00590
Epoch: 47052, Training Loss: 0.00665
Epoch: 47052, Training Loss: 0.00623
Epoch: 47052, Training Loss: 0.00760
Epoch: 47052, Training Loss: 0.00590
Epoch: 47053, Training Loss: 0.00665
Epoch: 47053, Training Loss: 0.00623
Epoch: 47053, Training Loss: 0.00760
Epoch: 47053, Training Loss: 0.00590
Epoch: 47054, Training Loss: 0.00665
Epoch: 47054, Training Loss: 0.00623
Epoch: 47054, Training Loss: 0.00760
Epoch: 47054, Training Loss: 0.00590
Epoch: 47055, Training Loss: 0.00665
Epoch: 47055, Training Loss: 0.00623
Epoch: 47055, Training Loss: 0.00760
Epoch: 47055, Training Loss: 0.00590
Epoch: 47056, Training Loss: 0.00665
Epoch: 47056, Training Loss: 0.00623
Epoch: 47056, Training Loss: 0.00760
Epoch: 47056, Training Loss: 0.00590
Epoch: 47057, Training Loss: 0.00665
Epoch: 47057, Training Loss: 0.00623
Epoch: 47057, Training Loss: 0.00760
Epoch: 47057, Training Loss: 0.00590
Epoch: 47058, Training Loss: 0.00665
Epoch: 47058, Training Loss: 0.00623
Epoch: 47058, Training Loss: 0.00760
Epoch: 47058, Training Loss: 0.00590
Epoch: 47059, Training Loss: 0.00665
Epoch: 47059, Training Loss: 0.00623
Epoch: 47059, Training Loss: 0.00760
Epoch: 47059, Training Loss: 0.00590
Epoch: 47060, Training Loss: 0.00665
Epoch: 47060, Training Loss: 0.00623
Epoch: 47060, Training Loss: 0.00760
Epoch: 47060, Training Loss: 0.00590
Epoch: 47061, Training Loss: 0.00665
Epoch: 47061, Training Loss: 0.00623
Epoch: 47061, Training Loss: 0.00760
Epoch: 47061, Training Loss: 0.00590
Epoch: 47062, Training Loss: 0.00665
Epoch: 47062, Training Loss: 0.00623
Epoch: 47062, Training Loss: 0.00760
Epoch: 47062, Training Loss: 0.00590
Epoch: 47063, Training Loss: 0.00665
Epoch: 47063, Training Loss: 0.00623
Epoch: 47063, Training Loss: 0.00760
Epoch: 47063, Training Loss: 0.00590
Epoch: 47064, Training Loss: 0.00665
Epoch: 47064, Training Loss: 0.00623
Epoch: 47064, Training Loss: 0.00760
Epoch: 47064, Training Loss: 0.00590
Epoch: 47065, Training Loss: 0.00665
Epoch: 47065, Training Loss: 0.00623
Epoch: 47065, Training Loss: 0.00760
Epoch: 47065, Training Loss: 0.00590
Epoch: 47066, Training Loss: 0.00665
Epoch: 47066, Training Loss: 0.00623
Epoch: 47066, Training Loss: 0.00760
Epoch: 47066, Training Loss: 0.00590
Epoch: 47067, Training Loss: 0.00665
Epoch: 47067, Training Loss: 0.00623
Epoch: 47067, Training Loss: 0.00760
Epoch: 47067, Training Loss: 0.00590
Epoch: 47068, Training Loss: 0.00665
Epoch: 47068, Training Loss: 0.00623
Epoch: 47068, Training Loss: 0.00760
Epoch: 47068, Training Loss: 0.00590
Epoch: 47069, Training Loss: 0.00665
Epoch: 47069, Training Loss: 0.00623
Epoch: 47069, Training Loss: 0.00760
Epoch: 47069, Training Loss: 0.00590
Epoch: 47070, Training Loss: 0.00665
Epoch: 47070, Training Loss: 0.00623
Epoch: 47070, Training Loss: 0.00760
Epoch: 47070, Training Loss: 0.00590
Epoch: 47071, Training Loss: 0.00665
Epoch: 47071, Training Loss: 0.00623
Epoch: 47071, Training Loss: 0.00760
Epoch: 47071, Training Loss: 0.00590
Epoch: 47072, Training Loss: 0.00665
Epoch: 47072, Training Loss: 0.00623
Epoch: 47072, Training Loss: 0.00760
Epoch: 47072, Training Loss: 0.00590
Epoch: 47073, Training Loss: 0.00665
Epoch: 47073, Training Loss: 0.00623
Epoch: 47073, Training Loss: 0.00760
Epoch: 47073, Training Loss: 0.00590
Epoch: 47074, Training Loss: 0.00665
Epoch: 47074, Training Loss: 0.00623
Epoch: 47074, Training Loss: 0.00760
Epoch: 47074, Training Loss: 0.00590
Epoch: 47075, Training Loss: 0.00665
Epoch: 47075, Training Loss: 0.00623
Epoch: 47075, Training Loss: 0.00760
Epoch: 47075, Training Loss: 0.00590
Epoch: 47076, Training Loss: 0.00665
Epoch: 47076, Training Loss: 0.00623
Epoch: 47076, Training Loss: 0.00760
Epoch: 47076, Training Loss: 0.00590
Epoch: 47077, Training Loss: 0.00665
Epoch: 47077, Training Loss: 0.00623
Epoch: 47077, Training Loss: 0.00760
Epoch: 47077, Training Loss: 0.00590
Epoch: 47078, Training Loss: 0.00665
Epoch: 47078, Training Loss: 0.00623
Epoch: 47078, Training Loss: 0.00760
Epoch: 47078, Training Loss: 0.00590
Epoch: 47079, Training Loss: 0.00665
Epoch: 47079, Training Loss: 0.00623
Epoch: 47079, Training Loss: 0.00760
Epoch: 47079, Training Loss: 0.00590
Epoch: 47080, Training Loss: 0.00665
Epoch: 47080, Training Loss: 0.00623
Epoch: 47080, Training Loss: 0.00760
Epoch: 47080, Training Loss: 0.00590
Epoch: 47081, Training Loss: 0.00665
Epoch: 47081, Training Loss: 0.00623
Epoch: 47081, Training Loss: 0.00760
Epoch: 47081, Training Loss: 0.00590
Epoch: 47082, Training Loss: 0.00665
Epoch: 47082, Training Loss: 0.00623
Epoch: 47082, Training Loss: 0.00760
Epoch: 47082, Training Loss: 0.00590
Epoch: 47083, Training Loss: 0.00665
Epoch: 47083, Training Loss: 0.00623
Epoch: 47083, Training Loss: 0.00760
Epoch: 47083, Training Loss: 0.00590
Epoch: 47084, Training Loss: 0.00665
Epoch: 47084, Training Loss: 0.00623
Epoch: 47084, Training Loss: 0.00760
Epoch: 47084, Training Loss: 0.00590
Epoch: 47085, Training Loss: 0.00665
Epoch: 47085, Training Loss: 0.00623
Epoch: 47085, Training Loss: 0.00760
Epoch: 47085, Training Loss: 0.00590
Epoch: 47086, Training Loss: 0.00665
Epoch: 47086, Training Loss: 0.00623
Epoch: 47086, Training Loss: 0.00760
Epoch: 47086, Training Loss: 0.00590
Epoch: 47087, Training Loss: 0.00665
Epoch: 47087, Training Loss: 0.00623
Epoch: 47087, Training Loss: 0.00760
Epoch: 47087, Training Loss: 0.00590
Epoch: 47088, Training Loss: 0.00665
Epoch: 47088, Training Loss: 0.00623
Epoch: 47088, Training Loss: 0.00760
Epoch: 47088, Training Loss: 0.00590
Epoch: 47089, Training Loss: 0.00665
Epoch: 47089, Training Loss: 0.00623
Epoch: 47089, Training Loss: 0.00760
Epoch: 47089, Training Loss: 0.00590
Epoch: 47090, Training Loss: 0.00665
Epoch: 47090, Training Loss: 0.00623
Epoch: 47090, Training Loss: 0.00760
Epoch: 47090, Training Loss: 0.00590
Epoch: 47091, Training Loss: 0.00665
Epoch: 47091, Training Loss: 0.00623
Epoch: 47091, Training Loss: 0.00760
Epoch: 47091, Training Loss: 0.00590
Epoch: 47092, Training Loss: 0.00665
Epoch: 47092, Training Loss: 0.00623
Epoch: 47092, Training Loss: 0.00760
Epoch: 47092, Training Loss: 0.00590
Epoch: 47093, Training Loss: 0.00665
Epoch: 47093, Training Loss: 0.00623
Epoch: 47093, Training Loss: 0.00760
Epoch: 47093, Training Loss: 0.00590
Epoch: 47094, Training Loss: 0.00665
Epoch: 47094, Training Loss: 0.00623
Epoch: 47094, Training Loss: 0.00760
Epoch: 47094, Training Loss: 0.00590
Epoch: 47095, Training Loss: 0.00665
Epoch: 47095, Training Loss: 0.00623
Epoch: 47095, Training Loss: 0.00760
Epoch: 47095, Training Loss: 0.00590
Epoch: 47096, Training Loss: 0.00665
Epoch: 47096, Training Loss: 0.00623
Epoch: 47096, Training Loss: 0.00760
Epoch: 47096, Training Loss: 0.00590
Epoch: 47097, Training Loss: 0.00665
Epoch: 47097, Training Loss: 0.00623
Epoch: 47097, Training Loss: 0.00760
Epoch: 47097, Training Loss: 0.00590
Epoch: 47098, Training Loss: 0.00665
Epoch: 47098, Training Loss: 0.00623
Epoch: 47098, Training Loss: 0.00760
Epoch: 47098, Training Loss: 0.00590
Epoch: 47099, Training Loss: 0.00665
Epoch: 47099, Training Loss: 0.00623
Epoch: 47099, Training Loss: 0.00760
Epoch: 47099, Training Loss: 0.00590
Epoch: 47100, Training Loss: 0.00665
Epoch: 47100, Training Loss: 0.00623
Epoch: 47100, Training Loss: 0.00760
Epoch: 47100, Training Loss: 0.00590
Epoch: 47101, Training Loss: 0.00665
Epoch: 47101, Training Loss: 0.00623
Epoch: 47101, Training Loss: 0.00760
Epoch: 47101, Training Loss: 0.00590
Epoch: 47102, Training Loss: 0.00665
Epoch: 47102, Training Loss: 0.00623
Epoch: 47102, Training Loss: 0.00760
Epoch: 47102, Training Loss: 0.00590
Epoch: 47103, Training Loss: 0.00665
Epoch: 47103, Training Loss: 0.00623
Epoch: 47103, Training Loss: 0.00760
Epoch: 47103, Training Loss: 0.00590
Epoch: 47104, Training Loss: 0.00665
Epoch: 47104, Training Loss: 0.00623
Epoch: 47104, Training Loss: 0.00760
Epoch: 47104, Training Loss: 0.00590
Epoch: 47105, Training Loss: 0.00665
Epoch: 47105, Training Loss: 0.00623
Epoch: 47105, Training Loss: 0.00760
Epoch: 47105, Training Loss: 0.00590
Epoch: 47106, Training Loss: 0.00665
Epoch: 47106, Training Loss: 0.00623
Epoch: 47106, Training Loss: 0.00760
Epoch: 47106, Training Loss: 0.00590
Epoch: 47107, Training Loss: 0.00665
Epoch: 47107, Training Loss: 0.00623
Epoch: 47107, Training Loss: 0.00760
Epoch: 47107, Training Loss: 0.00590
Epoch: 47108, Training Loss: 0.00665
Epoch: 47108, Training Loss: 0.00623
Epoch: 47108, Training Loss: 0.00760
Epoch: 47108, Training Loss: 0.00590
Epoch: 47109, Training Loss: 0.00665
Epoch: 47109, Training Loss: 0.00623
Epoch: 47109, Training Loss: 0.00760
Epoch: 47109, Training Loss: 0.00590
Epoch: 47110, Training Loss: 0.00665
Epoch: 47110, Training Loss: 0.00623
Epoch: 47110, Training Loss: 0.00760
Epoch: 47110, Training Loss: 0.00590
Epoch: 47111, Training Loss: 0.00665
Epoch: 47111, Training Loss: 0.00623
Epoch: 47111, Training Loss: 0.00760
Epoch: 47111, Training Loss: 0.00590
Epoch: 47112, Training Loss: 0.00665
Epoch: 47112, Training Loss: 0.00623
Epoch: 47112, Training Loss: 0.00760
Epoch: 47112, Training Loss: 0.00590
Epoch: 47113, Training Loss: 0.00665
Epoch: 47113, Training Loss: 0.00623
Epoch: 47113, Training Loss: 0.00760
Epoch: 47113, Training Loss: 0.00590
Epoch: 47114, Training Loss: 0.00665
Epoch: 47114, Training Loss: 0.00623
Epoch: 47114, Training Loss: 0.00760
Epoch: 47114, Training Loss: 0.00590
Epoch: 47115, Training Loss: 0.00665
Epoch: 47115, Training Loss: 0.00623
Epoch: 47115, Training Loss: 0.00760
Epoch: 47115, Training Loss: 0.00590
Epoch: 47116, Training Loss: 0.00665
Epoch: 47116, Training Loss: 0.00623
Epoch: 47116, Training Loss: 0.00760
Epoch: 47116, Training Loss: 0.00590
Epoch: 47117, Training Loss: 0.00665
Epoch: 47117, Training Loss: 0.00623
Epoch: 47117, Training Loss: 0.00760
Epoch: 47117, Training Loss: 0.00590
Epoch: 47118, Training Loss: 0.00665
Epoch: 47118, Training Loss: 0.00623
Epoch: 47118, Training Loss: 0.00760
Epoch: 47118, Training Loss: 0.00590
Epoch: 47119, Training Loss: 0.00665
Epoch: 47119, Training Loss: 0.00623
Epoch: 47119, Training Loss: 0.00760
Epoch: 47119, Training Loss: 0.00590
Epoch: 47120, Training Loss: 0.00665
Epoch: 47120, Training Loss: 0.00623
Epoch: 47120, Training Loss: 0.00760
Epoch: 47120, Training Loss: 0.00590
Epoch: 47121, Training Loss: 0.00665
Epoch: 47121, Training Loss: 0.00623
Epoch: 47121, Training Loss: 0.00760
Epoch: 47121, Training Loss: 0.00590
Epoch: 47122, Training Loss: 0.00665
Epoch: 47122, Training Loss: 0.00623
Epoch: 47122, Training Loss: 0.00760
Epoch: 47122, Training Loss: 0.00590
Epoch: 47123, Training Loss: 0.00665
Epoch: 47123, Training Loss: 0.00623
Epoch: 47123, Training Loss: 0.00760
Epoch: 47123, Training Loss: 0.00590
Epoch: 47124, Training Loss: 0.00665
Epoch: 47124, Training Loss: 0.00623
Epoch: 47124, Training Loss: 0.00760
Epoch: 47124, Training Loss: 0.00590
Epoch: 47125, Training Loss: 0.00665
Epoch: 47125, Training Loss: 0.00623
Epoch: 47125, Training Loss: 0.00760
Epoch: 47125, Training Loss: 0.00590
Epoch: 47126, Training Loss: 0.00665
Epoch: 47126, Training Loss: 0.00623
Epoch: 47126, Training Loss: 0.00760
Epoch: 47126, Training Loss: 0.00590
Epoch: 47127, Training Loss: 0.00665
Epoch: 47127, Training Loss: 0.00623
Epoch: 47127, Training Loss: 0.00760
Epoch: 47127, Training Loss: 0.00590
Epoch: 47128, Training Loss: 0.00665
Epoch: 47128, Training Loss: 0.00623
Epoch: 47128, Training Loss: 0.00760
Epoch: 47128, Training Loss: 0.00590
Epoch: 47129, Training Loss: 0.00665
Epoch: 47129, Training Loss: 0.00623
Epoch: 47129, Training Loss: 0.00760
Epoch: 47129, Training Loss: 0.00590
Epoch: 47130, Training Loss: 0.00665
Epoch: 47130, Training Loss: 0.00623
Epoch: 47130, Training Loss: 0.00760
Epoch: 47130, Training Loss: 0.00590
Epoch: 47131, Training Loss: 0.00665
Epoch: 47131, Training Loss: 0.00623
Epoch: 47131, Training Loss: 0.00760
Epoch: 47131, Training Loss: 0.00590
Epoch: 47132, Training Loss: 0.00665
Epoch: 47132, Training Loss: 0.00623
Epoch: 47132, Training Loss: 0.00760
Epoch: 47132, Training Loss: 0.00590
Epoch: 47133, Training Loss: 0.00665
Epoch: 47133, Training Loss: 0.00623
Epoch: 47133, Training Loss: 0.00760
Epoch: 47133, Training Loss: 0.00590
Epoch: 47134, Training Loss: 0.00665
Epoch: 47134, Training Loss: 0.00623
Epoch: 47134, Training Loss: 0.00760
Epoch: 47134, Training Loss: 0.00590
Epoch: 47135, Training Loss: 0.00665
Epoch: 47135, Training Loss: 0.00623
Epoch: 47135, Training Loss: 0.00760
Epoch: 47135, Training Loss: 0.00590
Epoch: 47136, Training Loss: 0.00665
Epoch: 47136, Training Loss: 0.00623
Epoch: 47136, Training Loss: 0.00760
Epoch: 47136, Training Loss: 0.00590
Epoch: 47137, Training Loss: 0.00665
Epoch: 47137, Training Loss: 0.00623
Epoch: 47137, Training Loss: 0.00760
Epoch: 47137, Training Loss: 0.00590
Epoch: 47138, Training Loss: 0.00665
Epoch: 47138, Training Loss: 0.00623
Epoch: 47138, Training Loss: 0.00760
Epoch: 47138, Training Loss: 0.00589
Epoch: 47139, Training Loss: 0.00665
Epoch: 47139, Training Loss: 0.00623
Epoch: 47139, Training Loss: 0.00760
Epoch: 47139, Training Loss: 0.00589
Epoch: 47140, Training Loss: 0.00665
Epoch: 47140, Training Loss: 0.00623
Epoch: 47140, Training Loss: 0.00760
Epoch: 47140, Training Loss: 0.00589
Epoch: 47141, Training Loss: 0.00665
Epoch: 47141, Training Loss: 0.00623
Epoch: 47141, Training Loss: 0.00760
Epoch: 47141, Training Loss: 0.00589
Epoch: 47142, Training Loss: 0.00664
Epoch: 47142, Training Loss: 0.00623
Epoch: 47142, Training Loss: 0.00759
Epoch: 47142, Training Loss: 0.00589
Epoch: 47143, Training Loss: 0.00664
Epoch: 47143, Training Loss: 0.00623
Epoch: 47143, Training Loss: 0.00759
Epoch: 47143, Training Loss: 0.00589
Epoch: 47144, Training Loss: 0.00664
Epoch: 47144, Training Loss: 0.00623
Epoch: 47144, Training Loss: 0.00759
Epoch: 47144, Training Loss: 0.00589
Epoch: 47145, Training Loss: 0.00664
Epoch: 47145, Training Loss: 0.00623
Epoch: 47145, Training Loss: 0.00759
Epoch: 47145, Training Loss: 0.00589
Epoch: 47146, Training Loss: 0.00664
Epoch: 47146, Training Loss: 0.00623
Epoch: 47146, Training Loss: 0.00759
Epoch: 47146, Training Loss: 0.00589
Epoch: 47147, Training Loss: 0.00664
Epoch: 47147, Training Loss: 0.00623
Epoch: 47147, Training Loss: 0.00759
Epoch: 47147, Training Loss: 0.00589
Epoch: 47148, Training Loss: 0.00664
Epoch: 47148, Training Loss: 0.00623
Epoch: 47148, Training Loss: 0.00759
Epoch: 47148, Training Loss: 0.00589
Epoch: 47149, Training Loss: 0.00664
Epoch: 47149, Training Loss: 0.00623
Epoch: 47149, Training Loss: 0.00759
Epoch: 47149, Training Loss: 0.00589
Epoch: 47150, Training Loss: 0.00664
Epoch: 47150, Training Loss: 0.00623
Epoch: 47150, Training Loss: 0.00759
Epoch: 47150, Training Loss: 0.00589
Epoch: 47151, Training Loss: 0.00664
Epoch: 47151, Training Loss: 0.00623
Epoch: 47151, Training Loss: 0.00759
Epoch: 47151, Training Loss: 0.00589
Epoch: 47152, Training Loss: 0.00664
Epoch: 47152, Training Loss: 0.00623
Epoch: 47152, Training Loss: 0.00759
Epoch: 47152, Training Loss: 0.00589
Epoch: 47153, Training Loss: 0.00664
Epoch: 47153, Training Loss: 0.00623
Epoch: 47153, Training Loss: 0.00759
Epoch: 47153, Training Loss: 0.00589
Epoch: 47154, Training Loss: 0.00664
Epoch: 47154, Training Loss: 0.00623
Epoch: 47154, Training Loss: 0.00759
Epoch: 47154, Training Loss: 0.00589
Epoch: 47155, Training Loss: 0.00664
Epoch: 47155, Training Loss: 0.00623
Epoch: 47155, Training Loss: 0.00759
Epoch: 47155, Training Loss: 0.00589
Epoch: 47156, Training Loss: 0.00664
Epoch: 47156, Training Loss: 0.00623
Epoch: 47156, Training Loss: 0.00759
Epoch: 47156, Training Loss: 0.00589
Epoch: 47157, Training Loss: 0.00664
Epoch: 47157, Training Loss: 0.00623
Epoch: 47157, Training Loss: 0.00759
Epoch: 47157, Training Loss: 0.00589
Epoch: 47158, Training Loss: 0.00664
Epoch: 47158, Training Loss: 0.00623
Epoch: 47158, Training Loss: 0.00759
Epoch: 47158, Training Loss: 0.00589
Epoch: 47159, Training Loss: 0.00664
Epoch: 47159, Training Loss: 0.00623
Epoch: 47159, Training Loss: 0.00759
Epoch: 47159, Training Loss: 0.00589
Epoch: 47160, Training Loss: 0.00664
Epoch: 47160, Training Loss: 0.00623
Epoch: 47160, Training Loss: 0.00759
Epoch: 47160, Training Loss: 0.00589
Epoch: 47161, Training Loss: 0.00664
Epoch: 47161, Training Loss: 0.00623
Epoch: 47161, Training Loss: 0.00759
Epoch: 47161, Training Loss: 0.00589
Epoch: 47162, Training Loss: 0.00664
Epoch: 47162, Training Loss: 0.00623
Epoch: 47162, Training Loss: 0.00759
Epoch: 47162, Training Loss: 0.00589
Epoch: 47163, Training Loss: 0.00664
Epoch: 47163, Training Loss: 0.00622
Epoch: 47163, Training Loss: 0.00759
Epoch: 47163, Training Loss: 0.00589
Epoch: 47164, Training Loss: 0.00664
Epoch: 47164, Training Loss: 0.00622
Epoch: 47164, Training Loss: 0.00759
Epoch: 47164, Training Loss: 0.00589
Epoch: 47165, Training Loss: 0.00664
Epoch: 47165, Training Loss: 0.00622
Epoch: 47165, Training Loss: 0.00759
Epoch: 47165, Training Loss: 0.00589
Epoch: 47166, Training Loss: 0.00664
Epoch: 47166, Training Loss: 0.00622
Epoch: 47166, Training Loss: 0.00759
Epoch: 47166, Training Loss: 0.00589
Epoch: 47167, Training Loss: 0.00664
Epoch: 47167, Training Loss: 0.00622
Epoch: 47167, Training Loss: 0.00759
Epoch: 47167, Training Loss: 0.00589
Epoch: 47168, Training Loss: 0.00664
Epoch: 47168, Training Loss: 0.00622
Epoch: 47168, Training Loss: 0.00759
Epoch: 47168, Training Loss: 0.00589
Epoch: 47169, Training Loss: 0.00664
Epoch: 47169, Training Loss: 0.00622
Epoch: 47169, Training Loss: 0.00759
Epoch: 47169, Training Loss: 0.00589
Epoch: 47170, Training Loss: 0.00664
Epoch: 47170, Training Loss: 0.00622
Epoch: 47170, Training Loss: 0.00759
Epoch: 47170, Training Loss: 0.00589
Epoch: 47171, Training Loss: 0.00664
Epoch: 47171, Training Loss: 0.00622
Epoch: 47171, Training Loss: 0.00759
Epoch: 47171, Training Loss: 0.00589
Epoch: 47172, Training Loss: 0.00664
Epoch: 47172, Training Loss: 0.00622
Epoch: 47172, Training Loss: 0.00759
Epoch: 47172, Training Loss: 0.00589
Epoch: 47173, Training Loss: 0.00664
Epoch: 47173, Training Loss: 0.00622
Epoch: 47173, Training Loss: 0.00759
Epoch: 47173, Training Loss: 0.00589
Epoch: 47174, Training Loss: 0.00664
Epoch: 47174, Training Loss: 0.00622
Epoch: 47174, Training Loss: 0.00759
Epoch: 47174, Training Loss: 0.00589
Epoch: 47175, Training Loss: 0.00664
Epoch: 47175, Training Loss: 0.00622
Epoch: 47175, Training Loss: 0.00759
Epoch: 47175, Training Loss: 0.00589
Epoch: 47176, Training Loss: 0.00664
Epoch: 47176, Training Loss: 0.00622
Epoch: 47176, Training Loss: 0.00759
Epoch: 47176, Training Loss: 0.00589
Epoch: 47177, Training Loss: 0.00664
Epoch: 47177, Training Loss: 0.00622
Epoch: 47177, Training Loss: 0.00759
Epoch: 47177, Training Loss: 0.00589
Epoch: 47178, Training Loss: 0.00664
Epoch: 47178, Training Loss: 0.00622
Epoch: 47178, Training Loss: 0.00759
Epoch: 47178, Training Loss: 0.00589
Epoch: 47179, Training Loss: 0.00664
Epoch: 47179, Training Loss: 0.00622
Epoch: 47179, Training Loss: 0.00759
Epoch: 47179, Training Loss: 0.00589
Epoch: 47180, Training Loss: 0.00664
Epoch: 47180, Training Loss: 0.00622
Epoch: 47180, Training Loss: 0.00759
Epoch: 47180, Training Loss: 0.00589
Epoch: 47181, Training Loss: 0.00664
Epoch: 47181, Training Loss: 0.00622
Epoch: 47181, Training Loss: 0.00759
Epoch: 47181, Training Loss: 0.00589
Epoch: 47182, Training Loss: 0.00664
Epoch: 47182, Training Loss: 0.00622
Epoch: 47182, Training Loss: 0.00759
Epoch: 47182, Training Loss: 0.00589
Epoch: 47183, Training Loss: 0.00664
Epoch: 47183, Training Loss: 0.00622
Epoch: 47183, Training Loss: 0.00759
Epoch: 47183, Training Loss: 0.00589
Epoch: 47184, Training Loss: 0.00664
Epoch: 47184, Training Loss: 0.00622
Epoch: 47184, Training Loss: 0.00759
Epoch: 47184, Training Loss: 0.00589
Epoch: 47185, Training Loss: 0.00664
Epoch: 47185, Training Loss: 0.00622
Epoch: 47185, Training Loss: 0.00759
Epoch: 47185, Training Loss: 0.00589
Epoch: 47186, Training Loss: 0.00664
Epoch: 47186, Training Loss: 0.00622
Epoch: 47186, Training Loss: 0.00759
Epoch: 47186, Training Loss: 0.00589
Epoch: 47187, Training Loss: 0.00664
Epoch: 47187, Training Loss: 0.00622
Epoch: 47187, Training Loss: 0.00759
Epoch: 47187, Training Loss: 0.00589
Epoch: 47188, Training Loss: 0.00664
Epoch: 47188, Training Loss: 0.00622
Epoch: 47188, Training Loss: 0.00759
Epoch: 47188, Training Loss: 0.00589
Epoch: 47189, Training Loss: 0.00664
Epoch: 47189, Training Loss: 0.00622
Epoch: 47189, Training Loss: 0.00759
Epoch: 47189, Training Loss: 0.00589
Epoch: 47190, Training Loss: 0.00664
Epoch: 47190, Training Loss: 0.00622
Epoch: 47190, Training Loss: 0.00759
Epoch: 47190, Training Loss: 0.00589
Epoch: 47191, Training Loss: 0.00664
Epoch: 47191, Training Loss: 0.00622
Epoch: 47191, Training Loss: 0.00759
Epoch: 47191, Training Loss: 0.00589
Epoch: 47192, Training Loss: 0.00664
Epoch: 47192, Training Loss: 0.00622
Epoch: 47192, Training Loss: 0.00759
Epoch: 47192, Training Loss: 0.00589
Epoch: 47193, Training Loss: 0.00664
Epoch: 47193, Training Loss: 0.00622
Epoch: 47193, Training Loss: 0.00759
Epoch: 47193, Training Loss: 0.00589
Epoch: 47194, Training Loss: 0.00664
Epoch: 47194, Training Loss: 0.00622
Epoch: 47194, Training Loss: 0.00759
Epoch: 47194, Training Loss: 0.00589
Epoch: 47195, Training Loss: 0.00664
Epoch: 47195, Training Loss: 0.00622
Epoch: 47195, Training Loss: 0.00759
Epoch: 47195, Training Loss: 0.00589
Epoch: 47196, Training Loss: 0.00664
Epoch: 47196, Training Loss: 0.00622
Epoch: 47196, Training Loss: 0.00759
Epoch: 47196, Training Loss: 0.00589
Epoch: 47197, Training Loss: 0.00664
Epoch: 47197, Training Loss: 0.00622
Epoch: 47197, Training Loss: 0.00759
Epoch: 47197, Training Loss: 0.00589
Epoch: 47198, Training Loss: 0.00664
Epoch: 47198, Training Loss: 0.00622
Epoch: 47198, Training Loss: 0.00759
Epoch: 47198, Training Loss: 0.00589
Epoch: 47199, Training Loss: 0.00664
Epoch: 47199, Training Loss: 0.00622
Epoch: 47199, Training Loss: 0.00759
Epoch: 47199, Training Loss: 0.00589
Epoch: 47200, Training Loss: 0.00664
Epoch: 47200, Training Loss: 0.00622
Epoch: 47200, Training Loss: 0.00759
Epoch: 47200, Training Loss: 0.00589
Epoch: 47201, Training Loss: 0.00664
Epoch: 47201, Training Loss: 0.00622
Epoch: 47201, Training Loss: 0.00759
Epoch: 47201, Training Loss: 0.00589
Epoch: 47202, Training Loss: 0.00664
Epoch: 47202, Training Loss: 0.00622
Epoch: 47202, Training Loss: 0.00759
Epoch: 47202, Training Loss: 0.00589
Epoch: 47203, Training Loss: 0.00664
Epoch: 47203, Training Loss: 0.00622
Epoch: 47203, Training Loss: 0.00759
Epoch: 47203, Training Loss: 0.00589
Epoch: 47204, Training Loss: 0.00664
Epoch: 47204, Training Loss: 0.00622
Epoch: 47204, Training Loss: 0.00759
Epoch: 47204, Training Loss: 0.00589
Epoch: 47205, Training Loss: 0.00664
Epoch: 47205, Training Loss: 0.00622
Epoch: 47205, Training Loss: 0.00759
Epoch: 47205, Training Loss: 0.00589
Epoch: 47206, Training Loss: 0.00664
Epoch: 47206, Training Loss: 0.00622
Epoch: 47206, Training Loss: 0.00759
Epoch: 47206, Training Loss: 0.00589
Epoch: 47207, Training Loss: 0.00664
Epoch: 47207, Training Loss: 0.00622
Epoch: 47207, Training Loss: 0.00759
Epoch: 47207, Training Loss: 0.00589
Epoch: 47208, Training Loss: 0.00664
Epoch: 47208, Training Loss: 0.00622
Epoch: 47208, Training Loss: 0.00759
Epoch: 47208, Training Loss: 0.00589
Epoch: 47209, Training Loss: 0.00664
Epoch: 47209, Training Loss: 0.00622
Epoch: 47209, Training Loss: 0.00759
Epoch: 47209, Training Loss: 0.00589
Epoch: 47210, Training Loss: 0.00664
Epoch: 47210, Training Loss: 0.00622
Epoch: 47210, Training Loss: 0.00759
Epoch: 47210, Training Loss: 0.00589
Epoch: 47211, Training Loss: 0.00664
Epoch: 47211, Training Loss: 0.00622
Epoch: 47211, Training Loss: 0.00759
Epoch: 47211, Training Loss: 0.00589
Epoch: 47212, Training Loss: 0.00664
Epoch: 47212, Training Loss: 0.00622
Epoch: 47212, Training Loss: 0.00759
Epoch: 47212, Training Loss: 0.00589
Epoch: 47213, Training Loss: 0.00664
Epoch: 47213, Training Loss: 0.00622
Epoch: 47213, Training Loss: 0.00759
Epoch: 47213, Training Loss: 0.00589
Epoch: 47214, Training Loss: 0.00664
Epoch: 47214, Training Loss: 0.00622
Epoch: 47214, Training Loss: 0.00759
Epoch: 47214, Training Loss: 0.00589
Epoch: 47215, Training Loss: 0.00664
Epoch: 47215, Training Loss: 0.00622
Epoch: 47215, Training Loss: 0.00759
Epoch: 47215, Training Loss: 0.00589
Epoch: 47216, Training Loss: 0.00664
Epoch: 47216, Training Loss: 0.00622
Epoch: 47216, Training Loss: 0.00759
Epoch: 47216, Training Loss: 0.00589
Epoch: 47217, Training Loss: 0.00664
Epoch: 47217, Training Loss: 0.00622
Epoch: 47217, Training Loss: 0.00759
Epoch: 47217, Training Loss: 0.00589
Epoch: 47218, Training Loss: 0.00664
Epoch: 47218, Training Loss: 0.00622
Epoch: 47218, Training Loss: 0.00759
Epoch: 47218, Training Loss: 0.00589
Epoch: 47219, Training Loss: 0.00664
Epoch: 47219, Training Loss: 0.00622
Epoch: 47219, Training Loss: 0.00759
Epoch: 47219, Training Loss: 0.00589
Epoch: 47220, Training Loss: 0.00664
Epoch: 47220, Training Loss: 0.00622
Epoch: 47220, Training Loss: 0.00759
Epoch: 47220, Training Loss: 0.00589
Epoch: 47221, Training Loss: 0.00664
Epoch: 47221, Training Loss: 0.00622
Epoch: 47221, Training Loss: 0.00759
Epoch: 47221, Training Loss: 0.00589
Epoch: 47222, Training Loss: 0.00664
Epoch: 47222, Training Loss: 0.00622
Epoch: 47222, Training Loss: 0.00759
Epoch: 47222, Training Loss: 0.00589
Epoch: 47223, Training Loss: 0.00664
Epoch: 47223, Training Loss: 0.00622
Epoch: 47223, Training Loss: 0.00759
Epoch: 47223, Training Loss: 0.00589
Epoch: 47224, Training Loss: 0.00664
Epoch: 47224, Training Loss: 0.00622
Epoch: 47224, Training Loss: 0.00759
Epoch: 47224, Training Loss: 0.00589
Epoch: 47225, Training Loss: 0.00664
Epoch: 47225, Training Loss: 0.00622
Epoch: 47225, Training Loss: 0.00759
Epoch: 47225, Training Loss: 0.00589
Epoch: 47226, Training Loss: 0.00664
Epoch: 47226, Training Loss: 0.00622
Epoch: 47226, Training Loss: 0.00759
Epoch: 47226, Training Loss: 0.00589
Epoch: 47227, Training Loss: 0.00664
Epoch: 47227, Training Loss: 0.00622
Epoch: 47227, Training Loss: 0.00759
Epoch: 47227, Training Loss: 0.00589
Epoch: 47228, Training Loss: 0.00664
Epoch: 47228, Training Loss: 0.00622
Epoch: 47228, Training Loss: 0.00759
Epoch: 47228, Training Loss: 0.00589
Epoch: 47229, Training Loss: 0.00664
Epoch: 47229, Training Loss: 0.00622
Epoch: 47229, Training Loss: 0.00759
Epoch: 47229, Training Loss: 0.00589
Epoch: 47230, Training Loss: 0.00664
Epoch: 47230, Training Loss: 0.00622
Epoch: 47230, Training Loss: 0.00759
Epoch: 47230, Training Loss: 0.00589
Epoch: 47231, Training Loss: 0.00664
Epoch: 47231, Training Loss: 0.00622
Epoch: 47231, Training Loss: 0.00759
Epoch: 47231, Training Loss: 0.00589
Epoch: 47232, Training Loss: 0.00664
Epoch: 47232, Training Loss: 0.00622
Epoch: 47232, Training Loss: 0.00759
Epoch: 47232, Training Loss: 0.00589
Epoch: 47233, Training Loss: 0.00664
Epoch: 47233, Training Loss: 0.00622
Epoch: 47233, Training Loss: 0.00759
Epoch: 47233, Training Loss: 0.00589
Epoch: 47234, Training Loss: 0.00664
Epoch: 47234, Training Loss: 0.00622
Epoch: 47234, Training Loss: 0.00759
Epoch: 47234, Training Loss: 0.00589
Epoch: 47235, Training Loss: 0.00664
Epoch: 47235, Training Loss: 0.00622
Epoch: 47235, Training Loss: 0.00759
Epoch: 47235, Training Loss: 0.00589
Epoch: 47236, Training Loss: 0.00664
Epoch: 47236, Training Loss: 0.00622
Epoch: 47236, Training Loss: 0.00759
Epoch: 47236, Training Loss: 0.00589
Epoch: 47237, Training Loss: 0.00664
Epoch: 47237, Training Loss: 0.00622
Epoch: 47237, Training Loss: 0.00759
Epoch: 47237, Training Loss: 0.00589
Epoch: 47238, Training Loss: 0.00664
Epoch: 47238, Training Loss: 0.00622
Epoch: 47238, Training Loss: 0.00759
Epoch: 47238, Training Loss: 0.00589
Epoch: 47239, Training Loss: 0.00664
Epoch: 47239, Training Loss: 0.00622
Epoch: 47239, Training Loss: 0.00759
Epoch: 47239, Training Loss: 0.00589
Epoch: 47240, Training Loss: 0.00664
Epoch: 47240, Training Loss: 0.00622
Epoch: 47240, Training Loss: 0.00759
Epoch: 47240, Training Loss: 0.00589
Epoch: 47241, Training Loss: 0.00664
Epoch: 47241, Training Loss: 0.00622
Epoch: 47241, Training Loss: 0.00759
Epoch: 47241, Training Loss: 0.00589
Epoch: 47242, Training Loss: 0.00664
Epoch: 47242, Training Loss: 0.00622
Epoch: 47242, Training Loss: 0.00759
Epoch: 47242, Training Loss: 0.00589
Epoch: 47243, Training Loss: 0.00664
Epoch: 47243, Training Loss: 0.00622
Epoch: 47243, Training Loss: 0.00759
Epoch: 47243, Training Loss: 0.00589
Epoch: 47244, Training Loss: 0.00664
Epoch: 47244, Training Loss: 0.00622
Epoch: 47244, Training Loss: 0.00759
Epoch: 47244, Training Loss: 0.00589
Epoch: 47245, Training Loss: 0.00664
Epoch: 47245, Training Loss: 0.00622
Epoch: 47245, Training Loss: 0.00759
Epoch: 47245, Training Loss: 0.00589
Epoch: 47246, Training Loss: 0.00664
Epoch: 47246, Training Loss: 0.00622
Epoch: 47246, Training Loss: 0.00759
Epoch: 47246, Training Loss: 0.00589
Epoch: 47247, Training Loss: 0.00664
Epoch: 47247, Training Loss: 0.00622
Epoch: 47247, Training Loss: 0.00759
Epoch: 47247, Training Loss: 0.00589
Epoch: 47248, Training Loss: 0.00664
Epoch: 47248, Training Loss: 0.00622
Epoch: 47248, Training Loss: 0.00759
Epoch: 47248, Training Loss: 0.00589
Epoch: 47249, Training Loss: 0.00664
Epoch: 47249, Training Loss: 0.00622
Epoch: 47249, Training Loss: 0.00759
Epoch: 47249, Training Loss: 0.00589
Epoch: 47250, Training Loss: 0.00664
Epoch: 47250, Training Loss: 0.00622
Epoch: 47250, Training Loss: 0.00759
Epoch: 47250, Training Loss: 0.00589
Epoch: 47251, Training Loss: 0.00664
Epoch: 47251, Training Loss: 0.00622
Epoch: 47251, Training Loss: 0.00759
Epoch: 47251, Training Loss: 0.00589
Epoch: 47252, Training Loss: 0.00664
Epoch: 47252, Training Loss: 0.00622
Epoch: 47252, Training Loss: 0.00759
Epoch: 47252, Training Loss: 0.00589
Epoch: 47253, Training Loss: 0.00664
Epoch: 47253, Training Loss: 0.00622
Epoch: 47253, Training Loss: 0.00759
Epoch: 47253, Training Loss: 0.00589
Epoch: 47254, Training Loss: 0.00664
Epoch: 47254, Training Loss: 0.00622
Epoch: 47254, Training Loss: 0.00759
Epoch: 47254, Training Loss: 0.00589
Epoch: 47255, Training Loss: 0.00664
Epoch: 47255, Training Loss: 0.00622
Epoch: 47255, Training Loss: 0.00759
Epoch: 47255, Training Loss: 0.00589
Epoch: 47256, Training Loss: 0.00664
Epoch: 47256, Training Loss: 0.00622
Epoch: 47256, Training Loss: 0.00759
Epoch: 47256, Training Loss: 0.00589
Epoch: 47257, Training Loss: 0.00664
Epoch: 47257, Training Loss: 0.00622
Epoch: 47257, Training Loss: 0.00759
Epoch: 47257, Training Loss: 0.00589
Epoch: 47258, Training Loss: 0.00664
Epoch: 47258, Training Loss: 0.00622
Epoch: 47258, Training Loss: 0.00759
Epoch: 47258, Training Loss: 0.00589
Epoch: 47259, Training Loss: 0.00664
Epoch: 47259, Training Loss: 0.00622
Epoch: 47259, Training Loss: 0.00759
Epoch: 47259, Training Loss: 0.00589
Epoch: 47260, Training Loss: 0.00664
Epoch: 47260, Training Loss: 0.00622
Epoch: 47260, Training Loss: 0.00759
Epoch: 47260, Training Loss: 0.00589
Epoch: 47261, Training Loss: 0.00664
Epoch: 47261, Training Loss: 0.00622
Epoch: 47261, Training Loss: 0.00758
Epoch: 47261, Training Loss: 0.00589
Epoch: 47262, Training Loss: 0.00664
Epoch: 47262, Training Loss: 0.00622
Epoch: 47262, Training Loss: 0.00758
Epoch: 47262, Training Loss: 0.00589
Epoch: 47263, Training Loss: 0.00664
Epoch: 47263, Training Loss: 0.00622
Epoch: 47263, Training Loss: 0.00758
Epoch: 47263, Training Loss: 0.00589
Epoch: 47264, Training Loss: 0.00664
Epoch: 47264, Training Loss: 0.00622
Epoch: 47264, Training Loss: 0.00758
Epoch: 47264, Training Loss: 0.00589
Epoch: 47265, Training Loss: 0.00664
Epoch: 47265, Training Loss: 0.00622
Epoch: 47265, Training Loss: 0.00758
Epoch: 47265, Training Loss: 0.00589
Epoch: 47266, Training Loss: 0.00664
Epoch: 47266, Training Loss: 0.00622
Epoch: 47266, Training Loss: 0.00758
Epoch: 47266, Training Loss: 0.00589
Epoch: 47267, Training Loss: 0.00664
Epoch: 47267, Training Loss: 0.00622
Epoch: 47267, Training Loss: 0.00758
Epoch: 47267, Training Loss: 0.00589
Epoch: 47268, Training Loss: 0.00664
Epoch: 47268, Training Loss: 0.00622
Epoch: 47268, Training Loss: 0.00758
Epoch: 47268, Training Loss: 0.00589
Epoch: 47269, Training Loss: 0.00664
Epoch: 47269, Training Loss: 0.00622
Epoch: 47269, Training Loss: 0.00758
Epoch: 47269, Training Loss: 0.00589
Epoch: 47270, Training Loss: 0.00664
Epoch: 47270, Training Loss: 0.00622
Epoch: 47270, Training Loss: 0.00758
Epoch: 47270, Training Loss: 0.00589
Epoch: 47271, Training Loss: 0.00664
Epoch: 47271, Training Loss: 0.00622
Epoch: 47271, Training Loss: 0.00758
Epoch: 47271, Training Loss: 0.00589
Epoch: 47272, Training Loss: 0.00664
Epoch: 47272, Training Loss: 0.00622
Epoch: 47272, Training Loss: 0.00758
Epoch: 47272, Training Loss: 0.00589
Epoch: 47273, Training Loss: 0.00664
Epoch: 47273, Training Loss: 0.00622
Epoch: 47273, Training Loss: 0.00758
Epoch: 47273, Training Loss: 0.00589
Epoch: 47274, Training Loss: 0.00664
Epoch: 47274, Training Loss: 0.00622
Epoch: 47274, Training Loss: 0.00758
Epoch: 47274, Training Loss: 0.00589
Epoch: 47275, Training Loss: 0.00664
Epoch: 47275, Training Loss: 0.00622
Epoch: 47275, Training Loss: 0.00758
Epoch: 47275, Training Loss: 0.00589
Epoch: 47276, Training Loss: 0.00664
Epoch: 47276, Training Loss: 0.00622
Epoch: 47276, Training Loss: 0.00758
Epoch: 47276, Training Loss: 0.00589
Epoch: 47277, Training Loss: 0.00663
Epoch: 47277, Training Loss: 0.00622
Epoch: 47277, Training Loss: 0.00758
Epoch: 47277, Training Loss: 0.00589
Epoch: 47278, Training Loss: 0.00663
Epoch: 47278, Training Loss: 0.00622
Epoch: 47278, Training Loss: 0.00758
Epoch: 47278, Training Loss: 0.00589
Epoch: 47279, Training Loss: 0.00663
Epoch: 47279, Training Loss: 0.00622
Epoch: 47279, Training Loss: 0.00758
Epoch: 47279, Training Loss: 0.00589
Epoch: 47280, Training Loss: 0.00663
Epoch: 47280, Training Loss: 0.00622
Epoch: 47280, Training Loss: 0.00758
Epoch: 47280, Training Loss: 0.00589
Epoch: 47281, Training Loss: 0.00663
Epoch: 47281, Training Loss: 0.00622
Epoch: 47281, Training Loss: 0.00758
Epoch: 47281, Training Loss: 0.00589
Epoch: 47282, Training Loss: 0.00663
Epoch: 47282, Training Loss: 0.00622
Epoch: 47282, Training Loss: 0.00758
Epoch: 47282, Training Loss: 0.00589
Epoch: 47283, Training Loss: 0.00663
Epoch: 47283, Training Loss: 0.00622
Epoch: 47283, Training Loss: 0.00758
Epoch: 47283, Training Loss: 0.00589
Epoch: 47284, Training Loss: 0.00663
Epoch: 47284, Training Loss: 0.00622
Epoch: 47284, Training Loss: 0.00758
Epoch: 47284, Training Loss: 0.00589
Epoch: 47285, Training Loss: 0.00663
Epoch: 47285, Training Loss: 0.00622
Epoch: 47285, Training Loss: 0.00758
Epoch: 47285, Training Loss: 0.00589
Epoch: 47286, Training Loss: 0.00663
Epoch: 47286, Training Loss: 0.00622
Epoch: 47286, Training Loss: 0.00758
Epoch: 47286, Training Loss: 0.00589
Epoch: 47287, Training Loss: 0.00663
Epoch: 47287, Training Loss: 0.00622
Epoch: 47287, Training Loss: 0.00758
Epoch: 47287, Training Loss: 0.00589
Epoch: 47288, Training Loss: 0.00663
Epoch: 47288, Training Loss: 0.00622
Epoch: 47288, Training Loss: 0.00758
Epoch: 47288, Training Loss: 0.00589
Epoch: 47289, Training Loss: 0.00663
Epoch: 47289, Training Loss: 0.00622
Epoch: 47289, Training Loss: 0.00758
Epoch: 47289, Training Loss: 0.00589
Epoch: 47290, Training Loss: 0.00663
Epoch: 47290, Training Loss: 0.00622
Epoch: 47290, Training Loss: 0.00758
Epoch: 47290, Training Loss: 0.00589
Epoch: 47291, Training Loss: 0.00663
Epoch: 47291, Training Loss: 0.00622
Epoch: 47291, Training Loss: 0.00758
Epoch: 47291, Training Loss: 0.00589
Epoch: 47292, Training Loss: 0.00663
Epoch: 47292, Training Loss: 0.00622
Epoch: 47292, Training Loss: 0.00758
Epoch: 47292, Training Loss: 0.00589
Epoch: 47293, Training Loss: 0.00663
Epoch: 47293, Training Loss: 0.00622
Epoch: 47293, Training Loss: 0.00758
Epoch: 47293, Training Loss: 0.00588
Epoch: 47294, Training Loss: 0.00663
Epoch: 47294, Training Loss: 0.00622
Epoch: 47294, Training Loss: 0.00758
Epoch: 47294, Training Loss: 0.00588
Epoch: 47295, Training Loss: 0.00663
Epoch: 47295, Training Loss: 0.00622
Epoch: 47295, Training Loss: 0.00758
Epoch: 47295, Training Loss: 0.00588
Epoch: 47296, Training Loss: 0.00663
Epoch: 47296, Training Loss: 0.00622
Epoch: 47296, Training Loss: 0.00758
Epoch: 47296, Training Loss: 0.00588
Epoch: 47297, Training Loss: 0.00663
Epoch: 47297, Training Loss: 0.00622
Epoch: 47297, Training Loss: 0.00758
Epoch: 47297, Training Loss: 0.00588
Epoch: 47298, Training Loss: 0.00663
Epoch: 47298, Training Loss: 0.00622
Epoch: 47298, Training Loss: 0.00758
Epoch: 47298, Training Loss: 0.00588
Epoch: 47299, Training Loss: 0.00663
Epoch: 47299, Training Loss: 0.00622
Epoch: 47299, Training Loss: 0.00758
Epoch: 47299, Training Loss: 0.00588
Epoch: 47300, Training Loss: 0.00663
Epoch: 47300, Training Loss: 0.00622
Epoch: 47300, Training Loss: 0.00758
Epoch: 47300, Training Loss: 0.00588
Epoch: 47301, Training Loss: 0.00663
Epoch: 47301, Training Loss: 0.00622
Epoch: 47301, Training Loss: 0.00758
Epoch: 47301, Training Loss: 0.00588
Epoch: 47302, Training Loss: 0.00663
Epoch: 47302, Training Loss: 0.00622
Epoch: 47302, Training Loss: 0.00758
Epoch: 47302, Training Loss: 0.00588
Epoch: 47303, Training Loss: 0.00663
Epoch: 47303, Training Loss: 0.00622
Epoch: 47303, Training Loss: 0.00758
Epoch: 47303, Training Loss: 0.00588
Epoch: 47304, Training Loss: 0.00663
Epoch: 47304, Training Loss: 0.00622
Epoch: 47304, Training Loss: 0.00758
Epoch: 47304, Training Loss: 0.00588
Epoch: 47305, Training Loss: 0.00663
Epoch: 47305, Training Loss: 0.00622
Epoch: 47305, Training Loss: 0.00758
Epoch: 47305, Training Loss: 0.00588
Epoch: 47306, Training Loss: 0.00663
Epoch: 47306, Training Loss: 0.00622
Epoch: 47306, Training Loss: 0.00758
Epoch: 47306, Training Loss: 0.00588
Epoch: 47307, Training Loss: 0.00663
Epoch: 47307, Training Loss: 0.00622
Epoch: 47307, Training Loss: 0.00758
Epoch: 47307, Training Loss: 0.00588
Epoch: 47308, Training Loss: 0.00663
Epoch: 47308, Training Loss: 0.00622
Epoch: 47308, Training Loss: 0.00758
Epoch: 47308, Training Loss: 0.00588
Epoch: 47309, Training Loss: 0.00663
Epoch: 47309, Training Loss: 0.00622
Epoch: 47309, Training Loss: 0.00758
Epoch: 47309, Training Loss: 0.00588
Epoch: 47310, Training Loss: 0.00663
Epoch: 47310, Training Loss: 0.00621
Epoch: 47310, Training Loss: 0.00758
Epoch: 47310, Training Loss: 0.00588
Epoch: 47311, Training Loss: 0.00663
Epoch: 47311, Training Loss: 0.00621
Epoch: 47311, Training Loss: 0.00758
Epoch: 47311, Training Loss: 0.00588
Epoch: 47312, Training Loss: 0.00663
Epoch: 47312, Training Loss: 0.00621
Epoch: 47312, Training Loss: 0.00758
Epoch: 47312, Training Loss: 0.00588
Epoch: 47313, Training Loss: 0.00663
Epoch: 47313, Training Loss: 0.00621
Epoch: 47313, Training Loss: 0.00758
Epoch: 47313, Training Loss: 0.00588
Epoch: 47314, Training Loss: 0.00663
Epoch: 47314, Training Loss: 0.00621
Epoch: 47314, Training Loss: 0.00758
Epoch: 47314, Training Loss: 0.00588
Epoch: 47315, Training Loss: 0.00663
Epoch: 47315, Training Loss: 0.00621
Epoch: 47315, Training Loss: 0.00758
Epoch: 47315, Training Loss: 0.00588
Epoch: 47316, Training Loss: 0.00663
Epoch: 47316, Training Loss: 0.00621
Epoch: 47316, Training Loss: 0.00758
Epoch: 47316, Training Loss: 0.00588
Epoch: 47317, Training Loss: 0.00663
Epoch: 47317, Training Loss: 0.00621
Epoch: 47317, Training Loss: 0.00758
Epoch: 47317, Training Loss: 0.00588
Epoch: 47318, Training Loss: 0.00663
Epoch: 47318, Training Loss: 0.00621
Epoch: 47318, Training Loss: 0.00758
Epoch: 47318, Training Loss: 0.00588
Epoch: 47319, Training Loss: 0.00663
Epoch: 47319, Training Loss: 0.00621
Epoch: 47319, Training Loss: 0.00758
Epoch: 47319, Training Loss: 0.00588
Epoch: 47320, Training Loss: 0.00663
Epoch: 47320, Training Loss: 0.00621
Epoch: 47320, Training Loss: 0.00758
Epoch: 47320, Training Loss: 0.00588
Epoch: 47321, Training Loss: 0.00663
Epoch: 47321, Training Loss: 0.00621
Epoch: 47321, Training Loss: 0.00758
Epoch: 47321, Training Loss: 0.00588
Epoch: 47322, Training Loss: 0.00663
Epoch: 47322, Training Loss: 0.00621
Epoch: 47322, Training Loss: 0.00758
Epoch: 47322, Training Loss: 0.00588
Epoch: 47323, Training Loss: 0.00663
Epoch: 47323, Training Loss: 0.00621
Epoch: 47323, Training Loss: 0.00758
Epoch: 47323, Training Loss: 0.00588
Epoch: 47324, Training Loss: 0.00663
Epoch: 47324, Training Loss: 0.00621
Epoch: 47324, Training Loss: 0.00758
Epoch: 47324, Training Loss: 0.00588
Epoch: 47325, Training Loss: 0.00663
Epoch: 47325, Training Loss: 0.00621
Epoch: 47325, Training Loss: 0.00758
Epoch: 47325, Training Loss: 0.00588
Epoch: 47326, Training Loss: 0.00663
Epoch: 47326, Training Loss: 0.00621
Epoch: 47326, Training Loss: 0.00758
Epoch: 47326, Training Loss: 0.00588
Epoch: 47327, Training Loss: 0.00663
Epoch: 47327, Training Loss: 0.00621
Epoch: 47327, Training Loss: 0.00758
Epoch: 47327, Training Loss: 0.00588
Epoch: 47328, Training Loss: 0.00663
Epoch: 47328, Training Loss: 0.00621
Epoch: 47328, Training Loss: 0.00758
Epoch: 47328, Training Loss: 0.00588
Epoch: 47329, Training Loss: 0.00663
Epoch: 47329, Training Loss: 0.00621
Epoch: 47329, Training Loss: 0.00758
Epoch: 47329, Training Loss: 0.00588
Epoch: 47330, Training Loss: 0.00663
Epoch: 47330, Training Loss: 0.00621
Epoch: 47330, Training Loss: 0.00758
Epoch: 47330, Training Loss: 0.00588
Epoch: 47331, Training Loss: 0.00663
Epoch: 47331, Training Loss: 0.00621
Epoch: 47331, Training Loss: 0.00758
Epoch: 47331, Training Loss: 0.00588
Epoch: 47332, Training Loss: 0.00663
Epoch: 47332, Training Loss: 0.00621
Epoch: 47332, Training Loss: 0.00758
Epoch: 47332, Training Loss: 0.00588
Epoch: 47333, Training Loss: 0.00663
Epoch: 47333, Training Loss: 0.00621
Epoch: 47333, Training Loss: 0.00758
Epoch: 47333, Training Loss: 0.00588
Epoch: 47334, Training Loss: 0.00663
Epoch: 47334, Training Loss: 0.00621
Epoch: 47334, Training Loss: 0.00758
Epoch: 47334, Training Loss: 0.00588
Epoch: 47335, Training Loss: 0.00663
Epoch: 47335, Training Loss: 0.00621
Epoch: 47335, Training Loss: 0.00758
Epoch: 47335, Training Loss: 0.00588
Epoch: 47336, Training Loss: 0.00663
Epoch: 47336, Training Loss: 0.00621
Epoch: 47336, Training Loss: 0.00758
Epoch: 47336, Training Loss: 0.00588
Epoch: 47337, Training Loss: 0.00663
Epoch: 47337, Training Loss: 0.00621
Epoch: 47337, Training Loss: 0.00758
Epoch: 47337, Training Loss: 0.00588
Epoch: 47338, Training Loss: 0.00663
Epoch: 47338, Training Loss: 0.00621
Epoch: 47338, Training Loss: 0.00758
Epoch: 47338, Training Loss: 0.00588
Epoch: 47339, Training Loss: 0.00663
Epoch: 47339, Training Loss: 0.00621
Epoch: 47339, Training Loss: 0.00758
Epoch: 47339, Training Loss: 0.00588
Epoch: 47340, Training Loss: 0.00663
Epoch: 47340, Training Loss: 0.00621
Epoch: 47340, Training Loss: 0.00758
Epoch: 47340, Training Loss: 0.00588
Epoch: 47341, Training Loss: 0.00663
Epoch: 47341, Training Loss: 0.00621
Epoch: 47341, Training Loss: 0.00758
Epoch: 47341, Training Loss: 0.00588
Epoch: 47342, Training Loss: 0.00663
Epoch: 47342, Training Loss: 0.00621
Epoch: 47342, Training Loss: 0.00758
Epoch: 47342, Training Loss: 0.00588
Epoch: 47343, Training Loss: 0.00663
Epoch: 47343, Training Loss: 0.00621
Epoch: 47343, Training Loss: 0.00758
Epoch: 47343, Training Loss: 0.00588
Epoch: 47344, Training Loss: 0.00663
Epoch: 47344, Training Loss: 0.00621
Epoch: 47344, Training Loss: 0.00758
Epoch: 47344, Training Loss: 0.00588
Epoch: 47345, Training Loss: 0.00663
Epoch: 47345, Training Loss: 0.00621
Epoch: 47345, Training Loss: 0.00758
Epoch: 47345, Training Loss: 0.00588
Epoch: 47346, Training Loss: 0.00663
Epoch: 47346, Training Loss: 0.00621
Epoch: 47346, Training Loss: 0.00758
Epoch: 47346, Training Loss: 0.00588
Epoch: 47347, Training Loss: 0.00663
Epoch: 47347, Training Loss: 0.00621
Epoch: 47347, Training Loss: 0.00758
Epoch: 47347, Training Loss: 0.00588
Epoch: 47348, Training Loss: 0.00663
Epoch: 47348, Training Loss: 0.00621
Epoch: 47348, Training Loss: 0.00758
Epoch: 47348, Training Loss: 0.00588
Epoch: 47349, Training Loss: 0.00663
Epoch: 47349, Training Loss: 0.00621
Epoch: 47349, Training Loss: 0.00758
Epoch: 47349, Training Loss: 0.00588
Epoch: 47350, Training Loss: 0.00663
Epoch: 47350, Training Loss: 0.00621
Epoch: 47350, Training Loss: 0.00758
Epoch: 47350, Training Loss: 0.00588
Epoch: 47351, Training Loss: 0.00663
Epoch: 47351, Training Loss: 0.00621
Epoch: 47351, Training Loss: 0.00758
Epoch: 47351, Training Loss: 0.00588
Epoch: 47352, Training Loss: 0.00663
Epoch: 47352, Training Loss: 0.00621
Epoch: 47352, Training Loss: 0.00758
Epoch: 47352, Training Loss: 0.00588
Epoch: 47353, Training Loss: 0.00663
Epoch: 47353, Training Loss: 0.00621
Epoch: 47353, Training Loss: 0.00758
Epoch: 47353, Training Loss: 0.00588
Epoch: 47354, Training Loss: 0.00663
Epoch: 47354, Training Loss: 0.00621
Epoch: 47354, Training Loss: 0.00758
Epoch: 47354, Training Loss: 0.00588
Epoch: 47355, Training Loss: 0.00663
Epoch: 47355, Training Loss: 0.00621
Epoch: 47355, Training Loss: 0.00758
Epoch: 47355, Training Loss: 0.00588
Epoch: 47356, Training Loss: 0.00663
Epoch: 47356, Training Loss: 0.00621
Epoch: 47356, Training Loss: 0.00758
Epoch: 47356, Training Loss: 0.00588
Epoch: 47357, Training Loss: 0.00663
Epoch: 47357, Training Loss: 0.00621
Epoch: 47357, Training Loss: 0.00758
Epoch: 47357, Training Loss: 0.00588
Epoch: 47358, Training Loss: 0.00663
Epoch: 47358, Training Loss: 0.00621
Epoch: 47358, Training Loss: 0.00758
Epoch: 47358, Training Loss: 0.00588
Epoch: 47359, Training Loss: 0.00663
Epoch: 47359, Training Loss: 0.00621
Epoch: 47359, Training Loss: 0.00758
Epoch: 47359, Training Loss: 0.00588
Epoch: 47360, Training Loss: 0.00663
Epoch: 47360, Training Loss: 0.00621
Epoch: 47360, Training Loss: 0.00758
Epoch: 47360, Training Loss: 0.00588
Epoch: 47361, Training Loss: 0.00663
Epoch: 47361, Training Loss: 0.00621
Epoch: 47361, Training Loss: 0.00758
Epoch: 47361, Training Loss: 0.00588
Epoch: 47362, Training Loss: 0.00663
Epoch: 47362, Training Loss: 0.00621
Epoch: 47362, Training Loss: 0.00758
Epoch: 47362, Training Loss: 0.00588
Epoch: 47363, Training Loss: 0.00663
Epoch: 47363, Training Loss: 0.00621
Epoch: 47363, Training Loss: 0.00758
Epoch: 47363, Training Loss: 0.00588
Epoch: 47364, Training Loss: 0.00663
Epoch: 47364, Training Loss: 0.00621
Epoch: 47364, Training Loss: 0.00758
Epoch: 47364, Training Loss: 0.00588
Epoch: 47365, Training Loss: 0.00663
Epoch: 47365, Training Loss: 0.00621
Epoch: 47365, Training Loss: 0.00758
Epoch: 47365, Training Loss: 0.00588
Epoch: 47366, Training Loss: 0.00663
Epoch: 47366, Training Loss: 0.00621
Epoch: 47366, Training Loss: 0.00758
Epoch: 47366, Training Loss: 0.00588
Epoch: 47367, Training Loss: 0.00663
Epoch: 47367, Training Loss: 0.00621
Epoch: 47367, Training Loss: 0.00758
Epoch: 47367, Training Loss: 0.00588
Epoch: 47368, Training Loss: 0.00663
Epoch: 47368, Training Loss: 0.00621
Epoch: 47368, Training Loss: 0.00758
Epoch: 47368, Training Loss: 0.00588
Epoch: 47369, Training Loss: 0.00663
Epoch: 47369, Training Loss: 0.00621
Epoch: 47369, Training Loss: 0.00758
Epoch: 47369, Training Loss: 0.00588
Epoch: 47370, Training Loss: 0.00663
Epoch: 47370, Training Loss: 0.00621
Epoch: 47370, Training Loss: 0.00758
Epoch: 47370, Training Loss: 0.00588
Epoch: 47371, Training Loss: 0.00663
Epoch: 47371, Training Loss: 0.00621
Epoch: 47371, Training Loss: 0.00758
Epoch: 47371, Training Loss: 0.00588
Epoch: 47372, Training Loss: 0.00663
Epoch: 47372, Training Loss: 0.00621
Epoch: 47372, Training Loss: 0.00758
Epoch: 47372, Training Loss: 0.00588
Epoch: 47373, Training Loss: 0.00663
Epoch: 47373, Training Loss: 0.00621
Epoch: 47373, Training Loss: 0.00758
Epoch: 47373, Training Loss: 0.00588
Epoch: 47374, Training Loss: 0.00663
Epoch: 47374, Training Loss: 0.00621
Epoch: 47374, Training Loss: 0.00758
Epoch: 47374, Training Loss: 0.00588
Epoch: 47375, Training Loss: 0.00663
Epoch: 47375, Training Loss: 0.00621
Epoch: 47375, Training Loss: 0.00758
Epoch: 47375, Training Loss: 0.00588
Epoch: 47376, Training Loss: 0.00663
Epoch: 47376, Training Loss: 0.00621
Epoch: 47376, Training Loss: 0.00758
Epoch: 47376, Training Loss: 0.00588
Epoch: 47377, Training Loss: 0.00663
Epoch: 47377, Training Loss: 0.00621
Epoch: 47377, Training Loss: 0.00758
Epoch: 47377, Training Loss: 0.00588
Epoch: 47378, Training Loss: 0.00663
Epoch: 47378, Training Loss: 0.00621
Epoch: 47378, Training Loss: 0.00758
Epoch: 47378, Training Loss: 0.00588
Epoch: 47379, Training Loss: 0.00663
Epoch: 47379, Training Loss: 0.00621
Epoch: 47379, Training Loss: 0.00758
Epoch: 47379, Training Loss: 0.00588
Epoch: 47380, Training Loss: 0.00663
Epoch: 47380, Training Loss: 0.00621
Epoch: 47380, Training Loss: 0.00758
Epoch: 47380, Training Loss: 0.00588
Epoch: 47381, Training Loss: 0.00663
Epoch: 47381, Training Loss: 0.00621
Epoch: 47381, Training Loss: 0.00757
Epoch: 47381, Training Loss: 0.00588
Epoch: 47382, Training Loss: 0.00663
Epoch: 47382, Training Loss: 0.00621
Epoch: 47382, Training Loss: 0.00757
Epoch: 47382, Training Loss: 0.00588
Epoch: 47383, Training Loss: 0.00663
Epoch: 47383, Training Loss: 0.00621
Epoch: 47383, Training Loss: 0.00757
Epoch: 47383, Training Loss: 0.00588
Epoch: 47384, Training Loss: 0.00663
Epoch: 47384, Training Loss: 0.00621
Epoch: 47384, Training Loss: 0.00757
Epoch: 47384, Training Loss: 0.00588
Epoch: 47385, Training Loss: 0.00663
Epoch: 47385, Training Loss: 0.00621
Epoch: 47385, Training Loss: 0.00757
Epoch: 47385, Training Loss: 0.00588
Epoch: 47386, Training Loss: 0.00663
Epoch: 47386, Training Loss: 0.00621
Epoch: 47386, Training Loss: 0.00757
Epoch: 47386, Training Loss: 0.00588
Epoch: 47387, Training Loss: 0.00663
Epoch: 47387, Training Loss: 0.00621
Epoch: 47387, Training Loss: 0.00757
Epoch: 47387, Training Loss: 0.00588
Epoch: 47388, Training Loss: 0.00663
Epoch: 47388, Training Loss: 0.00621
Epoch: 47388, Training Loss: 0.00757
Epoch: 47388, Training Loss: 0.00588
Epoch: 47389, Training Loss: 0.00663
Epoch: 47389, Training Loss: 0.00621
Epoch: 47389, Training Loss: 0.00757
Epoch: 47389, Training Loss: 0.00588
Epoch: 47390, Training Loss: 0.00663
Epoch: 47390, Training Loss: 0.00621
Epoch: 47390, Training Loss: 0.00757
Epoch: 47390, Training Loss: 0.00588
Epoch: 47391, Training Loss: 0.00663
Epoch: 47391, Training Loss: 0.00621
Epoch: 47391, Training Loss: 0.00757
Epoch: 47391, Training Loss: 0.00588
Epoch: 47392, Training Loss: 0.00663
Epoch: 47392, Training Loss: 0.00621
Epoch: 47392, Training Loss: 0.00757
Epoch: 47392, Training Loss: 0.00588
Epoch: 47393, Training Loss: 0.00663
Epoch: 47393, Training Loss: 0.00621
Epoch: 47393, Training Loss: 0.00757
Epoch: 47393, Training Loss: 0.00588
Epoch: 47394, Training Loss: 0.00663
Epoch: 47394, Training Loss: 0.00621
Epoch: 47394, Training Loss: 0.00757
Epoch: 47394, Training Loss: 0.00588
Epoch: 47395, Training Loss: 0.00663
Epoch: 47395, Training Loss: 0.00621
Epoch: 47395, Training Loss: 0.00757
Epoch: 47395, Training Loss: 0.00588
Epoch: 47396, Training Loss: 0.00663
Epoch: 47396, Training Loss: 0.00621
Epoch: 47396, Training Loss: 0.00757
Epoch: 47396, Training Loss: 0.00588
Epoch: 47397, Training Loss: 0.00663
Epoch: 47397, Training Loss: 0.00621
Epoch: 47397, Training Loss: 0.00757
Epoch: 47397, Training Loss: 0.00588
Epoch: 47398, Training Loss: 0.00663
Epoch: 47398, Training Loss: 0.00621
Epoch: 47398, Training Loss: 0.00757
Epoch: 47398, Training Loss: 0.00588
Epoch: 47399, Training Loss: 0.00663
Epoch: 47399, Training Loss: 0.00621
Epoch: 47399, Training Loss: 0.00757
Epoch: 47399, Training Loss: 0.00588
Epoch: 47400, Training Loss: 0.00663
Epoch: 47400, Training Loss: 0.00621
Epoch: 47400, Training Loss: 0.00757
Epoch: 47400, Training Loss: 0.00588
Epoch: 47401, Training Loss: 0.00663
Epoch: 47401, Training Loss: 0.00621
Epoch: 47401, Training Loss: 0.00757
Epoch: 47401, Training Loss: 0.00588
Epoch: 47402, Training Loss: 0.00663
Epoch: 47402, Training Loss: 0.00621
Epoch: 47402, Training Loss: 0.00757
Epoch: 47402, Training Loss: 0.00588
Epoch: 47403, Training Loss: 0.00663
Epoch: 47403, Training Loss: 0.00621
Epoch: 47403, Training Loss: 0.00757
Epoch: 47403, Training Loss: 0.00588
Epoch: 47404, Training Loss: 0.00663
Epoch: 47404, Training Loss: 0.00621
Epoch: 47404, Training Loss: 0.00757
Epoch: 47404, Training Loss: 0.00588
Epoch: 47405, Training Loss: 0.00663
Epoch: 47405, Training Loss: 0.00621
Epoch: 47405, Training Loss: 0.00757
Epoch: 47405, Training Loss: 0.00588
Epoch: 47406, Training Loss: 0.00663
Epoch: 47406, Training Loss: 0.00621
Epoch: 47406, Training Loss: 0.00757
Epoch: 47406, Training Loss: 0.00588
Epoch: 47407, Training Loss: 0.00663
Epoch: 47407, Training Loss: 0.00621
Epoch: 47407, Training Loss: 0.00757
Epoch: 47407, Training Loss: 0.00588
Epoch: 47408, Training Loss: 0.00663
Epoch: 47408, Training Loss: 0.00621
Epoch: 47408, Training Loss: 0.00757
Epoch: 47408, Training Loss: 0.00588
Epoch: 47409, Training Loss: 0.00663
Epoch: 47409, Training Loss: 0.00621
Epoch: 47409, Training Loss: 0.00757
Epoch: 47409, Training Loss: 0.00588
Epoch: 47410, Training Loss: 0.00663
Epoch: 47410, Training Loss: 0.00621
Epoch: 47410, Training Loss: 0.00757
Epoch: 47410, Training Loss: 0.00588
Epoch: 47411, Training Loss: 0.00663
Epoch: 47411, Training Loss: 0.00621
Epoch: 47411, Training Loss: 0.00757
Epoch: 47411, Training Loss: 0.00588
Epoch: 47412, Training Loss: 0.00663
Epoch: 47412, Training Loss: 0.00621
Epoch: 47412, Training Loss: 0.00757
Epoch: 47412, Training Loss: 0.00588
Epoch: 47413, Training Loss: 0.00662
Epoch: 47413, Training Loss: 0.00621
Epoch: 47413, Training Loss: 0.00757
Epoch: 47413, Training Loss: 0.00588
Epoch: 47414, Training Loss: 0.00662
Epoch: 47414, Training Loss: 0.00621
Epoch: 47414, Training Loss: 0.00757
Epoch: 47414, Training Loss: 0.00588
Epoch: 47415, Training Loss: 0.00662
Epoch: 47415, Training Loss: 0.00621
Epoch: 47415, Training Loss: 0.00757
Epoch: 47415, Training Loss: 0.00588
Epoch: 47416, Training Loss: 0.00662
Epoch: 47416, Training Loss: 0.00621
Epoch: 47416, Training Loss: 0.00757
Epoch: 47416, Training Loss: 0.00588
Epoch: 47417, Training Loss: 0.00662
Epoch: 47417, Training Loss: 0.00621
Epoch: 47417, Training Loss: 0.00757
Epoch: 47417, Training Loss: 0.00588
Epoch: 47418, Training Loss: 0.00662
Epoch: 47418, Training Loss: 0.00621
Epoch: 47418, Training Loss: 0.00757
Epoch: 47418, Training Loss: 0.00588
Epoch: 47419, Training Loss: 0.00662
Epoch: 47419, Training Loss: 0.00621
Epoch: 47419, Training Loss: 0.00757
Epoch: 47419, Training Loss: 0.00588
Epoch: 47420, Training Loss: 0.00662
Epoch: 47420, Training Loss: 0.00621
Epoch: 47420, Training Loss: 0.00757
Epoch: 47420, Training Loss: 0.00588
Epoch: 47421, Training Loss: 0.00662
Epoch: 47421, Training Loss: 0.00621
Epoch: 47421, Training Loss: 0.00757
Epoch: 47421, Training Loss: 0.00588
Epoch: 47422, Training Loss: 0.00662
Epoch: 47422, Training Loss: 0.00621
Epoch: 47422, Training Loss: 0.00757
Epoch: 47422, Training Loss: 0.00588
Epoch: 47423, Training Loss: 0.00662
Epoch: 47423, Training Loss: 0.00621
Epoch: 47423, Training Loss: 0.00757
Epoch: 47423, Training Loss: 0.00588
Epoch: 47424, Training Loss: 0.00662
Epoch: 47424, Training Loss: 0.00621
Epoch: 47424, Training Loss: 0.00757
Epoch: 47424, Training Loss: 0.00588
Epoch: 47425, Training Loss: 0.00662
Epoch: 47425, Training Loss: 0.00621
Epoch: 47425, Training Loss: 0.00757
Epoch: 47425, Training Loss: 0.00588
Epoch: 47426, Training Loss: 0.00662
Epoch: 47426, Training Loss: 0.00621
Epoch: 47426, Training Loss: 0.00757
Epoch: 47426, Training Loss: 0.00588
Epoch: 47427, Training Loss: 0.00662
Epoch: 47427, Training Loss: 0.00621
Epoch: 47427, Training Loss: 0.00757
Epoch: 47427, Training Loss: 0.00588
Epoch: 47428, Training Loss: 0.00662
Epoch: 47428, Training Loss: 0.00621
Epoch: 47428, Training Loss: 0.00757
Epoch: 47428, Training Loss: 0.00588
Epoch: 47429, Training Loss: 0.00662
Epoch: 47429, Training Loss: 0.00621
Epoch: 47429, Training Loss: 0.00757
Epoch: 47429, Training Loss: 0.00588
Epoch: 47430, Training Loss: 0.00662
Epoch: 47430, Training Loss: 0.00621
Epoch: 47430, Training Loss: 0.00757
Epoch: 47430, Training Loss: 0.00588
Epoch: 47431, Training Loss: 0.00662
Epoch: 47431, Training Loss: 0.00621
Epoch: 47431, Training Loss: 0.00757
Epoch: 47431, Training Loss: 0.00588
Epoch: 47432, Training Loss: 0.00662
Epoch: 47432, Training Loss: 0.00621
Epoch: 47432, Training Loss: 0.00757
Epoch: 47432, Training Loss: 0.00588
Epoch: 47433, Training Loss: 0.00662
Epoch: 47433, Training Loss: 0.00621
Epoch: 47433, Training Loss: 0.00757
Epoch: 47433, Training Loss: 0.00588
Epoch: 47434, Training Loss: 0.00662
Epoch: 47434, Training Loss: 0.00621
Epoch: 47434, Training Loss: 0.00757
Epoch: 47434, Training Loss: 0.00588
Epoch: 47435, Training Loss: 0.00662
Epoch: 47435, Training Loss: 0.00621
Epoch: 47435, Training Loss: 0.00757
Epoch: 47435, Training Loss: 0.00588
Epoch: 47436, Training Loss: 0.00662
Epoch: 47436, Training Loss: 0.00621
Epoch: 47436, Training Loss: 0.00757
Epoch: 47436, Training Loss: 0.00588
Epoch: 47437, Training Loss: 0.00662
Epoch: 47437, Training Loss: 0.00621
Epoch: 47437, Training Loss: 0.00757
Epoch: 47437, Training Loss: 0.00588
Epoch: 47438, Training Loss: 0.00662
Epoch: 47438, Training Loss: 0.00621
Epoch: 47438, Training Loss: 0.00757
Epoch: 47438, Training Loss: 0.00588
Epoch: 47439, Training Loss: 0.00662
Epoch: 47439, Training Loss: 0.00621
Epoch: 47439, Training Loss: 0.00757
Epoch: 47439, Training Loss: 0.00588
Epoch: 47440, Training Loss: 0.00662
Epoch: 47440, Training Loss: 0.00621
Epoch: 47440, Training Loss: 0.00757
Epoch: 47440, Training Loss: 0.00588
Epoch: 47441, Training Loss: 0.00662
Epoch: 47441, Training Loss: 0.00621
Epoch: 47441, Training Loss: 0.00757
Epoch: 47441, Training Loss: 0.00588
Epoch: 47442, Training Loss: 0.00662
Epoch: 47442, Training Loss: 0.00621
Epoch: 47442, Training Loss: 0.00757
Epoch: 47442, Training Loss: 0.00588
Epoch: 47443, Training Loss: 0.00662
Epoch: 47443, Training Loss: 0.00621
Epoch: 47443, Training Loss: 0.00757
Epoch: 47443, Training Loss: 0.00588
Epoch: 47444, Training Loss: 0.00662
Epoch: 47444, Training Loss: 0.00621
Epoch: 47444, Training Loss: 0.00757
Epoch: 47444, Training Loss: 0.00588
Epoch: 47445, Training Loss: 0.00662
Epoch: 47445, Training Loss: 0.00621
Epoch: 47445, Training Loss: 0.00757
Epoch: 47445, Training Loss: 0.00588
Epoch: 47446, Training Loss: 0.00662
Epoch: 47446, Training Loss: 0.00621
Epoch: 47446, Training Loss: 0.00757
Epoch: 47446, Training Loss: 0.00588
Epoch: 47447, Training Loss: 0.00662
Epoch: 47447, Training Loss: 0.00621
Epoch: 47447, Training Loss: 0.00757
Epoch: 47447, Training Loss: 0.00588
Epoch: 47448, Training Loss: 0.00662
Epoch: 47448, Training Loss: 0.00621
Epoch: 47448, Training Loss: 0.00757
Epoch: 47448, Training Loss: 0.00587
Epoch: 47449, Training Loss: 0.00662
Epoch: 47449, Training Loss: 0.00621
Epoch: 47449, Training Loss: 0.00757
Epoch: 47449, Training Loss: 0.00587
Epoch: 47450, Training Loss: 0.00662
Epoch: 47450, Training Loss: 0.00621
Epoch: 47450, Training Loss: 0.00757
Epoch: 47450, Training Loss: 0.00587
Epoch: 47451, Training Loss: 0.00662
Epoch: 47451, Training Loss: 0.00621
Epoch: 47451, Training Loss: 0.00757
Epoch: 47451, Training Loss: 0.00587
Epoch: 47452, Training Loss: 0.00662
Epoch: 47452, Training Loss: 0.00621
Epoch: 47452, Training Loss: 0.00757
Epoch: 47452, Training Loss: 0.00587
Epoch: 47453, Training Loss: 0.00662
Epoch: 47453, Training Loss: 0.00621
Epoch: 47453, Training Loss: 0.00757
Epoch: 47453, Training Loss: 0.00587
Epoch: 47454, Training Loss: 0.00662
Epoch: 47454, Training Loss: 0.00621
Epoch: 47454, Training Loss: 0.00757
Epoch: 47454, Training Loss: 0.00587
Epoch: 47455, Training Loss: 0.00662
Epoch: 47455, Training Loss: 0.00621
Epoch: 47455, Training Loss: 0.00757
Epoch: 47455, Training Loss: 0.00587
Epoch: 47456, Training Loss: 0.00662
Epoch: 47456, Training Loss: 0.00621
Epoch: 47456, Training Loss: 0.00757
Epoch: 47456, Training Loss: 0.00587
Epoch: 47457, Training Loss: 0.00662
Epoch: 47457, Training Loss: 0.00621
Epoch: 47457, Training Loss: 0.00757
Epoch: 47457, Training Loss: 0.00587
Epoch: 47458, Training Loss: 0.00662
Epoch: 47458, Training Loss: 0.00620
Epoch: 47458, Training Loss: 0.00757
Epoch: 47458, Training Loss: 0.00587
Epoch: 47459, Training Loss: 0.00662
Epoch: 47459, Training Loss: 0.00620
Epoch: 47459, Training Loss: 0.00757
Epoch: 47459, Training Loss: 0.00587
Epoch: 47460, Training Loss: 0.00662
Epoch: 47460, Training Loss: 0.00620
Epoch: 47460, Training Loss: 0.00757
Epoch: 47460, Training Loss: 0.00587
Epoch: 47461, Training Loss: 0.00662
Epoch: 47461, Training Loss: 0.00620
Epoch: 47461, Training Loss: 0.00757
Epoch: 47461, Training Loss: 0.00587
Epoch: 47462, Training Loss: 0.00662
Epoch: 47462, Training Loss: 0.00620
Epoch: 47462, Training Loss: 0.00757
Epoch: 47462, Training Loss: 0.00587
Epoch: 47463, Training Loss: 0.00662
Epoch: 47463, Training Loss: 0.00620
Epoch: 47463, Training Loss: 0.00757
Epoch: 47463, Training Loss: 0.00587
Epoch: 47464, Training Loss: 0.00662
Epoch: 47464, Training Loss: 0.00620
Epoch: 47464, Training Loss: 0.00757
Epoch: 47464, Training Loss: 0.00587
Epoch: 47465, Training Loss: 0.00662
Epoch: 47465, Training Loss: 0.00620
Epoch: 47465, Training Loss: 0.00757
Epoch: 47465, Training Loss: 0.00587
Epoch: 47466, Training Loss: 0.00662
Epoch: 47466, Training Loss: 0.00620
Epoch: 47466, Training Loss: 0.00757
Epoch: 47466, Training Loss: 0.00587
Epoch: 47467, Training Loss: 0.00662
Epoch: 47467, Training Loss: 0.00620
Epoch: 47467, Training Loss: 0.00757
Epoch: 47467, Training Loss: 0.00587
Epoch: 47468, Training Loss: 0.00662
Epoch: 47468, Training Loss: 0.00620
Epoch: 47468, Training Loss: 0.00757
Epoch: 47468, Training Loss: 0.00587
Epoch: 47469, Training Loss: 0.00662
Epoch: 47469, Training Loss: 0.00620
Epoch: 47469, Training Loss: 0.00757
Epoch: 47469, Training Loss: 0.00587
Epoch: 47470, Training Loss: 0.00662
Epoch: 47470, Training Loss: 0.00620
Epoch: 47470, Training Loss: 0.00757
Epoch: 47470, Training Loss: 0.00587
Epoch: 47471, Training Loss: 0.00662
Epoch: 47471, Training Loss: 0.00620
Epoch: 47471, Training Loss: 0.00757
Epoch: 47471, Training Loss: 0.00587
Epoch: 47472, Training Loss: 0.00662
Epoch: 47472, Training Loss: 0.00620
Epoch: 47472, Training Loss: 0.00757
Epoch: 47472, Training Loss: 0.00587
Epoch: 47473, Training Loss: 0.00662
Epoch: 47473, Training Loss: 0.00620
Epoch: 47473, Training Loss: 0.00757
Epoch: 47473, Training Loss: 0.00587
Epoch: 47474, Training Loss: 0.00662
Epoch: 47474, Training Loss: 0.00620
Epoch: 47474, Training Loss: 0.00757
Epoch: 47474, Training Loss: 0.00587
Epoch: 47475, Training Loss: 0.00662
Epoch: 47475, Training Loss: 0.00620
Epoch: 47475, Training Loss: 0.00757
Epoch: 47475, Training Loss: 0.00587
Epoch: 47476, Training Loss: 0.00662
Epoch: 47476, Training Loss: 0.00620
Epoch: 47476, Training Loss: 0.00757
Epoch: 47476, Training Loss: 0.00587
Epoch: 47477, Training Loss: 0.00662
Epoch: 47477, Training Loss: 0.00620
Epoch: 47477, Training Loss: 0.00757
Epoch: 47477, Training Loss: 0.00587
Epoch: 47478, Training Loss: 0.00662
Epoch: 47478, Training Loss: 0.00620
Epoch: 47478, Training Loss: 0.00757
Epoch: 47478, Training Loss: 0.00587
Epoch: 47479, Training Loss: 0.00662
Epoch: 47479, Training Loss: 0.00620
Epoch: 47479, Training Loss: 0.00757
Epoch: 47479, Training Loss: 0.00587
Epoch: 47480, Training Loss: 0.00662
Epoch: 47480, Training Loss: 0.00620
Epoch: 47480, Training Loss: 0.00757
Epoch: 47480, Training Loss: 0.00587
Epoch: 47481, Training Loss: 0.00662
Epoch: 47481, Training Loss: 0.00620
Epoch: 47481, Training Loss: 0.00757
Epoch: 47481, Training Loss: 0.00587
Epoch: 47482, Training Loss: 0.00662
Epoch: 47482, Training Loss: 0.00620
Epoch: 47482, Training Loss: 0.00757
Epoch: 47482, Training Loss: 0.00587
Epoch: 47483, Training Loss: 0.00662
Epoch: 47483, Training Loss: 0.00620
Epoch: 47483, Training Loss: 0.00757
Epoch: 47483, Training Loss: 0.00587
Epoch: 47484, Training Loss: 0.00662
Epoch: 47484, Training Loss: 0.00620
Epoch: 47484, Training Loss: 0.00757
Epoch: 47484, Training Loss: 0.00587
Epoch: 47485, Training Loss: 0.00662
Epoch: 47485, Training Loss: 0.00620
Epoch: 47485, Training Loss: 0.00757
Epoch: 47485, Training Loss: 0.00587
Epoch: 47486, Training Loss: 0.00662
Epoch: 47486, Training Loss: 0.00620
Epoch: 47486, Training Loss: 0.00757
Epoch: 47486, Training Loss: 0.00587
Epoch: 47487, Training Loss: 0.00662
Epoch: 47487, Training Loss: 0.00620
Epoch: 47487, Training Loss: 0.00757
Epoch: 47487, Training Loss: 0.00587
Epoch: 47488, Training Loss: 0.00662
Epoch: 47488, Training Loss: 0.00620
Epoch: 47488, Training Loss: 0.00757
Epoch: 47488, Training Loss: 0.00587
Epoch: 47489, Training Loss: 0.00662
Epoch: 47489, Training Loss: 0.00620
Epoch: 47489, Training Loss: 0.00757
Epoch: 47489, Training Loss: 0.00587
Epoch: 47490, Training Loss: 0.00662
Epoch: 47490, Training Loss: 0.00620
Epoch: 47490, Training Loss: 0.00757
Epoch: 47490, Training Loss: 0.00587
Epoch: 47491, Training Loss: 0.00662
Epoch: 47491, Training Loss: 0.00620
Epoch: 47491, Training Loss: 0.00757
Epoch: 47491, Training Loss: 0.00587
Epoch: 47492, Training Loss: 0.00662
Epoch: 47492, Training Loss: 0.00620
Epoch: 47492, Training Loss: 0.00757
Epoch: 47492, Training Loss: 0.00587
Epoch: 47493, Training Loss: 0.00662
Epoch: 47493, Training Loss: 0.00620
Epoch: 47493, Training Loss: 0.00757
Epoch: 47493, Training Loss: 0.00587
Epoch: 47494, Training Loss: 0.00662
Epoch: 47494, Training Loss: 0.00620
Epoch: 47494, Training Loss: 0.00757
Epoch: 47494, Training Loss: 0.00587
Epoch: 47495, Training Loss: 0.00662
Epoch: 47495, Training Loss: 0.00620
Epoch: 47495, Training Loss: 0.00757
Epoch: 47495, Training Loss: 0.00587
Epoch: 47496, Training Loss: 0.00662
Epoch: 47496, Training Loss: 0.00620
Epoch: 47496, Training Loss: 0.00757
Epoch: 47496, Training Loss: 0.00587
Epoch: 47497, Training Loss: 0.00662
Epoch: 47497, Training Loss: 0.00620
Epoch: 47497, Training Loss: 0.00757
Epoch: 47497, Training Loss: 0.00587
Epoch: 47498, Training Loss: 0.00662
Epoch: 47498, Training Loss: 0.00620
Epoch: 47498, Training Loss: 0.00757
Epoch: 47498, Training Loss: 0.00587
Epoch: 47499, Training Loss: 0.00662
Epoch: 47499, Training Loss: 0.00620
Epoch: 47499, Training Loss: 0.00757
Epoch: 47499, Training Loss: 0.00587
Epoch: 47500, Training Loss: 0.00662
Epoch: 47500, Training Loss: 0.00620
Epoch: 47500, Training Loss: 0.00757
Epoch: 47500, Training Loss: 0.00587
Epoch: 47501, Training Loss: 0.00662
Epoch: 47501, Training Loss: 0.00620
Epoch: 47501, Training Loss: 0.00756
Epoch: 47501, Training Loss: 0.00587
Epoch: 47502, Training Loss: 0.00662
Epoch: 47502, Training Loss: 0.00620
Epoch: 47502, Training Loss: 0.00756
Epoch: 47502, Training Loss: 0.00587
Epoch: 47503, Training Loss: 0.00662
Epoch: 47503, Training Loss: 0.00620
Epoch: 47503, Training Loss: 0.00756
Epoch: 47503, Training Loss: 0.00587
Epoch: 47504, Training Loss: 0.00662
Epoch: 47504, Training Loss: 0.00620
Epoch: 47504, Training Loss: 0.00756
Epoch: 47504, Training Loss: 0.00587
Epoch: 47505, Training Loss: 0.00662
Epoch: 47505, Training Loss: 0.00620
Epoch: 47505, Training Loss: 0.00756
Epoch: 47505, Training Loss: 0.00587
Epoch: 47506, Training Loss: 0.00662
Epoch: 47506, Training Loss: 0.00620
Epoch: 47506, Training Loss: 0.00756
Epoch: 47506, Training Loss: 0.00587
Epoch: 47507, Training Loss: 0.00662
Epoch: 47507, Training Loss: 0.00620
Epoch: 47507, Training Loss: 0.00756
Epoch: 47507, Training Loss: 0.00587
Epoch: 47508, Training Loss: 0.00662
Epoch: 47508, Training Loss: 0.00620
Epoch: 47508, Training Loss: 0.00756
Epoch: 47508, Training Loss: 0.00587
Epoch: 47509, Training Loss: 0.00662
Epoch: 47509, Training Loss: 0.00620
Epoch: 47509, Training Loss: 0.00756
Epoch: 47509, Training Loss: 0.00587
Epoch: 47510, Training Loss: 0.00662
Epoch: 47510, Training Loss: 0.00620
Epoch: 47510, Training Loss: 0.00756
Epoch: 47510, Training Loss: 0.00587
Epoch: 47511, Training Loss: 0.00662
Epoch: 47511, Training Loss: 0.00620
Epoch: 47511, Training Loss: 0.00756
Epoch: 47511, Training Loss: 0.00587
Epoch: 47512, Training Loss: 0.00662
Epoch: 47512, Training Loss: 0.00620
Epoch: 47512, Training Loss: 0.00756
Epoch: 47512, Training Loss: 0.00587
Epoch: 47513, Training Loss: 0.00662
Epoch: 47513, Training Loss: 0.00620
Epoch: 47513, Training Loss: 0.00756
Epoch: 47513, Training Loss: 0.00587
Epoch: 47514, Training Loss: 0.00662
Epoch: 47514, Training Loss: 0.00620
Epoch: 47514, Training Loss: 0.00756
Epoch: 47514, Training Loss: 0.00587
Epoch: 47515, Training Loss: 0.00662
Epoch: 47515, Training Loss: 0.00620
Epoch: 47515, Training Loss: 0.00756
Epoch: 47515, Training Loss: 0.00587
Epoch: 47516, Training Loss: 0.00662
Epoch: 47516, Training Loss: 0.00620
Epoch: 47516, Training Loss: 0.00756
Epoch: 47516, Training Loss: 0.00587
Epoch: 47517, Training Loss: 0.00662
Epoch: 47517, Training Loss: 0.00620
Epoch: 47517, Training Loss: 0.00756
Epoch: 47517, Training Loss: 0.00587
Epoch: 47518, Training Loss: 0.00662
Epoch: 47518, Training Loss: 0.00620
Epoch: 47518, Training Loss: 0.00756
Epoch: 47518, Training Loss: 0.00587
Epoch: 47519, Training Loss: 0.00662
Epoch: 47519, Training Loss: 0.00620
Epoch: 47519, Training Loss: 0.00756
Epoch: 47519, Training Loss: 0.00587
Epoch: 47520, Training Loss: 0.00662
Epoch: 47520, Training Loss: 0.00620
Epoch: 47520, Training Loss: 0.00756
Epoch: 47520, Training Loss: 0.00587
Epoch: 47521, Training Loss: 0.00662
Epoch: 47521, Training Loss: 0.00620
Epoch: 47521, Training Loss: 0.00756
Epoch: 47521, Training Loss: 0.00587
Epoch: 47522, Training Loss: 0.00662
Epoch: 47522, Training Loss: 0.00620
Epoch: 47522, Training Loss: 0.00756
Epoch: 47522, Training Loss: 0.00587
Epoch: 47523, Training Loss: 0.00662
Epoch: 47523, Training Loss: 0.00620
Epoch: 47523, Training Loss: 0.00756
Epoch: 47523, Training Loss: 0.00587
Epoch: 47524, Training Loss: 0.00662
Epoch: 47524, Training Loss: 0.00620
Epoch: 47524, Training Loss: 0.00756
Epoch: 47524, Training Loss: 0.00587
Epoch: 47525, Training Loss: 0.00662
Epoch: 47525, Training Loss: 0.00620
Epoch: 47525, Training Loss: 0.00756
Epoch: 47525, Training Loss: 0.00587
Epoch: 47526, Training Loss: 0.00662
Epoch: 47526, Training Loss: 0.00620
Epoch: 47526, Training Loss: 0.00756
Epoch: 47526, Training Loss: 0.00587
Epoch: 47527, Training Loss: 0.00662
Epoch: 47527, Training Loss: 0.00620
Epoch: 47527, Training Loss: 0.00756
Epoch: 47527, Training Loss: 0.00587
Epoch: 47528, Training Loss: 0.00662
Epoch: 47528, Training Loss: 0.00620
Epoch: 47528, Training Loss: 0.00756
Epoch: 47528, Training Loss: 0.00587
Epoch: 47529, Training Loss: 0.00662
Epoch: 47529, Training Loss: 0.00620
Epoch: 47529, Training Loss: 0.00756
Epoch: 47529, Training Loss: 0.00587
Epoch: 47530, Training Loss: 0.00662
Epoch: 47530, Training Loss: 0.00620
Epoch: 47530, Training Loss: 0.00756
Epoch: 47530, Training Loss: 0.00587
Epoch: 47531, Training Loss: 0.00662
Epoch: 47531, Training Loss: 0.00620
Epoch: 47531, Training Loss: 0.00756
Epoch: 47531, Training Loss: 0.00587
Epoch: 47532, Training Loss: 0.00662
Epoch: 47532, Training Loss: 0.00620
Epoch: 47532, Training Loss: 0.00756
Epoch: 47532, Training Loss: 0.00587
Epoch: 47533, Training Loss: 0.00662
Epoch: 47533, Training Loss: 0.00620
Epoch: 47533, Training Loss: 0.00756
Epoch: 47533, Training Loss: 0.00587
Epoch: 47534, Training Loss: 0.00662
Epoch: 47534, Training Loss: 0.00620
Epoch: 47534, Training Loss: 0.00756
Epoch: 47534, Training Loss: 0.00587
Epoch: 47535, Training Loss: 0.00662
Epoch: 47535, Training Loss: 0.00620
Epoch: 47535, Training Loss: 0.00756
Epoch: 47535, Training Loss: 0.00587
Epoch: 47536, Training Loss: 0.00662
Epoch: 47536, Training Loss: 0.00620
Epoch: 47536, Training Loss: 0.00756
Epoch: 47536, Training Loss: 0.00587
Epoch: 47537, Training Loss: 0.00662
Epoch: 47537, Training Loss: 0.00620
Epoch: 47537, Training Loss: 0.00756
Epoch: 47537, Training Loss: 0.00587
Epoch: 47538, Training Loss: 0.00662
Epoch: 47538, Training Loss: 0.00620
Epoch: 47538, Training Loss: 0.00756
Epoch: 47538, Training Loss: 0.00587
Epoch: 47539, Training Loss: 0.00662
Epoch: 47539, Training Loss: 0.00620
Epoch: 47539, Training Loss: 0.00756
Epoch: 47539, Training Loss: 0.00587
Epoch: 47540, Training Loss: 0.00662
Epoch: 47540, Training Loss: 0.00620
Epoch: 47540, Training Loss: 0.00756
Epoch: 47540, Training Loss: 0.00587
Epoch: 47541, Training Loss: 0.00662
Epoch: 47541, Training Loss: 0.00620
Epoch: 47541, Training Loss: 0.00756
Epoch: 47541, Training Loss: 0.00587
Epoch: 47542, Training Loss: 0.00662
Epoch: 47542, Training Loss: 0.00620
Epoch: 47542, Training Loss: 0.00756
Epoch: 47542, Training Loss: 0.00587
Epoch: 47543, Training Loss: 0.00662
Epoch: 47543, Training Loss: 0.00620
Epoch: 47543, Training Loss: 0.00756
Epoch: 47543, Training Loss: 0.00587
Epoch: 47544, Training Loss: 0.00662
Epoch: 47544, Training Loss: 0.00620
Epoch: 47544, Training Loss: 0.00756
Epoch: 47544, Training Loss: 0.00587
Epoch: 47545, Training Loss: 0.00662
Epoch: 47545, Training Loss: 0.00620
Epoch: 47545, Training Loss: 0.00756
Epoch: 47545, Training Loss: 0.00587
Epoch: 47546, Training Loss: 0.00662
Epoch: 47546, Training Loss: 0.00620
Epoch: 47546, Training Loss: 0.00756
Epoch: 47546, Training Loss: 0.00587
Epoch: 47547, Training Loss: 0.00662
Epoch: 47547, Training Loss: 0.00620
Epoch: 47547, Training Loss: 0.00756
Epoch: 47547, Training Loss: 0.00587
Epoch: 47548, Training Loss: 0.00662
Epoch: 47548, Training Loss: 0.00620
Epoch: 47548, Training Loss: 0.00756
Epoch: 47548, Training Loss: 0.00587
Epoch: 47549, Training Loss: 0.00662
Epoch: 47549, Training Loss: 0.00620
Epoch: 47549, Training Loss: 0.00756
Epoch: 47549, Training Loss: 0.00587
Epoch: 47550, Training Loss: 0.00661
Epoch: 47550, Training Loss: 0.00620
Epoch: 47550, Training Loss: 0.00756
Epoch: 47550, Training Loss: 0.00587
Epoch: 47551, Training Loss: 0.00661
Epoch: 47551, Training Loss: 0.00620
Epoch: 47551, Training Loss: 0.00756
Epoch: 47551, Training Loss: 0.00587
Epoch: 47552, Training Loss: 0.00661
Epoch: 47552, Training Loss: 0.00620
Epoch: 47552, Training Loss: 0.00756
Epoch: 47552, Training Loss: 0.00587
Epoch: 47553, Training Loss: 0.00661
Epoch: 47553, Training Loss: 0.00620
Epoch: 47553, Training Loss: 0.00756
Epoch: 47553, Training Loss: 0.00587
Epoch: 47554, Training Loss: 0.00661
Epoch: 47554, Training Loss: 0.00620
Epoch: 47554, Training Loss: 0.00756
Epoch: 47554, Training Loss: 0.00587
Epoch: 47555, Training Loss: 0.00661
Epoch: 47555, Training Loss: 0.00620
Epoch: 47555, Training Loss: 0.00756
Epoch: 47555, Training Loss: 0.00587
Epoch: 47556, Training Loss: 0.00661
Epoch: 47556, Training Loss: 0.00620
Epoch: 47556, Training Loss: 0.00756
Epoch: 47556, Training Loss: 0.00587
Epoch: 47557, Training Loss: 0.00661
Epoch: 47557, Training Loss: 0.00620
Epoch: 47557, Training Loss: 0.00756
Epoch: 47557, Training Loss: 0.00587
Epoch: 47558, Training Loss: 0.00661
Epoch: 47558, Training Loss: 0.00620
Epoch: 47558, Training Loss: 0.00756
Epoch: 47558, Training Loss: 0.00587
Epoch: 47559, Training Loss: 0.00661
Epoch: 47559, Training Loss: 0.00620
Epoch: 47559, Training Loss: 0.00756
Epoch: 47559, Training Loss: 0.00587
Epoch: 47560, Training Loss: 0.00661
Epoch: 47560, Training Loss: 0.00620
Epoch: 47560, Training Loss: 0.00756
Epoch: 47560, Training Loss: 0.00587
Epoch: 47561, Training Loss: 0.00661
Epoch: 47561, Training Loss: 0.00620
Epoch: 47561, Training Loss: 0.00756
Epoch: 47561, Training Loss: 0.00587
Epoch: 47562, Training Loss: 0.00661
Epoch: 47562, Training Loss: 0.00620
Epoch: 47562, Training Loss: 0.00756
Epoch: 47562, Training Loss: 0.00587
Epoch: 47563, Training Loss: 0.00661
Epoch: 47563, Training Loss: 0.00620
Epoch: 47563, Training Loss: 0.00756
Epoch: 47563, Training Loss: 0.00587
Epoch: 47564, Training Loss: 0.00661
Epoch: 47564, Training Loss: 0.00620
Epoch: 47564, Training Loss: 0.00756
Epoch: 47564, Training Loss: 0.00587
Epoch: 47565, Training Loss: 0.00661
Epoch: 47565, Training Loss: 0.00620
Epoch: 47565, Training Loss: 0.00756
Epoch: 47565, Training Loss: 0.00587
Epoch: 47566, Training Loss: 0.00661
Epoch: 47566, Training Loss: 0.00620
Epoch: 47566, Training Loss: 0.00756
Epoch: 47566, Training Loss: 0.00587
Epoch: 47567, Training Loss: 0.00661
Epoch: 47567, Training Loss: 0.00620
Epoch: 47567, Training Loss: 0.00756
Epoch: 47567, Training Loss: 0.00587
Epoch: 47568, Training Loss: 0.00661
Epoch: 47568, Training Loss: 0.00620
Epoch: 47568, Training Loss: 0.00756
Epoch: 47568, Training Loss: 0.00587
Epoch: 47569, Training Loss: 0.00661
Epoch: 47569, Training Loss: 0.00620
Epoch: 47569, Training Loss: 0.00756
Epoch: 47569, Training Loss: 0.00587
Epoch: 47570, Training Loss: 0.00661
Epoch: 47570, Training Loss: 0.00620
Epoch: 47570, Training Loss: 0.00756
Epoch: 47570, Training Loss: 0.00587
Epoch: 47571, Training Loss: 0.00661
Epoch: 47571, Training Loss: 0.00620
Epoch: 47571, Training Loss: 0.00756
Epoch: 47571, Training Loss: 0.00587
Epoch: 47572, Training Loss: 0.00661
Epoch: 47572, Training Loss: 0.00620
Epoch: 47572, Training Loss: 0.00756
Epoch: 47572, Training Loss: 0.00587
Epoch: 47573, Training Loss: 0.00661
Epoch: 47573, Training Loss: 0.00620
Epoch: 47573, Training Loss: 0.00756
Epoch: 47573, Training Loss: 0.00587
Epoch: 47574, Training Loss: 0.00661
Epoch: 47574, Training Loss: 0.00620
Epoch: 47574, Training Loss: 0.00756
Epoch: 47574, Training Loss: 0.00587
Epoch: 47575, Training Loss: 0.00661
Epoch: 47575, Training Loss: 0.00620
Epoch: 47575, Training Loss: 0.00756
Epoch: 47575, Training Loss: 0.00587
Epoch: 47576, Training Loss: 0.00661
Epoch: 47576, Training Loss: 0.00620
Epoch: 47576, Training Loss: 0.00756
Epoch: 47576, Training Loss: 0.00587
Epoch: 47577, Training Loss: 0.00661
Epoch: 47577, Training Loss: 0.00620
Epoch: 47577, Training Loss: 0.00756
Epoch: 47577, Training Loss: 0.00587
Epoch: 47578, Training Loss: 0.00661
Epoch: 47578, Training Loss: 0.00620
Epoch: 47578, Training Loss: 0.00756
Epoch: 47578, Training Loss: 0.00587
Epoch: 47579, Training Loss: 0.00661
Epoch: 47579, Training Loss: 0.00620
Epoch: 47579, Training Loss: 0.00756
Epoch: 47579, Training Loss: 0.00587
Epoch: 47580, Training Loss: 0.00661
Epoch: 47580, Training Loss: 0.00620
Epoch: 47580, Training Loss: 0.00756
Epoch: 47580, Training Loss: 0.00587
Epoch: 47581, Training Loss: 0.00661
Epoch: 47581, Training Loss: 0.00620
Epoch: 47581, Training Loss: 0.00756
Epoch: 47581, Training Loss: 0.00587
Epoch: 47582, Training Loss: 0.00661
Epoch: 47582, Training Loss: 0.00620
Epoch: 47582, Training Loss: 0.00756
Epoch: 47582, Training Loss: 0.00587
Epoch: 47583, Training Loss: 0.00661
Epoch: 47583, Training Loss: 0.00620
Epoch: 47583, Training Loss: 0.00756
Epoch: 47583, Training Loss: 0.00587
Epoch: 47584, Training Loss: 0.00661
Epoch: 47584, Training Loss: 0.00620
Epoch: 47584, Training Loss: 0.00756
Epoch: 47584, Training Loss: 0.00587
Epoch: 47585, Training Loss: 0.00661
Epoch: 47585, Training Loss: 0.00620
Epoch: 47585, Training Loss: 0.00756
Epoch: 47585, Training Loss: 0.00587
Epoch: 47586, Training Loss: 0.00661
Epoch: 47586, Training Loss: 0.00620
Epoch: 47586, Training Loss: 0.00756
Epoch: 47586, Training Loss: 0.00587
Epoch: 47587, Training Loss: 0.00661
Epoch: 47587, Training Loss: 0.00620
Epoch: 47587, Training Loss: 0.00756
Epoch: 47587, Training Loss: 0.00587
Epoch: 47588, Training Loss: 0.00661
Epoch: 47588, Training Loss: 0.00620
Epoch: 47588, Training Loss: 0.00756
Epoch: 47588, Training Loss: 0.00587
Epoch: 47589, Training Loss: 0.00661
Epoch: 47589, Training Loss: 0.00620
Epoch: 47589, Training Loss: 0.00756
Epoch: 47589, Training Loss: 0.00587
Epoch: 47590, Training Loss: 0.00661
Epoch: 47590, Training Loss: 0.00620
Epoch: 47590, Training Loss: 0.00756
Epoch: 47590, Training Loss: 0.00587
Epoch: 47591, Training Loss: 0.00661
Epoch: 47591, Training Loss: 0.00620
Epoch: 47591, Training Loss: 0.00756
Epoch: 47591, Training Loss: 0.00587
Epoch: 47592, Training Loss: 0.00661
Epoch: 47592, Training Loss: 0.00620
Epoch: 47592, Training Loss: 0.00756
Epoch: 47592, Training Loss: 0.00587
Epoch: 47593, Training Loss: 0.00661
Epoch: 47593, Training Loss: 0.00620
Epoch: 47593, Training Loss: 0.00756
Epoch: 47593, Training Loss: 0.00587
Epoch: 47594, Training Loss: 0.00661
Epoch: 47594, Training Loss: 0.00620
Epoch: 47594, Training Loss: 0.00756
Epoch: 47594, Training Loss: 0.00587
Epoch: 47595, Training Loss: 0.00661
Epoch: 47595, Training Loss: 0.00620
Epoch: 47595, Training Loss: 0.00756
Epoch: 47595, Training Loss: 0.00587
Epoch: 47596, Training Loss: 0.00661
Epoch: 47596, Training Loss: 0.00620
Epoch: 47596, Training Loss: 0.00756
Epoch: 47596, Training Loss: 0.00587
Epoch: 47597, Training Loss: 0.00661
Epoch: 47597, Training Loss: 0.00620
Epoch: 47597, Training Loss: 0.00756
Epoch: 47597, Training Loss: 0.00587
Epoch: 47598, Training Loss: 0.00661
Epoch: 47598, Training Loss: 0.00620
Epoch: 47598, Training Loss: 0.00756
Epoch: 47598, Training Loss: 0.00587
Epoch: 47599, Training Loss: 0.00661
Epoch: 47599, Training Loss: 0.00620
Epoch: 47599, Training Loss: 0.00756
Epoch: 47599, Training Loss: 0.00587
Epoch: 47600, Training Loss: 0.00661
Epoch: 47600, Training Loss: 0.00620
Epoch: 47600, Training Loss: 0.00756
Epoch: 47600, Training Loss: 0.00587
Epoch: 47601, Training Loss: 0.00661
Epoch: 47601, Training Loss: 0.00620
Epoch: 47601, Training Loss: 0.00756
Epoch: 47601, Training Loss: 0.00587
Epoch: 47602, Training Loss: 0.00661
Epoch: 47602, Training Loss: 0.00620
Epoch: 47602, Training Loss: 0.00756
Epoch: 47602, Training Loss: 0.00587
Epoch: 47603, Training Loss: 0.00661
Epoch: 47603, Training Loss: 0.00620
Epoch: 47603, Training Loss: 0.00756
Epoch: 47603, Training Loss: 0.00587
Epoch: 47604, Training Loss: 0.00661
Epoch: 47604, Training Loss: 0.00620
Epoch: 47604, Training Loss: 0.00756
Epoch: 47604, Training Loss: 0.00587
Epoch: 47605, Training Loss: 0.00661
Epoch: 47605, Training Loss: 0.00620
Epoch: 47605, Training Loss: 0.00756
Epoch: 47605, Training Loss: 0.00586
Epoch: 47606, Training Loss: 0.00661
Epoch: 47606, Training Loss: 0.00619
Epoch: 47606, Training Loss: 0.00756
Epoch: 47606, Training Loss: 0.00586
Epoch: 47607, Training Loss: 0.00661
Epoch: 47607, Training Loss: 0.00619
Epoch: 47607, Training Loss: 0.00756
Epoch: 47607, Training Loss: 0.00586
Epoch: 47608, Training Loss: 0.00661
Epoch: 47608, Training Loss: 0.00619
Epoch: 47608, Training Loss: 0.00756
Epoch: 47608, Training Loss: 0.00586
Epoch: 47609, Training Loss: 0.00661
Epoch: 47609, Training Loss: 0.00619
Epoch: 47609, Training Loss: 0.00756
Epoch: 47609, Training Loss: 0.00586
Epoch: 47610, Training Loss: 0.00661
Epoch: 47610, Training Loss: 0.00619
Epoch: 47610, Training Loss: 0.00756
Epoch: 47610, Training Loss: 0.00586
Epoch: 47611, Training Loss: 0.00661
Epoch: 47611, Training Loss: 0.00619
Epoch: 47611, Training Loss: 0.00756
Epoch: 47611, Training Loss: 0.00586
Epoch: 47612, Training Loss: 0.00661
Epoch: 47612, Training Loss: 0.00619
Epoch: 47612, Training Loss: 0.00756
Epoch: 47612, Training Loss: 0.00586
Epoch: 47613, Training Loss: 0.00661
Epoch: 47613, Training Loss: 0.00619
Epoch: 47613, Training Loss: 0.00756
Epoch: 47613, Training Loss: 0.00586
Epoch: 47614, Training Loss: 0.00661
Epoch: 47614, Training Loss: 0.00619
Epoch: 47614, Training Loss: 0.00756
Epoch: 47614, Training Loss: 0.00586
Epoch: 47615, Training Loss: 0.00661
Epoch: 47615, Training Loss: 0.00619
Epoch: 47615, Training Loss: 0.00756
Epoch: 47615, Training Loss: 0.00586
Epoch: 47616, Training Loss: 0.00661
Epoch: 47616, Training Loss: 0.00619
Epoch: 47616, Training Loss: 0.00756
Epoch: 47616, Training Loss: 0.00586
Epoch: 47617, Training Loss: 0.00661
Epoch: 47617, Training Loss: 0.00619
Epoch: 47617, Training Loss: 0.00756
Epoch: 47617, Training Loss: 0.00586
Epoch: 47618, Training Loss: 0.00661
Epoch: 47618, Training Loss: 0.00619
Epoch: 47618, Training Loss: 0.00756
Epoch: 47618, Training Loss: 0.00586
Epoch: 47619, Training Loss: 0.00661
Epoch: 47619, Training Loss: 0.00619
Epoch: 47619, Training Loss: 0.00756
Epoch: 47619, Training Loss: 0.00586
Epoch: 47620, Training Loss: 0.00661
Epoch: 47620, Training Loss: 0.00619
Epoch: 47620, Training Loss: 0.00756
Epoch: 47620, Training Loss: 0.00586
Epoch: 47621, Training Loss: 0.00661
Epoch: 47621, Training Loss: 0.00619
Epoch: 47621, Training Loss: 0.00756
Epoch: 47621, Training Loss: 0.00586
Epoch: 47622, Training Loss: 0.00661
Epoch: 47622, Training Loss: 0.00619
Epoch: 47622, Training Loss: 0.00755
Epoch: 47622, Training Loss: 0.00586
Epoch: 47623, Training Loss: 0.00661
Epoch: 47623, Training Loss: 0.00619
Epoch: 47623, Training Loss: 0.00755
Epoch: 47623, Training Loss: 0.00586
Epoch: 47624, Training Loss: 0.00661
Epoch: 47624, Training Loss: 0.00619
Epoch: 47624, Training Loss: 0.00755
Epoch: 47624, Training Loss: 0.00586
Epoch: 47625, Training Loss: 0.00661
Epoch: 47625, Training Loss: 0.00619
Epoch: 47625, Training Loss: 0.00755
Epoch: 47625, Training Loss: 0.00586
Epoch: 47626, Training Loss: 0.00661
Epoch: 47626, Training Loss: 0.00619
Epoch: 47626, Training Loss: 0.00755
Epoch: 47626, Training Loss: 0.00586
Epoch: 47627, Training Loss: 0.00661
Epoch: 47627, Training Loss: 0.00619
Epoch: 47627, Training Loss: 0.00755
Epoch: 47627, Training Loss: 0.00586
Epoch: 47628, Training Loss: 0.00661
Epoch: 47628, Training Loss: 0.00619
Epoch: 47628, Training Loss: 0.00755
Epoch: 47628, Training Loss: 0.00586
Epoch: 47629, Training Loss: 0.00661
Epoch: 47629, Training Loss: 0.00619
Epoch: 47629, Training Loss: 0.00755
Epoch: 47629, Training Loss: 0.00586
Epoch: 47630, Training Loss: 0.00661
Epoch: 47630, Training Loss: 0.00619
Epoch: 47630, Training Loss: 0.00755
Epoch: 47630, Training Loss: 0.00586
Epoch: 47631, Training Loss: 0.00661
Epoch: 47631, Training Loss: 0.00619
Epoch: 47631, Training Loss: 0.00755
Epoch: 47631, Training Loss: 0.00586
Epoch: 47632, Training Loss: 0.00661
Epoch: 47632, Training Loss: 0.00619
Epoch: 47632, Training Loss: 0.00755
Epoch: 47632, Training Loss: 0.00586
Epoch: 47633, Training Loss: 0.00661
Epoch: 47633, Training Loss: 0.00619
Epoch: 47633, Training Loss: 0.00755
Epoch: 47633, Training Loss: 0.00586
Epoch: 47634, Training Loss: 0.00661
Epoch: 47634, Training Loss: 0.00619
Epoch: 47634, Training Loss: 0.00755
Epoch: 47634, Training Loss: 0.00586
Epoch: 47635, Training Loss: 0.00661
Epoch: 47635, Training Loss: 0.00619
Epoch: 47635, Training Loss: 0.00755
Epoch: 47635, Training Loss: 0.00586
Epoch: 47636, Training Loss: 0.00661
Epoch: 47636, Training Loss: 0.00619
Epoch: 47636, Training Loss: 0.00755
Epoch: 47636, Training Loss: 0.00586
Epoch: 47637, Training Loss: 0.00661
Epoch: 47637, Training Loss: 0.00619
Epoch: 47637, Training Loss: 0.00755
Epoch: 47637, Training Loss: 0.00586
Epoch: 47638, Training Loss: 0.00661
Epoch: 47638, Training Loss: 0.00619
Epoch: 47638, Training Loss: 0.00755
Epoch: 47638, Training Loss: 0.00586
Epoch: 47639, Training Loss: 0.00661
Epoch: 47639, Training Loss: 0.00619
Epoch: 47639, Training Loss: 0.00755
Epoch: 47639, Training Loss: 0.00586
Epoch: 47640, Training Loss: 0.00661
Epoch: 47640, Training Loss: 0.00619
Epoch: 47640, Training Loss: 0.00755
Epoch: 47640, Training Loss: 0.00586
Epoch: 47641, Training Loss: 0.00661
Epoch: 47641, Training Loss: 0.00619
Epoch: 47641, Training Loss: 0.00755
Epoch: 47641, Training Loss: 0.00586
Epoch: 47642, Training Loss: 0.00661
Epoch: 47642, Training Loss: 0.00619
Epoch: 47642, Training Loss: 0.00755
Epoch: 47642, Training Loss: 0.00586
Epoch: 47643, Training Loss: 0.00661
Epoch: 47643, Training Loss: 0.00619
Epoch: 47643, Training Loss: 0.00755
Epoch: 47643, Training Loss: 0.00586
Epoch: 47644, Training Loss: 0.00661
Epoch: 47644, Training Loss: 0.00619
Epoch: 47644, Training Loss: 0.00755
Epoch: 47644, Training Loss: 0.00586
Epoch: 47645, Training Loss: 0.00661
Epoch: 47645, Training Loss: 0.00619
Epoch: 47645, Training Loss: 0.00755
Epoch: 47645, Training Loss: 0.00586
Epoch: 47646, Training Loss: 0.00661
Epoch: 47646, Training Loss: 0.00619
Epoch: 47646, Training Loss: 0.00755
Epoch: 47646, Training Loss: 0.00586
Epoch: 47647, Training Loss: 0.00661
Epoch: 47647, Training Loss: 0.00619
Epoch: 47647, Training Loss: 0.00755
Epoch: 47647, Training Loss: 0.00586
Epoch: 47648, Training Loss: 0.00661
Epoch: 47648, Training Loss: 0.00619
Epoch: 47648, Training Loss: 0.00755
Epoch: 47648, Training Loss: 0.00586
Epoch: 47649, Training Loss: 0.00661
Epoch: 47649, Training Loss: 0.00619
Epoch: 47649, Training Loss: 0.00755
Epoch: 47649, Training Loss: 0.00586
Epoch: 47650, Training Loss: 0.00661
Epoch: 47650, Training Loss: 0.00619
Epoch: 47650, Training Loss: 0.00755
Epoch: 47650, Training Loss: 0.00586
Epoch: 47651, Training Loss: 0.00661
Epoch: 47651, Training Loss: 0.00619
Epoch: 47651, Training Loss: 0.00755
Epoch: 47651, Training Loss: 0.00586
Epoch: 47652, Training Loss: 0.00661
Epoch: 47652, Training Loss: 0.00619
Epoch: 47652, Training Loss: 0.00755
Epoch: 47652, Training Loss: 0.00586
Epoch: 47653, Training Loss: 0.00661
Epoch: 47653, Training Loss: 0.00619
Epoch: 47653, Training Loss: 0.00755
Epoch: 47653, Training Loss: 0.00586
Epoch: 47654, Training Loss: 0.00661
Epoch: 47654, Training Loss: 0.00619
Epoch: 47654, Training Loss: 0.00755
Epoch: 47654, Training Loss: 0.00586
Epoch: 47655, Training Loss: 0.00661
Epoch: 47655, Training Loss: 0.00619
Epoch: 47655, Training Loss: 0.00755
Epoch: 47655, Training Loss: 0.00586
Epoch: 47656, Training Loss: 0.00661
Epoch: 47656, Training Loss: 0.00619
Epoch: 47656, Training Loss: 0.00755
Epoch: 47656, Training Loss: 0.00586
Epoch: 47657, Training Loss: 0.00661
Epoch: 47657, Training Loss: 0.00619
Epoch: 47657, Training Loss: 0.00755
Epoch: 47657, Training Loss: 0.00586
Epoch: 47658, Training Loss: 0.00661
Epoch: 47658, Training Loss: 0.00619
Epoch: 47658, Training Loss: 0.00755
Epoch: 47658, Training Loss: 0.00586
Epoch: 47659, Training Loss: 0.00661
Epoch: 47659, Training Loss: 0.00619
Epoch: 47659, Training Loss: 0.00755
Epoch: 47659, Training Loss: 0.00586
Epoch: 47660, Training Loss: 0.00661
Epoch: 47660, Training Loss: 0.00619
Epoch: 47660, Training Loss: 0.00755
Epoch: 47660, Training Loss: 0.00586
Epoch: 47661, Training Loss: 0.00661
Epoch: 47661, Training Loss: 0.00619
Epoch: 47661, Training Loss: 0.00755
Epoch: 47661, Training Loss: 0.00586
Epoch: 47662, Training Loss: 0.00661
Epoch: 47662, Training Loss: 0.00619
Epoch: 47662, Training Loss: 0.00755
Epoch: 47662, Training Loss: 0.00586
Epoch: 47663, Training Loss: 0.00661
Epoch: 47663, Training Loss: 0.00619
Epoch: 47663, Training Loss: 0.00755
Epoch: 47663, Training Loss: 0.00586
Epoch: 47664, Training Loss: 0.00661
Epoch: 47664, Training Loss: 0.00619
Epoch: 47664, Training Loss: 0.00755
Epoch: 47664, Training Loss: 0.00586
Epoch: 47665, Training Loss: 0.00661
Epoch: 47665, Training Loss: 0.00619
Epoch: 47665, Training Loss: 0.00755
Epoch: 47665, Training Loss: 0.00586
Epoch: 47666, Training Loss: 0.00661
Epoch: 47666, Training Loss: 0.00619
Epoch: 47666, Training Loss: 0.00755
Epoch: 47666, Training Loss: 0.00586
Epoch: 47667, Training Loss: 0.00661
Epoch: 47667, Training Loss: 0.00619
Epoch: 47667, Training Loss: 0.00755
Epoch: 47667, Training Loss: 0.00586
Epoch: 47668, Training Loss: 0.00661
Epoch: 47668, Training Loss: 0.00619
Epoch: 47668, Training Loss: 0.00755
Epoch: 47668, Training Loss: 0.00586
Epoch: 47669, Training Loss: 0.00661
Epoch: 47669, Training Loss: 0.00619
Epoch: 47669, Training Loss: 0.00755
Epoch: 47669, Training Loss: 0.00586
Epoch: 47670, Training Loss: 0.00661
Epoch: 47670, Training Loss: 0.00619
Epoch: 47670, Training Loss: 0.00755
Epoch: 47670, Training Loss: 0.00586
Epoch: 47671, Training Loss: 0.00661
Epoch: 47671, Training Loss: 0.00619
Epoch: 47671, Training Loss: 0.00755
Epoch: 47671, Training Loss: 0.00586
Epoch: 47672, Training Loss: 0.00661
Epoch: 47672, Training Loss: 0.00619
Epoch: 47672, Training Loss: 0.00755
Epoch: 47672, Training Loss: 0.00586
Epoch: 47673, Training Loss: 0.00661
Epoch: 47673, Training Loss: 0.00619
Epoch: 47673, Training Loss: 0.00755
Epoch: 47673, Training Loss: 0.00586
Epoch: 47674, Training Loss: 0.00661
Epoch: 47674, Training Loss: 0.00619
Epoch: 47674, Training Loss: 0.00755
Epoch: 47674, Training Loss: 0.00586
Epoch: 47675, Training Loss: 0.00661
Epoch: 47675, Training Loss: 0.00619
Epoch: 47675, Training Loss: 0.00755
Epoch: 47675, Training Loss: 0.00586
Epoch: 47676, Training Loss: 0.00661
Epoch: 47676, Training Loss: 0.00619
Epoch: 47676, Training Loss: 0.00755
Epoch: 47676, Training Loss: 0.00586
Epoch: 47677, Training Loss: 0.00661
Epoch: 47677, Training Loss: 0.00619
Epoch: 47677, Training Loss: 0.00755
Epoch: 47677, Training Loss: 0.00586
Epoch: 47678, Training Loss: 0.00661
Epoch: 47678, Training Loss: 0.00619
Epoch: 47678, Training Loss: 0.00755
Epoch: 47678, Training Loss: 0.00586
Epoch: 47679, Training Loss: 0.00661
Epoch: 47679, Training Loss: 0.00619
Epoch: 47679, Training Loss: 0.00755
Epoch: 47679, Training Loss: 0.00586
Epoch: 47680, Training Loss: 0.00661
Epoch: 47680, Training Loss: 0.00619
Epoch: 47680, Training Loss: 0.00755
Epoch: 47680, Training Loss: 0.00586
Epoch: 47681, Training Loss: 0.00661
Epoch: 47681, Training Loss: 0.00619
Epoch: 47681, Training Loss: 0.00755
Epoch: 47681, Training Loss: 0.00586
Epoch: 47682, Training Loss: 0.00661
Epoch: 47682, Training Loss: 0.00619
Epoch: 47682, Training Loss: 0.00755
Epoch: 47682, Training Loss: 0.00586
Epoch: 47683, Training Loss: 0.00661
Epoch: 47683, Training Loss: 0.00619
Epoch: 47683, Training Loss: 0.00755
Epoch: 47683, Training Loss: 0.00586
Epoch: 47684, Training Loss: 0.00661
Epoch: 47684, Training Loss: 0.00619
Epoch: 47684, Training Loss: 0.00755
Epoch: 47684, Training Loss: 0.00586
Epoch: 47685, Training Loss: 0.00661
Epoch: 47685, Training Loss: 0.00619
Epoch: 47685, Training Loss: 0.00755
Epoch: 47685, Training Loss: 0.00586
Epoch: 47686, Training Loss: 0.00661
Epoch: 47686, Training Loss: 0.00619
Epoch: 47686, Training Loss: 0.00755
Epoch: 47686, Training Loss: 0.00586
Epoch: 47687, Training Loss: 0.00661
Epoch: 47687, Training Loss: 0.00619
Epoch: 47687, Training Loss: 0.00755
Epoch: 47687, Training Loss: 0.00586
Epoch: 47688, Training Loss: 0.00660
Epoch: 47688, Training Loss: 0.00619
Epoch: 47688, Training Loss: 0.00755
Epoch: 47688, Training Loss: 0.00586
Epoch: 47689, Training Loss: 0.00660
Epoch: 47689, Training Loss: 0.00619
Epoch: 47689, Training Loss: 0.00755
Epoch: 47689, Training Loss: 0.00586
Epoch: 47690, Training Loss: 0.00660
Epoch: 47690, Training Loss: 0.00619
Epoch: 47690, Training Loss: 0.00755
Epoch: 47690, Training Loss: 0.00586
Epoch: 47691, Training Loss: 0.00660
Epoch: 47691, Training Loss: 0.00619
Epoch: 47691, Training Loss: 0.00755
Epoch: 47691, Training Loss: 0.00586
Epoch: 47692, Training Loss: 0.00660
Epoch: 47692, Training Loss: 0.00619
Epoch: 47692, Training Loss: 0.00755
Epoch: 47692, Training Loss: 0.00586
Epoch: 47693, Training Loss: 0.00660
Epoch: 47693, Training Loss: 0.00619
Epoch: 47693, Training Loss: 0.00755
Epoch: 47693, Training Loss: 0.00586
Epoch: 47694, Training Loss: 0.00660
Epoch: 47694, Training Loss: 0.00619
Epoch: 47694, Training Loss: 0.00755
Epoch: 47694, Training Loss: 0.00586
Epoch: 47695, Training Loss: 0.00660
Epoch: 47695, Training Loss: 0.00619
Epoch: 47695, Training Loss: 0.00755
Epoch: 47695, Training Loss: 0.00586
Epoch: 47696, Training Loss: 0.00660
Epoch: 47696, Training Loss: 0.00619
Epoch: 47696, Training Loss: 0.00755
Epoch: 47696, Training Loss: 0.00586
Epoch: 47697, Training Loss: 0.00660
Epoch: 47697, Training Loss: 0.00619
Epoch: 47697, Training Loss: 0.00755
Epoch: 47697, Training Loss: 0.00586
Epoch: 47698, Training Loss: 0.00660
Epoch: 47698, Training Loss: 0.00619
Epoch: 47698, Training Loss: 0.00755
Epoch: 47698, Training Loss: 0.00586
Epoch: 47699, Training Loss: 0.00660
Epoch: 47699, Training Loss: 0.00619
Epoch: 47699, Training Loss: 0.00755
Epoch: 47699, Training Loss: 0.00586
Epoch: 47700, Training Loss: 0.00660
Epoch: 47700, Training Loss: 0.00619
Epoch: 47700, Training Loss: 0.00755
Epoch: 47700, Training Loss: 0.00586
Epoch: 47701, Training Loss: 0.00660
Epoch: 47701, Training Loss: 0.00619
Epoch: 47701, Training Loss: 0.00755
Epoch: 47701, Training Loss: 0.00586
Epoch: 47702, Training Loss: 0.00660
Epoch: 47702, Training Loss: 0.00619
Epoch: 47702, Training Loss: 0.00755
Epoch: 47702, Training Loss: 0.00586
Epoch: 47703, Training Loss: 0.00660
Epoch: 47703, Training Loss: 0.00619
Epoch: 47703, Training Loss: 0.00755
Epoch: 47703, Training Loss: 0.00586
Epoch: 47704, Training Loss: 0.00660
Epoch: 47704, Training Loss: 0.00619
Epoch: 47704, Training Loss: 0.00755
Epoch: 47704, Training Loss: 0.00586
Epoch: 47705, Training Loss: 0.00660
Epoch: 47705, Training Loss: 0.00619
Epoch: 47705, Training Loss: 0.00755
Epoch: 47705, Training Loss: 0.00586
Epoch: 47706, Training Loss: 0.00660
Epoch: 47706, Training Loss: 0.00619
Epoch: 47706, Training Loss: 0.00755
Epoch: 47706, Training Loss: 0.00586
Epoch: 47707, Training Loss: 0.00660
Epoch: 47707, Training Loss: 0.00619
Epoch: 47707, Training Loss: 0.00755
Epoch: 47707, Training Loss: 0.00586
Epoch: 47708, Training Loss: 0.00660
Epoch: 47708, Training Loss: 0.00619
Epoch: 47708, Training Loss: 0.00755
Epoch: 47708, Training Loss: 0.00586
Epoch: 47709, Training Loss: 0.00660
Epoch: 47709, Training Loss: 0.00619
Epoch: 47709, Training Loss: 0.00755
Epoch: 47709, Training Loss: 0.00586
Epoch: 47710, Training Loss: 0.00660
Epoch: 47710, Training Loss: 0.00619
Epoch: 47710, Training Loss: 0.00755
Epoch: 47710, Training Loss: 0.00586
Epoch: 47711, Training Loss: 0.00660
Epoch: 47711, Training Loss: 0.00619
Epoch: 47711, Training Loss: 0.00755
Epoch: 47711, Training Loss: 0.00586
Epoch: 47712, Training Loss: 0.00660
Epoch: 47712, Training Loss: 0.00619
Epoch: 47712, Training Loss: 0.00755
Epoch: 47712, Training Loss: 0.00586
Epoch: 47713, Training Loss: 0.00660
Epoch: 47713, Training Loss: 0.00619
Epoch: 47713, Training Loss: 0.00755
Epoch: 47713, Training Loss: 0.00586
Epoch: 47714, Training Loss: 0.00660
Epoch: 47714, Training Loss: 0.00619
Epoch: 47714, Training Loss: 0.00755
Epoch: 47714, Training Loss: 0.00586
Epoch: 47715, Training Loss: 0.00660
Epoch: 47715, Training Loss: 0.00619
Epoch: 47715, Training Loss: 0.00755
Epoch: 47715, Training Loss: 0.00586
Epoch: 47716, Training Loss: 0.00660
Epoch: 47716, Training Loss: 0.00619
Epoch: 47716, Training Loss: 0.00755
Epoch: 47716, Training Loss: 0.00586
Epoch: 47717, Training Loss: 0.00660
Epoch: 47717, Training Loss: 0.00619
Epoch: 47717, Training Loss: 0.00755
Epoch: 47717, Training Loss: 0.00586
Epoch: 47718, Training Loss: 0.00660
Epoch: 47718, Training Loss: 0.00619
Epoch: 47718, Training Loss: 0.00755
Epoch: 47718, Training Loss: 0.00586
Epoch: 47719, Training Loss: 0.00660
Epoch: 47719, Training Loss: 0.00619
Epoch: 47719, Training Loss: 0.00755
Epoch: 47719, Training Loss: 0.00586
Epoch: 47720, Training Loss: 0.00660
Epoch: 47720, Training Loss: 0.00619
Epoch: 47720, Training Loss: 0.00755
Epoch: 47720, Training Loss: 0.00586
Epoch: 47721, Training Loss: 0.00660
Epoch: 47721, Training Loss: 0.00619
Epoch: 47721, Training Loss: 0.00755
Epoch: 47721, Training Loss: 0.00586
Epoch: 47722, Training Loss: 0.00660
Epoch: 47722, Training Loss: 0.00619
Epoch: 47722, Training Loss: 0.00755
Epoch: 47722, Training Loss: 0.00586
Epoch: 47723, Training Loss: 0.00660
Epoch: 47723, Training Loss: 0.00619
Epoch: 47723, Training Loss: 0.00755
Epoch: 47723, Training Loss: 0.00586
Epoch: 47724, Training Loss: 0.00660
Epoch: 47724, Training Loss: 0.00619
Epoch: 47724, Training Loss: 0.00755
Epoch: 47724, Training Loss: 0.00586
Epoch: 47725, Training Loss: 0.00660
Epoch: 47725, Training Loss: 0.00619
Epoch: 47725, Training Loss: 0.00755
Epoch: 47725, Training Loss: 0.00586
Epoch: 47726, Training Loss: 0.00660
Epoch: 47726, Training Loss: 0.00619
Epoch: 47726, Training Loss: 0.00755
Epoch: 47726, Training Loss: 0.00586
Epoch: 47727, Training Loss: 0.00660
Epoch: 47727, Training Loss: 0.00619
Epoch: 47727, Training Loss: 0.00755
Epoch: 47727, Training Loss: 0.00586
Epoch: 47728, Training Loss: 0.00660
Epoch: 47728, Training Loss: 0.00619
Epoch: 47728, Training Loss: 0.00755
Epoch: 47728, Training Loss: 0.00586
Epoch: 47729, Training Loss: 0.00660
Epoch: 47729, Training Loss: 0.00619
Epoch: 47729, Training Loss: 0.00755
Epoch: 47729, Training Loss: 0.00586
Epoch: 47730, Training Loss: 0.00660
Epoch: 47730, Training Loss: 0.00619
Epoch: 47730, Training Loss: 0.00755
Epoch: 47730, Training Loss: 0.00586
Epoch: 47731, Training Loss: 0.00660
Epoch: 47731, Training Loss: 0.00619
Epoch: 47731, Training Loss: 0.00755
Epoch: 47731, Training Loss: 0.00586
Epoch: 47732, Training Loss: 0.00660
Epoch: 47732, Training Loss: 0.00619
Epoch: 47732, Training Loss: 0.00755
Epoch: 47732, Training Loss: 0.00586
Epoch: 47733, Training Loss: 0.00660
Epoch: 47733, Training Loss: 0.00619
Epoch: 47733, Training Loss: 0.00755
Epoch: 47733, Training Loss: 0.00586
Epoch: 47734, Training Loss: 0.00660
Epoch: 47734, Training Loss: 0.00619
Epoch: 47734, Training Loss: 0.00755
Epoch: 47734, Training Loss: 0.00586
Epoch: 47735, Training Loss: 0.00660
Epoch: 47735, Training Loss: 0.00619
Epoch: 47735, Training Loss: 0.00755
Epoch: 47735, Training Loss: 0.00586
Epoch: 47736, Training Loss: 0.00660
Epoch: 47736, Training Loss: 0.00619
Epoch: 47736, Training Loss: 0.00755
Epoch: 47736, Training Loss: 0.00586
Epoch: 47737, Training Loss: 0.00660
Epoch: 47737, Training Loss: 0.00619
Epoch: 47737, Training Loss: 0.00755
Epoch: 47737, Training Loss: 0.00586
Epoch: 47738, Training Loss: 0.00660
Epoch: 47738, Training Loss: 0.00619
Epoch: 47738, Training Loss: 0.00755
Epoch: 47738, Training Loss: 0.00586
Epoch: 47739, Training Loss: 0.00660
Epoch: 47739, Training Loss: 0.00619
Epoch: 47739, Training Loss: 0.00755
Epoch: 47739, Training Loss: 0.00586
Epoch: 47740, Training Loss: 0.00660
Epoch: 47740, Training Loss: 0.00619
Epoch: 47740, Training Loss: 0.00755
Epoch: 47740, Training Loss: 0.00586
Epoch: 47741, Training Loss: 0.00660
Epoch: 47741, Training Loss: 0.00619
Epoch: 47741, Training Loss: 0.00755
Epoch: 47741, Training Loss: 0.00586
Epoch: 47742, Training Loss: 0.00660
Epoch: 47742, Training Loss: 0.00619
Epoch: 47742, Training Loss: 0.00755
Epoch: 47742, Training Loss: 0.00586
Epoch: 47743, Training Loss: 0.00660
Epoch: 47743, Training Loss: 0.00619
Epoch: 47743, Training Loss: 0.00754
Epoch: 47743, Training Loss: 0.00586
Epoch: 47744, Training Loss: 0.00660
Epoch: 47744, Training Loss: 0.00619
Epoch: 47744, Training Loss: 0.00754
Epoch: 47744, Training Loss: 0.00586
Epoch: 47745, Training Loss: 0.00660
Epoch: 47745, Training Loss: 0.00619
Epoch: 47745, Training Loss: 0.00754
Epoch: 47745, Training Loss: 0.00586
Epoch: 47746, Training Loss: 0.00660
Epoch: 47746, Training Loss: 0.00619
Epoch: 47746, Training Loss: 0.00754
Epoch: 47746, Training Loss: 0.00586
Epoch: 47747, Training Loss: 0.00660
Epoch: 47747, Training Loss: 0.00619
Epoch: 47747, Training Loss: 0.00754
Epoch: 47747, Training Loss: 0.00586
Epoch: 47748, Training Loss: 0.00660
Epoch: 47748, Training Loss: 0.00619
Epoch: 47748, Training Loss: 0.00754
Epoch: 47748, Training Loss: 0.00586
Epoch: 47749, Training Loss: 0.00660
Epoch: 47749, Training Loss: 0.00619
Epoch: 47749, Training Loss: 0.00754
Epoch: 47749, Training Loss: 0.00586
Epoch: 47750, Training Loss: 0.00660
Epoch: 47750, Training Loss: 0.00619
Epoch: 47750, Training Loss: 0.00754
Epoch: 47750, Training Loss: 0.00586
Epoch: 47751, Training Loss: 0.00660
Epoch: 47751, Training Loss: 0.00619
Epoch: 47751, Training Loss: 0.00754
Epoch: 47751, Training Loss: 0.00586
Epoch: 47752, Training Loss: 0.00660
Epoch: 47752, Training Loss: 0.00619
Epoch: 47752, Training Loss: 0.00754
Epoch: 47752, Training Loss: 0.00586
Epoch: 47753, Training Loss: 0.00660
Epoch: 47753, Training Loss: 0.00619
Epoch: 47753, Training Loss: 0.00754
Epoch: 47753, Training Loss: 0.00586
Epoch: 47754, Training Loss: 0.00660
Epoch: 47754, Training Loss: 0.00619
Epoch: 47754, Training Loss: 0.00754
Epoch: 47754, Training Loss: 0.00586
Epoch: 47755, Training Loss: 0.00660
Epoch: 47755, Training Loss: 0.00618
Epoch: 47755, Training Loss: 0.00754
Epoch: 47755, Training Loss: 0.00586
Epoch: 47756, Training Loss: 0.00660
Epoch: 47756, Training Loss: 0.00618
Epoch: 47756, Training Loss: 0.00754
Epoch: 47756, Training Loss: 0.00586
Epoch: 47757, Training Loss: 0.00660
Epoch: 47757, Training Loss: 0.00618
Epoch: 47757, Training Loss: 0.00754
Epoch: 47757, Training Loss: 0.00586
Epoch: 47758, Training Loss: 0.00660
Epoch: 47758, Training Loss: 0.00618
Epoch: 47758, Training Loss: 0.00754
Epoch: 47758, Training Loss: 0.00586
Epoch: 47759, Training Loss: 0.00660
Epoch: 47759, Training Loss: 0.00618
Epoch: 47759, Training Loss: 0.00754
Epoch: 47759, Training Loss: 0.00586
Epoch: 47760, Training Loss: 0.00660
Epoch: 47760, Training Loss: 0.00618
Epoch: 47760, Training Loss: 0.00754
Epoch: 47760, Training Loss: 0.00586
Epoch: 47761, Training Loss: 0.00660
Epoch: 47761, Training Loss: 0.00618
Epoch: 47761, Training Loss: 0.00754
Epoch: 47761, Training Loss: 0.00586
Epoch: 47762, Training Loss: 0.00660
Epoch: 47762, Training Loss: 0.00618
Epoch: 47762, Training Loss: 0.00754
Epoch: 47762, Training Loss: 0.00585
Epoch: 47763, Training Loss: 0.00660
Epoch: 47763, Training Loss: 0.00618
Epoch: 47763, Training Loss: 0.00754
Epoch: 47763, Training Loss: 0.00585
Epoch: 47764, Training Loss: 0.00660
Epoch: 47764, Training Loss: 0.00618
Epoch: 47764, Training Loss: 0.00754
Epoch: 47764, Training Loss: 0.00585
Epoch: 47765, Training Loss: 0.00660
Epoch: 47765, Training Loss: 0.00618
Epoch: 47765, Training Loss: 0.00754
Epoch: 47765, Training Loss: 0.00585
Epoch: 47766, Training Loss: 0.00660
Epoch: 47766, Training Loss: 0.00618
Epoch: 47766, Training Loss: 0.00754
Epoch: 47766, Training Loss: 0.00585
Epoch: 47767, Training Loss: 0.00660
Epoch: 47767, Training Loss: 0.00618
Epoch: 47767, Training Loss: 0.00754
Epoch: 47767, Training Loss: 0.00585
Epoch: 47768, Training Loss: 0.00660
Epoch: 47768, Training Loss: 0.00618
Epoch: 47768, Training Loss: 0.00754
Epoch: 47768, Training Loss: 0.00585
Epoch: 47769, Training Loss: 0.00660
Epoch: 47769, Training Loss: 0.00618
Epoch: 47769, Training Loss: 0.00754
Epoch: 47769, Training Loss: 0.00585
Epoch: 47770, Training Loss: 0.00660
Epoch: 47770, Training Loss: 0.00618
Epoch: 47770, Training Loss: 0.00754
Epoch: 47770, Training Loss: 0.00585
Epoch: 47771, Training Loss: 0.00660
Epoch: 47771, Training Loss: 0.00618
Epoch: 47771, Training Loss: 0.00754
Epoch: 47771, Training Loss: 0.00585
Epoch: 47772, Training Loss: 0.00660
Epoch: 47772, Training Loss: 0.00618
Epoch: 47772, Training Loss: 0.00754
Epoch: 47772, Training Loss: 0.00585
Epoch: 47773, Training Loss: 0.00660
Epoch: 47773, Training Loss: 0.00618
Epoch: 47773, Training Loss: 0.00754
Epoch: 47773, Training Loss: 0.00585
Epoch: 47774, Training Loss: 0.00660
Epoch: 47774, Training Loss: 0.00618
Epoch: 47774, Training Loss: 0.00754
Epoch: 47774, Training Loss: 0.00585
Epoch: 47775, Training Loss: 0.00660
Epoch: 47775, Training Loss: 0.00618
Epoch: 47775, Training Loss: 0.00754
Epoch: 47775, Training Loss: 0.00585
Epoch: 47776, Training Loss: 0.00660
Epoch: 47776, Training Loss: 0.00618
Epoch: 47776, Training Loss: 0.00754
Epoch: 47776, Training Loss: 0.00585
Epoch: 47777, Training Loss: 0.00660
Epoch: 47777, Training Loss: 0.00618
Epoch: 47777, Training Loss: 0.00754
Epoch: 47777, Training Loss: 0.00585
Epoch: 47778, Training Loss: 0.00660
Epoch: 47778, Training Loss: 0.00618
Epoch: 47778, Training Loss: 0.00754
Epoch: 47778, Training Loss: 0.00585
Epoch: 47779, Training Loss: 0.00660
Epoch: 47779, Training Loss: 0.00618
Epoch: 47779, Training Loss: 0.00754
Epoch: 47779, Training Loss: 0.00585
Epoch: 47780, Training Loss: 0.00660
Epoch: 47780, Training Loss: 0.00618
Epoch: 47780, Training Loss: 0.00754
Epoch: 47780, Training Loss: 0.00585
Epoch: 47781, Training Loss: 0.00660
Epoch: 47781, Training Loss: 0.00618
Epoch: 47781, Training Loss: 0.00754
Epoch: 47781, Training Loss: 0.00585
Epoch: 47782, Training Loss: 0.00660
Epoch: 47782, Training Loss: 0.00618
Epoch: 47782, Training Loss: 0.00754
Epoch: 47782, Training Loss: 0.00585
Epoch: 47783, Training Loss: 0.00660
Epoch: 47783, Training Loss: 0.00618
Epoch: 47783, Training Loss: 0.00754
Epoch: 47783, Training Loss: 0.00585
Epoch: 47784, Training Loss: 0.00660
Epoch: 47784, Training Loss: 0.00618
Epoch: 47784, Training Loss: 0.00754
Epoch: 47784, Training Loss: 0.00585
Epoch: 47785, Training Loss: 0.00660
Epoch: 47785, Training Loss: 0.00618
Epoch: 47785, Training Loss: 0.00754
Epoch: 47785, Training Loss: 0.00585
Epoch: 47786, Training Loss: 0.00660
Epoch: 47786, Training Loss: 0.00618
Epoch: 47786, Training Loss: 0.00754
Epoch: 47786, Training Loss: 0.00585
Epoch: 47787, Training Loss: 0.00660
Epoch: 47787, Training Loss: 0.00618
Epoch: 47787, Training Loss: 0.00754
Epoch: 47787, Training Loss: 0.00585
Epoch: 47788, Training Loss: 0.00660
Epoch: 47788, Training Loss: 0.00618
Epoch: 47788, Training Loss: 0.00754
Epoch: 47788, Training Loss: 0.00585
Epoch: 47789, Training Loss: 0.00660
Epoch: 47789, Training Loss: 0.00618
Epoch: 47789, Training Loss: 0.00754
Epoch: 47789, Training Loss: 0.00585
Epoch: 47790, Training Loss: 0.00660
Epoch: 47790, Training Loss: 0.00618
Epoch: 47790, Training Loss: 0.00754
Epoch: 47790, Training Loss: 0.00585
Epoch: 47791, Training Loss: 0.00660
Epoch: 47791, Training Loss: 0.00618
Epoch: 47791, Training Loss: 0.00754
Epoch: 47791, Training Loss: 0.00585
Epoch: 47792, Training Loss: 0.00660
Epoch: 47792, Training Loss: 0.00618
Epoch: 47792, Training Loss: 0.00754
Epoch: 47792, Training Loss: 0.00585
Epoch: 47793, Training Loss: 0.00660
Epoch: 47793, Training Loss: 0.00618
Epoch: 47793, Training Loss: 0.00754
Epoch: 47793, Training Loss: 0.00585
Epoch: 47794, Training Loss: 0.00660
Epoch: 47794, Training Loss: 0.00618
Epoch: 47794, Training Loss: 0.00754
Epoch: 47794, Training Loss: 0.00585
Epoch: 47795, Training Loss: 0.00660
Epoch: 47795, Training Loss: 0.00618
Epoch: 47795, Training Loss: 0.00754
Epoch: 47795, Training Loss: 0.00585
Epoch: 47796, Training Loss: 0.00660
Epoch: 47796, Training Loss: 0.00618
Epoch: 47796, Training Loss: 0.00754
Epoch: 47796, Training Loss: 0.00585
Epoch: 47797, Training Loss: 0.00660
Epoch: 47797, Training Loss: 0.00618
Epoch: 47797, Training Loss: 0.00754
Epoch: 47797, Training Loss: 0.00585
Epoch: 47798, Training Loss: 0.00660
Epoch: 47798, Training Loss: 0.00618
Epoch: 47798, Training Loss: 0.00754
Epoch: 47798, Training Loss: 0.00585
Epoch: 47799, Training Loss: 0.00660
Epoch: 47799, Training Loss: 0.00618
Epoch: 47799, Training Loss: 0.00754
Epoch: 47799, Training Loss: 0.00585
Epoch: 47800, Training Loss: 0.00660
Epoch: 47800, Training Loss: 0.00618
Epoch: 47800, Training Loss: 0.00754
Epoch: 47800, Training Loss: 0.00585
Epoch: 47801, Training Loss: 0.00660
Epoch: 47801, Training Loss: 0.00618
Epoch: 47801, Training Loss: 0.00754
Epoch: 47801, Training Loss: 0.00585
Epoch: 47802, Training Loss: 0.00660
Epoch: 47802, Training Loss: 0.00618
Epoch: 47802, Training Loss: 0.00754
Epoch: 47802, Training Loss: 0.00585
Epoch: 47803, Training Loss: 0.00660
Epoch: 47803, Training Loss: 0.00618
Epoch: 47803, Training Loss: 0.00754
Epoch: 47803, Training Loss: 0.00585
Epoch: 47804, Training Loss: 0.00660
Epoch: 47804, Training Loss: 0.00618
Epoch: 47804, Training Loss: 0.00754
Epoch: 47804, Training Loss: 0.00585
Epoch: 47805, Training Loss: 0.00660
Epoch: 47805, Training Loss: 0.00618
Epoch: 47805, Training Loss: 0.00754
Epoch: 47805, Training Loss: 0.00585
Epoch: 47806, Training Loss: 0.00660
Epoch: 47806, Training Loss: 0.00618
Epoch: 47806, Training Loss: 0.00754
Epoch: 47806, Training Loss: 0.00585
Epoch: 47807, Training Loss: 0.00660
Epoch: 47807, Training Loss: 0.00618
Epoch: 47807, Training Loss: 0.00754
Epoch: 47807, Training Loss: 0.00585
Epoch: 47808, Training Loss: 0.00660
Epoch: 47808, Training Loss: 0.00618
Epoch: 47808, Training Loss: 0.00754
Epoch: 47808, Training Loss: 0.00585
Epoch: 47809, Training Loss: 0.00660
Epoch: 47809, Training Loss: 0.00618
Epoch: 47809, Training Loss: 0.00754
Epoch: 47809, Training Loss: 0.00585
Epoch: 47810, Training Loss: 0.00660
Epoch: 47810, Training Loss: 0.00618
Epoch: 47810, Training Loss: 0.00754
Epoch: 47810, Training Loss: 0.00585
Epoch: 47811, Training Loss: 0.00660
Epoch: 47811, Training Loss: 0.00618
Epoch: 47811, Training Loss: 0.00754
Epoch: 47811, Training Loss: 0.00585
Epoch: 47812, Training Loss: 0.00660
Epoch: 47812, Training Loss: 0.00618
Epoch: 47812, Training Loss: 0.00754
Epoch: 47812, Training Loss: 0.00585
Epoch: 47813, Training Loss: 0.00660
Epoch: 47813, Training Loss: 0.00618
Epoch: 47813, Training Loss: 0.00754
Epoch: 47813, Training Loss: 0.00585
Epoch: 47814, Training Loss: 0.00660
Epoch: 47814, Training Loss: 0.00618
Epoch: 47814, Training Loss: 0.00754
Epoch: 47814, Training Loss: 0.00585
Epoch: 47815, Training Loss: 0.00660
Epoch: 47815, Training Loss: 0.00618
Epoch: 47815, Training Loss: 0.00754
Epoch: 47815, Training Loss: 0.00585
Epoch: 47816, Training Loss: 0.00660
Epoch: 47816, Training Loss: 0.00618
Epoch: 47816, Training Loss: 0.00754
Epoch: 47816, Training Loss: 0.00585
Epoch: 47817, Training Loss: 0.00660
Epoch: 47817, Training Loss: 0.00618
Epoch: 47817, Training Loss: 0.00754
Epoch: 47817, Training Loss: 0.00585
Epoch: 47818, Training Loss: 0.00660
Epoch: 47818, Training Loss: 0.00618
Epoch: 47818, Training Loss: 0.00754
Epoch: 47818, Training Loss: 0.00585
Epoch: 47819, Training Loss: 0.00660
Epoch: 47819, Training Loss: 0.00618
Epoch: 47819, Training Loss: 0.00754
Epoch: 47819, Training Loss: 0.00585
Epoch: 47820, Training Loss: 0.00660
Epoch: 47820, Training Loss: 0.00618
Epoch: 47820, Training Loss: 0.00754
Epoch: 47820, Training Loss: 0.00585
Epoch: 47821, Training Loss: 0.00660
Epoch: 47821, Training Loss: 0.00618
Epoch: 47821, Training Loss: 0.00754
Epoch: 47821, Training Loss: 0.00585
Epoch: 47822, Training Loss: 0.00660
Epoch: 47822, Training Loss: 0.00618
Epoch: 47822, Training Loss: 0.00754
Epoch: 47822, Training Loss: 0.00585
Epoch: 47823, Training Loss: 0.00660
Epoch: 47823, Training Loss: 0.00618
Epoch: 47823, Training Loss: 0.00754
Epoch: 47823, Training Loss: 0.00585
Epoch: 47824, Training Loss: 0.00660
Epoch: 47824, Training Loss: 0.00618
Epoch: 47824, Training Loss: 0.00754
Epoch: 47824, Training Loss: 0.00585
Epoch: 47825, Training Loss: 0.00660
Epoch: 47825, Training Loss: 0.00618
Epoch: 47825, Training Loss: 0.00754
Epoch: 47825, Training Loss: 0.00585
Epoch: 47826, Training Loss: 0.00659
Epoch: 47826, Training Loss: 0.00618
Epoch: 47826, Training Loss: 0.00754
Epoch: 47826, Training Loss: 0.00585
Epoch: 47827, Training Loss: 0.00659
Epoch: 47827, Training Loss: 0.00618
Epoch: 47827, Training Loss: 0.00754
Epoch: 47827, Training Loss: 0.00585
Epoch: 47828, Training Loss: 0.00659
Epoch: 47828, Training Loss: 0.00618
Epoch: 47828, Training Loss: 0.00754
Epoch: 47828, Training Loss: 0.00585
Epoch: 47829, Training Loss: 0.00659
Epoch: 47829, Training Loss: 0.00618
Epoch: 47829, Training Loss: 0.00754
Epoch: 47829, Training Loss: 0.00585
Epoch: 47830, Training Loss: 0.00659
Epoch: 47830, Training Loss: 0.00618
Epoch: 47830, Training Loss: 0.00754
Epoch: 47830, Training Loss: 0.00585
Epoch: 47831, Training Loss: 0.00659
Epoch: 47831, Training Loss: 0.00618
Epoch: 47831, Training Loss: 0.00754
Epoch: 47831, Training Loss: 0.00585
Epoch: 47832, Training Loss: 0.00659
Epoch: 47832, Training Loss: 0.00618
Epoch: 47832, Training Loss: 0.00754
Epoch: 47832, Training Loss: 0.00585
Epoch: 47833, Training Loss: 0.00659
Epoch: 47833, Training Loss: 0.00618
Epoch: 47833, Training Loss: 0.00754
Epoch: 47833, Training Loss: 0.00585
Epoch: 47834, Training Loss: 0.00659
Epoch: 47834, Training Loss: 0.00618
Epoch: 47834, Training Loss: 0.00754
Epoch: 47834, Training Loss: 0.00585
Epoch: 47835, Training Loss: 0.00659
Epoch: 47835, Training Loss: 0.00618
Epoch: 47835, Training Loss: 0.00754
Epoch: 47835, Training Loss: 0.00585
Epoch: 47836, Training Loss: 0.00659
Epoch: 47836, Training Loss: 0.00618
Epoch: 47836, Training Loss: 0.00754
Epoch: 47836, Training Loss: 0.00585
Epoch: 47837, Training Loss: 0.00659
Epoch: 47837, Training Loss: 0.00618
Epoch: 47837, Training Loss: 0.00754
Epoch: 47837, Training Loss: 0.00585
Epoch: 47838, Training Loss: 0.00659
Epoch: 47838, Training Loss: 0.00618
Epoch: 47838, Training Loss: 0.00754
Epoch: 47838, Training Loss: 0.00585
Epoch: 47839, Training Loss: 0.00659
Epoch: 47839, Training Loss: 0.00618
Epoch: 47839, Training Loss: 0.00754
Epoch: 47839, Training Loss: 0.00585
Epoch: 47840, Training Loss: 0.00659
Epoch: 47840, Training Loss: 0.00618
Epoch: 47840, Training Loss: 0.00754
Epoch: 47840, Training Loss: 0.00585
Epoch: 47841, Training Loss: 0.00659
Epoch: 47841, Training Loss: 0.00618
Epoch: 47841, Training Loss: 0.00754
Epoch: 47841, Training Loss: 0.00585
Epoch: 47842, Training Loss: 0.00659
Epoch: 47842, Training Loss: 0.00618
Epoch: 47842, Training Loss: 0.00754
Epoch: 47842, Training Loss: 0.00585
Epoch: 47843, Training Loss: 0.00659
Epoch: 47843, Training Loss: 0.00618
Epoch: 47843, Training Loss: 0.00754
Epoch: 47843, Training Loss: 0.00585
Epoch: 47844, Training Loss: 0.00659
Epoch: 47844, Training Loss: 0.00618
Epoch: 47844, Training Loss: 0.00754
Epoch: 47844, Training Loss: 0.00585
Epoch: 47845, Training Loss: 0.00659
Epoch: 47845, Training Loss: 0.00618
Epoch: 47845, Training Loss: 0.00754
Epoch: 47845, Training Loss: 0.00585
Epoch: 47846, Training Loss: 0.00659
Epoch: 47846, Training Loss: 0.00618
Epoch: 47846, Training Loss: 0.00754
Epoch: 47846, Training Loss: 0.00585
Epoch: 47847, Training Loss: 0.00659
Epoch: 47847, Training Loss: 0.00618
Epoch: 47847, Training Loss: 0.00754
Epoch: 47847, Training Loss: 0.00585
Epoch: 47848, Training Loss: 0.00659
Epoch: 47848, Training Loss: 0.00618
Epoch: 47848, Training Loss: 0.00754
Epoch: 47848, Training Loss: 0.00585
Epoch: 47849, Training Loss: 0.00659
Epoch: 47849, Training Loss: 0.00618
Epoch: 47849, Training Loss: 0.00754
Epoch: 47849, Training Loss: 0.00585
Epoch: 47850, Training Loss: 0.00659
Epoch: 47850, Training Loss: 0.00618
Epoch: 47850, Training Loss: 0.00754
Epoch: 47850, Training Loss: 0.00585
Epoch: 47851, Training Loss: 0.00659
Epoch: 47851, Training Loss: 0.00618
Epoch: 47851, Training Loss: 0.00754
Epoch: 47851, Training Loss: 0.00585
Epoch: 47852, Training Loss: 0.00659
Epoch: 47852, Training Loss: 0.00618
Epoch: 47852, Training Loss: 0.00754
Epoch: 47852, Training Loss: 0.00585
Epoch: 47853, Training Loss: 0.00659
Epoch: 47853, Training Loss: 0.00618
Epoch: 47853, Training Loss: 0.00754
Epoch: 47853, Training Loss: 0.00585
Epoch: 47854, Training Loss: 0.00659
Epoch: 47854, Training Loss: 0.00618
Epoch: 47854, Training Loss: 0.00754
Epoch: 47854, Training Loss: 0.00585
Epoch: 47855, Training Loss: 0.00659
Epoch: 47855, Training Loss: 0.00618
Epoch: 47855, Training Loss: 0.00754
Epoch: 47855, Training Loss: 0.00585
Epoch: 47856, Training Loss: 0.00659
Epoch: 47856, Training Loss: 0.00618
Epoch: 47856, Training Loss: 0.00754
Epoch: 47856, Training Loss: 0.00585
Epoch: 47857, Training Loss: 0.00659
Epoch: 47857, Training Loss: 0.00618
Epoch: 47857, Training Loss: 0.00754
Epoch: 47857, Training Loss: 0.00585
Epoch: 47858, Training Loss: 0.00659
Epoch: 47858, Training Loss: 0.00618
Epoch: 47858, Training Loss: 0.00754
Epoch: 47858, Training Loss: 0.00585
Epoch: 47859, Training Loss: 0.00659
Epoch: 47859, Training Loss: 0.00618
Epoch: 47859, Training Loss: 0.00754
Epoch: 47859, Training Loss: 0.00585
Epoch: 47860, Training Loss: 0.00659
Epoch: 47860, Training Loss: 0.00618
Epoch: 47860, Training Loss: 0.00754
Epoch: 47860, Training Loss: 0.00585
Epoch: 47861, Training Loss: 0.00659
Epoch: 47861, Training Loss: 0.00618
Epoch: 47861, Training Loss: 0.00754
Epoch: 47861, Training Loss: 0.00585
Epoch: 47862, Training Loss: 0.00659
Epoch: 47862, Training Loss: 0.00618
Epoch: 47862, Training Loss: 0.00754
Epoch: 47862, Training Loss: 0.00585
Epoch: 47863, Training Loss: 0.00659
Epoch: 47863, Training Loss: 0.00618
Epoch: 47863, Training Loss: 0.00754
Epoch: 47863, Training Loss: 0.00585
Epoch: 47864, Training Loss: 0.00659
Epoch: 47864, Training Loss: 0.00618
Epoch: 47864, Training Loss: 0.00753
Epoch: 47864, Training Loss: 0.00585
Epoch: 47865, Training Loss: 0.00659
Epoch: 47865, Training Loss: 0.00618
Epoch: 47865, Training Loss: 0.00753
Epoch: 47865, Training Loss: 0.00585
Epoch: 47866, Training Loss: 0.00659
Epoch: 47866, Training Loss: 0.00618
Epoch: 47866, Training Loss: 0.00753
Epoch: 47866, Training Loss: 0.00585
Epoch: 47867, Training Loss: 0.00659
Epoch: 47867, Training Loss: 0.00618
Epoch: 47867, Training Loss: 0.00753
Epoch: 47867, Training Loss: 0.00585
Epoch: 47868, Training Loss: 0.00659
Epoch: 47868, Training Loss: 0.00618
Epoch: 47868, Training Loss: 0.00753
Epoch: 47868, Training Loss: 0.00585
Epoch: 47869, Training Loss: 0.00659
Epoch: 47869, Training Loss: 0.00618
Epoch: 47869, Training Loss: 0.00753
Epoch: 47869, Training Loss: 0.00585
Epoch: 47870, Training Loss: 0.00659
Epoch: 47870, Training Loss: 0.00618
Epoch: 47870, Training Loss: 0.00753
Epoch: 47870, Training Loss: 0.00585
Epoch: 47871, Training Loss: 0.00659
Epoch: 47871, Training Loss: 0.00618
Epoch: 47871, Training Loss: 0.00753
Epoch: 47871, Training Loss: 0.00585
Epoch: 47872, Training Loss: 0.00659
Epoch: 47872, Training Loss: 0.00618
Epoch: 47872, Training Loss: 0.00753
Epoch: 47872, Training Loss: 0.00585
Epoch: 47873, Training Loss: 0.00659
Epoch: 47873, Training Loss: 0.00618
Epoch: 47873, Training Loss: 0.00753
Epoch: 47873, Training Loss: 0.00585
Epoch: 47874, Training Loss: 0.00659
Epoch: 47874, Training Loss: 0.00618
Epoch: 47874, Training Loss: 0.00753
Epoch: 47874, Training Loss: 0.00585
Epoch: 47875, Training Loss: 0.00659
Epoch: 47875, Training Loss: 0.00618
Epoch: 47875, Training Loss: 0.00753
Epoch: 47875, Training Loss: 0.00585
Epoch: 47876, Training Loss: 0.00659
Epoch: 47876, Training Loss: 0.00618
Epoch: 47876, Training Loss: 0.00753
Epoch: 47876, Training Loss: 0.00585
Epoch: 47877, Training Loss: 0.00659
Epoch: 47877, Training Loss: 0.00618
Epoch: 47877, Training Loss: 0.00753
Epoch: 47877, Training Loss: 0.00585
Epoch: 47878, Training Loss: 0.00659
Epoch: 47878, Training Loss: 0.00618
Epoch: 47878, Training Loss: 0.00753
Epoch: 47878, Training Loss: 0.00585
Epoch: 47879, Training Loss: 0.00659
Epoch: 47879, Training Loss: 0.00618
Epoch: 47879, Training Loss: 0.00753
Epoch: 47879, Training Loss: 0.00585
Epoch: 47880, Training Loss: 0.00659
Epoch: 47880, Training Loss: 0.00618
Epoch: 47880, Training Loss: 0.00753
Epoch: 47880, Training Loss: 0.00585
Epoch: 47881, Training Loss: 0.00659
Epoch: 47881, Training Loss: 0.00618
Epoch: 47881, Training Loss: 0.00753
Epoch: 47881, Training Loss: 0.00585
Epoch: 47882, Training Loss: 0.00659
Epoch: 47882, Training Loss: 0.00618
Epoch: 47882, Training Loss: 0.00753
Epoch: 47882, Training Loss: 0.00585
Epoch: 47883, Training Loss: 0.00659
Epoch: 47883, Training Loss: 0.00618
Epoch: 47883, Training Loss: 0.00753
Epoch: 47883, Training Loss: 0.00585
Epoch: 47884, Training Loss: 0.00659
Epoch: 47884, Training Loss: 0.00618
Epoch: 47884, Training Loss: 0.00753
Epoch: 47884, Training Loss: 0.00585
Epoch: 47885, Training Loss: 0.00659
Epoch: 47885, Training Loss: 0.00618
Epoch: 47885, Training Loss: 0.00753
Epoch: 47885, Training Loss: 0.00585
Epoch: 47886, Training Loss: 0.00659
Epoch: 47886, Training Loss: 0.00618
Epoch: 47886, Training Loss: 0.00753
Epoch: 47886, Training Loss: 0.00585
Epoch: 47887, Training Loss: 0.00659
Epoch: 47887, Training Loss: 0.00618
Epoch: 47887, Training Loss: 0.00753
Epoch: 47887, Training Loss: 0.00585
Epoch: 47888, Training Loss: 0.00659
Epoch: 47888, Training Loss: 0.00618
Epoch: 47888, Training Loss: 0.00753
Epoch: 47888, Training Loss: 0.00585
Epoch: 47889, Training Loss: 0.00659
Epoch: 47889, Training Loss: 0.00618
Epoch: 47889, Training Loss: 0.00753
Epoch: 47889, Training Loss: 0.00585
Epoch: 47890, Training Loss: 0.00659
Epoch: 47890, Training Loss: 0.00618
Epoch: 47890, Training Loss: 0.00753
Epoch: 47890, Training Loss: 0.00585
Epoch: 47891, Training Loss: 0.00659
Epoch: 47891, Training Loss: 0.00618
Epoch: 47891, Training Loss: 0.00753
Epoch: 47891, Training Loss: 0.00585
Epoch: 47892, Training Loss: 0.00659
Epoch: 47892, Training Loss: 0.00618
Epoch: 47892, Training Loss: 0.00753
Epoch: 47892, Training Loss: 0.00585
Epoch: 47893, Training Loss: 0.00659
Epoch: 47893, Training Loss: 0.00618
Epoch: 47893, Training Loss: 0.00753
Epoch: 47893, Training Loss: 0.00585
Epoch: 47894, Training Loss: 0.00659
Epoch: 47894, Training Loss: 0.00618
Epoch: 47894, Training Loss: 0.00753
Epoch: 47894, Training Loss: 0.00585
Epoch: 47895, Training Loss: 0.00659
Epoch: 47895, Training Loss: 0.00618
Epoch: 47895, Training Loss: 0.00753
Epoch: 47895, Training Loss: 0.00585
Epoch: 47896, Training Loss: 0.00659
Epoch: 47896, Training Loss: 0.00618
Epoch: 47896, Training Loss: 0.00753
Epoch: 47896, Training Loss: 0.00585
Epoch: 47897, Training Loss: 0.00659
Epoch: 47897, Training Loss: 0.00618
Epoch: 47897, Training Loss: 0.00753
Epoch: 47897, Training Loss: 0.00585
Epoch: 47898, Training Loss: 0.00659
Epoch: 47898, Training Loss: 0.00618
Epoch: 47898, Training Loss: 0.00753
Epoch: 47898, Training Loss: 0.00585
Epoch: 47899, Training Loss: 0.00659
Epoch: 47899, Training Loss: 0.00618
Epoch: 47899, Training Loss: 0.00753
Epoch: 47899, Training Loss: 0.00585
Epoch: 47900, Training Loss: 0.00659
Epoch: 47900, Training Loss: 0.00618
Epoch: 47900, Training Loss: 0.00753
Epoch: 47900, Training Loss: 0.00585
Epoch: 47901, Training Loss: 0.00659
Epoch: 47901, Training Loss: 0.00618
Epoch: 47901, Training Loss: 0.00753
Epoch: 47901, Training Loss: 0.00585
Epoch: 47902, Training Loss: 0.00659
Epoch: 47902, Training Loss: 0.00618
Epoch: 47902, Training Loss: 0.00753
Epoch: 47902, Training Loss: 0.00585
Epoch: 47903, Training Loss: 0.00659
Epoch: 47903, Training Loss: 0.00618
Epoch: 47903, Training Loss: 0.00753
Epoch: 47903, Training Loss: 0.00585
Epoch: 47904, Training Loss: 0.00659
Epoch: 47904, Training Loss: 0.00617
Epoch: 47904, Training Loss: 0.00753
Epoch: 47904, Training Loss: 0.00585
Epoch: 47905, Training Loss: 0.00659
Epoch: 47905, Training Loss: 0.00617
Epoch: 47905, Training Loss: 0.00753
Epoch: 47905, Training Loss: 0.00585
Epoch: 47906, Training Loss: 0.00659
Epoch: 47906, Training Loss: 0.00617
Epoch: 47906, Training Loss: 0.00753
Epoch: 47906, Training Loss: 0.00585
Epoch: 47907, Training Loss: 0.00659
Epoch: 47907, Training Loss: 0.00617
Epoch: 47907, Training Loss: 0.00753
Epoch: 47907, Training Loss: 0.00585
Epoch: 47908, Training Loss: 0.00659
Epoch: 47908, Training Loss: 0.00617
Epoch: 47908, Training Loss: 0.00753
Epoch: 47908, Training Loss: 0.00585
Epoch: 47909, Training Loss: 0.00659
Epoch: 47909, Training Loss: 0.00617
Epoch: 47909, Training Loss: 0.00753
Epoch: 47909, Training Loss: 0.00585
Epoch: 47910, Training Loss: 0.00659
Epoch: 47910, Training Loss: 0.00617
Epoch: 47910, Training Loss: 0.00753
Epoch: 47910, Training Loss: 0.00585
Epoch: 47911, Training Loss: 0.00659
Epoch: 47911, Training Loss: 0.00617
Epoch: 47911, Training Loss: 0.00753
Epoch: 47911, Training Loss: 0.00585
Epoch: 47912, Training Loss: 0.00659
Epoch: 47912, Training Loss: 0.00617
Epoch: 47912, Training Loss: 0.00753
Epoch: 47912, Training Loss: 0.00585
Epoch: 47913, Training Loss: 0.00659
Epoch: 47913, Training Loss: 0.00617
Epoch: 47913, Training Loss: 0.00753
Epoch: 47913, Training Loss: 0.00585
Epoch: 47914, Training Loss: 0.00659
Epoch: 47914, Training Loss: 0.00617
Epoch: 47914, Training Loss: 0.00753
Epoch: 47914, Training Loss: 0.00585
Epoch: 47915, Training Loss: 0.00659
Epoch: 47915, Training Loss: 0.00617
Epoch: 47915, Training Loss: 0.00753
Epoch: 47915, Training Loss: 0.00585
Epoch: 47916, Training Loss: 0.00659
Epoch: 47916, Training Loss: 0.00617
Epoch: 47916, Training Loss: 0.00753
Epoch: 47916, Training Loss: 0.00585
Epoch: 47917, Training Loss: 0.00659
Epoch: 47917, Training Loss: 0.00617
Epoch: 47917, Training Loss: 0.00753
Epoch: 47917, Training Loss: 0.00585
Epoch: 47918, Training Loss: 0.00659
Epoch: 47918, Training Loss: 0.00617
Epoch: 47918, Training Loss: 0.00753
Epoch: 47918, Training Loss: 0.00585
Epoch: 47919, Training Loss: 0.00659
Epoch: 47919, Training Loss: 0.00617
Epoch: 47919, Training Loss: 0.00753
Epoch: 47919, Training Loss: 0.00585
Epoch: 47920, Training Loss: 0.00659
Epoch: 47920, Training Loss: 0.00617
Epoch: 47920, Training Loss: 0.00753
Epoch: 47920, Training Loss: 0.00584
Epoch: 47921, Training Loss: 0.00659
Epoch: 47921, Training Loss: 0.00617
Epoch: 47921, Training Loss: 0.00753
Epoch: 47921, Training Loss: 0.00584
Epoch: 47922, Training Loss: 0.00659
Epoch: 47922, Training Loss: 0.00617
Epoch: 47922, Training Loss: 0.00753
Epoch: 47922, Training Loss: 0.00584
Epoch: 47923, Training Loss: 0.00659
Epoch: 47923, Training Loss: 0.00617
Epoch: 47923, Training Loss: 0.00753
Epoch: 47923, Training Loss: 0.00584
Epoch: 47924, Training Loss: 0.00659
Epoch: 47924, Training Loss: 0.00617
Epoch: 47924, Training Loss: 0.00753
Epoch: 47924, Training Loss: 0.00584
Epoch: 47925, Training Loss: 0.00659
Epoch: 47925, Training Loss: 0.00617
Epoch: 47925, Training Loss: 0.00753
Epoch: 47925, Training Loss: 0.00584
Epoch: 47926, Training Loss: 0.00659
Epoch: 47926, Training Loss: 0.00617
Epoch: 47926, Training Loss: 0.00753
Epoch: 47926, Training Loss: 0.00584
Epoch: 47927, Training Loss: 0.00659
Epoch: 47927, Training Loss: 0.00617
Epoch: 47927, Training Loss: 0.00753
Epoch: 47927, Training Loss: 0.00584
Epoch: 47928, Training Loss: 0.00659
Epoch: 47928, Training Loss: 0.00617
Epoch: 47928, Training Loss: 0.00753
Epoch: 47928, Training Loss: 0.00584
Epoch: 47929, Training Loss: 0.00659
Epoch: 47929, Training Loss: 0.00617
Epoch: 47929, Training Loss: 0.00753
Epoch: 47929, Training Loss: 0.00584
Epoch: 47930, Training Loss: 0.00659
Epoch: 47930, Training Loss: 0.00617
Epoch: 47930, Training Loss: 0.00753
Epoch: 47930, Training Loss: 0.00584
Epoch: 47931, Training Loss: 0.00659
Epoch: 47931, Training Loss: 0.00617
Epoch: 47931, Training Loss: 0.00753
Epoch: 47931, Training Loss: 0.00584
Epoch: 47932, Training Loss: 0.00659
Epoch: 47932, Training Loss: 0.00617
Epoch: 47932, Training Loss: 0.00753
Epoch: 47932, Training Loss: 0.00584
Epoch: 47933, Training Loss: 0.00659
Epoch: 47933, Training Loss: 0.00617
Epoch: 47933, Training Loss: 0.00753
Epoch: 47933, Training Loss: 0.00584
Epoch: 47934, Training Loss: 0.00659
Epoch: 47934, Training Loss: 0.00617
Epoch: 47934, Training Loss: 0.00753
Epoch: 47934, Training Loss: 0.00584
Epoch: 47935, Training Loss: 0.00659
Epoch: 47935, Training Loss: 0.00617
Epoch: 47935, Training Loss: 0.00753
Epoch: 47935, Training Loss: 0.00584
Epoch: 47936, Training Loss: 0.00659
Epoch: 47936, Training Loss: 0.00617
Epoch: 47936, Training Loss: 0.00753
Epoch: 47936, Training Loss: 0.00584
Epoch: 47937, Training Loss: 0.00659
Epoch: 47937, Training Loss: 0.00617
Epoch: 47937, Training Loss: 0.00753
Epoch: 47937, Training Loss: 0.00584
Epoch: 47938, Training Loss: 0.00659
Epoch: 47938, Training Loss: 0.00617
Epoch: 47938, Training Loss: 0.00753
Epoch: 47938, Training Loss: 0.00584
Epoch: 47939, Training Loss: 0.00659
Epoch: 47939, Training Loss: 0.00617
Epoch: 47939, Training Loss: 0.00753
Epoch: 47939, Training Loss: 0.00584
Epoch: 47940, Training Loss: 0.00659
Epoch: 47940, Training Loss: 0.00617
Epoch: 47940, Training Loss: 0.00753
Epoch: 47940, Training Loss: 0.00584
Epoch: 47941, Training Loss: 0.00659
Epoch: 47941, Training Loss: 0.00617
Epoch: 47941, Training Loss: 0.00753
Epoch: 47941, Training Loss: 0.00584
Epoch: 47942, Training Loss: 0.00659
Epoch: 47942, Training Loss: 0.00617
Epoch: 47942, Training Loss: 0.00753
Epoch: 47942, Training Loss: 0.00584
Epoch: 47943, Training Loss: 0.00659
Epoch: 47943, Training Loss: 0.00617
Epoch: 47943, Training Loss: 0.00753
Epoch: 47943, Training Loss: 0.00584
Epoch: 47944, Training Loss: 0.00659
Epoch: 47944, Training Loss: 0.00617
Epoch: 47944, Training Loss: 0.00753
Epoch: 47944, Training Loss: 0.00584
Epoch: 47945, Training Loss: 0.00659
Epoch: 47945, Training Loss: 0.00617
Epoch: 47945, Training Loss: 0.00753
Epoch: 47945, Training Loss: 0.00584
Epoch: 47946, Training Loss: 0.00659
Epoch: 47946, Training Loss: 0.00617
Epoch: 47946, Training Loss: 0.00753
Epoch: 47946, Training Loss: 0.00584
Epoch: 47947, Training Loss: 0.00659
Epoch: 47947, Training Loss: 0.00617
Epoch: 47947, Training Loss: 0.00753
Epoch: 47947, Training Loss: 0.00584
Epoch: 47948, Training Loss: 0.00659
Epoch: 47948, Training Loss: 0.00617
Epoch: 47948, Training Loss: 0.00753
Epoch: 47948, Training Loss: 0.00584
Epoch: 47949, Training Loss: 0.00659
Epoch: 47949, Training Loss: 0.00617
Epoch: 47949, Training Loss: 0.00753
Epoch: 47949, Training Loss: 0.00584
Epoch: 47950, Training Loss: 0.00659
Epoch: 47950, Training Loss: 0.00617
Epoch: 47950, Training Loss: 0.00753
Epoch: 47950, Training Loss: 0.00584
Epoch: 47951, Training Loss: 0.00659
Epoch: 47951, Training Loss: 0.00617
Epoch: 47951, Training Loss: 0.00753
Epoch: 47951, Training Loss: 0.00584
Epoch: 47952, Training Loss: 0.00659
Epoch: 47952, Training Loss: 0.00617
Epoch: 47952, Training Loss: 0.00753
Epoch: 47952, Training Loss: 0.00584
Epoch: 47953, Training Loss: 0.00659
Epoch: 47953, Training Loss: 0.00617
Epoch: 47953, Training Loss: 0.00753
Epoch: 47953, Training Loss: 0.00584
Epoch: 47954, Training Loss: 0.00659
Epoch: 47954, Training Loss: 0.00617
Epoch: 47954, Training Loss: 0.00753
Epoch: 47954, Training Loss: 0.00584
Epoch: 47955, Training Loss: 0.00659
Epoch: 47955, Training Loss: 0.00617
Epoch: 47955, Training Loss: 0.00753
Epoch: 47955, Training Loss: 0.00584
Epoch: 47956, Training Loss: 0.00659
Epoch: 47956, Training Loss: 0.00617
Epoch: 47956, Training Loss: 0.00753
Epoch: 47956, Training Loss: 0.00584
Epoch: 47957, Training Loss: 0.00659
Epoch: 47957, Training Loss: 0.00617
Epoch: 47957, Training Loss: 0.00753
Epoch: 47957, Training Loss: 0.00584
Epoch: 47958, Training Loss: 0.00659
Epoch: 47958, Training Loss: 0.00617
Epoch: 47958, Training Loss: 0.00753
Epoch: 47958, Training Loss: 0.00584
Epoch: 47959, Training Loss: 0.00659
Epoch: 47959, Training Loss: 0.00617
Epoch: 47959, Training Loss: 0.00753
Epoch: 47959, Training Loss: 0.00584
Epoch: 47960, Training Loss: 0.00659
Epoch: 47960, Training Loss: 0.00617
Epoch: 47960, Training Loss: 0.00753
Epoch: 47960, Training Loss: 0.00584
Epoch: 47961, Training Loss: 0.00659
Epoch: 47961, Training Loss: 0.00617
Epoch: 47961, Training Loss: 0.00753
Epoch: 47961, Training Loss: 0.00584
Epoch: 47962, Training Loss: 0.00659
Epoch: 47962, Training Loss: 0.00617
Epoch: 47962, Training Loss: 0.00753
Epoch: 47962, Training Loss: 0.00584
Epoch: 47963, Training Loss: 0.00659
Epoch: 47963, Training Loss: 0.00617
Epoch: 47963, Training Loss: 0.00753
Epoch: 47963, Training Loss: 0.00584
Epoch: 47964, Training Loss: 0.00659
Epoch: 47964, Training Loss: 0.00617
Epoch: 47964, Training Loss: 0.00753
Epoch: 47964, Training Loss: 0.00584
Epoch: 47965, Training Loss: 0.00658
Epoch: 47965, Training Loss: 0.00617
Epoch: 47965, Training Loss: 0.00753
Epoch: 47965, Training Loss: 0.00584
Epoch: 47966, Training Loss: 0.00658
Epoch: 47966, Training Loss: 0.00617
Epoch: 47966, Training Loss: 0.00753
Epoch: 47966, Training Loss: 0.00584
Epoch: 47967, Training Loss: 0.00658
Epoch: 47967, Training Loss: 0.00617
Epoch: 47967, Training Loss: 0.00753
Epoch: 47967, Training Loss: 0.00584
Epoch: 47968, Training Loss: 0.00658
Epoch: 47968, Training Loss: 0.00617
Epoch: 47968, Training Loss: 0.00753
Epoch: 47968, Training Loss: 0.00584
Epoch: 47969, Training Loss: 0.00658
Epoch: 47969, Training Loss: 0.00617
Epoch: 47969, Training Loss: 0.00753
Epoch: 47969, Training Loss: 0.00584
Epoch: 47970, Training Loss: 0.00658
Epoch: 47970, Training Loss: 0.00617
Epoch: 47970, Training Loss: 0.00753
Epoch: 47970, Training Loss: 0.00584
Epoch: 47971, Training Loss: 0.00658
Epoch: 47971, Training Loss: 0.00617
Epoch: 47971, Training Loss: 0.00753
Epoch: 47971, Training Loss: 0.00584
Epoch: 47972, Training Loss: 0.00658
Epoch: 47972, Training Loss: 0.00617
Epoch: 47972, Training Loss: 0.00753
Epoch: 47972, Training Loss: 0.00584
Epoch: 47973, Training Loss: 0.00658
Epoch: 47973, Training Loss: 0.00617
Epoch: 47973, Training Loss: 0.00753
Epoch: 47973, Training Loss: 0.00584
Epoch: 47974, Training Loss: 0.00658
Epoch: 47974, Training Loss: 0.00617
Epoch: 47974, Training Loss: 0.00753
Epoch: 47974, Training Loss: 0.00584
Epoch: 47975, Training Loss: 0.00658
Epoch: 47975, Training Loss: 0.00617
Epoch: 47975, Training Loss: 0.00753
Epoch: 47975, Training Loss: 0.00584
Epoch: 47976, Training Loss: 0.00658
Epoch: 47976, Training Loss: 0.00617
Epoch: 47976, Training Loss: 0.00753
Epoch: 47976, Training Loss: 0.00584
Epoch: 47977, Training Loss: 0.00658
Epoch: 47977, Training Loss: 0.00617
Epoch: 47977, Training Loss: 0.00753
Epoch: 47977, Training Loss: 0.00584
Epoch: 47978, Training Loss: 0.00658
Epoch: 47978, Training Loss: 0.00617
Epoch: 47978, Training Loss: 0.00753
Epoch: 47978, Training Loss: 0.00584
Epoch: 47979, Training Loss: 0.00658
Epoch: 47979, Training Loss: 0.00617
Epoch: 47979, Training Loss: 0.00753
Epoch: 47979, Training Loss: 0.00584
Epoch: 47980, Training Loss: 0.00658
Epoch: 47980, Training Loss: 0.00617
Epoch: 47980, Training Loss: 0.00753
Epoch: 47980, Training Loss: 0.00584
Epoch: 47981, Training Loss: 0.00658
Epoch: 47981, Training Loss: 0.00617
Epoch: 47981, Training Loss: 0.00753
Epoch: 47981, Training Loss: 0.00584
Epoch: 47982, Training Loss: 0.00658
Epoch: 47982, Training Loss: 0.00617
Epoch: 47982, Training Loss: 0.00753
Epoch: 47982, Training Loss: 0.00584
Epoch: 47983, Training Loss: 0.00658
Epoch: 47983, Training Loss: 0.00617
Epoch: 47983, Training Loss: 0.00753
Epoch: 47983, Training Loss: 0.00584
Epoch: 47984, Training Loss: 0.00658
Epoch: 47984, Training Loss: 0.00617
Epoch: 47984, Training Loss: 0.00753
Epoch: 47984, Training Loss: 0.00584
Epoch: 47985, Training Loss: 0.00658
Epoch: 47985, Training Loss: 0.00617
Epoch: 47985, Training Loss: 0.00753
Epoch: 47985, Training Loss: 0.00584
Epoch: 47986, Training Loss: 0.00658
Epoch: 47986, Training Loss: 0.00617
Epoch: 47986, Training Loss: 0.00752
Epoch: 47986, Training Loss: 0.00584
Epoch: 47987, Training Loss: 0.00658
Epoch: 47987, Training Loss: 0.00617
Epoch: 47987, Training Loss: 0.00752
Epoch: 47987, Training Loss: 0.00584
Epoch: 47988, Training Loss: 0.00658
Epoch: 47988, Training Loss: 0.00617
Epoch: 47988, Training Loss: 0.00752
Epoch: 47988, Training Loss: 0.00584
Epoch: 47989, Training Loss: 0.00658
Epoch: 47989, Training Loss: 0.00617
Epoch: 47989, Training Loss: 0.00752
Epoch: 47989, Training Loss: 0.00584
Epoch: 47990, Training Loss: 0.00658
Epoch: 47990, Training Loss: 0.00617
Epoch: 47990, Training Loss: 0.00752
Epoch: 47990, Training Loss: 0.00584
Epoch: 47991, Training Loss: 0.00658
Epoch: 47991, Training Loss: 0.00617
Epoch: 47991, Training Loss: 0.00752
Epoch: 47991, Training Loss: 0.00584
Epoch: 47992, Training Loss: 0.00658
Epoch: 47992, Training Loss: 0.00617
Epoch: 47992, Training Loss: 0.00752
Epoch: 47992, Training Loss: 0.00584
Epoch: 47993, Training Loss: 0.00658
Epoch: 47993, Training Loss: 0.00617
Epoch: 47993, Training Loss: 0.00752
Epoch: 47993, Training Loss: 0.00584
Epoch: 47994, Training Loss: 0.00658
Epoch: 47994, Training Loss: 0.00617
Epoch: 47994, Training Loss: 0.00752
Epoch: 47994, Training Loss: 0.00584
Epoch: 47995, Training Loss: 0.00658
Epoch: 47995, Training Loss: 0.00617
Epoch: 47995, Training Loss: 0.00752
Epoch: 47995, Training Loss: 0.00584
Epoch: 47996, Training Loss: 0.00658
Epoch: 47996, Training Loss: 0.00617
Epoch: 47996, Training Loss: 0.00752
Epoch: 47996, Training Loss: 0.00584
Epoch: 47997, Training Loss: 0.00658
Epoch: 47997, Training Loss: 0.00617
Epoch: 47997, Training Loss: 0.00752
Epoch: 47997, Training Loss: 0.00584
Epoch: 47998, Training Loss: 0.00658
Epoch: 47998, Training Loss: 0.00617
Epoch: 47998, Training Loss: 0.00752
Epoch: 47998, Training Loss: 0.00584
Epoch: 47999, Training Loss: 0.00658
Epoch: 47999, Training Loss: 0.00617
Epoch: 47999, Training Loss: 0.00752
Epoch: 47999, Training Loss: 0.00584
Epoch: 48000, Training Loss: 0.00658
Epoch: 48000, Training Loss: 0.00617
Epoch: 48000, Training Loss: 0.00752
Epoch: 48000, Training Loss: 0.00584
Epoch: 48001, Training Loss: 0.00658
Epoch: 48001, Training Loss: 0.00617
Epoch: 48001, Training Loss: 0.00752
Epoch: 48001, Training Loss: 0.00584
Epoch: 48002, Training Loss: 0.00658
Epoch: 48002, Training Loss: 0.00617
Epoch: 48002, Training Loss: 0.00752
Epoch: 48002, Training Loss: 0.00584
Epoch: 48003, Training Loss: 0.00658
Epoch: 48003, Training Loss: 0.00617
Epoch: 48003, Training Loss: 0.00752
Epoch: 48003, Training Loss: 0.00584
Epoch: 48004, Training Loss: 0.00658
Epoch: 48004, Training Loss: 0.00617
Epoch: 48004, Training Loss: 0.00752
Epoch: 48004, Training Loss: 0.00584
Epoch: 48005, Training Loss: 0.00658
Epoch: 48005, Training Loss: 0.00617
Epoch: 48005, Training Loss: 0.00752
Epoch: 48005, Training Loss: 0.00584
Epoch: 48006, Training Loss: 0.00658
Epoch: 48006, Training Loss: 0.00617
Epoch: 48006, Training Loss: 0.00752
Epoch: 48006, Training Loss: 0.00584
Epoch: 48007, Training Loss: 0.00658
Epoch: 48007, Training Loss: 0.00617
Epoch: 48007, Training Loss: 0.00752
Epoch: 48007, Training Loss: 0.00584
Epoch: 48008, Training Loss: 0.00658
Epoch: 48008, Training Loss: 0.00617
Epoch: 48008, Training Loss: 0.00752
Epoch: 48008, Training Loss: 0.00584
Epoch: 48009, Training Loss: 0.00658
Epoch: 48009, Training Loss: 0.00617
Epoch: 48009, Training Loss: 0.00752
Epoch: 48009, Training Loss: 0.00584
Epoch: 48010, Training Loss: 0.00658
Epoch: 48010, Training Loss: 0.00617
Epoch: 48010, Training Loss: 0.00752
Epoch: 48010, Training Loss: 0.00584
Epoch: 48011, Training Loss: 0.00658
Epoch: 48011, Training Loss: 0.00617
Epoch: 48011, Training Loss: 0.00752
Epoch: 48011, Training Loss: 0.00584
Epoch: 48012, Training Loss: 0.00658
Epoch: 48012, Training Loss: 0.00617
Epoch: 48012, Training Loss: 0.00752
Epoch: 48012, Training Loss: 0.00584
Epoch: 48013, Training Loss: 0.00658
Epoch: 48013, Training Loss: 0.00617
Epoch: 48013, Training Loss: 0.00752
Epoch: 48013, Training Loss: 0.00584
Epoch: 48014, Training Loss: 0.00658
Epoch: 48014, Training Loss: 0.00617
Epoch: 48014, Training Loss: 0.00752
Epoch: 48014, Training Loss: 0.00584
Epoch: 48015, Training Loss: 0.00658
Epoch: 48015, Training Loss: 0.00617
Epoch: 48015, Training Loss: 0.00752
Epoch: 48015, Training Loss: 0.00584
Epoch: 48016, Training Loss: 0.00658
Epoch: 48016, Training Loss: 0.00617
Epoch: 48016, Training Loss: 0.00752
Epoch: 48016, Training Loss: 0.00584
Epoch: 48017, Training Loss: 0.00658
Epoch: 48017, Training Loss: 0.00617
Epoch: 48017, Training Loss: 0.00752
Epoch: 48017, Training Loss: 0.00584
Epoch: 48018, Training Loss: 0.00658
Epoch: 48018, Training Loss: 0.00617
Epoch: 48018, Training Loss: 0.00752
Epoch: 48018, Training Loss: 0.00584
Epoch: 48019, Training Loss: 0.00658
Epoch: 48019, Training Loss: 0.00617
Epoch: 48019, Training Loss: 0.00752
Epoch: 48019, Training Loss: 0.00584
Epoch: 48020, Training Loss: 0.00658
Epoch: 48020, Training Loss: 0.00617
Epoch: 48020, Training Loss: 0.00752
Epoch: 48020, Training Loss: 0.00584
Epoch: 48021, Training Loss: 0.00658
Epoch: 48021, Training Loss: 0.00617
Epoch: 48021, Training Loss: 0.00752
Epoch: 48021, Training Loss: 0.00584
Epoch: 48022, Training Loss: 0.00658
Epoch: 48022, Training Loss: 0.00617
Epoch: 48022, Training Loss: 0.00752
Epoch: 48022, Training Loss: 0.00584
Epoch: 48023, Training Loss: 0.00658
Epoch: 48023, Training Loss: 0.00617
Epoch: 48023, Training Loss: 0.00752
Epoch: 48023, Training Loss: 0.00584
Epoch: 48024, Training Loss: 0.00658
Epoch: 48024, Training Loss: 0.00617
Epoch: 48024, Training Loss: 0.00752
Epoch: 48024, Training Loss: 0.00584
Epoch: 48025, Training Loss: 0.00658
Epoch: 48025, Training Loss: 0.00617
Epoch: 48025, Training Loss: 0.00752
Epoch: 48025, Training Loss: 0.00584
Epoch: 48026, Training Loss: 0.00658
Epoch: 48026, Training Loss: 0.00617
Epoch: 48026, Training Loss: 0.00752
Epoch: 48026, Training Loss: 0.00584
Epoch: 48027, Training Loss: 0.00658
Epoch: 48027, Training Loss: 0.00617
Epoch: 48027, Training Loss: 0.00752
Epoch: 48027, Training Loss: 0.00584
Epoch: 48028, Training Loss: 0.00658
Epoch: 48028, Training Loss: 0.00617
Epoch: 48028, Training Loss: 0.00752
Epoch: 48028, Training Loss: 0.00584
Epoch: 48029, Training Loss: 0.00658
Epoch: 48029, Training Loss: 0.00617
Epoch: 48029, Training Loss: 0.00752
Epoch: 48029, Training Loss: 0.00584
Epoch: 48030, Training Loss: 0.00658
Epoch: 48030, Training Loss: 0.00617
Epoch: 48030, Training Loss: 0.00752
Epoch: 48030, Training Loss: 0.00584
Epoch: 48031, Training Loss: 0.00658
Epoch: 48031, Training Loss: 0.00617
Epoch: 48031, Training Loss: 0.00752
Epoch: 48031, Training Loss: 0.00584
Epoch: 48032, Training Loss: 0.00658
Epoch: 48032, Training Loss: 0.00617
Epoch: 48032, Training Loss: 0.00752
Epoch: 48032, Training Loss: 0.00584
Epoch: 48033, Training Loss: 0.00658
Epoch: 48033, Training Loss: 0.00617
Epoch: 48033, Training Loss: 0.00752
Epoch: 48033, Training Loss: 0.00584
Epoch: 48034, Training Loss: 0.00658
Epoch: 48034, Training Loss: 0.00617
Epoch: 48034, Training Loss: 0.00752
Epoch: 48034, Training Loss: 0.00584
Epoch: 48035, Training Loss: 0.00658
Epoch: 48035, Training Loss: 0.00617
Epoch: 48035, Training Loss: 0.00752
Epoch: 48035, Training Loss: 0.00584
Epoch: 48036, Training Loss: 0.00658
Epoch: 48036, Training Loss: 0.00617
Epoch: 48036, Training Loss: 0.00752
Epoch: 48036, Training Loss: 0.00584
Epoch: 48037, Training Loss: 0.00658
Epoch: 48037, Training Loss: 0.00617
Epoch: 48037, Training Loss: 0.00752
Epoch: 48037, Training Loss: 0.00584
Epoch: 48038, Training Loss: 0.00658
Epoch: 48038, Training Loss: 0.00617
Epoch: 48038, Training Loss: 0.00752
Epoch: 48038, Training Loss: 0.00584
Epoch: 48039, Training Loss: 0.00658
Epoch: 48039, Training Loss: 0.00617
Epoch: 48039, Training Loss: 0.00752
Epoch: 48039, Training Loss: 0.00584
Epoch: 48040, Training Loss: 0.00658
Epoch: 48040, Training Loss: 0.00617
Epoch: 48040, Training Loss: 0.00752
Epoch: 48040, Training Loss: 0.00584
Epoch: 48041, Training Loss: 0.00658
Epoch: 48041, Training Loss: 0.00617
Epoch: 48041, Training Loss: 0.00752
Epoch: 48041, Training Loss: 0.00584
Epoch: 48042, Training Loss: 0.00658
Epoch: 48042, Training Loss: 0.00617
Epoch: 48042, Training Loss: 0.00752
Epoch: 48042, Training Loss: 0.00584
Epoch: 48043, Training Loss: 0.00658
Epoch: 48043, Training Loss: 0.00617
Epoch: 48043, Training Loss: 0.00752
Epoch: 48043, Training Loss: 0.00584
Epoch: 48044, Training Loss: 0.00658
Epoch: 48044, Training Loss: 0.00617
Epoch: 48044, Training Loss: 0.00752
Epoch: 48044, Training Loss: 0.00584
Epoch: 48045, Training Loss: 0.00658
Epoch: 48045, Training Loss: 0.00617
Epoch: 48045, Training Loss: 0.00752
Epoch: 48045, Training Loss: 0.00584
Epoch: 48046, Training Loss: 0.00658
Epoch: 48046, Training Loss: 0.00617
Epoch: 48046, Training Loss: 0.00752
Epoch: 48046, Training Loss: 0.00584
Epoch: 48047, Training Loss: 0.00658
Epoch: 48047, Training Loss: 0.00617
Epoch: 48047, Training Loss: 0.00752
Epoch: 48047, Training Loss: 0.00584
Epoch: 48048, Training Loss: 0.00658
Epoch: 48048, Training Loss: 0.00617
Epoch: 48048, Training Loss: 0.00752
Epoch: 48048, Training Loss: 0.00584
Epoch: 48049, Training Loss: 0.00658
Epoch: 48049, Training Loss: 0.00617
Epoch: 48049, Training Loss: 0.00752
Epoch: 48049, Training Loss: 0.00584
Epoch: 48050, Training Loss: 0.00658
Epoch: 48050, Training Loss: 0.00617
Epoch: 48050, Training Loss: 0.00752
Epoch: 48050, Training Loss: 0.00584
Epoch: 48051, Training Loss: 0.00658
Epoch: 48051, Training Loss: 0.00617
Epoch: 48051, Training Loss: 0.00752
Epoch: 48051, Training Loss: 0.00584
Epoch: 48052, Training Loss: 0.00658
Epoch: 48052, Training Loss: 0.00617
Epoch: 48052, Training Loss: 0.00752
Epoch: 48052, Training Loss: 0.00584
Epoch: 48053, Training Loss: 0.00658
Epoch: 48053, Training Loss: 0.00617
Epoch: 48053, Training Loss: 0.00752
Epoch: 48053, Training Loss: 0.00584
Epoch: 48054, Training Loss: 0.00658
Epoch: 48054, Training Loss: 0.00617
Epoch: 48054, Training Loss: 0.00752
Epoch: 48054, Training Loss: 0.00584
Epoch: 48055, Training Loss: 0.00658
Epoch: 48055, Training Loss: 0.00616
Epoch: 48055, Training Loss: 0.00752
Epoch: 48055, Training Loss: 0.00584
Epoch: 48056, Training Loss: 0.00658
Epoch: 48056, Training Loss: 0.00616
Epoch: 48056, Training Loss: 0.00752
Epoch: 48056, Training Loss: 0.00584
Epoch: 48057, Training Loss: 0.00658
Epoch: 48057, Training Loss: 0.00616
Epoch: 48057, Training Loss: 0.00752
Epoch: 48057, Training Loss: 0.00584
Epoch: 48058, Training Loss: 0.00658
Epoch: 48058, Training Loss: 0.00616
Epoch: 48058, Training Loss: 0.00752
Epoch: 48058, Training Loss: 0.00584
Epoch: 48059, Training Loss: 0.00658
Epoch: 48059, Training Loss: 0.00616
Epoch: 48059, Training Loss: 0.00752
Epoch: 48059, Training Loss: 0.00584
Epoch: 48060, Training Loss: 0.00658
Epoch: 48060, Training Loss: 0.00616
Epoch: 48060, Training Loss: 0.00752
Epoch: 48060, Training Loss: 0.00584
Epoch: 48061, Training Loss: 0.00658
Epoch: 48061, Training Loss: 0.00616
Epoch: 48061, Training Loss: 0.00752
Epoch: 48061, Training Loss: 0.00584
Epoch: 48062, Training Loss: 0.00658
Epoch: 48062, Training Loss: 0.00616
Epoch: 48062, Training Loss: 0.00752
Epoch: 48062, Training Loss: 0.00584
Epoch: 48063, Training Loss: 0.00658
Epoch: 48063, Training Loss: 0.00616
Epoch: 48063, Training Loss: 0.00752
Epoch: 48063, Training Loss: 0.00584
Epoch: 48064, Training Loss: 0.00658
Epoch: 48064, Training Loss: 0.00616
Epoch: 48064, Training Loss: 0.00752
Epoch: 48064, Training Loss: 0.00584
Epoch: 48065, Training Loss: 0.00658
Epoch: 48065, Training Loss: 0.00616
Epoch: 48065, Training Loss: 0.00752
Epoch: 48065, Training Loss: 0.00584
Epoch: 48066, Training Loss: 0.00658
Epoch: 48066, Training Loss: 0.00616
Epoch: 48066, Training Loss: 0.00752
Epoch: 48066, Training Loss: 0.00584
Epoch: 48067, Training Loss: 0.00658
Epoch: 48067, Training Loss: 0.00616
Epoch: 48067, Training Loss: 0.00752
Epoch: 48067, Training Loss: 0.00584
Epoch: 48068, Training Loss: 0.00658
Epoch: 48068, Training Loss: 0.00616
Epoch: 48068, Training Loss: 0.00752
Epoch: 48068, Training Loss: 0.00584
Epoch: 48069, Training Loss: 0.00658
Epoch: 48069, Training Loss: 0.00616
Epoch: 48069, Training Loss: 0.00752
Epoch: 48069, Training Loss: 0.00584
Epoch: 48070, Training Loss: 0.00658
Epoch: 48070, Training Loss: 0.00616
Epoch: 48070, Training Loss: 0.00752
Epoch: 48070, Training Loss: 0.00584
Epoch: 48071, Training Loss: 0.00658
Epoch: 48071, Training Loss: 0.00616
Epoch: 48071, Training Loss: 0.00752
Epoch: 48071, Training Loss: 0.00584
Epoch: 48072, Training Loss: 0.00658
Epoch: 48072, Training Loss: 0.00616
Epoch: 48072, Training Loss: 0.00752
Epoch: 48072, Training Loss: 0.00584
Epoch: 48073, Training Loss: 0.00658
Epoch: 48073, Training Loss: 0.00616
Epoch: 48073, Training Loss: 0.00752
Epoch: 48073, Training Loss: 0.00584
Epoch: 48074, Training Loss: 0.00658
Epoch: 48074, Training Loss: 0.00616
Epoch: 48074, Training Loss: 0.00752
Epoch: 48074, Training Loss: 0.00584
Epoch: 48075, Training Loss: 0.00658
Epoch: 48075, Training Loss: 0.00616
Epoch: 48075, Training Loss: 0.00752
Epoch: 48075, Training Loss: 0.00584
Epoch: 48076, Training Loss: 0.00658
Epoch: 48076, Training Loss: 0.00616
Epoch: 48076, Training Loss: 0.00752
Epoch: 48076, Training Loss: 0.00584
Epoch: 48077, Training Loss: 0.00658
Epoch: 48077, Training Loss: 0.00616
Epoch: 48077, Training Loss: 0.00752
Epoch: 48077, Training Loss: 0.00584
Epoch: 48078, Training Loss: 0.00658
Epoch: 48078, Training Loss: 0.00616
Epoch: 48078, Training Loss: 0.00752
Epoch: 48078, Training Loss: 0.00584
Epoch: 48079, Training Loss: 0.00658
Epoch: 48079, Training Loss: 0.00616
Epoch: 48079, Training Loss: 0.00752
Epoch: 48079, Training Loss: 0.00583
Epoch: 48080, Training Loss: 0.00658
Epoch: 48080, Training Loss: 0.00616
Epoch: 48080, Training Loss: 0.00752
Epoch: 48080, Training Loss: 0.00583
Epoch: 48081, Training Loss: 0.00658
Epoch: 48081, Training Loss: 0.00616
Epoch: 48081, Training Loss: 0.00752
Epoch: 48081, Training Loss: 0.00583
Epoch: 48082, Training Loss: 0.00658
Epoch: 48082, Training Loss: 0.00616
Epoch: 48082, Training Loss: 0.00752
Epoch: 48082, Training Loss: 0.00583
Epoch: 48083, Training Loss: 0.00658
Epoch: 48083, Training Loss: 0.00616
Epoch: 48083, Training Loss: 0.00752
Epoch: 48083, Training Loss: 0.00583
Epoch: 48084, Training Loss: 0.00658
Epoch: 48084, Training Loss: 0.00616
Epoch: 48084, Training Loss: 0.00752
Epoch: 48084, Training Loss: 0.00583
Epoch: 48085, Training Loss: 0.00658
Epoch: 48085, Training Loss: 0.00616
Epoch: 48085, Training Loss: 0.00752
Epoch: 48085, Training Loss: 0.00583
Epoch: 48086, Training Loss: 0.00658
Epoch: 48086, Training Loss: 0.00616
Epoch: 48086, Training Loss: 0.00752
Epoch: 48086, Training Loss: 0.00583
Epoch: 48087, Training Loss: 0.00658
Epoch: 48087, Training Loss: 0.00616
Epoch: 48087, Training Loss: 0.00752
Epoch: 48087, Training Loss: 0.00583
Epoch: 48088, Training Loss: 0.00658
Epoch: 48088, Training Loss: 0.00616
Epoch: 48088, Training Loss: 0.00752
Epoch: 48088, Training Loss: 0.00583
Epoch: 48089, Training Loss: 0.00658
Epoch: 48089, Training Loss: 0.00616
Epoch: 48089, Training Loss: 0.00752
Epoch: 48089, Training Loss: 0.00583
Epoch: 48090, Training Loss: 0.00658
Epoch: 48090, Training Loss: 0.00616
Epoch: 48090, Training Loss: 0.00752
Epoch: 48090, Training Loss: 0.00583
Epoch: 48091, Training Loss: 0.00658
Epoch: 48091, Training Loss: 0.00616
Epoch: 48091, Training Loss: 0.00752
Epoch: 48091, Training Loss: 0.00583
Epoch: 48092, Training Loss: 0.00658
Epoch: 48092, Training Loss: 0.00616
Epoch: 48092, Training Loss: 0.00752
Epoch: 48092, Training Loss: 0.00583
Epoch: 48093, Training Loss: 0.00658
Epoch: 48093, Training Loss: 0.00616
Epoch: 48093, Training Loss: 0.00752
Epoch: 48093, Training Loss: 0.00583
Epoch: 48094, Training Loss: 0.00658
Epoch: 48094, Training Loss: 0.00616
Epoch: 48094, Training Loss: 0.00752
Epoch: 48094, Training Loss: 0.00583
Epoch: 48095, Training Loss: 0.00658
Epoch: 48095, Training Loss: 0.00616
Epoch: 48095, Training Loss: 0.00752
Epoch: 48095, Training Loss: 0.00583
Epoch: 48096, Training Loss: 0.00658
Epoch: 48096, Training Loss: 0.00616
Epoch: 48096, Training Loss: 0.00752
Epoch: 48096, Training Loss: 0.00583
Epoch: 48097, Training Loss: 0.00658
Epoch: 48097, Training Loss: 0.00616
Epoch: 48097, Training Loss: 0.00752
Epoch: 48097, Training Loss: 0.00583
Epoch: 48098, Training Loss: 0.00658
Epoch: 48098, Training Loss: 0.00616
Epoch: 48098, Training Loss: 0.00752
Epoch: 48098, Training Loss: 0.00583
Epoch: 48099, Training Loss: 0.00658
Epoch: 48099, Training Loss: 0.00616
Epoch: 48099, Training Loss: 0.00752
Epoch: 48099, Training Loss: 0.00583
Epoch: 48100, Training Loss: 0.00658
Epoch: 48100, Training Loss: 0.00616
Epoch: 48100, Training Loss: 0.00752
Epoch: 48100, Training Loss: 0.00583
Epoch: 48101, Training Loss: 0.00658
Epoch: 48101, Training Loss: 0.00616
Epoch: 48101, Training Loss: 0.00752
Epoch: 48101, Training Loss: 0.00583
Epoch: 48102, Training Loss: 0.00658
Epoch: 48102, Training Loss: 0.00616
Epoch: 48102, Training Loss: 0.00752
Epoch: 48102, Training Loss: 0.00583
Epoch: 48103, Training Loss: 0.00658
Epoch: 48103, Training Loss: 0.00616
Epoch: 48103, Training Loss: 0.00752
Epoch: 48103, Training Loss: 0.00583
Epoch: 48104, Training Loss: 0.00657
Epoch: 48104, Training Loss: 0.00616
Epoch: 48104, Training Loss: 0.00752
Epoch: 48104, Training Loss: 0.00583
Epoch: 48105, Training Loss: 0.00657
Epoch: 48105, Training Loss: 0.00616
Epoch: 48105, Training Loss: 0.00752
Epoch: 48105, Training Loss: 0.00583
Epoch: 48106, Training Loss: 0.00657
Epoch: 48106, Training Loss: 0.00616
Epoch: 48106, Training Loss: 0.00752
Epoch: 48106, Training Loss: 0.00583
Epoch: 48107, Training Loss: 0.00657
Epoch: 48107, Training Loss: 0.00616
Epoch: 48107, Training Loss: 0.00752
Epoch: 48107, Training Loss: 0.00583
Epoch: 48108, Training Loss: 0.00657
Epoch: 48108, Training Loss: 0.00616
Epoch: 48108, Training Loss: 0.00752
Epoch: 48108, Training Loss: 0.00583
Epoch: 48109, Training Loss: 0.00657
Epoch: 48109, Training Loss: 0.00616
Epoch: 48109, Training Loss: 0.00751
Epoch: 48109, Training Loss: 0.00583
Epoch: 48110, Training Loss: 0.00657
Epoch: 48110, Training Loss: 0.00616
Epoch: 48110, Training Loss: 0.00751
Epoch: 48110, Training Loss: 0.00583
Epoch: 48111, Training Loss: 0.00657
Epoch: 48111, Training Loss: 0.00616
Epoch: 48111, Training Loss: 0.00751
Epoch: 48111, Training Loss: 0.00583
Epoch: 48112, Training Loss: 0.00657
Epoch: 48112, Training Loss: 0.00616
Epoch: 48112, Training Loss: 0.00751
Epoch: 48112, Training Loss: 0.00583
Epoch: 48113, Training Loss: 0.00657
Epoch: 48113, Training Loss: 0.00616
Epoch: 48113, Training Loss: 0.00751
Epoch: 48113, Training Loss: 0.00583
Epoch: 48114, Training Loss: 0.00657
Epoch: 48114, Training Loss: 0.00616
Epoch: 48114, Training Loss: 0.00751
Epoch: 48114, Training Loss: 0.00583
Epoch: 48115, Training Loss: 0.00657
Epoch: 48115, Training Loss: 0.00616
Epoch: 48115, Training Loss: 0.00751
Epoch: 48115, Training Loss: 0.00583
Epoch: 48116, Training Loss: 0.00657
Epoch: 48116, Training Loss: 0.00616
Epoch: 48116, Training Loss: 0.00751
Epoch: 48116, Training Loss: 0.00583
Epoch: 48117, Training Loss: 0.00657
Epoch: 48117, Training Loss: 0.00616
Epoch: 48117, Training Loss: 0.00751
Epoch: 48117, Training Loss: 0.00583
Epoch: 48118, Training Loss: 0.00657
Epoch: 48118, Training Loss: 0.00616
Epoch: 48118, Training Loss: 0.00751
Epoch: 48118, Training Loss: 0.00583
Epoch: 48119, Training Loss: 0.00657
Epoch: 48119, Training Loss: 0.00616
Epoch: 48119, Training Loss: 0.00751
Epoch: 48119, Training Loss: 0.00583
Epoch: 48120, Training Loss: 0.00657
Epoch: 48120, Training Loss: 0.00616
Epoch: 48120, Training Loss: 0.00751
Epoch: 48120, Training Loss: 0.00583
Epoch: 48121, Training Loss: 0.00657
Epoch: 48121, Training Loss: 0.00616
Epoch: 48121, Training Loss: 0.00751
Epoch: 48121, Training Loss: 0.00583
Epoch: 48122, Training Loss: 0.00657
Epoch: 48122, Training Loss: 0.00616
Epoch: 48122, Training Loss: 0.00751
Epoch: 48122, Training Loss: 0.00583
Epoch: 48123, Training Loss: 0.00657
Epoch: 48123, Training Loss: 0.00616
Epoch: 48123, Training Loss: 0.00751
Epoch: 48123, Training Loss: 0.00583
Epoch: 48124, Training Loss: 0.00657
Epoch: 48124, Training Loss: 0.00616
Epoch: 48124, Training Loss: 0.00751
Epoch: 48124, Training Loss: 0.00583
Epoch: 48125, Training Loss: 0.00657
Epoch: 48125, Training Loss: 0.00616
Epoch: 48125, Training Loss: 0.00751
Epoch: 48125, Training Loss: 0.00583
Epoch: 48126, Training Loss: 0.00657
Epoch: 48126, Training Loss: 0.00616
Epoch: 48126, Training Loss: 0.00751
Epoch: 48126, Training Loss: 0.00583
Epoch: 48127, Training Loss: 0.00657
Epoch: 48127, Training Loss: 0.00616
Epoch: 48127, Training Loss: 0.00751
Epoch: 48127, Training Loss: 0.00583
Epoch: 48128, Training Loss: 0.00657
Epoch: 48128, Training Loss: 0.00616
Epoch: 48128, Training Loss: 0.00751
Epoch: 48128, Training Loss: 0.00583
Epoch: 48129, Training Loss: 0.00657
Epoch: 48129, Training Loss: 0.00616
Epoch: 48129, Training Loss: 0.00751
Epoch: 48129, Training Loss: 0.00583
Epoch: 48130, Training Loss: 0.00657
Epoch: 48130, Training Loss: 0.00616
Epoch: 48130, Training Loss: 0.00751
Epoch: 48130, Training Loss: 0.00583
Epoch: 48131, Training Loss: 0.00657
Epoch: 48131, Training Loss: 0.00616
Epoch: 48131, Training Loss: 0.00751
Epoch: 48131, Training Loss: 0.00583
Epoch: 48132, Training Loss: 0.00657
Epoch: 48132, Training Loss: 0.00616
Epoch: 48132, Training Loss: 0.00751
Epoch: 48132, Training Loss: 0.00583
Epoch: 48133, Training Loss: 0.00657
Epoch: 48133, Training Loss: 0.00616
Epoch: 48133, Training Loss: 0.00751
Epoch: 48133, Training Loss: 0.00583
Epoch: 48134, Training Loss: 0.00657
Epoch: 48134, Training Loss: 0.00616
Epoch: 48134, Training Loss: 0.00751
Epoch: 48134, Training Loss: 0.00583
Epoch: 48135, Training Loss: 0.00657
Epoch: 48135, Training Loss: 0.00616
Epoch: 48135, Training Loss: 0.00751
Epoch: 48135, Training Loss: 0.00583
Epoch: 48136, Training Loss: 0.00657
Epoch: 48136, Training Loss: 0.00616
Epoch: 48136, Training Loss: 0.00751
Epoch: 48136, Training Loss: 0.00583
Epoch: 48137, Training Loss: 0.00657
Epoch: 48137, Training Loss: 0.00616
Epoch: 48137, Training Loss: 0.00751
Epoch: 48137, Training Loss: 0.00583
Epoch: 48138, Training Loss: 0.00657
Epoch: 48138, Training Loss: 0.00616
Epoch: 48138, Training Loss: 0.00751
Epoch: 48138, Training Loss: 0.00583
Epoch: 48139, Training Loss: 0.00657
Epoch: 48139, Training Loss: 0.00616
Epoch: 48139, Training Loss: 0.00751
Epoch: 48139, Training Loss: 0.00583
Epoch: 48140, Training Loss: 0.00657
Epoch: 48140, Training Loss: 0.00616
Epoch: 48140, Training Loss: 0.00751
Epoch: 48140, Training Loss: 0.00583
Epoch: 48141, Training Loss: 0.00657
Epoch: 48141, Training Loss: 0.00616
Epoch: 48141, Training Loss: 0.00751
Epoch: 48141, Training Loss: 0.00583
Epoch: 48142, Training Loss: 0.00657
Epoch: 48142, Training Loss: 0.00616
Epoch: 48142, Training Loss: 0.00751
Epoch: 48142, Training Loss: 0.00583
Epoch: 48143, Training Loss: 0.00657
Epoch: 48143, Training Loss: 0.00616
Epoch: 48143, Training Loss: 0.00751
Epoch: 48143, Training Loss: 0.00583
Epoch: 48144, Training Loss: 0.00657
Epoch: 48144, Training Loss: 0.00616
Epoch: 48144, Training Loss: 0.00751
Epoch: 48144, Training Loss: 0.00583
Epoch: 48145, Training Loss: 0.00657
Epoch: 48145, Training Loss: 0.00616
Epoch: 48145, Training Loss: 0.00751
Epoch: 48145, Training Loss: 0.00583
Epoch: 48146, Training Loss: 0.00657
Epoch: 48146, Training Loss: 0.00616
Epoch: 48146, Training Loss: 0.00751
Epoch: 48146, Training Loss: 0.00583
Epoch: 48147, Training Loss: 0.00657
Epoch: 48147, Training Loss: 0.00616
Epoch: 48147, Training Loss: 0.00751
Epoch: 48147, Training Loss: 0.00583
Epoch: 48148, Training Loss: 0.00657
Epoch: 48148, Training Loss: 0.00616
Epoch: 48148, Training Loss: 0.00751
Epoch: 48148, Training Loss: 0.00583
Epoch: 48149, Training Loss: 0.00657
Epoch: 48149, Training Loss: 0.00616
Epoch: 48149, Training Loss: 0.00751
Epoch: 48149, Training Loss: 0.00583
Epoch: 48150, Training Loss: 0.00657
Epoch: 48150, Training Loss: 0.00616
Epoch: 48150, Training Loss: 0.00751
Epoch: 48150, Training Loss: 0.00583
Epoch: 48151, Training Loss: 0.00657
Epoch: 48151, Training Loss: 0.00616
Epoch: 48151, Training Loss: 0.00751
Epoch: 48151, Training Loss: 0.00583
Epoch: 48152, Training Loss: 0.00657
Epoch: 48152, Training Loss: 0.00616
Epoch: 48152, Training Loss: 0.00751
Epoch: 48152, Training Loss: 0.00583
Epoch: 48153, Training Loss: 0.00657
Epoch: 48153, Training Loss: 0.00616
Epoch: 48153, Training Loss: 0.00751
Epoch: 48153, Training Loss: 0.00583
Epoch: 48154, Training Loss: 0.00657
Epoch: 48154, Training Loss: 0.00616
Epoch: 48154, Training Loss: 0.00751
Epoch: 48154, Training Loss: 0.00583
Epoch: 48155, Training Loss: 0.00657
Epoch: 48155, Training Loss: 0.00616
Epoch: 48155, Training Loss: 0.00751
Epoch: 48155, Training Loss: 0.00583
Epoch: 48156, Training Loss: 0.00657
Epoch: 48156, Training Loss: 0.00616
Epoch: 48156, Training Loss: 0.00751
Epoch: 48156, Training Loss: 0.00583
Epoch: 48157, Training Loss: 0.00657
Epoch: 48157, Training Loss: 0.00616
Epoch: 48157, Training Loss: 0.00751
Epoch: 48157, Training Loss: 0.00583
Epoch: 48158, Training Loss: 0.00657
Epoch: 48158, Training Loss: 0.00616
Epoch: 48158, Training Loss: 0.00751
Epoch: 48158, Training Loss: 0.00583
Epoch: 48159, Training Loss: 0.00657
Epoch: 48159, Training Loss: 0.00616
Epoch: 48159, Training Loss: 0.00751
Epoch: 48159, Training Loss: 0.00583
Epoch: 48160, Training Loss: 0.00657
Epoch: 48160, Training Loss: 0.00616
Epoch: 48160, Training Loss: 0.00751
Epoch: 48160, Training Loss: 0.00583
Epoch: 48161, Training Loss: 0.00657
Epoch: 48161, Training Loss: 0.00616
Epoch: 48161, Training Loss: 0.00751
Epoch: 48161, Training Loss: 0.00583
Epoch: 48162, Training Loss: 0.00657
Epoch: 48162, Training Loss: 0.00616
Epoch: 48162, Training Loss: 0.00751
Epoch: 48162, Training Loss: 0.00583
Epoch: 48163, Training Loss: 0.00657
Epoch: 48163, Training Loss: 0.00616
Epoch: 48163, Training Loss: 0.00751
Epoch: 48163, Training Loss: 0.00583
Epoch: 48164, Training Loss: 0.00657
Epoch: 48164, Training Loss: 0.00616
Epoch: 48164, Training Loss: 0.00751
Epoch: 48164, Training Loss: 0.00583
Epoch: 48165, Training Loss: 0.00657
Epoch: 48165, Training Loss: 0.00616
Epoch: 48165, Training Loss: 0.00751
Epoch: 48165, Training Loss: 0.00583
Epoch: 48166, Training Loss: 0.00657
Epoch: 48166, Training Loss: 0.00616
Epoch: 48166, Training Loss: 0.00751
Epoch: 48166, Training Loss: 0.00583
Epoch: 48167, Training Loss: 0.00657
Epoch: 48167, Training Loss: 0.00616
Epoch: 48167, Training Loss: 0.00751
Epoch: 48167, Training Loss: 0.00583
Epoch: 48168, Training Loss: 0.00657
Epoch: 48168, Training Loss: 0.00616
Epoch: 48168, Training Loss: 0.00751
Epoch: 48168, Training Loss: 0.00583
Epoch: 48169, Training Loss: 0.00657
Epoch: 48169, Training Loss: 0.00616
Epoch: 48169, Training Loss: 0.00751
Epoch: 48169, Training Loss: 0.00583
Epoch: 48170, Training Loss: 0.00657
Epoch: 48170, Training Loss: 0.00616
Epoch: 48170, Training Loss: 0.00751
Epoch: 48170, Training Loss: 0.00583
Epoch: 48171, Training Loss: 0.00657
Epoch: 48171, Training Loss: 0.00616
Epoch: 48171, Training Loss: 0.00751
Epoch: 48171, Training Loss: 0.00583
Epoch: 48172, Training Loss: 0.00657
Epoch: 48172, Training Loss: 0.00616
Epoch: 48172, Training Loss: 0.00751
Epoch: 48172, Training Loss: 0.00583
Epoch: 48173, Training Loss: 0.00657
Epoch: 48173, Training Loss: 0.00616
Epoch: 48173, Training Loss: 0.00751
Epoch: 48173, Training Loss: 0.00583
Epoch: 48174, Training Loss: 0.00657
Epoch: 48174, Training Loss: 0.00616
Epoch: 48174, Training Loss: 0.00751
Epoch: 48174, Training Loss: 0.00583
Epoch: 48175, Training Loss: 0.00657
Epoch: 48175, Training Loss: 0.00616
Epoch: 48175, Training Loss: 0.00751
Epoch: 48175, Training Loss: 0.00583
Epoch: 48176, Training Loss: 0.00657
Epoch: 48176, Training Loss: 0.00616
Epoch: 48176, Training Loss: 0.00751
Epoch: 48176, Training Loss: 0.00583
Epoch: 48177, Training Loss: 0.00657
Epoch: 48177, Training Loss: 0.00616
Epoch: 48177, Training Loss: 0.00751
Epoch: 48177, Training Loss: 0.00583
Epoch: 48178, Training Loss: 0.00657
Epoch: 48178, Training Loss: 0.00616
Epoch: 48178, Training Loss: 0.00751
Epoch: 48178, Training Loss: 0.00583
Epoch: 48179, Training Loss: 0.00657
Epoch: 48179, Training Loss: 0.00616
Epoch: 48179, Training Loss: 0.00751
Epoch: 48179, Training Loss: 0.00583
Epoch: 48180, Training Loss: 0.00657
Epoch: 48180, Training Loss: 0.00616
Epoch: 48180, Training Loss: 0.00751
Epoch: 48180, Training Loss: 0.00583
Epoch: 48181, Training Loss: 0.00657
Epoch: 48181, Training Loss: 0.00616
Epoch: 48181, Training Loss: 0.00751
Epoch: 48181, Training Loss: 0.00583
Epoch: 48182, Training Loss: 0.00657
Epoch: 48182, Training Loss: 0.00616
Epoch: 48182, Training Loss: 0.00751
Epoch: 48182, Training Loss: 0.00583
Epoch: 48183, Training Loss: 0.00657
Epoch: 48183, Training Loss: 0.00616
Epoch: 48183, Training Loss: 0.00751
Epoch: 48183, Training Loss: 0.00583
Epoch: 48184, Training Loss: 0.00657
Epoch: 48184, Training Loss: 0.00616
Epoch: 48184, Training Loss: 0.00751
Epoch: 48184, Training Loss: 0.00583
Epoch: 48185, Training Loss: 0.00657
Epoch: 48185, Training Loss: 0.00616
Epoch: 48185, Training Loss: 0.00751
Epoch: 48185, Training Loss: 0.00583
Epoch: 48186, Training Loss: 0.00657
Epoch: 48186, Training Loss: 0.00616
Epoch: 48186, Training Loss: 0.00751
Epoch: 48186, Training Loss: 0.00583
Epoch: 48187, Training Loss: 0.00657
Epoch: 48187, Training Loss: 0.00616
Epoch: 48187, Training Loss: 0.00751
Epoch: 48187, Training Loss: 0.00583
Epoch: 48188, Training Loss: 0.00657
Epoch: 48188, Training Loss: 0.00616
Epoch: 48188, Training Loss: 0.00751
Epoch: 48188, Training Loss: 0.00583
Epoch: 48189, Training Loss: 0.00657
Epoch: 48189, Training Loss: 0.00616
Epoch: 48189, Training Loss: 0.00751
Epoch: 48189, Training Loss: 0.00583
Epoch: 48190, Training Loss: 0.00657
Epoch: 48190, Training Loss: 0.00616
Epoch: 48190, Training Loss: 0.00751
Epoch: 48190, Training Loss: 0.00583
Epoch: 48191, Training Loss: 0.00657
Epoch: 48191, Training Loss: 0.00616
Epoch: 48191, Training Loss: 0.00751
Epoch: 48191, Training Loss: 0.00583
Epoch: 48192, Training Loss: 0.00657
Epoch: 48192, Training Loss: 0.00616
Epoch: 48192, Training Loss: 0.00751
Epoch: 48192, Training Loss: 0.00583
Epoch: 48193, Training Loss: 0.00657
Epoch: 48193, Training Loss: 0.00616
Epoch: 48193, Training Loss: 0.00751
Epoch: 48193, Training Loss: 0.00583
Epoch: 48194, Training Loss: 0.00657
Epoch: 48194, Training Loss: 0.00616
Epoch: 48194, Training Loss: 0.00751
Epoch: 48194, Training Loss: 0.00583
Epoch: 48195, Training Loss: 0.00657
Epoch: 48195, Training Loss: 0.00616
Epoch: 48195, Training Loss: 0.00751
Epoch: 48195, Training Loss: 0.00583
Epoch: 48196, Training Loss: 0.00657
Epoch: 48196, Training Loss: 0.00616
Epoch: 48196, Training Loss: 0.00751
Epoch: 48196, Training Loss: 0.00583
Epoch: 48197, Training Loss: 0.00657
Epoch: 48197, Training Loss: 0.00616
Epoch: 48197, Training Loss: 0.00751
Epoch: 48197, Training Loss: 0.00583
Epoch: 48198, Training Loss: 0.00657
Epoch: 48198, Training Loss: 0.00616
Epoch: 48198, Training Loss: 0.00751
Epoch: 48198, Training Loss: 0.00583
Epoch: 48199, Training Loss: 0.00657
Epoch: 48199, Training Loss: 0.00616
Epoch: 48199, Training Loss: 0.00751
Epoch: 48199, Training Loss: 0.00583
Epoch: 48200, Training Loss: 0.00657
Epoch: 48200, Training Loss: 0.00616
Epoch: 48200, Training Loss: 0.00751
Epoch: 48200, Training Loss: 0.00583
Epoch: 48201, Training Loss: 0.00657
Epoch: 48201, Training Loss: 0.00616
Epoch: 48201, Training Loss: 0.00751
Epoch: 48201, Training Loss: 0.00583
Epoch: 48202, Training Loss: 0.00657
Epoch: 48202, Training Loss: 0.00616
Epoch: 48202, Training Loss: 0.00751
Epoch: 48202, Training Loss: 0.00583
Epoch: 48203, Training Loss: 0.00657
Epoch: 48203, Training Loss: 0.00616
Epoch: 48203, Training Loss: 0.00751
Epoch: 48203, Training Loss: 0.00583
Epoch: 48204, Training Loss: 0.00657
Epoch: 48204, Training Loss: 0.00616
Epoch: 48204, Training Loss: 0.00751
Epoch: 48204, Training Loss: 0.00583
Epoch: 48205, Training Loss: 0.00657
Epoch: 48205, Training Loss: 0.00616
Epoch: 48205, Training Loss: 0.00751
Epoch: 48205, Training Loss: 0.00583
Epoch: 48206, Training Loss: 0.00657
Epoch: 48206, Training Loss: 0.00615
Epoch: 48206, Training Loss: 0.00751
Epoch: 48206, Training Loss: 0.00583
Epoch: 48207, Training Loss: 0.00657
Epoch: 48207, Training Loss: 0.00615
Epoch: 48207, Training Loss: 0.00751
Epoch: 48207, Training Loss: 0.00583
Epoch: 48208, Training Loss: 0.00657
Epoch: 48208, Training Loss: 0.00615
Epoch: 48208, Training Loss: 0.00751
Epoch: 48208, Training Loss: 0.00583
Epoch: 48209, Training Loss: 0.00657
Epoch: 48209, Training Loss: 0.00615
Epoch: 48209, Training Loss: 0.00751
Epoch: 48209, Training Loss: 0.00583
Epoch: 48210, Training Loss: 0.00657
Epoch: 48210, Training Loss: 0.00615
Epoch: 48210, Training Loss: 0.00751
Epoch: 48210, Training Loss: 0.00583
Epoch: 48211, Training Loss: 0.00657
Epoch: 48211, Training Loss: 0.00615
Epoch: 48211, Training Loss: 0.00751
Epoch: 48211, Training Loss: 0.00583
Epoch: 48212, Training Loss: 0.00657
Epoch: 48212, Training Loss: 0.00615
Epoch: 48212, Training Loss: 0.00751
Epoch: 48212, Training Loss: 0.00583
Epoch: 48213, Training Loss: 0.00657
Epoch: 48213, Training Loss: 0.00615
Epoch: 48213, Training Loss: 0.00751
Epoch: 48213, Training Loss: 0.00583
Epoch: 48214, Training Loss: 0.00657
Epoch: 48214, Training Loss: 0.00615
Epoch: 48214, Training Loss: 0.00751
Epoch: 48214, Training Loss: 0.00583
Epoch: 48215, Training Loss: 0.00657
Epoch: 48215, Training Loss: 0.00615
Epoch: 48215, Training Loss: 0.00751
Epoch: 48215, Training Loss: 0.00583
Epoch: 48216, Training Loss: 0.00657
Epoch: 48216, Training Loss: 0.00615
Epoch: 48216, Training Loss: 0.00751
Epoch: 48216, Training Loss: 0.00583
Epoch: 48217, Training Loss: 0.00657
Epoch: 48217, Training Loss: 0.00615
Epoch: 48217, Training Loss: 0.00751
Epoch: 48217, Training Loss: 0.00583
Epoch: 48218, Training Loss: 0.00657
Epoch: 48218, Training Loss: 0.00615
Epoch: 48218, Training Loss: 0.00751
Epoch: 48218, Training Loss: 0.00583
Epoch: 48219, Training Loss: 0.00657
Epoch: 48219, Training Loss: 0.00615
Epoch: 48219, Training Loss: 0.00751
Epoch: 48219, Training Loss: 0.00583
Epoch: 48220, Training Loss: 0.00657
Epoch: 48220, Training Loss: 0.00615
Epoch: 48220, Training Loss: 0.00751
Epoch: 48220, Training Loss: 0.00583
Epoch: 48221, Training Loss: 0.00657
Epoch: 48221, Training Loss: 0.00615
Epoch: 48221, Training Loss: 0.00751
Epoch: 48221, Training Loss: 0.00583
Epoch: 48222, Training Loss: 0.00657
Epoch: 48222, Training Loss: 0.00615
Epoch: 48222, Training Loss: 0.00751
Epoch: 48222, Training Loss: 0.00583
Epoch: 48223, Training Loss: 0.00657
Epoch: 48223, Training Loss: 0.00615
Epoch: 48223, Training Loss: 0.00751
Epoch: 48223, Training Loss: 0.00583
Epoch: 48224, Training Loss: 0.00657
Epoch: 48224, Training Loss: 0.00615
Epoch: 48224, Training Loss: 0.00751
Epoch: 48224, Training Loss: 0.00583
Epoch: 48225, Training Loss: 0.00657
Epoch: 48225, Training Loss: 0.00615
Epoch: 48225, Training Loss: 0.00751
Epoch: 48225, Training Loss: 0.00583
Epoch: 48226, Training Loss: 0.00657
Epoch: 48226, Training Loss: 0.00615
Epoch: 48226, Training Loss: 0.00751
Epoch: 48226, Training Loss: 0.00583
Epoch: 48227, Training Loss: 0.00657
Epoch: 48227, Training Loss: 0.00615
Epoch: 48227, Training Loss: 0.00751
Epoch: 48227, Training Loss: 0.00583
Epoch: 48228, Training Loss: 0.00657
Epoch: 48228, Training Loss: 0.00615
Epoch: 48228, Training Loss: 0.00751
Epoch: 48228, Training Loss: 0.00583
Epoch: 48229, Training Loss: 0.00657
Epoch: 48229, Training Loss: 0.00615
Epoch: 48229, Training Loss: 0.00751
Epoch: 48229, Training Loss: 0.00583
Epoch: 48230, Training Loss: 0.00657
Epoch: 48230, Training Loss: 0.00615
Epoch: 48230, Training Loss: 0.00751
Epoch: 48230, Training Loss: 0.00583
Epoch: 48231, Training Loss: 0.00657
Epoch: 48231, Training Loss: 0.00615
Epoch: 48231, Training Loss: 0.00751
Epoch: 48231, Training Loss: 0.00583
Epoch: 48232, Training Loss: 0.00657
Epoch: 48232, Training Loss: 0.00615
Epoch: 48232, Training Loss: 0.00750
Epoch: 48232, Training Loss: 0.00583
Epoch: 48233, Training Loss: 0.00657
Epoch: 48233, Training Loss: 0.00615
Epoch: 48233, Training Loss: 0.00750
Epoch: 48233, Training Loss: 0.00583
Epoch: 48234, Training Loss: 0.00657
Epoch: 48234, Training Loss: 0.00615
Epoch: 48234, Training Loss: 0.00750
Epoch: 48234, Training Loss: 0.00583
Epoch: 48235, Training Loss: 0.00657
Epoch: 48235, Training Loss: 0.00615
Epoch: 48235, Training Loss: 0.00750
Epoch: 48235, Training Loss: 0.00583
Epoch: 48236, Training Loss: 0.00657
Epoch: 48236, Training Loss: 0.00615
Epoch: 48236, Training Loss: 0.00750
Epoch: 48236, Training Loss: 0.00583
Epoch: 48237, Training Loss: 0.00657
Epoch: 48237, Training Loss: 0.00615
Epoch: 48237, Training Loss: 0.00750
Epoch: 48237, Training Loss: 0.00583
Epoch: 48238, Training Loss: 0.00657
Epoch: 48238, Training Loss: 0.00615
Epoch: 48238, Training Loss: 0.00750
Epoch: 48238, Training Loss: 0.00583
Epoch: 48239, Training Loss: 0.00657
Epoch: 48239, Training Loss: 0.00615
Epoch: 48239, Training Loss: 0.00750
Epoch: 48239, Training Loss: 0.00582
Epoch: 48240, Training Loss: 0.00657
Epoch: 48240, Training Loss: 0.00615
Epoch: 48240, Training Loss: 0.00750
Epoch: 48240, Training Loss: 0.00582
Epoch: 48241, Training Loss: 0.00657
Epoch: 48241, Training Loss: 0.00615
Epoch: 48241, Training Loss: 0.00750
Epoch: 48241, Training Loss: 0.00582
Epoch: 48242, Training Loss: 0.00657
Epoch: 48242, Training Loss: 0.00615
Epoch: 48242, Training Loss: 0.00750
Epoch: 48242, Training Loss: 0.00582
Epoch: 48243, Training Loss: 0.00657
Epoch: 48243, Training Loss: 0.00615
Epoch: 48243, Training Loss: 0.00750
Epoch: 48243, Training Loss: 0.00582
Epoch: 48244, Training Loss: 0.00656
Epoch: 48244, Training Loss: 0.00615
Epoch: 48244, Training Loss: 0.00750
Epoch: 48244, Training Loss: 0.00582
Epoch: 48245, Training Loss: 0.00656
Epoch: 48245, Training Loss: 0.00615
Epoch: 48245, Training Loss: 0.00750
Epoch: 48245, Training Loss: 0.00582
Epoch: 48246, Training Loss: 0.00656
Epoch: 48246, Training Loss: 0.00615
Epoch: 48246, Training Loss: 0.00750
Epoch: 48246, Training Loss: 0.00582
Epoch: 48247, Training Loss: 0.00656
Epoch: 48247, Training Loss: 0.00615
Epoch: 48247, Training Loss: 0.00750
Epoch: 48247, Training Loss: 0.00582
Epoch: 48248, Training Loss: 0.00656
Epoch: 48248, Training Loss: 0.00615
Epoch: 48248, Training Loss: 0.00750
Epoch: 48248, Training Loss: 0.00582
Epoch: 48249, Training Loss: 0.00656
Epoch: 48249, Training Loss: 0.00615
Epoch: 48249, Training Loss: 0.00750
Epoch: 48249, Training Loss: 0.00582
Epoch: 48250, Training Loss: 0.00656
Epoch: 48250, Training Loss: 0.00615
Epoch: 48250, Training Loss: 0.00750
Epoch: 48250, Training Loss: 0.00582
Epoch: 48251, Training Loss: 0.00656
Epoch: 48251, Training Loss: 0.00615
Epoch: 48251, Training Loss: 0.00750
Epoch: 48251, Training Loss: 0.00582
Epoch: 48252, Training Loss: 0.00656
Epoch: 48252, Training Loss: 0.00615
Epoch: 48252, Training Loss: 0.00750
Epoch: 48252, Training Loss: 0.00582
Epoch: 48253, Training Loss: 0.00656
Epoch: 48253, Training Loss: 0.00615
Epoch: 48253, Training Loss: 0.00750
Epoch: 48253, Training Loss: 0.00582
Epoch: 48254, Training Loss: 0.00656
Epoch: 48254, Training Loss: 0.00615
Epoch: 48254, Training Loss: 0.00750
Epoch: 48254, Training Loss: 0.00582
Epoch: 48255, Training Loss: 0.00656
Epoch: 48255, Training Loss: 0.00615
Epoch: 48255, Training Loss: 0.00750
Epoch: 48255, Training Loss: 0.00582
Epoch: 48256, Training Loss: 0.00656
Epoch: 48256, Training Loss: 0.00615
Epoch: 48256, Training Loss: 0.00750
Epoch: 48256, Training Loss: 0.00582
Epoch: 48257, Training Loss: 0.00656
Epoch: 48257, Training Loss: 0.00615
Epoch: 48257, Training Loss: 0.00750
Epoch: 48257, Training Loss: 0.00582
Epoch: 48258, Training Loss: 0.00656
Epoch: 48258, Training Loss: 0.00615
Epoch: 48258, Training Loss: 0.00750
Epoch: 48258, Training Loss: 0.00582
Epoch: 48259, Training Loss: 0.00656
Epoch: 48259, Training Loss: 0.00615
Epoch: 48259, Training Loss: 0.00750
Epoch: 48259, Training Loss: 0.00582
Epoch: 48260, Training Loss: 0.00656
Epoch: 48260, Training Loss: 0.00615
Epoch: 48260, Training Loss: 0.00750
Epoch: 48260, Training Loss: 0.00582
Epoch: 48261, Training Loss: 0.00656
Epoch: 48261, Training Loss: 0.00615
Epoch: 48261, Training Loss: 0.00750
Epoch: 48261, Training Loss: 0.00582
Epoch: 48262, Training Loss: 0.00656
Epoch: 48262, Training Loss: 0.00615
Epoch: 48262, Training Loss: 0.00750
Epoch: 48262, Training Loss: 0.00582
Epoch: 48263, Training Loss: 0.00656
Epoch: 48263, Training Loss: 0.00615
Epoch: 48263, Training Loss: 0.00750
Epoch: 48263, Training Loss: 0.00582
Epoch: 48264, Training Loss: 0.00656
Epoch: 48264, Training Loss: 0.00615
Epoch: 48264, Training Loss: 0.00750
Epoch: 48264, Training Loss: 0.00582
Epoch: 48265, Training Loss: 0.00656
Epoch: 48265, Training Loss: 0.00615
Epoch: 48265, Training Loss: 0.00750
Epoch: 48265, Training Loss: 0.00582
Epoch: 48266, Training Loss: 0.00656
Epoch: 48266, Training Loss: 0.00615
Epoch: 48266, Training Loss: 0.00750
Epoch: 48266, Training Loss: 0.00582
Epoch: 48267, Training Loss: 0.00656
Epoch: 48267, Training Loss: 0.00615
Epoch: 48267, Training Loss: 0.00750
Epoch: 48267, Training Loss: 0.00582
Epoch: 48268, Training Loss: 0.00656
Epoch: 48268, Training Loss: 0.00615
Epoch: 48268, Training Loss: 0.00750
Epoch: 48268, Training Loss: 0.00582
Epoch: 48269, Training Loss: 0.00656
Epoch: 48269, Training Loss: 0.00615
Epoch: 48269, Training Loss: 0.00750
Epoch: 48269, Training Loss: 0.00582
Epoch: 48270, Training Loss: 0.00656
Epoch: 48270, Training Loss: 0.00615
Epoch: 48270, Training Loss: 0.00750
Epoch: 48270, Training Loss: 0.00582
Epoch: 48271, Training Loss: 0.00656
Epoch: 48271, Training Loss: 0.00615
Epoch: 48271, Training Loss: 0.00750
Epoch: 48271, Training Loss: 0.00582
Epoch: 48272, Training Loss: 0.00656
Epoch: 48272, Training Loss: 0.00615
Epoch: 48272, Training Loss: 0.00750
Epoch: 48272, Training Loss: 0.00582
Epoch: 48273, Training Loss: 0.00656
Epoch: 48273, Training Loss: 0.00615
Epoch: 48273, Training Loss: 0.00750
Epoch: 48273, Training Loss: 0.00582
Epoch: 48274, Training Loss: 0.00656
Epoch: 48274, Training Loss: 0.00615
Epoch: 48274, Training Loss: 0.00750
Epoch: 48274, Training Loss: 0.00582
Epoch: 48275, Training Loss: 0.00656
Epoch: 48275, Training Loss: 0.00615
Epoch: 48275, Training Loss: 0.00750
Epoch: 48275, Training Loss: 0.00582
Epoch: 48276, Training Loss: 0.00656
Epoch: 48276, Training Loss: 0.00615
Epoch: 48276, Training Loss: 0.00750
Epoch: 48276, Training Loss: 0.00582
Epoch: 48277, Training Loss: 0.00656
Epoch: 48277, Training Loss: 0.00615
Epoch: 48277, Training Loss: 0.00750
Epoch: 48277, Training Loss: 0.00582
Epoch: 48278, Training Loss: 0.00656
Epoch: 48278, Training Loss: 0.00615
Epoch: 48278, Training Loss: 0.00750
Epoch: 48278, Training Loss: 0.00582
Epoch: 48279, Training Loss: 0.00656
Epoch: 48279, Training Loss: 0.00615
Epoch: 48279, Training Loss: 0.00750
Epoch: 48279, Training Loss: 0.00582
Epoch: 48280, Training Loss: 0.00656
Epoch: 48280, Training Loss: 0.00615
Epoch: 48280, Training Loss: 0.00750
Epoch: 48280, Training Loss: 0.00582
Epoch: 48281, Training Loss: 0.00656
Epoch: 48281, Training Loss: 0.00615
Epoch: 48281, Training Loss: 0.00750
Epoch: 48281, Training Loss: 0.00582
Epoch: 48282, Training Loss: 0.00656
Epoch: 48282, Training Loss: 0.00615
Epoch: 48282, Training Loss: 0.00750
Epoch: 48282, Training Loss: 0.00582
Epoch: 48283, Training Loss: 0.00656
Epoch: 48283, Training Loss: 0.00615
Epoch: 48283, Training Loss: 0.00750
Epoch: 48283, Training Loss: 0.00582
Epoch: 48284, Training Loss: 0.00656
Epoch: 48284, Training Loss: 0.00615
Epoch: 48284, Training Loss: 0.00750
Epoch: 48284, Training Loss: 0.00582
Epoch: 48285, Training Loss: 0.00656
Epoch: 48285, Training Loss: 0.00615
Epoch: 48285, Training Loss: 0.00750
Epoch: 48285, Training Loss: 0.00582
Epoch: 48286, Training Loss: 0.00656
Epoch: 48286, Training Loss: 0.00615
Epoch: 48286, Training Loss: 0.00750
Epoch: 48286, Training Loss: 0.00582
Epoch: 48287, Training Loss: 0.00656
Epoch: 48287, Training Loss: 0.00615
Epoch: 48287, Training Loss: 0.00750
Epoch: 48287, Training Loss: 0.00582
Epoch: 48288, Training Loss: 0.00656
Epoch: 48288, Training Loss: 0.00615
Epoch: 48288, Training Loss: 0.00750
Epoch: 48288, Training Loss: 0.00582
Epoch: 48289, Training Loss: 0.00656
Epoch: 48289, Training Loss: 0.00615
Epoch: 48289, Training Loss: 0.00750
Epoch: 48289, Training Loss: 0.00582
Epoch: 48290, Training Loss: 0.00656
Epoch: 48290, Training Loss: 0.00615
Epoch: 48290, Training Loss: 0.00750
Epoch: 48290, Training Loss: 0.00582
Epoch: 48291, Training Loss: 0.00656
Epoch: 48291, Training Loss: 0.00615
Epoch: 48291, Training Loss: 0.00750
Epoch: 48291, Training Loss: 0.00582
Epoch: 48292, Training Loss: 0.00656
Epoch: 48292, Training Loss: 0.00615
Epoch: 48292, Training Loss: 0.00750
Epoch: 48292, Training Loss: 0.00582
Epoch: 48293, Training Loss: 0.00656
Epoch: 48293, Training Loss: 0.00615
Epoch: 48293, Training Loss: 0.00750
Epoch: 48293, Training Loss: 0.00582
Epoch: 48294, Training Loss: 0.00656
Epoch: 48294, Training Loss: 0.00615
Epoch: 48294, Training Loss: 0.00750
Epoch: 48294, Training Loss: 0.00582
Epoch: 48295, Training Loss: 0.00656
Epoch: 48295, Training Loss: 0.00615
Epoch: 48295, Training Loss: 0.00750
Epoch: 48295, Training Loss: 0.00582
Epoch: 48296, Training Loss: 0.00656
Epoch: 48296, Training Loss: 0.00615
Epoch: 48296, Training Loss: 0.00750
Epoch: 48296, Training Loss: 0.00582
Epoch: 48297, Training Loss: 0.00656
Epoch: 48297, Training Loss: 0.00615
Epoch: 48297, Training Loss: 0.00750
Epoch: 48297, Training Loss: 0.00582
Epoch: 48298, Training Loss: 0.00656
Epoch: 48298, Training Loss: 0.00615
Epoch: 48298, Training Loss: 0.00750
Epoch: 48298, Training Loss: 0.00582
Epoch: 48299, Training Loss: 0.00656
Epoch: 48299, Training Loss: 0.00615
Epoch: 48299, Training Loss: 0.00750
Epoch: 48299, Training Loss: 0.00582
Epoch: 48300, Training Loss: 0.00656
Epoch: 48300, Training Loss: 0.00615
Epoch: 48300, Training Loss: 0.00750
Epoch: 48300, Training Loss: 0.00582
Epoch: 48301, Training Loss: 0.00656
Epoch: 48301, Training Loss: 0.00615
Epoch: 48301, Training Loss: 0.00750
Epoch: 48301, Training Loss: 0.00582
Epoch: 48302, Training Loss: 0.00656
Epoch: 48302, Training Loss: 0.00615
Epoch: 48302, Training Loss: 0.00750
Epoch: 48302, Training Loss: 0.00582
Epoch: 48303, Training Loss: 0.00656
Epoch: 48303, Training Loss: 0.00615
Epoch: 48303, Training Loss: 0.00750
Epoch: 48303, Training Loss: 0.00582
Epoch: 48304, Training Loss: 0.00656
Epoch: 48304, Training Loss: 0.00615
Epoch: 48304, Training Loss: 0.00750
Epoch: 48304, Training Loss: 0.00582
Epoch: 48305, Training Loss: 0.00656
Epoch: 48305, Training Loss: 0.00615
Epoch: 48305, Training Loss: 0.00750
Epoch: 48305, Training Loss: 0.00582
Epoch: 48306, Training Loss: 0.00656
Epoch: 48306, Training Loss: 0.00615
Epoch: 48306, Training Loss: 0.00750
Epoch: 48306, Training Loss: 0.00582
Epoch: 48307, Training Loss: 0.00656
Epoch: 48307, Training Loss: 0.00615
Epoch: 48307, Training Loss: 0.00750
Epoch: 48307, Training Loss: 0.00582
Epoch: 48308, Training Loss: 0.00656
Epoch: 48308, Training Loss: 0.00615
Epoch: 48308, Training Loss: 0.00750
Epoch: 48308, Training Loss: 0.00582
Epoch: 48309, Training Loss: 0.00656
Epoch: 48309, Training Loss: 0.00615
Epoch: 48309, Training Loss: 0.00750
Epoch: 48309, Training Loss: 0.00582
Epoch: 48310, Training Loss: 0.00656
Epoch: 48310, Training Loss: 0.00615
Epoch: 48310, Training Loss: 0.00750
Epoch: 48310, Training Loss: 0.00582
Epoch: 48311, Training Loss: 0.00656
Epoch: 48311, Training Loss: 0.00615
Epoch: 48311, Training Loss: 0.00750
Epoch: 48311, Training Loss: 0.00582
Epoch: 48312, Training Loss: 0.00656
Epoch: 48312, Training Loss: 0.00615
Epoch: 48312, Training Loss: 0.00750
Epoch: 48312, Training Loss: 0.00582
Epoch: 48313, Training Loss: 0.00656
Epoch: 48313, Training Loss: 0.00615
Epoch: 48313, Training Loss: 0.00750
Epoch: 48313, Training Loss: 0.00582
Epoch: 48314, Training Loss: 0.00656
Epoch: 48314, Training Loss: 0.00615
Epoch: 48314, Training Loss: 0.00750
Epoch: 48314, Training Loss: 0.00582
Epoch: 48315, Training Loss: 0.00656
Epoch: 48315, Training Loss: 0.00615
Epoch: 48315, Training Loss: 0.00750
Epoch: 48315, Training Loss: 0.00582
Epoch: 48316, Training Loss: 0.00656
Epoch: 48316, Training Loss: 0.00615
Epoch: 48316, Training Loss: 0.00750
Epoch: 48316, Training Loss: 0.00582
Epoch: 48317, Training Loss: 0.00656
Epoch: 48317, Training Loss: 0.00615
Epoch: 48317, Training Loss: 0.00750
Epoch: 48317, Training Loss: 0.00582
Epoch: 48318, Training Loss: 0.00656
Epoch: 48318, Training Loss: 0.00615
Epoch: 48318, Training Loss: 0.00750
Epoch: 48318, Training Loss: 0.00582
Epoch: 48319, Training Loss: 0.00656
Epoch: 48319, Training Loss: 0.00615
Epoch: 48319, Training Loss: 0.00750
Epoch: 48319, Training Loss: 0.00582
Epoch: 48320, Training Loss: 0.00656
Epoch: 48320, Training Loss: 0.00615
Epoch: 48320, Training Loss: 0.00750
Epoch: 48320, Training Loss: 0.00582
Epoch: 48321, Training Loss: 0.00656
Epoch: 48321, Training Loss: 0.00615
Epoch: 48321, Training Loss: 0.00750
Epoch: 48321, Training Loss: 0.00582
Epoch: 48322, Training Loss: 0.00656
Epoch: 48322, Training Loss: 0.00615
Epoch: 48322, Training Loss: 0.00750
Epoch: 48322, Training Loss: 0.00582
Epoch: 48323, Training Loss: 0.00656
Epoch: 48323, Training Loss: 0.00615
Epoch: 48323, Training Loss: 0.00750
Epoch: 48323, Training Loss: 0.00582
Epoch: 48324, Training Loss: 0.00656
Epoch: 48324, Training Loss: 0.00615
Epoch: 48324, Training Loss: 0.00750
Epoch: 48324, Training Loss: 0.00582
Epoch: 48325, Training Loss: 0.00656
Epoch: 48325, Training Loss: 0.00615
Epoch: 48325, Training Loss: 0.00750
Epoch: 48325, Training Loss: 0.00582
Epoch: 48326, Training Loss: 0.00656
Epoch: 48326, Training Loss: 0.00615
Epoch: 48326, Training Loss: 0.00750
Epoch: 48326, Training Loss: 0.00582
Epoch: 48327, Training Loss: 0.00656
Epoch: 48327, Training Loss: 0.00615
Epoch: 48327, Training Loss: 0.00750
Epoch: 48327, Training Loss: 0.00582
Epoch: 48328, Training Loss: 0.00656
Epoch: 48328, Training Loss: 0.00615
Epoch: 48328, Training Loss: 0.00750
Epoch: 48328, Training Loss: 0.00582
Epoch: 48329, Training Loss: 0.00656
Epoch: 48329, Training Loss: 0.00615
Epoch: 48329, Training Loss: 0.00750
Epoch: 48329, Training Loss: 0.00582
Epoch: 48330, Training Loss: 0.00656
Epoch: 48330, Training Loss: 0.00615
Epoch: 48330, Training Loss: 0.00750
Epoch: 48330, Training Loss: 0.00582
Epoch: 48331, Training Loss: 0.00656
Epoch: 48331, Training Loss: 0.00615
Epoch: 48331, Training Loss: 0.00750
Epoch: 48331, Training Loss: 0.00582
Epoch: 48332, Training Loss: 0.00656
Epoch: 48332, Training Loss: 0.00615
Epoch: 48332, Training Loss: 0.00750
Epoch: 48332, Training Loss: 0.00582
Epoch: 48333, Training Loss: 0.00656
Epoch: 48333, Training Loss: 0.00615
Epoch: 48333, Training Loss: 0.00750
Epoch: 48333, Training Loss: 0.00582
Epoch: 48334, Training Loss: 0.00656
Epoch: 48334, Training Loss: 0.00615
Epoch: 48334, Training Loss: 0.00750
Epoch: 48334, Training Loss: 0.00582
Epoch: 48335, Training Loss: 0.00656
Epoch: 48335, Training Loss: 0.00615
Epoch: 48335, Training Loss: 0.00750
Epoch: 48335, Training Loss: 0.00582
Epoch: 48336, Training Loss: 0.00656
Epoch: 48336, Training Loss: 0.00615
Epoch: 48336, Training Loss: 0.00750
Epoch: 48336, Training Loss: 0.00582
Epoch: 48337, Training Loss: 0.00656
Epoch: 48337, Training Loss: 0.00615
Epoch: 48337, Training Loss: 0.00750
Epoch: 48337, Training Loss: 0.00582
Epoch: 48338, Training Loss: 0.00656
Epoch: 48338, Training Loss: 0.00615
Epoch: 48338, Training Loss: 0.00750
Epoch: 48338, Training Loss: 0.00582
Epoch: 48339, Training Loss: 0.00656
Epoch: 48339, Training Loss: 0.00615
Epoch: 48339, Training Loss: 0.00750
Epoch: 48339, Training Loss: 0.00582
Epoch: 48340, Training Loss: 0.00656
Epoch: 48340, Training Loss: 0.00615
Epoch: 48340, Training Loss: 0.00750
Epoch: 48340, Training Loss: 0.00582
Epoch: 48341, Training Loss: 0.00656
Epoch: 48341, Training Loss: 0.00615
Epoch: 48341, Training Loss: 0.00750
Epoch: 48341, Training Loss: 0.00582
Epoch: 48342, Training Loss: 0.00656
Epoch: 48342, Training Loss: 0.00615
Epoch: 48342, Training Loss: 0.00750
Epoch: 48342, Training Loss: 0.00582
Epoch: 48343, Training Loss: 0.00656
Epoch: 48343, Training Loss: 0.00615
Epoch: 48343, Training Loss: 0.00750
Epoch: 48343, Training Loss: 0.00582
Epoch: 48344, Training Loss: 0.00656
Epoch: 48344, Training Loss: 0.00615
Epoch: 48344, Training Loss: 0.00750
Epoch: 48344, Training Loss: 0.00582
Epoch: 48345, Training Loss: 0.00656
Epoch: 48345, Training Loss: 0.00615
Epoch: 48345, Training Loss: 0.00750
Epoch: 48345, Training Loss: 0.00582
Epoch: 48346, Training Loss: 0.00656
Epoch: 48346, Training Loss: 0.00615
Epoch: 48346, Training Loss: 0.00750
Epoch: 48346, Training Loss: 0.00582
Epoch: 48347, Training Loss: 0.00656
Epoch: 48347, Training Loss: 0.00615
Epoch: 48347, Training Loss: 0.00750
Epoch: 48347, Training Loss: 0.00582
Epoch: 48348, Training Loss: 0.00656
Epoch: 48348, Training Loss: 0.00615
Epoch: 48348, Training Loss: 0.00750
Epoch: 48348, Training Loss: 0.00582
Epoch: 48349, Training Loss: 0.00656
Epoch: 48349, Training Loss: 0.00615
Epoch: 48349, Training Loss: 0.00750
Epoch: 48349, Training Loss: 0.00582
Epoch: 48350, Training Loss: 0.00656
Epoch: 48350, Training Loss: 0.00615
Epoch: 48350, Training Loss: 0.00750
Epoch: 48350, Training Loss: 0.00582
Epoch: 48351, Training Loss: 0.00656
Epoch: 48351, Training Loss: 0.00615
Epoch: 48351, Training Loss: 0.00750
Epoch: 48351, Training Loss: 0.00582
Epoch: 48352, Training Loss: 0.00656
Epoch: 48352, Training Loss: 0.00615
Epoch: 48352, Training Loss: 0.00750
Epoch: 48352, Training Loss: 0.00582
Epoch: 48353, Training Loss: 0.00656
Epoch: 48353, Training Loss: 0.00615
Epoch: 48353, Training Loss: 0.00750
Epoch: 48353, Training Loss: 0.00582
Epoch: 48354, Training Loss: 0.00656
Epoch: 48354, Training Loss: 0.00615
Epoch: 48354, Training Loss: 0.00750
Epoch: 48354, Training Loss: 0.00582
Epoch: 48355, Training Loss: 0.00656
Epoch: 48355, Training Loss: 0.00615
Epoch: 48355, Training Loss: 0.00749
Epoch: 48355, Training Loss: 0.00582
Epoch: 48356, Training Loss: 0.00656
Epoch: 48356, Training Loss: 0.00615
Epoch: 48356, Training Loss: 0.00749
Epoch: 48356, Training Loss: 0.00582
Epoch: 48357, Training Loss: 0.00656
Epoch: 48357, Training Loss: 0.00615
Epoch: 48357, Training Loss: 0.00749
Epoch: 48357, Training Loss: 0.00582
Epoch: 48358, Training Loss: 0.00656
Epoch: 48358, Training Loss: 0.00614
Epoch: 48358, Training Loss: 0.00749
Epoch: 48358, Training Loss: 0.00582
Epoch: 48359, Training Loss: 0.00656
Epoch: 48359, Training Loss: 0.00614
Epoch: 48359, Training Loss: 0.00749
Epoch: 48359, Training Loss: 0.00582
Epoch: 48360, Training Loss: 0.00656
Epoch: 48360, Training Loss: 0.00614
Epoch: 48360, Training Loss: 0.00749
Epoch: 48360, Training Loss: 0.00582
Epoch: 48361, Training Loss: 0.00656
Epoch: 48361, Training Loss: 0.00614
Epoch: 48361, Training Loss: 0.00749
Epoch: 48361, Training Loss: 0.00582
Epoch: 48362, Training Loss: 0.00656
Epoch: 48362, Training Loss: 0.00614
Epoch: 48362, Training Loss: 0.00749
Epoch: 48362, Training Loss: 0.00582
Epoch: 48363, Training Loss: 0.00656
Epoch: 48363, Training Loss: 0.00614
Epoch: 48363, Training Loss: 0.00749
Epoch: 48363, Training Loss: 0.00582
Epoch: 48364, Training Loss: 0.00656
Epoch: 48364, Training Loss: 0.00614
Epoch: 48364, Training Loss: 0.00749
Epoch: 48364, Training Loss: 0.00582
Epoch: 48365, Training Loss: 0.00656
Epoch: 48365, Training Loss: 0.00614
Epoch: 48365, Training Loss: 0.00749
Epoch: 48365, Training Loss: 0.00582
Epoch: 48366, Training Loss: 0.00656
Epoch: 48366, Training Loss: 0.00614
Epoch: 48366, Training Loss: 0.00749
Epoch: 48366, Training Loss: 0.00582
Epoch: 48367, Training Loss: 0.00656
Epoch: 48367, Training Loss: 0.00614
Epoch: 48367, Training Loss: 0.00749
Epoch: 48367, Training Loss: 0.00582
Epoch: 48368, Training Loss: 0.00656
Epoch: 48368, Training Loss: 0.00614
Epoch: 48368, Training Loss: 0.00749
Epoch: 48368, Training Loss: 0.00582
Epoch: 48369, Training Loss: 0.00656
Epoch: 48369, Training Loss: 0.00614
Epoch: 48369, Training Loss: 0.00749
Epoch: 48369, Training Loss: 0.00582
Epoch: 48370, Training Loss: 0.00656
Epoch: 48370, Training Loss: 0.00614
Epoch: 48370, Training Loss: 0.00749
Epoch: 48370, Training Loss: 0.00582
Epoch: 48371, Training Loss: 0.00656
Epoch: 48371, Training Loss: 0.00614
Epoch: 48371, Training Loss: 0.00749
Epoch: 48371, Training Loss: 0.00582
Epoch: 48372, Training Loss: 0.00656
Epoch: 48372, Training Loss: 0.00614
Epoch: 48372, Training Loss: 0.00749
Epoch: 48372, Training Loss: 0.00582
Epoch: 48373, Training Loss: 0.00656
Epoch: 48373, Training Loss: 0.00614
Epoch: 48373, Training Loss: 0.00749
Epoch: 48373, Training Loss: 0.00582
Epoch: 48374, Training Loss: 0.00656
Epoch: 48374, Training Loss: 0.00614
Epoch: 48374, Training Loss: 0.00749
Epoch: 48374, Training Loss: 0.00582
Epoch: 48375, Training Loss: 0.00656
Epoch: 48375, Training Loss: 0.00614
Epoch: 48375, Training Loss: 0.00749
Epoch: 48375, Training Loss: 0.00582
Epoch: 48376, Training Loss: 0.00656
Epoch: 48376, Training Loss: 0.00614
Epoch: 48376, Training Loss: 0.00749
Epoch: 48376, Training Loss: 0.00582
Epoch: 48377, Training Loss: 0.00656
Epoch: 48377, Training Loss: 0.00614
Epoch: 48377, Training Loss: 0.00749
Epoch: 48377, Training Loss: 0.00582
Epoch: 48378, Training Loss: 0.00656
Epoch: 48378, Training Loss: 0.00614
Epoch: 48378, Training Loss: 0.00749
Epoch: 48378, Training Loss: 0.00582
Epoch: 48379, Training Loss: 0.00656
Epoch: 48379, Training Loss: 0.00614
Epoch: 48379, Training Loss: 0.00749
Epoch: 48379, Training Loss: 0.00582
Epoch: 48380, Training Loss: 0.00656
Epoch: 48380, Training Loss: 0.00614
Epoch: 48380, Training Loss: 0.00749
Epoch: 48380, Training Loss: 0.00582
Epoch: 48381, Training Loss: 0.00656
Epoch: 48381, Training Loss: 0.00614
Epoch: 48381, Training Loss: 0.00749
Epoch: 48381, Training Loss: 0.00582
Epoch: 48382, Training Loss: 0.00656
Epoch: 48382, Training Loss: 0.00614
Epoch: 48382, Training Loss: 0.00749
Epoch: 48382, Training Loss: 0.00582
Epoch: 48383, Training Loss: 0.00656
Epoch: 48383, Training Loss: 0.00614
Epoch: 48383, Training Loss: 0.00749
Epoch: 48383, Training Loss: 0.00582
Epoch: 48384, Training Loss: 0.00655
Epoch: 48384, Training Loss: 0.00614
Epoch: 48384, Training Loss: 0.00749
Epoch: 48384, Training Loss: 0.00582
Epoch: 48385, Training Loss: 0.00655
Epoch: 48385, Training Loss: 0.00614
Epoch: 48385, Training Loss: 0.00749
Epoch: 48385, Training Loss: 0.00582
Epoch: 48386, Training Loss: 0.00655
Epoch: 48386, Training Loss: 0.00614
Epoch: 48386, Training Loss: 0.00749
Epoch: 48386, Training Loss: 0.00582
Epoch: 48387, Training Loss: 0.00655
Epoch: 48387, Training Loss: 0.00614
Epoch: 48387, Training Loss: 0.00749
Epoch: 48387, Training Loss: 0.00582
Epoch: 48388, Training Loss: 0.00655
Epoch: 48388, Training Loss: 0.00614
Epoch: 48388, Training Loss: 0.00749
Epoch: 48388, Training Loss: 0.00582
Epoch: 48389, Training Loss: 0.00655
Epoch: 48389, Training Loss: 0.00614
Epoch: 48389, Training Loss: 0.00749
Epoch: 48389, Training Loss: 0.00582
Epoch: 48390, Training Loss: 0.00655
Epoch: 48390, Training Loss: 0.00614
Epoch: 48390, Training Loss: 0.00749
Epoch: 48390, Training Loss: 0.00582
Epoch: 48391, Training Loss: 0.00655
Epoch: 48391, Training Loss: 0.00614
Epoch: 48391, Training Loss: 0.00749
Epoch: 48391, Training Loss: 0.00582
Epoch: 48392, Training Loss: 0.00655
Epoch: 48392, Training Loss: 0.00614
Epoch: 48392, Training Loss: 0.00749
Epoch: 48392, Training Loss: 0.00582
Epoch: 48393, Training Loss: 0.00655
Epoch: 48393, Training Loss: 0.00614
Epoch: 48393, Training Loss: 0.00749
Epoch: 48393, Training Loss: 0.00582
Epoch: 48394, Training Loss: 0.00655
Epoch: 48394, Training Loss: 0.00614
Epoch: 48394, Training Loss: 0.00749
Epoch: 48394, Training Loss: 0.00582
Epoch: 48395, Training Loss: 0.00655
Epoch: 48395, Training Loss: 0.00614
Epoch: 48395, Training Loss: 0.00749
Epoch: 48395, Training Loss: 0.00582
Epoch: 48396, Training Loss: 0.00655
Epoch: 48396, Training Loss: 0.00614
Epoch: 48396, Training Loss: 0.00749
Epoch: 48396, Training Loss: 0.00582
Epoch: 48397, Training Loss: 0.00655
Epoch: 48397, Training Loss: 0.00614
Epoch: 48397, Training Loss: 0.00749
Epoch: 48397, Training Loss: 0.00582
Epoch: 48398, Training Loss: 0.00655
Epoch: 48398, Training Loss: 0.00614
Epoch: 48398, Training Loss: 0.00749
Epoch: 48398, Training Loss: 0.00582
Epoch: 48399, Training Loss: 0.00655
Epoch: 48399, Training Loss: 0.00614
Epoch: 48399, Training Loss: 0.00749
Epoch: 48399, Training Loss: 0.00581
Epoch: 48400, Training Loss: 0.00655
Epoch: 48400, Training Loss: 0.00614
Epoch: 48400, Training Loss: 0.00749
Epoch: 48400, Training Loss: 0.00581
Epoch: 48401, Training Loss: 0.00655
Epoch: 48401, Training Loss: 0.00614
Epoch: 48401, Training Loss: 0.00749
Epoch: 48401, Training Loss: 0.00581
Epoch: 48402, Training Loss: 0.00655
Epoch: 48402, Training Loss: 0.00614
Epoch: 48402, Training Loss: 0.00749
Epoch: 48402, Training Loss: 0.00581
Epoch: 48403, Training Loss: 0.00655
Epoch: 48403, Training Loss: 0.00614
Epoch: 48403, Training Loss: 0.00749
Epoch: 48403, Training Loss: 0.00581
Epoch: 48404, Training Loss: 0.00655
Epoch: 48404, Training Loss: 0.00614
Epoch: 48404, Training Loss: 0.00749
Epoch: 48404, Training Loss: 0.00581
Epoch: 48405, Training Loss: 0.00655
Epoch: 48405, Training Loss: 0.00614
Epoch: 48405, Training Loss: 0.00749
Epoch: 48405, Training Loss: 0.00581
Epoch: 48406, Training Loss: 0.00655
Epoch: 48406, Training Loss: 0.00614
Epoch: 48406, Training Loss: 0.00749
Epoch: 48406, Training Loss: 0.00581
Epoch: 48407, Training Loss: 0.00655
Epoch: 48407, Training Loss: 0.00614
Epoch: 48407, Training Loss: 0.00749
Epoch: 48407, Training Loss: 0.00581
Epoch: 48408, Training Loss: 0.00655
Epoch: 48408, Training Loss: 0.00614
Epoch: 48408, Training Loss: 0.00749
Epoch: 48408, Training Loss: 0.00581
Epoch: 48409, Training Loss: 0.00655
Epoch: 48409, Training Loss: 0.00614
Epoch: 48409, Training Loss: 0.00749
Epoch: 48409, Training Loss: 0.00581
Epoch: 48410, Training Loss: 0.00655
Epoch: 48410, Training Loss: 0.00614
Epoch: 48410, Training Loss: 0.00749
Epoch: 48410, Training Loss: 0.00581
Epoch: 48411, Training Loss: 0.00655
Epoch: 48411, Training Loss: 0.00614
Epoch: 48411, Training Loss: 0.00749
Epoch: 48411, Training Loss: 0.00581
Epoch: 48412, Training Loss: 0.00655
Epoch: 48412, Training Loss: 0.00614
Epoch: 48412, Training Loss: 0.00749
Epoch: 48412, Training Loss: 0.00581
Epoch: 48413, Training Loss: 0.00655
Epoch: 48413, Training Loss: 0.00614
Epoch: 48413, Training Loss: 0.00749
Epoch: 48413, Training Loss: 0.00581
Epoch: 48414, Training Loss: 0.00655
Epoch: 48414, Training Loss: 0.00614
Epoch: 48414, Training Loss: 0.00749
Epoch: 48414, Training Loss: 0.00581
Epoch: 48415, Training Loss: 0.00655
Epoch: 48415, Training Loss: 0.00614
Epoch: 48415, Training Loss: 0.00749
Epoch: 48415, Training Loss: 0.00581
Epoch: 48416, Training Loss: 0.00655
Epoch: 48416, Training Loss: 0.00614
Epoch: 48416, Training Loss: 0.00749
Epoch: 48416, Training Loss: 0.00581
Epoch: 48417, Training Loss: 0.00655
Epoch: 48417, Training Loss: 0.00614
Epoch: 48417, Training Loss: 0.00749
Epoch: 48417, Training Loss: 0.00581
Epoch: 48418, Training Loss: 0.00655
Epoch: 48418, Training Loss: 0.00614
Epoch: 48418, Training Loss: 0.00749
Epoch: 48418, Training Loss: 0.00581
Epoch: 48419, Training Loss: 0.00655
Epoch: 48419, Training Loss: 0.00614
Epoch: 48419, Training Loss: 0.00749
Epoch: 48419, Training Loss: 0.00581
Epoch: 48420, Training Loss: 0.00655
Epoch: 48420, Training Loss: 0.00614
Epoch: 48420, Training Loss: 0.00749
Epoch: 48420, Training Loss: 0.00581
Epoch: 48421, Training Loss: 0.00655
Epoch: 48421, Training Loss: 0.00614
Epoch: 48421, Training Loss: 0.00749
Epoch: 48421, Training Loss: 0.00581
Epoch: 48422, Training Loss: 0.00655
Epoch: 48422, Training Loss: 0.00614
Epoch: 48422, Training Loss: 0.00749
Epoch: 48422, Training Loss: 0.00581
Epoch: 48423, Training Loss: 0.00655
Epoch: 48423, Training Loss: 0.00614
Epoch: 48423, Training Loss: 0.00749
Epoch: 48423, Training Loss: 0.00581
Epoch: 48424, Training Loss: 0.00655
Epoch: 48424, Training Loss: 0.00614
Epoch: 48424, Training Loss: 0.00749
Epoch: 48424, Training Loss: 0.00581
Epoch: 48425, Training Loss: 0.00655
Epoch: 48425, Training Loss: 0.00614
Epoch: 48425, Training Loss: 0.00749
Epoch: 48425, Training Loss: 0.00581
Epoch: 48426, Training Loss: 0.00655
Epoch: 48426, Training Loss: 0.00614
Epoch: 48426, Training Loss: 0.00749
Epoch: 48426, Training Loss: 0.00581
Epoch: 48427, Training Loss: 0.00655
Epoch: 48427, Training Loss: 0.00614
Epoch: 48427, Training Loss: 0.00749
Epoch: 48427, Training Loss: 0.00581
Epoch: 48428, Training Loss: 0.00655
Epoch: 48428, Training Loss: 0.00614
Epoch: 48428, Training Loss: 0.00749
Epoch: 48428, Training Loss: 0.00581
Epoch: 48429, Training Loss: 0.00655
Epoch: 48429, Training Loss: 0.00614
Epoch: 48429, Training Loss: 0.00749
Epoch: 48429, Training Loss: 0.00581
Epoch: 48430, Training Loss: 0.00655
Epoch: 48430, Training Loss: 0.00614
Epoch: 48430, Training Loss: 0.00749
Epoch: 48430, Training Loss: 0.00581
Epoch: 48431, Training Loss: 0.00655
Epoch: 48431, Training Loss: 0.00614
Epoch: 48431, Training Loss: 0.00749
Epoch: 48431, Training Loss: 0.00581
Epoch: 48432, Training Loss: 0.00655
Epoch: 48432, Training Loss: 0.00614
Epoch: 48432, Training Loss: 0.00749
Epoch: 48432, Training Loss: 0.00581
Epoch: 48433, Training Loss: 0.00655
Epoch: 48433, Training Loss: 0.00614
Epoch: 48433, Training Loss: 0.00749
Epoch: 48433, Training Loss: 0.00581
Epoch: 48434, Training Loss: 0.00655
Epoch: 48434, Training Loss: 0.00614
Epoch: 48434, Training Loss: 0.00749
Epoch: 48434, Training Loss: 0.00581
Epoch: 48435, Training Loss: 0.00655
Epoch: 48435, Training Loss: 0.00614
Epoch: 48435, Training Loss: 0.00749
Epoch: 48435, Training Loss: 0.00581
Epoch: 48436, Training Loss: 0.00655
Epoch: 48436, Training Loss: 0.00614
Epoch: 48436, Training Loss: 0.00749
Epoch: 48436, Training Loss: 0.00581
Epoch: 48437, Training Loss: 0.00655
Epoch: 48437, Training Loss: 0.00614
Epoch: 48437, Training Loss: 0.00749
Epoch: 48437, Training Loss: 0.00581
Epoch: 48438, Training Loss: 0.00655
Epoch: 48438, Training Loss: 0.00614
Epoch: 48438, Training Loss: 0.00749
Epoch: 48438, Training Loss: 0.00581
Epoch: 48439, Training Loss: 0.00655
Epoch: 48439, Training Loss: 0.00614
Epoch: 48439, Training Loss: 0.00749
Epoch: 48439, Training Loss: 0.00581
Epoch: 48440, Training Loss: 0.00655
Epoch: 48440, Training Loss: 0.00614
Epoch: 48440, Training Loss: 0.00749
Epoch: 48440, Training Loss: 0.00581
Epoch: 48441, Training Loss: 0.00655
Epoch: 48441, Training Loss: 0.00614
Epoch: 48441, Training Loss: 0.00749
Epoch: 48441, Training Loss: 0.00581
Epoch: 48442, Training Loss: 0.00655
Epoch: 48442, Training Loss: 0.00614
Epoch: 48442, Training Loss: 0.00749
Epoch: 48442, Training Loss: 0.00581
Epoch: 48443, Training Loss: 0.00655
Epoch: 48443, Training Loss: 0.00614
Epoch: 48443, Training Loss: 0.00749
Epoch: 48443, Training Loss: 0.00581
Epoch: 48444, Training Loss: 0.00655
Epoch: 48444, Training Loss: 0.00614
Epoch: 48444, Training Loss: 0.00749
Epoch: 48444, Training Loss: 0.00581
Epoch: 48445, Training Loss: 0.00655
Epoch: 48445, Training Loss: 0.00614
Epoch: 48445, Training Loss: 0.00749
Epoch: 48445, Training Loss: 0.00581
Epoch: 48446, Training Loss: 0.00655
Epoch: 48446, Training Loss: 0.00614
Epoch: 48446, Training Loss: 0.00749
Epoch: 48446, Training Loss: 0.00581
Epoch: 48447, Training Loss: 0.00655
Epoch: 48447, Training Loss: 0.00614
Epoch: 48447, Training Loss: 0.00749
Epoch: 48447, Training Loss: 0.00581
Epoch: 48448, Training Loss: 0.00655
Epoch: 48448, Training Loss: 0.00614
Epoch: 48448, Training Loss: 0.00749
Epoch: 48448, Training Loss: 0.00581
Epoch: 48449, Training Loss: 0.00655
Epoch: 48449, Training Loss: 0.00614
Epoch: 48449, Training Loss: 0.00749
Epoch: 48449, Training Loss: 0.00581
Epoch: 48450, Training Loss: 0.00655
Epoch: 48450, Training Loss: 0.00614
Epoch: 48450, Training Loss: 0.00749
Epoch: 48450, Training Loss: 0.00581
Epoch: 48451, Training Loss: 0.00655
Epoch: 48451, Training Loss: 0.00614
Epoch: 48451, Training Loss: 0.00749
Epoch: 48451, Training Loss: 0.00581
Epoch: 48452, Training Loss: 0.00655
Epoch: 48452, Training Loss: 0.00614
Epoch: 48452, Training Loss: 0.00749
Epoch: 48452, Training Loss: 0.00581
Epoch: 48453, Training Loss: 0.00655
Epoch: 48453, Training Loss: 0.00614
Epoch: 48453, Training Loss: 0.00749
Epoch: 48453, Training Loss: 0.00581
Epoch: 48454, Training Loss: 0.00655
Epoch: 48454, Training Loss: 0.00614
Epoch: 48454, Training Loss: 0.00749
Epoch: 48454, Training Loss: 0.00581
Epoch: 48455, Training Loss: 0.00655
Epoch: 48455, Training Loss: 0.00614
Epoch: 48455, Training Loss: 0.00749
Epoch: 48455, Training Loss: 0.00581
Epoch: 48456, Training Loss: 0.00655
Epoch: 48456, Training Loss: 0.00614
Epoch: 48456, Training Loss: 0.00749
Epoch: 48456, Training Loss: 0.00581
Epoch: 48457, Training Loss: 0.00655
Epoch: 48457, Training Loss: 0.00614
Epoch: 48457, Training Loss: 0.00749
Epoch: 48457, Training Loss: 0.00581
Epoch: 48458, Training Loss: 0.00655
Epoch: 48458, Training Loss: 0.00614
Epoch: 48458, Training Loss: 0.00749
Epoch: 48458, Training Loss: 0.00581
Epoch: 48459, Training Loss: 0.00655
Epoch: 48459, Training Loss: 0.00614
Epoch: 48459, Training Loss: 0.00749
Epoch: 48459, Training Loss: 0.00581
Epoch: 48460, Training Loss: 0.00655
Epoch: 48460, Training Loss: 0.00614
Epoch: 48460, Training Loss: 0.00749
Epoch: 48460, Training Loss: 0.00581
Epoch: 48461, Training Loss: 0.00655
Epoch: 48461, Training Loss: 0.00614
Epoch: 48461, Training Loss: 0.00749
Epoch: 48461, Training Loss: 0.00581
Epoch: 48462, Training Loss: 0.00655
Epoch: 48462, Training Loss: 0.00614
Epoch: 48462, Training Loss: 0.00749
Epoch: 48462, Training Loss: 0.00581
Epoch: 48463, Training Loss: 0.00655
Epoch: 48463, Training Loss: 0.00614
Epoch: 48463, Training Loss: 0.00749
Epoch: 48463, Training Loss: 0.00581
Epoch: 48464, Training Loss: 0.00655
Epoch: 48464, Training Loss: 0.00614
Epoch: 48464, Training Loss: 0.00749
Epoch: 48464, Training Loss: 0.00581
Epoch: 48465, Training Loss: 0.00655
Epoch: 48465, Training Loss: 0.00614
Epoch: 48465, Training Loss: 0.00749
Epoch: 48465, Training Loss: 0.00581
Epoch: 48466, Training Loss: 0.00655
Epoch: 48466, Training Loss: 0.00614
Epoch: 48466, Training Loss: 0.00749
Epoch: 48466, Training Loss: 0.00581
Epoch: 48467, Training Loss: 0.00655
Epoch: 48467, Training Loss: 0.00614
Epoch: 48467, Training Loss: 0.00749
Epoch: 48467, Training Loss: 0.00581
Epoch: 48468, Training Loss: 0.00655
Epoch: 48468, Training Loss: 0.00614
Epoch: 48468, Training Loss: 0.00749
Epoch: 48468, Training Loss: 0.00581
Epoch: 48469, Training Loss: 0.00655
Epoch: 48469, Training Loss: 0.00614
Epoch: 48469, Training Loss: 0.00749
Epoch: 48469, Training Loss: 0.00581
Epoch: 48470, Training Loss: 0.00655
Epoch: 48470, Training Loss: 0.00614
Epoch: 48470, Training Loss: 0.00749
Epoch: 48470, Training Loss: 0.00581
Epoch: 48471, Training Loss: 0.00655
Epoch: 48471, Training Loss: 0.00614
Epoch: 48471, Training Loss: 0.00749
Epoch: 48471, Training Loss: 0.00581
Epoch: 48472, Training Loss: 0.00655
Epoch: 48472, Training Loss: 0.00614
Epoch: 48472, Training Loss: 0.00749
Epoch: 48472, Training Loss: 0.00581
Epoch: 48473, Training Loss: 0.00655
Epoch: 48473, Training Loss: 0.00614
Epoch: 48473, Training Loss: 0.00749
Epoch: 48473, Training Loss: 0.00581
Epoch: 48474, Training Loss: 0.00655
Epoch: 48474, Training Loss: 0.00614
Epoch: 48474, Training Loss: 0.00749
Epoch: 48474, Training Loss: 0.00581
Epoch: 48475, Training Loss: 0.00655
Epoch: 48475, Training Loss: 0.00614
Epoch: 48475, Training Loss: 0.00749
Epoch: 48475, Training Loss: 0.00581
Epoch: 48476, Training Loss: 0.00655
Epoch: 48476, Training Loss: 0.00614
Epoch: 48476, Training Loss: 0.00749
Epoch: 48476, Training Loss: 0.00581
Epoch: 48477, Training Loss: 0.00655
Epoch: 48477, Training Loss: 0.00614
Epoch: 48477, Training Loss: 0.00749
Epoch: 48477, Training Loss: 0.00581
Epoch: 48478, Training Loss: 0.00655
Epoch: 48478, Training Loss: 0.00614
Epoch: 48478, Training Loss: 0.00749
Epoch: 48478, Training Loss: 0.00581
Epoch: 48479, Training Loss: 0.00655
Epoch: 48479, Training Loss: 0.00614
Epoch: 48479, Training Loss: 0.00748
Epoch: 48479, Training Loss: 0.00581
Epoch: 48480, Training Loss: 0.00655
Epoch: 48480, Training Loss: 0.00614
Epoch: 48480, Training Loss: 0.00748
Epoch: 48480, Training Loss: 0.00581
Epoch: 48481, Training Loss: 0.00655
Epoch: 48481, Training Loss: 0.00614
Epoch: 48481, Training Loss: 0.00748
Epoch: 48481, Training Loss: 0.00581
Epoch: 48482, Training Loss: 0.00655
Epoch: 48482, Training Loss: 0.00614
Epoch: 48482, Training Loss: 0.00748
Epoch: 48482, Training Loss: 0.00581
Epoch: 48483, Training Loss: 0.00655
Epoch: 48483, Training Loss: 0.00614
Epoch: 48483, Training Loss: 0.00748
Epoch: 48483, Training Loss: 0.00581
Epoch: 48484, Training Loss: 0.00655
Epoch: 48484, Training Loss: 0.00614
Epoch: 48484, Training Loss: 0.00748
Epoch: 48484, Training Loss: 0.00581
Epoch: 48485, Training Loss: 0.00655
Epoch: 48485, Training Loss: 0.00614
Epoch: 48485, Training Loss: 0.00748
Epoch: 48485, Training Loss: 0.00581
Epoch: 48486, Training Loss: 0.00655
Epoch: 48486, Training Loss: 0.00614
Epoch: 48486, Training Loss: 0.00748
Epoch: 48486, Training Loss: 0.00581
Epoch: 48487, Training Loss: 0.00655
Epoch: 48487, Training Loss: 0.00614
Epoch: 48487, Training Loss: 0.00748
Epoch: 48487, Training Loss: 0.00581
Epoch: 48488, Training Loss: 0.00655
Epoch: 48488, Training Loss: 0.00614
Epoch: 48488, Training Loss: 0.00748
Epoch: 48488, Training Loss: 0.00581
Epoch: 48489, Training Loss: 0.00655
Epoch: 48489, Training Loss: 0.00614
Epoch: 48489, Training Loss: 0.00748
Epoch: 48489, Training Loss: 0.00581
Epoch: 48490, Training Loss: 0.00655
Epoch: 48490, Training Loss: 0.00614
Epoch: 48490, Training Loss: 0.00748
Epoch: 48490, Training Loss: 0.00581
Epoch: 48491, Training Loss: 0.00655
Epoch: 48491, Training Loss: 0.00614
Epoch: 48491, Training Loss: 0.00748
Epoch: 48491, Training Loss: 0.00581
Epoch: 48492, Training Loss: 0.00655
Epoch: 48492, Training Loss: 0.00614
Epoch: 48492, Training Loss: 0.00748
Epoch: 48492, Training Loss: 0.00581
Epoch: 48493, Training Loss: 0.00655
Epoch: 48493, Training Loss: 0.00614
Epoch: 48493, Training Loss: 0.00748
Epoch: 48493, Training Loss: 0.00581
Epoch: 48494, Training Loss: 0.00655
Epoch: 48494, Training Loss: 0.00614
Epoch: 48494, Training Loss: 0.00748
Epoch: 48494, Training Loss: 0.00581
Epoch: 48495, Training Loss: 0.00655
Epoch: 48495, Training Loss: 0.00614
Epoch: 48495, Training Loss: 0.00748
Epoch: 48495, Training Loss: 0.00581
Epoch: 48496, Training Loss: 0.00655
Epoch: 48496, Training Loss: 0.00614
Epoch: 48496, Training Loss: 0.00748
Epoch: 48496, Training Loss: 0.00581
Epoch: 48497, Training Loss: 0.00655
Epoch: 48497, Training Loss: 0.00614
Epoch: 48497, Training Loss: 0.00748
Epoch: 48497, Training Loss: 0.00581
Epoch: 48498, Training Loss: 0.00655
Epoch: 48498, Training Loss: 0.00614
Epoch: 48498, Training Loss: 0.00748
Epoch: 48498, Training Loss: 0.00581
Epoch: 48499, Training Loss: 0.00655
Epoch: 48499, Training Loss: 0.00614
Epoch: 48499, Training Loss: 0.00748
Epoch: 48499, Training Loss: 0.00581
Epoch: 48500, Training Loss: 0.00655
Epoch: 48500, Training Loss: 0.00614
Epoch: 48500, Training Loss: 0.00748
Epoch: 48500, Training Loss: 0.00581
Epoch: 48501, Training Loss: 0.00655
Epoch: 48501, Training Loss: 0.00614
Epoch: 48501, Training Loss: 0.00748
Epoch: 48501, Training Loss: 0.00581
Epoch: 48502, Training Loss: 0.00655
Epoch: 48502, Training Loss: 0.00614
Epoch: 48502, Training Loss: 0.00748
Epoch: 48502, Training Loss: 0.00581
Epoch: 48503, Training Loss: 0.00655
Epoch: 48503, Training Loss: 0.00614
Epoch: 48503, Training Loss: 0.00748
Epoch: 48503, Training Loss: 0.00581
Epoch: 48504, Training Loss: 0.00655
Epoch: 48504, Training Loss: 0.00614
Epoch: 48504, Training Loss: 0.00748
Epoch: 48504, Training Loss: 0.00581
Epoch: 48505, Training Loss: 0.00655
Epoch: 48505, Training Loss: 0.00614
Epoch: 48505, Training Loss: 0.00748
Epoch: 48505, Training Loss: 0.00581
Epoch: 48506, Training Loss: 0.00655
Epoch: 48506, Training Loss: 0.00614
Epoch: 48506, Training Loss: 0.00748
Epoch: 48506, Training Loss: 0.00581
Epoch: 48507, Training Loss: 0.00655
Epoch: 48507, Training Loss: 0.00614
Epoch: 48507, Training Loss: 0.00748
Epoch: 48507, Training Loss: 0.00581
Epoch: 48508, Training Loss: 0.00655
Epoch: 48508, Training Loss: 0.00614
Epoch: 48508, Training Loss: 0.00748
Epoch: 48508, Training Loss: 0.00581
Epoch: 48509, Training Loss: 0.00655
Epoch: 48509, Training Loss: 0.00614
Epoch: 48509, Training Loss: 0.00748
Epoch: 48509, Training Loss: 0.00581
Epoch: 48510, Training Loss: 0.00655
Epoch: 48510, Training Loss: 0.00613
Epoch: 48510, Training Loss: 0.00748
Epoch: 48510, Training Loss: 0.00581
Epoch: 48511, Training Loss: 0.00655
Epoch: 48511, Training Loss: 0.00613
Epoch: 48511, Training Loss: 0.00748
Epoch: 48511, Training Loss: 0.00581
Epoch: 48512, Training Loss: 0.00655
Epoch: 48512, Training Loss: 0.00613
Epoch: 48512, Training Loss: 0.00748
Epoch: 48512, Training Loss: 0.00581
Epoch: 48513, Training Loss: 0.00655
Epoch: 48513, Training Loss: 0.00613
Epoch: 48513, Training Loss: 0.00748
Epoch: 48513, Training Loss: 0.00581
Epoch: 48514, Training Loss: 0.00655
Epoch: 48514, Training Loss: 0.00613
Epoch: 48514, Training Loss: 0.00748
Epoch: 48514, Training Loss: 0.00581
Epoch: 48515, Training Loss: 0.00655
Epoch: 48515, Training Loss: 0.00613
Epoch: 48515, Training Loss: 0.00748
Epoch: 48515, Training Loss: 0.00581
Epoch: 48516, Training Loss: 0.00655
Epoch: 48516, Training Loss: 0.00613
Epoch: 48516, Training Loss: 0.00748
Epoch: 48516, Training Loss: 0.00581
Epoch: 48517, Training Loss: 0.00655
Epoch: 48517, Training Loss: 0.00613
Epoch: 48517, Training Loss: 0.00748
Epoch: 48517, Training Loss: 0.00581
Epoch: 48518, Training Loss: 0.00655
Epoch: 48518, Training Loss: 0.00613
Epoch: 48518, Training Loss: 0.00748
Epoch: 48518, Training Loss: 0.00581
Epoch: 48519, Training Loss: 0.00655
Epoch: 48519, Training Loss: 0.00613
Epoch: 48519, Training Loss: 0.00748
Epoch: 48519, Training Loss: 0.00581
Epoch: 48520, Training Loss: 0.00655
Epoch: 48520, Training Loss: 0.00613
Epoch: 48520, Training Loss: 0.00748
Epoch: 48520, Training Loss: 0.00581
Epoch: 48521, Training Loss: 0.00655
Epoch: 48521, Training Loss: 0.00613
Epoch: 48521, Training Loss: 0.00748
Epoch: 48521, Training Loss: 0.00581
Epoch: 48522, Training Loss: 0.00655
Epoch: 48522, Training Loss: 0.00613
Epoch: 48522, Training Loss: 0.00748
Epoch: 48522, Training Loss: 0.00581
Epoch: 48523, Training Loss: 0.00655
Epoch: 48523, Training Loss: 0.00613
Epoch: 48523, Training Loss: 0.00748
Epoch: 48523, Training Loss: 0.00581
Epoch: 48524, Training Loss: 0.00655
Epoch: 48524, Training Loss: 0.00613
Epoch: 48524, Training Loss: 0.00748
Epoch: 48524, Training Loss: 0.00581
Epoch: 48525, Training Loss: 0.00655
Epoch: 48525, Training Loss: 0.00613
Epoch: 48525, Training Loss: 0.00748
Epoch: 48525, Training Loss: 0.00581
Epoch: 48526, Training Loss: 0.00654
Epoch: 48526, Training Loss: 0.00613
Epoch: 48526, Training Loss: 0.00748
Epoch: 48526, Training Loss: 0.00581
Epoch: 48527, Training Loss: 0.00654
Epoch: 48527, Training Loss: 0.00613
Epoch: 48527, Training Loss: 0.00748
Epoch: 48527, Training Loss: 0.00581
Epoch: 48528, Training Loss: 0.00654
Epoch: 48528, Training Loss: 0.00613
Epoch: 48528, Training Loss: 0.00748
Epoch: 48528, Training Loss: 0.00581
Epoch: 48529, Training Loss: 0.00654
Epoch: 48529, Training Loss: 0.00613
Epoch: 48529, Training Loss: 0.00748
Epoch: 48529, Training Loss: 0.00581
Epoch: 48530, Training Loss: 0.00654
Epoch: 48530, Training Loss: 0.00613
Epoch: 48530, Training Loss: 0.00748
Epoch: 48530, Training Loss: 0.00581
Epoch: 48531, Training Loss: 0.00654
Epoch: 48531, Training Loss: 0.00613
Epoch: 48531, Training Loss: 0.00748
Epoch: 48531, Training Loss: 0.00581
Epoch: 48532, Training Loss: 0.00654
Epoch: 48532, Training Loss: 0.00613
Epoch: 48532, Training Loss: 0.00748
Epoch: 48532, Training Loss: 0.00581
Epoch: 48533, Training Loss: 0.00654
Epoch: 48533, Training Loss: 0.00613
Epoch: 48533, Training Loss: 0.00748
Epoch: 48533, Training Loss: 0.00581
Epoch: 48534, Training Loss: 0.00654
Epoch: 48534, Training Loss: 0.00613
Epoch: 48534, Training Loss: 0.00748
Epoch: 48534, Training Loss: 0.00581
Epoch: 48535, Training Loss: 0.00654
Epoch: 48535, Training Loss: 0.00613
Epoch: 48535, Training Loss: 0.00748
Epoch: 48535, Training Loss: 0.00581
Epoch: 48536, Training Loss: 0.00654
Epoch: 48536, Training Loss: 0.00613
Epoch: 48536, Training Loss: 0.00748
Epoch: 48536, Training Loss: 0.00581
Epoch: 48537, Training Loss: 0.00654
Epoch: 48537, Training Loss: 0.00613
Epoch: 48537, Training Loss: 0.00748
Epoch: 48537, Training Loss: 0.00581
Epoch: 48538, Training Loss: 0.00654
Epoch: 48538, Training Loss: 0.00613
Epoch: 48538, Training Loss: 0.00748
Epoch: 48538, Training Loss: 0.00581
Epoch: 48539, Training Loss: 0.00654
Epoch: 48539, Training Loss: 0.00613
Epoch: 48539, Training Loss: 0.00748
Epoch: 48539, Training Loss: 0.00581
Epoch: 48540, Training Loss: 0.00654
Epoch: 48540, Training Loss: 0.00613
Epoch: 48540, Training Loss: 0.00748
Epoch: 48540, Training Loss: 0.00581
Epoch: 48541, Training Loss: 0.00654
Epoch: 48541, Training Loss: 0.00613
Epoch: 48541, Training Loss: 0.00748
Epoch: 48541, Training Loss: 0.00581
Epoch: 48542, Training Loss: 0.00654
Epoch: 48542, Training Loss: 0.00613
Epoch: 48542, Training Loss: 0.00748
Epoch: 48542, Training Loss: 0.00581
Epoch: 48543, Training Loss: 0.00654
Epoch: 48543, Training Loss: 0.00613
Epoch: 48543, Training Loss: 0.00748
Epoch: 48543, Training Loss: 0.00581
Epoch: 48544, Training Loss: 0.00654
Epoch: 48544, Training Loss: 0.00613
Epoch: 48544, Training Loss: 0.00748
Epoch: 48544, Training Loss: 0.00581
Epoch: 48545, Training Loss: 0.00654
Epoch: 48545, Training Loss: 0.00613
Epoch: 48545, Training Loss: 0.00748
Epoch: 48545, Training Loss: 0.00581
Epoch: 48546, Training Loss: 0.00654
Epoch: 48546, Training Loss: 0.00613
Epoch: 48546, Training Loss: 0.00748
Epoch: 48546, Training Loss: 0.00581
Epoch: 48547, Training Loss: 0.00654
Epoch: 48547, Training Loss: 0.00613
Epoch: 48547, Training Loss: 0.00748
Epoch: 48547, Training Loss: 0.00581
Epoch: 48548, Training Loss: 0.00654
Epoch: 48548, Training Loss: 0.00613
Epoch: 48548, Training Loss: 0.00748
Epoch: 48548, Training Loss: 0.00581
Epoch: 48549, Training Loss: 0.00654
Epoch: 48549, Training Loss: 0.00613
Epoch: 48549, Training Loss: 0.00748
Epoch: 48549, Training Loss: 0.00581
Epoch: 48550, Training Loss: 0.00654
Epoch: 48550, Training Loss: 0.00613
Epoch: 48550, Training Loss: 0.00748
Epoch: 48550, Training Loss: 0.00581
Epoch: 48551, Training Loss: 0.00654
Epoch: 48551, Training Loss: 0.00613
Epoch: 48551, Training Loss: 0.00748
Epoch: 48551, Training Loss: 0.00581
Epoch: 48552, Training Loss: 0.00654
Epoch: 48552, Training Loss: 0.00613
Epoch: 48552, Training Loss: 0.00748
Epoch: 48552, Training Loss: 0.00581
Epoch: 48553, Training Loss: 0.00654
Epoch: 48553, Training Loss: 0.00613
Epoch: 48553, Training Loss: 0.00748
Epoch: 48553, Training Loss: 0.00581
Epoch: 48554, Training Loss: 0.00654
Epoch: 48554, Training Loss: 0.00613
Epoch: 48554, Training Loss: 0.00748
Epoch: 48554, Training Loss: 0.00581
Epoch: 48555, Training Loss: 0.00654
Epoch: 48555, Training Loss: 0.00613
Epoch: 48555, Training Loss: 0.00748
Epoch: 48555, Training Loss: 0.00581
Epoch: 48556, Training Loss: 0.00654
Epoch: 48556, Training Loss: 0.00613
Epoch: 48556, Training Loss: 0.00748
Epoch: 48556, Training Loss: 0.00581
Epoch: 48557, Training Loss: 0.00654
Epoch: 48557, Training Loss: 0.00613
Epoch: 48557, Training Loss: 0.00748
Epoch: 48557, Training Loss: 0.00581
Epoch: 48558, Training Loss: 0.00654
Epoch: 48558, Training Loss: 0.00613
Epoch: 48558, Training Loss: 0.00748
Epoch: 48558, Training Loss: 0.00581
Epoch: 48559, Training Loss: 0.00654
Epoch: 48559, Training Loss: 0.00613
Epoch: 48559, Training Loss: 0.00748
Epoch: 48559, Training Loss: 0.00581
Epoch: 48560, Training Loss: 0.00654
Epoch: 48560, Training Loss: 0.00613
Epoch: 48560, Training Loss: 0.00748
Epoch: 48560, Training Loss: 0.00581
Epoch: 48561, Training Loss: 0.00654
Epoch: 48561, Training Loss: 0.00613
Epoch: 48561, Training Loss: 0.00748
Epoch: 48561, Training Loss: 0.00580
Epoch: 48562, Training Loss: 0.00654
Epoch: 48562, Training Loss: 0.00613
Epoch: 48562, Training Loss: 0.00748
Epoch: 48562, Training Loss: 0.00580
Epoch: 48563, Training Loss: 0.00654
Epoch: 48563, Training Loss: 0.00613
Epoch: 48563, Training Loss: 0.00748
Epoch: 48563, Training Loss: 0.00580
Epoch: 48564, Training Loss: 0.00654
Epoch: 48564, Training Loss: 0.00613
Epoch: 48564, Training Loss: 0.00748
Epoch: 48564, Training Loss: 0.00580
Epoch: 48565, Training Loss: 0.00654
Epoch: 48565, Training Loss: 0.00613
Epoch: 48565, Training Loss: 0.00748
Epoch: 48565, Training Loss: 0.00580
Epoch: 48566, Training Loss: 0.00654
Epoch: 48566, Training Loss: 0.00613
Epoch: 48566, Training Loss: 0.00748
Epoch: 48566, Training Loss: 0.00580
Epoch: 48567, Training Loss: 0.00654
Epoch: 48567, Training Loss: 0.00613
Epoch: 48567, Training Loss: 0.00748
Epoch: 48567, Training Loss: 0.00580
Epoch: 48568, Training Loss: 0.00654
Epoch: 48568, Training Loss: 0.00613
Epoch: 48568, Training Loss: 0.00748
Epoch: 48568, Training Loss: 0.00580
Epoch: 48569, Training Loss: 0.00654
Epoch: 48569, Training Loss: 0.00613
Epoch: 48569, Training Loss: 0.00748
Epoch: 48569, Training Loss: 0.00580
Epoch: 48570, Training Loss: 0.00654
Epoch: 48570, Training Loss: 0.00613
Epoch: 48570, Training Loss: 0.00748
Epoch: 48570, Training Loss: 0.00580
Epoch: 48571, Training Loss: 0.00654
Epoch: 48571, Training Loss: 0.00613
Epoch: 48571, Training Loss: 0.00748
Epoch: 48571, Training Loss: 0.00580
Epoch: 48572, Training Loss: 0.00654
Epoch: 48572, Training Loss: 0.00613
Epoch: 48572, Training Loss: 0.00748
Epoch: 48572, Training Loss: 0.00580
Epoch: 48573, Training Loss: 0.00654
Epoch: 48573, Training Loss: 0.00613
Epoch: 48573, Training Loss: 0.00748
Epoch: 48573, Training Loss: 0.00580
Epoch: 48574, Training Loss: 0.00654
Epoch: 48574, Training Loss: 0.00613
Epoch: 48574, Training Loss: 0.00748
Epoch: 48574, Training Loss: 0.00580
Epoch: 48575, Training Loss: 0.00654
Epoch: 48575, Training Loss: 0.00613
Epoch: 48575, Training Loss: 0.00748
Epoch: 48575, Training Loss: 0.00580
Epoch: 48576, Training Loss: 0.00654
Epoch: 48576, Training Loss: 0.00613
Epoch: 48576, Training Loss: 0.00748
Epoch: 48576, Training Loss: 0.00580
Epoch: 48577, Training Loss: 0.00654
Epoch: 48577, Training Loss: 0.00613
Epoch: 48577, Training Loss: 0.00748
Epoch: 48577, Training Loss: 0.00580
Epoch: 48578, Training Loss: 0.00654
Epoch: 48578, Training Loss: 0.00613
Epoch: 48578, Training Loss: 0.00748
Epoch: 48578, Training Loss: 0.00580
Epoch: 48579, Training Loss: 0.00654
Epoch: 48579, Training Loss: 0.00613
Epoch: 48579, Training Loss: 0.00748
Epoch: 48579, Training Loss: 0.00580
Epoch: 48580, Training Loss: 0.00654
Epoch: 48580, Training Loss: 0.00613
Epoch: 48580, Training Loss: 0.00748
Epoch: 48580, Training Loss: 0.00580
Epoch: 48581, Training Loss: 0.00654
Epoch: 48581, Training Loss: 0.00613
Epoch: 48581, Training Loss: 0.00748
Epoch: 48581, Training Loss: 0.00580
Epoch: 48582, Training Loss: 0.00654
Epoch: 48582, Training Loss: 0.00613
Epoch: 48582, Training Loss: 0.00748
Epoch: 48582, Training Loss: 0.00580
Epoch: 48583, Training Loss: 0.00654
Epoch: 48583, Training Loss: 0.00613
Epoch: 48583, Training Loss: 0.00748
Epoch: 48583, Training Loss: 0.00580
Epoch: 48584, Training Loss: 0.00654
Epoch: 48584, Training Loss: 0.00613
Epoch: 48584, Training Loss: 0.00748
Epoch: 48584, Training Loss: 0.00580
Epoch: 48585, Training Loss: 0.00654
Epoch: 48585, Training Loss: 0.00613
Epoch: 48585, Training Loss: 0.00748
Epoch: 48585, Training Loss: 0.00580
Epoch: 48586, Training Loss: 0.00654
Epoch: 48586, Training Loss: 0.00613
Epoch: 48586, Training Loss: 0.00748
Epoch: 48586, Training Loss: 0.00580
Epoch: 48587, Training Loss: 0.00654
Epoch: 48587, Training Loss: 0.00613
Epoch: 48587, Training Loss: 0.00748
Epoch: 48587, Training Loss: 0.00580
Epoch: 48588, Training Loss: 0.00654
Epoch: 48588, Training Loss: 0.00613
Epoch: 48588, Training Loss: 0.00748
Epoch: 48588, Training Loss: 0.00580
Epoch: 48589, Training Loss: 0.00654
Epoch: 48589, Training Loss: 0.00613
Epoch: 48589, Training Loss: 0.00748
Epoch: 48589, Training Loss: 0.00580
Epoch: 48590, Training Loss: 0.00654
Epoch: 48590, Training Loss: 0.00613
Epoch: 48590, Training Loss: 0.00748
Epoch: 48590, Training Loss: 0.00580
Epoch: 48591, Training Loss: 0.00654
Epoch: 48591, Training Loss: 0.00613
Epoch: 48591, Training Loss: 0.00748
Epoch: 48591, Training Loss: 0.00580
Epoch: 48592, Training Loss: 0.00654
Epoch: 48592, Training Loss: 0.00613
Epoch: 48592, Training Loss: 0.00748
Epoch: 48592, Training Loss: 0.00580
Epoch: 48593, Training Loss: 0.00654
Epoch: 48593, Training Loss: 0.00613
Epoch: 48593, Training Loss: 0.00748
Epoch: 48593, Training Loss: 0.00580
Epoch: 48594, Training Loss: 0.00654
Epoch: 48594, Training Loss: 0.00613
Epoch: 48594, Training Loss: 0.00748
Epoch: 48594, Training Loss: 0.00580
Epoch: 48595, Training Loss: 0.00654
Epoch: 48595, Training Loss: 0.00613
Epoch: 48595, Training Loss: 0.00748
Epoch: 48595, Training Loss: 0.00580
Epoch: 48596, Training Loss: 0.00654
Epoch: 48596, Training Loss: 0.00613
Epoch: 48596, Training Loss: 0.00748
Epoch: 48596, Training Loss: 0.00580
Epoch: 48597, Training Loss: 0.00654
Epoch: 48597, Training Loss: 0.00613
Epoch: 48597, Training Loss: 0.00748
Epoch: 48597, Training Loss: 0.00580
Epoch: 48598, Training Loss: 0.00654
Epoch: 48598, Training Loss: 0.00613
Epoch: 48598, Training Loss: 0.00748
Epoch: 48598, Training Loss: 0.00580
Epoch: 48599, Training Loss: 0.00654
Epoch: 48599, Training Loss: 0.00613
Epoch: 48599, Training Loss: 0.00748
Epoch: 48599, Training Loss: 0.00580
Epoch: 48600, Training Loss: 0.00654
Epoch: 48600, Training Loss: 0.00613
Epoch: 48600, Training Loss: 0.00748
Epoch: 48600, Training Loss: 0.00580
Epoch: 48601, Training Loss: 0.00654
Epoch: 48601, Training Loss: 0.00613
Epoch: 48601, Training Loss: 0.00748
Epoch: 48601, Training Loss: 0.00580
Epoch: 48602, Training Loss: 0.00654
Epoch: 48602, Training Loss: 0.00613
Epoch: 48602, Training Loss: 0.00748
Epoch: 48602, Training Loss: 0.00580
Epoch: 48603, Training Loss: 0.00654
Epoch: 48603, Training Loss: 0.00613
Epoch: 48603, Training Loss: 0.00747
Epoch: 48603, Training Loss: 0.00580
Epoch: 48604, Training Loss: 0.00654
Epoch: 48604, Training Loss: 0.00613
Epoch: 48604, Training Loss: 0.00747
Epoch: 48604, Training Loss: 0.00580
Epoch: 48605, Training Loss: 0.00654
Epoch: 48605, Training Loss: 0.00613
Epoch: 48605, Training Loss: 0.00747
Epoch: 48605, Training Loss: 0.00580
Epoch: 48606, Training Loss: 0.00654
Epoch: 48606, Training Loss: 0.00613
Epoch: 48606, Training Loss: 0.00747
Epoch: 48606, Training Loss: 0.00580
Epoch: 48607, Training Loss: 0.00654
Epoch: 48607, Training Loss: 0.00613
Epoch: 48607, Training Loss: 0.00747
Epoch: 48607, Training Loss: 0.00580
Epoch: 48608, Training Loss: 0.00654
Epoch: 48608, Training Loss: 0.00613
Epoch: 48608, Training Loss: 0.00747
Epoch: 48608, Training Loss: 0.00580
Epoch: 48609, Training Loss: 0.00654
Epoch: 48609, Training Loss: 0.00613
Epoch: 48609, Training Loss: 0.00747
Epoch: 48609, Training Loss: 0.00580
Epoch: 48610, Training Loss: 0.00654
Epoch: 48610, Training Loss: 0.00613
Epoch: 48610, Training Loss: 0.00747
Epoch: 48610, Training Loss: 0.00580
Epoch: 48611, Training Loss: 0.00654
Epoch: 48611, Training Loss: 0.00613
Epoch: 48611, Training Loss: 0.00747
Epoch: 48611, Training Loss: 0.00580
Epoch: 48612, Training Loss: 0.00654
Epoch: 48612, Training Loss: 0.00613
Epoch: 48612, Training Loss: 0.00747
Epoch: 48612, Training Loss: 0.00580
Epoch: 48613, Training Loss: 0.00654
Epoch: 48613, Training Loss: 0.00613
Epoch: 48613, Training Loss: 0.00747
Epoch: 48613, Training Loss: 0.00580
Epoch: 48614, Training Loss: 0.00654
Epoch: 48614, Training Loss: 0.00613
Epoch: 48614, Training Loss: 0.00747
Epoch: 48614, Training Loss: 0.00580
Epoch: 48615, Training Loss: 0.00654
Epoch: 48615, Training Loss: 0.00613
Epoch: 48615, Training Loss: 0.00747
Epoch: 48615, Training Loss: 0.00580
Epoch: 48616, Training Loss: 0.00654
Epoch: 48616, Training Loss: 0.00613
Epoch: 48616, Training Loss: 0.00747
Epoch: 48616, Training Loss: 0.00580
Epoch: 48617, Training Loss: 0.00654
Epoch: 48617, Training Loss: 0.00613
Epoch: 48617, Training Loss: 0.00747
Epoch: 48617, Training Loss: 0.00580
Epoch: 48618, Training Loss: 0.00654
Epoch: 48618, Training Loss: 0.00613
Epoch: 48618, Training Loss: 0.00747
Epoch: 48618, Training Loss: 0.00580
Epoch: 48619, Training Loss: 0.00654
Epoch: 48619, Training Loss: 0.00613
Epoch: 48619, Training Loss: 0.00747
Epoch: 48619, Training Loss: 0.00580
Epoch: 48620, Training Loss: 0.00654
Epoch: 48620, Training Loss: 0.00613
Epoch: 48620, Training Loss: 0.00747
Epoch: 48620, Training Loss: 0.00580
Epoch: 48621, Training Loss: 0.00654
Epoch: 48621, Training Loss: 0.00613
Epoch: 48621, Training Loss: 0.00747
Epoch: 48621, Training Loss: 0.00580
Epoch: 48622, Training Loss: 0.00654
Epoch: 48622, Training Loss: 0.00613
Epoch: 48622, Training Loss: 0.00747
Epoch: 48622, Training Loss: 0.00580
Epoch: 48623, Training Loss: 0.00654
Epoch: 48623, Training Loss: 0.00613
Epoch: 48623, Training Loss: 0.00747
Epoch: 48623, Training Loss: 0.00580
Epoch: 48624, Training Loss: 0.00654
Epoch: 48624, Training Loss: 0.00613
Epoch: 48624, Training Loss: 0.00747
Epoch: 48624, Training Loss: 0.00580
Epoch: 48625, Training Loss: 0.00654
Epoch: 48625, Training Loss: 0.00613
Epoch: 48625, Training Loss: 0.00747
Epoch: 48625, Training Loss: 0.00580
Epoch: 48626, Training Loss: 0.00654
Epoch: 48626, Training Loss: 0.00613
Epoch: 48626, Training Loss: 0.00747
Epoch: 48626, Training Loss: 0.00580
Epoch: 48627, Training Loss: 0.00654
Epoch: 48627, Training Loss: 0.00613
Epoch: 48627, Training Loss: 0.00747
Epoch: 48627, Training Loss: 0.00580
Epoch: 48628, Training Loss: 0.00654
Epoch: 48628, Training Loss: 0.00613
Epoch: 48628, Training Loss: 0.00747
Epoch: 48628, Training Loss: 0.00580
Epoch: 48629, Training Loss: 0.00654
Epoch: 48629, Training Loss: 0.00613
Epoch: 48629, Training Loss: 0.00747
Epoch: 48629, Training Loss: 0.00580
Epoch: 48630, Training Loss: 0.00654
Epoch: 48630, Training Loss: 0.00613
Epoch: 48630, Training Loss: 0.00747
Epoch: 48630, Training Loss: 0.00580
Epoch: 48631, Training Loss: 0.00654
Epoch: 48631, Training Loss: 0.00613
Epoch: 48631, Training Loss: 0.00747
Epoch: 48631, Training Loss: 0.00580
Epoch: 48632, Training Loss: 0.00654
Epoch: 48632, Training Loss: 0.00613
Epoch: 48632, Training Loss: 0.00747
Epoch: 48632, Training Loss: 0.00580
Epoch: 48633, Training Loss: 0.00654
Epoch: 48633, Training Loss: 0.00613
Epoch: 48633, Training Loss: 0.00747
Epoch: 48633, Training Loss: 0.00580
Epoch: 48634, Training Loss: 0.00654
Epoch: 48634, Training Loss: 0.00613
Epoch: 48634, Training Loss: 0.00747
Epoch: 48634, Training Loss: 0.00580
Epoch: 48635, Training Loss: 0.00654
Epoch: 48635, Training Loss: 0.00613
Epoch: 48635, Training Loss: 0.00747
Epoch: 48635, Training Loss: 0.00580
Epoch: 48636, Training Loss: 0.00654
Epoch: 48636, Training Loss: 0.00613
Epoch: 48636, Training Loss: 0.00747
Epoch: 48636, Training Loss: 0.00580
Epoch: 48637, Training Loss: 0.00654
Epoch: 48637, Training Loss: 0.00613
Epoch: 48637, Training Loss: 0.00747
Epoch: 48637, Training Loss: 0.00580
Epoch: 48638, Training Loss: 0.00654
Epoch: 48638, Training Loss: 0.00613
Epoch: 48638, Training Loss: 0.00747
Epoch: 48638, Training Loss: 0.00580
Epoch: 48639, Training Loss: 0.00654
Epoch: 48639, Training Loss: 0.00613
Epoch: 48639, Training Loss: 0.00747
Epoch: 48639, Training Loss: 0.00580
Epoch: 48640, Training Loss: 0.00654
Epoch: 48640, Training Loss: 0.00613
Epoch: 48640, Training Loss: 0.00747
Epoch: 48640, Training Loss: 0.00580
Epoch: 48641, Training Loss: 0.00654
Epoch: 48641, Training Loss: 0.00613
Epoch: 48641, Training Loss: 0.00747
Epoch: 48641, Training Loss: 0.00580
Epoch: 48642, Training Loss: 0.00654
Epoch: 48642, Training Loss: 0.00613
Epoch: 48642, Training Loss: 0.00747
Epoch: 48642, Training Loss: 0.00580
Epoch: 48643, Training Loss: 0.00654
Epoch: 48643, Training Loss: 0.00613
Epoch: 48643, Training Loss: 0.00747
Epoch: 48643, Training Loss: 0.00580
Epoch: 48644, Training Loss: 0.00654
Epoch: 48644, Training Loss: 0.00613
Epoch: 48644, Training Loss: 0.00747
Epoch: 48644, Training Loss: 0.00580
Epoch: 48645, Training Loss: 0.00654
Epoch: 48645, Training Loss: 0.00613
Epoch: 48645, Training Loss: 0.00747
Epoch: 48645, Training Loss: 0.00580
Epoch: 48646, Training Loss: 0.00654
Epoch: 48646, Training Loss: 0.00613
Epoch: 48646, Training Loss: 0.00747
Epoch: 48646, Training Loss: 0.00580
Epoch: 48647, Training Loss: 0.00654
Epoch: 48647, Training Loss: 0.00613
Epoch: 48647, Training Loss: 0.00747
Epoch: 48647, Training Loss: 0.00580
Epoch: 48648, Training Loss: 0.00654
Epoch: 48648, Training Loss: 0.00613
Epoch: 48648, Training Loss: 0.00747
Epoch: 48648, Training Loss: 0.00580
Epoch: 48649, Training Loss: 0.00654
Epoch: 48649, Training Loss: 0.00613
Epoch: 48649, Training Loss: 0.00747
Epoch: 48649, Training Loss: 0.00580
Epoch: 48650, Training Loss: 0.00654
Epoch: 48650, Training Loss: 0.00613
Epoch: 48650, Training Loss: 0.00747
Epoch: 48650, Training Loss: 0.00580
Epoch: 48651, Training Loss: 0.00654
Epoch: 48651, Training Loss: 0.00613
Epoch: 48651, Training Loss: 0.00747
Epoch: 48651, Training Loss: 0.00580
Epoch: 48652, Training Loss: 0.00654
Epoch: 48652, Training Loss: 0.00613
Epoch: 48652, Training Loss: 0.00747
Epoch: 48652, Training Loss: 0.00580
Epoch: 48653, Training Loss: 0.00654
Epoch: 48653, Training Loss: 0.00613
Epoch: 48653, Training Loss: 0.00747
Epoch: 48653, Training Loss: 0.00580
Epoch: 48654, Training Loss: 0.00654
Epoch: 48654, Training Loss: 0.00613
Epoch: 48654, Training Loss: 0.00747
Epoch: 48654, Training Loss: 0.00580
Epoch: 48655, Training Loss: 0.00654
Epoch: 48655, Training Loss: 0.00613
Epoch: 48655, Training Loss: 0.00747
Epoch: 48655, Training Loss: 0.00580
Epoch: 48656, Training Loss: 0.00654
Epoch: 48656, Training Loss: 0.00613
Epoch: 48656, Training Loss: 0.00747
Epoch: 48656, Training Loss: 0.00580
Epoch: 48657, Training Loss: 0.00654
Epoch: 48657, Training Loss: 0.00613
Epoch: 48657, Training Loss: 0.00747
Epoch: 48657, Training Loss: 0.00580
Epoch: 48658, Training Loss: 0.00654
Epoch: 48658, Training Loss: 0.00613
Epoch: 48658, Training Loss: 0.00747
Epoch: 48658, Training Loss: 0.00580
Epoch: 48659, Training Loss: 0.00654
Epoch: 48659, Training Loss: 0.00613
Epoch: 48659, Training Loss: 0.00747
Epoch: 48659, Training Loss: 0.00580
Epoch: 48660, Training Loss: 0.00654
Epoch: 48660, Training Loss: 0.00613
Epoch: 48660, Training Loss: 0.00747
Epoch: 48660, Training Loss: 0.00580
Epoch: 48661, Training Loss: 0.00654
Epoch: 48661, Training Loss: 0.00613
Epoch: 48661, Training Loss: 0.00747
Epoch: 48661, Training Loss: 0.00580
Epoch: 48662, Training Loss: 0.00654
Epoch: 48662, Training Loss: 0.00613
Epoch: 48662, Training Loss: 0.00747
Epoch: 48662, Training Loss: 0.00580
Epoch: 48663, Training Loss: 0.00654
Epoch: 48663, Training Loss: 0.00613
Epoch: 48663, Training Loss: 0.00747
Epoch: 48663, Training Loss: 0.00580
Epoch: 48664, Training Loss: 0.00654
Epoch: 48664, Training Loss: 0.00612
Epoch: 48664, Training Loss: 0.00747
Epoch: 48664, Training Loss: 0.00580
Epoch: 48665, Training Loss: 0.00654
Epoch: 48665, Training Loss: 0.00612
Epoch: 48665, Training Loss: 0.00747
Epoch: 48665, Training Loss: 0.00580
Epoch: 48666, Training Loss: 0.00654
Epoch: 48666, Training Loss: 0.00612
Epoch: 48666, Training Loss: 0.00747
Epoch: 48666, Training Loss: 0.00580
Epoch: 48667, Training Loss: 0.00653
Epoch: 48667, Training Loss: 0.00612
Epoch: 48667, Training Loss: 0.00747
Epoch: 48667, Training Loss: 0.00580
Epoch: 48668, Training Loss: 0.00653
Epoch: 48668, Training Loss: 0.00612
Epoch: 48668, Training Loss: 0.00747
Epoch: 48668, Training Loss: 0.00580
Epoch: 48669, Training Loss: 0.00653
Epoch: 48669, Training Loss: 0.00612
Epoch: 48669, Training Loss: 0.00747
Epoch: 48669, Training Loss: 0.00580
Epoch: 48670, Training Loss: 0.00653
Epoch: 48670, Training Loss: 0.00612
Epoch: 48670, Training Loss: 0.00747
Epoch: 48670, Training Loss: 0.00580
Epoch: 48671, Training Loss: 0.00653
Epoch: 48671, Training Loss: 0.00612
Epoch: 48671, Training Loss: 0.00747
Epoch: 48671, Training Loss: 0.00580
Epoch: 48672, Training Loss: 0.00653
Epoch: 48672, Training Loss: 0.00612
Epoch: 48672, Training Loss: 0.00747
Epoch: 48672, Training Loss: 0.00580
Epoch: 48673, Training Loss: 0.00653
Epoch: 48673, Training Loss: 0.00612
Epoch: 48673, Training Loss: 0.00747
Epoch: 48673, Training Loss: 0.00580
Epoch: 48674, Training Loss: 0.00653
Epoch: 48674, Training Loss: 0.00612
Epoch: 48674, Training Loss: 0.00747
Epoch: 48674, Training Loss: 0.00580
Epoch: 48675, Training Loss: 0.00653
Epoch: 48675, Training Loss: 0.00612
Epoch: 48675, Training Loss: 0.00747
Epoch: 48675, Training Loss: 0.00580
Epoch: 48676, Training Loss: 0.00653
Epoch: 48676, Training Loss: 0.00612
Epoch: 48676, Training Loss: 0.00747
Epoch: 48676, Training Loss: 0.00580
Epoch: 48677, Training Loss: 0.00653
Epoch: 48677, Training Loss: 0.00612
Epoch: 48677, Training Loss: 0.00747
Epoch: 48677, Training Loss: 0.00580
Epoch: 48678, Training Loss: 0.00653
Epoch: 48678, Training Loss: 0.00612
Epoch: 48678, Training Loss: 0.00747
Epoch: 48678, Training Loss: 0.00580
Epoch: 48679, Training Loss: 0.00653
Epoch: 48679, Training Loss: 0.00612
Epoch: 48679, Training Loss: 0.00747
Epoch: 48679, Training Loss: 0.00580
Epoch: 48680, Training Loss: 0.00653
Epoch: 48680, Training Loss: 0.00612
Epoch: 48680, Training Loss: 0.00747
Epoch: 48680, Training Loss: 0.00580
Epoch: 48681, Training Loss: 0.00653
Epoch: 48681, Training Loss: 0.00612
Epoch: 48681, Training Loss: 0.00747
Epoch: 48681, Training Loss: 0.00580
Epoch: 48682, Training Loss: 0.00653
Epoch: 48682, Training Loss: 0.00612
Epoch: 48682, Training Loss: 0.00747
Epoch: 48682, Training Loss: 0.00580
Epoch: 48683, Training Loss: 0.00653
Epoch: 48683, Training Loss: 0.00612
Epoch: 48683, Training Loss: 0.00747
Epoch: 48683, Training Loss: 0.00580
Epoch: 48684, Training Loss: 0.00653
Epoch: 48684, Training Loss: 0.00612
Epoch: 48684, Training Loss: 0.00747
Epoch: 48684, Training Loss: 0.00580
Epoch: 48685, Training Loss: 0.00653
Epoch: 48685, Training Loss: 0.00612
Epoch: 48685, Training Loss: 0.00747
Epoch: 48685, Training Loss: 0.00580
Epoch: 48686, Training Loss: 0.00653
Epoch: 48686, Training Loss: 0.00612
Epoch: 48686, Training Loss: 0.00747
Epoch: 48686, Training Loss: 0.00580
Epoch: 48687, Training Loss: 0.00653
Epoch: 48687, Training Loss: 0.00612
Epoch: 48687, Training Loss: 0.00747
Epoch: 48687, Training Loss: 0.00580
Epoch: 48688, Training Loss: 0.00653
Epoch: 48688, Training Loss: 0.00612
Epoch: 48688, Training Loss: 0.00747
Epoch: 48688, Training Loss: 0.00580
Epoch: 48689, Training Loss: 0.00653
Epoch: 48689, Training Loss: 0.00612
Epoch: 48689, Training Loss: 0.00747
Epoch: 48689, Training Loss: 0.00580
Epoch: 48690, Training Loss: 0.00653
Epoch: 48690, Training Loss: 0.00612
Epoch: 48690, Training Loss: 0.00747
Epoch: 48690, Training Loss: 0.00580
Epoch: 48691, Training Loss: 0.00653
Epoch: 48691, Training Loss: 0.00612
Epoch: 48691, Training Loss: 0.00747
Epoch: 48691, Training Loss: 0.00580
Epoch: 48692, Training Loss: 0.00653
Epoch: 48692, Training Loss: 0.00612
Epoch: 48692, Training Loss: 0.00747
Epoch: 48692, Training Loss: 0.00580
Epoch: 48693, Training Loss: 0.00653
Epoch: 48693, Training Loss: 0.00612
Epoch: 48693, Training Loss: 0.00747
Epoch: 48693, Training Loss: 0.00580
Epoch: 48694, Training Loss: 0.00653
Epoch: 48694, Training Loss: 0.00612
Epoch: 48694, Training Loss: 0.00747
Epoch: 48694, Training Loss: 0.00580
Epoch: 48695, Training Loss: 0.00653
Epoch: 48695, Training Loss: 0.00612
Epoch: 48695, Training Loss: 0.00747
Epoch: 48695, Training Loss: 0.00580
Epoch: 48696, Training Loss: 0.00653
Epoch: 48696, Training Loss: 0.00612
Epoch: 48696, Training Loss: 0.00747
Epoch: 48696, Training Loss: 0.00580
Epoch: 48697, Training Loss: 0.00653
Epoch: 48697, Training Loss: 0.00612
Epoch: 48697, Training Loss: 0.00747
Epoch: 48697, Training Loss: 0.00580
Epoch: 48698, Training Loss: 0.00653
Epoch: 48698, Training Loss: 0.00612
Epoch: 48698, Training Loss: 0.00747
Epoch: 48698, Training Loss: 0.00580
Epoch: 48699, Training Loss: 0.00653
Epoch: 48699, Training Loss: 0.00612
Epoch: 48699, Training Loss: 0.00747
Epoch: 48699, Training Loss: 0.00580
Epoch: 48700, Training Loss: 0.00653
Epoch: 48700, Training Loss: 0.00612
Epoch: 48700, Training Loss: 0.00747
Epoch: 48700, Training Loss: 0.00580
Epoch: 48701, Training Loss: 0.00653
Epoch: 48701, Training Loss: 0.00612
Epoch: 48701, Training Loss: 0.00747
Epoch: 48701, Training Loss: 0.00580
Epoch: 48702, Training Loss: 0.00653
Epoch: 48702, Training Loss: 0.00612
Epoch: 48702, Training Loss: 0.00747
Epoch: 48702, Training Loss: 0.00580
Epoch: 48703, Training Loss: 0.00653
Epoch: 48703, Training Loss: 0.00612
Epoch: 48703, Training Loss: 0.00747
Epoch: 48703, Training Loss: 0.00580
Epoch: 48704, Training Loss: 0.00653
Epoch: 48704, Training Loss: 0.00612
Epoch: 48704, Training Loss: 0.00747
Epoch: 48704, Training Loss: 0.00580
Epoch: 48705, Training Loss: 0.00653
Epoch: 48705, Training Loss: 0.00612
Epoch: 48705, Training Loss: 0.00747
Epoch: 48705, Training Loss: 0.00580
Epoch: 48706, Training Loss: 0.00653
Epoch: 48706, Training Loss: 0.00612
Epoch: 48706, Training Loss: 0.00747
Epoch: 48706, Training Loss: 0.00580
Epoch: 48707, Training Loss: 0.00653
Epoch: 48707, Training Loss: 0.00612
Epoch: 48707, Training Loss: 0.00747
Epoch: 48707, Training Loss: 0.00580
Epoch: 48708, Training Loss: 0.00653
Epoch: 48708, Training Loss: 0.00612
Epoch: 48708, Training Loss: 0.00747
Epoch: 48708, Training Loss: 0.00580
Epoch: 48709, Training Loss: 0.00653
Epoch: 48709, Training Loss: 0.00612
Epoch: 48709, Training Loss: 0.00747
Epoch: 48709, Training Loss: 0.00580
Epoch: 48710, Training Loss: 0.00653
Epoch: 48710, Training Loss: 0.00612
Epoch: 48710, Training Loss: 0.00747
Epoch: 48710, Training Loss: 0.00580
Epoch: 48711, Training Loss: 0.00653
Epoch: 48711, Training Loss: 0.00612
Epoch: 48711, Training Loss: 0.00747
Epoch: 48711, Training Loss: 0.00580
Epoch: 48712, Training Loss: 0.00653
Epoch: 48712, Training Loss: 0.00612
Epoch: 48712, Training Loss: 0.00747
Epoch: 48712, Training Loss: 0.00580
Epoch: 48713, Training Loss: 0.00653
Epoch: 48713, Training Loss: 0.00612
Epoch: 48713, Training Loss: 0.00747
Epoch: 48713, Training Loss: 0.00580
Epoch: 48714, Training Loss: 0.00653
Epoch: 48714, Training Loss: 0.00612
Epoch: 48714, Training Loss: 0.00747
Epoch: 48714, Training Loss: 0.00580
Epoch: 48715, Training Loss: 0.00653
Epoch: 48715, Training Loss: 0.00612
Epoch: 48715, Training Loss: 0.00747
Epoch: 48715, Training Loss: 0.00580
Epoch: 48716, Training Loss: 0.00653
Epoch: 48716, Training Loss: 0.00612
Epoch: 48716, Training Loss: 0.00747
Epoch: 48716, Training Loss: 0.00580
Epoch: 48717, Training Loss: 0.00653
Epoch: 48717, Training Loss: 0.00612
Epoch: 48717, Training Loss: 0.00747
Epoch: 48717, Training Loss: 0.00580
Epoch: 48718, Training Loss: 0.00653
Epoch: 48718, Training Loss: 0.00612
Epoch: 48718, Training Loss: 0.00747
Epoch: 48718, Training Loss: 0.00580
Epoch: 48719, Training Loss: 0.00653
Epoch: 48719, Training Loss: 0.00612
Epoch: 48719, Training Loss: 0.00747
Epoch: 48719, Training Loss: 0.00580
Epoch: 48720, Training Loss: 0.00653
Epoch: 48720, Training Loss: 0.00612
Epoch: 48720, Training Loss: 0.00747
Epoch: 48720, Training Loss: 0.00580
Epoch: 48721, Training Loss: 0.00653
Epoch: 48721, Training Loss: 0.00612
Epoch: 48721, Training Loss: 0.00747
Epoch: 48721, Training Loss: 0.00580
Epoch: 48722, Training Loss: 0.00653
Epoch: 48722, Training Loss: 0.00612
Epoch: 48722, Training Loss: 0.00747
Epoch: 48722, Training Loss: 0.00580
Epoch: 48723, Training Loss: 0.00653
Epoch: 48723, Training Loss: 0.00612
Epoch: 48723, Training Loss: 0.00747
Epoch: 48723, Training Loss: 0.00579
Epoch: 48724, Training Loss: 0.00653
Epoch: 48724, Training Loss: 0.00612
Epoch: 48724, Training Loss: 0.00747
Epoch: 48724, Training Loss: 0.00579
Epoch: 48725, Training Loss: 0.00653
Epoch: 48725, Training Loss: 0.00612
Epoch: 48725, Training Loss: 0.00747
Epoch: 48725, Training Loss: 0.00579
Epoch: 48726, Training Loss: 0.00653
Epoch: 48726, Training Loss: 0.00612
Epoch: 48726, Training Loss: 0.00747
Epoch: 48726, Training Loss: 0.00579
Epoch: 48727, Training Loss: 0.00653
Epoch: 48727, Training Loss: 0.00612
Epoch: 48727, Training Loss: 0.00747
Epoch: 48727, Training Loss: 0.00579
Epoch: 48728, Training Loss: 0.00653
Epoch: 48728, Training Loss: 0.00612
Epoch: 48728, Training Loss: 0.00746
Epoch: 48728, Training Loss: 0.00579
Epoch: 48729, Training Loss: 0.00653
Epoch: 48729, Training Loss: 0.00612
Epoch: 48729, Training Loss: 0.00746
Epoch: 48729, Training Loss: 0.00579
Epoch: 48730, Training Loss: 0.00653
Epoch: 48730, Training Loss: 0.00612
Epoch: 48730, Training Loss: 0.00746
Epoch: 48730, Training Loss: 0.00579
Epoch: 48731, Training Loss: 0.00653
Epoch: 48731, Training Loss: 0.00612
Epoch: 48731, Training Loss: 0.00746
Epoch: 48731, Training Loss: 0.00579
Epoch: 48732, Training Loss: 0.00653
Epoch: 48732, Training Loss: 0.00612
Epoch: 48732, Training Loss: 0.00746
Epoch: 48732, Training Loss: 0.00579
Epoch: 48733, Training Loss: 0.00653
Epoch: 48733, Training Loss: 0.00612
Epoch: 48733, Training Loss: 0.00746
Epoch: 48733, Training Loss: 0.00579
Epoch: 48734, Training Loss: 0.00653
Epoch: 48734, Training Loss: 0.00612
Epoch: 48734, Training Loss: 0.00746
Epoch: 48734, Training Loss: 0.00579
Epoch: 48735, Training Loss: 0.00653
Epoch: 48735, Training Loss: 0.00612
Epoch: 48735, Training Loss: 0.00746
Epoch: 48735, Training Loss: 0.00579
Epoch: 48736, Training Loss: 0.00653
Epoch: 48736, Training Loss: 0.00612
Epoch: 48736, Training Loss: 0.00746
Epoch: 48736, Training Loss: 0.00579
Epoch: 48737, Training Loss: 0.00653
Epoch: 48737, Training Loss: 0.00612
Epoch: 48737, Training Loss: 0.00746
Epoch: 48737, Training Loss: 0.00579
Epoch: 48738, Training Loss: 0.00653
Epoch: 48738, Training Loss: 0.00612
Epoch: 48738, Training Loss: 0.00746
Epoch: 48738, Training Loss: 0.00579
Epoch: 48739, Training Loss: 0.00653
Epoch: 48739, Training Loss: 0.00612
Epoch: 48739, Training Loss: 0.00746
Epoch: 48739, Training Loss: 0.00579
Epoch: 48740, Training Loss: 0.00653
Epoch: 48740, Training Loss: 0.00612
Epoch: 48740, Training Loss: 0.00746
Epoch: 48740, Training Loss: 0.00579
Epoch: 48741, Training Loss: 0.00653
Epoch: 48741, Training Loss: 0.00612
Epoch: 48741, Training Loss: 0.00746
Epoch: 48741, Training Loss: 0.00579
Epoch: 48742, Training Loss: 0.00653
Epoch: 48742, Training Loss: 0.00612
Epoch: 48742, Training Loss: 0.00746
Epoch: 48742, Training Loss: 0.00579
Epoch: 48743, Training Loss: 0.00653
Epoch: 48743, Training Loss: 0.00612
Epoch: 48743, Training Loss: 0.00746
Epoch: 48743, Training Loss: 0.00579
Epoch: 48744, Training Loss: 0.00653
Epoch: 48744, Training Loss: 0.00612
Epoch: 48744, Training Loss: 0.00746
Epoch: 48744, Training Loss: 0.00579
Epoch: 48745, Training Loss: 0.00653
Epoch: 48745, Training Loss: 0.00612
Epoch: 48745, Training Loss: 0.00746
Epoch: 48745, Training Loss: 0.00579
Epoch: 48746, Training Loss: 0.00653
Epoch: 48746, Training Loss: 0.00612
Epoch: 48746, Training Loss: 0.00746
Epoch: 48746, Training Loss: 0.00579
Epoch: 48747, Training Loss: 0.00653
Epoch: 48747, Training Loss: 0.00612
Epoch: 48747, Training Loss: 0.00746
Epoch: 48747, Training Loss: 0.00579
Epoch: 48748, Training Loss: 0.00653
Epoch: 48748, Training Loss: 0.00612
Epoch: 48748, Training Loss: 0.00746
Epoch: 48748, Training Loss: 0.00579
Epoch: 48749, Training Loss: 0.00653
Epoch: 48749, Training Loss: 0.00612
Epoch: 48749, Training Loss: 0.00746
Epoch: 48749, Training Loss: 0.00579
Epoch: 48750, Training Loss: 0.00653
Epoch: 48750, Training Loss: 0.00612
Epoch: 48750, Training Loss: 0.00746
Epoch: 48750, Training Loss: 0.00579
Epoch: 48751, Training Loss: 0.00653
Epoch: 48751, Training Loss: 0.00612
Epoch: 48751, Training Loss: 0.00746
Epoch: 48751, Training Loss: 0.00579
Epoch: 48752, Training Loss: 0.00653
Epoch: 48752, Training Loss: 0.00612
Epoch: 48752, Training Loss: 0.00746
Epoch: 48752, Training Loss: 0.00579
Epoch: 48753, Training Loss: 0.00653
Epoch: 48753, Training Loss: 0.00612
Epoch: 48753, Training Loss: 0.00746
Epoch: 48753, Training Loss: 0.00579
Epoch: 48754, Training Loss: 0.00653
Epoch: 48754, Training Loss: 0.00612
Epoch: 48754, Training Loss: 0.00746
Epoch: 48754, Training Loss: 0.00579
Epoch: 48755, Training Loss: 0.00653
Epoch: 48755, Training Loss: 0.00612
Epoch: 48755, Training Loss: 0.00746
Epoch: 48755, Training Loss: 0.00579
Epoch: 48756, Training Loss: 0.00653
Epoch: 48756, Training Loss: 0.00612
Epoch: 48756, Training Loss: 0.00746
Epoch: 48756, Training Loss: 0.00579
Epoch: 48757, Training Loss: 0.00653
Epoch: 48757, Training Loss: 0.00612
Epoch: 48757, Training Loss: 0.00746
Epoch: 48757, Training Loss: 0.00579
Epoch: 48758, Training Loss: 0.00653
Epoch: 48758, Training Loss: 0.00612
Epoch: 48758, Training Loss: 0.00746
Epoch: 48758, Training Loss: 0.00579
Epoch: 48759, Training Loss: 0.00653
Epoch: 48759, Training Loss: 0.00612
Epoch: 48759, Training Loss: 0.00746
Epoch: 48759, Training Loss: 0.00579
Epoch: 48760, Training Loss: 0.00653
Epoch: 48760, Training Loss: 0.00612
Epoch: 48760, Training Loss: 0.00746
Epoch: 48760, Training Loss: 0.00579
Epoch: 48761, Training Loss: 0.00653
Epoch: 48761, Training Loss: 0.00612
Epoch: 48761, Training Loss: 0.00746
Epoch: 48761, Training Loss: 0.00579
Epoch: 48762, Training Loss: 0.00653
Epoch: 48762, Training Loss: 0.00612
Epoch: 48762, Training Loss: 0.00746
Epoch: 48762, Training Loss: 0.00579
Epoch: 48763, Training Loss: 0.00653
Epoch: 48763, Training Loss: 0.00612
Epoch: 48763, Training Loss: 0.00746
Epoch: 48763, Training Loss: 0.00579
Epoch: 48764, Training Loss: 0.00653
Epoch: 48764, Training Loss: 0.00612
Epoch: 48764, Training Loss: 0.00746
Epoch: 48764, Training Loss: 0.00579
Epoch: 48765, Training Loss: 0.00653
Epoch: 48765, Training Loss: 0.00612
Epoch: 48765, Training Loss: 0.00746
Epoch: 48765, Training Loss: 0.00579
Epoch: 48766, Training Loss: 0.00653
Epoch: 48766, Training Loss: 0.00612
Epoch: 48766, Training Loss: 0.00746
Epoch: 48766, Training Loss: 0.00579
Epoch: 48767, Training Loss: 0.00653
Epoch: 48767, Training Loss: 0.00612
Epoch: 48767, Training Loss: 0.00746
Epoch: 48767, Training Loss: 0.00579
Epoch: 48768, Training Loss: 0.00653
Epoch: 48768, Training Loss: 0.00612
Epoch: 48768, Training Loss: 0.00746
Epoch: 48768, Training Loss: 0.00579
Epoch: 48769, Training Loss: 0.00653
Epoch: 48769, Training Loss: 0.00612
Epoch: 48769, Training Loss: 0.00746
Epoch: 48769, Training Loss: 0.00579
Epoch: 48770, Training Loss: 0.00653
Epoch: 48770, Training Loss: 0.00612
Epoch: 48770, Training Loss: 0.00746
Epoch: 48770, Training Loss: 0.00579
Epoch: 48771, Training Loss: 0.00653
Epoch: 48771, Training Loss: 0.00612
Epoch: 48771, Training Loss: 0.00746
Epoch: 48771, Training Loss: 0.00579
Epoch: 48772, Training Loss: 0.00653
Epoch: 48772, Training Loss: 0.00612
Epoch: 48772, Training Loss: 0.00746
Epoch: 48772, Training Loss: 0.00579
Epoch: 48773, Training Loss: 0.00653
Epoch: 48773, Training Loss: 0.00612
Epoch: 48773, Training Loss: 0.00746
Epoch: 48773, Training Loss: 0.00579
Epoch: 48774, Training Loss: 0.00653
Epoch: 48774, Training Loss: 0.00612
Epoch: 48774, Training Loss: 0.00746
Epoch: 48774, Training Loss: 0.00579
Epoch: 48775, Training Loss: 0.00653
Epoch: 48775, Training Loss: 0.00612
Epoch: 48775, Training Loss: 0.00746
Epoch: 48775, Training Loss: 0.00579
Epoch: 48776, Training Loss: 0.00653
Epoch: 48776, Training Loss: 0.00612
Epoch: 48776, Training Loss: 0.00746
Epoch: 48776, Training Loss: 0.00579
Epoch: 48777, Training Loss: 0.00653
Epoch: 48777, Training Loss: 0.00612
Epoch: 48777, Training Loss: 0.00746
Epoch: 48777, Training Loss: 0.00579
Epoch: 48778, Training Loss: 0.00653
Epoch: 48778, Training Loss: 0.00612
Epoch: 48778, Training Loss: 0.00746
Epoch: 48778, Training Loss: 0.00579
Epoch: 48779, Training Loss: 0.00653
Epoch: 48779, Training Loss: 0.00612
Epoch: 48779, Training Loss: 0.00746
Epoch: 48779, Training Loss: 0.00579
Epoch: 48780, Training Loss: 0.00653
Epoch: 48780, Training Loss: 0.00612
Epoch: 48780, Training Loss: 0.00746
Epoch: 48780, Training Loss: 0.00579
Epoch: 48781, Training Loss: 0.00653
Epoch: 48781, Training Loss: 0.00612
Epoch: 48781, Training Loss: 0.00746
Epoch: 48781, Training Loss: 0.00579
Epoch: 48782, Training Loss: 0.00653
Epoch: 48782, Training Loss: 0.00612
Epoch: 48782, Training Loss: 0.00746
Epoch: 48782, Training Loss: 0.00579
Epoch: 48783, Training Loss: 0.00653
Epoch: 48783, Training Loss: 0.00612
Epoch: 48783, Training Loss: 0.00746
Epoch: 48783, Training Loss: 0.00579
Epoch: 48784, Training Loss: 0.00653
Epoch: 48784, Training Loss: 0.00612
Epoch: 48784, Training Loss: 0.00746
Epoch: 48784, Training Loss: 0.00579
Epoch: 48785, Training Loss: 0.00653
Epoch: 48785, Training Loss: 0.00612
Epoch: 48785, Training Loss: 0.00746
Epoch: 48785, Training Loss: 0.00579
Epoch: 48786, Training Loss: 0.00653
Epoch: 48786, Training Loss: 0.00612
Epoch: 48786, Training Loss: 0.00746
Epoch: 48786, Training Loss: 0.00579
Epoch: 48787, Training Loss: 0.00653
Epoch: 48787, Training Loss: 0.00612
Epoch: 48787, Training Loss: 0.00746
Epoch: 48787, Training Loss: 0.00579
Epoch: 48788, Training Loss: 0.00653
Epoch: 48788, Training Loss: 0.00612
Epoch: 48788, Training Loss: 0.00746
Epoch: 48788, Training Loss: 0.00579
Epoch: 48789, Training Loss: 0.00653
Epoch: 48789, Training Loss: 0.00612
Epoch: 48789, Training Loss: 0.00746
Epoch: 48789, Training Loss: 0.00579
Epoch: 48790, Training Loss: 0.00653
Epoch: 48790, Training Loss: 0.00612
Epoch: 48790, Training Loss: 0.00746
Epoch: 48790, Training Loss: 0.00579
Epoch: 48791, Training Loss: 0.00653
Epoch: 48791, Training Loss: 0.00612
Epoch: 48791, Training Loss: 0.00746
Epoch: 48791, Training Loss: 0.00579
Epoch: 48792, Training Loss: 0.00653
Epoch: 48792, Training Loss: 0.00612
Epoch: 48792, Training Loss: 0.00746
Epoch: 48792, Training Loss: 0.00579
Epoch: 48793, Training Loss: 0.00653
Epoch: 48793, Training Loss: 0.00612
Epoch: 48793, Training Loss: 0.00746
Epoch: 48793, Training Loss: 0.00579
Epoch: 48794, Training Loss: 0.00653
Epoch: 48794, Training Loss: 0.00612
Epoch: 48794, Training Loss: 0.00746
Epoch: 48794, Training Loss: 0.00579
Epoch: 48795, Training Loss: 0.00653
Epoch: 48795, Training Loss: 0.00612
Epoch: 48795, Training Loss: 0.00746
Epoch: 48795, Training Loss: 0.00579
Epoch: 48796, Training Loss: 0.00653
Epoch: 48796, Training Loss: 0.00612
Epoch: 48796, Training Loss: 0.00746
Epoch: 48796, Training Loss: 0.00579
Epoch: 48797, Training Loss: 0.00653
Epoch: 48797, Training Loss: 0.00612
Epoch: 48797, Training Loss: 0.00746
Epoch: 48797, Training Loss: 0.00579
Epoch: 48798, Training Loss: 0.00653
Epoch: 48798, Training Loss: 0.00612
Epoch: 48798, Training Loss: 0.00746
Epoch: 48798, Training Loss: 0.00579
Epoch: 48799, Training Loss: 0.00653
Epoch: 48799, Training Loss: 0.00612
Epoch: 48799, Training Loss: 0.00746
Epoch: 48799, Training Loss: 0.00579
Epoch: 48800, Training Loss: 0.00653
Epoch: 48800, Training Loss: 0.00612
Epoch: 48800, Training Loss: 0.00746
Epoch: 48800, Training Loss: 0.00579
Epoch: 48801, Training Loss: 0.00653
Epoch: 48801, Training Loss: 0.00612
Epoch: 48801, Training Loss: 0.00746
Epoch: 48801, Training Loss: 0.00579
Epoch: 48802, Training Loss: 0.00653
Epoch: 48802, Training Loss: 0.00612
Epoch: 48802, Training Loss: 0.00746
Epoch: 48802, Training Loss: 0.00579
Epoch: 48803, Training Loss: 0.00653
Epoch: 48803, Training Loss: 0.00612
Epoch: 48803, Training Loss: 0.00746
Epoch: 48803, Training Loss: 0.00579
Epoch: 48804, Training Loss: 0.00653
Epoch: 48804, Training Loss: 0.00612
Epoch: 48804, Training Loss: 0.00746
Epoch: 48804, Training Loss: 0.00579
Epoch: 48805, Training Loss: 0.00653
Epoch: 48805, Training Loss: 0.00612
Epoch: 48805, Training Loss: 0.00746
Epoch: 48805, Training Loss: 0.00579
Epoch: 48806, Training Loss: 0.00653
Epoch: 48806, Training Loss: 0.00612
Epoch: 48806, Training Loss: 0.00746
Epoch: 48806, Training Loss: 0.00579
Epoch: 48807, Training Loss: 0.00653
Epoch: 48807, Training Loss: 0.00612
Epoch: 48807, Training Loss: 0.00746
Epoch: 48807, Training Loss: 0.00579
Epoch: 48808, Training Loss: 0.00653
Epoch: 48808, Training Loss: 0.00612
Epoch: 48808, Training Loss: 0.00746
Epoch: 48808, Training Loss: 0.00579
Epoch: 48809, Training Loss: 0.00653
Epoch: 48809, Training Loss: 0.00612
Epoch: 48809, Training Loss: 0.00746
Epoch: 48809, Training Loss: 0.00579
Epoch: 48810, Training Loss: 0.00652
Epoch: 48810, Training Loss: 0.00612
Epoch: 48810, Training Loss: 0.00746
Epoch: 48810, Training Loss: 0.00579
Epoch: 48811, Training Loss: 0.00652
Epoch: 48811, Training Loss: 0.00612
Epoch: 48811, Training Loss: 0.00746
Epoch: 48811, Training Loss: 0.00579
Epoch: 48812, Training Loss: 0.00652
Epoch: 48812, Training Loss: 0.00612
Epoch: 48812, Training Loss: 0.00746
Epoch: 48812, Training Loss: 0.00579
Epoch: 48813, Training Loss: 0.00652
Epoch: 48813, Training Loss: 0.00612
Epoch: 48813, Training Loss: 0.00746
Epoch: 48813, Training Loss: 0.00579
Epoch: 48814, Training Loss: 0.00652
Epoch: 48814, Training Loss: 0.00612
Epoch: 48814, Training Loss: 0.00746
Epoch: 48814, Training Loss: 0.00579
Epoch: 48815, Training Loss: 0.00652
Epoch: 48815, Training Loss: 0.00612
Epoch: 48815, Training Loss: 0.00746
Epoch: 48815, Training Loss: 0.00579
Epoch: 48816, Training Loss: 0.00652
Epoch: 48816, Training Loss: 0.00612
Epoch: 48816, Training Loss: 0.00746
Epoch: 48816, Training Loss: 0.00579
Epoch: 48817, Training Loss: 0.00652
Epoch: 48817, Training Loss: 0.00612
Epoch: 48817, Training Loss: 0.00746
Epoch: 48817, Training Loss: 0.00579
Epoch: 48818, Training Loss: 0.00652
Epoch: 48818, Training Loss: 0.00611
Epoch: 48818, Training Loss: 0.00746
Epoch: 48818, Training Loss: 0.00579
Epoch: 48819, Training Loss: 0.00652
Epoch: 48819, Training Loss: 0.00611
Epoch: 48819, Training Loss: 0.00746
Epoch: 48819, Training Loss: 0.00579
Epoch: 48820, Training Loss: 0.00652
Epoch: 48820, Training Loss: 0.00611
Epoch: 48820, Training Loss: 0.00746
Epoch: 48820, Training Loss: 0.00579
Epoch: 48821, Training Loss: 0.00652
Epoch: 48821, Training Loss: 0.00611
Epoch: 48821, Training Loss: 0.00746
Epoch: 48821, Training Loss: 0.00579
Epoch: 48822, Training Loss: 0.00652
Epoch: 48822, Training Loss: 0.00611
Epoch: 48822, Training Loss: 0.00746
Epoch: 48822, Training Loss: 0.00579
Epoch: 48823, Training Loss: 0.00652
Epoch: 48823, Training Loss: 0.00611
Epoch: 48823, Training Loss: 0.00746
Epoch: 48823, Training Loss: 0.00579
Epoch: 48824, Training Loss: 0.00652
Epoch: 48824, Training Loss: 0.00611
Epoch: 48824, Training Loss: 0.00746
Epoch: 48824, Training Loss: 0.00579
Epoch: 48825, Training Loss: 0.00652
Epoch: 48825, Training Loss: 0.00611
Epoch: 48825, Training Loss: 0.00746
Epoch: 48825, Training Loss: 0.00579
Epoch: 48826, Training Loss: 0.00652
Epoch: 48826, Training Loss: 0.00611
Epoch: 48826, Training Loss: 0.00746
Epoch: 48826, Training Loss: 0.00579
Epoch: 48827, Training Loss: 0.00652
Epoch: 48827, Training Loss: 0.00611
Epoch: 48827, Training Loss: 0.00746
Epoch: 48827, Training Loss: 0.00579
Epoch: 48828, Training Loss: 0.00652
Epoch: 48828, Training Loss: 0.00611
Epoch: 48828, Training Loss: 0.00746
Epoch: 48828, Training Loss: 0.00579
Epoch: 48829, Training Loss: 0.00652
Epoch: 48829, Training Loss: 0.00611
Epoch: 48829, Training Loss: 0.00746
Epoch: 48829, Training Loss: 0.00579
Epoch: 48830, Training Loss: 0.00652
Epoch: 48830, Training Loss: 0.00611
Epoch: 48830, Training Loss: 0.00746
Epoch: 48830, Training Loss: 0.00579
Epoch: 48831, Training Loss: 0.00652
Epoch: 48831, Training Loss: 0.00611
Epoch: 48831, Training Loss: 0.00746
Epoch: 48831, Training Loss: 0.00579
Epoch: 48832, Training Loss: 0.00652
Epoch: 48832, Training Loss: 0.00611
Epoch: 48832, Training Loss: 0.00746
Epoch: 48832, Training Loss: 0.00579
Epoch: 48833, Training Loss: 0.00652
Epoch: 48833, Training Loss: 0.00611
Epoch: 48833, Training Loss: 0.00746
Epoch: 48833, Training Loss: 0.00579
Epoch: 48834, Training Loss: 0.00652
Epoch: 48834, Training Loss: 0.00611
Epoch: 48834, Training Loss: 0.00746
Epoch: 48834, Training Loss: 0.00579
Epoch: 48835, Training Loss: 0.00652
Epoch: 48835, Training Loss: 0.00611
Epoch: 48835, Training Loss: 0.00746
Epoch: 48835, Training Loss: 0.00579
Epoch: 48836, Training Loss: 0.00652
Epoch: 48836, Training Loss: 0.00611
Epoch: 48836, Training Loss: 0.00746
Epoch: 48836, Training Loss: 0.00579
Epoch: 48837, Training Loss: 0.00652
Epoch: 48837, Training Loss: 0.00611
Epoch: 48837, Training Loss: 0.00746
Epoch: 48837, Training Loss: 0.00579
Epoch: 48838, Training Loss: 0.00652
Epoch: 48838, Training Loss: 0.00611
Epoch: 48838, Training Loss: 0.00746
Epoch: 48838, Training Loss: 0.00579
Epoch: 48839, Training Loss: 0.00652
Epoch: 48839, Training Loss: 0.00611
Epoch: 48839, Training Loss: 0.00746
Epoch: 48839, Training Loss: 0.00579
Epoch: 48840, Training Loss: 0.00652
Epoch: 48840, Training Loss: 0.00611
Epoch: 48840, Training Loss: 0.00746
Epoch: 48840, Training Loss: 0.00579
Epoch: 48841, Training Loss: 0.00652
Epoch: 48841, Training Loss: 0.00611
Epoch: 48841, Training Loss: 0.00746
Epoch: 48841, Training Loss: 0.00579
Epoch: 48842, Training Loss: 0.00652
Epoch: 48842, Training Loss: 0.00611
Epoch: 48842, Training Loss: 0.00746
Epoch: 48842, Training Loss: 0.00579
Epoch: 48843, Training Loss: 0.00652
Epoch: 48843, Training Loss: 0.00611
Epoch: 48843, Training Loss: 0.00746
Epoch: 48843, Training Loss: 0.00579
Epoch: 48844, Training Loss: 0.00652
Epoch: 48844, Training Loss: 0.00611
Epoch: 48844, Training Loss: 0.00746
Epoch: 48844, Training Loss: 0.00579
Epoch: 48845, Training Loss: 0.00652
Epoch: 48845, Training Loss: 0.00611
Epoch: 48845, Training Loss: 0.00746
Epoch: 48845, Training Loss: 0.00579
Epoch: 48846, Training Loss: 0.00652
Epoch: 48846, Training Loss: 0.00611
Epoch: 48846, Training Loss: 0.00746
Epoch: 48846, Training Loss: 0.00579
Epoch: 48847, Training Loss: 0.00652
Epoch: 48847, Training Loss: 0.00611
Epoch: 48847, Training Loss: 0.00746
Epoch: 48847, Training Loss: 0.00579
Epoch: 48848, Training Loss: 0.00652
Epoch: 48848, Training Loss: 0.00611
Epoch: 48848, Training Loss: 0.00746
Epoch: 48848, Training Loss: 0.00579
Epoch: 48849, Training Loss: 0.00652
Epoch: 48849, Training Loss: 0.00611
Epoch: 48849, Training Loss: 0.00746
Epoch: 48849, Training Loss: 0.00579
Epoch: 48850, Training Loss: 0.00652
Epoch: 48850, Training Loss: 0.00611
Epoch: 48850, Training Loss: 0.00746
Epoch: 48850, Training Loss: 0.00579
Epoch: 48851, Training Loss: 0.00652
Epoch: 48851, Training Loss: 0.00611
Epoch: 48851, Training Loss: 0.00746
Epoch: 48851, Training Loss: 0.00579
Epoch: 48852, Training Loss: 0.00652
Epoch: 48852, Training Loss: 0.00611
Epoch: 48852, Training Loss: 0.00746
Epoch: 48852, Training Loss: 0.00579
Epoch: 48853, Training Loss: 0.00652
Epoch: 48853, Training Loss: 0.00611
Epoch: 48853, Training Loss: 0.00746
Epoch: 48853, Training Loss: 0.00579
Epoch: 48854, Training Loss: 0.00652
Epoch: 48854, Training Loss: 0.00611
Epoch: 48854, Training Loss: 0.00745
Epoch: 48854, Training Loss: 0.00579
Epoch: 48855, Training Loss: 0.00652
Epoch: 48855, Training Loss: 0.00611
Epoch: 48855, Training Loss: 0.00745
Epoch: 48855, Training Loss: 0.00579
Epoch: 48856, Training Loss: 0.00652
Epoch: 48856, Training Loss: 0.00611
Epoch: 48856, Training Loss: 0.00745
Epoch: 48856, Training Loss: 0.00579
Epoch: 48857, Training Loss: 0.00652
Epoch: 48857, Training Loss: 0.00611
Epoch: 48857, Training Loss: 0.00745
Epoch: 48857, Training Loss: 0.00579
Epoch: 48858, Training Loss: 0.00652
Epoch: 48858, Training Loss: 0.00611
Epoch: 48858, Training Loss: 0.00745
Epoch: 48858, Training Loss: 0.00579
Epoch: 48859, Training Loss: 0.00652
Epoch: 48859, Training Loss: 0.00611
Epoch: 48859, Training Loss: 0.00745
Epoch: 48859, Training Loss: 0.00579
Epoch: 48860, Training Loss: 0.00652
Epoch: 48860, Training Loss: 0.00611
Epoch: 48860, Training Loss: 0.00745
Epoch: 48860, Training Loss: 0.00579
Epoch: 48861, Training Loss: 0.00652
Epoch: 48861, Training Loss: 0.00611
Epoch: 48861, Training Loss: 0.00745
Epoch: 48861, Training Loss: 0.00579
Epoch: 48862, Training Loss: 0.00652
Epoch: 48862, Training Loss: 0.00611
Epoch: 48862, Training Loss: 0.00745
Epoch: 48862, Training Loss: 0.00579
Epoch: 48863, Training Loss: 0.00652
Epoch: 48863, Training Loss: 0.00611
Epoch: 48863, Training Loss: 0.00745
Epoch: 48863, Training Loss: 0.00579
Epoch: 48864, Training Loss: 0.00652
Epoch: 48864, Training Loss: 0.00611
Epoch: 48864, Training Loss: 0.00745
Epoch: 48864, Training Loss: 0.00579
Epoch: 48865, Training Loss: 0.00652
Epoch: 48865, Training Loss: 0.00611
Epoch: 48865, Training Loss: 0.00745
Epoch: 48865, Training Loss: 0.00579
Epoch: 48866, Training Loss: 0.00652
Epoch: 48866, Training Loss: 0.00611
Epoch: 48866, Training Loss: 0.00745
Epoch: 48866, Training Loss: 0.00579
Epoch: 48867, Training Loss: 0.00652
Epoch: 48867, Training Loss: 0.00611
Epoch: 48867, Training Loss: 0.00745
Epoch: 48867, Training Loss: 0.00579
Epoch: 48868, Training Loss: 0.00652
Epoch: 48868, Training Loss: 0.00611
Epoch: 48868, Training Loss: 0.00745
Epoch: 48868, Training Loss: 0.00579
Epoch: 48869, Training Loss: 0.00652
Epoch: 48869, Training Loss: 0.00611
Epoch: 48869, Training Loss: 0.00745
Epoch: 48869, Training Loss: 0.00579
Epoch: 48870, Training Loss: 0.00652
Epoch: 48870, Training Loss: 0.00611
Epoch: 48870, Training Loss: 0.00745
Epoch: 48870, Training Loss: 0.00579
Epoch: 48871, Training Loss: 0.00652
Epoch: 48871, Training Loss: 0.00611
Epoch: 48871, Training Loss: 0.00745
Epoch: 48871, Training Loss: 0.00579
Epoch: 48872, Training Loss: 0.00652
Epoch: 48872, Training Loss: 0.00611
Epoch: 48872, Training Loss: 0.00745
Epoch: 48872, Training Loss: 0.00579
Epoch: 48873, Training Loss: 0.00652
Epoch: 48873, Training Loss: 0.00611
Epoch: 48873, Training Loss: 0.00745
Epoch: 48873, Training Loss: 0.00579
Epoch: 48874, Training Loss: 0.00652
Epoch: 48874, Training Loss: 0.00611
Epoch: 48874, Training Loss: 0.00745
Epoch: 48874, Training Loss: 0.00579
Epoch: 48875, Training Loss: 0.00652
Epoch: 48875, Training Loss: 0.00611
Epoch: 48875, Training Loss: 0.00745
Epoch: 48875, Training Loss: 0.00579
Epoch: 48876, Training Loss: 0.00652
Epoch: 48876, Training Loss: 0.00611
Epoch: 48876, Training Loss: 0.00745
Epoch: 48876, Training Loss: 0.00579
Epoch: 48877, Training Loss: 0.00652
Epoch: 48877, Training Loss: 0.00611
Epoch: 48877, Training Loss: 0.00745
Epoch: 48877, Training Loss: 0.00579
Epoch: 48878, Training Loss: 0.00652
Epoch: 48878, Training Loss: 0.00611
Epoch: 48878, Training Loss: 0.00745
Epoch: 48878, Training Loss: 0.00579
Epoch: 48879, Training Loss: 0.00652
Epoch: 48879, Training Loss: 0.00611
Epoch: 48879, Training Loss: 0.00745
Epoch: 48879, Training Loss: 0.00579
Epoch: 48880, Training Loss: 0.00652
Epoch: 48880, Training Loss: 0.00611
Epoch: 48880, Training Loss: 0.00745
Epoch: 48880, Training Loss: 0.00579
Epoch: 48881, Training Loss: 0.00652
Epoch: 48881, Training Loss: 0.00611
Epoch: 48881, Training Loss: 0.00745
Epoch: 48881, Training Loss: 0.00579
Epoch: 48882, Training Loss: 0.00652
Epoch: 48882, Training Loss: 0.00611
Epoch: 48882, Training Loss: 0.00745
Epoch: 48882, Training Loss: 0.00579
Epoch: 48883, Training Loss: 0.00652
Epoch: 48883, Training Loss: 0.00611
Epoch: 48883, Training Loss: 0.00745
Epoch: 48883, Training Loss: 0.00579
Epoch: 48884, Training Loss: 0.00652
Epoch: 48884, Training Loss: 0.00611
Epoch: 48884, Training Loss: 0.00745
Epoch: 48884, Training Loss: 0.00579
Epoch: 48885, Training Loss: 0.00652
Epoch: 48885, Training Loss: 0.00611
Epoch: 48885, Training Loss: 0.00745
Epoch: 48885, Training Loss: 0.00579
Epoch: 48886, Training Loss: 0.00652
Epoch: 48886, Training Loss: 0.00611
Epoch: 48886, Training Loss: 0.00745
Epoch: 48886, Training Loss: 0.00578
Epoch: 48887, Training Loss: 0.00652
Epoch: 48887, Training Loss: 0.00611
Epoch: 48887, Training Loss: 0.00745
Epoch: 48887, Training Loss: 0.00578
Epoch: 48888, Training Loss: 0.00652
Epoch: 48888, Training Loss: 0.00611
Epoch: 48888, Training Loss: 0.00745
Epoch: 48888, Training Loss: 0.00578
Epoch: 48889, Training Loss: 0.00652
Epoch: 48889, Training Loss: 0.00611
Epoch: 48889, Training Loss: 0.00745
Epoch: 48889, Training Loss: 0.00578
Epoch: 48890, Training Loss: 0.00652
Epoch: 48890, Training Loss: 0.00611
Epoch: 48890, Training Loss: 0.00745
Epoch: 48890, Training Loss: 0.00578
Epoch: 48891, Training Loss: 0.00652
Epoch: 48891, Training Loss: 0.00611
Epoch: 48891, Training Loss: 0.00745
Epoch: 48891, Training Loss: 0.00578
Epoch: 48892, Training Loss: 0.00652
Epoch: 48892, Training Loss: 0.00611
Epoch: 48892, Training Loss: 0.00745
Epoch: 48892, Training Loss: 0.00578
Epoch: 48893, Training Loss: 0.00652
Epoch: 48893, Training Loss: 0.00611
Epoch: 48893, Training Loss: 0.00745
Epoch: 48893, Training Loss: 0.00578
Epoch: 48894, Training Loss: 0.00652
Epoch: 48894, Training Loss: 0.00611
Epoch: 48894, Training Loss: 0.00745
Epoch: 48894, Training Loss: 0.00578
Epoch: 48895, Training Loss: 0.00652
Epoch: 48895, Training Loss: 0.00611
Epoch: 48895, Training Loss: 0.00745
Epoch: 48895, Training Loss: 0.00578
Epoch: 48896, Training Loss: 0.00652
Epoch: 48896, Training Loss: 0.00611
Epoch: 48896, Training Loss: 0.00745
Epoch: 48896, Training Loss: 0.00578
Epoch: 48897, Training Loss: 0.00652
Epoch: 48897, Training Loss: 0.00611
Epoch: 48897, Training Loss: 0.00745
Epoch: 48897, Training Loss: 0.00578
Epoch: 48898, Training Loss: 0.00652
Epoch: 48898, Training Loss: 0.00611
Epoch: 48898, Training Loss: 0.00745
Epoch: 48898, Training Loss: 0.00578
Epoch: 48899, Training Loss: 0.00652
Epoch: 48899, Training Loss: 0.00611
Epoch: 48899, Training Loss: 0.00745
Epoch: 48899, Training Loss: 0.00578
Epoch: 48900, Training Loss: 0.00652
Epoch: 48900, Training Loss: 0.00611
Epoch: 48900, Training Loss: 0.00745
Epoch: 48900, Training Loss: 0.00578
Epoch: 48901, Training Loss: 0.00652
Epoch: 48901, Training Loss: 0.00611
Epoch: 48901, Training Loss: 0.00745
Epoch: 48901, Training Loss: 0.00578
Epoch: 48902, Training Loss: 0.00652
Epoch: 48902, Training Loss: 0.00611
Epoch: 48902, Training Loss: 0.00745
Epoch: 48902, Training Loss: 0.00578
Epoch: 48903, Training Loss: 0.00652
Epoch: 48903, Training Loss: 0.00611
Epoch: 48903, Training Loss: 0.00745
Epoch: 48903, Training Loss: 0.00578
Epoch: 48904, Training Loss: 0.00652
Epoch: 48904, Training Loss: 0.00611
Epoch: 48904, Training Loss: 0.00745
Epoch: 48904, Training Loss: 0.00578
Epoch: 48905, Training Loss: 0.00652
Epoch: 48905, Training Loss: 0.00611
Epoch: 48905, Training Loss: 0.00745
Epoch: 48905, Training Loss: 0.00578
Epoch: 48906, Training Loss: 0.00652
Epoch: 48906, Training Loss: 0.00611
Epoch: 48906, Training Loss: 0.00745
Epoch: 48906, Training Loss: 0.00578
Epoch: 48907, Training Loss: 0.00652
Epoch: 48907, Training Loss: 0.00611
Epoch: 48907, Training Loss: 0.00745
Epoch: 48907, Training Loss: 0.00578
Epoch: 48908, Training Loss: 0.00652
Epoch: 48908, Training Loss: 0.00611
Epoch: 48908, Training Loss: 0.00745
Epoch: 48908, Training Loss: 0.00578
Epoch: 48909, Training Loss: 0.00652
Epoch: 48909, Training Loss: 0.00611
Epoch: 48909, Training Loss: 0.00745
Epoch: 48909, Training Loss: 0.00578
Epoch: 48910, Training Loss: 0.00652
Epoch: 48910, Training Loss: 0.00611
Epoch: 48910, Training Loss: 0.00745
Epoch: 48910, Training Loss: 0.00578
Epoch: 48911, Training Loss: 0.00652
Epoch: 48911, Training Loss: 0.00611
Epoch: 48911, Training Loss: 0.00745
Epoch: 48911, Training Loss: 0.00578
Epoch: 48912, Training Loss: 0.00652
Epoch: 48912, Training Loss: 0.00611
Epoch: 48912, Training Loss: 0.00745
Epoch: 48912, Training Loss: 0.00578
Epoch: 48913, Training Loss: 0.00652
Epoch: 48913, Training Loss: 0.00611
Epoch: 48913, Training Loss: 0.00745
Epoch: 48913, Training Loss: 0.00578
Epoch: 48914, Training Loss: 0.00652
Epoch: 48914, Training Loss: 0.00611
Epoch: 48914, Training Loss: 0.00745
Epoch: 48914, Training Loss: 0.00578
Epoch: 48915, Training Loss: 0.00652
Epoch: 48915, Training Loss: 0.00611
Epoch: 48915, Training Loss: 0.00745
Epoch: 48915, Training Loss: 0.00578
Epoch: 48916, Training Loss: 0.00652
Epoch: 48916, Training Loss: 0.00611
Epoch: 48916, Training Loss: 0.00745
Epoch: 48916, Training Loss: 0.00578
Epoch: 48917, Training Loss: 0.00652
Epoch: 48917, Training Loss: 0.00611
Epoch: 48917, Training Loss: 0.00745
Epoch: 48917, Training Loss: 0.00578
Epoch: 48918, Training Loss: 0.00652
Epoch: 48918, Training Loss: 0.00611
Epoch: 48918, Training Loss: 0.00745
Epoch: 48918, Training Loss: 0.00578
Epoch: 48919, Training Loss: 0.00652
Epoch: 48919, Training Loss: 0.00611
Epoch: 48919, Training Loss: 0.00745
Epoch: 48919, Training Loss: 0.00578
Epoch: 48920, Training Loss: 0.00652
Epoch: 48920, Training Loss: 0.00611
Epoch: 48920, Training Loss: 0.00745
Epoch: 48920, Training Loss: 0.00578
Epoch: 48921, Training Loss: 0.00652
Epoch: 48921, Training Loss: 0.00611
Epoch: 48921, Training Loss: 0.00745
Epoch: 48921, Training Loss: 0.00578
Epoch: 48922, Training Loss: 0.00652
Epoch: 48922, Training Loss: 0.00611
Epoch: 48922, Training Loss: 0.00745
Epoch: 48922, Training Loss: 0.00578
Epoch: 48923, Training Loss: 0.00652
Epoch: 48923, Training Loss: 0.00611
Epoch: 48923, Training Loss: 0.00745
Epoch: 48923, Training Loss: 0.00578
Epoch: 48924, Training Loss: 0.00652
Epoch: 48924, Training Loss: 0.00611
Epoch: 48924, Training Loss: 0.00745
Epoch: 48924, Training Loss: 0.00578
Epoch: 48925, Training Loss: 0.00652
Epoch: 48925, Training Loss: 0.00611
Epoch: 48925, Training Loss: 0.00745
Epoch: 48925, Training Loss: 0.00578
Epoch: 48926, Training Loss: 0.00652
Epoch: 48926, Training Loss: 0.00611
Epoch: 48926, Training Loss: 0.00745
Epoch: 48926, Training Loss: 0.00578
Epoch: 48927, Training Loss: 0.00652
Epoch: 48927, Training Loss: 0.00611
Epoch: 48927, Training Loss: 0.00745
Epoch: 48927, Training Loss: 0.00578
Epoch: 48928, Training Loss: 0.00652
Epoch: 48928, Training Loss: 0.00611
Epoch: 48928, Training Loss: 0.00745
Epoch: 48928, Training Loss: 0.00578
Epoch: 48929, Training Loss: 0.00652
Epoch: 48929, Training Loss: 0.00611
Epoch: 48929, Training Loss: 0.00745
Epoch: 48929, Training Loss: 0.00578
Epoch: 48930, Training Loss: 0.00652
Epoch: 48930, Training Loss: 0.00611
Epoch: 48930, Training Loss: 0.00745
Epoch: 48930, Training Loss: 0.00578
Epoch: 48931, Training Loss: 0.00652
Epoch: 48931, Training Loss: 0.00611
Epoch: 48931, Training Loss: 0.00745
Epoch: 48931, Training Loss: 0.00578
Epoch: 48932, Training Loss: 0.00652
Epoch: 48932, Training Loss: 0.00611
Epoch: 48932, Training Loss: 0.00745
Epoch: 48932, Training Loss: 0.00578
Epoch: 48933, Training Loss: 0.00652
Epoch: 48933, Training Loss: 0.00611
Epoch: 48933, Training Loss: 0.00745
Epoch: 48933, Training Loss: 0.00578
Epoch: 48934, Training Loss: 0.00652
Epoch: 48934, Training Loss: 0.00611
Epoch: 48934, Training Loss: 0.00745
Epoch: 48934, Training Loss: 0.00578
Epoch: 48935, Training Loss: 0.00652
Epoch: 48935, Training Loss: 0.00611
Epoch: 48935, Training Loss: 0.00745
Epoch: 48935, Training Loss: 0.00578
Epoch: 48936, Training Loss: 0.00652
Epoch: 48936, Training Loss: 0.00611
Epoch: 48936, Training Loss: 0.00745
Epoch: 48936, Training Loss: 0.00578
Epoch: 48937, Training Loss: 0.00652
Epoch: 48937, Training Loss: 0.00611
Epoch: 48937, Training Loss: 0.00745
Epoch: 48937, Training Loss: 0.00578
Epoch: 48938, Training Loss: 0.00652
Epoch: 48938, Training Loss: 0.00611
Epoch: 48938, Training Loss: 0.00745
Epoch: 48938, Training Loss: 0.00578
Epoch: 48939, Training Loss: 0.00652
Epoch: 48939, Training Loss: 0.00611
Epoch: 48939, Training Loss: 0.00745
Epoch: 48939, Training Loss: 0.00578
Epoch: 48940, Training Loss: 0.00652
Epoch: 48940, Training Loss: 0.00611
Epoch: 48940, Training Loss: 0.00745
Epoch: 48940, Training Loss: 0.00578
Epoch: 48941, Training Loss: 0.00652
Epoch: 48941, Training Loss: 0.00611
Epoch: 48941, Training Loss: 0.00745
Epoch: 48941, Training Loss: 0.00578
Epoch: 48942, Training Loss: 0.00652
Epoch: 48942, Training Loss: 0.00611
Epoch: 48942, Training Loss: 0.00745
Epoch: 48942, Training Loss: 0.00578
Epoch: 48943, Training Loss: 0.00652
Epoch: 48943, Training Loss: 0.00611
Epoch: 48943, Training Loss: 0.00745
Epoch: 48943, Training Loss: 0.00578
Epoch: 48944, Training Loss: 0.00652
Epoch: 48944, Training Loss: 0.00611
Epoch: 48944, Training Loss: 0.00745
Epoch: 48944, Training Loss: 0.00578
Epoch: 48945, Training Loss: 0.00652
Epoch: 48945, Training Loss: 0.00611
Epoch: 48945, Training Loss: 0.00745
Epoch: 48945, Training Loss: 0.00578
Epoch: 48946, Training Loss: 0.00652
Epoch: 48946, Training Loss: 0.00611
Epoch: 48946, Training Loss: 0.00745
Epoch: 48946, Training Loss: 0.00578
Epoch: 48947, Training Loss: 0.00652
Epoch: 48947, Training Loss: 0.00611
Epoch: 48947, Training Loss: 0.00745
Epoch: 48947, Training Loss: 0.00578
Epoch: 48948, Training Loss: 0.00652
Epoch: 48948, Training Loss: 0.00611
Epoch: 48948, Training Loss: 0.00745
Epoch: 48948, Training Loss: 0.00578
Epoch: 48949, Training Loss: 0.00652
Epoch: 48949, Training Loss: 0.00611
Epoch: 48949, Training Loss: 0.00745
Epoch: 48949, Training Loss: 0.00578
Epoch: 48950, Training Loss: 0.00652
Epoch: 48950, Training Loss: 0.00611
Epoch: 48950, Training Loss: 0.00745
Epoch: 48950, Training Loss: 0.00578
Epoch: 48951, Training Loss: 0.00652
Epoch: 48951, Training Loss: 0.00611
Epoch: 48951, Training Loss: 0.00745
Epoch: 48951, Training Loss: 0.00578
Epoch: 48952, Training Loss: 0.00652
Epoch: 48952, Training Loss: 0.00611
Epoch: 48952, Training Loss: 0.00745
Epoch: 48952, Training Loss: 0.00578
Epoch: 48953, Training Loss: 0.00651
Epoch: 48953, Training Loss: 0.00611
Epoch: 48953, Training Loss: 0.00745
Epoch: 48953, Training Loss: 0.00578
Epoch: 48954, Training Loss: 0.00651
Epoch: 48954, Training Loss: 0.00611
Epoch: 48954, Training Loss: 0.00745
Epoch: 48954, Training Loss: 0.00578
Epoch: 48955, Training Loss: 0.00651
Epoch: 48955, Training Loss: 0.00611
Epoch: 48955, Training Loss: 0.00745
Epoch: 48955, Training Loss: 0.00578
Epoch: 48956, Training Loss: 0.00651
Epoch: 48956, Training Loss: 0.00611
Epoch: 48956, Training Loss: 0.00745
Epoch: 48956, Training Loss: 0.00578
Epoch: 48957, Training Loss: 0.00651
Epoch: 48957, Training Loss: 0.00611
Epoch: 48957, Training Loss: 0.00745
Epoch: 48957, Training Loss: 0.00578
Epoch: 48958, Training Loss: 0.00651
Epoch: 48958, Training Loss: 0.00611
Epoch: 48958, Training Loss: 0.00745
Epoch: 48958, Training Loss: 0.00578
Epoch: 48959, Training Loss: 0.00651
Epoch: 48959, Training Loss: 0.00611
Epoch: 48959, Training Loss: 0.00745
Epoch: 48959, Training Loss: 0.00578
Epoch: 48960, Training Loss: 0.00651
Epoch: 48960, Training Loss: 0.00611
Epoch: 48960, Training Loss: 0.00745
Epoch: 48960, Training Loss: 0.00578
Epoch: 48961, Training Loss: 0.00651
Epoch: 48961, Training Loss: 0.00611
Epoch: 48961, Training Loss: 0.00745
Epoch: 48961, Training Loss: 0.00578
Epoch: 48962, Training Loss: 0.00651
Epoch: 48962, Training Loss: 0.00611
Epoch: 48962, Training Loss: 0.00745
Epoch: 48962, Training Loss: 0.00578
Epoch: 48963, Training Loss: 0.00651
Epoch: 48963, Training Loss: 0.00611
Epoch: 48963, Training Loss: 0.00745
Epoch: 48963, Training Loss: 0.00578
Epoch: 48964, Training Loss: 0.00651
Epoch: 48964, Training Loss: 0.00611
Epoch: 48964, Training Loss: 0.00745
Epoch: 48964, Training Loss: 0.00578
Epoch: 48965, Training Loss: 0.00651
Epoch: 48965, Training Loss: 0.00611
Epoch: 48965, Training Loss: 0.00745
Epoch: 48965, Training Loss: 0.00578
Epoch: 48966, Training Loss: 0.00651
Epoch: 48966, Training Loss: 0.00611
Epoch: 48966, Training Loss: 0.00745
Epoch: 48966, Training Loss: 0.00578
Epoch: 48967, Training Loss: 0.00651
Epoch: 48967, Training Loss: 0.00611
Epoch: 48967, Training Loss: 0.00745
Epoch: 48967, Training Loss: 0.00578
Epoch: 48968, Training Loss: 0.00651
Epoch: 48968, Training Loss: 0.00611
Epoch: 48968, Training Loss: 0.00745
Epoch: 48968, Training Loss: 0.00578
Epoch: 48969, Training Loss: 0.00651
Epoch: 48969, Training Loss: 0.00611
Epoch: 48969, Training Loss: 0.00745
Epoch: 48969, Training Loss: 0.00578
Epoch: 48970, Training Loss: 0.00651
Epoch: 48970, Training Loss: 0.00611
Epoch: 48970, Training Loss: 0.00745
Epoch: 48970, Training Loss: 0.00578
Epoch: 48971, Training Loss: 0.00651
Epoch: 48971, Training Loss: 0.00611
Epoch: 48971, Training Loss: 0.00745
Epoch: 48971, Training Loss: 0.00578
Epoch: 48972, Training Loss: 0.00651
Epoch: 48972, Training Loss: 0.00610
Epoch: 48972, Training Loss: 0.00745
Epoch: 48972, Training Loss: 0.00578
Epoch: 48973, Training Loss: 0.00651
Epoch: 48973, Training Loss: 0.00610
Epoch: 48973, Training Loss: 0.00745
Epoch: 48973, Training Loss: 0.00578
Epoch: 48974, Training Loss: 0.00651
Epoch: 48974, Training Loss: 0.00610
Epoch: 48974, Training Loss: 0.00745
Epoch: 48974, Training Loss: 0.00578
Epoch: 48975, Training Loss: 0.00651
Epoch: 48975, Training Loss: 0.00610
Epoch: 48975, Training Loss: 0.00745
Epoch: 48975, Training Loss: 0.00578
Epoch: 48976, Training Loss: 0.00651
Epoch: 48976, Training Loss: 0.00610
Epoch: 48976, Training Loss: 0.00745
Epoch: 48976, Training Loss: 0.00578
Epoch: 48977, Training Loss: 0.00651
Epoch: 48977, Training Loss: 0.00610
Epoch: 48977, Training Loss: 0.00745
Epoch: 48977, Training Loss: 0.00578
Epoch: 48978, Training Loss: 0.00651
Epoch: 48978, Training Loss: 0.00610
Epoch: 48978, Training Loss: 0.00745
Epoch: 48978, Training Loss: 0.00578
Epoch: 48979, Training Loss: 0.00651
Epoch: 48979, Training Loss: 0.00610
Epoch: 48979, Training Loss: 0.00745
Epoch: 48979, Training Loss: 0.00578
Epoch: 48980, Training Loss: 0.00651
Epoch: 48980, Training Loss: 0.00610
Epoch: 48980, Training Loss: 0.00744
Epoch: 48980, Training Loss: 0.00578
Epoch: 48981, Training Loss: 0.00651
Epoch: 48981, Training Loss: 0.00610
Epoch: 48981, Training Loss: 0.00744
Epoch: 48981, Training Loss: 0.00578
Epoch: 48982, Training Loss: 0.00651
Epoch: 48982, Training Loss: 0.00610
Epoch: 48982, Training Loss: 0.00744
Epoch: 48982, Training Loss: 0.00578
Epoch: 48983, Training Loss: 0.00651
Epoch: 48983, Training Loss: 0.00610
Epoch: 48983, Training Loss: 0.00744
Epoch: 48983, Training Loss: 0.00578
Epoch: 48984, Training Loss: 0.00651
Epoch: 48984, Training Loss: 0.00610
Epoch: 48984, Training Loss: 0.00744
Epoch: 48984, Training Loss: 0.00578
Epoch: 48985, Training Loss: 0.00651
Epoch: 48985, Training Loss: 0.00610
Epoch: 48985, Training Loss: 0.00744
Epoch: 48985, Training Loss: 0.00578
Epoch: 48986, Training Loss: 0.00651
Epoch: 48986, Training Loss: 0.00610
Epoch: 48986, Training Loss: 0.00744
Epoch: 48986, Training Loss: 0.00578
Epoch: 48987, Training Loss: 0.00651
Epoch: 48987, Training Loss: 0.00610
Epoch: 48987, Training Loss: 0.00744
Epoch: 48987, Training Loss: 0.00578
Epoch: 48988, Training Loss: 0.00651
Epoch: 48988, Training Loss: 0.00610
Epoch: 48988, Training Loss: 0.00744
Epoch: 48988, Training Loss: 0.00578
Epoch: 48989, Training Loss: 0.00651
Epoch: 48989, Training Loss: 0.00610
Epoch: 48989, Training Loss: 0.00744
Epoch: 48989, Training Loss: 0.00578
Epoch: 48990, Training Loss: 0.00651
Epoch: 48990, Training Loss: 0.00610
Epoch: 48990, Training Loss: 0.00744
Epoch: 48990, Training Loss: 0.00578
Epoch: 48991, Training Loss: 0.00651
Epoch: 48991, Training Loss: 0.00610
Epoch: 48991, Training Loss: 0.00744
Epoch: 48991, Training Loss: 0.00578
Epoch: 48992, Training Loss: 0.00651
Epoch: 48992, Training Loss: 0.00610
Epoch: 48992, Training Loss: 0.00744
Epoch: 48992, Training Loss: 0.00578
Epoch: 48993, Training Loss: 0.00651
Epoch: 48993, Training Loss: 0.00610
Epoch: 48993, Training Loss: 0.00744
Epoch: 48993, Training Loss: 0.00578
Epoch: 48994, Training Loss: 0.00651
Epoch: 48994, Training Loss: 0.00610
Epoch: 48994, Training Loss: 0.00744
Epoch: 48994, Training Loss: 0.00578
Epoch: 48995, Training Loss: 0.00651
Epoch: 48995, Training Loss: 0.00610
Epoch: 48995, Training Loss: 0.00744
Epoch: 48995, Training Loss: 0.00578
Epoch: 48996, Training Loss: 0.00651
Epoch: 48996, Training Loss: 0.00610
Epoch: 48996, Training Loss: 0.00744
Epoch: 48996, Training Loss: 0.00578
Epoch: 48997, Training Loss: 0.00651
Epoch: 48997, Training Loss: 0.00610
Epoch: 48997, Training Loss: 0.00744
Epoch: 48997, Training Loss: 0.00578
Epoch: 48998, Training Loss: 0.00651
Epoch: 48998, Training Loss: 0.00610
Epoch: 48998, Training Loss: 0.00744
Epoch: 48998, Training Loss: 0.00578
Epoch: 48999, Training Loss: 0.00651
Epoch: 48999, Training Loss: 0.00610
Epoch: 48999, Training Loss: 0.00744
Epoch: 48999, Training Loss: 0.00578
Epoch: 49000, Training Loss: 0.00651
Epoch: 49000, Training Loss: 0.00610
Epoch: 49000, Training Loss: 0.00744
Epoch: 49000, Training Loss: 0.00578
Epoch: 49001, Training Loss: 0.00651
Epoch: 49001, Training Loss: 0.00610
Epoch: 49001, Training Loss: 0.00744
Epoch: 49001, Training Loss: 0.00578
Epoch: 49002, Training Loss: 0.00651
Epoch: 49002, Training Loss: 0.00610
Epoch: 49002, Training Loss: 0.00744
Epoch: 49002, Training Loss: 0.00578
Epoch: 49003, Training Loss: 0.00651
Epoch: 49003, Training Loss: 0.00610
Epoch: 49003, Training Loss: 0.00744
Epoch: 49003, Training Loss: 0.00578
Epoch: 49004, Training Loss: 0.00651
Epoch: 49004, Training Loss: 0.00610
Epoch: 49004, Training Loss: 0.00744
Epoch: 49004, Training Loss: 0.00578
Epoch: 49005, Training Loss: 0.00651
Epoch: 49005, Training Loss: 0.00610
Epoch: 49005, Training Loss: 0.00744
Epoch: 49005, Training Loss: 0.00578
Epoch: 49006, Training Loss: 0.00651
Epoch: 49006, Training Loss: 0.00610
Epoch: 49006, Training Loss: 0.00744
Epoch: 49006, Training Loss: 0.00578
Epoch: 49007, Training Loss: 0.00651
Epoch: 49007, Training Loss: 0.00610
Epoch: 49007, Training Loss: 0.00744
Epoch: 49007, Training Loss: 0.00578
Epoch: 49008, Training Loss: 0.00651
Epoch: 49008, Training Loss: 0.00610
Epoch: 49008, Training Loss: 0.00744
Epoch: 49008, Training Loss: 0.00578
Epoch: 49009, Training Loss: 0.00651
Epoch: 49009, Training Loss: 0.00610
Epoch: 49009, Training Loss: 0.00744
Epoch: 49009, Training Loss: 0.00578
Epoch: 49010, Training Loss: 0.00651
Epoch: 49010, Training Loss: 0.00610
Epoch: 49010, Training Loss: 0.00744
Epoch: 49010, Training Loss: 0.00578
Epoch: 49011, Training Loss: 0.00651
Epoch: 49011, Training Loss: 0.00610
Epoch: 49011, Training Loss: 0.00744
Epoch: 49011, Training Loss: 0.00578
Epoch: 49012, Training Loss: 0.00651
Epoch: 49012, Training Loss: 0.00610
Epoch: 49012, Training Loss: 0.00744
Epoch: 49012, Training Loss: 0.00578
Epoch: 49013, Training Loss: 0.00651
Epoch: 49013, Training Loss: 0.00610
Epoch: 49013, Training Loss: 0.00744
Epoch: 49013, Training Loss: 0.00578
Epoch: 49014, Training Loss: 0.00651
Epoch: 49014, Training Loss: 0.00610
Epoch: 49014, Training Loss: 0.00744
Epoch: 49014, Training Loss: 0.00578
Epoch: 49015, Training Loss: 0.00651
Epoch: 49015, Training Loss: 0.00610
Epoch: 49015, Training Loss: 0.00744
Epoch: 49015, Training Loss: 0.00578
Epoch: 49016, Training Loss: 0.00651
Epoch: 49016, Training Loss: 0.00610
Epoch: 49016, Training Loss: 0.00744
Epoch: 49016, Training Loss: 0.00578
Epoch: 49017, Training Loss: 0.00651
Epoch: 49017, Training Loss: 0.00610
Epoch: 49017, Training Loss: 0.00744
Epoch: 49017, Training Loss: 0.00578
Epoch: 49018, Training Loss: 0.00651
Epoch: 49018, Training Loss: 0.00610
Epoch: 49018, Training Loss: 0.00744
Epoch: 49018, Training Loss: 0.00578
Epoch: 49019, Training Loss: 0.00651
Epoch: 49019, Training Loss: 0.00610
Epoch: 49019, Training Loss: 0.00744
Epoch: 49019, Training Loss: 0.00578
Epoch: 49020, Training Loss: 0.00651
Epoch: 49020, Training Loss: 0.00610
Epoch: 49020, Training Loss: 0.00744
Epoch: 49020, Training Loss: 0.00578
Epoch: 49021, Training Loss: 0.00651
Epoch: 49021, Training Loss: 0.00610
Epoch: 49021, Training Loss: 0.00744
Epoch: 49021, Training Loss: 0.00578
Epoch: 49022, Training Loss: 0.00651
Epoch: 49022, Training Loss: 0.00610
Epoch: 49022, Training Loss: 0.00744
Epoch: 49022, Training Loss: 0.00578
Epoch: 49023, Training Loss: 0.00651
Epoch: 49023, Training Loss: 0.00610
Epoch: 49023, Training Loss: 0.00744
Epoch: 49023, Training Loss: 0.00578
Epoch: 49024, Training Loss: 0.00651
Epoch: 49024, Training Loss: 0.00610
Epoch: 49024, Training Loss: 0.00744
Epoch: 49024, Training Loss: 0.00578
Epoch: 49025, Training Loss: 0.00651
Epoch: 49025, Training Loss: 0.00610
Epoch: 49025, Training Loss: 0.00744
Epoch: 49025, Training Loss: 0.00578
Epoch: 49026, Training Loss: 0.00651
Epoch: 49026, Training Loss: 0.00610
Epoch: 49026, Training Loss: 0.00744
Epoch: 49026, Training Loss: 0.00578
Epoch: 49027, Training Loss: 0.00651
Epoch: 49027, Training Loss: 0.00610
Epoch: 49027, Training Loss: 0.00744
Epoch: 49027, Training Loss: 0.00578
Epoch: 49028, Training Loss: 0.00651
Epoch: 49028, Training Loss: 0.00610
Epoch: 49028, Training Loss: 0.00744
Epoch: 49028, Training Loss: 0.00578
Epoch: 49029, Training Loss: 0.00651
Epoch: 49029, Training Loss: 0.00610
Epoch: 49029, Training Loss: 0.00744
Epoch: 49029, Training Loss: 0.00578
Epoch: 49030, Training Loss: 0.00651
Epoch: 49030, Training Loss: 0.00610
Epoch: 49030, Training Loss: 0.00744
Epoch: 49030, Training Loss: 0.00578
Epoch: 49031, Training Loss: 0.00651
Epoch: 49031, Training Loss: 0.00610
Epoch: 49031, Training Loss: 0.00744
Epoch: 49031, Training Loss: 0.00578
Epoch: 49032, Training Loss: 0.00651
Epoch: 49032, Training Loss: 0.00610
Epoch: 49032, Training Loss: 0.00744
Epoch: 49032, Training Loss: 0.00578
Epoch: 49033, Training Loss: 0.00651
Epoch: 49033, Training Loss: 0.00610
Epoch: 49033, Training Loss: 0.00744
Epoch: 49033, Training Loss: 0.00578
Epoch: 49034, Training Loss: 0.00651
Epoch: 49034, Training Loss: 0.00610
Epoch: 49034, Training Loss: 0.00744
Epoch: 49034, Training Loss: 0.00578
Epoch: 49035, Training Loss: 0.00651
Epoch: 49035, Training Loss: 0.00610
Epoch: 49035, Training Loss: 0.00744
Epoch: 49035, Training Loss: 0.00578
Epoch: 49036, Training Loss: 0.00651
Epoch: 49036, Training Loss: 0.00610
Epoch: 49036, Training Loss: 0.00744
Epoch: 49036, Training Loss: 0.00578
Epoch: 49037, Training Loss: 0.00651
Epoch: 49037, Training Loss: 0.00610
Epoch: 49037, Training Loss: 0.00744
Epoch: 49037, Training Loss: 0.00578
Epoch: 49038, Training Loss: 0.00651
Epoch: 49038, Training Loss: 0.00610
Epoch: 49038, Training Loss: 0.00744
Epoch: 49038, Training Loss: 0.00578
Epoch: 49039, Training Loss: 0.00651
Epoch: 49039, Training Loss: 0.00610
Epoch: 49039, Training Loss: 0.00744
Epoch: 49039, Training Loss: 0.00578
Epoch: 49040, Training Loss: 0.00651
Epoch: 49040, Training Loss: 0.00610
Epoch: 49040, Training Loss: 0.00744
Epoch: 49040, Training Loss: 0.00578
Epoch: 49041, Training Loss: 0.00651
Epoch: 49041, Training Loss: 0.00610
Epoch: 49041, Training Loss: 0.00744
Epoch: 49041, Training Loss: 0.00578
Epoch: 49042, Training Loss: 0.00651
Epoch: 49042, Training Loss: 0.00610
Epoch: 49042, Training Loss: 0.00744
Epoch: 49042, Training Loss: 0.00578
Epoch: 49043, Training Loss: 0.00651
Epoch: 49043, Training Loss: 0.00610
Epoch: 49043, Training Loss: 0.00744
Epoch: 49043, Training Loss: 0.00578
Epoch: 49044, Training Loss: 0.00651
Epoch: 49044, Training Loss: 0.00610
Epoch: 49044, Training Loss: 0.00744
Epoch: 49044, Training Loss: 0.00578
Epoch: 49045, Training Loss: 0.00651
Epoch: 49045, Training Loss: 0.00610
Epoch: 49045, Training Loss: 0.00744
Epoch: 49045, Training Loss: 0.00578
Epoch: 49046, Training Loss: 0.00651
Epoch: 49046, Training Loss: 0.00610
Epoch: 49046, Training Loss: 0.00744
Epoch: 49046, Training Loss: 0.00578
Epoch: 49047, Training Loss: 0.00651
Epoch: 49047, Training Loss: 0.00610
Epoch: 49047, Training Loss: 0.00744
Epoch: 49047, Training Loss: 0.00578
Epoch: 49048, Training Loss: 0.00651
Epoch: 49048, Training Loss: 0.00610
Epoch: 49048, Training Loss: 0.00744
Epoch: 49048, Training Loss: 0.00578
Epoch: 49049, Training Loss: 0.00651
Epoch: 49049, Training Loss: 0.00610
Epoch: 49049, Training Loss: 0.00744
Epoch: 49049, Training Loss: 0.00578
Epoch: 49050, Training Loss: 0.00651
Epoch: 49050, Training Loss: 0.00610
Epoch: 49050, Training Loss: 0.00744
Epoch: 49050, Training Loss: 0.00577
Epoch: 49051, Training Loss: 0.00651
Epoch: 49051, Training Loss: 0.00610
Epoch: 49051, Training Loss: 0.00744
Epoch: 49051, Training Loss: 0.00577
Epoch: 49052, Training Loss: 0.00651
Epoch: 49052, Training Loss: 0.00610
Epoch: 49052, Training Loss: 0.00744
Epoch: 49052, Training Loss: 0.00577
Epoch: 49053, Training Loss: 0.00651
Epoch: 49053, Training Loss: 0.00610
Epoch: 49053, Training Loss: 0.00744
Epoch: 49053, Training Loss: 0.00577
Epoch: 49054, Training Loss: 0.00651
Epoch: 49054, Training Loss: 0.00610
Epoch: 49054, Training Loss: 0.00744
Epoch: 49054, Training Loss: 0.00577
Epoch: 49055, Training Loss: 0.00651
Epoch: 49055, Training Loss: 0.00610
Epoch: 49055, Training Loss: 0.00744
Epoch: 49055, Training Loss: 0.00577
Epoch: 49056, Training Loss: 0.00651
Epoch: 49056, Training Loss: 0.00610
Epoch: 49056, Training Loss: 0.00744
Epoch: 49056, Training Loss: 0.00577
Epoch: 49057, Training Loss: 0.00651
Epoch: 49057, Training Loss: 0.00610
Epoch: 49057, Training Loss: 0.00744
Epoch: 49057, Training Loss: 0.00577
Epoch: 49058, Training Loss: 0.00651
Epoch: 49058, Training Loss: 0.00610
Epoch: 49058, Training Loss: 0.00744
Epoch: 49058, Training Loss: 0.00577
Epoch: 49059, Training Loss: 0.00651
Epoch: 49059, Training Loss: 0.00610
Epoch: 49059, Training Loss: 0.00744
Epoch: 49059, Training Loss: 0.00577
Epoch: 49060, Training Loss: 0.00651
Epoch: 49060, Training Loss: 0.00610
Epoch: 49060, Training Loss: 0.00744
Epoch: 49060, Training Loss: 0.00577
Epoch: 49061, Training Loss: 0.00651
Epoch: 49061, Training Loss: 0.00610
Epoch: 49061, Training Loss: 0.00744
Epoch: 49061, Training Loss: 0.00577
Epoch: 49062, Training Loss: 0.00651
Epoch: 49062, Training Loss: 0.00610
Epoch: 49062, Training Loss: 0.00744
Epoch: 49062, Training Loss: 0.00577
Epoch: 49063, Training Loss: 0.00651
Epoch: 49063, Training Loss: 0.00610
Epoch: 49063, Training Loss: 0.00744
Epoch: 49063, Training Loss: 0.00577
Epoch: 49064, Training Loss: 0.00651
Epoch: 49064, Training Loss: 0.00610
Epoch: 49064, Training Loss: 0.00744
Epoch: 49064, Training Loss: 0.00577
Epoch: 49065, Training Loss: 0.00651
Epoch: 49065, Training Loss: 0.00610
Epoch: 49065, Training Loss: 0.00744
Epoch: 49065, Training Loss: 0.00577
Epoch: 49066, Training Loss: 0.00651
Epoch: 49066, Training Loss: 0.00610
Epoch: 49066, Training Loss: 0.00744
Epoch: 49066, Training Loss: 0.00577
Epoch: 49067, Training Loss: 0.00651
Epoch: 49067, Training Loss: 0.00610
Epoch: 49067, Training Loss: 0.00744
Epoch: 49067, Training Loss: 0.00577
Epoch: 49068, Training Loss: 0.00651
Epoch: 49068, Training Loss: 0.00610
Epoch: 49068, Training Loss: 0.00744
Epoch: 49068, Training Loss: 0.00577
Epoch: 49069, Training Loss: 0.00651
Epoch: 49069, Training Loss: 0.00610
Epoch: 49069, Training Loss: 0.00744
Epoch: 49069, Training Loss: 0.00577
Epoch: 49070, Training Loss: 0.00651
Epoch: 49070, Training Loss: 0.00610
Epoch: 49070, Training Loss: 0.00744
Epoch: 49070, Training Loss: 0.00577
Epoch: 49071, Training Loss: 0.00651
Epoch: 49071, Training Loss: 0.00610
Epoch: 49071, Training Loss: 0.00744
Epoch: 49071, Training Loss: 0.00577
Epoch: 49072, Training Loss: 0.00651
Epoch: 49072, Training Loss: 0.00610
Epoch: 49072, Training Loss: 0.00744
Epoch: 49072, Training Loss: 0.00577
Epoch: 49073, Training Loss: 0.00651
Epoch: 49073, Training Loss: 0.00610
Epoch: 49073, Training Loss: 0.00744
Epoch: 49073, Training Loss: 0.00577
Epoch: 49074, Training Loss: 0.00651
Epoch: 49074, Training Loss: 0.00610
Epoch: 49074, Training Loss: 0.00744
Epoch: 49074, Training Loss: 0.00577
Epoch: 49075, Training Loss: 0.00651
Epoch: 49075, Training Loss: 0.00610
Epoch: 49075, Training Loss: 0.00744
Epoch: 49075, Training Loss: 0.00577
Epoch: 49076, Training Loss: 0.00651
Epoch: 49076, Training Loss: 0.00610
Epoch: 49076, Training Loss: 0.00744
Epoch: 49076, Training Loss: 0.00577
Epoch: 49077, Training Loss: 0.00651
Epoch: 49077, Training Loss: 0.00610
Epoch: 49077, Training Loss: 0.00744
Epoch: 49077, Training Loss: 0.00577
Epoch: 49078, Training Loss: 0.00651
Epoch: 49078, Training Loss: 0.00610
Epoch: 49078, Training Loss: 0.00744
Epoch: 49078, Training Loss: 0.00577
Epoch: 49079, Training Loss: 0.00651
Epoch: 49079, Training Loss: 0.00610
Epoch: 49079, Training Loss: 0.00744
Epoch: 49079, Training Loss: 0.00577
Epoch: 49080, Training Loss: 0.00651
Epoch: 49080, Training Loss: 0.00610
Epoch: 49080, Training Loss: 0.00744
Epoch: 49080, Training Loss: 0.00577
Epoch: 49081, Training Loss: 0.00651
Epoch: 49081, Training Loss: 0.00610
Epoch: 49081, Training Loss: 0.00744
Epoch: 49081, Training Loss: 0.00577
Epoch: 49082, Training Loss: 0.00651
Epoch: 49082, Training Loss: 0.00610
Epoch: 49082, Training Loss: 0.00744
Epoch: 49082, Training Loss: 0.00577
Epoch: 49083, Training Loss: 0.00651
Epoch: 49083, Training Loss: 0.00610
Epoch: 49083, Training Loss: 0.00744
Epoch: 49083, Training Loss: 0.00577
Epoch: 49084, Training Loss: 0.00651
Epoch: 49084, Training Loss: 0.00610
Epoch: 49084, Training Loss: 0.00744
Epoch: 49084, Training Loss: 0.00577
Epoch: 49085, Training Loss: 0.00651
Epoch: 49085, Training Loss: 0.00610
Epoch: 49085, Training Loss: 0.00744
Epoch: 49085, Training Loss: 0.00577
Epoch: 49086, Training Loss: 0.00651
Epoch: 49086, Training Loss: 0.00610
Epoch: 49086, Training Loss: 0.00744
Epoch: 49086, Training Loss: 0.00577
Epoch: 49087, Training Loss: 0.00651
Epoch: 49087, Training Loss: 0.00610
Epoch: 49087, Training Loss: 0.00744
Epoch: 49087, Training Loss: 0.00577
Epoch: 49088, Training Loss: 0.00651
Epoch: 49088, Training Loss: 0.00610
Epoch: 49088, Training Loss: 0.00744
Epoch: 49088, Training Loss: 0.00577
Epoch: 49089, Training Loss: 0.00651
Epoch: 49089, Training Loss: 0.00610
Epoch: 49089, Training Loss: 0.00744
Epoch: 49089, Training Loss: 0.00577
Epoch: 49090, Training Loss: 0.00651
Epoch: 49090, Training Loss: 0.00610
Epoch: 49090, Training Loss: 0.00744
Epoch: 49090, Training Loss: 0.00577
Epoch: 49091, Training Loss: 0.00651
Epoch: 49091, Training Loss: 0.00610
Epoch: 49091, Training Loss: 0.00744
Epoch: 49091, Training Loss: 0.00577
Epoch: 49092, Training Loss: 0.00651
Epoch: 49092, Training Loss: 0.00610
Epoch: 49092, Training Loss: 0.00744
Epoch: 49092, Training Loss: 0.00577
Epoch: 49093, Training Loss: 0.00651
Epoch: 49093, Training Loss: 0.00610
Epoch: 49093, Training Loss: 0.00744
Epoch: 49093, Training Loss: 0.00577
Epoch: 49094, Training Loss: 0.00651
Epoch: 49094, Training Loss: 0.00610
Epoch: 49094, Training Loss: 0.00744
Epoch: 49094, Training Loss: 0.00577
Epoch: 49095, Training Loss: 0.00651
Epoch: 49095, Training Loss: 0.00610
Epoch: 49095, Training Loss: 0.00744
Epoch: 49095, Training Loss: 0.00577
Epoch: 49096, Training Loss: 0.00651
Epoch: 49096, Training Loss: 0.00610
Epoch: 49096, Training Loss: 0.00744
Epoch: 49096, Training Loss: 0.00577
Epoch: 49097, Training Loss: 0.00650
Epoch: 49097, Training Loss: 0.00610
Epoch: 49097, Training Loss: 0.00744
Epoch: 49097, Training Loss: 0.00577
Epoch: 49098, Training Loss: 0.00650
Epoch: 49098, Training Loss: 0.00610
Epoch: 49098, Training Loss: 0.00744
Epoch: 49098, Training Loss: 0.00577
Epoch: 49099, Training Loss: 0.00650
Epoch: 49099, Training Loss: 0.00610
Epoch: 49099, Training Loss: 0.00744
Epoch: 49099, Training Loss: 0.00577
Epoch: 49100, Training Loss: 0.00650
Epoch: 49100, Training Loss: 0.00610
Epoch: 49100, Training Loss: 0.00744
Epoch: 49100, Training Loss: 0.00577
Epoch: 49101, Training Loss: 0.00650
Epoch: 49101, Training Loss: 0.00610
Epoch: 49101, Training Loss: 0.00744
Epoch: 49101, Training Loss: 0.00577
Epoch: 49102, Training Loss: 0.00650
Epoch: 49102, Training Loss: 0.00610
Epoch: 49102, Training Loss: 0.00744
Epoch: 49102, Training Loss: 0.00577
Epoch: 49103, Training Loss: 0.00650
Epoch: 49103, Training Loss: 0.00610
Epoch: 49103, Training Loss: 0.00744
Epoch: 49103, Training Loss: 0.00577
Epoch: 49104, Training Loss: 0.00650
Epoch: 49104, Training Loss: 0.00610
Epoch: 49104, Training Loss: 0.00744
Epoch: 49104, Training Loss: 0.00577
Epoch: 49105, Training Loss: 0.00650
Epoch: 49105, Training Loss: 0.00610
Epoch: 49105, Training Loss: 0.00744
Epoch: 49105, Training Loss: 0.00577
Epoch: 49106, Training Loss: 0.00650
Epoch: 49106, Training Loss: 0.00610
Epoch: 49106, Training Loss: 0.00743
Epoch: 49106, Training Loss: 0.00577
Epoch: 49107, Training Loss: 0.00650
Epoch: 49107, Training Loss: 0.00610
Epoch: 49107, Training Loss: 0.00743
Epoch: 49107, Training Loss: 0.00577
Epoch: 49108, Training Loss: 0.00650
Epoch: 49108, Training Loss: 0.00610
Epoch: 49108, Training Loss: 0.00743
Epoch: 49108, Training Loss: 0.00577
Epoch: 49109, Training Loss: 0.00650
Epoch: 49109, Training Loss: 0.00610
Epoch: 49109, Training Loss: 0.00743
Epoch: 49109, Training Loss: 0.00577
Epoch: 49110, Training Loss: 0.00650
Epoch: 49110, Training Loss: 0.00610
Epoch: 49110, Training Loss: 0.00743
Epoch: 49110, Training Loss: 0.00577
Epoch: 49111, Training Loss: 0.00650
Epoch: 49111, Training Loss: 0.00610
Epoch: 49111, Training Loss: 0.00743
Epoch: 49111, Training Loss: 0.00577
Epoch: 49112, Training Loss: 0.00650
Epoch: 49112, Training Loss: 0.00610
Epoch: 49112, Training Loss: 0.00743
Epoch: 49112, Training Loss: 0.00577
Epoch: 49113, Training Loss: 0.00650
Epoch: 49113, Training Loss: 0.00610
Epoch: 49113, Training Loss: 0.00743
Epoch: 49113, Training Loss: 0.00577
Epoch: 49114, Training Loss: 0.00650
Epoch: 49114, Training Loss: 0.00610
Epoch: 49114, Training Loss: 0.00743
Epoch: 49114, Training Loss: 0.00577
Epoch: 49115, Training Loss: 0.00650
Epoch: 49115, Training Loss: 0.00610
Epoch: 49115, Training Loss: 0.00743
Epoch: 49115, Training Loss: 0.00577
Epoch: 49116, Training Loss: 0.00650
Epoch: 49116, Training Loss: 0.00610
Epoch: 49116, Training Loss: 0.00743
Epoch: 49116, Training Loss: 0.00577
Epoch: 49117, Training Loss: 0.00650
Epoch: 49117, Training Loss: 0.00610
Epoch: 49117, Training Loss: 0.00743
Epoch: 49117, Training Loss: 0.00577
Epoch: 49118, Training Loss: 0.00650
Epoch: 49118, Training Loss: 0.00610
Epoch: 49118, Training Loss: 0.00743
Epoch: 49118, Training Loss: 0.00577
Epoch: 49119, Training Loss: 0.00650
Epoch: 49119, Training Loss: 0.00610
Epoch: 49119, Training Loss: 0.00743
Epoch: 49119, Training Loss: 0.00577
Epoch: 49120, Training Loss: 0.00650
Epoch: 49120, Training Loss: 0.00610
Epoch: 49120, Training Loss: 0.00743
Epoch: 49120, Training Loss: 0.00577
Epoch: 49121, Training Loss: 0.00650
Epoch: 49121, Training Loss: 0.00610
Epoch: 49121, Training Loss: 0.00743
Epoch: 49121, Training Loss: 0.00577
Epoch: 49122, Training Loss: 0.00650
Epoch: 49122, Training Loss: 0.00610
Epoch: 49122, Training Loss: 0.00743
Epoch: 49122, Training Loss: 0.00577
Epoch: 49123, Training Loss: 0.00650
Epoch: 49123, Training Loss: 0.00610
Epoch: 49123, Training Loss: 0.00743
Epoch: 49123, Training Loss: 0.00577
Epoch: 49124, Training Loss: 0.00650
Epoch: 49124, Training Loss: 0.00610
Epoch: 49124, Training Loss: 0.00743
Epoch: 49124, Training Loss: 0.00577
Epoch: 49125, Training Loss: 0.00650
Epoch: 49125, Training Loss: 0.00610
Epoch: 49125, Training Loss: 0.00743
Epoch: 49125, Training Loss: 0.00577
Epoch: 49126, Training Loss: 0.00650
Epoch: 49126, Training Loss: 0.00610
Epoch: 49126, Training Loss: 0.00743
Epoch: 49126, Training Loss: 0.00577
Epoch: 49127, Training Loss: 0.00650
Epoch: 49127, Training Loss: 0.00610
Epoch: 49127, Training Loss: 0.00743
Epoch: 49127, Training Loss: 0.00577
Epoch: 49128, Training Loss: 0.00650
Epoch: 49128, Training Loss: 0.00609
Epoch: 49128, Training Loss: 0.00743
Epoch: 49128, Training Loss: 0.00577
Epoch: 49129, Training Loss: 0.00650
Epoch: 49129, Training Loss: 0.00609
Epoch: 49129, Training Loss: 0.00743
Epoch: 49129, Training Loss: 0.00577
Epoch: 49130, Training Loss: 0.00650
Epoch: 49130, Training Loss: 0.00609
Epoch: 49130, Training Loss: 0.00743
Epoch: 49130, Training Loss: 0.00577
Epoch: 49131, Training Loss: 0.00650
Epoch: 49131, Training Loss: 0.00609
Epoch: 49131, Training Loss: 0.00743
Epoch: 49131, Training Loss: 0.00577
Epoch: 49132, Training Loss: 0.00650
Epoch: 49132, Training Loss: 0.00609
Epoch: 49132, Training Loss: 0.00743
Epoch: 49132, Training Loss: 0.00577
Epoch: 49133, Training Loss: 0.00650
Epoch: 49133, Training Loss: 0.00609
Epoch: 49133, Training Loss: 0.00743
Epoch: 49133, Training Loss: 0.00577
Epoch: 49134, Training Loss: 0.00650
Epoch: 49134, Training Loss: 0.00609
Epoch: 49134, Training Loss: 0.00743
Epoch: 49134, Training Loss: 0.00577
Epoch: 49135, Training Loss: 0.00650
Epoch: 49135, Training Loss: 0.00609
Epoch: 49135, Training Loss: 0.00743
Epoch: 49135, Training Loss: 0.00577
Epoch: 49136, Training Loss: 0.00650
Epoch: 49136, Training Loss: 0.00609
Epoch: 49136, Training Loss: 0.00743
Epoch: 49136, Training Loss: 0.00577
Epoch: 49137, Training Loss: 0.00650
Epoch: 49137, Training Loss: 0.00609
Epoch: 49137, Training Loss: 0.00743
Epoch: 49137, Training Loss: 0.00577
Epoch: 49138, Training Loss: 0.00650
Epoch: 49138, Training Loss: 0.00609
Epoch: 49138, Training Loss: 0.00743
Epoch: 49138, Training Loss: 0.00577
Epoch: 49139, Training Loss: 0.00650
Epoch: 49139, Training Loss: 0.00609
Epoch: 49139, Training Loss: 0.00743
Epoch: 49139, Training Loss: 0.00577
Epoch: 49140, Training Loss: 0.00650
Epoch: 49140, Training Loss: 0.00609
Epoch: 49140, Training Loss: 0.00743
Epoch: 49140, Training Loss: 0.00577
Epoch: 49141, Training Loss: 0.00650
Epoch: 49141, Training Loss: 0.00609
Epoch: 49141, Training Loss: 0.00743
Epoch: 49141, Training Loss: 0.00577
Epoch: 49142, Training Loss: 0.00650
Epoch: 49142, Training Loss: 0.00609
Epoch: 49142, Training Loss: 0.00743
Epoch: 49142, Training Loss: 0.00577
Epoch: 49143, Training Loss: 0.00650
Epoch: 49143, Training Loss: 0.00609
Epoch: 49143, Training Loss: 0.00743
Epoch: 49143, Training Loss: 0.00577
Epoch: 49144, Training Loss: 0.00650
Epoch: 49144, Training Loss: 0.00609
Epoch: 49144, Training Loss: 0.00743
Epoch: 49144, Training Loss: 0.00577
Epoch: 49145, Training Loss: 0.00650
Epoch: 49145, Training Loss: 0.00609
Epoch: 49145, Training Loss: 0.00743
Epoch: 49145, Training Loss: 0.00577
Epoch: 49146, Training Loss: 0.00650
Epoch: 49146, Training Loss: 0.00609
Epoch: 49146, Training Loss: 0.00743
Epoch: 49146, Training Loss: 0.00577
Epoch: 49147, Training Loss: 0.00650
Epoch: 49147, Training Loss: 0.00609
Epoch: 49147, Training Loss: 0.00743
Epoch: 49147, Training Loss: 0.00577
Epoch: 49148, Training Loss: 0.00650
Epoch: 49148, Training Loss: 0.00609
Epoch: 49148, Training Loss: 0.00743
Epoch: 49148, Training Loss: 0.00577
Epoch: 49149, Training Loss: 0.00650
Epoch: 49149, Training Loss: 0.00609
Epoch: 49149, Training Loss: 0.00743
Epoch: 49149, Training Loss: 0.00577
Epoch: 49150, Training Loss: 0.00650
Epoch: 49150, Training Loss: 0.00609
Epoch: 49150, Training Loss: 0.00743
Epoch: 49150, Training Loss: 0.00577
Epoch: 49151, Training Loss: 0.00650
Epoch: 49151, Training Loss: 0.00609
Epoch: 49151, Training Loss: 0.00743
Epoch: 49151, Training Loss: 0.00577
Epoch: 49152, Training Loss: 0.00650
Epoch: 49152, Training Loss: 0.00609
Epoch: 49152, Training Loss: 0.00743
Epoch: 49152, Training Loss: 0.00577
Epoch: 49153, Training Loss: 0.00650
Epoch: 49153, Training Loss: 0.00609
Epoch: 49153, Training Loss: 0.00743
Epoch: 49153, Training Loss: 0.00577
Epoch: 49154, Training Loss: 0.00650
Epoch: 49154, Training Loss: 0.00609
Epoch: 49154, Training Loss: 0.00743
Epoch: 49154, Training Loss: 0.00577
Epoch: 49155, Training Loss: 0.00650
Epoch: 49155, Training Loss: 0.00609
Epoch: 49155, Training Loss: 0.00743
Epoch: 49155, Training Loss: 0.00577
Epoch: 49156, Training Loss: 0.00650
Epoch: 49156, Training Loss: 0.00609
Epoch: 49156, Training Loss: 0.00743
Epoch: 49156, Training Loss: 0.00577
Epoch: 49157, Training Loss: 0.00650
Epoch: 49157, Training Loss: 0.00609
Epoch: 49157, Training Loss: 0.00743
Epoch: 49157, Training Loss: 0.00577
Epoch: 49158, Training Loss: 0.00650
Epoch: 49158, Training Loss: 0.00609
Epoch: 49158, Training Loss: 0.00743
Epoch: 49158, Training Loss: 0.00577
Epoch: 49159, Training Loss: 0.00650
Epoch: 49159, Training Loss: 0.00609
Epoch: 49159, Training Loss: 0.00743
Epoch: 49159, Training Loss: 0.00577
Epoch: 49160, Training Loss: 0.00650
Epoch: 49160, Training Loss: 0.00609
Epoch: 49160, Training Loss: 0.00743
Epoch: 49160, Training Loss: 0.00577
Epoch: 49161, Training Loss: 0.00650
Epoch: 49161, Training Loss: 0.00609
Epoch: 49161, Training Loss: 0.00743
Epoch: 49161, Training Loss: 0.00577
Epoch: 49162, Training Loss: 0.00650
Epoch: 49162, Training Loss: 0.00609
Epoch: 49162, Training Loss: 0.00743
Epoch: 49162, Training Loss: 0.00577
Epoch: 49163, Training Loss: 0.00650
Epoch: 49163, Training Loss: 0.00609
Epoch: 49163, Training Loss: 0.00743
Epoch: 49163, Training Loss: 0.00577
Epoch: 49164, Training Loss: 0.00650
Epoch: 49164, Training Loss: 0.00609
Epoch: 49164, Training Loss: 0.00743
Epoch: 49164, Training Loss: 0.00577
Epoch: 49165, Training Loss: 0.00650
Epoch: 49165, Training Loss: 0.00609
Epoch: 49165, Training Loss: 0.00743
Epoch: 49165, Training Loss: 0.00577
Epoch: 49166, Training Loss: 0.00650
Epoch: 49166, Training Loss: 0.00609
Epoch: 49166, Training Loss: 0.00743
Epoch: 49166, Training Loss: 0.00577
Epoch: 49167, Training Loss: 0.00650
Epoch: 49167, Training Loss: 0.00609
Epoch: 49167, Training Loss: 0.00743
Epoch: 49167, Training Loss: 0.00577
Epoch: 49168, Training Loss: 0.00650
Epoch: 49168, Training Loss: 0.00609
Epoch: 49168, Training Loss: 0.00743
Epoch: 49168, Training Loss: 0.00577
Epoch: 49169, Training Loss: 0.00650
Epoch: 49169, Training Loss: 0.00609
Epoch: 49169, Training Loss: 0.00743
Epoch: 49169, Training Loss: 0.00577
Epoch: 49170, Training Loss: 0.00650
Epoch: 49170, Training Loss: 0.00609
Epoch: 49170, Training Loss: 0.00743
Epoch: 49170, Training Loss: 0.00577
Epoch: 49171, Training Loss: 0.00650
Epoch: 49171, Training Loss: 0.00609
Epoch: 49171, Training Loss: 0.00743
Epoch: 49171, Training Loss: 0.00577
Epoch: 49172, Training Loss: 0.00650
Epoch: 49172, Training Loss: 0.00609
Epoch: 49172, Training Loss: 0.00743
Epoch: 49172, Training Loss: 0.00577
Epoch: 49173, Training Loss: 0.00650
Epoch: 49173, Training Loss: 0.00609
Epoch: 49173, Training Loss: 0.00743
Epoch: 49173, Training Loss: 0.00577
Epoch: 49174, Training Loss: 0.00650
Epoch: 49174, Training Loss: 0.00609
Epoch: 49174, Training Loss: 0.00743
Epoch: 49174, Training Loss: 0.00577
Epoch: 49175, Training Loss: 0.00650
Epoch: 49175, Training Loss: 0.00609
Epoch: 49175, Training Loss: 0.00743
Epoch: 49175, Training Loss: 0.00577
Epoch: 49176, Training Loss: 0.00650
Epoch: 49176, Training Loss: 0.00609
Epoch: 49176, Training Loss: 0.00743
Epoch: 49176, Training Loss: 0.00577
Epoch: 49177, Training Loss: 0.00650
Epoch: 49177, Training Loss: 0.00609
Epoch: 49177, Training Loss: 0.00743
Epoch: 49177, Training Loss: 0.00577
Epoch: 49178, Training Loss: 0.00650
Epoch: 49178, Training Loss: 0.00609
Epoch: 49178, Training Loss: 0.00743
Epoch: 49178, Training Loss: 0.00577
Epoch: 49179, Training Loss: 0.00650
Epoch: 49179, Training Loss: 0.00609
Epoch: 49179, Training Loss: 0.00743
Epoch: 49179, Training Loss: 0.00577
Epoch: 49180, Training Loss: 0.00650
Epoch: 49180, Training Loss: 0.00609
Epoch: 49180, Training Loss: 0.00743
Epoch: 49180, Training Loss: 0.00577
Epoch: 49181, Training Loss: 0.00650
Epoch: 49181, Training Loss: 0.00609
Epoch: 49181, Training Loss: 0.00743
Epoch: 49181, Training Loss: 0.00577
Epoch: 49182, Training Loss: 0.00650
Epoch: 49182, Training Loss: 0.00609
Epoch: 49182, Training Loss: 0.00743
Epoch: 49182, Training Loss: 0.00577
Epoch: 49183, Training Loss: 0.00650
Epoch: 49183, Training Loss: 0.00609
Epoch: 49183, Training Loss: 0.00743
Epoch: 49183, Training Loss: 0.00577
Epoch: 49184, Training Loss: 0.00650
Epoch: 49184, Training Loss: 0.00609
Epoch: 49184, Training Loss: 0.00743
Epoch: 49184, Training Loss: 0.00577
Epoch: 49185, Training Loss: 0.00650
Epoch: 49185, Training Loss: 0.00609
Epoch: 49185, Training Loss: 0.00743
Epoch: 49185, Training Loss: 0.00577
Epoch: 49186, Training Loss: 0.00650
Epoch: 49186, Training Loss: 0.00609
Epoch: 49186, Training Loss: 0.00743
Epoch: 49186, Training Loss: 0.00577
Epoch: 49187, Training Loss: 0.00650
Epoch: 49187, Training Loss: 0.00609
Epoch: 49187, Training Loss: 0.00743
Epoch: 49187, Training Loss: 0.00577
Epoch: 49188, Training Loss: 0.00650
Epoch: 49188, Training Loss: 0.00609
Epoch: 49188, Training Loss: 0.00743
Epoch: 49188, Training Loss: 0.00577
Epoch: 49189, Training Loss: 0.00650
Epoch: 49189, Training Loss: 0.00609
Epoch: 49189, Training Loss: 0.00743
Epoch: 49189, Training Loss: 0.00577
Epoch: 49190, Training Loss: 0.00650
Epoch: 49190, Training Loss: 0.00609
Epoch: 49190, Training Loss: 0.00743
Epoch: 49190, Training Loss: 0.00577
Epoch: 49191, Training Loss: 0.00650
Epoch: 49191, Training Loss: 0.00609
Epoch: 49191, Training Loss: 0.00743
Epoch: 49191, Training Loss: 0.00577
Epoch: 49192, Training Loss: 0.00650
Epoch: 49192, Training Loss: 0.00609
Epoch: 49192, Training Loss: 0.00743
Epoch: 49192, Training Loss: 0.00577
Epoch: 49193, Training Loss: 0.00650
Epoch: 49193, Training Loss: 0.00609
Epoch: 49193, Training Loss: 0.00743
Epoch: 49193, Training Loss: 0.00577
Epoch: 49194, Training Loss: 0.00650
Epoch: 49194, Training Loss: 0.00609
Epoch: 49194, Training Loss: 0.00743
Epoch: 49194, Training Loss: 0.00577
Epoch: 49195, Training Loss: 0.00650
Epoch: 49195, Training Loss: 0.00609
Epoch: 49195, Training Loss: 0.00743
Epoch: 49195, Training Loss: 0.00577
Epoch: 49196, Training Loss: 0.00650
Epoch: 49196, Training Loss: 0.00609
Epoch: 49196, Training Loss: 0.00743
Epoch: 49196, Training Loss: 0.00577
Epoch: 49197, Training Loss: 0.00650
Epoch: 49197, Training Loss: 0.00609
Epoch: 49197, Training Loss: 0.00743
Epoch: 49197, Training Loss: 0.00577
Epoch: 49198, Training Loss: 0.00650
Epoch: 49198, Training Loss: 0.00609
Epoch: 49198, Training Loss: 0.00743
Epoch: 49198, Training Loss: 0.00577
Epoch: 49199, Training Loss: 0.00650
Epoch: 49199, Training Loss: 0.00609
Epoch: 49199, Training Loss: 0.00743
Epoch: 49199, Training Loss: 0.00577
Epoch: 49200, Training Loss: 0.00650
Epoch: 49200, Training Loss: 0.00609
Epoch: 49200, Training Loss: 0.00743
Epoch: 49200, Training Loss: 0.00577
Epoch: 49201, Training Loss: 0.00650
Epoch: 49201, Training Loss: 0.00609
Epoch: 49201, Training Loss: 0.00743
Epoch: 49201, Training Loss: 0.00577
Epoch: 49202, Training Loss: 0.00650
Epoch: 49202, Training Loss: 0.00609
Epoch: 49202, Training Loss: 0.00743
Epoch: 49202, Training Loss: 0.00577
Epoch: 49203, Training Loss: 0.00650
Epoch: 49203, Training Loss: 0.00609
Epoch: 49203, Training Loss: 0.00743
Epoch: 49203, Training Loss: 0.00577
Epoch: 49204, Training Loss: 0.00650
Epoch: 49204, Training Loss: 0.00609
Epoch: 49204, Training Loss: 0.00743
Epoch: 49204, Training Loss: 0.00577
Epoch: 49205, Training Loss: 0.00650
Epoch: 49205, Training Loss: 0.00609
Epoch: 49205, Training Loss: 0.00743
Epoch: 49205, Training Loss: 0.00577
Epoch: 49206, Training Loss: 0.00650
Epoch: 49206, Training Loss: 0.00609
Epoch: 49206, Training Loss: 0.00743
Epoch: 49206, Training Loss: 0.00577
Epoch: 49207, Training Loss: 0.00650
Epoch: 49207, Training Loss: 0.00609
Epoch: 49207, Training Loss: 0.00743
Epoch: 49207, Training Loss: 0.00577
Epoch: 49208, Training Loss: 0.00650
Epoch: 49208, Training Loss: 0.00609
Epoch: 49208, Training Loss: 0.00743
Epoch: 49208, Training Loss: 0.00577
Epoch: 49209, Training Loss: 0.00650
Epoch: 49209, Training Loss: 0.00609
Epoch: 49209, Training Loss: 0.00743
Epoch: 49209, Training Loss: 0.00577
Epoch: 49210, Training Loss: 0.00650
Epoch: 49210, Training Loss: 0.00609
Epoch: 49210, Training Loss: 0.00743
Epoch: 49210, Training Loss: 0.00577
Epoch: 49211, Training Loss: 0.00650
Epoch: 49211, Training Loss: 0.00609
Epoch: 49211, Training Loss: 0.00743
Epoch: 49211, Training Loss: 0.00577
Epoch: 49212, Training Loss: 0.00650
Epoch: 49212, Training Loss: 0.00609
Epoch: 49212, Training Loss: 0.00743
Epoch: 49212, Training Loss: 0.00577
Epoch: 49213, Training Loss: 0.00650
Epoch: 49213, Training Loss: 0.00609
Epoch: 49213, Training Loss: 0.00743
Epoch: 49213, Training Loss: 0.00577
Epoch: 49214, Training Loss: 0.00650
Epoch: 49214, Training Loss: 0.00609
Epoch: 49214, Training Loss: 0.00743
Epoch: 49214, Training Loss: 0.00576
Epoch: 49215, Training Loss: 0.00650
Epoch: 49215, Training Loss: 0.00609
Epoch: 49215, Training Loss: 0.00743
Epoch: 49215, Training Loss: 0.00576
Epoch: 49216, Training Loss: 0.00650
Epoch: 49216, Training Loss: 0.00609
Epoch: 49216, Training Loss: 0.00743
Epoch: 49216, Training Loss: 0.00576
Epoch: 49217, Training Loss: 0.00650
Epoch: 49217, Training Loss: 0.00609
Epoch: 49217, Training Loss: 0.00743
Epoch: 49217, Training Loss: 0.00576
Epoch: 49218, Training Loss: 0.00650
Epoch: 49218, Training Loss: 0.00609
Epoch: 49218, Training Loss: 0.00743
Epoch: 49218, Training Loss: 0.00576
Epoch: 49219, Training Loss: 0.00650
Epoch: 49219, Training Loss: 0.00609
Epoch: 49219, Training Loss: 0.00743
Epoch: 49219, Training Loss: 0.00576
Epoch: 49220, Training Loss: 0.00650
Epoch: 49220, Training Loss: 0.00609
Epoch: 49220, Training Loss: 0.00743
Epoch: 49220, Training Loss: 0.00576
Epoch: 49221, Training Loss: 0.00650
Epoch: 49221, Training Loss: 0.00609
Epoch: 49221, Training Loss: 0.00743
Epoch: 49221, Training Loss: 0.00576
Epoch: 49222, Training Loss: 0.00650
Epoch: 49222, Training Loss: 0.00609
Epoch: 49222, Training Loss: 0.00743
Epoch: 49222, Training Loss: 0.00576
Epoch: 49223, Training Loss: 0.00650
Epoch: 49223, Training Loss: 0.00609
Epoch: 49223, Training Loss: 0.00743
Epoch: 49223, Training Loss: 0.00576
Epoch: 49224, Training Loss: 0.00650
Epoch: 49224, Training Loss: 0.00609
Epoch: 49224, Training Loss: 0.00743
Epoch: 49224, Training Loss: 0.00576
Epoch: 49225, Training Loss: 0.00650
Epoch: 49225, Training Loss: 0.00609
Epoch: 49225, Training Loss: 0.00743
Epoch: 49225, Training Loss: 0.00576
Epoch: 49226, Training Loss: 0.00650
Epoch: 49226, Training Loss: 0.00609
Epoch: 49226, Training Loss: 0.00743
Epoch: 49226, Training Loss: 0.00576
Epoch: 49227, Training Loss: 0.00650
Epoch: 49227, Training Loss: 0.00609
Epoch: 49227, Training Loss: 0.00743
Epoch: 49227, Training Loss: 0.00576
Epoch: 49228, Training Loss: 0.00650
Epoch: 49228, Training Loss: 0.00609
Epoch: 49228, Training Loss: 0.00743
Epoch: 49228, Training Loss: 0.00576
Epoch: 49229, Training Loss: 0.00650
Epoch: 49229, Training Loss: 0.00609
Epoch: 49229, Training Loss: 0.00743
Epoch: 49229, Training Loss: 0.00576
Epoch: 49230, Training Loss: 0.00650
Epoch: 49230, Training Loss: 0.00609
Epoch: 49230, Training Loss: 0.00743
Epoch: 49230, Training Loss: 0.00576
Epoch: 49231, Training Loss: 0.00650
Epoch: 49231, Training Loss: 0.00609
Epoch: 49231, Training Loss: 0.00743
Epoch: 49231, Training Loss: 0.00576
Epoch: 49232, Training Loss: 0.00650
Epoch: 49232, Training Loss: 0.00609
Epoch: 49232, Training Loss: 0.00743
Epoch: 49232, Training Loss: 0.00576
Epoch: 49233, Training Loss: 0.00650
Epoch: 49233, Training Loss: 0.00609
Epoch: 49233, Training Loss: 0.00742
Epoch: 49233, Training Loss: 0.00576
Epoch: 49234, Training Loss: 0.00650
Epoch: 49234, Training Loss: 0.00609
Epoch: 49234, Training Loss: 0.00742
Epoch: 49234, Training Loss: 0.00576
Epoch: 49235, Training Loss: 0.00650
Epoch: 49235, Training Loss: 0.00609
Epoch: 49235, Training Loss: 0.00742
Epoch: 49235, Training Loss: 0.00576
Epoch: 49236, Training Loss: 0.00650
Epoch: 49236, Training Loss: 0.00609
Epoch: 49236, Training Loss: 0.00742
Epoch: 49236, Training Loss: 0.00576
Epoch: 49237, Training Loss: 0.00650
Epoch: 49237, Training Loss: 0.00609
Epoch: 49237, Training Loss: 0.00742
Epoch: 49237, Training Loss: 0.00576
Epoch: 49238, Training Loss: 0.00650
Epoch: 49238, Training Loss: 0.00609
Epoch: 49238, Training Loss: 0.00742
Epoch: 49238, Training Loss: 0.00576
Epoch: 49239, Training Loss: 0.00650
Epoch: 49239, Training Loss: 0.00609
Epoch: 49239, Training Loss: 0.00742
Epoch: 49239, Training Loss: 0.00576
Epoch: 49240, Training Loss: 0.00650
Epoch: 49240, Training Loss: 0.00609
Epoch: 49240, Training Loss: 0.00742
Epoch: 49240, Training Loss: 0.00576
Epoch: 49241, Training Loss: 0.00649
Epoch: 49241, Training Loss: 0.00609
Epoch: 49241, Training Loss: 0.00742
Epoch: 49241, Training Loss: 0.00576
Epoch: 49242, Training Loss: 0.00649
Epoch: 49242, Training Loss: 0.00609
Epoch: 49242, Training Loss: 0.00742
Epoch: 49242, Training Loss: 0.00576
Epoch: 49243, Training Loss: 0.00649
Epoch: 49243, Training Loss: 0.00609
Epoch: 49243, Training Loss: 0.00742
Epoch: 49243, Training Loss: 0.00576
Epoch: 49244, Training Loss: 0.00649
Epoch: 49244, Training Loss: 0.00609
Epoch: 49244, Training Loss: 0.00742
Epoch: 49244, Training Loss: 0.00576
Epoch: 49245, Training Loss: 0.00649
Epoch: 49245, Training Loss: 0.00609
Epoch: 49245, Training Loss: 0.00742
Epoch: 49245, Training Loss: 0.00576
Epoch: 49246, Training Loss: 0.00649
Epoch: 49246, Training Loss: 0.00609
Epoch: 49246, Training Loss: 0.00742
Epoch: 49246, Training Loss: 0.00576
Epoch: 49247, Training Loss: 0.00649
Epoch: 49247, Training Loss: 0.00609
Epoch: 49247, Training Loss: 0.00742
Epoch: 49247, Training Loss: 0.00576
Epoch: 49248, Training Loss: 0.00649
Epoch: 49248, Training Loss: 0.00609
Epoch: 49248, Training Loss: 0.00742
Epoch: 49248, Training Loss: 0.00576
Epoch: 49249, Training Loss: 0.00649
Epoch: 49249, Training Loss: 0.00609
Epoch: 49249, Training Loss: 0.00742
Epoch: 49249, Training Loss: 0.00576
Epoch: 49250, Training Loss: 0.00649
Epoch: 49250, Training Loss: 0.00609
Epoch: 49250, Training Loss: 0.00742
Epoch: 49250, Training Loss: 0.00576
Epoch: 49251, Training Loss: 0.00649
Epoch: 49251, Training Loss: 0.00609
Epoch: 49251, Training Loss: 0.00742
Epoch: 49251, Training Loss: 0.00576
Epoch: 49252, Training Loss: 0.00649
Epoch: 49252, Training Loss: 0.00609
Epoch: 49252, Training Loss: 0.00742
Epoch: 49252, Training Loss: 0.00576
Epoch: 49253, Training Loss: 0.00649
Epoch: 49253, Training Loss: 0.00609
Epoch: 49253, Training Loss: 0.00742
Epoch: 49253, Training Loss: 0.00576
Epoch: 49254, Training Loss: 0.00649
Epoch: 49254, Training Loss: 0.00609
Epoch: 49254, Training Loss: 0.00742
Epoch: 49254, Training Loss: 0.00576
Epoch: 49255, Training Loss: 0.00649
Epoch: 49255, Training Loss: 0.00609
Epoch: 49255, Training Loss: 0.00742
Epoch: 49255, Training Loss: 0.00576
Epoch: 49256, Training Loss: 0.00649
Epoch: 49256, Training Loss: 0.00609
Epoch: 49256, Training Loss: 0.00742
Epoch: 49256, Training Loss: 0.00576
Epoch: 49257, Training Loss: 0.00649
Epoch: 49257, Training Loss: 0.00609
Epoch: 49257, Training Loss: 0.00742
Epoch: 49257, Training Loss: 0.00576
Epoch: 49258, Training Loss: 0.00649
Epoch: 49258, Training Loss: 0.00609
Epoch: 49258, Training Loss: 0.00742
Epoch: 49258, Training Loss: 0.00576
Epoch: 49259, Training Loss: 0.00649
Epoch: 49259, Training Loss: 0.00609
Epoch: 49259, Training Loss: 0.00742
Epoch: 49259, Training Loss: 0.00576
Epoch: 49260, Training Loss: 0.00649
Epoch: 49260, Training Loss: 0.00609
Epoch: 49260, Training Loss: 0.00742
Epoch: 49260, Training Loss: 0.00576
Epoch: 49261, Training Loss: 0.00649
Epoch: 49261, Training Loss: 0.00609
Epoch: 49261, Training Loss: 0.00742
Epoch: 49261, Training Loss: 0.00576
Epoch: 49262, Training Loss: 0.00649
Epoch: 49262, Training Loss: 0.00609
Epoch: 49262, Training Loss: 0.00742
Epoch: 49262, Training Loss: 0.00576
Epoch: 49263, Training Loss: 0.00649
Epoch: 49263, Training Loss: 0.00609
Epoch: 49263, Training Loss: 0.00742
Epoch: 49263, Training Loss: 0.00576
Epoch: 49264, Training Loss: 0.00649
Epoch: 49264, Training Loss: 0.00609
Epoch: 49264, Training Loss: 0.00742
Epoch: 49264, Training Loss: 0.00576
Epoch: 49265, Training Loss: 0.00649
Epoch: 49265, Training Loss: 0.00609
Epoch: 49265, Training Loss: 0.00742
Epoch: 49265, Training Loss: 0.00576
Epoch: 49266, Training Loss: 0.00649
Epoch: 49266, Training Loss: 0.00609
Epoch: 49266, Training Loss: 0.00742
Epoch: 49266, Training Loss: 0.00576
Epoch: 49267, Training Loss: 0.00649
Epoch: 49267, Training Loss: 0.00609
Epoch: 49267, Training Loss: 0.00742
Epoch: 49267, Training Loss: 0.00576
Epoch: 49268, Training Loss: 0.00649
Epoch: 49268, Training Loss: 0.00609
Epoch: 49268, Training Loss: 0.00742
Epoch: 49268, Training Loss: 0.00576
Epoch: 49269, Training Loss: 0.00649
Epoch: 49269, Training Loss: 0.00609
Epoch: 49269, Training Loss: 0.00742
Epoch: 49269, Training Loss: 0.00576
Epoch: 49270, Training Loss: 0.00649
Epoch: 49270, Training Loss: 0.00609
Epoch: 49270, Training Loss: 0.00742
Epoch: 49270, Training Loss: 0.00576
Epoch: 49271, Training Loss: 0.00649
Epoch: 49271, Training Loss: 0.00609
Epoch: 49271, Training Loss: 0.00742
Epoch: 49271, Training Loss: 0.00576
Epoch: 49272, Training Loss: 0.00649
Epoch: 49272, Training Loss: 0.00609
Epoch: 49272, Training Loss: 0.00742
Epoch: 49272, Training Loss: 0.00576
Epoch: 49273, Training Loss: 0.00649
Epoch: 49273, Training Loss: 0.00609
Epoch: 49273, Training Loss: 0.00742
Epoch: 49273, Training Loss: 0.00576
Epoch: 49274, Training Loss: 0.00649
Epoch: 49274, Training Loss: 0.00609
Epoch: 49274, Training Loss: 0.00742
Epoch: 49274, Training Loss: 0.00576
Epoch: 49275, Training Loss: 0.00649
Epoch: 49275, Training Loss: 0.00609
Epoch: 49275, Training Loss: 0.00742
Epoch: 49275, Training Loss: 0.00576
Epoch: 49276, Training Loss: 0.00649
Epoch: 49276, Training Loss: 0.00609
Epoch: 49276, Training Loss: 0.00742
Epoch: 49276, Training Loss: 0.00576
Epoch: 49277, Training Loss: 0.00649
Epoch: 49277, Training Loss: 0.00609
Epoch: 49277, Training Loss: 0.00742
Epoch: 49277, Training Loss: 0.00576
Epoch: 49278, Training Loss: 0.00649
Epoch: 49278, Training Loss: 0.00609
Epoch: 49278, Training Loss: 0.00742
Epoch: 49278, Training Loss: 0.00576
Epoch: 49279, Training Loss: 0.00649
Epoch: 49279, Training Loss: 0.00609
Epoch: 49279, Training Loss: 0.00742
Epoch: 49279, Training Loss: 0.00576
Epoch: 49280, Training Loss: 0.00649
Epoch: 49280, Training Loss: 0.00609
Epoch: 49280, Training Loss: 0.00742
Epoch: 49280, Training Loss: 0.00576
Epoch: 49281, Training Loss: 0.00649
Epoch: 49281, Training Loss: 0.00609
Epoch: 49281, Training Loss: 0.00742
Epoch: 49281, Training Loss: 0.00576
Epoch: 49282, Training Loss: 0.00649
Epoch: 49282, Training Loss: 0.00609
Epoch: 49282, Training Loss: 0.00742
Epoch: 49282, Training Loss: 0.00576
Epoch: 49283, Training Loss: 0.00649
Epoch: 49283, Training Loss: 0.00609
Epoch: 49283, Training Loss: 0.00742
Epoch: 49283, Training Loss: 0.00576
Epoch: 49284, Training Loss: 0.00649
Epoch: 49284, Training Loss: 0.00608
Epoch: 49284, Training Loss: 0.00742
Epoch: 49284, Training Loss: 0.00576
Epoch: 49285, Training Loss: 0.00649
Epoch: 49285, Training Loss: 0.00608
Epoch: 49285, Training Loss: 0.00742
Epoch: 49285, Training Loss: 0.00576
Epoch: 49286, Training Loss: 0.00649
Epoch: 49286, Training Loss: 0.00608
Epoch: 49286, Training Loss: 0.00742
Epoch: 49286, Training Loss: 0.00576
Epoch: 49287, Training Loss: 0.00649
Epoch: 49287, Training Loss: 0.00608
Epoch: 49287, Training Loss: 0.00742
Epoch: 49287, Training Loss: 0.00576
Epoch: 49288, Training Loss: 0.00649
Epoch: 49288, Training Loss: 0.00608
Epoch: 49288, Training Loss: 0.00742
Epoch: 49288, Training Loss: 0.00576
Epoch: 49289, Training Loss: 0.00649
Epoch: 49289, Training Loss: 0.00608
Epoch: 49289, Training Loss: 0.00742
Epoch: 49289, Training Loss: 0.00576
Epoch: 49290, Training Loss: 0.00649
Epoch: 49290, Training Loss: 0.00608
Epoch: 49290, Training Loss: 0.00742
Epoch: 49290, Training Loss: 0.00576
Epoch: 49291, Training Loss: 0.00649
Epoch: 49291, Training Loss: 0.00608
Epoch: 49291, Training Loss: 0.00742
Epoch: 49291, Training Loss: 0.00576
Epoch: 49292, Training Loss: 0.00649
Epoch: 49292, Training Loss: 0.00608
Epoch: 49292, Training Loss: 0.00742
Epoch: 49292, Training Loss: 0.00576
Epoch: 49293, Training Loss: 0.00649
Epoch: 49293, Training Loss: 0.00608
Epoch: 49293, Training Loss: 0.00742
Epoch: 49293, Training Loss: 0.00576
Epoch: 49294, Training Loss: 0.00649
Epoch: 49294, Training Loss: 0.00608
Epoch: 49294, Training Loss: 0.00742
Epoch: 49294, Training Loss: 0.00576
Epoch: 49295, Training Loss: 0.00649
Epoch: 49295, Training Loss: 0.00608
Epoch: 49295, Training Loss: 0.00742
Epoch: 49295, Training Loss: 0.00576
Epoch: 49296, Training Loss: 0.00649
Epoch: 49296, Training Loss: 0.00608
Epoch: 49296, Training Loss: 0.00742
Epoch: 49296, Training Loss: 0.00576
Epoch: 49297, Training Loss: 0.00649
Epoch: 49297, Training Loss: 0.00608
Epoch: 49297, Training Loss: 0.00742
Epoch: 49297, Training Loss: 0.00576
Epoch: 49298, Training Loss: 0.00649
Epoch: 49298, Training Loss: 0.00608
Epoch: 49298, Training Loss: 0.00742
Epoch: 49298, Training Loss: 0.00576
Epoch: 49299, Training Loss: 0.00649
Epoch: 49299, Training Loss: 0.00608
Epoch: 49299, Training Loss: 0.00742
Epoch: 49299, Training Loss: 0.00576
Epoch: 49300, Training Loss: 0.00649
Epoch: 49300, Training Loss: 0.00608
Epoch: 49300, Training Loss: 0.00742
Epoch: 49300, Training Loss: 0.00576
Epoch: 49301, Training Loss: 0.00649
Epoch: 49301, Training Loss: 0.00608
Epoch: 49301, Training Loss: 0.00742
Epoch: 49301, Training Loss: 0.00576
Epoch: 49302, Training Loss: 0.00649
Epoch: 49302, Training Loss: 0.00608
Epoch: 49302, Training Loss: 0.00742
Epoch: 49302, Training Loss: 0.00576
Epoch: 49303, Training Loss: 0.00649
Epoch: 49303, Training Loss: 0.00608
Epoch: 49303, Training Loss: 0.00742
Epoch: 49303, Training Loss: 0.00576
Epoch: 49304, Training Loss: 0.00649
Epoch: 49304, Training Loss: 0.00608
Epoch: 49304, Training Loss: 0.00742
Epoch: 49304, Training Loss: 0.00576
Epoch: 49305, Training Loss: 0.00649
Epoch: 49305, Training Loss: 0.00608
Epoch: 49305, Training Loss: 0.00742
Epoch: 49305, Training Loss: 0.00576
Epoch: 49306, Training Loss: 0.00649
Epoch: 49306, Training Loss: 0.00608
Epoch: 49306, Training Loss: 0.00742
Epoch: 49306, Training Loss: 0.00576
Epoch: 49307, Training Loss: 0.00649
Epoch: 49307, Training Loss: 0.00608
Epoch: 49307, Training Loss: 0.00742
Epoch: 49307, Training Loss: 0.00576
Epoch: 49308, Training Loss: 0.00649
Epoch: 49308, Training Loss: 0.00608
Epoch: 49308, Training Loss: 0.00742
Epoch: 49308, Training Loss: 0.00576
Epoch: 49309, Training Loss: 0.00649
Epoch: 49309, Training Loss: 0.00608
Epoch: 49309, Training Loss: 0.00742
Epoch: 49309, Training Loss: 0.00576
Epoch: 49310, Training Loss: 0.00649
Epoch: 49310, Training Loss: 0.00608
Epoch: 49310, Training Loss: 0.00742
Epoch: 49310, Training Loss: 0.00576
Epoch: 49311, Training Loss: 0.00649
Epoch: 49311, Training Loss: 0.00608
Epoch: 49311, Training Loss: 0.00742
Epoch: 49311, Training Loss: 0.00576
Epoch: 49312, Training Loss: 0.00649
Epoch: 49312, Training Loss: 0.00608
Epoch: 49312, Training Loss: 0.00742
Epoch: 49312, Training Loss: 0.00576
Epoch: 49313, Training Loss: 0.00649
Epoch: 49313, Training Loss: 0.00608
Epoch: 49313, Training Loss: 0.00742
Epoch: 49313, Training Loss: 0.00576
Epoch: 49314, Training Loss: 0.00649
Epoch: 49314, Training Loss: 0.00608
Epoch: 49314, Training Loss: 0.00742
Epoch: 49314, Training Loss: 0.00576
Epoch: 49315, Training Loss: 0.00649
Epoch: 49315, Training Loss: 0.00608
Epoch: 49315, Training Loss: 0.00742
Epoch: 49315, Training Loss: 0.00576
Epoch: 49316, Training Loss: 0.00649
Epoch: 49316, Training Loss: 0.00608
Epoch: 49316, Training Loss: 0.00742
Epoch: 49316, Training Loss: 0.00576
Epoch: 49317, Training Loss: 0.00649
Epoch: 49317, Training Loss: 0.00608
Epoch: 49317, Training Loss: 0.00742
Epoch: 49317, Training Loss: 0.00576
Epoch: 49318, Training Loss: 0.00649
Epoch: 49318, Training Loss: 0.00608
Epoch: 49318, Training Loss: 0.00742
Epoch: 49318, Training Loss: 0.00576
Epoch: 49319, Training Loss: 0.00649
Epoch: 49319, Training Loss: 0.00608
Epoch: 49319, Training Loss: 0.00742
Epoch: 49319, Training Loss: 0.00576
Epoch: 49320, Training Loss: 0.00649
Epoch: 49320, Training Loss: 0.00608
Epoch: 49320, Training Loss: 0.00742
Epoch: 49320, Training Loss: 0.00576
Epoch: 49321, Training Loss: 0.00649
Epoch: 49321, Training Loss: 0.00608
Epoch: 49321, Training Loss: 0.00742
Epoch: 49321, Training Loss: 0.00576
Epoch: 49322, Training Loss: 0.00649
Epoch: 49322, Training Loss: 0.00608
Epoch: 49322, Training Loss: 0.00742
Epoch: 49322, Training Loss: 0.00576
Epoch: 49323, Training Loss: 0.00649
Epoch: 49323, Training Loss: 0.00608
Epoch: 49323, Training Loss: 0.00742
Epoch: 49323, Training Loss: 0.00576
Epoch: 49324, Training Loss: 0.00649
Epoch: 49324, Training Loss: 0.00608
Epoch: 49324, Training Loss: 0.00742
Epoch: 49324, Training Loss: 0.00576
Epoch: 49325, Training Loss: 0.00649
Epoch: 49325, Training Loss: 0.00608
Epoch: 49325, Training Loss: 0.00742
Epoch: 49325, Training Loss: 0.00576
Epoch: 49326, Training Loss: 0.00649
Epoch: 49326, Training Loss: 0.00608
Epoch: 49326, Training Loss: 0.00742
Epoch: 49326, Training Loss: 0.00576
Epoch: 49327, Training Loss: 0.00649
Epoch: 49327, Training Loss: 0.00608
Epoch: 49327, Training Loss: 0.00742
Epoch: 49327, Training Loss: 0.00576
Epoch: 49328, Training Loss: 0.00649
Epoch: 49328, Training Loss: 0.00608
Epoch: 49328, Training Loss: 0.00742
Epoch: 49328, Training Loss: 0.00576
Epoch: 49329, Training Loss: 0.00649
Epoch: 49329, Training Loss: 0.00608
Epoch: 49329, Training Loss: 0.00742
Epoch: 49329, Training Loss: 0.00576
Epoch: 49330, Training Loss: 0.00649
Epoch: 49330, Training Loss: 0.00608
Epoch: 49330, Training Loss: 0.00742
Epoch: 49330, Training Loss: 0.00576
Epoch: 49331, Training Loss: 0.00649
Epoch: 49331, Training Loss: 0.00608
Epoch: 49331, Training Loss: 0.00742
Epoch: 49331, Training Loss: 0.00576
Epoch: 49332, Training Loss: 0.00649
Epoch: 49332, Training Loss: 0.00608
Epoch: 49332, Training Loss: 0.00742
Epoch: 49332, Training Loss: 0.00576
Epoch: 49333, Training Loss: 0.00649
Epoch: 49333, Training Loss: 0.00608
Epoch: 49333, Training Loss: 0.00742
Epoch: 49333, Training Loss: 0.00576
Epoch: 49334, Training Loss: 0.00649
Epoch: 49334, Training Loss: 0.00608
Epoch: 49334, Training Loss: 0.00742
Epoch: 49334, Training Loss: 0.00576
Epoch: 49335, Training Loss: 0.00649
Epoch: 49335, Training Loss: 0.00608
Epoch: 49335, Training Loss: 0.00742
Epoch: 49335, Training Loss: 0.00576
Epoch: 49336, Training Loss: 0.00649
Epoch: 49336, Training Loss: 0.00608
Epoch: 49336, Training Loss: 0.00742
Epoch: 49336, Training Loss: 0.00576
Epoch: 49337, Training Loss: 0.00649
Epoch: 49337, Training Loss: 0.00608
Epoch: 49337, Training Loss: 0.00742
Epoch: 49337, Training Loss: 0.00576
Epoch: 49338, Training Loss: 0.00649
Epoch: 49338, Training Loss: 0.00608
Epoch: 49338, Training Loss: 0.00742
Epoch: 49338, Training Loss: 0.00576
Epoch: 49339, Training Loss: 0.00649
Epoch: 49339, Training Loss: 0.00608
Epoch: 49339, Training Loss: 0.00742
Epoch: 49339, Training Loss: 0.00576
Epoch: 49340, Training Loss: 0.00649
Epoch: 49340, Training Loss: 0.00608
Epoch: 49340, Training Loss: 0.00742
Epoch: 49340, Training Loss: 0.00576
Epoch: 49341, Training Loss: 0.00649
Epoch: 49341, Training Loss: 0.00608
Epoch: 49341, Training Loss: 0.00742
Epoch: 49341, Training Loss: 0.00576
Epoch: 49342, Training Loss: 0.00649
Epoch: 49342, Training Loss: 0.00608
Epoch: 49342, Training Loss: 0.00742
Epoch: 49342, Training Loss: 0.00576
Epoch: 49343, Training Loss: 0.00649
Epoch: 49343, Training Loss: 0.00608
Epoch: 49343, Training Loss: 0.00742
Epoch: 49343, Training Loss: 0.00576
Epoch: 49344, Training Loss: 0.00649
Epoch: 49344, Training Loss: 0.00608
Epoch: 49344, Training Loss: 0.00742
Epoch: 49344, Training Loss: 0.00576
Epoch: 49345, Training Loss: 0.00649
Epoch: 49345, Training Loss: 0.00608
Epoch: 49345, Training Loss: 0.00742
Epoch: 49345, Training Loss: 0.00576
Epoch: 49346, Training Loss: 0.00649
Epoch: 49346, Training Loss: 0.00608
Epoch: 49346, Training Loss: 0.00742
Epoch: 49346, Training Loss: 0.00576
Epoch: 49347, Training Loss: 0.00649
Epoch: 49347, Training Loss: 0.00608
Epoch: 49347, Training Loss: 0.00742
Epoch: 49347, Training Loss: 0.00576
Epoch: 49348, Training Loss: 0.00649
Epoch: 49348, Training Loss: 0.00608
Epoch: 49348, Training Loss: 0.00742
Epoch: 49348, Training Loss: 0.00576
Epoch: 49349, Training Loss: 0.00649
Epoch: 49349, Training Loss: 0.00608
Epoch: 49349, Training Loss: 0.00742
Epoch: 49349, Training Loss: 0.00576
Epoch: 49350, Training Loss: 0.00649
Epoch: 49350, Training Loss: 0.00608
Epoch: 49350, Training Loss: 0.00742
Epoch: 49350, Training Loss: 0.00576
Epoch: 49351, Training Loss: 0.00649
Epoch: 49351, Training Loss: 0.00608
Epoch: 49351, Training Loss: 0.00742
Epoch: 49351, Training Loss: 0.00576
Epoch: 49352, Training Loss: 0.00649
Epoch: 49352, Training Loss: 0.00608
Epoch: 49352, Training Loss: 0.00742
Epoch: 49352, Training Loss: 0.00576
Epoch: 49353, Training Loss: 0.00649
Epoch: 49353, Training Loss: 0.00608
Epoch: 49353, Training Loss: 0.00742
Epoch: 49353, Training Loss: 0.00576
Epoch: 49354, Training Loss: 0.00649
Epoch: 49354, Training Loss: 0.00608
Epoch: 49354, Training Loss: 0.00742
Epoch: 49354, Training Loss: 0.00576
Epoch: 49355, Training Loss: 0.00649
Epoch: 49355, Training Loss: 0.00608
Epoch: 49355, Training Loss: 0.00742
Epoch: 49355, Training Loss: 0.00576
Epoch: 49356, Training Loss: 0.00649
Epoch: 49356, Training Loss: 0.00608
Epoch: 49356, Training Loss: 0.00742
Epoch: 49356, Training Loss: 0.00576
Epoch: 49357, Training Loss: 0.00649
Epoch: 49357, Training Loss: 0.00608
Epoch: 49357, Training Loss: 0.00742
Epoch: 49357, Training Loss: 0.00576
Epoch: 49358, Training Loss: 0.00649
Epoch: 49358, Training Loss: 0.00608
Epoch: 49358, Training Loss: 0.00742
Epoch: 49358, Training Loss: 0.00576
Epoch: 49359, Training Loss: 0.00649
Epoch: 49359, Training Loss: 0.00608
Epoch: 49359, Training Loss: 0.00742
Epoch: 49359, Training Loss: 0.00576
Epoch: 49360, Training Loss: 0.00649
Epoch: 49360, Training Loss: 0.00608
Epoch: 49360, Training Loss: 0.00741
Epoch: 49360, Training Loss: 0.00576
Epoch: 49361, Training Loss: 0.00649
Epoch: 49361, Training Loss: 0.00608
Epoch: 49361, Training Loss: 0.00741
Epoch: 49361, Training Loss: 0.00576
Epoch: 49362, Training Loss: 0.00649
Epoch: 49362, Training Loss: 0.00608
Epoch: 49362, Training Loss: 0.00741
Epoch: 49362, Training Loss: 0.00576
Epoch: 49363, Training Loss: 0.00649
Epoch: 49363, Training Loss: 0.00608
Epoch: 49363, Training Loss: 0.00741
Epoch: 49363, Training Loss: 0.00576
Epoch: 49364, Training Loss: 0.00649
Epoch: 49364, Training Loss: 0.00608
Epoch: 49364, Training Loss: 0.00741
Epoch: 49364, Training Loss: 0.00576
Epoch: 49365, Training Loss: 0.00649
Epoch: 49365, Training Loss: 0.00608
Epoch: 49365, Training Loss: 0.00741
Epoch: 49365, Training Loss: 0.00576
Epoch: 49366, Training Loss: 0.00649
Epoch: 49366, Training Loss: 0.00608
Epoch: 49366, Training Loss: 0.00741
Epoch: 49366, Training Loss: 0.00576
Epoch: 49367, Training Loss: 0.00649
Epoch: 49367, Training Loss: 0.00608
Epoch: 49367, Training Loss: 0.00741
Epoch: 49367, Training Loss: 0.00576
Epoch: 49368, Training Loss: 0.00649
Epoch: 49368, Training Loss: 0.00608
Epoch: 49368, Training Loss: 0.00741
Epoch: 49368, Training Loss: 0.00576
Epoch: 49369, Training Loss: 0.00649
Epoch: 49369, Training Loss: 0.00608
Epoch: 49369, Training Loss: 0.00741
Epoch: 49369, Training Loss: 0.00576
Epoch: 49370, Training Loss: 0.00649
Epoch: 49370, Training Loss: 0.00608
Epoch: 49370, Training Loss: 0.00741
Epoch: 49370, Training Loss: 0.00576
Epoch: 49371, Training Loss: 0.00649
Epoch: 49371, Training Loss: 0.00608
Epoch: 49371, Training Loss: 0.00741
Epoch: 49371, Training Loss: 0.00576
Epoch: 49372, Training Loss: 0.00649
Epoch: 49372, Training Loss: 0.00608
Epoch: 49372, Training Loss: 0.00741
Epoch: 49372, Training Loss: 0.00576
Epoch: 49373, Training Loss: 0.00649
Epoch: 49373, Training Loss: 0.00608
Epoch: 49373, Training Loss: 0.00741
Epoch: 49373, Training Loss: 0.00576
Epoch: 49374, Training Loss: 0.00649
Epoch: 49374, Training Loss: 0.00608
Epoch: 49374, Training Loss: 0.00741
Epoch: 49374, Training Loss: 0.00576
Epoch: 49375, Training Loss: 0.00649
Epoch: 49375, Training Loss: 0.00608
Epoch: 49375, Training Loss: 0.00741
Epoch: 49375, Training Loss: 0.00576
Epoch: 49376, Training Loss: 0.00649
Epoch: 49376, Training Loss: 0.00608
Epoch: 49376, Training Loss: 0.00741
Epoch: 49376, Training Loss: 0.00576
Epoch: 49377, Training Loss: 0.00649
Epoch: 49377, Training Loss: 0.00608
Epoch: 49377, Training Loss: 0.00741
Epoch: 49377, Training Loss: 0.00576
Epoch: 49378, Training Loss: 0.00649
Epoch: 49378, Training Loss: 0.00608
Epoch: 49378, Training Loss: 0.00741
Epoch: 49378, Training Loss: 0.00576
Epoch: 49379, Training Loss: 0.00649
Epoch: 49379, Training Loss: 0.00608
Epoch: 49379, Training Loss: 0.00741
Epoch: 49379, Training Loss: 0.00576
Epoch: 49380, Training Loss: 0.00649
Epoch: 49380, Training Loss: 0.00608
Epoch: 49380, Training Loss: 0.00741
Epoch: 49380, Training Loss: 0.00575
Epoch: 49381, Training Loss: 0.00649
Epoch: 49381, Training Loss: 0.00608
Epoch: 49381, Training Loss: 0.00741
Epoch: 49381, Training Loss: 0.00575
Epoch: 49382, Training Loss: 0.00649
Epoch: 49382, Training Loss: 0.00608
Epoch: 49382, Training Loss: 0.00741
Epoch: 49382, Training Loss: 0.00575
Epoch: 49383, Training Loss: 0.00649
Epoch: 49383, Training Loss: 0.00608
Epoch: 49383, Training Loss: 0.00741
Epoch: 49383, Training Loss: 0.00575
Epoch: 49384, Training Loss: 0.00649
Epoch: 49384, Training Loss: 0.00608
Epoch: 49384, Training Loss: 0.00741
Epoch: 49384, Training Loss: 0.00575
Epoch: 49385, Training Loss: 0.00649
Epoch: 49385, Training Loss: 0.00608
Epoch: 49385, Training Loss: 0.00741
Epoch: 49385, Training Loss: 0.00575
Epoch: 49386, Training Loss: 0.00648
Epoch: 49386, Training Loss: 0.00608
Epoch: 49386, Training Loss: 0.00741
Epoch: 49386, Training Loss: 0.00575
Epoch: 49387, Training Loss: 0.00648
Epoch: 49387, Training Loss: 0.00608
Epoch: 49387, Training Loss: 0.00741
Epoch: 49387, Training Loss: 0.00575
Epoch: 49388, Training Loss: 0.00648
Epoch: 49388, Training Loss: 0.00608
Epoch: 49388, Training Loss: 0.00741
Epoch: 49388, Training Loss: 0.00575
Epoch: 49389, Training Loss: 0.00648
Epoch: 49389, Training Loss: 0.00608
Epoch: 49389, Training Loss: 0.00741
Epoch: 49389, Training Loss: 0.00575
Epoch: 49390, Training Loss: 0.00648
Epoch: 49390, Training Loss: 0.00608
Epoch: 49390, Training Loss: 0.00741
Epoch: 49390, Training Loss: 0.00575
Epoch: 49391, Training Loss: 0.00648
Epoch: 49391, Training Loss: 0.00608
Epoch: 49391, Training Loss: 0.00741
Epoch: 49391, Training Loss: 0.00575
Epoch: 49392, Training Loss: 0.00648
Epoch: 49392, Training Loss: 0.00608
Epoch: 49392, Training Loss: 0.00741
Epoch: 49392, Training Loss: 0.00575
Epoch: 49393, Training Loss: 0.00648
Epoch: 49393, Training Loss: 0.00608
Epoch: 49393, Training Loss: 0.00741
Epoch: 49393, Training Loss: 0.00575
Epoch: 49394, Training Loss: 0.00648
Epoch: 49394, Training Loss: 0.00608
Epoch: 49394, Training Loss: 0.00741
Epoch: 49394, Training Loss: 0.00575
Epoch: 49395, Training Loss: 0.00648
Epoch: 49395, Training Loss: 0.00608
Epoch: 49395, Training Loss: 0.00741
Epoch: 49395, Training Loss: 0.00575
Epoch: 49396, Training Loss: 0.00648
Epoch: 49396, Training Loss: 0.00608
Epoch: 49396, Training Loss: 0.00741
Epoch: 49396, Training Loss: 0.00575
Epoch: 49397, Training Loss: 0.00648
Epoch: 49397, Training Loss: 0.00608
Epoch: 49397, Training Loss: 0.00741
Epoch: 49397, Training Loss: 0.00575
Epoch: 49398, Training Loss: 0.00648
Epoch: 49398, Training Loss: 0.00608
Epoch: 49398, Training Loss: 0.00741
Epoch: 49398, Training Loss: 0.00575
Epoch: 49399, Training Loss: 0.00648
Epoch: 49399, Training Loss: 0.00608
Epoch: 49399, Training Loss: 0.00741
Epoch: 49399, Training Loss: 0.00575
Epoch: 49400, Training Loss: 0.00648
Epoch: 49400, Training Loss: 0.00608
Epoch: 49400, Training Loss: 0.00741
Epoch: 49400, Training Loss: 0.00575
Epoch: 49401, Training Loss: 0.00648
Epoch: 49401, Training Loss: 0.00608
Epoch: 49401, Training Loss: 0.00741
Epoch: 49401, Training Loss: 0.00575
Epoch: 49402, Training Loss: 0.00648
Epoch: 49402, Training Loss: 0.00608
Epoch: 49402, Training Loss: 0.00741
Epoch: 49402, Training Loss: 0.00575
Epoch: 49403, Training Loss: 0.00648
Epoch: 49403, Training Loss: 0.00608
Epoch: 49403, Training Loss: 0.00741
Epoch: 49403, Training Loss: 0.00575
Epoch: 49404, Training Loss: 0.00648
Epoch: 49404, Training Loss: 0.00608
Epoch: 49404, Training Loss: 0.00741
Epoch: 49404, Training Loss: 0.00575
Epoch: 49405, Training Loss: 0.00648
Epoch: 49405, Training Loss: 0.00608
Epoch: 49405, Training Loss: 0.00741
Epoch: 49405, Training Loss: 0.00575
Epoch: 49406, Training Loss: 0.00648
Epoch: 49406, Training Loss: 0.00608
Epoch: 49406, Training Loss: 0.00741
Epoch: 49406, Training Loss: 0.00575
Epoch: 49407, Training Loss: 0.00648
Epoch: 49407, Training Loss: 0.00608
Epoch: 49407, Training Loss: 0.00741
Epoch: 49407, Training Loss: 0.00575
Epoch: 49408, Training Loss: 0.00648
Epoch: 49408, Training Loss: 0.00608
Epoch: 49408, Training Loss: 0.00741
Epoch: 49408, Training Loss: 0.00575
Epoch: 49409, Training Loss: 0.00648
Epoch: 49409, Training Loss: 0.00608
Epoch: 49409, Training Loss: 0.00741
Epoch: 49409, Training Loss: 0.00575
Epoch: 49410, Training Loss: 0.00648
Epoch: 49410, Training Loss: 0.00608
Epoch: 49410, Training Loss: 0.00741
Epoch: 49410, Training Loss: 0.00575
Epoch: 49411, Training Loss: 0.00648
Epoch: 49411, Training Loss: 0.00608
Epoch: 49411, Training Loss: 0.00741
Epoch: 49411, Training Loss: 0.00575
Epoch: 49412, Training Loss: 0.00648
Epoch: 49412, Training Loss: 0.00608
Epoch: 49412, Training Loss: 0.00741
Epoch: 49412, Training Loss: 0.00575
Epoch: 49413, Training Loss: 0.00648
Epoch: 49413, Training Loss: 0.00608
Epoch: 49413, Training Loss: 0.00741
Epoch: 49413, Training Loss: 0.00575
Epoch: 49414, Training Loss: 0.00648
Epoch: 49414, Training Loss: 0.00608
Epoch: 49414, Training Loss: 0.00741
Epoch: 49414, Training Loss: 0.00575
Epoch: 49415, Training Loss: 0.00648
Epoch: 49415, Training Loss: 0.00608
Epoch: 49415, Training Loss: 0.00741
Epoch: 49415, Training Loss: 0.00575
Epoch: 49416, Training Loss: 0.00648
Epoch: 49416, Training Loss: 0.00608
Epoch: 49416, Training Loss: 0.00741
Epoch: 49416, Training Loss: 0.00575
Epoch: 49417, Training Loss: 0.00648
Epoch: 49417, Training Loss: 0.00608
Epoch: 49417, Training Loss: 0.00741
Epoch: 49417, Training Loss: 0.00575
Epoch: 49418, Training Loss: 0.00648
Epoch: 49418, Training Loss: 0.00608
Epoch: 49418, Training Loss: 0.00741
Epoch: 49418, Training Loss: 0.00575
Epoch: 49419, Training Loss: 0.00648
Epoch: 49419, Training Loss: 0.00608
Epoch: 49419, Training Loss: 0.00741
Epoch: 49419, Training Loss: 0.00575
Epoch: 49420, Training Loss: 0.00648
Epoch: 49420, Training Loss: 0.00608
Epoch: 49420, Training Loss: 0.00741
Epoch: 49420, Training Loss: 0.00575
Epoch: 49421, Training Loss: 0.00648
Epoch: 49421, Training Loss: 0.00608
Epoch: 49421, Training Loss: 0.00741
Epoch: 49421, Training Loss: 0.00575
Epoch: 49422, Training Loss: 0.00648
Epoch: 49422, Training Loss: 0.00608
Epoch: 49422, Training Loss: 0.00741
Epoch: 49422, Training Loss: 0.00575
Epoch: 49423, Training Loss: 0.00648
Epoch: 49423, Training Loss: 0.00608
Epoch: 49423, Training Loss: 0.00741
Epoch: 49423, Training Loss: 0.00575
Epoch: 49424, Training Loss: 0.00648
Epoch: 49424, Training Loss: 0.00608
Epoch: 49424, Training Loss: 0.00741
Epoch: 49424, Training Loss: 0.00575
Epoch: 49425, Training Loss: 0.00648
Epoch: 49425, Training Loss: 0.00608
Epoch: 49425, Training Loss: 0.00741
Epoch: 49425, Training Loss: 0.00575
Epoch: 49426, Training Loss: 0.00648
Epoch: 49426, Training Loss: 0.00608
Epoch: 49426, Training Loss: 0.00741
Epoch: 49426, Training Loss: 0.00575
Epoch: 49427, Training Loss: 0.00648
Epoch: 49427, Training Loss: 0.00608
Epoch: 49427, Training Loss: 0.00741
Epoch: 49427, Training Loss: 0.00575
Epoch: 49428, Training Loss: 0.00648
Epoch: 49428, Training Loss: 0.00608
Epoch: 49428, Training Loss: 0.00741
Epoch: 49428, Training Loss: 0.00575
Epoch: 49429, Training Loss: 0.00648
Epoch: 49429, Training Loss: 0.00608
Epoch: 49429, Training Loss: 0.00741
Epoch: 49429, Training Loss: 0.00575
Epoch: 49430, Training Loss: 0.00648
Epoch: 49430, Training Loss: 0.00608
Epoch: 49430, Training Loss: 0.00741
Epoch: 49430, Training Loss: 0.00575
Epoch: 49431, Training Loss: 0.00648
Epoch: 49431, Training Loss: 0.00608
Epoch: 49431, Training Loss: 0.00741
Epoch: 49431, Training Loss: 0.00575
Epoch: 49432, Training Loss: 0.00648
Epoch: 49432, Training Loss: 0.00608
Epoch: 49432, Training Loss: 0.00741
Epoch: 49432, Training Loss: 0.00575
Epoch: 49433, Training Loss: 0.00648
Epoch: 49433, Training Loss: 0.00608
Epoch: 49433, Training Loss: 0.00741
Epoch: 49433, Training Loss: 0.00575
Epoch: 49434, Training Loss: 0.00648
Epoch: 49434, Training Loss: 0.00608
Epoch: 49434, Training Loss: 0.00741
Epoch: 49434, Training Loss: 0.00575
Epoch: 49435, Training Loss: 0.00648
Epoch: 49435, Training Loss: 0.00608
Epoch: 49435, Training Loss: 0.00741
Epoch: 49435, Training Loss: 0.00575
Epoch: 49436, Training Loss: 0.00648
Epoch: 49436, Training Loss: 0.00608
Epoch: 49436, Training Loss: 0.00741
Epoch: 49436, Training Loss: 0.00575
Epoch: 49437, Training Loss: 0.00648
Epoch: 49437, Training Loss: 0.00608
Epoch: 49437, Training Loss: 0.00741
Epoch: 49437, Training Loss: 0.00575
Epoch: 49438, Training Loss: 0.00648
Epoch: 49438, Training Loss: 0.00608
Epoch: 49438, Training Loss: 0.00741
Epoch: 49438, Training Loss: 0.00575
Epoch: 49439, Training Loss: 0.00648
Epoch: 49439, Training Loss: 0.00608
Epoch: 49439, Training Loss: 0.00741
Epoch: 49439, Training Loss: 0.00575
Epoch: 49440, Training Loss: 0.00648
Epoch: 49440, Training Loss: 0.00608
Epoch: 49440, Training Loss: 0.00741
Epoch: 49440, Training Loss: 0.00575
Epoch: 49441, Training Loss: 0.00648
Epoch: 49441, Training Loss: 0.00607
Epoch: 49441, Training Loss: 0.00741
Epoch: 49441, Training Loss: 0.00575
Epoch: 49442, Training Loss: 0.00648
Epoch: 49442, Training Loss: 0.00607
Epoch: 49442, Training Loss: 0.00741
Epoch: 49442, Training Loss: 0.00575
Epoch: 49443, Training Loss: 0.00648
Epoch: 49443, Training Loss: 0.00607
Epoch: 49443, Training Loss: 0.00741
Epoch: 49443, Training Loss: 0.00575
Epoch: 49444, Training Loss: 0.00648
Epoch: 49444, Training Loss: 0.00607
Epoch: 49444, Training Loss: 0.00741
Epoch: 49444, Training Loss: 0.00575
Epoch: 49445, Training Loss: 0.00648
Epoch: 49445, Training Loss: 0.00607
Epoch: 49445, Training Loss: 0.00741
Epoch: 49445, Training Loss: 0.00575
Epoch: 49446, Training Loss: 0.00648
Epoch: 49446, Training Loss: 0.00607
Epoch: 49446, Training Loss: 0.00741
Epoch: 49446, Training Loss: 0.00575
Epoch: 49447, Training Loss: 0.00648
Epoch: 49447, Training Loss: 0.00607
Epoch: 49447, Training Loss: 0.00741
Epoch: 49447, Training Loss: 0.00575
Epoch: 49448, Training Loss: 0.00648
Epoch: 49448, Training Loss: 0.00607
Epoch: 49448, Training Loss: 0.00741
Epoch: 49448, Training Loss: 0.00575
Epoch: 49449, Training Loss: 0.00648
Epoch: 49449, Training Loss: 0.00607
Epoch: 49449, Training Loss: 0.00741
Epoch: 49449, Training Loss: 0.00575
Epoch: 49450, Training Loss: 0.00648
Epoch: 49450, Training Loss: 0.00607
Epoch: 49450, Training Loss: 0.00741
Epoch: 49450, Training Loss: 0.00575
Epoch: 49451, Training Loss: 0.00648
Epoch: 49451, Training Loss: 0.00607
Epoch: 49451, Training Loss: 0.00741
Epoch: 49451, Training Loss: 0.00575
Epoch: 49452, Training Loss: 0.00648
Epoch: 49452, Training Loss: 0.00607
Epoch: 49452, Training Loss: 0.00741
Epoch: 49452, Training Loss: 0.00575
Epoch: 49453, Training Loss: 0.00648
Epoch: 49453, Training Loss: 0.00607
Epoch: 49453, Training Loss: 0.00741
Epoch: 49453, Training Loss: 0.00575
Epoch: 49454, Training Loss: 0.00648
Epoch: 49454, Training Loss: 0.00607
Epoch: 49454, Training Loss: 0.00741
Epoch: 49454, Training Loss: 0.00575
Epoch: 49455, Training Loss: 0.00648
Epoch: 49455, Training Loss: 0.00607
Epoch: 49455, Training Loss: 0.00741
Epoch: 49455, Training Loss: 0.00575
Epoch: 49456, Training Loss: 0.00648
Epoch: 49456, Training Loss: 0.00607
Epoch: 49456, Training Loss: 0.00741
Epoch: 49456, Training Loss: 0.00575
Epoch: 49457, Training Loss: 0.00648
Epoch: 49457, Training Loss: 0.00607
Epoch: 49457, Training Loss: 0.00741
Epoch: 49457, Training Loss: 0.00575
Epoch: 49458, Training Loss: 0.00648
Epoch: 49458, Training Loss: 0.00607
Epoch: 49458, Training Loss: 0.00741
Epoch: 49458, Training Loss: 0.00575
Epoch: 49459, Training Loss: 0.00648
Epoch: 49459, Training Loss: 0.00607
Epoch: 49459, Training Loss: 0.00741
Epoch: 49459, Training Loss: 0.00575
Epoch: 49460, Training Loss: 0.00648
Epoch: 49460, Training Loss: 0.00607
Epoch: 49460, Training Loss: 0.00741
Epoch: 49460, Training Loss: 0.00575
Epoch: 49461, Training Loss: 0.00648
Epoch: 49461, Training Loss: 0.00607
Epoch: 49461, Training Loss: 0.00741
Epoch: 49461, Training Loss: 0.00575
Epoch: 49462, Training Loss: 0.00648
Epoch: 49462, Training Loss: 0.00607
Epoch: 49462, Training Loss: 0.00741
Epoch: 49462, Training Loss: 0.00575
Epoch: 49463, Training Loss: 0.00648
Epoch: 49463, Training Loss: 0.00607
Epoch: 49463, Training Loss: 0.00741
Epoch: 49463, Training Loss: 0.00575
Epoch: 49464, Training Loss: 0.00648
Epoch: 49464, Training Loss: 0.00607
Epoch: 49464, Training Loss: 0.00741
Epoch: 49464, Training Loss: 0.00575
Epoch: 49465, Training Loss: 0.00648
Epoch: 49465, Training Loss: 0.00607
Epoch: 49465, Training Loss: 0.00741
Epoch: 49465, Training Loss: 0.00575
Epoch: 49466, Training Loss: 0.00648
Epoch: 49466, Training Loss: 0.00607
Epoch: 49466, Training Loss: 0.00741
Epoch: 49466, Training Loss: 0.00575
Epoch: 49467, Training Loss: 0.00648
Epoch: 49467, Training Loss: 0.00607
Epoch: 49467, Training Loss: 0.00741
Epoch: 49467, Training Loss: 0.00575
Epoch: 49468, Training Loss: 0.00648
Epoch: 49468, Training Loss: 0.00607
Epoch: 49468, Training Loss: 0.00741
Epoch: 49468, Training Loss: 0.00575
Epoch: 49469, Training Loss: 0.00648
Epoch: 49469, Training Loss: 0.00607
Epoch: 49469, Training Loss: 0.00741
Epoch: 49469, Training Loss: 0.00575
Epoch: 49470, Training Loss: 0.00648
Epoch: 49470, Training Loss: 0.00607
Epoch: 49470, Training Loss: 0.00741
Epoch: 49470, Training Loss: 0.00575
Epoch: 49471, Training Loss: 0.00648
Epoch: 49471, Training Loss: 0.00607
Epoch: 49471, Training Loss: 0.00741
Epoch: 49471, Training Loss: 0.00575
Epoch: 49472, Training Loss: 0.00648
Epoch: 49472, Training Loss: 0.00607
Epoch: 49472, Training Loss: 0.00741
Epoch: 49472, Training Loss: 0.00575
Epoch: 49473, Training Loss: 0.00648
Epoch: 49473, Training Loss: 0.00607
Epoch: 49473, Training Loss: 0.00741
Epoch: 49473, Training Loss: 0.00575
Epoch: 49474, Training Loss: 0.00648
Epoch: 49474, Training Loss: 0.00607
Epoch: 49474, Training Loss: 0.00741
Epoch: 49474, Training Loss: 0.00575
Epoch: 49475, Training Loss: 0.00648
Epoch: 49475, Training Loss: 0.00607
Epoch: 49475, Training Loss: 0.00741
Epoch: 49475, Training Loss: 0.00575
Epoch: 49476, Training Loss: 0.00648
Epoch: 49476, Training Loss: 0.00607
Epoch: 49476, Training Loss: 0.00741
Epoch: 49476, Training Loss: 0.00575
Epoch: 49477, Training Loss: 0.00648
Epoch: 49477, Training Loss: 0.00607
Epoch: 49477, Training Loss: 0.00741
Epoch: 49477, Training Loss: 0.00575
Epoch: 49478, Training Loss: 0.00648
Epoch: 49478, Training Loss: 0.00607
Epoch: 49478, Training Loss: 0.00741
Epoch: 49478, Training Loss: 0.00575
Epoch: 49479, Training Loss: 0.00648
Epoch: 49479, Training Loss: 0.00607
Epoch: 49479, Training Loss: 0.00741
Epoch: 49479, Training Loss: 0.00575
Epoch: 49480, Training Loss: 0.00648
Epoch: 49480, Training Loss: 0.00607
Epoch: 49480, Training Loss: 0.00741
Epoch: 49480, Training Loss: 0.00575
Epoch: 49481, Training Loss: 0.00648
Epoch: 49481, Training Loss: 0.00607
Epoch: 49481, Training Loss: 0.00741
Epoch: 49481, Training Loss: 0.00575
Epoch: 49482, Training Loss: 0.00648
Epoch: 49482, Training Loss: 0.00607
Epoch: 49482, Training Loss: 0.00741
Epoch: 49482, Training Loss: 0.00575
Epoch: 49483, Training Loss: 0.00648
Epoch: 49483, Training Loss: 0.00607
Epoch: 49483, Training Loss: 0.00741
Epoch: 49483, Training Loss: 0.00575
Epoch: 49484, Training Loss: 0.00648
Epoch: 49484, Training Loss: 0.00607
Epoch: 49484, Training Loss: 0.00741
Epoch: 49484, Training Loss: 0.00575
Epoch: 49485, Training Loss: 0.00648
Epoch: 49485, Training Loss: 0.00607
Epoch: 49485, Training Loss: 0.00741
Epoch: 49485, Training Loss: 0.00575
Epoch: 49486, Training Loss: 0.00648
Epoch: 49486, Training Loss: 0.00607
Epoch: 49486, Training Loss: 0.00741
Epoch: 49486, Training Loss: 0.00575
Epoch: 49487, Training Loss: 0.00648
Epoch: 49487, Training Loss: 0.00607
Epoch: 49487, Training Loss: 0.00741
Epoch: 49487, Training Loss: 0.00575
Epoch: 49488, Training Loss: 0.00648
Epoch: 49488, Training Loss: 0.00607
Epoch: 49488, Training Loss: 0.00740
Epoch: 49488, Training Loss: 0.00575
Epoch: 49489, Training Loss: 0.00648
Epoch: 49489, Training Loss: 0.00607
Epoch: 49489, Training Loss: 0.00740
Epoch: 49489, Training Loss: 0.00575
Epoch: 49490, Training Loss: 0.00648
Epoch: 49490, Training Loss: 0.00607
Epoch: 49490, Training Loss: 0.00740
Epoch: 49490, Training Loss: 0.00575
Epoch: 49491, Training Loss: 0.00648
Epoch: 49491, Training Loss: 0.00607
Epoch: 49491, Training Loss: 0.00740
Epoch: 49491, Training Loss: 0.00575
Epoch: 49492, Training Loss: 0.00648
Epoch: 49492, Training Loss: 0.00607
Epoch: 49492, Training Loss: 0.00740
Epoch: 49492, Training Loss: 0.00575
Epoch: 49493, Training Loss: 0.00648
Epoch: 49493, Training Loss: 0.00607
Epoch: 49493, Training Loss: 0.00740
Epoch: 49493, Training Loss: 0.00575
Epoch: 49494, Training Loss: 0.00648
Epoch: 49494, Training Loss: 0.00607
Epoch: 49494, Training Loss: 0.00740
Epoch: 49494, Training Loss: 0.00575
Epoch: 49495, Training Loss: 0.00648
Epoch: 49495, Training Loss: 0.00607
Epoch: 49495, Training Loss: 0.00740
Epoch: 49495, Training Loss: 0.00575
Epoch: 49496, Training Loss: 0.00648
Epoch: 49496, Training Loss: 0.00607
Epoch: 49496, Training Loss: 0.00740
Epoch: 49496, Training Loss: 0.00575
Epoch: 49497, Training Loss: 0.00648
Epoch: 49497, Training Loss: 0.00607
Epoch: 49497, Training Loss: 0.00740
Epoch: 49497, Training Loss: 0.00575
Epoch: 49498, Training Loss: 0.00648
Epoch: 49498, Training Loss: 0.00607
Epoch: 49498, Training Loss: 0.00740
Epoch: 49498, Training Loss: 0.00575
Epoch: 49499, Training Loss: 0.00648
Epoch: 49499, Training Loss: 0.00607
Epoch: 49499, Training Loss: 0.00740
Epoch: 49499, Training Loss: 0.00575
Epoch: 49500, Training Loss: 0.00648
Epoch: 49500, Training Loss: 0.00607
Epoch: 49500, Training Loss: 0.00740
Epoch: 49500, Training Loss: 0.00575
Epoch: 49501, Training Loss: 0.00648
Epoch: 49501, Training Loss: 0.00607
Epoch: 49501, Training Loss: 0.00740
Epoch: 49501, Training Loss: 0.00575
Epoch: 49502, Training Loss: 0.00648
Epoch: 49502, Training Loss: 0.00607
Epoch: 49502, Training Loss: 0.00740
Epoch: 49502, Training Loss: 0.00575
Epoch: 49503, Training Loss: 0.00648
Epoch: 49503, Training Loss: 0.00607
Epoch: 49503, Training Loss: 0.00740
Epoch: 49503, Training Loss: 0.00575
Epoch: 49504, Training Loss: 0.00648
Epoch: 49504, Training Loss: 0.00607
Epoch: 49504, Training Loss: 0.00740
Epoch: 49504, Training Loss: 0.00575
Epoch: 49505, Training Loss: 0.00648
Epoch: 49505, Training Loss: 0.00607
Epoch: 49505, Training Loss: 0.00740
Epoch: 49505, Training Loss: 0.00575
Epoch: 49506, Training Loss: 0.00648
Epoch: 49506, Training Loss: 0.00607
Epoch: 49506, Training Loss: 0.00740
Epoch: 49506, Training Loss: 0.00575
Epoch: 49507, Training Loss: 0.00648
Epoch: 49507, Training Loss: 0.00607
Epoch: 49507, Training Loss: 0.00740
Epoch: 49507, Training Loss: 0.00575
Epoch: 49508, Training Loss: 0.00648
Epoch: 49508, Training Loss: 0.00607
Epoch: 49508, Training Loss: 0.00740
Epoch: 49508, Training Loss: 0.00575
Epoch: 49509, Training Loss: 0.00648
Epoch: 49509, Training Loss: 0.00607
Epoch: 49509, Training Loss: 0.00740
Epoch: 49509, Training Loss: 0.00575
Epoch: 49510, Training Loss: 0.00648
Epoch: 49510, Training Loss: 0.00607
Epoch: 49510, Training Loss: 0.00740
Epoch: 49510, Training Loss: 0.00575
Epoch: 49511, Training Loss: 0.00648
Epoch: 49511, Training Loss: 0.00607
Epoch: 49511, Training Loss: 0.00740
Epoch: 49511, Training Loss: 0.00575
Epoch: 49512, Training Loss: 0.00648
Epoch: 49512, Training Loss: 0.00607
Epoch: 49512, Training Loss: 0.00740
Epoch: 49512, Training Loss: 0.00575
Epoch: 49513, Training Loss: 0.00648
Epoch: 49513, Training Loss: 0.00607
Epoch: 49513, Training Loss: 0.00740
Epoch: 49513, Training Loss: 0.00575
Epoch: 49514, Training Loss: 0.00648
Epoch: 49514, Training Loss: 0.00607
Epoch: 49514, Training Loss: 0.00740
Epoch: 49514, Training Loss: 0.00575
Epoch: 49515, Training Loss: 0.00648
Epoch: 49515, Training Loss: 0.00607
Epoch: 49515, Training Loss: 0.00740
Epoch: 49515, Training Loss: 0.00575
Epoch: 49516, Training Loss: 0.00648
Epoch: 49516, Training Loss: 0.00607
Epoch: 49516, Training Loss: 0.00740
Epoch: 49516, Training Loss: 0.00575
Epoch: 49517, Training Loss: 0.00648
Epoch: 49517, Training Loss: 0.00607
Epoch: 49517, Training Loss: 0.00740
Epoch: 49517, Training Loss: 0.00575
Epoch: 49518, Training Loss: 0.00648
Epoch: 49518, Training Loss: 0.00607
Epoch: 49518, Training Loss: 0.00740
Epoch: 49518, Training Loss: 0.00575
Epoch: 49519, Training Loss: 0.00648
Epoch: 49519, Training Loss: 0.00607
Epoch: 49519, Training Loss: 0.00740
Epoch: 49519, Training Loss: 0.00575
Epoch: 49520, Training Loss: 0.00648
Epoch: 49520, Training Loss: 0.00607
Epoch: 49520, Training Loss: 0.00740
Epoch: 49520, Training Loss: 0.00575
Epoch: 49521, Training Loss: 0.00648
Epoch: 49521, Training Loss: 0.00607
Epoch: 49521, Training Loss: 0.00740
Epoch: 49521, Training Loss: 0.00575
Epoch: 49522, Training Loss: 0.00648
Epoch: 49522, Training Loss: 0.00607
Epoch: 49522, Training Loss: 0.00740
Epoch: 49522, Training Loss: 0.00575
Epoch: 49523, Training Loss: 0.00648
Epoch: 49523, Training Loss: 0.00607
Epoch: 49523, Training Loss: 0.00740
Epoch: 49523, Training Loss: 0.00575
Epoch: 49524, Training Loss: 0.00648
Epoch: 49524, Training Loss: 0.00607
Epoch: 49524, Training Loss: 0.00740
Epoch: 49524, Training Loss: 0.00575
Epoch: 49525, Training Loss: 0.00648
Epoch: 49525, Training Loss: 0.00607
Epoch: 49525, Training Loss: 0.00740
Epoch: 49525, Training Loss: 0.00575
Epoch: 49526, Training Loss: 0.00648
Epoch: 49526, Training Loss: 0.00607
Epoch: 49526, Training Loss: 0.00740
Epoch: 49526, Training Loss: 0.00575
Epoch: 49527, Training Loss: 0.00648
Epoch: 49527, Training Loss: 0.00607
Epoch: 49527, Training Loss: 0.00740
Epoch: 49527, Training Loss: 0.00575
Epoch: 49528, Training Loss: 0.00648
Epoch: 49528, Training Loss: 0.00607
Epoch: 49528, Training Loss: 0.00740
Epoch: 49528, Training Loss: 0.00575
Epoch: 49529, Training Loss: 0.00648
Epoch: 49529, Training Loss: 0.00607
Epoch: 49529, Training Loss: 0.00740
Epoch: 49529, Training Loss: 0.00575
Epoch: 49530, Training Loss: 0.00648
Epoch: 49530, Training Loss: 0.00607
Epoch: 49530, Training Loss: 0.00740
Epoch: 49530, Training Loss: 0.00575
Epoch: 49531, Training Loss: 0.00648
Epoch: 49531, Training Loss: 0.00607
Epoch: 49531, Training Loss: 0.00740
Epoch: 49531, Training Loss: 0.00575
Epoch: 49532, Training Loss: 0.00647
Epoch: 49532, Training Loss: 0.00607
Epoch: 49532, Training Loss: 0.00740
Epoch: 49532, Training Loss: 0.00575
Epoch: 49533, Training Loss: 0.00647
Epoch: 49533, Training Loss: 0.00607
Epoch: 49533, Training Loss: 0.00740
Epoch: 49533, Training Loss: 0.00575
Epoch: 49534, Training Loss: 0.00647
Epoch: 49534, Training Loss: 0.00607
Epoch: 49534, Training Loss: 0.00740
Epoch: 49534, Training Loss: 0.00575
Epoch: 49535, Training Loss: 0.00647
Epoch: 49535, Training Loss: 0.00607
Epoch: 49535, Training Loss: 0.00740
Epoch: 49535, Training Loss: 0.00575
Epoch: 49536, Training Loss: 0.00647
Epoch: 49536, Training Loss: 0.00607
Epoch: 49536, Training Loss: 0.00740
Epoch: 49536, Training Loss: 0.00575
Epoch: 49537, Training Loss: 0.00647
Epoch: 49537, Training Loss: 0.00607
Epoch: 49537, Training Loss: 0.00740
Epoch: 49537, Training Loss: 0.00575
Epoch: 49538, Training Loss: 0.00647
Epoch: 49538, Training Loss: 0.00607
Epoch: 49538, Training Loss: 0.00740
Epoch: 49538, Training Loss: 0.00575
Epoch: 49539, Training Loss: 0.00647
Epoch: 49539, Training Loss: 0.00607
Epoch: 49539, Training Loss: 0.00740
Epoch: 49539, Training Loss: 0.00575
Epoch: 49540, Training Loss: 0.00647
Epoch: 49540, Training Loss: 0.00607
Epoch: 49540, Training Loss: 0.00740
Epoch: 49540, Training Loss: 0.00575
Epoch: 49541, Training Loss: 0.00647
Epoch: 49541, Training Loss: 0.00607
Epoch: 49541, Training Loss: 0.00740
Epoch: 49541, Training Loss: 0.00575
Epoch: 49542, Training Loss: 0.00647
Epoch: 49542, Training Loss: 0.00607
Epoch: 49542, Training Loss: 0.00740
Epoch: 49542, Training Loss: 0.00575
Epoch: 49543, Training Loss: 0.00647
Epoch: 49543, Training Loss: 0.00607
Epoch: 49543, Training Loss: 0.00740
Epoch: 49543, Training Loss: 0.00575
Epoch: 49544, Training Loss: 0.00647
Epoch: 49544, Training Loss: 0.00607
Epoch: 49544, Training Loss: 0.00740
Epoch: 49544, Training Loss: 0.00575
Epoch: 49545, Training Loss: 0.00647
Epoch: 49545, Training Loss: 0.00607
Epoch: 49545, Training Loss: 0.00740
Epoch: 49545, Training Loss: 0.00575
Epoch: 49546, Training Loss: 0.00647
Epoch: 49546, Training Loss: 0.00607
Epoch: 49546, Training Loss: 0.00740
Epoch: 49546, Training Loss: 0.00574
Epoch: 49547, Training Loss: 0.00647
Epoch: 49547, Training Loss: 0.00607
Epoch: 49547, Training Loss: 0.00740
Epoch: 49547, Training Loss: 0.00574
Epoch: 49548, Training Loss: 0.00647
Epoch: 49548, Training Loss: 0.00607
Epoch: 49548, Training Loss: 0.00740
Epoch: 49548, Training Loss: 0.00574
Epoch: 49549, Training Loss: 0.00647
Epoch: 49549, Training Loss: 0.00607
Epoch: 49549, Training Loss: 0.00740
Epoch: 49549, Training Loss: 0.00574
Epoch: 49550, Training Loss: 0.00647
Epoch: 49550, Training Loss: 0.00607
Epoch: 49550, Training Loss: 0.00740
Epoch: 49550, Training Loss: 0.00574
Epoch: 49551, Training Loss: 0.00647
Epoch: 49551, Training Loss: 0.00607
Epoch: 49551, Training Loss: 0.00740
Epoch: 49551, Training Loss: 0.00574
Epoch: 49552, Training Loss: 0.00647
Epoch: 49552, Training Loss: 0.00607
Epoch: 49552, Training Loss: 0.00740
Epoch: 49552, Training Loss: 0.00574
Epoch: 49553, Training Loss: 0.00647
Epoch: 49553, Training Loss: 0.00607
Epoch: 49553, Training Loss: 0.00740
Epoch: 49553, Training Loss: 0.00574
Epoch: 49554, Training Loss: 0.00647
Epoch: 49554, Training Loss: 0.00607
Epoch: 49554, Training Loss: 0.00740
Epoch: 49554, Training Loss: 0.00574
Epoch: 49555, Training Loss: 0.00647
Epoch: 49555, Training Loss: 0.00607
Epoch: 49555, Training Loss: 0.00740
Epoch: 49555, Training Loss: 0.00574
Epoch: 49556, Training Loss: 0.00647
Epoch: 49556, Training Loss: 0.00607
Epoch: 49556, Training Loss: 0.00740
Epoch: 49556, Training Loss: 0.00574
Epoch: 49557, Training Loss: 0.00647
Epoch: 49557, Training Loss: 0.00607
Epoch: 49557, Training Loss: 0.00740
Epoch: 49557, Training Loss: 0.00574
Epoch: 49558, Training Loss: 0.00647
Epoch: 49558, Training Loss: 0.00607
Epoch: 49558, Training Loss: 0.00740
Epoch: 49558, Training Loss: 0.00574
Epoch: 49559, Training Loss: 0.00647
Epoch: 49559, Training Loss: 0.00607
Epoch: 49559, Training Loss: 0.00740
Epoch: 49559, Training Loss: 0.00574
Epoch: 49560, Training Loss: 0.00647
Epoch: 49560, Training Loss: 0.00607
Epoch: 49560, Training Loss: 0.00740
Epoch: 49560, Training Loss: 0.00574
Epoch: 49561, Training Loss: 0.00647
Epoch: 49561, Training Loss: 0.00607
Epoch: 49561, Training Loss: 0.00740
Epoch: 49561, Training Loss: 0.00574
Epoch: 49562, Training Loss: 0.00647
Epoch: 49562, Training Loss: 0.00607
Epoch: 49562, Training Loss: 0.00740
Epoch: 49562, Training Loss: 0.00574
Epoch: 49563, Training Loss: 0.00647
Epoch: 49563, Training Loss: 0.00607
Epoch: 49563, Training Loss: 0.00740
Epoch: 49563, Training Loss: 0.00574
Epoch: 49564, Training Loss: 0.00647
Epoch: 49564, Training Loss: 0.00607
Epoch: 49564, Training Loss: 0.00740
Epoch: 49564, Training Loss: 0.00574
Epoch: 49565, Training Loss: 0.00647
Epoch: 49565, Training Loss: 0.00607
Epoch: 49565, Training Loss: 0.00740
Epoch: 49565, Training Loss: 0.00574
Epoch: 49566, Training Loss: 0.00647
Epoch: 49566, Training Loss: 0.00607
Epoch: 49566, Training Loss: 0.00740
Epoch: 49566, Training Loss: 0.00574
Epoch: 49567, Training Loss: 0.00647
Epoch: 49567, Training Loss: 0.00607
Epoch: 49567, Training Loss: 0.00740
Epoch: 49567, Training Loss: 0.00574
Epoch: 49568, Training Loss: 0.00647
Epoch: 49568, Training Loss: 0.00607
Epoch: 49568, Training Loss: 0.00740
Epoch: 49568, Training Loss: 0.00574
Epoch: 49569, Training Loss: 0.00647
Epoch: 49569, Training Loss: 0.00607
Epoch: 49569, Training Loss: 0.00740
Epoch: 49569, Training Loss: 0.00574
Epoch: 49570, Training Loss: 0.00647
Epoch: 49570, Training Loss: 0.00607
Epoch: 49570, Training Loss: 0.00740
Epoch: 49570, Training Loss: 0.00574
Epoch: 49571, Training Loss: 0.00647
Epoch: 49571, Training Loss: 0.00607
Epoch: 49571, Training Loss: 0.00740
Epoch: 49571, Training Loss: 0.00574
Epoch: 49572, Training Loss: 0.00647
Epoch: 49572, Training Loss: 0.00607
Epoch: 49572, Training Loss: 0.00740
Epoch: 49572, Training Loss: 0.00574
Epoch: 49573, Training Loss: 0.00647
Epoch: 49573, Training Loss: 0.00607
Epoch: 49573, Training Loss: 0.00740
Epoch: 49573, Training Loss: 0.00574
Epoch: 49574, Training Loss: 0.00647
Epoch: 49574, Training Loss: 0.00607
Epoch: 49574, Training Loss: 0.00740
Epoch: 49574, Training Loss: 0.00574
Epoch: 49575, Training Loss: 0.00647
Epoch: 49575, Training Loss: 0.00607
Epoch: 49575, Training Loss: 0.00740
Epoch: 49575, Training Loss: 0.00574
Epoch: 49576, Training Loss: 0.00647
Epoch: 49576, Training Loss: 0.00607
Epoch: 49576, Training Loss: 0.00740
Epoch: 49576, Training Loss: 0.00574
Epoch: 49577, Training Loss: 0.00647
Epoch: 49577, Training Loss: 0.00607
Epoch: 49577, Training Loss: 0.00740
Epoch: 49577, Training Loss: 0.00574
Epoch: 49578, Training Loss: 0.00647
Epoch: 49578, Training Loss: 0.00607
Epoch: 49578, Training Loss: 0.00740
Epoch: 49578, Training Loss: 0.00574
Epoch: 49579, Training Loss: 0.00647
Epoch: 49579, Training Loss: 0.00607
Epoch: 49579, Training Loss: 0.00740
Epoch: 49579, Training Loss: 0.00574
Epoch: 49580, Training Loss: 0.00647
Epoch: 49580, Training Loss: 0.00607
Epoch: 49580, Training Loss: 0.00740
Epoch: 49580, Training Loss: 0.00574
Epoch: 49581, Training Loss: 0.00647
Epoch: 49581, Training Loss: 0.00607
Epoch: 49581, Training Loss: 0.00740
Epoch: 49581, Training Loss: 0.00574
Epoch: 49582, Training Loss: 0.00647
Epoch: 49582, Training Loss: 0.00607
Epoch: 49582, Training Loss: 0.00740
Epoch: 49582, Training Loss: 0.00574
Epoch: 49583, Training Loss: 0.00647
Epoch: 49583, Training Loss: 0.00607
Epoch: 49583, Training Loss: 0.00740
Epoch: 49583, Training Loss: 0.00574
Epoch: 49584, Training Loss: 0.00647
Epoch: 49584, Training Loss: 0.00607
Epoch: 49584, Training Loss: 0.00740
Epoch: 49584, Training Loss: 0.00574
Epoch: 49585, Training Loss: 0.00647
Epoch: 49585, Training Loss: 0.00607
Epoch: 49585, Training Loss: 0.00740
Epoch: 49585, Training Loss: 0.00574
Epoch: 49586, Training Loss: 0.00647
Epoch: 49586, Training Loss: 0.00607
Epoch: 49586, Training Loss: 0.00740
Epoch: 49586, Training Loss: 0.00574
Epoch: 49587, Training Loss: 0.00647
Epoch: 49587, Training Loss: 0.00607
Epoch: 49587, Training Loss: 0.00740
Epoch: 49587, Training Loss: 0.00574
Epoch: 49588, Training Loss: 0.00647
Epoch: 49588, Training Loss: 0.00607
Epoch: 49588, Training Loss: 0.00740
Epoch: 49588, Training Loss: 0.00574
Epoch: 49589, Training Loss: 0.00647
Epoch: 49589, Training Loss: 0.00607
Epoch: 49589, Training Loss: 0.00740
Epoch: 49589, Training Loss: 0.00574
Epoch: 49590, Training Loss: 0.00647
Epoch: 49590, Training Loss: 0.00607
Epoch: 49590, Training Loss: 0.00740
Epoch: 49590, Training Loss: 0.00574
Epoch: 49591, Training Loss: 0.00647
Epoch: 49591, Training Loss: 0.00607
Epoch: 49591, Training Loss: 0.00740
Epoch: 49591, Training Loss: 0.00574
Epoch: 49592, Training Loss: 0.00647
Epoch: 49592, Training Loss: 0.00607
Epoch: 49592, Training Loss: 0.00740
Epoch: 49592, Training Loss: 0.00574
Epoch: 49593, Training Loss: 0.00647
Epoch: 49593, Training Loss: 0.00607
Epoch: 49593, Training Loss: 0.00740
Epoch: 49593, Training Loss: 0.00574
Epoch: 49594, Training Loss: 0.00647
Epoch: 49594, Training Loss: 0.00607
Epoch: 49594, Training Loss: 0.00740
Epoch: 49594, Training Loss: 0.00574
Epoch: 49595, Training Loss: 0.00647
Epoch: 49595, Training Loss: 0.00607
Epoch: 49595, Training Loss: 0.00740
Epoch: 49595, Training Loss: 0.00574
Epoch: 49596, Training Loss: 0.00647
Epoch: 49596, Training Loss: 0.00607
Epoch: 49596, Training Loss: 0.00740
Epoch: 49596, Training Loss: 0.00574
Epoch: 49597, Training Loss: 0.00647
Epoch: 49597, Training Loss: 0.00607
Epoch: 49597, Training Loss: 0.00740
Epoch: 49597, Training Loss: 0.00574
Epoch: 49598, Training Loss: 0.00647
Epoch: 49598, Training Loss: 0.00607
Epoch: 49598, Training Loss: 0.00740
Epoch: 49598, Training Loss: 0.00574
Epoch: 49599, Training Loss: 0.00647
Epoch: 49599, Training Loss: 0.00606
Epoch: 49599, Training Loss: 0.00740
Epoch: 49599, Training Loss: 0.00574
Epoch: 49600, Training Loss: 0.00647
Epoch: 49600, Training Loss: 0.00606
Epoch: 49600, Training Loss: 0.00740
Epoch: 49600, Training Loss: 0.00574
Epoch: 49601, Training Loss: 0.00647
Epoch: 49601, Training Loss: 0.00606
Epoch: 49601, Training Loss: 0.00740
Epoch: 49601, Training Loss: 0.00574
Epoch: 49602, Training Loss: 0.00647
Epoch: 49602, Training Loss: 0.00606
Epoch: 49602, Training Loss: 0.00740
Epoch: 49602, Training Loss: 0.00574
Epoch: 49603, Training Loss: 0.00647
Epoch: 49603, Training Loss: 0.00606
Epoch: 49603, Training Loss: 0.00740
Epoch: 49603, Training Loss: 0.00574
Epoch: 49604, Training Loss: 0.00647
Epoch: 49604, Training Loss: 0.00606
Epoch: 49604, Training Loss: 0.00740
Epoch: 49604, Training Loss: 0.00574
Epoch: 49605, Training Loss: 0.00647
Epoch: 49605, Training Loss: 0.00606
Epoch: 49605, Training Loss: 0.00740
Epoch: 49605, Training Loss: 0.00574
Epoch: 49606, Training Loss: 0.00647
Epoch: 49606, Training Loss: 0.00606
Epoch: 49606, Training Loss: 0.00740
Epoch: 49606, Training Loss: 0.00574
Epoch: 49607, Training Loss: 0.00647
Epoch: 49607, Training Loss: 0.00606
Epoch: 49607, Training Loss: 0.00740
Epoch: 49607, Training Loss: 0.00574
Epoch: 49608, Training Loss: 0.00647
Epoch: 49608, Training Loss: 0.00606
Epoch: 49608, Training Loss: 0.00740
Epoch: 49608, Training Loss: 0.00574
Epoch: 49609, Training Loss: 0.00647
Epoch: 49609, Training Loss: 0.00606
Epoch: 49609, Training Loss: 0.00740
Epoch: 49609, Training Loss: 0.00574
Epoch: 49610, Training Loss: 0.00647
Epoch: 49610, Training Loss: 0.00606
Epoch: 49610, Training Loss: 0.00740
Epoch: 49610, Training Loss: 0.00574
Epoch: 49611, Training Loss: 0.00647
Epoch: 49611, Training Loss: 0.00606
Epoch: 49611, Training Loss: 0.00740
Epoch: 49611, Training Loss: 0.00574
Epoch: 49612, Training Loss: 0.00647
Epoch: 49612, Training Loss: 0.00606
Epoch: 49612, Training Loss: 0.00740
Epoch: 49612, Training Loss: 0.00574
Epoch: 49613, Training Loss: 0.00647
Epoch: 49613, Training Loss: 0.00606
Epoch: 49613, Training Loss: 0.00740
Epoch: 49613, Training Loss: 0.00574
Epoch: 49614, Training Loss: 0.00647
Epoch: 49614, Training Loss: 0.00606
Epoch: 49614, Training Loss: 0.00740
Epoch: 49614, Training Loss: 0.00574
Epoch: 49615, Training Loss: 0.00647
Epoch: 49615, Training Loss: 0.00606
Epoch: 49615, Training Loss: 0.00740
Epoch: 49615, Training Loss: 0.00574
Epoch: 49616, Training Loss: 0.00647
Epoch: 49616, Training Loss: 0.00606
Epoch: 49616, Training Loss: 0.00740
Epoch: 49616, Training Loss: 0.00574
Epoch: 49617, Training Loss: 0.00647
Epoch: 49617, Training Loss: 0.00606
Epoch: 49617, Training Loss: 0.00739
Epoch: 49617, Training Loss: 0.00574
Epoch: 49618, Training Loss: 0.00647
Epoch: 49618, Training Loss: 0.00606
Epoch: 49618, Training Loss: 0.00739
Epoch: 49618, Training Loss: 0.00574
Epoch: 49619, Training Loss: 0.00647
Epoch: 49619, Training Loss: 0.00606
Epoch: 49619, Training Loss: 0.00739
Epoch: 49619, Training Loss: 0.00574
Epoch: 49620, Training Loss: 0.00647
Epoch: 49620, Training Loss: 0.00606
Epoch: 49620, Training Loss: 0.00739
Epoch: 49620, Training Loss: 0.00574
Epoch: 49621, Training Loss: 0.00647
Epoch: 49621, Training Loss: 0.00606
Epoch: 49621, Training Loss: 0.00739
Epoch: 49621, Training Loss: 0.00574
Epoch: 49622, Training Loss: 0.00647
Epoch: 49622, Training Loss: 0.00606
Epoch: 49622, Training Loss: 0.00739
Epoch: 49622, Training Loss: 0.00574
Epoch: 49623, Training Loss: 0.00647
Epoch: 49623, Training Loss: 0.00606
Epoch: 49623, Training Loss: 0.00739
Epoch: 49623, Training Loss: 0.00574
Epoch: 49624, Training Loss: 0.00647
Epoch: 49624, Training Loss: 0.00606
Epoch: 49624, Training Loss: 0.00739
Epoch: 49624, Training Loss: 0.00574
Epoch: 49625, Training Loss: 0.00647
Epoch: 49625, Training Loss: 0.00606
Epoch: 49625, Training Loss: 0.00739
Epoch: 49625, Training Loss: 0.00574
Epoch: 49626, Training Loss: 0.00647
Epoch: 49626, Training Loss: 0.00606
Epoch: 49626, Training Loss: 0.00739
Epoch: 49626, Training Loss: 0.00574
Epoch: 49627, Training Loss: 0.00647
Epoch: 49627, Training Loss: 0.00606
Epoch: 49627, Training Loss: 0.00739
Epoch: 49627, Training Loss: 0.00574
Epoch: 49628, Training Loss: 0.00647
Epoch: 49628, Training Loss: 0.00606
Epoch: 49628, Training Loss: 0.00739
Epoch: 49628, Training Loss: 0.00574
Epoch: 49629, Training Loss: 0.00647
Epoch: 49629, Training Loss: 0.00606
Epoch: 49629, Training Loss: 0.00739
Epoch: 49629, Training Loss: 0.00574
Epoch: 49630, Training Loss: 0.00647
Epoch: 49630, Training Loss: 0.00606
Epoch: 49630, Training Loss: 0.00739
Epoch: 49630, Training Loss: 0.00574
Epoch: 49631, Training Loss: 0.00647
Epoch: 49631, Training Loss: 0.00606
Epoch: 49631, Training Loss: 0.00739
Epoch: 49631, Training Loss: 0.00574
Epoch: 49632, Training Loss: 0.00647
Epoch: 49632, Training Loss: 0.00606
Epoch: 49632, Training Loss: 0.00739
Epoch: 49632, Training Loss: 0.00574
Epoch: 49633, Training Loss: 0.00647
Epoch: 49633, Training Loss: 0.00606
Epoch: 49633, Training Loss: 0.00739
Epoch: 49633, Training Loss: 0.00574
Epoch: 49634, Training Loss: 0.00647
Epoch: 49634, Training Loss: 0.00606
Epoch: 49634, Training Loss: 0.00739
Epoch: 49634, Training Loss: 0.00574
Epoch: 49635, Training Loss: 0.00647
Epoch: 49635, Training Loss: 0.00606
Epoch: 49635, Training Loss: 0.00739
Epoch: 49635, Training Loss: 0.00574
Epoch: 49636, Training Loss: 0.00647
Epoch: 49636, Training Loss: 0.00606
Epoch: 49636, Training Loss: 0.00739
Epoch: 49636, Training Loss: 0.00574
Epoch: 49637, Training Loss: 0.00647
Epoch: 49637, Training Loss: 0.00606
Epoch: 49637, Training Loss: 0.00739
Epoch: 49637, Training Loss: 0.00574
Epoch: 49638, Training Loss: 0.00647
Epoch: 49638, Training Loss: 0.00606
Epoch: 49638, Training Loss: 0.00739
Epoch: 49638, Training Loss: 0.00574
Epoch: 49639, Training Loss: 0.00647
Epoch: 49639, Training Loss: 0.00606
Epoch: 49639, Training Loss: 0.00739
Epoch: 49639, Training Loss: 0.00574
Epoch: 49640, Training Loss: 0.00647
Epoch: 49640, Training Loss: 0.00606
Epoch: 49640, Training Loss: 0.00739
Epoch: 49640, Training Loss: 0.00574
Epoch: 49641, Training Loss: 0.00647
Epoch: 49641, Training Loss: 0.00606
Epoch: 49641, Training Loss: 0.00739
Epoch: 49641, Training Loss: 0.00574
Epoch: 49642, Training Loss: 0.00647
Epoch: 49642, Training Loss: 0.00606
Epoch: 49642, Training Loss: 0.00739
Epoch: 49642, Training Loss: 0.00574
Epoch: 49643, Training Loss: 0.00647
Epoch: 49643, Training Loss: 0.00606
Epoch: 49643, Training Loss: 0.00739
Epoch: 49643, Training Loss: 0.00574
Epoch: 49644, Training Loss: 0.00647
Epoch: 49644, Training Loss: 0.00606
Epoch: 49644, Training Loss: 0.00739
Epoch: 49644, Training Loss: 0.00574
Epoch: 49645, Training Loss: 0.00647
Epoch: 49645, Training Loss: 0.00606
Epoch: 49645, Training Loss: 0.00739
Epoch: 49645, Training Loss: 0.00574
Epoch: 49646, Training Loss: 0.00647
Epoch: 49646, Training Loss: 0.00606
Epoch: 49646, Training Loss: 0.00739
Epoch: 49646, Training Loss: 0.00574
Epoch: 49647, Training Loss: 0.00647
Epoch: 49647, Training Loss: 0.00606
Epoch: 49647, Training Loss: 0.00739
Epoch: 49647, Training Loss: 0.00574
Epoch: 49648, Training Loss: 0.00647
Epoch: 49648, Training Loss: 0.00606
Epoch: 49648, Training Loss: 0.00739
Epoch: 49648, Training Loss: 0.00574
Epoch: 49649, Training Loss: 0.00647
Epoch: 49649, Training Loss: 0.00606
Epoch: 49649, Training Loss: 0.00739
Epoch: 49649, Training Loss: 0.00574
Epoch: 49650, Training Loss: 0.00647
Epoch: 49650, Training Loss: 0.00606
Epoch: 49650, Training Loss: 0.00739
Epoch: 49650, Training Loss: 0.00574
Epoch: 49651, Training Loss: 0.00647
Epoch: 49651, Training Loss: 0.00606
Epoch: 49651, Training Loss: 0.00739
Epoch: 49651, Training Loss: 0.00574
Epoch: 49652, Training Loss: 0.00647
Epoch: 49652, Training Loss: 0.00606
Epoch: 49652, Training Loss: 0.00739
Epoch: 49652, Training Loss: 0.00574
Epoch: 49653, Training Loss: 0.00647
Epoch: 49653, Training Loss: 0.00606
Epoch: 49653, Training Loss: 0.00739
Epoch: 49653, Training Loss: 0.00574
Epoch: 49654, Training Loss: 0.00647
Epoch: 49654, Training Loss: 0.00606
Epoch: 49654, Training Loss: 0.00739
Epoch: 49654, Training Loss: 0.00574
Epoch: 49655, Training Loss: 0.00647
Epoch: 49655, Training Loss: 0.00606
Epoch: 49655, Training Loss: 0.00739
Epoch: 49655, Training Loss: 0.00574
Epoch: 49656, Training Loss: 0.00647
Epoch: 49656, Training Loss: 0.00606
Epoch: 49656, Training Loss: 0.00739
Epoch: 49656, Training Loss: 0.00574
Epoch: 49657, Training Loss: 0.00647
Epoch: 49657, Training Loss: 0.00606
Epoch: 49657, Training Loss: 0.00739
Epoch: 49657, Training Loss: 0.00574
Epoch: 49658, Training Loss: 0.00647
Epoch: 49658, Training Loss: 0.00606
Epoch: 49658, Training Loss: 0.00739
Epoch: 49658, Training Loss: 0.00574
Epoch: 49659, Training Loss: 0.00647
Epoch: 49659, Training Loss: 0.00606
Epoch: 49659, Training Loss: 0.00739
Epoch: 49659, Training Loss: 0.00574
Epoch: 49660, Training Loss: 0.00647
Epoch: 49660, Training Loss: 0.00606
Epoch: 49660, Training Loss: 0.00739
Epoch: 49660, Training Loss: 0.00574
Epoch: 49661, Training Loss: 0.00647
Epoch: 49661, Training Loss: 0.00606
Epoch: 49661, Training Loss: 0.00739
Epoch: 49661, Training Loss: 0.00574
Epoch: 49662, Training Loss: 0.00647
Epoch: 49662, Training Loss: 0.00606
Epoch: 49662, Training Loss: 0.00739
Epoch: 49662, Training Loss: 0.00574
Epoch: 49663, Training Loss: 0.00647
Epoch: 49663, Training Loss: 0.00606
Epoch: 49663, Training Loss: 0.00739
Epoch: 49663, Training Loss: 0.00574
Epoch: 49664, Training Loss: 0.00647
Epoch: 49664, Training Loss: 0.00606
Epoch: 49664, Training Loss: 0.00739
Epoch: 49664, Training Loss: 0.00574
Epoch: 49665, Training Loss: 0.00647
Epoch: 49665, Training Loss: 0.00606
Epoch: 49665, Training Loss: 0.00739
Epoch: 49665, Training Loss: 0.00574
Epoch: 49666, Training Loss: 0.00647
Epoch: 49666, Training Loss: 0.00606
Epoch: 49666, Training Loss: 0.00739
Epoch: 49666, Training Loss: 0.00574
Epoch: 49667, Training Loss: 0.00647
Epoch: 49667, Training Loss: 0.00606
Epoch: 49667, Training Loss: 0.00739
Epoch: 49667, Training Loss: 0.00574
Epoch: 49668, Training Loss: 0.00647
Epoch: 49668, Training Loss: 0.00606
Epoch: 49668, Training Loss: 0.00739
Epoch: 49668, Training Loss: 0.00574
Epoch: 49669, Training Loss: 0.00647
Epoch: 49669, Training Loss: 0.00606
Epoch: 49669, Training Loss: 0.00739
Epoch: 49669, Training Loss: 0.00574
Epoch: 49670, Training Loss: 0.00647
Epoch: 49670, Training Loss: 0.00606
Epoch: 49670, Training Loss: 0.00739
Epoch: 49670, Training Loss: 0.00574
Epoch: 49671, Training Loss: 0.00647
Epoch: 49671, Training Loss: 0.00606
Epoch: 49671, Training Loss: 0.00739
Epoch: 49671, Training Loss: 0.00574
Epoch: 49672, Training Loss: 0.00647
Epoch: 49672, Training Loss: 0.00606
Epoch: 49672, Training Loss: 0.00739
Epoch: 49672, Training Loss: 0.00574
Epoch: 49673, Training Loss: 0.00647
Epoch: 49673, Training Loss: 0.00606
Epoch: 49673, Training Loss: 0.00739
Epoch: 49673, Training Loss: 0.00574
Epoch: 49674, Training Loss: 0.00647
Epoch: 49674, Training Loss: 0.00606
Epoch: 49674, Training Loss: 0.00739
Epoch: 49674, Training Loss: 0.00574
Epoch: 49675, Training Loss: 0.00647
Epoch: 49675, Training Loss: 0.00606
Epoch: 49675, Training Loss: 0.00739
Epoch: 49675, Training Loss: 0.00574
Epoch: 49676, Training Loss: 0.00647
Epoch: 49676, Training Loss: 0.00606
Epoch: 49676, Training Loss: 0.00739
Epoch: 49676, Training Loss: 0.00574
Epoch: 49677, Training Loss: 0.00647
Epoch: 49677, Training Loss: 0.00606
Epoch: 49677, Training Loss: 0.00739
Epoch: 49677, Training Loss: 0.00574
Epoch: 49678, Training Loss: 0.00647
Epoch: 49678, Training Loss: 0.00606
Epoch: 49678, Training Loss: 0.00739
Epoch: 49678, Training Loss: 0.00574
Epoch: 49679, Training Loss: 0.00646
Epoch: 49679, Training Loss: 0.00606
Epoch: 49679, Training Loss: 0.00739
Epoch: 49679, Training Loss: 0.00574
Epoch: 49680, Training Loss: 0.00646
Epoch: 49680, Training Loss: 0.00606
Epoch: 49680, Training Loss: 0.00739
Epoch: 49680, Training Loss: 0.00574
Epoch: 49681, Training Loss: 0.00646
Epoch: 49681, Training Loss: 0.00606
Epoch: 49681, Training Loss: 0.00739
Epoch: 49681, Training Loss: 0.00574
Epoch: 49682, Training Loss: 0.00646
Epoch: 49682, Training Loss: 0.00606
Epoch: 49682, Training Loss: 0.00739
Epoch: 49682, Training Loss: 0.00574
Epoch: 49683, Training Loss: 0.00646
Epoch: 49683, Training Loss: 0.00606
Epoch: 49683, Training Loss: 0.00739
Epoch: 49683, Training Loss: 0.00574
Epoch: 49684, Training Loss: 0.00646
Epoch: 49684, Training Loss: 0.00606
Epoch: 49684, Training Loss: 0.00739
Epoch: 49684, Training Loss: 0.00574
Epoch: 49685, Training Loss: 0.00646
Epoch: 49685, Training Loss: 0.00606
Epoch: 49685, Training Loss: 0.00739
Epoch: 49685, Training Loss: 0.00574
Epoch: 49686, Training Loss: 0.00646
Epoch: 49686, Training Loss: 0.00606
Epoch: 49686, Training Loss: 0.00739
Epoch: 49686, Training Loss: 0.00574
Epoch: 49687, Training Loss: 0.00646
Epoch: 49687, Training Loss: 0.00606
Epoch: 49687, Training Loss: 0.00739
Epoch: 49687, Training Loss: 0.00574
Epoch: 49688, Training Loss: 0.00646
Epoch: 49688, Training Loss: 0.00606
Epoch: 49688, Training Loss: 0.00739
Epoch: 49688, Training Loss: 0.00574
Epoch: 49689, Training Loss: 0.00646
Epoch: 49689, Training Loss: 0.00606
Epoch: 49689, Training Loss: 0.00739
Epoch: 49689, Training Loss: 0.00574
Epoch: 49690, Training Loss: 0.00646
Epoch: 49690, Training Loss: 0.00606
Epoch: 49690, Training Loss: 0.00739
Epoch: 49690, Training Loss: 0.00574
Epoch: 49691, Training Loss: 0.00646
Epoch: 49691, Training Loss: 0.00606
Epoch: 49691, Training Loss: 0.00739
Epoch: 49691, Training Loss: 0.00574
Epoch: 49692, Training Loss: 0.00646
Epoch: 49692, Training Loss: 0.00606
Epoch: 49692, Training Loss: 0.00739
Epoch: 49692, Training Loss: 0.00574
Epoch: 49693, Training Loss: 0.00646
Epoch: 49693, Training Loss: 0.00606
Epoch: 49693, Training Loss: 0.00739
Epoch: 49693, Training Loss: 0.00574
Epoch: 49694, Training Loss: 0.00646
Epoch: 49694, Training Loss: 0.00606
Epoch: 49694, Training Loss: 0.00739
Epoch: 49694, Training Loss: 0.00574
Epoch: 49695, Training Loss: 0.00646
Epoch: 49695, Training Loss: 0.00606
Epoch: 49695, Training Loss: 0.00739
Epoch: 49695, Training Loss: 0.00574
Epoch: 49696, Training Loss: 0.00646
Epoch: 49696, Training Loss: 0.00606
Epoch: 49696, Training Loss: 0.00739
Epoch: 49696, Training Loss: 0.00574
Epoch: 49697, Training Loss: 0.00646
Epoch: 49697, Training Loss: 0.00606
Epoch: 49697, Training Loss: 0.00739
Epoch: 49697, Training Loss: 0.00574
Epoch: 49698, Training Loss: 0.00646
Epoch: 49698, Training Loss: 0.00606
Epoch: 49698, Training Loss: 0.00739
Epoch: 49698, Training Loss: 0.00574
Epoch: 49699, Training Loss: 0.00646
Epoch: 49699, Training Loss: 0.00606
Epoch: 49699, Training Loss: 0.00739
Epoch: 49699, Training Loss: 0.00574
Epoch: 49700, Training Loss: 0.00646
Epoch: 49700, Training Loss: 0.00606
Epoch: 49700, Training Loss: 0.00739
Epoch: 49700, Training Loss: 0.00574
Epoch: 49701, Training Loss: 0.00646
Epoch: 49701, Training Loss: 0.00606
Epoch: 49701, Training Loss: 0.00739
Epoch: 49701, Training Loss: 0.00574
Epoch: 49702, Training Loss: 0.00646
Epoch: 49702, Training Loss: 0.00606
Epoch: 49702, Training Loss: 0.00739
Epoch: 49702, Training Loss: 0.00574
Epoch: 49703, Training Loss: 0.00646
Epoch: 49703, Training Loss: 0.00606
Epoch: 49703, Training Loss: 0.00739
Epoch: 49703, Training Loss: 0.00574
Epoch: 49704, Training Loss: 0.00646
Epoch: 49704, Training Loss: 0.00606
Epoch: 49704, Training Loss: 0.00739
Epoch: 49704, Training Loss: 0.00574
Epoch: 49705, Training Loss: 0.00646
Epoch: 49705, Training Loss: 0.00606
Epoch: 49705, Training Loss: 0.00739
Epoch: 49705, Training Loss: 0.00574
Epoch: 49706, Training Loss: 0.00646
Epoch: 49706, Training Loss: 0.00606
Epoch: 49706, Training Loss: 0.00739
Epoch: 49706, Training Loss: 0.00574
Epoch: 49707, Training Loss: 0.00646
Epoch: 49707, Training Loss: 0.00606
Epoch: 49707, Training Loss: 0.00739
Epoch: 49707, Training Loss: 0.00574
Epoch: 49708, Training Loss: 0.00646
Epoch: 49708, Training Loss: 0.00606
Epoch: 49708, Training Loss: 0.00739
Epoch: 49708, Training Loss: 0.00574
Epoch: 49709, Training Loss: 0.00646
Epoch: 49709, Training Loss: 0.00606
Epoch: 49709, Training Loss: 0.00739
Epoch: 49709, Training Loss: 0.00574
Epoch: 49710, Training Loss: 0.00646
Epoch: 49710, Training Loss: 0.00606
Epoch: 49710, Training Loss: 0.00739
Epoch: 49710, Training Loss: 0.00574
Epoch: 49711, Training Loss: 0.00646
Epoch: 49711, Training Loss: 0.00606
Epoch: 49711, Training Loss: 0.00739
Epoch: 49711, Training Loss: 0.00574
Epoch: 49712, Training Loss: 0.00646
Epoch: 49712, Training Loss: 0.00606
Epoch: 49712, Training Loss: 0.00739
Epoch: 49712, Training Loss: 0.00574
Epoch: 49713, Training Loss: 0.00646
Epoch: 49713, Training Loss: 0.00606
Epoch: 49713, Training Loss: 0.00739
Epoch: 49713, Training Loss: 0.00574
Epoch: 49714, Training Loss: 0.00646
Epoch: 49714, Training Loss: 0.00606
Epoch: 49714, Training Loss: 0.00739
Epoch: 49714, Training Loss: 0.00573
Epoch: 49715, Training Loss: 0.00646
Epoch: 49715, Training Loss: 0.00606
Epoch: 49715, Training Loss: 0.00739
Epoch: 49715, Training Loss: 0.00573
Epoch: 49716, Training Loss: 0.00646
Epoch: 49716, Training Loss: 0.00606
Epoch: 49716, Training Loss: 0.00739
Epoch: 49716, Training Loss: 0.00573
Epoch: 49717, Training Loss: 0.00646
Epoch: 49717, Training Loss: 0.00606
Epoch: 49717, Training Loss: 0.00739
Epoch: 49717, Training Loss: 0.00573
Epoch: 49718, Training Loss: 0.00646
Epoch: 49718, Training Loss: 0.00606
Epoch: 49718, Training Loss: 0.00739
Epoch: 49718, Training Loss: 0.00573
Epoch: 49719, Training Loss: 0.00646
Epoch: 49719, Training Loss: 0.00606
Epoch: 49719, Training Loss: 0.00739
Epoch: 49719, Training Loss: 0.00573
Epoch: 49720, Training Loss: 0.00646
Epoch: 49720, Training Loss: 0.00606
Epoch: 49720, Training Loss: 0.00739
Epoch: 49720, Training Loss: 0.00573
Epoch: 49721, Training Loss: 0.00646
Epoch: 49721, Training Loss: 0.00606
Epoch: 49721, Training Loss: 0.00739
Epoch: 49721, Training Loss: 0.00573
Epoch: 49722, Training Loss: 0.00646
Epoch: 49722, Training Loss: 0.00606
Epoch: 49722, Training Loss: 0.00739
Epoch: 49722, Training Loss: 0.00573
Epoch: 49723, Training Loss: 0.00646
Epoch: 49723, Training Loss: 0.00606
Epoch: 49723, Training Loss: 0.00739
Epoch: 49723, Training Loss: 0.00573
Epoch: 49724, Training Loss: 0.00646
Epoch: 49724, Training Loss: 0.00606
Epoch: 49724, Training Loss: 0.00739
Epoch: 49724, Training Loss: 0.00573
Epoch: 49725, Training Loss: 0.00646
Epoch: 49725, Training Loss: 0.00606
Epoch: 49725, Training Loss: 0.00739
Epoch: 49725, Training Loss: 0.00573
Epoch: 49726, Training Loss: 0.00646
Epoch: 49726, Training Loss: 0.00606
Epoch: 49726, Training Loss: 0.00739
Epoch: 49726, Training Loss: 0.00573
Epoch: 49727, Training Loss: 0.00646
Epoch: 49727, Training Loss: 0.00606
Epoch: 49727, Training Loss: 0.00739
Epoch: 49727, Training Loss: 0.00573
Epoch: 49728, Training Loss: 0.00646
Epoch: 49728, Training Loss: 0.00606
Epoch: 49728, Training Loss: 0.00739
Epoch: 49728, Training Loss: 0.00573
Epoch: 49729, Training Loss: 0.00646
Epoch: 49729, Training Loss: 0.00606
Epoch: 49729, Training Loss: 0.00739
Epoch: 49729, Training Loss: 0.00573
Epoch: 49730, Training Loss: 0.00646
Epoch: 49730, Training Loss: 0.00606
Epoch: 49730, Training Loss: 0.00739
Epoch: 49730, Training Loss: 0.00573
Epoch: 49731, Training Loss: 0.00646
Epoch: 49731, Training Loss: 0.00606
Epoch: 49731, Training Loss: 0.00739
Epoch: 49731, Training Loss: 0.00573
Epoch: 49732, Training Loss: 0.00646
Epoch: 49732, Training Loss: 0.00606
Epoch: 49732, Training Loss: 0.00739
Epoch: 49732, Training Loss: 0.00573
Epoch: 49733, Training Loss: 0.00646
Epoch: 49733, Training Loss: 0.00606
Epoch: 49733, Training Loss: 0.00739
Epoch: 49733, Training Loss: 0.00573
Epoch: 49734, Training Loss: 0.00646
Epoch: 49734, Training Loss: 0.00606
Epoch: 49734, Training Loss: 0.00739
Epoch: 49734, Training Loss: 0.00573
Epoch: 49735, Training Loss: 0.00646
Epoch: 49735, Training Loss: 0.00606
Epoch: 49735, Training Loss: 0.00739
Epoch: 49735, Training Loss: 0.00573
Epoch: 49736, Training Loss: 0.00646
Epoch: 49736, Training Loss: 0.00606
Epoch: 49736, Training Loss: 0.00739
Epoch: 49736, Training Loss: 0.00573
Epoch: 49737, Training Loss: 0.00646
Epoch: 49737, Training Loss: 0.00606
Epoch: 49737, Training Loss: 0.00739
Epoch: 49737, Training Loss: 0.00573
Epoch: 49738, Training Loss: 0.00646
Epoch: 49738, Training Loss: 0.00606
Epoch: 49738, Training Loss: 0.00739
Epoch: 49738, Training Loss: 0.00573
Epoch: 49739, Training Loss: 0.00646
Epoch: 49739, Training Loss: 0.00606
Epoch: 49739, Training Loss: 0.00739
Epoch: 49739, Training Loss: 0.00573
Epoch: 49740, Training Loss: 0.00646
Epoch: 49740, Training Loss: 0.00606
Epoch: 49740, Training Loss: 0.00739
Epoch: 49740, Training Loss: 0.00573
Epoch: 49741, Training Loss: 0.00646
Epoch: 49741, Training Loss: 0.00606
Epoch: 49741, Training Loss: 0.00739
Epoch: 49741, Training Loss: 0.00573
Epoch: 49742, Training Loss: 0.00646
Epoch: 49742, Training Loss: 0.00606
Epoch: 49742, Training Loss: 0.00739
Epoch: 49742, Training Loss: 0.00573
Epoch: 49743, Training Loss: 0.00646
Epoch: 49743, Training Loss: 0.00606
Epoch: 49743, Training Loss: 0.00739
Epoch: 49743, Training Loss: 0.00573
Epoch: 49744, Training Loss: 0.00646
Epoch: 49744, Training Loss: 0.00606
Epoch: 49744, Training Loss: 0.00739
Epoch: 49744, Training Loss: 0.00573
Epoch: 49745, Training Loss: 0.00646
Epoch: 49745, Training Loss: 0.00606
Epoch: 49745, Training Loss: 0.00739
Epoch: 49745, Training Loss: 0.00573
Epoch: 49746, Training Loss: 0.00646
Epoch: 49746, Training Loss: 0.00606
Epoch: 49746, Training Loss: 0.00738
Epoch: 49746, Training Loss: 0.00573
Epoch: 49747, Training Loss: 0.00646
Epoch: 49747, Training Loss: 0.00606
Epoch: 49747, Training Loss: 0.00738
Epoch: 49747, Training Loss: 0.00573
Epoch: 49748, Training Loss: 0.00646
Epoch: 49748, Training Loss: 0.00606
Epoch: 49748, Training Loss: 0.00738
Epoch: 49748, Training Loss: 0.00573
Epoch: 49749, Training Loss: 0.00646
Epoch: 49749, Training Loss: 0.00606
Epoch: 49749, Training Loss: 0.00738
Epoch: 49749, Training Loss: 0.00573
Epoch: 49750, Training Loss: 0.00646
Epoch: 49750, Training Loss: 0.00606
Epoch: 49750, Training Loss: 0.00738
Epoch: 49750, Training Loss: 0.00573
Epoch: 49751, Training Loss: 0.00646
Epoch: 49751, Training Loss: 0.00606
Epoch: 49751, Training Loss: 0.00738
Epoch: 49751, Training Loss: 0.00573
Epoch: 49752, Training Loss: 0.00646
Epoch: 49752, Training Loss: 0.00606
Epoch: 49752, Training Loss: 0.00738
Epoch: 49752, Training Loss: 0.00573
Epoch: 49753, Training Loss: 0.00646
Epoch: 49753, Training Loss: 0.00606
Epoch: 49753, Training Loss: 0.00738
Epoch: 49753, Training Loss: 0.00573
Epoch: 49754, Training Loss: 0.00646
Epoch: 49754, Training Loss: 0.00606
Epoch: 49754, Training Loss: 0.00738
Epoch: 49754, Training Loss: 0.00573
Epoch: 49755, Training Loss: 0.00646
Epoch: 49755, Training Loss: 0.00606
Epoch: 49755, Training Loss: 0.00738
Epoch: 49755, Training Loss: 0.00573
Epoch: 49756, Training Loss: 0.00646
Epoch: 49756, Training Loss: 0.00606
Epoch: 49756, Training Loss: 0.00738
Epoch: 49756, Training Loss: 0.00573
Epoch: 49757, Training Loss: 0.00646
Epoch: 49757, Training Loss: 0.00606
Epoch: 49757, Training Loss: 0.00738
Epoch: 49757, Training Loss: 0.00573
Epoch: 49758, Training Loss: 0.00646
Epoch: 49758, Training Loss: 0.00605
Epoch: 49758, Training Loss: 0.00738
Epoch: 49758, Training Loss: 0.00573
Epoch: 49759, Training Loss: 0.00646
Epoch: 49759, Training Loss: 0.00605
Epoch: 49759, Training Loss: 0.00738
Epoch: 49759, Training Loss: 0.00573
Epoch: 49760, Training Loss: 0.00646
Epoch: 49760, Training Loss: 0.00605
Epoch: 49760, Training Loss: 0.00738
Epoch: 49760, Training Loss: 0.00573
Epoch: 49761, Training Loss: 0.00646
Epoch: 49761, Training Loss: 0.00605
Epoch: 49761, Training Loss: 0.00738
Epoch: 49761, Training Loss: 0.00573
Epoch: 49762, Training Loss: 0.00646
Epoch: 49762, Training Loss: 0.00605
Epoch: 49762, Training Loss: 0.00738
Epoch: 49762, Training Loss: 0.00573
Epoch: 49763, Training Loss: 0.00646
Epoch: 49763, Training Loss: 0.00605
Epoch: 49763, Training Loss: 0.00738
Epoch: 49763, Training Loss: 0.00573
Epoch: 49764, Training Loss: 0.00646
Epoch: 49764, Training Loss: 0.00605
Epoch: 49764, Training Loss: 0.00738
Epoch: 49764, Training Loss: 0.00573
Epoch: 49765, Training Loss: 0.00646
Epoch: 49765, Training Loss: 0.00605
Epoch: 49765, Training Loss: 0.00738
Epoch: 49765, Training Loss: 0.00573
Epoch: 49766, Training Loss: 0.00646
Epoch: 49766, Training Loss: 0.00605
Epoch: 49766, Training Loss: 0.00738
Epoch: 49766, Training Loss: 0.00573
Epoch: 49767, Training Loss: 0.00646
Epoch: 49767, Training Loss: 0.00605
Epoch: 49767, Training Loss: 0.00738
Epoch: 49767, Training Loss: 0.00573
Epoch: 49768, Training Loss: 0.00646
Epoch: 49768, Training Loss: 0.00605
Epoch: 49768, Training Loss: 0.00738
Epoch: 49768, Training Loss: 0.00573
Epoch: 49769, Training Loss: 0.00646
Epoch: 49769, Training Loss: 0.00605
Epoch: 49769, Training Loss: 0.00738
Epoch: 49769, Training Loss: 0.00573
Epoch: 49770, Training Loss: 0.00646
Epoch: 49770, Training Loss: 0.00605
Epoch: 49770, Training Loss: 0.00738
Epoch: 49770, Training Loss: 0.00573
Epoch: 49771, Training Loss: 0.00646
Epoch: 49771, Training Loss: 0.00605
Epoch: 49771, Training Loss: 0.00738
Epoch: 49771, Training Loss: 0.00573
Epoch: 49772, Training Loss: 0.00646
Epoch: 49772, Training Loss: 0.00605
Epoch: 49772, Training Loss: 0.00738
Epoch: 49772, Training Loss: 0.00573
Epoch: 49773, Training Loss: 0.00646
Epoch: 49773, Training Loss: 0.00605
Epoch: 49773, Training Loss: 0.00738
Epoch: 49773, Training Loss: 0.00573
Epoch: 49774, Training Loss: 0.00646
Epoch: 49774, Training Loss: 0.00605
Epoch: 49774, Training Loss: 0.00738
Epoch: 49774, Training Loss: 0.00573
Epoch: 49775, Training Loss: 0.00646
Epoch: 49775, Training Loss: 0.00605
Epoch: 49775, Training Loss: 0.00738
Epoch: 49775, Training Loss: 0.00573
Epoch: 49776, Training Loss: 0.00646
Epoch: 49776, Training Loss: 0.00605
Epoch: 49776, Training Loss: 0.00738
Epoch: 49776, Training Loss: 0.00573
Epoch: 49777, Training Loss: 0.00646
Epoch: 49777, Training Loss: 0.00605
Epoch: 49777, Training Loss: 0.00738
Epoch: 49777, Training Loss: 0.00573
Epoch: 49778, Training Loss: 0.00646
Epoch: 49778, Training Loss: 0.00605
Epoch: 49778, Training Loss: 0.00738
Epoch: 49778, Training Loss: 0.00573
Epoch: 49779, Training Loss: 0.00646
Epoch: 49779, Training Loss: 0.00605
Epoch: 49779, Training Loss: 0.00738
Epoch: 49779, Training Loss: 0.00573
Epoch: 49780, Training Loss: 0.00646
Epoch: 49780, Training Loss: 0.00605
Epoch: 49780, Training Loss: 0.00738
Epoch: 49780, Training Loss: 0.00573
Epoch: 49781, Training Loss: 0.00646
Epoch: 49781, Training Loss: 0.00605
Epoch: 49781, Training Loss: 0.00738
Epoch: 49781, Training Loss: 0.00573
Epoch: 49782, Training Loss: 0.00646
Epoch: 49782, Training Loss: 0.00605
Epoch: 49782, Training Loss: 0.00738
Epoch: 49782, Training Loss: 0.00573
Epoch: 49783, Training Loss: 0.00646
Epoch: 49783, Training Loss: 0.00605
Epoch: 49783, Training Loss: 0.00738
Epoch: 49783, Training Loss: 0.00573
Epoch: 49784, Training Loss: 0.00646
Epoch: 49784, Training Loss: 0.00605
Epoch: 49784, Training Loss: 0.00738
Epoch: 49784, Training Loss: 0.00573
Epoch: 49785, Training Loss: 0.00646
Epoch: 49785, Training Loss: 0.00605
Epoch: 49785, Training Loss: 0.00738
Epoch: 49785, Training Loss: 0.00573
Epoch: 49786, Training Loss: 0.00646
Epoch: 49786, Training Loss: 0.00605
Epoch: 49786, Training Loss: 0.00738
Epoch: 49786, Training Loss: 0.00573
Epoch: 49787, Training Loss: 0.00646
Epoch: 49787, Training Loss: 0.00605
Epoch: 49787, Training Loss: 0.00738
Epoch: 49787, Training Loss: 0.00573
Epoch: 49788, Training Loss: 0.00646
Epoch: 49788, Training Loss: 0.00605
Epoch: 49788, Training Loss: 0.00738
Epoch: 49788, Training Loss: 0.00573
Epoch: 49789, Training Loss: 0.00646
Epoch: 49789, Training Loss: 0.00605
Epoch: 49789, Training Loss: 0.00738
Epoch: 49789, Training Loss: 0.00573
Epoch: 49790, Training Loss: 0.00646
Epoch: 49790, Training Loss: 0.00605
Epoch: 49790, Training Loss: 0.00738
Epoch: 49790, Training Loss: 0.00573
Epoch: 49791, Training Loss: 0.00646
Epoch: 49791, Training Loss: 0.00605
Epoch: 49791, Training Loss: 0.00738
Epoch: 49791, Training Loss: 0.00573
Epoch: 49792, Training Loss: 0.00646
Epoch: 49792, Training Loss: 0.00605
Epoch: 49792, Training Loss: 0.00738
Epoch: 49792, Training Loss: 0.00573
Epoch: 49793, Training Loss: 0.00646
Epoch: 49793, Training Loss: 0.00605
Epoch: 49793, Training Loss: 0.00738
Epoch: 49793, Training Loss: 0.00573
Epoch: 49794, Training Loss: 0.00646
Epoch: 49794, Training Loss: 0.00605
Epoch: 49794, Training Loss: 0.00738
Epoch: 49794, Training Loss: 0.00573
Epoch: 49795, Training Loss: 0.00646
Epoch: 49795, Training Loss: 0.00605
Epoch: 49795, Training Loss: 0.00738
Epoch: 49795, Training Loss: 0.00573
Epoch: 49796, Training Loss: 0.00646
Epoch: 49796, Training Loss: 0.00605
Epoch: 49796, Training Loss: 0.00738
Epoch: 49796, Training Loss: 0.00573
Epoch: 49797, Training Loss: 0.00646
Epoch: 49797, Training Loss: 0.00605
Epoch: 49797, Training Loss: 0.00738
Epoch: 49797, Training Loss: 0.00573
Epoch: 49798, Training Loss: 0.00646
Epoch: 49798, Training Loss: 0.00605
Epoch: 49798, Training Loss: 0.00738
Epoch: 49798, Training Loss: 0.00573
Epoch: 49799, Training Loss: 0.00646
Epoch: 49799, Training Loss: 0.00605
Epoch: 49799, Training Loss: 0.00738
Epoch: 49799, Training Loss: 0.00573
Epoch: 49800, Training Loss: 0.00646
Epoch: 49800, Training Loss: 0.00605
Epoch: 49800, Training Loss: 0.00738
Epoch: 49800, Training Loss: 0.00573
Epoch: 49801, Training Loss: 0.00646
Epoch: 49801, Training Loss: 0.00605
Epoch: 49801, Training Loss: 0.00738
Epoch: 49801, Training Loss: 0.00573
Epoch: 49802, Training Loss: 0.00646
Epoch: 49802, Training Loss: 0.00605
Epoch: 49802, Training Loss: 0.00738
Epoch: 49802, Training Loss: 0.00573
Epoch: 49803, Training Loss: 0.00646
Epoch: 49803, Training Loss: 0.00605
Epoch: 49803, Training Loss: 0.00738
Epoch: 49803, Training Loss: 0.00573
Epoch: 49804, Training Loss: 0.00646
Epoch: 49804, Training Loss: 0.00605
Epoch: 49804, Training Loss: 0.00738
Epoch: 49804, Training Loss: 0.00573
Epoch: 49805, Training Loss: 0.00646
Epoch: 49805, Training Loss: 0.00605
Epoch: 49805, Training Loss: 0.00738
Epoch: 49805, Training Loss: 0.00573
Epoch: 49806, Training Loss: 0.00646
Epoch: 49806, Training Loss: 0.00605
Epoch: 49806, Training Loss: 0.00738
Epoch: 49806, Training Loss: 0.00573
Epoch: 49807, Training Loss: 0.00646
Epoch: 49807, Training Loss: 0.00605
Epoch: 49807, Training Loss: 0.00738
Epoch: 49807, Training Loss: 0.00573
Epoch: 49808, Training Loss: 0.00646
Epoch: 49808, Training Loss: 0.00605
Epoch: 49808, Training Loss: 0.00738
Epoch: 49808, Training Loss: 0.00573
Epoch: 49809, Training Loss: 0.00646
Epoch: 49809, Training Loss: 0.00605
Epoch: 49809, Training Loss: 0.00738
Epoch: 49809, Training Loss: 0.00573
Epoch: 49810, Training Loss: 0.00646
Epoch: 49810, Training Loss: 0.00605
Epoch: 49810, Training Loss: 0.00738
Epoch: 49810, Training Loss: 0.00573
Epoch: 49811, Training Loss: 0.00646
Epoch: 49811, Training Loss: 0.00605
Epoch: 49811, Training Loss: 0.00738
Epoch: 49811, Training Loss: 0.00573
Epoch: 49812, Training Loss: 0.00646
Epoch: 49812, Training Loss: 0.00605
Epoch: 49812, Training Loss: 0.00738
Epoch: 49812, Training Loss: 0.00573
Epoch: 49813, Training Loss: 0.00646
Epoch: 49813, Training Loss: 0.00605
Epoch: 49813, Training Loss: 0.00738
Epoch: 49813, Training Loss: 0.00573
Epoch: 49814, Training Loss: 0.00646
Epoch: 49814, Training Loss: 0.00605
Epoch: 49814, Training Loss: 0.00738
Epoch: 49814, Training Loss: 0.00573
Epoch: 49815, Training Loss: 0.00646
Epoch: 49815, Training Loss: 0.00605
Epoch: 49815, Training Loss: 0.00738
Epoch: 49815, Training Loss: 0.00573
Epoch: 49816, Training Loss: 0.00646
Epoch: 49816, Training Loss: 0.00605
Epoch: 49816, Training Loss: 0.00738
Epoch: 49816, Training Loss: 0.00573
Epoch: 49817, Training Loss: 0.00646
Epoch: 49817, Training Loss: 0.00605
Epoch: 49817, Training Loss: 0.00738
Epoch: 49817, Training Loss: 0.00573
Epoch: 49818, Training Loss: 0.00646
Epoch: 49818, Training Loss: 0.00605
Epoch: 49818, Training Loss: 0.00738
Epoch: 49818, Training Loss: 0.00573
Epoch: 49819, Training Loss: 0.00646
Epoch: 49819, Training Loss: 0.00605
Epoch: 49819, Training Loss: 0.00738
Epoch: 49819, Training Loss: 0.00573
Epoch: 49820, Training Loss: 0.00646
Epoch: 49820, Training Loss: 0.00605
Epoch: 49820, Training Loss: 0.00738
Epoch: 49820, Training Loss: 0.00573
Epoch: 49821, Training Loss: 0.00646
Epoch: 49821, Training Loss: 0.00605
Epoch: 49821, Training Loss: 0.00738
Epoch: 49821, Training Loss: 0.00573
Epoch: 49822, Training Loss: 0.00646
Epoch: 49822, Training Loss: 0.00605
Epoch: 49822, Training Loss: 0.00738
Epoch: 49822, Training Loss: 0.00573
Epoch: 49823, Training Loss: 0.00646
Epoch: 49823, Training Loss: 0.00605
Epoch: 49823, Training Loss: 0.00738
Epoch: 49823, Training Loss: 0.00573
Epoch: 49824, Training Loss: 0.00646
Epoch: 49824, Training Loss: 0.00605
Epoch: 49824, Training Loss: 0.00738
Epoch: 49824, Training Loss: 0.00573
Epoch: 49825, Training Loss: 0.00646
Epoch: 49825, Training Loss: 0.00605
Epoch: 49825, Training Loss: 0.00738
Epoch: 49825, Training Loss: 0.00573
Epoch: 49826, Training Loss: 0.00645
Epoch: 49826, Training Loss: 0.00605
Epoch: 49826, Training Loss: 0.00738
Epoch: 49826, Training Loss: 0.00573
Epoch: 49827, Training Loss: 0.00645
Epoch: 49827, Training Loss: 0.00605
Epoch: 49827, Training Loss: 0.00738
Epoch: 49827, Training Loss: 0.00573
Epoch: 49828, Training Loss: 0.00645
Epoch: 49828, Training Loss: 0.00605
Epoch: 49828, Training Loss: 0.00738
Epoch: 49828, Training Loss: 0.00573
Epoch: 49829, Training Loss: 0.00645
Epoch: 49829, Training Loss: 0.00605
Epoch: 49829, Training Loss: 0.00738
Epoch: 49829, Training Loss: 0.00573
Epoch: 49830, Training Loss: 0.00645
Epoch: 49830, Training Loss: 0.00605
Epoch: 49830, Training Loss: 0.00738
Epoch: 49830, Training Loss: 0.00573
Epoch: 49831, Training Loss: 0.00645
Epoch: 49831, Training Loss: 0.00605
Epoch: 49831, Training Loss: 0.00738
Epoch: 49831, Training Loss: 0.00573
Epoch: 49832, Training Loss: 0.00645
Epoch: 49832, Training Loss: 0.00605
Epoch: 49832, Training Loss: 0.00738
Epoch: 49832, Training Loss: 0.00573
Epoch: 49833, Training Loss: 0.00645
Epoch: 49833, Training Loss: 0.00605
Epoch: 49833, Training Loss: 0.00738
Epoch: 49833, Training Loss: 0.00573
Epoch: 49834, Training Loss: 0.00645
Epoch: 49834, Training Loss: 0.00605
Epoch: 49834, Training Loss: 0.00738
Epoch: 49834, Training Loss: 0.00573
Epoch: 49835, Training Loss: 0.00645
Epoch: 49835, Training Loss: 0.00605
Epoch: 49835, Training Loss: 0.00738
Epoch: 49835, Training Loss: 0.00573
Epoch: 49836, Training Loss: 0.00645
Epoch: 49836, Training Loss: 0.00605
Epoch: 49836, Training Loss: 0.00738
Epoch: 49836, Training Loss: 0.00573
Epoch: 49837, Training Loss: 0.00645
Epoch: 49837, Training Loss: 0.00605
Epoch: 49837, Training Loss: 0.00738
Epoch: 49837, Training Loss: 0.00573
Epoch: 49838, Training Loss: 0.00645
Epoch: 49838, Training Loss: 0.00605
Epoch: 49838, Training Loss: 0.00738
Epoch: 49838, Training Loss: 0.00573
Epoch: 49839, Training Loss: 0.00645
Epoch: 49839, Training Loss: 0.00605
Epoch: 49839, Training Loss: 0.00738
Epoch: 49839, Training Loss: 0.00573
Epoch: 49840, Training Loss: 0.00645
Epoch: 49840, Training Loss: 0.00605
Epoch: 49840, Training Loss: 0.00738
Epoch: 49840, Training Loss: 0.00573
Epoch: 49841, Training Loss: 0.00645
Epoch: 49841, Training Loss: 0.00605
Epoch: 49841, Training Loss: 0.00738
Epoch: 49841, Training Loss: 0.00573
Epoch: 49842, Training Loss: 0.00645
Epoch: 49842, Training Loss: 0.00605
Epoch: 49842, Training Loss: 0.00738
Epoch: 49842, Training Loss: 0.00573
Epoch: 49843, Training Loss: 0.00645
Epoch: 49843, Training Loss: 0.00605
Epoch: 49843, Training Loss: 0.00738
Epoch: 49843, Training Loss: 0.00573
Epoch: 49844, Training Loss: 0.00645
Epoch: 49844, Training Loss: 0.00605
Epoch: 49844, Training Loss: 0.00738
Epoch: 49844, Training Loss: 0.00573
Epoch: 49845, Training Loss: 0.00645
Epoch: 49845, Training Loss: 0.00605
Epoch: 49845, Training Loss: 0.00738
Epoch: 49845, Training Loss: 0.00573
Epoch: 49846, Training Loss: 0.00645
Epoch: 49846, Training Loss: 0.00605
Epoch: 49846, Training Loss: 0.00738
Epoch: 49846, Training Loss: 0.00573
Epoch: 49847, Training Loss: 0.00645
Epoch: 49847, Training Loss: 0.00605
Epoch: 49847, Training Loss: 0.00738
Epoch: 49847, Training Loss: 0.00573
Epoch: 49848, Training Loss: 0.00645
Epoch: 49848, Training Loss: 0.00605
Epoch: 49848, Training Loss: 0.00738
Epoch: 49848, Training Loss: 0.00573
Epoch: 49849, Training Loss: 0.00645
Epoch: 49849, Training Loss: 0.00605
Epoch: 49849, Training Loss: 0.00738
Epoch: 49849, Training Loss: 0.00573
Epoch: 49850, Training Loss: 0.00645
Epoch: 49850, Training Loss: 0.00605
Epoch: 49850, Training Loss: 0.00738
Epoch: 49850, Training Loss: 0.00573
Epoch: 49851, Training Loss: 0.00645
Epoch: 49851, Training Loss: 0.00605
Epoch: 49851, Training Loss: 0.00738
Epoch: 49851, Training Loss: 0.00573
Epoch: 49852, Training Loss: 0.00645
Epoch: 49852, Training Loss: 0.00605
Epoch: 49852, Training Loss: 0.00738
Epoch: 49852, Training Loss: 0.00573
Epoch: 49853, Training Loss: 0.00645
Epoch: 49853, Training Loss: 0.00605
Epoch: 49853, Training Loss: 0.00738
Epoch: 49853, Training Loss: 0.00573
Epoch: 49854, Training Loss: 0.00645
Epoch: 49854, Training Loss: 0.00605
Epoch: 49854, Training Loss: 0.00738
Epoch: 49854, Training Loss: 0.00573
Epoch: 49855, Training Loss: 0.00645
Epoch: 49855, Training Loss: 0.00605
Epoch: 49855, Training Loss: 0.00738
Epoch: 49855, Training Loss: 0.00573
Epoch: 49856, Training Loss: 0.00645
Epoch: 49856, Training Loss: 0.00605
Epoch: 49856, Training Loss: 0.00738
Epoch: 49856, Training Loss: 0.00573
Epoch: 49857, Training Loss: 0.00645
Epoch: 49857, Training Loss: 0.00605
Epoch: 49857, Training Loss: 0.00738
Epoch: 49857, Training Loss: 0.00573
Epoch: 49858, Training Loss: 0.00645
Epoch: 49858, Training Loss: 0.00605
Epoch: 49858, Training Loss: 0.00738
Epoch: 49858, Training Loss: 0.00573
Epoch: 49859, Training Loss: 0.00645
Epoch: 49859, Training Loss: 0.00605
Epoch: 49859, Training Loss: 0.00738
Epoch: 49859, Training Loss: 0.00573
Epoch: 49860, Training Loss: 0.00645
Epoch: 49860, Training Loss: 0.00605
Epoch: 49860, Training Loss: 0.00738
Epoch: 49860, Training Loss: 0.00573
Epoch: 49861, Training Loss: 0.00645
Epoch: 49861, Training Loss: 0.00605
Epoch: 49861, Training Loss: 0.00738
Epoch: 49861, Training Loss: 0.00573
Epoch: 49862, Training Loss: 0.00645
Epoch: 49862, Training Loss: 0.00605
Epoch: 49862, Training Loss: 0.00738
Epoch: 49862, Training Loss: 0.00573
Epoch: 49863, Training Loss: 0.00645
Epoch: 49863, Training Loss: 0.00605
Epoch: 49863, Training Loss: 0.00738
Epoch: 49863, Training Loss: 0.00573
Epoch: 49864, Training Loss: 0.00645
Epoch: 49864, Training Loss: 0.00605
Epoch: 49864, Training Loss: 0.00738
Epoch: 49864, Training Loss: 0.00573
Epoch: 49865, Training Loss: 0.00645
Epoch: 49865, Training Loss: 0.00605
Epoch: 49865, Training Loss: 0.00738
Epoch: 49865, Training Loss: 0.00573
Epoch: 49866, Training Loss: 0.00645
Epoch: 49866, Training Loss: 0.00605
Epoch: 49866, Training Loss: 0.00738
Epoch: 49866, Training Loss: 0.00573
Epoch: 49867, Training Loss: 0.00645
Epoch: 49867, Training Loss: 0.00605
Epoch: 49867, Training Loss: 0.00738
Epoch: 49867, Training Loss: 0.00573
Epoch: 49868, Training Loss: 0.00645
Epoch: 49868, Training Loss: 0.00605
Epoch: 49868, Training Loss: 0.00738
Epoch: 49868, Training Loss: 0.00573
Epoch: 49869, Training Loss: 0.00645
Epoch: 49869, Training Loss: 0.00605
Epoch: 49869, Training Loss: 0.00738
Epoch: 49869, Training Loss: 0.00573
Epoch: 49870, Training Loss: 0.00645
Epoch: 49870, Training Loss: 0.00605
Epoch: 49870, Training Loss: 0.00738
Epoch: 49870, Training Loss: 0.00573
Epoch: 49871, Training Loss: 0.00645
Epoch: 49871, Training Loss: 0.00605
Epoch: 49871, Training Loss: 0.00738
Epoch: 49871, Training Loss: 0.00573
Epoch: 49872, Training Loss: 0.00645
Epoch: 49872, Training Loss: 0.00605
Epoch: 49872, Training Loss: 0.00738
Epoch: 49872, Training Loss: 0.00573
Epoch: 49873, Training Loss: 0.00645
Epoch: 49873, Training Loss: 0.00605
Epoch: 49873, Training Loss: 0.00738
Epoch: 49873, Training Loss: 0.00573
Epoch: 49874, Training Loss: 0.00645
Epoch: 49874, Training Loss: 0.00605
Epoch: 49874, Training Loss: 0.00738
Epoch: 49874, Training Loss: 0.00573
Epoch: 49875, Training Loss: 0.00645
Epoch: 49875, Training Loss: 0.00605
Epoch: 49875, Training Loss: 0.00737
Epoch: 49875, Training Loss: 0.00573
Epoch: 49876, Training Loss: 0.00645
Epoch: 49876, Training Loss: 0.00605
Epoch: 49876, Training Loss: 0.00737
Epoch: 49876, Training Loss: 0.00573
Epoch: 49877, Training Loss: 0.00645
Epoch: 49877, Training Loss: 0.00605
Epoch: 49877, Training Loss: 0.00737
Epoch: 49877, Training Loss: 0.00573
Epoch: 49878, Training Loss: 0.00645
Epoch: 49878, Training Loss: 0.00605
Epoch: 49878, Training Loss: 0.00737
Epoch: 49878, Training Loss: 0.00573
Epoch: 49879, Training Loss: 0.00645
Epoch: 49879, Training Loss: 0.00605
Epoch: 49879, Training Loss: 0.00737
Epoch: 49879, Training Loss: 0.00573
Epoch: 49880, Training Loss: 0.00645
Epoch: 49880, Training Loss: 0.00605
Epoch: 49880, Training Loss: 0.00737
Epoch: 49880, Training Loss: 0.00573
Epoch: 49881, Training Loss: 0.00645
Epoch: 49881, Training Loss: 0.00605
Epoch: 49881, Training Loss: 0.00737
Epoch: 49881, Training Loss: 0.00573
Epoch: 49882, Training Loss: 0.00645
Epoch: 49882, Training Loss: 0.00605
Epoch: 49882, Training Loss: 0.00737
Epoch: 49882, Training Loss: 0.00572
Epoch: 49883, Training Loss: 0.00645
Epoch: 49883, Training Loss: 0.00605
Epoch: 49883, Training Loss: 0.00737
Epoch: 49883, Training Loss: 0.00572
Epoch: 49884, Training Loss: 0.00645
Epoch: 49884, Training Loss: 0.00605
Epoch: 49884, Training Loss: 0.00737
Epoch: 49884, Training Loss: 0.00572
Epoch: 49885, Training Loss: 0.00645
Epoch: 49885, Training Loss: 0.00605
Epoch: 49885, Training Loss: 0.00737
Epoch: 49885, Training Loss: 0.00572
Epoch: 49886, Training Loss: 0.00645
Epoch: 49886, Training Loss: 0.00605
Epoch: 49886, Training Loss: 0.00737
Epoch: 49886, Training Loss: 0.00572
Epoch: 49887, Training Loss: 0.00645
Epoch: 49887, Training Loss: 0.00605
Epoch: 49887, Training Loss: 0.00737
Epoch: 49887, Training Loss: 0.00572
Epoch: 49888, Training Loss: 0.00645
Epoch: 49888, Training Loss: 0.00605
Epoch: 49888, Training Loss: 0.00737
Epoch: 49888, Training Loss: 0.00572
Epoch: 49889, Training Loss: 0.00645
Epoch: 49889, Training Loss: 0.00605
Epoch: 49889, Training Loss: 0.00737
Epoch: 49889, Training Loss: 0.00572
Epoch: 49890, Training Loss: 0.00645
Epoch: 49890, Training Loss: 0.00605
Epoch: 49890, Training Loss: 0.00737
Epoch: 49890, Training Loss: 0.00572
Epoch: 49891, Training Loss: 0.00645
Epoch: 49891, Training Loss: 0.00605
Epoch: 49891, Training Loss: 0.00737
Epoch: 49891, Training Loss: 0.00572
Epoch: 49892, Training Loss: 0.00645
Epoch: 49892, Training Loss: 0.00605
Epoch: 49892, Training Loss: 0.00737
Epoch: 49892, Training Loss: 0.00572
Epoch: 49893, Training Loss: 0.00645
Epoch: 49893, Training Loss: 0.00605
Epoch: 49893, Training Loss: 0.00737
Epoch: 49893, Training Loss: 0.00572
Epoch: 49894, Training Loss: 0.00645
Epoch: 49894, Training Loss: 0.00605
Epoch: 49894, Training Loss: 0.00737
Epoch: 49894, Training Loss: 0.00572
Epoch: 49895, Training Loss: 0.00645
Epoch: 49895, Training Loss: 0.00605
Epoch: 49895, Training Loss: 0.00737
Epoch: 49895, Training Loss: 0.00572
Epoch: 49896, Training Loss: 0.00645
Epoch: 49896, Training Loss: 0.00605
Epoch: 49896, Training Loss: 0.00737
Epoch: 49896, Training Loss: 0.00572
Epoch: 49897, Training Loss: 0.00645
Epoch: 49897, Training Loss: 0.00605
Epoch: 49897, Training Loss: 0.00737
Epoch: 49897, Training Loss: 0.00572
Epoch: 49898, Training Loss: 0.00645
Epoch: 49898, Training Loss: 0.00605
Epoch: 49898, Training Loss: 0.00737
Epoch: 49898, Training Loss: 0.00572
Epoch: 49899, Training Loss: 0.00645
Epoch: 49899, Training Loss: 0.00605
Epoch: 49899, Training Loss: 0.00737
Epoch: 49899, Training Loss: 0.00572
Epoch: 49900, Training Loss: 0.00645
Epoch: 49900, Training Loss: 0.00605
Epoch: 49900, Training Loss: 0.00737
Epoch: 49900, Training Loss: 0.00572
Epoch: 49901, Training Loss: 0.00645
Epoch: 49901, Training Loss: 0.00605
Epoch: 49901, Training Loss: 0.00737
Epoch: 49901, Training Loss: 0.00572
Epoch: 49902, Training Loss: 0.00645
Epoch: 49902, Training Loss: 0.00605
Epoch: 49902, Training Loss: 0.00737
Epoch: 49902, Training Loss: 0.00572
Epoch: 49903, Training Loss: 0.00645
Epoch: 49903, Training Loss: 0.00605
Epoch: 49903, Training Loss: 0.00737
Epoch: 49903, Training Loss: 0.00572
Epoch: 49904, Training Loss: 0.00645
Epoch: 49904, Training Loss: 0.00605
Epoch: 49904, Training Loss: 0.00737
Epoch: 49904, Training Loss: 0.00572
Epoch: 49905, Training Loss: 0.00645
Epoch: 49905, Training Loss: 0.00605
Epoch: 49905, Training Loss: 0.00737
Epoch: 49905, Training Loss: 0.00572
Epoch: 49906, Training Loss: 0.00645
Epoch: 49906, Training Loss: 0.00605
Epoch: 49906, Training Loss: 0.00737
Epoch: 49906, Training Loss: 0.00572
Epoch: 49907, Training Loss: 0.00645
Epoch: 49907, Training Loss: 0.00605
Epoch: 49907, Training Loss: 0.00737
Epoch: 49907, Training Loss: 0.00572
Epoch: 49908, Training Loss: 0.00645
Epoch: 49908, Training Loss: 0.00605
Epoch: 49908, Training Loss: 0.00737
Epoch: 49908, Training Loss: 0.00572
Epoch: 49909, Training Loss: 0.00645
Epoch: 49909, Training Loss: 0.00605
Epoch: 49909, Training Loss: 0.00737
Epoch: 49909, Training Loss: 0.00572
Epoch: 49910, Training Loss: 0.00645
Epoch: 49910, Training Loss: 0.00605
Epoch: 49910, Training Loss: 0.00737
Epoch: 49910, Training Loss: 0.00572
Epoch: 49911, Training Loss: 0.00645
Epoch: 49911, Training Loss: 0.00605
Epoch: 49911, Training Loss: 0.00737
Epoch: 49911, Training Loss: 0.00572
Epoch: 49912, Training Loss: 0.00645
Epoch: 49912, Training Loss: 0.00605
Epoch: 49912, Training Loss: 0.00737
Epoch: 49912, Training Loss: 0.00572
Epoch: 49913, Training Loss: 0.00645
Epoch: 49913, Training Loss: 0.00605
Epoch: 49913, Training Loss: 0.00737
Epoch: 49913, Training Loss: 0.00572
Epoch: 49914, Training Loss: 0.00645
Epoch: 49914, Training Loss: 0.00605
Epoch: 49914, Training Loss: 0.00737
Epoch: 49914, Training Loss: 0.00572
Epoch: 49915, Training Loss: 0.00645
Epoch: 49915, Training Loss: 0.00605
Epoch: 49915, Training Loss: 0.00737
Epoch: 49915, Training Loss: 0.00572
Epoch: 49916, Training Loss: 0.00645
Epoch: 49916, Training Loss: 0.00605
Epoch: 49916, Training Loss: 0.00737
Epoch: 49916, Training Loss: 0.00572
Epoch: 49917, Training Loss: 0.00645
Epoch: 49917, Training Loss: 0.00604
Epoch: 49917, Training Loss: 0.00737
Epoch: 49917, Training Loss: 0.00572
Epoch: 49918, Training Loss: 0.00645
Epoch: 49918, Training Loss: 0.00604
Epoch: 49918, Training Loss: 0.00737
Epoch: 49918, Training Loss: 0.00572
Epoch: 49919, Training Loss: 0.00645
Epoch: 49919, Training Loss: 0.00604
Epoch: 49919, Training Loss: 0.00737
Epoch: 49919, Training Loss: 0.00572
Epoch: 49920, Training Loss: 0.00645
Epoch: 49920, Training Loss: 0.00604
Epoch: 49920, Training Loss: 0.00737
Epoch: 49920, Training Loss: 0.00572
Epoch: 49921, Training Loss: 0.00645
Epoch: 49921, Training Loss: 0.00604
Epoch: 49921, Training Loss: 0.00737
Epoch: 49921, Training Loss: 0.00572
Epoch: 49922, Training Loss: 0.00645
Epoch: 49922, Training Loss: 0.00604
Epoch: 49922, Training Loss: 0.00737
Epoch: 49922, Training Loss: 0.00572
Epoch: 49923, Training Loss: 0.00645
Epoch: 49923, Training Loss: 0.00604
Epoch: 49923, Training Loss: 0.00737
Epoch: 49923, Training Loss: 0.00572
Epoch: 49924, Training Loss: 0.00645
Epoch: 49924, Training Loss: 0.00604
Epoch: 49924, Training Loss: 0.00737
Epoch: 49924, Training Loss: 0.00572
Epoch: 49925, Training Loss: 0.00645
Epoch: 49925, Training Loss: 0.00604
Epoch: 49925, Training Loss: 0.00737
Epoch: 49925, Training Loss: 0.00572
Epoch: 49926, Training Loss: 0.00645
Epoch: 49926, Training Loss: 0.00604
Epoch: 49926, Training Loss: 0.00737
Epoch: 49926, Training Loss: 0.00572
Epoch: 49927, Training Loss: 0.00645
Epoch: 49927, Training Loss: 0.00604
Epoch: 49927, Training Loss: 0.00737
Epoch: 49927, Training Loss: 0.00572
Epoch: 49928, Training Loss: 0.00645
Epoch: 49928, Training Loss: 0.00604
Epoch: 49928, Training Loss: 0.00737
Epoch: 49928, Training Loss: 0.00572
Epoch: 49929, Training Loss: 0.00645
Epoch: 49929, Training Loss: 0.00604
Epoch: 49929, Training Loss: 0.00737
Epoch: 49929, Training Loss: 0.00572
Epoch: 49930, Training Loss: 0.00645
Epoch: 49930, Training Loss: 0.00604
Epoch: 49930, Training Loss: 0.00737
Epoch: 49930, Training Loss: 0.00572
Epoch: 49931, Training Loss: 0.00645
Epoch: 49931, Training Loss: 0.00604
Epoch: 49931, Training Loss: 0.00737
Epoch: 49931, Training Loss: 0.00572
Epoch: 49932, Training Loss: 0.00645
Epoch: 49932, Training Loss: 0.00604
Epoch: 49932, Training Loss: 0.00737
Epoch: 49932, Training Loss: 0.00572
Epoch: 49933, Training Loss: 0.00645
Epoch: 49933, Training Loss: 0.00604
Epoch: 49933, Training Loss: 0.00737
Epoch: 49933, Training Loss: 0.00572
Epoch: 49934, Training Loss: 0.00645
Epoch: 49934, Training Loss: 0.00604
Epoch: 49934, Training Loss: 0.00737
Epoch: 49934, Training Loss: 0.00572
Epoch: 49935, Training Loss: 0.00645
Epoch: 49935, Training Loss: 0.00604
Epoch: 49935, Training Loss: 0.00737
Epoch: 49935, Training Loss: 0.00572
Epoch: 49936, Training Loss: 0.00645
Epoch: 49936, Training Loss: 0.00604
Epoch: 49936, Training Loss: 0.00737
Epoch: 49936, Training Loss: 0.00572
Epoch: 49937, Training Loss: 0.00645
Epoch: 49937, Training Loss: 0.00604
Epoch: 49937, Training Loss: 0.00737
Epoch: 49937, Training Loss: 0.00572
Epoch: 49938, Training Loss: 0.00645
Epoch: 49938, Training Loss: 0.00604
Epoch: 49938, Training Loss: 0.00737
Epoch: 49938, Training Loss: 0.00572
Epoch: 49939, Training Loss: 0.00645
Epoch: 49939, Training Loss: 0.00604
Epoch: 49939, Training Loss: 0.00737
Epoch: 49939, Training Loss: 0.00572
Epoch: 49940, Training Loss: 0.00645
Epoch: 49940, Training Loss: 0.00604
Epoch: 49940, Training Loss: 0.00737
Epoch: 49940, Training Loss: 0.00572
Epoch: 49941, Training Loss: 0.00645
Epoch: 49941, Training Loss: 0.00604
Epoch: 49941, Training Loss: 0.00737
Epoch: 49941, Training Loss: 0.00572
Epoch: 49942, Training Loss: 0.00645
Epoch: 49942, Training Loss: 0.00604
Epoch: 49942, Training Loss: 0.00737
Epoch: 49942, Training Loss: 0.00572
Epoch: 49943, Training Loss: 0.00645
Epoch: 49943, Training Loss: 0.00604
Epoch: 49943, Training Loss: 0.00737
Epoch: 49943, Training Loss: 0.00572
Epoch: 49944, Training Loss: 0.00645
Epoch: 49944, Training Loss: 0.00604
Epoch: 49944, Training Loss: 0.00737
Epoch: 49944, Training Loss: 0.00572
Epoch: 49945, Training Loss: 0.00645
Epoch: 49945, Training Loss: 0.00604
Epoch: 49945, Training Loss: 0.00737
Epoch: 49945, Training Loss: 0.00572
Epoch: 49946, Training Loss: 0.00645
Epoch: 49946, Training Loss: 0.00604
Epoch: 49946, Training Loss: 0.00737
Epoch: 49946, Training Loss: 0.00572
Epoch: 49947, Training Loss: 0.00645
Epoch: 49947, Training Loss: 0.00604
Epoch: 49947, Training Loss: 0.00737
Epoch: 49947, Training Loss: 0.00572
Epoch: 49948, Training Loss: 0.00645
Epoch: 49948, Training Loss: 0.00604
Epoch: 49948, Training Loss: 0.00737
Epoch: 49948, Training Loss: 0.00572
Epoch: 49949, Training Loss: 0.00645
Epoch: 49949, Training Loss: 0.00604
Epoch: 49949, Training Loss: 0.00737
Epoch: 49949, Training Loss: 0.00572
Epoch: 49950, Training Loss: 0.00645
Epoch: 49950, Training Loss: 0.00604
Epoch: 49950, Training Loss: 0.00737
Epoch: 49950, Training Loss: 0.00572
Epoch: 49951, Training Loss: 0.00645
Epoch: 49951, Training Loss: 0.00604
Epoch: 49951, Training Loss: 0.00737
Epoch: 49951, Training Loss: 0.00572
Epoch: 49952, Training Loss: 0.00645
Epoch: 49952, Training Loss: 0.00604
Epoch: 49952, Training Loss: 0.00737
Epoch: 49952, Training Loss: 0.00572
Epoch: 49953, Training Loss: 0.00645
Epoch: 49953, Training Loss: 0.00604
Epoch: 49953, Training Loss: 0.00737
Epoch: 49953, Training Loss: 0.00572
Epoch: 49954, Training Loss: 0.00645
Epoch: 49954, Training Loss: 0.00604
Epoch: 49954, Training Loss: 0.00737
Epoch: 49954, Training Loss: 0.00572
Epoch: 49955, Training Loss: 0.00645
Epoch: 49955, Training Loss: 0.00604
Epoch: 49955, Training Loss: 0.00737
Epoch: 49955, Training Loss: 0.00572
Epoch: 49956, Training Loss: 0.00645
Epoch: 49956, Training Loss: 0.00604
Epoch: 49956, Training Loss: 0.00737
Epoch: 49956, Training Loss: 0.00572
Epoch: 49957, Training Loss: 0.00645
Epoch: 49957, Training Loss: 0.00604
Epoch: 49957, Training Loss: 0.00737
Epoch: 49957, Training Loss: 0.00572
Epoch: 49958, Training Loss: 0.00645
Epoch: 49958, Training Loss: 0.00604
Epoch: 49958, Training Loss: 0.00737
Epoch: 49958, Training Loss: 0.00572
Epoch: 49959, Training Loss: 0.00645
Epoch: 49959, Training Loss: 0.00604
Epoch: 49959, Training Loss: 0.00737
Epoch: 49959, Training Loss: 0.00572
Epoch: 49960, Training Loss: 0.00645
Epoch: 49960, Training Loss: 0.00604
Epoch: 49960, Training Loss: 0.00737
Epoch: 49960, Training Loss: 0.00572
Epoch: 49961, Training Loss: 0.00645
Epoch: 49961, Training Loss: 0.00604
Epoch: 49961, Training Loss: 0.00737
Epoch: 49961, Training Loss: 0.00572
Epoch: 49962, Training Loss: 0.00645
Epoch: 49962, Training Loss: 0.00604
Epoch: 49962, Training Loss: 0.00737
Epoch: 49962, Training Loss: 0.00572
Epoch: 49963, Training Loss: 0.00645
Epoch: 49963, Training Loss: 0.00604
Epoch: 49963, Training Loss: 0.00737
Epoch: 49963, Training Loss: 0.00572
Epoch: 49964, Training Loss: 0.00645
Epoch: 49964, Training Loss: 0.00604
Epoch: 49964, Training Loss: 0.00737
Epoch: 49964, Training Loss: 0.00572
Epoch: 49965, Training Loss: 0.00645
Epoch: 49965, Training Loss: 0.00604
Epoch: 49965, Training Loss: 0.00737
Epoch: 49965, Training Loss: 0.00572
Epoch: 49966, Training Loss: 0.00645
Epoch: 49966, Training Loss: 0.00604
Epoch: 49966, Training Loss: 0.00737
Epoch: 49966, Training Loss: 0.00572
Epoch: 49967, Training Loss: 0.00645
Epoch: 49967, Training Loss: 0.00604
Epoch: 49967, Training Loss: 0.00737
Epoch: 49967, Training Loss: 0.00572
Epoch: 49968, Training Loss: 0.00645
Epoch: 49968, Training Loss: 0.00604
Epoch: 49968, Training Loss: 0.00737
Epoch: 49968, Training Loss: 0.00572
Epoch: 49969, Training Loss: 0.00645
Epoch: 49969, Training Loss: 0.00604
Epoch: 49969, Training Loss: 0.00737
Epoch: 49969, Training Loss: 0.00572
Epoch: 49970, Training Loss: 0.00645
Epoch: 49970, Training Loss: 0.00604
Epoch: 49970, Training Loss: 0.00737
Epoch: 49970, Training Loss: 0.00572
Epoch: 49971, Training Loss: 0.00645
Epoch: 49971, Training Loss: 0.00604
Epoch: 49971, Training Loss: 0.00737
Epoch: 49971, Training Loss: 0.00572
Epoch: 49972, Training Loss: 0.00645
Epoch: 49972, Training Loss: 0.00604
Epoch: 49972, Training Loss: 0.00737
Epoch: 49972, Training Loss: 0.00572
Epoch: 49973, Training Loss: 0.00645
Epoch: 49973, Training Loss: 0.00604
Epoch: 49973, Training Loss: 0.00737
Epoch: 49973, Training Loss: 0.00572
Epoch: 49974, Training Loss: 0.00644
Epoch: 49974, Training Loss: 0.00604
Epoch: 49974, Training Loss: 0.00737
Epoch: 49974, Training Loss: 0.00572
Epoch: 49975, Training Loss: 0.00644
Epoch: 49975, Training Loss: 0.00604
Epoch: 49975, Training Loss: 0.00737
Epoch: 49975, Training Loss: 0.00572
Epoch: 49976, Training Loss: 0.00644
Epoch: 49976, Training Loss: 0.00604
Epoch: 49976, Training Loss: 0.00737
Epoch: 49976, Training Loss: 0.00572
Epoch: 49977, Training Loss: 0.00644
Epoch: 49977, Training Loss: 0.00604
Epoch: 49977, Training Loss: 0.00737
Epoch: 49977, Training Loss: 0.00572
Epoch: 49978, Training Loss: 0.00644
Epoch: 49978, Training Loss: 0.00604
Epoch: 49978, Training Loss: 0.00737
Epoch: 49978, Training Loss: 0.00572
Epoch: 49979, Training Loss: 0.00644
Epoch: 49979, Training Loss: 0.00604
Epoch: 49979, Training Loss: 0.00737
Epoch: 49979, Training Loss: 0.00572
Epoch: 49980, Training Loss: 0.00644
Epoch: 49980, Training Loss: 0.00604
Epoch: 49980, Training Loss: 0.00737
Epoch: 49980, Training Loss: 0.00572
Epoch: 49981, Training Loss: 0.00644
Epoch: 49981, Training Loss: 0.00604
Epoch: 49981, Training Loss: 0.00737
Epoch: 49981, Training Loss: 0.00572
Epoch: 49982, Training Loss: 0.00644
Epoch: 49982, Training Loss: 0.00604
Epoch: 49982, Training Loss: 0.00737
Epoch: 49982, Training Loss: 0.00572
Epoch: 49983, Training Loss: 0.00644
Epoch: 49983, Training Loss: 0.00604
Epoch: 49983, Training Loss: 0.00737
Epoch: 49983, Training Loss: 0.00572
Epoch: 49984, Training Loss: 0.00644
Epoch: 49984, Training Loss: 0.00604
Epoch: 49984, Training Loss: 0.00737
Epoch: 49984, Training Loss: 0.00572
Epoch: 49985, Training Loss: 0.00644
Epoch: 49985, Training Loss: 0.00604
Epoch: 49985, Training Loss: 0.00737
Epoch: 49985, Training Loss: 0.00572
Epoch: 49986, Training Loss: 0.00644
Epoch: 49986, Training Loss: 0.00604
Epoch: 49986, Training Loss: 0.00737
Epoch: 49986, Training Loss: 0.00572
Epoch: 49987, Training Loss: 0.00644
Epoch: 49987, Training Loss: 0.00604
Epoch: 49987, Training Loss: 0.00737
Epoch: 49987, Training Loss: 0.00572
Epoch: 49988, Training Loss: 0.00644
Epoch: 49988, Training Loss: 0.00604
Epoch: 49988, Training Loss: 0.00737
Epoch: 49988, Training Loss: 0.00572
Epoch: 49989, Training Loss: 0.00644
Epoch: 49989, Training Loss: 0.00604
Epoch: 49989, Training Loss: 0.00737
Epoch: 49989, Training Loss: 0.00572
Epoch: 49990, Training Loss: 0.00644
Epoch: 49990, Training Loss: 0.00604
Epoch: 49990, Training Loss: 0.00737
Epoch: 49990, Training Loss: 0.00572
Epoch: 49991, Training Loss: 0.00644
Epoch: 49991, Training Loss: 0.00604
Epoch: 49991, Training Loss: 0.00737
Epoch: 49991, Training Loss: 0.00572
Epoch: 49992, Training Loss: 0.00644
Epoch: 49992, Training Loss: 0.00604
Epoch: 49992, Training Loss: 0.00737
Epoch: 49992, Training Loss: 0.00572
Epoch: 49993, Training Loss: 0.00644
Epoch: 49993, Training Loss: 0.00604
Epoch: 49993, Training Loss: 0.00737
Epoch: 49993, Training Loss: 0.00572
Epoch: 49994, Training Loss: 0.00644
Epoch: 49994, Training Loss: 0.00604
Epoch: 49994, Training Loss: 0.00737
Epoch: 49994, Training Loss: 0.00572
Epoch: 49995, Training Loss: 0.00644
Epoch: 49995, Training Loss: 0.00604
Epoch: 49995, Training Loss: 0.00737
Epoch: 49995, Training Loss: 0.00572
Epoch: 49996, Training Loss: 0.00644
Epoch: 49996, Training Loss: 0.00604
Epoch: 49996, Training Loss: 0.00737
Epoch: 49996, Training Loss: 0.00572
Epoch: 49997, Training Loss: 0.00644
Epoch: 49997, Training Loss: 0.00604
Epoch: 49997, Training Loss: 0.00737
Epoch: 49997, Training Loss: 0.00572
Epoch: 49998, Training Loss: 0.00644
Epoch: 49998, Training Loss: 0.00604
Epoch: 49998, Training Loss: 0.00737
Epoch: 49998, Training Loss: 0.00572
Epoch: 49999, Training Loss: 0.00644
Epoch: 49999, Training Loss: 0.00604
Epoch: 49999, Training Loss: 0.00737
Epoch: 49999, Training Loss: 0.00572

Evaluate the network


In [32]:
def eval_network(input_value):
    """Test the network and return the output value."""   
    network = Network([2,2,1])
    network.set_values(weights, biases)
    output_value = network.feedforward(input_value)
    return output_value[0][0]
    
def test():
    """A few tests to make sure the network performs properly."""
    print("0 XOR 0 = 0?: ", eval_network(np.array([0,0])[:,None]))
    print("0 XOR 1 = 1?: ", eval_network(np.array([0,1])[:,None]))
    print("1 XOR 0 = 1?: ", eval_network(np.array([1,0])[:,None]))
    print("1 XOR 1 = 0?: ", eval_network(np.array([1,1])[:,None]))   
    
test()


0 XOR 0 = 0?:  0.00706920635592
0 XOR 1 = 1?:  0.993918426743
1 XOR 0 = 1?:  0.993910528596
1 XOR 1 = 0?:  0.00636980548488

Analysis the results

Let us plot the training losses of all the epoches below.


In [83]:
plt.figure(1)
train_losses = np.array(train_losses)
plt.plot(epoch_n, train_losses)
plt.xlabel("Epoch")
plt.ylabel("Loss")
plt.title("Training Losses")
plt.show()


We can see that as the model is training, the loss is fluctuating and getting smaller and smaller.

Other methods

Weight Initializaion

This part will use different ways for weights initialization: Uniform initialization method.


In [33]:
class Network2(object):
    
    def __init__(self, sizes):
        """The list "sizes" contains the number of neurons in respective layers.""" 
        self.sizes = sizes
        self.number_layers = len(sizes)
        self.biases = [np.random.rand(i, 1) for i in sizes[1:]] 
        self.weights = [np.random.rand(i, j) / j for i,j in zip(sizes[1:],sizes[:-1])] # Modify this line to normal distribution
        
    def feedforward(self, input_):
        """Return the output of the neural network."""
        activation = input_
        for bias, weight in zip(self.biases, self.weights):
            output = np.dot(weight, activation) + bias
            activation = sigmoid(output)
        return activation
    
    def set_values(self, weights, biases):
        self.weights = weights
        self.biases = biases
    
    def backprop(self, training_data, desired_output, lr_rate):
        """Update parameters"""
        # Store gradients of weights and biases for update
        grad_weights = np.zeros_like(self.weights)
        grad_biases = np.zeros_like(self.biases)
        
        # Store outputs and activations for backprop
        outputs = []
        activation = training_data
        activations = [activation] 
        for b, w in zip(self.biases, self.weights):
            output = np.dot(w, activation) + b
            outputs.append(output)
            activation = sigmoid(output)
            activations.append(activation)
        
        # Compute the gradients of the last layer
        error = self.feedforward(training_data) - desired_output
        delta = error * sigmoid_deriv(outputs[-1])
        grad_biases[-1] = delta
        grad_weights[-1] = np.dot(delta, activations[-2].transpose())
        
        # Compute gradients of remaining layers
        for layer in range(2, self.number_layers):
            delta = np.dot(self.weights[-layer+1].transpose(), delta) * sigmoid_deriv(outputs[-layer])
            grad_biases[-layer] = delta
            grad_weights[-layer] = np.dot(delta, activations[-layer-1].transpose())
        
        # Update weights and biases
        self.weights = [w-lr_rate*grad_w for w, grad_w in zip(self.weights, grad_weights)]
        self.biases = [b-lr_rate*grad_b for b, grad_b in zip(self.biases, grad_biases)]

In [36]:
# Set training epoches and learning rate
epochs2 = 46000
lr_rate = 0.6

In [85]:
# Train the network with the weight initialization of uniform distribution.
def train2(training_data, desired_output, epochs2, lr_rate):
    losses = [] # Store losses for plotting
    epoch_n = [] # Store epochs for plotting
    network = Network2([2,2,1]) # Modify this line
    for epoch in range(epochs2):
        for x, y in zip(training_data, desired_output):
            x = x[:, None] # Convert row vector to column vector
            network.backprop(x, y, lr_rate)
            # Printing out the training progress
            train_loss = abs(y - network.feedforward(x))
            if epoch % 500 == 0:
                losses.append(train_loss[0][0]) 
                epoch_n.append(epoch)
            print("Epoch: {}, Training Loss: {:.5f}".format(epoch, train_loss[0][0]))
    return network.weights, network.biases, losses, epoch_n
    
weights2, biases2, train_losses2, epoch_n2 = train2(training_data, desired_output, epochs2, lr_rate)


Epoch: 0, Training Loss: 0.74461
Epoch: 0, Training Loss: 0.23633
Epoch: 0, Training Loss: 0.22796
Epoch: 0, Training Loss: 0.74753
Epoch: 1, Training Loss: 0.70226
Epoch: 1, Training Loss: 0.27628
Epoch: 1, Training Loss: 0.26366
Epoch: 1, Training Loss: 0.70507
Epoch: 2, Training Loss: 0.66001
Epoch: 2, Training Loss: 0.31599
Epoch: 2, Training Loss: 0.29866
Epoch: 2, Training Loss: 0.66433
Epoch: 3, Training Loss: 0.62107
Epoch: 3, Training Loss: 0.35260
Epoch: 3, Training Loss: 0.33068
Epoch: 3, Training Loss: 0.62804
Epoch: 4, Training Loss: 0.58759
Epoch: 4, Training Loss: 0.38425
Epoch: 4, Training Loss: 0.35832
Epoch: 4, Training Loss: 0.59755
Epoch: 5, Training Loss: 0.56023
Epoch: 5, Training Loss: 0.41032
Epoch: 5, Training Loss: 0.38114
Epoch: 5, Training Loss: 0.57298
Epoch: 6, Training Loss: 0.53863
Epoch: 6, Training Loss: 0.43109
Epoch: 6, Training Loss: 0.39941
Epoch: 6, Training Loss: 0.55372
Epoch: 7, Training Loss: 0.52194
Epoch: 7, Training Loss: 0.44727
Epoch: 7, Training Loss: 0.41373
Epoch: 7, Training Loss: 0.53886
Epoch: 8, Training Loss: 0.50920
Epoch: 8, Training Loss: 0.45972
Epoch: 8, Training Loss: 0.42481
Epoch: 8, Training Loss: 0.52752
Epoch: 9, Training Loss: 0.49953
Epoch: 9, Training Loss: 0.46923
Epoch: 9, Training Loss: 0.43332
Epoch: 9, Training Loss: 0.51891
Epoch: 10, Training Loss: 0.49223
Epoch: 10, Training Loss: 0.47644
Epoch: 10, Training Loss: 0.43982
Epoch: 10, Training Loss: 0.51240
Epoch: 11, Training Loss: 0.48671
Epoch: 11, Training Loss: 0.48191
Epoch: 11, Training Loss: 0.44477
Epoch: 11, Training Loss: 0.50748
Epoch: 12, Training Loss: 0.48255
Epoch: 12, Training Loss: 0.48604
Epoch: 12, Training Loss: 0.44853
Epoch: 12, Training Loss: 0.50377
Epoch: 13, Training Loss: 0.47941
Epoch: 13, Training Loss: 0.48917
Epoch: 13, Training Loss: 0.45138
Epoch: 13, Training Loss: 0.50098
Epoch: 14, Training Loss: 0.47704
Epoch: 14, Training Loss: 0.49153
Epoch: 14, Training Loss: 0.45356
Epoch: 14, Training Loss: 0.49888
Epoch: 15, Training Loss: 0.47525
Epoch: 15, Training Loss: 0.49331
Epoch: 15, Training Loss: 0.45521
Epoch: 15, Training Loss: 0.49730
Epoch: 16, Training Loss: 0.47390
Epoch: 16, Training Loss: 0.49466
Epoch: 16, Training Loss: 0.45647
Epoch: 16, Training Loss: 0.49612
Epoch: 17, Training Loss: 0.47287
Epoch: 17, Training Loss: 0.49568
Epoch: 17, Training Loss: 0.45743
Epoch: 17, Training Loss: 0.49523
Epoch: 18, Training Loss: 0.47210
Epoch: 18, Training Loss: 0.49645
Epoch: 18, Training Loss: 0.45817
Epoch: 18, Training Loss: 0.49456
Epoch: 19, Training Loss: 0.47151
Epoch: 19, Training Loss: 0.49703
Epoch: 19, Training Loss: 0.45874
Epoch: 19, Training Loss: 0.49407
Epoch: 20, Training Loss: 0.47107
Epoch: 20, Training Loss: 0.49748
Epoch: 20, Training Loss: 0.45918
Epoch: 20, Training Loss: 0.49370
Epoch: 21, Training Loss: 0.47073
Epoch: 21, Training Loss: 0.49781
Epoch: 21, Training Loss: 0.45952
Epoch: 21, Training Loss: 0.49342
Epoch: 22, Training Loss: 0.47047
Epoch: 22, Training Loss: 0.49807
Epoch: 22, Training Loss: 0.45979
Epoch: 22, Training Loss: 0.49322
Epoch: 23, Training Loss: 0.47027
Epoch: 23, Training Loss: 0.49826
Epoch: 23, Training Loss: 0.46001
Epoch: 23, Training Loss: 0.49308
Epoch: 24, Training Loss: 0.47012
Epoch: 24, Training Loss: 0.49841
Epoch: 24, Training Loss: 0.46018
Epoch: 24, Training Loss: 0.49298
Epoch: 25, Training Loss: 0.47001
Epoch: 25, Training Loss: 0.49852
Epoch: 25, Training Loss: 0.46032
Epoch: 25, Training Loss: 0.49290
Epoch: 26, Training Loss: 0.46992
Epoch: 26, Training Loss: 0.49861
Epoch: 26, Training Loss: 0.46044
Epoch: 26, Training Loss: 0.49285
Epoch: 27, Training Loss: 0.46985
Epoch: 27, Training Loss: 0.49867
Epoch: 27, Training Loss: 0.46054
Epoch: 27, Training Loss: 0.49282
Epoch: 28, Training Loss: 0.46979
Epoch: 28, Training Loss: 0.49872
Epoch: 28, Training Loss: 0.46063
Epoch: 28, Training Loss: 0.49281
Epoch: 29, Training Loss: 0.46975
Epoch: 29, Training Loss: 0.49876
Epoch: 29, Training Loss: 0.46070
Epoch: 29, Training Loss: 0.49280
Epoch: 30, Training Loss: 0.46972
Epoch: 30, Training Loss: 0.49880
Epoch: 30, Training Loss: 0.46077
Epoch: 30, Training Loss: 0.49280
Epoch: 31, Training Loss: 0.46969
Epoch: 31, Training Loss: 0.49882
Epoch: 31, Training Loss: 0.46083
Epoch: 31, Training Loss: 0.49281
Epoch: 32, Training Loss: 0.46967
Epoch: 32, Training Loss: 0.49884
Epoch: 32, Training Loss: 0.46089
Epoch: 32, Training Loss: 0.49282
Epoch: 33, Training Loss: 0.46965
Epoch: 33, Training Loss: 0.49886
Epoch: 33, Training Loss: 0.46095
Epoch: 33, Training Loss: 0.49283
Epoch: 34, Training Loss: 0.46963
Epoch: 34, Training Loss: 0.49887
Epoch: 34, Training Loss: 0.46100
Epoch: 34, Training Loss: 0.49285
Epoch: 35, Training Loss: 0.46962
Epoch: 35, Training Loss: 0.49888
Epoch: 35, Training Loss: 0.46105
Epoch: 35, Training Loss: 0.49286
Epoch: 36, Training Loss: 0.46961
Epoch: 36, Training Loss: 0.49889
Epoch: 36, Training Loss: 0.46110
Epoch: 36, Training Loss: 0.49288
Epoch: 37, Training Loss: 0.46960
Epoch: 37, Training Loss: 0.49890
Epoch: 37, Training Loss: 0.46114
Epoch: 37, Training Loss: 0.49290
Epoch: 38, Training Loss: 0.46959
Epoch: 38, Training Loss: 0.49891
Epoch: 38, Training Loss: 0.46119
Epoch: 38, Training Loss: 0.49293
Epoch: 39, Training Loss: 0.46958
Epoch: 39, Training Loss: 0.49892
Epoch: 39, Training Loss: 0.46124
Epoch: 39, Training Loss: 0.49295
Epoch: 40, Training Loss: 0.46958
Epoch: 40, Training Loss: 0.49892
Epoch: 40, Training Loss: 0.46128
Epoch: 40, Training Loss: 0.49297
Epoch: 41, Training Loss: 0.46957
Epoch: 41, Training Loss: 0.49893
Epoch: 41, Training Loss: 0.46133
Epoch: 41, Training Loss: 0.49299
Epoch: 42, Training Loss: 0.46956
Epoch: 42, Training Loss: 0.49893
Epoch: 42, Training Loss: 0.46137
Epoch: 42, Training Loss: 0.49302
Epoch: 43, Training Loss: 0.46956
Epoch: 43, Training Loss: 0.49894
Epoch: 43, Training Loss: 0.46141
Epoch: 43, Training Loss: 0.49304
Epoch: 44, Training Loss: 0.46955
Epoch: 44, Training Loss: 0.49894
Epoch: 44, Training Loss: 0.46146
Epoch: 44, Training Loss: 0.49306
Epoch: 45, Training Loss: 0.46955
Epoch: 45, Training Loss: 0.49895
Epoch: 45, Training Loss: 0.46150
Epoch: 45, Training Loss: 0.49309
Epoch: 46, Training Loss: 0.46955
Epoch: 46, Training Loss: 0.49895
Epoch: 46, Training Loss: 0.46155
Epoch: 46, Training Loss: 0.49311
Epoch: 47, Training Loss: 0.46954
Epoch: 47, Training Loss: 0.49896
Epoch: 47, Training Loss: 0.46159
Epoch: 47, Training Loss: 0.49313
Epoch: 48, Training Loss: 0.46954
Epoch: 48, Training Loss: 0.49896
Epoch: 48, Training Loss: 0.46163
Epoch: 48, Training Loss: 0.49316
Epoch: 49, Training Loss: 0.46953
Epoch: 49, Training Loss: 0.49897
Epoch: 49, Training Loss: 0.46168
Epoch: 49, Training Loss: 0.49318
Epoch: 50, Training Loss: 0.46953
Epoch: 50, Training Loss: 0.49897
Epoch: 50, Training Loss: 0.46172
Epoch: 50, Training Loss: 0.49320
Epoch: 51, Training Loss: 0.46953
Epoch: 51, Training Loss: 0.49898
Epoch: 51, Training Loss: 0.46176
Epoch: 51, Training Loss: 0.49323
Epoch: 52, Training Loss: 0.46953
Epoch: 52, Training Loss: 0.49898
Epoch: 52, Training Loss: 0.46181
Epoch: 52, Training Loss: 0.49325
Epoch: 53, Training Loss: 0.46952
Epoch: 53, Training Loss: 0.49899
Epoch: 53, Training Loss: 0.46185
Epoch: 53, Training Loss: 0.49327
Epoch: 54, Training Loss: 0.46952
Epoch: 54, Training Loss: 0.49899
Epoch: 54, Training Loss: 0.46189
Epoch: 54, Training Loss: 0.49330
Epoch: 55, Training Loss: 0.46952
Epoch: 55, Training Loss: 0.49899
Epoch: 55, Training Loss: 0.46194
Epoch: 55, Training Loss: 0.49332
Epoch: 56, Training Loss: 0.46952
Epoch: 56, Training Loss: 0.49900
Epoch: 56, Training Loss: 0.46198
Epoch: 56, Training Loss: 0.49334
Epoch: 57, Training Loss: 0.46951
Epoch: 57, Training Loss: 0.49900
Epoch: 57, Training Loss: 0.46202
Epoch: 57, Training Loss: 0.49337
Epoch: 58, Training Loss: 0.46951
Epoch: 58, Training Loss: 0.49901
Epoch: 58, Training Loss: 0.46207
Epoch: 58, Training Loss: 0.49339
Epoch: 59, Training Loss: 0.46951
Epoch: 59, Training Loss: 0.49901
Epoch: 59, Training Loss: 0.46211
Epoch: 59, Training Loss: 0.49341
Epoch: 60, Training Loss: 0.46951
Epoch: 60, Training Loss: 0.49901
Epoch: 60, Training Loss: 0.46215
Epoch: 60, Training Loss: 0.49344
Epoch: 61, Training Loss: 0.46951
Epoch: 61, Training Loss: 0.49902
Epoch: 61, Training Loss: 0.46220
Epoch: 61, Training Loss: 0.49346
Epoch: 62, Training Loss: 0.46951
Epoch: 62, Training Loss: 0.49902
Epoch: 62, Training Loss: 0.46224
Epoch: 62, Training Loss: 0.49348
Epoch: 63, Training Loss: 0.46951
Epoch: 63, Training Loss: 0.49902
Epoch: 63, Training Loss: 0.46228
Epoch: 63, Training Loss: 0.49351
Epoch: 64, Training Loss: 0.46951
Epoch: 64, Training Loss: 0.49903
Epoch: 64, Training Loss: 0.46232
Epoch: 64, Training Loss: 0.49353
Epoch: 65, Training Loss: 0.46951
Epoch: 65, Training Loss: 0.49903
Epoch: 65, Training Loss: 0.46237
Epoch: 65, Training Loss: 0.49355
Epoch: 66, Training Loss: 0.46951
Epoch: 66, Training Loss: 0.49904
Epoch: 66, Training Loss: 0.46241
Epoch: 66, Training Loss: 0.49358
Epoch: 67, Training Loss: 0.46951
Epoch: 67, Training Loss: 0.49904
Epoch: 67, Training Loss: 0.46245
Epoch: 67, Training Loss: 0.49360
Epoch: 68, Training Loss: 0.46951
Epoch: 68, Training Loss: 0.49904
Epoch: 68, Training Loss: 0.46250
Epoch: 68, Training Loss: 0.49362
Epoch: 69, Training Loss: 0.46951
Epoch: 69, Training Loss: 0.49905
Epoch: 69, Training Loss: 0.46254
Epoch: 69, Training Loss: 0.49365
Epoch: 70, Training Loss: 0.46951
Epoch: 70, Training Loss: 0.49905
Epoch: 70, Training Loss: 0.46258
Epoch: 70, Training Loss: 0.49367
Epoch: 71, Training Loss: 0.46951
Epoch: 71, Training Loss: 0.49905
Epoch: 71, Training Loss: 0.46263
Epoch: 71, Training Loss: 0.49369
Epoch: 72, Training Loss: 0.46951
Epoch: 72, Training Loss: 0.49905
Epoch: 72, Training Loss: 0.46267
Epoch: 72, Training Loss: 0.49372
Epoch: 73, Training Loss: 0.46951
Epoch: 73, Training Loss: 0.49906
Epoch: 73, Training Loss: 0.46271
Epoch: 73, Training Loss: 0.49374
Epoch: 74, Training Loss: 0.46951
Epoch: 74, Training Loss: 0.49906
Epoch: 74, Training Loss: 0.46275
Epoch: 74, Training Loss: 0.49376
Epoch: 75, Training Loss: 0.46951
Epoch: 75, Training Loss: 0.49906
Epoch: 75, Training Loss: 0.46280
Epoch: 75, Training Loss: 0.49378
Epoch: 76, Training Loss: 0.46951
Epoch: 76, Training Loss: 0.49907
Epoch: 76, Training Loss: 0.46284
Epoch: 76, Training Loss: 0.49381
Epoch: 77, Training Loss: 0.46951
Epoch: 77, Training Loss: 0.49907
Epoch: 77, Training Loss: 0.46288
Epoch: 77, Training Loss: 0.49383
Epoch: 78, Training Loss: 0.46952
Epoch: 78, Training Loss: 0.49907
Epoch: 78, Training Loss: 0.46292
Epoch: 78, Training Loss: 0.49385
Epoch: 79, Training Loss: 0.46952
Epoch: 79, Training Loss: 0.49908
Epoch: 79, Training Loss: 0.46297
Epoch: 79, Training Loss: 0.49388
Epoch: 80, Training Loss: 0.46952
Epoch: 80, Training Loss: 0.49908
Epoch: 80, Training Loss: 0.46301
Epoch: 80, Training Loss: 0.49390
Epoch: 81, Training Loss: 0.46952
Epoch: 81, Training Loss: 0.49908
Epoch: 81, Training Loss: 0.46305
Epoch: 81, Training Loss: 0.49392
Epoch: 82, Training Loss: 0.46953
Epoch: 82, Training Loss: 0.49908
Epoch: 82, Training Loss: 0.46309
Epoch: 82, Training Loss: 0.49395
Epoch: 83, Training Loss: 0.46953
Epoch: 83, Training Loss: 0.49909
Epoch: 83, Training Loss: 0.46314
Epoch: 83, Training Loss: 0.49397
Epoch: 84, Training Loss: 0.46953
Epoch: 84, Training Loss: 0.49909
Epoch: 84, Training Loss: 0.46318
Epoch: 84, Training Loss: 0.49399
Epoch: 85, Training Loss: 0.46953
Epoch: 85, Training Loss: 0.49909
Epoch: 85, Training Loss: 0.46322
Epoch: 85, Training Loss: 0.49402
Epoch: 86, Training Loss: 0.46954
Epoch: 86, Training Loss: 0.49909
Epoch: 86, Training Loss: 0.46327
Epoch: 86, Training Loss: 0.49404
Epoch: 87, Training Loss: 0.46954
Epoch: 87, Training Loss: 0.49909
Epoch: 87, Training Loss: 0.46331
Epoch: 87, Training Loss: 0.49406
Epoch: 88, Training Loss: 0.46954
Epoch: 88, Training Loss: 0.49910
Epoch: 88, Training Loss: 0.46335
Epoch: 88, Training Loss: 0.49408
Epoch: 89, Training Loss: 0.46955
Epoch: 89, Training Loss: 0.49910
Epoch: 89, Training Loss: 0.46339
Epoch: 89, Training Loss: 0.49411
Epoch: 90, Training Loss: 0.46955
Epoch: 90, Training Loss: 0.49910
Epoch: 90, Training Loss: 0.46343
Epoch: 90, Training Loss: 0.49413
Epoch: 91, Training Loss: 0.46955
Epoch: 91, Training Loss: 0.49910
Epoch: 91, Training Loss: 0.46348
Epoch: 91, Training Loss: 0.49415
Epoch: 92, Training Loss: 0.46956
Epoch: 92, Training Loss: 0.49910
Epoch: 92, Training Loss: 0.46352
Epoch: 92, Training Loss: 0.49418
Epoch: 93, Training Loss: 0.46956
Epoch: 93, Training Loss: 0.49911
Epoch: 93, Training Loss: 0.46356
Epoch: 93, Training Loss: 0.49420
Epoch: 94, Training Loss: 0.46956
Epoch: 94, Training Loss: 0.49911
Epoch: 94, Training Loss: 0.46360
Epoch: 94, Training Loss: 0.49422
Epoch: 95, Training Loss: 0.46957
Epoch: 95, Training Loss: 0.49911
Epoch: 95, Training Loss: 0.46365
Epoch: 95, Training Loss: 0.49425
Epoch: 96, Training Loss: 0.46957
Epoch: 96, Training Loss: 0.49911
Epoch: 96, Training Loss: 0.46369
Epoch: 96, Training Loss: 0.49427
Epoch: 97, Training Loss: 0.46958
Epoch: 97, Training Loss: 0.49911
Epoch: 97, Training Loss: 0.46373
Epoch: 97, Training Loss: 0.49429
Epoch: 98, Training Loss: 0.46958
Epoch: 98, Training Loss: 0.49911
Epoch: 98, Training Loss: 0.46377
Epoch: 98, Training Loss: 0.49432
Epoch: 99, Training Loss: 0.46959
Epoch: 99, Training Loss: 0.49912
Epoch: 99, Training Loss: 0.46381
Epoch: 99, Training Loss: 0.49434
Epoch: 100, Training Loss: 0.46959
Epoch: 100, Training Loss: 0.49912
Epoch: 100, Training Loss: 0.46386
Epoch: 100, Training Loss: 0.49436
Epoch: 101, Training Loss: 0.46960
Epoch: 101, Training Loss: 0.49912
Epoch: 101, Training Loss: 0.46390
Epoch: 101, Training Loss: 0.49438
Epoch: 102, Training Loss: 0.46960
Epoch: 102, Training Loss: 0.49912
Epoch: 102, Training Loss: 0.46394
Epoch: 102, Training Loss: 0.49441
Epoch: 103, Training Loss: 0.46961
Epoch: 103, Training Loss: 0.49912
Epoch: 103, Training Loss: 0.46398
Epoch: 103, Training Loss: 0.49443
Epoch: 104, Training Loss: 0.46961
Epoch: 104, Training Loss: 0.49912
Epoch: 104, Training Loss: 0.46402
Epoch: 104, Training Loss: 0.49445
Epoch: 105, Training Loss: 0.46962
Epoch: 105, Training Loss: 0.49912
Epoch: 105, Training Loss: 0.46407
Epoch: 105, Training Loss: 0.49448
Epoch: 106, Training Loss: 0.46962
Epoch: 106, Training Loss: 0.49912
Epoch: 106, Training Loss: 0.46411
Epoch: 106, Training Loss: 0.49450
Epoch: 107, Training Loss: 0.46963
Epoch: 107, Training Loss: 0.49912
Epoch: 107, Training Loss: 0.46415
Epoch: 107, Training Loss: 0.49452
Epoch: 108, Training Loss: 0.46963
Epoch: 108, Training Loss: 0.49913
Epoch: 108, Training Loss: 0.46419
Epoch: 108, Training Loss: 0.49455
Epoch: 109, Training Loss: 0.46964
Epoch: 109, Training Loss: 0.49913
Epoch: 109, Training Loss: 0.46423
Epoch: 109, Training Loss: 0.49457
Epoch: 110, Training Loss: 0.46964
Epoch: 110, Training Loss: 0.49913
Epoch: 110, Training Loss: 0.46428
Epoch: 110, Training Loss: 0.49459
Epoch: 111, Training Loss: 0.46965
Epoch: 111, Training Loss: 0.49913
Epoch: 111, Training Loss: 0.46432
Epoch: 111, Training Loss: 0.49462
Epoch: 112, Training Loss: 0.46966
Epoch: 112, Training Loss: 0.49913
Epoch: 112, Training Loss: 0.46436
Epoch: 112, Training Loss: 0.49464
Epoch: 113, Training Loss: 0.46966
Epoch: 113, Training Loss: 0.49913
Epoch: 113, Training Loss: 0.46440
Epoch: 113, Training Loss: 0.49466
Epoch: 114, Training Loss: 0.46967
Epoch: 114, Training Loss: 0.49913
Epoch: 114, Training Loss: 0.46444
Epoch: 114, Training Loss: 0.49469
Epoch: 115, Training Loss: 0.46967
Epoch: 115, Training Loss: 0.49913
Epoch: 115, Training Loss: 0.46448
Epoch: 115, Training Loss: 0.49471
Epoch: 116, Training Loss: 0.46968
Epoch: 116, Training Loss: 0.49913
Epoch: 116, Training Loss: 0.46453
Epoch: 116, Training Loss: 0.49473
Epoch: 117, Training Loss: 0.46969
Epoch: 117, Training Loss: 0.49913
Epoch: 117, Training Loss: 0.46457
Epoch: 117, Training Loss: 0.49476
Epoch: 118, Training Loss: 0.46969
Epoch: 118, Training Loss: 0.49913
Epoch: 118, Training Loss: 0.46461
Epoch: 118, Training Loss: 0.49478
Epoch: 119, Training Loss: 0.46970
Epoch: 119, Training Loss: 0.49913
Epoch: 119, Training Loss: 0.46465
Epoch: 119, Training Loss: 0.49480
Epoch: 120, Training Loss: 0.46971
Epoch: 120, Training Loss: 0.49913
Epoch: 120, Training Loss: 0.46469
Epoch: 120, Training Loss: 0.49483
Epoch: 121, Training Loss: 0.46971
Epoch: 121, Training Loss: 0.49913
Epoch: 121, Training Loss: 0.46473
Epoch: 121, Training Loss: 0.49485
Epoch: 122, Training Loss: 0.46972
Epoch: 122, Training Loss: 0.49913
Epoch: 122, Training Loss: 0.46477
Epoch: 122, Training Loss: 0.49487
Epoch: 123, Training Loss: 0.46973
Epoch: 123, Training Loss: 0.49913
Epoch: 123, Training Loss: 0.46482
Epoch: 123, Training Loss: 0.49490
Epoch: 124, Training Loss: 0.46973
Epoch: 124, Training Loss: 0.49913
Epoch: 124, Training Loss: 0.46486
Epoch: 124, Training Loss: 0.49492
Epoch: 125, Training Loss: 0.46974
Epoch: 125, Training Loss: 0.49913
Epoch: 125, Training Loss: 0.46490
Epoch: 125, Training Loss: 0.49494
Epoch: 126, Training Loss: 0.46975
Epoch: 126, Training Loss: 0.49913
Epoch: 126, Training Loss: 0.46494
Epoch: 126, Training Loss: 0.49497
Epoch: 127, Training Loss: 0.46976
Epoch: 127, Training Loss: 0.49913
Epoch: 127, Training Loss: 0.46498
Epoch: 127, Training Loss: 0.49499
Epoch: 128, Training Loss: 0.46976
Epoch: 128, Training Loss: 0.49913
Epoch: 128, Training Loss: 0.46502
Epoch: 128, Training Loss: 0.49501
Epoch: 129, Training Loss: 0.46977
Epoch: 129, Training Loss: 0.49912
Epoch: 129, Training Loss: 0.46506
Epoch: 129, Training Loss: 0.49504
Epoch: 130, Training Loss: 0.46978
Epoch: 130, Training Loss: 0.49912
Epoch: 130, Training Loss: 0.46510
Epoch: 130, Training Loss: 0.49506
Epoch: 131, Training Loss: 0.46979
Epoch: 131, Training Loss: 0.49912
Epoch: 131, Training Loss: 0.46515
Epoch: 131, Training Loss: 0.49509
Epoch: 132, Training Loss: 0.46979
Epoch: 132, Training Loss: 0.49912
Epoch: 132, Training Loss: 0.46519
Epoch: 132, Training Loss: 0.49511
Epoch: 133, Training Loss: 0.46980
Epoch: 133, Training Loss: 0.49912
Epoch: 133, Training Loss: 0.46523
Epoch: 133, Training Loss: 0.49513
Epoch: 134, Training Loss: 0.46981
Epoch: 134, Training Loss: 0.49912
Epoch: 134, Training Loss: 0.46527
Epoch: 134, Training Loss: 0.49516
Epoch: 135, Training Loss: 0.46982
Epoch: 135, Training Loss: 0.49912
Epoch: 135, Training Loss: 0.46531
Epoch: 135, Training Loss: 0.49518
Epoch: 136, Training Loss: 0.46983
Epoch: 136, Training Loss: 0.49912
Epoch: 136, Training Loss: 0.46535
Epoch: 136, Training Loss: 0.49520
Epoch: 137, Training Loss: 0.46983
Epoch: 137, Training Loss: 0.49912
Epoch: 137, Training Loss: 0.46539
Epoch: 137, Training Loss: 0.49523
Epoch: 138, Training Loss: 0.46984
Epoch: 138, Training Loss: 0.49911
Epoch: 138, Training Loss: 0.46543
Epoch: 138, Training Loss: 0.49525
Epoch: 139, Training Loss: 0.46985
Epoch: 139, Training Loss: 0.49911
Epoch: 139, Training Loss: 0.46547
Epoch: 139, Training Loss: 0.49528
Epoch: 140, Training Loss: 0.46986
Epoch: 140, Training Loss: 0.49911
Epoch: 140, Training Loss: 0.46551
Epoch: 140, Training Loss: 0.49530
Epoch: 141, Training Loss: 0.46987
Epoch: 141, Training Loss: 0.49911
Epoch: 141, Training Loss: 0.46555
Epoch: 141, Training Loss: 0.49532
Epoch: 142, Training Loss: 0.46988
Epoch: 142, Training Loss: 0.49911
Epoch: 142, Training Loss: 0.46559
Epoch: 142, Training Loss: 0.49535
Epoch: 143, Training Loss: 0.46988
Epoch: 143, Training Loss: 0.49911
Epoch: 143, Training Loss: 0.46564
Epoch: 143, Training Loss: 0.49537
Epoch: 144, Training Loss: 0.46989
Epoch: 144, Training Loss: 0.49910
Epoch: 144, Training Loss: 0.46568
Epoch: 144, Training Loss: 0.49539
Epoch: 145, Training Loss: 0.46990
Epoch: 145, Training Loss: 0.49910
Epoch: 145, Training Loss: 0.46572
Epoch: 145, Training Loss: 0.49542
Epoch: 146, Training Loss: 0.46991
Epoch: 146, Training Loss: 0.49910
Epoch: 146, Training Loss: 0.46576
Epoch: 146, Training Loss: 0.49544
Epoch: 147, Training Loss: 0.46992
Epoch: 147, Training Loss: 0.49910
Epoch: 147, Training Loss: 0.46580
Epoch: 147, Training Loss: 0.49547
Epoch: 148, Training Loss: 0.46993
Epoch: 148, Training Loss: 0.49910
Epoch: 148, Training Loss: 0.46584
Epoch: 148, Training Loss: 0.49549
Epoch: 149, Training Loss: 0.46994
Epoch: 149, Training Loss: 0.49909
Epoch: 149, Training Loss: 0.46588
Epoch: 149, Training Loss: 0.49551
Epoch: 150, Training Loss: 0.46994
Epoch: 150, Training Loss: 0.49909
Epoch: 150, Training Loss: 0.46592
Epoch: 150, Training Loss: 0.49554
Epoch: 151, Training Loss: 0.46995
Epoch: 151, Training Loss: 0.49909
Epoch: 151, Training Loss: 0.46596
Epoch: 151, Training Loss: 0.49556
Epoch: 152, Training Loss: 0.46996
Epoch: 152, Training Loss: 0.49909
Epoch: 152, Training Loss: 0.46600
Epoch: 152, Training Loss: 0.49559
Epoch: 153, Training Loss: 0.46997
Epoch: 153, Training Loss: 0.49908
Epoch: 153, Training Loss: 0.46604
Epoch: 153, Training Loss: 0.49561
Epoch: 154, Training Loss: 0.46998
Epoch: 154, Training Loss: 0.49908
Epoch: 154, Training Loss: 0.46608
Epoch: 154, Training Loss: 0.49564
Epoch: 155, Training Loss: 0.46999
Epoch: 155, Training Loss: 0.49908
Epoch: 155, Training Loss: 0.46612
Epoch: 155, Training Loss: 0.49566
Epoch: 156, Training Loss: 0.47000
Epoch: 156, Training Loss: 0.49907
Epoch: 156, Training Loss: 0.46616
Epoch: 156, Training Loss: 0.49568
Epoch: 157, Training Loss: 0.47001
Epoch: 157, Training Loss: 0.49907
Epoch: 157, Training Loss: 0.46620
Epoch: 157, Training Loss: 0.49571
Epoch: 158, Training Loss: 0.47002
Epoch: 158, Training Loss: 0.49907
Epoch: 158, Training Loss: 0.46624
Epoch: 158, Training Loss: 0.49573
Epoch: 159, Training Loss: 0.47003
Epoch: 159, Training Loss: 0.49907
Epoch: 159, Training Loss: 0.46628
Epoch: 159, Training Loss: 0.49576
Epoch: 160, Training Loss: 0.47003
Epoch: 160, Training Loss: 0.49906
Epoch: 160, Training Loss: 0.46632
Epoch: 160, Training Loss: 0.49578
Epoch: 161, Training Loss: 0.47004
Epoch: 161, Training Loss: 0.49906
Epoch: 161, Training Loss: 0.46636
Epoch: 161, Training Loss: 0.49581
Epoch: 162, Training Loss: 0.47005
Epoch: 162, Training Loss: 0.49906
Epoch: 162, Training Loss: 0.46640
Epoch: 162, Training Loss: 0.49583
Epoch: 163, Training Loss: 0.47006
Epoch: 163, Training Loss: 0.49905
Epoch: 163, Training Loss: 0.46644
Epoch: 163, Training Loss: 0.49586
Epoch: 164, Training Loss: 0.47007
Epoch: 164, Training Loss: 0.49905
Epoch: 164, Training Loss: 0.46648
Epoch: 164, Training Loss: 0.49588
Epoch: 165, Training Loss: 0.47008
Epoch: 165, Training Loss: 0.49905
Epoch: 165, Training Loss: 0.46652
Epoch: 165, Training Loss: 0.49590
Epoch: 166, Training Loss: 0.47009
Epoch: 166, Training Loss: 0.49904
Epoch: 166, Training Loss: 0.46656
Epoch: 166, Training Loss: 0.49593
Epoch: 167, Training Loss: 0.47010
Epoch: 167, Training Loss: 0.49904
Epoch: 167, Training Loss: 0.46660
Epoch: 167, Training Loss: 0.49595
Epoch: 168, Training Loss: 0.47011
Epoch: 168, Training Loss: 0.49904
Epoch: 168, Training Loss: 0.46664
Epoch: 168, Training Loss: 0.49598
Epoch: 169, Training Loss: 0.47012
Epoch: 169, Training Loss: 0.49903
Epoch: 169, Training Loss: 0.46668
Epoch: 169, Training Loss: 0.49600
Epoch: 170, Training Loss: 0.47013
Epoch: 170, Training Loss: 0.49903
Epoch: 170, Training Loss: 0.46672
Epoch: 170, Training Loss: 0.49603
Epoch: 171, Training Loss: 0.47014
Epoch: 171, Training Loss: 0.49902
Epoch: 171, Training Loss: 0.46676
Epoch: 171, Training Loss: 0.49605
Epoch: 172, Training Loss: 0.47015
Epoch: 172, Training Loss: 0.49902
Epoch: 172, Training Loss: 0.46679
Epoch: 172, Training Loss: 0.49608
Epoch: 173, Training Loss: 0.47016
Epoch: 173, Training Loss: 0.49902
Epoch: 173, Training Loss: 0.46683
Epoch: 173, Training Loss: 0.49610
Epoch: 174, Training Loss: 0.47017
Epoch: 174, Training Loss: 0.49901
Epoch: 174, Training Loss: 0.46687
Epoch: 174, Training Loss: 0.49613
Epoch: 175, Training Loss: 0.47018
Epoch: 175, Training Loss: 0.49901
Epoch: 175, Training Loss: 0.46691
Epoch: 175, Training Loss: 0.49615
Epoch: 176, Training Loss: 0.47019
Epoch: 176, Training Loss: 0.49900
Epoch: 176, Training Loss: 0.46695
Epoch: 176, Training Loss: 0.49618
Epoch: 177, Training Loss: 0.47020
Epoch: 177, Training Loss: 0.49900
Epoch: 177, Training Loss: 0.46699
Epoch: 177, Training Loss: 0.49620
Epoch: 178, Training Loss: 0.47021
Epoch: 178, Training Loss: 0.49900
Epoch: 178, Training Loss: 0.46703
Epoch: 178, Training Loss: 0.49623
Epoch: 179, Training Loss: 0.47022
Epoch: 179, Training Loss: 0.49899
Epoch: 179, Training Loss: 0.46707
Epoch: 179, Training Loss: 0.49625
Epoch: 180, Training Loss: 0.47022
Epoch: 180, Training Loss: 0.49899
Epoch: 180, Training Loss: 0.46711
Epoch: 180, Training Loss: 0.49628
Epoch: 181, Training Loss: 0.47023
Epoch: 181, Training Loss: 0.49898
Epoch: 181, Training Loss: 0.46715
Epoch: 181, Training Loss: 0.49630
Epoch: 182, Training Loss: 0.47024
Epoch: 182, Training Loss: 0.49898
Epoch: 182, Training Loss: 0.46718
Epoch: 182, Training Loss: 0.49633
Epoch: 183, Training Loss: 0.47025
Epoch: 183, Training Loss: 0.49897
Epoch: 183, Training Loss: 0.46722
Epoch: 183, Training Loss: 0.49635
Epoch: 184, Training Loss: 0.47026
Epoch: 184, Training Loss: 0.49897
Epoch: 184, Training Loss: 0.46726
Epoch: 184, Training Loss: 0.49638
Epoch: 185, Training Loss: 0.47027
Epoch: 185, Training Loss: 0.49896
Epoch: 185, Training Loss: 0.46730
Epoch: 185, Training Loss: 0.49640
Epoch: 186, Training Loss: 0.47028
Epoch: 186, Training Loss: 0.49896
Epoch: 186, Training Loss: 0.46734
Epoch: 186, Training Loss: 0.49643
Epoch: 187, Training Loss: 0.47029
Epoch: 187, Training Loss: 0.49895
Epoch: 187, Training Loss: 0.46738
Epoch: 187, Training Loss: 0.49645
Epoch: 188, Training Loss: 0.47030
Epoch: 188, Training Loss: 0.49895
Epoch: 188, Training Loss: 0.46742
Epoch: 188, Training Loss: 0.49648
Epoch: 189, Training Loss: 0.47031
Epoch: 189, Training Loss: 0.49894
Epoch: 189, Training Loss: 0.46745
Epoch: 189, Training Loss: 0.49650
Epoch: 190, Training Loss: 0.47032
Epoch: 190, Training Loss: 0.49894
Epoch: 190, Training Loss: 0.46749
Epoch: 190, Training Loss: 0.49653
Epoch: 191, Training Loss: 0.47033
Epoch: 191, Training Loss: 0.49893
Epoch: 191, Training Loss: 0.46753
Epoch: 191, Training Loss: 0.49655
Epoch: 192, Training Loss: 0.47034
Epoch: 192, Training Loss: 0.49893
Epoch: 192, Training Loss: 0.46757
Epoch: 192, Training Loss: 0.49658
Epoch: 193, Training Loss: 0.47035
Epoch: 193, Training Loss: 0.49892
Epoch: 193, Training Loss: 0.46761
Epoch: 193, Training Loss: 0.49660
Epoch: 194, Training Loss: 0.47036
Epoch: 194, Training Loss: 0.49892
Epoch: 194, Training Loss: 0.46765
Epoch: 194, Training Loss: 0.49663
Epoch: 195, Training Loss: 0.47037
Epoch: 195, Training Loss: 0.49891
Epoch: 195, Training Loss: 0.46768
Epoch: 195, Training Loss: 0.49665
Epoch: 196, Training Loss: 0.47038
Epoch: 196, Training Loss: 0.49891
Epoch: 196, Training Loss: 0.46772
Epoch: 196, Training Loss: 0.49668
Epoch: 197, Training Loss: 0.47039
Epoch: 197, Training Loss: 0.49890
Epoch: 197, Training Loss: 0.46776
Epoch: 197, Training Loss: 0.49670
Epoch: 198, Training Loss: 0.47040
Epoch: 198, Training Loss: 0.49890
Epoch: 198, Training Loss: 0.46780
Epoch: 198, Training Loss: 0.49673
Epoch: 199, Training Loss: 0.47041
Epoch: 199, Training Loss: 0.49889
Epoch: 199, Training Loss: 0.46784
Epoch: 199, Training Loss: 0.49676
Epoch: 200, Training Loss: 0.47042
Epoch: 200, Training Loss: 0.49888
Epoch: 200, Training Loss: 0.46787
Epoch: 200, Training Loss: 0.49678
Epoch: 201, Training Loss: 0.47043
Epoch: 201, Training Loss: 0.49888
Epoch: 201, Training Loss: 0.46791
Epoch: 201, Training Loss: 0.49681
Epoch: 202, Training Loss: 0.47044
Epoch: 202, Training Loss: 0.49887
Epoch: 202, Training Loss: 0.46795
Epoch: 202, Training Loss: 0.49683
Epoch: 203, Training Loss: 0.47045
Epoch: 203, Training Loss: 0.49887
Epoch: 203, Training Loss: 0.46799
Epoch: 203, Training Loss: 0.49686
Epoch: 204, Training Loss: 0.47046
Epoch: 204, Training Loss: 0.49886
Epoch: 204, Training Loss: 0.46802
Epoch: 204, Training Loss: 0.49688
Epoch: 205, Training Loss: 0.47047
Epoch: 205, Training Loss: 0.49886
Epoch: 205, Training Loss: 0.46806
Epoch: 205, Training Loss: 0.49691
Epoch: 206, Training Loss: 0.47048
Epoch: 206, Training Loss: 0.49885
Epoch: 206, Training Loss: 0.46810
Epoch: 206, Training Loss: 0.49693
Epoch: 207, Training Loss: 0.47049
Epoch: 207, Training Loss: 0.49884
Epoch: 207, Training Loss: 0.46814
Epoch: 207, Training Loss: 0.49696
Epoch: 208, Training Loss: 0.47050
Epoch: 208, Training Loss: 0.49884
Epoch: 208, Training Loss: 0.46817
Epoch: 208, Training Loss: 0.49699
Epoch: 209, Training Loss: 0.47051
Epoch: 209, Training Loss: 0.49883
Epoch: 209, Training Loss: 0.46821
Epoch: 209, Training Loss: 0.49701
Epoch: 210, Training Loss: 0.47052
Epoch: 210, Training Loss: 0.49882
Epoch: 210, Training Loss: 0.46825
Epoch: 210, Training Loss: 0.49704
Epoch: 211, Training Loss: 0.47053
Epoch: 211, Training Loss: 0.49882
Epoch: 211, Training Loss: 0.46828
Epoch: 211, Training Loss: 0.49706
Epoch: 212, Training Loss: 0.47054
Epoch: 212, Training Loss: 0.49881
Epoch: 212, Training Loss: 0.46832
Epoch: 212, Training Loss: 0.49709
Epoch: 213, Training Loss: 0.47055
Epoch: 213, Training Loss: 0.49881
Epoch: 213, Training Loss: 0.46836
Epoch: 213, Training Loss: 0.49711
Epoch: 214, Training Loss: 0.47056
Epoch: 214, Training Loss: 0.49880
Epoch: 214, Training Loss: 0.46840
Epoch: 214, Training Loss: 0.49714
Epoch: 215, Training Loss: 0.47057
Epoch: 215, Training Loss: 0.49879
Epoch: 215, Training Loss: 0.46843
Epoch: 215, Training Loss: 0.49717
Epoch: 216, Training Loss: 0.47058
Epoch: 216, Training Loss: 0.49879
Epoch: 216, Training Loss: 0.46847
Epoch: 216, Training Loss: 0.49719
Epoch: 217, Training Loss: 0.47059
Epoch: 217, Training Loss: 0.49878
Epoch: 217, Training Loss: 0.46851
Epoch: 217, Training Loss: 0.49722
Epoch: 218, Training Loss: 0.47060
Epoch: 218, Training Loss: 0.49877
Epoch: 218, Training Loss: 0.46854
Epoch: 218, Training Loss: 0.49724
Epoch: 219, Training Loss: 0.47061
Epoch: 219, Training Loss: 0.49877
Epoch: 219, Training Loss: 0.46858
Epoch: 219, Training Loss: 0.49727
Epoch: 220, Training Loss: 0.47062
Epoch: 220, Training Loss: 0.49876
Epoch: 220, Training Loss: 0.46862
Epoch: 220, Training Loss: 0.49730
Epoch: 221, Training Loss: 0.47063
Epoch: 221, Training Loss: 0.49875
Epoch: 221, Training Loss: 0.46865
Epoch: 221, Training Loss: 0.49732
Epoch: 222, Training Loss: 0.47063
Epoch: 222, Training Loss: 0.49875
Epoch: 222, Training Loss: 0.46869
Epoch: 222, Training Loss: 0.49735
Epoch: 223, Training Loss: 0.47064
Epoch: 223, Training Loss: 0.49874
Epoch: 223, Training Loss: 0.46872
Epoch: 223, Training Loss: 0.49737
Epoch: 224, Training Loss: 0.47065
Epoch: 224, Training Loss: 0.49873
Epoch: 224, Training Loss: 0.46876
Epoch: 224, Training Loss: 0.49740
Epoch: 225, Training Loss: 0.47066
Epoch: 225, Training Loss: 0.49873
Epoch: 225, Training Loss: 0.46880
Epoch: 225, Training Loss: 0.49743
Epoch: 226, Training Loss: 0.47067
Epoch: 226, Training Loss: 0.49872
Epoch: 226, Training Loss: 0.46883
Epoch: 226, Training Loss: 0.49745
Epoch: 227, Training Loss: 0.47068
Epoch: 227, Training Loss: 0.49871
Epoch: 227, Training Loss: 0.46887
Epoch: 227, Training Loss: 0.49748
Epoch: 228, Training Loss: 0.47069
Epoch: 228, Training Loss: 0.49871
Epoch: 228, Training Loss: 0.46891
Epoch: 228, Training Loss: 0.49750
Epoch: 229, Training Loss: 0.47070
Epoch: 229, Training Loss: 0.49870
Epoch: 229, Training Loss: 0.46894
Epoch: 229, Training Loss: 0.49753
Epoch: 230, Training Loss: 0.47071
Epoch: 230, Training Loss: 0.49869
Epoch: 230, Training Loss: 0.46898
Epoch: 230, Training Loss: 0.49756
Epoch: 231, Training Loss: 0.47072
Epoch: 231, Training Loss: 0.49869
Epoch: 231, Training Loss: 0.46901
Epoch: 231, Training Loss: 0.49758
Epoch: 232, Training Loss: 0.47073
Epoch: 232, Training Loss: 0.49868
Epoch: 232, Training Loss: 0.46905
Epoch: 232, Training Loss: 0.49761
Epoch: 233, Training Loss: 0.47074
Epoch: 233, Training Loss: 0.49867
Epoch: 233, Training Loss: 0.46908
Epoch: 233, Training Loss: 0.49763
Epoch: 234, Training Loss: 0.47075
Epoch: 234, Training Loss: 0.49866
Epoch: 234, Training Loss: 0.46912
Epoch: 234, Training Loss: 0.49766
Epoch: 235, Training Loss: 0.47076
Epoch: 235, Training Loss: 0.49866
Epoch: 235, Training Loss: 0.46915
Epoch: 235, Training Loss: 0.49769
Epoch: 236, Training Loss: 0.47077
Epoch: 236, Training Loss: 0.49865
Epoch: 236, Training Loss: 0.46919
Epoch: 236, Training Loss: 0.49771
Epoch: 237, Training Loss: 0.47078
Epoch: 237, Training Loss: 0.49864
Epoch: 237, Training Loss: 0.46923
Epoch: 237, Training Loss: 0.49774
Epoch: 238, Training Loss: 0.47079
Epoch: 238, Training Loss: 0.49864
Epoch: 238, Training Loss: 0.46926
Epoch: 238, Training Loss: 0.49776
Epoch: 239, Training Loss: 0.47079
Epoch: 239, Training Loss: 0.49863
Epoch: 239, Training Loss: 0.46930
Epoch: 239, Training Loss: 0.49779
Epoch: 240, Training Loss: 0.47080
Epoch: 240, Training Loss: 0.49862
Epoch: 240, Training Loss: 0.46933
Epoch: 240, Training Loss: 0.49782
Epoch: 241, Training Loss: 0.47081
Epoch: 241, Training Loss: 0.49861
Epoch: 241, Training Loss: 0.46937
Epoch: 241, Training Loss: 0.49784
Epoch: 242, Training Loss: 0.47082
Epoch: 242, Training Loss: 0.49861
Epoch: 242, Training Loss: 0.46940
Epoch: 242, Training Loss: 0.49787
Epoch: 243, Training Loss: 0.47083
Epoch: 243, Training Loss: 0.49860
Epoch: 243, Training Loss: 0.46944
Epoch: 243, Training Loss: 0.49789
Epoch: 244, Training Loss: 0.47084
Epoch: 244, Training Loss: 0.49859
Epoch: 244, Training Loss: 0.46947
Epoch: 244, Training Loss: 0.49792
Epoch: 245, Training Loss: 0.47085
Epoch: 245, Training Loss: 0.49858
Epoch: 245, Training Loss: 0.46951
Epoch: 245, Training Loss: 0.49795
Epoch: 246, Training Loss: 0.47086
Epoch: 246, Training Loss: 0.49858
Epoch: 246, Training Loss: 0.46954
Epoch: 246, Training Loss: 0.49797
Epoch: 247, Training Loss: 0.47087
Epoch: 247, Training Loss: 0.49857
Epoch: 247, Training Loss: 0.46957
Epoch: 247, Training Loss: 0.49800
Epoch: 248, Training Loss: 0.47088
Epoch: 248, Training Loss: 0.49856
Epoch: 248, Training Loss: 0.46961
Epoch: 248, Training Loss: 0.49802
Epoch: 249, Training Loss: 0.47089
Epoch: 249, Training Loss: 0.49855
Epoch: 249, Training Loss: 0.46964
Epoch: 249, Training Loss: 0.49805
Epoch: 250, Training Loss: 0.47089
Epoch: 250, Training Loss: 0.49855
Epoch: 250, Training Loss: 0.46968
Epoch: 250, Training Loss: 0.49808
Epoch: 251, Training Loss: 0.47090
Epoch: 251, Training Loss: 0.49854
Epoch: 251, Training Loss: 0.46971
Epoch: 251, Training Loss: 0.49810
Epoch: 252, Training Loss: 0.47091
Epoch: 252, Training Loss: 0.49853
Epoch: 252, Training Loss: 0.46975
Epoch: 252, Training Loss: 0.49813
Epoch: 253, Training Loss: 0.47092
Epoch: 253, Training Loss: 0.49852
Epoch: 253, Training Loss: 0.46978
Epoch: 253, Training Loss: 0.49815
Epoch: 254, Training Loss: 0.47093
Epoch: 254, Training Loss: 0.49852
Epoch: 254, Training Loss: 0.46981
Epoch: 254, Training Loss: 0.49818
Epoch: 255, Training Loss: 0.47094
Epoch: 255, Training Loss: 0.49851
Epoch: 255, Training Loss: 0.46985
Epoch: 255, Training Loss: 0.49821
Epoch: 256, Training Loss: 0.47095
Epoch: 256, Training Loss: 0.49850
Epoch: 256, Training Loss: 0.46988
Epoch: 256, Training Loss: 0.49823
Epoch: 257, Training Loss: 0.47096
Epoch: 257, Training Loss: 0.49849
Epoch: 257, Training Loss: 0.46992
Epoch: 257, Training Loss: 0.49826
Epoch: 258, Training Loss: 0.47096
Epoch: 258, Training Loss: 0.49849
Epoch: 258, Training Loss: 0.46995
Epoch: 258, Training Loss: 0.49828
Epoch: 259, Training Loss: 0.47097
Epoch: 259, Training Loss: 0.49848
Epoch: 259, Training Loss: 0.46998
Epoch: 259, Training Loss: 0.49831
Epoch: 260, Training Loss: 0.47098
Epoch: 260, Training Loss: 0.49847
Epoch: 260, Training Loss: 0.47002
Epoch: 260, Training Loss: 0.49834
Epoch: 261, Training Loss: 0.47099
Epoch: 261, Training Loss: 0.49846
Epoch: 261, Training Loss: 0.47005
Epoch: 261, Training Loss: 0.49836
Epoch: 262, Training Loss: 0.47100
Epoch: 262, Training Loss: 0.49846
Epoch: 262, Training Loss: 0.47008
Epoch: 262, Training Loss: 0.49839
Epoch: 263, Training Loss: 0.47101
Epoch: 263, Training Loss: 0.49845
Epoch: 263, Training Loss: 0.47012
Epoch: 263, Training Loss: 0.49841
Epoch: 264, Training Loss: 0.47102
Epoch: 264, Training Loss: 0.49844
Epoch: 264, Training Loss: 0.47015
Epoch: 264, Training Loss: 0.49844
Epoch: 265, Training Loss: 0.47102
Epoch: 265, Training Loss: 0.49843
Epoch: 265, Training Loss: 0.47018
Epoch: 265, Training Loss: 0.49847
Epoch: 266, Training Loss: 0.47103
Epoch: 266, Training Loss: 0.49843
Epoch: 266, Training Loss: 0.47022
Epoch: 266, Training Loss: 0.49849
Epoch: 267, Training Loss: 0.47104
Epoch: 267, Training Loss: 0.49842
Epoch: 267, Training Loss: 0.47025
Epoch: 267, Training Loss: 0.49852
Epoch: 268, Training Loss: 0.47105
Epoch: 268, Training Loss: 0.49841
Epoch: 268, Training Loss: 0.47028
Epoch: 268, Training Loss: 0.49854
Epoch: 269, Training Loss: 0.47106
Epoch: 269, Training Loss: 0.49840
Epoch: 269, Training Loss: 0.47032
Epoch: 269, Training Loss: 0.49857
Epoch: 270, Training Loss: 0.47107
Epoch: 270, Training Loss: 0.49840
Epoch: 270, Training Loss: 0.47035
Epoch: 270, Training Loss: 0.49860
Epoch: 271, Training Loss: 0.47107
Epoch: 271, Training Loss: 0.49839
Epoch: 271, Training Loss: 0.47038
Epoch: 271, Training Loss: 0.49862
Epoch: 272, Training Loss: 0.47108
Epoch: 272, Training Loss: 0.49838
Epoch: 272, Training Loss: 0.47041
Epoch: 272, Training Loss: 0.49865
Epoch: 273, Training Loss: 0.47109
Epoch: 273, Training Loss: 0.49837
Epoch: 273, Training Loss: 0.47045
Epoch: 273, Training Loss: 0.49867
Epoch: 274, Training Loss: 0.47110
Epoch: 274, Training Loss: 0.49836
Epoch: 274, Training Loss: 0.47048
Epoch: 274, Training Loss: 0.49870
Epoch: 275, Training Loss: 0.47111
Epoch: 275, Training Loss: 0.49836
Epoch: 275, Training Loss: 0.47051
Epoch: 275, Training Loss: 0.49872
Epoch: 276, Training Loss: 0.47111
Epoch: 276, Training Loss: 0.49835
Epoch: 276, Training Loss: 0.47054
Epoch: 276, Training Loss: 0.49875
Epoch: 277, Training Loss: 0.47112
Epoch: 277, Training Loss: 0.49834
Epoch: 277, Training Loss: 0.47058
Epoch: 277, Training Loss: 0.49877
Epoch: 278, Training Loss: 0.47113
Epoch: 278, Training Loss: 0.49833
Epoch: 278, Training Loss: 0.47061
Epoch: 278, Training Loss: 0.49880
Epoch: 279, Training Loss: 0.47114
Epoch: 279, Training Loss: 0.49833
Epoch: 279, Training Loss: 0.47064
Epoch: 279, Training Loss: 0.49883
Epoch: 280, Training Loss: 0.47115
Epoch: 280, Training Loss: 0.49832
Epoch: 280, Training Loss: 0.47067
Epoch: 280, Training Loss: 0.49885
Epoch: 281, Training Loss: 0.47115
Epoch: 281, Training Loss: 0.49831
Epoch: 281, Training Loss: 0.47070
Epoch: 281, Training Loss: 0.49888
Epoch: 282, Training Loss: 0.47116
Epoch: 282, Training Loss: 0.49830
Epoch: 282, Training Loss: 0.47074
Epoch: 282, Training Loss: 0.49890
Epoch: 283, Training Loss: 0.47117
Epoch: 283, Training Loss: 0.49830
Epoch: 283, Training Loss: 0.47077
Epoch: 283, Training Loss: 0.49893
Epoch: 284, Training Loss: 0.47118
Epoch: 284, Training Loss: 0.49829
Epoch: 284, Training Loss: 0.47080
Epoch: 284, Training Loss: 0.49895
Epoch: 285, Training Loss: 0.47119
Epoch: 285, Training Loss: 0.49828
Epoch: 285, Training Loss: 0.47083
Epoch: 285, Training Loss: 0.49898
Epoch: 286, Training Loss: 0.47119
Epoch: 286, Training Loss: 0.49827
Epoch: 286, Training Loss: 0.47086
Epoch: 286, Training Loss: 0.49900
Epoch: 287, Training Loss: 0.47120
Epoch: 287, Training Loss: 0.49826
Epoch: 287, Training Loss: 0.47089
Epoch: 287, Training Loss: 0.49903
Epoch: 288, Training Loss: 0.47121
Epoch: 288, Training Loss: 0.49826
Epoch: 288, Training Loss: 0.47093
Epoch: 288, Training Loss: 0.49906
Epoch: 289, Training Loss: 0.47122
Epoch: 289, Training Loss: 0.49825
Epoch: 289, Training Loss: 0.47096
Epoch: 289, Training Loss: 0.49908
Epoch: 290, Training Loss: 0.47122
Epoch: 290, Training Loss: 0.49824
Epoch: 290, Training Loss: 0.47099
Epoch: 290, Training Loss: 0.49911
Epoch: 291, Training Loss: 0.47123
Epoch: 291, Training Loss: 0.49823
Epoch: 291, Training Loss: 0.47102
Epoch: 291, Training Loss: 0.49913
Epoch: 292, Training Loss: 0.47124
Epoch: 292, Training Loss: 0.49823
Epoch: 292, Training Loss: 0.47105
Epoch: 292, Training Loss: 0.49916
Epoch: 293, Training Loss: 0.47125
Epoch: 293, Training Loss: 0.49822
Epoch: 293, Training Loss: 0.47108
Epoch: 293, Training Loss: 0.49918
Epoch: 294, Training Loss: 0.47125
Epoch: 294, Training Loss: 0.49821
Epoch: 294, Training Loss: 0.47111
Epoch: 294, Training Loss: 0.49921
Epoch: 295, Training Loss: 0.47126
Epoch: 295, Training Loss: 0.49820
Epoch: 295, Training Loss: 0.47114
Epoch: 295, Training Loss: 0.49923
Epoch: 296, Training Loss: 0.47127
Epoch: 296, Training Loss: 0.49820
Epoch: 296, Training Loss: 0.47117
Epoch: 296, Training Loss: 0.49926
Epoch: 297, Training Loss: 0.47128
Epoch: 297, Training Loss: 0.49819
Epoch: 297, Training Loss: 0.47120
Epoch: 297, Training Loss: 0.49928
Epoch: 298, Training Loss: 0.47128
Epoch: 298, Training Loss: 0.49818
Epoch: 298, Training Loss: 0.47123
Epoch: 298, Training Loss: 0.49931
Epoch: 299, Training Loss: 0.47129
Epoch: 299, Training Loss: 0.49817
Epoch: 299, Training Loss: 0.47126
Epoch: 299, Training Loss: 0.49933
Epoch: 300, Training Loss: 0.47130
Epoch: 300, Training Loss: 0.49817
Epoch: 300, Training Loss: 0.47129
Epoch: 300, Training Loss: 0.49936
Epoch: 301, Training Loss: 0.47131
Epoch: 301, Training Loss: 0.49816
Epoch: 301, Training Loss: 0.47132
Epoch: 301, Training Loss: 0.49938
Epoch: 302, Training Loss: 0.47131
Epoch: 302, Training Loss: 0.49815
Epoch: 302, Training Loss: 0.47135
Epoch: 302, Training Loss: 0.49941
Epoch: 303, Training Loss: 0.47132
Epoch: 303, Training Loss: 0.49814
Epoch: 303, Training Loss: 0.47139
Epoch: 303, Training Loss: 0.49943
Epoch: 304, Training Loss: 0.47133
Epoch: 304, Training Loss: 0.49814
Epoch: 304, Training Loss: 0.47141
Epoch: 304, Training Loss: 0.49945
Epoch: 305, Training Loss: 0.47133
Epoch: 305, Training Loss: 0.49813
Epoch: 305, Training Loss: 0.47144
Epoch: 305, Training Loss: 0.49948
Epoch: 306, Training Loss: 0.47134
Epoch: 306, Training Loss: 0.49812
Epoch: 306, Training Loss: 0.47147
Epoch: 306, Training Loss: 0.49950
Epoch: 307, Training Loss: 0.47135
Epoch: 307, Training Loss: 0.49812
Epoch: 307, Training Loss: 0.47150
Epoch: 307, Training Loss: 0.49953
Epoch: 308, Training Loss: 0.47136
Epoch: 308, Training Loss: 0.49811
Epoch: 308, Training Loss: 0.47153
Epoch: 308, Training Loss: 0.49955
Epoch: 309, Training Loss: 0.47136
Epoch: 309, Training Loss: 0.49810
Epoch: 309, Training Loss: 0.47156
Epoch: 309, Training Loss: 0.49958
Epoch: 310, Training Loss: 0.47137
Epoch: 310, Training Loss: 0.49809
Epoch: 310, Training Loss: 0.47159
Epoch: 310, Training Loss: 0.49960
Epoch: 311, Training Loss: 0.47138
Epoch: 311, Training Loss: 0.49809
Epoch: 311, Training Loss: 0.47162
Epoch: 311, Training Loss: 0.49963
Epoch: 312, Training Loss: 0.47138
Epoch: 312, Training Loss: 0.49808
Epoch: 312, Training Loss: 0.47165
Epoch: 312, Training Loss: 0.49965
Epoch: 313, Training Loss: 0.47139
Epoch: 313, Training Loss: 0.49807
Epoch: 313, Training Loss: 0.47168
Epoch: 313, Training Loss: 0.49967
Epoch: 314, Training Loss: 0.47140
Epoch: 314, Training Loss: 0.49807
Epoch: 314, Training Loss: 0.47171
Epoch: 314, Training Loss: 0.49970
Epoch: 315, Training Loss: 0.47140
Epoch: 315, Training Loss: 0.49806
Epoch: 315, Training Loss: 0.47174
Epoch: 315, Training Loss: 0.49972
Epoch: 316, Training Loss: 0.47141
Epoch: 316, Training Loss: 0.49805
Epoch: 316, Training Loss: 0.47177
Epoch: 316, Training Loss: 0.49975
Epoch: 317, Training Loss: 0.47142
Epoch: 317, Training Loss: 0.49804
Epoch: 317, Training Loss: 0.47180
Epoch: 317, Training Loss: 0.49977
Epoch: 318, Training Loss: 0.47143
Epoch: 318, Training Loss: 0.49804
Epoch: 318, Training Loss: 0.47183
Epoch: 318, Training Loss: 0.49979
Epoch: 319, Training Loss: 0.47143
Epoch: 319, Training Loss: 0.49803
Epoch: 319, Training Loss: 0.47185
Epoch: 319, Training Loss: 0.49982
Epoch: 320, Training Loss: 0.47144
Epoch: 320, Training Loss: 0.49802
Epoch: 320, Training Loss: 0.47188
Epoch: 320, Training Loss: 0.49984
Epoch: 321, Training Loss: 0.47145
Epoch: 321, Training Loss: 0.49802
Epoch: 321, Training Loss: 0.47191
Epoch: 321, Training Loss: 0.49987
Epoch: 322, Training Loss: 0.47145
Epoch: 322, Training Loss: 0.49801
Epoch: 322, Training Loss: 0.47194
Epoch: 322, Training Loss: 0.49989
Epoch: 323, Training Loss: 0.47146
Epoch: 323, Training Loss: 0.49800
Epoch: 323, Training Loss: 0.47197
Epoch: 323, Training Loss: 0.49991
Epoch: 324, Training Loss: 0.47147
Epoch: 324, Training Loss: 0.49800
Epoch: 324, Training Loss: 0.47200
Epoch: 324, Training Loss: 0.49994
Epoch: 325, Training Loss: 0.47147
Epoch: 325, Training Loss: 0.49799
Epoch: 325, Training Loss: 0.47202
Epoch: 325, Training Loss: 0.49996
Epoch: 326, Training Loss: 0.47148
Epoch: 326, Training Loss: 0.49798
Epoch: 326, Training Loss: 0.47205
Epoch: 326, Training Loss: 0.49998
Epoch: 327, Training Loss: 0.47148
Epoch: 327, Training Loss: 0.49798
Epoch: 327, Training Loss: 0.47208
Epoch: 327, Training Loss: 0.50001
Epoch: 328, Training Loss: 0.47149
Epoch: 328, Training Loss: 0.49797
Epoch: 328, Training Loss: 0.47211
Epoch: 328, Training Loss: 0.50003
Epoch: 329, Training Loss: 0.47150
Epoch: 329, Training Loss: 0.49796
Epoch: 329, Training Loss: 0.47214
Epoch: 329, Training Loss: 0.50005
Epoch: 330, Training Loss: 0.47150
Epoch: 330, Training Loss: 0.49796
Epoch: 330, Training Loss: 0.47216
Epoch: 330, Training Loss: 0.50008
Epoch: 331, Training Loss: 0.47151
Epoch: 331, Training Loss: 0.49795
Epoch: 331, Training Loss: 0.47219
Epoch: 331, Training Loss: 0.50010
Epoch: 332, Training Loss: 0.47152
Epoch: 332, Training Loss: 0.49794
Epoch: 332, Training Loss: 0.47222
Epoch: 332, Training Loss: 0.50012
Epoch: 333, Training Loss: 0.47152
Epoch: 333, Training Loss: 0.49794
Epoch: 333, Training Loss: 0.47225
Epoch: 333, Training Loss: 0.50015
Epoch: 334, Training Loss: 0.47153
Epoch: 334, Training Loss: 0.49793
Epoch: 334, Training Loss: 0.47227
Epoch: 334, Training Loss: 0.50017
Epoch: 335, Training Loss: 0.47154
Epoch: 335, Training Loss: 0.49792
Epoch: 335, Training Loss: 0.47230
Epoch: 335, Training Loss: 0.50019
Epoch: 336, Training Loss: 0.47154
Epoch: 336, Training Loss: 0.49792
Epoch: 336, Training Loss: 0.47233
Epoch: 336, Training Loss: 0.50021
Epoch: 337, Training Loss: 0.47155
Epoch: 337, Training Loss: 0.49791
Epoch: 337, Training Loss: 0.47236
Epoch: 337, Training Loss: 0.50024
Epoch: 338, Training Loss: 0.47155
Epoch: 338, Training Loss: 0.49790
Epoch: 338, Training Loss: 0.47238
Epoch: 338, Training Loss: 0.50026
Epoch: 339, Training Loss: 0.47156
Epoch: 339, Training Loss: 0.49790
Epoch: 339, Training Loss: 0.47241
Epoch: 339, Training Loss: 0.50028
Epoch: 340, Training Loss: 0.47157
Epoch: 340, Training Loss: 0.49789
Epoch: 340, Training Loss: 0.47244
Epoch: 340, Training Loss: 0.50031
Epoch: 341, Training Loss: 0.47157
Epoch: 341, Training Loss: 0.49789
Epoch: 341, Training Loss: 0.47246
Epoch: 341, Training Loss: 0.50033
Epoch: 342, Training Loss: 0.47158
Epoch: 342, Training Loss: 0.49788
Epoch: 342, Training Loss: 0.47249
Epoch: 342, Training Loss: 0.50035
Epoch: 343, Training Loss: 0.47159
Epoch: 343, Training Loss: 0.49787
Epoch: 343, Training Loss: 0.47252
Epoch: 343, Training Loss: 0.50037
Epoch: 344, Training Loss: 0.47159
Epoch: 344, Training Loss: 0.49787
Epoch: 344, Training Loss: 0.47254
Epoch: 344, Training Loss: 0.50039
Epoch: 345, Training Loss: 0.47160
Epoch: 345, Training Loss: 0.49786
Epoch: 345, Training Loss: 0.47257
Epoch: 345, Training Loss: 0.50042
Epoch: 346, Training Loss: 0.47160
Epoch: 346, Training Loss: 0.49786
Epoch: 346, Training Loss: 0.47259
Epoch: 346, Training Loss: 0.50044
Epoch: 347, Training Loss: 0.47161
Epoch: 347, Training Loss: 0.49785
Epoch: 347, Training Loss: 0.47262
Epoch: 347, Training Loss: 0.50046
Epoch: 348, Training Loss: 0.47162
Epoch: 348, Training Loss: 0.49784
Epoch: 348, Training Loss: 0.47265
Epoch: 348, Training Loss: 0.50048
Epoch: 349, Training Loss: 0.47162
Epoch: 349, Training Loss: 0.49784
Epoch: 349, Training Loss: 0.47267
Epoch: 349, Training Loss: 0.50050
Epoch: 350, Training Loss: 0.47163
Epoch: 350, Training Loss: 0.49783
Epoch: 350, Training Loss: 0.47270
Epoch: 350, Training Loss: 0.50053
Epoch: 351, Training Loss: 0.47163
Epoch: 351, Training Loss: 0.49783
Epoch: 351, Training Loss: 0.47273
Epoch: 351, Training Loss: 0.50055
Epoch: 352, Training Loss: 0.47164
Epoch: 352, Training Loss: 0.49782
Epoch: 352, Training Loss: 0.47275
Epoch: 352, Training Loss: 0.50057
Epoch: 353, Training Loss: 0.47165
Epoch: 353, Training Loss: 0.49782
Epoch: 353, Training Loss: 0.47278
Epoch: 353, Training Loss: 0.50059
Epoch: 354, Training Loss: 0.47165
Epoch: 354, Training Loss: 0.49781
Epoch: 354, Training Loss: 0.47280
Epoch: 354, Training Loss: 0.50061
Epoch: 355, Training Loss: 0.47166
Epoch: 355, Training Loss: 0.49780
Epoch: 355, Training Loss: 0.47283
Epoch: 355, Training Loss: 0.50063
Epoch: 356, Training Loss: 0.47166
Epoch: 356, Training Loss: 0.49780
Epoch: 356, Training Loss: 0.47285
Epoch: 356, Training Loss: 0.50066
Epoch: 357, Training Loss: 0.47167
Epoch: 357, Training Loss: 0.49779
Epoch: 357, Training Loss: 0.47288
Epoch: 357, Training Loss: 0.50068
Epoch: 358, Training Loss: 0.47168
Epoch: 358, Training Loss: 0.49779
Epoch: 358, Training Loss: 0.47290
Epoch: 358, Training Loss: 0.50070
Epoch: 359, Training Loss: 0.47168
Epoch: 359, Training Loss: 0.49778
Epoch: 359, Training Loss: 0.47293
Epoch: 359, Training Loss: 0.50072
Epoch: 360, Training Loss: 0.47169
Epoch: 360, Training Loss: 0.49778
Epoch: 360, Training Loss: 0.47295
Epoch: 360, Training Loss: 0.50074
Epoch: 361, Training Loss: 0.47169
Epoch: 361, Training Loss: 0.49777
Epoch: 361, Training Loss: 0.47298
Epoch: 361, Training Loss: 0.50076
Epoch: 362, Training Loss: 0.47170
Epoch: 362, Training Loss: 0.49777
Epoch: 362, Training Loss: 0.47300
Epoch: 362, Training Loss: 0.50078
Epoch: 363, Training Loss: 0.47170
Epoch: 363, Training Loss: 0.49776
Epoch: 363, Training Loss: 0.47303
Epoch: 363, Training Loss: 0.50080
Epoch: 364, Training Loss: 0.47171
Epoch: 364, Training Loss: 0.49776
Epoch: 364, Training Loss: 0.47305
Epoch: 364, Training Loss: 0.50082
Epoch: 365, Training Loss: 0.47172
Epoch: 365, Training Loss: 0.49775
Epoch: 365, Training Loss: 0.47308
Epoch: 365, Training Loss: 0.50084
Epoch: 366, Training Loss: 0.47172
Epoch: 366, Training Loss: 0.49775
Epoch: 366, Training Loss: 0.47310
Epoch: 366, Training Loss: 0.50086
Epoch: 367, Training Loss: 0.47173
Epoch: 367, Training Loss: 0.49774
Epoch: 367, Training Loss: 0.47313
Epoch: 367, Training Loss: 0.50088
Epoch: 368, Training Loss: 0.47173
Epoch: 368, Training Loss: 0.49774
Epoch: 368, Training Loss: 0.47315
Epoch: 368, Training Loss: 0.50090
Epoch: 369, Training Loss: 0.47174
Epoch: 369, Training Loss: 0.49773
Epoch: 369, Training Loss: 0.47318
Epoch: 369, Training Loss: 0.50092
Epoch: 370, Training Loss: 0.47174
Epoch: 370, Training Loss: 0.49773
Epoch: 370, Training Loss: 0.47320
Epoch: 370, Training Loss: 0.50094
Epoch: 371, Training Loss: 0.47175
Epoch: 371, Training Loss: 0.49772
Epoch: 371, Training Loss: 0.47322
Epoch: 371, Training Loss: 0.50096
Epoch: 372, Training Loss: 0.47175
Epoch: 372, Training Loss: 0.49772
Epoch: 372, Training Loss: 0.47325
Epoch: 372, Training Loss: 0.50098
Epoch: 373, Training Loss: 0.47176
Epoch: 373, Training Loss: 0.49771
Epoch: 373, Training Loss: 0.47327
Epoch: 373, Training Loss: 0.50100
Epoch: 374, Training Loss: 0.47177
Epoch: 374, Training Loss: 0.49771
Epoch: 374, Training Loss: 0.47330
Epoch: 374, Training Loss: 0.50102
Epoch: 375, Training Loss: 0.47177
Epoch: 375, Training Loss: 0.49770
Epoch: 375, Training Loss: 0.47332
Epoch: 375, Training Loss: 0.50104
Epoch: 376, Training Loss: 0.47178
Epoch: 376, Training Loss: 0.49770
Epoch: 376, Training Loss: 0.47334
Epoch: 376, Training Loss: 0.50106
Epoch: 377, Training Loss: 0.47178
Epoch: 377, Training Loss: 0.49770
Epoch: 377, Training Loss: 0.47337
Epoch: 377, Training Loss: 0.50108
Epoch: 378, Training Loss: 0.47179
Epoch: 378, Training Loss: 0.49769
Epoch: 378, Training Loss: 0.47339
Epoch: 378, Training Loss: 0.50110
Epoch: 379, Training Loss: 0.47179
Epoch: 379, Training Loss: 0.49769
Epoch: 379, Training Loss: 0.47342
Epoch: 379, Training Loss: 0.50112
Epoch: 380, Training Loss: 0.47180
Epoch: 380, Training Loss: 0.49768
Epoch: 380, Training Loss: 0.47344
Epoch: 380, Training Loss: 0.50114
Epoch: 381, Training Loss: 0.47180
Epoch: 381, Training Loss: 0.49768
Epoch: 381, Training Loss: 0.47346
Epoch: 381, Training Loss: 0.50116
Epoch: 382, Training Loss: 0.47181
Epoch: 382, Training Loss: 0.49767
Epoch: 382, Training Loss: 0.47349
Epoch: 382, Training Loss: 0.50118
Epoch: 383, Training Loss: 0.47181
Epoch: 383, Training Loss: 0.49767
Epoch: 383, Training Loss: 0.47351
Epoch: 383, Training Loss: 0.50120
Epoch: 384, Training Loss: 0.47182
Epoch: 384, Training Loss: 0.49767
Epoch: 384, Training Loss: 0.47353
Epoch: 384, Training Loss: 0.50122
Epoch: 385, Training Loss: 0.47183
Epoch: 385, Training Loss: 0.49766
Epoch: 385, Training Loss: 0.47355
Epoch: 385, Training Loss: 0.50124
Epoch: 386, Training Loss: 0.47183
Epoch: 386, Training Loss: 0.49766
Epoch: 386, Training Loss: 0.47358
Epoch: 386, Training Loss: 0.50125
Epoch: 387, Training Loss: 0.47184
Epoch: 387, Training Loss: 0.49765
Epoch: 387, Training Loss: 0.47360
Epoch: 387, Training Loss: 0.50127
Epoch: 388, Training Loss: 0.47184
Epoch: 388, Training Loss: 0.49765
Epoch: 388, Training Loss: 0.47362
Epoch: 388, Training Loss: 0.50129
Epoch: 389, Training Loss: 0.47185
Epoch: 389, Training Loss: 0.49765
Epoch: 389, Training Loss: 0.47365
Epoch: 389, Training Loss: 0.50131
Epoch: 390, Training Loss: 0.47185
Epoch: 390, Training Loss: 0.49764
Epoch: 390, Training Loss: 0.47367
Epoch: 390, Training Loss: 0.50133
Epoch: 391, Training Loss: 0.47186
Epoch: 391, Training Loss: 0.49764
Epoch: 391, Training Loss: 0.47369
Epoch: 391, Training Loss: 0.50135
Epoch: 392, Training Loss: 0.47186
Epoch: 392, Training Loss: 0.49764
Epoch: 392, Training Loss: 0.47371
Epoch: 392, Training Loss: 0.50137
Epoch: 393, Training Loss: 0.47187
Epoch: 393, Training Loss: 0.49763
Epoch: 393, Training Loss: 0.47374
Epoch: 393, Training Loss: 0.50138
Epoch: 394, Training Loss: 0.47187
Epoch: 394, Training Loss: 0.49763
Epoch: 394, Training Loss: 0.47376
Epoch: 394, Training Loss: 0.50140
Epoch: 395, Training Loss: 0.47188
Epoch: 395, Training Loss: 0.49762
Epoch: 395, Training Loss: 0.47378
Epoch: 395, Training Loss: 0.50142
Epoch: 396, Training Loss: 0.47188
Epoch: 396, Training Loss: 0.49762
Epoch: 396, Training Loss: 0.47380
Epoch: 396, Training Loss: 0.50144
Epoch: 397, Training Loss: 0.47189
Epoch: 397, Training Loss: 0.49762
Epoch: 397, Training Loss: 0.47382
Epoch: 397, Training Loss: 0.50145
Epoch: 398, Training Loss: 0.47189
Epoch: 398, Training Loss: 0.49761
Epoch: 398, Training Loss: 0.47385
Epoch: 398, Training Loss: 0.50147
Epoch: 399, Training Loss: 0.47190
Epoch: 399, Training Loss: 0.49761
Epoch: 399, Training Loss: 0.47387
Epoch: 399, Training Loss: 0.50149
Epoch: 400, Training Loss: 0.47191
Epoch: 400, Training Loss: 0.49761
Epoch: 400, Training Loss: 0.47389
Epoch: 400, Training Loss: 0.50151
Epoch: 401, Training Loss: 0.47191
Epoch: 401, Training Loss: 0.49761
Epoch: 401, Training Loss: 0.47391
Epoch: 401, Training Loss: 0.50152
Epoch: 402, Training Loss: 0.47192
Epoch: 402, Training Loss: 0.49760
Epoch: 402, Training Loss: 0.47393
Epoch: 402, Training Loss: 0.50154
Epoch: 403, Training Loss: 0.47192
Epoch: 403, Training Loss: 0.49760
Epoch: 403, Training Loss: 0.47395
Epoch: 403, Training Loss: 0.50156
Epoch: 404, Training Loss: 0.47193
Epoch: 404, Training Loss: 0.49760
Epoch: 404, Training Loss: 0.47398
Epoch: 404, Training Loss: 0.50158
Epoch: 405, Training Loss: 0.47193
Epoch: 405, Training Loss: 0.49759
Epoch: 405, Training Loss: 0.47400
Epoch: 405, Training Loss: 0.50159
Epoch: 406, Training Loss: 0.47194
Epoch: 406, Training Loss: 0.49759
Epoch: 406, Training Loss: 0.47402
Epoch: 406, Training Loss: 0.50161
Epoch: 407, Training Loss: 0.47194
Epoch: 407, Training Loss: 0.49759
Epoch: 407, Training Loss: 0.47404
Epoch: 407, Training Loss: 0.50163
Epoch: 408, Training Loss: 0.47195
Epoch: 408, Training Loss: 0.49758
Epoch: 408, Training Loss: 0.47406
Epoch: 408, Training Loss: 0.50164
Epoch: 409, Training Loss: 0.47195
Epoch: 409, Training Loss: 0.49758
Epoch: 409, Training Loss: 0.47408
Epoch: 409, Training Loss: 0.50166
Epoch: 410, Training Loss: 0.47196
Epoch: 410, Training Loss: 0.49758
Epoch: 410, Training Loss: 0.47410
Epoch: 410, Training Loss: 0.50168
Epoch: 411, Training Loss: 0.47196
Epoch: 411, Training Loss: 0.49758
Epoch: 411, Training Loss: 0.47412
Epoch: 411, Training Loss: 0.50169
Epoch: 412, Training Loss: 0.47197
Epoch: 412, Training Loss: 0.49757
Epoch: 412, Training Loss: 0.47415
Epoch: 412, Training Loss: 0.50171
Epoch: 413, Training Loss: 0.47197
Epoch: 413, Training Loss: 0.49757
Epoch: 413, Training Loss: 0.47417
Epoch: 413, Training Loss: 0.50172
Epoch: 414, Training Loss: 0.47198
Epoch: 414, Training Loss: 0.49757
Epoch: 414, Training Loss: 0.47419
Epoch: 414, Training Loss: 0.50174
Epoch: 415, Training Loss: 0.47198
Epoch: 415, Training Loss: 0.49757
Epoch: 415, Training Loss: 0.47421
Epoch: 415, Training Loss: 0.50176
Epoch: 416, Training Loss: 0.47199
Epoch: 416, Training Loss: 0.49756
Epoch: 416, Training Loss: 0.47423
Epoch: 416, Training Loss: 0.50177
Epoch: 417, Training Loss: 0.47199
Epoch: 417, Training Loss: 0.49756
Epoch: 417, Training Loss: 0.47425
Epoch: 417, Training Loss: 0.50179
Epoch: 418, Training Loss: 0.47200
Epoch: 418, Training Loss: 0.49756
Epoch: 418, Training Loss: 0.47427
Epoch: 418, Training Loss: 0.50180
Epoch: 419, Training Loss: 0.47200
Epoch: 419, Training Loss: 0.49756
Epoch: 419, Training Loss: 0.47429
Epoch: 419, Training Loss: 0.50182
Epoch: 420, Training Loss: 0.47201
Epoch: 420, Training Loss: 0.49756
Epoch: 420, Training Loss: 0.47431
Epoch: 420, Training Loss: 0.50184
Epoch: 421, Training Loss: 0.47201
Epoch: 421, Training Loss: 0.49755
Epoch: 421, Training Loss: 0.47433
Epoch: 421, Training Loss: 0.50185
Epoch: 422, Training Loss: 0.47202
Epoch: 422, Training Loss: 0.49755
Epoch: 422, Training Loss: 0.47435
Epoch: 422, Training Loss: 0.50187
Epoch: 423, Training Loss: 0.47202
Epoch: 423, Training Loss: 0.49755
Epoch: 423, Training Loss: 0.47437
Epoch: 423, Training Loss: 0.50188
Epoch: 424, Training Loss: 0.47203
Epoch: 424, Training Loss: 0.49755
Epoch: 424, Training Loss: 0.47439
Epoch: 424, Training Loss: 0.50190
Epoch: 425, Training Loss: 0.47203
Epoch: 425, Training Loss: 0.49755
Epoch: 425, Training Loss: 0.47441
Epoch: 425, Training Loss: 0.50191
Epoch: 426, Training Loss: 0.47204
Epoch: 426, Training Loss: 0.49754
Epoch: 426, Training Loss: 0.47443
Epoch: 426, Training Loss: 0.50193
Epoch: 427, Training Loss: 0.47204
Epoch: 427, Training Loss: 0.49754
Epoch: 427, Training Loss: 0.47445
Epoch: 427, Training Loss: 0.50194
Epoch: 428, Training Loss: 0.47205
Epoch: 428, Training Loss: 0.49754
Epoch: 428, Training Loss: 0.47447
Epoch: 428, Training Loss: 0.50196
Epoch: 429, Training Loss: 0.47205
Epoch: 429, Training Loss: 0.49754
Epoch: 429, Training Loss: 0.47449
Epoch: 429, Training Loss: 0.50197
Epoch: 430, Training Loss: 0.47206
Epoch: 430, Training Loss: 0.49754
Epoch: 430, Training Loss: 0.47451
Epoch: 430, Training Loss: 0.50198
Epoch: 431, Training Loss: 0.47206
Epoch: 431, Training Loss: 0.49754
Epoch: 431, Training Loss: 0.47453
Epoch: 431, Training Loss: 0.50200
Epoch: 432, Training Loss: 0.47207
Epoch: 432, Training Loss: 0.49753
Epoch: 432, Training Loss: 0.47455
Epoch: 432, Training Loss: 0.50201
Epoch: 433, Training Loss: 0.47207
Epoch: 433, Training Loss: 0.49753
Epoch: 433, Training Loss: 0.47457
Epoch: 433, Training Loss: 0.50203
Epoch: 434, Training Loss: 0.47208
Epoch: 434, Training Loss: 0.49753
Epoch: 434, Training Loss: 0.47459
Epoch: 434, Training Loss: 0.50204
Epoch: 435, Training Loss: 0.47208
Epoch: 435, Training Loss: 0.49753
Epoch: 435, Training Loss: 0.47460
Epoch: 435, Training Loss: 0.50206
Epoch: 436, Training Loss: 0.47209
Epoch: 436, Training Loss: 0.49753
Epoch: 436, Training Loss: 0.47462
Epoch: 436, Training Loss: 0.50207
Epoch: 437, Training Loss: 0.47209
Epoch: 437, Training Loss: 0.49753
Epoch: 437, Training Loss: 0.47464
Epoch: 437, Training Loss: 0.50208
Epoch: 438, Training Loss: 0.47210
Epoch: 438, Training Loss: 0.49753
Epoch: 438, Training Loss: 0.47466
Epoch: 438, Training Loss: 0.50210
Epoch: 439, Training Loss: 0.47210
Epoch: 439, Training Loss: 0.49752
Epoch: 439, Training Loss: 0.47468
Epoch: 439, Training Loss: 0.50211
Epoch: 440, Training Loss: 0.47211
Epoch: 440, Training Loss: 0.49752
Epoch: 440, Training Loss: 0.47470
Epoch: 440, Training Loss: 0.50212
Epoch: 441, Training Loss: 0.47211
Epoch: 441, Training Loss: 0.49752
Epoch: 441, Training Loss: 0.47472
Epoch: 441, Training Loss: 0.50214
Epoch: 442, Training Loss: 0.47212
Epoch: 442, Training Loss: 0.49752
Epoch: 442, Training Loss: 0.47474
Epoch: 442, Training Loss: 0.50215
Epoch: 443, Training Loss: 0.47212
Epoch: 443, Training Loss: 0.49752
Epoch: 443, Training Loss: 0.47476
Epoch: 443, Training Loss: 0.50216
Epoch: 444, Training Loss: 0.47213
Epoch: 444, Training Loss: 0.49752
Epoch: 444, Training Loss: 0.47477
Epoch: 444, Training Loss: 0.50218
Epoch: 445, Training Loss: 0.47213
Epoch: 445, Training Loss: 0.49752
Epoch: 445, Training Loss: 0.47479
Epoch: 445, Training Loss: 0.50219
Epoch: 446, Training Loss: 0.47214
Epoch: 446, Training Loss: 0.49752
Epoch: 446, Training Loss: 0.47481
Epoch: 446, Training Loss: 0.50220
Epoch: 447, Training Loss: 0.47214
Epoch: 447, Training Loss: 0.49752
Epoch: 447, Training Loss: 0.47483
Epoch: 447, Training Loss: 0.50222
Epoch: 448, Training Loss: 0.47215
Epoch: 448, Training Loss: 0.49752
Epoch: 448, Training Loss: 0.47485
Epoch: 448, Training Loss: 0.50223
Epoch: 449, Training Loss: 0.47215
Epoch: 449, Training Loss: 0.49752
Epoch: 449, Training Loss: 0.47487
Epoch: 449, Training Loss: 0.50224
Epoch: 450, Training Loss: 0.47216
Epoch: 450, Training Loss: 0.49751
Epoch: 450, Training Loss: 0.47488
Epoch: 450, Training Loss: 0.50225
Epoch: 451, Training Loss: 0.47216
Epoch: 451, Training Loss: 0.49751
Epoch: 451, Training Loss: 0.47490
Epoch: 451, Training Loss: 0.50227
Epoch: 452, Training Loss: 0.47217
Epoch: 452, Training Loss: 0.49751
Epoch: 452, Training Loss: 0.47492
Epoch: 452, Training Loss: 0.50228
Epoch: 453, Training Loss: 0.47217
Epoch: 453, Training Loss: 0.49751
Epoch: 453, Training Loss: 0.47494
Epoch: 453, Training Loss: 0.50229
Epoch: 454, Training Loss: 0.47218
Epoch: 454, Training Loss: 0.49751
Epoch: 454, Training Loss: 0.47496
Epoch: 454, Training Loss: 0.50230
Epoch: 455, Training Loss: 0.47218
Epoch: 455, Training Loss: 0.49751
Epoch: 455, Training Loss: 0.47497
Epoch: 455, Training Loss: 0.50232
Epoch: 456, Training Loss: 0.47218
Epoch: 456, Training Loss: 0.49751
Epoch: 456, Training Loss: 0.47499
Epoch: 456, Training Loss: 0.50233
Epoch: 457, Training Loss: 0.47219
Epoch: 457, Training Loss: 0.49751
Epoch: 457, Training Loss: 0.47501
Epoch: 457, Training Loss: 0.50234
Epoch: 458, Training Loss: 0.47219
Epoch: 458, Training Loss: 0.49751
Epoch: 458, Training Loss: 0.47503
Epoch: 458, Training Loss: 0.50235
Epoch: 459, Training Loss: 0.47220
Epoch: 459, Training Loss: 0.49751
Epoch: 459, Training Loss: 0.47505
Epoch: 459, Training Loss: 0.50236
Epoch: 460, Training Loss: 0.47220
Epoch: 460, Training Loss: 0.49751
Epoch: 460, Training Loss: 0.47506
Epoch: 460, Training Loss: 0.50238
Epoch: 461, Training Loss: 0.47221
Epoch: 461, Training Loss: 0.49751
Epoch: 461, Training Loss: 0.47508
Epoch: 461, Training Loss: 0.50239
Epoch: 462, Training Loss: 0.47221
Epoch: 462, Training Loss: 0.49751
Epoch: 462, Training Loss: 0.47510
Epoch: 462, Training Loss: 0.50240
Epoch: 463, Training Loss: 0.47222
Epoch: 463, Training Loss: 0.49751
Epoch: 463, Training Loss: 0.47512
Epoch: 463, Training Loss: 0.50241
Epoch: 464, Training Loss: 0.47222
Epoch: 464, Training Loss: 0.49751
Epoch: 464, Training Loss: 0.47513
Epoch: 464, Training Loss: 0.50242
Epoch: 465, Training Loss: 0.47223
Epoch: 465, Training Loss: 0.49751
Epoch: 465, Training Loss: 0.47515
Epoch: 465, Training Loss: 0.50243
Epoch: 466, Training Loss: 0.47223
Epoch: 466, Training Loss: 0.49751
Epoch: 466, Training Loss: 0.47517
Epoch: 466, Training Loss: 0.50244
Epoch: 467, Training Loss: 0.47224
Epoch: 467, Training Loss: 0.49751
Epoch: 467, Training Loss: 0.47518
Epoch: 467, Training Loss: 0.50245
Epoch: 468, Training Loss: 0.47224
Epoch: 468, Training Loss: 0.49751
Epoch: 468, Training Loss: 0.47520
Epoch: 468, Training Loss: 0.50247
Epoch: 469, Training Loss: 0.47225
Epoch: 469, Training Loss: 0.49751
Epoch: 469, Training Loss: 0.47522
Epoch: 469, Training Loss: 0.50248
Epoch: 470, Training Loss: 0.47225
Epoch: 470, Training Loss: 0.49751
Epoch: 470, Training Loss: 0.47524
Epoch: 470, Training Loss: 0.50249
Epoch: 471, Training Loss: 0.47226
Epoch: 471, Training Loss: 0.49751
Epoch: 471, Training Loss: 0.47525
Epoch: 471, Training Loss: 0.50250
Epoch: 472, Training Loss: 0.47226
Epoch: 472, Training Loss: 0.49751
Epoch: 472, Training Loss: 0.47527
Epoch: 472, Training Loss: 0.50251
Epoch: 473, Training Loss: 0.47227
Epoch: 473, Training Loss: 0.49751
Epoch: 473, Training Loss: 0.47529
Epoch: 473, Training Loss: 0.50252
Epoch: 474, Training Loss: 0.47227
Epoch: 474, Training Loss: 0.49751
Epoch: 474, Training Loss: 0.47530
Epoch: 474, Training Loss: 0.50253
Epoch: 475, Training Loss: 0.47227
Epoch: 475, Training Loss: 0.49751
Epoch: 475, Training Loss: 0.47532
Epoch: 475, Training Loss: 0.50254
Epoch: 476, Training Loss: 0.47228
Epoch: 476, Training Loss: 0.49751
Epoch: 476, Training Loss: 0.47534
Epoch: 476, Training Loss: 0.50255
Epoch: 477, Training Loss: 0.47228
Epoch: 477, Training Loss: 0.49751
Epoch: 477, Training Loss: 0.47535
Epoch: 477, Training Loss: 0.50256
Epoch: 478, Training Loss: 0.47229
Epoch: 478, Training Loss: 0.49751
Epoch: 478, Training Loss: 0.47537
Epoch: 478, Training Loss: 0.50257
Epoch: 479, Training Loss: 0.47229
Epoch: 479, Training Loss: 0.49752
Epoch: 479, Training Loss: 0.47539
Epoch: 479, Training Loss: 0.50258
Epoch: 480, Training Loss: 0.47230
Epoch: 480, Training Loss: 0.49752
Epoch: 480, Training Loss: 0.47540
Epoch: 480, Training Loss: 0.50259
Epoch: 481, Training Loss: 0.47230
Epoch: 481, Training Loss: 0.49752
Epoch: 481, Training Loss: 0.47542
Epoch: 481, Training Loss: 0.50260
Epoch: 482, Training Loss: 0.47231
Epoch: 482, Training Loss: 0.49752
Epoch: 482, Training Loss: 0.47544
Epoch: 482, Training Loss: 0.50261
Epoch: 483, Training Loss: 0.47231
Epoch: 483, Training Loss: 0.49752
Epoch: 483, Training Loss: 0.47545
Epoch: 483, Training Loss: 0.50262
Epoch: 484, Training Loss: 0.47232
Epoch: 484, Training Loss: 0.49752
Epoch: 484, Training Loss: 0.47547
Epoch: 484, Training Loss: 0.50263
Epoch: 485, Training Loss: 0.47232
Epoch: 485, Training Loss: 0.49752
Epoch: 485, Training Loss: 0.47548
Epoch: 485, Training Loss: 0.50264
Epoch: 486, Training Loss: 0.47233
Epoch: 486, Training Loss: 0.49752
Epoch: 486, Training Loss: 0.47550
Epoch: 486, Training Loss: 0.50265
Epoch: 487, Training Loss: 0.47233
Epoch: 487, Training Loss: 0.49752
Epoch: 487, Training Loss: 0.47552
Epoch: 487, Training Loss: 0.50266
Epoch: 488, Training Loss: 0.47233
Epoch: 488, Training Loss: 0.49752
Epoch: 488, Training Loss: 0.47553
Epoch: 488, Training Loss: 0.50267
Epoch: 489, Training Loss: 0.47234
Epoch: 489, Training Loss: 0.49752
Epoch: 489, Training Loss: 0.47555
Epoch: 489, Training Loss: 0.50268
Epoch: 490, Training Loss: 0.47234
Epoch: 490, Training Loss: 0.49752
Epoch: 490, Training Loss: 0.47556
Epoch: 490, Training Loss: 0.50269
Epoch: 491, Training Loss: 0.47235
Epoch: 491, Training Loss: 0.49753
Epoch: 491, Training Loss: 0.47558
Epoch: 491, Training Loss: 0.50269
Epoch: 492, Training Loss: 0.47235
Epoch: 492, Training Loss: 0.49753
Epoch: 492, Training Loss: 0.47560
Epoch: 492, Training Loss: 0.50270
Epoch: 493, Training Loss: 0.47236
Epoch: 493, Training Loss: 0.49753
Epoch: 493, Training Loss: 0.47561
Epoch: 493, Training Loss: 0.50271
Epoch: 494, Training Loss: 0.47236
Epoch: 494, Training Loss: 0.49753
Epoch: 494, Training Loss: 0.47563
Epoch: 494, Training Loss: 0.50272
Epoch: 495, Training Loss: 0.47237
Epoch: 495, Training Loss: 0.49753
Epoch: 495, Training Loss: 0.47564
Epoch: 495, Training Loss: 0.50273
Epoch: 496, Training Loss: 0.47237
Epoch: 496, Training Loss: 0.49753
Epoch: 496, Training Loss: 0.47566
Epoch: 496, Training Loss: 0.50274
Epoch: 497, Training Loss: 0.47238
Epoch: 497, Training Loss: 0.49753
Epoch: 497, Training Loss: 0.47568
Epoch: 497, Training Loss: 0.50275
Epoch: 498, Training Loss: 0.47238
Epoch: 498, Training Loss: 0.49753
Epoch: 498, Training Loss: 0.47569
Epoch: 498, Training Loss: 0.50276
Epoch: 499, Training Loss: 0.47238
Epoch: 499, Training Loss: 0.49754
Epoch: 499, Training Loss: 0.47571
Epoch: 499, Training Loss: 0.50276
Epoch: 500, Training Loss: 0.47239
Epoch: 500, Training Loss: 0.49754
Epoch: 500, Training Loss: 0.47572
Epoch: 500, Training Loss: 0.50277
Epoch: 501, Training Loss: 0.47239
Epoch: 501, Training Loss: 0.49754
Epoch: 501, Training Loss: 0.47574
Epoch: 501, Training Loss: 0.50278
Epoch: 502, Training Loss: 0.47240
Epoch: 502, Training Loss: 0.49754
Epoch: 502, Training Loss: 0.47575
Epoch: 502, Training Loss: 0.50279
Epoch: 503, Training Loss: 0.47240
Epoch: 503, Training Loss: 0.49754
Epoch: 503, Training Loss: 0.47577
Epoch: 503, Training Loss: 0.50280
Epoch: 504, Training Loss: 0.47241
Epoch: 504, Training Loss: 0.49754
Epoch: 504, Training Loss: 0.47578
Epoch: 504, Training Loss: 0.50281
Epoch: 505, Training Loss: 0.47241
Epoch: 505, Training Loss: 0.49754
Epoch: 505, Training Loss: 0.47580
Epoch: 505, Training Loss: 0.50281
Epoch: 506, Training Loss: 0.47241
Epoch: 506, Training Loss: 0.49754
Epoch: 506, Training Loss: 0.47581
Epoch: 506, Training Loss: 0.50282
Epoch: 507, Training Loss: 0.47242
Epoch: 507, Training Loss: 0.49755
Epoch: 507, Training Loss: 0.47583
Epoch: 507, Training Loss: 0.50283
Epoch: 508, Training Loss: 0.47242
Epoch: 508, Training Loss: 0.49755
Epoch: 508, Training Loss: 0.47584
Epoch: 508, Training Loss: 0.50284
Epoch: 509, Training Loss: 0.47243
Epoch: 509, Training Loss: 0.49755
Epoch: 509, Training Loss: 0.47586
Epoch: 509, Training Loss: 0.50284
Epoch: 510, Training Loss: 0.47243
Epoch: 510, Training Loss: 0.49755
Epoch: 510, Training Loss: 0.47587
Epoch: 510, Training Loss: 0.50285
Epoch: 511, Training Loss: 0.47244
Epoch: 511, Training Loss: 0.49755
Epoch: 511, Training Loss: 0.47589
Epoch: 511, Training Loss: 0.50286
Epoch: 512, Training Loss: 0.47244
Epoch: 512, Training Loss: 0.49755
Epoch: 512, Training Loss: 0.47590
Epoch: 512, Training Loss: 0.50287
Epoch: 513, Training Loss: 0.47244
Epoch: 513, Training Loss: 0.49756
Epoch: 513, Training Loss: 0.47592
Epoch: 513, Training Loss: 0.50287
Epoch: 514, Training Loss: 0.47245
Epoch: 514, Training Loss: 0.49756
Epoch: 514, Training Loss: 0.47593
Epoch: 514, Training Loss: 0.50288
Epoch: 515, Training Loss: 0.47245
Epoch: 515, Training Loss: 0.49756
Epoch: 515, Training Loss: 0.47595
Epoch: 515, Training Loss: 0.50289
Epoch: 516, Training Loss: 0.47246
Epoch: 516, Training Loss: 0.49756
Epoch: 516, Training Loss: 0.47596
Epoch: 516, Training Loss: 0.50290
Epoch: 517, Training Loss: 0.47246
Epoch: 517, Training Loss: 0.49756
Epoch: 517, Training Loss: 0.47598
Epoch: 517, Training Loss: 0.50290
Epoch: 518, Training Loss: 0.47247
Epoch: 518, Training Loss: 0.49756
Epoch: 518, Training Loss: 0.47599
Epoch: 518, Training Loss: 0.50291
Epoch: 519, Training Loss: 0.47247
Epoch: 519, Training Loss: 0.49757
Epoch: 519, Training Loss: 0.47601
Epoch: 519, Training Loss: 0.50292
Epoch: 520, Training Loss: 0.47247
Epoch: 520, Training Loss: 0.49757
Epoch: 520, Training Loss: 0.47602
Epoch: 520, Training Loss: 0.50292
Epoch: 521, Training Loss: 0.47248
Epoch: 521, Training Loss: 0.49757
Epoch: 521, Training Loss: 0.47604
Epoch: 521, Training Loss: 0.50293
Epoch: 522, Training Loss: 0.47248
Epoch: 522, Training Loss: 0.49757
Epoch: 522, Training Loss: 0.47605
Epoch: 522, Training Loss: 0.50294
Epoch: 523, Training Loss: 0.47249
Epoch: 523, Training Loss: 0.49757
Epoch: 523, Training Loss: 0.47607
Epoch: 523, Training Loss: 0.50295
Epoch: 524, Training Loss: 0.47249
Epoch: 524, Training Loss: 0.49758
Epoch: 524, Training Loss: 0.47608
Epoch: 524, Training Loss: 0.50295
Epoch: 525, Training Loss: 0.47249
Epoch: 525, Training Loss: 0.49758
Epoch: 525, Training Loss: 0.47609
Epoch: 525, Training Loss: 0.50296
Epoch: 526, Training Loss: 0.47250
Epoch: 526, Training Loss: 0.49758
Epoch: 526, Training Loss: 0.47611
Epoch: 526, Training Loss: 0.50297
Epoch: 527, Training Loss: 0.47250
Epoch: 527, Training Loss: 0.49758
Epoch: 527, Training Loss: 0.47612
Epoch: 527, Training Loss: 0.50297
Epoch: 528, Training Loss: 0.47251
Epoch: 528, Training Loss: 0.49758
Epoch: 528, Training Loss: 0.47614
Epoch: 528, Training Loss: 0.50298
Epoch: 529, Training Loss: 0.47251
Epoch: 529, Training Loss: 0.49759
Epoch: 529, Training Loss: 0.47615
Epoch: 529, Training Loss: 0.50298
Epoch: 530, Training Loss: 0.47251
Epoch: 530, Training Loss: 0.49759
Epoch: 530, Training Loss: 0.47617
Epoch: 530, Training Loss: 0.50299
Epoch: 531, Training Loss: 0.47252
Epoch: 531, Training Loss: 0.49759
Epoch: 531, Training Loss: 0.47618
Epoch: 531, Training Loss: 0.50300
Epoch: 532, Training Loss: 0.47252
Epoch: 532, Training Loss: 0.49759
Epoch: 532, Training Loss: 0.47620
Epoch: 532, Training Loss: 0.50300
Epoch: 533, Training Loss: 0.47253
Epoch: 533, Training Loss: 0.49759
Epoch: 533, Training Loss: 0.47621
Epoch: 533, Training Loss: 0.50301
Epoch: 534, Training Loss: 0.47253
Epoch: 534, Training Loss: 0.49760
Epoch: 534, Training Loss: 0.47622
Epoch: 534, Training Loss: 0.50302
Epoch: 535, Training Loss: 0.47253
Epoch: 535, Training Loss: 0.49760
Epoch: 535, Training Loss: 0.47624
Epoch: 535, Training Loss: 0.50302
Epoch: 536, Training Loss: 0.47254
Epoch: 536, Training Loss: 0.49760
Epoch: 536, Training Loss: 0.47625
Epoch: 536, Training Loss: 0.50303
Epoch: 537, Training Loss: 0.47254
Epoch: 537, Training Loss: 0.49760
Epoch: 537, Training Loss: 0.47627
Epoch: 537, Training Loss: 0.50303
Epoch: 538, Training Loss: 0.47254
Epoch: 538, Training Loss: 0.49760
Epoch: 538, Training Loss: 0.47628
Epoch: 538, Training Loss: 0.50304
Epoch: 539, Training Loss: 0.47255
Epoch: 539, Training Loss: 0.49761
Epoch: 539, Training Loss: 0.47629
Epoch: 539, Training Loss: 0.50305
Epoch: 540, Training Loss: 0.47255
Epoch: 540, Training Loss: 0.49761
Epoch: 540, Training Loss: 0.47631
Epoch: 540, Training Loss: 0.50305
Epoch: 541, Training Loss: 0.47256
Epoch: 541, Training Loss: 0.49761
Epoch: 541, Training Loss: 0.47632
Epoch: 541, Training Loss: 0.50306
Epoch: 542, Training Loss: 0.47256
Epoch: 542, Training Loss: 0.49761
Epoch: 542, Training Loss: 0.47634
Epoch: 542, Training Loss: 0.50306
Epoch: 543, Training Loss: 0.47256
Epoch: 543, Training Loss: 0.49762
Epoch: 543, Training Loss: 0.47635
Epoch: 543, Training Loss: 0.50307
Epoch: 544, Training Loss: 0.47257
Epoch: 544, Training Loss: 0.49762
Epoch: 544, Training Loss: 0.47636
Epoch: 544, Training Loss: 0.50308
Epoch: 545, Training Loss: 0.47257
Epoch: 545, Training Loss: 0.49762
Epoch: 545, Training Loss: 0.47638
Epoch: 545, Training Loss: 0.50308
Epoch: 546, Training Loss: 0.47257
Epoch: 546, Training Loss: 0.49762
Epoch: 546, Training Loss: 0.47639
Epoch: 546, Training Loss: 0.50309
Epoch: 547, Training Loss: 0.47258
Epoch: 547, Training Loss: 0.49762
Epoch: 547, Training Loss: 0.47640
Epoch: 547, Training Loss: 0.50309
Epoch: 548, Training Loss: 0.47258
Epoch: 548, Training Loss: 0.49763
Epoch: 548, Training Loss: 0.47642
Epoch: 548, Training Loss: 0.50310
Epoch: 549, Training Loss: 0.47258
Epoch: 549, Training Loss: 0.49763
Epoch: 549, Training Loss: 0.47643
Epoch: 549, Training Loss: 0.50310
Epoch: 550, Training Loss: 0.47259
Epoch: 550, Training Loss: 0.49763
Epoch: 550, Training Loss: 0.47645
Epoch: 550, Training Loss: 0.50311
Epoch: 551, Training Loss: 0.47259
Epoch: 551, Training Loss: 0.49763
Epoch: 551, Training Loss: 0.47646
Epoch: 551, Training Loss: 0.50311
Epoch: 552, Training Loss: 0.47259
Epoch: 552, Training Loss: 0.49764
Epoch: 552, Training Loss: 0.47647
Epoch: 552, Training Loss: 0.50312
Epoch: 553, Training Loss: 0.47260
Epoch: 553, Training Loss: 0.49764
Epoch: 553, Training Loss: 0.47649
Epoch: 553, Training Loss: 0.50312
Epoch: 554, Training Loss: 0.47260
Epoch: 554, Training Loss: 0.49764
Epoch: 554, Training Loss: 0.47650
Epoch: 554, Training Loss: 0.50313
Epoch: 555, Training Loss: 0.47260
Epoch: 555, Training Loss: 0.49764
Epoch: 555, Training Loss: 0.47651
Epoch: 555, Training Loss: 0.50313
Epoch: 556, Training Loss: 0.47261
Epoch: 556, Training Loss: 0.49764
Epoch: 556, Training Loss: 0.47653
Epoch: 556, Training Loss: 0.50314
Epoch: 557, Training Loss: 0.47261
Epoch: 557, Training Loss: 0.49765
Epoch: 557, Training Loss: 0.47654
Epoch: 557, Training Loss: 0.50315
Epoch: 558, Training Loss: 0.47261
Epoch: 558, Training Loss: 0.49765
Epoch: 558, Training Loss: 0.47655
Epoch: 558, Training Loss: 0.50315
Epoch: 559, Training Loss: 0.47262
Epoch: 559, Training Loss: 0.49765
Epoch: 559, Training Loss: 0.47657
Epoch: 559, Training Loss: 0.50316
Epoch: 560, Training Loss: 0.47262
Epoch: 560, Training Loss: 0.49765
Epoch: 560, Training Loss: 0.47658
Epoch: 560, Training Loss: 0.50316
Epoch: 561, Training Loss: 0.47262
Epoch: 561, Training Loss: 0.49766
Epoch: 561, Training Loss: 0.47659
Epoch: 561, Training Loss: 0.50317
Epoch: 562, Training Loss: 0.47263
Epoch: 562, Training Loss: 0.49766
Epoch: 562, Training Loss: 0.47661
Epoch: 562, Training Loss: 0.50317
Epoch: 563, Training Loss: 0.47263
Epoch: 563, Training Loss: 0.49766
Epoch: 563, Training Loss: 0.47662
Epoch: 563, Training Loss: 0.50318
Epoch: 564, Training Loss: 0.47263
Epoch: 564, Training Loss: 0.49766
Epoch: 564, Training Loss: 0.47663
Epoch: 564, Training Loss: 0.50318
Epoch: 565, Training Loss: 0.47263
Epoch: 565, Training Loss: 0.49767
Epoch: 565, Training Loss: 0.47665
Epoch: 565, Training Loss: 0.50319
Epoch: 566, Training Loss: 0.47264
Epoch: 566, Training Loss: 0.49767
Epoch: 566, Training Loss: 0.47666
Epoch: 566, Training Loss: 0.50319
Epoch: 567, Training Loss: 0.47264
Epoch: 567, Training Loss: 0.49767
Epoch: 567, Training Loss: 0.47667
Epoch: 567, Training Loss: 0.50319
Epoch: 568, Training Loss: 0.47264
Epoch: 568, Training Loss: 0.49767
Epoch: 568, Training Loss: 0.47669
Epoch: 568, Training Loss: 0.50320
Epoch: 569, Training Loss: 0.47265
Epoch: 569, Training Loss: 0.49768
Epoch: 569, Training Loss: 0.47670
Epoch: 569, Training Loss: 0.50320
Epoch: 570, Training Loss: 0.47265
Epoch: 570, Training Loss: 0.49768
Epoch: 570, Training Loss: 0.47671
Epoch: 570, Training Loss: 0.50321
Epoch: 571, Training Loss: 0.47265
Epoch: 571, Training Loss: 0.49768
Epoch: 571, Training Loss: 0.47673
Epoch: 571, Training Loss: 0.50321
Epoch: 572, Training Loss: 0.47265
Epoch: 572, Training Loss: 0.49768
Epoch: 572, Training Loss: 0.47674
Epoch: 572, Training Loss: 0.50322
Epoch: 573, Training Loss: 0.47266
Epoch: 573, Training Loss: 0.49768
Epoch: 573, Training Loss: 0.47675
Epoch: 573, Training Loss: 0.50322
Epoch: 574, Training Loss: 0.47266
Epoch: 574, Training Loss: 0.49769
Epoch: 574, Training Loss: 0.47677
Epoch: 574, Training Loss: 0.50323
Epoch: 575, Training Loss: 0.47266
Epoch: 575, Training Loss: 0.49769
Epoch: 575, Training Loss: 0.47678
Epoch: 575, Training Loss: 0.50323
Epoch: 576, Training Loss: 0.47266
Epoch: 576, Training Loss: 0.49769
Epoch: 576, Training Loss: 0.47679
Epoch: 576, Training Loss: 0.50324
Epoch: 577, Training Loss: 0.47267
Epoch: 577, Training Loss: 0.49769
Epoch: 577, Training Loss: 0.47681
Epoch: 577, Training Loss: 0.50324
Epoch: 578, Training Loss: 0.47267
Epoch: 578, Training Loss: 0.49770
Epoch: 578, Training Loss: 0.47682
Epoch: 578, Training Loss: 0.50325
Epoch: 579, Training Loss: 0.47267
Epoch: 579, Training Loss: 0.49770
Epoch: 579, Training Loss: 0.47683
Epoch: 579, Training Loss: 0.50325
Epoch: 580, Training Loss: 0.47267
Epoch: 580, Training Loss: 0.49770
Epoch: 580, Training Loss: 0.47684
Epoch: 580, Training Loss: 0.50326
Epoch: 581, Training Loss: 0.47268
Epoch: 581, Training Loss: 0.49770
Epoch: 581, Training Loss: 0.47686
Epoch: 581, Training Loss: 0.50326
Epoch: 582, Training Loss: 0.47268
Epoch: 582, Training Loss: 0.49771
Epoch: 582, Training Loss: 0.47687
Epoch: 582, Training Loss: 0.50326
Epoch: 583, Training Loss: 0.47268
Epoch: 583, Training Loss: 0.49771
Epoch: 583, Training Loss: 0.47688
Epoch: 583, Training Loss: 0.50327
Epoch: 584, Training Loss: 0.47268
Epoch: 584, Training Loss: 0.49771
Epoch: 584, Training Loss: 0.47690
Epoch: 584, Training Loss: 0.50327
Epoch: 585, Training Loss: 0.47269
Epoch: 585, Training Loss: 0.49771
Epoch: 585, Training Loss: 0.47691
Epoch: 585, Training Loss: 0.50328
Epoch: 586, Training Loss: 0.47269
Epoch: 586, Training Loss: 0.49771
Epoch: 586, Training Loss: 0.47692
Epoch: 586, Training Loss: 0.50328
Epoch: 587, Training Loss: 0.47269
Epoch: 587, Training Loss: 0.49772
Epoch: 587, Training Loss: 0.47694
Epoch: 587, Training Loss: 0.50329
Epoch: 588, Training Loss: 0.47269
Epoch: 588, Training Loss: 0.49772
Epoch: 588, Training Loss: 0.47695
Epoch: 588, Training Loss: 0.50329
Epoch: 589, Training Loss: 0.47269
Epoch: 589, Training Loss: 0.49772
Epoch: 589, Training Loss: 0.47696
Epoch: 589, Training Loss: 0.50330
Epoch: 590, Training Loss: 0.47270
Epoch: 590, Training Loss: 0.49772
Epoch: 590, Training Loss: 0.47697
Epoch: 590, Training Loss: 0.50330
Epoch: 591, Training Loss: 0.47270
Epoch: 591, Training Loss: 0.49773
Epoch: 591, Training Loss: 0.47699
Epoch: 591, Training Loss: 0.50330
Epoch: 592, Training Loss: 0.47270
Epoch: 592, Training Loss: 0.49773
Epoch: 592, Training Loss: 0.47700
Epoch: 592, Training Loss: 0.50331
Epoch: 593, Training Loss: 0.47270
Epoch: 593, Training Loss: 0.49773
Epoch: 593, Training Loss: 0.47701
Epoch: 593, Training Loss: 0.50331
Epoch: 594, Training Loss: 0.47270
Epoch: 594, Training Loss: 0.49773
Epoch: 594, Training Loss: 0.47703
Epoch: 594, Training Loss: 0.50332
Epoch: 595, Training Loss: 0.47270
Epoch: 595, Training Loss: 0.49773
Epoch: 595, Training Loss: 0.47704
Epoch: 595, Training Loss: 0.50332
Epoch: 596, Training Loss: 0.47271
Epoch: 596, Training Loss: 0.49774
Epoch: 596, Training Loss: 0.47705
Epoch: 596, Training Loss: 0.50333
Epoch: 597, Training Loss: 0.47271
Epoch: 597, Training Loss: 0.49774
Epoch: 597, Training Loss: 0.47706
Epoch: 597, Training Loss: 0.50333
Epoch: 598, Training Loss: 0.47271
Epoch: 598, Training Loss: 0.49774
Epoch: 598, Training Loss: 0.47708
Epoch: 598, Training Loss: 0.50334
Epoch: 599, Training Loss: 0.47271
Epoch: 599, Training Loss: 0.49774
Epoch: 599, Training Loss: 0.47709
Epoch: 599, Training Loss: 0.50334
Epoch: 600, Training Loss: 0.47271
Epoch: 600, Training Loss: 0.49774
Epoch: 600, Training Loss: 0.47710
Epoch: 600, Training Loss: 0.50334
Epoch: 601, Training Loss: 0.47271
Epoch: 601, Training Loss: 0.49775
Epoch: 601, Training Loss: 0.47711
Epoch: 601, Training Loss: 0.50335
Epoch: 602, Training Loss: 0.47272
Epoch: 602, Training Loss: 0.49775
Epoch: 602, Training Loss: 0.47713
Epoch: 602, Training Loss: 0.50335
Epoch: 603, Training Loss: 0.47272
Epoch: 603, Training Loss: 0.49775
Epoch: 603, Training Loss: 0.47714
Epoch: 603, Training Loss: 0.50336
Epoch: 604, Training Loss: 0.47272
Epoch: 604, Training Loss: 0.49775
Epoch: 604, Training Loss: 0.47715
Epoch: 604, Training Loss: 0.50336
Epoch: 605, Training Loss: 0.47272
Epoch: 605, Training Loss: 0.49775
Epoch: 605, Training Loss: 0.47717
Epoch: 605, Training Loss: 0.50337
Epoch: 606, Training Loss: 0.47272
Epoch: 606, Training Loss: 0.49776
Epoch: 606, Training Loss: 0.47718
Epoch: 606, Training Loss: 0.50337
Epoch: 607, Training Loss: 0.47272
Epoch: 607, Training Loss: 0.49776
Epoch: 607, Training Loss: 0.47719
Epoch: 607, Training Loss: 0.50337
Epoch: 608, Training Loss: 0.47272
Epoch: 608, Training Loss: 0.49776
Epoch: 608, Training Loss: 0.47720
Epoch: 608, Training Loss: 0.50338
Epoch: 609, Training Loss: 0.47272
Epoch: 609, Training Loss: 0.49776
Epoch: 609, Training Loss: 0.47722
Epoch: 609, Training Loss: 0.50338
Epoch: 610, Training Loss: 0.47272
Epoch: 610, Training Loss: 0.49776
Epoch: 610, Training Loss: 0.47723
Epoch: 610, Training Loss: 0.50339
Epoch: 611, Training Loss: 0.47272
Epoch: 611, Training Loss: 0.49777
Epoch: 611, Training Loss: 0.47724
Epoch: 611, Training Loss: 0.50339
Epoch: 612, Training Loss: 0.47273
Epoch: 612, Training Loss: 0.49777
Epoch: 612, Training Loss: 0.47725
Epoch: 612, Training Loss: 0.50340
Epoch: 613, Training Loss: 0.47273
Epoch: 613, Training Loss: 0.49777
Epoch: 613, Training Loss: 0.47727
Epoch: 613, Training Loss: 0.50340
Epoch: 614, Training Loss: 0.47273
Epoch: 614, Training Loss: 0.49777
Epoch: 614, Training Loss: 0.47728
Epoch: 614, Training Loss: 0.50340
Epoch: 615, Training Loss: 0.47273
Epoch: 615, Training Loss: 0.49777
Epoch: 615, Training Loss: 0.47729
Epoch: 615, Training Loss: 0.50341
Epoch: 616, Training Loss: 0.47273
Epoch: 616, Training Loss: 0.49778
Epoch: 616, Training Loss: 0.47730
Epoch: 616, Training Loss: 0.50341
Epoch: 617, Training Loss: 0.47273
Epoch: 617, Training Loss: 0.49778
Epoch: 617, Training Loss: 0.47732
Epoch: 617, Training Loss: 0.50342
Epoch: 618, Training Loss: 0.47273
Epoch: 618, Training Loss: 0.49778
Epoch: 618, Training Loss: 0.47733
Epoch: 618, Training Loss: 0.50342
Epoch: 619, Training Loss: 0.47273
Epoch: 619, Training Loss: 0.49778
Epoch: 619, Training Loss: 0.47734
Epoch: 619, Training Loss: 0.50343
Epoch: 620, Training Loss: 0.47273
Epoch: 620, Training Loss: 0.49778
Epoch: 620, Training Loss: 0.47735
Epoch: 620, Training Loss: 0.50343
Epoch: 621, Training Loss: 0.47273
Epoch: 621, Training Loss: 0.49778
Epoch: 621, Training Loss: 0.47737
Epoch: 621, Training Loss: 0.50344
Epoch: 622, Training Loss: 0.47273
Epoch: 622, Training Loss: 0.49779
Epoch: 622, Training Loss: 0.47738
Epoch: 622, Training Loss: 0.50344
Epoch: 623, Training Loss: 0.47273
Epoch: 623, Training Loss: 0.49779
Epoch: 623, Training Loss: 0.47739
Epoch: 623, Training Loss: 0.50345
Epoch: 624, Training Loss: 0.47273
Epoch: 624, Training Loss: 0.49779
Epoch: 624, Training Loss: 0.47740
Epoch: 624, Training Loss: 0.50345
Epoch: 625, Training Loss: 0.47273
Epoch: 625, Training Loss: 0.49779
Epoch: 625, Training Loss: 0.47742
Epoch: 625, Training Loss: 0.50345
Epoch: 626, Training Loss: 0.47273
Epoch: 626, Training Loss: 0.49779
Epoch: 626, Training Loss: 0.47743
Epoch: 626, Training Loss: 0.50346
Epoch: 627, Training Loss: 0.47273
Epoch: 627, Training Loss: 0.49779
Epoch: 627, Training Loss: 0.47744
Epoch: 627, Training Loss: 0.50346
Epoch: 628, Training Loss: 0.47273
Epoch: 628, Training Loss: 0.49780
Epoch: 628, Training Loss: 0.47745
Epoch: 628, Training Loss: 0.50347
Epoch: 629, Training Loss: 0.47273
Epoch: 629, Training Loss: 0.49780
Epoch: 629, Training Loss: 0.47747
Epoch: 629, Training Loss: 0.50347
Epoch: 630, Training Loss: 0.47273
Epoch: 630, Training Loss: 0.49780
Epoch: 630, Training Loss: 0.47748
Epoch: 630, Training Loss: 0.50348
Epoch: 631, Training Loss: 0.47273
Epoch: 631, Training Loss: 0.49780
Epoch: 631, Training Loss: 0.47749
Epoch: 631, Training Loss: 0.50348
Epoch: 632, Training Loss: 0.47273
Epoch: 632, Training Loss: 0.49780
Epoch: 632, Training Loss: 0.47750
Epoch: 632, Training Loss: 0.50349
Epoch: 633, Training Loss: 0.47273
Epoch: 633, Training Loss: 0.49780
Epoch: 633, Training Loss: 0.47752
Epoch: 633, Training Loss: 0.50349
Epoch: 634, Training Loss: 0.47273
Epoch: 634, Training Loss: 0.49780
Epoch: 634, Training Loss: 0.47753
Epoch: 634, Training Loss: 0.50350
Epoch: 635, Training Loss: 0.47273
Epoch: 635, Training Loss: 0.49781
Epoch: 635, Training Loss: 0.47754
Epoch: 635, Training Loss: 0.50350
Epoch: 636, Training Loss: 0.47273
Epoch: 636, Training Loss: 0.49781
Epoch: 636, Training Loss: 0.47755
Epoch: 636, Training Loss: 0.50351
Epoch: 637, Training Loss: 0.47272
Epoch: 637, Training Loss: 0.49781
Epoch: 637, Training Loss: 0.47757
Epoch: 637, Training Loss: 0.50351
Epoch: 638, Training Loss: 0.47272
Epoch: 638, Training Loss: 0.49781
Epoch: 638, Training Loss: 0.47758
Epoch: 638, Training Loss: 0.50352
Epoch: 639, Training Loss: 0.47272
Epoch: 639, Training Loss: 0.49781
Epoch: 639, Training Loss: 0.47759
Epoch: 639, Training Loss: 0.50352
Epoch: 640, Training Loss: 0.47272
Epoch: 640, Training Loss: 0.49781
Epoch: 640, Training Loss: 0.47760
Epoch: 640, Training Loss: 0.50353
Epoch: 641, Training Loss: 0.47272
Epoch: 641, Training Loss: 0.49781
Epoch: 641, Training Loss: 0.47762
Epoch: 641, Training Loss: 0.50353
Epoch: 642, Training Loss: 0.47272
Epoch: 642, Training Loss: 0.49781
Epoch: 642, Training Loss: 0.47763
Epoch: 642, Training Loss: 0.50354
Epoch: 643, Training Loss: 0.47272
Epoch: 643, Training Loss: 0.49782
Epoch: 643, Training Loss: 0.47764
Epoch: 643, Training Loss: 0.50354
Epoch: 644, Training Loss: 0.47272
Epoch: 644, Training Loss: 0.49782
Epoch: 644, Training Loss: 0.47765
Epoch: 644, Training Loss: 0.50355
Epoch: 645, Training Loss: 0.47272
Epoch: 645, Training Loss: 0.49782
Epoch: 645, Training Loss: 0.47767
Epoch: 645, Training Loss: 0.50355
Epoch: 646, Training Loss: 0.47271
Epoch: 646, Training Loss: 0.49782
Epoch: 646, Training Loss: 0.47768
Epoch: 646, Training Loss: 0.50356
Epoch: 647, Training Loss: 0.47271
Epoch: 647, Training Loss: 0.49782
Epoch: 647, Training Loss: 0.47769
Epoch: 647, Training Loss: 0.50356
Epoch: 648, Training Loss: 0.47271
Epoch: 648, Training Loss: 0.49782
Epoch: 648, Training Loss: 0.47770
Epoch: 648, Training Loss: 0.50357
Epoch: 649, Training Loss: 0.47271
Epoch: 649, Training Loss: 0.49782
Epoch: 649, Training Loss: 0.47772
Epoch: 649, Training Loss: 0.50357
Epoch: 650, Training Loss: 0.47271
Epoch: 650, Training Loss: 0.49782
Epoch: 650, Training Loss: 0.47773
Epoch: 650, Training Loss: 0.50358
Epoch: 651, Training Loss: 0.47271
Epoch: 651, Training Loss: 0.49782
Epoch: 651, Training Loss: 0.47774
Epoch: 651, Training Loss: 0.50358
Epoch: 652, Training Loss: 0.47270
Epoch: 652, Training Loss: 0.49782
Epoch: 652, Training Loss: 0.47775
Epoch: 652, Training Loss: 0.50359
Epoch: 653, Training Loss: 0.47270
Epoch: 653, Training Loss: 0.49783
Epoch: 653, Training Loss: 0.47777
Epoch: 653, Training Loss: 0.50359
Epoch: 654, Training Loss: 0.47270
Epoch: 654, Training Loss: 0.49783
Epoch: 654, Training Loss: 0.47778
Epoch: 654, Training Loss: 0.50360
Epoch: 655, Training Loss: 0.47270
Epoch: 655, Training Loss: 0.49783
Epoch: 655, Training Loss: 0.47779
Epoch: 655, Training Loss: 0.50360
Epoch: 656, Training Loss: 0.47270
Epoch: 656, Training Loss: 0.49783
Epoch: 656, Training Loss: 0.47780
Epoch: 656, Training Loss: 0.50361
Epoch: 657, Training Loss: 0.47269
Epoch: 657, Training Loss: 0.49783
Epoch: 657, Training Loss: 0.47782
Epoch: 657, Training Loss: 0.50362
Epoch: 658, Training Loss: 0.47269
Epoch: 658, Training Loss: 0.49783
Epoch: 658, Training Loss: 0.47783
Epoch: 658, Training Loss: 0.50362
Epoch: 659, Training Loss: 0.47269
Epoch: 659, Training Loss: 0.49783
Epoch: 659, Training Loss: 0.47784
Epoch: 659, Training Loss: 0.50363
Epoch: 660, Training Loss: 0.47269
Epoch: 660, Training Loss: 0.49783
Epoch: 660, Training Loss: 0.47785
Epoch: 660, Training Loss: 0.50363
Epoch: 661, Training Loss: 0.47268
Epoch: 661, Training Loss: 0.49783
Epoch: 661, Training Loss: 0.47787
Epoch: 661, Training Loss: 0.50364
Epoch: 662, Training Loss: 0.47268
Epoch: 662, Training Loss: 0.49783
Epoch: 662, Training Loss: 0.47788
Epoch: 662, Training Loss: 0.50364
Epoch: 663, Training Loss: 0.47268
Epoch: 663, Training Loss: 0.49783
Epoch: 663, Training Loss: 0.47789
Epoch: 663, Training Loss: 0.50365
Epoch: 664, Training Loss: 0.47267
Epoch: 664, Training Loss: 0.49783
Epoch: 664, Training Loss: 0.47790
Epoch: 664, Training Loss: 0.50366
Epoch: 665, Training Loss: 0.47267
Epoch: 665, Training Loss: 0.49783
Epoch: 665, Training Loss: 0.47791
Epoch: 665, Training Loss: 0.50366
Epoch: 666, Training Loss: 0.47267
Epoch: 666, Training Loss: 0.49783
Epoch: 666, Training Loss: 0.47793
Epoch: 666, Training Loss: 0.50367
Epoch: 667, Training Loss: 0.47266
Epoch: 667, Training Loss: 0.49783
Epoch: 667, Training Loss: 0.47794
Epoch: 667, Training Loss: 0.50367
Epoch: 668, Training Loss: 0.47266
Epoch: 668, Training Loss: 0.49783
Epoch: 668, Training Loss: 0.47795
Epoch: 668, Training Loss: 0.50368
Epoch: 669, Training Loss: 0.47266
Epoch: 669, Training Loss: 0.49783
Epoch: 669, Training Loss: 0.47796
Epoch: 669, Training Loss: 0.50369
Epoch: 670, Training Loss: 0.47265
Epoch: 670, Training Loss: 0.49783
Epoch: 670, Training Loss: 0.47798
Epoch: 670, Training Loss: 0.50369
Epoch: 671, Training Loss: 0.47265
Epoch: 671, Training Loss: 0.49783
Epoch: 671, Training Loss: 0.47799
Epoch: 671, Training Loss: 0.50370
Epoch: 672, Training Loss: 0.47265
Epoch: 672, Training Loss: 0.49783
Epoch: 672, Training Loss: 0.47800
Epoch: 672, Training Loss: 0.50371
Epoch: 673, Training Loss: 0.47264
Epoch: 673, Training Loss: 0.49783
Epoch: 673, Training Loss: 0.47801
Epoch: 673, Training Loss: 0.50371
Epoch: 674, Training Loss: 0.47264
Epoch: 674, Training Loss: 0.49783
Epoch: 674, Training Loss: 0.47803
Epoch: 674, Training Loss: 0.50372
Epoch: 675, Training Loss: 0.47264
Epoch: 675, Training Loss: 0.49783
Epoch: 675, Training Loss: 0.47804
Epoch: 675, Training Loss: 0.50373
Epoch: 676, Training Loss: 0.47263
Epoch: 676, Training Loss: 0.49783
Epoch: 676, Training Loss: 0.47805
Epoch: 676, Training Loss: 0.50373
Epoch: 677, Training Loss: 0.47263
Epoch: 677, Training Loss: 0.49783
Epoch: 677, Training Loss: 0.47806
Epoch: 677, Training Loss: 0.50374
Epoch: 678, Training Loss: 0.47262
Epoch: 678, Training Loss: 0.49783
Epoch: 678, Training Loss: 0.47808
Epoch: 678, Training Loss: 0.50375
Epoch: 679, Training Loss: 0.47262
Epoch: 679, Training Loss: 0.49783
Epoch: 679, Training Loss: 0.47809
Epoch: 679, Training Loss: 0.50375
Epoch: 680, Training Loss: 0.47261
Epoch: 680, Training Loss: 0.49783
Epoch: 680, Training Loss: 0.47810
Epoch: 680, Training Loss: 0.50376
Epoch: 681, Training Loss: 0.47261
Epoch: 681, Training Loss: 0.49783
Epoch: 681, Training Loss: 0.47811
Epoch: 681, Training Loss: 0.50377
Epoch: 682, Training Loss: 0.47260
Epoch: 682, Training Loss: 0.49783
Epoch: 682, Training Loss: 0.47813
Epoch: 682, Training Loss: 0.50377
Epoch: 683, Training Loss: 0.47260
Epoch: 683, Training Loss: 0.49783
Epoch: 683, Training Loss: 0.47814
Epoch: 683, Training Loss: 0.50378
Epoch: 684, Training Loss: 0.47260
Epoch: 684, Training Loss: 0.49783
Epoch: 684, Training Loss: 0.47815
Epoch: 684, Training Loss: 0.50379
Epoch: 685, Training Loss: 0.47259
Epoch: 685, Training Loss: 0.49783
Epoch: 685, Training Loss: 0.47816
Epoch: 685, Training Loss: 0.50380
Epoch: 686, Training Loss: 0.47259
Epoch: 686, Training Loss: 0.49783
Epoch: 686, Training Loss: 0.47818
Epoch: 686, Training Loss: 0.50380
Epoch: 687, Training Loss: 0.47258
Epoch: 687, Training Loss: 0.49783
Epoch: 687, Training Loss: 0.47819
Epoch: 687, Training Loss: 0.50381
Epoch: 688, Training Loss: 0.47257
Epoch: 688, Training Loss: 0.49783
Epoch: 688, Training Loss: 0.47820
Epoch: 688, Training Loss: 0.50382
Epoch: 689, Training Loss: 0.47257
Epoch: 689, Training Loss: 0.49783
Epoch: 689, Training Loss: 0.47821
Epoch: 689, Training Loss: 0.50382
Epoch: 690, Training Loss: 0.47256
Epoch: 690, Training Loss: 0.49783
Epoch: 690, Training Loss: 0.47823
Epoch: 690, Training Loss: 0.50383
Epoch: 691, Training Loss: 0.47256
Epoch: 691, Training Loss: 0.49783
Epoch: 691, Training Loss: 0.47824
Epoch: 691, Training Loss: 0.50384
Epoch: 692, Training Loss: 0.47255
Epoch: 692, Training Loss: 0.49783
Epoch: 692, Training Loss: 0.47825
Epoch: 692, Training Loss: 0.50385
Epoch: 693, Training Loss: 0.47255
Epoch: 693, Training Loss: 0.49783
Epoch: 693, Training Loss: 0.47826
Epoch: 693, Training Loss: 0.50386
Epoch: 694, Training Loss: 0.47254
Epoch: 694, Training Loss: 0.49783
Epoch: 694, Training Loss: 0.47828
Epoch: 694, Training Loss: 0.50386
Epoch: 695, Training Loss: 0.47254
Epoch: 695, Training Loss: 0.49782
Epoch: 695, Training Loss: 0.47829
Epoch: 695, Training Loss: 0.50387
Epoch: 696, Training Loss: 0.47253
Epoch: 696, Training Loss: 0.49782
Epoch: 696, Training Loss: 0.47830
Epoch: 696, Training Loss: 0.50388
Epoch: 697, Training Loss: 0.47252
Epoch: 697, Training Loss: 0.49782
Epoch: 697, Training Loss: 0.47831
Epoch: 697, Training Loss: 0.50389
Epoch: 698, Training Loss: 0.47252
Epoch: 698, Training Loss: 0.49782
Epoch: 698, Training Loss: 0.47833
Epoch: 698, Training Loss: 0.50390
Epoch: 699, Training Loss: 0.47251
Epoch: 699, Training Loss: 0.49782
Epoch: 699, Training Loss: 0.47834
Epoch: 699, Training Loss: 0.50390
Epoch: 700, Training Loss: 0.47250
Epoch: 700, Training Loss: 0.49782
Epoch: 700, Training Loss: 0.47835
Epoch: 700, Training Loss: 0.50391
Epoch: 701, Training Loss: 0.47250
Epoch: 701, Training Loss: 0.49782
Epoch: 701, Training Loss: 0.47837
Epoch: 701, Training Loss: 0.50392
Epoch: 702, Training Loss: 0.47249
Epoch: 702, Training Loss: 0.49782
Epoch: 702, Training Loss: 0.47838
Epoch: 702, Training Loss: 0.50393
Epoch: 703, Training Loss: 0.47248
Epoch: 703, Training Loss: 0.49782
Epoch: 703, Training Loss: 0.47839
Epoch: 703, Training Loss: 0.50394
Epoch: 704, Training Loss: 0.47248
Epoch: 704, Training Loss: 0.49782
Epoch: 704, Training Loss: 0.47840
Epoch: 704, Training Loss: 0.50395
Epoch: 705, Training Loss: 0.47247
Epoch: 705, Training Loss: 0.49781
Epoch: 705, Training Loss: 0.47842
Epoch: 705, Training Loss: 0.50395
Epoch: 706, Training Loss: 0.47246
Epoch: 706, Training Loss: 0.49781
Epoch: 706, Training Loss: 0.47843
Epoch: 706, Training Loss: 0.50396
Epoch: 707, Training Loss: 0.47246
Epoch: 707, Training Loss: 0.49781
Epoch: 707, Training Loss: 0.47844
Epoch: 707, Training Loss: 0.50397
Epoch: 708, Training Loss: 0.47245
Epoch: 708, Training Loss: 0.49781
Epoch: 708, Training Loss: 0.47845
Epoch: 708, Training Loss: 0.50398
Epoch: 709, Training Loss: 0.47244
Epoch: 709, Training Loss: 0.49781
Epoch: 709, Training Loss: 0.47847
Epoch: 709, Training Loss: 0.50399
Epoch: 710, Training Loss: 0.47243
Epoch: 710, Training Loss: 0.49781
Epoch: 710, Training Loss: 0.47848
Epoch: 710, Training Loss: 0.50400
Epoch: 711, Training Loss: 0.47242
Epoch: 711, Training Loss: 0.49780
Epoch: 711, Training Loss: 0.47849
Epoch: 711, Training Loss: 0.50401
Epoch: 712, Training Loss: 0.47242
Epoch: 712, Training Loss: 0.49780
Epoch: 712, Training Loss: 0.47850
Epoch: 712, Training Loss: 0.50402
Epoch: 713, Training Loss: 0.47241
Epoch: 713, Training Loss: 0.49780
Epoch: 713, Training Loss: 0.47852
Epoch: 713, Training Loss: 0.50403
Epoch: 714, Training Loss: 0.47240
Epoch: 714, Training Loss: 0.49780
Epoch: 714, Training Loss: 0.47853
Epoch: 714, Training Loss: 0.50404
Epoch: 715, Training Loss: 0.47239
Epoch: 715, Training Loss: 0.49780
Epoch: 715, Training Loss: 0.47854
Epoch: 715, Training Loss: 0.50405
Epoch: 716, Training Loss: 0.47238
Epoch: 716, Training Loss: 0.49780
Epoch: 716, Training Loss: 0.47856
Epoch: 716, Training Loss: 0.50406
Epoch: 717, Training Loss: 0.47238
Epoch: 717, Training Loss: 0.49779
Epoch: 717, Training Loss: 0.47857
Epoch: 717, Training Loss: 0.50407
Epoch: 718, Training Loss: 0.47237
Epoch: 718, Training Loss: 0.49779
Epoch: 718, Training Loss: 0.47858
Epoch: 718, Training Loss: 0.50408
Epoch: 719, Training Loss: 0.47236
Epoch: 719, Training Loss: 0.49779
Epoch: 719, Training Loss: 0.47859
Epoch: 719, Training Loss: 0.50409
Epoch: 720, Training Loss: 0.47235
Epoch: 720, Training Loss: 0.49779
Epoch: 720, Training Loss: 0.47861
Epoch: 720, Training Loss: 0.50410
Epoch: 721, Training Loss: 0.47234
Epoch: 721, Training Loss: 0.49779
Epoch: 721, Training Loss: 0.47862
Epoch: 721, Training Loss: 0.50411
Epoch: 722, Training Loss: 0.47233
Epoch: 722, Training Loss: 0.49778
Epoch: 722, Training Loss: 0.47863
Epoch: 722, Training Loss: 0.50412
Epoch: 723, Training Loss: 0.47232
Epoch: 723, Training Loss: 0.49778
Epoch: 723, Training Loss: 0.47864
Epoch: 723, Training Loss: 0.50413
Epoch: 724, Training Loss: 0.47231
Epoch: 724, Training Loss: 0.49778
Epoch: 724, Training Loss: 0.47866
Epoch: 724, Training Loss: 0.50414
Epoch: 725, Training Loss: 0.47230
Epoch: 725, Training Loss: 0.49778
Epoch: 725, Training Loss: 0.47867
Epoch: 725, Training Loss: 0.50415
Epoch: 726, Training Loss: 0.47229
Epoch: 726, Training Loss: 0.49777
Epoch: 726, Training Loss: 0.47868
Epoch: 726, Training Loss: 0.50416
Epoch: 727, Training Loss: 0.47228
Epoch: 727, Training Loss: 0.49777
Epoch: 727, Training Loss: 0.47870
Epoch: 727, Training Loss: 0.50417
Epoch: 728, Training Loss: 0.47227
Epoch: 728, Training Loss: 0.49777
Epoch: 728, Training Loss: 0.47871
Epoch: 728, Training Loss: 0.50418
Epoch: 729, Training Loss: 0.47226
Epoch: 729, Training Loss: 0.49777
Epoch: 729, Training Loss: 0.47872
Epoch: 729, Training Loss: 0.50419
Epoch: 730, Training Loss: 0.47225
Epoch: 730, Training Loss: 0.49776
Epoch: 730, Training Loss: 0.47873
Epoch: 730, Training Loss: 0.50420
Epoch: 731, Training Loss: 0.47224
Epoch: 731, Training Loss: 0.49776
Epoch: 731, Training Loss: 0.47875
Epoch: 731, Training Loss: 0.50421
Epoch: 732, Training Loss: 0.47223
Epoch: 732, Training Loss: 0.49776
Epoch: 732, Training Loss: 0.47876
Epoch: 732, Training Loss: 0.50423
Epoch: 733, Training Loss: 0.47222
Epoch: 733, Training Loss: 0.49776
Epoch: 733, Training Loss: 0.47877
Epoch: 733, Training Loss: 0.50424
Epoch: 734, Training Loss: 0.47221
Epoch: 734, Training Loss: 0.49775
Epoch: 734, Training Loss: 0.47878
Epoch: 734, Training Loss: 0.50425
Epoch: 735, Training Loss: 0.47220
Epoch: 735, Training Loss: 0.49775
Epoch: 735, Training Loss: 0.47880
Epoch: 735, Training Loss: 0.50426
Epoch: 736, Training Loss: 0.47219
Epoch: 736, Training Loss: 0.49775
Epoch: 736, Training Loss: 0.47881
Epoch: 736, Training Loss: 0.50427
Epoch: 737, Training Loss: 0.47218
Epoch: 737, Training Loss: 0.49774
Epoch: 737, Training Loss: 0.47882
Epoch: 737, Training Loss: 0.50428
Epoch: 738, Training Loss: 0.47217
Epoch: 738, Training Loss: 0.49774
Epoch: 738, Training Loss: 0.47884
Epoch: 738, Training Loss: 0.50430
Epoch: 739, Training Loss: 0.47216
Epoch: 739, Training Loss: 0.49774
Epoch: 739, Training Loss: 0.47885
Epoch: 739, Training Loss: 0.50431
Epoch: 740, Training Loss: 0.47214
Epoch: 740, Training Loss: 0.49774
Epoch: 740, Training Loss: 0.47886
Epoch: 740, Training Loss: 0.50432
Epoch: 741, Training Loss: 0.47213
Epoch: 741, Training Loss: 0.49773
Epoch: 741, Training Loss: 0.47887
Epoch: 741, Training Loss: 0.50433
Epoch: 742, Training Loss: 0.47212
Epoch: 742, Training Loss: 0.49773
Epoch: 742, Training Loss: 0.47889
Epoch: 742, Training Loss: 0.50434
Epoch: 743, Training Loss: 0.47211
Epoch: 743, Training Loss: 0.49773
Epoch: 743, Training Loss: 0.47890
Epoch: 743, Training Loss: 0.50436
Epoch: 744, Training Loss: 0.47210
Epoch: 744, Training Loss: 0.49772
Epoch: 744, Training Loss: 0.47891
Epoch: 744, Training Loss: 0.50437
Epoch: 745, Training Loss: 0.47208
Epoch: 745, Training Loss: 0.49772
Epoch: 745, Training Loss: 0.47893
Epoch: 745, Training Loss: 0.50438
Epoch: 746, Training Loss: 0.47207
Epoch: 746, Training Loss: 0.49772
Epoch: 746, Training Loss: 0.47894
Epoch: 746, Training Loss: 0.50439
Epoch: 747, Training Loss: 0.47206
Epoch: 747, Training Loss: 0.49771
Epoch: 747, Training Loss: 0.47895
Epoch: 747, Training Loss: 0.50441
Epoch: 748, Training Loss: 0.47205
Epoch: 748, Training Loss: 0.49771
Epoch: 748, Training Loss: 0.47897
Epoch: 748, Training Loss: 0.50442
Epoch: 749, Training Loss: 0.47203
Epoch: 749, Training Loss: 0.49770
Epoch: 749, Training Loss: 0.47898
Epoch: 749, Training Loss: 0.50443
Epoch: 750, Training Loss: 0.47202
Epoch: 750, Training Loss: 0.49770
Epoch: 750, Training Loss: 0.47899
Epoch: 750, Training Loss: 0.50445
Epoch: 751, Training Loss: 0.47201
Epoch: 751, Training Loss: 0.49770
Epoch: 751, Training Loss: 0.47900
Epoch: 751, Training Loss: 0.50446
Epoch: 752, Training Loss: 0.47199
Epoch: 752, Training Loss: 0.49769
Epoch: 752, Training Loss: 0.47902
Epoch: 752, Training Loss: 0.50447
Epoch: 753, Training Loss: 0.47198
Epoch: 753, Training Loss: 0.49769
Epoch: 753, Training Loss: 0.47903
Epoch: 753, Training Loss: 0.50449
Epoch: 754, Training Loss: 0.47197
Epoch: 754, Training Loss: 0.49769
Epoch: 754, Training Loss: 0.47904
Epoch: 754, Training Loss: 0.50450
Epoch: 755, Training Loss: 0.47195
Epoch: 755, Training Loss: 0.49768
Epoch: 755, Training Loss: 0.47906
Epoch: 755, Training Loss: 0.50452
Epoch: 756, Training Loss: 0.47194
Epoch: 756, Training Loss: 0.49768
Epoch: 756, Training Loss: 0.47907
Epoch: 756, Training Loss: 0.50453
Epoch: 757, Training Loss: 0.47193
Epoch: 757, Training Loss: 0.49767
Epoch: 757, Training Loss: 0.47908
Epoch: 757, Training Loss: 0.50454
Epoch: 758, Training Loss: 0.47191
Epoch: 758, Training Loss: 0.49767
Epoch: 758, Training Loss: 0.47909
Epoch: 758, Training Loss: 0.50456
Epoch: 759, Training Loss: 0.47190
Epoch: 759, Training Loss: 0.49767
Epoch: 759, Training Loss: 0.47911
Epoch: 759, Training Loss: 0.50457
Epoch: 760, Training Loss: 0.47188
Epoch: 760, Training Loss: 0.49766
Epoch: 760, Training Loss: 0.47912
Epoch: 760, Training Loss: 0.50459
Epoch: 761, Training Loss: 0.47187
Epoch: 761, Training Loss: 0.49766
Epoch: 761, Training Loss: 0.47913
Epoch: 761, Training Loss: 0.50460
Epoch: 762, Training Loss: 0.47185
Epoch: 762, Training Loss: 0.49765
Epoch: 762, Training Loss: 0.47915
Epoch: 762, Training Loss: 0.50462
Epoch: 763, Training Loss: 0.47184
Epoch: 763, Training Loss: 0.49765
Epoch: 763, Training Loss: 0.47916
Epoch: 763, Training Loss: 0.50463
Epoch: 764, Training Loss: 0.47182
Epoch: 764, Training Loss: 0.49764
Epoch: 764, Training Loss: 0.47917
Epoch: 764, Training Loss: 0.50465
Epoch: 765, Training Loss: 0.47181
Epoch: 765, Training Loss: 0.49764
Epoch: 765, Training Loss: 0.47919
Epoch: 765, Training Loss: 0.50466
Epoch: 766, Training Loss: 0.47179
Epoch: 766, Training Loss: 0.49763
Epoch: 766, Training Loss: 0.47920
Epoch: 766, Training Loss: 0.50468
Epoch: 767, Training Loss: 0.47178
Epoch: 767, Training Loss: 0.49763
Epoch: 767, Training Loss: 0.47921
Epoch: 767, Training Loss: 0.50469
Epoch: 768, Training Loss: 0.47176
Epoch: 768, Training Loss: 0.49762
Epoch: 768, Training Loss: 0.47922
Epoch: 768, Training Loss: 0.50471
Epoch: 769, Training Loss: 0.47174
Epoch: 769, Training Loss: 0.49762
Epoch: 769, Training Loss: 0.47924
Epoch: 769, Training Loss: 0.50472
Epoch: 770, Training Loss: 0.47173
Epoch: 770, Training Loss: 0.49761
Epoch: 770, Training Loss: 0.47925
Epoch: 770, Training Loss: 0.50474
Epoch: 771, Training Loss: 0.47171
Epoch: 771, Training Loss: 0.49761
Epoch: 771, Training Loss: 0.47926
Epoch: 771, Training Loss: 0.50476
Epoch: 772, Training Loss: 0.47170
Epoch: 772, Training Loss: 0.49760
Epoch: 772, Training Loss: 0.47928
Epoch: 772, Training Loss: 0.50477
Epoch: 773, Training Loss: 0.47168
Epoch: 773, Training Loss: 0.49760
Epoch: 773, Training Loss: 0.47929
Epoch: 773, Training Loss: 0.50479
Epoch: 774, Training Loss: 0.47166
Epoch: 774, Training Loss: 0.49759
Epoch: 774, Training Loss: 0.47930
Epoch: 774, Training Loss: 0.50480
Epoch: 775, Training Loss: 0.47165
Epoch: 775, Training Loss: 0.49759
Epoch: 775, Training Loss: 0.47932
Epoch: 775, Training Loss: 0.50482
Epoch: 776, Training Loss: 0.47163
Epoch: 776, Training Loss: 0.49758
Epoch: 776, Training Loss: 0.47933
Epoch: 776, Training Loss: 0.50484
Epoch: 777, Training Loss: 0.47161
Epoch: 777, Training Loss: 0.49758
Epoch: 777, Training Loss: 0.47934
Epoch: 777, Training Loss: 0.50485
Epoch: 778, Training Loss: 0.47159
Epoch: 778, Training Loss: 0.49757
Epoch: 778, Training Loss: 0.47936
Epoch: 778, Training Loss: 0.50487
Epoch: 779, Training Loss: 0.47158
Epoch: 779, Training Loss: 0.49757
Epoch: 779, Training Loss: 0.47937
Epoch: 779, Training Loss: 0.50489
Epoch: 780, Training Loss: 0.47156
Epoch: 780, Training Loss: 0.49756
Epoch: 780, Training Loss: 0.47938
Epoch: 780, Training Loss: 0.50491
Epoch: 781, Training Loss: 0.47154
Epoch: 781, Training Loss: 0.49756
Epoch: 781, Training Loss: 0.47939
Epoch: 781, Training Loss: 0.50492
Epoch: 782, Training Loss: 0.47152
Epoch: 782, Training Loss: 0.49755
Epoch: 782, Training Loss: 0.47941
Epoch: 782, Training Loss: 0.50494
Epoch: 783, Training Loss: 0.47150
Epoch: 783, Training Loss: 0.49755
Epoch: 783, Training Loss: 0.47942
Epoch: 783, Training Loss: 0.50496
Epoch: 784, Training Loss: 0.47148
Epoch: 784, Training Loss: 0.49754
Epoch: 784, Training Loss: 0.47943
Epoch: 784, Training Loss: 0.50498
Epoch: 785, Training Loss: 0.47147
Epoch: 785, Training Loss: 0.49753
Epoch: 785, Training Loss: 0.47945
Epoch: 785, Training Loss: 0.50499
Epoch: 786, Training Loss: 0.47145
Epoch: 786, Training Loss: 0.49753
Epoch: 786, Training Loss: 0.47946
Epoch: 786, Training Loss: 0.50501
Epoch: 787, Training Loss: 0.47143
Epoch: 787, Training Loss: 0.49752
Epoch: 787, Training Loss: 0.47947
Epoch: 787, Training Loss: 0.50503
Epoch: 788, Training Loss: 0.47141
Epoch: 788, Training Loss: 0.49752
Epoch: 788, Training Loss: 0.47949
Epoch: 788, Training Loss: 0.50505
Epoch: 789, Training Loss: 0.47139
Epoch: 789, Training Loss: 0.49751
Epoch: 789, Training Loss: 0.47950
Epoch: 789, Training Loss: 0.50507
Epoch: 790, Training Loss: 0.47137
Epoch: 790, Training Loss: 0.49750
Epoch: 790, Training Loss: 0.47951
Epoch: 790, Training Loss: 0.50509
Epoch: 791, Training Loss: 0.47135
Epoch: 791, Training Loss: 0.49750
Epoch: 791, Training Loss: 0.47953
Epoch: 791, Training Loss: 0.50511
Epoch: 792, Training Loss: 0.47133
Epoch: 792, Training Loss: 0.49749
Epoch: 792, Training Loss: 0.47954
Epoch: 792, Training Loss: 0.50512
Epoch: 793, Training Loss: 0.47131
Epoch: 793, Training Loss: 0.49749
Epoch: 793, Training Loss: 0.47955
Epoch: 793, Training Loss: 0.50514
Epoch: 794, Training Loss: 0.47129
Epoch: 794, Training Loss: 0.49748
Epoch: 794, Training Loss: 0.47957
Epoch: 794, Training Loss: 0.50516
Epoch: 795, Training Loss: 0.47127
Epoch: 795, Training Loss: 0.49747
Epoch: 795, Training Loss: 0.47958
Epoch: 795, Training Loss: 0.50518
Epoch: 796, Training Loss: 0.47125
Epoch: 796, Training Loss: 0.49747
Epoch: 796, Training Loss: 0.47959
Epoch: 796, Training Loss: 0.50520
Epoch: 797, Training Loss: 0.47123
Epoch: 797, Training Loss: 0.49746
Epoch: 797, Training Loss: 0.47961
Epoch: 797, Training Loss: 0.50522
Epoch: 798, Training Loss: 0.47120
Epoch: 798, Training Loss: 0.49745
Epoch: 798, Training Loss: 0.47962
Epoch: 798, Training Loss: 0.50524
Epoch: 799, Training Loss: 0.47118
Epoch: 799, Training Loss: 0.49745
Epoch: 799, Training Loss: 0.47963
Epoch: 799, Training Loss: 0.50526
Epoch: 800, Training Loss: 0.47116
Epoch: 800, Training Loss: 0.49744
Epoch: 800, Training Loss: 0.47964
Epoch: 800, Training Loss: 0.50528
Epoch: 801, Training Loss: 0.47114
Epoch: 801, Training Loss: 0.49743
Epoch: 801, Training Loss: 0.47966
Epoch: 801, Training Loss: 0.50530
Epoch: 802, Training Loss: 0.47112
Epoch: 802, Training Loss: 0.49743
Epoch: 802, Training Loss: 0.47967
Epoch: 802, Training Loss: 0.50532
Epoch: 803, Training Loss: 0.47109
Epoch: 803, Training Loss: 0.49742
Epoch: 803, Training Loss: 0.47968
Epoch: 803, Training Loss: 0.50534
Epoch: 804, Training Loss: 0.47107
Epoch: 804, Training Loss: 0.49741
Epoch: 804, Training Loss: 0.47970
Epoch: 804, Training Loss: 0.50537
Epoch: 805, Training Loss: 0.47105
Epoch: 805, Training Loss: 0.49741
Epoch: 805, Training Loss: 0.47971
Epoch: 805, Training Loss: 0.50539
Epoch: 806, Training Loss: 0.47103
Epoch: 806, Training Loss: 0.49740
Epoch: 806, Training Loss: 0.47972
Epoch: 806, Training Loss: 0.50541
Epoch: 807, Training Loss: 0.47100
Epoch: 807, Training Loss: 0.49739
Epoch: 807, Training Loss: 0.47974
Epoch: 807, Training Loss: 0.50543
Epoch: 808, Training Loss: 0.47098
Epoch: 808, Training Loss: 0.49738
Epoch: 808, Training Loss: 0.47975
Epoch: 808, Training Loss: 0.50545
Epoch: 809, Training Loss: 0.47096
Epoch: 809, Training Loss: 0.49738
Epoch: 809, Training Loss: 0.47976
Epoch: 809, Training Loss: 0.50547
Epoch: 810, Training Loss: 0.47093
Epoch: 810, Training Loss: 0.49737
Epoch: 810, Training Loss: 0.47978
Epoch: 810, Training Loss: 0.50550
Epoch: 811, Training Loss: 0.47091
Epoch: 811, Training Loss: 0.49736
Epoch: 811, Training Loss: 0.47979
Epoch: 811, Training Loss: 0.50552
Epoch: 812, Training Loss: 0.47089
Epoch: 812, Training Loss: 0.49735
Epoch: 812, Training Loss: 0.47980
Epoch: 812, Training Loss: 0.50554
Epoch: 813, Training Loss: 0.47086
Epoch: 813, Training Loss: 0.49735
Epoch: 813, Training Loss: 0.47982
Epoch: 813, Training Loss: 0.50556
Epoch: 814, Training Loss: 0.47084
Epoch: 814, Training Loss: 0.49734
Epoch: 814, Training Loss: 0.47983
Epoch: 814, Training Loss: 0.50559
Epoch: 815, Training Loss: 0.47081
Epoch: 815, Training Loss: 0.49733
Epoch: 815, Training Loss: 0.47984
Epoch: 815, Training Loss: 0.50561
Epoch: 816, Training Loss: 0.47079
Epoch: 816, Training Loss: 0.49732
Epoch: 816, Training Loss: 0.47986
Epoch: 816, Training Loss: 0.50563
Epoch: 817, Training Loss: 0.47076
Epoch: 817, Training Loss: 0.49731
Epoch: 817, Training Loss: 0.47987
Epoch: 817, Training Loss: 0.50565
Epoch: 818, Training Loss: 0.47074
Epoch: 818, Training Loss: 0.49731
Epoch: 818, Training Loss: 0.47988
Epoch: 818, Training Loss: 0.50568
Epoch: 819, Training Loss: 0.47071
Epoch: 819, Training Loss: 0.49730
Epoch: 819, Training Loss: 0.47989
Epoch: 819, Training Loss: 0.50570
Epoch: 820, Training Loss: 0.47068
Epoch: 820, Training Loss: 0.49729
Epoch: 820, Training Loss: 0.47991
Epoch: 820, Training Loss: 0.50573
Epoch: 821, Training Loss: 0.47066
Epoch: 821, Training Loss: 0.49728
Epoch: 821, Training Loss: 0.47992
Epoch: 821, Training Loss: 0.50575
Epoch: 822, Training Loss: 0.47063
Epoch: 822, Training Loss: 0.49727
Epoch: 822, Training Loss: 0.47993
Epoch: 822, Training Loss: 0.50577
Epoch: 823, Training Loss: 0.47060
Epoch: 823, Training Loss: 0.49727
Epoch: 823, Training Loss: 0.47995
Epoch: 823, Training Loss: 0.50580
Epoch: 824, Training Loss: 0.47058
Epoch: 824, Training Loss: 0.49726
Epoch: 824, Training Loss: 0.47996
Epoch: 824, Training Loss: 0.50582
Epoch: 825, Training Loss: 0.47055
Epoch: 825, Training Loss: 0.49725
Epoch: 825, Training Loss: 0.47997
Epoch: 825, Training Loss: 0.50585
Epoch: 826, Training Loss: 0.47052
Epoch: 826, Training Loss: 0.49724
Epoch: 826, Training Loss: 0.47999
Epoch: 826, Training Loss: 0.50587
Epoch: 827, Training Loss: 0.47050
Epoch: 827, Training Loss: 0.49723
Epoch: 827, Training Loss: 0.48000
Epoch: 827, Training Loss: 0.50590
Epoch: 828, Training Loss: 0.47047
Epoch: 828, Training Loss: 0.49722
Epoch: 828, Training Loss: 0.48001
Epoch: 828, Training Loss: 0.50592
Epoch: 829, Training Loss: 0.47044
Epoch: 829, Training Loss: 0.49721
Epoch: 829, Training Loss: 0.48003
Epoch: 829, Training Loss: 0.50595
Epoch: 830, Training Loss: 0.47041
Epoch: 830, Training Loss: 0.49721
Epoch: 830, Training Loss: 0.48004
Epoch: 830, Training Loss: 0.50597
Epoch: 831, Training Loss: 0.47038
Epoch: 831, Training Loss: 0.49720
Epoch: 831, Training Loss: 0.48005
Epoch: 831, Training Loss: 0.50600
Epoch: 832, Training Loss: 0.47035
Epoch: 832, Training Loss: 0.49719
Epoch: 832, Training Loss: 0.48006
Epoch: 832, Training Loss: 0.50603
Epoch: 833, Training Loss: 0.47033
Epoch: 833, Training Loss: 0.49718
Epoch: 833, Training Loss: 0.48008
Epoch: 833, Training Loss: 0.50605
Epoch: 834, Training Loss: 0.47030
Epoch: 834, Training Loss: 0.49717
Epoch: 834, Training Loss: 0.48009
Epoch: 834, Training Loss: 0.50608
Epoch: 835, Training Loss: 0.47027
Epoch: 835, Training Loss: 0.49716
Epoch: 835, Training Loss: 0.48010
Epoch: 835, Training Loss: 0.50611
Epoch: 836, Training Loss: 0.47024
Epoch: 836, Training Loss: 0.49715
Epoch: 836, Training Loss: 0.48012
Epoch: 836, Training Loss: 0.50613
Epoch: 837, Training Loss: 0.47021
Epoch: 837, Training Loss: 0.49714
Epoch: 837, Training Loss: 0.48013
Epoch: 837, Training Loss: 0.50616
Epoch: 838, Training Loss: 0.47018
Epoch: 838, Training Loss: 0.49713
Epoch: 838, Training Loss: 0.48014
Epoch: 838, Training Loss: 0.50619
Epoch: 839, Training Loss: 0.47015
Epoch: 839, Training Loss: 0.49712
Epoch: 839, Training Loss: 0.48016
Epoch: 839, Training Loss: 0.50622
Epoch: 840, Training Loss: 0.47012
Epoch: 840, Training Loss: 0.49711
Epoch: 840, Training Loss: 0.48017
Epoch: 840, Training Loss: 0.50624
Epoch: 841, Training Loss: 0.47008
Epoch: 841, Training Loss: 0.49710
Epoch: 841, Training Loss: 0.48018
Epoch: 841, Training Loss: 0.50627
Epoch: 842, Training Loss: 0.47005
Epoch: 842, Training Loss: 0.49709
Epoch: 842, Training Loss: 0.48019
Epoch: 842, Training Loss: 0.50630
Epoch: 843, Training Loss: 0.47002
Epoch: 843, Training Loss: 0.49708
Epoch: 843, Training Loss: 0.48021
Epoch: 843, Training Loss: 0.50633
Epoch: 844, Training Loss: 0.46999
Epoch: 844, Training Loss: 0.49707
Epoch: 844, Training Loss: 0.48022
Epoch: 844, Training Loss: 0.50636
Epoch: 845, Training Loss: 0.46996
Epoch: 845, Training Loss: 0.49706
Epoch: 845, Training Loss: 0.48023
Epoch: 845, Training Loss: 0.50639
Epoch: 846, Training Loss: 0.46993
Epoch: 846, Training Loss: 0.49705
Epoch: 846, Training Loss: 0.48025
Epoch: 846, Training Loss: 0.50641
Epoch: 847, Training Loss: 0.46989
Epoch: 847, Training Loss: 0.49704
Epoch: 847, Training Loss: 0.48026
Epoch: 847, Training Loss: 0.50644
Epoch: 848, Training Loss: 0.46986
Epoch: 848, Training Loss: 0.49703
Epoch: 848, Training Loss: 0.48027
Epoch: 848, Training Loss: 0.50647
Epoch: 849, Training Loss: 0.46983
Epoch: 849, Training Loss: 0.49702
Epoch: 849, Training Loss: 0.48028
Epoch: 849, Training Loss: 0.50650
Epoch: 850, Training Loss: 0.46979
Epoch: 850, Training Loss: 0.49701
Epoch: 850, Training Loss: 0.48030
Epoch: 850, Training Loss: 0.50653
Epoch: 851, Training Loss: 0.46976
Epoch: 851, Training Loss: 0.49700
Epoch: 851, Training Loss: 0.48031
Epoch: 851, Training Loss: 0.50656
Epoch: 852, Training Loss: 0.46973
Epoch: 852, Training Loss: 0.49699
Epoch: 852, Training Loss: 0.48032
Epoch: 852, Training Loss: 0.50659
Epoch: 853, Training Loss: 0.46969
Epoch: 853, Training Loss: 0.49698
Epoch: 853, Training Loss: 0.48034
Epoch: 853, Training Loss: 0.50662
Epoch: 854, Training Loss: 0.46966
Epoch: 854, Training Loss: 0.49697
Epoch: 854, Training Loss: 0.48035
Epoch: 854, Training Loss: 0.50665
Epoch: 855, Training Loss: 0.46962
Epoch: 855, Training Loss: 0.49696
Epoch: 855, Training Loss: 0.48036
Epoch: 855, Training Loss: 0.50669
Epoch: 856, Training Loss: 0.46959
Epoch: 856, Training Loss: 0.49695
Epoch: 856, Training Loss: 0.48037
Epoch: 856, Training Loss: 0.50672
Epoch: 857, Training Loss: 0.46955
Epoch: 857, Training Loss: 0.49694
Epoch: 857, Training Loss: 0.48039
Epoch: 857, Training Loss: 0.50675
Epoch: 858, Training Loss: 0.46952
Epoch: 858, Training Loss: 0.49693
Epoch: 858, Training Loss: 0.48040
Epoch: 858, Training Loss: 0.50678
Epoch: 859, Training Loss: 0.46948
Epoch: 859, Training Loss: 0.49692
Epoch: 859, Training Loss: 0.48041
Epoch: 859, Training Loss: 0.50681
Epoch: 860, Training Loss: 0.46944
Epoch: 860, Training Loss: 0.49691
Epoch: 860, Training Loss: 0.48042
Epoch: 860, Training Loss: 0.50684
Epoch: 861, Training Loss: 0.46941
Epoch: 861, Training Loss: 0.49690
Epoch: 861, Training Loss: 0.48044
Epoch: 861, Training Loss: 0.50688
Epoch: 862, Training Loss: 0.46937
Epoch: 862, Training Loss: 0.49688
Epoch: 862, Training Loss: 0.48045
Epoch: 862, Training Loss: 0.50691
Epoch: 863, Training Loss: 0.46933
Epoch: 863, Training Loss: 0.49687
Epoch: 863, Training Loss: 0.48046
Epoch: 863, Training Loss: 0.50694
Epoch: 864, Training Loss: 0.46930
Epoch: 864, Training Loss: 0.49686
Epoch: 864, Training Loss: 0.48047
Epoch: 864, Training Loss: 0.50698
Epoch: 865, Training Loss: 0.46926
Epoch: 865, Training Loss: 0.49685
Epoch: 865, Training Loss: 0.48049
Epoch: 865, Training Loss: 0.50701
Epoch: 866, Training Loss: 0.46922
Epoch: 866, Training Loss: 0.49684
Epoch: 866, Training Loss: 0.48050
Epoch: 866, Training Loss: 0.50704
Epoch: 867, Training Loss: 0.46918
Epoch: 867, Training Loss: 0.49683
Epoch: 867, Training Loss: 0.48051
Epoch: 867, Training Loss: 0.50708
Epoch: 868, Training Loss: 0.46914
Epoch: 868, Training Loss: 0.49681
Epoch: 868, Training Loss: 0.48052
Epoch: 868, Training Loss: 0.50711
Epoch: 869, Training Loss: 0.46910
Epoch: 869, Training Loss: 0.49680
Epoch: 869, Training Loss: 0.48054
Epoch: 869, Training Loss: 0.50715
Epoch: 870, Training Loss: 0.46907
Epoch: 870, Training Loss: 0.49679
Epoch: 870, Training Loss: 0.48055
Epoch: 870, Training Loss: 0.50718
Epoch: 871, Training Loss: 0.46903
Epoch: 871, Training Loss: 0.49678
Epoch: 871, Training Loss: 0.48056
Epoch: 871, Training Loss: 0.50722
Epoch: 872, Training Loss: 0.46899
Epoch: 872, Training Loss: 0.49677
Epoch: 872, Training Loss: 0.48057
Epoch: 872, Training Loss: 0.50725
Epoch: 873, Training Loss: 0.46895
Epoch: 873, Training Loss: 0.49675
Epoch: 873, Training Loss: 0.48059
Epoch: 873, Training Loss: 0.50729
Epoch: 874, Training Loss: 0.46890
Epoch: 874, Training Loss: 0.49674
Epoch: 874, Training Loss: 0.48060
Epoch: 874, Training Loss: 0.50732
Epoch: 875, Training Loss: 0.46886
Epoch: 875, Training Loss: 0.49673
Epoch: 875, Training Loss: 0.48061
Epoch: 875, Training Loss: 0.50736
Epoch: 876, Training Loss: 0.46882
Epoch: 876, Training Loss: 0.49672
Epoch: 876, Training Loss: 0.48062
Epoch: 876, Training Loss: 0.50739
Epoch: 877, Training Loss: 0.46878
Epoch: 877, Training Loss: 0.49671
Epoch: 877, Training Loss: 0.48063
Epoch: 877, Training Loss: 0.50743
Epoch: 878, Training Loss: 0.46874
Epoch: 878, Training Loss: 0.49669
Epoch: 878, Training Loss: 0.48065
Epoch: 878, Training Loss: 0.50747
Epoch: 879, Training Loss: 0.46870
Epoch: 879, Training Loss: 0.49668
Epoch: 879, Training Loss: 0.48066
Epoch: 879, Training Loss: 0.50750
Epoch: 880, Training Loss: 0.46865
Epoch: 880, Training Loss: 0.49667
Epoch: 880, Training Loss: 0.48067
Epoch: 880, Training Loss: 0.50754
Epoch: 881, Training Loss: 0.46861
Epoch: 881, Training Loss: 0.49665
Epoch: 881, Training Loss: 0.48068
Epoch: 881, Training Loss: 0.50758
Epoch: 882, Training Loss: 0.46857
Epoch: 882, Training Loss: 0.49664
Epoch: 882, Training Loss: 0.48069
Epoch: 882, Training Loss: 0.50762
Epoch: 883, Training Loss: 0.46852
Epoch: 883, Training Loss: 0.49663
Epoch: 883, Training Loss: 0.48071
Epoch: 883, Training Loss: 0.50765
Epoch: 884, Training Loss: 0.46848
Epoch: 884, Training Loss: 0.49662
Epoch: 884, Training Loss: 0.48072
Epoch: 884, Training Loss: 0.50769
Epoch: 885, Training Loss: 0.46844
Epoch: 885, Training Loss: 0.49660
Epoch: 885, Training Loss: 0.48073
Epoch: 885, Training Loss: 0.50773
Epoch: 886, Training Loss: 0.46839
Epoch: 886, Training Loss: 0.49659
Epoch: 886, Training Loss: 0.48074
Epoch: 886, Training Loss: 0.50777
Epoch: 887, Training Loss: 0.46835
Epoch: 887, Training Loss: 0.49658
Epoch: 887, Training Loss: 0.48075
Epoch: 887, Training Loss: 0.50781
Epoch: 888, Training Loss: 0.46830
Epoch: 888, Training Loss: 0.49656
Epoch: 888, Training Loss: 0.48076
Epoch: 888, Training Loss: 0.50785
Epoch: 889, Training Loss: 0.46826
Epoch: 889, Training Loss: 0.49655
Epoch: 889, Training Loss: 0.48078
Epoch: 889, Training Loss: 0.50789
Epoch: 890, Training Loss: 0.46821
Epoch: 890, Training Loss: 0.49653
Epoch: 890, Training Loss: 0.48079
Epoch: 890, Training Loss: 0.50793
Epoch: 891, Training Loss: 0.46816
Epoch: 891, Training Loss: 0.49652
Epoch: 891, Training Loss: 0.48080
Epoch: 891, Training Loss: 0.50797
Epoch: 892, Training Loss: 0.46812
Epoch: 892, Training Loss: 0.49651
Epoch: 892, Training Loss: 0.48081
Epoch: 892, Training Loss: 0.50801
Epoch: 893, Training Loss: 0.46807
Epoch: 893, Training Loss: 0.49649
Epoch: 893, Training Loss: 0.48082
Epoch: 893, Training Loss: 0.50805
Epoch: 894, Training Loss: 0.46802
Epoch: 894, Training Loss: 0.49648
Epoch: 894, Training Loss: 0.48083
Epoch: 894, Training Loss: 0.50809
Epoch: 895, Training Loss: 0.46797
Epoch: 895, Training Loss: 0.49647
Epoch: 895, Training Loss: 0.48084
Epoch: 895, Training Loss: 0.50813
Epoch: 896, Training Loss: 0.46793
Epoch: 896, Training Loss: 0.49645
Epoch: 896, Training Loss: 0.48086
Epoch: 896, Training Loss: 0.50818
Epoch: 897, Training Loss: 0.46788
Epoch: 897, Training Loss: 0.49644
Epoch: 897, Training Loss: 0.48087
Epoch: 897, Training Loss: 0.50822
Epoch: 898, Training Loss: 0.46783
Epoch: 898, Training Loss: 0.49642
Epoch: 898, Training Loss: 0.48088
Epoch: 898, Training Loss: 0.50826
Epoch: 899, Training Loss: 0.46778
Epoch: 899, Training Loss: 0.49641
Epoch: 899, Training Loss: 0.48089
Epoch: 899, Training Loss: 0.50830
Epoch: 900, Training Loss: 0.46773
Epoch: 900, Training Loss: 0.49639
Epoch: 900, Training Loss: 0.48090
Epoch: 900, Training Loss: 0.50835
Epoch: 901, Training Loss: 0.46768
Epoch: 901, Training Loss: 0.49638
Epoch: 901, Training Loss: 0.48091
Epoch: 901, Training Loss: 0.50839
Epoch: 902, Training Loss: 0.46763
Epoch: 902, Training Loss: 0.49636
Epoch: 902, Training Loss: 0.48092
Epoch: 902, Training Loss: 0.50843
Epoch: 903, Training Loss: 0.46758
Epoch: 903, Training Loss: 0.49635
Epoch: 903, Training Loss: 0.48093
Epoch: 903, Training Loss: 0.50848
Epoch: 904, Training Loss: 0.46753
Epoch: 904, Training Loss: 0.49633
Epoch: 904, Training Loss: 0.48094
Epoch: 904, Training Loss: 0.50852
Epoch: 905, Training Loss: 0.46747
Epoch: 905, Training Loss: 0.49632
Epoch: 905, Training Loss: 0.48096
Epoch: 905, Training Loss: 0.50857
Epoch: 906, Training Loss: 0.46742
Epoch: 906, Training Loss: 0.49630
Epoch: 906, Training Loss: 0.48097
Epoch: 906, Training Loss: 0.50861
Epoch: 907, Training Loss: 0.46737
Epoch: 907, Training Loss: 0.49629
Epoch: 907, Training Loss: 0.48098
Epoch: 907, Training Loss: 0.50866
Epoch: 908, Training Loss: 0.46732
Epoch: 908, Training Loss: 0.49627
Epoch: 908, Training Loss: 0.48099
Epoch: 908, Training Loss: 0.50870
Epoch: 909, Training Loss: 0.46726
Epoch: 909, Training Loss: 0.49626
Epoch: 909, Training Loss: 0.48100
Epoch: 909, Training Loss: 0.50875
Epoch: 910, Training Loss: 0.46721
Epoch: 910, Training Loss: 0.49624
Epoch: 910, Training Loss: 0.48101
Epoch: 910, Training Loss: 0.50879
Epoch: 911, Training Loss: 0.46715
Epoch: 911, Training Loss: 0.49623
Epoch: 911, Training Loss: 0.48102
Epoch: 911, Training Loss: 0.50884
Epoch: 912, Training Loss: 0.46710
Epoch: 912, Training Loss: 0.49621
Epoch: 912, Training Loss: 0.48103
Epoch: 912, Training Loss: 0.50889
Epoch: 913, Training Loss: 0.46704
Epoch: 913, Training Loss: 0.49620
Epoch: 913, Training Loss: 0.48104
Epoch: 913, Training Loss: 0.50893
Epoch: 914, Training Loss: 0.46699
Epoch: 914, Training Loss: 0.49618
Epoch: 914, Training Loss: 0.48105
Epoch: 914, Training Loss: 0.50898
Epoch: 915, Training Loss: 0.46693
Epoch: 915, Training Loss: 0.49616
Epoch: 915, Training Loss: 0.48106
Epoch: 915, Training Loss: 0.50903
Epoch: 916, Training Loss: 0.46688
Epoch: 916, Training Loss: 0.49615
Epoch: 916, Training Loss: 0.48107
Epoch: 916, Training Loss: 0.50908
Epoch: 917, Training Loss: 0.46682
Epoch: 917, Training Loss: 0.49613
Epoch: 917, Training Loss: 0.48108
Epoch: 917, Training Loss: 0.50912
Epoch: 918, Training Loss: 0.46676
Epoch: 918, Training Loss: 0.49611
Epoch: 918, Training Loss: 0.48109
Epoch: 918, Training Loss: 0.50917
Epoch: 919, Training Loss: 0.46670
Epoch: 919, Training Loss: 0.49610
Epoch: 919, Training Loss: 0.48110
Epoch: 919, Training Loss: 0.50922
Epoch: 920, Training Loss: 0.46665
Epoch: 920, Training Loss: 0.49608
Epoch: 920, Training Loss: 0.48111
Epoch: 920, Training Loss: 0.50927
Epoch: 921, Training Loss: 0.46659
Epoch: 921, Training Loss: 0.49606
Epoch: 921, Training Loss: 0.48112
Epoch: 921, Training Loss: 0.50932
Epoch: 922, Training Loss: 0.46653
Epoch: 922, Training Loss: 0.49605
Epoch: 922, Training Loss: 0.48113
Epoch: 922, Training Loss: 0.50937
Epoch: 923, Training Loss: 0.46647
Epoch: 923, Training Loss: 0.49603
Epoch: 923, Training Loss: 0.48114
Epoch: 923, Training Loss: 0.50942
Epoch: 924, Training Loss: 0.46641
Epoch: 924, Training Loss: 0.49601
Epoch: 924, Training Loss: 0.48115
Epoch: 924, Training Loss: 0.50947
Epoch: 925, Training Loss: 0.46635
Epoch: 925, Training Loss: 0.49600
Epoch: 925, Training Loss: 0.48116
Epoch: 925, Training Loss: 0.50953
Epoch: 926, Training Loss: 0.46629
Epoch: 926, Training Loss: 0.49598
Epoch: 926, Training Loss: 0.48117
Epoch: 926, Training Loss: 0.50958
Epoch: 927, Training Loss: 0.46623
Epoch: 927, Training Loss: 0.49596
Epoch: 927, Training Loss: 0.48117
Epoch: 927, Training Loss: 0.50963
Epoch: 928, Training Loss: 0.46616
Epoch: 928, Training Loss: 0.49594
Epoch: 928, Training Loss: 0.48118
Epoch: 928, Training Loss: 0.50968
Epoch: 929, Training Loss: 0.46610
Epoch: 929, Training Loss: 0.49593
Epoch: 929, Training Loss: 0.48119
Epoch: 929, Training Loss: 0.50973
Epoch: 930, Training Loss: 0.46604
Epoch: 930, Training Loss: 0.49591
Epoch: 930, Training Loss: 0.48120
Epoch: 930, Training Loss: 0.50979
Epoch: 931, Training Loss: 0.46598
Epoch: 931, Training Loss: 0.49589
Epoch: 931, Training Loss: 0.48121
Epoch: 931, Training Loss: 0.50984
Epoch: 932, Training Loss: 0.46591
Epoch: 932, Training Loss: 0.49587
Epoch: 932, Training Loss: 0.48122
Epoch: 932, Training Loss: 0.50989
Epoch: 933, Training Loss: 0.46585
Epoch: 933, Training Loss: 0.49586
Epoch: 933, Training Loss: 0.48123
Epoch: 933, Training Loss: 0.50995
Epoch: 934, Training Loss: 0.46578
Epoch: 934, Training Loss: 0.49584
Epoch: 934, Training Loss: 0.48124
Epoch: 934, Training Loss: 0.51000
Epoch: 935, Training Loss: 0.46572
Epoch: 935, Training Loss: 0.49582
Epoch: 935, Training Loss: 0.48124
Epoch: 935, Training Loss: 0.51006
Epoch: 936, Training Loss: 0.46565
Epoch: 936, Training Loss: 0.49580
Epoch: 936, Training Loss: 0.48125
Epoch: 936, Training Loss: 0.51011
Epoch: 937, Training Loss: 0.46558
Epoch: 937, Training Loss: 0.49578
Epoch: 937, Training Loss: 0.48126
Epoch: 937, Training Loss: 0.51017
Epoch: 938, Training Loss: 0.46552
Epoch: 938, Training Loss: 0.49576
Epoch: 938, Training Loss: 0.48127
Epoch: 938, Training Loss: 0.51023
Epoch: 939, Training Loss: 0.46545
Epoch: 939, Training Loss: 0.49574
Epoch: 939, Training Loss: 0.48128
Epoch: 939, Training Loss: 0.51028
Epoch: 940, Training Loss: 0.46538
Epoch: 940, Training Loss: 0.49573
Epoch: 940, Training Loss: 0.48129
Epoch: 940, Training Loss: 0.51034
Epoch: 941, Training Loss: 0.46531
Epoch: 941, Training Loss: 0.49571
Epoch: 941, Training Loss: 0.48129
Epoch: 941, Training Loss: 0.51040
Epoch: 942, Training Loss: 0.46525
Epoch: 942, Training Loss: 0.49569
Epoch: 942, Training Loss: 0.48130
Epoch: 942, Training Loss: 0.51045
Epoch: 943, Training Loss: 0.46518
Epoch: 943, Training Loss: 0.49567
Epoch: 943, Training Loss: 0.48131
Epoch: 943, Training Loss: 0.51051
Epoch: 944, Training Loss: 0.46511
Epoch: 944, Training Loss: 0.49565
Epoch: 944, Training Loss: 0.48132
Epoch: 944, Training Loss: 0.51057
Epoch: 945, Training Loss: 0.46503
Epoch: 945, Training Loss: 0.49563
Epoch: 945, Training Loss: 0.48132
Epoch: 945, Training Loss: 0.51063
Epoch: 946, Training Loss: 0.46496
Epoch: 946, Training Loss: 0.49561
Epoch: 946, Training Loss: 0.48133
Epoch: 946, Training Loss: 0.51069
Epoch: 947, Training Loss: 0.46489
Epoch: 947, Training Loss: 0.49559
Epoch: 947, Training Loss: 0.48134
Epoch: 947, Training Loss: 0.51075
Epoch: 948, Training Loss: 0.46482
Epoch: 948, Training Loss: 0.49557
Epoch: 948, Training Loss: 0.48134
Epoch: 948, Training Loss: 0.51081
Epoch: 949, Training Loss: 0.46475
Epoch: 949, Training Loss: 0.49555
Epoch: 949, Training Loss: 0.48135
Epoch: 949, Training Loss: 0.51087
Epoch: 950, Training Loss: 0.46467
Epoch: 950, Training Loss: 0.49553
Epoch: 950, Training Loss: 0.48136
Epoch: 950, Training Loss: 0.51093
Epoch: 951, Training Loss: 0.46460
Epoch: 951, Training Loss: 0.49551
Epoch: 951, Training Loss: 0.48136
Epoch: 951, Training Loss: 0.51099
Epoch: 952, Training Loss: 0.46452
Epoch: 952, Training Loss: 0.49549
Epoch: 952, Training Loss: 0.48137
Epoch: 952, Training Loss: 0.51106
Epoch: 953, Training Loss: 0.46445
Epoch: 953, Training Loss: 0.49547
Epoch: 953, Training Loss: 0.48138
Epoch: 953, Training Loss: 0.51112
Epoch: 954, Training Loss: 0.46437
Epoch: 954, Training Loss: 0.49545
Epoch: 954, Training Loss: 0.48138
Epoch: 954, Training Loss: 0.51118
Epoch: 955, Training Loss: 0.46430
Epoch: 955, Training Loss: 0.49543
Epoch: 955, Training Loss: 0.48139
Epoch: 955, Training Loss: 0.51124
Epoch: 956, Training Loss: 0.46422
Epoch: 956, Training Loss: 0.49541
Epoch: 956, Training Loss: 0.48139
Epoch: 956, Training Loss: 0.51131
Epoch: 957, Training Loss: 0.46414
Epoch: 957, Training Loss: 0.49538
Epoch: 957, Training Loss: 0.48140
Epoch: 957, Training Loss: 0.51137
Epoch: 958, Training Loss: 0.46406
Epoch: 958, Training Loss: 0.49536
Epoch: 958, Training Loss: 0.48141
Epoch: 958, Training Loss: 0.51144
Epoch: 959, Training Loss: 0.46399
Epoch: 959, Training Loss: 0.49534
Epoch: 959, Training Loss: 0.48141
Epoch: 959, Training Loss: 0.51150
Epoch: 960, Training Loss: 0.46391
Epoch: 960, Training Loss: 0.49532
Epoch: 960, Training Loss: 0.48142
Epoch: 960, Training Loss: 0.51157
Epoch: 961, Training Loss: 0.46383
Epoch: 961, Training Loss: 0.49530
Epoch: 961, Training Loss: 0.48142
Epoch: 961, Training Loss: 0.51163
Epoch: 962, Training Loss: 0.46375
Epoch: 962, Training Loss: 0.49528
Epoch: 962, Training Loss: 0.48143
Epoch: 962, Training Loss: 0.51170
Epoch: 963, Training Loss: 0.46366
Epoch: 963, Training Loss: 0.49525
Epoch: 963, Training Loss: 0.48143
Epoch: 963, Training Loss: 0.51177
Epoch: 964, Training Loss: 0.46358
Epoch: 964, Training Loss: 0.49523
Epoch: 964, Training Loss: 0.48144
Epoch: 964, Training Loss: 0.51183
Epoch: 965, Training Loss: 0.46350
Epoch: 965, Training Loss: 0.49521
Epoch: 965, Training Loss: 0.48144
Epoch: 965, Training Loss: 0.51190
Epoch: 966, Training Loss: 0.46342
Epoch: 966, Training Loss: 0.49519
Epoch: 966, Training Loss: 0.48145
Epoch: 966, Training Loss: 0.51197
Epoch: 967, Training Loss: 0.46333
Epoch: 967, Training Loss: 0.49517
Epoch: 967, Training Loss: 0.48145
Epoch: 967, Training Loss: 0.51204
Epoch: 968, Training Loss: 0.46325
Epoch: 968, Training Loss: 0.49514
Epoch: 968, Training Loss: 0.48145
Epoch: 968, Training Loss: 0.51211
Epoch: 969, Training Loss: 0.46316
Epoch: 969, Training Loss: 0.49512
Epoch: 969, Training Loss: 0.48146
Epoch: 969, Training Loss: 0.51218
Epoch: 970, Training Loss: 0.46308
Epoch: 970, Training Loss: 0.49510
Epoch: 970, Training Loss: 0.48146
Epoch: 970, Training Loss: 0.51225
Epoch: 971, Training Loss: 0.46299
Epoch: 971, Training Loss: 0.49507
Epoch: 971, Training Loss: 0.48146
Epoch: 971, Training Loss: 0.51232
Epoch: 972, Training Loss: 0.46290
Epoch: 972, Training Loss: 0.49505
Epoch: 972, Training Loss: 0.48147
Epoch: 972, Training Loss: 0.51239
Epoch: 973, Training Loss: 0.46282
Epoch: 973, Training Loss: 0.49503
Epoch: 973, Training Loss: 0.48147
Epoch: 973, Training Loss: 0.51246
Epoch: 974, Training Loss: 0.46273
Epoch: 974, Training Loss: 0.49500
Epoch: 974, Training Loss: 0.48147
Epoch: 974, Training Loss: 0.51254
Epoch: 975, Training Loss: 0.46264
Epoch: 975, Training Loss: 0.49498
Epoch: 975, Training Loss: 0.48148
Epoch: 975, Training Loss: 0.51261
Epoch: 976, Training Loss: 0.46255
Epoch: 976, Training Loss: 0.49495
Epoch: 976, Training Loss: 0.48148
Epoch: 976, Training Loss: 0.51268
Epoch: 977, Training Loss: 0.46246
Epoch: 977, Training Loss: 0.49493
Epoch: 977, Training Loss: 0.48148
Epoch: 977, Training Loss: 0.51276
Epoch: 978, Training Loss: 0.46237
Epoch: 978, Training Loss: 0.49491
Epoch: 978, Training Loss: 0.48148
Epoch: 978, Training Loss: 0.51283
Epoch: 979, Training Loss: 0.46227
Epoch: 979, Training Loss: 0.49488
Epoch: 979, Training Loss: 0.48149
Epoch: 979, Training Loss: 0.51290
Epoch: 980, Training Loss: 0.46218
Epoch: 980, Training Loss: 0.49486
Epoch: 980, Training Loss: 0.48149
Epoch: 980, Training Loss: 0.51298
Epoch: 981, Training Loss: 0.46209
Epoch: 981, Training Loss: 0.49483
Epoch: 981, Training Loss: 0.48149
Epoch: 981, Training Loss: 0.51306
Epoch: 982, Training Loss: 0.46199
Epoch: 982, Training Loss: 0.49481
Epoch: 982, Training Loss: 0.48149
Epoch: 982, Training Loss: 0.51313
Epoch: 983, Training Loss: 0.46190
Epoch: 983, Training Loss: 0.49478
Epoch: 983, Training Loss: 0.48149
Epoch: 983, Training Loss: 0.51321
Epoch: 984, Training Loss: 0.46180
Epoch: 984, Training Loss: 0.49475
Epoch: 984, Training Loss: 0.48149
Epoch: 984, Training Loss: 0.51329
Epoch: 985, Training Loss: 0.46171
Epoch: 985, Training Loss: 0.49473
Epoch: 985, Training Loss: 0.48149
Epoch: 985, Training Loss: 0.51336
Epoch: 986, Training Loss: 0.46161
Epoch: 986, Training Loss: 0.49470
Epoch: 986, Training Loss: 0.48149
Epoch: 986, Training Loss: 0.51344
Epoch: 987, Training Loss: 0.46151
Epoch: 987, Training Loss: 0.49468
Epoch: 987, Training Loss: 0.48149
Epoch: 987, Training Loss: 0.51352
Epoch: 988, Training Loss: 0.46141
Epoch: 988, Training Loss: 0.49465
Epoch: 988, Training Loss: 0.48150
Epoch: 988, Training Loss: 0.51360
Epoch: 989, Training Loss: 0.46131
Epoch: 989, Training Loss: 0.49462
Epoch: 989, Training Loss: 0.48149
Epoch: 989, Training Loss: 0.51368
Epoch: 990, Training Loss: 0.46121
Epoch: 990, Training Loss: 0.49460
Epoch: 990, Training Loss: 0.48149
Epoch: 990, Training Loss: 0.51376
Epoch: 991, Training Loss: 0.46111
Epoch: 991, Training Loss: 0.49457
Epoch: 991, Training Loss: 0.48149
Epoch: 991, Training Loss: 0.51384
Epoch: 992, Training Loss: 0.46101
Epoch: 992, Training Loss: 0.49454
Epoch: 992, Training Loss: 0.48149
Epoch: 992, Training Loss: 0.51393
Epoch: 993, Training Loss: 0.46091
Epoch: 993, Training Loss: 0.49452
Epoch: 993, Training Loss: 0.48149
Epoch: 993, Training Loss: 0.51401
Epoch: 994, Training Loss: 0.46080
Epoch: 994, Training Loss: 0.49449
Epoch: 994, Training Loss: 0.48149
Epoch: 994, Training Loss: 0.51409
Epoch: 995, Training Loss: 0.46070
Epoch: 995, Training Loss: 0.49446
Epoch: 995, Training Loss: 0.48149
Epoch: 995, Training Loss: 0.51418
Epoch: 996, Training Loss: 0.46059
Epoch: 996, Training Loss: 0.49443
Epoch: 996, Training Loss: 0.48149
Epoch: 996, Training Loss: 0.51426
Epoch: 997, Training Loss: 0.46049
Epoch: 997, Training Loss: 0.49441
Epoch: 997, Training Loss: 0.48149
Epoch: 997, Training Loss: 0.51435
Epoch: 998, Training Loss: 0.46038
Epoch: 998, Training Loss: 0.49438
Epoch: 998, Training Loss: 0.48148
Epoch: 998, Training Loss: 0.51443
Epoch: 999, Training Loss: 0.46027
Epoch: 999, Training Loss: 0.49435
Epoch: 999, Training Loss: 0.48148
Epoch: 999, Training Loss: 0.51452
Epoch: 1000, Training Loss: 0.46016
Epoch: 1000, Training Loss: 0.49432
Epoch: 1000, Training Loss: 0.48148
Epoch: 1000, Training Loss: 0.51460
Epoch: 1001, Training Loss: 0.46005
Epoch: 1001, Training Loss: 0.49429
Epoch: 1001, Training Loss: 0.48148
Epoch: 1001, Training Loss: 0.51469
Epoch: 1002, Training Loss: 0.45994
Epoch: 1002, Training Loss: 0.49426
Epoch: 1002, Training Loss: 0.48147
Epoch: 1002, Training Loss: 0.51478
Epoch: 1003, Training Loss: 0.45983
Epoch: 1003, Training Loss: 0.49423
Epoch: 1003, Training Loss: 0.48147
Epoch: 1003, Training Loss: 0.51487
Epoch: 1004, Training Loss: 0.45972
Epoch: 1004, Training Loss: 0.49420
Epoch: 1004, Training Loss: 0.48146
Epoch: 1004, Training Loss: 0.51496
Epoch: 1005, Training Loss: 0.45961
Epoch: 1005, Training Loss: 0.49417
Epoch: 1005, Training Loss: 0.48146
Epoch: 1005, Training Loss: 0.51505
Epoch: 1006, Training Loss: 0.45949
Epoch: 1006, Training Loss: 0.49414
Epoch: 1006, Training Loss: 0.48146
Epoch: 1006, Training Loss: 0.51514
Epoch: 1007, Training Loss: 0.45938
Epoch: 1007, Training Loss: 0.49411
Epoch: 1007, Training Loss: 0.48145
Epoch: 1007, Training Loss: 0.51523
Epoch: 1008, Training Loss: 0.45926
Epoch: 1008, Training Loss: 0.49408
Epoch: 1008, Training Loss: 0.48145
Epoch: 1008, Training Loss: 0.51532
Epoch: 1009, Training Loss: 0.45915
Epoch: 1009, Training Loss: 0.49405
Epoch: 1009, Training Loss: 0.48144
Epoch: 1009, Training Loss: 0.51541
Epoch: 1010, Training Loss: 0.45903
Epoch: 1010, Training Loss: 0.49402
Epoch: 1010, Training Loss: 0.48144
Epoch: 1010, Training Loss: 0.51550
Epoch: 1011, Training Loss: 0.45891
Epoch: 1011, Training Loss: 0.49399
Epoch: 1011, Training Loss: 0.48143
Epoch: 1011, Training Loss: 0.51560
Epoch: 1012, Training Loss: 0.45879
Epoch: 1012, Training Loss: 0.49396
Epoch: 1012, Training Loss: 0.48142
Epoch: 1012, Training Loss: 0.51569
Epoch: 1013, Training Loss: 0.45867
Epoch: 1013, Training Loss: 0.49393
Epoch: 1013, Training Loss: 0.48142
Epoch: 1013, Training Loss: 0.51579
Epoch: 1014, Training Loss: 0.45855
Epoch: 1014, Training Loss: 0.49390
Epoch: 1014, Training Loss: 0.48141
Epoch: 1014, Training Loss: 0.51588
Epoch: 1015, Training Loss: 0.45843
Epoch: 1015, Training Loss: 0.49386
Epoch: 1015, Training Loss: 0.48140
Epoch: 1015, Training Loss: 0.51598
Epoch: 1016, Training Loss: 0.45830
Epoch: 1016, Training Loss: 0.49383
Epoch: 1016, Training Loss: 0.48140
Epoch: 1016, Training Loss: 0.51608
Epoch: 1017, Training Loss: 0.45818
Epoch: 1017, Training Loss: 0.49380
Epoch: 1017, Training Loss: 0.48139
Epoch: 1017, Training Loss: 0.51617
Epoch: 1018, Training Loss: 0.45805
Epoch: 1018, Training Loss: 0.49376
Epoch: 1018, Training Loss: 0.48138
Epoch: 1018, Training Loss: 0.51627
Epoch: 1019, Training Loss: 0.45793
Epoch: 1019, Training Loss: 0.49373
Epoch: 1019, Training Loss: 0.48137
Epoch: 1019, Training Loss: 0.51637
Epoch: 1020, Training Loss: 0.45780
Epoch: 1020, Training Loss: 0.49370
Epoch: 1020, Training Loss: 0.48136
Epoch: 1020, Training Loss: 0.51647
Epoch: 1021, Training Loss: 0.45767
Epoch: 1021, Training Loss: 0.49366
Epoch: 1021, Training Loss: 0.48135
Epoch: 1021, Training Loss: 0.51657
Epoch: 1022, Training Loss: 0.45754
Epoch: 1022, Training Loss: 0.49363
Epoch: 1022, Training Loss: 0.48134
Epoch: 1022, Training Loss: 0.51667
Epoch: 1023, Training Loss: 0.45741
Epoch: 1023, Training Loss: 0.49360
Epoch: 1023, Training Loss: 0.48133
Epoch: 1023, Training Loss: 0.51677
Epoch: 1024, Training Loss: 0.45728
Epoch: 1024, Training Loss: 0.49356
Epoch: 1024, Training Loss: 0.48132
Epoch: 1024, Training Loss: 0.51687
Epoch: 1025, Training Loss: 0.45715
Epoch: 1025, Training Loss: 0.49353
Epoch: 1025, Training Loss: 0.48131
Epoch: 1025, Training Loss: 0.51698
Epoch: 1026, Training Loss: 0.45702
Epoch: 1026, Training Loss: 0.49349
Epoch: 1026, Training Loss: 0.48130
Epoch: 1026, Training Loss: 0.51708
Epoch: 1027, Training Loss: 0.45688
Epoch: 1027, Training Loss: 0.49346
Epoch: 1027, Training Loss: 0.48129
Epoch: 1027, Training Loss: 0.51719
Epoch: 1028, Training Loss: 0.45675
Epoch: 1028, Training Loss: 0.49342
Epoch: 1028, Training Loss: 0.48128
Epoch: 1028, Training Loss: 0.51729
Epoch: 1029, Training Loss: 0.45661
Epoch: 1029, Training Loss: 0.49338
Epoch: 1029, Training Loss: 0.48127
Epoch: 1029, Training Loss: 0.51740
Epoch: 1030, Training Loss: 0.45647
Epoch: 1030, Training Loss: 0.49335
Epoch: 1030, Training Loss: 0.48126
Epoch: 1030, Training Loss: 0.51750
Epoch: 1031, Training Loss: 0.45633
Epoch: 1031, Training Loss: 0.49331
Epoch: 1031, Training Loss: 0.48124
Epoch: 1031, Training Loss: 0.51761
Epoch: 1032, Training Loss: 0.45619
Epoch: 1032, Training Loss: 0.49327
Epoch: 1032, Training Loss: 0.48123
Epoch: 1032, Training Loss: 0.51772
Epoch: 1033, Training Loss: 0.45605
Epoch: 1033, Training Loss: 0.49324
Epoch: 1033, Training Loss: 0.48122
Epoch: 1033, Training Loss: 0.51783
Epoch: 1034, Training Loss: 0.45591
Epoch: 1034, Training Loss: 0.49320
Epoch: 1034, Training Loss: 0.48120
Epoch: 1034, Training Loss: 0.51794
Epoch: 1035, Training Loss: 0.45577
Epoch: 1035, Training Loss: 0.49316
Epoch: 1035, Training Loss: 0.48119
Epoch: 1035, Training Loss: 0.51805
Epoch: 1036, Training Loss: 0.45562
Epoch: 1036, Training Loss: 0.49312
Epoch: 1036, Training Loss: 0.48117
Epoch: 1036, Training Loss: 0.51816
Epoch: 1037, Training Loss: 0.45547
Epoch: 1037, Training Loss: 0.49308
Epoch: 1037, Training Loss: 0.48116
Epoch: 1037, Training Loss: 0.51827
Epoch: 1038, Training Loss: 0.45533
Epoch: 1038, Training Loss: 0.49305
Epoch: 1038, Training Loss: 0.48114
Epoch: 1038, Training Loss: 0.51839
Epoch: 1039, Training Loss: 0.45518
Epoch: 1039, Training Loss: 0.49301
Epoch: 1039, Training Loss: 0.48113
Epoch: 1039, Training Loss: 0.51850
Epoch: 1040, Training Loss: 0.45503
Epoch: 1040, Training Loss: 0.49297
Epoch: 1040, Training Loss: 0.48111
Epoch: 1040, Training Loss: 0.51861
Epoch: 1041, Training Loss: 0.45488
Epoch: 1041, Training Loss: 0.49293
Epoch: 1041, Training Loss: 0.48109
Epoch: 1041, Training Loss: 0.51873
Epoch: 1042, Training Loss: 0.45473
Epoch: 1042, Training Loss: 0.49289
Epoch: 1042, Training Loss: 0.48107
Epoch: 1042, Training Loss: 0.51884
Epoch: 1043, Training Loss: 0.45458
Epoch: 1043, Training Loss: 0.49285
Epoch: 1043, Training Loss: 0.48106
Epoch: 1043, Training Loss: 0.51896
Epoch: 1044, Training Loss: 0.45442
Epoch: 1044, Training Loss: 0.49281
Epoch: 1044, Training Loss: 0.48104
Epoch: 1044, Training Loss: 0.51908
Epoch: 1045, Training Loss: 0.45427
Epoch: 1045, Training Loss: 0.49276
Epoch: 1045, Training Loss: 0.48102
Epoch: 1045, Training Loss: 0.51920
Epoch: 1046, Training Loss: 0.45411
Epoch: 1046, Training Loss: 0.49272
Epoch: 1046, Training Loss: 0.48100
Epoch: 1046, Training Loss: 0.51932
Epoch: 1047, Training Loss: 0.45395
Epoch: 1047, Training Loss: 0.49268
Epoch: 1047, Training Loss: 0.48098
Epoch: 1047, Training Loss: 0.51944
Epoch: 1048, Training Loss: 0.45379
Epoch: 1048, Training Loss: 0.49264
Epoch: 1048, Training Loss: 0.48096
Epoch: 1048, Training Loss: 0.51956
Epoch: 1049, Training Loss: 0.45363
Epoch: 1049, Training Loss: 0.49260
Epoch: 1049, Training Loss: 0.48094
Epoch: 1049, Training Loss: 0.51968
Epoch: 1050, Training Loss: 0.45347
Epoch: 1050, Training Loss: 0.49255
Epoch: 1050, Training Loss: 0.48092
Epoch: 1050, Training Loss: 0.51980
Epoch: 1051, Training Loss: 0.45331
Epoch: 1051, Training Loss: 0.49251
Epoch: 1051, Training Loss: 0.48090
Epoch: 1051, Training Loss: 0.51992
Epoch: 1052, Training Loss: 0.45314
Epoch: 1052, Training Loss: 0.49246
Epoch: 1052, Training Loss: 0.48087
Epoch: 1052, Training Loss: 0.52005
Epoch: 1053, Training Loss: 0.45298
Epoch: 1053, Training Loss: 0.49242
Epoch: 1053, Training Loss: 0.48085
Epoch: 1053, Training Loss: 0.52017
Epoch: 1054, Training Loss: 0.45281
Epoch: 1054, Training Loss: 0.49238
Epoch: 1054, Training Loss: 0.48083
Epoch: 1054, Training Loss: 0.52030
Epoch: 1055, Training Loss: 0.45264
Epoch: 1055, Training Loss: 0.49233
Epoch: 1055, Training Loss: 0.48081
Epoch: 1055, Training Loss: 0.52043
Epoch: 1056, Training Loss: 0.45247
Epoch: 1056, Training Loss: 0.49228
Epoch: 1056, Training Loss: 0.48078
Epoch: 1056, Training Loss: 0.52055
Epoch: 1057, Training Loss: 0.45230
Epoch: 1057, Training Loss: 0.49224
Epoch: 1057, Training Loss: 0.48076
Epoch: 1057, Training Loss: 0.52068
Epoch: 1058, Training Loss: 0.45213
Epoch: 1058, Training Loss: 0.49219
Epoch: 1058, Training Loss: 0.48073
Epoch: 1058, Training Loss: 0.52081
Epoch: 1059, Training Loss: 0.45195
Epoch: 1059, Training Loss: 0.49215
Epoch: 1059, Training Loss: 0.48070
Epoch: 1059, Training Loss: 0.52094
Epoch: 1060, Training Loss: 0.45178
Epoch: 1060, Training Loss: 0.49210
Epoch: 1060, Training Loss: 0.48068
Epoch: 1060, Training Loss: 0.52107
Epoch: 1061, Training Loss: 0.45160
Epoch: 1061, Training Loss: 0.49205
Epoch: 1061, Training Loss: 0.48065
Epoch: 1061, Training Loss: 0.52121
Epoch: 1062, Training Loss: 0.45142
Epoch: 1062, Training Loss: 0.49200
Epoch: 1062, Training Loss: 0.48062
Epoch: 1062, Training Loss: 0.52134
Epoch: 1063, Training Loss: 0.45124
Epoch: 1063, Training Loss: 0.49195
Epoch: 1063, Training Loss: 0.48060
Epoch: 1063, Training Loss: 0.52147
Epoch: 1064, Training Loss: 0.45106
Epoch: 1064, Training Loss: 0.49191
Epoch: 1064, Training Loss: 0.48057
Epoch: 1064, Training Loss: 0.52161
Epoch: 1065, Training Loss: 0.45088
Epoch: 1065, Training Loss: 0.49186
Epoch: 1065, Training Loss: 0.48054
Epoch: 1065, Training Loss: 0.52174
Epoch: 1066, Training Loss: 0.45069
Epoch: 1066, Training Loss: 0.49181
Epoch: 1066, Training Loss: 0.48051
Epoch: 1066, Training Loss: 0.52188
Epoch: 1067, Training Loss: 0.45051
Epoch: 1067, Training Loss: 0.49176
Epoch: 1067, Training Loss: 0.48048
Epoch: 1067, Training Loss: 0.52202
Epoch: 1068, Training Loss: 0.45032
Epoch: 1068, Training Loss: 0.49170
Epoch: 1068, Training Loss: 0.48045
Epoch: 1068, Training Loss: 0.52216
Epoch: 1069, Training Loss: 0.45013
Epoch: 1069, Training Loss: 0.49165
Epoch: 1069, Training Loss: 0.48041
Epoch: 1069, Training Loss: 0.52230
Epoch: 1070, Training Loss: 0.44994
Epoch: 1070, Training Loss: 0.49160
Epoch: 1070, Training Loss: 0.48038
Epoch: 1070, Training Loss: 0.52244
Epoch: 1071, Training Loss: 0.44975
Epoch: 1071, Training Loss: 0.49155
Epoch: 1071, Training Loss: 0.48035
Epoch: 1071, Training Loss: 0.52258
Epoch: 1072, Training Loss: 0.44955
Epoch: 1072, Training Loss: 0.49150
Epoch: 1072, Training Loss: 0.48032
Epoch: 1072, Training Loss: 0.52272
Epoch: 1073, Training Loss: 0.44936
Epoch: 1073, Training Loss: 0.49144
Epoch: 1073, Training Loss: 0.48028
Epoch: 1073, Training Loss: 0.52286
Epoch: 1074, Training Loss: 0.44916
Epoch: 1074, Training Loss: 0.49139
Epoch: 1074, Training Loss: 0.48025
Epoch: 1074, Training Loss: 0.52301
Epoch: 1075, Training Loss: 0.44896
Epoch: 1075, Training Loss: 0.49133
Epoch: 1075, Training Loss: 0.48021
Epoch: 1075, Training Loss: 0.52315
Epoch: 1076, Training Loss: 0.44876
Epoch: 1076, Training Loss: 0.49128
Epoch: 1076, Training Loss: 0.48018
Epoch: 1076, Training Loss: 0.52330
Epoch: 1077, Training Loss: 0.44856
Epoch: 1077, Training Loss: 0.49122
Epoch: 1077, Training Loss: 0.48014
Epoch: 1077, Training Loss: 0.52345
Epoch: 1078, Training Loss: 0.44836
Epoch: 1078, Training Loss: 0.49117
Epoch: 1078, Training Loss: 0.48010
Epoch: 1078, Training Loss: 0.52360
Epoch: 1079, Training Loss: 0.44815
Epoch: 1079, Training Loss: 0.49111
Epoch: 1079, Training Loss: 0.48006
Epoch: 1079, Training Loss: 0.52374
Epoch: 1080, Training Loss: 0.44795
Epoch: 1080, Training Loss: 0.49106
Epoch: 1080, Training Loss: 0.48002
Epoch: 1080, Training Loss: 0.52389
Epoch: 1081, Training Loss: 0.44774
Epoch: 1081, Training Loss: 0.49100
Epoch: 1081, Training Loss: 0.47998
Epoch: 1081, Training Loss: 0.52405
Epoch: 1082, Training Loss: 0.44753
Epoch: 1082, Training Loss: 0.49094
Epoch: 1082, Training Loss: 0.47994
Epoch: 1082, Training Loss: 0.52420
Epoch: 1083, Training Loss: 0.44732
Epoch: 1083, Training Loss: 0.49088
Epoch: 1083, Training Loss: 0.47990
Epoch: 1083, Training Loss: 0.52435
Epoch: 1084, Training Loss: 0.44710
Epoch: 1084, Training Loss: 0.49082
Epoch: 1084, Training Loss: 0.47986
Epoch: 1084, Training Loss: 0.52451
Epoch: 1085, Training Loss: 0.44689
Epoch: 1085, Training Loss: 0.49076
Epoch: 1085, Training Loss: 0.47982
Epoch: 1085, Training Loss: 0.52466
Epoch: 1086, Training Loss: 0.44667
Epoch: 1086, Training Loss: 0.49070
Epoch: 1086, Training Loss: 0.47977
Epoch: 1086, Training Loss: 0.52482
Epoch: 1087, Training Loss: 0.44645
Epoch: 1087, Training Loss: 0.49064
Epoch: 1087, Training Loss: 0.47973
Epoch: 1087, Training Loss: 0.52498
Epoch: 1088, Training Loss: 0.44623
Epoch: 1088, Training Loss: 0.49058
Epoch: 1088, Training Loss: 0.47969
Epoch: 1088, Training Loss: 0.52513
Epoch: 1089, Training Loss: 0.44601
Epoch: 1089, Training Loss: 0.49052
Epoch: 1089, Training Loss: 0.47964
Epoch: 1089, Training Loss: 0.52529
Epoch: 1090, Training Loss: 0.44578
Epoch: 1090, Training Loss: 0.49045
Epoch: 1090, Training Loss: 0.47959
Epoch: 1090, Training Loss: 0.52546
Epoch: 1091, Training Loss: 0.44556
Epoch: 1091, Training Loss: 0.49039
Epoch: 1091, Training Loss: 0.47955
Epoch: 1091, Training Loss: 0.52562
Epoch: 1092, Training Loss: 0.44533
Epoch: 1092, Training Loss: 0.49033
Epoch: 1092, Training Loss: 0.47950
Epoch: 1092, Training Loss: 0.52578
Epoch: 1093, Training Loss: 0.44510
Epoch: 1093, Training Loss: 0.49026
Epoch: 1093, Training Loss: 0.47945
Epoch: 1093, Training Loss: 0.52594
Epoch: 1094, Training Loss: 0.44487
Epoch: 1094, Training Loss: 0.49019
Epoch: 1094, Training Loss: 0.47940
Epoch: 1094, Training Loss: 0.52611
Epoch: 1095, Training Loss: 0.44463
Epoch: 1095, Training Loss: 0.49013
Epoch: 1095, Training Loss: 0.47935
Epoch: 1095, Training Loss: 0.52628
Epoch: 1096, Training Loss: 0.44440
Epoch: 1096, Training Loss: 0.49006
Epoch: 1096, Training Loss: 0.47930
Epoch: 1096, Training Loss: 0.52644
Epoch: 1097, Training Loss: 0.44416
Epoch: 1097, Training Loss: 0.48999
Epoch: 1097, Training Loss: 0.47925
Epoch: 1097, Training Loss: 0.52661
Epoch: 1098, Training Loss: 0.44392
Epoch: 1098, Training Loss: 0.48993
Epoch: 1098, Training Loss: 0.47919
Epoch: 1098, Training Loss: 0.52678
Epoch: 1099, Training Loss: 0.44368
Epoch: 1099, Training Loss: 0.48986
Epoch: 1099, Training Loss: 0.47914
Epoch: 1099, Training Loss: 0.52695
Epoch: 1100, Training Loss: 0.44343
Epoch: 1100, Training Loss: 0.48979
Epoch: 1100, Training Loss: 0.47909
Epoch: 1100, Training Loss: 0.52712
Epoch: 1101, Training Loss: 0.44319
Epoch: 1101, Training Loss: 0.48972
Epoch: 1101, Training Loss: 0.47903
Epoch: 1101, Training Loss: 0.52730
Epoch: 1102, Training Loss: 0.44294
Epoch: 1102, Training Loss: 0.48965
Epoch: 1102, Training Loss: 0.47897
Epoch: 1102, Training Loss: 0.52747
Epoch: 1103, Training Loss: 0.44269
Epoch: 1103, Training Loss: 0.48957
Epoch: 1103, Training Loss: 0.47892
Epoch: 1103, Training Loss: 0.52765
Epoch: 1104, Training Loss: 0.44244
Epoch: 1104, Training Loss: 0.48950
Epoch: 1104, Training Loss: 0.47886
Epoch: 1104, Training Loss: 0.52782
Epoch: 1105, Training Loss: 0.44218
Epoch: 1105, Training Loss: 0.48943
Epoch: 1105, Training Loss: 0.47880
Epoch: 1105, Training Loss: 0.52800
Epoch: 1106, Training Loss: 0.44193
Epoch: 1106, Training Loss: 0.48936
Epoch: 1106, Training Loss: 0.47874
Epoch: 1106, Training Loss: 0.52818
Epoch: 1107, Training Loss: 0.44167
Epoch: 1107, Training Loss: 0.48928
Epoch: 1107, Training Loss: 0.47868
Epoch: 1107, Training Loss: 0.52836
Epoch: 1108, Training Loss: 0.44141
Epoch: 1108, Training Loss: 0.48920
Epoch: 1108, Training Loss: 0.47862
Epoch: 1108, Training Loss: 0.52854
Epoch: 1109, Training Loss: 0.44115
Epoch: 1109, Training Loss: 0.48913
Epoch: 1109, Training Loss: 0.47855
Epoch: 1109, Training Loss: 0.52873
Epoch: 1110, Training Loss: 0.44088
Epoch: 1110, Training Loss: 0.48905
Epoch: 1110, Training Loss: 0.47849
Epoch: 1110, Training Loss: 0.52891
Epoch: 1111, Training Loss: 0.44061
Epoch: 1111, Training Loss: 0.48897
Epoch: 1111, Training Loss: 0.47843
Epoch: 1111, Training Loss: 0.52909
Epoch: 1112, Training Loss: 0.44035
Epoch: 1112, Training Loss: 0.48889
Epoch: 1112, Training Loss: 0.47836
Epoch: 1112, Training Loss: 0.52928
Epoch: 1113, Training Loss: 0.44007
Epoch: 1113, Training Loss: 0.48882
Epoch: 1113, Training Loss: 0.47829
Epoch: 1113, Training Loss: 0.52947
Epoch: 1114, Training Loss: 0.43980
Epoch: 1114, Training Loss: 0.48873
Epoch: 1114, Training Loss: 0.47823
Epoch: 1114, Training Loss: 0.52966
Epoch: 1115, Training Loss: 0.43952
Epoch: 1115, Training Loss: 0.48865
Epoch: 1115, Training Loss: 0.47816
Epoch: 1115, Training Loss: 0.52985
Epoch: 1116, Training Loss: 0.43925
Epoch: 1116, Training Loss: 0.48857
Epoch: 1116, Training Loss: 0.47809
Epoch: 1116, Training Loss: 0.53004
Epoch: 1117, Training Loss: 0.43896
Epoch: 1117, Training Loss: 0.48849
Epoch: 1117, Training Loss: 0.47802
Epoch: 1117, Training Loss: 0.53023
Epoch: 1118, Training Loss: 0.43868
Epoch: 1118, Training Loss: 0.48840
Epoch: 1118, Training Loss: 0.47795
Epoch: 1118, Training Loss: 0.53043
Epoch: 1119, Training Loss: 0.43840
Epoch: 1119, Training Loss: 0.48832
Epoch: 1119, Training Loss: 0.47787
Epoch: 1119, Training Loss: 0.53062
Epoch: 1120, Training Loss: 0.43811
Epoch: 1120, Training Loss: 0.48823
Epoch: 1120, Training Loss: 0.47780
Epoch: 1120, Training Loss: 0.53082
Epoch: 1121, Training Loss: 0.43782
Epoch: 1121, Training Loss: 0.48815
Epoch: 1121, Training Loss: 0.47772
Epoch: 1121, Training Loss: 0.53102
Epoch: 1122, Training Loss: 0.43753
Epoch: 1122, Training Loss: 0.48806
Epoch: 1122, Training Loss: 0.47765
Epoch: 1122, Training Loss: 0.53122
Epoch: 1123, Training Loss: 0.43723
Epoch: 1123, Training Loss: 0.48797
Epoch: 1123, Training Loss: 0.47757
Epoch: 1123, Training Loss: 0.53142
Epoch: 1124, Training Loss: 0.43693
Epoch: 1124, Training Loss: 0.48788
Epoch: 1124, Training Loss: 0.47749
Epoch: 1124, Training Loss: 0.53162
Epoch: 1125, Training Loss: 0.43663
Epoch: 1125, Training Loss: 0.48779
Epoch: 1125, Training Loss: 0.47741
Epoch: 1125, Training Loss: 0.53182
Epoch: 1126, Training Loss: 0.43633
Epoch: 1126, Training Loss: 0.48770
Epoch: 1126, Training Loss: 0.47733
Epoch: 1126, Training Loss: 0.53203
Epoch: 1127, Training Loss: 0.43602
Epoch: 1127, Training Loss: 0.48761
Epoch: 1127, Training Loss: 0.47725
Epoch: 1127, Training Loss: 0.53223
Epoch: 1128, Training Loss: 0.43572
Epoch: 1128, Training Loss: 0.48752
Epoch: 1128, Training Loss: 0.47717
Epoch: 1128, Training Loss: 0.53244
Epoch: 1129, Training Loss: 0.43541
Epoch: 1129, Training Loss: 0.48742
Epoch: 1129, Training Loss: 0.47708
Epoch: 1129, Training Loss: 0.53265
Epoch: 1130, Training Loss: 0.43509
Epoch: 1130, Training Loss: 0.48733
Epoch: 1130, Training Loss: 0.47700
Epoch: 1130, Training Loss: 0.53286
Epoch: 1131, Training Loss: 0.43478
Epoch: 1131, Training Loss: 0.48723
Epoch: 1131, Training Loss: 0.47691
Epoch: 1131, Training Loss: 0.53307
Epoch: 1132, Training Loss: 0.43446
Epoch: 1132, Training Loss: 0.48713
Epoch: 1132, Training Loss: 0.47682
Epoch: 1132, Training Loss: 0.53328
Epoch: 1133, Training Loss: 0.43414
Epoch: 1133, Training Loss: 0.48703
Epoch: 1133, Training Loss: 0.47673
Epoch: 1133, Training Loss: 0.53350
Epoch: 1134, Training Loss: 0.43382
Epoch: 1134, Training Loss: 0.48693
Epoch: 1134, Training Loss: 0.47664
Epoch: 1134, Training Loss: 0.53371
Epoch: 1135, Training Loss: 0.43349
Epoch: 1135, Training Loss: 0.48683
Epoch: 1135, Training Loss: 0.47655
Epoch: 1135, Training Loss: 0.53393
Epoch: 1136, Training Loss: 0.43316
Epoch: 1136, Training Loss: 0.48673
Epoch: 1136, Training Loss: 0.47646
Epoch: 1136, Training Loss: 0.53415
Epoch: 1137, Training Loss: 0.43283
Epoch: 1137, Training Loss: 0.48663
Epoch: 1137, Training Loss: 0.47637
Epoch: 1137, Training Loss: 0.53437
Epoch: 1138, Training Loss: 0.43249
Epoch: 1138, Training Loss: 0.48653
Epoch: 1138, Training Loss: 0.47627
Epoch: 1138, Training Loss: 0.53459
Epoch: 1139, Training Loss: 0.43216
Epoch: 1139, Training Loss: 0.48642
Epoch: 1139, Training Loss: 0.47617
Epoch: 1139, Training Loss: 0.53481
Epoch: 1140, Training Loss: 0.43182
Epoch: 1140, Training Loss: 0.48631
Epoch: 1140, Training Loss: 0.47608
Epoch: 1140, Training Loss: 0.53504
Epoch: 1141, Training Loss: 0.43147
Epoch: 1141, Training Loss: 0.48621
Epoch: 1141, Training Loss: 0.47598
Epoch: 1141, Training Loss: 0.53527
Epoch: 1142, Training Loss: 0.43113
Epoch: 1142, Training Loss: 0.48610
Epoch: 1142, Training Loss: 0.47588
Epoch: 1142, Training Loss: 0.53549
Epoch: 1143, Training Loss: 0.43078
Epoch: 1143, Training Loss: 0.48599
Epoch: 1143, Training Loss: 0.47577
Epoch: 1143, Training Loss: 0.53572
Epoch: 1144, Training Loss: 0.43043
Epoch: 1144, Training Loss: 0.48588
Epoch: 1144, Training Loss: 0.47567
Epoch: 1144, Training Loss: 0.53595
Epoch: 1145, Training Loss: 0.43007
Epoch: 1145, Training Loss: 0.48577
Epoch: 1145, Training Loss: 0.47557
Epoch: 1145, Training Loss: 0.53619
Epoch: 1146, Training Loss: 0.42971
Epoch: 1146, Training Loss: 0.48565
Epoch: 1146, Training Loss: 0.47546
Epoch: 1146, Training Loss: 0.53642
Epoch: 1147, Training Loss: 0.42935
Epoch: 1147, Training Loss: 0.48554
Epoch: 1147, Training Loss: 0.47535
Epoch: 1147, Training Loss: 0.53665
Epoch: 1148, Training Loss: 0.42899
Epoch: 1148, Training Loss: 0.48542
Epoch: 1148, Training Loss: 0.47524
Epoch: 1148, Training Loss: 0.53689
Epoch: 1149, Training Loss: 0.42862
Epoch: 1149, Training Loss: 0.48530
Epoch: 1149, Training Loss: 0.47513
Epoch: 1149, Training Loss: 0.53713
Epoch: 1150, Training Loss: 0.42825
Epoch: 1150, Training Loss: 0.48519
Epoch: 1150, Training Loss: 0.47502
Epoch: 1150, Training Loss: 0.53737
Epoch: 1151, Training Loss: 0.42788
Epoch: 1151, Training Loss: 0.48507
Epoch: 1151, Training Loss: 0.47491
Epoch: 1151, Training Loss: 0.53761
Epoch: 1152, Training Loss: 0.42750
Epoch: 1152, Training Loss: 0.48495
Epoch: 1152, Training Loss: 0.47479
Epoch: 1152, Training Loss: 0.53786
Epoch: 1153, Training Loss: 0.42712
Epoch: 1153, Training Loss: 0.48482
Epoch: 1153, Training Loss: 0.47468
Epoch: 1153, Training Loss: 0.53810
Epoch: 1154, Training Loss: 0.42674
Epoch: 1154, Training Loss: 0.48470
Epoch: 1154, Training Loss: 0.47456
Epoch: 1154, Training Loss: 0.53835
Epoch: 1155, Training Loss: 0.42635
Epoch: 1155, Training Loss: 0.48457
Epoch: 1155, Training Loss: 0.47444
Epoch: 1155, Training Loss: 0.53860
Epoch: 1156, Training Loss: 0.42596
Epoch: 1156, Training Loss: 0.48445
Epoch: 1156, Training Loss: 0.47432
Epoch: 1156, Training Loss: 0.53885
Epoch: 1157, Training Loss: 0.42557
Epoch: 1157, Training Loss: 0.48432
Epoch: 1157, Training Loss: 0.47420
Epoch: 1157, Training Loss: 0.53910
Epoch: 1158, Training Loss: 0.42517
Epoch: 1158, Training Loss: 0.48419
Epoch: 1158, Training Loss: 0.47407
Epoch: 1158, Training Loss: 0.53935
Epoch: 1159, Training Loss: 0.42477
Epoch: 1159, Training Loss: 0.48406
Epoch: 1159, Training Loss: 0.47395
Epoch: 1159, Training Loss: 0.53961
Epoch: 1160, Training Loss: 0.42437
Epoch: 1160, Training Loss: 0.48393
Epoch: 1160, Training Loss: 0.47382
Epoch: 1160, Training Loss: 0.53986
Epoch: 1161, Training Loss: 0.42397
Epoch: 1161, Training Loss: 0.48379
Epoch: 1161, Training Loss: 0.47369
Epoch: 1161, Training Loss: 0.54012
Epoch: 1162, Training Loss: 0.42356
Epoch: 1162, Training Loss: 0.48366
Epoch: 1162, Training Loss: 0.47356
Epoch: 1162, Training Loss: 0.54038
Epoch: 1163, Training Loss: 0.42314
Epoch: 1163, Training Loss: 0.48352
Epoch: 1163, Training Loss: 0.47343
Epoch: 1163, Training Loss: 0.54065
Epoch: 1164, Training Loss: 0.42273
Epoch: 1164, Training Loss: 0.48338
Epoch: 1164, Training Loss: 0.47329
Epoch: 1164, Training Loss: 0.54091
Epoch: 1165, Training Loss: 0.42231
Epoch: 1165, Training Loss: 0.48324
Epoch: 1165, Training Loss: 0.47316
Epoch: 1165, Training Loss: 0.54118
Epoch: 1166, Training Loss: 0.42188
Epoch: 1166, Training Loss: 0.48310
Epoch: 1166, Training Loss: 0.47302
Epoch: 1166, Training Loss: 0.54144
Epoch: 1167, Training Loss: 0.42145
Epoch: 1167, Training Loss: 0.48296
Epoch: 1167, Training Loss: 0.47288
Epoch: 1167, Training Loss: 0.54171
Epoch: 1168, Training Loss: 0.42102
Epoch: 1168, Training Loss: 0.48281
Epoch: 1168, Training Loss: 0.47274
Epoch: 1168, Training Loss: 0.54198
Epoch: 1169, Training Loss: 0.42059
Epoch: 1169, Training Loss: 0.48267
Epoch: 1169, Training Loss: 0.47260
Epoch: 1169, Training Loss: 0.54226
Epoch: 1170, Training Loss: 0.42015
Epoch: 1170, Training Loss: 0.48252
Epoch: 1170, Training Loss: 0.47245
Epoch: 1170, Training Loss: 0.54253
Epoch: 1171, Training Loss: 0.41971
Epoch: 1171, Training Loss: 0.48237
Epoch: 1171, Training Loss: 0.47231
Epoch: 1171, Training Loss: 0.54281
Epoch: 1172, Training Loss: 0.41926
Epoch: 1172, Training Loss: 0.48222
Epoch: 1172, Training Loss: 0.47216
Epoch: 1172, Training Loss: 0.54309
Epoch: 1173, Training Loss: 0.41881
Epoch: 1173, Training Loss: 0.48207
Epoch: 1173, Training Loss: 0.47201
Epoch: 1173, Training Loss: 0.54337
Epoch: 1174, Training Loss: 0.41836
Epoch: 1174, Training Loss: 0.48191
Epoch: 1174, Training Loss: 0.47186
Epoch: 1174, Training Loss: 0.54365
Epoch: 1175, Training Loss: 0.41790
Epoch: 1175, Training Loss: 0.48175
Epoch: 1175, Training Loss: 0.47170
Epoch: 1175, Training Loss: 0.54394
Epoch: 1176, Training Loss: 0.41744
Epoch: 1176, Training Loss: 0.48160
Epoch: 1176, Training Loss: 0.47155
Epoch: 1176, Training Loss: 0.54422
Epoch: 1177, Training Loss: 0.41698
Epoch: 1177, Training Loss: 0.48144
Epoch: 1177, Training Loss: 0.47139
Epoch: 1177, Training Loss: 0.54451
Epoch: 1178, Training Loss: 0.41651
Epoch: 1178, Training Loss: 0.48127
Epoch: 1178, Training Loss: 0.47123
Epoch: 1178, Training Loss: 0.54480
Epoch: 1179, Training Loss: 0.41604
Epoch: 1179, Training Loss: 0.48111
Epoch: 1179, Training Loss: 0.47107
Epoch: 1179, Training Loss: 0.54509
Epoch: 1180, Training Loss: 0.41556
Epoch: 1180, Training Loss: 0.48095
Epoch: 1180, Training Loss: 0.47091
Epoch: 1180, Training Loss: 0.54539
Epoch: 1181, Training Loss: 0.41508
Epoch: 1181, Training Loss: 0.48078
Epoch: 1181, Training Loss: 0.47074
Epoch: 1181, Training Loss: 0.54568
Epoch: 1182, Training Loss: 0.41459
Epoch: 1182, Training Loss: 0.48061
Epoch: 1182, Training Loss: 0.47058
Epoch: 1182, Training Loss: 0.54598
Epoch: 1183, Training Loss: 0.41410
Epoch: 1183, Training Loss: 0.48044
Epoch: 1183, Training Loss: 0.47041
Epoch: 1183, Training Loss: 0.54628
Epoch: 1184, Training Loss: 0.41361
Epoch: 1184, Training Loss: 0.48026
Epoch: 1184, Training Loss: 0.47024
Epoch: 1184, Training Loss: 0.54659
Epoch: 1185, Training Loss: 0.41311
Epoch: 1185, Training Loss: 0.48009
Epoch: 1185, Training Loss: 0.47006
Epoch: 1185, Training Loss: 0.54689
Epoch: 1186, Training Loss: 0.41261
Epoch: 1186, Training Loss: 0.47991
Epoch: 1186, Training Loss: 0.46989
Epoch: 1186, Training Loss: 0.54720
Epoch: 1187, Training Loss: 0.41211
Epoch: 1187, Training Loss: 0.47973
Epoch: 1187, Training Loss: 0.46971
Epoch: 1187, Training Loss: 0.54751
Epoch: 1188, Training Loss: 0.41160
Epoch: 1188, Training Loss: 0.47955
Epoch: 1188, Training Loss: 0.46953
Epoch: 1188, Training Loss: 0.54782
Epoch: 1189, Training Loss: 0.41108
Epoch: 1189, Training Loss: 0.47937
Epoch: 1189, Training Loss: 0.46935
Epoch: 1189, Training Loss: 0.54813
Epoch: 1190, Training Loss: 0.41056
Epoch: 1190, Training Loss: 0.47918
Epoch: 1190, Training Loss: 0.46916
Epoch: 1190, Training Loss: 0.54844
Epoch: 1191, Training Loss: 0.41004
Epoch: 1191, Training Loss: 0.47900
Epoch: 1191, Training Loss: 0.46898
Epoch: 1191, Training Loss: 0.54876
Epoch: 1192, Training Loss: 0.40951
Epoch: 1192, Training Loss: 0.47881
Epoch: 1192, Training Loss: 0.46879
Epoch: 1192, Training Loss: 0.54908
Epoch: 1193, Training Loss: 0.40898
Epoch: 1193, Training Loss: 0.47862
Epoch: 1193, Training Loss: 0.46860
Epoch: 1193, Training Loss: 0.54940
Epoch: 1194, Training Loss: 0.40845
Epoch: 1194, Training Loss: 0.47842
Epoch: 1194, Training Loss: 0.46841
Epoch: 1194, Training Loss: 0.54973
Epoch: 1195, Training Loss: 0.40791
Epoch: 1195, Training Loss: 0.47823
Epoch: 1195, Training Loss: 0.46821
Epoch: 1195, Training Loss: 0.55005
Epoch: 1196, Training Loss: 0.40736
Epoch: 1196, Training Loss: 0.47803
Epoch: 1196, Training Loss: 0.46801
Epoch: 1196, Training Loss: 0.55038
Epoch: 1197, Training Loss: 0.40681
Epoch: 1197, Training Loss: 0.47783
Epoch: 1197, Training Loss: 0.46781
Epoch: 1197, Training Loss: 0.55071
Epoch: 1198, Training Loss: 0.40626
Epoch: 1198, Training Loss: 0.47763
Epoch: 1198, Training Loss: 0.46761
Epoch: 1198, Training Loss: 0.55104
Epoch: 1199, Training Loss: 0.40570
Epoch: 1199, Training Loss: 0.47742
Epoch: 1199, Training Loss: 0.46741
Epoch: 1199, Training Loss: 0.55138
Epoch: 1200, Training Loss: 0.40514
Epoch: 1200, Training Loss: 0.47722
Epoch: 1200, Training Loss: 0.46720
Epoch: 1200, Training Loss: 0.55172
Epoch: 1201, Training Loss: 0.40457
Epoch: 1201, Training Loss: 0.47701
Epoch: 1201, Training Loss: 0.46699
Epoch: 1201, Training Loss: 0.55206
Epoch: 1202, Training Loss: 0.40400
Epoch: 1202, Training Loss: 0.47680
Epoch: 1202, Training Loss: 0.46678
Epoch: 1202, Training Loss: 0.55240
Epoch: 1203, Training Loss: 0.40342
Epoch: 1203, Training Loss: 0.47658
Epoch: 1203, Training Loss: 0.46657
Epoch: 1203, Training Loss: 0.55274
Epoch: 1204, Training Loss: 0.40284
Epoch: 1204, Training Loss: 0.47637
Epoch: 1204, Training Loss: 0.46635
Epoch: 1204, Training Loss: 0.55309
Epoch: 1205, Training Loss: 0.40225
Epoch: 1205, Training Loss: 0.47615
Epoch: 1205, Training Loss: 0.46613
Epoch: 1205, Training Loss: 0.55344
Epoch: 1206, Training Loss: 0.40166
Epoch: 1206, Training Loss: 0.47593
Epoch: 1206, Training Loss: 0.46591
Epoch: 1206, Training Loss: 0.55379
Epoch: 1207, Training Loss: 0.40106
Epoch: 1207, Training Loss: 0.47570
Epoch: 1207, Training Loss: 0.46569
Epoch: 1207, Training Loss: 0.55415
Epoch: 1208, Training Loss: 0.40046
Epoch: 1208, Training Loss: 0.47548
Epoch: 1208, Training Loss: 0.46546
Epoch: 1208, Training Loss: 0.55450
Epoch: 1209, Training Loss: 0.39986
Epoch: 1209, Training Loss: 0.47525
Epoch: 1209, Training Loss: 0.46523
Epoch: 1209, Training Loss: 0.55486
Epoch: 1210, Training Loss: 0.39924
Epoch: 1210, Training Loss: 0.47502
Epoch: 1210, Training Loss: 0.46500
Epoch: 1210, Training Loss: 0.55522
Epoch: 1211, Training Loss: 0.39863
Epoch: 1211, Training Loss: 0.47478
Epoch: 1211, Training Loss: 0.46476
Epoch: 1211, Training Loss: 0.55559
Epoch: 1212, Training Loss: 0.39801
Epoch: 1212, Training Loss: 0.47455
Epoch: 1212, Training Loss: 0.46453
Epoch: 1212, Training Loss: 0.55595
Epoch: 1213, Training Loss: 0.39738
Epoch: 1213, Training Loss: 0.47431
Epoch: 1213, Training Loss: 0.46429
Epoch: 1213, Training Loss: 0.55632
Epoch: 1214, Training Loss: 0.39675
Epoch: 1214, Training Loss: 0.47407
Epoch: 1214, Training Loss: 0.46404
Epoch: 1214, Training Loss: 0.55669
Epoch: 1215, Training Loss: 0.39612
Epoch: 1215, Training Loss: 0.47382
Epoch: 1215, Training Loss: 0.46380
Epoch: 1215, Training Loss: 0.55706
Epoch: 1216, Training Loss: 0.39548
Epoch: 1216, Training Loss: 0.47358
Epoch: 1216, Training Loss: 0.46355
Epoch: 1216, Training Loss: 0.55744
Epoch: 1217, Training Loss: 0.39483
Epoch: 1217, Training Loss: 0.47333
Epoch: 1217, Training Loss: 0.46330
Epoch: 1217, Training Loss: 0.55782
Epoch: 1218, Training Loss: 0.39418
Epoch: 1218, Training Loss: 0.47307
Epoch: 1218, Training Loss: 0.46305
Epoch: 1218, Training Loss: 0.55820
Epoch: 1219, Training Loss: 0.39352
Epoch: 1219, Training Loss: 0.47282
Epoch: 1219, Training Loss: 0.46279
Epoch: 1219, Training Loss: 0.55858
Epoch: 1220, Training Loss: 0.39286
Epoch: 1220, Training Loss: 0.47256
Epoch: 1220, Training Loss: 0.46253
Epoch: 1220, Training Loss: 0.55897
Epoch: 1221, Training Loss: 0.39219
Epoch: 1221, Training Loss: 0.47230
Epoch: 1221, Training Loss: 0.46227
Epoch: 1221, Training Loss: 0.55936
Epoch: 1222, Training Loss: 0.39152
Epoch: 1222, Training Loss: 0.47204
Epoch: 1222, Training Loss: 0.46201
Epoch: 1222, Training Loss: 0.55975
Epoch: 1223, Training Loss: 0.39085
Epoch: 1223, Training Loss: 0.47177
Epoch: 1223, Training Loss: 0.46174
Epoch: 1223, Training Loss: 0.56014
Epoch: 1224, Training Loss: 0.39016
Epoch: 1224, Training Loss: 0.47150
Epoch: 1224, Training Loss: 0.46147
Epoch: 1224, Training Loss: 0.56054
Epoch: 1225, Training Loss: 0.38948
Epoch: 1225, Training Loss: 0.47123
Epoch: 1225, Training Loss: 0.46120
Epoch: 1225, Training Loss: 0.56094
Epoch: 1226, Training Loss: 0.38879
Epoch: 1226, Training Loss: 0.47096
Epoch: 1226, Training Loss: 0.46092
Epoch: 1226, Training Loss: 0.56134
Epoch: 1227, Training Loss: 0.38809
Epoch: 1227, Training Loss: 0.47068
Epoch: 1227, Training Loss: 0.46064
Epoch: 1227, Training Loss: 0.56175
Epoch: 1228, Training Loss: 0.38738
Epoch: 1228, Training Loss: 0.47040
Epoch: 1228, Training Loss: 0.46036
Epoch: 1228, Training Loss: 0.56215
Epoch: 1229, Training Loss: 0.38668
Epoch: 1229, Training Loss: 0.47011
Epoch: 1229, Training Loss: 0.46007
Epoch: 1229, Training Loss: 0.56256
Epoch: 1230, Training Loss: 0.38596
Epoch: 1230, Training Loss: 0.46983
Epoch: 1230, Training Loss: 0.45979
Epoch: 1230, Training Loss: 0.56297
Epoch: 1231, Training Loss: 0.38524
Epoch: 1231, Training Loss: 0.46954
Epoch: 1231, Training Loss: 0.45950
Epoch: 1231, Training Loss: 0.56339
Epoch: 1232, Training Loss: 0.38452
Epoch: 1232, Training Loss: 0.46924
Epoch: 1232, Training Loss: 0.45920
Epoch: 1232, Training Loss: 0.56381
Epoch: 1233, Training Loss: 0.38379
Epoch: 1233, Training Loss: 0.46895
Epoch: 1233, Training Loss: 0.45890
Epoch: 1233, Training Loss: 0.56423
Epoch: 1234, Training Loss: 0.38306
Epoch: 1234, Training Loss: 0.46865
Epoch: 1234, Training Loss: 0.45860
Epoch: 1234, Training Loss: 0.56465
Epoch: 1235, Training Loss: 0.38232
Epoch: 1235, Training Loss: 0.46835
Epoch: 1235, Training Loss: 0.45830
Epoch: 1235, Training Loss: 0.56507
Epoch: 1236, Training Loss: 0.38157
Epoch: 1236, Training Loss: 0.46804
Epoch: 1236, Training Loss: 0.45800
Epoch: 1236, Training Loss: 0.56550
Epoch: 1237, Training Loss: 0.38082
Epoch: 1237, Training Loss: 0.46773
Epoch: 1237, Training Loss: 0.45769
Epoch: 1237, Training Loss: 0.56593
Epoch: 1238, Training Loss: 0.38006
Epoch: 1238, Training Loss: 0.46742
Epoch: 1238, Training Loss: 0.45737
Epoch: 1238, Training Loss: 0.56637
Epoch: 1239, Training Loss: 0.37930
Epoch: 1239, Training Loss: 0.46711
Epoch: 1239, Training Loss: 0.45706
Epoch: 1239, Training Loss: 0.56680
Epoch: 1240, Training Loss: 0.37854
Epoch: 1240, Training Loss: 0.46679
Epoch: 1240, Training Loss: 0.45674
Epoch: 1240, Training Loss: 0.56724
Epoch: 1241, Training Loss: 0.37776
Epoch: 1241, Training Loss: 0.46647
Epoch: 1241, Training Loss: 0.45642
Epoch: 1241, Training Loss: 0.56768
Epoch: 1242, Training Loss: 0.37699
Epoch: 1242, Training Loss: 0.46615
Epoch: 1242, Training Loss: 0.45609
Epoch: 1242, Training Loss: 0.56812
Epoch: 1243, Training Loss: 0.37620
Epoch: 1243, Training Loss: 0.46582
Epoch: 1243, Training Loss: 0.45576
Epoch: 1243, Training Loss: 0.56857
Epoch: 1244, Training Loss: 0.37542
Epoch: 1244, Training Loss: 0.46549
Epoch: 1244, Training Loss: 0.45543
Epoch: 1244, Training Loss: 0.56902
Epoch: 1245, Training Loss: 0.37462
Epoch: 1245, Training Loss: 0.46515
Epoch: 1245, Training Loss: 0.45510
Epoch: 1245, Training Loss: 0.56947
Epoch: 1246, Training Loss: 0.37382
Epoch: 1246, Training Loss: 0.46482
Epoch: 1246, Training Loss: 0.45476
Epoch: 1246, Training Loss: 0.56993
Epoch: 1247, Training Loss: 0.37302
Epoch: 1247, Training Loss: 0.46448
Epoch: 1247, Training Loss: 0.45442
Epoch: 1247, Training Loss: 0.57038
Epoch: 1248, Training Loss: 0.37221
Epoch: 1248, Training Loss: 0.46413
Epoch: 1248, Training Loss: 0.45407
Epoch: 1248, Training Loss: 0.57084
Epoch: 1249, Training Loss: 0.37140
Epoch: 1249, Training Loss: 0.46379
Epoch: 1249, Training Loss: 0.45372
Epoch: 1249, Training Loss: 0.57130
Epoch: 1250, Training Loss: 0.37058
Epoch: 1250, Training Loss: 0.46344
Epoch: 1250, Training Loss: 0.45337
Epoch: 1250, Training Loss: 0.57177
Epoch: 1251, Training Loss: 0.36975
Epoch: 1251, Training Loss: 0.46308
Epoch: 1251, Training Loss: 0.45302
Epoch: 1251, Training Loss: 0.57223
Epoch: 1252, Training Loss: 0.36892
Epoch: 1252, Training Loss: 0.46273
Epoch: 1252, Training Loss: 0.45266
Epoch: 1252, Training Loss: 0.57270
Epoch: 1253, Training Loss: 0.36809
Epoch: 1253, Training Loss: 0.46237
Epoch: 1253, Training Loss: 0.45230
Epoch: 1253, Training Loss: 0.57318
Epoch: 1254, Training Loss: 0.36725
Epoch: 1254, Training Loss: 0.46200
Epoch: 1254, Training Loss: 0.45193
Epoch: 1254, Training Loss: 0.57365
Epoch: 1255, Training Loss: 0.36640
Epoch: 1255, Training Loss: 0.46164
Epoch: 1255, Training Loss: 0.45157
Epoch: 1255, Training Loss: 0.57413
Epoch: 1256, Training Loss: 0.36555
Epoch: 1256, Training Loss: 0.46126
Epoch: 1256, Training Loss: 0.45120
Epoch: 1256, Training Loss: 0.57461
Epoch: 1257, Training Loss: 0.36470
Epoch: 1257, Training Loss: 0.46089
Epoch: 1257, Training Loss: 0.45082
Epoch: 1257, Training Loss: 0.57509
Epoch: 1258, Training Loss: 0.36384
Epoch: 1258, Training Loss: 0.46051
Epoch: 1258, Training Loss: 0.45044
Epoch: 1258, Training Loss: 0.57557
Epoch: 1259, Training Loss: 0.36297
Epoch: 1259, Training Loss: 0.46013
Epoch: 1259, Training Loss: 0.45006
Epoch: 1259, Training Loss: 0.57606
Epoch: 1260, Training Loss: 0.36210
Epoch: 1260, Training Loss: 0.45975
Epoch: 1260, Training Loss: 0.44968
Epoch: 1260, Training Loss: 0.57655
Epoch: 1261, Training Loss: 0.36122
Epoch: 1261, Training Loss: 0.45936
Epoch: 1261, Training Loss: 0.44929
Epoch: 1261, Training Loss: 0.57704
Epoch: 1262, Training Loss: 0.36034
Epoch: 1262, Training Loss: 0.45897
Epoch: 1262, Training Loss: 0.44890
Epoch: 1262, Training Loss: 0.57754
Epoch: 1263, Training Loss: 0.35946
Epoch: 1263, Training Loss: 0.45858
Epoch: 1263, Training Loss: 0.44851
Epoch: 1263, Training Loss: 0.57803
Epoch: 1264, Training Loss: 0.35857
Epoch: 1264, Training Loss: 0.45818
Epoch: 1264, Training Loss: 0.44811
Epoch: 1264, Training Loss: 0.57853
Epoch: 1265, Training Loss: 0.35767
Epoch: 1265, Training Loss: 0.45778
Epoch: 1265, Training Loss: 0.44771
Epoch: 1265, Training Loss: 0.57903
Epoch: 1266, Training Loss: 0.35677
Epoch: 1266, Training Loss: 0.45738
Epoch: 1266, Training Loss: 0.44730
Epoch: 1266, Training Loss: 0.57953
Epoch: 1267, Training Loss: 0.35587
Epoch: 1267, Training Loss: 0.45697
Epoch: 1267, Training Loss: 0.44690
Epoch: 1267, Training Loss: 0.58004
Epoch: 1268, Training Loss: 0.35496
Epoch: 1268, Training Loss: 0.45656
Epoch: 1268, Training Loss: 0.44649
Epoch: 1268, Training Loss: 0.58055
Epoch: 1269, Training Loss: 0.35405
Epoch: 1269, Training Loss: 0.45614
Epoch: 1269, Training Loss: 0.44607
Epoch: 1269, Training Loss: 0.58105
Epoch: 1270, Training Loss: 0.35313
Epoch: 1270, Training Loss: 0.45573
Epoch: 1270, Training Loss: 0.44566
Epoch: 1270, Training Loss: 0.58157
Epoch: 1271, Training Loss: 0.35221
Epoch: 1271, Training Loss: 0.45531
Epoch: 1271, Training Loss: 0.44524
Epoch: 1271, Training Loss: 0.58208
Epoch: 1272, Training Loss: 0.35128
Epoch: 1272, Training Loss: 0.45488
Epoch: 1272, Training Loss: 0.44481
Epoch: 1272, Training Loss: 0.58259
Epoch: 1273, Training Loss: 0.35035
Epoch: 1273, Training Loss: 0.45446
Epoch: 1273, Training Loss: 0.44439
Epoch: 1273, Training Loss: 0.58311
Epoch: 1274, Training Loss: 0.34941
Epoch: 1274, Training Loss: 0.45403
Epoch: 1274, Training Loss: 0.44396
Epoch: 1274, Training Loss: 0.58363
Epoch: 1275, Training Loss: 0.34847
Epoch: 1275, Training Loss: 0.45359
Epoch: 1275, Training Loss: 0.44352
Epoch: 1275, Training Loss: 0.58415
Epoch: 1276, Training Loss: 0.34753
Epoch: 1276, Training Loss: 0.45316
Epoch: 1276, Training Loss: 0.44309
Epoch: 1276, Training Loss: 0.58467
Epoch: 1277, Training Loss: 0.34658
Epoch: 1277, Training Loss: 0.45272
Epoch: 1277, Training Loss: 0.44265
Epoch: 1277, Training Loss: 0.58520
Epoch: 1278, Training Loss: 0.34563
Epoch: 1278, Training Loss: 0.45227
Epoch: 1278, Training Loss: 0.44221
Epoch: 1278, Training Loss: 0.58573
Epoch: 1279, Training Loss: 0.34468
Epoch: 1279, Training Loss: 0.45183
Epoch: 1279, Training Loss: 0.44176
Epoch: 1279, Training Loss: 0.58625
Epoch: 1280, Training Loss: 0.34372
Epoch: 1280, Training Loss: 0.45138
Epoch: 1280, Training Loss: 0.44131
Epoch: 1280, Training Loss: 0.58678
Epoch: 1281, Training Loss: 0.34275
Epoch: 1281, Training Loss: 0.45093
Epoch: 1281, Training Loss: 0.44086
Epoch: 1281, Training Loss: 0.58731
Epoch: 1282, Training Loss: 0.34179
Epoch: 1282, Training Loss: 0.45047
Epoch: 1282, Training Loss: 0.44041
Epoch: 1282, Training Loss: 0.58785
Epoch: 1283, Training Loss: 0.34082
Epoch: 1283, Training Loss: 0.45001
Epoch: 1283, Training Loss: 0.43995
Epoch: 1283, Training Loss: 0.58838
Epoch: 1284, Training Loss: 0.33984
Epoch: 1284, Training Loss: 0.44955
Epoch: 1284, Training Loss: 0.43949
Epoch: 1284, Training Loss: 0.58892
Epoch: 1285, Training Loss: 0.33886
Epoch: 1285, Training Loss: 0.44909
Epoch: 1285, Training Loss: 0.43903
Epoch: 1285, Training Loss: 0.58945
Epoch: 1286, Training Loss: 0.33788
Epoch: 1286, Training Loss: 0.44862
Epoch: 1286, Training Loss: 0.43857
Epoch: 1286, Training Loss: 0.58999
Epoch: 1287, Training Loss: 0.33690
Epoch: 1287, Training Loss: 0.44815
Epoch: 1287, Training Loss: 0.43810
Epoch: 1287, Training Loss: 0.59053
Epoch: 1288, Training Loss: 0.33591
Epoch: 1288, Training Loss: 0.44768
Epoch: 1288, Training Loss: 0.43763
Epoch: 1288, Training Loss: 0.59107
Epoch: 1289, Training Loss: 0.33492
Epoch: 1289, Training Loss: 0.44720
Epoch: 1289, Training Loss: 0.43715
Epoch: 1289, Training Loss: 0.59161
Epoch: 1290, Training Loss: 0.33393
Epoch: 1290, Training Loss: 0.44672
Epoch: 1290, Training Loss: 0.43668
Epoch: 1290, Training Loss: 0.59216
Epoch: 1291, Training Loss: 0.33293
Epoch: 1291, Training Loss: 0.44624
Epoch: 1291, Training Loss: 0.43620
Epoch: 1291, Training Loss: 0.59270
Epoch: 1292, Training Loss: 0.33193
Epoch: 1292, Training Loss: 0.44576
Epoch: 1292, Training Loss: 0.43572
Epoch: 1292, Training Loss: 0.59324
Epoch: 1293, Training Loss: 0.33093
Epoch: 1293, Training Loss: 0.44527
Epoch: 1293, Training Loss: 0.43523
Epoch: 1293, Training Loss: 0.59379
Epoch: 1294, Training Loss: 0.32993
Epoch: 1294, Training Loss: 0.44478
Epoch: 1294, Training Loss: 0.43475
Epoch: 1294, Training Loss: 0.59434
Epoch: 1295, Training Loss: 0.32892
Epoch: 1295, Training Loss: 0.44429
Epoch: 1295, Training Loss: 0.43426
Epoch: 1295, Training Loss: 0.59488
Epoch: 1296, Training Loss: 0.32791
Epoch: 1296, Training Loss: 0.44379
Epoch: 1296, Training Loss: 0.43377
Epoch: 1296, Training Loss: 0.59543
Epoch: 1297, Training Loss: 0.32690
Epoch: 1297, Training Loss: 0.44330
Epoch: 1297, Training Loss: 0.43327
Epoch: 1297, Training Loss: 0.59598
Epoch: 1298, Training Loss: 0.32588
Epoch: 1298, Training Loss: 0.44280
Epoch: 1298, Training Loss: 0.43278
Epoch: 1298, Training Loss: 0.59653
Epoch: 1299, Training Loss: 0.32487
Epoch: 1299, Training Loss: 0.44230
Epoch: 1299, Training Loss: 0.43228
Epoch: 1299, Training Loss: 0.59708
Epoch: 1300, Training Loss: 0.32385
Epoch: 1300, Training Loss: 0.44179
Epoch: 1300, Training Loss: 0.43178
Epoch: 1300, Training Loss: 0.59763
Epoch: 1301, Training Loss: 0.32283
Epoch: 1301, Training Loss: 0.44129
Epoch: 1301, Training Loss: 0.43128
Epoch: 1301, Training Loss: 0.59817
Epoch: 1302, Training Loss: 0.32181
Epoch: 1302, Training Loss: 0.44078
Epoch: 1302, Training Loss: 0.43077
Epoch: 1302, Training Loss: 0.59872
Epoch: 1303, Training Loss: 0.32078
Epoch: 1303, Training Loss: 0.44027
Epoch: 1303, Training Loss: 0.43027
Epoch: 1303, Training Loss: 0.59927
Epoch: 1304, Training Loss: 0.31976
Epoch: 1304, Training Loss: 0.43976
Epoch: 1304, Training Loss: 0.42976
Epoch: 1304, Training Loss: 0.59983
Epoch: 1305, Training Loss: 0.31873
Epoch: 1305, Training Loss: 0.43924
Epoch: 1305, Training Loss: 0.42925
Epoch: 1305, Training Loss: 0.60038
Epoch: 1306, Training Loss: 0.31770
Epoch: 1306, Training Loss: 0.43873
Epoch: 1306, Training Loss: 0.42874
Epoch: 1306, Training Loss: 0.60093
Epoch: 1307, Training Loss: 0.31667
Epoch: 1307, Training Loss: 0.43821
Epoch: 1307, Training Loss: 0.42822
Epoch: 1307, Training Loss: 0.60148
Epoch: 1308, Training Loss: 0.31564
Epoch: 1308, Training Loss: 0.43769
Epoch: 1308, Training Loss: 0.42771
Epoch: 1308, Training Loss: 0.60203
Epoch: 1309, Training Loss: 0.31461
Epoch: 1309, Training Loss: 0.43717
Epoch: 1309, Training Loss: 0.42719
Epoch: 1309, Training Loss: 0.60257
Epoch: 1310, Training Loss: 0.31357
Epoch: 1310, Training Loss: 0.43665
Epoch: 1310, Training Loss: 0.42667
Epoch: 1310, Training Loss: 0.60312
Epoch: 1311, Training Loss: 0.31254
Epoch: 1311, Training Loss: 0.43612
Epoch: 1311, Training Loss: 0.42615
Epoch: 1311, Training Loss: 0.60367
Epoch: 1312, Training Loss: 0.31150
Epoch: 1312, Training Loss: 0.43560
Epoch: 1312, Training Loss: 0.42563
Epoch: 1312, Training Loss: 0.60422
Epoch: 1313, Training Loss: 0.31047
Epoch: 1313, Training Loss: 0.43507
Epoch: 1313, Training Loss: 0.42511
Epoch: 1313, Training Loss: 0.60477
Epoch: 1314, Training Loss: 0.30943
Epoch: 1314, Training Loss: 0.43454
Epoch: 1314, Training Loss: 0.42459
Epoch: 1314, Training Loss: 0.60531
Epoch: 1315, Training Loss: 0.30839
Epoch: 1315, Training Loss: 0.43401
Epoch: 1315, Training Loss: 0.42406
Epoch: 1315, Training Loss: 0.60586
Epoch: 1316, Training Loss: 0.30736
Epoch: 1316, Training Loss: 0.43348
Epoch: 1316, Training Loss: 0.42353
Epoch: 1316, Training Loss: 0.60641
Epoch: 1317, Training Loss: 0.30632
Epoch: 1317, Training Loss: 0.43294
Epoch: 1317, Training Loss: 0.42301
Epoch: 1317, Training Loss: 0.60695
Epoch: 1318, Training Loss: 0.30528
Epoch: 1318, Training Loss: 0.43241
Epoch: 1318, Training Loss: 0.42248
Epoch: 1318, Training Loss: 0.60749
Epoch: 1319, Training Loss: 0.30424
Epoch: 1319, Training Loss: 0.43188
Epoch: 1319, Training Loss: 0.42195
Epoch: 1319, Training Loss: 0.60804
Epoch: 1320, Training Loss: 0.30321
Epoch: 1320, Training Loss: 0.43134
Epoch: 1320, Training Loss: 0.42142
Epoch: 1320, Training Loss: 0.60858
Epoch: 1321, Training Loss: 0.30217
Epoch: 1321, Training Loss: 0.43080
Epoch: 1321, Training Loss: 0.42089
Epoch: 1321, Training Loss: 0.60912
Epoch: 1322, Training Loss: 0.30113
Epoch: 1322, Training Loss: 0.43027
Epoch: 1322, Training Loss: 0.42036
Epoch: 1322, Training Loss: 0.60966
Epoch: 1323, Training Loss: 0.30009
Epoch: 1323, Training Loss: 0.42973
Epoch: 1323, Training Loss: 0.41982
Epoch: 1323, Training Loss: 0.61019
Epoch: 1324, Training Loss: 0.29906
Epoch: 1324, Training Loss: 0.42919
Epoch: 1324, Training Loss: 0.41929
Epoch: 1324, Training Loss: 0.61073
Epoch: 1325, Training Loss: 0.29802
Epoch: 1325, Training Loss: 0.42865
Epoch: 1325, Training Loss: 0.41876
Epoch: 1325, Training Loss: 0.61126
Epoch: 1326, Training Loss: 0.29698
Epoch: 1326, Training Loss: 0.42811
Epoch: 1326, Training Loss: 0.41822
Epoch: 1326, Training Loss: 0.61180
Epoch: 1327, Training Loss: 0.29595
Epoch: 1327, Training Loss: 0.42757
Epoch: 1327, Training Loss: 0.41769
Epoch: 1327, Training Loss: 0.61233
Epoch: 1328, Training Loss: 0.29491
Epoch: 1328, Training Loss: 0.42703
Epoch: 1328, Training Loss: 0.41715
Epoch: 1328, Training Loss: 0.61286
Epoch: 1329, Training Loss: 0.29388
Epoch: 1329, Training Loss: 0.42649
Epoch: 1329, Training Loss: 0.41662
Epoch: 1329, Training Loss: 0.61339
Epoch: 1330, Training Loss: 0.29285
Epoch: 1330, Training Loss: 0.42594
Epoch: 1330, Training Loss: 0.41608
Epoch: 1330, Training Loss: 0.61391
Epoch: 1331, Training Loss: 0.29182
Epoch: 1331, Training Loss: 0.42540
Epoch: 1331, Training Loss: 0.41555
Epoch: 1331, Training Loss: 0.61444
Epoch: 1332, Training Loss: 0.29079
Epoch: 1332, Training Loss: 0.42486
Epoch: 1332, Training Loss: 0.41501
Epoch: 1332, Training Loss: 0.61496
Epoch: 1333, Training Loss: 0.28976
Epoch: 1333, Training Loss: 0.42432
Epoch: 1333, Training Loss: 0.41448
Epoch: 1333, Training Loss: 0.61548
Epoch: 1334, Training Loss: 0.28873
Epoch: 1334, Training Loss: 0.42378
Epoch: 1334, Training Loss: 0.41394
Epoch: 1334, Training Loss: 0.61600
Epoch: 1335, Training Loss: 0.28771
Epoch: 1335, Training Loss: 0.42324
Epoch: 1335, Training Loss: 0.41341
Epoch: 1335, Training Loss: 0.61651
Epoch: 1336, Training Loss: 0.28668
Epoch: 1336, Training Loss: 0.42269
Epoch: 1336, Training Loss: 0.41287
Epoch: 1336, Training Loss: 0.61703
Epoch: 1337, Training Loss: 0.28566
Epoch: 1337, Training Loss: 0.42215
Epoch: 1337, Training Loss: 0.41234
Epoch: 1337, Training Loss: 0.61754
Epoch: 1338, Training Loss: 0.28464
Epoch: 1338, Training Loss: 0.42161
Epoch: 1338, Training Loss: 0.41180
Epoch: 1338, Training Loss: 0.61805
Epoch: 1339, Training Loss: 0.28362
Epoch: 1339, Training Loss: 0.42107
Epoch: 1339, Training Loss: 0.41127
Epoch: 1339, Training Loss: 0.61855
Epoch: 1340, Training Loss: 0.28260
Epoch: 1340, Training Loss: 0.42053
Epoch: 1340, Training Loss: 0.41074
Epoch: 1340, Training Loss: 0.61906
Epoch: 1341, Training Loss: 0.28159
Epoch: 1341, Training Loss: 0.41999
Epoch: 1341, Training Loss: 0.41021
Epoch: 1341, Training Loss: 0.61956
Epoch: 1342, Training Loss: 0.28057
Epoch: 1342, Training Loss: 0.41945
Epoch: 1342, Training Loss: 0.40967
Epoch: 1342, Training Loss: 0.62006
Epoch: 1343, Training Loss: 0.27956
Epoch: 1343, Training Loss: 0.41892
Epoch: 1343, Training Loss: 0.40914
Epoch: 1343, Training Loss: 0.62056
Epoch: 1344, Training Loss: 0.27855
Epoch: 1344, Training Loss: 0.41838
Epoch: 1344, Training Loss: 0.40861
Epoch: 1344, Training Loss: 0.62105
Epoch: 1345, Training Loss: 0.27755
Epoch: 1345, Training Loss: 0.41784
Epoch: 1345, Training Loss: 0.40808
Epoch: 1345, Training Loss: 0.62154
Epoch: 1346, Training Loss: 0.27654
Epoch: 1346, Training Loss: 0.41731
Epoch: 1346, Training Loss: 0.40755
Epoch: 1346, Training Loss: 0.62203
Epoch: 1347, Training Loss: 0.27554
Epoch: 1347, Training Loss: 0.41677
Epoch: 1347, Training Loss: 0.40703
Epoch: 1347, Training Loss: 0.62252
Epoch: 1348, Training Loss: 0.27454
Epoch: 1348, Training Loss: 0.41624
Epoch: 1348, Training Loss: 0.40650
Epoch: 1348, Training Loss: 0.62300
Epoch: 1349, Training Loss: 0.27354
Epoch: 1349, Training Loss: 0.41571
Epoch: 1349, Training Loss: 0.40598
Epoch: 1349, Training Loss: 0.62348
Epoch: 1350, Training Loss: 0.27255
Epoch: 1350, Training Loss: 0.41518
Epoch: 1350, Training Loss: 0.40545
Epoch: 1350, Training Loss: 0.62396
Epoch: 1351, Training Loss: 0.27156
Epoch: 1351, Training Loss: 0.41465
Epoch: 1351, Training Loss: 0.40493
Epoch: 1351, Training Loss: 0.62443
Epoch: 1352, Training Loss: 0.27057
Epoch: 1352, Training Loss: 0.41412
Epoch: 1352, Training Loss: 0.40441
Epoch: 1352, Training Loss: 0.62490
Epoch: 1353, Training Loss: 0.26959
Epoch: 1353, Training Loss: 0.41359
Epoch: 1353, Training Loss: 0.40389
Epoch: 1353, Training Loss: 0.62537
Epoch: 1354, Training Loss: 0.26860
Epoch: 1354, Training Loss: 0.41306
Epoch: 1354, Training Loss: 0.40337
Epoch: 1354, Training Loss: 0.62583
Epoch: 1355, Training Loss: 0.26762
Epoch: 1355, Training Loss: 0.41254
Epoch: 1355, Training Loss: 0.40285
Epoch: 1355, Training Loss: 0.62629
Epoch: 1356, Training Loss: 0.26665
Epoch: 1356, Training Loss: 0.41202
Epoch: 1356, Training Loss: 0.40233
Epoch: 1356, Training Loss: 0.62675
Epoch: 1357, Training Loss: 0.26567
Epoch: 1357, Training Loss: 0.41150
Epoch: 1357, Training Loss: 0.40182
Epoch: 1357, Training Loss: 0.62721
Epoch: 1358, Training Loss: 0.26470
Epoch: 1358, Training Loss: 0.41098
Epoch: 1358, Training Loss: 0.40131
Epoch: 1358, Training Loss: 0.62766
Epoch: 1359, Training Loss: 0.26374
Epoch: 1359, Training Loss: 0.41046
Epoch: 1359, Training Loss: 0.40080
Epoch: 1359, Training Loss: 0.62811
Epoch: 1360, Training Loss: 0.26277
Epoch: 1360, Training Loss: 0.40994
Epoch: 1360, Training Loss: 0.40029
Epoch: 1360, Training Loss: 0.62855
Epoch: 1361, Training Loss: 0.26181
Epoch: 1361, Training Loss: 0.40943
Epoch: 1361, Training Loss: 0.39978
Epoch: 1361, Training Loss: 0.62900
Epoch: 1362, Training Loss: 0.26085
Epoch: 1362, Training Loss: 0.40891
Epoch: 1362, Training Loss: 0.39927
Epoch: 1362, Training Loss: 0.62943
Epoch: 1363, Training Loss: 0.25990
Epoch: 1363, Training Loss: 0.40840
Epoch: 1363, Training Loss: 0.39877
Epoch: 1363, Training Loss: 0.62987
Epoch: 1364, Training Loss: 0.25895
Epoch: 1364, Training Loss: 0.40789
Epoch: 1364, Training Loss: 0.39827
Epoch: 1364, Training Loss: 0.63030
Epoch: 1365, Training Loss: 0.25800
Epoch: 1365, Training Loss: 0.40739
Epoch: 1365, Training Loss: 0.39777
Epoch: 1365, Training Loss: 0.63073
Epoch: 1366, Training Loss: 0.25706
Epoch: 1366, Training Loss: 0.40688
Epoch: 1366, Training Loss: 0.39727
Epoch: 1366, Training Loss: 0.63116
Epoch: 1367, Training Loss: 0.25612
Epoch: 1367, Training Loss: 0.40638
Epoch: 1367, Training Loss: 0.39677
Epoch: 1367, Training Loss: 0.63158
Epoch: 1368, Training Loss: 0.25518
Epoch: 1368, Training Loss: 0.40588
Epoch: 1368, Training Loss: 0.39628
Epoch: 1368, Training Loss: 0.63200
Epoch: 1369, Training Loss: 0.25425
Epoch: 1369, Training Loss: 0.40538
Epoch: 1369, Training Loss: 0.39579
Epoch: 1369, Training Loss: 0.63241
Epoch: 1370, Training Loss: 0.25332
Epoch: 1370, Training Loss: 0.40488
Epoch: 1370, Training Loss: 0.39530
Epoch: 1370, Training Loss: 0.63282
Epoch: 1371, Training Loss: 0.25240
Epoch: 1371, Training Loss: 0.40439
Epoch: 1371, Training Loss: 0.39481
Epoch: 1371, Training Loss: 0.63323
Epoch: 1372, Training Loss: 0.25148
Epoch: 1372, Training Loss: 0.40390
Epoch: 1372, Training Loss: 0.39432
Epoch: 1372, Training Loss: 0.63363
Epoch: 1373, Training Loss: 0.25056
Epoch: 1373, Training Loss: 0.40341
Epoch: 1373, Training Loss: 0.39384
Epoch: 1373, Training Loss: 0.63403
Epoch: 1374, Training Loss: 0.24965
Epoch: 1374, Training Loss: 0.40292
Epoch: 1374, Training Loss: 0.39336
Epoch: 1374, Training Loss: 0.63443
Epoch: 1375, Training Loss: 0.24874
Epoch: 1375, Training Loss: 0.40243
Epoch: 1375, Training Loss: 0.39288
Epoch: 1375, Training Loss: 0.63482
Epoch: 1376, Training Loss: 0.24783
Epoch: 1376, Training Loss: 0.40195
Epoch: 1376, Training Loss: 0.39241
Epoch: 1376, Training Loss: 0.63521
Epoch: 1377, Training Loss: 0.24693
Epoch: 1377, Training Loss: 0.40147
Epoch: 1377, Training Loss: 0.39193
Epoch: 1377, Training Loss: 0.63560
Epoch: 1378, Training Loss: 0.24603
Epoch: 1378, Training Loss: 0.40099
Epoch: 1378, Training Loss: 0.39146
Epoch: 1378, Training Loss: 0.63598
Epoch: 1379, Training Loss: 0.24514
Epoch: 1379, Training Loss: 0.40052
Epoch: 1379, Training Loss: 0.39099
Epoch: 1379, Training Loss: 0.63636
Epoch: 1380, Training Loss: 0.24425
Epoch: 1380, Training Loss: 0.40005
Epoch: 1380, Training Loss: 0.39052
Epoch: 1380, Training Loss: 0.63674
Epoch: 1381, Training Loss: 0.24336
Epoch: 1381, Training Loss: 0.39958
Epoch: 1381, Training Loss: 0.39006
Epoch: 1381, Training Loss: 0.63711
Epoch: 1382, Training Loss: 0.24248
Epoch: 1382, Training Loss: 0.39911
Epoch: 1382, Training Loss: 0.38960
Epoch: 1382, Training Loss: 0.63748
Epoch: 1383, Training Loss: 0.24160
Epoch: 1383, Training Loss: 0.39864
Epoch: 1383, Training Loss: 0.38914
Epoch: 1383, Training Loss: 0.63784
Epoch: 1384, Training Loss: 0.24073
Epoch: 1384, Training Loss: 0.39818
Epoch: 1384, Training Loss: 0.38868
Epoch: 1384, Training Loss: 0.63820
Epoch: 1385, Training Loss: 0.23986
Epoch: 1385, Training Loss: 0.39772
Epoch: 1385, Training Loss: 0.38823
Epoch: 1385, Training Loss: 0.63856
Epoch: 1386, Training Loss: 0.23900
Epoch: 1386, Training Loss: 0.39726
Epoch: 1386, Training Loss: 0.38778
Epoch: 1386, Training Loss: 0.63891
Epoch: 1387, Training Loss: 0.23813
Epoch: 1387, Training Loss: 0.39681
Epoch: 1387, Training Loss: 0.38733
Epoch: 1387, Training Loss: 0.63926
Epoch: 1388, Training Loss: 0.23728
Epoch: 1388, Training Loss: 0.39636
Epoch: 1388, Training Loss: 0.38689
Epoch: 1388, Training Loss: 0.63961
Epoch: 1389, Training Loss: 0.23643
Epoch: 1389, Training Loss: 0.39591
Epoch: 1389, Training Loss: 0.38644
Epoch: 1389, Training Loss: 0.63995
Epoch: 1390, Training Loss: 0.23558
Epoch: 1390, Training Loss: 0.39546
Epoch: 1390, Training Loss: 0.38600
Epoch: 1390, Training Loss: 0.64029
Epoch: 1391, Training Loss: 0.23473
Epoch: 1391, Training Loss: 0.39502
Epoch: 1391, Training Loss: 0.38556
Epoch: 1391, Training Loss: 0.64063
Epoch: 1392, Training Loss: 0.23389
Epoch: 1392, Training Loss: 0.39458
Epoch: 1392, Training Loss: 0.38513
Epoch: 1392, Training Loss: 0.64096
Epoch: 1393, Training Loss: 0.23306
Epoch: 1393, Training Loss: 0.39414
Epoch: 1393, Training Loss: 0.38470
Epoch: 1393, Training Loss: 0.64129
Epoch: 1394, Training Loss: 0.23223
Epoch: 1394, Training Loss: 0.39370
Epoch: 1394, Training Loss: 0.38427
Epoch: 1394, Training Loss: 0.64161
Epoch: 1395, Training Loss: 0.23140
Epoch: 1395, Training Loss: 0.39327
Epoch: 1395, Training Loss: 0.38384
Epoch: 1395, Training Loss: 0.64193
Epoch: 1396, Training Loss: 0.23058
Epoch: 1396, Training Loss: 0.39284
Epoch: 1396, Training Loss: 0.38341
Epoch: 1396, Training Loss: 0.64225
Epoch: 1397, Training Loss: 0.22976
Epoch: 1397, Training Loss: 0.39242
Epoch: 1397, Training Loss: 0.38299
Epoch: 1397, Training Loss: 0.64256
Epoch: 1398, Training Loss: 0.22894
Epoch: 1398, Training Loss: 0.39199
Epoch: 1398, Training Loss: 0.38257
Epoch: 1398, Training Loss: 0.64288
Epoch: 1399, Training Loss: 0.22813
Epoch: 1399, Training Loss: 0.39157
Epoch: 1399, Training Loss: 0.38216
Epoch: 1399, Training Loss: 0.64318
Epoch: 1400, Training Loss: 0.22733
Epoch: 1400, Training Loss: 0.39115
Epoch: 1400, Training Loss: 0.38175
Epoch: 1400, Training Loss: 0.64349
Epoch: 1401, Training Loss: 0.22653
Epoch: 1401, Training Loss: 0.39074
Epoch: 1401, Training Loss: 0.38134
Epoch: 1401, Training Loss: 0.64379
Epoch: 1402, Training Loss: 0.22573
Epoch: 1402, Training Loss: 0.39032
Epoch: 1402, Training Loss: 0.38093
Epoch: 1402, Training Loss: 0.64408
Epoch: 1403, Training Loss: 0.22494
Epoch: 1403, Training Loss: 0.38992
Epoch: 1403, Training Loss: 0.38052
Epoch: 1403, Training Loss: 0.64438
Epoch: 1404, Training Loss: 0.22415
Epoch: 1404, Training Loss: 0.38951
Epoch: 1404, Training Loss: 0.38012
Epoch: 1404, Training Loss: 0.64467
Epoch: 1405, Training Loss: 0.22336
Epoch: 1405, Training Loss: 0.38910
Epoch: 1405, Training Loss: 0.37972
Epoch: 1405, Training Loss: 0.64496
Epoch: 1406, Training Loss: 0.22258
Epoch: 1406, Training Loss: 0.38870
Epoch: 1406, Training Loss: 0.37932
Epoch: 1406, Training Loss: 0.64524
Epoch: 1407, Training Loss: 0.22181
Epoch: 1407, Training Loss: 0.38830
Epoch: 1407, Training Loss: 0.37893
Epoch: 1407, Training Loss: 0.64552
Epoch: 1408, Training Loss: 0.22104
Epoch: 1408, Training Loss: 0.38791
Epoch: 1408, Training Loss: 0.37854
Epoch: 1408, Training Loss: 0.64580
Epoch: 1409, Training Loss: 0.22027
Epoch: 1409, Training Loss: 0.38752
Epoch: 1409, Training Loss: 0.37815
Epoch: 1409, Training Loss: 0.64607
Epoch: 1410, Training Loss: 0.21951
Epoch: 1410, Training Loss: 0.38713
Epoch: 1410, Training Loss: 0.37777
Epoch: 1410, Training Loss: 0.64634
Epoch: 1411, Training Loss: 0.21875
Epoch: 1411, Training Loss: 0.38674
Epoch: 1411, Training Loss: 0.37738
Epoch: 1411, Training Loss: 0.64661
Epoch: 1412, Training Loss: 0.21799
Epoch: 1412, Training Loss: 0.38636
Epoch: 1412, Training Loss: 0.37700
Epoch: 1412, Training Loss: 0.64687
Epoch: 1413, Training Loss: 0.21724
Epoch: 1413, Training Loss: 0.38598
Epoch: 1413, Training Loss: 0.37663
Epoch: 1413, Training Loss: 0.64713
Epoch: 1414, Training Loss: 0.21650
Epoch: 1414, Training Loss: 0.38560
Epoch: 1414, Training Loss: 0.37625
Epoch: 1414, Training Loss: 0.64739
Epoch: 1415, Training Loss: 0.21575
Epoch: 1415, Training Loss: 0.38522
Epoch: 1415, Training Loss: 0.37588
Epoch: 1415, Training Loss: 0.64764
Epoch: 1416, Training Loss: 0.21502
Epoch: 1416, Training Loss: 0.38485
Epoch: 1416, Training Loss: 0.37551
Epoch: 1416, Training Loss: 0.64789
Epoch: 1417, Training Loss: 0.21428
Epoch: 1417, Training Loss: 0.38448
Epoch: 1417, Training Loss: 0.37515
Epoch: 1417, Training Loss: 0.64814
Epoch: 1418, Training Loss: 0.21355
Epoch: 1418, Training Loss: 0.38411
Epoch: 1418, Training Loss: 0.37478
Epoch: 1418, Training Loss: 0.64839
Epoch: 1419, Training Loss: 0.21283
Epoch: 1419, Training Loss: 0.38375
Epoch: 1419, Training Loss: 0.37442
Epoch: 1419, Training Loss: 0.64863
Epoch: 1420, Training Loss: 0.21211
Epoch: 1420, Training Loss: 0.38339
Epoch: 1420, Training Loss: 0.37407
Epoch: 1420, Training Loss: 0.64887
Epoch: 1421, Training Loss: 0.21139
Epoch: 1421, Training Loss: 0.38303
Epoch: 1421, Training Loss: 0.37371
Epoch: 1421, Training Loss: 0.64910
Epoch: 1422, Training Loss: 0.21068
Epoch: 1422, Training Loss: 0.38267
Epoch: 1422, Training Loss: 0.37336
Epoch: 1422, Training Loss: 0.64933
Epoch: 1423, Training Loss: 0.20997
Epoch: 1423, Training Loss: 0.38232
Epoch: 1423, Training Loss: 0.37301
Epoch: 1423, Training Loss: 0.64956
Epoch: 1424, Training Loss: 0.20927
Epoch: 1424, Training Loss: 0.38197
Epoch: 1424, Training Loss: 0.37266
Epoch: 1424, Training Loss: 0.64979
Epoch: 1425, Training Loss: 0.20857
Epoch: 1425, Training Loss: 0.38162
Epoch: 1425, Training Loss: 0.37232
Epoch: 1425, Training Loss: 0.65001
Epoch: 1426, Training Loss: 0.20787
Epoch: 1426, Training Loss: 0.38128
Epoch: 1426, Training Loss: 0.37198
Epoch: 1426, Training Loss: 0.65024
Epoch: 1427, Training Loss: 0.20718
Epoch: 1427, Training Loss: 0.38094
Epoch: 1427, Training Loss: 0.37164
Epoch: 1427, Training Loss: 0.65045
Epoch: 1428, Training Loss: 0.20649
Epoch: 1428, Training Loss: 0.38060
Epoch: 1428, Training Loss: 0.37130
Epoch: 1428, Training Loss: 0.65067
Epoch: 1429, Training Loss: 0.20581
Epoch: 1429, Training Loss: 0.38026
Epoch: 1429, Training Loss: 0.37097
Epoch: 1429, Training Loss: 0.65088
Epoch: 1430, Training Loss: 0.20513
Epoch: 1430, Training Loss: 0.37993
Epoch: 1430, Training Loss: 0.37064
Epoch: 1430, Training Loss: 0.65109
Epoch: 1431, Training Loss: 0.20445
Epoch: 1431, Training Loss: 0.37960
Epoch: 1431, Training Loss: 0.37031
Epoch: 1431, Training Loss: 0.65130
Epoch: 1432, Training Loss: 0.20378
Epoch: 1432, Training Loss: 0.37927
Epoch: 1432, Training Loss: 0.36999
Epoch: 1432, Training Loss: 0.65150
Epoch: 1433, Training Loss: 0.20311
Epoch: 1433, Training Loss: 0.37894
Epoch: 1433, Training Loss: 0.36966
Epoch: 1433, Training Loss: 0.65170
Epoch: 1434, Training Loss: 0.20245
Epoch: 1434, Training Loss: 0.37862
Epoch: 1434, Training Loss: 0.36934
Epoch: 1434, Training Loss: 0.65190
Epoch: 1435, Training Loss: 0.20179
Epoch: 1435, Training Loss: 0.37830
Epoch: 1435, Training Loss: 0.36903
Epoch: 1435, Training Loss: 0.65209
Epoch: 1436, Training Loss: 0.20113
Epoch: 1436, Training Loss: 0.37798
Epoch: 1436, Training Loss: 0.36871
Epoch: 1436, Training Loss: 0.65229
Epoch: 1437, Training Loss: 0.20048
Epoch: 1437, Training Loss: 0.37767
Epoch: 1437, Training Loss: 0.36840
Epoch: 1437, Training Loss: 0.65248
Epoch: 1438, Training Loss: 0.19983
Epoch: 1438, Training Loss: 0.37736
Epoch: 1438, Training Loss: 0.36809
Epoch: 1438, Training Loss: 0.65267
Epoch: 1439, Training Loss: 0.19919
Epoch: 1439, Training Loss: 0.37705
Epoch: 1439, Training Loss: 0.36778
Epoch: 1439, Training Loss: 0.65285
Epoch: 1440, Training Loss: 0.19855
Epoch: 1440, Training Loss: 0.37674
Epoch: 1440, Training Loss: 0.36748
Epoch: 1440, Training Loss: 0.65303
Epoch: 1441, Training Loss: 0.19791
Epoch: 1441, Training Loss: 0.37644
Epoch: 1441, Training Loss: 0.36718
Epoch: 1441, Training Loss: 0.65321
Epoch: 1442, Training Loss: 0.19728
Epoch: 1442, Training Loss: 0.37613
Epoch: 1442, Training Loss: 0.36688
Epoch: 1442, Training Loss: 0.65339
Epoch: 1443, Training Loss: 0.19665
Epoch: 1443, Training Loss: 0.37583
Epoch: 1443, Training Loss: 0.36658
Epoch: 1443, Training Loss: 0.65357
Epoch: 1444, Training Loss: 0.19603
Epoch: 1444, Training Loss: 0.37554
Epoch: 1444, Training Loss: 0.36628
Epoch: 1444, Training Loss: 0.65374
Epoch: 1445, Training Loss: 0.19541
Epoch: 1445, Training Loss: 0.37524
Epoch: 1445, Training Loss: 0.36599
Epoch: 1445, Training Loss: 0.65391
Epoch: 1446, Training Loss: 0.19479
Epoch: 1446, Training Loss: 0.37495
Epoch: 1446, Training Loss: 0.36570
Epoch: 1446, Training Loss: 0.65408
Epoch: 1447, Training Loss: 0.19418
Epoch: 1447, Training Loss: 0.37466
Epoch: 1447, Training Loss: 0.36541
Epoch: 1447, Training Loss: 0.65424
Epoch: 1448, Training Loss: 0.19357
Epoch: 1448, Training Loss: 0.37437
Epoch: 1448, Training Loss: 0.36513
Epoch: 1448, Training Loss: 0.65441
Epoch: 1449, Training Loss: 0.19296
Epoch: 1449, Training Loss: 0.37409
Epoch: 1449, Training Loss: 0.36484
Epoch: 1449, Training Loss: 0.65457
Epoch: 1450, Training Loss: 0.19236
Epoch: 1450, Training Loss: 0.37381
Epoch: 1450, Training Loss: 0.36456
Epoch: 1450, Training Loss: 0.65472
Epoch: 1451, Training Loss: 0.19176
Epoch: 1451, Training Loss: 0.37353
Epoch: 1451, Training Loss: 0.36429
Epoch: 1451, Training Loss: 0.65488
Epoch: 1452, Training Loss: 0.19116
Epoch: 1452, Training Loss: 0.37325
Epoch: 1452, Training Loss: 0.36401
Epoch: 1452, Training Loss: 0.65504
Epoch: 1453, Training Loss: 0.19057
Epoch: 1453, Training Loss: 0.37298
Epoch: 1453, Training Loss: 0.36374
Epoch: 1453, Training Loss: 0.65519
Epoch: 1454, Training Loss: 0.18998
Epoch: 1454, Training Loss: 0.37270
Epoch: 1454, Training Loss: 0.36347
Epoch: 1454, Training Loss: 0.65534
Epoch: 1455, Training Loss: 0.18940
Epoch: 1455, Training Loss: 0.37243
Epoch: 1455, Training Loss: 0.36320
Epoch: 1455, Training Loss: 0.65548
Epoch: 1456, Training Loss: 0.18882
Epoch: 1456, Training Loss: 0.37217
Epoch: 1456, Training Loss: 0.36293
Epoch: 1456, Training Loss: 0.65563
Epoch: 1457, Training Loss: 0.18824
Epoch: 1457, Training Loss: 0.37190
Epoch: 1457, Training Loss: 0.36266
Epoch: 1457, Training Loss: 0.65577
Epoch: 1458, Training Loss: 0.18767
Epoch: 1458, Training Loss: 0.37164
Epoch: 1458, Training Loss: 0.36240
Epoch: 1458, Training Loss: 0.65591
Epoch: 1459, Training Loss: 0.18710
Epoch: 1459, Training Loss: 0.37137
Epoch: 1459, Training Loss: 0.36214
Epoch: 1459, Training Loss: 0.65605
Epoch: 1460, Training Loss: 0.18653
Epoch: 1460, Training Loss: 0.37112
Epoch: 1460, Training Loss: 0.36188
Epoch: 1460, Training Loss: 0.65619
Epoch: 1461, Training Loss: 0.18597
Epoch: 1461, Training Loss: 0.37086
Epoch: 1461, Training Loss: 0.36163
Epoch: 1461, Training Loss: 0.65632
Epoch: 1462, Training Loss: 0.18541
Epoch: 1462, Training Loss: 0.37060
Epoch: 1462, Training Loss: 0.36138
Epoch: 1462, Training Loss: 0.65646
Epoch: 1463, Training Loss: 0.18485
Epoch: 1463, Training Loss: 0.37035
Epoch: 1463, Training Loss: 0.36112
Epoch: 1463, Training Loss: 0.65659
Epoch: 1464, Training Loss: 0.18430
Epoch: 1464, Training Loss: 0.37010
Epoch: 1464, Training Loss: 0.36087
Epoch: 1464, Training Loss: 0.65672
Epoch: 1465, Training Loss: 0.18375
Epoch: 1465, Training Loss: 0.36985
Epoch: 1465, Training Loss: 0.36063
Epoch: 1465, Training Loss: 0.65684
Epoch: 1466, Training Loss: 0.18320
Epoch: 1466, Training Loss: 0.36961
Epoch: 1466, Training Loss: 0.36038
Epoch: 1466, Training Loss: 0.65697
Epoch: 1467, Training Loss: 0.18266
Epoch: 1467, Training Loss: 0.36936
Epoch: 1467, Training Loss: 0.36014
Epoch: 1467, Training Loss: 0.65709
Epoch: 1468, Training Loss: 0.18212
Epoch: 1468, Training Loss: 0.36912
Epoch: 1468, Training Loss: 0.35990
Epoch: 1468, Training Loss: 0.65721
Epoch: 1469, Training Loss: 0.18158
Epoch: 1469, Training Loss: 0.36888
Epoch: 1469, Training Loss: 0.35966
Epoch: 1469, Training Loss: 0.65733
Epoch: 1470, Training Loss: 0.18105
Epoch: 1470, Training Loss: 0.36865
Epoch: 1470, Training Loss: 0.35942
Epoch: 1470, Training Loss: 0.65745
Epoch: 1471, Training Loss: 0.18052
Epoch: 1471, Training Loss: 0.36841
Epoch: 1471, Training Loss: 0.35919
Epoch: 1471, Training Loss: 0.65757
Epoch: 1472, Training Loss: 0.17999
Epoch: 1472, Training Loss: 0.36818
Epoch: 1472, Training Loss: 0.35896
Epoch: 1472, Training Loss: 0.65768
Epoch: 1473, Training Loss: 0.17946
Epoch: 1473, Training Loss: 0.36795
Epoch: 1473, Training Loss: 0.35872
Epoch: 1473, Training Loss: 0.65779
Epoch: 1474, Training Loss: 0.17894
Epoch: 1474, Training Loss: 0.36772
Epoch: 1474, Training Loss: 0.35850
Epoch: 1474, Training Loss: 0.65790
Epoch: 1475, Training Loss: 0.17843
Epoch: 1475, Training Loss: 0.36749
Epoch: 1475, Training Loss: 0.35827
Epoch: 1475, Training Loss: 0.65801
Epoch: 1476, Training Loss: 0.17791
Epoch: 1476, Training Loss: 0.36726
Epoch: 1476, Training Loss: 0.35804
Epoch: 1476, Training Loss: 0.65812
Epoch: 1477, Training Loss: 0.17740
Epoch: 1477, Training Loss: 0.36704
Epoch: 1477, Training Loss: 0.35782
Epoch: 1477, Training Loss: 0.65823
Epoch: 1478, Training Loss: 0.17689
Epoch: 1478, Training Loss: 0.36682
Epoch: 1478, Training Loss: 0.35760
Epoch: 1478, Training Loss: 0.65833
Epoch: 1479, Training Loss: 0.17639
Epoch: 1479, Training Loss: 0.36660
Epoch: 1479, Training Loss: 0.35738
Epoch: 1479, Training Loss: 0.65843
Epoch: 1480, Training Loss: 0.17589
Epoch: 1480, Training Loss: 0.36638
Epoch: 1480, Training Loss: 0.35716
Epoch: 1480, Training Loss: 0.65853
Epoch: 1481, Training Loss: 0.17539
Epoch: 1481, Training Loss: 0.36617
Epoch: 1481, Training Loss: 0.35695
Epoch: 1481, Training Loss: 0.65863
Epoch: 1482, Training Loss: 0.17489
Epoch: 1482, Training Loss: 0.36595
Epoch: 1482, Training Loss: 0.35673
Epoch: 1482, Training Loss: 0.65873
Epoch: 1483, Training Loss: 0.17440
Epoch: 1483, Training Loss: 0.36574
Epoch: 1483, Training Loss: 0.35652
Epoch: 1483, Training Loss: 0.65883
Epoch: 1484, Training Loss: 0.17391
Epoch: 1484, Training Loss: 0.36553
Epoch: 1484, Training Loss: 0.35631
Epoch: 1484, Training Loss: 0.65892
Epoch: 1485, Training Loss: 0.17342
Epoch: 1485, Training Loss: 0.36532
Epoch: 1485, Training Loss: 0.35610
Epoch: 1485, Training Loss: 0.65902
Epoch: 1486, Training Loss: 0.17294
Epoch: 1486, Training Loss: 0.36511
Epoch: 1486, Training Loss: 0.35589
Epoch: 1486, Training Loss: 0.65911
Epoch: 1487, Training Loss: 0.17246
Epoch: 1487, Training Loss: 0.36491
Epoch: 1487, Training Loss: 0.35569
Epoch: 1487, Training Loss: 0.65920
Epoch: 1488, Training Loss: 0.17198
Epoch: 1488, Training Loss: 0.36471
Epoch: 1488, Training Loss: 0.35549
Epoch: 1488, Training Loss: 0.65929
Epoch: 1489, Training Loss: 0.17150
Epoch: 1489, Training Loss: 0.36451
Epoch: 1489, Training Loss: 0.35529
Epoch: 1489, Training Loss: 0.65937
Epoch: 1490, Training Loss: 0.17103
Epoch: 1490, Training Loss: 0.36431
Epoch: 1490, Training Loss: 0.35509
Epoch: 1490, Training Loss: 0.65946
Epoch: 1491, Training Loss: 0.17056
Epoch: 1491, Training Loss: 0.36411
Epoch: 1491, Training Loss: 0.35489
Epoch: 1491, Training Loss: 0.65954
Epoch: 1492, Training Loss: 0.17009
Epoch: 1492, Training Loss: 0.36391
Epoch: 1492, Training Loss: 0.35469
Epoch: 1492, Training Loss: 0.65963
Epoch: 1493, Training Loss: 0.16963
Epoch: 1493, Training Loss: 0.36372
Epoch: 1493, Training Loss: 0.35450
Epoch: 1493, Training Loss: 0.65971
Epoch: 1494, Training Loss: 0.16917
Epoch: 1494, Training Loss: 0.36352
Epoch: 1494, Training Loss: 0.35430
Epoch: 1494, Training Loss: 0.65979
Epoch: 1495, Training Loss: 0.16871
Epoch: 1495, Training Loss: 0.36333
Epoch: 1495, Training Loss: 0.35411
Epoch: 1495, Training Loss: 0.65987
Epoch: 1496, Training Loss: 0.16825
Epoch: 1496, Training Loss: 0.36314
Epoch: 1496, Training Loss: 0.35392
Epoch: 1496, Training Loss: 0.65995
Epoch: 1497, Training Loss: 0.16780
Epoch: 1497, Training Loss: 0.36296
Epoch: 1497, Training Loss: 0.35373
Epoch: 1497, Training Loss: 0.66002
Epoch: 1498, Training Loss: 0.16735
Epoch: 1498, Training Loss: 0.36277
Epoch: 1498, Training Loss: 0.35355
Epoch: 1498, Training Loss: 0.66010
Epoch: 1499, Training Loss: 0.16690
Epoch: 1499, Training Loss: 0.36258
Epoch: 1499, Training Loss: 0.35336
Epoch: 1499, Training Loss: 0.66017
Epoch: 1500, Training Loss: 0.16645
Epoch: 1500, Training Loss: 0.36240
Epoch: 1500, Training Loss: 0.35318
Epoch: 1500, Training Loss: 0.66025
Epoch: 1501, Training Loss: 0.16601
Epoch: 1501, Training Loss: 0.36222
Epoch: 1501, Training Loss: 0.35299
Epoch: 1501, Training Loss: 0.66032
Epoch: 1502, Training Loss: 0.16557
Epoch: 1502, Training Loss: 0.36204
Epoch: 1502, Training Loss: 0.35281
Epoch: 1502, Training Loss: 0.66039
Epoch: 1503, Training Loss: 0.16513
Epoch: 1503, Training Loss: 0.36186
Epoch: 1503, Training Loss: 0.35263
Epoch: 1503, Training Loss: 0.66046
Epoch: 1504, Training Loss: 0.16470
Epoch: 1504, Training Loss: 0.36168
Epoch: 1504, Training Loss: 0.35246
Epoch: 1504, Training Loss: 0.66053
Epoch: 1505, Training Loss: 0.16427
Epoch: 1505, Training Loss: 0.36151
Epoch: 1505, Training Loss: 0.35228
Epoch: 1505, Training Loss: 0.66059
Epoch: 1506, Training Loss: 0.16384
Epoch: 1506, Training Loss: 0.36133
Epoch: 1506, Training Loss: 0.35210
Epoch: 1506, Training Loss: 0.66066
Epoch: 1507, Training Loss: 0.16341
Epoch: 1507, Training Loss: 0.36116
Epoch: 1507, Training Loss: 0.35193
Epoch: 1507, Training Loss: 0.66072
Epoch: 1508, Training Loss: 0.16299
Epoch: 1508, Training Loss: 0.36099
Epoch: 1508, Training Loss: 0.35176
Epoch: 1508, Training Loss: 0.66079
Epoch: 1509, Training Loss: 0.16256
Epoch: 1509, Training Loss: 0.36082
Epoch: 1509, Training Loss: 0.35159
Epoch: 1509, Training Loss: 0.66085
Epoch: 1510, Training Loss: 0.16214
Epoch: 1510, Training Loss: 0.36065
Epoch: 1510, Training Loss: 0.35142
Epoch: 1510, Training Loss: 0.66091
Epoch: 1511, Training Loss: 0.16173
Epoch: 1511, Training Loss: 0.36049
Epoch: 1511, Training Loss: 0.35125
Epoch: 1511, Training Loss: 0.66097
Epoch: 1512, Training Loss: 0.16131
Epoch: 1512, Training Loss: 0.36032
Epoch: 1512, Training Loss: 0.35109
Epoch: 1512, Training Loss: 0.66103
Epoch: 1513, Training Loss: 0.16090
Epoch: 1513, Training Loss: 0.36016
Epoch: 1513, Training Loss: 0.35092
Epoch: 1513, Training Loss: 0.66109
Epoch: 1514, Training Loss: 0.16049
Epoch: 1514, Training Loss: 0.35999
Epoch: 1514, Training Loss: 0.35076
Epoch: 1514, Training Loss: 0.66115
Epoch: 1515, Training Loss: 0.16008
Epoch: 1515, Training Loss: 0.35983
Epoch: 1515, Training Loss: 0.35059
Epoch: 1515, Training Loss: 0.66120
Epoch: 1516, Training Loss: 0.15967
Epoch: 1516, Training Loss: 0.35967
Epoch: 1516, Training Loss: 0.35043
Epoch: 1516, Training Loss: 0.66126
Epoch: 1517, Training Loss: 0.15927
Epoch: 1517, Training Loss: 0.35951
Epoch: 1517, Training Loss: 0.35027
Epoch: 1517, Training Loss: 0.66131
Epoch: 1518, Training Loss: 0.15887
Epoch: 1518, Training Loss: 0.35936
Epoch: 1518, Training Loss: 0.35012
Epoch: 1518, Training Loss: 0.66137
Epoch: 1519, Training Loss: 0.15847
Epoch: 1519, Training Loss: 0.35920
Epoch: 1519, Training Loss: 0.34996
Epoch: 1519, Training Loss: 0.66142
Epoch: 1520, Training Loss: 0.15807
Epoch: 1520, Training Loss: 0.35904
Epoch: 1520, Training Loss: 0.34980
Epoch: 1520, Training Loss: 0.66147
Epoch: 1521, Training Loss: 0.15768
Epoch: 1521, Training Loss: 0.35889
Epoch: 1521, Training Loss: 0.34965
Epoch: 1521, Training Loss: 0.66152
Epoch: 1522, Training Loss: 0.15729
Epoch: 1522, Training Loss: 0.35874
Epoch: 1522, Training Loss: 0.34949
Epoch: 1522, Training Loss: 0.66157
Epoch: 1523, Training Loss: 0.15690
Epoch: 1523, Training Loss: 0.35859
Epoch: 1523, Training Loss: 0.34934
Epoch: 1523, Training Loss: 0.66162
Epoch: 1524, Training Loss: 0.15651
Epoch: 1524, Training Loss: 0.35844
Epoch: 1524, Training Loss: 0.34919
Epoch: 1524, Training Loss: 0.66167
Epoch: 1525, Training Loss: 0.15613
Epoch: 1525, Training Loss: 0.35829
Epoch: 1525, Training Loss: 0.34904
Epoch: 1525, Training Loss: 0.66172
Epoch: 1526, Training Loss: 0.15574
Epoch: 1526, Training Loss: 0.35814
Epoch: 1526, Training Loss: 0.34889
Epoch: 1526, Training Loss: 0.66176
Epoch: 1527, Training Loss: 0.15536
Epoch: 1527, Training Loss: 0.35800
Epoch: 1527, Training Loss: 0.34875
Epoch: 1527, Training Loss: 0.66181
Epoch: 1528, Training Loss: 0.15498
Epoch: 1528, Training Loss: 0.35785
Epoch: 1528, Training Loss: 0.34860
Epoch: 1528, Training Loss: 0.66185
Epoch: 1529, Training Loss: 0.15461
Epoch: 1529, Training Loss: 0.35771
Epoch: 1529, Training Loss: 0.34845
Epoch: 1529, Training Loss: 0.66190
Epoch: 1530, Training Loss: 0.15423
Epoch: 1530, Training Loss: 0.35756
Epoch: 1530, Training Loss: 0.34831
Epoch: 1530, Training Loss: 0.66194
Epoch: 1531, Training Loss: 0.15386
Epoch: 1531, Training Loss: 0.35742
Epoch: 1531, Training Loss: 0.34817
Epoch: 1531, Training Loss: 0.66198
Epoch: 1532, Training Loss: 0.15349
Epoch: 1532, Training Loss: 0.35728
Epoch: 1532, Training Loss: 0.34803
Epoch: 1532, Training Loss: 0.66202
Epoch: 1533, Training Loss: 0.15312
Epoch: 1533, Training Loss: 0.35714
Epoch: 1533, Training Loss: 0.34789
Epoch: 1533, Training Loss: 0.66207
Epoch: 1534, Training Loss: 0.15276
Epoch: 1534, Training Loss: 0.35701
Epoch: 1534, Training Loss: 0.34775
Epoch: 1534, Training Loss: 0.66211
Epoch: 1535, Training Loss: 0.15239
Epoch: 1535, Training Loss: 0.35687
Epoch: 1535, Training Loss: 0.34761
Epoch: 1535, Training Loss: 0.66214
Epoch: 1536, Training Loss: 0.15203
Epoch: 1536, Training Loss: 0.35673
Epoch: 1536, Training Loss: 0.34747
Epoch: 1536, Training Loss: 0.66218
Epoch: 1537, Training Loss: 0.15167
Epoch: 1537, Training Loss: 0.35660
Epoch: 1537, Training Loss: 0.34734
Epoch: 1537, Training Loss: 0.66222
Epoch: 1538, Training Loss: 0.15131
Epoch: 1538, Training Loss: 0.35647
Epoch: 1538, Training Loss: 0.34720
Epoch: 1538, Training Loss: 0.66226
Epoch: 1539, Training Loss: 0.15096
Epoch: 1539, Training Loss: 0.35633
Epoch: 1539, Training Loss: 0.34707
Epoch: 1539, Training Loss: 0.66229
Epoch: 1540, Training Loss: 0.15060
Epoch: 1540, Training Loss: 0.35620
Epoch: 1540, Training Loss: 0.34693
Epoch: 1540, Training Loss: 0.66233
Epoch: 1541, Training Loss: 0.15025
Epoch: 1541, Training Loss: 0.35607
Epoch: 1541, Training Loss: 0.34680
Epoch: 1541, Training Loss: 0.66236
Epoch: 1542, Training Loss: 0.14990
Epoch: 1542, Training Loss: 0.35594
Epoch: 1542, Training Loss: 0.34667
Epoch: 1542, Training Loss: 0.66240
Epoch: 1543, Training Loss: 0.14955
Epoch: 1543, Training Loss: 0.35581
Epoch: 1543, Training Loss: 0.34654
Epoch: 1543, Training Loss: 0.66243
Epoch: 1544, Training Loss: 0.14921
Epoch: 1544, Training Loss: 0.35569
Epoch: 1544, Training Loss: 0.34641
Epoch: 1544, Training Loss: 0.66247
Epoch: 1545, Training Loss: 0.14886
Epoch: 1545, Training Loss: 0.35556
Epoch: 1545, Training Loss: 0.34628
Epoch: 1545, Training Loss: 0.66250
Epoch: 1546, Training Loss: 0.14852
Epoch: 1546, Training Loss: 0.35543
Epoch: 1546, Training Loss: 0.34616
Epoch: 1546, Training Loss: 0.66253
Epoch: 1547, Training Loss: 0.14818
Epoch: 1547, Training Loss: 0.35531
Epoch: 1547, Training Loss: 0.34603
Epoch: 1547, Training Loss: 0.66256
Epoch: 1548, Training Loss: 0.14784
Epoch: 1548, Training Loss: 0.35519
Epoch: 1548, Training Loss: 0.34591
Epoch: 1548, Training Loss: 0.66259
Epoch: 1549, Training Loss: 0.14750
Epoch: 1549, Training Loss: 0.35506
Epoch: 1549, Training Loss: 0.34578
Epoch: 1549, Training Loss: 0.66262
Epoch: 1550, Training Loss: 0.14717
Epoch: 1550, Training Loss: 0.35494
Epoch: 1550, Training Loss: 0.34566
Epoch: 1550, Training Loss: 0.66265
Epoch: 1551, Training Loss: 0.14683
Epoch: 1551, Training Loss: 0.35482
Epoch: 1551, Training Loss: 0.34554
Epoch: 1551, Training Loss: 0.66268
Epoch: 1552, Training Loss: 0.14650
Epoch: 1552, Training Loss: 0.35470
Epoch: 1552, Training Loss: 0.34542
Epoch: 1552, Training Loss: 0.66271
Epoch: 1553, Training Loss: 0.14617
Epoch: 1553, Training Loss: 0.35458
Epoch: 1553, Training Loss: 0.34530
Epoch: 1553, Training Loss: 0.66273
Epoch: 1554, Training Loss: 0.14584
Epoch: 1554, Training Loss: 0.35447
Epoch: 1554, Training Loss: 0.34518
Epoch: 1554, Training Loss: 0.66276
Epoch: 1555, Training Loss: 0.14552
Epoch: 1555, Training Loss: 0.35435
Epoch: 1555, Training Loss: 0.34506
Epoch: 1555, Training Loss: 0.66279
Epoch: 1556, Training Loss: 0.14519
Epoch: 1556, Training Loss: 0.35423
Epoch: 1556, Training Loss: 0.34494
Epoch: 1556, Training Loss: 0.66281
Epoch: 1557, Training Loss: 0.14487
Epoch: 1557, Training Loss: 0.35412
Epoch: 1557, Training Loss: 0.34482
Epoch: 1557, Training Loss: 0.66284
Epoch: 1558, Training Loss: 0.14455
Epoch: 1558, Training Loss: 0.35400
Epoch: 1558, Training Loss: 0.34471
Epoch: 1558, Training Loss: 0.66286
Epoch: 1559, Training Loss: 0.14423
Epoch: 1559, Training Loss: 0.35389
Epoch: 1559, Training Loss: 0.34459
Epoch: 1559, Training Loss: 0.66289
Epoch: 1560, Training Loss: 0.14391
Epoch: 1560, Training Loss: 0.35378
Epoch: 1560, Training Loss: 0.34448
Epoch: 1560, Training Loss: 0.66291
Epoch: 1561, Training Loss: 0.14360
Epoch: 1561, Training Loss: 0.35367
Epoch: 1561, Training Loss: 0.34437
Epoch: 1561, Training Loss: 0.66293
Epoch: 1562, Training Loss: 0.14328
Epoch: 1562, Training Loss: 0.35356
Epoch: 1562, Training Loss: 0.34425
Epoch: 1562, Training Loss: 0.66295
Epoch: 1563, Training Loss: 0.14297
Epoch: 1563, Training Loss: 0.35345
Epoch: 1563, Training Loss: 0.34414
Epoch: 1563, Training Loss: 0.66298
Epoch: 1564, Training Loss: 0.14266
Epoch: 1564, Training Loss: 0.35334
Epoch: 1564, Training Loss: 0.34403
Epoch: 1564, Training Loss: 0.66300
Epoch: 1565, Training Loss: 0.14235
Epoch: 1565, Training Loss: 0.35323
Epoch: 1565, Training Loss: 0.34392
Epoch: 1565, Training Loss: 0.66302
Epoch: 1566, Training Loss: 0.14204
Epoch: 1566, Training Loss: 0.35312
Epoch: 1566, Training Loss: 0.34381
Epoch: 1566, Training Loss: 0.66304
Epoch: 1567, Training Loss: 0.14173
Epoch: 1567, Training Loss: 0.35301
Epoch: 1567, Training Loss: 0.34370
Epoch: 1567, Training Loss: 0.66306
Epoch: 1568, Training Loss: 0.14143
Epoch: 1568, Training Loss: 0.35291
Epoch: 1568, Training Loss: 0.34360
Epoch: 1568, Training Loss: 0.66308
Epoch: 1569, Training Loss: 0.14113
Epoch: 1569, Training Loss: 0.35280
Epoch: 1569, Training Loss: 0.34349
Epoch: 1569, Training Loss: 0.66310
Epoch: 1570, Training Loss: 0.14083
Epoch: 1570, Training Loss: 0.35270
Epoch: 1570, Training Loss: 0.34338
Epoch: 1570, Training Loss: 0.66312
Epoch: 1571, Training Loss: 0.14053
Epoch: 1571, Training Loss: 0.35260
Epoch: 1571, Training Loss: 0.34328
Epoch: 1571, Training Loss: 0.66313
Epoch: 1572, Training Loss: 0.14023
Epoch: 1572, Training Loss: 0.35249
Epoch: 1572, Training Loss: 0.34317
Epoch: 1572, Training Loss: 0.66315
Epoch: 1573, Training Loss: 0.13993
Epoch: 1573, Training Loss: 0.35239
Epoch: 1573, Training Loss: 0.34307
Epoch: 1573, Training Loss: 0.66317
Epoch: 1574, Training Loss: 0.13964
Epoch: 1574, Training Loss: 0.35229
Epoch: 1574, Training Loss: 0.34297
Epoch: 1574, Training Loss: 0.66319
Epoch: 1575, Training Loss: 0.13934
Epoch: 1575, Training Loss: 0.35219
Epoch: 1575, Training Loss: 0.34286
Epoch: 1575, Training Loss: 0.66320
Epoch: 1576, Training Loss: 0.13905
Epoch: 1576, Training Loss: 0.35209
Epoch: 1576, Training Loss: 0.34276
Epoch: 1576, Training Loss: 0.66322
Epoch: 1577, Training Loss: 0.13876
Epoch: 1577, Training Loss: 0.35199
Epoch: 1577, Training Loss: 0.34266
Epoch: 1577, Training Loss: 0.66323
Epoch: 1578, Training Loss: 0.13847
Epoch: 1578, Training Loss: 0.35189
Epoch: 1578, Training Loss: 0.34256
Epoch: 1578, Training Loss: 0.66325
Epoch: 1579, Training Loss: 0.13818
Epoch: 1579, Training Loss: 0.35180
Epoch: 1579, Training Loss: 0.34246
Epoch: 1579, Training Loss: 0.66326
Epoch: 1580, Training Loss: 0.13790
Epoch: 1580, Training Loss: 0.35170
Epoch: 1580, Training Loss: 0.34236
Epoch: 1580, Training Loss: 0.66328
Epoch: 1581, Training Loss: 0.13761
Epoch: 1581, Training Loss: 0.35160
Epoch: 1581, Training Loss: 0.34227
Epoch: 1581, Training Loss: 0.66329
Epoch: 1582, Training Loss: 0.13733
Epoch: 1582, Training Loss: 0.35151
Epoch: 1582, Training Loss: 0.34217
Epoch: 1582, Training Loss: 0.66331
Epoch: 1583, Training Loss: 0.13705
Epoch: 1583, Training Loss: 0.35141
Epoch: 1583, Training Loss: 0.34207
Epoch: 1583, Training Loss: 0.66332
Epoch: 1584, Training Loss: 0.13677
Epoch: 1584, Training Loss: 0.35132
Epoch: 1584, Training Loss: 0.34198
Epoch: 1584, Training Loss: 0.66333
Epoch: 1585, Training Loss: 0.13649
Epoch: 1585, Training Loss: 0.35123
Epoch: 1585, Training Loss: 0.34188
Epoch: 1585, Training Loss: 0.66334
Epoch: 1586, Training Loss: 0.13621
Epoch: 1586, Training Loss: 0.35113
Epoch: 1586, Training Loss: 0.34179
Epoch: 1586, Training Loss: 0.66336
Epoch: 1587, Training Loss: 0.13593
Epoch: 1587, Training Loss: 0.35104
Epoch: 1587, Training Loss: 0.34169
Epoch: 1587, Training Loss: 0.66337
Epoch: 1588, Training Loss: 0.13566
Epoch: 1588, Training Loss: 0.35095
Epoch: 1588, Training Loss: 0.34160
Epoch: 1588, Training Loss: 0.66338
Epoch: 1589, Training Loss: 0.13538
Epoch: 1589, Training Loss: 0.35086
Epoch: 1589, Training Loss: 0.34151
Epoch: 1589, Training Loss: 0.66339
Epoch: 1590, Training Loss: 0.13511
Epoch: 1590, Training Loss: 0.35077
Epoch: 1590, Training Loss: 0.34141
Epoch: 1590, Training Loss: 0.66340
Epoch: 1591, Training Loss: 0.13484
Epoch: 1591, Training Loss: 0.35068
Epoch: 1591, Training Loss: 0.34132
Epoch: 1591, Training Loss: 0.66341
Epoch: 1592, Training Loss: 0.13457
Epoch: 1592, Training Loss: 0.35059
Epoch: 1592, Training Loss: 0.34123
Epoch: 1592, Training Loss: 0.66342
Epoch: 1593, Training Loss: 0.13430
Epoch: 1593, Training Loss: 0.35050
Epoch: 1593, Training Loss: 0.34114
Epoch: 1593, Training Loss: 0.66343
Epoch: 1594, Training Loss: 0.13404
Epoch: 1594, Training Loss: 0.35042
Epoch: 1594, Training Loss: 0.34105
Epoch: 1594, Training Loss: 0.66344
Epoch: 1595, Training Loss: 0.13377
Epoch: 1595, Training Loss: 0.35033
Epoch: 1595, Training Loss: 0.34096
Epoch: 1595, Training Loss: 0.66345
Epoch: 1596, Training Loss: 0.13351
Epoch: 1596, Training Loss: 0.35024
Epoch: 1596, Training Loss: 0.34088
Epoch: 1596, Training Loss: 0.66346
Epoch: 1597, Training Loss: 0.13324
Epoch: 1597, Training Loss: 0.35016
Epoch: 1597, Training Loss: 0.34079
Epoch: 1597, Training Loss: 0.66346
Epoch: 1598, Training Loss: 0.13298
Epoch: 1598, Training Loss: 0.35007
Epoch: 1598, Training Loss: 0.34070
Epoch: 1598, Training Loss: 0.66347
Epoch: 1599, Training Loss: 0.13272
Epoch: 1599, Training Loss: 0.34999
Epoch: 1599, Training Loss: 0.34062
Epoch: 1599, Training Loss: 0.66348
Epoch: 1600, Training Loss: 0.13246
Epoch: 1600, Training Loss: 0.34990
Epoch: 1600, Training Loss: 0.34053
Epoch: 1600, Training Loss: 0.66349
Epoch: 1601, Training Loss: 0.13220
Epoch: 1601, Training Loss: 0.34982
Epoch: 1601, Training Loss: 0.34044
Epoch: 1601, Training Loss: 0.66349
Epoch: 1602, Training Loss: 0.13195
Epoch: 1602, Training Loss: 0.34974
Epoch: 1602, Training Loss: 0.34036
Epoch: 1602, Training Loss: 0.66350
Epoch: 1603, Training Loss: 0.13169
Epoch: 1603, Training Loss: 0.34966
Epoch: 1603, Training Loss: 0.34028
Epoch: 1603, Training Loss: 0.66351
Epoch: 1604, Training Loss: 0.13144
Epoch: 1604, Training Loss: 0.34957
Epoch: 1604, Training Loss: 0.34019
Epoch: 1604, Training Loss: 0.66351
Epoch: 1605, Training Loss: 0.13119
Epoch: 1605, Training Loss: 0.34949
Epoch: 1605, Training Loss: 0.34011
Epoch: 1605, Training Loss: 0.66352
Epoch: 1606, Training Loss: 0.13093
Epoch: 1606, Training Loss: 0.34941
Epoch: 1606, Training Loss: 0.34003
Epoch: 1606, Training Loss: 0.66353
Epoch: 1607, Training Loss: 0.13068
Epoch: 1607, Training Loss: 0.34933
Epoch: 1607, Training Loss: 0.33994
Epoch: 1607, Training Loss: 0.66353
Epoch: 1608, Training Loss: 0.13043
Epoch: 1608, Training Loss: 0.34925
Epoch: 1608, Training Loss: 0.33986
Epoch: 1608, Training Loss: 0.66354
Epoch: 1609, Training Loss: 0.13019
Epoch: 1609, Training Loss: 0.34917
Epoch: 1609, Training Loss: 0.33978
Epoch: 1609, Training Loss: 0.66354
Epoch: 1610, Training Loss: 0.12994
Epoch: 1610, Training Loss: 0.34910
Epoch: 1610, Training Loss: 0.33970
Epoch: 1610, Training Loss: 0.66354
Epoch: 1611, Training Loss: 0.12969
Epoch: 1611, Training Loss: 0.34902
Epoch: 1611, Training Loss: 0.33962
Epoch: 1611, Training Loss: 0.66355
Epoch: 1612, Training Loss: 0.12945
Epoch: 1612, Training Loss: 0.34894
Epoch: 1612, Training Loss: 0.33954
Epoch: 1612, Training Loss: 0.66355
Epoch: 1613, Training Loss: 0.12921
Epoch: 1613, Training Loss: 0.34886
Epoch: 1613, Training Loss: 0.33946
Epoch: 1613, Training Loss: 0.66356
Epoch: 1614, Training Loss: 0.12896
Epoch: 1614, Training Loss: 0.34879
Epoch: 1614, Training Loss: 0.33939
Epoch: 1614, Training Loss: 0.66356
Epoch: 1615, Training Loss: 0.12872
Epoch: 1615, Training Loss: 0.34871
Epoch: 1615, Training Loss: 0.33931
Epoch: 1615, Training Loss: 0.66356
Epoch: 1616, Training Loss: 0.12848
Epoch: 1616, Training Loss: 0.34864
Epoch: 1616, Training Loss: 0.33923
Epoch: 1616, Training Loss: 0.66357
Epoch: 1617, Training Loss: 0.12824
Epoch: 1617, Training Loss: 0.34856
Epoch: 1617, Training Loss: 0.33915
Epoch: 1617, Training Loss: 0.66357
Epoch: 1618, Training Loss: 0.12801
Epoch: 1618, Training Loss: 0.34849
Epoch: 1618, Training Loss: 0.33908
Epoch: 1618, Training Loss: 0.66357
Epoch: 1619, Training Loss: 0.12777
Epoch: 1619, Training Loss: 0.34842
Epoch: 1619, Training Loss: 0.33900
Epoch: 1619, Training Loss: 0.66357
Epoch: 1620, Training Loss: 0.12753
Epoch: 1620, Training Loss: 0.34834
Epoch: 1620, Training Loss: 0.33893
Epoch: 1620, Training Loss: 0.66358
Epoch: 1621, Training Loss: 0.12730
Epoch: 1621, Training Loss: 0.34827
Epoch: 1621, Training Loss: 0.33885
Epoch: 1621, Training Loss: 0.66358
Epoch: 1622, Training Loss: 0.12707
Epoch: 1622, Training Loss: 0.34820
Epoch: 1622, Training Loss: 0.33878
Epoch: 1622, Training Loss: 0.66358
Epoch: 1623, Training Loss: 0.12683
Epoch: 1623, Training Loss: 0.34813
Epoch: 1623, Training Loss: 0.33870
Epoch: 1623, Training Loss: 0.66358
Epoch: 1624, Training Loss: 0.12660
Epoch: 1624, Training Loss: 0.34805
Epoch: 1624, Training Loss: 0.33863
Epoch: 1624, Training Loss: 0.66358
Epoch: 1625, Training Loss: 0.12637
Epoch: 1625, Training Loss: 0.34798
Epoch: 1625, Training Loss: 0.33856
Epoch: 1625, Training Loss: 0.66358
Epoch: 1626, Training Loss: 0.12614
Epoch: 1626, Training Loss: 0.34791
Epoch: 1626, Training Loss: 0.33849
Epoch: 1626, Training Loss: 0.66358
Epoch: 1627, Training Loss: 0.12592
Epoch: 1627, Training Loss: 0.34784
Epoch: 1627, Training Loss: 0.33841
Epoch: 1627, Training Loss: 0.66358
Epoch: 1628, Training Loss: 0.12569
Epoch: 1628, Training Loss: 0.34777
Epoch: 1628, Training Loss: 0.33834
Epoch: 1628, Training Loss: 0.66358
Epoch: 1629, Training Loss: 0.12546
Epoch: 1629, Training Loss: 0.34771
Epoch: 1629, Training Loss: 0.33827
Epoch: 1629, Training Loss: 0.66358
Epoch: 1630, Training Loss: 0.12524
Epoch: 1630, Training Loss: 0.34764
Epoch: 1630, Training Loss: 0.33820
Epoch: 1630, Training Loss: 0.66358
Epoch: 1631, Training Loss: 0.12502
Epoch: 1631, Training Loss: 0.34757
Epoch: 1631, Training Loss: 0.33813
Epoch: 1631, Training Loss: 0.66358
Epoch: 1632, Training Loss: 0.12479
Epoch: 1632, Training Loss: 0.34750
Epoch: 1632, Training Loss: 0.33806
Epoch: 1632, Training Loss: 0.66358
Epoch: 1633, Training Loss: 0.12457
Epoch: 1633, Training Loss: 0.34743
Epoch: 1633, Training Loss: 0.33799
Epoch: 1633, Training Loss: 0.66358
Epoch: 1634, Training Loss: 0.12435
Epoch: 1634, Training Loss: 0.34737
Epoch: 1634, Training Loss: 0.33792
Epoch: 1634, Training Loss: 0.66358
Epoch: 1635, Training Loss: 0.12413
Epoch: 1635, Training Loss: 0.34730
Epoch: 1635, Training Loss: 0.33785
Epoch: 1635, Training Loss: 0.66358
Epoch: 1636, Training Loss: 0.12391
Epoch: 1636, Training Loss: 0.34724
Epoch: 1636, Training Loss: 0.33779
Epoch: 1636, Training Loss: 0.66358
Epoch: 1637, Training Loss: 0.12369
Epoch: 1637, Training Loss: 0.34717
Epoch: 1637, Training Loss: 0.33772
Epoch: 1637, Training Loss: 0.66357
Epoch: 1638, Training Loss: 0.12348
Epoch: 1638, Training Loss: 0.34711
Epoch: 1638, Training Loss: 0.33765
Epoch: 1638, Training Loss: 0.66357
Epoch: 1639, Training Loss: 0.12326
Epoch: 1639, Training Loss: 0.34704
Epoch: 1639, Training Loss: 0.33758
Epoch: 1639, Training Loss: 0.66357
Epoch: 1640, Training Loss: 0.12305
Epoch: 1640, Training Loss: 0.34698
Epoch: 1640, Training Loss: 0.33752
Epoch: 1640, Training Loss: 0.66357
Epoch: 1641, Training Loss: 0.12283
Epoch: 1641, Training Loss: 0.34691
Epoch: 1641, Training Loss: 0.33745
Epoch: 1641, Training Loss: 0.66357
Epoch: 1642, Training Loss: 0.12262
Epoch: 1642, Training Loss: 0.34685
Epoch: 1642, Training Loss: 0.33739
Epoch: 1642, Training Loss: 0.66356
Epoch: 1643, Training Loss: 0.12241
Epoch: 1643, Training Loss: 0.34679
Epoch: 1643, Training Loss: 0.33732
Epoch: 1643, Training Loss: 0.66356
Epoch: 1644, Training Loss: 0.12220
Epoch: 1644, Training Loss: 0.34672
Epoch: 1644, Training Loss: 0.33726
Epoch: 1644, Training Loss: 0.66356
Epoch: 1645, Training Loss: 0.12199
Epoch: 1645, Training Loss: 0.34666
Epoch: 1645, Training Loss: 0.33719
Epoch: 1645, Training Loss: 0.66355
Epoch: 1646, Training Loss: 0.12178
Epoch: 1646, Training Loss: 0.34660
Epoch: 1646, Training Loss: 0.33713
Epoch: 1646, Training Loss: 0.66355
Epoch: 1647, Training Loss: 0.12157
Epoch: 1647, Training Loss: 0.34654
Epoch: 1647, Training Loss: 0.33706
Epoch: 1647, Training Loss: 0.66355
Epoch: 1648, Training Loss: 0.12136
Epoch: 1648, Training Loss: 0.34648
Epoch: 1648, Training Loss: 0.33700
Epoch: 1648, Training Loss: 0.66354
Epoch: 1649, Training Loss: 0.12116
Epoch: 1649, Training Loss: 0.34642
Epoch: 1649, Training Loss: 0.33694
Epoch: 1649, Training Loss: 0.66354
Epoch: 1650, Training Loss: 0.12095
Epoch: 1650, Training Loss: 0.34636
Epoch: 1650, Training Loss: 0.33688
Epoch: 1650, Training Loss: 0.66354
Epoch: 1651, Training Loss: 0.12075
Epoch: 1651, Training Loss: 0.34630
Epoch: 1651, Training Loss: 0.33681
Epoch: 1651, Training Loss: 0.66353
Epoch: 1652, Training Loss: 0.12054
Epoch: 1652, Training Loss: 0.34624
Epoch: 1652, Training Loss: 0.33675
Epoch: 1652, Training Loss: 0.66353
Epoch: 1653, Training Loss: 0.12034
Epoch: 1653, Training Loss: 0.34618
Epoch: 1653, Training Loss: 0.33669
Epoch: 1653, Training Loss: 0.66352
Epoch: 1654, Training Loss: 0.12014
Epoch: 1654, Training Loss: 0.34612
Epoch: 1654, Training Loss: 0.33663
Epoch: 1654, Training Loss: 0.66352
Epoch: 1655, Training Loss: 0.11994
Epoch: 1655, Training Loss: 0.34606
Epoch: 1655, Training Loss: 0.33657
Epoch: 1655, Training Loss: 0.66351
Epoch: 1656, Training Loss: 0.11974
Epoch: 1656, Training Loss: 0.34600
Epoch: 1656, Training Loss: 0.33651
Epoch: 1656, Training Loss: 0.66351
Epoch: 1657, Training Loss: 0.11954
Epoch: 1657, Training Loss: 0.34595
Epoch: 1657, Training Loss: 0.33645
Epoch: 1657, Training Loss: 0.66350
Epoch: 1658, Training Loss: 0.11934
Epoch: 1658, Training Loss: 0.34589
Epoch: 1658, Training Loss: 0.33639
Epoch: 1658, Training Loss: 0.66350
Epoch: 1659, Training Loss: 0.11914
Epoch: 1659, Training Loss: 0.34583
Epoch: 1659, Training Loss: 0.33633
Epoch: 1659, Training Loss: 0.66349
Epoch: 1660, Training Loss: 0.11895
Epoch: 1660, Training Loss: 0.34577
Epoch: 1660, Training Loss: 0.33627
Epoch: 1660, Training Loss: 0.66349
Epoch: 1661, Training Loss: 0.11875
Epoch: 1661, Training Loss: 0.34572
Epoch: 1661, Training Loss: 0.33621
Epoch: 1661, Training Loss: 0.66348
Epoch: 1662, Training Loss: 0.11856
Epoch: 1662, Training Loss: 0.34566
Epoch: 1662, Training Loss: 0.33615
Epoch: 1662, Training Loss: 0.66348
Epoch: 1663, Training Loss: 0.11836
Epoch: 1663, Training Loss: 0.34561
Epoch: 1663, Training Loss: 0.33610
Epoch: 1663, Training Loss: 0.66347
Epoch: 1664, Training Loss: 0.11817
Epoch: 1664, Training Loss: 0.34555
Epoch: 1664, Training Loss: 0.33604
Epoch: 1664, Training Loss: 0.66346
Epoch: 1665, Training Loss: 0.11798
Epoch: 1665, Training Loss: 0.34550
Epoch: 1665, Training Loss: 0.33598
Epoch: 1665, Training Loss: 0.66346
Epoch: 1666, Training Loss: 0.11778
Epoch: 1666, Training Loss: 0.34544
Epoch: 1666, Training Loss: 0.33593
Epoch: 1666, Training Loss: 0.66345
Epoch: 1667, Training Loss: 0.11759
Epoch: 1667, Training Loss: 0.34539
Epoch: 1667, Training Loss: 0.33587
Epoch: 1667, Training Loss: 0.66344
Epoch: 1668, Training Loss: 0.11740
Epoch: 1668, Training Loss: 0.34533
Epoch: 1668, Training Loss: 0.33581
Epoch: 1668, Training Loss: 0.66344
Epoch: 1669, Training Loss: 0.11721
Epoch: 1669, Training Loss: 0.34528
Epoch: 1669, Training Loss: 0.33576
Epoch: 1669, Training Loss: 0.66343
Epoch: 1670, Training Loss: 0.11703
Epoch: 1670, Training Loss: 0.34523
Epoch: 1670, Training Loss: 0.33570
Epoch: 1670, Training Loss: 0.66342
Epoch: 1671, Training Loss: 0.11684
Epoch: 1671, Training Loss: 0.34517
Epoch: 1671, Training Loss: 0.33565
Epoch: 1671, Training Loss: 0.66342
Epoch: 1672, Training Loss: 0.11665
Epoch: 1672, Training Loss: 0.34512
Epoch: 1672, Training Loss: 0.33559
Epoch: 1672, Training Loss: 0.66341
Epoch: 1673, Training Loss: 0.11647
Epoch: 1673, Training Loss: 0.34507
Epoch: 1673, Training Loss: 0.33554
Epoch: 1673, Training Loss: 0.66340
Epoch: 1674, Training Loss: 0.11628
Epoch: 1674, Training Loss: 0.34502
Epoch: 1674, Training Loss: 0.33548
Epoch: 1674, Training Loss: 0.66340
Epoch: 1675, Training Loss: 0.11610
Epoch: 1675, Training Loss: 0.34497
Epoch: 1675, Training Loss: 0.33543
Epoch: 1675, Training Loss: 0.66339
Epoch: 1676, Training Loss: 0.11591
Epoch: 1676, Training Loss: 0.34491
Epoch: 1676, Training Loss: 0.33537
Epoch: 1676, Training Loss: 0.66338
Epoch: 1677, Training Loss: 0.11573
Epoch: 1677, Training Loss: 0.34486
Epoch: 1677, Training Loss: 0.33532
Epoch: 1677, Training Loss: 0.66337
Epoch: 1678, Training Loss: 0.11555
Epoch: 1678, Training Loss: 0.34481
Epoch: 1678, Training Loss: 0.33527
Epoch: 1678, Training Loss: 0.66336
Epoch: 1679, Training Loss: 0.11537
Epoch: 1679, Training Loss: 0.34476
Epoch: 1679, Training Loss: 0.33521
Epoch: 1679, Training Loss: 0.66336
Epoch: 1680, Training Loss: 0.11519
Epoch: 1680, Training Loss: 0.34471
Epoch: 1680, Training Loss: 0.33516
Epoch: 1680, Training Loss: 0.66335
Epoch: 1681, Training Loss: 0.11501
Epoch: 1681, Training Loss: 0.34466
Epoch: 1681, Training Loss: 0.33511
Epoch: 1681, Training Loss: 0.66334
Epoch: 1682, Training Loss: 0.11483
Epoch: 1682, Training Loss: 0.34461
Epoch: 1682, Training Loss: 0.33506
Epoch: 1682, Training Loss: 0.66333
Epoch: 1683, Training Loss: 0.11465
Epoch: 1683, Training Loss: 0.34456
Epoch: 1683, Training Loss: 0.33501
Epoch: 1683, Training Loss: 0.66332
Epoch: 1684, Training Loss: 0.11447
Epoch: 1684, Training Loss: 0.34451
Epoch: 1684, Training Loss: 0.33495
Epoch: 1684, Training Loss: 0.66331
Epoch: 1685, Training Loss: 0.11430
Epoch: 1685, Training Loss: 0.34446
Epoch: 1685, Training Loss: 0.33490
Epoch: 1685, Training Loss: 0.66330
Epoch: 1686, Training Loss: 0.11412
Epoch: 1686, Training Loss: 0.34442
Epoch: 1686, Training Loss: 0.33485
Epoch: 1686, Training Loss: 0.66330
Epoch: 1687, Training Loss: 0.11394
Epoch: 1687, Training Loss: 0.34437
Epoch: 1687, Training Loss: 0.33480
Epoch: 1687, Training Loss: 0.66329
Epoch: 1688, Training Loss: 0.11377
Epoch: 1688, Training Loss: 0.34432
Epoch: 1688, Training Loss: 0.33475
Epoch: 1688, Training Loss: 0.66328
Epoch: 1689, Training Loss: 0.11360
Epoch: 1689, Training Loss: 0.34427
Epoch: 1689, Training Loss: 0.33470
Epoch: 1689, Training Loss: 0.66327
Epoch: 1690, Training Loss: 0.11342
Epoch: 1690, Training Loss: 0.34422
Epoch: 1690, Training Loss: 0.33465
Epoch: 1690, Training Loss: 0.66326
Epoch: 1691, Training Loss: 0.11325
Epoch: 1691, Training Loss: 0.34418
Epoch: 1691, Training Loss: 0.33460
Epoch: 1691, Training Loss: 0.66325
Epoch: 1692, Training Loss: 0.11308
Epoch: 1692, Training Loss: 0.34413
Epoch: 1692, Training Loss: 0.33455
Epoch: 1692, Training Loss: 0.66324
Epoch: 1693, Training Loss: 0.11291
Epoch: 1693, Training Loss: 0.34408
Epoch: 1693, Training Loss: 0.33450
Epoch: 1693, Training Loss: 0.66323
Epoch: 1694, Training Loss: 0.11274
Epoch: 1694, Training Loss: 0.34404
Epoch: 1694, Training Loss: 0.33445
Epoch: 1694, Training Loss: 0.66322
Epoch: 1695, Training Loss: 0.11257
Epoch: 1695, Training Loss: 0.34399
Epoch: 1695, Training Loss: 0.33441
Epoch: 1695, Training Loss: 0.66321
Epoch: 1696, Training Loss: 0.11240
Epoch: 1696, Training Loss: 0.34395
Epoch: 1696, Training Loss: 0.33436
Epoch: 1696, Training Loss: 0.66320
Epoch: 1697, Training Loss: 0.11223
Epoch: 1697, Training Loss: 0.34390
Epoch: 1697, Training Loss: 0.33431
Epoch: 1697, Training Loss: 0.66319
Epoch: 1698, Training Loss: 0.11206
Epoch: 1698, Training Loss: 0.34386
Epoch: 1698, Training Loss: 0.33426
Epoch: 1698, Training Loss: 0.66318
Epoch: 1699, Training Loss: 0.11190
Epoch: 1699, Training Loss: 0.34381
Epoch: 1699, Training Loss: 0.33421
Epoch: 1699, Training Loss: 0.66317
Epoch: 1700, Training Loss: 0.11173
Epoch: 1700, Training Loss: 0.34377
Epoch: 1700, Training Loss: 0.33417
Epoch: 1700, Training Loss: 0.66316
Epoch: 1701, Training Loss: 0.11156
Epoch: 1701, Training Loss: 0.34372
Epoch: 1701, Training Loss: 0.33412
Epoch: 1701, Training Loss: 0.66315
Epoch: 1702, Training Loss: 0.11140
Epoch: 1702, Training Loss: 0.34368
Epoch: 1702, Training Loss: 0.33407
Epoch: 1702, Training Loss: 0.66314
Epoch: 1703, Training Loss: 0.11123
Epoch: 1703, Training Loss: 0.34363
Epoch: 1703, Training Loss: 0.33403
Epoch: 1703, Training Loss: 0.66312
Epoch: 1704, Training Loss: 0.11107
Epoch: 1704, Training Loss: 0.34359
Epoch: 1704, Training Loss: 0.33398
Epoch: 1704, Training Loss: 0.66311
Epoch: 1705, Training Loss: 0.11091
Epoch: 1705, Training Loss: 0.34355
Epoch: 1705, Training Loss: 0.33393
Epoch: 1705, Training Loss: 0.66310
Epoch: 1706, Training Loss: 0.11075
Epoch: 1706, Training Loss: 0.34350
Epoch: 1706, Training Loss: 0.33389
Epoch: 1706, Training Loss: 0.66309
Epoch: 1707, Training Loss: 0.11058
Epoch: 1707, Training Loss: 0.34346
Epoch: 1707, Training Loss: 0.33384
Epoch: 1707, Training Loss: 0.66308
Epoch: 1708, Training Loss: 0.11042
Epoch: 1708, Training Loss: 0.34342
Epoch: 1708, Training Loss: 0.33380
Epoch: 1708, Training Loss: 0.66307
Epoch: 1709, Training Loss: 0.11026
Epoch: 1709, Training Loss: 0.34337
Epoch: 1709, Training Loss: 0.33375
Epoch: 1709, Training Loss: 0.66306
Epoch: 1710, Training Loss: 0.11010
Epoch: 1710, Training Loss: 0.34333
Epoch: 1710, Training Loss: 0.33371
Epoch: 1710, Training Loss: 0.66305
Epoch: 1711, Training Loss: 0.10994
Epoch: 1711, Training Loss: 0.34329
Epoch: 1711, Training Loss: 0.33366
Epoch: 1711, Training Loss: 0.66303
Epoch: 1712, Training Loss: 0.10979
Epoch: 1712, Training Loss: 0.34325
Epoch: 1712, Training Loss: 0.33362
Epoch: 1712, Training Loss: 0.66302
Epoch: 1713, Training Loss: 0.10963
Epoch: 1713, Training Loss: 0.34321
Epoch: 1713, Training Loss: 0.33358
Epoch: 1713, Training Loss: 0.66301
Epoch: 1714, Training Loss: 0.10947
Epoch: 1714, Training Loss: 0.34317
Epoch: 1714, Training Loss: 0.33353
Epoch: 1714, Training Loss: 0.66300
Epoch: 1715, Training Loss: 0.10931
Epoch: 1715, Training Loss: 0.34312
Epoch: 1715, Training Loss: 0.33349
Epoch: 1715, Training Loss: 0.66299
Epoch: 1716, Training Loss: 0.10916
Epoch: 1716, Training Loss: 0.34308
Epoch: 1716, Training Loss: 0.33344
Epoch: 1716, Training Loss: 0.66297
Epoch: 1717, Training Loss: 0.10900
Epoch: 1717, Training Loss: 0.34304
Epoch: 1717, Training Loss: 0.33340
Epoch: 1717, Training Loss: 0.66296
Epoch: 1718, Training Loss: 0.10885
Epoch: 1718, Training Loss: 0.34300
Epoch: 1718, Training Loss: 0.33336
Epoch: 1718, Training Loss: 0.66295
Epoch: 1719, Training Loss: 0.10869
Epoch: 1719, Training Loss: 0.34296
Epoch: 1719, Training Loss: 0.33331
Epoch: 1719, Training Loss: 0.66294
Epoch: 1720, Training Loss: 0.10854
Epoch: 1720, Training Loss: 0.34292
Epoch: 1720, Training Loss: 0.33327
Epoch: 1720, Training Loss: 0.66292
Epoch: 1721, Training Loss: 0.10839
Epoch: 1721, Training Loss: 0.34288
Epoch: 1721, Training Loss: 0.33323
Epoch: 1721, Training Loss: 0.66291
Epoch: 1722, Training Loss: 0.10824
Epoch: 1722, Training Loss: 0.34284
Epoch: 1722, Training Loss: 0.33319
Epoch: 1722, Training Loss: 0.66290
Epoch: 1723, Training Loss: 0.10808
Epoch: 1723, Training Loss: 0.34280
Epoch: 1723, Training Loss: 0.33315
Epoch: 1723, Training Loss: 0.66288
Epoch: 1724, Training Loss: 0.10793
Epoch: 1724, Training Loss: 0.34276
Epoch: 1724, Training Loss: 0.33310
Epoch: 1724, Training Loss: 0.66287
Epoch: 1725, Training Loss: 0.10778
Epoch: 1725, Training Loss: 0.34273
Epoch: 1725, Training Loss: 0.33306
Epoch: 1725, Training Loss: 0.66286
Epoch: 1726, Training Loss: 0.10763
Epoch: 1726, Training Loss: 0.34269
Epoch: 1726, Training Loss: 0.33302
Epoch: 1726, Training Loss: 0.66285
Epoch: 1727, Training Loss: 0.10748
Epoch: 1727, Training Loss: 0.34265
Epoch: 1727, Training Loss: 0.33298
Epoch: 1727, Training Loss: 0.66283
Epoch: 1728, Training Loss: 0.10733
Epoch: 1728, Training Loss: 0.34261
Epoch: 1728, Training Loss: 0.33294
Epoch: 1728, Training Loss: 0.66282
Epoch: 1729, Training Loss: 0.10719
Epoch: 1729, Training Loss: 0.34257
Epoch: 1729, Training Loss: 0.33290
Epoch: 1729, Training Loss: 0.66281
Epoch: 1730, Training Loss: 0.10704
Epoch: 1730, Training Loss: 0.34253
Epoch: 1730, Training Loss: 0.33286
Epoch: 1730, Training Loss: 0.66279
Epoch: 1731, Training Loss: 0.10689
Epoch: 1731, Training Loss: 0.34250
Epoch: 1731, Training Loss: 0.33282
Epoch: 1731, Training Loss: 0.66278
Epoch: 1732, Training Loss: 0.10674
Epoch: 1732, Training Loss: 0.34246
Epoch: 1732, Training Loss: 0.33278
Epoch: 1732, Training Loss: 0.66276
Epoch: 1733, Training Loss: 0.10660
Epoch: 1733, Training Loss: 0.34242
Epoch: 1733, Training Loss: 0.33274
Epoch: 1733, Training Loss: 0.66275
Epoch: 1734, Training Loss: 0.10645
Epoch: 1734, Training Loss: 0.34238
Epoch: 1734, Training Loss: 0.33270
Epoch: 1734, Training Loss: 0.66274
Epoch: 1735, Training Loss: 0.10631
Epoch: 1735, Training Loss: 0.34235
Epoch: 1735, Training Loss: 0.33266
Epoch: 1735, Training Loss: 0.66272
Epoch: 1736, Training Loss: 0.10616
Epoch: 1736, Training Loss: 0.34231
Epoch: 1736, Training Loss: 0.33262
Epoch: 1736, Training Loss: 0.66271
Epoch: 1737, Training Loss: 0.10602
Epoch: 1737, Training Loss: 0.34227
Epoch: 1737, Training Loss: 0.33258
Epoch: 1737, Training Loss: 0.66269
Epoch: 1738, Training Loss: 0.10588
Epoch: 1738, Training Loss: 0.34224
Epoch: 1738, Training Loss: 0.33254
Epoch: 1738, Training Loss: 0.66268
Epoch: 1739, Training Loss: 0.10574
Epoch: 1739, Training Loss: 0.34220
Epoch: 1739, Training Loss: 0.33250
Epoch: 1739, Training Loss: 0.66267
Epoch: 1740, Training Loss: 0.10559
Epoch: 1740, Training Loss: 0.34217
Epoch: 1740, Training Loss: 0.33246
Epoch: 1740, Training Loss: 0.66265
Epoch: 1741, Training Loss: 0.10545
Epoch: 1741, Training Loss: 0.34213
Epoch: 1741, Training Loss: 0.33242
Epoch: 1741, Training Loss: 0.66264
Epoch: 1742, Training Loss: 0.10531
Epoch: 1742, Training Loss: 0.34209
Epoch: 1742, Training Loss: 0.33238
Epoch: 1742, Training Loss: 0.66262
Epoch: 1743, Training Loss: 0.10517
Epoch: 1743, Training Loss: 0.34206
Epoch: 1743, Training Loss: 0.33235
Epoch: 1743, Training Loss: 0.66261
Epoch: 1744, Training Loss: 0.10503
Epoch: 1744, Training Loss: 0.34202
Epoch: 1744, Training Loss: 0.33231
Epoch: 1744, Training Loss: 0.66259
Epoch: 1745, Training Loss: 0.10489
Epoch: 1745, Training Loss: 0.34199
Epoch: 1745, Training Loss: 0.33227
Epoch: 1745, Training Loss: 0.66258
Epoch: 1746, Training Loss: 0.10475
Epoch: 1746, Training Loss: 0.34195
Epoch: 1746, Training Loss: 0.33223
Epoch: 1746, Training Loss: 0.66256
Epoch: 1747, Training Loss: 0.10461
Epoch: 1747, Training Loss: 0.34192
Epoch: 1747, Training Loss: 0.33219
Epoch: 1747, Training Loss: 0.66255
Epoch: 1748, Training Loss: 0.10448
Epoch: 1748, Training Loss: 0.34189
Epoch: 1748, Training Loss: 0.33216
Epoch: 1748, Training Loss: 0.66253
Epoch: 1749, Training Loss: 0.10434
Epoch: 1749, Training Loss: 0.34185
Epoch: 1749, Training Loss: 0.33212
Epoch: 1749, Training Loss: 0.66252
Epoch: 1750, Training Loss: 0.10420
Epoch: 1750, Training Loss: 0.34182
Epoch: 1750, Training Loss: 0.33208
Epoch: 1750, Training Loss: 0.66250
Epoch: 1751, Training Loss: 0.10407
Epoch: 1751, Training Loss: 0.34178
Epoch: 1751, Training Loss: 0.33205
Epoch: 1751, Training Loss: 0.66248
Epoch: 1752, Training Loss: 0.10393
Epoch: 1752, Training Loss: 0.34175
Epoch: 1752, Training Loss: 0.33201
Epoch: 1752, Training Loss: 0.66247
Epoch: 1753, Training Loss: 0.10380
Epoch: 1753, Training Loss: 0.34172
Epoch: 1753, Training Loss: 0.33197
Epoch: 1753, Training Loss: 0.66245
Epoch: 1754, Training Loss: 0.10366
Epoch: 1754, Training Loss: 0.34168
Epoch: 1754, Training Loss: 0.33194
Epoch: 1754, Training Loss: 0.66244
Epoch: 1755, Training Loss: 0.10353
Epoch: 1755, Training Loss: 0.34165
Epoch: 1755, Training Loss: 0.33190
Epoch: 1755, Training Loss: 0.66242
Epoch: 1756, Training Loss: 0.10339
Epoch: 1756, Training Loss: 0.34162
Epoch: 1756, Training Loss: 0.33187
Epoch: 1756, Training Loss: 0.66241
Epoch: 1757, Training Loss: 0.10326
Epoch: 1757, Training Loss: 0.34158
Epoch: 1757, Training Loss: 0.33183
Epoch: 1757, Training Loss: 0.66239
Epoch: 1758, Training Loss: 0.10313
Epoch: 1758, Training Loss: 0.34155
Epoch: 1758, Training Loss: 0.33179
Epoch: 1758, Training Loss: 0.66237
Epoch: 1759, Training Loss: 0.10300
Epoch: 1759, Training Loss: 0.34152
Epoch: 1759, Training Loss: 0.33176
Epoch: 1759, Training Loss: 0.66236
Epoch: 1760, Training Loss: 0.10286
Epoch: 1760, Training Loss: 0.34149
Epoch: 1760, Training Loss: 0.33172
Epoch: 1760, Training Loss: 0.66234
Epoch: 1761, Training Loss: 0.10273
Epoch: 1761, Training Loss: 0.34145
Epoch: 1761, Training Loss: 0.33169
Epoch: 1761, Training Loss: 0.66232
Epoch: 1762, Training Loss: 0.10260
Epoch: 1762, Training Loss: 0.34142
Epoch: 1762, Training Loss: 0.33165
Epoch: 1762, Training Loss: 0.66231
Epoch: 1763, Training Loss: 0.10247
Epoch: 1763, Training Loss: 0.34139
Epoch: 1763, Training Loss: 0.33162
Epoch: 1763, Training Loss: 0.66229
Epoch: 1764, Training Loss: 0.10234
Epoch: 1764, Training Loss: 0.34136
Epoch: 1764, Training Loss: 0.33158
Epoch: 1764, Training Loss: 0.66227
Epoch: 1765, Training Loss: 0.10221
Epoch: 1765, Training Loss: 0.34133
Epoch: 1765, Training Loss: 0.33155
Epoch: 1765, Training Loss: 0.66226
Epoch: 1766, Training Loss: 0.10209
Epoch: 1766, Training Loss: 0.34130
Epoch: 1766, Training Loss: 0.33151
Epoch: 1766, Training Loss: 0.66224
Epoch: 1767, Training Loss: 0.10196
Epoch: 1767, Training Loss: 0.34126
Epoch: 1767, Training Loss: 0.33148
Epoch: 1767, Training Loss: 0.66222
Epoch: 1768, Training Loss: 0.10183
Epoch: 1768, Training Loss: 0.34123
Epoch: 1768, Training Loss: 0.33145
Epoch: 1768, Training Loss: 0.66221
Epoch: 1769, Training Loss: 0.10170
Epoch: 1769, Training Loss: 0.34120
Epoch: 1769, Training Loss: 0.33141
Epoch: 1769, Training Loss: 0.66219
Epoch: 1770, Training Loss: 0.10158
Epoch: 1770, Training Loss: 0.34117
Epoch: 1770, Training Loss: 0.33138
Epoch: 1770, Training Loss: 0.66217
Epoch: 1771, Training Loss: 0.10145
Epoch: 1771, Training Loss: 0.34114
Epoch: 1771, Training Loss: 0.33134
Epoch: 1771, Training Loss: 0.66215
Epoch: 1772, Training Loss: 0.10132
Epoch: 1772, Training Loss: 0.34111
Epoch: 1772, Training Loss: 0.33131
Epoch: 1772, Training Loss: 0.66214
Epoch: 1773, Training Loss: 0.10120
Epoch: 1773, Training Loss: 0.34108
Epoch: 1773, Training Loss: 0.33128
Epoch: 1773, Training Loss: 0.66212
Epoch: 1774, Training Loss: 0.10107
Epoch: 1774, Training Loss: 0.34105
Epoch: 1774, Training Loss: 0.33124
Epoch: 1774, Training Loss: 0.66210
Epoch: 1775, Training Loss: 0.10095
Epoch: 1775, Training Loss: 0.34102
Epoch: 1775, Training Loss: 0.33121
Epoch: 1775, Training Loss: 0.66208
Epoch: 1776, Training Loss: 0.10083
Epoch: 1776, Training Loss: 0.34099
Epoch: 1776, Training Loss: 0.33118
Epoch: 1776, Training Loss: 0.66207
Epoch: 1777, Training Loss: 0.10070
Epoch: 1777, Training Loss: 0.34096
Epoch: 1777, Training Loss: 0.33114
Epoch: 1777, Training Loss: 0.66205
Epoch: 1778, Training Loss: 0.10058
Epoch: 1778, Training Loss: 0.34093
Epoch: 1778, Training Loss: 0.33111
Epoch: 1778, Training Loss: 0.66203
Epoch: 1779, Training Loss: 0.10046
Epoch: 1779, Training Loss: 0.34090
Epoch: 1779, Training Loss: 0.33108
Epoch: 1779, Training Loss: 0.66201
Epoch: 1780, Training Loss: 0.10034
Epoch: 1780, Training Loss: 0.34087
Epoch: 1780, Training Loss: 0.33105
Epoch: 1780, Training Loss: 0.66199
Epoch: 1781, Training Loss: 0.10021
Epoch: 1781, Training Loss: 0.34084
Epoch: 1781, Training Loss: 0.33101
Epoch: 1781, Training Loss: 0.66198
Epoch: 1782, Training Loss: 0.10009
Epoch: 1782, Training Loss: 0.34082
Epoch: 1782, Training Loss: 0.33098
Epoch: 1782, Training Loss: 0.66196
Epoch: 1783, Training Loss: 0.09997
Epoch: 1783, Training Loss: 0.34079
Epoch: 1783, Training Loss: 0.33095
Epoch: 1783, Training Loss: 0.66194
Epoch: 1784, Training Loss: 0.09985
Epoch: 1784, Training Loss: 0.34076
Epoch: 1784, Training Loss: 0.33092
Epoch: 1784, Training Loss: 0.66192
Epoch: 1785, Training Loss: 0.09973
Epoch: 1785, Training Loss: 0.34073
Epoch: 1785, Training Loss: 0.33089
Epoch: 1785, Training Loss: 0.66190
Epoch: 1786, Training Loss: 0.09961
Epoch: 1786, Training Loss: 0.34070
Epoch: 1786, Training Loss: 0.33085
Epoch: 1786, Training Loss: 0.66188
Epoch: 1787, Training Loss: 0.09949
Epoch: 1787, Training Loss: 0.34067
Epoch: 1787, Training Loss: 0.33082
Epoch: 1787, Training Loss: 0.66186
Epoch: 1788, Training Loss: 0.09937
Epoch: 1788, Training Loss: 0.34065
Epoch: 1788, Training Loss: 0.33079
Epoch: 1788, Training Loss: 0.66184
Epoch: 1789, Training Loss: 0.09926
Epoch: 1789, Training Loss: 0.34062
Epoch: 1789, Training Loss: 0.33076
Epoch: 1789, Training Loss: 0.66182
Epoch: 1790, Training Loss: 0.09914
Epoch: 1790, Training Loss: 0.34059
Epoch: 1790, Training Loss: 0.33073
Epoch: 1790, Training Loss: 0.66181
Epoch: 1791, Training Loss: 0.09902
Epoch: 1791, Training Loss: 0.34056
Epoch: 1791, Training Loss: 0.33070
Epoch: 1791, Training Loss: 0.66179
Epoch: 1792, Training Loss: 0.09891
Epoch: 1792, Training Loss: 0.34054
Epoch: 1792, Training Loss: 0.33067
Epoch: 1792, Training Loss: 0.66177
Epoch: 1793, Training Loss: 0.09879
Epoch: 1793, Training Loss: 0.34051
Epoch: 1793, Training Loss: 0.33064
Epoch: 1793, Training Loss: 0.66175
Epoch: 1794, Training Loss: 0.09867
Epoch: 1794, Training Loss: 0.34048
Epoch: 1794, Training Loss: 0.33061
Epoch: 1794, Training Loss: 0.66173
Epoch: 1795, Training Loss: 0.09856
Epoch: 1795, Training Loss: 0.34045
Epoch: 1795, Training Loss: 0.33057
Epoch: 1795, Training Loss: 0.66171
Epoch: 1796, Training Loss: 0.09844
Epoch: 1796, Training Loss: 0.34043
Epoch: 1796, Training Loss: 0.33054
Epoch: 1796, Training Loss: 0.66169
Epoch: 1797, Training Loss: 0.09833
Epoch: 1797, Training Loss: 0.34040
Epoch: 1797, Training Loss: 0.33051
Epoch: 1797, Training Loss: 0.66167
Epoch: 1798, Training Loss: 0.09821
Epoch: 1798, Training Loss: 0.34037
Epoch: 1798, Training Loss: 0.33048
Epoch: 1798, Training Loss: 0.66165
Epoch: 1799, Training Loss: 0.09810
Epoch: 1799, Training Loss: 0.34035
Epoch: 1799, Training Loss: 0.33045
Epoch: 1799, Training Loss: 0.66163
Epoch: 1800, Training Loss: 0.09799
Epoch: 1800, Training Loss: 0.34032
Epoch: 1800, Training Loss: 0.33042
Epoch: 1800, Training Loss: 0.66161
Epoch: 1801, Training Loss: 0.09787
Epoch: 1801, Training Loss: 0.34029
Epoch: 1801, Training Loss: 0.33039
Epoch: 1801, Training Loss: 0.66159
Epoch: 1802, Training Loss: 0.09776
Epoch: 1802, Training Loss: 0.34027
Epoch: 1802, Training Loss: 0.33036
Epoch: 1802, Training Loss: 0.66157
Epoch: 1803, Training Loss: 0.09765
Epoch: 1803, Training Loss: 0.34024
Epoch: 1803, Training Loss: 0.33033
Epoch: 1803, Training Loss: 0.66155
Epoch: 1804, Training Loss: 0.09754
Epoch: 1804, Training Loss: 0.34022
Epoch: 1804, Training Loss: 0.33030
Epoch: 1804, Training Loss: 0.66153
Epoch: 1805, Training Loss: 0.09743
Epoch: 1805, Training Loss: 0.34019
Epoch: 1805, Training Loss: 0.33028
Epoch: 1805, Training Loss: 0.66150
Epoch: 1806, Training Loss: 0.09732
Epoch: 1806, Training Loss: 0.34017
Epoch: 1806, Training Loss: 0.33025
Epoch: 1806, Training Loss: 0.66148
Epoch: 1807, Training Loss: 0.09721
Epoch: 1807, Training Loss: 0.34014
Epoch: 1807, Training Loss: 0.33022
Epoch: 1807, Training Loss: 0.66146
Epoch: 1808, Training Loss: 0.09710
Epoch: 1808, Training Loss: 0.34011
Epoch: 1808, Training Loss: 0.33019
Epoch: 1808, Training Loss: 0.66144
Epoch: 1809, Training Loss: 0.09699
Epoch: 1809, Training Loss: 0.34009
Epoch: 1809, Training Loss: 0.33016
Epoch: 1809, Training Loss: 0.66142
Epoch: 1810, Training Loss: 0.09688
Epoch: 1810, Training Loss: 0.34006
Epoch: 1810, Training Loss: 0.33013
Epoch: 1810, Training Loss: 0.66140
Epoch: 1811, Training Loss: 0.09677
Epoch: 1811, Training Loss: 0.34004
Epoch: 1811, Training Loss: 0.33010
Epoch: 1811, Training Loss: 0.66138
Epoch: 1812, Training Loss: 0.09666
Epoch: 1812, Training Loss: 0.34001
Epoch: 1812, Training Loss: 0.33007
Epoch: 1812, Training Loss: 0.66136
Epoch: 1813, Training Loss: 0.09655
Epoch: 1813, Training Loss: 0.33999
Epoch: 1813, Training Loss: 0.33004
Epoch: 1813, Training Loss: 0.66133
Epoch: 1814, Training Loss: 0.09644
Epoch: 1814, Training Loss: 0.33997
Epoch: 1814, Training Loss: 0.33002
Epoch: 1814, Training Loss: 0.66131
Epoch: 1815, Training Loss: 0.09634
Epoch: 1815, Training Loss: 0.33994
Epoch: 1815, Training Loss: 0.32999
Epoch: 1815, Training Loss: 0.66129
Epoch: 1816, Training Loss: 0.09623
Epoch: 1816, Training Loss: 0.33992
Epoch: 1816, Training Loss: 0.32996
Epoch: 1816, Training Loss: 0.66127
Epoch: 1817, Training Loss: 0.09612
Epoch: 1817, Training Loss: 0.33989
Epoch: 1817, Training Loss: 0.32993
Epoch: 1817, Training Loss: 0.66125
Epoch: 1818, Training Loss: 0.09602
Epoch: 1818, Training Loss: 0.33987
Epoch: 1818, Training Loss: 0.32990
Epoch: 1818, Training Loss: 0.66122
Epoch: 1819, Training Loss: 0.09591
Epoch: 1819, Training Loss: 0.33984
Epoch: 1819, Training Loss: 0.32987
Epoch: 1819, Training Loss: 0.66120
Epoch: 1820, Training Loss: 0.09581
Epoch: 1820, Training Loss: 0.33982
Epoch: 1820, Training Loss: 0.32985
Epoch: 1820, Training Loss: 0.66118
Epoch: 1821, Training Loss: 0.09570
Epoch: 1821, Training Loss: 0.33980
Epoch: 1821, Training Loss: 0.32982
Epoch: 1821, Training Loss: 0.66116
Epoch: 1822, Training Loss: 0.09560
Epoch: 1822, Training Loss: 0.33977
Epoch: 1822, Training Loss: 0.32979
Epoch: 1822, Training Loss: 0.66113
Epoch: 1823, Training Loss: 0.09549
Epoch: 1823, Training Loss: 0.33975
Epoch: 1823, Training Loss: 0.32976
Epoch: 1823, Training Loss: 0.66111
Epoch: 1824, Training Loss: 0.09539
Epoch: 1824, Training Loss: 0.33973
Epoch: 1824, Training Loss: 0.32974
Epoch: 1824, Training Loss: 0.66109
Epoch: 1825, Training Loss: 0.09529
Epoch: 1825, Training Loss: 0.33970
Epoch: 1825, Training Loss: 0.32971
Epoch: 1825, Training Loss: 0.66106
Epoch: 1826, Training Loss: 0.09518
Epoch: 1826, Training Loss: 0.33968
Epoch: 1826, Training Loss: 0.32968
Epoch: 1826, Training Loss: 0.66104
Epoch: 1827, Training Loss: 0.09508
Epoch: 1827, Training Loss: 0.33966
Epoch: 1827, Training Loss: 0.32965
Epoch: 1827, Training Loss: 0.66102
Epoch: 1828, Training Loss: 0.09498
Epoch: 1828, Training Loss: 0.33963
Epoch: 1828, Training Loss: 0.32963
Epoch: 1828, Training Loss: 0.66099
Epoch: 1829, Training Loss: 0.09488
Epoch: 1829, Training Loss: 0.33961
Epoch: 1829, Training Loss: 0.32960
Epoch: 1829, Training Loss: 0.66097
Epoch: 1830, Training Loss: 0.09478
Epoch: 1830, Training Loss: 0.33959
Epoch: 1830, Training Loss: 0.32957
Epoch: 1830, Training Loss: 0.66095
Epoch: 1831, Training Loss: 0.09468
Epoch: 1831, Training Loss: 0.33957
Epoch: 1831, Training Loss: 0.32954
Epoch: 1831, Training Loss: 0.66092
Epoch: 1832, Training Loss: 0.09458
Epoch: 1832, Training Loss: 0.33954
Epoch: 1832, Training Loss: 0.32952
Epoch: 1832, Training Loss: 0.66090
Epoch: 1833, Training Loss: 0.09448
Epoch: 1833, Training Loss: 0.33952
Epoch: 1833, Training Loss: 0.32949
Epoch: 1833, Training Loss: 0.66087
Epoch: 1834, Training Loss: 0.09438
Epoch: 1834, Training Loss: 0.33950
Epoch: 1834, Training Loss: 0.32946
Epoch: 1834, Training Loss: 0.66085
Epoch: 1835, Training Loss: 0.09428
Epoch: 1835, Training Loss: 0.33948
Epoch: 1835, Training Loss: 0.32944
Epoch: 1835, Training Loss: 0.66083
Epoch: 1836, Training Loss: 0.09418
Epoch: 1836, Training Loss: 0.33946
Epoch: 1836, Training Loss: 0.32941
Epoch: 1836, Training Loss: 0.66080
Epoch: 1837, Training Loss: 0.09408
Epoch: 1837, Training Loss: 0.33943
Epoch: 1837, Training Loss: 0.32939
Epoch: 1837, Training Loss: 0.66078
Epoch: 1838, Training Loss: 0.09398
Epoch: 1838, Training Loss: 0.33941
Epoch: 1838, Training Loss: 0.32936
Epoch: 1838, Training Loss: 0.66075
Epoch: 1839, Training Loss: 0.09388
Epoch: 1839, Training Loss: 0.33939
Epoch: 1839, Training Loss: 0.32933
Epoch: 1839, Training Loss: 0.66073
Epoch: 1840, Training Loss: 0.09379
Epoch: 1840, Training Loss: 0.33937
Epoch: 1840, Training Loss: 0.32931
Epoch: 1840, Training Loss: 0.66070
Epoch: 1841, Training Loss: 0.09369
Epoch: 1841, Training Loss: 0.33935
Epoch: 1841, Training Loss: 0.32928
Epoch: 1841, Training Loss: 0.66068
Epoch: 1842, Training Loss: 0.09359
Epoch: 1842, Training Loss: 0.33933
Epoch: 1842, Training Loss: 0.32925
Epoch: 1842, Training Loss: 0.66065
Epoch: 1843, Training Loss: 0.09350
Epoch: 1843, Training Loss: 0.33930
Epoch: 1843, Training Loss: 0.32923
Epoch: 1843, Training Loss: 0.66063
Epoch: 1844, Training Loss: 0.09340
Epoch: 1844, Training Loss: 0.33928
Epoch: 1844, Training Loss: 0.32920
Epoch: 1844, Training Loss: 0.66060
Epoch: 1845, Training Loss: 0.09330
Epoch: 1845, Training Loss: 0.33926
Epoch: 1845, Training Loss: 0.32918
Epoch: 1845, Training Loss: 0.66057
Epoch: 1846, Training Loss: 0.09321
Epoch: 1846, Training Loss: 0.33924
Epoch: 1846, Training Loss: 0.32915
Epoch: 1846, Training Loss: 0.66055
Epoch: 1847, Training Loss: 0.09311
Epoch: 1847, Training Loss: 0.33922
Epoch: 1847, Training Loss: 0.32912
Epoch: 1847, Training Loss: 0.66052
Epoch: 1848, Training Loss: 0.09302
Epoch: 1848, Training Loss: 0.33920
Epoch: 1848, Training Loss: 0.32910
Epoch: 1848, Training Loss: 0.66050
Epoch: 1849, Training Loss: 0.09292
Epoch: 1849, Training Loss: 0.33918
Epoch: 1849, Training Loss: 0.32907
Epoch: 1849, Training Loss: 0.66047
Epoch: 1850, Training Loss: 0.09283
Epoch: 1850, Training Loss: 0.33916
Epoch: 1850, Training Loss: 0.32905
Epoch: 1850, Training Loss: 0.66044
Epoch: 1851, Training Loss: 0.09274
Epoch: 1851, Training Loss: 0.33914
Epoch: 1851, Training Loss: 0.32902
Epoch: 1851, Training Loss: 0.66042
Epoch: 1852, Training Loss: 0.09264
Epoch: 1852, Training Loss: 0.33912
Epoch: 1852, Training Loss: 0.32900
Epoch: 1852, Training Loss: 0.66039
Epoch: 1853, Training Loss: 0.09255
Epoch: 1853, Training Loss: 0.33910
Epoch: 1853, Training Loss: 0.32897
Epoch: 1853, Training Loss: 0.66036
Epoch: 1854, Training Loss: 0.09246
Epoch: 1854, Training Loss: 0.33908
Epoch: 1854, Training Loss: 0.32895
Epoch: 1854, Training Loss: 0.66033
Epoch: 1855, Training Loss: 0.09237
Epoch: 1855, Training Loss: 0.33906
Epoch: 1855, Training Loss: 0.32892
Epoch: 1855, Training Loss: 0.66031
Epoch: 1856, Training Loss: 0.09227
Epoch: 1856, Training Loss: 0.33904
Epoch: 1856, Training Loss: 0.32890
Epoch: 1856, Training Loss: 0.66028
Epoch: 1857, Training Loss: 0.09218
Epoch: 1857, Training Loss: 0.33902
Epoch: 1857, Training Loss: 0.32887
Epoch: 1857, Training Loss: 0.66025
Epoch: 1858, Training Loss: 0.09209
Epoch: 1858, Training Loss: 0.33900
Epoch: 1858, Training Loss: 0.32885
Epoch: 1858, Training Loss: 0.66022
Epoch: 1859, Training Loss: 0.09200
Epoch: 1859, Training Loss: 0.33898
Epoch: 1859, Training Loss: 0.32882
Epoch: 1859, Training Loss: 0.66020
Epoch: 1860, Training Loss: 0.09191
Epoch: 1860, Training Loss: 0.33896
Epoch: 1860, Training Loss: 0.32880
Epoch: 1860, Training Loss: 0.66017
Epoch: 1861, Training Loss: 0.09182
Epoch: 1861, Training Loss: 0.33894
Epoch: 1861, Training Loss: 0.32877
Epoch: 1861, Training Loss: 0.66014
Epoch: 1862, Training Loss: 0.09173
Epoch: 1862, Training Loss: 0.33892
Epoch: 1862, Training Loss: 0.32875
Epoch: 1862, Training Loss: 0.66011
Epoch: 1863, Training Loss: 0.09164
Epoch: 1863, Training Loss: 0.33890
Epoch: 1863, Training Loss: 0.32872
Epoch: 1863, Training Loss: 0.66008
Epoch: 1864, Training Loss: 0.09155
Epoch: 1864, Training Loss: 0.33888
Epoch: 1864, Training Loss: 0.32870
Epoch: 1864, Training Loss: 0.66005
Epoch: 1865, Training Loss: 0.09147
Epoch: 1865, Training Loss: 0.33886
Epoch: 1865, Training Loss: 0.32867
Epoch: 1865, Training Loss: 0.66003
Epoch: 1866, Training Loss: 0.09138
Epoch: 1866, Training Loss: 0.33884
Epoch: 1866, Training Loss: 0.32865
Epoch: 1866, Training Loss: 0.66000
Epoch: 1867, Training Loss: 0.09129
Epoch: 1867, Training Loss: 0.33882
Epoch: 1867, Training Loss: 0.32862
Epoch: 1867, Training Loss: 0.65997
Epoch: 1868, Training Loss: 0.09120
Epoch: 1868, Training Loss: 0.33881
Epoch: 1868, Training Loss: 0.32860
Epoch: 1868, Training Loss: 0.65994
Epoch: 1869, Training Loss: 0.09112
Epoch: 1869, Training Loss: 0.33879
Epoch: 1869, Training Loss: 0.32858
Epoch: 1869, Training Loss: 0.65991
Epoch: 1870, Training Loss: 0.09103
Epoch: 1870, Training Loss: 0.33877
Epoch: 1870, Training Loss: 0.32855
Epoch: 1870, Training Loss: 0.65988
Epoch: 1871, Training Loss: 0.09094
Epoch: 1871, Training Loss: 0.33875
Epoch: 1871, Training Loss: 0.32853
Epoch: 1871, Training Loss: 0.65985
Epoch: 1872, Training Loss: 0.09086
Epoch: 1872, Training Loss: 0.33873
Epoch: 1872, Training Loss: 0.32850
Epoch: 1872, Training Loss: 0.65982
Epoch: 1873, Training Loss: 0.09077
Epoch: 1873, Training Loss: 0.33871
Epoch: 1873, Training Loss: 0.32848
Epoch: 1873, Training Loss: 0.65979
Epoch: 1874, Training Loss: 0.09069
Epoch: 1874, Training Loss: 0.33870
Epoch: 1874, Training Loss: 0.32846
Epoch: 1874, Training Loss: 0.65976
Epoch: 1875, Training Loss: 0.09060
Epoch: 1875, Training Loss: 0.33868
Epoch: 1875, Training Loss: 0.32843
Epoch: 1875, Training Loss: 0.65973
Epoch: 1876, Training Loss: 0.09052
Epoch: 1876, Training Loss: 0.33866
Epoch: 1876, Training Loss: 0.32841
Epoch: 1876, Training Loss: 0.65969
Epoch: 1877, Training Loss: 0.09043
Epoch: 1877, Training Loss: 0.33864
Epoch: 1877, Training Loss: 0.32838
Epoch: 1877, Training Loss: 0.65966
Epoch: 1878, Training Loss: 0.09035
Epoch: 1878, Training Loss: 0.33862
Epoch: 1878, Training Loss: 0.32836
Epoch: 1878, Training Loss: 0.65963
Epoch: 1879, Training Loss: 0.09026
Epoch: 1879, Training Loss: 0.33861
Epoch: 1879, Training Loss: 0.32834
Epoch: 1879, Training Loss: 0.65960
Epoch: 1880, Training Loss: 0.09018
Epoch: 1880, Training Loss: 0.33859
Epoch: 1880, Training Loss: 0.32831
Epoch: 1880, Training Loss: 0.65957
Epoch: 1881, Training Loss: 0.09010
Epoch: 1881, Training Loss: 0.33857
Epoch: 1881, Training Loss: 0.32829
Epoch: 1881, Training Loss: 0.65954
Epoch: 1882, Training Loss: 0.09001
Epoch: 1882, Training Loss: 0.33855
Epoch: 1882, Training Loss: 0.32826
Epoch: 1882, Training Loss: 0.65950
Epoch: 1883, Training Loss: 0.08993
Epoch: 1883, Training Loss: 0.33854
Epoch: 1883, Training Loss: 0.32824
Epoch: 1883, Training Loss: 0.65947
Epoch: 1884, Training Loss: 0.08985
Epoch: 1884, Training Loss: 0.33852
Epoch: 1884, Training Loss: 0.32822
Epoch: 1884, Training Loss: 0.65944
Epoch: 1885, Training Loss: 0.08977
Epoch: 1885, Training Loss: 0.33850
Epoch: 1885, Training Loss: 0.32819
Epoch: 1885, Training Loss: 0.65941
Epoch: 1886, Training Loss: 0.08969
Epoch: 1886, Training Loss: 0.33848
Epoch: 1886, Training Loss: 0.32817
Epoch: 1886, Training Loss: 0.65937
Epoch: 1887, Training Loss: 0.08961
Epoch: 1887, Training Loss: 0.33847
Epoch: 1887, Training Loss: 0.32815
Epoch: 1887, Training Loss: 0.65934
Epoch: 1888, Training Loss: 0.08953
Epoch: 1888, Training Loss: 0.33845
Epoch: 1888, Training Loss: 0.32812
Epoch: 1888, Training Loss: 0.65931
Epoch: 1889, Training Loss: 0.08945
Epoch: 1889, Training Loss: 0.33843
Epoch: 1889, Training Loss: 0.32810
Epoch: 1889, Training Loss: 0.65927
Epoch: 1890, Training Loss: 0.08937
Epoch: 1890, Training Loss: 0.33842
Epoch: 1890, Training Loss: 0.32807
Epoch: 1890, Training Loss: 0.65924
Epoch: 1891, Training Loss: 0.08929
Epoch: 1891, Training Loss: 0.33840
Epoch: 1891, Training Loss: 0.32805
Epoch: 1891, Training Loss: 0.65921
Epoch: 1892, Training Loss: 0.08921
Epoch: 1892, Training Loss: 0.33838
Epoch: 1892, Training Loss: 0.32803
Epoch: 1892, Training Loss: 0.65917
Epoch: 1893, Training Loss: 0.08913
Epoch: 1893, Training Loss: 0.33837
Epoch: 1893, Training Loss: 0.32800
Epoch: 1893, Training Loss: 0.65914
Epoch: 1894, Training Loss: 0.08905
Epoch: 1894, Training Loss: 0.33835
Epoch: 1894, Training Loss: 0.32798
Epoch: 1894, Training Loss: 0.65910
Epoch: 1895, Training Loss: 0.08897
Epoch: 1895, Training Loss: 0.33833
Epoch: 1895, Training Loss: 0.32796
Epoch: 1895, Training Loss: 0.65907
Epoch: 1896, Training Loss: 0.08890
Epoch: 1896, Training Loss: 0.33832
Epoch: 1896, Training Loss: 0.32793
Epoch: 1896, Training Loss: 0.65903
Epoch: 1897, Training Loss: 0.08882
Epoch: 1897, Training Loss: 0.33830
Epoch: 1897, Training Loss: 0.32791
Epoch: 1897, Training Loss: 0.65900
Epoch: 1898, Training Loss: 0.08874
Epoch: 1898, Training Loss: 0.33829
Epoch: 1898, Training Loss: 0.32789
Epoch: 1898, Training Loss: 0.65896
Epoch: 1899, Training Loss: 0.08866
Epoch: 1899, Training Loss: 0.33827
Epoch: 1899, Training Loss: 0.32786
Epoch: 1899, Training Loss: 0.65892
Epoch: 1900, Training Loss: 0.08859
Epoch: 1900, Training Loss: 0.33825
Epoch: 1900, Training Loss: 0.32784
Epoch: 1900, Training Loss: 0.65889
Epoch: 1901, Training Loss: 0.08851
Epoch: 1901, Training Loss: 0.33824
Epoch: 1901, Training Loss: 0.32782
Epoch: 1901, Training Loss: 0.65885
Epoch: 1902, Training Loss: 0.08844
Epoch: 1902, Training Loss: 0.33822
Epoch: 1902, Training Loss: 0.32779
Epoch: 1902, Training Loss: 0.65881
Epoch: 1903, Training Loss: 0.08836
Epoch: 1903, Training Loss: 0.33821
Epoch: 1903, Training Loss: 0.32777
Epoch: 1903, Training Loss: 0.65878
Epoch: 1904, Training Loss: 0.08829
Epoch: 1904, Training Loss: 0.33819
Epoch: 1904, Training Loss: 0.32775
Epoch: 1904, Training Loss: 0.65874
Epoch: 1905, Training Loss: 0.08821
Epoch: 1905, Training Loss: 0.33817
Epoch: 1905, Training Loss: 0.32772
Epoch: 1905, Training Loss: 0.65870
Epoch: 1906, Training Loss: 0.08814
Epoch: 1906, Training Loss: 0.33816
Epoch: 1906, Training Loss: 0.32770
Epoch: 1906, Training Loss: 0.65867
Epoch: 1907, Training Loss: 0.08806
Epoch: 1907, Training Loss: 0.33814
Epoch: 1907, Training Loss: 0.32768
Epoch: 1907, Training Loss: 0.65863
Epoch: 1908, Training Loss: 0.08799
Epoch: 1908, Training Loss: 0.33813
Epoch: 1908, Training Loss: 0.32766
Epoch: 1908, Training Loss: 0.65859
Epoch: 1909, Training Loss: 0.08792
Epoch: 1909, Training Loss: 0.33811
Epoch: 1909, Training Loss: 0.32763
Epoch: 1909, Training Loss: 0.65855
Epoch: 1910, Training Loss: 0.08784
Epoch: 1910, Training Loss: 0.33810
Epoch: 1910, Training Loss: 0.32761
Epoch: 1910, Training Loss: 0.65851
Epoch: 1911, Training Loss: 0.08777
Epoch: 1911, Training Loss: 0.33808
Epoch: 1911, Training Loss: 0.32759
Epoch: 1911, Training Loss: 0.65847
Epoch: 1912, Training Loss: 0.08770
Epoch: 1912, Training Loss: 0.33807
Epoch: 1912, Training Loss: 0.32756
Epoch: 1912, Training Loss: 0.65843
Epoch: 1913, Training Loss: 0.08763
Epoch: 1913, Training Loss: 0.33805
Epoch: 1913, Training Loss: 0.32754
Epoch: 1913, Training Loss: 0.65839
Epoch: 1914, Training Loss: 0.08756
Epoch: 1914, Training Loss: 0.33804
Epoch: 1914, Training Loss: 0.32752
Epoch: 1914, Training Loss: 0.65835
Epoch: 1915, Training Loss: 0.08748
Epoch: 1915, Training Loss: 0.33802
Epoch: 1915, Training Loss: 0.32749
Epoch: 1915, Training Loss: 0.65831
Epoch: 1916, Training Loss: 0.08741
Epoch: 1916, Training Loss: 0.33801
Epoch: 1916, Training Loss: 0.32747
Epoch: 1916, Training Loss: 0.65827
Epoch: 1917, Training Loss: 0.08734
Epoch: 1917, Training Loss: 0.33799
Epoch: 1917, Training Loss: 0.32745
Epoch: 1917, Training Loss: 0.65823
Epoch: 1918, Training Loss: 0.08727
Epoch: 1918, Training Loss: 0.33798
Epoch: 1918, Training Loss: 0.32742
Epoch: 1918, Training Loss: 0.65819
Epoch: 1919, Training Loss: 0.08720
Epoch: 1919, Training Loss: 0.33796
Epoch: 1919, Training Loss: 0.32740
Epoch: 1919, Training Loss: 0.65815
Epoch: 1920, Training Loss: 0.08713
Epoch: 1920, Training Loss: 0.33795
Epoch: 1920, Training Loss: 0.32738
Epoch: 1920, Training Loss: 0.65811
Epoch: 1921, Training Loss: 0.08706
Epoch: 1921, Training Loss: 0.33794
Epoch: 1921, Training Loss: 0.32735
Epoch: 1921, Training Loss: 0.65807
Epoch: 1922, Training Loss: 0.08700
Epoch: 1922, Training Loss: 0.33792
Epoch: 1922, Training Loss: 0.32733
Epoch: 1922, Training Loss: 0.65802
Epoch: 1923, Training Loss: 0.08693
Epoch: 1923, Training Loss: 0.33791
Epoch: 1923, Training Loss: 0.32731
Epoch: 1923, Training Loss: 0.65798
Epoch: 1924, Training Loss: 0.08686
Epoch: 1924, Training Loss: 0.33789
Epoch: 1924, Training Loss: 0.32728
Epoch: 1924, Training Loss: 0.65794
Epoch: 1925, Training Loss: 0.08679
Epoch: 1925, Training Loss: 0.33788
Epoch: 1925, Training Loss: 0.32726
Epoch: 1925, Training Loss: 0.65789
Epoch: 1926, Training Loss: 0.08672
Epoch: 1926, Training Loss: 0.33786
Epoch: 1926, Training Loss: 0.32724
Epoch: 1926, Training Loss: 0.65785
Epoch: 1927, Training Loss: 0.08666
Epoch: 1927, Training Loss: 0.33785
Epoch: 1927, Training Loss: 0.32721
Epoch: 1927, Training Loss: 0.65781
Epoch: 1928, Training Loss: 0.08659
Epoch: 1928, Training Loss: 0.33784
Epoch: 1928, Training Loss: 0.32719
Epoch: 1928, Training Loss: 0.65776
Epoch: 1929, Training Loss: 0.08653
Epoch: 1929, Training Loss: 0.33782
Epoch: 1929, Training Loss: 0.32717
Epoch: 1929, Training Loss: 0.65772
Epoch: 1930, Training Loss: 0.08646
Epoch: 1930, Training Loss: 0.33781
Epoch: 1930, Training Loss: 0.32714
Epoch: 1930, Training Loss: 0.65767
Epoch: 1931, Training Loss: 0.08639
Epoch: 1931, Training Loss: 0.33780
Epoch: 1931, Training Loss: 0.32712
Epoch: 1931, Training Loss: 0.65763
Epoch: 1932, Training Loss: 0.08633
Epoch: 1932, Training Loss: 0.33778
Epoch: 1932, Training Loss: 0.32710
Epoch: 1932, Training Loss: 0.65758
Epoch: 1933, Training Loss: 0.08626
Epoch: 1933, Training Loss: 0.33777
Epoch: 1933, Training Loss: 0.32707
Epoch: 1933, Training Loss: 0.65754
Epoch: 1934, Training Loss: 0.08620
Epoch: 1934, Training Loss: 0.33776
Epoch: 1934, Training Loss: 0.32705
Epoch: 1934, Training Loss: 0.65749
Epoch: 1935, Training Loss: 0.08614
Epoch: 1935, Training Loss: 0.33774
Epoch: 1935, Training Loss: 0.32703
Epoch: 1935, Training Loss: 0.65744
Epoch: 1936, Training Loss: 0.08607
Epoch: 1936, Training Loss: 0.33773
Epoch: 1936, Training Loss: 0.32700
Epoch: 1936, Training Loss: 0.65739
Epoch: 1937, Training Loss: 0.08601
Epoch: 1937, Training Loss: 0.33772
Epoch: 1937, Training Loss: 0.32698
Epoch: 1937, Training Loss: 0.65735
Epoch: 1938, Training Loss: 0.08595
Epoch: 1938, Training Loss: 0.33770
Epoch: 1938, Training Loss: 0.32695
Epoch: 1938, Training Loss: 0.65730
Epoch: 1939, Training Loss: 0.08588
Epoch: 1939, Training Loss: 0.33769
Epoch: 1939, Training Loss: 0.32693
Epoch: 1939, Training Loss: 0.65725
Epoch: 1940, Training Loss: 0.08582
Epoch: 1940, Training Loss: 0.33768
Epoch: 1940, Training Loss: 0.32691
Epoch: 1940, Training Loss: 0.65720
Epoch: 1941, Training Loss: 0.08576
Epoch: 1941, Training Loss: 0.33766
Epoch: 1941, Training Loss: 0.32688
Epoch: 1941, Training Loss: 0.65715
Epoch: 1942, Training Loss: 0.08570
Epoch: 1942, Training Loss: 0.33765
Epoch: 1942, Training Loss: 0.32686
Epoch: 1942, Training Loss: 0.65710
Epoch: 1943, Training Loss: 0.08564
Epoch: 1943, Training Loss: 0.33764
Epoch: 1943, Training Loss: 0.32683
Epoch: 1943, Training Loss: 0.65705
Epoch: 1944, Training Loss: 0.08558
Epoch: 1944, Training Loss: 0.33762
Epoch: 1944, Training Loss: 0.32681
Epoch: 1944, Training Loss: 0.65700
Epoch: 1945, Training Loss: 0.08552
Epoch: 1945, Training Loss: 0.33761
Epoch: 1945, Training Loss: 0.32679
Epoch: 1945, Training Loss: 0.65695
Epoch: 1946, Training Loss: 0.08546
Epoch: 1946, Training Loss: 0.33760
Epoch: 1946, Training Loss: 0.32676
Epoch: 1946, Training Loss: 0.65690
Epoch: 1947, Training Loss: 0.08540
Epoch: 1947, Training Loss: 0.33759
Epoch: 1947, Training Loss: 0.32674
Epoch: 1947, Training Loss: 0.65685
Epoch: 1948, Training Loss: 0.08534
Epoch: 1948, Training Loss: 0.33757
Epoch: 1948, Training Loss: 0.32671
Epoch: 1948, Training Loss: 0.65680
Epoch: 1949, Training Loss: 0.08528
Epoch: 1949, Training Loss: 0.33756
Epoch: 1949, Training Loss: 0.32669
Epoch: 1949, Training Loss: 0.65674
Epoch: 1950, Training Loss: 0.08522
Epoch: 1950, Training Loss: 0.33755
Epoch: 1950, Training Loss: 0.32666
Epoch: 1950, Training Loss: 0.65669
Epoch: 1951, Training Loss: 0.08516
Epoch: 1951, Training Loss: 0.33754
Epoch: 1951, Training Loss: 0.32664
Epoch: 1951, Training Loss: 0.65664
Epoch: 1952, Training Loss: 0.08510
Epoch: 1952, Training Loss: 0.33752
Epoch: 1952, Training Loss: 0.32662
Epoch: 1952, Training Loss: 0.65658
Epoch: 1953, Training Loss: 0.08505
Epoch: 1953, Training Loss: 0.33751
Epoch: 1953, Training Loss: 0.32659
Epoch: 1953, Training Loss: 0.65653
Epoch: 1954, Training Loss: 0.08499
Epoch: 1954, Training Loss: 0.33750
Epoch: 1954, Training Loss: 0.32657
Epoch: 1954, Training Loss: 0.65648
Epoch: 1955, Training Loss: 0.08493
Epoch: 1955, Training Loss: 0.33749
Epoch: 1955, Training Loss: 0.32654
Epoch: 1955, Training Loss: 0.65642
Epoch: 1956, Training Loss: 0.08488
Epoch: 1956, Training Loss: 0.33747
Epoch: 1956, Training Loss: 0.32652
Epoch: 1956, Training Loss: 0.65636
Epoch: 1957, Training Loss: 0.08482
Epoch: 1957, Training Loss: 0.33746
Epoch: 1957, Training Loss: 0.32649
Epoch: 1957, Training Loss: 0.65631
Epoch: 1958, Training Loss: 0.08477
Epoch: 1958, Training Loss: 0.33745
Epoch: 1958, Training Loss: 0.32647
Epoch: 1958, Training Loss: 0.65625
Epoch: 1959, Training Loss: 0.08471
Epoch: 1959, Training Loss: 0.33744
Epoch: 1959, Training Loss: 0.32644
Epoch: 1959, Training Loss: 0.65619
Epoch: 1960, Training Loss: 0.08466
Epoch: 1960, Training Loss: 0.33743
Epoch: 1960, Training Loss: 0.32641
Epoch: 1960, Training Loss: 0.65614
Epoch: 1961, Training Loss: 0.08460
Epoch: 1961, Training Loss: 0.33741
Epoch: 1961, Training Loss: 0.32639
Epoch: 1961, Training Loss: 0.65608
Epoch: 1962, Training Loss: 0.08455
Epoch: 1962, Training Loss: 0.33740
Epoch: 1962, Training Loss: 0.32636
Epoch: 1962, Training Loss: 0.65602
Epoch: 1963, Training Loss: 0.08450
Epoch: 1963, Training Loss: 0.33739
Epoch: 1963, Training Loss: 0.32634
Epoch: 1963, Training Loss: 0.65596
Epoch: 1964, Training Loss: 0.08444
Epoch: 1964, Training Loss: 0.33738
Epoch: 1964, Training Loss: 0.32631
Epoch: 1964, Training Loss: 0.65590
Epoch: 1965, Training Loss: 0.08439
Epoch: 1965, Training Loss: 0.33737
Epoch: 1965, Training Loss: 0.32629
Epoch: 1965, Training Loss: 0.65584
Epoch: 1966, Training Loss: 0.08434
Epoch: 1966, Training Loss: 0.33736
Epoch: 1966, Training Loss: 0.32626
Epoch: 1966, Training Loss: 0.65578
Epoch: 1967, Training Loss: 0.08429
Epoch: 1967, Training Loss: 0.33734
Epoch: 1967, Training Loss: 0.32623
Epoch: 1967, Training Loss: 0.65572
Epoch: 1968, Training Loss: 0.08424
Epoch: 1968, Training Loss: 0.33733
Epoch: 1968, Training Loss: 0.32621
Epoch: 1968, Training Loss: 0.65565
Epoch: 1969, Training Loss: 0.08418
Epoch: 1969, Training Loss: 0.33732
Epoch: 1969, Training Loss: 0.32618
Epoch: 1969, Training Loss: 0.65559
Epoch: 1970, Training Loss: 0.08413
Epoch: 1970, Training Loss: 0.33731
Epoch: 1970, Training Loss: 0.32615
Epoch: 1970, Training Loss: 0.65553
Epoch: 1971, Training Loss: 0.08408
Epoch: 1971, Training Loss: 0.33730
Epoch: 1971, Training Loss: 0.32613
Epoch: 1971, Training Loss: 0.65546
Epoch: 1972, Training Loss: 0.08403
Epoch: 1972, Training Loss: 0.33729
Epoch: 1972, Training Loss: 0.32610
Epoch: 1972, Training Loss: 0.65540
Epoch: 1973, Training Loss: 0.08399
Epoch: 1973, Training Loss: 0.33728
Epoch: 1973, Training Loss: 0.32607
Epoch: 1973, Training Loss: 0.65533
Epoch: 1974, Training Loss: 0.08394
Epoch: 1974, Training Loss: 0.33726
Epoch: 1974, Training Loss: 0.32604
Epoch: 1974, Training Loss: 0.65527
Epoch: 1975, Training Loss: 0.08389
Epoch: 1975, Training Loss: 0.33725
Epoch: 1975, Training Loss: 0.32602
Epoch: 1975, Training Loss: 0.65520
Epoch: 1976, Training Loss: 0.08384
Epoch: 1976, Training Loss: 0.33724
Epoch: 1976, Training Loss: 0.32599
Epoch: 1976, Training Loss: 0.65514
Epoch: 1977, Training Loss: 0.08379
Epoch: 1977, Training Loss: 0.33723
Epoch: 1977, Training Loss: 0.32596
Epoch: 1977, Training Loss: 0.65507
Epoch: 1978, Training Loss: 0.08375
Epoch: 1978, Training Loss: 0.33722
Epoch: 1978, Training Loss: 0.32593
Epoch: 1978, Training Loss: 0.65500
Epoch: 1979, Training Loss: 0.08370
Epoch: 1979, Training Loss: 0.33721
Epoch: 1979, Training Loss: 0.32591
Epoch: 1979, Training Loss: 0.65493
Epoch: 1980, Training Loss: 0.08365
Epoch: 1980, Training Loss: 0.33720
Epoch: 1980, Training Loss: 0.32588
Epoch: 1980, Training Loss: 0.65486
Epoch: 1981, Training Loss: 0.08361
Epoch: 1981, Training Loss: 0.33719
Epoch: 1981, Training Loss: 0.32585
Epoch: 1981, Training Loss: 0.65479
Epoch: 1982, Training Loss: 0.08356
Epoch: 1982, Training Loss: 0.33717
Epoch: 1982, Training Loss: 0.32582
Epoch: 1982, Training Loss: 0.65472
Epoch: 1983, Training Loss: 0.08352
Epoch: 1983, Training Loss: 0.33716
Epoch: 1983, Training Loss: 0.32579
Epoch: 1983, Training Loss: 0.65465
Epoch: 1984, Training Loss: 0.08347
Epoch: 1984, Training Loss: 0.33715
Epoch: 1984, Training Loss: 0.32576
Epoch: 1984, Training Loss: 0.65458
Epoch: 1985, Training Loss: 0.08343
Epoch: 1985, Training Loss: 0.33714
Epoch: 1985, Training Loss: 0.32573
Epoch: 1985, Training Loss: 0.65450
Epoch: 1986, Training Loss: 0.08339
Epoch: 1986, Training Loss: 0.33713
Epoch: 1986, Training Loss: 0.32570
Epoch: 1986, Training Loss: 0.65443
Epoch: 1987, Training Loss: 0.08335
Epoch: 1987, Training Loss: 0.33712
Epoch: 1987, Training Loss: 0.32567
Epoch: 1987, Training Loss: 0.65435
Epoch: 1988, Training Loss: 0.08330
Epoch: 1988, Training Loss: 0.33711
Epoch: 1988, Training Loss: 0.32564
Epoch: 1988, Training Loss: 0.65428
Epoch: 1989, Training Loss: 0.08326
Epoch: 1989, Training Loss: 0.33710
Epoch: 1989, Training Loss: 0.32561
Epoch: 1989, Training Loss: 0.65420
Epoch: 1990, Training Loss: 0.08322
Epoch: 1990, Training Loss: 0.33709
Epoch: 1990, Training Loss: 0.32558
Epoch: 1990, Training Loss: 0.65412
Epoch: 1991, Training Loss: 0.08318
Epoch: 1991, Training Loss: 0.33707
Epoch: 1991, Training Loss: 0.32555
Epoch: 1991, Training Loss: 0.65405
Epoch: 1992, Training Loss: 0.08314
Epoch: 1992, Training Loss: 0.33706
Epoch: 1992, Training Loss: 0.32552
Epoch: 1992, Training Loss: 0.65397
Epoch: 1993, Training Loss: 0.08310
Epoch: 1993, Training Loss: 0.33705
Epoch: 1993, Training Loss: 0.32549
Epoch: 1993, Training Loss: 0.65389
Epoch: 1994, Training Loss: 0.08306
Epoch: 1994, Training Loss: 0.33704
Epoch: 1994, Training Loss: 0.32546
Epoch: 1994, Training Loss: 0.65381
Epoch: 1995, Training Loss: 0.08302
Epoch: 1995, Training Loss: 0.33703
Epoch: 1995, Training Loss: 0.32542
Epoch: 1995, Training Loss: 0.65373
Epoch: 1996, Training Loss: 0.08298
Epoch: 1996, Training Loss: 0.33702
Epoch: 1996, Training Loss: 0.32539
Epoch: 1996, Training Loss: 0.65365
Epoch: 1997, Training Loss: 0.08295
Epoch: 1997, Training Loss: 0.33701
Epoch: 1997, Training Loss: 0.32536
Epoch: 1997, Training Loss: 0.65356
Epoch: 1998, Training Loss: 0.08291
Epoch: 1998, Training Loss: 0.33700
Epoch: 1998, Training Loss: 0.32533
Epoch: 1998, Training Loss: 0.65348
Epoch: 1999, Training Loss: 0.08287
Epoch: 1999, Training Loss: 0.33699
Epoch: 1999, Training Loss: 0.32529
Epoch: 1999, Training Loss: 0.65339
Epoch: 2000, Training Loss: 0.08284
Epoch: 2000, Training Loss: 0.33698
Epoch: 2000, Training Loss: 0.32526
Epoch: 2000, Training Loss: 0.65331
Epoch: 2001, Training Loss: 0.08280
Epoch: 2001, Training Loss: 0.33697
Epoch: 2001, Training Loss: 0.32522
Epoch: 2001, Training Loss: 0.65322
Epoch: 2002, Training Loss: 0.08276
Epoch: 2002, Training Loss: 0.33695
Epoch: 2002, Training Loss: 0.32519
Epoch: 2002, Training Loss: 0.65314
Epoch: 2003, Training Loss: 0.08273
Epoch: 2003, Training Loss: 0.33694
Epoch: 2003, Training Loss: 0.32516
Epoch: 2003, Training Loss: 0.65305
Epoch: 2004, Training Loss: 0.08270
Epoch: 2004, Training Loss: 0.33693
Epoch: 2004, Training Loss: 0.32512
Epoch: 2004, Training Loss: 0.65296
Epoch: 2005, Training Loss: 0.08266
Epoch: 2005, Training Loss: 0.33692
Epoch: 2005, Training Loss: 0.32509
Epoch: 2005, Training Loss: 0.65287
Epoch: 2006, Training Loss: 0.08263
Epoch: 2006, Training Loss: 0.33691
Epoch: 2006, Training Loss: 0.32505
Epoch: 2006, Training Loss: 0.65278
Epoch: 2007, Training Loss: 0.08260
Epoch: 2007, Training Loss: 0.33690
Epoch: 2007, Training Loss: 0.32501
Epoch: 2007, Training Loss: 0.65268
Epoch: 2008, Training Loss: 0.08257
Epoch: 2008, Training Loss: 0.33689
Epoch: 2008, Training Loss: 0.32498
Epoch: 2008, Training Loss: 0.65259
Epoch: 2009, Training Loss: 0.08254
Epoch: 2009, Training Loss: 0.33688
Epoch: 2009, Training Loss: 0.32494
Epoch: 2009, Training Loss: 0.65250
Epoch: 2010, Training Loss: 0.08250
Epoch: 2010, Training Loss: 0.33687
Epoch: 2010, Training Loss: 0.32490
Epoch: 2010, Training Loss: 0.65240
Epoch: 2011, Training Loss: 0.08248
Epoch: 2011, Training Loss: 0.33685
Epoch: 2011, Training Loss: 0.32487
Epoch: 2011, Training Loss: 0.65230
Epoch: 2012, Training Loss: 0.08245
Epoch: 2012, Training Loss: 0.33684
Epoch: 2012, Training Loss: 0.32483
Epoch: 2012, Training Loss: 0.65221
Epoch: 2013, Training Loss: 0.08242
Epoch: 2013, Training Loss: 0.33683
Epoch: 2013, Training Loss: 0.32479
Epoch: 2013, Training Loss: 0.65211
Epoch: 2014, Training Loss: 0.08239
Epoch: 2014, Training Loss: 0.33682
Epoch: 2014, Training Loss: 0.32475
Epoch: 2014, Training Loss: 0.65201
Epoch: 2015, Training Loss: 0.08236
Epoch: 2015, Training Loss: 0.33681
Epoch: 2015, Training Loss: 0.32471
Epoch: 2015, Training Loss: 0.65191
Epoch: 2016, Training Loss: 0.08233
Epoch: 2016, Training Loss: 0.33680
Epoch: 2016, Training Loss: 0.32467
Epoch: 2016, Training Loss: 0.65180
Epoch: 2017, Training Loss: 0.08231
Epoch: 2017, Training Loss: 0.33679
Epoch: 2017, Training Loss: 0.32463
Epoch: 2017, Training Loss: 0.65170
Epoch: 2018, Training Loss: 0.08228
Epoch: 2018, Training Loss: 0.33677
Epoch: 2018, Training Loss: 0.32459
Epoch: 2018, Training Loss: 0.65160
Epoch: 2019, Training Loss: 0.08226
Epoch: 2019, Training Loss: 0.33676
Epoch: 2019, Training Loss: 0.32455
Epoch: 2019, Training Loss: 0.65149
Epoch: 2020, Training Loss: 0.08223
Epoch: 2020, Training Loss: 0.33675
Epoch: 2020, Training Loss: 0.32450
Epoch: 2020, Training Loss: 0.65138
Epoch: 2021, Training Loss: 0.08221
Epoch: 2021, Training Loss: 0.33674
Epoch: 2021, Training Loss: 0.32446
Epoch: 2021, Training Loss: 0.65127
Epoch: 2022, Training Loss: 0.08219
Epoch: 2022, Training Loss: 0.33673
Epoch: 2022, Training Loss: 0.32442
Epoch: 2022, Training Loss: 0.65116
Epoch: 2023, Training Loss: 0.08217
Epoch: 2023, Training Loss: 0.33672
Epoch: 2023, Training Loss: 0.32437
Epoch: 2023, Training Loss: 0.65105
Epoch: 2024, Training Loss: 0.08214
Epoch: 2024, Training Loss: 0.33670
Epoch: 2024, Training Loss: 0.32433
Epoch: 2024, Training Loss: 0.65094
Epoch: 2025, Training Loss: 0.08212
Epoch: 2025, Training Loss: 0.33669
Epoch: 2025, Training Loss: 0.32428
Epoch: 2025, Training Loss: 0.65083
Epoch: 2026, Training Loss: 0.08210
Epoch: 2026, Training Loss: 0.33668
Epoch: 2026, Training Loss: 0.32424
Epoch: 2026, Training Loss: 0.65071
Epoch: 2027, Training Loss: 0.08208
Epoch: 2027, Training Loss: 0.33667
Epoch: 2027, Training Loss: 0.32419
Epoch: 2027, Training Loss: 0.65060
Epoch: 2028, Training Loss: 0.08206
Epoch: 2028, Training Loss: 0.33665
Epoch: 2028, Training Loss: 0.32414
Epoch: 2028, Training Loss: 0.65048
Epoch: 2029, Training Loss: 0.08205
Epoch: 2029, Training Loss: 0.33664
Epoch: 2029, Training Loss: 0.32410
Epoch: 2029, Training Loss: 0.65036
Epoch: 2030, Training Loss: 0.08203
Epoch: 2030, Training Loss: 0.33663
Epoch: 2030, Training Loss: 0.32405
Epoch: 2030, Training Loss: 0.65024
Epoch: 2031, Training Loss: 0.08201
Epoch: 2031, Training Loss: 0.33662
Epoch: 2031, Training Loss: 0.32400
Epoch: 2031, Training Loss: 0.65011
Epoch: 2032, Training Loss: 0.08200
Epoch: 2032, Training Loss: 0.33660
Epoch: 2032, Training Loss: 0.32395
Epoch: 2032, Training Loss: 0.64999
Epoch: 2033, Training Loss: 0.08198
Epoch: 2033, Training Loss: 0.33659
Epoch: 2033, Training Loss: 0.32390
Epoch: 2033, Training Loss: 0.64986
Epoch: 2034, Training Loss: 0.08197
Epoch: 2034, Training Loss: 0.33658
Epoch: 2034, Training Loss: 0.32384
Epoch: 2034, Training Loss: 0.64974
Epoch: 2035, Training Loss: 0.08195
Epoch: 2035, Training Loss: 0.33656
Epoch: 2035, Training Loss: 0.32379
Epoch: 2035, Training Loss: 0.64961
Epoch: 2036, Training Loss: 0.08194
Epoch: 2036, Training Loss: 0.33655
Epoch: 2036, Training Loss: 0.32374
Epoch: 2036, Training Loss: 0.64948
Epoch: 2037, Training Loss: 0.08193
Epoch: 2037, Training Loss: 0.33653
Epoch: 2037, Training Loss: 0.32368
Epoch: 2037, Training Loss: 0.64935
Epoch: 2038, Training Loss: 0.08192
Epoch: 2038, Training Loss: 0.33652
Epoch: 2038, Training Loss: 0.32363
Epoch: 2038, Training Loss: 0.64921
Epoch: 2039, Training Loss: 0.08191
Epoch: 2039, Training Loss: 0.33651
Epoch: 2039, Training Loss: 0.32357
Epoch: 2039, Training Loss: 0.64908
Epoch: 2040, Training Loss: 0.08190
Epoch: 2040, Training Loss: 0.33649
Epoch: 2040, Training Loss: 0.32352
Epoch: 2040, Training Loss: 0.64894
Epoch: 2041, Training Loss: 0.08189
Epoch: 2041, Training Loss: 0.33648
Epoch: 2041, Training Loss: 0.32346
Epoch: 2041, Training Loss: 0.64880
Epoch: 2042, Training Loss: 0.08188
Epoch: 2042, Training Loss: 0.33646
Epoch: 2042, Training Loss: 0.32340
Epoch: 2042, Training Loss: 0.64866
Epoch: 2043, Training Loss: 0.08188
Epoch: 2043, Training Loss: 0.33645
Epoch: 2043, Training Loss: 0.32334
Epoch: 2043, Training Loss: 0.64852
Epoch: 2044, Training Loss: 0.08187
Epoch: 2044, Training Loss: 0.33643
Epoch: 2044, Training Loss: 0.32328
Epoch: 2044, Training Loss: 0.64837
Epoch: 2045, Training Loss: 0.08187
Epoch: 2045, Training Loss: 0.33642
Epoch: 2045, Training Loss: 0.32322
Epoch: 2045, Training Loss: 0.64822
Epoch: 2046, Training Loss: 0.08186
Epoch: 2046, Training Loss: 0.33640
Epoch: 2046, Training Loss: 0.32315
Epoch: 2046, Training Loss: 0.64808
Epoch: 2047, Training Loss: 0.08186
Epoch: 2047, Training Loss: 0.33638
Epoch: 2047, Training Loss: 0.32309
Epoch: 2047, Training Loss: 0.64792
Epoch: 2048, Training Loss: 0.08186
Epoch: 2048, Training Loss: 0.33637
Epoch: 2048, Training Loss: 0.32302
Epoch: 2048, Training Loss: 0.64777
Epoch: 2049, Training Loss: 0.08186
Epoch: 2049, Training Loss: 0.33635
Epoch: 2049, Training Loss: 0.32296
Epoch: 2049, Training Loss: 0.64762
Epoch: 2050, Training Loss: 0.08186
Epoch: 2050, Training Loss: 0.33633
Epoch: 2050, Training Loss: 0.32289
Epoch: 2050, Training Loss: 0.64746
Epoch: 2051, Training Loss: 0.08186
Epoch: 2051, Training Loss: 0.33632
Epoch: 2051, Training Loss: 0.32282
Epoch: 2051, Training Loss: 0.64730
Epoch: 2052, Training Loss: 0.08186
Epoch: 2052, Training Loss: 0.33630
Epoch: 2052, Training Loss: 0.32275
Epoch: 2052, Training Loss: 0.64714
Epoch: 2053, Training Loss: 0.08186
Epoch: 2053, Training Loss: 0.33628
Epoch: 2053, Training Loss: 0.32268
Epoch: 2053, Training Loss: 0.64698
Epoch: 2054, Training Loss: 0.08186
Epoch: 2054, Training Loss: 0.33626
Epoch: 2054, Training Loss: 0.32261
Epoch: 2054, Training Loss: 0.64681
Epoch: 2055, Training Loss: 0.08187
Epoch: 2055, Training Loss: 0.33624
Epoch: 2055, Training Loss: 0.32253
Epoch: 2055, Training Loss: 0.64664
Epoch: 2056, Training Loss: 0.08187
Epoch: 2056, Training Loss: 0.33622
Epoch: 2056, Training Loss: 0.32246
Epoch: 2056, Training Loss: 0.64647
Epoch: 2057, Training Loss: 0.08188
Epoch: 2057, Training Loss: 0.33620
Epoch: 2057, Training Loss: 0.32238
Epoch: 2057, Training Loss: 0.64630
Epoch: 2058, Training Loss: 0.08189
Epoch: 2058, Training Loss: 0.33618
Epoch: 2058, Training Loss: 0.32230
Epoch: 2058, Training Loss: 0.64612
Epoch: 2059, Training Loss: 0.08190
Epoch: 2059, Training Loss: 0.33616
Epoch: 2059, Training Loss: 0.32222
Epoch: 2059, Training Loss: 0.64594
Epoch: 2060, Training Loss: 0.08191
Epoch: 2060, Training Loss: 0.33614
Epoch: 2060, Training Loss: 0.32214
Epoch: 2060, Training Loss: 0.64576
Epoch: 2061, Training Loss: 0.08192
Epoch: 2061, Training Loss: 0.33612
Epoch: 2061, Training Loss: 0.32205
Epoch: 2061, Training Loss: 0.64558
Epoch: 2062, Training Loss: 0.08193
Epoch: 2062, Training Loss: 0.33610
Epoch: 2062, Training Loss: 0.32197
Epoch: 2062, Training Loss: 0.64540
Epoch: 2063, Training Loss: 0.08194
Epoch: 2063, Training Loss: 0.33608
Epoch: 2063, Training Loss: 0.32188
Epoch: 2063, Training Loss: 0.64521
Epoch: 2064, Training Loss: 0.08196
Epoch: 2064, Training Loss: 0.33605
Epoch: 2064, Training Loss: 0.32179
Epoch: 2064, Training Loss: 0.64502
Epoch: 2065, Training Loss: 0.08198
Epoch: 2065, Training Loss: 0.33603
Epoch: 2065, Training Loss: 0.32170
Epoch: 2065, Training Loss: 0.64482
Epoch: 2066, Training Loss: 0.08199
Epoch: 2066, Training Loss: 0.33601
Epoch: 2066, Training Loss: 0.32161
Epoch: 2066, Training Loss: 0.64463
Epoch: 2067, Training Loss: 0.08201
Epoch: 2067, Training Loss: 0.33598
Epoch: 2067, Training Loss: 0.32152
Epoch: 2067, Training Loss: 0.64443
Epoch: 2068, Training Loss: 0.08203
Epoch: 2068, Training Loss: 0.33596
Epoch: 2068, Training Loss: 0.32142
Epoch: 2068, Training Loss: 0.64422
Epoch: 2069, Training Loss: 0.08205
Epoch: 2069, Training Loss: 0.33593
Epoch: 2069, Training Loss: 0.32132
Epoch: 2069, Training Loss: 0.64402
Epoch: 2070, Training Loss: 0.08207
Epoch: 2070, Training Loss: 0.33591
Epoch: 2070, Training Loss: 0.32122
Epoch: 2070, Training Loss: 0.64381
Epoch: 2071, Training Loss: 0.08210
Epoch: 2071, Training Loss: 0.33588
Epoch: 2071, Training Loss: 0.32112
Epoch: 2071, Training Loss: 0.64360
Epoch: 2072, Training Loss: 0.08212
Epoch: 2072, Training Loss: 0.33585
Epoch: 2072, Training Loss: 0.32101
Epoch: 2072, Training Loss: 0.64338
Epoch: 2073, Training Loss: 0.08215
Epoch: 2073, Training Loss: 0.33582
Epoch: 2073, Training Loss: 0.32091
Epoch: 2073, Training Loss: 0.64316
Epoch: 2074, Training Loss: 0.08217
Epoch: 2074, Training Loss: 0.33579
Epoch: 2074, Training Loss: 0.32080
Epoch: 2074, Training Loss: 0.64294
Epoch: 2075, Training Loss: 0.08220
Epoch: 2075, Training Loss: 0.33576
Epoch: 2075, Training Loss: 0.32069
Epoch: 2075, Training Loss: 0.64272
Epoch: 2076, Training Loss: 0.08223
Epoch: 2076, Training Loss: 0.33573
Epoch: 2076, Training Loss: 0.32057
Epoch: 2076, Training Loss: 0.64249
Epoch: 2077, Training Loss: 0.08226
Epoch: 2077, Training Loss: 0.33570
Epoch: 2077, Training Loss: 0.32045
Epoch: 2077, Training Loss: 0.64226
Epoch: 2078, Training Loss: 0.08230
Epoch: 2078, Training Loss: 0.33567
Epoch: 2078, Training Loss: 0.32034
Epoch: 2078, Training Loss: 0.64202
Epoch: 2079, Training Loss: 0.08233
Epoch: 2079, Training Loss: 0.33563
Epoch: 2079, Training Loss: 0.32021
Epoch: 2079, Training Loss: 0.64178
Epoch: 2080, Training Loss: 0.08236
Epoch: 2080, Training Loss: 0.33560
Epoch: 2080, Training Loss: 0.32009
Epoch: 2080, Training Loss: 0.64154
Epoch: 2081, Training Loss: 0.08240
Epoch: 2081, Training Loss: 0.33556
Epoch: 2081, Training Loss: 0.31996
Epoch: 2081, Training Loss: 0.64130
Epoch: 2082, Training Loss: 0.08244
Epoch: 2082, Training Loss: 0.33553
Epoch: 2082, Training Loss: 0.31983
Epoch: 2082, Training Loss: 0.64105
Epoch: 2083, Training Loss: 0.08248
Epoch: 2083, Training Loss: 0.33549
Epoch: 2083, Training Loss: 0.31970
Epoch: 2083, Training Loss: 0.64079
Epoch: 2084, Training Loss: 0.08252
Epoch: 2084, Training Loss: 0.33545
Epoch: 2084, Training Loss: 0.31956
Epoch: 2084, Training Loss: 0.64053
Epoch: 2085, Training Loss: 0.08256
Epoch: 2085, Training Loss: 0.33541
Epoch: 2085, Training Loss: 0.31942
Epoch: 2085, Training Loss: 0.64027
Epoch: 2086, Training Loss: 0.08261
Epoch: 2086, Training Loss: 0.33537
Epoch: 2086, Training Loss: 0.31928
Epoch: 2086, Training Loss: 0.64001
Epoch: 2087, Training Loss: 0.08266
Epoch: 2087, Training Loss: 0.33533
Epoch: 2087, Training Loss: 0.31914
Epoch: 2087, Training Loss: 0.63974
Epoch: 2088, Training Loss: 0.08270
Epoch: 2088, Training Loss: 0.33529
Epoch: 2088, Training Loss: 0.31899
Epoch: 2088, Training Loss: 0.63946
Epoch: 2089, Training Loss: 0.08275
Epoch: 2089, Training Loss: 0.33524
Epoch: 2089, Training Loss: 0.31884
Epoch: 2089, Training Loss: 0.63918
Epoch: 2090, Training Loss: 0.08280
Epoch: 2090, Training Loss: 0.33520
Epoch: 2090, Training Loss: 0.31868
Epoch: 2090, Training Loss: 0.63890
Epoch: 2091, Training Loss: 0.08286
Epoch: 2091, Training Loss: 0.33515
Epoch: 2091, Training Loss: 0.31852
Epoch: 2091, Training Loss: 0.63861
Epoch: 2092, Training Loss: 0.08291
Epoch: 2092, Training Loss: 0.33510
Epoch: 2092, Training Loss: 0.31836
Epoch: 2092, Training Loss: 0.63832
Epoch: 2093, Training Loss: 0.08297
Epoch: 2093, Training Loss: 0.33505
Epoch: 2093, Training Loss: 0.31819
Epoch: 2093, Training Loss: 0.63802
Epoch: 2094, Training Loss: 0.08302
Epoch: 2094, Training Loss: 0.33500
Epoch: 2094, Training Loss: 0.31802
Epoch: 2094, Training Loss: 0.63772
Epoch: 2095, Training Loss: 0.08308
Epoch: 2095, Training Loss: 0.33495
Epoch: 2095, Training Loss: 0.31785
Epoch: 2095, Training Loss: 0.63741
Epoch: 2096, Training Loss: 0.08314
Epoch: 2096, Training Loss: 0.33490
Epoch: 2096, Training Loss: 0.31767
Epoch: 2096, Training Loss: 0.63710
Epoch: 2097, Training Loss: 0.08321
Epoch: 2097, Training Loss: 0.33484
Epoch: 2097, Training Loss: 0.31749
Epoch: 2097, Training Loss: 0.63678
Epoch: 2098, Training Loss: 0.08327
Epoch: 2098, Training Loss: 0.33478
Epoch: 2098, Training Loss: 0.31730
Epoch: 2098, Training Loss: 0.63646
Epoch: 2099, Training Loss: 0.08334
Epoch: 2099, Training Loss: 0.33473
Epoch: 2099, Training Loss: 0.31711
Epoch: 2099, Training Loss: 0.63613
Epoch: 2100, Training Loss: 0.08341
Epoch: 2100, Training Loss: 0.33467
Epoch: 2100, Training Loss: 0.31692
Epoch: 2100, Training Loss: 0.63580
Epoch: 2101, Training Loss: 0.08348
Epoch: 2101, Training Loss: 0.33460
Epoch: 2101, Training Loss: 0.31672
Epoch: 2101, Training Loss: 0.63546
Epoch: 2102, Training Loss: 0.08355
Epoch: 2102, Training Loss: 0.33454
Epoch: 2102, Training Loss: 0.31651
Epoch: 2102, Training Loss: 0.63512
Epoch: 2103, Training Loss: 0.08362
Epoch: 2103, Training Loss: 0.33447
Epoch: 2103, Training Loss: 0.31631
Epoch: 2103, Training Loss: 0.63477
Epoch: 2104, Training Loss: 0.08370
Epoch: 2104, Training Loss: 0.33440
Epoch: 2104, Training Loss: 0.31609
Epoch: 2104, Training Loss: 0.63442
Epoch: 2105, Training Loss: 0.08377
Epoch: 2105, Training Loss: 0.33433
Epoch: 2105, Training Loss: 0.31587
Epoch: 2105, Training Loss: 0.63406
Epoch: 2106, Training Loss: 0.08385
Epoch: 2106, Training Loss: 0.33426
Epoch: 2106, Training Loss: 0.31565
Epoch: 2106, Training Loss: 0.63369
Epoch: 2107, Training Loss: 0.08393
Epoch: 2107, Training Loss: 0.33419
Epoch: 2107, Training Loss: 0.31542
Epoch: 2107, Training Loss: 0.63332
Epoch: 2108, Training Loss: 0.08402
Epoch: 2108, Training Loss: 0.33411
Epoch: 2108, Training Loss: 0.31519
Epoch: 2108, Training Loss: 0.63294
Epoch: 2109, Training Loss: 0.08410
Epoch: 2109, Training Loss: 0.33403
Epoch: 2109, Training Loss: 0.31495
Epoch: 2109, Training Loss: 0.63255
Epoch: 2110, Training Loss: 0.08419
Epoch: 2110, Training Loss: 0.33395
Epoch: 2110, Training Loss: 0.31471
Epoch: 2110, Training Loss: 0.63216
Epoch: 2111, Training Loss: 0.08428
Epoch: 2111, Training Loss: 0.33387
Epoch: 2111, Training Loss: 0.31446
Epoch: 2111, Training Loss: 0.63177
Epoch: 2112, Training Loss: 0.08437
Epoch: 2112, Training Loss: 0.33379
Epoch: 2112, Training Loss: 0.31421
Epoch: 2112, Training Loss: 0.63136
Epoch: 2113, Training Loss: 0.08446
Epoch: 2113, Training Loss: 0.33370
Epoch: 2113, Training Loss: 0.31395
Epoch: 2113, Training Loss: 0.63095
Epoch: 2114, Training Loss: 0.08455
Epoch: 2114, Training Loss: 0.33361
Epoch: 2114, Training Loss: 0.31368
Epoch: 2114, Training Loss: 0.63053
Epoch: 2115, Training Loss: 0.08465
Epoch: 2115, Training Loss: 0.33351
Epoch: 2115, Training Loss: 0.31341
Epoch: 2115, Training Loss: 0.63011
Epoch: 2116, Training Loss: 0.08475
Epoch: 2116, Training Loss: 0.33342
Epoch: 2116, Training Loss: 0.31314
Epoch: 2116, Training Loss: 0.62968
Epoch: 2117, Training Loss: 0.08485
Epoch: 2117, Training Loss: 0.33332
Epoch: 2117, Training Loss: 0.31286
Epoch: 2117, Training Loss: 0.62924
Epoch: 2118, Training Loss: 0.08495
Epoch: 2118, Training Loss: 0.33322
Epoch: 2118, Training Loss: 0.31257
Epoch: 2118, Training Loss: 0.62879
Epoch: 2119, Training Loss: 0.08505
Epoch: 2119, Training Loss: 0.33312
Epoch: 2119, Training Loss: 0.31228
Epoch: 2119, Training Loss: 0.62834
Epoch: 2120, Training Loss: 0.08516
Epoch: 2120, Training Loss: 0.33301
Epoch: 2120, Training Loss: 0.31198
Epoch: 2120, Training Loss: 0.62788
Epoch: 2121, Training Loss: 0.08527
Epoch: 2121, Training Loss: 0.33290
Epoch: 2121, Training Loss: 0.31167
Epoch: 2121, Training Loss: 0.62741
Epoch: 2122, Training Loss: 0.08538
Epoch: 2122, Training Loss: 0.33279
Epoch: 2122, Training Loss: 0.31136
Epoch: 2122, Training Loss: 0.62693
Epoch: 2123, Training Loss: 0.08549
Epoch: 2123, Training Loss: 0.33268
Epoch: 2123, Training Loss: 0.31104
Epoch: 2123, Training Loss: 0.62645
Epoch: 2124, Training Loss: 0.08560
Epoch: 2124, Training Loss: 0.33256
Epoch: 2124, Training Loss: 0.31072
Epoch: 2124, Training Loss: 0.62595
Epoch: 2125, Training Loss: 0.08571
Epoch: 2125, Training Loss: 0.33244
Epoch: 2125, Training Loss: 0.31039
Epoch: 2125, Training Loss: 0.62545
Epoch: 2126, Training Loss: 0.08583
Epoch: 2126, Training Loss: 0.33231
Epoch: 2126, Training Loss: 0.31006
Epoch: 2126, Training Loss: 0.62494
Epoch: 2127, Training Loss: 0.08595
Epoch: 2127, Training Loss: 0.33219
Epoch: 2127, Training Loss: 0.30972
Epoch: 2127, Training Loss: 0.62442
Epoch: 2128, Training Loss: 0.08607
Epoch: 2128, Training Loss: 0.33206
Epoch: 2128, Training Loss: 0.30937
Epoch: 2128, Training Loss: 0.62390
Epoch: 2129, Training Loss: 0.08619
Epoch: 2129, Training Loss: 0.33192
Epoch: 2129, Training Loss: 0.30902
Epoch: 2129, Training Loss: 0.62336
Epoch: 2130, Training Loss: 0.08631
Epoch: 2130, Training Loss: 0.33179
Epoch: 2130, Training Loss: 0.30866
Epoch: 2130, Training Loss: 0.62282
Epoch: 2131, Training Loss: 0.08644
Epoch: 2131, Training Loss: 0.33165
Epoch: 2131, Training Loss: 0.30830
Epoch: 2131, Training Loss: 0.62226
Epoch: 2132, Training Loss: 0.08656
Epoch: 2132, Training Loss: 0.33150
Epoch: 2132, Training Loss: 0.30793
Epoch: 2132, Training Loss: 0.62170
Epoch: 2133, Training Loss: 0.08669
Epoch: 2133, Training Loss: 0.33136
Epoch: 2133, Training Loss: 0.30756
Epoch: 2133, Training Loss: 0.62113
Epoch: 2134, Training Loss: 0.08682
Epoch: 2134, Training Loss: 0.33120
Epoch: 2134, Training Loss: 0.30718
Epoch: 2134, Training Loss: 0.62055
Epoch: 2135, Training Loss: 0.08695
Epoch: 2135, Training Loss: 0.33105
Epoch: 2135, Training Loss: 0.30679
Epoch: 2135, Training Loss: 0.61996
Epoch: 2136, Training Loss: 0.08708
Epoch: 2136, Training Loss: 0.33089
Epoch: 2136, Training Loss: 0.30640
Epoch: 2136, Training Loss: 0.61936
Epoch: 2137, Training Loss: 0.08722
Epoch: 2137, Training Loss: 0.33073
Epoch: 2137, Training Loss: 0.30600
Epoch: 2137, Training Loss: 0.61875
Epoch: 2138, Training Loss: 0.08735
Epoch: 2138, Training Loss: 0.33057
Epoch: 2138, Training Loss: 0.30560
Epoch: 2138, Training Loss: 0.61812
Epoch: 2139, Training Loss: 0.08749
Epoch: 2139, Training Loss: 0.33040
Epoch: 2139, Training Loss: 0.30519
Epoch: 2139, Training Loss: 0.61749
Epoch: 2140, Training Loss: 0.08762
Epoch: 2140, Training Loss: 0.33023
Epoch: 2140, Training Loss: 0.30478
Epoch: 2140, Training Loss: 0.61685
Epoch: 2141, Training Loss: 0.08776
Epoch: 2141, Training Loss: 0.33006
Epoch: 2141, Training Loss: 0.30437
Epoch: 2141, Training Loss: 0.61620
Epoch: 2142, Training Loss: 0.08790
Epoch: 2142, Training Loss: 0.32988
Epoch: 2142, Training Loss: 0.30395
Epoch: 2142, Training Loss: 0.61553
Epoch: 2143, Training Loss: 0.08804
Epoch: 2143, Training Loss: 0.32970
Epoch: 2143, Training Loss: 0.30352
Epoch: 2143, Training Loss: 0.61486
Epoch: 2144, Training Loss: 0.08818
Epoch: 2144, Training Loss: 0.32951
Epoch: 2144, Training Loss: 0.30310
Epoch: 2144, Training Loss: 0.61417
Epoch: 2145, Training Loss: 0.08833
Epoch: 2145, Training Loss: 0.32932
Epoch: 2145, Training Loss: 0.30266
Epoch: 2145, Training Loss: 0.61347
Epoch: 2146, Training Loss: 0.08847
Epoch: 2146, Training Loss: 0.32913
Epoch: 2146, Training Loss: 0.30223
Epoch: 2146, Training Loss: 0.61276
Epoch: 2147, Training Loss: 0.08862
Epoch: 2147, Training Loss: 0.32893
Epoch: 2147, Training Loss: 0.30179
Epoch: 2147, Training Loss: 0.61204
Epoch: 2148, Training Loss: 0.08876
Epoch: 2148, Training Loss: 0.32873
Epoch: 2148, Training Loss: 0.30135
Epoch: 2148, Training Loss: 0.61131
Epoch: 2149, Training Loss: 0.08891
Epoch: 2149, Training Loss: 0.32853
Epoch: 2149, Training Loss: 0.30090
Epoch: 2149, Training Loss: 0.61057
Epoch: 2150, Training Loss: 0.08905
Epoch: 2150, Training Loss: 0.32832
Epoch: 2150, Training Loss: 0.30046
Epoch: 2150, Training Loss: 0.60981
Epoch: 2151, Training Loss: 0.08920
Epoch: 2151, Training Loss: 0.32811
Epoch: 2151, Training Loss: 0.30001
Epoch: 2151, Training Loss: 0.60904
Epoch: 2152, Training Loss: 0.08935
Epoch: 2152, Training Loss: 0.32789
Epoch: 2152, Training Loss: 0.29956
Epoch: 2152, Training Loss: 0.60826
Epoch: 2153, Training Loss: 0.08950
Epoch: 2153, Training Loss: 0.32767
Epoch: 2153, Training Loss: 0.29910
Epoch: 2153, Training Loss: 0.60746
Epoch: 2154, Training Loss: 0.08965
Epoch: 2154, Training Loss: 0.32745
Epoch: 2154, Training Loss: 0.29865
Epoch: 2154, Training Loss: 0.60665
Epoch: 2155, Training Loss: 0.08980
Epoch: 2155, Training Loss: 0.32723
Epoch: 2155, Training Loss: 0.29819
Epoch: 2155, Training Loss: 0.60583
Epoch: 2156, Training Loss: 0.08995
Epoch: 2156, Training Loss: 0.32699
Epoch: 2156, Training Loss: 0.29774
Epoch: 2156, Training Loss: 0.60499
Epoch: 2157, Training Loss: 0.09011
Epoch: 2157, Training Loss: 0.32676
Epoch: 2157, Training Loss: 0.29728
Epoch: 2157, Training Loss: 0.60415
Epoch: 2158, Training Loss: 0.09026
Epoch: 2158, Training Loss: 0.32652
Epoch: 2158, Training Loss: 0.29682
Epoch: 2158, Training Loss: 0.60328
Epoch: 2159, Training Loss: 0.09041
Epoch: 2159, Training Loss: 0.32628
Epoch: 2159, Training Loss: 0.29636
Epoch: 2159, Training Loss: 0.60241
Epoch: 2160, Training Loss: 0.09057
Epoch: 2160, Training Loss: 0.32603
Epoch: 2160, Training Loss: 0.29591
Epoch: 2160, Training Loss: 0.60152
Epoch: 2161, Training Loss: 0.09072
Epoch: 2161, Training Loss: 0.32578
Epoch: 2161, Training Loss: 0.29545
Epoch: 2161, Training Loss: 0.60061
Epoch: 2162, Training Loss: 0.09088
Epoch: 2162, Training Loss: 0.32552
Epoch: 2162, Training Loss: 0.29499
Epoch: 2162, Training Loss: 0.59969
Epoch: 2163, Training Loss: 0.09103
Epoch: 2163, Training Loss: 0.32526
Epoch: 2163, Training Loss: 0.29454
Epoch: 2163, Training Loss: 0.59876
Epoch: 2164, Training Loss: 0.09119
Epoch: 2164, Training Loss: 0.32500
Epoch: 2164, Training Loss: 0.29409
Epoch: 2164, Training Loss: 0.59781
Epoch: 2165, Training Loss: 0.09134
Epoch: 2165, Training Loss: 0.32473
Epoch: 2165, Training Loss: 0.29363
Epoch: 2165, Training Loss: 0.59685
Epoch: 2166, Training Loss: 0.09150
Epoch: 2166, Training Loss: 0.32446
Epoch: 2166, Training Loss: 0.29318
Epoch: 2166, Training Loss: 0.59587
Epoch: 2167, Training Loss: 0.09166
Epoch: 2167, Training Loss: 0.32418
Epoch: 2167, Training Loss: 0.29273
Epoch: 2167, Training Loss: 0.59488
Epoch: 2168, Training Loss: 0.09182
Epoch: 2168, Training Loss: 0.32390
Epoch: 2168, Training Loss: 0.29229
Epoch: 2168, Training Loss: 0.59387
Epoch: 2169, Training Loss: 0.09198
Epoch: 2169, Training Loss: 0.32361
Epoch: 2169, Training Loss: 0.29184
Epoch: 2169, Training Loss: 0.59284
Epoch: 2170, Training Loss: 0.09214
Epoch: 2170, Training Loss: 0.32331
Epoch: 2170, Training Loss: 0.29140
Epoch: 2170, Training Loss: 0.59181
Epoch: 2171, Training Loss: 0.09230
Epoch: 2171, Training Loss: 0.32301
Epoch: 2171, Training Loss: 0.29096
Epoch: 2171, Training Loss: 0.59075
Epoch: 2172, Training Loss: 0.09246
Epoch: 2172, Training Loss: 0.32271
Epoch: 2172, Training Loss: 0.29053
Epoch: 2172, Training Loss: 0.58968
Epoch: 2173, Training Loss: 0.09263
Epoch: 2173, Training Loss: 0.32240
Epoch: 2173, Training Loss: 0.29009
Epoch: 2173, Training Loss: 0.58859
Epoch: 2174, Training Loss: 0.09279
Epoch: 2174, Training Loss: 0.32208
Epoch: 2174, Training Loss: 0.28966
Epoch: 2174, Training Loss: 0.58749
Epoch: 2175, Training Loss: 0.09296
Epoch: 2175, Training Loss: 0.32176
Epoch: 2175, Training Loss: 0.28924
Epoch: 2175, Training Loss: 0.58637
Epoch: 2176, Training Loss: 0.09312
Epoch: 2176, Training Loss: 0.32143
Epoch: 2176, Training Loss: 0.28881
Epoch: 2176, Training Loss: 0.58524
Epoch: 2177, Training Loss: 0.09329
Epoch: 2177, Training Loss: 0.32110
Epoch: 2177, Training Loss: 0.28839
Epoch: 2177, Training Loss: 0.58409
Epoch: 2178, Training Loss: 0.09346
Epoch: 2178, Training Loss: 0.32076
Epoch: 2178, Training Loss: 0.28797
Epoch: 2178, Training Loss: 0.58292
Epoch: 2179, Training Loss: 0.09363
Epoch: 2179, Training Loss: 0.32041
Epoch: 2179, Training Loss: 0.28756
Epoch: 2179, Training Loss: 0.58174
Epoch: 2180, Training Loss: 0.09380
Epoch: 2180, Training Loss: 0.32006
Epoch: 2180, Training Loss: 0.28715
Epoch: 2180, Training Loss: 0.58054
Epoch: 2181, Training Loss: 0.09397
Epoch: 2181, Training Loss: 0.31969
Epoch: 2181, Training Loss: 0.28674
Epoch: 2181, Training Loss: 0.57933
Epoch: 2182, Training Loss: 0.09415
Epoch: 2182, Training Loss: 0.31933
Epoch: 2182, Training Loss: 0.28634
Epoch: 2182, Training Loss: 0.57810
Epoch: 2183, Training Loss: 0.09432
Epoch: 2183, Training Loss: 0.31895
Epoch: 2183, Training Loss: 0.28594
Epoch: 2183, Training Loss: 0.57685
Epoch: 2184, Training Loss: 0.09450
Epoch: 2184, Training Loss: 0.31856
Epoch: 2184, Training Loss: 0.28554
Epoch: 2184, Training Loss: 0.57559
Epoch: 2185, Training Loss: 0.09468
Epoch: 2185, Training Loss: 0.31817
Epoch: 2185, Training Loss: 0.28515
Epoch: 2185, Training Loss: 0.57431
Epoch: 2186, Training Loss: 0.09486
Epoch: 2186, Training Loss: 0.31777
Epoch: 2186, Training Loss: 0.28476
Epoch: 2186, Training Loss: 0.57301
Epoch: 2187, Training Loss: 0.09504
Epoch: 2187, Training Loss: 0.31736
Epoch: 2187, Training Loss: 0.28437
Epoch: 2187, Training Loss: 0.57170
Epoch: 2188, Training Loss: 0.09522
Epoch: 2188, Training Loss: 0.31695
Epoch: 2188, Training Loss: 0.28399
Epoch: 2188, Training Loss: 0.57037
Epoch: 2189, Training Loss: 0.09541
Epoch: 2189, Training Loss: 0.31652
Epoch: 2189, Training Loss: 0.28360
Epoch: 2189, Training Loss: 0.56903
Epoch: 2190, Training Loss: 0.09560
Epoch: 2190, Training Loss: 0.31609
Epoch: 2190, Training Loss: 0.28323
Epoch: 2190, Training Loss: 0.56767
Epoch: 2191, Training Loss: 0.09579
Epoch: 2191, Training Loss: 0.31564
Epoch: 2191, Training Loss: 0.28285
Epoch: 2191, Training Loss: 0.56629
Epoch: 2192, Training Loss: 0.09598
Epoch: 2192, Training Loss: 0.31519
Epoch: 2192, Training Loss: 0.28248
Epoch: 2192, Training Loss: 0.56490
Epoch: 2193, Training Loss: 0.09617
Epoch: 2193, Training Loss: 0.31473
Epoch: 2193, Training Loss: 0.28211
Epoch: 2193, Training Loss: 0.56349
Epoch: 2194, Training Loss: 0.09636
Epoch: 2194, Training Loss: 0.31425
Epoch: 2194, Training Loss: 0.28174
Epoch: 2194, Training Loss: 0.56206
Epoch: 2195, Training Loss: 0.09656
Epoch: 2195, Training Loss: 0.31377
Epoch: 2195, Training Loss: 0.28137
Epoch: 2195, Training Loss: 0.56062
Epoch: 2196, Training Loss: 0.09676
Epoch: 2196, Training Loss: 0.31328
Epoch: 2196, Training Loss: 0.28101
Epoch: 2196, Training Loss: 0.55916
Epoch: 2197, Training Loss: 0.09696
Epoch: 2197, Training Loss: 0.31278
Epoch: 2197, Training Loss: 0.28065
Epoch: 2197, Training Loss: 0.55769
Epoch: 2198, Training Loss: 0.09716
Epoch: 2198, Training Loss: 0.31227
Epoch: 2198, Training Loss: 0.28029
Epoch: 2198, Training Loss: 0.55620
Epoch: 2199, Training Loss: 0.09736
Epoch: 2199, Training Loss: 0.31175
Epoch: 2199, Training Loss: 0.27993
Epoch: 2199, Training Loss: 0.55470
Epoch: 2200, Training Loss: 0.09757
Epoch: 2200, Training Loss: 0.31122
Epoch: 2200, Training Loss: 0.27957
Epoch: 2200, Training Loss: 0.55318
Epoch: 2201, Training Loss: 0.09777
Epoch: 2201, Training Loss: 0.31067
Epoch: 2201, Training Loss: 0.27922
Epoch: 2201, Training Loss: 0.55164
Epoch: 2202, Training Loss: 0.09798
Epoch: 2202, Training Loss: 0.31012
Epoch: 2202, Training Loss: 0.27886
Epoch: 2202, Training Loss: 0.55009
Epoch: 2203, Training Loss: 0.09819
Epoch: 2203, Training Loss: 0.30956
Epoch: 2203, Training Loss: 0.27850
Epoch: 2203, Training Loss: 0.54852
Epoch: 2204, Training Loss: 0.09841
Epoch: 2204, Training Loss: 0.30898
Epoch: 2204, Training Loss: 0.27815
Epoch: 2204, Training Loss: 0.54694
Epoch: 2205, Training Loss: 0.09862
Epoch: 2205, Training Loss: 0.30840
Epoch: 2205, Training Loss: 0.27780
Epoch: 2205, Training Loss: 0.54534
Epoch: 2206, Training Loss: 0.09884
Epoch: 2206, Training Loss: 0.30780
Epoch: 2206, Training Loss: 0.27744
Epoch: 2206, Training Loss: 0.54373
Epoch: 2207, Training Loss: 0.09905
Epoch: 2207, Training Loss: 0.30720
Epoch: 2207, Training Loss: 0.27709
Epoch: 2207, Training Loss: 0.54210
Epoch: 2208, Training Loss: 0.09927
Epoch: 2208, Training Loss: 0.30658
Epoch: 2208, Training Loss: 0.27673
Epoch: 2208, Training Loss: 0.54046
Epoch: 2209, Training Loss: 0.09949
Epoch: 2209, Training Loss: 0.30596
Epoch: 2209, Training Loss: 0.27638
Epoch: 2209, Training Loss: 0.53880
Epoch: 2210, Training Loss: 0.09972
Epoch: 2210, Training Loss: 0.30532
Epoch: 2210, Training Loss: 0.27602
Epoch: 2210, Training Loss: 0.53713
Epoch: 2211, Training Loss: 0.09994
Epoch: 2211, Training Loss: 0.30467
Epoch: 2211, Training Loss: 0.27566
Epoch: 2211, Training Loss: 0.53544
Epoch: 2212, Training Loss: 0.10016
Epoch: 2212, Training Loss: 0.30401
Epoch: 2212, Training Loss: 0.27530
Epoch: 2212, Training Loss: 0.53374
Epoch: 2213, Training Loss: 0.10039
Epoch: 2213, Training Loss: 0.30334
Epoch: 2213, Training Loss: 0.27494
Epoch: 2213, Training Loss: 0.53202
Epoch: 2214, Training Loss: 0.10062
Epoch: 2214, Training Loss: 0.30266
Epoch: 2214, Training Loss: 0.27458
Epoch: 2214, Training Loss: 0.53029
Epoch: 2215, Training Loss: 0.10084
Epoch: 2215, Training Loss: 0.30197
Epoch: 2215, Training Loss: 0.27422
Epoch: 2215, Training Loss: 0.52855
Epoch: 2216, Training Loss: 0.10107
Epoch: 2216, Training Loss: 0.30127
Epoch: 2216, Training Loss: 0.27385
Epoch: 2216, Training Loss: 0.52679
Epoch: 2217, Training Loss: 0.10130
Epoch: 2217, Training Loss: 0.30056
Epoch: 2217, Training Loss: 0.27348
Epoch: 2217, Training Loss: 0.52502
Epoch: 2218, Training Loss: 0.10153
Epoch: 2218, Training Loss: 0.29984
Epoch: 2218, Training Loss: 0.27311
Epoch: 2218, Training Loss: 0.52324
Epoch: 2219, Training Loss: 0.10177
Epoch: 2219, Training Loss: 0.29911
Epoch: 2219, Training Loss: 0.27273
Epoch: 2219, Training Loss: 0.52144
Epoch: 2220, Training Loss: 0.10200
Epoch: 2220, Training Loss: 0.29837
Epoch: 2220, Training Loss: 0.27235
Epoch: 2220, Training Loss: 0.51963
Epoch: 2221, Training Loss: 0.10223
Epoch: 2221, Training Loss: 0.29762
Epoch: 2221, Training Loss: 0.27197
Epoch: 2221, Training Loss: 0.51781
Epoch: 2222, Training Loss: 0.10246
Epoch: 2222, Training Loss: 0.29686
Epoch: 2222, Training Loss: 0.27159
Epoch: 2222, Training Loss: 0.51597
Epoch: 2223, Training Loss: 0.10270
Epoch: 2223, Training Loss: 0.29609
Epoch: 2223, Training Loss: 0.27120
Epoch: 2223, Training Loss: 0.51412
Epoch: 2224, Training Loss: 0.10293
Epoch: 2224, Training Loss: 0.29531
Epoch: 2224, Training Loss: 0.27080
Epoch: 2224, Training Loss: 0.51226
Epoch: 2225, Training Loss: 0.10316
Epoch: 2225, Training Loss: 0.29452
Epoch: 2225, Training Loss: 0.27040
Epoch: 2225, Training Loss: 0.51039
Epoch: 2226, Training Loss: 0.10339
Epoch: 2226, Training Loss: 0.29373
Epoch: 2226, Training Loss: 0.27000
Epoch: 2226, Training Loss: 0.50851
Epoch: 2227, Training Loss: 0.10363
Epoch: 2227, Training Loss: 0.29292
Epoch: 2227, Training Loss: 0.26959
Epoch: 2227, Training Loss: 0.50661
Epoch: 2228, Training Loss: 0.10386
Epoch: 2228, Training Loss: 0.29211
Epoch: 2228, Training Loss: 0.26918
Epoch: 2228, Training Loss: 0.50470
Epoch: 2229, Training Loss: 0.10409
Epoch: 2229, Training Loss: 0.29129
Epoch: 2229, Training Loss: 0.26876
Epoch: 2229, Training Loss: 0.50278
Epoch: 2230, Training Loss: 0.10432
Epoch: 2230, Training Loss: 0.29046
Epoch: 2230, Training Loss: 0.26833
Epoch: 2230, Training Loss: 0.50086
Epoch: 2231, Training Loss: 0.10455
Epoch: 2231, Training Loss: 0.28962
Epoch: 2231, Training Loss: 0.26790
Epoch: 2231, Training Loss: 0.49892
Epoch: 2232, Training Loss: 0.10478
Epoch: 2232, Training Loss: 0.28877
Epoch: 2232, Training Loss: 0.26747
Epoch: 2232, Training Loss: 0.49697
Epoch: 2233, Training Loss: 0.10501
Epoch: 2233, Training Loss: 0.28792
Epoch: 2233, Training Loss: 0.26702
Epoch: 2233, Training Loss: 0.49501
Epoch: 2234, Training Loss: 0.10524
Epoch: 2234, Training Loss: 0.28706
Epoch: 2234, Training Loss: 0.26658
Epoch: 2234, Training Loss: 0.49304
Epoch: 2235, Training Loss: 0.10546
Epoch: 2235, Training Loss: 0.28619
Epoch: 2235, Training Loss: 0.26612
Epoch: 2235, Training Loss: 0.49106
Epoch: 2236, Training Loss: 0.10569
Epoch: 2236, Training Loss: 0.28531
Epoch: 2236, Training Loss: 0.26566
Epoch: 2236, Training Loss: 0.48908
Epoch: 2237, Training Loss: 0.10591
Epoch: 2237, Training Loss: 0.28443
Epoch: 2237, Training Loss: 0.26519
Epoch: 2237, Training Loss: 0.48708
Epoch: 2238, Training Loss: 0.10613
Epoch: 2238, Training Loss: 0.28354
Epoch: 2238, Training Loss: 0.26471
Epoch: 2238, Training Loss: 0.48508
Epoch: 2239, Training Loss: 0.10635
Epoch: 2239, Training Loss: 0.28265
Epoch: 2239, Training Loss: 0.26423
Epoch: 2239, Training Loss: 0.48307
Epoch: 2240, Training Loss: 0.10657
Epoch: 2240, Training Loss: 0.28175
Epoch: 2240, Training Loss: 0.26374
Epoch: 2240, Training Loss: 0.48105
Epoch: 2241, Training Loss: 0.10679
Epoch: 2241, Training Loss: 0.28084
Epoch: 2241, Training Loss: 0.26324
Epoch: 2241, Training Loss: 0.47902
Epoch: 2242, Training Loss: 0.10700
Epoch: 2242, Training Loss: 0.27993
Epoch: 2242, Training Loss: 0.26274
Epoch: 2242, Training Loss: 0.47699
Epoch: 2243, Training Loss: 0.10721
Epoch: 2243, Training Loss: 0.27901
Epoch: 2243, Training Loss: 0.26222
Epoch: 2243, Training Loss: 0.47495
Epoch: 2244, Training Loss: 0.10742
Epoch: 2244, Training Loss: 0.27809
Epoch: 2244, Training Loss: 0.26170
Epoch: 2244, Training Loss: 0.47290
Epoch: 2245, Training Loss: 0.10763
Epoch: 2245, Training Loss: 0.27716
Epoch: 2245, Training Loss: 0.26118
Epoch: 2245, Training Loss: 0.47085
Epoch: 2246, Training Loss: 0.10783
Epoch: 2246, Training Loss: 0.27623
Epoch: 2246, Training Loss: 0.26064
Epoch: 2246, Training Loss: 0.46879
Epoch: 2247, Training Loss: 0.10803
Epoch: 2247, Training Loss: 0.27529
Epoch: 2247, Training Loss: 0.26010
Epoch: 2247, Training Loss: 0.46672
Epoch: 2248, Training Loss: 0.10823
Epoch: 2248, Training Loss: 0.27435
Epoch: 2248, Training Loss: 0.25954
Epoch: 2248, Training Loss: 0.46466
Epoch: 2249, Training Loss: 0.10843
Epoch: 2249, Training Loss: 0.27340
Epoch: 2249, Training Loss: 0.25898
Epoch: 2249, Training Loss: 0.46258
Epoch: 2250, Training Loss: 0.10862
Epoch: 2250, Training Loss: 0.27245
Epoch: 2250, Training Loss: 0.25841
Epoch: 2250, Training Loss: 0.46050
Epoch: 2251, Training Loss: 0.10881
Epoch: 2251, Training Loss: 0.27150
Epoch: 2251, Training Loss: 0.25784
Epoch: 2251, Training Loss: 0.45842
Epoch: 2252, Training Loss: 0.10899
Epoch: 2252, Training Loss: 0.27054
Epoch: 2252, Training Loss: 0.25725
Epoch: 2252, Training Loss: 0.45634
Epoch: 2253, Training Loss: 0.10918
Epoch: 2253, Training Loss: 0.26958
Epoch: 2253, Training Loss: 0.25666
Epoch: 2253, Training Loss: 0.45425
Epoch: 2254, Training Loss: 0.10935
Epoch: 2254, Training Loss: 0.26862
Epoch: 2254, Training Loss: 0.25606
Epoch: 2254, Training Loss: 0.45216
Epoch: 2255, Training Loss: 0.10953
Epoch: 2255, Training Loss: 0.26766
Epoch: 2255, Training Loss: 0.25545
Epoch: 2255, Training Loss: 0.45007
Epoch: 2256, Training Loss: 0.10970
Epoch: 2256, Training Loss: 0.26669
Epoch: 2256, Training Loss: 0.25483
Epoch: 2256, Training Loss: 0.44797
Epoch: 2257, Training Loss: 0.10987
Epoch: 2257, Training Loss: 0.26572
Epoch: 2257, Training Loss: 0.25421
Epoch: 2257, Training Loss: 0.44587
Epoch: 2258, Training Loss: 0.11003
Epoch: 2258, Training Loss: 0.26474
Epoch: 2258, Training Loss: 0.25358
Epoch: 2258, Training Loss: 0.44377
Epoch: 2259, Training Loss: 0.11019
Epoch: 2259, Training Loss: 0.26377
Epoch: 2259, Training Loss: 0.25294
Epoch: 2259, Training Loss: 0.44167
Epoch: 2260, Training Loss: 0.11035
Epoch: 2260, Training Loss: 0.26279
Epoch: 2260, Training Loss: 0.25229
Epoch: 2260, Training Loss: 0.43957
Epoch: 2261, Training Loss: 0.11050
Epoch: 2261, Training Loss: 0.26182
Epoch: 2261, Training Loss: 0.25163
Epoch: 2261, Training Loss: 0.43747
Epoch: 2262, Training Loss: 0.11065
Epoch: 2262, Training Loss: 0.26084
Epoch: 2262, Training Loss: 0.25097
Epoch: 2262, Training Loss: 0.43537
Epoch: 2263, Training Loss: 0.11079
Epoch: 2263, Training Loss: 0.25985
Epoch: 2263, Training Loss: 0.25030
Epoch: 2263, Training Loss: 0.43327
Epoch: 2264, Training Loss: 0.11093
Epoch: 2264, Training Loss: 0.25887
Epoch: 2264, Training Loss: 0.24962
Epoch: 2264, Training Loss: 0.43117
Epoch: 2265, Training Loss: 0.11107
Epoch: 2265, Training Loss: 0.25789
Epoch: 2265, Training Loss: 0.24893
Epoch: 2265, Training Loss: 0.42907
Epoch: 2266, Training Loss: 0.11120
Epoch: 2266, Training Loss: 0.25691
Epoch: 2266, Training Loss: 0.24824
Epoch: 2266, Training Loss: 0.42698
Epoch: 2267, Training Loss: 0.11132
Epoch: 2267, Training Loss: 0.25592
Epoch: 2267, Training Loss: 0.24754
Epoch: 2267, Training Loss: 0.42488
Epoch: 2268, Training Loss: 0.11145
Epoch: 2268, Training Loss: 0.25494
Epoch: 2268, Training Loss: 0.24683
Epoch: 2268, Training Loss: 0.42279
Epoch: 2269, Training Loss: 0.11156
Epoch: 2269, Training Loss: 0.25395
Epoch: 2269, Training Loss: 0.24612
Epoch: 2269, Training Loss: 0.42070
Epoch: 2270, Training Loss: 0.11168
Epoch: 2270, Training Loss: 0.25297
Epoch: 2270, Training Loss: 0.24540
Epoch: 2270, Training Loss: 0.41861
Epoch: 2271, Training Loss: 0.11178
Epoch: 2271, Training Loss: 0.25198
Epoch: 2271, Training Loss: 0.24467
Epoch: 2271, Training Loss: 0.41653
Epoch: 2272, Training Loss: 0.11189
Epoch: 2272, Training Loss: 0.25100
Epoch: 2272, Training Loss: 0.24394
Epoch: 2272, Training Loss: 0.41444
Epoch: 2273, Training Loss: 0.11199
Epoch: 2273, Training Loss: 0.25001
Epoch: 2273, Training Loss: 0.24320
Epoch: 2273, Training Loss: 0.41237
Epoch: 2274, Training Loss: 0.11208
Epoch: 2274, Training Loss: 0.24903
Epoch: 2274, Training Loss: 0.24245
Epoch: 2274, Training Loss: 0.41029
Epoch: 2275, Training Loss: 0.11217
Epoch: 2275, Training Loss: 0.24804
Epoch: 2275, Training Loss: 0.24170
Epoch: 2275, Training Loss: 0.40822
Epoch: 2276, Training Loss: 0.11225
Epoch: 2276, Training Loss: 0.24706
Epoch: 2276, Training Loss: 0.24094
Epoch: 2276, Training Loss: 0.40616
Epoch: 2277, Training Loss: 0.11233
Epoch: 2277, Training Loss: 0.24608
Epoch: 2277, Training Loss: 0.24018
Epoch: 2277, Training Loss: 0.40410
Epoch: 2278, Training Loss: 0.11241
Epoch: 2278, Training Loss: 0.24510
Epoch: 2278, Training Loss: 0.23942
Epoch: 2278, Training Loss: 0.40205
Epoch: 2279, Training Loss: 0.11248
Epoch: 2279, Training Loss: 0.24412
Epoch: 2279, Training Loss: 0.23864
Epoch: 2279, Training Loss: 0.40000
Epoch: 2280, Training Loss: 0.11255
Epoch: 2280, Training Loss: 0.24314
Epoch: 2280, Training Loss: 0.23787
Epoch: 2280, Training Loss: 0.39795
Epoch: 2281, Training Loss: 0.11261
Epoch: 2281, Training Loss: 0.24216
Epoch: 2281, Training Loss: 0.23709
Epoch: 2281, Training Loss: 0.39592
Epoch: 2282, Training Loss: 0.11266
Epoch: 2282, Training Loss: 0.24118
Epoch: 2282, Training Loss: 0.23630
Epoch: 2282, Training Loss: 0.39389
Epoch: 2283, Training Loss: 0.11272
Epoch: 2283, Training Loss: 0.24021
Epoch: 2283, Training Loss: 0.23551
Epoch: 2283, Training Loss: 0.39186
Epoch: 2284, Training Loss: 0.11276
Epoch: 2284, Training Loss: 0.23924
Epoch: 2284, Training Loss: 0.23472
Epoch: 2284, Training Loss: 0.38985
Epoch: 2285, Training Loss: 0.11281
Epoch: 2285, Training Loss: 0.23827
Epoch: 2285, Training Loss: 0.23392
Epoch: 2285, Training Loss: 0.38784
Epoch: 2286, Training Loss: 0.11284
Epoch: 2286, Training Loss: 0.23730
Epoch: 2286, Training Loss: 0.23312
Epoch: 2286, Training Loss: 0.38583
Epoch: 2287, Training Loss: 0.11288
Epoch: 2287, Training Loss: 0.23633
Epoch: 2287, Training Loss: 0.23232
Epoch: 2287, Training Loss: 0.38384
Epoch: 2288, Training Loss: 0.11290
Epoch: 2288, Training Loss: 0.23536
Epoch: 2288, Training Loss: 0.23151
Epoch: 2288, Training Loss: 0.38185
Epoch: 2289, Training Loss: 0.11293
Epoch: 2289, Training Loss: 0.23440
Epoch: 2289, Training Loss: 0.23070
Epoch: 2289, Training Loss: 0.37987
Epoch: 2290, Training Loss: 0.11295
Epoch: 2290, Training Loss: 0.23344
Epoch: 2290, Training Loss: 0.22989
Epoch: 2290, Training Loss: 0.37790
Epoch: 2291, Training Loss: 0.11296
Epoch: 2291, Training Loss: 0.23248
Epoch: 2291, Training Loss: 0.22907
Epoch: 2291, Training Loss: 0.37594
Epoch: 2292, Training Loss: 0.11297
Epoch: 2292, Training Loss: 0.23153
Epoch: 2292, Training Loss: 0.22826
Epoch: 2292, Training Loss: 0.37398
Epoch: 2293, Training Loss: 0.11298
Epoch: 2293, Training Loss: 0.23058
Epoch: 2293, Training Loss: 0.22744
Epoch: 2293, Training Loss: 0.37204
Epoch: 2294, Training Loss: 0.11298
Epoch: 2294, Training Loss: 0.22963
Epoch: 2294, Training Loss: 0.22662
Epoch: 2294, Training Loss: 0.37010
Epoch: 2295, Training Loss: 0.11298
Epoch: 2295, Training Loss: 0.22868
Epoch: 2295, Training Loss: 0.22580
Epoch: 2295, Training Loss: 0.36818
Epoch: 2296, Training Loss: 0.11297
Epoch: 2296, Training Loss: 0.22773
Epoch: 2296, Training Loss: 0.22497
Epoch: 2296, Training Loss: 0.36626
Epoch: 2297, Training Loss: 0.11296
Epoch: 2297, Training Loss: 0.22679
Epoch: 2297, Training Loss: 0.22415
Epoch: 2297, Training Loss: 0.36435
Epoch: 2298, Training Loss: 0.11294
Epoch: 2298, Training Loss: 0.22585
Epoch: 2298, Training Loss: 0.22332
Epoch: 2298, Training Loss: 0.36245
Epoch: 2299, Training Loss: 0.11292
Epoch: 2299, Training Loss: 0.22492
Epoch: 2299, Training Loss: 0.22250
Epoch: 2299, Training Loss: 0.36056
Epoch: 2300, Training Loss: 0.11290
Epoch: 2300, Training Loss: 0.22399
Epoch: 2300, Training Loss: 0.22167
Epoch: 2300, Training Loss: 0.35869
Epoch: 2301, Training Loss: 0.11287
Epoch: 2301, Training Loss: 0.22306
Epoch: 2301, Training Loss: 0.22084
Epoch: 2301, Training Loss: 0.35682
Epoch: 2302, Training Loss: 0.11284
Epoch: 2302, Training Loss: 0.22213
Epoch: 2302, Training Loss: 0.22001
Epoch: 2302, Training Loss: 0.35496
Epoch: 2303, Training Loss: 0.11280
Epoch: 2303, Training Loss: 0.22121
Epoch: 2303, Training Loss: 0.21919
Epoch: 2303, Training Loss: 0.35311
Epoch: 2304, Training Loss: 0.11276
Epoch: 2304, Training Loss: 0.22029
Epoch: 2304, Training Loss: 0.21836
Epoch: 2304, Training Loss: 0.35128
Epoch: 2305, Training Loss: 0.11272
Epoch: 2305, Training Loss: 0.21938
Epoch: 2305, Training Loss: 0.21753
Epoch: 2305, Training Loss: 0.34945
Epoch: 2306, Training Loss: 0.11267
Epoch: 2306, Training Loss: 0.21846
Epoch: 2306, Training Loss: 0.21670
Epoch: 2306, Training Loss: 0.34763
Epoch: 2307, Training Loss: 0.11262
Epoch: 2307, Training Loss: 0.21756
Epoch: 2307, Training Loss: 0.21588
Epoch: 2307, Training Loss: 0.34583
Epoch: 2308, Training Loss: 0.11256
Epoch: 2308, Training Loss: 0.21665
Epoch: 2308, Training Loss: 0.21505
Epoch: 2308, Training Loss: 0.34403
Epoch: 2309, Training Loss: 0.11250
Epoch: 2309, Training Loss: 0.21575
Epoch: 2309, Training Loss: 0.21422
Epoch: 2309, Training Loss: 0.34225
Epoch: 2310, Training Loss: 0.11244
Epoch: 2310, Training Loss: 0.21486
Epoch: 2310, Training Loss: 0.21340
Epoch: 2310, Training Loss: 0.34048
Epoch: 2311, Training Loss: 0.11237
Epoch: 2311, Training Loss: 0.21396
Epoch: 2311, Training Loss: 0.21258
Epoch: 2311, Training Loss: 0.33872
Epoch: 2312, Training Loss: 0.11230
Epoch: 2312, Training Loss: 0.21307
Epoch: 2312, Training Loss: 0.21176
Epoch: 2312, Training Loss: 0.33697
Epoch: 2313, Training Loss: 0.11223
Epoch: 2313, Training Loss: 0.21219
Epoch: 2313, Training Loss: 0.21094
Epoch: 2313, Training Loss: 0.33523
Epoch: 2314, Training Loss: 0.11216
Epoch: 2314, Training Loss: 0.21131
Epoch: 2314, Training Loss: 0.21012
Epoch: 2314, Training Loss: 0.33350
Epoch: 2315, Training Loss: 0.11208
Epoch: 2315, Training Loss: 0.21043
Epoch: 2315, Training Loss: 0.20930
Epoch: 2315, Training Loss: 0.33179
Epoch: 2316, Training Loss: 0.11199
Epoch: 2316, Training Loss: 0.20956
Epoch: 2316, Training Loss: 0.20849
Epoch: 2316, Training Loss: 0.33008
Epoch: 2317, Training Loss: 0.11191
Epoch: 2317, Training Loss: 0.20869
Epoch: 2317, Training Loss: 0.20767
Epoch: 2317, Training Loss: 0.32839
Epoch: 2318, Training Loss: 0.11182
Epoch: 2318, Training Loss: 0.20783
Epoch: 2318, Training Loss: 0.20686
Epoch: 2318, Training Loss: 0.32671
Epoch: 2319, Training Loss: 0.11173
Epoch: 2319, Training Loss: 0.20697
Epoch: 2319, Training Loss: 0.20605
Epoch: 2319, Training Loss: 0.32504
Epoch: 2320, Training Loss: 0.11163
Epoch: 2320, Training Loss: 0.20611
Epoch: 2320, Training Loss: 0.20525
Epoch: 2320, Training Loss: 0.32338
Epoch: 2321, Training Loss: 0.11153
Epoch: 2321, Training Loss: 0.20526
Epoch: 2321, Training Loss: 0.20444
Epoch: 2321, Training Loss: 0.32174
Epoch: 2322, Training Loss: 0.11143
Epoch: 2322, Training Loss: 0.20441
Epoch: 2322, Training Loss: 0.20364
Epoch: 2322, Training Loss: 0.32010
Epoch: 2323, Training Loss: 0.11133
Epoch: 2323, Training Loss: 0.20357
Epoch: 2323, Training Loss: 0.20285
Epoch: 2323, Training Loss: 0.31848
Epoch: 2324, Training Loss: 0.11122
Epoch: 2324, Training Loss: 0.20273
Epoch: 2324, Training Loss: 0.20205
Epoch: 2324, Training Loss: 0.31687
Epoch: 2325, Training Loss: 0.11111
Epoch: 2325, Training Loss: 0.20190
Epoch: 2325, Training Loss: 0.20126
Epoch: 2325, Training Loss: 0.31527
Epoch: 2326, Training Loss: 0.11100
Epoch: 2326, Training Loss: 0.20107
Epoch: 2326, Training Loss: 0.20047
Epoch: 2326, Training Loss: 0.31368
Epoch: 2327, Training Loss: 0.11089
Epoch: 2327, Training Loss: 0.20025
Epoch: 2327, Training Loss: 0.19968
Epoch: 2327, Training Loss: 0.31210
Epoch: 2328, Training Loss: 0.11077
Epoch: 2328, Training Loss: 0.19943
Epoch: 2328, Training Loss: 0.19890
Epoch: 2328, Training Loss: 0.31054
Epoch: 2329, Training Loss: 0.11065
Epoch: 2329, Training Loss: 0.19861
Epoch: 2329, Training Loss: 0.19811
Epoch: 2329, Training Loss: 0.30898
Epoch: 2330, Training Loss: 0.11053
Epoch: 2330, Training Loss: 0.19780
Epoch: 2330, Training Loss: 0.19734
Epoch: 2330, Training Loss: 0.30744
Epoch: 2331, Training Loss: 0.11041
Epoch: 2331, Training Loss: 0.19699
Epoch: 2331, Training Loss: 0.19656
Epoch: 2331, Training Loss: 0.30591
Epoch: 2332, Training Loss: 0.11029
Epoch: 2332, Training Loss: 0.19619
Epoch: 2332, Training Loss: 0.19579
Epoch: 2332, Training Loss: 0.30439
Epoch: 2333, Training Loss: 0.11016
Epoch: 2333, Training Loss: 0.19539
Epoch: 2333, Training Loss: 0.19502
Epoch: 2333, Training Loss: 0.30289
Epoch: 2334, Training Loss: 0.11003
Epoch: 2334, Training Loss: 0.19460
Epoch: 2334, Training Loss: 0.19426
Epoch: 2334, Training Loss: 0.30139
Epoch: 2335, Training Loss: 0.10990
Epoch: 2335, Training Loss: 0.19381
Epoch: 2335, Training Loss: 0.19350
Epoch: 2335, Training Loss: 0.29991
Epoch: 2336, Training Loss: 0.10976
Epoch: 2336, Training Loss: 0.19303
Epoch: 2336, Training Loss: 0.19274
Epoch: 2336, Training Loss: 0.29843
Epoch: 2337, Training Loss: 0.10963
Epoch: 2337, Training Loss: 0.19225
Epoch: 2337, Training Loss: 0.19199
Epoch: 2337, Training Loss: 0.29697
Epoch: 2338, Training Loss: 0.10949
Epoch: 2338, Training Loss: 0.19148
Epoch: 2338, Training Loss: 0.19124
Epoch: 2338, Training Loss: 0.29552
Epoch: 2339, Training Loss: 0.10935
Epoch: 2339, Training Loss: 0.19071
Epoch: 2339, Training Loss: 0.19049
Epoch: 2339, Training Loss: 0.29408
Epoch: 2340, Training Loss: 0.10921
Epoch: 2340, Training Loss: 0.18994
Epoch: 2340, Training Loss: 0.18975
Epoch: 2340, Training Loss: 0.29266
Epoch: 2341, Training Loss: 0.10907
Epoch: 2341, Training Loss: 0.18918
Epoch: 2341, Training Loss: 0.18901
Epoch: 2341, Training Loss: 0.29124
Epoch: 2342, Training Loss: 0.10892
Epoch: 2342, Training Loss: 0.18843
Epoch: 2342, Training Loss: 0.18828
Epoch: 2342, Training Loss: 0.28983
Epoch: 2343, Training Loss: 0.10878
Epoch: 2343, Training Loss: 0.18768
Epoch: 2343, Training Loss: 0.18755
Epoch: 2343, Training Loss: 0.28844
Epoch: 2344, Training Loss: 0.10863
Epoch: 2344, Training Loss: 0.18693
Epoch: 2344, Training Loss: 0.18682
Epoch: 2344, Training Loss: 0.28706
Epoch: 2345, Training Loss: 0.10848
Epoch: 2345, Training Loss: 0.18619
Epoch: 2345, Training Loss: 0.18610
Epoch: 2345, Training Loss: 0.28569
Epoch: 2346, Training Loss: 0.10833
Epoch: 2346, Training Loss: 0.18545
Epoch: 2346, Training Loss: 0.18538
Epoch: 2346, Training Loss: 0.28432
Epoch: 2347, Training Loss: 0.10818
Epoch: 2347, Training Loss: 0.18472
Epoch: 2347, Training Loss: 0.18466
Epoch: 2347, Training Loss: 0.28297
Epoch: 2348, Training Loss: 0.10803
Epoch: 2348, Training Loss: 0.18399
Epoch: 2348, Training Loss: 0.18395
Epoch: 2348, Training Loss: 0.28164
Epoch: 2349, Training Loss: 0.10787
Epoch: 2349, Training Loss: 0.18327
Epoch: 2349, Training Loss: 0.18324
Epoch: 2349, Training Loss: 0.28031
Epoch: 2350, Training Loss: 0.10772
Epoch: 2350, Training Loss: 0.18255
Epoch: 2350, Training Loss: 0.18254
Epoch: 2350, Training Loss: 0.27899
Epoch: 2351, Training Loss: 0.10756
Epoch: 2351, Training Loss: 0.18184
Epoch: 2351, Training Loss: 0.18184
Epoch: 2351, Training Loss: 0.27768
Epoch: 2352, Training Loss: 0.10740
Epoch: 2352, Training Loss: 0.18113
Epoch: 2352, Training Loss: 0.18115
Epoch: 2352, Training Loss: 0.27639
Epoch: 2353, Training Loss: 0.10724
Epoch: 2353, Training Loss: 0.18043
Epoch: 2353, Training Loss: 0.18046
Epoch: 2353, Training Loss: 0.27510
Epoch: 2354, Training Loss: 0.10708
Epoch: 2354, Training Loss: 0.17973
Epoch: 2354, Training Loss: 0.17977
Epoch: 2354, Training Loss: 0.27383
Epoch: 2355, Training Loss: 0.10692
Epoch: 2355, Training Loss: 0.17903
Epoch: 2355, Training Loss: 0.17909
Epoch: 2355, Training Loss: 0.27256
Epoch: 2356, Training Loss: 0.10676
Epoch: 2356, Training Loss: 0.17834
Epoch: 2356, Training Loss: 0.17841
Epoch: 2356, Training Loss: 0.27131
Epoch: 2357, Training Loss: 0.10660
Epoch: 2357, Training Loss: 0.17766
Epoch: 2357, Training Loss: 0.17773
Epoch: 2357, Training Loss: 0.27006
Epoch: 2358, Training Loss: 0.10643
Epoch: 2358, Training Loss: 0.17697
Epoch: 2358, Training Loss: 0.17706
Epoch: 2358, Training Loss: 0.26883
Epoch: 2359, Training Loss: 0.10627
Epoch: 2359, Training Loss: 0.17630
Epoch: 2359, Training Loss: 0.17640
Epoch: 2359, Training Loss: 0.26760
Epoch: 2360, Training Loss: 0.10610
Epoch: 2360, Training Loss: 0.17563
Epoch: 2360, Training Loss: 0.17574
Epoch: 2360, Training Loss: 0.26639
Epoch: 2361, Training Loss: 0.10594
Epoch: 2361, Training Loss: 0.17496
Epoch: 2361, Training Loss: 0.17508
Epoch: 2361, Training Loss: 0.26518
Epoch: 2362, Training Loss: 0.10577
Epoch: 2362, Training Loss: 0.17430
Epoch: 2362, Training Loss: 0.17442
Epoch: 2362, Training Loss: 0.26399
Epoch: 2363, Training Loss: 0.10560
Epoch: 2363, Training Loss: 0.17364
Epoch: 2363, Training Loss: 0.17377
Epoch: 2363, Training Loss: 0.26281
Epoch: 2364, Training Loss: 0.10543
Epoch: 2364, Training Loss: 0.17298
Epoch: 2364, Training Loss: 0.17313
Epoch: 2364, Training Loss: 0.26163
Epoch: 2365, Training Loss: 0.10526
Epoch: 2365, Training Loss: 0.17233
Epoch: 2365, Training Loss: 0.17249
Epoch: 2365, Training Loss: 0.26047
Epoch: 2366, Training Loss: 0.10509
Epoch: 2366, Training Loss: 0.17169
Epoch: 2366, Training Loss: 0.17185
Epoch: 2366, Training Loss: 0.25931
Epoch: 2367, Training Loss: 0.10492
Epoch: 2367, Training Loss: 0.17105
Epoch: 2367, Training Loss: 0.17122
Epoch: 2367, Training Loss: 0.25816
Epoch: 2368, Training Loss: 0.10475
Epoch: 2368, Training Loss: 0.17041
Epoch: 2368, Training Loss: 0.17059
Epoch: 2368, Training Loss: 0.25703
Epoch: 2369, Training Loss: 0.10458
Epoch: 2369, Training Loss: 0.16978
Epoch: 2369, Training Loss: 0.16996
Epoch: 2369, Training Loss: 0.25590
Epoch: 2370, Training Loss: 0.10441
Epoch: 2370, Training Loss: 0.16915
Epoch: 2370, Training Loss: 0.16934
Epoch: 2370, Training Loss: 0.25478
Epoch: 2371, Training Loss: 0.10424
Epoch: 2371, Training Loss: 0.16853
Epoch: 2371, Training Loss: 0.16872
Epoch: 2371, Training Loss: 0.25367
Epoch: 2372, Training Loss: 0.10406
Epoch: 2372, Training Loss: 0.16791
Epoch: 2372, Training Loss: 0.16811
Epoch: 2372, Training Loss: 0.25257
Epoch: 2373, Training Loss: 0.10389
Epoch: 2373, Training Loss: 0.16729
Epoch: 2373, Training Loss: 0.16750
Epoch: 2373, Training Loss: 0.25148
Epoch: 2374, Training Loss: 0.10372
Epoch: 2374, Training Loss: 0.16668
Epoch: 2374, Training Loss: 0.16690
Epoch: 2374, Training Loss: 0.25040
Epoch: 2375, Training Loss: 0.10354
Epoch: 2375, Training Loss: 0.16608
Epoch: 2375, Training Loss: 0.16630
Epoch: 2375, Training Loss: 0.24933
Epoch: 2376, Training Loss: 0.10337
Epoch: 2376, Training Loss: 0.16547
Epoch: 2376, Training Loss: 0.16570
Epoch: 2376, Training Loss: 0.24827
Epoch: 2377, Training Loss: 0.10319
Epoch: 2377, Training Loss: 0.16488
Epoch: 2377, Training Loss: 0.16511
Epoch: 2377, Training Loss: 0.24721
Epoch: 2378, Training Loss: 0.10302
Epoch: 2378, Training Loss: 0.16428
Epoch: 2378, Training Loss: 0.16452
Epoch: 2378, Training Loss: 0.24616
Epoch: 2379, Training Loss: 0.10285
Epoch: 2379, Training Loss: 0.16369
Epoch: 2379, Training Loss: 0.16393
Epoch: 2379, Training Loss: 0.24513
Epoch: 2380, Training Loss: 0.10267
Epoch: 2380, Training Loss: 0.16311
Epoch: 2380, Training Loss: 0.16335
Epoch: 2380, Training Loss: 0.24410
Epoch: 2381, Training Loss: 0.10250
Epoch: 2381, Training Loss: 0.16252
Epoch: 2381, Training Loss: 0.16277
Epoch: 2381, Training Loss: 0.24308
Epoch: 2382, Training Loss: 0.10232
Epoch: 2382, Training Loss: 0.16195
Epoch: 2382, Training Loss: 0.16220
Epoch: 2382, Training Loss: 0.24206
Epoch: 2383, Training Loss: 0.10214
Epoch: 2383, Training Loss: 0.16137
Epoch: 2383, Training Loss: 0.16163
Epoch: 2383, Training Loss: 0.24106
Epoch: 2384, Training Loss: 0.10197
Epoch: 2384, Training Loss: 0.16080
Epoch: 2384, Training Loss: 0.16106
Epoch: 2384, Training Loss: 0.24006
Epoch: 2385, Training Loss: 0.10179
Epoch: 2385, Training Loss: 0.16024
Epoch: 2385, Training Loss: 0.16050
Epoch: 2385, Training Loss: 0.23908
Epoch: 2386, Training Loss: 0.10162
Epoch: 2386, Training Loss: 0.15968
Epoch: 2386, Training Loss: 0.15994
Epoch: 2386, Training Loss: 0.23810
Epoch: 2387, Training Loss: 0.10144
Epoch: 2387, Training Loss: 0.15912
Epoch: 2387, Training Loss: 0.15939
Epoch: 2387, Training Loss: 0.23712
Epoch: 2388, Training Loss: 0.10127
Epoch: 2388, Training Loss: 0.15857
Epoch: 2388, Training Loss: 0.15884
Epoch: 2388, Training Loss: 0.23616
Epoch: 2389, Training Loss: 0.10109
Epoch: 2389, Training Loss: 0.15802
Epoch: 2389, Training Loss: 0.15829
Epoch: 2389, Training Loss: 0.23520
Epoch: 2390, Training Loss: 0.10091
Epoch: 2390, Training Loss: 0.15747
Epoch: 2390, Training Loss: 0.15775
Epoch: 2390, Training Loss: 0.23425
Epoch: 2391, Training Loss: 0.10074
Epoch: 2391, Training Loss: 0.15693
Epoch: 2391, Training Loss: 0.15721
Epoch: 2391, Training Loss: 0.23331
Epoch: 2392, Training Loss: 0.10056
Epoch: 2392, Training Loss: 0.15639
Epoch: 2392, Training Loss: 0.15667
Epoch: 2392, Training Loss: 0.23238
Epoch: 2393, Training Loss: 0.10039
Epoch: 2393, Training Loss: 0.15585
Epoch: 2393, Training Loss: 0.15614
Epoch: 2393, Training Loss: 0.23145
Epoch: 2394, Training Loss: 0.10021
Epoch: 2394, Training Loss: 0.15532
Epoch: 2394, Training Loss: 0.15561
Epoch: 2394, Training Loss: 0.23054
Epoch: 2395, Training Loss: 0.10004
Epoch: 2395, Training Loss: 0.15480
Epoch: 2395, Training Loss: 0.15509
Epoch: 2395, Training Loss: 0.22963
Epoch: 2396, Training Loss: 0.09986
Epoch: 2396, Training Loss: 0.15427
Epoch: 2396, Training Loss: 0.15457
Epoch: 2396, Training Loss: 0.22872
Epoch: 2397, Training Loss: 0.09968
Epoch: 2397, Training Loss: 0.15375
Epoch: 2397, Training Loss: 0.15405
Epoch: 2397, Training Loss: 0.22783
Epoch: 2398, Training Loss: 0.09951
Epoch: 2398, Training Loss: 0.15324
Epoch: 2398, Training Loss: 0.15353
Epoch: 2398, Training Loss: 0.22694
Epoch: 2399, Training Loss: 0.09933
Epoch: 2399, Training Loss: 0.15272
Epoch: 2399, Training Loss: 0.15302
Epoch: 2399, Training Loss: 0.22606
Epoch: 2400, Training Loss: 0.09916
Epoch: 2400, Training Loss: 0.15221
Epoch: 2400, Training Loss: 0.15252
Epoch: 2400, Training Loss: 0.22518
Epoch: 2401, Training Loss: 0.09898
Epoch: 2401, Training Loss: 0.15171
Epoch: 2401, Training Loss: 0.15201
Epoch: 2401, Training Loss: 0.22431
Epoch: 2402, Training Loss: 0.09881
Epoch: 2402, Training Loss: 0.15121
Epoch: 2402, Training Loss: 0.15151
Epoch: 2402, Training Loss: 0.22345
Epoch: 2403, Training Loss: 0.09864
Epoch: 2403, Training Loss: 0.15071
Epoch: 2403, Training Loss: 0.15101
Epoch: 2403, Training Loss: 0.22260
Epoch: 2404, Training Loss: 0.09846
Epoch: 2404, Training Loss: 0.15021
Epoch: 2404, Training Loss: 0.15052
Epoch: 2404, Training Loss: 0.22175
Epoch: 2405, Training Loss: 0.09829
Epoch: 2405, Training Loss: 0.14972
Epoch: 2405, Training Loss: 0.15003
Epoch: 2405, Training Loss: 0.22091
Epoch: 2406, Training Loss: 0.09811
Epoch: 2406, Training Loss: 0.14923
Epoch: 2406, Training Loss: 0.14954
Epoch: 2406, Training Loss: 0.22008
Epoch: 2407, Training Loss: 0.09794
Epoch: 2407, Training Loss: 0.14875
Epoch: 2407, Training Loss: 0.14906
Epoch: 2407, Training Loss: 0.21925
Epoch: 2408, Training Loss: 0.09777
Epoch: 2408, Training Loss: 0.14827
Epoch: 2408, Training Loss: 0.14858
Epoch: 2408, Training Loss: 0.21843
Epoch: 2409, Training Loss: 0.09760
Epoch: 2409, Training Loss: 0.14779
Epoch: 2409, Training Loss: 0.14810
Epoch: 2409, Training Loss: 0.21761
Epoch: 2410, Training Loss: 0.09742
Epoch: 2410, Training Loss: 0.14731
Epoch: 2410, Training Loss: 0.14763
Epoch: 2410, Training Loss: 0.21680
Epoch: 2411, Training Loss: 0.09725
Epoch: 2411, Training Loss: 0.14684
Epoch: 2411, Training Loss: 0.14716
Epoch: 2411, Training Loss: 0.21600
Epoch: 2412, Training Loss: 0.09708
Epoch: 2412, Training Loss: 0.14637
Epoch: 2412, Training Loss: 0.14669
Epoch: 2412, Training Loss: 0.21521
Epoch: 2413, Training Loss: 0.09691
Epoch: 2413, Training Loss: 0.14591
Epoch: 2413, Training Loss: 0.14623
Epoch: 2413, Training Loss: 0.21442
Epoch: 2414, Training Loss: 0.09674
Epoch: 2414, Training Loss: 0.14545
Epoch: 2414, Training Loss: 0.14577
Epoch: 2414, Training Loss: 0.21364
Epoch: 2415, Training Loss: 0.09657
Epoch: 2415, Training Loss: 0.14499
Epoch: 2415, Training Loss: 0.14531
Epoch: 2415, Training Loss: 0.21286
Epoch: 2416, Training Loss: 0.09639
Epoch: 2416, Training Loss: 0.14453
Epoch: 2416, Training Loss: 0.14485
Epoch: 2416, Training Loss: 0.21209
Epoch: 2417, Training Loss: 0.09622
Epoch: 2417, Training Loss: 0.14408
Epoch: 2417, Training Loss: 0.14440
Epoch: 2417, Training Loss: 0.21132
Epoch: 2418, Training Loss: 0.09605
Epoch: 2418, Training Loss: 0.14363
Epoch: 2418, Training Loss: 0.14395
Epoch: 2418, Training Loss: 0.21056
Epoch: 2419, Training Loss: 0.09588
Epoch: 2419, Training Loss: 0.14318
Epoch: 2419, Training Loss: 0.14351
Epoch: 2419, Training Loss: 0.20981
Epoch: 2420, Training Loss: 0.09572
Epoch: 2420, Training Loss: 0.14274
Epoch: 2420, Training Loss: 0.14306
Epoch: 2420, Training Loss: 0.20906
Epoch: 2421, Training Loss: 0.09555
Epoch: 2421, Training Loss: 0.14230
Epoch: 2421, Training Loss: 0.14262
Epoch: 2421, Training Loss: 0.20832
Epoch: 2422, Training Loss: 0.09538
Epoch: 2422, Training Loss: 0.14186
Epoch: 2422, Training Loss: 0.14219
Epoch: 2422, Training Loss: 0.20758
Epoch: 2423, Training Loss: 0.09521
Epoch: 2423, Training Loss: 0.14143
Epoch: 2423, Training Loss: 0.14175
Epoch: 2423, Training Loss: 0.20685
Epoch: 2424, Training Loss: 0.09504
Epoch: 2424, Training Loss: 0.14100
Epoch: 2424, Training Loss: 0.14132
Epoch: 2424, Training Loss: 0.20613
Epoch: 2425, Training Loss: 0.09488
Epoch: 2425, Training Loss: 0.14057
Epoch: 2425, Training Loss: 0.14089
Epoch: 2425, Training Loss: 0.20541
Epoch: 2426, Training Loss: 0.09471
Epoch: 2426, Training Loss: 0.14014
Epoch: 2426, Training Loss: 0.14047
Epoch: 2426, Training Loss: 0.20469
Epoch: 2427, Training Loss: 0.09454
Epoch: 2427, Training Loss: 0.13972
Epoch: 2427, Training Loss: 0.14005
Epoch: 2427, Training Loss: 0.20399
Epoch: 2428, Training Loss: 0.09438
Epoch: 2428, Training Loss: 0.13930
Epoch: 2428, Training Loss: 0.13963
Epoch: 2428, Training Loss: 0.20328
Epoch: 2429, Training Loss: 0.09421
Epoch: 2429, Training Loss: 0.13888
Epoch: 2429, Training Loss: 0.13921
Epoch: 2429, Training Loss: 0.20258
Epoch: 2430, Training Loss: 0.09405
Epoch: 2430, Training Loss: 0.13847
Epoch: 2430, Training Loss: 0.13880
Epoch: 2430, Training Loss: 0.20189
Epoch: 2431, Training Loss: 0.09388
Epoch: 2431, Training Loss: 0.13806
Epoch: 2431, Training Loss: 0.13838
Epoch: 2431, Training Loss: 0.20120
Epoch: 2432, Training Loss: 0.09372
Epoch: 2432, Training Loss: 0.13765
Epoch: 2432, Training Loss: 0.13798
Epoch: 2432, Training Loss: 0.20052
Epoch: 2433, Training Loss: 0.09355
Epoch: 2433, Training Loss: 0.13724
Epoch: 2433, Training Loss: 0.13757
Epoch: 2433, Training Loss: 0.19984
Epoch: 2434, Training Loss: 0.09339
Epoch: 2434, Training Loss: 0.13684
Epoch: 2434, Training Loss: 0.13717
Epoch: 2434, Training Loss: 0.19917
Epoch: 2435, Training Loss: 0.09323
Epoch: 2435, Training Loss: 0.13644
Epoch: 2435, Training Loss: 0.13677
Epoch: 2435, Training Loss: 0.19850
Epoch: 2436, Training Loss: 0.09306
Epoch: 2436, Training Loss: 0.13604
Epoch: 2436, Training Loss: 0.13637
Epoch: 2436, Training Loss: 0.19784
Epoch: 2437, Training Loss: 0.09290
Epoch: 2437, Training Loss: 0.13565
Epoch: 2437, Training Loss: 0.13597
Epoch: 2437, Training Loss: 0.19718
Epoch: 2438, Training Loss: 0.09274
Epoch: 2438, Training Loss: 0.13525
Epoch: 2438, Training Loss: 0.13558
Epoch: 2438, Training Loss: 0.19653
Epoch: 2439, Training Loss: 0.09258
Epoch: 2439, Training Loss: 0.13486
Epoch: 2439, Training Loss: 0.13519
Epoch: 2439, Training Loss: 0.19588
Epoch: 2440, Training Loss: 0.09242
Epoch: 2440, Training Loss: 0.13448
Epoch: 2440, Training Loss: 0.13480
Epoch: 2440, Training Loss: 0.19524
Epoch: 2441, Training Loss: 0.09226
Epoch: 2441, Training Loss: 0.13409
Epoch: 2441, Training Loss: 0.13442
Epoch: 2441, Training Loss: 0.19460
Epoch: 2442, Training Loss: 0.09210
Epoch: 2442, Training Loss: 0.13371
Epoch: 2442, Training Loss: 0.13403
Epoch: 2442, Training Loss: 0.19397
Epoch: 2443, Training Loss: 0.09194
Epoch: 2443, Training Loss: 0.13333
Epoch: 2443, Training Loss: 0.13365
Epoch: 2443, Training Loss: 0.19334
Epoch: 2444, Training Loss: 0.09178
Epoch: 2444, Training Loss: 0.13295
Epoch: 2444, Training Loss: 0.13328
Epoch: 2444, Training Loss: 0.19271
Epoch: 2445, Training Loss: 0.09162
Epoch: 2445, Training Loss: 0.13258
Epoch: 2445, Training Loss: 0.13290
Epoch: 2445, Training Loss: 0.19209
Epoch: 2446, Training Loss: 0.09146
Epoch: 2446, Training Loss: 0.13220
Epoch: 2446, Training Loss: 0.13253
Epoch: 2446, Training Loss: 0.19148
Epoch: 2447, Training Loss: 0.09131
Epoch: 2447, Training Loss: 0.13183
Epoch: 2447, Training Loss: 0.13216
Epoch: 2447, Training Loss: 0.19087
Epoch: 2448, Training Loss: 0.09115
Epoch: 2448, Training Loss: 0.13147
Epoch: 2448, Training Loss: 0.13179
Epoch: 2448, Training Loss: 0.19026
Epoch: 2449, Training Loss: 0.09099
Epoch: 2449, Training Loss: 0.13110
Epoch: 2449, Training Loss: 0.13142
Epoch: 2449, Training Loss: 0.18966
Epoch: 2450, Training Loss: 0.09084
Epoch: 2450, Training Loss: 0.13074
Epoch: 2450, Training Loss: 0.13106
Epoch: 2450, Training Loss: 0.18906
Epoch: 2451, Training Loss: 0.09068
Epoch: 2451, Training Loss: 0.13038
Epoch: 2451, Training Loss: 0.13070
Epoch: 2451, Training Loss: 0.18846
Epoch: 2452, Training Loss: 0.09053
Epoch: 2452, Training Loss: 0.13002
Epoch: 2452, Training Loss: 0.13034
Epoch: 2452, Training Loss: 0.18787
Epoch: 2453, Training Loss: 0.09037
Epoch: 2453, Training Loss: 0.12966
Epoch: 2453, Training Loss: 0.12998
Epoch: 2453, Training Loss: 0.18729
Epoch: 2454, Training Loss: 0.09022
Epoch: 2454, Training Loss: 0.12931
Epoch: 2454, Training Loss: 0.12963
Epoch: 2454, Training Loss: 0.18671
Epoch: 2455, Training Loss: 0.09007
Epoch: 2455, Training Loss: 0.12896
Epoch: 2455, Training Loss: 0.12928
Epoch: 2455, Training Loss: 0.18613
Epoch: 2456, Training Loss: 0.08991
Epoch: 2456, Training Loss: 0.12861
Epoch: 2456, Training Loss: 0.12893
Epoch: 2456, Training Loss: 0.18555
Epoch: 2457, Training Loss: 0.08976
Epoch: 2457, Training Loss: 0.12826
Epoch: 2457, Training Loss: 0.12858
Epoch: 2457, Training Loss: 0.18498
Epoch: 2458, Training Loss: 0.08961
Epoch: 2458, Training Loss: 0.12792
Epoch: 2458, Training Loss: 0.12824
Epoch: 2458, Training Loss: 0.18442
Epoch: 2459, Training Loss: 0.08946
Epoch: 2459, Training Loss: 0.12757
Epoch: 2459, Training Loss: 0.12789
Epoch: 2459, Training Loss: 0.18386
Epoch: 2460, Training Loss: 0.08931
Epoch: 2460, Training Loss: 0.12723
Epoch: 2460, Training Loss: 0.12755
Epoch: 2460, Training Loss: 0.18330
Epoch: 2461, Training Loss: 0.08916
Epoch: 2461, Training Loss: 0.12689
Epoch: 2461, Training Loss: 0.12721
Epoch: 2461, Training Loss: 0.18275
Epoch: 2462, Training Loss: 0.08901
Epoch: 2462, Training Loss: 0.12656
Epoch: 2462, Training Loss: 0.12688
Epoch: 2462, Training Loss: 0.18220
Epoch: 2463, Training Loss: 0.08886
Epoch: 2463, Training Loss: 0.12622
Epoch: 2463, Training Loss: 0.12654
Epoch: 2463, Training Loss: 0.18165
Epoch: 2464, Training Loss: 0.08871
Epoch: 2464, Training Loss: 0.12589
Epoch: 2464, Training Loss: 0.12621
Epoch: 2464, Training Loss: 0.18111
Epoch: 2465, Training Loss: 0.08856
Epoch: 2465, Training Loss: 0.12556
Epoch: 2465, Training Loss: 0.12588
Epoch: 2465, Training Loss: 0.18057
Epoch: 2466, Training Loss: 0.08841
Epoch: 2466, Training Loss: 0.12523
Epoch: 2466, Training Loss: 0.12555
Epoch: 2466, Training Loss: 0.18003
Epoch: 2467, Training Loss: 0.08826
Epoch: 2467, Training Loss: 0.12491
Epoch: 2467, Training Loss: 0.12522
Epoch: 2467, Training Loss: 0.17950
Epoch: 2468, Training Loss: 0.08812
Epoch: 2468, Training Loss: 0.12458
Epoch: 2468, Training Loss: 0.12490
Epoch: 2468, Training Loss: 0.17897
Epoch: 2469, Training Loss: 0.08797
Epoch: 2469, Training Loss: 0.12426
Epoch: 2469, Training Loss: 0.12458
Epoch: 2469, Training Loss: 0.17845
Epoch: 2470, Training Loss: 0.08782
Epoch: 2470, Training Loss: 0.12394
Epoch: 2470, Training Loss: 0.12426
Epoch: 2470, Training Loss: 0.17793
Epoch: 2471, Training Loss: 0.08768
Epoch: 2471, Training Loss: 0.12362
Epoch: 2471, Training Loss: 0.12394
Epoch: 2471, Training Loss: 0.17741
Epoch: 2472, Training Loss: 0.08753
Epoch: 2472, Training Loss: 0.12331
Epoch: 2472, Training Loss: 0.12362
Epoch: 2472, Training Loss: 0.17690
Epoch: 2473, Training Loss: 0.08739
Epoch: 2473, Training Loss: 0.12299
Epoch: 2473, Training Loss: 0.12331
Epoch: 2473, Training Loss: 0.17639
Epoch: 2474, Training Loss: 0.08724
Epoch: 2474, Training Loss: 0.12268
Epoch: 2474, Training Loss: 0.12299
Epoch: 2474, Training Loss: 0.17588
Epoch: 2475, Training Loss: 0.08710
Epoch: 2475, Training Loss: 0.12237
Epoch: 2475, Training Loss: 0.12268
Epoch: 2475, Training Loss: 0.17538
Epoch: 2476, Training Loss: 0.08696
Epoch: 2476, Training Loss: 0.12206
Epoch: 2476, Training Loss: 0.12237
Epoch: 2476, Training Loss: 0.17488
Epoch: 2477, Training Loss: 0.08682
Epoch: 2477, Training Loss: 0.12176
Epoch: 2477, Training Loss: 0.12207
Epoch: 2477, Training Loss: 0.17438
Epoch: 2478, Training Loss: 0.08667
Epoch: 2478, Training Loss: 0.12145
Epoch: 2478, Training Loss: 0.12176
Epoch: 2478, Training Loss: 0.17388
Epoch: 2479, Training Loss: 0.08653
Epoch: 2479, Training Loss: 0.12115
Epoch: 2479, Training Loss: 0.12146
Epoch: 2479, Training Loss: 0.17339
Epoch: 2480, Training Loss: 0.08639
Epoch: 2480, Training Loss: 0.12085
Epoch: 2480, Training Loss: 0.12116
Epoch: 2480, Training Loss: 0.17291
Epoch: 2481, Training Loss: 0.08625
Epoch: 2481, Training Loss: 0.12055
Epoch: 2481, Training Loss: 0.12086
Epoch: 2481, Training Loss: 0.17242
Epoch: 2482, Training Loss: 0.08611
Epoch: 2482, Training Loss: 0.12025
Epoch: 2482, Training Loss: 0.12056
Epoch: 2482, Training Loss: 0.17194
Epoch: 2483, Training Loss: 0.08597
Epoch: 2483, Training Loss: 0.11995
Epoch: 2483, Training Loss: 0.12026
Epoch: 2483, Training Loss: 0.17146
Epoch: 2484, Training Loss: 0.08583
Epoch: 2484, Training Loss: 0.11966
Epoch: 2484, Training Loss: 0.11997
Epoch: 2484, Training Loss: 0.17099
Epoch: 2485, Training Loss: 0.08569
Epoch: 2485, Training Loss: 0.11937
Epoch: 2485, Training Loss: 0.11967
Epoch: 2485, Training Loss: 0.17052
Epoch: 2486, Training Loss: 0.08555
Epoch: 2486, Training Loss: 0.11908
Epoch: 2486, Training Loss: 0.11938
Epoch: 2486, Training Loss: 0.17005
Epoch: 2487, Training Loss: 0.08542
Epoch: 2487, Training Loss: 0.11879
Epoch: 2487, Training Loss: 0.11909
Epoch: 2487, Training Loss: 0.16958
Epoch: 2488, Training Loss: 0.08528
Epoch: 2488, Training Loss: 0.11850
Epoch: 2488, Training Loss: 0.11881
Epoch: 2488, Training Loss: 0.16912
Epoch: 2489, Training Loss: 0.08514
Epoch: 2489, Training Loss: 0.11822
Epoch: 2489, Training Loss: 0.11852
Epoch: 2489, Training Loss: 0.16866
Epoch: 2490, Training Loss: 0.08501
Epoch: 2490, Training Loss: 0.11793
Epoch: 2490, Training Loss: 0.11824
Epoch: 2490, Training Loss: 0.16820
Epoch: 2491, Training Loss: 0.08487
Epoch: 2491, Training Loss: 0.11765
Epoch: 2491, Training Loss: 0.11795
Epoch: 2491, Training Loss: 0.16775
Epoch: 2492, Training Loss: 0.08474
Epoch: 2492, Training Loss: 0.11737
Epoch: 2492, Training Loss: 0.11767
Epoch: 2492, Training Loss: 0.16730
Epoch: 2493, Training Loss: 0.08460
Epoch: 2493, Training Loss: 0.11709
Epoch: 2493, Training Loss: 0.11739
Epoch: 2493, Training Loss: 0.16685
Epoch: 2494, Training Loss: 0.08447
Epoch: 2494, Training Loss: 0.11681
Epoch: 2494, Training Loss: 0.11712
Epoch: 2494, Training Loss: 0.16641
Epoch: 2495, Training Loss: 0.08433
Epoch: 2495, Training Loss: 0.11654
Epoch: 2495, Training Loss: 0.11684
Epoch: 2495, Training Loss: 0.16596
Epoch: 2496, Training Loss: 0.08420
Epoch: 2496, Training Loss: 0.11627
Epoch: 2496, Training Loss: 0.11657
Epoch: 2496, Training Loss: 0.16552
Epoch: 2497, Training Loss: 0.08407
Epoch: 2497, Training Loss: 0.11599
Epoch: 2497, Training Loss: 0.11629
Epoch: 2497, Training Loss: 0.16509
Epoch: 2498, Training Loss: 0.08393
Epoch: 2498, Training Loss: 0.11572
Epoch: 2498, Training Loss: 0.11602
Epoch: 2498, Training Loss: 0.16465
Epoch: 2499, Training Loss: 0.08380
Epoch: 2499, Training Loss: 0.11545
Epoch: 2499, Training Loss: 0.11575
Epoch: 2499, Training Loss: 0.16422
Epoch: 2500, Training Loss: 0.08367
Epoch: 2500, Training Loss: 0.11519
Epoch: 2500, Training Loss: 0.11548
Epoch: 2500, Training Loss: 0.16379
Epoch: 2501, Training Loss: 0.08354
Epoch: 2501, Training Loss: 0.11492
Epoch: 2501, Training Loss: 0.11522
Epoch: 2501, Training Loss: 0.16337
Epoch: 2502, Training Loss: 0.08341
Epoch: 2502, Training Loss: 0.11465
Epoch: 2502, Training Loss: 0.11495
Epoch: 2502, Training Loss: 0.16294
Epoch: 2503, Training Loss: 0.08328
Epoch: 2503, Training Loss: 0.11439
Epoch: 2503, Training Loss: 0.11469
Epoch: 2503, Training Loss: 0.16252
Epoch: 2504, Training Loss: 0.08315
Epoch: 2504, Training Loss: 0.11413
Epoch: 2504, Training Loss: 0.11443
Epoch: 2504, Training Loss: 0.16210
Epoch: 2505, Training Loss: 0.08302
Epoch: 2505, Training Loss: 0.11387
Epoch: 2505, Training Loss: 0.11416
Epoch: 2505, Training Loss: 0.16169
Epoch: 2506, Training Loss: 0.08289
Epoch: 2506, Training Loss: 0.11361
Epoch: 2506, Training Loss: 0.11390
Epoch: 2506, Training Loss: 0.16127
Epoch: 2507, Training Loss: 0.08277
Epoch: 2507, Training Loss: 0.11335
Epoch: 2507, Training Loss: 0.11365
Epoch: 2507, Training Loss: 0.16086
Epoch: 2508, Training Loss: 0.08264
Epoch: 2508, Training Loss: 0.11310
Epoch: 2508, Training Loss: 0.11339
Epoch: 2508, Training Loss: 0.16045
Epoch: 2509, Training Loss: 0.08251
Epoch: 2509, Training Loss: 0.11284
Epoch: 2509, Training Loss: 0.11314
Epoch: 2509, Training Loss: 0.16005
Epoch: 2510, Training Loss: 0.08238
Epoch: 2510, Training Loss: 0.11259
Epoch: 2510, Training Loss: 0.11288
Epoch: 2510, Training Loss: 0.15965
Epoch: 2511, Training Loss: 0.08226
Epoch: 2511, Training Loss: 0.11234
Epoch: 2511, Training Loss: 0.11263
Epoch: 2511, Training Loss: 0.15924
Epoch: 2512, Training Loss: 0.08213
Epoch: 2512, Training Loss: 0.11209
Epoch: 2512, Training Loss: 0.11238
Epoch: 2512, Training Loss: 0.15885
Epoch: 2513, Training Loss: 0.08201
Epoch: 2513, Training Loss: 0.11184
Epoch: 2513, Training Loss: 0.11213
Epoch: 2513, Training Loss: 0.15845
Epoch: 2514, Training Loss: 0.08188
Epoch: 2514, Training Loss: 0.11159
Epoch: 2514, Training Loss: 0.11188
Epoch: 2514, Training Loss: 0.15806
Epoch: 2515, Training Loss: 0.08176
Epoch: 2515, Training Loss: 0.11135
Epoch: 2515, Training Loss: 0.11163
Epoch: 2515, Training Loss: 0.15766
Epoch: 2516, Training Loss: 0.08163
Epoch: 2516, Training Loss: 0.11110
Epoch: 2516, Training Loss: 0.11139
Epoch: 2516, Training Loss: 0.15728
Epoch: 2517, Training Loss: 0.08151
Epoch: 2517, Training Loss: 0.11086
Epoch: 2517, Training Loss: 0.11115
Epoch: 2517, Training Loss: 0.15689
Epoch: 2518, Training Loss: 0.08139
Epoch: 2518, Training Loss: 0.11062
Epoch: 2518, Training Loss: 0.11090
Epoch: 2518, Training Loss: 0.15650
Epoch: 2519, Training Loss: 0.08126
Epoch: 2519, Training Loss: 0.11037
Epoch: 2519, Training Loss: 0.11066
Epoch: 2519, Training Loss: 0.15612
Epoch: 2520, Training Loss: 0.08114
Epoch: 2520, Training Loss: 0.11014
Epoch: 2520, Training Loss: 0.11042
Epoch: 2520, Training Loss: 0.15574
Epoch: 2521, Training Loss: 0.08102
Epoch: 2521, Training Loss: 0.10990
Epoch: 2521, Training Loss: 0.11018
Epoch: 2521, Training Loss: 0.15536
Epoch: 2522, Training Loss: 0.08090
Epoch: 2522, Training Loss: 0.10966
Epoch: 2522, Training Loss: 0.10995
Epoch: 2522, Training Loss: 0.15499
Epoch: 2523, Training Loss: 0.08078
Epoch: 2523, Training Loss: 0.10942
Epoch: 2523, Training Loss: 0.10971
Epoch: 2523, Training Loss: 0.15461
Epoch: 2524, Training Loss: 0.08066
Epoch: 2524, Training Loss: 0.10919
Epoch: 2524, Training Loss: 0.10947
Epoch: 2524, Training Loss: 0.15424
Epoch: 2525, Training Loss: 0.08054
Epoch: 2525, Training Loss: 0.10896
Epoch: 2525, Training Loss: 0.10924
Epoch: 2525, Training Loss: 0.15387
Epoch: 2526, Training Loss: 0.08042
Epoch: 2526, Training Loss: 0.10873
Epoch: 2526, Training Loss: 0.10901
Epoch: 2526, Training Loss: 0.15351
Epoch: 2527, Training Loss: 0.08030
Epoch: 2527, Training Loss: 0.10849
Epoch: 2527, Training Loss: 0.10878
Epoch: 2527, Training Loss: 0.15314
Epoch: 2528, Training Loss: 0.08018
Epoch: 2528, Training Loss: 0.10827
Epoch: 2528, Training Loss: 0.10855
Epoch: 2528, Training Loss: 0.15278
Epoch: 2529, Training Loss: 0.08006
Epoch: 2529, Training Loss: 0.10804
Epoch: 2529, Training Loss: 0.10832
Epoch: 2529, Training Loss: 0.15242
Epoch: 2530, Training Loss: 0.07994
Epoch: 2530, Training Loss: 0.10781
Epoch: 2530, Training Loss: 0.10809
Epoch: 2530, Training Loss: 0.15206
Epoch: 2531, Training Loss: 0.07982
Epoch: 2531, Training Loss: 0.10758
Epoch: 2531, Training Loss: 0.10786
Epoch: 2531, Training Loss: 0.15170
Epoch: 2532, Training Loss: 0.07971
Epoch: 2532, Training Loss: 0.10736
Epoch: 2532, Training Loss: 0.10764
Epoch: 2532, Training Loss: 0.15135
Epoch: 2533, Training Loss: 0.07959
Epoch: 2533, Training Loss: 0.10714
Epoch: 2533, Training Loss: 0.10742
Epoch: 2533, Training Loss: 0.15100
Epoch: 2534, Training Loss: 0.07947
Epoch: 2534, Training Loss: 0.10691
Epoch: 2534, Training Loss: 0.10719
Epoch: 2534, Training Loss: 0.15065
Epoch: 2535, Training Loss: 0.07936
Epoch: 2535, Training Loss: 0.10669
Epoch: 2535, Training Loss: 0.10697
Epoch: 2535, Training Loss: 0.15030
Epoch: 2536, Training Loss: 0.07924
Epoch: 2536, Training Loss: 0.10647
Epoch: 2536, Training Loss: 0.10675
Epoch: 2536, Training Loss: 0.14995
Epoch: 2537, Training Loss: 0.07913
Epoch: 2537, Training Loss: 0.10625
Epoch: 2537, Training Loss: 0.10653
Epoch: 2537, Training Loss: 0.14961
Epoch: 2538, Training Loss: 0.07901
Epoch: 2538, Training Loss: 0.10604
Epoch: 2538, Training Loss: 0.10631
Epoch: 2538, Training Loss: 0.14926
Epoch: 2539, Training Loss: 0.07890
Epoch: 2539, Training Loss: 0.10582
Epoch: 2539, Training Loss: 0.10610
Epoch: 2539, Training Loss: 0.14892
Epoch: 2540, Training Loss: 0.07878
Epoch: 2540, Training Loss: 0.10561
Epoch: 2540, Training Loss: 0.10588
Epoch: 2540, Training Loss: 0.14858
Epoch: 2541, Training Loss: 0.07867
Epoch: 2541, Training Loss: 0.10539
Epoch: 2541, Training Loss: 0.10567
Epoch: 2541, Training Loss: 0.14825
Epoch: 2542, Training Loss: 0.07856
Epoch: 2542, Training Loss: 0.10518
Epoch: 2542, Training Loss: 0.10545
Epoch: 2542, Training Loss: 0.14791
Epoch: 2543, Training Loss: 0.07845
Epoch: 2543, Training Loss: 0.10497
Epoch: 2543, Training Loss: 0.10524
Epoch: 2543, Training Loss: 0.14758
Epoch: 2544, Training Loss: 0.07833
Epoch: 2544, Training Loss: 0.10476
Epoch: 2544, Training Loss: 0.10503
Epoch: 2544, Training Loss: 0.14724
Epoch: 2545, Training Loss: 0.07822
Epoch: 2545, Training Loss: 0.10455
Epoch: 2545, Training Loss: 0.10482
Epoch: 2545, Training Loss: 0.14691
Epoch: 2546, Training Loss: 0.07811
Epoch: 2546, Training Loss: 0.10434
Epoch: 2546, Training Loss: 0.10461
Epoch: 2546, Training Loss: 0.14659
Epoch: 2547, Training Loss: 0.07800
Epoch: 2547, Training Loss: 0.10413
Epoch: 2547, Training Loss: 0.10440
Epoch: 2547, Training Loss: 0.14626
Epoch: 2548, Training Loss: 0.07789
Epoch: 2548, Training Loss: 0.10392
Epoch: 2548, Training Loss: 0.10419
Epoch: 2548, Training Loss: 0.14594
Epoch: 2549, Training Loss: 0.07778
Epoch: 2549, Training Loss: 0.10372
Epoch: 2549, Training Loss: 0.10399
Epoch: 2549, Training Loss: 0.14561
Epoch: 2550, Training Loss: 0.07767
Epoch: 2550, Training Loss: 0.10351
Epoch: 2550, Training Loss: 0.10378
Epoch: 2550, Training Loss: 0.14529
Epoch: 2551, Training Loss: 0.07756
Epoch: 2551, Training Loss: 0.10331
Epoch: 2551, Training Loss: 0.10358
Epoch: 2551, Training Loss: 0.14497
Epoch: 2552, Training Loss: 0.07745
Epoch: 2552, Training Loss: 0.10311
Epoch: 2552, Training Loss: 0.10337
Epoch: 2552, Training Loss: 0.14466
Epoch: 2553, Training Loss: 0.07734
Epoch: 2553, Training Loss: 0.10290
Epoch: 2553, Training Loss: 0.10317
Epoch: 2553, Training Loss: 0.14434
Epoch: 2554, Training Loss: 0.07723
Epoch: 2554, Training Loss: 0.10270
Epoch: 2554, Training Loss: 0.10297
Epoch: 2554, Training Loss: 0.14403
Epoch: 2555, Training Loss: 0.07713
Epoch: 2555, Training Loss: 0.10250
Epoch: 2555, Training Loss: 0.10277
Epoch: 2555, Training Loss: 0.14371
Epoch: 2556, Training Loss: 0.07702
Epoch: 2556, Training Loss: 0.10231
Epoch: 2556, Training Loss: 0.10257
Epoch: 2556, Training Loss: 0.14340
Epoch: 2557, Training Loss: 0.07691
Epoch: 2557, Training Loss: 0.10211
Epoch: 2557, Training Loss: 0.10237
Epoch: 2557, Training Loss: 0.14309
Epoch: 2558, Training Loss: 0.07680
Epoch: 2558, Training Loss: 0.10191
Epoch: 2558, Training Loss: 0.10218
Epoch: 2558, Training Loss: 0.14279
Epoch: 2559, Training Loss: 0.07670
Epoch: 2559, Training Loss: 0.10172
Epoch: 2559, Training Loss: 0.10198
Epoch: 2559, Training Loss: 0.14248
Epoch: 2560, Training Loss: 0.07659
Epoch: 2560, Training Loss: 0.10152
Epoch: 2560, Training Loss: 0.10178
Epoch: 2560, Training Loss: 0.14218
Epoch: 2561, Training Loss: 0.07649
Epoch: 2561, Training Loss: 0.10133
Epoch: 2561, Training Loss: 0.10159
Epoch: 2561, Training Loss: 0.14187
Epoch: 2562, Training Loss: 0.07638
Epoch: 2562, Training Loss: 0.10113
Epoch: 2562, Training Loss: 0.10140
Epoch: 2562, Training Loss: 0.14157
Epoch: 2563, Training Loss: 0.07628
Epoch: 2563, Training Loss: 0.10094
Epoch: 2563, Training Loss: 0.10120
Epoch: 2563, Training Loss: 0.14127
Epoch: 2564, Training Loss: 0.07617
Epoch: 2564, Training Loss: 0.10075
Epoch: 2564, Training Loss: 0.10101
Epoch: 2564, Training Loss: 0.14097
Epoch: 2565, Training Loss: 0.07607
Epoch: 2565, Training Loss: 0.10056
Epoch: 2565, Training Loss: 0.10082
Epoch: 2565, Training Loss: 0.14068
Epoch: 2566, Training Loss: 0.07596
Epoch: 2566, Training Loss: 0.10037
Epoch: 2566, Training Loss: 0.10063
Epoch: 2566, Training Loss: 0.14038
Epoch: 2567, Training Loss: 0.07586
Epoch: 2567, Training Loss: 0.10018
Epoch: 2567, Training Loss: 0.10044
Epoch: 2567, Training Loss: 0.14009
Epoch: 2568, Training Loss: 0.07576
Epoch: 2568, Training Loss: 0.10000
Epoch: 2568, Training Loss: 0.10026
Epoch: 2568, Training Loss: 0.13980
Epoch: 2569, Training Loss: 0.07565
Epoch: 2569, Training Loss: 0.09981
Epoch: 2569, Training Loss: 0.10007
Epoch: 2569, Training Loss: 0.13951
Epoch: 2570, Training Loss: 0.07555
Epoch: 2570, Training Loss: 0.09962
Epoch: 2570, Training Loss: 0.09988
Epoch: 2570, Training Loss: 0.13922
Epoch: 2571, Training Loss: 0.07545
Epoch: 2571, Training Loss: 0.09944
Epoch: 2571, Training Loss: 0.09970
Epoch: 2571, Training Loss: 0.13893
Epoch: 2572, Training Loss: 0.07535
Epoch: 2572, Training Loss: 0.09926
Epoch: 2572, Training Loss: 0.09951
Epoch: 2572, Training Loss: 0.13865
Epoch: 2573, Training Loss: 0.07525
Epoch: 2573, Training Loss: 0.09907
Epoch: 2573, Training Loss: 0.09933
Epoch: 2573, Training Loss: 0.13836
Epoch: 2574, Training Loss: 0.07514
Epoch: 2574, Training Loss: 0.09889
Epoch: 2574, Training Loss: 0.09915
Epoch: 2574, Training Loss: 0.13808
Epoch: 2575, Training Loss: 0.07504
Epoch: 2575, Training Loss: 0.09871
Epoch: 2575, Training Loss: 0.09897
Epoch: 2575, Training Loss: 0.13780
Epoch: 2576, Training Loss: 0.07494
Epoch: 2576, Training Loss: 0.09853
Epoch: 2576, Training Loss: 0.09879
Epoch: 2576, Training Loss: 0.13752
Epoch: 2577, Training Loss: 0.07484
Epoch: 2577, Training Loss: 0.09835
Epoch: 2577, Training Loss: 0.09861
Epoch: 2577, Training Loss: 0.13724
Epoch: 2578, Training Loss: 0.07474
Epoch: 2578, Training Loss: 0.09817
Epoch: 2578, Training Loss: 0.09843
Epoch: 2578, Training Loss: 0.13696
Epoch: 2579, Training Loss: 0.07464
Epoch: 2579, Training Loss: 0.09799
Epoch: 2579, Training Loss: 0.09825
Epoch: 2579, Training Loss: 0.13668
Epoch: 2580, Training Loss: 0.07455
Epoch: 2580, Training Loss: 0.09782
Epoch: 2580, Training Loss: 0.09807
Epoch: 2580, Training Loss: 0.13641
Epoch: 2581, Training Loss: 0.07445
Epoch: 2581, Training Loss: 0.09764
Epoch: 2581, Training Loss: 0.09789
Epoch: 2581, Training Loss: 0.13614
Epoch: 2582, Training Loss: 0.07435
Epoch: 2582, Training Loss: 0.09747
Epoch: 2582, Training Loss: 0.09772
Epoch: 2582, Training Loss: 0.13586
Epoch: 2583, Training Loss: 0.07425
Epoch: 2583, Training Loss: 0.09729
Epoch: 2583, Training Loss: 0.09754
Epoch: 2583, Training Loss: 0.13559
Epoch: 2584, Training Loss: 0.07415
Epoch: 2584, Training Loss: 0.09712
Epoch: 2584, Training Loss: 0.09737
Epoch: 2584, Training Loss: 0.13532
Epoch: 2585, Training Loss: 0.07406
Epoch: 2585, Training Loss: 0.09695
Epoch: 2585, Training Loss: 0.09720
Epoch: 2585, Training Loss: 0.13506
Epoch: 2586, Training Loss: 0.07396
Epoch: 2586, Training Loss: 0.09677
Epoch: 2586, Training Loss: 0.09702
Epoch: 2586, Training Loss: 0.13479
Epoch: 2587, Training Loss: 0.07386
Epoch: 2587, Training Loss: 0.09660
Epoch: 2587, Training Loss: 0.09685
Epoch: 2587, Training Loss: 0.13453
Epoch: 2588, Training Loss: 0.07377
Epoch: 2588, Training Loss: 0.09643
Epoch: 2588, Training Loss: 0.09668
Epoch: 2588, Training Loss: 0.13426
Epoch: 2589, Training Loss: 0.07367
Epoch: 2589, Training Loss: 0.09626
Epoch: 2589, Training Loss: 0.09651
Epoch: 2589, Training Loss: 0.13400
Epoch: 2590, Training Loss: 0.07357
Epoch: 2590, Training Loss: 0.09609
Epoch: 2590, Training Loss: 0.09634
Epoch: 2590, Training Loss: 0.13374
Epoch: 2591, Training Loss: 0.07348
Epoch: 2591, Training Loss: 0.09592
Epoch: 2591, Training Loss: 0.09617
Epoch: 2591, Training Loss: 0.13348
Epoch: 2592, Training Loss: 0.07338
Epoch: 2592, Training Loss: 0.09576
Epoch: 2592, Training Loss: 0.09601
Epoch: 2592, Training Loss: 0.13322
Epoch: 2593, Training Loss: 0.07329
Epoch: 2593, Training Loss: 0.09559
Epoch: 2593, Training Loss: 0.09584
Epoch: 2593, Training Loss: 0.13296
Epoch: 2594, Training Loss: 0.07320
Epoch: 2594, Training Loss: 0.09542
Epoch: 2594, Training Loss: 0.09567
Epoch: 2594, Training Loss: 0.13270
Epoch: 2595, Training Loss: 0.07310
Epoch: 2595, Training Loss: 0.09526
Epoch: 2595, Training Loss: 0.09551
Epoch: 2595, Training Loss: 0.13245
Epoch: 2596, Training Loss: 0.07301
Epoch: 2596, Training Loss: 0.09510
Epoch: 2596, Training Loss: 0.09534
Epoch: 2596, Training Loss: 0.13220
Epoch: 2597, Training Loss: 0.07291
Epoch: 2597, Training Loss: 0.09493
Epoch: 2597, Training Loss: 0.09518
Epoch: 2597, Training Loss: 0.13194
Epoch: 2598, Training Loss: 0.07282
Epoch: 2598, Training Loss: 0.09477
Epoch: 2598, Training Loss: 0.09501
Epoch: 2598, Training Loss: 0.13169
Epoch: 2599, Training Loss: 0.07273
Epoch: 2599, Training Loss: 0.09461
Epoch: 2599, Training Loss: 0.09485
Epoch: 2599, Training Loss: 0.13144
Epoch: 2600, Training Loss: 0.07264
Epoch: 2600, Training Loss: 0.09445
Epoch: 2600, Training Loss: 0.09469
Epoch: 2600, Training Loss: 0.13119
Epoch: 2601, Training Loss: 0.07254
Epoch: 2601, Training Loss: 0.09428
Epoch: 2601, Training Loss: 0.09453
Epoch: 2601, Training Loss: 0.13094
Epoch: 2602, Training Loss: 0.07245
Epoch: 2602, Training Loss: 0.09412
Epoch: 2602, Training Loss: 0.09437
Epoch: 2602, Training Loss: 0.13070
Epoch: 2603, Training Loss: 0.07236
Epoch: 2603, Training Loss: 0.09397
Epoch: 2603, Training Loss: 0.09421
Epoch: 2603, Training Loss: 0.13045
Epoch: 2604, Training Loss: 0.07227
Epoch: 2604, Training Loss: 0.09381
Epoch: 2604, Training Loss: 0.09405
Epoch: 2604, Training Loss: 0.13021
Epoch: 2605, Training Loss: 0.07218
Epoch: 2605, Training Loss: 0.09365
Epoch: 2605, Training Loss: 0.09389
Epoch: 2605, Training Loss: 0.12996
Epoch: 2606, Training Loss: 0.07209
Epoch: 2606, Training Loss: 0.09349
Epoch: 2606, Training Loss: 0.09373
Epoch: 2606, Training Loss: 0.12972
Epoch: 2607, Training Loss: 0.07200
Epoch: 2607, Training Loss: 0.09333
Epoch: 2607, Training Loss: 0.09358
Epoch: 2607, Training Loss: 0.12948
Epoch: 2608, Training Loss: 0.07191
Epoch: 2608, Training Loss: 0.09318
Epoch: 2608, Training Loss: 0.09342
Epoch: 2608, Training Loss: 0.12924
Epoch: 2609, Training Loss: 0.07182
Epoch: 2609, Training Loss: 0.09302
Epoch: 2609, Training Loss: 0.09326
Epoch: 2609, Training Loss: 0.12900
Epoch: 2610, Training Loss: 0.07173
Epoch: 2610, Training Loss: 0.09287
Epoch: 2610, Training Loss: 0.09311
Epoch: 2610, Training Loss: 0.12877
Epoch: 2611, Training Loss: 0.07164
Epoch: 2611, Training Loss: 0.09272
Epoch: 2611, Training Loss: 0.09295
Epoch: 2611, Training Loss: 0.12853
Epoch: 2612, Training Loss: 0.07155
Epoch: 2612, Training Loss: 0.09256
Epoch: 2612, Training Loss: 0.09280
Epoch: 2612, Training Loss: 0.12829
Epoch: 2613, Training Loss: 0.07146
Epoch: 2613, Training Loss: 0.09241
Epoch: 2613, Training Loss: 0.09265
Epoch: 2613, Training Loss: 0.12806
Epoch: 2614, Training Loss: 0.07137
Epoch: 2614, Training Loss: 0.09226
Epoch: 2614, Training Loss: 0.09250
Epoch: 2614, Training Loss: 0.12783
Epoch: 2615, Training Loss: 0.07129
Epoch: 2615, Training Loss: 0.09211
Epoch: 2615, Training Loss: 0.09234
Epoch: 2615, Training Loss: 0.12759
Epoch: 2616, Training Loss: 0.07120
Epoch: 2616, Training Loss: 0.09196
Epoch: 2616, Training Loss: 0.09219
Epoch: 2616, Training Loss: 0.12736
Epoch: 2617, Training Loss: 0.07111
Epoch: 2617, Training Loss: 0.09181
Epoch: 2617, Training Loss: 0.09204
Epoch: 2617, Training Loss: 0.12713
Epoch: 2618, Training Loss: 0.07102
Epoch: 2618, Training Loss: 0.09166
Epoch: 2618, Training Loss: 0.09189
Epoch: 2618, Training Loss: 0.12690
Epoch: 2619, Training Loss: 0.07094
Epoch: 2619, Training Loss: 0.09151
Epoch: 2619, Training Loss: 0.09174
Epoch: 2619, Training Loss: 0.12668
Epoch: 2620, Training Loss: 0.07085
Epoch: 2620, Training Loss: 0.09136
Epoch: 2620, Training Loss: 0.09160
Epoch: 2620, Training Loss: 0.12645
Epoch: 2621, Training Loss: 0.07076
Epoch: 2621, Training Loss: 0.09121
Epoch: 2621, Training Loss: 0.09145
Epoch: 2621, Training Loss: 0.12622
Epoch: 2622, Training Loss: 0.07068
Epoch: 2622, Training Loss: 0.09107
Epoch: 2622, Training Loss: 0.09130
Epoch: 2622, Training Loss: 0.12600
Epoch: 2623, Training Loss: 0.07059
Epoch: 2623, Training Loss: 0.09092
Epoch: 2623, Training Loss: 0.09115
Epoch: 2623, Training Loss: 0.12577
Epoch: 2624, Training Loss: 0.07051
Epoch: 2624, Training Loss: 0.09077
Epoch: 2624, Training Loss: 0.09101
Epoch: 2624, Training Loss: 0.12555
Epoch: 2625, Training Loss: 0.07042
Epoch: 2625, Training Loss: 0.09063
Epoch: 2625, Training Loss: 0.09086
Epoch: 2625, Training Loss: 0.12533
Epoch: 2626, Training Loss: 0.07034
Epoch: 2626, Training Loss: 0.09049
Epoch: 2626, Training Loss: 0.09072
Epoch: 2626, Training Loss: 0.12511
Epoch: 2627, Training Loss: 0.07025
Epoch: 2627, Training Loss: 0.09034
Epoch: 2627, Training Loss: 0.09057
Epoch: 2627, Training Loss: 0.12489
Epoch: 2628, Training Loss: 0.07017
Epoch: 2628, Training Loss: 0.09020
Epoch: 2628, Training Loss: 0.09043
Epoch: 2628, Training Loss: 0.12467
Epoch: 2629, Training Loss: 0.07008
Epoch: 2629, Training Loss: 0.09006
Epoch: 2629, Training Loss: 0.09029
Epoch: 2629, Training Loss: 0.12445
Epoch: 2630, Training Loss: 0.07000
Epoch: 2630, Training Loss: 0.08991
Epoch: 2630, Training Loss: 0.09014
Epoch: 2630, Training Loss: 0.12423
Epoch: 2631, Training Loss: 0.06992
Epoch: 2631, Training Loss: 0.08977
Epoch: 2631, Training Loss: 0.09000
Epoch: 2631, Training Loss: 0.12402
Epoch: 2632, Training Loss: 0.06983
Epoch: 2632, Training Loss: 0.08963
Epoch: 2632, Training Loss: 0.08986
Epoch: 2632, Training Loss: 0.12380
Epoch: 2633, Training Loss: 0.06975
Epoch: 2633, Training Loss: 0.08949
Epoch: 2633, Training Loss: 0.08972
Epoch: 2633, Training Loss: 0.12359
Epoch: 2634, Training Loss: 0.06967
Epoch: 2634, Training Loss: 0.08935
Epoch: 2634, Training Loss: 0.08958
Epoch: 2634, Training Loss: 0.12337
Epoch: 2635, Training Loss: 0.06958
Epoch: 2635, Training Loss: 0.08921
Epoch: 2635, Training Loss: 0.08944
Epoch: 2635, Training Loss: 0.12316
Epoch: 2636, Training Loss: 0.06950
Epoch: 2636, Training Loss: 0.08908
Epoch: 2636, Training Loss: 0.08930
Epoch: 2636, Training Loss: 0.12295
Epoch: 2637, Training Loss: 0.06942
Epoch: 2637, Training Loss: 0.08894
Epoch: 2637, Training Loss: 0.08916
Epoch: 2637, Training Loss: 0.12274
Epoch: 2638, Training Loss: 0.06934
Epoch: 2638, Training Loss: 0.08880
Epoch: 2638, Training Loss: 0.08903
Epoch: 2638, Training Loss: 0.12253
Epoch: 2639, Training Loss: 0.06926
Epoch: 2639, Training Loss: 0.08866
Epoch: 2639, Training Loss: 0.08889
Epoch: 2639, Training Loss: 0.12232
Epoch: 2640, Training Loss: 0.06918
Epoch: 2640, Training Loss: 0.08853
Epoch: 2640, Training Loss: 0.08875
Epoch: 2640, Training Loss: 0.12211
Epoch: 2641, Training Loss: 0.06909
Epoch: 2641, Training Loss: 0.08839
Epoch: 2641, Training Loss: 0.08862
Epoch: 2641, Training Loss: 0.12191
Epoch: 2642, Training Loss: 0.06901
Epoch: 2642, Training Loss: 0.08826
Epoch: 2642, Training Loss: 0.08848
Epoch: 2642, Training Loss: 0.12170
Epoch: 2643, Training Loss: 0.06893
Epoch: 2643, Training Loss: 0.08812
Epoch: 2643, Training Loss: 0.08835
Epoch: 2643, Training Loss: 0.12150
Epoch: 2644, Training Loss: 0.06885
Epoch: 2644, Training Loss: 0.08799
Epoch: 2644, Training Loss: 0.08821
Epoch: 2644, Training Loss: 0.12129
Epoch: 2645, Training Loss: 0.06877
Epoch: 2645, Training Loss: 0.08785
Epoch: 2645, Training Loss: 0.08808
Epoch: 2645, Training Loss: 0.12109
Epoch: 2646, Training Loss: 0.06869
Epoch: 2646, Training Loss: 0.08772
Epoch: 2646, Training Loss: 0.08795
Epoch: 2646, Training Loss: 0.12089
Epoch: 2647, Training Loss: 0.06861
Epoch: 2647, Training Loss: 0.08759
Epoch: 2647, Training Loss: 0.08781
Epoch: 2647, Training Loss: 0.12069
Epoch: 2648, Training Loss: 0.06853
Epoch: 2648, Training Loss: 0.08746
Epoch: 2648, Training Loss: 0.08768
Epoch: 2648, Training Loss: 0.12048
Epoch: 2649, Training Loss: 0.06846
Epoch: 2649, Training Loss: 0.08733
Epoch: 2649, Training Loss: 0.08755
Epoch: 2649, Training Loss: 0.12028
Epoch: 2650, Training Loss: 0.06838
Epoch: 2650, Training Loss: 0.08719
Epoch: 2650, Training Loss: 0.08742
Epoch: 2650, Training Loss: 0.12009
Epoch: 2651, Training Loss: 0.06830
Epoch: 2651, Training Loss: 0.08706
Epoch: 2651, Training Loss: 0.08729
Epoch: 2651, Training Loss: 0.11989
Epoch: 2652, Training Loss: 0.06822
Epoch: 2652, Training Loss: 0.08693
Epoch: 2652, Training Loss: 0.08716
Epoch: 2652, Training Loss: 0.11969
Epoch: 2653, Training Loss: 0.06814
Epoch: 2653, Training Loss: 0.08681
Epoch: 2653, Training Loss: 0.08703
Epoch: 2653, Training Loss: 0.11949
Epoch: 2654, Training Loss: 0.06806
Epoch: 2654, Training Loss: 0.08668
Epoch: 2654, Training Loss: 0.08690
Epoch: 2654, Training Loss: 0.11930
Epoch: 2655, Training Loss: 0.06799
Epoch: 2655, Training Loss: 0.08655
Epoch: 2655, Training Loss: 0.08677
Epoch: 2655, Training Loss: 0.11910
Epoch: 2656, Training Loss: 0.06791
Epoch: 2656, Training Loss: 0.08642
Epoch: 2656, Training Loss: 0.08664
Epoch: 2656, Training Loss: 0.11891
Epoch: 2657, Training Loss: 0.06783
Epoch: 2657, Training Loss: 0.08629
Epoch: 2657, Training Loss: 0.08651
Epoch: 2657, Training Loss: 0.11872
Epoch: 2658, Training Loss: 0.06776
Epoch: 2658, Training Loss: 0.08617
Epoch: 2658, Training Loss: 0.08639
Epoch: 2658, Training Loss: 0.11852
Epoch: 2659, Training Loss: 0.06768
Epoch: 2659, Training Loss: 0.08604
Epoch: 2659, Training Loss: 0.08626
Epoch: 2659, Training Loss: 0.11833
Epoch: 2660, Training Loss: 0.06760
Epoch: 2660, Training Loss: 0.08591
Epoch: 2660, Training Loss: 0.08613
Epoch: 2660, Training Loss: 0.11814
Epoch: 2661, Training Loss: 0.06753
Epoch: 2661, Training Loss: 0.08579
Epoch: 2661, Training Loss: 0.08601
Epoch: 2661, Training Loss: 0.11795
Epoch: 2662, Training Loss: 0.06745
Epoch: 2662, Training Loss: 0.08566
Epoch: 2662, Training Loss: 0.08588
Epoch: 2662, Training Loss: 0.11776
Epoch: 2663, Training Loss: 0.06738
Epoch: 2663, Training Loss: 0.08554
Epoch: 2663, Training Loss: 0.08576
Epoch: 2663, Training Loss: 0.11757
Epoch: 2664, Training Loss: 0.06730
Epoch: 2664, Training Loss: 0.08542
Epoch: 2664, Training Loss: 0.08563
Epoch: 2664, Training Loss: 0.11738
Epoch: 2665, Training Loss: 0.06723
Epoch: 2665, Training Loss: 0.08529
Epoch: 2665, Training Loss: 0.08551
Epoch: 2665, Training Loss: 0.11720
Epoch: 2666, Training Loss: 0.06715
Epoch: 2666, Training Loss: 0.08517
Epoch: 2666, Training Loss: 0.08539
Epoch: 2666, Training Loss: 0.11701
Epoch: 2667, Training Loss: 0.06708
Epoch: 2667, Training Loss: 0.08505
Epoch: 2667, Training Loss: 0.08526
Epoch: 2667, Training Loss: 0.11683
Epoch: 2668, Training Loss: 0.06700
Epoch: 2668, Training Loss: 0.08493
Epoch: 2668, Training Loss: 0.08514
Epoch: 2668, Training Loss: 0.11664
Epoch: 2669, Training Loss: 0.06693
Epoch: 2669, Training Loss: 0.08480
Epoch: 2669, Training Loss: 0.08502
Epoch: 2669, Training Loss: 0.11646
Epoch: 2670, Training Loss: 0.06685
Epoch: 2670, Training Loss: 0.08468
Epoch: 2670, Training Loss: 0.08490
Epoch: 2670, Training Loss: 0.11627
Epoch: 2671, Training Loss: 0.06678
Epoch: 2671, Training Loss: 0.08456
Epoch: 2671, Training Loss: 0.08478
Epoch: 2671, Training Loss: 0.11609
Epoch: 2672, Training Loss: 0.06671
Epoch: 2672, Training Loss: 0.08444
Epoch: 2672, Training Loss: 0.08466
Epoch: 2672, Training Loss: 0.11591
Epoch: 2673, Training Loss: 0.06663
Epoch: 2673, Training Loss: 0.08432
Epoch: 2673, Training Loss: 0.08454
Epoch: 2673, Training Loss: 0.11573
Epoch: 2674, Training Loss: 0.06656
Epoch: 2674, Training Loss: 0.08420
Epoch: 2674, Training Loss: 0.08442
Epoch: 2674, Training Loss: 0.11555
Epoch: 2675, Training Loss: 0.06649
Epoch: 2675, Training Loss: 0.08408
Epoch: 2675, Training Loss: 0.08430
Epoch: 2675, Training Loss: 0.11537
Epoch: 2676, Training Loss: 0.06641
Epoch: 2676, Training Loss: 0.08397
Epoch: 2676, Training Loss: 0.08418
Epoch: 2676, Training Loss: 0.11519
Epoch: 2677, Training Loss: 0.06634
Epoch: 2677, Training Loss: 0.08385
Epoch: 2677, Training Loss: 0.08406
Epoch: 2677, Training Loss: 0.11501
Epoch: 2678, Training Loss: 0.06627
Epoch: 2678, Training Loss: 0.08373
Epoch: 2678, Training Loss: 0.08394
Epoch: 2678, Training Loss: 0.11483
Epoch: 2679, Training Loss: 0.06620
Epoch: 2679, Training Loss: 0.08361
Epoch: 2679, Training Loss: 0.08383
Epoch: 2679, Training Loss: 0.11466
Epoch: 2680, Training Loss: 0.06612
Epoch: 2680, Training Loss: 0.08350
Epoch: 2680, Training Loss: 0.08371
Epoch: 2680, Training Loss: 0.11448
Epoch: 2681, Training Loss: 0.06605
Epoch: 2681, Training Loss: 0.08338
Epoch: 2681, Training Loss: 0.08359
Epoch: 2681, Training Loss: 0.11430
Epoch: 2682, Training Loss: 0.06598
Epoch: 2682, Training Loss: 0.08327
Epoch: 2682, Training Loss: 0.08348
Epoch: 2682, Training Loss: 0.11413
Epoch: 2683, Training Loss: 0.06591
Epoch: 2683, Training Loss: 0.08315
Epoch: 2683, Training Loss: 0.08336
Epoch: 2683, Training Loss: 0.11396
Epoch: 2684, Training Loss: 0.06584
Epoch: 2684, Training Loss: 0.08304
Epoch: 2684, Training Loss: 0.08325
Epoch: 2684, Training Loss: 0.11378
Epoch: 2685, Training Loss: 0.06577
Epoch: 2685, Training Loss: 0.08292
Epoch: 2685, Training Loss: 0.08313
Epoch: 2685, Training Loss: 0.11361
Epoch: 2686, Training Loss: 0.06570
Epoch: 2686, Training Loss: 0.08281
Epoch: 2686, Training Loss: 0.08302
Epoch: 2686, Training Loss: 0.11344
Epoch: 2687, Training Loss: 0.06563
Epoch: 2687, Training Loss: 0.08269
Epoch: 2687, Training Loss: 0.08290
Epoch: 2687, Training Loss: 0.11327
Epoch: 2688, Training Loss: 0.06556
Epoch: 2688, Training Loss: 0.08258
Epoch: 2688, Training Loss: 0.08279
Epoch: 2688, Training Loss: 0.11310
Epoch: 2689, Training Loss: 0.06549
Epoch: 2689, Training Loss: 0.08247
Epoch: 2689, Training Loss: 0.08268
Epoch: 2689, Training Loss: 0.11293
Epoch: 2690, Training Loss: 0.06542
Epoch: 2690, Training Loss: 0.08236
Epoch: 2690, Training Loss: 0.08256
Epoch: 2690, Training Loss: 0.11276
Epoch: 2691, Training Loss: 0.06535
Epoch: 2691, Training Loss: 0.08224
Epoch: 2691, Training Loss: 0.08245
Epoch: 2691, Training Loss: 0.11259
Epoch: 2692, Training Loss: 0.06528
Epoch: 2692, Training Loss: 0.08213
Epoch: 2692, Training Loss: 0.08234
Epoch: 2692, Training Loss: 0.11242
Epoch: 2693, Training Loss: 0.06521
Epoch: 2693, Training Loss: 0.08202
Epoch: 2693, Training Loss: 0.08223
Epoch: 2693, Training Loss: 0.11225
Epoch: 2694, Training Loss: 0.06514
Epoch: 2694, Training Loss: 0.08191
Epoch: 2694, Training Loss: 0.08212
Epoch: 2694, Training Loss: 0.11209
Epoch: 2695, Training Loss: 0.06507
Epoch: 2695, Training Loss: 0.08180
Epoch: 2695, Training Loss: 0.08201
Epoch: 2695, Training Loss: 0.11192
Epoch: 2696, Training Loss: 0.06500
Epoch: 2696, Training Loss: 0.08169
Epoch: 2696, Training Loss: 0.08190
Epoch: 2696, Training Loss: 0.11175
Epoch: 2697, Training Loss: 0.06494
Epoch: 2697, Training Loss: 0.08158
Epoch: 2697, Training Loss: 0.08179
Epoch: 2697, Training Loss: 0.11159
Epoch: 2698, Training Loss: 0.06487
Epoch: 2698, Training Loss: 0.08147
Epoch: 2698, Training Loss: 0.08168
Epoch: 2698, Training Loss: 0.11142
Epoch: 2699, Training Loss: 0.06480
Epoch: 2699, Training Loss: 0.08136
Epoch: 2699, Training Loss: 0.08157
Epoch: 2699, Training Loss: 0.11126
Epoch: 2700, Training Loss: 0.06473
Epoch: 2700, Training Loss: 0.08125
Epoch: 2700, Training Loss: 0.08146
Epoch: 2700, Training Loss: 0.11110
Epoch: 2701, Training Loss: 0.06466
Epoch: 2701, Training Loss: 0.08115
Epoch: 2701, Training Loss: 0.08135
Epoch: 2701, Training Loss: 0.11094
Epoch: 2702, Training Loss: 0.06460
Epoch: 2702, Training Loss: 0.08104
Epoch: 2702, Training Loss: 0.08124
Epoch: 2702, Training Loss: 0.11077
Epoch: 2703, Training Loss: 0.06453
Epoch: 2703, Training Loss: 0.08093
Epoch: 2703, Training Loss: 0.08114
Epoch: 2703, Training Loss: 0.11061
Epoch: 2704, Training Loss: 0.06446
Epoch: 2704, Training Loss: 0.08082
Epoch: 2704, Training Loss: 0.08103
Epoch: 2704, Training Loss: 0.11045
Epoch: 2705, Training Loss: 0.06440
Epoch: 2705, Training Loss: 0.08072
Epoch: 2705, Training Loss: 0.08092
Epoch: 2705, Training Loss: 0.11029
Epoch: 2706, Training Loss: 0.06433
Epoch: 2706, Training Loss: 0.08061
Epoch: 2706, Training Loss: 0.08081
Epoch: 2706, Training Loss: 0.11013
Epoch: 2707, Training Loss: 0.06426
Epoch: 2707, Training Loss: 0.08051
Epoch: 2707, Training Loss: 0.08071
Epoch: 2707, Training Loss: 0.10997
Epoch: 2708, Training Loss: 0.06420
Epoch: 2708, Training Loss: 0.08040
Epoch: 2708, Training Loss: 0.08060
Epoch: 2708, Training Loss: 0.10982
Epoch: 2709, Training Loss: 0.06413
Epoch: 2709, Training Loss: 0.08030
Epoch: 2709, Training Loss: 0.08050
Epoch: 2709, Training Loss: 0.10966
Epoch: 2710, Training Loss: 0.06406
Epoch: 2710, Training Loss: 0.08019
Epoch: 2710, Training Loss: 0.08039
Epoch: 2710, Training Loss: 0.10950
Epoch: 2711, Training Loss: 0.06400
Epoch: 2711, Training Loss: 0.08009
Epoch: 2711, Training Loss: 0.08029
Epoch: 2711, Training Loss: 0.10934
Epoch: 2712, Training Loss: 0.06393
Epoch: 2712, Training Loss: 0.07998
Epoch: 2712, Training Loss: 0.08018
Epoch: 2712, Training Loss: 0.10919
Epoch: 2713, Training Loss: 0.06387
Epoch: 2713, Training Loss: 0.07988
Epoch: 2713, Training Loss: 0.08008
Epoch: 2713, Training Loss: 0.10903
Epoch: 2714, Training Loss: 0.06380
Epoch: 2714, Training Loss: 0.07978
Epoch: 2714, Training Loss: 0.07998
Epoch: 2714, Training Loss: 0.10888
Epoch: 2715, Training Loss: 0.06374
Epoch: 2715, Training Loss: 0.07967
Epoch: 2715, Training Loss: 0.07987
Epoch: 2715, Training Loss: 0.10872
Epoch: 2716, Training Loss: 0.06367
Epoch: 2716, Training Loss: 0.07957
Epoch: 2716, Training Loss: 0.07977
Epoch: 2716, Training Loss: 0.10857
Epoch: 2717, Training Loss: 0.06361
Epoch: 2717, Training Loss: 0.07947
Epoch: 2717, Training Loss: 0.07967
Epoch: 2717, Training Loss: 0.10842
Epoch: 2718, Training Loss: 0.06354
Epoch: 2718, Training Loss: 0.07937
Epoch: 2718, Training Loss: 0.07957
Epoch: 2718, Training Loss: 0.10827
Epoch: 2719, Training Loss: 0.06348
Epoch: 2719, Training Loss: 0.07927
Epoch: 2719, Training Loss: 0.07947
Epoch: 2719, Training Loss: 0.10811
Epoch: 2720, Training Loss: 0.06342
Epoch: 2720, Training Loss: 0.07917
Epoch: 2720, Training Loss: 0.07936
Epoch: 2720, Training Loss: 0.10796
Epoch: 2721, Training Loss: 0.06335
Epoch: 2721, Training Loss: 0.07906
Epoch: 2721, Training Loss: 0.07926
Epoch: 2721, Training Loss: 0.10781
Epoch: 2722, Training Loss: 0.06329
Epoch: 2722, Training Loss: 0.07896
Epoch: 2722, Training Loss: 0.07916
Epoch: 2722, Training Loss: 0.10766
Epoch: 2723, Training Loss: 0.06323
Epoch: 2723, Training Loss: 0.07886
Epoch: 2723, Training Loss: 0.07906
Epoch: 2723, Training Loss: 0.10751
Epoch: 2724, Training Loss: 0.06316
Epoch: 2724, Training Loss: 0.07877
Epoch: 2724, Training Loss: 0.07896
Epoch: 2724, Training Loss: 0.10736
Epoch: 2725, Training Loss: 0.06310
Epoch: 2725, Training Loss: 0.07867
Epoch: 2725, Training Loss: 0.07886
Epoch: 2725, Training Loss: 0.10721
Epoch: 2726, Training Loss: 0.06304
Epoch: 2726, Training Loss: 0.07857
Epoch: 2726, Training Loss: 0.07876
Epoch: 2726, Training Loss: 0.10707
Epoch: 2727, Training Loss: 0.06297
Epoch: 2727, Training Loss: 0.07847
Epoch: 2727, Training Loss: 0.07866
Epoch: 2727, Training Loss: 0.10692
Epoch: 2728, Training Loss: 0.06291
Epoch: 2728, Training Loss: 0.07837
Epoch: 2728, Training Loss: 0.07857
Epoch: 2728, Training Loss: 0.10677
Epoch: 2729, Training Loss: 0.06285
Epoch: 2729, Training Loss: 0.07827
Epoch: 2729, Training Loss: 0.07847
Epoch: 2729, Training Loss: 0.10662
Epoch: 2730, Training Loss: 0.06279
Epoch: 2730, Training Loss: 0.07817
Epoch: 2730, Training Loss: 0.07837
Epoch: 2730, Training Loss: 0.10648
Epoch: 2731, Training Loss: 0.06272
Epoch: 2731, Training Loss: 0.07808
Epoch: 2731, Training Loss: 0.07827
Epoch: 2731, Training Loss: 0.10633
Epoch: 2732, Training Loss: 0.06266
Epoch: 2732, Training Loss: 0.07798
Epoch: 2732, Training Loss: 0.07817
Epoch: 2732, Training Loss: 0.10619
Epoch: 2733, Training Loss: 0.06260
Epoch: 2733, Training Loss: 0.07788
Epoch: 2733, Training Loss: 0.07808
Epoch: 2733, Training Loss: 0.10604
Epoch: 2734, Training Loss: 0.06254
Epoch: 2734, Training Loss: 0.07779
Epoch: 2734, Training Loss: 0.07798
Epoch: 2734, Training Loss: 0.10590
Epoch: 2735, Training Loss: 0.06248
Epoch: 2735, Training Loss: 0.07769
Epoch: 2735, Training Loss: 0.07788
Epoch: 2735, Training Loss: 0.10576
Epoch: 2736, Training Loss: 0.06242
Epoch: 2736, Training Loss: 0.07760
Epoch: 2736, Training Loss: 0.07779
Epoch: 2736, Training Loss: 0.10561
Epoch: 2737, Training Loss: 0.06235
Epoch: 2737, Training Loss: 0.07750
Epoch: 2737, Training Loss: 0.07769
Epoch: 2737, Training Loss: 0.10547
Epoch: 2738, Training Loss: 0.06229
Epoch: 2738, Training Loss: 0.07740
Epoch: 2738, Training Loss: 0.07760
Epoch: 2738, Training Loss: 0.10533
Epoch: 2739, Training Loss: 0.06223
Epoch: 2739, Training Loss: 0.07731
Epoch: 2739, Training Loss: 0.07750
Epoch: 2739, Training Loss: 0.10519
Epoch: 2740, Training Loss: 0.06217
Epoch: 2740, Training Loss: 0.07722
Epoch: 2740, Training Loss: 0.07741
Epoch: 2740, Training Loss: 0.10505
Epoch: 2741, Training Loss: 0.06211
Epoch: 2741, Training Loss: 0.07712
Epoch: 2741, Training Loss: 0.07731
Epoch: 2741, Training Loss: 0.10491
Epoch: 2742, Training Loss: 0.06205
Epoch: 2742, Training Loss: 0.07703
Epoch: 2742, Training Loss: 0.07722
Epoch: 2742, Training Loss: 0.10477
Epoch: 2743, Training Loss: 0.06199
Epoch: 2743, Training Loss: 0.07693
Epoch: 2743, Training Loss: 0.07713
Epoch: 2743, Training Loss: 0.10463
Epoch: 2744, Training Loss: 0.06193
Epoch: 2744, Training Loss: 0.07684
Epoch: 2744, Training Loss: 0.07703
Epoch: 2744, Training Loss: 0.10449
Epoch: 2745, Training Loss: 0.06187
Epoch: 2745, Training Loss: 0.07675
Epoch: 2745, Training Loss: 0.07694
Epoch: 2745, Training Loss: 0.10435
Epoch: 2746, Training Loss: 0.06181
Epoch: 2746, Training Loss: 0.07666
Epoch: 2746, Training Loss: 0.07685
Epoch: 2746, Training Loss: 0.10421
Epoch: 2747, Training Loss: 0.06175
Epoch: 2747, Training Loss: 0.07656
Epoch: 2747, Training Loss: 0.07675
Epoch: 2747, Training Loss: 0.10407
Epoch: 2748, Training Loss: 0.06169
Epoch: 2748, Training Loss: 0.07647
Epoch: 2748, Training Loss: 0.07666
Epoch: 2748, Training Loss: 0.10394
Epoch: 2749, Training Loss: 0.06163
Epoch: 2749, Training Loss: 0.07638
Epoch: 2749, Training Loss: 0.07657
Epoch: 2749, Training Loss: 0.10380
Epoch: 2750, Training Loss: 0.06158
Epoch: 2750, Training Loss: 0.07629
Epoch: 2750, Training Loss: 0.07648
Epoch: 2750, Training Loss: 0.10366
Epoch: 2751, Training Loss: 0.06152
Epoch: 2751, Training Loss: 0.07620
Epoch: 2751, Training Loss: 0.07639
Epoch: 2751, Training Loss: 0.10353
Epoch: 2752, Training Loss: 0.06146
Epoch: 2752, Training Loss: 0.07611
Epoch: 2752, Training Loss: 0.07629
Epoch: 2752, Training Loss: 0.10339
Epoch: 2753, Training Loss: 0.06140
Epoch: 2753, Training Loss: 0.07602
Epoch: 2753, Training Loss: 0.07620
Epoch: 2753, Training Loss: 0.10326
Epoch: 2754, Training Loss: 0.06134
Epoch: 2754, Training Loss: 0.07593
Epoch: 2754, Training Loss: 0.07611
Epoch: 2754, Training Loss: 0.10312
Epoch: 2755, Training Loss: 0.06128
Epoch: 2755, Training Loss: 0.07584
Epoch: 2755, Training Loss: 0.07602
Epoch: 2755, Training Loss: 0.10299
Epoch: 2756, Training Loss: 0.06122
Epoch: 2756, Training Loss: 0.07575
Epoch: 2756, Training Loss: 0.07593
Epoch: 2756, Training Loss: 0.10286
Epoch: 2757, Training Loss: 0.06117
Epoch: 2757, Training Loss: 0.07566
Epoch: 2757, Training Loss: 0.07584
Epoch: 2757, Training Loss: 0.10272
Epoch: 2758, Training Loss: 0.06111
Epoch: 2758, Training Loss: 0.07557
Epoch: 2758, Training Loss: 0.07575
Epoch: 2758, Training Loss: 0.10259
Epoch: 2759, Training Loss: 0.06105
Epoch: 2759, Training Loss: 0.07548
Epoch: 2759, Training Loss: 0.07567
Epoch: 2759, Training Loss: 0.10246
Epoch: 2760, Training Loss: 0.06099
Epoch: 2760, Training Loss: 0.07539
Epoch: 2760, Training Loss: 0.07558
Epoch: 2760, Training Loss: 0.10233
Epoch: 2761, Training Loss: 0.06094
Epoch: 2761, Training Loss: 0.07530
Epoch: 2761, Training Loss: 0.07549
Epoch: 2761, Training Loss: 0.10220
Epoch: 2762, Training Loss: 0.06088
Epoch: 2762, Training Loss: 0.07521
Epoch: 2762, Training Loss: 0.07540
Epoch: 2762, Training Loss: 0.10207
Epoch: 2763, Training Loss: 0.06082
Epoch: 2763, Training Loss: 0.07513
Epoch: 2763, Training Loss: 0.07531
Epoch: 2763, Training Loss: 0.10194
Epoch: 2764, Training Loss: 0.06077
Epoch: 2764, Training Loss: 0.07504
Epoch: 2764, Training Loss: 0.07522
Epoch: 2764, Training Loss: 0.10181
Epoch: 2765, Training Loss: 0.06071
Epoch: 2765, Training Loss: 0.07495
Epoch: 2765, Training Loss: 0.07514
Epoch: 2765, Training Loss: 0.10168
Epoch: 2766, Training Loss: 0.06065
Epoch: 2766, Training Loss: 0.07486
Epoch: 2766, Training Loss: 0.07505
Epoch: 2766, Training Loss: 0.10155
Epoch: 2767, Training Loss: 0.06060
Epoch: 2767, Training Loss: 0.07478
Epoch: 2767, Training Loss: 0.07496
Epoch: 2767, Training Loss: 0.10142
Epoch: 2768, Training Loss: 0.06054
Epoch: 2768, Training Loss: 0.07469
Epoch: 2768, Training Loss: 0.07488
Epoch: 2768, Training Loss: 0.10129
Epoch: 2769, Training Loss: 0.06048
Epoch: 2769, Training Loss: 0.07461
Epoch: 2769, Training Loss: 0.07479
Epoch: 2769, Training Loss: 0.10116
Epoch: 2770, Training Loss: 0.06043
Epoch: 2770, Training Loss: 0.07452
Epoch: 2770, Training Loss: 0.07470
Epoch: 2770, Training Loss: 0.10104
Epoch: 2771, Training Loss: 0.06037
Epoch: 2771, Training Loss: 0.07443
Epoch: 2771, Training Loss: 0.07462
Epoch: 2771, Training Loss: 0.10091
Epoch: 2772, Training Loss: 0.06032
Epoch: 2772, Training Loss: 0.07435
Epoch: 2772, Training Loss: 0.07453
Epoch: 2772, Training Loss: 0.10078
Epoch: 2773, Training Loss: 0.06026
Epoch: 2773, Training Loss: 0.07426
Epoch: 2773, Training Loss: 0.07445
Epoch: 2773, Training Loss: 0.10066
Epoch: 2774, Training Loss: 0.06021
Epoch: 2774, Training Loss: 0.07418
Epoch: 2774, Training Loss: 0.07436
Epoch: 2774, Training Loss: 0.10053
Epoch: 2775, Training Loss: 0.06015
Epoch: 2775, Training Loss: 0.07410
Epoch: 2775, Training Loss: 0.07428
Epoch: 2775, Training Loss: 0.10041
Epoch: 2776, Training Loss: 0.06009
Epoch: 2776, Training Loss: 0.07401
Epoch: 2776, Training Loss: 0.07419
Epoch: 2776, Training Loss: 0.10028
Epoch: 2777, Training Loss: 0.06004
Epoch: 2777, Training Loss: 0.07393
Epoch: 2777, Training Loss: 0.07411
Epoch: 2777, Training Loss: 0.10016
Epoch: 2778, Training Loss: 0.05999
Epoch: 2778, Training Loss: 0.07384
Epoch: 2778, Training Loss: 0.07403
Epoch: 2778, Training Loss: 0.10003
Epoch: 2779, Training Loss: 0.05993
Epoch: 2779, Training Loss: 0.07376
Epoch: 2779, Training Loss: 0.07394
Epoch: 2779, Training Loss: 0.09991
Epoch: 2780, Training Loss: 0.05988
Epoch: 2780, Training Loss: 0.07368
Epoch: 2780, Training Loss: 0.07386
Epoch: 2780, Training Loss: 0.09978
Epoch: 2781, Training Loss: 0.05982
Epoch: 2781, Training Loss: 0.07359
Epoch: 2781, Training Loss: 0.07378
Epoch: 2781, Training Loss: 0.09966
Epoch: 2782, Training Loss: 0.05977
Epoch: 2782, Training Loss: 0.07351
Epoch: 2782, Training Loss: 0.07369
Epoch: 2782, Training Loss: 0.09954
Epoch: 2783, Training Loss: 0.05971
Epoch: 2783, Training Loss: 0.07343
Epoch: 2783, Training Loss: 0.07361
Epoch: 2783, Training Loss: 0.09942
Epoch: 2784, Training Loss: 0.05966
Epoch: 2784, Training Loss: 0.07335
Epoch: 2784, Training Loss: 0.07353
Epoch: 2784, Training Loss: 0.09930
Epoch: 2785, Training Loss: 0.05961
Epoch: 2785, Training Loss: 0.07327
Epoch: 2785, Training Loss: 0.07345
Epoch: 2785, Training Loss: 0.09917
Epoch: 2786, Training Loss: 0.05955
Epoch: 2786, Training Loss: 0.07318
Epoch: 2786, Training Loss: 0.07336
Epoch: 2786, Training Loss: 0.09905
Epoch: 2787, Training Loss: 0.05950
Epoch: 2787, Training Loss: 0.07310
Epoch: 2787, Training Loss: 0.07328
Epoch: 2787, Training Loss: 0.09893
Epoch: 2788, Training Loss: 0.05944
Epoch: 2788, Training Loss: 0.07302
Epoch: 2788, Training Loss: 0.07320
Epoch: 2788, Training Loss: 0.09881
Epoch: 2789, Training Loss: 0.05939
Epoch: 2789, Training Loss: 0.07294
Epoch: 2789, Training Loss: 0.07312
Epoch: 2789, Training Loss: 0.09869
Epoch: 2790, Training Loss: 0.05934
Epoch: 2790, Training Loss: 0.07286
Epoch: 2790, Training Loss: 0.07304
Epoch: 2790, Training Loss: 0.09857
Epoch: 2791, Training Loss: 0.05928
Epoch: 2791, Training Loss: 0.07278
Epoch: 2791, Training Loss: 0.07296
Epoch: 2791, Training Loss: 0.09846
Epoch: 2792, Training Loss: 0.05923
Epoch: 2792, Training Loss: 0.07270
Epoch: 2792, Training Loss: 0.07288
Epoch: 2792, Training Loss: 0.09834
Epoch: 2793, Training Loss: 0.05918
Epoch: 2793, Training Loss: 0.07262
Epoch: 2793, Training Loss: 0.07280
Epoch: 2793, Training Loss: 0.09822
Epoch: 2794, Training Loss: 0.05913
Epoch: 2794, Training Loss: 0.07254
Epoch: 2794, Training Loss: 0.07272
Epoch: 2794, Training Loss: 0.09810
Epoch: 2795, Training Loss: 0.05907
Epoch: 2795, Training Loss: 0.07246
Epoch: 2795, Training Loss: 0.07264
Epoch: 2795, Training Loss: 0.09798
Epoch: 2796, Training Loss: 0.05902
Epoch: 2796, Training Loss: 0.07238
Epoch: 2796, Training Loss: 0.07256
Epoch: 2796, Training Loss: 0.09787
Epoch: 2797, Training Loss: 0.05897
Epoch: 2797, Training Loss: 0.07230
Epoch: 2797, Training Loss: 0.07248
Epoch: 2797, Training Loss: 0.09775
Epoch: 2798, Training Loss: 0.05892
Epoch: 2798, Training Loss: 0.07222
Epoch: 2798, Training Loss: 0.07240
Epoch: 2798, Training Loss: 0.09763
Epoch: 2799, Training Loss: 0.05887
Epoch: 2799, Training Loss: 0.07214
Epoch: 2799, Training Loss: 0.07232
Epoch: 2799, Training Loss: 0.09752
Epoch: 2800, Training Loss: 0.05881
Epoch: 2800, Training Loss: 0.07207
Epoch: 2800, Training Loss: 0.07224
Epoch: 2800, Training Loss: 0.09740
Epoch: 2801, Training Loss: 0.05876
Epoch: 2801, Training Loss: 0.07199
Epoch: 2801, Training Loss: 0.07216
Epoch: 2801, Training Loss: 0.09729
Epoch: 2802, Training Loss: 0.05871
Epoch: 2802, Training Loss: 0.07191
Epoch: 2802, Training Loss: 0.07209
Epoch: 2802, Training Loss: 0.09717
Epoch: 2803, Training Loss: 0.05866
Epoch: 2803, Training Loss: 0.07183
Epoch: 2803, Training Loss: 0.07201
Epoch: 2803, Training Loss: 0.09706
Epoch: 2804, Training Loss: 0.05861
Epoch: 2804, Training Loss: 0.07176
Epoch: 2804, Training Loss: 0.07193
Epoch: 2804, Training Loss: 0.09694
Epoch: 2805, Training Loss: 0.05856
Epoch: 2805, Training Loss: 0.07168
Epoch: 2805, Training Loss: 0.07185
Epoch: 2805, Training Loss: 0.09683
Epoch: 2806, Training Loss: 0.05850
Epoch: 2806, Training Loss: 0.07160
Epoch: 2806, Training Loss: 0.07178
Epoch: 2806, Training Loss: 0.09672
Epoch: 2807, Training Loss: 0.05845
Epoch: 2807, Training Loss: 0.07152
Epoch: 2807, Training Loss: 0.07170
Epoch: 2807, Training Loss: 0.09660
Epoch: 2808, Training Loss: 0.05840
Epoch: 2808, Training Loss: 0.07145
Epoch: 2808, Training Loss: 0.07162
Epoch: 2808, Training Loss: 0.09649
Epoch: 2809, Training Loss: 0.05835
Epoch: 2809, Training Loss: 0.07137
Epoch: 2809, Training Loss: 0.07155
Epoch: 2809, Training Loss: 0.09638
Epoch: 2810, Training Loss: 0.05830
Epoch: 2810, Training Loss: 0.07130
Epoch: 2810, Training Loss: 0.07147
Epoch: 2810, Training Loss: 0.09626
Epoch: 2811, Training Loss: 0.05825
Epoch: 2811, Training Loss: 0.07122
Epoch: 2811, Training Loss: 0.07139
Epoch: 2811, Training Loss: 0.09615
Epoch: 2812, Training Loss: 0.05820
Epoch: 2812, Training Loss: 0.07114
Epoch: 2812, Training Loss: 0.07132
Epoch: 2812, Training Loss: 0.09604
Epoch: 2813, Training Loss: 0.05815
Epoch: 2813, Training Loss: 0.07107
Epoch: 2813, Training Loss: 0.07124
Epoch: 2813, Training Loss: 0.09593
Epoch: 2814, Training Loss: 0.05810
Epoch: 2814, Training Loss: 0.07099
Epoch: 2814, Training Loss: 0.07117
Epoch: 2814, Training Loss: 0.09582
Epoch: 2815, Training Loss: 0.05805
Epoch: 2815, Training Loss: 0.07092
Epoch: 2815, Training Loss: 0.07109
Epoch: 2815, Training Loss: 0.09571
Epoch: 2816, Training Loss: 0.05800
Epoch: 2816, Training Loss: 0.07084
Epoch: 2816, Training Loss: 0.07102
Epoch: 2816, Training Loss: 0.09560
Epoch: 2817, Training Loss: 0.05795
Epoch: 2817, Training Loss: 0.07077
Epoch: 2817, Training Loss: 0.07094
Epoch: 2817, Training Loss: 0.09549
Epoch: 2818, Training Loss: 0.05790
Epoch: 2818, Training Loss: 0.07070
Epoch: 2818, Training Loss: 0.07087
Epoch: 2818, Training Loss: 0.09538
Epoch: 2819, Training Loss: 0.05785
Epoch: 2819, Training Loss: 0.07062
Epoch: 2819, Training Loss: 0.07079
Epoch: 2819, Training Loss: 0.09527
Epoch: 2820, Training Loss: 0.05780
Epoch: 2820, Training Loss: 0.07055
Epoch: 2820, Training Loss: 0.07072
Epoch: 2820, Training Loss: 0.09516
Epoch: 2821, Training Loss: 0.05775
Epoch: 2821, Training Loss: 0.07047
Epoch: 2821, Training Loss: 0.07065
Epoch: 2821, Training Loss: 0.09505
Epoch: 2822, Training Loss: 0.05770
Epoch: 2822, Training Loss: 0.07040
Epoch: 2822, Training Loss: 0.07057
Epoch: 2822, Training Loss: 0.09495
Epoch: 2823, Training Loss: 0.05766
Epoch: 2823, Training Loss: 0.07033
Epoch: 2823, Training Loss: 0.07050
Epoch: 2823, Training Loss: 0.09484
Epoch: 2824, Training Loss: 0.05761
Epoch: 2824, Training Loss: 0.07025
Epoch: 2824, Training Loss: 0.07043
Epoch: 2824, Training Loss: 0.09473
Epoch: 2825, Training Loss: 0.05756
Epoch: 2825, Training Loss: 0.07018
Epoch: 2825, Training Loss: 0.07035
Epoch: 2825, Training Loss: 0.09462
Epoch: 2826, Training Loss: 0.05751
Epoch: 2826, Training Loss: 0.07011
Epoch: 2826, Training Loss: 0.07028
Epoch: 2826, Training Loss: 0.09452
Epoch: 2827, Training Loss: 0.05746
Epoch: 2827, Training Loss: 0.07004
Epoch: 2827, Training Loss: 0.07021
Epoch: 2827, Training Loss: 0.09441
Epoch: 2828, Training Loss: 0.05741
Epoch: 2828, Training Loss: 0.06996
Epoch: 2828, Training Loss: 0.07013
Epoch: 2828, Training Loss: 0.09430
Epoch: 2829, Training Loss: 0.05736
Epoch: 2829, Training Loss: 0.06989
Epoch: 2829, Training Loss: 0.07006
Epoch: 2829, Training Loss: 0.09420
Epoch: 2830, Training Loss: 0.05732
Epoch: 2830, Training Loss: 0.06982
Epoch: 2830, Training Loss: 0.06999
Epoch: 2830, Training Loss: 0.09409
Epoch: 2831, Training Loss: 0.05727
Epoch: 2831, Training Loss: 0.06975
Epoch: 2831, Training Loss: 0.06992
Epoch: 2831, Training Loss: 0.09399
Epoch: 2832, Training Loss: 0.05722
Epoch: 2832, Training Loss: 0.06968
Epoch: 2832, Training Loss: 0.06985
Epoch: 2832, Training Loss: 0.09388
Epoch: 2833, Training Loss: 0.05717
Epoch: 2833, Training Loss: 0.06961
Epoch: 2833, Training Loss: 0.06978
Epoch: 2833, Training Loss: 0.09378
Epoch: 2834, Training Loss: 0.05712
Epoch: 2834, Training Loss: 0.06953
Epoch: 2834, Training Loss: 0.06970
Epoch: 2834, Training Loss: 0.09367
Epoch: 2835, Training Loss: 0.05708
Epoch: 2835, Training Loss: 0.06946
Epoch: 2835, Training Loss: 0.06963
Epoch: 2835, Training Loss: 0.09357
Epoch: 2836, Training Loss: 0.05703
Epoch: 2836, Training Loss: 0.06939
Epoch: 2836, Training Loss: 0.06956
Epoch: 2836, Training Loss: 0.09347
Epoch: 2837, Training Loss: 0.05698
Epoch: 2837, Training Loss: 0.06932
Epoch: 2837, Training Loss: 0.06949
Epoch: 2837, Training Loss: 0.09336
Epoch: 2838, Training Loss: 0.05693
Epoch: 2838, Training Loss: 0.06925
Epoch: 2838, Training Loss: 0.06942
Epoch: 2838, Training Loss: 0.09326
Epoch: 2839, Training Loss: 0.05689
Epoch: 2839, Training Loss: 0.06918
Epoch: 2839, Training Loss: 0.06935
Epoch: 2839, Training Loss: 0.09316
Epoch: 2840, Training Loss: 0.05684
Epoch: 2840, Training Loss: 0.06911
Epoch: 2840, Training Loss: 0.06928
Epoch: 2840, Training Loss: 0.09305
Epoch: 2841, Training Loss: 0.05679
Epoch: 2841, Training Loss: 0.06904
Epoch: 2841, Training Loss: 0.06921
Epoch: 2841, Training Loss: 0.09295
Epoch: 2842, Training Loss: 0.05675
Epoch: 2842, Training Loss: 0.06897
Epoch: 2842, Training Loss: 0.06914
Epoch: 2842, Training Loss: 0.09285
Epoch: 2843, Training Loss: 0.05670
Epoch: 2843, Training Loss: 0.06890
Epoch: 2843, Training Loss: 0.06907
Epoch: 2843, Training Loss: 0.09275
Epoch: 2844, Training Loss: 0.05665
Epoch: 2844, Training Loss: 0.06884
Epoch: 2844, Training Loss: 0.06900
Epoch: 2844, Training Loss: 0.09265
Epoch: 2845, Training Loss: 0.05661
Epoch: 2845, Training Loss: 0.06877
Epoch: 2845, Training Loss: 0.06893
Epoch: 2845, Training Loss: 0.09255
Epoch: 2846, Training Loss: 0.05656
Epoch: 2846, Training Loss: 0.06870
Epoch: 2846, Training Loss: 0.06886
Epoch: 2846, Training Loss: 0.09245
Epoch: 2847, Training Loss: 0.05651
Epoch: 2847, Training Loss: 0.06863
Epoch: 2847, Training Loss: 0.06880
Epoch: 2847, Training Loss: 0.09235
Epoch: 2848, Training Loss: 0.05647
Epoch: 2848, Training Loss: 0.06856
Epoch: 2848, Training Loss: 0.06873
Epoch: 2848, Training Loss: 0.09224
Epoch: 2849, Training Loss: 0.05642
Epoch: 2849, Training Loss: 0.06849
Epoch: 2849, Training Loss: 0.06866
Epoch: 2849, Training Loss: 0.09215
Epoch: 2850, Training Loss: 0.05637
Epoch: 2850, Training Loss: 0.06843
Epoch: 2850, Training Loss: 0.06859
Epoch: 2850, Training Loss: 0.09205
Epoch: 2851, Training Loss: 0.05633
Epoch: 2851, Training Loss: 0.06836
Epoch: 2851, Training Loss: 0.06852
Epoch: 2851, Training Loss: 0.09195
Epoch: 2852, Training Loss: 0.05628
Epoch: 2852, Training Loss: 0.06829
Epoch: 2852, Training Loss: 0.06846
Epoch: 2852, Training Loss: 0.09185
Epoch: 2853, Training Loss: 0.05624
Epoch: 2853, Training Loss: 0.06822
Epoch: 2853, Training Loss: 0.06839
Epoch: 2853, Training Loss: 0.09175
Epoch: 2854, Training Loss: 0.05619
Epoch: 2854, Training Loss: 0.06816
Epoch: 2854, Training Loss: 0.06832
Epoch: 2854, Training Loss: 0.09165
Epoch: 2855, Training Loss: 0.05615
Epoch: 2855, Training Loss: 0.06809
Epoch: 2855, Training Loss: 0.06825
Epoch: 2855, Training Loss: 0.09155
Epoch: 2856, Training Loss: 0.05610
Epoch: 2856, Training Loss: 0.06802
Epoch: 2856, Training Loss: 0.06819
Epoch: 2856, Training Loss: 0.09145
Epoch: 2857, Training Loss: 0.05605
Epoch: 2857, Training Loss: 0.06796
Epoch: 2857, Training Loss: 0.06812
Epoch: 2857, Training Loss: 0.09136
Epoch: 2858, Training Loss: 0.05601
Epoch: 2858, Training Loss: 0.06789
Epoch: 2858, Training Loss: 0.06805
Epoch: 2858, Training Loss: 0.09126
Epoch: 2859, Training Loss: 0.05596
Epoch: 2859, Training Loss: 0.06782
Epoch: 2859, Training Loss: 0.06799
Epoch: 2859, Training Loss: 0.09116
Epoch: 2860, Training Loss: 0.05592
Epoch: 2860, Training Loss: 0.06776
Epoch: 2860, Training Loss: 0.06792
Epoch: 2860, Training Loss: 0.09107
Epoch: 2861, Training Loss: 0.05587
Epoch: 2861, Training Loss: 0.06769
Epoch: 2861, Training Loss: 0.06785
Epoch: 2861, Training Loss: 0.09097
Epoch: 2862, Training Loss: 0.05583
Epoch: 2862, Training Loss: 0.06762
Epoch: 2862, Training Loss: 0.06779
Epoch: 2862, Training Loss: 0.09087
Epoch: 2863, Training Loss: 0.05579
Epoch: 2863, Training Loss: 0.06756
Epoch: 2863, Training Loss: 0.06772
Epoch: 2863, Training Loss: 0.09078
Epoch: 2864, Training Loss: 0.05574
Epoch: 2864, Training Loss: 0.06749
Epoch: 2864, Training Loss: 0.06766
Epoch: 2864, Training Loss: 0.09068
Epoch: 2865, Training Loss: 0.05570
Epoch: 2865, Training Loss: 0.06743
Epoch: 2865, Training Loss: 0.06759
Epoch: 2865, Training Loss: 0.09059
Epoch: 2866, Training Loss: 0.05565
Epoch: 2866, Training Loss: 0.06736
Epoch: 2866, Training Loss: 0.06753
Epoch: 2866, Training Loss: 0.09049
Epoch: 2867, Training Loss: 0.05561
Epoch: 2867, Training Loss: 0.06730
Epoch: 2867, Training Loss: 0.06746
Epoch: 2867, Training Loss: 0.09040
Epoch: 2868, Training Loss: 0.05556
Epoch: 2868, Training Loss: 0.06723
Epoch: 2868, Training Loss: 0.06740
Epoch: 2868, Training Loss: 0.09030
Epoch: 2869, Training Loss: 0.05552
Epoch: 2869, Training Loss: 0.06717
Epoch: 2869, Training Loss: 0.06733
Epoch: 2869, Training Loss: 0.09021
Epoch: 2870, Training Loss: 0.05548
Epoch: 2870, Training Loss: 0.06710
Epoch: 2870, Training Loss: 0.06727
Epoch: 2870, Training Loss: 0.09011
Epoch: 2871, Training Loss: 0.05543
Epoch: 2871, Training Loss: 0.06704
Epoch: 2871, Training Loss: 0.06720
Epoch: 2871, Training Loss: 0.09002
Epoch: 2872, Training Loss: 0.05539
Epoch: 2872, Training Loss: 0.06698
Epoch: 2872, Training Loss: 0.06714
Epoch: 2872, Training Loss: 0.08992
Epoch: 2873, Training Loss: 0.05534
Epoch: 2873, Training Loss: 0.06691
Epoch: 2873, Training Loss: 0.06707
Epoch: 2873, Training Loss: 0.08983
Epoch: 2874, Training Loss: 0.05530
Epoch: 2874, Training Loss: 0.06685
Epoch: 2874, Training Loss: 0.06701
Epoch: 2874, Training Loss: 0.08974
Epoch: 2875, Training Loss: 0.05526
Epoch: 2875, Training Loss: 0.06679
Epoch: 2875, Training Loss: 0.06695
Epoch: 2875, Training Loss: 0.08965
Epoch: 2876, Training Loss: 0.05521
Epoch: 2876, Training Loss: 0.06672
Epoch: 2876, Training Loss: 0.06688
Epoch: 2876, Training Loss: 0.08955
Epoch: 2877, Training Loss: 0.05517
Epoch: 2877, Training Loss: 0.06666
Epoch: 2877, Training Loss: 0.06682
Epoch: 2877, Training Loss: 0.08946
Epoch: 2878, Training Loss: 0.05513
Epoch: 2878, Training Loss: 0.06660
Epoch: 2878, Training Loss: 0.06676
Epoch: 2878, Training Loss: 0.08937
Epoch: 2879, Training Loss: 0.05508
Epoch: 2879, Training Loss: 0.06653
Epoch: 2879, Training Loss: 0.06669
Epoch: 2879, Training Loss: 0.08928
Epoch: 2880, Training Loss: 0.05504
Epoch: 2880, Training Loss: 0.06647
Epoch: 2880, Training Loss: 0.06663
Epoch: 2880, Training Loss: 0.08919
Epoch: 2881, Training Loss: 0.05500
Epoch: 2881, Training Loss: 0.06641
Epoch: 2881, Training Loss: 0.06657
Epoch: 2881, Training Loss: 0.08909
Epoch: 2882, Training Loss: 0.05496
Epoch: 2882, Training Loss: 0.06634
Epoch: 2882, Training Loss: 0.06650
Epoch: 2882, Training Loss: 0.08900
Epoch: 2883, Training Loss: 0.05491
Epoch: 2883, Training Loss: 0.06628
Epoch: 2883, Training Loss: 0.06644
Epoch: 2883, Training Loss: 0.08891
Epoch: 2884, Training Loss: 0.05487
Epoch: 2884, Training Loss: 0.06622
Epoch: 2884, Training Loss: 0.06638
Epoch: 2884, Training Loss: 0.08882
Epoch: 2885, Training Loss: 0.05483
Epoch: 2885, Training Loss: 0.06616
Epoch: 2885, Training Loss: 0.06632
Epoch: 2885, Training Loss: 0.08873
Epoch: 2886, Training Loss: 0.05479
Epoch: 2886, Training Loss: 0.06610
Epoch: 2886, Training Loss: 0.06626
Epoch: 2886, Training Loss: 0.08864
Epoch: 2887, Training Loss: 0.05474
Epoch: 2887, Training Loss: 0.06604
Epoch: 2887, Training Loss: 0.06619
Epoch: 2887, Training Loss: 0.08855
Epoch: 2888, Training Loss: 0.05470
Epoch: 2888, Training Loss: 0.06597
Epoch: 2888, Training Loss: 0.06613
Epoch: 2888, Training Loss: 0.08846
Epoch: 2889, Training Loss: 0.05466
Epoch: 2889, Training Loss: 0.06591
Epoch: 2889, Training Loss: 0.06607
Epoch: 2889, Training Loss: 0.08837
Epoch: 2890, Training Loss: 0.05462
Epoch: 2890, Training Loss: 0.06585
Epoch: 2890, Training Loss: 0.06601
Epoch: 2890, Training Loss: 0.08828
Epoch: 2891, Training Loss: 0.05457
Epoch: 2891, Training Loss: 0.06579
Epoch: 2891, Training Loss: 0.06595
Epoch: 2891, Training Loss: 0.08819
Epoch: 2892, Training Loss: 0.05453
Epoch: 2892, Training Loss: 0.06573
Epoch: 2892, Training Loss: 0.06589
Epoch: 2892, Training Loss: 0.08811
Epoch: 2893, Training Loss: 0.05449
Epoch: 2893, Training Loss: 0.06567
Epoch: 2893, Training Loss: 0.06583
Epoch: 2893, Training Loss: 0.08802
Epoch: 2894, Training Loss: 0.05445
Epoch: 2894, Training Loss: 0.06561
Epoch: 2894, Training Loss: 0.06577
Epoch: 2894, Training Loss: 0.08793
Epoch: 2895, Training Loss: 0.05441
Epoch: 2895, Training Loss: 0.06555
Epoch: 2895, Training Loss: 0.06571
Epoch: 2895, Training Loss: 0.08784
Epoch: 2896, Training Loss: 0.05437
Epoch: 2896, Training Loss: 0.06549
Epoch: 2896, Training Loss: 0.06564
Epoch: 2896, Training Loss: 0.08775
Epoch: 2897, Training Loss: 0.05432
Epoch: 2897, Training Loss: 0.06543
Epoch: 2897, Training Loss: 0.06558
Epoch: 2897, Training Loss: 0.08767
Epoch: 2898, Training Loss: 0.05428
Epoch: 2898, Training Loss: 0.06537
Epoch: 2898, Training Loss: 0.06552
Epoch: 2898, Training Loss: 0.08758
Epoch: 2899, Training Loss: 0.05424
Epoch: 2899, Training Loss: 0.06531
Epoch: 2899, Training Loss: 0.06546
Epoch: 2899, Training Loss: 0.08749
Epoch: 2900, Training Loss: 0.05420
Epoch: 2900, Training Loss: 0.06525
Epoch: 2900, Training Loss: 0.06540
Epoch: 2900, Training Loss: 0.08740
Epoch: 2901, Training Loss: 0.05416
Epoch: 2901, Training Loss: 0.06519
Epoch: 2901, Training Loss: 0.06535
Epoch: 2901, Training Loss: 0.08732
Epoch: 2902, Training Loss: 0.05412
Epoch: 2902, Training Loss: 0.06513
Epoch: 2902, Training Loss: 0.06529
Epoch: 2902, Training Loss: 0.08723
Epoch: 2903, Training Loss: 0.05408
Epoch: 2903, Training Loss: 0.06507
Epoch: 2903, Training Loss: 0.06523
Epoch: 2903, Training Loss: 0.08715
Epoch: 2904, Training Loss: 0.05404
Epoch: 2904, Training Loss: 0.06501
Epoch: 2904, Training Loss: 0.06517
Epoch: 2904, Training Loss: 0.08706
Epoch: 2905, Training Loss: 0.05400
Epoch: 2905, Training Loss: 0.06495
Epoch: 2905, Training Loss: 0.06511
Epoch: 2905, Training Loss: 0.08697
Epoch: 2906, Training Loss: 0.05396
Epoch: 2906, Training Loss: 0.06489
Epoch: 2906, Training Loss: 0.06505
Epoch: 2906, Training Loss: 0.08689
Epoch: 2907, Training Loss: 0.05392
Epoch: 2907, Training Loss: 0.06484
Epoch: 2907, Training Loss: 0.06499
Epoch: 2907, Training Loss: 0.08680
Epoch: 2908, Training Loss: 0.05387
Epoch: 2908, Training Loss: 0.06478
Epoch: 2908, Training Loss: 0.06493
Epoch: 2908, Training Loss: 0.08672
Epoch: 2909, Training Loss: 0.05383
Epoch: 2909, Training Loss: 0.06472
Epoch: 2909, Training Loss: 0.06487
Epoch: 2909, Training Loss: 0.08663
Epoch: 2910, Training Loss: 0.05379
Epoch: 2910, Training Loss: 0.06466
Epoch: 2910, Training Loss: 0.06482
Epoch: 2910, Training Loss: 0.08655
Epoch: 2911, Training Loss: 0.05375
Epoch: 2911, Training Loss: 0.06460
Epoch: 2911, Training Loss: 0.06476
Epoch: 2911, Training Loss: 0.08647
Epoch: 2912, Training Loss: 0.05371
Epoch: 2912, Training Loss: 0.06454
Epoch: 2912, Training Loss: 0.06470
Epoch: 2912, Training Loss: 0.08638
Epoch: 2913, Training Loss: 0.05367
Epoch: 2913, Training Loss: 0.06449
Epoch: 2913, Training Loss: 0.06464
Epoch: 2913, Training Loss: 0.08630
Epoch: 2914, Training Loss: 0.05363
Epoch: 2914, Training Loss: 0.06443
Epoch: 2914, Training Loss: 0.06458
Epoch: 2914, Training Loss: 0.08621
Epoch: 2915, Training Loss: 0.05359
Epoch: 2915, Training Loss: 0.06437
Epoch: 2915, Training Loss: 0.06453
Epoch: 2915, Training Loss: 0.08613
Epoch: 2916, Training Loss: 0.05355
Epoch: 2916, Training Loss: 0.06431
Epoch: 2916, Training Loss: 0.06447
Epoch: 2916, Training Loss: 0.08605
Epoch: 2917, Training Loss: 0.05351
Epoch: 2917, Training Loss: 0.06426
Epoch: 2917, Training Loss: 0.06441
Epoch: 2917, Training Loss: 0.08596
Epoch: 2918, Training Loss: 0.05348
Epoch: 2918, Training Loss: 0.06420
Epoch: 2918, Training Loss: 0.06435
Epoch: 2918, Training Loss: 0.08588
Epoch: 2919, Training Loss: 0.05344
Epoch: 2919, Training Loss: 0.06414
Epoch: 2919, Training Loss: 0.06430
Epoch: 2919, Training Loss: 0.08580
Epoch: 2920, Training Loss: 0.05340
Epoch: 2920, Training Loss: 0.06409
Epoch: 2920, Training Loss: 0.06424
Epoch: 2920, Training Loss: 0.08572
Epoch: 2921, Training Loss: 0.05336
Epoch: 2921, Training Loss: 0.06403
Epoch: 2921, Training Loss: 0.06418
Epoch: 2921, Training Loss: 0.08563
Epoch: 2922, Training Loss: 0.05332
Epoch: 2922, Training Loss: 0.06397
Epoch: 2922, Training Loss: 0.06413
Epoch: 2922, Training Loss: 0.08555
Epoch: 2923, Training Loss: 0.05328
Epoch: 2923, Training Loss: 0.06392
Epoch: 2923, Training Loss: 0.06407
Epoch: 2923, Training Loss: 0.08547
Epoch: 2924, Training Loss: 0.05324
Epoch: 2924, Training Loss: 0.06386
Epoch: 2924, Training Loss: 0.06401
Epoch: 2924, Training Loss: 0.08539
Epoch: 2925, Training Loss: 0.05320
Epoch: 2925, Training Loss: 0.06381
Epoch: 2925, Training Loss: 0.06396
Epoch: 2925, Training Loss: 0.08531
Epoch: 2926, Training Loss: 0.05316
Epoch: 2926, Training Loss: 0.06375
Epoch: 2926, Training Loss: 0.06390
Epoch: 2926, Training Loss: 0.08523
Epoch: 2927, Training Loss: 0.05312
Epoch: 2927, Training Loss: 0.06369
Epoch: 2927, Training Loss: 0.06385
Epoch: 2927, Training Loss: 0.08515
Epoch: 2928, Training Loss: 0.05308
Epoch: 2928, Training Loss: 0.06364
Epoch: 2928, Training Loss: 0.06379
Epoch: 2928, Training Loss: 0.08506
Epoch: 2929, Training Loss: 0.05304
Epoch: 2929, Training Loss: 0.06358
Epoch: 2929, Training Loss: 0.06373
Epoch: 2929, Training Loss: 0.08498
Epoch: 2930, Training Loss: 0.05301
Epoch: 2930, Training Loss: 0.06353
Epoch: 2930, Training Loss: 0.06368
Epoch: 2930, Training Loss: 0.08490
Epoch: 2931, Training Loss: 0.05297
Epoch: 2931, Training Loss: 0.06347
Epoch: 2931, Training Loss: 0.06362
Epoch: 2931, Training Loss: 0.08482
Epoch: 2932, Training Loss: 0.05293
Epoch: 2932, Training Loss: 0.06342
Epoch: 2932, Training Loss: 0.06357
Epoch: 2932, Training Loss: 0.08474
Epoch: 2933, Training Loss: 0.05289
Epoch: 2933, Training Loss: 0.06336
Epoch: 2933, Training Loss: 0.06351
Epoch: 2933, Training Loss: 0.08466
Epoch: 2934, Training Loss: 0.05285
Epoch: 2934, Training Loss: 0.06331
Epoch: 2934, Training Loss: 0.06346
Epoch: 2934, Training Loss: 0.08458
Epoch: 2935, Training Loss: 0.05281
Epoch: 2935, Training Loss: 0.06325
Epoch: 2935, Training Loss: 0.06340
Epoch: 2935, Training Loss: 0.08451
Epoch: 2936, Training Loss: 0.05278
Epoch: 2936, Training Loss: 0.06320
Epoch: 2936, Training Loss: 0.06335
Epoch: 2936, Training Loss: 0.08443
Epoch: 2937, Training Loss: 0.05274
Epoch: 2937, Training Loss: 0.06314
Epoch: 2937, Training Loss: 0.06329
Epoch: 2937, Training Loss: 0.08435
Epoch: 2938, Training Loss: 0.05270
Epoch: 2938, Training Loss: 0.06309
Epoch: 2938, Training Loss: 0.06324
Epoch: 2938, Training Loss: 0.08427
Epoch: 2939, Training Loss: 0.05266
Epoch: 2939, Training Loss: 0.06304
Epoch: 2939, Training Loss: 0.06318
Epoch: 2939, Training Loss: 0.08419
Epoch: 2940, Training Loss: 0.05262
Epoch: 2940, Training Loss: 0.06298
Epoch: 2940, Training Loss: 0.06313
Epoch: 2940, Training Loss: 0.08411
Epoch: 2941, Training Loss: 0.05259
Epoch: 2941, Training Loss: 0.06293
Epoch: 2941, Training Loss: 0.06308
Epoch: 2941, Training Loss: 0.08403
Epoch: 2942, Training Loss: 0.05255
Epoch: 2942, Training Loss: 0.06287
Epoch: 2942, Training Loss: 0.06302
Epoch: 2942, Training Loss: 0.08396
Epoch: 2943, Training Loss: 0.05251
Epoch: 2943, Training Loss: 0.06282
Epoch: 2943, Training Loss: 0.06297
Epoch: 2943, Training Loss: 0.08388
Epoch: 2944, Training Loss: 0.05247
Epoch: 2944, Training Loss: 0.06277
Epoch: 2944, Training Loss: 0.06292
Epoch: 2944, Training Loss: 0.08380
Epoch: 2945, Training Loss: 0.05244
Epoch: 2945, Training Loss: 0.06271
Epoch: 2945, Training Loss: 0.06286
Epoch: 2945, Training Loss: 0.08372
Epoch: 2946, Training Loss: 0.05240
Epoch: 2946, Training Loss: 0.06266
Epoch: 2946, Training Loss: 0.06281
Epoch: 2946, Training Loss: 0.08365
Epoch: 2947, Training Loss: 0.05236
Epoch: 2947, Training Loss: 0.06261
Epoch: 2947, Training Loss: 0.06275
Epoch: 2947, Training Loss: 0.08357
Epoch: 2948, Training Loss: 0.05232
Epoch: 2948, Training Loss: 0.06255
Epoch: 2948, Training Loss: 0.06270
Epoch: 2948, Training Loss: 0.08349
Epoch: 2949, Training Loss: 0.05229
Epoch: 2949, Training Loss: 0.06250
Epoch: 2949, Training Loss: 0.06265
Epoch: 2949, Training Loss: 0.08342
Epoch: 2950, Training Loss: 0.05225
Epoch: 2950, Training Loss: 0.06245
Epoch: 2950, Training Loss: 0.06260
Epoch: 2950, Training Loss: 0.08334
Epoch: 2951, Training Loss: 0.05221
Epoch: 2951, Training Loss: 0.06239
Epoch: 2951, Training Loss: 0.06254
Epoch: 2951, Training Loss: 0.08326
Epoch: 2952, Training Loss: 0.05218
Epoch: 2952, Training Loss: 0.06234
Epoch: 2952, Training Loss: 0.06249
Epoch: 2952, Training Loss: 0.08319
Epoch: 2953, Training Loss: 0.05214
Epoch: 2953, Training Loss: 0.06229
Epoch: 2953, Training Loss: 0.06244
Epoch: 2953, Training Loss: 0.08311
Epoch: 2954, Training Loss: 0.05210
Epoch: 2954, Training Loss: 0.06224
Epoch: 2954, Training Loss: 0.06238
Epoch: 2954, Training Loss: 0.08304
Epoch: 2955, Training Loss: 0.05207
Epoch: 2955, Training Loss: 0.06219
Epoch: 2955, Training Loss: 0.06233
Epoch: 2955, Training Loss: 0.08296
Epoch: 2956, Training Loss: 0.05203
Epoch: 2956, Training Loss: 0.06213
Epoch: 2956, Training Loss: 0.06228
Epoch: 2956, Training Loss: 0.08288
Epoch: 2957, Training Loss: 0.05199
Epoch: 2957, Training Loss: 0.06208
Epoch: 2957, Training Loss: 0.06223
Epoch: 2957, Training Loss: 0.08281
Epoch: 2958, Training Loss: 0.05196
Epoch: 2958, Training Loss: 0.06203
Epoch: 2958, Training Loss: 0.06218
Epoch: 2958, Training Loss: 0.08273
Epoch: 2959, Training Loss: 0.05192
Epoch: 2959, Training Loss: 0.06198
Epoch: 2959, Training Loss: 0.06212
Epoch: 2959, Training Loss: 0.08266
Epoch: 2960, Training Loss: 0.05188
Epoch: 2960, Training Loss: 0.06193
Epoch: 2960, Training Loss: 0.06207
Epoch: 2960, Training Loss: 0.08259
Epoch: 2961, Training Loss: 0.05185
Epoch: 2961, Training Loss: 0.06187
Epoch: 2961, Training Loss: 0.06202
Epoch: 2961, Training Loss: 0.08251
Epoch: 2962, Training Loss: 0.05181
Epoch: 2962, Training Loss: 0.06182
Epoch: 2962, Training Loss: 0.06197
Epoch: 2962, Training Loss: 0.08244
Epoch: 2963, Training Loss: 0.05178
Epoch: 2963, Training Loss: 0.06177
Epoch: 2963, Training Loss: 0.06192
Epoch: 2963, Training Loss: 0.08236
Epoch: 2964, Training Loss: 0.05174
Epoch: 2964, Training Loss: 0.06172
Epoch: 2964, Training Loss: 0.06187
Epoch: 2964, Training Loss: 0.08229
Epoch: 2965, Training Loss: 0.05170
Epoch: 2965, Training Loss: 0.06167
Epoch: 2965, Training Loss: 0.06182
Epoch: 2965, Training Loss: 0.08222
Epoch: 2966, Training Loss: 0.05167
Epoch: 2966, Training Loss: 0.06162
Epoch: 2966, Training Loss: 0.06176
Epoch: 2966, Training Loss: 0.08214
Epoch: 2967, Training Loss: 0.05163
Epoch: 2967, Training Loss: 0.06157
Epoch: 2967, Training Loss: 0.06171
Epoch: 2967, Training Loss: 0.08207
Epoch: 2968, Training Loss: 0.05160
Epoch: 2968, Training Loss: 0.06152
Epoch: 2968, Training Loss: 0.06166
Epoch: 2968, Training Loss: 0.08200
Epoch: 2969, Training Loss: 0.05156
Epoch: 2969, Training Loss: 0.06147
Epoch: 2969, Training Loss: 0.06161
Epoch: 2969, Training Loss: 0.08192
Epoch: 2970, Training Loss: 0.05152
Epoch: 2970, Training Loss: 0.06142
Epoch: 2970, Training Loss: 0.06156
Epoch: 2970, Training Loss: 0.08185
Epoch: 2971, Training Loss: 0.05149
Epoch: 2971, Training Loss: 0.06137
Epoch: 2971, Training Loss: 0.06151
Epoch: 2971, Training Loss: 0.08178
Epoch: 2972, Training Loss: 0.05145
Epoch: 2972, Training Loss: 0.06132
Epoch: 2972, Training Loss: 0.06146
Epoch: 2972, Training Loss: 0.08170
Epoch: 2973, Training Loss: 0.05142
Epoch: 2973, Training Loss: 0.06127
Epoch: 2973, Training Loss: 0.06141
Epoch: 2973, Training Loss: 0.08163
Epoch: 2974, Training Loss: 0.05138
Epoch: 2974, Training Loss: 0.06122
Epoch: 2974, Training Loss: 0.06136
Epoch: 2974, Training Loss: 0.08156
Epoch: 2975, Training Loss: 0.05135
Epoch: 2975, Training Loss: 0.06117
Epoch: 2975, Training Loss: 0.06131
Epoch: 2975, Training Loss: 0.08149
Epoch: 2976, Training Loss: 0.05131
Epoch: 2976, Training Loss: 0.06112
Epoch: 2976, Training Loss: 0.06126
Epoch: 2976, Training Loss: 0.08142
Epoch: 2977, Training Loss: 0.05128
Epoch: 2977, Training Loss: 0.06107
Epoch: 2977, Training Loss: 0.06121
Epoch: 2977, Training Loss: 0.08135
Epoch: 2978, Training Loss: 0.05124
Epoch: 2978, Training Loss: 0.06102
Epoch: 2978, Training Loss: 0.06116
Epoch: 2978, Training Loss: 0.08127
Epoch: 2979, Training Loss: 0.05121
Epoch: 2979, Training Loss: 0.06097
Epoch: 2979, Training Loss: 0.06111
Epoch: 2979, Training Loss: 0.08120
Epoch: 2980, Training Loss: 0.05117
Epoch: 2980, Training Loss: 0.06092
Epoch: 2980, Training Loss: 0.06106
Epoch: 2980, Training Loss: 0.08113
Epoch: 2981, Training Loss: 0.05114
Epoch: 2981, Training Loss: 0.06087
Epoch: 2981, Training Loss: 0.06101
Epoch: 2981, Training Loss: 0.08106
Epoch: 2982, Training Loss: 0.05110
Epoch: 2982, Training Loss: 0.06082
Epoch: 2982, Training Loss: 0.06096
Epoch: 2982, Training Loss: 0.08099
Epoch: 2983, Training Loss: 0.05107
Epoch: 2983, Training Loss: 0.06077
Epoch: 2983, Training Loss: 0.06091
Epoch: 2983, Training Loss: 0.08092
Epoch: 2984, Training Loss: 0.05103
Epoch: 2984, Training Loss: 0.06072
Epoch: 2984, Training Loss: 0.06087
Epoch: 2984, Training Loss: 0.08085
Epoch: 2985, Training Loss: 0.05100
Epoch: 2985, Training Loss: 0.06067
Epoch: 2985, Training Loss: 0.06082
Epoch: 2985, Training Loss: 0.08078
Epoch: 2986, Training Loss: 0.05096
Epoch: 2986, Training Loss: 0.06063
Epoch: 2986, Training Loss: 0.06077
Epoch: 2986, Training Loss: 0.08071
Epoch: 2987, Training Loss: 0.05093
Epoch: 2987, Training Loss: 0.06058
Epoch: 2987, Training Loss: 0.06072
Epoch: 2987, Training Loss: 0.08064
Epoch: 2988, Training Loss: 0.05090
Epoch: 2988, Training Loss: 0.06053
Epoch: 2988, Training Loss: 0.06067
Epoch: 2988, Training Loss: 0.08057
Epoch: 2989, Training Loss: 0.05086
Epoch: 2989, Training Loss: 0.06048
Epoch: 2989, Training Loss: 0.06062
Epoch: 2989, Training Loss: 0.08050
Epoch: 2990, Training Loss: 0.05083
Epoch: 2990, Training Loss: 0.06043
Epoch: 2990, Training Loss: 0.06057
Epoch: 2990, Training Loss: 0.08043
Epoch: 2991, Training Loss: 0.05079
Epoch: 2991, Training Loss: 0.06038
Epoch: 2991, Training Loss: 0.06053
Epoch: 2991, Training Loss: 0.08036
Epoch: 2992, Training Loss: 0.05076
Epoch: 2992, Training Loss: 0.06034
Epoch: 2992, Training Loss: 0.06048
Epoch: 2992, Training Loss: 0.08029
Epoch: 2993, Training Loss: 0.05072
Epoch: 2993, Training Loss: 0.06029
Epoch: 2993, Training Loss: 0.06043
Epoch: 2993, Training Loss: 0.08022
Epoch: 2994, Training Loss: 0.05069
Epoch: 2994, Training Loss: 0.06024
Epoch: 2994, Training Loss: 0.06038
Epoch: 2994, Training Loss: 0.08015
Epoch: 2995, Training Loss: 0.05066
Epoch: 2995, Training Loss: 0.06019
Epoch: 2995, Training Loss: 0.06033
Epoch: 2995, Training Loss: 0.08008
Epoch: 2996, Training Loss: 0.05062
Epoch: 2996, Training Loss: 0.06015
Epoch: 2996, Training Loss: 0.06029
Epoch: 2996, Training Loss: 0.08002
Epoch: 2997, Training Loss: 0.05059
Epoch: 2997, Training Loss: 0.06010
Epoch: 2997, Training Loss: 0.06024
Epoch: 2997, Training Loss: 0.07995
Epoch: 2998, Training Loss: 0.05056
Epoch: 2998, Training Loss: 0.06005
Epoch: 2998, Training Loss: 0.06019
Epoch: 2998, Training Loss: 0.07988
Epoch: 2999, Training Loss: 0.05052
Epoch: 2999, Training Loss: 0.06000
Epoch: 2999, Training Loss: 0.06014
Epoch: 2999, Training Loss: 0.07981
Epoch: 3000, Training Loss: 0.05049
Epoch: 3000, Training Loss: 0.05996
Epoch: 3000, Training Loss: 0.06010
Epoch: 3000, Training Loss: 0.07974
Epoch: 3001, Training Loss: 0.05045
Epoch: 3001, Training Loss: 0.05991
Epoch: 3001, Training Loss: 0.06005
Epoch: 3001, Training Loss: 0.07968
Epoch: 3002, Training Loss: 0.05042
Epoch: 3002, Training Loss: 0.05986
Epoch: 3002, Training Loss: 0.06000
Epoch: 3002, Training Loss: 0.07961
Epoch: 3003, Training Loss: 0.05039
Epoch: 3003, Training Loss: 0.05981
Epoch: 3003, Training Loss: 0.05996
Epoch: 3003, Training Loss: 0.07954
Epoch: 3004, Training Loss: 0.05035
Epoch: 3004, Training Loss: 0.05977
Epoch: 3004, Training Loss: 0.05991
Epoch: 3004, Training Loss: 0.07947
Epoch: 3005, Training Loss: 0.05032
Epoch: 3005, Training Loss: 0.05972
Epoch: 3005, Training Loss: 0.05986
Epoch: 3005, Training Loss: 0.07941
Epoch: 3006, Training Loss: 0.05029
Epoch: 3006, Training Loss: 0.05968
Epoch: 3006, Training Loss: 0.05981
Epoch: 3006, Training Loss: 0.07934
Epoch: 3007, Training Loss: 0.05025
Epoch: 3007, Training Loss: 0.05963
Epoch: 3007, Training Loss: 0.05977
Epoch: 3007, Training Loss: 0.07927
Epoch: 3008, Training Loss: 0.05022
Epoch: 3008, Training Loss: 0.05958
Epoch: 3008, Training Loss: 0.05972
Epoch: 3008, Training Loss: 0.07921
Epoch: 3009, Training Loss: 0.05019
Epoch: 3009, Training Loss: 0.05954
Epoch: 3009, Training Loss: 0.05968
Epoch: 3009, Training Loss: 0.07914
Epoch: 3010, Training Loss: 0.05016
Epoch: 3010, Training Loss: 0.05949
Epoch: 3010, Training Loss: 0.05963
Epoch: 3010, Training Loss: 0.07907
Epoch: 3011, Training Loss: 0.05012
Epoch: 3011, Training Loss: 0.05944
Epoch: 3011, Training Loss: 0.05958
Epoch: 3011, Training Loss: 0.07901
Epoch: 3012, Training Loss: 0.05009
Epoch: 3012, Training Loss: 0.05940
Epoch: 3012, Training Loss: 0.05954
Epoch: 3012, Training Loss: 0.07894
Epoch: 3013, Training Loss: 0.05006
Epoch: 3013, Training Loss: 0.05935
Epoch: 3013, Training Loss: 0.05949
Epoch: 3013, Training Loss: 0.07888
Epoch: 3014, Training Loss: 0.05002
Epoch: 3014, Training Loss: 0.05931
Epoch: 3014, Training Loss: 0.05944
Epoch: 3014, Training Loss: 0.07881
Epoch: 3015, Training Loss: 0.04999
Epoch: 3015, Training Loss: 0.05926
Epoch: 3015, Training Loss: 0.05940
Epoch: 3015, Training Loss: 0.07874
Epoch: 3016, Training Loss: 0.04996
Epoch: 3016, Training Loss: 0.05922
Epoch: 3016, Training Loss: 0.05935
Epoch: 3016, Training Loss: 0.07868
Epoch: 3017, Training Loss: 0.04993
Epoch: 3017, Training Loss: 0.05917
Epoch: 3017, Training Loss: 0.05931
Epoch: 3017, Training Loss: 0.07861
Epoch: 3018, Training Loss: 0.04989
Epoch: 3018, Training Loss: 0.05912
Epoch: 3018, Training Loss: 0.05926
Epoch: 3018, Training Loss: 0.07855
Epoch: 3019, Training Loss: 0.04986
Epoch: 3019, Training Loss: 0.05908
Epoch: 3019, Training Loss: 0.05922
Epoch: 3019, Training Loss: 0.07848
Epoch: 3020, Training Loss: 0.04983
Epoch: 3020, Training Loss: 0.05903
Epoch: 3020, Training Loss: 0.05917
Epoch: 3020, Training Loss: 0.07842
Epoch: 3021, Training Loss: 0.04980
Epoch: 3021, Training Loss: 0.05899
Epoch: 3021, Training Loss: 0.05913
Epoch: 3021, Training Loss: 0.07835
Epoch: 3022, Training Loss: 0.04977
Epoch: 3022, Training Loss: 0.05894
Epoch: 3022, Training Loss: 0.05908
Epoch: 3022, Training Loss: 0.07829
Epoch: 3023, Training Loss: 0.04973
Epoch: 3023, Training Loss: 0.05890
Epoch: 3023, Training Loss: 0.05904
Epoch: 3023, Training Loss: 0.07822
Epoch: 3024, Training Loss: 0.04970
Epoch: 3024, Training Loss: 0.05885
Epoch: 3024, Training Loss: 0.05899
Epoch: 3024, Training Loss: 0.07816
Epoch: 3025, Training Loss: 0.04967
Epoch: 3025, Training Loss: 0.05881
Epoch: 3025, Training Loss: 0.05895
Epoch: 3025, Training Loss: 0.07810
Epoch: 3026, Training Loss: 0.04964
Epoch: 3026, Training Loss: 0.05876
Epoch: 3026, Training Loss: 0.05890
Epoch: 3026, Training Loss: 0.07803
Epoch: 3027, Training Loss: 0.04960
Epoch: 3027, Training Loss: 0.05872
Epoch: 3027, Training Loss: 0.05886
Epoch: 3027, Training Loss: 0.07797
Epoch: 3028, Training Loss: 0.04957
Epoch: 3028, Training Loss: 0.05868
Epoch: 3028, Training Loss: 0.05881
Epoch: 3028, Training Loss: 0.07790
Epoch: 3029, Training Loss: 0.04954
Epoch: 3029, Training Loss: 0.05863
Epoch: 3029, Training Loss: 0.05877
Epoch: 3029, Training Loss: 0.07784
Epoch: 3030, Training Loss: 0.04951
Epoch: 3030, Training Loss: 0.05859
Epoch: 3030, Training Loss: 0.05872
Epoch: 3030, Training Loss: 0.07778
Epoch: 3031, Training Loss: 0.04948
Epoch: 3031, Training Loss: 0.05854
Epoch: 3031, Training Loss: 0.05868
Epoch: 3031, Training Loss: 0.07771
Epoch: 3032, Training Loss: 0.04945
Epoch: 3032, Training Loss: 0.05850
Epoch: 3032, Training Loss: 0.05864
Epoch: 3032, Training Loss: 0.07765
Epoch: 3033, Training Loss: 0.04941
Epoch: 3033, Training Loss: 0.05846
Epoch: 3033, Training Loss: 0.05859
Epoch: 3033, Training Loss: 0.07759
Epoch: 3034, Training Loss: 0.04938
Epoch: 3034, Training Loss: 0.05841
Epoch: 3034, Training Loss: 0.05855
Epoch: 3034, Training Loss: 0.07753
Epoch: 3035, Training Loss: 0.04935
Epoch: 3035, Training Loss: 0.05837
Epoch: 3035, Training Loss: 0.05850
Epoch: 3035, Training Loss: 0.07746
Epoch: 3036, Training Loss: 0.04932
Epoch: 3036, Training Loss: 0.05832
Epoch: 3036, Training Loss: 0.05846
Epoch: 3036, Training Loss: 0.07740
Epoch: 3037, Training Loss: 0.04929
Epoch: 3037, Training Loss: 0.05828
Epoch: 3037, Training Loss: 0.05842
Epoch: 3037, Training Loss: 0.07734
Epoch: 3038, Training Loss: 0.04926
Epoch: 3038, Training Loss: 0.05824
Epoch: 3038, Training Loss: 0.05837
Epoch: 3038, Training Loss: 0.07728
Epoch: 3039, Training Loss: 0.04923
Epoch: 3039, Training Loss: 0.05819
Epoch: 3039, Training Loss: 0.05833
Epoch: 3039, Training Loss: 0.07721
Epoch: 3040, Training Loss: 0.04920
Epoch: 3040, Training Loss: 0.05815
Epoch: 3040, Training Loss: 0.05829
Epoch: 3040, Training Loss: 0.07715
Epoch: 3041, Training Loss: 0.04916
Epoch: 3041, Training Loss: 0.05811
Epoch: 3041, Training Loss: 0.05824
Epoch: 3041, Training Loss: 0.07709
Epoch: 3042, Training Loss: 0.04913
Epoch: 3042, Training Loss: 0.05806
Epoch: 3042, Training Loss: 0.05820
Epoch: 3042, Training Loss: 0.07703
Epoch: 3043, Training Loss: 0.04910
Epoch: 3043, Training Loss: 0.05802
Epoch: 3043, Training Loss: 0.05816
Epoch: 3043, Training Loss: 0.07697
Epoch: 3044, Training Loss: 0.04907
Epoch: 3044, Training Loss: 0.05798
Epoch: 3044, Training Loss: 0.05811
Epoch: 3044, Training Loss: 0.07690
Epoch: 3045, Training Loss: 0.04904
Epoch: 3045, Training Loss: 0.05793
Epoch: 3045, Training Loss: 0.05807
Epoch: 3045, Training Loss: 0.07684
Epoch: 3046, Training Loss: 0.04901
Epoch: 3046, Training Loss: 0.05789
Epoch: 3046, Training Loss: 0.05803
Epoch: 3046, Training Loss: 0.07678
Epoch: 3047, Training Loss: 0.04898
Epoch: 3047, Training Loss: 0.05785
Epoch: 3047, Training Loss: 0.05798
Epoch: 3047, Training Loss: 0.07672
Epoch: 3048, Training Loss: 0.04895
Epoch: 3048, Training Loss: 0.05781
Epoch: 3048, Training Loss: 0.05794
Epoch: 3048, Training Loss: 0.07666
Epoch: 3049, Training Loss: 0.04892
Epoch: 3049, Training Loss: 0.05776
Epoch: 3049, Training Loss: 0.05790
Epoch: 3049, Training Loss: 0.07660
Epoch: 3050, Training Loss: 0.04889
Epoch: 3050, Training Loss: 0.05772
Epoch: 3050, Training Loss: 0.05786
Epoch: 3050, Training Loss: 0.07654
Epoch: 3051, Training Loss: 0.04886
Epoch: 3051, Training Loss: 0.05768
Epoch: 3051, Training Loss: 0.05781
Epoch: 3051, Training Loss: 0.07648
Epoch: 3052, Training Loss: 0.04883
Epoch: 3052, Training Loss: 0.05764
Epoch: 3052, Training Loss: 0.05777
Epoch: 3052, Training Loss: 0.07642
Epoch: 3053, Training Loss: 0.04879
Epoch: 3053, Training Loss: 0.05760
Epoch: 3053, Training Loss: 0.05773
Epoch: 3053, Training Loss: 0.07636
Epoch: 3054, Training Loss: 0.04876
Epoch: 3054, Training Loss: 0.05755
Epoch: 3054, Training Loss: 0.05769
Epoch: 3054, Training Loss: 0.07630
Epoch: 3055, Training Loss: 0.04873
Epoch: 3055, Training Loss: 0.05751
Epoch: 3055, Training Loss: 0.05764
Epoch: 3055, Training Loss: 0.07624
Epoch: 3056, Training Loss: 0.04870
Epoch: 3056, Training Loss: 0.05747
Epoch: 3056, Training Loss: 0.05760
Epoch: 3056, Training Loss: 0.07618
Epoch: 3057, Training Loss: 0.04867
Epoch: 3057, Training Loss: 0.05743
Epoch: 3057, Training Loss: 0.05756
Epoch: 3057, Training Loss: 0.07612
Epoch: 3058, Training Loss: 0.04864
Epoch: 3058, Training Loss: 0.05739
Epoch: 3058, Training Loss: 0.05752
Epoch: 3058, Training Loss: 0.07606
Epoch: 3059, Training Loss: 0.04861
Epoch: 3059, Training Loss: 0.05734
Epoch: 3059, Training Loss: 0.05748
Epoch: 3059, Training Loss: 0.07600
Epoch: 3060, Training Loss: 0.04858
Epoch: 3060, Training Loss: 0.05730
Epoch: 3060, Training Loss: 0.05744
Epoch: 3060, Training Loss: 0.07594
Epoch: 3061, Training Loss: 0.04855
Epoch: 3061, Training Loss: 0.05726
Epoch: 3061, Training Loss: 0.05739
Epoch: 3061, Training Loss: 0.07588
Epoch: 3062, Training Loss: 0.04852
Epoch: 3062, Training Loss: 0.05722
Epoch: 3062, Training Loss: 0.05735
Epoch: 3062, Training Loss: 0.07582
Epoch: 3063, Training Loss: 0.04849
Epoch: 3063, Training Loss: 0.05718
Epoch: 3063, Training Loss: 0.05731
Epoch: 3063, Training Loss: 0.07576
Epoch: 3064, Training Loss: 0.04846
Epoch: 3064, Training Loss: 0.05714
Epoch: 3064, Training Loss: 0.05727
Epoch: 3064, Training Loss: 0.07570
Epoch: 3065, Training Loss: 0.04843
Epoch: 3065, Training Loss: 0.05710
Epoch: 3065, Training Loss: 0.05723
Epoch: 3065, Training Loss: 0.07564
Epoch: 3066, Training Loss: 0.04840
Epoch: 3066, Training Loss: 0.05705
Epoch: 3066, Training Loss: 0.05719
Epoch: 3066, Training Loss: 0.07558
Epoch: 3067, Training Loss: 0.04837
Epoch: 3067, Training Loss: 0.05701
Epoch: 3067, Training Loss: 0.05715
Epoch: 3067, Training Loss: 0.07552
Epoch: 3068, Training Loss: 0.04834
Epoch: 3068, Training Loss: 0.05697
Epoch: 3068, Training Loss: 0.05710
Epoch: 3068, Training Loss: 0.07547
Epoch: 3069, Training Loss: 0.04832
Epoch: 3069, Training Loss: 0.05693
Epoch: 3069, Training Loss: 0.05706
Epoch: 3069, Training Loss: 0.07541
Epoch: 3070, Training Loss: 0.04829
Epoch: 3070, Training Loss: 0.05689
Epoch: 3070, Training Loss: 0.05702
Epoch: 3070, Training Loss: 0.07535
Epoch: 3071, Training Loss: 0.04826
Epoch: 3071, Training Loss: 0.05685
Epoch: 3071, Training Loss: 0.05698
Epoch: 3071, Training Loss: 0.07529
Epoch: 3072, Training Loss: 0.04823
Epoch: 3072, Training Loss: 0.05681
Epoch: 3072, Training Loss: 0.05694
Epoch: 3072, Training Loss: 0.07523
Epoch: 3073, Training Loss: 0.04820
Epoch: 3073, Training Loss: 0.05677
Epoch: 3073, Training Loss: 0.05690
Epoch: 3073, Training Loss: 0.07518
Epoch: 3074, Training Loss: 0.04817
Epoch: 3074, Training Loss: 0.05673
Epoch: 3074, Training Loss: 0.05686
Epoch: 3074, Training Loss: 0.07512
Epoch: 3075, Training Loss: 0.04814
Epoch: 3075, Training Loss: 0.05669
Epoch: 3075, Training Loss: 0.05682
Epoch: 3075, Training Loss: 0.07506
Epoch: 3076, Training Loss: 0.04811
Epoch: 3076, Training Loss: 0.05665
Epoch: 3076, Training Loss: 0.05678
Epoch: 3076, Training Loss: 0.07500
Epoch: 3077, Training Loss: 0.04808
Epoch: 3077, Training Loss: 0.05661
Epoch: 3077, Training Loss: 0.05674
Epoch: 3077, Training Loss: 0.07495
Epoch: 3078, Training Loss: 0.04805
Epoch: 3078, Training Loss: 0.05657
Epoch: 3078, Training Loss: 0.05670
Epoch: 3078, Training Loss: 0.07489
Epoch: 3079, Training Loss: 0.04802
Epoch: 3079, Training Loss: 0.05653
Epoch: 3079, Training Loss: 0.05666
Epoch: 3079, Training Loss: 0.07483
Epoch: 3080, Training Loss: 0.04799
Epoch: 3080, Training Loss: 0.05649
Epoch: 3080, Training Loss: 0.05662
Epoch: 3080, Training Loss: 0.07477
Epoch: 3081, Training Loss: 0.04796
Epoch: 3081, Training Loss: 0.05645
Epoch: 3081, Training Loss: 0.05658
Epoch: 3081, Training Loss: 0.07472
Epoch: 3082, Training Loss: 0.04794
Epoch: 3082, Training Loss: 0.05641
Epoch: 3082, Training Loss: 0.05654
Epoch: 3082, Training Loss: 0.07466
Epoch: 3083, Training Loss: 0.04791
Epoch: 3083, Training Loss: 0.05637
Epoch: 3083, Training Loss: 0.05650
Epoch: 3083, Training Loss: 0.07460
Epoch: 3084, Training Loss: 0.04788
Epoch: 3084, Training Loss: 0.05633
Epoch: 3084, Training Loss: 0.05646
Epoch: 3084, Training Loss: 0.07455
Epoch: 3085, Training Loss: 0.04785
Epoch: 3085, Training Loss: 0.05629
Epoch: 3085, Training Loss: 0.05642
Epoch: 3085, Training Loss: 0.07449
Epoch: 3086, Training Loss: 0.04782
Epoch: 3086, Training Loss: 0.05625
Epoch: 3086, Training Loss: 0.05638
Epoch: 3086, Training Loss: 0.07443
Epoch: 3087, Training Loss: 0.04779
Epoch: 3087, Training Loss: 0.05621
Epoch: 3087, Training Loss: 0.05634
Epoch: 3087, Training Loss: 0.07438
Epoch: 3088, Training Loss: 0.04776
Epoch: 3088, Training Loss: 0.05617
Epoch: 3088, Training Loss: 0.05630
Epoch: 3088, Training Loss: 0.07432
Epoch: 3089, Training Loss: 0.04773
Epoch: 3089, Training Loss: 0.05613
Epoch: 3089, Training Loss: 0.05626
Epoch: 3089, Training Loss: 0.07427
Epoch: 3090, Training Loss: 0.04771
Epoch: 3090, Training Loss: 0.05609
Epoch: 3090, Training Loss: 0.05622
Epoch: 3090, Training Loss: 0.07421
Epoch: 3091, Training Loss: 0.04768
Epoch: 3091, Training Loss: 0.05605
Epoch: 3091, Training Loss: 0.05618
Epoch: 3091, Training Loss: 0.07415
Epoch: 3092, Training Loss: 0.04765
Epoch: 3092, Training Loss: 0.05601
Epoch: 3092, Training Loss: 0.05614
Epoch: 3092, Training Loss: 0.07410
Epoch: 3093, Training Loss: 0.04762
Epoch: 3093, Training Loss: 0.05598
Epoch: 3093, Training Loss: 0.05610
Epoch: 3093, Training Loss: 0.07404
Epoch: 3094, Training Loss: 0.04759
Epoch: 3094, Training Loss: 0.05594
Epoch: 3094, Training Loss: 0.05607
Epoch: 3094, Training Loss: 0.07399
Epoch: 3095, Training Loss: 0.04756
Epoch: 3095, Training Loss: 0.05590
Epoch: 3095, Training Loss: 0.05603
Epoch: 3095, Training Loss: 0.07393
Epoch: 3096, Training Loss: 0.04754
Epoch: 3096, Training Loss: 0.05586
Epoch: 3096, Training Loss: 0.05599
Epoch: 3096, Training Loss: 0.07388
Epoch: 3097, Training Loss: 0.04751
Epoch: 3097, Training Loss: 0.05582
Epoch: 3097, Training Loss: 0.05595
Epoch: 3097, Training Loss: 0.07382
Epoch: 3098, Training Loss: 0.04748
Epoch: 3098, Training Loss: 0.05578
Epoch: 3098, Training Loss: 0.05591
Epoch: 3098, Training Loss: 0.07377
Epoch: 3099, Training Loss: 0.04745
Epoch: 3099, Training Loss: 0.05574
Epoch: 3099, Training Loss: 0.05587
Epoch: 3099, Training Loss: 0.07371
Epoch: 3100, Training Loss: 0.04742
Epoch: 3100, Training Loss: 0.05571
Epoch: 3100, Training Loss: 0.05583
Epoch: 3100, Training Loss: 0.07366
Epoch: 3101, Training Loss: 0.04739
Epoch: 3101, Training Loss: 0.05567
Epoch: 3101, Training Loss: 0.05580
Epoch: 3101, Training Loss: 0.07360
Epoch: 3102, Training Loss: 0.04737
Epoch: 3102, Training Loss: 0.05563
Epoch: 3102, Training Loss: 0.05576
Epoch: 3102, Training Loss: 0.07355
Epoch: 3103, Training Loss: 0.04734
Epoch: 3103, Training Loss: 0.05559
Epoch: 3103, Training Loss: 0.05572
Epoch: 3103, Training Loss: 0.07349
Epoch: 3104, Training Loss: 0.04731
Epoch: 3104, Training Loss: 0.05555
Epoch: 3104, Training Loss: 0.05568
Epoch: 3104, Training Loss: 0.07344
Epoch: 3105, Training Loss: 0.04728
Epoch: 3105, Training Loss: 0.05551
Epoch: 3105, Training Loss: 0.05564
Epoch: 3105, Training Loss: 0.07339
Epoch: 3106, Training Loss: 0.04726
Epoch: 3106, Training Loss: 0.05548
Epoch: 3106, Training Loss: 0.05560
Epoch: 3106, Training Loss: 0.07333
Epoch: 3107, Training Loss: 0.04723
Epoch: 3107, Training Loss: 0.05544
Epoch: 3107, Training Loss: 0.05557
Epoch: 3107, Training Loss: 0.07328
Epoch: 3108, Training Loss: 0.04720
Epoch: 3108, Training Loss: 0.05540
Epoch: 3108, Training Loss: 0.05553
Epoch: 3108, Training Loss: 0.07322
Epoch: 3109, Training Loss: 0.04717
Epoch: 3109, Training Loss: 0.05536
Epoch: 3109, Training Loss: 0.05549
Epoch: 3109, Training Loss: 0.07317
Epoch: 3110, Training Loss: 0.04714
Epoch: 3110, Training Loss: 0.05533
Epoch: 3110, Training Loss: 0.05545
Epoch: 3110, Training Loss: 0.07312
Epoch: 3111, Training Loss: 0.04712
Epoch: 3111, Training Loss: 0.05529
Epoch: 3111, Training Loss: 0.05541
Epoch: 3111, Training Loss: 0.07306
Epoch: 3112, Training Loss: 0.04709
Epoch: 3112, Training Loss: 0.05525
Epoch: 3112, Training Loss: 0.05538
Epoch: 3112, Training Loss: 0.07301
Epoch: 3113, Training Loss: 0.04706
Epoch: 3113, Training Loss: 0.05521
Epoch: 3113, Training Loss: 0.05534
Epoch: 3113, Training Loss: 0.07296
Epoch: 3114, Training Loss: 0.04704
Epoch: 3114, Training Loss: 0.05518
Epoch: 3114, Training Loss: 0.05530
Epoch: 3114, Training Loss: 0.07290
Epoch: 3115, Training Loss: 0.04701
Epoch: 3115, Training Loss: 0.05514
Epoch: 3115, Training Loss: 0.05526
Epoch: 3115, Training Loss: 0.07285
Epoch: 3116, Training Loss: 0.04698
Epoch: 3116, Training Loss: 0.05510
Epoch: 3116, Training Loss: 0.05523
Epoch: 3116, Training Loss: 0.07280
Epoch: 3117, Training Loss: 0.04695
Epoch: 3117, Training Loss: 0.05506
Epoch: 3117, Training Loss: 0.05519
Epoch: 3117, Training Loss: 0.07274
Epoch: 3118, Training Loss: 0.04693
Epoch: 3118, Training Loss: 0.05503
Epoch: 3118, Training Loss: 0.05515
Epoch: 3118, Training Loss: 0.07269
Epoch: 3119, Training Loss: 0.04690
Epoch: 3119, Training Loss: 0.05499
Epoch: 3119, Training Loss: 0.05512
Epoch: 3119, Training Loss: 0.07264
Epoch: 3120, Training Loss: 0.04687
Epoch: 3120, Training Loss: 0.05495
Epoch: 3120, Training Loss: 0.05508
Epoch: 3120, Training Loss: 0.07258
Epoch: 3121, Training Loss: 0.04684
Epoch: 3121, Training Loss: 0.05492
Epoch: 3121, Training Loss: 0.05504
Epoch: 3121, Training Loss: 0.07253
Epoch: 3122, Training Loss: 0.04682
Epoch: 3122, Training Loss: 0.05488
Epoch: 3122, Training Loss: 0.05500
Epoch: 3122, Training Loss: 0.07248
Epoch: 3123, Training Loss: 0.04679
Epoch: 3123, Training Loss: 0.05484
Epoch: 3123, Training Loss: 0.05497
Epoch: 3123, Training Loss: 0.07243
Epoch: 3124, Training Loss: 0.04676
Epoch: 3124, Training Loss: 0.05481
Epoch: 3124, Training Loss: 0.05493
Epoch: 3124, Training Loss: 0.07238
Epoch: 3125, Training Loss: 0.04674
Epoch: 3125, Training Loss: 0.05477
Epoch: 3125, Training Loss: 0.05489
Epoch: 3125, Training Loss: 0.07232
Epoch: 3126, Training Loss: 0.04671
Epoch: 3126, Training Loss: 0.05473
Epoch: 3126, Training Loss: 0.05486
Epoch: 3126, Training Loss: 0.07227
Epoch: 3127, Training Loss: 0.04668
Epoch: 3127, Training Loss: 0.05470
Epoch: 3127, Training Loss: 0.05482
Epoch: 3127, Training Loss: 0.07222
Epoch: 3128, Training Loss: 0.04666
Epoch: 3128, Training Loss: 0.05466
Epoch: 3128, Training Loss: 0.05478
Epoch: 3128, Training Loss: 0.07217
Epoch: 3129, Training Loss: 0.04663
Epoch: 3129, Training Loss: 0.05462
Epoch: 3129, Training Loss: 0.05475
Epoch: 3129, Training Loss: 0.07212
Epoch: 3130, Training Loss: 0.04660
Epoch: 3130, Training Loss: 0.05459
Epoch: 3130, Training Loss: 0.05471
Epoch: 3130, Training Loss: 0.07206
Epoch: 3131, Training Loss: 0.04658
Epoch: 3131, Training Loss: 0.05455
Epoch: 3131, Training Loss: 0.05467
Epoch: 3131, Training Loss: 0.07201
Epoch: 3132, Training Loss: 0.04655
Epoch: 3132, Training Loss: 0.05451
Epoch: 3132, Training Loss: 0.05464
Epoch: 3132, Training Loss: 0.07196
Epoch: 3133, Training Loss: 0.04652
Epoch: 3133, Training Loss: 0.05448
Epoch: 3133, Training Loss: 0.05460
Epoch: 3133, Training Loss: 0.07191
Epoch: 3134, Training Loss: 0.04650
Epoch: 3134, Training Loss: 0.05444
Epoch: 3134, Training Loss: 0.05457
Epoch: 3134, Training Loss: 0.07186
Epoch: 3135, Training Loss: 0.04647
Epoch: 3135, Training Loss: 0.05441
Epoch: 3135, Training Loss: 0.05453
Epoch: 3135, Training Loss: 0.07181
Epoch: 3136, Training Loss: 0.04644
Epoch: 3136, Training Loss: 0.05437
Epoch: 3136, Training Loss: 0.05449
Epoch: 3136, Training Loss: 0.07176
Epoch: 3137, Training Loss: 0.04642
Epoch: 3137, Training Loss: 0.05433
Epoch: 3137, Training Loss: 0.05446
Epoch: 3137, Training Loss: 0.07171
Epoch: 3138, Training Loss: 0.04639
Epoch: 3138, Training Loss: 0.05430
Epoch: 3138, Training Loss: 0.05442
Epoch: 3138, Training Loss: 0.07166
Epoch: 3139, Training Loss: 0.04636
Epoch: 3139, Training Loss: 0.05426
Epoch: 3139, Training Loss: 0.05439
Epoch: 3139, Training Loss: 0.07160
Epoch: 3140, Training Loss: 0.04634
Epoch: 3140, Training Loss: 0.05423
Epoch: 3140, Training Loss: 0.05435
Epoch: 3140, Training Loss: 0.07155
Epoch: 3141, Training Loss: 0.04631
Epoch: 3141, Training Loss: 0.05419
Epoch: 3141, Training Loss: 0.05432
Epoch: 3141, Training Loss: 0.07150
Epoch: 3142, Training Loss: 0.04629
Epoch: 3142, Training Loss: 0.05416
Epoch: 3142, Training Loss: 0.05428
Epoch: 3142, Training Loss: 0.07145
Epoch: 3143, Training Loss: 0.04626
Epoch: 3143, Training Loss: 0.05412
Epoch: 3143, Training Loss: 0.05424
Epoch: 3143, Training Loss: 0.07140
Epoch: 3144, Training Loss: 0.04623
Epoch: 3144, Training Loss: 0.05409
Epoch: 3144, Training Loss: 0.05421
Epoch: 3144, Training Loss: 0.07135
Epoch: 3145, Training Loss: 0.04621
Epoch: 3145, Training Loss: 0.05405
Epoch: 3145, Training Loss: 0.05417
Epoch: 3145, Training Loss: 0.07130
Epoch: 3146, Training Loss: 0.04618
Epoch: 3146, Training Loss: 0.05401
Epoch: 3146, Training Loss: 0.05414
Epoch: 3146, Training Loss: 0.07125
Epoch: 3147, Training Loss: 0.04616
Epoch: 3147, Training Loss: 0.05398
Epoch: 3147, Training Loss: 0.05410
Epoch: 3147, Training Loss: 0.07120
Epoch: 3148, Training Loss: 0.04613
Epoch: 3148, Training Loss: 0.05394
Epoch: 3148, Training Loss: 0.05407
Epoch: 3148, Training Loss: 0.07115
Epoch: 3149, Training Loss: 0.04610
Epoch: 3149, Training Loss: 0.05391
Epoch: 3149, Training Loss: 0.05403
Epoch: 3149, Training Loss: 0.07110
Epoch: 3150, Training Loss: 0.04608
Epoch: 3150, Training Loss: 0.05387
Epoch: 3150, Training Loss: 0.05400
Epoch: 3150, Training Loss: 0.07105
Epoch: 3151, Training Loss: 0.04605
Epoch: 3151, Training Loss: 0.05384
Epoch: 3151, Training Loss: 0.05396
Epoch: 3151, Training Loss: 0.07100
Epoch: 3152, Training Loss: 0.04603
Epoch: 3152, Training Loss: 0.05380
Epoch: 3152, Training Loss: 0.05393
Epoch: 3152, Training Loss: 0.07095
Epoch: 3153, Training Loss: 0.04600
Epoch: 3153, Training Loss: 0.05377
Epoch: 3153, Training Loss: 0.05389
Epoch: 3153, Training Loss: 0.07090
Epoch: 3154, Training Loss: 0.04598
Epoch: 3154, Training Loss: 0.05373
Epoch: 3154, Training Loss: 0.05386
Epoch: 3154, Training Loss: 0.07086
Epoch: 3155, Training Loss: 0.04595
Epoch: 3155, Training Loss: 0.05370
Epoch: 3155, Training Loss: 0.05382
Epoch: 3155, Training Loss: 0.07081
Epoch: 3156, Training Loss: 0.04592
Epoch: 3156, Training Loss: 0.05367
Epoch: 3156, Training Loss: 0.05379
Epoch: 3156, Training Loss: 0.07076
Epoch: 3157, Training Loss: 0.04590
Epoch: 3157, Training Loss: 0.05363
Epoch: 3157, Training Loss: 0.05375
Epoch: 3157, Training Loss: 0.07071
Epoch: 3158, Training Loss: 0.04587
Epoch: 3158, Training Loss: 0.05360
Epoch: 3158, Training Loss: 0.05372
Epoch: 3158, Training Loss: 0.07066
Epoch: 3159, Training Loss: 0.04585
Epoch: 3159, Training Loss: 0.05356
Epoch: 3159, Training Loss: 0.05368
Epoch: 3159, Training Loss: 0.07061
Epoch: 3160, Training Loss: 0.04582
Epoch: 3160, Training Loss: 0.05353
Epoch: 3160, Training Loss: 0.05365
Epoch: 3160, Training Loss: 0.07056
Epoch: 3161, Training Loss: 0.04580
Epoch: 3161, Training Loss: 0.05349
Epoch: 3161, Training Loss: 0.05362
Epoch: 3161, Training Loss: 0.07051
Epoch: 3162, Training Loss: 0.04577
Epoch: 3162, Training Loss: 0.05346
Epoch: 3162, Training Loss: 0.05358
Epoch: 3162, Training Loss: 0.07046
Epoch: 3163, Training Loss: 0.04575
Epoch: 3163, Training Loss: 0.05342
Epoch: 3163, Training Loss: 0.05355
Epoch: 3163, Training Loss: 0.07042
Epoch: 3164, Training Loss: 0.04572
Epoch: 3164, Training Loss: 0.05339
Epoch: 3164, Training Loss: 0.05351
Epoch: 3164, Training Loss: 0.07037
Epoch: 3165, Training Loss: 0.04570
Epoch: 3165, Training Loss: 0.05336
Epoch: 3165, Training Loss: 0.05348
Epoch: 3165, Training Loss: 0.07032
Epoch: 3166, Training Loss: 0.04567
Epoch: 3166, Training Loss: 0.05332
Epoch: 3166, Training Loss: 0.05344
Epoch: 3166, Training Loss: 0.07027
Epoch: 3167, Training Loss: 0.04565
Epoch: 3167, Training Loss: 0.05329
Epoch: 3167, Training Loss: 0.05341
Epoch: 3167, Training Loss: 0.07022
Epoch: 3168, Training Loss: 0.04562
Epoch: 3168, Training Loss: 0.05325
Epoch: 3168, Training Loss: 0.05338
Epoch: 3168, Training Loss: 0.07018
Epoch: 3169, Training Loss: 0.04560
Epoch: 3169, Training Loss: 0.05322
Epoch: 3169, Training Loss: 0.05334
Epoch: 3169, Training Loss: 0.07013
Epoch: 3170, Training Loss: 0.04557
Epoch: 3170, Training Loss: 0.05319
Epoch: 3170, Training Loss: 0.05331
Epoch: 3170, Training Loss: 0.07008
Epoch: 3171, Training Loss: 0.04555
Epoch: 3171, Training Loss: 0.05315
Epoch: 3171, Training Loss: 0.05327
Epoch: 3171, Training Loss: 0.07003
Epoch: 3172, Training Loss: 0.04552
Epoch: 3172, Training Loss: 0.05312
Epoch: 3172, Training Loss: 0.05324
Epoch: 3172, Training Loss: 0.06998
Epoch: 3173, Training Loss: 0.04550
Epoch: 3173, Training Loss: 0.05309
Epoch: 3173, Training Loss: 0.05321
Epoch: 3173, Training Loss: 0.06994
Epoch: 3174, Training Loss: 0.04547
Epoch: 3174, Training Loss: 0.05305
Epoch: 3174, Training Loss: 0.05317
Epoch: 3174, Training Loss: 0.06989
Epoch: 3175, Training Loss: 0.04545
Epoch: 3175, Training Loss: 0.05302
Epoch: 3175, Training Loss: 0.05314
Epoch: 3175, Training Loss: 0.06984
Epoch: 3176, Training Loss: 0.04542
Epoch: 3176, Training Loss: 0.05299
Epoch: 3176, Training Loss: 0.05311
Epoch: 3176, Training Loss: 0.06979
Epoch: 3177, Training Loss: 0.04540
Epoch: 3177, Training Loss: 0.05295
Epoch: 3177, Training Loss: 0.05307
Epoch: 3177, Training Loss: 0.06975
Epoch: 3178, Training Loss: 0.04537
Epoch: 3178, Training Loss: 0.05292
Epoch: 3178, Training Loss: 0.05304
Epoch: 3178, Training Loss: 0.06970
Epoch: 3179, Training Loss: 0.04535
Epoch: 3179, Training Loss: 0.05289
Epoch: 3179, Training Loss: 0.05301
Epoch: 3179, Training Loss: 0.06965
Epoch: 3180, Training Loss: 0.04532
Epoch: 3180, Training Loss: 0.05285
Epoch: 3180, Training Loss: 0.05297
Epoch: 3180, Training Loss: 0.06961
Epoch: 3181, Training Loss: 0.04530
Epoch: 3181, Training Loss: 0.05282
Epoch: 3181, Training Loss: 0.05294
Epoch: 3181, Training Loss: 0.06956
Epoch: 3182, Training Loss: 0.04527
Epoch: 3182, Training Loss: 0.05279
Epoch: 3182, Training Loss: 0.05291
Epoch: 3182, Training Loss: 0.06951
Epoch: 3183, Training Loss: 0.04525
Epoch: 3183, Training Loss: 0.05275
Epoch: 3183, Training Loss: 0.05287
Epoch: 3183, Training Loss: 0.06947
Epoch: 3184, Training Loss: 0.04523
Epoch: 3184, Training Loss: 0.05272
Epoch: 3184, Training Loss: 0.05284
Epoch: 3184, Training Loss: 0.06942
Epoch: 3185, Training Loss: 0.04520
Epoch: 3185, Training Loss: 0.05269
Epoch: 3185, Training Loss: 0.05281
Epoch: 3185, Training Loss: 0.06937
Epoch: 3186, Training Loss: 0.04518
Epoch: 3186, Training Loss: 0.05266
Epoch: 3186, Training Loss: 0.05277
Epoch: 3186, Training Loss: 0.06933
Epoch: 3187, Training Loss: 0.04515
Epoch: 3187, Training Loss: 0.05262
Epoch: 3187, Training Loss: 0.05274
Epoch: 3187, Training Loss: 0.06928
Epoch: 3188, Training Loss: 0.04513
Epoch: 3188, Training Loss: 0.05259
Epoch: 3188, Training Loss: 0.05271
Epoch: 3188, Training Loss: 0.06923
Epoch: 3189, Training Loss: 0.04510
Epoch: 3189, Training Loss: 0.05256
Epoch: 3189, Training Loss: 0.05268
Epoch: 3189, Training Loss: 0.06919
Epoch: 3190, Training Loss: 0.04508
Epoch: 3190, Training Loss: 0.05252
Epoch: 3190, Training Loss: 0.05264
Epoch: 3190, Training Loss: 0.06914
Epoch: 3191, Training Loss: 0.04506
Epoch: 3191, Training Loss: 0.05249
Epoch: 3191, Training Loss: 0.05261
Epoch: 3191, Training Loss: 0.06910
Epoch: 3192, Training Loss: 0.04503
Epoch: 3192, Training Loss: 0.05246
Epoch: 3192, Training Loss: 0.05258
Epoch: 3192, Training Loss: 0.06905
Epoch: 3193, Training Loss: 0.04501
Epoch: 3193, Training Loss: 0.05243
Epoch: 3193, Training Loss: 0.05255
Epoch: 3193, Training Loss: 0.06900
Epoch: 3194, Training Loss: 0.04498
Epoch: 3194, Training Loss: 0.05239
Epoch: 3194, Training Loss: 0.05251
Epoch: 3194, Training Loss: 0.06896
Epoch: 3195, Training Loss: 0.04496
Epoch: 3195, Training Loss: 0.05236
Epoch: 3195, Training Loss: 0.05248
Epoch: 3195, Training Loss: 0.06891
Epoch: 3196, Training Loss: 0.04493
Epoch: 3196, Training Loss: 0.05233
Epoch: 3196, Training Loss: 0.05245
Epoch: 3196, Training Loss: 0.06887
Epoch: 3197, Training Loss: 0.04491
Epoch: 3197, Training Loss: 0.05230
Epoch: 3197, Training Loss: 0.05242
Epoch: 3197, Training Loss: 0.06882
Epoch: 3198, Training Loss: 0.04489
Epoch: 3198, Training Loss: 0.05227
Epoch: 3198, Training Loss: 0.05238
Epoch: 3198, Training Loss: 0.06878
Epoch: 3199, Training Loss: 0.04486
Epoch: 3199, Training Loss: 0.05223
Epoch: 3199, Training Loss: 0.05235
Epoch: 3199, Training Loss: 0.06873
Epoch: 3200, Training Loss: 0.04484
Epoch: 3200, Training Loss: 0.05220
Epoch: 3200, Training Loss: 0.05232
Epoch: 3200, Training Loss: 0.06868
Epoch: 3201, Training Loss: 0.04482
Epoch: 3201, Training Loss: 0.05217
Epoch: 3201, Training Loss: 0.05229
Epoch: 3201, Training Loss: 0.06864
Epoch: 3202, Training Loss: 0.04479
Epoch: 3202, Training Loss: 0.05214
Epoch: 3202, Training Loss: 0.05226
Epoch: 3202, Training Loss: 0.06859
Epoch: 3203, Training Loss: 0.04477
Epoch: 3203, Training Loss: 0.05211
Epoch: 3203, Training Loss: 0.05222
Epoch: 3203, Training Loss: 0.06855
Epoch: 3204, Training Loss: 0.04474
Epoch: 3204, Training Loss: 0.05207
Epoch: 3204, Training Loss: 0.05219
Epoch: 3204, Training Loss: 0.06850
Epoch: 3205, Training Loss: 0.04472
Epoch: 3205, Training Loss: 0.05204
Epoch: 3205, Training Loss: 0.05216
Epoch: 3205, Training Loss: 0.06846
Epoch: 3206, Training Loss: 0.04470
Epoch: 3206, Training Loss: 0.05201
Epoch: 3206, Training Loss: 0.05213
Epoch: 3206, Training Loss: 0.06841
Epoch: 3207, Training Loss: 0.04467
Epoch: 3207, Training Loss: 0.05198
Epoch: 3207, Training Loss: 0.05210
Epoch: 3207, Training Loss: 0.06837
Epoch: 3208, Training Loss: 0.04465
Epoch: 3208, Training Loss: 0.05195
Epoch: 3208, Training Loss: 0.05207
Epoch: 3208, Training Loss: 0.06833
Epoch: 3209, Training Loss: 0.04463
Epoch: 3209, Training Loss: 0.05192
Epoch: 3209, Training Loss: 0.05203
Epoch: 3209, Training Loss: 0.06828
Epoch: 3210, Training Loss: 0.04460
Epoch: 3210, Training Loss: 0.05188
Epoch: 3210, Training Loss: 0.05200
Epoch: 3210, Training Loss: 0.06824
Epoch: 3211, Training Loss: 0.04458
Epoch: 3211, Training Loss: 0.05185
Epoch: 3211, Training Loss: 0.05197
Epoch: 3211, Training Loss: 0.06819
Epoch: 3212, Training Loss: 0.04456
Epoch: 3212, Training Loss: 0.05182
Epoch: 3212, Training Loss: 0.05194
Epoch: 3212, Training Loss: 0.06815
Epoch: 3213, Training Loss: 0.04453
Epoch: 3213, Training Loss: 0.05179
Epoch: 3213, Training Loss: 0.05191
Epoch: 3213, Training Loss: 0.06810
Epoch: 3214, Training Loss: 0.04451
Epoch: 3214, Training Loss: 0.05176
Epoch: 3214, Training Loss: 0.05188
Epoch: 3214, Training Loss: 0.06806
Epoch: 3215, Training Loss: 0.04449
Epoch: 3215, Training Loss: 0.05173
Epoch: 3215, Training Loss: 0.05184
Epoch: 3215, Training Loss: 0.06802
Epoch: 3216, Training Loss: 0.04446
Epoch: 3216, Training Loss: 0.05170
Epoch: 3216, Training Loss: 0.05181
Epoch: 3216, Training Loss: 0.06797
Epoch: 3217, Training Loss: 0.04444
Epoch: 3217, Training Loss: 0.05167
Epoch: 3217, Training Loss: 0.05178
Epoch: 3217, Training Loss: 0.06793
Epoch: 3218, Training Loss: 0.04442
Epoch: 3218, Training Loss: 0.05163
Epoch: 3218, Training Loss: 0.05175
Epoch: 3218, Training Loss: 0.06788
Epoch: 3219, Training Loss: 0.04439
Epoch: 3219, Training Loss: 0.05160
Epoch: 3219, Training Loss: 0.05172
Epoch: 3219, Training Loss: 0.06784
Epoch: 3220, Training Loss: 0.04437
Epoch: 3220, Training Loss: 0.05157
Epoch: 3220, Training Loss: 0.05169
Epoch: 3220, Training Loss: 0.06780
Epoch: 3221, Training Loss: 0.04435
Epoch: 3221, Training Loss: 0.05154
Epoch: 3221, Training Loss: 0.05166
Epoch: 3221, Training Loss: 0.06775
Epoch: 3222, Training Loss: 0.04432
Epoch: 3222, Training Loss: 0.05151
Epoch: 3222, Training Loss: 0.05163
Epoch: 3222, Training Loss: 0.06771
Epoch: 3223, Training Loss: 0.04430
Epoch: 3223, Training Loss: 0.05148
Epoch: 3223, Training Loss: 0.05160
Epoch: 3223, Training Loss: 0.06767
Epoch: 3224, Training Loss: 0.04428
Epoch: 3224, Training Loss: 0.05145
Epoch: 3224, Training Loss: 0.05157
Epoch: 3224, Training Loss: 0.06762
Epoch: 3225, Training Loss: 0.04425
Epoch: 3225, Training Loss: 0.05142
Epoch: 3225, Training Loss: 0.05153
Epoch: 3225, Training Loss: 0.06758
Epoch: 3226, Training Loss: 0.04423
Epoch: 3226, Training Loss: 0.05139
Epoch: 3226, Training Loss: 0.05150
Epoch: 3226, Training Loss: 0.06754
Epoch: 3227, Training Loss: 0.04421
Epoch: 3227, Training Loss: 0.05136
Epoch: 3227, Training Loss: 0.05147
Epoch: 3227, Training Loss: 0.06749
Epoch: 3228, Training Loss: 0.04419
Epoch: 3228, Training Loss: 0.05133
Epoch: 3228, Training Loss: 0.05144
Epoch: 3228, Training Loss: 0.06745
Epoch: 3229, Training Loss: 0.04416
Epoch: 3229, Training Loss: 0.05130
Epoch: 3229, Training Loss: 0.05141
Epoch: 3229, Training Loss: 0.06741
Epoch: 3230, Training Loss: 0.04414
Epoch: 3230, Training Loss: 0.05127
Epoch: 3230, Training Loss: 0.05138
Epoch: 3230, Training Loss: 0.06736
Epoch: 3231, Training Loss: 0.04412
Epoch: 3231, Training Loss: 0.05124
Epoch: 3231, Training Loss: 0.05135
Epoch: 3231, Training Loss: 0.06732
Epoch: 3232, Training Loss: 0.04409
Epoch: 3232, Training Loss: 0.05121
Epoch: 3232, Training Loss: 0.05132
Epoch: 3232, Training Loss: 0.06728
Epoch: 3233, Training Loss: 0.04407
Epoch: 3233, Training Loss: 0.05117
Epoch: 3233, Training Loss: 0.05129
Epoch: 3233, Training Loss: 0.06724
Epoch: 3234, Training Loss: 0.04405
Epoch: 3234, Training Loss: 0.05114
Epoch: 3234, Training Loss: 0.05126
Epoch: 3234, Training Loss: 0.06719
Epoch: 3235, Training Loss: 0.04403
Epoch: 3235, Training Loss: 0.05111
Epoch: 3235, Training Loss: 0.05123
Epoch: 3235, Training Loss: 0.06715
Epoch: 3236, Training Loss: 0.04400
Epoch: 3236, Training Loss: 0.05108
Epoch: 3236, Training Loss: 0.05120
Epoch: 3236, Training Loss: 0.06711
Epoch: 3237, Training Loss: 0.04398
Epoch: 3237, Training Loss: 0.05105
Epoch: 3237, Training Loss: 0.05117
Epoch: 3237, Training Loss: 0.06707
Epoch: 3238, Training Loss: 0.04396
Epoch: 3238, Training Loss: 0.05102
Epoch: 3238, Training Loss: 0.05114
Epoch: 3238, Training Loss: 0.06702
Epoch: 3239, Training Loss: 0.04394
Epoch: 3239, Training Loss: 0.05099
Epoch: 3239, Training Loss: 0.05111
Epoch: 3239, Training Loss: 0.06698
Epoch: 3240, Training Loss: 0.04391
Epoch: 3240, Training Loss: 0.05096
Epoch: 3240, Training Loss: 0.05108
Epoch: 3240, Training Loss: 0.06694
Epoch: 3241, Training Loss: 0.04389
Epoch: 3241, Training Loss: 0.05093
Epoch: 3241, Training Loss: 0.05105
Epoch: 3241, Training Loss: 0.06690
Epoch: 3242, Training Loss: 0.04387
Epoch: 3242, Training Loss: 0.05090
Epoch: 3242, Training Loss: 0.05102
Epoch: 3242, Training Loss: 0.06685
Epoch: 3243, Training Loss: 0.04385
Epoch: 3243, Training Loss: 0.05087
Epoch: 3243, Training Loss: 0.05099
Epoch: 3243, Training Loss: 0.06681
Epoch: 3244, Training Loss: 0.04382
Epoch: 3244, Training Loss: 0.05085
Epoch: 3244, Training Loss: 0.05096
Epoch: 3244, Training Loss: 0.06677
Epoch: 3245, Training Loss: 0.04380
Epoch: 3245, Training Loss: 0.05082
Epoch: 3245, Training Loss: 0.05093
Epoch: 3245, Training Loss: 0.06673
Epoch: 3246, Training Loss: 0.04378
Epoch: 3246, Training Loss: 0.05079
Epoch: 3246, Training Loss: 0.05090
Epoch: 3246, Training Loss: 0.06669
Epoch: 3247, Training Loss: 0.04376
Epoch: 3247, Training Loss: 0.05076
Epoch: 3247, Training Loss: 0.05087
Epoch: 3247, Training Loss: 0.06664
Epoch: 3248, Training Loss: 0.04374
Epoch: 3248, Training Loss: 0.05073
Epoch: 3248, Training Loss: 0.05084
Epoch: 3248, Training Loss: 0.06660
Epoch: 3249, Training Loss: 0.04371
Epoch: 3249, Training Loss: 0.05070
Epoch: 3249, Training Loss: 0.05081
Epoch: 3249, Training Loss: 0.06656
Epoch: 3250, Training Loss: 0.04369
Epoch: 3250, Training Loss: 0.05067
Epoch: 3250, Training Loss: 0.05078
Epoch: 3250, Training Loss: 0.06652
Epoch: 3251, Training Loss: 0.04367
Epoch: 3251, Training Loss: 0.05064
Epoch: 3251, Training Loss: 0.05075
Epoch: 3251, Training Loss: 0.06648
Epoch: 3252, Training Loss: 0.04365
Epoch: 3252, Training Loss: 0.05061
Epoch: 3252, Training Loss: 0.05072
Epoch: 3252, Training Loss: 0.06644
Epoch: 3253, Training Loss: 0.04363
Epoch: 3253, Training Loss: 0.05058
Epoch: 3253, Training Loss: 0.05069
Epoch: 3253, Training Loss: 0.06640
Epoch: 3254, Training Loss: 0.04360
Epoch: 3254, Training Loss: 0.05055
Epoch: 3254, Training Loss: 0.05066
Epoch: 3254, Training Loss: 0.06636
Epoch: 3255, Training Loss: 0.04358
Epoch: 3255, Training Loss: 0.05052
Epoch: 3255, Training Loss: 0.05063
Epoch: 3255, Training Loss: 0.06631
Epoch: 3256, Training Loss: 0.04356
Epoch: 3256, Training Loss: 0.05049
Epoch: 3256, Training Loss: 0.05061
Epoch: 3256, Training Loss: 0.06627
Epoch: 3257, Training Loss: 0.04354
Epoch: 3257, Training Loss: 0.05046
Epoch: 3257, Training Loss: 0.05058
Epoch: 3257, Training Loss: 0.06623
Epoch: 3258, Training Loss: 0.04352
Epoch: 3258, Training Loss: 0.05043
Epoch: 3258, Training Loss: 0.05055
Epoch: 3258, Training Loss: 0.06619
Epoch: 3259, Training Loss: 0.04349
Epoch: 3259, Training Loss: 0.05040
Epoch: 3259, Training Loss: 0.05052
Epoch: 3259, Training Loss: 0.06615
Epoch: 3260, Training Loss: 0.04347
Epoch: 3260, Training Loss: 0.05038
Epoch: 3260, Training Loss: 0.05049
Epoch: 3260, Training Loss: 0.06611
Epoch: 3261, Training Loss: 0.04345
Epoch: 3261, Training Loss: 0.05035
Epoch: 3261, Training Loss: 0.05046
Epoch: 3261, Training Loss: 0.06607
Epoch: 3262, Training Loss: 0.04343
Epoch: 3262, Training Loss: 0.05032
Epoch: 3262, Training Loss: 0.05043
Epoch: 3262, Training Loss: 0.06603
Epoch: 3263, Training Loss: 0.04341
Epoch: 3263, Training Loss: 0.05029
Epoch: 3263, Training Loss: 0.05040
Epoch: 3263, Training Loss: 0.06599
Epoch: 3264, Training Loss: 0.04338
Epoch: 3264, Training Loss: 0.05026
Epoch: 3264, Training Loss: 0.05037
Epoch: 3264, Training Loss: 0.06595
Epoch: 3265, Training Loss: 0.04336
Epoch: 3265, Training Loss: 0.05023
Epoch: 3265, Training Loss: 0.05034
Epoch: 3265, Training Loss: 0.06591
Epoch: 3266, Training Loss: 0.04334
Epoch: 3266, Training Loss: 0.05020
Epoch: 3266, Training Loss: 0.05032
Epoch: 3266, Training Loss: 0.06587
Epoch: 3267, Training Loss: 0.04332
Epoch: 3267, Training Loss: 0.05017
Epoch: 3267, Training Loss: 0.05029
Epoch: 3267, Training Loss: 0.06583
Epoch: 3268, Training Loss: 0.04330
Epoch: 3268, Training Loss: 0.05015
Epoch: 3268, Training Loss: 0.05026
Epoch: 3268, Training Loss: 0.06579
Epoch: 3269, Training Loss: 0.04328
Epoch: 3269, Training Loss: 0.05012
Epoch: 3269, Training Loss: 0.05023
Epoch: 3269, Training Loss: 0.06575
Epoch: 3270, Training Loss: 0.04326
Epoch: 3270, Training Loss: 0.05009
Epoch: 3270, Training Loss: 0.05020
Epoch: 3270, Training Loss: 0.06571
Epoch: 3271, Training Loss: 0.04323
Epoch: 3271, Training Loss: 0.05006
Epoch: 3271, Training Loss: 0.05017
Epoch: 3271, Training Loss: 0.06567
Epoch: 3272, Training Loss: 0.04321
Epoch: 3272, Training Loss: 0.05003
Epoch: 3272, Training Loss: 0.05014
Epoch: 3272, Training Loss: 0.06563
Epoch: 3273, Training Loss: 0.04319
Epoch: 3273, Training Loss: 0.05000
Epoch: 3273, Training Loss: 0.05011
Epoch: 3273, Training Loss: 0.06559
Epoch: 3274, Training Loss: 0.04317
Epoch: 3274, Training Loss: 0.04997
Epoch: 3274, Training Loss: 0.05009
Epoch: 3274, Training Loss: 0.06555
Epoch: 3275, Training Loss: 0.04315
Epoch: 3275, Training Loss: 0.04995
Epoch: 3275, Training Loss: 0.05006
Epoch: 3275, Training Loss: 0.06551
Epoch: 3276, Training Loss: 0.04313
Epoch: 3276, Training Loss: 0.04992
Epoch: 3276, Training Loss: 0.05003
Epoch: 3276, Training Loss: 0.06547
Epoch: 3277, Training Loss: 0.04311
Epoch: 3277, Training Loss: 0.04989
Epoch: 3277, Training Loss: 0.05000
Epoch: 3277, Training Loss: 0.06543
Epoch: 3278, Training Loss: 0.04308
Epoch: 3278, Training Loss: 0.04986
Epoch: 3278, Training Loss: 0.04997
Epoch: 3278, Training Loss: 0.06539
Epoch: 3279, Training Loss: 0.04306
Epoch: 3279, Training Loss: 0.04983
Epoch: 3279, Training Loss: 0.04995
Epoch: 3279, Training Loss: 0.06535
Epoch: 3280, Training Loss: 0.04304
Epoch: 3280, Training Loss: 0.04981
Epoch: 3280, Training Loss: 0.04992
Epoch: 3280, Training Loss: 0.06531
Epoch: 3281, Training Loss: 0.04302
Epoch: 3281, Training Loss: 0.04978
Epoch: 3281, Training Loss: 0.04989
Epoch: 3281, Training Loss: 0.06527
Epoch: 3282, Training Loss: 0.04300
Epoch: 3282, Training Loss: 0.04975
Epoch: 3282, Training Loss: 0.04986
Epoch: 3282, Training Loss: 0.06523
Epoch: 3283, Training Loss: 0.04298
Epoch: 3283, Training Loss: 0.04972
Epoch: 3283, Training Loss: 0.04983
Epoch: 3283, Training Loss: 0.06519
Epoch: 3284, Training Loss: 0.04296
Epoch: 3284, Training Loss: 0.04969
Epoch: 3284, Training Loss: 0.04980
Epoch: 3284, Training Loss: 0.06515
Epoch: 3285, Training Loss: 0.04294
Epoch: 3285, Training Loss: 0.04967
Epoch: 3285, Training Loss: 0.04978
Epoch: 3285, Training Loss: 0.06511
Epoch: 3286, Training Loss: 0.04292
Epoch: 3286, Training Loss: 0.04964
Epoch: 3286, Training Loss: 0.04975
Epoch: 3286, Training Loss: 0.06507
Epoch: 3287, Training Loss: 0.04289
Epoch: 3287, Training Loss: 0.04961
Epoch: 3287, Training Loss: 0.04972
Epoch: 3287, Training Loss: 0.06503
Epoch: 3288, Training Loss: 0.04287
Epoch: 3288, Training Loss: 0.04958
Epoch: 3288, Training Loss: 0.04969
Epoch: 3288, Training Loss: 0.06499
Epoch: 3289, Training Loss: 0.04285
Epoch: 3289, Training Loss: 0.04956
Epoch: 3289, Training Loss: 0.04967
Epoch: 3289, Training Loss: 0.06496
Epoch: 3290, Training Loss: 0.04283
Epoch: 3290, Training Loss: 0.04953
Epoch: 3290, Training Loss: 0.04964
Epoch: 3290, Training Loss: 0.06492
Epoch: 3291, Training Loss: 0.04281
Epoch: 3291, Training Loss: 0.04950
Epoch: 3291, Training Loss: 0.04961
Epoch: 3291, Training Loss: 0.06488
Epoch: 3292, Training Loss: 0.04279
Epoch: 3292, Training Loss: 0.04947
Epoch: 3292, Training Loss: 0.04958
Epoch: 3292, Training Loss: 0.06484
Epoch: 3293, Training Loss: 0.04277
Epoch: 3293, Training Loss: 0.04944
Epoch: 3293, Training Loss: 0.04956
Epoch: 3293, Training Loss: 0.06480
Epoch: 3294, Training Loss: 0.04275
Epoch: 3294, Training Loss: 0.04942
Epoch: 3294, Training Loss: 0.04953
Epoch: 3294, Training Loss: 0.06476
Epoch: 3295, Training Loss: 0.04273
Epoch: 3295, Training Loss: 0.04939
Epoch: 3295, Training Loss: 0.04950
Epoch: 3295, Training Loss: 0.06472
Epoch: 3296, Training Loss: 0.04271
Epoch: 3296, Training Loss: 0.04936
Epoch: 3296, Training Loss: 0.04947
Epoch: 3296, Training Loss: 0.06469
Epoch: 3297, Training Loss: 0.04269
Epoch: 3297, Training Loss: 0.04934
Epoch: 3297, Training Loss: 0.04945
Epoch: 3297, Training Loss: 0.06465
Epoch: 3298, Training Loss: 0.04267
Epoch: 3298, Training Loss: 0.04931
Epoch: 3298, Training Loss: 0.04942
Epoch: 3298, Training Loss: 0.06461
Epoch: 3299, Training Loss: 0.04265
Epoch: 3299, Training Loss: 0.04928
Epoch: 3299, Training Loss: 0.04939
Epoch: 3299, Training Loss: 0.06457
Epoch: 3300, Training Loss: 0.04262
Epoch: 3300, Training Loss: 0.04925
Epoch: 3300, Training Loss: 0.04936
Epoch: 3300, Training Loss: 0.06453
Epoch: 3301, Training Loss: 0.04260
Epoch: 3301, Training Loss: 0.04923
Epoch: 3301, Training Loss: 0.04934
Epoch: 3301, Training Loss: 0.06449
Epoch: 3302, Training Loss: 0.04258
Epoch: 3302, Training Loss: 0.04920
Epoch: 3302, Training Loss: 0.04931
Epoch: 3302, Training Loss: 0.06446
Epoch: 3303, Training Loss: 0.04256
Epoch: 3303, Training Loss: 0.04917
Epoch: 3303, Training Loss: 0.04928
Epoch: 3303, Training Loss: 0.06442
Epoch: 3304, Training Loss: 0.04254
Epoch: 3304, Training Loss: 0.04915
Epoch: 3304, Training Loss: 0.04925
Epoch: 3304, Training Loss: 0.06438
Epoch: 3305, Training Loss: 0.04252
Epoch: 3305, Training Loss: 0.04912
Epoch: 3305, Training Loss: 0.04923
Epoch: 3305, Training Loss: 0.06434
Epoch: 3306, Training Loss: 0.04250
Epoch: 3306, Training Loss: 0.04909
Epoch: 3306, Training Loss: 0.04920
Epoch: 3306, Training Loss: 0.06430
Epoch: 3307, Training Loss: 0.04248
Epoch: 3307, Training Loss: 0.04906
Epoch: 3307, Training Loss: 0.04917
Epoch: 3307, Training Loss: 0.06427
Epoch: 3308, Training Loss: 0.04246
Epoch: 3308, Training Loss: 0.04904
Epoch: 3308, Training Loss: 0.04915
Epoch: 3308, Training Loss: 0.06423
Epoch: 3309, Training Loss: 0.04244
Epoch: 3309, Training Loss: 0.04901
Epoch: 3309, Training Loss: 0.04912
Epoch: 3309, Training Loss: 0.06419
Epoch: 3310, Training Loss: 0.04242
Epoch: 3310, Training Loss: 0.04898
Epoch: 3310, Training Loss: 0.04909
Epoch: 3310, Training Loss: 0.06415
Epoch: 3311, Training Loss: 0.04240
Epoch: 3311, Training Loss: 0.04896
Epoch: 3311, Training Loss: 0.04907
Epoch: 3311, Training Loss: 0.06412
Epoch: 3312, Training Loss: 0.04238
Epoch: 3312, Training Loss: 0.04893
Epoch: 3312, Training Loss: 0.04904
Epoch: 3312, Training Loss: 0.06408
Epoch: 3313, Training Loss: 0.04236
Epoch: 3313, Training Loss: 0.04890
Epoch: 3313, Training Loss: 0.04901
Epoch: 3313, Training Loss: 0.06404
Epoch: 3314, Training Loss: 0.04234
Epoch: 3314, Training Loss: 0.04888
Epoch: 3314, Training Loss: 0.04899
Epoch: 3314, Training Loss: 0.06400
Epoch: 3315, Training Loss: 0.04232
Epoch: 3315, Training Loss: 0.04885
Epoch: 3315, Training Loss: 0.04896
Epoch: 3315, Training Loss: 0.06397
Epoch: 3316, Training Loss: 0.04230
Epoch: 3316, Training Loss: 0.04882
Epoch: 3316, Training Loss: 0.04893
Epoch: 3316, Training Loss: 0.06393
Epoch: 3317, Training Loss: 0.04228
Epoch: 3317, Training Loss: 0.04880
Epoch: 3317, Training Loss: 0.04891
Epoch: 3317, Training Loss: 0.06389
Epoch: 3318, Training Loss: 0.04226
Epoch: 3318, Training Loss: 0.04877
Epoch: 3318, Training Loss: 0.04888
Epoch: 3318, Training Loss: 0.06386
Epoch: 3319, Training Loss: 0.04224
Epoch: 3319, Training Loss: 0.04874
Epoch: 3319, Training Loss: 0.04885
Epoch: 3319, Training Loss: 0.06382
Epoch: 3320, Training Loss: 0.04222
Epoch: 3320, Training Loss: 0.04872
Epoch: 3320, Training Loss: 0.04883
Epoch: 3320, Training Loss: 0.06378
Epoch: 3321, Training Loss: 0.04220
Epoch: 3321, Training Loss: 0.04869
Epoch: 3321, Training Loss: 0.04880
Epoch: 3321, Training Loss: 0.06375
Epoch: 3322, Training Loss: 0.04218
Epoch: 3322, Training Loss: 0.04867
Epoch: 3322, Training Loss: 0.04877
Epoch: 3322, Training Loss: 0.06371
Epoch: 3323, Training Loss: 0.04216
Epoch: 3323, Training Loss: 0.04864
Epoch: 3323, Training Loss: 0.04875
Epoch: 3323, Training Loss: 0.06367
Epoch: 3324, Training Loss: 0.04214
Epoch: 3324, Training Loss: 0.04861
Epoch: 3324, Training Loss: 0.04872
Epoch: 3324, Training Loss: 0.06363
Epoch: 3325, Training Loss: 0.04212
Epoch: 3325, Training Loss: 0.04859
Epoch: 3325, Training Loss: 0.04869
Epoch: 3325, Training Loss: 0.06360
Epoch: 3326, Training Loss: 0.04210
Epoch: 3326, Training Loss: 0.04856
Epoch: 3326, Training Loss: 0.04867
Epoch: 3326, Training Loss: 0.06356
Epoch: 3327, Training Loss: 0.04208
Epoch: 3327, Training Loss: 0.04853
Epoch: 3327, Training Loss: 0.04864
Epoch: 3327, Training Loss: 0.06352
Epoch: 3328, Training Loss: 0.04206
Epoch: 3328, Training Loss: 0.04851
Epoch: 3328, Training Loss: 0.04862
Epoch: 3328, Training Loss: 0.06349
Epoch: 3329, Training Loss: 0.04204
Epoch: 3329, Training Loss: 0.04848
Epoch: 3329, Training Loss: 0.04859
Epoch: 3329, Training Loss: 0.06345
Epoch: 3330, Training Loss: 0.04202
Epoch: 3330, Training Loss: 0.04846
Epoch: 3330, Training Loss: 0.04856
Epoch: 3330, Training Loss: 0.06342
Epoch: 3331, Training Loss: 0.04200
Epoch: 3331, Training Loss: 0.04843
Epoch: 3331, Training Loss: 0.04854
Epoch: 3331, Training Loss: 0.06338
Epoch: 3332, Training Loss: 0.04198
Epoch: 3332, Training Loss: 0.04840
Epoch: 3332, Training Loss: 0.04851
Epoch: 3332, Training Loss: 0.06334
Epoch: 3333, Training Loss: 0.04196
Epoch: 3333, Training Loss: 0.04838
Epoch: 3333, Training Loss: 0.04849
Epoch: 3333, Training Loss: 0.06331
Epoch: 3334, Training Loss: 0.04194
Epoch: 3334, Training Loss: 0.04835
Epoch: 3334, Training Loss: 0.04846
Epoch: 3334, Training Loss: 0.06327
Epoch: 3335, Training Loss: 0.04192
Epoch: 3335, Training Loss: 0.04833
Epoch: 3335, Training Loss: 0.04843
Epoch: 3335, Training Loss: 0.06323
Epoch: 3336, Training Loss: 0.04190
Epoch: 3336, Training Loss: 0.04830
Epoch: 3336, Training Loss: 0.04841
Epoch: 3336, Training Loss: 0.06320
Epoch: 3337, Training Loss: 0.04188
Epoch: 3337, Training Loss: 0.04828
Epoch: 3337, Training Loss: 0.04838
Epoch: 3337, Training Loss: 0.06316
Epoch: 3338, Training Loss: 0.04186
Epoch: 3338, Training Loss: 0.04825
Epoch: 3338, Training Loss: 0.04836
Epoch: 3338, Training Loss: 0.06313
Epoch: 3339, Training Loss: 0.04184
Epoch: 3339, Training Loss: 0.04822
Epoch: 3339, Training Loss: 0.04833
Epoch: 3339, Training Loss: 0.06309
Epoch: 3340, Training Loss: 0.04182
Epoch: 3340, Training Loss: 0.04820
Epoch: 3340, Training Loss: 0.04831
Epoch: 3340, Training Loss: 0.06305
Epoch: 3341, Training Loss: 0.04180
Epoch: 3341, Training Loss: 0.04817
Epoch: 3341, Training Loss: 0.04828
Epoch: 3341, Training Loss: 0.06302
Epoch: 3342, Training Loss: 0.04179
Epoch: 3342, Training Loss: 0.04815
Epoch: 3342, Training Loss: 0.04825
Epoch: 3342, Training Loss: 0.06298
Epoch: 3343, Training Loss: 0.04177
Epoch: 3343, Training Loss: 0.04812
Epoch: 3343, Training Loss: 0.04823
Epoch: 3343, Training Loss: 0.06295
Epoch: 3344, Training Loss: 0.04175
Epoch: 3344, Training Loss: 0.04810
Epoch: 3344, Training Loss: 0.04820
Epoch: 3344, Training Loss: 0.06291
Epoch: 3345, Training Loss: 0.04173
Epoch: 3345, Training Loss: 0.04807
Epoch: 3345, Training Loss: 0.04818
Epoch: 3345, Training Loss: 0.06288
Epoch: 3346, Training Loss: 0.04171
Epoch: 3346, Training Loss: 0.04805
Epoch: 3346, Training Loss: 0.04815
Epoch: 3346, Training Loss: 0.06284
Epoch: 3347, Training Loss: 0.04169
Epoch: 3347, Training Loss: 0.04802
Epoch: 3347, Training Loss: 0.04813
Epoch: 3347, Training Loss: 0.06281
Epoch: 3348, Training Loss: 0.04167
Epoch: 3348, Training Loss: 0.04800
Epoch: 3348, Training Loss: 0.04810
Epoch: 3348, Training Loss: 0.06277
Epoch: 3349, Training Loss: 0.04165
Epoch: 3349, Training Loss: 0.04797
Epoch: 3349, Training Loss: 0.04808
Epoch: 3349, Training Loss: 0.06274
Epoch: 3350, Training Loss: 0.04163
Epoch: 3350, Training Loss: 0.04795
Epoch: 3350, Training Loss: 0.04805
Epoch: 3350, Training Loss: 0.06270
Epoch: 3351, Training Loss: 0.04161
Epoch: 3351, Training Loss: 0.04792
Epoch: 3351, Training Loss: 0.04803
Epoch: 3351, Training Loss: 0.06266
Epoch: 3352, Training Loss: 0.04159
Epoch: 3352, Training Loss: 0.04790
Epoch: 3352, Training Loss: 0.04800
Epoch: 3352, Training Loss: 0.06263
Epoch: 3353, Training Loss: 0.04157
Epoch: 3353, Training Loss: 0.04787
Epoch: 3353, Training Loss: 0.04798
Epoch: 3353, Training Loss: 0.06259
Epoch: 3354, Training Loss: 0.04155
Epoch: 3354, Training Loss: 0.04784
Epoch: 3354, Training Loss: 0.04795
Epoch: 3354, Training Loss: 0.06256
Epoch: 3355, Training Loss: 0.04153
Epoch: 3355, Training Loss: 0.04782
Epoch: 3355, Training Loss: 0.04793
Epoch: 3355, Training Loss: 0.06252
Epoch: 3356, Training Loss: 0.04152
Epoch: 3356, Training Loss: 0.04779
Epoch: 3356, Training Loss: 0.04790
Epoch: 3356, Training Loss: 0.06249
Epoch: 3357, Training Loss: 0.04150
Epoch: 3357, Training Loss: 0.04777
Epoch: 3357, Training Loss: 0.04788
Epoch: 3357, Training Loss: 0.06245
Epoch: 3358, Training Loss: 0.04148
Epoch: 3358, Training Loss: 0.04775
Epoch: 3358, Training Loss: 0.04785
Epoch: 3358, Training Loss: 0.06242
Epoch: 3359, Training Loss: 0.04146
Epoch: 3359, Training Loss: 0.04772
Epoch: 3359, Training Loss: 0.04783
Epoch: 3359, Training Loss: 0.06238
Epoch: 3360, Training Loss: 0.04144
Epoch: 3360, Training Loss: 0.04770
Epoch: 3360, Training Loss: 0.04780
Epoch: 3360, Training Loss: 0.06235
Epoch: 3361, Training Loss: 0.04142
Epoch: 3361, Training Loss: 0.04767
Epoch: 3361, Training Loss: 0.04778
Epoch: 3361, Training Loss: 0.06232
Epoch: 3362, Training Loss: 0.04140
Epoch: 3362, Training Loss: 0.04765
Epoch: 3362, Training Loss: 0.04775
Epoch: 3362, Training Loss: 0.06228
Epoch: 3363, Training Loss: 0.04138
Epoch: 3363, Training Loss: 0.04762
Epoch: 3363, Training Loss: 0.04773
Epoch: 3363, Training Loss: 0.06225
Epoch: 3364, Training Loss: 0.04136
Epoch: 3364, Training Loss: 0.04760
Epoch: 3364, Training Loss: 0.04770
Epoch: 3364, Training Loss: 0.06221
Epoch: 3365, Training Loss: 0.04135
Epoch: 3365, Training Loss: 0.04757
Epoch: 3365, Training Loss: 0.04768
Epoch: 3365, Training Loss: 0.06218
Epoch: 3366, Training Loss: 0.04133
Epoch: 3366, Training Loss: 0.04755
Epoch: 3366, Training Loss: 0.04765
Epoch: 3366, Training Loss: 0.06214
Epoch: 3367, Training Loss: 0.04131
Epoch: 3367, Training Loss: 0.04752
Epoch: 3367, Training Loss: 0.04763
Epoch: 3367, Training Loss: 0.06211
Epoch: 3368, Training Loss: 0.04129
Epoch: 3368, Training Loss: 0.04750
Epoch: 3368, Training Loss: 0.04760
Epoch: 3368, Training Loss: 0.06207
Epoch: 3369, Training Loss: 0.04127
Epoch: 3369, Training Loss: 0.04747
Epoch: 3369, Training Loss: 0.04758
Epoch: 3369, Training Loss: 0.06204
Epoch: 3370, Training Loss: 0.04125
Epoch: 3370, Training Loss: 0.04745
Epoch: 3370, Training Loss: 0.04755
Epoch: 3370, Training Loss: 0.06201
Epoch: 3371, Training Loss: 0.04123
Epoch: 3371, Training Loss: 0.04742
Epoch: 3371, Training Loss: 0.04753
Epoch: 3371, Training Loss: 0.06197
Epoch: 3372, Training Loss: 0.04121
Epoch: 3372, Training Loss: 0.04740
Epoch: 3372, Training Loss: 0.04750
Epoch: 3372, Training Loss: 0.06194
Epoch: 3373, Training Loss: 0.04120
Epoch: 3373, Training Loss: 0.04738
Epoch: 3373, Training Loss: 0.04748
Epoch: 3373, Training Loss: 0.06190
Epoch: 3374, Training Loss: 0.04118
Epoch: 3374, Training Loss: 0.04735
Epoch: 3374, Training Loss: 0.04746
Epoch: 3374, Training Loss: 0.06187
Epoch: 3375, Training Loss: 0.04116
Epoch: 3375, Training Loss: 0.04733
Epoch: 3375, Training Loss: 0.04743
Epoch: 3375, Training Loss: 0.06184
Epoch: 3376, Training Loss: 0.04114
Epoch: 3376, Training Loss: 0.04730
Epoch: 3376, Training Loss: 0.04741
Epoch: 3376, Training Loss: 0.06180
Epoch: 3377, Training Loss: 0.04112
Epoch: 3377, Training Loss: 0.04728
Epoch: 3377, Training Loss: 0.04738
Epoch: 3377, Training Loss: 0.06177
Epoch: 3378, Training Loss: 0.04110
Epoch: 3378, Training Loss: 0.04725
Epoch: 3378, Training Loss: 0.04736
Epoch: 3378, Training Loss: 0.06173
Epoch: 3379, Training Loss: 0.04108
Epoch: 3379, Training Loss: 0.04723
Epoch: 3379, Training Loss: 0.04733
Epoch: 3379, Training Loss: 0.06170
Epoch: 3380, Training Loss: 0.04107
Epoch: 3380, Training Loss: 0.04721
Epoch: 3380, Training Loss: 0.04731
Epoch: 3380, Training Loss: 0.06167
Epoch: 3381, Training Loss: 0.04105
Epoch: 3381, Training Loss: 0.04718
Epoch: 3381, Training Loss: 0.04729
Epoch: 3381, Training Loss: 0.06163
Epoch: 3382, Training Loss: 0.04103
Epoch: 3382, Training Loss: 0.04716
Epoch: 3382, Training Loss: 0.04726
Epoch: 3382, Training Loss: 0.06160
Epoch: 3383, Training Loss: 0.04101
Epoch: 3383, Training Loss: 0.04713
Epoch: 3383, Training Loss: 0.04724
Epoch: 3383, Training Loss: 0.06157
Epoch: 3384, Training Loss: 0.04099
Epoch: 3384, Training Loss: 0.04711
Epoch: 3384, Training Loss: 0.04721
Epoch: 3384, Training Loss: 0.06153
Epoch: 3385, Training Loss: 0.04097
Epoch: 3385, Training Loss: 0.04709
Epoch: 3385, Training Loss: 0.04719
Epoch: 3385, Training Loss: 0.06150
Epoch: 3386, Training Loss: 0.04095
Epoch: 3386, Training Loss: 0.04706
Epoch: 3386, Training Loss: 0.04717
Epoch: 3386, Training Loss: 0.06147
Epoch: 3387, Training Loss: 0.04094
Epoch: 3387, Training Loss: 0.04704
Epoch: 3387, Training Loss: 0.04714
Epoch: 3387, Training Loss: 0.06143
Epoch: 3388, Training Loss: 0.04092
Epoch: 3388, Training Loss: 0.04701
Epoch: 3388, Training Loss: 0.04712
Epoch: 3388, Training Loss: 0.06140
Epoch: 3389, Training Loss: 0.04090
Epoch: 3389, Training Loss: 0.04699
Epoch: 3389, Training Loss: 0.04709
Epoch: 3389, Training Loss: 0.06137
Epoch: 3390, Training Loss: 0.04088
Epoch: 3390, Training Loss: 0.04697
Epoch: 3390, Training Loss: 0.04707
Epoch: 3390, Training Loss: 0.06133
Epoch: 3391, Training Loss: 0.04086
Epoch: 3391, Training Loss: 0.04694
Epoch: 3391, Training Loss: 0.04705
Epoch: 3391, Training Loss: 0.06130
Epoch: 3392, Training Loss: 0.04085
Epoch: 3392, Training Loss: 0.04692
Epoch: 3392, Training Loss: 0.04702
Epoch: 3392, Training Loss: 0.06127
Epoch: 3393, Training Loss: 0.04083
Epoch: 3393, Training Loss: 0.04690
Epoch: 3393, Training Loss: 0.04700
Epoch: 3393, Training Loss: 0.06123
Epoch: 3394, Training Loss: 0.04081
Epoch: 3394, Training Loss: 0.04687
Epoch: 3394, Training Loss: 0.04697
Epoch: 3394, Training Loss: 0.06120
Epoch: 3395, Training Loss: 0.04079
Epoch: 3395, Training Loss: 0.04685
Epoch: 3395, Training Loss: 0.04695
Epoch: 3395, Training Loss: 0.06117
Epoch: 3396, Training Loss: 0.04077
Epoch: 3396, Training Loss: 0.04682
Epoch: 3396, Training Loss: 0.04693
Epoch: 3396, Training Loss: 0.06113
Epoch: 3397, Training Loss: 0.04075
Epoch: 3397, Training Loss: 0.04680
Epoch: 3397, Training Loss: 0.04690
Epoch: 3397, Training Loss: 0.06110
Epoch: 3398, Training Loss: 0.04074
Epoch: 3398, Training Loss: 0.04678
Epoch: 3398, Training Loss: 0.04688
Epoch: 3398, Training Loss: 0.06107
Epoch: 3399, Training Loss: 0.04072
Epoch: 3399, Training Loss: 0.04675
Epoch: 3399, Training Loss: 0.04686
Epoch: 3399, Training Loss: 0.06104
Epoch: 3400, Training Loss: 0.04070
Epoch: 3400, Training Loss: 0.04673
Epoch: 3400, Training Loss: 0.04683
Epoch: 3400, Training Loss: 0.06100
Epoch: 3401, Training Loss: 0.04068
Epoch: 3401, Training Loss: 0.04671
Epoch: 3401, Training Loss: 0.04681
Epoch: 3401, Training Loss: 0.06097
Epoch: 3402, Training Loss: 0.04066
Epoch: 3402, Training Loss: 0.04668
Epoch: 3402, Training Loss: 0.04679
Epoch: 3402, Training Loss: 0.06094
Epoch: 3403, Training Loss: 0.04065
Epoch: 3403, Training Loss: 0.04666
Epoch: 3403, Training Loss: 0.04676
Epoch: 3403, Training Loss: 0.06091
Epoch: 3404, Training Loss: 0.04063
Epoch: 3404, Training Loss: 0.04664
Epoch: 3404, Training Loss: 0.04674
Epoch: 3404, Training Loss: 0.06087
Epoch: 3405, Training Loss: 0.04061
Epoch: 3405, Training Loss: 0.04661
Epoch: 3405, Training Loss: 0.04672
Epoch: 3405, Training Loss: 0.06084
Epoch: 3406, Training Loss: 0.04059
Epoch: 3406, Training Loss: 0.04659
Epoch: 3406, Training Loss: 0.04669
Epoch: 3406, Training Loss: 0.06081
Epoch: 3407, Training Loss: 0.04058
Epoch: 3407, Training Loss: 0.04657
Epoch: 3407, Training Loss: 0.04667
Epoch: 3407, Training Loss: 0.06078
Epoch: 3408, Training Loss: 0.04056
Epoch: 3408, Training Loss: 0.04654
Epoch: 3408, Training Loss: 0.04665
Epoch: 3408, Training Loss: 0.06074
Epoch: 3409, Training Loss: 0.04054
Epoch: 3409, Training Loss: 0.04652
Epoch: 3409, Training Loss: 0.04662
Epoch: 3409, Training Loss: 0.06071
Epoch: 3410, Training Loss: 0.04052
Epoch: 3410, Training Loss: 0.04650
Epoch: 3410, Training Loss: 0.04660
Epoch: 3410, Training Loss: 0.06068
Epoch: 3411, Training Loss: 0.04050
Epoch: 3411, Training Loss: 0.04647
Epoch: 3411, Training Loss: 0.04658
Epoch: 3411, Training Loss: 0.06065
Epoch: 3412, Training Loss: 0.04049
Epoch: 3412, Training Loss: 0.04645
Epoch: 3412, Training Loss: 0.04655
Epoch: 3412, Training Loss: 0.06061
Epoch: 3413, Training Loss: 0.04047
Epoch: 3413, Training Loss: 0.04643
Epoch: 3413, Training Loss: 0.04653
Epoch: 3413, Training Loss: 0.06058
Epoch: 3414, Training Loss: 0.04045
Epoch: 3414, Training Loss: 0.04641
Epoch: 3414, Training Loss: 0.04651
Epoch: 3414, Training Loss: 0.06055
Epoch: 3415, Training Loss: 0.04043
Epoch: 3415, Training Loss: 0.04638
Epoch: 3415, Training Loss: 0.04648
Epoch: 3415, Training Loss: 0.06052
Epoch: 3416, Training Loss: 0.04042
Epoch: 3416, Training Loss: 0.04636
Epoch: 3416, Training Loss: 0.04646
Epoch: 3416, Training Loss: 0.06049
Epoch: 3417, Training Loss: 0.04040
Epoch: 3417, Training Loss: 0.04634
Epoch: 3417, Training Loss: 0.04644
Epoch: 3417, Training Loss: 0.06045
Epoch: 3418, Training Loss: 0.04038
Epoch: 3418, Training Loss: 0.04631
Epoch: 3418, Training Loss: 0.04642
Epoch: 3418, Training Loss: 0.06042
Epoch: 3419, Training Loss: 0.04036
Epoch: 3419, Training Loss: 0.04629
Epoch: 3419, Training Loss: 0.04639
Epoch: 3419, Training Loss: 0.06039
Epoch: 3420, Training Loss: 0.04035
Epoch: 3420, Training Loss: 0.04627
Epoch: 3420, Training Loss: 0.04637
Epoch: 3420, Training Loss: 0.06036
Epoch: 3421, Training Loss: 0.04033
Epoch: 3421, Training Loss: 0.04625
Epoch: 3421, Training Loss: 0.04635
Epoch: 3421, Training Loss: 0.06033
Epoch: 3422, Training Loss: 0.04031
Epoch: 3422, Training Loss: 0.04622
Epoch: 3422, Training Loss: 0.04632
Epoch: 3422, Training Loss: 0.06030
Epoch: 3423, Training Loss: 0.04029
Epoch: 3423, Training Loss: 0.04620
Epoch: 3423, Training Loss: 0.04630
Epoch: 3423, Training Loss: 0.06026
Epoch: 3424, Training Loss: 0.04028
Epoch: 3424, Training Loss: 0.04618
Epoch: 3424, Training Loss: 0.04628
Epoch: 3424, Training Loss: 0.06023
Epoch: 3425, Training Loss: 0.04026
Epoch: 3425, Training Loss: 0.04616
Epoch: 3425, Training Loss: 0.04626
Epoch: 3425, Training Loss: 0.06020
Epoch: 3426, Training Loss: 0.04024
Epoch: 3426, Training Loss: 0.04613
Epoch: 3426, Training Loss: 0.04623
Epoch: 3426, Training Loss: 0.06017
Epoch: 3427, Training Loss: 0.04022
Epoch: 3427, Training Loss: 0.04611
Epoch: 3427, Training Loss: 0.04621
Epoch: 3427, Training Loss: 0.06014
Epoch: 3428, Training Loss: 0.04021
Epoch: 3428, Training Loss: 0.04609
Epoch: 3428, Training Loss: 0.04619
Epoch: 3428, Training Loss: 0.06011
Epoch: 3429, Training Loss: 0.04019
Epoch: 3429, Training Loss: 0.04606
Epoch: 3429, Training Loss: 0.04617
Epoch: 3429, Training Loss: 0.06008
Epoch: 3430, Training Loss: 0.04017
Epoch: 3430, Training Loss: 0.04604
Epoch: 3430, Training Loss: 0.04614
Epoch: 3430, Training Loss: 0.06004
Epoch: 3431, Training Loss: 0.04015
Epoch: 3431, Training Loss: 0.04602
Epoch: 3431, Training Loss: 0.04612
Epoch: 3431, Training Loss: 0.06001
Epoch: 3432, Training Loss: 0.04014
Epoch: 3432, Training Loss: 0.04600
Epoch: 3432, Training Loss: 0.04610
Epoch: 3432, Training Loss: 0.05998
Epoch: 3433, Training Loss: 0.04012
Epoch: 3433, Training Loss: 0.04598
Epoch: 3433, Training Loss: 0.04608
Epoch: 3433, Training Loss: 0.05995
Epoch: 3434, Training Loss: 0.04010
Epoch: 3434, Training Loss: 0.04595
Epoch: 3434, Training Loss: 0.04605
Epoch: 3434, Training Loss: 0.05992
Epoch: 3435, Training Loss: 0.04008
Epoch: 3435, Training Loss: 0.04593
Epoch: 3435, Training Loss: 0.04603
Epoch: 3435, Training Loss: 0.05989
Epoch: 3436, Training Loss: 0.04007
Epoch: 3436, Training Loss: 0.04591
Epoch: 3436, Training Loss: 0.04601
Epoch: 3436, Training Loss: 0.05986
Epoch: 3437, Training Loss: 0.04005
Epoch: 3437, Training Loss: 0.04589
Epoch: 3437, Training Loss: 0.04599
Epoch: 3437, Training Loss: 0.05983
Epoch: 3438, Training Loss: 0.04003
Epoch: 3438, Training Loss: 0.04586
Epoch: 3438, Training Loss: 0.04596
Epoch: 3438, Training Loss: 0.05980
Epoch: 3439, Training Loss: 0.04002
Epoch: 3439, Training Loss: 0.04584
Epoch: 3439, Training Loss: 0.04594
Epoch: 3439, Training Loss: 0.05976
Epoch: 3440, Training Loss: 0.04000
Epoch: 3440, Training Loss: 0.04582
Epoch: 3440, Training Loss: 0.04592
Epoch: 3440, Training Loss: 0.05973
Epoch: 3441, Training Loss: 0.03998
Epoch: 3441, Training Loss: 0.04580
Epoch: 3441, Training Loss: 0.04590
Epoch: 3441, Training Loss: 0.05970
Epoch: 3442, Training Loss: 0.03996
Epoch: 3442, Training Loss: 0.04577
Epoch: 3442, Training Loss: 0.04587
Epoch: 3442, Training Loss: 0.05967
Epoch: 3443, Training Loss: 0.03995
Epoch: 3443, Training Loss: 0.04575
Epoch: 3443, Training Loss: 0.04585
Epoch: 3443, Training Loss: 0.05964
Epoch: 3444, Training Loss: 0.03993
Epoch: 3444, Training Loss: 0.04573
Epoch: 3444, Training Loss: 0.04583
Epoch: 3444, Training Loss: 0.05961
Epoch: 3445, Training Loss: 0.03991
Epoch: 3445, Training Loss: 0.04571
Epoch: 3445, Training Loss: 0.04581
Epoch: 3445, Training Loss: 0.05958
Epoch: 3446, Training Loss: 0.03990
Epoch: 3446, Training Loss: 0.04569
Epoch: 3446, Training Loss: 0.04579
Epoch: 3446, Training Loss: 0.05955
Epoch: 3447, Training Loss: 0.03988
Epoch: 3447, Training Loss: 0.04566
Epoch: 3447, Training Loss: 0.04576
Epoch: 3447, Training Loss: 0.05952
Epoch: 3448, Training Loss: 0.03986
Epoch: 3448, Training Loss: 0.04564
Epoch: 3448, Training Loss: 0.04574
Epoch: 3448, Training Loss: 0.05949
Epoch: 3449, Training Loss: 0.03985
Epoch: 3449, Training Loss: 0.04562
Epoch: 3449, Training Loss: 0.04572
Epoch: 3449, Training Loss: 0.05946
Epoch: 3450, Training Loss: 0.03983
Epoch: 3450, Training Loss: 0.04560
Epoch: 3450, Training Loss: 0.04570
Epoch: 3450, Training Loss: 0.05943
Epoch: 3451, Training Loss: 0.03981
Epoch: 3451, Training Loss: 0.04558
Epoch: 3451, Training Loss: 0.04568
Epoch: 3451, Training Loss: 0.05940
Epoch: 3452, Training Loss: 0.03980
Epoch: 3452, Training Loss: 0.04556
Epoch: 3452, Training Loss: 0.04565
Epoch: 3452, Training Loss: 0.05937
Epoch: 3453, Training Loss: 0.03978
Epoch: 3453, Training Loss: 0.04553
Epoch: 3453, Training Loss: 0.04563
Epoch: 3453, Training Loss: 0.05934
Epoch: 3454, Training Loss: 0.03976
Epoch: 3454, Training Loss: 0.04551
Epoch: 3454, Training Loss: 0.04561
Epoch: 3454, Training Loss: 0.05931
Epoch: 3455, Training Loss: 0.03974
Epoch: 3455, Training Loss: 0.04549
Epoch: 3455, Training Loss: 0.04559
Epoch: 3455, Training Loss: 0.05928
Epoch: 3456, Training Loss: 0.03973
Epoch: 3456, Training Loss: 0.04547
Epoch: 3456, Training Loss: 0.04557
Epoch: 3456, Training Loss: 0.05925
Epoch: 3457, Training Loss: 0.03971
Epoch: 3457, Training Loss: 0.04545
Epoch: 3457, Training Loss: 0.04555
Epoch: 3457, Training Loss: 0.05922
Epoch: 3458, Training Loss: 0.03969
Epoch: 3458, Training Loss: 0.04543
Epoch: 3458, Training Loss: 0.04552
Epoch: 3458, Training Loss: 0.05919
Epoch: 3459, Training Loss: 0.03968
Epoch: 3459, Training Loss: 0.04540
Epoch: 3459, Training Loss: 0.04550
Epoch: 3459, Training Loss: 0.05916
Epoch: 3460, Training Loss: 0.03966
Epoch: 3460, Training Loss: 0.04538
Epoch: 3460, Training Loss: 0.04548
Epoch: 3460, Training Loss: 0.05913
Epoch: 3461, Training Loss: 0.03964
Epoch: 3461, Training Loss: 0.04536
Epoch: 3461, Training Loss: 0.04546
Epoch: 3461, Training Loss: 0.05910
Epoch: 3462, Training Loss: 0.03963
Epoch: 3462, Training Loss: 0.04534
Epoch: 3462, Training Loss: 0.04544
Epoch: 3462, Training Loss: 0.05907
Epoch: 3463, Training Loss: 0.03961
Epoch: 3463, Training Loss: 0.04532
Epoch: 3463, Training Loss: 0.04542
Epoch: 3463, Training Loss: 0.05904
Epoch: 3464, Training Loss: 0.03959
Epoch: 3464, Training Loss: 0.04530
Epoch: 3464, Training Loss: 0.04539
Epoch: 3464, Training Loss: 0.05901
Epoch: 3465, Training Loss: 0.03958
Epoch: 3465, Training Loss: 0.04527
Epoch: 3465, Training Loss: 0.04537
Epoch: 3465, Training Loss: 0.05898
Epoch: 3466, Training Loss: 0.03956
Epoch: 3466, Training Loss: 0.04525
Epoch: 3466, Training Loss: 0.04535
Epoch: 3466, Training Loss: 0.05895
Epoch: 3467, Training Loss: 0.03954
Epoch: 3467, Training Loss: 0.04523
Epoch: 3467, Training Loss: 0.04533
Epoch: 3467, Training Loss: 0.05892
Epoch: 3468, Training Loss: 0.03953
Epoch: 3468, Training Loss: 0.04521
Epoch: 3468, Training Loss: 0.04531
Epoch: 3468, Training Loss: 0.05889
Epoch: 3469, Training Loss: 0.03951
Epoch: 3469, Training Loss: 0.04519
Epoch: 3469, Training Loss: 0.04529
Epoch: 3469, Training Loss: 0.05886
Epoch: 3470, Training Loss: 0.03950
Epoch: 3470, Training Loss: 0.04517
Epoch: 3470, Training Loss: 0.04527
Epoch: 3470, Training Loss: 0.05883
Epoch: 3471, Training Loss: 0.03948
Epoch: 3471, Training Loss: 0.04515
Epoch: 3471, Training Loss: 0.04524
Epoch: 3471, Training Loss: 0.05880
Epoch: 3472, Training Loss: 0.03946
Epoch: 3472, Training Loss: 0.04513
Epoch: 3472, Training Loss: 0.04522
Epoch: 3472, Training Loss: 0.05877
Epoch: 3473, Training Loss: 0.03945
Epoch: 3473, Training Loss: 0.04510
Epoch: 3473, Training Loss: 0.04520
Epoch: 3473, Training Loss: 0.05874
Epoch: 3474, Training Loss: 0.03943
Epoch: 3474, Training Loss: 0.04508
Epoch: 3474, Training Loss: 0.04518
Epoch: 3474, Training Loss: 0.05871
Epoch: 3475, Training Loss: 0.03941
Epoch: 3475, Training Loss: 0.04506
Epoch: 3475, Training Loss: 0.04516
Epoch: 3475, Training Loss: 0.05868
Epoch: 3476, Training Loss: 0.03940
Epoch: 3476, Training Loss: 0.04504
Epoch: 3476, Training Loss: 0.04514
Epoch: 3476, Training Loss: 0.05865
Epoch: 3477, Training Loss: 0.03938
Epoch: 3477, Training Loss: 0.04502
Epoch: 3477, Training Loss: 0.04512
Epoch: 3477, Training Loss: 0.05862
Epoch: 3478, Training Loss: 0.03936
Epoch: 3478, Training Loss: 0.04500
Epoch: 3478, Training Loss: 0.04510
Epoch: 3478, Training Loss: 0.05859
Epoch: 3479, Training Loss: 0.03935
Epoch: 3479, Training Loss: 0.04498
Epoch: 3479, Training Loss: 0.04507
Epoch: 3479, Training Loss: 0.05856
Epoch: 3480, Training Loss: 0.03933
Epoch: 3480, Training Loss: 0.04496
Epoch: 3480, Training Loss: 0.04505
Epoch: 3480, Training Loss: 0.05853
Epoch: 3481, Training Loss: 0.03931
Epoch: 3481, Training Loss: 0.04494
Epoch: 3481, Training Loss: 0.04503
Epoch: 3481, Training Loss: 0.05850
Epoch: 3482, Training Loss: 0.03930
Epoch: 3482, Training Loss: 0.04491
Epoch: 3482, Training Loss: 0.04501
Epoch: 3482, Training Loss: 0.05848
Epoch: 3483, Training Loss: 0.03928
Epoch: 3483, Training Loss: 0.04489
Epoch: 3483, Training Loss: 0.04499
Epoch: 3483, Training Loss: 0.05845
Epoch: 3484, Training Loss: 0.03927
Epoch: 3484, Training Loss: 0.04487
Epoch: 3484, Training Loss: 0.04497
Epoch: 3484, Training Loss: 0.05842
Epoch: 3485, Training Loss: 0.03925
Epoch: 3485, Training Loss: 0.04485
Epoch: 3485, Training Loss: 0.04495
Epoch: 3485, Training Loss: 0.05839
Epoch: 3486, Training Loss: 0.03923
Epoch: 3486, Training Loss: 0.04483
Epoch: 3486, Training Loss: 0.04493
Epoch: 3486, Training Loss: 0.05836
Epoch: 3487, Training Loss: 0.03922
Epoch: 3487, Training Loss: 0.04481
Epoch: 3487, Training Loss: 0.04491
Epoch: 3487, Training Loss: 0.05833
Epoch: 3488, Training Loss: 0.03920
Epoch: 3488, Training Loss: 0.04479
Epoch: 3488, Training Loss: 0.04489
Epoch: 3488, Training Loss: 0.05830
Epoch: 3489, Training Loss: 0.03919
Epoch: 3489, Training Loss: 0.04477
Epoch: 3489, Training Loss: 0.04487
Epoch: 3489, Training Loss: 0.05827
Epoch: 3490, Training Loss: 0.03917
Epoch: 3490, Training Loss: 0.04475
Epoch: 3490, Training Loss: 0.04484
Epoch: 3490, Training Loss: 0.05824
Epoch: 3491, Training Loss: 0.03915
Epoch: 3491, Training Loss: 0.04473
Epoch: 3491, Training Loss: 0.04482
Epoch: 3491, Training Loss: 0.05822
Epoch: 3492, Training Loss: 0.03914
Epoch: 3492, Training Loss: 0.04471
Epoch: 3492, Training Loss: 0.04480
Epoch: 3492, Training Loss: 0.05819
Epoch: 3493, Training Loss: 0.03912
Epoch: 3493, Training Loss: 0.04469
Epoch: 3493, Training Loss: 0.04478
Epoch: 3493, Training Loss: 0.05816
Epoch: 3494, Training Loss: 0.03911
Epoch: 3494, Training Loss: 0.04466
Epoch: 3494, Training Loss: 0.04476
Epoch: 3494, Training Loss: 0.05813
Epoch: 3495, Training Loss: 0.03909
Epoch: 3495, Training Loss: 0.04464
Epoch: 3495, Training Loss: 0.04474
Epoch: 3495, Training Loss: 0.05810
Epoch: 3496, Training Loss: 0.03907
Epoch: 3496, Training Loss: 0.04462
Epoch: 3496, Training Loss: 0.04472
Epoch: 3496, Training Loss: 0.05807
Epoch: 3497, Training Loss: 0.03906
Epoch: 3497, Training Loss: 0.04460
Epoch: 3497, Training Loss: 0.04470
Epoch: 3497, Training Loss: 0.05804
Epoch: 3498, Training Loss: 0.03904
Epoch: 3498, Training Loss: 0.04458
Epoch: 3498, Training Loss: 0.04468
Epoch: 3498, Training Loss: 0.05802
Epoch: 3499, Training Loss: 0.03903
Epoch: 3499, Training Loss: 0.04456
Epoch: 3499, Training Loss: 0.04466
Epoch: 3499, Training Loss: 0.05799
Epoch: 3500, Training Loss: 0.03901
Epoch: 3500, Training Loss: 0.04454
Epoch: 3500, Training Loss: 0.04464
Epoch: 3500, Training Loss: 0.05796
Epoch: 3501, Training Loss: 0.03899
Epoch: 3501, Training Loss: 0.04452
Epoch: 3501, Training Loss: 0.04462
Epoch: 3501, Training Loss: 0.05793
Epoch: 3502, Training Loss: 0.03898
Epoch: 3502, Training Loss: 0.04450
Epoch: 3502, Training Loss: 0.04460
Epoch: 3502, Training Loss: 0.05790
Epoch: 3503, Training Loss: 0.03896
Epoch: 3503, Training Loss: 0.04448
Epoch: 3503, Training Loss: 0.04458
Epoch: 3503, Training Loss: 0.05787
Epoch: 3504, Training Loss: 0.03895
Epoch: 3504, Training Loss: 0.04446
Epoch: 3504, Training Loss: 0.04456
Epoch: 3504, Training Loss: 0.05785
Epoch: 3505, Training Loss: 0.03893
Epoch: 3505, Training Loss: 0.04444
Epoch: 3505, Training Loss: 0.04454
Epoch: 3505, Training Loss: 0.05782
Epoch: 3506, Training Loss: 0.03891
Epoch: 3506, Training Loss: 0.04442
Epoch: 3506, Training Loss: 0.04452
Epoch: 3506, Training Loss: 0.05779
Epoch: 3507, Training Loss: 0.03890
Epoch: 3507, Training Loss: 0.04440
Epoch: 3507, Training Loss: 0.04450
Epoch: 3507, Training Loss: 0.05776
Epoch: 3508, Training Loss: 0.03888
Epoch: 3508, Training Loss: 0.04438
Epoch: 3508, Training Loss: 0.04447
Epoch: 3508, Training Loss: 0.05773
Epoch: 3509, Training Loss: 0.03887
Epoch: 3509, Training Loss: 0.04436
Epoch: 3509, Training Loss: 0.04445
Epoch: 3509, Training Loss: 0.05770
Epoch: 3510, Training Loss: 0.03885
Epoch: 3510, Training Loss: 0.04434
Epoch: 3510, Training Loss: 0.04443
Epoch: 3510, Training Loss: 0.05768
Epoch: 3511, Training Loss: 0.03884
Epoch: 3511, Training Loss: 0.04432
Epoch: 3511, Training Loss: 0.04441
Epoch: 3511, Training Loss: 0.05765
Epoch: 3512, Training Loss: 0.03882
Epoch: 3512, Training Loss: 0.04430
Epoch: 3512, Training Loss: 0.04439
Epoch: 3512, Training Loss: 0.05762
Epoch: 3513, Training Loss: 0.03880
Epoch: 3513, Training Loss: 0.04428
Epoch: 3513, Training Loss: 0.04437
Epoch: 3513, Training Loss: 0.05759
Epoch: 3514, Training Loss: 0.03879
Epoch: 3514, Training Loss: 0.04426
Epoch: 3514, Training Loss: 0.04435
Epoch: 3514, Training Loss: 0.05757
Epoch: 3515, Training Loss: 0.03877
Epoch: 3515, Training Loss: 0.04424
Epoch: 3515, Training Loss: 0.04433
Epoch: 3515, Training Loss: 0.05754
Epoch: 3516, Training Loss: 0.03876
Epoch: 3516, Training Loss: 0.04422
Epoch: 3516, Training Loss: 0.04431
Epoch: 3516, Training Loss: 0.05751
Epoch: 3517, Training Loss: 0.03874
Epoch: 3517, Training Loss: 0.04420
Epoch: 3517, Training Loss: 0.04429
Epoch: 3517, Training Loss: 0.05748
Epoch: 3518, Training Loss: 0.03873
Epoch: 3518, Training Loss: 0.04418
Epoch: 3518, Training Loss: 0.04427
Epoch: 3518, Training Loss: 0.05745
Epoch: 3519, Training Loss: 0.03871
Epoch: 3519, Training Loss: 0.04416
Epoch: 3519, Training Loss: 0.04425
Epoch: 3519, Training Loss: 0.05743
Epoch: 3520, Training Loss: 0.03869
Epoch: 3520, Training Loss: 0.04414
Epoch: 3520, Training Loss: 0.04423
Epoch: 3520, Training Loss: 0.05740
Epoch: 3521, Training Loss: 0.03868
Epoch: 3521, Training Loss: 0.04412
Epoch: 3521, Training Loss: 0.04421
Epoch: 3521, Training Loss: 0.05737
Epoch: 3522, Training Loss: 0.03866
Epoch: 3522, Training Loss: 0.04410
Epoch: 3522, Training Loss: 0.04419
Epoch: 3522, Training Loss: 0.05734
Epoch: 3523, Training Loss: 0.03865
Epoch: 3523, Training Loss: 0.04408
Epoch: 3523, Training Loss: 0.04417
Epoch: 3523, Training Loss: 0.05732
Epoch: 3524, Training Loss: 0.03863
Epoch: 3524, Training Loss: 0.04406
Epoch: 3524, Training Loss: 0.04415
Epoch: 3524, Training Loss: 0.05729
Epoch: 3525, Training Loss: 0.03862
Epoch: 3525, Training Loss: 0.04404
Epoch: 3525, Training Loss: 0.04413
Epoch: 3525, Training Loss: 0.05726
Epoch: 3526, Training Loss: 0.03860
Epoch: 3526, Training Loss: 0.04402
Epoch: 3526, Training Loss: 0.04411
Epoch: 3526, Training Loss: 0.05723
Epoch: 3527, Training Loss: 0.03859
Epoch: 3527, Training Loss: 0.04400
Epoch: 3527, Training Loss: 0.04409
Epoch: 3527, Training Loss: 0.05721
Epoch: 3528, Training Loss: 0.03857
Epoch: 3528, Training Loss: 0.04398
Epoch: 3528, Training Loss: 0.04407
Epoch: 3528, Training Loss: 0.05718
Epoch: 3529, Training Loss: 0.03856
Epoch: 3529, Training Loss: 0.04396
Epoch: 3529, Training Loss: 0.04405
Epoch: 3529, Training Loss: 0.05715
Epoch: 3530, Training Loss: 0.03854
Epoch: 3530, Training Loss: 0.04394
Epoch: 3530, Training Loss: 0.04403
Epoch: 3530, Training Loss: 0.05712
Epoch: 3531, Training Loss: 0.03853
Epoch: 3531, Training Loss: 0.04392
Epoch: 3531, Training Loss: 0.04401
Epoch: 3531, Training Loss: 0.05710
Epoch: 3532, Training Loss: 0.03851
Epoch: 3532, Training Loss: 0.04390
Epoch: 3532, Training Loss: 0.04399
Epoch: 3532, Training Loss: 0.05707
Epoch: 3533, Training Loss: 0.03849
Epoch: 3533, Training Loss: 0.04388
Epoch: 3533, Training Loss: 0.04398
Epoch: 3533, Training Loss: 0.05704
Epoch: 3534, Training Loss: 0.03848
Epoch: 3534, Training Loss: 0.04386
Epoch: 3534, Training Loss: 0.04396
Epoch: 3534, Training Loss: 0.05702
Epoch: 3535, Training Loss: 0.03846
Epoch: 3535, Training Loss: 0.04384
Epoch: 3535, Training Loss: 0.04394
Epoch: 3535, Training Loss: 0.05699
Epoch: 3536, Training Loss: 0.03845
Epoch: 3536, Training Loss: 0.04382
Epoch: 3536, Training Loss: 0.04392
Epoch: 3536, Training Loss: 0.05696
Epoch: 3537, Training Loss: 0.03843
Epoch: 3537, Training Loss: 0.04380
Epoch: 3537, Training Loss: 0.04390
Epoch: 3537, Training Loss: 0.05693
Epoch: 3538, Training Loss: 0.03842
Epoch: 3538, Training Loss: 0.04378
Epoch: 3538, Training Loss: 0.04388
Epoch: 3538, Training Loss: 0.05691
Epoch: 3539, Training Loss: 0.03840
Epoch: 3539, Training Loss: 0.04376
Epoch: 3539, Training Loss: 0.04386
Epoch: 3539, Training Loss: 0.05688
Epoch: 3540, Training Loss: 0.03839
Epoch: 3540, Training Loss: 0.04374
Epoch: 3540, Training Loss: 0.04384
Epoch: 3540, Training Loss: 0.05685
Epoch: 3541, Training Loss: 0.03837
Epoch: 3541, Training Loss: 0.04372
Epoch: 3541, Training Loss: 0.04382
Epoch: 3541, Training Loss: 0.05683
Epoch: 3542, Training Loss: 0.03836
Epoch: 3542, Training Loss: 0.04371
Epoch: 3542, Training Loss: 0.04380
Epoch: 3542, Training Loss: 0.05680
Epoch: 3543, Training Loss: 0.03834
Epoch: 3543, Training Loss: 0.04369
Epoch: 3543, Training Loss: 0.04378
Epoch: 3543, Training Loss: 0.05677
Epoch: 3544, Training Loss: 0.03833
Epoch: 3544, Training Loss: 0.04367
Epoch: 3544, Training Loss: 0.04376
Epoch: 3544, Training Loss: 0.05675
Epoch: 3545, Training Loss: 0.03831
Epoch: 3545, Training Loss: 0.04365
Epoch: 3545, Training Loss: 0.04374
Epoch: 3545, Training Loss: 0.05672
Epoch: 3546, Training Loss: 0.03830
Epoch: 3546, Training Loss: 0.04363
Epoch: 3546, Training Loss: 0.04372
Epoch: 3546, Training Loss: 0.05669
Epoch: 3547, Training Loss: 0.03828
Epoch: 3547, Training Loss: 0.04361
Epoch: 3547, Training Loss: 0.04370
Epoch: 3547, Training Loss: 0.05667
Epoch: 3548, Training Loss: 0.03827
Epoch: 3548, Training Loss: 0.04359
Epoch: 3548, Training Loss: 0.04368
Epoch: 3548, Training Loss: 0.05664
Epoch: 3549, Training Loss: 0.03825
Epoch: 3549, Training Loss: 0.04357
Epoch: 3549, Training Loss: 0.04366
Epoch: 3549, Training Loss: 0.05661
Epoch: 3550, Training Loss: 0.03824
Epoch: 3550, Training Loss: 0.04355
Epoch: 3550, Training Loss: 0.04364
Epoch: 3550, Training Loss: 0.05659
Epoch: 3551, Training Loss: 0.03822
Epoch: 3551, Training Loss: 0.04353
Epoch: 3551, Training Loss: 0.04363
Epoch: 3551, Training Loss: 0.05656
Epoch: 3552, Training Loss: 0.03821
Epoch: 3552, Training Loss: 0.04351
Epoch: 3552, Training Loss: 0.04361
Epoch: 3552, Training Loss: 0.05653
Epoch: 3553, Training Loss: 0.03819
Epoch: 3553, Training Loss: 0.04349
Epoch: 3553, Training Loss: 0.04359
Epoch: 3553, Training Loss: 0.05651
Epoch: 3554, Training Loss: 0.03818
Epoch: 3554, Training Loss: 0.04347
Epoch: 3554, Training Loss: 0.04357
Epoch: 3554, Training Loss: 0.05648
Epoch: 3555, Training Loss: 0.03816
Epoch: 3555, Training Loss: 0.04345
Epoch: 3555, Training Loss: 0.04355
Epoch: 3555, Training Loss: 0.05645
Epoch: 3556, Training Loss: 0.03815
Epoch: 3556, Training Loss: 0.04344
Epoch: 3556, Training Loss: 0.04353
Epoch: 3556, Training Loss: 0.05643
Epoch: 3557, Training Loss: 0.03813
Epoch: 3557, Training Loss: 0.04342
Epoch: 3557, Training Loss: 0.04351
Epoch: 3557, Training Loss: 0.05640
Epoch: 3558, Training Loss: 0.03812
Epoch: 3558, Training Loss: 0.04340
Epoch: 3558, Training Loss: 0.04349
Epoch: 3558, Training Loss: 0.05637
Epoch: 3559, Training Loss: 0.03810
Epoch: 3559, Training Loss: 0.04338
Epoch: 3559, Training Loss: 0.04347
Epoch: 3559, Training Loss: 0.05635
Epoch: 3560, Training Loss: 0.03809
Epoch: 3560, Training Loss: 0.04336
Epoch: 3560, Training Loss: 0.04345
Epoch: 3560, Training Loss: 0.05632
Epoch: 3561, Training Loss: 0.03807
Epoch: 3561, Training Loss: 0.04334
Epoch: 3561, Training Loss: 0.04343
Epoch: 3561, Training Loss: 0.05630
Epoch: 3562, Training Loss: 0.03806
Epoch: 3562, Training Loss: 0.04332
Epoch: 3562, Training Loss: 0.04342
Epoch: 3562, Training Loss: 0.05627
Epoch: 3563, Training Loss: 0.03804
Epoch: 3563, Training Loss: 0.04330
Epoch: 3563, Training Loss: 0.04340
Epoch: 3563, Training Loss: 0.05624
Epoch: 3564, Training Loss: 0.03803
Epoch: 3564, Training Loss: 0.04328
Epoch: 3564, Training Loss: 0.04338
Epoch: 3564, Training Loss: 0.05622
Epoch: 3565, Training Loss: 0.03801
Epoch: 3565, Training Loss: 0.04327
Epoch: 3565, Training Loss: 0.04336
Epoch: 3565, Training Loss: 0.05619
Epoch: 3566, Training Loss: 0.03800
Epoch: 3566, Training Loss: 0.04325
Epoch: 3566, Training Loss: 0.04334
Epoch: 3566, Training Loss: 0.05617
Epoch: 3567, Training Loss: 0.03798
Epoch: 3567, Training Loss: 0.04323
Epoch: 3567, Training Loss: 0.04332
Epoch: 3567, Training Loss: 0.05614
Epoch: 3568, Training Loss: 0.03797
Epoch: 3568, Training Loss: 0.04321
Epoch: 3568, Training Loss: 0.04330
Epoch: 3568, Training Loss: 0.05611
Epoch: 3569, Training Loss: 0.03796
Epoch: 3569, Training Loss: 0.04319
Epoch: 3569, Training Loss: 0.04328
Epoch: 3569, Training Loss: 0.05609
Epoch: 3570, Training Loss: 0.03794
Epoch: 3570, Training Loss: 0.04317
Epoch: 3570, Training Loss: 0.04326
Epoch: 3570, Training Loss: 0.05606
Epoch: 3571, Training Loss: 0.03793
Epoch: 3571, Training Loss: 0.04315
Epoch: 3571, Training Loss: 0.04325
Epoch: 3571, Training Loss: 0.05604
Epoch: 3572, Training Loss: 0.03791
Epoch: 3572, Training Loss: 0.04313
Epoch: 3572, Training Loss: 0.04323
Epoch: 3572, Training Loss: 0.05601
Epoch: 3573, Training Loss: 0.03790
Epoch: 3573, Training Loss: 0.04312
Epoch: 3573, Training Loss: 0.04321
Epoch: 3573, Training Loss: 0.05598
Epoch: 3574, Training Loss: 0.03788
Epoch: 3574, Training Loss: 0.04310
Epoch: 3574, Training Loss: 0.04319
Epoch: 3574, Training Loss: 0.05596
Epoch: 3575, Training Loss: 0.03787
Epoch: 3575, Training Loss: 0.04308
Epoch: 3575, Training Loss: 0.04317
Epoch: 3575, Training Loss: 0.05593
Epoch: 3576, Training Loss: 0.03785
Epoch: 3576, Training Loss: 0.04306
Epoch: 3576, Training Loss: 0.04315
Epoch: 3576, Training Loss: 0.05591
Epoch: 3577, Training Loss: 0.03784
Epoch: 3577, Training Loss: 0.04304
Epoch: 3577, Training Loss: 0.04313
Epoch: 3577, Training Loss: 0.05588
Epoch: 3578, Training Loss: 0.03782
Epoch: 3578, Training Loss: 0.04302
Epoch: 3578, Training Loss: 0.04311
Epoch: 3578, Training Loss: 0.05586
Epoch: 3579, Training Loss: 0.03781
Epoch: 3579, Training Loss: 0.04300
Epoch: 3579, Training Loss: 0.04310
Epoch: 3579, Training Loss: 0.05583
Epoch: 3580, Training Loss: 0.03779
Epoch: 3580, Training Loss: 0.04299
Epoch: 3580, Training Loss: 0.04308
Epoch: 3580, Training Loss: 0.05580
Epoch: 3581, Training Loss: 0.03778
Epoch: 3581, Training Loss: 0.04297
Epoch: 3581, Training Loss: 0.04306
Epoch: 3581, Training Loss: 0.05578
Epoch: 3582, Training Loss: 0.03777
Epoch: 3582, Training Loss: 0.04295
Epoch: 3582, Training Loss: 0.04304
Epoch: 3582, Training Loss: 0.05575
Epoch: 3583, Training Loss: 0.03775
Epoch: 3583, Training Loss: 0.04293
Epoch: 3583, Training Loss: 0.04302
Epoch: 3583, Training Loss: 0.05573
Epoch: 3584, Training Loss: 0.03774
Epoch: 3584, Training Loss: 0.04291
Epoch: 3584, Training Loss: 0.04300
Epoch: 3584, Training Loss: 0.05570
Epoch: 3585, Training Loss: 0.03772
Epoch: 3585, Training Loss: 0.04289
Epoch: 3585, Training Loss: 0.04299
Epoch: 3585, Training Loss: 0.05568
Epoch: 3586, Training Loss: 0.03771
Epoch: 3586, Training Loss: 0.04287
Epoch: 3586, Training Loss: 0.04297
Epoch: 3586, Training Loss: 0.05565
Epoch: 3587, Training Loss: 0.03769
Epoch: 3587, Training Loss: 0.04286
Epoch: 3587, Training Loss: 0.04295
Epoch: 3587, Training Loss: 0.05563
Epoch: 3588, Training Loss: 0.03768
Epoch: 3588, Training Loss: 0.04284
Epoch: 3588, Training Loss: 0.04293
Epoch: 3588, Training Loss: 0.05560
Epoch: 3589, Training Loss: 0.03766
Epoch: 3589, Training Loss: 0.04282
Epoch: 3589, Training Loss: 0.04291
Epoch: 3589, Training Loss: 0.05558
Epoch: 3590, Training Loss: 0.03765
Epoch: 3590, Training Loss: 0.04280
Epoch: 3590, Training Loss: 0.04289
Epoch: 3590, Training Loss: 0.05555
Epoch: 3591, Training Loss: 0.03764
Epoch: 3591, Training Loss: 0.04278
Epoch: 3591, Training Loss: 0.04287
Epoch: 3591, Training Loss: 0.05553
Epoch: 3592, Training Loss: 0.03762
Epoch: 3592, Training Loss: 0.04276
Epoch: 3592, Training Loss: 0.04286
Epoch: 3592, Training Loss: 0.05550
Epoch: 3593, Training Loss: 0.03761
Epoch: 3593, Training Loss: 0.04275
Epoch: 3593, Training Loss: 0.04284
Epoch: 3593, Training Loss: 0.05548
Epoch: 3594, Training Loss: 0.03759
Epoch: 3594, Training Loss: 0.04273
Epoch: 3594, Training Loss: 0.04282
Epoch: 3594, Training Loss: 0.05545
Epoch: 3595, Training Loss: 0.03758
Epoch: 3595, Training Loss: 0.04271
Epoch: 3595, Training Loss: 0.04280
Epoch: 3595, Training Loss: 0.05543
Epoch: 3596, Training Loss: 0.03756
Epoch: 3596, Training Loss: 0.04269
Epoch: 3596, Training Loss: 0.04278
Epoch: 3596, Training Loss: 0.05540
Epoch: 3597, Training Loss: 0.03755
Epoch: 3597, Training Loss: 0.04267
Epoch: 3597, Training Loss: 0.04277
Epoch: 3597, Training Loss: 0.05538
Epoch: 3598, Training Loss: 0.03754
Epoch: 3598, Training Loss: 0.04266
Epoch: 3598, Training Loss: 0.04275
Epoch: 3598, Training Loss: 0.05535
Epoch: 3599, Training Loss: 0.03752
Epoch: 3599, Training Loss: 0.04264
Epoch: 3599, Training Loss: 0.04273
Epoch: 3599, Training Loss: 0.05533
Epoch: 3600, Training Loss: 0.03751
Epoch: 3600, Training Loss: 0.04262
Epoch: 3600, Training Loss: 0.04271
Epoch: 3600, Training Loss: 0.05530
Epoch: 3601, Training Loss: 0.03749
Epoch: 3601, Training Loss: 0.04260
Epoch: 3601, Training Loss: 0.04269
Epoch: 3601, Training Loss: 0.05528
Epoch: 3602, Training Loss: 0.03748
Epoch: 3602, Training Loss: 0.04258
Epoch: 3602, Training Loss: 0.04267
Epoch: 3602, Training Loss: 0.05525
Epoch: 3603, Training Loss: 0.03747
Epoch: 3603, Training Loss: 0.04257
Epoch: 3603, Training Loss: 0.04266
Epoch: 3603, Training Loss: 0.05523
Epoch: 3604, Training Loss: 0.03745
Epoch: 3604, Training Loss: 0.04255
Epoch: 3604, Training Loss: 0.04264
Epoch: 3604, Training Loss: 0.05520
Epoch: 3605, Training Loss: 0.03744
Epoch: 3605, Training Loss: 0.04253
Epoch: 3605, Training Loss: 0.04262
Epoch: 3605, Training Loss: 0.05518
Epoch: 3606, Training Loss: 0.03742
Epoch: 3606, Training Loss: 0.04251
Epoch: 3606, Training Loss: 0.04260
Epoch: 3606, Training Loss: 0.05515
Epoch: 3607, Training Loss: 0.03741
Epoch: 3607, Training Loss: 0.04249
Epoch: 3607, Training Loss: 0.04258
Epoch: 3607, Training Loss: 0.05513
Epoch: 3608, Training Loss: 0.03739
Epoch: 3608, Training Loss: 0.04248
Epoch: 3608, Training Loss: 0.04257
Epoch: 3608, Training Loss: 0.05510
Epoch: 3609, Training Loss: 0.03738
Epoch: 3609, Training Loss: 0.04246
Epoch: 3609, Training Loss: 0.04255
Epoch: 3609, Training Loss: 0.05508
Epoch: 3610, Training Loss: 0.03737
Epoch: 3610, Training Loss: 0.04244
Epoch: 3610, Training Loss: 0.04253
Epoch: 3610, Training Loss: 0.05505
Epoch: 3611, Training Loss: 0.03735
Epoch: 3611, Training Loss: 0.04242
Epoch: 3611, Training Loss: 0.04251
Epoch: 3611, Training Loss: 0.05503
Epoch: 3612, Training Loss: 0.03734
Epoch: 3612, Training Loss: 0.04240
Epoch: 3612, Training Loss: 0.04250
Epoch: 3612, Training Loss: 0.05500
Epoch: 3613, Training Loss: 0.03732
Epoch: 3613, Training Loss: 0.04239
Epoch: 3613, Training Loss: 0.04248
Epoch: 3613, Training Loss: 0.05498
Epoch: 3614, Training Loss: 0.03731
Epoch: 3614, Training Loss: 0.04237
Epoch: 3614, Training Loss: 0.04246
Epoch: 3614, Training Loss: 0.05495
Epoch: 3615, Training Loss: 0.03730
Epoch: 3615, Training Loss: 0.04235
Epoch: 3615, Training Loss: 0.04244
Epoch: 3615, Training Loss: 0.05493
Epoch: 3616, Training Loss: 0.03728
Epoch: 3616, Training Loss: 0.04233
Epoch: 3616, Training Loss: 0.04242
Epoch: 3616, Training Loss: 0.05491
Epoch: 3617, Training Loss: 0.03727
Epoch: 3617, Training Loss: 0.04232
Epoch: 3617, Training Loss: 0.04241
Epoch: 3617, Training Loss: 0.05488
Epoch: 3618, Training Loss: 0.03725
Epoch: 3618, Training Loss: 0.04230
Epoch: 3618, Training Loss: 0.04239
Epoch: 3618, Training Loss: 0.05486
Epoch: 3619, Training Loss: 0.03724
Epoch: 3619, Training Loss: 0.04228
Epoch: 3619, Training Loss: 0.04237
Epoch: 3619, Training Loss: 0.05483
Epoch: 3620, Training Loss: 0.03723
Epoch: 3620, Training Loss: 0.04226
Epoch: 3620, Training Loss: 0.04235
Epoch: 3620, Training Loss: 0.05481
Epoch: 3621, Training Loss: 0.03721
Epoch: 3621, Training Loss: 0.04225
Epoch: 3621, Training Loss: 0.04234
Epoch: 3621, Training Loss: 0.05478
Epoch: 3622, Training Loss: 0.03720
Epoch: 3622, Training Loss: 0.04223
Epoch: 3622, Training Loss: 0.04232
Epoch: 3622, Training Loss: 0.05476
Epoch: 3623, Training Loss: 0.03719
Epoch: 3623, Training Loss: 0.04221
Epoch: 3623, Training Loss: 0.04230
Epoch: 3623, Training Loss: 0.05474
Epoch: 3624, Training Loss: 0.03717
Epoch: 3624, Training Loss: 0.04219
Epoch: 3624, Training Loss: 0.04228
Epoch: 3624, Training Loss: 0.05471
Epoch: 3625, Training Loss: 0.03716
Epoch: 3625, Training Loss: 0.04218
Epoch: 3625, Training Loss: 0.04227
Epoch: 3625, Training Loss: 0.05469
Epoch: 3626, Training Loss: 0.03714
Epoch: 3626, Training Loss: 0.04216
Epoch: 3626, Training Loss: 0.04225
Epoch: 3626, Training Loss: 0.05466
Epoch: 3627, Training Loss: 0.03713
Epoch: 3627, Training Loss: 0.04214
Epoch: 3627, Training Loss: 0.04223
Epoch: 3627, Training Loss: 0.05464
Epoch: 3628, Training Loss: 0.03712
Epoch: 3628, Training Loss: 0.04212
Epoch: 3628, Training Loss: 0.04221
Epoch: 3628, Training Loss: 0.05461
Epoch: 3629, Training Loss: 0.03710
Epoch: 3629, Training Loss: 0.04211
Epoch: 3629, Training Loss: 0.04219
Epoch: 3629, Training Loss: 0.05459
Epoch: 3630, Training Loss: 0.03709
Epoch: 3630, Training Loss: 0.04209
Epoch: 3630, Training Loss: 0.04218
Epoch: 3630, Training Loss: 0.05457
Epoch: 3631, Training Loss: 0.03708
Epoch: 3631, Training Loss: 0.04207
Epoch: 3631, Training Loss: 0.04216
Epoch: 3631, Training Loss: 0.05454
Epoch: 3632, Training Loss: 0.03706
Epoch: 3632, Training Loss: 0.04205
Epoch: 3632, Training Loss: 0.04214
Epoch: 3632, Training Loss: 0.05452
Epoch: 3633, Training Loss: 0.03705
Epoch: 3633, Training Loss: 0.04204
Epoch: 3633, Training Loss: 0.04213
Epoch: 3633, Training Loss: 0.05449
Epoch: 3634, Training Loss: 0.03703
Epoch: 3634, Training Loss: 0.04202
Epoch: 3634, Training Loss: 0.04211
Epoch: 3634, Training Loss: 0.05447
Epoch: 3635, Training Loss: 0.03702
Epoch: 3635, Training Loss: 0.04200
Epoch: 3635, Training Loss: 0.04209
Epoch: 3635, Training Loss: 0.05445
Epoch: 3636, Training Loss: 0.03701
Epoch: 3636, Training Loss: 0.04198
Epoch: 3636, Training Loss: 0.04207
Epoch: 3636, Training Loss: 0.05442
Epoch: 3637, Training Loss: 0.03699
Epoch: 3637, Training Loss: 0.04197
Epoch: 3637, Training Loss: 0.04206
Epoch: 3637, Training Loss: 0.05440
Epoch: 3638, Training Loss: 0.03698
Epoch: 3638, Training Loss: 0.04195
Epoch: 3638, Training Loss: 0.04204
Epoch: 3638, Training Loss: 0.05438
Epoch: 3639, Training Loss: 0.03697
Epoch: 3639, Training Loss: 0.04193
Epoch: 3639, Training Loss: 0.04202
Epoch: 3639, Training Loss: 0.05435
Epoch: 3640, Training Loss: 0.03695
Epoch: 3640, Training Loss: 0.04191
Epoch: 3640, Training Loss: 0.04200
Epoch: 3640, Training Loss: 0.05433
Epoch: 3641, Training Loss: 0.03694
Epoch: 3641, Training Loss: 0.04190
Epoch: 3641, Training Loss: 0.04199
Epoch: 3641, Training Loss: 0.05430
Epoch: 3642, Training Loss: 0.03693
Epoch: 3642, Training Loss: 0.04188
Epoch: 3642, Training Loss: 0.04197
Epoch: 3642, Training Loss: 0.05428
Epoch: 3643, Training Loss: 0.03691
Epoch: 3643, Training Loss: 0.04186
Epoch: 3643, Training Loss: 0.04195
Epoch: 3643, Training Loss: 0.05426
Epoch: 3644, Training Loss: 0.03690
Epoch: 3644, Training Loss: 0.04185
Epoch: 3644, Training Loss: 0.04193
Epoch: 3644, Training Loss: 0.05423
Epoch: 3645, Training Loss: 0.03688
Epoch: 3645, Training Loss: 0.04183
Epoch: 3645, Training Loss: 0.04192
Epoch: 3645, Training Loss: 0.05421
Epoch: 3646, Training Loss: 0.03687
Epoch: 3646, Training Loss: 0.04181
Epoch: 3646, Training Loss: 0.04190
Epoch: 3646, Training Loss: 0.05419
Epoch: 3647, Training Loss: 0.03686
Epoch: 3647, Training Loss: 0.04179
Epoch: 3647, Training Loss: 0.04188
Epoch: 3647, Training Loss: 0.05416
Epoch: 3648, Training Loss: 0.03684
Epoch: 3648, Training Loss: 0.04178
Epoch: 3648, Training Loss: 0.04187
Epoch: 3648, Training Loss: 0.05414
Epoch: 3649, Training Loss: 0.03683
Epoch: 3649, Training Loss: 0.04176
Epoch: 3649, Training Loss: 0.04185
Epoch: 3649, Training Loss: 0.05412
Epoch: 3650, Training Loss: 0.03682
Epoch: 3650, Training Loss: 0.04174
Epoch: 3650, Training Loss: 0.04183
Epoch: 3650, Training Loss: 0.05409
Epoch: 3651, Training Loss: 0.03680
Epoch: 3651, Training Loss: 0.04173
Epoch: 3651, Training Loss: 0.04182
Epoch: 3651, Training Loss: 0.05407
Epoch: 3652, Training Loss: 0.03679
Epoch: 3652, Training Loss: 0.04171
Epoch: 3652, Training Loss: 0.04180
Epoch: 3652, Training Loss: 0.05405
Epoch: 3653, Training Loss: 0.03678
Epoch: 3653, Training Loss: 0.04169
Epoch: 3653, Training Loss: 0.04178
Epoch: 3653, Training Loss: 0.05402
Epoch: 3654, Training Loss: 0.03676
Epoch: 3654, Training Loss: 0.04168
Epoch: 3654, Training Loss: 0.04176
Epoch: 3654, Training Loss: 0.05400
Epoch: 3655, Training Loss: 0.03675
Epoch: 3655, Training Loss: 0.04166
Epoch: 3655, Training Loss: 0.04175
Epoch: 3655, Training Loss: 0.05398
Epoch: 3656, Training Loss: 0.03674
Epoch: 3656, Training Loss: 0.04164
Epoch: 3656, Training Loss: 0.04173
Epoch: 3656, Training Loss: 0.05395
Epoch: 3657, Training Loss: 0.03672
Epoch: 3657, Training Loss: 0.04162
Epoch: 3657, Training Loss: 0.04171
Epoch: 3657, Training Loss: 0.05393
Epoch: 3658, Training Loss: 0.03671
Epoch: 3658, Training Loss: 0.04161
Epoch: 3658, Training Loss: 0.04170
Epoch: 3658, Training Loss: 0.05391
Epoch: 3659, Training Loss: 0.03670
Epoch: 3659, Training Loss: 0.04159
Epoch: 3659, Training Loss: 0.04168
Epoch: 3659, Training Loss: 0.05388
Epoch: 3660, Training Loss: 0.03668
Epoch: 3660, Training Loss: 0.04157
Epoch: 3660, Training Loss: 0.04166
Epoch: 3660, Training Loss: 0.05386
Epoch: 3661, Training Loss: 0.03667
Epoch: 3661, Training Loss: 0.04156
Epoch: 3661, Training Loss: 0.04165
Epoch: 3661, Training Loss: 0.05384
Epoch: 3662, Training Loss: 0.03666
Epoch: 3662, Training Loss: 0.04154
Epoch: 3662, Training Loss: 0.04163
Epoch: 3662, Training Loss: 0.05381
Epoch: 3663, Training Loss: 0.03664
Epoch: 3663, Training Loss: 0.04152
Epoch: 3663, Training Loss: 0.04161
Epoch: 3663, Training Loss: 0.05379
Epoch: 3664, Training Loss: 0.03663
Epoch: 3664, Training Loss: 0.04151
Epoch: 3664, Training Loss: 0.04160
Epoch: 3664, Training Loss: 0.05377
Epoch: 3665, Training Loss: 0.03662
Epoch: 3665, Training Loss: 0.04149
Epoch: 3665, Training Loss: 0.04158
Epoch: 3665, Training Loss: 0.05374
Epoch: 3666, Training Loss: 0.03660
Epoch: 3666, Training Loss: 0.04147
Epoch: 3666, Training Loss: 0.04156
Epoch: 3666, Training Loss: 0.05372
Epoch: 3667, Training Loss: 0.03659
Epoch: 3667, Training Loss: 0.04146
Epoch: 3667, Training Loss: 0.04154
Epoch: 3667, Training Loss: 0.05370
Epoch: 3668, Training Loss: 0.03658
Epoch: 3668, Training Loss: 0.04144
Epoch: 3668, Training Loss: 0.04153
Epoch: 3668, Training Loss: 0.05367
Epoch: 3669, Training Loss: 0.03656
Epoch: 3669, Training Loss: 0.04142
Epoch: 3669, Training Loss: 0.04151
Epoch: 3669, Training Loss: 0.05365
Epoch: 3670, Training Loss: 0.03655
Epoch: 3670, Training Loss: 0.04141
Epoch: 3670, Training Loss: 0.04149
Epoch: 3670, Training Loss: 0.05363
Epoch: 3671, Training Loss: 0.03654
Epoch: 3671, Training Loss: 0.04139
Epoch: 3671, Training Loss: 0.04148
Epoch: 3671, Training Loss: 0.05361
Epoch: 3672, Training Loss: 0.03653
Epoch: 3672, Training Loss: 0.04137
Epoch: 3672, Training Loss: 0.04146
Epoch: 3672, Training Loss: 0.05358
Epoch: 3673, Training Loss: 0.03651
Epoch: 3673, Training Loss: 0.04136
Epoch: 3673, Training Loss: 0.04144
Epoch: 3673, Training Loss: 0.05356
Epoch: 3674, Training Loss: 0.03650
Epoch: 3674, Training Loss: 0.04134
Epoch: 3674, Training Loss: 0.04143
Epoch: 3674, Training Loss: 0.05354
Epoch: 3675, Training Loss: 0.03649
Epoch: 3675, Training Loss: 0.04132
Epoch: 3675, Training Loss: 0.04141
Epoch: 3675, Training Loss: 0.05351
Epoch: 3676, Training Loss: 0.03647
Epoch: 3676, Training Loss: 0.04131
Epoch: 3676, Training Loss: 0.04139
Epoch: 3676, Training Loss: 0.05349
Epoch: 3677, Training Loss: 0.03646
Epoch: 3677, Training Loss: 0.04129
Epoch: 3677, Training Loss: 0.04138
Epoch: 3677, Training Loss: 0.05347
Epoch: 3678, Training Loss: 0.03645
Epoch: 3678, Training Loss: 0.04127
Epoch: 3678, Training Loss: 0.04136
Epoch: 3678, Training Loss: 0.05345
Epoch: 3679, Training Loss: 0.03643
Epoch: 3679, Training Loss: 0.04126
Epoch: 3679, Training Loss: 0.04135
Epoch: 3679, Training Loss: 0.05342
Epoch: 3680, Training Loss: 0.03642
Epoch: 3680, Training Loss: 0.04124
Epoch: 3680, Training Loss: 0.04133
Epoch: 3680, Training Loss: 0.05340
Epoch: 3681, Training Loss: 0.03641
Epoch: 3681, Training Loss: 0.04122
Epoch: 3681, Training Loss: 0.04131
Epoch: 3681, Training Loss: 0.05338
Epoch: 3682, Training Loss: 0.03639
Epoch: 3682, Training Loss: 0.04121
Epoch: 3682, Training Loss: 0.04130
Epoch: 3682, Training Loss: 0.05336
Epoch: 3683, Training Loss: 0.03638
Epoch: 3683, Training Loss: 0.04119
Epoch: 3683, Training Loss: 0.04128
Epoch: 3683, Training Loss: 0.05333
Epoch: 3684, Training Loss: 0.03637
Epoch: 3684, Training Loss: 0.04118
Epoch: 3684, Training Loss: 0.04126
Epoch: 3684, Training Loss: 0.05331
Epoch: 3685, Training Loss: 0.03636
Epoch: 3685, Training Loss: 0.04116
Epoch: 3685, Training Loss: 0.04125
Epoch: 3685, Training Loss: 0.05329
Epoch: 3686, Training Loss: 0.03634
Epoch: 3686, Training Loss: 0.04114
Epoch: 3686, Training Loss: 0.04123
Epoch: 3686, Training Loss: 0.05327
Epoch: 3687, Training Loss: 0.03633
Epoch: 3687, Training Loss: 0.04113
Epoch: 3687, Training Loss: 0.04121
Epoch: 3687, Training Loss: 0.05324
Epoch: 3688, Training Loss: 0.03632
Epoch: 3688, Training Loss: 0.04111
Epoch: 3688, Training Loss: 0.04120
Epoch: 3688, Training Loss: 0.05322
Epoch: 3689, Training Loss: 0.03630
Epoch: 3689, Training Loss: 0.04109
Epoch: 3689, Training Loss: 0.04118
Epoch: 3689, Training Loss: 0.05320
Epoch: 3690, Training Loss: 0.03629
Epoch: 3690, Training Loss: 0.04108
Epoch: 3690, Training Loss: 0.04116
Epoch: 3690, Training Loss: 0.05318
Epoch: 3691, Training Loss: 0.03628
Epoch: 3691, Training Loss: 0.04106
Epoch: 3691, Training Loss: 0.04115
Epoch: 3691, Training Loss: 0.05315
Epoch: 3692, Training Loss: 0.03627
Epoch: 3692, Training Loss: 0.04105
Epoch: 3692, Training Loss: 0.04113
Epoch: 3692, Training Loss: 0.05313
Epoch: 3693, Training Loss: 0.03625
Epoch: 3693, Training Loss: 0.04103
Epoch: 3693, Training Loss: 0.04112
Epoch: 3693, Training Loss: 0.05311
Epoch: 3694, Training Loss: 0.03624
Epoch: 3694, Training Loss: 0.04101
Epoch: 3694, Training Loss: 0.04110
Epoch: 3694, Training Loss: 0.05309
Epoch: 3695, Training Loss: 0.03623
Epoch: 3695, Training Loss: 0.04100
Epoch: 3695, Training Loss: 0.04108
Epoch: 3695, Training Loss: 0.05307
Epoch: 3696, Training Loss: 0.03621
Epoch: 3696, Training Loss: 0.04098
Epoch: 3696, Training Loss: 0.04107
Epoch: 3696, Training Loss: 0.05304
Epoch: 3697, Training Loss: 0.03620
Epoch: 3697, Training Loss: 0.04096
Epoch: 3697, Training Loss: 0.04105
Epoch: 3697, Training Loss: 0.05302
Epoch: 3698, Training Loss: 0.03619
Epoch: 3698, Training Loss: 0.04095
Epoch: 3698, Training Loss: 0.04103
Epoch: 3698, Training Loss: 0.05300
Epoch: 3699, Training Loss: 0.03618
Epoch: 3699, Training Loss: 0.04093
Epoch: 3699, Training Loss: 0.04102
Epoch: 3699, Training Loss: 0.05298
Epoch: 3700, Training Loss: 0.03616
Epoch: 3700, Training Loss: 0.04092
Epoch: 3700, Training Loss: 0.04100
Epoch: 3700, Training Loss: 0.05295
Epoch: 3701, Training Loss: 0.03615
Epoch: 3701, Training Loss: 0.04090
Epoch: 3701, Training Loss: 0.04099
Epoch: 3701, Training Loss: 0.05293
Epoch: 3702, Training Loss: 0.03614
Epoch: 3702, Training Loss: 0.04088
Epoch: 3702, Training Loss: 0.04097
Epoch: 3702, Training Loss: 0.05291
Epoch: 3703, Training Loss: 0.03612
Epoch: 3703, Training Loss: 0.04087
Epoch: 3703, Training Loss: 0.04095
Epoch: 3703, Training Loss: 0.05289
Epoch: 3704, Training Loss: 0.03611
Epoch: 3704, Training Loss: 0.04085
Epoch: 3704, Training Loss: 0.04094
Epoch: 3704, Training Loss: 0.05287
Epoch: 3705, Training Loss: 0.03610
Epoch: 3705, Training Loss: 0.04084
Epoch: 3705, Training Loss: 0.04092
Epoch: 3705, Training Loss: 0.05284
Epoch: 3706, Training Loss: 0.03609
Epoch: 3706, Training Loss: 0.04082
Epoch: 3706, Training Loss: 0.04091
Epoch: 3706, Training Loss: 0.05282
Epoch: 3707, Training Loss: 0.03607
Epoch: 3707, Training Loss: 0.04080
Epoch: 3707, Training Loss: 0.04089
Epoch: 3707, Training Loss: 0.05280
Epoch: 3708, Training Loss: 0.03606
Epoch: 3708, Training Loss: 0.04079
Epoch: 3708, Training Loss: 0.04087
Epoch: 3708, Training Loss: 0.05278
Epoch: 3709, Training Loss: 0.03605
Epoch: 3709, Training Loss: 0.04077
Epoch: 3709, Training Loss: 0.04086
Epoch: 3709, Training Loss: 0.05276
Epoch: 3710, Training Loss: 0.03604
Epoch: 3710, Training Loss: 0.04076
Epoch: 3710, Training Loss: 0.04084
Epoch: 3710, Training Loss: 0.05273
Epoch: 3711, Training Loss: 0.03602
Epoch: 3711, Training Loss: 0.04074
Epoch: 3711, Training Loss: 0.04083
Epoch: 3711, Training Loss: 0.05271
Epoch: 3712, Training Loss: 0.03601
Epoch: 3712, Training Loss: 0.04072
Epoch: 3712, Training Loss: 0.04081
Epoch: 3712, Training Loss: 0.05269
Epoch: 3713, Training Loss: 0.03600
Epoch: 3713, Training Loss: 0.04071
Epoch: 3713, Training Loss: 0.04079
Epoch: 3713, Training Loss: 0.05267
Epoch: 3714, Training Loss: 0.03599
Epoch: 3714, Training Loss: 0.04069
Epoch: 3714, Training Loss: 0.04078
Epoch: 3714, Training Loss: 0.05265
Epoch: 3715, Training Loss: 0.03597
Epoch: 3715, Training Loss: 0.04068
Epoch: 3715, Training Loss: 0.04076
Epoch: 3715, Training Loss: 0.05263
Epoch: 3716, Training Loss: 0.03596
Epoch: 3716, Training Loss: 0.04066
Epoch: 3716, Training Loss: 0.04075
Epoch: 3716, Training Loss: 0.05260
Epoch: 3717, Training Loss: 0.03595
Epoch: 3717, Training Loss: 0.04065
Epoch: 3717, Training Loss: 0.04073
Epoch: 3717, Training Loss: 0.05258
Epoch: 3718, Training Loss: 0.03594
Epoch: 3718, Training Loss: 0.04063
Epoch: 3718, Training Loss: 0.04072
Epoch: 3718, Training Loss: 0.05256
Epoch: 3719, Training Loss: 0.03592
Epoch: 3719, Training Loss: 0.04061
Epoch: 3719, Training Loss: 0.04070
Epoch: 3719, Training Loss: 0.05254
Epoch: 3720, Training Loss: 0.03591
Epoch: 3720, Training Loss: 0.04060
Epoch: 3720, Training Loss: 0.04068
Epoch: 3720, Training Loss: 0.05252
Epoch: 3721, Training Loss: 0.03590
Epoch: 3721, Training Loss: 0.04058
Epoch: 3721, Training Loss: 0.04067
Epoch: 3721, Training Loss: 0.05250
Epoch: 3722, Training Loss: 0.03589
Epoch: 3722, Training Loss: 0.04057
Epoch: 3722, Training Loss: 0.04065
Epoch: 3722, Training Loss: 0.05247
Epoch: 3723, Training Loss: 0.03587
Epoch: 3723, Training Loss: 0.04055
Epoch: 3723, Training Loss: 0.04064
Epoch: 3723, Training Loss: 0.05245
Epoch: 3724, Training Loss: 0.03586
Epoch: 3724, Training Loss: 0.04054
Epoch: 3724, Training Loss: 0.04062
Epoch: 3724, Training Loss: 0.05243
Epoch: 3725, Training Loss: 0.03585
Epoch: 3725, Training Loss: 0.04052
Epoch: 3725, Training Loss: 0.04061
Epoch: 3725, Training Loss: 0.05241
Epoch: 3726, Training Loss: 0.03584
Epoch: 3726, Training Loss: 0.04050
Epoch: 3726, Training Loss: 0.04059
Epoch: 3726, Training Loss: 0.05239
Epoch: 3727, Training Loss: 0.03582
Epoch: 3727, Training Loss: 0.04049
Epoch: 3727, Training Loss: 0.04057
Epoch: 3727, Training Loss: 0.05237
Epoch: 3728, Training Loss: 0.03581
Epoch: 3728, Training Loss: 0.04047
Epoch: 3728, Training Loss: 0.04056
Epoch: 3728, Training Loss: 0.05235
Epoch: 3729, Training Loss: 0.03580
Epoch: 3729, Training Loss: 0.04046
Epoch: 3729, Training Loss: 0.04054
Epoch: 3729, Training Loss: 0.05232
Epoch: 3730, Training Loss: 0.03579
Epoch: 3730, Training Loss: 0.04044
Epoch: 3730, Training Loss: 0.04053
Epoch: 3730, Training Loss: 0.05230
Epoch: 3731, Training Loss: 0.03577
Epoch: 3731, Training Loss: 0.04043
Epoch: 3731, Training Loss: 0.04051
Epoch: 3731, Training Loss: 0.05228
Epoch: 3732, Training Loss: 0.03576
Epoch: 3732, Training Loss: 0.04041
Epoch: 3732, Training Loss: 0.04050
Epoch: 3732, Training Loss: 0.05226
Epoch: 3733, Training Loss: 0.03575
Epoch: 3733, Training Loss: 0.04040
Epoch: 3733, Training Loss: 0.04048
Epoch: 3733, Training Loss: 0.05224
Epoch: 3734, Training Loss: 0.03574
Epoch: 3734, Training Loss: 0.04038
Epoch: 3734, Training Loss: 0.04046
Epoch: 3734, Training Loss: 0.05222
Epoch: 3735, Training Loss: 0.03572
Epoch: 3735, Training Loss: 0.04036
Epoch: 3735, Training Loss: 0.04045
Epoch: 3735, Training Loss: 0.05220
Epoch: 3736, Training Loss: 0.03571
Epoch: 3736, Training Loss: 0.04035
Epoch: 3736, Training Loss: 0.04043
Epoch: 3736, Training Loss: 0.05218
Epoch: 3737, Training Loss: 0.03570
Epoch: 3737, Training Loss: 0.04033
Epoch: 3737, Training Loss: 0.04042
Epoch: 3737, Training Loss: 0.05215
Epoch: 3738, Training Loss: 0.03569
Epoch: 3738, Training Loss: 0.04032
Epoch: 3738, Training Loss: 0.04040
Epoch: 3738, Training Loss: 0.05213
Epoch: 3739, Training Loss: 0.03568
Epoch: 3739, Training Loss: 0.04030
Epoch: 3739, Training Loss: 0.04039
Epoch: 3739, Training Loss: 0.05211
Epoch: 3740, Training Loss: 0.03566
Epoch: 3740, Training Loss: 0.04029
Epoch: 3740, Training Loss: 0.04037
Epoch: 3740, Training Loss: 0.05209
Epoch: 3741, Training Loss: 0.03565
Epoch: 3741, Training Loss: 0.04027
Epoch: 3741, Training Loss: 0.04036
Epoch: 3741, Training Loss: 0.05207
Epoch: 3742, Training Loss: 0.03564
Epoch: 3742, Training Loss: 0.04026
Epoch: 3742, Training Loss: 0.04034
Epoch: 3742, Training Loss: 0.05205
Epoch: 3743, Training Loss: 0.03563
Epoch: 3743, Training Loss: 0.04024
Epoch: 3743, Training Loss: 0.04033
Epoch: 3743, Training Loss: 0.05203
Epoch: 3744, Training Loss: 0.03561
Epoch: 3744, Training Loss: 0.04023
Epoch: 3744, Training Loss: 0.04031
Epoch: 3744, Training Loss: 0.05201
Epoch: 3745, Training Loss: 0.03560
Epoch: 3745, Training Loss: 0.04021
Epoch: 3745, Training Loss: 0.04029
Epoch: 3745, Training Loss: 0.05199
Epoch: 3746, Training Loss: 0.03559
Epoch: 3746, Training Loss: 0.04019
Epoch: 3746, Training Loss: 0.04028
Epoch: 3746, Training Loss: 0.05196
Epoch: 3747, Training Loss: 0.03558
Epoch: 3747, Training Loss: 0.04018
Epoch: 3747, Training Loss: 0.04026
Epoch: 3747, Training Loss: 0.05194
Epoch: 3748, Training Loss: 0.03557
Epoch: 3748, Training Loss: 0.04016
Epoch: 3748, Training Loss: 0.04025
Epoch: 3748, Training Loss: 0.05192
Epoch: 3749, Training Loss: 0.03555
Epoch: 3749, Training Loss: 0.04015
Epoch: 3749, Training Loss: 0.04023
Epoch: 3749, Training Loss: 0.05190
Epoch: 3750, Training Loss: 0.03554
Epoch: 3750, Training Loss: 0.04013
Epoch: 3750, Training Loss: 0.04022
Epoch: 3750, Training Loss: 0.05188
Epoch: 3751, Training Loss: 0.03553
Epoch: 3751, Training Loss: 0.04012
Epoch: 3751, Training Loss: 0.04020
Epoch: 3751, Training Loss: 0.05186
Epoch: 3752, Training Loss: 0.03552
Epoch: 3752, Training Loss: 0.04010
Epoch: 3752, Training Loss: 0.04019
Epoch: 3752, Training Loss: 0.05184
Epoch: 3753, Training Loss: 0.03551
Epoch: 3753, Training Loss: 0.04009
Epoch: 3753, Training Loss: 0.04017
Epoch: 3753, Training Loss: 0.05182
Epoch: 3754, Training Loss: 0.03549
Epoch: 3754, Training Loss: 0.04007
Epoch: 3754, Training Loss: 0.04016
Epoch: 3754, Training Loss: 0.05180
Epoch: 3755, Training Loss: 0.03548
Epoch: 3755, Training Loss: 0.04006
Epoch: 3755, Training Loss: 0.04014
Epoch: 3755, Training Loss: 0.05178
Epoch: 3756, Training Loss: 0.03547
Epoch: 3756, Training Loss: 0.04004
Epoch: 3756, Training Loss: 0.04013
Epoch: 3756, Training Loss: 0.05176
Epoch: 3757, Training Loss: 0.03546
Epoch: 3757, Training Loss: 0.04003
Epoch: 3757, Training Loss: 0.04011
Epoch: 3757, Training Loss: 0.05174
Epoch: 3758, Training Loss: 0.03545
Epoch: 3758, Training Loss: 0.04001
Epoch: 3758, Training Loss: 0.04010
Epoch: 3758, Training Loss: 0.05171
Epoch: 3759, Training Loss: 0.03543
Epoch: 3759, Training Loss: 0.04000
Epoch: 3759, Training Loss: 0.04008
Epoch: 3759, Training Loss: 0.05169
Epoch: 3760, Training Loss: 0.03542
Epoch: 3760, Training Loss: 0.03998
Epoch: 3760, Training Loss: 0.04007
Epoch: 3760, Training Loss: 0.05167
Epoch: 3761, Training Loss: 0.03541
Epoch: 3761, Training Loss: 0.03997
Epoch: 3761, Training Loss: 0.04005
Epoch: 3761, Training Loss: 0.05165
Epoch: 3762, Training Loss: 0.03540
Epoch: 3762, Training Loss: 0.03995
Epoch: 3762, Training Loss: 0.04004
Epoch: 3762, Training Loss: 0.05163
Epoch: 3763, Training Loss: 0.03539
Epoch: 3763, Training Loss: 0.03994
Epoch: 3763, Training Loss: 0.04002
Epoch: 3763, Training Loss: 0.05161
Epoch: 3764, Training Loss: 0.03537
Epoch: 3764, Training Loss: 0.03992
Epoch: 3764, Training Loss: 0.04001
Epoch: 3764, Training Loss: 0.05159
Epoch: 3765, Training Loss: 0.03536
Epoch: 3765, Training Loss: 0.03991
Epoch: 3765, Training Loss: 0.03999
Epoch: 3765, Training Loss: 0.05157
Epoch: 3766, Training Loss: 0.03535
Epoch: 3766, Training Loss: 0.03989
Epoch: 3766, Training Loss: 0.03998
Epoch: 3766, Training Loss: 0.05155
Epoch: 3767, Training Loss: 0.03534
Epoch: 3767, Training Loss: 0.03988
Epoch: 3767, Training Loss: 0.03996
Epoch: 3767, Training Loss: 0.05153
Epoch: 3768, Training Loss: 0.03533
Epoch: 3768, Training Loss: 0.03986
Epoch: 3768, Training Loss: 0.03995
Epoch: 3768, Training Loss: 0.05151
Epoch: 3769, Training Loss: 0.03531
Epoch: 3769, Training Loss: 0.03985
Epoch: 3769, Training Loss: 0.03993
Epoch: 3769, Training Loss: 0.05149
Epoch: 3770, Training Loss: 0.03530
Epoch: 3770, Training Loss: 0.03983
Epoch: 3770, Training Loss: 0.03992
Epoch: 3770, Training Loss: 0.05147
Epoch: 3771, Training Loss: 0.03529
Epoch: 3771, Training Loss: 0.03982
Epoch: 3771, Training Loss: 0.03990
Epoch: 3771, Training Loss: 0.05145
Epoch: 3772, Training Loss: 0.03528
Epoch: 3772, Training Loss: 0.03980
Epoch: 3772, Training Loss: 0.03989
Epoch: 3772, Training Loss: 0.05143
Epoch: 3773, Training Loss: 0.03527
Epoch: 3773, Training Loss: 0.03979
Epoch: 3773, Training Loss: 0.03987
Epoch: 3773, Training Loss: 0.05141
Epoch: 3774, Training Loss: 0.03525
Epoch: 3774, Training Loss: 0.03977
Epoch: 3774, Training Loss: 0.03986
Epoch: 3774, Training Loss: 0.05139
Epoch: 3775, Training Loss: 0.03524
Epoch: 3775, Training Loss: 0.03976
Epoch: 3775, Training Loss: 0.03984
Epoch: 3775, Training Loss: 0.05137
Epoch: 3776, Training Loss: 0.03523
Epoch: 3776, Training Loss: 0.03974
Epoch: 3776, Training Loss: 0.03983
Epoch: 3776, Training Loss: 0.05135
Epoch: 3777, Training Loss: 0.03522
Epoch: 3777, Training Loss: 0.03973
Epoch: 3777, Training Loss: 0.03981
Epoch: 3777, Training Loss: 0.05133
Epoch: 3778, Training Loss: 0.03521
Epoch: 3778, Training Loss: 0.03971
Epoch: 3778, Training Loss: 0.03980
Epoch: 3778, Training Loss: 0.05131
Epoch: 3779, Training Loss: 0.03520
Epoch: 3779, Training Loss: 0.03970
Epoch: 3779, Training Loss: 0.03978
Epoch: 3779, Training Loss: 0.05129
Epoch: 3780, Training Loss: 0.03518
Epoch: 3780, Training Loss: 0.03968
Epoch: 3780, Training Loss: 0.03977
Epoch: 3780, Training Loss: 0.05127
Epoch: 3781, Training Loss: 0.03517
Epoch: 3781, Training Loss: 0.03967
Epoch: 3781, Training Loss: 0.03975
Epoch: 3781, Training Loss: 0.05125
Epoch: 3782, Training Loss: 0.03516
Epoch: 3782, Training Loss: 0.03966
Epoch: 3782, Training Loss: 0.03974
Epoch: 3782, Training Loss: 0.05122
Epoch: 3783, Training Loss: 0.03515
Epoch: 3783, Training Loss: 0.03964
Epoch: 3783, Training Loss: 0.03972
Epoch: 3783, Training Loss: 0.05120
Epoch: 3784, Training Loss: 0.03514
Epoch: 3784, Training Loss: 0.03963
Epoch: 3784, Training Loss: 0.03971
Epoch: 3784, Training Loss: 0.05118
Epoch: 3785, Training Loss: 0.03512
Epoch: 3785, Training Loss: 0.03961
Epoch: 3785, Training Loss: 0.03969
Epoch: 3785, Training Loss: 0.05116
Epoch: 3786, Training Loss: 0.03511
Epoch: 3786, Training Loss: 0.03960
Epoch: 3786, Training Loss: 0.03968
Epoch: 3786, Training Loss: 0.05114
Epoch: 3787, Training Loss: 0.03510
Epoch: 3787, Training Loss: 0.03958
Epoch: 3787, Training Loss: 0.03966
Epoch: 3787, Training Loss: 0.05112
Epoch: 3788, Training Loss: 0.03509
Epoch: 3788, Training Loss: 0.03957
Epoch: 3788, Training Loss: 0.03965
Epoch: 3788, Training Loss: 0.05110
Epoch: 3789, Training Loss: 0.03508
Epoch: 3789, Training Loss: 0.03955
Epoch: 3789, Training Loss: 0.03964
Epoch: 3789, Training Loss: 0.05108
Epoch: 3790, Training Loss: 0.03507
Epoch: 3790, Training Loss: 0.03954
Epoch: 3790, Training Loss: 0.03962
Epoch: 3790, Training Loss: 0.05106
Epoch: 3791, Training Loss: 0.03505
Epoch: 3791, Training Loss: 0.03952
Epoch: 3791, Training Loss: 0.03961
Epoch: 3791, Training Loss: 0.05104
Epoch: 3792, Training Loss: 0.03504
Epoch: 3792, Training Loss: 0.03951
Epoch: 3792, Training Loss: 0.03959
Epoch: 3792, Training Loss: 0.05102
Epoch: 3793, Training Loss: 0.03503
Epoch: 3793, Training Loss: 0.03949
Epoch: 3793, Training Loss: 0.03958
Epoch: 3793, Training Loss: 0.05100
Epoch: 3794, Training Loss: 0.03502
Epoch: 3794, Training Loss: 0.03948
Epoch: 3794, Training Loss: 0.03956
Epoch: 3794, Training Loss: 0.05098
Epoch: 3795, Training Loss: 0.03501
Epoch: 3795, Training Loss: 0.03947
Epoch: 3795, Training Loss: 0.03955
Epoch: 3795, Training Loss: 0.05096
Epoch: 3796, Training Loss: 0.03500
Epoch: 3796, Training Loss: 0.03945
Epoch: 3796, Training Loss: 0.03953
Epoch: 3796, Training Loss: 0.05095
Epoch: 3797, Training Loss: 0.03499
Epoch: 3797, Training Loss: 0.03944
Epoch: 3797, Training Loss: 0.03952
Epoch: 3797, Training Loss: 0.05093
Epoch: 3798, Training Loss: 0.03497
Epoch: 3798, Training Loss: 0.03942
Epoch: 3798, Training Loss: 0.03950
Epoch: 3798, Training Loss: 0.05091
Epoch: 3799, Training Loss: 0.03496
Epoch: 3799, Training Loss: 0.03941
Epoch: 3799, Training Loss: 0.03949
Epoch: 3799, Training Loss: 0.05089
Epoch: 3800, Training Loss: 0.03495
Epoch: 3800, Training Loss: 0.03939
Epoch: 3800, Training Loss: 0.03948
Epoch: 3800, Training Loss: 0.05087
Epoch: 3801, Training Loss: 0.03494
Epoch: 3801, Training Loss: 0.03938
Epoch: 3801, Training Loss: 0.03946
Epoch: 3801, Training Loss: 0.05085
Epoch: 3802, Training Loss: 0.03493
Epoch: 3802, Training Loss: 0.03936
Epoch: 3802, Training Loss: 0.03945
Epoch: 3802, Training Loss: 0.05083
Epoch: 3803, Training Loss: 0.03492
Epoch: 3803, Training Loss: 0.03935
Epoch: 3803, Training Loss: 0.03943
Epoch: 3803, Training Loss: 0.05081
Epoch: 3804, Training Loss: 0.03490
Epoch: 3804, Training Loss: 0.03934
Epoch: 3804, Training Loss: 0.03942
Epoch: 3804, Training Loss: 0.05079
Epoch: 3805, Training Loss: 0.03489
Epoch: 3805, Training Loss: 0.03932
Epoch: 3805, Training Loss: 0.03940
Epoch: 3805, Training Loss: 0.05077
Epoch: 3806, Training Loss: 0.03488
Epoch: 3806, Training Loss: 0.03931
Epoch: 3806, Training Loss: 0.03939
Epoch: 3806, Training Loss: 0.05075
Epoch: 3807, Training Loss: 0.03487
Epoch: 3807, Training Loss: 0.03929
Epoch: 3807, Training Loss: 0.03937
Epoch: 3807, Training Loss: 0.05073
Epoch: 3808, Training Loss: 0.03486
Epoch: 3808, Training Loss: 0.03928
Epoch: 3808, Training Loss: 0.03936
Epoch: 3808, Training Loss: 0.05071
Epoch: 3809, Training Loss: 0.03485
Epoch: 3809, Training Loss: 0.03926
Epoch: 3809, Training Loss: 0.03935
Epoch: 3809, Training Loss: 0.05069
Epoch: 3810, Training Loss: 0.03484
Epoch: 3810, Training Loss: 0.03925
Epoch: 3810, Training Loss: 0.03933
Epoch: 3810, Training Loss: 0.05067
Epoch: 3811, Training Loss: 0.03482
Epoch: 3811, Training Loss: 0.03923
Epoch: 3811, Training Loss: 0.03932
Epoch: 3811, Training Loss: 0.05065
Epoch: 3812, Training Loss: 0.03481
Epoch: 3812, Training Loss: 0.03922
Epoch: 3812, Training Loss: 0.03930
Epoch: 3812, Training Loss: 0.05063
Epoch: 3813, Training Loss: 0.03480
Epoch: 3813, Training Loss: 0.03921
Epoch: 3813, Training Loss: 0.03929
Epoch: 3813, Training Loss: 0.05061
Epoch: 3814, Training Loss: 0.03479
Epoch: 3814, Training Loss: 0.03919
Epoch: 3814, Training Loss: 0.03927
Epoch: 3814, Training Loss: 0.05059
Epoch: 3815, Training Loss: 0.03478
Epoch: 3815, Training Loss: 0.03918
Epoch: 3815, Training Loss: 0.03926
Epoch: 3815, Training Loss: 0.05057
Epoch: 3816, Training Loss: 0.03477
Epoch: 3816, Training Loss: 0.03916
Epoch: 3816, Training Loss: 0.03925
Epoch: 3816, Training Loss: 0.05055
Epoch: 3817, Training Loss: 0.03476
Epoch: 3817, Training Loss: 0.03915
Epoch: 3817, Training Loss: 0.03923
Epoch: 3817, Training Loss: 0.05053
Epoch: 3818, Training Loss: 0.03474
Epoch: 3818, Training Loss: 0.03914
Epoch: 3818, Training Loss: 0.03922
Epoch: 3818, Training Loss: 0.05051
Epoch: 3819, Training Loss: 0.03473
Epoch: 3819, Training Loss: 0.03912
Epoch: 3819, Training Loss: 0.03920
Epoch: 3819, Training Loss: 0.05049
Epoch: 3820, Training Loss: 0.03472
Epoch: 3820, Training Loss: 0.03911
Epoch: 3820, Training Loss: 0.03919
Epoch: 3820, Training Loss: 0.05048
Epoch: 3821, Training Loss: 0.03471
Epoch: 3821, Training Loss: 0.03909
Epoch: 3821, Training Loss: 0.03917
Epoch: 3821, Training Loss: 0.05046
Epoch: 3822, Training Loss: 0.03470
Epoch: 3822, Training Loss: 0.03908
Epoch: 3822, Training Loss: 0.03916
Epoch: 3822, Training Loss: 0.05044
Epoch: 3823, Training Loss: 0.03469
Epoch: 3823, Training Loss: 0.03906
Epoch: 3823, Training Loss: 0.03915
Epoch: 3823, Training Loss: 0.05042
Epoch: 3824, Training Loss: 0.03468
Epoch: 3824, Training Loss: 0.03905
Epoch: 3824, Training Loss: 0.03913
Epoch: 3824, Training Loss: 0.05040
Epoch: 3825, Training Loss: 0.03467
Epoch: 3825, Training Loss: 0.03904
Epoch: 3825, Training Loss: 0.03912
Epoch: 3825, Training Loss: 0.05038
Epoch: 3826, Training Loss: 0.03465
Epoch: 3826, Training Loss: 0.03902
Epoch: 3826, Training Loss: 0.03910
Epoch: 3826, Training Loss: 0.05036
Epoch: 3827, Training Loss: 0.03464
Epoch: 3827, Training Loss: 0.03901
Epoch: 3827, Training Loss: 0.03909
Epoch: 3827, Training Loss: 0.05034
Epoch: 3828, Training Loss: 0.03463
Epoch: 3828, Training Loss: 0.03899
Epoch: 3828, Training Loss: 0.03908
Epoch: 3828, Training Loss: 0.05032
Epoch: 3829, Training Loss: 0.03462
Epoch: 3829, Training Loss: 0.03898
Epoch: 3829, Training Loss: 0.03906
Epoch: 3829, Training Loss: 0.05030
Epoch: 3830, Training Loss: 0.03461
Epoch: 3830, Training Loss: 0.03897
Epoch: 3830, Training Loss: 0.03905
Epoch: 3830, Training Loss: 0.05028
Epoch: 3831, Training Loss: 0.03460
Epoch: 3831, Training Loss: 0.03895
Epoch: 3831, Training Loss: 0.03903
Epoch: 3831, Training Loss: 0.05026
Epoch: 3832, Training Loss: 0.03459
Epoch: 3832, Training Loss: 0.03894
Epoch: 3832, Training Loss: 0.03902
Epoch: 3832, Training Loss: 0.05024
Epoch: 3833, Training Loss: 0.03458
Epoch: 3833, Training Loss: 0.03892
Epoch: 3833, Training Loss: 0.03901
Epoch: 3833, Training Loss: 0.05023
Epoch: 3834, Training Loss: 0.03456
Epoch: 3834, Training Loss: 0.03891
Epoch: 3834, Training Loss: 0.03899
Epoch: 3834, Training Loss: 0.05021
Epoch: 3835, Training Loss: 0.03455
Epoch: 3835, Training Loss: 0.03890
Epoch: 3835, Training Loss: 0.03898
Epoch: 3835, Training Loss: 0.05019
Epoch: 3836, Training Loss: 0.03454
Epoch: 3836, Training Loss: 0.03888
Epoch: 3836, Training Loss: 0.03896
Epoch: 3836, Training Loss: 0.05017
Epoch: 3837, Training Loss: 0.03453
Epoch: 3837, Training Loss: 0.03887
Epoch: 3837, Training Loss: 0.03895
Epoch: 3837, Training Loss: 0.05015
Epoch: 3838, Training Loss: 0.03452
Epoch: 3838, Training Loss: 0.03885
Epoch: 3838, Training Loss: 0.03894
Epoch: 3838, Training Loss: 0.05013
Epoch: 3839, Training Loss: 0.03451
Epoch: 3839, Training Loss: 0.03884
Epoch: 3839, Training Loss: 0.03892
Epoch: 3839, Training Loss: 0.05011
Epoch: 3840, Training Loss: 0.03450
Epoch: 3840, Training Loss: 0.03883
Epoch: 3840, Training Loss: 0.03891
Epoch: 3840, Training Loss: 0.05009
Epoch: 3841, Training Loss: 0.03449
Epoch: 3841, Training Loss: 0.03881
Epoch: 3841, Training Loss: 0.03889
Epoch: 3841, Training Loss: 0.05007
Epoch: 3842, Training Loss: 0.03448
Epoch: 3842, Training Loss: 0.03880
Epoch: 3842, Training Loss: 0.03888
Epoch: 3842, Training Loss: 0.05005
Epoch: 3843, Training Loss: 0.03446
Epoch: 3843, Training Loss: 0.03879
Epoch: 3843, Training Loss: 0.03887
Epoch: 3843, Training Loss: 0.05004
Epoch: 3844, Training Loss: 0.03445
Epoch: 3844, Training Loss: 0.03877
Epoch: 3844, Training Loss: 0.03885
Epoch: 3844, Training Loss: 0.05002
Epoch: 3845, Training Loss: 0.03444
Epoch: 3845, Training Loss: 0.03876
Epoch: 3845, Training Loss: 0.03884
Epoch: 3845, Training Loss: 0.05000
Epoch: 3846, Training Loss: 0.03443
Epoch: 3846, Training Loss: 0.03874
Epoch: 3846, Training Loss: 0.03883
Epoch: 3846, Training Loss: 0.04998
Epoch: 3847, Training Loss: 0.03442
Epoch: 3847, Training Loss: 0.03873
Epoch: 3847, Training Loss: 0.03881
Epoch: 3847, Training Loss: 0.04996
Epoch: 3848, Training Loss: 0.03441
Epoch: 3848, Training Loss: 0.03872
Epoch: 3848, Training Loss: 0.03880
Epoch: 3848, Training Loss: 0.04994
Epoch: 3849, Training Loss: 0.03440
Epoch: 3849, Training Loss: 0.03870
Epoch: 3849, Training Loss: 0.03878
Epoch: 3849, Training Loss: 0.04992
Epoch: 3850, Training Loss: 0.03439
Epoch: 3850, Training Loss: 0.03869
Epoch: 3850, Training Loss: 0.03877
Epoch: 3850, Training Loss: 0.04990
Epoch: 3851, Training Loss: 0.03438
Epoch: 3851, Training Loss: 0.03868
Epoch: 3851, Training Loss: 0.03876
Epoch: 3851, Training Loss: 0.04989
Epoch: 3852, Training Loss: 0.03437
Epoch: 3852, Training Loss: 0.03866
Epoch: 3852, Training Loss: 0.03874
Epoch: 3852, Training Loss: 0.04987
Epoch: 3853, Training Loss: 0.03435
Epoch: 3853, Training Loss: 0.03865
Epoch: 3853, Training Loss: 0.03873
Epoch: 3853, Training Loss: 0.04985
Epoch: 3854, Training Loss: 0.03434
Epoch: 3854, Training Loss: 0.03863
Epoch: 3854, Training Loss: 0.03872
Epoch: 3854, Training Loss: 0.04983
Epoch: 3855, Training Loss: 0.03433
Epoch: 3855, Training Loss: 0.03862
Epoch: 3855, Training Loss: 0.03870
Epoch: 3855, Training Loss: 0.04981
Epoch: 3856, Training Loss: 0.03432
Epoch: 3856, Training Loss: 0.03861
Epoch: 3856, Training Loss: 0.03869
Epoch: 3856, Training Loss: 0.04979
Epoch: 3857, Training Loss: 0.03431
Epoch: 3857, Training Loss: 0.03859
Epoch: 3857, Training Loss: 0.03867
Epoch: 3857, Training Loss: 0.04977
Epoch: 3858, Training Loss: 0.03430
Epoch: 3858, Training Loss: 0.03858
Epoch: 3858, Training Loss: 0.03866
Epoch: 3858, Training Loss: 0.04976
Epoch: 3859, Training Loss: 0.03429
Epoch: 3859, Training Loss: 0.03857
Epoch: 3859, Training Loss: 0.03865
Epoch: 3859, Training Loss: 0.04974
Epoch: 3860, Training Loss: 0.03428
Epoch: 3860, Training Loss: 0.03855
Epoch: 3860, Training Loss: 0.03863
Epoch: 3860, Training Loss: 0.04972
Epoch: 3861, Training Loss: 0.03427
Epoch: 3861, Training Loss: 0.03854
Epoch: 3861, Training Loss: 0.03862
Epoch: 3861, Training Loss: 0.04970
Epoch: 3862, Training Loss: 0.03426
Epoch: 3862, Training Loss: 0.03853
Epoch: 3862, Training Loss: 0.03861
Epoch: 3862, Training Loss: 0.04968
Epoch: 3863, Training Loss: 0.03424
Epoch: 3863, Training Loss: 0.03851
Epoch: 3863, Training Loss: 0.03859
Epoch: 3863, Training Loss: 0.04966
Epoch: 3864, Training Loss: 0.03423
Epoch: 3864, Training Loss: 0.03850
Epoch: 3864, Training Loss: 0.03858
Epoch: 3864, Training Loss: 0.04964
Epoch: 3865, Training Loss: 0.03422
Epoch: 3865, Training Loss: 0.03849
Epoch: 3865, Training Loss: 0.03857
Epoch: 3865, Training Loss: 0.04963
Epoch: 3866, Training Loss: 0.03421
Epoch: 3866, Training Loss: 0.03847
Epoch: 3866, Training Loss: 0.03855
Epoch: 3866, Training Loss: 0.04961
Epoch: 3867, Training Loss: 0.03420
Epoch: 3867, Training Loss: 0.03846
Epoch: 3867, Training Loss: 0.03854
Epoch: 3867, Training Loss: 0.04959
Epoch: 3868, Training Loss: 0.03419
Epoch: 3868, Training Loss: 0.03844
Epoch: 3868, Training Loss: 0.03852
Epoch: 3868, Training Loss: 0.04957
Epoch: 3869, Training Loss: 0.03418
Epoch: 3869, Training Loss: 0.03843
Epoch: 3869, Training Loss: 0.03851
Epoch: 3869, Training Loss: 0.04955
Epoch: 3870, Training Loss: 0.03417
Epoch: 3870, Training Loss: 0.03842
Epoch: 3870, Training Loss: 0.03850
Epoch: 3870, Training Loss: 0.04953
Epoch: 3871, Training Loss: 0.03416
Epoch: 3871, Training Loss: 0.03840
Epoch: 3871, Training Loss: 0.03848
Epoch: 3871, Training Loss: 0.04952
Epoch: 3872, Training Loss: 0.03415
Epoch: 3872, Training Loss: 0.03839
Epoch: 3872, Training Loss: 0.03847
Epoch: 3872, Training Loss: 0.04950
Epoch: 3873, Training Loss: 0.03414
Epoch: 3873, Training Loss: 0.03838
Epoch: 3873, Training Loss: 0.03846
Epoch: 3873, Training Loss: 0.04948
Epoch: 3874, Training Loss: 0.03413
Epoch: 3874, Training Loss: 0.03836
Epoch: 3874, Training Loss: 0.03844
Epoch: 3874, Training Loss: 0.04946
Epoch: 3875, Training Loss: 0.03412
Epoch: 3875, Training Loss: 0.03835
Epoch: 3875, Training Loss: 0.03843
Epoch: 3875, Training Loss: 0.04944
Epoch: 3876, Training Loss: 0.03410
Epoch: 3876, Training Loss: 0.03834
Epoch: 3876, Training Loss: 0.03842
Epoch: 3876, Training Loss: 0.04942
Epoch: 3877, Training Loss: 0.03409
Epoch: 3877, Training Loss: 0.03832
Epoch: 3877, Training Loss: 0.03840
Epoch: 3877, Training Loss: 0.04941
Epoch: 3878, Training Loss: 0.03408
Epoch: 3878, Training Loss: 0.03831
Epoch: 3878, Training Loss: 0.03839
Epoch: 3878, Training Loss: 0.04939
Epoch: 3879, Training Loss: 0.03407
Epoch: 3879, Training Loss: 0.03830
Epoch: 3879, Training Loss: 0.03838
Epoch: 3879, Training Loss: 0.04937
Epoch: 3880, Training Loss: 0.03406
Epoch: 3880, Training Loss: 0.03828
Epoch: 3880, Training Loss: 0.03836
Epoch: 3880, Training Loss: 0.04935
Epoch: 3881, Training Loss: 0.03405
Epoch: 3881, Training Loss: 0.03827
Epoch: 3881, Training Loss: 0.03835
Epoch: 3881, Training Loss: 0.04933
Epoch: 3882, Training Loss: 0.03404
Epoch: 3882, Training Loss: 0.03826
Epoch: 3882, Training Loss: 0.03834
Epoch: 3882, Training Loss: 0.04931
Epoch: 3883, Training Loss: 0.03403
Epoch: 3883, Training Loss: 0.03824
Epoch: 3883, Training Loss: 0.03832
Epoch: 3883, Training Loss: 0.04930
Epoch: 3884, Training Loss: 0.03402
Epoch: 3884, Training Loss: 0.03823
Epoch: 3884, Training Loss: 0.03831
Epoch: 3884, Training Loss: 0.04928
Epoch: 3885, Training Loss: 0.03401
Epoch: 3885, Training Loss: 0.03822
Epoch: 3885, Training Loss: 0.03830
Epoch: 3885, Training Loss: 0.04926
Epoch: 3886, Training Loss: 0.03400
Epoch: 3886, Training Loss: 0.03820
Epoch: 3886, Training Loss: 0.03828
Epoch: 3886, Training Loss: 0.04924
Epoch: 3887, Training Loss: 0.03399
Epoch: 3887, Training Loss: 0.03819
Epoch: 3887, Training Loss: 0.03827
Epoch: 3887, Training Loss: 0.04922
Epoch: 3888, Training Loss: 0.03398
Epoch: 3888, Training Loss: 0.03818
Epoch: 3888, Training Loss: 0.03826
Epoch: 3888, Training Loss: 0.04921
Epoch: 3889, Training Loss: 0.03397
Epoch: 3889, Training Loss: 0.03816
Epoch: 3889, Training Loss: 0.03824
Epoch: 3889, Training Loss: 0.04919
Epoch: 3890, Training Loss: 0.03395
Epoch: 3890, Training Loss: 0.03815
Epoch: 3890, Training Loss: 0.03823
Epoch: 3890, Training Loss: 0.04917
Epoch: 3891, Training Loss: 0.03394
Epoch: 3891, Training Loss: 0.03814
Epoch: 3891, Training Loss: 0.03822
Epoch: 3891, Training Loss: 0.04915
Epoch: 3892, Training Loss: 0.03393
Epoch: 3892, Training Loss: 0.03813
Epoch: 3892, Training Loss: 0.03820
Epoch: 3892, Training Loss: 0.04913
Epoch: 3893, Training Loss: 0.03392
Epoch: 3893, Training Loss: 0.03811
Epoch: 3893, Training Loss: 0.03819
Epoch: 3893, Training Loss: 0.04912
Epoch: 3894, Training Loss: 0.03391
Epoch: 3894, Training Loss: 0.03810
Epoch: 3894, Training Loss: 0.03818
Epoch: 3894, Training Loss: 0.04910
Epoch: 3895, Training Loss: 0.03390
Epoch: 3895, Training Loss: 0.03809
Epoch: 3895, Training Loss: 0.03816
Epoch: 3895, Training Loss: 0.04908
Epoch: 3896, Training Loss: 0.03389
Epoch: 3896, Training Loss: 0.03807
Epoch: 3896, Training Loss: 0.03815
Epoch: 3896, Training Loss: 0.04906
Epoch: 3897, Training Loss: 0.03388
Epoch: 3897, Training Loss: 0.03806
Epoch: 3897, Training Loss: 0.03814
Epoch: 3897, Training Loss: 0.04905
Epoch: 3898, Training Loss: 0.03387
Epoch: 3898, Training Loss: 0.03805
Epoch: 3898, Training Loss: 0.03813
Epoch: 3898, Training Loss: 0.04903
Epoch: 3899, Training Loss: 0.03386
Epoch: 3899, Training Loss: 0.03803
Epoch: 3899, Training Loss: 0.03811
Epoch: 3899, Training Loss: 0.04901
Epoch: 3900, Training Loss: 0.03385
Epoch: 3900, Training Loss: 0.03802
Epoch: 3900, Training Loss: 0.03810
Epoch: 3900, Training Loss: 0.04899
Epoch: 3901, Training Loss: 0.03384
Epoch: 3901, Training Loss: 0.03801
Epoch: 3901, Training Loss: 0.03809
Epoch: 3901, Training Loss: 0.04897
Epoch: 3902, Training Loss: 0.03383
Epoch: 3902, Training Loss: 0.03799
Epoch: 3902, Training Loss: 0.03807
Epoch: 3902, Training Loss: 0.04896
Epoch: 3903, Training Loss: 0.03382
Epoch: 3903, Training Loss: 0.03798
Epoch: 3903, Training Loss: 0.03806
Epoch: 3903, Training Loss: 0.04894
Epoch: 3904, Training Loss: 0.03381
Epoch: 3904, Training Loss: 0.03797
Epoch: 3904, Training Loss: 0.03805
Epoch: 3904, Training Loss: 0.04892
Epoch: 3905, Training Loss: 0.03380
Epoch: 3905, Training Loss: 0.03796
Epoch: 3905, Training Loss: 0.03803
Epoch: 3905, Training Loss: 0.04890
Epoch: 3906, Training Loss: 0.03379
Epoch: 3906, Training Loss: 0.03794
Epoch: 3906, Training Loss: 0.03802
Epoch: 3906, Training Loss: 0.04889
Epoch: 3907, Training Loss: 0.03378
Epoch: 3907, Training Loss: 0.03793
Epoch: 3907, Training Loss: 0.03801
Epoch: 3907, Training Loss: 0.04887
Epoch: 3908, Training Loss: 0.03377
Epoch: 3908, Training Loss: 0.03792
Epoch: 3908, Training Loss: 0.03800
Epoch: 3908, Training Loss: 0.04885
Epoch: 3909, Training Loss: 0.03376
Epoch: 3909, Training Loss: 0.03790
Epoch: 3909, Training Loss: 0.03798
Epoch: 3909, Training Loss: 0.04883
Epoch: 3910, Training Loss: 0.03374
Epoch: 3910, Training Loss: 0.03789
Epoch: 3910, Training Loss: 0.03797
Epoch: 3910, Training Loss: 0.04881
Epoch: 3911, Training Loss: 0.03373
Epoch: 3911, Training Loss: 0.03788
Epoch: 3911, Training Loss: 0.03796
Epoch: 3911, Training Loss: 0.04880
Epoch: 3912, Training Loss: 0.03372
Epoch: 3912, Training Loss: 0.03786
Epoch: 3912, Training Loss: 0.03794
Epoch: 3912, Training Loss: 0.04878
Epoch: 3913, Training Loss: 0.03371
Epoch: 3913, Training Loss: 0.03785
Epoch: 3913, Training Loss: 0.03793
Epoch: 3913, Training Loss: 0.04876
Epoch: 3914, Training Loss: 0.03370
Epoch: 3914, Training Loss: 0.03784
Epoch: 3914, Training Loss: 0.03792
Epoch: 3914, Training Loss: 0.04874
Epoch: 3915, Training Loss: 0.03369
Epoch: 3915, Training Loss: 0.03783
Epoch: 3915, Training Loss: 0.03790
Epoch: 3915, Training Loss: 0.04873
Epoch: 3916, Training Loss: 0.03368
Epoch: 3916, Training Loss: 0.03781
Epoch: 3916, Training Loss: 0.03789
Epoch: 3916, Training Loss: 0.04871
Epoch: 3917, Training Loss: 0.03367
Epoch: 3917, Training Loss: 0.03780
Epoch: 3917, Training Loss: 0.03788
Epoch: 3917, Training Loss: 0.04869
Epoch: 3918, Training Loss: 0.03366
Epoch: 3918, Training Loss: 0.03779
Epoch: 3918, Training Loss: 0.03787
Epoch: 3918, Training Loss: 0.04867
Epoch: 3919, Training Loss: 0.03365
Epoch: 3919, Training Loss: 0.03778
Epoch: 3919, Training Loss: 0.03785
Epoch: 3919, Training Loss: 0.04866
Epoch: 3920, Training Loss: 0.03364
Epoch: 3920, Training Loss: 0.03776
Epoch: 3920, Training Loss: 0.03784
Epoch: 3920, Training Loss: 0.04864
Epoch: 3921, Training Loss: 0.03363
Epoch: 3921, Training Loss: 0.03775
Epoch: 3921, Training Loss: 0.03783
Epoch: 3921, Training Loss: 0.04862
Epoch: 3922, Training Loss: 0.03362
Epoch: 3922, Training Loss: 0.03774
Epoch: 3922, Training Loss: 0.03781
Epoch: 3922, Training Loss: 0.04860
Epoch: 3923, Training Loss: 0.03361
Epoch: 3923, Training Loss: 0.03772
Epoch: 3923, Training Loss: 0.03780
Epoch: 3923, Training Loss: 0.04859
Epoch: 3924, Training Loss: 0.03360
Epoch: 3924, Training Loss: 0.03771
Epoch: 3924, Training Loss: 0.03779
Epoch: 3924, Training Loss: 0.04857
Epoch: 3925, Training Loss: 0.03359
Epoch: 3925, Training Loss: 0.03770
Epoch: 3925, Training Loss: 0.03778
Epoch: 3925, Training Loss: 0.04855
Epoch: 3926, Training Loss: 0.03358
Epoch: 3926, Training Loss: 0.03769
Epoch: 3926, Training Loss: 0.03776
Epoch: 3926, Training Loss: 0.04854
Epoch: 3927, Training Loss: 0.03357
Epoch: 3927, Training Loss: 0.03767
Epoch: 3927, Training Loss: 0.03775
Epoch: 3927, Training Loss: 0.04852
Epoch: 3928, Training Loss: 0.03356
Epoch: 3928, Training Loss: 0.03766
Epoch: 3928, Training Loss: 0.03774
Epoch: 3928, Training Loss: 0.04850
Epoch: 3929, Training Loss: 0.03355
Epoch: 3929, Training Loss: 0.03765
Epoch: 3929, Training Loss: 0.03773
Epoch: 3929, Training Loss: 0.04848
Epoch: 3930, Training Loss: 0.03354
Epoch: 3930, Training Loss: 0.03763
Epoch: 3930, Training Loss: 0.03771
Epoch: 3930, Training Loss: 0.04847
Epoch: 3931, Training Loss: 0.03353
Epoch: 3931, Training Loss: 0.03762
Epoch: 3931, Training Loss: 0.03770
Epoch: 3931, Training Loss: 0.04845
Epoch: 3932, Training Loss: 0.03352
Epoch: 3932, Training Loss: 0.03761
Epoch: 3932, Training Loss: 0.03769
Epoch: 3932, Training Loss: 0.04843
Epoch: 3933, Training Loss: 0.03351
Epoch: 3933, Training Loss: 0.03760
Epoch: 3933, Training Loss: 0.03767
Epoch: 3933, Training Loss: 0.04841
Epoch: 3934, Training Loss: 0.03350
Epoch: 3934, Training Loss: 0.03758
Epoch: 3934, Training Loss: 0.03766
Epoch: 3934, Training Loss: 0.04840
Epoch: 3935, Training Loss: 0.03349
Epoch: 3935, Training Loss: 0.03757
Epoch: 3935, Training Loss: 0.03765
Epoch: 3935, Training Loss: 0.04838
Epoch: 3936, Training Loss: 0.03348
Epoch: 3936, Training Loss: 0.03756
Epoch: 3936, Training Loss: 0.03764
Epoch: 3936, Training Loss: 0.04836
Epoch: 3937, Training Loss: 0.03347
Epoch: 3937, Training Loss: 0.03755
Epoch: 3937, Training Loss: 0.03762
Epoch: 3937, Training Loss: 0.04835
Epoch: 3938, Training Loss: 0.03346
Epoch: 3938, Training Loss: 0.03753
Epoch: 3938, Training Loss: 0.03761
Epoch: 3938, Training Loss: 0.04833
Epoch: 3939, Training Loss: 0.03345
Epoch: 3939, Training Loss: 0.03752
Epoch: 3939, Training Loss: 0.03760
Epoch: 3939, Training Loss: 0.04831
Epoch: 3940, Training Loss: 0.03344
Epoch: 3940, Training Loss: 0.03751
Epoch: 3940, Training Loss: 0.03759
Epoch: 3940, Training Loss: 0.04829
Epoch: 3941, Training Loss: 0.03343
Epoch: 3941, Training Loss: 0.03750
Epoch: 3941, Training Loss: 0.03757
Epoch: 3941, Training Loss: 0.04828
Epoch: 3942, Training Loss: 0.03342
Epoch: 3942, Training Loss: 0.03748
Epoch: 3942, Training Loss: 0.03756
Epoch: 3942, Training Loss: 0.04826
Epoch: 3943, Training Loss: 0.03341
Epoch: 3943, Training Loss: 0.03747
Epoch: 3943, Training Loss: 0.03755
Epoch: 3943, Training Loss: 0.04824
Epoch: 3944, Training Loss: 0.03340
Epoch: 3944, Training Loss: 0.03746
Epoch: 3944, Training Loss: 0.03754
Epoch: 3944, Training Loss: 0.04823
Epoch: 3945, Training Loss: 0.03339
Epoch: 3945, Training Loss: 0.03745
Epoch: 3945, Training Loss: 0.03752
Epoch: 3945, Training Loss: 0.04821
Epoch: 3946, Training Loss: 0.03338
Epoch: 3946, Training Loss: 0.03743
Epoch: 3946, Training Loss: 0.03751
Epoch: 3946, Training Loss: 0.04819
Epoch: 3947, Training Loss: 0.03337
Epoch: 3947, Training Loss: 0.03742
Epoch: 3947, Training Loss: 0.03750
Epoch: 3947, Training Loss: 0.04818
Epoch: 3948, Training Loss: 0.03336
Epoch: 3948, Training Loss: 0.03741
Epoch: 3948, Training Loss: 0.03749
Epoch: 3948, Training Loss: 0.04816
Epoch: 3949, Training Loss: 0.03335
Epoch: 3949, Training Loss: 0.03740
Epoch: 3949, Training Loss: 0.03747
Epoch: 3949, Training Loss: 0.04814
Epoch: 3950, Training Loss: 0.03334
Epoch: 3950, Training Loss: 0.03738
Epoch: 3950, Training Loss: 0.03746
Epoch: 3950, Training Loss: 0.04812
Epoch: 3951, Training Loss: 0.03333
Epoch: 3951, Training Loss: 0.03737
Epoch: 3951, Training Loss: 0.03745
Epoch: 3951, Training Loss: 0.04811
Epoch: 3952, Training Loss: 0.03332
Epoch: 3952, Training Loss: 0.03736
Epoch: 3952, Training Loss: 0.03744
Epoch: 3952, Training Loss: 0.04809
Epoch: 3953, Training Loss: 0.03331
Epoch: 3953, Training Loss: 0.03735
Epoch: 3953, Training Loss: 0.03742
Epoch: 3953, Training Loss: 0.04807
Epoch: 3954, Training Loss: 0.03330
Epoch: 3954, Training Loss: 0.03733
Epoch: 3954, Training Loss: 0.03741
Epoch: 3954, Training Loss: 0.04806
Epoch: 3955, Training Loss: 0.03329
Epoch: 3955, Training Loss: 0.03732
Epoch: 3955, Training Loss: 0.03740
Epoch: 3955, Training Loss: 0.04804
Epoch: 3956, Training Loss: 0.03328
Epoch: 3956, Training Loss: 0.03731
Epoch: 3956, Training Loss: 0.03739
Epoch: 3956, Training Loss: 0.04802
Epoch: 3957, Training Loss: 0.03327
Epoch: 3957, Training Loss: 0.03730
Epoch: 3957, Training Loss: 0.03737
Epoch: 3957, Training Loss: 0.04801
Epoch: 3958, Training Loss: 0.03326
Epoch: 3958, Training Loss: 0.03728
Epoch: 3958, Training Loss: 0.03736
Epoch: 3958, Training Loss: 0.04799
Epoch: 3959, Training Loss: 0.03325
Epoch: 3959, Training Loss: 0.03727
Epoch: 3959, Training Loss: 0.03735
Epoch: 3959, Training Loss: 0.04797
Epoch: 3960, Training Loss: 0.03324
Epoch: 3960, Training Loss: 0.03726
Epoch: 3960, Training Loss: 0.03734
Epoch: 3960, Training Loss: 0.04796
Epoch: 3961, Training Loss: 0.03323
Epoch: 3961, Training Loss: 0.03725
Epoch: 3961, Training Loss: 0.03732
Epoch: 3961, Training Loss: 0.04794
Epoch: 3962, Training Loss: 0.03322
Epoch: 3962, Training Loss: 0.03724
Epoch: 3962, Training Loss: 0.03731
Epoch: 3962, Training Loss: 0.04792
Epoch: 3963, Training Loss: 0.03321
Epoch: 3963, Training Loss: 0.03722
Epoch: 3963, Training Loss: 0.03730
Epoch: 3963, Training Loss: 0.04791
Epoch: 3964, Training Loss: 0.03320
Epoch: 3964, Training Loss: 0.03721
Epoch: 3964, Training Loss: 0.03729
Epoch: 3964, Training Loss: 0.04789
Epoch: 3965, Training Loss: 0.03319
Epoch: 3965, Training Loss: 0.03720
Epoch: 3965, Training Loss: 0.03728
Epoch: 3965, Training Loss: 0.04787
Epoch: 3966, Training Loss: 0.03318
Epoch: 3966, Training Loss: 0.03719
Epoch: 3966, Training Loss: 0.03726
Epoch: 3966, Training Loss: 0.04786
Epoch: 3967, Training Loss: 0.03317
Epoch: 3967, Training Loss: 0.03717
Epoch: 3967, Training Loss: 0.03725
Epoch: 3967, Training Loss: 0.04784
Epoch: 3968, Training Loss: 0.03316
Epoch: 3968, Training Loss: 0.03716
Epoch: 3968, Training Loss: 0.03724
Epoch: 3968, Training Loss: 0.04782
Epoch: 3969, Training Loss: 0.03315
Epoch: 3969, Training Loss: 0.03715
Epoch: 3969, Training Loss: 0.03723
Epoch: 3969, Training Loss: 0.04781
Epoch: 3970, Training Loss: 0.03314
Epoch: 3970, Training Loss: 0.03714
Epoch: 3970, Training Loss: 0.03721
Epoch: 3970, Training Loss: 0.04779
Epoch: 3971, Training Loss: 0.03313
Epoch: 3971, Training Loss: 0.03713
Epoch: 3971, Training Loss: 0.03720
Epoch: 3971, Training Loss: 0.04777
Epoch: 3972, Training Loss: 0.03312
Epoch: 3972, Training Loss: 0.03711
Epoch: 3972, Training Loss: 0.03719
Epoch: 3972, Training Loss: 0.04776
Epoch: 3973, Training Loss: 0.03311
Epoch: 3973, Training Loss: 0.03710
Epoch: 3973, Training Loss: 0.03718
Epoch: 3973, Training Loss: 0.04774
Epoch: 3974, Training Loss: 0.03310
Epoch: 3974, Training Loss: 0.03709
Epoch: 3974, Training Loss: 0.03716
Epoch: 3974, Training Loss: 0.04772
Epoch: 3975, Training Loss: 0.03309
Epoch: 3975, Training Loss: 0.03708
Epoch: 3975, Training Loss: 0.03715
Epoch: 3975, Training Loss: 0.04771
Epoch: 3976, Training Loss: 0.03308
Epoch: 3976, Training Loss: 0.03706
Epoch: 3976, Training Loss: 0.03714
Epoch: 3976, Training Loss: 0.04769
Epoch: 3977, Training Loss: 0.03307
Epoch: 3977, Training Loss: 0.03705
Epoch: 3977, Training Loss: 0.03713
Epoch: 3977, Training Loss: 0.04767
Epoch: 3978, Training Loss: 0.03306
Epoch: 3978, Training Loss: 0.03704
Epoch: 3978, Training Loss: 0.03712
Epoch: 3978, Training Loss: 0.04766
Epoch: 3979, Training Loss: 0.03305
Epoch: 3979, Training Loss: 0.03703
Epoch: 3979, Training Loss: 0.03710
Epoch: 3979, Training Loss: 0.04764
Epoch: 3980, Training Loss: 0.03304
Epoch: 3980, Training Loss: 0.03702
Epoch: 3980, Training Loss: 0.03709
Epoch: 3980, Training Loss: 0.04762
Epoch: 3981, Training Loss: 0.03303
Epoch: 3981, Training Loss: 0.03700
Epoch: 3981, Training Loss: 0.03708
Epoch: 3981, Training Loss: 0.04761
Epoch: 3982, Training Loss: 0.03302
Epoch: 3982, Training Loss: 0.03699
Epoch: 3982, Training Loss: 0.03707
Epoch: 3982, Training Loss: 0.04759
Epoch: 3983, Training Loss: 0.03301
Epoch: 3983, Training Loss: 0.03698
Epoch: 3983, Training Loss: 0.03706
Epoch: 3983, Training Loss: 0.04758
Epoch: 3984, Training Loss: 0.03300
Epoch: 3984, Training Loss: 0.03697
Epoch: 3984, Training Loss: 0.03704
Epoch: 3984, Training Loss: 0.04756
Epoch: 3985, Training Loss: 0.03299
Epoch: 3985, Training Loss: 0.03696
Epoch: 3985, Training Loss: 0.03703
Epoch: 3985, Training Loss: 0.04754
Epoch: 3986, Training Loss: 0.03298
Epoch: 3986, Training Loss: 0.03694
Epoch: 3986, Training Loss: 0.03702
Epoch: 3986, Training Loss: 0.04753
Epoch: 3987, Training Loss: 0.03297
Epoch: 3987, Training Loss: 0.03693
Epoch: 3987, Training Loss: 0.03701
Epoch: 3987, Training Loss: 0.04751
Epoch: 3988, Training Loss: 0.03296
Epoch: 3988, Training Loss: 0.03692
Epoch: 3988, Training Loss: 0.03700
Epoch: 3988, Training Loss: 0.04749
Epoch: 3989, Training Loss: 0.03295
Epoch: 3989, Training Loss: 0.03691
Epoch: 3989, Training Loss: 0.03698
Epoch: 3989, Training Loss: 0.04748
Epoch: 3990, Training Loss: 0.03294
Epoch: 3990, Training Loss: 0.03690
Epoch: 3990, Training Loss: 0.03697
Epoch: 3990, Training Loss: 0.04746
Epoch: 3991, Training Loss: 0.03293
Epoch: 3991, Training Loss: 0.03688
Epoch: 3991, Training Loss: 0.03696
Epoch: 3991, Training Loss: 0.04744
Epoch: 3992, Training Loss: 0.03292
Epoch: 3992, Training Loss: 0.03687
Epoch: 3992, Training Loss: 0.03695
Epoch: 3992, Training Loss: 0.04743
Epoch: 3993, Training Loss: 0.03291
Epoch: 3993, Training Loss: 0.03686
Epoch: 3993, Training Loss: 0.03694
Epoch: 3993, Training Loss: 0.04741
Epoch: 3994, Training Loss: 0.03290
Epoch: 3994, Training Loss: 0.03685
Epoch: 3994, Training Loss: 0.03692
Epoch: 3994, Training Loss: 0.04740
Epoch: 3995, Training Loss: 0.03289
Epoch: 3995, Training Loss: 0.03684
Epoch: 3995, Training Loss: 0.03691
Epoch: 3995, Training Loss: 0.04738
Epoch: 3996, Training Loss: 0.03288
Epoch: 3996, Training Loss: 0.03682
Epoch: 3996, Training Loss: 0.03690
Epoch: 3996, Training Loss: 0.04736
Epoch: 3997, Training Loss: 0.03287
Epoch: 3997, Training Loss: 0.03681
Epoch: 3997, Training Loss: 0.03689
Epoch: 3997, Training Loss: 0.04735
Epoch: 3998, Training Loss: 0.03286
Epoch: 3998, Training Loss: 0.03680
Epoch: 3998, Training Loss: 0.03688
Epoch: 3998, Training Loss: 0.04733
Epoch: 3999, Training Loss: 0.03285
Epoch: 3999, Training Loss: 0.03679
Epoch: 3999, Training Loss: 0.03686
Epoch: 3999, Training Loss: 0.04731
Epoch: 4000, Training Loss: 0.03284
Epoch: 4000, Training Loss: 0.03678
Epoch: 4000, Training Loss: 0.03685
Epoch: 4000, Training Loss: 0.04730
Epoch: 4001, Training Loss: 0.03283
Epoch: 4001, Training Loss: 0.03676
Epoch: 4001, Training Loss: 0.03684
Epoch: 4001, Training Loss: 0.04728
Epoch: 4002, Training Loss: 0.03282
Epoch: 4002, Training Loss: 0.03675
Epoch: 4002, Training Loss: 0.03683
Epoch: 4002, Training Loss: 0.04727
Epoch: 4003, Training Loss: 0.03282
Epoch: 4003, Training Loss: 0.03674
Epoch: 4003, Training Loss: 0.03682
Epoch: 4003, Training Loss: 0.04725
Epoch: 4004, Training Loss: 0.03281
Epoch: 4004, Training Loss: 0.03673
Epoch: 4004, Training Loss: 0.03680
Epoch: 4004, Training Loss: 0.04723
Epoch: 4005, Training Loss: 0.03280
Epoch: 4005, Training Loss: 0.03672
Epoch: 4005, Training Loss: 0.03679
Epoch: 4005, Training Loss: 0.04722
Epoch: 4006, Training Loss: 0.03279
Epoch: 4006, Training Loss: 0.03671
Epoch: 4006, Training Loss: 0.03678
Epoch: 4006, Training Loss: 0.04720
Epoch: 4007, Training Loss: 0.03278
Epoch: 4007, Training Loss: 0.03669
Epoch: 4007, Training Loss: 0.03677
Epoch: 4007, Training Loss: 0.04719
Epoch: 4008, Training Loss: 0.03277
Epoch: 4008, Training Loss: 0.03668
Epoch: 4008, Training Loss: 0.03676
Epoch: 4008, Training Loss: 0.04717
Epoch: 4009, Training Loss: 0.03276
Epoch: 4009, Training Loss: 0.03667
Epoch: 4009, Training Loss: 0.03675
Epoch: 4009, Training Loss: 0.04715
Epoch: 4010, Training Loss: 0.03275
Epoch: 4010, Training Loss: 0.03666
Epoch: 4010, Training Loss: 0.03673
Epoch: 4010, Training Loss: 0.04714
Epoch: 4011, Training Loss: 0.03274
Epoch: 4011, Training Loss: 0.03665
Epoch: 4011, Training Loss: 0.03672
Epoch: 4011, Training Loss: 0.04712
Epoch: 4012, Training Loss: 0.03273
Epoch: 4012, Training Loss: 0.03663
Epoch: 4012, Training Loss: 0.03671
Epoch: 4012, Training Loss: 0.04711
Epoch: 4013, Training Loss: 0.03272
Epoch: 4013, Training Loss: 0.03662
Epoch: 4013, Training Loss: 0.03670
Epoch: 4013, Training Loss: 0.04709
Epoch: 4014, Training Loss: 0.03271
Epoch: 4014, Training Loss: 0.03661
Epoch: 4014, Training Loss: 0.03669
Epoch: 4014, Training Loss: 0.04707
Epoch: 4015, Training Loss: 0.03270
Epoch: 4015, Training Loss: 0.03660
Epoch: 4015, Training Loss: 0.03667
Epoch: 4015, Training Loss: 0.04706
Epoch: 4016, Training Loss: 0.03269
Epoch: 4016, Training Loss: 0.03659
Epoch: 4016, Training Loss: 0.03666
Epoch: 4016, Training Loss: 0.04704
Epoch: 4017, Training Loss: 0.03268
Epoch: 4017, Training Loss: 0.03658
Epoch: 4017, Training Loss: 0.03665
Epoch: 4017, Training Loss: 0.04703
Epoch: 4018, Training Loss: 0.03267
Epoch: 4018, Training Loss: 0.03656
Epoch: 4018, Training Loss: 0.03664
Epoch: 4018, Training Loss: 0.04701
Epoch: 4019, Training Loss: 0.03266
Epoch: 4019, Training Loss: 0.03655
Epoch: 4019, Training Loss: 0.03663
Epoch: 4019, Training Loss: 0.04700
Epoch: 4020, Training Loss: 0.03265
Epoch: 4020, Training Loss: 0.03654
Epoch: 4020, Training Loss: 0.03662
Epoch: 4020, Training Loss: 0.04698
Epoch: 4021, Training Loss: 0.03264
Epoch: 4021, Training Loss: 0.03653
Epoch: 4021, Training Loss: 0.03660
Epoch: 4021, Training Loss: 0.04696
Epoch: 4022, Training Loss: 0.03263
Epoch: 4022, Training Loss: 0.03652
Epoch: 4022, Training Loss: 0.03659
Epoch: 4022, Training Loss: 0.04695
Epoch: 4023, Training Loss: 0.03263
Epoch: 4023, Training Loss: 0.03651
Epoch: 4023, Training Loss: 0.03658
Epoch: 4023, Training Loss: 0.04693
Epoch: 4024, Training Loss: 0.03262
Epoch: 4024, Training Loss: 0.03649
Epoch: 4024, Training Loss: 0.03657
Epoch: 4024, Training Loss: 0.04692
Epoch: 4025, Training Loss: 0.03261
Epoch: 4025, Training Loss: 0.03648
Epoch: 4025, Training Loss: 0.03656
Epoch: 4025, Training Loss: 0.04690
Epoch: 4026, Training Loss: 0.03260
Epoch: 4026, Training Loss: 0.03647
Epoch: 4026, Training Loss: 0.03655
Epoch: 4026, Training Loss: 0.04688
Epoch: 4027, Training Loss: 0.03259
Epoch: 4027, Training Loss: 0.03646
Epoch: 4027, Training Loss: 0.03653
Epoch: 4027, Training Loss: 0.04687
Epoch: 4028, Training Loss: 0.03258
Epoch: 4028, Training Loss: 0.03645
Epoch: 4028, Training Loss: 0.03652
Epoch: 4028, Training Loss: 0.04685
Epoch: 4029, Training Loss: 0.03257
Epoch: 4029, Training Loss: 0.03644
Epoch: 4029, Training Loss: 0.03651
Epoch: 4029, Training Loss: 0.04684
Epoch: 4030, Training Loss: 0.03256
Epoch: 4030, Training Loss: 0.03643
Epoch: 4030, Training Loss: 0.03650
Epoch: 4030, Training Loss: 0.04682
Epoch: 4031, Training Loss: 0.03255
Epoch: 4031, Training Loss: 0.03641
Epoch: 4031, Training Loss: 0.03649
Epoch: 4031, Training Loss: 0.04681
Epoch: 4032, Training Loss: 0.03254
Epoch: 4032, Training Loss: 0.03640
Epoch: 4032, Training Loss: 0.03648
Epoch: 4032, Training Loss: 0.04679
Epoch: 4033, Training Loss: 0.03253
Epoch: 4033, Training Loss: 0.03639
Epoch: 4033, Training Loss: 0.03647
Epoch: 4033, Training Loss: 0.04678
Epoch: 4034, Training Loss: 0.03252
Epoch: 4034, Training Loss: 0.03638
Epoch: 4034, Training Loss: 0.03645
Epoch: 4034, Training Loss: 0.04676
Epoch: 4035, Training Loss: 0.03251
Epoch: 4035, Training Loss: 0.03637
Epoch: 4035, Training Loss: 0.03644
Epoch: 4035, Training Loss: 0.04674
Epoch: 4036, Training Loss: 0.03250
Epoch: 4036, Training Loss: 0.03636
Epoch: 4036, Training Loss: 0.03643
Epoch: 4036, Training Loss: 0.04673
Epoch: 4037, Training Loss: 0.03249
Epoch: 4037, Training Loss: 0.03634
Epoch: 4037, Training Loss: 0.03642
Epoch: 4037, Training Loss: 0.04671
Epoch: 4038, Training Loss: 0.03248
Epoch: 4038, Training Loss: 0.03633
Epoch: 4038, Training Loss: 0.03641
Epoch: 4038, Training Loss: 0.04670
Epoch: 4039, Training Loss: 0.03248
Epoch: 4039, Training Loss: 0.03632
Epoch: 4039, Training Loss: 0.03640
Epoch: 4039, Training Loss: 0.04668
Epoch: 4040, Training Loss: 0.03247
Epoch: 4040, Training Loss: 0.03631
Epoch: 4040, Training Loss: 0.03638
Epoch: 4040, Training Loss: 0.04667
Epoch: 4041, Training Loss: 0.03246
Epoch: 4041, Training Loss: 0.03630
Epoch: 4041, Training Loss: 0.03637
Epoch: 4041, Training Loss: 0.04665
Epoch: 4042, Training Loss: 0.03245
Epoch: 4042, Training Loss: 0.03629
Epoch: 4042, Training Loss: 0.03636
Epoch: 4042, Training Loss: 0.04664
Epoch: 4043, Training Loss: 0.03244
Epoch: 4043, Training Loss: 0.03628
Epoch: 4043, Training Loss: 0.03635
Epoch: 4043, Training Loss: 0.04662
Epoch: 4044, Training Loss: 0.03243
Epoch: 4044, Training Loss: 0.03626
Epoch: 4044, Training Loss: 0.03634
Epoch: 4044, Training Loss: 0.04660
Epoch: 4045, Training Loss: 0.03242
Epoch: 4045, Training Loss: 0.03625
Epoch: 4045, Training Loss: 0.03633
Epoch: 4045, Training Loss: 0.04659
Epoch: 4046, Training Loss: 0.03241
Epoch: 4046, Training Loss: 0.03624
Epoch: 4046, Training Loss: 0.03632
Epoch: 4046, Training Loss: 0.04657
Epoch: 4047, Training Loss: 0.03240
Epoch: 4047, Training Loss: 0.03623
Epoch: 4047, Training Loss: 0.03630
Epoch: 4047, Training Loss: 0.04656
Epoch: 4048, Training Loss: 0.03239
Epoch: 4048, Training Loss: 0.03622
Epoch: 4048, Training Loss: 0.03629
Epoch: 4048, Training Loss: 0.04654
Epoch: 4049, Training Loss: 0.03238
Epoch: 4049, Training Loss: 0.03621
Epoch: 4049, Training Loss: 0.03628
Epoch: 4049, Training Loss: 0.04653
Epoch: 4050, Training Loss: 0.03237
Epoch: 4050, Training Loss: 0.03620
Epoch: 4050, Training Loss: 0.03627
Epoch: 4050, Training Loss: 0.04651
Epoch: 4051, Training Loss: 0.03236
Epoch: 4051, Training Loss: 0.03619
Epoch: 4051, Training Loss: 0.03626
Epoch: 4051, Training Loss: 0.04650
Epoch: 4052, Training Loss: 0.03235
Epoch: 4052, Training Loss: 0.03617
Epoch: 4052, Training Loss: 0.03625
Epoch: 4052, Training Loss: 0.04648
Epoch: 4053, Training Loss: 0.03235
Epoch: 4053, Training Loss: 0.03616
Epoch: 4053, Training Loss: 0.03624
Epoch: 4053, Training Loss: 0.04647
Epoch: 4054, Training Loss: 0.03234
Epoch: 4054, Training Loss: 0.03615
Epoch: 4054, Training Loss: 0.03623
Epoch: 4054, Training Loss: 0.04645
Epoch: 4055, Training Loss: 0.03233
Epoch: 4055, Training Loss: 0.03614
Epoch: 4055, Training Loss: 0.03621
Epoch: 4055, Training Loss: 0.04643
Epoch: 4056, Training Loss: 0.03232
Epoch: 4056, Training Loss: 0.03613
Epoch: 4056, Training Loss: 0.03620
Epoch: 4056, Training Loss: 0.04642
Epoch: 4057, Training Loss: 0.03231
Epoch: 4057, Training Loss: 0.03612
Epoch: 4057, Training Loss: 0.03619
Epoch: 4057, Training Loss: 0.04640
Epoch: 4058, Training Loss: 0.03230
Epoch: 4058, Training Loss: 0.03611
Epoch: 4058, Training Loss: 0.03618
Epoch: 4058, Training Loss: 0.04639
Epoch: 4059, Training Loss: 0.03229
Epoch: 4059, Training Loss: 0.03609
Epoch: 4059, Training Loss: 0.03617
Epoch: 4059, Training Loss: 0.04637
Epoch: 4060, Training Loss: 0.03228
Epoch: 4060, Training Loss: 0.03608
Epoch: 4060, Training Loss: 0.03616
Epoch: 4060, Training Loss: 0.04636
Epoch: 4061, Training Loss: 0.03227
Epoch: 4061, Training Loss: 0.03607
Epoch: 4061, Training Loss: 0.03615
Epoch: 4061, Training Loss: 0.04634
Epoch: 4062, Training Loss: 0.03226
Epoch: 4062, Training Loss: 0.03606
Epoch: 4062, Training Loss: 0.03613
Epoch: 4062, Training Loss: 0.04633
Epoch: 4063, Training Loss: 0.03225
Epoch: 4063, Training Loss: 0.03605
Epoch: 4063, Training Loss: 0.03612
Epoch: 4063, Training Loss: 0.04631
Epoch: 4064, Training Loss: 0.03224
Epoch: 4064, Training Loss: 0.03604
Epoch: 4064, Training Loss: 0.03611
Epoch: 4064, Training Loss: 0.04630
Epoch: 4065, Training Loss: 0.03224
Epoch: 4065, Training Loss: 0.03603
Epoch: 4065, Training Loss: 0.03610
Epoch: 4065, Training Loss: 0.04628
Epoch: 4066, Training Loss: 0.03223
Epoch: 4066, Training Loss: 0.03602
Epoch: 4066, Training Loss: 0.03609
Epoch: 4066, Training Loss: 0.04627
Epoch: 4067, Training Loss: 0.03222
Epoch: 4067, Training Loss: 0.03601
Epoch: 4067, Training Loss: 0.03608
Epoch: 4067, Training Loss: 0.04625
Epoch: 4068, Training Loss: 0.03221
Epoch: 4068, Training Loss: 0.03599
Epoch: 4068, Training Loss: 0.03607
Epoch: 4068, Training Loss: 0.04624
Epoch: 4069, Training Loss: 0.03220
Epoch: 4069, Training Loss: 0.03598
Epoch: 4069, Training Loss: 0.03606
Epoch: 4069, Training Loss: 0.04622
Epoch: 4070, Training Loss: 0.03219
Epoch: 4070, Training Loss: 0.03597
Epoch: 4070, Training Loss: 0.03605
Epoch: 4070, Training Loss: 0.04621
Epoch: 4071, Training Loss: 0.03218
Epoch: 4071, Training Loss: 0.03596
Epoch: 4071, Training Loss: 0.03603
Epoch: 4071, Training Loss: 0.04619
Epoch: 4072, Training Loss: 0.03217
Epoch: 4072, Training Loss: 0.03595
Epoch: 4072, Training Loss: 0.03602
Epoch: 4072, Training Loss: 0.04618
Epoch: 4073, Training Loss: 0.03216
Epoch: 4073, Training Loss: 0.03594
Epoch: 4073, Training Loss: 0.03601
Epoch: 4073, Training Loss: 0.04616
Epoch: 4074, Training Loss: 0.03215
Epoch: 4074, Training Loss: 0.03593
Epoch: 4074, Training Loss: 0.03600
Epoch: 4074, Training Loss: 0.04615
Epoch: 4075, Training Loss: 0.03215
Epoch: 4075, Training Loss: 0.03592
Epoch: 4075, Training Loss: 0.03599
Epoch: 4075, Training Loss: 0.04613
Epoch: 4076, Training Loss: 0.03214
Epoch: 4076, Training Loss: 0.03591
Epoch: 4076, Training Loss: 0.03598
Epoch: 4076, Training Loss: 0.04612
Epoch: 4077, Training Loss: 0.03213
Epoch: 4077, Training Loss: 0.03589
Epoch: 4077, Training Loss: 0.03597
Epoch: 4077, Training Loss: 0.04610
Epoch: 4078, Training Loss: 0.03212
Epoch: 4078, Training Loss: 0.03588
Epoch: 4078, Training Loss: 0.03596
Epoch: 4078, Training Loss: 0.04609
Epoch: 4079, Training Loss: 0.03211
Epoch: 4079, Training Loss: 0.03587
Epoch: 4079, Training Loss: 0.03594
Epoch: 4079, Training Loss: 0.04607
Epoch: 4080, Training Loss: 0.03210
Epoch: 4080, Training Loss: 0.03586
Epoch: 4080, Training Loss: 0.03593
Epoch: 4080, Training Loss: 0.04606
Epoch: 4081, Training Loss: 0.03209
Epoch: 4081, Training Loss: 0.03585
Epoch: 4081, Training Loss: 0.03592
Epoch: 4081, Training Loss: 0.04604
Epoch: 4082, Training Loss: 0.03208
Epoch: 4082, Training Loss: 0.03584
Epoch: 4082, Training Loss: 0.03591
Epoch: 4082, Training Loss: 0.04603
Epoch: 4083, Training Loss: 0.03207
Epoch: 4083, Training Loss: 0.03583
Epoch: 4083, Training Loss: 0.03590
Epoch: 4083, Training Loss: 0.04601
Epoch: 4084, Training Loss: 0.03206
Epoch: 4084, Training Loss: 0.03582
Epoch: 4084, Training Loss: 0.03589
Epoch: 4084, Training Loss: 0.04600
Epoch: 4085, Training Loss: 0.03206
Epoch: 4085, Training Loss: 0.03581
Epoch: 4085, Training Loss: 0.03588
Epoch: 4085, Training Loss: 0.04598
Epoch: 4086, Training Loss: 0.03205
Epoch: 4086, Training Loss: 0.03579
Epoch: 4086, Training Loss: 0.03587
Epoch: 4086, Training Loss: 0.04597
Epoch: 4087, Training Loss: 0.03204
Epoch: 4087, Training Loss: 0.03578
Epoch: 4087, Training Loss: 0.03586
Epoch: 4087, Training Loss: 0.04595
Epoch: 4088, Training Loss: 0.03203
Epoch: 4088, Training Loss: 0.03577
Epoch: 4088, Training Loss: 0.03585
Epoch: 4088, Training Loss: 0.04594
Epoch: 4089, Training Loss: 0.03202
Epoch: 4089, Training Loss: 0.03576
Epoch: 4089, Training Loss: 0.03583
Epoch: 4089, Training Loss: 0.04592
Epoch: 4090, Training Loss: 0.03201
Epoch: 4090, Training Loss: 0.03575
Epoch: 4090, Training Loss: 0.03582
Epoch: 4090, Training Loss: 0.04591
Epoch: 4091, Training Loss: 0.03200
Epoch: 4091, Training Loss: 0.03574
Epoch: 4091, Training Loss: 0.03581
Epoch: 4091, Training Loss: 0.04589
Epoch: 4092, Training Loss: 0.03199
Epoch: 4092, Training Loss: 0.03573
Epoch: 4092, Training Loss: 0.03580
Epoch: 4092, Training Loss: 0.04588
Epoch: 4093, Training Loss: 0.03198
Epoch: 4093, Training Loss: 0.03572
Epoch: 4093, Training Loss: 0.03579
Epoch: 4093, Training Loss: 0.04586
Epoch: 4094, Training Loss: 0.03197
Epoch: 4094, Training Loss: 0.03571
Epoch: 4094, Training Loss: 0.03578
Epoch: 4094, Training Loss: 0.04585
Epoch: 4095, Training Loss: 0.03197
Epoch: 4095, Training Loss: 0.03570
Epoch: 4095, Training Loss: 0.03577
Epoch: 4095, Training Loss: 0.04583
Epoch: 4096, Training Loss: 0.03196
Epoch: 4096, Training Loss: 0.03569
Epoch: 4096, Training Loss: 0.03576
Epoch: 4096, Training Loss: 0.04582
Epoch: 4097, Training Loss: 0.03195
Epoch: 4097, Training Loss: 0.03567
Epoch: 4097, Training Loss: 0.03575
Epoch: 4097, Training Loss: 0.04580
Epoch: 4098, Training Loss: 0.03194
Epoch: 4098, Training Loss: 0.03566
Epoch: 4098, Training Loss: 0.03574
Epoch: 4098, Training Loss: 0.04579
Epoch: 4099, Training Loss: 0.03193
Epoch: 4099, Training Loss: 0.03565
Epoch: 4099, Training Loss: 0.03573
Epoch: 4099, Training Loss: 0.04577
Epoch: 4100, Training Loss: 0.03192
Epoch: 4100, Training Loss: 0.03564
Epoch: 4100, Training Loss: 0.03571
Epoch: 4100, Training Loss: 0.04576
Epoch: 4101, Training Loss: 0.03191
Epoch: 4101, Training Loss: 0.03563
Epoch: 4101, Training Loss: 0.03570
Epoch: 4101, Training Loss: 0.04575
Epoch: 4102, Training Loss: 0.03190
Epoch: 4102, Training Loss: 0.03562
Epoch: 4102, Training Loss: 0.03569
Epoch: 4102, Training Loss: 0.04573
Epoch: 4103, Training Loss: 0.03190
Epoch: 4103, Training Loss: 0.03561
Epoch: 4103, Training Loss: 0.03568
Epoch: 4103, Training Loss: 0.04572
Epoch: 4104, Training Loss: 0.03189
Epoch: 4104, Training Loss: 0.03560
Epoch: 4104, Training Loss: 0.03567
Epoch: 4104, Training Loss: 0.04570
Epoch: 4105, Training Loss: 0.03188
Epoch: 4105, Training Loss: 0.03559
Epoch: 4105, Training Loss: 0.03566
Epoch: 4105, Training Loss: 0.04569
Epoch: 4106, Training Loss: 0.03187
Epoch: 4106, Training Loss: 0.03558
Epoch: 4106, Training Loss: 0.03565
Epoch: 4106, Training Loss: 0.04567
Epoch: 4107, Training Loss: 0.03186
Epoch: 4107, Training Loss: 0.03557
Epoch: 4107, Training Loss: 0.03564
Epoch: 4107, Training Loss: 0.04566
Epoch: 4108, Training Loss: 0.03185
Epoch: 4108, Training Loss: 0.03556
Epoch: 4108, Training Loss: 0.03563
Epoch: 4108, Training Loss: 0.04564
Epoch: 4109, Training Loss: 0.03184
Epoch: 4109, Training Loss: 0.03554
Epoch: 4109, Training Loss: 0.03562
Epoch: 4109, Training Loss: 0.04563
Epoch: 4110, Training Loss: 0.03183
Epoch: 4110, Training Loss: 0.03553
Epoch: 4110, Training Loss: 0.03561
Epoch: 4110, Training Loss: 0.04561
Epoch: 4111, Training Loss: 0.03182
Epoch: 4111, Training Loss: 0.03552
Epoch: 4111, Training Loss: 0.03560
Epoch: 4111, Training Loss: 0.04560
Epoch: 4112, Training Loss: 0.03182
Epoch: 4112, Training Loss: 0.03551
Epoch: 4112, Training Loss: 0.03558
Epoch: 4112, Training Loss: 0.04558
Epoch: 4113, Training Loss: 0.03181
Epoch: 4113, Training Loss: 0.03550
Epoch: 4113, Training Loss: 0.03557
Epoch: 4113, Training Loss: 0.04557
Epoch: 4114, Training Loss: 0.03180
Epoch: 4114, Training Loss: 0.03549
Epoch: 4114, Training Loss: 0.03556
Epoch: 4114, Training Loss: 0.04556
Epoch: 4115, Training Loss: 0.03179
Epoch: 4115, Training Loss: 0.03548
Epoch: 4115, Training Loss: 0.03555
Epoch: 4115, Training Loss: 0.04554
Epoch: 4116, Training Loss: 0.03178
Epoch: 4116, Training Loss: 0.03547
Epoch: 4116, Training Loss: 0.03554
Epoch: 4116, Training Loss: 0.04553
Epoch: 4117, Training Loss: 0.03177
Epoch: 4117, Training Loss: 0.03546
Epoch: 4117, Training Loss: 0.03553
Epoch: 4117, Training Loss: 0.04551
Epoch: 4118, Training Loss: 0.03176
Epoch: 4118, Training Loss: 0.03545
Epoch: 4118, Training Loss: 0.03552
Epoch: 4118, Training Loss: 0.04550
Epoch: 4119, Training Loss: 0.03175
Epoch: 4119, Training Loss: 0.03544
Epoch: 4119, Training Loss: 0.03551
Epoch: 4119, Training Loss: 0.04548
Epoch: 4120, Training Loss: 0.03175
Epoch: 4120, Training Loss: 0.03543
Epoch: 4120, Training Loss: 0.03550
Epoch: 4120, Training Loss: 0.04547
Epoch: 4121, Training Loss: 0.03174
Epoch: 4121, Training Loss: 0.03542
Epoch: 4121, Training Loss: 0.03549
Epoch: 4121, Training Loss: 0.04545
Epoch: 4122, Training Loss: 0.03173
Epoch: 4122, Training Loss: 0.03541
Epoch: 4122, Training Loss: 0.03548
Epoch: 4122, Training Loss: 0.04544
Epoch: 4123, Training Loss: 0.03172
Epoch: 4123, Training Loss: 0.03539
Epoch: 4123, Training Loss: 0.03547
Epoch: 4123, Training Loss: 0.04543
Epoch: 4124, Training Loss: 0.03171
Epoch: 4124, Training Loss: 0.03538
Epoch: 4124, Training Loss: 0.03546
Epoch: 4124, Training Loss: 0.04541
Epoch: 4125, Training Loss: 0.03170
Epoch: 4125, Training Loss: 0.03537
Epoch: 4125, Training Loss: 0.03545
Epoch: 4125, Training Loss: 0.04540
Epoch: 4126, Training Loss: 0.03169
Epoch: 4126, Training Loss: 0.03536
Epoch: 4126, Training Loss: 0.03543
Epoch: 4126, Training Loss: 0.04538
Epoch: 4127, Training Loss: 0.03169
Epoch: 4127, Training Loss: 0.03535
Epoch: 4127, Training Loss: 0.03542
Epoch: 4127, Training Loss: 0.04537
Epoch: 4128, Training Loss: 0.03168
Epoch: 4128, Training Loss: 0.03534
Epoch: 4128, Training Loss: 0.03541
Epoch: 4128, Training Loss: 0.04535
Epoch: 4129, Training Loss: 0.03167
Epoch: 4129, Training Loss: 0.03533
Epoch: 4129, Training Loss: 0.03540
Epoch: 4129, Training Loss: 0.04534
Epoch: 4130, Training Loss: 0.03166
Epoch: 4130, Training Loss: 0.03532
Epoch: 4130, Training Loss: 0.03539
Epoch: 4130, Training Loss: 0.04532
Epoch: 4131, Training Loss: 0.03165
Epoch: 4131, Training Loss: 0.03531
Epoch: 4131, Training Loss: 0.03538
Epoch: 4131, Training Loss: 0.04531
Epoch: 4132, Training Loss: 0.03164
Epoch: 4132, Training Loss: 0.03530
Epoch: 4132, Training Loss: 0.03537
Epoch: 4132, Training Loss: 0.04530
Epoch: 4133, Training Loss: 0.03163
Epoch: 4133, Training Loss: 0.03529
Epoch: 4133, Training Loss: 0.03536
Epoch: 4133, Training Loss: 0.04528
Epoch: 4134, Training Loss: 0.03162
Epoch: 4134, Training Loss: 0.03528
Epoch: 4134, Training Loss: 0.03535
Epoch: 4134, Training Loss: 0.04527
Epoch: 4135, Training Loss: 0.03162
Epoch: 4135, Training Loss: 0.03527
Epoch: 4135, Training Loss: 0.03534
Epoch: 4135, Training Loss: 0.04525
Epoch: 4136, Training Loss: 0.03161
Epoch: 4136, Training Loss: 0.03526
Epoch: 4136, Training Loss: 0.03533
Epoch: 4136, Training Loss: 0.04524
Epoch: 4137, Training Loss: 0.03160
Epoch: 4137, Training Loss: 0.03525
Epoch: 4137, Training Loss: 0.03532
Epoch: 4137, Training Loss: 0.04523
Epoch: 4138, Training Loss: 0.03159
Epoch: 4138, Training Loss: 0.03524
Epoch: 4138, Training Loss: 0.03531
Epoch: 4138, Training Loss: 0.04521
Epoch: 4139, Training Loss: 0.03158
Epoch: 4139, Training Loss: 0.03523
Epoch: 4139, Training Loss: 0.03530
Epoch: 4139, Training Loss: 0.04520
Epoch: 4140, Training Loss: 0.03157
Epoch: 4140, Training Loss: 0.03522
Epoch: 4140, Training Loss: 0.03529
Epoch: 4140, Training Loss: 0.04518
Epoch: 4141, Training Loss: 0.03156
Epoch: 4141, Training Loss: 0.03521
Epoch: 4141, Training Loss: 0.03528
Epoch: 4141, Training Loss: 0.04517
Epoch: 4142, Training Loss: 0.03156
Epoch: 4142, Training Loss: 0.03519
Epoch: 4142, Training Loss: 0.03527
Epoch: 4142, Training Loss: 0.04515
Epoch: 4143, Training Loss: 0.03155
Epoch: 4143, Training Loss: 0.03518
Epoch: 4143, Training Loss: 0.03526
Epoch: 4143, Training Loss: 0.04514
Epoch: 4144, Training Loss: 0.03154
Epoch: 4144, Training Loss: 0.03517
Epoch: 4144, Training Loss: 0.03524
Epoch: 4144, Training Loss: 0.04513
Epoch: 4145, Training Loss: 0.03153
Epoch: 4145, Training Loss: 0.03516
Epoch: 4145, Training Loss: 0.03523
Epoch: 4145, Training Loss: 0.04511
Epoch: 4146, Training Loss: 0.03152
Epoch: 4146, Training Loss: 0.03515
Epoch: 4146, Training Loss: 0.03522
Epoch: 4146, Training Loss: 0.04510
Epoch: 4147, Training Loss: 0.03151
Epoch: 4147, Training Loss: 0.03514
Epoch: 4147, Training Loss: 0.03521
Epoch: 4147, Training Loss: 0.04508
Epoch: 4148, Training Loss: 0.03150
Epoch: 4148, Training Loss: 0.03513
Epoch: 4148, Training Loss: 0.03520
Epoch: 4148, Training Loss: 0.04507
Epoch: 4149, Training Loss: 0.03150
Epoch: 4149, Training Loss: 0.03512
Epoch: 4149, Training Loss: 0.03519
Epoch: 4149, Training Loss: 0.04506
Epoch: 4150, Training Loss: 0.03149
Epoch: 4150, Training Loss: 0.03511
Epoch: 4150, Training Loss: 0.03518
Epoch: 4150, Training Loss: 0.04504
Epoch: 4151, Training Loss: 0.03148
Epoch: 4151, Training Loss: 0.03510
Epoch: 4151, Training Loss: 0.03517
Epoch: 4151, Training Loss: 0.04503
Epoch: 4152, Training Loss: 0.03147
Epoch: 4152, Training Loss: 0.03509
Epoch: 4152, Training Loss: 0.03516
Epoch: 4152, Training Loss: 0.04501
Epoch: 4153, Training Loss: 0.03146
Epoch: 4153, Training Loss: 0.03508
Epoch: 4153, Training Loss: 0.03515
Epoch: 4153, Training Loss: 0.04500
Epoch: 4154, Training Loss: 0.03145
Epoch: 4154, Training Loss: 0.03507
Epoch: 4154, Training Loss: 0.03514
Epoch: 4154, Training Loss: 0.04499
Epoch: 4155, Training Loss: 0.03145
Epoch: 4155, Training Loss: 0.03506
Epoch: 4155, Training Loss: 0.03513
Epoch: 4155, Training Loss: 0.04497
Epoch: 4156, Training Loss: 0.03144
Epoch: 4156, Training Loss: 0.03505
Epoch: 4156, Training Loss: 0.03512
Epoch: 4156, Training Loss: 0.04496
Epoch: 4157, Training Loss: 0.03143
Epoch: 4157, Training Loss: 0.03504
Epoch: 4157, Training Loss: 0.03511
Epoch: 4157, Training Loss: 0.04494
Epoch: 4158, Training Loss: 0.03142
Epoch: 4158, Training Loss: 0.03503
Epoch: 4158, Training Loss: 0.03510
Epoch: 4158, Training Loss: 0.04493
Epoch: 4159, Training Loss: 0.03141
Epoch: 4159, Training Loss: 0.03502
Epoch: 4159, Training Loss: 0.03509
Epoch: 4159, Training Loss: 0.04492
Epoch: 4160, Training Loss: 0.03140
Epoch: 4160, Training Loss: 0.03501
Epoch: 4160, Training Loss: 0.03508
Epoch: 4160, Training Loss: 0.04490
Epoch: 4161, Training Loss: 0.03139
Epoch: 4161, Training Loss: 0.03500
Epoch: 4161, Training Loss: 0.03507
Epoch: 4161, Training Loss: 0.04489
Epoch: 4162, Training Loss: 0.03139
Epoch: 4162, Training Loss: 0.03499
Epoch: 4162, Training Loss: 0.03506
Epoch: 4162, Training Loss: 0.04487
Epoch: 4163, Training Loss: 0.03138
Epoch: 4163, Training Loss: 0.03498
Epoch: 4163, Training Loss: 0.03505
Epoch: 4163, Training Loss: 0.04486
Epoch: 4164, Training Loss: 0.03137
Epoch: 4164, Training Loss: 0.03497
Epoch: 4164, Training Loss: 0.03504
Epoch: 4164, Training Loss: 0.04485
Epoch: 4165, Training Loss: 0.03136
Epoch: 4165, Training Loss: 0.03496
Epoch: 4165, Training Loss: 0.03503
Epoch: 4165, Training Loss: 0.04483
Epoch: 4166, Training Loss: 0.03135
Epoch: 4166, Training Loss: 0.03495
Epoch: 4166, Training Loss: 0.03502
Epoch: 4166, Training Loss: 0.04482
Epoch: 4167, Training Loss: 0.03134
Epoch: 4167, Training Loss: 0.03494
Epoch: 4167, Training Loss: 0.03501
Epoch: 4167, Training Loss: 0.04480
Epoch: 4168, Training Loss: 0.03134
Epoch: 4168, Training Loss: 0.03493
Epoch: 4168, Training Loss: 0.03500
Epoch: 4168, Training Loss: 0.04479
Epoch: 4169, Training Loss: 0.03133
Epoch: 4169, Training Loss: 0.03492
Epoch: 4169, Training Loss: 0.03499
Epoch: 4169, Training Loss: 0.04478
Epoch: 4170, Training Loss: 0.03132
Epoch: 4170, Training Loss: 0.03490
Epoch: 4170, Training Loss: 0.03498
Epoch: 4170, Training Loss: 0.04476
Epoch: 4171, Training Loss: 0.03131
Epoch: 4171, Training Loss: 0.03489
Epoch: 4171, Training Loss: 0.03497
Epoch: 4171, Training Loss: 0.04475
Epoch: 4172, Training Loss: 0.03130
Epoch: 4172, Training Loss: 0.03488
Epoch: 4172, Training Loss: 0.03495
Epoch: 4172, Training Loss: 0.04474
Epoch: 4173, Training Loss: 0.03129
Epoch: 4173, Training Loss: 0.03487
Epoch: 4173, Training Loss: 0.03494
Epoch: 4173, Training Loss: 0.04472
Epoch: 4174, Training Loss: 0.03129
Epoch: 4174, Training Loss: 0.03486
Epoch: 4174, Training Loss: 0.03493
Epoch: 4174, Training Loss: 0.04471
Epoch: 4175, Training Loss: 0.03128
Epoch: 4175, Training Loss: 0.03485
Epoch: 4175, Training Loss: 0.03492
Epoch: 4175, Training Loss: 0.04469
Epoch: 4176, Training Loss: 0.03127
Epoch: 4176, Training Loss: 0.03484
Epoch: 4176, Training Loss: 0.03491
Epoch: 4176, Training Loss: 0.04468
Epoch: 4177, Training Loss: 0.03126
Epoch: 4177, Training Loss: 0.03483
Epoch: 4177, Training Loss: 0.03490
Epoch: 4177, Training Loss: 0.04467
Epoch: 4178, Training Loss: 0.03125
Epoch: 4178, Training Loss: 0.03482
Epoch: 4178, Training Loss: 0.03489
Epoch: 4178, Training Loss: 0.04465
Epoch: 4179, Training Loss: 0.03124
Epoch: 4179, Training Loss: 0.03481
Epoch: 4179, Training Loss: 0.03488
Epoch: 4179, Training Loss: 0.04464
Epoch: 4180, Training Loss: 0.03124
Epoch: 4180, Training Loss: 0.03480
Epoch: 4180, Training Loss: 0.03487
Epoch: 4180, Training Loss: 0.04463
Epoch: 4181, Training Loss: 0.03123
Epoch: 4181, Training Loss: 0.03479
Epoch: 4181, Training Loss: 0.03486
Epoch: 4181, Training Loss: 0.04461
Epoch: 4182, Training Loss: 0.03122
Epoch: 4182, Training Loss: 0.03478
Epoch: 4182, Training Loss: 0.03485
Epoch: 4182, Training Loss: 0.04460
Epoch: 4183, Training Loss: 0.03121
Epoch: 4183, Training Loss: 0.03477
Epoch: 4183, Training Loss: 0.03484
Epoch: 4183, Training Loss: 0.04458
Epoch: 4184, Training Loss: 0.03120
Epoch: 4184, Training Loss: 0.03476
Epoch: 4184, Training Loss: 0.03483
Epoch: 4184, Training Loss: 0.04457
Epoch: 4185, Training Loss: 0.03119
Epoch: 4185, Training Loss: 0.03475
Epoch: 4185, Training Loss: 0.03482
Epoch: 4185, Training Loss: 0.04456
Epoch: 4186, Training Loss: 0.03119
Epoch: 4186, Training Loss: 0.03474
Epoch: 4186, Training Loss: 0.03481
Epoch: 4186, Training Loss: 0.04454
Epoch: 4187, Training Loss: 0.03118
Epoch: 4187, Training Loss: 0.03473
Epoch: 4187, Training Loss: 0.03480
Epoch: 4187, Training Loss: 0.04453
Epoch: 4188, Training Loss: 0.03117
Epoch: 4188, Training Loss: 0.03472
Epoch: 4188, Training Loss: 0.03479
Epoch: 4188, Training Loss: 0.04452
Epoch: 4189, Training Loss: 0.03116
Epoch: 4189, Training Loss: 0.03471
Epoch: 4189, Training Loss: 0.03478
Epoch: 4189, Training Loss: 0.04450
Epoch: 4190, Training Loss: 0.03115
Epoch: 4190, Training Loss: 0.03470
Epoch: 4190, Training Loss: 0.03477
Epoch: 4190, Training Loss: 0.04449
Epoch: 4191, Training Loss: 0.03114
Epoch: 4191, Training Loss: 0.03469
Epoch: 4191, Training Loss: 0.03476
Epoch: 4191, Training Loss: 0.04448
Epoch: 4192, Training Loss: 0.03114
Epoch: 4192, Training Loss: 0.03468
Epoch: 4192, Training Loss: 0.03475
Epoch: 4192, Training Loss: 0.04446
Epoch: 4193, Training Loss: 0.03113
Epoch: 4193, Training Loss: 0.03467
Epoch: 4193, Training Loss: 0.03474
Epoch: 4193, Training Loss: 0.04445
Epoch: 4194, Training Loss: 0.03112
Epoch: 4194, Training Loss: 0.03466
Epoch: 4194, Training Loss: 0.03473
Epoch: 4194, Training Loss: 0.04443
Epoch: 4195, Training Loss: 0.03111
Epoch: 4195, Training Loss: 0.03465
Epoch: 4195, Training Loss: 0.03472
Epoch: 4195, Training Loss: 0.04442
Epoch: 4196, Training Loss: 0.03110
Epoch: 4196, Training Loss: 0.03464
Epoch: 4196, Training Loss: 0.03471
Epoch: 4196, Training Loss: 0.04441
Epoch: 4197, Training Loss: 0.03110
Epoch: 4197, Training Loss: 0.03463
Epoch: 4197, Training Loss: 0.03470
Epoch: 4197, Training Loss: 0.04439
Epoch: 4198, Training Loss: 0.03109
Epoch: 4198, Training Loss: 0.03462
Epoch: 4198, Training Loss: 0.03469
Epoch: 4198, Training Loss: 0.04438
Epoch: 4199, Training Loss: 0.03108
Epoch: 4199, Training Loss: 0.03461
Epoch: 4199, Training Loss: 0.03468
Epoch: 4199, Training Loss: 0.04437
Epoch: 4200, Training Loss: 0.03107
Epoch: 4200, Training Loss: 0.03460
Epoch: 4200, Training Loss: 0.03467
Epoch: 4200, Training Loss: 0.04435
Epoch: 4201, Training Loss: 0.03106
Epoch: 4201, Training Loss: 0.03459
Epoch: 4201, Training Loss: 0.03466
Epoch: 4201, Training Loss: 0.04434
Epoch: 4202, Training Loss: 0.03105
Epoch: 4202, Training Loss: 0.03458
Epoch: 4202, Training Loss: 0.03465
Epoch: 4202, Training Loss: 0.04433
Epoch: 4203, Training Loss: 0.03105
Epoch: 4203, Training Loss: 0.03457
Epoch: 4203, Training Loss: 0.03464
Epoch: 4203, Training Loss: 0.04431
Epoch: 4204, Training Loss: 0.03104
Epoch: 4204, Training Loss: 0.03456
Epoch: 4204, Training Loss: 0.03463
Epoch: 4204, Training Loss: 0.04430
Epoch: 4205, Training Loss: 0.03103
Epoch: 4205, Training Loss: 0.03455
Epoch: 4205, Training Loss: 0.03462
Epoch: 4205, Training Loss: 0.04429
Epoch: 4206, Training Loss: 0.03102
Epoch: 4206, Training Loss: 0.03454
Epoch: 4206, Training Loss: 0.03461
Epoch: 4206, Training Loss: 0.04427
Epoch: 4207, Training Loss: 0.03101
Epoch: 4207, Training Loss: 0.03453
Epoch: 4207, Training Loss: 0.03460
Epoch: 4207, Training Loss: 0.04426
Epoch: 4208, Training Loss: 0.03101
Epoch: 4208, Training Loss: 0.03452
Epoch: 4208, Training Loss: 0.03459
Epoch: 4208, Training Loss: 0.04425
Epoch: 4209, Training Loss: 0.03100
Epoch: 4209, Training Loss: 0.03451
Epoch: 4209, Training Loss: 0.03458
Epoch: 4209, Training Loss: 0.04423
Epoch: 4210, Training Loss: 0.03099
Epoch: 4210, Training Loss: 0.03450
Epoch: 4210, Training Loss: 0.03457
Epoch: 4210, Training Loss: 0.04422
Epoch: 4211, Training Loss: 0.03098
Epoch: 4211, Training Loss: 0.03449
Epoch: 4211, Training Loss: 0.03456
Epoch: 4211, Training Loss: 0.04421
Epoch: 4212, Training Loss: 0.03097
Epoch: 4212, Training Loss: 0.03448
Epoch: 4212, Training Loss: 0.03455
Epoch: 4212, Training Loss: 0.04419
Epoch: 4213, Training Loss: 0.03097
Epoch: 4213, Training Loss: 0.03447
Epoch: 4213, Training Loss: 0.03454
Epoch: 4213, Training Loss: 0.04418
Epoch: 4214, Training Loss: 0.03096
Epoch: 4214, Training Loss: 0.03446
Epoch: 4214, Training Loss: 0.03453
Epoch: 4214, Training Loss: 0.04417
Epoch: 4215, Training Loss: 0.03095
Epoch: 4215, Training Loss: 0.03445
Epoch: 4215, Training Loss: 0.03452
Epoch: 4215, Training Loss: 0.04415
Epoch: 4216, Training Loss: 0.03094
Epoch: 4216, Training Loss: 0.03444
Epoch: 4216, Training Loss: 0.03451
Epoch: 4216, Training Loss: 0.04414
Epoch: 4217, Training Loss: 0.03093
Epoch: 4217, Training Loss: 0.03443
Epoch: 4217, Training Loss: 0.03450
Epoch: 4217, Training Loss: 0.04413
Epoch: 4218, Training Loss: 0.03092
Epoch: 4218, Training Loss: 0.03442
Epoch: 4218, Training Loss: 0.03449
Epoch: 4218, Training Loss: 0.04411
Epoch: 4219, Training Loss: 0.03092
Epoch: 4219, Training Loss: 0.03441
Epoch: 4219, Training Loss: 0.03448
Epoch: 4219, Training Loss: 0.04410
Epoch: 4220, Training Loss: 0.03091
Epoch: 4220, Training Loss: 0.03440
Epoch: 4220, Training Loss: 0.03447
Epoch: 4220, Training Loss: 0.04409
Epoch: 4221, Training Loss: 0.03090
Epoch: 4221, Training Loss: 0.03439
Epoch: 4221, Training Loss: 0.03446
Epoch: 4221, Training Loss: 0.04407
Epoch: 4222, Training Loss: 0.03089
Epoch: 4222, Training Loss: 0.03438
Epoch: 4222, Training Loss: 0.03445
Epoch: 4222, Training Loss: 0.04406
Epoch: 4223, Training Loss: 0.03088
Epoch: 4223, Training Loss: 0.03438
Epoch: 4223, Training Loss: 0.03444
Epoch: 4223, Training Loss: 0.04405
Epoch: 4224, Training Loss: 0.03088
Epoch: 4224, Training Loss: 0.03437
Epoch: 4224, Training Loss: 0.03443
Epoch: 4224, Training Loss: 0.04403
Epoch: 4225, Training Loss: 0.03087
Epoch: 4225, Training Loss: 0.03436
Epoch: 4225, Training Loss: 0.03442
Epoch: 4225, Training Loss: 0.04402
Epoch: 4226, Training Loss: 0.03086
Epoch: 4226, Training Loss: 0.03435
Epoch: 4226, Training Loss: 0.03441
Epoch: 4226, Training Loss: 0.04401
Epoch: 4227, Training Loss: 0.03085
Epoch: 4227, Training Loss: 0.03434
Epoch: 4227, Training Loss: 0.03440
Epoch: 4227, Training Loss: 0.04399
Epoch: 4228, Training Loss: 0.03084
Epoch: 4228, Training Loss: 0.03433
Epoch: 4228, Training Loss: 0.03440
Epoch: 4228, Training Loss: 0.04398
Epoch: 4229, Training Loss: 0.03084
Epoch: 4229, Training Loss: 0.03432
Epoch: 4229, Training Loss: 0.03439
Epoch: 4229, Training Loss: 0.04397
Epoch: 4230, Training Loss: 0.03083
Epoch: 4230, Training Loss: 0.03431
Epoch: 4230, Training Loss: 0.03438
Epoch: 4230, Training Loss: 0.04395
Epoch: 4231, Training Loss: 0.03082
Epoch: 4231, Training Loss: 0.03430
Epoch: 4231, Training Loss: 0.03437
Epoch: 4231, Training Loss: 0.04394
Epoch: 4232, Training Loss: 0.03081
Epoch: 4232, Training Loss: 0.03429
Epoch: 4232, Training Loss: 0.03436
Epoch: 4232, Training Loss: 0.04393
Epoch: 4233, Training Loss: 0.03080
Epoch: 4233, Training Loss: 0.03428
Epoch: 4233, Training Loss: 0.03435
Epoch: 4233, Training Loss: 0.04392
Epoch: 4234, Training Loss: 0.03080
Epoch: 4234, Training Loss: 0.03427
Epoch: 4234, Training Loss: 0.03434
Epoch: 4234, Training Loss: 0.04390
Epoch: 4235, Training Loss: 0.03079
Epoch: 4235, Training Loss: 0.03426
Epoch: 4235, Training Loss: 0.03433
Epoch: 4235, Training Loss: 0.04389
Epoch: 4236, Training Loss: 0.03078
Epoch: 4236, Training Loss: 0.03425
Epoch: 4236, Training Loss: 0.03432
Epoch: 4236, Training Loss: 0.04388
Epoch: 4237, Training Loss: 0.03077
Epoch: 4237, Training Loss: 0.03424
Epoch: 4237, Training Loss: 0.03431
Epoch: 4237, Training Loss: 0.04386
Epoch: 4238, Training Loss: 0.03076
Epoch: 4238, Training Loss: 0.03423
Epoch: 4238, Training Loss: 0.03430
Epoch: 4238, Training Loss: 0.04385
Epoch: 4239, Training Loss: 0.03076
Epoch: 4239, Training Loss: 0.03422
Epoch: 4239, Training Loss: 0.03429
Epoch: 4239, Training Loss: 0.04384
Epoch: 4240, Training Loss: 0.03075
Epoch: 4240, Training Loss: 0.03421
Epoch: 4240, Training Loss: 0.03428
Epoch: 4240, Training Loss: 0.04382
Epoch: 4241, Training Loss: 0.03074
Epoch: 4241, Training Loss: 0.03420
Epoch: 4241, Training Loss: 0.03427
Epoch: 4241, Training Loss: 0.04381
Epoch: 4242, Training Loss: 0.03073
Epoch: 4242, Training Loss: 0.03419
Epoch: 4242, Training Loss: 0.03426
Epoch: 4242, Training Loss: 0.04380
Epoch: 4243, Training Loss: 0.03073
Epoch: 4243, Training Loss: 0.03418
Epoch: 4243, Training Loss: 0.03425
Epoch: 4243, Training Loss: 0.04379
Epoch: 4244, Training Loss: 0.03072
Epoch: 4244, Training Loss: 0.03417
Epoch: 4244, Training Loss: 0.03424
Epoch: 4244, Training Loss: 0.04377
Epoch: 4245, Training Loss: 0.03071
Epoch: 4245, Training Loss: 0.03416
Epoch: 4245, Training Loss: 0.03423
Epoch: 4245, Training Loss: 0.04376
Epoch: 4246, Training Loss: 0.03070
Epoch: 4246, Training Loss: 0.03415
Epoch: 4246, Training Loss: 0.03422
Epoch: 4246, Training Loss: 0.04375
Epoch: 4247, Training Loss: 0.03069
Epoch: 4247, Training Loss: 0.03414
Epoch: 4247, Training Loss: 0.03421
Epoch: 4247, Training Loss: 0.04373
Epoch: 4248, Training Loss: 0.03069
Epoch: 4248, Training Loss: 0.03413
Epoch: 4248, Training Loss: 0.03420
Epoch: 4248, Training Loss: 0.04372
Epoch: 4249, Training Loss: 0.03068
Epoch: 4249, Training Loss: 0.03412
Epoch: 4249, Training Loss: 0.03419
Epoch: 4249, Training Loss: 0.04371
Epoch: 4250, Training Loss: 0.03067
Epoch: 4250, Training Loss: 0.03411
Epoch: 4250, Training Loss: 0.03418
Epoch: 4250, Training Loss: 0.04369
Epoch: 4251, Training Loss: 0.03066
Epoch: 4251, Training Loss: 0.03410
Epoch: 4251, Training Loss: 0.03417
Epoch: 4251, Training Loss: 0.04368
Epoch: 4252, Training Loss: 0.03065
Epoch: 4252, Training Loss: 0.03409
Epoch: 4252, Training Loss: 0.03416
Epoch: 4252, Training Loss: 0.04367
Epoch: 4253, Training Loss: 0.03065
Epoch: 4253, Training Loss: 0.03409
Epoch: 4253, Training Loss: 0.03415
Epoch: 4253, Training Loss: 0.04366
Epoch: 4254, Training Loss: 0.03064
Epoch: 4254, Training Loss: 0.03408
Epoch: 4254, Training Loss: 0.03414
Epoch: 4254, Training Loss: 0.04364
Epoch: 4255, Training Loss: 0.03063
Epoch: 4255, Training Loss: 0.03407
Epoch: 4255, Training Loss: 0.03413
Epoch: 4255, Training Loss: 0.04363
Epoch: 4256, Training Loss: 0.03062
Epoch: 4256, Training Loss: 0.03406
Epoch: 4256, Training Loss: 0.03412
Epoch: 4256, Training Loss: 0.04362
Epoch: 4257, Training Loss: 0.03062
Epoch: 4257, Training Loss: 0.03405
Epoch: 4257, Training Loss: 0.03412
Epoch: 4257, Training Loss: 0.04360
Epoch: 4258, Training Loss: 0.03061
Epoch: 4258, Training Loss: 0.03404
Epoch: 4258, Training Loss: 0.03411
Epoch: 4258, Training Loss: 0.04359
Epoch: 4259, Training Loss: 0.03060
Epoch: 4259, Training Loss: 0.03403
Epoch: 4259, Training Loss: 0.03410
Epoch: 4259, Training Loss: 0.04358
Epoch: 4260, Training Loss: 0.03059
Epoch: 4260, Training Loss: 0.03402
Epoch: 4260, Training Loss: 0.03409
Epoch: 4260, Training Loss: 0.04357
Epoch: 4261, Training Loss: 0.03058
Epoch: 4261, Training Loss: 0.03401
Epoch: 4261, Training Loss: 0.03408
Epoch: 4261, Training Loss: 0.04355
Epoch: 4262, Training Loss: 0.03058
Epoch: 4262, Training Loss: 0.03400
Epoch: 4262, Training Loss: 0.03407
Epoch: 4262, Training Loss: 0.04354
Epoch: 4263, Training Loss: 0.03057
Epoch: 4263, Training Loss: 0.03399
Epoch: 4263, Training Loss: 0.03406
Epoch: 4263, Training Loss: 0.04353
Epoch: 4264, Training Loss: 0.03056
Epoch: 4264, Training Loss: 0.03398
Epoch: 4264, Training Loss: 0.03405
Epoch: 4264, Training Loss: 0.04352
Epoch: 4265, Training Loss: 0.03055
Epoch: 4265, Training Loss: 0.03397
Epoch: 4265, Training Loss: 0.03404
Epoch: 4265, Training Loss: 0.04350
Epoch: 4266, Training Loss: 0.03054
Epoch: 4266, Training Loss: 0.03396
Epoch: 4266, Training Loss: 0.03403
Epoch: 4266, Training Loss: 0.04349
Epoch: 4267, Training Loss: 0.03054
Epoch: 4267, Training Loss: 0.03395
Epoch: 4267, Training Loss: 0.03402
Epoch: 4267, Training Loss: 0.04348
Epoch: 4268, Training Loss: 0.03053
Epoch: 4268, Training Loss: 0.03394
Epoch: 4268, Training Loss: 0.03401
Epoch: 4268, Training Loss: 0.04346
Epoch: 4269, Training Loss: 0.03052
Epoch: 4269, Training Loss: 0.03393
Epoch: 4269, Training Loss: 0.03400
Epoch: 4269, Training Loss: 0.04345
Epoch: 4270, Training Loss: 0.03051
Epoch: 4270, Training Loss: 0.03392
Epoch: 4270, Training Loss: 0.03399
Epoch: 4270, Training Loss: 0.04344
Epoch: 4271, Training Loss: 0.03051
Epoch: 4271, Training Loss: 0.03391
Epoch: 4271, Training Loss: 0.03398
Epoch: 4271, Training Loss: 0.04343
Epoch: 4272, Training Loss: 0.03050
Epoch: 4272, Training Loss: 0.03391
Epoch: 4272, Training Loss: 0.03397
Epoch: 4272, Training Loss: 0.04341
Epoch: 4273, Training Loss: 0.03049
Epoch: 4273, Training Loss: 0.03390
Epoch: 4273, Training Loss: 0.03396
Epoch: 4273, Training Loss: 0.04340
Epoch: 4274, Training Loss: 0.03048
Epoch: 4274, Training Loss: 0.03389
Epoch: 4274, Training Loss: 0.03395
Epoch: 4274, Training Loss: 0.04339
Epoch: 4275, Training Loss: 0.03048
Epoch: 4275, Training Loss: 0.03388
Epoch: 4275, Training Loss: 0.03394
Epoch: 4275, Training Loss: 0.04338
Epoch: 4276, Training Loss: 0.03047
Epoch: 4276, Training Loss: 0.03387
Epoch: 4276, Training Loss: 0.03394
Epoch: 4276, Training Loss: 0.04336
Epoch: 4277, Training Loss: 0.03046
Epoch: 4277, Training Loss: 0.03386
Epoch: 4277, Training Loss: 0.03393
Epoch: 4277, Training Loss: 0.04335
Epoch: 4278, Training Loss: 0.03045
Epoch: 4278, Training Loss: 0.03385
Epoch: 4278, Training Loss: 0.03392
Epoch: 4278, Training Loss: 0.04334
Epoch: 4279, Training Loss: 0.03044
Epoch: 4279, Training Loss: 0.03384
Epoch: 4279, Training Loss: 0.03391
Epoch: 4279, Training Loss: 0.04332
Epoch: 4280, Training Loss: 0.03044
Epoch: 4280, Training Loss: 0.03383
Epoch: 4280, Training Loss: 0.03390
Epoch: 4280, Training Loss: 0.04331
Epoch: 4281, Training Loss: 0.03043
Epoch: 4281, Training Loss: 0.03382
Epoch: 4281, Training Loss: 0.03389
Epoch: 4281, Training Loss: 0.04330
Epoch: 4282, Training Loss: 0.03042
Epoch: 4282, Training Loss: 0.03381
Epoch: 4282, Training Loss: 0.03388
Epoch: 4282, Training Loss: 0.04329
Epoch: 4283, Training Loss: 0.03041
Epoch: 4283, Training Loss: 0.03380
Epoch: 4283, Training Loss: 0.03387
Epoch: 4283, Training Loss: 0.04327
Epoch: 4284, Training Loss: 0.03041
Epoch: 4284, Training Loss: 0.03379
Epoch: 4284, Training Loss: 0.03386
Epoch: 4284, Training Loss: 0.04326
Epoch: 4285, Training Loss: 0.03040
Epoch: 4285, Training Loss: 0.03378
Epoch: 4285, Training Loss: 0.03385
Epoch: 4285, Training Loss: 0.04325
Epoch: 4286, Training Loss: 0.03039
Epoch: 4286, Training Loss: 0.03377
Epoch: 4286, Training Loss: 0.03384
Epoch: 4286, Training Loss: 0.04324
Epoch: 4287, Training Loss: 0.03038
Epoch: 4287, Training Loss: 0.03376
Epoch: 4287, Training Loss: 0.03383
Epoch: 4287, Training Loss: 0.04322
Epoch: 4288, Training Loss: 0.03038
Epoch: 4288, Training Loss: 0.03376
Epoch: 4288, Training Loss: 0.03382
Epoch: 4288, Training Loss: 0.04321
Epoch: 4289, Training Loss: 0.03037
Epoch: 4289, Training Loss: 0.03375
Epoch: 4289, Training Loss: 0.03381
Epoch: 4289, Training Loss: 0.04320
Epoch: 4290, Training Loss: 0.03036
Epoch: 4290, Training Loss: 0.03374
Epoch: 4290, Training Loss: 0.03380
Epoch: 4290, Training Loss: 0.04319
Epoch: 4291, Training Loss: 0.03035
Epoch: 4291, Training Loss: 0.03373
Epoch: 4291, Training Loss: 0.03380
Epoch: 4291, Training Loss: 0.04317
Epoch: 4292, Training Loss: 0.03034
Epoch: 4292, Training Loss: 0.03372
Epoch: 4292, Training Loss: 0.03379
Epoch: 4292, Training Loss: 0.04316
Epoch: 4293, Training Loss: 0.03034
Epoch: 4293, Training Loss: 0.03371
Epoch: 4293, Training Loss: 0.03378
Epoch: 4293, Training Loss: 0.04315
Epoch: 4294, Training Loss: 0.03033
Epoch: 4294, Training Loss: 0.03370
Epoch: 4294, Training Loss: 0.03377
Epoch: 4294, Training Loss: 0.04314
Epoch: 4295, Training Loss: 0.03032
Epoch: 4295, Training Loss: 0.03369
Epoch: 4295, Training Loss: 0.03376
Epoch: 4295, Training Loss: 0.04312
Epoch: 4296, Training Loss: 0.03031
Epoch: 4296, Training Loss: 0.03368
Epoch: 4296, Training Loss: 0.03375
Epoch: 4296, Training Loss: 0.04311
Epoch: 4297, Training Loss: 0.03031
Epoch: 4297, Training Loss: 0.03367
Epoch: 4297, Training Loss: 0.03374
Epoch: 4297, Training Loss: 0.04310
Epoch: 4298, Training Loss: 0.03030
Epoch: 4298, Training Loss: 0.03366
Epoch: 4298, Training Loss: 0.03373
Epoch: 4298, Training Loss: 0.04309
Epoch: 4299, Training Loss: 0.03029
Epoch: 4299, Training Loss: 0.03365
Epoch: 4299, Training Loss: 0.03372
Epoch: 4299, Training Loss: 0.04307
Epoch: 4300, Training Loss: 0.03028
Epoch: 4300, Training Loss: 0.03364
Epoch: 4300, Training Loss: 0.03371
Epoch: 4300, Training Loss: 0.04306
Epoch: 4301, Training Loss: 0.03028
Epoch: 4301, Training Loss: 0.03364
Epoch: 4301, Training Loss: 0.03370
Epoch: 4301, Training Loss: 0.04305
Epoch: 4302, Training Loss: 0.03027
Epoch: 4302, Training Loss: 0.03363
Epoch: 4302, Training Loss: 0.03369
Epoch: 4302, Training Loss: 0.04304
Epoch: 4303, Training Loss: 0.03026
Epoch: 4303, Training Loss: 0.03362
Epoch: 4303, Training Loss: 0.03368
Epoch: 4303, Training Loss: 0.04303
Epoch: 4304, Training Loss: 0.03025
Epoch: 4304, Training Loss: 0.03361
Epoch: 4304, Training Loss: 0.03368
Epoch: 4304, Training Loss: 0.04301
Epoch: 4305, Training Loss: 0.03025
Epoch: 4305, Training Loss: 0.03360
Epoch: 4305, Training Loss: 0.03367
Epoch: 4305, Training Loss: 0.04300
Epoch: 4306, Training Loss: 0.03024
Epoch: 4306, Training Loss: 0.03359
Epoch: 4306, Training Loss: 0.03366
Epoch: 4306, Training Loss: 0.04299
Epoch: 4307, Training Loss: 0.03023
Epoch: 4307, Training Loss: 0.03358
Epoch: 4307, Training Loss: 0.03365
Epoch: 4307, Training Loss: 0.04298
Epoch: 4308, Training Loss: 0.03022
Epoch: 4308, Training Loss: 0.03357
Epoch: 4308, Training Loss: 0.03364
Epoch: 4308, Training Loss: 0.04296
Epoch: 4309, Training Loss: 0.03022
Epoch: 4309, Training Loss: 0.03356
Epoch: 4309, Training Loss: 0.03363
Epoch: 4309, Training Loss: 0.04295
Epoch: 4310, Training Loss: 0.03021
Epoch: 4310, Training Loss: 0.03355
Epoch: 4310, Training Loss: 0.03362
Epoch: 4310, Training Loss: 0.04294
Epoch: 4311, Training Loss: 0.03020
Epoch: 4311, Training Loss: 0.03354
Epoch: 4311, Training Loss: 0.03361
Epoch: 4311, Training Loss: 0.04293
Epoch: 4312, Training Loss: 0.03019
Epoch: 4312, Training Loss: 0.03353
Epoch: 4312, Training Loss: 0.03360
Epoch: 4312, Training Loss: 0.04291
Epoch: 4313, Training Loss: 0.03019
Epoch: 4313, Training Loss: 0.03353
Epoch: 4313, Training Loss: 0.03359
Epoch: 4313, Training Loss: 0.04290
Epoch: 4314, Training Loss: 0.03018
Epoch: 4314, Training Loss: 0.03352
Epoch: 4314, Training Loss: 0.03358
Epoch: 4314, Training Loss: 0.04289
Epoch: 4315, Training Loss: 0.03017
Epoch: 4315, Training Loss: 0.03351
Epoch: 4315, Training Loss: 0.03357
Epoch: 4315, Training Loss: 0.04288
Epoch: 4316, Training Loss: 0.03016
Epoch: 4316, Training Loss: 0.03350
Epoch: 4316, Training Loss: 0.03357
Epoch: 4316, Training Loss: 0.04287
Epoch: 4317, Training Loss: 0.03016
Epoch: 4317, Training Loss: 0.03349
Epoch: 4317, Training Loss: 0.03356
Epoch: 4317, Training Loss: 0.04285
Epoch: 4318, Training Loss: 0.03015
Epoch: 4318, Training Loss: 0.03348
Epoch: 4318, Training Loss: 0.03355
Epoch: 4318, Training Loss: 0.04284
Epoch: 4319, Training Loss: 0.03014
Epoch: 4319, Training Loss: 0.03347
Epoch: 4319, Training Loss: 0.03354
Epoch: 4319, Training Loss: 0.04283
Epoch: 4320, Training Loss: 0.03013
Epoch: 4320, Training Loss: 0.03346
Epoch: 4320, Training Loss: 0.03353
Epoch: 4320, Training Loss: 0.04282
Epoch: 4321, Training Loss: 0.03013
Epoch: 4321, Training Loss: 0.03345
Epoch: 4321, Training Loss: 0.03352
Epoch: 4321, Training Loss: 0.04280
Epoch: 4322, Training Loss: 0.03012
Epoch: 4322, Training Loss: 0.03344
Epoch: 4322, Training Loss: 0.03351
Epoch: 4322, Training Loss: 0.04279
Epoch: 4323, Training Loss: 0.03011
Epoch: 4323, Training Loss: 0.03344
Epoch: 4323, Training Loss: 0.03350
Epoch: 4323, Training Loss: 0.04278
Epoch: 4324, Training Loss: 0.03010
Epoch: 4324, Training Loss: 0.03343
Epoch: 4324, Training Loss: 0.03349
Epoch: 4324, Training Loss: 0.04277
Epoch: 4325, Training Loss: 0.03010
Epoch: 4325, Training Loss: 0.03342
Epoch: 4325, Training Loss: 0.03348
Epoch: 4325, Training Loss: 0.04276
Epoch: 4326, Training Loss: 0.03009
Epoch: 4326, Training Loss: 0.03341
Epoch: 4326, Training Loss: 0.03347
Epoch: 4326, Training Loss: 0.04274
Epoch: 4327, Training Loss: 0.03008
Epoch: 4327, Training Loss: 0.03340
Epoch: 4327, Training Loss: 0.03347
Epoch: 4327, Training Loss: 0.04273
Epoch: 4328, Training Loss: 0.03007
Epoch: 4328, Training Loss: 0.03339
Epoch: 4328, Training Loss: 0.03346
Epoch: 4328, Training Loss: 0.04272
Epoch: 4329, Training Loss: 0.03007
Epoch: 4329, Training Loss: 0.03338
Epoch: 4329, Training Loss: 0.03345
Epoch: 4329, Training Loss: 0.04271
Epoch: 4330, Training Loss: 0.03006
Epoch: 4330, Training Loss: 0.03337
Epoch: 4330, Training Loss: 0.03344
Epoch: 4330, Training Loss: 0.04270
Epoch: 4331, Training Loss: 0.03005
Epoch: 4331, Training Loss: 0.03336
Epoch: 4331, Training Loss: 0.03343
Epoch: 4331, Training Loss: 0.04268
Epoch: 4332, Training Loss: 0.03004
Epoch: 4332, Training Loss: 0.03335
Epoch: 4332, Training Loss: 0.03342
Epoch: 4332, Training Loss: 0.04267
Epoch: 4333, Training Loss: 0.03004
Epoch: 4333, Training Loss: 0.03335
Epoch: 4333, Training Loss: 0.03341
Epoch: 4333, Training Loss: 0.04266
Epoch: 4334, Training Loss: 0.03003
Epoch: 4334, Training Loss: 0.03334
Epoch: 4334, Training Loss: 0.03340
Epoch: 4334, Training Loss: 0.04265
Epoch: 4335, Training Loss: 0.03002
Epoch: 4335, Training Loss: 0.03333
Epoch: 4335, Training Loss: 0.03339
Epoch: 4335, Training Loss: 0.04263
Epoch: 4336, Training Loss: 0.03001
Epoch: 4336, Training Loss: 0.03332
Epoch: 4336, Training Loss: 0.03338
Epoch: 4336, Training Loss: 0.04262
Epoch: 4337, Training Loss: 0.03001
Epoch: 4337, Training Loss: 0.03331
Epoch: 4337, Training Loss: 0.03338
Epoch: 4337, Training Loss: 0.04261
Epoch: 4338, Training Loss: 0.03000
Epoch: 4338, Training Loss: 0.03330
Epoch: 4338, Training Loss: 0.03337
Epoch: 4338, Training Loss: 0.04260
Epoch: 4339, Training Loss: 0.02999
Epoch: 4339, Training Loss: 0.03329
Epoch: 4339, Training Loss: 0.03336
Epoch: 4339, Training Loss: 0.04259
Epoch: 4340, Training Loss: 0.02998
Epoch: 4340, Training Loss: 0.03328
Epoch: 4340, Training Loss: 0.03335
Epoch: 4340, Training Loss: 0.04257
Epoch: 4341, Training Loss: 0.02998
Epoch: 4341, Training Loss: 0.03327
Epoch: 4341, Training Loss: 0.03334
Epoch: 4341, Training Loss: 0.04256
Epoch: 4342, Training Loss: 0.02997
Epoch: 4342, Training Loss: 0.03326
Epoch: 4342, Training Loss: 0.03333
Epoch: 4342, Training Loss: 0.04255
Epoch: 4343, Training Loss: 0.02996
Epoch: 4343, Training Loss: 0.03326
Epoch: 4343, Training Loss: 0.03332
Epoch: 4343, Training Loss: 0.04254
Epoch: 4344, Training Loss: 0.02996
Epoch: 4344, Training Loss: 0.03325
Epoch: 4344, Training Loss: 0.03331
Epoch: 4344, Training Loss: 0.04253
Epoch: 4345, Training Loss: 0.02995
Epoch: 4345, Training Loss: 0.03324
Epoch: 4345, Training Loss: 0.03330
Epoch: 4345, Training Loss: 0.04251
Epoch: 4346, Training Loss: 0.02994
Epoch: 4346, Training Loss: 0.03323
Epoch: 4346, Training Loss: 0.03330
Epoch: 4346, Training Loss: 0.04250
Epoch: 4347, Training Loss: 0.02993
Epoch: 4347, Training Loss: 0.03322
Epoch: 4347, Training Loss: 0.03329
Epoch: 4347, Training Loss: 0.04249
Epoch: 4348, Training Loss: 0.02993
Epoch: 4348, Training Loss: 0.03321
Epoch: 4348, Training Loss: 0.03328
Epoch: 4348, Training Loss: 0.04248
Epoch: 4349, Training Loss: 0.02992
Epoch: 4349, Training Loss: 0.03320
Epoch: 4349, Training Loss: 0.03327
Epoch: 4349, Training Loss: 0.04247
Epoch: 4350, Training Loss: 0.02991
Epoch: 4350, Training Loss: 0.03319
Epoch: 4350, Training Loss: 0.03326
Epoch: 4350, Training Loss: 0.04246
Epoch: 4351, Training Loss: 0.02990
Epoch: 4351, Training Loss: 0.03318
Epoch: 4351, Training Loss: 0.03325
Epoch: 4351, Training Loss: 0.04244
Epoch: 4352, Training Loss: 0.02990
Epoch: 4352, Training Loss: 0.03318
Epoch: 4352, Training Loss: 0.03324
Epoch: 4352, Training Loss: 0.04243
Epoch: 4353, Training Loss: 0.02989
Epoch: 4353, Training Loss: 0.03317
Epoch: 4353, Training Loss: 0.03323
Epoch: 4353, Training Loss: 0.04242
Epoch: 4354, Training Loss: 0.02988
Epoch: 4354, Training Loss: 0.03316
Epoch: 4354, Training Loss: 0.03322
Epoch: 4354, Training Loss: 0.04241
Epoch: 4355, Training Loss: 0.02988
Epoch: 4355, Training Loss: 0.03315
Epoch: 4355, Training Loss: 0.03322
Epoch: 4355, Training Loss: 0.04240
Epoch: 4356, Training Loss: 0.02987
Epoch: 4356, Training Loss: 0.03314
Epoch: 4356, Training Loss: 0.03321
Epoch: 4356, Training Loss: 0.04238
Epoch: 4357, Training Loss: 0.02986
Epoch: 4357, Training Loss: 0.03313
Epoch: 4357, Training Loss: 0.03320
Epoch: 4357, Training Loss: 0.04237
Epoch: 4358, Training Loss: 0.02985
Epoch: 4358, Training Loss: 0.03312
Epoch: 4358, Training Loss: 0.03319
Epoch: 4358, Training Loss: 0.04236
Epoch: 4359, Training Loss: 0.02985
Epoch: 4359, Training Loss: 0.03311
Epoch: 4359, Training Loss: 0.03318
Epoch: 4359, Training Loss: 0.04235
Epoch: 4360, Training Loss: 0.02984
Epoch: 4360, Training Loss: 0.03311
Epoch: 4360, Training Loss: 0.03317
Epoch: 4360, Training Loss: 0.04234
Epoch: 4361, Training Loss: 0.02983
Epoch: 4361, Training Loss: 0.03310
Epoch: 4361, Training Loss: 0.03316
Epoch: 4361, Training Loss: 0.04232
Epoch: 4362, Training Loss: 0.02982
Epoch: 4362, Training Loss: 0.03309
Epoch: 4362, Training Loss: 0.03315
Epoch: 4362, Training Loss: 0.04231
Epoch: 4363, Training Loss: 0.02982
Epoch: 4363, Training Loss: 0.03308
Epoch: 4363, Training Loss: 0.03314
Epoch: 4363, Training Loss: 0.04230
Epoch: 4364, Training Loss: 0.02981
Epoch: 4364, Training Loss: 0.03307
Epoch: 4364, Training Loss: 0.03314
Epoch: 4364, Training Loss: 0.04229
Epoch: 4365, Training Loss: 0.02980
Epoch: 4365, Training Loss: 0.03306
Epoch: 4365, Training Loss: 0.03313
Epoch: 4365, Training Loss: 0.04228
Epoch: 4366, Training Loss: 0.02980
Epoch: 4366, Training Loss: 0.03305
Epoch: 4366, Training Loss: 0.03312
Epoch: 4366, Training Loss: 0.04227
Epoch: 4367, Training Loss: 0.02979
Epoch: 4367, Training Loss: 0.03304
Epoch: 4367, Training Loss: 0.03311
Epoch: 4367, Training Loss: 0.04225
Epoch: 4368, Training Loss: 0.02978
Epoch: 4368, Training Loss: 0.03304
Epoch: 4368, Training Loss: 0.03310
Epoch: 4368, Training Loss: 0.04224
Epoch: 4369, Training Loss: 0.02977
Epoch: 4369, Training Loss: 0.03303
Epoch: 4369, Training Loss: 0.03309
Epoch: 4369, Training Loss: 0.04223
Epoch: 4370, Training Loss: 0.02977
Epoch: 4370, Training Loss: 0.03302
Epoch: 4370, Training Loss: 0.03308
Epoch: 4370, Training Loss: 0.04222
Epoch: 4371, Training Loss: 0.02976
Epoch: 4371, Training Loss: 0.03301
Epoch: 4371, Training Loss: 0.03307
Epoch: 4371, Training Loss: 0.04221
Epoch: 4372, Training Loss: 0.02975
Epoch: 4372, Training Loss: 0.03300
Epoch: 4372, Training Loss: 0.03307
Epoch: 4372, Training Loss: 0.04220
Epoch: 4373, Training Loss: 0.02974
Epoch: 4373, Training Loss: 0.03299
Epoch: 4373, Training Loss: 0.03306
Epoch: 4373, Training Loss: 0.04218
Epoch: 4374, Training Loss: 0.02974
Epoch: 4374, Training Loss: 0.03298
Epoch: 4374, Training Loss: 0.03305
Epoch: 4374, Training Loss: 0.04217
Epoch: 4375, Training Loss: 0.02973
Epoch: 4375, Training Loss: 0.03297
Epoch: 4375, Training Loss: 0.03304
Epoch: 4375, Training Loss: 0.04216
Epoch: 4376, Training Loss: 0.02972
Epoch: 4376, Training Loss: 0.03297
Epoch: 4376, Training Loss: 0.03303
Epoch: 4376, Training Loss: 0.04215
Epoch: 4377, Training Loss: 0.02972
Epoch: 4377, Training Loss: 0.03296
Epoch: 4377, Training Loss: 0.03302
Epoch: 4377, Training Loss: 0.04214
Epoch: 4378, Training Loss: 0.02971
Epoch: 4378, Training Loss: 0.03295
Epoch: 4378, Training Loss: 0.03301
Epoch: 4378, Training Loss: 0.04213
Epoch: 4379, Training Loss: 0.02970
Epoch: 4379, Training Loss: 0.03294
Epoch: 4379, Training Loss: 0.03301
Epoch: 4379, Training Loss: 0.04211
Epoch: 4380, Training Loss: 0.02969
Epoch: 4380, Training Loss: 0.03293
Epoch: 4380, Training Loss: 0.03300
Epoch: 4380, Training Loss: 0.04210
Epoch: 4381, Training Loss: 0.02969
Epoch: 4381, Training Loss: 0.03292
Epoch: 4381, Training Loss: 0.03299
Epoch: 4381, Training Loss: 0.04209
Epoch: 4382, Training Loss: 0.02968
Epoch: 4382, Training Loss: 0.03291
Epoch: 4382, Training Loss: 0.03298
Epoch: 4382, Training Loss: 0.04208
Epoch: 4383, Training Loss: 0.02967
Epoch: 4383, Training Loss: 0.03291
Epoch: 4383, Training Loss: 0.03297
Epoch: 4383, Training Loss: 0.04207
Epoch: 4384, Training Loss: 0.02967
Epoch: 4384, Training Loss: 0.03290
Epoch: 4384, Training Loss: 0.03296
Epoch: 4384, Training Loss: 0.04206
Epoch: 4385, Training Loss: 0.02966
Epoch: 4385, Training Loss: 0.03289
Epoch: 4385, Training Loss: 0.03295
Epoch: 4385, Training Loss: 0.04204
Epoch: 4386, Training Loss: 0.02965
Epoch: 4386, Training Loss: 0.03288
Epoch: 4386, Training Loss: 0.03294
Epoch: 4386, Training Loss: 0.04203
Epoch: 4387, Training Loss: 0.02964
Epoch: 4387, Training Loss: 0.03287
Epoch: 4387, Training Loss: 0.03294
Epoch: 4387, Training Loss: 0.04202
Epoch: 4388, Training Loss: 0.02964
Epoch: 4388, Training Loss: 0.03286
Epoch: 4388, Training Loss: 0.03293
Epoch: 4388, Training Loss: 0.04201
Epoch: 4389, Training Loss: 0.02963
Epoch: 4389, Training Loss: 0.03285
Epoch: 4389, Training Loss: 0.03292
Epoch: 4389, Training Loss: 0.04200
Epoch: 4390, Training Loss: 0.02962
Epoch: 4390, Training Loss: 0.03285
Epoch: 4390, Training Loss: 0.03291
Epoch: 4390, Training Loss: 0.04199
Epoch: 4391, Training Loss: 0.02962
Epoch: 4391, Training Loss: 0.03284
Epoch: 4391, Training Loss: 0.03290
Epoch: 4391, Training Loss: 0.04198
Epoch: 4392, Training Loss: 0.02961
Epoch: 4392, Training Loss: 0.03283
Epoch: 4392, Training Loss: 0.03289
Epoch: 4392, Training Loss: 0.04196
Epoch: 4393, Training Loss: 0.02960
Epoch: 4393, Training Loss: 0.03282
Epoch: 4393, Training Loss: 0.03288
Epoch: 4393, Training Loss: 0.04195
Epoch: 4394, Training Loss: 0.02959
Epoch: 4394, Training Loss: 0.03281
Epoch: 4394, Training Loss: 0.03288
Epoch: 4394, Training Loss: 0.04194
Epoch: 4395, Training Loss: 0.02959
Epoch: 4395, Training Loss: 0.03280
Epoch: 4395, Training Loss: 0.03287
Epoch: 4395, Training Loss: 0.04193
Epoch: 4396, Training Loss: 0.02958
Epoch: 4396, Training Loss: 0.03279
Epoch: 4396, Training Loss: 0.03286
Epoch: 4396, Training Loss: 0.04192
Epoch: 4397, Training Loss: 0.02957
Epoch: 4397, Training Loss: 0.03279
Epoch: 4397, Training Loss: 0.03285
Epoch: 4397, Training Loss: 0.04191
Epoch: 4398, Training Loss: 0.02957
Epoch: 4398, Training Loss: 0.03278
Epoch: 4398, Training Loss: 0.03284
Epoch: 4398, Training Loss: 0.04189
Epoch: 4399, Training Loss: 0.02956
Epoch: 4399, Training Loss: 0.03277
Epoch: 4399, Training Loss: 0.03283
Epoch: 4399, Training Loss: 0.04188
Epoch: 4400, Training Loss: 0.02955
Epoch: 4400, Training Loss: 0.03276
Epoch: 4400, Training Loss: 0.03282
Epoch: 4400, Training Loss: 0.04187
Epoch: 4401, Training Loss: 0.02955
Epoch: 4401, Training Loss: 0.03275
Epoch: 4401, Training Loss: 0.03282
Epoch: 4401, Training Loss: 0.04186
Epoch: 4402, Training Loss: 0.02954
Epoch: 4402, Training Loss: 0.03274
Epoch: 4402, Training Loss: 0.03281
Epoch: 4402, Training Loss: 0.04185
Epoch: 4403, Training Loss: 0.02953
Epoch: 4403, Training Loss: 0.03273
Epoch: 4403, Training Loss: 0.03280
Epoch: 4403, Training Loss: 0.04184
Epoch: 4404, Training Loss: 0.02952
Epoch: 4404, Training Loss: 0.03273
Epoch: 4404, Training Loss: 0.03279
Epoch: 4404, Training Loss: 0.04183
Epoch: 4405, Training Loss: 0.02952
Epoch: 4405, Training Loss: 0.03272
Epoch: 4405, Training Loss: 0.03278
Epoch: 4405, Training Loss: 0.04181
Epoch: 4406, Training Loss: 0.02951
Epoch: 4406, Training Loss: 0.03271
Epoch: 4406, Training Loss: 0.03277
Epoch: 4406, Training Loss: 0.04180
Epoch: 4407, Training Loss: 0.02950
Epoch: 4407, Training Loss: 0.03270
Epoch: 4407, Training Loss: 0.03276
Epoch: 4407, Training Loss: 0.04179
Epoch: 4408, Training Loss: 0.02950
Epoch: 4408, Training Loss: 0.03269
Epoch: 4408, Training Loss: 0.03276
Epoch: 4408, Training Loss: 0.04178
Epoch: 4409, Training Loss: 0.02949
Epoch: 4409, Training Loss: 0.03268
Epoch: 4409, Training Loss: 0.03275
Epoch: 4409, Training Loss: 0.04177
Epoch: 4410, Training Loss: 0.02948
Epoch: 4410, Training Loss: 0.03267
Epoch: 4410, Training Loss: 0.03274
Epoch: 4410, Training Loss: 0.04176
Epoch: 4411, Training Loss: 0.02948
Epoch: 4411, Training Loss: 0.03267
Epoch: 4411, Training Loss: 0.03273
Epoch: 4411, Training Loss: 0.04175
Epoch: 4412, Training Loss: 0.02947
Epoch: 4412, Training Loss: 0.03266
Epoch: 4412, Training Loss: 0.03272
Epoch: 4412, Training Loss: 0.04174
Epoch: 4413, Training Loss: 0.02946
Epoch: 4413, Training Loss: 0.03265
Epoch: 4413, Training Loss: 0.03271
Epoch: 4413, Training Loss: 0.04172
Epoch: 4414, Training Loss: 0.02945
Epoch: 4414, Training Loss: 0.03264
Epoch: 4414, Training Loss: 0.03271
Epoch: 4414, Training Loss: 0.04171
Epoch: 4415, Training Loss: 0.02945
Epoch: 4415, Training Loss: 0.03263
Epoch: 4415, Training Loss: 0.03270
Epoch: 4415, Training Loss: 0.04170
Epoch: 4416, Training Loss: 0.02944
Epoch: 4416, Training Loss: 0.03262
Epoch: 4416, Training Loss: 0.03269
Epoch: 4416, Training Loss: 0.04169
Epoch: 4417, Training Loss: 0.02943
Epoch: 4417, Training Loss: 0.03262
Epoch: 4417, Training Loss: 0.03268
Epoch: 4417, Training Loss: 0.04168
Epoch: 4418, Training Loss: 0.02943
Epoch: 4418, Training Loss: 0.03261
Epoch: 4418, Training Loss: 0.03267
Epoch: 4418, Training Loss: 0.04167
Epoch: 4419, Training Loss: 0.02942
Epoch: 4419, Training Loss: 0.03260
Epoch: 4419, Training Loss: 0.03266
Epoch: 4419, Training Loss: 0.04166
Epoch: 4420, Training Loss: 0.02941
Epoch: 4420, Training Loss: 0.03259
Epoch: 4420, Training Loss: 0.03265
Epoch: 4420, Training Loss: 0.04164
Epoch: 4421, Training Loss: 0.02941
Epoch: 4421, Training Loss: 0.03258
Epoch: 4421, Training Loss: 0.03265
Epoch: 4421, Training Loss: 0.04163
Epoch: 4422, Training Loss: 0.02940
Epoch: 4422, Training Loss: 0.03257
Epoch: 4422, Training Loss: 0.03264
Epoch: 4422, Training Loss: 0.04162
Epoch: 4423, Training Loss: 0.02939
Epoch: 4423, Training Loss: 0.03257
Epoch: 4423, Training Loss: 0.03263
Epoch: 4423, Training Loss: 0.04161
Epoch: 4424, Training Loss: 0.02938
Epoch: 4424, Training Loss: 0.03256
Epoch: 4424, Training Loss: 0.03262
Epoch: 4424, Training Loss: 0.04160
Epoch: 4425, Training Loss: 0.02938
Epoch: 4425, Training Loss: 0.03255
Epoch: 4425, Training Loss: 0.03261
Epoch: 4425, Training Loss: 0.04159
Epoch: 4426, Training Loss: 0.02937
Epoch: 4426, Training Loss: 0.03254
Epoch: 4426, Training Loss: 0.03260
Epoch: 4426, Training Loss: 0.04158
Epoch: 4427, Training Loss: 0.02936
Epoch: 4427, Training Loss: 0.03253
Epoch: 4427, Training Loss: 0.03260
Epoch: 4427, Training Loss: 0.04157
Epoch: 4428, Training Loss: 0.02936
Epoch: 4428, Training Loss: 0.03252
Epoch: 4428, Training Loss: 0.03259
Epoch: 4428, Training Loss: 0.04155
Epoch: 4429, Training Loss: 0.02935
Epoch: 4429, Training Loss: 0.03252
Epoch: 4429, Training Loss: 0.03258
Epoch: 4429, Training Loss: 0.04154
Epoch: 4430, Training Loss: 0.02934
Epoch: 4430, Training Loss: 0.03251
Epoch: 4430, Training Loss: 0.03257
Epoch: 4430, Training Loss: 0.04153
Epoch: 4431, Training Loss: 0.02934
Epoch: 4431, Training Loss: 0.03250
Epoch: 4431, Training Loss: 0.03256
Epoch: 4431, Training Loss: 0.04152
Epoch: 4432, Training Loss: 0.02933
Epoch: 4432, Training Loss: 0.03249
Epoch: 4432, Training Loss: 0.03255
Epoch: 4432, Training Loss: 0.04151
Epoch: 4433, Training Loss: 0.02932
Epoch: 4433, Training Loss: 0.03248
Epoch: 4433, Training Loss: 0.03255
Epoch: 4433, Training Loss: 0.04150
Epoch: 4434, Training Loss: 0.02932
Epoch: 4434, Training Loss: 0.03247
Epoch: 4434, Training Loss: 0.03254
Epoch: 4434, Training Loss: 0.04149
Epoch: 4435, Training Loss: 0.02931
Epoch: 4435, Training Loss: 0.03247
Epoch: 4435, Training Loss: 0.03253
Epoch: 4435, Training Loss: 0.04148
Epoch: 4436, Training Loss: 0.02930
Epoch: 4436, Training Loss: 0.03246
Epoch: 4436, Training Loss: 0.03252
Epoch: 4436, Training Loss: 0.04147
Epoch: 4437, Training Loss: 0.02929
Epoch: 4437, Training Loss: 0.03245
Epoch: 4437, Training Loss: 0.03251
Epoch: 4437, Training Loss: 0.04145
Epoch: 4438, Training Loss: 0.02929
Epoch: 4438, Training Loss: 0.03244
Epoch: 4438, Training Loss: 0.03250
Epoch: 4438, Training Loss: 0.04144
Epoch: 4439, Training Loss: 0.02928
Epoch: 4439, Training Loss: 0.03243
Epoch: 4439, Training Loss: 0.03250
Epoch: 4439, Training Loss: 0.04143
Epoch: 4440, Training Loss: 0.02927
Epoch: 4440, Training Loss: 0.03242
Epoch: 4440, Training Loss: 0.03249
Epoch: 4440, Training Loss: 0.04142
Epoch: 4441, Training Loss: 0.02927
Epoch: 4441, Training Loss: 0.03242
Epoch: 4441, Training Loss: 0.03248
Epoch: 4441, Training Loss: 0.04141
Epoch: 4442, Training Loss: 0.02926
Epoch: 4442, Training Loss: 0.03241
Epoch: 4442, Training Loss: 0.03247
Epoch: 4442, Training Loss: 0.04140
Epoch: 4443, Training Loss: 0.02925
Epoch: 4443, Training Loss: 0.03240
Epoch: 4443, Training Loss: 0.03246
Epoch: 4443, Training Loss: 0.04139
Epoch: 4444, Training Loss: 0.02925
Epoch: 4444, Training Loss: 0.03239
Epoch: 4444, Training Loss: 0.03245
Epoch: 4444, Training Loss: 0.04138
Epoch: 4445, Training Loss: 0.02924
Epoch: 4445, Training Loss: 0.03238
Epoch: 4445, Training Loss: 0.03245
Epoch: 4445, Training Loss: 0.04137
Epoch: 4446, Training Loss: 0.02923
Epoch: 4446, Training Loss: 0.03237
Epoch: 4446, Training Loss: 0.03244
Epoch: 4446, Training Loss: 0.04135
Epoch: 4447, Training Loss: 0.02923
Epoch: 4447, Training Loss: 0.03237
Epoch: 4447, Training Loss: 0.03243
Epoch: 4447, Training Loss: 0.04134
Epoch: 4448, Training Loss: 0.02922
Epoch: 4448, Training Loss: 0.03236
Epoch: 4448, Training Loss: 0.03242
Epoch: 4448, Training Loss: 0.04133
Epoch: 4449, Training Loss: 0.02921
Epoch: 4449, Training Loss: 0.03235
Epoch: 4449, Training Loss: 0.03241
Epoch: 4449, Training Loss: 0.04132
Epoch: 4450, Training Loss: 0.02921
Epoch: 4450, Training Loss: 0.03234
Epoch: 4450, Training Loss: 0.03240
Epoch: 4450, Training Loss: 0.04131
Epoch: 4451, Training Loss: 0.02920
Epoch: 4451, Training Loss: 0.03233
Epoch: 4451, Training Loss: 0.03240
Epoch: 4451, Training Loss: 0.04130
Epoch: 4452, Training Loss: 0.02919
Epoch: 4452, Training Loss: 0.03232
Epoch: 4452, Training Loss: 0.03239
Epoch: 4452, Training Loss: 0.04129
Epoch: 4453, Training Loss: 0.02919
Epoch: 4453, Training Loss: 0.03232
Epoch: 4453, Training Loss: 0.03238
Epoch: 4453, Training Loss: 0.04128
Epoch: 4454, Training Loss: 0.02918
Epoch: 4454, Training Loss: 0.03231
Epoch: 4454, Training Loss: 0.03237
Epoch: 4454, Training Loss: 0.04127
Epoch: 4455, Training Loss: 0.02917
Epoch: 4455, Training Loss: 0.03230
Epoch: 4455, Training Loss: 0.03236
Epoch: 4455, Training Loss: 0.04126
Epoch: 4456, Training Loss: 0.02916
Epoch: 4456, Training Loss: 0.03229
Epoch: 4456, Training Loss: 0.03236
Epoch: 4456, Training Loss: 0.04124
Epoch: 4457, Training Loss: 0.02916
Epoch: 4457, Training Loss: 0.03228
Epoch: 4457, Training Loss: 0.03235
Epoch: 4457, Training Loss: 0.04123
Epoch: 4458, Training Loss: 0.02915
Epoch: 4458, Training Loss: 0.03228
Epoch: 4458, Training Loss: 0.03234
Epoch: 4458, Training Loss: 0.04122
Epoch: 4459, Training Loss: 0.02914
Epoch: 4459, Training Loss: 0.03227
Epoch: 4459, Training Loss: 0.03233
Epoch: 4459, Training Loss: 0.04121
Epoch: 4460, Training Loss: 0.02914
Epoch: 4460, Training Loss: 0.03226
Epoch: 4460, Training Loss: 0.03232
Epoch: 4460, Training Loss: 0.04120
Epoch: 4461, Training Loss: 0.02913
Epoch: 4461, Training Loss: 0.03225
Epoch: 4461, Training Loss: 0.03231
Epoch: 4461, Training Loss: 0.04119
Epoch: 4462, Training Loss: 0.02912
Epoch: 4462, Training Loss: 0.03224
Epoch: 4462, Training Loss: 0.03231
Epoch: 4462, Training Loss: 0.04118
Epoch: 4463, Training Loss: 0.02912
Epoch: 4463, Training Loss: 0.03223
Epoch: 4463, Training Loss: 0.03230
Epoch: 4463, Training Loss: 0.04117
Epoch: 4464, Training Loss: 0.02911
Epoch: 4464, Training Loss: 0.03223
Epoch: 4464, Training Loss: 0.03229
Epoch: 4464, Training Loss: 0.04116
Epoch: 4465, Training Loss: 0.02910
Epoch: 4465, Training Loss: 0.03222
Epoch: 4465, Training Loss: 0.03228
Epoch: 4465, Training Loss: 0.04115
Epoch: 4466, Training Loss: 0.02910
Epoch: 4466, Training Loss: 0.03221
Epoch: 4466, Training Loss: 0.03227
Epoch: 4466, Training Loss: 0.04113
Epoch: 4467, Training Loss: 0.02909
Epoch: 4467, Training Loss: 0.03220
Epoch: 4467, Training Loss: 0.03227
Epoch: 4467, Training Loss: 0.04112
Epoch: 4468, Training Loss: 0.02908
Epoch: 4468, Training Loss: 0.03219
Epoch: 4468, Training Loss: 0.03226
Epoch: 4468, Training Loss: 0.04111
Epoch: 4469, Training Loss: 0.02908
Epoch: 4469, Training Loss: 0.03219
Epoch: 4469, Training Loss: 0.03225
Epoch: 4469, Training Loss: 0.04110
Epoch: 4470, Training Loss: 0.02907
Epoch: 4470, Training Loss: 0.03218
Epoch: 4470, Training Loss: 0.03224
Epoch: 4470, Training Loss: 0.04109
Epoch: 4471, Training Loss: 0.02906
Epoch: 4471, Training Loss: 0.03217
Epoch: 4471, Training Loss: 0.03223
Epoch: 4471, Training Loss: 0.04108
Epoch: 4472, Training Loss: 0.02906
Epoch: 4472, Training Loss: 0.03216
Epoch: 4472, Training Loss: 0.03223
Epoch: 4472, Training Loss: 0.04107
Epoch: 4473, Training Loss: 0.02905
Epoch: 4473, Training Loss: 0.03215
Epoch: 4473, Training Loss: 0.03222
Epoch: 4473, Training Loss: 0.04106
Epoch: 4474, Training Loss: 0.02904
Epoch: 4474, Training Loss: 0.03215
Epoch: 4474, Training Loss: 0.03221
Epoch: 4474, Training Loss: 0.04105
Epoch: 4475, Training Loss: 0.02904
Epoch: 4475, Training Loss: 0.03214
Epoch: 4475, Training Loss: 0.03220
Epoch: 4475, Training Loss: 0.04104
Epoch: 4476, Training Loss: 0.02903
Epoch: 4476, Training Loss: 0.03213
Epoch: 4476, Training Loss: 0.03219
Epoch: 4476, Training Loss: 0.04103
Epoch: 4477, Training Loss: 0.02902
Epoch: 4477, Training Loss: 0.03212
Epoch: 4477, Training Loss: 0.03218
Epoch: 4477, Training Loss: 0.04102
Epoch: 4478, Training Loss: 0.02902
Epoch: 4478, Training Loss: 0.03211
Epoch: 4478, Training Loss: 0.03218
Epoch: 4478, Training Loss: 0.04100
Epoch: 4479, Training Loss: 0.02901
Epoch: 4479, Training Loss: 0.03211
Epoch: 4479, Training Loss: 0.03217
Epoch: 4479, Training Loss: 0.04099
Epoch: 4480, Training Loss: 0.02900
Epoch: 4480, Training Loss: 0.03210
Epoch: 4480, Training Loss: 0.03216
Epoch: 4480, Training Loss: 0.04098
Epoch: 4481, Training Loss: 0.02900
Epoch: 4481, Training Loss: 0.03209
Epoch: 4481, Training Loss: 0.03215
Epoch: 4481, Training Loss: 0.04097
Epoch: 4482, Training Loss: 0.02899
Epoch: 4482, Training Loss: 0.03208
Epoch: 4482, Training Loss: 0.03214
Epoch: 4482, Training Loss: 0.04096
Epoch: 4483, Training Loss: 0.02898
Epoch: 4483, Training Loss: 0.03207
Epoch: 4483, Training Loss: 0.03214
Epoch: 4483, Training Loss: 0.04095
Epoch: 4484, Training Loss: 0.02898
Epoch: 4484, Training Loss: 0.03207
Epoch: 4484, Training Loss: 0.03213
Epoch: 4484, Training Loss: 0.04094
Epoch: 4485, Training Loss: 0.02897
Epoch: 4485, Training Loss: 0.03206
Epoch: 4485, Training Loss: 0.03212
Epoch: 4485, Training Loss: 0.04093
Epoch: 4486, Training Loss: 0.02896
Epoch: 4486, Training Loss: 0.03205
Epoch: 4486, Training Loss: 0.03211
Epoch: 4486, Training Loss: 0.04092
Epoch: 4487, Training Loss: 0.02896
Epoch: 4487, Training Loss: 0.03204
Epoch: 4487, Training Loss: 0.03210
Epoch: 4487, Training Loss: 0.04091
Epoch: 4488, Training Loss: 0.02895
Epoch: 4488, Training Loss: 0.03203
Epoch: 4488, Training Loss: 0.03210
Epoch: 4488, Training Loss: 0.04090
Epoch: 4489, Training Loss: 0.02894
Epoch: 4489, Training Loss: 0.03203
Epoch: 4489, Training Loss: 0.03209
Epoch: 4489, Training Loss: 0.04089
Epoch: 4490, Training Loss: 0.02894
Epoch: 4490, Training Loss: 0.03202
Epoch: 4490, Training Loss: 0.03208
Epoch: 4490, Training Loss: 0.04088
Epoch: 4491, Training Loss: 0.02893
Epoch: 4491, Training Loss: 0.03201
Epoch: 4491, Training Loss: 0.03207
Epoch: 4491, Training Loss: 0.04087
Epoch: 4492, Training Loss: 0.02892
Epoch: 4492, Training Loss: 0.03200
Epoch: 4492, Training Loss: 0.03206
Epoch: 4492, Training Loss: 0.04085
Epoch: 4493, Training Loss: 0.02892
Epoch: 4493, Training Loss: 0.03199
Epoch: 4493, Training Loss: 0.03206
Epoch: 4493, Training Loss: 0.04084
Epoch: 4494, Training Loss: 0.02891
Epoch: 4494, Training Loss: 0.03199
Epoch: 4494, Training Loss: 0.03205
Epoch: 4494, Training Loss: 0.04083
Epoch: 4495, Training Loss: 0.02890
Epoch: 4495, Training Loss: 0.03198
Epoch: 4495, Training Loss: 0.03204
Epoch: 4495, Training Loss: 0.04082
Epoch: 4496, Training Loss: 0.02890
Epoch: 4496, Training Loss: 0.03197
Epoch: 4496, Training Loss: 0.03203
Epoch: 4496, Training Loss: 0.04081
Epoch: 4497, Training Loss: 0.02889
Epoch: 4497, Training Loss: 0.03196
Epoch: 4497, Training Loss: 0.03202
Epoch: 4497, Training Loss: 0.04080
Epoch: 4498, Training Loss: 0.02888
Epoch: 4498, Training Loss: 0.03195
Epoch: 4498, Training Loss: 0.03202
Epoch: 4498, Training Loss: 0.04079
Epoch: 4499, Training Loss: 0.02888
Epoch: 4499, Training Loss: 0.03195
Epoch: 4499, Training Loss: 0.03201
Epoch: 4499, Training Loss: 0.04078
Epoch: 4500, Training Loss: 0.02887
Epoch: 4500, Training Loss: 0.03194
Epoch: 4500, Training Loss: 0.03200
Epoch: 4500, Training Loss: 0.04077
Epoch: 4501, Training Loss: 0.02886
Epoch: 4501, Training Loss: 0.03193
Epoch: 4501, Training Loss: 0.03199
Epoch: 4501, Training Loss: 0.04076
Epoch: 4502, Training Loss: 0.02886
Epoch: 4502, Training Loss: 0.03192
Epoch: 4502, Training Loss: 0.03198
Epoch: 4502, Training Loss: 0.04075
Epoch: 4503, Training Loss: 0.02885
Epoch: 4503, Training Loss: 0.03191
Epoch: 4503, Training Loss: 0.03198
Epoch: 4503, Training Loss: 0.04074
Epoch: 4504, Training Loss: 0.02884
Epoch: 4504, Training Loss: 0.03191
Epoch: 4504, Training Loss: 0.03197
Epoch: 4504, Training Loss: 0.04073
Epoch: 4505, Training Loss: 0.02884
Epoch: 4505, Training Loss: 0.03190
Epoch: 4505, Training Loss: 0.03196
Epoch: 4505, Training Loss: 0.04072
Epoch: 4506, Training Loss: 0.02883
Epoch: 4506, Training Loss: 0.03189
Epoch: 4506, Training Loss: 0.03195
Epoch: 4506, Training Loss: 0.04071
Epoch: 4507, Training Loss: 0.02882
Epoch: 4507, Training Loss: 0.03188
Epoch: 4507, Training Loss: 0.03195
Epoch: 4507, Training Loss: 0.04070
Epoch: 4508, Training Loss: 0.02882
Epoch: 4508, Training Loss: 0.03187
Epoch: 4508, Training Loss: 0.03194
Epoch: 4508, Training Loss: 0.04068
Epoch: 4509, Training Loss: 0.02881
Epoch: 4509, Training Loss: 0.03187
Epoch: 4509, Training Loss: 0.03193
Epoch: 4509, Training Loss: 0.04067
Epoch: 4510, Training Loss: 0.02880
Epoch: 4510, Training Loss: 0.03186
Epoch: 4510, Training Loss: 0.03192
Epoch: 4510, Training Loss: 0.04066
Epoch: 4511, Training Loss: 0.02880
Epoch: 4511, Training Loss: 0.03185
Epoch: 4511, Training Loss: 0.03191
Epoch: 4511, Training Loss: 0.04065
Epoch: 4512, Training Loss: 0.02879
Epoch: 4512, Training Loss: 0.03184
Epoch: 4512, Training Loss: 0.03191
Epoch: 4512, Training Loss: 0.04064
Epoch: 4513, Training Loss: 0.02879
Epoch: 4513, Training Loss: 0.03184
Epoch: 4513, Training Loss: 0.03190
Epoch: 4513, Training Loss: 0.04063
Epoch: 4514, Training Loss: 0.02878
Epoch: 4514, Training Loss: 0.03183
Epoch: 4514, Training Loss: 0.03189
Epoch: 4514, Training Loss: 0.04062
Epoch: 4515, Training Loss: 0.02877
Epoch: 4515, Training Loss: 0.03182
Epoch: 4515, Training Loss: 0.03188
Epoch: 4515, Training Loss: 0.04061
Epoch: 4516, Training Loss: 0.02877
Epoch: 4516, Training Loss: 0.03181
Epoch: 4516, Training Loss: 0.03187
Epoch: 4516, Training Loss: 0.04060
Epoch: 4517, Training Loss: 0.02876
Epoch: 4517, Training Loss: 0.03180
Epoch: 4517, Training Loss: 0.03187
Epoch: 4517, Training Loss: 0.04059
Epoch: 4518, Training Loss: 0.02875
Epoch: 4518, Training Loss: 0.03180
Epoch: 4518, Training Loss: 0.03186
Epoch: 4518, Training Loss: 0.04058
Epoch: 4519, Training Loss: 0.02875
Epoch: 4519, Training Loss: 0.03179
Epoch: 4519, Training Loss: 0.03185
Epoch: 4519, Training Loss: 0.04057
Epoch: 4520, Training Loss: 0.02874
Epoch: 4520, Training Loss: 0.03178
Epoch: 4520, Training Loss: 0.03184
Epoch: 4520, Training Loss: 0.04056
Epoch: 4521, Training Loss: 0.02873
Epoch: 4521, Training Loss: 0.03177
Epoch: 4521, Training Loss: 0.03184
Epoch: 4521, Training Loss: 0.04055
Epoch: 4522, Training Loss: 0.02873
Epoch: 4522, Training Loss: 0.03176
Epoch: 4522, Training Loss: 0.03183
Epoch: 4522, Training Loss: 0.04054
Epoch: 4523, Training Loss: 0.02872
Epoch: 4523, Training Loss: 0.03176
Epoch: 4523, Training Loss: 0.03182
Epoch: 4523, Training Loss: 0.04053
Epoch: 4524, Training Loss: 0.02871
Epoch: 4524, Training Loss: 0.03175
Epoch: 4524, Training Loss: 0.03181
Epoch: 4524, Training Loss: 0.04052
Epoch: 4525, Training Loss: 0.02871
Epoch: 4525, Training Loss: 0.03174
Epoch: 4525, Training Loss: 0.03180
Epoch: 4525, Training Loss: 0.04051
Epoch: 4526, Training Loss: 0.02870
Epoch: 4526, Training Loss: 0.03173
Epoch: 4526, Training Loss: 0.03180
Epoch: 4526, Training Loss: 0.04050
Epoch: 4527, Training Loss: 0.02869
Epoch: 4527, Training Loss: 0.03173
Epoch: 4527, Training Loss: 0.03179
Epoch: 4527, Training Loss: 0.04049
Epoch: 4528, Training Loss: 0.02869
Epoch: 4528, Training Loss: 0.03172
Epoch: 4528, Training Loss: 0.03178
Epoch: 4528, Training Loss: 0.04048
Epoch: 4529, Training Loss: 0.02868
Epoch: 4529, Training Loss: 0.03171
Epoch: 4529, Training Loss: 0.03177
Epoch: 4529, Training Loss: 0.04046
Epoch: 4530, Training Loss: 0.02867
Epoch: 4530, Training Loss: 0.03170
Epoch: 4530, Training Loss: 0.03176
Epoch: 4530, Training Loss: 0.04045
Epoch: 4531, Training Loss: 0.02867
Epoch: 4531, Training Loss: 0.03170
Epoch: 4531, Training Loss: 0.03176
Epoch: 4531, Training Loss: 0.04044
Epoch: 4532, Training Loss: 0.02866
Epoch: 4532, Training Loss: 0.03169
Epoch: 4532, Training Loss: 0.03175
Epoch: 4532, Training Loss: 0.04043
Epoch: 4533, Training Loss: 0.02866
Epoch: 4533, Training Loss: 0.03168
Epoch: 4533, Training Loss: 0.03174
Epoch: 4533, Training Loss: 0.04042
Epoch: 4534, Training Loss: 0.02865
Epoch: 4534, Training Loss: 0.03167
Epoch: 4534, Training Loss: 0.03173
Epoch: 4534, Training Loss: 0.04041
Epoch: 4535, Training Loss: 0.02864
Epoch: 4535, Training Loss: 0.03166
Epoch: 4535, Training Loss: 0.03173
Epoch: 4535, Training Loss: 0.04040
Epoch: 4536, Training Loss: 0.02864
Epoch: 4536, Training Loss: 0.03166
Epoch: 4536, Training Loss: 0.03172
Epoch: 4536, Training Loss: 0.04039
Epoch: 4537, Training Loss: 0.02863
Epoch: 4537, Training Loss: 0.03165
Epoch: 4537, Training Loss: 0.03171
Epoch: 4537, Training Loss: 0.04038
Epoch: 4538, Training Loss: 0.02862
Epoch: 4538, Training Loss: 0.03164
Epoch: 4538, Training Loss: 0.03170
Epoch: 4538, Training Loss: 0.04037
Epoch: 4539, Training Loss: 0.02862
Epoch: 4539, Training Loss: 0.03163
Epoch: 4539, Training Loss: 0.03170
Epoch: 4539, Training Loss: 0.04036
Epoch: 4540, Training Loss: 0.02861
Epoch: 4540, Training Loss: 0.03163
Epoch: 4540, Training Loss: 0.03169
Epoch: 4540, Training Loss: 0.04035
Epoch: 4541, Training Loss: 0.02860
Epoch: 4541, Training Loss: 0.03162
Epoch: 4541, Training Loss: 0.03168
Epoch: 4541, Training Loss: 0.04034
Epoch: 4542, Training Loss: 0.02860
Epoch: 4542, Training Loss: 0.03161
Epoch: 4542, Training Loss: 0.03167
Epoch: 4542, Training Loss: 0.04033
Epoch: 4543, Training Loss: 0.02859
Epoch: 4543, Training Loss: 0.03160
Epoch: 4543, Training Loss: 0.03166
Epoch: 4543, Training Loss: 0.04032
Epoch: 4544, Training Loss: 0.02858
Epoch: 4544, Training Loss: 0.03159
Epoch: 4544, Training Loss: 0.03166
Epoch: 4544, Training Loss: 0.04031
Epoch: 4545, Training Loss: 0.02858
Epoch: 4545, Training Loss: 0.03159
Epoch: 4545, Training Loss: 0.03165
Epoch: 4545, Training Loss: 0.04030
Epoch: 4546, Training Loss: 0.02857
Epoch: 4546, Training Loss: 0.03158
Epoch: 4546, Training Loss: 0.03164
Epoch: 4546, Training Loss: 0.04029
Epoch: 4547, Training Loss: 0.02857
Epoch: 4547, Training Loss: 0.03157
Epoch: 4547, Training Loss: 0.03163
Epoch: 4547, Training Loss: 0.04028
Epoch: 4548, Training Loss: 0.02856
Epoch: 4548, Training Loss: 0.03156
Epoch: 4548, Training Loss: 0.03163
Epoch: 4548, Training Loss: 0.04027
Epoch: 4549, Training Loss: 0.02855
Epoch: 4549, Training Loss: 0.03156
Epoch: 4549, Training Loss: 0.03162
Epoch: 4549, Training Loss: 0.04026
Epoch: 4550, Training Loss: 0.02855
Epoch: 4550, Training Loss: 0.03155
Epoch: 4550, Training Loss: 0.03161
Epoch: 4550, Training Loss: 0.04025
Epoch: 4551, Training Loss: 0.02854
Epoch: 4551, Training Loss: 0.03154
Epoch: 4551, Training Loss: 0.03160
Epoch: 4551, Training Loss: 0.04024
Epoch: 4552, Training Loss: 0.02853
Epoch: 4552, Training Loss: 0.03153
Epoch: 4552, Training Loss: 0.03160
Epoch: 4552, Training Loss: 0.04023
Epoch: 4553, Training Loss: 0.02853
Epoch: 4553, Training Loss: 0.03153
Epoch: 4553, Training Loss: 0.03159
Epoch: 4553, Training Loss: 0.04022
Epoch: 4554, Training Loss: 0.02852
Epoch: 4554, Training Loss: 0.03152
Epoch: 4554, Training Loss: 0.03158
Epoch: 4554, Training Loss: 0.04021
Epoch: 4555, Training Loss: 0.02851
Epoch: 4555, Training Loss: 0.03151
Epoch: 4555, Training Loss: 0.03157
Epoch: 4555, Training Loss: 0.04020
Epoch: 4556, Training Loss: 0.02851
Epoch: 4556, Training Loss: 0.03150
Epoch: 4556, Training Loss: 0.03156
Epoch: 4556, Training Loss: 0.04019
Epoch: 4557, Training Loss: 0.02850
Epoch: 4557, Training Loss: 0.03150
Epoch: 4557, Training Loss: 0.03156
Epoch: 4557, Training Loss: 0.04018
Epoch: 4558, Training Loss: 0.02850
Epoch: 4558, Training Loss: 0.03149
Epoch: 4558, Training Loss: 0.03155
Epoch: 4558, Training Loss: 0.04017
Epoch: 4559, Training Loss: 0.02849
Epoch: 4559, Training Loss: 0.03148
Epoch: 4559, Training Loss: 0.03154
Epoch: 4559, Training Loss: 0.04016
Epoch: 4560, Training Loss: 0.02848
Epoch: 4560, Training Loss: 0.03147
Epoch: 4560, Training Loss: 0.03153
Epoch: 4560, Training Loss: 0.04015
Epoch: 4561, Training Loss: 0.02848
Epoch: 4561, Training Loss: 0.03146
Epoch: 4561, Training Loss: 0.03153
Epoch: 4561, Training Loss: 0.04014
Epoch: 4562, Training Loss: 0.02847
Epoch: 4562, Training Loss: 0.03146
Epoch: 4562, Training Loss: 0.03152
Epoch: 4562, Training Loss: 0.04013
Epoch: 4563, Training Loss: 0.02846
Epoch: 4563, Training Loss: 0.03145
Epoch: 4563, Training Loss: 0.03151
Epoch: 4563, Training Loss: 0.04012
Epoch: 4564, Training Loss: 0.02846
Epoch: 4564, Training Loss: 0.03144
Epoch: 4564, Training Loss: 0.03150
Epoch: 4564, Training Loss: 0.04011
Epoch: 4565, Training Loss: 0.02845
Epoch: 4565, Training Loss: 0.03143
Epoch: 4565, Training Loss: 0.03150
Epoch: 4565, Training Loss: 0.04010
Epoch: 4566, Training Loss: 0.02844
Epoch: 4566, Training Loss: 0.03143
Epoch: 4566, Training Loss: 0.03149
Epoch: 4566, Training Loss: 0.04009
Epoch: 4567, Training Loss: 0.02844
Epoch: 4567, Training Loss: 0.03142
Epoch: 4567, Training Loss: 0.03148
Epoch: 4567, Training Loss: 0.04008
Epoch: 4568, Training Loss: 0.02843
Epoch: 4568, Training Loss: 0.03141
Epoch: 4568, Training Loss: 0.03147
Epoch: 4568, Training Loss: 0.04006
Epoch: 4569, Training Loss: 0.02843
Epoch: 4569, Training Loss: 0.03140
Epoch: 4569, Training Loss: 0.03147
Epoch: 4569, Training Loss: 0.04005
Epoch: 4570, Training Loss: 0.02842
Epoch: 4570, Training Loss: 0.03140
Epoch: 4570, Training Loss: 0.03146
Epoch: 4570, Training Loss: 0.04004
Epoch: 4571, Training Loss: 0.02841
Epoch: 4571, Training Loss: 0.03139
Epoch: 4571, Training Loss: 0.03145
Epoch: 4571, Training Loss: 0.04003
Epoch: 4572, Training Loss: 0.02841
Epoch: 4572, Training Loss: 0.03138
Epoch: 4572, Training Loss: 0.03144
Epoch: 4572, Training Loss: 0.04002
Epoch: 4573, Training Loss: 0.02840
Epoch: 4573, Training Loss: 0.03137
Epoch: 4573, Training Loss: 0.03144
Epoch: 4573, Training Loss: 0.04001
Epoch: 4574, Training Loss: 0.02839
Epoch: 4574, Training Loss: 0.03137
Epoch: 4574, Training Loss: 0.03143
Epoch: 4574, Training Loss: 0.04000
Epoch: 4575, Training Loss: 0.02839
Epoch: 4575, Training Loss: 0.03136
Epoch: 4575, Training Loss: 0.03142
Epoch: 4575, Training Loss: 0.03999
Epoch: 4576, Training Loss: 0.02838
Epoch: 4576, Training Loss: 0.03135
Epoch: 4576, Training Loss: 0.03141
Epoch: 4576, Training Loss: 0.03998
Epoch: 4577, Training Loss: 0.02838
Epoch: 4577, Training Loss: 0.03134
Epoch: 4577, Training Loss: 0.03141
Epoch: 4577, Training Loss: 0.03997
Epoch: 4578, Training Loss: 0.02837
Epoch: 4578, Training Loss: 0.03134
Epoch: 4578, Training Loss: 0.03140
Epoch: 4578, Training Loss: 0.03996
Epoch: 4579, Training Loss: 0.02836
Epoch: 4579, Training Loss: 0.03133
Epoch: 4579, Training Loss: 0.03139
Epoch: 4579, Training Loss: 0.03995
Epoch: 4580, Training Loss: 0.02836
Epoch: 4580, Training Loss: 0.03132
Epoch: 4580, Training Loss: 0.03138
Epoch: 4580, Training Loss: 0.03994
Epoch: 4581, Training Loss: 0.02835
Epoch: 4581, Training Loss: 0.03131
Epoch: 4581, Training Loss: 0.03138
Epoch: 4581, Training Loss: 0.03993
Epoch: 4582, Training Loss: 0.02834
Epoch: 4582, Training Loss: 0.03131
Epoch: 4582, Training Loss: 0.03137
Epoch: 4582, Training Loss: 0.03992
Epoch: 4583, Training Loss: 0.02834
Epoch: 4583, Training Loss: 0.03130
Epoch: 4583, Training Loss: 0.03136
Epoch: 4583, Training Loss: 0.03991
Epoch: 4584, Training Loss: 0.02833
Epoch: 4584, Training Loss: 0.03129
Epoch: 4584, Training Loss: 0.03135
Epoch: 4584, Training Loss: 0.03990
Epoch: 4585, Training Loss: 0.02833
Epoch: 4585, Training Loss: 0.03128
Epoch: 4585, Training Loss: 0.03135
Epoch: 4585, Training Loss: 0.03989
Epoch: 4586, Training Loss: 0.02832
Epoch: 4586, Training Loss: 0.03128
Epoch: 4586, Training Loss: 0.03134
Epoch: 4586, Training Loss: 0.03988
Epoch: 4587, Training Loss: 0.02831
Epoch: 4587, Training Loss: 0.03127
Epoch: 4587, Training Loss: 0.03133
Epoch: 4587, Training Loss: 0.03987
Epoch: 4588, Training Loss: 0.02831
Epoch: 4588, Training Loss: 0.03126
Epoch: 4588, Training Loss: 0.03132
Epoch: 4588, Training Loss: 0.03986
Epoch: 4589, Training Loss: 0.02830
Epoch: 4589, Training Loss: 0.03125
Epoch: 4589, Training Loss: 0.03132
Epoch: 4589, Training Loss: 0.03985
Epoch: 4590, Training Loss: 0.02829
Epoch: 4590, Training Loss: 0.03125
Epoch: 4590, Training Loss: 0.03131
Epoch: 4590, Training Loss: 0.03984
Epoch: 4591, Training Loss: 0.02829
Epoch: 4591, Training Loss: 0.03124
Epoch: 4591, Training Loss: 0.03130
Epoch: 4591, Training Loss: 0.03983
Epoch: 4592, Training Loss: 0.02828
Epoch: 4592, Training Loss: 0.03123
Epoch: 4592, Training Loss: 0.03129
Epoch: 4592, Training Loss: 0.03982
Epoch: 4593, Training Loss: 0.02828
Epoch: 4593, Training Loss: 0.03122
Epoch: 4593, Training Loss: 0.03129
Epoch: 4593, Training Loss: 0.03981
Epoch: 4594, Training Loss: 0.02827
Epoch: 4594, Training Loss: 0.03122
Epoch: 4594, Training Loss: 0.03128
Epoch: 4594, Training Loss: 0.03980
Epoch: 4595, Training Loss: 0.02826
Epoch: 4595, Training Loss: 0.03121
Epoch: 4595, Training Loss: 0.03127
Epoch: 4595, Training Loss: 0.03979
Epoch: 4596, Training Loss: 0.02826
Epoch: 4596, Training Loss: 0.03120
Epoch: 4596, Training Loss: 0.03126
Epoch: 4596, Training Loss: 0.03978
Epoch: 4597, Training Loss: 0.02825
Epoch: 4597, Training Loss: 0.03120
Epoch: 4597, Training Loss: 0.03126
Epoch: 4597, Training Loss: 0.03977
Epoch: 4598, Training Loss: 0.02825
Epoch: 4598, Training Loss: 0.03119
Epoch: 4598, Training Loss: 0.03125
Epoch: 4598, Training Loss: 0.03976
Epoch: 4599, Training Loss: 0.02824
Epoch: 4599, Training Loss: 0.03118
Epoch: 4599, Training Loss: 0.03124
Epoch: 4599, Training Loss: 0.03976
Epoch: 4600, Training Loss: 0.02823
Epoch: 4600, Training Loss: 0.03117
Epoch: 4600, Training Loss: 0.03123
Epoch: 4600, Training Loss: 0.03975
Epoch: 4601, Training Loss: 0.02823
Epoch: 4601, Training Loss: 0.03117
Epoch: 4601, Training Loss: 0.03123
Epoch: 4601, Training Loss: 0.03974
Epoch: 4602, Training Loss: 0.02822
Epoch: 4602, Training Loss: 0.03116
Epoch: 4602, Training Loss: 0.03122
Epoch: 4602, Training Loss: 0.03973
Epoch: 4603, Training Loss: 0.02821
Epoch: 4603, Training Loss: 0.03115
Epoch: 4603, Training Loss: 0.03121
Epoch: 4603, Training Loss: 0.03972
Epoch: 4604, Training Loss: 0.02821
Epoch: 4604, Training Loss: 0.03114
Epoch: 4604, Training Loss: 0.03120
Epoch: 4604, Training Loss: 0.03971
Epoch: 4605, Training Loss: 0.02820
Epoch: 4605, Training Loss: 0.03114
Epoch: 4605, Training Loss: 0.03120
Epoch: 4605, Training Loss: 0.03970
Epoch: 4606, Training Loss: 0.02820
Epoch: 4606, Training Loss: 0.03113
Epoch: 4606, Training Loss: 0.03119
Epoch: 4606, Training Loss: 0.03969
Epoch: 4607, Training Loss: 0.02819
Epoch: 4607, Training Loss: 0.03112
Epoch: 4607, Training Loss: 0.03118
Epoch: 4607, Training Loss: 0.03968
Epoch: 4608, Training Loss: 0.02818
Epoch: 4608, Training Loss: 0.03111
Epoch: 4608, Training Loss: 0.03117
Epoch: 4608, Training Loss: 0.03967
Epoch: 4609, Training Loss: 0.02818
Epoch: 4609, Training Loss: 0.03111
Epoch: 4609, Training Loss: 0.03117
Epoch: 4609, Training Loss: 0.03966
Epoch: 4610, Training Loss: 0.02817
Epoch: 4610, Training Loss: 0.03110
Epoch: 4610, Training Loss: 0.03116
Epoch: 4610, Training Loss: 0.03965
Epoch: 4611, Training Loss: 0.02817
Epoch: 4611, Training Loss: 0.03109
Epoch: 4611, Training Loss: 0.03115
Epoch: 4611, Training Loss: 0.03964
Epoch: 4612, Training Loss: 0.02816
Epoch: 4612, Training Loss: 0.03108
Epoch: 4612, Training Loss: 0.03115
Epoch: 4612, Training Loss: 0.03963
Epoch: 4613, Training Loss: 0.02815
Epoch: 4613, Training Loss: 0.03108
Epoch: 4613, Training Loss: 0.03114
Epoch: 4613, Training Loss: 0.03962
Epoch: 4614, Training Loss: 0.02815
Epoch: 4614, Training Loss: 0.03107
Epoch: 4614, Training Loss: 0.03113
Epoch: 4614, Training Loss: 0.03961
Epoch: 4615, Training Loss: 0.02814
Epoch: 4615, Training Loss: 0.03106
Epoch: 4615, Training Loss: 0.03112
Epoch: 4615, Training Loss: 0.03960
Epoch: 4616, Training Loss: 0.02813
Epoch: 4616, Training Loss: 0.03106
Epoch: 4616, Training Loss: 0.03112
Epoch: 4616, Training Loss: 0.03959
Epoch: 4617, Training Loss: 0.02813
Epoch: 4617, Training Loss: 0.03105
Epoch: 4617, Training Loss: 0.03111
Epoch: 4617, Training Loss: 0.03958
Epoch: 4618, Training Loss: 0.02812
Epoch: 4618, Training Loss: 0.03104
Epoch: 4618, Training Loss: 0.03110
Epoch: 4618, Training Loss: 0.03957
Epoch: 4619, Training Loss: 0.02812
Epoch: 4619, Training Loss: 0.03103
Epoch: 4619, Training Loss: 0.03109
Epoch: 4619, Training Loss: 0.03956
Epoch: 4620, Training Loss: 0.02811
Epoch: 4620, Training Loss: 0.03103
Epoch: 4620, Training Loss: 0.03109
Epoch: 4620, Training Loss: 0.03955
Epoch: 4621, Training Loss: 0.02810
Epoch: 4621, Training Loss: 0.03102
Epoch: 4621, Training Loss: 0.03108
Epoch: 4621, Training Loss: 0.03954
Epoch: 4622, Training Loss: 0.02810
Epoch: 4622, Training Loss: 0.03101
Epoch: 4622, Training Loss: 0.03107
Epoch: 4622, Training Loss: 0.03953
Epoch: 4623, Training Loss: 0.02809
Epoch: 4623, Training Loss: 0.03100
Epoch: 4623, Training Loss: 0.03106
Epoch: 4623, Training Loss: 0.03952
Epoch: 4624, Training Loss: 0.02809
Epoch: 4624, Training Loss: 0.03100
Epoch: 4624, Training Loss: 0.03106
Epoch: 4624, Training Loss: 0.03951
Epoch: 4625, Training Loss: 0.02808
Epoch: 4625, Training Loss: 0.03099
Epoch: 4625, Training Loss: 0.03105
Epoch: 4625, Training Loss: 0.03950
Epoch: 4626, Training Loss: 0.02807
Epoch: 4626, Training Loss: 0.03098
Epoch: 4626, Training Loss: 0.03104
Epoch: 4626, Training Loss: 0.03949
Epoch: 4627, Training Loss: 0.02807
Epoch: 4627, Training Loss: 0.03098
Epoch: 4627, Training Loss: 0.03104
Epoch: 4627, Training Loss: 0.03948
Epoch: 4628, Training Loss: 0.02806
Epoch: 4628, Training Loss: 0.03097
Epoch: 4628, Training Loss: 0.03103
Epoch: 4628, Training Loss: 0.03947
Epoch: 4629, Training Loss: 0.02806
Epoch: 4629, Training Loss: 0.03096
Epoch: 4629, Training Loss: 0.03102
Epoch: 4629, Training Loss: 0.03946
Epoch: 4630, Training Loss: 0.02805
Epoch: 4630, Training Loss: 0.03095
Epoch: 4630, Training Loss: 0.03101
Epoch: 4630, Training Loss: 0.03945
Epoch: 4631, Training Loss: 0.02804
Epoch: 4631, Training Loss: 0.03095
Epoch: 4631, Training Loss: 0.03101
Epoch: 4631, Training Loss: 0.03944
Epoch: 4632, Training Loss: 0.02804
Epoch: 4632, Training Loss: 0.03094
Epoch: 4632, Training Loss: 0.03100
Epoch: 4632, Training Loss: 0.03943
Epoch: 4633, Training Loss: 0.02803
Epoch: 4633, Training Loss: 0.03093
Epoch: 4633, Training Loss: 0.03099
Epoch: 4633, Training Loss: 0.03942
Epoch: 4634, Training Loss: 0.02803
Epoch: 4634, Training Loss: 0.03092
Epoch: 4634, Training Loss: 0.03098
Epoch: 4634, Training Loss: 0.03941
Epoch: 4635, Training Loss: 0.02802
Epoch: 4635, Training Loss: 0.03092
Epoch: 4635, Training Loss: 0.03098
Epoch: 4635, Training Loss: 0.03940
Epoch: 4636, Training Loss: 0.02801
Epoch: 4636, Training Loss: 0.03091
Epoch: 4636, Training Loss: 0.03097
Epoch: 4636, Training Loss: 0.03939
Epoch: 4637, Training Loss: 0.02801
Epoch: 4637, Training Loss: 0.03090
Epoch: 4637, Training Loss: 0.03096
Epoch: 4637, Training Loss: 0.03938
Epoch: 4638, Training Loss: 0.02800
Epoch: 4638, Training Loss: 0.03090
Epoch: 4638, Training Loss: 0.03096
Epoch: 4638, Training Loss: 0.03937
Epoch: 4639, Training Loss: 0.02800
Epoch: 4639, Training Loss: 0.03089
Epoch: 4639, Training Loss: 0.03095
Epoch: 4639, Training Loss: 0.03936
Epoch: 4640, Training Loss: 0.02799
Epoch: 4640, Training Loss: 0.03088
Epoch: 4640, Training Loss: 0.03094
Epoch: 4640, Training Loss: 0.03936
Epoch: 4641, Training Loss: 0.02798
Epoch: 4641, Training Loss: 0.03087
Epoch: 4641, Training Loss: 0.03093
Epoch: 4641, Training Loss: 0.03935
Epoch: 4642, Training Loss: 0.02798
Epoch: 4642, Training Loss: 0.03087
Epoch: 4642, Training Loss: 0.03093
Epoch: 4642, Training Loss: 0.03934
Epoch: 4643, Training Loss: 0.02797
Epoch: 4643, Training Loss: 0.03086
Epoch: 4643, Training Loss: 0.03092
Epoch: 4643, Training Loss: 0.03933
Epoch: 4644, Training Loss: 0.02797
Epoch: 4644, Training Loss: 0.03085
Epoch: 4644, Training Loss: 0.03091
Epoch: 4644, Training Loss: 0.03932
Epoch: 4645, Training Loss: 0.02796
Epoch: 4645, Training Loss: 0.03085
Epoch: 4645, Training Loss: 0.03091
Epoch: 4645, Training Loss: 0.03931
Epoch: 4646, Training Loss: 0.02795
Epoch: 4646, Training Loss: 0.03084
Epoch: 4646, Training Loss: 0.03090
Epoch: 4646, Training Loss: 0.03930
Epoch: 4647, Training Loss: 0.02795
Epoch: 4647, Training Loss: 0.03083
Epoch: 4647, Training Loss: 0.03089
Epoch: 4647, Training Loss: 0.03929
Epoch: 4648, Training Loss: 0.02794
Epoch: 4648, Training Loss: 0.03082
Epoch: 4648, Training Loss: 0.03088
Epoch: 4648, Training Loss: 0.03928
Epoch: 4649, Training Loss: 0.02794
Epoch: 4649, Training Loss: 0.03082
Epoch: 4649, Training Loss: 0.03088
Epoch: 4649, Training Loss: 0.03927
Epoch: 4650, Training Loss: 0.02793
Epoch: 4650, Training Loss: 0.03081
Epoch: 4650, Training Loss: 0.03087
Epoch: 4650, Training Loss: 0.03926
Epoch: 4651, Training Loss: 0.02792
Epoch: 4651, Training Loss: 0.03080
Epoch: 4651, Training Loss: 0.03086
Epoch: 4651, Training Loss: 0.03925
Epoch: 4652, Training Loss: 0.02792
Epoch: 4652, Training Loss: 0.03080
Epoch: 4652, Training Loss: 0.03086
Epoch: 4652, Training Loss: 0.03924
Epoch: 4653, Training Loss: 0.02791
Epoch: 4653, Training Loss: 0.03079
Epoch: 4653, Training Loss: 0.03085
Epoch: 4653, Training Loss: 0.03923
Epoch: 4654, Training Loss: 0.02791
Epoch: 4654, Training Loss: 0.03078
Epoch: 4654, Training Loss: 0.03084
Epoch: 4654, Training Loss: 0.03922
Epoch: 4655, Training Loss: 0.02790
Epoch: 4655, Training Loss: 0.03077
Epoch: 4655, Training Loss: 0.03083
Epoch: 4655, Training Loss: 0.03921
Epoch: 4656, Training Loss: 0.02789
Epoch: 4656, Training Loss: 0.03077
Epoch: 4656, Training Loss: 0.03083
Epoch: 4656, Training Loss: 0.03920
Epoch: 4657, Training Loss: 0.02789
Epoch: 4657, Training Loss: 0.03076
Epoch: 4657, Training Loss: 0.03082
Epoch: 4657, Training Loss: 0.03919
Epoch: 4658, Training Loss: 0.02788
Epoch: 4658, Training Loss: 0.03075
Epoch: 4658, Training Loss: 0.03081
Epoch: 4658, Training Loss: 0.03918
Epoch: 4659, Training Loss: 0.02788
Epoch: 4659, Training Loss: 0.03075
Epoch: 4659, Training Loss: 0.03081
Epoch: 4659, Training Loss: 0.03917
Epoch: 4660, Training Loss: 0.02787
Epoch: 4660, Training Loss: 0.03074
Epoch: 4660, Training Loss: 0.03080
Epoch: 4660, Training Loss: 0.03916
Epoch: 4661, Training Loss: 0.02786
Epoch: 4661, Training Loss: 0.03073
Epoch: 4661, Training Loss: 0.03079
Epoch: 4661, Training Loss: 0.03915
Epoch: 4662, Training Loss: 0.02786
Epoch: 4662, Training Loss: 0.03072
Epoch: 4662, Training Loss: 0.03078
Epoch: 4662, Training Loss: 0.03915
Epoch: 4663, Training Loss: 0.02785
Epoch: 4663, Training Loss: 0.03072
Epoch: 4663, Training Loss: 0.03078
Epoch: 4663, Training Loss: 0.03914
Epoch: 4664, Training Loss: 0.02785
Epoch: 4664, Training Loss: 0.03071
Epoch: 4664, Training Loss: 0.03077
Epoch: 4664, Training Loss: 0.03913
Epoch: 4665, Training Loss: 0.02784
Epoch: 4665, Training Loss: 0.03070
Epoch: 4665, Training Loss: 0.03076
Epoch: 4665, Training Loss: 0.03912
Epoch: 4666, Training Loss: 0.02783
Epoch: 4666, Training Loss: 0.03070
Epoch: 4666, Training Loss: 0.03076
Epoch: 4666, Training Loss: 0.03911
Epoch: 4667, Training Loss: 0.02783
Epoch: 4667, Training Loss: 0.03069
Epoch: 4667, Training Loss: 0.03075
Epoch: 4667, Training Loss: 0.03910
Epoch: 4668, Training Loss: 0.02782
Epoch: 4668, Training Loss: 0.03068
Epoch: 4668, Training Loss: 0.03074
Epoch: 4668, Training Loss: 0.03909
Epoch: 4669, Training Loss: 0.02782
Epoch: 4669, Training Loss: 0.03067
Epoch: 4669, Training Loss: 0.03073
Epoch: 4669, Training Loss: 0.03908
Epoch: 4670, Training Loss: 0.02781
Epoch: 4670, Training Loss: 0.03067
Epoch: 4670, Training Loss: 0.03073
Epoch: 4670, Training Loss: 0.03907
Epoch: 4671, Training Loss: 0.02780
Epoch: 4671, Training Loss: 0.03066
Epoch: 4671, Training Loss: 0.03072
Epoch: 4671, Training Loss: 0.03906
Epoch: 4672, Training Loss: 0.02780
Epoch: 4672, Training Loss: 0.03065
Epoch: 4672, Training Loss: 0.03071
Epoch: 4672, Training Loss: 0.03905
Epoch: 4673, Training Loss: 0.02779
Epoch: 4673, Training Loss: 0.03065
Epoch: 4673, Training Loss: 0.03071
Epoch: 4673, Training Loss: 0.03904
Epoch: 4674, Training Loss: 0.02779
Epoch: 4674, Training Loss: 0.03064
Epoch: 4674, Training Loss: 0.03070
Epoch: 4674, Training Loss: 0.03903
Epoch: 4675, Training Loss: 0.02778
Epoch: 4675, Training Loss: 0.03063
Epoch: 4675, Training Loss: 0.03069
Epoch: 4675, Training Loss: 0.03902
Epoch: 4676, Training Loss: 0.02777
Epoch: 4676, Training Loss: 0.03063
Epoch: 4676, Training Loss: 0.03068
Epoch: 4676, Training Loss: 0.03901
Epoch: 4677, Training Loss: 0.02777
Epoch: 4677, Training Loss: 0.03062
Epoch: 4677, Training Loss: 0.03068
Epoch: 4677, Training Loss: 0.03900
Epoch: 4678, Training Loss: 0.02776
Epoch: 4678, Training Loss: 0.03061
Epoch: 4678, Training Loss: 0.03067
Epoch: 4678, Training Loss: 0.03899
Epoch: 4679, Training Loss: 0.02776
Epoch: 4679, Training Loss: 0.03060
Epoch: 4679, Training Loss: 0.03066
Epoch: 4679, Training Loss: 0.03899
Epoch: 4680, Training Loss: 0.02775
Epoch: 4680, Training Loss: 0.03060
Epoch: 4680, Training Loss: 0.03066
Epoch: 4680, Training Loss: 0.03898
Epoch: 4681, Training Loss: 0.02775
Epoch: 4681, Training Loss: 0.03059
Epoch: 4681, Training Loss: 0.03065
Epoch: 4681, Training Loss: 0.03897
Epoch: 4682, Training Loss: 0.02774
Epoch: 4682, Training Loss: 0.03058
Epoch: 4682, Training Loss: 0.03064
Epoch: 4682, Training Loss: 0.03896
Epoch: 4683, Training Loss: 0.02773
Epoch: 4683, Training Loss: 0.03058
Epoch: 4683, Training Loss: 0.03064
Epoch: 4683, Training Loss: 0.03895
Epoch: 4684, Training Loss: 0.02773
Epoch: 4684, Training Loss: 0.03057
Epoch: 4684, Training Loss: 0.03063
Epoch: 4684, Training Loss: 0.03894
Epoch: 4685, Training Loss: 0.02772
Epoch: 4685, Training Loss: 0.03056
Epoch: 4685, Training Loss: 0.03062
Epoch: 4685, Training Loss: 0.03893
Epoch: 4686, Training Loss: 0.02772
Epoch: 4686, Training Loss: 0.03056
Epoch: 4686, Training Loss: 0.03061
Epoch: 4686, Training Loss: 0.03892
Epoch: 4687, Training Loss: 0.02771
Epoch: 4687, Training Loss: 0.03055
Epoch: 4687, Training Loss: 0.03061
Epoch: 4687, Training Loss: 0.03891
Epoch: 4688, Training Loss: 0.02770
Epoch: 4688, Training Loss: 0.03054
Epoch: 4688, Training Loss: 0.03060
Epoch: 4688, Training Loss: 0.03890
Epoch: 4689, Training Loss: 0.02770
Epoch: 4689, Training Loss: 0.03053
Epoch: 4689, Training Loss: 0.03059
Epoch: 4689, Training Loss: 0.03889
Epoch: 4690, Training Loss: 0.02769
Epoch: 4690, Training Loss: 0.03053
Epoch: 4690, Training Loss: 0.03059
Epoch: 4690, Training Loss: 0.03888
Epoch: 4691, Training Loss: 0.02769
Epoch: 4691, Training Loss: 0.03052
Epoch: 4691, Training Loss: 0.03058
Epoch: 4691, Training Loss: 0.03887
Epoch: 4692, Training Loss: 0.02768
Epoch: 4692, Training Loss: 0.03051
Epoch: 4692, Training Loss: 0.03057
Epoch: 4692, Training Loss: 0.03886
Epoch: 4693, Training Loss: 0.02768
Epoch: 4693, Training Loss: 0.03051
Epoch: 4693, Training Loss: 0.03057
Epoch: 4693, Training Loss: 0.03886
Epoch: 4694, Training Loss: 0.02767
Epoch: 4694, Training Loss: 0.03050
Epoch: 4694, Training Loss: 0.03056
Epoch: 4694, Training Loss: 0.03885
Epoch: 4695, Training Loss: 0.02766
Epoch: 4695, Training Loss: 0.03049
Epoch: 4695, Training Loss: 0.03055
Epoch: 4695, Training Loss: 0.03884
Epoch: 4696, Training Loss: 0.02766
Epoch: 4696, Training Loss: 0.03049
Epoch: 4696, Training Loss: 0.03054
Epoch: 4696, Training Loss: 0.03883
Epoch: 4697, Training Loss: 0.02765
Epoch: 4697, Training Loss: 0.03048
Epoch: 4697, Training Loss: 0.03054
Epoch: 4697, Training Loss: 0.03882
Epoch: 4698, Training Loss: 0.02765
Epoch: 4698, Training Loss: 0.03047
Epoch: 4698, Training Loss: 0.03053
Epoch: 4698, Training Loss: 0.03881
Epoch: 4699, Training Loss: 0.02764
Epoch: 4699, Training Loss: 0.03047
Epoch: 4699, Training Loss: 0.03052
Epoch: 4699, Training Loss: 0.03880
Epoch: 4700, Training Loss: 0.02763
Epoch: 4700, Training Loss: 0.03046
Epoch: 4700, Training Loss: 0.03052
Epoch: 4700, Training Loss: 0.03879
Epoch: 4701, Training Loss: 0.02763
Epoch: 4701, Training Loss: 0.03045
Epoch: 4701, Training Loss: 0.03051
Epoch: 4701, Training Loss: 0.03878
Epoch: 4702, Training Loss: 0.02762
Epoch: 4702, Training Loss: 0.03044
Epoch: 4702, Training Loss: 0.03050
Epoch: 4702, Training Loss: 0.03877
Epoch: 4703, Training Loss: 0.02762
Epoch: 4703, Training Loss: 0.03044
Epoch: 4703, Training Loss: 0.03050
Epoch: 4703, Training Loss: 0.03876
Epoch: 4704, Training Loss: 0.02761
Epoch: 4704, Training Loss: 0.03043
Epoch: 4704, Training Loss: 0.03049
Epoch: 4704, Training Loss: 0.03875
Epoch: 4705, Training Loss: 0.02761
Epoch: 4705, Training Loss: 0.03042
Epoch: 4705, Training Loss: 0.03048
Epoch: 4705, Training Loss: 0.03874
Epoch: 4706, Training Loss: 0.02760
Epoch: 4706, Training Loss: 0.03042
Epoch: 4706, Training Loss: 0.03048
Epoch: 4706, Training Loss: 0.03874
Epoch: 4707, Training Loss: 0.02759
Epoch: 4707, Training Loss: 0.03041
Epoch: 4707, Training Loss: 0.03047
Epoch: 4707, Training Loss: 0.03873
Epoch: 4708, Training Loss: 0.02759
Epoch: 4708, Training Loss: 0.03040
Epoch: 4708, Training Loss: 0.03046
Epoch: 4708, Training Loss: 0.03872
Epoch: 4709, Training Loss: 0.02758
Epoch: 4709, Training Loss: 0.03040
Epoch: 4709, Training Loss: 0.03045
Epoch: 4709, Training Loss: 0.03871
Epoch: 4710, Training Loss: 0.02758
Epoch: 4710, Training Loss: 0.03039
Epoch: 4710, Training Loss: 0.03045
Epoch: 4710, Training Loss: 0.03870
Epoch: 4711, Training Loss: 0.02757
Epoch: 4711, Training Loss: 0.03038
Epoch: 4711, Training Loss: 0.03044
Epoch: 4711, Training Loss: 0.03869
Epoch: 4712, Training Loss: 0.02757
Epoch: 4712, Training Loss: 0.03038
Epoch: 4712, Training Loss: 0.03043
Epoch: 4712, Training Loss: 0.03868
Epoch: 4713, Training Loss: 0.02756
Epoch: 4713, Training Loss: 0.03037
Epoch: 4713, Training Loss: 0.03043
Epoch: 4713, Training Loss: 0.03867
Epoch: 4714, Training Loss: 0.02755
Epoch: 4714, Training Loss: 0.03036
Epoch: 4714, Training Loss: 0.03042
Epoch: 4714, Training Loss: 0.03866
Epoch: 4715, Training Loss: 0.02755
Epoch: 4715, Training Loss: 0.03036
Epoch: 4715, Training Loss: 0.03041
Epoch: 4715, Training Loss: 0.03865
Epoch: 4716, Training Loss: 0.02754
Epoch: 4716, Training Loss: 0.03035
Epoch: 4716, Training Loss: 0.03041
Epoch: 4716, Training Loss: 0.03864
Epoch: 4717, Training Loss: 0.02754
Epoch: 4717, Training Loss: 0.03034
Epoch: 4717, Training Loss: 0.03040
Epoch: 4717, Training Loss: 0.03863
Epoch: 4718, Training Loss: 0.02753
Epoch: 4718, Training Loss: 0.03033
Epoch: 4718, Training Loss: 0.03039
Epoch: 4718, Training Loss: 0.03863
Epoch: 4719, Training Loss: 0.02753
Epoch: 4719, Training Loss: 0.03033
Epoch: 4719, Training Loss: 0.03039
Epoch: 4719, Training Loss: 0.03862
Epoch: 4720, Training Loss: 0.02752
Epoch: 4720, Training Loss: 0.03032
Epoch: 4720, Training Loss: 0.03038
Epoch: 4720, Training Loss: 0.03861
Epoch: 4721, Training Loss: 0.02751
Epoch: 4721, Training Loss: 0.03031
Epoch: 4721, Training Loss: 0.03037
Epoch: 4721, Training Loss: 0.03860
Epoch: 4722, Training Loss: 0.02751
Epoch: 4722, Training Loss: 0.03031
Epoch: 4722, Training Loss: 0.03037
Epoch: 4722, Training Loss: 0.03859
Epoch: 4723, Training Loss: 0.02750
Epoch: 4723, Training Loss: 0.03030
Epoch: 4723, Training Loss: 0.03036
Epoch: 4723, Training Loss: 0.03858
Epoch: 4724, Training Loss: 0.02750
Epoch: 4724, Training Loss: 0.03029
Epoch: 4724, Training Loss: 0.03035
Epoch: 4724, Training Loss: 0.03857
Epoch: 4725, Training Loss: 0.02749
Epoch: 4725, Training Loss: 0.03029
Epoch: 4725, Training Loss: 0.03035
Epoch: 4725, Training Loss: 0.03856
Epoch: 4726, Training Loss: 0.02749
Epoch: 4726, Training Loss: 0.03028
Epoch: 4726, Training Loss: 0.03034
Epoch: 4726, Training Loss: 0.03855
Epoch: 4727, Training Loss: 0.02748
Epoch: 4727, Training Loss: 0.03027
Epoch: 4727, Training Loss: 0.03033
Epoch: 4727, Training Loss: 0.03854
Epoch: 4728, Training Loss: 0.02747
Epoch: 4728, Training Loss: 0.03027
Epoch: 4728, Training Loss: 0.03032
Epoch: 4728, Training Loss: 0.03853
Epoch: 4729, Training Loss: 0.02747
Epoch: 4729, Training Loss: 0.03026
Epoch: 4729, Training Loss: 0.03032
Epoch: 4729, Training Loss: 0.03853
Epoch: 4730, Training Loss: 0.02746
Epoch: 4730, Training Loss: 0.03025
Epoch: 4730, Training Loss: 0.03031
Epoch: 4730, Training Loss: 0.03852
Epoch: 4731, Training Loss: 0.02746
Epoch: 4731, Training Loss: 0.03025
Epoch: 4731, Training Loss: 0.03030
Epoch: 4731, Training Loss: 0.03851
Epoch: 4732, Training Loss: 0.02745
Epoch: 4732, Training Loss: 0.03024
Epoch: 4732, Training Loss: 0.03030
Epoch: 4732, Training Loss: 0.03850
Epoch: 4733, Training Loss: 0.02745
Epoch: 4733, Training Loss: 0.03023
Epoch: 4733, Training Loss: 0.03029
Epoch: 4733, Training Loss: 0.03849
Epoch: 4734, Training Loss: 0.02744
Epoch: 4734, Training Loss: 0.03023
Epoch: 4734, Training Loss: 0.03028
Epoch: 4734, Training Loss: 0.03848
Epoch: 4735, Training Loss: 0.02743
Epoch: 4735, Training Loss: 0.03022
Epoch: 4735, Training Loss: 0.03028
Epoch: 4735, Training Loss: 0.03847
Epoch: 4736, Training Loss: 0.02743
Epoch: 4736, Training Loss: 0.03021
Epoch: 4736, Training Loss: 0.03027
Epoch: 4736, Training Loss: 0.03846
Epoch: 4737, Training Loss: 0.02742
Epoch: 4737, Training Loss: 0.03021
Epoch: 4737, Training Loss: 0.03026
Epoch: 4737, Training Loss: 0.03845
Epoch: 4738, Training Loss: 0.02742
Epoch: 4738, Training Loss: 0.03020
Epoch: 4738, Training Loss: 0.03026
Epoch: 4738, Training Loss: 0.03844
Epoch: 4739, Training Loss: 0.02741
Epoch: 4739, Training Loss: 0.03019
Epoch: 4739, Training Loss: 0.03025
Epoch: 4739, Training Loss: 0.03844
Epoch: 4740, Training Loss: 0.02741
Epoch: 4740, Training Loss: 0.03019
Epoch: 4740, Training Loss: 0.03024
Epoch: 4740, Training Loss: 0.03843
Epoch: 4741, Training Loss: 0.02740
Epoch: 4741, Training Loss: 0.03018
Epoch: 4741, Training Loss: 0.03024
Epoch: 4741, Training Loss: 0.03842
Epoch: 4742, Training Loss: 0.02739
Epoch: 4742, Training Loss: 0.03017
Epoch: 4742, Training Loss: 0.03023
Epoch: 4742, Training Loss: 0.03841
Epoch: 4743, Training Loss: 0.02739
Epoch: 4743, Training Loss: 0.03017
Epoch: 4743, Training Loss: 0.03022
Epoch: 4743, Training Loss: 0.03840
Epoch: 4744, Training Loss: 0.02738
Epoch: 4744, Training Loss: 0.03016
Epoch: 4744, Training Loss: 0.03022
Epoch: 4744, Training Loss: 0.03839
Epoch: 4745, Training Loss: 0.02738
Epoch: 4745, Training Loss: 0.03015
Epoch: 4745, Training Loss: 0.03021
Epoch: 4745, Training Loss: 0.03838
Epoch: 4746, Training Loss: 0.02737
Epoch: 4746, Training Loss: 0.03014
Epoch: 4746, Training Loss: 0.03020
Epoch: 4746, Training Loss: 0.03837
Epoch: 4747, Training Loss: 0.02737
Epoch: 4747, Training Loss: 0.03014
Epoch: 4747, Training Loss: 0.03020
Epoch: 4747, Training Loss: 0.03836
Epoch: 4748, Training Loss: 0.02736
Epoch: 4748, Training Loss: 0.03013
Epoch: 4748, Training Loss: 0.03019
Epoch: 4748, Training Loss: 0.03835
Epoch: 4749, Training Loss: 0.02736
Epoch: 4749, Training Loss: 0.03012
Epoch: 4749, Training Loss: 0.03018
Epoch: 4749, Training Loss: 0.03835
Epoch: 4750, Training Loss: 0.02735
Epoch: 4750, Training Loss: 0.03012
Epoch: 4750, Training Loss: 0.03018
Epoch: 4750, Training Loss: 0.03834
Epoch: 4751, Training Loss: 0.02734
Epoch: 4751, Training Loss: 0.03011
Epoch: 4751, Training Loss: 0.03017
Epoch: 4751, Training Loss: 0.03833
Epoch: 4752, Training Loss: 0.02734
Epoch: 4752, Training Loss: 0.03010
Epoch: 4752, Training Loss: 0.03016
Epoch: 4752, Training Loss: 0.03832
Epoch: 4753, Training Loss: 0.02733
Epoch: 4753, Training Loss: 0.03010
Epoch: 4753, Training Loss: 0.03016
Epoch: 4753, Training Loss: 0.03831
Epoch: 4754, Training Loss: 0.02733
Epoch: 4754, Training Loss: 0.03009
Epoch: 4754, Training Loss: 0.03015
Epoch: 4754, Training Loss: 0.03830
Epoch: 4755, Training Loss: 0.02732
Epoch: 4755, Training Loss: 0.03008
Epoch: 4755, Training Loss: 0.03014
Epoch: 4755, Training Loss: 0.03829
Epoch: 4756, Training Loss: 0.02732
Epoch: 4756, Training Loss: 0.03008
Epoch: 4756, Training Loss: 0.03014
Epoch: 4756, Training Loss: 0.03828
Epoch: 4757, Training Loss: 0.02731
Epoch: 4757, Training Loss: 0.03007
Epoch: 4757, Training Loss: 0.03013
Epoch: 4757, Training Loss: 0.03827
Epoch: 4758, Training Loss: 0.02730
Epoch: 4758, Training Loss: 0.03006
Epoch: 4758, Training Loss: 0.03012
Epoch: 4758, Training Loss: 0.03827
Epoch: 4759, Training Loss: 0.02730
Epoch: 4759, Training Loss: 0.03006
Epoch: 4759, Training Loss: 0.03012
Epoch: 4759, Training Loss: 0.03826
Epoch: 4760, Training Loss: 0.02729
Epoch: 4760, Training Loss: 0.03005
Epoch: 4760, Training Loss: 0.03011
Epoch: 4760, Training Loss: 0.03825
Epoch: 4761, Training Loss: 0.02729
Epoch: 4761, Training Loss: 0.03004
Epoch: 4761, Training Loss: 0.03010
Epoch: 4761, Training Loss: 0.03824
Epoch: 4762, Training Loss: 0.02728
Epoch: 4762, Training Loss: 0.03004
Epoch: 4762, Training Loss: 0.03010
Epoch: 4762, Training Loss: 0.03823
Epoch: 4763, Training Loss: 0.02728
Epoch: 4763, Training Loss: 0.03003
Epoch: 4763, Training Loss: 0.03009
Epoch: 4763, Training Loss: 0.03822
Epoch: 4764, Training Loss: 0.02727
Epoch: 4764, Training Loss: 0.03002
Epoch: 4764, Training Loss: 0.03008
Epoch: 4764, Training Loss: 0.03821
Epoch: 4765, Training Loss: 0.02727
Epoch: 4765, Training Loss: 0.03002
Epoch: 4765, Training Loss: 0.03008
Epoch: 4765, Training Loss: 0.03820
Epoch: 4766, Training Loss: 0.02726
Epoch: 4766, Training Loss: 0.03001
Epoch: 4766, Training Loss: 0.03007
Epoch: 4766, Training Loss: 0.03819
Epoch: 4767, Training Loss: 0.02725
Epoch: 4767, Training Loss: 0.03000
Epoch: 4767, Training Loss: 0.03006
Epoch: 4767, Training Loss: 0.03819
Epoch: 4768, Training Loss: 0.02725
Epoch: 4768, Training Loss: 0.03000
Epoch: 4768, Training Loss: 0.03006
Epoch: 4768, Training Loss: 0.03818
Epoch: 4769, Training Loss: 0.02724
Epoch: 4769, Training Loss: 0.02999
Epoch: 4769, Training Loss: 0.03005
Epoch: 4769, Training Loss: 0.03817
Epoch: 4770, Training Loss: 0.02724
Epoch: 4770, Training Loss: 0.02999
Epoch: 4770, Training Loss: 0.03004
Epoch: 4770, Training Loss: 0.03816
Epoch: 4771, Training Loss: 0.02723
Epoch: 4771, Training Loss: 0.02998
Epoch: 4771, Training Loss: 0.03004
Epoch: 4771, Training Loss: 0.03815
Epoch: 4772, Training Loss: 0.02723
Epoch: 4772, Training Loss: 0.02997
Epoch: 4772, Training Loss: 0.03003
Epoch: 4772, Training Loss: 0.03814
Epoch: 4773, Training Loss: 0.02722
Epoch: 4773, Training Loss: 0.02997
Epoch: 4773, Training Loss: 0.03002
Epoch: 4773, Training Loss: 0.03813
Epoch: 4774, Training Loss: 0.02722
Epoch: 4774, Training Loss: 0.02996
Epoch: 4774, Training Loss: 0.03002
Epoch: 4774, Training Loss: 0.03812
Epoch: 4775, Training Loss: 0.02721
Epoch: 4775, Training Loss: 0.02995
Epoch: 4775, Training Loss: 0.03001
Epoch: 4775, Training Loss: 0.03812
Epoch: 4776, Training Loss: 0.02720
Epoch: 4776, Training Loss: 0.02995
Epoch: 4776, Training Loss: 0.03000
Epoch: 4776, Training Loss: 0.03811
Epoch: 4777, Training Loss: 0.02720
Epoch: 4777, Training Loss: 0.02994
Epoch: 4777, Training Loss: 0.03000
Epoch: 4777, Training Loss: 0.03810
Epoch: 4778, Training Loss: 0.02719
Epoch: 4778, Training Loss: 0.02993
Epoch: 4778, Training Loss: 0.02999
Epoch: 4778, Training Loss: 0.03809
Epoch: 4779, Training Loss: 0.02719
Epoch: 4779, Training Loss: 0.02993
Epoch: 4779, Training Loss: 0.02998
Epoch: 4779, Training Loss: 0.03808
Epoch: 4780, Training Loss: 0.02718
Epoch: 4780, Training Loss: 0.02992
Epoch: 4780, Training Loss: 0.02998
Epoch: 4780, Training Loss: 0.03807
Epoch: 4781, Training Loss: 0.02718
Epoch: 4781, Training Loss: 0.02991
Epoch: 4781, Training Loss: 0.02997
Epoch: 4781, Training Loss: 0.03806
Epoch: 4782, Training Loss: 0.02717
Epoch: 4782, Training Loss: 0.02991
Epoch: 4782, Training Loss: 0.02996
Epoch: 4782, Training Loss: 0.03805
Epoch: 4783, Training Loss: 0.02717
Epoch: 4783, Training Loss: 0.02990
Epoch: 4783, Training Loss: 0.02996
Epoch: 4783, Training Loss: 0.03805
Epoch: 4784, Training Loss: 0.02716
Epoch: 4784, Training Loss: 0.02989
Epoch: 4784, Training Loss: 0.02995
Epoch: 4784, Training Loss: 0.03804
Epoch: 4785, Training Loss: 0.02715
Epoch: 4785, Training Loss: 0.02989
Epoch: 4785, Training Loss: 0.02994
Epoch: 4785, Training Loss: 0.03803
Epoch: 4786, Training Loss: 0.02715
Epoch: 4786, Training Loss: 0.02988
Epoch: 4786, Training Loss: 0.02994
Epoch: 4786, Training Loss: 0.03802
Epoch: 4787, Training Loss: 0.02714
Epoch: 4787, Training Loss: 0.02987
Epoch: 4787, Training Loss: 0.02993
Epoch: 4787, Training Loss: 0.03801
Epoch: 4788, Training Loss: 0.02714
Epoch: 4788, Training Loss: 0.02987
Epoch: 4788, Training Loss: 0.02992
Epoch: 4788, Training Loss: 0.03800
Epoch: 4789, Training Loss: 0.02713
Epoch: 4789, Training Loss: 0.02986
Epoch: 4789, Training Loss: 0.02992
Epoch: 4789, Training Loss: 0.03799
Epoch: 4790, Training Loss: 0.02713
Epoch: 4790, Training Loss: 0.02985
Epoch: 4790, Training Loss: 0.02991
Epoch: 4790, Training Loss: 0.03798
Epoch: 4791, Training Loss: 0.02712
Epoch: 4791, Training Loss: 0.02985
Epoch: 4791, Training Loss: 0.02990
Epoch: 4791, Training Loss: 0.03798
Epoch: 4792, Training Loss: 0.02712
Epoch: 4792, Training Loss: 0.02984
Epoch: 4792, Training Loss: 0.02990
Epoch: 4792, Training Loss: 0.03797
Epoch: 4793, Training Loss: 0.02711
Epoch: 4793, Training Loss: 0.02983
Epoch: 4793, Training Loss: 0.02989
Epoch: 4793, Training Loss: 0.03796
Epoch: 4794, Training Loss: 0.02711
Epoch: 4794, Training Loss: 0.02983
Epoch: 4794, Training Loss: 0.02988
Epoch: 4794, Training Loss: 0.03795
Epoch: 4795, Training Loss: 0.02710
Epoch: 4795, Training Loss: 0.02982
Epoch: 4795, Training Loss: 0.02988
Epoch: 4795, Training Loss: 0.03794
Epoch: 4796, Training Loss: 0.02709
Epoch: 4796, Training Loss: 0.02981
Epoch: 4796, Training Loss: 0.02987
Epoch: 4796, Training Loss: 0.03793
Epoch: 4797, Training Loss: 0.02709
Epoch: 4797, Training Loss: 0.02981
Epoch: 4797, Training Loss: 0.02987
Epoch: 4797, Training Loss: 0.03792
Epoch: 4798, Training Loss: 0.02708
Epoch: 4798, Training Loss: 0.02980
Epoch: 4798, Training Loss: 0.02986
Epoch: 4798, Training Loss: 0.03792
Epoch: 4799, Training Loss: 0.02708
Epoch: 4799, Training Loss: 0.02980
Epoch: 4799, Training Loss: 0.02985
Epoch: 4799, Training Loss: 0.03791
Epoch: 4800, Training Loss: 0.02707
Epoch: 4800, Training Loss: 0.02979
Epoch: 4800, Training Loss: 0.02985
Epoch: 4800, Training Loss: 0.03790
Epoch: 4801, Training Loss: 0.02707
Epoch: 4801, Training Loss: 0.02978
Epoch: 4801, Training Loss: 0.02984
Epoch: 4801, Training Loss: 0.03789
Epoch: 4802, Training Loss: 0.02706
Epoch: 4802, Training Loss: 0.02978
Epoch: 4802, Training Loss: 0.02983
Epoch: 4802, Training Loss: 0.03788
Epoch: 4803, Training Loss: 0.02706
Epoch: 4803, Training Loss: 0.02977
Epoch: 4803, Training Loss: 0.02983
Epoch: 4803, Training Loss: 0.03787
Epoch: 4804, Training Loss: 0.02705
Epoch: 4804, Training Loss: 0.02976
Epoch: 4804, Training Loss: 0.02982
Epoch: 4804, Training Loss: 0.03786
Epoch: 4805, Training Loss: 0.02705
Epoch: 4805, Training Loss: 0.02976
Epoch: 4805, Training Loss: 0.02981
Epoch: 4805, Training Loss: 0.03785
Epoch: 4806, Training Loss: 0.02704
Epoch: 4806, Training Loss: 0.02975
Epoch: 4806, Training Loss: 0.02981
Epoch: 4806, Training Loss: 0.03785
Epoch: 4807, Training Loss: 0.02703
Epoch: 4807, Training Loss: 0.02974
Epoch: 4807, Training Loss: 0.02980
Epoch: 4807, Training Loss: 0.03784
Epoch: 4808, Training Loss: 0.02703
Epoch: 4808, Training Loss: 0.02974
Epoch: 4808, Training Loss: 0.02979
Epoch: 4808, Training Loss: 0.03783
Epoch: 4809, Training Loss: 0.02702
Epoch: 4809, Training Loss: 0.02973
Epoch: 4809, Training Loss: 0.02979
Epoch: 4809, Training Loss: 0.03782
Epoch: 4810, Training Loss: 0.02702
Epoch: 4810, Training Loss: 0.02972
Epoch: 4810, Training Loss: 0.02978
Epoch: 4810, Training Loss: 0.03781
Epoch: 4811, Training Loss: 0.02701
Epoch: 4811, Training Loss: 0.02972
Epoch: 4811, Training Loss: 0.02977
Epoch: 4811, Training Loss: 0.03780
Epoch: 4812, Training Loss: 0.02701
Epoch: 4812, Training Loss: 0.02971
Epoch: 4812, Training Loss: 0.02977
Epoch: 4812, Training Loss: 0.03779
Epoch: 4813, Training Loss: 0.02700
Epoch: 4813, Training Loss: 0.02970
Epoch: 4813, Training Loss: 0.02976
Epoch: 4813, Training Loss: 0.03779
Epoch: 4814, Training Loss: 0.02700
Epoch: 4814, Training Loss: 0.02970
Epoch: 4814, Training Loss: 0.02976
Epoch: 4814, Training Loss: 0.03778
Epoch: 4815, Training Loss: 0.02699
Epoch: 4815, Training Loss: 0.02969
Epoch: 4815, Training Loss: 0.02975
Epoch: 4815, Training Loss: 0.03777
Epoch: 4816, Training Loss: 0.02699
Epoch: 4816, Training Loss: 0.02969
Epoch: 4816, Training Loss: 0.02974
Epoch: 4816, Training Loss: 0.03776
Epoch: 4817, Training Loss: 0.02698
Epoch: 4817, Training Loss: 0.02968
Epoch: 4817, Training Loss: 0.02974
Epoch: 4817, Training Loss: 0.03775
Epoch: 4818, Training Loss: 0.02697
Epoch: 4818, Training Loss: 0.02967
Epoch: 4818, Training Loss: 0.02973
Epoch: 4818, Training Loss: 0.03774
Epoch: 4819, Training Loss: 0.02697
Epoch: 4819, Training Loss: 0.02967
Epoch: 4819, Training Loss: 0.02972
Epoch: 4819, Training Loss: 0.03773
Epoch: 4820, Training Loss: 0.02696
Epoch: 4820, Training Loss: 0.02966
Epoch: 4820, Training Loss: 0.02972
Epoch: 4820, Training Loss: 0.03773
Epoch: 4821, Training Loss: 0.02696
Epoch: 4821, Training Loss: 0.02965
Epoch: 4821, Training Loss: 0.02971
Epoch: 4821, Training Loss: 0.03772
Epoch: 4822, Training Loss: 0.02695
Epoch: 4822, Training Loss: 0.02965
Epoch: 4822, Training Loss: 0.02970
Epoch: 4822, Training Loss: 0.03771
Epoch: 4823, Training Loss: 0.02695
Epoch: 4823, Training Loss: 0.02964
Epoch: 4823, Training Loss: 0.02970
Epoch: 4823, Training Loss: 0.03770
Epoch: 4824, Training Loss: 0.02694
Epoch: 4824, Training Loss: 0.02963
Epoch: 4824, Training Loss: 0.02969
Epoch: 4824, Training Loss: 0.03769
Epoch: 4825, Training Loss: 0.02694
Epoch: 4825, Training Loss: 0.02963
Epoch: 4825, Training Loss: 0.02968
Epoch: 4825, Training Loss: 0.03768
Epoch: 4826, Training Loss: 0.02693
Epoch: 4826, Training Loss: 0.02962
Epoch: 4826, Training Loss: 0.02968
Epoch: 4826, Training Loss: 0.03768
Epoch: 4827, Training Loss: 0.02693
Epoch: 4827, Training Loss: 0.02962
Epoch: 4827, Training Loss: 0.02967
Epoch: 4827, Training Loss: 0.03767
Epoch: 4828, Training Loss: 0.02692
Epoch: 4828, Training Loss: 0.02961
Epoch: 4828, Training Loss: 0.02967
Epoch: 4828, Training Loss: 0.03766
Epoch: 4829, Training Loss: 0.02692
Epoch: 4829, Training Loss: 0.02960
Epoch: 4829, Training Loss: 0.02966
Epoch: 4829, Training Loss: 0.03765
Epoch: 4830, Training Loss: 0.02691
Epoch: 4830, Training Loss: 0.02960
Epoch: 4830, Training Loss: 0.02965
Epoch: 4830, Training Loss: 0.03764
Epoch: 4831, Training Loss: 0.02691
Epoch: 4831, Training Loss: 0.02959
Epoch: 4831, Training Loss: 0.02965
Epoch: 4831, Training Loss: 0.03763
Epoch: 4832, Training Loss: 0.02690
Epoch: 4832, Training Loss: 0.02958
Epoch: 4832, Training Loss: 0.02964
Epoch: 4832, Training Loss: 0.03762
Epoch: 4833, Training Loss: 0.02689
Epoch: 4833, Training Loss: 0.02958
Epoch: 4833, Training Loss: 0.02963
Epoch: 4833, Training Loss: 0.03762
Epoch: 4834, Training Loss: 0.02689
Epoch: 4834, Training Loss: 0.02957
Epoch: 4834, Training Loss: 0.02963
Epoch: 4834, Training Loss: 0.03761
Epoch: 4835, Training Loss: 0.02688
Epoch: 4835, Training Loss: 0.02956
Epoch: 4835, Training Loss: 0.02962
Epoch: 4835, Training Loss: 0.03760
Epoch: 4836, Training Loss: 0.02688
Epoch: 4836, Training Loss: 0.02956
Epoch: 4836, Training Loss: 0.02961
Epoch: 4836, Training Loss: 0.03759
Epoch: 4837, Training Loss: 0.02687
Epoch: 4837, Training Loss: 0.02955
Epoch: 4837, Training Loss: 0.02961
Epoch: 4837, Training Loss: 0.03758
Epoch: 4838, Training Loss: 0.02687
Epoch: 4838, Training Loss: 0.02955
Epoch: 4838, Training Loss: 0.02960
Epoch: 4838, Training Loss: 0.03757
Epoch: 4839, Training Loss: 0.02686
Epoch: 4839, Training Loss: 0.02954
Epoch: 4839, Training Loss: 0.02960
Epoch: 4839, Training Loss: 0.03757
Epoch: 4840, Training Loss: 0.02686
Epoch: 4840, Training Loss: 0.02953
Epoch: 4840, Training Loss: 0.02959
Epoch: 4840, Training Loss: 0.03756
Epoch: 4841, Training Loss: 0.02685
Epoch: 4841, Training Loss: 0.02953
Epoch: 4841, Training Loss: 0.02958
Epoch: 4841, Training Loss: 0.03755
Epoch: 4842, Training Loss: 0.02685
Epoch: 4842, Training Loss: 0.02952
Epoch: 4842, Training Loss: 0.02958
Epoch: 4842, Training Loss: 0.03754
Epoch: 4843, Training Loss: 0.02684
Epoch: 4843, Training Loss: 0.02951
Epoch: 4843, Training Loss: 0.02957
Epoch: 4843, Training Loss: 0.03753
Epoch: 4844, Training Loss: 0.02684
Epoch: 4844, Training Loss: 0.02951
Epoch: 4844, Training Loss: 0.02956
Epoch: 4844, Training Loss: 0.03752
Epoch: 4845, Training Loss: 0.02683
Epoch: 4845, Training Loss: 0.02950
Epoch: 4845, Training Loss: 0.02956
Epoch: 4845, Training Loss: 0.03751
Epoch: 4846, Training Loss: 0.02683
Epoch: 4846, Training Loss: 0.02949
Epoch: 4846, Training Loss: 0.02955
Epoch: 4846, Training Loss: 0.03751
Epoch: 4847, Training Loss: 0.02682
Epoch: 4847, Training Loss: 0.02949
Epoch: 4847, Training Loss: 0.02954
Epoch: 4847, Training Loss: 0.03750
Epoch: 4848, Training Loss: 0.02681
Epoch: 4848, Training Loss: 0.02948
Epoch: 4848, Training Loss: 0.02954
Epoch: 4848, Training Loss: 0.03749
Epoch: 4849, Training Loss: 0.02681
Epoch: 4849, Training Loss: 0.02948
Epoch: 4849, Training Loss: 0.02953
Epoch: 4849, Training Loss: 0.03748
Epoch: 4850, Training Loss: 0.02680
Epoch: 4850, Training Loss: 0.02947
Epoch: 4850, Training Loss: 0.02953
Epoch: 4850, Training Loss: 0.03747
Epoch: 4851, Training Loss: 0.02680
Epoch: 4851, Training Loss: 0.02946
Epoch: 4851, Training Loss: 0.02952
Epoch: 4851, Training Loss: 0.03746
Epoch: 4852, Training Loss: 0.02679
Epoch: 4852, Training Loss: 0.02946
Epoch: 4852, Training Loss: 0.02951
Epoch: 4852, Training Loss: 0.03746
Epoch: 4853, Training Loss: 0.02679
Epoch: 4853, Training Loss: 0.02945
Epoch: 4853, Training Loss: 0.02951
Epoch: 4853, Training Loss: 0.03745
Epoch: 4854, Training Loss: 0.02678
Epoch: 4854, Training Loss: 0.02944
Epoch: 4854, Training Loss: 0.02950
Epoch: 4854, Training Loss: 0.03744
Epoch: 4855, Training Loss: 0.02678
Epoch: 4855, Training Loss: 0.02944
Epoch: 4855, Training Loss: 0.02949
Epoch: 4855, Training Loss: 0.03743
Epoch: 4856, Training Loss: 0.02677
Epoch: 4856, Training Loss: 0.02943
Epoch: 4856, Training Loss: 0.02949
Epoch: 4856, Training Loss: 0.03742
Epoch: 4857, Training Loss: 0.02677
Epoch: 4857, Training Loss: 0.02943
Epoch: 4857, Training Loss: 0.02948
Epoch: 4857, Training Loss: 0.03741
Epoch: 4858, Training Loss: 0.02676
Epoch: 4858, Training Loss: 0.02942
Epoch: 4858, Training Loss: 0.02948
Epoch: 4858, Training Loss: 0.03741
Epoch: 4859, Training Loss: 0.02676
Epoch: 4859, Training Loss: 0.02941
Epoch: 4859, Training Loss: 0.02947
Epoch: 4859, Training Loss: 0.03740
Epoch: 4860, Training Loss: 0.02675
Epoch: 4860, Training Loss: 0.02941
Epoch: 4860, Training Loss: 0.02946
Epoch: 4860, Training Loss: 0.03739
Epoch: 4861, Training Loss: 0.02675
Epoch: 4861, Training Loss: 0.02940
Epoch: 4861, Training Loss: 0.02946
Epoch: 4861, Training Loss: 0.03738
Epoch: 4862, Training Loss: 0.02674
Epoch: 4862, Training Loss: 0.02939
Epoch: 4862, Training Loss: 0.02945
Epoch: 4862, Training Loss: 0.03737
Epoch: 4863, Training Loss: 0.02674
Epoch: 4863, Training Loss: 0.02939
Epoch: 4863, Training Loss: 0.02944
Epoch: 4863, Training Loss: 0.03736
Epoch: 4864, Training Loss: 0.02673
Epoch: 4864, Training Loss: 0.02938
Epoch: 4864, Training Loss: 0.02944
Epoch: 4864, Training Loss: 0.03736
Epoch: 4865, Training Loss: 0.02672
Epoch: 4865, Training Loss: 0.02938
Epoch: 4865, Training Loss: 0.02943
Epoch: 4865, Training Loss: 0.03735
Epoch: 4866, Training Loss: 0.02672
Epoch: 4866, Training Loss: 0.02937
Epoch: 4866, Training Loss: 0.02943
Epoch: 4866, Training Loss: 0.03734
Epoch: 4867, Training Loss: 0.02671
Epoch: 4867, Training Loss: 0.02936
Epoch: 4867, Training Loss: 0.02942
Epoch: 4867, Training Loss: 0.03733
Epoch: 4868, Training Loss: 0.02671
Epoch: 4868, Training Loss: 0.02936
Epoch: 4868, Training Loss: 0.02941
Epoch: 4868, Training Loss: 0.03732
Epoch: 4869, Training Loss: 0.02670
Epoch: 4869, Training Loss: 0.02935
Epoch: 4869, Training Loss: 0.02941
Epoch: 4869, Training Loss: 0.03731
Epoch: 4870, Training Loss: 0.02670
Epoch: 4870, Training Loss: 0.02934
Epoch: 4870, Training Loss: 0.02940
Epoch: 4870, Training Loss: 0.03731
Epoch: 4871, Training Loss: 0.02669
Epoch: 4871, Training Loss: 0.02934
Epoch: 4871, Training Loss: 0.02939
Epoch: 4871, Training Loss: 0.03730
Epoch: 4872, Training Loss: 0.02669
Epoch: 4872, Training Loss: 0.02933
Epoch: 4872, Training Loss: 0.02939
Epoch: 4872, Training Loss: 0.03729
Epoch: 4873, Training Loss: 0.02668
Epoch: 4873, Training Loss: 0.02933
Epoch: 4873, Training Loss: 0.02938
Epoch: 4873, Training Loss: 0.03728
Epoch: 4874, Training Loss: 0.02668
Epoch: 4874, Training Loss: 0.02932
Epoch: 4874, Training Loss: 0.02938
Epoch: 4874, Training Loss: 0.03727
Epoch: 4875, Training Loss: 0.02667
Epoch: 4875, Training Loss: 0.02931
Epoch: 4875, Training Loss: 0.02937
Epoch: 4875, Training Loss: 0.03727
Epoch: 4876, Training Loss: 0.02667
Epoch: 4876, Training Loss: 0.02931
Epoch: 4876, Training Loss: 0.02936
Epoch: 4876, Training Loss: 0.03726
Epoch: 4877, Training Loss: 0.02666
Epoch: 4877, Training Loss: 0.02930
Epoch: 4877, Training Loss: 0.02936
Epoch: 4877, Training Loss: 0.03725
Epoch: 4878, Training Loss: 0.02666
Epoch: 4878, Training Loss: 0.02929
Epoch: 4878, Training Loss: 0.02935
Epoch: 4878, Training Loss: 0.03724
Epoch: 4879, Training Loss: 0.02665
Epoch: 4879, Training Loss: 0.02929
Epoch: 4879, Training Loss: 0.02934
Epoch: 4879, Training Loss: 0.03723
Epoch: 4880, Training Loss: 0.02665
Epoch: 4880, Training Loss: 0.02928
Epoch: 4880, Training Loss: 0.02934
Epoch: 4880, Training Loss: 0.03722
Epoch: 4881, Training Loss: 0.02664
Epoch: 4881, Training Loss: 0.02928
Epoch: 4881, Training Loss: 0.02933
Epoch: 4881, Training Loss: 0.03722
Epoch: 4882, Training Loss: 0.02664
Epoch: 4882, Training Loss: 0.02927
Epoch: 4882, Training Loss: 0.02933
Epoch: 4882, Training Loss: 0.03721
Epoch: 4883, Training Loss: 0.02663
Epoch: 4883, Training Loss: 0.02926
Epoch: 4883, Training Loss: 0.02932
Epoch: 4883, Training Loss: 0.03720
Epoch: 4884, Training Loss: 0.02663
Epoch: 4884, Training Loss: 0.02926
Epoch: 4884, Training Loss: 0.02931
Epoch: 4884, Training Loss: 0.03719
Epoch: 4885, Training Loss: 0.02662
Epoch: 4885, Training Loss: 0.02925
Epoch: 4885, Training Loss: 0.02931
Epoch: 4885, Training Loss: 0.03718
Epoch: 4886, Training Loss: 0.02662
Epoch: 4886, Training Loss: 0.02925
Epoch: 4886, Training Loss: 0.02930
Epoch: 4886, Training Loss: 0.03718
Epoch: 4887, Training Loss: 0.02661
Epoch: 4887, Training Loss: 0.02924
Epoch: 4887, Training Loss: 0.02930
Epoch: 4887, Training Loss: 0.03717
Epoch: 4888, Training Loss: 0.02660
Epoch: 4888, Training Loss: 0.02923
Epoch: 4888, Training Loss: 0.02929
Epoch: 4888, Training Loss: 0.03716
Epoch: 4889, Training Loss: 0.02660
Epoch: 4889, Training Loss: 0.02923
Epoch: 4889, Training Loss: 0.02928
Epoch: 4889, Training Loss: 0.03715
Epoch: 4890, Training Loss: 0.02659
Epoch: 4890, Training Loss: 0.02922
Epoch: 4890, Training Loss: 0.02928
Epoch: 4890, Training Loss: 0.03714
Epoch: 4891, Training Loss: 0.02659
Epoch: 4891, Training Loss: 0.02921
Epoch: 4891, Training Loss: 0.02927
Epoch: 4891, Training Loss: 0.03713
Epoch: 4892, Training Loss: 0.02658
Epoch: 4892, Training Loss: 0.02921
Epoch: 4892, Training Loss: 0.02926
Epoch: 4892, Training Loss: 0.03713
Epoch: 4893, Training Loss: 0.02658
Epoch: 4893, Training Loss: 0.02920
Epoch: 4893, Training Loss: 0.02926
Epoch: 4893, Training Loss: 0.03712
Epoch: 4894, Training Loss: 0.02657
Epoch: 4894, Training Loss: 0.02920
Epoch: 4894, Training Loss: 0.02925
Epoch: 4894, Training Loss: 0.03711
Epoch: 4895, Training Loss: 0.02657
Epoch: 4895, Training Loss: 0.02919
Epoch: 4895, Training Loss: 0.02925
Epoch: 4895, Training Loss: 0.03710
Epoch: 4896, Training Loss: 0.02656
Epoch: 4896, Training Loss: 0.02918
Epoch: 4896, Training Loss: 0.02924
Epoch: 4896, Training Loss: 0.03709
Epoch: 4897, Training Loss: 0.02656
Epoch: 4897, Training Loss: 0.02918
Epoch: 4897, Training Loss: 0.02923
Epoch: 4897, Training Loss: 0.03709
Epoch: 4898, Training Loss: 0.02655
Epoch: 4898, Training Loss: 0.02917
Epoch: 4898, Training Loss: 0.02923
Epoch: 4898, Training Loss: 0.03708
Epoch: 4899, Training Loss: 0.02655
Epoch: 4899, Training Loss: 0.02917
Epoch: 4899, Training Loss: 0.02922
Epoch: 4899, Training Loss: 0.03707
Epoch: 4900, Training Loss: 0.02654
Epoch: 4900, Training Loss: 0.02916
Epoch: 4900, Training Loss: 0.02922
Epoch: 4900, Training Loss: 0.03706
Epoch: 4901, Training Loss: 0.02654
Epoch: 4901, Training Loss: 0.02915
Epoch: 4901, Training Loss: 0.02921
Epoch: 4901, Training Loss: 0.03705
Epoch: 4902, Training Loss: 0.02653
Epoch: 4902, Training Loss: 0.02915
Epoch: 4902, Training Loss: 0.02920
Epoch: 4902, Training Loss: 0.03704
Epoch: 4903, Training Loss: 0.02653
Epoch: 4903, Training Loss: 0.02914
Epoch: 4903, Training Loss: 0.02920
Epoch: 4903, Training Loss: 0.03704
Epoch: 4904, Training Loss: 0.02652
Epoch: 4904, Training Loss: 0.02914
Epoch: 4904, Training Loss: 0.02919
Epoch: 4904, Training Loss: 0.03703
Epoch: 4905, Training Loss: 0.02652
Epoch: 4905, Training Loss: 0.02913
Epoch: 4905, Training Loss: 0.02918
Epoch: 4905, Training Loss: 0.03702
Epoch: 4906, Training Loss: 0.02651
Epoch: 4906, Training Loss: 0.02912
Epoch: 4906, Training Loss: 0.02918
Epoch: 4906, Training Loss: 0.03701
Epoch: 4907, Training Loss: 0.02651
Epoch: 4907, Training Loss: 0.02912
Epoch: 4907, Training Loss: 0.02917
Epoch: 4907, Training Loss: 0.03700
Epoch: 4908, Training Loss: 0.02650
Epoch: 4908, Training Loss: 0.02911
Epoch: 4908, Training Loss: 0.02917
Epoch: 4908, Training Loss: 0.03700
Epoch: 4909, Training Loss: 0.02650
Epoch: 4909, Training Loss: 0.02911
Epoch: 4909, Training Loss: 0.02916
Epoch: 4909, Training Loss: 0.03699
Epoch: 4910, Training Loss: 0.02649
Epoch: 4910, Training Loss: 0.02910
Epoch: 4910, Training Loss: 0.02915
Epoch: 4910, Training Loss: 0.03698
Epoch: 4911, Training Loss: 0.02649
Epoch: 4911, Training Loss: 0.02909
Epoch: 4911, Training Loss: 0.02915
Epoch: 4911, Training Loss: 0.03697
Epoch: 4912, Training Loss: 0.02648
Epoch: 4912, Training Loss: 0.02909
Epoch: 4912, Training Loss: 0.02914
Epoch: 4912, Training Loss: 0.03696
Epoch: 4913, Training Loss: 0.02648
Epoch: 4913, Training Loss: 0.02908
Epoch: 4913, Training Loss: 0.02914
Epoch: 4913, Training Loss: 0.03696
Epoch: 4914, Training Loss: 0.02647
Epoch: 4914, Training Loss: 0.02907
Epoch: 4914, Training Loss: 0.02913
Epoch: 4914, Training Loss: 0.03695
Epoch: 4915, Training Loss: 0.02647
Epoch: 4915, Training Loss: 0.02907
Epoch: 4915, Training Loss: 0.02912
Epoch: 4915, Training Loss: 0.03694
Epoch: 4916, Training Loss: 0.02646
Epoch: 4916, Training Loss: 0.02906
Epoch: 4916, Training Loss: 0.02912
Epoch: 4916, Training Loss: 0.03693
Epoch: 4917, Training Loss: 0.02646
Epoch: 4917, Training Loss: 0.02906
Epoch: 4917, Training Loss: 0.02911
Epoch: 4917, Training Loss: 0.03692
Epoch: 4918, Training Loss: 0.02645
Epoch: 4918, Training Loss: 0.02905
Epoch: 4918, Training Loss: 0.02911
Epoch: 4918, Training Loss: 0.03692
Epoch: 4919, Training Loss: 0.02645
Epoch: 4919, Training Loss: 0.02904
Epoch: 4919, Training Loss: 0.02910
Epoch: 4919, Training Loss: 0.03691
Epoch: 4920, Training Loss: 0.02644
Epoch: 4920, Training Loss: 0.02904
Epoch: 4920, Training Loss: 0.02909
Epoch: 4920, Training Loss: 0.03690
Epoch: 4921, Training Loss: 0.02644
Epoch: 4921, Training Loss: 0.02903
Epoch: 4921, Training Loss: 0.02909
Epoch: 4921, Training Loss: 0.03689
Epoch: 4922, Training Loss: 0.02643
Epoch: 4922, Training Loss: 0.02903
Epoch: 4922, Training Loss: 0.02908
Epoch: 4922, Training Loss: 0.03688
Epoch: 4923, Training Loss: 0.02643
Epoch: 4923, Training Loss: 0.02902
Epoch: 4923, Training Loss: 0.02908
Epoch: 4923, Training Loss: 0.03688
Epoch: 4924, Training Loss: 0.02642
Epoch: 4924, Training Loss: 0.02901
Epoch: 4924, Training Loss: 0.02907
Epoch: 4924, Training Loss: 0.03687
Epoch: 4925, Training Loss: 0.02641
Epoch: 4925, Training Loss: 0.02901
Epoch: 4925, Training Loss: 0.02906
Epoch: 4925, Training Loss: 0.03686
Epoch: 4926, Training Loss: 0.02641
Epoch: 4926, Training Loss: 0.02900
Epoch: 4926, Training Loss: 0.02906
Epoch: 4926, Training Loss: 0.03685
Epoch: 4927, Training Loss: 0.02640
Epoch: 4927, Training Loss: 0.02900
Epoch: 4927, Training Loss: 0.02905
Epoch: 4927, Training Loss: 0.03684
Epoch: 4928, Training Loss: 0.02640
Epoch: 4928, Training Loss: 0.02899
Epoch: 4928, Training Loss: 0.02905
Epoch: 4928, Training Loss: 0.03684
Epoch: 4929, Training Loss: 0.02639
Epoch: 4929, Training Loss: 0.02898
Epoch: 4929, Training Loss: 0.02904
Epoch: 4929, Training Loss: 0.03683
Epoch: 4930, Training Loss: 0.02639
Epoch: 4930, Training Loss: 0.02898
Epoch: 4930, Training Loss: 0.02903
Epoch: 4930, Training Loss: 0.03682
Epoch: 4931, Training Loss: 0.02638
Epoch: 4931, Training Loss: 0.02897
Epoch: 4931, Training Loss: 0.02903
Epoch: 4931, Training Loss: 0.03681
Epoch: 4932, Training Loss: 0.02638
Epoch: 4932, Training Loss: 0.02897
Epoch: 4932, Training Loss: 0.02902
Epoch: 4932, Training Loss: 0.03680
Epoch: 4933, Training Loss: 0.02637
Epoch: 4933, Training Loss: 0.02896
Epoch: 4933, Training Loss: 0.02902
Epoch: 4933, Training Loss: 0.03680
Epoch: 4934, Training Loss: 0.02637
Epoch: 4934, Training Loss: 0.02895
Epoch: 4934, Training Loss: 0.02901
Epoch: 4934, Training Loss: 0.03679
Epoch: 4935, Training Loss: 0.02636
Epoch: 4935, Training Loss: 0.02895
Epoch: 4935, Training Loss: 0.02900
Epoch: 4935, Training Loss: 0.03678
Epoch: 4936, Training Loss: 0.02636
Epoch: 4936, Training Loss: 0.02894
Epoch: 4936, Training Loss: 0.02900
Epoch: 4936, Training Loss: 0.03677
Epoch: 4937, Training Loss: 0.02635
Epoch: 4937, Training Loss: 0.02894
Epoch: 4937, Training Loss: 0.02899
Epoch: 4937, Training Loss: 0.03676
Epoch: 4938, Training Loss: 0.02635
Epoch: 4938, Training Loss: 0.02893
Epoch: 4938, Training Loss: 0.02899
Epoch: 4938, Training Loss: 0.03676
Epoch: 4939, Training Loss: 0.02634
Epoch: 4939, Training Loss: 0.02892
Epoch: 4939, Training Loss: 0.02898
Epoch: 4939, Training Loss: 0.03675
Epoch: 4940, Training Loss: 0.02634
Epoch: 4940, Training Loss: 0.02892
Epoch: 4940, Training Loss: 0.02897
Epoch: 4940, Training Loss: 0.03674
Epoch: 4941, Training Loss: 0.02633
Epoch: 4941, Training Loss: 0.02891
Epoch: 4941, Training Loss: 0.02897
Epoch: 4941, Training Loss: 0.03673
Epoch: 4942, Training Loss: 0.02633
Epoch: 4942, Training Loss: 0.02891
Epoch: 4942, Training Loss: 0.02896
Epoch: 4942, Training Loss: 0.03673
Epoch: 4943, Training Loss: 0.02632
Epoch: 4943, Training Loss: 0.02890
Epoch: 4943, Training Loss: 0.02896
Epoch: 4943, Training Loss: 0.03672
Epoch: 4944, Training Loss: 0.02632
Epoch: 4944, Training Loss: 0.02890
Epoch: 4944, Training Loss: 0.02895
Epoch: 4944, Training Loss: 0.03671
Epoch: 4945, Training Loss: 0.02631
Epoch: 4945, Training Loss: 0.02889
Epoch: 4945, Training Loss: 0.02894
Epoch: 4945, Training Loss: 0.03670
Epoch: 4946, Training Loss: 0.02631
Epoch: 4946, Training Loss: 0.02888
Epoch: 4946, Training Loss: 0.02894
Epoch: 4946, Training Loss: 0.03669
Epoch: 4947, Training Loss: 0.02630
Epoch: 4947, Training Loss: 0.02888
Epoch: 4947, Training Loss: 0.02893
Epoch: 4947, Training Loss: 0.03669
Epoch: 4948, Training Loss: 0.02630
Epoch: 4948, Training Loss: 0.02887
Epoch: 4948, Training Loss: 0.02893
Epoch: 4948, Training Loss: 0.03668
Epoch: 4949, Training Loss: 0.02629
Epoch: 4949, Training Loss: 0.02887
Epoch: 4949, Training Loss: 0.02892
Epoch: 4949, Training Loss: 0.03667
Epoch: 4950, Training Loss: 0.02629
Epoch: 4950, Training Loss: 0.02886
Epoch: 4950, Training Loss: 0.02891
Epoch: 4950, Training Loss: 0.03666
Epoch: 4951, Training Loss: 0.02628
Epoch: 4951, Training Loss: 0.02885
Epoch: 4951, Training Loss: 0.02891
Epoch: 4951, Training Loss: 0.03665
Epoch: 4952, Training Loss: 0.02628
Epoch: 4952, Training Loss: 0.02885
Epoch: 4952, Training Loss: 0.02890
Epoch: 4952, Training Loss: 0.03665
Epoch: 4953, Training Loss: 0.02627
Epoch: 4953, Training Loss: 0.02884
Epoch: 4953, Training Loss: 0.02890
Epoch: 4953, Training Loss: 0.03664
Epoch: 4954, Training Loss: 0.02627
Epoch: 4954, Training Loss: 0.02884
Epoch: 4954, Training Loss: 0.02889
Epoch: 4954, Training Loss: 0.03663
Epoch: 4955, Training Loss: 0.02626
Epoch: 4955, Training Loss: 0.02883
Epoch: 4955, Training Loss: 0.02888
Epoch: 4955, Training Loss: 0.03662
Epoch: 4956, Training Loss: 0.02626
Epoch: 4956, Training Loss: 0.02882
Epoch: 4956, Training Loss: 0.02888
Epoch: 4956, Training Loss: 0.03661
Epoch: 4957, Training Loss: 0.02625
Epoch: 4957, Training Loss: 0.02882
Epoch: 4957, Training Loss: 0.02887
Epoch: 4957, Training Loss: 0.03661
Epoch: 4958, Training Loss: 0.02625
Epoch: 4958, Training Loss: 0.02881
Epoch: 4958, Training Loss: 0.02887
Epoch: 4958, Training Loss: 0.03660
Epoch: 4959, Training Loss: 0.02624
Epoch: 4959, Training Loss: 0.02881
Epoch: 4959, Training Loss: 0.02886
Epoch: 4959, Training Loss: 0.03659
Epoch: 4960, Training Loss: 0.02624
Epoch: 4960, Training Loss: 0.02880
Epoch: 4960, Training Loss: 0.02886
Epoch: 4960, Training Loss: 0.03658
Epoch: 4961, Training Loss: 0.02623
Epoch: 4961, Training Loss: 0.02879
Epoch: 4961, Training Loss: 0.02885
Epoch: 4961, Training Loss: 0.03658
Epoch: 4962, Training Loss: 0.02623
Epoch: 4962, Training Loss: 0.02879
Epoch: 4962, Training Loss: 0.02884
Epoch: 4962, Training Loss: 0.03657
Epoch: 4963, Training Loss: 0.02622
Epoch: 4963, Training Loss: 0.02878
Epoch: 4963, Training Loss: 0.02884
Epoch: 4963, Training Loss: 0.03656
Epoch: 4964, Training Loss: 0.02622
Epoch: 4964, Training Loss: 0.02878
Epoch: 4964, Training Loss: 0.02883
Epoch: 4964, Training Loss: 0.03655
Epoch: 4965, Training Loss: 0.02621
Epoch: 4965, Training Loss: 0.02877
Epoch: 4965, Training Loss: 0.02883
Epoch: 4965, Training Loss: 0.03654
Epoch: 4966, Training Loss: 0.02621
Epoch: 4966, Training Loss: 0.02877
Epoch: 4966, Training Loss: 0.02882
Epoch: 4966, Training Loss: 0.03654
Epoch: 4967, Training Loss: 0.02620
Epoch: 4967, Training Loss: 0.02876
Epoch: 4967, Training Loss: 0.02881
Epoch: 4967, Training Loss: 0.03653
Epoch: 4968, Training Loss: 0.02620
Epoch: 4968, Training Loss: 0.02875
Epoch: 4968, Training Loss: 0.02881
Epoch: 4968, Training Loss: 0.03652
Epoch: 4969, Training Loss: 0.02619
Epoch: 4969, Training Loss: 0.02875
Epoch: 4969, Training Loss: 0.02880
Epoch: 4969, Training Loss: 0.03651
Epoch: 4970, Training Loss: 0.02619
Epoch: 4970, Training Loss: 0.02874
Epoch: 4970, Training Loss: 0.02880
Epoch: 4970, Training Loss: 0.03651
Epoch: 4971, Training Loss: 0.02618
Epoch: 4971, Training Loss: 0.02874
Epoch: 4971, Training Loss: 0.02879
Epoch: 4971, Training Loss: 0.03650
Epoch: 4972, Training Loss: 0.02618
Epoch: 4972, Training Loss: 0.02873
Epoch: 4972, Training Loss: 0.02878
Epoch: 4972, Training Loss: 0.03649
Epoch: 4973, Training Loss: 0.02617
Epoch: 4973, Training Loss: 0.02872
Epoch: 4973, Training Loss: 0.02878
Epoch: 4973, Training Loss: 0.03648
Epoch: 4974, Training Loss: 0.02617
Epoch: 4974, Training Loss: 0.02872
Epoch: 4974, Training Loss: 0.02877
Epoch: 4974, Training Loss: 0.03647
Epoch: 4975, Training Loss: 0.02616
Epoch: 4975, Training Loss: 0.02871
Epoch: 4975, Training Loss: 0.02877
Epoch: 4975, Training Loss: 0.03647
Epoch: 4976, Training Loss: 0.02616
Epoch: 4976, Training Loss: 0.02871
Epoch: 4976, Training Loss: 0.02876
Epoch: 4976, Training Loss: 0.03646
Epoch: 4977, Training Loss: 0.02615
Epoch: 4977, Training Loss: 0.02870
Epoch: 4977, Training Loss: 0.02876
Epoch: 4977, Training Loss: 0.03645
Epoch: 4978, Training Loss: 0.02615
Epoch: 4978, Training Loss: 0.02870
Epoch: 4978, Training Loss: 0.02875
Epoch: 4978, Training Loss: 0.03644
Epoch: 4979, Training Loss: 0.02615
Epoch: 4979, Training Loss: 0.02869
Epoch: 4979, Training Loss: 0.02874
Epoch: 4979, Training Loss: 0.03644
Epoch: 4980, Training Loss: 0.02614
Epoch: 4980, Training Loss: 0.02868
Epoch: 4980, Training Loss: 0.02874
Epoch: 4980, Training Loss: 0.03643
Epoch: 4981, Training Loss: 0.02614
Epoch: 4981, Training Loss: 0.02868
Epoch: 4981, Training Loss: 0.02873
Epoch: 4981, Training Loss: 0.03642
Epoch: 4982, Training Loss: 0.02613
Epoch: 4982, Training Loss: 0.02867
Epoch: 4982, Training Loss: 0.02873
Epoch: 4982, Training Loss: 0.03641
Epoch: 4983, Training Loss: 0.02613
Epoch: 4983, Training Loss: 0.02867
Epoch: 4983, Training Loss: 0.02872
Epoch: 4983, Training Loss: 0.03641
Epoch: 4984, Training Loss: 0.02612
Epoch: 4984, Training Loss: 0.02866
Epoch: 4984, Training Loss: 0.02871
Epoch: 4984, Training Loss: 0.03640
Epoch: 4985, Training Loss: 0.02612
Epoch: 4985, Training Loss: 0.02865
Epoch: 4985, Training Loss: 0.02871
Epoch: 4985, Training Loss: 0.03639
Epoch: 4986, Training Loss: 0.02611
Epoch: 4986, Training Loss: 0.02865
Epoch: 4986, Training Loss: 0.02870
Epoch: 4986, Training Loss: 0.03638
Epoch: 4987, Training Loss: 0.02611
Epoch: 4987, Training Loss: 0.02864
Epoch: 4987, Training Loss: 0.02870
Epoch: 4987, Training Loss: 0.03637
Epoch: 4988, Training Loss: 0.02610
Epoch: 4988, Training Loss: 0.02864
Epoch: 4988, Training Loss: 0.02869
Epoch: 4988, Training Loss: 0.03637
Epoch: 4989, Training Loss: 0.02610
Epoch: 4989, Training Loss: 0.02863
Epoch: 4989, Training Loss: 0.02869
Epoch: 4989, Training Loss: 0.03636
Epoch: 4990, Training Loss: 0.02609
Epoch: 4990, Training Loss: 0.02863
Epoch: 4990, Training Loss: 0.02868
Epoch: 4990, Training Loss: 0.03635
Epoch: 4991, Training Loss: 0.02609
Epoch: 4991, Training Loss: 0.02862
Epoch: 4991, Training Loss: 0.02867
Epoch: 4991, Training Loss: 0.03634
Epoch: 4992, Training Loss: 0.02608
Epoch: 4992, Training Loss: 0.02861
Epoch: 4992, Training Loss: 0.02867
Epoch: 4992, Training Loss: 0.03634
Epoch: 4993, Training Loss: 0.02608
Epoch: 4993, Training Loss: 0.02861
Epoch: 4993, Training Loss: 0.02866
Epoch: 4993, Training Loss: 0.03633
Epoch: 4994, Training Loss: 0.02607
Epoch: 4994, Training Loss: 0.02860
Epoch: 4994, Training Loss: 0.02866
Epoch: 4994, Training Loss: 0.03632
Epoch: 4995, Training Loss: 0.02607
Epoch: 4995, Training Loss: 0.02860
Epoch: 4995, Training Loss: 0.02865
Epoch: 4995, Training Loss: 0.03631
Epoch: 4996, Training Loss: 0.02606
Epoch: 4996, Training Loss: 0.02859
Epoch: 4996, Training Loss: 0.02865
Epoch: 4996, Training Loss: 0.03631
Epoch: 4997, Training Loss: 0.02606
Epoch: 4997, Training Loss: 0.02859
Epoch: 4997, Training Loss: 0.02864
Epoch: 4997, Training Loss: 0.03630
Epoch: 4998, Training Loss: 0.02605
Epoch: 4998, Training Loss: 0.02858
Epoch: 4998, Training Loss: 0.02863
Epoch: 4998, Training Loss: 0.03629
Epoch: 4999, Training Loss: 0.02605
Epoch: 4999, Training Loss: 0.02857
Epoch: 4999, Training Loss: 0.02863
Epoch: 4999, Training Loss: 0.03628
Epoch: 5000, Training Loss: 0.02604
Epoch: 5000, Training Loss: 0.02857
Epoch: 5000, Training Loss: 0.02862
Epoch: 5000, Training Loss: 0.03627
Epoch: 5001, Training Loss: 0.02604
Epoch: 5001, Training Loss: 0.02856
Epoch: 5001, Training Loss: 0.02862
Epoch: 5001, Training Loss: 0.03627
Epoch: 5002, Training Loss: 0.02603
Epoch: 5002, Training Loss: 0.02856
Epoch: 5002, Training Loss: 0.02861
Epoch: 5002, Training Loss: 0.03626
Epoch: 5003, Training Loss: 0.02603
Epoch: 5003, Training Loss: 0.02855
Epoch: 5003, Training Loss: 0.02860
Epoch: 5003, Training Loss: 0.03625
Epoch: 5004, Training Loss: 0.02602
Epoch: 5004, Training Loss: 0.02854
Epoch: 5004, Training Loss: 0.02860
Epoch: 5004, Training Loss: 0.03624
Epoch: 5005, Training Loss: 0.02602
Epoch: 5005, Training Loss: 0.02854
Epoch: 5005, Training Loss: 0.02859
Epoch: 5005, Training Loss: 0.03624
Epoch: 5006, Training Loss: 0.02601
Epoch: 5006, Training Loss: 0.02853
Epoch: 5006, Training Loss: 0.02859
Epoch: 5006, Training Loss: 0.03623
Epoch: 5007, Training Loss: 0.02601
Epoch: 5007, Training Loss: 0.02853
Epoch: 5007, Training Loss: 0.02858
Epoch: 5007, Training Loss: 0.03622
Epoch: 5008, Training Loss: 0.02600
Epoch: 5008, Training Loss: 0.02852
Epoch: 5008, Training Loss: 0.02858
Epoch: 5008, Training Loss: 0.03621
Epoch: 5009, Training Loss: 0.02600
Epoch: 5009, Training Loss: 0.02852
Epoch: 5009, Training Loss: 0.02857
Epoch: 5009, Training Loss: 0.03621
Epoch: 5010, Training Loss: 0.02599
Epoch: 5010, Training Loss: 0.02851
Epoch: 5010, Training Loss: 0.02856
Epoch: 5010, Training Loss: 0.03620
Epoch: 5011, Training Loss: 0.02599
Epoch: 5011, Training Loss: 0.02850
Epoch: 5011, Training Loss: 0.02856
Epoch: 5011, Training Loss: 0.03619
Epoch: 5012, Training Loss: 0.02598
Epoch: 5012, Training Loss: 0.02850
Epoch: 5012, Training Loss: 0.02855
Epoch: 5012, Training Loss: 0.03618
Epoch: 5013, Training Loss: 0.02598
Epoch: 5013, Training Loss: 0.02849
Epoch: 5013, Training Loss: 0.02855
Epoch: 5013, Training Loss: 0.03618
Epoch: 5014, Training Loss: 0.02597
Epoch: 5014, Training Loss: 0.02849
Epoch: 5014, Training Loss: 0.02854
Epoch: 5014, Training Loss: 0.03617
Epoch: 5015, Training Loss: 0.02597
Epoch: 5015, Training Loss: 0.02848
Epoch: 5015, Training Loss: 0.02854
Epoch: 5015, Training Loss: 0.03616
Epoch: 5016, Training Loss: 0.02596
Epoch: 5016, Training Loss: 0.02848
Epoch: 5016, Training Loss: 0.02853
Epoch: 5016, Training Loss: 0.03615
Epoch: 5017, Training Loss: 0.02596
Epoch: 5017, Training Loss: 0.02847
Epoch: 5017, Training Loss: 0.02852
Epoch: 5017, Training Loss: 0.03615
Epoch: 5018, Training Loss: 0.02596
Epoch: 5018, Training Loss: 0.02846
Epoch: 5018, Training Loss: 0.02852
Epoch: 5018, Training Loss: 0.03614
Epoch: 5019, Training Loss: 0.02595
Epoch: 5019, Training Loss: 0.02846
Epoch: 5019, Training Loss: 0.02851
Epoch: 5019, Training Loss: 0.03613
Epoch: 5020, Training Loss: 0.02595
Epoch: 5020, Training Loss: 0.02845
Epoch: 5020, Training Loss: 0.02851
Epoch: 5020, Training Loss: 0.03612
Epoch: 5021, Training Loss: 0.02594
Epoch: 5021, Training Loss: 0.02845
Epoch: 5021, Training Loss: 0.02850
Epoch: 5021, Training Loss: 0.03612
Epoch: 5022, Training Loss: 0.02594
Epoch: 5022, Training Loss: 0.02844
Epoch: 5022, Training Loss: 0.02850
Epoch: 5022, Training Loss: 0.03611
Epoch: 5023, Training Loss: 0.02593
Epoch: 5023, Training Loss: 0.02844
Epoch: 5023, Training Loss: 0.02849
Epoch: 5023, Training Loss: 0.03610
Epoch: 5024, Training Loss: 0.02593
Epoch: 5024, Training Loss: 0.02843
Epoch: 5024, Training Loss: 0.02848
Epoch: 5024, Training Loss: 0.03609
Epoch: 5025, Training Loss: 0.02592
Epoch: 5025, Training Loss: 0.02843
Epoch: 5025, Training Loss: 0.02848
Epoch: 5025, Training Loss: 0.03609
Epoch: 5026, Training Loss: 0.02592
Epoch: 5026, Training Loss: 0.02842
Epoch: 5026, Training Loss: 0.02847
Epoch: 5026, Training Loss: 0.03608
Epoch: 5027, Training Loss: 0.02591
Epoch: 5027, Training Loss: 0.02841
Epoch: 5027, Training Loss: 0.02847
Epoch: 5027, Training Loss: 0.03607
Epoch: 5028, Training Loss: 0.02591
Epoch: 5028, Training Loss: 0.02841
Epoch: 5028, Training Loss: 0.02846
Epoch: 5028, Training Loss: 0.03606
Epoch: 5029, Training Loss: 0.02590
Epoch: 5029, Training Loss: 0.02840
Epoch: 5029, Training Loss: 0.02846
Epoch: 5029, Training Loss: 0.03606
Epoch: 5030, Training Loss: 0.02590
Epoch: 5030, Training Loss: 0.02840
Epoch: 5030, Training Loss: 0.02845
Epoch: 5030, Training Loss: 0.03605
Epoch: 5031, Training Loss: 0.02589
Epoch: 5031, Training Loss: 0.02839
Epoch: 5031, Training Loss: 0.02844
Epoch: 5031, Training Loss: 0.03604
Epoch: 5032, Training Loss: 0.02589
Epoch: 5032, Training Loss: 0.02839
Epoch: 5032, Training Loss: 0.02844
Epoch: 5032, Training Loss: 0.03603
Epoch: 5033, Training Loss: 0.02588
Epoch: 5033, Training Loss: 0.02838
Epoch: 5033, Training Loss: 0.02843
Epoch: 5033, Training Loss: 0.03603
Epoch: 5034, Training Loss: 0.02588
Epoch: 5034, Training Loss: 0.02837
Epoch: 5034, Training Loss: 0.02843
Epoch: 5034, Training Loss: 0.03602
Epoch: 5035, Training Loss: 0.02587
Epoch: 5035, Training Loss: 0.02837
Epoch: 5035, Training Loss: 0.02842
Epoch: 5035, Training Loss: 0.03601
Epoch: 5036, Training Loss: 0.02587
Epoch: 5036, Training Loss: 0.02836
Epoch: 5036, Training Loss: 0.02842
Epoch: 5036, Training Loss: 0.03600
Epoch: 5037, Training Loss: 0.02586
Epoch: 5037, Training Loss: 0.02836
Epoch: 5037, Training Loss: 0.02841
Epoch: 5037, Training Loss: 0.03600
Epoch: 5038, Training Loss: 0.02586
Epoch: 5038, Training Loss: 0.02835
Epoch: 5038, Training Loss: 0.02841
Epoch: 5038, Training Loss: 0.03599
Epoch: 5039, Training Loss: 0.02585
Epoch: 5039, Training Loss: 0.02835
Epoch: 5039, Training Loss: 0.02840
Epoch: 5039, Training Loss: 0.03598
Epoch: 5040, Training Loss: 0.02585
Epoch: 5040, Training Loss: 0.02834
Epoch: 5040, Training Loss: 0.02839
Epoch: 5040, Training Loss: 0.03597
Epoch: 5041, Training Loss: 0.02584
Epoch: 5041, Training Loss: 0.02833
Epoch: 5041, Training Loss: 0.02839
Epoch: 5041, Training Loss: 0.03597
Epoch: 5042, Training Loss: 0.02584
Epoch: 5042, Training Loss: 0.02833
Epoch: 5042, Training Loss: 0.02838
Epoch: 5042, Training Loss: 0.03596
Epoch: 5043, Training Loss: 0.02584
Epoch: 5043, Training Loss: 0.02832
Epoch: 5043, Training Loss: 0.02838
Epoch: 5043, Training Loss: 0.03595
Epoch: 5044, Training Loss: 0.02583
Epoch: 5044, Training Loss: 0.02832
Epoch: 5044, Training Loss: 0.02837
Epoch: 5044, Training Loss: 0.03594
Epoch: 5045, Training Loss: 0.02583
Epoch: 5045, Training Loss: 0.02831
Epoch: 5045, Training Loss: 0.02837
Epoch: 5045, Training Loss: 0.03594
Epoch: 5046, Training Loss: 0.02582
Epoch: 5046, Training Loss: 0.02831
Epoch: 5046, Training Loss: 0.02836
Epoch: 5046, Training Loss: 0.03593
Epoch: 5047, Training Loss: 0.02582
Epoch: 5047, Training Loss: 0.02830
Epoch: 5047, Training Loss: 0.02835
Epoch: 5047, Training Loss: 0.03592
Epoch: 5048, Training Loss: 0.02581
Epoch: 5048, Training Loss: 0.02830
Epoch: 5048, Training Loss: 0.02835
Epoch: 5048, Training Loss: 0.03591
Epoch: 5049, Training Loss: 0.02581
Epoch: 5049, Training Loss: 0.02829
Epoch: 5049, Training Loss: 0.02834
Epoch: 5049, Training Loss: 0.03591
Epoch: 5050, Training Loss: 0.02580
Epoch: 5050, Training Loss: 0.02828
Epoch: 5050, Training Loss: 0.02834
Epoch: 5050, Training Loss: 0.03590
Epoch: 5051, Training Loss: 0.02580
Epoch: 5051, Training Loss: 0.02828
Epoch: 5051, Training Loss: 0.02833
Epoch: 5051, Training Loss: 0.03589
Epoch: 5052, Training Loss: 0.02579
Epoch: 5052, Training Loss: 0.02827
Epoch: 5052, Training Loss: 0.02833
Epoch: 5052, Training Loss: 0.03588
Epoch: 5053, Training Loss: 0.02579
Epoch: 5053, Training Loss: 0.02827
Epoch: 5053, Training Loss: 0.02832
Epoch: 5053, Training Loss: 0.03588
Epoch: 5054, Training Loss: 0.02578
Epoch: 5054, Training Loss: 0.02826
Epoch: 5054, Training Loss: 0.02832
Epoch: 5054, Training Loss: 0.03587
Epoch: 5055, Training Loss: 0.02578
Epoch: 5055, Training Loss: 0.02826
Epoch: 5055, Training Loss: 0.02831
Epoch: 5055, Training Loss: 0.03586
Epoch: 5056, Training Loss: 0.02577
Epoch: 5056, Training Loss: 0.02825
Epoch: 5056, Training Loss: 0.02830
Epoch: 5056, Training Loss: 0.03585
Epoch: 5057, Training Loss: 0.02577
Epoch: 5057, Training Loss: 0.02825
Epoch: 5057, Training Loss: 0.02830
Epoch: 5057, Training Loss: 0.03585
Epoch: 5058, Training Loss: 0.02576
Epoch: 5058, Training Loss: 0.02824
Epoch: 5058, Training Loss: 0.02829
Epoch: 5058, Training Loss: 0.03584
Epoch: 5059, Training Loss: 0.02576
Epoch: 5059, Training Loss: 0.02823
Epoch: 5059, Training Loss: 0.02829
Epoch: 5059, Training Loss: 0.03583
Epoch: 5060, Training Loss: 0.02576
Epoch: 5060, Training Loss: 0.02823
Epoch: 5060, Training Loss: 0.02828
Epoch: 5060, Training Loss: 0.03583
Epoch: 5061, Training Loss: 0.02575
Epoch: 5061, Training Loss: 0.02822
Epoch: 5061, Training Loss: 0.02828
Epoch: 5061, Training Loss: 0.03582
Epoch: 5062, Training Loss: 0.02575
Epoch: 5062, Training Loss: 0.02822
Epoch: 5062, Training Loss: 0.02827
Epoch: 5062, Training Loss: 0.03581
Epoch: 5063, Training Loss: 0.02574
Epoch: 5063, Training Loss: 0.02821
Epoch: 5063, Training Loss: 0.02827
Epoch: 5063, Training Loss: 0.03580
Epoch: 5064, Training Loss: 0.02574
Epoch: 5064, Training Loss: 0.02821
Epoch: 5064, Training Loss: 0.02826
Epoch: 5064, Training Loss: 0.03580
Epoch: 5065, Training Loss: 0.02573
Epoch: 5065, Training Loss: 0.02820
Epoch: 5065, Training Loss: 0.02825
Epoch: 5065, Training Loss: 0.03579
Epoch: 5066, Training Loss: 0.02573
Epoch: 5066, Training Loss: 0.02820
Epoch: 5066, Training Loss: 0.02825
Epoch: 5066, Training Loss: 0.03578
Epoch: 5067, Training Loss: 0.02572
Epoch: 5067, Training Loss: 0.02819
Epoch: 5067, Training Loss: 0.02824
Epoch: 5067, Training Loss: 0.03577
Epoch: 5068, Training Loss: 0.02572
Epoch: 5068, Training Loss: 0.02818
Epoch: 5068, Training Loss: 0.02824
Epoch: 5068, Training Loss: 0.03577
Epoch: 5069, Training Loss: 0.02571
Epoch: 5069, Training Loss: 0.02818
Epoch: 5069, Training Loss: 0.02823
Epoch: 5069, Training Loss: 0.03576
Epoch: 5070, Training Loss: 0.02571
Epoch: 5070, Training Loss: 0.02817
Epoch: 5070, Training Loss: 0.02823
Epoch: 5070, Training Loss: 0.03575
Epoch: 5071, Training Loss: 0.02570
Epoch: 5071, Training Loss: 0.02817
Epoch: 5071, Training Loss: 0.02822
Epoch: 5071, Training Loss: 0.03574
Epoch: 5072, Training Loss: 0.02570
Epoch: 5072, Training Loss: 0.02816
Epoch: 5072, Training Loss: 0.02822
Epoch: 5072, Training Loss: 0.03574
Epoch: 5073, Training Loss: 0.02569
Epoch: 5073, Training Loss: 0.02816
Epoch: 5073, Training Loss: 0.02821
Epoch: 5073, Training Loss: 0.03573
Epoch: 5074, Training Loss: 0.02569
Epoch: 5074, Training Loss: 0.02815
Epoch: 5074, Training Loss: 0.02820
Epoch: 5074, Training Loss: 0.03572
Epoch: 5075, Training Loss: 0.02568
Epoch: 5075, Training Loss: 0.02815
Epoch: 5075, Training Loss: 0.02820
Epoch: 5075, Training Loss: 0.03572
Epoch: 5076, Training Loss: 0.02568
Epoch: 5076, Training Loss: 0.02814
Epoch: 5076, Training Loss: 0.02819
Epoch: 5076, Training Loss: 0.03571
Epoch: 5077, Training Loss: 0.02568
Epoch: 5077, Training Loss: 0.02813
Epoch: 5077, Training Loss: 0.02819
Epoch: 5077, Training Loss: 0.03570
Epoch: 5078, Training Loss: 0.02567
Epoch: 5078, Training Loss: 0.02813
Epoch: 5078, Training Loss: 0.02818
Epoch: 5078, Training Loss: 0.03569
Epoch: 5079, Training Loss: 0.02567
Epoch: 5079, Training Loss: 0.02812
Epoch: 5079, Training Loss: 0.02818
Epoch: 5079, Training Loss: 0.03569
Epoch: 5080, Training Loss: 0.02566
Epoch: 5080, Training Loss: 0.02812
Epoch: 5080, Training Loss: 0.02817
Epoch: 5080, Training Loss: 0.03568
Epoch: 5081, Training Loss: 0.02566
Epoch: 5081, Training Loss: 0.02811
Epoch: 5081, Training Loss: 0.02817
Epoch: 5081, Training Loss: 0.03567
Epoch: 5082, Training Loss: 0.02565
Epoch: 5082, Training Loss: 0.02811
Epoch: 5082, Training Loss: 0.02816
Epoch: 5082, Training Loss: 0.03566
Epoch: 5083, Training Loss: 0.02565
Epoch: 5083, Training Loss: 0.02810
Epoch: 5083, Training Loss: 0.02815
Epoch: 5083, Training Loss: 0.03566
Epoch: 5084, Training Loss: 0.02564
Epoch: 5084, Training Loss: 0.02810
Epoch: 5084, Training Loss: 0.02815
Epoch: 5084, Training Loss: 0.03565
Epoch: 5085, Training Loss: 0.02564
Epoch: 5085, Training Loss: 0.02809
Epoch: 5085, Training Loss: 0.02814
Epoch: 5085, Training Loss: 0.03564
Epoch: 5086, Training Loss: 0.02563
Epoch: 5086, Training Loss: 0.02809
Epoch: 5086, Training Loss: 0.02814
Epoch: 5086, Training Loss: 0.03564
Epoch: 5087, Training Loss: 0.02563
Epoch: 5087, Training Loss: 0.02808
Epoch: 5087, Training Loss: 0.02813
Epoch: 5087, Training Loss: 0.03563
Epoch: 5088, Training Loss: 0.02562
Epoch: 5088, Training Loss: 0.02807
Epoch: 5088, Training Loss: 0.02813
Epoch: 5088, Training Loss: 0.03562
Epoch: 5089, Training Loss: 0.02562
Epoch: 5089, Training Loss: 0.02807
Epoch: 5089, Training Loss: 0.02812
Epoch: 5089, Training Loss: 0.03561
Epoch: 5090, Training Loss: 0.02561
Epoch: 5090, Training Loss: 0.02806
Epoch: 5090, Training Loss: 0.02812
Epoch: 5090, Training Loss: 0.03561
Epoch: 5091, Training Loss: 0.02561
Epoch: 5091, Training Loss: 0.02806
Epoch: 5091, Training Loss: 0.02811
Epoch: 5091, Training Loss: 0.03560
Epoch: 5092, Training Loss: 0.02561
Epoch: 5092, Training Loss: 0.02805
Epoch: 5092, Training Loss: 0.02811
Epoch: 5092, Training Loss: 0.03559
Epoch: 5093, Training Loss: 0.02560
Epoch: 5093, Training Loss: 0.02805
Epoch: 5093, Training Loss: 0.02810
Epoch: 5093, Training Loss: 0.03558
Epoch: 5094, Training Loss: 0.02560
Epoch: 5094, Training Loss: 0.02804
Epoch: 5094, Training Loss: 0.02809
Epoch: 5094, Training Loss: 0.03558
Epoch: 5095, Training Loss: 0.02559
Epoch: 5095, Training Loss: 0.02804
Epoch: 5095, Training Loss: 0.02809
Epoch: 5095, Training Loss: 0.03557
Epoch: 5096, Training Loss: 0.02559
Epoch: 5096, Training Loss: 0.02803
Epoch: 5096, Training Loss: 0.02808
Epoch: 5096, Training Loss: 0.03556
Epoch: 5097, Training Loss: 0.02558
Epoch: 5097, Training Loss: 0.02803
Epoch: 5097, Training Loss: 0.02808
Epoch: 5097, Training Loss: 0.03556
Epoch: 5098, Training Loss: 0.02558
Epoch: 5098, Training Loss: 0.02802
Epoch: 5098, Training Loss: 0.02807
Epoch: 5098, Training Loss: 0.03555
Epoch: 5099, Training Loss: 0.02557
Epoch: 5099, Training Loss: 0.02801
Epoch: 5099, Training Loss: 0.02807
Epoch: 5099, Training Loss: 0.03554
Epoch: 5100, Training Loss: 0.02557
Epoch: 5100, Training Loss: 0.02801
Epoch: 5100, Training Loss: 0.02806
Epoch: 5100, Training Loss: 0.03553
Epoch: 5101, Training Loss: 0.02556
Epoch: 5101, Training Loss: 0.02800
Epoch: 5101, Training Loss: 0.02806
Epoch: 5101, Training Loss: 0.03553
Epoch: 5102, Training Loss: 0.02556
Epoch: 5102, Training Loss: 0.02800
Epoch: 5102, Training Loss: 0.02805
Epoch: 5102, Training Loss: 0.03552
Epoch: 5103, Training Loss: 0.02555
Epoch: 5103, Training Loss: 0.02799
Epoch: 5103, Training Loss: 0.02805
Epoch: 5103, Training Loss: 0.03551
Epoch: 5104, Training Loss: 0.02555
Epoch: 5104, Training Loss: 0.02799
Epoch: 5104, Training Loss: 0.02804
Epoch: 5104, Training Loss: 0.03551
Epoch: 5105, Training Loss: 0.02555
Epoch: 5105, Training Loss: 0.02798
Epoch: 5105, Training Loss: 0.02803
Epoch: 5105, Training Loss: 0.03550
Epoch: 5106, Training Loss: 0.02554
Epoch: 5106, Training Loss: 0.02798
Epoch: 5106, Training Loss: 0.02803
Epoch: 5106, Training Loss: 0.03549
Epoch: 5107, Training Loss: 0.02554
Epoch: 5107, Training Loss: 0.02797
Epoch: 5107, Training Loss: 0.02802
Epoch: 5107, Training Loss: 0.03548
Epoch: 5108, Training Loss: 0.02553
Epoch: 5108, Training Loss: 0.02797
Epoch: 5108, Training Loss: 0.02802
Epoch: 5108, Training Loss: 0.03548
Epoch: 5109, Training Loss: 0.02553
Epoch: 5109, Training Loss: 0.02796
Epoch: 5109, Training Loss: 0.02801
Epoch: 5109, Training Loss: 0.03547
Epoch: 5110, Training Loss: 0.02552
Epoch: 5110, Training Loss: 0.02796
Epoch: 5110, Training Loss: 0.02801
Epoch: 5110, Training Loss: 0.03546
Epoch: 5111, Training Loss: 0.02552
Epoch: 5111, Training Loss: 0.02795
Epoch: 5111, Training Loss: 0.02800
Epoch: 5111, Training Loss: 0.03546
Epoch: 5112, Training Loss: 0.02551
Epoch: 5112, Training Loss: 0.02794
Epoch: 5112, Training Loss: 0.02800
Epoch: 5112, Training Loss: 0.03545
Epoch: 5113, Training Loss: 0.02551
Epoch: 5113, Training Loss: 0.02794
Epoch: 5113, Training Loss: 0.02799
Epoch: 5113, Training Loss: 0.03544
Epoch: 5114, Training Loss: 0.02550
Epoch: 5114, Training Loss: 0.02793
Epoch: 5114, Training Loss: 0.02799
Epoch: 5114, Training Loss: 0.03543
Epoch: 5115, Training Loss: 0.02550
Epoch: 5115, Training Loss: 0.02793
Epoch: 5115, Training Loss: 0.02798
Epoch: 5115, Training Loss: 0.03543
Epoch: 5116, Training Loss: 0.02550
Epoch: 5116, Training Loss: 0.02792
Epoch: 5116, Training Loss: 0.02798
Epoch: 5116, Training Loss: 0.03542
Epoch: 5117, Training Loss: 0.02549
Epoch: 5117, Training Loss: 0.02792
Epoch: 5117, Training Loss: 0.02797
Epoch: 5117, Training Loss: 0.03541
Epoch: 5118, Training Loss: 0.02549
Epoch: 5118, Training Loss: 0.02791
Epoch: 5118, Training Loss: 0.02796
Epoch: 5118, Training Loss: 0.03541
Epoch: 5119, Training Loss: 0.02548
Epoch: 5119, Training Loss: 0.02791
Epoch: 5119, Training Loss: 0.02796
Epoch: 5119, Training Loss: 0.03540
Epoch: 5120, Training Loss: 0.02548
Epoch: 5120, Training Loss: 0.02790
Epoch: 5120, Training Loss: 0.02795
Epoch: 5120, Training Loss: 0.03539
Epoch: 5121, Training Loss: 0.02547
Epoch: 5121, Training Loss: 0.02790
Epoch: 5121, Training Loss: 0.02795
Epoch: 5121, Training Loss: 0.03538
Epoch: 5122, Training Loss: 0.02547
Epoch: 5122, Training Loss: 0.02789
Epoch: 5122, Training Loss: 0.02794
Epoch: 5122, Training Loss: 0.03538
Epoch: 5123, Training Loss: 0.02546
Epoch: 5123, Training Loss: 0.02789
Epoch: 5123, Training Loss: 0.02794
Epoch: 5123, Training Loss: 0.03537
Epoch: 5124, Training Loss: 0.02546
Epoch: 5124, Training Loss: 0.02788
Epoch: 5124, Training Loss: 0.02793
Epoch: 5124, Training Loss: 0.03536
Epoch: 5125, Training Loss: 0.02545
Epoch: 5125, Training Loss: 0.02787
Epoch: 5125, Training Loss: 0.02793
Epoch: 5125, Training Loss: 0.03536
Epoch: 5126, Training Loss: 0.02545
Epoch: 5126, Training Loss: 0.02787
Epoch: 5126, Training Loss: 0.02792
Epoch: 5126, Training Loss: 0.03535
Epoch: 5127, Training Loss: 0.02545
Epoch: 5127, Training Loss: 0.02786
Epoch: 5127, Training Loss: 0.02792
Epoch: 5127, Training Loss: 0.03534
Epoch: 5128, Training Loss: 0.02544
Epoch: 5128, Training Loss: 0.02786
Epoch: 5128, Training Loss: 0.02791
Epoch: 5128, Training Loss: 0.03533
Epoch: 5129, Training Loss: 0.02544
Epoch: 5129, Training Loss: 0.02785
Epoch: 5129, Training Loss: 0.02791
Epoch: 5129, Training Loss: 0.03533
Epoch: 5130, Training Loss: 0.02543
Epoch: 5130, Training Loss: 0.02785
Epoch: 5130, Training Loss: 0.02790
Epoch: 5130, Training Loss: 0.03532
Epoch: 5131, Training Loss: 0.02543
Epoch: 5131, Training Loss: 0.02784
Epoch: 5131, Training Loss: 0.02789
Epoch: 5131, Training Loss: 0.03531
Epoch: 5132, Training Loss: 0.02542
Epoch: 5132, Training Loss: 0.02784
Epoch: 5132, Training Loss: 0.02789
Epoch: 5132, Training Loss: 0.03531
Epoch: 5133, Training Loss: 0.02542
Epoch: 5133, Training Loss: 0.02783
Epoch: 5133, Training Loss: 0.02788
Epoch: 5133, Training Loss: 0.03530
Epoch: 5134, Training Loss: 0.02541
Epoch: 5134, Training Loss: 0.02783
Epoch: 5134, Training Loss: 0.02788
Epoch: 5134, Training Loss: 0.03529
Epoch: 5135, Training Loss: 0.02541
Epoch: 5135, Training Loss: 0.02782
Epoch: 5135, Training Loss: 0.02787
Epoch: 5135, Training Loss: 0.03528
Epoch: 5136, Training Loss: 0.02540
Epoch: 5136, Training Loss: 0.02782
Epoch: 5136, Training Loss: 0.02787
Epoch: 5136, Training Loss: 0.03528
Epoch: 5137, Training Loss: 0.02540
Epoch: 5137, Training Loss: 0.02781
Epoch: 5137, Training Loss: 0.02786
Epoch: 5137, Training Loss: 0.03527
Epoch: 5138, Training Loss: 0.02540
Epoch: 5138, Training Loss: 0.02780
Epoch: 5138, Training Loss: 0.02786
Epoch: 5138, Training Loss: 0.03526
Epoch: 5139, Training Loss: 0.02539
Epoch: 5139, Training Loss: 0.02780
Epoch: 5139, Training Loss: 0.02785
Epoch: 5139, Training Loss: 0.03526
Epoch: 5140, Training Loss: 0.02539
Epoch: 5140, Training Loss: 0.02779
Epoch: 5140, Training Loss: 0.02785
Epoch: 5140, Training Loss: 0.03525
Epoch: 5141, Training Loss: 0.02538
Epoch: 5141, Training Loss: 0.02779
Epoch: 5141, Training Loss: 0.02784
Epoch: 5141, Training Loss: 0.03524
Epoch: 5142, Training Loss: 0.02538
Epoch: 5142, Training Loss: 0.02778
Epoch: 5142, Training Loss: 0.02784
Epoch: 5142, Training Loss: 0.03524
Epoch: 5143, Training Loss: 0.02537
Epoch: 5143, Training Loss: 0.02778
Epoch: 5143, Training Loss: 0.02783
Epoch: 5143, Training Loss: 0.03523
Epoch: 5144, Training Loss: 0.02537
Epoch: 5144, Training Loss: 0.02777
Epoch: 5144, Training Loss: 0.02783
Epoch: 5144, Training Loss: 0.03522
Epoch: 5145, Training Loss: 0.02536
Epoch: 5145, Training Loss: 0.02777
Epoch: 5145, Training Loss: 0.02782
Epoch: 5145, Training Loss: 0.03521
Epoch: 5146, Training Loss: 0.02536
Epoch: 5146, Training Loss: 0.02776
Epoch: 5146, Training Loss: 0.02781
Epoch: 5146, Training Loss: 0.03521
Epoch: 5147, Training Loss: 0.02535
Epoch: 5147, Training Loss: 0.02776
Epoch: 5147, Training Loss: 0.02781
Epoch: 5147, Training Loss: 0.03520
Epoch: 5148, Training Loss: 0.02535
Epoch: 5148, Training Loss: 0.02775
Epoch: 5148, Training Loss: 0.02780
Epoch: 5148, Training Loss: 0.03519
Epoch: 5149, Training Loss: 0.02535
Epoch: 5149, Training Loss: 0.02775
Epoch: 5149, Training Loss: 0.02780
Epoch: 5149, Training Loss: 0.03519
Epoch: 5150, Training Loss: 0.02534
Epoch: 5150, Training Loss: 0.02774
Epoch: 5150, Training Loss: 0.02779
Epoch: 5150, Training Loss: 0.03518
Epoch: 5151, Training Loss: 0.02534
Epoch: 5151, Training Loss: 0.02774
Epoch: 5151, Training Loss: 0.02779
Epoch: 5151, Training Loss: 0.03517
Epoch: 5152, Training Loss: 0.02533
Epoch: 5152, Training Loss: 0.02773
Epoch: 5152, Training Loss: 0.02778
Epoch: 5152, Training Loss: 0.03517
Epoch: 5153, Training Loss: 0.02533
Epoch: 5153, Training Loss: 0.02773
Epoch: 5153, Training Loss: 0.02778
Epoch: 5153, Training Loss: 0.03516
Epoch: 5154, Training Loss: 0.02532
Epoch: 5154, Training Loss: 0.02772
Epoch: 5154, Training Loss: 0.02777
Epoch: 5154, Training Loss: 0.03515
Epoch: 5155, Training Loss: 0.02532
Epoch: 5155, Training Loss: 0.02772
Epoch: 5155, Training Loss: 0.02777
Epoch: 5155, Training Loss: 0.03514
Epoch: 5156, Training Loss: 0.02531
Epoch: 5156, Training Loss: 0.02771
Epoch: 5156, Training Loss: 0.02776
Epoch: 5156, Training Loss: 0.03514
Epoch: 5157, Training Loss: 0.02531
Epoch: 5157, Training Loss: 0.02770
Epoch: 5157, Training Loss: 0.02776
Epoch: 5157, Training Loss: 0.03513
Epoch: 5158, Training Loss: 0.02531
Epoch: 5158, Training Loss: 0.02770
Epoch: 5158, Training Loss: 0.02775
Epoch: 5158, Training Loss: 0.03512
Epoch: 5159, Training Loss: 0.02530
Epoch: 5159, Training Loss: 0.02769
Epoch: 5159, Training Loss: 0.02775
Epoch: 5159, Training Loss: 0.03512
Epoch: 5160, Training Loss: 0.02530
Epoch: 5160, Training Loss: 0.02769
Epoch: 5160, Training Loss: 0.02774
Epoch: 5160, Training Loss: 0.03511
Epoch: 5161, Training Loss: 0.02529
Epoch: 5161, Training Loss: 0.02768
Epoch: 5161, Training Loss: 0.02774
Epoch: 5161, Training Loss: 0.03510
Epoch: 5162, Training Loss: 0.02529
Epoch: 5162, Training Loss: 0.02768
Epoch: 5162, Training Loss: 0.02773
Epoch: 5162, Training Loss: 0.03510
Epoch: 5163, Training Loss: 0.02528
Epoch: 5163, Training Loss: 0.02767
Epoch: 5163, Training Loss: 0.02772
Epoch: 5163, Training Loss: 0.03509
Epoch: 5164, Training Loss: 0.02528
Epoch: 5164, Training Loss: 0.02767
Epoch: 5164, Training Loss: 0.02772
Epoch: 5164, Training Loss: 0.03508
Epoch: 5165, Training Loss: 0.02527
Epoch: 5165, Training Loss: 0.02766
Epoch: 5165, Training Loss: 0.02771
Epoch: 5165, Training Loss: 0.03507
Epoch: 5166, Training Loss: 0.02527
Epoch: 5166, Training Loss: 0.02766
Epoch: 5166, Training Loss: 0.02771
Epoch: 5166, Training Loss: 0.03507
Epoch: 5167, Training Loss: 0.02527
Epoch: 5167, Training Loss: 0.02765
Epoch: 5167, Training Loss: 0.02770
Epoch: 5167, Training Loss: 0.03506
Epoch: 5168, Training Loss: 0.02526
Epoch: 5168, Training Loss: 0.02765
Epoch: 5168, Training Loss: 0.02770
Epoch: 5168, Training Loss: 0.03505
Epoch: 5169, Training Loss: 0.02526
Epoch: 5169, Training Loss: 0.02764
Epoch: 5169, Training Loss: 0.02769
Epoch: 5169, Training Loss: 0.03505
Epoch: 5170, Training Loss: 0.02525
Epoch: 5170, Training Loss: 0.02764
Epoch: 5170, Training Loss: 0.02769
Epoch: 5170, Training Loss: 0.03504
Epoch: 5171, Training Loss: 0.02525
Epoch: 5171, Training Loss: 0.02763
Epoch: 5171, Training Loss: 0.02768
Epoch: 5171, Training Loss: 0.03503
Epoch: 5172, Training Loss: 0.02524
Epoch: 5172, Training Loss: 0.02763
Epoch: 5172, Training Loss: 0.02768
Epoch: 5172, Training Loss: 0.03503
Epoch: 5173, Training Loss: 0.02524
Epoch: 5173, Training Loss: 0.02762
Epoch: 5173, Training Loss: 0.02767
Epoch: 5173, Training Loss: 0.03502
Epoch: 5174, Training Loss: 0.02523
Epoch: 5174, Training Loss: 0.02762
Epoch: 5174, Training Loss: 0.02767
Epoch: 5174, Training Loss: 0.03501
Epoch: 5175, Training Loss: 0.02523
Epoch: 5175, Training Loss: 0.02761
Epoch: 5175, Training Loss: 0.02766
Epoch: 5175, Training Loss: 0.03501
Epoch: 5176, Training Loss: 0.02523
Epoch: 5176, Training Loss: 0.02761
Epoch: 5176, Training Loss: 0.02766
Epoch: 5176, Training Loss: 0.03500
Epoch: 5177, Training Loss: 0.02522
Epoch: 5177, Training Loss: 0.02760
Epoch: 5177, Training Loss: 0.02765
Epoch: 5177, Training Loss: 0.03499
Epoch: 5178, Training Loss: 0.02522
Epoch: 5178, Training Loss: 0.02759
Epoch: 5178, Training Loss: 0.02765
Epoch: 5178, Training Loss: 0.03499
Epoch: 5179, Training Loss: 0.02521
Epoch: 5179, Training Loss: 0.02759
Epoch: 5179, Training Loss: 0.02764
Epoch: 5179, Training Loss: 0.03498
Epoch: 5180, Training Loss: 0.02521
Epoch: 5180, Training Loss: 0.02758
Epoch: 5180, Training Loss: 0.02764
Epoch: 5180, Training Loss: 0.03497
Epoch: 5181, Training Loss: 0.02520
Epoch: 5181, Training Loss: 0.02758
Epoch: 5181, Training Loss: 0.02763
Epoch: 5181, Training Loss: 0.03496
Epoch: 5182, Training Loss: 0.02520
Epoch: 5182, Training Loss: 0.02757
Epoch: 5182, Training Loss: 0.02763
Epoch: 5182, Training Loss: 0.03496
Epoch: 5183, Training Loss: 0.02519
Epoch: 5183, Training Loss: 0.02757
Epoch: 5183, Training Loss: 0.02762
Epoch: 5183, Training Loss: 0.03495
Epoch: 5184, Training Loss: 0.02519
Epoch: 5184, Training Loss: 0.02756
Epoch: 5184, Training Loss: 0.02762
Epoch: 5184, Training Loss: 0.03494
Epoch: 5185, Training Loss: 0.02519
Epoch: 5185, Training Loss: 0.02756
Epoch: 5185, Training Loss: 0.02761
Epoch: 5185, Training Loss: 0.03494
Epoch: 5186, Training Loss: 0.02518
Epoch: 5186, Training Loss: 0.02755
Epoch: 5186, Training Loss: 0.02760
Epoch: 5186, Training Loss: 0.03493
Epoch: 5187, Training Loss: 0.02518
Epoch: 5187, Training Loss: 0.02755
Epoch: 5187, Training Loss: 0.02760
Epoch: 5187, Training Loss: 0.03492
Epoch: 5188, Training Loss: 0.02517
Epoch: 5188, Training Loss: 0.02754
Epoch: 5188, Training Loss: 0.02759
Epoch: 5188, Training Loss: 0.03492
Epoch: 5189, Training Loss: 0.02517
Epoch: 5189, Training Loss: 0.02754
Epoch: 5189, Training Loss: 0.02759
Epoch: 5189, Training Loss: 0.03491
Epoch: 5190, Training Loss: 0.02516
Epoch: 5190, Training Loss: 0.02753
Epoch: 5190, Training Loss: 0.02758
Epoch: 5190, Training Loss: 0.03490
Epoch: 5191, Training Loss: 0.02516
Epoch: 5191, Training Loss: 0.02753
Epoch: 5191, Training Loss: 0.02758
Epoch: 5191, Training Loss: 0.03490
Epoch: 5192, Training Loss: 0.02515
Epoch: 5192, Training Loss: 0.02752
Epoch: 5192, Training Loss: 0.02757
Epoch: 5192, Training Loss: 0.03489
Epoch: 5193, Training Loss: 0.02515
Epoch: 5193, Training Loss: 0.02752
Epoch: 5193, Training Loss: 0.02757
Epoch: 5193, Training Loss: 0.03488
Epoch: 5194, Training Loss: 0.02515
Epoch: 5194, Training Loss: 0.02751
Epoch: 5194, Training Loss: 0.02756
Epoch: 5194, Training Loss: 0.03488
Epoch: 5195, Training Loss: 0.02514
Epoch: 5195, Training Loss: 0.02751
Epoch: 5195, Training Loss: 0.02756
Epoch: 5195, Training Loss: 0.03487
Epoch: 5196, Training Loss: 0.02514
Epoch: 5196, Training Loss: 0.02750
Epoch: 5196, Training Loss: 0.02755
Epoch: 5196, Training Loss: 0.03486
Epoch: 5197, Training Loss: 0.02513
Epoch: 5197, Training Loss: 0.02750
Epoch: 5197, Training Loss: 0.02755
Epoch: 5197, Training Loss: 0.03485
Epoch: 5198, Training Loss: 0.02513
Epoch: 5198, Training Loss: 0.02749
Epoch: 5198, Training Loss: 0.02754
Epoch: 5198, Training Loss: 0.03485
Epoch: 5199, Training Loss: 0.02512
Epoch: 5199, Training Loss: 0.02749
Epoch: 5199, Training Loss: 0.02754
Epoch: 5199, Training Loss: 0.03484
Epoch: 5200, Training Loss: 0.02512
Epoch: 5200, Training Loss: 0.02748
Epoch: 5200, Training Loss: 0.02753
Epoch: 5200, Training Loss: 0.03483
Epoch: 5201, Training Loss: 0.02512
Epoch: 5201, Training Loss: 0.02748
Epoch: 5201, Training Loss: 0.02753
Epoch: 5201, Training Loss: 0.03483
Epoch: 5202, Training Loss: 0.02511
Epoch: 5202, Training Loss: 0.02747
Epoch: 5202, Training Loss: 0.02752
Epoch: 5202, Training Loss: 0.03482
Epoch: 5203, Training Loss: 0.02511
Epoch: 5203, Training Loss: 0.02747
Epoch: 5203, Training Loss: 0.02752
Epoch: 5203, Training Loss: 0.03481
Epoch: 5204, Training Loss: 0.02510
Epoch: 5204, Training Loss: 0.02746
Epoch: 5204, Training Loss: 0.02751
Epoch: 5204, Training Loss: 0.03481
Epoch: 5205, Training Loss: 0.02510
Epoch: 5205, Training Loss: 0.02746
Epoch: 5205, Training Loss: 0.02751
Epoch: 5205, Training Loss: 0.03480
Epoch: 5206, Training Loss: 0.02509
Epoch: 5206, Training Loss: 0.02745
Epoch: 5206, Training Loss: 0.02750
Epoch: 5206, Training Loss: 0.03479
Epoch: 5207, Training Loss: 0.02509
Epoch: 5207, Training Loss: 0.02744
Epoch: 5207, Training Loss: 0.02750
Epoch: 5207, Training Loss: 0.03479
Epoch: 5208, Training Loss: 0.02508
Epoch: 5208, Training Loss: 0.02744
Epoch: 5208, Training Loss: 0.02749
Epoch: 5208, Training Loss: 0.03478
Epoch: 5209, Training Loss: 0.02508
Epoch: 5209, Training Loss: 0.02743
Epoch: 5209, Training Loss: 0.02749
Epoch: 5209, Training Loss: 0.03477
Epoch: 5210, Training Loss: 0.02508
Epoch: 5210, Training Loss: 0.02743
Epoch: 5210, Training Loss: 0.02748
Epoch: 5210, Training Loss: 0.03477
Epoch: 5211, Training Loss: 0.02507
Epoch: 5211, Training Loss: 0.02742
Epoch: 5211, Training Loss: 0.02748
Epoch: 5211, Training Loss: 0.03476
Epoch: 5212, Training Loss: 0.02507
Epoch: 5212, Training Loss: 0.02742
Epoch: 5212, Training Loss: 0.02747
Epoch: 5212, Training Loss: 0.03475
Epoch: 5213, Training Loss: 0.02506
Epoch: 5213, Training Loss: 0.02741
Epoch: 5213, Training Loss: 0.02747
Epoch: 5213, Training Loss: 0.03475
Epoch: 5214, Training Loss: 0.02506
Epoch: 5214, Training Loss: 0.02741
Epoch: 5214, Training Loss: 0.02746
Epoch: 5214, Training Loss: 0.03474
Epoch: 5215, Training Loss: 0.02505
Epoch: 5215, Training Loss: 0.02740
Epoch: 5215, Training Loss: 0.02746
Epoch: 5215, Training Loss: 0.03473
Epoch: 5216, Training Loss: 0.02505
Epoch: 5216, Training Loss: 0.02740
Epoch: 5216, Training Loss: 0.02745
Epoch: 5216, Training Loss: 0.03473
Epoch: 5217, Training Loss: 0.02505
Epoch: 5217, Training Loss: 0.02739
Epoch: 5217, Training Loss: 0.02745
Epoch: 5217, Training Loss: 0.03472
Epoch: 5218, Training Loss: 0.02504
Epoch: 5218, Training Loss: 0.02739
Epoch: 5218, Training Loss: 0.02744
Epoch: 5218, Training Loss: 0.03471
Epoch: 5219, Training Loss: 0.02504
Epoch: 5219, Training Loss: 0.02738
Epoch: 5219, Training Loss: 0.02743
Epoch: 5219, Training Loss: 0.03471
Epoch: 5220, Training Loss: 0.02503
Epoch: 5220, Training Loss: 0.02738
Epoch: 5220, Training Loss: 0.02743
Epoch: 5220, Training Loss: 0.03470
Epoch: 5221, Training Loss: 0.02503
Epoch: 5221, Training Loss: 0.02737
Epoch: 5221, Training Loss: 0.02742
Epoch: 5221, Training Loss: 0.03469
Epoch: 5222, Training Loss: 0.02502
Epoch: 5222, Training Loss: 0.02737
Epoch: 5222, Training Loss: 0.02742
Epoch: 5222, Training Loss: 0.03469
Epoch: 5223, Training Loss: 0.02502
Epoch: 5223, Training Loss: 0.02736
Epoch: 5223, Training Loss: 0.02741
Epoch: 5223, Training Loss: 0.03468
Epoch: 5224, Training Loss: 0.02502
Epoch: 5224, Training Loss: 0.02736
Epoch: 5224, Training Loss: 0.02741
Epoch: 5224, Training Loss: 0.03467
Epoch: 5225, Training Loss: 0.02501
Epoch: 5225, Training Loss: 0.02735
Epoch: 5225, Training Loss: 0.02740
Epoch: 5225, Training Loss: 0.03467
Epoch: 5226, Training Loss: 0.02501
Epoch: 5226, Training Loss: 0.02735
Epoch: 5226, Training Loss: 0.02740
Epoch: 5226, Training Loss: 0.03466
Epoch: 5227, Training Loss: 0.02500
Epoch: 5227, Training Loss: 0.02734
Epoch: 5227, Training Loss: 0.02739
Epoch: 5227, Training Loss: 0.03465
Epoch: 5228, Training Loss: 0.02500
Epoch: 5228, Training Loss: 0.02734
Epoch: 5228, Training Loss: 0.02739
Epoch: 5228, Training Loss: 0.03465
Epoch: 5229, Training Loss: 0.02499
Epoch: 5229, Training Loss: 0.02733
Epoch: 5229, Training Loss: 0.02738
Epoch: 5229, Training Loss: 0.03464
Epoch: 5230, Training Loss: 0.02499
Epoch: 5230, Training Loss: 0.02733
Epoch: 5230, Training Loss: 0.02738
Epoch: 5230, Training Loss: 0.03463
Epoch: 5231, Training Loss: 0.02498
Epoch: 5231, Training Loss: 0.02732
Epoch: 5231, Training Loss: 0.02737
Epoch: 5231, Training Loss: 0.03463
Epoch: 5232, Training Loss: 0.02498
Epoch: 5232, Training Loss: 0.02732
Epoch: 5232, Training Loss: 0.02737
Epoch: 5232, Training Loss: 0.03462
Epoch: 5233, Training Loss: 0.02498
Epoch: 5233, Training Loss: 0.02731
Epoch: 5233, Training Loss: 0.02736
Epoch: 5233, Training Loss: 0.03461
Epoch: 5234, Training Loss: 0.02497
Epoch: 5234, Training Loss: 0.02731
Epoch: 5234, Training Loss: 0.02736
Epoch: 5234, Training Loss: 0.03461
Epoch: 5235, Training Loss: 0.02497
Epoch: 5235, Training Loss: 0.02730
Epoch: 5235, Training Loss: 0.02735
Epoch: 5235, Training Loss: 0.03460
Epoch: 5236, Training Loss: 0.02496
Epoch: 5236, Training Loss: 0.02730
Epoch: 5236, Training Loss: 0.02735
Epoch: 5236, Training Loss: 0.03459
Epoch: 5237, Training Loss: 0.02496
Epoch: 5237, Training Loss: 0.02729
Epoch: 5237, Training Loss: 0.02734
Epoch: 5237, Training Loss: 0.03459
Epoch: 5238, Training Loss: 0.02495
Epoch: 5238, Training Loss: 0.02729
Epoch: 5238, Training Loss: 0.02734
Epoch: 5238, Training Loss: 0.03458
Epoch: 5239, Training Loss: 0.02495
Epoch: 5239, Training Loss: 0.02728
Epoch: 5239, Training Loss: 0.02733
Epoch: 5239, Training Loss: 0.03457
Epoch: 5240, Training Loss: 0.02495
Epoch: 5240, Training Loss: 0.02728
Epoch: 5240, Training Loss: 0.02733
Epoch: 5240, Training Loss: 0.03457
Epoch: 5241, Training Loss: 0.02494
Epoch: 5241, Training Loss: 0.02727
Epoch: 5241, Training Loss: 0.02732
Epoch: 5241, Training Loss: 0.03456
Epoch: 5242, Training Loss: 0.02494
Epoch: 5242, Training Loss: 0.02727
Epoch: 5242, Training Loss: 0.02732
Epoch: 5242, Training Loss: 0.03455
Epoch: 5243, Training Loss: 0.02493
Epoch: 5243, Training Loss: 0.02726
Epoch: 5243, Training Loss: 0.02731
Epoch: 5243, Training Loss: 0.03455
Epoch: 5244, Training Loss: 0.02493
Epoch: 5244, Training Loss: 0.02726
Epoch: 5244, Training Loss: 0.02731
Epoch: 5244, Training Loss: 0.03454
Epoch: 5245, Training Loss: 0.02492
Epoch: 5245, Training Loss: 0.02725
Epoch: 5245, Training Loss: 0.02730
Epoch: 5245, Training Loss: 0.03453
Epoch: 5246, Training Loss: 0.02492
Epoch: 5246, Training Loss: 0.02725
Epoch: 5246, Training Loss: 0.02730
Epoch: 5246, Training Loss: 0.03453
Epoch: 5247, Training Loss: 0.02492
Epoch: 5247, Training Loss: 0.02724
Epoch: 5247, Training Loss: 0.02729
Epoch: 5247, Training Loss: 0.03452
Epoch: 5248, Training Loss: 0.02491
Epoch: 5248, Training Loss: 0.02724
Epoch: 5248, Training Loss: 0.02729
Epoch: 5248, Training Loss: 0.03451
Epoch: 5249, Training Loss: 0.02491
Epoch: 5249, Training Loss: 0.02723
Epoch: 5249, Training Loss: 0.02728
Epoch: 5249, Training Loss: 0.03451
Epoch: 5250, Training Loss: 0.02490
Epoch: 5250, Training Loss: 0.02723
Epoch: 5250, Training Loss: 0.02728
Epoch: 5250, Training Loss: 0.03450
Epoch: 5251, Training Loss: 0.02490
Epoch: 5251, Training Loss: 0.02722
Epoch: 5251, Training Loss: 0.02727
Epoch: 5251, Training Loss: 0.03449
Epoch: 5252, Training Loss: 0.02489
Epoch: 5252, Training Loss: 0.02722
Epoch: 5252, Training Loss: 0.02727
Epoch: 5252, Training Loss: 0.03449
Epoch: 5253, Training Loss: 0.02489
Epoch: 5253, Training Loss: 0.02721
Epoch: 5253, Training Loss: 0.02726
Epoch: 5253, Training Loss: 0.03448
Epoch: 5254, Training Loss: 0.02489
Epoch: 5254, Training Loss: 0.02721
Epoch: 5254, Training Loss: 0.02726
Epoch: 5254, Training Loss: 0.03447
Epoch: 5255, Training Loss: 0.02488
Epoch: 5255, Training Loss: 0.02720
Epoch: 5255, Training Loss: 0.02725
Epoch: 5255, Training Loss: 0.03447
Epoch: 5256, Training Loss: 0.02488
Epoch: 5256, Training Loss: 0.02720
Epoch: 5256, Training Loss: 0.02725
Epoch: 5256, Training Loss: 0.03446
Epoch: 5257, Training Loss: 0.02487
Epoch: 5257, Training Loss: 0.02719
Epoch: 5257, Training Loss: 0.02724
Epoch: 5257, Training Loss: 0.03445
Epoch: 5258, Training Loss: 0.02487
Epoch: 5258, Training Loss: 0.02719
Epoch: 5258, Training Loss: 0.02724
Epoch: 5258, Training Loss: 0.03445
Epoch: 5259, Training Loss: 0.02487
Epoch: 5259, Training Loss: 0.02718
Epoch: 5259, Training Loss: 0.02723
Epoch: 5259, Training Loss: 0.03444
Epoch: 5260, Training Loss: 0.02486
Epoch: 5260, Training Loss: 0.02718
Epoch: 5260, Training Loss: 0.02723
Epoch: 5260, Training Loss: 0.03443
Epoch: 5261, Training Loss: 0.02486
Epoch: 5261, Training Loss: 0.02717
Epoch: 5261, Training Loss: 0.02722
Epoch: 5261, Training Loss: 0.03443
Epoch: 5262, Training Loss: 0.02485
Epoch: 5262, Training Loss: 0.02717
Epoch: 5262, Training Loss: 0.02722
Epoch: 5262, Training Loss: 0.03442
Epoch: 5263, Training Loss: 0.02485
Epoch: 5263, Training Loss: 0.02716
Epoch: 5263, Training Loss: 0.02721
Epoch: 5263, Training Loss: 0.03441
Epoch: 5264, Training Loss: 0.02484
Epoch: 5264, Training Loss: 0.02716
Epoch: 5264, Training Loss: 0.02721
Epoch: 5264, Training Loss: 0.03441
Epoch: 5265, Training Loss: 0.02484
Epoch: 5265, Training Loss: 0.02715
Epoch: 5265, Training Loss: 0.02720
Epoch: 5265, Training Loss: 0.03440
Epoch: 5266, Training Loss: 0.02484
Epoch: 5266, Training Loss: 0.02715
Epoch: 5266, Training Loss: 0.02720
Epoch: 5266, Training Loss: 0.03439
Epoch: 5267, Training Loss: 0.02483
Epoch: 5267, Training Loss: 0.02714
Epoch: 5267, Training Loss: 0.02719
Epoch: 5267, Training Loss: 0.03439
Epoch: 5268, Training Loss: 0.02483
Epoch: 5268, Training Loss: 0.02714
Epoch: 5268, Training Loss: 0.02719
Epoch: 5268, Training Loss: 0.03438
Epoch: 5269, Training Loss: 0.02482
Epoch: 5269, Training Loss: 0.02713
Epoch: 5269, Training Loss: 0.02718
Epoch: 5269, Training Loss: 0.03437
Epoch: 5270, Training Loss: 0.02482
Epoch: 5270, Training Loss: 0.02713
Epoch: 5270, Training Loss: 0.02718
Epoch: 5270, Training Loss: 0.03437
Epoch: 5271, Training Loss: 0.02481
Epoch: 5271, Training Loss: 0.02712
Epoch: 5271, Training Loss: 0.02717
Epoch: 5271, Training Loss: 0.03436
Epoch: 5272, Training Loss: 0.02481
Epoch: 5272, Training Loss: 0.02712
Epoch: 5272, Training Loss: 0.02717
Epoch: 5272, Training Loss: 0.03435
Epoch: 5273, Training Loss: 0.02481
Epoch: 5273, Training Loss: 0.02711
Epoch: 5273, Training Loss: 0.02716
Epoch: 5273, Training Loss: 0.03435
Epoch: 5274, Training Loss: 0.02480
Epoch: 5274, Training Loss: 0.02711
Epoch: 5274, Training Loss: 0.02716
Epoch: 5274, Training Loss: 0.03434
Epoch: 5275, Training Loss: 0.02480
Epoch: 5275, Training Loss: 0.02710
Epoch: 5275, Training Loss: 0.02715
Epoch: 5275, Training Loss: 0.03433
Epoch: 5276, Training Loss: 0.02479
Epoch: 5276, Training Loss: 0.02710
Epoch: 5276, Training Loss: 0.02715
Epoch: 5276, Training Loss: 0.03433
Epoch: 5277, Training Loss: 0.02479
Epoch: 5277, Training Loss: 0.02709
Epoch: 5277, Training Loss: 0.02714
Epoch: 5277, Training Loss: 0.03432
Epoch: 5278, Training Loss: 0.02478
Epoch: 5278, Training Loss: 0.02709
Epoch: 5278, Training Loss: 0.02714
Epoch: 5278, Training Loss: 0.03432
Epoch: 5279, Training Loss: 0.02478
Epoch: 5279, Training Loss: 0.02708
Epoch: 5279, Training Loss: 0.02713
Epoch: 5279, Training Loss: 0.03431
Epoch: 5280, Training Loss: 0.02478
Epoch: 5280, Training Loss: 0.02708
Epoch: 5280, Training Loss: 0.02713
Epoch: 5280, Training Loss: 0.03430
Epoch: 5281, Training Loss: 0.02477
Epoch: 5281, Training Loss: 0.02707
Epoch: 5281, Training Loss: 0.02712
Epoch: 5281, Training Loss: 0.03430
Epoch: 5282, Training Loss: 0.02477
Epoch: 5282, Training Loss: 0.02707
Epoch: 5282, Training Loss: 0.02712
Epoch: 5282, Training Loss: 0.03429
Epoch: 5283, Training Loss: 0.02476
Epoch: 5283, Training Loss: 0.02706
Epoch: 5283, Training Loss: 0.02711
Epoch: 5283, Training Loss: 0.03428
Epoch: 5284, Training Loss: 0.02476
Epoch: 5284, Training Loss: 0.02706
Epoch: 5284, Training Loss: 0.02711
Epoch: 5284, Training Loss: 0.03428
Epoch: 5285, Training Loss: 0.02476
Epoch: 5285, Training Loss: 0.02705
Epoch: 5285, Training Loss: 0.02710
Epoch: 5285, Training Loss: 0.03427
Epoch: 5286, Training Loss: 0.02475
Epoch: 5286, Training Loss: 0.02705
Epoch: 5286, Training Loss: 0.02710
Epoch: 5286, Training Loss: 0.03426
Epoch: 5287, Training Loss: 0.02475
Epoch: 5287, Training Loss: 0.02704
Epoch: 5287, Training Loss: 0.02709
Epoch: 5287, Training Loss: 0.03426
Epoch: 5288, Training Loss: 0.02474
Epoch: 5288, Training Loss: 0.02704
Epoch: 5288, Training Loss: 0.02709
Epoch: 5288, Training Loss: 0.03425
Epoch: 5289, Training Loss: 0.02474
Epoch: 5289, Training Loss: 0.02703
Epoch: 5289, Training Loss: 0.02708
Epoch: 5289, Training Loss: 0.03424
Epoch: 5290, Training Loss: 0.02473
Epoch: 5290, Training Loss: 0.02703
Epoch: 5290, Training Loss: 0.02708
Epoch: 5290, Training Loss: 0.03424
Epoch: 5291, Training Loss: 0.02473
Epoch: 5291, Training Loss: 0.02702
Epoch: 5291, Training Loss: 0.02707
Epoch: 5291, Training Loss: 0.03423
Epoch: 5292, Training Loss: 0.02473
Epoch: 5292, Training Loss: 0.02702
Epoch: 5292, Training Loss: 0.02707
Epoch: 5292, Training Loss: 0.03422
Epoch: 5293, Training Loss: 0.02472
Epoch: 5293, Training Loss: 0.02701
Epoch: 5293, Training Loss: 0.02706
Epoch: 5293, Training Loss: 0.03422
Epoch: 5294, Training Loss: 0.02472
Epoch: 5294, Training Loss: 0.02701
Epoch: 5294, Training Loss: 0.02706
Epoch: 5294, Training Loss: 0.03421
Epoch: 5295, Training Loss: 0.02471
Epoch: 5295, Training Loss: 0.02700
Epoch: 5295, Training Loss: 0.02705
Epoch: 5295, Training Loss: 0.03421
Epoch: 5296, Training Loss: 0.02471
Epoch: 5296, Training Loss: 0.02700
Epoch: 5296, Training Loss: 0.02705
Epoch: 5296, Training Loss: 0.03420
Epoch: 5297, Training Loss: 0.02471
Epoch: 5297, Training Loss: 0.02699
Epoch: 5297, Training Loss: 0.02705
Epoch: 5297, Training Loss: 0.03419
Epoch: 5298, Training Loss: 0.02470
Epoch: 5298, Training Loss: 0.02699
Epoch: 5298, Training Loss: 0.02704
Epoch: 5298, Training Loss: 0.03419
Epoch: 5299, Training Loss: 0.02470
Epoch: 5299, Training Loss: 0.02699
Epoch: 5299, Training Loss: 0.02704
Epoch: 5299, Training Loss: 0.03418
Epoch: 5300, Training Loss: 0.02469
Epoch: 5300, Training Loss: 0.02698
Epoch: 5300, Training Loss: 0.02703
Epoch: 5300, Training Loss: 0.03417
Epoch: 5301, Training Loss: 0.02469
Epoch: 5301, Training Loss: 0.02698
Epoch: 5301, Training Loss: 0.02703
Epoch: 5301, Training Loss: 0.03417
Epoch: 5302, Training Loss: 0.02468
Epoch: 5302, Training Loss: 0.02697
Epoch: 5302, Training Loss: 0.02702
Epoch: 5302, Training Loss: 0.03416
Epoch: 5303, Training Loss: 0.02468
Epoch: 5303, Training Loss: 0.02697
Epoch: 5303, Training Loss: 0.02702
Epoch: 5303, Training Loss: 0.03415
Epoch: 5304, Training Loss: 0.02468
Epoch: 5304, Training Loss: 0.02696
Epoch: 5304, Training Loss: 0.02701
Epoch: 5304, Training Loss: 0.03415
Epoch: 5305, Training Loss: 0.02467
Epoch: 5305, Training Loss: 0.02696
Epoch: 5305, Training Loss: 0.02701
Epoch: 5305, Training Loss: 0.03414
Epoch: 5306, Training Loss: 0.02467
Epoch: 5306, Training Loss: 0.02695
Epoch: 5306, Training Loss: 0.02700
Epoch: 5306, Training Loss: 0.03413
Epoch: 5307, Training Loss: 0.02466
Epoch: 5307, Training Loss: 0.02695
Epoch: 5307, Training Loss: 0.02700
Epoch: 5307, Training Loss: 0.03413
Epoch: 5308, Training Loss: 0.02466
Epoch: 5308, Training Loss: 0.02694
Epoch: 5308, Training Loss: 0.02699
Epoch: 5308, Training Loss: 0.03412
Epoch: 5309, Training Loss: 0.02466
Epoch: 5309, Training Loss: 0.02694
Epoch: 5309, Training Loss: 0.02699
Epoch: 5309, Training Loss: 0.03412
Epoch: 5310, Training Loss: 0.02465
Epoch: 5310, Training Loss: 0.02693
Epoch: 5310, Training Loss: 0.02698
Epoch: 5310, Training Loss: 0.03411
Epoch: 5311, Training Loss: 0.02465
Epoch: 5311, Training Loss: 0.02693
Epoch: 5311, Training Loss: 0.02698
Epoch: 5311, Training Loss: 0.03410
Epoch: 5312, Training Loss: 0.02464
Epoch: 5312, Training Loss: 0.02692
Epoch: 5312, Training Loss: 0.02697
Epoch: 5312, Training Loss: 0.03410
Epoch: 5313, Training Loss: 0.02464
Epoch: 5313, Training Loss: 0.02692
Epoch: 5313, Training Loss: 0.02697
Epoch: 5313, Training Loss: 0.03409
Epoch: 5314, Training Loss: 0.02463
Epoch: 5314, Training Loss: 0.02691
Epoch: 5314, Training Loss: 0.02696
Epoch: 5314, Training Loss: 0.03408
Epoch: 5315, Training Loss: 0.02463
Epoch: 5315, Training Loss: 0.02691
Epoch: 5315, Training Loss: 0.02696
Epoch: 5315, Training Loss: 0.03408
Epoch: 5316, Training Loss: 0.02463
Epoch: 5316, Training Loss: 0.02690
Epoch: 5316, Training Loss: 0.02695
Epoch: 5316, Training Loss: 0.03407
Epoch: 5317, Training Loss: 0.02462
Epoch: 5317, Training Loss: 0.02690
Epoch: 5317, Training Loss: 0.02695
Epoch: 5317, Training Loss: 0.03406
Epoch: 5318, Training Loss: 0.02462
Epoch: 5318, Training Loss: 0.02689
Epoch: 5318, Training Loss: 0.02694
Epoch: 5318, Training Loss: 0.03406
Epoch: 5319, Training Loss: 0.02461
Epoch: 5319, Training Loss: 0.02689
Epoch: 5319, Training Loss: 0.02694
Epoch: 5319, Training Loss: 0.03405
Epoch: 5320, Training Loss: 0.02461
Epoch: 5320, Training Loss: 0.02688
Epoch: 5320, Training Loss: 0.02693
Epoch: 5320, Training Loss: 0.03404
Epoch: 5321, Training Loss: 0.02461
Epoch: 5321, Training Loss: 0.02688
Epoch: 5321, Training Loss: 0.02693
Epoch: 5321, Training Loss: 0.03404
Epoch: 5322, Training Loss: 0.02460
Epoch: 5322, Training Loss: 0.02687
Epoch: 5322, Training Loss: 0.02692
Epoch: 5322, Training Loss: 0.03403
Epoch: 5323, Training Loss: 0.02460
Epoch: 5323, Training Loss: 0.02687
Epoch: 5323, Training Loss: 0.02692
Epoch: 5323, Training Loss: 0.03403
Epoch: 5324, Training Loss: 0.02459
Epoch: 5324, Training Loss: 0.02686
Epoch: 5324, Training Loss: 0.02691
Epoch: 5324, Training Loss: 0.03402
Epoch: 5325, Training Loss: 0.02459
Epoch: 5325, Training Loss: 0.02686
Epoch: 5325, Training Loss: 0.02691
Epoch: 5325, Training Loss: 0.03401
Epoch: 5326, Training Loss: 0.02459
Epoch: 5326, Training Loss: 0.02685
Epoch: 5326, Training Loss: 0.02690
Epoch: 5326, Training Loss: 0.03401
Epoch: 5327, Training Loss: 0.02458
Epoch: 5327, Training Loss: 0.02685
Epoch: 5327, Training Loss: 0.02690
Epoch: 5327, Training Loss: 0.03400
Epoch: 5328, Training Loss: 0.02458
Epoch: 5328, Training Loss: 0.02684
Epoch: 5328, Training Loss: 0.02689
Epoch: 5328, Training Loss: 0.03399
Epoch: 5329, Training Loss: 0.02457
Epoch: 5329, Training Loss: 0.02684
Epoch: 5329, Training Loss: 0.02689
Epoch: 5329, Training Loss: 0.03399
Epoch: 5330, Training Loss: 0.02457
Epoch: 5330, Training Loss: 0.02684
Epoch: 5330, Training Loss: 0.02689
Epoch: 5330, Training Loss: 0.03398
Epoch: 5331, Training Loss: 0.02456
Epoch: 5331, Training Loss: 0.02683
Epoch: 5331, Training Loss: 0.02688
Epoch: 5331, Training Loss: 0.03397
Epoch: 5332, Training Loss: 0.02456
Epoch: 5332, Training Loss: 0.02683
Epoch: 5332, Training Loss: 0.02688
Epoch: 5332, Training Loss: 0.03397
Epoch: 5333, Training Loss: 0.02456
Epoch: 5333, Training Loss: 0.02682
Epoch: 5333, Training Loss: 0.02687
Epoch: 5333, Training Loss: 0.03396
Epoch: 5334, Training Loss: 0.02455
Epoch: 5334, Training Loss: 0.02682
Epoch: 5334, Training Loss: 0.02687
Epoch: 5334, Training Loss: 0.03396
Epoch: 5335, Training Loss: 0.02455
Epoch: 5335, Training Loss: 0.02681
Epoch: 5335, Training Loss: 0.02686
Epoch: 5335, Training Loss: 0.03395
Epoch: 5336, Training Loss: 0.02454
Epoch: 5336, Training Loss: 0.02681
Epoch: 5336, Training Loss: 0.02686
Epoch: 5336, Training Loss: 0.03394
Epoch: 5337, Training Loss: 0.02454
Epoch: 5337, Training Loss: 0.02680
Epoch: 5337, Training Loss: 0.02685
Epoch: 5337, Training Loss: 0.03394
Epoch: 5338, Training Loss: 0.02454
Epoch: 5338, Training Loss: 0.02680
Epoch: 5338, Training Loss: 0.02685
Epoch: 5338, Training Loss: 0.03393
Epoch: 5339, Training Loss: 0.02453
Epoch: 5339, Training Loss: 0.02679
Epoch: 5339, Training Loss: 0.02684
Epoch: 5339, Training Loss: 0.03392
Epoch: 5340, Training Loss: 0.02453
Epoch: 5340, Training Loss: 0.02679
Epoch: 5340, Training Loss: 0.02684
Epoch: 5340, Training Loss: 0.03392
Epoch: 5341, Training Loss: 0.02452
Epoch: 5341, Training Loss: 0.02678
Epoch: 5341, Training Loss: 0.02683
Epoch: 5341, Training Loss: 0.03391
Epoch: 5342, Training Loss: 0.02452
Epoch: 5342, Training Loss: 0.02678
Epoch: 5342, Training Loss: 0.02683
Epoch: 5342, Training Loss: 0.03391
Epoch: 5343, Training Loss: 0.02452
Epoch: 5343, Training Loss: 0.02677
Epoch: 5343, Training Loss: 0.02682
Epoch: 5343, Training Loss: 0.03390
Epoch: 5344, Training Loss: 0.02451
Epoch: 5344, Training Loss: 0.02677
Epoch: 5344, Training Loss: 0.02682
Epoch: 5344, Training Loss: 0.03389
Epoch: 5345, Training Loss: 0.02451
Epoch: 5345, Training Loss: 0.02676
Epoch: 5345, Training Loss: 0.02681
Epoch: 5345, Training Loss: 0.03389
Epoch: 5346, Training Loss: 0.02450
Epoch: 5346, Training Loss: 0.02676
Epoch: 5346, Training Loss: 0.02681
Epoch: 5346, Training Loss: 0.03388
Epoch: 5347, Training Loss: 0.02450
Epoch: 5347, Training Loss: 0.02675
Epoch: 5347, Training Loss: 0.02680
Epoch: 5347, Training Loss: 0.03387
Epoch: 5348, Training Loss: 0.02450
Epoch: 5348, Training Loss: 0.02675
Epoch: 5348, Training Loss: 0.02680
Epoch: 5348, Training Loss: 0.03387
Epoch: 5349, Training Loss: 0.02449
Epoch: 5349, Training Loss: 0.02674
Epoch: 5349, Training Loss: 0.02679
Epoch: 5349, Training Loss: 0.03386
Epoch: 5350, Training Loss: 0.02449
Epoch: 5350, Training Loss: 0.02674
Epoch: 5350, Training Loss: 0.02679
Epoch: 5350, Training Loss: 0.03386
Epoch: 5351, Training Loss: 0.02448
Epoch: 5351, Training Loss: 0.02673
Epoch: 5351, Training Loss: 0.02678
Epoch: 5351, Training Loss: 0.03385
Epoch: 5352, Training Loss: 0.02448
Epoch: 5352, Training Loss: 0.02673
Epoch: 5352, Training Loss: 0.02678
Epoch: 5352, Training Loss: 0.03384
Epoch: 5353, Training Loss: 0.02448
Epoch: 5353, Training Loss: 0.02673
Epoch: 5353, Training Loss: 0.02678
Epoch: 5353, Training Loss: 0.03384
Epoch: 5354, Training Loss: 0.02447
Epoch: 5354, Training Loss: 0.02672
Epoch: 5354, Training Loss: 0.02677
Epoch: 5354, Training Loss: 0.03383
Epoch: 5355, Training Loss: 0.02447
Epoch: 5355, Training Loss: 0.02672
Epoch: 5355, Training Loss: 0.02677
Epoch: 5355, Training Loss: 0.03382
Epoch: 5356, Training Loss: 0.02446
Epoch: 5356, Training Loss: 0.02671
Epoch: 5356, Training Loss: 0.02676
Epoch: 5356, Training Loss: 0.03382
Epoch: 5357, Training Loss: 0.02446
Epoch: 5357, Training Loss: 0.02671
Epoch: 5357, Training Loss: 0.02676
Epoch: 5357, Training Loss: 0.03381
Epoch: 5358, Training Loss: 0.02445
Epoch: 5358, Training Loss: 0.02670
Epoch: 5358, Training Loss: 0.02675
Epoch: 5358, Training Loss: 0.03381
Epoch: 5359, Training Loss: 0.02445
Epoch: 5359, Training Loss: 0.02670
Epoch: 5359, Training Loss: 0.02675
Epoch: 5359, Training Loss: 0.03380
Epoch: 5360, Training Loss: 0.02445
Epoch: 5360, Training Loss: 0.02669
Epoch: 5360, Training Loss: 0.02674
Epoch: 5360, Training Loss: 0.03379
Epoch: 5361, Training Loss: 0.02444
Epoch: 5361, Training Loss: 0.02669
Epoch: 5361, Training Loss: 0.02674
Epoch: 5361, Training Loss: 0.03379
Epoch: 5362, Training Loss: 0.02444
Epoch: 5362, Training Loss: 0.02668
Epoch: 5362, Training Loss: 0.02673
Epoch: 5362, Training Loss: 0.03378
Epoch: 5363, Training Loss: 0.02443
Epoch: 5363, Training Loss: 0.02668
Epoch: 5363, Training Loss: 0.02673
Epoch: 5363, Training Loss: 0.03377
Epoch: 5364, Training Loss: 0.02443
Epoch: 5364, Training Loss: 0.02667
Epoch: 5364, Training Loss: 0.02672
Epoch: 5364, Training Loss: 0.03377
Epoch: 5365, Training Loss: 0.02443
Epoch: 5365, Training Loss: 0.02667
Epoch: 5365, Training Loss: 0.02672
Epoch: 5365, Training Loss: 0.03376
Epoch: 5366, Training Loss: 0.02442
Epoch: 5366, Training Loss: 0.02666
Epoch: 5366, Training Loss: 0.02671
Epoch: 5366, Training Loss: 0.03376
Epoch: 5367, Training Loss: 0.02442
Epoch: 5367, Training Loss: 0.02666
Epoch: 5367, Training Loss: 0.02671
Epoch: 5367, Training Loss: 0.03375
Epoch: 5368, Training Loss: 0.02441
Epoch: 5368, Training Loss: 0.02665
Epoch: 5368, Training Loss: 0.02670
Epoch: 5368, Training Loss: 0.03374
Epoch: 5369, Training Loss: 0.02441
Epoch: 5369, Training Loss: 0.02665
Epoch: 5369, Training Loss: 0.02670
Epoch: 5369, Training Loss: 0.03374
Epoch: 5370, Training Loss: 0.02441
Epoch: 5370, Training Loss: 0.02665
Epoch: 5370, Training Loss: 0.02669
Epoch: 5370, Training Loss: 0.03373
Epoch: 5371, Training Loss: 0.02440
Epoch: 5371, Training Loss: 0.02664
Epoch: 5371, Training Loss: 0.02669
Epoch: 5371, Training Loss: 0.03372
Epoch: 5372, Training Loss: 0.02440
Epoch: 5372, Training Loss: 0.02664
Epoch: 5372, Training Loss: 0.02669
Epoch: 5372, Training Loss: 0.03372
Epoch: 5373, Training Loss: 0.02439
Epoch: 5373, Training Loss: 0.02663
Epoch: 5373, Training Loss: 0.02668
Epoch: 5373, Training Loss: 0.03371
Epoch: 5374, Training Loss: 0.02439
Epoch: 5374, Training Loss: 0.02663
Epoch: 5374, Training Loss: 0.02668
Epoch: 5374, Training Loss: 0.03371
Epoch: 5375, Training Loss: 0.02439
Epoch: 5375, Training Loss: 0.02662
Epoch: 5375, Training Loss: 0.02667
Epoch: 5375, Training Loss: 0.03370
Epoch: 5376, Training Loss: 0.02438
Epoch: 5376, Training Loss: 0.02662
Epoch: 5376, Training Loss: 0.02667
Epoch: 5376, Training Loss: 0.03369
Epoch: 5377, Training Loss: 0.02438
Epoch: 5377, Training Loss: 0.02661
Epoch: 5377, Training Loss: 0.02666
Epoch: 5377, Training Loss: 0.03369
Epoch: 5378, Training Loss: 0.02437
Epoch: 5378, Training Loss: 0.02661
Epoch: 5378, Training Loss: 0.02666
Epoch: 5378, Training Loss: 0.03368
Epoch: 5379, Training Loss: 0.02437
Epoch: 5379, Training Loss: 0.02660
Epoch: 5379, Training Loss: 0.02665
Epoch: 5379, Training Loss: 0.03367
Epoch: 5380, Training Loss: 0.02437
Epoch: 5380, Training Loss: 0.02660
Epoch: 5380, Training Loss: 0.02665
Epoch: 5380, Training Loss: 0.03367
Epoch: 5381, Training Loss: 0.02436
Epoch: 5381, Training Loss: 0.02659
Epoch: 5381, Training Loss: 0.02664
Epoch: 5381, Training Loss: 0.03366
Epoch: 5382, Training Loss: 0.02436
Epoch: 5382, Training Loss: 0.02659
Epoch: 5382, Training Loss: 0.02664
Epoch: 5382, Training Loss: 0.03366
Epoch: 5383, Training Loss: 0.02435
Epoch: 5383, Training Loss: 0.02658
Epoch: 5383, Training Loss: 0.02663
Epoch: 5383, Training Loss: 0.03365
Epoch: 5384, Training Loss: 0.02435
Epoch: 5384, Training Loss: 0.02658
Epoch: 5384, Training Loss: 0.02663
Epoch: 5384, Training Loss: 0.03364
Epoch: 5385, Training Loss: 0.02435
Epoch: 5385, Training Loss: 0.02657
Epoch: 5385, Training Loss: 0.02662
Epoch: 5385, Training Loss: 0.03364
Epoch: 5386, Training Loss: 0.02434
Epoch: 5386, Training Loss: 0.02657
Epoch: 5386, Training Loss: 0.02662
Epoch: 5386, Training Loss: 0.03363
Epoch: 5387, Training Loss: 0.02434
Epoch: 5387, Training Loss: 0.02657
Epoch: 5387, Training Loss: 0.02661
Epoch: 5387, Training Loss: 0.03363
Epoch: 5388, Training Loss: 0.02433
Epoch: 5388, Training Loss: 0.02656
Epoch: 5388, Training Loss: 0.02661
Epoch: 5388, Training Loss: 0.03362
Epoch: 5389, Training Loss: 0.02433
Epoch: 5389, Training Loss: 0.02656
Epoch: 5389, Training Loss: 0.02661
Epoch: 5389, Training Loss: 0.03361
Epoch: 5390, Training Loss: 0.02433
Epoch: 5390, Training Loss: 0.02655
Epoch: 5390, Training Loss: 0.02660
Epoch: 5390, Training Loss: 0.03361
Epoch: 5391, Training Loss: 0.02432
Epoch: 5391, Training Loss: 0.02655
Epoch: 5391, Training Loss: 0.02660
Epoch: 5391, Training Loss: 0.03360
Epoch: 5392, Training Loss: 0.02432
Epoch: 5392, Training Loss: 0.02654
Epoch: 5392, Training Loss: 0.02659
Epoch: 5392, Training Loss: 0.03359
Epoch: 5393, Training Loss: 0.02431
Epoch: 5393, Training Loss: 0.02654
Epoch: 5393, Training Loss: 0.02659
Epoch: 5393, Training Loss: 0.03359
Epoch: 5394, Training Loss: 0.02431
Epoch: 5394, Training Loss: 0.02653
Epoch: 5394, Training Loss: 0.02658
Epoch: 5394, Training Loss: 0.03358
Epoch: 5395, Training Loss: 0.02431
Epoch: 5395, Training Loss: 0.02653
Epoch: 5395, Training Loss: 0.02658
Epoch: 5395, Training Loss: 0.03358
Epoch: 5396, Training Loss: 0.02430
Epoch: 5396, Training Loss: 0.02652
Epoch: 5396, Training Loss: 0.02657
Epoch: 5396, Training Loss: 0.03357
Epoch: 5397, Training Loss: 0.02430
Epoch: 5397, Training Loss: 0.02652
Epoch: 5397, Training Loss: 0.02657
Epoch: 5397, Training Loss: 0.03356
Epoch: 5398, Training Loss: 0.02429
Epoch: 5398, Training Loss: 0.02651
Epoch: 5398, Training Loss: 0.02656
Epoch: 5398, Training Loss: 0.03356
Epoch: 5399, Training Loss: 0.02429
Epoch: 5399, Training Loss: 0.02651
Epoch: 5399, Training Loss: 0.02656
Epoch: 5399, Training Loss: 0.03355
Epoch: 5400, Training Loss: 0.02429
Epoch: 5400, Training Loss: 0.02651
Epoch: 5400, Training Loss: 0.02655
Epoch: 5400, Training Loss: 0.03355
Epoch: 5401, Training Loss: 0.02428
Epoch: 5401, Training Loss: 0.02650
Epoch: 5401, Training Loss: 0.02655
Epoch: 5401, Training Loss: 0.03354
Epoch: 5402, Training Loss: 0.02428
Epoch: 5402, Training Loss: 0.02650
Epoch: 5402, Training Loss: 0.02654
Epoch: 5402, Training Loss: 0.03353
Epoch: 5403, Training Loss: 0.02427
Epoch: 5403, Training Loss: 0.02649
Epoch: 5403, Training Loss: 0.02654
Epoch: 5403, Training Loss: 0.03353
Epoch: 5404, Training Loss: 0.02427
Epoch: 5404, Training Loss: 0.02649
Epoch: 5404, Training Loss: 0.02654
Epoch: 5404, Training Loss: 0.03352
Epoch: 5405, Training Loss: 0.02427
Epoch: 5405, Training Loss: 0.02648
Epoch: 5405, Training Loss: 0.02653
Epoch: 5405, Training Loss: 0.03352
Epoch: 5406, Training Loss: 0.02426
Epoch: 5406, Training Loss: 0.02648
Epoch: 5406, Training Loss: 0.02653
Epoch: 5406, Training Loss: 0.03351
Epoch: 5407, Training Loss: 0.02426
Epoch: 5407, Training Loss: 0.02647
Epoch: 5407, Training Loss: 0.02652
Epoch: 5407, Training Loss: 0.03350
Epoch: 5408, Training Loss: 0.02426
Epoch: 5408, Training Loss: 0.02647
Epoch: 5408, Training Loss: 0.02652
Epoch: 5408, Training Loss: 0.03350
Epoch: 5409, Training Loss: 0.02425
Epoch: 5409, Training Loss: 0.02646
Epoch: 5409, Training Loss: 0.02651
Epoch: 5409, Training Loss: 0.03349
Epoch: 5410, Training Loss: 0.02425
Epoch: 5410, Training Loss: 0.02646
Epoch: 5410, Training Loss: 0.02651
Epoch: 5410, Training Loss: 0.03348
Epoch: 5411, Training Loss: 0.02424
Epoch: 5411, Training Loss: 0.02645
Epoch: 5411, Training Loss: 0.02650
Epoch: 5411, Training Loss: 0.03348
Epoch: 5412, Training Loss: 0.02424
Epoch: 5412, Training Loss: 0.02645
Epoch: 5412, Training Loss: 0.02650
Epoch: 5412, Training Loss: 0.03347
Epoch: 5413, Training Loss: 0.02424
Epoch: 5413, Training Loss: 0.02645
Epoch: 5413, Training Loss: 0.02649
Epoch: 5413, Training Loss: 0.03347
Epoch: 5414, Training Loss: 0.02423
Epoch: 5414, Training Loss: 0.02644
Epoch: 5414, Training Loss: 0.02649
Epoch: 5414, Training Loss: 0.03346
Epoch: 5415, Training Loss: 0.02423
Epoch: 5415, Training Loss: 0.02644
Epoch: 5415, Training Loss: 0.02648
Epoch: 5415, Training Loss: 0.03345
Epoch: 5416, Training Loss: 0.02422
Epoch: 5416, Training Loss: 0.02643
Epoch: 5416, Training Loss: 0.02648
Epoch: 5416, Training Loss: 0.03345
Epoch: 5417, Training Loss: 0.02422
Epoch: 5417, Training Loss: 0.02643
Epoch: 5417, Training Loss: 0.02648
Epoch: 5417, Training Loss: 0.03344
Epoch: 5418, Training Loss: 0.02422
Epoch: 5418, Training Loss: 0.02642
Epoch: 5418, Training Loss: 0.02647
Epoch: 5418, Training Loss: 0.03344
Epoch: 5419, Training Loss: 0.02421
Epoch: 5419, Training Loss: 0.02642
Epoch: 5419, Training Loss: 0.02647
Epoch: 5419, Training Loss: 0.03343
Epoch: 5420, Training Loss: 0.02421
Epoch: 5420, Training Loss: 0.02641
Epoch: 5420, Training Loss: 0.02646
Epoch: 5420, Training Loss: 0.03342
Epoch: 5421, Training Loss: 0.02420
Epoch: 5421, Training Loss: 0.02641
Epoch: 5421, Training Loss: 0.02646
Epoch: 5421, Training Loss: 0.03342
Epoch: 5422, Training Loss: 0.02420
Epoch: 5422, Training Loss: 0.02640
Epoch: 5422, Training Loss: 0.02645
Epoch: 5422, Training Loss: 0.03341
Epoch: 5423, Training Loss: 0.02420
Epoch: 5423, Training Loss: 0.02640
Epoch: 5423, Training Loss: 0.02645
Epoch: 5423, Training Loss: 0.03341
Epoch: 5424, Training Loss: 0.02419
Epoch: 5424, Training Loss: 0.02639
Epoch: 5424, Training Loss: 0.02644
Epoch: 5424, Training Loss: 0.03340
Epoch: 5425, Training Loss: 0.02419
Epoch: 5425, Training Loss: 0.02639
Epoch: 5425, Training Loss: 0.02644
Epoch: 5425, Training Loss: 0.03339
Epoch: 5426, Training Loss: 0.02418
Epoch: 5426, Training Loss: 0.02639
Epoch: 5426, Training Loss: 0.02643
Epoch: 5426, Training Loss: 0.03339
Epoch: 5427, Training Loss: 0.02418
Epoch: 5427, Training Loss: 0.02638
Epoch: 5427, Training Loss: 0.02643
Epoch: 5427, Training Loss: 0.03338
Epoch: 5428, Training Loss: 0.02418
Epoch: 5428, Training Loss: 0.02638
Epoch: 5428, Training Loss: 0.02643
Epoch: 5428, Training Loss: 0.03338
Epoch: 5429, Training Loss: 0.02417
Epoch: 5429, Training Loss: 0.02637
Epoch: 5429, Training Loss: 0.02642
Epoch: 5429, Training Loss: 0.03337
Epoch: 5430, Training Loss: 0.02417
Epoch: 5430, Training Loss: 0.02637
Epoch: 5430, Training Loss: 0.02642
Epoch: 5430, Training Loss: 0.03336
Epoch: 5431, Training Loss: 0.02416
Epoch: 5431, Training Loss: 0.02636
Epoch: 5431, Training Loss: 0.02641
Epoch: 5431, Training Loss: 0.03336
Epoch: 5432, Training Loss: 0.02416
Epoch: 5432, Training Loss: 0.02636
Epoch: 5432, Training Loss: 0.02641
Epoch: 5432, Training Loss: 0.03335
Epoch: 5433, Training Loss: 0.02416
Epoch: 5433, Training Loss: 0.02635
Epoch: 5433, Training Loss: 0.02640
Epoch: 5433, Training Loss: 0.03335
Epoch: 5434, Training Loss: 0.02415
Epoch: 5434, Training Loss: 0.02635
Epoch: 5434, Training Loss: 0.02640
Epoch: 5434, Training Loss: 0.03334
Epoch: 5435, Training Loss: 0.02415
Epoch: 5435, Training Loss: 0.02634
Epoch: 5435, Training Loss: 0.02639
Epoch: 5435, Training Loss: 0.03333
Epoch: 5436, Training Loss: 0.02415
Epoch: 5436, Training Loss: 0.02634
Epoch: 5436, Training Loss: 0.02639
Epoch: 5436, Training Loss: 0.03333
Epoch: 5437, Training Loss: 0.02414
Epoch: 5437, Training Loss: 0.02634
Epoch: 5437, Training Loss: 0.02638
Epoch: 5437, Training Loss: 0.03332
Epoch: 5438, Training Loss: 0.02414
Epoch: 5438, Training Loss: 0.02633
Epoch: 5438, Training Loss: 0.02638
Epoch: 5438, Training Loss: 0.03332
Epoch: 5439, Training Loss: 0.02413
Epoch: 5439, Training Loss: 0.02633
Epoch: 5439, Training Loss: 0.02637
Epoch: 5439, Training Loss: 0.03331
Epoch: 5440, Training Loss: 0.02413
Epoch: 5440, Training Loss: 0.02632
Epoch: 5440, Training Loss: 0.02637
Epoch: 5440, Training Loss: 0.03330
Epoch: 5441, Training Loss: 0.02413
Epoch: 5441, Training Loss: 0.02632
Epoch: 5441, Training Loss: 0.02637
Epoch: 5441, Training Loss: 0.03330
Epoch: 5442, Training Loss: 0.02412
Epoch: 5442, Training Loss: 0.02631
Epoch: 5442, Training Loss: 0.02636
Epoch: 5442, Training Loss: 0.03329
Epoch: 5443, Training Loss: 0.02412
Epoch: 5443, Training Loss: 0.02631
Epoch: 5443, Training Loss: 0.02636
Epoch: 5443, Training Loss: 0.03329
Epoch: 5444, Training Loss: 0.02411
Epoch: 5444, Training Loss: 0.02630
Epoch: 5444, Training Loss: 0.02635
Epoch: 5444, Training Loss: 0.03328
Epoch: 5445, Training Loss: 0.02411
Epoch: 5445, Training Loss: 0.02630
Epoch: 5445, Training Loss: 0.02635
Epoch: 5445, Training Loss: 0.03327
Epoch: 5446, Training Loss: 0.02411
Epoch: 5446, Training Loss: 0.02629
Epoch: 5446, Training Loss: 0.02634
Epoch: 5446, Training Loss: 0.03327
Epoch: 5447, Training Loss: 0.02410
Epoch: 5447, Training Loss: 0.02629
Epoch: 5447, Training Loss: 0.02634
Epoch: 5447, Training Loss: 0.03326
Epoch: 5448, Training Loss: 0.02410
Epoch: 5448, Training Loss: 0.02629
Epoch: 5448, Training Loss: 0.02633
Epoch: 5448, Training Loss: 0.03326
Epoch: 5449, Training Loss: 0.02409
Epoch: 5449, Training Loss: 0.02628
Epoch: 5449, Training Loss: 0.02633
Epoch: 5449, Training Loss: 0.03325
Epoch: 5450, Training Loss: 0.02409
Epoch: 5450, Training Loss: 0.02628
Epoch: 5450, Training Loss: 0.02632
Epoch: 5450, Training Loss: 0.03324
Epoch: 5451, Training Loss: 0.02409
Epoch: 5451, Training Loss: 0.02627
Epoch: 5451, Training Loss: 0.02632
Epoch: 5451, Training Loss: 0.03324
Epoch: 5452, Training Loss: 0.02408
Epoch: 5452, Training Loss: 0.02627
Epoch: 5452, Training Loss: 0.02632
Epoch: 5452, Training Loss: 0.03323
Epoch: 5453, Training Loss: 0.02408
Epoch: 5453, Training Loss: 0.02626
Epoch: 5453, Training Loss: 0.02631
Epoch: 5453, Training Loss: 0.03323
Epoch: 5454, Training Loss: 0.02408
Epoch: 5454, Training Loss: 0.02626
Epoch: 5454, Training Loss: 0.02631
Epoch: 5454, Training Loss: 0.03322
Epoch: 5455, Training Loss: 0.02407
Epoch: 5455, Training Loss: 0.02625
Epoch: 5455, Training Loss: 0.02630
Epoch: 5455, Training Loss: 0.03321
Epoch: 5456, Training Loss: 0.02407
Epoch: 5456, Training Loss: 0.02625
Epoch: 5456, Training Loss: 0.02630
Epoch: 5456, Training Loss: 0.03321
Epoch: 5457, Training Loss: 0.02406
Epoch: 5457, Training Loss: 0.02625
Epoch: 5457, Training Loss: 0.02629
Epoch: 5457, Training Loss: 0.03320
Epoch: 5458, Training Loss: 0.02406
Epoch: 5458, Training Loss: 0.02624
Epoch: 5458, Training Loss: 0.02629
Epoch: 5458, Training Loss: 0.03320
Epoch: 5459, Training Loss: 0.02406
Epoch: 5459, Training Loss: 0.02624
Epoch: 5459, Training Loss: 0.02628
Epoch: 5459, Training Loss: 0.03319
Epoch: 5460, Training Loss: 0.02405
Epoch: 5460, Training Loss: 0.02623
Epoch: 5460, Training Loss: 0.02628
Epoch: 5460, Training Loss: 0.03318
Epoch: 5461, Training Loss: 0.02405
Epoch: 5461, Training Loss: 0.02623
Epoch: 5461, Training Loss: 0.02628
Epoch: 5461, Training Loss: 0.03318
Epoch: 5462, Training Loss: 0.02404
Epoch: 5462, Training Loss: 0.02622
Epoch: 5462, Training Loss: 0.02627
Epoch: 5462, Training Loss: 0.03317
Epoch: 5463, Training Loss: 0.02404
Epoch: 5463, Training Loss: 0.02622
Epoch: 5463, Training Loss: 0.02627
Epoch: 5463, Training Loss: 0.03317
Epoch: 5464, Training Loss: 0.02404
Epoch: 5464, Training Loss: 0.02621
Epoch: 5464, Training Loss: 0.02626
Epoch: 5464, Training Loss: 0.03316
Epoch: 5465, Training Loss: 0.02403
Epoch: 5465, Training Loss: 0.02621
Epoch: 5465, Training Loss: 0.02626
Epoch: 5465, Training Loss: 0.03316
Epoch: 5466, Training Loss: 0.02403
Epoch: 5466, Training Loss: 0.02620
Epoch: 5466, Training Loss: 0.02625
Epoch: 5466, Training Loss: 0.03315
Epoch: 5467, Training Loss: 0.02403
Epoch: 5467, Training Loss: 0.02620
Epoch: 5467, Training Loss: 0.02625
Epoch: 5467, Training Loss: 0.03314
Epoch: 5468, Training Loss: 0.02402
Epoch: 5468, Training Loss: 0.02620
Epoch: 5468, Training Loss: 0.02624
Epoch: 5468, Training Loss: 0.03314
Epoch: 5469, Training Loss: 0.02402
Epoch: 5469, Training Loss: 0.02619
Epoch: 5469, Training Loss: 0.02624
Epoch: 5469, Training Loss: 0.03313
Epoch: 5470, Training Loss: 0.02401
Epoch: 5470, Training Loss: 0.02619
Epoch: 5470, Training Loss: 0.02623
Epoch: 5470, Training Loss: 0.03313
Epoch: 5471, Training Loss: 0.02401
Epoch: 5471, Training Loss: 0.02618
Epoch: 5471, Training Loss: 0.02623
Epoch: 5471, Training Loss: 0.03312
Epoch: 5472, Training Loss: 0.02401
Epoch: 5472, Training Loss: 0.02618
Epoch: 5472, Training Loss: 0.02623
Epoch: 5472, Training Loss: 0.03311
Epoch: 5473, Training Loss: 0.02400
Epoch: 5473, Training Loss: 0.02617
Epoch: 5473, Training Loss: 0.02622
Epoch: 5473, Training Loss: 0.03311
Epoch: 5474, Training Loss: 0.02400
Epoch: 5474, Training Loss: 0.02617
Epoch: 5474, Training Loss: 0.02622
Epoch: 5474, Training Loss: 0.03310
Epoch: 5475, Training Loss: 0.02399
Epoch: 5475, Training Loss: 0.02616
Epoch: 5475, Training Loss: 0.02621
Epoch: 5475, Training Loss: 0.03310
Epoch: 5476, Training Loss: 0.02399
Epoch: 5476, Training Loss: 0.02616
Epoch: 5476, Training Loss: 0.02621
Epoch: 5476, Training Loss: 0.03309
Epoch: 5477, Training Loss: 0.02399
Epoch: 5477, Training Loss: 0.02616
Epoch: 5477, Training Loss: 0.02620
Epoch: 5477, Training Loss: 0.03308
Epoch: 5478, Training Loss: 0.02398
Epoch: 5478, Training Loss: 0.02615
Epoch: 5478, Training Loss: 0.02620
Epoch: 5478, Training Loss: 0.03308
Epoch: 5479, Training Loss: 0.02398
Epoch: 5479, Training Loss: 0.02615
Epoch: 5479, Training Loss: 0.02619
Epoch: 5479, Training Loss: 0.03307
Epoch: 5480, Training Loss: 0.02398
Epoch: 5480, Training Loss: 0.02614
Epoch: 5480, Training Loss: 0.02619
Epoch: 5480, Training Loss: 0.03307
Epoch: 5481, Training Loss: 0.02397
Epoch: 5481, Training Loss: 0.02614
Epoch: 5481, Training Loss: 0.02619
Epoch: 5481, Training Loss: 0.03306
Epoch: 5482, Training Loss: 0.02397
Epoch: 5482, Training Loss: 0.02613
Epoch: 5482, Training Loss: 0.02618
Epoch: 5482, Training Loss: 0.03306
Epoch: 5483, Training Loss: 0.02396
Epoch: 5483, Training Loss: 0.02613
Epoch: 5483, Training Loss: 0.02618
Epoch: 5483, Training Loss: 0.03305
Epoch: 5484, Training Loss: 0.02396
Epoch: 5484, Training Loss: 0.02612
Epoch: 5484, Training Loss: 0.02617
Epoch: 5484, Training Loss: 0.03304
Epoch: 5485, Training Loss: 0.02396
Epoch: 5485, Training Loss: 0.02612
Epoch: 5485, Training Loss: 0.02617
Epoch: 5485, Training Loss: 0.03304
Epoch: 5486, Training Loss: 0.02395
Epoch: 5486, Training Loss: 0.02612
Epoch: 5486, Training Loss: 0.02616
Epoch: 5486, Training Loss: 0.03303
Epoch: 5487, Training Loss: 0.02395
Epoch: 5487, Training Loss: 0.02611
Epoch: 5487, Training Loss: 0.02616
Epoch: 5487, Training Loss: 0.03303
Epoch: 5488, Training Loss: 0.02395
Epoch: 5488, Training Loss: 0.02611
Epoch: 5488, Training Loss: 0.02615
Epoch: 5488, Training Loss: 0.03302
Epoch: 5489, Training Loss: 0.02394
Epoch: 5489, Training Loss: 0.02610
Epoch: 5489, Training Loss: 0.02615
Epoch: 5489, Training Loss: 0.03301
Epoch: 5490, Training Loss: 0.02394
Epoch: 5490, Training Loss: 0.02610
Epoch: 5490, Training Loss: 0.02615
Epoch: 5490, Training Loss: 0.03301
Epoch: 5491, Training Loss: 0.02393
Epoch: 5491, Training Loss: 0.02609
Epoch: 5491, Training Loss: 0.02614
Epoch: 5491, Training Loss: 0.03300
Epoch: 5492, Training Loss: 0.02393
Epoch: 5492, Training Loss: 0.02609
Epoch: 5492, Training Loss: 0.02614
Epoch: 5492, Training Loss: 0.03300
Epoch: 5493, Training Loss: 0.02393
Epoch: 5493, Training Loss: 0.02608
Epoch: 5493, Training Loss: 0.02613
Epoch: 5493, Training Loss: 0.03299
Epoch: 5494, Training Loss: 0.02392
Epoch: 5494, Training Loss: 0.02608
Epoch: 5494, Training Loss: 0.02613
Epoch: 5494, Training Loss: 0.03299
Epoch: 5495, Training Loss: 0.02392
Epoch: 5495, Training Loss: 0.02608
Epoch: 5495, Training Loss: 0.02612
Epoch: 5495, Training Loss: 0.03298
Epoch: 5496, Training Loss: 0.02391
Epoch: 5496, Training Loss: 0.02607
Epoch: 5496, Training Loss: 0.02612
Epoch: 5496, Training Loss: 0.03297
Epoch: 5497, Training Loss: 0.02391
Epoch: 5497, Training Loss: 0.02607
Epoch: 5497, Training Loss: 0.02611
Epoch: 5497, Training Loss: 0.03297
Epoch: 5498, Training Loss: 0.02391
Epoch: 5498, Training Loss: 0.02606
Epoch: 5498, Training Loss: 0.02611
Epoch: 5498, Training Loss: 0.03296
Epoch: 5499, Training Loss: 0.02390
Epoch: 5499, Training Loss: 0.02606
Epoch: 5499, Training Loss: 0.02611
Epoch: 5499, Training Loss: 0.03296
Epoch: 5500, Training Loss: 0.02390
Epoch: 5500, Training Loss: 0.02605
Epoch: 5500, Training Loss: 0.02610
Epoch: 5500, Training Loss: 0.03295
Epoch: 5501, Training Loss: 0.02390
Epoch: 5501, Training Loss: 0.02605
Epoch: 5501, Training Loss: 0.02610
Epoch: 5501, Training Loss: 0.03294
Epoch: 5502, Training Loss: 0.02389
Epoch: 5502, Training Loss: 0.02604
Epoch: 5502, Training Loss: 0.02609
Epoch: 5502, Training Loss: 0.03294
Epoch: 5503, Training Loss: 0.02389
Epoch: 5503, Training Loss: 0.02604
Epoch: 5503, Training Loss: 0.02609
Epoch: 5503, Training Loss: 0.03293
Epoch: 5504, Training Loss: 0.02388
Epoch: 5504, Training Loss: 0.02604
Epoch: 5504, Training Loss: 0.02608
Epoch: 5504, Training Loss: 0.03293
Epoch: 5505, Training Loss: 0.02388
Epoch: 5505, Training Loss: 0.02603
Epoch: 5505, Training Loss: 0.02608
Epoch: 5505, Training Loss: 0.03292
Epoch: 5506, Training Loss: 0.02388
Epoch: 5506, Training Loss: 0.02603
Epoch: 5506, Training Loss: 0.02607
Epoch: 5506, Training Loss: 0.03292
Epoch: 5507, Training Loss: 0.02387
Epoch: 5507, Training Loss: 0.02602
Epoch: 5507, Training Loss: 0.02607
Epoch: 5507, Training Loss: 0.03291
Epoch: 5508, Training Loss: 0.02387
Epoch: 5508, Training Loss: 0.02602
Epoch: 5508, Training Loss: 0.02607
Epoch: 5508, Training Loss: 0.03290
Epoch: 5509, Training Loss: 0.02387
Epoch: 5509, Training Loss: 0.02601
Epoch: 5509, Training Loss: 0.02606
Epoch: 5509, Training Loss: 0.03290
Epoch: 5510, Training Loss: 0.02386
Epoch: 5510, Training Loss: 0.02601
Epoch: 5510, Training Loss: 0.02606
Epoch: 5510, Training Loss: 0.03289
Epoch: 5511, Training Loss: 0.02386
Epoch: 5511, Training Loss: 0.02601
Epoch: 5511, Training Loss: 0.02605
Epoch: 5511, Training Loss: 0.03289
Epoch: 5512, Training Loss: 0.02385
Epoch: 5512, Training Loss: 0.02600
Epoch: 5512, Training Loss: 0.02605
Epoch: 5512, Training Loss: 0.03288
Epoch: 5513, Training Loss: 0.02385
Epoch: 5513, Training Loss: 0.02600
Epoch: 5513, Training Loss: 0.02604
Epoch: 5513, Training Loss: 0.03288
Epoch: 5514, Training Loss: 0.02385
Epoch: 5514, Training Loss: 0.02599
Epoch: 5514, Training Loss: 0.02604
Epoch: 5514, Training Loss: 0.03287
Epoch: 5515, Training Loss: 0.02384
Epoch: 5515, Training Loss: 0.02599
Epoch: 5515, Training Loss: 0.02604
Epoch: 5515, Training Loss: 0.03286
Epoch: 5516, Training Loss: 0.02384
Epoch: 5516, Training Loss: 0.02598
Epoch: 5516, Training Loss: 0.02603
Epoch: 5516, Training Loss: 0.03286
Epoch: 5517, Training Loss: 0.02384
Epoch: 5517, Training Loss: 0.02598
Epoch: 5517, Training Loss: 0.02603
Epoch: 5517, Training Loss: 0.03285
Epoch: 5518, Training Loss: 0.02383
Epoch: 5518, Training Loss: 0.02597
Epoch: 5518, Training Loss: 0.02602
Epoch: 5518, Training Loss: 0.03285
Epoch: 5519, Training Loss: 0.02383
Epoch: 5519, Training Loss: 0.02597
Epoch: 5519, Training Loss: 0.02602
Epoch: 5519, Training Loss: 0.03284
Epoch: 5520, Training Loss: 0.02382
Epoch: 5520, Training Loss: 0.02597
Epoch: 5520, Training Loss: 0.02601
Epoch: 5520, Training Loss: 0.03283
Epoch: 5521, Training Loss: 0.02382
Epoch: 5521, Training Loss: 0.02596
Epoch: 5521, Training Loss: 0.02601
Epoch: 5521, Training Loss: 0.03283
Epoch: 5522, Training Loss: 0.02382
Epoch: 5522, Training Loss: 0.02596
Epoch: 5522, Training Loss: 0.02600
Epoch: 5522, Training Loss: 0.03282
Epoch: 5523, Training Loss: 0.02381
Epoch: 5523, Training Loss: 0.02595
Epoch: 5523, Training Loss: 0.02600
Epoch: 5523, Training Loss: 0.03282
Epoch: 5524, Training Loss: 0.02381
Epoch: 5524, Training Loss: 0.02595
Epoch: 5524, Training Loss: 0.02600
Epoch: 5524, Training Loss: 0.03281
Epoch: 5525, Training Loss: 0.02381
Epoch: 5525, Training Loss: 0.02594
Epoch: 5525, Training Loss: 0.02599
Epoch: 5525, Training Loss: 0.03281
Epoch: 5526, Training Loss: 0.02380
Epoch: 5526, Training Loss: 0.02594
Epoch: 5526, Training Loss: 0.02599
Epoch: 5526, Training Loss: 0.03280
Epoch: 5527, Training Loss: 0.02380
Epoch: 5527, Training Loss: 0.02594
Epoch: 5527, Training Loss: 0.02598
Epoch: 5527, Training Loss: 0.03279
Epoch: 5528, Training Loss: 0.02379
Epoch: 5528, Training Loss: 0.02593
Epoch: 5528, Training Loss: 0.02598
Epoch: 5528, Training Loss: 0.03279
Epoch: 5529, Training Loss: 0.02379
Epoch: 5529, Training Loss: 0.02593
Epoch: 5529, Training Loss: 0.02597
Epoch: 5529, Training Loss: 0.03278
Epoch: 5530, Training Loss: 0.02379
Epoch: 5530, Training Loss: 0.02592
Epoch: 5530, Training Loss: 0.02597
Epoch: 5530, Training Loss: 0.03278
Epoch: 5531, Training Loss: 0.02378
Epoch: 5531, Training Loss: 0.02592
Epoch: 5531, Training Loss: 0.02597
Epoch: 5531, Training Loss: 0.03277
Epoch: 5532, Training Loss: 0.02378
Epoch: 5532, Training Loss: 0.02591
Epoch: 5532, Training Loss: 0.02596
Epoch: 5532, Training Loss: 0.03277
Epoch: 5533, Training Loss: 0.02378
Epoch: 5533, Training Loss: 0.02591
Epoch: 5533, Training Loss: 0.02596
Epoch: 5533, Training Loss: 0.03276
Epoch: 5534, Training Loss: 0.02377
Epoch: 5534, Training Loss: 0.02590
Epoch: 5534, Training Loss: 0.02595
Epoch: 5534, Training Loss: 0.03275
Epoch: 5535, Training Loss: 0.02377
Epoch: 5535, Training Loss: 0.02590
Epoch: 5535, Training Loss: 0.02595
Epoch: 5535, Training Loss: 0.03275
Epoch: 5536, Training Loss: 0.02376
Epoch: 5536, Training Loss: 0.02590
Epoch: 5536, Training Loss: 0.02594
Epoch: 5536, Training Loss: 0.03274
Epoch: 5537, Training Loss: 0.02376
Epoch: 5537, Training Loss: 0.02589
Epoch: 5537, Training Loss: 0.02594
Epoch: 5537, Training Loss: 0.03274
Epoch: 5538, Training Loss: 0.02376
Epoch: 5538, Training Loss: 0.02589
Epoch: 5538, Training Loss: 0.02594
Epoch: 5538, Training Loss: 0.03273
Epoch: 5539, Training Loss: 0.02375
Epoch: 5539, Training Loss: 0.02588
Epoch: 5539, Training Loss: 0.02593
Epoch: 5539, Training Loss: 0.03273
Epoch: 5540, Training Loss: 0.02375
Epoch: 5540, Training Loss: 0.02588
Epoch: 5540, Training Loss: 0.02593
Epoch: 5540, Training Loss: 0.03272
Epoch: 5541, Training Loss: 0.02375
Epoch: 5541, Training Loss: 0.02587
Epoch: 5541, Training Loss: 0.02592
Epoch: 5541, Training Loss: 0.03271
Epoch: 5542, Training Loss: 0.02374
Epoch: 5542, Training Loss: 0.02587
Epoch: 5542, Training Loss: 0.02592
Epoch: 5542, Training Loss: 0.03271
Epoch: 5543, Training Loss: 0.02374
Epoch: 5543, Training Loss: 0.02587
Epoch: 5543, Training Loss: 0.02591
Epoch: 5543, Training Loss: 0.03270
Epoch: 5544, Training Loss: 0.02374
Epoch: 5544, Training Loss: 0.02586
Epoch: 5544, Training Loss: 0.02591
Epoch: 5544, Training Loss: 0.03270
Epoch: 5545, Training Loss: 0.02373
Epoch: 5545, Training Loss: 0.02586
Epoch: 5545, Training Loss: 0.02590
Epoch: 5545, Training Loss: 0.03269
Epoch: 5546, Training Loss: 0.02373
Epoch: 5546, Training Loss: 0.02585
Epoch: 5546, Training Loss: 0.02590
Epoch: 5546, Training Loss: 0.03269
Epoch: 5547, Training Loss: 0.02372
Epoch: 5547, Training Loss: 0.02585
Epoch: 5547, Training Loss: 0.02590
Epoch: 5547, Training Loss: 0.03268
Epoch: 5548, Training Loss: 0.02372
Epoch: 5548, Training Loss: 0.02584
Epoch: 5548, Training Loss: 0.02589
Epoch: 5548, Training Loss: 0.03268
Epoch: 5549, Training Loss: 0.02372
Epoch: 5549, Training Loss: 0.02584
Epoch: 5549, Training Loss: 0.02589
Epoch: 5549, Training Loss: 0.03267
Epoch: 5550, Training Loss: 0.02371
Epoch: 5550, Training Loss: 0.02584
Epoch: 5550, Training Loss: 0.02588
Epoch: 5550, Training Loss: 0.03266
Epoch: 5551, Training Loss: 0.02371
Epoch: 5551, Training Loss: 0.02583
Epoch: 5551, Training Loss: 0.02588
Epoch: 5551, Training Loss: 0.03266
Epoch: 5552, Training Loss: 0.02371
Epoch: 5552, Training Loss: 0.02583
Epoch: 5552, Training Loss: 0.02587
Epoch: 5552, Training Loss: 0.03265
Epoch: 5553, Training Loss: 0.02370
Epoch: 5553, Training Loss: 0.02582
Epoch: 5553, Training Loss: 0.02587
Epoch: 5553, Training Loss: 0.03265
Epoch: 5554, Training Loss: 0.02370
Epoch: 5554, Training Loss: 0.02582
Epoch: 5554, Training Loss: 0.02587
Epoch: 5554, Training Loss: 0.03264
Epoch: 5555, Training Loss: 0.02369
Epoch: 5555, Training Loss: 0.02581
Epoch: 5555, Training Loss: 0.02586
Epoch: 5555, Training Loss: 0.03264
Epoch: 5556, Training Loss: 0.02369
Epoch: 5556, Training Loss: 0.02581
Epoch: 5556, Training Loss: 0.02586
Epoch: 5556, Training Loss: 0.03263
Epoch: 5557, Training Loss: 0.02369
Epoch: 5557, Training Loss: 0.02581
Epoch: 5557, Training Loss: 0.02585
Epoch: 5557, Training Loss: 0.03262
Epoch: 5558, Training Loss: 0.02368
Epoch: 5558, Training Loss: 0.02580
Epoch: 5558, Training Loss: 0.02585
Epoch: 5558, Training Loss: 0.03262
Epoch: 5559, Training Loss: 0.02368
Epoch: 5559, Training Loss: 0.02580
Epoch: 5559, Training Loss: 0.02584
Epoch: 5559, Training Loss: 0.03261
Epoch: 5560, Training Loss: 0.02368
Epoch: 5560, Training Loss: 0.02579
Epoch: 5560, Training Loss: 0.02584
Epoch: 5560, Training Loss: 0.03261
Epoch: 5561, Training Loss: 0.02367
Epoch: 5561, Training Loss: 0.02579
Epoch: 5561, Training Loss: 0.02584
Epoch: 5561, Training Loss: 0.03260
Epoch: 5562, Training Loss: 0.02367
Epoch: 5562, Training Loss: 0.02578
Epoch: 5562, Training Loss: 0.02583
Epoch: 5562, Training Loss: 0.03260
Epoch: 5563, Training Loss: 0.02367
Epoch: 5563, Training Loss: 0.02578
Epoch: 5563, Training Loss: 0.02583
Epoch: 5563, Training Loss: 0.03259
Epoch: 5564, Training Loss: 0.02366
Epoch: 5564, Training Loss: 0.02578
Epoch: 5564, Training Loss: 0.02582
Epoch: 5564, Training Loss: 0.03258
Epoch: 5565, Training Loss: 0.02366
Epoch: 5565, Training Loss: 0.02577
Epoch: 5565, Training Loss: 0.02582
Epoch: 5565, Training Loss: 0.03258
Epoch: 5566, Training Loss: 0.02365
Epoch: 5566, Training Loss: 0.02577
Epoch: 5566, Training Loss: 0.02581
Epoch: 5566, Training Loss: 0.03257
Epoch: 5567, Training Loss: 0.02365
Epoch: 5567, Training Loss: 0.02576
Epoch: 5567, Training Loss: 0.02581
Epoch: 5567, Training Loss: 0.03257
Epoch: 5568, Training Loss: 0.02365
Epoch: 5568, Training Loss: 0.02576
Epoch: 5568, Training Loss: 0.02581
Epoch: 5568, Training Loss: 0.03256
Epoch: 5569, Training Loss: 0.02364
Epoch: 5569, Training Loss: 0.02575
Epoch: 5569, Training Loss: 0.02580
Epoch: 5569, Training Loss: 0.03256
Epoch: 5570, Training Loss: 0.02364
Epoch: 5570, Training Loss: 0.02575
Epoch: 5570, Training Loss: 0.02580
Epoch: 5570, Training Loss: 0.03255
Epoch: 5571, Training Loss: 0.02364
Epoch: 5571, Training Loss: 0.02575
Epoch: 5571, Training Loss: 0.02579
Epoch: 5571, Training Loss: 0.03255
Epoch: 5572, Training Loss: 0.02363
Epoch: 5572, Training Loss: 0.02574
Epoch: 5572, Training Loss: 0.02579
Epoch: 5572, Training Loss: 0.03254
Epoch: 5573, Training Loss: 0.02363
Epoch: 5573, Training Loss: 0.02574
Epoch: 5573, Training Loss: 0.02578
Epoch: 5573, Training Loss: 0.03253
Epoch: 5574, Training Loss: 0.02362
Epoch: 5574, Training Loss: 0.02573
Epoch: 5574, Training Loss: 0.02578
Epoch: 5574, Training Loss: 0.03253
Epoch: 5575, Training Loss: 0.02362
Epoch: 5575, Training Loss: 0.02573
Epoch: 5575, Training Loss: 0.02578
Epoch: 5575, Training Loss: 0.03252
Epoch: 5576, Training Loss: 0.02362
Epoch: 5576, Training Loss: 0.02572
Epoch: 5576, Training Loss: 0.02577
Epoch: 5576, Training Loss: 0.03252
Epoch: 5577, Training Loss: 0.02361
Epoch: 5577, Training Loss: 0.02572
Epoch: 5577, Training Loss: 0.02577
Epoch: 5577, Training Loss: 0.03251
Epoch: 5578, Training Loss: 0.02361
Epoch: 5578, Training Loss: 0.02572
Epoch: 5578, Training Loss: 0.02576
Epoch: 5578, Training Loss: 0.03251
Epoch: 5579, Training Loss: 0.02361
Epoch: 5579, Training Loss: 0.02571
Epoch: 5579, Training Loss: 0.02576
Epoch: 5579, Training Loss: 0.03250
Epoch: 5580, Training Loss: 0.02360
Epoch: 5580, Training Loss: 0.02571
Epoch: 5580, Training Loss: 0.02575
Epoch: 5580, Training Loss: 0.03250
Epoch: 5581, Training Loss: 0.02360
Epoch: 5581, Training Loss: 0.02570
Epoch: 5581, Training Loss: 0.02575
Epoch: 5581, Training Loss: 0.03249
Epoch: 5582, Training Loss: 0.02360
Epoch: 5582, Training Loss: 0.02570
Epoch: 5582, Training Loss: 0.02575
Epoch: 5582, Training Loss: 0.03248
Epoch: 5583, Training Loss: 0.02359
Epoch: 5583, Training Loss: 0.02570
Epoch: 5583, Training Loss: 0.02574
Epoch: 5583, Training Loss: 0.03248
Epoch: 5584, Training Loss: 0.02359
Epoch: 5584, Training Loss: 0.02569
Epoch: 5584, Training Loss: 0.02574
Epoch: 5584, Training Loss: 0.03247
Epoch: 5585, Training Loss: 0.02358
Epoch: 5585, Training Loss: 0.02569
Epoch: 5585, Training Loss: 0.02573
Epoch: 5585, Training Loss: 0.03247
Epoch: 5586, Training Loss: 0.02358
Epoch: 5586, Training Loss: 0.02568
Epoch: 5586, Training Loss: 0.02573
Epoch: 5586, Training Loss: 0.03246
Epoch: 5587, Training Loss: 0.02358
Epoch: 5587, Training Loss: 0.02568
Epoch: 5587, Training Loss: 0.02573
Epoch: 5587, Training Loss: 0.03246
Epoch: 5588, Training Loss: 0.02357
Epoch: 5588, Training Loss: 0.02567
Epoch: 5588, Training Loss: 0.02572
Epoch: 5588, Training Loss: 0.03245
Epoch: 5589, Training Loss: 0.02357
Epoch: 5589, Training Loss: 0.02567
Epoch: 5589, Training Loss: 0.02572
Epoch: 5589, Training Loss: 0.03245
Epoch: 5590, Training Loss: 0.02357
Epoch: 5590, Training Loss: 0.02567
Epoch: 5590, Training Loss: 0.02571
Epoch: 5590, Training Loss: 0.03244
Epoch: 5591, Training Loss: 0.02356
Epoch: 5591, Training Loss: 0.02566
Epoch: 5591, Training Loss: 0.02571
Epoch: 5591, Training Loss: 0.03243
Epoch: 5592, Training Loss: 0.02356
Epoch: 5592, Training Loss: 0.02566
Epoch: 5592, Training Loss: 0.02570
Epoch: 5592, Training Loss: 0.03243
Epoch: 5593, Training Loss: 0.02356
Epoch: 5593, Training Loss: 0.02565
Epoch: 5593, Training Loss: 0.02570
Epoch: 5593, Training Loss: 0.03242
Epoch: 5594, Training Loss: 0.02355
Epoch: 5594, Training Loss: 0.02565
Epoch: 5594, Training Loss: 0.02570
Epoch: 5594, Training Loss: 0.03242
Epoch: 5595, Training Loss: 0.02355
Epoch: 5595, Training Loss: 0.02564
Epoch: 5595, Training Loss: 0.02569
Epoch: 5595, Training Loss: 0.03241
Epoch: 5596, Training Loss: 0.02354
Epoch: 5596, Training Loss: 0.02564
Epoch: 5596, Training Loss: 0.02569
Epoch: 5596, Training Loss: 0.03241
Epoch: 5597, Training Loss: 0.02354
Epoch: 5597, Training Loss: 0.02564
Epoch: 5597, Training Loss: 0.02568
Epoch: 5597, Training Loss: 0.03240
Epoch: 5598, Training Loss: 0.02354
Epoch: 5598, Training Loss: 0.02563
Epoch: 5598, Training Loss: 0.02568
Epoch: 5598, Training Loss: 0.03240
Epoch: 5599, Training Loss: 0.02353
Epoch: 5599, Training Loss: 0.02563
Epoch: 5599, Training Loss: 0.02567
Epoch: 5599, Training Loss: 0.03239
Epoch: 5600, Training Loss: 0.02353
Epoch: 5600, Training Loss: 0.02562
Epoch: 5600, Training Loss: 0.02567
Epoch: 5600, Training Loss: 0.03238
Epoch: 5601, Training Loss: 0.02353
Epoch: 5601, Training Loss: 0.02562
Epoch: 5601, Training Loss: 0.02567
Epoch: 5601, Training Loss: 0.03238
Epoch: 5602, Training Loss: 0.02352
Epoch: 5602, Training Loss: 0.02561
Epoch: 5602, Training Loss: 0.02566
Epoch: 5602, Training Loss: 0.03237
Epoch: 5603, Training Loss: 0.02352
Epoch: 5603, Training Loss: 0.02561
Epoch: 5603, Training Loss: 0.02566
Epoch: 5603, Training Loss: 0.03237
Epoch: 5604, Training Loss: 0.02352
Epoch: 5604, Training Loss: 0.02561
Epoch: 5604, Training Loss: 0.02565
Epoch: 5604, Training Loss: 0.03236
Epoch: 5605, Training Loss: 0.02351
Epoch: 5605, Training Loss: 0.02560
Epoch: 5605, Training Loss: 0.02565
Epoch: 5605, Training Loss: 0.03236
Epoch: 5606, Training Loss: 0.02351
Epoch: 5606, Training Loss: 0.02560
Epoch: 5606, Training Loss: 0.02564
Epoch: 5606, Training Loss: 0.03235
Epoch: 5607, Training Loss: 0.02351
Epoch: 5607, Training Loss: 0.02559
Epoch: 5607, Training Loss: 0.02564
Epoch: 5607, Training Loss: 0.03235
Epoch: 5608, Training Loss: 0.02350
Epoch: 5608, Training Loss: 0.02559
Epoch: 5608, Training Loss: 0.02564
Epoch: 5608, Training Loss: 0.03234
Epoch: 5609, Training Loss: 0.02350
Epoch: 5609, Training Loss: 0.02559
Epoch: 5609, Training Loss: 0.02563
Epoch: 5609, Training Loss: 0.03233
Epoch: 5610, Training Loss: 0.02349
Epoch: 5610, Training Loss: 0.02558
Epoch: 5610, Training Loss: 0.02563
Epoch: 5610, Training Loss: 0.03233
Epoch: 5611, Training Loss: 0.02349
Epoch: 5611, Training Loss: 0.02558
Epoch: 5611, Training Loss: 0.02562
Epoch: 5611, Training Loss: 0.03232
Epoch: 5612, Training Loss: 0.02349
Epoch: 5612, Training Loss: 0.02557
Epoch: 5612, Training Loss: 0.02562
Epoch: 5612, Training Loss: 0.03232
Epoch: 5613, Training Loss: 0.02348
Epoch: 5613, Training Loss: 0.02557
Epoch: 5613, Training Loss: 0.02562
Epoch: 5613, Training Loss: 0.03231
Epoch: 5614, Training Loss: 0.02348
Epoch: 5614, Training Loss: 0.02556
Epoch: 5614, Training Loss: 0.02561
Epoch: 5614, Training Loss: 0.03231
Epoch: 5615, Training Loss: 0.02348
Epoch: 5615, Training Loss: 0.02556
Epoch: 5615, Training Loss: 0.02561
Epoch: 5615, Training Loss: 0.03230
Epoch: 5616, Training Loss: 0.02347
Epoch: 5616, Training Loss: 0.02556
Epoch: 5616, Training Loss: 0.02560
Epoch: 5616, Training Loss: 0.03230
Epoch: 5617, Training Loss: 0.02347
Epoch: 5617, Training Loss: 0.02555
Epoch: 5617, Training Loss: 0.02560
Epoch: 5617, Training Loss: 0.03229
Epoch: 5618, Training Loss: 0.02347
Epoch: 5618, Training Loss: 0.02555
Epoch: 5618, Training Loss: 0.02559
Epoch: 5618, Training Loss: 0.03229
Epoch: 5619, Training Loss: 0.02346
Epoch: 5619, Training Loss: 0.02554
Epoch: 5619, Training Loss: 0.02559
Epoch: 5619, Training Loss: 0.03228
Epoch: 5620, Training Loss: 0.02346
Epoch: 5620, Training Loss: 0.02554
Epoch: 5620, Training Loss: 0.02559
Epoch: 5620, Training Loss: 0.03227
Epoch: 5621, Training Loss: 0.02345
Epoch: 5621, Training Loss: 0.02554
Epoch: 5621, Training Loss: 0.02558
Epoch: 5621, Training Loss: 0.03227
Epoch: 5622, Training Loss: 0.02345
Epoch: 5622, Training Loss: 0.02553
Epoch: 5622, Training Loss: 0.02558
Epoch: 5622, Training Loss: 0.03226
Epoch: 5623, Training Loss: 0.02345
Epoch: 5623, Training Loss: 0.02553
Epoch: 5623, Training Loss: 0.02557
Epoch: 5623, Training Loss: 0.03226
Epoch: 5624, Training Loss: 0.02344
Epoch: 5624, Training Loss: 0.02552
Epoch: 5624, Training Loss: 0.02557
Epoch: 5624, Training Loss: 0.03225
Epoch: 5625, Training Loss: 0.02344
Epoch: 5625, Training Loss: 0.02552
Epoch: 5625, Training Loss: 0.02557
Epoch: 5625, Training Loss: 0.03225
Epoch: 5626, Training Loss: 0.02344
Epoch: 5626, Training Loss: 0.02551
Epoch: 5626, Training Loss: 0.02556
Epoch: 5626, Training Loss: 0.03224
Epoch: 5627, Training Loss: 0.02343
Epoch: 5627, Training Loss: 0.02551
Epoch: 5627, Training Loss: 0.02556
Epoch: 5627, Training Loss: 0.03224
Epoch: 5628, Training Loss: 0.02343
Epoch: 5628, Training Loss: 0.02551
Epoch: 5628, Training Loss: 0.02555
Epoch: 5628, Training Loss: 0.03223
Epoch: 5629, Training Loss: 0.02343
Epoch: 5629, Training Loss: 0.02550
Epoch: 5629, Training Loss: 0.02555
Epoch: 5629, Training Loss: 0.03223
Epoch: 5630, Training Loss: 0.02342
Epoch: 5630, Training Loss: 0.02550
Epoch: 5630, Training Loss: 0.02554
Epoch: 5630, Training Loss: 0.03222
Epoch: 5631, Training Loss: 0.02342
Epoch: 5631, Training Loss: 0.02549
Epoch: 5631, Training Loss: 0.02554
Epoch: 5631, Training Loss: 0.03221
Epoch: 5632, Training Loss: 0.02342
Epoch: 5632, Training Loss: 0.02549
Epoch: 5632, Training Loss: 0.02554
Epoch: 5632, Training Loss: 0.03221
Epoch: 5633, Training Loss: 0.02341
Epoch: 5633, Training Loss: 0.02549
Epoch: 5633, Training Loss: 0.02553
Epoch: 5633, Training Loss: 0.03220
Epoch: 5634, Training Loss: 0.02341
Epoch: 5634, Training Loss: 0.02548
Epoch: 5634, Training Loss: 0.02553
Epoch: 5634, Training Loss: 0.03220
Epoch: 5635, Training Loss: 0.02340
Epoch: 5635, Training Loss: 0.02548
Epoch: 5635, Training Loss: 0.02552
Epoch: 5635, Training Loss: 0.03219
Epoch: 5636, Training Loss: 0.02340
Epoch: 5636, Training Loss: 0.02547
Epoch: 5636, Training Loss: 0.02552
Epoch: 5636, Training Loss: 0.03219
Epoch: 5637, Training Loss: 0.02340
Epoch: 5637, Training Loss: 0.02547
Epoch: 5637, Training Loss: 0.02552
Epoch: 5637, Training Loss: 0.03218
Epoch: 5638, Training Loss: 0.02339
Epoch: 5638, Training Loss: 0.02547
Epoch: 5638, Training Loss: 0.02551
Epoch: 5638, Training Loss: 0.03218
Epoch: 5639, Training Loss: 0.02339
Epoch: 5639, Training Loss: 0.02546
Epoch: 5639, Training Loss: 0.02551
Epoch: 5639, Training Loss: 0.03217
Epoch: 5640, Training Loss: 0.02339
Epoch: 5640, Training Loss: 0.02546
Epoch: 5640, Training Loss: 0.02550
Epoch: 5640, Training Loss: 0.03217
Epoch: 5641, Training Loss: 0.02338
Epoch: 5641, Training Loss: 0.02545
Epoch: 5641, Training Loss: 0.02550
Epoch: 5641, Training Loss: 0.03216
Epoch: 5642, Training Loss: 0.02338
Epoch: 5642, Training Loss: 0.02545
Epoch: 5642, Training Loss: 0.02550
Epoch: 5642, Training Loss: 0.03215
Epoch: 5643, Training Loss: 0.02338
Epoch: 5643, Training Loss: 0.02544
Epoch: 5643, Training Loss: 0.02549
Epoch: 5643, Training Loss: 0.03215
Epoch: 5644, Training Loss: 0.02337
Epoch: 5644, Training Loss: 0.02544
Epoch: 5644, Training Loss: 0.02549
Epoch: 5644, Training Loss: 0.03214
Epoch: 5645, Training Loss: 0.02337
Epoch: 5645, Training Loss: 0.02544
Epoch: 5645, Training Loss: 0.02548
Epoch: 5645, Training Loss: 0.03214
Epoch: 5646, Training Loss: 0.02337
Epoch: 5646, Training Loss: 0.02543
Epoch: 5646, Training Loss: 0.02548
Epoch: 5646, Training Loss: 0.03213
Epoch: 5647, Training Loss: 0.02336
Epoch: 5647, Training Loss: 0.02543
Epoch: 5647, Training Loss: 0.02547
Epoch: 5647, Training Loss: 0.03213
Epoch: 5648, Training Loss: 0.02336
Epoch: 5648, Training Loss: 0.02542
Epoch: 5648, Training Loss: 0.02547
Epoch: 5648, Training Loss: 0.03212
Epoch: 5649, Training Loss: 0.02336
Epoch: 5649, Training Loss: 0.02542
Epoch: 5649, Training Loss: 0.02547
Epoch: 5649, Training Loss: 0.03212
Epoch: 5650, Training Loss: 0.02335
Epoch: 5650, Training Loss: 0.02542
Epoch: 5650, Training Loss: 0.02546
Epoch: 5650, Training Loss: 0.03211
Epoch: 5651, Training Loss: 0.02335
Epoch: 5651, Training Loss: 0.02541
Epoch: 5651, Training Loss: 0.02546
Epoch: 5651, Training Loss: 0.03211
Epoch: 5652, Training Loss: 0.02334
Epoch: 5652, Training Loss: 0.02541
Epoch: 5652, Training Loss: 0.02545
Epoch: 5652, Training Loss: 0.03210
Epoch: 5653, Training Loss: 0.02334
Epoch: 5653, Training Loss: 0.02540
Epoch: 5653, Training Loss: 0.02545
Epoch: 5653, Training Loss: 0.03210
Epoch: 5654, Training Loss: 0.02334
Epoch: 5654, Training Loss: 0.02540
Epoch: 5654, Training Loss: 0.02545
Epoch: 5654, Training Loss: 0.03209
Epoch: 5655, Training Loss: 0.02333
Epoch: 5655, Training Loss: 0.02540
Epoch: 5655, Training Loss: 0.02544
Epoch: 5655, Training Loss: 0.03208
Epoch: 5656, Training Loss: 0.02333
Epoch: 5656, Training Loss: 0.02539
Epoch: 5656, Training Loss: 0.02544
Epoch: 5656, Training Loss: 0.03208
Epoch: 5657, Training Loss: 0.02333
Epoch: 5657, Training Loss: 0.02539
Epoch: 5657, Training Loss: 0.02543
Epoch: 5657, Training Loss: 0.03207
Epoch: 5658, Training Loss: 0.02332
Epoch: 5658, Training Loss: 0.02538
Epoch: 5658, Training Loss: 0.02543
Epoch: 5658, Training Loss: 0.03207
Epoch: 5659, Training Loss: 0.02332
Epoch: 5659, Training Loss: 0.02538
Epoch: 5659, Training Loss: 0.02543
Epoch: 5659, Training Loss: 0.03206
Epoch: 5660, Training Loss: 0.02332
Epoch: 5660, Training Loss: 0.02537
Epoch: 5660, Training Loss: 0.02542
Epoch: 5660, Training Loss: 0.03206
Epoch: 5661, Training Loss: 0.02331
Epoch: 5661, Training Loss: 0.02537
Epoch: 5661, Training Loss: 0.02542
Epoch: 5661, Training Loss: 0.03205
Epoch: 5662, Training Loss: 0.02331
Epoch: 5662, Training Loss: 0.02537
Epoch: 5662, Training Loss: 0.02541
Epoch: 5662, Training Loss: 0.03205
Epoch: 5663, Training Loss: 0.02331
Epoch: 5663, Training Loss: 0.02536
Epoch: 5663, Training Loss: 0.02541
Epoch: 5663, Training Loss: 0.03204
Epoch: 5664, Training Loss: 0.02330
Epoch: 5664, Training Loss: 0.02536
Epoch: 5664, Training Loss: 0.02540
Epoch: 5664, Training Loss: 0.03204
Epoch: 5665, Training Loss: 0.02330
Epoch: 5665, Training Loss: 0.02535
Epoch: 5665, Training Loss: 0.02540
Epoch: 5665, Training Loss: 0.03203
Epoch: 5666, Training Loss: 0.02330
Epoch: 5666, Training Loss: 0.02535
Epoch: 5666, Training Loss: 0.02540
Epoch: 5666, Training Loss: 0.03203
Epoch: 5667, Training Loss: 0.02329
Epoch: 5667, Training Loss: 0.02535
Epoch: 5667, Training Loss: 0.02539
Epoch: 5667, Training Loss: 0.03202
Epoch: 5668, Training Loss: 0.02329
Epoch: 5668, Training Loss: 0.02534
Epoch: 5668, Training Loss: 0.02539
Epoch: 5668, Training Loss: 0.03201
Epoch: 5669, Training Loss: 0.02329
Epoch: 5669, Training Loss: 0.02534
Epoch: 5669, Training Loss: 0.02538
Epoch: 5669, Training Loss: 0.03201
Epoch: 5670, Training Loss: 0.02328
Epoch: 5670, Training Loss: 0.02533
Epoch: 5670, Training Loss: 0.02538
Epoch: 5670, Training Loss: 0.03200
Epoch: 5671, Training Loss: 0.02328
Epoch: 5671, Training Loss: 0.02533
Epoch: 5671, Training Loss: 0.02538
Epoch: 5671, Training Loss: 0.03200
Epoch: 5672, Training Loss: 0.02327
Epoch: 5672, Training Loss: 0.02533
Epoch: 5672, Training Loss: 0.02537
Epoch: 5672, Training Loss: 0.03199
Epoch: 5673, Training Loss: 0.02327
Epoch: 5673, Training Loss: 0.02532
Epoch: 5673, Training Loss: 0.02537
Epoch: 5673, Training Loss: 0.03199
Epoch: 5674, Training Loss: 0.02327
Epoch: 5674, Training Loss: 0.02532
Epoch: 5674, Training Loss: 0.02536
Epoch: 5674, Training Loss: 0.03198
Epoch: 5675, Training Loss: 0.02326
Epoch: 5675, Training Loss: 0.02531
Epoch: 5675, Training Loss: 0.02536
Epoch: 5675, Training Loss: 0.03198
Epoch: 5676, Training Loss: 0.02326
Epoch: 5676, Training Loss: 0.02531
Epoch: 5676, Training Loss: 0.02536
Epoch: 5676, Training Loss: 0.03197
Epoch: 5677, Training Loss: 0.02326
Epoch: 5677, Training Loss: 0.02531
Epoch: 5677, Training Loss: 0.02535
Epoch: 5677, Training Loss: 0.03197
Epoch: 5678, Training Loss: 0.02325
Epoch: 5678, Training Loss: 0.02530
Epoch: 5678, Training Loss: 0.02535
Epoch: 5678, Training Loss: 0.03196
Epoch: 5679, Training Loss: 0.02325
Epoch: 5679, Training Loss: 0.02530
Epoch: 5679, Training Loss: 0.02534
Epoch: 5679, Training Loss: 0.03196
Epoch: 5680, Training Loss: 0.02325
Epoch: 5680, Training Loss: 0.02529
Epoch: 5680, Training Loss: 0.02534
Epoch: 5680, Training Loss: 0.03195
Epoch: 5681, Training Loss: 0.02324
Epoch: 5681, Training Loss: 0.02529
Epoch: 5681, Training Loss: 0.02534
Epoch: 5681, Training Loss: 0.03195
Epoch: 5682, Training Loss: 0.02324
Epoch: 5682, Training Loss: 0.02529
Epoch: 5682, Training Loss: 0.02533
Epoch: 5682, Training Loss: 0.03194
Epoch: 5683, Training Loss: 0.02324
Epoch: 5683, Training Loss: 0.02528
Epoch: 5683, Training Loss: 0.02533
Epoch: 5683, Training Loss: 0.03193
Epoch: 5684, Training Loss: 0.02323
Epoch: 5684, Training Loss: 0.02528
Epoch: 5684, Training Loss: 0.02532
Epoch: 5684, Training Loss: 0.03193
Epoch: 5685, Training Loss: 0.02323
Epoch: 5685, Training Loss: 0.02527
Epoch: 5685, Training Loss: 0.02532
Epoch: 5685, Training Loss: 0.03192
Epoch: 5686, Training Loss: 0.02323
Epoch: 5686, Training Loss: 0.02527
Epoch: 5686, Training Loss: 0.02532
Epoch: 5686, Training Loss: 0.03192
Epoch: 5687, Training Loss: 0.02322
Epoch: 5687, Training Loss: 0.02527
Epoch: 5687, Training Loss: 0.02531
Epoch: 5687, Training Loss: 0.03191
Epoch: 5688, Training Loss: 0.02322
Epoch: 5688, Training Loss: 0.02526
Epoch: 5688, Training Loss: 0.02531
Epoch: 5688, Training Loss: 0.03191
Epoch: 5689, Training Loss: 0.02322
Epoch: 5689, Training Loss: 0.02526
Epoch: 5689, Training Loss: 0.02530
Epoch: 5689, Training Loss: 0.03190
Epoch: 5690, Training Loss: 0.02321
Epoch: 5690, Training Loss: 0.02525
Epoch: 5690, Training Loss: 0.02530
Epoch: 5690, Training Loss: 0.03190
Epoch: 5691, Training Loss: 0.02321
Epoch: 5691, Training Loss: 0.02525
Epoch: 5691, Training Loss: 0.02530
Epoch: 5691, Training Loss: 0.03189
Epoch: 5692, Training Loss: 0.02320
Epoch: 5692, Training Loss: 0.02525
Epoch: 5692, Training Loss: 0.02529
Epoch: 5692, Training Loss: 0.03189
Epoch: 5693, Training Loss: 0.02320
Epoch: 5693, Training Loss: 0.02524
Epoch: 5693, Training Loss: 0.02529
Epoch: 5693, Training Loss: 0.03188
Epoch: 5694, Training Loss: 0.02320
Epoch: 5694, Training Loss: 0.02524
Epoch: 5694, Training Loss: 0.02528
Epoch: 5694, Training Loss: 0.03188
Epoch: 5695, Training Loss: 0.02319
Epoch: 5695, Training Loss: 0.02523
Epoch: 5695, Training Loss: 0.02528
Epoch: 5695, Training Loss: 0.03187
Epoch: 5696, Training Loss: 0.02319
Epoch: 5696, Training Loss: 0.02523
Epoch: 5696, Training Loss: 0.02527
Epoch: 5696, Training Loss: 0.03187
Epoch: 5697, Training Loss: 0.02319
Epoch: 5697, Training Loss: 0.02523
Epoch: 5697, Training Loss: 0.02527
Epoch: 5697, Training Loss: 0.03186
Epoch: 5698, Training Loss: 0.02318
Epoch: 5698, Training Loss: 0.02522
Epoch: 5698, Training Loss: 0.02527
Epoch: 5698, Training Loss: 0.03186
Epoch: 5699, Training Loss: 0.02318
Epoch: 5699, Training Loss: 0.02522
Epoch: 5699, Training Loss: 0.02526
Epoch: 5699, Training Loss: 0.03185
Epoch: 5700, Training Loss: 0.02318
Epoch: 5700, Training Loss: 0.02521
Epoch: 5700, Training Loss: 0.02526
Epoch: 5700, Training Loss: 0.03184
Epoch: 5701, Training Loss: 0.02317
Epoch: 5701, Training Loss: 0.02521
Epoch: 5701, Training Loss: 0.02525
Epoch: 5701, Training Loss: 0.03184
Epoch: 5702, Training Loss: 0.02317
Epoch: 5702, Training Loss: 0.02521
Epoch: 5702, Training Loss: 0.02525
Epoch: 5702, Training Loss: 0.03183
Epoch: 5703, Training Loss: 0.02317
Epoch: 5703, Training Loss: 0.02520
Epoch: 5703, Training Loss: 0.02525
Epoch: 5703, Training Loss: 0.03183
Epoch: 5704, Training Loss: 0.02316
Epoch: 5704, Training Loss: 0.02520
Epoch: 5704, Training Loss: 0.02524
Epoch: 5704, Training Loss: 0.03182
Epoch: 5705, Training Loss: 0.02316
Epoch: 5705, Training Loss: 0.02519
Epoch: 5705, Training Loss: 0.02524
Epoch: 5705, Training Loss: 0.03182
Epoch: 5706, Training Loss: 0.02316
Epoch: 5706, Training Loss: 0.02519
Epoch: 5706, Training Loss: 0.02523
Epoch: 5706, Training Loss: 0.03181
Epoch: 5707, Training Loss: 0.02315
Epoch: 5707, Training Loss: 0.02519
Epoch: 5707, Training Loss: 0.02523
Epoch: 5707, Training Loss: 0.03181
Epoch: 5708, Training Loss: 0.02315
Epoch: 5708, Training Loss: 0.02518
Epoch: 5708, Training Loss: 0.02523
Epoch: 5708, Training Loss: 0.03180
Epoch: 5709, Training Loss: 0.02315
Epoch: 5709, Training Loss: 0.02518
Epoch: 5709, Training Loss: 0.02522
Epoch: 5709, Training Loss: 0.03180
Epoch: 5710, Training Loss: 0.02314
Epoch: 5710, Training Loss: 0.02517
Epoch: 5710, Training Loss: 0.02522
Epoch: 5710, Training Loss: 0.03179
Epoch: 5711, Training Loss: 0.02314
Epoch: 5711, Training Loss: 0.02517
Epoch: 5711, Training Loss: 0.02521
Epoch: 5711, Training Loss: 0.03179
Epoch: 5712, Training Loss: 0.02314
Epoch: 5712, Training Loss: 0.02517
Epoch: 5712, Training Loss: 0.02521
Epoch: 5712, Training Loss: 0.03178
Epoch: 5713, Training Loss: 0.02313
Epoch: 5713, Training Loss: 0.02516
Epoch: 5713, Training Loss: 0.02521
Epoch: 5713, Training Loss: 0.03178
Epoch: 5714, Training Loss: 0.02313
Epoch: 5714, Training Loss: 0.02516
Epoch: 5714, Training Loss: 0.02520
Epoch: 5714, Training Loss: 0.03177
Epoch: 5715, Training Loss: 0.02313
Epoch: 5715, Training Loss: 0.02515
Epoch: 5715, Training Loss: 0.02520
Epoch: 5715, Training Loss: 0.03177
Epoch: 5716, Training Loss: 0.02312
Epoch: 5716, Training Loss: 0.02515
Epoch: 5716, Training Loss: 0.02519
Epoch: 5716, Training Loss: 0.03176
Epoch: 5717, Training Loss: 0.02312
Epoch: 5717, Training Loss: 0.02515
Epoch: 5717, Training Loss: 0.02519
Epoch: 5717, Training Loss: 0.03176
Epoch: 5718, Training Loss: 0.02312
Epoch: 5718, Training Loss: 0.02514
Epoch: 5718, Training Loss: 0.02519
Epoch: 5718, Training Loss: 0.03175
Epoch: 5719, Training Loss: 0.02311
Epoch: 5719, Training Loss: 0.02514
Epoch: 5719, Training Loss: 0.02518
Epoch: 5719, Training Loss: 0.03175
Epoch: 5720, Training Loss: 0.02311
Epoch: 5720, Training Loss: 0.02513
Epoch: 5720, Training Loss: 0.02518
Epoch: 5720, Training Loss: 0.03174
Epoch: 5721, Training Loss: 0.02311
Epoch: 5721, Training Loss: 0.02513
Epoch: 5721, Training Loss: 0.02517
Epoch: 5721, Training Loss: 0.03173
Epoch: 5722, Training Loss: 0.02310
Epoch: 5722, Training Loss: 0.02513
Epoch: 5722, Training Loss: 0.02517
Epoch: 5722, Training Loss: 0.03173
Epoch: 5723, Training Loss: 0.02310
Epoch: 5723, Training Loss: 0.02512
Epoch: 5723, Training Loss: 0.02517
Epoch: 5723, Training Loss: 0.03172
Epoch: 5724, Training Loss: 0.02309
Epoch: 5724, Training Loss: 0.02512
Epoch: 5724, Training Loss: 0.02516
Epoch: 5724, Training Loss: 0.03172
Epoch: 5725, Training Loss: 0.02309
Epoch: 5725, Training Loss: 0.02511
Epoch: 5725, Training Loss: 0.02516
Epoch: 5725, Training Loss: 0.03171
Epoch: 5726, Training Loss: 0.02309
Epoch: 5726, Training Loss: 0.02511
Epoch: 5726, Training Loss: 0.02516
Epoch: 5726, Training Loss: 0.03171
Epoch: 5727, Training Loss: 0.02308
Epoch: 5727, Training Loss: 0.02511
Epoch: 5727, Training Loss: 0.02515
Epoch: 5727, Training Loss: 0.03170
Epoch: 5728, Training Loss: 0.02308
Epoch: 5728, Training Loss: 0.02510
Epoch: 5728, Training Loss: 0.02515
Epoch: 5728, Training Loss: 0.03170
Epoch: 5729, Training Loss: 0.02308
Epoch: 5729, Training Loss: 0.02510
Epoch: 5729, Training Loss: 0.02514
Epoch: 5729, Training Loss: 0.03169
Epoch: 5730, Training Loss: 0.02307
Epoch: 5730, Training Loss: 0.02509
Epoch: 5730, Training Loss: 0.02514
Epoch: 5730, Training Loss: 0.03169
Epoch: 5731, Training Loss: 0.02307
Epoch: 5731, Training Loss: 0.02509
Epoch: 5731, Training Loss: 0.02514
Epoch: 5731, Training Loss: 0.03168
Epoch: 5732, Training Loss: 0.02307
Epoch: 5732, Training Loss: 0.02509
Epoch: 5732, Training Loss: 0.02513
Epoch: 5732, Training Loss: 0.03168
Epoch: 5733, Training Loss: 0.02306
Epoch: 5733, Training Loss: 0.02508
Epoch: 5733, Training Loss: 0.02513
Epoch: 5733, Training Loss: 0.03167
Epoch: 5734, Training Loss: 0.02306
Epoch: 5734, Training Loss: 0.02508
Epoch: 5734, Training Loss: 0.02512
Epoch: 5734, Training Loss: 0.03167
Epoch: 5735, Training Loss: 0.02306
Epoch: 5735, Training Loss: 0.02507
Epoch: 5735, Training Loss: 0.02512
Epoch: 5735, Training Loss: 0.03166
Epoch: 5736, Training Loss: 0.02305
Epoch: 5736, Training Loss: 0.02507
Epoch: 5736, Training Loss: 0.02512
Epoch: 5736, Training Loss: 0.03166
Epoch: 5737, Training Loss: 0.02305
Epoch: 5737, Training Loss: 0.02507
Epoch: 5737, Training Loss: 0.02511
Epoch: 5737, Training Loss: 0.03165
Epoch: 5738, Training Loss: 0.02305
Epoch: 5738, Training Loss: 0.02506
Epoch: 5738, Training Loss: 0.02511
Epoch: 5738, Training Loss: 0.03165
Epoch: 5739, Training Loss: 0.02304
Epoch: 5739, Training Loss: 0.02506
Epoch: 5739, Training Loss: 0.02510
Epoch: 5739, Training Loss: 0.03164
Epoch: 5740, Training Loss: 0.02304
Epoch: 5740, Training Loss: 0.02505
Epoch: 5740, Training Loss: 0.02510
Epoch: 5740, Training Loss: 0.03164
Epoch: 5741, Training Loss: 0.02304
Epoch: 5741, Training Loss: 0.02505
Epoch: 5741, Training Loss: 0.02510
Epoch: 5741, Training Loss: 0.03163
Epoch: 5742, Training Loss: 0.02303
Epoch: 5742, Training Loss: 0.02505
Epoch: 5742, Training Loss: 0.02509
Epoch: 5742, Training Loss: 0.03163
Epoch: 5743, Training Loss: 0.02303
Epoch: 5743, Training Loss: 0.02504
Epoch: 5743, Training Loss: 0.02509
Epoch: 5743, Training Loss: 0.03162
Epoch: 5744, Training Loss: 0.02303
Epoch: 5744, Training Loss: 0.02504
Epoch: 5744, Training Loss: 0.02508
Epoch: 5744, Training Loss: 0.03162
Epoch: 5745, Training Loss: 0.02302
Epoch: 5745, Training Loss: 0.02503
Epoch: 5745, Training Loss: 0.02508
Epoch: 5745, Training Loss: 0.03161
Epoch: 5746, Training Loss: 0.02302
Epoch: 5746, Training Loss: 0.02503
Epoch: 5746, Training Loss: 0.02508
Epoch: 5746, Training Loss: 0.03161
Epoch: 5747, Training Loss: 0.02302
Epoch: 5747, Training Loss: 0.02503
Epoch: 5747, Training Loss: 0.02507
Epoch: 5747, Training Loss: 0.03160
Epoch: 5748, Training Loss: 0.02301
Epoch: 5748, Training Loss: 0.02502
Epoch: 5748, Training Loss: 0.02507
Epoch: 5748, Training Loss: 0.03159
Epoch: 5749, Training Loss: 0.02301
Epoch: 5749, Training Loss: 0.02502
Epoch: 5749, Training Loss: 0.02506
Epoch: 5749, Training Loss: 0.03159
Epoch: 5750, Training Loss: 0.02301
Epoch: 5750, Training Loss: 0.02501
Epoch: 5750, Training Loss: 0.02506
Epoch: 5750, Training Loss: 0.03158
Epoch: 5751, Training Loss: 0.02300
Epoch: 5751, Training Loss: 0.02501
Epoch: 5751, Training Loss: 0.02506
Epoch: 5751, Training Loss: 0.03158
Epoch: 5752, Training Loss: 0.02300
Epoch: 5752, Training Loss: 0.02501
Epoch: 5752, Training Loss: 0.02505
Epoch: 5752, Training Loss: 0.03157
Epoch: 5753, Training Loss: 0.02300
Epoch: 5753, Training Loss: 0.02500
Epoch: 5753, Training Loss: 0.02505
Epoch: 5753, Training Loss: 0.03157
Epoch: 5754, Training Loss: 0.02299
Epoch: 5754, Training Loss: 0.02500
Epoch: 5754, Training Loss: 0.02504
Epoch: 5754, Training Loss: 0.03156
Epoch: 5755, Training Loss: 0.02299
Epoch: 5755, Training Loss: 0.02500
Epoch: 5755, Training Loss: 0.02504
Epoch: 5755, Training Loss: 0.03156
Epoch: 5756, Training Loss: 0.02299
Epoch: 5756, Training Loss: 0.02499
Epoch: 5756, Training Loss: 0.02504
Epoch: 5756, Training Loss: 0.03155
Epoch: 5757, Training Loss: 0.02298
Epoch: 5757, Training Loss: 0.02499
Epoch: 5757, Training Loss: 0.02503
Epoch: 5757, Training Loss: 0.03155
Epoch: 5758, Training Loss: 0.02298
Epoch: 5758, Training Loss: 0.02498
Epoch: 5758, Training Loss: 0.02503
Epoch: 5758, Training Loss: 0.03154
Epoch: 5759, Training Loss: 0.02298
Epoch: 5759, Training Loss: 0.02498
Epoch: 5759, Training Loss: 0.02503
Epoch: 5759, Training Loss: 0.03154
Epoch: 5760, Training Loss: 0.02297
Epoch: 5760, Training Loss: 0.02498
Epoch: 5760, Training Loss: 0.02502
Epoch: 5760, Training Loss: 0.03153
Epoch: 5761, Training Loss: 0.02297
Epoch: 5761, Training Loss: 0.02497
Epoch: 5761, Training Loss: 0.02502
Epoch: 5761, Training Loss: 0.03153
Epoch: 5762, Training Loss: 0.02297
Epoch: 5762, Training Loss: 0.02497
Epoch: 5762, Training Loss: 0.02501
Epoch: 5762, Training Loss: 0.03152
Epoch: 5763, Training Loss: 0.02296
Epoch: 5763, Training Loss: 0.02496
Epoch: 5763, Training Loss: 0.02501
Epoch: 5763, Training Loss: 0.03152
Epoch: 5764, Training Loss: 0.02296
Epoch: 5764, Training Loss: 0.02496
Epoch: 5764, Training Loss: 0.02501
Epoch: 5764, Training Loss: 0.03151
Epoch: 5765, Training Loss: 0.02296
Epoch: 5765, Training Loss: 0.02496
Epoch: 5765, Training Loss: 0.02500
Epoch: 5765, Training Loss: 0.03151
Epoch: 5766, Training Loss: 0.02295
Epoch: 5766, Training Loss: 0.02495
Epoch: 5766, Training Loss: 0.02500
Epoch: 5766, Training Loss: 0.03150
Epoch: 5767, Training Loss: 0.02295
Epoch: 5767, Training Loss: 0.02495
Epoch: 5767, Training Loss: 0.02499
Epoch: 5767, Training Loss: 0.03150
Epoch: 5768, Training Loss: 0.02295
Epoch: 5768, Training Loss: 0.02494
Epoch: 5768, Training Loss: 0.02499
Epoch: 5768, Training Loss: 0.03149
Epoch: 5769, Training Loss: 0.02294
Epoch: 5769, Training Loss: 0.02494
Epoch: 5769, Training Loss: 0.02499
Epoch: 5769, Training Loss: 0.03149
Epoch: 5770, Training Loss: 0.02294
Epoch: 5770, Training Loss: 0.02494
Epoch: 5770, Training Loss: 0.02498
Epoch: 5770, Training Loss: 0.03148
Epoch: 5771, Training Loss: 0.02294
Epoch: 5771, Training Loss: 0.02493
Epoch: 5771, Training Loss: 0.02498
Epoch: 5771, Training Loss: 0.03148
Epoch: 5772, Training Loss: 0.02293
Epoch: 5772, Training Loss: 0.02493
Epoch: 5772, Training Loss: 0.02497
Epoch: 5772, Training Loss: 0.03147
Epoch: 5773, Training Loss: 0.02293
Epoch: 5773, Training Loss: 0.02493
Epoch: 5773, Training Loss: 0.02497
Epoch: 5773, Training Loss: 0.03147
Epoch: 5774, Training Loss: 0.02293
Epoch: 5774, Training Loss: 0.02492
Epoch: 5774, Training Loss: 0.02497
Epoch: 5774, Training Loss: 0.03146
Epoch: 5775, Training Loss: 0.02292
Epoch: 5775, Training Loss: 0.02492
Epoch: 5775, Training Loss: 0.02496
Epoch: 5775, Training Loss: 0.03146
Epoch: 5776, Training Loss: 0.02292
Epoch: 5776, Training Loss: 0.02491
Epoch: 5776, Training Loss: 0.02496
Epoch: 5776, Training Loss: 0.03145
Epoch: 5777, Training Loss: 0.02292
Epoch: 5777, Training Loss: 0.02491
Epoch: 5777, Training Loss: 0.02495
Epoch: 5777, Training Loss: 0.03145
Epoch: 5778, Training Loss: 0.02291
Epoch: 5778, Training Loss: 0.02491
Epoch: 5778, Training Loss: 0.02495
Epoch: 5778, Training Loss: 0.03144
Epoch: 5779, Training Loss: 0.02291
Epoch: 5779, Training Loss: 0.02490
Epoch: 5779, Training Loss: 0.02495
Epoch: 5779, Training Loss: 0.03144
Epoch: 5780, Training Loss: 0.02291
Epoch: 5780, Training Loss: 0.02490
Epoch: 5780, Training Loss: 0.02494
Epoch: 5780, Training Loss: 0.03143
Epoch: 5781, Training Loss: 0.02290
Epoch: 5781, Training Loss: 0.02489
Epoch: 5781, Training Loss: 0.02494
Epoch: 5781, Training Loss: 0.03143
Epoch: 5782, Training Loss: 0.02290
Epoch: 5782, Training Loss: 0.02489
Epoch: 5782, Training Loss: 0.02494
Epoch: 5782, Training Loss: 0.03142
Epoch: 5783, Training Loss: 0.02290
Epoch: 5783, Training Loss: 0.02489
Epoch: 5783, Training Loss: 0.02493
Epoch: 5783, Training Loss: 0.03142
Epoch: 5784, Training Loss: 0.02289
Epoch: 5784, Training Loss: 0.02488
Epoch: 5784, Training Loss: 0.02493
Epoch: 5784, Training Loss: 0.03141
Epoch: 5785, Training Loss: 0.02289
Epoch: 5785, Training Loss: 0.02488
Epoch: 5785, Training Loss: 0.02492
Epoch: 5785, Training Loss: 0.03141
Epoch: 5786, Training Loss: 0.02289
Epoch: 5786, Training Loss: 0.02488
Epoch: 5786, Training Loss: 0.02492
Epoch: 5786, Training Loss: 0.03140
Epoch: 5787, Training Loss: 0.02288
Epoch: 5787, Training Loss: 0.02487
Epoch: 5787, Training Loss: 0.02492
Epoch: 5787, Training Loss: 0.03140
Epoch: 5788, Training Loss: 0.02288
Epoch: 5788, Training Loss: 0.02487
Epoch: 5788, Training Loss: 0.02491
Epoch: 5788, Training Loss: 0.03139
Epoch: 5789, Training Loss: 0.02288
Epoch: 5789, Training Loss: 0.02486
Epoch: 5789, Training Loss: 0.02491
Epoch: 5789, Training Loss: 0.03139
Epoch: 5790, Training Loss: 0.02287
Epoch: 5790, Training Loss: 0.02486
Epoch: 5790, Training Loss: 0.02490
Epoch: 5790, Training Loss: 0.03138
Epoch: 5791, Training Loss: 0.02287
Epoch: 5791, Training Loss: 0.02486
Epoch: 5791, Training Loss: 0.02490
Epoch: 5791, Training Loss: 0.03138
Epoch: 5792, Training Loss: 0.02287
Epoch: 5792, Training Loss: 0.02485
Epoch: 5792, Training Loss: 0.02490
Epoch: 5792, Training Loss: 0.03137
Epoch: 5793, Training Loss: 0.02286
Epoch: 5793, Training Loss: 0.02485
Epoch: 5793, Training Loss: 0.02489
Epoch: 5793, Training Loss: 0.03137
Epoch: 5794, Training Loss: 0.02286
Epoch: 5794, Training Loss: 0.02484
Epoch: 5794, Training Loss: 0.02489
Epoch: 5794, Training Loss: 0.03136
Epoch: 5795, Training Loss: 0.02286
Epoch: 5795, Training Loss: 0.02484
Epoch: 5795, Training Loss: 0.02489
Epoch: 5795, Training Loss: 0.03136
Epoch: 5796, Training Loss: 0.02285
Epoch: 5796, Training Loss: 0.02484
Epoch: 5796, Training Loss: 0.02488
Epoch: 5796, Training Loss: 0.03135
Epoch: 5797, Training Loss: 0.02285
Epoch: 5797, Training Loss: 0.02483
Epoch: 5797, Training Loss: 0.02488
Epoch: 5797, Training Loss: 0.03135
Epoch: 5798, Training Loss: 0.02285
Epoch: 5798, Training Loss: 0.02483
Epoch: 5798, Training Loss: 0.02487
Epoch: 5798, Training Loss: 0.03134
Epoch: 5799, Training Loss: 0.02284
Epoch: 5799, Training Loss: 0.02483
Epoch: 5799, Training Loss: 0.02487
Epoch: 5799, Training Loss: 0.03134
Epoch: 5800, Training Loss: 0.02284
Epoch: 5800, Training Loss: 0.02482
Epoch: 5800, Training Loss: 0.02487
Epoch: 5800, Training Loss: 0.03133
Epoch: 5801, Training Loss: 0.02284
Epoch: 5801, Training Loss: 0.02482
Epoch: 5801, Training Loss: 0.02486
Epoch: 5801, Training Loss: 0.03133
Epoch: 5802, Training Loss: 0.02283
Epoch: 5802, Training Loss: 0.02481
Epoch: 5802, Training Loss: 0.02486
Epoch: 5802, Training Loss: 0.03132
Epoch: 5803, Training Loss: 0.02283
Epoch: 5803, Training Loss: 0.02481
Epoch: 5803, Training Loss: 0.02485
Epoch: 5803, Training Loss: 0.03132
Epoch: 5804, Training Loss: 0.02283
Epoch: 5804, Training Loss: 0.02481
Epoch: 5804, Training Loss: 0.02485
Epoch: 5804, Training Loss: 0.03131
Epoch: 5805, Training Loss: 0.02282
Epoch: 5805, Training Loss: 0.02480
Epoch: 5805, Training Loss: 0.02485
Epoch: 5805, Training Loss: 0.03131
Epoch: 5806, Training Loss: 0.02282
Epoch: 5806, Training Loss: 0.02480
Epoch: 5806, Training Loss: 0.02484
Epoch: 5806, Training Loss: 0.03130
Epoch: 5807, Training Loss: 0.02282
Epoch: 5807, Training Loss: 0.02479
Epoch: 5807, Training Loss: 0.02484
Epoch: 5807, Training Loss: 0.03130
Epoch: 5808, Training Loss: 0.02281
Epoch: 5808, Training Loss: 0.02479
Epoch: 5808, Training Loss: 0.02484
Epoch: 5808, Training Loss: 0.03129
Epoch: 5809, Training Loss: 0.02281
Epoch: 5809, Training Loss: 0.02479
Epoch: 5809, Training Loss: 0.02483
Epoch: 5809, Training Loss: 0.03129
Epoch: 5810, Training Loss: 0.02281
Epoch: 5810, Training Loss: 0.02478
Epoch: 5810, Training Loss: 0.02483
Epoch: 5810, Training Loss: 0.03128
Epoch: 5811, Training Loss: 0.02280
Epoch: 5811, Training Loss: 0.02478
Epoch: 5811, Training Loss: 0.02482
Epoch: 5811, Training Loss: 0.03128
Epoch: 5812, Training Loss: 0.02280
Epoch: 5812, Training Loss: 0.02478
Epoch: 5812, Training Loss: 0.02482
Epoch: 5812, Training Loss: 0.03127
Epoch: 5813, Training Loss: 0.02280
Epoch: 5813, Training Loss: 0.02477
Epoch: 5813, Training Loss: 0.02482
Epoch: 5813, Training Loss: 0.03127
Epoch: 5814, Training Loss: 0.02279
Epoch: 5814, Training Loss: 0.02477
Epoch: 5814, Training Loss: 0.02481
Epoch: 5814, Training Loss: 0.03126
Epoch: 5815, Training Loss: 0.02279
Epoch: 5815, Training Loss: 0.02476
Epoch: 5815, Training Loss: 0.02481
Epoch: 5815, Training Loss: 0.03126
Epoch: 5816, Training Loss: 0.02279
Epoch: 5816, Training Loss: 0.02476
Epoch: 5816, Training Loss: 0.02480
Epoch: 5816, Training Loss: 0.03125
Epoch: 5817, Training Loss: 0.02278
Epoch: 5817, Training Loss: 0.02476
Epoch: 5817, Training Loss: 0.02480
Epoch: 5817, Training Loss: 0.03125
Epoch: 5818, Training Loss: 0.02278
Epoch: 5818, Training Loss: 0.02475
Epoch: 5818, Training Loss: 0.02480
Epoch: 5818, Training Loss: 0.03124
Epoch: 5819, Training Loss: 0.02278
Epoch: 5819, Training Loss: 0.02475
Epoch: 5819, Training Loss: 0.02479
Epoch: 5819, Training Loss: 0.03124
Epoch: 5820, Training Loss: 0.02277
Epoch: 5820, Training Loss: 0.02475
Epoch: 5820, Training Loss: 0.02479
Epoch: 5820, Training Loss: 0.03123
Epoch: 5821, Training Loss: 0.02277
Epoch: 5821, Training Loss: 0.02474
Epoch: 5821, Training Loss: 0.02479
Epoch: 5821, Training Loss: 0.03123
Epoch: 5822, Training Loss: 0.02277
Epoch: 5822, Training Loss: 0.02474
Epoch: 5822, Training Loss: 0.02478
Epoch: 5822, Training Loss: 0.03122
Epoch: 5823, Training Loss: 0.02276
Epoch: 5823, Training Loss: 0.02473
Epoch: 5823, Training Loss: 0.02478
Epoch: 5823, Training Loss: 0.03122
Epoch: 5824, Training Loss: 0.02276
Epoch: 5824, Training Loss: 0.02473
Epoch: 5824, Training Loss: 0.02477
Epoch: 5824, Training Loss: 0.03121
Epoch: 5825, Training Loss: 0.02276
Epoch: 5825, Training Loss: 0.02473
Epoch: 5825, Training Loss: 0.02477
Epoch: 5825, Training Loss: 0.03121
Epoch: 5826, Training Loss: 0.02275
Epoch: 5826, Training Loss: 0.02472
Epoch: 5826, Training Loss: 0.02477
Epoch: 5826, Training Loss: 0.03120
Epoch: 5827, Training Loss: 0.02275
Epoch: 5827, Training Loss: 0.02472
Epoch: 5827, Training Loss: 0.02476
Epoch: 5827, Training Loss: 0.03120
Epoch: 5828, Training Loss: 0.02275
Epoch: 5828, Training Loss: 0.02471
Epoch: 5828, Training Loss: 0.02476
Epoch: 5828, Training Loss: 0.03119
Epoch: 5829, Training Loss: 0.02274
Epoch: 5829, Training Loss: 0.02471
Epoch: 5829, Training Loss: 0.02476
Epoch: 5829, Training Loss: 0.03119
Epoch: 5830, Training Loss: 0.02274
Epoch: 5830, Training Loss: 0.02471
Epoch: 5830, Training Loss: 0.02475
Epoch: 5830, Training Loss: 0.03118
Epoch: 5831, Training Loss: 0.02274
Epoch: 5831, Training Loss: 0.02470
Epoch: 5831, Training Loss: 0.02475
Epoch: 5831, Training Loss: 0.03118
Epoch: 5832, Training Loss: 0.02273
Epoch: 5832, Training Loss: 0.02470
Epoch: 5832, Training Loss: 0.02474
Epoch: 5832, Training Loss: 0.03117
Epoch: 5833, Training Loss: 0.02273
Epoch: 5833, Training Loss: 0.02470
Epoch: 5833, Training Loss: 0.02474
Epoch: 5833, Training Loss: 0.03117
Epoch: 5834, Training Loss: 0.02273
Epoch: 5834, Training Loss: 0.02469
Epoch: 5834, Training Loss: 0.02474
Epoch: 5834, Training Loss: 0.03116
Epoch: 5835, Training Loss: 0.02272
Epoch: 5835, Training Loss: 0.02469
Epoch: 5835, Training Loss: 0.02473
Epoch: 5835, Training Loss: 0.03116
Epoch: 5836, Training Loss: 0.02272
Epoch: 5836, Training Loss: 0.02468
Epoch: 5836, Training Loss: 0.02473
Epoch: 5836, Training Loss: 0.03115
Epoch: 5837, Training Loss: 0.02272
Epoch: 5837, Training Loss: 0.02468
Epoch: 5837, Training Loss: 0.02473
Epoch: 5837, Training Loss: 0.03115
Epoch: 5838, Training Loss: 0.02271
Epoch: 5838, Training Loss: 0.02468
Epoch: 5838, Training Loss: 0.02472
Epoch: 5838, Training Loss: 0.03114
Epoch: 5839, Training Loss: 0.02271
Epoch: 5839, Training Loss: 0.02467
Epoch: 5839, Training Loss: 0.02472
Epoch: 5839, Training Loss: 0.03114
Epoch: 5840, Training Loss: 0.02271
Epoch: 5840, Training Loss: 0.02467
Epoch: 5840, Training Loss: 0.02471
Epoch: 5840, Training Loss: 0.03113
Epoch: 5841, Training Loss: 0.02271
Epoch: 5841, Training Loss: 0.02467
Epoch: 5841, Training Loss: 0.02471
Epoch: 5841, Training Loss: 0.03113
Epoch: 5842, Training Loss: 0.02270
Epoch: 5842, Training Loss: 0.02466
Epoch: 5842, Training Loss: 0.02471
Epoch: 5842, Training Loss: 0.03112
Epoch: 5843, Training Loss: 0.02270
Epoch: 5843, Training Loss: 0.02466
Epoch: 5843, Training Loss: 0.02470
Epoch: 5843, Training Loss: 0.03112
Epoch: 5844, Training Loss: 0.02270
Epoch: 5844, Training Loss: 0.02465
Epoch: 5844, Training Loss: 0.02470
Epoch: 5844, Training Loss: 0.03111
Epoch: 5845, Training Loss: 0.02269
Epoch: 5845, Training Loss: 0.02465
Epoch: 5845, Training Loss: 0.02470
Epoch: 5845, Training Loss: 0.03111
Epoch: 5846, Training Loss: 0.02269
Epoch: 5846, Training Loss: 0.02465
Epoch: 5846, Training Loss: 0.02469
Epoch: 5846, Training Loss: 0.03110
Epoch: 5847, Training Loss: 0.02269
Epoch: 5847, Training Loss: 0.02464
Epoch: 5847, Training Loss: 0.02469
Epoch: 5847, Training Loss: 0.03110
Epoch: 5848, Training Loss: 0.02268
Epoch: 5848, Training Loss: 0.02464
Epoch: 5848, Training Loss: 0.02468
Epoch: 5848, Training Loss: 0.03109
Epoch: 5849, Training Loss: 0.02268
Epoch: 5849, Training Loss: 0.02464
Epoch: 5849, Training Loss: 0.02468
Epoch: 5849, Training Loss: 0.03109
Epoch: 5850, Training Loss: 0.02268
Epoch: 5850, Training Loss: 0.02463
Epoch: 5850, Training Loss: 0.02468
Epoch: 5850, Training Loss: 0.03108
Epoch: 5851, Training Loss: 0.02267
Epoch: 5851, Training Loss: 0.02463
Epoch: 5851, Training Loss: 0.02467
Epoch: 5851, Training Loss: 0.03108
Epoch: 5852, Training Loss: 0.02267
Epoch: 5852, Training Loss: 0.02462
Epoch: 5852, Training Loss: 0.02467
Epoch: 5852, Training Loss: 0.03107
Epoch: 5853, Training Loss: 0.02267
Epoch: 5853, Training Loss: 0.02462
Epoch: 5853, Training Loss: 0.02467
Epoch: 5853, Training Loss: 0.03107
Epoch: 5854, Training Loss: 0.02266
Epoch: 5854, Training Loss: 0.02462
Epoch: 5854, Training Loss: 0.02466
Epoch: 5854, Training Loss: 0.03106
Epoch: 5855, Training Loss: 0.02266
Epoch: 5855, Training Loss: 0.02461
Epoch: 5855, Training Loss: 0.02466
Epoch: 5855, Training Loss: 0.03106
Epoch: 5856, Training Loss: 0.02266
Epoch: 5856, Training Loss: 0.02461
Epoch: 5856, Training Loss: 0.02465
Epoch: 5856, Training Loss: 0.03105
Epoch: 5857, Training Loss: 0.02265
Epoch: 5857, Training Loss: 0.02461
Epoch: 5857, Training Loss: 0.02465
Epoch: 5857, Training Loss: 0.03105
Epoch: 5858, Training Loss: 0.02265
Epoch: 5858, Training Loss: 0.02460
Epoch: 5858, Training Loss: 0.02465
Epoch: 5858, Training Loss: 0.03104
Epoch: 5859, Training Loss: 0.02265
Epoch: 5859, Training Loss: 0.02460
Epoch: 5859, Training Loss: 0.02464
Epoch: 5859, Training Loss: 0.03104
Epoch: 5860, Training Loss: 0.02264
Epoch: 5860, Training Loss: 0.02459
Epoch: 5860, Training Loss: 0.02464
Epoch: 5860, Training Loss: 0.03103
Epoch: 5861, Training Loss: 0.02264
Epoch: 5861, Training Loss: 0.02459
Epoch: 5861, Training Loss: 0.02464
Epoch: 5861, Training Loss: 0.03103
Epoch: 5862, Training Loss: 0.02264
Epoch: 5862, Training Loss: 0.02459
Epoch: 5862, Training Loss: 0.02463
Epoch: 5862, Training Loss: 0.03102
Epoch: 5863, Training Loss: 0.02263
Epoch: 5863, Training Loss: 0.02458
Epoch: 5863, Training Loss: 0.02463
Epoch: 5863, Training Loss: 0.03102
Epoch: 5864, Training Loss: 0.02263
Epoch: 5864, Training Loss: 0.02458
Epoch: 5864, Training Loss: 0.02462
Epoch: 5864, Training Loss: 0.03101
Epoch: 5865, Training Loss: 0.02263
Epoch: 5865, Training Loss: 0.02458
Epoch: 5865, Training Loss: 0.02462
Epoch: 5865, Training Loss: 0.03101
Epoch: 5866, Training Loss: 0.02262
Epoch: 5866, Training Loss: 0.02457
Epoch: 5866, Training Loss: 0.02462
Epoch: 5866, Training Loss: 0.03100
Epoch: 5867, Training Loss: 0.02262
Epoch: 5867, Training Loss: 0.02457
Epoch: 5867, Training Loss: 0.02461
Epoch: 5867, Training Loss: 0.03100
Epoch: 5868, Training Loss: 0.02262
Epoch: 5868, Training Loss: 0.02456
Epoch: 5868, Training Loss: 0.02461
Epoch: 5868, Training Loss: 0.03099
Epoch: 5869, Training Loss: 0.02261
Epoch: 5869, Training Loss: 0.02456
Epoch: 5869, Training Loss: 0.02461
Epoch: 5869, Training Loss: 0.03099
Epoch: 5870, Training Loss: 0.02261
Epoch: 5870, Training Loss: 0.02456
Epoch: 5870, Training Loss: 0.02460
Epoch: 5870, Training Loss: 0.03098
Epoch: 5871, Training Loss: 0.02261
Epoch: 5871, Training Loss: 0.02455
Epoch: 5871, Training Loss: 0.02460
Epoch: 5871, Training Loss: 0.03098
Epoch: 5872, Training Loss: 0.02260
Epoch: 5872, Training Loss: 0.02455
Epoch: 5872, Training Loss: 0.02459
Epoch: 5872, Training Loss: 0.03097
Epoch: 5873, Training Loss: 0.02260
Epoch: 5873, Training Loss: 0.02455
Epoch: 5873, Training Loss: 0.02459
Epoch: 5873, Training Loss: 0.03097
Epoch: 5874, Training Loss: 0.02260
Epoch: 5874, Training Loss: 0.02454
Epoch: 5874, Training Loss: 0.02459
Epoch: 5874, Training Loss: 0.03096
Epoch: 5875, Training Loss: 0.02260
Epoch: 5875, Training Loss: 0.02454
Epoch: 5875, Training Loss: 0.02458
Epoch: 5875, Training Loss: 0.03096
Epoch: 5876, Training Loss: 0.02259
Epoch: 5876, Training Loss: 0.02454
Epoch: 5876, Training Loss: 0.02458
Epoch: 5876, Training Loss: 0.03095
Epoch: 5877, Training Loss: 0.02259
Epoch: 5877, Training Loss: 0.02453
Epoch: 5877, Training Loss: 0.02458
Epoch: 5877, Training Loss: 0.03095
Epoch: 5878, Training Loss: 0.02259
Epoch: 5878, Training Loss: 0.02453
Epoch: 5878, Training Loss: 0.02457
Epoch: 5878, Training Loss: 0.03095
Epoch: 5879, Training Loss: 0.02258
Epoch: 5879, Training Loss: 0.02452
Epoch: 5879, Training Loss: 0.02457
Epoch: 5879, Training Loss: 0.03094
Epoch: 5880, Training Loss: 0.02258
Epoch: 5880, Training Loss: 0.02452
Epoch: 5880, Training Loss: 0.02456
Epoch: 5880, Training Loss: 0.03094
Epoch: 5881, Training Loss: 0.02258
Epoch: 5881, Training Loss: 0.02452
Epoch: 5881, Training Loss: 0.02456
Epoch: 5881, Training Loss: 0.03093
Epoch: 5882, Training Loss: 0.02257
Epoch: 5882, Training Loss: 0.02451
Epoch: 5882, Training Loss: 0.02456
Epoch: 5882, Training Loss: 0.03093
Epoch: 5883, Training Loss: 0.02257
Epoch: 5883, Training Loss: 0.02451
Epoch: 5883, Training Loss: 0.02455
Epoch: 5883, Training Loss: 0.03092
Epoch: 5884, Training Loss: 0.02257
Epoch: 5884, Training Loss: 0.02451
Epoch: 5884, Training Loss: 0.02455
Epoch: 5884, Training Loss: 0.03092
Epoch: 5885, Training Loss: 0.02256
Epoch: 5885, Training Loss: 0.02450
Epoch: 5885, Training Loss: 0.02455
Epoch: 5885, Training Loss: 0.03091
Epoch: 5886, Training Loss: 0.02256
Epoch: 5886, Training Loss: 0.02450
Epoch: 5886, Training Loss: 0.02454
Epoch: 5886, Training Loss: 0.03091
Epoch: 5887, Training Loss: 0.02256
Epoch: 5887, Training Loss: 0.02449
Epoch: 5887, Training Loss: 0.02454
Epoch: 5887, Training Loss: 0.03090
Epoch: 5888, Training Loss: 0.02255
Epoch: 5888, Training Loss: 0.02449
Epoch: 5888, Training Loss: 0.02453
Epoch: 5888, Training Loss: 0.03090
Epoch: 5889, Training Loss: 0.02255
Epoch: 5889, Training Loss: 0.02449
Epoch: 5889, Training Loss: 0.02453
Epoch: 5889, Training Loss: 0.03089
Epoch: 5890, Training Loss: 0.02255
Epoch: 5890, Training Loss: 0.02448
Epoch: 5890, Training Loss: 0.02453
Epoch: 5890, Training Loss: 0.03089
Epoch: 5891, Training Loss: 0.02254
Epoch: 5891, Training Loss: 0.02448
Epoch: 5891, Training Loss: 0.02452
Epoch: 5891, Training Loss: 0.03088
Epoch: 5892, Training Loss: 0.02254
Epoch: 5892, Training Loss: 0.02448
Epoch: 5892, Training Loss: 0.02452
Epoch: 5892, Training Loss: 0.03088
Epoch: 5893, Training Loss: 0.02254
Epoch: 5893, Training Loss: 0.02447
Epoch: 5893, Training Loss: 0.02452
Epoch: 5893, Training Loss: 0.03087
Epoch: 5894, Training Loss: 0.02253
Epoch: 5894, Training Loss: 0.02447
Epoch: 5894, Training Loss: 0.02451
Epoch: 5894, Training Loss: 0.03087
Epoch: 5895, Training Loss: 0.02253
Epoch: 5895, Training Loss: 0.02446
Epoch: 5895, Training Loss: 0.02451
Epoch: 5895, Training Loss: 0.03086
Epoch: 5896, Training Loss: 0.02253
Epoch: 5896, Training Loss: 0.02446
Epoch: 5896, Training Loss: 0.02451
Epoch: 5896, Training Loss: 0.03086
Epoch: 5897, Training Loss: 0.02253
Epoch: 5897, Training Loss: 0.02446
Epoch: 5897, Training Loss: 0.02450
Epoch: 5897, Training Loss: 0.03085
Epoch: 5898, Training Loss: 0.02252
Epoch: 5898, Training Loss: 0.02445
Epoch: 5898, Training Loss: 0.02450
Epoch: 5898, Training Loss: 0.03085
Epoch: 5899, Training Loss: 0.02252
Epoch: 5899, Training Loss: 0.02445
Epoch: 5899, Training Loss: 0.02449
Epoch: 5899, Training Loss: 0.03084
Epoch: 5900, Training Loss: 0.02252
Epoch: 5900, Training Loss: 0.02445
Epoch: 5900, Training Loss: 0.02449
Epoch: 5900, Training Loss: 0.03084
Epoch: 5901, Training Loss: 0.02251
Epoch: 5901, Training Loss: 0.02444
Epoch: 5901, Training Loss: 0.02449
Epoch: 5901, Training Loss: 0.03083
Epoch: 5902, Training Loss: 0.02251
Epoch: 5902, Training Loss: 0.02444
Epoch: 5902, Training Loss: 0.02448
Epoch: 5902, Training Loss: 0.03083
Epoch: 5903, Training Loss: 0.02251
Epoch: 5903, Training Loss: 0.02444
Epoch: 5903, Training Loss: 0.02448
Epoch: 5903, Training Loss: 0.03082
Epoch: 5904, Training Loss: 0.02250
Epoch: 5904, Training Loss: 0.02443
Epoch: 5904, Training Loss: 0.02448
Epoch: 5904, Training Loss: 0.03082
Epoch: 5905, Training Loss: 0.02250
Epoch: 5905, Training Loss: 0.02443
Epoch: 5905, Training Loss: 0.02447
Epoch: 5905, Training Loss: 0.03081
Epoch: 5906, Training Loss: 0.02250
Epoch: 5906, Training Loss: 0.02442
Epoch: 5906, Training Loss: 0.02447
Epoch: 5906, Training Loss: 0.03081
Epoch: 5907, Training Loss: 0.02249
Epoch: 5907, Training Loss: 0.02442
Epoch: 5907, Training Loss: 0.02446
Epoch: 5907, Training Loss: 0.03081
Epoch: 5908, Training Loss: 0.02249
Epoch: 5908, Training Loss: 0.02442
Epoch: 5908, Training Loss: 0.02446
Epoch: 5908, Training Loss: 0.03080
Epoch: 5909, Training Loss: 0.02249
Epoch: 5909, Training Loss: 0.02441
Epoch: 5909, Training Loss: 0.02446
Epoch: 5909, Training Loss: 0.03080
Epoch: 5910, Training Loss: 0.02248
Epoch: 5910, Training Loss: 0.02441
Epoch: 5910, Training Loss: 0.02445
Epoch: 5910, Training Loss: 0.03079
Epoch: 5911, Training Loss: 0.02248
Epoch: 5911, Training Loss: 0.02441
Epoch: 5911, Training Loss: 0.02445
Epoch: 5911, Training Loss: 0.03079
Epoch: 5912, Training Loss: 0.02248
Epoch: 5912, Training Loss: 0.02440
Epoch: 5912, Training Loss: 0.02445
Epoch: 5912, Training Loss: 0.03078
Epoch: 5913, Training Loss: 0.02247
Epoch: 5913, Training Loss: 0.02440
Epoch: 5913, Training Loss: 0.02444
Epoch: 5913, Training Loss: 0.03078
Epoch: 5914, Training Loss: 0.02247
Epoch: 5914, Training Loss: 0.02440
Epoch: 5914, Training Loss: 0.02444
Epoch: 5914, Training Loss: 0.03077
Epoch: 5915, Training Loss: 0.02247
Epoch: 5915, Training Loss: 0.02439
Epoch: 5915, Training Loss: 0.02444
Epoch: 5915, Training Loss: 0.03077
Epoch: 5916, Training Loss: 0.02247
Epoch: 5916, Training Loss: 0.02439
Epoch: 5916, Training Loss: 0.02443
Epoch: 5916, Training Loss: 0.03076
Epoch: 5917, Training Loss: 0.02246
Epoch: 5917, Training Loss: 0.02438
Epoch: 5917, Training Loss: 0.02443
Epoch: 5917, Training Loss: 0.03076
Epoch: 5918, Training Loss: 0.02246
Epoch: 5918, Training Loss: 0.02438
Epoch: 5918, Training Loss: 0.02442
Epoch: 5918, Training Loss: 0.03075
Epoch: 5919, Training Loss: 0.02246
Epoch: 5919, Training Loss: 0.02438
Epoch: 5919, Training Loss: 0.02442
Epoch: 5919, Training Loss: 0.03075
Epoch: 5920, Training Loss: 0.02245
Epoch: 5920, Training Loss: 0.02437
Epoch: 5920, Training Loss: 0.02442
Epoch: 5920, Training Loss: 0.03074
Epoch: 5921, Training Loss: 0.02245
Epoch: 5921, Training Loss: 0.02437
Epoch: 5921, Training Loss: 0.02441
Epoch: 5921, Training Loss: 0.03074
Epoch: 5922, Training Loss: 0.02245
Epoch: 5922, Training Loss: 0.02437
Epoch: 5922, Training Loss: 0.02441
Epoch: 5922, Training Loss: 0.03073
Epoch: 5923, Training Loss: 0.02244
Epoch: 5923, Training Loss: 0.02436
Epoch: 5923, Training Loss: 0.02441
Epoch: 5923, Training Loss: 0.03073
Epoch: 5924, Training Loss: 0.02244
Epoch: 5924, Training Loss: 0.02436
Epoch: 5924, Training Loss: 0.02440
Epoch: 5924, Training Loss: 0.03072
Epoch: 5925, Training Loss: 0.02244
Epoch: 5925, Training Loss: 0.02436
Epoch: 5925, Training Loss: 0.02440
Epoch: 5925, Training Loss: 0.03072
Epoch: 5926, Training Loss: 0.02243
Epoch: 5926, Training Loss: 0.02435
Epoch: 5926, Training Loss: 0.02440
Epoch: 5926, Training Loss: 0.03071
Epoch: 5927, Training Loss: 0.02243
Epoch: 5927, Training Loss: 0.02435
Epoch: 5927, Training Loss: 0.02439
Epoch: 5927, Training Loss: 0.03071
Epoch: 5928, Training Loss: 0.02243
Epoch: 5928, Training Loss: 0.02434
Epoch: 5928, Training Loss: 0.02439
Epoch: 5928, Training Loss: 0.03071
Epoch: 5929, Training Loss: 0.02242
Epoch: 5929, Training Loss: 0.02434
Epoch: 5929, Training Loss: 0.02438
Epoch: 5929, Training Loss: 0.03070
Epoch: 5930, Training Loss: 0.02242
Epoch: 5930, Training Loss: 0.02434
Epoch: 5930, Training Loss: 0.02438
Epoch: 5930, Training Loss: 0.03070
Epoch: 5931, Training Loss: 0.02242
Epoch: 5931, Training Loss: 0.02433
Epoch: 5931, Training Loss: 0.02438
Epoch: 5931, Training Loss: 0.03069
Epoch: 5932, Training Loss: 0.02241
Epoch: 5932, Training Loss: 0.02433
Epoch: 5932, Training Loss: 0.02437
Epoch: 5932, Training Loss: 0.03069
Epoch: 5933, Training Loss: 0.02241
Epoch: 5933, Training Loss: 0.02433
Epoch: 5933, Training Loss: 0.02437
Epoch: 5933, Training Loss: 0.03068
Epoch: 5934, Training Loss: 0.02241
Epoch: 5934, Training Loss: 0.02432
Epoch: 5934, Training Loss: 0.02437
Epoch: 5934, Training Loss: 0.03068
Epoch: 5935, Training Loss: 0.02241
Epoch: 5935, Training Loss: 0.02432
Epoch: 5935, Training Loss: 0.02436
Epoch: 5935, Training Loss: 0.03067
Epoch: 5936, Training Loss: 0.02240
Epoch: 5936, Training Loss: 0.02432
Epoch: 5936, Training Loss: 0.02436
Epoch: 5936, Training Loss: 0.03067
Epoch: 5937, Training Loss: 0.02240
Epoch: 5937, Training Loss: 0.02431
Epoch: 5937, Training Loss: 0.02436
Epoch: 5937, Training Loss: 0.03066
Epoch: 5938, Training Loss: 0.02240
Epoch: 5938, Training Loss: 0.02431
Epoch: 5938, Training Loss: 0.02435
Epoch: 5938, Training Loss: 0.03066
Epoch: 5939, Training Loss: 0.02239
Epoch: 5939, Training Loss: 0.02430
Epoch: 5939, Training Loss: 0.02435
Epoch: 5939, Training Loss: 0.03065
Epoch: 5940, Training Loss: 0.02239
Epoch: 5940, Training Loss: 0.02430
Epoch: 5940, Training Loss: 0.02434
Epoch: 5940, Training Loss: 0.03065
Epoch: 5941, Training Loss: 0.02239
Epoch: 5941, Training Loss: 0.02430
Epoch: 5941, Training Loss: 0.02434
Epoch: 5941, Training Loss: 0.03064
Epoch: 5942, Training Loss: 0.02238
Epoch: 5942, Training Loss: 0.02429
Epoch: 5942, Training Loss: 0.02434
Epoch: 5942, Training Loss: 0.03064
Epoch: 5943, Training Loss: 0.02238
Epoch: 5943, Training Loss: 0.02429
Epoch: 5943, Training Loss: 0.02433
Epoch: 5943, Training Loss: 0.03063
Epoch: 5944, Training Loss: 0.02238
Epoch: 5944, Training Loss: 0.02429
Epoch: 5944, Training Loss: 0.02433
Epoch: 5944, Training Loss: 0.03063
Epoch: 5945, Training Loss: 0.02237
Epoch: 5945, Training Loss: 0.02428
Epoch: 5945, Training Loss: 0.02433
Epoch: 5945, Training Loss: 0.03062
Epoch: 5946, Training Loss: 0.02237
Epoch: 5946, Training Loss: 0.02428
Epoch: 5946, Training Loss: 0.02432
Epoch: 5946, Training Loss: 0.03062
Epoch: 5947, Training Loss: 0.02237
Epoch: 5947, Training Loss: 0.02428
Epoch: 5947, Training Loss: 0.02432
Epoch: 5947, Training Loss: 0.03062
Epoch: 5948, Training Loss: 0.02237
Epoch: 5948, Training Loss: 0.02427
Epoch: 5948, Training Loss: 0.02432
Epoch: 5948, Training Loss: 0.03061
Epoch: 5949, Training Loss: 0.02236
Epoch: 5949, Training Loss: 0.02427
Epoch: 5949, Training Loss: 0.02431
Epoch: 5949, Training Loss: 0.03061
Epoch: 5950, Training Loss: 0.02236
Epoch: 5950, Training Loss: 0.02427
Epoch: 5950, Training Loss: 0.02431
Epoch: 5950, Training Loss: 0.03060
Epoch: 5951, Training Loss: 0.02236
Epoch: 5951, Training Loss: 0.02426
Epoch: 5951, Training Loss: 0.02431
Epoch: 5951, Training Loss: 0.03060
Epoch: 5952, Training Loss: 0.02235
Epoch: 5952, Training Loss: 0.02426
Epoch: 5952, Training Loss: 0.02430
Epoch: 5952, Training Loss: 0.03059
Epoch: 5953, Training Loss: 0.02235
Epoch: 5953, Training Loss: 0.02425
Epoch: 5953, Training Loss: 0.02430
Epoch: 5953, Training Loss: 0.03059
Epoch: 5954, Training Loss: 0.02235
Epoch: 5954, Training Loss: 0.02425
Epoch: 5954, Training Loss: 0.02429
Epoch: 5954, Training Loss: 0.03058
Epoch: 5955, Training Loss: 0.02234
Epoch: 5955, Training Loss: 0.02425
Epoch: 5955, Training Loss: 0.02429
Epoch: 5955, Training Loss: 0.03058
Epoch: 5956, Training Loss: 0.02234
Epoch: 5956, Training Loss: 0.02424
Epoch: 5956, Training Loss: 0.02429
Epoch: 5956, Training Loss: 0.03057
Epoch: 5957, Training Loss: 0.02234
Epoch: 5957, Training Loss: 0.02424
Epoch: 5957, Training Loss: 0.02428
Epoch: 5957, Training Loss: 0.03057
Epoch: 5958, Training Loss: 0.02233
Epoch: 5958, Training Loss: 0.02424
Epoch: 5958, Training Loss: 0.02428
Epoch: 5958, Training Loss: 0.03056
Epoch: 5959, Training Loss: 0.02233
Epoch: 5959, Training Loss: 0.02423
Epoch: 5959, Training Loss: 0.02428
Epoch: 5959, Training Loss: 0.03056
Epoch: 5960, Training Loss: 0.02233
Epoch: 5960, Training Loss: 0.02423
Epoch: 5960, Training Loss: 0.02427
Epoch: 5960, Training Loss: 0.03055
Epoch: 5961, Training Loss: 0.02232
Epoch: 5961, Training Loss: 0.02423
Epoch: 5961, Training Loss: 0.02427
Epoch: 5961, Training Loss: 0.03055
Epoch: 5962, Training Loss: 0.02232
Epoch: 5962, Training Loss: 0.02422
Epoch: 5962, Training Loss: 0.02427
Epoch: 5962, Training Loss: 0.03055
Epoch: 5963, Training Loss: 0.02232
Epoch: 5963, Training Loss: 0.02422
Epoch: 5963, Training Loss: 0.02426
Epoch: 5963, Training Loss: 0.03054
Epoch: 5964, Training Loss: 0.02232
Epoch: 5964, Training Loss: 0.02422
Epoch: 5964, Training Loss: 0.02426
Epoch: 5964, Training Loss: 0.03054
Epoch: 5965, Training Loss: 0.02231
Epoch: 5965, Training Loss: 0.02421
Epoch: 5965, Training Loss: 0.02426
Epoch: 5965, Training Loss: 0.03053
Epoch: 5966, Training Loss: 0.02231
Epoch: 5966, Training Loss: 0.02421
Epoch: 5966, Training Loss: 0.02425
Epoch: 5966, Training Loss: 0.03053
Epoch: 5967, Training Loss: 0.02231
Epoch: 5967, Training Loss: 0.02420
Epoch: 5967, Training Loss: 0.02425
Epoch: 5967, Training Loss: 0.03052
Epoch: 5968, Training Loss: 0.02230
Epoch: 5968, Training Loss: 0.02420
Epoch: 5968, Training Loss: 0.02424
Epoch: 5968, Training Loss: 0.03052
Epoch: 5969, Training Loss: 0.02230
Epoch: 5969, Training Loss: 0.02420
Epoch: 5969, Training Loss: 0.02424
Epoch: 5969, Training Loss: 0.03051
Epoch: 5970, Training Loss: 0.02230
Epoch: 5970, Training Loss: 0.02419
Epoch: 5970, Training Loss: 0.02424
Epoch: 5970, Training Loss: 0.03051
Epoch: 5971, Training Loss: 0.02229
Epoch: 5971, Training Loss: 0.02419
Epoch: 5971, Training Loss: 0.02423
Epoch: 5971, Training Loss: 0.03050
Epoch: 5972, Training Loss: 0.02229
Epoch: 5972, Training Loss: 0.02419
Epoch: 5972, Training Loss: 0.02423
Epoch: 5972, Training Loss: 0.03050
Epoch: 5973, Training Loss: 0.02229
Epoch: 5973, Training Loss: 0.02418
Epoch: 5973, Training Loss: 0.02423
Epoch: 5973, Training Loss: 0.03049
Epoch: 5974, Training Loss: 0.02228
Epoch: 5974, Training Loss: 0.02418
Epoch: 5974, Training Loss: 0.02422
Epoch: 5974, Training Loss: 0.03049
Epoch: 5975, Training Loss: 0.02228
Epoch: 5975, Training Loss: 0.02418
Epoch: 5975, Training Loss: 0.02422
Epoch: 5975, Training Loss: 0.03048
Epoch: 5976, Training Loss: 0.02228
Epoch: 5976, Training Loss: 0.02417
Epoch: 5976, Training Loss: 0.02422
Epoch: 5976, Training Loss: 0.03048
Epoch: 5977, Training Loss: 0.02228
Epoch: 5977, Training Loss: 0.02417
Epoch: 5977, Training Loss: 0.02421
Epoch: 5977, Training Loss: 0.03048
Epoch: 5978, Training Loss: 0.02227
Epoch: 5978, Training Loss: 0.02417
Epoch: 5978, Training Loss: 0.02421
Epoch: 5978, Training Loss: 0.03047
Epoch: 5979, Training Loss: 0.02227
Epoch: 5979, Training Loss: 0.02416
Epoch: 5979, Training Loss: 0.02421
Epoch: 5979, Training Loss: 0.03047
Epoch: 5980, Training Loss: 0.02227
Epoch: 5980, Training Loss: 0.02416
Epoch: 5980, Training Loss: 0.02420
Epoch: 5980, Training Loss: 0.03046
Epoch: 5981, Training Loss: 0.02226
Epoch: 5981, Training Loss: 0.02415
Epoch: 5981, Training Loss: 0.02420
Epoch: 5981, Training Loss: 0.03046
Epoch: 5982, Training Loss: 0.02226
Epoch: 5982, Training Loss: 0.02415
Epoch: 5982, Training Loss: 0.02419
Epoch: 5982, Training Loss: 0.03045
Epoch: 5983, Training Loss: 0.02226
Epoch: 5983, Training Loss: 0.02415
Epoch: 5983, Training Loss: 0.02419
Epoch: 5983, Training Loss: 0.03045
Epoch: 5984, Training Loss: 0.02225
Epoch: 5984, Training Loss: 0.02414
Epoch: 5984, Training Loss: 0.02419
Epoch: 5984, Training Loss: 0.03044
Epoch: 5985, Training Loss: 0.02225
Epoch: 5985, Training Loss: 0.02414
Epoch: 5985, Training Loss: 0.02418
Epoch: 5985, Training Loss: 0.03044
Epoch: 5986, Training Loss: 0.02225
Epoch: 5986, Training Loss: 0.02414
Epoch: 5986, Training Loss: 0.02418
Epoch: 5986, Training Loss: 0.03043
Epoch: 5987, Training Loss: 0.02224
Epoch: 5987, Training Loss: 0.02413
Epoch: 5987, Training Loss: 0.02418
Epoch: 5987, Training Loss: 0.03043
Epoch: 5988, Training Loss: 0.02224
Epoch: 5988, Training Loss: 0.02413
Epoch: 5988, Training Loss: 0.02417
Epoch: 5988, Training Loss: 0.03042
Epoch: 5989, Training Loss: 0.02224
Epoch: 5989, Training Loss: 0.02413
Epoch: 5989, Training Loss: 0.02417
Epoch: 5989, Training Loss: 0.03042
Epoch: 5990, Training Loss: 0.02224
Epoch: 5990, Training Loss: 0.02412
Epoch: 5990, Training Loss: 0.02417
Epoch: 5990, Training Loss: 0.03042
Epoch: 5991, Training Loss: 0.02223
Epoch: 5991, Training Loss: 0.02412
Epoch: 5991, Training Loss: 0.02416
Epoch: 5991, Training Loss: 0.03041
Epoch: 5992, Training Loss: 0.02223
Epoch: 5992, Training Loss: 0.02412
Epoch: 5992, Training Loss: 0.02416
Epoch: 5992, Training Loss: 0.03041
Epoch: 5993, Training Loss: 0.02223
Epoch: 5993, Training Loss: 0.02411
Epoch: 5993, Training Loss: 0.02416
Epoch: 5993, Training Loss: 0.03040
Epoch: 5994, Training Loss: 0.02222
Epoch: 5994, Training Loss: 0.02411
Epoch: 5994, Training Loss: 0.02415
Epoch: 5994, Training Loss: 0.03040
Epoch: 5995, Training Loss: 0.02222
Epoch: 5995, Training Loss: 0.02411
Epoch: 5995, Training Loss: 0.02415
Epoch: 5995, Training Loss: 0.03039
Epoch: 5996, Training Loss: 0.02222
Epoch: 5996, Training Loss: 0.02410
Epoch: 5996, Training Loss: 0.02414
Epoch: 5996, Training Loss: 0.03039
Epoch: 5997, Training Loss: 0.02221
Epoch: 5997, Training Loss: 0.02410
Epoch: 5997, Training Loss: 0.02414
Epoch: 5997, Training Loss: 0.03038
Epoch: 5998, Training Loss: 0.02221
Epoch: 5998, Training Loss: 0.02409
Epoch: 5998, Training Loss: 0.02414
Epoch: 5998, Training Loss: 0.03038
Epoch: 5999, Training Loss: 0.02221
Epoch: 5999, Training Loss: 0.02409
Epoch: 5999, Training Loss: 0.02413
Epoch: 5999, Training Loss: 0.03037
Epoch: 6000, Training Loss: 0.02221
Epoch: 6000, Training Loss: 0.02409
Epoch: 6000, Training Loss: 0.02413
Epoch: 6000, Training Loss: 0.03037
Epoch: 6001, Training Loss: 0.02220
Epoch: 6001, Training Loss: 0.02408
Epoch: 6001, Training Loss: 0.02413
Epoch: 6001, Training Loss: 0.03036
Epoch: 6002, Training Loss: 0.02220
Epoch: 6002, Training Loss: 0.02408
Epoch: 6002, Training Loss: 0.02412
Epoch: 6002, Training Loss: 0.03036
Epoch: 6003, Training Loss: 0.02220
Epoch: 6003, Training Loss: 0.02408
Epoch: 6003, Training Loss: 0.02412
Epoch: 6003, Training Loss: 0.03036
Epoch: 6004, Training Loss: 0.02219
Epoch: 6004, Training Loss: 0.02407
Epoch: 6004, Training Loss: 0.02412
Epoch: 6004, Training Loss: 0.03035
Epoch: 6005, Training Loss: 0.02219
Epoch: 6005, Training Loss: 0.02407
Epoch: 6005, Training Loss: 0.02411
Epoch: 6005, Training Loss: 0.03035
Epoch: 6006, Training Loss: 0.02219
Epoch: 6006, Training Loss: 0.02407
Epoch: 6006, Training Loss: 0.02411
Epoch: 6006, Training Loss: 0.03034
Epoch: 6007, Training Loss: 0.02218
Epoch: 6007, Training Loss: 0.02406
Epoch: 6007, Training Loss: 0.02411
Epoch: 6007, Training Loss: 0.03034
Epoch: 6008, Training Loss: 0.02218
Epoch: 6008, Training Loss: 0.02406
Epoch: 6008, Training Loss: 0.02410
Epoch: 6008, Training Loss: 0.03033
Epoch: 6009, Training Loss: 0.02218
Epoch: 6009, Training Loss: 0.02406
Epoch: 6009, Training Loss: 0.02410
Epoch: 6009, Training Loss: 0.03033
Epoch: 6010, Training Loss: 0.02217
Epoch: 6010, Training Loss: 0.02405
Epoch: 6010, Training Loss: 0.02410
Epoch: 6010, Training Loss: 0.03032
Epoch: 6011, Training Loss: 0.02217
Epoch: 6011, Training Loss: 0.02405
Epoch: 6011, Training Loss: 0.02409
Epoch: 6011, Training Loss: 0.03032
Epoch: 6012, Training Loss: 0.02217
Epoch: 6012, Training Loss: 0.02405
Epoch: 6012, Training Loss: 0.02409
Epoch: 6012, Training Loss: 0.03031
Epoch: 6013, Training Loss: 0.02217
Epoch: 6013, Training Loss: 0.02404
Epoch: 6013, Training Loss: 0.02409
Epoch: 6013, Training Loss: 0.03031
Epoch: 6014, Training Loss: 0.02216
Epoch: 6014, Training Loss: 0.02404
Epoch: 6014, Training Loss: 0.02408
Epoch: 6014, Training Loss: 0.03030
Epoch: 6015, Training Loss: 0.02216
Epoch: 6015, Training Loss: 0.02404
Epoch: 6015, Training Loss: 0.02408
Epoch: 6015, Training Loss: 0.03030
Epoch: 6016, Training Loss: 0.02216
Epoch: 6016, Training Loss: 0.02403
Epoch: 6016, Training Loss: 0.02407
Epoch: 6016, Training Loss: 0.03030
Epoch: 6017, Training Loss: 0.02215
Epoch: 6017, Training Loss: 0.02403
Epoch: 6017, Training Loss: 0.02407
Epoch: 6017, Training Loss: 0.03029
Epoch: 6018, Training Loss: 0.02215
Epoch: 6018, Training Loss: 0.02402
Epoch: 6018, Training Loss: 0.02407
Epoch: 6018, Training Loss: 0.03029
Epoch: 6019, Training Loss: 0.02215
Epoch: 6019, Training Loss: 0.02402
Epoch: 6019, Training Loss: 0.02406
Epoch: 6019, Training Loss: 0.03028
Epoch: 6020, Training Loss: 0.02214
Epoch: 6020, Training Loss: 0.02402
Epoch: 6020, Training Loss: 0.02406
Epoch: 6020, Training Loss: 0.03028
Epoch: 6021, Training Loss: 0.02214
Epoch: 6021, Training Loss: 0.02401
Epoch: 6021, Training Loss: 0.02406
Epoch: 6021, Training Loss: 0.03027
Epoch: 6022, Training Loss: 0.02214
Epoch: 6022, Training Loss: 0.02401
Epoch: 6022, Training Loss: 0.02405
Epoch: 6022, Training Loss: 0.03027
Epoch: 6023, Training Loss: 0.02214
Epoch: 6023, Training Loss: 0.02401
Epoch: 6023, Training Loss: 0.02405
Epoch: 6023, Training Loss: 0.03026
Epoch: 6024, Training Loss: 0.02213
Epoch: 6024, Training Loss: 0.02400
Epoch: 6024, Training Loss: 0.02405
Epoch: 6024, Training Loss: 0.03026
Epoch: 6025, Training Loss: 0.02213
Epoch: 6025, Training Loss: 0.02400
Epoch: 6025, Training Loss: 0.02404
Epoch: 6025, Training Loss: 0.03025
Epoch: 6026, Training Loss: 0.02213
Epoch: 6026, Training Loss: 0.02400
Epoch: 6026, Training Loss: 0.02404
Epoch: 6026, Training Loss: 0.03025
Epoch: 6027, Training Loss: 0.02212
Epoch: 6027, Training Loss: 0.02399
Epoch: 6027, Training Loss: 0.02404
Epoch: 6027, Training Loss: 0.03025
Epoch: 6028, Training Loss: 0.02212
Epoch: 6028, Training Loss: 0.02399
Epoch: 6028, Training Loss: 0.02403
Epoch: 6028, Training Loss: 0.03024
Epoch: 6029, Training Loss: 0.02212
Epoch: 6029, Training Loss: 0.02399
Epoch: 6029, Training Loss: 0.02403
Epoch: 6029, Training Loss: 0.03024
Epoch: 6030, Training Loss: 0.02211
Epoch: 6030, Training Loss: 0.02398
Epoch: 6030, Training Loss: 0.02403
Epoch: 6030, Training Loss: 0.03023
Epoch: 6031, Training Loss: 0.02211
Epoch: 6031, Training Loss: 0.02398
Epoch: 6031, Training Loss: 0.02402
Epoch: 6031, Training Loss: 0.03023
Epoch: 6032, Training Loss: 0.02211
Epoch: 6032, Training Loss: 0.02398
Epoch: 6032, Training Loss: 0.02402
Epoch: 6032, Training Loss: 0.03022
Epoch: 6033, Training Loss: 0.02211
Epoch: 6033, Training Loss: 0.02397
Epoch: 6033, Training Loss: 0.02402
Epoch: 6033, Training Loss: 0.03022
Epoch: 6034, Training Loss: 0.02210
Epoch: 6034, Training Loss: 0.02397
Epoch: 6034, Training Loss: 0.02401
Epoch: 6034, Training Loss: 0.03021
Epoch: 6035, Training Loss: 0.02210
Epoch: 6035, Training Loss: 0.02397
Epoch: 6035, Training Loss: 0.02401
Epoch: 6035, Training Loss: 0.03021
Epoch: 6036, Training Loss: 0.02210
Epoch: 6036, Training Loss: 0.02396
Epoch: 6036, Training Loss: 0.02401
Epoch: 6036, Training Loss: 0.03020
Epoch: 6037, Training Loss: 0.02209
Epoch: 6037, Training Loss: 0.02396
Epoch: 6037, Training Loss: 0.02400
Epoch: 6037, Training Loss: 0.03020
Epoch: 6038, Training Loss: 0.02209
Epoch: 6038, Training Loss: 0.02396
Epoch: 6038, Training Loss: 0.02400
Epoch: 6038, Training Loss: 0.03020
Epoch: 6039, Training Loss: 0.02209
Epoch: 6039, Training Loss: 0.02395
Epoch: 6039, Training Loss: 0.02399
Epoch: 6039, Training Loss: 0.03019
Epoch: 6040, Training Loss: 0.02208
Epoch: 6040, Training Loss: 0.02395
Epoch: 6040, Training Loss: 0.02399
Epoch: 6040, Training Loss: 0.03019
Epoch: 6041, Training Loss: 0.02208
Epoch: 6041, Training Loss: 0.02395
Epoch: 6041, Training Loss: 0.02399
Epoch: 6041, Training Loss: 0.03018
Epoch: 6042, Training Loss: 0.02208
Epoch: 6042, Training Loss: 0.02394
Epoch: 6042, Training Loss: 0.02398
Epoch: 6042, Training Loss: 0.03018
Epoch: 6043, Training Loss: 0.02208
Epoch: 6043, Training Loss: 0.02394
Epoch: 6043, Training Loss: 0.02398
Epoch: 6043, Training Loss: 0.03017
Epoch: 6044, Training Loss: 0.02207
Epoch: 6044, Training Loss: 0.02393
Epoch: 6044, Training Loss: 0.02398
Epoch: 6044, Training Loss: 0.03017
Epoch: 6045, Training Loss: 0.02207
Epoch: 6045, Training Loss: 0.02393
Epoch: 6045, Training Loss: 0.02397
Epoch: 6045, Training Loss: 0.03016
Epoch: 6046, Training Loss: 0.02207
Epoch: 6046, Training Loss: 0.02393
Epoch: 6046, Training Loss: 0.02397
Epoch: 6046, Training Loss: 0.03016
Epoch: 6047, Training Loss: 0.02206
Epoch: 6047, Training Loss: 0.02392
Epoch: 6047, Training Loss: 0.02397
Epoch: 6047, Training Loss: 0.03016
Epoch: 6048, Training Loss: 0.02206
Epoch: 6048, Training Loss: 0.02392
Epoch: 6048, Training Loss: 0.02396
Epoch: 6048, Training Loss: 0.03015
Epoch: 6049, Training Loss: 0.02206
Epoch: 6049, Training Loss: 0.02392
Epoch: 6049, Training Loss: 0.02396
Epoch: 6049, Training Loss: 0.03015
Epoch: 6050, Training Loss: 0.02205
Epoch: 6050, Training Loss: 0.02391
Epoch: 6050, Training Loss: 0.02396
Epoch: 6050, Training Loss: 0.03014
Epoch: 6051, Training Loss: 0.02205
Epoch: 6051, Training Loss: 0.02391
Epoch: 6051, Training Loss: 0.02395
Epoch: 6051, Training Loss: 0.03014
Epoch: 6052, Training Loss: 0.02205
Epoch: 6052, Training Loss: 0.02391
Epoch: 6052, Training Loss: 0.02395
Epoch: 6052, Training Loss: 0.03013
Epoch: 6053, Training Loss: 0.02205
Epoch: 6053, Training Loss: 0.02390
Epoch: 6053, Training Loss: 0.02395
Epoch: 6053, Training Loss: 0.03013
Epoch: 6054, Training Loss: 0.02204
Epoch: 6054, Training Loss: 0.02390
Epoch: 6054, Training Loss: 0.02394
Epoch: 6054, Training Loss: 0.03012
Epoch: 6055, Training Loss: 0.02204
Epoch: 6055, Training Loss: 0.02390
Epoch: 6055, Training Loss: 0.02394
Epoch: 6055, Training Loss: 0.03012
Epoch: 6056, Training Loss: 0.02204
Epoch: 6056, Training Loss: 0.02389
Epoch: 6056, Training Loss: 0.02394
Epoch: 6056, Training Loss: 0.03011
Epoch: 6057, Training Loss: 0.02203
Epoch: 6057, Training Loss: 0.02389
Epoch: 6057, Training Loss: 0.02393
Epoch: 6057, Training Loss: 0.03011
Epoch: 6058, Training Loss: 0.02203
Epoch: 6058, Training Loss: 0.02389
Epoch: 6058, Training Loss: 0.02393
Epoch: 6058, Training Loss: 0.03011
Epoch: 6059, Training Loss: 0.02203
Epoch: 6059, Training Loss: 0.02388
Epoch: 6059, Training Loss: 0.02393
Epoch: 6059, Training Loss: 0.03010
Epoch: 6060, Training Loss: 0.02202
Epoch: 6060, Training Loss: 0.02388
Epoch: 6060, Training Loss: 0.02392
Epoch: 6060, Training Loss: 0.03010
Epoch: 6061, Training Loss: 0.02202
Epoch: 6061, Training Loss: 0.02388
Epoch: 6061, Training Loss: 0.02392
Epoch: 6061, Training Loss: 0.03009
Epoch: 6062, Training Loss: 0.02202
Epoch: 6062, Training Loss: 0.02387
Epoch: 6062, Training Loss: 0.02392
Epoch: 6062, Training Loss: 0.03009
Epoch: 6063, Training Loss: 0.02202
Epoch: 6063, Training Loss: 0.02387
Epoch: 6063, Training Loss: 0.02391
Epoch: 6063, Training Loss: 0.03008
Epoch: 6064, Training Loss: 0.02201
Epoch: 6064, Training Loss: 0.02387
Epoch: 6064, Training Loss: 0.02391
Epoch: 6064, Training Loss: 0.03008
Epoch: 6065, Training Loss: 0.02201
Epoch: 6065, Training Loss: 0.02386
Epoch: 6065, Training Loss: 0.02391
Epoch: 6065, Training Loss: 0.03007
Epoch: 6066, Training Loss: 0.02201
Epoch: 6066, Training Loss: 0.02386
Epoch: 6066, Training Loss: 0.02390
Epoch: 6066, Training Loss: 0.03007
Epoch: 6067, Training Loss: 0.02200
Epoch: 6067, Training Loss: 0.02386
Epoch: 6067, Training Loss: 0.02390
Epoch: 6067, Training Loss: 0.03007
Epoch: 6068, Training Loss: 0.02200
Epoch: 6068, Training Loss: 0.02385
Epoch: 6068, Training Loss: 0.02389
Epoch: 6068, Training Loss: 0.03006
Epoch: 6069, Training Loss: 0.02200
Epoch: 6069, Training Loss: 0.02385
Epoch: 6069, Training Loss: 0.02389
Epoch: 6069, Training Loss: 0.03006
Epoch: 6070, Training Loss: 0.02200
Epoch: 6070, Training Loss: 0.02385
Epoch: 6070, Training Loss: 0.02389
Epoch: 6070, Training Loss: 0.03005
Epoch: 6071, Training Loss: 0.02199
Epoch: 6071, Training Loss: 0.02384
Epoch: 6071, Training Loss: 0.02388
Epoch: 6071, Training Loss: 0.03005
Epoch: 6072, Training Loss: 0.02199
Epoch: 6072, Training Loss: 0.02384
Epoch: 6072, Training Loss: 0.02388
Epoch: 6072, Training Loss: 0.03004
Epoch: 6073, Training Loss: 0.02199
Epoch: 6073, Training Loss: 0.02384
Epoch: 6073, Training Loss: 0.02388
Epoch: 6073, Training Loss: 0.03004
Epoch: 6074, Training Loss: 0.02198
Epoch: 6074, Training Loss: 0.02383
Epoch: 6074, Training Loss: 0.02387
Epoch: 6074, Training Loss: 0.03003
Epoch: 6075, Training Loss: 0.02198
Epoch: 6075, Training Loss: 0.02383
Epoch: 6075, Training Loss: 0.02387
Epoch: 6075, Training Loss: 0.03003
Epoch: 6076, Training Loss: 0.02198
Epoch: 6076, Training Loss: 0.02383
Epoch: 6076, Training Loss: 0.02387
Epoch: 6076, Training Loss: 0.03003
Epoch: 6077, Training Loss: 0.02197
Epoch: 6077, Training Loss: 0.02382
Epoch: 6077, Training Loss: 0.02386
Epoch: 6077, Training Loss: 0.03002
Epoch: 6078, Training Loss: 0.02197
Epoch: 6078, Training Loss: 0.02382
Epoch: 6078, Training Loss: 0.02386
Epoch: 6078, Training Loss: 0.03002
Epoch: 6079, Training Loss: 0.02197
Epoch: 6079, Training Loss: 0.02381
Epoch: 6079, Training Loss: 0.02386
Epoch: 6079, Training Loss: 0.03001
Epoch: 6080, Training Loss: 0.02197
Epoch: 6080, Training Loss: 0.02381
Epoch: 6080, Training Loss: 0.02385
Epoch: 6080, Training Loss: 0.03001
Epoch: 6081, Training Loss: 0.02196
Epoch: 6081, Training Loss: 0.02381
Epoch: 6081, Training Loss: 0.02385
Epoch: 6081, Training Loss: 0.03000
Epoch: 6082, Training Loss: 0.02196
Epoch: 6082, Training Loss: 0.02380
Epoch: 6082, Training Loss: 0.02385
Epoch: 6082, Training Loss: 0.03000
Epoch: 6083, Training Loss: 0.02196
Epoch: 6083, Training Loss: 0.02380
Epoch: 6083, Training Loss: 0.02384
Epoch: 6083, Training Loss: 0.02999
Epoch: 6084, Training Loss: 0.02195
Epoch: 6084, Training Loss: 0.02380
Epoch: 6084, Training Loss: 0.02384
Epoch: 6084, Training Loss: 0.02999
Epoch: 6085, Training Loss: 0.02195
Epoch: 6085, Training Loss: 0.02379
Epoch: 6085, Training Loss: 0.02384
Epoch: 6085, Training Loss: 0.02999
Epoch: 6086, Training Loss: 0.02195
Epoch: 6086, Training Loss: 0.02379
Epoch: 6086, Training Loss: 0.02383
Epoch: 6086, Training Loss: 0.02998
Epoch: 6087, Training Loss: 0.02194
Epoch: 6087, Training Loss: 0.02379
Epoch: 6087, Training Loss: 0.02383
Epoch: 6087, Training Loss: 0.02998
Epoch: 6088, Training Loss: 0.02194
Epoch: 6088, Training Loss: 0.02378
Epoch: 6088, Training Loss: 0.02383
Epoch: 6088, Training Loss: 0.02997
Epoch: 6089, Training Loss: 0.02194
Epoch: 6089, Training Loss: 0.02378
Epoch: 6089, Training Loss: 0.02382
Epoch: 6089, Training Loss: 0.02997
Epoch: 6090, Training Loss: 0.02194
Epoch: 6090, Training Loss: 0.02378
Epoch: 6090, Training Loss: 0.02382
Epoch: 6090, Training Loss: 0.02996
Epoch: 6091, Training Loss: 0.02193
Epoch: 6091, Training Loss: 0.02377
Epoch: 6091, Training Loss: 0.02382
Epoch: 6091, Training Loss: 0.02996
Epoch: 6092, Training Loss: 0.02193
Epoch: 6092, Training Loss: 0.02377
Epoch: 6092, Training Loss: 0.02381
Epoch: 6092, Training Loss: 0.02995
Epoch: 6093, Training Loss: 0.02193
Epoch: 6093, Training Loss: 0.02377
Epoch: 6093, Training Loss: 0.02381
Epoch: 6093, Training Loss: 0.02995
Epoch: 6094, Training Loss: 0.02192
Epoch: 6094, Training Loss: 0.02376
Epoch: 6094, Training Loss: 0.02381
Epoch: 6094, Training Loss: 0.02995
Epoch: 6095, Training Loss: 0.02192
Epoch: 6095, Training Loss: 0.02376
Epoch: 6095, Training Loss: 0.02380
Epoch: 6095, Training Loss: 0.02994
Epoch: 6096, Training Loss: 0.02192
Epoch: 6096, Training Loss: 0.02376
Epoch: 6096, Training Loss: 0.02380
Epoch: 6096, Training Loss: 0.02994
Epoch: 6097, Training Loss: 0.02192
Epoch: 6097, Training Loss: 0.02375
Epoch: 6097, Training Loss: 0.02380
Epoch: 6097, Training Loss: 0.02993
Epoch: 6098, Training Loss: 0.02191
Epoch: 6098, Training Loss: 0.02375
Epoch: 6098, Training Loss: 0.02379
Epoch: 6098, Training Loss: 0.02993
Epoch: 6099, Training Loss: 0.02191
Epoch: 6099, Training Loss: 0.02375
Epoch: 6099, Training Loss: 0.02379
Epoch: 6099, Training Loss: 0.02992
Epoch: 6100, Training Loss: 0.02191
Epoch: 6100, Training Loss: 0.02374
Epoch: 6100, Training Loss: 0.02379
Epoch: 6100, Training Loss: 0.02992
Epoch: 6101, Training Loss: 0.02190
Epoch: 6101, Training Loss: 0.02374
Epoch: 6101, Training Loss: 0.02378
Epoch: 6101, Training Loss: 0.02991
Epoch: 6102, Training Loss: 0.02190
Epoch: 6102, Training Loss: 0.02374
Epoch: 6102, Training Loss: 0.02378
Epoch: 6102, Training Loss: 0.02991
Epoch: 6103, Training Loss: 0.02190
Epoch: 6103, Training Loss: 0.02373
Epoch: 6103, Training Loss: 0.02378
Epoch: 6103, Training Loss: 0.02991
Epoch: 6104, Training Loss: 0.02190
Epoch: 6104, Training Loss: 0.02373
Epoch: 6104, Training Loss: 0.02377
Epoch: 6104, Training Loss: 0.02990
Epoch: 6105, Training Loss: 0.02189
Epoch: 6105, Training Loss: 0.02373
Epoch: 6105, Training Loss: 0.02377
Epoch: 6105, Training Loss: 0.02990
Epoch: 6106, Training Loss: 0.02189
Epoch: 6106, Training Loss: 0.02372
Epoch: 6106, Training Loss: 0.02377
Epoch: 6106, Training Loss: 0.02989
Epoch: 6107, Training Loss: 0.02189
Epoch: 6107, Training Loss: 0.02372
Epoch: 6107, Training Loss: 0.02376
Epoch: 6107, Training Loss: 0.02989
Epoch: 6108, Training Loss: 0.02188
Epoch: 6108, Training Loss: 0.02372
Epoch: 6108, Training Loss: 0.02376
Epoch: 6108, Training Loss: 0.02988
Epoch: 6109, Training Loss: 0.02188
Epoch: 6109, Training Loss: 0.02371
Epoch: 6109, Training Loss: 0.02376
Epoch: 6109, Training Loss: 0.02988
Epoch: 6110, Training Loss: 0.02188
Epoch: 6110, Training Loss: 0.02371
Epoch: 6110, Training Loss: 0.02375
Epoch: 6110, Training Loss: 0.02987
Epoch: 6111, Training Loss: 0.02187
Epoch: 6111, Training Loss: 0.02371
Epoch: 6111, Training Loss: 0.02375
Epoch: 6111, Training Loss: 0.02987
Epoch: 6112, Training Loss: 0.02187
Epoch: 6112, Training Loss: 0.02370
Epoch: 6112, Training Loss: 0.02375
Epoch: 6112, Training Loss: 0.02987
Epoch: 6113, Training Loss: 0.02187
Epoch: 6113, Training Loss: 0.02370
Epoch: 6113, Training Loss: 0.02374
Epoch: 6113, Training Loss: 0.02986
Epoch: 6114, Training Loss: 0.02187
Epoch: 6114, Training Loss: 0.02370
Epoch: 6114, Training Loss: 0.02374
Epoch: 6114, Training Loss: 0.02986
Epoch: 6115, Training Loss: 0.02186
Epoch: 6115, Training Loss: 0.02369
Epoch: 6115, Training Loss: 0.02374
Epoch: 6115, Training Loss: 0.02985
Epoch: 6116, Training Loss: 0.02186
Epoch: 6116, Training Loss: 0.02369
Epoch: 6116, Training Loss: 0.02373
Epoch: 6116, Training Loss: 0.02985
Epoch: 6117, Training Loss: 0.02186
Epoch: 6117, Training Loss: 0.02369
Epoch: 6117, Training Loss: 0.02373
Epoch: 6117, Training Loss: 0.02984
Epoch: 6118, Training Loss: 0.02185
Epoch: 6118, Training Loss: 0.02368
Epoch: 6118, Training Loss: 0.02373
Epoch: 6118, Training Loss: 0.02984
Epoch: 6119, Training Loss: 0.02185
Epoch: 6119, Training Loss: 0.02368
Epoch: 6119, Training Loss: 0.02372
Epoch: 6119, Training Loss: 0.02984
Epoch: 6120, Training Loss: 0.02185
Epoch: 6120, Training Loss: 0.02368
Epoch: 6120, Training Loss: 0.02372
Epoch: 6120, Training Loss: 0.02983
Epoch: 6121, Training Loss: 0.02185
Epoch: 6121, Training Loss: 0.02367
Epoch: 6121, Training Loss: 0.02372
Epoch: 6121, Training Loss: 0.02983
Epoch: 6122, Training Loss: 0.02184
Epoch: 6122, Training Loss: 0.02367
Epoch: 6122, Training Loss: 0.02371
Epoch: 6122, Training Loss: 0.02982
Epoch: 6123, Training Loss: 0.02184
Epoch: 6123, Training Loss: 0.02367
Epoch: 6123, Training Loss: 0.02371
Epoch: 6123, Training Loss: 0.02982
Epoch: 6124, Training Loss: 0.02184
Epoch: 6124, Training Loss: 0.02366
Epoch: 6124, Training Loss: 0.02371
Epoch: 6124, Training Loss: 0.02981
Epoch: 6125, Training Loss: 0.02183
Epoch: 6125, Training Loss: 0.02366
Epoch: 6125, Training Loss: 0.02370
Epoch: 6125, Training Loss: 0.02981
Epoch: 6126, Training Loss: 0.02183
Epoch: 6126, Training Loss: 0.02366
Epoch: 6126, Training Loss: 0.02370
Epoch: 6126, Training Loss: 0.02980
Epoch: 6127, Training Loss: 0.02183
Epoch: 6127, Training Loss: 0.02365
Epoch: 6127, Training Loss: 0.02370
Epoch: 6127, Training Loss: 0.02980
Epoch: 6128, Training Loss: 0.02183
Epoch: 6128, Training Loss: 0.02365
Epoch: 6128, Training Loss: 0.02369
Epoch: 6128, Training Loss: 0.02980
Epoch: 6129, Training Loss: 0.02182
Epoch: 6129, Training Loss: 0.02365
Epoch: 6129, Training Loss: 0.02369
Epoch: 6129, Training Loss: 0.02979
Epoch: 6130, Training Loss: 0.02182
Epoch: 6130, Training Loss: 0.02364
Epoch: 6130, Training Loss: 0.02369
Epoch: 6130, Training Loss: 0.02979
Epoch: 6131, Training Loss: 0.02182
Epoch: 6131, Training Loss: 0.02364
Epoch: 6131, Training Loss: 0.02368
Epoch: 6131, Training Loss: 0.02978
Epoch: 6132, Training Loss: 0.02181
Epoch: 6132, Training Loss: 0.02364
Epoch: 6132, Training Loss: 0.02368
Epoch: 6132, Training Loss: 0.02978
Epoch: 6133, Training Loss: 0.02181
Epoch: 6133, Training Loss: 0.02363
Epoch: 6133, Training Loss: 0.02368
Epoch: 6133, Training Loss: 0.02977
Epoch: 6134, Training Loss: 0.02181
Epoch: 6134, Training Loss: 0.02363
Epoch: 6134, Training Loss: 0.02367
Epoch: 6134, Training Loss: 0.02977
Epoch: 6135, Training Loss: 0.02181
Epoch: 6135, Training Loss: 0.02363
Epoch: 6135, Training Loss: 0.02367
Epoch: 6135, Training Loss: 0.02977
Epoch: 6136, Training Loss: 0.02180
Epoch: 6136, Training Loss: 0.02362
Epoch: 6136, Training Loss: 0.02367
Epoch: 6136, Training Loss: 0.02976
Epoch: 6137, Training Loss: 0.02180
Epoch: 6137, Training Loss: 0.02362
Epoch: 6137, Training Loss: 0.02366
Epoch: 6137, Training Loss: 0.02976
Epoch: 6138, Training Loss: 0.02180
Epoch: 6138, Training Loss: 0.02362
Epoch: 6138, Training Loss: 0.02366
Epoch: 6138, Training Loss: 0.02975
Epoch: 6139, Training Loss: 0.02179
Epoch: 6139, Training Loss: 0.02361
Epoch: 6139, Training Loss: 0.02366
Epoch: 6139, Training Loss: 0.02975
Epoch: 6140, Training Loss: 0.02179
Epoch: 6140, Training Loss: 0.02361
Epoch: 6140, Training Loss: 0.02365
Epoch: 6140, Training Loss: 0.02974
Epoch: 6141, Training Loss: 0.02179
Epoch: 6141, Training Loss: 0.02361
Epoch: 6141, Training Loss: 0.02365
Epoch: 6141, Training Loss: 0.02974
Epoch: 6142, Training Loss: 0.02179
Epoch: 6142, Training Loss: 0.02360
Epoch: 6142, Training Loss: 0.02365
Epoch: 6142, Training Loss: 0.02974
Epoch: 6143, Training Loss: 0.02178
Epoch: 6143, Training Loss: 0.02360
Epoch: 6143, Training Loss: 0.02364
Epoch: 6143, Training Loss: 0.02973
Epoch: 6144, Training Loss: 0.02178
Epoch: 6144, Training Loss: 0.02360
Epoch: 6144, Training Loss: 0.02364
Epoch: 6144, Training Loss: 0.02973
Epoch: 6145, Training Loss: 0.02178
Epoch: 6145, Training Loss: 0.02359
Epoch: 6145, Training Loss: 0.02364
Epoch: 6145, Training Loss: 0.02972
Epoch: 6146, Training Loss: 0.02177
Epoch: 6146, Training Loss: 0.02359
Epoch: 6146, Training Loss: 0.02363
Epoch: 6146, Training Loss: 0.02972
Epoch: 6147, Training Loss: 0.02177
Epoch: 6147, Training Loss: 0.02359
Epoch: 6147, Training Loss: 0.02363
Epoch: 6147, Training Loss: 0.02971
Epoch: 6148, Training Loss: 0.02177
Epoch: 6148, Training Loss: 0.02358
Epoch: 6148, Training Loss: 0.02363
Epoch: 6148, Training Loss: 0.02971
Epoch: 6149, Training Loss: 0.02176
Epoch: 6149, Training Loss: 0.02358
Epoch: 6149, Training Loss: 0.02362
Epoch: 6149, Training Loss: 0.02971
Epoch: 6150, Training Loss: 0.02176
Epoch: 6150, Training Loss: 0.02358
Epoch: 6150, Training Loss: 0.02362
Epoch: 6150, Training Loss: 0.02970
Epoch: 6151, Training Loss: 0.02176
Epoch: 6151, Training Loss: 0.02357
Epoch: 6151, Training Loss: 0.02362
Epoch: 6151, Training Loss: 0.02970
Epoch: 6152, Training Loss: 0.02176
Epoch: 6152, Training Loss: 0.02357
Epoch: 6152, Training Loss: 0.02361
Epoch: 6152, Training Loss: 0.02969
Epoch: 6153, Training Loss: 0.02175
Epoch: 6153, Training Loss: 0.02357
Epoch: 6153, Training Loss: 0.02361
Epoch: 6153, Training Loss: 0.02969
Epoch: 6154, Training Loss: 0.02175
Epoch: 6154, Training Loss: 0.02356
Epoch: 6154, Training Loss: 0.02361
Epoch: 6154, Training Loss: 0.02968
Epoch: 6155, Training Loss: 0.02175
Epoch: 6155, Training Loss: 0.02356
Epoch: 6155, Training Loss: 0.02360
Epoch: 6155, Training Loss: 0.02968
Epoch: 6156, Training Loss: 0.02174
Epoch: 6156, Training Loss: 0.02356
Epoch: 6156, Training Loss: 0.02360
Epoch: 6156, Training Loss: 0.02967
Epoch: 6157, Training Loss: 0.02174
Epoch: 6157, Training Loss: 0.02355
Epoch: 6157, Training Loss: 0.02360
Epoch: 6157, Training Loss: 0.02967
Epoch: 6158, Training Loss: 0.02174
Epoch: 6158, Training Loss: 0.02355
Epoch: 6158, Training Loss: 0.02359
Epoch: 6158, Training Loss: 0.02967
Epoch: 6159, Training Loss: 0.02174
Epoch: 6159, Training Loss: 0.02355
Epoch: 6159, Training Loss: 0.02359
Epoch: 6159, Training Loss: 0.02966
Epoch: 6160, Training Loss: 0.02173
Epoch: 6160, Training Loss: 0.02354
Epoch: 6160, Training Loss: 0.02359
Epoch: 6160, Training Loss: 0.02966
Epoch: 6161, Training Loss: 0.02173
Epoch: 6161, Training Loss: 0.02354
Epoch: 6161, Training Loss: 0.02358
Epoch: 6161, Training Loss: 0.02965
Epoch: 6162, Training Loss: 0.02173
Epoch: 6162, Training Loss: 0.02354
Epoch: 6162, Training Loss: 0.02358
Epoch: 6162, Training Loss: 0.02965
Epoch: 6163, Training Loss: 0.02172
Epoch: 6163, Training Loss: 0.02353
Epoch: 6163, Training Loss: 0.02358
Epoch: 6163, Training Loss: 0.02964
Epoch: 6164, Training Loss: 0.02172
Epoch: 6164, Training Loss: 0.02353
Epoch: 6164, Training Loss: 0.02357
Epoch: 6164, Training Loss: 0.02964
Epoch: 6165, Training Loss: 0.02172
Epoch: 6165, Training Loss: 0.02353
Epoch: 6165, Training Loss: 0.02357
Epoch: 6165, Training Loss: 0.02964
Epoch: 6166, Training Loss: 0.02172
Epoch: 6166, Training Loss: 0.02352
Epoch: 6166, Training Loss: 0.02357
Epoch: 6166, Training Loss: 0.02963
Epoch: 6167, Training Loss: 0.02171
Epoch: 6167, Training Loss: 0.02352
Epoch: 6167, Training Loss: 0.02356
Epoch: 6167, Training Loss: 0.02963
Epoch: 6168, Training Loss: 0.02171
Epoch: 6168, Training Loss: 0.02352
Epoch: 6168, Training Loss: 0.02356
Epoch: 6168, Training Loss: 0.02962
Epoch: 6169, Training Loss: 0.02171
Epoch: 6169, Training Loss: 0.02351
Epoch: 6169, Training Loss: 0.02356
Epoch: 6169, Training Loss: 0.02962
Epoch: 6170, Training Loss: 0.02170
Epoch: 6170, Training Loss: 0.02351
Epoch: 6170, Training Loss: 0.02355
Epoch: 6170, Training Loss: 0.02961
Epoch: 6171, Training Loss: 0.02170
Epoch: 6171, Training Loss: 0.02351
Epoch: 6171, Training Loss: 0.02355
Epoch: 6171, Training Loss: 0.02961
Epoch: 6172, Training Loss: 0.02170
Epoch: 6172, Training Loss: 0.02350
Epoch: 6172, Training Loss: 0.02355
Epoch: 6172, Training Loss: 0.02961
Epoch: 6173, Training Loss: 0.02170
Epoch: 6173, Training Loss: 0.02350
Epoch: 6173, Training Loss: 0.02354
Epoch: 6173, Training Loss: 0.02960
Epoch: 6174, Training Loss: 0.02169
Epoch: 6174, Training Loss: 0.02350
Epoch: 6174, Training Loss: 0.02354
Epoch: 6174, Training Loss: 0.02960
Epoch: 6175, Training Loss: 0.02169
Epoch: 6175, Training Loss: 0.02350
Epoch: 6175, Training Loss: 0.02354
Epoch: 6175, Training Loss: 0.02959
Epoch: 6176, Training Loss: 0.02169
Epoch: 6176, Training Loss: 0.02349
Epoch: 6176, Training Loss: 0.02353
Epoch: 6176, Training Loss: 0.02959
Epoch: 6177, Training Loss: 0.02169
Epoch: 6177, Training Loss: 0.02349
Epoch: 6177, Training Loss: 0.02353
Epoch: 6177, Training Loss: 0.02958
Epoch: 6178, Training Loss: 0.02168
Epoch: 6178, Training Loss: 0.02349
Epoch: 6178, Training Loss: 0.02353
Epoch: 6178, Training Loss: 0.02958
Epoch: 6179, Training Loss: 0.02168
Epoch: 6179, Training Loss: 0.02348
Epoch: 6179, Training Loss: 0.02352
Epoch: 6179, Training Loss: 0.02958
Epoch: 6180, Training Loss: 0.02168
Epoch: 6180, Training Loss: 0.02348
Epoch: 6180, Training Loss: 0.02352
Epoch: 6180, Training Loss: 0.02957
Epoch: 6181, Training Loss: 0.02167
Epoch: 6181, Training Loss: 0.02348
Epoch: 6181, Training Loss: 0.02352
Epoch: 6181, Training Loss: 0.02957
Epoch: 6182, Training Loss: 0.02167
Epoch: 6182, Training Loss: 0.02347
Epoch: 6182, Training Loss: 0.02351
Epoch: 6182, Training Loss: 0.02956
Epoch: 6183, Training Loss: 0.02167
Epoch: 6183, Training Loss: 0.02347
Epoch: 6183, Training Loss: 0.02351
Epoch: 6183, Training Loss: 0.02956
Epoch: 6184, Training Loss: 0.02167
Epoch: 6184, Training Loss: 0.02347
Epoch: 6184, Training Loss: 0.02351
Epoch: 6184, Training Loss: 0.02956
Epoch: 6185, Training Loss: 0.02166
Epoch: 6185, Training Loss: 0.02346
Epoch: 6185, Training Loss: 0.02350
Epoch: 6185, Training Loss: 0.02955
Epoch: 6186, Training Loss: 0.02166
Epoch: 6186, Training Loss: 0.02346
Epoch: 6186, Training Loss: 0.02350
Epoch: 6186, Training Loss: 0.02955
Epoch: 6187, Training Loss: 0.02166
Epoch: 6187, Training Loss: 0.02346
Epoch: 6187, Training Loss: 0.02350
Epoch: 6187, Training Loss: 0.02954
Epoch: 6188, Training Loss: 0.02165
Epoch: 6188, Training Loss: 0.02345
Epoch: 6188, Training Loss: 0.02349
Epoch: 6188, Training Loss: 0.02954
Epoch: 6189, Training Loss: 0.02165
Epoch: 6189, Training Loss: 0.02345
Epoch: 6189, Training Loss: 0.02349
Epoch: 6189, Training Loss: 0.02953
Epoch: 6190, Training Loss: 0.02165
Epoch: 6190, Training Loss: 0.02345
Epoch: 6190, Training Loss: 0.02349
Epoch: 6190, Training Loss: 0.02953
Epoch: 6191, Training Loss: 0.02165
Epoch: 6191, Training Loss: 0.02344
Epoch: 6191, Training Loss: 0.02348
Epoch: 6191, Training Loss: 0.02953
Epoch: 6192, Training Loss: 0.02164
Epoch: 6192, Training Loss: 0.02344
Epoch: 6192, Training Loss: 0.02348
Epoch: 6192, Training Loss: 0.02952
Epoch: 6193, Training Loss: 0.02164
Epoch: 6193, Training Loss: 0.02344
Epoch: 6193, Training Loss: 0.02348
Epoch: 6193, Training Loss: 0.02952
Epoch: 6194, Training Loss: 0.02164
Epoch: 6194, Training Loss: 0.02343
Epoch: 6194, Training Loss: 0.02347
Epoch: 6194, Training Loss: 0.02951
Epoch: 6195, Training Loss: 0.02163
Epoch: 6195, Training Loss: 0.02343
Epoch: 6195, Training Loss: 0.02347
Epoch: 6195, Training Loss: 0.02951
Epoch: 6196, Training Loss: 0.02163
Epoch: 6196, Training Loss: 0.02343
Epoch: 6196, Training Loss: 0.02347
Epoch: 6196, Training Loss: 0.02950
Epoch: 6197, Training Loss: 0.02163
Epoch: 6197, Training Loss: 0.02342
Epoch: 6197, Training Loss: 0.02346
Epoch: 6197, Training Loss: 0.02950
Epoch: 6198, Training Loss: 0.02163
Epoch: 6198, Training Loss: 0.02342
Epoch: 6198, Training Loss: 0.02346
Epoch: 6198, Training Loss: 0.02950
Epoch: 6199, Training Loss: 0.02162
Epoch: 6199, Training Loss: 0.02342
Epoch: 6199, Training Loss: 0.02346
Epoch: 6199, Training Loss: 0.02949
Epoch: 6200, Training Loss: 0.02162
Epoch: 6200, Training Loss: 0.02341
Epoch: 6200, Training Loss: 0.02346
Epoch: 6200, Training Loss: 0.02949
Epoch: 6201, Training Loss: 0.02162
Epoch: 6201, Training Loss: 0.02341
Epoch: 6201, Training Loss: 0.02345
Epoch: 6201, Training Loss: 0.02948
Epoch: 6202, Training Loss: 0.02161
Epoch: 6202, Training Loss: 0.02341
Epoch: 6202, Training Loss: 0.02345
Epoch: 6202, Training Loss: 0.02948
Epoch: 6203, Training Loss: 0.02161
Epoch: 6203, Training Loss: 0.02340
Epoch: 6203, Training Loss: 0.02345
Epoch: 6203, Training Loss: 0.02947
Epoch: 6204, Training Loss: 0.02161
Epoch: 6204, Training Loss: 0.02340
Epoch: 6204, Training Loss: 0.02344
Epoch: 6204, Training Loss: 0.02947
Epoch: 6205, Training Loss: 0.02161
Epoch: 6205, Training Loss: 0.02340
Epoch: 6205, Training Loss: 0.02344
Epoch: 6205, Training Loss: 0.02947
Epoch: 6206, Training Loss: 0.02160
Epoch: 6206, Training Loss: 0.02339
Epoch: 6206, Training Loss: 0.02344
Epoch: 6206, Training Loss: 0.02946
Epoch: 6207, Training Loss: 0.02160
Epoch: 6207, Training Loss: 0.02339
Epoch: 6207, Training Loss: 0.02343
Epoch: 6207, Training Loss: 0.02946
Epoch: 6208, Training Loss: 0.02160
Epoch: 6208, Training Loss: 0.02339
Epoch: 6208, Training Loss: 0.02343
Epoch: 6208, Training Loss: 0.02945
Epoch: 6209, Training Loss: 0.02159
Epoch: 6209, Training Loss: 0.02338
Epoch: 6209, Training Loss: 0.02343
Epoch: 6209, Training Loss: 0.02945
Epoch: 6210, Training Loss: 0.02159
Epoch: 6210, Training Loss: 0.02338
Epoch: 6210, Training Loss: 0.02342
Epoch: 6210, Training Loss: 0.02945
Epoch: 6211, Training Loss: 0.02159
Epoch: 6211, Training Loss: 0.02338
Epoch: 6211, Training Loss: 0.02342
Epoch: 6211, Training Loss: 0.02944
Epoch: 6212, Training Loss: 0.02159
Epoch: 6212, Training Loss: 0.02338
Epoch: 6212, Training Loss: 0.02342
Epoch: 6212, Training Loss: 0.02944
Epoch: 6213, Training Loss: 0.02158
Epoch: 6213, Training Loss: 0.02337
Epoch: 6213, Training Loss: 0.02341
Epoch: 6213, Training Loss: 0.02943
Epoch: 6214, Training Loss: 0.02158
Epoch: 6214, Training Loss: 0.02337
Epoch: 6214, Training Loss: 0.02341
Epoch: 6214, Training Loss: 0.02943
Epoch: 6215, Training Loss: 0.02158
Epoch: 6215, Training Loss: 0.02337
Epoch: 6215, Training Loss: 0.02341
Epoch: 6215, Training Loss: 0.02942
Epoch: 6216, Training Loss: 0.02158
Epoch: 6216, Training Loss: 0.02336
Epoch: 6216, Training Loss: 0.02340
Epoch: 6216, Training Loss: 0.02942
Epoch: 6217, Training Loss: 0.02157
Epoch: 6217, Training Loss: 0.02336
Epoch: 6217, Training Loss: 0.02340
Epoch: 6217, Training Loss: 0.02942
Epoch: 6218, Training Loss: 0.02157
Epoch: 6218, Training Loss: 0.02336
Epoch: 6218, Training Loss: 0.02340
Epoch: 6218, Training Loss: 0.02941
Epoch: 6219, Training Loss: 0.02157
Epoch: 6219, Training Loss: 0.02335
Epoch: 6219, Training Loss: 0.02339
Epoch: 6219, Training Loss: 0.02941
Epoch: 6220, Training Loss: 0.02156
Epoch: 6220, Training Loss: 0.02335
Epoch: 6220, Training Loss: 0.02339
Epoch: 6220, Training Loss: 0.02940
Epoch: 6221, Training Loss: 0.02156
Epoch: 6221, Training Loss: 0.02335
Epoch: 6221, Training Loss: 0.02339
Epoch: 6221, Training Loss: 0.02940
Epoch: 6222, Training Loss: 0.02156
Epoch: 6222, Training Loss: 0.02334
Epoch: 6222, Training Loss: 0.02338
Epoch: 6222, Training Loss: 0.02939
Epoch: 6223, Training Loss: 0.02156
Epoch: 6223, Training Loss: 0.02334
Epoch: 6223, Training Loss: 0.02338
Epoch: 6223, Training Loss: 0.02939
Epoch: 6224, Training Loss: 0.02155
Epoch: 6224, Training Loss: 0.02334
Epoch: 6224, Training Loss: 0.02338
Epoch: 6224, Training Loss: 0.02939
Epoch: 6225, Training Loss: 0.02155
Epoch: 6225, Training Loss: 0.02333
Epoch: 6225, Training Loss: 0.02337
Epoch: 6225, Training Loss: 0.02938
Epoch: 6226, Training Loss: 0.02155
Epoch: 6226, Training Loss: 0.02333
Epoch: 6226, Training Loss: 0.02337
Epoch: 6226, Training Loss: 0.02938
Epoch: 6227, Training Loss: 0.02154
Epoch: 6227, Training Loss: 0.02333
Epoch: 6227, Training Loss: 0.02337
Epoch: 6227, Training Loss: 0.02937
Epoch: 6228, Training Loss: 0.02154
Epoch: 6228, Training Loss: 0.02332
Epoch: 6228, Training Loss: 0.02336
Epoch: 6228, Training Loss: 0.02937
Epoch: 6229, Training Loss: 0.02154
Epoch: 6229, Training Loss: 0.02332
Epoch: 6229, Training Loss: 0.02336
Epoch: 6229, Training Loss: 0.02937
Epoch: 6230, Training Loss: 0.02154
Epoch: 6230, Training Loss: 0.02332
Epoch: 6230, Training Loss: 0.02336
Epoch: 6230, Training Loss: 0.02936
Epoch: 6231, Training Loss: 0.02153
Epoch: 6231, Training Loss: 0.02331
Epoch: 6231, Training Loss: 0.02336
Epoch: 6231, Training Loss: 0.02936
Epoch: 6232, Training Loss: 0.02153
Epoch: 6232, Training Loss: 0.02331
Epoch: 6232, Training Loss: 0.02335
Epoch: 6232, Training Loss: 0.02935
Epoch: 6233, Training Loss: 0.02153
Epoch: 6233, Training Loss: 0.02331
Epoch: 6233, Training Loss: 0.02335
Epoch: 6233, Training Loss: 0.02935
Epoch: 6234, Training Loss: 0.02153
Epoch: 6234, Training Loss: 0.02330
Epoch: 6234, Training Loss: 0.02335
Epoch: 6234, Training Loss: 0.02934
Epoch: 6235, Training Loss: 0.02152
Epoch: 6235, Training Loss: 0.02330
Epoch: 6235, Training Loss: 0.02334
Epoch: 6235, Training Loss: 0.02934
Epoch: 6236, Training Loss: 0.02152
Epoch: 6236, Training Loss: 0.02330
Epoch: 6236, Training Loss: 0.02334
Epoch: 6236, Training Loss: 0.02934
Epoch: 6237, Training Loss: 0.02152
Epoch: 6237, Training Loss: 0.02329
Epoch: 6237, Training Loss: 0.02334
Epoch: 6237, Training Loss: 0.02933
Epoch: 6238, Training Loss: 0.02151
Epoch: 6238, Training Loss: 0.02329
Epoch: 6238, Training Loss: 0.02333
Epoch: 6238, Training Loss: 0.02933
Epoch: 6239, Training Loss: 0.02151
Epoch: 6239, Training Loss: 0.02329
Epoch: 6239, Training Loss: 0.02333
Epoch: 6239, Training Loss: 0.02932
Epoch: 6240, Training Loss: 0.02151
Epoch: 6240, Training Loss: 0.02329
Epoch: 6240, Training Loss: 0.02333
Epoch: 6240, Training Loss: 0.02932
Epoch: 6241, Training Loss: 0.02151
Epoch: 6241, Training Loss: 0.02328
Epoch: 6241, Training Loss: 0.02332
Epoch: 6241, Training Loss: 0.02932
Epoch: 6242, Training Loss: 0.02150
Epoch: 6242, Training Loss: 0.02328
Epoch: 6242, Training Loss: 0.02332
Epoch: 6242, Training Loss: 0.02931
Epoch: 6243, Training Loss: 0.02150
Epoch: 6243, Training Loss: 0.02328
Epoch: 6243, Training Loss: 0.02332
Epoch: 6243, Training Loss: 0.02931
Epoch: 6244, Training Loss: 0.02150
Epoch: 6244, Training Loss: 0.02327
Epoch: 6244, Training Loss: 0.02331
Epoch: 6244, Training Loss: 0.02930
Epoch: 6245, Training Loss: 0.02149
Epoch: 6245, Training Loss: 0.02327
Epoch: 6245, Training Loss: 0.02331
Epoch: 6245, Training Loss: 0.02930
Epoch: 6246, Training Loss: 0.02149
Epoch: 6246, Training Loss: 0.02327
Epoch: 6246, Training Loss: 0.02331
Epoch: 6246, Training Loss: 0.02929
Epoch: 6247, Training Loss: 0.02149
Epoch: 6247, Training Loss: 0.02326
Epoch: 6247, Training Loss: 0.02330
Epoch: 6247, Training Loss: 0.02929
Epoch: 6248, Training Loss: 0.02149
Epoch: 6248, Training Loss: 0.02326
Epoch: 6248, Training Loss: 0.02330
Epoch: 6248, Training Loss: 0.02929
Epoch: 6249, Training Loss: 0.02148
Epoch: 6249, Training Loss: 0.02326
Epoch: 6249, Training Loss: 0.02330
Epoch: 6249, Training Loss: 0.02928
Epoch: 6250, Training Loss: 0.02148
Epoch: 6250, Training Loss: 0.02325
Epoch: 6250, Training Loss: 0.02329
Epoch: 6250, Training Loss: 0.02928
Epoch: 6251, Training Loss: 0.02148
Epoch: 6251, Training Loss: 0.02325
Epoch: 6251, Training Loss: 0.02329
Epoch: 6251, Training Loss: 0.02927
Epoch: 6252, Training Loss: 0.02148
Epoch: 6252, Training Loss: 0.02325
Epoch: 6252, Training Loss: 0.02329
Epoch: 6252, Training Loss: 0.02927
Epoch: 6253, Training Loss: 0.02147
Epoch: 6253, Training Loss: 0.02324
Epoch: 6253, Training Loss: 0.02329
Epoch: 6253, Training Loss: 0.02927
Epoch: 6254, Training Loss: 0.02147
Epoch: 6254, Training Loss: 0.02324
Epoch: 6254, Training Loss: 0.02328
Epoch: 6254, Training Loss: 0.02926
Epoch: 6255, Training Loss: 0.02147
Epoch: 6255, Training Loss: 0.02324
Epoch: 6255, Training Loss: 0.02328
Epoch: 6255, Training Loss: 0.02926
Epoch: 6256, Training Loss: 0.02146
Epoch: 6256, Training Loss: 0.02323
Epoch: 6256, Training Loss: 0.02328
Epoch: 6256, Training Loss: 0.02925
Epoch: 6257, Training Loss: 0.02146
Epoch: 6257, Training Loss: 0.02323
Epoch: 6257, Training Loss: 0.02327
Epoch: 6257, Training Loss: 0.02925
Epoch: 6258, Training Loss: 0.02146
Epoch: 6258, Training Loss: 0.02323
Epoch: 6258, Training Loss: 0.02327
Epoch: 6258, Training Loss: 0.02925
Epoch: 6259, Training Loss: 0.02146
Epoch: 6259, Training Loss: 0.02323
Epoch: 6259, Training Loss: 0.02327
Epoch: 6259, Training Loss: 0.02924
Epoch: 6260, Training Loss: 0.02145
Epoch: 6260, Training Loss: 0.02322
Epoch: 6260, Training Loss: 0.02326
Epoch: 6260, Training Loss: 0.02924
Epoch: 6261, Training Loss: 0.02145
Epoch: 6261, Training Loss: 0.02322
Epoch: 6261, Training Loss: 0.02326
Epoch: 6261, Training Loss: 0.02923
Epoch: 6262, Training Loss: 0.02145
Epoch: 6262, Training Loss: 0.02322
Epoch: 6262, Training Loss: 0.02326
Epoch: 6262, Training Loss: 0.02923
Epoch: 6263, Training Loss: 0.02145
Epoch: 6263, Training Loss: 0.02321
Epoch: 6263, Training Loss: 0.02325
Epoch: 6263, Training Loss: 0.02922
Epoch: 6264, Training Loss: 0.02144
Epoch: 6264, Training Loss: 0.02321
Epoch: 6264, Training Loss: 0.02325
Epoch: 6264, Training Loss: 0.02922
Epoch: 6265, Training Loss: 0.02144
Epoch: 6265, Training Loss: 0.02321
Epoch: 6265, Training Loss: 0.02325
Epoch: 6265, Training Loss: 0.02922
Epoch: 6266, Training Loss: 0.02144
Epoch: 6266, Training Loss: 0.02320
Epoch: 6266, Training Loss: 0.02324
Epoch: 6266, Training Loss: 0.02921
Epoch: 6267, Training Loss: 0.02143
Epoch: 6267, Training Loss: 0.02320
Epoch: 6267, Training Loss: 0.02324
Epoch: 6267, Training Loss: 0.02921
Epoch: 6268, Training Loss: 0.02143
Epoch: 6268, Training Loss: 0.02320
Epoch: 6268, Training Loss: 0.02324
Epoch: 6268, Training Loss: 0.02920
Epoch: 6269, Training Loss: 0.02143
Epoch: 6269, Training Loss: 0.02319
Epoch: 6269, Training Loss: 0.02323
Epoch: 6269, Training Loss: 0.02920
Epoch: 6270, Training Loss: 0.02143
Epoch: 6270, Training Loss: 0.02319
Epoch: 6270, Training Loss: 0.02323
Epoch: 6270, Training Loss: 0.02920
Epoch: 6271, Training Loss: 0.02142
Epoch: 6271, Training Loss: 0.02319
Epoch: 6271, Training Loss: 0.02323
Epoch: 6271, Training Loss: 0.02919
Epoch: 6272, Training Loss: 0.02142
Epoch: 6272, Training Loss: 0.02318
Epoch: 6272, Training Loss: 0.02323
Epoch: 6272, Training Loss: 0.02919
Epoch: 6273, Training Loss: 0.02142
Epoch: 6273, Training Loss: 0.02318
Epoch: 6273, Training Loss: 0.02322
Epoch: 6273, Training Loss: 0.02918
Epoch: 6274, Training Loss: 0.02141
Epoch: 6274, Training Loss: 0.02318
Epoch: 6274, Training Loss: 0.02322
Epoch: 6274, Training Loss: 0.02918
Epoch: 6275, Training Loss: 0.02141
Epoch: 6275, Training Loss: 0.02317
Epoch: 6275, Training Loss: 0.02322
Epoch: 6275, Training Loss: 0.02918
Epoch: 6276, Training Loss: 0.02141
Epoch: 6276, Training Loss: 0.02317
Epoch: 6276, Training Loss: 0.02321
Epoch: 6276, Training Loss: 0.02917
Epoch: 6277, Training Loss: 0.02141
Epoch: 6277, Training Loss: 0.02317
Epoch: 6277, Training Loss: 0.02321
Epoch: 6277, Training Loss: 0.02917
Epoch: 6278, Training Loss: 0.02140
Epoch: 6278, Training Loss: 0.02317
Epoch: 6278, Training Loss: 0.02321
Epoch: 6278, Training Loss: 0.02916
Epoch: 6279, Training Loss: 0.02140
Epoch: 6279, Training Loss: 0.02316
Epoch: 6279, Training Loss: 0.02320
Epoch: 6279, Training Loss: 0.02916
Epoch: 6280, Training Loss: 0.02140
Epoch: 6280, Training Loss: 0.02316
Epoch: 6280, Training Loss: 0.02320
Epoch: 6280, Training Loss: 0.02915
Epoch: 6281, Training Loss: 0.02140
Epoch: 6281, Training Loss: 0.02316
Epoch: 6281, Training Loss: 0.02320
Epoch: 6281, Training Loss: 0.02915
Epoch: 6282, Training Loss: 0.02139
Epoch: 6282, Training Loss: 0.02315
Epoch: 6282, Training Loss: 0.02319
Epoch: 6282, Training Loss: 0.02915
Epoch: 6283, Training Loss: 0.02139
Epoch: 6283, Training Loss: 0.02315
Epoch: 6283, Training Loss: 0.02319
Epoch: 6283, Training Loss: 0.02914
Epoch: 6284, Training Loss: 0.02139
Epoch: 6284, Training Loss: 0.02315
Epoch: 6284, Training Loss: 0.02319
Epoch: 6284, Training Loss: 0.02914
Epoch: 6285, Training Loss: 0.02138
Epoch: 6285, Training Loss: 0.02314
Epoch: 6285, Training Loss: 0.02318
Epoch: 6285, Training Loss: 0.02913
Epoch: 6286, Training Loss: 0.02138
Epoch: 6286, Training Loss: 0.02314
Epoch: 6286, Training Loss: 0.02318
Epoch: 6286, Training Loss: 0.02913
Epoch: 6287, Training Loss: 0.02138
Epoch: 6287, Training Loss: 0.02314
Epoch: 6287, Training Loss: 0.02318
Epoch: 6287, Training Loss: 0.02913
Epoch: 6288, Training Loss: 0.02138
Epoch: 6288, Training Loss: 0.02313
Epoch: 6288, Training Loss: 0.02317
Epoch: 6288, Training Loss: 0.02912
Epoch: 6289, Training Loss: 0.02137
Epoch: 6289, Training Loss: 0.02313
Epoch: 6289, Training Loss: 0.02317
Epoch: 6289, Training Loss: 0.02912
Epoch: 6290, Training Loss: 0.02137
Epoch: 6290, Training Loss: 0.02313
Epoch: 6290, Training Loss: 0.02317
Epoch: 6290, Training Loss: 0.02911
Epoch: 6291, Training Loss: 0.02137
Epoch: 6291, Training Loss: 0.02312
Epoch: 6291, Training Loss: 0.02317
Epoch: 6291, Training Loss: 0.02911
Epoch: 6292, Training Loss: 0.02137
Epoch: 6292, Training Loss: 0.02312
Epoch: 6292, Training Loss: 0.02316
Epoch: 6292, Training Loss: 0.02911
Epoch: 6293, Training Loss: 0.02136
Epoch: 6293, Training Loss: 0.02312
Epoch: 6293, Training Loss: 0.02316
Epoch: 6293, Training Loss: 0.02910
Epoch: 6294, Training Loss: 0.02136
Epoch: 6294, Training Loss: 0.02312
Epoch: 6294, Training Loss: 0.02316
Epoch: 6294, Training Loss: 0.02910
Epoch: 6295, Training Loss: 0.02136
Epoch: 6295, Training Loss: 0.02311
Epoch: 6295, Training Loss: 0.02315
Epoch: 6295, Training Loss: 0.02909
Epoch: 6296, Training Loss: 0.02135
Epoch: 6296, Training Loss: 0.02311
Epoch: 6296, Training Loss: 0.02315
Epoch: 6296, Training Loss: 0.02909
Epoch: 6297, Training Loss: 0.02135
Epoch: 6297, Training Loss: 0.02311
Epoch: 6297, Training Loss: 0.02315
Epoch: 6297, Training Loss: 0.02909
Epoch: 6298, Training Loss: 0.02135
Epoch: 6298, Training Loss: 0.02310
Epoch: 6298, Training Loss: 0.02314
Epoch: 6298, Training Loss: 0.02908
Epoch: 6299, Training Loss: 0.02135
Epoch: 6299, Training Loss: 0.02310
Epoch: 6299, Training Loss: 0.02314
Epoch: 6299, Training Loss: 0.02908
Epoch: 6300, Training Loss: 0.02134
Epoch: 6300, Training Loss: 0.02310
Epoch: 6300, Training Loss: 0.02314
Epoch: 6300, Training Loss: 0.02907
Epoch: 6301, Training Loss: 0.02134
Epoch: 6301, Training Loss: 0.02309
Epoch: 6301, Training Loss: 0.02313
Epoch: 6301, Training Loss: 0.02907
Epoch: 6302, Training Loss: 0.02134
Epoch: 6302, Training Loss: 0.02309
Epoch: 6302, Training Loss: 0.02313
Epoch: 6302, Training Loss: 0.02907
Epoch: 6303, Training Loss: 0.02134
Epoch: 6303, Training Loss: 0.02309
Epoch: 6303, Training Loss: 0.02313
Epoch: 6303, Training Loss: 0.02906
Epoch: 6304, Training Loss: 0.02133
Epoch: 6304, Training Loss: 0.02308
Epoch: 6304, Training Loss: 0.02312
Epoch: 6304, Training Loss: 0.02906
Epoch: 6305, Training Loss: 0.02133
Epoch: 6305, Training Loss: 0.02308
Epoch: 6305, Training Loss: 0.02312
Epoch: 6305, Training Loss: 0.02905
Epoch: 6306, Training Loss: 0.02133
Epoch: 6306, Training Loss: 0.02308
Epoch: 6306, Training Loss: 0.02312
Epoch: 6306, Training Loss: 0.02905
Epoch: 6307, Training Loss: 0.02133
Epoch: 6307, Training Loss: 0.02307
Epoch: 6307, Training Loss: 0.02312
Epoch: 6307, Training Loss: 0.02904
Epoch: 6308, Training Loss: 0.02132
Epoch: 6308, Training Loss: 0.02307
Epoch: 6308, Training Loss: 0.02311
Epoch: 6308, Training Loss: 0.02904
Epoch: 6309, Training Loss: 0.02132
Epoch: 6309, Training Loss: 0.02307
Epoch: 6309, Training Loss: 0.02311
Epoch: 6309, Training Loss: 0.02904
Epoch: 6310, Training Loss: 0.02132
Epoch: 6310, Training Loss: 0.02307
Epoch: 6310, Training Loss: 0.02311
Epoch: 6310, Training Loss: 0.02903
Epoch: 6311, Training Loss: 0.02131
Epoch: 6311, Training Loss: 0.02306
Epoch: 6311, Training Loss: 0.02310
Epoch: 6311, Training Loss: 0.02903
Epoch: 6312, Training Loss: 0.02131
Epoch: 6312, Training Loss: 0.02306
Epoch: 6312, Training Loss: 0.02310
Epoch: 6312, Training Loss: 0.02902
Epoch: 6313, Training Loss: 0.02131
Epoch: 6313, Training Loss: 0.02306
Epoch: 6313, Training Loss: 0.02310
Epoch: 6313, Training Loss: 0.02902
Epoch: 6314, Training Loss: 0.02131
Epoch: 6314, Training Loss: 0.02305
Epoch: 6314, Training Loss: 0.02309
Epoch: 6314, Training Loss: 0.02902
Epoch: 6315, Training Loss: 0.02130
Epoch: 6315, Training Loss: 0.02305
Epoch: 6315, Training Loss: 0.02309
Epoch: 6315, Training Loss: 0.02901
Epoch: 6316, Training Loss: 0.02130
Epoch: 6316, Training Loss: 0.02305
Epoch: 6316, Training Loss: 0.02309
Epoch: 6316, Training Loss: 0.02901
Epoch: 6317, Training Loss: 0.02130
Epoch: 6317, Training Loss: 0.02304
Epoch: 6317, Training Loss: 0.02308
Epoch: 6317, Training Loss: 0.02900
Epoch: 6318, Training Loss: 0.02130
Epoch: 6318, Training Loss: 0.02304
Epoch: 6318, Training Loss: 0.02308
Epoch: 6318, Training Loss: 0.02900
Epoch: 6319, Training Loss: 0.02129
Epoch: 6319, Training Loss: 0.02304
Epoch: 6319, Training Loss: 0.02308
Epoch: 6319, Training Loss: 0.02900
Epoch: 6320, Training Loss: 0.02129
Epoch: 6320, Training Loss: 0.02303
Epoch: 6320, Training Loss: 0.02308
Epoch: 6320, Training Loss: 0.02899
Epoch: 6321, Training Loss: 0.02129
Epoch: 6321, Training Loss: 0.02303
Epoch: 6321, Training Loss: 0.02307
Epoch: 6321, Training Loss: 0.02899
Epoch: 6322, Training Loss: 0.02128
Epoch: 6322, Training Loss: 0.02303
Epoch: 6322, Training Loss: 0.02307
Epoch: 6322, Training Loss: 0.02898
Epoch: 6323, Training Loss: 0.02128
Epoch: 6323, Training Loss: 0.02303
Epoch: 6323, Training Loss: 0.02307
Epoch: 6323, Training Loss: 0.02898
Epoch: 6324, Training Loss: 0.02128
Epoch: 6324, Training Loss: 0.02302
Epoch: 6324, Training Loss: 0.02306
Epoch: 6324, Training Loss: 0.02898
Epoch: 6325, Training Loss: 0.02128
Epoch: 6325, Training Loss: 0.02302
Epoch: 6325, Training Loss: 0.02306
Epoch: 6325, Training Loss: 0.02897
Epoch: 6326, Training Loss: 0.02127
Epoch: 6326, Training Loss: 0.02302
Epoch: 6326, Training Loss: 0.02306
Epoch: 6326, Training Loss: 0.02897
Epoch: 6327, Training Loss: 0.02127
Epoch: 6327, Training Loss: 0.02301
Epoch: 6327, Training Loss: 0.02305
Epoch: 6327, Training Loss: 0.02896
Epoch: 6328, Training Loss: 0.02127
Epoch: 6328, Training Loss: 0.02301
Epoch: 6328, Training Loss: 0.02305
Epoch: 6328, Training Loss: 0.02896
Epoch: 6329, Training Loss: 0.02127
Epoch: 6329, Training Loss: 0.02301
Epoch: 6329, Training Loss: 0.02305
Epoch: 6329, Training Loss: 0.02896
Epoch: 6330, Training Loss: 0.02126
Epoch: 6330, Training Loss: 0.02300
Epoch: 6330, Training Loss: 0.02304
Epoch: 6330, Training Loss: 0.02895
Epoch: 6331, Training Loss: 0.02126
Epoch: 6331, Training Loss: 0.02300
Epoch: 6331, Training Loss: 0.02304
Epoch: 6331, Training Loss: 0.02895
Epoch: 6332, Training Loss: 0.02126
Epoch: 6332, Training Loss: 0.02300
Epoch: 6332, Training Loss: 0.02304
Epoch: 6332, Training Loss: 0.02894
Epoch: 6333, Training Loss: 0.02126
Epoch: 6333, Training Loss: 0.02299
Epoch: 6333, Training Loss: 0.02304
Epoch: 6333, Training Loss: 0.02894
Epoch: 6334, Training Loss: 0.02125
Epoch: 6334, Training Loss: 0.02299
Epoch: 6334, Training Loss: 0.02303
Epoch: 6334, Training Loss: 0.02894
Epoch: 6335, Training Loss: 0.02125
Epoch: 6335, Training Loss: 0.02299
Epoch: 6335, Training Loss: 0.02303
Epoch: 6335, Training Loss: 0.02893
Epoch: 6336, Training Loss: 0.02125
Epoch: 6336, Training Loss: 0.02299
Epoch: 6336, Training Loss: 0.02303
Epoch: 6336, Training Loss: 0.02893
Epoch: 6337, Training Loss: 0.02124
Epoch: 6337, Training Loss: 0.02298
Epoch: 6337, Training Loss: 0.02302
Epoch: 6337, Training Loss: 0.02892
Epoch: 6338, Training Loss: 0.02124
Epoch: 6338, Training Loss: 0.02298
Epoch: 6338, Training Loss: 0.02302
Epoch: 6338, Training Loss: 0.02892
Epoch: 6339, Training Loss: 0.02124
Epoch: 6339, Training Loss: 0.02298
Epoch: 6339, Training Loss: 0.02302
Epoch: 6339, Training Loss: 0.02892
Epoch: 6340, Training Loss: 0.02124
Epoch: 6340, Training Loss: 0.02297
Epoch: 6340, Training Loss: 0.02301
Epoch: 6340, Training Loss: 0.02891
Epoch: 6341, Training Loss: 0.02123
Epoch: 6341, Training Loss: 0.02297
Epoch: 6341, Training Loss: 0.02301
Epoch: 6341, Training Loss: 0.02891
Epoch: 6342, Training Loss: 0.02123
Epoch: 6342, Training Loss: 0.02297
Epoch: 6342, Training Loss: 0.02301
Epoch: 6342, Training Loss: 0.02890
Epoch: 6343, Training Loss: 0.02123
Epoch: 6343, Training Loss: 0.02296
Epoch: 6343, Training Loss: 0.02300
Epoch: 6343, Training Loss: 0.02890
Epoch: 6344, Training Loss: 0.02123
Epoch: 6344, Training Loss: 0.02296
Epoch: 6344, Training Loss: 0.02300
Epoch: 6344, Training Loss: 0.02890
Epoch: 6345, Training Loss: 0.02122
Epoch: 6345, Training Loss: 0.02296
Epoch: 6345, Training Loss: 0.02300
Epoch: 6345, Training Loss: 0.02889
Epoch: 6346, Training Loss: 0.02122
Epoch: 6346, Training Loss: 0.02295
Epoch: 6346, Training Loss: 0.02300
Epoch: 6346, Training Loss: 0.02889
Epoch: 6347, Training Loss: 0.02122
Epoch: 6347, Training Loss: 0.02295
Epoch: 6347, Training Loss: 0.02299
Epoch: 6347, Training Loss: 0.02888
Epoch: 6348, Training Loss: 0.02122
Epoch: 6348, Training Loss: 0.02295
Epoch: 6348, Training Loss: 0.02299
Epoch: 6348, Training Loss: 0.02888
Epoch: 6349, Training Loss: 0.02121
Epoch: 6349, Training Loss: 0.02295
Epoch: 6349, Training Loss: 0.02299
Epoch: 6349, Training Loss: 0.02888
Epoch: 6350, Training Loss: 0.02121
Epoch: 6350, Training Loss: 0.02294
Epoch: 6350, Training Loss: 0.02298
Epoch: 6350, Training Loss: 0.02887
Epoch: 6351, Training Loss: 0.02121
Epoch: 6351, Training Loss: 0.02294
Epoch: 6351, Training Loss: 0.02298
Epoch: 6351, Training Loss: 0.02887
Epoch: 6352, Training Loss: 0.02120
Epoch: 6352, Training Loss: 0.02294
Epoch: 6352, Training Loss: 0.02298
Epoch: 6352, Training Loss: 0.02886
Epoch: 6353, Training Loss: 0.02120
Epoch: 6353, Training Loss: 0.02293
Epoch: 6353, Training Loss: 0.02297
Epoch: 6353, Training Loss: 0.02886
Epoch: 6354, Training Loss: 0.02120
Epoch: 6354, Training Loss: 0.02293
Epoch: 6354, Training Loss: 0.02297
Epoch: 6354, Training Loss: 0.02886
Epoch: 6355, Training Loss: 0.02120
Epoch: 6355, Training Loss: 0.02293
Epoch: 6355, Training Loss: 0.02297
Epoch: 6355, Training Loss: 0.02885
Epoch: 6356, Training Loss: 0.02119
Epoch: 6356, Training Loss: 0.02292
Epoch: 6356, Training Loss: 0.02296
Epoch: 6356, Training Loss: 0.02885
Epoch: 6357, Training Loss: 0.02119
Epoch: 6357, Training Loss: 0.02292
Epoch: 6357, Training Loss: 0.02296
Epoch: 6357, Training Loss: 0.02884
Epoch: 6358, Training Loss: 0.02119
Epoch: 6358, Training Loss: 0.02292
Epoch: 6358, Training Loss: 0.02296
Epoch: 6358, Training Loss: 0.02884
Epoch: 6359, Training Loss: 0.02119
Epoch: 6359, Training Loss: 0.02292
Epoch: 6359, Training Loss: 0.02296
Epoch: 6359, Training Loss: 0.02884
Epoch: 6360, Training Loss: 0.02118
Epoch: 6360, Training Loss: 0.02291
Epoch: 6360, Training Loss: 0.02295
Epoch: 6360, Training Loss: 0.02883
Epoch: 6361, Training Loss: 0.02118
Epoch: 6361, Training Loss: 0.02291
Epoch: 6361, Training Loss: 0.02295
Epoch: 6361, Training Loss: 0.02883
Epoch: 6362, Training Loss: 0.02118
Epoch: 6362, Training Loss: 0.02291
Epoch: 6362, Training Loss: 0.02295
Epoch: 6362, Training Loss: 0.02882
Epoch: 6363, Training Loss: 0.02118
Epoch: 6363, Training Loss: 0.02290
Epoch: 6363, Training Loss: 0.02294
Epoch: 6363, Training Loss: 0.02882
Epoch: 6364, Training Loss: 0.02117
Epoch: 6364, Training Loss: 0.02290
Epoch: 6364, Training Loss: 0.02294
Epoch: 6364, Training Loss: 0.02882
Epoch: 6365, Training Loss: 0.02117
Epoch: 6365, Training Loss: 0.02290
Epoch: 6365, Training Loss: 0.02294
Epoch: 6365, Training Loss: 0.02881
Epoch: 6366, Training Loss: 0.02117
Epoch: 6366, Training Loss: 0.02289
Epoch: 6366, Training Loss: 0.02293
Epoch: 6366, Training Loss: 0.02881
Epoch: 6367, Training Loss: 0.02116
Epoch: 6367, Training Loss: 0.02289
Epoch: 6367, Training Loss: 0.02293
Epoch: 6367, Training Loss: 0.02880
Epoch: 6368, Training Loss: 0.02116
Epoch: 6368, Training Loss: 0.02289
Epoch: 6368, Training Loss: 0.02293
Epoch: 6368, Training Loss: 0.02880
Epoch: 6369, Training Loss: 0.02116
Epoch: 6369, Training Loss: 0.02288
Epoch: 6369, Training Loss: 0.02293
Epoch: 6369, Training Loss: 0.02880
Epoch: 6370, Training Loss: 0.02116
Epoch: 6370, Training Loss: 0.02288
Epoch: 6370, Training Loss: 0.02292
Epoch: 6370, Training Loss: 0.02879
Epoch: 6371, Training Loss: 0.02115
Epoch: 6371, Training Loss: 0.02288
Epoch: 6371, Training Loss: 0.02292
Epoch: 6371, Training Loss: 0.02879
Epoch: 6372, Training Loss: 0.02115
Epoch: 6372, Training Loss: 0.02288
Epoch: 6372, Training Loss: 0.02292
Epoch: 6372, Training Loss: 0.02879
Epoch: 6373, Training Loss: 0.02115
Epoch: 6373, Training Loss: 0.02287
Epoch: 6373, Training Loss: 0.02291
Epoch: 6373, Training Loss: 0.02878
Epoch: 6374, Training Loss: 0.02115
Epoch: 6374, Training Loss: 0.02287
Epoch: 6374, Training Loss: 0.02291
Epoch: 6374, Training Loss: 0.02878
Epoch: 6375, Training Loss: 0.02114
Epoch: 6375, Training Loss: 0.02287
Epoch: 6375, Training Loss: 0.02291
Epoch: 6375, Training Loss: 0.02877
Epoch: 6376, Training Loss: 0.02114
Epoch: 6376, Training Loss: 0.02286
Epoch: 6376, Training Loss: 0.02290
Epoch: 6376, Training Loss: 0.02877
Epoch: 6377, Training Loss: 0.02114
Epoch: 6377, Training Loss: 0.02286
Epoch: 6377, Training Loss: 0.02290
Epoch: 6377, Training Loss: 0.02877
Epoch: 6378, Training Loss: 0.02114
Epoch: 6378, Training Loss: 0.02286
Epoch: 6378, Training Loss: 0.02290
Epoch: 6378, Training Loss: 0.02876
Epoch: 6379, Training Loss: 0.02113
Epoch: 6379, Training Loss: 0.02285
Epoch: 6379, Training Loss: 0.02289
Epoch: 6379, Training Loss: 0.02876
Epoch: 6380, Training Loss: 0.02113
Epoch: 6380, Training Loss: 0.02285
Epoch: 6380, Training Loss: 0.02289
Epoch: 6380, Training Loss: 0.02875
Epoch: 6381, Training Loss: 0.02113
Epoch: 6381, Training Loss: 0.02285
Epoch: 6381, Training Loss: 0.02289
Epoch: 6381, Training Loss: 0.02875
Epoch: 6382, Training Loss: 0.02113
Epoch: 6382, Training Loss: 0.02285
Epoch: 6382, Training Loss: 0.02289
Epoch: 6382, Training Loss: 0.02875
Epoch: 6383, Training Loss: 0.02112
Epoch: 6383, Training Loss: 0.02284
Epoch: 6383, Training Loss: 0.02288
Epoch: 6383, Training Loss: 0.02874
Epoch: 6384, Training Loss: 0.02112
Epoch: 6384, Training Loss: 0.02284
Epoch: 6384, Training Loss: 0.02288
Epoch: 6384, Training Loss: 0.02874
Epoch: 6385, Training Loss: 0.02112
Epoch: 6385, Training Loss: 0.02284
Epoch: 6385, Training Loss: 0.02288
Epoch: 6385, Training Loss: 0.02873
Epoch: 6386, Training Loss: 0.02111
Epoch: 6386, Training Loss: 0.02283
Epoch: 6386, Training Loss: 0.02287
Epoch: 6386, Training Loss: 0.02873
Epoch: 6387, Training Loss: 0.02111
Epoch: 6387, Training Loss: 0.02283
Epoch: 6387, Training Loss: 0.02287
Epoch: 6387, Training Loss: 0.02873
Epoch: 6388, Training Loss: 0.02111
Epoch: 6388, Training Loss: 0.02283
Epoch: 6388, Training Loss: 0.02287
Epoch: 6388, Training Loss: 0.02872
Epoch: 6389, Training Loss: 0.02111
Epoch: 6389, Training Loss: 0.02282
Epoch: 6389, Training Loss: 0.02286
Epoch: 6389, Training Loss: 0.02872
Epoch: 6390, Training Loss: 0.02110
Epoch: 6390, Training Loss: 0.02282
Epoch: 6390, Training Loss: 0.02286
Epoch: 6390, Training Loss: 0.02871
Epoch: 6391, Training Loss: 0.02110
Epoch: 6391, Training Loss: 0.02282
Epoch: 6391, Training Loss: 0.02286
Epoch: 6391, Training Loss: 0.02871
Epoch: 6392, Training Loss: 0.02110
Epoch: 6392, Training Loss: 0.02282
Epoch: 6392, Training Loss: 0.02286
Epoch: 6392, Training Loss: 0.02871
Epoch: 6393, Training Loss: 0.02110
Epoch: 6393, Training Loss: 0.02281
Epoch: 6393, Training Loss: 0.02285
Epoch: 6393, Training Loss: 0.02870
Epoch: 6394, Training Loss: 0.02109
Epoch: 6394, Training Loss: 0.02281
Epoch: 6394, Training Loss: 0.02285
Epoch: 6394, Training Loss: 0.02870
Epoch: 6395, Training Loss: 0.02109
Epoch: 6395, Training Loss: 0.02281
Epoch: 6395, Training Loss: 0.02285
Epoch: 6395, Training Loss: 0.02869
Epoch: 6396, Training Loss: 0.02109
Epoch: 6396, Training Loss: 0.02280
Epoch: 6396, Training Loss: 0.02284
Epoch: 6396, Training Loss: 0.02869
Epoch: 6397, Training Loss: 0.02109
Epoch: 6397, Training Loss: 0.02280
Epoch: 6397, Training Loss: 0.02284
Epoch: 6397, Training Loss: 0.02869
Epoch: 6398, Training Loss: 0.02108
Epoch: 6398, Training Loss: 0.02280
Epoch: 6398, Training Loss: 0.02284
Epoch: 6398, Training Loss: 0.02868
Epoch: 6399, Training Loss: 0.02108
Epoch: 6399, Training Loss: 0.02279
Epoch: 6399, Training Loss: 0.02283
Epoch: 6399, Training Loss: 0.02868
Epoch: 6400, Training Loss: 0.02108
Epoch: 6400, Training Loss: 0.02279
Epoch: 6400, Training Loss: 0.02283
Epoch: 6400, Training Loss: 0.02868
Epoch: 6401, Training Loss: 0.02108
Epoch: 6401, Training Loss: 0.02279
Epoch: 6401, Training Loss: 0.02283
Epoch: 6401, Training Loss: 0.02867
Epoch: 6402, Training Loss: 0.02107
Epoch: 6402, Training Loss: 0.02279
Epoch: 6402, Training Loss: 0.02283
Epoch: 6402, Training Loss: 0.02867
Epoch: 6403, Training Loss: 0.02107
Epoch: 6403, Training Loss: 0.02278
Epoch: 6403, Training Loss: 0.02282
Epoch: 6403, Training Loss: 0.02866
Epoch: 6404, Training Loss: 0.02107
Epoch: 6404, Training Loss: 0.02278
Epoch: 6404, Training Loss: 0.02282
Epoch: 6404, Training Loss: 0.02866
Epoch: 6405, Training Loss: 0.02107
Epoch: 6405, Training Loss: 0.02278
Epoch: 6405, Training Loss: 0.02282
Epoch: 6405, Training Loss: 0.02866
Epoch: 6406, Training Loss: 0.02106
Epoch: 6406, Training Loss: 0.02277
Epoch: 6406, Training Loss: 0.02281
Epoch: 6406, Training Loss: 0.02865
Epoch: 6407, Training Loss: 0.02106
Epoch: 6407, Training Loss: 0.02277
Epoch: 6407, Training Loss: 0.02281
Epoch: 6407, Training Loss: 0.02865
Epoch: 6408, Training Loss: 0.02106
Epoch: 6408, Training Loss: 0.02277
Epoch: 6408, Training Loss: 0.02281
Epoch: 6408, Training Loss: 0.02864
Epoch: 6409, Training Loss: 0.02105
Epoch: 6409, Training Loss: 0.02276
Epoch: 6409, Training Loss: 0.02280
Epoch: 6409, Training Loss: 0.02864
Epoch: 6410, Training Loss: 0.02105
Epoch: 6410, Training Loss: 0.02276
Epoch: 6410, Training Loss: 0.02280
Epoch: 6410, Training Loss: 0.02864
Epoch: 6411, Training Loss: 0.02105
Epoch: 6411, Training Loss: 0.02276
Epoch: 6411, Training Loss: 0.02280
Epoch: 6411, Training Loss: 0.02863
Epoch: 6412, Training Loss: 0.02105
Epoch: 6412, Training Loss: 0.02276
Epoch: 6412, Training Loss: 0.02280
Epoch: 6412, Training Loss: 0.02863
Epoch: 6413, Training Loss: 0.02104
Epoch: 6413, Training Loss: 0.02275
Epoch: 6413, Training Loss: 0.02279
Epoch: 6413, Training Loss: 0.02862
Epoch: 6414, Training Loss: 0.02104
Epoch: 6414, Training Loss: 0.02275
Epoch: 6414, Training Loss: 0.02279
Epoch: 6414, Training Loss: 0.02862
Epoch: 6415, Training Loss: 0.02104
Epoch: 6415, Training Loss: 0.02275
Epoch: 6415, Training Loss: 0.02279
Epoch: 6415, Training Loss: 0.02862
Epoch: 6416, Training Loss: 0.02104
Epoch: 6416, Training Loss: 0.02274
Epoch: 6416, Training Loss: 0.02278
Epoch: 6416, Training Loss: 0.02861
Epoch: 6417, Training Loss: 0.02103
Epoch: 6417, Training Loss: 0.02274
Epoch: 6417, Training Loss: 0.02278
Epoch: 6417, Training Loss: 0.02861
Epoch: 6418, Training Loss: 0.02103
Epoch: 6418, Training Loss: 0.02274
Epoch: 6418, Training Loss: 0.02278
Epoch: 6418, Training Loss: 0.02861
Epoch: 6419, Training Loss: 0.02103
Epoch: 6419, Training Loss: 0.02273
Epoch: 6419, Training Loss: 0.02277
Epoch: 6419, Training Loss: 0.02860
Epoch: 6420, Training Loss: 0.02103
Epoch: 6420, Training Loss: 0.02273
Epoch: 6420, Training Loss: 0.02277
Epoch: 6420, Training Loss: 0.02860
Epoch: 6421, Training Loss: 0.02102
Epoch: 6421, Training Loss: 0.02273
Epoch: 6421, Training Loss: 0.02277
Epoch: 6421, Training Loss: 0.02859
Epoch: 6422, Training Loss: 0.02102
Epoch: 6422, Training Loss: 0.02273
Epoch: 6422, Training Loss: 0.02277
Epoch: 6422, Training Loss: 0.02859
Epoch: 6423, Training Loss: 0.02102
Epoch: 6423, Training Loss: 0.02272
Epoch: 6423, Training Loss: 0.02276
Epoch: 6423, Training Loss: 0.02859
Epoch: 6424, Training Loss: 0.02102
Epoch: 6424, Training Loss: 0.02272
Epoch: 6424, Training Loss: 0.02276
Epoch: 6424, Training Loss: 0.02858
Epoch: 6425, Training Loss: 0.02101
Epoch: 6425, Training Loss: 0.02272
Epoch: 6425, Training Loss: 0.02276
Epoch: 6425, Training Loss: 0.02858
Epoch: 6426, Training Loss: 0.02101
Epoch: 6426, Training Loss: 0.02271
Epoch: 6426, Training Loss: 0.02275
Epoch: 6426, Training Loss: 0.02857
Epoch: 6427, Training Loss: 0.02101
Epoch: 6427, Training Loss: 0.02271
Epoch: 6427, Training Loss: 0.02275
Epoch: 6427, Training Loss: 0.02857
Epoch: 6428, Training Loss: 0.02101
Epoch: 6428, Training Loss: 0.02271
Epoch: 6428, Training Loss: 0.02275
Epoch: 6428, Training Loss: 0.02857
Epoch: 6429, Training Loss: 0.02100
Epoch: 6429, Training Loss: 0.02271
Epoch: 6429, Training Loss: 0.02275
Epoch: 6429, Training Loss: 0.02856
Epoch: 6430, Training Loss: 0.02100
Epoch: 6430, Training Loss: 0.02270
Epoch: 6430, Training Loss: 0.02274
Epoch: 6430, Training Loss: 0.02856
Epoch: 6431, Training Loss: 0.02100
Epoch: 6431, Training Loss: 0.02270
Epoch: 6431, Training Loss: 0.02274
Epoch: 6431, Training Loss: 0.02855
Epoch: 6432, Training Loss: 0.02100
Epoch: 6432, Training Loss: 0.02270
Epoch: 6432, Training Loss: 0.02274
Epoch: 6432, Training Loss: 0.02855
Epoch: 6433, Training Loss: 0.02099
Epoch: 6433, Training Loss: 0.02269
Epoch: 6433, Training Loss: 0.02273
Epoch: 6433, Training Loss: 0.02855
Epoch: 6434, Training Loss: 0.02099
Epoch: 6434, Training Loss: 0.02269
Epoch: 6434, Training Loss: 0.02273
Epoch: 6434, Training Loss: 0.02854
Epoch: 6435, Training Loss: 0.02099
Epoch: 6435, Training Loss: 0.02269
Epoch: 6435, Training Loss: 0.02273
Epoch: 6435, Training Loss: 0.02854
Epoch: 6436, Training Loss: 0.02098
Epoch: 6436, Training Loss: 0.02268
Epoch: 6436, Training Loss: 0.02272
Epoch: 6436, Training Loss: 0.02854
Epoch: 6437, Training Loss: 0.02098
Epoch: 6437, Training Loss: 0.02268
Epoch: 6437, Training Loss: 0.02272
Epoch: 6437, Training Loss: 0.02853
Epoch: 6438, Training Loss: 0.02098
Epoch: 6438, Training Loss: 0.02268
Epoch: 6438, Training Loss: 0.02272
Epoch: 6438, Training Loss: 0.02853
Epoch: 6439, Training Loss: 0.02098
Epoch: 6439, Training Loss: 0.02268
Epoch: 6439, Training Loss: 0.02272
Epoch: 6439, Training Loss: 0.02852
Epoch: 6440, Training Loss: 0.02097
Epoch: 6440, Training Loss: 0.02267
Epoch: 6440, Training Loss: 0.02271
Epoch: 6440, Training Loss: 0.02852
Epoch: 6441, Training Loss: 0.02097
Epoch: 6441, Training Loss: 0.02267
Epoch: 6441, Training Loss: 0.02271
Epoch: 6441, Training Loss: 0.02852
Epoch: 6442, Training Loss: 0.02097
Epoch: 6442, Training Loss: 0.02267
Epoch: 6442, Training Loss: 0.02271
Epoch: 6442, Training Loss: 0.02851
Epoch: 6443, Training Loss: 0.02097
Epoch: 6443, Training Loss: 0.02266
Epoch: 6443, Training Loss: 0.02270
Epoch: 6443, Training Loss: 0.02851
Epoch: 6444, Training Loss: 0.02096
Epoch: 6444, Training Loss: 0.02266
Epoch: 6444, Training Loss: 0.02270
Epoch: 6444, Training Loss: 0.02851
Epoch: 6445, Training Loss: 0.02096
Epoch: 6445, Training Loss: 0.02266
Epoch: 6445, Training Loss: 0.02270
Epoch: 6445, Training Loss: 0.02850
Epoch: 6446, Training Loss: 0.02096
Epoch: 6446, Training Loss: 0.02266
Epoch: 6446, Training Loss: 0.02269
Epoch: 6446, Training Loss: 0.02850
Epoch: 6447, Training Loss: 0.02096
Epoch: 6447, Training Loss: 0.02265
Epoch: 6447, Training Loss: 0.02269
Epoch: 6447, Training Loss: 0.02849
Epoch: 6448, Training Loss: 0.02095
Epoch: 6448, Training Loss: 0.02265
Epoch: 6448, Training Loss: 0.02269
Epoch: 6448, Training Loss: 0.02849
Epoch: 6449, Training Loss: 0.02095
Epoch: 6449, Training Loss: 0.02265
Epoch: 6449, Training Loss: 0.02269
Epoch: 6449, Training Loss: 0.02849
Epoch: 6450, Training Loss: 0.02095
Epoch: 6450, Training Loss: 0.02264
Epoch: 6450, Training Loss: 0.02268
Epoch: 6450, Training Loss: 0.02848
Epoch: 6451, Training Loss: 0.02095
Epoch: 6451, Training Loss: 0.02264
Epoch: 6451, Training Loss: 0.02268
Epoch: 6451, Training Loss: 0.02848
Epoch: 6452, Training Loss: 0.02094
Epoch: 6452, Training Loss: 0.02264
Epoch: 6452, Training Loss: 0.02268
Epoch: 6452, Training Loss: 0.02847
Epoch: 6453, Training Loss: 0.02094
Epoch: 6453, Training Loss: 0.02263
Epoch: 6453, Training Loss: 0.02267
Epoch: 6453, Training Loss: 0.02847
Epoch: 6454, Training Loss: 0.02094
Epoch: 6454, Training Loss: 0.02263
Epoch: 6454, Training Loss: 0.02267
Epoch: 6454, Training Loss: 0.02847
Epoch: 6455, Training Loss: 0.02094
Epoch: 6455, Training Loss: 0.02263
Epoch: 6455, Training Loss: 0.02267
Epoch: 6455, Training Loss: 0.02846
Epoch: 6456, Training Loss: 0.02093
Epoch: 6456, Training Loss: 0.02263
Epoch: 6456, Training Loss: 0.02267
Epoch: 6456, Training Loss: 0.02846
Epoch: 6457, Training Loss: 0.02093
Epoch: 6457, Training Loss: 0.02262
Epoch: 6457, Training Loss: 0.02266
Epoch: 6457, Training Loss: 0.02846
Epoch: 6458, Training Loss: 0.02093
Epoch: 6458, Training Loss: 0.02262
Epoch: 6458, Training Loss: 0.02266
Epoch: 6458, Training Loss: 0.02845
Epoch: 6459, Training Loss: 0.02093
Epoch: 6459, Training Loss: 0.02262
Epoch: 6459, Training Loss: 0.02266
Epoch: 6459, Training Loss: 0.02845
Epoch: 6460, Training Loss: 0.02092
Epoch: 6460, Training Loss: 0.02261
Epoch: 6460, Training Loss: 0.02265
Epoch: 6460, Training Loss: 0.02844
Epoch: 6461, Training Loss: 0.02092
Epoch: 6461, Training Loss: 0.02261
Epoch: 6461, Training Loss: 0.02265
Epoch: 6461, Training Loss: 0.02844
Epoch: 6462, Training Loss: 0.02092
Epoch: 6462, Training Loss: 0.02261
Epoch: 6462, Training Loss: 0.02265
Epoch: 6462, Training Loss: 0.02844
Epoch: 6463, Training Loss: 0.02092
Epoch: 6463, Training Loss: 0.02261
Epoch: 6463, Training Loss: 0.02264
Epoch: 6463, Training Loss: 0.02843
Epoch: 6464, Training Loss: 0.02091
Epoch: 6464, Training Loss: 0.02260
Epoch: 6464, Training Loss: 0.02264
Epoch: 6464, Training Loss: 0.02843
Epoch: 6465, Training Loss: 0.02091
Epoch: 6465, Training Loss: 0.02260
Epoch: 6465, Training Loss: 0.02264
Epoch: 6465, Training Loss: 0.02842
Epoch: 6466, Training Loss: 0.02091
Epoch: 6466, Training Loss: 0.02260
Epoch: 6466, Training Loss: 0.02264
Epoch: 6466, Training Loss: 0.02842
Epoch: 6467, Training Loss: 0.02091
Epoch: 6467, Training Loss: 0.02259
Epoch: 6467, Training Loss: 0.02263
Epoch: 6467, Training Loss: 0.02842
Epoch: 6468, Training Loss: 0.02090
Epoch: 6468, Training Loss: 0.02259
Epoch: 6468, Training Loss: 0.02263
Epoch: 6468, Training Loss: 0.02841
Epoch: 6469, Training Loss: 0.02090
Epoch: 6469, Training Loss: 0.02259
Epoch: 6469, Training Loss: 0.02263
Epoch: 6469, Training Loss: 0.02841
Epoch: 6470, Training Loss: 0.02090
Epoch: 6470, Training Loss: 0.02258
Epoch: 6470, Training Loss: 0.02262
Epoch: 6470, Training Loss: 0.02841
Epoch: 6471, Training Loss: 0.02090
Epoch: 6471, Training Loss: 0.02258
Epoch: 6471, Training Loss: 0.02262
Epoch: 6471, Training Loss: 0.02840
Epoch: 6472, Training Loss: 0.02089
Epoch: 6472, Training Loss: 0.02258
Epoch: 6472, Training Loss: 0.02262
Epoch: 6472, Training Loss: 0.02840
Epoch: 6473, Training Loss: 0.02089
Epoch: 6473, Training Loss: 0.02258
Epoch: 6473, Training Loss: 0.02262
Epoch: 6473, Training Loss: 0.02839
Epoch: 6474, Training Loss: 0.02089
Epoch: 6474, Training Loss: 0.02257
Epoch: 6474, Training Loss: 0.02261
Epoch: 6474, Training Loss: 0.02839
Epoch: 6475, Training Loss: 0.02089
Epoch: 6475, Training Loss: 0.02257
Epoch: 6475, Training Loss: 0.02261
Epoch: 6475, Training Loss: 0.02839
Epoch: 6476, Training Loss: 0.02088
Epoch: 6476, Training Loss: 0.02257
Epoch: 6476, Training Loss: 0.02261
Epoch: 6476, Training Loss: 0.02838
Epoch: 6477, Training Loss: 0.02088
Epoch: 6477, Training Loss: 0.02256
Epoch: 6477, Training Loss: 0.02260
Epoch: 6477, Training Loss: 0.02838
Epoch: 6478, Training Loss: 0.02088
Epoch: 6478, Training Loss: 0.02256
Epoch: 6478, Training Loss: 0.02260
Epoch: 6478, Training Loss: 0.02838
Epoch: 6479, Training Loss: 0.02088
Epoch: 6479, Training Loss: 0.02256
Epoch: 6479, Training Loss: 0.02260
Epoch: 6479, Training Loss: 0.02837
Epoch: 6480, Training Loss: 0.02087
Epoch: 6480, Training Loss: 0.02256
Epoch: 6480, Training Loss: 0.02260
Epoch: 6480, Training Loss: 0.02837
Epoch: 6481, Training Loss: 0.02087
Epoch: 6481, Training Loss: 0.02255
Epoch: 6481, Training Loss: 0.02259
Epoch: 6481, Training Loss: 0.02836
Epoch: 6482, Training Loss: 0.02087
Epoch: 6482, Training Loss: 0.02255
Epoch: 6482, Training Loss: 0.02259
Epoch: 6482, Training Loss: 0.02836
Epoch: 6483, Training Loss: 0.02086
Epoch: 6483, Training Loss: 0.02255
Epoch: 6483, Training Loss: 0.02259
Epoch: 6483, Training Loss: 0.02836
Epoch: 6484, Training Loss: 0.02086
Epoch: 6484, Training Loss: 0.02254
Epoch: 6484, Training Loss: 0.02258
Epoch: 6484, Training Loss: 0.02835
Epoch: 6485, Training Loss: 0.02086
Epoch: 6485, Training Loss: 0.02254
Epoch: 6485, Training Loss: 0.02258
Epoch: 6485, Training Loss: 0.02835
Epoch: 6486, Training Loss: 0.02086
Epoch: 6486, Training Loss: 0.02254
Epoch: 6486, Training Loss: 0.02258
Epoch: 6486, Training Loss: 0.02835
Epoch: 6487, Training Loss: 0.02085
Epoch: 6487, Training Loss: 0.02254
Epoch: 6487, Training Loss: 0.02257
Epoch: 6487, Training Loss: 0.02834
Epoch: 6488, Training Loss: 0.02085
Epoch: 6488, Training Loss: 0.02253
Epoch: 6488, Training Loss: 0.02257
Epoch: 6488, Training Loss: 0.02834
Epoch: 6489, Training Loss: 0.02085
Epoch: 6489, Training Loss: 0.02253
Epoch: 6489, Training Loss: 0.02257
Epoch: 6489, Training Loss: 0.02833
Epoch: 6490, Training Loss: 0.02085
Epoch: 6490, Training Loss: 0.02253
Epoch: 6490, Training Loss: 0.02257
Epoch: 6490, Training Loss: 0.02833
Epoch: 6491, Training Loss: 0.02084
Epoch: 6491, Training Loss: 0.02252
Epoch: 6491, Training Loss: 0.02256
Epoch: 6491, Training Loss: 0.02833
Epoch: 6492, Training Loss: 0.02084
Epoch: 6492, Training Loss: 0.02252
Epoch: 6492, Training Loss: 0.02256
Epoch: 6492, Training Loss: 0.02832
Epoch: 6493, Training Loss: 0.02084
Epoch: 6493, Training Loss: 0.02252
Epoch: 6493, Training Loss: 0.02256
Epoch: 6493, Training Loss: 0.02832
Epoch: 6494, Training Loss: 0.02084
Epoch: 6494, Training Loss: 0.02252
Epoch: 6494, Training Loss: 0.02255
Epoch: 6494, Training Loss: 0.02832
Epoch: 6495, Training Loss: 0.02083
Epoch: 6495, Training Loss: 0.02251
Epoch: 6495, Training Loss: 0.02255
Epoch: 6495, Training Loss: 0.02831
Epoch: 6496, Training Loss: 0.02083
Epoch: 6496, Training Loss: 0.02251
Epoch: 6496, Training Loss: 0.02255
Epoch: 6496, Training Loss: 0.02831
Epoch: 6497, Training Loss: 0.02083
Epoch: 6497, Training Loss: 0.02251
Epoch: 6497, Training Loss: 0.02255
Epoch: 6497, Training Loss: 0.02830
Epoch: 6498, Training Loss: 0.02083
Epoch: 6498, Training Loss: 0.02250
Epoch: 6498, Training Loss: 0.02254
Epoch: 6498, Training Loss: 0.02830
Epoch: 6499, Training Loss: 0.02082
Epoch: 6499, Training Loss: 0.02250
Epoch: 6499, Training Loss: 0.02254
Epoch: 6499, Training Loss: 0.02830
Epoch: 6500, Training Loss: 0.02082
Epoch: 6500, Training Loss: 0.02250
Epoch: 6500, Training Loss: 0.02254
Epoch: 6500, Training Loss: 0.02829
Epoch: 6501, Training Loss: 0.02082
Epoch: 6501, Training Loss: 0.02250
Epoch: 6501, Training Loss: 0.02253
Epoch: 6501, Training Loss: 0.02829
Epoch: 6502, Training Loss: 0.02082
Epoch: 6502, Training Loss: 0.02249
Epoch: 6502, Training Loss: 0.02253
Epoch: 6502, Training Loss: 0.02829
Epoch: 6503, Training Loss: 0.02081
Epoch: 6503, Training Loss: 0.02249
Epoch: 6503, Training Loss: 0.02253
Epoch: 6503, Training Loss: 0.02828
Epoch: 6504, Training Loss: 0.02081
Epoch: 6504, Training Loss: 0.02249
Epoch: 6504, Training Loss: 0.02253
Epoch: 6504, Training Loss: 0.02828
Epoch: 6505, Training Loss: 0.02081
Epoch: 6505, Training Loss: 0.02248
Epoch: 6505, Training Loss: 0.02252
Epoch: 6505, Training Loss: 0.02827
Epoch: 6506, Training Loss: 0.02081
Epoch: 6506, Training Loss: 0.02248
Epoch: 6506, Training Loss: 0.02252
Epoch: 6506, Training Loss: 0.02827
Epoch: 6507, Training Loss: 0.02080
Epoch: 6507, Training Loss: 0.02248
Epoch: 6507, Training Loss: 0.02252
Epoch: 6507, Training Loss: 0.02827
Epoch: 6508, Training Loss: 0.02080
Epoch: 6508, Training Loss: 0.02247
Epoch: 6508, Training Loss: 0.02251
Epoch: 6508, Training Loss: 0.02826
Epoch: 6509, Training Loss: 0.02080
Epoch: 6509, Training Loss: 0.02247
Epoch: 6509, Training Loss: 0.02251
Epoch: 6509, Training Loss: 0.02826
Epoch: 6510, Training Loss: 0.02080
Epoch: 6510, Training Loss: 0.02247
Epoch: 6510, Training Loss: 0.02251
Epoch: 6510, Training Loss: 0.02826
Epoch: 6511, Training Loss: 0.02079
Epoch: 6511, Training Loss: 0.02247
Epoch: 6511, Training Loss: 0.02251
Epoch: 6511, Training Loss: 0.02825
Epoch: 6512, Training Loss: 0.02079
Epoch: 6512, Training Loss: 0.02246
Epoch: 6512, Training Loss: 0.02250
Epoch: 6512, Training Loss: 0.02825
Epoch: 6513, Training Loss: 0.02079
Epoch: 6513, Training Loss: 0.02246
Epoch: 6513, Training Loss: 0.02250
Epoch: 6513, Training Loss: 0.02824
Epoch: 6514, Training Loss: 0.02079
Epoch: 6514, Training Loss: 0.02246
Epoch: 6514, Training Loss: 0.02250
Epoch: 6514, Training Loss: 0.02824
Epoch: 6515, Training Loss: 0.02078
Epoch: 6515, Training Loss: 0.02245
Epoch: 6515, Training Loss: 0.02249
Epoch: 6515, Training Loss: 0.02824
Epoch: 6516, Training Loss: 0.02078
Epoch: 6516, Training Loss: 0.02245
Epoch: 6516, Training Loss: 0.02249
Epoch: 6516, Training Loss: 0.02823
Epoch: 6517, Training Loss: 0.02078
Epoch: 6517, Training Loss: 0.02245
Epoch: 6517, Training Loss: 0.02249
Epoch: 6517, Training Loss: 0.02823
Epoch: 6518, Training Loss: 0.02078
Epoch: 6518, Training Loss: 0.02245
Epoch: 6518, Training Loss: 0.02249
Epoch: 6518, Training Loss: 0.02823
Epoch: 6519, Training Loss: 0.02077
Epoch: 6519, Training Loss: 0.02244
Epoch: 6519, Training Loss: 0.02248
Epoch: 6519, Training Loss: 0.02822
Epoch: 6520, Training Loss: 0.02077
Epoch: 6520, Training Loss: 0.02244
Epoch: 6520, Training Loss: 0.02248
Epoch: 6520, Training Loss: 0.02822
Epoch: 6521, Training Loss: 0.02077
Epoch: 6521, Training Loss: 0.02244
Epoch: 6521, Training Loss: 0.02248
Epoch: 6521, Training Loss: 0.02821
Epoch: 6522, Training Loss: 0.02077
Epoch: 6522, Training Loss: 0.02243
Epoch: 6522, Training Loss: 0.02247
Epoch: 6522, Training Loss: 0.02821
Epoch: 6523, Training Loss: 0.02076
Epoch: 6523, Training Loss: 0.02243
Epoch: 6523, Training Loss: 0.02247
Epoch: 6523, Training Loss: 0.02821
Epoch: 6524, Training Loss: 0.02076
Epoch: 6524, Training Loss: 0.02243
Epoch: 6524, Training Loss: 0.02247
Epoch: 6524, Training Loss: 0.02820
Epoch: 6525, Training Loss: 0.02076
Epoch: 6525, Training Loss: 0.02243
Epoch: 6525, Training Loss: 0.02247
Epoch: 6525, Training Loss: 0.02820
Epoch: 6526, Training Loss: 0.02076
Epoch: 6526, Training Loss: 0.02242
Epoch: 6526, Training Loss: 0.02246
Epoch: 6526, Training Loss: 0.02820
Epoch: 6527, Training Loss: 0.02075
Epoch: 6527, Training Loss: 0.02242
Epoch: 6527, Training Loss: 0.02246
Epoch: 6527, Training Loss: 0.02819
Epoch: 6528, Training Loss: 0.02075
Epoch: 6528, Training Loss: 0.02242
Epoch: 6528, Training Loss: 0.02246
Epoch: 6528, Training Loss: 0.02819
Epoch: 6529, Training Loss: 0.02075
Epoch: 6529, Training Loss: 0.02241
Epoch: 6529, Training Loss: 0.02245
Epoch: 6529, Training Loss: 0.02818
Epoch: 6530, Training Loss: 0.02075
Epoch: 6530, Training Loss: 0.02241
Epoch: 6530, Training Loss: 0.02245
Epoch: 6530, Training Loss: 0.02818
Epoch: 6531, Training Loss: 0.02074
Epoch: 6531, Training Loss: 0.02241
Epoch: 6531, Training Loss: 0.02245
Epoch: 6531, Training Loss: 0.02818
Epoch: 6532, Training Loss: 0.02074
Epoch: 6532, Training Loss: 0.02241
Epoch: 6532, Training Loss: 0.02245
Epoch: 6532, Training Loss: 0.02817
Epoch: 6533, Training Loss: 0.02074
Epoch: 6533, Training Loss: 0.02240
Epoch: 6533, Training Loss: 0.02244
Epoch: 6533, Training Loss: 0.02817
Epoch: 6534, Training Loss: 0.02074
Epoch: 6534, Training Loss: 0.02240
Epoch: 6534, Training Loss: 0.02244
Epoch: 6534, Training Loss: 0.02817
Epoch: 6535, Training Loss: 0.02073
Epoch: 6535, Training Loss: 0.02240
Epoch: 6535, Training Loss: 0.02244
Epoch: 6535, Training Loss: 0.02816
Epoch: 6536, Training Loss: 0.02073
Epoch: 6536, Training Loss: 0.02239
Epoch: 6536, Training Loss: 0.02243
Epoch: 6536, Training Loss: 0.02816
Epoch: 6537, Training Loss: 0.02073
Epoch: 6537, Training Loss: 0.02239
Epoch: 6537, Training Loss: 0.02243
Epoch: 6537, Training Loss: 0.02815
Epoch: 6538, Training Loss: 0.02073
Epoch: 6538, Training Loss: 0.02239
Epoch: 6538, Training Loss: 0.02243
Epoch: 6538, Training Loss: 0.02815
Epoch: 6539, Training Loss: 0.02072
Epoch: 6539, Training Loss: 0.02239
Epoch: 6539, Training Loss: 0.02243
Epoch: 6539, Training Loss: 0.02815
Epoch: 6540, Training Loss: 0.02072
Epoch: 6540, Training Loss: 0.02238
Epoch: 6540, Training Loss: 0.02242
Epoch: 6540, Training Loss: 0.02814
Epoch: 6541, Training Loss: 0.02072
Epoch: 6541, Training Loss: 0.02238
Epoch: 6541, Training Loss: 0.02242
Epoch: 6541, Training Loss: 0.02814
Epoch: 6542, Training Loss: 0.02072
Epoch: 6542, Training Loss: 0.02238
Epoch: 6542, Training Loss: 0.02242
Epoch: 6542, Training Loss: 0.02814
Epoch: 6543, Training Loss: 0.02071
Epoch: 6543, Training Loss: 0.02238
Epoch: 6543, Training Loss: 0.02241
Epoch: 6543, Training Loss: 0.02813
Epoch: 6544, Training Loss: 0.02071
Epoch: 6544, Training Loss: 0.02237
Epoch: 6544, Training Loss: 0.02241
Epoch: 6544, Training Loss: 0.02813
Epoch: 6545, Training Loss: 0.02071
Epoch: 6545, Training Loss: 0.02237
Epoch: 6545, Training Loss: 0.02241
Epoch: 6545, Training Loss: 0.02813
Epoch: 6546, Training Loss: 0.02071
Epoch: 6546, Training Loss: 0.02237
Epoch: 6546, Training Loss: 0.02241
Epoch: 6546, Training Loss: 0.02812
Epoch: 6547, Training Loss: 0.02070
Epoch: 6547, Training Loss: 0.02236
Epoch: 6547, Training Loss: 0.02240
Epoch: 6547, Training Loss: 0.02812
Epoch: 6548, Training Loss: 0.02070
Epoch: 6548, Training Loss: 0.02236
Epoch: 6548, Training Loss: 0.02240
Epoch: 6548, Training Loss: 0.02811
Epoch: 6549, Training Loss: 0.02070
Epoch: 6549, Training Loss: 0.02236
Epoch: 6549, Training Loss: 0.02240
Epoch: 6549, Training Loss: 0.02811
Epoch: 6550, Training Loss: 0.02070
Epoch: 6550, Training Loss: 0.02236
Epoch: 6550, Training Loss: 0.02239
Epoch: 6550, Training Loss: 0.02811
Epoch: 6551, Training Loss: 0.02069
Epoch: 6551, Training Loss: 0.02235
Epoch: 6551, Training Loss: 0.02239
Epoch: 6551, Training Loss: 0.02810
Epoch: 6552, Training Loss: 0.02069
Epoch: 6552, Training Loss: 0.02235
Epoch: 6552, Training Loss: 0.02239
Epoch: 6552, Training Loss: 0.02810
Epoch: 6553, Training Loss: 0.02069
Epoch: 6553, Training Loss: 0.02235
Epoch: 6553, Training Loss: 0.02239
Epoch: 6553, Training Loss: 0.02810
Epoch: 6554, Training Loss: 0.02069
Epoch: 6554, Training Loss: 0.02234
Epoch: 6554, Training Loss: 0.02238
Epoch: 6554, Training Loss: 0.02809
Epoch: 6555, Training Loss: 0.02068
Epoch: 6555, Training Loss: 0.02234
Epoch: 6555, Training Loss: 0.02238
Epoch: 6555, Training Loss: 0.02809
Epoch: 6556, Training Loss: 0.02068
Epoch: 6556, Training Loss: 0.02234
Epoch: 6556, Training Loss: 0.02238
Epoch: 6556, Training Loss: 0.02808
Epoch: 6557, Training Loss: 0.02068
Epoch: 6557, Training Loss: 0.02234
Epoch: 6557, Training Loss: 0.02237
Epoch: 6557, Training Loss: 0.02808
Epoch: 6558, Training Loss: 0.02068
Epoch: 6558, Training Loss: 0.02233
Epoch: 6558, Training Loss: 0.02237
Epoch: 6558, Training Loss: 0.02808
Epoch: 6559, Training Loss: 0.02068
Epoch: 6559, Training Loss: 0.02233
Epoch: 6559, Training Loss: 0.02237
Epoch: 6559, Training Loss: 0.02807
Epoch: 6560, Training Loss: 0.02067
Epoch: 6560, Training Loss: 0.02233
Epoch: 6560, Training Loss: 0.02237
Epoch: 6560, Training Loss: 0.02807
Epoch: 6561, Training Loss: 0.02067
Epoch: 6561, Training Loss: 0.02232
Epoch: 6561, Training Loss: 0.02236
Epoch: 6561, Training Loss: 0.02807
Epoch: 6562, Training Loss: 0.02067
Epoch: 6562, Training Loss: 0.02232
Epoch: 6562, Training Loss: 0.02236
Epoch: 6562, Training Loss: 0.02806
Epoch: 6563, Training Loss: 0.02067
Epoch: 6563, Training Loss: 0.02232
Epoch: 6563, Training Loss: 0.02236
Epoch: 6563, Training Loss: 0.02806
Epoch: 6564, Training Loss: 0.02066
Epoch: 6564, Training Loss: 0.02232
Epoch: 6564, Training Loss: 0.02235
Epoch: 6564, Training Loss: 0.02806
Epoch: 6565, Training Loss: 0.02066
Epoch: 6565, Training Loss: 0.02231
Epoch: 6565, Training Loss: 0.02235
Epoch: 6565, Training Loss: 0.02805
Epoch: 6566, Training Loss: 0.02066
Epoch: 6566, Training Loss: 0.02231
Epoch: 6566, Training Loss: 0.02235
Epoch: 6566, Training Loss: 0.02805
Epoch: 6567, Training Loss: 0.02066
Epoch: 6567, Training Loss: 0.02231
Epoch: 6567, Training Loss: 0.02235
Epoch: 6567, Training Loss: 0.02804
Epoch: 6568, Training Loss: 0.02065
Epoch: 6568, Training Loss: 0.02230
Epoch: 6568, Training Loss: 0.02234
Epoch: 6568, Training Loss: 0.02804
Epoch: 6569, Training Loss: 0.02065
Epoch: 6569, Training Loss: 0.02230
Epoch: 6569, Training Loss: 0.02234
Epoch: 6569, Training Loss: 0.02804
Epoch: 6570, Training Loss: 0.02065
Epoch: 6570, Training Loss: 0.02230
Epoch: 6570, Training Loss: 0.02234
Epoch: 6570, Training Loss: 0.02803
Epoch: 6571, Training Loss: 0.02065
Epoch: 6571, Training Loss: 0.02230
Epoch: 6571, Training Loss: 0.02233
Epoch: 6571, Training Loss: 0.02803
Epoch: 6572, Training Loss: 0.02064
Epoch: 6572, Training Loss: 0.02229
Epoch: 6572, Training Loss: 0.02233
Epoch: 6572, Training Loss: 0.02803
Epoch: 6573, Training Loss: 0.02064
Epoch: 6573, Training Loss: 0.02229
Epoch: 6573, Training Loss: 0.02233
Epoch: 6573, Training Loss: 0.02802
Epoch: 6574, Training Loss: 0.02064
Epoch: 6574, Training Loss: 0.02229
Epoch: 6574, Training Loss: 0.02233
Epoch: 6574, Training Loss: 0.02802
Epoch: 6575, Training Loss: 0.02064
Epoch: 6575, Training Loss: 0.02228
Epoch: 6575, Training Loss: 0.02232
Epoch: 6575, Training Loss: 0.02801
Epoch: 6576, Training Loss: 0.02063
Epoch: 6576, Training Loss: 0.02228
Epoch: 6576, Training Loss: 0.02232
Epoch: 6576, Training Loss: 0.02801
Epoch: 6577, Training Loss: 0.02063
Epoch: 6577, Training Loss: 0.02228
Epoch: 6577, Training Loss: 0.02232
Epoch: 6577, Training Loss: 0.02801
Epoch: 6578, Training Loss: 0.02063
Epoch: 6578, Training Loss: 0.02228
Epoch: 6578, Training Loss: 0.02232
Epoch: 6578, Training Loss: 0.02800
Epoch: 6579, Training Loss: 0.02063
Epoch: 6579, Training Loss: 0.02227
Epoch: 6579, Training Loss: 0.02231
Epoch: 6579, Training Loss: 0.02800
Epoch: 6580, Training Loss: 0.02062
Epoch: 6580, Training Loss: 0.02227
Epoch: 6580, Training Loss: 0.02231
Epoch: 6580, Training Loss: 0.02800
Epoch: 6581, Training Loss: 0.02062
Epoch: 6581, Training Loss: 0.02227
Epoch: 6581, Training Loss: 0.02231
Epoch: 6581, Training Loss: 0.02799
Epoch: 6582, Training Loss: 0.02062
Epoch: 6582, Training Loss: 0.02227
Epoch: 6582, Training Loss: 0.02230
Epoch: 6582, Training Loss: 0.02799
Epoch: 6583, Training Loss: 0.02062
Epoch: 6583, Training Loss: 0.02226
Epoch: 6583, Training Loss: 0.02230
Epoch: 6583, Training Loss: 0.02799
Epoch: 6584, Training Loss: 0.02061
Epoch: 6584, Training Loss: 0.02226
Epoch: 6584, Training Loss: 0.02230
Epoch: 6584, Training Loss: 0.02798
Epoch: 6585, Training Loss: 0.02061
Epoch: 6585, Training Loss: 0.02226
Epoch: 6585, Training Loss: 0.02230
Epoch: 6585, Training Loss: 0.02798
Epoch: 6586, Training Loss: 0.02061
Epoch: 6586, Training Loss: 0.02225
Epoch: 6586, Training Loss: 0.02229
Epoch: 6586, Training Loss: 0.02797
Epoch: 6587, Training Loss: 0.02061
Epoch: 6587, Training Loss: 0.02225
Epoch: 6587, Training Loss: 0.02229
Epoch: 6587, Training Loss: 0.02797
Epoch: 6588, Training Loss: 0.02060
Epoch: 6588, Training Loss: 0.02225
Epoch: 6588, Training Loss: 0.02229
Epoch: 6588, Training Loss: 0.02797
Epoch: 6589, Training Loss: 0.02060
Epoch: 6589, Training Loss: 0.02225
Epoch: 6589, Training Loss: 0.02228
Epoch: 6589, Training Loss: 0.02796
Epoch: 6590, Training Loss: 0.02060
Epoch: 6590, Training Loss: 0.02224
Epoch: 6590, Training Loss: 0.02228
Epoch: 6590, Training Loss: 0.02796
Epoch: 6591, Training Loss: 0.02060
Epoch: 6591, Training Loss: 0.02224
Epoch: 6591, Training Loss: 0.02228
Epoch: 6591, Training Loss: 0.02796
Epoch: 6592, Training Loss: 0.02059
Epoch: 6592, Training Loss: 0.02224
Epoch: 6592, Training Loss: 0.02228
Epoch: 6592, Training Loss: 0.02795
Epoch: 6593, Training Loss: 0.02059
Epoch: 6593, Training Loss: 0.02223
Epoch: 6593, Training Loss: 0.02227
Epoch: 6593, Training Loss: 0.02795
Epoch: 6594, Training Loss: 0.02059
Epoch: 6594, Training Loss: 0.02223
Epoch: 6594, Training Loss: 0.02227
Epoch: 6594, Training Loss: 0.02795
Epoch: 6595, Training Loss: 0.02059
Epoch: 6595, Training Loss: 0.02223
Epoch: 6595, Training Loss: 0.02227
Epoch: 6595, Training Loss: 0.02794
Epoch: 6596, Training Loss: 0.02058
Epoch: 6596, Training Loss: 0.02223
Epoch: 6596, Training Loss: 0.02226
Epoch: 6596, Training Loss: 0.02794
Epoch: 6597, Training Loss: 0.02058
Epoch: 6597, Training Loss: 0.02222
Epoch: 6597, Training Loss: 0.02226
Epoch: 6597, Training Loss: 0.02793
Epoch: 6598, Training Loss: 0.02058
Epoch: 6598, Training Loss: 0.02222
Epoch: 6598, Training Loss: 0.02226
Epoch: 6598, Training Loss: 0.02793
Epoch: 6599, Training Loss: 0.02058
Epoch: 6599, Training Loss: 0.02222
Epoch: 6599, Training Loss: 0.02226
Epoch: 6599, Training Loss: 0.02793
Epoch: 6600, Training Loss: 0.02057
Epoch: 6600, Training Loss: 0.02222
Epoch: 6600, Training Loss: 0.02225
Epoch: 6600, Training Loss: 0.02792
Epoch: 6601, Training Loss: 0.02057
Epoch: 6601, Training Loss: 0.02221
Epoch: 6601, Training Loss: 0.02225
Epoch: 6601, Training Loss: 0.02792
Epoch: 6602, Training Loss: 0.02057
Epoch: 6602, Training Loss: 0.02221
Epoch: 6602, Training Loss: 0.02225
Epoch: 6602, Training Loss: 0.02792
Epoch: 6603, Training Loss: 0.02057
Epoch: 6603, Training Loss: 0.02221
Epoch: 6603, Training Loss: 0.02225
Epoch: 6603, Training Loss: 0.02791
Epoch: 6604, Training Loss: 0.02057
Epoch: 6604, Training Loss: 0.02220
Epoch: 6604, Training Loss: 0.02224
Epoch: 6604, Training Loss: 0.02791
Epoch: 6605, Training Loss: 0.02056
Epoch: 6605, Training Loss: 0.02220
Epoch: 6605, Training Loss: 0.02224
Epoch: 6605, Training Loss: 0.02791
Epoch: 6606, Training Loss: 0.02056
Epoch: 6606, Training Loss: 0.02220
Epoch: 6606, Training Loss: 0.02224
Epoch: 6606, Training Loss: 0.02790
Epoch: 6607, Training Loss: 0.02056
Epoch: 6607, Training Loss: 0.02220
Epoch: 6607, Training Loss: 0.02223
Epoch: 6607, Training Loss: 0.02790
Epoch: 6608, Training Loss: 0.02056
Epoch: 6608, Training Loss: 0.02219
Epoch: 6608, Training Loss: 0.02223
Epoch: 6608, Training Loss: 0.02790
Epoch: 6609, Training Loss: 0.02055
Epoch: 6609, Training Loss: 0.02219
Epoch: 6609, Training Loss: 0.02223
Epoch: 6609, Training Loss: 0.02789
Epoch: 6610, Training Loss: 0.02055
Epoch: 6610, Training Loss: 0.02219
Epoch: 6610, Training Loss: 0.02223
Epoch: 6610, Training Loss: 0.02789
Epoch: 6611, Training Loss: 0.02055
Epoch: 6611, Training Loss: 0.02218
Epoch: 6611, Training Loss: 0.02222
Epoch: 6611, Training Loss: 0.02788
Epoch: 6612, Training Loss: 0.02055
Epoch: 6612, Training Loss: 0.02218
Epoch: 6612, Training Loss: 0.02222
Epoch: 6612, Training Loss: 0.02788
Epoch: 6613, Training Loss: 0.02054
Epoch: 6613, Training Loss: 0.02218
Epoch: 6613, Training Loss: 0.02222
Epoch: 6613, Training Loss: 0.02788
Epoch: 6614, Training Loss: 0.02054
Epoch: 6614, Training Loss: 0.02218
Epoch: 6614, Training Loss: 0.02221
Epoch: 6614, Training Loss: 0.02787
Epoch: 6615, Training Loss: 0.02054
Epoch: 6615, Training Loss: 0.02217
Epoch: 6615, Training Loss: 0.02221
Epoch: 6615, Training Loss: 0.02787
Epoch: 6616, Training Loss: 0.02054
Epoch: 6616, Training Loss: 0.02217
Epoch: 6616, Training Loss: 0.02221
Epoch: 6616, Training Loss: 0.02787
Epoch: 6617, Training Loss: 0.02053
Epoch: 6617, Training Loss: 0.02217
Epoch: 6617, Training Loss: 0.02221
Epoch: 6617, Training Loss: 0.02786
Epoch: 6618, Training Loss: 0.02053
Epoch: 6618, Training Loss: 0.02217
Epoch: 6618, Training Loss: 0.02220
Epoch: 6618, Training Loss: 0.02786
Epoch: 6619, Training Loss: 0.02053
Epoch: 6619, Training Loss: 0.02216
Epoch: 6619, Training Loss: 0.02220
Epoch: 6619, Training Loss: 0.02786
Epoch: 6620, Training Loss: 0.02053
Epoch: 6620, Training Loss: 0.02216
Epoch: 6620, Training Loss: 0.02220
Epoch: 6620, Training Loss: 0.02785
Epoch: 6621, Training Loss: 0.02052
Epoch: 6621, Training Loss: 0.02216
Epoch: 6621, Training Loss: 0.02220
Epoch: 6621, Training Loss: 0.02785
Epoch: 6622, Training Loss: 0.02052
Epoch: 6622, Training Loss: 0.02215
Epoch: 6622, Training Loss: 0.02219
Epoch: 6622, Training Loss: 0.02784
Epoch: 6623, Training Loss: 0.02052
Epoch: 6623, Training Loss: 0.02215
Epoch: 6623, Training Loss: 0.02219
Epoch: 6623, Training Loss: 0.02784
Epoch: 6624, Training Loss: 0.02052
Epoch: 6624, Training Loss: 0.02215
Epoch: 6624, Training Loss: 0.02219
Epoch: 6624, Training Loss: 0.02784
Epoch: 6625, Training Loss: 0.02051
Epoch: 6625, Training Loss: 0.02215
Epoch: 6625, Training Loss: 0.02218
Epoch: 6625, Training Loss: 0.02783
Epoch: 6626, Training Loss: 0.02051
Epoch: 6626, Training Loss: 0.02214
Epoch: 6626, Training Loss: 0.02218
Epoch: 6626, Training Loss: 0.02783
Epoch: 6627, Training Loss: 0.02051
Epoch: 6627, Training Loss: 0.02214
Epoch: 6627, Training Loss: 0.02218
Epoch: 6627, Training Loss: 0.02783
Epoch: 6628, Training Loss: 0.02051
Epoch: 6628, Training Loss: 0.02214
Epoch: 6628, Training Loss: 0.02218
Epoch: 6628, Training Loss: 0.02782
Epoch: 6629, Training Loss: 0.02050
Epoch: 6629, Training Loss: 0.02213
Epoch: 6629, Training Loss: 0.02217
Epoch: 6629, Training Loss: 0.02782
Epoch: 6630, Training Loss: 0.02050
Epoch: 6630, Training Loss: 0.02213
Epoch: 6630, Training Loss: 0.02217
Epoch: 6630, Training Loss: 0.02782
Epoch: 6631, Training Loss: 0.02050
Epoch: 6631, Training Loss: 0.02213
Epoch: 6631, Training Loss: 0.02217
Epoch: 6631, Training Loss: 0.02781
Epoch: 6632, Training Loss: 0.02050
Epoch: 6632, Training Loss: 0.02213
Epoch: 6632, Training Loss: 0.02217
Epoch: 6632, Training Loss: 0.02781
Epoch: 6633, Training Loss: 0.02050
Epoch: 6633, Training Loss: 0.02212
Epoch: 6633, Training Loss: 0.02216
Epoch: 6633, Training Loss: 0.02781
Epoch: 6634, Training Loss: 0.02049
Epoch: 6634, Training Loss: 0.02212
Epoch: 6634, Training Loss: 0.02216
Epoch: 6634, Training Loss: 0.02780
Epoch: 6635, Training Loss: 0.02049
Epoch: 6635, Training Loss: 0.02212
Epoch: 6635, Training Loss: 0.02216
Epoch: 6635, Training Loss: 0.02780
Epoch: 6636, Training Loss: 0.02049
Epoch: 6636, Training Loss: 0.02212
Epoch: 6636, Training Loss: 0.02215
Epoch: 6636, Training Loss: 0.02779
Epoch: 6637, Training Loss: 0.02049
Epoch: 6637, Training Loss: 0.02211
Epoch: 6637, Training Loss: 0.02215
Epoch: 6637, Training Loss: 0.02779
Epoch: 6638, Training Loss: 0.02048
Epoch: 6638, Training Loss: 0.02211
Epoch: 6638, Training Loss: 0.02215
Epoch: 6638, Training Loss: 0.02779
Epoch: 6639, Training Loss: 0.02048
Epoch: 6639, Training Loss: 0.02211
Epoch: 6639, Training Loss: 0.02215
Epoch: 6639, Training Loss: 0.02778
Epoch: 6640, Training Loss: 0.02048
Epoch: 6640, Training Loss: 0.02210
Epoch: 6640, Training Loss: 0.02214
Epoch: 6640, Training Loss: 0.02778
Epoch: 6641, Training Loss: 0.02048
Epoch: 6641, Training Loss: 0.02210
Epoch: 6641, Training Loss: 0.02214
Epoch: 6641, Training Loss: 0.02778
Epoch: 6642, Training Loss: 0.02047
Epoch: 6642, Training Loss: 0.02210
Epoch: 6642, Training Loss: 0.02214
Epoch: 6642, Training Loss: 0.02777
Epoch: 6643, Training Loss: 0.02047
Epoch: 6643, Training Loss: 0.02210
Epoch: 6643, Training Loss: 0.02213
Epoch: 6643, Training Loss: 0.02777
Epoch: 6644, Training Loss: 0.02047
Epoch: 6644, Training Loss: 0.02209
Epoch: 6644, Training Loss: 0.02213
Epoch: 6644, Training Loss: 0.02777
Epoch: 6645, Training Loss: 0.02047
Epoch: 6645, Training Loss: 0.02209
Epoch: 6645, Training Loss: 0.02213
Epoch: 6645, Training Loss: 0.02776
Epoch: 6646, Training Loss: 0.02046
Epoch: 6646, Training Loss: 0.02209
Epoch: 6646, Training Loss: 0.02213
Epoch: 6646, Training Loss: 0.02776
Epoch: 6647, Training Loss: 0.02046
Epoch: 6647, Training Loss: 0.02209
Epoch: 6647, Training Loss: 0.02212
Epoch: 6647, Training Loss: 0.02776
Epoch: 6648, Training Loss: 0.02046
Epoch: 6648, Training Loss: 0.02208
Epoch: 6648, Training Loss: 0.02212
Epoch: 6648, Training Loss: 0.02775
Epoch: 6649, Training Loss: 0.02046
Epoch: 6649, Training Loss: 0.02208
Epoch: 6649, Training Loss: 0.02212
Epoch: 6649, Training Loss: 0.02775
Epoch: 6650, Training Loss: 0.02045
Epoch: 6650, Training Loss: 0.02208
Epoch: 6650, Training Loss: 0.02212
Epoch: 6650, Training Loss: 0.02774
Epoch: 6651, Training Loss: 0.02045
Epoch: 6651, Training Loss: 0.02207
Epoch: 6651, Training Loss: 0.02211
Epoch: 6651, Training Loss: 0.02774
Epoch: 6652, Training Loss: 0.02045
Epoch: 6652, Training Loss: 0.02207
Epoch: 6652, Training Loss: 0.02211
Epoch: 6652, Training Loss: 0.02774
Epoch: 6653, Training Loss: 0.02045
Epoch: 6653, Training Loss: 0.02207
Epoch: 6653, Training Loss: 0.02211
Epoch: 6653, Training Loss: 0.02773
Epoch: 6654, Training Loss: 0.02044
Epoch: 6654, Training Loss: 0.02207
Epoch: 6654, Training Loss: 0.02210
Epoch: 6654, Training Loss: 0.02773
Epoch: 6655, Training Loss: 0.02044
Epoch: 6655, Training Loss: 0.02206
Epoch: 6655, Training Loss: 0.02210
Epoch: 6655, Training Loss: 0.02773
Epoch: 6656, Training Loss: 0.02044
Epoch: 6656, Training Loss: 0.02206
Epoch: 6656, Training Loss: 0.02210
Epoch: 6656, Training Loss: 0.02772
Epoch: 6657, Training Loss: 0.02044
Epoch: 6657, Training Loss: 0.02206
Epoch: 6657, Training Loss: 0.02210
Epoch: 6657, Training Loss: 0.02772
Epoch: 6658, Training Loss: 0.02044
Epoch: 6658, Training Loss: 0.02206
Epoch: 6658, Training Loss: 0.02209
Epoch: 6658, Training Loss: 0.02772
Epoch: 6659, Training Loss: 0.02043
Epoch: 6659, Training Loss: 0.02205
Epoch: 6659, Training Loss: 0.02209
Epoch: 6659, Training Loss: 0.02771
Epoch: 6660, Training Loss: 0.02043
Epoch: 6660, Training Loss: 0.02205
Epoch: 6660, Training Loss: 0.02209
Epoch: 6660, Training Loss: 0.02771
Epoch: 6661, Training Loss: 0.02043
Epoch: 6661, Training Loss: 0.02205
Epoch: 6661, Training Loss: 0.02209
Epoch: 6661, Training Loss: 0.02771
Epoch: 6662, Training Loss: 0.02043
Epoch: 6662, Training Loss: 0.02204
Epoch: 6662, Training Loss: 0.02208
Epoch: 6662, Training Loss: 0.02770
Epoch: 6663, Training Loss: 0.02042
Epoch: 6663, Training Loss: 0.02204
Epoch: 6663, Training Loss: 0.02208
Epoch: 6663, Training Loss: 0.02770
Epoch: 6664, Training Loss: 0.02042
Epoch: 6664, Training Loss: 0.02204
Epoch: 6664, Training Loss: 0.02208
Epoch: 6664, Training Loss: 0.02770
Epoch: 6665, Training Loss: 0.02042
Epoch: 6665, Training Loss: 0.02204
Epoch: 6665, Training Loss: 0.02207
Epoch: 6665, Training Loss: 0.02769
Epoch: 6666, Training Loss: 0.02042
Epoch: 6666, Training Loss: 0.02203
Epoch: 6666, Training Loss: 0.02207
Epoch: 6666, Training Loss: 0.02769
Epoch: 6667, Training Loss: 0.02041
Epoch: 6667, Training Loss: 0.02203
Epoch: 6667, Training Loss: 0.02207
Epoch: 6667, Training Loss: 0.02768
Epoch: 6668, Training Loss: 0.02041
Epoch: 6668, Training Loss: 0.02203
Epoch: 6668, Training Loss: 0.02207
Epoch: 6668, Training Loss: 0.02768
Epoch: 6669, Training Loss: 0.02041
Epoch: 6669, Training Loss: 0.02203
Epoch: 6669, Training Loss: 0.02206
Epoch: 6669, Training Loss: 0.02768
Epoch: 6670, Training Loss: 0.02041
Epoch: 6670, Training Loss: 0.02202
Epoch: 6670, Training Loss: 0.02206
Epoch: 6670, Training Loss: 0.02767
Epoch: 6671, Training Loss: 0.02040
Epoch: 6671, Training Loss: 0.02202
Epoch: 6671, Training Loss: 0.02206
Epoch: 6671, Training Loss: 0.02767
Epoch: 6672, Training Loss: 0.02040
Epoch: 6672, Training Loss: 0.02202
Epoch: 6672, Training Loss: 0.02206
Epoch: 6672, Training Loss: 0.02767
Epoch: 6673, Training Loss: 0.02040
Epoch: 6673, Training Loss: 0.02201
Epoch: 6673, Training Loss: 0.02205
Epoch: 6673, Training Loss: 0.02766
Epoch: 6674, Training Loss: 0.02040
Epoch: 6674, Training Loss: 0.02201
Epoch: 6674, Training Loss: 0.02205
Epoch: 6674, Training Loss: 0.02766
Epoch: 6675, Training Loss: 0.02039
Epoch: 6675, Training Loss: 0.02201
Epoch: 6675, Training Loss: 0.02205
Epoch: 6675, Training Loss: 0.02766
Epoch: 6676, Training Loss: 0.02039
Epoch: 6676, Training Loss: 0.02201
Epoch: 6676, Training Loss: 0.02205
Epoch: 6676, Training Loss: 0.02765
Epoch: 6677, Training Loss: 0.02039
Epoch: 6677, Training Loss: 0.02200
Epoch: 6677, Training Loss: 0.02204
Epoch: 6677, Training Loss: 0.02765
Epoch: 6678, Training Loss: 0.02039
Epoch: 6678, Training Loss: 0.02200
Epoch: 6678, Training Loss: 0.02204
Epoch: 6678, Training Loss: 0.02765
Epoch: 6679, Training Loss: 0.02039
Epoch: 6679, Training Loss: 0.02200
Epoch: 6679, Training Loss: 0.02204
Epoch: 6679, Training Loss: 0.02764
Epoch: 6680, Training Loss: 0.02038
Epoch: 6680, Training Loss: 0.02200
Epoch: 6680, Training Loss: 0.02203
Epoch: 6680, Training Loss: 0.02764
Epoch: 6681, Training Loss: 0.02038
Epoch: 6681, Training Loss: 0.02199
Epoch: 6681, Training Loss: 0.02203
Epoch: 6681, Training Loss: 0.02764
Epoch: 6682, Training Loss: 0.02038
Epoch: 6682, Training Loss: 0.02199
Epoch: 6682, Training Loss: 0.02203
Epoch: 6682, Training Loss: 0.02763
Epoch: 6683, Training Loss: 0.02038
Epoch: 6683, Training Loss: 0.02199
Epoch: 6683, Training Loss: 0.02203
Epoch: 6683, Training Loss: 0.02763
Epoch: 6684, Training Loss: 0.02037
Epoch: 6684, Training Loss: 0.02199
Epoch: 6684, Training Loss: 0.02202
Epoch: 6684, Training Loss: 0.02763
Epoch: 6685, Training Loss: 0.02037
Epoch: 6685, Training Loss: 0.02198
Epoch: 6685, Training Loss: 0.02202
Epoch: 6685, Training Loss: 0.02762
Epoch: 6686, Training Loss: 0.02037
Epoch: 6686, Training Loss: 0.02198
Epoch: 6686, Training Loss: 0.02202
Epoch: 6686, Training Loss: 0.02762
Epoch: 6687, Training Loss: 0.02037
Epoch: 6687, Training Loss: 0.02198
Epoch: 6687, Training Loss: 0.02202
Epoch: 6687, Training Loss: 0.02761
Epoch: 6688, Training Loss: 0.02036
Epoch: 6688, Training Loss: 0.02197
Epoch: 6688, Training Loss: 0.02201
Epoch: 6688, Training Loss: 0.02761
Epoch: 6689, Training Loss: 0.02036
Epoch: 6689, Training Loss: 0.02197
Epoch: 6689, Training Loss: 0.02201
Epoch: 6689, Training Loss: 0.02761
Epoch: 6690, Training Loss: 0.02036
Epoch: 6690, Training Loss: 0.02197
Epoch: 6690, Training Loss: 0.02201
Epoch: 6690, Training Loss: 0.02760
Epoch: 6691, Training Loss: 0.02036
Epoch: 6691, Training Loss: 0.02197
Epoch: 6691, Training Loss: 0.02200
Epoch: 6691, Training Loss: 0.02760
Epoch: 6692, Training Loss: 0.02035
Epoch: 6692, Training Loss: 0.02196
Epoch: 6692, Training Loss: 0.02200
Epoch: 6692, Training Loss: 0.02760
Epoch: 6693, Training Loss: 0.02035
Epoch: 6693, Training Loss: 0.02196
Epoch: 6693, Training Loss: 0.02200
Epoch: 6693, Training Loss: 0.02759
Epoch: 6694, Training Loss: 0.02035
Epoch: 6694, Training Loss: 0.02196
Epoch: 6694, Training Loss: 0.02200
Epoch: 6694, Training Loss: 0.02759
Epoch: 6695, Training Loss: 0.02035
Epoch: 6695, Training Loss: 0.02196
Epoch: 6695, Training Loss: 0.02199
Epoch: 6695, Training Loss: 0.02759
Epoch: 6696, Training Loss: 0.02035
Epoch: 6696, Training Loss: 0.02195
Epoch: 6696, Training Loss: 0.02199
Epoch: 6696, Training Loss: 0.02758
Epoch: 6697, Training Loss: 0.02034
Epoch: 6697, Training Loss: 0.02195
Epoch: 6697, Training Loss: 0.02199
Epoch: 6697, Training Loss: 0.02758
Epoch: 6698, Training Loss: 0.02034
Epoch: 6698, Training Loss: 0.02195
Epoch: 6698, Training Loss: 0.02199
Epoch: 6698, Training Loss: 0.02758
Epoch: 6699, Training Loss: 0.02034
Epoch: 6699, Training Loss: 0.02194
Epoch: 6699, Training Loss: 0.02198
Epoch: 6699, Training Loss: 0.02757
Epoch: 6700, Training Loss: 0.02034
Epoch: 6700, Training Loss: 0.02194
Epoch: 6700, Training Loss: 0.02198
Epoch: 6700, Training Loss: 0.02757
Epoch: 6701, Training Loss: 0.02033
Epoch: 6701, Training Loss: 0.02194
Epoch: 6701, Training Loss: 0.02198
Epoch: 6701, Training Loss: 0.02757
Epoch: 6702, Training Loss: 0.02033
Epoch: 6702, Training Loss: 0.02194
Epoch: 6702, Training Loss: 0.02197
Epoch: 6702, Training Loss: 0.02756
Epoch: 6703, Training Loss: 0.02033
Epoch: 6703, Training Loss: 0.02193
Epoch: 6703, Training Loss: 0.02197
Epoch: 6703, Training Loss: 0.02756
Epoch: 6704, Training Loss: 0.02033
Epoch: 6704, Training Loss: 0.02193
Epoch: 6704, Training Loss: 0.02197
Epoch: 6704, Training Loss: 0.02756
Epoch: 6705, Training Loss: 0.02032
Epoch: 6705, Training Loss: 0.02193
Epoch: 6705, Training Loss: 0.02197
Epoch: 6705, Training Loss: 0.02755
Epoch: 6706, Training Loss: 0.02032
Epoch: 6706, Training Loss: 0.02193
Epoch: 6706, Training Loss: 0.02196
Epoch: 6706, Training Loss: 0.02755
Epoch: 6707, Training Loss: 0.02032
Epoch: 6707, Training Loss: 0.02192
Epoch: 6707, Training Loss: 0.02196
Epoch: 6707, Training Loss: 0.02754
Epoch: 6708, Training Loss: 0.02032
Epoch: 6708, Training Loss: 0.02192
Epoch: 6708, Training Loss: 0.02196
Epoch: 6708, Training Loss: 0.02754
Epoch: 6709, Training Loss: 0.02031
Epoch: 6709, Training Loss: 0.02192
Epoch: 6709, Training Loss: 0.02196
Epoch: 6709, Training Loss: 0.02754
Epoch: 6710, Training Loss: 0.02031
Epoch: 6710, Training Loss: 0.02192
Epoch: 6710, Training Loss: 0.02195
Epoch: 6710, Training Loss: 0.02753
Epoch: 6711, Training Loss: 0.02031
Epoch: 6711, Training Loss: 0.02191
Epoch: 6711, Training Loss: 0.02195
Epoch: 6711, Training Loss: 0.02753
Epoch: 6712, Training Loss: 0.02031
Epoch: 6712, Training Loss: 0.02191
Epoch: 6712, Training Loss: 0.02195
Epoch: 6712, Training Loss: 0.02753
Epoch: 6713, Training Loss: 0.02031
Epoch: 6713, Training Loss: 0.02191
Epoch: 6713, Training Loss: 0.02195
Epoch: 6713, Training Loss: 0.02752
Epoch: 6714, Training Loss: 0.02030
Epoch: 6714, Training Loss: 0.02190
Epoch: 6714, Training Loss: 0.02194
Epoch: 6714, Training Loss: 0.02752
Epoch: 6715, Training Loss: 0.02030
Epoch: 6715, Training Loss: 0.02190
Epoch: 6715, Training Loss: 0.02194
Epoch: 6715, Training Loss: 0.02752
Epoch: 6716, Training Loss: 0.02030
Epoch: 6716, Training Loss: 0.02190
Epoch: 6716, Training Loss: 0.02194
Epoch: 6716, Training Loss: 0.02751
Epoch: 6717, Training Loss: 0.02030
Epoch: 6717, Training Loss: 0.02190
Epoch: 6717, Training Loss: 0.02193
Epoch: 6717, Training Loss: 0.02751
Epoch: 6718, Training Loss: 0.02029
Epoch: 6718, Training Loss: 0.02189
Epoch: 6718, Training Loss: 0.02193
Epoch: 6718, Training Loss: 0.02751
Epoch: 6719, Training Loss: 0.02029
Epoch: 6719, Training Loss: 0.02189
Epoch: 6719, Training Loss: 0.02193
Epoch: 6719, Training Loss: 0.02750
Epoch: 6720, Training Loss: 0.02029
Epoch: 6720, Training Loss: 0.02189
Epoch: 6720, Training Loss: 0.02193
Epoch: 6720, Training Loss: 0.02750
Epoch: 6721, Training Loss: 0.02029
Epoch: 6721, Training Loss: 0.02189
Epoch: 6721, Training Loss: 0.02192
Epoch: 6721, Training Loss: 0.02750
Epoch: 6722, Training Loss: 0.02028
Epoch: 6722, Training Loss: 0.02188
Epoch: 6722, Training Loss: 0.02192
Epoch: 6722, Training Loss: 0.02749
Epoch: 6723, Training Loss: 0.02028
Epoch: 6723, Training Loss: 0.02188
Epoch: 6723, Training Loss: 0.02192
Epoch: 6723, Training Loss: 0.02749
Epoch: 6724, Training Loss: 0.02028
Epoch: 6724, Training Loss: 0.02188
Epoch: 6724, Training Loss: 0.02192
Epoch: 6724, Training Loss: 0.02749
Epoch: 6725, Training Loss: 0.02028
Epoch: 6725, Training Loss: 0.02188
Epoch: 6725, Training Loss: 0.02191
Epoch: 6725, Training Loss: 0.02748
Epoch: 6726, Training Loss: 0.02028
Epoch: 6726, Training Loss: 0.02187
Epoch: 6726, Training Loss: 0.02191
Epoch: 6726, Training Loss: 0.02748
Epoch: 6727, Training Loss: 0.02027
Epoch: 6727, Training Loss: 0.02187
Epoch: 6727, Training Loss: 0.02191
Epoch: 6727, Training Loss: 0.02748
Epoch: 6728, Training Loss: 0.02027
Epoch: 6728, Training Loss: 0.02187
Epoch: 6728, Training Loss: 0.02191
Epoch: 6728, Training Loss: 0.02747
Epoch: 6729, Training Loss: 0.02027
Epoch: 6729, Training Loss: 0.02186
Epoch: 6729, Training Loss: 0.02190
Epoch: 6729, Training Loss: 0.02747
Epoch: 6730, Training Loss: 0.02027
Epoch: 6730, Training Loss: 0.02186
Epoch: 6730, Training Loss: 0.02190
Epoch: 6730, Training Loss: 0.02747
Epoch: 6731, Training Loss: 0.02026
Epoch: 6731, Training Loss: 0.02186
Epoch: 6731, Training Loss: 0.02190
Epoch: 6731, Training Loss: 0.02746
Epoch: 6732, Training Loss: 0.02026
Epoch: 6732, Training Loss: 0.02186
Epoch: 6732, Training Loss: 0.02189
Epoch: 6732, Training Loss: 0.02746
Epoch: 6733, Training Loss: 0.02026
Epoch: 6733, Training Loss: 0.02185
Epoch: 6733, Training Loss: 0.02189
Epoch: 6733, Training Loss: 0.02745
Epoch: 6734, Training Loss: 0.02026
Epoch: 6734, Training Loss: 0.02185
Epoch: 6734, Training Loss: 0.02189
Epoch: 6734, Training Loss: 0.02745
Epoch: 6735, Training Loss: 0.02025
Epoch: 6735, Training Loss: 0.02185
Epoch: 6735, Training Loss: 0.02189
Epoch: 6735, Training Loss: 0.02745
Epoch: 6736, Training Loss: 0.02025
Epoch: 6736, Training Loss: 0.02185
Epoch: 6736, Training Loss: 0.02188
Epoch: 6736, Training Loss: 0.02744
Epoch: 6737, Training Loss: 0.02025
Epoch: 6737, Training Loss: 0.02184
Epoch: 6737, Training Loss: 0.02188
Epoch: 6737, Training Loss: 0.02744
Epoch: 6738, Training Loss: 0.02025
Epoch: 6738, Training Loss: 0.02184
Epoch: 6738, Training Loss: 0.02188
Epoch: 6738, Training Loss: 0.02744
Epoch: 6739, Training Loss: 0.02024
Epoch: 6739, Training Loss: 0.02184
Epoch: 6739, Training Loss: 0.02188
Epoch: 6739, Training Loss: 0.02743
Epoch: 6740, Training Loss: 0.02024
Epoch: 6740, Training Loss: 0.02184
Epoch: 6740, Training Loss: 0.02187
Epoch: 6740, Training Loss: 0.02743
Epoch: 6741, Training Loss: 0.02024
Epoch: 6741, Training Loss: 0.02183
Epoch: 6741, Training Loss: 0.02187
Epoch: 6741, Training Loss: 0.02743
Epoch: 6742, Training Loss: 0.02024
Epoch: 6742, Training Loss: 0.02183
Epoch: 6742, Training Loss: 0.02187
Epoch: 6742, Training Loss: 0.02742
Epoch: 6743, Training Loss: 0.02024
Epoch: 6743, Training Loss: 0.02183
Epoch: 6743, Training Loss: 0.02187
Epoch: 6743, Training Loss: 0.02742
Epoch: 6744, Training Loss: 0.02023
Epoch: 6744, Training Loss: 0.02183
Epoch: 6744, Training Loss: 0.02186
Epoch: 6744, Training Loss: 0.02742
Epoch: 6745, Training Loss: 0.02023
Epoch: 6745, Training Loss: 0.02182
Epoch: 6745, Training Loss: 0.02186
Epoch: 6745, Training Loss: 0.02741
Epoch: 6746, Training Loss: 0.02023
Epoch: 6746, Training Loss: 0.02182
Epoch: 6746, Training Loss: 0.02186
Epoch: 6746, Training Loss: 0.02741
Epoch: 6747, Training Loss: 0.02023
Epoch: 6747, Training Loss: 0.02182
Epoch: 6747, Training Loss: 0.02186
Epoch: 6747, Training Loss: 0.02741
Epoch: 6748, Training Loss: 0.02022
Epoch: 6748, Training Loss: 0.02181
Epoch: 6748, Training Loss: 0.02185
Epoch: 6748, Training Loss: 0.02740
Epoch: 6749, Training Loss: 0.02022
Epoch: 6749, Training Loss: 0.02181
Epoch: 6749, Training Loss: 0.02185
Epoch: 6749, Training Loss: 0.02740
Epoch: 6750, Training Loss: 0.02022
Epoch: 6750, Training Loss: 0.02181
Epoch: 6750, Training Loss: 0.02185
Epoch: 6750, Training Loss: 0.02740
Epoch: 6751, Training Loss: 0.02022
Epoch: 6751, Training Loss: 0.02181
Epoch: 6751, Training Loss: 0.02184
Epoch: 6751, Training Loss: 0.02739
Epoch: 6752, Training Loss: 0.02021
Epoch: 6752, Training Loss: 0.02180
Epoch: 6752, Training Loss: 0.02184
Epoch: 6752, Training Loss: 0.02739
Epoch: 6753, Training Loss: 0.02021
Epoch: 6753, Training Loss: 0.02180
Epoch: 6753, Training Loss: 0.02184
Epoch: 6753, Training Loss: 0.02739
Epoch: 6754, Training Loss: 0.02021
Epoch: 6754, Training Loss: 0.02180
Epoch: 6754, Training Loss: 0.02184
Epoch: 6754, Training Loss: 0.02738
Epoch: 6755, Training Loss: 0.02021
Epoch: 6755, Training Loss: 0.02180
Epoch: 6755, Training Loss: 0.02183
Epoch: 6755, Training Loss: 0.02738
Epoch: 6756, Training Loss: 0.02021
Epoch: 6756, Training Loss: 0.02179
Epoch: 6756, Training Loss: 0.02183
Epoch: 6756, Training Loss: 0.02738
Epoch: 6757, Training Loss: 0.02020
Epoch: 6757, Training Loss: 0.02179
Epoch: 6757, Training Loss: 0.02183
Epoch: 6757, Training Loss: 0.02737
Epoch: 6758, Training Loss: 0.02020
Epoch: 6758, Training Loss: 0.02179
Epoch: 6758, Training Loss: 0.02183
Epoch: 6758, Training Loss: 0.02737
Epoch: 6759, Training Loss: 0.02020
Epoch: 6759, Training Loss: 0.02179
Epoch: 6759, Training Loss: 0.02182
Epoch: 6759, Training Loss: 0.02737
Epoch: 6760, Training Loss: 0.02020
Epoch: 6760, Training Loss: 0.02178
Epoch: 6760, Training Loss: 0.02182
Epoch: 6760, Training Loss: 0.02736
Epoch: 6761, Training Loss: 0.02019
Epoch: 6761, Training Loss: 0.02178
Epoch: 6761, Training Loss: 0.02182
Epoch: 6761, Training Loss: 0.02736
Epoch: 6762, Training Loss: 0.02019
Epoch: 6762, Training Loss: 0.02178
Epoch: 6762, Training Loss: 0.02182
Epoch: 6762, Training Loss: 0.02736
Epoch: 6763, Training Loss: 0.02019
Epoch: 6763, Training Loss: 0.02178
Epoch: 6763, Training Loss: 0.02181
Epoch: 6763, Training Loss: 0.02735
Epoch: 6764, Training Loss: 0.02019
Epoch: 6764, Training Loss: 0.02177
Epoch: 6764, Training Loss: 0.02181
Epoch: 6764, Training Loss: 0.02735
Epoch: 6765, Training Loss: 0.02018
Epoch: 6765, Training Loss: 0.02177
Epoch: 6765, Training Loss: 0.02181
Epoch: 6765, Training Loss: 0.02735
Epoch: 6766, Training Loss: 0.02018
Epoch: 6766, Training Loss: 0.02177
Epoch: 6766, Training Loss: 0.02181
Epoch: 6766, Training Loss: 0.02734
Epoch: 6767, Training Loss: 0.02018
Epoch: 6767, Training Loss: 0.02176
Epoch: 6767, Training Loss: 0.02180
Epoch: 6767, Training Loss: 0.02734
Epoch: 6768, Training Loss: 0.02018
Epoch: 6768, Training Loss: 0.02176
Epoch: 6768, Training Loss: 0.02180
Epoch: 6768, Training Loss: 0.02733
Epoch: 6769, Training Loss: 0.02018
Epoch: 6769, Training Loss: 0.02176
Epoch: 6769, Training Loss: 0.02180
Epoch: 6769, Training Loss: 0.02733
Epoch: 6770, Training Loss: 0.02017
Epoch: 6770, Training Loss: 0.02176
Epoch: 6770, Training Loss: 0.02179
Epoch: 6770, Training Loss: 0.02733
Epoch: 6771, Training Loss: 0.02017
Epoch: 6771, Training Loss: 0.02175
Epoch: 6771, Training Loss: 0.02179
Epoch: 6771, Training Loss: 0.02732
Epoch: 6772, Training Loss: 0.02017
Epoch: 6772, Training Loss: 0.02175
Epoch: 6772, Training Loss: 0.02179
Epoch: 6772, Training Loss: 0.02732
Epoch: 6773, Training Loss: 0.02017
Epoch: 6773, Training Loss: 0.02175
Epoch: 6773, Training Loss: 0.02179
Epoch: 6773, Training Loss: 0.02732
Epoch: 6774, Training Loss: 0.02016
Epoch: 6774, Training Loss: 0.02175
Epoch: 6774, Training Loss: 0.02178
Epoch: 6774, Training Loss: 0.02731
Epoch: 6775, Training Loss: 0.02016
Epoch: 6775, Training Loss: 0.02174
Epoch: 6775, Training Loss: 0.02178
Epoch: 6775, Training Loss: 0.02731
Epoch: 6776, Training Loss: 0.02016
Epoch: 6776, Training Loss: 0.02174
Epoch: 6776, Training Loss: 0.02178
Epoch: 6776, Training Loss: 0.02731
Epoch: 6777, Training Loss: 0.02016
Epoch: 6777, Training Loss: 0.02174
Epoch: 6777, Training Loss: 0.02178
Epoch: 6777, Training Loss: 0.02730
Epoch: 6778, Training Loss: 0.02016
Epoch: 6778, Training Loss: 0.02174
Epoch: 6778, Training Loss: 0.02177
Epoch: 6778, Training Loss: 0.02730
Epoch: 6779, Training Loss: 0.02015
Epoch: 6779, Training Loss: 0.02173
Epoch: 6779, Training Loss: 0.02177
Epoch: 6779, Training Loss: 0.02730
Epoch: 6780, Training Loss: 0.02015
Epoch: 6780, Training Loss: 0.02173
Epoch: 6780, Training Loss: 0.02177
Epoch: 6780, Training Loss: 0.02729
Epoch: 6781, Training Loss: 0.02015
Epoch: 6781, Training Loss: 0.02173
Epoch: 6781, Training Loss: 0.02177
Epoch: 6781, Training Loss: 0.02729
Epoch: 6782, Training Loss: 0.02015
Epoch: 6782, Training Loss: 0.02173
Epoch: 6782, Training Loss: 0.02176
Epoch: 6782, Training Loss: 0.02729
Epoch: 6783, Training Loss: 0.02014
Epoch: 6783, Training Loss: 0.02172
Epoch: 6783, Training Loss: 0.02176
Epoch: 6783, Training Loss: 0.02728
Epoch: 6784, Training Loss: 0.02014
Epoch: 6784, Training Loss: 0.02172
Epoch: 6784, Training Loss: 0.02176
Epoch: 6784, Training Loss: 0.02728
Epoch: 6785, Training Loss: 0.02014
Epoch: 6785, Training Loss: 0.02172
Epoch: 6785, Training Loss: 0.02176
Epoch: 6785, Training Loss: 0.02728
Epoch: 6786, Training Loss: 0.02014
Epoch: 6786, Training Loss: 0.02172
Epoch: 6786, Training Loss: 0.02175
Epoch: 6786, Training Loss: 0.02727
Epoch: 6787, Training Loss: 0.02013
Epoch: 6787, Training Loss: 0.02171
Epoch: 6787, Training Loss: 0.02175
Epoch: 6787, Training Loss: 0.02727
Epoch: 6788, Training Loss: 0.02013
Epoch: 6788, Training Loss: 0.02171
Epoch: 6788, Training Loss: 0.02175
Epoch: 6788, Training Loss: 0.02727
Epoch: 6789, Training Loss: 0.02013
Epoch: 6789, Training Loss: 0.02171
Epoch: 6789, Training Loss: 0.02174
Epoch: 6789, Training Loss: 0.02726
Epoch: 6790, Training Loss: 0.02013
Epoch: 6790, Training Loss: 0.02170
Epoch: 6790, Training Loss: 0.02174
Epoch: 6790, Training Loss: 0.02726
Epoch: 6791, Training Loss: 0.02013
Epoch: 6791, Training Loss: 0.02170
Epoch: 6791, Training Loss: 0.02174
Epoch: 6791, Training Loss: 0.02726
Epoch: 6792, Training Loss: 0.02012
Epoch: 6792, Training Loss: 0.02170
Epoch: 6792, Training Loss: 0.02174
Epoch: 6792, Training Loss: 0.02725
Epoch: 6793, Training Loss: 0.02012
Epoch: 6793, Training Loss: 0.02170
Epoch: 6793, Training Loss: 0.02173
Epoch: 6793, Training Loss: 0.02725
Epoch: 6794, Training Loss: 0.02012
Epoch: 6794, Training Loss: 0.02169
Epoch: 6794, Training Loss: 0.02173
Epoch: 6794, Training Loss: 0.02725
Epoch: 6795, Training Loss: 0.02012
Epoch: 6795, Training Loss: 0.02169
Epoch: 6795, Training Loss: 0.02173
Epoch: 6795, Training Loss: 0.02724
Epoch: 6796, Training Loss: 0.02011
Epoch: 6796, Training Loss: 0.02169
Epoch: 6796, Training Loss: 0.02173
Epoch: 6796, Training Loss: 0.02724
Epoch: 6797, Training Loss: 0.02011
Epoch: 6797, Training Loss: 0.02169
Epoch: 6797, Training Loss: 0.02172
Epoch: 6797, Training Loss: 0.02724
Epoch: 6798, Training Loss: 0.02011
Epoch: 6798, Training Loss: 0.02168
Epoch: 6798, Training Loss: 0.02172
Epoch: 6798, Training Loss: 0.02723
Epoch: 6799, Training Loss: 0.02011
Epoch: 6799, Training Loss: 0.02168
Epoch: 6799, Training Loss: 0.02172
Epoch: 6799, Training Loss: 0.02723
Epoch: 6800, Training Loss: 0.02011
Epoch: 6800, Training Loss: 0.02168
Epoch: 6800, Training Loss: 0.02172
Epoch: 6800, Training Loss: 0.02723
Epoch: 6801, Training Loss: 0.02010
Epoch: 6801, Training Loss: 0.02168
Epoch: 6801, Training Loss: 0.02171
Epoch: 6801, Training Loss: 0.02722
Epoch: 6802, Training Loss: 0.02010
Epoch: 6802, Training Loss: 0.02167
Epoch: 6802, Training Loss: 0.02171
Epoch: 6802, Training Loss: 0.02722
Epoch: 6803, Training Loss: 0.02010
Epoch: 6803, Training Loss: 0.02167
Epoch: 6803, Training Loss: 0.02171
Epoch: 6803, Training Loss: 0.02722
Epoch: 6804, Training Loss: 0.02010
Epoch: 6804, Training Loss: 0.02167
Epoch: 6804, Training Loss: 0.02171
Epoch: 6804, Training Loss: 0.02721
Epoch: 6805, Training Loss: 0.02009
Epoch: 6805, Training Loss: 0.02167
Epoch: 6805, Training Loss: 0.02170
Epoch: 6805, Training Loss: 0.02721
Epoch: 6806, Training Loss: 0.02009
Epoch: 6806, Training Loss: 0.02166
Epoch: 6806, Training Loss: 0.02170
Epoch: 6806, Training Loss: 0.02721
Epoch: 6807, Training Loss: 0.02009
Epoch: 6807, Training Loss: 0.02166
Epoch: 6807, Training Loss: 0.02170
Epoch: 6807, Training Loss: 0.02720
Epoch: 6808, Training Loss: 0.02009
Epoch: 6808, Training Loss: 0.02166
Epoch: 6808, Training Loss: 0.02170
Epoch: 6808, Training Loss: 0.02720
Epoch: 6809, Training Loss: 0.02008
Epoch: 6809, Training Loss: 0.02166
Epoch: 6809, Training Loss: 0.02169
Epoch: 6809, Training Loss: 0.02720
Epoch: 6810, Training Loss: 0.02008
Epoch: 6810, Training Loss: 0.02165
Epoch: 6810, Training Loss: 0.02169
Epoch: 6810, Training Loss: 0.02719
Epoch: 6811, Training Loss: 0.02008
Epoch: 6811, Training Loss: 0.02165
Epoch: 6811, Training Loss: 0.02169
Epoch: 6811, Training Loss: 0.02719
Epoch: 6812, Training Loss: 0.02008
Epoch: 6812, Training Loss: 0.02165
Epoch: 6812, Training Loss: 0.02169
Epoch: 6812, Training Loss: 0.02719
Epoch: 6813, Training Loss: 0.02008
Epoch: 6813, Training Loss: 0.02165
Epoch: 6813, Training Loss: 0.02168
Epoch: 6813, Training Loss: 0.02718
Epoch: 6814, Training Loss: 0.02007
Epoch: 6814, Training Loss: 0.02164
Epoch: 6814, Training Loss: 0.02168
Epoch: 6814, Training Loss: 0.02718
Epoch: 6815, Training Loss: 0.02007
Epoch: 6815, Training Loss: 0.02164
Epoch: 6815, Training Loss: 0.02168
Epoch: 6815, Training Loss: 0.02718
Epoch: 6816, Training Loss: 0.02007
Epoch: 6816, Training Loss: 0.02164
Epoch: 6816, Training Loss: 0.02167
Epoch: 6816, Training Loss: 0.02717
Epoch: 6817, Training Loss: 0.02007
Epoch: 6817, Training Loss: 0.02164
Epoch: 6817, Training Loss: 0.02167
Epoch: 6817, Training Loss: 0.02717
Epoch: 6818, Training Loss: 0.02006
Epoch: 6818, Training Loss: 0.02163
Epoch: 6818, Training Loss: 0.02167
Epoch: 6818, Training Loss: 0.02717
Epoch: 6819, Training Loss: 0.02006
Epoch: 6819, Training Loss: 0.02163
Epoch: 6819, Training Loss: 0.02167
Epoch: 6819, Training Loss: 0.02716
Epoch: 6820, Training Loss: 0.02006
Epoch: 6820, Training Loss: 0.02163
Epoch: 6820, Training Loss: 0.02166
Epoch: 6820, Training Loss: 0.02716
Epoch: 6821, Training Loss: 0.02006
Epoch: 6821, Training Loss: 0.02162
Epoch: 6821, Training Loss: 0.02166
Epoch: 6821, Training Loss: 0.02716
Epoch: 6822, Training Loss: 0.02006
Epoch: 6822, Training Loss: 0.02162
Epoch: 6822, Training Loss: 0.02166
Epoch: 6822, Training Loss: 0.02715
Epoch: 6823, Training Loss: 0.02005
Epoch: 6823, Training Loss: 0.02162
Epoch: 6823, Training Loss: 0.02166
Epoch: 6823, Training Loss: 0.02715
Epoch: 6824, Training Loss: 0.02005
Epoch: 6824, Training Loss: 0.02162
Epoch: 6824, Training Loss: 0.02165
Epoch: 6824, Training Loss: 0.02715
Epoch: 6825, Training Loss: 0.02005
Epoch: 6825, Training Loss: 0.02161
Epoch: 6825, Training Loss: 0.02165
Epoch: 6825, Training Loss: 0.02714
Epoch: 6826, Training Loss: 0.02005
Epoch: 6826, Training Loss: 0.02161
Epoch: 6826, Training Loss: 0.02165
Epoch: 6826, Training Loss: 0.02714
Epoch: 6827, Training Loss: 0.02004
Epoch: 6827, Training Loss: 0.02161
Epoch: 6827, Training Loss: 0.02165
Epoch: 6827, Training Loss: 0.02714
Epoch: 6828, Training Loss: 0.02004
Epoch: 6828, Training Loss: 0.02161
Epoch: 6828, Training Loss: 0.02164
Epoch: 6828, Training Loss: 0.02713
Epoch: 6829, Training Loss: 0.02004
Epoch: 6829, Training Loss: 0.02160
Epoch: 6829, Training Loss: 0.02164
Epoch: 6829, Training Loss: 0.02713
Epoch: 6830, Training Loss: 0.02004
Epoch: 6830, Training Loss: 0.02160
Epoch: 6830, Training Loss: 0.02164
Epoch: 6830, Training Loss: 0.02713
Epoch: 6831, Training Loss: 0.02004
Epoch: 6831, Training Loss: 0.02160
Epoch: 6831, Training Loss: 0.02164
Epoch: 6831, Training Loss: 0.02712
Epoch: 6832, Training Loss: 0.02003
Epoch: 6832, Training Loss: 0.02160
Epoch: 6832, Training Loss: 0.02163
Epoch: 6832, Training Loss: 0.02712
Epoch: 6833, Training Loss: 0.02003
Epoch: 6833, Training Loss: 0.02159
Epoch: 6833, Training Loss: 0.02163
Epoch: 6833, Training Loss: 0.02712
Epoch: 6834, Training Loss: 0.02003
Epoch: 6834, Training Loss: 0.02159
Epoch: 6834, Training Loss: 0.02163
Epoch: 6834, Training Loss: 0.02711
Epoch: 6835, Training Loss: 0.02003
Epoch: 6835, Training Loss: 0.02159
Epoch: 6835, Training Loss: 0.02163
Epoch: 6835, Training Loss: 0.02711
Epoch: 6836, Training Loss: 0.02002
Epoch: 6836, Training Loss: 0.02159
Epoch: 6836, Training Loss: 0.02162
Epoch: 6836, Training Loss: 0.02711
Epoch: 6837, Training Loss: 0.02002
Epoch: 6837, Training Loss: 0.02158
Epoch: 6837, Training Loss: 0.02162
Epoch: 6837, Training Loss: 0.02710
Epoch: 6838, Training Loss: 0.02002
Epoch: 6838, Training Loss: 0.02158
Epoch: 6838, Training Loss: 0.02162
Epoch: 6838, Training Loss: 0.02710
Epoch: 6839, Training Loss: 0.02002
Epoch: 6839, Training Loss: 0.02158
Epoch: 6839, Training Loss: 0.02162
Epoch: 6839, Training Loss: 0.02710
Epoch: 6840, Training Loss: 0.02001
Epoch: 6840, Training Loss: 0.02158
Epoch: 6840, Training Loss: 0.02161
Epoch: 6840, Training Loss: 0.02709
Epoch: 6841, Training Loss: 0.02001
Epoch: 6841, Training Loss: 0.02157
Epoch: 6841, Training Loss: 0.02161
Epoch: 6841, Training Loss: 0.02709
Epoch: 6842, Training Loss: 0.02001
Epoch: 6842, Training Loss: 0.02157
Epoch: 6842, Training Loss: 0.02161
Epoch: 6842, Training Loss: 0.02709
Epoch: 6843, Training Loss: 0.02001
Epoch: 6843, Training Loss: 0.02157
Epoch: 6843, Training Loss: 0.02161
Epoch: 6843, Training Loss: 0.02708
Epoch: 6844, Training Loss: 0.02001
Epoch: 6844, Training Loss: 0.02157
Epoch: 6844, Training Loss: 0.02160
Epoch: 6844, Training Loss: 0.02708
Epoch: 6845, Training Loss: 0.02000
Epoch: 6845, Training Loss: 0.02156
Epoch: 6845, Training Loss: 0.02160
Epoch: 6845, Training Loss: 0.02708
Epoch: 6846, Training Loss: 0.02000
Epoch: 6846, Training Loss: 0.02156
Epoch: 6846, Training Loss: 0.02160
Epoch: 6846, Training Loss: 0.02707
Epoch: 6847, Training Loss: 0.02000
Epoch: 6847, Training Loss: 0.02156
Epoch: 6847, Training Loss: 0.02160
Epoch: 6847, Training Loss: 0.02707
Epoch: 6848, Training Loss: 0.02000
Epoch: 6848, Training Loss: 0.02156
Epoch: 6848, Training Loss: 0.02159
Epoch: 6848, Training Loss: 0.02707
Epoch: 6849, Training Loss: 0.01999
Epoch: 6849, Training Loss: 0.02155
Epoch: 6849, Training Loss: 0.02159
Epoch: 6849, Training Loss: 0.02706
Epoch: 6850, Training Loss: 0.01999
Epoch: 6850, Training Loss: 0.02155
Epoch: 6850, Training Loss: 0.02159
Epoch: 6850, Training Loss: 0.02706
Epoch: 6851, Training Loss: 0.01999
Epoch: 6851, Training Loss: 0.02155
Epoch: 6851, Training Loss: 0.02159
Epoch: 6851, Training Loss: 0.02706
Epoch: 6852, Training Loss: 0.01999
Epoch: 6852, Training Loss: 0.02155
Epoch: 6852, Training Loss: 0.02158
Epoch: 6852, Training Loss: 0.02705
Epoch: 6853, Training Loss: 0.01999
Epoch: 6853, Training Loss: 0.02154
Epoch: 6853, Training Loss: 0.02158
Epoch: 6853, Training Loss: 0.02705
Epoch: 6854, Training Loss: 0.01998
Epoch: 6854, Training Loss: 0.02154
Epoch: 6854, Training Loss: 0.02158
Epoch: 6854, Training Loss: 0.02705
Epoch: 6855, Training Loss: 0.01998
Epoch: 6855, Training Loss: 0.02154
Epoch: 6855, Training Loss: 0.02158
Epoch: 6855, Training Loss: 0.02704
Epoch: 6856, Training Loss: 0.01998
Epoch: 6856, Training Loss: 0.02154
Epoch: 6856, Training Loss: 0.02157
Epoch: 6856, Training Loss: 0.02704
Epoch: 6857, Training Loss: 0.01998
Epoch: 6857, Training Loss: 0.02153
Epoch: 6857, Training Loss: 0.02157
Epoch: 6857, Training Loss: 0.02704
Epoch: 6858, Training Loss: 0.01997
Epoch: 6858, Training Loss: 0.02153
Epoch: 6858, Training Loss: 0.02157
Epoch: 6858, Training Loss: 0.02703
Epoch: 6859, Training Loss: 0.01997
Epoch: 6859, Training Loss: 0.02153
Epoch: 6859, Training Loss: 0.02156
Epoch: 6859, Training Loss: 0.02703
Epoch: 6860, Training Loss: 0.01997
Epoch: 6860, Training Loss: 0.02153
Epoch: 6860, Training Loss: 0.02156
Epoch: 6860, Training Loss: 0.02703
Epoch: 6861, Training Loss: 0.01997
Epoch: 6861, Training Loss: 0.02152
Epoch: 6861, Training Loss: 0.02156
Epoch: 6861, Training Loss: 0.02702
Epoch: 6862, Training Loss: 0.01997
Epoch: 6862, Training Loss: 0.02152
Epoch: 6862, Training Loss: 0.02156
Epoch: 6862, Training Loss: 0.02702
Epoch: 6863, Training Loss: 0.01996
Epoch: 6863, Training Loss: 0.02152
Epoch: 6863, Training Loss: 0.02155
Epoch: 6863, Training Loss: 0.02702
Epoch: 6864, Training Loss: 0.01996
Epoch: 6864, Training Loss: 0.02152
Epoch: 6864, Training Loss: 0.02155
Epoch: 6864, Training Loss: 0.02701
Epoch: 6865, Training Loss: 0.01996
Epoch: 6865, Training Loss: 0.02151
Epoch: 6865, Training Loss: 0.02155
Epoch: 6865, Training Loss: 0.02701
Epoch: 6866, Training Loss: 0.01996
Epoch: 6866, Training Loss: 0.02151
Epoch: 6866, Training Loss: 0.02155
Epoch: 6866, Training Loss: 0.02701
Epoch: 6867, Training Loss: 0.01995
Epoch: 6867, Training Loss: 0.02151
Epoch: 6867, Training Loss: 0.02154
Epoch: 6867, Training Loss: 0.02700
Epoch: 6868, Training Loss: 0.01995
Epoch: 6868, Training Loss: 0.02151
Epoch: 6868, Training Loss: 0.02154
Epoch: 6868, Training Loss: 0.02700
Epoch: 6869, Training Loss: 0.01995
Epoch: 6869, Training Loss: 0.02150
Epoch: 6869, Training Loss: 0.02154
Epoch: 6869, Training Loss: 0.02700
Epoch: 6870, Training Loss: 0.01995
Epoch: 6870, Training Loss: 0.02150
Epoch: 6870, Training Loss: 0.02154
Epoch: 6870, Training Loss: 0.02699
Epoch: 6871, Training Loss: 0.01995
Epoch: 6871, Training Loss: 0.02150
Epoch: 6871, Training Loss: 0.02153
Epoch: 6871, Training Loss: 0.02699
Epoch: 6872, Training Loss: 0.01994
Epoch: 6872, Training Loss: 0.02149
Epoch: 6872, Training Loss: 0.02153
Epoch: 6872, Training Loss: 0.02699
Epoch: 6873, Training Loss: 0.01994
Epoch: 6873, Training Loss: 0.02149
Epoch: 6873, Training Loss: 0.02153
Epoch: 6873, Training Loss: 0.02698
Epoch: 6874, Training Loss: 0.01994
Epoch: 6874, Training Loss: 0.02149
Epoch: 6874, Training Loss: 0.02153
Epoch: 6874, Training Loss: 0.02698
Epoch: 6875, Training Loss: 0.01994
Epoch: 6875, Training Loss: 0.02149
Epoch: 6875, Training Loss: 0.02152
Epoch: 6875, Training Loss: 0.02698
Epoch: 6876, Training Loss: 0.01993
Epoch: 6876, Training Loss: 0.02148
Epoch: 6876, Training Loss: 0.02152
Epoch: 6876, Training Loss: 0.02697
Epoch: 6877, Training Loss: 0.01993
Epoch: 6877, Training Loss: 0.02148
Epoch: 6877, Training Loss: 0.02152
Epoch: 6877, Training Loss: 0.02697
Epoch: 6878, Training Loss: 0.01993
Epoch: 6878, Training Loss: 0.02148
Epoch: 6878, Training Loss: 0.02152
Epoch: 6878, Training Loss: 0.02697
Epoch: 6879, Training Loss: 0.01993
Epoch: 6879, Training Loss: 0.02148
Epoch: 6879, Training Loss: 0.02151
Epoch: 6879, Training Loss: 0.02696
Epoch: 6880, Training Loss: 0.01993
Epoch: 6880, Training Loss: 0.02147
Epoch: 6880, Training Loss: 0.02151
Epoch: 6880, Training Loss: 0.02696
Epoch: 6881, Training Loss: 0.01992
Epoch: 6881, Training Loss: 0.02147
Epoch: 6881, Training Loss: 0.02151
Epoch: 6881, Training Loss: 0.02696
Epoch: 6882, Training Loss: 0.01992
Epoch: 6882, Training Loss: 0.02147
Epoch: 6882, Training Loss: 0.02151
Epoch: 6882, Training Loss: 0.02695
Epoch: 6883, Training Loss: 0.01992
Epoch: 6883, Training Loss: 0.02147
Epoch: 6883, Training Loss: 0.02150
Epoch: 6883, Training Loss: 0.02695
Epoch: 6884, Training Loss: 0.01992
Epoch: 6884, Training Loss: 0.02146
Epoch: 6884, Training Loss: 0.02150
Epoch: 6884, Training Loss: 0.02695
Epoch: 6885, Training Loss: 0.01991
Epoch: 6885, Training Loss: 0.02146
Epoch: 6885, Training Loss: 0.02150
Epoch: 6885, Training Loss: 0.02695
Epoch: 6886, Training Loss: 0.01991
Epoch: 6886, Training Loss: 0.02146
Epoch: 6886, Training Loss: 0.02150
Epoch: 6886, Training Loss: 0.02694
Epoch: 6887, Training Loss: 0.01991
Epoch: 6887, Training Loss: 0.02146
Epoch: 6887, Training Loss: 0.02149
Epoch: 6887, Training Loss: 0.02694
Epoch: 6888, Training Loss: 0.01991
Epoch: 6888, Training Loss: 0.02145
Epoch: 6888, Training Loss: 0.02149
Epoch: 6888, Training Loss: 0.02694
Epoch: 6889, Training Loss: 0.01991
Epoch: 6889, Training Loss: 0.02145
Epoch: 6889, Training Loss: 0.02149
Epoch: 6889, Training Loss: 0.02693
Epoch: 6890, Training Loss: 0.01990
Epoch: 6890, Training Loss: 0.02145
Epoch: 6890, Training Loss: 0.02149
Epoch: 6890, Training Loss: 0.02693
Epoch: 6891, Training Loss: 0.01990
Epoch: 6891, Training Loss: 0.02145
Epoch: 6891, Training Loss: 0.02148
Epoch: 6891, Training Loss: 0.02693
Epoch: 6892, Training Loss: 0.01990
Epoch: 6892, Training Loss: 0.02144
Epoch: 6892, Training Loss: 0.02148
Epoch: 6892, Training Loss: 0.02692
Epoch: 6893, Training Loss: 0.01990
Epoch: 6893, Training Loss: 0.02144
Epoch: 6893, Training Loss: 0.02148
Epoch: 6893, Training Loss: 0.02692
Epoch: 6894, Training Loss: 0.01990
Epoch: 6894, Training Loss: 0.02144
Epoch: 6894, Training Loss: 0.02148
Epoch: 6894, Training Loss: 0.02692
Epoch: 6895, Training Loss: 0.01989
Epoch: 6895, Training Loss: 0.02144
Epoch: 6895, Training Loss: 0.02147
Epoch: 6895, Training Loss: 0.02691
Epoch: 6896, Training Loss: 0.01989
Epoch: 6896, Training Loss: 0.02143
Epoch: 6896, Training Loss: 0.02147
Epoch: 6896, Training Loss: 0.02691
Epoch: 6897, Training Loss: 0.01989
Epoch: 6897, Training Loss: 0.02143
Epoch: 6897, Training Loss: 0.02147
Epoch: 6897, Training Loss: 0.02691
Epoch: 6898, Training Loss: 0.01989
Epoch: 6898, Training Loss: 0.02143
Epoch: 6898, Training Loss: 0.02147
Epoch: 6898, Training Loss: 0.02690
Epoch: 6899, Training Loss: 0.01988
Epoch: 6899, Training Loss: 0.02143
Epoch: 6899, Training Loss: 0.02146
Epoch: 6899, Training Loss: 0.02690
Epoch: 6900, Training Loss: 0.01988
Epoch: 6900, Training Loss: 0.02142
Epoch: 6900, Training Loss: 0.02146
Epoch: 6900, Training Loss: 0.02690
Epoch: 6901, Training Loss: 0.01988
Epoch: 6901, Training Loss: 0.02142
Epoch: 6901, Training Loss: 0.02146
Epoch: 6901, Training Loss: 0.02689
Epoch: 6902, Training Loss: 0.01988
Epoch: 6902, Training Loss: 0.02142
Epoch: 6902, Training Loss: 0.02146
Epoch: 6902, Training Loss: 0.02689
Epoch: 6903, Training Loss: 0.01988
Epoch: 6903, Training Loss: 0.02142
Epoch: 6903, Training Loss: 0.02145
Epoch: 6903, Training Loss: 0.02689
Epoch: 6904, Training Loss: 0.01987
Epoch: 6904, Training Loss: 0.02141
Epoch: 6904, Training Loss: 0.02145
Epoch: 6904, Training Loss: 0.02688
Epoch: 6905, Training Loss: 0.01987
Epoch: 6905, Training Loss: 0.02141
Epoch: 6905, Training Loss: 0.02145
Epoch: 6905, Training Loss: 0.02688
Epoch: 6906, Training Loss: 0.01987
Epoch: 6906, Training Loss: 0.02141
Epoch: 6906, Training Loss: 0.02145
Epoch: 6906, Training Loss: 0.02688
Epoch: 6907, Training Loss: 0.01987
Epoch: 6907, Training Loss: 0.02141
Epoch: 6907, Training Loss: 0.02144
Epoch: 6907, Training Loss: 0.02687
Epoch: 6908, Training Loss: 0.01986
Epoch: 6908, Training Loss: 0.02140
Epoch: 6908, Training Loss: 0.02144
Epoch: 6908, Training Loss: 0.02687
Epoch: 6909, Training Loss: 0.01986
Epoch: 6909, Training Loss: 0.02140
Epoch: 6909, Training Loss: 0.02144
Epoch: 6909, Training Loss: 0.02687
Epoch: 6910, Training Loss: 0.01986
Epoch: 6910, Training Loss: 0.02140
Epoch: 6910, Training Loss: 0.02144
Epoch: 6910, Training Loss: 0.02686
Epoch: 6911, Training Loss: 0.01986
Epoch: 6911, Training Loss: 0.02140
Epoch: 6911, Training Loss: 0.02143
Epoch: 6911, Training Loss: 0.02686
Epoch: 6912, Training Loss: 0.01986
Epoch: 6912, Training Loss: 0.02139
Epoch: 6912, Training Loss: 0.02143
Epoch: 6912, Training Loss: 0.02686
Epoch: 6913, Training Loss: 0.01985
Epoch: 6913, Training Loss: 0.02139
Epoch: 6913, Training Loss: 0.02143
Epoch: 6913, Training Loss: 0.02685
Epoch: 6914, Training Loss: 0.01985
Epoch: 6914, Training Loss: 0.02139
Epoch: 6914, Training Loss: 0.02143
Epoch: 6914, Training Loss: 0.02685
Epoch: 6915, Training Loss: 0.01985
Epoch: 6915, Training Loss: 0.02139
Epoch: 6915, Training Loss: 0.02142
Epoch: 6915, Training Loss: 0.02685
Epoch: 6916, Training Loss: 0.01985
Epoch: 6916, Training Loss: 0.02138
Epoch: 6916, Training Loss: 0.02142
Epoch: 6916, Training Loss: 0.02684
Epoch: 6917, Training Loss: 0.01984
Epoch: 6917, Training Loss: 0.02138
Epoch: 6917, Training Loss: 0.02142
Epoch: 6917, Training Loss: 0.02684
Epoch: 6918, Training Loss: 0.01984
Epoch: 6918, Training Loss: 0.02138
Epoch: 6918, Training Loss: 0.02142
Epoch: 6918, Training Loss: 0.02684
Epoch: 6919, Training Loss: 0.01984
Epoch: 6919, Training Loss: 0.02138
Epoch: 6919, Training Loss: 0.02141
Epoch: 6919, Training Loss: 0.02683
Epoch: 6920, Training Loss: 0.01984
Epoch: 6920, Training Loss: 0.02137
Epoch: 6920, Training Loss: 0.02141
Epoch: 6920, Training Loss: 0.02683
Epoch: 6921, Training Loss: 0.01984
Epoch: 6921, Training Loss: 0.02137
Epoch: 6921, Training Loss: 0.02141
Epoch: 6921, Training Loss: 0.02683
Epoch: 6922, Training Loss: 0.01983
Epoch: 6922, Training Loss: 0.02137
Epoch: 6922, Training Loss: 0.02141
Epoch: 6922, Training Loss: 0.02683
Epoch: 6923, Training Loss: 0.01983
Epoch: 6923, Training Loss: 0.02137
Epoch: 6923, Training Loss: 0.02140
Epoch: 6923, Training Loss: 0.02682
Epoch: 6924, Training Loss: 0.01983
Epoch: 6924, Training Loss: 0.02136
Epoch: 6924, Training Loss: 0.02140
Epoch: 6924, Training Loss: 0.02682
Epoch: 6925, Training Loss: 0.01983
Epoch: 6925, Training Loss: 0.02136
Epoch: 6925, Training Loss: 0.02140
Epoch: 6925, Training Loss: 0.02682
Epoch: 6926, Training Loss: 0.01982
Epoch: 6926, Training Loss: 0.02136
Epoch: 6926, Training Loss: 0.02140
Epoch: 6926, Training Loss: 0.02681
Epoch: 6927, Training Loss: 0.01982
Epoch: 6927, Training Loss: 0.02136
Epoch: 6927, Training Loss: 0.02139
Epoch: 6927, Training Loss: 0.02681
Epoch: 6928, Training Loss: 0.01982
Epoch: 6928, Training Loss: 0.02135
Epoch: 6928, Training Loss: 0.02139
Epoch: 6928, Training Loss: 0.02681
Epoch: 6929, Training Loss: 0.01982
Epoch: 6929, Training Loss: 0.02135
Epoch: 6929, Training Loss: 0.02139
Epoch: 6929, Training Loss: 0.02680
Epoch: 6930, Training Loss: 0.01982
Epoch: 6930, Training Loss: 0.02135
Epoch: 6930, Training Loss: 0.02139
Epoch: 6930, Training Loss: 0.02680
Epoch: 6931, Training Loss: 0.01981
Epoch: 6931, Training Loss: 0.02135
Epoch: 6931, Training Loss: 0.02138
Epoch: 6931, Training Loss: 0.02680
Epoch: 6932, Training Loss: 0.01981
Epoch: 6932, Training Loss: 0.02135
Epoch: 6932, Training Loss: 0.02138
Epoch: 6932, Training Loss: 0.02679
Epoch: 6933, Training Loss: 0.01981
Epoch: 6933, Training Loss: 0.02134
Epoch: 6933, Training Loss: 0.02138
Epoch: 6933, Training Loss: 0.02679
Epoch: 6934, Training Loss: 0.01981
Epoch: 6934, Training Loss: 0.02134
Epoch: 6934, Training Loss: 0.02138
Epoch: 6934, Training Loss: 0.02679
Epoch: 6935, Training Loss: 0.01981
Epoch: 6935, Training Loss: 0.02134
Epoch: 6935, Training Loss: 0.02137
Epoch: 6935, Training Loss: 0.02678
Epoch: 6936, Training Loss: 0.01980
Epoch: 6936, Training Loss: 0.02134
Epoch: 6936, Training Loss: 0.02137
Epoch: 6936, Training Loss: 0.02678
Epoch: 6937, Training Loss: 0.01980
Epoch: 6937, Training Loss: 0.02133
Epoch: 6937, Training Loss: 0.02137
Epoch: 6937, Training Loss: 0.02678
Epoch: 6938, Training Loss: 0.01980
Epoch: 6938, Training Loss: 0.02133
Epoch: 6938, Training Loss: 0.02137
Epoch: 6938, Training Loss: 0.02677
Epoch: 6939, Training Loss: 0.01980
Epoch: 6939, Training Loss: 0.02133
Epoch: 6939, Training Loss: 0.02136
Epoch: 6939, Training Loss: 0.02677
Epoch: 6940, Training Loss: 0.01979
Epoch: 6940, Training Loss: 0.02133
Epoch: 6940, Training Loss: 0.02136
Epoch: 6940, Training Loss: 0.02677
Epoch: 6941, Training Loss: 0.01979
Epoch: 6941, Training Loss: 0.02132
Epoch: 6941, Training Loss: 0.02136
Epoch: 6941, Training Loss: 0.02676
Epoch: 6942, Training Loss: 0.01979
Epoch: 6942, Training Loss: 0.02132
Epoch: 6942, Training Loss: 0.02136
Epoch: 6942, Training Loss: 0.02676
Epoch: 6943, Training Loss: 0.01979
Epoch: 6943, Training Loss: 0.02132
Epoch: 6943, Training Loss: 0.02135
Epoch: 6943, Training Loss: 0.02676
Epoch: 6944, Training Loss: 0.01979
Epoch: 6944, Training Loss: 0.02132
Epoch: 6944, Training Loss: 0.02135
Epoch: 6944, Training Loss: 0.02675
Epoch: 6945, Training Loss: 0.01978
Epoch: 6945, Training Loss: 0.02131
Epoch: 6945, Training Loss: 0.02135
Epoch: 6945, Training Loss: 0.02675
Epoch: 6946, Training Loss: 0.01978
Epoch: 6946, Training Loss: 0.02131
Epoch: 6946, Training Loss: 0.02135
Epoch: 6946, Training Loss: 0.02675
Epoch: 6947, Training Loss: 0.01978
Epoch: 6947, Training Loss: 0.02131
Epoch: 6947, Training Loss: 0.02134
Epoch: 6947, Training Loss: 0.02674
Epoch: 6948, Training Loss: 0.01978
Epoch: 6948, Training Loss: 0.02131
Epoch: 6948, Training Loss: 0.02134
Epoch: 6948, Training Loss: 0.02674
Epoch: 6949, Training Loss: 0.01978
Epoch: 6949, Training Loss: 0.02130
Epoch: 6949, Training Loss: 0.02134
Epoch: 6949, Training Loss: 0.02674
Epoch: 6950, Training Loss: 0.01977
Epoch: 6950, Training Loss: 0.02130
Epoch: 6950, Training Loss: 0.02134
Epoch: 6950, Training Loss: 0.02674
Epoch: 6951, Training Loss: 0.01977
Epoch: 6951, Training Loss: 0.02130
Epoch: 6951, Training Loss: 0.02133
Epoch: 6951, Training Loss: 0.02673
Epoch: 6952, Training Loss: 0.01977
Epoch: 6952, Training Loss: 0.02130
Epoch: 6952, Training Loss: 0.02133
Epoch: 6952, Training Loss: 0.02673
Epoch: 6953, Training Loss: 0.01977
Epoch: 6953, Training Loss: 0.02129
Epoch: 6953, Training Loss: 0.02133
Epoch: 6953, Training Loss: 0.02673
Epoch: 6954, Training Loss: 0.01976
Epoch: 6954, Training Loss: 0.02129
Epoch: 6954, Training Loss: 0.02133
Epoch: 6954, Training Loss: 0.02672
Epoch: 6955, Training Loss: 0.01976
Epoch: 6955, Training Loss: 0.02129
Epoch: 6955, Training Loss: 0.02132
Epoch: 6955, Training Loss: 0.02672
Epoch: 6956, Training Loss: 0.01976
Epoch: 6956, Training Loss: 0.02129
Epoch: 6956, Training Loss: 0.02132
Epoch: 6956, Training Loss: 0.02672
Epoch: 6957, Training Loss: 0.01976
Epoch: 6957, Training Loss: 0.02128
Epoch: 6957, Training Loss: 0.02132
Epoch: 6957, Training Loss: 0.02671
Epoch: 6958, Training Loss: 0.01976
Epoch: 6958, Training Loss: 0.02128
Epoch: 6958, Training Loss: 0.02132
Epoch: 6958, Training Loss: 0.02671
Epoch: 6959, Training Loss: 0.01975
Epoch: 6959, Training Loss: 0.02128
Epoch: 6959, Training Loss: 0.02131
Epoch: 6959, Training Loss: 0.02671
Epoch: 6960, Training Loss: 0.01975
Epoch: 6960, Training Loss: 0.02128
Epoch: 6960, Training Loss: 0.02131
Epoch: 6960, Training Loss: 0.02670
Epoch: 6961, Training Loss: 0.01975
Epoch: 6961, Training Loss: 0.02127
Epoch: 6961, Training Loss: 0.02131
Epoch: 6961, Training Loss: 0.02670
Epoch: 6962, Training Loss: 0.01975
Epoch: 6962, Training Loss: 0.02127
Epoch: 6962, Training Loss: 0.02131
Epoch: 6962, Training Loss: 0.02670
Epoch: 6963, Training Loss: 0.01974
Epoch: 6963, Training Loss: 0.02127
Epoch: 6963, Training Loss: 0.02131
Epoch: 6963, Training Loss: 0.02669
Epoch: 6964, Training Loss: 0.01974
Epoch: 6964, Training Loss: 0.02127
Epoch: 6964, Training Loss: 0.02130
Epoch: 6964, Training Loss: 0.02669
Epoch: 6965, Training Loss: 0.01974
Epoch: 6965, Training Loss: 0.02126
Epoch: 6965, Training Loss: 0.02130
Epoch: 6965, Training Loss: 0.02669
Epoch: 6966, Training Loss: 0.01974
Epoch: 6966, Training Loss: 0.02126
Epoch: 6966, Training Loss: 0.02130
Epoch: 6966, Training Loss: 0.02668
Epoch: 6967, Training Loss: 0.01974
Epoch: 6967, Training Loss: 0.02126
Epoch: 6967, Training Loss: 0.02130
Epoch: 6967, Training Loss: 0.02668
Epoch: 6968, Training Loss: 0.01973
Epoch: 6968, Training Loss: 0.02126
Epoch: 6968, Training Loss: 0.02129
Epoch: 6968, Training Loss: 0.02668
Epoch: 6969, Training Loss: 0.01973
Epoch: 6969, Training Loss: 0.02125
Epoch: 6969, Training Loss: 0.02129
Epoch: 6969, Training Loss: 0.02667
Epoch: 6970, Training Loss: 0.01973
Epoch: 6970, Training Loss: 0.02125
Epoch: 6970, Training Loss: 0.02129
Epoch: 6970, Training Loss: 0.02667
Epoch: 6971, Training Loss: 0.01973
Epoch: 6971, Training Loss: 0.02125
Epoch: 6971, Training Loss: 0.02129
Epoch: 6971, Training Loss: 0.02667
Epoch: 6972, Training Loss: 0.01973
Epoch: 6972, Training Loss: 0.02125
Epoch: 6972, Training Loss: 0.02128
Epoch: 6972, Training Loss: 0.02667
Epoch: 6973, Training Loss: 0.01972
Epoch: 6973, Training Loss: 0.02124
Epoch: 6973, Training Loss: 0.02128
Epoch: 6973, Training Loss: 0.02666
Epoch: 6974, Training Loss: 0.01972
Epoch: 6974, Training Loss: 0.02124
Epoch: 6974, Training Loss: 0.02128
Epoch: 6974, Training Loss: 0.02666
Epoch: 6975, Training Loss: 0.01972
Epoch: 6975, Training Loss: 0.02124
Epoch: 6975, Training Loss: 0.02128
Epoch: 6975, Training Loss: 0.02666
Epoch: 6976, Training Loss: 0.01972
Epoch: 6976, Training Loss: 0.02124
Epoch: 6976, Training Loss: 0.02127
Epoch: 6976, Training Loss: 0.02665
Epoch: 6977, Training Loss: 0.01971
Epoch: 6977, Training Loss: 0.02123
Epoch: 6977, Training Loss: 0.02127
Epoch: 6977, Training Loss: 0.02665
Epoch: 6978, Training Loss: 0.01971
Epoch: 6978, Training Loss: 0.02123
Epoch: 6978, Training Loss: 0.02127
Epoch: 6978, Training Loss: 0.02665
Epoch: 6979, Training Loss: 0.01971
Epoch: 6979, Training Loss: 0.02123
Epoch: 6979, Training Loss: 0.02127
Epoch: 6979, Training Loss: 0.02664
Epoch: 6980, Training Loss: 0.01971
Epoch: 6980, Training Loss: 0.02123
Epoch: 6980, Training Loss: 0.02126
Epoch: 6980, Training Loss: 0.02664
Epoch: 6981, Training Loss: 0.01971
Epoch: 6981, Training Loss: 0.02122
Epoch: 6981, Training Loss: 0.02126
Epoch: 6981, Training Loss: 0.02664
Epoch: 6982, Training Loss: 0.01970
Epoch: 6982, Training Loss: 0.02122
Epoch: 6982, Training Loss: 0.02126
Epoch: 6982, Training Loss: 0.02663
Epoch: 6983, Training Loss: 0.01970
Epoch: 6983, Training Loss: 0.02122
Epoch: 6983, Training Loss: 0.02126
Epoch: 6983, Training Loss: 0.02663
Epoch: 6984, Training Loss: 0.01970
Epoch: 6984, Training Loss: 0.02122
Epoch: 6984, Training Loss: 0.02125
Epoch: 6984, Training Loss: 0.02663
Epoch: 6985, Training Loss: 0.01970
Epoch: 6985, Training Loss: 0.02122
Epoch: 6985, Training Loss: 0.02125
Epoch: 6985, Training Loss: 0.02662
Epoch: 6986, Training Loss: 0.01970
Epoch: 6986, Training Loss: 0.02121
Epoch: 6986, Training Loss: 0.02125
Epoch: 6986, Training Loss: 0.02662
Epoch: 6987, Training Loss: 0.01969
Epoch: 6987, Training Loss: 0.02121
Epoch: 6987, Training Loss: 0.02125
Epoch: 6987, Training Loss: 0.02662
Epoch: 6988, Training Loss: 0.01969
Epoch: 6988, Training Loss: 0.02121
Epoch: 6988, Training Loss: 0.02124
Epoch: 6988, Training Loss: 0.02661
Epoch: 6989, Training Loss: 0.01969
Epoch: 6989, Training Loss: 0.02121
Epoch: 6989, Training Loss: 0.02124
Epoch: 6989, Training Loss: 0.02661
Epoch: 6990, Training Loss: 0.01969
Epoch: 6990, Training Loss: 0.02120
Epoch: 6990, Training Loss: 0.02124
Epoch: 6990, Training Loss: 0.02661
Epoch: 6991, Training Loss: 0.01968
Epoch: 6991, Training Loss: 0.02120
Epoch: 6991, Training Loss: 0.02124
Epoch: 6991, Training Loss: 0.02661
Epoch: 6992, Training Loss: 0.01968
Epoch: 6992, Training Loss: 0.02120
Epoch: 6992, Training Loss: 0.02123
Epoch: 6992, Training Loss: 0.02660
Epoch: 6993, Training Loss: 0.01968
Epoch: 6993, Training Loss: 0.02120
Epoch: 6993, Training Loss: 0.02123
Epoch: 6993, Training Loss: 0.02660
Epoch: 6994, Training Loss: 0.01968
Epoch: 6994, Training Loss: 0.02119
Epoch: 6994, Training Loss: 0.02123
Epoch: 6994, Training Loss: 0.02660
Epoch: 6995, Training Loss: 0.01968
Epoch: 6995, Training Loss: 0.02119
Epoch: 6995, Training Loss: 0.02123
Epoch: 6995, Training Loss: 0.02659
Epoch: 6996, Training Loss: 0.01967
Epoch: 6996, Training Loss: 0.02119
Epoch: 6996, Training Loss: 0.02122
Epoch: 6996, Training Loss: 0.02659
Epoch: 6997, Training Loss: 0.01967
Epoch: 6997, Training Loss: 0.02119
Epoch: 6997, Training Loss: 0.02122
Epoch: 6997, Training Loss: 0.02659
Epoch: 6998, Training Loss: 0.01967
Epoch: 6998, Training Loss: 0.02118
Epoch: 6998, Training Loss: 0.02122
Epoch: 6998, Training Loss: 0.02658
Epoch: 6999, Training Loss: 0.01967
Epoch: 6999, Training Loss: 0.02118
Epoch: 6999, Training Loss: 0.02122
Epoch: 6999, Training Loss: 0.02658
Epoch: 7000, Training Loss: 0.01967
Epoch: 7000, Training Loss: 0.02118
Epoch: 7000, Training Loss: 0.02121
Epoch: 7000, Training Loss: 0.02658
Epoch: 7001, Training Loss: 0.01966
Epoch: 7001, Training Loss: 0.02118
Epoch: 7001, Training Loss: 0.02121
Epoch: 7001, Training Loss: 0.02657
Epoch: 7002, Training Loss: 0.01966
Epoch: 7002, Training Loss: 0.02117
Epoch: 7002, Training Loss: 0.02121
Epoch: 7002, Training Loss: 0.02657
Epoch: 7003, Training Loss: 0.01966
Epoch: 7003, Training Loss: 0.02117
Epoch: 7003, Training Loss: 0.02121
Epoch: 7003, Training Loss: 0.02657
Epoch: 7004, Training Loss: 0.01966
Epoch: 7004, Training Loss: 0.02117
Epoch: 7004, Training Loss: 0.02121
Epoch: 7004, Training Loss: 0.02656
Epoch: 7005, Training Loss: 0.01966
Epoch: 7005, Training Loss: 0.02117
Epoch: 7005, Training Loss: 0.02120
Epoch: 7005, Training Loss: 0.02656
Epoch: 7006, Training Loss: 0.01965
Epoch: 7006, Training Loss: 0.02116
Epoch: 7006, Training Loss: 0.02120
Epoch: 7006, Training Loss: 0.02656
Epoch: 7007, Training Loss: 0.01965
Epoch: 7007, Training Loss: 0.02116
Epoch: 7007, Training Loss: 0.02120
Epoch: 7007, Training Loss: 0.02656
Epoch: 7008, Training Loss: 0.01965
Epoch: 7008, Training Loss: 0.02116
Epoch: 7008, Training Loss: 0.02120
Epoch: 7008, Training Loss: 0.02655
Epoch: 7009, Training Loss: 0.01965
Epoch: 7009, Training Loss: 0.02116
Epoch: 7009, Training Loss: 0.02119
Epoch: 7009, Training Loss: 0.02655
Epoch: 7010, Training Loss: 0.01964
Epoch: 7010, Training Loss: 0.02115
Epoch: 7010, Training Loss: 0.02119
Epoch: 7010, Training Loss: 0.02655
Epoch: 7011, Training Loss: 0.01964
Epoch: 7011, Training Loss: 0.02115
Epoch: 7011, Training Loss: 0.02119
Epoch: 7011, Training Loss: 0.02654
Epoch: 7012, Training Loss: 0.01964
Epoch: 7012, Training Loss: 0.02115
Epoch: 7012, Training Loss: 0.02119
Epoch: 7012, Training Loss: 0.02654
Epoch: 7013, Training Loss: 0.01964
Epoch: 7013, Training Loss: 0.02115
Epoch: 7013, Training Loss: 0.02118
Epoch: 7013, Training Loss: 0.02654
Epoch: 7014, Training Loss: 0.01964
Epoch: 7014, Training Loss: 0.02114
Epoch: 7014, Training Loss: 0.02118
Epoch: 7014, Training Loss: 0.02653
Epoch: 7015, Training Loss: 0.01963
Epoch: 7015, Training Loss: 0.02114
Epoch: 7015, Training Loss: 0.02118
Epoch: 7015, Training Loss: 0.02653
Epoch: 7016, Training Loss: 0.01963
Epoch: 7016, Training Loss: 0.02114
Epoch: 7016, Training Loss: 0.02118
Epoch: 7016, Training Loss: 0.02653
Epoch: 7017, Training Loss: 0.01963
Epoch: 7017, Training Loss: 0.02114
Epoch: 7017, Training Loss: 0.02117
Epoch: 7017, Training Loss: 0.02652
Epoch: 7018, Training Loss: 0.01963
Epoch: 7018, Training Loss: 0.02114
Epoch: 7018, Training Loss: 0.02117
Epoch: 7018, Training Loss: 0.02652
Epoch: 7019, Training Loss: 0.01963
Epoch: 7019, Training Loss: 0.02113
Epoch: 7019, Training Loss: 0.02117
Epoch: 7019, Training Loss: 0.02652
Epoch: 7020, Training Loss: 0.01962
Epoch: 7020, Training Loss: 0.02113
Epoch: 7020, Training Loss: 0.02117
Epoch: 7020, Training Loss: 0.02651
Epoch: 7021, Training Loss: 0.01962
Epoch: 7021, Training Loss: 0.02113
Epoch: 7021, Training Loss: 0.02116
Epoch: 7021, Training Loss: 0.02651
Epoch: 7022, Training Loss: 0.01962
Epoch: 7022, Training Loss: 0.02113
Epoch: 7022, Training Loss: 0.02116
Epoch: 7022, Training Loss: 0.02651
Epoch: 7023, Training Loss: 0.01962
Epoch: 7023, Training Loss: 0.02112
Epoch: 7023, Training Loss: 0.02116
Epoch: 7023, Training Loss: 0.02651
Epoch: 7024, Training Loss: 0.01961
Epoch: 7024, Training Loss: 0.02112
Epoch: 7024, Training Loss: 0.02116
Epoch: 7024, Training Loss: 0.02650
Epoch: 7025, Training Loss: 0.01961
Epoch: 7025, Training Loss: 0.02112
Epoch: 7025, Training Loss: 0.02115
Epoch: 7025, Training Loss: 0.02650
Epoch: 7026, Training Loss: 0.01961
Epoch: 7026, Training Loss: 0.02112
Epoch: 7026, Training Loss: 0.02115
Epoch: 7026, Training Loss: 0.02650
Epoch: 7027, Training Loss: 0.01961
Epoch: 7027, Training Loss: 0.02111
Epoch: 7027, Training Loss: 0.02115
Epoch: 7027, Training Loss: 0.02649
Epoch: 7028, Training Loss: 0.01961
Epoch: 7028, Training Loss: 0.02111
Epoch: 7028, Training Loss: 0.02115
Epoch: 7028, Training Loss: 0.02649
Epoch: 7029, Training Loss: 0.01960
Epoch: 7029, Training Loss: 0.02111
Epoch: 7029, Training Loss: 0.02115
Epoch: 7029, Training Loss: 0.02649
Epoch: 7030, Training Loss: 0.01960
Epoch: 7030, Training Loss: 0.02111
Epoch: 7030, Training Loss: 0.02114
Epoch: 7030, Training Loss: 0.02648
Epoch: 7031, Training Loss: 0.01960
Epoch: 7031, Training Loss: 0.02110
Epoch: 7031, Training Loss: 0.02114
Epoch: 7031, Training Loss: 0.02648
Epoch: 7032, Training Loss: 0.01960
Epoch: 7032, Training Loss: 0.02110
Epoch: 7032, Training Loss: 0.02114
Epoch: 7032, Training Loss: 0.02648
Epoch: 7033, Training Loss: 0.01960
Epoch: 7033, Training Loss: 0.02110
Epoch: 7033, Training Loss: 0.02114
Epoch: 7033, Training Loss: 0.02647
Epoch: 7034, Training Loss: 0.01959
Epoch: 7034, Training Loss: 0.02110
Epoch: 7034, Training Loss: 0.02113
Epoch: 7034, Training Loss: 0.02647
Epoch: 7035, Training Loss: 0.01959
Epoch: 7035, Training Loss: 0.02109
Epoch: 7035, Training Loss: 0.02113
Epoch: 7035, Training Loss: 0.02647
Epoch: 7036, Training Loss: 0.01959
Epoch: 7036, Training Loss: 0.02109
Epoch: 7036, Training Loss: 0.02113
Epoch: 7036, Training Loss: 0.02646
Epoch: 7037, Training Loss: 0.01959
Epoch: 7037, Training Loss: 0.02109
Epoch: 7037, Training Loss: 0.02113
Epoch: 7037, Training Loss: 0.02646
Epoch: 7038, Training Loss: 0.01959
Epoch: 7038, Training Loss: 0.02109
Epoch: 7038, Training Loss: 0.02112
Epoch: 7038, Training Loss: 0.02646
Epoch: 7039, Training Loss: 0.01958
Epoch: 7039, Training Loss: 0.02109
Epoch: 7039, Training Loss: 0.02112
Epoch: 7039, Training Loss: 0.02646
Epoch: 7040, Training Loss: 0.01958
Epoch: 7040, Training Loss: 0.02108
Epoch: 7040, Training Loss: 0.02112
Epoch: 7040, Training Loss: 0.02645
Epoch: 7041, Training Loss: 0.01958
Epoch: 7041, Training Loss: 0.02108
Epoch: 7041, Training Loss: 0.02112
Epoch: 7041, Training Loss: 0.02645
Epoch: 7042, Training Loss: 0.01958
Epoch: 7042, Training Loss: 0.02108
Epoch: 7042, Training Loss: 0.02111
Epoch: 7042, Training Loss: 0.02645
Epoch: 7043, Training Loss: 0.01957
Epoch: 7043, Training Loss: 0.02108
Epoch: 7043, Training Loss: 0.02111
Epoch: 7043, Training Loss: 0.02644
Epoch: 7044, Training Loss: 0.01957
Epoch: 7044, Training Loss: 0.02107
Epoch: 7044, Training Loss: 0.02111
Epoch: 7044, Training Loss: 0.02644
Epoch: 7045, Training Loss: 0.01957
Epoch: 7045, Training Loss: 0.02107
Epoch: 7045, Training Loss: 0.02111
Epoch: 7045, Training Loss: 0.02644
Epoch: 7046, Training Loss: 0.01957
Epoch: 7046, Training Loss: 0.02107
Epoch: 7046, Training Loss: 0.02110
Epoch: 7046, Training Loss: 0.02643
Epoch: 7047, Training Loss: 0.01957
Epoch: 7047, Training Loss: 0.02107
Epoch: 7047, Training Loss: 0.02110
Epoch: 7047, Training Loss: 0.02643
Epoch: 7048, Training Loss: 0.01956
Epoch: 7048, Training Loss: 0.02106
Epoch: 7048, Training Loss: 0.02110
Epoch: 7048, Training Loss: 0.02643
Epoch: 7049, Training Loss: 0.01956
Epoch: 7049, Training Loss: 0.02106
Epoch: 7049, Training Loss: 0.02110
Epoch: 7049, Training Loss: 0.02642
Epoch: 7050, Training Loss: 0.01956
Epoch: 7050, Training Loss: 0.02106
Epoch: 7050, Training Loss: 0.02109
Epoch: 7050, Training Loss: 0.02642
Epoch: 7051, Training Loss: 0.01956
Epoch: 7051, Training Loss: 0.02106
Epoch: 7051, Training Loss: 0.02109
Epoch: 7051, Training Loss: 0.02642
Epoch: 7052, Training Loss: 0.01956
Epoch: 7052, Training Loss: 0.02105
Epoch: 7052, Training Loss: 0.02109
Epoch: 7052, Training Loss: 0.02642
Epoch: 7053, Training Loss: 0.01955
Epoch: 7053, Training Loss: 0.02105
Epoch: 7053, Training Loss: 0.02109
Epoch: 7053, Training Loss: 0.02641
Epoch: 7054, Training Loss: 0.01955
Epoch: 7054, Training Loss: 0.02105
Epoch: 7054, Training Loss: 0.02109
Epoch: 7054, Training Loss: 0.02641
Epoch: 7055, Training Loss: 0.01955
Epoch: 7055, Training Loss: 0.02105
Epoch: 7055, Training Loss: 0.02108
Epoch: 7055, Training Loss: 0.02641
Epoch: 7056, Training Loss: 0.01955
Epoch: 7056, Training Loss: 0.02104
Epoch: 7056, Training Loss: 0.02108
Epoch: 7056, Training Loss: 0.02640
Epoch: 7057, Training Loss: 0.01955
Epoch: 7057, Training Loss: 0.02104
Epoch: 7057, Training Loss: 0.02108
Epoch: 7057, Training Loss: 0.02640
Epoch: 7058, Training Loss: 0.01954
Epoch: 7058, Training Loss: 0.02104
Epoch: 7058, Training Loss: 0.02108
Epoch: 7058, Training Loss: 0.02640
Epoch: 7059, Training Loss: 0.01954
Epoch: 7059, Training Loss: 0.02104
Epoch: 7059, Training Loss: 0.02107
Epoch: 7059, Training Loss: 0.02639
Epoch: 7060, Training Loss: 0.01954
Epoch: 7060, Training Loss: 0.02104
Epoch: 7060, Training Loss: 0.02107
Epoch: 7060, Training Loss: 0.02639
Epoch: 7061, Training Loss: 0.01954
Epoch: 7061, Training Loss: 0.02103
Epoch: 7061, Training Loss: 0.02107
Epoch: 7061, Training Loss: 0.02639
Epoch: 7062, Training Loss: 0.01954
Epoch: 7062, Training Loss: 0.02103
Epoch: 7062, Training Loss: 0.02107
Epoch: 7062, Training Loss: 0.02638
Epoch: 7063, Training Loss: 0.01953
Epoch: 7063, Training Loss: 0.02103
Epoch: 7063, Training Loss: 0.02106
Epoch: 7063, Training Loss: 0.02638
Epoch: 7064, Training Loss: 0.01953
Epoch: 7064, Training Loss: 0.02103
Epoch: 7064, Training Loss: 0.02106
Epoch: 7064, Training Loss: 0.02638
Epoch: 7065, Training Loss: 0.01953
Epoch: 7065, Training Loss: 0.02102
Epoch: 7065, Training Loss: 0.02106
Epoch: 7065, Training Loss: 0.02638
Epoch: 7066, Training Loss: 0.01953
Epoch: 7066, Training Loss: 0.02102
Epoch: 7066, Training Loss: 0.02106
Epoch: 7066, Training Loss: 0.02637
Epoch: 7067, Training Loss: 0.01952
Epoch: 7067, Training Loss: 0.02102
Epoch: 7067, Training Loss: 0.02105
Epoch: 7067, Training Loss: 0.02637
Epoch: 7068, Training Loss: 0.01952
Epoch: 7068, Training Loss: 0.02102
Epoch: 7068, Training Loss: 0.02105
Epoch: 7068, Training Loss: 0.02637
Epoch: 7069, Training Loss: 0.01952
Epoch: 7069, Training Loss: 0.02101
Epoch: 7069, Training Loss: 0.02105
Epoch: 7069, Training Loss: 0.02636
Epoch: 7070, Training Loss: 0.01952
Epoch: 7070, Training Loss: 0.02101
Epoch: 7070, Training Loss: 0.02105
Epoch: 7070, Training Loss: 0.02636
Epoch: 7071, Training Loss: 0.01952
Epoch: 7071, Training Loss: 0.02101
Epoch: 7071, Training Loss: 0.02104
Epoch: 7071, Training Loss: 0.02636
Epoch: 7072, Training Loss: 0.01951
Epoch: 7072, Training Loss: 0.02101
Epoch: 7072, Training Loss: 0.02104
Epoch: 7072, Training Loss: 0.02635
Epoch: 7073, Training Loss: 0.01951
Epoch: 7073, Training Loss: 0.02100
Epoch: 7073, Training Loss: 0.02104
Epoch: 7073, Training Loss: 0.02635
Epoch: 7074, Training Loss: 0.01951
Epoch: 7074, Training Loss: 0.02100
Epoch: 7074, Training Loss: 0.02104
Epoch: 7074, Training Loss: 0.02635
Epoch: 7075, Training Loss: 0.01951
Epoch: 7075, Training Loss: 0.02100
Epoch: 7075, Training Loss: 0.02104
Epoch: 7075, Training Loss: 0.02634
Epoch: 7076, Training Loss: 0.01951
Epoch: 7076, Training Loss: 0.02100
Epoch: 7076, Training Loss: 0.02103
Epoch: 7076, Training Loss: 0.02634
Epoch: 7077, Training Loss: 0.01950
Epoch: 7077, Training Loss: 0.02099
Epoch: 7077, Training Loss: 0.02103
Epoch: 7077, Training Loss: 0.02634
Epoch: 7078, Training Loss: 0.01950
Epoch: 7078, Training Loss: 0.02099
Epoch: 7078, Training Loss: 0.02103
Epoch: 7078, Training Loss: 0.02634
Epoch: 7079, Training Loss: 0.01950
Epoch: 7079, Training Loss: 0.02099
Epoch: 7079, Training Loss: 0.02103
Epoch: 7079, Training Loss: 0.02633
Epoch: 7080, Training Loss: 0.01950
Epoch: 7080, Training Loss: 0.02099
Epoch: 7080, Training Loss: 0.02102
Epoch: 7080, Training Loss: 0.02633
Epoch: 7081, Training Loss: 0.01950
Epoch: 7081, Training Loss: 0.02099
Epoch: 7081, Training Loss: 0.02102
Epoch: 7081, Training Loss: 0.02633
Epoch: 7082, Training Loss: 0.01949
Epoch: 7082, Training Loss: 0.02098
Epoch: 7082, Training Loss: 0.02102
Epoch: 7082, Training Loss: 0.02632
Epoch: 7083, Training Loss: 0.01949
Epoch: 7083, Training Loss: 0.02098
Epoch: 7083, Training Loss: 0.02102
Epoch: 7083, Training Loss: 0.02632
Epoch: 7084, Training Loss: 0.01949
Epoch: 7084, Training Loss: 0.02098
Epoch: 7084, Training Loss: 0.02101
Epoch: 7084, Training Loss: 0.02632
Epoch: 7085, Training Loss: 0.01949
Epoch: 7085, Training Loss: 0.02098
Epoch: 7085, Training Loss: 0.02101
Epoch: 7085, Training Loss: 0.02631
Epoch: 7086, Training Loss: 0.01949
Epoch: 7086, Training Loss: 0.02097
Epoch: 7086, Training Loss: 0.02101
Epoch: 7086, Training Loss: 0.02631
Epoch: 7087, Training Loss: 0.01948
Epoch: 7087, Training Loss: 0.02097
Epoch: 7087, Training Loss: 0.02101
Epoch: 7087, Training Loss: 0.02631
Epoch: 7088, Training Loss: 0.01948
Epoch: 7088, Training Loss: 0.02097
Epoch: 7088, Training Loss: 0.02100
Epoch: 7088, Training Loss: 0.02631
Epoch: 7089, Training Loss: 0.01948
Epoch: 7089, Training Loss: 0.02097
Epoch: 7089, Training Loss: 0.02100
Epoch: 7089, Training Loss: 0.02630
Epoch: 7090, Training Loss: 0.01948
Epoch: 7090, Training Loss: 0.02096
Epoch: 7090, Training Loss: 0.02100
Epoch: 7090, Training Loss: 0.02630
Epoch: 7091, Training Loss: 0.01948
Epoch: 7091, Training Loss: 0.02096
Epoch: 7091, Training Loss: 0.02100
Epoch: 7091, Training Loss: 0.02630
Epoch: 7092, Training Loss: 0.01947
Epoch: 7092, Training Loss: 0.02096
Epoch: 7092, Training Loss: 0.02100
Epoch: 7092, Training Loss: 0.02629
Epoch: 7093, Training Loss: 0.01947
Epoch: 7093, Training Loss: 0.02096
Epoch: 7093, Training Loss: 0.02099
Epoch: 7093, Training Loss: 0.02629
Epoch: 7094, Training Loss: 0.01947
Epoch: 7094, Training Loss: 0.02095
Epoch: 7094, Training Loss: 0.02099
Epoch: 7094, Training Loss: 0.02629
Epoch: 7095, Training Loss: 0.01947
Epoch: 7095, Training Loss: 0.02095
Epoch: 7095, Training Loss: 0.02099
Epoch: 7095, Training Loss: 0.02628
Epoch: 7096, Training Loss: 0.01946
Epoch: 7096, Training Loss: 0.02095
Epoch: 7096, Training Loss: 0.02099
Epoch: 7096, Training Loss: 0.02628
Epoch: 7097, Training Loss: 0.01946
Epoch: 7097, Training Loss: 0.02095
Epoch: 7097, Training Loss: 0.02098
Epoch: 7097, Training Loss: 0.02628
Epoch: 7098, Training Loss: 0.01946
Epoch: 7098, Training Loss: 0.02095
Epoch: 7098, Training Loss: 0.02098
Epoch: 7098, Training Loss: 0.02627
Epoch: 7099, Training Loss: 0.01946
Epoch: 7099, Training Loss: 0.02094
Epoch: 7099, Training Loss: 0.02098
Epoch: 7099, Training Loss: 0.02627
Epoch: 7100, Training Loss: 0.01946
Epoch: 7100, Training Loss: 0.02094
Epoch: 7100, Training Loss: 0.02098
Epoch: 7100, Training Loss: 0.02627
Epoch: 7101, Training Loss: 0.01945
Epoch: 7101, Training Loss: 0.02094
Epoch: 7101, Training Loss: 0.02097
Epoch: 7101, Training Loss: 0.02627
Epoch: 7102, Training Loss: 0.01945
Epoch: 7102, Training Loss: 0.02094
Epoch: 7102, Training Loss: 0.02097
Epoch: 7102, Training Loss: 0.02626
Epoch: 7103, Training Loss: 0.01945
Epoch: 7103, Training Loss: 0.02093
Epoch: 7103, Training Loss: 0.02097
Epoch: 7103, Training Loss: 0.02626
Epoch: 7104, Training Loss: 0.01945
Epoch: 7104, Training Loss: 0.02093
Epoch: 7104, Training Loss: 0.02097
Epoch: 7104, Training Loss: 0.02626
Epoch: 7105, Training Loss: 0.01945
Epoch: 7105, Training Loss: 0.02093
Epoch: 7105, Training Loss: 0.02096
Epoch: 7105, Training Loss: 0.02625
Epoch: 7106, Training Loss: 0.01944
Epoch: 7106, Training Loss: 0.02093
Epoch: 7106, Training Loss: 0.02096
Epoch: 7106, Training Loss: 0.02625
Epoch: 7107, Training Loss: 0.01944
Epoch: 7107, Training Loss: 0.02092
Epoch: 7107, Training Loss: 0.02096
Epoch: 7107, Training Loss: 0.02625
Epoch: 7108, Training Loss: 0.01944
Epoch: 7108, Training Loss: 0.02092
Epoch: 7108, Training Loss: 0.02096
Epoch: 7108, Training Loss: 0.02624
Epoch: 7109, Training Loss: 0.01944
Epoch: 7109, Training Loss: 0.02092
Epoch: 7109, Training Loss: 0.02096
Epoch: 7109, Training Loss: 0.02624
Epoch: 7110, Training Loss: 0.01944
Epoch: 7110, Training Loss: 0.02092
Epoch: 7110, Training Loss: 0.02095
Epoch: 7110, Training Loss: 0.02624
Epoch: 7111, Training Loss: 0.01943
Epoch: 7111, Training Loss: 0.02092
Epoch: 7111, Training Loss: 0.02095
Epoch: 7111, Training Loss: 0.02624
Epoch: 7112, Training Loss: 0.01943
Epoch: 7112, Training Loss: 0.02091
Epoch: 7112, Training Loss: 0.02095
Epoch: 7112, Training Loss: 0.02623
Epoch: 7113, Training Loss: 0.01943
Epoch: 7113, Training Loss: 0.02091
Epoch: 7113, Training Loss: 0.02095
Epoch: 7113, Training Loss: 0.02623
Epoch: 7114, Training Loss: 0.01943
Epoch: 7114, Training Loss: 0.02091
Epoch: 7114, Training Loss: 0.02094
Epoch: 7114, Training Loss: 0.02623
Epoch: 7115, Training Loss: 0.01943
Epoch: 7115, Training Loss: 0.02091
Epoch: 7115, Training Loss: 0.02094
Epoch: 7115, Training Loss: 0.02622
Epoch: 7116, Training Loss: 0.01942
Epoch: 7116, Training Loss: 0.02090
Epoch: 7116, Training Loss: 0.02094
Epoch: 7116, Training Loss: 0.02622
Epoch: 7117, Training Loss: 0.01942
Epoch: 7117, Training Loss: 0.02090
Epoch: 7117, Training Loss: 0.02094
Epoch: 7117, Training Loss: 0.02622
Epoch: 7118, Training Loss: 0.01942
Epoch: 7118, Training Loss: 0.02090
Epoch: 7118, Training Loss: 0.02093
Epoch: 7118, Training Loss: 0.02621
Epoch: 7119, Training Loss: 0.01942
Epoch: 7119, Training Loss: 0.02090
Epoch: 7119, Training Loss: 0.02093
Epoch: 7119, Training Loss: 0.02621
Epoch: 7120, Training Loss: 0.01942
Epoch: 7120, Training Loss: 0.02089
Epoch: 7120, Training Loss: 0.02093
Epoch: 7120, Training Loss: 0.02621
Epoch: 7121, Training Loss: 0.01941
Epoch: 7121, Training Loss: 0.02089
Epoch: 7121, Training Loss: 0.02093
Epoch: 7121, Training Loss: 0.02621
Epoch: 7122, Training Loss: 0.01941
Epoch: 7122, Training Loss: 0.02089
Epoch: 7122, Training Loss: 0.02093
Epoch: 7122, Training Loss: 0.02620
Epoch: 7123, Training Loss: 0.01941
Epoch: 7123, Training Loss: 0.02089
Epoch: 7123, Training Loss: 0.02092
Epoch: 7123, Training Loss: 0.02620
Epoch: 7124, Training Loss: 0.01941
Epoch: 7124, Training Loss: 0.02088
Epoch: 7124, Training Loss: 0.02092
Epoch: 7124, Training Loss: 0.02620
Epoch: 7125, Training Loss: 0.01941
Epoch: 7125, Training Loss: 0.02088
Epoch: 7125, Training Loss: 0.02092
Epoch: 7125, Training Loss: 0.02619
Epoch: 7126, Training Loss: 0.01940
Epoch: 7126, Training Loss: 0.02088
Epoch: 7126, Training Loss: 0.02092
Epoch: 7126, Training Loss: 0.02619
Epoch: 7127, Training Loss: 0.01940
Epoch: 7127, Training Loss: 0.02088
Epoch: 7127, Training Loss: 0.02091
Epoch: 7127, Training Loss: 0.02619
Epoch: 7128, Training Loss: 0.01940
Epoch: 7128, Training Loss: 0.02088
Epoch: 7128, Training Loss: 0.02091
Epoch: 7128, Training Loss: 0.02618
Epoch: 7129, Training Loss: 0.01940
Epoch: 7129, Training Loss: 0.02087
Epoch: 7129, Training Loss: 0.02091
Epoch: 7129, Training Loss: 0.02618
Epoch: 7130, Training Loss: 0.01939
Epoch: 7130, Training Loss: 0.02087
Epoch: 7130, Training Loss: 0.02091
Epoch: 7130, Training Loss: 0.02618
Epoch: 7131, Training Loss: 0.01939
Epoch: 7131, Training Loss: 0.02087
Epoch: 7131, Training Loss: 0.02090
Epoch: 7131, Training Loss: 0.02618
Epoch: 7132, Training Loss: 0.01939
Epoch: 7132, Training Loss: 0.02087
Epoch: 7132, Training Loss: 0.02090
Epoch: 7132, Training Loss: 0.02617
Epoch: 7133, Training Loss: 0.01939
Epoch: 7133, Training Loss: 0.02086
Epoch: 7133, Training Loss: 0.02090
Epoch: 7133, Training Loss: 0.02617
Epoch: 7134, Training Loss: 0.01939
Epoch: 7134, Training Loss: 0.02086
Epoch: 7134, Training Loss: 0.02090
Epoch: 7134, Training Loss: 0.02617
Epoch: 7135, Training Loss: 0.01938
Epoch: 7135, Training Loss: 0.02086
Epoch: 7135, Training Loss: 0.02089
Epoch: 7135, Training Loss: 0.02616
Epoch: 7136, Training Loss: 0.01938
Epoch: 7136, Training Loss: 0.02086
Epoch: 7136, Training Loss: 0.02089
Epoch: 7136, Training Loss: 0.02616
Epoch: 7137, Training Loss: 0.01938
Epoch: 7137, Training Loss: 0.02085
Epoch: 7137, Training Loss: 0.02089
Epoch: 7137, Training Loss: 0.02616
Epoch: 7138, Training Loss: 0.01938
Epoch: 7138, Training Loss: 0.02085
Epoch: 7138, Training Loss: 0.02089
Epoch: 7138, Training Loss: 0.02615
Epoch: 7139, Training Loss: 0.01938
Epoch: 7139, Training Loss: 0.02085
Epoch: 7139, Training Loss: 0.02089
Epoch: 7139, Training Loss: 0.02615
Epoch: 7140, Training Loss: 0.01937
Epoch: 7140, Training Loss: 0.02085
Epoch: 7140, Training Loss: 0.02088
Epoch: 7140, Training Loss: 0.02615
Epoch: 7141, Training Loss: 0.01937
Epoch: 7141, Training Loss: 0.02085
Epoch: 7141, Training Loss: 0.02088
Epoch: 7141, Training Loss: 0.02615
Epoch: 7142, Training Loss: 0.01937
Epoch: 7142, Training Loss: 0.02084
Epoch: 7142, Training Loss: 0.02088
Epoch: 7142, Training Loss: 0.02614
Epoch: 7143, Training Loss: 0.01937
Epoch: 7143, Training Loss: 0.02084
Epoch: 7143, Training Loss: 0.02088
Epoch: 7143, Training Loss: 0.02614
Epoch: 7144, Training Loss: 0.01937
Epoch: 7144, Training Loss: 0.02084
Epoch: 7144, Training Loss: 0.02087
Epoch: 7144, Training Loss: 0.02614
Epoch: 7145, Training Loss: 0.01936
Epoch: 7145, Training Loss: 0.02084
Epoch: 7145, Training Loss: 0.02087
Epoch: 7145, Training Loss: 0.02613
Epoch: 7146, Training Loss: 0.01936
Epoch: 7146, Training Loss: 0.02083
Epoch: 7146, Training Loss: 0.02087
Epoch: 7146, Training Loss: 0.02613
Epoch: 7147, Training Loss: 0.01936
Epoch: 7147, Training Loss: 0.02083
Epoch: 7147, Training Loss: 0.02087
Epoch: 7147, Training Loss: 0.02613
Epoch: 7148, Training Loss: 0.01936
Epoch: 7148, Training Loss: 0.02083
Epoch: 7148, Training Loss: 0.02086
Epoch: 7148, Training Loss: 0.02612
Epoch: 7149, Training Loss: 0.01936
Epoch: 7149, Training Loss: 0.02083
Epoch: 7149, Training Loss: 0.02086
Epoch: 7149, Training Loss: 0.02612
Epoch: 7150, Training Loss: 0.01935
Epoch: 7150, Training Loss: 0.02082
Epoch: 7150, Training Loss: 0.02086
Epoch: 7150, Training Loss: 0.02612
Epoch: 7151, Training Loss: 0.01935
Epoch: 7151, Training Loss: 0.02082
Epoch: 7151, Training Loss: 0.02086
Epoch: 7151, Training Loss: 0.02612
Epoch: 7152, Training Loss: 0.01935
Epoch: 7152, Training Loss: 0.02082
Epoch: 7152, Training Loss: 0.02086
Epoch: 7152, Training Loss: 0.02611
Epoch: 7153, Training Loss: 0.01935
Epoch: 7153, Training Loss: 0.02082
Epoch: 7153, Training Loss: 0.02085
Epoch: 7153, Training Loss: 0.02611
Epoch: 7154, Training Loss: 0.01935
Epoch: 7154, Training Loss: 0.02082
Epoch: 7154, Training Loss: 0.02085
Epoch: 7154, Training Loss: 0.02611
Epoch: 7155, Training Loss: 0.01934
Epoch: 7155, Training Loss: 0.02081
Epoch: 7155, Training Loss: 0.02085
Epoch: 7155, Training Loss: 0.02610
Epoch: 7156, Training Loss: 0.01934
Epoch: 7156, Training Loss: 0.02081
Epoch: 7156, Training Loss: 0.02085
Epoch: 7156, Training Loss: 0.02610
Epoch: 7157, Training Loss: 0.01934
Epoch: 7157, Training Loss: 0.02081
Epoch: 7157, Training Loss: 0.02084
Epoch: 7157, Training Loss: 0.02610
Epoch: 7158, Training Loss: 0.01934
Epoch: 7158, Training Loss: 0.02081
Epoch: 7158, Training Loss: 0.02084
Epoch: 7158, Training Loss: 0.02609
Epoch: 7159, Training Loss: 0.01934
Epoch: 7159, Training Loss: 0.02080
Epoch: 7159, Training Loss: 0.02084
Epoch: 7159, Training Loss: 0.02609
Epoch: 7160, Training Loss: 0.01933
Epoch: 7160, Training Loss: 0.02080
Epoch: 7160, Training Loss: 0.02084
Epoch: 7160, Training Loss: 0.02609
Epoch: 7161, Training Loss: 0.01933
Epoch: 7161, Training Loss: 0.02080
Epoch: 7161, Training Loss: 0.02083
Epoch: 7161, Training Loss: 0.02609
Epoch: 7162, Training Loss: 0.01933
Epoch: 7162, Training Loss: 0.02080
Epoch: 7162, Training Loss: 0.02083
Epoch: 7162, Training Loss: 0.02608
Epoch: 7163, Training Loss: 0.01933
Epoch: 7163, Training Loss: 0.02079
Epoch: 7163, Training Loss: 0.02083
Epoch: 7163, Training Loss: 0.02608
Epoch: 7164, Training Loss: 0.01933
Epoch: 7164, Training Loss: 0.02079
Epoch: 7164, Training Loss: 0.02083
Epoch: 7164, Training Loss: 0.02608
Epoch: 7165, Training Loss: 0.01932
Epoch: 7165, Training Loss: 0.02079
Epoch: 7165, Training Loss: 0.02083
Epoch: 7165, Training Loss: 0.02607
Epoch: 7166, Training Loss: 0.01932
Epoch: 7166, Training Loss: 0.02079
Epoch: 7166, Training Loss: 0.02082
Epoch: 7166, Training Loss: 0.02607
Epoch: 7167, Training Loss: 0.01932
Epoch: 7167, Training Loss: 0.02079
Epoch: 7167, Training Loss: 0.02082
Epoch: 7167, Training Loss: 0.02607
Epoch: 7168, Training Loss: 0.01932
Epoch: 7168, Training Loss: 0.02078
Epoch: 7168, Training Loss: 0.02082
Epoch: 7168, Training Loss: 0.02606
Epoch: 7169, Training Loss: 0.01932
Epoch: 7169, Training Loss: 0.02078
Epoch: 7169, Training Loss: 0.02082
Epoch: 7169, Training Loss: 0.02606
Epoch: 7170, Training Loss: 0.01931
Epoch: 7170, Training Loss: 0.02078
Epoch: 7170, Training Loss: 0.02081
Epoch: 7170, Training Loss: 0.02606
Epoch: 7171, Training Loss: 0.01931
Epoch: 7171, Training Loss: 0.02078
Epoch: 7171, Training Loss: 0.02081
Epoch: 7171, Training Loss: 0.02606
Epoch: 7172, Training Loss: 0.01931
Epoch: 7172, Training Loss: 0.02077
Epoch: 7172, Training Loss: 0.02081
Epoch: 7172, Training Loss: 0.02605
Epoch: 7173, Training Loss: 0.01931
Epoch: 7173, Training Loss: 0.02077
Epoch: 7173, Training Loss: 0.02081
Epoch: 7173, Training Loss: 0.02605
Epoch: 7174, Training Loss: 0.01931
Epoch: 7174, Training Loss: 0.02077
Epoch: 7174, Training Loss: 0.02081
Epoch: 7174, Training Loss: 0.02605
Epoch: 7175, Training Loss: 0.01930
Epoch: 7175, Training Loss: 0.02077
Epoch: 7175, Training Loss: 0.02080
Epoch: 7175, Training Loss: 0.02604
Epoch: 7176, Training Loss: 0.01930
Epoch: 7176, Training Loss: 0.02077
Epoch: 7176, Training Loss: 0.02080
Epoch: 7176, Training Loss: 0.02604
Epoch: 7177, Training Loss: 0.01930
Epoch: 7177, Training Loss: 0.02076
Epoch: 7177, Training Loss: 0.02080
Epoch: 7177, Training Loss: 0.02604
Epoch: 7178, Training Loss: 0.01930
Epoch: 7178, Training Loss: 0.02076
Epoch: 7178, Training Loss: 0.02080
Epoch: 7178, Training Loss: 0.02604
Epoch: 7179, Training Loss: 0.01930
Epoch: 7179, Training Loss: 0.02076
Epoch: 7179, Training Loss: 0.02079
Epoch: 7179, Training Loss: 0.02603
Epoch: 7180, Training Loss: 0.01929
Epoch: 7180, Training Loss: 0.02076
Epoch: 7180, Training Loss: 0.02079
Epoch: 7180, Training Loss: 0.02603
Epoch: 7181, Training Loss: 0.01929
Epoch: 7181, Training Loss: 0.02075
Epoch: 7181, Training Loss: 0.02079
Epoch: 7181, Training Loss: 0.02603
Epoch: 7182, Training Loss: 0.01929
Epoch: 7182, Training Loss: 0.02075
Epoch: 7182, Training Loss: 0.02079
Epoch: 7182, Training Loss: 0.02602
Epoch: 7183, Training Loss: 0.01929
Epoch: 7183, Training Loss: 0.02075
Epoch: 7183, Training Loss: 0.02078
Epoch: 7183, Training Loss: 0.02602
Epoch: 7184, Training Loss: 0.01929
Epoch: 7184, Training Loss: 0.02075
Epoch: 7184, Training Loss: 0.02078
Epoch: 7184, Training Loss: 0.02602
Epoch: 7185, Training Loss: 0.01928
Epoch: 7185, Training Loss: 0.02074
Epoch: 7185, Training Loss: 0.02078
Epoch: 7185, Training Loss: 0.02601
Epoch: 7186, Training Loss: 0.01928
Epoch: 7186, Training Loss: 0.02074
Epoch: 7186, Training Loss: 0.02078
Epoch: 7186, Training Loss: 0.02601
Epoch: 7187, Training Loss: 0.01928
Epoch: 7187, Training Loss: 0.02074
Epoch: 7187, Training Loss: 0.02078
Epoch: 7187, Training Loss: 0.02601
Epoch: 7188, Training Loss: 0.01928
Epoch: 7188, Training Loss: 0.02074
Epoch: 7188, Training Loss: 0.02077
Epoch: 7188, Training Loss: 0.02601
Epoch: 7189, Training Loss: 0.01928
Epoch: 7189, Training Loss: 0.02074
Epoch: 7189, Training Loss: 0.02077
Epoch: 7189, Training Loss: 0.02600
Epoch: 7190, Training Loss: 0.01927
Epoch: 7190, Training Loss: 0.02073
Epoch: 7190, Training Loss: 0.02077
Epoch: 7190, Training Loss: 0.02600
Epoch: 7191, Training Loss: 0.01927
Epoch: 7191, Training Loss: 0.02073
Epoch: 7191, Training Loss: 0.02077
Epoch: 7191, Training Loss: 0.02600
Epoch: 7192, Training Loss: 0.01927
Epoch: 7192, Training Loss: 0.02073
Epoch: 7192, Training Loss: 0.02076
Epoch: 7192, Training Loss: 0.02599
Epoch: 7193, Training Loss: 0.01927
Epoch: 7193, Training Loss: 0.02073
Epoch: 7193, Training Loss: 0.02076
Epoch: 7193, Training Loss: 0.02599
Epoch: 7194, Training Loss: 0.01927
Epoch: 7194, Training Loss: 0.02072
Epoch: 7194, Training Loss: 0.02076
Epoch: 7194, Training Loss: 0.02599
Epoch: 7195, Training Loss: 0.01926
Epoch: 7195, Training Loss: 0.02072
Epoch: 7195, Training Loss: 0.02076
Epoch: 7195, Training Loss: 0.02598
Epoch: 7196, Training Loss: 0.01926
Epoch: 7196, Training Loss: 0.02072
Epoch: 7196, Training Loss: 0.02075
Epoch: 7196, Training Loss: 0.02598
Epoch: 7197, Training Loss: 0.01926
Epoch: 7197, Training Loss: 0.02072
Epoch: 7197, Training Loss: 0.02075
Epoch: 7197, Training Loss: 0.02598
Epoch: 7198, Training Loss: 0.01926
Epoch: 7198, Training Loss: 0.02072
Epoch: 7198, Training Loss: 0.02075
Epoch: 7198, Training Loss: 0.02598
Epoch: 7199, Training Loss: 0.01926
Epoch: 7199, Training Loss: 0.02071
Epoch: 7199, Training Loss: 0.02075
Epoch: 7199, Training Loss: 0.02597
Epoch: 7200, Training Loss: 0.01925
Epoch: 7200, Training Loss: 0.02071
Epoch: 7200, Training Loss: 0.02075
Epoch: 7200, Training Loss: 0.02597
Epoch: 7201, Training Loss: 0.01925
Epoch: 7201, Training Loss: 0.02071
Epoch: 7201, Training Loss: 0.02074
Epoch: 7201, Training Loss: 0.02597
Epoch: 7202, Training Loss: 0.01925
Epoch: 7202, Training Loss: 0.02071
Epoch: 7202, Training Loss: 0.02074
Epoch: 7202, Training Loss: 0.02596
Epoch: 7203, Training Loss: 0.01925
Epoch: 7203, Training Loss: 0.02070
Epoch: 7203, Training Loss: 0.02074
Epoch: 7203, Training Loss: 0.02596
Epoch: 7204, Training Loss: 0.01925
Epoch: 7204, Training Loss: 0.02070
Epoch: 7204, Training Loss: 0.02074
Epoch: 7204, Training Loss: 0.02596
Epoch: 7205, Training Loss: 0.01924
Epoch: 7205, Training Loss: 0.02070
Epoch: 7205, Training Loss: 0.02073
Epoch: 7205, Training Loss: 0.02596
Epoch: 7206, Training Loss: 0.01924
Epoch: 7206, Training Loss: 0.02070
Epoch: 7206, Training Loss: 0.02073
Epoch: 7206, Training Loss: 0.02595
Epoch: 7207, Training Loss: 0.01924
Epoch: 7207, Training Loss: 0.02069
Epoch: 7207, Training Loss: 0.02073
Epoch: 7207, Training Loss: 0.02595
Epoch: 7208, Training Loss: 0.01924
Epoch: 7208, Training Loss: 0.02069
Epoch: 7208, Training Loss: 0.02073
Epoch: 7208, Training Loss: 0.02595
Epoch: 7209, Training Loss: 0.01924
Epoch: 7209, Training Loss: 0.02069
Epoch: 7209, Training Loss: 0.02073
Epoch: 7209, Training Loss: 0.02594
Epoch: 7210, Training Loss: 0.01923
Epoch: 7210, Training Loss: 0.02069
Epoch: 7210, Training Loss: 0.02072
Epoch: 7210, Training Loss: 0.02594
Epoch: 7211, Training Loss: 0.01923
Epoch: 7211, Training Loss: 0.02069
Epoch: 7211, Training Loss: 0.02072
Epoch: 7211, Training Loss: 0.02594
Epoch: 7212, Training Loss: 0.01923
Epoch: 7212, Training Loss: 0.02068
Epoch: 7212, Training Loss: 0.02072
Epoch: 7212, Training Loss: 0.02594
Epoch: 7213, Training Loss: 0.01923
Epoch: 7213, Training Loss: 0.02068
Epoch: 7213, Training Loss: 0.02072
Epoch: 7213, Training Loss: 0.02593
Epoch: 7214, Training Loss: 0.01923
Epoch: 7214, Training Loss: 0.02068
Epoch: 7214, Training Loss: 0.02071
Epoch: 7214, Training Loss: 0.02593
Epoch: 7215, Training Loss: 0.01922
Epoch: 7215, Training Loss: 0.02068
Epoch: 7215, Training Loss: 0.02071
Epoch: 7215, Training Loss: 0.02593
Epoch: 7216, Training Loss: 0.01922
Epoch: 7216, Training Loss: 0.02067
Epoch: 7216, Training Loss: 0.02071
Epoch: 7216, Training Loss: 0.02592
Epoch: 7217, Training Loss: 0.01922
Epoch: 7217, Training Loss: 0.02067
Epoch: 7217, Training Loss: 0.02071
Epoch: 7217, Training Loss: 0.02592
Epoch: 7218, Training Loss: 0.01922
Epoch: 7218, Training Loss: 0.02067
Epoch: 7218, Training Loss: 0.02071
Epoch: 7218, Training Loss: 0.02592
Epoch: 7219, Training Loss: 0.01922
Epoch: 7219, Training Loss: 0.02067
Epoch: 7219, Training Loss: 0.02070
Epoch: 7219, Training Loss: 0.02591
Epoch: 7220, Training Loss: 0.01921
Epoch: 7220, Training Loss: 0.02067
Epoch: 7220, Training Loss: 0.02070
Epoch: 7220, Training Loss: 0.02591
Epoch: 7221, Training Loss: 0.01921
Epoch: 7221, Training Loss: 0.02066
Epoch: 7221, Training Loss: 0.02070
Epoch: 7221, Training Loss: 0.02591
Epoch: 7222, Training Loss: 0.01921
Epoch: 7222, Training Loss: 0.02066
Epoch: 7222, Training Loss: 0.02070
Epoch: 7222, Training Loss: 0.02591
Epoch: 7223, Training Loss: 0.01921
Epoch: 7223, Training Loss: 0.02066
Epoch: 7223, Training Loss: 0.02069
Epoch: 7223, Training Loss: 0.02590
Epoch: 7224, Training Loss: 0.01921
Epoch: 7224, Training Loss: 0.02066
Epoch: 7224, Training Loss: 0.02069
Epoch: 7224, Training Loss: 0.02590
Epoch: 7225, Training Loss: 0.01920
Epoch: 7225, Training Loss: 0.02065
Epoch: 7225, Training Loss: 0.02069
Epoch: 7225, Training Loss: 0.02590
Epoch: 7226, Training Loss: 0.01920
Epoch: 7226, Training Loss: 0.02065
Epoch: 7226, Training Loss: 0.02069
Epoch: 7226, Training Loss: 0.02589
Epoch: 7227, Training Loss: 0.01920
Epoch: 7227, Training Loss: 0.02065
Epoch: 7227, Training Loss: 0.02068
Epoch: 7227, Training Loss: 0.02589
Epoch: 7228, Training Loss: 0.01920
Epoch: 7228, Training Loss: 0.02065
Epoch: 7228, Training Loss: 0.02068
Epoch: 7228, Training Loss: 0.02589
Epoch: 7229, Training Loss: 0.01920
Epoch: 7229, Training Loss: 0.02065
Epoch: 7229, Training Loss: 0.02068
Epoch: 7229, Training Loss: 0.02589
Epoch: 7230, Training Loss: 0.01919
Epoch: 7230, Training Loss: 0.02064
Epoch: 7230, Training Loss: 0.02068
Epoch: 7230, Training Loss: 0.02588
Epoch: 7231, Training Loss: 0.01919
Epoch: 7231, Training Loss: 0.02064
Epoch: 7231, Training Loss: 0.02068
Epoch: 7231, Training Loss: 0.02588
Epoch: 7232, Training Loss: 0.01919
Epoch: 7232, Training Loss: 0.02064
Epoch: 7232, Training Loss: 0.02067
Epoch: 7232, Training Loss: 0.02588
Epoch: 7233, Training Loss: 0.01919
Epoch: 7233, Training Loss: 0.02064
Epoch: 7233, Training Loss: 0.02067
Epoch: 7233, Training Loss: 0.02587
Epoch: 7234, Training Loss: 0.01919
Epoch: 7234, Training Loss: 0.02063
Epoch: 7234, Training Loss: 0.02067
Epoch: 7234, Training Loss: 0.02587
Epoch: 7235, Training Loss: 0.01918
Epoch: 7235, Training Loss: 0.02063
Epoch: 7235, Training Loss: 0.02067
Epoch: 7235, Training Loss: 0.02587
Epoch: 7236, Training Loss: 0.01918
Epoch: 7236, Training Loss: 0.02063
Epoch: 7236, Training Loss: 0.02066
Epoch: 7236, Training Loss: 0.02587
Epoch: 7237, Training Loss: 0.01918
Epoch: 7237, Training Loss: 0.02063
Epoch: 7237, Training Loss: 0.02066
Epoch: 7237, Training Loss: 0.02586
Epoch: 7238, Training Loss: 0.01918
Epoch: 7238, Training Loss: 0.02063
Epoch: 7238, Training Loss: 0.02066
Epoch: 7238, Training Loss: 0.02586
Epoch: 7239, Training Loss: 0.01918
Epoch: 7239, Training Loss: 0.02062
Epoch: 7239, Training Loss: 0.02066
Epoch: 7239, Training Loss: 0.02586
Epoch: 7240, Training Loss: 0.01917
Epoch: 7240, Training Loss: 0.02062
Epoch: 7240, Training Loss: 0.02066
Epoch: 7240, Training Loss: 0.02585
Epoch: 7241, Training Loss: 0.01917
Epoch: 7241, Training Loss: 0.02062
Epoch: 7241, Training Loss: 0.02065
Epoch: 7241, Training Loss: 0.02585
Epoch: 7242, Training Loss: 0.01917
Epoch: 7242, Training Loss: 0.02062
Epoch: 7242, Training Loss: 0.02065
Epoch: 7242, Training Loss: 0.02585
Epoch: 7243, Training Loss: 0.01917
Epoch: 7243, Training Loss: 0.02061
Epoch: 7243, Training Loss: 0.02065
Epoch: 7243, Training Loss: 0.02584
Epoch: 7244, Training Loss: 0.01917
Epoch: 7244, Training Loss: 0.02061
Epoch: 7244, Training Loss: 0.02065
Epoch: 7244, Training Loss: 0.02584
Epoch: 7245, Training Loss: 0.01916
Epoch: 7245, Training Loss: 0.02061
Epoch: 7245, Training Loss: 0.02064
Epoch: 7245, Training Loss: 0.02584
Epoch: 7246, Training Loss: 0.01916
Epoch: 7246, Training Loss: 0.02061
Epoch: 7246, Training Loss: 0.02064
Epoch: 7246, Training Loss: 0.02584
Epoch: 7247, Training Loss: 0.01916
Epoch: 7247, Training Loss: 0.02060
Epoch: 7247, Training Loss: 0.02064
Epoch: 7247, Training Loss: 0.02583
Epoch: 7248, Training Loss: 0.01916
Epoch: 7248, Training Loss: 0.02060
Epoch: 7248, Training Loss: 0.02064
Epoch: 7248, Training Loss: 0.02583
Epoch: 7249, Training Loss: 0.01916
Epoch: 7249, Training Loss: 0.02060
Epoch: 7249, Training Loss: 0.02064
Epoch: 7249, Training Loss: 0.02583
Epoch: 7250, Training Loss: 0.01915
Epoch: 7250, Training Loss: 0.02060
Epoch: 7250, Training Loss: 0.02063
Epoch: 7250, Training Loss: 0.02582
Epoch: 7251, Training Loss: 0.01915
Epoch: 7251, Training Loss: 0.02060
Epoch: 7251, Training Loss: 0.02063
Epoch: 7251, Training Loss: 0.02582
Epoch: 7252, Training Loss: 0.01915
Epoch: 7252, Training Loss: 0.02059
Epoch: 7252, Training Loss: 0.02063
Epoch: 7252, Training Loss: 0.02582
Epoch: 7253, Training Loss: 0.01915
Epoch: 7253, Training Loss: 0.02059
Epoch: 7253, Training Loss: 0.02063
Epoch: 7253, Training Loss: 0.02582
Epoch: 7254, Training Loss: 0.01915
Epoch: 7254, Training Loss: 0.02059
Epoch: 7254, Training Loss: 0.02062
Epoch: 7254, Training Loss: 0.02581
Epoch: 7255, Training Loss: 0.01914
Epoch: 7255, Training Loss: 0.02059
Epoch: 7255, Training Loss: 0.02062
Epoch: 7255, Training Loss: 0.02581
Epoch: 7256, Training Loss: 0.01914
Epoch: 7256, Training Loss: 0.02058
Epoch: 7256, Training Loss: 0.02062
Epoch: 7256, Training Loss: 0.02581
Epoch: 7257, Training Loss: 0.01914
Epoch: 7257, Training Loss: 0.02058
Epoch: 7257, Training Loss: 0.02062
Epoch: 7257, Training Loss: 0.02580
Epoch: 7258, Training Loss: 0.01914
Epoch: 7258, Training Loss: 0.02058
Epoch: 7258, Training Loss: 0.02062
Epoch: 7258, Training Loss: 0.02580
Epoch: 7259, Training Loss: 0.01914
Epoch: 7259, Training Loss: 0.02058
Epoch: 7259, Training Loss: 0.02061
Epoch: 7259, Training Loss: 0.02580
Epoch: 7260, Training Loss: 0.01913
Epoch: 7260, Training Loss: 0.02058
Epoch: 7260, Training Loss: 0.02061
Epoch: 7260, Training Loss: 0.02580
Epoch: 7261, Training Loss: 0.01913
Epoch: 7261, Training Loss: 0.02057
Epoch: 7261, Training Loss: 0.02061
Epoch: 7261, Training Loss: 0.02579
Epoch: 7262, Training Loss: 0.01913
Epoch: 7262, Training Loss: 0.02057
Epoch: 7262, Training Loss: 0.02061
Epoch: 7262, Training Loss: 0.02579
Epoch: 7263, Training Loss: 0.01913
Epoch: 7263, Training Loss: 0.02057
Epoch: 7263, Training Loss: 0.02060
Epoch: 7263, Training Loss: 0.02579
Epoch: 7264, Training Loss: 0.01913
Epoch: 7264, Training Loss: 0.02057
Epoch: 7264, Training Loss: 0.02060
Epoch: 7264, Training Loss: 0.02578
Epoch: 7265, Training Loss: 0.01913
Epoch: 7265, Training Loss: 0.02056
Epoch: 7265, Training Loss: 0.02060
Epoch: 7265, Training Loss: 0.02578
Epoch: 7266, Training Loss: 0.01912
Epoch: 7266, Training Loss: 0.02056
Epoch: 7266, Training Loss: 0.02060
Epoch: 7266, Training Loss: 0.02578
Epoch: 7267, Training Loss: 0.01912
Epoch: 7267, Training Loss: 0.02056
Epoch: 7267, Training Loss: 0.02060
Epoch: 7267, Training Loss: 0.02578
Epoch: 7268, Training Loss: 0.01912
Epoch: 7268, Training Loss: 0.02056
Epoch: 7268, Training Loss: 0.02059
Epoch: 7268, Training Loss: 0.02577
Epoch: 7269, Training Loss: 0.01912
Epoch: 7269, Training Loss: 0.02056
Epoch: 7269, Training Loss: 0.02059
Epoch: 7269, Training Loss: 0.02577
Epoch: 7270, Training Loss: 0.01912
Epoch: 7270, Training Loss: 0.02055
Epoch: 7270, Training Loss: 0.02059
Epoch: 7270, Training Loss: 0.02577
Epoch: 7271, Training Loss: 0.01911
Epoch: 7271, Training Loss: 0.02055
Epoch: 7271, Training Loss: 0.02059
Epoch: 7271, Training Loss: 0.02576
Epoch: 7272, Training Loss: 0.01911
Epoch: 7272, Training Loss: 0.02055
Epoch: 7272, Training Loss: 0.02058
Epoch: 7272, Training Loss: 0.02576
Epoch: 7273, Training Loss: 0.01911
Epoch: 7273, Training Loss: 0.02055
Epoch: 7273, Training Loss: 0.02058
Epoch: 7273, Training Loss: 0.02576
Epoch: 7274, Training Loss: 0.01911
Epoch: 7274, Training Loss: 0.02054
Epoch: 7274, Training Loss: 0.02058
Epoch: 7274, Training Loss: 0.02576
Epoch: 7275, Training Loss: 0.01911
Epoch: 7275, Training Loss: 0.02054
Epoch: 7275, Training Loss: 0.02058
Epoch: 7275, Training Loss: 0.02575
Epoch: 7276, Training Loss: 0.01910
Epoch: 7276, Training Loss: 0.02054
Epoch: 7276, Training Loss: 0.02058
Epoch: 7276, Training Loss: 0.02575
Epoch: 7277, Training Loss: 0.01910
Epoch: 7277, Training Loss: 0.02054
Epoch: 7277, Training Loss: 0.02057
Epoch: 7277, Training Loss: 0.02575
Epoch: 7278, Training Loss: 0.01910
Epoch: 7278, Training Loss: 0.02054
Epoch: 7278, Training Loss: 0.02057
Epoch: 7278, Training Loss: 0.02574
Epoch: 7279, Training Loss: 0.01910
Epoch: 7279, Training Loss: 0.02053
Epoch: 7279, Training Loss: 0.02057
Epoch: 7279, Training Loss: 0.02574
Epoch: 7280, Training Loss: 0.01910
Epoch: 7280, Training Loss: 0.02053
Epoch: 7280, Training Loss: 0.02057
Epoch: 7280, Training Loss: 0.02574
Epoch: 7281, Training Loss: 0.01909
Epoch: 7281, Training Loss: 0.02053
Epoch: 7281, Training Loss: 0.02056
Epoch: 7281, Training Loss: 0.02574
Epoch: 7282, Training Loss: 0.01909
Epoch: 7282, Training Loss: 0.02053
Epoch: 7282, Training Loss: 0.02056
Epoch: 7282, Training Loss: 0.02573
Epoch: 7283, Training Loss: 0.01909
Epoch: 7283, Training Loss: 0.02053
Epoch: 7283, Training Loss: 0.02056
Epoch: 7283, Training Loss: 0.02573
Epoch: 7284, Training Loss: 0.01909
Epoch: 7284, Training Loss: 0.02052
Epoch: 7284, Training Loss: 0.02056
Epoch: 7284, Training Loss: 0.02573
Epoch: 7285, Training Loss: 0.01909
Epoch: 7285, Training Loss: 0.02052
Epoch: 7285, Training Loss: 0.02056
Epoch: 7285, Training Loss: 0.02572
Epoch: 7286, Training Loss: 0.01908
Epoch: 7286, Training Loss: 0.02052
Epoch: 7286, Training Loss: 0.02055
Epoch: 7286, Training Loss: 0.02572
Epoch: 7287, Training Loss: 0.01908
Epoch: 7287, Training Loss: 0.02052
Epoch: 7287, Training Loss: 0.02055
Epoch: 7287, Training Loss: 0.02572
Epoch: 7288, Training Loss: 0.01908
Epoch: 7288, Training Loss: 0.02051
Epoch: 7288, Training Loss: 0.02055
Epoch: 7288, Training Loss: 0.02572
Epoch: 7289, Training Loss: 0.01908
Epoch: 7289, Training Loss: 0.02051
Epoch: 7289, Training Loss: 0.02055
Epoch: 7289, Training Loss: 0.02571
Epoch: 7290, Training Loss: 0.01908
Epoch: 7290, Training Loss: 0.02051
Epoch: 7290, Training Loss: 0.02054
Epoch: 7290, Training Loss: 0.02571
Epoch: 7291, Training Loss: 0.01907
Epoch: 7291, Training Loss: 0.02051
Epoch: 7291, Training Loss: 0.02054
Epoch: 7291, Training Loss: 0.02571
Epoch: 7292, Training Loss: 0.01907
Epoch: 7292, Training Loss: 0.02051
Epoch: 7292, Training Loss: 0.02054
Epoch: 7292, Training Loss: 0.02570
Epoch: 7293, Training Loss: 0.01907
Epoch: 7293, Training Loss: 0.02050
Epoch: 7293, Training Loss: 0.02054
Epoch: 7293, Training Loss: 0.02570
Epoch: 7294, Training Loss: 0.01907
Epoch: 7294, Training Loss: 0.02050
Epoch: 7294, Training Loss: 0.02054
Epoch: 7294, Training Loss: 0.02570
Epoch: 7295, Training Loss: 0.01907
Epoch: 7295, Training Loss: 0.02050
Epoch: 7295, Training Loss: 0.02053
Epoch: 7295, Training Loss: 0.02570
Epoch: 7296, Training Loss: 0.01906
Epoch: 7296, Training Loss: 0.02050
Epoch: 7296, Training Loss: 0.02053
Epoch: 7296, Training Loss: 0.02569
Epoch: 7297, Training Loss: 0.01906
Epoch: 7297, Training Loss: 0.02049
Epoch: 7297, Training Loss: 0.02053
Epoch: 7297, Training Loss: 0.02569
Epoch: 7298, Training Loss: 0.01906
Epoch: 7298, Training Loss: 0.02049
Epoch: 7298, Training Loss: 0.02053
Epoch: 7298, Training Loss: 0.02569
Epoch: 7299, Training Loss: 0.01906
Epoch: 7299, Training Loss: 0.02049
Epoch: 7299, Training Loss: 0.02052
Epoch: 7299, Training Loss: 0.02568
Epoch: 7300, Training Loss: 0.01906
Epoch: 7300, Training Loss: 0.02049
Epoch: 7300, Training Loss: 0.02052
Epoch: 7300, Training Loss: 0.02568
Epoch: 7301, Training Loss: 0.01905
Epoch: 7301, Training Loss: 0.02049
Epoch: 7301, Training Loss: 0.02052
Epoch: 7301, Training Loss: 0.02568
Epoch: 7302, Training Loss: 0.01905
Epoch: 7302, Training Loss: 0.02048
Epoch: 7302, Training Loss: 0.02052
Epoch: 7302, Training Loss: 0.02568
Epoch: 7303, Training Loss: 0.01905
Epoch: 7303, Training Loss: 0.02048
Epoch: 7303, Training Loss: 0.02052
Epoch: 7303, Training Loss: 0.02567
Epoch: 7304, Training Loss: 0.01905
Epoch: 7304, Training Loss: 0.02048
Epoch: 7304, Training Loss: 0.02051
Epoch: 7304, Training Loss: 0.02567
Epoch: 7305, Training Loss: 0.01905
Epoch: 7305, Training Loss: 0.02048
Epoch: 7305, Training Loss: 0.02051
Epoch: 7305, Training Loss: 0.02567
Epoch: 7306, Training Loss: 0.01905
Epoch: 7306, Training Loss: 0.02047
Epoch: 7306, Training Loss: 0.02051
Epoch: 7306, Training Loss: 0.02566
Epoch: 7307, Training Loss: 0.01904
Epoch: 7307, Training Loss: 0.02047
Epoch: 7307, Training Loss: 0.02051
Epoch: 7307, Training Loss: 0.02566
Epoch: 7308, Training Loss: 0.01904
Epoch: 7308, Training Loss: 0.02047
Epoch: 7308, Training Loss: 0.02050
Epoch: 7308, Training Loss: 0.02566
Epoch: 7309, Training Loss: 0.01904
Epoch: 7309, Training Loss: 0.02047
Epoch: 7309, Training Loss: 0.02050
Epoch: 7309, Training Loss: 0.02566
Epoch: 7310, Training Loss: 0.01904
Epoch: 7310, Training Loss: 0.02047
Epoch: 7310, Training Loss: 0.02050
Epoch: 7310, Training Loss: 0.02565
Epoch: 7311, Training Loss: 0.01904
Epoch: 7311, Training Loss: 0.02046
Epoch: 7311, Training Loss: 0.02050
Epoch: 7311, Training Loss: 0.02565
Epoch: 7312, Training Loss: 0.01903
Epoch: 7312, Training Loss: 0.02046
Epoch: 7312, Training Loss: 0.02050
Epoch: 7312, Training Loss: 0.02565
Epoch: 7313, Training Loss: 0.01903
Epoch: 7313, Training Loss: 0.02046
Epoch: 7313, Training Loss: 0.02049
Epoch: 7313, Training Loss: 0.02564
Epoch: 7314, Training Loss: 0.01903
Epoch: 7314, Training Loss: 0.02046
Epoch: 7314, Training Loss: 0.02049
Epoch: 7314, Training Loss: 0.02564
Epoch: 7315, Training Loss: 0.01903
Epoch: 7315, Training Loss: 0.02045
Epoch: 7315, Training Loss: 0.02049
Epoch: 7315, Training Loss: 0.02564
Epoch: 7316, Training Loss: 0.01903
Epoch: 7316, Training Loss: 0.02045
Epoch: 7316, Training Loss: 0.02049
Epoch: 7316, Training Loss: 0.02564
Epoch: 7317, Training Loss: 0.01902
Epoch: 7317, Training Loss: 0.02045
Epoch: 7317, Training Loss: 0.02048
Epoch: 7317, Training Loss: 0.02563
Epoch: 7318, Training Loss: 0.01902
Epoch: 7318, Training Loss: 0.02045
Epoch: 7318, Training Loss: 0.02048
Epoch: 7318, Training Loss: 0.02563
Epoch: 7319, Training Loss: 0.01902
Epoch: 7319, Training Loss: 0.02045
Epoch: 7319, Training Loss: 0.02048
Epoch: 7319, Training Loss: 0.02563
Epoch: 7320, Training Loss: 0.01902
Epoch: 7320, Training Loss: 0.02044
Epoch: 7320, Training Loss: 0.02048
Epoch: 7320, Training Loss: 0.02562
Epoch: 7321, Training Loss: 0.01902
Epoch: 7321, Training Loss: 0.02044
Epoch: 7321, Training Loss: 0.02048
Epoch: 7321, Training Loss: 0.02562
Epoch: 7322, Training Loss: 0.01901
Epoch: 7322, Training Loss: 0.02044
Epoch: 7322, Training Loss: 0.02047
Epoch: 7322, Training Loss: 0.02562
Epoch: 7323, Training Loss: 0.01901
Epoch: 7323, Training Loss: 0.02044
Epoch: 7323, Training Loss: 0.02047
Epoch: 7323, Training Loss: 0.02562
Epoch: 7324, Training Loss: 0.01901
Epoch: 7324, Training Loss: 0.02044
Epoch: 7324, Training Loss: 0.02047
Epoch: 7324, Training Loss: 0.02561
Epoch: 7325, Training Loss: 0.01901
Epoch: 7325, Training Loss: 0.02043
Epoch: 7325, Training Loss: 0.02047
Epoch: 7325, Training Loss: 0.02561
Epoch: 7326, Training Loss: 0.01901
Epoch: 7326, Training Loss: 0.02043
Epoch: 7326, Training Loss: 0.02047
Epoch: 7326, Training Loss: 0.02561
Epoch: 7327, Training Loss: 0.01900
Epoch: 7327, Training Loss: 0.02043
Epoch: 7327, Training Loss: 0.02046
Epoch: 7327, Training Loss: 0.02560
Epoch: 7328, Training Loss: 0.01900
Epoch: 7328, Training Loss: 0.02043
Epoch: 7328, Training Loss: 0.02046
Epoch: 7328, Training Loss: 0.02560
Epoch: 7329, Training Loss: 0.01900
Epoch: 7329, Training Loss: 0.02042
Epoch: 7329, Training Loss: 0.02046
Epoch: 7329, Training Loss: 0.02560
Epoch: 7330, Training Loss: 0.01900
Epoch: 7330, Training Loss: 0.02042
Epoch: 7330, Training Loss: 0.02046
Epoch: 7330, Training Loss: 0.02560
Epoch: 7331, Training Loss: 0.01900
Epoch: 7331, Training Loss: 0.02042
Epoch: 7331, Training Loss: 0.02045
Epoch: 7331, Training Loss: 0.02559
Epoch: 7332, Training Loss: 0.01900
Epoch: 7332, Training Loss: 0.02042
Epoch: 7332, Training Loss: 0.02045
Epoch: 7332, Training Loss: 0.02559
Epoch: 7333, Training Loss: 0.01899
Epoch: 7333, Training Loss: 0.02042
Epoch: 7333, Training Loss: 0.02045
Epoch: 7333, Training Loss: 0.02559
Epoch: 7334, Training Loss: 0.01899
Epoch: 7334, Training Loss: 0.02041
Epoch: 7334, Training Loss: 0.02045
Epoch: 7334, Training Loss: 0.02559
Epoch: 7335, Training Loss: 0.01899
Epoch: 7335, Training Loss: 0.02041
Epoch: 7335, Training Loss: 0.02045
Epoch: 7335, Training Loss: 0.02558
Epoch: 7336, Training Loss: 0.01899
Epoch: 7336, Training Loss: 0.02041
Epoch: 7336, Training Loss: 0.02044
Epoch: 7336, Training Loss: 0.02558
Epoch: 7337, Training Loss: 0.01899
Epoch: 7337, Training Loss: 0.02041
Epoch: 7337, Training Loss: 0.02044
Epoch: 7337, Training Loss: 0.02558
Epoch: 7338, Training Loss: 0.01898
Epoch: 7338, Training Loss: 0.02040
Epoch: 7338, Training Loss: 0.02044
Epoch: 7338, Training Loss: 0.02557
Epoch: 7339, Training Loss: 0.01898
Epoch: 7339, Training Loss: 0.02040
Epoch: 7339, Training Loss: 0.02044
Epoch: 7339, Training Loss: 0.02557
Epoch: 7340, Training Loss: 0.01898
Epoch: 7340, Training Loss: 0.02040
Epoch: 7340, Training Loss: 0.02043
Epoch: 7340, Training Loss: 0.02557
Epoch: 7341, Training Loss: 0.01898
Epoch: 7341, Training Loss: 0.02040
Epoch: 7341, Training Loss: 0.02043
Epoch: 7341, Training Loss: 0.02557
Epoch: 7342, Training Loss: 0.01898
Epoch: 7342, Training Loss: 0.02040
Epoch: 7342, Training Loss: 0.02043
Epoch: 7342, Training Loss: 0.02556
Epoch: 7343, Training Loss: 0.01897
Epoch: 7343, Training Loss: 0.02039
Epoch: 7343, Training Loss: 0.02043
Epoch: 7343, Training Loss: 0.02556
Epoch: 7344, Training Loss: 0.01897
Epoch: 7344, Training Loss: 0.02039
Epoch: 7344, Training Loss: 0.02043
Epoch: 7344, Training Loss: 0.02556
Epoch: 7345, Training Loss: 0.01897
Epoch: 7345, Training Loss: 0.02039
Epoch: 7345, Training Loss: 0.02042
Epoch: 7345, Training Loss: 0.02555
Epoch: 7346, Training Loss: 0.01897
Epoch: 7346, Training Loss: 0.02039
Epoch: 7346, Training Loss: 0.02042
Epoch: 7346, Training Loss: 0.02555
Epoch: 7347, Training Loss: 0.01897
Epoch: 7347, Training Loss: 0.02039
Epoch: 7347, Training Loss: 0.02042
Epoch: 7347, Training Loss: 0.02555
Epoch: 7348, Training Loss: 0.01896
Epoch: 7348, Training Loss: 0.02038
Epoch: 7348, Training Loss: 0.02042
Epoch: 7348, Training Loss: 0.02555
Epoch: 7349, Training Loss: 0.01896
Epoch: 7349, Training Loss: 0.02038
Epoch: 7349, Training Loss: 0.02042
Epoch: 7349, Training Loss: 0.02554
Epoch: 7350, Training Loss: 0.01896
Epoch: 7350, Training Loss: 0.02038
Epoch: 7350, Training Loss: 0.02041
Epoch: 7350, Training Loss: 0.02554
Epoch: 7351, Training Loss: 0.01896
Epoch: 7351, Training Loss: 0.02038
Epoch: 7351, Training Loss: 0.02041
Epoch: 7351, Training Loss: 0.02554
Epoch: 7352, Training Loss: 0.01896
Epoch: 7352, Training Loss: 0.02037
Epoch: 7352, Training Loss: 0.02041
Epoch: 7352, Training Loss: 0.02553
Epoch: 7353, Training Loss: 0.01896
Epoch: 7353, Training Loss: 0.02037
Epoch: 7353, Training Loss: 0.02041
Epoch: 7353, Training Loss: 0.02553
Epoch: 7354, Training Loss: 0.01895
Epoch: 7354, Training Loss: 0.02037
Epoch: 7354, Training Loss: 0.02040
Epoch: 7354, Training Loss: 0.02553
Epoch: 7355, Training Loss: 0.01895
Epoch: 7355, Training Loss: 0.02037
Epoch: 7355, Training Loss: 0.02040
Epoch: 7355, Training Loss: 0.02553
Epoch: 7356, Training Loss: 0.01895
Epoch: 7356, Training Loss: 0.02037
Epoch: 7356, Training Loss: 0.02040
Epoch: 7356, Training Loss: 0.02552
Epoch: 7357, Training Loss: 0.01895
Epoch: 7357, Training Loss: 0.02036
Epoch: 7357, Training Loss: 0.02040
Epoch: 7357, Training Loss: 0.02552
Epoch: 7358, Training Loss: 0.01895
Epoch: 7358, Training Loss: 0.02036
Epoch: 7358, Training Loss: 0.02040
Epoch: 7358, Training Loss: 0.02552
Epoch: 7359, Training Loss: 0.01894
Epoch: 7359, Training Loss: 0.02036
Epoch: 7359, Training Loss: 0.02039
Epoch: 7359, Training Loss: 0.02552
Epoch: 7360, Training Loss: 0.01894
Epoch: 7360, Training Loss: 0.02036
Epoch: 7360, Training Loss: 0.02039
Epoch: 7360, Training Loss: 0.02551
Epoch: 7361, Training Loss: 0.01894
Epoch: 7361, Training Loss: 0.02035
Epoch: 7361, Training Loss: 0.02039
Epoch: 7361, Training Loss: 0.02551
Epoch: 7362, Training Loss: 0.01894
Epoch: 7362, Training Loss: 0.02035
Epoch: 7362, Training Loss: 0.02039
Epoch: 7362, Training Loss: 0.02551
Epoch: 7363, Training Loss: 0.01894
Epoch: 7363, Training Loss: 0.02035
Epoch: 7363, Training Loss: 0.02038
Epoch: 7363, Training Loss: 0.02550
Epoch: 7364, Training Loss: 0.01893
Epoch: 7364, Training Loss: 0.02035
Epoch: 7364, Training Loss: 0.02038
Epoch: 7364, Training Loss: 0.02550
Epoch: 7365, Training Loss: 0.01893
Epoch: 7365, Training Loss: 0.02035
Epoch: 7365, Training Loss: 0.02038
Epoch: 7365, Training Loss: 0.02550
Epoch: 7366, Training Loss: 0.01893
Epoch: 7366, Training Loss: 0.02034
Epoch: 7366, Training Loss: 0.02038
Epoch: 7366, Training Loss: 0.02550
Epoch: 7367, Training Loss: 0.01893
Epoch: 7367, Training Loss: 0.02034
Epoch: 7367, Training Loss: 0.02038
Epoch: 7367, Training Loss: 0.02549
Epoch: 7368, Training Loss: 0.01893
Epoch: 7368, Training Loss: 0.02034
Epoch: 7368, Training Loss: 0.02037
Epoch: 7368, Training Loss: 0.02549
Epoch: 7369, Training Loss: 0.01892
Epoch: 7369, Training Loss: 0.02034
Epoch: 7369, Training Loss: 0.02037
Epoch: 7369, Training Loss: 0.02549
Epoch: 7370, Training Loss: 0.01892
Epoch: 7370, Training Loss: 0.02034
Epoch: 7370, Training Loss: 0.02037
Epoch: 7370, Training Loss: 0.02548
Epoch: 7371, Training Loss: 0.01892
Epoch: 7371, Training Loss: 0.02033
Epoch: 7371, Training Loss: 0.02037
Epoch: 7371, Training Loss: 0.02548
Epoch: 7372, Training Loss: 0.01892
Epoch: 7372, Training Loss: 0.02033
Epoch: 7372, Training Loss: 0.02037
Epoch: 7372, Training Loss: 0.02548
Epoch: 7373, Training Loss: 0.01892
Epoch: 7373, Training Loss: 0.02033
Epoch: 7373, Training Loss: 0.02036
Epoch: 7373, Training Loss: 0.02548
Epoch: 7374, Training Loss: 0.01892
Epoch: 7374, Training Loss: 0.02033
Epoch: 7374, Training Loss: 0.02036
Epoch: 7374, Training Loss: 0.02547
Epoch: 7375, Training Loss: 0.01891
Epoch: 7375, Training Loss: 0.02032
Epoch: 7375, Training Loss: 0.02036
Epoch: 7375, Training Loss: 0.02547
Epoch: 7376, Training Loss: 0.01891
Epoch: 7376, Training Loss: 0.02032
Epoch: 7376, Training Loss: 0.02036
Epoch: 7376, Training Loss: 0.02547
Epoch: 7377, Training Loss: 0.01891
Epoch: 7377, Training Loss: 0.02032
Epoch: 7377, Training Loss: 0.02035
Epoch: 7377, Training Loss: 0.02547
Epoch: 7378, Training Loss: 0.01891
Epoch: 7378, Training Loss: 0.02032
Epoch: 7378, Training Loss: 0.02035
Epoch: 7378, Training Loss: 0.02546
Epoch: 7379, Training Loss: 0.01891
Epoch: 7379, Training Loss: 0.02032
Epoch: 7379, Training Loss: 0.02035
Epoch: 7379, Training Loss: 0.02546
Epoch: 7380, Training Loss: 0.01890
Epoch: 7380, Training Loss: 0.02031
Epoch: 7380, Training Loss: 0.02035
Epoch: 7380, Training Loss: 0.02546
Epoch: 7381, Training Loss: 0.01890
Epoch: 7381, Training Loss: 0.02031
Epoch: 7381, Training Loss: 0.02035
Epoch: 7381, Training Loss: 0.02545
Epoch: 7382, Training Loss: 0.01890
Epoch: 7382, Training Loss: 0.02031
Epoch: 7382, Training Loss: 0.02034
Epoch: 7382, Training Loss: 0.02545
Epoch: 7383, Training Loss: 0.01890
Epoch: 7383, Training Loss: 0.02031
Epoch: 7383, Training Loss: 0.02034
Epoch: 7383, Training Loss: 0.02545
Epoch: 7384, Training Loss: 0.01890
Epoch: 7384, Training Loss: 0.02031
Epoch: 7384, Training Loss: 0.02034
Epoch: 7384, Training Loss: 0.02545
Epoch: 7385, Training Loss: 0.01889
Epoch: 7385, Training Loss: 0.02030
Epoch: 7385, Training Loss: 0.02034
Epoch: 7385, Training Loss: 0.02544
Epoch: 7386, Training Loss: 0.01889
Epoch: 7386, Training Loss: 0.02030
Epoch: 7386, Training Loss: 0.02034
Epoch: 7386, Training Loss: 0.02544
Epoch: 7387, Training Loss: 0.01889
Epoch: 7387, Training Loss: 0.02030
Epoch: 7387, Training Loss: 0.02033
Epoch: 7387, Training Loss: 0.02544
Epoch: 7388, Training Loss: 0.01889
Epoch: 7388, Training Loss: 0.02030
Epoch: 7388, Training Loss: 0.02033
Epoch: 7388, Training Loss: 0.02543
Epoch: 7389, Training Loss: 0.01889
Epoch: 7389, Training Loss: 0.02029
Epoch: 7389, Training Loss: 0.02033
Epoch: 7389, Training Loss: 0.02543
Epoch: 7390, Training Loss: 0.01888
Epoch: 7390, Training Loss: 0.02029
Epoch: 7390, Training Loss: 0.02033
Epoch: 7390, Training Loss: 0.02543
Epoch: 7391, Training Loss: 0.01888
Epoch: 7391, Training Loss: 0.02029
Epoch: 7391, Training Loss: 0.02032
Epoch: 7391, Training Loss: 0.02543
Epoch: 7392, Training Loss: 0.01888
Epoch: 7392, Training Loss: 0.02029
Epoch: 7392, Training Loss: 0.02032
Epoch: 7392, Training Loss: 0.02542
Epoch: 7393, Training Loss: 0.01888
Epoch: 7393, Training Loss: 0.02029
Epoch: 7393, Training Loss: 0.02032
Epoch: 7393, Training Loss: 0.02542
Epoch: 7394, Training Loss: 0.01888
Epoch: 7394, Training Loss: 0.02028
Epoch: 7394, Training Loss: 0.02032
Epoch: 7394, Training Loss: 0.02542
Epoch: 7395, Training Loss: 0.01888
Epoch: 7395, Training Loss: 0.02028
Epoch: 7395, Training Loss: 0.02032
Epoch: 7395, Training Loss: 0.02542
Epoch: 7396, Training Loss: 0.01887
Epoch: 7396, Training Loss: 0.02028
Epoch: 7396, Training Loss: 0.02031
Epoch: 7396, Training Loss: 0.02541
Epoch: 7397, Training Loss: 0.01887
Epoch: 7397, Training Loss: 0.02028
Epoch: 7397, Training Loss: 0.02031
Epoch: 7397, Training Loss: 0.02541
Epoch: 7398, Training Loss: 0.01887
Epoch: 7398, Training Loss: 0.02028
Epoch: 7398, Training Loss: 0.02031
Epoch: 7398, Training Loss: 0.02541
Epoch: 7399, Training Loss: 0.01887
Epoch: 7399, Training Loss: 0.02027
Epoch: 7399, Training Loss: 0.02031
Epoch: 7399, Training Loss: 0.02540
Epoch: 7400, Training Loss: 0.01887
Epoch: 7400, Training Loss: 0.02027
Epoch: 7400, Training Loss: 0.02031
Epoch: 7400, Training Loss: 0.02540
Epoch: 7401, Training Loss: 0.01886
Epoch: 7401, Training Loss: 0.02027
Epoch: 7401, Training Loss: 0.02030
Epoch: 7401, Training Loss: 0.02540
Epoch: 7402, Training Loss: 0.01886
Epoch: 7402, Training Loss: 0.02027
Epoch: 7402, Training Loss: 0.02030
Epoch: 7402, Training Loss: 0.02540
Epoch: 7403, Training Loss: 0.01886
Epoch: 7403, Training Loss: 0.02026
Epoch: 7403, Training Loss: 0.02030
Epoch: 7403, Training Loss: 0.02539
Epoch: 7404, Training Loss: 0.01886
Epoch: 7404, Training Loss: 0.02026
Epoch: 7404, Training Loss: 0.02030
Epoch: 7404, Training Loss: 0.02539
Epoch: 7405, Training Loss: 0.01886
Epoch: 7405, Training Loss: 0.02026
Epoch: 7405, Training Loss: 0.02029
Epoch: 7405, Training Loss: 0.02539
Epoch: 7406, Training Loss: 0.01885
Epoch: 7406, Training Loss: 0.02026
Epoch: 7406, Training Loss: 0.02029
Epoch: 7406, Training Loss: 0.02539
Epoch: 7407, Training Loss: 0.01885
Epoch: 7407, Training Loss: 0.02026
Epoch: 7407, Training Loss: 0.02029
Epoch: 7407, Training Loss: 0.02538
Epoch: 7408, Training Loss: 0.01885
Epoch: 7408, Training Loss: 0.02025
Epoch: 7408, Training Loss: 0.02029
Epoch: 7408, Training Loss: 0.02538
Epoch: 7409, Training Loss: 0.01885
Epoch: 7409, Training Loss: 0.02025
Epoch: 7409, Training Loss: 0.02029
Epoch: 7409, Training Loss: 0.02538
Epoch: 7410, Training Loss: 0.01885
Epoch: 7410, Training Loss: 0.02025
Epoch: 7410, Training Loss: 0.02028
Epoch: 7410, Training Loss: 0.02537
Epoch: 7411, Training Loss: 0.01885
Epoch: 7411, Training Loss: 0.02025
Epoch: 7411, Training Loss: 0.02028
Epoch: 7411, Training Loss: 0.02537
Epoch: 7412, Training Loss: 0.01884
Epoch: 7412, Training Loss: 0.02025
Epoch: 7412, Training Loss: 0.02028
Epoch: 7412, Training Loss: 0.02537
Epoch: 7413, Training Loss: 0.01884
Epoch: 7413, Training Loss: 0.02024
Epoch: 7413, Training Loss: 0.02028
Epoch: 7413, Training Loss: 0.02537
Epoch: 7414, Training Loss: 0.01884
Epoch: 7414, Training Loss: 0.02024
Epoch: 7414, Training Loss: 0.02028
Epoch: 7414, Training Loss: 0.02536
Epoch: 7415, Training Loss: 0.01884
Epoch: 7415, Training Loss: 0.02024
Epoch: 7415, Training Loss: 0.02027
Epoch: 7415, Training Loss: 0.02536
Epoch: 7416, Training Loss: 0.01884
Epoch: 7416, Training Loss: 0.02024
Epoch: 7416, Training Loss: 0.02027
Epoch: 7416, Training Loss: 0.02536
Epoch: 7417, Training Loss: 0.01883
Epoch: 7417, Training Loss: 0.02024
Epoch: 7417, Training Loss: 0.02027
Epoch: 7417, Training Loss: 0.02535
Epoch: 7418, Training Loss: 0.01883
Epoch: 7418, Training Loss: 0.02023
Epoch: 7418, Training Loss: 0.02027
Epoch: 7418, Training Loss: 0.02535
Epoch: 7419, Training Loss: 0.01883
Epoch: 7419, Training Loss: 0.02023
Epoch: 7419, Training Loss: 0.02027
Epoch: 7419, Training Loss: 0.02535
Epoch: 7420, Training Loss: 0.01883
Epoch: 7420, Training Loss: 0.02023
Epoch: 7420, Training Loss: 0.02026
Epoch: 7420, Training Loss: 0.02535
Epoch: 7421, Training Loss: 0.01883
Epoch: 7421, Training Loss: 0.02023
Epoch: 7421, Training Loss: 0.02026
Epoch: 7421, Training Loss: 0.02534
Epoch: 7422, Training Loss: 0.01882
Epoch: 7422, Training Loss: 0.02022
Epoch: 7422, Training Loss: 0.02026
Epoch: 7422, Training Loss: 0.02534
Epoch: 7423, Training Loss: 0.01882
Epoch: 7423, Training Loss: 0.02022
Epoch: 7423, Training Loss: 0.02026
Epoch: 7423, Training Loss: 0.02534
Epoch: 7424, Training Loss: 0.01882
Epoch: 7424, Training Loss: 0.02022
Epoch: 7424, Training Loss: 0.02025
Epoch: 7424, Training Loss: 0.02534
Epoch: 7425, Training Loss: 0.01882
Epoch: 7425, Training Loss: 0.02022
Epoch: 7425, Training Loss: 0.02025
Epoch: 7425, Training Loss: 0.02533
Epoch: 7426, Training Loss: 0.01882
Epoch: 7426, Training Loss: 0.02022
Epoch: 7426, Training Loss: 0.02025
Epoch: 7426, Training Loss: 0.02533
Epoch: 7427, Training Loss: 0.01882
Epoch: 7427, Training Loss: 0.02021
Epoch: 7427, Training Loss: 0.02025
Epoch: 7427, Training Loss: 0.02533
Epoch: 7428, Training Loss: 0.01881
Epoch: 7428, Training Loss: 0.02021
Epoch: 7428, Training Loss: 0.02025
Epoch: 7428, Training Loss: 0.02532
Epoch: 7429, Training Loss: 0.01881
Epoch: 7429, Training Loss: 0.02021
Epoch: 7429, Training Loss: 0.02024
Epoch: 7429, Training Loss: 0.02532
Epoch: 7430, Training Loss: 0.01881
Epoch: 7430, Training Loss: 0.02021
Epoch: 7430, Training Loss: 0.02024
Epoch: 7430, Training Loss: 0.02532
Epoch: 7431, Training Loss: 0.01881
Epoch: 7431, Training Loss: 0.02021
Epoch: 7431, Training Loss: 0.02024
Epoch: 7431, Training Loss: 0.02532
Epoch: 7432, Training Loss: 0.01881
Epoch: 7432, Training Loss: 0.02020
Epoch: 7432, Training Loss: 0.02024
Epoch: 7432, Training Loss: 0.02531
Epoch: 7433, Training Loss: 0.01880
Epoch: 7433, Training Loss: 0.02020
Epoch: 7433, Training Loss: 0.02024
Epoch: 7433, Training Loss: 0.02531
Epoch: 7434, Training Loss: 0.01880
Epoch: 7434, Training Loss: 0.02020
Epoch: 7434, Training Loss: 0.02023
Epoch: 7434, Training Loss: 0.02531
Epoch: 7435, Training Loss: 0.01880
Epoch: 7435, Training Loss: 0.02020
Epoch: 7435, Training Loss: 0.02023
Epoch: 7435, Training Loss: 0.02531
Epoch: 7436, Training Loss: 0.01880
Epoch: 7436, Training Loss: 0.02020
Epoch: 7436, Training Loss: 0.02023
Epoch: 7436, Training Loss: 0.02530
Epoch: 7437, Training Loss: 0.01880
Epoch: 7437, Training Loss: 0.02019
Epoch: 7437, Training Loss: 0.02023
Epoch: 7437, Training Loss: 0.02530
Epoch: 7438, Training Loss: 0.01879
Epoch: 7438, Training Loss: 0.02019
Epoch: 7438, Training Loss: 0.02022
Epoch: 7438, Training Loss: 0.02530
Epoch: 7439, Training Loss: 0.01879
Epoch: 7439, Training Loss: 0.02019
Epoch: 7439, Training Loss: 0.02022
Epoch: 7439, Training Loss: 0.02529
Epoch: 7440, Training Loss: 0.01879
Epoch: 7440, Training Loss: 0.02019
Epoch: 7440, Training Loss: 0.02022
Epoch: 7440, Training Loss: 0.02529
Epoch: 7441, Training Loss: 0.01879
Epoch: 7441, Training Loss: 0.02018
Epoch: 7441, Training Loss: 0.02022
Epoch: 7441, Training Loss: 0.02529
Epoch: 7442, Training Loss: 0.01879
Epoch: 7442, Training Loss: 0.02018
Epoch: 7442, Training Loss: 0.02022
Epoch: 7442, Training Loss: 0.02529
Epoch: 7443, Training Loss: 0.01879
Epoch: 7443, Training Loss: 0.02018
Epoch: 7443, Training Loss: 0.02021
Epoch: 7443, Training Loss: 0.02528
Epoch: 7444, Training Loss: 0.01878
Epoch: 7444, Training Loss: 0.02018
Epoch: 7444, Training Loss: 0.02021
Epoch: 7444, Training Loss: 0.02528
Epoch: 7445, Training Loss: 0.01878
Epoch: 7445, Training Loss: 0.02018
Epoch: 7445, Training Loss: 0.02021
Epoch: 7445, Training Loss: 0.02528
Epoch: 7446, Training Loss: 0.01878
Epoch: 7446, Training Loss: 0.02017
Epoch: 7446, Training Loss: 0.02021
Epoch: 7446, Training Loss: 0.02528
Epoch: 7447, Training Loss: 0.01878
Epoch: 7447, Training Loss: 0.02017
Epoch: 7447, Training Loss: 0.02021
Epoch: 7447, Training Loss: 0.02527
Epoch: 7448, Training Loss: 0.01878
Epoch: 7448, Training Loss: 0.02017
Epoch: 7448, Training Loss: 0.02020
Epoch: 7448, Training Loss: 0.02527
Epoch: 7449, Training Loss: 0.01877
Epoch: 7449, Training Loss: 0.02017
Epoch: 7449, Training Loss: 0.02020
Epoch: 7449, Training Loss: 0.02527
Epoch: 7450, Training Loss: 0.01877
Epoch: 7450, Training Loss: 0.02017
Epoch: 7450, Training Loss: 0.02020
Epoch: 7450, Training Loss: 0.02526
Epoch: 7451, Training Loss: 0.01877
Epoch: 7451, Training Loss: 0.02016
Epoch: 7451, Training Loss: 0.02020
Epoch: 7451, Training Loss: 0.02526
Epoch: 7452, Training Loss: 0.01877
Epoch: 7452, Training Loss: 0.02016
Epoch: 7452, Training Loss: 0.02020
Epoch: 7452, Training Loss: 0.02526
Epoch: 7453, Training Loss: 0.01877
Epoch: 7453, Training Loss: 0.02016
Epoch: 7453, Training Loss: 0.02019
Epoch: 7453, Training Loss: 0.02526
Epoch: 7454, Training Loss: 0.01877
Epoch: 7454, Training Loss: 0.02016
Epoch: 7454, Training Loss: 0.02019
Epoch: 7454, Training Loss: 0.02525
Epoch: 7455, Training Loss: 0.01876
Epoch: 7455, Training Loss: 0.02016
Epoch: 7455, Training Loss: 0.02019
Epoch: 7455, Training Loss: 0.02525
Epoch: 7456, Training Loss: 0.01876
Epoch: 7456, Training Loss: 0.02015
Epoch: 7456, Training Loss: 0.02019
Epoch: 7456, Training Loss: 0.02525
Epoch: 7457, Training Loss: 0.01876
Epoch: 7457, Training Loss: 0.02015
Epoch: 7457, Training Loss: 0.02018
Epoch: 7457, Training Loss: 0.02525
Epoch: 7458, Training Loss: 0.01876
Epoch: 7458, Training Loss: 0.02015
Epoch: 7458, Training Loss: 0.02018
Epoch: 7458, Training Loss: 0.02524
Epoch: 7459, Training Loss: 0.01876
Epoch: 7459, Training Loss: 0.02015
Epoch: 7459, Training Loss: 0.02018
Epoch: 7459, Training Loss: 0.02524
Epoch: 7460, Training Loss: 0.01875
Epoch: 7460, Training Loss: 0.02014
Epoch: 7460, Training Loss: 0.02018
Epoch: 7460, Training Loss: 0.02524
Epoch: 7461, Training Loss: 0.01875
Epoch: 7461, Training Loss: 0.02014
Epoch: 7461, Training Loss: 0.02018
Epoch: 7461, Training Loss: 0.02524
Epoch: 7462, Training Loss: 0.01875
Epoch: 7462, Training Loss: 0.02014
Epoch: 7462, Training Loss: 0.02017
Epoch: 7462, Training Loss: 0.02523
Epoch: 7463, Training Loss: 0.01875
Epoch: 7463, Training Loss: 0.02014
Epoch: 7463, Training Loss: 0.02017
Epoch: 7463, Training Loss: 0.02523
Epoch: 7464, Training Loss: 0.01875
Epoch: 7464, Training Loss: 0.02014
Epoch: 7464, Training Loss: 0.02017
Epoch: 7464, Training Loss: 0.02523
Epoch: 7465, Training Loss: 0.01874
Epoch: 7465, Training Loss: 0.02013
Epoch: 7465, Training Loss: 0.02017
Epoch: 7465, Training Loss: 0.02522
Epoch: 7466, Training Loss: 0.01874
Epoch: 7466, Training Loss: 0.02013
Epoch: 7466, Training Loss: 0.02017
Epoch: 7466, Training Loss: 0.02522
Epoch: 7467, Training Loss: 0.01874
Epoch: 7467, Training Loss: 0.02013
Epoch: 7467, Training Loss: 0.02016
Epoch: 7467, Training Loss: 0.02522
Epoch: 7468, Training Loss: 0.01874
Epoch: 7468, Training Loss: 0.02013
Epoch: 7468, Training Loss: 0.02016
Epoch: 7468, Training Loss: 0.02522
Epoch: 7469, Training Loss: 0.01874
Epoch: 7469, Training Loss: 0.02013
Epoch: 7469, Training Loss: 0.02016
Epoch: 7469, Training Loss: 0.02521
Epoch: 7470, Training Loss: 0.01874
Epoch: 7470, Training Loss: 0.02012
Epoch: 7470, Training Loss: 0.02016
Epoch: 7470, Training Loss: 0.02521
Epoch: 7471, Training Loss: 0.01873
Epoch: 7471, Training Loss: 0.02012
Epoch: 7471, Training Loss: 0.02016
Epoch: 7471, Training Loss: 0.02521
Epoch: 7472, Training Loss: 0.01873
Epoch: 7472, Training Loss: 0.02012
Epoch: 7472, Training Loss: 0.02015
Epoch: 7472, Training Loss: 0.02521
Epoch: 7473, Training Loss: 0.01873
Epoch: 7473, Training Loss: 0.02012
Epoch: 7473, Training Loss: 0.02015
Epoch: 7473, Training Loss: 0.02520
Epoch: 7474, Training Loss: 0.01873
Epoch: 7474, Training Loss: 0.02012
Epoch: 7474, Training Loss: 0.02015
Epoch: 7474, Training Loss: 0.02520
Epoch: 7475, Training Loss: 0.01873
Epoch: 7475, Training Loss: 0.02011
Epoch: 7475, Training Loss: 0.02015
Epoch: 7475, Training Loss: 0.02520
Epoch: 7476, Training Loss: 0.01872
Epoch: 7476, Training Loss: 0.02011
Epoch: 7476, Training Loss: 0.02015
Epoch: 7476, Training Loss: 0.02519
Epoch: 7477, Training Loss: 0.01872
Epoch: 7477, Training Loss: 0.02011
Epoch: 7477, Training Loss: 0.02014
Epoch: 7477, Training Loss: 0.02519
Epoch: 7478, Training Loss: 0.01872
Epoch: 7478, Training Loss: 0.02011
Epoch: 7478, Training Loss: 0.02014
Epoch: 7478, Training Loss: 0.02519
Epoch: 7479, Training Loss: 0.01872
Epoch: 7479, Training Loss: 0.02011
Epoch: 7479, Training Loss: 0.02014
Epoch: 7479, Training Loss: 0.02519
Epoch: 7480, Training Loss: 0.01872
Epoch: 7480, Training Loss: 0.02010
Epoch: 7480, Training Loss: 0.02014
Epoch: 7480, Training Loss: 0.02518
Epoch: 7481, Training Loss: 0.01872
Epoch: 7481, Training Loss: 0.02010
Epoch: 7481, Training Loss: 0.02013
Epoch: 7481, Training Loss: 0.02518
Epoch: 7482, Training Loss: 0.01871
Epoch: 7482, Training Loss: 0.02010
Epoch: 7482, Training Loss: 0.02013
Epoch: 7482, Training Loss: 0.02518
Epoch: 7483, Training Loss: 0.01871
Epoch: 7483, Training Loss: 0.02010
Epoch: 7483, Training Loss: 0.02013
Epoch: 7483, Training Loss: 0.02518
Epoch: 7484, Training Loss: 0.01871
Epoch: 7484, Training Loss: 0.02009
Epoch: 7484, Training Loss: 0.02013
Epoch: 7484, Training Loss: 0.02517
Epoch: 7485, Training Loss: 0.01871
Epoch: 7485, Training Loss: 0.02009
Epoch: 7485, Training Loss: 0.02013
Epoch: 7485, Training Loss: 0.02517
Epoch: 7486, Training Loss: 0.01871
Epoch: 7486, Training Loss: 0.02009
Epoch: 7486, Training Loss: 0.02012
Epoch: 7486, Training Loss: 0.02517
Epoch: 7487, Training Loss: 0.01870
Epoch: 7487, Training Loss: 0.02009
Epoch: 7487, Training Loss: 0.02012
Epoch: 7487, Training Loss: 0.02517
Epoch: 7488, Training Loss: 0.01870
Epoch: 7488, Training Loss: 0.02009
Epoch: 7488, Training Loss: 0.02012
Epoch: 7488, Training Loss: 0.02516
Epoch: 7489, Training Loss: 0.01870
Epoch: 7489, Training Loss: 0.02008
Epoch: 7489, Training Loss: 0.02012
Epoch: 7489, Training Loss: 0.02516
Epoch: 7490, Training Loss: 0.01870
Epoch: 7490, Training Loss: 0.02008
Epoch: 7490, Training Loss: 0.02012
Epoch: 7490, Training Loss: 0.02516
Epoch: 7491, Training Loss: 0.01870
Epoch: 7491, Training Loss: 0.02008
Epoch: 7491, Training Loss: 0.02011
Epoch: 7491, Training Loss: 0.02515
Epoch: 7492, Training Loss: 0.01870
Epoch: 7492, Training Loss: 0.02008
Epoch: 7492, Training Loss: 0.02011
Epoch: 7492, Training Loss: 0.02515
Epoch: 7493, Training Loss: 0.01869
Epoch: 7493, Training Loss: 0.02008
Epoch: 7493, Training Loss: 0.02011
Epoch: 7493, Training Loss: 0.02515
Epoch: 7494, Training Loss: 0.01869
Epoch: 7494, Training Loss: 0.02007
Epoch: 7494, Training Loss: 0.02011
Epoch: 7494, Training Loss: 0.02515
Epoch: 7495, Training Loss: 0.01869
Epoch: 7495, Training Loss: 0.02007
Epoch: 7495, Training Loss: 0.02011
Epoch: 7495, Training Loss: 0.02514
Epoch: 7496, Training Loss: 0.01869
Epoch: 7496, Training Loss: 0.02007
Epoch: 7496, Training Loss: 0.02010
Epoch: 7496, Training Loss: 0.02514
Epoch: 7497, Training Loss: 0.01869
Epoch: 7497, Training Loss: 0.02007
Epoch: 7497, Training Loss: 0.02010
Epoch: 7497, Training Loss: 0.02514
Epoch: 7498, Training Loss: 0.01868
Epoch: 7498, Training Loss: 0.02007
Epoch: 7498, Training Loss: 0.02010
Epoch: 7498, Training Loss: 0.02514
Epoch: 7499, Training Loss: 0.01868
Epoch: 7499, Training Loss: 0.02006
Epoch: 7499, Training Loss: 0.02010
Epoch: 7499, Training Loss: 0.02513
Epoch: 7500, Training Loss: 0.01868
Epoch: 7500, Training Loss: 0.02006
Epoch: 7500, Training Loss: 0.02010
Epoch: 7500, Training Loss: 0.02513
Epoch: 7501, Training Loss: 0.01868
Epoch: 7501, Training Loss: 0.02006
Epoch: 7501, Training Loss: 0.02009
Epoch: 7501, Training Loss: 0.02513
Epoch: 7502, Training Loss: 0.01868
Epoch: 7502, Training Loss: 0.02006
Epoch: 7502, Training Loss: 0.02009
Epoch: 7502, Training Loss: 0.02513
Epoch: 7503, Training Loss: 0.01868
Epoch: 7503, Training Loss: 0.02006
Epoch: 7503, Training Loss: 0.02009
Epoch: 7503, Training Loss: 0.02512
Epoch: 7504, Training Loss: 0.01867
Epoch: 7504, Training Loss: 0.02005
Epoch: 7504, Training Loss: 0.02009
Epoch: 7504, Training Loss: 0.02512
Epoch: 7505, Training Loss: 0.01867
Epoch: 7505, Training Loss: 0.02005
Epoch: 7505, Training Loss: 0.02008
Epoch: 7505, Training Loss: 0.02512
Epoch: 7506, Training Loss: 0.01867
Epoch: 7506, Training Loss: 0.02005
Epoch: 7506, Training Loss: 0.02008
Epoch: 7506, Training Loss: 0.02511
Epoch: 7507, Training Loss: 0.01867
Epoch: 7507, Training Loss: 0.02005
Epoch: 7507, Training Loss: 0.02008
Epoch: 7507, Training Loss: 0.02511
Epoch: 7508, Training Loss: 0.01867
Epoch: 7508, Training Loss: 0.02005
Epoch: 7508, Training Loss: 0.02008
Epoch: 7508, Training Loss: 0.02511
Epoch: 7509, Training Loss: 0.01866
Epoch: 7509, Training Loss: 0.02004
Epoch: 7509, Training Loss: 0.02008
Epoch: 7509, Training Loss: 0.02511
Epoch: 7510, Training Loss: 0.01866
Epoch: 7510, Training Loss: 0.02004
Epoch: 7510, Training Loss: 0.02007
Epoch: 7510, Training Loss: 0.02510
Epoch: 7511, Training Loss: 0.01866
Epoch: 7511, Training Loss: 0.02004
Epoch: 7511, Training Loss: 0.02007
Epoch: 7511, Training Loss: 0.02510
Epoch: 7512, Training Loss: 0.01866
Epoch: 7512, Training Loss: 0.02004
Epoch: 7512, Training Loss: 0.02007
Epoch: 7512, Training Loss: 0.02510
Epoch: 7513, Training Loss: 0.01866
Epoch: 7513, Training Loss: 0.02003
Epoch: 7513, Training Loss: 0.02007
Epoch: 7513, Training Loss: 0.02510
Epoch: 7514, Training Loss: 0.01866
Epoch: 7514, Training Loss: 0.02003
Epoch: 7514, Training Loss: 0.02007
Epoch: 7514, Training Loss: 0.02509
Epoch: 7515, Training Loss: 0.01865
Epoch: 7515, Training Loss: 0.02003
Epoch: 7515, Training Loss: 0.02006
Epoch: 7515, Training Loss: 0.02509
Epoch: 7516, Training Loss: 0.01865
Epoch: 7516, Training Loss: 0.02003
Epoch: 7516, Training Loss: 0.02006
Epoch: 7516, Training Loss: 0.02509
Epoch: 7517, Training Loss: 0.01865
Epoch: 7517, Training Loss: 0.02003
Epoch: 7517, Training Loss: 0.02006
Epoch: 7517, Training Loss: 0.02509
Epoch: 7518, Training Loss: 0.01865
Epoch: 7518, Training Loss: 0.02002
Epoch: 7518, Training Loss: 0.02006
Epoch: 7518, Training Loss: 0.02508
Epoch: 7519, Training Loss: 0.01865
Epoch: 7519, Training Loss: 0.02002
Epoch: 7519, Training Loss: 0.02006
Epoch: 7519, Training Loss: 0.02508
Epoch: 7520, Training Loss: 0.01864
Epoch: 7520, Training Loss: 0.02002
Epoch: 7520, Training Loss: 0.02005
Epoch: 7520, Training Loss: 0.02508
Epoch: 7521, Training Loss: 0.01864
Epoch: 7521, Training Loss: 0.02002
Epoch: 7521, Training Loss: 0.02005
Epoch: 7521, Training Loss: 0.02507
Epoch: 7522, Training Loss: 0.01864
Epoch: 7522, Training Loss: 0.02002
Epoch: 7522, Training Loss: 0.02005
Epoch: 7522, Training Loss: 0.02507
Epoch: 7523, Training Loss: 0.01864
Epoch: 7523, Training Loss: 0.02001
Epoch: 7523, Training Loss: 0.02005
Epoch: 7523, Training Loss: 0.02507
Epoch: 7524, Training Loss: 0.01864
Epoch: 7524, Training Loss: 0.02001
Epoch: 7524, Training Loss: 0.02005
Epoch: 7524, Training Loss: 0.02507
Epoch: 7525, Training Loss: 0.01864
Epoch: 7525, Training Loss: 0.02001
Epoch: 7525, Training Loss: 0.02004
Epoch: 7525, Training Loss: 0.02506
Epoch: 7526, Training Loss: 0.01863
Epoch: 7526, Training Loss: 0.02001
Epoch: 7526, Training Loss: 0.02004
Epoch: 7526, Training Loss: 0.02506
Epoch: 7527, Training Loss: 0.01863
Epoch: 7527, Training Loss: 0.02001
Epoch: 7527, Training Loss: 0.02004
Epoch: 7527, Training Loss: 0.02506
Epoch: 7528, Training Loss: 0.01863
Epoch: 7528, Training Loss: 0.02000
Epoch: 7528, Training Loss: 0.02004
Epoch: 7528, Training Loss: 0.02506
Epoch: 7529, Training Loss: 0.01863
Epoch: 7529, Training Loss: 0.02000
Epoch: 7529, Training Loss: 0.02004
Epoch: 7529, Training Loss: 0.02505
Epoch: 7530, Training Loss: 0.01863
Epoch: 7530, Training Loss: 0.02000
Epoch: 7530, Training Loss: 0.02003
Epoch: 7530, Training Loss: 0.02505
Epoch: 7531, Training Loss: 0.01862
Epoch: 7531, Training Loss: 0.02000
Epoch: 7531, Training Loss: 0.02003
Epoch: 7531, Training Loss: 0.02505
Epoch: 7532, Training Loss: 0.01862
Epoch: 7532, Training Loss: 0.02000
Epoch: 7532, Training Loss: 0.02003
Epoch: 7532, Training Loss: 0.02505
Epoch: 7533, Training Loss: 0.01862
Epoch: 7533, Training Loss: 0.01999
Epoch: 7533, Training Loss: 0.02003
Epoch: 7533, Training Loss: 0.02504
Epoch: 7534, Training Loss: 0.01862
Epoch: 7534, Training Loss: 0.01999
Epoch: 7534, Training Loss: 0.02003
Epoch: 7534, Training Loss: 0.02504
Epoch: 7535, Training Loss: 0.01862
Epoch: 7535, Training Loss: 0.01999
Epoch: 7535, Training Loss: 0.02002
Epoch: 7535, Training Loss: 0.02504
Epoch: 7536, Training Loss: 0.01862
Epoch: 7536, Training Loss: 0.01999
Epoch: 7536, Training Loss: 0.02002
Epoch: 7536, Training Loss: 0.02503
Epoch: 7537, Training Loss: 0.01861
Epoch: 7537, Training Loss: 0.01999
Epoch: 7537, Training Loss: 0.02002
Epoch: 7537, Training Loss: 0.02503
Epoch: 7538, Training Loss: 0.01861
Epoch: 7538, Training Loss: 0.01998
Epoch: 7538, Training Loss: 0.02002
Epoch: 7538, Training Loss: 0.02503
Epoch: 7539, Training Loss: 0.01861
Epoch: 7539, Training Loss: 0.01998
Epoch: 7539, Training Loss: 0.02002
Epoch: 7539, Training Loss: 0.02503
Epoch: 7540, Training Loss: 0.01861
Epoch: 7540, Training Loss: 0.01998
Epoch: 7540, Training Loss: 0.02001
Epoch: 7540, Training Loss: 0.02502
Epoch: 7541, Training Loss: 0.01861
Epoch: 7541, Training Loss: 0.01998
Epoch: 7541, Training Loss: 0.02001
Epoch: 7541, Training Loss: 0.02502
Epoch: 7542, Training Loss: 0.01860
Epoch: 7542, Training Loss: 0.01998
Epoch: 7542, Training Loss: 0.02001
Epoch: 7542, Training Loss: 0.02502
Epoch: 7543, Training Loss: 0.01860
Epoch: 7543, Training Loss: 0.01997
Epoch: 7543, Training Loss: 0.02001
Epoch: 7543, Training Loss: 0.02502
Epoch: 7544, Training Loss: 0.01860
Epoch: 7544, Training Loss: 0.01997
Epoch: 7544, Training Loss: 0.02000
Epoch: 7544, Training Loss: 0.02501
Epoch: 7545, Training Loss: 0.01860
Epoch: 7545, Training Loss: 0.01997
Epoch: 7545, Training Loss: 0.02000
Epoch: 7545, Training Loss: 0.02501
Epoch: 7546, Training Loss: 0.01860
Epoch: 7546, Training Loss: 0.01997
Epoch: 7546, Training Loss: 0.02000
Epoch: 7546, Training Loss: 0.02501
Epoch: 7547, Training Loss: 0.01860
Epoch: 7547, Training Loss: 0.01997
Epoch: 7547, Training Loss: 0.02000
Epoch: 7547, Training Loss: 0.02501
Epoch: 7548, Training Loss: 0.01859
Epoch: 7548, Training Loss: 0.01996
Epoch: 7548, Training Loss: 0.02000
Epoch: 7548, Training Loss: 0.02500
Epoch: 7549, Training Loss: 0.01859
Epoch: 7549, Training Loss: 0.01996
Epoch: 7549, Training Loss: 0.01999
Epoch: 7549, Training Loss: 0.02500
Epoch: 7550, Training Loss: 0.01859
Epoch: 7550, Training Loss: 0.01996
Epoch: 7550, Training Loss: 0.01999
Epoch: 7550, Training Loss: 0.02500
Epoch: 7551, Training Loss: 0.01859
Epoch: 7551, Training Loss: 0.01996
Epoch: 7551, Training Loss: 0.01999
Epoch: 7551, Training Loss: 0.02500
Epoch: 7552, Training Loss: 0.01859
Epoch: 7552, Training Loss: 0.01996
Epoch: 7552, Training Loss: 0.01999
Epoch: 7552, Training Loss: 0.02499
Epoch: 7553, Training Loss: 0.01858
Epoch: 7553, Training Loss: 0.01995
Epoch: 7553, Training Loss: 0.01999
Epoch: 7553, Training Loss: 0.02499
Epoch: 7554, Training Loss: 0.01858
Epoch: 7554, Training Loss: 0.01995
Epoch: 7554, Training Loss: 0.01998
Epoch: 7554, Training Loss: 0.02499
Epoch: 7555, Training Loss: 0.01858
Epoch: 7555, Training Loss: 0.01995
Epoch: 7555, Training Loss: 0.01998
Epoch: 7555, Training Loss: 0.02498
Epoch: 7556, Training Loss: 0.01858
Epoch: 7556, Training Loss: 0.01995
Epoch: 7556, Training Loss: 0.01998
Epoch: 7556, Training Loss: 0.02498
Epoch: 7557, Training Loss: 0.01858
Epoch: 7557, Training Loss: 0.01994
Epoch: 7557, Training Loss: 0.01998
Epoch: 7557, Training Loss: 0.02498
Epoch: 7558, Training Loss: 0.01858
Epoch: 7558, Training Loss: 0.01994
Epoch: 7558, Training Loss: 0.01998
Epoch: 7558, Training Loss: 0.02498
Epoch: 7559, Training Loss: 0.01857
Epoch: 7559, Training Loss: 0.01994
Epoch: 7559, Training Loss: 0.01997
Epoch: 7559, Training Loss: 0.02497
Epoch: 7560, Training Loss: 0.01857
Epoch: 7560, Training Loss: 0.01994
Epoch: 7560, Training Loss: 0.01997
Epoch: 7560, Training Loss: 0.02497
Epoch: 7561, Training Loss: 0.01857
Epoch: 7561, Training Loss: 0.01994
Epoch: 7561, Training Loss: 0.01997
Epoch: 7561, Training Loss: 0.02497
Epoch: 7562, Training Loss: 0.01857
Epoch: 7562, Training Loss: 0.01993
Epoch: 7562, Training Loss: 0.01997
Epoch: 7562, Training Loss: 0.02497
Epoch: 7563, Training Loss: 0.01857
Epoch: 7563, Training Loss: 0.01993
Epoch: 7563, Training Loss: 0.01997
Epoch: 7563, Training Loss: 0.02496
Epoch: 7564, Training Loss: 0.01856
Epoch: 7564, Training Loss: 0.01993
Epoch: 7564, Training Loss: 0.01996
Epoch: 7564, Training Loss: 0.02496
Epoch: 7565, Training Loss: 0.01856
Epoch: 7565, Training Loss: 0.01993
Epoch: 7565, Training Loss: 0.01996
Epoch: 7565, Training Loss: 0.02496
Epoch: 7566, Training Loss: 0.01856
Epoch: 7566, Training Loss: 0.01993
Epoch: 7566, Training Loss: 0.01996
Epoch: 7566, Training Loss: 0.02496
Epoch: 7567, Training Loss: 0.01856
Epoch: 7567, Training Loss: 0.01992
Epoch: 7567, Training Loss: 0.01996
Epoch: 7567, Training Loss: 0.02495
Epoch: 7568, Training Loss: 0.01856
Epoch: 7568, Training Loss: 0.01992
Epoch: 7568, Training Loss: 0.01996
Epoch: 7568, Training Loss: 0.02495
Epoch: 7569, Training Loss: 0.01856
Epoch: 7569, Training Loss: 0.01992
Epoch: 7569, Training Loss: 0.01995
Epoch: 7569, Training Loss: 0.02495
Epoch: 7570, Training Loss: 0.01855
Epoch: 7570, Training Loss: 0.01992
Epoch: 7570, Training Loss: 0.01995
Epoch: 7570, Training Loss: 0.02495
Epoch: 7571, Training Loss: 0.01855
Epoch: 7571, Training Loss: 0.01992
Epoch: 7571, Training Loss: 0.01995
Epoch: 7571, Training Loss: 0.02494
Epoch: 7572, Training Loss: 0.01855
Epoch: 7572, Training Loss: 0.01991
Epoch: 7572, Training Loss: 0.01995
Epoch: 7572, Training Loss: 0.02494
Epoch: 7573, Training Loss: 0.01855
Epoch: 7573, Training Loss: 0.01991
Epoch: 7573, Training Loss: 0.01995
Epoch: 7573, Training Loss: 0.02494
Epoch: 7574, Training Loss: 0.01855
Epoch: 7574, Training Loss: 0.01991
Epoch: 7574, Training Loss: 0.01994
Epoch: 7574, Training Loss: 0.02493
Epoch: 7575, Training Loss: 0.01855
Epoch: 7575, Training Loss: 0.01991
Epoch: 7575, Training Loss: 0.01994
Epoch: 7575, Training Loss: 0.02493
Epoch: 7576, Training Loss: 0.01854
Epoch: 7576, Training Loss: 0.01991
Epoch: 7576, Training Loss: 0.01994
Epoch: 7576, Training Loss: 0.02493
Epoch: 7577, Training Loss: 0.01854
Epoch: 7577, Training Loss: 0.01990
Epoch: 7577, Training Loss: 0.01994
Epoch: 7577, Training Loss: 0.02493
Epoch: 7578, Training Loss: 0.01854
Epoch: 7578, Training Loss: 0.01990
Epoch: 7578, Training Loss: 0.01994
Epoch: 7578, Training Loss: 0.02492
Epoch: 7579, Training Loss: 0.01854
Epoch: 7579, Training Loss: 0.01990
Epoch: 7579, Training Loss: 0.01993
Epoch: 7579, Training Loss: 0.02492
Epoch: 7580, Training Loss: 0.01854
Epoch: 7580, Training Loss: 0.01990
Epoch: 7580, Training Loss: 0.01993
Epoch: 7580, Training Loss: 0.02492
Epoch: 7581, Training Loss: 0.01853
Epoch: 7581, Training Loss: 0.01990
Epoch: 7581, Training Loss: 0.01993
Epoch: 7581, Training Loss: 0.02492
Epoch: 7582, Training Loss: 0.01853
Epoch: 7582, Training Loss: 0.01989
Epoch: 7582, Training Loss: 0.01993
Epoch: 7582, Training Loss: 0.02491
Epoch: 7583, Training Loss: 0.01853
Epoch: 7583, Training Loss: 0.01989
Epoch: 7583, Training Loss: 0.01993
Epoch: 7583, Training Loss: 0.02491
Epoch: 7584, Training Loss: 0.01853
Epoch: 7584, Training Loss: 0.01989
Epoch: 7584, Training Loss: 0.01992
Epoch: 7584, Training Loss: 0.02491
Epoch: 7585, Training Loss: 0.01853
Epoch: 7585, Training Loss: 0.01989
Epoch: 7585, Training Loss: 0.01992
Epoch: 7585, Training Loss: 0.02491
Epoch: 7586, Training Loss: 0.01853
Epoch: 7586, Training Loss: 0.01989
Epoch: 7586, Training Loss: 0.01992
Epoch: 7586, Training Loss: 0.02490
Epoch: 7587, Training Loss: 0.01852
Epoch: 7587, Training Loss: 0.01988
Epoch: 7587, Training Loss: 0.01992
Epoch: 7587, Training Loss: 0.02490
Epoch: 7588, Training Loss: 0.01852
Epoch: 7588, Training Loss: 0.01988
Epoch: 7588, Training Loss: 0.01992
Epoch: 7588, Training Loss: 0.02490
Epoch: 7589, Training Loss: 0.01852
Epoch: 7589, Training Loss: 0.01988
Epoch: 7589, Training Loss: 0.01991
Epoch: 7589, Training Loss: 0.02490
Epoch: 7590, Training Loss: 0.01852
Epoch: 7590, Training Loss: 0.01988
Epoch: 7590, Training Loss: 0.01991
Epoch: 7590, Training Loss: 0.02489
Epoch: 7591, Training Loss: 0.01852
Epoch: 7591, Training Loss: 0.01988
Epoch: 7591, Training Loss: 0.01991
Epoch: 7591, Training Loss: 0.02489
Epoch: 7592, Training Loss: 0.01851
Epoch: 7592, Training Loss: 0.01987
Epoch: 7592, Training Loss: 0.01991
Epoch: 7592, Training Loss: 0.02489
Epoch: 7593, Training Loss: 0.01851
Epoch: 7593, Training Loss: 0.01987
Epoch: 7593, Training Loss: 0.01991
Epoch: 7593, Training Loss: 0.02489
Epoch: 7594, Training Loss: 0.01851
Epoch: 7594, Training Loss: 0.01987
Epoch: 7594, Training Loss: 0.01990
Epoch: 7594, Training Loss: 0.02488
Epoch: 7595, Training Loss: 0.01851
Epoch: 7595, Training Loss: 0.01987
Epoch: 7595, Training Loss: 0.01990
Epoch: 7595, Training Loss: 0.02488
Epoch: 7596, Training Loss: 0.01851
Epoch: 7596, Training Loss: 0.01987
Epoch: 7596, Training Loss: 0.01990
Epoch: 7596, Training Loss: 0.02488
Epoch: 7597, Training Loss: 0.01851
Epoch: 7597, Training Loss: 0.01986
Epoch: 7597, Training Loss: 0.01990
Epoch: 7597, Training Loss: 0.02488
Epoch: 7598, Training Loss: 0.01850
Epoch: 7598, Training Loss: 0.01986
Epoch: 7598, Training Loss: 0.01990
Epoch: 7598, Training Loss: 0.02487
Epoch: 7599, Training Loss: 0.01850
Epoch: 7599, Training Loss: 0.01986
Epoch: 7599, Training Loss: 0.01989
Epoch: 7599, Training Loss: 0.02487
Epoch: 7600, Training Loss: 0.01850
Epoch: 7600, Training Loss: 0.01986
Epoch: 7600, Training Loss: 0.01989
Epoch: 7600, Training Loss: 0.02487
Epoch: 7601, Training Loss: 0.01850
Epoch: 7601, Training Loss: 0.01986
Epoch: 7601, Training Loss: 0.01989
Epoch: 7601, Training Loss: 0.02486
Epoch: 7602, Training Loss: 0.01850
Epoch: 7602, Training Loss: 0.01985
Epoch: 7602, Training Loss: 0.01989
Epoch: 7602, Training Loss: 0.02486
Epoch: 7603, Training Loss: 0.01850
Epoch: 7603, Training Loss: 0.01985
Epoch: 7603, Training Loss: 0.01989
Epoch: 7603, Training Loss: 0.02486
Epoch: 7604, Training Loss: 0.01849
Epoch: 7604, Training Loss: 0.01985
Epoch: 7604, Training Loss: 0.01988
Epoch: 7604, Training Loss: 0.02486
Epoch: 7605, Training Loss: 0.01849
Epoch: 7605, Training Loss: 0.01985
Epoch: 7605, Training Loss: 0.01988
Epoch: 7605, Training Loss: 0.02485
Epoch: 7606, Training Loss: 0.01849
Epoch: 7606, Training Loss: 0.01985
Epoch: 7606, Training Loss: 0.01988
Epoch: 7606, Training Loss: 0.02485
Epoch: 7607, Training Loss: 0.01849
Epoch: 7607, Training Loss: 0.01984
Epoch: 7607, Training Loss: 0.01988
Epoch: 7607, Training Loss: 0.02485
Epoch: 7608, Training Loss: 0.01849
Epoch: 7608, Training Loss: 0.01984
Epoch: 7608, Training Loss: 0.01988
Epoch: 7608, Training Loss: 0.02485
Epoch: 7609, Training Loss: 0.01848
Epoch: 7609, Training Loss: 0.01984
Epoch: 7609, Training Loss: 0.01987
Epoch: 7609, Training Loss: 0.02484
Epoch: 7610, Training Loss: 0.01848
Epoch: 7610, Training Loss: 0.01984
Epoch: 7610, Training Loss: 0.01987
Epoch: 7610, Training Loss: 0.02484
Epoch: 7611, Training Loss: 0.01848
Epoch: 7611, Training Loss: 0.01984
Epoch: 7611, Training Loss: 0.01987
Epoch: 7611, Training Loss: 0.02484
Epoch: 7612, Training Loss: 0.01848
Epoch: 7612, Training Loss: 0.01983
Epoch: 7612, Training Loss: 0.01987
Epoch: 7612, Training Loss: 0.02484
Epoch: 7613, Training Loss: 0.01848
Epoch: 7613, Training Loss: 0.01983
Epoch: 7613, Training Loss: 0.01987
Epoch: 7613, Training Loss: 0.02483
Epoch: 7614, Training Loss: 0.01848
Epoch: 7614, Training Loss: 0.01983
Epoch: 7614, Training Loss: 0.01986
Epoch: 7614, Training Loss: 0.02483
Epoch: 7615, Training Loss: 0.01847
Epoch: 7615, Training Loss: 0.01983
Epoch: 7615, Training Loss: 0.01986
Epoch: 7615, Training Loss: 0.02483
Epoch: 7616, Training Loss: 0.01847
Epoch: 7616, Training Loss: 0.01983
Epoch: 7616, Training Loss: 0.01986
Epoch: 7616, Training Loss: 0.02483
Epoch: 7617, Training Loss: 0.01847
Epoch: 7617, Training Loss: 0.01982
Epoch: 7617, Training Loss: 0.01986
Epoch: 7617, Training Loss: 0.02482
Epoch: 7618, Training Loss: 0.01847
Epoch: 7618, Training Loss: 0.01982
Epoch: 7618, Training Loss: 0.01986
Epoch: 7618, Training Loss: 0.02482
Epoch: 7619, Training Loss: 0.01847
Epoch: 7619, Training Loss: 0.01982
Epoch: 7619, Training Loss: 0.01985
Epoch: 7619, Training Loss: 0.02482
Epoch: 7620, Training Loss: 0.01847
Epoch: 7620, Training Loss: 0.01982
Epoch: 7620, Training Loss: 0.01985
Epoch: 7620, Training Loss: 0.02482
Epoch: 7621, Training Loss: 0.01846
Epoch: 7621, Training Loss: 0.01982
Epoch: 7621, Training Loss: 0.01985
Epoch: 7621, Training Loss: 0.02481
Epoch: 7622, Training Loss: 0.01846
Epoch: 7622, Training Loss: 0.01981
Epoch: 7622, Training Loss: 0.01985
Epoch: 7622, Training Loss: 0.02481
Epoch: 7623, Training Loss: 0.01846
Epoch: 7623, Training Loss: 0.01981
Epoch: 7623, Training Loss: 0.01985
Epoch: 7623, Training Loss: 0.02481
Epoch: 7624, Training Loss: 0.01846
Epoch: 7624, Training Loss: 0.01981
Epoch: 7624, Training Loss: 0.01984
Epoch: 7624, Training Loss: 0.02481
Epoch: 7625, Training Loss: 0.01846
Epoch: 7625, Training Loss: 0.01981
Epoch: 7625, Training Loss: 0.01984
Epoch: 7625, Training Loss: 0.02480
Epoch: 7626, Training Loss: 0.01845
Epoch: 7626, Training Loss: 0.01981
Epoch: 7626, Training Loss: 0.01984
Epoch: 7626, Training Loss: 0.02480
Epoch: 7627, Training Loss: 0.01845
Epoch: 7627, Training Loss: 0.01980
Epoch: 7627, Training Loss: 0.01984
Epoch: 7627, Training Loss: 0.02480
Epoch: 7628, Training Loss: 0.01845
Epoch: 7628, Training Loss: 0.01980
Epoch: 7628, Training Loss: 0.01984
Epoch: 7628, Training Loss: 0.02480
Epoch: 7629, Training Loss: 0.01845
Epoch: 7629, Training Loss: 0.01980
Epoch: 7629, Training Loss: 0.01983
Epoch: 7629, Training Loss: 0.02479
Epoch: 7630, Training Loss: 0.01845
Epoch: 7630, Training Loss: 0.01980
Epoch: 7630, Training Loss: 0.01983
Epoch: 7630, Training Loss: 0.02479
Epoch: 7631, Training Loss: 0.01845
Epoch: 7631, Training Loss: 0.01980
Epoch: 7631, Training Loss: 0.01983
Epoch: 7631, Training Loss: 0.02479
Epoch: 7632, Training Loss: 0.01844
Epoch: 7632, Training Loss: 0.01979
Epoch: 7632, Training Loss: 0.01983
Epoch: 7632, Training Loss: 0.02478
Epoch: 7633, Training Loss: 0.01844
Epoch: 7633, Training Loss: 0.01979
Epoch: 7633, Training Loss: 0.01983
Epoch: 7633, Training Loss: 0.02478
Epoch: 7634, Training Loss: 0.01844
Epoch: 7634, Training Loss: 0.01979
Epoch: 7634, Training Loss: 0.01982
Epoch: 7634, Training Loss: 0.02478
Epoch: 7635, Training Loss: 0.01844
Epoch: 7635, Training Loss: 0.01979
Epoch: 7635, Training Loss: 0.01982
Epoch: 7635, Training Loss: 0.02478
Epoch: 7636, Training Loss: 0.01844
Epoch: 7636, Training Loss: 0.01979
Epoch: 7636, Training Loss: 0.01982
Epoch: 7636, Training Loss: 0.02477
Epoch: 7637, Training Loss: 0.01844
Epoch: 7637, Training Loss: 0.01978
Epoch: 7637, Training Loss: 0.01982
Epoch: 7637, Training Loss: 0.02477
Epoch: 7638, Training Loss: 0.01843
Epoch: 7638, Training Loss: 0.01978
Epoch: 7638, Training Loss: 0.01982
Epoch: 7638, Training Loss: 0.02477
Epoch: 7639, Training Loss: 0.01843
Epoch: 7639, Training Loss: 0.01978
Epoch: 7639, Training Loss: 0.01981
Epoch: 7639, Training Loss: 0.02477
Epoch: 7640, Training Loss: 0.01843
Epoch: 7640, Training Loss: 0.01978
Epoch: 7640, Training Loss: 0.01981
Epoch: 7640, Training Loss: 0.02476
Epoch: 7641, Training Loss: 0.01843
Epoch: 7641, Training Loss: 0.01978
Epoch: 7641, Training Loss: 0.01981
Epoch: 7641, Training Loss: 0.02476
Epoch: 7642, Training Loss: 0.01843
Epoch: 7642, Training Loss: 0.01977
Epoch: 7642, Training Loss: 0.01981
Epoch: 7642, Training Loss: 0.02476
Epoch: 7643, Training Loss: 0.01842
Epoch: 7643, Training Loss: 0.01977
Epoch: 7643, Training Loss: 0.01981
Epoch: 7643, Training Loss: 0.02476
Epoch: 7644, Training Loss: 0.01842
Epoch: 7644, Training Loss: 0.01977
Epoch: 7644, Training Loss: 0.01980
Epoch: 7644, Training Loss: 0.02475
Epoch: 7645, Training Loss: 0.01842
Epoch: 7645, Training Loss: 0.01977
Epoch: 7645, Training Loss: 0.01980
Epoch: 7645, Training Loss: 0.02475
Epoch: 7646, Training Loss: 0.01842
Epoch: 7646, Training Loss: 0.01977
Epoch: 7646, Training Loss: 0.01980
Epoch: 7646, Training Loss: 0.02475
Epoch: 7647, Training Loss: 0.01842
Epoch: 7647, Training Loss: 0.01976
Epoch: 7647, Training Loss: 0.01980
Epoch: 7647, Training Loss: 0.02475
Epoch: 7648, Training Loss: 0.01842
Epoch: 7648, Training Loss: 0.01976
Epoch: 7648, Training Loss: 0.01980
Epoch: 7648, Training Loss: 0.02474
Epoch: 7649, Training Loss: 0.01841
Epoch: 7649, Training Loss: 0.01976
Epoch: 7649, Training Loss: 0.01979
Epoch: 7649, Training Loss: 0.02474
Epoch: 7650, Training Loss: 0.01841
Epoch: 7650, Training Loss: 0.01976
Epoch: 7650, Training Loss: 0.01979
Epoch: 7650, Training Loss: 0.02474
Epoch: 7651, Training Loss: 0.01841
Epoch: 7651, Training Loss: 0.01976
Epoch: 7651, Training Loss: 0.01979
Epoch: 7651, Training Loss: 0.02474
Epoch: 7652, Training Loss: 0.01841
Epoch: 7652, Training Loss: 0.01975
Epoch: 7652, Training Loss: 0.01979
Epoch: 7652, Training Loss: 0.02473
Epoch: 7653, Training Loss: 0.01841
Epoch: 7653, Training Loss: 0.01975
Epoch: 7653, Training Loss: 0.01979
Epoch: 7653, Training Loss: 0.02473
Epoch: 7654, Training Loss: 0.01841
Epoch: 7654, Training Loss: 0.01975
Epoch: 7654, Training Loss: 0.01978
Epoch: 7654, Training Loss: 0.02473
Epoch: 7655, Training Loss: 0.01840
Epoch: 7655, Training Loss: 0.01975
Epoch: 7655, Training Loss: 0.01978
Epoch: 7655, Training Loss: 0.02473
Epoch: 7656, Training Loss: 0.01840
Epoch: 7656, Training Loss: 0.01975
Epoch: 7656, Training Loss: 0.01978
Epoch: 7656, Training Loss: 0.02472
Epoch: 7657, Training Loss: 0.01840
Epoch: 7657, Training Loss: 0.01974
Epoch: 7657, Training Loss: 0.01978
Epoch: 7657, Training Loss: 0.02472
Epoch: 7658, Training Loss: 0.01840
Epoch: 7658, Training Loss: 0.01974
Epoch: 7658, Training Loss: 0.01978
Epoch: 7658, Training Loss: 0.02472
Epoch: 7659, Training Loss: 0.01840
Epoch: 7659, Training Loss: 0.01974
Epoch: 7659, Training Loss: 0.01977
Epoch: 7659, Training Loss: 0.02472
Epoch: 7660, Training Loss: 0.01839
Epoch: 7660, Training Loss: 0.01974
Epoch: 7660, Training Loss: 0.01977
Epoch: 7660, Training Loss: 0.02471
Epoch: 7661, Training Loss: 0.01839
Epoch: 7661, Training Loss: 0.01974
Epoch: 7661, Training Loss: 0.01977
Epoch: 7661, Training Loss: 0.02471
Epoch: 7662, Training Loss: 0.01839
Epoch: 7662, Training Loss: 0.01973
Epoch: 7662, Training Loss: 0.01977
Epoch: 7662, Training Loss: 0.02471
Epoch: 7663, Training Loss: 0.01839
Epoch: 7663, Training Loss: 0.01973
Epoch: 7663, Training Loss: 0.01977
Epoch: 7663, Training Loss: 0.02471
Epoch: 7664, Training Loss: 0.01839
Epoch: 7664, Training Loss: 0.01973
Epoch: 7664, Training Loss: 0.01976
Epoch: 7664, Training Loss: 0.02470
Epoch: 7665, Training Loss: 0.01839
Epoch: 7665, Training Loss: 0.01973
Epoch: 7665, Training Loss: 0.01976
Epoch: 7665, Training Loss: 0.02470
Epoch: 7666, Training Loss: 0.01838
Epoch: 7666, Training Loss: 0.01973
Epoch: 7666, Training Loss: 0.01976
Epoch: 7666, Training Loss: 0.02470
Epoch: 7667, Training Loss: 0.01838
Epoch: 7667, Training Loss: 0.01973
Epoch: 7667, Training Loss: 0.01976
Epoch: 7667, Training Loss: 0.02470
Epoch: 7668, Training Loss: 0.01838
Epoch: 7668, Training Loss: 0.01972
Epoch: 7668, Training Loss: 0.01976
Epoch: 7668, Training Loss: 0.02469
Epoch: 7669, Training Loss: 0.01838
Epoch: 7669, Training Loss: 0.01972
Epoch: 7669, Training Loss: 0.01975
Epoch: 7669, Training Loss: 0.02469
Epoch: 7670, Training Loss: 0.01838
Epoch: 7670, Training Loss: 0.01972
Epoch: 7670, Training Loss: 0.01975
Epoch: 7670, Training Loss: 0.02469
Epoch: 7671, Training Loss: 0.01838
Epoch: 7671, Training Loss: 0.01972
Epoch: 7671, Training Loss: 0.01975
Epoch: 7671, Training Loss: 0.02469
Epoch: 7672, Training Loss: 0.01837
Epoch: 7672, Training Loss: 0.01972
Epoch: 7672, Training Loss: 0.01975
Epoch: 7672, Training Loss: 0.02468
Epoch: 7673, Training Loss: 0.01837
Epoch: 7673, Training Loss: 0.01971
Epoch: 7673, Training Loss: 0.01975
Epoch: 7673, Training Loss: 0.02468
Epoch: 7674, Training Loss: 0.01837
Epoch: 7674, Training Loss: 0.01971
Epoch: 7674, Training Loss: 0.01974
Epoch: 7674, Training Loss: 0.02468
Epoch: 7675, Training Loss: 0.01837
Epoch: 7675, Training Loss: 0.01971
Epoch: 7675, Training Loss: 0.01974
Epoch: 7675, Training Loss: 0.02468
Epoch: 7676, Training Loss: 0.01837
Epoch: 7676, Training Loss: 0.01971
Epoch: 7676, Training Loss: 0.01974
Epoch: 7676, Training Loss: 0.02467
Epoch: 7677, Training Loss: 0.01837
Epoch: 7677, Training Loss: 0.01971
Epoch: 7677, Training Loss: 0.01974
Epoch: 7677, Training Loss: 0.02467
Epoch: 7678, Training Loss: 0.01836
Epoch: 7678, Training Loss: 0.01970
Epoch: 7678, Training Loss: 0.01974
Epoch: 7678, Training Loss: 0.02467
Epoch: 7679, Training Loss: 0.01836
Epoch: 7679, Training Loss: 0.01970
Epoch: 7679, Training Loss: 0.01973
Epoch: 7679, Training Loss: 0.02467
Epoch: 7680, Training Loss: 0.01836
Epoch: 7680, Training Loss: 0.01970
Epoch: 7680, Training Loss: 0.01973
Epoch: 7680, Training Loss: 0.02466
Epoch: 7681, Training Loss: 0.01836
Epoch: 7681, Training Loss: 0.01970
Epoch: 7681, Training Loss: 0.01973
Epoch: 7681, Training Loss: 0.02466
Epoch: 7682, Training Loss: 0.01836
Epoch: 7682, Training Loss: 0.01970
Epoch: 7682, Training Loss: 0.01973
Epoch: 7682, Training Loss: 0.02466
Epoch: 7683, Training Loss: 0.01835
Epoch: 7683, Training Loss: 0.01969
Epoch: 7683, Training Loss: 0.01973
Epoch: 7683, Training Loss: 0.02465
Epoch: 7684, Training Loss: 0.01835
Epoch: 7684, Training Loss: 0.01969
Epoch: 7684, Training Loss: 0.01972
Epoch: 7684, Training Loss: 0.02465
Epoch: 7685, Training Loss: 0.01835
Epoch: 7685, Training Loss: 0.01969
Epoch: 7685, Training Loss: 0.01972
Epoch: 7685, Training Loss: 0.02465
Epoch: 7686, Training Loss: 0.01835
Epoch: 7686, Training Loss: 0.01969
Epoch: 7686, Training Loss: 0.01972
Epoch: 7686, Training Loss: 0.02465
Epoch: 7687, Training Loss: 0.01835
Epoch: 7687, Training Loss: 0.01969
Epoch: 7687, Training Loss: 0.01972
Epoch: 7687, Training Loss: 0.02464
Epoch: 7688, Training Loss: 0.01835
Epoch: 7688, Training Loss: 0.01968
Epoch: 7688, Training Loss: 0.01972
Epoch: 7688, Training Loss: 0.02464
Epoch: 7689, Training Loss: 0.01834
Epoch: 7689, Training Loss: 0.01968
Epoch: 7689, Training Loss: 0.01971
Epoch: 7689, Training Loss: 0.02464
Epoch: 7690, Training Loss: 0.01834
Epoch: 7690, Training Loss: 0.01968
Epoch: 7690, Training Loss: 0.01971
Epoch: 7690, Training Loss: 0.02464
Epoch: 7691, Training Loss: 0.01834
Epoch: 7691, Training Loss: 0.01968
Epoch: 7691, Training Loss: 0.01971
Epoch: 7691, Training Loss: 0.02463
Epoch: 7692, Training Loss: 0.01834
Epoch: 7692, Training Loss: 0.01968
Epoch: 7692, Training Loss: 0.01971
Epoch: 7692, Training Loss: 0.02463
Epoch: 7693, Training Loss: 0.01834
Epoch: 7693, Training Loss: 0.01967
Epoch: 7693, Training Loss: 0.01971
Epoch: 7693, Training Loss: 0.02463
Epoch: 7694, Training Loss: 0.01834
Epoch: 7694, Training Loss: 0.01967
Epoch: 7694, Training Loss: 0.01971
Epoch: 7694, Training Loss: 0.02463
Epoch: 7695, Training Loss: 0.01833
Epoch: 7695, Training Loss: 0.01967
Epoch: 7695, Training Loss: 0.01970
Epoch: 7695, Training Loss: 0.02462
Epoch: 7696, Training Loss: 0.01833
Epoch: 7696, Training Loss: 0.01967
Epoch: 7696, Training Loss: 0.01970
Epoch: 7696, Training Loss: 0.02462
Epoch: 7697, Training Loss: 0.01833
Epoch: 7697, Training Loss: 0.01967
Epoch: 7697, Training Loss: 0.01970
Epoch: 7697, Training Loss: 0.02462
Epoch: 7698, Training Loss: 0.01833
Epoch: 7698, Training Loss: 0.01966
Epoch: 7698, Training Loss: 0.01970
Epoch: 7698, Training Loss: 0.02462
Epoch: 7699, Training Loss: 0.01833
Epoch: 7699, Training Loss: 0.01966
Epoch: 7699, Training Loss: 0.01970
Epoch: 7699, Training Loss: 0.02461
Epoch: 7700, Training Loss: 0.01833
Epoch: 7700, Training Loss: 0.01966
Epoch: 7700, Training Loss: 0.01969
Epoch: 7700, Training Loss: 0.02461
Epoch: 7701, Training Loss: 0.01832
Epoch: 7701, Training Loss: 0.01966
Epoch: 7701, Training Loss: 0.01969
Epoch: 7701, Training Loss: 0.02461
Epoch: 7702, Training Loss: 0.01832
Epoch: 7702, Training Loss: 0.01966
Epoch: 7702, Training Loss: 0.01969
Epoch: 7702, Training Loss: 0.02461
Epoch: 7703, Training Loss: 0.01832
Epoch: 7703, Training Loss: 0.01965
Epoch: 7703, Training Loss: 0.01969
Epoch: 7703, Training Loss: 0.02460
Epoch: 7704, Training Loss: 0.01832
Epoch: 7704, Training Loss: 0.01965
Epoch: 7704, Training Loss: 0.01969
Epoch: 7704, Training Loss: 0.02460
Epoch: 7705, Training Loss: 0.01832
Epoch: 7705, Training Loss: 0.01965
Epoch: 7705, Training Loss: 0.01968
Epoch: 7705, Training Loss: 0.02460
Epoch: 7706, Training Loss: 0.01832
Epoch: 7706, Training Loss: 0.01965
Epoch: 7706, Training Loss: 0.01968
Epoch: 7706, Training Loss: 0.02460
Epoch: 7707, Training Loss: 0.01831
Epoch: 7707, Training Loss: 0.01965
Epoch: 7707, Training Loss: 0.01968
Epoch: 7707, Training Loss: 0.02459
Epoch: 7708, Training Loss: 0.01831
Epoch: 7708, Training Loss: 0.01964
Epoch: 7708, Training Loss: 0.01968
Epoch: 7708, Training Loss: 0.02459
Epoch: 7709, Training Loss: 0.01831
Epoch: 7709, Training Loss: 0.01964
Epoch: 7709, Training Loss: 0.01968
Epoch: 7709, Training Loss: 0.02459
Epoch: 7710, Training Loss: 0.01831
Epoch: 7710, Training Loss: 0.01964
Epoch: 7710, Training Loss: 0.01967
Epoch: 7710, Training Loss: 0.02459
Epoch: 7711, Training Loss: 0.01831
Epoch: 7711, Training Loss: 0.01964
Epoch: 7711, Training Loss: 0.01967
Epoch: 7711, Training Loss: 0.02458
Epoch: 7712, Training Loss: 0.01830
Epoch: 7712, Training Loss: 0.01964
Epoch: 7712, Training Loss: 0.01967
Epoch: 7712, Training Loss: 0.02458
Epoch: 7713, Training Loss: 0.01830
Epoch: 7713, Training Loss: 0.01964
Epoch: 7713, Training Loss: 0.01967
Epoch: 7713, Training Loss: 0.02458
Epoch: 7714, Training Loss: 0.01830
Epoch: 7714, Training Loss: 0.01963
Epoch: 7714, Training Loss: 0.01967
Epoch: 7714, Training Loss: 0.02458
Epoch: 7715, Training Loss: 0.01830
Epoch: 7715, Training Loss: 0.01963
Epoch: 7715, Training Loss: 0.01966
Epoch: 7715, Training Loss: 0.02457
Epoch: 7716, Training Loss: 0.01830
Epoch: 7716, Training Loss: 0.01963
Epoch: 7716, Training Loss: 0.01966
Epoch: 7716, Training Loss: 0.02457
Epoch: 7717, Training Loss: 0.01830
Epoch: 7717, Training Loss: 0.01963
Epoch: 7717, Training Loss: 0.01966
Epoch: 7717, Training Loss: 0.02457
Epoch: 7718, Training Loss: 0.01829
Epoch: 7718, Training Loss: 0.01963
Epoch: 7718, Training Loss: 0.01966
Epoch: 7718, Training Loss: 0.02457
Epoch: 7719, Training Loss: 0.01829
Epoch: 7719, Training Loss: 0.01962
Epoch: 7719, Training Loss: 0.01966
Epoch: 7719, Training Loss: 0.02456
Epoch: 7720, Training Loss: 0.01829
Epoch: 7720, Training Loss: 0.01962
Epoch: 7720, Training Loss: 0.01965
Epoch: 7720, Training Loss: 0.02456
Epoch: 7721, Training Loss: 0.01829
Epoch: 7721, Training Loss: 0.01962
Epoch: 7721, Training Loss: 0.01965
Epoch: 7721, Training Loss: 0.02456
Epoch: 7722, Training Loss: 0.01829
Epoch: 7722, Training Loss: 0.01962
Epoch: 7722, Training Loss: 0.01965
Epoch: 7722, Training Loss: 0.02456
Epoch: 7723, Training Loss: 0.01829
Epoch: 7723, Training Loss: 0.01962
Epoch: 7723, Training Loss: 0.01965
Epoch: 7723, Training Loss: 0.02455
Epoch: 7724, Training Loss: 0.01828
Epoch: 7724, Training Loss: 0.01961
Epoch: 7724, Training Loss: 0.01965
Epoch: 7724, Training Loss: 0.02455
Epoch: 7725, Training Loss: 0.01828
Epoch: 7725, Training Loss: 0.01961
Epoch: 7725, Training Loss: 0.01964
Epoch: 7725, Training Loss: 0.02455
Epoch: 7726, Training Loss: 0.01828
Epoch: 7726, Training Loss: 0.01961
Epoch: 7726, Training Loss: 0.01964
Epoch: 7726, Training Loss: 0.02455
Epoch: 7727, Training Loss: 0.01828
Epoch: 7727, Training Loss: 0.01961
Epoch: 7727, Training Loss: 0.01964
Epoch: 7727, Training Loss: 0.02454
Epoch: 7728, Training Loss: 0.01828
Epoch: 7728, Training Loss: 0.01961
Epoch: 7728, Training Loss: 0.01964
Epoch: 7728, Training Loss: 0.02454
Epoch: 7729, Training Loss: 0.01828
Epoch: 7729, Training Loss: 0.01960
Epoch: 7729, Training Loss: 0.01964
Epoch: 7729, Training Loss: 0.02454
Epoch: 7730, Training Loss: 0.01827
Epoch: 7730, Training Loss: 0.01960
Epoch: 7730, Training Loss: 0.01963
Epoch: 7730, Training Loss: 0.02454
Epoch: 7731, Training Loss: 0.01827
Epoch: 7731, Training Loss: 0.01960
Epoch: 7731, Training Loss: 0.01963
Epoch: 7731, Training Loss: 0.02453
Epoch: 7732, Training Loss: 0.01827
Epoch: 7732, Training Loss: 0.01960
Epoch: 7732, Training Loss: 0.01963
Epoch: 7732, Training Loss: 0.02453
Epoch: 7733, Training Loss: 0.01827
Epoch: 7733, Training Loss: 0.01960
Epoch: 7733, Training Loss: 0.01963
Epoch: 7733, Training Loss: 0.02453
Epoch: 7734, Training Loss: 0.01827
Epoch: 7734, Training Loss: 0.01959
Epoch: 7734, Training Loss: 0.01963
Epoch: 7734, Training Loss: 0.02453
Epoch: 7735, Training Loss: 0.01827
Epoch: 7735, Training Loss: 0.01959
Epoch: 7735, Training Loss: 0.01963
Epoch: 7735, Training Loss: 0.02452
Epoch: 7736, Training Loss: 0.01826
Epoch: 7736, Training Loss: 0.01959
Epoch: 7736, Training Loss: 0.01962
Epoch: 7736, Training Loss: 0.02452
Epoch: 7737, Training Loss: 0.01826
Epoch: 7737, Training Loss: 0.01959
Epoch: 7737, Training Loss: 0.01962
Epoch: 7737, Training Loss: 0.02452
Epoch: 7738, Training Loss: 0.01826
Epoch: 7738, Training Loss: 0.01959
Epoch: 7738, Training Loss: 0.01962
Epoch: 7738, Training Loss: 0.02452
Epoch: 7739, Training Loss: 0.01826
Epoch: 7739, Training Loss: 0.01959
Epoch: 7739, Training Loss: 0.01962
Epoch: 7739, Training Loss: 0.02451
Epoch: 7740, Training Loss: 0.01826
Epoch: 7740, Training Loss: 0.01958
Epoch: 7740, Training Loss: 0.01962
Epoch: 7740, Training Loss: 0.02451
Epoch: 7741, Training Loss: 0.01826
Epoch: 7741, Training Loss: 0.01958
Epoch: 7741, Training Loss: 0.01961
Epoch: 7741, Training Loss: 0.02451
Epoch: 7742, Training Loss: 0.01825
Epoch: 7742, Training Loss: 0.01958
Epoch: 7742, Training Loss: 0.01961
Epoch: 7742, Training Loss: 0.02451
Epoch: 7743, Training Loss: 0.01825
Epoch: 7743, Training Loss: 0.01958
Epoch: 7743, Training Loss: 0.01961
Epoch: 7743, Training Loss: 0.02450
Epoch: 7744, Training Loss: 0.01825
Epoch: 7744, Training Loss: 0.01958
Epoch: 7744, Training Loss: 0.01961
Epoch: 7744, Training Loss: 0.02450
Epoch: 7745, Training Loss: 0.01825
Epoch: 7745, Training Loss: 0.01957
Epoch: 7745, Training Loss: 0.01961
Epoch: 7745, Training Loss: 0.02450
Epoch: 7746, Training Loss: 0.01825
Epoch: 7746, Training Loss: 0.01957
Epoch: 7746, Training Loss: 0.01960
Epoch: 7746, Training Loss: 0.02450
Epoch: 7747, Training Loss: 0.01824
Epoch: 7747, Training Loss: 0.01957
Epoch: 7747, Training Loss: 0.01960
Epoch: 7747, Training Loss: 0.02449
Epoch: 7748, Training Loss: 0.01824
Epoch: 7748, Training Loss: 0.01957
Epoch: 7748, Training Loss: 0.01960
Epoch: 7748, Training Loss: 0.02449
Epoch: 7749, Training Loss: 0.01824
Epoch: 7749, Training Loss: 0.01957
Epoch: 7749, Training Loss: 0.01960
Epoch: 7749, Training Loss: 0.02449
Epoch: 7750, Training Loss: 0.01824
Epoch: 7750, Training Loss: 0.01956
Epoch: 7750, Training Loss: 0.01960
Epoch: 7750, Training Loss: 0.02449
Epoch: 7751, Training Loss: 0.01824
Epoch: 7751, Training Loss: 0.01956
Epoch: 7751, Training Loss: 0.01959
Epoch: 7751, Training Loss: 0.02448
Epoch: 7752, Training Loss: 0.01824
Epoch: 7752, Training Loss: 0.01956
Epoch: 7752, Training Loss: 0.01959
Epoch: 7752, Training Loss: 0.02448
Epoch: 7753, Training Loss: 0.01823
Epoch: 7753, Training Loss: 0.01956
Epoch: 7753, Training Loss: 0.01959
Epoch: 7753, Training Loss: 0.02448
Epoch: 7754, Training Loss: 0.01823
Epoch: 7754, Training Loss: 0.01956
Epoch: 7754, Training Loss: 0.01959
Epoch: 7754, Training Loss: 0.02448
Epoch: 7755, Training Loss: 0.01823
Epoch: 7755, Training Loss: 0.01955
Epoch: 7755, Training Loss: 0.01959
Epoch: 7755, Training Loss: 0.02447
Epoch: 7756, Training Loss: 0.01823
Epoch: 7756, Training Loss: 0.01955
Epoch: 7756, Training Loss: 0.01958
Epoch: 7756, Training Loss: 0.02447
Epoch: 7757, Training Loss: 0.01823
Epoch: 7757, Training Loss: 0.01955
Epoch: 7757, Training Loss: 0.01958
Epoch: 7757, Training Loss: 0.02447
Epoch: 7758, Training Loss: 0.01823
Epoch: 7758, Training Loss: 0.01955
Epoch: 7758, Training Loss: 0.01958
Epoch: 7758, Training Loss: 0.02447
Epoch: 7759, Training Loss: 0.01822
Epoch: 7759, Training Loss: 0.01955
Epoch: 7759, Training Loss: 0.01958
Epoch: 7759, Training Loss: 0.02446
Epoch: 7760, Training Loss: 0.01822
Epoch: 7760, Training Loss: 0.01954
Epoch: 7760, Training Loss: 0.01958
Epoch: 7760, Training Loss: 0.02446
Epoch: 7761, Training Loss: 0.01822
Epoch: 7761, Training Loss: 0.01954
Epoch: 7761, Training Loss: 0.01958
Epoch: 7761, Training Loss: 0.02446
Epoch: 7762, Training Loss: 0.01822
Epoch: 7762, Training Loss: 0.01954
Epoch: 7762, Training Loss: 0.01957
Epoch: 7762, Training Loss: 0.02446
Epoch: 7763, Training Loss: 0.01822
Epoch: 7763, Training Loss: 0.01954
Epoch: 7763, Training Loss: 0.01957
Epoch: 7763, Training Loss: 0.02446
Epoch: 7764, Training Loss: 0.01822
Epoch: 7764, Training Loss: 0.01954
Epoch: 7764, Training Loss: 0.01957
Epoch: 7764, Training Loss: 0.02445
Epoch: 7765, Training Loss: 0.01821
Epoch: 7765, Training Loss: 0.01954
Epoch: 7765, Training Loss: 0.01957
Epoch: 7765, Training Loss: 0.02445
Epoch: 7766, Training Loss: 0.01821
Epoch: 7766, Training Loss: 0.01953
Epoch: 7766, Training Loss: 0.01957
Epoch: 7766, Training Loss: 0.02445
Epoch: 7767, Training Loss: 0.01821
Epoch: 7767, Training Loss: 0.01953
Epoch: 7767, Training Loss: 0.01956
Epoch: 7767, Training Loss: 0.02445
Epoch: 7768, Training Loss: 0.01821
Epoch: 7768, Training Loss: 0.01953
Epoch: 7768, Training Loss: 0.01956
Epoch: 7768, Training Loss: 0.02444
Epoch: 7769, Training Loss: 0.01821
Epoch: 7769, Training Loss: 0.01953
Epoch: 7769, Training Loss: 0.01956
Epoch: 7769, Training Loss: 0.02444
Epoch: 7770, Training Loss: 0.01821
Epoch: 7770, Training Loss: 0.01953
Epoch: 7770, Training Loss: 0.01956
Epoch: 7770, Training Loss: 0.02444
Epoch: 7771, Training Loss: 0.01820
Epoch: 7771, Training Loss: 0.01952
Epoch: 7771, Training Loss: 0.01956
Epoch: 7771, Training Loss: 0.02444
Epoch: 7772, Training Loss: 0.01820
Epoch: 7772, Training Loss: 0.01952
Epoch: 7772, Training Loss: 0.01955
Epoch: 7772, Training Loss: 0.02443
Epoch: 7773, Training Loss: 0.01820
Epoch: 7773, Training Loss: 0.01952
Epoch: 7773, Training Loss: 0.01955
Epoch: 7773, Training Loss: 0.02443
Epoch: 7774, Training Loss: 0.01820
Epoch: 7774, Training Loss: 0.01952
Epoch: 7774, Training Loss: 0.01955
Epoch: 7774, Training Loss: 0.02443
Epoch: 7775, Training Loss: 0.01820
Epoch: 7775, Training Loss: 0.01952
Epoch: 7775, Training Loss: 0.01955
Epoch: 7775, Training Loss: 0.02443
Epoch: 7776, Training Loss: 0.01820
Epoch: 7776, Training Loss: 0.01951
Epoch: 7776, Training Loss: 0.01955
Epoch: 7776, Training Loss: 0.02442
Epoch: 7777, Training Loss: 0.01819
Epoch: 7777, Training Loss: 0.01951
Epoch: 7777, Training Loss: 0.01954
Epoch: 7777, Training Loss: 0.02442
Epoch: 7778, Training Loss: 0.01819
Epoch: 7778, Training Loss: 0.01951
Epoch: 7778, Training Loss: 0.01954
Epoch: 7778, Training Loss: 0.02442
Epoch: 7779, Training Loss: 0.01819
Epoch: 7779, Training Loss: 0.01951
Epoch: 7779, Training Loss: 0.01954
Epoch: 7779, Training Loss: 0.02442
Epoch: 7780, Training Loss: 0.01819
Epoch: 7780, Training Loss: 0.01951
Epoch: 7780, Training Loss: 0.01954
Epoch: 7780, Training Loss: 0.02441
Epoch: 7781, Training Loss: 0.01819
Epoch: 7781, Training Loss: 0.01950
Epoch: 7781, Training Loss: 0.01954
Epoch: 7781, Training Loss: 0.02441
Epoch: 7782, Training Loss: 0.01819
Epoch: 7782, Training Loss: 0.01950
Epoch: 7782, Training Loss: 0.01954
Epoch: 7782, Training Loss: 0.02441
Epoch: 7783, Training Loss: 0.01818
Epoch: 7783, Training Loss: 0.01950
Epoch: 7783, Training Loss: 0.01953
Epoch: 7783, Training Loss: 0.02441
Epoch: 7784, Training Loss: 0.01818
Epoch: 7784, Training Loss: 0.01950
Epoch: 7784, Training Loss: 0.01953
Epoch: 7784, Training Loss: 0.02440
Epoch: 7785, Training Loss: 0.01818
Epoch: 7785, Training Loss: 0.01950
Epoch: 7785, Training Loss: 0.01953
Epoch: 7785, Training Loss: 0.02440
Epoch: 7786, Training Loss: 0.01818
Epoch: 7786, Training Loss: 0.01950
Epoch: 7786, Training Loss: 0.01953
Epoch: 7786, Training Loss: 0.02440
Epoch: 7787, Training Loss: 0.01818
Epoch: 7787, Training Loss: 0.01949
Epoch: 7787, Training Loss: 0.01953
Epoch: 7787, Training Loss: 0.02440
Epoch: 7788, Training Loss: 0.01818
Epoch: 7788, Training Loss: 0.01949
Epoch: 7788, Training Loss: 0.01952
Epoch: 7788, Training Loss: 0.02439
Epoch: 7789, Training Loss: 0.01817
Epoch: 7789, Training Loss: 0.01949
Epoch: 7789, Training Loss: 0.01952
Epoch: 7789, Training Loss: 0.02439
Epoch: 7790, Training Loss: 0.01817
Epoch: 7790, Training Loss: 0.01949
Epoch: 7790, Training Loss: 0.01952
Epoch: 7790, Training Loss: 0.02439
Epoch: 7791, Training Loss: 0.01817
Epoch: 7791, Training Loss: 0.01949
Epoch: 7791, Training Loss: 0.01952
Epoch: 7791, Training Loss: 0.02439
Epoch: 7792, Training Loss: 0.01817
Epoch: 7792, Training Loss: 0.01948
Epoch: 7792, Training Loss: 0.01952
Epoch: 7792, Training Loss: 0.02438
Epoch: 7793, Training Loss: 0.01817
Epoch: 7793, Training Loss: 0.01948
Epoch: 7793, Training Loss: 0.01951
Epoch: 7793, Training Loss: 0.02438
Epoch: 7794, Training Loss: 0.01817
Epoch: 7794, Training Loss: 0.01948
Epoch: 7794, Training Loss: 0.01951
Epoch: 7794, Training Loss: 0.02438
Epoch: 7795, Training Loss: 0.01816
Epoch: 7795, Training Loss: 0.01948
Epoch: 7795, Training Loss: 0.01951
Epoch: 7795, Training Loss: 0.02438
Epoch: 7796, Training Loss: 0.01816
Epoch: 7796, Training Loss: 0.01948
Epoch: 7796, Training Loss: 0.01951
Epoch: 7796, Training Loss: 0.02437
Epoch: 7797, Training Loss: 0.01816
Epoch: 7797, Training Loss: 0.01947
Epoch: 7797, Training Loss: 0.01951
Epoch: 7797, Training Loss: 0.02437
Epoch: 7798, Training Loss: 0.01816
Epoch: 7798, Training Loss: 0.01947
Epoch: 7798, Training Loss: 0.01950
Epoch: 7798, Training Loss: 0.02437
Epoch: 7799, Training Loss: 0.01816
Epoch: 7799, Training Loss: 0.01947
Epoch: 7799, Training Loss: 0.01950
Epoch: 7799, Training Loss: 0.02437
Epoch: 7800, Training Loss: 0.01816
Epoch: 7800, Training Loss: 0.01947
Epoch: 7800, Training Loss: 0.01950
Epoch: 7800, Training Loss: 0.02436
Epoch: 7801, Training Loss: 0.01815
Epoch: 7801, Training Loss: 0.01947
Epoch: 7801, Training Loss: 0.01950
Epoch: 7801, Training Loss: 0.02436
Epoch: 7802, Training Loss: 0.01815
Epoch: 7802, Training Loss: 0.01946
Epoch: 7802, Training Loss: 0.01950
Epoch: 7802, Training Loss: 0.02436
Epoch: 7803, Training Loss: 0.01815
Epoch: 7803, Training Loss: 0.01946
Epoch: 7803, Training Loss: 0.01950
Epoch: 7803, Training Loss: 0.02436
Epoch: 7804, Training Loss: 0.01815
Epoch: 7804, Training Loss: 0.01946
Epoch: 7804, Training Loss: 0.01949
Epoch: 7804, Training Loss: 0.02435
Epoch: 7805, Training Loss: 0.01815
Epoch: 7805, Training Loss: 0.01946
Epoch: 7805, Training Loss: 0.01949
Epoch: 7805, Training Loss: 0.02435
Epoch: 7806, Training Loss: 0.01815
Epoch: 7806, Training Loss: 0.01946
Epoch: 7806, Training Loss: 0.01949
Epoch: 7806, Training Loss: 0.02435
Epoch: 7807, Training Loss: 0.01814
Epoch: 7807, Training Loss: 0.01946
Epoch: 7807, Training Loss: 0.01949
Epoch: 7807, Training Loss: 0.02435
Epoch: 7808, Training Loss: 0.01814
Epoch: 7808, Training Loss: 0.01945
Epoch: 7808, Training Loss: 0.01949
Epoch: 7808, Training Loss: 0.02434
Epoch: 7809, Training Loss: 0.01814
Epoch: 7809, Training Loss: 0.01945
Epoch: 7809, Training Loss: 0.01948
Epoch: 7809, Training Loss: 0.02434
Epoch: 7810, Training Loss: 0.01814
Epoch: 7810, Training Loss: 0.01945
Epoch: 7810, Training Loss: 0.01948
Epoch: 7810, Training Loss: 0.02434
Epoch: 7811, Training Loss: 0.01814
Epoch: 7811, Training Loss: 0.01945
Epoch: 7811, Training Loss: 0.01948
Epoch: 7811, Training Loss: 0.02434
Epoch: 7812, Training Loss: 0.01814
Epoch: 7812, Training Loss: 0.01945
Epoch: 7812, Training Loss: 0.01948
Epoch: 7812, Training Loss: 0.02433
Epoch: 7813, Training Loss: 0.01813
Epoch: 7813, Training Loss: 0.01944
Epoch: 7813, Training Loss: 0.01948
Epoch: 7813, Training Loss: 0.02433
Epoch: 7814, Training Loss: 0.01813
Epoch: 7814, Training Loss: 0.01944
Epoch: 7814, Training Loss: 0.01947
Epoch: 7814, Training Loss: 0.02433
Epoch: 7815, Training Loss: 0.01813
Epoch: 7815, Training Loss: 0.01944
Epoch: 7815, Training Loss: 0.01947
Epoch: 7815, Training Loss: 0.02433
Epoch: 7816, Training Loss: 0.01813
Epoch: 7816, Training Loss: 0.01944
Epoch: 7816, Training Loss: 0.01947
Epoch: 7816, Training Loss: 0.02433
Epoch: 7817, Training Loss: 0.01813
Epoch: 7817, Training Loss: 0.01944
Epoch: 7817, Training Loss: 0.01947
Epoch: 7817, Training Loss: 0.02432
Epoch: 7818, Training Loss: 0.01813
Epoch: 7818, Training Loss: 0.01943
Epoch: 7818, Training Loss: 0.01947
Epoch: 7818, Training Loss: 0.02432
Epoch: 7819, Training Loss: 0.01812
Epoch: 7819, Training Loss: 0.01943
Epoch: 7819, Training Loss: 0.01946
Epoch: 7819, Training Loss: 0.02432
Epoch: 7820, Training Loss: 0.01812
Epoch: 7820, Training Loss: 0.01943
Epoch: 7820, Training Loss: 0.01946
Epoch: 7820, Training Loss: 0.02432
Epoch: 7821, Training Loss: 0.01812
Epoch: 7821, Training Loss: 0.01943
Epoch: 7821, Training Loss: 0.01946
Epoch: 7821, Training Loss: 0.02431
Epoch: 7822, Training Loss: 0.01812
Epoch: 7822, Training Loss: 0.01943
Epoch: 7822, Training Loss: 0.01946
Epoch: 7822, Training Loss: 0.02431
Epoch: 7823, Training Loss: 0.01812
Epoch: 7823, Training Loss: 0.01943
Epoch: 7823, Training Loss: 0.01946
Epoch: 7823, Training Loss: 0.02431
Epoch: 7824, Training Loss: 0.01812
Epoch: 7824, Training Loss: 0.01942
Epoch: 7824, Training Loss: 0.01946
Epoch: 7824, Training Loss: 0.02431
Epoch: 7825, Training Loss: 0.01811
Epoch: 7825, Training Loss: 0.01942
Epoch: 7825, Training Loss: 0.01945
Epoch: 7825, Training Loss: 0.02430
Epoch: 7826, Training Loss: 0.01811
Epoch: 7826, Training Loss: 0.01942
Epoch: 7826, Training Loss: 0.01945
Epoch: 7826, Training Loss: 0.02430
Epoch: 7827, Training Loss: 0.01811
Epoch: 7827, Training Loss: 0.01942
Epoch: 7827, Training Loss: 0.01945
Epoch: 7827, Training Loss: 0.02430
Epoch: 7828, Training Loss: 0.01811
Epoch: 7828, Training Loss: 0.01942
Epoch: 7828, Training Loss: 0.01945
Epoch: 7828, Training Loss: 0.02430
Epoch: 7829, Training Loss: 0.01811
Epoch: 7829, Training Loss: 0.01941
Epoch: 7829, Training Loss: 0.01945
Epoch: 7829, Training Loss: 0.02429
Epoch: 7830, Training Loss: 0.01811
Epoch: 7830, Training Loss: 0.01941
Epoch: 7830, Training Loss: 0.01944
Epoch: 7830, Training Loss: 0.02429
Epoch: 7831, Training Loss: 0.01810
Epoch: 7831, Training Loss: 0.01941
Epoch: 7831, Training Loss: 0.01944
Epoch: 7831, Training Loss: 0.02429
Epoch: 7832, Training Loss: 0.01810
Epoch: 7832, Training Loss: 0.01941
Epoch: 7832, Training Loss: 0.01944
Epoch: 7832, Training Loss: 0.02429
Epoch: 7833, Training Loss: 0.01810
Epoch: 7833, Training Loss: 0.01941
Epoch: 7833, Training Loss: 0.01944
Epoch: 7833, Training Loss: 0.02428
Epoch: 7834, Training Loss: 0.01810
Epoch: 7834, Training Loss: 0.01940
Epoch: 7834, Training Loss: 0.01944
Epoch: 7834, Training Loss: 0.02428
Epoch: 7835, Training Loss: 0.01810
Epoch: 7835, Training Loss: 0.01940
Epoch: 7835, Training Loss: 0.01943
Epoch: 7835, Training Loss: 0.02428
Epoch: 7836, Training Loss: 0.01810
Epoch: 7836, Training Loss: 0.01940
Epoch: 7836, Training Loss: 0.01943
Epoch: 7836, Training Loss: 0.02428
Epoch: 7837, Training Loss: 0.01809
Epoch: 7837, Training Loss: 0.01940
Epoch: 7837, Training Loss: 0.01943
Epoch: 7837, Training Loss: 0.02427
Epoch: 7838, Training Loss: 0.01809
Epoch: 7838, Training Loss: 0.01940
Epoch: 7838, Training Loss: 0.01943
Epoch: 7838, Training Loss: 0.02427
Epoch: 7839, Training Loss: 0.01809
Epoch: 7839, Training Loss: 0.01940
Epoch: 7839, Training Loss: 0.01943
Epoch: 7839, Training Loss: 0.02427
Epoch: 7840, Training Loss: 0.01809
Epoch: 7840, Training Loss: 0.01939
Epoch: 7840, Training Loss: 0.01943
Epoch: 7840, Training Loss: 0.02427
Epoch: 7841, Training Loss: 0.01809
Epoch: 7841, Training Loss: 0.01939
Epoch: 7841, Training Loss: 0.01942
Epoch: 7841, Training Loss: 0.02426
Epoch: 7842, Training Loss: 0.01809
Epoch: 7842, Training Loss: 0.01939
Epoch: 7842, Training Loss: 0.01942
Epoch: 7842, Training Loss: 0.02426
Epoch: 7843, Training Loss: 0.01808
Epoch: 7843, Training Loss: 0.01939
Epoch: 7843, Training Loss: 0.01942
Epoch: 7843, Training Loss: 0.02426
Epoch: 7844, Training Loss: 0.01808
Epoch: 7844, Training Loss: 0.01939
Epoch: 7844, Training Loss: 0.01942
Epoch: 7844, Training Loss: 0.02426
Epoch: 7845, Training Loss: 0.01808
Epoch: 7845, Training Loss: 0.01938
Epoch: 7845, Training Loss: 0.01942
Epoch: 7845, Training Loss: 0.02425
Epoch: 7846, Training Loss: 0.01808
Epoch: 7846, Training Loss: 0.01938
Epoch: 7846, Training Loss: 0.01941
Epoch: 7846, Training Loss: 0.02425
Epoch: 7847, Training Loss: 0.01808
Epoch: 7847, Training Loss: 0.01938
Epoch: 7847, Training Loss: 0.01941
Epoch: 7847, Training Loss: 0.02425
Epoch: 7848, Training Loss: 0.01808
Epoch: 7848, Training Loss: 0.01938
Epoch: 7848, Training Loss: 0.01941
Epoch: 7848, Training Loss: 0.02425
Epoch: 7849, Training Loss: 0.01807
Epoch: 7849, Training Loss: 0.01938
Epoch: 7849, Training Loss: 0.01941
Epoch: 7849, Training Loss: 0.02425
Epoch: 7850, Training Loss: 0.01807
Epoch: 7850, Training Loss: 0.01937
Epoch: 7850, Training Loss: 0.01941
Epoch: 7850, Training Loss: 0.02424
Epoch: 7851, Training Loss: 0.01807
Epoch: 7851, Training Loss: 0.01937
Epoch: 7851, Training Loss: 0.01940
Epoch: 7851, Training Loss: 0.02424
Epoch: 7852, Training Loss: 0.01807
Epoch: 7852, Training Loss: 0.01937
Epoch: 7852, Training Loss: 0.01940
Epoch: 7852, Training Loss: 0.02424
Epoch: 7853, Training Loss: 0.01807
Epoch: 7853, Training Loss: 0.01937
Epoch: 7853, Training Loss: 0.01940
Epoch: 7853, Training Loss: 0.02424
Epoch: 7854, Training Loss: 0.01807
Epoch: 7854, Training Loss: 0.01937
Epoch: 7854, Training Loss: 0.01940
Epoch: 7854, Training Loss: 0.02423
Epoch: 7855, Training Loss: 0.01806
Epoch: 7855, Training Loss: 0.01937
Epoch: 7855, Training Loss: 0.01940
Epoch: 7855, Training Loss: 0.02423
Epoch: 7856, Training Loss: 0.01806
Epoch: 7856, Training Loss: 0.01936
Epoch: 7856, Training Loss: 0.01940
Epoch: 7856, Training Loss: 0.02423
Epoch: 7857, Training Loss: 0.01806
Epoch: 7857, Training Loss: 0.01936
Epoch: 7857, Training Loss: 0.01939
Epoch: 7857, Training Loss: 0.02423
Epoch: 7858, Training Loss: 0.01806
Epoch: 7858, Training Loss: 0.01936
Epoch: 7858, Training Loss: 0.01939
Epoch: 7858, Training Loss: 0.02422
Epoch: 7859, Training Loss: 0.01806
Epoch: 7859, Training Loss: 0.01936
Epoch: 7859, Training Loss: 0.01939
Epoch: 7859, Training Loss: 0.02422
Epoch: 7860, Training Loss: 0.01806
Epoch: 7860, Training Loss: 0.01936
Epoch: 7860, Training Loss: 0.01939
Epoch: 7860, Training Loss: 0.02422
Epoch: 7861, Training Loss: 0.01805
Epoch: 7861, Training Loss: 0.01935
Epoch: 7861, Training Loss: 0.01939
Epoch: 7861, Training Loss: 0.02422
Epoch: 7862, Training Loss: 0.01805
Epoch: 7862, Training Loss: 0.01935
Epoch: 7862, Training Loss: 0.01938
Epoch: 7862, Training Loss: 0.02421
Epoch: 7863, Training Loss: 0.01805
Epoch: 7863, Training Loss: 0.01935
Epoch: 7863, Training Loss: 0.01938
Epoch: 7863, Training Loss: 0.02421
Epoch: 7864, Training Loss: 0.01805
Epoch: 7864, Training Loss: 0.01935
Epoch: 7864, Training Loss: 0.01938
Epoch: 7864, Training Loss: 0.02421
Epoch: 7865, Training Loss: 0.01805
Epoch: 7865, Training Loss: 0.01935
Epoch: 7865, Training Loss: 0.01938
Epoch: 7865, Training Loss: 0.02421
Epoch: 7866, Training Loss: 0.01805
Epoch: 7866, Training Loss: 0.01934
Epoch: 7866, Training Loss: 0.01938
Epoch: 7866, Training Loss: 0.02420
Epoch: 7867, Training Loss: 0.01804
Epoch: 7867, Training Loss: 0.01934
Epoch: 7867, Training Loss: 0.01937
Epoch: 7867, Training Loss: 0.02420
Epoch: 7868, Training Loss: 0.01804
Epoch: 7868, Training Loss: 0.01934
Epoch: 7868, Training Loss: 0.01937
Epoch: 7868, Training Loss: 0.02420
Epoch: 7869, Training Loss: 0.01804
Epoch: 7869, Training Loss: 0.01934
Epoch: 7869, Training Loss: 0.01937
Epoch: 7869, Training Loss: 0.02420
Epoch: 7870, Training Loss: 0.01804
Epoch: 7870, Training Loss: 0.01934
Epoch: 7870, Training Loss: 0.01937
Epoch: 7870, Training Loss: 0.02419
Epoch: 7871, Training Loss: 0.01804
Epoch: 7871, Training Loss: 0.01934
Epoch: 7871, Training Loss: 0.01937
Epoch: 7871, Training Loss: 0.02419
Epoch: 7872, Training Loss: 0.01804
Epoch: 7872, Training Loss: 0.01933
Epoch: 7872, Training Loss: 0.01937
Epoch: 7872, Training Loss: 0.02419
Epoch: 7873, Training Loss: 0.01803
Epoch: 7873, Training Loss: 0.01933
Epoch: 7873, Training Loss: 0.01936
Epoch: 7873, Training Loss: 0.02419
Epoch: 7874, Training Loss: 0.01803
Epoch: 7874, Training Loss: 0.01933
Epoch: 7874, Training Loss: 0.01936
Epoch: 7874, Training Loss: 0.02419
Epoch: 7875, Training Loss: 0.01803
Epoch: 7875, Training Loss: 0.01933
Epoch: 7875, Training Loss: 0.01936
Epoch: 7875, Training Loss: 0.02418
Epoch: 7876, Training Loss: 0.01803
Epoch: 7876, Training Loss: 0.01933
Epoch: 7876, Training Loss: 0.01936
Epoch: 7876, Training Loss: 0.02418
Epoch: 7877, Training Loss: 0.01803
Epoch: 7877, Training Loss: 0.01932
Epoch: 7877, Training Loss: 0.01936
Epoch: 7877, Training Loss: 0.02418
Epoch: 7878, Training Loss: 0.01803
Epoch: 7878, Training Loss: 0.01932
Epoch: 7878, Training Loss: 0.01935
Epoch: 7878, Training Loss: 0.02418
Epoch: 7879, Training Loss: 0.01802
Epoch: 7879, Training Loss: 0.01932
Epoch: 7879, Training Loss: 0.01935
Epoch: 7879, Training Loss: 0.02417
Epoch: 7880, Training Loss: 0.01802
Epoch: 7880, Training Loss: 0.01932
Epoch: 7880, Training Loss: 0.01935
Epoch: 7880, Training Loss: 0.02417
Epoch: 7881, Training Loss: 0.01802
Epoch: 7881, Training Loss: 0.01932
Epoch: 7881, Training Loss: 0.01935
Epoch: 7881, Training Loss: 0.02417
Epoch: 7882, Training Loss: 0.01802
Epoch: 7882, Training Loss: 0.01932
Epoch: 7882, Training Loss: 0.01935
Epoch: 7882, Training Loss: 0.02417
Epoch: 7883, Training Loss: 0.01802
Epoch: 7883, Training Loss: 0.01931
Epoch: 7883, Training Loss: 0.01935
Epoch: 7883, Training Loss: 0.02416
Epoch: 7884, Training Loss: 0.01802
Epoch: 7884, Training Loss: 0.01931
Epoch: 7884, Training Loss: 0.01934
Epoch: 7884, Training Loss: 0.02416
Epoch: 7885, Training Loss: 0.01801
Epoch: 7885, Training Loss: 0.01931
Epoch: 7885, Training Loss: 0.01934
Epoch: 7885, Training Loss: 0.02416
Epoch: 7886, Training Loss: 0.01801
Epoch: 7886, Training Loss: 0.01931
Epoch: 7886, Training Loss: 0.01934
Epoch: 7886, Training Loss: 0.02416
Epoch: 7887, Training Loss: 0.01801
Epoch: 7887, Training Loss: 0.01931
Epoch: 7887, Training Loss: 0.01934
Epoch: 7887, Training Loss: 0.02415
Epoch: 7888, Training Loss: 0.01801
Epoch: 7888, Training Loss: 0.01930
Epoch: 7888, Training Loss: 0.01934
Epoch: 7888, Training Loss: 0.02415
Epoch: 7889, Training Loss: 0.01801
Epoch: 7889, Training Loss: 0.01930
Epoch: 7889, Training Loss: 0.01933
Epoch: 7889, Training Loss: 0.02415
Epoch: 7890, Training Loss: 0.01801
Epoch: 7890, Training Loss: 0.01930
Epoch: 7890, Training Loss: 0.01933
Epoch: 7890, Training Loss: 0.02415
Epoch: 7891, Training Loss: 0.01800
Epoch: 7891, Training Loss: 0.01930
Epoch: 7891, Training Loss: 0.01933
Epoch: 7891, Training Loss: 0.02414
Epoch: 7892, Training Loss: 0.01800
Epoch: 7892, Training Loss: 0.01930
Epoch: 7892, Training Loss: 0.01933
Epoch: 7892, Training Loss: 0.02414
Epoch: 7893, Training Loss: 0.01800
Epoch: 7893, Training Loss: 0.01929
Epoch: 7893, Training Loss: 0.01933
Epoch: 7893, Training Loss: 0.02414
Epoch: 7894, Training Loss: 0.01800
Epoch: 7894, Training Loss: 0.01929
Epoch: 7894, Training Loss: 0.01932
Epoch: 7894, Training Loss: 0.02414
Epoch: 7895, Training Loss: 0.01800
Epoch: 7895, Training Loss: 0.01929
Epoch: 7895, Training Loss: 0.01932
Epoch: 7895, Training Loss: 0.02414
Epoch: 7896, Training Loss: 0.01800
Epoch: 7896, Training Loss: 0.01929
Epoch: 7896, Training Loss: 0.01932
Epoch: 7896, Training Loss: 0.02413
Epoch: 7897, Training Loss: 0.01799
Epoch: 7897, Training Loss: 0.01929
Epoch: 7897, Training Loss: 0.01932
Epoch: 7897, Training Loss: 0.02413
Epoch: 7898, Training Loss: 0.01799
Epoch: 7898, Training Loss: 0.01929
Epoch: 7898, Training Loss: 0.01932
Epoch: 7898, Training Loss: 0.02413
Epoch: 7899, Training Loss: 0.01799
Epoch: 7899, Training Loss: 0.01928
Epoch: 7899, Training Loss: 0.01932
Epoch: 7899, Training Loss: 0.02413
Epoch: 7900, Training Loss: 0.01799
Epoch: 7900, Training Loss: 0.01928
Epoch: 7900, Training Loss: 0.01931
Epoch: 7900, Training Loss: 0.02412
Epoch: 7901, Training Loss: 0.01799
Epoch: 7901, Training Loss: 0.01928
Epoch: 7901, Training Loss: 0.01931
Epoch: 7901, Training Loss: 0.02412
Epoch: 7902, Training Loss: 0.01799
Epoch: 7902, Training Loss: 0.01928
Epoch: 7902, Training Loss: 0.01931
Epoch: 7902, Training Loss: 0.02412
Epoch: 7903, Training Loss: 0.01798
Epoch: 7903, Training Loss: 0.01928
Epoch: 7903, Training Loss: 0.01931
Epoch: 7903, Training Loss: 0.02412
Epoch: 7904, Training Loss: 0.01798
Epoch: 7904, Training Loss: 0.01927
Epoch: 7904, Training Loss: 0.01931
Epoch: 7904, Training Loss: 0.02411
Epoch: 7905, Training Loss: 0.01798
Epoch: 7905, Training Loss: 0.01927
Epoch: 7905, Training Loss: 0.01930
Epoch: 7905, Training Loss: 0.02411
Epoch: 7906, Training Loss: 0.01798
Epoch: 7906, Training Loss: 0.01927
Epoch: 7906, Training Loss: 0.01930
Epoch: 7906, Training Loss: 0.02411
Epoch: 7907, Training Loss: 0.01798
Epoch: 7907, Training Loss: 0.01927
Epoch: 7907, Training Loss: 0.01930
Epoch: 7907, Training Loss: 0.02411
Epoch: 7908, Training Loss: 0.01798
Epoch: 7908, Training Loss: 0.01927
Epoch: 7908, Training Loss: 0.01930
Epoch: 7908, Training Loss: 0.02410
Epoch: 7909, Training Loss: 0.01797
Epoch: 7909, Training Loss: 0.01927
Epoch: 7909, Training Loss: 0.01930
Epoch: 7909, Training Loss: 0.02410
Epoch: 7910, Training Loss: 0.01797
Epoch: 7910, Training Loss: 0.01926
Epoch: 7910, Training Loss: 0.01930
Epoch: 7910, Training Loss: 0.02410
Epoch: 7911, Training Loss: 0.01797
Epoch: 7911, Training Loss: 0.01926
Epoch: 7911, Training Loss: 0.01929
Epoch: 7911, Training Loss: 0.02410
Epoch: 7912, Training Loss: 0.01797
Epoch: 7912, Training Loss: 0.01926
Epoch: 7912, Training Loss: 0.01929
Epoch: 7912, Training Loss: 0.02409
Epoch: 7913, Training Loss: 0.01797
Epoch: 7913, Training Loss: 0.01926
Epoch: 7913, Training Loss: 0.01929
Epoch: 7913, Training Loss: 0.02409
Epoch: 7914, Training Loss: 0.01797
Epoch: 7914, Training Loss: 0.01926
Epoch: 7914, Training Loss: 0.01929
Epoch: 7914, Training Loss: 0.02409
Epoch: 7915, Training Loss: 0.01797
Epoch: 7915, Training Loss: 0.01925
Epoch: 7915, Training Loss: 0.01929
Epoch: 7915, Training Loss: 0.02409
Epoch: 7916, Training Loss: 0.01796
Epoch: 7916, Training Loss: 0.01925
Epoch: 7916, Training Loss: 0.01928
Epoch: 7916, Training Loss: 0.02409
Epoch: 7917, Training Loss: 0.01796
Epoch: 7917, Training Loss: 0.01925
Epoch: 7917, Training Loss: 0.01928
Epoch: 7917, Training Loss: 0.02408
Epoch: 7918, Training Loss: 0.01796
Epoch: 7918, Training Loss: 0.01925
Epoch: 7918, Training Loss: 0.01928
Epoch: 7918, Training Loss: 0.02408
Epoch: 7919, Training Loss: 0.01796
Epoch: 7919, Training Loss: 0.01925
Epoch: 7919, Training Loss: 0.01928
Epoch: 7919, Training Loss: 0.02408
Epoch: 7920, Training Loss: 0.01796
Epoch: 7920, Training Loss: 0.01925
Epoch: 7920, Training Loss: 0.01928
Epoch: 7920, Training Loss: 0.02408
Epoch: 7921, Training Loss: 0.01796
Epoch: 7921, Training Loss: 0.01924
Epoch: 7921, Training Loss: 0.01928
Epoch: 7921, Training Loss: 0.02407
Epoch: 7922, Training Loss: 0.01795
Epoch: 7922, Training Loss: 0.01924
Epoch: 7922, Training Loss: 0.01927
Epoch: 7922, Training Loss: 0.02407
Epoch: 7923, Training Loss: 0.01795
Epoch: 7923, Training Loss: 0.01924
Epoch: 7923, Training Loss: 0.01927
Epoch: 7923, Training Loss: 0.02407
Epoch: 7924, Training Loss: 0.01795
Epoch: 7924, Training Loss: 0.01924
Epoch: 7924, Training Loss: 0.01927
Epoch: 7924, Training Loss: 0.02407
Epoch: 7925, Training Loss: 0.01795
Epoch: 7925, Training Loss: 0.01924
Epoch: 7925, Training Loss: 0.01927
Epoch: 7925, Training Loss: 0.02406
Epoch: 7926, Training Loss: 0.01795
Epoch: 7926, Training Loss: 0.01923
Epoch: 7926, Training Loss: 0.01927
Epoch: 7926, Training Loss: 0.02406
Epoch: 7927, Training Loss: 0.01795
Epoch: 7927, Training Loss: 0.01923
Epoch: 7927, Training Loss: 0.01926
Epoch: 7927, Training Loss: 0.02406
Epoch: 7928, Training Loss: 0.01794
Epoch: 7928, Training Loss: 0.01923
Epoch: 7928, Training Loss: 0.01926
Epoch: 7928, Training Loss: 0.02406
Epoch: 7929, Training Loss: 0.01794
Epoch: 7929, Training Loss: 0.01923
Epoch: 7929, Training Loss: 0.01926
Epoch: 7929, Training Loss: 0.02405
Epoch: 7930, Training Loss: 0.01794
Epoch: 7930, Training Loss: 0.01923
Epoch: 7930, Training Loss: 0.01926
Epoch: 7930, Training Loss: 0.02405
Epoch: 7931, Training Loss: 0.01794
Epoch: 7931, Training Loss: 0.01923
Epoch: 7931, Training Loss: 0.01926
Epoch: 7931, Training Loss: 0.02405
Epoch: 7932, Training Loss: 0.01794
Epoch: 7932, Training Loss: 0.01922
Epoch: 7932, Training Loss: 0.01926
Epoch: 7932, Training Loss: 0.02405
Epoch: 7933, Training Loss: 0.01794
Epoch: 7933, Training Loss: 0.01922
Epoch: 7933, Training Loss: 0.01925
Epoch: 7933, Training Loss: 0.02405
Epoch: 7934, Training Loss: 0.01793
Epoch: 7934, Training Loss: 0.01922
Epoch: 7934, Training Loss: 0.01925
Epoch: 7934, Training Loss: 0.02404
Epoch: 7935, Training Loss: 0.01793
Epoch: 7935, Training Loss: 0.01922
Epoch: 7935, Training Loss: 0.01925
Epoch: 7935, Training Loss: 0.02404
Epoch: 7936, Training Loss: 0.01793
Epoch: 7936, Training Loss: 0.01922
Epoch: 7936, Training Loss: 0.01925
Epoch: 7936, Training Loss: 0.02404
Epoch: 7937, Training Loss: 0.01793
Epoch: 7937, Training Loss: 0.01921
Epoch: 7937, Training Loss: 0.01925
Epoch: 7937, Training Loss: 0.02404
Epoch: 7938, Training Loss: 0.01793
Epoch: 7938, Training Loss: 0.01921
Epoch: 7938, Training Loss: 0.01924
Epoch: 7938, Training Loss: 0.02403
Epoch: 7939, Training Loss: 0.01793
Epoch: 7939, Training Loss: 0.01921
Epoch: 7939, Training Loss: 0.01924
Epoch: 7939, Training Loss: 0.02403
Epoch: 7940, Training Loss: 0.01792
Epoch: 7940, Training Loss: 0.01921
Epoch: 7940, Training Loss: 0.01924
Epoch: 7940, Training Loss: 0.02403
Epoch: 7941, Training Loss: 0.01792
Epoch: 7941, Training Loss: 0.01921
Epoch: 7941, Training Loss: 0.01924
Epoch: 7941, Training Loss: 0.02403
Epoch: 7942, Training Loss: 0.01792
Epoch: 7942, Training Loss: 0.01921
Epoch: 7942, Training Loss: 0.01924
Epoch: 7942, Training Loss: 0.02402
Epoch: 7943, Training Loss: 0.01792
Epoch: 7943, Training Loss: 0.01920
Epoch: 7943, Training Loss: 0.01923
Epoch: 7943, Training Loss: 0.02402
Epoch: 7944, Training Loss: 0.01792
Epoch: 7944, Training Loss: 0.01920
Epoch: 7944, Training Loss: 0.01923
Epoch: 7944, Training Loss: 0.02402
Epoch: 7945, Training Loss: 0.01792
Epoch: 7945, Training Loss: 0.01920
Epoch: 7945, Training Loss: 0.01923
Epoch: 7945, Training Loss: 0.02402
Epoch: 7946, Training Loss: 0.01791
Epoch: 7946, Training Loss: 0.01920
Epoch: 7946, Training Loss: 0.01923
Epoch: 7946, Training Loss: 0.02401
Epoch: 7947, Training Loss: 0.01791
Epoch: 7947, Training Loss: 0.01920
Epoch: 7947, Training Loss: 0.01923
Epoch: 7947, Training Loss: 0.02401
Epoch: 7948, Training Loss: 0.01791
Epoch: 7948, Training Loss: 0.01919
Epoch: 7948, Training Loss: 0.01923
Epoch: 7948, Training Loss: 0.02401
Epoch: 7949, Training Loss: 0.01791
Epoch: 7949, Training Loss: 0.01919
Epoch: 7949, Training Loss: 0.01922
Epoch: 7949, Training Loss: 0.02401
Epoch: 7950, Training Loss: 0.01791
Epoch: 7950, Training Loss: 0.01919
Epoch: 7950, Training Loss: 0.01922
Epoch: 7950, Training Loss: 0.02401
Epoch: 7951, Training Loss: 0.01791
Epoch: 7951, Training Loss: 0.01919
Epoch: 7951, Training Loss: 0.01922
Epoch: 7951, Training Loss: 0.02400
Epoch: 7952, Training Loss: 0.01791
Epoch: 7952, Training Loss: 0.01919
Epoch: 7952, Training Loss: 0.01922
Epoch: 7952, Training Loss: 0.02400
Epoch: 7953, Training Loss: 0.01790
Epoch: 7953, Training Loss: 0.01919
Epoch: 7953, Training Loss: 0.01922
Epoch: 7953, Training Loss: 0.02400
Epoch: 7954, Training Loss: 0.01790
Epoch: 7954, Training Loss: 0.01918
Epoch: 7954, Training Loss: 0.01921
Epoch: 7954, Training Loss: 0.02400
Epoch: 7955, Training Loss: 0.01790
Epoch: 7955, Training Loss: 0.01918
Epoch: 7955, Training Loss: 0.01921
Epoch: 7955, Training Loss: 0.02399
Epoch: 7956, Training Loss: 0.01790
Epoch: 7956, Training Loss: 0.01918
Epoch: 7956, Training Loss: 0.01921
Epoch: 7956, Training Loss: 0.02399
Epoch: 7957, Training Loss: 0.01790
Epoch: 7957, Training Loss: 0.01918
Epoch: 7957, Training Loss: 0.01921
Epoch: 7957, Training Loss: 0.02399
Epoch: 7958, Training Loss: 0.01790
Epoch: 7958, Training Loss: 0.01918
Epoch: 7958, Training Loss: 0.01921
Epoch: 7958, Training Loss: 0.02399
Epoch: 7959, Training Loss: 0.01789
Epoch: 7959, Training Loss: 0.01917
Epoch: 7959, Training Loss: 0.01921
Epoch: 7959, Training Loss: 0.02398
Epoch: 7960, Training Loss: 0.01789
Epoch: 7960, Training Loss: 0.01917
Epoch: 7960, Training Loss: 0.01920
Epoch: 7960, Training Loss: 0.02398
Epoch: 7961, Training Loss: 0.01789
Epoch: 7961, Training Loss: 0.01917
Epoch: 7961, Training Loss: 0.01920
Epoch: 7961, Training Loss: 0.02398
Epoch: 7962, Training Loss: 0.01789
Epoch: 7962, Training Loss: 0.01917
Epoch: 7962, Training Loss: 0.01920
Epoch: 7962, Training Loss: 0.02398
Epoch: 7963, Training Loss: 0.01789
Epoch: 7963, Training Loss: 0.01917
Epoch: 7963, Training Loss: 0.01920
Epoch: 7963, Training Loss: 0.02398
Epoch: 7964, Training Loss: 0.01789
Epoch: 7964, Training Loss: 0.01917
Epoch: 7964, Training Loss: 0.01920
Epoch: 7964, Training Loss: 0.02397
Epoch: 7965, Training Loss: 0.01788
Epoch: 7965, Training Loss: 0.01916
Epoch: 7965, Training Loss: 0.01919
Epoch: 7965, Training Loss: 0.02397
Epoch: 7966, Training Loss: 0.01788
Epoch: 7966, Training Loss: 0.01916
Epoch: 7966, Training Loss: 0.01919
Epoch: 7966, Training Loss: 0.02397
Epoch: 7967, Training Loss: 0.01788
Epoch: 7967, Training Loss: 0.01916
Epoch: 7967, Training Loss: 0.01919
Epoch: 7967, Training Loss: 0.02397
Epoch: 7968, Training Loss: 0.01788
Epoch: 7968, Training Loss: 0.01916
Epoch: 7968, Training Loss: 0.01919
Epoch: 7968, Training Loss: 0.02396
Epoch: 7969, Training Loss: 0.01788
Epoch: 7969, Training Loss: 0.01916
Epoch: 7969, Training Loss: 0.01919
Epoch: 7969, Training Loss: 0.02396
Epoch: 7970, Training Loss: 0.01788
Epoch: 7970, Training Loss: 0.01915
Epoch: 7970, Training Loss: 0.01919
Epoch: 7970, Training Loss: 0.02396
Epoch: 7971, Training Loss: 0.01787
Epoch: 7971, Training Loss: 0.01915
Epoch: 7971, Training Loss: 0.01918
Epoch: 7971, Training Loss: 0.02396
Epoch: 7972, Training Loss: 0.01787
Epoch: 7972, Training Loss: 0.01915
Epoch: 7972, Training Loss: 0.01918
Epoch: 7972, Training Loss: 0.02395
Epoch: 7973, Training Loss: 0.01787
Epoch: 7973, Training Loss: 0.01915
Epoch: 7973, Training Loss: 0.01918
Epoch: 7973, Training Loss: 0.02395
Epoch: 7974, Training Loss: 0.01787
Epoch: 7974, Training Loss: 0.01915
Epoch: 7974, Training Loss: 0.01918
Epoch: 7974, Training Loss: 0.02395
Epoch: 7975, Training Loss: 0.01787
Epoch: 7975, Training Loss: 0.01915
Epoch: 7975, Training Loss: 0.01918
Epoch: 7975, Training Loss: 0.02395
Epoch: 7976, Training Loss: 0.01787
Epoch: 7976, Training Loss: 0.01914
Epoch: 7976, Training Loss: 0.01918
Epoch: 7976, Training Loss: 0.02394
Epoch: 7977, Training Loss: 0.01787
Epoch: 7977, Training Loss: 0.01914
Epoch: 7977, Training Loss: 0.01917
Epoch: 7977, Training Loss: 0.02394
Epoch: 7978, Training Loss: 0.01786
Epoch: 7978, Training Loss: 0.01914
Epoch: 7978, Training Loss: 0.01917
Epoch: 7978, Training Loss: 0.02394
Epoch: 7979, Training Loss: 0.01786
Epoch: 7979, Training Loss: 0.01914
Epoch: 7979, Training Loss: 0.01917
Epoch: 7979, Training Loss: 0.02394
Epoch: 7980, Training Loss: 0.01786
Epoch: 7980, Training Loss: 0.01914
Epoch: 7980, Training Loss: 0.01917
Epoch: 7980, Training Loss: 0.02394
Epoch: 7981, Training Loss: 0.01786
Epoch: 7981, Training Loss: 0.01913
Epoch: 7981, Training Loss: 0.01917
Epoch: 7981, Training Loss: 0.02393
Epoch: 7982, Training Loss: 0.01786
Epoch: 7982, Training Loss: 0.01913
Epoch: 7982, Training Loss: 0.01916
Epoch: 7982, Training Loss: 0.02393
Epoch: 7983, Training Loss: 0.01786
Epoch: 7983, Training Loss: 0.01913
Epoch: 7983, Training Loss: 0.01916
Epoch: 7983, Training Loss: 0.02393
Epoch: 7984, Training Loss: 0.01785
Epoch: 7984, Training Loss: 0.01913
Epoch: 7984, Training Loss: 0.01916
Epoch: 7984, Training Loss: 0.02393
Epoch: 7985, Training Loss: 0.01785
Epoch: 7985, Training Loss: 0.01913
Epoch: 7985, Training Loss: 0.01916
Epoch: 7985, Training Loss: 0.02392
Epoch: 7986, Training Loss: 0.01785
Epoch: 7986, Training Loss: 0.01913
Epoch: 7986, Training Loss: 0.01916
Epoch: 7986, Training Loss: 0.02392
Epoch: 7987, Training Loss: 0.01785
Epoch: 7987, Training Loss: 0.01912
Epoch: 7987, Training Loss: 0.01916
Epoch: 7987, Training Loss: 0.02392
Epoch: 7988, Training Loss: 0.01785
Epoch: 7988, Training Loss: 0.01912
Epoch: 7988, Training Loss: 0.01915
Epoch: 7988, Training Loss: 0.02392
Epoch: 7989, Training Loss: 0.01785
Epoch: 7989, Training Loss: 0.01912
Epoch: 7989, Training Loss: 0.01915
Epoch: 7989, Training Loss: 0.02391
Epoch: 7990, Training Loss: 0.01784
Epoch: 7990, Training Loss: 0.01912
Epoch: 7990, Training Loss: 0.01915
Epoch: 7990, Training Loss: 0.02391
Epoch: 7991, Training Loss: 0.01784
Epoch: 7991, Training Loss: 0.01912
Epoch: 7991, Training Loss: 0.01915
Epoch: 7991, Training Loss: 0.02391
Epoch: 7992, Training Loss: 0.01784
Epoch: 7992, Training Loss: 0.01911
Epoch: 7992, Training Loss: 0.01915
Epoch: 7992, Training Loss: 0.02391
Epoch: 7993, Training Loss: 0.01784
Epoch: 7993, Training Loss: 0.01911
Epoch: 7993, Training Loss: 0.01914
Epoch: 7993, Training Loss: 0.02391
Epoch: 7994, Training Loss: 0.01784
Epoch: 7994, Training Loss: 0.01911
Epoch: 7994, Training Loss: 0.01914
Epoch: 7994, Training Loss: 0.02390
Epoch: 7995, Training Loss: 0.01784
Epoch: 7995, Training Loss: 0.01911
Epoch: 7995, Training Loss: 0.01914
Epoch: 7995, Training Loss: 0.02390
Epoch: 7996, Training Loss: 0.01783
Epoch: 7996, Training Loss: 0.01911
Epoch: 7996, Training Loss: 0.01914
Epoch: 7996, Training Loss: 0.02390
Epoch: 7997, Training Loss: 0.01783
Epoch: 7997, Training Loss: 0.01911
Epoch: 7997, Training Loss: 0.01914
Epoch: 7997, Training Loss: 0.02390
Epoch: 7998, Training Loss: 0.01783
Epoch: 7998, Training Loss: 0.01910
Epoch: 7998, Training Loss: 0.01914
Epoch: 7998, Training Loss: 0.02389
Epoch: 7999, Training Loss: 0.01783
Epoch: 7999, Training Loss: 0.01910
Epoch: 7999, Training Loss: 0.01913
Epoch: 7999, Training Loss: 0.02389
Epoch: 8000, Training Loss: 0.01783
Epoch: 8000, Training Loss: 0.01910
Epoch: 8000, Training Loss: 0.01913
Epoch: 8000, Training Loss: 0.02389
Epoch: 8001, Training Loss: 0.01783
Epoch: 8001, Training Loss: 0.01910
Epoch: 8001, Training Loss: 0.01913
Epoch: 8001, Training Loss: 0.02389
Epoch: 8002, Training Loss: 0.01783
Epoch: 8002, Training Loss: 0.01910
Epoch: 8002, Training Loss: 0.01913
Epoch: 8002, Training Loss: 0.02388
Epoch: 8003, Training Loss: 0.01782
Epoch: 8003, Training Loss: 0.01909
Epoch: 8003, Training Loss: 0.01913
Epoch: 8003, Training Loss: 0.02388
Epoch: 8004, Training Loss: 0.01782
Epoch: 8004, Training Loss: 0.01909
Epoch: 8004, Training Loss: 0.01912
Epoch: 8004, Training Loss: 0.02388
Epoch: 8005, Training Loss: 0.01782
Epoch: 8005, Training Loss: 0.01909
Epoch: 8005, Training Loss: 0.01912
Epoch: 8005, Training Loss: 0.02388
Epoch: 8006, Training Loss: 0.01782
Epoch: 8006, Training Loss: 0.01909
Epoch: 8006, Training Loss: 0.01912
Epoch: 8006, Training Loss: 0.02388
Epoch: 8007, Training Loss: 0.01782
Epoch: 8007, Training Loss: 0.01909
Epoch: 8007, Training Loss: 0.01912
Epoch: 8007, Training Loss: 0.02387
Epoch: 8008, Training Loss: 0.01782
Epoch: 8008, Training Loss: 0.01909
Epoch: 8008, Training Loss: 0.01912
Epoch: 8008, Training Loss: 0.02387
Epoch: 8009, Training Loss: 0.01781
Epoch: 8009, Training Loss: 0.01908
Epoch: 8009, Training Loss: 0.01912
Epoch: 8009, Training Loss: 0.02387
Epoch: 8010, Training Loss: 0.01781
Epoch: 8010, Training Loss: 0.01908
Epoch: 8010, Training Loss: 0.01911
Epoch: 8010, Training Loss: 0.02387
Epoch: 8011, Training Loss: 0.01781
Epoch: 8011, Training Loss: 0.01908
Epoch: 8011, Training Loss: 0.01911
Epoch: 8011, Training Loss: 0.02386
Epoch: 8012, Training Loss: 0.01781
Epoch: 8012, Training Loss: 0.01908
Epoch: 8012, Training Loss: 0.01911
Epoch: 8012, Training Loss: 0.02386
Epoch: 8013, Training Loss: 0.01781
Epoch: 8013, Training Loss: 0.01908
Epoch: 8013, Training Loss: 0.01911
Epoch: 8013, Training Loss: 0.02386
Epoch: 8014, Training Loss: 0.01781
Epoch: 8014, Training Loss: 0.01908
Epoch: 8014, Training Loss: 0.01911
Epoch: 8014, Training Loss: 0.02386
Epoch: 8015, Training Loss: 0.01780
Epoch: 8015, Training Loss: 0.01907
Epoch: 8015, Training Loss: 0.01910
Epoch: 8015, Training Loss: 0.02385
Epoch: 8016, Training Loss: 0.01780
Epoch: 8016, Training Loss: 0.01907
Epoch: 8016, Training Loss: 0.01910
Epoch: 8016, Training Loss: 0.02385
Epoch: 8017, Training Loss: 0.01780
Epoch: 8017, Training Loss: 0.01907
Epoch: 8017, Training Loss: 0.01910
Epoch: 8017, Training Loss: 0.02385
Epoch: 8018, Training Loss: 0.01780
Epoch: 8018, Training Loss: 0.01907
Epoch: 8018, Training Loss: 0.01910
Epoch: 8018, Training Loss: 0.02385
Epoch: 8019, Training Loss: 0.01780
Epoch: 8019, Training Loss: 0.01907
Epoch: 8019, Training Loss: 0.01910
Epoch: 8019, Training Loss: 0.02385
Epoch: 8020, Training Loss: 0.01780
Epoch: 8020, Training Loss: 0.01906
Epoch: 8020, Training Loss: 0.01910
Epoch: 8020, Training Loss: 0.02384
Epoch: 8021, Training Loss: 0.01780
Epoch: 8021, Training Loss: 0.01906
Epoch: 8021, Training Loss: 0.01909
Epoch: 8021, Training Loss: 0.02384
Epoch: 8022, Training Loss: 0.01779
Epoch: 8022, Training Loss: 0.01906
Epoch: 8022, Training Loss: 0.01909
Epoch: 8022, Training Loss: 0.02384
Epoch: 8023, Training Loss: 0.01779
Epoch: 8023, Training Loss: 0.01906
Epoch: 8023, Training Loss: 0.01909
Epoch: 8023, Training Loss: 0.02384
Epoch: 8024, Training Loss: 0.01779
Epoch: 8024, Training Loss: 0.01906
Epoch: 8024, Training Loss: 0.01909
Epoch: 8024, Training Loss: 0.02383
Epoch: 8025, Training Loss: 0.01779
Epoch: 8025, Training Loss: 0.01906
Epoch: 8025, Training Loss: 0.01909
Epoch: 8025, Training Loss: 0.02383
Epoch: 8026, Training Loss: 0.01779
Epoch: 8026, Training Loss: 0.01905
Epoch: 8026, Training Loss: 0.01909
Epoch: 8026, Training Loss: 0.02383
Epoch: 8027, Training Loss: 0.01779
Epoch: 8027, Training Loss: 0.01905
Epoch: 8027, Training Loss: 0.01908
Epoch: 8027, Training Loss: 0.02383
Epoch: 8028, Training Loss: 0.01778
Epoch: 8028, Training Loss: 0.01905
Epoch: 8028, Training Loss: 0.01908
Epoch: 8028, Training Loss: 0.02382
Epoch: 8029, Training Loss: 0.01778
Epoch: 8029, Training Loss: 0.01905
Epoch: 8029, Training Loss: 0.01908
Epoch: 8029, Training Loss: 0.02382
Epoch: 8030, Training Loss: 0.01778
Epoch: 8030, Training Loss: 0.01905
Epoch: 8030, Training Loss: 0.01908
Epoch: 8030, Training Loss: 0.02382
Epoch: 8031, Training Loss: 0.01778
Epoch: 8031, Training Loss: 0.01905
Epoch: 8031, Training Loss: 0.01908
Epoch: 8031, Training Loss: 0.02382
Epoch: 8032, Training Loss: 0.01778
Epoch: 8032, Training Loss: 0.01904
Epoch: 8032, Training Loss: 0.01907
Epoch: 8032, Training Loss: 0.02382
Epoch: 8033, Training Loss: 0.01778
Epoch: 8033, Training Loss: 0.01904
Epoch: 8033, Training Loss: 0.01907
Epoch: 8033, Training Loss: 0.02381
Epoch: 8034, Training Loss: 0.01777
Epoch: 8034, Training Loss: 0.01904
Epoch: 8034, Training Loss: 0.01907
Epoch: 8034, Training Loss: 0.02381
Epoch: 8035, Training Loss: 0.01777
Epoch: 8035, Training Loss: 0.01904
Epoch: 8035, Training Loss: 0.01907
Epoch: 8035, Training Loss: 0.02381
Epoch: 8036, Training Loss: 0.01777
Epoch: 8036, Training Loss: 0.01904
Epoch: 8036, Training Loss: 0.01907
Epoch: 8036, Training Loss: 0.02381
Epoch: 8037, Training Loss: 0.01777
Epoch: 8037, Training Loss: 0.01903
Epoch: 8037, Training Loss: 0.01907
Epoch: 8037, Training Loss: 0.02380
Epoch: 8038, Training Loss: 0.01777
Epoch: 8038, Training Loss: 0.01903
Epoch: 8038, Training Loss: 0.01906
Epoch: 8038, Training Loss: 0.02380
Epoch: 8039, Training Loss: 0.01777
Epoch: 8039, Training Loss: 0.01903
Epoch: 8039, Training Loss: 0.01906
Epoch: 8039, Training Loss: 0.02380
Epoch: 8040, Training Loss: 0.01777
Epoch: 8040, Training Loss: 0.01903
Epoch: 8040, Training Loss: 0.01906
Epoch: 8040, Training Loss: 0.02380
Epoch: 8041, Training Loss: 0.01776
Epoch: 8041, Training Loss: 0.01903
Epoch: 8041, Training Loss: 0.01906
Epoch: 8041, Training Loss: 0.02380
Epoch: 8042, Training Loss: 0.01776
Epoch: 8042, Training Loss: 0.01903
Epoch: 8042, Training Loss: 0.01906
Epoch: 8042, Training Loss: 0.02379
Epoch: 8043, Training Loss: 0.01776
Epoch: 8043, Training Loss: 0.01902
Epoch: 8043, Training Loss: 0.01906
Epoch: 8043, Training Loss: 0.02379
Epoch: 8044, Training Loss: 0.01776
Epoch: 8044, Training Loss: 0.01902
Epoch: 8044, Training Loss: 0.01905
Epoch: 8044, Training Loss: 0.02379
Epoch: 8045, Training Loss: 0.01776
Epoch: 8045, Training Loss: 0.01902
Epoch: 8045, Training Loss: 0.01905
Epoch: 8045, Training Loss: 0.02379
Epoch: 8046, Training Loss: 0.01776
Epoch: 8046, Training Loss: 0.01902
Epoch: 8046, Training Loss: 0.01905
Epoch: 8046, Training Loss: 0.02378
Epoch: 8047, Training Loss: 0.01775
Epoch: 8047, Training Loss: 0.01902
Epoch: 8047, Training Loss: 0.01905
Epoch: 8047, Training Loss: 0.02378
Epoch: 8048, Training Loss: 0.01775
Epoch: 8048, Training Loss: 0.01901
Epoch: 8048, Training Loss: 0.01905
Epoch: 8048, Training Loss: 0.02378
Epoch: 8049, Training Loss: 0.01775
Epoch: 8049, Training Loss: 0.01901
Epoch: 8049, Training Loss: 0.01904
Epoch: 8049, Training Loss: 0.02378
Epoch: 8050, Training Loss: 0.01775
Epoch: 8050, Training Loss: 0.01901
Epoch: 8050, Training Loss: 0.01904
Epoch: 8050, Training Loss: 0.02377
Epoch: 8051, Training Loss: 0.01775
Epoch: 8051, Training Loss: 0.01901
Epoch: 8051, Training Loss: 0.01904
Epoch: 8051, Training Loss: 0.02377
Epoch: 8052, Training Loss: 0.01775
Epoch: 8052, Training Loss: 0.01901
Epoch: 8052, Training Loss: 0.01904
Epoch: 8052, Training Loss: 0.02377
Epoch: 8053, Training Loss: 0.01774
Epoch: 8053, Training Loss: 0.01901
Epoch: 8053, Training Loss: 0.01904
Epoch: 8053, Training Loss: 0.02377
Epoch: 8054, Training Loss: 0.01774
Epoch: 8054, Training Loss: 0.01900
Epoch: 8054, Training Loss: 0.01904
Epoch: 8054, Training Loss: 0.02377
Epoch: 8055, Training Loss: 0.01774
Epoch: 8055, Training Loss: 0.01900
Epoch: 8055, Training Loss: 0.01903
Epoch: 8055, Training Loss: 0.02376
Epoch: 8056, Training Loss: 0.01774
Epoch: 8056, Training Loss: 0.01900
Epoch: 8056, Training Loss: 0.01903
Epoch: 8056, Training Loss: 0.02376
Epoch: 8057, Training Loss: 0.01774
Epoch: 8057, Training Loss: 0.01900
Epoch: 8057, Training Loss: 0.01903
Epoch: 8057, Training Loss: 0.02376
Epoch: 8058, Training Loss: 0.01774
Epoch: 8058, Training Loss: 0.01900
Epoch: 8058, Training Loss: 0.01903
Epoch: 8058, Training Loss: 0.02376
Epoch: 8059, Training Loss: 0.01774
Epoch: 8059, Training Loss: 0.01900
Epoch: 8059, Training Loss: 0.01903
Epoch: 8059, Training Loss: 0.02375
Epoch: 8060, Training Loss: 0.01773
Epoch: 8060, Training Loss: 0.01899
Epoch: 8060, Training Loss: 0.01903
Epoch: 8060, Training Loss: 0.02375
Epoch: 8061, Training Loss: 0.01773
Epoch: 8061, Training Loss: 0.01899
Epoch: 8061, Training Loss: 0.01902
Epoch: 8061, Training Loss: 0.02375
Epoch: 8062, Training Loss: 0.01773
Epoch: 8062, Training Loss: 0.01899
Epoch: 8062, Training Loss: 0.01902
Epoch: 8062, Training Loss: 0.02375
Epoch: 8063, Training Loss: 0.01773
Epoch: 8063, Training Loss: 0.01899
Epoch: 8063, Training Loss: 0.01902
Epoch: 8063, Training Loss: 0.02375
Epoch: 8064, Training Loss: 0.01773
Epoch: 8064, Training Loss: 0.01899
Epoch: 8064, Training Loss: 0.01902
Epoch: 8064, Training Loss: 0.02374
Epoch: 8065, Training Loss: 0.01773
Epoch: 8065, Training Loss: 0.01898
Epoch: 8065, Training Loss: 0.01902
Epoch: 8065, Training Loss: 0.02374
Epoch: 8066, Training Loss: 0.01772
Epoch: 8066, Training Loss: 0.01898
Epoch: 8066, Training Loss: 0.01901
Epoch: 8066, Training Loss: 0.02374
Epoch: 8067, Training Loss: 0.01772
Epoch: 8067, Training Loss: 0.01898
Epoch: 8067, Training Loss: 0.01901
Epoch: 8067, Training Loss: 0.02374
Epoch: 8068, Training Loss: 0.01772
Epoch: 8068, Training Loss: 0.01898
Epoch: 8068, Training Loss: 0.01901
Epoch: 8068, Training Loss: 0.02373
Epoch: 8069, Training Loss: 0.01772
Epoch: 8069, Training Loss: 0.01898
Epoch: 8069, Training Loss: 0.01901
Epoch: 8069, Training Loss: 0.02373
Epoch: 8070, Training Loss: 0.01772
Epoch: 8070, Training Loss: 0.01898
Epoch: 8070, Training Loss: 0.01901
Epoch: 8070, Training Loss: 0.02373
Epoch: 8071, Training Loss: 0.01772
Epoch: 8071, Training Loss: 0.01897
Epoch: 8071, Training Loss: 0.01901
Epoch: 8071, Training Loss: 0.02373
Epoch: 8072, Training Loss: 0.01771
Epoch: 8072, Training Loss: 0.01897
Epoch: 8072, Training Loss: 0.01900
Epoch: 8072, Training Loss: 0.02372
Epoch: 8073, Training Loss: 0.01771
Epoch: 8073, Training Loss: 0.01897
Epoch: 8073, Training Loss: 0.01900
Epoch: 8073, Training Loss: 0.02372
Epoch: 8074, Training Loss: 0.01771
Epoch: 8074, Training Loss: 0.01897
Epoch: 8074, Training Loss: 0.01900
Epoch: 8074, Training Loss: 0.02372
Epoch: 8075, Training Loss: 0.01771
Epoch: 8075, Training Loss: 0.01897
Epoch: 8075, Training Loss: 0.01900
Epoch: 8075, Training Loss: 0.02372
Epoch: 8076, Training Loss: 0.01771
Epoch: 8076, Training Loss: 0.01897
Epoch: 8076, Training Loss: 0.01900
Epoch: 8076, Training Loss: 0.02372
Epoch: 8077, Training Loss: 0.01771
Epoch: 8077, Training Loss: 0.01896
Epoch: 8077, Training Loss: 0.01900
Epoch: 8077, Training Loss: 0.02371
Epoch: 8078, Training Loss: 0.01771
Epoch: 8078, Training Loss: 0.01896
Epoch: 8078, Training Loss: 0.01899
Epoch: 8078, Training Loss: 0.02371
Epoch: 8079, Training Loss: 0.01770
Epoch: 8079, Training Loss: 0.01896
Epoch: 8079, Training Loss: 0.01899
Epoch: 8079, Training Loss: 0.02371
Epoch: 8080, Training Loss: 0.01770
Epoch: 8080, Training Loss: 0.01896
Epoch: 8080, Training Loss: 0.01899
Epoch: 8080, Training Loss: 0.02371
Epoch: 8081, Training Loss: 0.01770
Epoch: 8081, Training Loss: 0.01896
Epoch: 8081, Training Loss: 0.01899
Epoch: 8081, Training Loss: 0.02370
Epoch: 8082, Training Loss: 0.01770
Epoch: 8082, Training Loss: 0.01896
Epoch: 8082, Training Loss: 0.01899
Epoch: 8082, Training Loss: 0.02370
Epoch: 8083, Training Loss: 0.01770
Epoch: 8083, Training Loss: 0.01895
Epoch: 8083, Training Loss: 0.01898
Epoch: 8083, Training Loss: 0.02370
Epoch: 8084, Training Loss: 0.01770
Epoch: 8084, Training Loss: 0.01895
Epoch: 8084, Training Loss: 0.01898
Epoch: 8084, Training Loss: 0.02370
Epoch: 8085, Training Loss: 0.01769
Epoch: 8085, Training Loss: 0.01895
Epoch: 8085, Training Loss: 0.01898
Epoch: 8085, Training Loss: 0.02370
Epoch: 8086, Training Loss: 0.01769
Epoch: 8086, Training Loss: 0.01895
Epoch: 8086, Training Loss: 0.01898
Epoch: 8086, Training Loss: 0.02369
Epoch: 8087, Training Loss: 0.01769
Epoch: 8087, Training Loss: 0.01895
Epoch: 8087, Training Loss: 0.01898
Epoch: 8087, Training Loss: 0.02369
Epoch: 8088, Training Loss: 0.01769
Epoch: 8088, Training Loss: 0.01894
Epoch: 8088, Training Loss: 0.01898
Epoch: 8088, Training Loss: 0.02369
Epoch: 8089, Training Loss: 0.01769
Epoch: 8089, Training Loss: 0.01894
Epoch: 8089, Training Loss: 0.01897
Epoch: 8089, Training Loss: 0.02369
Epoch: 8090, Training Loss: 0.01769
Epoch: 8090, Training Loss: 0.01894
Epoch: 8090, Training Loss: 0.01897
Epoch: 8090, Training Loss: 0.02368
Epoch: 8091, Training Loss: 0.01769
Epoch: 8091, Training Loss: 0.01894
Epoch: 8091, Training Loss: 0.01897
Epoch: 8091, Training Loss: 0.02368
Epoch: 8092, Training Loss: 0.01768
Epoch: 8092, Training Loss: 0.01894
Epoch: 8092, Training Loss: 0.01897
Epoch: 8092, Training Loss: 0.02368
Epoch: 8093, Training Loss: 0.01768
Epoch: 8093, Training Loss: 0.01894
Epoch: 8093, Training Loss: 0.01897
Epoch: 8093, Training Loss: 0.02368
Epoch: 8094, Training Loss: 0.01768
Epoch: 8094, Training Loss: 0.01893
Epoch: 8094, Training Loss: 0.01897
Epoch: 8094, Training Loss: 0.02368
Epoch: 8095, Training Loss: 0.01768
Epoch: 8095, Training Loss: 0.01893
Epoch: 8095, Training Loss: 0.01896
Epoch: 8095, Training Loss: 0.02367
Epoch: 8096, Training Loss: 0.01768
Epoch: 8096, Training Loss: 0.01893
Epoch: 8096, Training Loss: 0.01896
Epoch: 8096, Training Loss: 0.02367
Epoch: 8097, Training Loss: 0.01768
Epoch: 8097, Training Loss: 0.01893
Epoch: 8097, Training Loss: 0.01896
Epoch: 8097, Training Loss: 0.02367
Epoch: 8098, Training Loss: 0.01767
Epoch: 8098, Training Loss: 0.01893
Epoch: 8098, Training Loss: 0.01896
Epoch: 8098, Training Loss: 0.02367
Epoch: 8099, Training Loss: 0.01767
Epoch: 8099, Training Loss: 0.01893
Epoch: 8099, Training Loss: 0.01896
Epoch: 8099, Training Loss: 0.02366
Epoch: 8100, Training Loss: 0.01767
Epoch: 8100, Training Loss: 0.01892
Epoch: 8100, Training Loss: 0.01895
Epoch: 8100, Training Loss: 0.02366
Epoch: 8101, Training Loss: 0.01767
Epoch: 8101, Training Loss: 0.01892
Epoch: 8101, Training Loss: 0.01895
Epoch: 8101, Training Loss: 0.02366
Epoch: 8102, Training Loss: 0.01767
Epoch: 8102, Training Loss: 0.01892
Epoch: 8102, Training Loss: 0.01895
Epoch: 8102, Training Loss: 0.02366
Epoch: 8103, Training Loss: 0.01767
Epoch: 8103, Training Loss: 0.01892
Epoch: 8103, Training Loss: 0.01895
Epoch: 8103, Training Loss: 0.02365
Epoch: 8104, Training Loss: 0.01767
Epoch: 8104, Training Loss: 0.01892
Epoch: 8104, Training Loss: 0.01895
Epoch: 8104, Training Loss: 0.02365
Epoch: 8105, Training Loss: 0.01766
Epoch: 8105, Training Loss: 0.01892
Epoch: 8105, Training Loss: 0.01895
Epoch: 8105, Training Loss: 0.02365
Epoch: 8106, Training Loss: 0.01766
Epoch: 8106, Training Loss: 0.01891
Epoch: 8106, Training Loss: 0.01894
Epoch: 8106, Training Loss: 0.02365
Epoch: 8107, Training Loss: 0.01766
Epoch: 8107, Training Loss: 0.01891
Epoch: 8107, Training Loss: 0.01894
Epoch: 8107, Training Loss: 0.02365
Epoch: 8108, Training Loss: 0.01766
Epoch: 8108, Training Loss: 0.01891
Epoch: 8108, Training Loss: 0.01894
Epoch: 8108, Training Loss: 0.02364
Epoch: 8109, Training Loss: 0.01766
Epoch: 8109, Training Loss: 0.01891
Epoch: 8109, Training Loss: 0.01894
Epoch: 8109, Training Loss: 0.02364
Epoch: 8110, Training Loss: 0.01766
Epoch: 8110, Training Loss: 0.01891
Epoch: 8110, Training Loss: 0.01894
Epoch: 8110, Training Loss: 0.02364
Epoch: 8111, Training Loss: 0.01765
Epoch: 8111, Training Loss: 0.01890
Epoch: 8111, Training Loss: 0.01894
Epoch: 8111, Training Loss: 0.02364
Epoch: 8112, Training Loss: 0.01765
Epoch: 8112, Training Loss: 0.01890
Epoch: 8112, Training Loss: 0.01893
Epoch: 8112, Training Loss: 0.02363
Epoch: 8113, Training Loss: 0.01765
Epoch: 8113, Training Loss: 0.01890
Epoch: 8113, Training Loss: 0.01893
Epoch: 8113, Training Loss: 0.02363
Epoch: 8114, Training Loss: 0.01765
Epoch: 8114, Training Loss: 0.01890
Epoch: 8114, Training Loss: 0.01893
Epoch: 8114, Training Loss: 0.02363
Epoch: 8115, Training Loss: 0.01765
Epoch: 8115, Training Loss: 0.01890
Epoch: 8115, Training Loss: 0.01893
Epoch: 8115, Training Loss: 0.02363
Epoch: 8116, Training Loss: 0.01765
Epoch: 8116, Training Loss: 0.01890
Epoch: 8116, Training Loss: 0.01893
Epoch: 8116, Training Loss: 0.02363
Epoch: 8117, Training Loss: 0.01765
Epoch: 8117, Training Loss: 0.01889
Epoch: 8117, Training Loss: 0.01893
Epoch: 8117, Training Loss: 0.02362
Epoch: 8118, Training Loss: 0.01764
Epoch: 8118, Training Loss: 0.01889
Epoch: 8118, Training Loss: 0.01892
Epoch: 8118, Training Loss: 0.02362
Epoch: 8119, Training Loss: 0.01764
Epoch: 8119, Training Loss: 0.01889
Epoch: 8119, Training Loss: 0.01892
Epoch: 8119, Training Loss: 0.02362
Epoch: 8120, Training Loss: 0.01764
Epoch: 8120, Training Loss: 0.01889
Epoch: 8120, Training Loss: 0.01892
Epoch: 8120, Training Loss: 0.02362
Epoch: 8121, Training Loss: 0.01764
Epoch: 8121, Training Loss: 0.01889
Epoch: 8121, Training Loss: 0.01892
Epoch: 8121, Training Loss: 0.02361
Epoch: 8122, Training Loss: 0.01764
Epoch: 8122, Training Loss: 0.01889
Epoch: 8122, Training Loss: 0.01892
Epoch: 8122, Training Loss: 0.02361
Epoch: 8123, Training Loss: 0.01764
Epoch: 8123, Training Loss: 0.01888
Epoch: 8123, Training Loss: 0.01891
Epoch: 8123, Training Loss: 0.02361
Epoch: 8124, Training Loss: 0.01763
Epoch: 8124, Training Loss: 0.01888
Epoch: 8124, Training Loss: 0.01891
Epoch: 8124, Training Loss: 0.02361
Epoch: 8125, Training Loss: 0.01763
Epoch: 8125, Training Loss: 0.01888
Epoch: 8125, Training Loss: 0.01891
Epoch: 8125, Training Loss: 0.02361
Epoch: 8126, Training Loss: 0.01763
Epoch: 8126, Training Loss: 0.01888
Epoch: 8126, Training Loss: 0.01891
Epoch: 8126, Training Loss: 0.02360
Epoch: 8127, Training Loss: 0.01763
Epoch: 8127, Training Loss: 0.01888
Epoch: 8127, Training Loss: 0.01891
Epoch: 8127, Training Loss: 0.02360
Epoch: 8128, Training Loss: 0.01763
Epoch: 8128, Training Loss: 0.01888
Epoch: 8128, Training Loss: 0.01891
Epoch: 8128, Training Loss: 0.02360
Epoch: 8129, Training Loss: 0.01763
Epoch: 8129, Training Loss: 0.01887
Epoch: 8129, Training Loss: 0.01890
Epoch: 8129, Training Loss: 0.02360
Epoch: 8130, Training Loss: 0.01762
Epoch: 8130, Training Loss: 0.01887
Epoch: 8130, Training Loss: 0.01890
Epoch: 8130, Training Loss: 0.02359
Epoch: 8131, Training Loss: 0.01762
Epoch: 8131, Training Loss: 0.01887
Epoch: 8131, Training Loss: 0.01890
Epoch: 8131, Training Loss: 0.02359
Epoch: 8132, Training Loss: 0.01762
Epoch: 8132, Training Loss: 0.01887
Epoch: 8132, Training Loss: 0.01890
Epoch: 8132, Training Loss: 0.02359
Epoch: 8133, Training Loss: 0.01762
Epoch: 8133, Training Loss: 0.01887
Epoch: 8133, Training Loss: 0.01890
Epoch: 8133, Training Loss: 0.02359
Epoch: 8134, Training Loss: 0.01762
Epoch: 8134, Training Loss: 0.01886
Epoch: 8134, Training Loss: 0.01890
Epoch: 8134, Training Loss: 0.02359
Epoch: 8135, Training Loss: 0.01762
Epoch: 8135, Training Loss: 0.01886
Epoch: 8135, Training Loss: 0.01889
Epoch: 8135, Training Loss: 0.02358
Epoch: 8136, Training Loss: 0.01762
Epoch: 8136, Training Loss: 0.01886
Epoch: 8136, Training Loss: 0.01889
Epoch: 8136, Training Loss: 0.02358
Epoch: 8137, Training Loss: 0.01761
Epoch: 8137, Training Loss: 0.01886
Epoch: 8137, Training Loss: 0.01889
Epoch: 8137, Training Loss: 0.02358
Epoch: 8138, Training Loss: 0.01761
Epoch: 8138, Training Loss: 0.01886
Epoch: 8138, Training Loss: 0.01889
Epoch: 8138, Training Loss: 0.02358
Epoch: 8139, Training Loss: 0.01761
Epoch: 8139, Training Loss: 0.01886
Epoch: 8139, Training Loss: 0.01889
Epoch: 8139, Training Loss: 0.02357
Epoch: 8140, Training Loss: 0.01761
Epoch: 8140, Training Loss: 0.01885
Epoch: 8140, Training Loss: 0.01889
Epoch: 8140, Training Loss: 0.02357
Epoch: 8141, Training Loss: 0.01761
Epoch: 8141, Training Loss: 0.01885
Epoch: 8141, Training Loss: 0.01888
Epoch: 8141, Training Loss: 0.02357
Epoch: 8142, Training Loss: 0.01761
Epoch: 8142, Training Loss: 0.01885
Epoch: 8142, Training Loss: 0.01888
Epoch: 8142, Training Loss: 0.02357
Epoch: 8143, Training Loss: 0.01761
Epoch: 8143, Training Loss: 0.01885
Epoch: 8143, Training Loss: 0.01888
Epoch: 8143, Training Loss: 0.02357
Epoch: 8144, Training Loss: 0.01760
Epoch: 8144, Training Loss: 0.01885
Epoch: 8144, Training Loss: 0.01888
Epoch: 8144, Training Loss: 0.02356
Epoch: 8145, Training Loss: 0.01760
Epoch: 8145, Training Loss: 0.01885
Epoch: 8145, Training Loss: 0.01888
Epoch: 8145, Training Loss: 0.02356
Epoch: 8146, Training Loss: 0.01760
Epoch: 8146, Training Loss: 0.01884
Epoch: 8146, Training Loss: 0.01887
Epoch: 8146, Training Loss: 0.02356
Epoch: 8147, Training Loss: 0.01760
Epoch: 8147, Training Loss: 0.01884
Epoch: 8147, Training Loss: 0.01887
Epoch: 8147, Training Loss: 0.02356
Epoch: 8148, Training Loss: 0.01760
Epoch: 8148, Training Loss: 0.01884
Epoch: 8148, Training Loss: 0.01887
Epoch: 8148, Training Loss: 0.02355
Epoch: 8149, Training Loss: 0.01760
Epoch: 8149, Training Loss: 0.01884
Epoch: 8149, Training Loss: 0.01887
Epoch: 8149, Training Loss: 0.02355
Epoch: 8150, Training Loss: 0.01759
Epoch: 8150, Training Loss: 0.01884
Epoch: 8150, Training Loss: 0.01887
Epoch: 8150, Training Loss: 0.02355
Epoch: 8151, Training Loss: 0.01759
Epoch: 8151, Training Loss: 0.01884
Epoch: 8151, Training Loss: 0.01887
Epoch: 8151, Training Loss: 0.02355
Epoch: 8152, Training Loss: 0.01759
Epoch: 8152, Training Loss: 0.01883
Epoch: 8152, Training Loss: 0.01886
Epoch: 8152, Training Loss: 0.02355
Epoch: 8153, Training Loss: 0.01759
Epoch: 8153, Training Loss: 0.01883
Epoch: 8153, Training Loss: 0.01886
Epoch: 8153, Training Loss: 0.02354
Epoch: 8154, Training Loss: 0.01759
Epoch: 8154, Training Loss: 0.01883
Epoch: 8154, Training Loss: 0.01886
Epoch: 8154, Training Loss: 0.02354
Epoch: 8155, Training Loss: 0.01759
Epoch: 8155, Training Loss: 0.01883
Epoch: 8155, Training Loss: 0.01886
Epoch: 8155, Training Loss: 0.02354
Epoch: 8156, Training Loss: 0.01759
Epoch: 8156, Training Loss: 0.01883
Epoch: 8156, Training Loss: 0.01886
Epoch: 8156, Training Loss: 0.02354
Epoch: 8157, Training Loss: 0.01758
Epoch: 8157, Training Loss: 0.01883
Epoch: 8157, Training Loss: 0.01886
Epoch: 8157, Training Loss: 0.02353
Epoch: 8158, Training Loss: 0.01758
Epoch: 8158, Training Loss: 0.01882
Epoch: 8158, Training Loss: 0.01885
Epoch: 8158, Training Loss: 0.02353
Epoch: 8159, Training Loss: 0.01758
Epoch: 8159, Training Loss: 0.01882
Epoch: 8159, Training Loss: 0.01885
Epoch: 8159, Training Loss: 0.02353
Epoch: 8160, Training Loss: 0.01758
Epoch: 8160, Training Loss: 0.01882
Epoch: 8160, Training Loss: 0.01885
Epoch: 8160, Training Loss: 0.02353
Epoch: 8161, Training Loss: 0.01758
Epoch: 8161, Training Loss: 0.01882
Epoch: 8161, Training Loss: 0.01885
Epoch: 8161, Training Loss: 0.02353
Epoch: 8162, Training Loss: 0.01758
Epoch: 8162, Training Loss: 0.01882
Epoch: 8162, Training Loss: 0.01885
Epoch: 8162, Training Loss: 0.02352
Epoch: 8163, Training Loss: 0.01757
Epoch: 8163, Training Loss: 0.01881
Epoch: 8163, Training Loss: 0.01885
Epoch: 8163, Training Loss: 0.02352
Epoch: 8164, Training Loss: 0.01757
Epoch: 8164, Training Loss: 0.01881
Epoch: 8164, Training Loss: 0.01884
Epoch: 8164, Training Loss: 0.02352
Epoch: 8165, Training Loss: 0.01757
Epoch: 8165, Training Loss: 0.01881
Epoch: 8165, Training Loss: 0.01884
Epoch: 8165, Training Loss: 0.02352
Epoch: 8166, Training Loss: 0.01757
Epoch: 8166, Training Loss: 0.01881
Epoch: 8166, Training Loss: 0.01884
Epoch: 8166, Training Loss: 0.02351
Epoch: 8167, Training Loss: 0.01757
Epoch: 8167, Training Loss: 0.01881
Epoch: 8167, Training Loss: 0.01884
Epoch: 8167, Training Loss: 0.02351
Epoch: 8168, Training Loss: 0.01757
Epoch: 8168, Training Loss: 0.01881
Epoch: 8168, Training Loss: 0.01884
Epoch: 8168, Training Loss: 0.02351
Epoch: 8169, Training Loss: 0.01757
Epoch: 8169, Training Loss: 0.01880
Epoch: 8169, Training Loss: 0.01884
Epoch: 8169, Training Loss: 0.02351
Epoch: 8170, Training Loss: 0.01756
Epoch: 8170, Training Loss: 0.01880
Epoch: 8170, Training Loss: 0.01883
Epoch: 8170, Training Loss: 0.02351
Epoch: 8171, Training Loss: 0.01756
Epoch: 8171, Training Loss: 0.01880
Epoch: 8171, Training Loss: 0.01883
Epoch: 8171, Training Loss: 0.02350
Epoch: 8172, Training Loss: 0.01756
Epoch: 8172, Training Loss: 0.01880
Epoch: 8172, Training Loss: 0.01883
Epoch: 8172, Training Loss: 0.02350
Epoch: 8173, Training Loss: 0.01756
Epoch: 8173, Training Loss: 0.01880
Epoch: 8173, Training Loss: 0.01883
Epoch: 8173, Training Loss: 0.02350
Epoch: 8174, Training Loss: 0.01756
Epoch: 8174, Training Loss: 0.01880
Epoch: 8174, Training Loss: 0.01883
Epoch: 8174, Training Loss: 0.02350
Epoch: 8175, Training Loss: 0.01756
Epoch: 8175, Training Loss: 0.01879
Epoch: 8175, Training Loss: 0.01883
Epoch: 8175, Training Loss: 0.02349
Epoch: 8176, Training Loss: 0.01755
Epoch: 8176, Training Loss: 0.01879
Epoch: 8176, Training Loss: 0.01882
Epoch: 8176, Training Loss: 0.02349
Epoch: 8177, Training Loss: 0.01755
Epoch: 8177, Training Loss: 0.01879
Epoch: 8177, Training Loss: 0.01882
Epoch: 8177, Training Loss: 0.02349
Epoch: 8178, Training Loss: 0.01755
Epoch: 8178, Training Loss: 0.01879
Epoch: 8178, Training Loss: 0.01882
Epoch: 8178, Training Loss: 0.02349
Epoch: 8179, Training Loss: 0.01755
Epoch: 8179, Training Loss: 0.01879
Epoch: 8179, Training Loss: 0.01882
Epoch: 8179, Training Loss: 0.02349
Epoch: 8180, Training Loss: 0.01755
Epoch: 8180, Training Loss: 0.01879
Epoch: 8180, Training Loss: 0.01882
Epoch: 8180, Training Loss: 0.02348
Epoch: 8181, Training Loss: 0.01755
Epoch: 8181, Training Loss: 0.01878
Epoch: 8181, Training Loss: 0.01881
Epoch: 8181, Training Loss: 0.02348
Epoch: 8182, Training Loss: 0.01755
Epoch: 8182, Training Loss: 0.01878
Epoch: 8182, Training Loss: 0.01881
Epoch: 8182, Training Loss: 0.02348
Epoch: 8183, Training Loss: 0.01754
Epoch: 8183, Training Loss: 0.01878
Epoch: 8183, Training Loss: 0.01881
Epoch: 8183, Training Loss: 0.02348
Epoch: 8184, Training Loss: 0.01754
Epoch: 8184, Training Loss: 0.01878
Epoch: 8184, Training Loss: 0.01881
Epoch: 8184, Training Loss: 0.02348
Epoch: 8185, Training Loss: 0.01754
Epoch: 8185, Training Loss: 0.01878
Epoch: 8185, Training Loss: 0.01881
Epoch: 8185, Training Loss: 0.02347
Epoch: 8186, Training Loss: 0.01754
Epoch: 8186, Training Loss: 0.01878
Epoch: 8186, Training Loss: 0.01881
Epoch: 8186, Training Loss: 0.02347
Epoch: 8187, Training Loss: 0.01754
Epoch: 8187, Training Loss: 0.01877
Epoch: 8187, Training Loss: 0.01880
Epoch: 8187, Training Loss: 0.02347
Epoch: 8188, Training Loss: 0.01754
Epoch: 8188, Training Loss: 0.01877
Epoch: 8188, Training Loss: 0.01880
Epoch: 8188, Training Loss: 0.02347
Epoch: 8189, Training Loss: 0.01753
Epoch: 8189, Training Loss: 0.01877
Epoch: 8189, Training Loss: 0.01880
Epoch: 8189, Training Loss: 0.02346
Epoch: 8190, Training Loss: 0.01753
Epoch: 8190, Training Loss: 0.01877
Epoch: 8190, Training Loss: 0.01880
Epoch: 8190, Training Loss: 0.02346
Epoch: 8191, Training Loss: 0.01753
Epoch: 8191, Training Loss: 0.01877
Epoch: 8191, Training Loss: 0.01880
Epoch: 8191, Training Loss: 0.02346
Epoch: 8192, Training Loss: 0.01753
Epoch: 8192, Training Loss: 0.01877
Epoch: 8192, Training Loss: 0.01880
Epoch: 8192, Training Loss: 0.02346
Epoch: 8193, Training Loss: 0.01753
Epoch: 8193, Training Loss: 0.01876
Epoch: 8193, Training Loss: 0.01879
Epoch: 8193, Training Loss: 0.02346
Epoch: 8194, Training Loss: 0.01753
Epoch: 8194, Training Loss: 0.01876
Epoch: 8194, Training Loss: 0.01879
Epoch: 8194, Training Loss: 0.02345
Epoch: 8195, Training Loss: 0.01753
Epoch: 8195, Training Loss: 0.01876
Epoch: 8195, Training Loss: 0.01879
Epoch: 8195, Training Loss: 0.02345
Epoch: 8196, Training Loss: 0.01752
Epoch: 8196, Training Loss: 0.01876
Epoch: 8196, Training Loss: 0.01879
Epoch: 8196, Training Loss: 0.02345
Epoch: 8197, Training Loss: 0.01752
Epoch: 8197, Training Loss: 0.01876
Epoch: 8197, Training Loss: 0.01879
Epoch: 8197, Training Loss: 0.02345
Epoch: 8198, Training Loss: 0.01752
Epoch: 8198, Training Loss: 0.01876
Epoch: 8198, Training Loss: 0.01879
Epoch: 8198, Training Loss: 0.02344
Epoch: 8199, Training Loss: 0.01752
Epoch: 8199, Training Loss: 0.01875
Epoch: 8199, Training Loss: 0.01878
Epoch: 8199, Training Loss: 0.02344
Epoch: 8200, Training Loss: 0.01752
Epoch: 8200, Training Loss: 0.01875
Epoch: 8200, Training Loss: 0.01878
Epoch: 8200, Training Loss: 0.02344
Epoch: 8201, Training Loss: 0.01752
Epoch: 8201, Training Loss: 0.01875
Epoch: 8201, Training Loss: 0.01878
Epoch: 8201, Training Loss: 0.02344
Epoch: 8202, Training Loss: 0.01752
Epoch: 8202, Training Loss: 0.01875
Epoch: 8202, Training Loss: 0.01878
Epoch: 8202, Training Loss: 0.02344
Epoch: 8203, Training Loss: 0.01751
Epoch: 8203, Training Loss: 0.01875
Epoch: 8203, Training Loss: 0.01878
Epoch: 8203, Training Loss: 0.02343
Epoch: 8204, Training Loss: 0.01751
Epoch: 8204, Training Loss: 0.01874
Epoch: 8204, Training Loss: 0.01878
Epoch: 8204, Training Loss: 0.02343
Epoch: 8205, Training Loss: 0.01751
Epoch: 8205, Training Loss: 0.01874
Epoch: 8205, Training Loss: 0.01877
Epoch: 8205, Training Loss: 0.02343
Epoch: 8206, Training Loss: 0.01751
Epoch: 8206, Training Loss: 0.01874
Epoch: 8206, Training Loss: 0.01877
Epoch: 8206, Training Loss: 0.02343
Epoch: 8207, Training Loss: 0.01751
Epoch: 8207, Training Loss: 0.01874
Epoch: 8207, Training Loss: 0.01877
Epoch: 8207, Training Loss: 0.02342
Epoch: 8208, Training Loss: 0.01751
Epoch: 8208, Training Loss: 0.01874
Epoch: 8208, Training Loss: 0.01877
Epoch: 8208, Training Loss: 0.02342
Epoch: 8209, Training Loss: 0.01750
Epoch: 8209, Training Loss: 0.01874
Epoch: 8209, Training Loss: 0.01877
Epoch: 8209, Training Loss: 0.02342
Epoch: 8210, Training Loss: 0.01750
Epoch: 8210, Training Loss: 0.01873
Epoch: 8210, Training Loss: 0.01877
Epoch: 8210, Training Loss: 0.02342
Epoch: 8211, Training Loss: 0.01750
Epoch: 8211, Training Loss: 0.01873
Epoch: 8211, Training Loss: 0.01876
Epoch: 8211, Training Loss: 0.02342
Epoch: 8212, Training Loss: 0.01750
Epoch: 8212, Training Loss: 0.01873
Epoch: 8212, Training Loss: 0.01876
Epoch: 8212, Training Loss: 0.02341
Epoch: 8213, Training Loss: 0.01750
Epoch: 8213, Training Loss: 0.01873
Epoch: 8213, Training Loss: 0.01876
Epoch: 8213, Training Loss: 0.02341
Epoch: 8214, Training Loss: 0.01750
Epoch: 8214, Training Loss: 0.01873
Epoch: 8214, Training Loss: 0.01876
Epoch: 8214, Training Loss: 0.02341
Epoch: 8215, Training Loss: 0.01750
Epoch: 8215, Training Loss: 0.01873
Epoch: 8215, Training Loss: 0.01876
Epoch: 8215, Training Loss: 0.02341
Epoch: 8216, Training Loss: 0.01749
Epoch: 8216, Training Loss: 0.01872
Epoch: 8216, Training Loss: 0.01876
Epoch: 8216, Training Loss: 0.02341
Epoch: 8217, Training Loss: 0.01749
Epoch: 8217, Training Loss: 0.01872
Epoch: 8217, Training Loss: 0.01875
Epoch: 8217, Training Loss: 0.02340
Epoch: 8218, Training Loss: 0.01749
Epoch: 8218, Training Loss: 0.01872
Epoch: 8218, Training Loss: 0.01875
Epoch: 8218, Training Loss: 0.02340
Epoch: 8219, Training Loss: 0.01749
Epoch: 8219, Training Loss: 0.01872
Epoch: 8219, Training Loss: 0.01875
Epoch: 8219, Training Loss: 0.02340
Epoch: 8220, Training Loss: 0.01749
Epoch: 8220, Training Loss: 0.01872
Epoch: 8220, Training Loss: 0.01875
Epoch: 8220, Training Loss: 0.02340
Epoch: 8221, Training Loss: 0.01749
Epoch: 8221, Training Loss: 0.01872
Epoch: 8221, Training Loss: 0.01875
Epoch: 8221, Training Loss: 0.02339
Epoch: 8222, Training Loss: 0.01749
Epoch: 8222, Training Loss: 0.01871
Epoch: 8222, Training Loss: 0.01875
Epoch: 8222, Training Loss: 0.02339
Epoch: 8223, Training Loss: 0.01748
Epoch: 8223, Training Loss: 0.01871
Epoch: 8223, Training Loss: 0.01874
Epoch: 8223, Training Loss: 0.02339
Epoch: 8224, Training Loss: 0.01748
Epoch: 8224, Training Loss: 0.01871
Epoch: 8224, Training Loss: 0.01874
Epoch: 8224, Training Loss: 0.02339
Epoch: 8225, Training Loss: 0.01748
Epoch: 8225, Training Loss: 0.01871
Epoch: 8225, Training Loss: 0.01874
Epoch: 8225, Training Loss: 0.02339
Epoch: 8226, Training Loss: 0.01748
Epoch: 8226, Training Loss: 0.01871
Epoch: 8226, Training Loss: 0.01874
Epoch: 8226, Training Loss: 0.02338
Epoch: 8227, Training Loss: 0.01748
Epoch: 8227, Training Loss: 0.01871
Epoch: 8227, Training Loss: 0.01874
Epoch: 8227, Training Loss: 0.02338
Epoch: 8228, Training Loss: 0.01748
Epoch: 8228, Training Loss: 0.01870
Epoch: 8228, Training Loss: 0.01873
Epoch: 8228, Training Loss: 0.02338
Epoch: 8229, Training Loss: 0.01747
Epoch: 8229, Training Loss: 0.01870
Epoch: 8229, Training Loss: 0.01873
Epoch: 8229, Training Loss: 0.02338
Epoch: 8230, Training Loss: 0.01747
Epoch: 8230, Training Loss: 0.01870
Epoch: 8230, Training Loss: 0.01873
Epoch: 8230, Training Loss: 0.02337
Epoch: 8231, Training Loss: 0.01747
Epoch: 8231, Training Loss: 0.01870
Epoch: 8231, Training Loss: 0.01873
Epoch: 8231, Training Loss: 0.02337
Epoch: 8232, Training Loss: 0.01747
Epoch: 8232, Training Loss: 0.01870
Epoch: 8232, Training Loss: 0.01873
Epoch: 8232, Training Loss: 0.02337
Epoch: 8233, Training Loss: 0.01747
Epoch: 8233, Training Loss: 0.01870
Epoch: 8233, Training Loss: 0.01873
Epoch: 8233, Training Loss: 0.02337
Epoch: 8234, Training Loss: 0.01747
Epoch: 8234, Training Loss: 0.01869
Epoch: 8234, Training Loss: 0.01872
Epoch: 8234, Training Loss: 0.02337
Epoch: 8235, Training Loss: 0.01747
Epoch: 8235, Training Loss: 0.01869
Epoch: 8235, Training Loss: 0.01872
Epoch: 8235, Training Loss: 0.02336
Epoch: 8236, Training Loss: 0.01746
Epoch: 8236, Training Loss: 0.01869
Epoch: 8236, Training Loss: 0.01872
Epoch: 8236, Training Loss: 0.02336
Epoch: 8237, Training Loss: 0.01746
Epoch: 8237, Training Loss: 0.01869
Epoch: 8237, Training Loss: 0.01872
Epoch: 8237, Training Loss: 0.02336
Epoch: 8238, Training Loss: 0.01746
Epoch: 8238, Training Loss: 0.01869
Epoch: 8238, Training Loss: 0.01872
Epoch: 8238, Training Loss: 0.02336
Epoch: 8239, Training Loss: 0.01746
Epoch: 8239, Training Loss: 0.01869
Epoch: 8239, Training Loss: 0.01872
Epoch: 8239, Training Loss: 0.02336
Epoch: 8240, Training Loss: 0.01746
Epoch: 8240, Training Loss: 0.01868
Epoch: 8240, Training Loss: 0.01871
Epoch: 8240, Training Loss: 0.02335
Epoch: 8241, Training Loss: 0.01746
Epoch: 8241, Training Loss: 0.01868
Epoch: 8241, Training Loss: 0.01871
Epoch: 8241, Training Loss: 0.02335
Epoch: 8242, Training Loss: 0.01746
Epoch: 8242, Training Loss: 0.01868
Epoch: 8242, Training Loss: 0.01871
Epoch: 8242, Training Loss: 0.02335
Epoch: 8243, Training Loss: 0.01745
Epoch: 8243, Training Loss: 0.01868
Epoch: 8243, Training Loss: 0.01871
Epoch: 8243, Training Loss: 0.02335
Epoch: 8244, Training Loss: 0.01745
Epoch: 8244, Training Loss: 0.01868
Epoch: 8244, Training Loss: 0.01871
Epoch: 8244, Training Loss: 0.02334
Epoch: 8245, Training Loss: 0.01745
Epoch: 8245, Training Loss: 0.01868
Epoch: 8245, Training Loss: 0.01871
Epoch: 8245, Training Loss: 0.02334
Epoch: 8246, Training Loss: 0.01745
Epoch: 8246, Training Loss: 0.01867
Epoch: 8246, Training Loss: 0.01870
Epoch: 8246, Training Loss: 0.02334
Epoch: 8247, Training Loss: 0.01745
Epoch: 8247, Training Loss: 0.01867
Epoch: 8247, Training Loss: 0.01870
Epoch: 8247, Training Loss: 0.02334
Epoch: 8248, Training Loss: 0.01745
Epoch: 8248, Training Loss: 0.01867
Epoch: 8248, Training Loss: 0.01870
Epoch: 8248, Training Loss: 0.02334
Epoch: 8249, Training Loss: 0.01744
Epoch: 8249, Training Loss: 0.01867
Epoch: 8249, Training Loss: 0.01870
Epoch: 8249, Training Loss: 0.02333
Epoch: 8250, Training Loss: 0.01744
Epoch: 8250, Training Loss: 0.01867
Epoch: 8250, Training Loss: 0.01870
Epoch: 8250, Training Loss: 0.02333
Epoch: 8251, Training Loss: 0.01744
Epoch: 8251, Training Loss: 0.01867
Epoch: 8251, Training Loss: 0.01870
Epoch: 8251, Training Loss: 0.02333
Epoch: 8252, Training Loss: 0.01744
Epoch: 8252, Training Loss: 0.01866
Epoch: 8252, Training Loss: 0.01869
Epoch: 8252, Training Loss: 0.02333
Epoch: 8253, Training Loss: 0.01744
Epoch: 8253, Training Loss: 0.01866
Epoch: 8253, Training Loss: 0.01869
Epoch: 8253, Training Loss: 0.02333
Epoch: 8254, Training Loss: 0.01744
Epoch: 8254, Training Loss: 0.01866
Epoch: 8254, Training Loss: 0.01869
Epoch: 8254, Training Loss: 0.02332
Epoch: 8255, Training Loss: 0.01744
Epoch: 8255, Training Loss: 0.01866
Epoch: 8255, Training Loss: 0.01869
Epoch: 8255, Training Loss: 0.02332
Epoch: 8256, Training Loss: 0.01743
Epoch: 8256, Training Loss: 0.01866
Epoch: 8256, Training Loss: 0.01869
Epoch: 8256, Training Loss: 0.02332
Epoch: 8257, Training Loss: 0.01743
Epoch: 8257, Training Loss: 0.01866
Epoch: 8257, Training Loss: 0.01869
Epoch: 8257, Training Loss: 0.02332
Epoch: 8258, Training Loss: 0.01743
Epoch: 8258, Training Loss: 0.01865
Epoch: 8258, Training Loss: 0.01868
Epoch: 8258, Training Loss: 0.02331
Epoch: 8259, Training Loss: 0.01743
Epoch: 8259, Training Loss: 0.01865
Epoch: 8259, Training Loss: 0.01868
Epoch: 8259, Training Loss: 0.02331
Epoch: 8260, Training Loss: 0.01743
Epoch: 8260, Training Loss: 0.01865
Epoch: 8260, Training Loss: 0.01868
Epoch: 8260, Training Loss: 0.02331
Epoch: 8261, Training Loss: 0.01743
Epoch: 8261, Training Loss: 0.01865
Epoch: 8261, Training Loss: 0.01868
Epoch: 8261, Training Loss: 0.02331
Epoch: 8262, Training Loss: 0.01743
Epoch: 8262, Training Loss: 0.01865
Epoch: 8262, Training Loss: 0.01868
Epoch: 8262, Training Loss: 0.02331
Epoch: 8263, Training Loss: 0.01742
Epoch: 8263, Training Loss: 0.01865
Epoch: 8263, Training Loss: 0.01868
Epoch: 8263, Training Loss: 0.02330
Epoch: 8264, Training Loss: 0.01742
Epoch: 8264, Training Loss: 0.01864
Epoch: 8264, Training Loss: 0.01867
Epoch: 8264, Training Loss: 0.02330
Epoch: 8265, Training Loss: 0.01742
Epoch: 8265, Training Loss: 0.01864
Epoch: 8265, Training Loss: 0.01867
Epoch: 8265, Training Loss: 0.02330
Epoch: 8266, Training Loss: 0.01742
Epoch: 8266, Training Loss: 0.01864
Epoch: 8266, Training Loss: 0.01867
Epoch: 8266, Training Loss: 0.02330
Epoch: 8267, Training Loss: 0.01742
Epoch: 8267, Training Loss: 0.01864
Epoch: 8267, Training Loss: 0.01867
Epoch: 8267, Training Loss: 0.02330
Epoch: 8268, Training Loss: 0.01742
Epoch: 8268, Training Loss: 0.01864
Epoch: 8268, Training Loss: 0.01867
Epoch: 8268, Training Loss: 0.02329
Epoch: 8269, Training Loss: 0.01741
Epoch: 8269, Training Loss: 0.01864
Epoch: 8269, Training Loss: 0.01867
Epoch: 8269, Training Loss: 0.02329
Epoch: 8270, Training Loss: 0.01741
Epoch: 8270, Training Loss: 0.01863
Epoch: 8270, Training Loss: 0.01866
Epoch: 8270, Training Loss: 0.02329
Epoch: 8271, Training Loss: 0.01741
Epoch: 8271, Training Loss: 0.01863
Epoch: 8271, Training Loss: 0.01866
Epoch: 8271, Training Loss: 0.02329
Epoch: 8272, Training Loss: 0.01741
Epoch: 8272, Training Loss: 0.01863
Epoch: 8272, Training Loss: 0.01866
Epoch: 8272, Training Loss: 0.02328
Epoch: 8273, Training Loss: 0.01741
Epoch: 8273, Training Loss: 0.01863
Epoch: 8273, Training Loss: 0.01866
Epoch: 8273, Training Loss: 0.02328
Epoch: 8274, Training Loss: 0.01741
Epoch: 8274, Training Loss: 0.01863
Epoch: 8274, Training Loss: 0.01866
Epoch: 8274, Training Loss: 0.02328
Epoch: 8275, Training Loss: 0.01741
Epoch: 8275, Training Loss: 0.01863
Epoch: 8275, Training Loss: 0.01866
Epoch: 8275, Training Loss: 0.02328
Epoch: 8276, Training Loss: 0.01740
Epoch: 8276, Training Loss: 0.01862
Epoch: 8276, Training Loss: 0.01865
Epoch: 8276, Training Loss: 0.02328
Epoch: 8277, Training Loss: 0.01740
Epoch: 8277, Training Loss: 0.01862
Epoch: 8277, Training Loss: 0.01865
Epoch: 8277, Training Loss: 0.02327
Epoch: 8278, Training Loss: 0.01740
Epoch: 8278, Training Loss: 0.01862
Epoch: 8278, Training Loss: 0.01865
Epoch: 8278, Training Loss: 0.02327
Epoch: 8279, Training Loss: 0.01740
Epoch: 8279, Training Loss: 0.01862
Epoch: 8279, Training Loss: 0.01865
Epoch: 8279, Training Loss: 0.02327
Epoch: 8280, Training Loss: 0.01740
Epoch: 8280, Training Loss: 0.01862
Epoch: 8280, Training Loss: 0.01865
Epoch: 8280, Training Loss: 0.02327
Epoch: 8281, Training Loss: 0.01740
Epoch: 8281, Training Loss: 0.01862
Epoch: 8281, Training Loss: 0.01865
Epoch: 8281, Training Loss: 0.02327
Epoch: 8282, Training Loss: 0.01740
Epoch: 8282, Training Loss: 0.01861
Epoch: 8282, Training Loss: 0.01864
Epoch: 8282, Training Loss: 0.02326
Epoch: 8283, Training Loss: 0.01739
Epoch: 8283, Training Loss: 0.01861
Epoch: 8283, Training Loss: 0.01864
Epoch: 8283, Training Loss: 0.02326
Epoch: 8284, Training Loss: 0.01739
Epoch: 8284, Training Loss: 0.01861
Epoch: 8284, Training Loss: 0.01864
Epoch: 8284, Training Loss: 0.02326
Epoch: 8285, Training Loss: 0.01739
Epoch: 8285, Training Loss: 0.01861
Epoch: 8285, Training Loss: 0.01864
Epoch: 8285, Training Loss: 0.02326
Epoch: 8286, Training Loss: 0.01739
Epoch: 8286, Training Loss: 0.01861
Epoch: 8286, Training Loss: 0.01864
Epoch: 8286, Training Loss: 0.02325
Epoch: 8287, Training Loss: 0.01739
Epoch: 8287, Training Loss: 0.01861
Epoch: 8287, Training Loss: 0.01864
Epoch: 8287, Training Loss: 0.02325
Epoch: 8288, Training Loss: 0.01739
Epoch: 8288, Training Loss: 0.01860
Epoch: 8288, Training Loss: 0.01863
Epoch: 8288, Training Loss: 0.02325
Epoch: 8289, Training Loss: 0.01739
Epoch: 8289, Training Loss: 0.01860
Epoch: 8289, Training Loss: 0.01863
Epoch: 8289, Training Loss: 0.02325
Epoch: 8290, Training Loss: 0.01738
Epoch: 8290, Training Loss: 0.01860
Epoch: 8290, Training Loss: 0.01863
Epoch: 8290, Training Loss: 0.02325
Epoch: 8291, Training Loss: 0.01738
Epoch: 8291, Training Loss: 0.01860
Epoch: 8291, Training Loss: 0.01863
Epoch: 8291, Training Loss: 0.02324
Epoch: 8292, Training Loss: 0.01738
Epoch: 8292, Training Loss: 0.01860
Epoch: 8292, Training Loss: 0.01863
Epoch: 8292, Training Loss: 0.02324
Epoch: 8293, Training Loss: 0.01738
Epoch: 8293, Training Loss: 0.01860
Epoch: 8293, Training Loss: 0.01863
Epoch: 8293, Training Loss: 0.02324
Epoch: 8294, Training Loss: 0.01738
Epoch: 8294, Training Loss: 0.01859
Epoch: 8294, Training Loss: 0.01862
Epoch: 8294, Training Loss: 0.02324
Epoch: 8295, Training Loss: 0.01738
Epoch: 8295, Training Loss: 0.01859
Epoch: 8295, Training Loss: 0.01862
Epoch: 8295, Training Loss: 0.02324
Epoch: 8296, Training Loss: 0.01737
Epoch: 8296, Training Loss: 0.01859
Epoch: 8296, Training Loss: 0.01862
Epoch: 8296, Training Loss: 0.02323
Epoch: 8297, Training Loss: 0.01737
Epoch: 8297, Training Loss: 0.01859
Epoch: 8297, Training Loss: 0.01862
Epoch: 8297, Training Loss: 0.02323
Epoch: 8298, Training Loss: 0.01737
Epoch: 8298, Training Loss: 0.01859
Epoch: 8298, Training Loss: 0.01862
Epoch: 8298, Training Loss: 0.02323
Epoch: 8299, Training Loss: 0.01737
Epoch: 8299, Training Loss: 0.01859
Epoch: 8299, Training Loss: 0.01862
Epoch: 8299, Training Loss: 0.02323
Epoch: 8300, Training Loss: 0.01737
Epoch: 8300, Training Loss: 0.01858
Epoch: 8300, Training Loss: 0.01861
Epoch: 8300, Training Loss: 0.02322
Epoch: 8301, Training Loss: 0.01737
Epoch: 8301, Training Loss: 0.01858
Epoch: 8301, Training Loss: 0.01861
Epoch: 8301, Training Loss: 0.02322
Epoch: 8302, Training Loss: 0.01737
Epoch: 8302, Training Loss: 0.01858
Epoch: 8302, Training Loss: 0.01861
Epoch: 8302, Training Loss: 0.02322
Epoch: 8303, Training Loss: 0.01736
Epoch: 8303, Training Loss: 0.01858
Epoch: 8303, Training Loss: 0.01861
Epoch: 8303, Training Loss: 0.02322
Epoch: 8304, Training Loss: 0.01736
Epoch: 8304, Training Loss: 0.01858
Epoch: 8304, Training Loss: 0.01861
Epoch: 8304, Training Loss: 0.02322
Epoch: 8305, Training Loss: 0.01736
Epoch: 8305, Training Loss: 0.01858
Epoch: 8305, Training Loss: 0.01861
Epoch: 8305, Training Loss: 0.02321
Epoch: 8306, Training Loss: 0.01736
Epoch: 8306, Training Loss: 0.01857
Epoch: 8306, Training Loss: 0.01860
Epoch: 8306, Training Loss: 0.02321
Epoch: 8307, Training Loss: 0.01736
Epoch: 8307, Training Loss: 0.01857
Epoch: 8307, Training Loss: 0.01860
Epoch: 8307, Training Loss: 0.02321
Epoch: 8308, Training Loss: 0.01736
Epoch: 8308, Training Loss: 0.01857
Epoch: 8308, Training Loss: 0.01860
Epoch: 8308, Training Loss: 0.02321
Epoch: 8309, Training Loss: 0.01736
Epoch: 8309, Training Loss: 0.01857
Epoch: 8309, Training Loss: 0.01860
Epoch: 8309, Training Loss: 0.02321
Epoch: 8310, Training Loss: 0.01735
Epoch: 8310, Training Loss: 0.01857
Epoch: 8310, Training Loss: 0.01860
Epoch: 8310, Training Loss: 0.02320
Epoch: 8311, Training Loss: 0.01735
Epoch: 8311, Training Loss: 0.01857
Epoch: 8311, Training Loss: 0.01860
Epoch: 8311, Training Loss: 0.02320
Epoch: 8312, Training Loss: 0.01735
Epoch: 8312, Training Loss: 0.01856
Epoch: 8312, Training Loss: 0.01859
Epoch: 8312, Training Loss: 0.02320
Epoch: 8313, Training Loss: 0.01735
Epoch: 8313, Training Loss: 0.01856
Epoch: 8313, Training Loss: 0.01859
Epoch: 8313, Training Loss: 0.02320
Epoch: 8314, Training Loss: 0.01735
Epoch: 8314, Training Loss: 0.01856
Epoch: 8314, Training Loss: 0.01859
Epoch: 8314, Training Loss: 0.02319
Epoch: 8315, Training Loss: 0.01735
Epoch: 8315, Training Loss: 0.01856
Epoch: 8315, Training Loss: 0.01859
Epoch: 8315, Training Loss: 0.02319
Epoch: 8316, Training Loss: 0.01735
Epoch: 8316, Training Loss: 0.01856
Epoch: 8316, Training Loss: 0.01859
Epoch: 8316, Training Loss: 0.02319
Epoch: 8317, Training Loss: 0.01734
Epoch: 8317, Training Loss: 0.01856
Epoch: 8317, Training Loss: 0.01859
Epoch: 8317, Training Loss: 0.02319
Epoch: 8318, Training Loss: 0.01734
Epoch: 8318, Training Loss: 0.01855
Epoch: 8318, Training Loss: 0.01858
Epoch: 8318, Training Loss: 0.02319
Epoch: 8319, Training Loss: 0.01734
Epoch: 8319, Training Loss: 0.01855
Epoch: 8319, Training Loss: 0.01858
Epoch: 8319, Training Loss: 0.02318
Epoch: 8320, Training Loss: 0.01734
Epoch: 8320, Training Loss: 0.01855
Epoch: 8320, Training Loss: 0.01858
Epoch: 8320, Training Loss: 0.02318
Epoch: 8321, Training Loss: 0.01734
Epoch: 8321, Training Loss: 0.01855
Epoch: 8321, Training Loss: 0.01858
Epoch: 8321, Training Loss: 0.02318
Epoch: 8322, Training Loss: 0.01734
Epoch: 8322, Training Loss: 0.01855
Epoch: 8322, Training Loss: 0.01858
Epoch: 8322, Training Loss: 0.02318
Epoch: 8323, Training Loss: 0.01734
Epoch: 8323, Training Loss: 0.01855
Epoch: 8323, Training Loss: 0.01858
Epoch: 8323, Training Loss: 0.02318
Epoch: 8324, Training Loss: 0.01733
Epoch: 8324, Training Loss: 0.01854
Epoch: 8324, Training Loss: 0.01857
Epoch: 8324, Training Loss: 0.02317
Epoch: 8325, Training Loss: 0.01733
Epoch: 8325, Training Loss: 0.01854
Epoch: 8325, Training Loss: 0.01857
Epoch: 8325, Training Loss: 0.02317
Epoch: 8326, Training Loss: 0.01733
Epoch: 8326, Training Loss: 0.01854
Epoch: 8326, Training Loss: 0.01857
Epoch: 8326, Training Loss: 0.02317
Epoch: 8327, Training Loss: 0.01733
Epoch: 8327, Training Loss: 0.01854
Epoch: 8327, Training Loss: 0.01857
Epoch: 8327, Training Loss: 0.02317
Epoch: 8328, Training Loss: 0.01733
Epoch: 8328, Training Loss: 0.01854
Epoch: 8328, Training Loss: 0.01857
Epoch: 8328, Training Loss: 0.02317
Epoch: 8329, Training Loss: 0.01733
Epoch: 8329, Training Loss: 0.01854
Epoch: 8329, Training Loss: 0.01857
Epoch: 8329, Training Loss: 0.02316
Epoch: 8330, Training Loss: 0.01733
Epoch: 8330, Training Loss: 0.01853
Epoch: 8330, Training Loss: 0.01857
Epoch: 8330, Training Loss: 0.02316
Epoch: 8331, Training Loss: 0.01732
Epoch: 8331, Training Loss: 0.01853
Epoch: 8331, Training Loss: 0.01856
Epoch: 8331, Training Loss: 0.02316
Epoch: 8332, Training Loss: 0.01732
Epoch: 8332, Training Loss: 0.01853
Epoch: 8332, Training Loss: 0.01856
Epoch: 8332, Training Loss: 0.02316
Epoch: 8333, Training Loss: 0.01732
Epoch: 8333, Training Loss: 0.01853
Epoch: 8333, Training Loss: 0.01856
Epoch: 8333, Training Loss: 0.02315
Epoch: 8334, Training Loss: 0.01732
Epoch: 8334, Training Loss: 0.01853
Epoch: 8334, Training Loss: 0.01856
Epoch: 8334, Training Loss: 0.02315
Epoch: 8335, Training Loss: 0.01732
Epoch: 8335, Training Loss: 0.01853
Epoch: 8335, Training Loss: 0.01856
Epoch: 8335, Training Loss: 0.02315
Epoch: 8336, Training Loss: 0.01732
Epoch: 8336, Training Loss: 0.01853
Epoch: 8336, Training Loss: 0.01856
Epoch: 8336, Training Loss: 0.02315
Epoch: 8337, Training Loss: 0.01731
Epoch: 8337, Training Loss: 0.01852
Epoch: 8337, Training Loss: 0.01855
Epoch: 8337, Training Loss: 0.02315
Epoch: 8338, Training Loss: 0.01731
Epoch: 8338, Training Loss: 0.01852
Epoch: 8338, Training Loss: 0.01855
Epoch: 8338, Training Loss: 0.02314
Epoch: 8339, Training Loss: 0.01731
Epoch: 8339, Training Loss: 0.01852
Epoch: 8339, Training Loss: 0.01855
Epoch: 8339, Training Loss: 0.02314
Epoch: 8340, Training Loss: 0.01731
Epoch: 8340, Training Loss: 0.01852
Epoch: 8340, Training Loss: 0.01855
Epoch: 8340, Training Loss: 0.02314
Epoch: 8341, Training Loss: 0.01731
Epoch: 8341, Training Loss: 0.01852
Epoch: 8341, Training Loss: 0.01855
Epoch: 8341, Training Loss: 0.02314
Epoch: 8342, Training Loss: 0.01731
Epoch: 8342, Training Loss: 0.01852
Epoch: 8342, Training Loss: 0.01855
Epoch: 8342, Training Loss: 0.02314
Epoch: 8343, Training Loss: 0.01731
Epoch: 8343, Training Loss: 0.01851
Epoch: 8343, Training Loss: 0.01854
Epoch: 8343, Training Loss: 0.02313
Epoch: 8344, Training Loss: 0.01730
Epoch: 8344, Training Loss: 0.01851
Epoch: 8344, Training Loss: 0.01854
Epoch: 8344, Training Loss: 0.02313
Epoch: 8345, Training Loss: 0.01730
Epoch: 8345, Training Loss: 0.01851
Epoch: 8345, Training Loss: 0.01854
Epoch: 8345, Training Loss: 0.02313
Epoch: 8346, Training Loss: 0.01730
Epoch: 8346, Training Loss: 0.01851
Epoch: 8346, Training Loss: 0.01854
Epoch: 8346, Training Loss: 0.02313
Epoch: 8347, Training Loss: 0.01730
Epoch: 8347, Training Loss: 0.01851
Epoch: 8347, Training Loss: 0.01854
Epoch: 8347, Training Loss: 0.02313
Epoch: 8348, Training Loss: 0.01730
Epoch: 8348, Training Loss: 0.01851
Epoch: 8348, Training Loss: 0.01854
Epoch: 8348, Training Loss: 0.02312
Epoch: 8349, Training Loss: 0.01730
Epoch: 8349, Training Loss: 0.01850
Epoch: 8349, Training Loss: 0.01853
Epoch: 8349, Training Loss: 0.02312
Epoch: 8350, Training Loss: 0.01730
Epoch: 8350, Training Loss: 0.01850
Epoch: 8350, Training Loss: 0.01853
Epoch: 8350, Training Loss: 0.02312
Epoch: 8351, Training Loss: 0.01729
Epoch: 8351, Training Loss: 0.01850
Epoch: 8351, Training Loss: 0.01853
Epoch: 8351, Training Loss: 0.02312
Epoch: 8352, Training Loss: 0.01729
Epoch: 8352, Training Loss: 0.01850
Epoch: 8352, Training Loss: 0.01853
Epoch: 8352, Training Loss: 0.02311
Epoch: 8353, Training Loss: 0.01729
Epoch: 8353, Training Loss: 0.01850
Epoch: 8353, Training Loss: 0.01853
Epoch: 8353, Training Loss: 0.02311
Epoch: 8354, Training Loss: 0.01729
Epoch: 8354, Training Loss: 0.01850
Epoch: 8354, Training Loss: 0.01853
Epoch: 8354, Training Loss: 0.02311
Epoch: 8355, Training Loss: 0.01729
Epoch: 8355, Training Loss: 0.01849
Epoch: 8355, Training Loss: 0.01852
Epoch: 8355, Training Loss: 0.02311
Epoch: 8356, Training Loss: 0.01729
Epoch: 8356, Training Loss: 0.01849
Epoch: 8356, Training Loss: 0.01852
Epoch: 8356, Training Loss: 0.02311
Epoch: 8357, Training Loss: 0.01729
Epoch: 8357, Training Loss: 0.01849
Epoch: 8357, Training Loss: 0.01852
Epoch: 8357, Training Loss: 0.02310
Epoch: 8358, Training Loss: 0.01728
Epoch: 8358, Training Loss: 0.01849
Epoch: 8358, Training Loss: 0.01852
Epoch: 8358, Training Loss: 0.02310
Epoch: 8359, Training Loss: 0.01728
Epoch: 8359, Training Loss: 0.01849
Epoch: 8359, Training Loss: 0.01852
Epoch: 8359, Training Loss: 0.02310
Epoch: 8360, Training Loss: 0.01728
Epoch: 8360, Training Loss: 0.01849
Epoch: 8360, Training Loss: 0.01852
Epoch: 8360, Training Loss: 0.02310
Epoch: 8361, Training Loss: 0.01728
Epoch: 8361, Training Loss: 0.01848
Epoch: 8361, Training Loss: 0.01851
Epoch: 8361, Training Loss: 0.02310
Epoch: 8362, Training Loss: 0.01728
Epoch: 8362, Training Loss: 0.01848
Epoch: 8362, Training Loss: 0.01851
Epoch: 8362, Training Loss: 0.02309
Epoch: 8363, Training Loss: 0.01728
Epoch: 8363, Training Loss: 0.01848
Epoch: 8363, Training Loss: 0.01851
Epoch: 8363, Training Loss: 0.02309
Epoch: 8364, Training Loss: 0.01728
Epoch: 8364, Training Loss: 0.01848
Epoch: 8364, Training Loss: 0.01851
Epoch: 8364, Training Loss: 0.02309
Epoch: 8365, Training Loss: 0.01727
Epoch: 8365, Training Loss: 0.01848
Epoch: 8365, Training Loss: 0.01851
Epoch: 8365, Training Loss: 0.02309
Epoch: 8366, Training Loss: 0.01727
Epoch: 8366, Training Loss: 0.01848
Epoch: 8366, Training Loss: 0.01851
Epoch: 8366, Training Loss: 0.02309
Epoch: 8367, Training Loss: 0.01727
Epoch: 8367, Training Loss: 0.01847
Epoch: 8367, Training Loss: 0.01850
Epoch: 8367, Training Loss: 0.02308
Epoch: 8368, Training Loss: 0.01727
Epoch: 8368, Training Loss: 0.01847
Epoch: 8368, Training Loss: 0.01850
Epoch: 8368, Training Loss: 0.02308
Epoch: 8369, Training Loss: 0.01727
Epoch: 8369, Training Loss: 0.01847
Epoch: 8369, Training Loss: 0.01850
Epoch: 8369, Training Loss: 0.02308
Epoch: 8370, Training Loss: 0.01727
Epoch: 8370, Training Loss: 0.01847
Epoch: 8370, Training Loss: 0.01850
Epoch: 8370, Training Loss: 0.02308
Epoch: 8371, Training Loss: 0.01727
Epoch: 8371, Training Loss: 0.01847
Epoch: 8371, Training Loss: 0.01850
Epoch: 8371, Training Loss: 0.02307
Epoch: 8372, Training Loss: 0.01726
Epoch: 8372, Training Loss: 0.01847
Epoch: 8372, Training Loss: 0.01850
Epoch: 8372, Training Loss: 0.02307
Epoch: 8373, Training Loss: 0.01726
Epoch: 8373, Training Loss: 0.01846
Epoch: 8373, Training Loss: 0.01849
Epoch: 8373, Training Loss: 0.02307
Epoch: 8374, Training Loss: 0.01726
Epoch: 8374, Training Loss: 0.01846
Epoch: 8374, Training Loss: 0.01849
Epoch: 8374, Training Loss: 0.02307
Epoch: 8375, Training Loss: 0.01726
Epoch: 8375, Training Loss: 0.01846
Epoch: 8375, Training Loss: 0.01849
Epoch: 8375, Training Loss: 0.02307
Epoch: 8376, Training Loss: 0.01726
Epoch: 8376, Training Loss: 0.01846
Epoch: 8376, Training Loss: 0.01849
Epoch: 8376, Training Loss: 0.02306
Epoch: 8377, Training Loss: 0.01726
Epoch: 8377, Training Loss: 0.01846
Epoch: 8377, Training Loss: 0.01849
Epoch: 8377, Training Loss: 0.02306
Epoch: 8378, Training Loss: 0.01726
Epoch: 8378, Training Loss: 0.01846
Epoch: 8378, Training Loss: 0.01849
Epoch: 8378, Training Loss: 0.02306
Epoch: 8379, Training Loss: 0.01725
Epoch: 8379, Training Loss: 0.01845
Epoch: 8379, Training Loss: 0.01849
Epoch: 8379, Training Loss: 0.02306
Epoch: 8380, Training Loss: 0.01725
Epoch: 8380, Training Loss: 0.01845
Epoch: 8380, Training Loss: 0.01848
Epoch: 8380, Training Loss: 0.02306
Epoch: 8381, Training Loss: 0.01725
Epoch: 8381, Training Loss: 0.01845
Epoch: 8381, Training Loss: 0.01848
Epoch: 8381, Training Loss: 0.02305
Epoch: 8382, Training Loss: 0.01725
Epoch: 8382, Training Loss: 0.01845
Epoch: 8382, Training Loss: 0.01848
Epoch: 8382, Training Loss: 0.02305
Epoch: 8383, Training Loss: 0.01725
Epoch: 8383, Training Loss: 0.01845
Epoch: 8383, Training Loss: 0.01848
Epoch: 8383, Training Loss: 0.02305
Epoch: 8384, Training Loss: 0.01725
Epoch: 8384, Training Loss: 0.01845
Epoch: 8384, Training Loss: 0.01848
Epoch: 8384, Training Loss: 0.02305
Epoch: 8385, Training Loss: 0.01725
Epoch: 8385, Training Loss: 0.01845
Epoch: 8385, Training Loss: 0.01848
Epoch: 8385, Training Loss: 0.02305
Epoch: 8386, Training Loss: 0.01724
Epoch: 8386, Training Loss: 0.01844
Epoch: 8386, Training Loss: 0.01847
Epoch: 8386, Training Loss: 0.02304
Epoch: 8387, Training Loss: 0.01724
Epoch: 8387, Training Loss: 0.01844
Epoch: 8387, Training Loss: 0.01847
Epoch: 8387, Training Loss: 0.02304
Epoch: 8388, Training Loss: 0.01724
Epoch: 8388, Training Loss: 0.01844
Epoch: 8388, Training Loss: 0.01847
Epoch: 8388, Training Loss: 0.02304
Epoch: 8389, Training Loss: 0.01724
Epoch: 8389, Training Loss: 0.01844
Epoch: 8389, Training Loss: 0.01847
Epoch: 8389, Training Loss: 0.02304
Epoch: 8390, Training Loss: 0.01724
Epoch: 8390, Training Loss: 0.01844
Epoch: 8390, Training Loss: 0.01847
Epoch: 8390, Training Loss: 0.02304
Epoch: 8391, Training Loss: 0.01724
Epoch: 8391, Training Loss: 0.01844
Epoch: 8391, Training Loss: 0.01847
Epoch: 8391, Training Loss: 0.02303
Epoch: 8392, Training Loss: 0.01724
Epoch: 8392, Training Loss: 0.01843
Epoch: 8392, Training Loss: 0.01846
Epoch: 8392, Training Loss: 0.02303
Epoch: 8393, Training Loss: 0.01723
Epoch: 8393, Training Loss: 0.01843
Epoch: 8393, Training Loss: 0.01846
Epoch: 8393, Training Loss: 0.02303
Epoch: 8394, Training Loss: 0.01723
Epoch: 8394, Training Loss: 0.01843
Epoch: 8394, Training Loss: 0.01846
Epoch: 8394, Training Loss: 0.02303
Epoch: 8395, Training Loss: 0.01723
Epoch: 8395, Training Loss: 0.01843
Epoch: 8395, Training Loss: 0.01846
Epoch: 8395, Training Loss: 0.02302
Epoch: 8396, Training Loss: 0.01723
Epoch: 8396, Training Loss: 0.01843
Epoch: 8396, Training Loss: 0.01846
Epoch: 8396, Training Loss: 0.02302
Epoch: 8397, Training Loss: 0.01723
Epoch: 8397, Training Loss: 0.01843
Epoch: 8397, Training Loss: 0.01846
Epoch: 8397, Training Loss: 0.02302
Epoch: 8398, Training Loss: 0.01723
Epoch: 8398, Training Loss: 0.01842
Epoch: 8398, Training Loss: 0.01845
Epoch: 8398, Training Loss: 0.02302
Epoch: 8399, Training Loss: 0.01723
Epoch: 8399, Training Loss: 0.01842
Epoch: 8399, Training Loss: 0.01845
Epoch: 8399, Training Loss: 0.02302
Epoch: 8400, Training Loss: 0.01722
Epoch: 8400, Training Loss: 0.01842
Epoch: 8400, Training Loss: 0.01845
Epoch: 8400, Training Loss: 0.02301
Epoch: 8401, Training Loss: 0.01722
Epoch: 8401, Training Loss: 0.01842
Epoch: 8401, Training Loss: 0.01845
Epoch: 8401, Training Loss: 0.02301
Epoch: 8402, Training Loss: 0.01722
Epoch: 8402, Training Loss: 0.01842
Epoch: 8402, Training Loss: 0.01845
Epoch: 8402, Training Loss: 0.02301
Epoch: 8403, Training Loss: 0.01722
Epoch: 8403, Training Loss: 0.01842
Epoch: 8403, Training Loss: 0.01845
Epoch: 8403, Training Loss: 0.02301
Epoch: 8404, Training Loss: 0.01722
Epoch: 8404, Training Loss: 0.01841
Epoch: 8404, Training Loss: 0.01844
Epoch: 8404, Training Loss: 0.02301
Epoch: 8405, Training Loss: 0.01722
Epoch: 8405, Training Loss: 0.01841
Epoch: 8405, Training Loss: 0.01844
Epoch: 8405, Training Loss: 0.02300
Epoch: 8406, Training Loss: 0.01721
Epoch: 8406, Training Loss: 0.01841
Epoch: 8406, Training Loss: 0.01844
Epoch: 8406, Training Loss: 0.02300
Epoch: 8407, Training Loss: 0.01721
Epoch: 8407, Training Loss: 0.01841
Epoch: 8407, Training Loss: 0.01844
Epoch: 8407, Training Loss: 0.02300
Epoch: 8408, Training Loss: 0.01721
Epoch: 8408, Training Loss: 0.01841
Epoch: 8408, Training Loss: 0.01844
Epoch: 8408, Training Loss: 0.02300
Epoch: 8409, Training Loss: 0.01721
Epoch: 8409, Training Loss: 0.01841
Epoch: 8409, Training Loss: 0.01844
Epoch: 8409, Training Loss: 0.02300
Epoch: 8410, Training Loss: 0.01721
Epoch: 8410, Training Loss: 0.01840
Epoch: 8410, Training Loss: 0.01843
Epoch: 8410, Training Loss: 0.02299
Epoch: 8411, Training Loss: 0.01721
Epoch: 8411, Training Loss: 0.01840
Epoch: 8411, Training Loss: 0.01843
Epoch: 8411, Training Loss: 0.02299
Epoch: 8412, Training Loss: 0.01721
Epoch: 8412, Training Loss: 0.01840
Epoch: 8412, Training Loss: 0.01843
Epoch: 8412, Training Loss: 0.02299
Epoch: 8413, Training Loss: 0.01720
Epoch: 8413, Training Loss: 0.01840
Epoch: 8413, Training Loss: 0.01843
Epoch: 8413, Training Loss: 0.02299
Epoch: 8414, Training Loss: 0.01720
Epoch: 8414, Training Loss: 0.01840
Epoch: 8414, Training Loss: 0.01843
Epoch: 8414, Training Loss: 0.02299
Epoch: 8415, Training Loss: 0.01720
Epoch: 8415, Training Loss: 0.01840
Epoch: 8415, Training Loss: 0.01843
Epoch: 8415, Training Loss: 0.02298
Epoch: 8416, Training Loss: 0.01720
Epoch: 8416, Training Loss: 0.01840
Epoch: 8416, Training Loss: 0.01843
Epoch: 8416, Training Loss: 0.02298
Epoch: 8417, Training Loss: 0.01720
Epoch: 8417, Training Loss: 0.01839
Epoch: 8417, Training Loss: 0.01842
Epoch: 8417, Training Loss: 0.02298
Epoch: 8418, Training Loss: 0.01720
Epoch: 8418, Training Loss: 0.01839
Epoch: 8418, Training Loss: 0.01842
Epoch: 8418, Training Loss: 0.02298
Epoch: 8419, Training Loss: 0.01720
Epoch: 8419, Training Loss: 0.01839
Epoch: 8419, Training Loss: 0.01842
Epoch: 8419, Training Loss: 0.02298
Epoch: 8420, Training Loss: 0.01719
Epoch: 8420, Training Loss: 0.01839
Epoch: 8420, Training Loss: 0.01842
Epoch: 8420, Training Loss: 0.02297
Epoch: 8421, Training Loss: 0.01719
Epoch: 8421, Training Loss: 0.01839
Epoch: 8421, Training Loss: 0.01842
Epoch: 8421, Training Loss: 0.02297
Epoch: 8422, Training Loss: 0.01719
Epoch: 8422, Training Loss: 0.01839
Epoch: 8422, Training Loss: 0.01842
Epoch: 8422, Training Loss: 0.02297
Epoch: 8423, Training Loss: 0.01719
Epoch: 8423, Training Loss: 0.01838
Epoch: 8423, Training Loss: 0.01841
Epoch: 8423, Training Loss: 0.02297
Epoch: 8424, Training Loss: 0.01719
Epoch: 8424, Training Loss: 0.01838
Epoch: 8424, Training Loss: 0.01841
Epoch: 8424, Training Loss: 0.02297
Epoch: 8425, Training Loss: 0.01719
Epoch: 8425, Training Loss: 0.01838
Epoch: 8425, Training Loss: 0.01841
Epoch: 8425, Training Loss: 0.02296
Epoch: 8426, Training Loss: 0.01719
Epoch: 8426, Training Loss: 0.01838
Epoch: 8426, Training Loss: 0.01841
Epoch: 8426, Training Loss: 0.02296
Epoch: 8427, Training Loss: 0.01718
Epoch: 8427, Training Loss: 0.01838
Epoch: 8427, Training Loss: 0.01841
Epoch: 8427, Training Loss: 0.02296
Epoch: 8428, Training Loss: 0.01718
Epoch: 8428, Training Loss: 0.01838
Epoch: 8428, Training Loss: 0.01841
Epoch: 8428, Training Loss: 0.02296
Epoch: 8429, Training Loss: 0.01718
Epoch: 8429, Training Loss: 0.01837
Epoch: 8429, Training Loss: 0.01840
Epoch: 8429, Training Loss: 0.02295
Epoch: 8430, Training Loss: 0.01718
Epoch: 8430, Training Loss: 0.01837
Epoch: 8430, Training Loss: 0.01840
Epoch: 8430, Training Loss: 0.02295
Epoch: 8431, Training Loss: 0.01718
Epoch: 8431, Training Loss: 0.01837
Epoch: 8431, Training Loss: 0.01840
Epoch: 8431, Training Loss: 0.02295
Epoch: 8432, Training Loss: 0.01718
Epoch: 8432, Training Loss: 0.01837
Epoch: 8432, Training Loss: 0.01840
Epoch: 8432, Training Loss: 0.02295
Epoch: 8433, Training Loss: 0.01718
Epoch: 8433, Training Loss: 0.01837
Epoch: 8433, Training Loss: 0.01840
Epoch: 8433, Training Loss: 0.02295
Epoch: 8434, Training Loss: 0.01717
Epoch: 8434, Training Loss: 0.01837
Epoch: 8434, Training Loss: 0.01840
Epoch: 8434, Training Loss: 0.02294
Epoch: 8435, Training Loss: 0.01717
Epoch: 8435, Training Loss: 0.01836
Epoch: 8435, Training Loss: 0.01839
Epoch: 8435, Training Loss: 0.02294
Epoch: 8436, Training Loss: 0.01717
Epoch: 8436, Training Loss: 0.01836
Epoch: 8436, Training Loss: 0.01839
Epoch: 8436, Training Loss: 0.02294
Epoch: 8437, Training Loss: 0.01717
Epoch: 8437, Training Loss: 0.01836
Epoch: 8437, Training Loss: 0.01839
Epoch: 8437, Training Loss: 0.02294
Epoch: 8438, Training Loss: 0.01717
Epoch: 8438, Training Loss: 0.01836
Epoch: 8438, Training Loss: 0.01839
Epoch: 8438, Training Loss: 0.02294
Epoch: 8439, Training Loss: 0.01717
Epoch: 8439, Training Loss: 0.01836
Epoch: 8439, Training Loss: 0.01839
Epoch: 8439, Training Loss: 0.02293
Epoch: 8440, Training Loss: 0.01717
Epoch: 8440, Training Loss: 0.01836
Epoch: 8440, Training Loss: 0.01839
Epoch: 8440, Training Loss: 0.02293
Epoch: 8441, Training Loss: 0.01716
Epoch: 8441, Training Loss: 0.01836
Epoch: 8441, Training Loss: 0.01839
Epoch: 8441, Training Loss: 0.02293
Epoch: 8442, Training Loss: 0.01716
Epoch: 8442, Training Loss: 0.01835
Epoch: 8442, Training Loss: 0.01838
Epoch: 8442, Training Loss: 0.02293
Epoch: 8443, Training Loss: 0.01716
Epoch: 8443, Training Loss: 0.01835
Epoch: 8443, Training Loss: 0.01838
Epoch: 8443, Training Loss: 0.02293
Epoch: 8444, Training Loss: 0.01716
Epoch: 8444, Training Loss: 0.01835
Epoch: 8444, Training Loss: 0.01838
Epoch: 8444, Training Loss: 0.02292
Epoch: 8445, Training Loss: 0.01716
Epoch: 8445, Training Loss: 0.01835
Epoch: 8445, Training Loss: 0.01838
Epoch: 8445, Training Loss: 0.02292
Epoch: 8446, Training Loss: 0.01716
Epoch: 8446, Training Loss: 0.01835
Epoch: 8446, Training Loss: 0.01838
Epoch: 8446, Training Loss: 0.02292
Epoch: 8447, Training Loss: 0.01716
Epoch: 8447, Training Loss: 0.01835
Epoch: 8447, Training Loss: 0.01838
Epoch: 8447, Training Loss: 0.02292
Epoch: 8448, Training Loss: 0.01716
Epoch: 8448, Training Loss: 0.01834
Epoch: 8448, Training Loss: 0.01837
Epoch: 8448, Training Loss: 0.02292
Epoch: 8449, Training Loss: 0.01715
Epoch: 8449, Training Loss: 0.01834
Epoch: 8449, Training Loss: 0.01837
Epoch: 8449, Training Loss: 0.02291
Epoch: 8450, Training Loss: 0.01715
Epoch: 8450, Training Loss: 0.01834
Epoch: 8450, Training Loss: 0.01837
Epoch: 8450, Training Loss: 0.02291
Epoch: 8451, Training Loss: 0.01715
Epoch: 8451, Training Loss: 0.01834
Epoch: 8451, Training Loss: 0.01837
Epoch: 8451, Training Loss: 0.02291
Epoch: 8452, Training Loss: 0.01715
Epoch: 8452, Training Loss: 0.01834
Epoch: 8452, Training Loss: 0.01837
Epoch: 8452, Training Loss: 0.02291
Epoch: 8453, Training Loss: 0.01715
Epoch: 8453, Training Loss: 0.01834
Epoch: 8453, Training Loss: 0.01837
Epoch: 8453, Training Loss: 0.02291
Epoch: 8454, Training Loss: 0.01715
Epoch: 8454, Training Loss: 0.01833
Epoch: 8454, Training Loss: 0.01836
Epoch: 8454, Training Loss: 0.02290
Epoch: 8455, Training Loss: 0.01715
Epoch: 8455, Training Loss: 0.01833
Epoch: 8455, Training Loss: 0.01836
Epoch: 8455, Training Loss: 0.02290
Epoch: 8456, Training Loss: 0.01714
Epoch: 8456, Training Loss: 0.01833
Epoch: 8456, Training Loss: 0.01836
Epoch: 8456, Training Loss: 0.02290
Epoch: 8457, Training Loss: 0.01714
Epoch: 8457, Training Loss: 0.01833
Epoch: 8457, Training Loss: 0.01836
Epoch: 8457, Training Loss: 0.02290
Epoch: 8458, Training Loss: 0.01714
Epoch: 8458, Training Loss: 0.01833
Epoch: 8458, Training Loss: 0.01836
Epoch: 8458, Training Loss: 0.02290
Epoch: 8459, Training Loss: 0.01714
Epoch: 8459, Training Loss: 0.01833
Epoch: 8459, Training Loss: 0.01836
Epoch: 8459, Training Loss: 0.02289
Epoch: 8460, Training Loss: 0.01714
Epoch: 8460, Training Loss: 0.01833
Epoch: 8460, Training Loss: 0.01835
Epoch: 8460, Training Loss: 0.02289
Epoch: 8461, Training Loss: 0.01714
Epoch: 8461, Training Loss: 0.01832
Epoch: 8461, Training Loss: 0.01835
Epoch: 8461, Training Loss: 0.02289
Epoch: 8462, Training Loss: 0.01714
Epoch: 8462, Training Loss: 0.01832
Epoch: 8462, Training Loss: 0.01835
Epoch: 8462, Training Loss: 0.02289
Epoch: 8463, Training Loss: 0.01713
Epoch: 8463, Training Loss: 0.01832
Epoch: 8463, Training Loss: 0.01835
Epoch: 8463, Training Loss: 0.02289
Epoch: 8464, Training Loss: 0.01713
Epoch: 8464, Training Loss: 0.01832
Epoch: 8464, Training Loss: 0.01835
Epoch: 8464, Training Loss: 0.02288
Epoch: 8465, Training Loss: 0.01713
Epoch: 8465, Training Loss: 0.01832
Epoch: 8465, Training Loss: 0.01835
Epoch: 8465, Training Loss: 0.02288
Epoch: 8466, Training Loss: 0.01713
Epoch: 8466, Training Loss: 0.01832
Epoch: 8466, Training Loss: 0.01835
Epoch: 8466, Training Loss: 0.02288
Epoch: 8467, Training Loss: 0.01713
Epoch: 8467, Training Loss: 0.01831
Epoch: 8467, Training Loss: 0.01834
Epoch: 8467, Training Loss: 0.02288
Epoch: 8468, Training Loss: 0.01713
Epoch: 8468, Training Loss: 0.01831
Epoch: 8468, Training Loss: 0.01834
Epoch: 8468, Training Loss: 0.02287
Epoch: 8469, Training Loss: 0.01713
Epoch: 8469, Training Loss: 0.01831
Epoch: 8469, Training Loss: 0.01834
Epoch: 8469, Training Loss: 0.02287
Epoch: 8470, Training Loss: 0.01712
Epoch: 8470, Training Loss: 0.01831
Epoch: 8470, Training Loss: 0.01834
Epoch: 8470, Training Loss: 0.02287
Epoch: 8471, Training Loss: 0.01712
Epoch: 8471, Training Loss: 0.01831
Epoch: 8471, Training Loss: 0.01834
Epoch: 8471, Training Loss: 0.02287
Epoch: 8472, Training Loss: 0.01712
Epoch: 8472, Training Loss: 0.01831
Epoch: 8472, Training Loss: 0.01834
Epoch: 8472, Training Loss: 0.02287
Epoch: 8473, Training Loss: 0.01712
Epoch: 8473, Training Loss: 0.01830
Epoch: 8473, Training Loss: 0.01833
Epoch: 8473, Training Loss: 0.02286
Epoch: 8474, Training Loss: 0.01712
Epoch: 8474, Training Loss: 0.01830
Epoch: 8474, Training Loss: 0.01833
Epoch: 8474, Training Loss: 0.02286
Epoch: 8475, Training Loss: 0.01712
Epoch: 8475, Training Loss: 0.01830
Epoch: 8475, Training Loss: 0.01833
Epoch: 8475, Training Loss: 0.02286
Epoch: 8476, Training Loss: 0.01712
Epoch: 8476, Training Loss: 0.01830
Epoch: 8476, Training Loss: 0.01833
Epoch: 8476, Training Loss: 0.02286
Epoch: 8477, Training Loss: 0.01711
Epoch: 8477, Training Loss: 0.01830
Epoch: 8477, Training Loss: 0.01833
Epoch: 8477, Training Loss: 0.02286
Epoch: 8478, Training Loss: 0.01711
Epoch: 8478, Training Loss: 0.01830
Epoch: 8478, Training Loss: 0.01833
Epoch: 8478, Training Loss: 0.02285
Epoch: 8479, Training Loss: 0.01711
Epoch: 8479, Training Loss: 0.01830
Epoch: 8479, Training Loss: 0.01832
Epoch: 8479, Training Loss: 0.02285
Epoch: 8480, Training Loss: 0.01711
Epoch: 8480, Training Loss: 0.01829
Epoch: 8480, Training Loss: 0.01832
Epoch: 8480, Training Loss: 0.02285
Epoch: 8481, Training Loss: 0.01711
Epoch: 8481, Training Loss: 0.01829
Epoch: 8481, Training Loss: 0.01832
Epoch: 8481, Training Loss: 0.02285
Epoch: 8482, Training Loss: 0.01711
Epoch: 8482, Training Loss: 0.01829
Epoch: 8482, Training Loss: 0.01832
Epoch: 8482, Training Loss: 0.02285
Epoch: 8483, Training Loss: 0.01711
Epoch: 8483, Training Loss: 0.01829
Epoch: 8483, Training Loss: 0.01832
Epoch: 8483, Training Loss: 0.02284
Epoch: 8484, Training Loss: 0.01710
Epoch: 8484, Training Loss: 0.01829
Epoch: 8484, Training Loss: 0.01832
Epoch: 8484, Training Loss: 0.02284
Epoch: 8485, Training Loss: 0.01710
Epoch: 8485, Training Loss: 0.01829
Epoch: 8485, Training Loss: 0.01832
Epoch: 8485, Training Loss: 0.02284
Epoch: 8486, Training Loss: 0.01710
Epoch: 8486, Training Loss: 0.01828
Epoch: 8486, Training Loss: 0.01831
Epoch: 8486, Training Loss: 0.02284
Epoch: 8487, Training Loss: 0.01710
Epoch: 8487, Training Loss: 0.01828
Epoch: 8487, Training Loss: 0.01831
Epoch: 8487, Training Loss: 0.02284
Epoch: 8488, Training Loss: 0.01710
Epoch: 8488, Training Loss: 0.01828
Epoch: 8488, Training Loss: 0.01831
Epoch: 8488, Training Loss: 0.02283
Epoch: 8489, Training Loss: 0.01710
Epoch: 8489, Training Loss: 0.01828
Epoch: 8489, Training Loss: 0.01831
Epoch: 8489, Training Loss: 0.02283
Epoch: 8490, Training Loss: 0.01710
Epoch: 8490, Training Loss: 0.01828
Epoch: 8490, Training Loss: 0.01831
Epoch: 8490, Training Loss: 0.02283
Epoch: 8491, Training Loss: 0.01709
Epoch: 8491, Training Loss: 0.01828
Epoch: 8491, Training Loss: 0.01831
Epoch: 8491, Training Loss: 0.02283
Epoch: 8492, Training Loss: 0.01709
Epoch: 8492, Training Loss: 0.01827
Epoch: 8492, Training Loss: 0.01830
Epoch: 8492, Training Loss: 0.02283
Epoch: 8493, Training Loss: 0.01709
Epoch: 8493, Training Loss: 0.01827
Epoch: 8493, Training Loss: 0.01830
Epoch: 8493, Training Loss: 0.02282
Epoch: 8494, Training Loss: 0.01709
Epoch: 8494, Training Loss: 0.01827
Epoch: 8494, Training Loss: 0.01830
Epoch: 8494, Training Loss: 0.02282
Epoch: 8495, Training Loss: 0.01709
Epoch: 8495, Training Loss: 0.01827
Epoch: 8495, Training Loss: 0.01830
Epoch: 8495, Training Loss: 0.02282
Epoch: 8496, Training Loss: 0.01709
Epoch: 8496, Training Loss: 0.01827
Epoch: 8496, Training Loss: 0.01830
Epoch: 8496, Training Loss: 0.02282
Epoch: 8497, Training Loss: 0.01709
Epoch: 8497, Training Loss: 0.01827
Epoch: 8497, Training Loss: 0.01830
Epoch: 8497, Training Loss: 0.02282
Epoch: 8498, Training Loss: 0.01708
Epoch: 8498, Training Loss: 0.01827
Epoch: 8498, Training Loss: 0.01829
Epoch: 8498, Training Loss: 0.02281
Epoch: 8499, Training Loss: 0.01708
Epoch: 8499, Training Loss: 0.01826
Epoch: 8499, Training Loss: 0.01829
Epoch: 8499, Training Loss: 0.02281
Epoch: 8500, Training Loss: 0.01708
Epoch: 8500, Training Loss: 0.01826
Epoch: 8500, Training Loss: 0.01829
Epoch: 8500, Training Loss: 0.02281
Epoch: 8501, Training Loss: 0.01708
Epoch: 8501, Training Loss: 0.01826
Epoch: 8501, Training Loss: 0.01829
Epoch: 8501, Training Loss: 0.02281
Epoch: 8502, Training Loss: 0.01708
Epoch: 8502, Training Loss: 0.01826
Epoch: 8502, Training Loss: 0.01829
Epoch: 8502, Training Loss: 0.02281
Epoch: 8503, Training Loss: 0.01708
Epoch: 8503, Training Loss: 0.01826
Epoch: 8503, Training Loss: 0.01829
Epoch: 8503, Training Loss: 0.02280
Epoch: 8504, Training Loss: 0.01708
Epoch: 8504, Training Loss: 0.01826
Epoch: 8504, Training Loss: 0.01829
Epoch: 8504, Training Loss: 0.02280
Epoch: 8505, Training Loss: 0.01707
Epoch: 8505, Training Loss: 0.01825
Epoch: 8505, Training Loss: 0.01828
Epoch: 8505, Training Loss: 0.02280
Epoch: 8506, Training Loss: 0.01707
Epoch: 8506, Training Loss: 0.01825
Epoch: 8506, Training Loss: 0.01828
Epoch: 8506, Training Loss: 0.02280
Epoch: 8507, Training Loss: 0.01707
Epoch: 8507, Training Loss: 0.01825
Epoch: 8507, Training Loss: 0.01828
Epoch: 8507, Training Loss: 0.02280
Epoch: 8508, Training Loss: 0.01707
Epoch: 8508, Training Loss: 0.01825
Epoch: 8508, Training Loss: 0.01828
Epoch: 8508, Training Loss: 0.02279
Epoch: 8509, Training Loss: 0.01707
Epoch: 8509, Training Loss: 0.01825
Epoch: 8509, Training Loss: 0.01828
Epoch: 8509, Training Loss: 0.02279
Epoch: 8510, Training Loss: 0.01707
Epoch: 8510, Training Loss: 0.01825
Epoch: 8510, Training Loss: 0.01828
Epoch: 8510, Training Loss: 0.02279
Epoch: 8511, Training Loss: 0.01707
Epoch: 8511, Training Loss: 0.01824
Epoch: 8511, Training Loss: 0.01827
Epoch: 8511, Training Loss: 0.02279
Epoch: 8512, Training Loss: 0.01706
Epoch: 8512, Training Loss: 0.01824
Epoch: 8512, Training Loss: 0.01827
Epoch: 8512, Training Loss: 0.02279
Epoch: 8513, Training Loss: 0.01706
Epoch: 8513, Training Loss: 0.01824
Epoch: 8513, Training Loss: 0.01827
Epoch: 8513, Training Loss: 0.02278
Epoch: 8514, Training Loss: 0.01706
Epoch: 8514, Training Loss: 0.01824
Epoch: 8514, Training Loss: 0.01827
Epoch: 8514, Training Loss: 0.02278
Epoch: 8515, Training Loss: 0.01706
Epoch: 8515, Training Loss: 0.01824
Epoch: 8515, Training Loss: 0.01827
Epoch: 8515, Training Loss: 0.02278
Epoch: 8516, Training Loss: 0.01706
Epoch: 8516, Training Loss: 0.01824
Epoch: 8516, Training Loss: 0.01827
Epoch: 8516, Training Loss: 0.02278
Epoch: 8517, Training Loss: 0.01706
Epoch: 8517, Training Loss: 0.01824
Epoch: 8517, Training Loss: 0.01826
Epoch: 8517, Training Loss: 0.02278
Epoch: 8518, Training Loss: 0.01706
Epoch: 8518, Training Loss: 0.01823
Epoch: 8518, Training Loss: 0.01826
Epoch: 8518, Training Loss: 0.02277
Epoch: 8519, Training Loss: 0.01706
Epoch: 8519, Training Loss: 0.01823
Epoch: 8519, Training Loss: 0.01826
Epoch: 8519, Training Loss: 0.02277
Epoch: 8520, Training Loss: 0.01705
Epoch: 8520, Training Loss: 0.01823
Epoch: 8520, Training Loss: 0.01826
Epoch: 8520, Training Loss: 0.02277
Epoch: 8521, Training Loss: 0.01705
Epoch: 8521, Training Loss: 0.01823
Epoch: 8521, Training Loss: 0.01826
Epoch: 8521, Training Loss: 0.02277
Epoch: 8522, Training Loss: 0.01705
Epoch: 8522, Training Loss: 0.01823
Epoch: 8522, Training Loss: 0.01826
Epoch: 8522, Training Loss: 0.02277
Epoch: 8523, Training Loss: 0.01705
Epoch: 8523, Training Loss: 0.01823
Epoch: 8523, Training Loss: 0.01826
Epoch: 8523, Training Loss: 0.02276
Epoch: 8524, Training Loss: 0.01705
Epoch: 8524, Training Loss: 0.01822
Epoch: 8524, Training Loss: 0.01825
Epoch: 8524, Training Loss: 0.02276
Epoch: 8525, Training Loss: 0.01705
Epoch: 8525, Training Loss: 0.01822
Epoch: 8525, Training Loss: 0.01825
Epoch: 8525, Training Loss: 0.02276
Epoch: 8526, Training Loss: 0.01705
Epoch: 8526, Training Loss: 0.01822
Epoch: 8526, Training Loss: 0.01825
Epoch: 8526, Training Loss: 0.02276
Epoch: 8527, Training Loss: 0.01704
Epoch: 8527, Training Loss: 0.01822
Epoch: 8527, Training Loss: 0.01825
Epoch: 8527, Training Loss: 0.02276
Epoch: 8528, Training Loss: 0.01704
Epoch: 8528, Training Loss: 0.01822
Epoch: 8528, Training Loss: 0.01825
Epoch: 8528, Training Loss: 0.02275
Epoch: 8529, Training Loss: 0.01704
Epoch: 8529, Training Loss: 0.01822
Epoch: 8529, Training Loss: 0.01825
Epoch: 8529, Training Loss: 0.02275
Epoch: 8530, Training Loss: 0.01704
Epoch: 8530, Training Loss: 0.01821
Epoch: 8530, Training Loss: 0.01824
Epoch: 8530, Training Loss: 0.02275
Epoch: 8531, Training Loss: 0.01704
Epoch: 8531, Training Loss: 0.01821
Epoch: 8531, Training Loss: 0.01824
Epoch: 8531, Training Loss: 0.02275
Epoch: 8532, Training Loss: 0.01704
Epoch: 8532, Training Loss: 0.01821
Epoch: 8532, Training Loss: 0.01824
Epoch: 8532, Training Loss: 0.02275
Epoch: 8533, Training Loss: 0.01704
Epoch: 8533, Training Loss: 0.01821
Epoch: 8533, Training Loss: 0.01824
Epoch: 8533, Training Loss: 0.02274
Epoch: 8534, Training Loss: 0.01703
Epoch: 8534, Training Loss: 0.01821
Epoch: 8534, Training Loss: 0.01824
Epoch: 8534, Training Loss: 0.02274
Epoch: 8535, Training Loss: 0.01703
Epoch: 8535, Training Loss: 0.01821
Epoch: 8535, Training Loss: 0.01824
Epoch: 8535, Training Loss: 0.02274
Epoch: 8536, Training Loss: 0.01703
Epoch: 8536, Training Loss: 0.01821
Epoch: 8536, Training Loss: 0.01824
Epoch: 8536, Training Loss: 0.02274
Epoch: 8537, Training Loss: 0.01703
Epoch: 8537, Training Loss: 0.01820
Epoch: 8537, Training Loss: 0.01823
Epoch: 8537, Training Loss: 0.02274
Epoch: 8538, Training Loss: 0.01703
Epoch: 8538, Training Loss: 0.01820
Epoch: 8538, Training Loss: 0.01823
Epoch: 8538, Training Loss: 0.02273
Epoch: 8539, Training Loss: 0.01703
Epoch: 8539, Training Loss: 0.01820
Epoch: 8539, Training Loss: 0.01823
Epoch: 8539, Training Loss: 0.02273
Epoch: 8540, Training Loss: 0.01703
Epoch: 8540, Training Loss: 0.01820
Epoch: 8540, Training Loss: 0.01823
Epoch: 8540, Training Loss: 0.02273
Epoch: 8541, Training Loss: 0.01702
Epoch: 8541, Training Loss: 0.01820
Epoch: 8541, Training Loss: 0.01823
Epoch: 8541, Training Loss: 0.02273
Epoch: 8542, Training Loss: 0.01702
Epoch: 8542, Training Loss: 0.01820
Epoch: 8542, Training Loss: 0.01823
Epoch: 8542, Training Loss: 0.02273
Epoch: 8543, Training Loss: 0.01702
Epoch: 8543, Training Loss: 0.01819
Epoch: 8543, Training Loss: 0.01822
Epoch: 8543, Training Loss: 0.02272
Epoch: 8544, Training Loss: 0.01702
Epoch: 8544, Training Loss: 0.01819
Epoch: 8544, Training Loss: 0.01822
Epoch: 8544, Training Loss: 0.02272
Epoch: 8545, Training Loss: 0.01702
Epoch: 8545, Training Loss: 0.01819
Epoch: 8545, Training Loss: 0.01822
Epoch: 8545, Training Loss: 0.02272
Epoch: 8546, Training Loss: 0.01702
Epoch: 8546, Training Loss: 0.01819
Epoch: 8546, Training Loss: 0.01822
Epoch: 8546, Training Loss: 0.02272
Epoch: 8547, Training Loss: 0.01702
Epoch: 8547, Training Loss: 0.01819
Epoch: 8547, Training Loss: 0.01822
Epoch: 8547, Training Loss: 0.02272
Epoch: 8548, Training Loss: 0.01701
Epoch: 8548, Training Loss: 0.01819
Epoch: 8548, Training Loss: 0.01822
Epoch: 8548, Training Loss: 0.02271
Epoch: 8549, Training Loss: 0.01701
Epoch: 8549, Training Loss: 0.01819
Epoch: 8549, Training Loss: 0.01821
Epoch: 8549, Training Loss: 0.02271
Epoch: 8550, Training Loss: 0.01701
Epoch: 8550, Training Loss: 0.01818
Epoch: 8550, Training Loss: 0.01821
Epoch: 8550, Training Loss: 0.02271
Epoch: 8551, Training Loss: 0.01701
Epoch: 8551, Training Loss: 0.01818
Epoch: 8551, Training Loss: 0.01821
Epoch: 8551, Training Loss: 0.02271
Epoch: 8552, Training Loss: 0.01701
Epoch: 8552, Training Loss: 0.01818
Epoch: 8552, Training Loss: 0.01821
Epoch: 8552, Training Loss: 0.02271
Epoch: 8553, Training Loss: 0.01701
Epoch: 8553, Training Loss: 0.01818
Epoch: 8553, Training Loss: 0.01821
Epoch: 8553, Training Loss: 0.02270
Epoch: 8554, Training Loss: 0.01701
Epoch: 8554, Training Loss: 0.01818
Epoch: 8554, Training Loss: 0.01821
Epoch: 8554, Training Loss: 0.02270
Epoch: 8555, Training Loss: 0.01701
Epoch: 8555, Training Loss: 0.01818
Epoch: 8555, Training Loss: 0.01821
Epoch: 8555, Training Loss: 0.02270
Epoch: 8556, Training Loss: 0.01700
Epoch: 8556, Training Loss: 0.01817
Epoch: 8556, Training Loss: 0.01820
Epoch: 8556, Training Loss: 0.02270
Epoch: 8557, Training Loss: 0.01700
Epoch: 8557, Training Loss: 0.01817
Epoch: 8557, Training Loss: 0.01820
Epoch: 8557, Training Loss: 0.02270
Epoch: 8558, Training Loss: 0.01700
Epoch: 8558, Training Loss: 0.01817
Epoch: 8558, Training Loss: 0.01820
Epoch: 8558, Training Loss: 0.02269
Epoch: 8559, Training Loss: 0.01700
Epoch: 8559, Training Loss: 0.01817
Epoch: 8559, Training Loss: 0.01820
Epoch: 8559, Training Loss: 0.02269
Epoch: 8560, Training Loss: 0.01700
Epoch: 8560, Training Loss: 0.01817
Epoch: 8560, Training Loss: 0.01820
Epoch: 8560, Training Loss: 0.02269
Epoch: 8561, Training Loss: 0.01700
Epoch: 8561, Training Loss: 0.01817
Epoch: 8561, Training Loss: 0.01820
Epoch: 8561, Training Loss: 0.02269
Epoch: 8562, Training Loss: 0.01700
Epoch: 8562, Training Loss: 0.01817
Epoch: 8562, Training Loss: 0.01819
Epoch: 8562, Training Loss: 0.02269
Epoch: 8563, Training Loss: 0.01699
Epoch: 8563, Training Loss: 0.01816
Epoch: 8563, Training Loss: 0.01819
Epoch: 8563, Training Loss: 0.02268
Epoch: 8564, Training Loss: 0.01699
Epoch: 8564, Training Loss: 0.01816
Epoch: 8564, Training Loss: 0.01819
Epoch: 8564, Training Loss: 0.02268
Epoch: 8565, Training Loss: 0.01699
Epoch: 8565, Training Loss: 0.01816
Epoch: 8565, Training Loss: 0.01819
Epoch: 8565, Training Loss: 0.02268
Epoch: 8566, Training Loss: 0.01699
Epoch: 8566, Training Loss: 0.01816
Epoch: 8566, Training Loss: 0.01819
Epoch: 8566, Training Loss: 0.02268
Epoch: 8567, Training Loss: 0.01699
Epoch: 8567, Training Loss: 0.01816
Epoch: 8567, Training Loss: 0.01819
Epoch: 8567, Training Loss: 0.02268
Epoch: 8568, Training Loss: 0.01699
Epoch: 8568, Training Loss: 0.01816
Epoch: 8568, Training Loss: 0.01819
Epoch: 8568, Training Loss: 0.02267
Epoch: 8569, Training Loss: 0.01699
Epoch: 8569, Training Loss: 0.01815
Epoch: 8569, Training Loss: 0.01818
Epoch: 8569, Training Loss: 0.02267
Epoch: 8570, Training Loss: 0.01698
Epoch: 8570, Training Loss: 0.01815
Epoch: 8570, Training Loss: 0.01818
Epoch: 8570, Training Loss: 0.02267
Epoch: 8571, Training Loss: 0.01698
Epoch: 8571, Training Loss: 0.01815
Epoch: 8571, Training Loss: 0.01818
Epoch: 8571, Training Loss: 0.02267
Epoch: 8572, Training Loss: 0.01698
Epoch: 8572, Training Loss: 0.01815
Epoch: 8572, Training Loss: 0.01818
Epoch: 8572, Training Loss: 0.02267
Epoch: 8573, Training Loss: 0.01698
Epoch: 8573, Training Loss: 0.01815
Epoch: 8573, Training Loss: 0.01818
Epoch: 8573, Training Loss: 0.02266
Epoch: 8574, Training Loss: 0.01698
Epoch: 8574, Training Loss: 0.01815
Epoch: 8574, Training Loss: 0.01818
Epoch: 8574, Training Loss: 0.02266
Epoch: 8575, Training Loss: 0.01698
Epoch: 8575, Training Loss: 0.01815
Epoch: 8575, Training Loss: 0.01817
Epoch: 8575, Training Loss: 0.02266
Epoch: 8576, Training Loss: 0.01698
Epoch: 8576, Training Loss: 0.01814
Epoch: 8576, Training Loss: 0.01817
Epoch: 8576, Training Loss: 0.02266
Epoch: 8577, Training Loss: 0.01697
Epoch: 8577, Training Loss: 0.01814
Epoch: 8577, Training Loss: 0.01817
Epoch: 8577, Training Loss: 0.02266
Epoch: 8578, Training Loss: 0.01697
Epoch: 8578, Training Loss: 0.01814
Epoch: 8578, Training Loss: 0.01817
Epoch: 8578, Training Loss: 0.02265
Epoch: 8579, Training Loss: 0.01697
Epoch: 8579, Training Loss: 0.01814
Epoch: 8579, Training Loss: 0.01817
Epoch: 8579, Training Loss: 0.02265
Epoch: 8580, Training Loss: 0.01697
Epoch: 8580, Training Loss: 0.01814
Epoch: 8580, Training Loss: 0.01817
Epoch: 8580, Training Loss: 0.02265
Epoch: 8581, Training Loss: 0.01697
Epoch: 8581, Training Loss: 0.01814
Epoch: 8581, Training Loss: 0.01817
Epoch: 8581, Training Loss: 0.02265
Epoch: 8582, Training Loss: 0.01697
Epoch: 8582, Training Loss: 0.01813
Epoch: 8582, Training Loss: 0.01816
Epoch: 8582, Training Loss: 0.02265
Epoch: 8583, Training Loss: 0.01697
Epoch: 8583, Training Loss: 0.01813
Epoch: 8583, Training Loss: 0.01816
Epoch: 8583, Training Loss: 0.02264
Epoch: 8584, Training Loss: 0.01697
Epoch: 8584, Training Loss: 0.01813
Epoch: 8584, Training Loss: 0.01816
Epoch: 8584, Training Loss: 0.02264
Epoch: 8585, Training Loss: 0.01696
Epoch: 8585, Training Loss: 0.01813
Epoch: 8585, Training Loss: 0.01816
Epoch: 8585, Training Loss: 0.02264
Epoch: 8586, Training Loss: 0.01696
Epoch: 8586, Training Loss: 0.01813
Epoch: 8586, Training Loss: 0.01816
Epoch: 8586, Training Loss: 0.02264
Epoch: 8587, Training Loss: 0.01696
Epoch: 8587, Training Loss: 0.01813
Epoch: 8587, Training Loss: 0.01816
Epoch: 8587, Training Loss: 0.02264
Epoch: 8588, Training Loss: 0.01696
Epoch: 8588, Training Loss: 0.01813
Epoch: 8588, Training Loss: 0.01815
Epoch: 8588, Training Loss: 0.02263
Epoch: 8589, Training Loss: 0.01696
Epoch: 8589, Training Loss: 0.01812
Epoch: 8589, Training Loss: 0.01815
Epoch: 8589, Training Loss: 0.02263
Epoch: 8590, Training Loss: 0.01696
Epoch: 8590, Training Loss: 0.01812
Epoch: 8590, Training Loss: 0.01815
Epoch: 8590, Training Loss: 0.02263
Epoch: 8591, Training Loss: 0.01696
Epoch: 8591, Training Loss: 0.01812
Epoch: 8591, Training Loss: 0.01815
Epoch: 8591, Training Loss: 0.02263
Epoch: 8592, Training Loss: 0.01695
Epoch: 8592, Training Loss: 0.01812
Epoch: 8592, Training Loss: 0.01815
Epoch: 8592, Training Loss: 0.02263
Epoch: 8593, Training Loss: 0.01695
Epoch: 8593, Training Loss: 0.01812
Epoch: 8593, Training Loss: 0.01815
Epoch: 8593, Training Loss: 0.02262
Epoch: 8594, Training Loss: 0.01695
Epoch: 8594, Training Loss: 0.01812
Epoch: 8594, Training Loss: 0.01815
Epoch: 8594, Training Loss: 0.02262
Epoch: 8595, Training Loss: 0.01695
Epoch: 8595, Training Loss: 0.01811
Epoch: 8595, Training Loss: 0.01814
Epoch: 8595, Training Loss: 0.02262
Epoch: 8596, Training Loss: 0.01695
Epoch: 8596, Training Loss: 0.01811
Epoch: 8596, Training Loss: 0.01814
Epoch: 8596, Training Loss: 0.02262
Epoch: 8597, Training Loss: 0.01695
Epoch: 8597, Training Loss: 0.01811
Epoch: 8597, Training Loss: 0.01814
Epoch: 8597, Training Loss: 0.02262
Epoch: 8598, Training Loss: 0.01695
Epoch: 8598, Training Loss: 0.01811
Epoch: 8598, Training Loss: 0.01814
Epoch: 8598, Training Loss: 0.02261
Epoch: 8599, Training Loss: 0.01694
Epoch: 8599, Training Loss: 0.01811
Epoch: 8599, Training Loss: 0.01814
Epoch: 8599, Training Loss: 0.02261
Epoch: 8600, Training Loss: 0.01694
Epoch: 8600, Training Loss: 0.01811
Epoch: 8600, Training Loss: 0.01814
Epoch: 8600, Training Loss: 0.02261
Epoch: 8601, Training Loss: 0.01694
Epoch: 8601, Training Loss: 0.01811
Epoch: 8601, Training Loss: 0.01813
Epoch: 8601, Training Loss: 0.02261
Epoch: 8602, Training Loss: 0.01694
Epoch: 8602, Training Loss: 0.01810
Epoch: 8602, Training Loss: 0.01813
Epoch: 8602, Training Loss: 0.02261
Epoch: 8603, Training Loss: 0.01694
Epoch: 8603, Training Loss: 0.01810
Epoch: 8603, Training Loss: 0.01813
Epoch: 8603, Training Loss: 0.02260
Epoch: 8604, Training Loss: 0.01694
Epoch: 8604, Training Loss: 0.01810
Epoch: 8604, Training Loss: 0.01813
Epoch: 8604, Training Loss: 0.02260
Epoch: 8605, Training Loss: 0.01694
Epoch: 8605, Training Loss: 0.01810
Epoch: 8605, Training Loss: 0.01813
Epoch: 8605, Training Loss: 0.02260
Epoch: 8606, Training Loss: 0.01693
Epoch: 8606, Training Loss: 0.01810
Epoch: 8606, Training Loss: 0.01813
Epoch: 8606, Training Loss: 0.02260
Epoch: 8607, Training Loss: 0.01693
Epoch: 8607, Training Loss: 0.01810
Epoch: 8607, Training Loss: 0.01813
Epoch: 8607, Training Loss: 0.02260
Epoch: 8608, Training Loss: 0.01693
Epoch: 8608, Training Loss: 0.01809
Epoch: 8608, Training Loss: 0.01812
Epoch: 8608, Training Loss: 0.02259
Epoch: 8609, Training Loss: 0.01693
Epoch: 8609, Training Loss: 0.01809
Epoch: 8609, Training Loss: 0.01812
Epoch: 8609, Training Loss: 0.02259
Epoch: 8610, Training Loss: 0.01693
Epoch: 8610, Training Loss: 0.01809
Epoch: 8610, Training Loss: 0.01812
Epoch: 8610, Training Loss: 0.02259
Epoch: 8611, Training Loss: 0.01693
Epoch: 8611, Training Loss: 0.01809
Epoch: 8611, Training Loss: 0.01812
Epoch: 8611, Training Loss: 0.02259
Epoch: 8612, Training Loss: 0.01693
Epoch: 8612, Training Loss: 0.01809
Epoch: 8612, Training Loss: 0.01812
Epoch: 8612, Training Loss: 0.02259
Epoch: 8613, Training Loss: 0.01693
Epoch: 8613, Training Loss: 0.01809
Epoch: 8613, Training Loss: 0.01812
Epoch: 8613, Training Loss: 0.02259
Epoch: 8614, Training Loss: 0.01692
Epoch: 8614, Training Loss: 0.01809
Epoch: 8614, Training Loss: 0.01811
Epoch: 8614, Training Loss: 0.02258
Epoch: 8615, Training Loss: 0.01692
Epoch: 8615, Training Loss: 0.01808
Epoch: 8615, Training Loss: 0.01811
Epoch: 8615, Training Loss: 0.02258
Epoch: 8616, Training Loss: 0.01692
Epoch: 8616, Training Loss: 0.01808
Epoch: 8616, Training Loss: 0.01811
Epoch: 8616, Training Loss: 0.02258
Epoch: 8617, Training Loss: 0.01692
Epoch: 8617, Training Loss: 0.01808
Epoch: 8617, Training Loss: 0.01811
Epoch: 8617, Training Loss: 0.02258
Epoch: 8618, Training Loss: 0.01692
Epoch: 8618, Training Loss: 0.01808
Epoch: 8618, Training Loss: 0.01811
Epoch: 8618, Training Loss: 0.02258
Epoch: 8619, Training Loss: 0.01692
Epoch: 8619, Training Loss: 0.01808
Epoch: 8619, Training Loss: 0.01811
Epoch: 8619, Training Loss: 0.02257
Epoch: 8620, Training Loss: 0.01692
Epoch: 8620, Training Loss: 0.01808
Epoch: 8620, Training Loss: 0.01811
Epoch: 8620, Training Loss: 0.02257
Epoch: 8621, Training Loss: 0.01691
Epoch: 8621, Training Loss: 0.01807
Epoch: 8621, Training Loss: 0.01810
Epoch: 8621, Training Loss: 0.02257
Epoch: 8622, Training Loss: 0.01691
Epoch: 8622, Training Loss: 0.01807
Epoch: 8622, Training Loss: 0.01810
Epoch: 8622, Training Loss: 0.02257
Epoch: 8623, Training Loss: 0.01691
Epoch: 8623, Training Loss: 0.01807
Epoch: 8623, Training Loss: 0.01810
Epoch: 8623, Training Loss: 0.02257
Epoch: 8624, Training Loss: 0.01691
Epoch: 8624, Training Loss: 0.01807
Epoch: 8624, Training Loss: 0.01810
Epoch: 8624, Training Loss: 0.02256
Epoch: 8625, Training Loss: 0.01691
Epoch: 8625, Training Loss: 0.01807
Epoch: 8625, Training Loss: 0.01810
Epoch: 8625, Training Loss: 0.02256
Epoch: 8626, Training Loss: 0.01691
Epoch: 8626, Training Loss: 0.01807
Epoch: 8626, Training Loss: 0.01810
Epoch: 8626, Training Loss: 0.02256
Epoch: 8627, Training Loss: 0.01691
Epoch: 8627, Training Loss: 0.01807
Epoch: 8627, Training Loss: 0.01809
Epoch: 8627, Training Loss: 0.02256
Epoch: 8628, Training Loss: 0.01690
Epoch: 8628, Training Loss: 0.01806
Epoch: 8628, Training Loss: 0.01809
Epoch: 8628, Training Loss: 0.02256
Epoch: 8629, Training Loss: 0.01690
Epoch: 8629, Training Loss: 0.01806
Epoch: 8629, Training Loss: 0.01809
Epoch: 8629, Training Loss: 0.02255
Epoch: 8630, Training Loss: 0.01690
Epoch: 8630, Training Loss: 0.01806
Epoch: 8630, Training Loss: 0.01809
Epoch: 8630, Training Loss: 0.02255
Epoch: 8631, Training Loss: 0.01690
Epoch: 8631, Training Loss: 0.01806
Epoch: 8631, Training Loss: 0.01809
Epoch: 8631, Training Loss: 0.02255
Epoch: 8632, Training Loss: 0.01690
Epoch: 8632, Training Loss: 0.01806
Epoch: 8632, Training Loss: 0.01809
Epoch: 8632, Training Loss: 0.02255
Epoch: 8633, Training Loss: 0.01690
Epoch: 8633, Training Loss: 0.01806
Epoch: 8633, Training Loss: 0.01809
Epoch: 8633, Training Loss: 0.02255
Epoch: 8634, Training Loss: 0.01690
Epoch: 8634, Training Loss: 0.01805
Epoch: 8634, Training Loss: 0.01808
Epoch: 8634, Training Loss: 0.02254
Epoch: 8635, Training Loss: 0.01690
Epoch: 8635, Training Loss: 0.01805
Epoch: 8635, Training Loss: 0.01808
Epoch: 8635, Training Loss: 0.02254
Epoch: 8636, Training Loss: 0.01689
Epoch: 8636, Training Loss: 0.01805
Epoch: 8636, Training Loss: 0.01808
Epoch: 8636, Training Loss: 0.02254
Epoch: 8637, Training Loss: 0.01689
Epoch: 8637, Training Loss: 0.01805
Epoch: 8637, Training Loss: 0.01808
Epoch: 8637, Training Loss: 0.02254
Epoch: 8638, Training Loss: 0.01689
Epoch: 8638, Training Loss: 0.01805
Epoch: 8638, Training Loss: 0.01808
Epoch: 8638, Training Loss: 0.02254
Epoch: 8639, Training Loss: 0.01689
Epoch: 8639, Training Loss: 0.01805
Epoch: 8639, Training Loss: 0.01808
Epoch: 8639, Training Loss: 0.02253
Epoch: 8640, Training Loss: 0.01689
Epoch: 8640, Training Loss: 0.01805
Epoch: 8640, Training Loss: 0.01807
Epoch: 8640, Training Loss: 0.02253
Epoch: 8641, Training Loss: 0.01689
Epoch: 8641, Training Loss: 0.01804
Epoch: 8641, Training Loss: 0.01807
Epoch: 8641, Training Loss: 0.02253
Epoch: 8642, Training Loss: 0.01689
Epoch: 8642, Training Loss: 0.01804
Epoch: 8642, Training Loss: 0.01807
Epoch: 8642, Training Loss: 0.02253
Epoch: 8643, Training Loss: 0.01688
Epoch: 8643, Training Loss: 0.01804
Epoch: 8643, Training Loss: 0.01807
Epoch: 8643, Training Loss: 0.02253
Epoch: 8644, Training Loss: 0.01688
Epoch: 8644, Training Loss: 0.01804
Epoch: 8644, Training Loss: 0.01807
Epoch: 8644, Training Loss: 0.02252
Epoch: 8645, Training Loss: 0.01688
Epoch: 8645, Training Loss: 0.01804
Epoch: 8645, Training Loss: 0.01807
Epoch: 8645, Training Loss: 0.02252
Epoch: 8646, Training Loss: 0.01688
Epoch: 8646, Training Loss: 0.01804
Epoch: 8646, Training Loss: 0.01807
Epoch: 8646, Training Loss: 0.02252
Epoch: 8647, Training Loss: 0.01688
Epoch: 8647, Training Loss: 0.01804
Epoch: 8647, Training Loss: 0.01806
Epoch: 8647, Training Loss: 0.02252
Epoch: 8648, Training Loss: 0.01688
Epoch: 8648, Training Loss: 0.01803
Epoch: 8648, Training Loss: 0.01806
Epoch: 8648, Training Loss: 0.02252
Epoch: 8649, Training Loss: 0.01688
Epoch: 8649, Training Loss: 0.01803
Epoch: 8649, Training Loss: 0.01806
Epoch: 8649, Training Loss: 0.02251
Epoch: 8650, Training Loss: 0.01688
Epoch: 8650, Training Loss: 0.01803
Epoch: 8650, Training Loss: 0.01806
Epoch: 8650, Training Loss: 0.02251
Epoch: 8651, Training Loss: 0.01687
Epoch: 8651, Training Loss: 0.01803
Epoch: 8651, Training Loss: 0.01806
Epoch: 8651, Training Loss: 0.02251
Epoch: 8652, Training Loss: 0.01687
Epoch: 8652, Training Loss: 0.01803
Epoch: 8652, Training Loss: 0.01806
Epoch: 8652, Training Loss: 0.02251
Epoch: 8653, Training Loss: 0.01687
Epoch: 8653, Training Loss: 0.01803
Epoch: 8653, Training Loss: 0.01806
Epoch: 8653, Training Loss: 0.02251
Epoch: 8654, Training Loss: 0.01687
Epoch: 8654, Training Loss: 0.01802
Epoch: 8654, Training Loss: 0.01805
Epoch: 8654, Training Loss: 0.02250
Epoch: 8655, Training Loss: 0.01687
Epoch: 8655, Training Loss: 0.01802
Epoch: 8655, Training Loss: 0.01805
Epoch: 8655, Training Loss: 0.02250
Epoch: 8656, Training Loss: 0.01687
Epoch: 8656, Training Loss: 0.01802
Epoch: 8656, Training Loss: 0.01805
Epoch: 8656, Training Loss: 0.02250
Epoch: 8657, Training Loss: 0.01687
Epoch: 8657, Training Loss: 0.01802
Epoch: 8657, Training Loss: 0.01805
Epoch: 8657, Training Loss: 0.02250
Epoch: 8658, Training Loss: 0.01686
Epoch: 8658, Training Loss: 0.01802
Epoch: 8658, Training Loss: 0.01805
Epoch: 8658, Training Loss: 0.02250
Epoch: 8659, Training Loss: 0.01686
Epoch: 8659, Training Loss: 0.01802
Epoch: 8659, Training Loss: 0.01805
Epoch: 8659, Training Loss: 0.02250
Epoch: 8660, Training Loss: 0.01686
Epoch: 8660, Training Loss: 0.01802
Epoch: 8660, Training Loss: 0.01804
Epoch: 8660, Training Loss: 0.02249
Epoch: 8661, Training Loss: 0.01686
Epoch: 8661, Training Loss: 0.01801
Epoch: 8661, Training Loss: 0.01804
Epoch: 8661, Training Loss: 0.02249
Epoch: 8662, Training Loss: 0.01686
Epoch: 8662, Training Loss: 0.01801
Epoch: 8662, Training Loss: 0.01804
Epoch: 8662, Training Loss: 0.02249
Epoch: 8663, Training Loss: 0.01686
Epoch: 8663, Training Loss: 0.01801
Epoch: 8663, Training Loss: 0.01804
Epoch: 8663, Training Loss: 0.02249
Epoch: 8664, Training Loss: 0.01686
Epoch: 8664, Training Loss: 0.01801
Epoch: 8664, Training Loss: 0.01804
Epoch: 8664, Training Loss: 0.02249
Epoch: 8665, Training Loss: 0.01685
Epoch: 8665, Training Loss: 0.01801
Epoch: 8665, Training Loss: 0.01804
Epoch: 8665, Training Loss: 0.02248
Epoch: 8666, Training Loss: 0.01685
Epoch: 8666, Training Loss: 0.01801
Epoch: 8666, Training Loss: 0.01804
Epoch: 8666, Training Loss: 0.02248
Epoch: 8667, Training Loss: 0.01685
Epoch: 8667, Training Loss: 0.01800
Epoch: 8667, Training Loss: 0.01803
Epoch: 8667, Training Loss: 0.02248
Epoch: 8668, Training Loss: 0.01685
Epoch: 8668, Training Loss: 0.01800
Epoch: 8668, Training Loss: 0.01803
Epoch: 8668, Training Loss: 0.02248
Epoch: 8669, Training Loss: 0.01685
Epoch: 8669, Training Loss: 0.01800
Epoch: 8669, Training Loss: 0.01803
Epoch: 8669, Training Loss: 0.02248
Epoch: 8670, Training Loss: 0.01685
Epoch: 8670, Training Loss: 0.01800
Epoch: 8670, Training Loss: 0.01803
Epoch: 8670, Training Loss: 0.02247
Epoch: 8671, Training Loss: 0.01685
Epoch: 8671, Training Loss: 0.01800
Epoch: 8671, Training Loss: 0.01803
Epoch: 8671, Training Loss: 0.02247
Epoch: 8672, Training Loss: 0.01685
Epoch: 8672, Training Loss: 0.01800
Epoch: 8672, Training Loss: 0.01803
Epoch: 8672, Training Loss: 0.02247
Epoch: 8673, Training Loss: 0.01684
Epoch: 8673, Training Loss: 0.01800
Epoch: 8673, Training Loss: 0.01802
Epoch: 8673, Training Loss: 0.02247
Epoch: 8674, Training Loss: 0.01684
Epoch: 8674, Training Loss: 0.01799
Epoch: 8674, Training Loss: 0.01802
Epoch: 8674, Training Loss: 0.02247
Epoch: 8675, Training Loss: 0.01684
Epoch: 8675, Training Loss: 0.01799
Epoch: 8675, Training Loss: 0.01802
Epoch: 8675, Training Loss: 0.02246
Epoch: 8676, Training Loss: 0.01684
Epoch: 8676, Training Loss: 0.01799
Epoch: 8676, Training Loss: 0.01802
Epoch: 8676, Training Loss: 0.02246
Epoch: 8677, Training Loss: 0.01684
Epoch: 8677, Training Loss: 0.01799
Epoch: 8677, Training Loss: 0.01802
Epoch: 8677, Training Loss: 0.02246
Epoch: 8678, Training Loss: 0.01684
Epoch: 8678, Training Loss: 0.01799
Epoch: 8678, Training Loss: 0.01802
Epoch: 8678, Training Loss: 0.02246
Epoch: 8679, Training Loss: 0.01684
Epoch: 8679, Training Loss: 0.01799
Epoch: 8679, Training Loss: 0.01802
Epoch: 8679, Training Loss: 0.02246
Epoch: 8680, Training Loss: 0.01683
Epoch: 8680, Training Loss: 0.01799
Epoch: 8680, Training Loss: 0.01801
Epoch: 8680, Training Loss: 0.02245
Epoch: 8681, Training Loss: 0.01683
Epoch: 8681, Training Loss: 0.01798
Epoch: 8681, Training Loss: 0.01801
Epoch: 8681, Training Loss: 0.02245
Epoch: 8682, Training Loss: 0.01683
Epoch: 8682, Training Loss: 0.01798
Epoch: 8682, Training Loss: 0.01801
Epoch: 8682, Training Loss: 0.02245
Epoch: 8683, Training Loss: 0.01683
Epoch: 8683, Training Loss: 0.01798
Epoch: 8683, Training Loss: 0.01801
Epoch: 8683, Training Loss: 0.02245
Epoch: 8684, Training Loss: 0.01683
Epoch: 8684, Training Loss: 0.01798
Epoch: 8684, Training Loss: 0.01801
Epoch: 8684, Training Loss: 0.02245
Epoch: 8685, Training Loss: 0.01683
Epoch: 8685, Training Loss: 0.01798
Epoch: 8685, Training Loss: 0.01801
Epoch: 8685, Training Loss: 0.02244
Epoch: 8686, Training Loss: 0.01683
Epoch: 8686, Training Loss: 0.01798
Epoch: 8686, Training Loss: 0.01801
Epoch: 8686, Training Loss: 0.02244
Epoch: 8687, Training Loss: 0.01683
Epoch: 8687, Training Loss: 0.01797
Epoch: 8687, Training Loss: 0.01800
Epoch: 8687, Training Loss: 0.02244
Epoch: 8688, Training Loss: 0.01682
Epoch: 8688, Training Loss: 0.01797
Epoch: 8688, Training Loss: 0.01800
Epoch: 8688, Training Loss: 0.02244
Epoch: 8689, Training Loss: 0.01682
Epoch: 8689, Training Loss: 0.01797
Epoch: 8689, Training Loss: 0.01800
Epoch: 8689, Training Loss: 0.02244
Epoch: 8690, Training Loss: 0.01682
Epoch: 8690, Training Loss: 0.01797
Epoch: 8690, Training Loss: 0.01800
Epoch: 8690, Training Loss: 0.02244
Epoch: 8691, Training Loss: 0.01682
Epoch: 8691, Training Loss: 0.01797
Epoch: 8691, Training Loss: 0.01800
Epoch: 8691, Training Loss: 0.02243
Epoch: 8692, Training Loss: 0.01682
Epoch: 8692, Training Loss: 0.01797
Epoch: 8692, Training Loss: 0.01800
Epoch: 8692, Training Loss: 0.02243
Epoch: 8693, Training Loss: 0.01682
Epoch: 8693, Training Loss: 0.01797
Epoch: 8693, Training Loss: 0.01799
Epoch: 8693, Training Loss: 0.02243
Epoch: 8694, Training Loss: 0.01682
Epoch: 8694, Training Loss: 0.01796
Epoch: 8694, Training Loss: 0.01799
Epoch: 8694, Training Loss: 0.02243
Epoch: 8695, Training Loss: 0.01681
Epoch: 8695, Training Loss: 0.01796
Epoch: 8695, Training Loss: 0.01799
Epoch: 8695, Training Loss: 0.02243
Epoch: 8696, Training Loss: 0.01681
Epoch: 8696, Training Loss: 0.01796
Epoch: 8696, Training Loss: 0.01799
Epoch: 8696, Training Loss: 0.02242
Epoch: 8697, Training Loss: 0.01681
Epoch: 8697, Training Loss: 0.01796
Epoch: 8697, Training Loss: 0.01799
Epoch: 8697, Training Loss: 0.02242
Epoch: 8698, Training Loss: 0.01681
Epoch: 8698, Training Loss: 0.01796
Epoch: 8698, Training Loss: 0.01799
Epoch: 8698, Training Loss: 0.02242
Epoch: 8699, Training Loss: 0.01681
Epoch: 8699, Training Loss: 0.01796
Epoch: 8699, Training Loss: 0.01799
Epoch: 8699, Training Loss: 0.02242
Epoch: 8700, Training Loss: 0.01681
Epoch: 8700, Training Loss: 0.01796
Epoch: 8700, Training Loss: 0.01798
Epoch: 8700, Training Loss: 0.02242
Epoch: 8701, Training Loss: 0.01681
Epoch: 8701, Training Loss: 0.01795
Epoch: 8701, Training Loss: 0.01798
Epoch: 8701, Training Loss: 0.02241
Epoch: 8702, Training Loss: 0.01681
Epoch: 8702, Training Loss: 0.01795
Epoch: 8702, Training Loss: 0.01798
Epoch: 8702, Training Loss: 0.02241
Epoch: 8703, Training Loss: 0.01680
Epoch: 8703, Training Loss: 0.01795
Epoch: 8703, Training Loss: 0.01798
Epoch: 8703, Training Loss: 0.02241
Epoch: 8704, Training Loss: 0.01680
Epoch: 8704, Training Loss: 0.01795
Epoch: 8704, Training Loss: 0.01798
Epoch: 8704, Training Loss: 0.02241
Epoch: 8705, Training Loss: 0.01680
Epoch: 8705, Training Loss: 0.01795
Epoch: 8705, Training Loss: 0.01798
Epoch: 8705, Training Loss: 0.02241
Epoch: 8706, Training Loss: 0.01680
Epoch: 8706, Training Loss: 0.01795
Epoch: 8706, Training Loss: 0.01798
Epoch: 8706, Training Loss: 0.02240
Epoch: 8707, Training Loss: 0.01680
Epoch: 8707, Training Loss: 0.01794
Epoch: 8707, Training Loss: 0.01797
Epoch: 8707, Training Loss: 0.02240
Epoch: 8708, Training Loss: 0.01680
Epoch: 8708, Training Loss: 0.01794
Epoch: 8708, Training Loss: 0.01797
Epoch: 8708, Training Loss: 0.02240
Epoch: 8709, Training Loss: 0.01680
Epoch: 8709, Training Loss: 0.01794
Epoch: 8709, Training Loss: 0.01797
Epoch: 8709, Training Loss: 0.02240
Epoch: 8710, Training Loss: 0.01679
Epoch: 8710, Training Loss: 0.01794
Epoch: 8710, Training Loss: 0.01797
Epoch: 8710, Training Loss: 0.02240
Epoch: 8711, Training Loss: 0.01679
Epoch: 8711, Training Loss: 0.01794
Epoch: 8711, Training Loss: 0.01797
Epoch: 8711, Training Loss: 0.02240
Epoch: 8712, Training Loss: 0.01679
Epoch: 8712, Training Loss: 0.01794
Epoch: 8712, Training Loss: 0.01797
Epoch: 8712, Training Loss: 0.02239
Epoch: 8713, Training Loss: 0.01679
Epoch: 8713, Training Loss: 0.01794
Epoch: 8713, Training Loss: 0.01796
Epoch: 8713, Training Loss: 0.02239
Epoch: 8714, Training Loss: 0.01679
Epoch: 8714, Training Loss: 0.01793
Epoch: 8714, Training Loss: 0.01796
Epoch: 8714, Training Loss: 0.02239
Epoch: 8715, Training Loss: 0.01679
Epoch: 8715, Training Loss: 0.01793
Epoch: 8715, Training Loss: 0.01796
Epoch: 8715, Training Loss: 0.02239
Epoch: 8716, Training Loss: 0.01679
Epoch: 8716, Training Loss: 0.01793
Epoch: 8716, Training Loss: 0.01796
Epoch: 8716, Training Loss: 0.02239
Epoch: 8717, Training Loss: 0.01679
Epoch: 8717, Training Loss: 0.01793
Epoch: 8717, Training Loss: 0.01796
Epoch: 8717, Training Loss: 0.02238
Epoch: 8718, Training Loss: 0.01678
Epoch: 8718, Training Loss: 0.01793
Epoch: 8718, Training Loss: 0.01796
Epoch: 8718, Training Loss: 0.02238
Epoch: 8719, Training Loss: 0.01678
Epoch: 8719, Training Loss: 0.01793
Epoch: 8719, Training Loss: 0.01796
Epoch: 8719, Training Loss: 0.02238
Epoch: 8720, Training Loss: 0.01678
Epoch: 8720, Training Loss: 0.01793
Epoch: 8720, Training Loss: 0.01795
Epoch: 8720, Training Loss: 0.02238
Epoch: 8721, Training Loss: 0.01678
Epoch: 8721, Training Loss: 0.01792
Epoch: 8721, Training Loss: 0.01795
Epoch: 8721, Training Loss: 0.02238
Epoch: 8722, Training Loss: 0.01678
Epoch: 8722, Training Loss: 0.01792
Epoch: 8722, Training Loss: 0.01795
Epoch: 8722, Training Loss: 0.02237
Epoch: 8723, Training Loss: 0.01678
Epoch: 8723, Training Loss: 0.01792
Epoch: 8723, Training Loss: 0.01795
Epoch: 8723, Training Loss: 0.02237
Epoch: 8724, Training Loss: 0.01678
Epoch: 8724, Training Loss: 0.01792
Epoch: 8724, Training Loss: 0.01795
Epoch: 8724, Training Loss: 0.02237
Epoch: 8725, Training Loss: 0.01677
Epoch: 8725, Training Loss: 0.01792
Epoch: 8725, Training Loss: 0.01795
Epoch: 8725, Training Loss: 0.02237
Epoch: 8726, Training Loss: 0.01677
Epoch: 8726, Training Loss: 0.01792
Epoch: 8726, Training Loss: 0.01795
Epoch: 8726, Training Loss: 0.02237
Epoch: 8727, Training Loss: 0.01677
Epoch: 8727, Training Loss: 0.01792
Epoch: 8727, Training Loss: 0.01794
Epoch: 8727, Training Loss: 0.02236
Epoch: 8728, Training Loss: 0.01677
Epoch: 8728, Training Loss: 0.01791
Epoch: 8728, Training Loss: 0.01794
Epoch: 8728, Training Loss: 0.02236
Epoch: 8729, Training Loss: 0.01677
Epoch: 8729, Training Loss: 0.01791
Epoch: 8729, Training Loss: 0.01794
Epoch: 8729, Training Loss: 0.02236
Epoch: 8730, Training Loss: 0.01677
Epoch: 8730, Training Loss: 0.01791
Epoch: 8730, Training Loss: 0.01794
Epoch: 8730, Training Loss: 0.02236
Epoch: 8731, Training Loss: 0.01677
Epoch: 8731, Training Loss: 0.01791
Epoch: 8731, Training Loss: 0.01794
Epoch: 8731, Training Loss: 0.02236
Epoch: 8732, Training Loss: 0.01677
Epoch: 8732, Training Loss: 0.01791
Epoch: 8732, Training Loss: 0.01794
Epoch: 8732, Training Loss: 0.02235
Epoch: 8733, Training Loss: 0.01676
Epoch: 8733, Training Loss: 0.01791
Epoch: 8733, Training Loss: 0.01794
Epoch: 8733, Training Loss: 0.02235
Epoch: 8734, Training Loss: 0.01676
Epoch: 8734, Training Loss: 0.01790
Epoch: 8734, Training Loss: 0.01793
Epoch: 8734, Training Loss: 0.02235
Epoch: 8735, Training Loss: 0.01676
Epoch: 8735, Training Loss: 0.01790
Epoch: 8735, Training Loss: 0.01793
Epoch: 8735, Training Loss: 0.02235
Epoch: 8736, Training Loss: 0.01676
Epoch: 8736, Training Loss: 0.01790
Epoch: 8736, Training Loss: 0.01793
Epoch: 8736, Training Loss: 0.02235
Epoch: 8737, Training Loss: 0.01676
Epoch: 8737, Training Loss: 0.01790
Epoch: 8737, Training Loss: 0.01793
Epoch: 8737, Training Loss: 0.02235
Epoch: 8738, Training Loss: 0.01676
Epoch: 8738, Training Loss: 0.01790
Epoch: 8738, Training Loss: 0.01793
Epoch: 8738, Training Loss: 0.02234
Epoch: 8739, Training Loss: 0.01676
Epoch: 8739, Training Loss: 0.01790
Epoch: 8739, Training Loss: 0.01793
Epoch: 8739, Training Loss: 0.02234
Epoch: 8740, Training Loss: 0.01675
Epoch: 8740, Training Loss: 0.01790
Epoch: 8740, Training Loss: 0.01792
Epoch: 8740, Training Loss: 0.02234
Epoch: 8741, Training Loss: 0.01675
Epoch: 8741, Training Loss: 0.01789
Epoch: 8741, Training Loss: 0.01792
Epoch: 8741, Training Loss: 0.02234
Epoch: 8742, Training Loss: 0.01675
Epoch: 8742, Training Loss: 0.01789
Epoch: 8742, Training Loss: 0.01792
Epoch: 8742, Training Loss: 0.02234
Epoch: 8743, Training Loss: 0.01675
Epoch: 8743, Training Loss: 0.01789
Epoch: 8743, Training Loss: 0.01792
Epoch: 8743, Training Loss: 0.02233
Epoch: 8744, Training Loss: 0.01675
Epoch: 8744, Training Loss: 0.01789
Epoch: 8744, Training Loss: 0.01792
Epoch: 8744, Training Loss: 0.02233
Epoch: 8745, Training Loss: 0.01675
Epoch: 8745, Training Loss: 0.01789
Epoch: 8745, Training Loss: 0.01792
Epoch: 8745, Training Loss: 0.02233
Epoch: 8746, Training Loss: 0.01675
Epoch: 8746, Training Loss: 0.01789
Epoch: 8746, Training Loss: 0.01792
Epoch: 8746, Training Loss: 0.02233
Epoch: 8747, Training Loss: 0.01675
Epoch: 8747, Training Loss: 0.01789
Epoch: 8747, Training Loss: 0.01791
Epoch: 8747, Training Loss: 0.02233
Epoch: 8748, Training Loss: 0.01674
Epoch: 8748, Training Loss: 0.01788
Epoch: 8748, Training Loss: 0.01791
Epoch: 8748, Training Loss: 0.02232
Epoch: 8749, Training Loss: 0.01674
Epoch: 8749, Training Loss: 0.01788
Epoch: 8749, Training Loss: 0.01791
Epoch: 8749, Training Loss: 0.02232
Epoch: 8750, Training Loss: 0.01674
Epoch: 8750, Training Loss: 0.01788
Epoch: 8750, Training Loss: 0.01791
Epoch: 8750, Training Loss: 0.02232
Epoch: 8751, Training Loss: 0.01674
Epoch: 8751, Training Loss: 0.01788
Epoch: 8751, Training Loss: 0.01791
Epoch: 8751, Training Loss: 0.02232
Epoch: 8752, Training Loss: 0.01674
Epoch: 8752, Training Loss: 0.01788
Epoch: 8752, Training Loss: 0.01791
Epoch: 8752, Training Loss: 0.02232
Epoch: 8753, Training Loss: 0.01674
Epoch: 8753, Training Loss: 0.01788
Epoch: 8753, Training Loss: 0.01791
Epoch: 8753, Training Loss: 0.02231
Epoch: 8754, Training Loss: 0.01674
Epoch: 8754, Training Loss: 0.01788
Epoch: 8754, Training Loss: 0.01790
Epoch: 8754, Training Loss: 0.02231
Epoch: 8755, Training Loss: 0.01673
Epoch: 8755, Training Loss: 0.01787
Epoch: 8755, Training Loss: 0.01790
Epoch: 8755, Training Loss: 0.02231
Epoch: 8756, Training Loss: 0.01673
Epoch: 8756, Training Loss: 0.01787
Epoch: 8756, Training Loss: 0.01790
Epoch: 8756, Training Loss: 0.02231
Epoch: 8757, Training Loss: 0.01673
Epoch: 8757, Training Loss: 0.01787
Epoch: 8757, Training Loss: 0.01790
Epoch: 8757, Training Loss: 0.02231
Epoch: 8758, Training Loss: 0.01673
Epoch: 8758, Training Loss: 0.01787
Epoch: 8758, Training Loss: 0.01790
Epoch: 8758, Training Loss: 0.02231
Epoch: 8759, Training Loss: 0.01673
Epoch: 8759, Training Loss: 0.01787
Epoch: 8759, Training Loss: 0.01790
Epoch: 8759, Training Loss: 0.02230
Epoch: 8760, Training Loss: 0.01673
Epoch: 8760, Training Loss: 0.01787
Epoch: 8760, Training Loss: 0.01790
Epoch: 8760, Training Loss: 0.02230
Epoch: 8761, Training Loss: 0.01673
Epoch: 8761, Training Loss: 0.01786
Epoch: 8761, Training Loss: 0.01789
Epoch: 8761, Training Loss: 0.02230
Epoch: 8762, Training Loss: 0.01673
Epoch: 8762, Training Loss: 0.01786
Epoch: 8762, Training Loss: 0.01789
Epoch: 8762, Training Loss: 0.02230
Epoch: 8763, Training Loss: 0.01672
Epoch: 8763, Training Loss: 0.01786
Epoch: 8763, Training Loss: 0.01789
Epoch: 8763, Training Loss: 0.02230
Epoch: 8764, Training Loss: 0.01672
Epoch: 8764, Training Loss: 0.01786
Epoch: 8764, Training Loss: 0.01789
Epoch: 8764, Training Loss: 0.02229
Epoch: 8765, Training Loss: 0.01672
Epoch: 8765, Training Loss: 0.01786
Epoch: 8765, Training Loss: 0.01789
Epoch: 8765, Training Loss: 0.02229
Epoch: 8766, Training Loss: 0.01672
Epoch: 8766, Training Loss: 0.01786
Epoch: 8766, Training Loss: 0.01789
Epoch: 8766, Training Loss: 0.02229
Epoch: 8767, Training Loss: 0.01672
Epoch: 8767, Training Loss: 0.01786
Epoch: 8767, Training Loss: 0.01788
Epoch: 8767, Training Loss: 0.02229
Epoch: 8768, Training Loss: 0.01672
Epoch: 8768, Training Loss: 0.01785
Epoch: 8768, Training Loss: 0.01788
Epoch: 8768, Training Loss: 0.02229
Epoch: 8769, Training Loss: 0.01672
Epoch: 8769, Training Loss: 0.01785
Epoch: 8769, Training Loss: 0.01788
Epoch: 8769, Training Loss: 0.02228
Epoch: 8770, Training Loss: 0.01672
Epoch: 8770, Training Loss: 0.01785
Epoch: 8770, Training Loss: 0.01788
Epoch: 8770, Training Loss: 0.02228
Epoch: 8771, Training Loss: 0.01671
Epoch: 8771, Training Loss: 0.01785
Epoch: 8771, Training Loss: 0.01788
Epoch: 8771, Training Loss: 0.02228
Epoch: 8772, Training Loss: 0.01671
Epoch: 8772, Training Loss: 0.01785
Epoch: 8772, Training Loss: 0.01788
Epoch: 8772, Training Loss: 0.02228
Epoch: 8773, Training Loss: 0.01671
Epoch: 8773, Training Loss: 0.01785
Epoch: 8773, Training Loss: 0.01788
Epoch: 8773, Training Loss: 0.02228
Epoch: 8774, Training Loss: 0.01671
Epoch: 8774, Training Loss: 0.01785
Epoch: 8774, Training Loss: 0.01787
Epoch: 8774, Training Loss: 0.02228
Epoch: 8775, Training Loss: 0.01671
Epoch: 8775, Training Loss: 0.01784
Epoch: 8775, Training Loss: 0.01787
Epoch: 8775, Training Loss: 0.02227
Epoch: 8776, Training Loss: 0.01671
Epoch: 8776, Training Loss: 0.01784
Epoch: 8776, Training Loss: 0.01787
Epoch: 8776, Training Loss: 0.02227
Epoch: 8777, Training Loss: 0.01671
Epoch: 8777, Training Loss: 0.01784
Epoch: 8777, Training Loss: 0.01787
Epoch: 8777, Training Loss: 0.02227
Epoch: 8778, Training Loss: 0.01670
Epoch: 8778, Training Loss: 0.01784
Epoch: 8778, Training Loss: 0.01787
Epoch: 8778, Training Loss: 0.02227
Epoch: 8779, Training Loss: 0.01670
Epoch: 8779, Training Loss: 0.01784
Epoch: 8779, Training Loss: 0.01787
Epoch: 8779, Training Loss: 0.02227
Epoch: 8780, Training Loss: 0.01670
Epoch: 8780, Training Loss: 0.01784
Epoch: 8780, Training Loss: 0.01787
Epoch: 8780, Training Loss: 0.02226
Epoch: 8781, Training Loss: 0.01670
Epoch: 8781, Training Loss: 0.01784
Epoch: 8781, Training Loss: 0.01786
Epoch: 8781, Training Loss: 0.02226
Epoch: 8782, Training Loss: 0.01670
Epoch: 8782, Training Loss: 0.01783
Epoch: 8782, Training Loss: 0.01786
Epoch: 8782, Training Loss: 0.02226
Epoch: 8783, Training Loss: 0.01670
Epoch: 8783, Training Loss: 0.01783
Epoch: 8783, Training Loss: 0.01786
Epoch: 8783, Training Loss: 0.02226
Epoch: 8784, Training Loss: 0.01670
Epoch: 8784, Training Loss: 0.01783
Epoch: 8784, Training Loss: 0.01786
Epoch: 8784, Training Loss: 0.02226
Epoch: 8785, Training Loss: 0.01670
Epoch: 8785, Training Loss: 0.01783
Epoch: 8785, Training Loss: 0.01786
Epoch: 8785, Training Loss: 0.02225
Epoch: 8786, Training Loss: 0.01669
Epoch: 8786, Training Loss: 0.01783
Epoch: 8786, Training Loss: 0.01786
Epoch: 8786, Training Loss: 0.02225
Epoch: 8787, Training Loss: 0.01669
Epoch: 8787, Training Loss: 0.01783
Epoch: 8787, Training Loss: 0.01786
Epoch: 8787, Training Loss: 0.02225
Epoch: 8788, Training Loss: 0.01669
Epoch: 8788, Training Loss: 0.01783
Epoch: 8788, Training Loss: 0.01785
Epoch: 8788, Training Loss: 0.02225
Epoch: 8789, Training Loss: 0.01669
Epoch: 8789, Training Loss: 0.01782
Epoch: 8789, Training Loss: 0.01785
Epoch: 8789, Training Loss: 0.02225
Epoch: 8790, Training Loss: 0.01669
Epoch: 8790, Training Loss: 0.01782
Epoch: 8790, Training Loss: 0.01785
Epoch: 8790, Training Loss: 0.02225
Epoch: 8791, Training Loss: 0.01669
Epoch: 8791, Training Loss: 0.01782
Epoch: 8791, Training Loss: 0.01785
Epoch: 8791, Training Loss: 0.02224
Epoch: 8792, Training Loss: 0.01669
Epoch: 8792, Training Loss: 0.01782
Epoch: 8792, Training Loss: 0.01785
Epoch: 8792, Training Loss: 0.02224
Epoch: 8793, Training Loss: 0.01668
Epoch: 8793, Training Loss: 0.01782
Epoch: 8793, Training Loss: 0.01785
Epoch: 8793, Training Loss: 0.02224
Epoch: 8794, Training Loss: 0.01668
Epoch: 8794, Training Loss: 0.01782
Epoch: 8794, Training Loss: 0.01785
Epoch: 8794, Training Loss: 0.02224
Epoch: 8795, Training Loss: 0.01668
Epoch: 8795, Training Loss: 0.01781
Epoch: 8795, Training Loss: 0.01784
Epoch: 8795, Training Loss: 0.02224
Epoch: 8796, Training Loss: 0.01668
Epoch: 8796, Training Loss: 0.01781
Epoch: 8796, Training Loss: 0.01784
Epoch: 8796, Training Loss: 0.02223
Epoch: 8797, Training Loss: 0.01668
Epoch: 8797, Training Loss: 0.01781
Epoch: 8797, Training Loss: 0.01784
Epoch: 8797, Training Loss: 0.02223
Epoch: 8798, Training Loss: 0.01668
Epoch: 8798, Training Loss: 0.01781
Epoch: 8798, Training Loss: 0.01784
Epoch: 8798, Training Loss: 0.02223
Epoch: 8799, Training Loss: 0.01668
Epoch: 8799, Training Loss: 0.01781
Epoch: 8799, Training Loss: 0.01784
Epoch: 8799, Training Loss: 0.02223
Epoch: 8800, Training Loss: 0.01668
Epoch: 8800, Training Loss: 0.01781
Epoch: 8800, Training Loss: 0.01784
Epoch: 8800, Training Loss: 0.02223
Epoch: 8801, Training Loss: 0.01667
Epoch: 8801, Training Loss: 0.01781
Epoch: 8801, Training Loss: 0.01783
Epoch: 8801, Training Loss: 0.02222
Epoch: 8802, Training Loss: 0.01667
Epoch: 8802, Training Loss: 0.01780
Epoch: 8802, Training Loss: 0.01783
Epoch: 8802, Training Loss: 0.02222
Epoch: 8803, Training Loss: 0.01667
Epoch: 8803, Training Loss: 0.01780
Epoch: 8803, Training Loss: 0.01783
Epoch: 8803, Training Loss: 0.02222
Epoch: 8804, Training Loss: 0.01667
Epoch: 8804, Training Loss: 0.01780
Epoch: 8804, Training Loss: 0.01783
Epoch: 8804, Training Loss: 0.02222
Epoch: 8805, Training Loss: 0.01667
Epoch: 8805, Training Loss: 0.01780
Epoch: 8805, Training Loss: 0.01783
Epoch: 8805, Training Loss: 0.02222
Epoch: 8806, Training Loss: 0.01667
Epoch: 8806, Training Loss: 0.01780
Epoch: 8806, Training Loss: 0.01783
Epoch: 8806, Training Loss: 0.02222
Epoch: 8807, Training Loss: 0.01667
Epoch: 8807, Training Loss: 0.01780
Epoch: 8807, Training Loss: 0.01783
Epoch: 8807, Training Loss: 0.02221
Epoch: 8808, Training Loss: 0.01667
Epoch: 8808, Training Loss: 0.01780
Epoch: 8808, Training Loss: 0.01782
Epoch: 8808, Training Loss: 0.02221
Epoch: 8809, Training Loss: 0.01666
Epoch: 8809, Training Loss: 0.01779
Epoch: 8809, Training Loss: 0.01782
Epoch: 8809, Training Loss: 0.02221
Epoch: 8810, Training Loss: 0.01666
Epoch: 8810, Training Loss: 0.01779
Epoch: 8810, Training Loss: 0.01782
Epoch: 8810, Training Loss: 0.02221
Epoch: 8811, Training Loss: 0.01666
Epoch: 8811, Training Loss: 0.01779
Epoch: 8811, Training Loss: 0.01782
Epoch: 8811, Training Loss: 0.02221
Epoch: 8812, Training Loss: 0.01666
Epoch: 8812, Training Loss: 0.01779
Epoch: 8812, Training Loss: 0.01782
Epoch: 8812, Training Loss: 0.02220
Epoch: 8813, Training Loss: 0.01666
Epoch: 8813, Training Loss: 0.01779
Epoch: 8813, Training Loss: 0.01782
Epoch: 8813, Training Loss: 0.02220
Epoch: 8814, Training Loss: 0.01666
Epoch: 8814, Training Loss: 0.01779
Epoch: 8814, Training Loss: 0.01782
Epoch: 8814, Training Loss: 0.02220
Epoch: 8815, Training Loss: 0.01666
Epoch: 8815, Training Loss: 0.01779
Epoch: 8815, Training Loss: 0.01781
Epoch: 8815, Training Loss: 0.02220
Epoch: 8816, Training Loss: 0.01665
Epoch: 8816, Training Loss: 0.01778
Epoch: 8816, Training Loss: 0.01781
Epoch: 8816, Training Loss: 0.02220
Epoch: 8817, Training Loss: 0.01665
Epoch: 8817, Training Loss: 0.01778
Epoch: 8817, Training Loss: 0.01781
Epoch: 8817, Training Loss: 0.02219
Epoch: 8818, Training Loss: 0.01665
Epoch: 8818, Training Loss: 0.01778
Epoch: 8818, Training Loss: 0.01781
Epoch: 8818, Training Loss: 0.02219
Epoch: 8819, Training Loss: 0.01665
Epoch: 8819, Training Loss: 0.01778
Epoch: 8819, Training Loss: 0.01781
Epoch: 8819, Training Loss: 0.02219
Epoch: 8820, Training Loss: 0.01665
Epoch: 8820, Training Loss: 0.01778
Epoch: 8820, Training Loss: 0.01781
Epoch: 8820, Training Loss: 0.02219
Epoch: 8821, Training Loss: 0.01665
Epoch: 8821, Training Loss: 0.01778
Epoch: 8821, Training Loss: 0.01781
Epoch: 8821, Training Loss: 0.02219
Epoch: 8822, Training Loss: 0.01665
Epoch: 8822, Training Loss: 0.01778
Epoch: 8822, Training Loss: 0.01780
Epoch: 8822, Training Loss: 0.02219
Epoch: 8823, Training Loss: 0.01665
Epoch: 8823, Training Loss: 0.01777
Epoch: 8823, Training Loss: 0.01780
Epoch: 8823, Training Loss: 0.02218
Epoch: 8824, Training Loss: 0.01664
Epoch: 8824, Training Loss: 0.01777
Epoch: 8824, Training Loss: 0.01780
Epoch: 8824, Training Loss: 0.02218
Epoch: 8825, Training Loss: 0.01664
Epoch: 8825, Training Loss: 0.01777
Epoch: 8825, Training Loss: 0.01780
Epoch: 8825, Training Loss: 0.02218
Epoch: 8826, Training Loss: 0.01664
Epoch: 8826, Training Loss: 0.01777
Epoch: 8826, Training Loss: 0.01780
Epoch: 8826, Training Loss: 0.02218
Epoch: 8827, Training Loss: 0.01664
Epoch: 8827, Training Loss: 0.01777
Epoch: 8827, Training Loss: 0.01780
Epoch: 8827, Training Loss: 0.02218
Epoch: 8828, Training Loss: 0.01664
Epoch: 8828, Training Loss: 0.01777
Epoch: 8828, Training Loss: 0.01780
Epoch: 8828, Training Loss: 0.02217
Epoch: 8829, Training Loss: 0.01664
Epoch: 8829, Training Loss: 0.01777
Epoch: 8829, Training Loss: 0.01779
Epoch: 8829, Training Loss: 0.02217
Epoch: 8830, Training Loss: 0.01664
Epoch: 8830, Training Loss: 0.01776
Epoch: 8830, Training Loss: 0.01779
Epoch: 8830, Training Loss: 0.02217
Epoch: 8831, Training Loss: 0.01664
Epoch: 8831, Training Loss: 0.01776
Epoch: 8831, Training Loss: 0.01779
Epoch: 8831, Training Loss: 0.02217
Epoch: 8832, Training Loss: 0.01663
Epoch: 8832, Training Loss: 0.01776
Epoch: 8832, Training Loss: 0.01779
Epoch: 8832, Training Loss: 0.02217
Epoch: 8833, Training Loss: 0.01663
Epoch: 8833, Training Loss: 0.01776
Epoch: 8833, Training Loss: 0.01779
Epoch: 8833, Training Loss: 0.02216
Epoch: 8834, Training Loss: 0.01663
Epoch: 8834, Training Loss: 0.01776
Epoch: 8834, Training Loss: 0.01779
Epoch: 8834, Training Loss: 0.02216
Epoch: 8835, Training Loss: 0.01663
Epoch: 8835, Training Loss: 0.01776
Epoch: 8835, Training Loss: 0.01779
Epoch: 8835, Training Loss: 0.02216
Epoch: 8836, Training Loss: 0.01663
Epoch: 8836, Training Loss: 0.01776
Epoch: 8836, Training Loss: 0.01778
Epoch: 8836, Training Loss: 0.02216
Epoch: 8837, Training Loss: 0.01663
Epoch: 8837, Training Loss: 0.01775
Epoch: 8837, Training Loss: 0.01778
Epoch: 8837, Training Loss: 0.02216
Epoch: 8838, Training Loss: 0.01663
Epoch: 8838, Training Loss: 0.01775
Epoch: 8838, Training Loss: 0.01778
Epoch: 8838, Training Loss: 0.02216
Epoch: 8839, Training Loss: 0.01663
Epoch: 8839, Training Loss: 0.01775
Epoch: 8839, Training Loss: 0.01778
Epoch: 8839, Training Loss: 0.02215
Epoch: 8840, Training Loss: 0.01662
Epoch: 8840, Training Loss: 0.01775
Epoch: 8840, Training Loss: 0.01778
Epoch: 8840, Training Loss: 0.02215
Epoch: 8841, Training Loss: 0.01662
Epoch: 8841, Training Loss: 0.01775
Epoch: 8841, Training Loss: 0.01778
Epoch: 8841, Training Loss: 0.02215
Epoch: 8842, Training Loss: 0.01662
Epoch: 8842, Training Loss: 0.01775
Epoch: 8842, Training Loss: 0.01778
Epoch: 8842, Training Loss: 0.02215
Epoch: 8843, Training Loss: 0.01662
Epoch: 8843, Training Loss: 0.01775
Epoch: 8843, Training Loss: 0.01777
Epoch: 8843, Training Loss: 0.02215
Epoch: 8844, Training Loss: 0.01662
Epoch: 8844, Training Loss: 0.01774
Epoch: 8844, Training Loss: 0.01777
Epoch: 8844, Training Loss: 0.02214
Epoch: 8845, Training Loss: 0.01662
Epoch: 8845, Training Loss: 0.01774
Epoch: 8845, Training Loss: 0.01777
Epoch: 8845, Training Loss: 0.02214
Epoch: 8846, Training Loss: 0.01662
Epoch: 8846, Training Loss: 0.01774
Epoch: 8846, Training Loss: 0.01777
Epoch: 8846, Training Loss: 0.02214
Epoch: 8847, Training Loss: 0.01661
Epoch: 8847, Training Loss: 0.01774
Epoch: 8847, Training Loss: 0.01777
Epoch: 8847, Training Loss: 0.02214
Epoch: 8848, Training Loss: 0.01661
Epoch: 8848, Training Loss: 0.01774
Epoch: 8848, Training Loss: 0.01777
Epoch: 8848, Training Loss: 0.02214
Epoch: 8849, Training Loss: 0.01661
Epoch: 8849, Training Loss: 0.01774
Epoch: 8849, Training Loss: 0.01776
Epoch: 8849, Training Loss: 0.02214
Epoch: 8850, Training Loss: 0.01661
Epoch: 8850, Training Loss: 0.01774
Epoch: 8850, Training Loss: 0.01776
Epoch: 8850, Training Loss: 0.02213
Epoch: 8851, Training Loss: 0.01661
Epoch: 8851, Training Loss: 0.01773
Epoch: 8851, Training Loss: 0.01776
Epoch: 8851, Training Loss: 0.02213
Epoch: 8852, Training Loss: 0.01661
Epoch: 8852, Training Loss: 0.01773
Epoch: 8852, Training Loss: 0.01776
Epoch: 8852, Training Loss: 0.02213
Epoch: 8853, Training Loss: 0.01661
Epoch: 8853, Training Loss: 0.01773
Epoch: 8853, Training Loss: 0.01776
Epoch: 8853, Training Loss: 0.02213
Epoch: 8854, Training Loss: 0.01661
Epoch: 8854, Training Loss: 0.01773
Epoch: 8854, Training Loss: 0.01776
Epoch: 8854, Training Loss: 0.02213
Epoch: 8855, Training Loss: 0.01660
Epoch: 8855, Training Loss: 0.01773
Epoch: 8855, Training Loss: 0.01776
Epoch: 8855, Training Loss: 0.02212
Epoch: 8856, Training Loss: 0.01660
Epoch: 8856, Training Loss: 0.01773
Epoch: 8856, Training Loss: 0.01775
Epoch: 8856, Training Loss: 0.02212
Epoch: 8857, Training Loss: 0.01660
Epoch: 8857, Training Loss: 0.01772
Epoch: 8857, Training Loss: 0.01775
Epoch: 8857, Training Loss: 0.02212
Epoch: 8858, Training Loss: 0.01660
Epoch: 8858, Training Loss: 0.01772
Epoch: 8858, Training Loss: 0.01775
Epoch: 8858, Training Loss: 0.02212
Epoch: 8859, Training Loss: 0.01660
Epoch: 8859, Training Loss: 0.01772
Epoch: 8859, Training Loss: 0.01775
Epoch: 8859, Training Loss: 0.02212
Epoch: 8860, Training Loss: 0.01660
Epoch: 8860, Training Loss: 0.01772
Epoch: 8860, Training Loss: 0.01775
Epoch: 8860, Training Loss: 0.02211
Epoch: 8861, Training Loss: 0.01660
Epoch: 8861, Training Loss: 0.01772
Epoch: 8861, Training Loss: 0.01775
Epoch: 8861, Training Loss: 0.02211
Epoch: 8862, Training Loss: 0.01660
Epoch: 8862, Training Loss: 0.01772
Epoch: 8862, Training Loss: 0.01775
Epoch: 8862, Training Loss: 0.02211
Epoch: 8863, Training Loss: 0.01659
Epoch: 8863, Training Loss: 0.01772
Epoch: 8863, Training Loss: 0.01774
Epoch: 8863, Training Loss: 0.02211
Epoch: 8864, Training Loss: 0.01659
Epoch: 8864, Training Loss: 0.01771
Epoch: 8864, Training Loss: 0.01774
Epoch: 8864, Training Loss: 0.02211
Epoch: 8865, Training Loss: 0.01659
Epoch: 8865, Training Loss: 0.01771
Epoch: 8865, Training Loss: 0.01774
Epoch: 8865, Training Loss: 0.02211
Epoch: 8866, Training Loss: 0.01659
Epoch: 8866, Training Loss: 0.01771
Epoch: 8866, Training Loss: 0.01774
Epoch: 8866, Training Loss: 0.02210
Epoch: 8867, Training Loss: 0.01659
Epoch: 8867, Training Loss: 0.01771
Epoch: 8867, Training Loss: 0.01774
Epoch: 8867, Training Loss: 0.02210
Epoch: 8868, Training Loss: 0.01659
Epoch: 8868, Training Loss: 0.01771
Epoch: 8868, Training Loss: 0.01774
Epoch: 8868, Training Loss: 0.02210
Epoch: 8869, Training Loss: 0.01659
Epoch: 8869, Training Loss: 0.01771
Epoch: 8869, Training Loss: 0.01774
Epoch: 8869, Training Loss: 0.02210
Epoch: 8870, Training Loss: 0.01659
Epoch: 8870, Training Loss: 0.01771
Epoch: 8870, Training Loss: 0.01773
Epoch: 8870, Training Loss: 0.02210
Epoch: 8871, Training Loss: 0.01658
Epoch: 8871, Training Loss: 0.01770
Epoch: 8871, Training Loss: 0.01773
Epoch: 8871, Training Loss: 0.02209
Epoch: 8872, Training Loss: 0.01658
Epoch: 8872, Training Loss: 0.01770
Epoch: 8872, Training Loss: 0.01773
Epoch: 8872, Training Loss: 0.02209
Epoch: 8873, Training Loss: 0.01658
Epoch: 8873, Training Loss: 0.01770
Epoch: 8873, Training Loss: 0.01773
Epoch: 8873, Training Loss: 0.02209
Epoch: 8874, Training Loss: 0.01658
Epoch: 8874, Training Loss: 0.01770
Epoch: 8874, Training Loss: 0.01773
Epoch: 8874, Training Loss: 0.02209
Epoch: 8875, Training Loss: 0.01658
Epoch: 8875, Training Loss: 0.01770
Epoch: 8875, Training Loss: 0.01773
Epoch: 8875, Training Loss: 0.02209
Epoch: 8876, Training Loss: 0.01658
Epoch: 8876, Training Loss: 0.01770
Epoch: 8876, Training Loss: 0.01773
Epoch: 8876, Training Loss: 0.02209
Epoch: 8877, Training Loss: 0.01658
Epoch: 8877, Training Loss: 0.01770
Epoch: 8877, Training Loss: 0.01772
Epoch: 8877, Training Loss: 0.02208
Epoch: 8878, Training Loss: 0.01657
Epoch: 8878, Training Loss: 0.01769
Epoch: 8878, Training Loss: 0.01772
Epoch: 8878, Training Loss: 0.02208
Epoch: 8879, Training Loss: 0.01657
Epoch: 8879, Training Loss: 0.01769
Epoch: 8879, Training Loss: 0.01772
Epoch: 8879, Training Loss: 0.02208
Epoch: 8880, Training Loss: 0.01657
Epoch: 8880, Training Loss: 0.01769
Epoch: 8880, Training Loss: 0.01772
Epoch: 8880, Training Loss: 0.02208
Epoch: 8881, Training Loss: 0.01657
Epoch: 8881, Training Loss: 0.01769
Epoch: 8881, Training Loss: 0.01772
Epoch: 8881, Training Loss: 0.02208
Epoch: 8882, Training Loss: 0.01657
Epoch: 8882, Training Loss: 0.01769
Epoch: 8882, Training Loss: 0.01772
Epoch: 8882, Training Loss: 0.02207
Epoch: 8883, Training Loss: 0.01657
Epoch: 8883, Training Loss: 0.01769
Epoch: 8883, Training Loss: 0.01772
Epoch: 8883, Training Loss: 0.02207
Epoch: 8884, Training Loss: 0.01657
Epoch: 8884, Training Loss: 0.01769
Epoch: 8884, Training Loss: 0.01771
Epoch: 8884, Training Loss: 0.02207
Epoch: 8885, Training Loss: 0.01657
Epoch: 8885, Training Loss: 0.01768
Epoch: 8885, Training Loss: 0.01771
Epoch: 8885, Training Loss: 0.02207
Epoch: 8886, Training Loss: 0.01656
Epoch: 8886, Training Loss: 0.01768
Epoch: 8886, Training Loss: 0.01771
Epoch: 8886, Training Loss: 0.02207
Epoch: 8887, Training Loss: 0.01656
Epoch: 8887, Training Loss: 0.01768
Epoch: 8887, Training Loss: 0.01771
Epoch: 8887, Training Loss: 0.02207
Epoch: 8888, Training Loss: 0.01656
Epoch: 8888, Training Loss: 0.01768
Epoch: 8888, Training Loss: 0.01771
Epoch: 8888, Training Loss: 0.02206
Epoch: 8889, Training Loss: 0.01656
Epoch: 8889, Training Loss: 0.01768
Epoch: 8889, Training Loss: 0.01771
Epoch: 8889, Training Loss: 0.02206
Epoch: 8890, Training Loss: 0.01656
Epoch: 8890, Training Loss: 0.01768
Epoch: 8890, Training Loss: 0.01771
Epoch: 8890, Training Loss: 0.02206
Epoch: 8891, Training Loss: 0.01656
Epoch: 8891, Training Loss: 0.01768
Epoch: 8891, Training Loss: 0.01770
Epoch: 8891, Training Loss: 0.02206
Epoch: 8892, Training Loss: 0.01656
Epoch: 8892, Training Loss: 0.01767
Epoch: 8892, Training Loss: 0.01770
Epoch: 8892, Training Loss: 0.02206
Epoch: 8893, Training Loss: 0.01656
Epoch: 8893, Training Loss: 0.01767
Epoch: 8893, Training Loss: 0.01770
Epoch: 8893, Training Loss: 0.02205
Epoch: 8894, Training Loss: 0.01655
Epoch: 8894, Training Loss: 0.01767
Epoch: 8894, Training Loss: 0.01770
Epoch: 8894, Training Loss: 0.02205
Epoch: 8895, Training Loss: 0.01655
Epoch: 8895, Training Loss: 0.01767
Epoch: 8895, Training Loss: 0.01770
Epoch: 8895, Training Loss: 0.02205
Epoch: 8896, Training Loss: 0.01655
Epoch: 8896, Training Loss: 0.01767
Epoch: 8896, Training Loss: 0.01770
Epoch: 8896, Training Loss: 0.02205
Epoch: 8897, Training Loss: 0.01655
Epoch: 8897, Training Loss: 0.01767
Epoch: 8897, Training Loss: 0.01770
Epoch: 8897, Training Loss: 0.02205
Epoch: 8898, Training Loss: 0.01655
Epoch: 8898, Training Loss: 0.01767
Epoch: 8898, Training Loss: 0.01769
Epoch: 8898, Training Loss: 0.02204
Epoch: 8899, Training Loss: 0.01655
Epoch: 8899, Training Loss: 0.01766
Epoch: 8899, Training Loss: 0.01769
Epoch: 8899, Training Loss: 0.02204
Epoch: 8900, Training Loss: 0.01655
Epoch: 8900, Training Loss: 0.01766
Epoch: 8900, Training Loss: 0.01769
Epoch: 8900, Training Loss: 0.02204
Epoch: 8901, Training Loss: 0.01655
Epoch: 8901, Training Loss: 0.01766
Epoch: 8901, Training Loss: 0.01769
Epoch: 8901, Training Loss: 0.02204
Epoch: 8902, Training Loss: 0.01654
Epoch: 8902, Training Loss: 0.01766
Epoch: 8902, Training Loss: 0.01769
Epoch: 8902, Training Loss: 0.02204
Epoch: 8903, Training Loss: 0.01654
Epoch: 8903, Training Loss: 0.01766
Epoch: 8903, Training Loss: 0.01769
Epoch: 8903, Training Loss: 0.02204
Epoch: 8904, Training Loss: 0.01654
Epoch: 8904, Training Loss: 0.01766
Epoch: 8904, Training Loss: 0.01769
Epoch: 8904, Training Loss: 0.02203
Epoch: 8905, Training Loss: 0.01654
Epoch: 8905, Training Loss: 0.01766
Epoch: 8905, Training Loss: 0.01768
Epoch: 8905, Training Loss: 0.02203
Epoch: 8906, Training Loss: 0.01654
Epoch: 8906, Training Loss: 0.01765
Epoch: 8906, Training Loss: 0.01768
Epoch: 8906, Training Loss: 0.02203
Epoch: 8907, Training Loss: 0.01654
Epoch: 8907, Training Loss: 0.01765
Epoch: 8907, Training Loss: 0.01768
Epoch: 8907, Training Loss: 0.02203
Epoch: 8908, Training Loss: 0.01654
Epoch: 8908, Training Loss: 0.01765
Epoch: 8908, Training Loss: 0.01768
Epoch: 8908, Training Loss: 0.02203
Epoch: 8909, Training Loss: 0.01654
Epoch: 8909, Training Loss: 0.01765
Epoch: 8909, Training Loss: 0.01768
Epoch: 8909, Training Loss: 0.02202
Epoch: 8910, Training Loss: 0.01653
Epoch: 8910, Training Loss: 0.01765
Epoch: 8910, Training Loss: 0.01768
Epoch: 8910, Training Loss: 0.02202
Epoch: 8911, Training Loss: 0.01653
Epoch: 8911, Training Loss: 0.01765
Epoch: 8911, Training Loss: 0.01768
Epoch: 8911, Training Loss: 0.02202
Epoch: 8912, Training Loss: 0.01653
Epoch: 8912, Training Loss: 0.01765
Epoch: 8912, Training Loss: 0.01767
Epoch: 8912, Training Loss: 0.02202
Epoch: 8913, Training Loss: 0.01653
Epoch: 8913, Training Loss: 0.01764
Epoch: 8913, Training Loss: 0.01767
Epoch: 8913, Training Loss: 0.02202
Epoch: 8914, Training Loss: 0.01653
Epoch: 8914, Training Loss: 0.01764
Epoch: 8914, Training Loss: 0.01767
Epoch: 8914, Training Loss: 0.02202
Epoch: 8915, Training Loss: 0.01653
Epoch: 8915, Training Loss: 0.01764
Epoch: 8915, Training Loss: 0.01767
Epoch: 8915, Training Loss: 0.02201
Epoch: 8916, Training Loss: 0.01653
Epoch: 8916, Training Loss: 0.01764
Epoch: 8916, Training Loss: 0.01767
Epoch: 8916, Training Loss: 0.02201
Epoch: 8917, Training Loss: 0.01652
Epoch: 8917, Training Loss: 0.01764
Epoch: 8917, Training Loss: 0.01767
Epoch: 8917, Training Loss: 0.02201
Epoch: 8918, Training Loss: 0.01652
Epoch: 8918, Training Loss: 0.01764
Epoch: 8918, Training Loss: 0.01767
Epoch: 8918, Training Loss: 0.02201
Epoch: 8919, Training Loss: 0.01652
Epoch: 8919, Training Loss: 0.01764
Epoch: 8919, Training Loss: 0.01766
Epoch: 8919, Training Loss: 0.02201
Epoch: 8920, Training Loss: 0.01652
Epoch: 8920, Training Loss: 0.01763
Epoch: 8920, Training Loss: 0.01766
Epoch: 8920, Training Loss: 0.02200
Epoch: 8921, Training Loss: 0.01652
Epoch: 8921, Training Loss: 0.01763
Epoch: 8921, Training Loss: 0.01766
Epoch: 8921, Training Loss: 0.02200
Epoch: 8922, Training Loss: 0.01652
Epoch: 8922, Training Loss: 0.01763
Epoch: 8922, Training Loss: 0.01766
Epoch: 8922, Training Loss: 0.02200
Epoch: 8923, Training Loss: 0.01652
Epoch: 8923, Training Loss: 0.01763
Epoch: 8923, Training Loss: 0.01766
Epoch: 8923, Training Loss: 0.02200
Epoch: 8924, Training Loss: 0.01652
Epoch: 8924, Training Loss: 0.01763
Epoch: 8924, Training Loss: 0.01766
Epoch: 8924, Training Loss: 0.02200
Epoch: 8925, Training Loss: 0.01651
Epoch: 8925, Training Loss: 0.01763
Epoch: 8925, Training Loss: 0.01766
Epoch: 8925, Training Loss: 0.02200
Epoch: 8926, Training Loss: 0.01651
Epoch: 8926, Training Loss: 0.01763
Epoch: 8926, Training Loss: 0.01765
Epoch: 8926, Training Loss: 0.02199
Epoch: 8927, Training Loss: 0.01651
Epoch: 8927, Training Loss: 0.01763
Epoch: 8927, Training Loss: 0.01765
Epoch: 8927, Training Loss: 0.02199
Epoch: 8928, Training Loss: 0.01651
Epoch: 8928, Training Loss: 0.01762
Epoch: 8928, Training Loss: 0.01765
Epoch: 8928, Training Loss: 0.02199
Epoch: 8929, Training Loss: 0.01651
Epoch: 8929, Training Loss: 0.01762
Epoch: 8929, Training Loss: 0.01765
Epoch: 8929, Training Loss: 0.02199
Epoch: 8930, Training Loss: 0.01651
Epoch: 8930, Training Loss: 0.01762
Epoch: 8930, Training Loss: 0.01765
Epoch: 8930, Training Loss: 0.02199
Epoch: 8931, Training Loss: 0.01651
Epoch: 8931, Training Loss: 0.01762
Epoch: 8931, Training Loss: 0.01765
Epoch: 8931, Training Loss: 0.02198
Epoch: 8932, Training Loss: 0.01651
Epoch: 8932, Training Loss: 0.01762
Epoch: 8932, Training Loss: 0.01765
Epoch: 8932, Training Loss: 0.02198
Epoch: 8933, Training Loss: 0.01650
Epoch: 8933, Training Loss: 0.01762
Epoch: 8933, Training Loss: 0.01764
Epoch: 8933, Training Loss: 0.02198
Epoch: 8934, Training Loss: 0.01650
Epoch: 8934, Training Loss: 0.01762
Epoch: 8934, Training Loss: 0.01764
Epoch: 8934, Training Loss: 0.02198
Epoch: 8935, Training Loss: 0.01650
Epoch: 8935, Training Loss: 0.01761
Epoch: 8935, Training Loss: 0.01764
Epoch: 8935, Training Loss: 0.02198
Epoch: 8936, Training Loss: 0.01650
Epoch: 8936, Training Loss: 0.01761
Epoch: 8936, Training Loss: 0.01764
Epoch: 8936, Training Loss: 0.02198
Epoch: 8937, Training Loss: 0.01650
Epoch: 8937, Training Loss: 0.01761
Epoch: 8937, Training Loss: 0.01764
Epoch: 8937, Training Loss: 0.02197
Epoch: 8938, Training Loss: 0.01650
Epoch: 8938, Training Loss: 0.01761
Epoch: 8938, Training Loss: 0.01764
Epoch: 8938, Training Loss: 0.02197
Epoch: 8939, Training Loss: 0.01650
Epoch: 8939, Training Loss: 0.01761
Epoch: 8939, Training Loss: 0.01764
Epoch: 8939, Training Loss: 0.02197
Epoch: 8940, Training Loss: 0.01650
Epoch: 8940, Training Loss: 0.01761
Epoch: 8940, Training Loss: 0.01763
Epoch: 8940, Training Loss: 0.02197
Epoch: 8941, Training Loss: 0.01649
Epoch: 8941, Training Loss: 0.01761
Epoch: 8941, Training Loss: 0.01763
Epoch: 8941, Training Loss: 0.02197
Epoch: 8942, Training Loss: 0.01649
Epoch: 8942, Training Loss: 0.01760
Epoch: 8942, Training Loss: 0.01763
Epoch: 8942, Training Loss: 0.02196
Epoch: 8943, Training Loss: 0.01649
Epoch: 8943, Training Loss: 0.01760
Epoch: 8943, Training Loss: 0.01763
Epoch: 8943, Training Loss: 0.02196
Epoch: 8944, Training Loss: 0.01649
Epoch: 8944, Training Loss: 0.01760
Epoch: 8944, Training Loss: 0.01763
Epoch: 8944, Training Loss: 0.02196
Epoch: 8945, Training Loss: 0.01649
Epoch: 8945, Training Loss: 0.01760
Epoch: 8945, Training Loss: 0.01763
Epoch: 8945, Training Loss: 0.02196
Epoch: 8946, Training Loss: 0.01649
Epoch: 8946, Training Loss: 0.01760
Epoch: 8946, Training Loss: 0.01763
Epoch: 8946, Training Loss: 0.02196
Epoch: 8947, Training Loss: 0.01649
Epoch: 8947, Training Loss: 0.01760
Epoch: 8947, Training Loss: 0.01762
Epoch: 8947, Training Loss: 0.02196
Epoch: 8948, Training Loss: 0.01649
Epoch: 8948, Training Loss: 0.01760
Epoch: 8948, Training Loss: 0.01762
Epoch: 8948, Training Loss: 0.02195
Epoch: 8949, Training Loss: 0.01648
Epoch: 8949, Training Loss: 0.01759
Epoch: 8949, Training Loss: 0.01762
Epoch: 8949, Training Loss: 0.02195
Epoch: 8950, Training Loss: 0.01648
Epoch: 8950, Training Loss: 0.01759
Epoch: 8950, Training Loss: 0.01762
Epoch: 8950, Training Loss: 0.02195
Epoch: 8951, Training Loss: 0.01648
Epoch: 8951, Training Loss: 0.01759
Epoch: 8951, Training Loss: 0.01762
Epoch: 8951, Training Loss: 0.02195
Epoch: 8952, Training Loss: 0.01648
Epoch: 8952, Training Loss: 0.01759
Epoch: 8952, Training Loss: 0.01762
Epoch: 8952, Training Loss: 0.02195
Epoch: 8953, Training Loss: 0.01648
Epoch: 8953, Training Loss: 0.01759
Epoch: 8953, Training Loss: 0.01762
Epoch: 8953, Training Loss: 0.02194
Epoch: 8954, Training Loss: 0.01648
Epoch: 8954, Training Loss: 0.01759
Epoch: 8954, Training Loss: 0.01762
Epoch: 8954, Training Loss: 0.02194
Epoch: 8955, Training Loss: 0.01648
Epoch: 8955, Training Loss: 0.01759
Epoch: 8955, Training Loss: 0.01761
Epoch: 8955, Training Loss: 0.02194
Epoch: 8956, Training Loss: 0.01648
Epoch: 8956, Training Loss: 0.01758
Epoch: 8956, Training Loss: 0.01761
Epoch: 8956, Training Loss: 0.02194
Epoch: 8957, Training Loss: 0.01647
Epoch: 8957, Training Loss: 0.01758
Epoch: 8957, Training Loss: 0.01761
Epoch: 8957, Training Loss: 0.02194
Epoch: 8958, Training Loss: 0.01647
Epoch: 8958, Training Loss: 0.01758
Epoch: 8958, Training Loss: 0.01761
Epoch: 8958, Training Loss: 0.02194
Epoch: 8959, Training Loss: 0.01647
Epoch: 8959, Training Loss: 0.01758
Epoch: 8959, Training Loss: 0.01761
Epoch: 8959, Training Loss: 0.02193
Epoch: 8960, Training Loss: 0.01647
Epoch: 8960, Training Loss: 0.01758
Epoch: 8960, Training Loss: 0.01761
Epoch: 8960, Training Loss: 0.02193
Epoch: 8961, Training Loss: 0.01647
Epoch: 8961, Training Loss: 0.01758
Epoch: 8961, Training Loss: 0.01761
Epoch: 8961, Training Loss: 0.02193
Epoch: 8962, Training Loss: 0.01647
Epoch: 8962, Training Loss: 0.01758
Epoch: 8962, Training Loss: 0.01760
Epoch: 8962, Training Loss: 0.02193
Epoch: 8963, Training Loss: 0.01647
Epoch: 8963, Training Loss: 0.01757
Epoch: 8963, Training Loss: 0.01760
Epoch: 8963, Training Loss: 0.02193
Epoch: 8964, Training Loss: 0.01647
Epoch: 8964, Training Loss: 0.01757
Epoch: 8964, Training Loss: 0.01760
Epoch: 8964, Training Loss: 0.02193
Epoch: 8965, Training Loss: 0.01646
Epoch: 8965, Training Loss: 0.01757
Epoch: 8965, Training Loss: 0.01760
Epoch: 8965, Training Loss: 0.02192
Epoch: 8966, Training Loss: 0.01646
Epoch: 8966, Training Loss: 0.01757
Epoch: 8966, Training Loss: 0.01760
Epoch: 8966, Training Loss: 0.02192
Epoch: 8967, Training Loss: 0.01646
Epoch: 8967, Training Loss: 0.01757
Epoch: 8967, Training Loss: 0.01760
Epoch: 8967, Training Loss: 0.02192
Epoch: 8968, Training Loss: 0.01646
Epoch: 8968, Training Loss: 0.01757
Epoch: 8968, Training Loss: 0.01760
Epoch: 8968, Training Loss: 0.02192
Epoch: 8969, Training Loss: 0.01646
Epoch: 8969, Training Loss: 0.01757
Epoch: 8969, Training Loss: 0.01759
Epoch: 8969, Training Loss: 0.02192
Epoch: 8970, Training Loss: 0.01646
Epoch: 8970, Training Loss: 0.01756
Epoch: 8970, Training Loss: 0.01759
Epoch: 8970, Training Loss: 0.02191
Epoch: 8971, Training Loss: 0.01646
Epoch: 8971, Training Loss: 0.01756
Epoch: 8971, Training Loss: 0.01759
Epoch: 8971, Training Loss: 0.02191
Epoch: 8972, Training Loss: 0.01646
Epoch: 8972, Training Loss: 0.01756
Epoch: 8972, Training Loss: 0.01759
Epoch: 8972, Training Loss: 0.02191
Epoch: 8973, Training Loss: 0.01645
Epoch: 8973, Training Loss: 0.01756
Epoch: 8973, Training Loss: 0.01759
Epoch: 8973, Training Loss: 0.02191
Epoch: 8974, Training Loss: 0.01645
Epoch: 8974, Training Loss: 0.01756
Epoch: 8974, Training Loss: 0.01759
Epoch: 8974, Training Loss: 0.02191
Epoch: 8975, Training Loss: 0.01645
Epoch: 8975, Training Loss: 0.01756
Epoch: 8975, Training Loss: 0.01759
Epoch: 8975, Training Loss: 0.02191
Epoch: 8976, Training Loss: 0.01645
Epoch: 8976, Training Loss: 0.01756
Epoch: 8976, Training Loss: 0.01758
Epoch: 8976, Training Loss: 0.02190
Epoch: 8977, Training Loss: 0.01645
Epoch: 8977, Training Loss: 0.01755
Epoch: 8977, Training Loss: 0.01758
Epoch: 8977, Training Loss: 0.02190
Epoch: 8978, Training Loss: 0.01645
Epoch: 8978, Training Loss: 0.01755
Epoch: 8978, Training Loss: 0.01758
Epoch: 8978, Training Loss: 0.02190
Epoch: 8979, Training Loss: 0.01645
Epoch: 8979, Training Loss: 0.01755
Epoch: 8979, Training Loss: 0.01758
Epoch: 8979, Training Loss: 0.02190
Epoch: 8980, Training Loss: 0.01645
Epoch: 8980, Training Loss: 0.01755
Epoch: 8980, Training Loss: 0.01758
Epoch: 8980, Training Loss: 0.02190
Epoch: 8981, Training Loss: 0.01644
Epoch: 8981, Training Loss: 0.01755
Epoch: 8981, Training Loss: 0.01758
Epoch: 8981, Training Loss: 0.02189
Epoch: 8982, Training Loss: 0.01644
Epoch: 8982, Training Loss: 0.01755
Epoch: 8982, Training Loss: 0.01758
Epoch: 8982, Training Loss: 0.02189
Epoch: 8983, Training Loss: 0.01644
Epoch: 8983, Training Loss: 0.01755
Epoch: 8983, Training Loss: 0.01757
Epoch: 8983, Training Loss: 0.02189
Epoch: 8984, Training Loss: 0.01644
Epoch: 8984, Training Loss: 0.01754
Epoch: 8984, Training Loss: 0.01757
Epoch: 8984, Training Loss: 0.02189
Epoch: 8985, Training Loss: 0.01644
Epoch: 8985, Training Loss: 0.01754
Epoch: 8985, Training Loss: 0.01757
Epoch: 8985, Training Loss: 0.02189
Epoch: 8986, Training Loss: 0.01644
Epoch: 8986, Training Loss: 0.01754
Epoch: 8986, Training Loss: 0.01757
Epoch: 8986, Training Loss: 0.02189
Epoch: 8987, Training Loss: 0.01644
Epoch: 8987, Training Loss: 0.01754
Epoch: 8987, Training Loss: 0.01757
Epoch: 8987, Training Loss: 0.02188
Epoch: 8988, Training Loss: 0.01644
Epoch: 8988, Training Loss: 0.01754
Epoch: 8988, Training Loss: 0.01757
Epoch: 8988, Training Loss: 0.02188
Epoch: 8989, Training Loss: 0.01643
Epoch: 8989, Training Loss: 0.01754
Epoch: 8989, Training Loss: 0.01757
Epoch: 8989, Training Loss: 0.02188
Epoch: 8990, Training Loss: 0.01643
Epoch: 8990, Training Loss: 0.01754
Epoch: 8990, Training Loss: 0.01756
Epoch: 8990, Training Loss: 0.02188
Epoch: 8991, Training Loss: 0.01643
Epoch: 8991, Training Loss: 0.01754
Epoch: 8991, Training Loss: 0.01756
Epoch: 8991, Training Loss: 0.02188
Epoch: 8992, Training Loss: 0.01643
Epoch: 8992, Training Loss: 0.01753
Epoch: 8992, Training Loss: 0.01756
Epoch: 8992, Training Loss: 0.02187
Epoch: 8993, Training Loss: 0.01643
Epoch: 8993, Training Loss: 0.01753
Epoch: 8993, Training Loss: 0.01756
Epoch: 8993, Training Loss: 0.02187
Epoch: 8994, Training Loss: 0.01643
Epoch: 8994, Training Loss: 0.01753
Epoch: 8994, Training Loss: 0.01756
Epoch: 8994, Training Loss: 0.02187
Epoch: 8995, Training Loss: 0.01643
Epoch: 8995, Training Loss: 0.01753
Epoch: 8995, Training Loss: 0.01756
Epoch: 8995, Training Loss: 0.02187
Epoch: 8996, Training Loss: 0.01643
Epoch: 8996, Training Loss: 0.01753
Epoch: 8996, Training Loss: 0.01756
Epoch: 8996, Training Loss: 0.02187
Epoch: 8997, Training Loss: 0.01642
Epoch: 8997, Training Loss: 0.01753
Epoch: 8997, Training Loss: 0.01755
Epoch: 8997, Training Loss: 0.02187
Epoch: 8998, Training Loss: 0.01642
Epoch: 8998, Training Loss: 0.01753
Epoch: 8998, Training Loss: 0.01755
Epoch: 8998, Training Loss: 0.02186
Epoch: 8999, Training Loss: 0.01642
Epoch: 8999, Training Loss: 0.01752
Epoch: 8999, Training Loss: 0.01755
Epoch: 8999, Training Loss: 0.02186
Epoch: 9000, Training Loss: 0.01642
Epoch: 9000, Training Loss: 0.01752
Epoch: 9000, Training Loss: 0.01755
Epoch: 9000, Training Loss: 0.02186
Epoch: 9001, Training Loss: 0.01642
Epoch: 9001, Training Loss: 0.01752
Epoch: 9001, Training Loss: 0.01755
Epoch: 9001, Training Loss: 0.02186
Epoch: 9002, Training Loss: 0.01642
Epoch: 9002, Training Loss: 0.01752
Epoch: 9002, Training Loss: 0.01755
Epoch: 9002, Training Loss: 0.02186
Epoch: 9003, Training Loss: 0.01642
Epoch: 9003, Training Loss: 0.01752
Epoch: 9003, Training Loss: 0.01755
Epoch: 9003, Training Loss: 0.02186
Epoch: 9004, Training Loss: 0.01642
Epoch: 9004, Training Loss: 0.01752
Epoch: 9004, Training Loss: 0.01754
Epoch: 9004, Training Loss: 0.02185
Epoch: 9005, Training Loss: 0.01641
Epoch: 9005, Training Loss: 0.01752
Epoch: 9005, Training Loss: 0.01754
Epoch: 9005, Training Loss: 0.02185
Epoch: 9006, Training Loss: 0.01641
Epoch: 9006, Training Loss: 0.01751
Epoch: 9006, Training Loss: 0.01754
Epoch: 9006, Training Loss: 0.02185
Epoch: 9007, Training Loss: 0.01641
Epoch: 9007, Training Loss: 0.01751
Epoch: 9007, Training Loss: 0.01754
Epoch: 9007, Training Loss: 0.02185
Epoch: 9008, Training Loss: 0.01641
Epoch: 9008, Training Loss: 0.01751
Epoch: 9008, Training Loss: 0.01754
Epoch: 9008, Training Loss: 0.02185
Epoch: 9009, Training Loss: 0.01641
Epoch: 9009, Training Loss: 0.01751
Epoch: 9009, Training Loss: 0.01754
Epoch: 9009, Training Loss: 0.02184
Epoch: 9010, Training Loss: 0.01641
Epoch: 9010, Training Loss: 0.01751
Epoch: 9010, Training Loss: 0.01754
Epoch: 9010, Training Loss: 0.02184
Epoch: 9011, Training Loss: 0.01641
Epoch: 9011, Training Loss: 0.01751
Epoch: 9011, Training Loss: 0.01754
Epoch: 9011, Training Loss: 0.02184
Epoch: 9012, Training Loss: 0.01641
Epoch: 9012, Training Loss: 0.01751
Epoch: 9012, Training Loss: 0.01753
Epoch: 9012, Training Loss: 0.02184
Epoch: 9013, Training Loss: 0.01640
Epoch: 9013, Training Loss: 0.01750
Epoch: 9013, Training Loss: 0.01753
Epoch: 9013, Training Loss: 0.02184
Epoch: 9014, Training Loss: 0.01640
Epoch: 9014, Training Loss: 0.01750
Epoch: 9014, Training Loss: 0.01753
Epoch: 9014, Training Loss: 0.02184
Epoch: 9015, Training Loss: 0.01640
Epoch: 9015, Training Loss: 0.01750
Epoch: 9015, Training Loss: 0.01753
Epoch: 9015, Training Loss: 0.02183
Epoch: 9016, Training Loss: 0.01640
Epoch: 9016, Training Loss: 0.01750
Epoch: 9016, Training Loss: 0.01753
Epoch: 9016, Training Loss: 0.02183
Epoch: 9017, Training Loss: 0.01640
Epoch: 9017, Training Loss: 0.01750
Epoch: 9017, Training Loss: 0.01753
Epoch: 9017, Training Loss: 0.02183
Epoch: 9018, Training Loss: 0.01640
Epoch: 9018, Training Loss: 0.01750
Epoch: 9018, Training Loss: 0.01753
Epoch: 9018, Training Loss: 0.02183
Epoch: 9019, Training Loss: 0.01640
Epoch: 9019, Training Loss: 0.01750
Epoch: 9019, Training Loss: 0.01752
Epoch: 9019, Training Loss: 0.02183
Epoch: 9020, Training Loss: 0.01640
Epoch: 9020, Training Loss: 0.01749
Epoch: 9020, Training Loss: 0.01752
Epoch: 9020, Training Loss: 0.02182
Epoch: 9021, Training Loss: 0.01639
Epoch: 9021, Training Loss: 0.01749
Epoch: 9021, Training Loss: 0.01752
Epoch: 9021, Training Loss: 0.02182
Epoch: 9022, Training Loss: 0.01639
Epoch: 9022, Training Loss: 0.01749
Epoch: 9022, Training Loss: 0.01752
Epoch: 9022, Training Loss: 0.02182
Epoch: 9023, Training Loss: 0.01639
Epoch: 9023, Training Loss: 0.01749
Epoch: 9023, Training Loss: 0.01752
Epoch: 9023, Training Loss: 0.02182
Epoch: 9024, Training Loss: 0.01639
Epoch: 9024, Training Loss: 0.01749
Epoch: 9024, Training Loss: 0.01752
Epoch: 9024, Training Loss: 0.02182
Epoch: 9025, Training Loss: 0.01639
Epoch: 9025, Training Loss: 0.01749
Epoch: 9025, Training Loss: 0.01752
Epoch: 9025, Training Loss: 0.02182
Epoch: 9026, Training Loss: 0.01639
Epoch: 9026, Training Loss: 0.01749
Epoch: 9026, Training Loss: 0.01751
Epoch: 9026, Training Loss: 0.02181
Epoch: 9027, Training Loss: 0.01639
Epoch: 9027, Training Loss: 0.01748
Epoch: 9027, Training Loss: 0.01751
Epoch: 9027, Training Loss: 0.02181
Epoch: 9028, Training Loss: 0.01639
Epoch: 9028, Training Loss: 0.01748
Epoch: 9028, Training Loss: 0.01751
Epoch: 9028, Training Loss: 0.02181
Epoch: 9029, Training Loss: 0.01638
Epoch: 9029, Training Loss: 0.01748
Epoch: 9029, Training Loss: 0.01751
Epoch: 9029, Training Loss: 0.02181
Epoch: 9030, Training Loss: 0.01638
Epoch: 9030, Training Loss: 0.01748
Epoch: 9030, Training Loss: 0.01751
Epoch: 9030, Training Loss: 0.02181
Epoch: 9031, Training Loss: 0.01638
Epoch: 9031, Training Loss: 0.01748
Epoch: 9031, Training Loss: 0.01751
Epoch: 9031, Training Loss: 0.02181
Epoch: 9032, Training Loss: 0.01638
Epoch: 9032, Training Loss: 0.01748
Epoch: 9032, Training Loss: 0.01751
Epoch: 9032, Training Loss: 0.02180
Epoch: 9033, Training Loss: 0.01638
Epoch: 9033, Training Loss: 0.01748
Epoch: 9033, Training Loss: 0.01750
Epoch: 9033, Training Loss: 0.02180
Epoch: 9034, Training Loss: 0.01638
Epoch: 9034, Training Loss: 0.01748
Epoch: 9034, Training Loss: 0.01750
Epoch: 9034, Training Loss: 0.02180
Epoch: 9035, Training Loss: 0.01638
Epoch: 9035, Training Loss: 0.01747
Epoch: 9035, Training Loss: 0.01750
Epoch: 9035, Training Loss: 0.02180
Epoch: 9036, Training Loss: 0.01638
Epoch: 9036, Training Loss: 0.01747
Epoch: 9036, Training Loss: 0.01750
Epoch: 9036, Training Loss: 0.02180
Epoch: 9037, Training Loss: 0.01637
Epoch: 9037, Training Loss: 0.01747
Epoch: 9037, Training Loss: 0.01750
Epoch: 9037, Training Loss: 0.02179
Epoch: 9038, Training Loss: 0.01637
Epoch: 9038, Training Loss: 0.01747
Epoch: 9038, Training Loss: 0.01750
Epoch: 9038, Training Loss: 0.02179
Epoch: 9039, Training Loss: 0.01637
Epoch: 9039, Training Loss: 0.01747
Epoch: 9039, Training Loss: 0.01750
Epoch: 9039, Training Loss: 0.02179
Epoch: 9040, Training Loss: 0.01637
Epoch: 9040, Training Loss: 0.01747
Epoch: 9040, Training Loss: 0.01749
Epoch: 9040, Training Loss: 0.02179
Epoch: 9041, Training Loss: 0.01637
Epoch: 9041, Training Loss: 0.01747
Epoch: 9041, Training Loss: 0.01749
Epoch: 9041, Training Loss: 0.02179
Epoch: 9042, Training Loss: 0.01637
Epoch: 9042, Training Loss: 0.01746
Epoch: 9042, Training Loss: 0.01749
Epoch: 9042, Training Loss: 0.02179
Epoch: 9043, Training Loss: 0.01637
Epoch: 9043, Training Loss: 0.01746
Epoch: 9043, Training Loss: 0.01749
Epoch: 9043, Training Loss: 0.02178
Epoch: 9044, Training Loss: 0.01637
Epoch: 9044, Training Loss: 0.01746
Epoch: 9044, Training Loss: 0.01749
Epoch: 9044, Training Loss: 0.02178
Epoch: 9045, Training Loss: 0.01636
Epoch: 9045, Training Loss: 0.01746
Epoch: 9045, Training Loss: 0.01749
Epoch: 9045, Training Loss: 0.02178
Epoch: 9046, Training Loss: 0.01636
Epoch: 9046, Training Loss: 0.01746
Epoch: 9046, Training Loss: 0.01749
Epoch: 9046, Training Loss: 0.02178
Epoch: 9047, Training Loss: 0.01636
Epoch: 9047, Training Loss: 0.01746
Epoch: 9047, Training Loss: 0.01749
Epoch: 9047, Training Loss: 0.02178
Epoch: 9048, Training Loss: 0.01636
Epoch: 9048, Training Loss: 0.01746
Epoch: 9048, Training Loss: 0.01748
Epoch: 9048, Training Loss: 0.02178
Epoch: 9049, Training Loss: 0.01636
Epoch: 9049, Training Loss: 0.01745
Epoch: 9049, Training Loss: 0.01748
Epoch: 9049, Training Loss: 0.02177
Epoch: 9050, Training Loss: 0.01636
Epoch: 9050, Training Loss: 0.01745
Epoch: 9050, Training Loss: 0.01748
Epoch: 9050, Training Loss: 0.02177
Epoch: 9051, Training Loss: 0.01636
Epoch: 9051, Training Loss: 0.01745
Epoch: 9051, Training Loss: 0.01748
Epoch: 9051, Training Loss: 0.02177
Epoch: 9052, Training Loss: 0.01636
Epoch: 9052, Training Loss: 0.01745
Epoch: 9052, Training Loss: 0.01748
Epoch: 9052, Training Loss: 0.02177
Epoch: 9053, Training Loss: 0.01635
Epoch: 9053, Training Loss: 0.01745
Epoch: 9053, Training Loss: 0.01748
Epoch: 9053, Training Loss: 0.02177
Epoch: 9054, Training Loss: 0.01635
Epoch: 9054, Training Loss: 0.01745
Epoch: 9054, Training Loss: 0.01748
Epoch: 9054, Training Loss: 0.02176
Epoch: 9055, Training Loss: 0.01635
Epoch: 9055, Training Loss: 0.01745
Epoch: 9055, Training Loss: 0.01747
Epoch: 9055, Training Loss: 0.02176
Epoch: 9056, Training Loss: 0.01635
Epoch: 9056, Training Loss: 0.01744
Epoch: 9056, Training Loss: 0.01747
Epoch: 9056, Training Loss: 0.02176
Epoch: 9057, Training Loss: 0.01635
Epoch: 9057, Training Loss: 0.01744
Epoch: 9057, Training Loss: 0.01747
Epoch: 9057, Training Loss: 0.02176
Epoch: 9058, Training Loss: 0.01635
Epoch: 9058, Training Loss: 0.01744
Epoch: 9058, Training Loss: 0.01747
Epoch: 9058, Training Loss: 0.02176
Epoch: 9059, Training Loss: 0.01635
Epoch: 9059, Training Loss: 0.01744
Epoch: 9059, Training Loss: 0.01747
Epoch: 9059, Training Loss: 0.02176
Epoch: 9060, Training Loss: 0.01635
Epoch: 9060, Training Loss: 0.01744
Epoch: 9060, Training Loss: 0.01747
Epoch: 9060, Training Loss: 0.02175
Epoch: 9061, Training Loss: 0.01634
Epoch: 9061, Training Loss: 0.01744
Epoch: 9061, Training Loss: 0.01747
Epoch: 9061, Training Loss: 0.02175
Epoch: 9062, Training Loss: 0.01634
Epoch: 9062, Training Loss: 0.01744
Epoch: 9062, Training Loss: 0.01746
Epoch: 9062, Training Loss: 0.02175
Epoch: 9063, Training Loss: 0.01634
Epoch: 9063, Training Loss: 0.01744
Epoch: 9063, Training Loss: 0.01746
Epoch: 9063, Training Loss: 0.02175
Epoch: 9064, Training Loss: 0.01634
Epoch: 9064, Training Loss: 0.01743
Epoch: 9064, Training Loss: 0.01746
Epoch: 9064, Training Loss: 0.02175
Epoch: 9065, Training Loss: 0.01634
Epoch: 9065, Training Loss: 0.01743
Epoch: 9065, Training Loss: 0.01746
Epoch: 9065, Training Loss: 0.02175
Epoch: 9066, Training Loss: 0.01634
Epoch: 9066, Training Loss: 0.01743
Epoch: 9066, Training Loss: 0.01746
Epoch: 9066, Training Loss: 0.02174
Epoch: 9067, Training Loss: 0.01634
Epoch: 9067, Training Loss: 0.01743
Epoch: 9067, Training Loss: 0.01746
Epoch: 9067, Training Loss: 0.02174
Epoch: 9068, Training Loss: 0.01634
Epoch: 9068, Training Loss: 0.01743
Epoch: 9068, Training Loss: 0.01746
Epoch: 9068, Training Loss: 0.02174
Epoch: 9069, Training Loss: 0.01634
Epoch: 9069, Training Loss: 0.01743
Epoch: 9069, Training Loss: 0.01745
Epoch: 9069, Training Loss: 0.02174
Epoch: 9070, Training Loss: 0.01633
Epoch: 9070, Training Loss: 0.01743
Epoch: 9070, Training Loss: 0.01745
Epoch: 9070, Training Loss: 0.02174
Epoch: 9071, Training Loss: 0.01633
Epoch: 9071, Training Loss: 0.01742
Epoch: 9071, Training Loss: 0.01745
Epoch: 9071, Training Loss: 0.02173
Epoch: 9072, Training Loss: 0.01633
Epoch: 9072, Training Loss: 0.01742
Epoch: 9072, Training Loss: 0.01745
Epoch: 9072, Training Loss: 0.02173
Epoch: 9073, Training Loss: 0.01633
Epoch: 9073, Training Loss: 0.01742
Epoch: 9073, Training Loss: 0.01745
Epoch: 9073, Training Loss: 0.02173
Epoch: 9074, Training Loss: 0.01633
Epoch: 9074, Training Loss: 0.01742
Epoch: 9074, Training Loss: 0.01745
Epoch: 9074, Training Loss: 0.02173
Epoch: 9075, Training Loss: 0.01633
Epoch: 9075, Training Loss: 0.01742
Epoch: 9075, Training Loss: 0.01745
Epoch: 9075, Training Loss: 0.02173
Epoch: 9076, Training Loss: 0.01633
Epoch: 9076, Training Loss: 0.01742
Epoch: 9076, Training Loss: 0.01745
Epoch: 9076, Training Loss: 0.02173
Epoch: 9077, Training Loss: 0.01633
Epoch: 9077, Training Loss: 0.01742
Epoch: 9077, Training Loss: 0.01744
Epoch: 9077, Training Loss: 0.02172
Epoch: 9078, Training Loss: 0.01632
Epoch: 9078, Training Loss: 0.01741
Epoch: 9078, Training Loss: 0.01744
Epoch: 9078, Training Loss: 0.02172
Epoch: 9079, Training Loss: 0.01632
Epoch: 9079, Training Loss: 0.01741
Epoch: 9079, Training Loss: 0.01744
Epoch: 9079, Training Loss: 0.02172
Epoch: 9080, Training Loss: 0.01632
Epoch: 9080, Training Loss: 0.01741
Epoch: 9080, Training Loss: 0.01744
Epoch: 9080, Training Loss: 0.02172
Epoch: 9081, Training Loss: 0.01632
Epoch: 9081, Training Loss: 0.01741
Epoch: 9081, Training Loss: 0.01744
Epoch: 9081, Training Loss: 0.02172
Epoch: 9082, Training Loss: 0.01632
Epoch: 9082, Training Loss: 0.01741
Epoch: 9082, Training Loss: 0.01744
Epoch: 9082, Training Loss: 0.02172
Epoch: 9083, Training Loss: 0.01632
Epoch: 9083, Training Loss: 0.01741
Epoch: 9083, Training Loss: 0.01744
Epoch: 9083, Training Loss: 0.02171
Epoch: 9084, Training Loss: 0.01632
Epoch: 9084, Training Loss: 0.01741
Epoch: 9084, Training Loss: 0.01743
Epoch: 9084, Training Loss: 0.02171
Epoch: 9085, Training Loss: 0.01632
Epoch: 9085, Training Loss: 0.01741
Epoch: 9085, Training Loss: 0.01743
Epoch: 9085, Training Loss: 0.02171
Epoch: 9086, Training Loss: 0.01631
Epoch: 9086, Training Loss: 0.01740
Epoch: 9086, Training Loss: 0.01743
Epoch: 9086, Training Loss: 0.02171
Epoch: 9087, Training Loss: 0.01631
Epoch: 9087, Training Loss: 0.01740
Epoch: 9087, Training Loss: 0.01743
Epoch: 9087, Training Loss: 0.02171
Epoch: 9088, Training Loss: 0.01631
Epoch: 9088, Training Loss: 0.01740
Epoch: 9088, Training Loss: 0.01743
Epoch: 9088, Training Loss: 0.02170
Epoch: 9089, Training Loss: 0.01631
Epoch: 9089, Training Loss: 0.01740
Epoch: 9089, Training Loss: 0.01743
Epoch: 9089, Training Loss: 0.02170
Epoch: 9090, Training Loss: 0.01631
Epoch: 9090, Training Loss: 0.01740
Epoch: 9090, Training Loss: 0.01743
Epoch: 9090, Training Loss: 0.02170
Epoch: 9091, Training Loss: 0.01631
Epoch: 9091, Training Loss: 0.01740
Epoch: 9091, Training Loss: 0.01742
Epoch: 9091, Training Loss: 0.02170
Epoch: 9092, Training Loss: 0.01631
Epoch: 9092, Training Loss: 0.01740
Epoch: 9092, Training Loss: 0.01742
Epoch: 9092, Training Loss: 0.02170
Epoch: 9093, Training Loss: 0.01631
Epoch: 9093, Training Loss: 0.01739
Epoch: 9093, Training Loss: 0.01742
Epoch: 9093, Training Loss: 0.02170
Epoch: 9094, Training Loss: 0.01630
Epoch: 9094, Training Loss: 0.01739
Epoch: 9094, Training Loss: 0.01742
Epoch: 9094, Training Loss: 0.02169
Epoch: 9095, Training Loss: 0.01630
Epoch: 9095, Training Loss: 0.01739
Epoch: 9095, Training Loss: 0.01742
Epoch: 9095, Training Loss: 0.02169
Epoch: 9096, Training Loss: 0.01630
Epoch: 9096, Training Loss: 0.01739
Epoch: 9096, Training Loss: 0.01742
Epoch: 9096, Training Loss: 0.02169
Epoch: 9097, Training Loss: 0.01630
Epoch: 9097, Training Loss: 0.01739
Epoch: 9097, Training Loss: 0.01742
Epoch: 9097, Training Loss: 0.02169
Epoch: 9098, Training Loss: 0.01630
Epoch: 9098, Training Loss: 0.01739
Epoch: 9098, Training Loss: 0.01742
Epoch: 9098, Training Loss: 0.02169
Epoch: 9099, Training Loss: 0.01630
Epoch: 9099, Training Loss: 0.01739
Epoch: 9099, Training Loss: 0.01741
Epoch: 9099, Training Loss: 0.02169
Epoch: 9100, Training Loss: 0.01630
Epoch: 9100, Training Loss: 0.01738
Epoch: 9100, Training Loss: 0.01741
Epoch: 9100, Training Loss: 0.02168
Epoch: 9101, Training Loss: 0.01630
Epoch: 9101, Training Loss: 0.01738
Epoch: 9101, Training Loss: 0.01741
Epoch: 9101, Training Loss: 0.02168
Epoch: 9102, Training Loss: 0.01629
Epoch: 9102, Training Loss: 0.01738
Epoch: 9102, Training Loss: 0.01741
Epoch: 9102, Training Loss: 0.02168
Epoch: 9103, Training Loss: 0.01629
Epoch: 9103, Training Loss: 0.01738
Epoch: 9103, Training Loss: 0.01741
Epoch: 9103, Training Loss: 0.02168
Epoch: 9104, Training Loss: 0.01629
Epoch: 9104, Training Loss: 0.01738
Epoch: 9104, Training Loss: 0.01741
Epoch: 9104, Training Loss: 0.02168
Epoch: 9105, Training Loss: 0.01629
Epoch: 9105, Training Loss: 0.01738
Epoch: 9105, Training Loss: 0.01741
Epoch: 9105, Training Loss: 0.02168
Epoch: 9106, Training Loss: 0.01629
Epoch: 9106, Training Loss: 0.01738
Epoch: 9106, Training Loss: 0.01740
Epoch: 9106, Training Loss: 0.02167
Epoch: 9107, Training Loss: 0.01629
Epoch: 9107, Training Loss: 0.01738
Epoch: 9107, Training Loss: 0.01740
Epoch: 9107, Training Loss: 0.02167
Epoch: 9108, Training Loss: 0.01629
Epoch: 9108, Training Loss: 0.01737
Epoch: 9108, Training Loss: 0.01740
Epoch: 9108, Training Loss: 0.02167
Epoch: 9109, Training Loss: 0.01629
Epoch: 9109, Training Loss: 0.01737
Epoch: 9109, Training Loss: 0.01740
Epoch: 9109, Training Loss: 0.02167
Epoch: 9110, Training Loss: 0.01628
Epoch: 9110, Training Loss: 0.01737
Epoch: 9110, Training Loss: 0.01740
Epoch: 9110, Training Loss: 0.02167
Epoch: 9111, Training Loss: 0.01628
Epoch: 9111, Training Loss: 0.01737
Epoch: 9111, Training Loss: 0.01740
Epoch: 9111, Training Loss: 0.02166
Epoch: 9112, Training Loss: 0.01628
Epoch: 9112, Training Loss: 0.01737
Epoch: 9112, Training Loss: 0.01740
Epoch: 9112, Training Loss: 0.02166
Epoch: 9113, Training Loss: 0.01628
Epoch: 9113, Training Loss: 0.01737
Epoch: 9113, Training Loss: 0.01739
Epoch: 9113, Training Loss: 0.02166
Epoch: 9114, Training Loss: 0.01628
Epoch: 9114, Training Loss: 0.01737
Epoch: 9114, Training Loss: 0.01739
Epoch: 9114, Training Loss: 0.02166
Epoch: 9115, Training Loss: 0.01628
Epoch: 9115, Training Loss: 0.01736
Epoch: 9115, Training Loss: 0.01739
Epoch: 9115, Training Loss: 0.02166
Epoch: 9116, Training Loss: 0.01628
Epoch: 9116, Training Loss: 0.01736
Epoch: 9116, Training Loss: 0.01739
Epoch: 9116, Training Loss: 0.02166
Epoch: 9117, Training Loss: 0.01628
Epoch: 9117, Training Loss: 0.01736
Epoch: 9117, Training Loss: 0.01739
Epoch: 9117, Training Loss: 0.02165
Epoch: 9118, Training Loss: 0.01628
Epoch: 9118, Training Loss: 0.01736
Epoch: 9118, Training Loss: 0.01739
Epoch: 9118, Training Loss: 0.02165
Epoch: 9119, Training Loss: 0.01627
Epoch: 9119, Training Loss: 0.01736
Epoch: 9119, Training Loss: 0.01739
Epoch: 9119, Training Loss: 0.02165
Epoch: 9120, Training Loss: 0.01627
Epoch: 9120, Training Loss: 0.01736
Epoch: 9120, Training Loss: 0.01739
Epoch: 9120, Training Loss: 0.02165
Epoch: 9121, Training Loss: 0.01627
Epoch: 9121, Training Loss: 0.01736
Epoch: 9121, Training Loss: 0.01738
Epoch: 9121, Training Loss: 0.02165
Epoch: 9122, Training Loss: 0.01627
Epoch: 9122, Training Loss: 0.01735
Epoch: 9122, Training Loss: 0.01738
Epoch: 9122, Training Loss: 0.02165
Epoch: 9123, Training Loss: 0.01627
Epoch: 9123, Training Loss: 0.01735
Epoch: 9123, Training Loss: 0.01738
Epoch: 9123, Training Loss: 0.02164
Epoch: 9124, Training Loss: 0.01627
Epoch: 9124, Training Loss: 0.01735
Epoch: 9124, Training Loss: 0.01738
Epoch: 9124, Training Loss: 0.02164
Epoch: 9125, Training Loss: 0.01627
Epoch: 9125, Training Loss: 0.01735
Epoch: 9125, Training Loss: 0.01738
Epoch: 9125, Training Loss: 0.02164
Epoch: 9126, Training Loss: 0.01627
Epoch: 9126, Training Loss: 0.01735
Epoch: 9126, Training Loss: 0.01738
Epoch: 9126, Training Loss: 0.02164
Epoch: 9127, Training Loss: 0.01626
Epoch: 9127, Training Loss: 0.01735
Epoch: 9127, Training Loss: 0.01738
Epoch: 9127, Training Loss: 0.02164
Epoch: 9128, Training Loss: 0.01626
Epoch: 9128, Training Loss: 0.01735
Epoch: 9128, Training Loss: 0.01737
Epoch: 9128, Training Loss: 0.02164
Epoch: 9129, Training Loss: 0.01626
Epoch: 9129, Training Loss: 0.01735
Epoch: 9129, Training Loss: 0.01737
Epoch: 9129, Training Loss: 0.02163
Epoch: 9130, Training Loss: 0.01626
Epoch: 9130, Training Loss: 0.01734
Epoch: 9130, Training Loss: 0.01737
Epoch: 9130, Training Loss: 0.02163
Epoch: 9131, Training Loss: 0.01626
Epoch: 9131, Training Loss: 0.01734
Epoch: 9131, Training Loss: 0.01737
Epoch: 9131, Training Loss: 0.02163
Epoch: 9132, Training Loss: 0.01626
Epoch: 9132, Training Loss: 0.01734
Epoch: 9132, Training Loss: 0.01737
Epoch: 9132, Training Loss: 0.02163
Epoch: 9133, Training Loss: 0.01626
Epoch: 9133, Training Loss: 0.01734
Epoch: 9133, Training Loss: 0.01737
Epoch: 9133, Training Loss: 0.02163
Epoch: 9134, Training Loss: 0.01626
Epoch: 9134, Training Loss: 0.01734
Epoch: 9134, Training Loss: 0.01737
Epoch: 9134, Training Loss: 0.02162
Epoch: 9135, Training Loss: 0.01625
Epoch: 9135, Training Loss: 0.01734
Epoch: 9135, Training Loss: 0.01736
Epoch: 9135, Training Loss: 0.02162
Epoch: 9136, Training Loss: 0.01625
Epoch: 9136, Training Loss: 0.01734
Epoch: 9136, Training Loss: 0.01736
Epoch: 9136, Training Loss: 0.02162
Epoch: 9137, Training Loss: 0.01625
Epoch: 9137, Training Loss: 0.01733
Epoch: 9137, Training Loss: 0.01736
Epoch: 9137, Training Loss: 0.02162
Epoch: 9138, Training Loss: 0.01625
Epoch: 9138, Training Loss: 0.01733
Epoch: 9138, Training Loss: 0.01736
Epoch: 9138, Training Loss: 0.02162
Epoch: 9139, Training Loss: 0.01625
Epoch: 9139, Training Loss: 0.01733
Epoch: 9139, Training Loss: 0.01736
Epoch: 9139, Training Loss: 0.02162
Epoch: 9140, Training Loss: 0.01625
Epoch: 9140, Training Loss: 0.01733
Epoch: 9140, Training Loss: 0.01736
Epoch: 9140, Training Loss: 0.02161
Epoch: 9141, Training Loss: 0.01625
Epoch: 9141, Training Loss: 0.01733
Epoch: 9141, Training Loss: 0.01736
Epoch: 9141, Training Loss: 0.02161
Epoch: 9142, Training Loss: 0.01625
Epoch: 9142, Training Loss: 0.01733
Epoch: 9142, Training Loss: 0.01736
Epoch: 9142, Training Loss: 0.02161
Epoch: 9143, Training Loss: 0.01624
Epoch: 9143, Training Loss: 0.01733
Epoch: 9143, Training Loss: 0.01735
Epoch: 9143, Training Loss: 0.02161
Epoch: 9144, Training Loss: 0.01624
Epoch: 9144, Training Loss: 0.01733
Epoch: 9144, Training Loss: 0.01735
Epoch: 9144, Training Loss: 0.02161
Epoch: 9145, Training Loss: 0.01624
Epoch: 9145, Training Loss: 0.01732
Epoch: 9145, Training Loss: 0.01735
Epoch: 9145, Training Loss: 0.02161
Epoch: 9146, Training Loss: 0.01624
Epoch: 9146, Training Loss: 0.01732
Epoch: 9146, Training Loss: 0.01735
Epoch: 9146, Training Loss: 0.02160
Epoch: 9147, Training Loss: 0.01624
Epoch: 9147, Training Loss: 0.01732
Epoch: 9147, Training Loss: 0.01735
Epoch: 9147, Training Loss: 0.02160
Epoch: 9148, Training Loss: 0.01624
Epoch: 9148, Training Loss: 0.01732
Epoch: 9148, Training Loss: 0.01735
Epoch: 9148, Training Loss: 0.02160
Epoch: 9149, Training Loss: 0.01624
Epoch: 9149, Training Loss: 0.01732
Epoch: 9149, Training Loss: 0.01735
Epoch: 9149, Training Loss: 0.02160
Epoch: 9150, Training Loss: 0.01624
Epoch: 9150, Training Loss: 0.01732
Epoch: 9150, Training Loss: 0.01734
Epoch: 9150, Training Loss: 0.02160
Epoch: 9151, Training Loss: 0.01624
Epoch: 9151, Training Loss: 0.01732
Epoch: 9151, Training Loss: 0.01734
Epoch: 9151, Training Loss: 0.02160
Epoch: 9152, Training Loss: 0.01623
Epoch: 9152, Training Loss: 0.01731
Epoch: 9152, Training Loss: 0.01734
Epoch: 9152, Training Loss: 0.02159
Epoch: 9153, Training Loss: 0.01623
Epoch: 9153, Training Loss: 0.01731
Epoch: 9153, Training Loss: 0.01734
Epoch: 9153, Training Loss: 0.02159
Epoch: 9154, Training Loss: 0.01623
Epoch: 9154, Training Loss: 0.01731
Epoch: 9154, Training Loss: 0.01734
Epoch: 9154, Training Loss: 0.02159
Epoch: 9155, Training Loss: 0.01623
Epoch: 9155, Training Loss: 0.01731
Epoch: 9155, Training Loss: 0.01734
Epoch: 9155, Training Loss: 0.02159
Epoch: 9156, Training Loss: 0.01623
Epoch: 9156, Training Loss: 0.01731
Epoch: 9156, Training Loss: 0.01734
Epoch: 9156, Training Loss: 0.02159
Epoch: 9157, Training Loss: 0.01623
Epoch: 9157, Training Loss: 0.01731
Epoch: 9157, Training Loss: 0.01734
Epoch: 9157, Training Loss: 0.02159
Epoch: 9158, Training Loss: 0.01623
Epoch: 9158, Training Loss: 0.01731
Epoch: 9158, Training Loss: 0.01733
Epoch: 9158, Training Loss: 0.02158
Epoch: 9159, Training Loss: 0.01623
Epoch: 9159, Training Loss: 0.01731
Epoch: 9159, Training Loss: 0.01733
Epoch: 9159, Training Loss: 0.02158
Epoch: 9160, Training Loss: 0.01622
Epoch: 9160, Training Loss: 0.01730
Epoch: 9160, Training Loss: 0.01733
Epoch: 9160, Training Loss: 0.02158
Epoch: 9161, Training Loss: 0.01622
Epoch: 9161, Training Loss: 0.01730
Epoch: 9161, Training Loss: 0.01733
Epoch: 9161, Training Loss: 0.02158
Epoch: 9162, Training Loss: 0.01622
Epoch: 9162, Training Loss: 0.01730
Epoch: 9162, Training Loss: 0.01733
Epoch: 9162, Training Loss: 0.02158
Epoch: 9163, Training Loss: 0.01622
Epoch: 9163, Training Loss: 0.01730
Epoch: 9163, Training Loss: 0.01733
Epoch: 9163, Training Loss: 0.02157
Epoch: 9164, Training Loss: 0.01622
Epoch: 9164, Training Loss: 0.01730
Epoch: 9164, Training Loss: 0.01733
Epoch: 9164, Training Loss: 0.02157
Epoch: 9165, Training Loss: 0.01622
Epoch: 9165, Training Loss: 0.01730
Epoch: 9165, Training Loss: 0.01732
Epoch: 9165, Training Loss: 0.02157
Epoch: 9166, Training Loss: 0.01622
Epoch: 9166, Training Loss: 0.01730
Epoch: 9166, Training Loss: 0.01732
Epoch: 9166, Training Loss: 0.02157
Epoch: 9167, Training Loss: 0.01622
Epoch: 9167, Training Loss: 0.01729
Epoch: 9167, Training Loss: 0.01732
Epoch: 9167, Training Loss: 0.02157
Epoch: 9168, Training Loss: 0.01621
Epoch: 9168, Training Loss: 0.01729
Epoch: 9168, Training Loss: 0.01732
Epoch: 9168, Training Loss: 0.02157
Epoch: 9169, Training Loss: 0.01621
Epoch: 9169, Training Loss: 0.01729
Epoch: 9169, Training Loss: 0.01732
Epoch: 9169, Training Loss: 0.02156
Epoch: 9170, Training Loss: 0.01621
Epoch: 9170, Training Loss: 0.01729
Epoch: 9170, Training Loss: 0.01732
Epoch: 9170, Training Loss: 0.02156
Epoch: 9171, Training Loss: 0.01621
Epoch: 9171, Training Loss: 0.01729
Epoch: 9171, Training Loss: 0.01732
Epoch: 9171, Training Loss: 0.02156
Epoch: 9172, Training Loss: 0.01621
Epoch: 9172, Training Loss: 0.01729
Epoch: 9172, Training Loss: 0.01732
Epoch: 9172, Training Loss: 0.02156
Epoch: 9173, Training Loss: 0.01621
Epoch: 9173, Training Loss: 0.01729
Epoch: 9173, Training Loss: 0.01731
Epoch: 9173, Training Loss: 0.02156
Epoch: 9174, Training Loss: 0.01621
Epoch: 9174, Training Loss: 0.01728
Epoch: 9174, Training Loss: 0.01731
Epoch: 9174, Training Loss: 0.02156
Epoch: 9175, Training Loss: 0.01621
Epoch: 9175, Training Loss: 0.01728
Epoch: 9175, Training Loss: 0.01731
Epoch: 9175, Training Loss: 0.02155
Epoch: 9176, Training Loss: 0.01621
Epoch: 9176, Training Loss: 0.01728
Epoch: 9176, Training Loss: 0.01731
Epoch: 9176, Training Loss: 0.02155
Epoch: 9177, Training Loss: 0.01620
Epoch: 9177, Training Loss: 0.01728
Epoch: 9177, Training Loss: 0.01731
Epoch: 9177, Training Loss: 0.02155
Epoch: 9178, Training Loss: 0.01620
Epoch: 9178, Training Loss: 0.01728
Epoch: 9178, Training Loss: 0.01731
Epoch: 9178, Training Loss: 0.02155
Epoch: 9179, Training Loss: 0.01620
Epoch: 9179, Training Loss: 0.01728
Epoch: 9179, Training Loss: 0.01731
Epoch: 9179, Training Loss: 0.02155
Epoch: 9180, Training Loss: 0.01620
Epoch: 9180, Training Loss: 0.01728
Epoch: 9180, Training Loss: 0.01730
Epoch: 9180, Training Loss: 0.02155
Epoch: 9181, Training Loss: 0.01620
Epoch: 9181, Training Loss: 0.01728
Epoch: 9181, Training Loss: 0.01730
Epoch: 9181, Training Loss: 0.02154
Epoch: 9182, Training Loss: 0.01620
Epoch: 9182, Training Loss: 0.01727
Epoch: 9182, Training Loss: 0.01730
Epoch: 9182, Training Loss: 0.02154
Epoch: 9183, Training Loss: 0.01620
Epoch: 9183, Training Loss: 0.01727
Epoch: 9183, Training Loss: 0.01730
Epoch: 9183, Training Loss: 0.02154
Epoch: 9184, Training Loss: 0.01620
Epoch: 9184, Training Loss: 0.01727
Epoch: 9184, Training Loss: 0.01730
Epoch: 9184, Training Loss: 0.02154
Epoch: 9185, Training Loss: 0.01619
Epoch: 9185, Training Loss: 0.01727
Epoch: 9185, Training Loss: 0.01730
Epoch: 9185, Training Loss: 0.02154
Epoch: 9186, Training Loss: 0.01619
Epoch: 9186, Training Loss: 0.01727
Epoch: 9186, Training Loss: 0.01730
Epoch: 9186, Training Loss: 0.02154
Epoch: 9187, Training Loss: 0.01619
Epoch: 9187, Training Loss: 0.01727
Epoch: 9187, Training Loss: 0.01729
Epoch: 9187, Training Loss: 0.02153
Epoch: 9188, Training Loss: 0.01619
Epoch: 9188, Training Loss: 0.01727
Epoch: 9188, Training Loss: 0.01729
Epoch: 9188, Training Loss: 0.02153
Epoch: 9189, Training Loss: 0.01619
Epoch: 9189, Training Loss: 0.01726
Epoch: 9189, Training Loss: 0.01729
Epoch: 9189, Training Loss: 0.02153
Epoch: 9190, Training Loss: 0.01619
Epoch: 9190, Training Loss: 0.01726
Epoch: 9190, Training Loss: 0.01729
Epoch: 9190, Training Loss: 0.02153
Epoch: 9191, Training Loss: 0.01619
Epoch: 9191, Training Loss: 0.01726
Epoch: 9191, Training Loss: 0.01729
Epoch: 9191, Training Loss: 0.02153
Epoch: 9192, Training Loss: 0.01619
Epoch: 9192, Training Loss: 0.01726
Epoch: 9192, Training Loss: 0.01729
Epoch: 9192, Training Loss: 0.02153
Epoch: 9193, Training Loss: 0.01618
Epoch: 9193, Training Loss: 0.01726
Epoch: 9193, Training Loss: 0.01729
Epoch: 9193, Training Loss: 0.02152
Epoch: 9194, Training Loss: 0.01618
Epoch: 9194, Training Loss: 0.01726
Epoch: 9194, Training Loss: 0.01729
Epoch: 9194, Training Loss: 0.02152
Epoch: 9195, Training Loss: 0.01618
Epoch: 9195, Training Loss: 0.01726
Epoch: 9195, Training Loss: 0.01728
Epoch: 9195, Training Loss: 0.02152
Epoch: 9196, Training Loss: 0.01618
Epoch: 9196, Training Loss: 0.01726
Epoch: 9196, Training Loss: 0.01728
Epoch: 9196, Training Loss: 0.02152
Epoch: 9197, Training Loss: 0.01618
Epoch: 9197, Training Loss: 0.01725
Epoch: 9197, Training Loss: 0.01728
Epoch: 9197, Training Loss: 0.02152
Epoch: 9198, Training Loss: 0.01618
Epoch: 9198, Training Loss: 0.01725
Epoch: 9198, Training Loss: 0.01728
Epoch: 9198, Training Loss: 0.02151
Epoch: 9199, Training Loss: 0.01618
Epoch: 9199, Training Loss: 0.01725
Epoch: 9199, Training Loss: 0.01728
Epoch: 9199, Training Loss: 0.02151
Epoch: 9200, Training Loss: 0.01618
Epoch: 9200, Training Loss: 0.01725
Epoch: 9200, Training Loss: 0.01728
Epoch: 9200, Training Loss: 0.02151
Epoch: 9201, Training Loss: 0.01618
Epoch: 9201, Training Loss: 0.01725
Epoch: 9201, Training Loss: 0.01728
Epoch: 9201, Training Loss: 0.02151
Epoch: 9202, Training Loss: 0.01617
Epoch: 9202, Training Loss: 0.01725
Epoch: 9202, Training Loss: 0.01727
Epoch: 9202, Training Loss: 0.02151
Epoch: 9203, Training Loss: 0.01617
Epoch: 9203, Training Loss: 0.01725
Epoch: 9203, Training Loss: 0.01727
Epoch: 9203, Training Loss: 0.02151
Epoch: 9204, Training Loss: 0.01617
Epoch: 9204, Training Loss: 0.01724
Epoch: 9204, Training Loss: 0.01727
Epoch: 9204, Training Loss: 0.02150
Epoch: 9205, Training Loss: 0.01617
Epoch: 9205, Training Loss: 0.01724
Epoch: 9205, Training Loss: 0.01727
Epoch: 9205, Training Loss: 0.02150
Epoch: 9206, Training Loss: 0.01617
Epoch: 9206, Training Loss: 0.01724
Epoch: 9206, Training Loss: 0.01727
Epoch: 9206, Training Loss: 0.02150
Epoch: 9207, Training Loss: 0.01617
Epoch: 9207, Training Loss: 0.01724
Epoch: 9207, Training Loss: 0.01727
Epoch: 9207, Training Loss: 0.02150
Epoch: 9208, Training Loss: 0.01617
Epoch: 9208, Training Loss: 0.01724
Epoch: 9208, Training Loss: 0.01727
Epoch: 9208, Training Loss: 0.02150
Epoch: 9209, Training Loss: 0.01617
Epoch: 9209, Training Loss: 0.01724
Epoch: 9209, Training Loss: 0.01727
Epoch: 9209, Training Loss: 0.02150
Epoch: 9210, Training Loss: 0.01616
Epoch: 9210, Training Loss: 0.01724
Epoch: 9210, Training Loss: 0.01726
Epoch: 9210, Training Loss: 0.02149
Epoch: 9211, Training Loss: 0.01616
Epoch: 9211, Training Loss: 0.01724
Epoch: 9211, Training Loss: 0.01726
Epoch: 9211, Training Loss: 0.02149
Epoch: 9212, Training Loss: 0.01616
Epoch: 9212, Training Loss: 0.01723
Epoch: 9212, Training Loss: 0.01726
Epoch: 9212, Training Loss: 0.02149
Epoch: 9213, Training Loss: 0.01616
Epoch: 9213, Training Loss: 0.01723
Epoch: 9213, Training Loss: 0.01726
Epoch: 9213, Training Loss: 0.02149
Epoch: 9214, Training Loss: 0.01616
Epoch: 9214, Training Loss: 0.01723
Epoch: 9214, Training Loss: 0.01726
Epoch: 9214, Training Loss: 0.02149
Epoch: 9215, Training Loss: 0.01616
Epoch: 9215, Training Loss: 0.01723
Epoch: 9215, Training Loss: 0.01726
Epoch: 9215, Training Loss: 0.02149
Epoch: 9216, Training Loss: 0.01616
Epoch: 9216, Training Loss: 0.01723
Epoch: 9216, Training Loss: 0.01726
Epoch: 9216, Training Loss: 0.02148
Epoch: 9217, Training Loss: 0.01616
Epoch: 9217, Training Loss: 0.01723
Epoch: 9217, Training Loss: 0.01725
Epoch: 9217, Training Loss: 0.02148
Epoch: 9218, Training Loss: 0.01615
Epoch: 9218, Training Loss: 0.01723
Epoch: 9218, Training Loss: 0.01725
Epoch: 9218, Training Loss: 0.02148
Epoch: 9219, Training Loss: 0.01615
Epoch: 9219, Training Loss: 0.01723
Epoch: 9219, Training Loss: 0.01725
Epoch: 9219, Training Loss: 0.02148
Epoch: 9220, Training Loss: 0.01615
Epoch: 9220, Training Loss: 0.01722
Epoch: 9220, Training Loss: 0.01725
Epoch: 9220, Training Loss: 0.02148
Epoch: 9221, Training Loss: 0.01615
Epoch: 9221, Training Loss: 0.01722
Epoch: 9221, Training Loss: 0.01725
Epoch: 9221, Training Loss: 0.02148
Epoch: 9222, Training Loss: 0.01615
Epoch: 9222, Training Loss: 0.01722
Epoch: 9222, Training Loss: 0.01725
Epoch: 9222, Training Loss: 0.02147
Epoch: 9223, Training Loss: 0.01615
Epoch: 9223, Training Loss: 0.01722
Epoch: 9223, Training Loss: 0.01725
Epoch: 9223, Training Loss: 0.02147
Epoch: 9224, Training Loss: 0.01615
Epoch: 9224, Training Loss: 0.01722
Epoch: 9224, Training Loss: 0.01725
Epoch: 9224, Training Loss: 0.02147
Epoch: 9225, Training Loss: 0.01615
Epoch: 9225, Training Loss: 0.01722
Epoch: 9225, Training Loss: 0.01724
Epoch: 9225, Training Loss: 0.02147
Epoch: 9226, Training Loss: 0.01615
Epoch: 9226, Training Loss: 0.01722
Epoch: 9226, Training Loss: 0.01724
Epoch: 9226, Training Loss: 0.02147
Epoch: 9227, Training Loss: 0.01614
Epoch: 9227, Training Loss: 0.01721
Epoch: 9227, Training Loss: 0.01724
Epoch: 9227, Training Loss: 0.02147
Epoch: 9228, Training Loss: 0.01614
Epoch: 9228, Training Loss: 0.01721
Epoch: 9228, Training Loss: 0.01724
Epoch: 9228, Training Loss: 0.02146
Epoch: 9229, Training Loss: 0.01614
Epoch: 9229, Training Loss: 0.01721
Epoch: 9229, Training Loss: 0.01724
Epoch: 9229, Training Loss: 0.02146
Epoch: 9230, Training Loss: 0.01614
Epoch: 9230, Training Loss: 0.01721
Epoch: 9230, Training Loss: 0.01724
Epoch: 9230, Training Loss: 0.02146
Epoch: 9231, Training Loss: 0.01614
Epoch: 9231, Training Loss: 0.01721
Epoch: 9231, Training Loss: 0.01724
Epoch: 9231, Training Loss: 0.02146
Epoch: 9232, Training Loss: 0.01614
Epoch: 9232, Training Loss: 0.01721
Epoch: 9232, Training Loss: 0.01724
Epoch: 9232, Training Loss: 0.02146
Epoch: 9233, Training Loss: 0.01614
Epoch: 9233, Training Loss: 0.01721
Epoch: 9233, Training Loss: 0.01723
Epoch: 9233, Training Loss: 0.02146
Epoch: 9234, Training Loss: 0.01614
Epoch: 9234, Training Loss: 0.01721
Epoch: 9234, Training Loss: 0.01723
Epoch: 9234, Training Loss: 0.02145
Epoch: 9235, Training Loss: 0.01613
Epoch: 9235, Training Loss: 0.01720
Epoch: 9235, Training Loss: 0.01723
Epoch: 9235, Training Loss: 0.02145
Epoch: 9236, Training Loss: 0.01613
Epoch: 9236, Training Loss: 0.01720
Epoch: 9236, Training Loss: 0.01723
Epoch: 9236, Training Loss: 0.02145
Epoch: 9237, Training Loss: 0.01613
Epoch: 9237, Training Loss: 0.01720
Epoch: 9237, Training Loss: 0.01723
Epoch: 9237, Training Loss: 0.02145
Epoch: 9238, Training Loss: 0.01613
Epoch: 9238, Training Loss: 0.01720
Epoch: 9238, Training Loss: 0.01723
Epoch: 9238, Training Loss: 0.02145
Epoch: 9239, Training Loss: 0.01613
Epoch: 9239, Training Loss: 0.01720
Epoch: 9239, Training Loss: 0.01723
Epoch: 9239, Training Loss: 0.02145
Epoch: 9240, Training Loss: 0.01613
Epoch: 9240, Training Loss: 0.01720
Epoch: 9240, Training Loss: 0.01722
Epoch: 9240, Training Loss: 0.02144
Epoch: 9241, Training Loss: 0.01613
Epoch: 9241, Training Loss: 0.01720
Epoch: 9241, Training Loss: 0.01722
Epoch: 9241, Training Loss: 0.02144
Epoch: 9242, Training Loss: 0.01613
Epoch: 9242, Training Loss: 0.01719
Epoch: 9242, Training Loss: 0.01722
Epoch: 9242, Training Loss: 0.02144
Epoch: 9243, Training Loss: 0.01613
Epoch: 9243, Training Loss: 0.01719
Epoch: 9243, Training Loss: 0.01722
Epoch: 9243, Training Loss: 0.02144
Epoch: 9244, Training Loss: 0.01612
Epoch: 9244, Training Loss: 0.01719
Epoch: 9244, Training Loss: 0.01722
Epoch: 9244, Training Loss: 0.02144
Epoch: 9245, Training Loss: 0.01612
Epoch: 9245, Training Loss: 0.01719
Epoch: 9245, Training Loss: 0.01722
Epoch: 9245, Training Loss: 0.02144
Epoch: 9246, Training Loss: 0.01612
Epoch: 9246, Training Loss: 0.01719
Epoch: 9246, Training Loss: 0.01722
Epoch: 9246, Training Loss: 0.02143
Epoch: 9247, Training Loss: 0.01612
Epoch: 9247, Training Loss: 0.01719
Epoch: 9247, Training Loss: 0.01722
Epoch: 9247, Training Loss: 0.02143
Epoch: 9248, Training Loss: 0.01612
Epoch: 9248, Training Loss: 0.01719
Epoch: 9248, Training Loss: 0.01721
Epoch: 9248, Training Loss: 0.02143
Epoch: 9249, Training Loss: 0.01612
Epoch: 9249, Training Loss: 0.01719
Epoch: 9249, Training Loss: 0.01721
Epoch: 9249, Training Loss: 0.02143
Epoch: 9250, Training Loss: 0.01612
Epoch: 9250, Training Loss: 0.01718
Epoch: 9250, Training Loss: 0.01721
Epoch: 9250, Training Loss: 0.02143
Epoch: 9251, Training Loss: 0.01612
Epoch: 9251, Training Loss: 0.01718
Epoch: 9251, Training Loss: 0.01721
Epoch: 9251, Training Loss: 0.02143
Epoch: 9252, Training Loss: 0.01611
Epoch: 9252, Training Loss: 0.01718
Epoch: 9252, Training Loss: 0.01721
Epoch: 9252, Training Loss: 0.02142
Epoch: 9253, Training Loss: 0.01611
Epoch: 9253, Training Loss: 0.01718
Epoch: 9253, Training Loss: 0.01721
Epoch: 9253, Training Loss: 0.02142
Epoch: 9254, Training Loss: 0.01611
Epoch: 9254, Training Loss: 0.01718
Epoch: 9254, Training Loss: 0.01721
Epoch: 9254, Training Loss: 0.02142
Epoch: 9255, Training Loss: 0.01611
Epoch: 9255, Training Loss: 0.01718
Epoch: 9255, Training Loss: 0.01720
Epoch: 9255, Training Loss: 0.02142
Epoch: 9256, Training Loss: 0.01611
Epoch: 9256, Training Loss: 0.01718
Epoch: 9256, Training Loss: 0.01720
Epoch: 9256, Training Loss: 0.02142
Epoch: 9257, Training Loss: 0.01611
Epoch: 9257, Training Loss: 0.01717
Epoch: 9257, Training Loss: 0.01720
Epoch: 9257, Training Loss: 0.02141
Epoch: 9258, Training Loss: 0.01611
Epoch: 9258, Training Loss: 0.01717
Epoch: 9258, Training Loss: 0.01720
Epoch: 9258, Training Loss: 0.02141
Epoch: 9259, Training Loss: 0.01611
Epoch: 9259, Training Loss: 0.01717
Epoch: 9259, Training Loss: 0.01720
Epoch: 9259, Training Loss: 0.02141
Epoch: 9260, Training Loss: 0.01611
Epoch: 9260, Training Loss: 0.01717
Epoch: 9260, Training Loss: 0.01720
Epoch: 9260, Training Loss: 0.02141
Epoch: 9261, Training Loss: 0.01610
Epoch: 9261, Training Loss: 0.01717
Epoch: 9261, Training Loss: 0.01720
Epoch: 9261, Training Loss: 0.02141
Epoch: 9262, Training Loss: 0.01610
Epoch: 9262, Training Loss: 0.01717
Epoch: 9262, Training Loss: 0.01720
Epoch: 9262, Training Loss: 0.02141
Epoch: 9263, Training Loss: 0.01610
Epoch: 9263, Training Loss: 0.01717
Epoch: 9263, Training Loss: 0.01719
Epoch: 9263, Training Loss: 0.02140
Epoch: 9264, Training Loss: 0.01610
Epoch: 9264, Training Loss: 0.01717
Epoch: 9264, Training Loss: 0.01719
Epoch: 9264, Training Loss: 0.02140
Epoch: 9265, Training Loss: 0.01610
Epoch: 9265, Training Loss: 0.01716
Epoch: 9265, Training Loss: 0.01719
Epoch: 9265, Training Loss: 0.02140
Epoch: 9266, Training Loss: 0.01610
Epoch: 9266, Training Loss: 0.01716
Epoch: 9266, Training Loss: 0.01719
Epoch: 9266, Training Loss: 0.02140
Epoch: 9267, Training Loss: 0.01610
Epoch: 9267, Training Loss: 0.01716
Epoch: 9267, Training Loss: 0.01719
Epoch: 9267, Training Loss: 0.02140
Epoch: 9268, Training Loss: 0.01610
Epoch: 9268, Training Loss: 0.01716
Epoch: 9268, Training Loss: 0.01719
Epoch: 9268, Training Loss: 0.02140
Epoch: 9269, Training Loss: 0.01609
Epoch: 9269, Training Loss: 0.01716
Epoch: 9269, Training Loss: 0.01719
Epoch: 9269, Training Loss: 0.02139
Epoch: 9270, Training Loss: 0.01609
Epoch: 9270, Training Loss: 0.01716
Epoch: 9270, Training Loss: 0.01719
Epoch: 9270, Training Loss: 0.02139
Epoch: 9271, Training Loss: 0.01609
Epoch: 9271, Training Loss: 0.01716
Epoch: 9271, Training Loss: 0.01718
Epoch: 9271, Training Loss: 0.02139
Epoch: 9272, Training Loss: 0.01609
Epoch: 9272, Training Loss: 0.01716
Epoch: 9272, Training Loss: 0.01718
Epoch: 9272, Training Loss: 0.02139
Epoch: 9273, Training Loss: 0.01609
Epoch: 9273, Training Loss: 0.01715
Epoch: 9273, Training Loss: 0.01718
Epoch: 9273, Training Loss: 0.02139
Epoch: 9274, Training Loss: 0.01609
Epoch: 9274, Training Loss: 0.01715
Epoch: 9274, Training Loss: 0.01718
Epoch: 9274, Training Loss: 0.02139
Epoch: 9275, Training Loss: 0.01609
Epoch: 9275, Training Loss: 0.01715
Epoch: 9275, Training Loss: 0.01718
Epoch: 9275, Training Loss: 0.02138
Epoch: 9276, Training Loss: 0.01609
Epoch: 9276, Training Loss: 0.01715
Epoch: 9276, Training Loss: 0.01718
Epoch: 9276, Training Loss: 0.02138
Epoch: 9277, Training Loss: 0.01609
Epoch: 9277, Training Loss: 0.01715
Epoch: 9277, Training Loss: 0.01718
Epoch: 9277, Training Loss: 0.02138
Epoch: 9278, Training Loss: 0.01608
Epoch: 9278, Training Loss: 0.01715
Epoch: 9278, Training Loss: 0.01717
Epoch: 9278, Training Loss: 0.02138
Epoch: 9279, Training Loss: 0.01608
Epoch: 9279, Training Loss: 0.01715
Epoch: 9279, Training Loss: 0.01717
Epoch: 9279, Training Loss: 0.02138
Epoch: 9280, Training Loss: 0.01608
Epoch: 9280, Training Loss: 0.01714
Epoch: 9280, Training Loss: 0.01717
Epoch: 9280, Training Loss: 0.02138
Epoch: 9281, Training Loss: 0.01608
Epoch: 9281, Training Loss: 0.01714
Epoch: 9281, Training Loss: 0.01717
Epoch: 9281, Training Loss: 0.02137
Epoch: 9282, Training Loss: 0.01608
Epoch: 9282, Training Loss: 0.01714
Epoch: 9282, Training Loss: 0.01717
Epoch: 9282, Training Loss: 0.02137
Epoch: 9283, Training Loss: 0.01608
Epoch: 9283, Training Loss: 0.01714
Epoch: 9283, Training Loss: 0.01717
Epoch: 9283, Training Loss: 0.02137
Epoch: 9284, Training Loss: 0.01608
Epoch: 9284, Training Loss: 0.01714
Epoch: 9284, Training Loss: 0.01717
Epoch: 9284, Training Loss: 0.02137
Epoch: 9285, Training Loss: 0.01608
Epoch: 9285, Training Loss: 0.01714
Epoch: 9285, Training Loss: 0.01717
Epoch: 9285, Training Loss: 0.02137
Epoch: 9286, Training Loss: 0.01607
Epoch: 9286, Training Loss: 0.01714
Epoch: 9286, Training Loss: 0.01716
Epoch: 9286, Training Loss: 0.02137
Epoch: 9287, Training Loss: 0.01607
Epoch: 9287, Training Loss: 0.01714
Epoch: 9287, Training Loss: 0.01716
Epoch: 9287, Training Loss: 0.02136
Epoch: 9288, Training Loss: 0.01607
Epoch: 9288, Training Loss: 0.01713
Epoch: 9288, Training Loss: 0.01716
Epoch: 9288, Training Loss: 0.02136
Epoch: 9289, Training Loss: 0.01607
Epoch: 9289, Training Loss: 0.01713
Epoch: 9289, Training Loss: 0.01716
Epoch: 9289, Training Loss: 0.02136
Epoch: 9290, Training Loss: 0.01607
Epoch: 9290, Training Loss: 0.01713
Epoch: 9290, Training Loss: 0.01716
Epoch: 9290, Training Loss: 0.02136
Epoch: 9291, Training Loss: 0.01607
Epoch: 9291, Training Loss: 0.01713
Epoch: 9291, Training Loss: 0.01716
Epoch: 9291, Training Loss: 0.02136
Epoch: 9292, Training Loss: 0.01607
Epoch: 9292, Training Loss: 0.01713
Epoch: 9292, Training Loss: 0.01716
Epoch: 9292, Training Loss: 0.02136
Epoch: 9293, Training Loss: 0.01607
Epoch: 9293, Training Loss: 0.01713
Epoch: 9293, Training Loss: 0.01715
Epoch: 9293, Training Loss: 0.02135
Epoch: 9294, Training Loss: 0.01607
Epoch: 9294, Training Loss: 0.01713
Epoch: 9294, Training Loss: 0.01715
Epoch: 9294, Training Loss: 0.02135
Epoch: 9295, Training Loss: 0.01606
Epoch: 9295, Training Loss: 0.01713
Epoch: 9295, Training Loss: 0.01715
Epoch: 9295, Training Loss: 0.02135
Epoch: 9296, Training Loss: 0.01606
Epoch: 9296, Training Loss: 0.01712
Epoch: 9296, Training Loss: 0.01715
Epoch: 9296, Training Loss: 0.02135
Epoch: 9297, Training Loss: 0.01606
Epoch: 9297, Training Loss: 0.01712
Epoch: 9297, Training Loss: 0.01715
Epoch: 9297, Training Loss: 0.02135
Epoch: 9298, Training Loss: 0.01606
Epoch: 9298, Training Loss: 0.01712
Epoch: 9298, Training Loss: 0.01715
Epoch: 9298, Training Loss: 0.02135
Epoch: 9299, Training Loss: 0.01606
Epoch: 9299, Training Loss: 0.01712
Epoch: 9299, Training Loss: 0.01715
Epoch: 9299, Training Loss: 0.02134
Epoch: 9300, Training Loss: 0.01606
Epoch: 9300, Training Loss: 0.01712
Epoch: 9300, Training Loss: 0.01715
Epoch: 9300, Training Loss: 0.02134
Epoch: 9301, Training Loss: 0.01606
Epoch: 9301, Training Loss: 0.01712
Epoch: 9301, Training Loss: 0.01714
Epoch: 9301, Training Loss: 0.02134
Epoch: 9302, Training Loss: 0.01606
Epoch: 9302, Training Loss: 0.01712
Epoch: 9302, Training Loss: 0.01714
Epoch: 9302, Training Loss: 0.02134
Epoch: 9303, Training Loss: 0.01605
Epoch: 9303, Training Loss: 0.01711
Epoch: 9303, Training Loss: 0.01714
Epoch: 9303, Training Loss: 0.02134
Epoch: 9304, Training Loss: 0.01605
Epoch: 9304, Training Loss: 0.01711
Epoch: 9304, Training Loss: 0.01714
Epoch: 9304, Training Loss: 0.02134
Epoch: 9305, Training Loss: 0.01605
Epoch: 9305, Training Loss: 0.01711
Epoch: 9305, Training Loss: 0.01714
Epoch: 9305, Training Loss: 0.02133
Epoch: 9306, Training Loss: 0.01605
Epoch: 9306, Training Loss: 0.01711
Epoch: 9306, Training Loss: 0.01714
Epoch: 9306, Training Loss: 0.02133
Epoch: 9307, Training Loss: 0.01605
Epoch: 9307, Training Loss: 0.01711
Epoch: 9307, Training Loss: 0.01714
Epoch: 9307, Training Loss: 0.02133
Epoch: 9308, Training Loss: 0.01605
Epoch: 9308, Training Loss: 0.01711
Epoch: 9308, Training Loss: 0.01714
Epoch: 9308, Training Loss: 0.02133
Epoch: 9309, Training Loss: 0.01605
Epoch: 9309, Training Loss: 0.01711
Epoch: 9309, Training Loss: 0.01713
Epoch: 9309, Training Loss: 0.02133
Epoch: 9310, Training Loss: 0.01605
Epoch: 9310, Training Loss: 0.01711
Epoch: 9310, Training Loss: 0.01713
Epoch: 9310, Training Loss: 0.02133
Epoch: 9311, Training Loss: 0.01605
Epoch: 9311, Training Loss: 0.01710
Epoch: 9311, Training Loss: 0.01713
Epoch: 9311, Training Loss: 0.02132
Epoch: 9312, Training Loss: 0.01604
Epoch: 9312, Training Loss: 0.01710
Epoch: 9312, Training Loss: 0.01713
Epoch: 9312, Training Loss: 0.02132
Epoch: 9313, Training Loss: 0.01604
Epoch: 9313, Training Loss: 0.01710
Epoch: 9313, Training Loss: 0.01713
Epoch: 9313, Training Loss: 0.02132
Epoch: 9314, Training Loss: 0.01604
Epoch: 9314, Training Loss: 0.01710
Epoch: 9314, Training Loss: 0.01713
Epoch: 9314, Training Loss: 0.02132
Epoch: 9315, Training Loss: 0.01604
Epoch: 9315, Training Loss: 0.01710
Epoch: 9315, Training Loss: 0.01713
Epoch: 9315, Training Loss: 0.02132
Epoch: 9316, Training Loss: 0.01604
Epoch: 9316, Training Loss: 0.01710
Epoch: 9316, Training Loss: 0.01712
Epoch: 9316, Training Loss: 0.02132
Epoch: 9317, Training Loss: 0.01604
Epoch: 9317, Training Loss: 0.01710
Epoch: 9317, Training Loss: 0.01712
Epoch: 9317, Training Loss: 0.02131
Epoch: 9318, Training Loss: 0.01604
Epoch: 9318, Training Loss: 0.01710
Epoch: 9318, Training Loss: 0.01712
Epoch: 9318, Training Loss: 0.02131
Epoch: 9319, Training Loss: 0.01604
Epoch: 9319, Training Loss: 0.01709
Epoch: 9319, Training Loss: 0.01712
Epoch: 9319, Training Loss: 0.02131
Epoch: 9320, Training Loss: 0.01604
Epoch: 9320, Training Loss: 0.01709
Epoch: 9320, Training Loss: 0.01712
Epoch: 9320, Training Loss: 0.02131
Epoch: 9321, Training Loss: 0.01603
Epoch: 9321, Training Loss: 0.01709
Epoch: 9321, Training Loss: 0.01712
Epoch: 9321, Training Loss: 0.02131
Epoch: 9322, Training Loss: 0.01603
Epoch: 9322, Training Loss: 0.01709
Epoch: 9322, Training Loss: 0.01712
Epoch: 9322, Training Loss: 0.02131
Epoch: 9323, Training Loss: 0.01603
Epoch: 9323, Training Loss: 0.01709
Epoch: 9323, Training Loss: 0.01712
Epoch: 9323, Training Loss: 0.02130
Epoch: 9324, Training Loss: 0.01603
Epoch: 9324, Training Loss: 0.01709
Epoch: 9324, Training Loss: 0.01711
Epoch: 9324, Training Loss: 0.02130
Epoch: 9325, Training Loss: 0.01603
Epoch: 9325, Training Loss: 0.01709
Epoch: 9325, Training Loss: 0.01711
Epoch: 9325, Training Loss: 0.02130
Epoch: 9326, Training Loss: 0.01603
Epoch: 9326, Training Loss: 0.01708
Epoch: 9326, Training Loss: 0.01711
Epoch: 9326, Training Loss: 0.02130
Epoch: 9327, Training Loss: 0.01603
Epoch: 9327, Training Loss: 0.01708
Epoch: 9327, Training Loss: 0.01711
Epoch: 9327, Training Loss: 0.02130
Epoch: 9328, Training Loss: 0.01603
Epoch: 9328, Training Loss: 0.01708
Epoch: 9328, Training Loss: 0.01711
Epoch: 9328, Training Loss: 0.02130
Epoch: 9329, Training Loss: 0.01602
Epoch: 9329, Training Loss: 0.01708
Epoch: 9329, Training Loss: 0.01711
Epoch: 9329, Training Loss: 0.02129
Epoch: 9330, Training Loss: 0.01602
Epoch: 9330, Training Loss: 0.01708
Epoch: 9330, Training Loss: 0.01711
Epoch: 9330, Training Loss: 0.02129
Epoch: 9331, Training Loss: 0.01602
Epoch: 9331, Training Loss: 0.01708
Epoch: 9331, Training Loss: 0.01711
Epoch: 9331, Training Loss: 0.02129
Epoch: 9332, Training Loss: 0.01602
Epoch: 9332, Training Loss: 0.01708
Epoch: 9332, Training Loss: 0.01710
Epoch: 9332, Training Loss: 0.02129
Epoch: 9333, Training Loss: 0.01602
Epoch: 9333, Training Loss: 0.01708
Epoch: 9333, Training Loss: 0.01710
Epoch: 9333, Training Loss: 0.02129
Epoch: 9334, Training Loss: 0.01602
Epoch: 9334, Training Loss: 0.01707
Epoch: 9334, Training Loss: 0.01710
Epoch: 9334, Training Loss: 0.02129
Epoch: 9335, Training Loss: 0.01602
Epoch: 9335, Training Loss: 0.01707
Epoch: 9335, Training Loss: 0.01710
Epoch: 9335, Training Loss: 0.02128
Epoch: 9336, Training Loss: 0.01602
Epoch: 9336, Training Loss: 0.01707
Epoch: 9336, Training Loss: 0.01710
Epoch: 9336, Training Loss: 0.02128
Epoch: 9337, Training Loss: 0.01602
Epoch: 9337, Training Loss: 0.01707
Epoch: 9337, Training Loss: 0.01710
Epoch: 9337, Training Loss: 0.02128
Epoch: 9338, Training Loss: 0.01601
Epoch: 9338, Training Loss: 0.01707
Epoch: 9338, Training Loss: 0.01710
Epoch: 9338, Training Loss: 0.02128
Epoch: 9339, Training Loss: 0.01601
Epoch: 9339, Training Loss: 0.01707
Epoch: 9339, Training Loss: 0.01710
Epoch: 9339, Training Loss: 0.02128
Epoch: 9340, Training Loss: 0.01601
Epoch: 9340, Training Loss: 0.01707
Epoch: 9340, Training Loss: 0.01709
Epoch: 9340, Training Loss: 0.02128
Epoch: 9341, Training Loss: 0.01601
Epoch: 9341, Training Loss: 0.01707
Epoch: 9341, Training Loss: 0.01709
Epoch: 9341, Training Loss: 0.02127
Epoch: 9342, Training Loss: 0.01601
Epoch: 9342, Training Loss: 0.01706
Epoch: 9342, Training Loss: 0.01709
Epoch: 9342, Training Loss: 0.02127
Epoch: 9343, Training Loss: 0.01601
Epoch: 9343, Training Loss: 0.01706
Epoch: 9343, Training Loss: 0.01709
Epoch: 9343, Training Loss: 0.02127
Epoch: 9344, Training Loss: 0.01601
Epoch: 9344, Training Loss: 0.01706
Epoch: 9344, Training Loss: 0.01709
Epoch: 9344, Training Loss: 0.02127
Epoch: 9345, Training Loss: 0.01601
Epoch: 9345, Training Loss: 0.01706
Epoch: 9345, Training Loss: 0.01709
Epoch: 9345, Training Loss: 0.02127
Epoch: 9346, Training Loss: 0.01600
Epoch: 9346, Training Loss: 0.01706
Epoch: 9346, Training Loss: 0.01709
Epoch: 9346, Training Loss: 0.02127
Epoch: 9347, Training Loss: 0.01600
Epoch: 9347, Training Loss: 0.01706
Epoch: 9347, Training Loss: 0.01708
Epoch: 9347, Training Loss: 0.02127
Epoch: 9348, Training Loss: 0.01600
Epoch: 9348, Training Loss: 0.01706
Epoch: 9348, Training Loss: 0.01708
Epoch: 9348, Training Loss: 0.02126
Epoch: 9349, Training Loss: 0.01600
Epoch: 9349, Training Loss: 0.01706
Epoch: 9349, Training Loss: 0.01708
Epoch: 9349, Training Loss: 0.02126
Epoch: 9350, Training Loss: 0.01600
Epoch: 9350, Training Loss: 0.01705
Epoch: 9350, Training Loss: 0.01708
Epoch: 9350, Training Loss: 0.02126
Epoch: 9351, Training Loss: 0.01600
Epoch: 9351, Training Loss: 0.01705
Epoch: 9351, Training Loss: 0.01708
Epoch: 9351, Training Loss: 0.02126
Epoch: 9352, Training Loss: 0.01600
Epoch: 9352, Training Loss: 0.01705
Epoch: 9352, Training Loss: 0.01708
Epoch: 9352, Training Loss: 0.02126
Epoch: 9353, Training Loss: 0.01600
Epoch: 9353, Training Loss: 0.01705
Epoch: 9353, Training Loss: 0.01708
Epoch: 9353, Training Loss: 0.02126
Epoch: 9354, Training Loss: 0.01600
Epoch: 9354, Training Loss: 0.01705
Epoch: 9354, Training Loss: 0.01708
Epoch: 9354, Training Loss: 0.02125
Epoch: 9355, Training Loss: 0.01599
Epoch: 9355, Training Loss: 0.01705
Epoch: 9355, Training Loss: 0.01707
Epoch: 9355, Training Loss: 0.02125
Epoch: 9356, Training Loss: 0.01599
Epoch: 9356, Training Loss: 0.01705
Epoch: 9356, Training Loss: 0.01707
Epoch: 9356, Training Loss: 0.02125
Epoch: 9357, Training Loss: 0.01599
Epoch: 9357, Training Loss: 0.01704
Epoch: 9357, Training Loss: 0.01707
Epoch: 9357, Training Loss: 0.02125
Epoch: 9358, Training Loss: 0.01599
Epoch: 9358, Training Loss: 0.01704
Epoch: 9358, Training Loss: 0.01707
Epoch: 9358, Training Loss: 0.02125
Epoch: 9359, Training Loss: 0.01599
Epoch: 9359, Training Loss: 0.01704
Epoch: 9359, Training Loss: 0.01707
Epoch: 9359, Training Loss: 0.02125
Epoch: 9360, Training Loss: 0.01599
Epoch: 9360, Training Loss: 0.01704
Epoch: 9360, Training Loss: 0.01707
Epoch: 9360, Training Loss: 0.02124
Epoch: 9361, Training Loss: 0.01599
Epoch: 9361, Training Loss: 0.01704
Epoch: 9361, Training Loss: 0.01707
Epoch: 9361, Training Loss: 0.02124
Epoch: 9362, Training Loss: 0.01599
Epoch: 9362, Training Loss: 0.01704
Epoch: 9362, Training Loss: 0.01707
Epoch: 9362, Training Loss: 0.02124
Epoch: 9363, Training Loss: 0.01599
Epoch: 9363, Training Loss: 0.01704
Epoch: 9363, Training Loss: 0.01706
Epoch: 9363, Training Loss: 0.02124
Epoch: 9364, Training Loss: 0.01598
Epoch: 9364, Training Loss: 0.01704
Epoch: 9364, Training Loss: 0.01706
Epoch: 9364, Training Loss: 0.02124
Epoch: 9365, Training Loss: 0.01598
Epoch: 9365, Training Loss: 0.01703
Epoch: 9365, Training Loss: 0.01706
Epoch: 9365, Training Loss: 0.02124
Epoch: 9366, Training Loss: 0.01598
Epoch: 9366, Training Loss: 0.01703
Epoch: 9366, Training Loss: 0.01706
Epoch: 9366, Training Loss: 0.02123
Epoch: 9367, Training Loss: 0.01598
Epoch: 9367, Training Loss: 0.01703
Epoch: 9367, Training Loss: 0.01706
Epoch: 9367, Training Loss: 0.02123
Epoch: 9368, Training Loss: 0.01598
Epoch: 9368, Training Loss: 0.01703
Epoch: 9368, Training Loss: 0.01706
Epoch: 9368, Training Loss: 0.02123
Epoch: 9369, Training Loss: 0.01598
Epoch: 9369, Training Loss: 0.01703
Epoch: 9369, Training Loss: 0.01706
Epoch: 9369, Training Loss: 0.02123
Epoch: 9370, Training Loss: 0.01598
Epoch: 9370, Training Loss: 0.01703
Epoch: 9370, Training Loss: 0.01706
Epoch: 9370, Training Loss: 0.02123
Epoch: 9371, Training Loss: 0.01598
Epoch: 9371, Training Loss: 0.01703
Epoch: 9371, Training Loss: 0.01705
Epoch: 9371, Training Loss: 0.02123
Epoch: 9372, Training Loss: 0.01597
Epoch: 9372, Training Loss: 0.01703
Epoch: 9372, Training Loss: 0.01705
Epoch: 9372, Training Loss: 0.02122
Epoch: 9373, Training Loss: 0.01597
Epoch: 9373, Training Loss: 0.01702
Epoch: 9373, Training Loss: 0.01705
Epoch: 9373, Training Loss: 0.02122
Epoch: 9374, Training Loss: 0.01597
Epoch: 9374, Training Loss: 0.01702
Epoch: 9374, Training Loss: 0.01705
Epoch: 9374, Training Loss: 0.02122
Epoch: 9375, Training Loss: 0.01597
Epoch: 9375, Training Loss: 0.01702
Epoch: 9375, Training Loss: 0.01705
Epoch: 9375, Training Loss: 0.02122
Epoch: 9376, Training Loss: 0.01597
Epoch: 9376, Training Loss: 0.01702
Epoch: 9376, Training Loss: 0.01705
Epoch: 9376, Training Loss: 0.02122
Epoch: 9377, Training Loss: 0.01597
Epoch: 9377, Training Loss: 0.01702
Epoch: 9377, Training Loss: 0.01705
Epoch: 9377, Training Loss: 0.02122
Epoch: 9378, Training Loss: 0.01597
Epoch: 9378, Training Loss: 0.01702
Epoch: 9378, Training Loss: 0.01704
Epoch: 9378, Training Loss: 0.02121
Epoch: 9379, Training Loss: 0.01597
Epoch: 9379, Training Loss: 0.01702
Epoch: 9379, Training Loss: 0.01704
Epoch: 9379, Training Loss: 0.02121
Epoch: 9380, Training Loss: 0.01597
Epoch: 9380, Training Loss: 0.01702
Epoch: 9380, Training Loss: 0.01704
Epoch: 9380, Training Loss: 0.02121
Epoch: 9381, Training Loss: 0.01596
Epoch: 9381, Training Loss: 0.01701
Epoch: 9381, Training Loss: 0.01704
Epoch: 9381, Training Loss: 0.02121
Epoch: 9382, Training Loss: 0.01596
Epoch: 9382, Training Loss: 0.01701
Epoch: 9382, Training Loss: 0.01704
Epoch: 9382, Training Loss: 0.02121
Epoch: 9383, Training Loss: 0.01596
Epoch: 9383, Training Loss: 0.01701
Epoch: 9383, Training Loss: 0.01704
Epoch: 9383, Training Loss: 0.02121
Epoch: 9384, Training Loss: 0.01596
Epoch: 9384, Training Loss: 0.01701
Epoch: 9384, Training Loss: 0.01704
Epoch: 9384, Training Loss: 0.02120
Epoch: 9385, Training Loss: 0.01596
Epoch: 9385, Training Loss: 0.01701
Epoch: 9385, Training Loss: 0.01704
Epoch: 9385, Training Loss: 0.02120
Epoch: 9386, Training Loss: 0.01596
Epoch: 9386, Training Loss: 0.01701
Epoch: 9386, Training Loss: 0.01703
Epoch: 9386, Training Loss: 0.02120
Epoch: 9387, Training Loss: 0.01596
Epoch: 9387, Training Loss: 0.01701
Epoch: 9387, Training Loss: 0.01703
Epoch: 9387, Training Loss: 0.02120
Epoch: 9388, Training Loss: 0.01596
Epoch: 9388, Training Loss: 0.01701
Epoch: 9388, Training Loss: 0.01703
Epoch: 9388, Training Loss: 0.02120
Epoch: 9389, Training Loss: 0.01596
Epoch: 9389, Training Loss: 0.01700
Epoch: 9389, Training Loss: 0.01703
Epoch: 9389, Training Loss: 0.02120
Epoch: 9390, Training Loss: 0.01595
Epoch: 9390, Training Loss: 0.01700
Epoch: 9390, Training Loss: 0.01703
Epoch: 9390, Training Loss: 0.02119
Epoch: 9391, Training Loss: 0.01595
Epoch: 9391, Training Loss: 0.01700
Epoch: 9391, Training Loss: 0.01703
Epoch: 9391, Training Loss: 0.02119
Epoch: 9392, Training Loss: 0.01595
Epoch: 9392, Training Loss: 0.01700
Epoch: 9392, Training Loss: 0.01703
Epoch: 9392, Training Loss: 0.02119
Epoch: 9393, Training Loss: 0.01595
Epoch: 9393, Training Loss: 0.01700
Epoch: 9393, Training Loss: 0.01703
Epoch: 9393, Training Loss: 0.02119
Epoch: 9394, Training Loss: 0.01595
Epoch: 9394, Training Loss: 0.01700
Epoch: 9394, Training Loss: 0.01702
Epoch: 9394, Training Loss: 0.02119
Epoch: 9395, Training Loss: 0.01595
Epoch: 9395, Training Loss: 0.01700
Epoch: 9395, Training Loss: 0.01702
Epoch: 9395, Training Loss: 0.02119
Epoch: 9396, Training Loss: 0.01595
Epoch: 9396, Training Loss: 0.01700
Epoch: 9396, Training Loss: 0.01702
Epoch: 9396, Training Loss: 0.02118
Epoch: 9397, Training Loss: 0.01595
Epoch: 9397, Training Loss: 0.01699
Epoch: 9397, Training Loss: 0.01702
Epoch: 9397, Training Loss: 0.02118
Epoch: 9398, Training Loss: 0.01595
Epoch: 9398, Training Loss: 0.01699
Epoch: 9398, Training Loss: 0.01702
Epoch: 9398, Training Loss: 0.02118
Epoch: 9399, Training Loss: 0.01594
Epoch: 9399, Training Loss: 0.01699
Epoch: 9399, Training Loss: 0.01702
Epoch: 9399, Training Loss: 0.02118
Epoch: 9400, Training Loss: 0.01594
Epoch: 9400, Training Loss: 0.01699
Epoch: 9400, Training Loss: 0.01702
Epoch: 9400, Training Loss: 0.02118
Epoch: 9401, Training Loss: 0.01594
Epoch: 9401, Training Loss: 0.01699
Epoch: 9401, Training Loss: 0.01702
Epoch: 9401, Training Loss: 0.02118
Epoch: 9402, Training Loss: 0.01594
Epoch: 9402, Training Loss: 0.01699
Epoch: 9402, Training Loss: 0.01701
Epoch: 9402, Training Loss: 0.02117
Epoch: 9403, Training Loss: 0.01594
Epoch: 9403, Training Loss: 0.01699
Epoch: 9403, Training Loss: 0.01701
Epoch: 9403, Training Loss: 0.02117
Epoch: 9404, Training Loss: 0.01594
Epoch: 9404, Training Loss: 0.01698
Epoch: 9404, Training Loss: 0.01701
Epoch: 9404, Training Loss: 0.02117
Epoch: 9405, Training Loss: 0.01594
Epoch: 9405, Training Loss: 0.01698
Epoch: 9405, Training Loss: 0.01701
Epoch: 9405, Training Loss: 0.02117
Epoch: 9406, Training Loss: 0.01594
Epoch: 9406, Training Loss: 0.01698
Epoch: 9406, Training Loss: 0.01701
Epoch: 9406, Training Loss: 0.02117
Epoch: 9407, Training Loss: 0.01593
Epoch: 9407, Training Loss: 0.01698
Epoch: 9407, Training Loss: 0.01701
Epoch: 9407, Training Loss: 0.02117
Epoch: 9408, Training Loss: 0.01593
Epoch: 9408, Training Loss: 0.01698
Epoch: 9408, Training Loss: 0.01701
Epoch: 9408, Training Loss: 0.02117
Epoch: 9409, Training Loss: 0.01593
Epoch: 9409, Training Loss: 0.01698
Epoch: 9409, Training Loss: 0.01701
Epoch: 9409, Training Loss: 0.02116
Epoch: 9410, Training Loss: 0.01593
Epoch: 9410, Training Loss: 0.01698
Epoch: 9410, Training Loss: 0.01700
Epoch: 9410, Training Loss: 0.02116
Epoch: 9411, Training Loss: 0.01593
Epoch: 9411, Training Loss: 0.01698
Epoch: 9411, Training Loss: 0.01700
Epoch: 9411, Training Loss: 0.02116
Epoch: 9412, Training Loss: 0.01593
Epoch: 9412, Training Loss: 0.01697
Epoch: 9412, Training Loss: 0.01700
Epoch: 9412, Training Loss: 0.02116
Epoch: 9413, Training Loss: 0.01593
Epoch: 9413, Training Loss: 0.01697
Epoch: 9413, Training Loss: 0.01700
Epoch: 9413, Training Loss: 0.02116
Epoch: 9414, Training Loss: 0.01593
Epoch: 9414, Training Loss: 0.01697
Epoch: 9414, Training Loss: 0.01700
Epoch: 9414, Training Loss: 0.02116
Epoch: 9415, Training Loss: 0.01593
Epoch: 9415, Training Loss: 0.01697
Epoch: 9415, Training Loss: 0.01700
Epoch: 9415, Training Loss: 0.02115
Epoch: 9416, Training Loss: 0.01592
Epoch: 9416, Training Loss: 0.01697
Epoch: 9416, Training Loss: 0.01700
Epoch: 9416, Training Loss: 0.02115
Epoch: 9417, Training Loss: 0.01592
Epoch: 9417, Training Loss: 0.01697
Epoch: 9417, Training Loss: 0.01700
Epoch: 9417, Training Loss: 0.02115
Epoch: 9418, Training Loss: 0.01592
Epoch: 9418, Training Loss: 0.01697
Epoch: 9418, Training Loss: 0.01699
Epoch: 9418, Training Loss: 0.02115
Epoch: 9419, Training Loss: 0.01592
Epoch: 9419, Training Loss: 0.01697
Epoch: 9419, Training Loss: 0.01699
Epoch: 9419, Training Loss: 0.02115
Epoch: 9420, Training Loss: 0.01592
Epoch: 9420, Training Loss: 0.01696
Epoch: 9420, Training Loss: 0.01699
Epoch: 9420, Training Loss: 0.02115
Epoch: 9421, Training Loss: 0.01592
Epoch: 9421, Training Loss: 0.01696
Epoch: 9421, Training Loss: 0.01699
Epoch: 9421, Training Loss: 0.02114
Epoch: 9422, Training Loss: 0.01592
Epoch: 9422, Training Loss: 0.01696
Epoch: 9422, Training Loss: 0.01699
Epoch: 9422, Training Loss: 0.02114
Epoch: 9423, Training Loss: 0.01592
Epoch: 9423, Training Loss: 0.01696
Epoch: 9423, Training Loss: 0.01699
Epoch: 9423, Training Loss: 0.02114
Epoch: 9424, Training Loss: 0.01592
Epoch: 9424, Training Loss: 0.01696
Epoch: 9424, Training Loss: 0.01699
Epoch: 9424, Training Loss: 0.02114
Epoch: 9425, Training Loss: 0.01591
Epoch: 9425, Training Loss: 0.01696
Epoch: 9425, Training Loss: 0.01698
Epoch: 9425, Training Loss: 0.02114
Epoch: 9426, Training Loss: 0.01591
Epoch: 9426, Training Loss: 0.01696
Epoch: 9426, Training Loss: 0.01698
Epoch: 9426, Training Loss: 0.02114
Epoch: 9427, Training Loss: 0.01591
Epoch: 9427, Training Loss: 0.01696
Epoch: 9427, Training Loss: 0.01698
Epoch: 9427, Training Loss: 0.02113
Epoch: 9428, Training Loss: 0.01591
Epoch: 9428, Training Loss: 0.01695
Epoch: 9428, Training Loss: 0.01698
Epoch: 9428, Training Loss: 0.02113
Epoch: 9429, Training Loss: 0.01591
Epoch: 9429, Training Loss: 0.01695
Epoch: 9429, Training Loss: 0.01698
Epoch: 9429, Training Loss: 0.02113
Epoch: 9430, Training Loss: 0.01591
Epoch: 9430, Training Loss: 0.01695
Epoch: 9430, Training Loss: 0.01698
Epoch: 9430, Training Loss: 0.02113
Epoch: 9431, Training Loss: 0.01591
Epoch: 9431, Training Loss: 0.01695
Epoch: 9431, Training Loss: 0.01698
Epoch: 9431, Training Loss: 0.02113
Epoch: 9432, Training Loss: 0.01591
Epoch: 9432, Training Loss: 0.01695
Epoch: 9432, Training Loss: 0.01698
Epoch: 9432, Training Loss: 0.02113
Epoch: 9433, Training Loss: 0.01591
Epoch: 9433, Training Loss: 0.01695
Epoch: 9433, Training Loss: 0.01697
Epoch: 9433, Training Loss: 0.02112
Epoch: 9434, Training Loss: 0.01590
Epoch: 9434, Training Loss: 0.01695
Epoch: 9434, Training Loss: 0.01697
Epoch: 9434, Training Loss: 0.02112
Epoch: 9435, Training Loss: 0.01590
Epoch: 9435, Training Loss: 0.01695
Epoch: 9435, Training Loss: 0.01697
Epoch: 9435, Training Loss: 0.02112
Epoch: 9436, Training Loss: 0.01590
Epoch: 9436, Training Loss: 0.01694
Epoch: 9436, Training Loss: 0.01697
Epoch: 9436, Training Loss: 0.02112
Epoch: 9437, Training Loss: 0.01590
Epoch: 9437, Training Loss: 0.01694
Epoch: 9437, Training Loss: 0.01697
Epoch: 9437, Training Loss: 0.02112
Epoch: 9438, Training Loss: 0.01590
Epoch: 9438, Training Loss: 0.01694
Epoch: 9438, Training Loss: 0.01697
Epoch: 9438, Training Loss: 0.02112
Epoch: 9439, Training Loss: 0.01590
Epoch: 9439, Training Loss: 0.01694
Epoch: 9439, Training Loss: 0.01697
Epoch: 9439, Training Loss: 0.02111
Epoch: 9440, Training Loss: 0.01590
Epoch: 9440, Training Loss: 0.01694
Epoch: 9440, Training Loss: 0.01697
Epoch: 9440, Training Loss: 0.02111
Epoch: 9441, Training Loss: 0.01590
Epoch: 9441, Training Loss: 0.01694
Epoch: 9441, Training Loss: 0.01696
Epoch: 9441, Training Loss: 0.02111
Epoch: 9442, Training Loss: 0.01589
Epoch: 9442, Training Loss: 0.01694
Epoch: 9442, Training Loss: 0.01696
Epoch: 9442, Training Loss: 0.02111
Epoch: 9443, Training Loss: 0.01589
Epoch: 9443, Training Loss: 0.01694
Epoch: 9443, Training Loss: 0.01696
Epoch: 9443, Training Loss: 0.02111
Epoch: 9444, Training Loss: 0.01589
Epoch: 9444, Training Loss: 0.01693
Epoch: 9444, Training Loss: 0.01696
Epoch: 9444, Training Loss: 0.02111
Epoch: 9445, Training Loss: 0.01589
Epoch: 9445, Training Loss: 0.01693
Epoch: 9445, Training Loss: 0.01696
Epoch: 9445, Training Loss: 0.02111
Epoch: 9446, Training Loss: 0.01589
Epoch: 9446, Training Loss: 0.01693
Epoch: 9446, Training Loss: 0.01696
Epoch: 9446, Training Loss: 0.02110
Epoch: 9447, Training Loss: 0.01589
Epoch: 9447, Training Loss: 0.01693
Epoch: 9447, Training Loss: 0.01696
Epoch: 9447, Training Loss: 0.02110
Epoch: 9448, Training Loss: 0.01589
Epoch: 9448, Training Loss: 0.01693
Epoch: 9448, Training Loss: 0.01696
Epoch: 9448, Training Loss: 0.02110
Epoch: 9449, Training Loss: 0.01589
Epoch: 9449, Training Loss: 0.01693
Epoch: 9449, Training Loss: 0.01695
Epoch: 9449, Training Loss: 0.02110
Epoch: 9450, Training Loss: 0.01589
Epoch: 9450, Training Loss: 0.01693
Epoch: 9450, Training Loss: 0.01695
Epoch: 9450, Training Loss: 0.02110
Epoch: 9451, Training Loss: 0.01588
Epoch: 9451, Training Loss: 0.01693
Epoch: 9451, Training Loss: 0.01695
Epoch: 9451, Training Loss: 0.02110
Epoch: 9452, Training Loss: 0.01588
Epoch: 9452, Training Loss: 0.01692
Epoch: 9452, Training Loss: 0.01695
Epoch: 9452, Training Loss: 0.02109
Epoch: 9453, Training Loss: 0.01588
Epoch: 9453, Training Loss: 0.01692
Epoch: 9453, Training Loss: 0.01695
Epoch: 9453, Training Loss: 0.02109
Epoch: 9454, Training Loss: 0.01588
Epoch: 9454, Training Loss: 0.01692
Epoch: 9454, Training Loss: 0.01695
Epoch: 9454, Training Loss: 0.02109
Epoch: 9455, Training Loss: 0.01588
Epoch: 9455, Training Loss: 0.01692
Epoch: 9455, Training Loss: 0.01695
Epoch: 9455, Training Loss: 0.02109
Epoch: 9456, Training Loss: 0.01588
Epoch: 9456, Training Loss: 0.01692
Epoch: 9456, Training Loss: 0.01695
Epoch: 9456, Training Loss: 0.02109
Epoch: 9457, Training Loss: 0.01588
Epoch: 9457, Training Loss: 0.01692
Epoch: 9457, Training Loss: 0.01694
Epoch: 9457, Training Loss: 0.02109
Epoch: 9458, Training Loss: 0.01588
Epoch: 9458, Training Loss: 0.01692
Epoch: 9458, Training Loss: 0.01694
Epoch: 9458, Training Loss: 0.02108
Epoch: 9459, Training Loss: 0.01588
Epoch: 9459, Training Loss: 0.01692
Epoch: 9459, Training Loss: 0.01694
Epoch: 9459, Training Loss: 0.02108
Epoch: 9460, Training Loss: 0.01587
Epoch: 9460, Training Loss: 0.01691
Epoch: 9460, Training Loss: 0.01694
Epoch: 9460, Training Loss: 0.02108
Epoch: 9461, Training Loss: 0.01587
Epoch: 9461, Training Loss: 0.01691
Epoch: 9461, Training Loss: 0.01694
Epoch: 9461, Training Loss: 0.02108
Epoch: 9462, Training Loss: 0.01587
Epoch: 9462, Training Loss: 0.01691
Epoch: 9462, Training Loss: 0.01694
Epoch: 9462, Training Loss: 0.02108
Epoch: 9463, Training Loss: 0.01587
Epoch: 9463, Training Loss: 0.01691
Epoch: 9463, Training Loss: 0.01694
Epoch: 9463, Training Loss: 0.02108
Epoch: 9464, Training Loss: 0.01587
Epoch: 9464, Training Loss: 0.01691
Epoch: 9464, Training Loss: 0.01694
Epoch: 9464, Training Loss: 0.02107
Epoch: 9465, Training Loss: 0.01587
Epoch: 9465, Training Loss: 0.01691
Epoch: 9465, Training Loss: 0.01693
Epoch: 9465, Training Loss: 0.02107
Epoch: 9466, Training Loss: 0.01587
Epoch: 9466, Training Loss: 0.01691
Epoch: 9466, Training Loss: 0.01693
Epoch: 9466, Training Loss: 0.02107
Epoch: 9467, Training Loss: 0.01587
Epoch: 9467, Training Loss: 0.01691
Epoch: 9467, Training Loss: 0.01693
Epoch: 9467, Training Loss: 0.02107
Epoch: 9468, Training Loss: 0.01587
Epoch: 9468, Training Loss: 0.01690
Epoch: 9468, Training Loss: 0.01693
Epoch: 9468, Training Loss: 0.02107
Epoch: 9469, Training Loss: 0.01586
Epoch: 9469, Training Loss: 0.01690
Epoch: 9469, Training Loss: 0.01693
Epoch: 9469, Training Loss: 0.02107
Epoch: 9470, Training Loss: 0.01586
Epoch: 9470, Training Loss: 0.01690
Epoch: 9470, Training Loss: 0.01693
Epoch: 9470, Training Loss: 0.02106
Epoch: 9471, Training Loss: 0.01586
Epoch: 9471, Training Loss: 0.01690
Epoch: 9471, Training Loss: 0.01693
Epoch: 9471, Training Loss: 0.02106
Epoch: 9472, Training Loss: 0.01586
Epoch: 9472, Training Loss: 0.01690
Epoch: 9472, Training Loss: 0.01693
Epoch: 9472, Training Loss: 0.02106
Epoch: 9473, Training Loss: 0.01586
Epoch: 9473, Training Loss: 0.01690
Epoch: 9473, Training Loss: 0.01692
Epoch: 9473, Training Loss: 0.02106
Epoch: 9474, Training Loss: 0.01586
Epoch: 9474, Training Loss: 0.01690
Epoch: 9474, Training Loss: 0.01692
Epoch: 9474, Training Loss: 0.02106
Epoch: 9475, Training Loss: 0.01586
Epoch: 9475, Training Loss: 0.01690
Epoch: 9475, Training Loss: 0.01692
Epoch: 9475, Training Loss: 0.02106
Epoch: 9476, Training Loss: 0.01586
Epoch: 9476, Training Loss: 0.01689
Epoch: 9476, Training Loss: 0.01692
Epoch: 9476, Training Loss: 0.02106
Epoch: 9477, Training Loss: 0.01586
Epoch: 9477, Training Loss: 0.01689
Epoch: 9477, Training Loss: 0.01692
Epoch: 9477, Training Loss: 0.02105
Epoch: 9478, Training Loss: 0.01585
Epoch: 9478, Training Loss: 0.01689
Epoch: 9478, Training Loss: 0.01692
Epoch: 9478, Training Loss: 0.02105
Epoch: 9479, Training Loss: 0.01585
Epoch: 9479, Training Loss: 0.01689
Epoch: 9479, Training Loss: 0.01692
Epoch: 9479, Training Loss: 0.02105
Epoch: 9480, Training Loss: 0.01585
Epoch: 9480, Training Loss: 0.01689
Epoch: 9480, Training Loss: 0.01692
Epoch: 9480, Training Loss: 0.02105
Epoch: 9481, Training Loss: 0.01585
Epoch: 9481, Training Loss: 0.01689
Epoch: 9481, Training Loss: 0.01691
Epoch: 9481, Training Loss: 0.02105
Epoch: 9482, Training Loss: 0.01585
Epoch: 9482, Training Loss: 0.01689
Epoch: 9482, Training Loss: 0.01691
Epoch: 9482, Training Loss: 0.02105
Epoch: 9483, Training Loss: 0.01585
Epoch: 9483, Training Loss: 0.01689
Epoch: 9483, Training Loss: 0.01691
Epoch: 9483, Training Loss: 0.02104
Epoch: 9484, Training Loss: 0.01585
Epoch: 9484, Training Loss: 0.01688
Epoch: 9484, Training Loss: 0.01691
Epoch: 9484, Training Loss: 0.02104
Epoch: 9485, Training Loss: 0.01585
Epoch: 9485, Training Loss: 0.01688
Epoch: 9485, Training Loss: 0.01691
Epoch: 9485, Training Loss: 0.02104
Epoch: 9486, Training Loss: 0.01585
Epoch: 9486, Training Loss: 0.01688
Epoch: 9486, Training Loss: 0.01691
Epoch: 9486, Training Loss: 0.02104
Epoch: 9487, Training Loss: 0.01584
Epoch: 9487, Training Loss: 0.01688
Epoch: 9487, Training Loss: 0.01691
Epoch: 9487, Training Loss: 0.02104
Epoch: 9488, Training Loss: 0.01584
Epoch: 9488, Training Loss: 0.01688
Epoch: 9488, Training Loss: 0.01691
Epoch: 9488, Training Loss: 0.02104
Epoch: 9489, Training Loss: 0.01584
Epoch: 9489, Training Loss: 0.01688
Epoch: 9489, Training Loss: 0.01690
Epoch: 9489, Training Loss: 0.02103
Epoch: 9490, Training Loss: 0.01584
Epoch: 9490, Training Loss: 0.01688
Epoch: 9490, Training Loss: 0.01690
Epoch: 9490, Training Loss: 0.02103
Epoch: 9491, Training Loss: 0.01584
Epoch: 9491, Training Loss: 0.01688
Epoch: 9491, Training Loss: 0.01690
Epoch: 9491, Training Loss: 0.02103
Epoch: 9492, Training Loss: 0.01584
Epoch: 9492, Training Loss: 0.01687
Epoch: 9492, Training Loss: 0.01690
Epoch: 9492, Training Loss: 0.02103
Epoch: 9493, Training Loss: 0.01584
Epoch: 9493, Training Loss: 0.01687
Epoch: 9493, Training Loss: 0.01690
Epoch: 9493, Training Loss: 0.02103
Epoch: 9494, Training Loss: 0.01584
Epoch: 9494, Training Loss: 0.01687
Epoch: 9494, Training Loss: 0.01690
Epoch: 9494, Training Loss: 0.02103
Epoch: 9495, Training Loss: 0.01584
Epoch: 9495, Training Loss: 0.01687
Epoch: 9495, Training Loss: 0.01690
Epoch: 9495, Training Loss: 0.02102
Epoch: 9496, Training Loss: 0.01583
Epoch: 9496, Training Loss: 0.01687
Epoch: 9496, Training Loss: 0.01690
Epoch: 9496, Training Loss: 0.02102
Epoch: 9497, Training Loss: 0.01583
Epoch: 9497, Training Loss: 0.01687
Epoch: 9497, Training Loss: 0.01689
Epoch: 9497, Training Loss: 0.02102
Epoch: 9498, Training Loss: 0.01583
Epoch: 9498, Training Loss: 0.01687
Epoch: 9498, Training Loss: 0.01689
Epoch: 9498, Training Loss: 0.02102
Epoch: 9499, Training Loss: 0.01583
Epoch: 9499, Training Loss: 0.01687
Epoch: 9499, Training Loss: 0.01689
Epoch: 9499, Training Loss: 0.02102
Epoch: 9500, Training Loss: 0.01583
Epoch: 9500, Training Loss: 0.01686
Epoch: 9500, Training Loss: 0.01689
Epoch: 9500, Training Loss: 0.02102
Epoch: 9501, Training Loss: 0.01583
Epoch: 9501, Training Loss: 0.01686
Epoch: 9501, Training Loss: 0.01689
Epoch: 9501, Training Loss: 0.02102
Epoch: 9502, Training Loss: 0.01583
Epoch: 9502, Training Loss: 0.01686
Epoch: 9502, Training Loss: 0.01689
Epoch: 9502, Training Loss: 0.02101
Epoch: 9503, Training Loss: 0.01583
Epoch: 9503, Training Loss: 0.01686
Epoch: 9503, Training Loss: 0.01689
Epoch: 9503, Training Loss: 0.02101
Epoch: 9504, Training Loss: 0.01583
Epoch: 9504, Training Loss: 0.01686
Epoch: 9504, Training Loss: 0.01689
Epoch: 9504, Training Loss: 0.02101
Epoch: 9505, Training Loss: 0.01582
Epoch: 9505, Training Loss: 0.01686
Epoch: 9505, Training Loss: 0.01688
Epoch: 9505, Training Loss: 0.02101
Epoch: 9506, Training Loss: 0.01582
Epoch: 9506, Training Loss: 0.01686
Epoch: 9506, Training Loss: 0.01688
Epoch: 9506, Training Loss: 0.02101
Epoch: 9507, Training Loss: 0.01582
Epoch: 9507, Training Loss: 0.01686
Epoch: 9507, Training Loss: 0.01688
Epoch: 9507, Training Loss: 0.02101
Epoch: 9508, Training Loss: 0.01582
Epoch: 9508, Training Loss: 0.01685
Epoch: 9508, Training Loss: 0.01688
Epoch: 9508, Training Loss: 0.02100
Epoch: 9509, Training Loss: 0.01582
Epoch: 9509, Training Loss: 0.01685
Epoch: 9509, Training Loss: 0.01688
Epoch: 9509, Training Loss: 0.02100
Epoch: 9510, Training Loss: 0.01582
Epoch: 9510, Training Loss: 0.01685
Epoch: 9510, Training Loss: 0.01688
Epoch: 9510, Training Loss: 0.02100
Epoch: 9511, Training Loss: 0.01582
Epoch: 9511, Training Loss: 0.01685
Epoch: 9511, Training Loss: 0.01688
Epoch: 9511, Training Loss: 0.02100
Epoch: 9512, Training Loss: 0.01582
Epoch: 9512, Training Loss: 0.01685
Epoch: 9512, Training Loss: 0.01688
Epoch: 9512, Training Loss: 0.02100
Epoch: 9513, Training Loss: 0.01582
Epoch: 9513, Training Loss: 0.01685
Epoch: 9513, Training Loss: 0.01687
Epoch: 9513, Training Loss: 0.02100
Epoch: 9514, Training Loss: 0.01581
Epoch: 9514, Training Loss: 0.01685
Epoch: 9514, Training Loss: 0.01687
Epoch: 9514, Training Loss: 0.02099
Epoch: 9515, Training Loss: 0.01581
Epoch: 9515, Training Loss: 0.01685
Epoch: 9515, Training Loss: 0.01687
Epoch: 9515, Training Loss: 0.02099
Epoch: 9516, Training Loss: 0.01581
Epoch: 9516, Training Loss: 0.01684
Epoch: 9516, Training Loss: 0.01687
Epoch: 9516, Training Loss: 0.02099
Epoch: 9517, Training Loss: 0.01581
Epoch: 9517, Training Loss: 0.01684
Epoch: 9517, Training Loss: 0.01687
Epoch: 9517, Training Loss: 0.02099
Epoch: 9518, Training Loss: 0.01581
Epoch: 9518, Training Loss: 0.01684
Epoch: 9518, Training Loss: 0.01687
Epoch: 9518, Training Loss: 0.02099
Epoch: 9519, Training Loss: 0.01581
Epoch: 9519, Training Loss: 0.01684
Epoch: 9519, Training Loss: 0.01687
Epoch: 9519, Training Loss: 0.02099
Epoch: 9520, Training Loss: 0.01581
Epoch: 9520, Training Loss: 0.01684
Epoch: 9520, Training Loss: 0.01687
Epoch: 9520, Training Loss: 0.02099
Epoch: 9521, Training Loss: 0.01581
Epoch: 9521, Training Loss: 0.01684
Epoch: 9521, Training Loss: 0.01686
Epoch: 9521, Training Loss: 0.02098
Epoch: 9522, Training Loss: 0.01581
Epoch: 9522, Training Loss: 0.01684
Epoch: 9522, Training Loss: 0.01686
Epoch: 9522, Training Loss: 0.02098
Epoch: 9523, Training Loss: 0.01580
Epoch: 9523, Training Loss: 0.01684
Epoch: 9523, Training Loss: 0.01686
Epoch: 9523, Training Loss: 0.02098
Epoch: 9524, Training Loss: 0.01580
Epoch: 9524, Training Loss: 0.01683
Epoch: 9524, Training Loss: 0.01686
Epoch: 9524, Training Loss: 0.02098
Epoch: 9525, Training Loss: 0.01580
Epoch: 9525, Training Loss: 0.01683
Epoch: 9525, Training Loss: 0.01686
Epoch: 9525, Training Loss: 0.02098
Epoch: 9526, Training Loss: 0.01580
Epoch: 9526, Training Loss: 0.01683
Epoch: 9526, Training Loss: 0.01686
Epoch: 9526, Training Loss: 0.02098
Epoch: 9527, Training Loss: 0.01580
Epoch: 9527, Training Loss: 0.01683
Epoch: 9527, Training Loss: 0.01686
Epoch: 9527, Training Loss: 0.02097
Epoch: 9528, Training Loss: 0.01580
Epoch: 9528, Training Loss: 0.01683
Epoch: 9528, Training Loss: 0.01686
Epoch: 9528, Training Loss: 0.02097
Epoch: 9529, Training Loss: 0.01580
Epoch: 9529, Training Loss: 0.01683
Epoch: 9529, Training Loss: 0.01685
Epoch: 9529, Training Loss: 0.02097
Epoch: 9530, Training Loss: 0.01580
Epoch: 9530, Training Loss: 0.01683
Epoch: 9530, Training Loss: 0.01685
Epoch: 9530, Training Loss: 0.02097
Epoch: 9531, Training Loss: 0.01580
Epoch: 9531, Training Loss: 0.01683
Epoch: 9531, Training Loss: 0.01685
Epoch: 9531, Training Loss: 0.02097
Epoch: 9532, Training Loss: 0.01579
Epoch: 9532, Training Loss: 0.01682
Epoch: 9532, Training Loss: 0.01685
Epoch: 9532, Training Loss: 0.02097
Epoch: 9533, Training Loss: 0.01579
Epoch: 9533, Training Loss: 0.01682
Epoch: 9533, Training Loss: 0.01685
Epoch: 9533, Training Loss: 0.02096
Epoch: 9534, Training Loss: 0.01579
Epoch: 9534, Training Loss: 0.01682
Epoch: 9534, Training Loss: 0.01685
Epoch: 9534, Training Loss: 0.02096
Epoch: 9535, Training Loss: 0.01579
Epoch: 9535, Training Loss: 0.01682
Epoch: 9535, Training Loss: 0.01685
Epoch: 9535, Training Loss: 0.02096
Epoch: 9536, Training Loss: 0.01579
Epoch: 9536, Training Loss: 0.01682
Epoch: 9536, Training Loss: 0.01685
Epoch: 9536, Training Loss: 0.02096
Epoch: 9537, Training Loss: 0.01579
Epoch: 9537, Training Loss: 0.01682
Epoch: 9537, Training Loss: 0.01684
Epoch: 9537, Training Loss: 0.02096
Epoch: 9538, Training Loss: 0.01579
Epoch: 9538, Training Loss: 0.01682
Epoch: 9538, Training Loss: 0.01684
Epoch: 9538, Training Loss: 0.02096
Epoch: 9539, Training Loss: 0.01579
Epoch: 9539, Training Loss: 0.01682
Epoch: 9539, Training Loss: 0.01684
Epoch: 9539, Training Loss: 0.02096
Epoch: 9540, Training Loss: 0.01579
Epoch: 9540, Training Loss: 0.01681
Epoch: 9540, Training Loss: 0.01684
Epoch: 9540, Training Loss: 0.02095
Epoch: 9541, Training Loss: 0.01578
Epoch: 9541, Training Loss: 0.01681
Epoch: 9541, Training Loss: 0.01684
Epoch: 9541, Training Loss: 0.02095
Epoch: 9542, Training Loss: 0.01578
Epoch: 9542, Training Loss: 0.01681
Epoch: 9542, Training Loss: 0.01684
Epoch: 9542, Training Loss: 0.02095
Epoch: 9543, Training Loss: 0.01578
Epoch: 9543, Training Loss: 0.01681
Epoch: 9543, Training Loss: 0.01684
Epoch: 9543, Training Loss: 0.02095
Epoch: 9544, Training Loss: 0.01578
Epoch: 9544, Training Loss: 0.01681
Epoch: 9544, Training Loss: 0.01684
Epoch: 9544, Training Loss: 0.02095
Epoch: 9545, Training Loss: 0.01578
Epoch: 9545, Training Loss: 0.01681
Epoch: 9545, Training Loss: 0.01683
Epoch: 9545, Training Loss: 0.02095
Epoch: 9546, Training Loss: 0.01578
Epoch: 9546, Training Loss: 0.01681
Epoch: 9546, Training Loss: 0.01683
Epoch: 9546, Training Loss: 0.02094
Epoch: 9547, Training Loss: 0.01578
Epoch: 9547, Training Loss: 0.01681
Epoch: 9547, Training Loss: 0.01683
Epoch: 9547, Training Loss: 0.02094
Epoch: 9548, Training Loss: 0.01578
Epoch: 9548, Training Loss: 0.01680
Epoch: 9548, Training Loss: 0.01683
Epoch: 9548, Training Loss: 0.02094
Epoch: 9549, Training Loss: 0.01578
Epoch: 9549, Training Loss: 0.01680
Epoch: 9549, Training Loss: 0.01683
Epoch: 9549, Training Loss: 0.02094
Epoch: 9550, Training Loss: 0.01577
Epoch: 9550, Training Loss: 0.01680
Epoch: 9550, Training Loss: 0.01683
Epoch: 9550, Training Loss: 0.02094
Epoch: 9551, Training Loss: 0.01577
Epoch: 9551, Training Loss: 0.01680
Epoch: 9551, Training Loss: 0.01683
Epoch: 9551, Training Loss: 0.02094
Epoch: 9552, Training Loss: 0.01577
Epoch: 9552, Training Loss: 0.01680
Epoch: 9552, Training Loss: 0.01683
Epoch: 9552, Training Loss: 0.02093
Epoch: 9553, Training Loss: 0.01577
Epoch: 9553, Training Loss: 0.01680
Epoch: 9553, Training Loss: 0.01682
Epoch: 9553, Training Loss: 0.02093
Epoch: 9554, Training Loss: 0.01577
Epoch: 9554, Training Loss: 0.01680
Epoch: 9554, Training Loss: 0.01682
Epoch: 9554, Training Loss: 0.02093
Epoch: 9555, Training Loss: 0.01577
Epoch: 9555, Training Loss: 0.01680
Epoch: 9555, Training Loss: 0.01682
Epoch: 9555, Training Loss: 0.02093
Epoch: 9556, Training Loss: 0.01577
Epoch: 9556, Training Loss: 0.01679
Epoch: 9556, Training Loss: 0.01682
Epoch: 9556, Training Loss: 0.02093
Epoch: 9557, Training Loss: 0.01577
Epoch: 9557, Training Loss: 0.01679
Epoch: 9557, Training Loss: 0.01682
Epoch: 9557, Training Loss: 0.02093
Epoch: 9558, Training Loss: 0.01577
Epoch: 9558, Training Loss: 0.01679
Epoch: 9558, Training Loss: 0.01682
Epoch: 9558, Training Loss: 0.02093
Epoch: 9559, Training Loss: 0.01576
Epoch: 9559, Training Loss: 0.01679
Epoch: 9559, Training Loss: 0.01682
Epoch: 9559, Training Loss: 0.02092
Epoch: 9560, Training Loss: 0.01576
Epoch: 9560, Training Loss: 0.01679
Epoch: 9560, Training Loss: 0.01682
Epoch: 9560, Training Loss: 0.02092
Epoch: 9561, Training Loss: 0.01576
Epoch: 9561, Training Loss: 0.01679
Epoch: 9561, Training Loss: 0.01681
Epoch: 9561, Training Loss: 0.02092
Epoch: 9562, Training Loss: 0.01576
Epoch: 9562, Training Loss: 0.01679
Epoch: 9562, Training Loss: 0.01681
Epoch: 9562, Training Loss: 0.02092
Epoch: 9563, Training Loss: 0.01576
Epoch: 9563, Training Loss: 0.01679
Epoch: 9563, Training Loss: 0.01681
Epoch: 9563, Training Loss: 0.02092
Epoch: 9564, Training Loss: 0.01576
Epoch: 9564, Training Loss: 0.01678
Epoch: 9564, Training Loss: 0.01681
Epoch: 9564, Training Loss: 0.02092
Epoch: 9565, Training Loss: 0.01576
Epoch: 9565, Training Loss: 0.01678
Epoch: 9565, Training Loss: 0.01681
Epoch: 9565, Training Loss: 0.02091
Epoch: 9566, Training Loss: 0.01576
Epoch: 9566, Training Loss: 0.01678
Epoch: 9566, Training Loss: 0.01681
Epoch: 9566, Training Loss: 0.02091
Epoch: 9567, Training Loss: 0.01576
Epoch: 9567, Training Loss: 0.01678
Epoch: 9567, Training Loss: 0.01681
Epoch: 9567, Training Loss: 0.02091
Epoch: 9568, Training Loss: 0.01575
Epoch: 9568, Training Loss: 0.01678
Epoch: 9568, Training Loss: 0.01681
Epoch: 9568, Training Loss: 0.02091
Epoch: 9569, Training Loss: 0.01575
Epoch: 9569, Training Loss: 0.01678
Epoch: 9569, Training Loss: 0.01681
Epoch: 9569, Training Loss: 0.02091
Epoch: 9570, Training Loss: 0.01575
Epoch: 9570, Training Loss: 0.01678
Epoch: 9570, Training Loss: 0.01680
Epoch: 9570, Training Loss: 0.02091
Epoch: 9571, Training Loss: 0.01575
Epoch: 9571, Training Loss: 0.01678
Epoch: 9571, Training Loss: 0.01680
Epoch: 9571, Training Loss: 0.02090
Epoch: 9572, Training Loss: 0.01575
Epoch: 9572, Training Loss: 0.01678
Epoch: 9572, Training Loss: 0.01680
Epoch: 9572, Training Loss: 0.02090
Epoch: 9573, Training Loss: 0.01575
Epoch: 9573, Training Loss: 0.01677
Epoch: 9573, Training Loss: 0.01680
Epoch: 9573, Training Loss: 0.02090
Epoch: 9574, Training Loss: 0.01575
Epoch: 9574, Training Loss: 0.01677
Epoch: 9574, Training Loss: 0.01680
Epoch: 9574, Training Loss: 0.02090
Epoch: 9575, Training Loss: 0.01575
Epoch: 9575, Training Loss: 0.01677
Epoch: 9575, Training Loss: 0.01680
Epoch: 9575, Training Loss: 0.02090
Epoch: 9576, Training Loss: 0.01575
Epoch: 9576, Training Loss: 0.01677
Epoch: 9576, Training Loss: 0.01680
Epoch: 9576, Training Loss: 0.02090
Epoch: 9577, Training Loss: 0.01574
Epoch: 9577, Training Loss: 0.01677
Epoch: 9577, Training Loss: 0.01680
Epoch: 9577, Training Loss: 0.02090
Epoch: 9578, Training Loss: 0.01574
Epoch: 9578, Training Loss: 0.01677
Epoch: 9578, Training Loss: 0.01679
Epoch: 9578, Training Loss: 0.02089
Epoch: 9579, Training Loss: 0.01574
Epoch: 9579, Training Loss: 0.01677
Epoch: 9579, Training Loss: 0.01679
Epoch: 9579, Training Loss: 0.02089
Epoch: 9580, Training Loss: 0.01574
Epoch: 9580, Training Loss: 0.01677
Epoch: 9580, Training Loss: 0.01679
Epoch: 9580, Training Loss: 0.02089
Epoch: 9581, Training Loss: 0.01574
Epoch: 9581, Training Loss: 0.01676
Epoch: 9581, Training Loss: 0.01679
Epoch: 9581, Training Loss: 0.02089
Epoch: 9582, Training Loss: 0.01574
Epoch: 9582, Training Loss: 0.01676
Epoch: 9582, Training Loss: 0.01679
Epoch: 9582, Training Loss: 0.02089
Epoch: 9583, Training Loss: 0.01574
Epoch: 9583, Training Loss: 0.01676
Epoch: 9583, Training Loss: 0.01679
Epoch: 9583, Training Loss: 0.02089
Epoch: 9584, Training Loss: 0.01574
Epoch: 9584, Training Loss: 0.01676
Epoch: 9584, Training Loss: 0.01679
Epoch: 9584, Training Loss: 0.02088
Epoch: 9585, Training Loss: 0.01574
Epoch: 9585, Training Loss: 0.01676
Epoch: 9585, Training Loss: 0.01679
Epoch: 9585, Training Loss: 0.02088
Epoch: 9586, Training Loss: 0.01573
Epoch: 9586, Training Loss: 0.01676
Epoch: 9586, Training Loss: 0.01678
Epoch: 9586, Training Loss: 0.02088
Epoch: 9587, Training Loss: 0.01573
Epoch: 9587, Training Loss: 0.01676
Epoch: 9587, Training Loss: 0.01678
Epoch: 9587, Training Loss: 0.02088
Epoch: 9588, Training Loss: 0.01573
Epoch: 9588, Training Loss: 0.01676
Epoch: 9588, Training Loss: 0.01678
Epoch: 9588, Training Loss: 0.02088
Epoch: 9589, Training Loss: 0.01573
Epoch: 9589, Training Loss: 0.01675
Epoch: 9589, Training Loss: 0.01678
Epoch: 9589, Training Loss: 0.02088
Epoch: 9590, Training Loss: 0.01573
Epoch: 9590, Training Loss: 0.01675
Epoch: 9590, Training Loss: 0.01678
Epoch: 9590, Training Loss: 0.02087
Epoch: 9591, Training Loss: 0.01573
Epoch: 9591, Training Loss: 0.01675
Epoch: 9591, Training Loss: 0.01678
Epoch: 9591, Training Loss: 0.02087
Epoch: 9592, Training Loss: 0.01573
Epoch: 9592, Training Loss: 0.01675
Epoch: 9592, Training Loss: 0.01678
Epoch: 9592, Training Loss: 0.02087
Epoch: 9593, Training Loss: 0.01573
Epoch: 9593, Training Loss: 0.01675
Epoch: 9593, Training Loss: 0.01678
Epoch: 9593, Training Loss: 0.02087
Epoch: 9594, Training Loss: 0.01573
Epoch: 9594, Training Loss: 0.01675
Epoch: 9594, Training Loss: 0.01677
Epoch: 9594, Training Loss: 0.02087
Epoch: 9595, Training Loss: 0.01572
Epoch: 9595, Training Loss: 0.01675
Epoch: 9595, Training Loss: 0.01677
Epoch: 9595, Training Loss: 0.02087
Epoch: 9596, Training Loss: 0.01572
Epoch: 9596, Training Loss: 0.01675
Epoch: 9596, Training Loss: 0.01677
Epoch: 9596, Training Loss: 0.02087
Epoch: 9597, Training Loss: 0.01572
Epoch: 9597, Training Loss: 0.01674
Epoch: 9597, Training Loss: 0.01677
Epoch: 9597, Training Loss: 0.02086
Epoch: 9598, Training Loss: 0.01572
Epoch: 9598, Training Loss: 0.01674
Epoch: 9598, Training Loss: 0.01677
Epoch: 9598, Training Loss: 0.02086
Epoch: 9599, Training Loss: 0.01572
Epoch: 9599, Training Loss: 0.01674
Epoch: 9599, Training Loss: 0.01677
Epoch: 9599, Training Loss: 0.02086
Epoch: 9600, Training Loss: 0.01572
Epoch: 9600, Training Loss: 0.01674
Epoch: 9600, Training Loss: 0.01677
Epoch: 9600, Training Loss: 0.02086
Epoch: 9601, Training Loss: 0.01572
Epoch: 9601, Training Loss: 0.01674
Epoch: 9601, Training Loss: 0.01677
Epoch: 9601, Training Loss: 0.02086
Epoch: 9602, Training Loss: 0.01572
Epoch: 9602, Training Loss: 0.01674
Epoch: 9602, Training Loss: 0.01676
Epoch: 9602, Training Loss: 0.02086
Epoch: 9603, Training Loss: 0.01572
Epoch: 9603, Training Loss: 0.01674
Epoch: 9603, Training Loss: 0.01676
Epoch: 9603, Training Loss: 0.02085
Epoch: 9604, Training Loss: 0.01571
Epoch: 9604, Training Loss: 0.01674
Epoch: 9604, Training Loss: 0.01676
Epoch: 9604, Training Loss: 0.02085
Epoch: 9605, Training Loss: 0.01571
Epoch: 9605, Training Loss: 0.01673
Epoch: 9605, Training Loss: 0.01676
Epoch: 9605, Training Loss: 0.02085
Epoch: 9606, Training Loss: 0.01571
Epoch: 9606, Training Loss: 0.01673
Epoch: 9606, Training Loss: 0.01676
Epoch: 9606, Training Loss: 0.02085
Epoch: 9607, Training Loss: 0.01571
Epoch: 9607, Training Loss: 0.01673
Epoch: 9607, Training Loss: 0.01676
Epoch: 9607, Training Loss: 0.02085
Epoch: 9608, Training Loss: 0.01571
Epoch: 9608, Training Loss: 0.01673
Epoch: 9608, Training Loss: 0.01676
Epoch: 9608, Training Loss: 0.02085
Epoch: 9609, Training Loss: 0.01571
Epoch: 9609, Training Loss: 0.01673
Epoch: 9609, Training Loss: 0.01676
Epoch: 9609, Training Loss: 0.02085
Epoch: 9610, Training Loss: 0.01571
Epoch: 9610, Training Loss: 0.01673
Epoch: 9610, Training Loss: 0.01675
Epoch: 9610, Training Loss: 0.02084
Epoch: 9611, Training Loss: 0.01571
Epoch: 9611, Training Loss: 0.01673
Epoch: 9611, Training Loss: 0.01675
Epoch: 9611, Training Loss: 0.02084
Epoch: 9612, Training Loss: 0.01571
Epoch: 9612, Training Loss: 0.01673
Epoch: 9612, Training Loss: 0.01675
Epoch: 9612, Training Loss: 0.02084
Epoch: 9613, Training Loss: 0.01570
Epoch: 9613, Training Loss: 0.01673
Epoch: 9613, Training Loss: 0.01675
Epoch: 9613, Training Loss: 0.02084
Epoch: 9614, Training Loss: 0.01570
Epoch: 9614, Training Loss: 0.01672
Epoch: 9614, Training Loss: 0.01675
Epoch: 9614, Training Loss: 0.02084
Epoch: 9615, Training Loss: 0.01570
Epoch: 9615, Training Loss: 0.01672
Epoch: 9615, Training Loss: 0.01675
Epoch: 9615, Training Loss: 0.02084
Epoch: 9616, Training Loss: 0.01570
Epoch: 9616, Training Loss: 0.01672
Epoch: 9616, Training Loss: 0.01675
Epoch: 9616, Training Loss: 0.02083
Epoch: 9617, Training Loss: 0.01570
Epoch: 9617, Training Loss: 0.01672
Epoch: 9617, Training Loss: 0.01675
Epoch: 9617, Training Loss: 0.02083
Epoch: 9618, Training Loss: 0.01570
Epoch: 9618, Training Loss: 0.01672
Epoch: 9618, Training Loss: 0.01675
Epoch: 9618, Training Loss: 0.02083
Epoch: 9619, Training Loss: 0.01570
Epoch: 9619, Training Loss: 0.01672
Epoch: 9619, Training Loss: 0.01674
Epoch: 9619, Training Loss: 0.02083
Epoch: 9620, Training Loss: 0.01570
Epoch: 9620, Training Loss: 0.01672
Epoch: 9620, Training Loss: 0.01674
Epoch: 9620, Training Loss: 0.02083
Epoch: 9621, Training Loss: 0.01570
Epoch: 9621, Training Loss: 0.01672
Epoch: 9621, Training Loss: 0.01674
Epoch: 9621, Training Loss: 0.02083
Epoch: 9622, Training Loss: 0.01569
Epoch: 9622, Training Loss: 0.01671
Epoch: 9622, Training Loss: 0.01674
Epoch: 9622, Training Loss: 0.02083
Epoch: 9623, Training Loss: 0.01569
Epoch: 9623, Training Loss: 0.01671
Epoch: 9623, Training Loss: 0.01674
Epoch: 9623, Training Loss: 0.02082
Epoch: 9624, Training Loss: 0.01569
Epoch: 9624, Training Loss: 0.01671
Epoch: 9624, Training Loss: 0.01674
Epoch: 9624, Training Loss: 0.02082
Epoch: 9625, Training Loss: 0.01569
Epoch: 9625, Training Loss: 0.01671
Epoch: 9625, Training Loss: 0.01674
Epoch: 9625, Training Loss: 0.02082
Epoch: 9626, Training Loss: 0.01569
Epoch: 9626, Training Loss: 0.01671
Epoch: 9626, Training Loss: 0.01674
Epoch: 9626, Training Loss: 0.02082
Epoch: 9627, Training Loss: 0.01569
Epoch: 9627, Training Loss: 0.01671
Epoch: 9627, Training Loss: 0.01673
Epoch: 9627, Training Loss: 0.02082
Epoch: 9628, Training Loss: 0.01569
Epoch: 9628, Training Loss: 0.01671
Epoch: 9628, Training Loss: 0.01673
Epoch: 9628, Training Loss: 0.02082
Epoch: 9629, Training Loss: 0.01569
Epoch: 9629, Training Loss: 0.01671
Epoch: 9629, Training Loss: 0.01673
Epoch: 9629, Training Loss: 0.02081
Epoch: 9630, Training Loss: 0.01569
Epoch: 9630, Training Loss: 0.01670
Epoch: 9630, Training Loss: 0.01673
Epoch: 9630, Training Loss: 0.02081
Epoch: 9631, Training Loss: 0.01568
Epoch: 9631, Training Loss: 0.01670
Epoch: 9631, Training Loss: 0.01673
Epoch: 9631, Training Loss: 0.02081
Epoch: 9632, Training Loss: 0.01568
Epoch: 9632, Training Loss: 0.01670
Epoch: 9632, Training Loss: 0.01673
Epoch: 9632, Training Loss: 0.02081
Epoch: 9633, Training Loss: 0.01568
Epoch: 9633, Training Loss: 0.01670
Epoch: 9633, Training Loss: 0.01673
Epoch: 9633, Training Loss: 0.02081
Epoch: 9634, Training Loss: 0.01568
Epoch: 9634, Training Loss: 0.01670
Epoch: 9634, Training Loss: 0.01673
Epoch: 9634, Training Loss: 0.02081
Epoch: 9635, Training Loss: 0.01568
Epoch: 9635, Training Loss: 0.01670
Epoch: 9635, Training Loss: 0.01672
Epoch: 9635, Training Loss: 0.02080
Epoch: 9636, Training Loss: 0.01568
Epoch: 9636, Training Loss: 0.01670
Epoch: 9636, Training Loss: 0.01672
Epoch: 9636, Training Loss: 0.02080
Epoch: 9637, Training Loss: 0.01568
Epoch: 9637, Training Loss: 0.01670
Epoch: 9637, Training Loss: 0.01672
Epoch: 9637, Training Loss: 0.02080
Epoch: 9638, Training Loss: 0.01568
Epoch: 9638, Training Loss: 0.01669
Epoch: 9638, Training Loss: 0.01672
Epoch: 9638, Training Loss: 0.02080
Epoch: 9639, Training Loss: 0.01568
Epoch: 9639, Training Loss: 0.01669
Epoch: 9639, Training Loss: 0.01672
Epoch: 9639, Training Loss: 0.02080
Epoch: 9640, Training Loss: 0.01568
Epoch: 9640, Training Loss: 0.01669
Epoch: 9640, Training Loss: 0.01672
Epoch: 9640, Training Loss: 0.02080
Epoch: 9641, Training Loss: 0.01567
Epoch: 9641, Training Loss: 0.01669
Epoch: 9641, Training Loss: 0.01672
Epoch: 9641, Training Loss: 0.02080
Epoch: 9642, Training Loss: 0.01567
Epoch: 9642, Training Loss: 0.01669
Epoch: 9642, Training Loss: 0.01672
Epoch: 9642, Training Loss: 0.02079
Epoch: 9643, Training Loss: 0.01567
Epoch: 9643, Training Loss: 0.01669
Epoch: 9643, Training Loss: 0.01671
Epoch: 9643, Training Loss: 0.02079
Epoch: 9644, Training Loss: 0.01567
Epoch: 9644, Training Loss: 0.01669
Epoch: 9644, Training Loss: 0.01671
Epoch: 9644, Training Loss: 0.02079
Epoch: 9645, Training Loss: 0.01567
Epoch: 9645, Training Loss: 0.01669
Epoch: 9645, Training Loss: 0.01671
Epoch: 9645, Training Loss: 0.02079
Epoch: 9646, Training Loss: 0.01567
Epoch: 9646, Training Loss: 0.01669
Epoch: 9646, Training Loss: 0.01671
Epoch: 9646, Training Loss: 0.02079
Epoch: 9647, Training Loss: 0.01567
Epoch: 9647, Training Loss: 0.01668
Epoch: 9647, Training Loss: 0.01671
Epoch: 9647, Training Loss: 0.02079
Epoch: 9648, Training Loss: 0.01567
Epoch: 9648, Training Loss: 0.01668
Epoch: 9648, Training Loss: 0.01671
Epoch: 9648, Training Loss: 0.02078
Epoch: 9649, Training Loss: 0.01567
Epoch: 9649, Training Loss: 0.01668
Epoch: 9649, Training Loss: 0.01671
Epoch: 9649, Training Loss: 0.02078
Epoch: 9650, Training Loss: 0.01566
Epoch: 9650, Training Loss: 0.01668
Epoch: 9650, Training Loss: 0.01671
Epoch: 9650, Training Loss: 0.02078
Epoch: 9651, Training Loss: 0.01566
Epoch: 9651, Training Loss: 0.01668
Epoch: 9651, Training Loss: 0.01671
Epoch: 9651, Training Loss: 0.02078
Epoch: 9652, Training Loss: 0.01566
Epoch: 9652, Training Loss: 0.01668
Epoch: 9652, Training Loss: 0.01670
Epoch: 9652, Training Loss: 0.02078
Epoch: 9653, Training Loss: 0.01566
Epoch: 9653, Training Loss: 0.01668
Epoch: 9653, Training Loss: 0.01670
Epoch: 9653, Training Loss: 0.02078
Epoch: 9654, Training Loss: 0.01566
Epoch: 9654, Training Loss: 0.01668
Epoch: 9654, Training Loss: 0.01670
Epoch: 9654, Training Loss: 0.02078
Epoch: 9655, Training Loss: 0.01566
Epoch: 9655, Training Loss: 0.01667
Epoch: 9655, Training Loss: 0.01670
Epoch: 9655, Training Loss: 0.02077
Epoch: 9656, Training Loss: 0.01566
Epoch: 9656, Training Loss: 0.01667
Epoch: 9656, Training Loss: 0.01670
Epoch: 9656, Training Loss: 0.02077
Epoch: 9657, Training Loss: 0.01566
Epoch: 9657, Training Loss: 0.01667
Epoch: 9657, Training Loss: 0.01670
Epoch: 9657, Training Loss: 0.02077
Epoch: 9658, Training Loss: 0.01566
Epoch: 9658, Training Loss: 0.01667
Epoch: 9658, Training Loss: 0.01670
Epoch: 9658, Training Loss: 0.02077
Epoch: 9659, Training Loss: 0.01565
Epoch: 9659, Training Loss: 0.01667
Epoch: 9659, Training Loss: 0.01670
Epoch: 9659, Training Loss: 0.02077
Epoch: 9660, Training Loss: 0.01565
Epoch: 9660, Training Loss: 0.01667
Epoch: 9660, Training Loss: 0.01669
Epoch: 9660, Training Loss: 0.02077
Epoch: 9661, Training Loss: 0.01565
Epoch: 9661, Training Loss: 0.01667
Epoch: 9661, Training Loss: 0.01669
Epoch: 9661, Training Loss: 0.02076
Epoch: 9662, Training Loss: 0.01565
Epoch: 9662, Training Loss: 0.01667
Epoch: 9662, Training Loss: 0.01669
Epoch: 9662, Training Loss: 0.02076
Epoch: 9663, Training Loss: 0.01565
Epoch: 9663, Training Loss: 0.01666
Epoch: 9663, Training Loss: 0.01669
Epoch: 9663, Training Loss: 0.02076
Epoch: 9664, Training Loss: 0.01565
Epoch: 9664, Training Loss: 0.01666
Epoch: 9664, Training Loss: 0.01669
Epoch: 9664, Training Loss: 0.02076
Epoch: 9665, Training Loss: 0.01565
Epoch: 9665, Training Loss: 0.01666
Epoch: 9665, Training Loss: 0.01669
Epoch: 9665, Training Loss: 0.02076
Epoch: 9666, Training Loss: 0.01565
Epoch: 9666, Training Loss: 0.01666
Epoch: 9666, Training Loss: 0.01669
Epoch: 9666, Training Loss: 0.02076
Epoch: 9667, Training Loss: 0.01565
Epoch: 9667, Training Loss: 0.01666
Epoch: 9667, Training Loss: 0.01669
Epoch: 9667, Training Loss: 0.02076
Epoch: 9668, Training Loss: 0.01564
Epoch: 9668, Training Loss: 0.01666
Epoch: 9668, Training Loss: 0.01668
Epoch: 9668, Training Loss: 0.02075
Epoch: 9669, Training Loss: 0.01564
Epoch: 9669, Training Loss: 0.01666
Epoch: 9669, Training Loss: 0.01668
Epoch: 9669, Training Loss: 0.02075
Epoch: 9670, Training Loss: 0.01564
Epoch: 9670, Training Loss: 0.01666
Epoch: 9670, Training Loss: 0.01668
Epoch: 9670, Training Loss: 0.02075
Epoch: 9671, Training Loss: 0.01564
Epoch: 9671, Training Loss: 0.01665
Epoch: 9671, Training Loss: 0.01668
Epoch: 9671, Training Loss: 0.02075
Epoch: 9672, Training Loss: 0.01564
Epoch: 9672, Training Loss: 0.01665
Epoch: 9672, Training Loss: 0.01668
Epoch: 9672, Training Loss: 0.02075
Epoch: 9673, Training Loss: 0.01564
Epoch: 9673, Training Loss: 0.01665
Epoch: 9673, Training Loss: 0.01668
Epoch: 9673, Training Loss: 0.02075
Epoch: 9674, Training Loss: 0.01564
Epoch: 9674, Training Loss: 0.01665
Epoch: 9674, Training Loss: 0.01668
Epoch: 9674, Training Loss: 0.02074
Epoch: 9675, Training Loss: 0.01564
Epoch: 9675, Training Loss: 0.01665
Epoch: 9675, Training Loss: 0.01668
Epoch: 9675, Training Loss: 0.02074
Epoch: 9676, Training Loss: 0.01564
Epoch: 9676, Training Loss: 0.01665
Epoch: 9676, Training Loss: 0.01668
Epoch: 9676, Training Loss: 0.02074
Epoch: 9677, Training Loss: 0.01564
Epoch: 9677, Training Loss: 0.01665
Epoch: 9677, Training Loss: 0.01667
Epoch: 9677, Training Loss: 0.02074
Epoch: 9678, Training Loss: 0.01563
Epoch: 9678, Training Loss: 0.01665
Epoch: 9678, Training Loss: 0.01667
Epoch: 9678, Training Loss: 0.02074
Epoch: 9679, Training Loss: 0.01563
Epoch: 9679, Training Loss: 0.01665
Epoch: 9679, Training Loss: 0.01667
Epoch: 9679, Training Loss: 0.02074
Epoch: 9680, Training Loss: 0.01563
Epoch: 9680, Training Loss: 0.01664
Epoch: 9680, Training Loss: 0.01667
Epoch: 9680, Training Loss: 0.02074
Epoch: 9681, Training Loss: 0.01563
Epoch: 9681, Training Loss: 0.01664
Epoch: 9681, Training Loss: 0.01667
Epoch: 9681, Training Loss: 0.02073
Epoch: 9682, Training Loss: 0.01563
Epoch: 9682, Training Loss: 0.01664
Epoch: 9682, Training Loss: 0.01667
Epoch: 9682, Training Loss: 0.02073
Epoch: 9683, Training Loss: 0.01563
Epoch: 9683, Training Loss: 0.01664
Epoch: 9683, Training Loss: 0.01667
Epoch: 9683, Training Loss: 0.02073
Epoch: 9684, Training Loss: 0.01563
Epoch: 9684, Training Loss: 0.01664
Epoch: 9684, Training Loss: 0.01667
Epoch: 9684, Training Loss: 0.02073
Epoch: 9685, Training Loss: 0.01563
Epoch: 9685, Training Loss: 0.01664
Epoch: 9685, Training Loss: 0.01666
Epoch: 9685, Training Loss: 0.02073
Epoch: 9686, Training Loss: 0.01563
Epoch: 9686, Training Loss: 0.01664
Epoch: 9686, Training Loss: 0.01666
Epoch: 9686, Training Loss: 0.02073
Epoch: 9687, Training Loss: 0.01562
Epoch: 9687, Training Loss: 0.01664
Epoch: 9687, Training Loss: 0.01666
Epoch: 9687, Training Loss: 0.02072
Epoch: 9688, Training Loss: 0.01562
Epoch: 9688, Training Loss: 0.01663
Epoch: 9688, Training Loss: 0.01666
Epoch: 9688, Training Loss: 0.02072
Epoch: 9689, Training Loss: 0.01562
Epoch: 9689, Training Loss: 0.01663
Epoch: 9689, Training Loss: 0.01666
Epoch: 9689, Training Loss: 0.02072
Epoch: 9690, Training Loss: 0.01562
Epoch: 9690, Training Loss: 0.01663
Epoch: 9690, Training Loss: 0.01666
Epoch: 9690, Training Loss: 0.02072
Epoch: 9691, Training Loss: 0.01562
Epoch: 9691, Training Loss: 0.01663
Epoch: 9691, Training Loss: 0.01666
Epoch: 9691, Training Loss: 0.02072
Epoch: 9692, Training Loss: 0.01562
Epoch: 9692, Training Loss: 0.01663
Epoch: 9692, Training Loss: 0.01666
Epoch: 9692, Training Loss: 0.02072
Epoch: 9693, Training Loss: 0.01562
Epoch: 9693, Training Loss: 0.01663
Epoch: 9693, Training Loss: 0.01665
Epoch: 9693, Training Loss: 0.02072
Epoch: 9694, Training Loss: 0.01562
Epoch: 9694, Training Loss: 0.01663
Epoch: 9694, Training Loss: 0.01665
Epoch: 9694, Training Loss: 0.02071
Epoch: 9695, Training Loss: 0.01562
Epoch: 9695, Training Loss: 0.01663
Epoch: 9695, Training Loss: 0.01665
Epoch: 9695, Training Loss: 0.02071
Epoch: 9696, Training Loss: 0.01561
Epoch: 9696, Training Loss: 0.01662
Epoch: 9696, Training Loss: 0.01665
Epoch: 9696, Training Loss: 0.02071
Epoch: 9697, Training Loss: 0.01561
Epoch: 9697, Training Loss: 0.01662
Epoch: 9697, Training Loss: 0.01665
Epoch: 9697, Training Loss: 0.02071
Epoch: 9698, Training Loss: 0.01561
Epoch: 9698, Training Loss: 0.01662
Epoch: 9698, Training Loss: 0.01665
Epoch: 9698, Training Loss: 0.02071
Epoch: 9699, Training Loss: 0.01561
Epoch: 9699, Training Loss: 0.01662
Epoch: 9699, Training Loss: 0.01665
Epoch: 9699, Training Loss: 0.02071
Epoch: 9700, Training Loss: 0.01561
Epoch: 9700, Training Loss: 0.01662
Epoch: 9700, Training Loss: 0.01665
Epoch: 9700, Training Loss: 0.02071
Epoch: 9701, Training Loss: 0.01561
Epoch: 9701, Training Loss: 0.01662
Epoch: 9701, Training Loss: 0.01665
Epoch: 9701, Training Loss: 0.02070
Epoch: 9702, Training Loss: 0.01561
Epoch: 9702, Training Loss: 0.01662
Epoch: 9702, Training Loss: 0.01664
Epoch: 9702, Training Loss: 0.02070
Epoch: 9703, Training Loss: 0.01561
Epoch: 9703, Training Loss: 0.01662
Epoch: 9703, Training Loss: 0.01664
Epoch: 9703, Training Loss: 0.02070
Epoch: 9704, Training Loss: 0.01561
Epoch: 9704, Training Loss: 0.01662
Epoch: 9704, Training Loss: 0.01664
Epoch: 9704, Training Loss: 0.02070
Epoch: 9705, Training Loss: 0.01560
Epoch: 9705, Training Loss: 0.01661
Epoch: 9705, Training Loss: 0.01664
Epoch: 9705, Training Loss: 0.02070
Epoch: 9706, Training Loss: 0.01560
Epoch: 9706, Training Loss: 0.01661
Epoch: 9706, Training Loss: 0.01664
Epoch: 9706, Training Loss: 0.02070
Epoch: 9707, Training Loss: 0.01560
Epoch: 9707, Training Loss: 0.01661
Epoch: 9707, Training Loss: 0.01664
Epoch: 9707, Training Loss: 0.02069
Epoch: 9708, Training Loss: 0.01560
Epoch: 9708, Training Loss: 0.01661
Epoch: 9708, Training Loss: 0.01664
Epoch: 9708, Training Loss: 0.02069
Epoch: 9709, Training Loss: 0.01560
Epoch: 9709, Training Loss: 0.01661
Epoch: 9709, Training Loss: 0.01664
Epoch: 9709, Training Loss: 0.02069
Epoch: 9710, Training Loss: 0.01560
Epoch: 9710, Training Loss: 0.01661
Epoch: 9710, Training Loss: 0.01663
Epoch: 9710, Training Loss: 0.02069
Epoch: 9711, Training Loss: 0.01560
Epoch: 9711, Training Loss: 0.01661
Epoch: 9711, Training Loss: 0.01663
Epoch: 9711, Training Loss: 0.02069
Epoch: 9712, Training Loss: 0.01560
Epoch: 9712, Training Loss: 0.01661
Epoch: 9712, Training Loss: 0.01663
Epoch: 9712, Training Loss: 0.02069
Epoch: 9713, Training Loss: 0.01560
Epoch: 9713, Training Loss: 0.01660
Epoch: 9713, Training Loss: 0.01663
Epoch: 9713, Training Loss: 0.02069
Epoch: 9714, Training Loss: 0.01560
Epoch: 9714, Training Loss: 0.01660
Epoch: 9714, Training Loss: 0.01663
Epoch: 9714, Training Loss: 0.02068
Epoch: 9715, Training Loss: 0.01559
Epoch: 9715, Training Loss: 0.01660
Epoch: 9715, Training Loss: 0.01663
Epoch: 9715, Training Loss: 0.02068
Epoch: 9716, Training Loss: 0.01559
Epoch: 9716, Training Loss: 0.01660
Epoch: 9716, Training Loss: 0.01663
Epoch: 9716, Training Loss: 0.02068
Epoch: 9717, Training Loss: 0.01559
Epoch: 9717, Training Loss: 0.01660
Epoch: 9717, Training Loss: 0.01663
Epoch: 9717, Training Loss: 0.02068
Epoch: 9718, Training Loss: 0.01559
Epoch: 9718, Training Loss: 0.01660
Epoch: 9718, Training Loss: 0.01662
Epoch: 9718, Training Loss: 0.02068
Epoch: 9719, Training Loss: 0.01559
Epoch: 9719, Training Loss: 0.01660
Epoch: 9719, Training Loss: 0.01662
Epoch: 9719, Training Loss: 0.02068
Epoch: 9720, Training Loss: 0.01559
Epoch: 9720, Training Loss: 0.01660
Epoch: 9720, Training Loss: 0.01662
Epoch: 9720, Training Loss: 0.02067
Epoch: 9721, Training Loss: 0.01559
Epoch: 9721, Training Loss: 0.01660
Epoch: 9721, Training Loss: 0.01662
Epoch: 9721, Training Loss: 0.02067
Epoch: 9722, Training Loss: 0.01559
Epoch: 9722, Training Loss: 0.01659
Epoch: 9722, Training Loss: 0.01662
Epoch: 9722, Training Loss: 0.02067
Epoch: 9723, Training Loss: 0.01559
Epoch: 9723, Training Loss: 0.01659
Epoch: 9723, Training Loss: 0.01662
Epoch: 9723, Training Loss: 0.02067
Epoch: 9724, Training Loss: 0.01558
Epoch: 9724, Training Loss: 0.01659
Epoch: 9724, Training Loss: 0.01662
Epoch: 9724, Training Loss: 0.02067
Epoch: 9725, Training Loss: 0.01558
Epoch: 9725, Training Loss: 0.01659
Epoch: 9725, Training Loss: 0.01662
Epoch: 9725, Training Loss: 0.02067
Epoch: 9726, Training Loss: 0.01558
Epoch: 9726, Training Loss: 0.01659
Epoch: 9726, Training Loss: 0.01662
Epoch: 9726, Training Loss: 0.02067
Epoch: 9727, Training Loss: 0.01558
Epoch: 9727, Training Loss: 0.01659
Epoch: 9727, Training Loss: 0.01661
Epoch: 9727, Training Loss: 0.02066
Epoch: 9728, Training Loss: 0.01558
Epoch: 9728, Training Loss: 0.01659
Epoch: 9728, Training Loss: 0.01661
Epoch: 9728, Training Loss: 0.02066
Epoch: 9729, Training Loss: 0.01558
Epoch: 9729, Training Loss: 0.01659
Epoch: 9729, Training Loss: 0.01661
Epoch: 9729, Training Loss: 0.02066
Epoch: 9730, Training Loss: 0.01558
Epoch: 9730, Training Loss: 0.01658
Epoch: 9730, Training Loss: 0.01661
Epoch: 9730, Training Loss: 0.02066
Epoch: 9731, Training Loss: 0.01558
Epoch: 9731, Training Loss: 0.01658
Epoch: 9731, Training Loss: 0.01661
Epoch: 9731, Training Loss: 0.02066
Epoch: 9732, Training Loss: 0.01558
Epoch: 9732, Training Loss: 0.01658
Epoch: 9732, Training Loss: 0.01661
Epoch: 9732, Training Loss: 0.02066
Epoch: 9733, Training Loss: 0.01558
Epoch: 9733, Training Loss: 0.01658
Epoch: 9733, Training Loss: 0.01661
Epoch: 9733, Training Loss: 0.02065
Epoch: 9734, Training Loss: 0.01557
Epoch: 9734, Training Loss: 0.01658
Epoch: 9734, Training Loss: 0.01661
Epoch: 9734, Training Loss: 0.02065
Epoch: 9735, Training Loss: 0.01557
Epoch: 9735, Training Loss: 0.01658
Epoch: 9735, Training Loss: 0.01660
Epoch: 9735, Training Loss: 0.02065
Epoch: 9736, Training Loss: 0.01557
Epoch: 9736, Training Loss: 0.01658
Epoch: 9736, Training Loss: 0.01660
Epoch: 9736, Training Loss: 0.02065
Epoch: 9737, Training Loss: 0.01557
Epoch: 9737, Training Loss: 0.01658
Epoch: 9737, Training Loss: 0.01660
Epoch: 9737, Training Loss: 0.02065
Epoch: 9738, Training Loss: 0.01557
Epoch: 9738, Training Loss: 0.01658
Epoch: 9738, Training Loss: 0.01660
Epoch: 9738, Training Loss: 0.02065
Epoch: 9739, Training Loss: 0.01557
Epoch: 9739, Training Loss: 0.01657
Epoch: 9739, Training Loss: 0.01660
Epoch: 9739, Training Loss: 0.02065
Epoch: 9740, Training Loss: 0.01557
Epoch: 9740, Training Loss: 0.01657
Epoch: 9740, Training Loss: 0.01660
Epoch: 9740, Training Loss: 0.02064
Epoch: 9741, Training Loss: 0.01557
Epoch: 9741, Training Loss: 0.01657
Epoch: 9741, Training Loss: 0.01660
Epoch: 9741, Training Loss: 0.02064
Epoch: 9742, Training Loss: 0.01557
Epoch: 9742, Training Loss: 0.01657
Epoch: 9742, Training Loss: 0.01660
Epoch: 9742, Training Loss: 0.02064
Epoch: 9743, Training Loss: 0.01556
Epoch: 9743, Training Loss: 0.01657
Epoch: 9743, Training Loss: 0.01660
Epoch: 9743, Training Loss: 0.02064
Epoch: 9744, Training Loss: 0.01556
Epoch: 9744, Training Loss: 0.01657
Epoch: 9744, Training Loss: 0.01659
Epoch: 9744, Training Loss: 0.02064
Epoch: 9745, Training Loss: 0.01556
Epoch: 9745, Training Loss: 0.01657
Epoch: 9745, Training Loss: 0.01659
Epoch: 9745, Training Loss: 0.02064
Epoch: 9746, Training Loss: 0.01556
Epoch: 9746, Training Loss: 0.01657
Epoch: 9746, Training Loss: 0.01659
Epoch: 9746, Training Loss: 0.02064
Epoch: 9747, Training Loss: 0.01556
Epoch: 9747, Training Loss: 0.01656
Epoch: 9747, Training Loss: 0.01659
Epoch: 9747, Training Loss: 0.02063
Epoch: 9748, Training Loss: 0.01556
Epoch: 9748, Training Loss: 0.01656
Epoch: 9748, Training Loss: 0.01659
Epoch: 9748, Training Loss: 0.02063
Epoch: 9749, Training Loss: 0.01556
Epoch: 9749, Training Loss: 0.01656
Epoch: 9749, Training Loss: 0.01659
Epoch: 9749, Training Loss: 0.02063
Epoch: 9750, Training Loss: 0.01556
Epoch: 9750, Training Loss: 0.01656
Epoch: 9750, Training Loss: 0.01659
Epoch: 9750, Training Loss: 0.02063
Epoch: 9751, Training Loss: 0.01556
Epoch: 9751, Training Loss: 0.01656
Epoch: 9751, Training Loss: 0.01659
Epoch: 9751, Training Loss: 0.02063
Epoch: 9752, Training Loss: 0.01555
Epoch: 9752, Training Loss: 0.01656
Epoch: 9752, Training Loss: 0.01658
Epoch: 9752, Training Loss: 0.02063
Epoch: 9753, Training Loss: 0.01555
Epoch: 9753, Training Loss: 0.01656
Epoch: 9753, Training Loss: 0.01658
Epoch: 9753, Training Loss: 0.02062
Epoch: 9754, Training Loss: 0.01555
Epoch: 9754, Training Loss: 0.01656
Epoch: 9754, Training Loss: 0.01658
Epoch: 9754, Training Loss: 0.02062
Epoch: 9755, Training Loss: 0.01555
Epoch: 9755, Training Loss: 0.01655
Epoch: 9755, Training Loss: 0.01658
Epoch: 9755, Training Loss: 0.02062
Epoch: 9756, Training Loss: 0.01555
Epoch: 9756, Training Loss: 0.01655
Epoch: 9756, Training Loss: 0.01658
Epoch: 9756, Training Loss: 0.02062
Epoch: 9757, Training Loss: 0.01555
Epoch: 9757, Training Loss: 0.01655
Epoch: 9757, Training Loss: 0.01658
Epoch: 9757, Training Loss: 0.02062
Epoch: 9758, Training Loss: 0.01555
Epoch: 9758, Training Loss: 0.01655
Epoch: 9758, Training Loss: 0.01658
Epoch: 9758, Training Loss: 0.02062
Epoch: 9759, Training Loss: 0.01555
Epoch: 9759, Training Loss: 0.01655
Epoch: 9759, Training Loss: 0.01658
Epoch: 9759, Training Loss: 0.02062
Epoch: 9760, Training Loss: 0.01555
Epoch: 9760, Training Loss: 0.01655
Epoch: 9760, Training Loss: 0.01657
Epoch: 9760, Training Loss: 0.02061
Epoch: 9761, Training Loss: 0.01555
Epoch: 9761, Training Loss: 0.01655
Epoch: 9761, Training Loss: 0.01657
Epoch: 9761, Training Loss: 0.02061
Epoch: 9762, Training Loss: 0.01554
Epoch: 9762, Training Loss: 0.01655
Epoch: 9762, Training Loss: 0.01657
Epoch: 9762, Training Loss: 0.02061
Epoch: 9763, Training Loss: 0.01554
Epoch: 9763, Training Loss: 0.01655
Epoch: 9763, Training Loss: 0.01657
Epoch: 9763, Training Loss: 0.02061
Epoch: 9764, Training Loss: 0.01554
Epoch: 9764, Training Loss: 0.01654
Epoch: 9764, Training Loss: 0.01657
Epoch: 9764, Training Loss: 0.02061
Epoch: 9765, Training Loss: 0.01554
Epoch: 9765, Training Loss: 0.01654
Epoch: 9765, Training Loss: 0.01657
Epoch: 9765, Training Loss: 0.02061
Epoch: 9766, Training Loss: 0.01554
Epoch: 9766, Training Loss: 0.01654
Epoch: 9766, Training Loss: 0.01657
Epoch: 9766, Training Loss: 0.02060
Epoch: 9767, Training Loss: 0.01554
Epoch: 9767, Training Loss: 0.01654
Epoch: 9767, Training Loss: 0.01657
Epoch: 9767, Training Loss: 0.02060
Epoch: 9768, Training Loss: 0.01554
Epoch: 9768, Training Loss: 0.01654
Epoch: 9768, Training Loss: 0.01657
Epoch: 9768, Training Loss: 0.02060
Epoch: 9769, Training Loss: 0.01554
Epoch: 9769, Training Loss: 0.01654
Epoch: 9769, Training Loss: 0.01656
Epoch: 9769, Training Loss: 0.02060
Epoch: 9770, Training Loss: 0.01554
Epoch: 9770, Training Loss: 0.01654
Epoch: 9770, Training Loss: 0.01656
Epoch: 9770, Training Loss: 0.02060
Epoch: 9771, Training Loss: 0.01553
Epoch: 9771, Training Loss: 0.01654
Epoch: 9771, Training Loss: 0.01656
Epoch: 9771, Training Loss: 0.02060
Epoch: 9772, Training Loss: 0.01553
Epoch: 9772, Training Loss: 0.01653
Epoch: 9772, Training Loss: 0.01656
Epoch: 9772, Training Loss: 0.02060
Epoch: 9773, Training Loss: 0.01553
Epoch: 9773, Training Loss: 0.01653
Epoch: 9773, Training Loss: 0.01656
Epoch: 9773, Training Loss: 0.02059
Epoch: 9774, Training Loss: 0.01553
Epoch: 9774, Training Loss: 0.01653
Epoch: 9774, Training Loss: 0.01656
Epoch: 9774, Training Loss: 0.02059
Epoch: 9775, Training Loss: 0.01553
Epoch: 9775, Training Loss: 0.01653
Epoch: 9775, Training Loss: 0.01656
Epoch: 9775, Training Loss: 0.02059
Epoch: 9776, Training Loss: 0.01553
Epoch: 9776, Training Loss: 0.01653
Epoch: 9776, Training Loss: 0.01656
Epoch: 9776, Training Loss: 0.02059
Epoch: 9777, Training Loss: 0.01553
Epoch: 9777, Training Loss: 0.01653
Epoch: 9777, Training Loss: 0.01655
Epoch: 9777, Training Loss: 0.02059
Epoch: 9778, Training Loss: 0.01553
Epoch: 9778, Training Loss: 0.01653
Epoch: 9778, Training Loss: 0.01655
Epoch: 9778, Training Loss: 0.02059
Epoch: 9779, Training Loss: 0.01553
Epoch: 9779, Training Loss: 0.01653
Epoch: 9779, Training Loss: 0.01655
Epoch: 9779, Training Loss: 0.02059
Epoch: 9780, Training Loss: 0.01553
Epoch: 9780, Training Loss: 0.01653
Epoch: 9780, Training Loss: 0.01655
Epoch: 9780, Training Loss: 0.02058
Epoch: 9781, Training Loss: 0.01552
Epoch: 9781, Training Loss: 0.01652
Epoch: 9781, Training Loss: 0.01655
Epoch: 9781, Training Loss: 0.02058
Epoch: 9782, Training Loss: 0.01552
Epoch: 9782, Training Loss: 0.01652
Epoch: 9782, Training Loss: 0.01655
Epoch: 9782, Training Loss: 0.02058
Epoch: 9783, Training Loss: 0.01552
Epoch: 9783, Training Loss: 0.01652
Epoch: 9783, Training Loss: 0.01655
Epoch: 9783, Training Loss: 0.02058
Epoch: 9784, Training Loss: 0.01552
Epoch: 9784, Training Loss: 0.01652
Epoch: 9784, Training Loss: 0.01655
Epoch: 9784, Training Loss: 0.02058
Epoch: 9785, Training Loss: 0.01552
Epoch: 9785, Training Loss: 0.01652
Epoch: 9785, Training Loss: 0.01655
Epoch: 9785, Training Loss: 0.02058
Epoch: 9786, Training Loss: 0.01552
Epoch: 9786, Training Loss: 0.01652
Epoch: 9786, Training Loss: 0.01654
Epoch: 9786, Training Loss: 0.02057
Epoch: 9787, Training Loss: 0.01552
Epoch: 9787, Training Loss: 0.01652
Epoch: 9787, Training Loss: 0.01654
Epoch: 9787, Training Loss: 0.02057
Epoch: 9788, Training Loss: 0.01552
Epoch: 9788, Training Loss: 0.01652
Epoch: 9788, Training Loss: 0.01654
Epoch: 9788, Training Loss: 0.02057
Epoch: 9789, Training Loss: 0.01552
Epoch: 9789, Training Loss: 0.01651
Epoch: 9789, Training Loss: 0.01654
Epoch: 9789, Training Loss: 0.02057
Epoch: 9790, Training Loss: 0.01551
Epoch: 9790, Training Loss: 0.01651
Epoch: 9790, Training Loss: 0.01654
Epoch: 9790, Training Loss: 0.02057
Epoch: 9791, Training Loss: 0.01551
Epoch: 9791, Training Loss: 0.01651
Epoch: 9791, Training Loss: 0.01654
Epoch: 9791, Training Loss: 0.02057
Epoch: 9792, Training Loss: 0.01551
Epoch: 9792, Training Loss: 0.01651
Epoch: 9792, Training Loss: 0.01654
Epoch: 9792, Training Loss: 0.02057
Epoch: 9793, Training Loss: 0.01551
Epoch: 9793, Training Loss: 0.01651
Epoch: 9793, Training Loss: 0.01654
Epoch: 9793, Training Loss: 0.02056
Epoch: 9794, Training Loss: 0.01551
Epoch: 9794, Training Loss: 0.01651
Epoch: 9794, Training Loss: 0.01653
Epoch: 9794, Training Loss: 0.02056
Epoch: 9795, Training Loss: 0.01551
Epoch: 9795, Training Loss: 0.01651
Epoch: 9795, Training Loss: 0.01653
Epoch: 9795, Training Loss: 0.02056
Epoch: 9796, Training Loss: 0.01551
Epoch: 9796, Training Loss: 0.01651
Epoch: 9796, Training Loss: 0.01653
Epoch: 9796, Training Loss: 0.02056
Epoch: 9797, Training Loss: 0.01551
Epoch: 9797, Training Loss: 0.01651
Epoch: 9797, Training Loss: 0.01653
Epoch: 9797, Training Loss: 0.02056
Epoch: 9798, Training Loss: 0.01551
Epoch: 9798, Training Loss: 0.01650
Epoch: 9798, Training Loss: 0.01653
Epoch: 9798, Training Loss: 0.02056
Epoch: 9799, Training Loss: 0.01551
Epoch: 9799, Training Loss: 0.01650
Epoch: 9799, Training Loss: 0.01653
Epoch: 9799, Training Loss: 0.02056
Epoch: 9800, Training Loss: 0.01550
Epoch: 9800, Training Loss: 0.01650
Epoch: 9800, Training Loss: 0.01653
Epoch: 9800, Training Loss: 0.02055
Epoch: 9801, Training Loss: 0.01550
Epoch: 9801, Training Loss: 0.01650
Epoch: 9801, Training Loss: 0.01653
Epoch: 9801, Training Loss: 0.02055
Epoch: 9802, Training Loss: 0.01550
Epoch: 9802, Training Loss: 0.01650
Epoch: 9802, Training Loss: 0.01653
Epoch: 9802, Training Loss: 0.02055
Epoch: 9803, Training Loss: 0.01550
Epoch: 9803, Training Loss: 0.01650
Epoch: 9803, Training Loss: 0.01652
Epoch: 9803, Training Loss: 0.02055
Epoch: 9804, Training Loss: 0.01550
Epoch: 9804, Training Loss: 0.01650
Epoch: 9804, Training Loss: 0.01652
Epoch: 9804, Training Loss: 0.02055
Epoch: 9805, Training Loss: 0.01550
Epoch: 9805, Training Loss: 0.01650
Epoch: 9805, Training Loss: 0.01652
Epoch: 9805, Training Loss: 0.02055
Epoch: 9806, Training Loss: 0.01550
Epoch: 9806, Training Loss: 0.01650
Epoch: 9806, Training Loss: 0.01652
Epoch: 9806, Training Loss: 0.02055
Epoch: 9807, Training Loss: 0.01550
Epoch: 9807, Training Loss: 0.01649
Epoch: 9807, Training Loss: 0.01652
Epoch: 9807, Training Loss: 0.02054
Epoch: 9808, Training Loss: 0.01550
Epoch: 9808, Training Loss: 0.01649
Epoch: 9808, Training Loss: 0.01652
Epoch: 9808, Training Loss: 0.02054
Epoch: 9809, Training Loss: 0.01549
Epoch: 9809, Training Loss: 0.01649
Epoch: 9809, Training Loss: 0.01652
Epoch: 9809, Training Loss: 0.02054
Epoch: 9810, Training Loss: 0.01549
Epoch: 9810, Training Loss: 0.01649
Epoch: 9810, Training Loss: 0.01652
Epoch: 9810, Training Loss: 0.02054
Epoch: 9811, Training Loss: 0.01549
Epoch: 9811, Training Loss: 0.01649
Epoch: 9811, Training Loss: 0.01652
Epoch: 9811, Training Loss: 0.02054
Epoch: 9812, Training Loss: 0.01549
Epoch: 9812, Training Loss: 0.01649
Epoch: 9812, Training Loss: 0.01651
Epoch: 9812, Training Loss: 0.02054
Epoch: 9813, Training Loss: 0.01549
Epoch: 9813, Training Loss: 0.01649
Epoch: 9813, Training Loss: 0.01651
Epoch: 9813, Training Loss: 0.02053
Epoch: 9814, Training Loss: 0.01549
Epoch: 9814, Training Loss: 0.01649
Epoch: 9814, Training Loss: 0.01651
Epoch: 9814, Training Loss: 0.02053
Epoch: 9815, Training Loss: 0.01549
Epoch: 9815, Training Loss: 0.01648
Epoch: 9815, Training Loss: 0.01651
Epoch: 9815, Training Loss: 0.02053
Epoch: 9816, Training Loss: 0.01549
Epoch: 9816, Training Loss: 0.01648
Epoch: 9816, Training Loss: 0.01651
Epoch: 9816, Training Loss: 0.02053
Epoch: 9817, Training Loss: 0.01549
Epoch: 9817, Training Loss: 0.01648
Epoch: 9817, Training Loss: 0.01651
Epoch: 9817, Training Loss: 0.02053
Epoch: 9818, Training Loss: 0.01549
Epoch: 9818, Training Loss: 0.01648
Epoch: 9818, Training Loss: 0.01651
Epoch: 9818, Training Loss: 0.02053
Epoch: 9819, Training Loss: 0.01548
Epoch: 9819, Training Loss: 0.01648
Epoch: 9819, Training Loss: 0.01651
Epoch: 9819, Training Loss: 0.02053
Epoch: 9820, Training Loss: 0.01548
Epoch: 9820, Training Loss: 0.01648
Epoch: 9820, Training Loss: 0.01650
Epoch: 9820, Training Loss: 0.02052
Epoch: 9821, Training Loss: 0.01548
Epoch: 9821, Training Loss: 0.01648
Epoch: 9821, Training Loss: 0.01650
Epoch: 9821, Training Loss: 0.02052
Epoch: 9822, Training Loss: 0.01548
Epoch: 9822, Training Loss: 0.01648
Epoch: 9822, Training Loss: 0.01650
Epoch: 9822, Training Loss: 0.02052
Epoch: 9823, Training Loss: 0.01548
Epoch: 9823, Training Loss: 0.01648
Epoch: 9823, Training Loss: 0.01650
Epoch: 9823, Training Loss: 0.02052
Epoch: 9824, Training Loss: 0.01548
Epoch: 9824, Training Loss: 0.01647
Epoch: 9824, Training Loss: 0.01650
Epoch: 9824, Training Loss: 0.02052
Epoch: 9825, Training Loss: 0.01548
Epoch: 9825, Training Loss: 0.01647
Epoch: 9825, Training Loss: 0.01650
Epoch: 9825, Training Loss: 0.02052
Epoch: 9826, Training Loss: 0.01548
Epoch: 9826, Training Loss: 0.01647
Epoch: 9826, Training Loss: 0.01650
Epoch: 9826, Training Loss: 0.02052
Epoch: 9827, Training Loss: 0.01548
Epoch: 9827, Training Loss: 0.01647
Epoch: 9827, Training Loss: 0.01650
Epoch: 9827, Training Loss: 0.02051
Epoch: 9828, Training Loss: 0.01547
Epoch: 9828, Training Loss: 0.01647
Epoch: 9828, Training Loss: 0.01650
Epoch: 9828, Training Loss: 0.02051
Epoch: 9829, Training Loss: 0.01547
Epoch: 9829, Training Loss: 0.01647
Epoch: 9829, Training Loss: 0.01649
Epoch: 9829, Training Loss: 0.02051
Epoch: 9830, Training Loss: 0.01547
Epoch: 9830, Training Loss: 0.01647
Epoch: 9830, Training Loss: 0.01649
Epoch: 9830, Training Loss: 0.02051
Epoch: 9831, Training Loss: 0.01547
Epoch: 9831, Training Loss: 0.01647
Epoch: 9831, Training Loss: 0.01649
Epoch: 9831, Training Loss: 0.02051
Epoch: 9832, Training Loss: 0.01547
Epoch: 9832, Training Loss: 0.01646
Epoch: 9832, Training Loss: 0.01649
Epoch: 9832, Training Loss: 0.02051
Epoch: 9833, Training Loss: 0.01547
Epoch: 9833, Training Loss: 0.01646
Epoch: 9833, Training Loss: 0.01649
Epoch: 9833, Training Loss: 0.02050
Epoch: 9834, Training Loss: 0.01547
Epoch: 9834, Training Loss: 0.01646
Epoch: 9834, Training Loss: 0.01649
Epoch: 9834, Training Loss: 0.02050
Epoch: 9835, Training Loss: 0.01547
Epoch: 9835, Training Loss: 0.01646
Epoch: 9835, Training Loss: 0.01649
Epoch: 9835, Training Loss: 0.02050
Epoch: 9836, Training Loss: 0.01547
Epoch: 9836, Training Loss: 0.01646
Epoch: 9836, Training Loss: 0.01649
Epoch: 9836, Training Loss: 0.02050
Epoch: 9837, Training Loss: 0.01547
Epoch: 9837, Training Loss: 0.01646
Epoch: 9837, Training Loss: 0.01648
Epoch: 9837, Training Loss: 0.02050
Epoch: 9838, Training Loss: 0.01546
Epoch: 9838, Training Loss: 0.01646
Epoch: 9838, Training Loss: 0.01648
Epoch: 9838, Training Loss: 0.02050
Epoch: 9839, Training Loss: 0.01546
Epoch: 9839, Training Loss: 0.01646
Epoch: 9839, Training Loss: 0.01648
Epoch: 9839, Training Loss: 0.02050
Epoch: 9840, Training Loss: 0.01546
Epoch: 9840, Training Loss: 0.01646
Epoch: 9840, Training Loss: 0.01648
Epoch: 9840, Training Loss: 0.02049
Epoch: 9841, Training Loss: 0.01546
Epoch: 9841, Training Loss: 0.01645
Epoch: 9841, Training Loss: 0.01648
Epoch: 9841, Training Loss: 0.02049
Epoch: 9842, Training Loss: 0.01546
Epoch: 9842, Training Loss: 0.01645
Epoch: 9842, Training Loss: 0.01648
Epoch: 9842, Training Loss: 0.02049
Epoch: 9843, Training Loss: 0.01546
Epoch: 9843, Training Loss: 0.01645
Epoch: 9843, Training Loss: 0.01648
Epoch: 9843, Training Loss: 0.02049
Epoch: 9844, Training Loss: 0.01546
Epoch: 9844, Training Loss: 0.01645
Epoch: 9844, Training Loss: 0.01648
Epoch: 9844, Training Loss: 0.02049
Epoch: 9845, Training Loss: 0.01546
Epoch: 9845, Training Loss: 0.01645
Epoch: 9845, Training Loss: 0.01648
Epoch: 9845, Training Loss: 0.02049
Epoch: 9846, Training Loss: 0.01546
Epoch: 9846, Training Loss: 0.01645
Epoch: 9846, Training Loss: 0.01647
Epoch: 9846, Training Loss: 0.02049
Epoch: 9847, Training Loss: 0.01545
Epoch: 9847, Training Loss: 0.01645
Epoch: 9847, Training Loss: 0.01647
Epoch: 9847, Training Loss: 0.02048
Epoch: 9848, Training Loss: 0.01545
Epoch: 9848, Training Loss: 0.01645
Epoch: 9848, Training Loss: 0.01647
Epoch: 9848, Training Loss: 0.02048
Epoch: 9849, Training Loss: 0.01545
Epoch: 9849, Training Loss: 0.01645
Epoch: 9849, Training Loss: 0.01647
Epoch: 9849, Training Loss: 0.02048
Epoch: 9850, Training Loss: 0.01545
Epoch: 9850, Training Loss: 0.01644
Epoch: 9850, Training Loss: 0.01647
Epoch: 9850, Training Loss: 0.02048
Epoch: 9851, Training Loss: 0.01545
Epoch: 9851, Training Loss: 0.01644
Epoch: 9851, Training Loss: 0.01647
Epoch: 9851, Training Loss: 0.02048
Epoch: 9852, Training Loss: 0.01545
Epoch: 9852, Training Loss: 0.01644
Epoch: 9852, Training Loss: 0.01647
Epoch: 9852, Training Loss: 0.02048
Epoch: 9853, Training Loss: 0.01545
Epoch: 9853, Training Loss: 0.01644
Epoch: 9853, Training Loss: 0.01647
Epoch: 9853, Training Loss: 0.02048
Epoch: 9854, Training Loss: 0.01545
Epoch: 9854, Training Loss: 0.01644
Epoch: 9854, Training Loss: 0.01646
Epoch: 9854, Training Loss: 0.02047
Epoch: 9855, Training Loss: 0.01545
Epoch: 9855, Training Loss: 0.01644
Epoch: 9855, Training Loss: 0.01646
Epoch: 9855, Training Loss: 0.02047
Epoch: 9856, Training Loss: 0.01545
Epoch: 9856, Training Loss: 0.01644
Epoch: 9856, Training Loss: 0.01646
Epoch: 9856, Training Loss: 0.02047
Epoch: 9857, Training Loss: 0.01544
Epoch: 9857, Training Loss: 0.01644
Epoch: 9857, Training Loss: 0.01646
Epoch: 9857, Training Loss: 0.02047
Epoch: 9858, Training Loss: 0.01544
Epoch: 9858, Training Loss: 0.01643
Epoch: 9858, Training Loss: 0.01646
Epoch: 9858, Training Loss: 0.02047
Epoch: 9859, Training Loss: 0.01544
Epoch: 9859, Training Loss: 0.01643
Epoch: 9859, Training Loss: 0.01646
Epoch: 9859, Training Loss: 0.02047
Epoch: 9860, Training Loss: 0.01544
Epoch: 9860, Training Loss: 0.01643
Epoch: 9860, Training Loss: 0.01646
Epoch: 9860, Training Loss: 0.02046
Epoch: 9861, Training Loss: 0.01544
Epoch: 9861, Training Loss: 0.01643
Epoch: 9861, Training Loss: 0.01646
Epoch: 9861, Training Loss: 0.02046
Epoch: 9862, Training Loss: 0.01544
Epoch: 9862, Training Loss: 0.01643
Epoch: 9862, Training Loss: 0.01646
Epoch: 9862, Training Loss: 0.02046
Epoch: 9863, Training Loss: 0.01544
Epoch: 9863, Training Loss: 0.01643
Epoch: 9863, Training Loss: 0.01645
Epoch: 9863, Training Loss: 0.02046
Epoch: 9864, Training Loss: 0.01544
Epoch: 9864, Training Loss: 0.01643
Epoch: 9864, Training Loss: 0.01645
Epoch: 9864, Training Loss: 0.02046
Epoch: 9865, Training Loss: 0.01544
Epoch: 9865, Training Loss: 0.01643
Epoch: 9865, Training Loss: 0.01645
Epoch: 9865, Training Loss: 0.02046
Epoch: 9866, Training Loss: 0.01544
Epoch: 9866, Training Loss: 0.01643
Epoch: 9866, Training Loss: 0.01645
Epoch: 9866, Training Loss: 0.02046
Epoch: 9867, Training Loss: 0.01543
Epoch: 9867, Training Loss: 0.01642
Epoch: 9867, Training Loss: 0.01645
Epoch: 9867, Training Loss: 0.02045
Epoch: 9868, Training Loss: 0.01543
Epoch: 9868, Training Loss: 0.01642
Epoch: 9868, Training Loss: 0.01645
Epoch: 9868, Training Loss: 0.02045
Epoch: 9869, Training Loss: 0.01543
Epoch: 9869, Training Loss: 0.01642
Epoch: 9869, Training Loss: 0.01645
Epoch: 9869, Training Loss: 0.02045
Epoch: 9870, Training Loss: 0.01543
Epoch: 9870, Training Loss: 0.01642
Epoch: 9870, Training Loss: 0.01645
Epoch: 9870, Training Loss: 0.02045
Epoch: 9871, Training Loss: 0.01543
Epoch: 9871, Training Loss: 0.01642
Epoch: 9871, Training Loss: 0.01645
Epoch: 9871, Training Loss: 0.02045
Epoch: 9872, Training Loss: 0.01543
Epoch: 9872, Training Loss: 0.01642
Epoch: 9872, Training Loss: 0.01644
Epoch: 9872, Training Loss: 0.02045
Epoch: 9873, Training Loss: 0.01543
Epoch: 9873, Training Loss: 0.01642
Epoch: 9873, Training Loss: 0.01644
Epoch: 9873, Training Loss: 0.02045
Epoch: 9874, Training Loss: 0.01543
Epoch: 9874, Training Loss: 0.01642
Epoch: 9874, Training Loss: 0.01644
Epoch: 9874, Training Loss: 0.02044
Epoch: 9875, Training Loss: 0.01543
Epoch: 9875, Training Loss: 0.01642
Epoch: 9875, Training Loss: 0.01644
Epoch: 9875, Training Loss: 0.02044
Epoch: 9876, Training Loss: 0.01542
Epoch: 9876, Training Loss: 0.01641
Epoch: 9876, Training Loss: 0.01644
Epoch: 9876, Training Loss: 0.02044
Epoch: 9877, Training Loss: 0.01542
Epoch: 9877, Training Loss: 0.01641
Epoch: 9877, Training Loss: 0.01644
Epoch: 9877, Training Loss: 0.02044
Epoch: 9878, Training Loss: 0.01542
Epoch: 9878, Training Loss: 0.01641
Epoch: 9878, Training Loss: 0.01644
Epoch: 9878, Training Loss: 0.02044
Epoch: 9879, Training Loss: 0.01542
Epoch: 9879, Training Loss: 0.01641
Epoch: 9879, Training Loss: 0.01644
Epoch: 9879, Training Loss: 0.02044
Epoch: 9880, Training Loss: 0.01542
Epoch: 9880, Training Loss: 0.01641
Epoch: 9880, Training Loss: 0.01643
Epoch: 9880, Training Loss: 0.02044
Epoch: 9881, Training Loss: 0.01542
Epoch: 9881, Training Loss: 0.01641
Epoch: 9881, Training Loss: 0.01643
Epoch: 9881, Training Loss: 0.02043
Epoch: 9882, Training Loss: 0.01542
Epoch: 9882, Training Loss: 0.01641
Epoch: 9882, Training Loss: 0.01643
Epoch: 9882, Training Loss: 0.02043
Epoch: 9883, Training Loss: 0.01542
Epoch: 9883, Training Loss: 0.01641
Epoch: 9883, Training Loss: 0.01643
Epoch: 9883, Training Loss: 0.02043
Epoch: 9884, Training Loss: 0.01542
Epoch: 9884, Training Loss: 0.01640
Epoch: 9884, Training Loss: 0.01643
Epoch: 9884, Training Loss: 0.02043
Epoch: 9885, Training Loss: 0.01542
Epoch: 9885, Training Loss: 0.01640
Epoch: 9885, Training Loss: 0.01643
Epoch: 9885, Training Loss: 0.02043
Epoch: 9886, Training Loss: 0.01541
Epoch: 9886, Training Loss: 0.01640
Epoch: 9886, Training Loss: 0.01643
Epoch: 9886, Training Loss: 0.02043
Epoch: 9887, Training Loss: 0.01541
Epoch: 9887, Training Loss: 0.01640
Epoch: 9887, Training Loss: 0.01643
Epoch: 9887, Training Loss: 0.02043
Epoch: 9888, Training Loss: 0.01541
Epoch: 9888, Training Loss: 0.01640
Epoch: 9888, Training Loss: 0.01643
Epoch: 9888, Training Loss: 0.02042
Epoch: 9889, Training Loss: 0.01541
Epoch: 9889, Training Loss: 0.01640
Epoch: 9889, Training Loss: 0.01642
Epoch: 9889, Training Loss: 0.02042
Epoch: 9890, Training Loss: 0.01541
Epoch: 9890, Training Loss: 0.01640
Epoch: 9890, Training Loss: 0.01642
Epoch: 9890, Training Loss: 0.02042
Epoch: 9891, Training Loss: 0.01541
Epoch: 9891, Training Loss: 0.01640
Epoch: 9891, Training Loss: 0.01642
Epoch: 9891, Training Loss: 0.02042
Epoch: 9892, Training Loss: 0.01541
Epoch: 9892, Training Loss: 0.01640
Epoch: 9892, Training Loss: 0.01642
Epoch: 9892, Training Loss: 0.02042
Epoch: 9893, Training Loss: 0.01541
Epoch: 9893, Training Loss: 0.01639
Epoch: 9893, Training Loss: 0.01642
Epoch: 9893, Training Loss: 0.02042
Epoch: 9894, Training Loss: 0.01541
Epoch: 9894, Training Loss: 0.01639
Epoch: 9894, Training Loss: 0.01642
Epoch: 9894, Training Loss: 0.02041
Epoch: 9895, Training Loss: 0.01540
Epoch: 9895, Training Loss: 0.01639
Epoch: 9895, Training Loss: 0.01642
Epoch: 9895, Training Loss: 0.02041
Epoch: 9896, Training Loss: 0.01540
Epoch: 9896, Training Loss: 0.01639
Epoch: 9896, Training Loss: 0.01642
Epoch: 9896, Training Loss: 0.02041
Epoch: 9897, Training Loss: 0.01540
Epoch: 9897, Training Loss: 0.01639
Epoch: 9897, Training Loss: 0.01642
Epoch: 9897, Training Loss: 0.02041
Epoch: 9898, Training Loss: 0.01540
Epoch: 9898, Training Loss: 0.01639
Epoch: 9898, Training Loss: 0.01641
Epoch: 9898, Training Loss: 0.02041
Epoch: 9899, Training Loss: 0.01540
Epoch: 9899, Training Loss: 0.01639
Epoch: 9899, Training Loss: 0.01641
Epoch: 9899, Training Loss: 0.02041
Epoch: 9900, Training Loss: 0.01540
Epoch: 9900, Training Loss: 0.01639
Epoch: 9900, Training Loss: 0.01641
Epoch: 9900, Training Loss: 0.02041
Epoch: 9901, Training Loss: 0.01540
Epoch: 9901, Training Loss: 0.01639
Epoch: 9901, Training Loss: 0.01641
Epoch: 9901, Training Loss: 0.02040
Epoch: 9902, Training Loss: 0.01540
Epoch: 9902, Training Loss: 0.01638
Epoch: 9902, Training Loss: 0.01641
Epoch: 9902, Training Loss: 0.02040
Epoch: 9903, Training Loss: 0.01540
Epoch: 9903, Training Loss: 0.01638
Epoch: 9903, Training Loss: 0.01641
Epoch: 9903, Training Loss: 0.02040
Epoch: 9904, Training Loss: 0.01540
Epoch: 9904, Training Loss: 0.01638
Epoch: 9904, Training Loss: 0.01641
Epoch: 9904, Training Loss: 0.02040
Epoch: 9905, Training Loss: 0.01539
Epoch: 9905, Training Loss: 0.01638
Epoch: 9905, Training Loss: 0.01641
Epoch: 9905, Training Loss: 0.02040
Epoch: 9906, Training Loss: 0.01539
Epoch: 9906, Training Loss: 0.01638
Epoch: 9906, Training Loss: 0.01641
Epoch: 9906, Training Loss: 0.02040
Epoch: 9907, Training Loss: 0.01539
Epoch: 9907, Training Loss: 0.01638
Epoch: 9907, Training Loss: 0.01640
Epoch: 9907, Training Loss: 0.02040
Epoch: 9908, Training Loss: 0.01539
Epoch: 9908, Training Loss: 0.01638
Epoch: 9908, Training Loss: 0.01640
Epoch: 9908, Training Loss: 0.02039
Epoch: 9909, Training Loss: 0.01539
Epoch: 9909, Training Loss: 0.01638
Epoch: 9909, Training Loss: 0.01640
Epoch: 9909, Training Loss: 0.02039
Epoch: 9910, Training Loss: 0.01539
Epoch: 9910, Training Loss: 0.01637
Epoch: 9910, Training Loss: 0.01640
Epoch: 9910, Training Loss: 0.02039
Epoch: 9911, Training Loss: 0.01539
Epoch: 9911, Training Loss: 0.01637
Epoch: 9911, Training Loss: 0.01640
Epoch: 9911, Training Loss: 0.02039
Epoch: 9912, Training Loss: 0.01539
Epoch: 9912, Training Loss: 0.01637
Epoch: 9912, Training Loss: 0.01640
Epoch: 9912, Training Loss: 0.02039
Epoch: 9913, Training Loss: 0.01539
Epoch: 9913, Training Loss: 0.01637
Epoch: 9913, Training Loss: 0.01640
Epoch: 9913, Training Loss: 0.02039
Epoch: 9914, Training Loss: 0.01539
Epoch: 9914, Training Loss: 0.01637
Epoch: 9914, Training Loss: 0.01640
Epoch: 9914, Training Loss: 0.02039
Epoch: 9915, Training Loss: 0.01538
Epoch: 9915, Training Loss: 0.01637
Epoch: 9915, Training Loss: 0.01639
Epoch: 9915, Training Loss: 0.02038
Epoch: 9916, Training Loss: 0.01538
Epoch: 9916, Training Loss: 0.01637
Epoch: 9916, Training Loss: 0.01639
Epoch: 9916, Training Loss: 0.02038
Epoch: 9917, Training Loss: 0.01538
Epoch: 9917, Training Loss: 0.01637
Epoch: 9917, Training Loss: 0.01639
Epoch: 9917, Training Loss: 0.02038
Epoch: 9918, Training Loss: 0.01538
Epoch: 9918, Training Loss: 0.01637
Epoch: 9918, Training Loss: 0.01639
Epoch: 9918, Training Loss: 0.02038
Epoch: 9919, Training Loss: 0.01538
Epoch: 9919, Training Loss: 0.01636
Epoch: 9919, Training Loss: 0.01639
Epoch: 9919, Training Loss: 0.02038
Epoch: 9920, Training Loss: 0.01538
Epoch: 9920, Training Loss: 0.01636
Epoch: 9920, Training Loss: 0.01639
Epoch: 9920, Training Loss: 0.02038
Epoch: 9921, Training Loss: 0.01538
Epoch: 9921, Training Loss: 0.01636
Epoch: 9921, Training Loss: 0.01639
Epoch: 9921, Training Loss: 0.02038
Epoch: 9922, Training Loss: 0.01538
Epoch: 9922, Training Loss: 0.01636
Epoch: 9922, Training Loss: 0.01639
Epoch: 9922, Training Loss: 0.02037
Epoch: 9923, Training Loss: 0.01538
Epoch: 9923, Training Loss: 0.01636
Epoch: 9923, Training Loss: 0.01639
Epoch: 9923, Training Loss: 0.02037
Epoch: 9924, Training Loss: 0.01538
Epoch: 9924, Training Loss: 0.01636
Epoch: 9924, Training Loss: 0.01638
Epoch: 9924, Training Loss: 0.02037
Epoch: 9925, Training Loss: 0.01537
Epoch: 9925, Training Loss: 0.01636
Epoch: 9925, Training Loss: 0.01638
Epoch: 9925, Training Loss: 0.02037
Epoch: 9926, Training Loss: 0.01537
Epoch: 9926, Training Loss: 0.01636
Epoch: 9926, Training Loss: 0.01638
Epoch: 9926, Training Loss: 0.02037
Epoch: 9927, Training Loss: 0.01537
Epoch: 9927, Training Loss: 0.01636
Epoch: 9927, Training Loss: 0.01638
Epoch: 9927, Training Loss: 0.02037
Epoch: 9928, Training Loss: 0.01537
Epoch: 9928, Training Loss: 0.01635
Epoch: 9928, Training Loss: 0.01638
Epoch: 9928, Training Loss: 0.02037
Epoch: 9929, Training Loss: 0.01537
Epoch: 9929, Training Loss: 0.01635
Epoch: 9929, Training Loss: 0.01638
Epoch: 9929, Training Loss: 0.02036
Epoch: 9930, Training Loss: 0.01537
Epoch: 9930, Training Loss: 0.01635
Epoch: 9930, Training Loss: 0.01638
Epoch: 9930, Training Loss: 0.02036
Epoch: 9931, Training Loss: 0.01537
Epoch: 9931, Training Loss: 0.01635
Epoch: 9931, Training Loss: 0.01638
Epoch: 9931, Training Loss: 0.02036
Epoch: 9932, Training Loss: 0.01537
Epoch: 9932, Training Loss: 0.01635
Epoch: 9932, Training Loss: 0.01638
Epoch: 9932, Training Loss: 0.02036
Epoch: 9933, Training Loss: 0.01537
Epoch: 9933, Training Loss: 0.01635
Epoch: 9933, Training Loss: 0.01637
Epoch: 9933, Training Loss: 0.02036
Epoch: 9934, Training Loss: 0.01536
Epoch: 9934, Training Loss: 0.01635
Epoch: 9934, Training Loss: 0.01637
Epoch: 9934, Training Loss: 0.02036
Epoch: 9935, Training Loss: 0.01536
Epoch: 9935, Training Loss: 0.01635
Epoch: 9935, Training Loss: 0.01637
Epoch: 9935, Training Loss: 0.02036
Epoch: 9936, Training Loss: 0.01536
Epoch: 9936, Training Loss: 0.01635
Epoch: 9936, Training Loss: 0.01637
Epoch: 9936, Training Loss: 0.02035
Epoch: 9937, Training Loss: 0.01536
Epoch: 9937, Training Loss: 0.01634
Epoch: 9937, Training Loss: 0.01637
Epoch: 9937, Training Loss: 0.02035
Epoch: 9938, Training Loss: 0.01536
Epoch: 9938, Training Loss: 0.01634
Epoch: 9938, Training Loss: 0.01637
Epoch: 9938, Training Loss: 0.02035
Epoch: 9939, Training Loss: 0.01536
Epoch: 9939, Training Loss: 0.01634
Epoch: 9939, Training Loss: 0.01637
Epoch: 9939, Training Loss: 0.02035
Epoch: 9940, Training Loss: 0.01536
Epoch: 9940, Training Loss: 0.01634
Epoch: 9940, Training Loss: 0.01637
Epoch: 9940, Training Loss: 0.02035
Epoch: 9941, Training Loss: 0.01536
Epoch: 9941, Training Loss: 0.01634
Epoch: 9941, Training Loss: 0.01637
Epoch: 9941, Training Loss: 0.02035
Epoch: 9942, Training Loss: 0.01536
Epoch: 9942, Training Loss: 0.01634
Epoch: 9942, Training Loss: 0.01636
Epoch: 9942, Training Loss: 0.02034
Epoch: 9943, Training Loss: 0.01536
Epoch: 9943, Training Loss: 0.01634
Epoch: 9943, Training Loss: 0.01636
Epoch: 9943, Training Loss: 0.02034
Epoch: 9944, Training Loss: 0.01535
Epoch: 9944, Training Loss: 0.01634
Epoch: 9944, Training Loss: 0.01636
Epoch: 9944, Training Loss: 0.02034
Epoch: 9945, Training Loss: 0.01535
Epoch: 9945, Training Loss: 0.01634
Epoch: 9945, Training Loss: 0.01636
Epoch: 9945, Training Loss: 0.02034
Epoch: 9946, Training Loss: 0.01535
Epoch: 9946, Training Loss: 0.01633
Epoch: 9946, Training Loss: 0.01636
Epoch: 9946, Training Loss: 0.02034
Epoch: 9947, Training Loss: 0.01535
Epoch: 9947, Training Loss: 0.01633
Epoch: 9947, Training Loss: 0.01636
Epoch: 9947, Training Loss: 0.02034
Epoch: 9948, Training Loss: 0.01535
Epoch: 9948, Training Loss: 0.01633
Epoch: 9948, Training Loss: 0.01636
Epoch: 9948, Training Loss: 0.02034
Epoch: 9949, Training Loss: 0.01535
Epoch: 9949, Training Loss: 0.01633
Epoch: 9949, Training Loss: 0.01636
Epoch: 9949, Training Loss: 0.02033
Epoch: 9950, Training Loss: 0.01535
Epoch: 9950, Training Loss: 0.01633
Epoch: 9950, Training Loss: 0.01635
Epoch: 9950, Training Loss: 0.02033
Epoch: 9951, Training Loss: 0.01535
Epoch: 9951, Training Loss: 0.01633
Epoch: 9951, Training Loss: 0.01635
Epoch: 9951, Training Loss: 0.02033
Epoch: 9952, Training Loss: 0.01535
Epoch: 9952, Training Loss: 0.01633
Epoch: 9952, Training Loss: 0.01635
Epoch: 9952, Training Loss: 0.02033
Epoch: 9953, Training Loss: 0.01535
Epoch: 9953, Training Loss: 0.01633
Epoch: 9953, Training Loss: 0.01635
Epoch: 9953, Training Loss: 0.02033
Epoch: 9954, Training Loss: 0.01534
Epoch: 9954, Training Loss: 0.01632
Epoch: 9954, Training Loss: 0.01635
Epoch: 9954, Training Loss: 0.02033
Epoch: 9955, Training Loss: 0.01534
Epoch: 9955, Training Loss: 0.01632
Epoch: 9955, Training Loss: 0.01635
Epoch: 9955, Training Loss: 0.02033
Epoch: 9956, Training Loss: 0.01534
Epoch: 9956, Training Loss: 0.01632
Epoch: 9956, Training Loss: 0.01635
Epoch: 9956, Training Loss: 0.02032
Epoch: 9957, Training Loss: 0.01534
Epoch: 9957, Training Loss: 0.01632
Epoch: 9957, Training Loss: 0.01635
Epoch: 9957, Training Loss: 0.02032
Epoch: 9958, Training Loss: 0.01534
Epoch: 9958, Training Loss: 0.01632
Epoch: 9958, Training Loss: 0.01635
Epoch: 9958, Training Loss: 0.02032
Epoch: 9959, Training Loss: 0.01534
Epoch: 9959, Training Loss: 0.01632
Epoch: 9959, Training Loss: 0.01634
Epoch: 9959, Training Loss: 0.02032
Epoch: 9960, Training Loss: 0.01534
Epoch: 9960, Training Loss: 0.01632
Epoch: 9960, Training Loss: 0.01634
Epoch: 9960, Training Loss: 0.02032
Epoch: 9961, Training Loss: 0.01534
Epoch: 9961, Training Loss: 0.01632
Epoch: 9961, Training Loss: 0.01634
Epoch: 9961, Training Loss: 0.02032
Epoch: 9962, Training Loss: 0.01534
Epoch: 9962, Training Loss: 0.01632
Epoch: 9962, Training Loss: 0.01634
Epoch: 9962, Training Loss: 0.02032
Epoch: 9963, Training Loss: 0.01534
Epoch: 9963, Training Loss: 0.01631
Epoch: 9963, Training Loss: 0.01634
Epoch: 9963, Training Loss: 0.02031
Epoch: 9964, Training Loss: 0.01533
Epoch: 9964, Training Loss: 0.01631
Epoch: 9964, Training Loss: 0.01634
Epoch: 9964, Training Loss: 0.02031
Epoch: 9965, Training Loss: 0.01533
Epoch: 9965, Training Loss: 0.01631
Epoch: 9965, Training Loss: 0.01634
Epoch: 9965, Training Loss: 0.02031
Epoch: 9966, Training Loss: 0.01533
Epoch: 9966, Training Loss: 0.01631
Epoch: 9966, Training Loss: 0.01634
Epoch: 9966, Training Loss: 0.02031
Epoch: 9967, Training Loss: 0.01533
Epoch: 9967, Training Loss: 0.01631
Epoch: 9967, Training Loss: 0.01634
Epoch: 9967, Training Loss: 0.02031
Epoch: 9968, Training Loss: 0.01533
Epoch: 9968, Training Loss: 0.01631
Epoch: 9968, Training Loss: 0.01633
Epoch: 9968, Training Loss: 0.02031
Epoch: 9969, Training Loss: 0.01533
Epoch: 9969, Training Loss: 0.01631
Epoch: 9969, Training Loss: 0.01633
Epoch: 9969, Training Loss: 0.02031
Epoch: 9970, Training Loss: 0.01533
Epoch: 9970, Training Loss: 0.01631
Epoch: 9970, Training Loss: 0.01633
Epoch: 9970, Training Loss: 0.02030
Epoch: 9971, Training Loss: 0.01533
Epoch: 9971, Training Loss: 0.01631
Epoch: 9971, Training Loss: 0.01633
Epoch: 9971, Training Loss: 0.02030
Epoch: 9972, Training Loss: 0.01533
Epoch: 9972, Training Loss: 0.01630
Epoch: 9972, Training Loss: 0.01633
Epoch: 9972, Training Loss: 0.02030
Epoch: 9973, Training Loss: 0.01533
Epoch: 9973, Training Loss: 0.01630
Epoch: 9973, Training Loss: 0.01633
Epoch: 9973, Training Loss: 0.02030
Epoch: 9974, Training Loss: 0.01532
Epoch: 9974, Training Loss: 0.01630
Epoch: 9974, Training Loss: 0.01633
Epoch: 9974, Training Loss: 0.02030
Epoch: 9975, Training Loss: 0.01532
Epoch: 9975, Training Loss: 0.01630
Epoch: 9975, Training Loss: 0.01633
Epoch: 9975, Training Loss: 0.02030
Epoch: 9976, Training Loss: 0.01532
Epoch: 9976, Training Loss: 0.01630
Epoch: 9976, Training Loss: 0.01633
Epoch: 9976, Training Loss: 0.02030
Epoch: 9977, Training Loss: 0.01532
Epoch: 9977, Training Loss: 0.01630
Epoch: 9977, Training Loss: 0.01632
Epoch: 9977, Training Loss: 0.02029
Epoch: 9978, Training Loss: 0.01532
Epoch: 9978, Training Loss: 0.01630
Epoch: 9978, Training Loss: 0.01632
Epoch: 9978, Training Loss: 0.02029
Epoch: 9979, Training Loss: 0.01532
Epoch: 9979, Training Loss: 0.01630
Epoch: 9979, Training Loss: 0.01632
Epoch: 9979, Training Loss: 0.02029
Epoch: 9980, Training Loss: 0.01532
Epoch: 9980, Training Loss: 0.01630
Epoch: 9980, Training Loss: 0.01632
Epoch: 9980, Training Loss: 0.02029
Epoch: 9981, Training Loss: 0.01532
Epoch: 9981, Training Loss: 0.01629
Epoch: 9981, Training Loss: 0.01632
Epoch: 9981, Training Loss: 0.02029
Epoch: 9982, Training Loss: 0.01532
Epoch: 9982, Training Loss: 0.01629
Epoch: 9982, Training Loss: 0.01632
Epoch: 9982, Training Loss: 0.02029
Epoch: 9983, Training Loss: 0.01531
Epoch: 9983, Training Loss: 0.01629
Epoch: 9983, Training Loss: 0.01632
Epoch: 9983, Training Loss: 0.02029
Epoch: 9984, Training Loss: 0.01531
Epoch: 9984, Training Loss: 0.01629
Epoch: 9984, Training Loss: 0.01632
Epoch: 9984, Training Loss: 0.02028
Epoch: 9985, Training Loss: 0.01531
Epoch: 9985, Training Loss: 0.01629
Epoch: 9985, Training Loss: 0.01632
Epoch: 9985, Training Loss: 0.02028
Epoch: 9986, Training Loss: 0.01531
Epoch: 9986, Training Loss: 0.01629
Epoch: 9986, Training Loss: 0.01631
Epoch: 9986, Training Loss: 0.02028
Epoch: 9987, Training Loss: 0.01531
Epoch: 9987, Training Loss: 0.01629
Epoch: 9987, Training Loss: 0.01631
Epoch: 9987, Training Loss: 0.02028
Epoch: 9988, Training Loss: 0.01531
Epoch: 9988, Training Loss: 0.01629
Epoch: 9988, Training Loss: 0.01631
Epoch: 9988, Training Loss: 0.02028
Epoch: 9989, Training Loss: 0.01531
Epoch: 9989, Training Loss: 0.01629
Epoch: 9989, Training Loss: 0.01631
Epoch: 9989, Training Loss: 0.02028
Epoch: 9990, Training Loss: 0.01531
Epoch: 9990, Training Loss: 0.01628
Epoch: 9990, Training Loss: 0.01631
Epoch: 9990, Training Loss: 0.02028
Epoch: 9991, Training Loss: 0.01531
Epoch: 9991, Training Loss: 0.01628
Epoch: 9991, Training Loss: 0.01631
Epoch: 9991, Training Loss: 0.02027
Epoch: 9992, Training Loss: 0.01531
Epoch: 9992, Training Loss: 0.01628
Epoch: 9992, Training Loss: 0.01631
Epoch: 9992, Training Loss: 0.02027
Epoch: 9993, Training Loss: 0.01530
Epoch: 9993, Training Loss: 0.01628
Epoch: 9993, Training Loss: 0.01631
Epoch: 9993, Training Loss: 0.02027
Epoch: 9994, Training Loss: 0.01530
Epoch: 9994, Training Loss: 0.01628
Epoch: 9994, Training Loss: 0.01631
Epoch: 9994, Training Loss: 0.02027
Epoch: 9995, Training Loss: 0.01530
Epoch: 9995, Training Loss: 0.01628
Epoch: 9995, Training Loss: 0.01630
Epoch: 9995, Training Loss: 0.02027
Epoch: 9996, Training Loss: 0.01530
Epoch: 9996, Training Loss: 0.01628
Epoch: 9996, Training Loss: 0.01630
Epoch: 9996, Training Loss: 0.02027
Epoch: 9997, Training Loss: 0.01530
Epoch: 9997, Training Loss: 0.01628
Epoch: 9997, Training Loss: 0.01630
Epoch: 9997, Training Loss: 0.02027
Epoch: 9998, Training Loss: 0.01530
Epoch: 9998, Training Loss: 0.01628
Epoch: 9998, Training Loss: 0.01630
Epoch: 9998, Training Loss: 0.02026
Epoch: 9999, Training Loss: 0.01530
Epoch: 9999, Training Loss: 0.01627
Epoch: 9999, Training Loss: 0.01630
Epoch: 9999, Training Loss: 0.02026
Epoch: 10000, Training Loss: 0.01530
Epoch: 10000, Training Loss: 0.01627
Epoch: 10000, Training Loss: 0.01630
Epoch: 10000, Training Loss: 0.02026
Epoch: 10001, Training Loss: 0.01530
Epoch: 10001, Training Loss: 0.01627
Epoch: 10001, Training Loss: 0.01630
Epoch: 10001, Training Loss: 0.02026
Epoch: 10002, Training Loss: 0.01530
Epoch: 10002, Training Loss: 0.01627
Epoch: 10002, Training Loss: 0.01630
Epoch: 10002, Training Loss: 0.02026
Epoch: 10003, Training Loss: 0.01529
Epoch: 10003, Training Loss: 0.01627
Epoch: 10003, Training Loss: 0.01629
Epoch: 10003, Training Loss: 0.02026
Epoch: 10004, Training Loss: 0.01529
Epoch: 10004, Training Loss: 0.01627
Epoch: 10004, Training Loss: 0.01629
Epoch: 10004, Training Loss: 0.02026
Epoch: 10005, Training Loss: 0.01529
Epoch: 10005, Training Loss: 0.01627
Epoch: 10005, Training Loss: 0.01629
Epoch: 10005, Training Loss: 0.02025
Epoch: 10006, Training Loss: 0.01529
Epoch: 10006, Training Loss: 0.01627
Epoch: 10006, Training Loss: 0.01629
Epoch: 10006, Training Loss: 0.02025
Epoch: 10007, Training Loss: 0.01529
Epoch: 10007, Training Loss: 0.01627
Epoch: 10007, Training Loss: 0.01629
Epoch: 10007, Training Loss: 0.02025
Epoch: 10008, Training Loss: 0.01529
Epoch: 10008, Training Loss: 0.01626
Epoch: 10008, Training Loss: 0.01629
Epoch: 10008, Training Loss: 0.02025
Epoch: 10009, Training Loss: 0.01529
Epoch: 10009, Training Loss: 0.01626
Epoch: 10009, Training Loss: 0.01629
Epoch: 10009, Training Loss: 0.02025
Epoch: 10010, Training Loss: 0.01529
Epoch: 10010, Training Loss: 0.01626
Epoch: 10010, Training Loss: 0.01629
Epoch: 10010, Training Loss: 0.02025
Epoch: 10011, Training Loss: 0.01529
Epoch: 10011, Training Loss: 0.01626
Epoch: 10011, Training Loss: 0.01629
Epoch: 10011, Training Loss: 0.02025
Epoch: 10012, Training Loss: 0.01529
Epoch: 10012, Training Loss: 0.01626
Epoch: 10012, Training Loss: 0.01628
Epoch: 10012, Training Loss: 0.02024
Epoch: 10013, Training Loss: 0.01528
Epoch: 10013, Training Loss: 0.01626
Epoch: 10013, Training Loss: 0.01628
Epoch: 10013, Training Loss: 0.02024
Epoch: 10014, Training Loss: 0.01528
Epoch: 10014, Training Loss: 0.01626
Epoch: 10014, Training Loss: 0.01628
Epoch: 10014, Training Loss: 0.02024
Epoch: 10015, Training Loss: 0.01528
Epoch: 10015, Training Loss: 0.01626
Epoch: 10015, Training Loss: 0.01628
Epoch: 10015, Training Loss: 0.02024
Epoch: 10016, Training Loss: 0.01528
Epoch: 10016, Training Loss: 0.01626
Epoch: 10016, Training Loss: 0.01628
Epoch: 10016, Training Loss: 0.02024
Epoch: 10017, Training Loss: 0.01528
Epoch: 10017, Training Loss: 0.01625
Epoch: 10017, Training Loss: 0.01628
Epoch: 10017, Training Loss: 0.02024
Epoch: 10018, Training Loss: 0.01528
Epoch: 10018, Training Loss: 0.01625
Epoch: 10018, Training Loss: 0.01628
Epoch: 10018, Training Loss: 0.02024
Epoch: 10019, Training Loss: 0.01528
Epoch: 10019, Training Loss: 0.01625
Epoch: 10019, Training Loss: 0.01628
Epoch: 10019, Training Loss: 0.02023
Epoch: 10020, Training Loss: 0.01528
Epoch: 10020, Training Loss: 0.01625
Epoch: 10020, Training Loss: 0.01628
Epoch: 10020, Training Loss: 0.02023
Epoch: 10021, Training Loss: 0.01528
Epoch: 10021, Training Loss: 0.01625
Epoch: 10021, Training Loss: 0.01627
Epoch: 10021, Training Loss: 0.02023
Epoch: 10022, Training Loss: 0.01528
Epoch: 10022, Training Loss: 0.01625
Epoch: 10022, Training Loss: 0.01627
Epoch: 10022, Training Loss: 0.02023
Epoch: 10023, Training Loss: 0.01527
Epoch: 10023, Training Loss: 0.01625
Epoch: 10023, Training Loss: 0.01627
Epoch: 10023, Training Loss: 0.02023
Epoch: 10024, Training Loss: 0.01527
Epoch: 10024, Training Loss: 0.01625
Epoch: 10024, Training Loss: 0.01627
Epoch: 10024, Training Loss: 0.02023
Epoch: 10025, Training Loss: 0.01527
Epoch: 10025, Training Loss: 0.01625
Epoch: 10025, Training Loss: 0.01627
Epoch: 10025, Training Loss: 0.02023
Epoch: 10026, Training Loss: 0.01527
Epoch: 10026, Training Loss: 0.01624
Epoch: 10026, Training Loss: 0.01627
Epoch: 10026, Training Loss: 0.02022
Epoch: 10027, Training Loss: 0.01527
Epoch: 10027, Training Loss: 0.01624
Epoch: 10027, Training Loss: 0.01627
Epoch: 10027, Training Loss: 0.02022
Epoch: 10028, Training Loss: 0.01527
Epoch: 10028, Training Loss: 0.01624
Epoch: 10028, Training Loss: 0.01627
Epoch: 10028, Training Loss: 0.02022
Epoch: 10029, Training Loss: 0.01527
Epoch: 10029, Training Loss: 0.01624
Epoch: 10029, Training Loss: 0.01627
Epoch: 10029, Training Loss: 0.02022
Epoch: 10030, Training Loss: 0.01527
Epoch: 10030, Training Loss: 0.01624
Epoch: 10030, Training Loss: 0.01626
Epoch: 10030, Training Loss: 0.02022
Epoch: 10031, Training Loss: 0.01527
Epoch: 10031, Training Loss: 0.01624
Epoch: 10031, Training Loss: 0.01626
Epoch: 10031, Training Loss: 0.02022
Epoch: 10032, Training Loss: 0.01527
Epoch: 10032, Training Loss: 0.01624
Epoch: 10032, Training Loss: 0.01626
Epoch: 10032, Training Loss: 0.02022
Epoch: 10033, Training Loss: 0.01526
Epoch: 10033, Training Loss: 0.01624
Epoch: 10033, Training Loss: 0.01626
Epoch: 10033, Training Loss: 0.02021
Epoch: 10034, Training Loss: 0.01526
Epoch: 10034, Training Loss: 0.01624
Epoch: 10034, Training Loss: 0.01626
Epoch: 10034, Training Loss: 0.02021
Epoch: 10035, Training Loss: 0.01526
Epoch: 10035, Training Loss: 0.01623
Epoch: 10035, Training Loss: 0.01626
Epoch: 10035, Training Loss: 0.02021
Epoch: 10036, Training Loss: 0.01526
Epoch: 10036, Training Loss: 0.01623
Epoch: 10036, Training Loss: 0.01626
Epoch: 10036, Training Loss: 0.02021
Epoch: 10037, Training Loss: 0.01526
Epoch: 10037, Training Loss: 0.01623
Epoch: 10037, Training Loss: 0.01626
Epoch: 10037, Training Loss: 0.02021
Epoch: 10038, Training Loss: 0.01526
Epoch: 10038, Training Loss: 0.01623
Epoch: 10038, Training Loss: 0.01626
Epoch: 10038, Training Loss: 0.02021
Epoch: 10039, Training Loss: 0.01526
Epoch: 10039, Training Loss: 0.01623
Epoch: 10039, Training Loss: 0.01625
Epoch: 10039, Training Loss: 0.02021
Epoch: 10040, Training Loss: 0.01526
Epoch: 10040, Training Loss: 0.01623
Epoch: 10040, Training Loss: 0.01625
Epoch: 10040, Training Loss: 0.02020
Epoch: 10041, Training Loss: 0.01526
Epoch: 10041, Training Loss: 0.01623
Epoch: 10041, Training Loss: 0.01625
Epoch: 10041, Training Loss: 0.02020
Epoch: 10042, Training Loss: 0.01526
Epoch: 10042, Training Loss: 0.01623
Epoch: 10042, Training Loss: 0.01625
Epoch: 10042, Training Loss: 0.02020
Epoch: 10043, Training Loss: 0.01525
Epoch: 10043, Training Loss: 0.01622
Epoch: 10043, Training Loss: 0.01625
Epoch: 10043, Training Loss: 0.02020
Epoch: 10044, Training Loss: 0.01525
Epoch: 10044, Training Loss: 0.01622
Epoch: 10044, Training Loss: 0.01625
Epoch: 10044, Training Loss: 0.02020
Epoch: 10045, Training Loss: 0.01525
Epoch: 10045, Training Loss: 0.01622
Epoch: 10045, Training Loss: 0.01625
Epoch: 10045, Training Loss: 0.02020
Epoch: 10046, Training Loss: 0.01525
Epoch: 10046, Training Loss: 0.01622
Epoch: 10046, Training Loss: 0.01625
Epoch: 10046, Training Loss: 0.02020
Epoch: 10047, Training Loss: 0.01525
Epoch: 10047, Training Loss: 0.01622
Epoch: 10047, Training Loss: 0.01625
Epoch: 10047, Training Loss: 0.02019
Epoch: 10048, Training Loss: 0.01525
Epoch: 10048, Training Loss: 0.01622
Epoch: 10048, Training Loss: 0.01624
Epoch: 10048, Training Loss: 0.02019
Epoch: 10049, Training Loss: 0.01525
Epoch: 10049, Training Loss: 0.01622
Epoch: 10049, Training Loss: 0.01624
Epoch: 10049, Training Loss: 0.02019
Epoch: 10050, Training Loss: 0.01525
Epoch: 10050, Training Loss: 0.01622
Epoch: 10050, Training Loss: 0.01624
Epoch: 10050, Training Loss: 0.02019
Epoch: 10051, Training Loss: 0.01525
Epoch: 10051, Training Loss: 0.01622
Epoch: 10051, Training Loss: 0.01624
Epoch: 10051, Training Loss: 0.02019
Epoch: 10052, Training Loss: 0.01525
Epoch: 10052, Training Loss: 0.01621
Epoch: 10052, Training Loss: 0.01624
Epoch: 10052, Training Loss: 0.02019
Epoch: 10053, Training Loss: 0.01524
Epoch: 10053, Training Loss: 0.01621
Epoch: 10053, Training Loss: 0.01624
Epoch: 10053, Training Loss: 0.02019
Epoch: 10054, Training Loss: 0.01524
Epoch: 10054, Training Loss: 0.01621
Epoch: 10054, Training Loss: 0.01624
Epoch: 10054, Training Loss: 0.02018
Epoch: 10055, Training Loss: 0.01524
Epoch: 10055, Training Loss: 0.01621
Epoch: 10055, Training Loss: 0.01624
Epoch: 10055, Training Loss: 0.02018
Epoch: 10056, Training Loss: 0.01524
Epoch: 10056, Training Loss: 0.01621
Epoch: 10056, Training Loss: 0.01624
Epoch: 10056, Training Loss: 0.02018
Epoch: 10057, Training Loss: 0.01524
Epoch: 10057, Training Loss: 0.01621
Epoch: 10057, Training Loss: 0.01623
Epoch: 10057, Training Loss: 0.02018
Epoch: 10058, Training Loss: 0.01524
Epoch: 10058, Training Loss: 0.01621
Epoch: 10058, Training Loss: 0.01623
Epoch: 10058, Training Loss: 0.02018
Epoch: 10059, Training Loss: 0.01524
Epoch: 10059, Training Loss: 0.01621
Epoch: 10059, Training Loss: 0.01623
Epoch: 10059, Training Loss: 0.02018
Epoch: 10060, Training Loss: 0.01524
Epoch: 10060, Training Loss: 0.01621
Epoch: 10060, Training Loss: 0.01623
Epoch: 10060, Training Loss: 0.02018
Epoch: 10061, Training Loss: 0.01524
Epoch: 10061, Training Loss: 0.01620
Epoch: 10061, Training Loss: 0.01623
Epoch: 10061, Training Loss: 0.02017
Epoch: 10062, Training Loss: 0.01524
Epoch: 10062, Training Loss: 0.01620
Epoch: 10062, Training Loss: 0.01623
Epoch: 10062, Training Loss: 0.02017
Epoch: 10063, Training Loss: 0.01523
Epoch: 10063, Training Loss: 0.01620
Epoch: 10063, Training Loss: 0.01623
Epoch: 10063, Training Loss: 0.02017
Epoch: 10064, Training Loss: 0.01523
Epoch: 10064, Training Loss: 0.01620
Epoch: 10064, Training Loss: 0.01623
Epoch: 10064, Training Loss: 0.02017
Epoch: 10065, Training Loss: 0.01523
Epoch: 10065, Training Loss: 0.01620
Epoch: 10065, Training Loss: 0.01623
Epoch: 10065, Training Loss: 0.02017
Epoch: 10066, Training Loss: 0.01523
Epoch: 10066, Training Loss: 0.01620
Epoch: 10066, Training Loss: 0.01622
Epoch: 10066, Training Loss: 0.02017
Epoch: 10067, Training Loss: 0.01523
Epoch: 10067, Training Loss: 0.01620
Epoch: 10067, Training Loss: 0.01622
Epoch: 10067, Training Loss: 0.02017
Epoch: 10068, Training Loss: 0.01523
Epoch: 10068, Training Loss: 0.01620
Epoch: 10068, Training Loss: 0.01622
Epoch: 10068, Training Loss: 0.02016
Epoch: 10069, Training Loss: 0.01523
Epoch: 10069, Training Loss: 0.01620
Epoch: 10069, Training Loss: 0.01622
Epoch: 10069, Training Loss: 0.02016
Epoch: 10070, Training Loss: 0.01523
Epoch: 10070, Training Loss: 0.01620
Epoch: 10070, Training Loss: 0.01622
Epoch: 10070, Training Loss: 0.02016
Epoch: 10071, Training Loss: 0.01523
Epoch: 10071, Training Loss: 0.01619
Epoch: 10071, Training Loss: 0.01622
Epoch: 10071, Training Loss: 0.02016
Epoch: 10072, Training Loss: 0.01523
Epoch: 10072, Training Loss: 0.01619
Epoch: 10072, Training Loss: 0.01622
Epoch: 10072, Training Loss: 0.02016
Epoch: 10073, Training Loss: 0.01522
Epoch: 10073, Training Loss: 0.01619
Epoch: 10073, Training Loss: 0.01622
Epoch: 10073, Training Loss: 0.02016
Epoch: 10074, Training Loss: 0.01522
Epoch: 10074, Training Loss: 0.01619
Epoch: 10074, Training Loss: 0.01622
Epoch: 10074, Training Loss: 0.02016
Epoch: 10075, Training Loss: 0.01522
Epoch: 10075, Training Loss: 0.01619
Epoch: 10075, Training Loss: 0.01621
Epoch: 10075, Training Loss: 0.02015
Epoch: 10076, Training Loss: 0.01522
Epoch: 10076, Training Loss: 0.01619
Epoch: 10076, Training Loss: 0.01621
Epoch: 10076, Training Loss: 0.02015
Epoch: 10077, Training Loss: 0.01522
Epoch: 10077, Training Loss: 0.01619
Epoch: 10077, Training Loss: 0.01621
Epoch: 10077, Training Loss: 0.02015
Epoch: 10078, Training Loss: 0.01522
Epoch: 10078, Training Loss: 0.01619
Epoch: 10078, Training Loss: 0.01621
Epoch: 10078, Training Loss: 0.02015
Epoch: 10079, Training Loss: 0.01522
Epoch: 10079, Training Loss: 0.01619
Epoch: 10079, Training Loss: 0.01621
Epoch: 10079, Training Loss: 0.02015
Epoch: 10080, Training Loss: 0.01522
Epoch: 10080, Training Loss: 0.01618
Epoch: 10080, Training Loss: 0.01621
Epoch: 10080, Training Loss: 0.02015
Epoch: 10081, Training Loss: 0.01522
Epoch: 10081, Training Loss: 0.01618
Epoch: 10081, Training Loss: 0.01621
Epoch: 10081, Training Loss: 0.02015
Epoch: 10082, Training Loss: 0.01522
Epoch: 10082, Training Loss: 0.01618
Epoch: 10082, Training Loss: 0.01621
Epoch: 10082, Training Loss: 0.02014
Epoch: 10083, Training Loss: 0.01521
Epoch: 10083, Training Loss: 0.01618
Epoch: 10083, Training Loss: 0.01621
Epoch: 10083, Training Loss: 0.02014
Epoch: 10084, Training Loss: 0.01521
Epoch: 10084, Training Loss: 0.01618
Epoch: 10084, Training Loss: 0.01620
Epoch: 10084, Training Loss: 0.02014
Epoch: 10085, Training Loss: 0.01521
Epoch: 10085, Training Loss: 0.01618
Epoch: 10085, Training Loss: 0.01620
Epoch: 10085, Training Loss: 0.02014
Epoch: 10086, Training Loss: 0.01521
Epoch: 10086, Training Loss: 0.01618
Epoch: 10086, Training Loss: 0.01620
Epoch: 10086, Training Loss: 0.02014
Epoch: 10087, Training Loss: 0.01521
Epoch: 10087, Training Loss: 0.01618
Epoch: 10087, Training Loss: 0.01620
Epoch: 10087, Training Loss: 0.02014
Epoch: 10088, Training Loss: 0.01521
Epoch: 10088, Training Loss: 0.01618
Epoch: 10088, Training Loss: 0.01620
Epoch: 10088, Training Loss: 0.02014
Epoch: 10089, Training Loss: 0.01521
Epoch: 10089, Training Loss: 0.01617
Epoch: 10089, Training Loss: 0.01620
Epoch: 10089, Training Loss: 0.02013
Epoch: 10090, Training Loss: 0.01521
Epoch: 10090, Training Loss: 0.01617
Epoch: 10090, Training Loss: 0.01620
Epoch: 10090, Training Loss: 0.02013
Epoch: 10091, Training Loss: 0.01521
Epoch: 10091, Training Loss: 0.01617
Epoch: 10091, Training Loss: 0.01620
Epoch: 10091, Training Loss: 0.02013
Epoch: 10092, Training Loss: 0.01521
Epoch: 10092, Training Loss: 0.01617
Epoch: 10092, Training Loss: 0.01620
Epoch: 10092, Training Loss: 0.02013
Epoch: 10093, Training Loss: 0.01520
Epoch: 10093, Training Loss: 0.01617
Epoch: 10093, Training Loss: 0.01619
Epoch: 10093, Training Loss: 0.02013
Epoch: 10094, Training Loss: 0.01520
Epoch: 10094, Training Loss: 0.01617
Epoch: 10094, Training Loss: 0.01619
Epoch: 10094, Training Loss: 0.02013
Epoch: 10095, Training Loss: 0.01520
Epoch: 10095, Training Loss: 0.01617
Epoch: 10095, Training Loss: 0.01619
Epoch: 10095, Training Loss: 0.02013
Epoch: 10096, Training Loss: 0.01520
Epoch: 10096, Training Loss: 0.01617
Epoch: 10096, Training Loss: 0.01619
Epoch: 10096, Training Loss: 0.02013
Epoch: 10097, Training Loss: 0.01520
Epoch: 10097, Training Loss: 0.01617
Epoch: 10097, Training Loss: 0.01619
Epoch: 10097, Training Loss: 0.02012
Epoch: 10098, Training Loss: 0.01520
Epoch: 10098, Training Loss: 0.01616
Epoch: 10098, Training Loss: 0.01619
Epoch: 10098, Training Loss: 0.02012
Epoch: 10099, Training Loss: 0.01520
Epoch: 10099, Training Loss: 0.01616
Epoch: 10099, Training Loss: 0.01619
Epoch: 10099, Training Loss: 0.02012
Epoch: 10100, Training Loss: 0.01520
Epoch: 10100, Training Loss: 0.01616
Epoch: 10100, Training Loss: 0.01619
Epoch: 10100, Training Loss: 0.02012
Epoch: 10101, Training Loss: 0.01520
Epoch: 10101, Training Loss: 0.01616
Epoch: 10101, Training Loss: 0.01619
Epoch: 10101, Training Loss: 0.02012
Epoch: 10102, Training Loss: 0.01520
Epoch: 10102, Training Loss: 0.01616
Epoch: 10102, Training Loss: 0.01618
Epoch: 10102, Training Loss: 0.02012
Epoch: 10103, Training Loss: 0.01519
Epoch: 10103, Training Loss: 0.01616
Epoch: 10103, Training Loss: 0.01618
Epoch: 10103, Training Loss: 0.02012
Epoch: 10104, Training Loss: 0.01519
Epoch: 10104, Training Loss: 0.01616
Epoch: 10104, Training Loss: 0.01618
Epoch: 10104, Training Loss: 0.02011
Epoch: 10105, Training Loss: 0.01519
Epoch: 10105, Training Loss: 0.01616
Epoch: 10105, Training Loss: 0.01618
Epoch: 10105, Training Loss: 0.02011
Epoch: 10106, Training Loss: 0.01519
Epoch: 10106, Training Loss: 0.01616
Epoch: 10106, Training Loss: 0.01618
Epoch: 10106, Training Loss: 0.02011
Epoch: 10107, Training Loss: 0.01519
Epoch: 10107, Training Loss: 0.01615
Epoch: 10107, Training Loss: 0.01618
Epoch: 10107, Training Loss: 0.02011
Epoch: 10108, Training Loss: 0.01519
Epoch: 10108, Training Loss: 0.01615
Epoch: 10108, Training Loss: 0.01618
Epoch: 10108, Training Loss: 0.02011
Epoch: 10109, Training Loss: 0.01519
Epoch: 10109, Training Loss: 0.01615
Epoch: 10109, Training Loss: 0.01618
Epoch: 10109, Training Loss: 0.02011
Epoch: 10110, Training Loss: 0.01519
Epoch: 10110, Training Loss: 0.01615
Epoch: 10110, Training Loss: 0.01618
Epoch: 10110, Training Loss: 0.02011
Epoch: 10111, Training Loss: 0.01519
Epoch: 10111, Training Loss: 0.01615
Epoch: 10111, Training Loss: 0.01617
Epoch: 10111, Training Loss: 0.02010
Epoch: 10112, Training Loss: 0.01519
Epoch: 10112, Training Loss: 0.01615
Epoch: 10112, Training Loss: 0.01617
Epoch: 10112, Training Loss: 0.02010
Epoch: 10113, Training Loss: 0.01518
Epoch: 10113, Training Loss: 0.01615
Epoch: 10113, Training Loss: 0.01617
Epoch: 10113, Training Loss: 0.02010
Epoch: 10114, Training Loss: 0.01518
Epoch: 10114, Training Loss: 0.01615
Epoch: 10114, Training Loss: 0.01617
Epoch: 10114, Training Loss: 0.02010
Epoch: 10115, Training Loss: 0.01518
Epoch: 10115, Training Loss: 0.01615
Epoch: 10115, Training Loss: 0.01617
Epoch: 10115, Training Loss: 0.02010
Epoch: 10116, Training Loss: 0.01518
Epoch: 10116, Training Loss: 0.01614
Epoch: 10116, Training Loss: 0.01617
Epoch: 10116, Training Loss: 0.02010
Epoch: 10117, Training Loss: 0.01518
Epoch: 10117, Training Loss: 0.01614
Epoch: 10117, Training Loss: 0.01617
Epoch: 10117, Training Loss: 0.02010
Epoch: 10118, Training Loss: 0.01518
Epoch: 10118, Training Loss: 0.01614
Epoch: 10118, Training Loss: 0.01617
Epoch: 10118, Training Loss: 0.02009
Epoch: 10119, Training Loss: 0.01518
Epoch: 10119, Training Loss: 0.01614
Epoch: 10119, Training Loss: 0.01617
Epoch: 10119, Training Loss: 0.02009
Epoch: 10120, Training Loss: 0.01518
Epoch: 10120, Training Loss: 0.01614
Epoch: 10120, Training Loss: 0.01616
Epoch: 10120, Training Loss: 0.02009
Epoch: 10121, Training Loss: 0.01518
Epoch: 10121, Training Loss: 0.01614
Epoch: 10121, Training Loss: 0.01616
Epoch: 10121, Training Loss: 0.02009
Epoch: 10122, Training Loss: 0.01518
Epoch: 10122, Training Loss: 0.01614
Epoch: 10122, Training Loss: 0.01616
Epoch: 10122, Training Loss: 0.02009
Epoch: 10123, Training Loss: 0.01517
Epoch: 10123, Training Loss: 0.01614
Epoch: 10123, Training Loss: 0.01616
Epoch: 10123, Training Loss: 0.02009
Epoch: 10124, Training Loss: 0.01517
Epoch: 10124, Training Loss: 0.01614
Epoch: 10124, Training Loss: 0.01616
Epoch: 10124, Training Loss: 0.02009
Epoch: 10125, Training Loss: 0.01517
Epoch: 10125, Training Loss: 0.01613
Epoch: 10125, Training Loss: 0.01616
Epoch: 10125, Training Loss: 0.02008
Epoch: 10126, Training Loss: 0.01517
Epoch: 10126, Training Loss: 0.01613
Epoch: 10126, Training Loss: 0.01616
Epoch: 10126, Training Loss: 0.02008
Epoch: 10127, Training Loss: 0.01517
Epoch: 10127, Training Loss: 0.01613
Epoch: 10127, Training Loss: 0.01616
Epoch: 10127, Training Loss: 0.02008
Epoch: 10128, Training Loss: 0.01517
Epoch: 10128, Training Loss: 0.01613
Epoch: 10128, Training Loss: 0.01616
Epoch: 10128, Training Loss: 0.02008
Epoch: 10129, Training Loss: 0.01517
Epoch: 10129, Training Loss: 0.01613
Epoch: 10129, Training Loss: 0.01616
Epoch: 10129, Training Loss: 0.02008
Epoch: 10130, Training Loss: 0.01517
Epoch: 10130, Training Loss: 0.01613
Epoch: 10130, Training Loss: 0.01615
Epoch: 10130, Training Loss: 0.02008
Epoch: 10131, Training Loss: 0.01517
Epoch: 10131, Training Loss: 0.01613
Epoch: 10131, Training Loss: 0.01615
Epoch: 10131, Training Loss: 0.02008
Epoch: 10132, Training Loss: 0.01517
Epoch: 10132, Training Loss: 0.01613
Epoch: 10132, Training Loss: 0.01615
Epoch: 10132, Training Loss: 0.02007
Epoch: 10133, Training Loss: 0.01516
Epoch: 10133, Training Loss: 0.01613
Epoch: 10133, Training Loss: 0.01615
Epoch: 10133, Training Loss: 0.02007
Epoch: 10134, Training Loss: 0.01516
Epoch: 10134, Training Loss: 0.01612
Epoch: 10134, Training Loss: 0.01615
Epoch: 10134, Training Loss: 0.02007
Epoch: 10135, Training Loss: 0.01516
Epoch: 10135, Training Loss: 0.01612
Epoch: 10135, Training Loss: 0.01615
Epoch: 10135, Training Loss: 0.02007
Epoch: 10136, Training Loss: 0.01516
Epoch: 10136, Training Loss: 0.01612
Epoch: 10136, Training Loss: 0.01615
Epoch: 10136, Training Loss: 0.02007
Epoch: 10137, Training Loss: 0.01516
Epoch: 10137, Training Loss: 0.01612
Epoch: 10137, Training Loss: 0.01615
Epoch: 10137, Training Loss: 0.02007
Epoch: 10138, Training Loss: 0.01516
Epoch: 10138, Training Loss: 0.01612
Epoch: 10138, Training Loss: 0.01615
Epoch: 10138, Training Loss: 0.02007
Epoch: 10139, Training Loss: 0.01516
Epoch: 10139, Training Loss: 0.01612
Epoch: 10139, Training Loss: 0.01614
Epoch: 10139, Training Loss: 0.02006
Epoch: 10140, Training Loss: 0.01516
Epoch: 10140, Training Loss: 0.01612
Epoch: 10140, Training Loss: 0.01614
Epoch: 10140, Training Loss: 0.02006
Epoch: 10141, Training Loss: 0.01516
Epoch: 10141, Training Loss: 0.01612
Epoch: 10141, Training Loss: 0.01614
Epoch: 10141, Training Loss: 0.02006
Epoch: 10142, Training Loss: 0.01516
Epoch: 10142, Training Loss: 0.01612
Epoch: 10142, Training Loss: 0.01614
Epoch: 10142, Training Loss: 0.02006
Epoch: 10143, Training Loss: 0.01515
Epoch: 10143, Training Loss: 0.01611
Epoch: 10143, Training Loss: 0.01614
Epoch: 10143, Training Loss: 0.02006
Epoch: 10144, Training Loss: 0.01515
Epoch: 10144, Training Loss: 0.01611
Epoch: 10144, Training Loss: 0.01614
Epoch: 10144, Training Loss: 0.02006
Epoch: 10145, Training Loss: 0.01515
Epoch: 10145, Training Loss: 0.01611
Epoch: 10145, Training Loss: 0.01614
Epoch: 10145, Training Loss: 0.02006
Epoch: 10146, Training Loss: 0.01515
Epoch: 10146, Training Loss: 0.01611
Epoch: 10146, Training Loss: 0.01614
Epoch: 10146, Training Loss: 0.02006
Epoch: 10147, Training Loss: 0.01515
Epoch: 10147, Training Loss: 0.01611
Epoch: 10147, Training Loss: 0.01614
Epoch: 10147, Training Loss: 0.02005
Epoch: 10148, Training Loss: 0.01515
Epoch: 10148, Training Loss: 0.01611
Epoch: 10148, Training Loss: 0.01613
Epoch: 10148, Training Loss: 0.02005
Epoch: 10149, Training Loss: 0.01515
Epoch: 10149, Training Loss: 0.01611
Epoch: 10149, Training Loss: 0.01613
Epoch: 10149, Training Loss: 0.02005
Epoch: 10150, Training Loss: 0.01515
Epoch: 10150, Training Loss: 0.01611
Epoch: 10150, Training Loss: 0.01613
Epoch: 10150, Training Loss: 0.02005
Epoch: 10151, Training Loss: 0.01515
Epoch: 10151, Training Loss: 0.01611
Epoch: 10151, Training Loss: 0.01613
Epoch: 10151, Training Loss: 0.02005
Epoch: 10152, Training Loss: 0.01515
Epoch: 10152, Training Loss: 0.01611
Epoch: 10152, Training Loss: 0.01613
Epoch: 10152, Training Loss: 0.02005
Epoch: 10153, Training Loss: 0.01515
Epoch: 10153, Training Loss: 0.01610
Epoch: 10153, Training Loss: 0.01613
Epoch: 10153, Training Loss: 0.02005
Epoch: 10154, Training Loss: 0.01514
Epoch: 10154, Training Loss: 0.01610
Epoch: 10154, Training Loss: 0.01613
Epoch: 10154, Training Loss: 0.02004
Epoch: 10155, Training Loss: 0.01514
Epoch: 10155, Training Loss: 0.01610
Epoch: 10155, Training Loss: 0.01613
Epoch: 10155, Training Loss: 0.02004
Epoch: 10156, Training Loss: 0.01514
Epoch: 10156, Training Loss: 0.01610
Epoch: 10156, Training Loss: 0.01613
Epoch: 10156, Training Loss: 0.02004
Epoch: 10157, Training Loss: 0.01514
Epoch: 10157, Training Loss: 0.01610
Epoch: 10157, Training Loss: 0.01612
Epoch: 10157, Training Loss: 0.02004
Epoch: 10158, Training Loss: 0.01514
Epoch: 10158, Training Loss: 0.01610
Epoch: 10158, Training Loss: 0.01612
Epoch: 10158, Training Loss: 0.02004
Epoch: 10159, Training Loss: 0.01514
Epoch: 10159, Training Loss: 0.01610
Epoch: 10159, Training Loss: 0.01612
Epoch: 10159, Training Loss: 0.02004
Epoch: 10160, Training Loss: 0.01514
Epoch: 10160, Training Loss: 0.01610
Epoch: 10160, Training Loss: 0.01612
Epoch: 10160, Training Loss: 0.02004
Epoch: 10161, Training Loss: 0.01514
Epoch: 10161, Training Loss: 0.01610
Epoch: 10161, Training Loss: 0.01612
Epoch: 10161, Training Loss: 0.02003
Epoch: 10162, Training Loss: 0.01514
Epoch: 10162, Training Loss: 0.01609
Epoch: 10162, Training Loss: 0.01612
Epoch: 10162, Training Loss: 0.02003
Epoch: 10163, Training Loss: 0.01514
Epoch: 10163, Training Loss: 0.01609
Epoch: 10163, Training Loss: 0.01612
Epoch: 10163, Training Loss: 0.02003
Epoch: 10164, Training Loss: 0.01513
Epoch: 10164, Training Loss: 0.01609
Epoch: 10164, Training Loss: 0.01612
Epoch: 10164, Training Loss: 0.02003
Epoch: 10165, Training Loss: 0.01513
Epoch: 10165, Training Loss: 0.01609
Epoch: 10165, Training Loss: 0.01612
Epoch: 10165, Training Loss: 0.02003
Epoch: 10166, Training Loss: 0.01513
Epoch: 10166, Training Loss: 0.01609
Epoch: 10166, Training Loss: 0.01611
Epoch: 10166, Training Loss: 0.02003
Epoch: 10167, Training Loss: 0.01513
Epoch: 10167, Training Loss: 0.01609
Epoch: 10167, Training Loss: 0.01611
Epoch: 10167, Training Loss: 0.02003
Epoch: 10168, Training Loss: 0.01513
Epoch: 10168, Training Loss: 0.01609
Epoch: 10168, Training Loss: 0.01611
Epoch: 10168, Training Loss: 0.02002
Epoch: 10169, Training Loss: 0.01513
Epoch: 10169, Training Loss: 0.01609
Epoch: 10169, Training Loss: 0.01611
Epoch: 10169, Training Loss: 0.02002
Epoch: 10170, Training Loss: 0.01513
Epoch: 10170, Training Loss: 0.01609
Epoch: 10170, Training Loss: 0.01611
Epoch: 10170, Training Loss: 0.02002
Epoch: 10171, Training Loss: 0.01513
Epoch: 10171, Training Loss: 0.01608
Epoch: 10171, Training Loss: 0.01611
Epoch: 10171, Training Loss: 0.02002
Epoch: 10172, Training Loss: 0.01513
Epoch: 10172, Training Loss: 0.01608
Epoch: 10172, Training Loss: 0.01611
Epoch: 10172, Training Loss: 0.02002
Epoch: 10173, Training Loss: 0.01513
Epoch: 10173, Training Loss: 0.01608
Epoch: 10173, Training Loss: 0.01611
Epoch: 10173, Training Loss: 0.02002
Epoch: 10174, Training Loss: 0.01512
Epoch: 10174, Training Loss: 0.01608
Epoch: 10174, Training Loss: 0.01611
Epoch: 10174, Training Loss: 0.02002
Epoch: 10175, Training Loss: 0.01512
Epoch: 10175, Training Loss: 0.01608
Epoch: 10175, Training Loss: 0.01610
Epoch: 10175, Training Loss: 0.02001
Epoch: 10176, Training Loss: 0.01512
Epoch: 10176, Training Loss: 0.01608
Epoch: 10176, Training Loss: 0.01610
Epoch: 10176, Training Loss: 0.02001
Epoch: 10177, Training Loss: 0.01512
Epoch: 10177, Training Loss: 0.01608
Epoch: 10177, Training Loss: 0.01610
Epoch: 10177, Training Loss: 0.02001
Epoch: 10178, Training Loss: 0.01512
Epoch: 10178, Training Loss: 0.01608
Epoch: 10178, Training Loss: 0.01610
Epoch: 10178, Training Loss: 0.02001
Epoch: 10179, Training Loss: 0.01512
Epoch: 10179, Training Loss: 0.01608
Epoch: 10179, Training Loss: 0.01610
Epoch: 10179, Training Loss: 0.02001
Epoch: 10180, Training Loss: 0.01512
Epoch: 10180, Training Loss: 0.01607
Epoch: 10180, Training Loss: 0.01610
Epoch: 10180, Training Loss: 0.02001
Epoch: 10181, Training Loss: 0.01512
Epoch: 10181, Training Loss: 0.01607
Epoch: 10181, Training Loss: 0.01610
Epoch: 10181, Training Loss: 0.02001
Epoch: 10182, Training Loss: 0.01512
Epoch: 10182, Training Loss: 0.01607
Epoch: 10182, Training Loss: 0.01610
Epoch: 10182, Training Loss: 0.02001
Epoch: 10183, Training Loss: 0.01512
Epoch: 10183, Training Loss: 0.01607
Epoch: 10183, Training Loss: 0.01610
Epoch: 10183, Training Loss: 0.02000
Epoch: 10184, Training Loss: 0.01511
Epoch: 10184, Training Loss: 0.01607
Epoch: 10184, Training Loss: 0.01610
Epoch: 10184, Training Loss: 0.02000
Epoch: 10185, Training Loss: 0.01511
Epoch: 10185, Training Loss: 0.01607
Epoch: 10185, Training Loss: 0.01609
Epoch: 10185, Training Loss: 0.02000
Epoch: 10186, Training Loss: 0.01511
Epoch: 10186, Training Loss: 0.01607
Epoch: 10186, Training Loss: 0.01609
Epoch: 10186, Training Loss: 0.02000
Epoch: 10187, Training Loss: 0.01511
Epoch: 10187, Training Loss: 0.01607
Epoch: 10187, Training Loss: 0.01609
Epoch: 10187, Training Loss: 0.02000
Epoch: 10188, Training Loss: 0.01511
Epoch: 10188, Training Loss: 0.01607
Epoch: 10188, Training Loss: 0.01609
Epoch: 10188, Training Loss: 0.02000
Epoch: 10189, Training Loss: 0.01511
Epoch: 10189, Training Loss: 0.01606
Epoch: 10189, Training Loss: 0.01609
Epoch: 10189, Training Loss: 0.02000
Epoch: 10190, Training Loss: 0.01511
Epoch: 10190, Training Loss: 0.01606
Epoch: 10190, Training Loss: 0.01609
Epoch: 10190, Training Loss: 0.01999
Epoch: 10191, Training Loss: 0.01511
Epoch: 10191, Training Loss: 0.01606
Epoch: 10191, Training Loss: 0.01609
Epoch: 10191, Training Loss: 0.01999
Epoch: 10192, Training Loss: 0.01511
Epoch: 10192, Training Loss: 0.01606
Epoch: 10192, Training Loss: 0.01609
Epoch: 10192, Training Loss: 0.01999
Epoch: 10193, Training Loss: 0.01511
Epoch: 10193, Training Loss: 0.01606
Epoch: 10193, Training Loss: 0.01609
Epoch: 10193, Training Loss: 0.01999
Epoch: 10194, Training Loss: 0.01511
Epoch: 10194, Training Loss: 0.01606
Epoch: 10194, Training Loss: 0.01608
Epoch: 10194, Training Loss: 0.01999
Epoch: 10195, Training Loss: 0.01510
Epoch: 10195, Training Loss: 0.01606
Epoch: 10195, Training Loss: 0.01608
Epoch: 10195, Training Loss: 0.01999
Epoch: 10196, Training Loss: 0.01510
Epoch: 10196, Training Loss: 0.01606
Epoch: 10196, Training Loss: 0.01608
Epoch: 10196, Training Loss: 0.01999
Epoch: 10197, Training Loss: 0.01510
Epoch: 10197, Training Loss: 0.01606
Epoch: 10197, Training Loss: 0.01608
Epoch: 10197, Training Loss: 0.01998
Epoch: 10198, Training Loss: 0.01510
Epoch: 10198, Training Loss: 0.01606
Epoch: 10198, Training Loss: 0.01608
Epoch: 10198, Training Loss: 0.01998
Epoch: 10199, Training Loss: 0.01510
Epoch: 10199, Training Loss: 0.01605
Epoch: 10199, Training Loss: 0.01608
Epoch: 10199, Training Loss: 0.01998
Epoch: 10200, Training Loss: 0.01510
Epoch: 10200, Training Loss: 0.01605
Epoch: 10200, Training Loss: 0.01608
Epoch: 10200, Training Loss: 0.01998
Epoch: 10201, Training Loss: 0.01510
Epoch: 10201, Training Loss: 0.01605
Epoch: 10201, Training Loss: 0.01608
Epoch: 10201, Training Loss: 0.01998
Epoch: 10202, Training Loss: 0.01510
Epoch: 10202, Training Loss: 0.01605
Epoch: 10202, Training Loss: 0.01608
Epoch: 10202, Training Loss: 0.01998
Epoch: 10203, Training Loss: 0.01510
Epoch: 10203, Training Loss: 0.01605
Epoch: 10203, Training Loss: 0.01607
Epoch: 10203, Training Loss: 0.01998
Epoch: 10204, Training Loss: 0.01510
Epoch: 10204, Training Loss: 0.01605
Epoch: 10204, Training Loss: 0.01607
Epoch: 10204, Training Loss: 0.01997
Epoch: 10205, Training Loss: 0.01509
Epoch: 10205, Training Loss: 0.01605
Epoch: 10205, Training Loss: 0.01607
Epoch: 10205, Training Loss: 0.01997
Epoch: 10206, Training Loss: 0.01509
Epoch: 10206, Training Loss: 0.01605
Epoch: 10206, Training Loss: 0.01607
Epoch: 10206, Training Loss: 0.01997
Epoch: 10207, Training Loss: 0.01509
Epoch: 10207, Training Loss: 0.01605
Epoch: 10207, Training Loss: 0.01607
Epoch: 10207, Training Loss: 0.01997
Epoch: 10208, Training Loss: 0.01509
Epoch: 10208, Training Loss: 0.01604
Epoch: 10208, Training Loss: 0.01607
Epoch: 10208, Training Loss: 0.01997
Epoch: 10209, Training Loss: 0.01509
Epoch: 10209, Training Loss: 0.01604
Epoch: 10209, Training Loss: 0.01607
Epoch: 10209, Training Loss: 0.01997
Epoch: 10210, Training Loss: 0.01509
Epoch: 10210, Training Loss: 0.01604
Epoch: 10210, Training Loss: 0.01607
Epoch: 10210, Training Loss: 0.01997
Epoch: 10211, Training Loss: 0.01509
Epoch: 10211, Training Loss: 0.01604
Epoch: 10211, Training Loss: 0.01607
Epoch: 10211, Training Loss: 0.01997
Epoch: 10212, Training Loss: 0.01509
Epoch: 10212, Training Loss: 0.01604
Epoch: 10212, Training Loss: 0.01606
Epoch: 10212, Training Loss: 0.01996
Epoch: 10213, Training Loss: 0.01509
Epoch: 10213, Training Loss: 0.01604
Epoch: 10213, Training Loss: 0.01606
Epoch: 10213, Training Loss: 0.01996
Epoch: 10214, Training Loss: 0.01509
Epoch: 10214, Training Loss: 0.01604
Epoch: 10214, Training Loss: 0.01606
Epoch: 10214, Training Loss: 0.01996
Epoch: 10215, Training Loss: 0.01508
Epoch: 10215, Training Loss: 0.01604
Epoch: 10215, Training Loss: 0.01606
Epoch: 10215, Training Loss: 0.01996
Epoch: 10216, Training Loss: 0.01508
Epoch: 10216, Training Loss: 0.01604
Epoch: 10216, Training Loss: 0.01606
Epoch: 10216, Training Loss: 0.01996
Epoch: 10217, Training Loss: 0.01508
Epoch: 10217, Training Loss: 0.01603
Epoch: 10217, Training Loss: 0.01606
Epoch: 10217, Training Loss: 0.01996
Epoch: 10218, Training Loss: 0.01508
Epoch: 10218, Training Loss: 0.01603
Epoch: 10218, Training Loss: 0.01606
Epoch: 10218, Training Loss: 0.01996
Epoch: 10219, Training Loss: 0.01508
Epoch: 10219, Training Loss: 0.01603
Epoch: 10219, Training Loss: 0.01606
Epoch: 10219, Training Loss: 0.01995
Epoch: 10220, Training Loss: 0.01508
Epoch: 10220, Training Loss: 0.01603
Epoch: 10220, Training Loss: 0.01606
Epoch: 10220, Training Loss: 0.01995
Epoch: 10221, Training Loss: 0.01508
Epoch: 10221, Training Loss: 0.01603
Epoch: 10221, Training Loss: 0.01606
Epoch: 10221, Training Loss: 0.01995
Epoch: 10222, Training Loss: 0.01508
Epoch: 10222, Training Loss: 0.01603
Epoch: 10222, Training Loss: 0.01605
Epoch: 10222, Training Loss: 0.01995
Epoch: 10223, Training Loss: 0.01508
Epoch: 10223, Training Loss: 0.01603
Epoch: 10223, Training Loss: 0.01605
Epoch: 10223, Training Loss: 0.01995
Epoch: 10224, Training Loss: 0.01508
Epoch: 10224, Training Loss: 0.01603
Epoch: 10224, Training Loss: 0.01605
Epoch: 10224, Training Loss: 0.01995
Epoch: 10225, Training Loss: 0.01507
Epoch: 10225, Training Loss: 0.01603
Epoch: 10225, Training Loss: 0.01605
Epoch: 10225, Training Loss: 0.01995
Epoch: 10226, Training Loss: 0.01507
Epoch: 10226, Training Loss: 0.01603
Epoch: 10226, Training Loss: 0.01605
Epoch: 10226, Training Loss: 0.01994
Epoch: 10227, Training Loss: 0.01507
Epoch: 10227, Training Loss: 0.01602
Epoch: 10227, Training Loss: 0.01605
Epoch: 10227, Training Loss: 0.01994
Epoch: 10228, Training Loss: 0.01507
Epoch: 10228, Training Loss: 0.01602
Epoch: 10228, Training Loss: 0.01605
Epoch: 10228, Training Loss: 0.01994
Epoch: 10229, Training Loss: 0.01507
Epoch: 10229, Training Loss: 0.01602
Epoch: 10229, Training Loss: 0.01605
Epoch: 10229, Training Loss: 0.01994
Epoch: 10230, Training Loss: 0.01507
Epoch: 10230, Training Loss: 0.01602
Epoch: 10230, Training Loss: 0.01605
Epoch: 10230, Training Loss: 0.01994
Epoch: 10231, Training Loss: 0.01507
Epoch: 10231, Training Loss: 0.01602
Epoch: 10231, Training Loss: 0.01604
Epoch: 10231, Training Loss: 0.01994
Epoch: 10232, Training Loss: 0.01507
Epoch: 10232, Training Loss: 0.01602
Epoch: 10232, Training Loss: 0.01604
Epoch: 10232, Training Loss: 0.01994
Epoch: 10233, Training Loss: 0.01507
Epoch: 10233, Training Loss: 0.01602
Epoch: 10233, Training Loss: 0.01604
Epoch: 10233, Training Loss: 0.01994
Epoch: 10234, Training Loss: 0.01507
Epoch: 10234, Training Loss: 0.01602
Epoch: 10234, Training Loss: 0.01604
Epoch: 10234, Training Loss: 0.01993
Epoch: 10235, Training Loss: 0.01507
Epoch: 10235, Training Loss: 0.01602
Epoch: 10235, Training Loss: 0.01604
Epoch: 10235, Training Loss: 0.01993
Epoch: 10236, Training Loss: 0.01506
Epoch: 10236, Training Loss: 0.01601
Epoch: 10236, Training Loss: 0.01604
Epoch: 10236, Training Loss: 0.01993
Epoch: 10237, Training Loss: 0.01506
Epoch: 10237, Training Loss: 0.01601
Epoch: 10237, Training Loss: 0.01604
Epoch: 10237, Training Loss: 0.01993
Epoch: 10238, Training Loss: 0.01506
Epoch: 10238, Training Loss: 0.01601
Epoch: 10238, Training Loss: 0.01604
Epoch: 10238, Training Loss: 0.01993
Epoch: 10239, Training Loss: 0.01506
Epoch: 10239, Training Loss: 0.01601
Epoch: 10239, Training Loss: 0.01604
Epoch: 10239, Training Loss: 0.01993
Epoch: 10240, Training Loss: 0.01506
Epoch: 10240, Training Loss: 0.01601
Epoch: 10240, Training Loss: 0.01603
Epoch: 10240, Training Loss: 0.01993
Epoch: 10241, Training Loss: 0.01506
Epoch: 10241, Training Loss: 0.01601
Epoch: 10241, Training Loss: 0.01603
Epoch: 10241, Training Loss: 0.01992
Epoch: 10242, Training Loss: 0.01506
Epoch: 10242, Training Loss: 0.01601
Epoch: 10242, Training Loss: 0.01603
Epoch: 10242, Training Loss: 0.01992
Epoch: 10243, Training Loss: 0.01506
Epoch: 10243, Training Loss: 0.01601
Epoch: 10243, Training Loss: 0.01603
Epoch: 10243, Training Loss: 0.01992
Epoch: 10244, Training Loss: 0.01506
Epoch: 10244, Training Loss: 0.01601
Epoch: 10244, Training Loss: 0.01603
Epoch: 10244, Training Loss: 0.01992
Epoch: 10245, Training Loss: 0.01506
Epoch: 10245, Training Loss: 0.01600
Epoch: 10245, Training Loss: 0.01603
Epoch: 10245, Training Loss: 0.01992
Epoch: 10246, Training Loss: 0.01505
Epoch: 10246, Training Loss: 0.01600
Epoch: 10246, Training Loss: 0.01603
Epoch: 10246, Training Loss: 0.01992
Epoch: 10247, Training Loss: 0.01505
Epoch: 10247, Training Loss: 0.01600
Epoch: 10247, Training Loss: 0.01603
Epoch: 10247, Training Loss: 0.01992
Epoch: 10248, Training Loss: 0.01505
Epoch: 10248, Training Loss: 0.01600
Epoch: 10248, Training Loss: 0.01603
Epoch: 10248, Training Loss: 0.01991
Epoch: 10249, Training Loss: 0.01505
Epoch: 10249, Training Loss: 0.01600
Epoch: 10249, Training Loss: 0.01603
Epoch: 10249, Training Loss: 0.01991
Epoch: 10250, Training Loss: 0.01505
Epoch: 10250, Training Loss: 0.01600
Epoch: 10250, Training Loss: 0.01602
Epoch: 10250, Training Loss: 0.01991
Epoch: 10251, Training Loss: 0.01505
Epoch: 10251, Training Loss: 0.01600
Epoch: 10251, Training Loss: 0.01602
Epoch: 10251, Training Loss: 0.01991
Epoch: 10252, Training Loss: 0.01505
Epoch: 10252, Training Loss: 0.01600
Epoch: 10252, Training Loss: 0.01602
Epoch: 10252, Training Loss: 0.01991
Epoch: 10253, Training Loss: 0.01505
Epoch: 10253, Training Loss: 0.01600
Epoch: 10253, Training Loss: 0.01602
Epoch: 10253, Training Loss: 0.01991
Epoch: 10254, Training Loss: 0.01505
Epoch: 10254, Training Loss: 0.01600
Epoch: 10254, Training Loss: 0.01602
Epoch: 10254, Training Loss: 0.01991
Epoch: 10255, Training Loss: 0.01505
Epoch: 10255, Training Loss: 0.01599
Epoch: 10255, Training Loss: 0.01602
Epoch: 10255, Training Loss: 0.01991
Epoch: 10256, Training Loss: 0.01504
Epoch: 10256, Training Loss: 0.01599
Epoch: 10256, Training Loss: 0.01602
Epoch: 10256, Training Loss: 0.01990
Epoch: 10257, Training Loss: 0.01504
Epoch: 10257, Training Loss: 0.01599
Epoch: 10257, Training Loss: 0.01602
Epoch: 10257, Training Loss: 0.01990
Epoch: 10258, Training Loss: 0.01504
Epoch: 10258, Training Loss: 0.01599
Epoch: 10258, Training Loss: 0.01602
Epoch: 10258, Training Loss: 0.01990
Epoch: 10259, Training Loss: 0.01504
Epoch: 10259, Training Loss: 0.01599
Epoch: 10259, Training Loss: 0.01601
Epoch: 10259, Training Loss: 0.01990
Epoch: 10260, Training Loss: 0.01504
Epoch: 10260, Training Loss: 0.01599
Epoch: 10260, Training Loss: 0.01601
Epoch: 10260, Training Loss: 0.01990
Epoch: 10261, Training Loss: 0.01504
Epoch: 10261, Training Loss: 0.01599
Epoch: 10261, Training Loss: 0.01601
Epoch: 10261, Training Loss: 0.01990
Epoch: 10262, Training Loss: 0.01504
Epoch: 10262, Training Loss: 0.01599
Epoch: 10262, Training Loss: 0.01601
Epoch: 10262, Training Loss: 0.01990
Epoch: 10263, Training Loss: 0.01504
Epoch: 10263, Training Loss: 0.01599
Epoch: 10263, Training Loss: 0.01601
Epoch: 10263, Training Loss: 0.01989
Epoch: 10264, Training Loss: 0.01504
Epoch: 10264, Training Loss: 0.01598
Epoch: 10264, Training Loss: 0.01601
Epoch: 10264, Training Loss: 0.01989
Epoch: 10265, Training Loss: 0.01504
Epoch: 10265, Training Loss: 0.01598
Epoch: 10265, Training Loss: 0.01601
Epoch: 10265, Training Loss: 0.01989
Epoch: 10266, Training Loss: 0.01504
Epoch: 10266, Training Loss: 0.01598
Epoch: 10266, Training Loss: 0.01601
Epoch: 10266, Training Loss: 0.01989
Epoch: 10267, Training Loss: 0.01503
Epoch: 10267, Training Loss: 0.01598
Epoch: 10267, Training Loss: 0.01601
Epoch: 10267, Training Loss: 0.01989
Epoch: 10268, Training Loss: 0.01503
Epoch: 10268, Training Loss: 0.01598
Epoch: 10268, Training Loss: 0.01600
Epoch: 10268, Training Loss: 0.01989
Epoch: 10269, Training Loss: 0.01503
Epoch: 10269, Training Loss: 0.01598
Epoch: 10269, Training Loss: 0.01600
Epoch: 10269, Training Loss: 0.01989
Epoch: 10270, Training Loss: 0.01503
Epoch: 10270, Training Loss: 0.01598
Epoch: 10270, Training Loss: 0.01600
Epoch: 10270, Training Loss: 0.01988
Epoch: 10271, Training Loss: 0.01503
Epoch: 10271, Training Loss: 0.01598
Epoch: 10271, Training Loss: 0.01600
Epoch: 10271, Training Loss: 0.01988
Epoch: 10272, Training Loss: 0.01503
Epoch: 10272, Training Loss: 0.01598
Epoch: 10272, Training Loss: 0.01600
Epoch: 10272, Training Loss: 0.01988
Epoch: 10273, Training Loss: 0.01503
Epoch: 10273, Training Loss: 0.01597
Epoch: 10273, Training Loss: 0.01600
Epoch: 10273, Training Loss: 0.01988
Epoch: 10274, Training Loss: 0.01503
Epoch: 10274, Training Loss: 0.01597
Epoch: 10274, Training Loss: 0.01600
Epoch: 10274, Training Loss: 0.01988
Epoch: 10275, Training Loss: 0.01503
Epoch: 10275, Training Loss: 0.01597
Epoch: 10275, Training Loss: 0.01600
Epoch: 10275, Training Loss: 0.01988
Epoch: 10276, Training Loss: 0.01503
Epoch: 10276, Training Loss: 0.01597
Epoch: 10276, Training Loss: 0.01600
Epoch: 10276, Training Loss: 0.01988
Epoch: 10277, Training Loss: 0.01502
Epoch: 10277, Training Loss: 0.01597
Epoch: 10277, Training Loss: 0.01600
Epoch: 10277, Training Loss: 0.01988
Epoch: 10278, Training Loss: 0.01502
Epoch: 10278, Training Loss: 0.01597
Epoch: 10278, Training Loss: 0.01599
Epoch: 10278, Training Loss: 0.01987
Epoch: 10279, Training Loss: 0.01502
Epoch: 10279, Training Loss: 0.01597
Epoch: 10279, Training Loss: 0.01599
Epoch: 10279, Training Loss: 0.01987
Epoch: 10280, Training Loss: 0.01502
Epoch: 10280, Training Loss: 0.01597
Epoch: 10280, Training Loss: 0.01599
Epoch: 10280, Training Loss: 0.01987
Epoch: 10281, Training Loss: 0.01502
Epoch: 10281, Training Loss: 0.01597
Epoch: 10281, Training Loss: 0.01599
Epoch: 10281, Training Loss: 0.01987
Epoch: 10282, Training Loss: 0.01502
Epoch: 10282, Training Loss: 0.01597
Epoch: 10282, Training Loss: 0.01599
Epoch: 10282, Training Loss: 0.01987
Epoch: 10283, Training Loss: 0.01502
Epoch: 10283, Training Loss: 0.01596
Epoch: 10283, Training Loss: 0.01599
Epoch: 10283, Training Loss: 0.01987
Epoch: 10284, Training Loss: 0.01502
Epoch: 10284, Training Loss: 0.01596
Epoch: 10284, Training Loss: 0.01599
Epoch: 10284, Training Loss: 0.01987
Epoch: 10285, Training Loss: 0.01502
Epoch: 10285, Training Loss: 0.01596
Epoch: 10285, Training Loss: 0.01599
Epoch: 10285, Training Loss: 0.01986
Epoch: 10286, Training Loss: 0.01502
Epoch: 10286, Training Loss: 0.01596
Epoch: 10286, Training Loss: 0.01599
Epoch: 10286, Training Loss: 0.01986
Epoch: 10287, Training Loss: 0.01502
Epoch: 10287, Training Loss: 0.01596
Epoch: 10287, Training Loss: 0.01598
Epoch: 10287, Training Loss: 0.01986
Epoch: 10288, Training Loss: 0.01501
Epoch: 10288, Training Loss: 0.01596
Epoch: 10288, Training Loss: 0.01598
Epoch: 10288, Training Loss: 0.01986
Epoch: 10289, Training Loss: 0.01501
Epoch: 10289, Training Loss: 0.01596
Epoch: 10289, Training Loss: 0.01598
Epoch: 10289, Training Loss: 0.01986
Epoch: 10290, Training Loss: 0.01501
Epoch: 10290, Training Loss: 0.01596
Epoch: 10290, Training Loss: 0.01598
Epoch: 10290, Training Loss: 0.01986
Epoch: 10291, Training Loss: 0.01501
Epoch: 10291, Training Loss: 0.01596
Epoch: 10291, Training Loss: 0.01598
Epoch: 10291, Training Loss: 0.01986
Epoch: 10292, Training Loss: 0.01501
Epoch: 10292, Training Loss: 0.01595
Epoch: 10292, Training Loss: 0.01598
Epoch: 10292, Training Loss: 0.01985
Epoch: 10293, Training Loss: 0.01501
Epoch: 10293, Training Loss: 0.01595
Epoch: 10293, Training Loss: 0.01598
Epoch: 10293, Training Loss: 0.01985
Epoch: 10294, Training Loss: 0.01501
Epoch: 10294, Training Loss: 0.01595
Epoch: 10294, Training Loss: 0.01598
Epoch: 10294, Training Loss: 0.01985
Epoch: 10295, Training Loss: 0.01501
Epoch: 10295, Training Loss: 0.01595
Epoch: 10295, Training Loss: 0.01598
Epoch: 10295, Training Loss: 0.01985
Epoch: 10296, Training Loss: 0.01501
Epoch: 10296, Training Loss: 0.01595
Epoch: 10296, Training Loss: 0.01598
Epoch: 10296, Training Loss: 0.01985
Epoch: 10297, Training Loss: 0.01501
Epoch: 10297, Training Loss: 0.01595
Epoch: 10297, Training Loss: 0.01597
Epoch: 10297, Training Loss: 0.01985
Epoch: 10298, Training Loss: 0.01500
Epoch: 10298, Training Loss: 0.01595
Epoch: 10298, Training Loss: 0.01597
Epoch: 10298, Training Loss: 0.01985
Epoch: 10299, Training Loss: 0.01500
Epoch: 10299, Training Loss: 0.01595
Epoch: 10299, Training Loss: 0.01597
Epoch: 10299, Training Loss: 0.01985
Epoch: 10300, Training Loss: 0.01500
Epoch: 10300, Training Loss: 0.01595
Epoch: 10300, Training Loss: 0.01597
Epoch: 10300, Training Loss: 0.01984
Epoch: 10301, Training Loss: 0.01500
Epoch: 10301, Training Loss: 0.01595
Epoch: 10301, Training Loss: 0.01597
Epoch: 10301, Training Loss: 0.01984
Epoch: 10302, Training Loss: 0.01500
Epoch: 10302, Training Loss: 0.01594
Epoch: 10302, Training Loss: 0.01597
Epoch: 10302, Training Loss: 0.01984
Epoch: 10303, Training Loss: 0.01500
Epoch: 10303, Training Loss: 0.01594
Epoch: 10303, Training Loss: 0.01597
Epoch: 10303, Training Loss: 0.01984
Epoch: 10304, Training Loss: 0.01500
Epoch: 10304, Training Loss: 0.01594
Epoch: 10304, Training Loss: 0.01597
Epoch: 10304, Training Loss: 0.01984
Epoch: 10305, Training Loss: 0.01500
Epoch: 10305, Training Loss: 0.01594
Epoch: 10305, Training Loss: 0.01597
Epoch: 10305, Training Loss: 0.01984
Epoch: 10306, Training Loss: 0.01500
Epoch: 10306, Training Loss: 0.01594
Epoch: 10306, Training Loss: 0.01596
Epoch: 10306, Training Loss: 0.01984
Epoch: 10307, Training Loss: 0.01500
Epoch: 10307, Training Loss: 0.01594
Epoch: 10307, Training Loss: 0.01596
Epoch: 10307, Training Loss: 0.01983
Epoch: 10308, Training Loss: 0.01500
Epoch: 10308, Training Loss: 0.01594
Epoch: 10308, Training Loss: 0.01596
Epoch: 10308, Training Loss: 0.01983
Epoch: 10309, Training Loss: 0.01499
Epoch: 10309, Training Loss: 0.01594
Epoch: 10309, Training Loss: 0.01596
Epoch: 10309, Training Loss: 0.01983
Epoch: 10310, Training Loss: 0.01499
Epoch: 10310, Training Loss: 0.01594
Epoch: 10310, Training Loss: 0.01596
Epoch: 10310, Training Loss: 0.01983
Epoch: 10311, Training Loss: 0.01499
Epoch: 10311, Training Loss: 0.01593
Epoch: 10311, Training Loss: 0.01596
Epoch: 10311, Training Loss: 0.01983
Epoch: 10312, Training Loss: 0.01499
Epoch: 10312, Training Loss: 0.01593
Epoch: 10312, Training Loss: 0.01596
Epoch: 10312, Training Loss: 0.01983
Epoch: 10313, Training Loss: 0.01499
Epoch: 10313, Training Loss: 0.01593
Epoch: 10313, Training Loss: 0.01596
Epoch: 10313, Training Loss: 0.01983
Epoch: 10314, Training Loss: 0.01499
Epoch: 10314, Training Loss: 0.01593
Epoch: 10314, Training Loss: 0.01596
Epoch: 10314, Training Loss: 0.01983
Epoch: 10315, Training Loss: 0.01499
Epoch: 10315, Training Loss: 0.01593
Epoch: 10315, Training Loss: 0.01595
Epoch: 10315, Training Loss: 0.01982
Epoch: 10316, Training Loss: 0.01499
Epoch: 10316, Training Loss: 0.01593
Epoch: 10316, Training Loss: 0.01595
Epoch: 10316, Training Loss: 0.01982
Epoch: 10317, Training Loss: 0.01499
Epoch: 10317, Training Loss: 0.01593
Epoch: 10317, Training Loss: 0.01595
Epoch: 10317, Training Loss: 0.01982
Epoch: 10318, Training Loss: 0.01499
Epoch: 10318, Training Loss: 0.01593
Epoch: 10318, Training Loss: 0.01595
Epoch: 10318, Training Loss: 0.01982
Epoch: 10319, Training Loss: 0.01498
Epoch: 10319, Training Loss: 0.01593
Epoch: 10319, Training Loss: 0.01595
Epoch: 10319, Training Loss: 0.01982
Epoch: 10320, Training Loss: 0.01498
Epoch: 10320, Training Loss: 0.01593
Epoch: 10320, Training Loss: 0.01595
Epoch: 10320, Training Loss: 0.01982
Epoch: 10321, Training Loss: 0.01498
Epoch: 10321, Training Loss: 0.01592
Epoch: 10321, Training Loss: 0.01595
Epoch: 10321, Training Loss: 0.01982
Epoch: 10322, Training Loss: 0.01498
Epoch: 10322, Training Loss: 0.01592
Epoch: 10322, Training Loss: 0.01595
Epoch: 10322, Training Loss: 0.01981
Epoch: 10323, Training Loss: 0.01498
Epoch: 10323, Training Loss: 0.01592
Epoch: 10323, Training Loss: 0.01595
Epoch: 10323, Training Loss: 0.01981
Epoch: 10324, Training Loss: 0.01498
Epoch: 10324, Training Loss: 0.01592
Epoch: 10324, Training Loss: 0.01595
Epoch: 10324, Training Loss: 0.01981
Epoch: 10325, Training Loss: 0.01498
Epoch: 10325, Training Loss: 0.01592
Epoch: 10325, Training Loss: 0.01594
Epoch: 10325, Training Loss: 0.01981
Epoch: 10326, Training Loss: 0.01498
Epoch: 10326, Training Loss: 0.01592
Epoch: 10326, Training Loss: 0.01594
Epoch: 10326, Training Loss: 0.01981
Epoch: 10327, Training Loss: 0.01498
Epoch: 10327, Training Loss: 0.01592
Epoch: 10327, Training Loss: 0.01594
Epoch: 10327, Training Loss: 0.01981
Epoch: 10328, Training Loss: 0.01498
Epoch: 10328, Training Loss: 0.01592
Epoch: 10328, Training Loss: 0.01594
Epoch: 10328, Training Loss: 0.01981
Epoch: 10329, Training Loss: 0.01498
Epoch: 10329, Training Loss: 0.01592
Epoch: 10329, Training Loss: 0.01594
Epoch: 10329, Training Loss: 0.01981
Epoch: 10330, Training Loss: 0.01497
Epoch: 10330, Training Loss: 0.01591
Epoch: 10330, Training Loss: 0.01594
Epoch: 10330, Training Loss: 0.01980
Epoch: 10331, Training Loss: 0.01497
Epoch: 10331, Training Loss: 0.01591
Epoch: 10331, Training Loss: 0.01594
Epoch: 10331, Training Loss: 0.01980
Epoch: 10332, Training Loss: 0.01497
Epoch: 10332, Training Loss: 0.01591
Epoch: 10332, Training Loss: 0.01594
Epoch: 10332, Training Loss: 0.01980
Epoch: 10333, Training Loss: 0.01497
Epoch: 10333, Training Loss: 0.01591
Epoch: 10333, Training Loss: 0.01594
Epoch: 10333, Training Loss: 0.01980
Epoch: 10334, Training Loss: 0.01497
Epoch: 10334, Training Loss: 0.01591
Epoch: 10334, Training Loss: 0.01593
Epoch: 10334, Training Loss: 0.01980
Epoch: 10335, Training Loss: 0.01497
Epoch: 10335, Training Loss: 0.01591
Epoch: 10335, Training Loss: 0.01593
Epoch: 10335, Training Loss: 0.01980
Epoch: 10336, Training Loss: 0.01497
Epoch: 10336, Training Loss: 0.01591
Epoch: 10336, Training Loss: 0.01593
Epoch: 10336, Training Loss: 0.01980
Epoch: 10337, Training Loss: 0.01497
Epoch: 10337, Training Loss: 0.01591
Epoch: 10337, Training Loss: 0.01593
Epoch: 10337, Training Loss: 0.01979
Epoch: 10338, Training Loss: 0.01497
Epoch: 10338, Training Loss: 0.01591
Epoch: 10338, Training Loss: 0.01593
Epoch: 10338, Training Loss: 0.01979
Epoch: 10339, Training Loss: 0.01497
Epoch: 10339, Training Loss: 0.01591
Epoch: 10339, Training Loss: 0.01593
Epoch: 10339, Training Loss: 0.01979
Epoch: 10340, Training Loss: 0.01496
Epoch: 10340, Training Loss: 0.01590
Epoch: 10340, Training Loss: 0.01593
Epoch: 10340, Training Loss: 0.01979
Epoch: 10341, Training Loss: 0.01496
Epoch: 10341, Training Loss: 0.01590
Epoch: 10341, Training Loss: 0.01593
Epoch: 10341, Training Loss: 0.01979
Epoch: 10342, Training Loss: 0.01496
Epoch: 10342, Training Loss: 0.01590
Epoch: 10342, Training Loss: 0.01593
Epoch: 10342, Training Loss: 0.01979
Epoch: 10343, Training Loss: 0.01496
Epoch: 10343, Training Loss: 0.01590
Epoch: 10343, Training Loss: 0.01593
Epoch: 10343, Training Loss: 0.01979
Epoch: 10344, Training Loss: 0.01496
Epoch: 10344, Training Loss: 0.01590
Epoch: 10344, Training Loss: 0.01592
Epoch: 10344, Training Loss: 0.01978
Epoch: 10345, Training Loss: 0.01496
Epoch: 10345, Training Loss: 0.01590
Epoch: 10345, Training Loss: 0.01592
Epoch: 10345, Training Loss: 0.01978
Epoch: 10346, Training Loss: 0.01496
Epoch: 10346, Training Loss: 0.01590
Epoch: 10346, Training Loss: 0.01592
Epoch: 10346, Training Loss: 0.01978
Epoch: 10347, Training Loss: 0.01496
Epoch: 10347, Training Loss: 0.01590
Epoch: 10347, Training Loss: 0.01592
Epoch: 10347, Training Loss: 0.01978
Epoch: 10348, Training Loss: 0.01496
Epoch: 10348, Training Loss: 0.01590
Epoch: 10348, Training Loss: 0.01592
Epoch: 10348, Training Loss: 0.01978
Epoch: 10349, Training Loss: 0.01496
Epoch: 10349, Training Loss: 0.01589
Epoch: 10349, Training Loss: 0.01592
Epoch: 10349, Training Loss: 0.01978
Epoch: 10350, Training Loss: 0.01496
Epoch: 10350, Training Loss: 0.01589
Epoch: 10350, Training Loss: 0.01592
Epoch: 10350, Training Loss: 0.01978
Epoch: 10351, Training Loss: 0.01495
Epoch: 10351, Training Loss: 0.01589
Epoch: 10351, Training Loss: 0.01592
Epoch: 10351, Training Loss: 0.01978
Epoch: 10352, Training Loss: 0.01495
Epoch: 10352, Training Loss: 0.01589
Epoch: 10352, Training Loss: 0.01592
Epoch: 10352, Training Loss: 0.01977
Epoch: 10353, Training Loss: 0.01495
Epoch: 10353, Training Loss: 0.01589
Epoch: 10353, Training Loss: 0.01591
Epoch: 10353, Training Loss: 0.01977
Epoch: 10354, Training Loss: 0.01495
Epoch: 10354, Training Loss: 0.01589
Epoch: 10354, Training Loss: 0.01591
Epoch: 10354, Training Loss: 0.01977
Epoch: 10355, Training Loss: 0.01495
Epoch: 10355, Training Loss: 0.01589
Epoch: 10355, Training Loss: 0.01591
Epoch: 10355, Training Loss: 0.01977
Epoch: 10356, Training Loss: 0.01495
Epoch: 10356, Training Loss: 0.01589
Epoch: 10356, Training Loss: 0.01591
Epoch: 10356, Training Loss: 0.01977
Epoch: 10357, Training Loss: 0.01495
Epoch: 10357, Training Loss: 0.01589
Epoch: 10357, Training Loss: 0.01591
Epoch: 10357, Training Loss: 0.01977
Epoch: 10358, Training Loss: 0.01495
Epoch: 10358, Training Loss: 0.01589
Epoch: 10358, Training Loss: 0.01591
Epoch: 10358, Training Loss: 0.01977
Epoch: 10359, Training Loss: 0.01495
Epoch: 10359, Training Loss: 0.01588
Epoch: 10359, Training Loss: 0.01591
Epoch: 10359, Training Loss: 0.01976
Epoch: 10360, Training Loss: 0.01495
Epoch: 10360, Training Loss: 0.01588
Epoch: 10360, Training Loss: 0.01591
Epoch: 10360, Training Loss: 0.01976
Epoch: 10361, Training Loss: 0.01494
Epoch: 10361, Training Loss: 0.01588
Epoch: 10361, Training Loss: 0.01591
Epoch: 10361, Training Loss: 0.01976
Epoch: 10362, Training Loss: 0.01494
Epoch: 10362, Training Loss: 0.01588
Epoch: 10362, Training Loss: 0.01591
Epoch: 10362, Training Loss: 0.01976
Epoch: 10363, Training Loss: 0.01494
Epoch: 10363, Training Loss: 0.01588
Epoch: 10363, Training Loss: 0.01590
Epoch: 10363, Training Loss: 0.01976
Epoch: 10364, Training Loss: 0.01494
Epoch: 10364, Training Loss: 0.01588
Epoch: 10364, Training Loss: 0.01590
Epoch: 10364, Training Loss: 0.01976
Epoch: 10365, Training Loss: 0.01494
Epoch: 10365, Training Loss: 0.01588
Epoch: 10365, Training Loss: 0.01590
Epoch: 10365, Training Loss: 0.01976
Epoch: 10366, Training Loss: 0.01494
Epoch: 10366, Training Loss: 0.01588
Epoch: 10366, Training Loss: 0.01590
Epoch: 10366, Training Loss: 0.01976
Epoch: 10367, Training Loss: 0.01494
Epoch: 10367, Training Loss: 0.01588
Epoch: 10367, Training Loss: 0.01590
Epoch: 10367, Training Loss: 0.01975
Epoch: 10368, Training Loss: 0.01494
Epoch: 10368, Training Loss: 0.01587
Epoch: 10368, Training Loss: 0.01590
Epoch: 10368, Training Loss: 0.01975
Epoch: 10369, Training Loss: 0.01494
Epoch: 10369, Training Loss: 0.01587
Epoch: 10369, Training Loss: 0.01590
Epoch: 10369, Training Loss: 0.01975
Epoch: 10370, Training Loss: 0.01494
Epoch: 10370, Training Loss: 0.01587
Epoch: 10370, Training Loss: 0.01590
Epoch: 10370, Training Loss: 0.01975
Epoch: 10371, Training Loss: 0.01494
Epoch: 10371, Training Loss: 0.01587
Epoch: 10371, Training Loss: 0.01590
Epoch: 10371, Training Loss: 0.01975
Epoch: 10372, Training Loss: 0.01493
Epoch: 10372, Training Loss: 0.01587
Epoch: 10372, Training Loss: 0.01590
Epoch: 10372, Training Loss: 0.01975
Epoch: 10373, Training Loss: 0.01493
Epoch: 10373, Training Loss: 0.01587
Epoch: 10373, Training Loss: 0.01589
Epoch: 10373, Training Loss: 0.01975
Epoch: 10374, Training Loss: 0.01493
Epoch: 10374, Training Loss: 0.01587
Epoch: 10374, Training Loss: 0.01589
Epoch: 10374, Training Loss: 0.01974
Epoch: 10375, Training Loss: 0.01493
Epoch: 10375, Training Loss: 0.01587
Epoch: 10375, Training Loss: 0.01589
Epoch: 10375, Training Loss: 0.01974
Epoch: 10376, Training Loss: 0.01493
Epoch: 10376, Training Loss: 0.01587
Epoch: 10376, Training Loss: 0.01589
Epoch: 10376, Training Loss: 0.01974
Epoch: 10377, Training Loss: 0.01493
Epoch: 10377, Training Loss: 0.01587
Epoch: 10377, Training Loss: 0.01589
Epoch: 10377, Training Loss: 0.01974
Epoch: 10378, Training Loss: 0.01493
Epoch: 10378, Training Loss: 0.01586
Epoch: 10378, Training Loss: 0.01589
Epoch: 10378, Training Loss: 0.01974
Epoch: 10379, Training Loss: 0.01493
Epoch: 10379, Training Loss: 0.01586
Epoch: 10379, Training Loss: 0.01589
Epoch: 10379, Training Loss: 0.01974
Epoch: 10380, Training Loss: 0.01493
Epoch: 10380, Training Loss: 0.01586
Epoch: 10380, Training Loss: 0.01589
Epoch: 10380, Training Loss: 0.01974
Epoch: 10381, Training Loss: 0.01493
Epoch: 10381, Training Loss: 0.01586
Epoch: 10381, Training Loss: 0.01589
Epoch: 10381, Training Loss: 0.01974
Epoch: 10382, Training Loss: 0.01493
Epoch: 10382, Training Loss: 0.01586
Epoch: 10382, Training Loss: 0.01588
Epoch: 10382, Training Loss: 0.01973
Epoch: 10383, Training Loss: 0.01492
Epoch: 10383, Training Loss: 0.01586
Epoch: 10383, Training Loss: 0.01588
Epoch: 10383, Training Loss: 0.01973
Epoch: 10384, Training Loss: 0.01492
Epoch: 10384, Training Loss: 0.01586
Epoch: 10384, Training Loss: 0.01588
Epoch: 10384, Training Loss: 0.01973
Epoch: 10385, Training Loss: 0.01492
Epoch: 10385, Training Loss: 0.01586
Epoch: 10385, Training Loss: 0.01588
Epoch: 10385, Training Loss: 0.01973
Epoch: 10386, Training Loss: 0.01492
Epoch: 10386, Training Loss: 0.01586
Epoch: 10386, Training Loss: 0.01588
Epoch: 10386, Training Loss: 0.01973
Epoch: 10387, Training Loss: 0.01492
Epoch: 10387, Training Loss: 0.01585
Epoch: 10387, Training Loss: 0.01588
Epoch: 10387, Training Loss: 0.01973
Epoch: 10388, Training Loss: 0.01492
Epoch: 10388, Training Loss: 0.01585
Epoch: 10388, Training Loss: 0.01588
Epoch: 10388, Training Loss: 0.01973
Epoch: 10389, Training Loss: 0.01492
Epoch: 10389, Training Loss: 0.01585
Epoch: 10389, Training Loss: 0.01588
Epoch: 10389, Training Loss: 0.01972
Epoch: 10390, Training Loss: 0.01492
Epoch: 10390, Training Loss: 0.01585
Epoch: 10390, Training Loss: 0.01588
Epoch: 10390, Training Loss: 0.01972
Epoch: 10391, Training Loss: 0.01492
Epoch: 10391, Training Loss: 0.01585
Epoch: 10391, Training Loss: 0.01588
Epoch: 10391, Training Loss: 0.01972
Epoch: 10392, Training Loss: 0.01492
Epoch: 10392, Training Loss: 0.01585
Epoch: 10392, Training Loss: 0.01587
Epoch: 10392, Training Loss: 0.01972
Epoch: 10393, Training Loss: 0.01491
Epoch: 10393, Training Loss: 0.01585
Epoch: 10393, Training Loss: 0.01587
Epoch: 10393, Training Loss: 0.01972
Epoch: 10394, Training Loss: 0.01491
Epoch: 10394, Training Loss: 0.01585
Epoch: 10394, Training Loss: 0.01587
Epoch: 10394, Training Loss: 0.01972
Epoch: 10395, Training Loss: 0.01491
Epoch: 10395, Training Loss: 0.01585
Epoch: 10395, Training Loss: 0.01587
Epoch: 10395, Training Loss: 0.01972
Epoch: 10396, Training Loss: 0.01491
Epoch: 10396, Training Loss: 0.01585
Epoch: 10396, Training Loss: 0.01587
Epoch: 10396, Training Loss: 0.01972
Epoch: 10397, Training Loss: 0.01491
Epoch: 10397, Training Loss: 0.01584
Epoch: 10397, Training Loss: 0.01587
Epoch: 10397, Training Loss: 0.01971
Epoch: 10398, Training Loss: 0.01491
Epoch: 10398, Training Loss: 0.01584
Epoch: 10398, Training Loss: 0.01587
Epoch: 10398, Training Loss: 0.01971
Epoch: 10399, Training Loss: 0.01491
Epoch: 10399, Training Loss: 0.01584
Epoch: 10399, Training Loss: 0.01587
Epoch: 10399, Training Loss: 0.01971
Epoch: 10400, Training Loss: 0.01491
Epoch: 10400, Training Loss: 0.01584
Epoch: 10400, Training Loss: 0.01587
Epoch: 10400, Training Loss: 0.01971
Epoch: 10401, Training Loss: 0.01491
Epoch: 10401, Training Loss: 0.01584
Epoch: 10401, Training Loss: 0.01586
Epoch: 10401, Training Loss: 0.01971
Epoch: 10402, Training Loss: 0.01491
Epoch: 10402, Training Loss: 0.01584
Epoch: 10402, Training Loss: 0.01586
Epoch: 10402, Training Loss: 0.01971
Epoch: 10403, Training Loss: 0.01491
Epoch: 10403, Training Loss: 0.01584
Epoch: 10403, Training Loss: 0.01586
Epoch: 10403, Training Loss: 0.01971
Epoch: 10404, Training Loss: 0.01490
Epoch: 10404, Training Loss: 0.01584
Epoch: 10404, Training Loss: 0.01586
Epoch: 10404, Training Loss: 0.01971
Epoch: 10405, Training Loss: 0.01490
Epoch: 10405, Training Loss: 0.01584
Epoch: 10405, Training Loss: 0.01586
Epoch: 10405, Training Loss: 0.01970
Epoch: 10406, Training Loss: 0.01490
Epoch: 10406, Training Loss: 0.01584
Epoch: 10406, Training Loss: 0.01586
Epoch: 10406, Training Loss: 0.01970
Epoch: 10407, Training Loss: 0.01490
Epoch: 10407, Training Loss: 0.01583
Epoch: 10407, Training Loss: 0.01586
Epoch: 10407, Training Loss: 0.01970
Epoch: 10408, Training Loss: 0.01490
Epoch: 10408, Training Loss: 0.01583
Epoch: 10408, Training Loss: 0.01586
Epoch: 10408, Training Loss: 0.01970
Epoch: 10409, Training Loss: 0.01490
Epoch: 10409, Training Loss: 0.01583
Epoch: 10409, Training Loss: 0.01586
Epoch: 10409, Training Loss: 0.01970
Epoch: 10410, Training Loss: 0.01490
Epoch: 10410, Training Loss: 0.01583
Epoch: 10410, Training Loss: 0.01586
Epoch: 10410, Training Loss: 0.01970
Epoch: 10411, Training Loss: 0.01490
Epoch: 10411, Training Loss: 0.01583
Epoch: 10411, Training Loss: 0.01585
Epoch: 10411, Training Loss: 0.01970
Epoch: 10412, Training Loss: 0.01490
Epoch: 10412, Training Loss: 0.01583
Epoch: 10412, Training Loss: 0.01585
Epoch: 10412, Training Loss: 0.01969
Epoch: 10413, Training Loss: 0.01490
Epoch: 10413, Training Loss: 0.01583
Epoch: 10413, Training Loss: 0.01585
Epoch: 10413, Training Loss: 0.01969
Epoch: 10414, Training Loss: 0.01490
Epoch: 10414, Training Loss: 0.01583
Epoch: 10414, Training Loss: 0.01585
Epoch: 10414, Training Loss: 0.01969
Epoch: 10415, Training Loss: 0.01489
Epoch: 10415, Training Loss: 0.01583
Epoch: 10415, Training Loss: 0.01585
Epoch: 10415, Training Loss: 0.01969
Epoch: 10416, Training Loss: 0.01489
Epoch: 10416, Training Loss: 0.01582
Epoch: 10416, Training Loss: 0.01585
Epoch: 10416, Training Loss: 0.01969
Epoch: 10417, Training Loss: 0.01489
Epoch: 10417, Training Loss: 0.01582
Epoch: 10417, Training Loss: 0.01585
Epoch: 10417, Training Loss: 0.01969
Epoch: 10418, Training Loss: 0.01489
Epoch: 10418, Training Loss: 0.01582
Epoch: 10418, Training Loss: 0.01585
Epoch: 10418, Training Loss: 0.01969
Epoch: 10419, Training Loss: 0.01489
Epoch: 10419, Training Loss: 0.01582
Epoch: 10419, Training Loss: 0.01585
Epoch: 10419, Training Loss: 0.01969
Epoch: 10420, Training Loss: 0.01489
Epoch: 10420, Training Loss: 0.01582
Epoch: 10420, Training Loss: 0.01585
Epoch: 10420, Training Loss: 0.01968
Epoch: 10421, Training Loss: 0.01489
Epoch: 10421, Training Loss: 0.01582
Epoch: 10421, Training Loss: 0.01584
Epoch: 10421, Training Loss: 0.01968
Epoch: 10422, Training Loss: 0.01489
Epoch: 10422, Training Loss: 0.01582
Epoch: 10422, Training Loss: 0.01584
Epoch: 10422, Training Loss: 0.01968
Epoch: 10423, Training Loss: 0.01489
Epoch: 10423, Training Loss: 0.01582
Epoch: 10423, Training Loss: 0.01584
Epoch: 10423, Training Loss: 0.01968
Epoch: 10424, Training Loss: 0.01489
Epoch: 10424, Training Loss: 0.01582
Epoch: 10424, Training Loss: 0.01584
Epoch: 10424, Training Loss: 0.01968
Epoch: 10425, Training Loss: 0.01488
Epoch: 10425, Training Loss: 0.01582
Epoch: 10425, Training Loss: 0.01584
Epoch: 10425, Training Loss: 0.01968
Epoch: 10426, Training Loss: 0.01488
Epoch: 10426, Training Loss: 0.01581
Epoch: 10426, Training Loss: 0.01584
Epoch: 10426, Training Loss: 0.01968
Epoch: 10427, Training Loss: 0.01488
Epoch: 10427, Training Loss: 0.01581
Epoch: 10427, Training Loss: 0.01584
Epoch: 10427, Training Loss: 0.01967
Epoch: 10428, Training Loss: 0.01488
Epoch: 10428, Training Loss: 0.01581
Epoch: 10428, Training Loss: 0.01584
Epoch: 10428, Training Loss: 0.01967
Epoch: 10429, Training Loss: 0.01488
Epoch: 10429, Training Loss: 0.01581
Epoch: 10429, Training Loss: 0.01584
Epoch: 10429, Training Loss: 0.01967
Epoch: 10430, Training Loss: 0.01488
Epoch: 10430, Training Loss: 0.01581
Epoch: 10430, Training Loss: 0.01583
Epoch: 10430, Training Loss: 0.01967
Epoch: 10431, Training Loss: 0.01488
Epoch: 10431, Training Loss: 0.01581
Epoch: 10431, Training Loss: 0.01583
Epoch: 10431, Training Loss: 0.01967
Epoch: 10432, Training Loss: 0.01488
Epoch: 10432, Training Loss: 0.01581
Epoch: 10432, Training Loss: 0.01583
Epoch: 10432, Training Loss: 0.01967
Epoch: 10433, Training Loss: 0.01488
Epoch: 10433, Training Loss: 0.01581
Epoch: 10433, Training Loss: 0.01583
Epoch: 10433, Training Loss: 0.01967
Epoch: 10434, Training Loss: 0.01488
Epoch: 10434, Training Loss: 0.01581
Epoch: 10434, Training Loss: 0.01583
Epoch: 10434, Training Loss: 0.01967
Epoch: 10435, Training Loss: 0.01488
Epoch: 10435, Training Loss: 0.01581
Epoch: 10435, Training Loss: 0.01583
Epoch: 10435, Training Loss: 0.01966
Epoch: 10436, Training Loss: 0.01487
Epoch: 10436, Training Loss: 0.01580
Epoch: 10436, Training Loss: 0.01583
Epoch: 10436, Training Loss: 0.01966
Epoch: 10437, Training Loss: 0.01487
Epoch: 10437, Training Loss: 0.01580
Epoch: 10437, Training Loss: 0.01583
Epoch: 10437, Training Loss: 0.01966
Epoch: 10438, Training Loss: 0.01487
Epoch: 10438, Training Loss: 0.01580
Epoch: 10438, Training Loss: 0.01583
Epoch: 10438, Training Loss: 0.01966
Epoch: 10439, Training Loss: 0.01487
Epoch: 10439, Training Loss: 0.01580
Epoch: 10439, Training Loss: 0.01583
Epoch: 10439, Training Loss: 0.01966
Epoch: 10440, Training Loss: 0.01487
Epoch: 10440, Training Loss: 0.01580
Epoch: 10440, Training Loss: 0.01582
Epoch: 10440, Training Loss: 0.01966
Epoch: 10441, Training Loss: 0.01487
Epoch: 10441, Training Loss: 0.01580
Epoch: 10441, Training Loss: 0.01582
Epoch: 10441, Training Loss: 0.01966
Epoch: 10442, Training Loss: 0.01487
Epoch: 10442, Training Loss: 0.01580
Epoch: 10442, Training Loss: 0.01582
Epoch: 10442, Training Loss: 0.01966
Epoch: 10443, Training Loss: 0.01487
Epoch: 10443, Training Loss: 0.01580
Epoch: 10443, Training Loss: 0.01582
Epoch: 10443, Training Loss: 0.01965
Epoch: 10444, Training Loss: 0.01487
Epoch: 10444, Training Loss: 0.01580
Epoch: 10444, Training Loss: 0.01582
Epoch: 10444, Training Loss: 0.01965
Epoch: 10445, Training Loss: 0.01487
Epoch: 10445, Training Loss: 0.01579
Epoch: 10445, Training Loss: 0.01582
Epoch: 10445, Training Loss: 0.01965
Epoch: 10446, Training Loss: 0.01487
Epoch: 10446, Training Loss: 0.01579
Epoch: 10446, Training Loss: 0.01582
Epoch: 10446, Training Loss: 0.01965
Epoch: 10447, Training Loss: 0.01486
Epoch: 10447, Training Loss: 0.01579
Epoch: 10447, Training Loss: 0.01582
Epoch: 10447, Training Loss: 0.01965
Epoch: 10448, Training Loss: 0.01486
Epoch: 10448, Training Loss: 0.01579
Epoch: 10448, Training Loss: 0.01582
Epoch: 10448, Training Loss: 0.01965
Epoch: 10449, Training Loss: 0.01486
Epoch: 10449, Training Loss: 0.01579
Epoch: 10449, Training Loss: 0.01582
Epoch: 10449, Training Loss: 0.01965
Epoch: 10450, Training Loss: 0.01486
Epoch: 10450, Training Loss: 0.01579
Epoch: 10450, Training Loss: 0.01581
Epoch: 10450, Training Loss: 0.01964
Epoch: 10451, Training Loss: 0.01486
Epoch: 10451, Training Loss: 0.01579
Epoch: 10451, Training Loss: 0.01581
Epoch: 10451, Training Loss: 0.01964
Epoch: 10452, Training Loss: 0.01486
Epoch: 10452, Training Loss: 0.01579
Epoch: 10452, Training Loss: 0.01581
Epoch: 10452, Training Loss: 0.01964
Epoch: 10453, Training Loss: 0.01486
Epoch: 10453, Training Loss: 0.01579
Epoch: 10453, Training Loss: 0.01581
Epoch: 10453, Training Loss: 0.01964
Epoch: 10454, Training Loss: 0.01486
Epoch: 10454, Training Loss: 0.01579
Epoch: 10454, Training Loss: 0.01581
Epoch: 10454, Training Loss: 0.01964
Epoch: 10455, Training Loss: 0.01486
Epoch: 10455, Training Loss: 0.01578
Epoch: 10455, Training Loss: 0.01581
Epoch: 10455, Training Loss: 0.01964
Epoch: 10456, Training Loss: 0.01486
Epoch: 10456, Training Loss: 0.01578
Epoch: 10456, Training Loss: 0.01581
Epoch: 10456, Training Loss: 0.01964
Epoch: 10457, Training Loss: 0.01486
Epoch: 10457, Training Loss: 0.01578
Epoch: 10457, Training Loss: 0.01581
Epoch: 10457, Training Loss: 0.01964
Epoch: 10458, Training Loss: 0.01485
Epoch: 10458, Training Loss: 0.01578
Epoch: 10458, Training Loss: 0.01581
Epoch: 10458, Training Loss: 0.01963
Epoch: 10459, Training Loss: 0.01485
Epoch: 10459, Training Loss: 0.01578
Epoch: 10459, Training Loss: 0.01580
Epoch: 10459, Training Loss: 0.01963
Epoch: 10460, Training Loss: 0.01485
Epoch: 10460, Training Loss: 0.01578
Epoch: 10460, Training Loss: 0.01580
Epoch: 10460, Training Loss: 0.01963
Epoch: 10461, Training Loss: 0.01485
Epoch: 10461, Training Loss: 0.01578
Epoch: 10461, Training Loss: 0.01580
Epoch: 10461, Training Loss: 0.01963
Epoch: 10462, Training Loss: 0.01485
Epoch: 10462, Training Loss: 0.01578
Epoch: 10462, Training Loss: 0.01580
Epoch: 10462, Training Loss: 0.01963
Epoch: 10463, Training Loss: 0.01485
Epoch: 10463, Training Loss: 0.01578
Epoch: 10463, Training Loss: 0.01580
Epoch: 10463, Training Loss: 0.01963
Epoch: 10464, Training Loss: 0.01485
Epoch: 10464, Training Loss: 0.01578
Epoch: 10464, Training Loss: 0.01580
Epoch: 10464, Training Loss: 0.01963
Epoch: 10465, Training Loss: 0.01485
Epoch: 10465, Training Loss: 0.01577
Epoch: 10465, Training Loss: 0.01580
Epoch: 10465, Training Loss: 0.01962
Epoch: 10466, Training Loss: 0.01485
Epoch: 10466, Training Loss: 0.01577
Epoch: 10466, Training Loss: 0.01580
Epoch: 10466, Training Loss: 0.01962
Epoch: 10467, Training Loss: 0.01485
Epoch: 10467, Training Loss: 0.01577
Epoch: 10467, Training Loss: 0.01580
Epoch: 10467, Training Loss: 0.01962
Epoch: 10468, Training Loss: 0.01484
Epoch: 10468, Training Loss: 0.01577
Epoch: 10468, Training Loss: 0.01580
Epoch: 10468, Training Loss: 0.01962
Epoch: 10469, Training Loss: 0.01484
Epoch: 10469, Training Loss: 0.01577
Epoch: 10469, Training Loss: 0.01579
Epoch: 10469, Training Loss: 0.01962
Epoch: 10470, Training Loss: 0.01484
Epoch: 10470, Training Loss: 0.01577
Epoch: 10470, Training Loss: 0.01579
Epoch: 10470, Training Loss: 0.01962
Epoch: 10471, Training Loss: 0.01484
Epoch: 10471, Training Loss: 0.01577
Epoch: 10471, Training Loss: 0.01579
Epoch: 10471, Training Loss: 0.01962
Epoch: 10472, Training Loss: 0.01484
Epoch: 10472, Training Loss: 0.01577
Epoch: 10472, Training Loss: 0.01579
Epoch: 10472, Training Loss: 0.01962
Epoch: 10473, Training Loss: 0.01484
Epoch: 10473, Training Loss: 0.01577
Epoch: 10473, Training Loss: 0.01579
Epoch: 10473, Training Loss: 0.01961
Epoch: 10474, Training Loss: 0.01484
Epoch: 10474, Training Loss: 0.01577
Epoch: 10474, Training Loss: 0.01579
Epoch: 10474, Training Loss: 0.01961
Epoch: 10475, Training Loss: 0.01484
Epoch: 10475, Training Loss: 0.01576
Epoch: 10475, Training Loss: 0.01579
Epoch: 10475, Training Loss: 0.01961
Epoch: 10476, Training Loss: 0.01484
Epoch: 10476, Training Loss: 0.01576
Epoch: 10476, Training Loss: 0.01579
Epoch: 10476, Training Loss: 0.01961
Epoch: 10477, Training Loss: 0.01484
Epoch: 10477, Training Loss: 0.01576
Epoch: 10477, Training Loss: 0.01579
Epoch: 10477, Training Loss: 0.01961
Epoch: 10478, Training Loss: 0.01484
Epoch: 10478, Training Loss: 0.01576
Epoch: 10478, Training Loss: 0.01579
Epoch: 10478, Training Loss: 0.01961
Epoch: 10479, Training Loss: 0.01483
Epoch: 10479, Training Loss: 0.01576
Epoch: 10479, Training Loss: 0.01578
Epoch: 10479, Training Loss: 0.01961
Epoch: 10480, Training Loss: 0.01483
Epoch: 10480, Training Loss: 0.01576
Epoch: 10480, Training Loss: 0.01578
Epoch: 10480, Training Loss: 0.01961
Epoch: 10481, Training Loss: 0.01483
Epoch: 10481, Training Loss: 0.01576
Epoch: 10481, Training Loss: 0.01578
Epoch: 10481, Training Loss: 0.01960
Epoch: 10482, Training Loss: 0.01483
Epoch: 10482, Training Loss: 0.01576
Epoch: 10482, Training Loss: 0.01578
Epoch: 10482, Training Loss: 0.01960
Epoch: 10483, Training Loss: 0.01483
Epoch: 10483, Training Loss: 0.01576
Epoch: 10483, Training Loss: 0.01578
Epoch: 10483, Training Loss: 0.01960
Epoch: 10484, Training Loss: 0.01483
Epoch: 10484, Training Loss: 0.01576
Epoch: 10484, Training Loss: 0.01578
Epoch: 10484, Training Loss: 0.01960
Epoch: 10485, Training Loss: 0.01483
Epoch: 10485, Training Loss: 0.01575
Epoch: 10485, Training Loss: 0.01578
Epoch: 10485, Training Loss: 0.01960
Epoch: 10486, Training Loss: 0.01483
Epoch: 10486, Training Loss: 0.01575
Epoch: 10486, Training Loss: 0.01578
Epoch: 10486, Training Loss: 0.01960
Epoch: 10487, Training Loss: 0.01483
Epoch: 10487, Training Loss: 0.01575
Epoch: 10487, Training Loss: 0.01578
Epoch: 10487, Training Loss: 0.01960
Epoch: 10488, Training Loss: 0.01483
Epoch: 10488, Training Loss: 0.01575
Epoch: 10488, Training Loss: 0.01578
Epoch: 10488, Training Loss: 0.01959
Epoch: 10489, Training Loss: 0.01483
Epoch: 10489, Training Loss: 0.01575
Epoch: 10489, Training Loss: 0.01577
Epoch: 10489, Training Loss: 0.01959
Epoch: 10490, Training Loss: 0.01482
Epoch: 10490, Training Loss: 0.01575
Epoch: 10490, Training Loss: 0.01577
Epoch: 10490, Training Loss: 0.01959
Epoch: 10491, Training Loss: 0.01482
Epoch: 10491, Training Loss: 0.01575
Epoch: 10491, Training Loss: 0.01577
Epoch: 10491, Training Loss: 0.01959
Epoch: 10492, Training Loss: 0.01482
Epoch: 10492, Training Loss: 0.01575
Epoch: 10492, Training Loss: 0.01577
Epoch: 10492, Training Loss: 0.01959
Epoch: 10493, Training Loss: 0.01482
Epoch: 10493, Training Loss: 0.01575
Epoch: 10493, Training Loss: 0.01577
Epoch: 10493, Training Loss: 0.01959
Epoch: 10494, Training Loss: 0.01482
Epoch: 10494, Training Loss: 0.01574
Epoch: 10494, Training Loss: 0.01577
Epoch: 10494, Training Loss: 0.01959
Epoch: 10495, Training Loss: 0.01482
Epoch: 10495, Training Loss: 0.01574
Epoch: 10495, Training Loss: 0.01577
Epoch: 10495, Training Loss: 0.01959
Epoch: 10496, Training Loss: 0.01482
Epoch: 10496, Training Loss: 0.01574
Epoch: 10496, Training Loss: 0.01577
Epoch: 10496, Training Loss: 0.01958
Epoch: 10497, Training Loss: 0.01482
Epoch: 10497, Training Loss: 0.01574
Epoch: 10497, Training Loss: 0.01577
Epoch: 10497, Training Loss: 0.01958
Epoch: 10498, Training Loss: 0.01482
Epoch: 10498, Training Loss: 0.01574
Epoch: 10498, Training Loss: 0.01576
Epoch: 10498, Training Loss: 0.01958
Epoch: 10499, Training Loss: 0.01482
Epoch: 10499, Training Loss: 0.01574
Epoch: 10499, Training Loss: 0.01576
Epoch: 10499, Training Loss: 0.01958
Epoch: 10500, Training Loss: 0.01482
Epoch: 10500, Training Loss: 0.01574
Epoch: 10500, Training Loss: 0.01576
Epoch: 10500, Training Loss: 0.01958
Epoch: 10501, Training Loss: 0.01481
Epoch: 10501, Training Loss: 0.01574
Epoch: 10501, Training Loss: 0.01576
Epoch: 10501, Training Loss: 0.01958
Epoch: 10502, Training Loss: 0.01481
Epoch: 10502, Training Loss: 0.01574
Epoch: 10502, Training Loss: 0.01576
Epoch: 10502, Training Loss: 0.01958
Epoch: 10503, Training Loss: 0.01481
Epoch: 10503, Training Loss: 0.01574
Epoch: 10503, Training Loss: 0.01576
Epoch: 10503, Training Loss: 0.01958
Epoch: 10504, Training Loss: 0.01481
Epoch: 10504, Training Loss: 0.01573
Epoch: 10504, Training Loss: 0.01576
Epoch: 10504, Training Loss: 0.01957
Epoch: 10505, Training Loss: 0.01481
Epoch: 10505, Training Loss: 0.01573
Epoch: 10505, Training Loss: 0.01576
Epoch: 10505, Training Loss: 0.01957
Epoch: 10506, Training Loss: 0.01481
Epoch: 10506, Training Loss: 0.01573
Epoch: 10506, Training Loss: 0.01576
Epoch: 10506, Training Loss: 0.01957
Epoch: 10507, Training Loss: 0.01481
Epoch: 10507, Training Loss: 0.01573
Epoch: 10507, Training Loss: 0.01576
Epoch: 10507, Training Loss: 0.01957
Epoch: 10508, Training Loss: 0.01481
Epoch: 10508, Training Loss: 0.01573
Epoch: 10508, Training Loss: 0.01575
Epoch: 10508, Training Loss: 0.01957
Epoch: 10509, Training Loss: 0.01481
Epoch: 10509, Training Loss: 0.01573
Epoch: 10509, Training Loss: 0.01575
Epoch: 10509, Training Loss: 0.01957
Epoch: 10510, Training Loss: 0.01481
Epoch: 10510, Training Loss: 0.01573
Epoch: 10510, Training Loss: 0.01575
Epoch: 10510, Training Loss: 0.01957
Epoch: 10511, Training Loss: 0.01481
Epoch: 10511, Training Loss: 0.01573
Epoch: 10511, Training Loss: 0.01575
Epoch: 10511, Training Loss: 0.01957
Epoch: 10512, Training Loss: 0.01480
Epoch: 10512, Training Loss: 0.01573
Epoch: 10512, Training Loss: 0.01575
Epoch: 10512, Training Loss: 0.01956
Epoch: 10513, Training Loss: 0.01480
Epoch: 10513, Training Loss: 0.01573
Epoch: 10513, Training Loss: 0.01575
Epoch: 10513, Training Loss: 0.01956
Epoch: 10514, Training Loss: 0.01480
Epoch: 10514, Training Loss: 0.01572
Epoch: 10514, Training Loss: 0.01575
Epoch: 10514, Training Loss: 0.01956
Epoch: 10515, Training Loss: 0.01480
Epoch: 10515, Training Loss: 0.01572
Epoch: 10515, Training Loss: 0.01575
Epoch: 10515, Training Loss: 0.01956
Epoch: 10516, Training Loss: 0.01480
Epoch: 10516, Training Loss: 0.01572
Epoch: 10516, Training Loss: 0.01575
Epoch: 10516, Training Loss: 0.01956
Epoch: 10517, Training Loss: 0.01480
Epoch: 10517, Training Loss: 0.01572
Epoch: 10517, Training Loss: 0.01575
Epoch: 10517, Training Loss: 0.01956
Epoch: 10518, Training Loss: 0.01480
Epoch: 10518, Training Loss: 0.01572
Epoch: 10518, Training Loss: 0.01574
Epoch: 10518, Training Loss: 0.01956
Epoch: 10519, Training Loss: 0.01480
Epoch: 10519, Training Loss: 0.01572
Epoch: 10519, Training Loss: 0.01574
Epoch: 10519, Training Loss: 0.01955
Epoch: 10520, Training Loss: 0.01480
Epoch: 10520, Training Loss: 0.01572
Epoch: 10520, Training Loss: 0.01574
Epoch: 10520, Training Loss: 0.01955
Epoch: 10521, Training Loss: 0.01480
Epoch: 10521, Training Loss: 0.01572
Epoch: 10521, Training Loss: 0.01574
Epoch: 10521, Training Loss: 0.01955
Epoch: 10522, Training Loss: 0.01480
Epoch: 10522, Training Loss: 0.01572
Epoch: 10522, Training Loss: 0.01574
Epoch: 10522, Training Loss: 0.01955
Epoch: 10523, Training Loss: 0.01479
Epoch: 10523, Training Loss: 0.01572
Epoch: 10523, Training Loss: 0.01574
Epoch: 10523, Training Loss: 0.01955
Epoch: 10524, Training Loss: 0.01479
Epoch: 10524, Training Loss: 0.01571
Epoch: 10524, Training Loss: 0.01574
Epoch: 10524, Training Loss: 0.01955
Epoch: 10525, Training Loss: 0.01479
Epoch: 10525, Training Loss: 0.01571
Epoch: 10525, Training Loss: 0.01574
Epoch: 10525, Training Loss: 0.01955
Epoch: 10526, Training Loss: 0.01479
Epoch: 10526, Training Loss: 0.01571
Epoch: 10526, Training Loss: 0.01574
Epoch: 10526, Training Loss: 0.01955
Epoch: 10527, Training Loss: 0.01479
Epoch: 10527, Training Loss: 0.01571
Epoch: 10527, Training Loss: 0.01574
Epoch: 10527, Training Loss: 0.01954
Epoch: 10528, Training Loss: 0.01479
Epoch: 10528, Training Loss: 0.01571
Epoch: 10528, Training Loss: 0.01573
Epoch: 10528, Training Loss: 0.01954
Epoch: 10529, Training Loss: 0.01479
Epoch: 10529, Training Loss: 0.01571
Epoch: 10529, Training Loss: 0.01573
Epoch: 10529, Training Loss: 0.01954
Epoch: 10530, Training Loss: 0.01479
Epoch: 10530, Training Loss: 0.01571
Epoch: 10530, Training Loss: 0.01573
Epoch: 10530, Training Loss: 0.01954
Epoch: 10531, Training Loss: 0.01479
Epoch: 10531, Training Loss: 0.01571
Epoch: 10531, Training Loss: 0.01573
Epoch: 10531, Training Loss: 0.01954
Epoch: 10532, Training Loss: 0.01479
Epoch: 10532, Training Loss: 0.01571
Epoch: 10532, Training Loss: 0.01573
Epoch: 10532, Training Loss: 0.01954
Epoch: 10533, Training Loss: 0.01479
Epoch: 10533, Training Loss: 0.01571
Epoch: 10533, Training Loss: 0.01573
Epoch: 10533, Training Loss: 0.01954
Epoch: 10534, Training Loss: 0.01478
Epoch: 10534, Training Loss: 0.01570
Epoch: 10534, Training Loss: 0.01573
Epoch: 10534, Training Loss: 0.01954
Epoch: 10535, Training Loss: 0.01478
Epoch: 10535, Training Loss: 0.01570
Epoch: 10535, Training Loss: 0.01573
Epoch: 10535, Training Loss: 0.01953
Epoch: 10536, Training Loss: 0.01478
Epoch: 10536, Training Loss: 0.01570
Epoch: 10536, Training Loss: 0.01573
Epoch: 10536, Training Loss: 0.01953
Epoch: 10537, Training Loss: 0.01478
Epoch: 10537, Training Loss: 0.01570
Epoch: 10537, Training Loss: 0.01573
Epoch: 10537, Training Loss: 0.01953
Epoch: 10538, Training Loss: 0.01478
Epoch: 10538, Training Loss: 0.01570
Epoch: 10538, Training Loss: 0.01572
Epoch: 10538, Training Loss: 0.01953
Epoch: 10539, Training Loss: 0.01478
Epoch: 10539, Training Loss: 0.01570
Epoch: 10539, Training Loss: 0.01572
Epoch: 10539, Training Loss: 0.01953
Epoch: 10540, Training Loss: 0.01478
Epoch: 10540, Training Loss: 0.01570
Epoch: 10540, Training Loss: 0.01572
Epoch: 10540, Training Loss: 0.01953
Epoch: 10541, Training Loss: 0.01478
Epoch: 10541, Training Loss: 0.01570
Epoch: 10541, Training Loss: 0.01572
Epoch: 10541, Training Loss: 0.01953
Epoch: 10542, Training Loss: 0.01478
Epoch: 10542, Training Loss: 0.01570
Epoch: 10542, Training Loss: 0.01572
Epoch: 10542, Training Loss: 0.01953
Epoch: 10543, Training Loss: 0.01478
Epoch: 10543, Training Loss: 0.01570
Epoch: 10543, Training Loss: 0.01572
Epoch: 10543, Training Loss: 0.01952
Epoch: 10544, Training Loss: 0.01477
Epoch: 10544, Training Loss: 0.01569
Epoch: 10544, Training Loss: 0.01572
Epoch: 10544, Training Loss: 0.01952
Epoch: 10545, Training Loss: 0.01477
Epoch: 10545, Training Loss: 0.01569
Epoch: 10545, Training Loss: 0.01572
Epoch: 10545, Training Loss: 0.01952
Epoch: 10546, Training Loss: 0.01477
Epoch: 10546, Training Loss: 0.01569
Epoch: 10546, Training Loss: 0.01572
Epoch: 10546, Training Loss: 0.01952
Epoch: 10547, Training Loss: 0.01477
Epoch: 10547, Training Loss: 0.01569
Epoch: 10547, Training Loss: 0.01572
Epoch: 10547, Training Loss: 0.01952
Epoch: 10548, Training Loss: 0.01477
Epoch: 10548, Training Loss: 0.01569
Epoch: 10548, Training Loss: 0.01571
Epoch: 10548, Training Loss: 0.01952
Epoch: 10549, Training Loss: 0.01477
Epoch: 10549, Training Loss: 0.01569
Epoch: 10549, Training Loss: 0.01571
Epoch: 10549, Training Loss: 0.01952
Epoch: 10550, Training Loss: 0.01477
Epoch: 10550, Training Loss: 0.01569
Epoch: 10550, Training Loss: 0.01571
Epoch: 10550, Training Loss: 0.01951
Epoch: 10551, Training Loss: 0.01477
Epoch: 10551, Training Loss: 0.01569
Epoch: 10551, Training Loss: 0.01571
Epoch: 10551, Training Loss: 0.01951
Epoch: 10552, Training Loss: 0.01477
Epoch: 10552, Training Loss: 0.01569
Epoch: 10552, Training Loss: 0.01571
Epoch: 10552, Training Loss: 0.01951
Epoch: 10553, Training Loss: 0.01477
Epoch: 10553, Training Loss: 0.01569
Epoch: 10553, Training Loss: 0.01571
Epoch: 10553, Training Loss: 0.01951
Epoch: 10554, Training Loss: 0.01477
Epoch: 10554, Training Loss: 0.01568
Epoch: 10554, Training Loss: 0.01571
Epoch: 10554, Training Loss: 0.01951
Epoch: 10555, Training Loss: 0.01476
Epoch: 10555, Training Loss: 0.01568
Epoch: 10555, Training Loss: 0.01571
Epoch: 10555, Training Loss: 0.01951
Epoch: 10556, Training Loss: 0.01476
Epoch: 10556, Training Loss: 0.01568
Epoch: 10556, Training Loss: 0.01571
Epoch: 10556, Training Loss: 0.01951
Epoch: 10557, Training Loss: 0.01476
Epoch: 10557, Training Loss: 0.01568
Epoch: 10557, Training Loss: 0.01571
Epoch: 10557, Training Loss: 0.01951
Epoch: 10558, Training Loss: 0.01476
Epoch: 10558, Training Loss: 0.01568
Epoch: 10558, Training Loss: 0.01570
Epoch: 10558, Training Loss: 0.01950
Epoch: 10559, Training Loss: 0.01476
Epoch: 10559, Training Loss: 0.01568
Epoch: 10559, Training Loss: 0.01570
Epoch: 10559, Training Loss: 0.01950
Epoch: 10560, Training Loss: 0.01476
Epoch: 10560, Training Loss: 0.01568
Epoch: 10560, Training Loss: 0.01570
Epoch: 10560, Training Loss: 0.01950
Epoch: 10561, Training Loss: 0.01476
Epoch: 10561, Training Loss: 0.01568
Epoch: 10561, Training Loss: 0.01570
Epoch: 10561, Training Loss: 0.01950
Epoch: 10562, Training Loss: 0.01476
Epoch: 10562, Training Loss: 0.01568
Epoch: 10562, Training Loss: 0.01570
Epoch: 10562, Training Loss: 0.01950
Epoch: 10563, Training Loss: 0.01476
Epoch: 10563, Training Loss: 0.01567
Epoch: 10563, Training Loss: 0.01570
Epoch: 10563, Training Loss: 0.01950
Epoch: 10564, Training Loss: 0.01476
Epoch: 10564, Training Loss: 0.01567
Epoch: 10564, Training Loss: 0.01570
Epoch: 10564, Training Loss: 0.01950
Epoch: 10565, Training Loss: 0.01476
Epoch: 10565, Training Loss: 0.01567
Epoch: 10565, Training Loss: 0.01570
Epoch: 10565, Training Loss: 0.01950
Epoch: 10566, Training Loss: 0.01475
Epoch: 10566, Training Loss: 0.01567
Epoch: 10566, Training Loss: 0.01570
Epoch: 10566, Training Loss: 0.01949
Epoch: 10567, Training Loss: 0.01475
Epoch: 10567, Training Loss: 0.01567
Epoch: 10567, Training Loss: 0.01569
Epoch: 10567, Training Loss: 0.01949
Epoch: 10568, Training Loss: 0.01475
Epoch: 10568, Training Loss: 0.01567
Epoch: 10568, Training Loss: 0.01569
Epoch: 10568, Training Loss: 0.01949
Epoch: 10569, Training Loss: 0.01475
Epoch: 10569, Training Loss: 0.01567
Epoch: 10569, Training Loss: 0.01569
Epoch: 10569, Training Loss: 0.01949
Epoch: 10570, Training Loss: 0.01475
Epoch: 10570, Training Loss: 0.01567
Epoch: 10570, Training Loss: 0.01569
Epoch: 10570, Training Loss: 0.01949
Epoch: 10571, Training Loss: 0.01475
Epoch: 10571, Training Loss: 0.01567
Epoch: 10571, Training Loss: 0.01569
Epoch: 10571, Training Loss: 0.01949
Epoch: 10572, Training Loss: 0.01475
Epoch: 10572, Training Loss: 0.01567
Epoch: 10572, Training Loss: 0.01569
Epoch: 10572, Training Loss: 0.01949
Epoch: 10573, Training Loss: 0.01475
Epoch: 10573, Training Loss: 0.01566
Epoch: 10573, Training Loss: 0.01569
Epoch: 10573, Training Loss: 0.01949
Epoch: 10574, Training Loss: 0.01475
Epoch: 10574, Training Loss: 0.01566
Epoch: 10574, Training Loss: 0.01569
Epoch: 10574, Training Loss: 0.01948
Epoch: 10575, Training Loss: 0.01475
Epoch: 10575, Training Loss: 0.01566
Epoch: 10575, Training Loss: 0.01569
Epoch: 10575, Training Loss: 0.01948
Epoch: 10576, Training Loss: 0.01475
Epoch: 10576, Training Loss: 0.01566
Epoch: 10576, Training Loss: 0.01569
Epoch: 10576, Training Loss: 0.01948
Epoch: 10577, Training Loss: 0.01474
Epoch: 10577, Training Loss: 0.01566
Epoch: 10577, Training Loss: 0.01568
Epoch: 10577, Training Loss: 0.01948
Epoch: 10578, Training Loss: 0.01474
Epoch: 10578, Training Loss: 0.01566
Epoch: 10578, Training Loss: 0.01568
Epoch: 10578, Training Loss: 0.01948
Epoch: 10579, Training Loss: 0.01474
Epoch: 10579, Training Loss: 0.01566
Epoch: 10579, Training Loss: 0.01568
Epoch: 10579, Training Loss: 0.01948
Epoch: 10580, Training Loss: 0.01474
Epoch: 10580, Training Loss: 0.01566
Epoch: 10580, Training Loss: 0.01568
Epoch: 10580, Training Loss: 0.01948
Epoch: 10581, Training Loss: 0.01474
Epoch: 10581, Training Loss: 0.01566
Epoch: 10581, Training Loss: 0.01568
Epoch: 10581, Training Loss: 0.01948
Epoch: 10582, Training Loss: 0.01474
Epoch: 10582, Training Loss: 0.01566
Epoch: 10582, Training Loss: 0.01568
Epoch: 10582, Training Loss: 0.01947
Epoch: 10583, Training Loss: 0.01474
Epoch: 10583, Training Loss: 0.01565
Epoch: 10583, Training Loss: 0.01568
Epoch: 10583, Training Loss: 0.01947
Epoch: 10584, Training Loss: 0.01474
Epoch: 10584, Training Loss: 0.01565
Epoch: 10584, Training Loss: 0.01568
Epoch: 10584, Training Loss: 0.01947
Epoch: 10585, Training Loss: 0.01474
Epoch: 10585, Training Loss: 0.01565
Epoch: 10585, Training Loss: 0.01568
Epoch: 10585, Training Loss: 0.01947
Epoch: 10586, Training Loss: 0.01474
Epoch: 10586, Training Loss: 0.01565
Epoch: 10586, Training Loss: 0.01568
Epoch: 10586, Training Loss: 0.01947
Epoch: 10587, Training Loss: 0.01474
Epoch: 10587, Training Loss: 0.01565
Epoch: 10587, Training Loss: 0.01567
Epoch: 10587, Training Loss: 0.01947
Epoch: 10588, Training Loss: 0.01473
Epoch: 10588, Training Loss: 0.01565
Epoch: 10588, Training Loss: 0.01567
Epoch: 10588, Training Loss: 0.01947
Epoch: 10589, Training Loss: 0.01473
Epoch: 10589, Training Loss: 0.01565
Epoch: 10589, Training Loss: 0.01567
Epoch: 10589, Training Loss: 0.01946
Epoch: 10590, Training Loss: 0.01473
Epoch: 10590, Training Loss: 0.01565
Epoch: 10590, Training Loss: 0.01567
Epoch: 10590, Training Loss: 0.01946
Epoch: 10591, Training Loss: 0.01473
Epoch: 10591, Training Loss: 0.01565
Epoch: 10591, Training Loss: 0.01567
Epoch: 10591, Training Loss: 0.01946
Epoch: 10592, Training Loss: 0.01473
Epoch: 10592, Training Loss: 0.01565
Epoch: 10592, Training Loss: 0.01567
Epoch: 10592, Training Loss: 0.01946
Epoch: 10593, Training Loss: 0.01473
Epoch: 10593, Training Loss: 0.01564
Epoch: 10593, Training Loss: 0.01567
Epoch: 10593, Training Loss: 0.01946
Epoch: 10594, Training Loss: 0.01473
Epoch: 10594, Training Loss: 0.01564
Epoch: 10594, Training Loss: 0.01567
Epoch: 10594, Training Loss: 0.01946
Epoch: 10595, Training Loss: 0.01473
Epoch: 10595, Training Loss: 0.01564
Epoch: 10595, Training Loss: 0.01567
Epoch: 10595, Training Loss: 0.01946
Epoch: 10596, Training Loss: 0.01473
Epoch: 10596, Training Loss: 0.01564
Epoch: 10596, Training Loss: 0.01567
Epoch: 10596, Training Loss: 0.01946
Epoch: 10597, Training Loss: 0.01473
Epoch: 10597, Training Loss: 0.01564
Epoch: 10597, Training Loss: 0.01566
Epoch: 10597, Training Loss: 0.01945
Epoch: 10598, Training Loss: 0.01473
Epoch: 10598, Training Loss: 0.01564
Epoch: 10598, Training Loss: 0.01566
Epoch: 10598, Training Loss: 0.01945
Epoch: 10599, Training Loss: 0.01473
Epoch: 10599, Training Loss: 0.01564
Epoch: 10599, Training Loss: 0.01566
Epoch: 10599, Training Loss: 0.01945
Epoch: 10600, Training Loss: 0.01472
Epoch: 10600, Training Loss: 0.01564
Epoch: 10600, Training Loss: 0.01566
Epoch: 10600, Training Loss: 0.01945
Epoch: 10601, Training Loss: 0.01472
Epoch: 10601, Training Loss: 0.01564
Epoch: 10601, Training Loss: 0.01566
Epoch: 10601, Training Loss: 0.01945
Epoch: 10602, Training Loss: 0.01472
Epoch: 10602, Training Loss: 0.01564
Epoch: 10602, Training Loss: 0.01566
Epoch: 10602, Training Loss: 0.01945
Epoch: 10603, Training Loss: 0.01472
Epoch: 10603, Training Loss: 0.01563
Epoch: 10603, Training Loss: 0.01566
Epoch: 10603, Training Loss: 0.01945
Epoch: 10604, Training Loss: 0.01472
Epoch: 10604, Training Loss: 0.01563
Epoch: 10604, Training Loss: 0.01566
Epoch: 10604, Training Loss: 0.01945
Epoch: 10605, Training Loss: 0.01472
Epoch: 10605, Training Loss: 0.01563
Epoch: 10605, Training Loss: 0.01566
Epoch: 10605, Training Loss: 0.01944
Epoch: 10606, Training Loss: 0.01472
Epoch: 10606, Training Loss: 0.01563
Epoch: 10606, Training Loss: 0.01566
Epoch: 10606, Training Loss: 0.01944
Epoch: 10607, Training Loss: 0.01472
Epoch: 10607, Training Loss: 0.01563
Epoch: 10607, Training Loss: 0.01565
Epoch: 10607, Training Loss: 0.01944
Epoch: 10608, Training Loss: 0.01472
Epoch: 10608, Training Loss: 0.01563
Epoch: 10608, Training Loss: 0.01565
Epoch: 10608, Training Loss: 0.01944
Epoch: 10609, Training Loss: 0.01472
Epoch: 10609, Training Loss: 0.01563
Epoch: 10609, Training Loss: 0.01565
Epoch: 10609, Training Loss: 0.01944
Epoch: 10610, Training Loss: 0.01472
Epoch: 10610, Training Loss: 0.01563
Epoch: 10610, Training Loss: 0.01565
Epoch: 10610, Training Loss: 0.01944
Epoch: 10611, Training Loss: 0.01471
Epoch: 10611, Training Loss: 0.01563
Epoch: 10611, Training Loss: 0.01565
Epoch: 10611, Training Loss: 0.01944
Epoch: 10612, Training Loss: 0.01471
Epoch: 10612, Training Loss: 0.01563
Epoch: 10612, Training Loss: 0.01565
Epoch: 10612, Training Loss: 0.01944
Epoch: 10613, Training Loss: 0.01471
Epoch: 10613, Training Loss: 0.01562
Epoch: 10613, Training Loss: 0.01565
Epoch: 10613, Training Loss: 0.01943
Epoch: 10614, Training Loss: 0.01471
Epoch: 10614, Training Loss: 0.01562
Epoch: 10614, Training Loss: 0.01565
Epoch: 10614, Training Loss: 0.01943
Epoch: 10615, Training Loss: 0.01471
Epoch: 10615, Training Loss: 0.01562
Epoch: 10615, Training Loss: 0.01565
Epoch: 10615, Training Loss: 0.01943
Epoch: 10616, Training Loss: 0.01471
Epoch: 10616, Training Loss: 0.01562
Epoch: 10616, Training Loss: 0.01565
Epoch: 10616, Training Loss: 0.01943
Epoch: 10617, Training Loss: 0.01471
Epoch: 10617, Training Loss: 0.01562
Epoch: 10617, Training Loss: 0.01564
Epoch: 10617, Training Loss: 0.01943
Epoch: 10618, Training Loss: 0.01471
Epoch: 10618, Training Loss: 0.01562
Epoch: 10618, Training Loss: 0.01564
Epoch: 10618, Training Loss: 0.01943
Epoch: 10619, Training Loss: 0.01471
Epoch: 10619, Training Loss: 0.01562
Epoch: 10619, Training Loss: 0.01564
Epoch: 10619, Training Loss: 0.01943
Epoch: 10620, Training Loss: 0.01471
Epoch: 10620, Training Loss: 0.01562
Epoch: 10620, Training Loss: 0.01564
Epoch: 10620, Training Loss: 0.01943
Epoch: 10621, Training Loss: 0.01471
Epoch: 10621, Training Loss: 0.01562
Epoch: 10621, Training Loss: 0.01564
Epoch: 10621, Training Loss: 0.01942
Epoch: 10622, Training Loss: 0.01470
Epoch: 10622, Training Loss: 0.01562
Epoch: 10622, Training Loss: 0.01564
Epoch: 10622, Training Loss: 0.01942
Epoch: 10623, Training Loss: 0.01470
Epoch: 10623, Training Loss: 0.01561
Epoch: 10623, Training Loss: 0.01564
Epoch: 10623, Training Loss: 0.01942
Epoch: 10624, Training Loss: 0.01470
Epoch: 10624, Training Loss: 0.01561
Epoch: 10624, Training Loss: 0.01564
Epoch: 10624, Training Loss: 0.01942
Epoch: 10625, Training Loss: 0.01470
Epoch: 10625, Training Loss: 0.01561
Epoch: 10625, Training Loss: 0.01564
Epoch: 10625, Training Loss: 0.01942
Epoch: 10626, Training Loss: 0.01470
Epoch: 10626, Training Loss: 0.01561
Epoch: 10626, Training Loss: 0.01564
Epoch: 10626, Training Loss: 0.01942
Epoch: 10627, Training Loss: 0.01470
Epoch: 10627, Training Loss: 0.01561
Epoch: 10627, Training Loss: 0.01563
Epoch: 10627, Training Loss: 0.01942
Epoch: 10628, Training Loss: 0.01470
Epoch: 10628, Training Loss: 0.01561
Epoch: 10628, Training Loss: 0.01563
Epoch: 10628, Training Loss: 0.01942
Epoch: 10629, Training Loss: 0.01470
Epoch: 10629, Training Loss: 0.01561
Epoch: 10629, Training Loss: 0.01563
Epoch: 10629, Training Loss: 0.01941
Epoch: 10630, Training Loss: 0.01470
Epoch: 10630, Training Loss: 0.01561
Epoch: 10630, Training Loss: 0.01563
Epoch: 10630, Training Loss: 0.01941
Epoch: 10631, Training Loss: 0.01470
Epoch: 10631, Training Loss: 0.01561
Epoch: 10631, Training Loss: 0.01563
Epoch: 10631, Training Loss: 0.01941
Epoch: 10632, Training Loss: 0.01470
Epoch: 10632, Training Loss: 0.01561
Epoch: 10632, Training Loss: 0.01563
Epoch: 10632, Training Loss: 0.01941
Epoch: 10633, Training Loss: 0.01469
Epoch: 10633, Training Loss: 0.01560
Epoch: 10633, Training Loss: 0.01563
Epoch: 10633, Training Loss: 0.01941
Epoch: 10634, Training Loss: 0.01469
Epoch: 10634, Training Loss: 0.01560
Epoch: 10634, Training Loss: 0.01563
Epoch: 10634, Training Loss: 0.01941
Epoch: 10635, Training Loss: 0.01469
Epoch: 10635, Training Loss: 0.01560
Epoch: 10635, Training Loss: 0.01563
Epoch: 10635, Training Loss: 0.01941
Epoch: 10636, Training Loss: 0.01469
Epoch: 10636, Training Loss: 0.01560
Epoch: 10636, Training Loss: 0.01563
Epoch: 10636, Training Loss: 0.01941
Epoch: 10637, Training Loss: 0.01469
Epoch: 10637, Training Loss: 0.01560
Epoch: 10637, Training Loss: 0.01562
Epoch: 10637, Training Loss: 0.01940
Epoch: 10638, Training Loss: 0.01469
Epoch: 10638, Training Loss: 0.01560
Epoch: 10638, Training Loss: 0.01562
Epoch: 10638, Training Loss: 0.01940
Epoch: 10639, Training Loss: 0.01469
Epoch: 10639, Training Loss: 0.01560
Epoch: 10639, Training Loss: 0.01562
Epoch: 10639, Training Loss: 0.01940
Epoch: 10640, Training Loss: 0.01469
Epoch: 10640, Training Loss: 0.01560
Epoch: 10640, Training Loss: 0.01562
Epoch: 10640, Training Loss: 0.01940
Epoch: 10641, Training Loss: 0.01469
Epoch: 10641, Training Loss: 0.01560
Epoch: 10641, Training Loss: 0.01562
Epoch: 10641, Training Loss: 0.01940
Epoch: 10642, Training Loss: 0.01469
Epoch: 10642, Training Loss: 0.01560
Epoch: 10642, Training Loss: 0.01562
Epoch: 10642, Training Loss: 0.01940
Epoch: 10643, Training Loss: 0.01469
Epoch: 10643, Training Loss: 0.01560
Epoch: 10643, Training Loss: 0.01562
Epoch: 10643, Training Loss: 0.01940
Epoch: 10644, Training Loss: 0.01468
Epoch: 10644, Training Loss: 0.01559
Epoch: 10644, Training Loss: 0.01562
Epoch: 10644, Training Loss: 0.01939
Epoch: 10645, Training Loss: 0.01468
Epoch: 10645, Training Loss: 0.01559
Epoch: 10645, Training Loss: 0.01562
Epoch: 10645, Training Loss: 0.01939
Epoch: 10646, Training Loss: 0.01468
Epoch: 10646, Training Loss: 0.01559
Epoch: 10646, Training Loss: 0.01562
Epoch: 10646, Training Loss: 0.01939
Epoch: 10647, Training Loss: 0.01468
Epoch: 10647, Training Loss: 0.01559
Epoch: 10647, Training Loss: 0.01561
Epoch: 10647, Training Loss: 0.01939
Epoch: 10648, Training Loss: 0.01468
Epoch: 10648, Training Loss: 0.01559
Epoch: 10648, Training Loss: 0.01561
Epoch: 10648, Training Loss: 0.01939
Epoch: 10649, Training Loss: 0.01468
Epoch: 10649, Training Loss: 0.01559
Epoch: 10649, Training Loss: 0.01561
Epoch: 10649, Training Loss: 0.01939
Epoch: 10650, Training Loss: 0.01468
Epoch: 10650, Training Loss: 0.01559
Epoch: 10650, Training Loss: 0.01561
Epoch: 10650, Training Loss: 0.01939
Epoch: 10651, Training Loss: 0.01468
Epoch: 10651, Training Loss: 0.01559
Epoch: 10651, Training Loss: 0.01561
Epoch: 10651, Training Loss: 0.01939
Epoch: 10652, Training Loss: 0.01468
Epoch: 10652, Training Loss: 0.01559
Epoch: 10652, Training Loss: 0.01561
Epoch: 10652, Training Loss: 0.01938
Epoch: 10653, Training Loss: 0.01468
Epoch: 10653, Training Loss: 0.01559
Epoch: 10653, Training Loss: 0.01561
Epoch: 10653, Training Loss: 0.01938
Epoch: 10654, Training Loss: 0.01468
Epoch: 10654, Training Loss: 0.01558
Epoch: 10654, Training Loss: 0.01561
Epoch: 10654, Training Loss: 0.01938
Epoch: 10655, Training Loss: 0.01467
Epoch: 10655, Training Loss: 0.01558
Epoch: 10655, Training Loss: 0.01561
Epoch: 10655, Training Loss: 0.01938
Epoch: 10656, Training Loss: 0.01467
Epoch: 10656, Training Loss: 0.01558
Epoch: 10656, Training Loss: 0.01561
Epoch: 10656, Training Loss: 0.01938
Epoch: 10657, Training Loss: 0.01467
Epoch: 10657, Training Loss: 0.01558
Epoch: 10657, Training Loss: 0.01561
Epoch: 10657, Training Loss: 0.01938
Epoch: 10658, Training Loss: 0.01467
Epoch: 10658, Training Loss: 0.01558
Epoch: 10658, Training Loss: 0.01560
Epoch: 10658, Training Loss: 0.01938
Epoch: 10659, Training Loss: 0.01467
Epoch: 10659, Training Loss: 0.01558
Epoch: 10659, Training Loss: 0.01560
Epoch: 10659, Training Loss: 0.01938
Epoch: 10660, Training Loss: 0.01467
Epoch: 10660, Training Loss: 0.01558
Epoch: 10660, Training Loss: 0.01560
Epoch: 10660, Training Loss: 0.01937
Epoch: 10661, Training Loss: 0.01467
Epoch: 10661, Training Loss: 0.01558
Epoch: 10661, Training Loss: 0.01560
Epoch: 10661, Training Loss: 0.01937
Epoch: 10662, Training Loss: 0.01467
Epoch: 10662, Training Loss: 0.01558
Epoch: 10662, Training Loss: 0.01560
Epoch: 10662, Training Loss: 0.01937
Epoch: 10663, Training Loss: 0.01467
Epoch: 10663, Training Loss: 0.01558
Epoch: 10663, Training Loss: 0.01560
Epoch: 10663, Training Loss: 0.01937
Epoch: 10664, Training Loss: 0.01467
Epoch: 10664, Training Loss: 0.01557
Epoch: 10664, Training Loss: 0.01560
Epoch: 10664, Training Loss: 0.01937
Epoch: 10665, Training Loss: 0.01467
Epoch: 10665, Training Loss: 0.01557
Epoch: 10665, Training Loss: 0.01560
Epoch: 10665, Training Loss: 0.01937
Epoch: 10666, Training Loss: 0.01466
Epoch: 10666, Training Loss: 0.01557
Epoch: 10666, Training Loss: 0.01560
Epoch: 10666, Training Loss: 0.01937
Epoch: 10667, Training Loss: 0.01466
Epoch: 10667, Training Loss: 0.01557
Epoch: 10667, Training Loss: 0.01560
Epoch: 10667, Training Loss: 0.01937
Epoch: 10668, Training Loss: 0.01466
Epoch: 10668, Training Loss: 0.01557
Epoch: 10668, Training Loss: 0.01559
Epoch: 10668, Training Loss: 0.01936
Epoch: 10669, Training Loss: 0.01466
Epoch: 10669, Training Loss: 0.01557
Epoch: 10669, Training Loss: 0.01559
Epoch: 10669, Training Loss: 0.01936
Epoch: 10670, Training Loss: 0.01466
Epoch: 10670, Training Loss: 0.01557
Epoch: 10670, Training Loss: 0.01559
Epoch: 10670, Training Loss: 0.01936
Epoch: 10671, Training Loss: 0.01466
Epoch: 10671, Training Loss: 0.01557
Epoch: 10671, Training Loss: 0.01559
Epoch: 10671, Training Loss: 0.01936
Epoch: 10672, Training Loss: 0.01466
Epoch: 10672, Training Loss: 0.01557
Epoch: 10672, Training Loss: 0.01559
Epoch: 10672, Training Loss: 0.01936
Epoch: 10673, Training Loss: 0.01466
Epoch: 10673, Training Loss: 0.01557
Epoch: 10673, Training Loss: 0.01559
Epoch: 10673, Training Loss: 0.01936
Epoch: 10674, Training Loss: 0.01466
Epoch: 10674, Training Loss: 0.01556
Epoch: 10674, Training Loss: 0.01559
Epoch: 10674, Training Loss: 0.01936
Epoch: 10675, Training Loss: 0.01466
Epoch: 10675, Training Loss: 0.01556
Epoch: 10675, Training Loss: 0.01559
Epoch: 10675, Training Loss: 0.01936
Epoch: 10676, Training Loss: 0.01466
Epoch: 10676, Training Loss: 0.01556
Epoch: 10676, Training Loss: 0.01559
Epoch: 10676, Training Loss: 0.01935
Epoch: 10677, Training Loss: 0.01466
Epoch: 10677, Training Loss: 0.01556
Epoch: 10677, Training Loss: 0.01559
Epoch: 10677, Training Loss: 0.01935
Epoch: 10678, Training Loss: 0.01465
Epoch: 10678, Training Loss: 0.01556
Epoch: 10678, Training Loss: 0.01558
Epoch: 10678, Training Loss: 0.01935
Epoch: 10679, Training Loss: 0.01465
Epoch: 10679, Training Loss: 0.01556
Epoch: 10679, Training Loss: 0.01558
Epoch: 10679, Training Loss: 0.01935
Epoch: 10680, Training Loss: 0.01465
Epoch: 10680, Training Loss: 0.01556
Epoch: 10680, Training Loss: 0.01558
Epoch: 10680, Training Loss: 0.01935
Epoch: 10681, Training Loss: 0.01465
Epoch: 10681, Training Loss: 0.01556
Epoch: 10681, Training Loss: 0.01558
Epoch: 10681, Training Loss: 0.01935
Epoch: 10682, Training Loss: 0.01465
Epoch: 10682, Training Loss: 0.01556
Epoch: 10682, Training Loss: 0.01558
Epoch: 10682, Training Loss: 0.01935
Epoch: 10683, Training Loss: 0.01465
Epoch: 10683, Training Loss: 0.01556
Epoch: 10683, Training Loss: 0.01558
Epoch: 10683, Training Loss: 0.01935
Epoch: 10684, Training Loss: 0.01465
Epoch: 10684, Training Loss: 0.01555
Epoch: 10684, Training Loss: 0.01558
Epoch: 10684, Training Loss: 0.01934
Epoch: 10685, Training Loss: 0.01465
Epoch: 10685, Training Loss: 0.01555
Epoch: 10685, Training Loss: 0.01558
Epoch: 10685, Training Loss: 0.01934
Epoch: 10686, Training Loss: 0.01465
Epoch: 10686, Training Loss: 0.01555
Epoch: 10686, Training Loss: 0.01558
Epoch: 10686, Training Loss: 0.01934
Epoch: 10687, Training Loss: 0.01465
Epoch: 10687, Training Loss: 0.01555
Epoch: 10687, Training Loss: 0.01558
Epoch: 10687, Training Loss: 0.01934
Epoch: 10688, Training Loss: 0.01465
Epoch: 10688, Training Loss: 0.01555
Epoch: 10688, Training Loss: 0.01557
Epoch: 10688, Training Loss: 0.01934
Epoch: 10689, Training Loss: 0.01464
Epoch: 10689, Training Loss: 0.01555
Epoch: 10689, Training Loss: 0.01557
Epoch: 10689, Training Loss: 0.01934
Epoch: 10690, Training Loss: 0.01464
Epoch: 10690, Training Loss: 0.01555
Epoch: 10690, Training Loss: 0.01557
Epoch: 10690, Training Loss: 0.01934
Epoch: 10691, Training Loss: 0.01464
Epoch: 10691, Training Loss: 0.01555
Epoch: 10691, Training Loss: 0.01557
Epoch: 10691, Training Loss: 0.01934
Epoch: 10692, Training Loss: 0.01464
Epoch: 10692, Training Loss: 0.01555
Epoch: 10692, Training Loss: 0.01557
Epoch: 10692, Training Loss: 0.01933
Epoch: 10693, Training Loss: 0.01464
Epoch: 10693, Training Loss: 0.01555
Epoch: 10693, Training Loss: 0.01557
Epoch: 10693, Training Loss: 0.01933
Epoch: 10694, Training Loss: 0.01464
Epoch: 10694, Training Loss: 0.01554
Epoch: 10694, Training Loss: 0.01557
Epoch: 10694, Training Loss: 0.01933
Epoch: 10695, Training Loss: 0.01464
Epoch: 10695, Training Loss: 0.01554
Epoch: 10695, Training Loss: 0.01557
Epoch: 10695, Training Loss: 0.01933
Epoch: 10696, Training Loss: 0.01464
Epoch: 10696, Training Loss: 0.01554
Epoch: 10696, Training Loss: 0.01557
Epoch: 10696, Training Loss: 0.01933
Epoch: 10697, Training Loss: 0.01464
Epoch: 10697, Training Loss: 0.01554
Epoch: 10697, Training Loss: 0.01557
Epoch: 10697, Training Loss: 0.01933
Epoch: 10698, Training Loss: 0.01464
Epoch: 10698, Training Loss: 0.01554
Epoch: 10698, Training Loss: 0.01556
Epoch: 10698, Training Loss: 0.01933
Epoch: 10699, Training Loss: 0.01464
Epoch: 10699, Training Loss: 0.01554
Epoch: 10699, Training Loss: 0.01556
Epoch: 10699, Training Loss: 0.01933
Epoch: 10700, Training Loss: 0.01463
Epoch: 10700, Training Loss: 0.01554
Epoch: 10700, Training Loss: 0.01556
Epoch: 10700, Training Loss: 0.01932
Epoch: 10701, Training Loss: 0.01463
Epoch: 10701, Training Loss: 0.01554
Epoch: 10701, Training Loss: 0.01556
Epoch: 10701, Training Loss: 0.01932
Epoch: 10702, Training Loss: 0.01463
Epoch: 10702, Training Loss: 0.01554
Epoch: 10702, Training Loss: 0.01556
Epoch: 10702, Training Loss: 0.01932
Epoch: 10703, Training Loss: 0.01463
Epoch: 10703, Training Loss: 0.01554
Epoch: 10703, Training Loss: 0.01556
Epoch: 10703, Training Loss: 0.01932
Epoch: 10704, Training Loss: 0.01463
Epoch: 10704, Training Loss: 0.01553
Epoch: 10704, Training Loss: 0.01556
Epoch: 10704, Training Loss: 0.01932
Epoch: 10705, Training Loss: 0.01463
Epoch: 10705, Training Loss: 0.01553
Epoch: 10705, Training Loss: 0.01556
Epoch: 10705, Training Loss: 0.01932
Epoch: 10706, Training Loss: 0.01463
Epoch: 10706, Training Loss: 0.01553
Epoch: 10706, Training Loss: 0.01556
Epoch: 10706, Training Loss: 0.01932
Epoch: 10707, Training Loss: 0.01463
Epoch: 10707, Training Loss: 0.01553
Epoch: 10707, Training Loss: 0.01556
Epoch: 10707, Training Loss: 0.01932
Epoch: 10708, Training Loss: 0.01463
Epoch: 10708, Training Loss: 0.01553
Epoch: 10708, Training Loss: 0.01555
Epoch: 10708, Training Loss: 0.01931
Epoch: 10709, Training Loss: 0.01463
Epoch: 10709, Training Loss: 0.01553
Epoch: 10709, Training Loss: 0.01555
Epoch: 10709, Training Loss: 0.01931
Epoch: 10710, Training Loss: 0.01463
Epoch: 10710, Training Loss: 0.01553
Epoch: 10710, Training Loss: 0.01555
Epoch: 10710, Training Loss: 0.01931
Epoch: 10711, Training Loss: 0.01462
Epoch: 10711, Training Loss: 0.01553
Epoch: 10711, Training Loss: 0.01555
Epoch: 10711, Training Loss: 0.01931
Epoch: 10712, Training Loss: 0.01462
Epoch: 10712, Training Loss: 0.01553
Epoch: 10712, Training Loss: 0.01555
Epoch: 10712, Training Loss: 0.01931
Epoch: 10713, Training Loss: 0.01462
Epoch: 10713, Training Loss: 0.01553
Epoch: 10713, Training Loss: 0.01555
Epoch: 10713, Training Loss: 0.01931
Epoch: 10714, Training Loss: 0.01462
Epoch: 10714, Training Loss: 0.01553
Epoch: 10714, Training Loss: 0.01555
Epoch: 10714, Training Loss: 0.01931
Epoch: 10715, Training Loss: 0.01462
Epoch: 10715, Training Loss: 0.01552
Epoch: 10715, Training Loss: 0.01555
Epoch: 10715, Training Loss: 0.01931
Epoch: 10716, Training Loss: 0.01462
Epoch: 10716, Training Loss: 0.01552
Epoch: 10716, Training Loss: 0.01555
Epoch: 10716, Training Loss: 0.01930
Epoch: 10717, Training Loss: 0.01462
Epoch: 10717, Training Loss: 0.01552
Epoch: 10717, Training Loss: 0.01555
Epoch: 10717, Training Loss: 0.01930
Epoch: 10718, Training Loss: 0.01462
Epoch: 10718, Training Loss: 0.01552
Epoch: 10718, Training Loss: 0.01554
Epoch: 10718, Training Loss: 0.01930
Epoch: 10719, Training Loss: 0.01462
Epoch: 10719, Training Loss: 0.01552
Epoch: 10719, Training Loss: 0.01554
Epoch: 10719, Training Loss: 0.01930
Epoch: 10720, Training Loss: 0.01462
Epoch: 10720, Training Loss: 0.01552
Epoch: 10720, Training Loss: 0.01554
Epoch: 10720, Training Loss: 0.01930
Epoch: 10721, Training Loss: 0.01462
Epoch: 10721, Training Loss: 0.01552
Epoch: 10721, Training Loss: 0.01554
Epoch: 10721, Training Loss: 0.01930
Epoch: 10722, Training Loss: 0.01462
Epoch: 10722, Training Loss: 0.01552
Epoch: 10722, Training Loss: 0.01554
Epoch: 10722, Training Loss: 0.01930
Epoch: 10723, Training Loss: 0.01461
Epoch: 10723, Training Loss: 0.01552
Epoch: 10723, Training Loss: 0.01554
Epoch: 10723, Training Loss: 0.01930
Epoch: 10724, Training Loss: 0.01461
Epoch: 10724, Training Loss: 0.01552
Epoch: 10724, Training Loss: 0.01554
Epoch: 10724, Training Loss: 0.01929
Epoch: 10725, Training Loss: 0.01461
Epoch: 10725, Training Loss: 0.01551
Epoch: 10725, Training Loss: 0.01554
Epoch: 10725, Training Loss: 0.01929
Epoch: 10726, Training Loss: 0.01461
Epoch: 10726, Training Loss: 0.01551
Epoch: 10726, Training Loss: 0.01554
Epoch: 10726, Training Loss: 0.01929
Epoch: 10727, Training Loss: 0.01461
Epoch: 10727, Training Loss: 0.01551
Epoch: 10727, Training Loss: 0.01554
Epoch: 10727, Training Loss: 0.01929
Epoch: 10728, Training Loss: 0.01461
Epoch: 10728, Training Loss: 0.01551
Epoch: 10728, Training Loss: 0.01554
Epoch: 10728, Training Loss: 0.01929
Epoch: 10729, Training Loss: 0.01461
Epoch: 10729, Training Loss: 0.01551
Epoch: 10729, Training Loss: 0.01553
Epoch: 10729, Training Loss: 0.01929
Epoch: 10730, Training Loss: 0.01461
Epoch: 10730, Training Loss: 0.01551
Epoch: 10730, Training Loss: 0.01553
Epoch: 10730, Training Loss: 0.01929
Epoch: 10731, Training Loss: 0.01461
Epoch: 10731, Training Loss: 0.01551
Epoch: 10731, Training Loss: 0.01553
Epoch: 10731, Training Loss: 0.01929
Epoch: 10732, Training Loss: 0.01461
Epoch: 10732, Training Loss: 0.01551
Epoch: 10732, Training Loss: 0.01553
Epoch: 10732, Training Loss: 0.01928
Epoch: 10733, Training Loss: 0.01461
Epoch: 10733, Training Loss: 0.01551
Epoch: 10733, Training Loss: 0.01553
Epoch: 10733, Training Loss: 0.01928
Epoch: 10734, Training Loss: 0.01460
Epoch: 10734, Training Loss: 0.01551
Epoch: 10734, Training Loss: 0.01553
Epoch: 10734, Training Loss: 0.01928
Epoch: 10735, Training Loss: 0.01460
Epoch: 10735, Training Loss: 0.01550
Epoch: 10735, Training Loss: 0.01553
Epoch: 10735, Training Loss: 0.01928
Epoch: 10736, Training Loss: 0.01460
Epoch: 10736, Training Loss: 0.01550
Epoch: 10736, Training Loss: 0.01553
Epoch: 10736, Training Loss: 0.01928
Epoch: 10737, Training Loss: 0.01460
Epoch: 10737, Training Loss: 0.01550
Epoch: 10737, Training Loss: 0.01553
Epoch: 10737, Training Loss: 0.01928
Epoch: 10738, Training Loss: 0.01460
Epoch: 10738, Training Loss: 0.01550
Epoch: 10738, Training Loss: 0.01553
Epoch: 10738, Training Loss: 0.01928
Epoch: 10739, Training Loss: 0.01460
Epoch: 10739, Training Loss: 0.01550
Epoch: 10739, Training Loss: 0.01552
Epoch: 10739, Training Loss: 0.01928
Epoch: 10740, Training Loss: 0.01460
Epoch: 10740, Training Loss: 0.01550
Epoch: 10740, Training Loss: 0.01552
Epoch: 10740, Training Loss: 0.01927
Epoch: 10741, Training Loss: 0.01460
Epoch: 10741, Training Loss: 0.01550
Epoch: 10741, Training Loss: 0.01552
Epoch: 10741, Training Loss: 0.01927
Epoch: 10742, Training Loss: 0.01460
Epoch: 10742, Training Loss: 0.01550
Epoch: 10742, Training Loss: 0.01552
Epoch: 10742, Training Loss: 0.01927
Epoch: 10743, Training Loss: 0.01460
Epoch: 10743, Training Loss: 0.01550
Epoch: 10743, Training Loss: 0.01552
Epoch: 10743, Training Loss: 0.01927
Epoch: 10744, Training Loss: 0.01460
Epoch: 10744, Training Loss: 0.01550
Epoch: 10744, Training Loss: 0.01552
Epoch: 10744, Training Loss: 0.01927
Epoch: 10745, Training Loss: 0.01459
Epoch: 10745, Training Loss: 0.01549
Epoch: 10745, Training Loss: 0.01552
Epoch: 10745, Training Loss: 0.01927
Epoch: 10746, Training Loss: 0.01459
Epoch: 10746, Training Loss: 0.01549
Epoch: 10746, Training Loss: 0.01552
Epoch: 10746, Training Loss: 0.01927
Epoch: 10747, Training Loss: 0.01459
Epoch: 10747, Training Loss: 0.01549
Epoch: 10747, Training Loss: 0.01552
Epoch: 10747, Training Loss: 0.01927
Epoch: 10748, Training Loss: 0.01459
Epoch: 10748, Training Loss: 0.01549
Epoch: 10748, Training Loss: 0.01552
Epoch: 10748, Training Loss: 0.01926
Epoch: 10749, Training Loss: 0.01459
Epoch: 10749, Training Loss: 0.01549
Epoch: 10749, Training Loss: 0.01551
Epoch: 10749, Training Loss: 0.01926
Epoch: 10750, Training Loss: 0.01459
Epoch: 10750, Training Loss: 0.01549
Epoch: 10750, Training Loss: 0.01551
Epoch: 10750, Training Loss: 0.01926
Epoch: 10751, Training Loss: 0.01459
Epoch: 10751, Training Loss: 0.01549
Epoch: 10751, Training Loss: 0.01551
Epoch: 10751, Training Loss: 0.01926
Epoch: 10752, Training Loss: 0.01459
Epoch: 10752, Training Loss: 0.01549
Epoch: 10752, Training Loss: 0.01551
Epoch: 10752, Training Loss: 0.01926
Epoch: 10753, Training Loss: 0.01459
Epoch: 10753, Training Loss: 0.01549
Epoch: 10753, Training Loss: 0.01551
Epoch: 10753, Training Loss: 0.01926
Epoch: 10754, Training Loss: 0.01459
Epoch: 10754, Training Loss: 0.01549
Epoch: 10754, Training Loss: 0.01551
Epoch: 10754, Training Loss: 0.01926
Epoch: 10755, Training Loss: 0.01459
Epoch: 10755, Training Loss: 0.01549
Epoch: 10755, Training Loss: 0.01551
Epoch: 10755, Training Loss: 0.01926
Epoch: 10756, Training Loss: 0.01459
Epoch: 10756, Training Loss: 0.01548
Epoch: 10756, Training Loss: 0.01551
Epoch: 10756, Training Loss: 0.01926
Epoch: 10757, Training Loss: 0.01458
Epoch: 10757, Training Loss: 0.01548
Epoch: 10757, Training Loss: 0.01551
Epoch: 10757, Training Loss: 0.01925
Epoch: 10758, Training Loss: 0.01458
Epoch: 10758, Training Loss: 0.01548
Epoch: 10758, Training Loss: 0.01551
Epoch: 10758, Training Loss: 0.01925
Epoch: 10759, Training Loss: 0.01458
Epoch: 10759, Training Loss: 0.01548
Epoch: 10759, Training Loss: 0.01550
Epoch: 10759, Training Loss: 0.01925
Epoch: 10760, Training Loss: 0.01458
Epoch: 10760, Training Loss: 0.01548
Epoch: 10760, Training Loss: 0.01550
Epoch: 10760, Training Loss: 0.01925
Epoch: 10761, Training Loss: 0.01458
Epoch: 10761, Training Loss: 0.01548
Epoch: 10761, Training Loss: 0.01550
Epoch: 10761, Training Loss: 0.01925
Epoch: 10762, Training Loss: 0.01458
Epoch: 10762, Training Loss: 0.01548
Epoch: 10762, Training Loss: 0.01550
Epoch: 10762, Training Loss: 0.01925
Epoch: 10763, Training Loss: 0.01458
Epoch: 10763, Training Loss: 0.01548
Epoch: 10763, Training Loss: 0.01550
Epoch: 10763, Training Loss: 0.01925
Epoch: 10764, Training Loss: 0.01458
Epoch: 10764, Training Loss: 0.01548
Epoch: 10764, Training Loss: 0.01550
Epoch: 10764, Training Loss: 0.01925
Epoch: 10765, Training Loss: 0.01458
Epoch: 10765, Training Loss: 0.01548
Epoch: 10765, Training Loss: 0.01550
Epoch: 10765, Training Loss: 0.01924
Epoch: 10766, Training Loss: 0.01458
Epoch: 10766, Training Loss: 0.01547
Epoch: 10766, Training Loss: 0.01550
Epoch: 10766, Training Loss: 0.01924
Epoch: 10767, Training Loss: 0.01458
Epoch: 10767, Training Loss: 0.01547
Epoch: 10767, Training Loss: 0.01550
Epoch: 10767, Training Loss: 0.01924
Epoch: 10768, Training Loss: 0.01457
Epoch: 10768, Training Loss: 0.01547
Epoch: 10768, Training Loss: 0.01550
Epoch: 10768, Training Loss: 0.01924
Epoch: 10769, Training Loss: 0.01457
Epoch: 10769, Training Loss: 0.01547
Epoch: 10769, Training Loss: 0.01550
Epoch: 10769, Training Loss: 0.01924
Epoch: 10770, Training Loss: 0.01457
Epoch: 10770, Training Loss: 0.01547
Epoch: 10770, Training Loss: 0.01549
Epoch: 10770, Training Loss: 0.01924
Epoch: 10771, Training Loss: 0.01457
Epoch: 10771, Training Loss: 0.01547
Epoch: 10771, Training Loss: 0.01549
Epoch: 10771, Training Loss: 0.01924
Epoch: 10772, Training Loss: 0.01457
Epoch: 10772, Training Loss: 0.01547
Epoch: 10772, Training Loss: 0.01549
Epoch: 10772, Training Loss: 0.01924
Epoch: 10773, Training Loss: 0.01457
Epoch: 10773, Training Loss: 0.01547
Epoch: 10773, Training Loss: 0.01549
Epoch: 10773, Training Loss: 0.01923
Epoch: 10774, Training Loss: 0.01457
Epoch: 10774, Training Loss: 0.01547
Epoch: 10774, Training Loss: 0.01549
Epoch: 10774, Training Loss: 0.01923
Epoch: 10775, Training Loss: 0.01457
Epoch: 10775, Training Loss: 0.01547
Epoch: 10775, Training Loss: 0.01549
Epoch: 10775, Training Loss: 0.01923
Epoch: 10776, Training Loss: 0.01457
Epoch: 10776, Training Loss: 0.01546
Epoch: 10776, Training Loss: 0.01549
Epoch: 10776, Training Loss: 0.01923
Epoch: 10777, Training Loss: 0.01457
Epoch: 10777, Training Loss: 0.01546
Epoch: 10777, Training Loss: 0.01549
Epoch: 10777, Training Loss: 0.01923
Epoch: 10778, Training Loss: 0.01457
Epoch: 10778, Training Loss: 0.01546
Epoch: 10778, Training Loss: 0.01549
Epoch: 10778, Training Loss: 0.01923
Epoch: 10779, Training Loss: 0.01456
Epoch: 10779, Training Loss: 0.01546
Epoch: 10779, Training Loss: 0.01549
Epoch: 10779, Training Loss: 0.01923
Epoch: 10780, Training Loss: 0.01456
Epoch: 10780, Training Loss: 0.01546
Epoch: 10780, Training Loss: 0.01548
Epoch: 10780, Training Loss: 0.01923
Epoch: 10781, Training Loss: 0.01456
Epoch: 10781, Training Loss: 0.01546
Epoch: 10781, Training Loss: 0.01548
Epoch: 10781, Training Loss: 0.01922
Epoch: 10782, Training Loss: 0.01456
Epoch: 10782, Training Loss: 0.01546
Epoch: 10782, Training Loss: 0.01548
Epoch: 10782, Training Loss: 0.01922
Epoch: 10783, Training Loss: 0.01456
Epoch: 10783, Training Loss: 0.01546
Epoch: 10783, Training Loss: 0.01548
Epoch: 10783, Training Loss: 0.01922
Epoch: 10784, Training Loss: 0.01456
Epoch: 10784, Training Loss: 0.01546
Epoch: 10784, Training Loss: 0.01548
Epoch: 10784, Training Loss: 0.01922
Epoch: 10785, Training Loss: 0.01456
Epoch: 10785, Training Loss: 0.01546
Epoch: 10785, Training Loss: 0.01548
Epoch: 10785, Training Loss: 0.01922
Epoch: 10786, Training Loss: 0.01456
Epoch: 10786, Training Loss: 0.01546
Epoch: 10786, Training Loss: 0.01548
Epoch: 10786, Training Loss: 0.01922
Epoch: 10787, Training Loss: 0.01456
Epoch: 10787, Training Loss: 0.01545
Epoch: 10787, Training Loss: 0.01548
Epoch: 10787, Training Loss: 0.01922
Epoch: 10788, Training Loss: 0.01456
Epoch: 10788, Training Loss: 0.01545
Epoch: 10788, Training Loss: 0.01548
Epoch: 10788, Training Loss: 0.01922
Epoch: 10789, Training Loss: 0.01456
Epoch: 10789, Training Loss: 0.01545
Epoch: 10789, Training Loss: 0.01548
Epoch: 10789, Training Loss: 0.01921
Epoch: 10790, Training Loss: 0.01456
Epoch: 10790, Training Loss: 0.01545
Epoch: 10790, Training Loss: 0.01547
Epoch: 10790, Training Loss: 0.01921
Epoch: 10791, Training Loss: 0.01455
Epoch: 10791, Training Loss: 0.01545
Epoch: 10791, Training Loss: 0.01547
Epoch: 10791, Training Loss: 0.01921
Epoch: 10792, Training Loss: 0.01455
Epoch: 10792, Training Loss: 0.01545
Epoch: 10792, Training Loss: 0.01547
Epoch: 10792, Training Loss: 0.01921
Epoch: 10793, Training Loss: 0.01455
Epoch: 10793, Training Loss: 0.01545
Epoch: 10793, Training Loss: 0.01547
Epoch: 10793, Training Loss: 0.01921
Epoch: 10794, Training Loss: 0.01455
Epoch: 10794, Training Loss: 0.01545
Epoch: 10794, Training Loss: 0.01547
Epoch: 10794, Training Loss: 0.01921
Epoch: 10795, Training Loss: 0.01455
Epoch: 10795, Training Loss: 0.01545
Epoch: 10795, Training Loss: 0.01547
Epoch: 10795, Training Loss: 0.01921
Epoch: 10796, Training Loss: 0.01455
Epoch: 10796, Training Loss: 0.01545
Epoch: 10796, Training Loss: 0.01547
Epoch: 10796, Training Loss: 0.01921
Epoch: 10797, Training Loss: 0.01455
Epoch: 10797, Training Loss: 0.01544
Epoch: 10797, Training Loss: 0.01547
Epoch: 10797, Training Loss: 0.01920
Epoch: 10798, Training Loss: 0.01455
Epoch: 10798, Training Loss: 0.01544
Epoch: 10798, Training Loss: 0.01547
Epoch: 10798, Training Loss: 0.01920
Epoch: 10799, Training Loss: 0.01455
Epoch: 10799, Training Loss: 0.01544
Epoch: 10799, Training Loss: 0.01547
Epoch: 10799, Training Loss: 0.01920
Epoch: 10800, Training Loss: 0.01455
Epoch: 10800, Training Loss: 0.01544
Epoch: 10800, Training Loss: 0.01547
Epoch: 10800, Training Loss: 0.01920
Epoch: 10801, Training Loss: 0.01455
Epoch: 10801, Training Loss: 0.01544
Epoch: 10801, Training Loss: 0.01546
Epoch: 10801, Training Loss: 0.01920
Epoch: 10802, Training Loss: 0.01454
Epoch: 10802, Training Loss: 0.01544
Epoch: 10802, Training Loss: 0.01546
Epoch: 10802, Training Loss: 0.01920
Epoch: 10803, Training Loss: 0.01454
Epoch: 10803, Training Loss: 0.01544
Epoch: 10803, Training Loss: 0.01546
Epoch: 10803, Training Loss: 0.01920
Epoch: 10804, Training Loss: 0.01454
Epoch: 10804, Training Loss: 0.01544
Epoch: 10804, Training Loss: 0.01546
Epoch: 10804, Training Loss: 0.01920
Epoch: 10805, Training Loss: 0.01454
Epoch: 10805, Training Loss: 0.01544
Epoch: 10805, Training Loss: 0.01546
Epoch: 10805, Training Loss: 0.01919
Epoch: 10806, Training Loss: 0.01454
Epoch: 10806, Training Loss: 0.01544
Epoch: 10806, Training Loss: 0.01546
Epoch: 10806, Training Loss: 0.01919
Epoch: 10807, Training Loss: 0.01454
Epoch: 10807, Training Loss: 0.01543
Epoch: 10807, Training Loss: 0.01546
Epoch: 10807, Training Loss: 0.01919
Epoch: 10808, Training Loss: 0.01454
Epoch: 10808, Training Loss: 0.01543
Epoch: 10808, Training Loss: 0.01546
Epoch: 10808, Training Loss: 0.01919
Epoch: 10809, Training Loss: 0.01454
Epoch: 10809, Training Loss: 0.01543
Epoch: 10809, Training Loss: 0.01546
Epoch: 10809, Training Loss: 0.01919
Epoch: 10810, Training Loss: 0.01454
Epoch: 10810, Training Loss: 0.01543
Epoch: 10810, Training Loss: 0.01546
Epoch: 10810, Training Loss: 0.01919
Epoch: 10811, Training Loss: 0.01454
Epoch: 10811, Training Loss: 0.01543
Epoch: 10811, Training Loss: 0.01545
Epoch: 10811, Training Loss: 0.01919
Epoch: 10812, Training Loss: 0.01454
Epoch: 10812, Training Loss: 0.01543
Epoch: 10812, Training Loss: 0.01545
Epoch: 10812, Training Loss: 0.01919
Epoch: 10813, Training Loss: 0.01454
Epoch: 10813, Training Loss: 0.01543
Epoch: 10813, Training Loss: 0.01545
Epoch: 10813, Training Loss: 0.01918
Epoch: 10814, Training Loss: 0.01453
Epoch: 10814, Training Loss: 0.01543
Epoch: 10814, Training Loss: 0.01545
Epoch: 10814, Training Loss: 0.01918
Epoch: 10815, Training Loss: 0.01453
Epoch: 10815, Training Loss: 0.01543
Epoch: 10815, Training Loss: 0.01545
Epoch: 10815, Training Loss: 0.01918
Epoch: 10816, Training Loss: 0.01453
Epoch: 10816, Training Loss: 0.01543
Epoch: 10816, Training Loss: 0.01545
Epoch: 10816, Training Loss: 0.01918
Epoch: 10817, Training Loss: 0.01453
Epoch: 10817, Training Loss: 0.01543
Epoch: 10817, Training Loss: 0.01545
Epoch: 10817, Training Loss: 0.01918
Epoch: 10818, Training Loss: 0.01453
Epoch: 10818, Training Loss: 0.01542
Epoch: 10818, Training Loss: 0.01545
Epoch: 10818, Training Loss: 0.01918
Epoch: 10819, Training Loss: 0.01453
Epoch: 10819, Training Loss: 0.01542
Epoch: 10819, Training Loss: 0.01545
Epoch: 10819, Training Loss: 0.01918
Epoch: 10820, Training Loss: 0.01453
Epoch: 10820, Training Loss: 0.01542
Epoch: 10820, Training Loss: 0.01545
Epoch: 10820, Training Loss: 0.01918
Epoch: 10821, Training Loss: 0.01453
Epoch: 10821, Training Loss: 0.01542
Epoch: 10821, Training Loss: 0.01545
Epoch: 10821, Training Loss: 0.01918
Epoch: 10822, Training Loss: 0.01453
Epoch: 10822, Training Loss: 0.01542
Epoch: 10822, Training Loss: 0.01544
Epoch: 10822, Training Loss: 0.01917
Epoch: 10823, Training Loss: 0.01453
Epoch: 10823, Training Loss: 0.01542
Epoch: 10823, Training Loss: 0.01544
Epoch: 10823, Training Loss: 0.01917
Epoch: 10824, Training Loss: 0.01453
Epoch: 10824, Training Loss: 0.01542
Epoch: 10824, Training Loss: 0.01544
Epoch: 10824, Training Loss: 0.01917
Epoch: 10825, Training Loss: 0.01452
Epoch: 10825, Training Loss: 0.01542
Epoch: 10825, Training Loss: 0.01544
Epoch: 10825, Training Loss: 0.01917
Epoch: 10826, Training Loss: 0.01452
Epoch: 10826, Training Loss: 0.01542
Epoch: 10826, Training Loss: 0.01544
Epoch: 10826, Training Loss: 0.01917
Epoch: 10827, Training Loss: 0.01452
Epoch: 10827, Training Loss: 0.01542
Epoch: 10827, Training Loss: 0.01544
Epoch: 10827, Training Loss: 0.01917
Epoch: 10828, Training Loss: 0.01452
Epoch: 10828, Training Loss: 0.01541
Epoch: 10828, Training Loss: 0.01544
Epoch: 10828, Training Loss: 0.01917
Epoch: 10829, Training Loss: 0.01452
Epoch: 10829, Training Loss: 0.01541
Epoch: 10829, Training Loss: 0.01544
Epoch: 10829, Training Loss: 0.01917
Epoch: 10830, Training Loss: 0.01452
Epoch: 10830, Training Loss: 0.01541
Epoch: 10830, Training Loss: 0.01544
Epoch: 10830, Training Loss: 0.01916
Epoch: 10831, Training Loss: 0.01452
Epoch: 10831, Training Loss: 0.01541
Epoch: 10831, Training Loss: 0.01544
Epoch: 10831, Training Loss: 0.01916
Epoch: 10832, Training Loss: 0.01452
Epoch: 10832, Training Loss: 0.01541
Epoch: 10832, Training Loss: 0.01543
Epoch: 10832, Training Loss: 0.01916
Epoch: 10833, Training Loss: 0.01452
Epoch: 10833, Training Loss: 0.01541
Epoch: 10833, Training Loss: 0.01543
Epoch: 10833, Training Loss: 0.01916
Epoch: 10834, Training Loss: 0.01452
Epoch: 10834, Training Loss: 0.01541
Epoch: 10834, Training Loss: 0.01543
Epoch: 10834, Training Loss: 0.01916
Epoch: 10835, Training Loss: 0.01452
Epoch: 10835, Training Loss: 0.01541
Epoch: 10835, Training Loss: 0.01543
Epoch: 10835, Training Loss: 0.01916
Epoch: 10836, Training Loss: 0.01452
Epoch: 10836, Training Loss: 0.01541
Epoch: 10836, Training Loss: 0.01543
Epoch: 10836, Training Loss: 0.01916
Epoch: 10837, Training Loss: 0.01451
Epoch: 10837, Training Loss: 0.01541
Epoch: 10837, Training Loss: 0.01543
Epoch: 10837, Training Loss: 0.01916
Epoch: 10838, Training Loss: 0.01451
Epoch: 10838, Training Loss: 0.01541
Epoch: 10838, Training Loss: 0.01543
Epoch: 10838, Training Loss: 0.01915
Epoch: 10839, Training Loss: 0.01451
Epoch: 10839, Training Loss: 0.01540
Epoch: 10839, Training Loss: 0.01543
Epoch: 10839, Training Loss: 0.01915
Epoch: 10840, Training Loss: 0.01451
Epoch: 10840, Training Loss: 0.01540
Epoch: 10840, Training Loss: 0.01543
Epoch: 10840, Training Loss: 0.01915
Epoch: 10841, Training Loss: 0.01451
Epoch: 10841, Training Loss: 0.01540
Epoch: 10841, Training Loss: 0.01543
Epoch: 10841, Training Loss: 0.01915
Epoch: 10842, Training Loss: 0.01451
Epoch: 10842, Training Loss: 0.01540
Epoch: 10842, Training Loss: 0.01542
Epoch: 10842, Training Loss: 0.01915
Epoch: 10843, Training Loss: 0.01451
Epoch: 10843, Training Loss: 0.01540
Epoch: 10843, Training Loss: 0.01542
Epoch: 10843, Training Loss: 0.01915
Epoch: 10844, Training Loss: 0.01451
Epoch: 10844, Training Loss: 0.01540
Epoch: 10844, Training Loss: 0.01542
Epoch: 10844, Training Loss: 0.01915
Epoch: 10845, Training Loss: 0.01451
Epoch: 10845, Training Loss: 0.01540
Epoch: 10845, Training Loss: 0.01542
Epoch: 10845, Training Loss: 0.01915
Epoch: 10846, Training Loss: 0.01451
Epoch: 10846, Training Loss: 0.01540
Epoch: 10846, Training Loss: 0.01542
Epoch: 10846, Training Loss: 0.01914
Epoch: 10847, Training Loss: 0.01451
Epoch: 10847, Training Loss: 0.01540
Epoch: 10847, Training Loss: 0.01542
Epoch: 10847, Training Loss: 0.01914
Epoch: 10848, Training Loss: 0.01450
Epoch: 10848, Training Loss: 0.01540
Epoch: 10848, Training Loss: 0.01542
Epoch: 10848, Training Loss: 0.01914
Epoch: 10849, Training Loss: 0.01450
Epoch: 10849, Training Loss: 0.01539
Epoch: 10849, Training Loss: 0.01542
Epoch: 10849, Training Loss: 0.01914
Epoch: 10850, Training Loss: 0.01450
Epoch: 10850, Training Loss: 0.01539
Epoch: 10850, Training Loss: 0.01542
Epoch: 10850, Training Loss: 0.01914
Epoch: 10851, Training Loss: 0.01450
Epoch: 10851, Training Loss: 0.01539
Epoch: 10851, Training Loss: 0.01542
Epoch: 10851, Training Loss: 0.01914
Epoch: 10852, Training Loss: 0.01450
Epoch: 10852, Training Loss: 0.01539
Epoch: 10852, Training Loss: 0.01542
Epoch: 10852, Training Loss: 0.01914
Epoch: 10853, Training Loss: 0.01450
Epoch: 10853, Training Loss: 0.01539
Epoch: 10853, Training Loss: 0.01541
Epoch: 10853, Training Loss: 0.01914
Epoch: 10854, Training Loss: 0.01450
Epoch: 10854, Training Loss: 0.01539
Epoch: 10854, Training Loss: 0.01541
Epoch: 10854, Training Loss: 0.01913
Epoch: 10855, Training Loss: 0.01450
Epoch: 10855, Training Loss: 0.01539
Epoch: 10855, Training Loss: 0.01541
Epoch: 10855, Training Loss: 0.01913
Epoch: 10856, Training Loss: 0.01450
Epoch: 10856, Training Loss: 0.01539
Epoch: 10856, Training Loss: 0.01541
Epoch: 10856, Training Loss: 0.01913
Epoch: 10857, Training Loss: 0.01450
Epoch: 10857, Training Loss: 0.01539
Epoch: 10857, Training Loss: 0.01541
Epoch: 10857, Training Loss: 0.01913
Epoch: 10858, Training Loss: 0.01450
Epoch: 10858, Training Loss: 0.01539
Epoch: 10858, Training Loss: 0.01541
Epoch: 10858, Training Loss: 0.01913
Epoch: 10859, Training Loss: 0.01450
Epoch: 10859, Training Loss: 0.01539
Epoch: 10859, Training Loss: 0.01541
Epoch: 10859, Training Loss: 0.01913
Epoch: 10860, Training Loss: 0.01449
Epoch: 10860, Training Loss: 0.01538
Epoch: 10860, Training Loss: 0.01541
Epoch: 10860, Training Loss: 0.01913
Epoch: 10861, Training Loss: 0.01449
Epoch: 10861, Training Loss: 0.01538
Epoch: 10861, Training Loss: 0.01541
Epoch: 10861, Training Loss: 0.01913
Epoch: 10862, Training Loss: 0.01449
Epoch: 10862, Training Loss: 0.01538
Epoch: 10862, Training Loss: 0.01541
Epoch: 10862, Training Loss: 0.01913
Epoch: 10863, Training Loss: 0.01449
Epoch: 10863, Training Loss: 0.01538
Epoch: 10863, Training Loss: 0.01540
Epoch: 10863, Training Loss: 0.01912
Epoch: 10864, Training Loss: 0.01449
Epoch: 10864, Training Loss: 0.01538
Epoch: 10864, Training Loss: 0.01540
Epoch: 10864, Training Loss: 0.01912
Epoch: 10865, Training Loss: 0.01449
Epoch: 10865, Training Loss: 0.01538
Epoch: 10865, Training Loss: 0.01540
Epoch: 10865, Training Loss: 0.01912
Epoch: 10866, Training Loss: 0.01449
Epoch: 10866, Training Loss: 0.01538
Epoch: 10866, Training Loss: 0.01540
Epoch: 10866, Training Loss: 0.01912
Epoch: 10867, Training Loss: 0.01449
Epoch: 10867, Training Loss: 0.01538
Epoch: 10867, Training Loss: 0.01540
Epoch: 10867, Training Loss: 0.01912
Epoch: 10868, Training Loss: 0.01449
Epoch: 10868, Training Loss: 0.01538
Epoch: 10868, Training Loss: 0.01540
Epoch: 10868, Training Loss: 0.01912
Epoch: 10869, Training Loss: 0.01449
Epoch: 10869, Training Loss: 0.01538
Epoch: 10869, Training Loss: 0.01540
Epoch: 10869, Training Loss: 0.01912
Epoch: 10870, Training Loss: 0.01449
Epoch: 10870, Training Loss: 0.01537
Epoch: 10870, Training Loss: 0.01540
Epoch: 10870, Training Loss: 0.01912
Epoch: 10871, Training Loss: 0.01449
Epoch: 10871, Training Loss: 0.01537
Epoch: 10871, Training Loss: 0.01540
Epoch: 10871, Training Loss: 0.01911
Epoch: 10872, Training Loss: 0.01448
Epoch: 10872, Training Loss: 0.01537
Epoch: 10872, Training Loss: 0.01540
Epoch: 10872, Training Loss: 0.01911
Epoch: 10873, Training Loss: 0.01448
Epoch: 10873, Training Loss: 0.01537
Epoch: 10873, Training Loss: 0.01540
Epoch: 10873, Training Loss: 0.01911
Epoch: 10874, Training Loss: 0.01448
Epoch: 10874, Training Loss: 0.01537
Epoch: 10874, Training Loss: 0.01539
Epoch: 10874, Training Loss: 0.01911
Epoch: 10875, Training Loss: 0.01448
Epoch: 10875, Training Loss: 0.01537
Epoch: 10875, Training Loss: 0.01539
Epoch: 10875, Training Loss: 0.01911
Epoch: 10876, Training Loss: 0.01448
Epoch: 10876, Training Loss: 0.01537
Epoch: 10876, Training Loss: 0.01539
Epoch: 10876, Training Loss: 0.01911
Epoch: 10877, Training Loss: 0.01448
Epoch: 10877, Training Loss: 0.01537
Epoch: 10877, Training Loss: 0.01539
Epoch: 10877, Training Loss: 0.01911
Epoch: 10878, Training Loss: 0.01448
Epoch: 10878, Training Loss: 0.01537
Epoch: 10878, Training Loss: 0.01539
Epoch: 10878, Training Loss: 0.01911
Epoch: 10879, Training Loss: 0.01448
Epoch: 10879, Training Loss: 0.01537
Epoch: 10879, Training Loss: 0.01539
Epoch: 10879, Training Loss: 0.01910
Epoch: 10880, Training Loss: 0.01448
Epoch: 10880, Training Loss: 0.01537
Epoch: 10880, Training Loss: 0.01539
Epoch: 10880, Training Loss: 0.01910
Epoch: 10881, Training Loss: 0.01448
Epoch: 10881, Training Loss: 0.01536
Epoch: 10881, Training Loss: 0.01539
Epoch: 10881, Training Loss: 0.01910
Epoch: 10882, Training Loss: 0.01448
Epoch: 10882, Training Loss: 0.01536
Epoch: 10882, Training Loss: 0.01539
Epoch: 10882, Training Loss: 0.01910
Epoch: 10883, Training Loss: 0.01447
Epoch: 10883, Training Loss: 0.01536
Epoch: 10883, Training Loss: 0.01539
Epoch: 10883, Training Loss: 0.01910
Epoch: 10884, Training Loss: 0.01447
Epoch: 10884, Training Loss: 0.01536
Epoch: 10884, Training Loss: 0.01538
Epoch: 10884, Training Loss: 0.01910
Epoch: 10885, Training Loss: 0.01447
Epoch: 10885, Training Loss: 0.01536
Epoch: 10885, Training Loss: 0.01538
Epoch: 10885, Training Loss: 0.01910
Epoch: 10886, Training Loss: 0.01447
Epoch: 10886, Training Loss: 0.01536
Epoch: 10886, Training Loss: 0.01538
Epoch: 10886, Training Loss: 0.01910
Epoch: 10887, Training Loss: 0.01447
Epoch: 10887, Training Loss: 0.01536
Epoch: 10887, Training Loss: 0.01538
Epoch: 10887, Training Loss: 0.01909
Epoch: 10888, Training Loss: 0.01447
Epoch: 10888, Training Loss: 0.01536
Epoch: 10888, Training Loss: 0.01538
Epoch: 10888, Training Loss: 0.01909
Epoch: 10889, Training Loss: 0.01447
Epoch: 10889, Training Loss: 0.01536
Epoch: 10889, Training Loss: 0.01538
Epoch: 10889, Training Loss: 0.01909
Epoch: 10890, Training Loss: 0.01447
Epoch: 10890, Training Loss: 0.01536
Epoch: 10890, Training Loss: 0.01538
Epoch: 10890, Training Loss: 0.01909
Epoch: 10891, Training Loss: 0.01447
Epoch: 10891, Training Loss: 0.01535
Epoch: 10891, Training Loss: 0.01538
Epoch: 10891, Training Loss: 0.01909
Epoch: 10892, Training Loss: 0.01447
Epoch: 10892, Training Loss: 0.01535
Epoch: 10892, Training Loss: 0.01538
Epoch: 10892, Training Loss: 0.01909
Epoch: 10893, Training Loss: 0.01447
Epoch: 10893, Training Loss: 0.01535
Epoch: 10893, Training Loss: 0.01538
Epoch: 10893, Training Loss: 0.01909
Epoch: 10894, Training Loss: 0.01447
Epoch: 10894, Training Loss: 0.01535
Epoch: 10894, Training Loss: 0.01538
Epoch: 10894, Training Loss: 0.01909
Epoch: 10895, Training Loss: 0.01446
Epoch: 10895, Training Loss: 0.01535
Epoch: 10895, Training Loss: 0.01537
Epoch: 10895, Training Loss: 0.01909
Epoch: 10896, Training Loss: 0.01446
Epoch: 10896, Training Loss: 0.01535
Epoch: 10896, Training Loss: 0.01537
Epoch: 10896, Training Loss: 0.01908
Epoch: 10897, Training Loss: 0.01446
Epoch: 10897, Training Loss: 0.01535
Epoch: 10897, Training Loss: 0.01537
Epoch: 10897, Training Loss: 0.01908
Epoch: 10898, Training Loss: 0.01446
Epoch: 10898, Training Loss: 0.01535
Epoch: 10898, Training Loss: 0.01537
Epoch: 10898, Training Loss: 0.01908
Epoch: 10899, Training Loss: 0.01446
Epoch: 10899, Training Loss: 0.01535
Epoch: 10899, Training Loss: 0.01537
Epoch: 10899, Training Loss: 0.01908
Epoch: 10900, Training Loss: 0.01446
Epoch: 10900, Training Loss: 0.01535
Epoch: 10900, Training Loss: 0.01537
Epoch: 10900, Training Loss: 0.01908
Epoch: 10901, Training Loss: 0.01446
Epoch: 10901, Training Loss: 0.01535
Epoch: 10901, Training Loss: 0.01537
Epoch: 10901, Training Loss: 0.01908
Epoch: 10902, Training Loss: 0.01446
Epoch: 10902, Training Loss: 0.01534
Epoch: 10902, Training Loss: 0.01537
Epoch: 10902, Training Loss: 0.01908
Epoch: 10903, Training Loss: 0.01446
Epoch: 10903, Training Loss: 0.01534
Epoch: 10903, Training Loss: 0.01537
Epoch: 10903, Training Loss: 0.01908
Epoch: 10904, Training Loss: 0.01446
Epoch: 10904, Training Loss: 0.01534
Epoch: 10904, Training Loss: 0.01537
Epoch: 10904, Training Loss: 0.01907
Epoch: 10905, Training Loss: 0.01446
Epoch: 10905, Training Loss: 0.01534
Epoch: 10905, Training Loss: 0.01536
Epoch: 10905, Training Loss: 0.01907
Epoch: 10906, Training Loss: 0.01446
Epoch: 10906, Training Loss: 0.01534
Epoch: 10906, Training Loss: 0.01536
Epoch: 10906, Training Loss: 0.01907
Epoch: 10907, Training Loss: 0.01445
Epoch: 10907, Training Loss: 0.01534
Epoch: 10907, Training Loss: 0.01536
Epoch: 10907, Training Loss: 0.01907
Epoch: 10908, Training Loss: 0.01445
Epoch: 10908, Training Loss: 0.01534
Epoch: 10908, Training Loss: 0.01536
Epoch: 10908, Training Loss: 0.01907
Epoch: 10909, Training Loss: 0.01445
Epoch: 10909, Training Loss: 0.01534
Epoch: 10909, Training Loss: 0.01536
Epoch: 10909, Training Loss: 0.01907
Epoch: 10910, Training Loss: 0.01445
Epoch: 10910, Training Loss: 0.01534
Epoch: 10910, Training Loss: 0.01536
Epoch: 10910, Training Loss: 0.01907
Epoch: 10911, Training Loss: 0.01445
Epoch: 10911, Training Loss: 0.01534
Epoch: 10911, Training Loss: 0.01536
Epoch: 10911, Training Loss: 0.01907
Epoch: 10912, Training Loss: 0.01445
Epoch: 10912, Training Loss: 0.01533
Epoch: 10912, Training Loss: 0.01536
Epoch: 10912, Training Loss: 0.01906
Epoch: 10913, Training Loss: 0.01445
Epoch: 10913, Training Loss: 0.01533
Epoch: 10913, Training Loss: 0.01536
Epoch: 10913, Training Loss: 0.01906
Epoch: 10914, Training Loss: 0.01445
Epoch: 10914, Training Loss: 0.01533
Epoch: 10914, Training Loss: 0.01536
Epoch: 10914, Training Loss: 0.01906
Epoch: 10915, Training Loss: 0.01445
Epoch: 10915, Training Loss: 0.01533
Epoch: 10915, Training Loss: 0.01536
Epoch: 10915, Training Loss: 0.01906
Epoch: 10916, Training Loss: 0.01445
Epoch: 10916, Training Loss: 0.01533
Epoch: 10916, Training Loss: 0.01535
Epoch: 10916, Training Loss: 0.01906
Epoch: 10917, Training Loss: 0.01445
Epoch: 10917, Training Loss: 0.01533
Epoch: 10917, Training Loss: 0.01535
Epoch: 10917, Training Loss: 0.01906
Epoch: 10918, Training Loss: 0.01444
Epoch: 10918, Training Loss: 0.01533
Epoch: 10918, Training Loss: 0.01535
Epoch: 10918, Training Loss: 0.01906
Epoch: 10919, Training Loss: 0.01444
Epoch: 10919, Training Loss: 0.01533
Epoch: 10919, Training Loss: 0.01535
Epoch: 10919, Training Loss: 0.01906
Epoch: 10920, Training Loss: 0.01444
Epoch: 10920, Training Loss: 0.01533
Epoch: 10920, Training Loss: 0.01535
Epoch: 10920, Training Loss: 0.01906
Epoch: 10921, Training Loss: 0.01444
Epoch: 10921, Training Loss: 0.01533
Epoch: 10921, Training Loss: 0.01535
Epoch: 10921, Training Loss: 0.01905
Epoch: 10922, Training Loss: 0.01444
Epoch: 10922, Training Loss: 0.01533
Epoch: 10922, Training Loss: 0.01535
Epoch: 10922, Training Loss: 0.01905
Epoch: 10923, Training Loss: 0.01444
Epoch: 10923, Training Loss: 0.01532
Epoch: 10923, Training Loss: 0.01535
Epoch: 10923, Training Loss: 0.01905
Epoch: 10924, Training Loss: 0.01444
Epoch: 10924, Training Loss: 0.01532
Epoch: 10924, Training Loss: 0.01535
Epoch: 10924, Training Loss: 0.01905
Epoch: 10925, Training Loss: 0.01444
Epoch: 10925, Training Loss: 0.01532
Epoch: 10925, Training Loss: 0.01535
Epoch: 10925, Training Loss: 0.01905
Epoch: 10926, Training Loss: 0.01444
Epoch: 10926, Training Loss: 0.01532
Epoch: 10926, Training Loss: 0.01535
Epoch: 10926, Training Loss: 0.01905
Epoch: 10927, Training Loss: 0.01444
Epoch: 10927, Training Loss: 0.01532
Epoch: 10927, Training Loss: 0.01534
Epoch: 10927, Training Loss: 0.01905
Epoch: 10928, Training Loss: 0.01444
Epoch: 10928, Training Loss: 0.01532
Epoch: 10928, Training Loss: 0.01534
Epoch: 10928, Training Loss: 0.01905
Epoch: 10929, Training Loss: 0.01444
Epoch: 10929, Training Loss: 0.01532
Epoch: 10929, Training Loss: 0.01534
Epoch: 10929, Training Loss: 0.01904
Epoch: 10930, Training Loss: 0.01443
Epoch: 10930, Training Loss: 0.01532
Epoch: 10930, Training Loss: 0.01534
Epoch: 10930, Training Loss: 0.01904
Epoch: 10931, Training Loss: 0.01443
Epoch: 10931, Training Loss: 0.01532
Epoch: 10931, Training Loss: 0.01534
Epoch: 10931, Training Loss: 0.01904
Epoch: 10932, Training Loss: 0.01443
Epoch: 10932, Training Loss: 0.01532
Epoch: 10932, Training Loss: 0.01534
Epoch: 10932, Training Loss: 0.01904
Epoch: 10933, Training Loss: 0.01443
Epoch: 10933, Training Loss: 0.01532
Epoch: 10933, Training Loss: 0.01534
Epoch: 10933, Training Loss: 0.01904
Epoch: 10934, Training Loss: 0.01443
Epoch: 10934, Training Loss: 0.01531
Epoch: 10934, Training Loss: 0.01534
Epoch: 10934, Training Loss: 0.01904
Epoch: 10935, Training Loss: 0.01443
Epoch: 10935, Training Loss: 0.01531
Epoch: 10935, Training Loss: 0.01534
Epoch: 10935, Training Loss: 0.01904
Epoch: 10936, Training Loss: 0.01443
Epoch: 10936, Training Loss: 0.01531
Epoch: 10936, Training Loss: 0.01534
Epoch: 10936, Training Loss: 0.01904
Epoch: 10937, Training Loss: 0.01443
Epoch: 10937, Training Loss: 0.01531
Epoch: 10937, Training Loss: 0.01533
Epoch: 10937, Training Loss: 0.01903
Epoch: 10938, Training Loss: 0.01443
Epoch: 10938, Training Loss: 0.01531
Epoch: 10938, Training Loss: 0.01533
Epoch: 10938, Training Loss: 0.01903
Epoch: 10939, Training Loss: 0.01443
Epoch: 10939, Training Loss: 0.01531
Epoch: 10939, Training Loss: 0.01533
Epoch: 10939, Training Loss: 0.01903
Epoch: 10940, Training Loss: 0.01443
Epoch: 10940, Training Loss: 0.01531
Epoch: 10940, Training Loss: 0.01533
Epoch: 10940, Training Loss: 0.01903
Epoch: 10941, Training Loss: 0.01443
Epoch: 10941, Training Loss: 0.01531
Epoch: 10941, Training Loss: 0.01533
Epoch: 10941, Training Loss: 0.01903
Epoch: 10942, Training Loss: 0.01442
Epoch: 10942, Training Loss: 0.01531
Epoch: 10942, Training Loss: 0.01533
Epoch: 10942, Training Loss: 0.01903
Epoch: 10943, Training Loss: 0.01442
Epoch: 10943, Training Loss: 0.01531
Epoch: 10943, Training Loss: 0.01533
Epoch: 10943, Training Loss: 0.01903
Epoch: 10944, Training Loss: 0.01442
Epoch: 10944, Training Loss: 0.01530
Epoch: 10944, Training Loss: 0.01533
Epoch: 10944, Training Loss: 0.01903
Epoch: 10945, Training Loss: 0.01442
Epoch: 10945, Training Loss: 0.01530
Epoch: 10945, Training Loss: 0.01533
Epoch: 10945, Training Loss: 0.01903
Epoch: 10946, Training Loss: 0.01442
Epoch: 10946, Training Loss: 0.01530
Epoch: 10946, Training Loss: 0.01533
Epoch: 10946, Training Loss: 0.01902
Epoch: 10947, Training Loss: 0.01442
Epoch: 10947, Training Loss: 0.01530
Epoch: 10947, Training Loss: 0.01533
Epoch: 10947, Training Loss: 0.01902
Epoch: 10948, Training Loss: 0.01442
Epoch: 10948, Training Loss: 0.01530
Epoch: 10948, Training Loss: 0.01532
Epoch: 10948, Training Loss: 0.01902
Epoch: 10949, Training Loss: 0.01442
Epoch: 10949, Training Loss: 0.01530
Epoch: 10949, Training Loss: 0.01532
Epoch: 10949, Training Loss: 0.01902
Epoch: 10950, Training Loss: 0.01442
Epoch: 10950, Training Loss: 0.01530
Epoch: 10950, Training Loss: 0.01532
Epoch: 10950, Training Loss: 0.01902
Epoch: 10951, Training Loss: 0.01442
Epoch: 10951, Training Loss: 0.01530
Epoch: 10951, Training Loss: 0.01532
Epoch: 10951, Training Loss: 0.01902
Epoch: 10952, Training Loss: 0.01442
Epoch: 10952, Training Loss: 0.01530
Epoch: 10952, Training Loss: 0.01532
Epoch: 10952, Training Loss: 0.01902
Epoch: 10953, Training Loss: 0.01442
Epoch: 10953, Training Loss: 0.01530
Epoch: 10953, Training Loss: 0.01532
Epoch: 10953, Training Loss: 0.01902
Epoch: 10954, Training Loss: 0.01441
Epoch: 10954, Training Loss: 0.01530
Epoch: 10954, Training Loss: 0.01532
Epoch: 10954, Training Loss: 0.01901
Epoch: 10955, Training Loss: 0.01441
Epoch: 10955, Training Loss: 0.01529
Epoch: 10955, Training Loss: 0.01532
Epoch: 10955, Training Loss: 0.01901
Epoch: 10956, Training Loss: 0.01441
Epoch: 10956, Training Loss: 0.01529
Epoch: 10956, Training Loss: 0.01532
Epoch: 10956, Training Loss: 0.01901
Epoch: 10957, Training Loss: 0.01441
Epoch: 10957, Training Loss: 0.01529
Epoch: 10957, Training Loss: 0.01532
Epoch: 10957, Training Loss: 0.01901
Epoch: 10958, Training Loss: 0.01441
Epoch: 10958, Training Loss: 0.01529
Epoch: 10958, Training Loss: 0.01531
Epoch: 10958, Training Loss: 0.01901
Epoch: 10959, Training Loss: 0.01441
Epoch: 10959, Training Loss: 0.01529
Epoch: 10959, Training Loss: 0.01531
Epoch: 10959, Training Loss: 0.01901
Epoch: 10960, Training Loss: 0.01441
Epoch: 10960, Training Loss: 0.01529
Epoch: 10960, Training Loss: 0.01531
Epoch: 10960, Training Loss: 0.01901
Epoch: 10961, Training Loss: 0.01441
Epoch: 10961, Training Loss: 0.01529
Epoch: 10961, Training Loss: 0.01531
Epoch: 10961, Training Loss: 0.01901
Epoch: 10962, Training Loss: 0.01441
Epoch: 10962, Training Loss: 0.01529
Epoch: 10962, Training Loss: 0.01531
Epoch: 10962, Training Loss: 0.01901
Epoch: 10963, Training Loss: 0.01441
Epoch: 10963, Training Loss: 0.01529
Epoch: 10963, Training Loss: 0.01531
Epoch: 10963, Training Loss: 0.01900
Epoch: 10964, Training Loss: 0.01441
Epoch: 10964, Training Loss: 0.01529
Epoch: 10964, Training Loss: 0.01531
Epoch: 10964, Training Loss: 0.01900
Epoch: 10965, Training Loss: 0.01440
Epoch: 10965, Training Loss: 0.01529
Epoch: 10965, Training Loss: 0.01531
Epoch: 10965, Training Loss: 0.01900
Epoch: 10966, Training Loss: 0.01440
Epoch: 10966, Training Loss: 0.01528
Epoch: 10966, Training Loss: 0.01531
Epoch: 10966, Training Loss: 0.01900
Epoch: 10967, Training Loss: 0.01440
Epoch: 10967, Training Loss: 0.01528
Epoch: 10967, Training Loss: 0.01531
Epoch: 10967, Training Loss: 0.01900
Epoch: 10968, Training Loss: 0.01440
Epoch: 10968, Training Loss: 0.01528
Epoch: 10968, Training Loss: 0.01531
Epoch: 10968, Training Loss: 0.01900
Epoch: 10969, Training Loss: 0.01440
Epoch: 10969, Training Loss: 0.01528
Epoch: 10969, Training Loss: 0.01530
Epoch: 10969, Training Loss: 0.01900
Epoch: 10970, Training Loss: 0.01440
Epoch: 10970, Training Loss: 0.01528
Epoch: 10970, Training Loss: 0.01530
Epoch: 10970, Training Loss: 0.01900
Epoch: 10971, Training Loss: 0.01440
Epoch: 10971, Training Loss: 0.01528
Epoch: 10971, Training Loss: 0.01530
Epoch: 10971, Training Loss: 0.01899
Epoch: 10972, Training Loss: 0.01440
Epoch: 10972, Training Loss: 0.01528
Epoch: 10972, Training Loss: 0.01530
Epoch: 10972, Training Loss: 0.01899
Epoch: 10973, Training Loss: 0.01440
Epoch: 10973, Training Loss: 0.01528
Epoch: 10973, Training Loss: 0.01530
Epoch: 10973, Training Loss: 0.01899
Epoch: 10974, Training Loss: 0.01440
Epoch: 10974, Training Loss: 0.01528
Epoch: 10974, Training Loss: 0.01530
Epoch: 10974, Training Loss: 0.01899
Epoch: 10975, Training Loss: 0.01440
Epoch: 10975, Training Loss: 0.01528
Epoch: 10975, Training Loss: 0.01530
Epoch: 10975, Training Loss: 0.01899
Epoch: 10976, Training Loss: 0.01440
Epoch: 10976, Training Loss: 0.01527
Epoch: 10976, Training Loss: 0.01530
Epoch: 10976, Training Loss: 0.01899
Epoch: 10977, Training Loss: 0.01439
Epoch: 10977, Training Loss: 0.01527
Epoch: 10977, Training Loss: 0.01530
Epoch: 10977, Training Loss: 0.01899
Epoch: 10978, Training Loss: 0.01439
Epoch: 10978, Training Loss: 0.01527
Epoch: 10978, Training Loss: 0.01530
Epoch: 10978, Training Loss: 0.01899
Epoch: 10979, Training Loss: 0.01439
Epoch: 10979, Training Loss: 0.01527
Epoch: 10979, Training Loss: 0.01530
Epoch: 10979, Training Loss: 0.01898
Epoch: 10980, Training Loss: 0.01439
Epoch: 10980, Training Loss: 0.01527
Epoch: 10980, Training Loss: 0.01529
Epoch: 10980, Training Loss: 0.01898
Epoch: 10981, Training Loss: 0.01439
Epoch: 10981, Training Loss: 0.01527
Epoch: 10981, Training Loss: 0.01529
Epoch: 10981, Training Loss: 0.01898
Epoch: 10982, Training Loss: 0.01439
Epoch: 10982, Training Loss: 0.01527
Epoch: 10982, Training Loss: 0.01529
Epoch: 10982, Training Loss: 0.01898
Epoch: 10983, Training Loss: 0.01439
Epoch: 10983, Training Loss: 0.01527
Epoch: 10983, Training Loss: 0.01529
Epoch: 10983, Training Loss: 0.01898
Epoch: 10984, Training Loss: 0.01439
Epoch: 10984, Training Loss: 0.01527
Epoch: 10984, Training Loss: 0.01529
Epoch: 10984, Training Loss: 0.01898
Epoch: 10985, Training Loss: 0.01439
Epoch: 10985, Training Loss: 0.01527
Epoch: 10985, Training Loss: 0.01529
Epoch: 10985, Training Loss: 0.01898
Epoch: 10986, Training Loss: 0.01439
Epoch: 10986, Training Loss: 0.01527
Epoch: 10986, Training Loss: 0.01529
Epoch: 10986, Training Loss: 0.01898
Epoch: 10987, Training Loss: 0.01439
Epoch: 10987, Training Loss: 0.01526
Epoch: 10987, Training Loss: 0.01529
Epoch: 10987, Training Loss: 0.01898
Epoch: 10988, Training Loss: 0.01439
Epoch: 10988, Training Loss: 0.01526
Epoch: 10988, Training Loss: 0.01529
Epoch: 10988, Training Loss: 0.01897
Epoch: 10989, Training Loss: 0.01438
Epoch: 10989, Training Loss: 0.01526
Epoch: 10989, Training Loss: 0.01529
Epoch: 10989, Training Loss: 0.01897
Epoch: 10990, Training Loss: 0.01438
Epoch: 10990, Training Loss: 0.01526
Epoch: 10990, Training Loss: 0.01529
Epoch: 10990, Training Loss: 0.01897
Epoch: 10991, Training Loss: 0.01438
Epoch: 10991, Training Loss: 0.01526
Epoch: 10991, Training Loss: 0.01528
Epoch: 10991, Training Loss: 0.01897
Epoch: 10992, Training Loss: 0.01438
Epoch: 10992, Training Loss: 0.01526
Epoch: 10992, Training Loss: 0.01528
Epoch: 10992, Training Loss: 0.01897
Epoch: 10993, Training Loss: 0.01438
Epoch: 10993, Training Loss: 0.01526
Epoch: 10993, Training Loss: 0.01528
Epoch: 10993, Training Loss: 0.01897
Epoch: 10994, Training Loss: 0.01438
Epoch: 10994, Training Loss: 0.01526
Epoch: 10994, Training Loss: 0.01528
Epoch: 10994, Training Loss: 0.01897
Epoch: 10995, Training Loss: 0.01438
Epoch: 10995, Training Loss: 0.01526
Epoch: 10995, Training Loss: 0.01528
Epoch: 10995, Training Loss: 0.01897
Epoch: 10996, Training Loss: 0.01438
Epoch: 10996, Training Loss: 0.01526
Epoch: 10996, Training Loss: 0.01528
Epoch: 10996, Training Loss: 0.01896
Epoch: 10997, Training Loss: 0.01438
Epoch: 10997, Training Loss: 0.01526
Epoch: 10997, Training Loss: 0.01528
Epoch: 10997, Training Loss: 0.01896
Epoch: 10998, Training Loss: 0.01438
Epoch: 10998, Training Loss: 0.01525
Epoch: 10998, Training Loss: 0.01528
Epoch: 10998, Training Loss: 0.01896
Epoch: 10999, Training Loss: 0.01438
Epoch: 10999, Training Loss: 0.01525
Epoch: 10999, Training Loss: 0.01528
Epoch: 10999, Training Loss: 0.01896
Epoch: 11000, Training Loss: 0.01438
Epoch: 11000, Training Loss: 0.01525
Epoch: 11000, Training Loss: 0.01528
Epoch: 11000, Training Loss: 0.01896
Epoch: 11001, Training Loss: 0.01437
Epoch: 11001, Training Loss: 0.01525
Epoch: 11001, Training Loss: 0.01527
Epoch: 11001, Training Loss: 0.01896
Epoch: 11002, Training Loss: 0.01437
Epoch: 11002, Training Loss: 0.01525
Epoch: 11002, Training Loss: 0.01527
Epoch: 11002, Training Loss: 0.01896
Epoch: 11003, Training Loss: 0.01437
Epoch: 11003, Training Loss: 0.01525
Epoch: 11003, Training Loss: 0.01527
Epoch: 11003, Training Loss: 0.01896
Epoch: 11004, Training Loss: 0.01437
Epoch: 11004, Training Loss: 0.01525
Epoch: 11004, Training Loss: 0.01527
Epoch: 11004, Training Loss: 0.01896
Epoch: 11005, Training Loss: 0.01437
Epoch: 11005, Training Loss: 0.01525
Epoch: 11005, Training Loss: 0.01527
Epoch: 11005, Training Loss: 0.01895
Epoch: 11006, Training Loss: 0.01437
Epoch: 11006, Training Loss: 0.01525
Epoch: 11006, Training Loss: 0.01527
Epoch: 11006, Training Loss: 0.01895
Epoch: 11007, Training Loss: 0.01437
Epoch: 11007, Training Loss: 0.01525
Epoch: 11007, Training Loss: 0.01527
Epoch: 11007, Training Loss: 0.01895
Epoch: 11008, Training Loss: 0.01437
Epoch: 11008, Training Loss: 0.01525
Epoch: 11008, Training Loss: 0.01527
Epoch: 11008, Training Loss: 0.01895
Epoch: 11009, Training Loss: 0.01437
Epoch: 11009, Training Loss: 0.01524
Epoch: 11009, Training Loss: 0.01527
Epoch: 11009, Training Loss: 0.01895
Epoch: 11010, Training Loss: 0.01437
Epoch: 11010, Training Loss: 0.01524
Epoch: 11010, Training Loss: 0.01527
Epoch: 11010, Training Loss: 0.01895
Epoch: 11011, Training Loss: 0.01437
Epoch: 11011, Training Loss: 0.01524
Epoch: 11011, Training Loss: 0.01527
Epoch: 11011, Training Loss: 0.01895
Epoch: 11012, Training Loss: 0.01437
Epoch: 11012, Training Loss: 0.01524
Epoch: 11012, Training Loss: 0.01526
Epoch: 11012, Training Loss: 0.01895
Epoch: 11013, Training Loss: 0.01436
Epoch: 11013, Training Loss: 0.01524
Epoch: 11013, Training Loss: 0.01526
Epoch: 11013, Training Loss: 0.01894
Epoch: 11014, Training Loss: 0.01436
Epoch: 11014, Training Loss: 0.01524
Epoch: 11014, Training Loss: 0.01526
Epoch: 11014, Training Loss: 0.01894
Epoch: 11015, Training Loss: 0.01436
Epoch: 11015, Training Loss: 0.01524
Epoch: 11015, Training Loss: 0.01526
Epoch: 11015, Training Loss: 0.01894
Epoch: 11016, Training Loss: 0.01436
Epoch: 11016, Training Loss: 0.01524
Epoch: 11016, Training Loss: 0.01526
Epoch: 11016, Training Loss: 0.01894
Epoch: 11017, Training Loss: 0.01436
Epoch: 11017, Training Loss: 0.01524
Epoch: 11017, Training Loss: 0.01526
Epoch: 11017, Training Loss: 0.01894
Epoch: 11018, Training Loss: 0.01436
Epoch: 11018, Training Loss: 0.01524
Epoch: 11018, Training Loss: 0.01526
Epoch: 11018, Training Loss: 0.01894
Epoch: 11019, Training Loss: 0.01436
Epoch: 11019, Training Loss: 0.01523
Epoch: 11019, Training Loss: 0.01526
Epoch: 11019, Training Loss: 0.01894
Epoch: 11020, Training Loss: 0.01436
Epoch: 11020, Training Loss: 0.01523
Epoch: 11020, Training Loss: 0.01526
Epoch: 11020, Training Loss: 0.01894
Epoch: 11021, Training Loss: 0.01436
Epoch: 11021, Training Loss: 0.01523
Epoch: 11021, Training Loss: 0.01526
Epoch: 11021, Training Loss: 0.01894
Epoch: 11022, Training Loss: 0.01436
Epoch: 11022, Training Loss: 0.01523
Epoch: 11022, Training Loss: 0.01526
Epoch: 11022, Training Loss: 0.01893
Epoch: 11023, Training Loss: 0.01436
Epoch: 11023, Training Loss: 0.01523
Epoch: 11023, Training Loss: 0.01525
Epoch: 11023, Training Loss: 0.01893
Epoch: 11024, Training Loss: 0.01436
Epoch: 11024, Training Loss: 0.01523
Epoch: 11024, Training Loss: 0.01525
Epoch: 11024, Training Loss: 0.01893
Epoch: 11025, Training Loss: 0.01435
Epoch: 11025, Training Loss: 0.01523
Epoch: 11025, Training Loss: 0.01525
Epoch: 11025, Training Loss: 0.01893
Epoch: 11026, Training Loss: 0.01435
Epoch: 11026, Training Loss: 0.01523
Epoch: 11026, Training Loss: 0.01525
Epoch: 11026, Training Loss: 0.01893
Epoch: 11027, Training Loss: 0.01435
Epoch: 11027, Training Loss: 0.01523
Epoch: 11027, Training Loss: 0.01525
Epoch: 11027, Training Loss: 0.01893
Epoch: 11028, Training Loss: 0.01435
Epoch: 11028, Training Loss: 0.01523
Epoch: 11028, Training Loss: 0.01525
Epoch: 11028, Training Loss: 0.01893
Epoch: 11029, Training Loss: 0.01435
Epoch: 11029, Training Loss: 0.01523
Epoch: 11029, Training Loss: 0.01525
Epoch: 11029, Training Loss: 0.01893
Epoch: 11030, Training Loss: 0.01435
Epoch: 11030, Training Loss: 0.01522
Epoch: 11030, Training Loss: 0.01525
Epoch: 11030, Training Loss: 0.01892
Epoch: 11031, Training Loss: 0.01435
Epoch: 11031, Training Loss: 0.01522
Epoch: 11031, Training Loss: 0.01525
Epoch: 11031, Training Loss: 0.01892
Epoch: 11032, Training Loss: 0.01435
Epoch: 11032, Training Loss: 0.01522
Epoch: 11032, Training Loss: 0.01525
Epoch: 11032, Training Loss: 0.01892
Epoch: 11033, Training Loss: 0.01435
Epoch: 11033, Training Loss: 0.01522
Epoch: 11033, Training Loss: 0.01525
Epoch: 11033, Training Loss: 0.01892
Epoch: 11034, Training Loss: 0.01435
Epoch: 11034, Training Loss: 0.01522
Epoch: 11034, Training Loss: 0.01524
Epoch: 11034, Training Loss: 0.01892
Epoch: 11035, Training Loss: 0.01435
Epoch: 11035, Training Loss: 0.01522
Epoch: 11035, Training Loss: 0.01524
Epoch: 11035, Training Loss: 0.01892
Epoch: 11036, Training Loss: 0.01435
Epoch: 11036, Training Loss: 0.01522
Epoch: 11036, Training Loss: 0.01524
Epoch: 11036, Training Loss: 0.01892
Epoch: 11037, Training Loss: 0.01434
Epoch: 11037, Training Loss: 0.01522
Epoch: 11037, Training Loss: 0.01524
Epoch: 11037, Training Loss: 0.01892
Epoch: 11038, Training Loss: 0.01434
Epoch: 11038, Training Loss: 0.01522
Epoch: 11038, Training Loss: 0.01524
Epoch: 11038, Training Loss: 0.01892
Epoch: 11039, Training Loss: 0.01434
Epoch: 11039, Training Loss: 0.01522
Epoch: 11039, Training Loss: 0.01524
Epoch: 11039, Training Loss: 0.01891
Epoch: 11040, Training Loss: 0.01434
Epoch: 11040, Training Loss: 0.01522
Epoch: 11040, Training Loss: 0.01524
Epoch: 11040, Training Loss: 0.01891
Epoch: 11041, Training Loss: 0.01434
Epoch: 11041, Training Loss: 0.01521
Epoch: 11041, Training Loss: 0.01524
Epoch: 11041, Training Loss: 0.01891
Epoch: 11042, Training Loss: 0.01434
Epoch: 11042, Training Loss: 0.01521
Epoch: 11042, Training Loss: 0.01524
Epoch: 11042, Training Loss: 0.01891
Epoch: 11043, Training Loss: 0.01434
Epoch: 11043, Training Loss: 0.01521
Epoch: 11043, Training Loss: 0.01524
Epoch: 11043, Training Loss: 0.01891
Epoch: 11044, Training Loss: 0.01434
Epoch: 11044, Training Loss: 0.01521
Epoch: 11044, Training Loss: 0.01523
Epoch: 11044, Training Loss: 0.01891
Epoch: 11045, Training Loss: 0.01434
Epoch: 11045, Training Loss: 0.01521
Epoch: 11045, Training Loss: 0.01523
Epoch: 11045, Training Loss: 0.01891
Epoch: 11046, Training Loss: 0.01434
Epoch: 11046, Training Loss: 0.01521
Epoch: 11046, Training Loss: 0.01523
Epoch: 11046, Training Loss: 0.01891
Epoch: 11047, Training Loss: 0.01434
Epoch: 11047, Training Loss: 0.01521
Epoch: 11047, Training Loss: 0.01523
Epoch: 11047, Training Loss: 0.01890
Epoch: 11048, Training Loss: 0.01434
Epoch: 11048, Training Loss: 0.01521
Epoch: 11048, Training Loss: 0.01523
Epoch: 11048, Training Loss: 0.01890
Epoch: 11049, Training Loss: 0.01433
Epoch: 11049, Training Loss: 0.01521
Epoch: 11049, Training Loss: 0.01523
Epoch: 11049, Training Loss: 0.01890
Epoch: 11050, Training Loss: 0.01433
Epoch: 11050, Training Loss: 0.01521
Epoch: 11050, Training Loss: 0.01523
Epoch: 11050, Training Loss: 0.01890
Epoch: 11051, Training Loss: 0.01433
Epoch: 11051, Training Loss: 0.01521
Epoch: 11051, Training Loss: 0.01523
Epoch: 11051, Training Loss: 0.01890
Epoch: 11052, Training Loss: 0.01433
Epoch: 11052, Training Loss: 0.01520
Epoch: 11052, Training Loss: 0.01523
Epoch: 11052, Training Loss: 0.01890
Epoch: 11053, Training Loss: 0.01433
Epoch: 11053, Training Loss: 0.01520
Epoch: 11053, Training Loss: 0.01523
Epoch: 11053, Training Loss: 0.01890
Epoch: 11054, Training Loss: 0.01433
Epoch: 11054, Training Loss: 0.01520
Epoch: 11054, Training Loss: 0.01523
Epoch: 11054, Training Loss: 0.01890
Epoch: 11055, Training Loss: 0.01433
Epoch: 11055, Training Loss: 0.01520
Epoch: 11055, Training Loss: 0.01522
Epoch: 11055, Training Loss: 0.01890
Epoch: 11056, Training Loss: 0.01433
Epoch: 11056, Training Loss: 0.01520
Epoch: 11056, Training Loss: 0.01522
Epoch: 11056, Training Loss: 0.01889
Epoch: 11057, Training Loss: 0.01433
Epoch: 11057, Training Loss: 0.01520
Epoch: 11057, Training Loss: 0.01522
Epoch: 11057, Training Loss: 0.01889
Epoch: 11058, Training Loss: 0.01433
Epoch: 11058, Training Loss: 0.01520
Epoch: 11058, Training Loss: 0.01522
Epoch: 11058, Training Loss: 0.01889
Epoch: 11059, Training Loss: 0.01433
Epoch: 11059, Training Loss: 0.01520
Epoch: 11059, Training Loss: 0.01522
Epoch: 11059, Training Loss: 0.01889
Epoch: 11060, Training Loss: 0.01433
Epoch: 11060, Training Loss: 0.01520
Epoch: 11060, Training Loss: 0.01522
Epoch: 11060, Training Loss: 0.01889
Epoch: 11061, Training Loss: 0.01432
Epoch: 11061, Training Loss: 0.01520
Epoch: 11061, Training Loss: 0.01522
Epoch: 11061, Training Loss: 0.01889
Epoch: 11062, Training Loss: 0.01432
Epoch: 11062, Training Loss: 0.01520
Epoch: 11062, Training Loss: 0.01522
Epoch: 11062, Training Loss: 0.01889
Epoch: 11063, Training Loss: 0.01432
Epoch: 11063, Training Loss: 0.01519
Epoch: 11063, Training Loss: 0.01522
Epoch: 11063, Training Loss: 0.01889
Epoch: 11064, Training Loss: 0.01432
Epoch: 11064, Training Loss: 0.01519
Epoch: 11064, Training Loss: 0.01522
Epoch: 11064, Training Loss: 0.01888
Epoch: 11065, Training Loss: 0.01432
Epoch: 11065, Training Loss: 0.01519
Epoch: 11065, Training Loss: 0.01522
Epoch: 11065, Training Loss: 0.01888
Epoch: 11066, Training Loss: 0.01432
Epoch: 11066, Training Loss: 0.01519
Epoch: 11066, Training Loss: 0.01521
Epoch: 11066, Training Loss: 0.01888
Epoch: 11067, Training Loss: 0.01432
Epoch: 11067, Training Loss: 0.01519
Epoch: 11067, Training Loss: 0.01521
Epoch: 11067, Training Loss: 0.01888
Epoch: 11068, Training Loss: 0.01432
Epoch: 11068, Training Loss: 0.01519
Epoch: 11068, Training Loss: 0.01521
Epoch: 11068, Training Loss: 0.01888
Epoch: 11069, Training Loss: 0.01432
Epoch: 11069, Training Loss: 0.01519
Epoch: 11069, Training Loss: 0.01521
Epoch: 11069, Training Loss: 0.01888
Epoch: 11070, Training Loss: 0.01432
Epoch: 11070, Training Loss: 0.01519
Epoch: 11070, Training Loss: 0.01521
Epoch: 11070, Training Loss: 0.01888
Epoch: 11071, Training Loss: 0.01432
Epoch: 11071, Training Loss: 0.01519
Epoch: 11071, Training Loss: 0.01521
Epoch: 11071, Training Loss: 0.01888
Epoch: 11072, Training Loss: 0.01432
Epoch: 11072, Training Loss: 0.01519
Epoch: 11072, Training Loss: 0.01521
Epoch: 11072, Training Loss: 0.01888
Epoch: 11073, Training Loss: 0.01431
Epoch: 11073, Training Loss: 0.01519
Epoch: 11073, Training Loss: 0.01521
Epoch: 11073, Training Loss: 0.01887
Epoch: 11074, Training Loss: 0.01431
Epoch: 11074, Training Loss: 0.01518
Epoch: 11074, Training Loss: 0.01521
Epoch: 11074, Training Loss: 0.01887
Epoch: 11075, Training Loss: 0.01431
Epoch: 11075, Training Loss: 0.01518
Epoch: 11075, Training Loss: 0.01521
Epoch: 11075, Training Loss: 0.01887
Epoch: 11076, Training Loss: 0.01431
Epoch: 11076, Training Loss: 0.01518
Epoch: 11076, Training Loss: 0.01521
Epoch: 11076, Training Loss: 0.01887
Epoch: 11077, Training Loss: 0.01431
Epoch: 11077, Training Loss: 0.01518
Epoch: 11077, Training Loss: 0.01520
Epoch: 11077, Training Loss: 0.01887
Epoch: 11078, Training Loss: 0.01431
Epoch: 11078, Training Loss: 0.01518
Epoch: 11078, Training Loss: 0.01520
Epoch: 11078, Training Loss: 0.01887
Epoch: 11079, Training Loss: 0.01431
Epoch: 11079, Training Loss: 0.01518
Epoch: 11079, Training Loss: 0.01520
Epoch: 11079, Training Loss: 0.01887
Epoch: 11080, Training Loss: 0.01431
Epoch: 11080, Training Loss: 0.01518
Epoch: 11080, Training Loss: 0.01520
Epoch: 11080, Training Loss: 0.01887
Epoch: 11081, Training Loss: 0.01431
Epoch: 11081, Training Loss: 0.01518
Epoch: 11081, Training Loss: 0.01520
Epoch: 11081, Training Loss: 0.01886
Epoch: 11082, Training Loss: 0.01431
Epoch: 11082, Training Loss: 0.01518
Epoch: 11082, Training Loss: 0.01520
Epoch: 11082, Training Loss: 0.01886
Epoch: 11083, Training Loss: 0.01431
Epoch: 11083, Training Loss: 0.01518
Epoch: 11083, Training Loss: 0.01520
Epoch: 11083, Training Loss: 0.01886
Epoch: 11084, Training Loss: 0.01431
Epoch: 11084, Training Loss: 0.01518
Epoch: 11084, Training Loss: 0.01520
Epoch: 11084, Training Loss: 0.01886
Epoch: 11085, Training Loss: 0.01430
Epoch: 11085, Training Loss: 0.01517
Epoch: 11085, Training Loss: 0.01520
Epoch: 11085, Training Loss: 0.01886
Epoch: 11086, Training Loss: 0.01430
Epoch: 11086, Training Loss: 0.01517
Epoch: 11086, Training Loss: 0.01520
Epoch: 11086, Training Loss: 0.01886
Epoch: 11087, Training Loss: 0.01430
Epoch: 11087, Training Loss: 0.01517
Epoch: 11087, Training Loss: 0.01520
Epoch: 11087, Training Loss: 0.01886
Epoch: 11088, Training Loss: 0.01430
Epoch: 11088, Training Loss: 0.01517
Epoch: 11088, Training Loss: 0.01519
Epoch: 11088, Training Loss: 0.01886
Epoch: 11089, Training Loss: 0.01430
Epoch: 11089, Training Loss: 0.01517
Epoch: 11089, Training Loss: 0.01519
Epoch: 11089, Training Loss: 0.01886
Epoch: 11090, Training Loss: 0.01430
Epoch: 11090, Training Loss: 0.01517
Epoch: 11090, Training Loss: 0.01519
Epoch: 11090, Training Loss: 0.01885
Epoch: 11091, Training Loss: 0.01430
Epoch: 11091, Training Loss: 0.01517
Epoch: 11091, Training Loss: 0.01519
Epoch: 11091, Training Loss: 0.01885
Epoch: 11092, Training Loss: 0.01430
Epoch: 11092, Training Loss: 0.01517
Epoch: 11092, Training Loss: 0.01519
Epoch: 11092, Training Loss: 0.01885
Epoch: 11093, Training Loss: 0.01430
Epoch: 11093, Training Loss: 0.01517
Epoch: 11093, Training Loss: 0.01519
Epoch: 11093, Training Loss: 0.01885
Epoch: 11094, Training Loss: 0.01430
Epoch: 11094, Training Loss: 0.01517
Epoch: 11094, Training Loss: 0.01519
Epoch: 11094, Training Loss: 0.01885
Epoch: 11095, Training Loss: 0.01430
Epoch: 11095, Training Loss: 0.01517
Epoch: 11095, Training Loss: 0.01519
Epoch: 11095, Training Loss: 0.01885
Epoch: 11096, Training Loss: 0.01430
Epoch: 11096, Training Loss: 0.01516
Epoch: 11096, Training Loss: 0.01519
Epoch: 11096, Training Loss: 0.01885
Epoch: 11097, Training Loss: 0.01429
Epoch: 11097, Training Loss: 0.01516
Epoch: 11097, Training Loss: 0.01519
Epoch: 11097, Training Loss: 0.01885
Epoch: 11098, Training Loss: 0.01429
Epoch: 11098, Training Loss: 0.01516
Epoch: 11098, Training Loss: 0.01519
Epoch: 11098, Training Loss: 0.01885
Epoch: 11099, Training Loss: 0.01429
Epoch: 11099, Training Loss: 0.01516
Epoch: 11099, Training Loss: 0.01518
Epoch: 11099, Training Loss: 0.01884
Epoch: 11100, Training Loss: 0.01429
Epoch: 11100, Training Loss: 0.01516
Epoch: 11100, Training Loss: 0.01518
Epoch: 11100, Training Loss: 0.01884
Epoch: 11101, Training Loss: 0.01429
Epoch: 11101, Training Loss: 0.01516
Epoch: 11101, Training Loss: 0.01518
Epoch: 11101, Training Loss: 0.01884
Epoch: 11102, Training Loss: 0.01429
Epoch: 11102, Training Loss: 0.01516
Epoch: 11102, Training Loss: 0.01518
Epoch: 11102, Training Loss: 0.01884
Epoch: 11103, Training Loss: 0.01429
Epoch: 11103, Training Loss: 0.01516
Epoch: 11103, Training Loss: 0.01518
Epoch: 11103, Training Loss: 0.01884
Epoch: 11104, Training Loss: 0.01429
Epoch: 11104, Training Loss: 0.01516
Epoch: 11104, Training Loss: 0.01518
Epoch: 11104, Training Loss: 0.01884
Epoch: 11105, Training Loss: 0.01429
Epoch: 11105, Training Loss: 0.01516
Epoch: 11105, Training Loss: 0.01518
Epoch: 11105, Training Loss: 0.01884
Epoch: 11106, Training Loss: 0.01429
Epoch: 11106, Training Loss: 0.01516
Epoch: 11106, Training Loss: 0.01518
Epoch: 11106, Training Loss: 0.01884
Epoch: 11107, Training Loss: 0.01429
Epoch: 11107, Training Loss: 0.01515
Epoch: 11107, Training Loss: 0.01518
Epoch: 11107, Training Loss: 0.01883
Epoch: 11108, Training Loss: 0.01429
Epoch: 11108, Training Loss: 0.01515
Epoch: 11108, Training Loss: 0.01518
Epoch: 11108, Training Loss: 0.01883
Epoch: 11109, Training Loss: 0.01428
Epoch: 11109, Training Loss: 0.01515
Epoch: 11109, Training Loss: 0.01518
Epoch: 11109, Training Loss: 0.01883
Epoch: 11110, Training Loss: 0.01428
Epoch: 11110, Training Loss: 0.01515
Epoch: 11110, Training Loss: 0.01517
Epoch: 11110, Training Loss: 0.01883
Epoch: 11111, Training Loss: 0.01428
Epoch: 11111, Training Loss: 0.01515
Epoch: 11111, Training Loss: 0.01517
Epoch: 11111, Training Loss: 0.01883
Epoch: 11112, Training Loss: 0.01428
Epoch: 11112, Training Loss: 0.01515
Epoch: 11112, Training Loss: 0.01517
Epoch: 11112, Training Loss: 0.01883
Epoch: 11113, Training Loss: 0.01428
Epoch: 11113, Training Loss: 0.01515
Epoch: 11113, Training Loss: 0.01517
Epoch: 11113, Training Loss: 0.01883
Epoch: 11114, Training Loss: 0.01428
Epoch: 11114, Training Loss: 0.01515
Epoch: 11114, Training Loss: 0.01517
Epoch: 11114, Training Loss: 0.01883
Epoch: 11115, Training Loss: 0.01428
Epoch: 11115, Training Loss: 0.01515
Epoch: 11115, Training Loss: 0.01517
Epoch: 11115, Training Loss: 0.01883
Epoch: 11116, Training Loss: 0.01428
Epoch: 11116, Training Loss: 0.01515
Epoch: 11116, Training Loss: 0.01517
Epoch: 11116, Training Loss: 0.01882
Epoch: 11117, Training Loss: 0.01428
Epoch: 11117, Training Loss: 0.01515
Epoch: 11117, Training Loss: 0.01517
Epoch: 11117, Training Loss: 0.01882
Epoch: 11118, Training Loss: 0.01428
Epoch: 11118, Training Loss: 0.01514
Epoch: 11118, Training Loss: 0.01517
Epoch: 11118, Training Loss: 0.01882
Epoch: 11119, Training Loss: 0.01428
Epoch: 11119, Training Loss: 0.01514
Epoch: 11119, Training Loss: 0.01517
Epoch: 11119, Training Loss: 0.01882
Epoch: 11120, Training Loss: 0.01428
Epoch: 11120, Training Loss: 0.01514
Epoch: 11120, Training Loss: 0.01517
Epoch: 11120, Training Loss: 0.01882
Epoch: 11121, Training Loss: 0.01427
Epoch: 11121, Training Loss: 0.01514
Epoch: 11121, Training Loss: 0.01516
Epoch: 11121, Training Loss: 0.01882
Epoch: 11122, Training Loss: 0.01427
Epoch: 11122, Training Loss: 0.01514
Epoch: 11122, Training Loss: 0.01516
Epoch: 11122, Training Loss: 0.01882
Epoch: 11123, Training Loss: 0.01427
Epoch: 11123, Training Loss: 0.01514
Epoch: 11123, Training Loss: 0.01516
Epoch: 11123, Training Loss: 0.01882
Epoch: 11124, Training Loss: 0.01427
Epoch: 11124, Training Loss: 0.01514
Epoch: 11124, Training Loss: 0.01516
Epoch: 11124, Training Loss: 0.01881
Epoch: 11125, Training Loss: 0.01427
Epoch: 11125, Training Loss: 0.01514
Epoch: 11125, Training Loss: 0.01516
Epoch: 11125, Training Loss: 0.01881
Epoch: 11126, Training Loss: 0.01427
Epoch: 11126, Training Loss: 0.01514
Epoch: 11126, Training Loss: 0.01516
Epoch: 11126, Training Loss: 0.01881
Epoch: 11127, Training Loss: 0.01427
Epoch: 11127, Training Loss: 0.01514
Epoch: 11127, Training Loss: 0.01516
Epoch: 11127, Training Loss: 0.01881
Epoch: 11128, Training Loss: 0.01427
Epoch: 11128, Training Loss: 0.01514
Epoch: 11128, Training Loss: 0.01516
Epoch: 11128, Training Loss: 0.01881
Epoch: 11129, Training Loss: 0.01427
Epoch: 11129, Training Loss: 0.01513
Epoch: 11129, Training Loss: 0.01516
Epoch: 11129, Training Loss: 0.01881
Epoch: 11130, Training Loss: 0.01427
Epoch: 11130, Training Loss: 0.01513
Epoch: 11130, Training Loss: 0.01516
Epoch: 11130, Training Loss: 0.01881
Epoch: 11131, Training Loss: 0.01427
Epoch: 11131, Training Loss: 0.01513
Epoch: 11131, Training Loss: 0.01516
Epoch: 11131, Training Loss: 0.01881
Epoch: 11132, Training Loss: 0.01427
Epoch: 11132, Training Loss: 0.01513
Epoch: 11132, Training Loss: 0.01515
Epoch: 11132, Training Loss: 0.01881
Epoch: 11133, Training Loss: 0.01426
Epoch: 11133, Training Loss: 0.01513
Epoch: 11133, Training Loss: 0.01515
Epoch: 11133, Training Loss: 0.01880
Epoch: 11134, Training Loss: 0.01426
Epoch: 11134, Training Loss: 0.01513
Epoch: 11134, Training Loss: 0.01515
Epoch: 11134, Training Loss: 0.01880
Epoch: 11135, Training Loss: 0.01426
Epoch: 11135, Training Loss: 0.01513
Epoch: 11135, Training Loss: 0.01515
Epoch: 11135, Training Loss: 0.01880
Epoch: 11136, Training Loss: 0.01426
Epoch: 11136, Training Loss: 0.01513
Epoch: 11136, Training Loss: 0.01515
Epoch: 11136, Training Loss: 0.01880
Epoch: 11137, Training Loss: 0.01426
Epoch: 11137, Training Loss: 0.01513
Epoch: 11137, Training Loss: 0.01515
Epoch: 11137, Training Loss: 0.01880
Epoch: 11138, Training Loss: 0.01426
Epoch: 11138, Training Loss: 0.01513
Epoch: 11138, Training Loss: 0.01515
Epoch: 11138, Training Loss: 0.01880
Epoch: 11139, Training Loss: 0.01426
Epoch: 11139, Training Loss: 0.01513
Epoch: 11139, Training Loss: 0.01515
Epoch: 11139, Training Loss: 0.01880
Epoch: 11140, Training Loss: 0.01426
Epoch: 11140, Training Loss: 0.01512
Epoch: 11140, Training Loss: 0.01515
Epoch: 11140, Training Loss: 0.01880
Epoch: 11141, Training Loss: 0.01426
Epoch: 11141, Training Loss: 0.01512
Epoch: 11141, Training Loss: 0.01515
Epoch: 11141, Training Loss: 0.01880
Epoch: 11142, Training Loss: 0.01426
Epoch: 11142, Training Loss: 0.01512
Epoch: 11142, Training Loss: 0.01515
Epoch: 11142, Training Loss: 0.01879
Epoch: 11143, Training Loss: 0.01426
Epoch: 11143, Training Loss: 0.01512
Epoch: 11143, Training Loss: 0.01514
Epoch: 11143, Training Loss: 0.01879
Epoch: 11144, Training Loss: 0.01426
Epoch: 11144, Training Loss: 0.01512
Epoch: 11144, Training Loss: 0.01514
Epoch: 11144, Training Loss: 0.01879
Epoch: 11145, Training Loss: 0.01425
Epoch: 11145, Training Loss: 0.01512
Epoch: 11145, Training Loss: 0.01514
Epoch: 11145, Training Loss: 0.01879
Epoch: 11146, Training Loss: 0.01425
Epoch: 11146, Training Loss: 0.01512
Epoch: 11146, Training Loss: 0.01514
Epoch: 11146, Training Loss: 0.01879
Epoch: 11147, Training Loss: 0.01425
Epoch: 11147, Training Loss: 0.01512
Epoch: 11147, Training Loss: 0.01514
Epoch: 11147, Training Loss: 0.01879
Epoch: 11148, Training Loss: 0.01425
Epoch: 11148, Training Loss: 0.01512
Epoch: 11148, Training Loss: 0.01514
Epoch: 11148, Training Loss: 0.01879
Epoch: 11149, Training Loss: 0.01425
Epoch: 11149, Training Loss: 0.01512
Epoch: 11149, Training Loss: 0.01514
Epoch: 11149, Training Loss: 0.01879
Epoch: 11150, Training Loss: 0.01425
Epoch: 11150, Training Loss: 0.01512
Epoch: 11150, Training Loss: 0.01514
Epoch: 11150, Training Loss: 0.01878
Epoch: 11151, Training Loss: 0.01425
Epoch: 11151, Training Loss: 0.01511
Epoch: 11151, Training Loss: 0.01514
Epoch: 11151, Training Loss: 0.01878
Epoch: 11152, Training Loss: 0.01425
Epoch: 11152, Training Loss: 0.01511
Epoch: 11152, Training Loss: 0.01514
Epoch: 11152, Training Loss: 0.01878
Epoch: 11153, Training Loss: 0.01425
Epoch: 11153, Training Loss: 0.01511
Epoch: 11153, Training Loss: 0.01514
Epoch: 11153, Training Loss: 0.01878
Epoch: 11154, Training Loss: 0.01425
Epoch: 11154, Training Loss: 0.01511
Epoch: 11154, Training Loss: 0.01513
Epoch: 11154, Training Loss: 0.01878
Epoch: 11155, Training Loss: 0.01425
Epoch: 11155, Training Loss: 0.01511
Epoch: 11155, Training Loss: 0.01513
Epoch: 11155, Training Loss: 0.01878
Epoch: 11156, Training Loss: 0.01425
Epoch: 11156, Training Loss: 0.01511
Epoch: 11156, Training Loss: 0.01513
Epoch: 11156, Training Loss: 0.01878
Epoch: 11157, Training Loss: 0.01425
Epoch: 11157, Training Loss: 0.01511
Epoch: 11157, Training Loss: 0.01513
Epoch: 11157, Training Loss: 0.01878
Epoch: 11158, Training Loss: 0.01424
Epoch: 11158, Training Loss: 0.01511
Epoch: 11158, Training Loss: 0.01513
Epoch: 11158, Training Loss: 0.01878
Epoch: 11159, Training Loss: 0.01424
Epoch: 11159, Training Loss: 0.01511
Epoch: 11159, Training Loss: 0.01513
Epoch: 11159, Training Loss: 0.01877
Epoch: 11160, Training Loss: 0.01424
Epoch: 11160, Training Loss: 0.01511
Epoch: 11160, Training Loss: 0.01513
Epoch: 11160, Training Loss: 0.01877
Epoch: 11161, Training Loss: 0.01424
Epoch: 11161, Training Loss: 0.01511
Epoch: 11161, Training Loss: 0.01513
Epoch: 11161, Training Loss: 0.01877
Epoch: 11162, Training Loss: 0.01424
Epoch: 11162, Training Loss: 0.01510
Epoch: 11162, Training Loss: 0.01513
Epoch: 11162, Training Loss: 0.01877
Epoch: 11163, Training Loss: 0.01424
Epoch: 11163, Training Loss: 0.01510
Epoch: 11163, Training Loss: 0.01513
Epoch: 11163, Training Loss: 0.01877
Epoch: 11164, Training Loss: 0.01424
Epoch: 11164, Training Loss: 0.01510
Epoch: 11164, Training Loss: 0.01513
Epoch: 11164, Training Loss: 0.01877
Epoch: 11165, Training Loss: 0.01424
Epoch: 11165, Training Loss: 0.01510
Epoch: 11165, Training Loss: 0.01512
Epoch: 11165, Training Loss: 0.01877
Epoch: 11166, Training Loss: 0.01424
Epoch: 11166, Training Loss: 0.01510
Epoch: 11166, Training Loss: 0.01512
Epoch: 11166, Training Loss: 0.01877
Epoch: 11167, Training Loss: 0.01424
Epoch: 11167, Training Loss: 0.01510
Epoch: 11167, Training Loss: 0.01512
Epoch: 11167, Training Loss: 0.01877
Epoch: 11168, Training Loss: 0.01424
Epoch: 11168, Training Loss: 0.01510
Epoch: 11168, Training Loss: 0.01512
Epoch: 11168, Training Loss: 0.01876
Epoch: 11169, Training Loss: 0.01424
Epoch: 11169, Training Loss: 0.01510
Epoch: 11169, Training Loss: 0.01512
Epoch: 11169, Training Loss: 0.01876
Epoch: 11170, Training Loss: 0.01423
Epoch: 11170, Training Loss: 0.01510
Epoch: 11170, Training Loss: 0.01512
Epoch: 11170, Training Loss: 0.01876
Epoch: 11171, Training Loss: 0.01423
Epoch: 11171, Training Loss: 0.01510
Epoch: 11171, Training Loss: 0.01512
Epoch: 11171, Training Loss: 0.01876
Epoch: 11172, Training Loss: 0.01423
Epoch: 11172, Training Loss: 0.01510
Epoch: 11172, Training Loss: 0.01512
Epoch: 11172, Training Loss: 0.01876
Epoch: 11173, Training Loss: 0.01423
Epoch: 11173, Training Loss: 0.01509
Epoch: 11173, Training Loss: 0.01512
Epoch: 11173, Training Loss: 0.01876
Epoch: 11174, Training Loss: 0.01423
Epoch: 11174, Training Loss: 0.01509
Epoch: 11174, Training Loss: 0.01512
Epoch: 11174, Training Loss: 0.01876
Epoch: 11175, Training Loss: 0.01423
Epoch: 11175, Training Loss: 0.01509
Epoch: 11175, Training Loss: 0.01512
Epoch: 11175, Training Loss: 0.01876
Epoch: 11176, Training Loss: 0.01423
Epoch: 11176, Training Loss: 0.01509
Epoch: 11176, Training Loss: 0.01511
Epoch: 11176, Training Loss: 0.01876
Epoch: 11177, Training Loss: 0.01423
Epoch: 11177, Training Loss: 0.01509
Epoch: 11177, Training Loss: 0.01511
Epoch: 11177, Training Loss: 0.01875
Epoch: 11178, Training Loss: 0.01423
Epoch: 11178, Training Loss: 0.01509
Epoch: 11178, Training Loss: 0.01511
Epoch: 11178, Training Loss: 0.01875
Epoch: 11179, Training Loss: 0.01423
Epoch: 11179, Training Loss: 0.01509
Epoch: 11179, Training Loss: 0.01511
Epoch: 11179, Training Loss: 0.01875
Epoch: 11180, Training Loss: 0.01423
Epoch: 11180, Training Loss: 0.01509
Epoch: 11180, Training Loss: 0.01511
Epoch: 11180, Training Loss: 0.01875
Epoch: 11181, Training Loss: 0.01423
Epoch: 11181, Training Loss: 0.01509
Epoch: 11181, Training Loss: 0.01511
Epoch: 11181, Training Loss: 0.01875
Epoch: 11182, Training Loss: 0.01422
Epoch: 11182, Training Loss: 0.01509
Epoch: 11182, Training Loss: 0.01511
Epoch: 11182, Training Loss: 0.01875
Epoch: 11183, Training Loss: 0.01422
Epoch: 11183, Training Loss: 0.01509
Epoch: 11183, Training Loss: 0.01511
Epoch: 11183, Training Loss: 0.01875
Epoch: 11184, Training Loss: 0.01422
Epoch: 11184, Training Loss: 0.01508
Epoch: 11184, Training Loss: 0.01511
Epoch: 11184, Training Loss: 0.01875
Epoch: 11185, Training Loss: 0.01422
Epoch: 11185, Training Loss: 0.01508
Epoch: 11185, Training Loss: 0.01511
Epoch: 11185, Training Loss: 0.01874
Epoch: 11186, Training Loss: 0.01422
Epoch: 11186, Training Loss: 0.01508
Epoch: 11186, Training Loss: 0.01511
Epoch: 11186, Training Loss: 0.01874
Epoch: 11187, Training Loss: 0.01422
Epoch: 11187, Training Loss: 0.01508
Epoch: 11187, Training Loss: 0.01510
Epoch: 11187, Training Loss: 0.01874
Epoch: 11188, Training Loss: 0.01422
Epoch: 11188, Training Loss: 0.01508
Epoch: 11188, Training Loss: 0.01510
Epoch: 11188, Training Loss: 0.01874
Epoch: 11189, Training Loss: 0.01422
Epoch: 11189, Training Loss: 0.01508
Epoch: 11189, Training Loss: 0.01510
Epoch: 11189, Training Loss: 0.01874
Epoch: 11190, Training Loss: 0.01422
Epoch: 11190, Training Loss: 0.01508
Epoch: 11190, Training Loss: 0.01510
Epoch: 11190, Training Loss: 0.01874
Epoch: 11191, Training Loss: 0.01422
Epoch: 11191, Training Loss: 0.01508
Epoch: 11191, Training Loss: 0.01510
Epoch: 11191, Training Loss: 0.01874
Epoch: 11192, Training Loss: 0.01422
Epoch: 11192, Training Loss: 0.01508
Epoch: 11192, Training Loss: 0.01510
Epoch: 11192, Training Loss: 0.01874
Epoch: 11193, Training Loss: 0.01422
Epoch: 11193, Training Loss: 0.01508
Epoch: 11193, Training Loss: 0.01510
Epoch: 11193, Training Loss: 0.01874
Epoch: 11194, Training Loss: 0.01421
Epoch: 11194, Training Loss: 0.01508
Epoch: 11194, Training Loss: 0.01510
Epoch: 11194, Training Loss: 0.01873
Epoch: 11195, Training Loss: 0.01421
Epoch: 11195, Training Loss: 0.01507
Epoch: 11195, Training Loss: 0.01510
Epoch: 11195, Training Loss: 0.01873
Epoch: 11196, Training Loss: 0.01421
Epoch: 11196, Training Loss: 0.01507
Epoch: 11196, Training Loss: 0.01510
Epoch: 11196, Training Loss: 0.01873
Epoch: 11197, Training Loss: 0.01421
Epoch: 11197, Training Loss: 0.01507
Epoch: 11197, Training Loss: 0.01510
Epoch: 11197, Training Loss: 0.01873
Epoch: 11198, Training Loss: 0.01421
Epoch: 11198, Training Loss: 0.01507
Epoch: 11198, Training Loss: 0.01509
Epoch: 11198, Training Loss: 0.01873
Epoch: 11199, Training Loss: 0.01421
Epoch: 11199, Training Loss: 0.01507
Epoch: 11199, Training Loss: 0.01509
Epoch: 11199, Training Loss: 0.01873
Epoch: 11200, Training Loss: 0.01421
Epoch: 11200, Training Loss: 0.01507
Epoch: 11200, Training Loss: 0.01509
Epoch: 11200, Training Loss: 0.01873
Epoch: 11201, Training Loss: 0.01421
Epoch: 11201, Training Loss: 0.01507
Epoch: 11201, Training Loss: 0.01509
Epoch: 11201, Training Loss: 0.01873
Epoch: 11202, Training Loss: 0.01421
Epoch: 11202, Training Loss: 0.01507
Epoch: 11202, Training Loss: 0.01509
Epoch: 11202, Training Loss: 0.01873
Epoch: 11203, Training Loss: 0.01421
Epoch: 11203, Training Loss: 0.01507
Epoch: 11203, Training Loss: 0.01509
Epoch: 11203, Training Loss: 0.01872
Epoch: 11204, Training Loss: 0.01421
Epoch: 11204, Training Loss: 0.01507
Epoch: 11204, Training Loss: 0.01509
Epoch: 11204, Training Loss: 0.01872
Epoch: 11205, Training Loss: 0.01421
Epoch: 11205, Training Loss: 0.01507
Epoch: 11205, Training Loss: 0.01509
Epoch: 11205, Training Loss: 0.01872
Epoch: 11206, Training Loss: 0.01421
Epoch: 11206, Training Loss: 0.01506
Epoch: 11206, Training Loss: 0.01509
Epoch: 11206, Training Loss: 0.01872
Epoch: 11207, Training Loss: 0.01420
Epoch: 11207, Training Loss: 0.01506
Epoch: 11207, Training Loss: 0.01509
Epoch: 11207, Training Loss: 0.01872
Epoch: 11208, Training Loss: 0.01420
Epoch: 11208, Training Loss: 0.01506
Epoch: 11208, Training Loss: 0.01509
Epoch: 11208, Training Loss: 0.01872
Epoch: 11209, Training Loss: 0.01420
Epoch: 11209, Training Loss: 0.01506
Epoch: 11209, Training Loss: 0.01508
Epoch: 11209, Training Loss: 0.01872
Epoch: 11210, Training Loss: 0.01420
Epoch: 11210, Training Loss: 0.01506
Epoch: 11210, Training Loss: 0.01508
Epoch: 11210, Training Loss: 0.01872
Epoch: 11211, Training Loss: 0.01420
Epoch: 11211, Training Loss: 0.01506
Epoch: 11211, Training Loss: 0.01508
Epoch: 11211, Training Loss: 0.01872
Epoch: 11212, Training Loss: 0.01420
Epoch: 11212, Training Loss: 0.01506
Epoch: 11212, Training Loss: 0.01508
Epoch: 11212, Training Loss: 0.01871
Epoch: 11213, Training Loss: 0.01420
Epoch: 11213, Training Loss: 0.01506
Epoch: 11213, Training Loss: 0.01508
Epoch: 11213, Training Loss: 0.01871
Epoch: 11214, Training Loss: 0.01420
Epoch: 11214, Training Loss: 0.01506
Epoch: 11214, Training Loss: 0.01508
Epoch: 11214, Training Loss: 0.01871
Epoch: 11215, Training Loss: 0.01420
Epoch: 11215, Training Loss: 0.01506
Epoch: 11215, Training Loss: 0.01508
Epoch: 11215, Training Loss: 0.01871
Epoch: 11216, Training Loss: 0.01420
Epoch: 11216, Training Loss: 0.01506
Epoch: 11216, Training Loss: 0.01508
Epoch: 11216, Training Loss: 0.01871
Epoch: 11217, Training Loss: 0.01420
Epoch: 11217, Training Loss: 0.01505
Epoch: 11217, Training Loss: 0.01508
Epoch: 11217, Training Loss: 0.01871
Epoch: 11218, Training Loss: 0.01420
Epoch: 11218, Training Loss: 0.01505
Epoch: 11218, Training Loss: 0.01508
Epoch: 11218, Training Loss: 0.01871
Epoch: 11219, Training Loss: 0.01419
Epoch: 11219, Training Loss: 0.01505
Epoch: 11219, Training Loss: 0.01508
Epoch: 11219, Training Loss: 0.01871
Epoch: 11220, Training Loss: 0.01419
Epoch: 11220, Training Loss: 0.01505
Epoch: 11220, Training Loss: 0.01507
Epoch: 11220, Training Loss: 0.01870
Epoch: 11221, Training Loss: 0.01419
Epoch: 11221, Training Loss: 0.01505
Epoch: 11221, Training Loss: 0.01507
Epoch: 11221, Training Loss: 0.01870
Epoch: 11222, Training Loss: 0.01419
Epoch: 11222, Training Loss: 0.01505
Epoch: 11222, Training Loss: 0.01507
Epoch: 11222, Training Loss: 0.01870
Epoch: 11223, Training Loss: 0.01419
Epoch: 11223, Training Loss: 0.01505
Epoch: 11223, Training Loss: 0.01507
Epoch: 11223, Training Loss: 0.01870
Epoch: 11224, Training Loss: 0.01419
Epoch: 11224, Training Loss: 0.01505
Epoch: 11224, Training Loss: 0.01507
Epoch: 11224, Training Loss: 0.01870
Epoch: 11225, Training Loss: 0.01419
Epoch: 11225, Training Loss: 0.01505
Epoch: 11225, Training Loss: 0.01507
Epoch: 11225, Training Loss: 0.01870
Epoch: 11226, Training Loss: 0.01419
Epoch: 11226, Training Loss: 0.01505
Epoch: 11226, Training Loss: 0.01507
Epoch: 11226, Training Loss: 0.01870
Epoch: 11227, Training Loss: 0.01419
Epoch: 11227, Training Loss: 0.01505
Epoch: 11227, Training Loss: 0.01507
Epoch: 11227, Training Loss: 0.01870
Epoch: 11228, Training Loss: 0.01419
Epoch: 11228, Training Loss: 0.01505
Epoch: 11228, Training Loss: 0.01507
Epoch: 11228, Training Loss: 0.01870
Epoch: 11229, Training Loss: 0.01419
Epoch: 11229, Training Loss: 0.01504
Epoch: 11229, Training Loss: 0.01507
Epoch: 11229, Training Loss: 0.01869
Epoch: 11230, Training Loss: 0.01419
Epoch: 11230, Training Loss: 0.01504
Epoch: 11230, Training Loss: 0.01507
Epoch: 11230, Training Loss: 0.01869
Epoch: 11231, Training Loss: 0.01418
Epoch: 11231, Training Loss: 0.01504
Epoch: 11231, Training Loss: 0.01507
Epoch: 11231, Training Loss: 0.01869
Epoch: 11232, Training Loss: 0.01418
Epoch: 11232, Training Loss: 0.01504
Epoch: 11232, Training Loss: 0.01506
Epoch: 11232, Training Loss: 0.01869
Epoch: 11233, Training Loss: 0.01418
Epoch: 11233, Training Loss: 0.01504
Epoch: 11233, Training Loss: 0.01506
Epoch: 11233, Training Loss: 0.01869
Epoch: 11234, Training Loss: 0.01418
Epoch: 11234, Training Loss: 0.01504
Epoch: 11234, Training Loss: 0.01506
Epoch: 11234, Training Loss: 0.01869
Epoch: 11235, Training Loss: 0.01418
Epoch: 11235, Training Loss: 0.01504
Epoch: 11235, Training Loss: 0.01506
Epoch: 11235, Training Loss: 0.01869
Epoch: 11236, Training Loss: 0.01418
Epoch: 11236, Training Loss: 0.01504
Epoch: 11236, Training Loss: 0.01506
Epoch: 11236, Training Loss: 0.01869
Epoch: 11237, Training Loss: 0.01418
Epoch: 11237, Training Loss: 0.01504
Epoch: 11237, Training Loss: 0.01506
Epoch: 11237, Training Loss: 0.01869
Epoch: 11238, Training Loss: 0.01418
Epoch: 11238, Training Loss: 0.01504
Epoch: 11238, Training Loss: 0.01506
Epoch: 11238, Training Loss: 0.01868
Epoch: 11239, Training Loss: 0.01418
Epoch: 11239, Training Loss: 0.01504
Epoch: 11239, Training Loss: 0.01506
Epoch: 11239, Training Loss: 0.01868
Epoch: 11240, Training Loss: 0.01418
Epoch: 11240, Training Loss: 0.01503
Epoch: 11240, Training Loss: 0.01506
Epoch: 11240, Training Loss: 0.01868
Epoch: 11241, Training Loss: 0.01418
Epoch: 11241, Training Loss: 0.01503
Epoch: 11241, Training Loss: 0.01506
Epoch: 11241, Training Loss: 0.01868
Epoch: 11242, Training Loss: 0.01418
Epoch: 11242, Training Loss: 0.01503
Epoch: 11242, Training Loss: 0.01506
Epoch: 11242, Training Loss: 0.01868
Epoch: 11243, Training Loss: 0.01418
Epoch: 11243, Training Loss: 0.01503
Epoch: 11243, Training Loss: 0.01505
Epoch: 11243, Training Loss: 0.01868
Epoch: 11244, Training Loss: 0.01417
Epoch: 11244, Training Loss: 0.01503
Epoch: 11244, Training Loss: 0.01505
Epoch: 11244, Training Loss: 0.01868
Epoch: 11245, Training Loss: 0.01417
Epoch: 11245, Training Loss: 0.01503
Epoch: 11245, Training Loss: 0.01505
Epoch: 11245, Training Loss: 0.01868
Epoch: 11246, Training Loss: 0.01417
Epoch: 11246, Training Loss: 0.01503
Epoch: 11246, Training Loss: 0.01505
Epoch: 11246, Training Loss: 0.01868
Epoch: 11247, Training Loss: 0.01417
Epoch: 11247, Training Loss: 0.01503
Epoch: 11247, Training Loss: 0.01505
Epoch: 11247, Training Loss: 0.01867
Epoch: 11248, Training Loss: 0.01417
Epoch: 11248, Training Loss: 0.01503
Epoch: 11248, Training Loss: 0.01505
Epoch: 11248, Training Loss: 0.01867
Epoch: 11249, Training Loss: 0.01417
Epoch: 11249, Training Loss: 0.01503
Epoch: 11249, Training Loss: 0.01505
Epoch: 11249, Training Loss: 0.01867
Epoch: 11250, Training Loss: 0.01417
Epoch: 11250, Training Loss: 0.01503
Epoch: 11250, Training Loss: 0.01505
Epoch: 11250, Training Loss: 0.01867
Epoch: 11251, Training Loss: 0.01417
Epoch: 11251, Training Loss: 0.01502
Epoch: 11251, Training Loss: 0.01505
Epoch: 11251, Training Loss: 0.01867
Epoch: 11252, Training Loss: 0.01417
Epoch: 11252, Training Loss: 0.01502
Epoch: 11252, Training Loss: 0.01505
Epoch: 11252, Training Loss: 0.01867
Epoch: 11253, Training Loss: 0.01417
Epoch: 11253, Training Loss: 0.01502
Epoch: 11253, Training Loss: 0.01505
Epoch: 11253, Training Loss: 0.01867
Epoch: 11254, Training Loss: 0.01417
Epoch: 11254, Training Loss: 0.01502
Epoch: 11254, Training Loss: 0.01504
Epoch: 11254, Training Loss: 0.01867
Epoch: 11255, Training Loss: 0.01417
Epoch: 11255, Training Loss: 0.01502
Epoch: 11255, Training Loss: 0.01504
Epoch: 11255, Training Loss: 0.01867
Epoch: 11256, Training Loss: 0.01416
Epoch: 11256, Training Loss: 0.01502
Epoch: 11256, Training Loss: 0.01504
Epoch: 11256, Training Loss: 0.01866
Epoch: 11257, Training Loss: 0.01416
Epoch: 11257, Training Loss: 0.01502
Epoch: 11257, Training Loss: 0.01504
Epoch: 11257, Training Loss: 0.01866
Epoch: 11258, Training Loss: 0.01416
Epoch: 11258, Training Loss: 0.01502
Epoch: 11258, Training Loss: 0.01504
Epoch: 11258, Training Loss: 0.01866
Epoch: 11259, Training Loss: 0.01416
Epoch: 11259, Training Loss: 0.01502
Epoch: 11259, Training Loss: 0.01504
Epoch: 11259, Training Loss: 0.01866
Epoch: 11260, Training Loss: 0.01416
Epoch: 11260, Training Loss: 0.01502
Epoch: 11260, Training Loss: 0.01504
Epoch: 11260, Training Loss: 0.01866
Epoch: 11261, Training Loss: 0.01416
Epoch: 11261, Training Loss: 0.01502
Epoch: 11261, Training Loss: 0.01504
Epoch: 11261, Training Loss: 0.01866
Epoch: 11262, Training Loss: 0.01416
Epoch: 11262, Training Loss: 0.01501
Epoch: 11262, Training Loss: 0.01504
Epoch: 11262, Training Loss: 0.01866
Epoch: 11263, Training Loss: 0.01416
Epoch: 11263, Training Loss: 0.01501
Epoch: 11263, Training Loss: 0.01504
Epoch: 11263, Training Loss: 0.01866
Epoch: 11264, Training Loss: 0.01416
Epoch: 11264, Training Loss: 0.01501
Epoch: 11264, Training Loss: 0.01504
Epoch: 11264, Training Loss: 0.01866
Epoch: 11265, Training Loss: 0.01416
Epoch: 11265, Training Loss: 0.01501
Epoch: 11265, Training Loss: 0.01503
Epoch: 11265, Training Loss: 0.01865
Epoch: 11266, Training Loss: 0.01416
Epoch: 11266, Training Loss: 0.01501
Epoch: 11266, Training Loss: 0.01503
Epoch: 11266, Training Loss: 0.01865
Epoch: 11267, Training Loss: 0.01416
Epoch: 11267, Training Loss: 0.01501
Epoch: 11267, Training Loss: 0.01503
Epoch: 11267, Training Loss: 0.01865
Epoch: 11268, Training Loss: 0.01415
Epoch: 11268, Training Loss: 0.01501
Epoch: 11268, Training Loss: 0.01503
Epoch: 11268, Training Loss: 0.01865
Epoch: 11269, Training Loss: 0.01415
Epoch: 11269, Training Loss: 0.01501
Epoch: 11269, Training Loss: 0.01503
Epoch: 11269, Training Loss: 0.01865
Epoch: 11270, Training Loss: 0.01415
Epoch: 11270, Training Loss: 0.01501
Epoch: 11270, Training Loss: 0.01503
Epoch: 11270, Training Loss: 0.01865
Epoch: 11271, Training Loss: 0.01415
Epoch: 11271, Training Loss: 0.01501
Epoch: 11271, Training Loss: 0.01503
Epoch: 11271, Training Loss: 0.01865
Epoch: 11272, Training Loss: 0.01415
Epoch: 11272, Training Loss: 0.01501
Epoch: 11272, Training Loss: 0.01503
Epoch: 11272, Training Loss: 0.01865
Epoch: 11273, Training Loss: 0.01415
Epoch: 11273, Training Loss: 0.01501
Epoch: 11273, Training Loss: 0.01503
Epoch: 11273, Training Loss: 0.01864
Epoch: 11274, Training Loss: 0.01415
Epoch: 11274, Training Loss: 0.01500
Epoch: 11274, Training Loss: 0.01503
Epoch: 11274, Training Loss: 0.01864
Epoch: 11275, Training Loss: 0.01415
Epoch: 11275, Training Loss: 0.01500
Epoch: 11275, Training Loss: 0.01503
Epoch: 11275, Training Loss: 0.01864
Epoch: 11276, Training Loss: 0.01415
Epoch: 11276, Training Loss: 0.01500
Epoch: 11276, Training Loss: 0.01503
Epoch: 11276, Training Loss: 0.01864
Epoch: 11277, Training Loss: 0.01415
Epoch: 11277, Training Loss: 0.01500
Epoch: 11277, Training Loss: 0.01502
Epoch: 11277, Training Loss: 0.01864
Epoch: 11278, Training Loss: 0.01415
Epoch: 11278, Training Loss: 0.01500
Epoch: 11278, Training Loss: 0.01502
Epoch: 11278, Training Loss: 0.01864
Epoch: 11279, Training Loss: 0.01415
Epoch: 11279, Training Loss: 0.01500
Epoch: 11279, Training Loss: 0.01502
Epoch: 11279, Training Loss: 0.01864
Epoch: 11280, Training Loss: 0.01415
Epoch: 11280, Training Loss: 0.01500
Epoch: 11280, Training Loss: 0.01502
Epoch: 11280, Training Loss: 0.01864
Epoch: 11281, Training Loss: 0.01414
Epoch: 11281, Training Loss: 0.01500
Epoch: 11281, Training Loss: 0.01502
Epoch: 11281, Training Loss: 0.01864
Epoch: 11282, Training Loss: 0.01414
Epoch: 11282, Training Loss: 0.01500
Epoch: 11282, Training Loss: 0.01502
Epoch: 11282, Training Loss: 0.01863
Epoch: 11283, Training Loss: 0.01414
Epoch: 11283, Training Loss: 0.01500
Epoch: 11283, Training Loss: 0.01502
Epoch: 11283, Training Loss: 0.01863
Epoch: 11284, Training Loss: 0.01414
Epoch: 11284, Training Loss: 0.01500
Epoch: 11284, Training Loss: 0.01502
Epoch: 11284, Training Loss: 0.01863
Epoch: 11285, Training Loss: 0.01414
Epoch: 11285, Training Loss: 0.01499
Epoch: 11285, Training Loss: 0.01502
Epoch: 11285, Training Loss: 0.01863
Epoch: 11286, Training Loss: 0.01414
Epoch: 11286, Training Loss: 0.01499
Epoch: 11286, Training Loss: 0.01502
Epoch: 11286, Training Loss: 0.01863
Epoch: 11287, Training Loss: 0.01414
Epoch: 11287, Training Loss: 0.01499
Epoch: 11287, Training Loss: 0.01502
Epoch: 11287, Training Loss: 0.01863
Epoch: 11288, Training Loss: 0.01414
Epoch: 11288, Training Loss: 0.01499
Epoch: 11288, Training Loss: 0.01501
Epoch: 11288, Training Loss: 0.01863
Epoch: 11289, Training Loss: 0.01414
Epoch: 11289, Training Loss: 0.01499
Epoch: 11289, Training Loss: 0.01501
Epoch: 11289, Training Loss: 0.01863
Epoch: 11290, Training Loss: 0.01414
Epoch: 11290, Training Loss: 0.01499
Epoch: 11290, Training Loss: 0.01501
Epoch: 11290, Training Loss: 0.01863
Epoch: 11291, Training Loss: 0.01414
Epoch: 11291, Training Loss: 0.01499
Epoch: 11291, Training Loss: 0.01501
Epoch: 11291, Training Loss: 0.01862
Epoch: 11292, Training Loss: 0.01414
Epoch: 11292, Training Loss: 0.01499
Epoch: 11292, Training Loss: 0.01501
Epoch: 11292, Training Loss: 0.01862
Epoch: 11293, Training Loss: 0.01413
Epoch: 11293, Training Loss: 0.01499
Epoch: 11293, Training Loss: 0.01501
Epoch: 11293, Training Loss: 0.01862
Epoch: 11294, Training Loss: 0.01413
Epoch: 11294, Training Loss: 0.01499
Epoch: 11294, Training Loss: 0.01501
Epoch: 11294, Training Loss: 0.01862
Epoch: 11295, Training Loss: 0.01413
Epoch: 11295, Training Loss: 0.01499
Epoch: 11295, Training Loss: 0.01501
Epoch: 11295, Training Loss: 0.01862
Epoch: 11296, Training Loss: 0.01413
Epoch: 11296, Training Loss: 0.01498
Epoch: 11296, Training Loss: 0.01501
Epoch: 11296, Training Loss: 0.01862
Epoch: 11297, Training Loss: 0.01413
Epoch: 11297, Training Loss: 0.01498
Epoch: 11297, Training Loss: 0.01501
Epoch: 11297, Training Loss: 0.01862
Epoch: 11298, Training Loss: 0.01413
Epoch: 11298, Training Loss: 0.01498
Epoch: 11298, Training Loss: 0.01501
Epoch: 11298, Training Loss: 0.01862
Epoch: 11299, Training Loss: 0.01413
Epoch: 11299, Training Loss: 0.01498
Epoch: 11299, Training Loss: 0.01500
Epoch: 11299, Training Loss: 0.01862
Epoch: 11300, Training Loss: 0.01413
Epoch: 11300, Training Loss: 0.01498
Epoch: 11300, Training Loss: 0.01500
Epoch: 11300, Training Loss: 0.01861
Epoch: 11301, Training Loss: 0.01413
Epoch: 11301, Training Loss: 0.01498
Epoch: 11301, Training Loss: 0.01500
Epoch: 11301, Training Loss: 0.01861
Epoch: 11302, Training Loss: 0.01413
Epoch: 11302, Training Loss: 0.01498
Epoch: 11302, Training Loss: 0.01500
Epoch: 11302, Training Loss: 0.01861
Epoch: 11303, Training Loss: 0.01413
Epoch: 11303, Training Loss: 0.01498
Epoch: 11303, Training Loss: 0.01500
Epoch: 11303, Training Loss: 0.01861
Epoch: 11304, Training Loss: 0.01413
Epoch: 11304, Training Loss: 0.01498
Epoch: 11304, Training Loss: 0.01500
Epoch: 11304, Training Loss: 0.01861
Epoch: 11305, Training Loss: 0.01413
Epoch: 11305, Training Loss: 0.01498
Epoch: 11305, Training Loss: 0.01500
Epoch: 11305, Training Loss: 0.01861
Epoch: 11306, Training Loss: 0.01412
Epoch: 11306, Training Loss: 0.01498
Epoch: 11306, Training Loss: 0.01500
Epoch: 11306, Training Loss: 0.01861
Epoch: 11307, Training Loss: 0.01412
Epoch: 11307, Training Loss: 0.01498
Epoch: 11307, Training Loss: 0.01500
Epoch: 11307, Training Loss: 0.01861
Epoch: 11308, Training Loss: 0.01412
Epoch: 11308, Training Loss: 0.01497
Epoch: 11308, Training Loss: 0.01500
Epoch: 11308, Training Loss: 0.01861
Epoch: 11309, Training Loss: 0.01412
Epoch: 11309, Training Loss: 0.01497
Epoch: 11309, Training Loss: 0.01500
Epoch: 11309, Training Loss: 0.01860
Epoch: 11310, Training Loss: 0.01412
Epoch: 11310, Training Loss: 0.01497
Epoch: 11310, Training Loss: 0.01499
Epoch: 11310, Training Loss: 0.01860
Epoch: 11311, Training Loss: 0.01412
Epoch: 11311, Training Loss: 0.01497
Epoch: 11311, Training Loss: 0.01499
Epoch: 11311, Training Loss: 0.01860
Epoch: 11312, Training Loss: 0.01412
Epoch: 11312, Training Loss: 0.01497
Epoch: 11312, Training Loss: 0.01499
Epoch: 11312, Training Loss: 0.01860
Epoch: 11313, Training Loss: 0.01412
Epoch: 11313, Training Loss: 0.01497
Epoch: 11313, Training Loss: 0.01499
Epoch: 11313, Training Loss: 0.01860
Epoch: 11314, Training Loss: 0.01412
Epoch: 11314, Training Loss: 0.01497
Epoch: 11314, Training Loss: 0.01499
Epoch: 11314, Training Loss: 0.01860
Epoch: 11315, Training Loss: 0.01412
Epoch: 11315, Training Loss: 0.01497
Epoch: 11315, Training Loss: 0.01499
Epoch: 11315, Training Loss: 0.01860
Epoch: 11316, Training Loss: 0.01412
Epoch: 11316, Training Loss: 0.01497
Epoch: 11316, Training Loss: 0.01499
Epoch: 11316, Training Loss: 0.01860
Epoch: 11317, Training Loss: 0.01412
Epoch: 11317, Training Loss: 0.01497
Epoch: 11317, Training Loss: 0.01499
Epoch: 11317, Training Loss: 0.01860
Epoch: 11318, Training Loss: 0.01411
Epoch: 11318, Training Loss: 0.01497
Epoch: 11318, Training Loss: 0.01499
Epoch: 11318, Training Loss: 0.01859
Epoch: 11319, Training Loss: 0.01411
Epoch: 11319, Training Loss: 0.01496
Epoch: 11319, Training Loss: 0.01499
Epoch: 11319, Training Loss: 0.01859
Epoch: 11320, Training Loss: 0.01411
Epoch: 11320, Training Loss: 0.01496
Epoch: 11320, Training Loss: 0.01499
Epoch: 11320, Training Loss: 0.01859
Epoch: 11321, Training Loss: 0.01411
Epoch: 11321, Training Loss: 0.01496
Epoch: 11321, Training Loss: 0.01499
Epoch: 11321, Training Loss: 0.01859
Epoch: 11322, Training Loss: 0.01411
Epoch: 11322, Training Loss: 0.01496
Epoch: 11322, Training Loss: 0.01498
Epoch: 11322, Training Loss: 0.01859
Epoch: 11323, Training Loss: 0.01411
Epoch: 11323, Training Loss: 0.01496
Epoch: 11323, Training Loss: 0.01498
Epoch: 11323, Training Loss: 0.01859
Epoch: 11324, Training Loss: 0.01411
Epoch: 11324, Training Loss: 0.01496
Epoch: 11324, Training Loss: 0.01498
Epoch: 11324, Training Loss: 0.01859
Epoch: 11325, Training Loss: 0.01411
Epoch: 11325, Training Loss: 0.01496
Epoch: 11325, Training Loss: 0.01498
Epoch: 11325, Training Loss: 0.01859
Epoch: 11326, Training Loss: 0.01411
Epoch: 11326, Training Loss: 0.01496
Epoch: 11326, Training Loss: 0.01498
Epoch: 11326, Training Loss: 0.01859
Epoch: 11327, Training Loss: 0.01411
Epoch: 11327, Training Loss: 0.01496
Epoch: 11327, Training Loss: 0.01498
Epoch: 11327, Training Loss: 0.01858
Epoch: 11328, Training Loss: 0.01411
Epoch: 11328, Training Loss: 0.01496
Epoch: 11328, Training Loss: 0.01498
Epoch: 11328, Training Loss: 0.01858
Epoch: 11329, Training Loss: 0.01411
Epoch: 11329, Training Loss: 0.01496
Epoch: 11329, Training Loss: 0.01498
Epoch: 11329, Training Loss: 0.01858
Epoch: 11330, Training Loss: 0.01411
Epoch: 11330, Training Loss: 0.01495
Epoch: 11330, Training Loss: 0.01498
Epoch: 11330, Training Loss: 0.01858
Epoch: 11331, Training Loss: 0.01410
Epoch: 11331, Training Loss: 0.01495
Epoch: 11331, Training Loss: 0.01498
Epoch: 11331, Training Loss: 0.01858
Epoch: 11332, Training Loss: 0.01410
Epoch: 11332, Training Loss: 0.01495
Epoch: 11332, Training Loss: 0.01498
Epoch: 11332, Training Loss: 0.01858
Epoch: 11333, Training Loss: 0.01410
Epoch: 11333, Training Loss: 0.01495
Epoch: 11333, Training Loss: 0.01497
Epoch: 11333, Training Loss: 0.01858
Epoch: 11334, Training Loss: 0.01410
Epoch: 11334, Training Loss: 0.01495
Epoch: 11334, Training Loss: 0.01497
Epoch: 11334, Training Loss: 0.01858
Epoch: 11335, Training Loss: 0.01410
Epoch: 11335, Training Loss: 0.01495
Epoch: 11335, Training Loss: 0.01497
Epoch: 11335, Training Loss: 0.01858
Epoch: 11336, Training Loss: 0.01410
Epoch: 11336, Training Loss: 0.01495
Epoch: 11336, Training Loss: 0.01497
Epoch: 11336, Training Loss: 0.01857
Epoch: 11337, Training Loss: 0.01410
Epoch: 11337, Training Loss: 0.01495
Epoch: 11337, Training Loss: 0.01497
Epoch: 11337, Training Loss: 0.01857
Epoch: 11338, Training Loss: 0.01410
Epoch: 11338, Training Loss: 0.01495
Epoch: 11338, Training Loss: 0.01497
Epoch: 11338, Training Loss: 0.01857
Epoch: 11339, Training Loss: 0.01410
Epoch: 11339, Training Loss: 0.01495
Epoch: 11339, Training Loss: 0.01497
Epoch: 11339, Training Loss: 0.01857
Epoch: 11340, Training Loss: 0.01410
Epoch: 11340, Training Loss: 0.01495
Epoch: 11340, Training Loss: 0.01497
Epoch: 11340, Training Loss: 0.01857
Epoch: 11341, Training Loss: 0.01410
Epoch: 11341, Training Loss: 0.01495
Epoch: 11341, Training Loss: 0.01497
Epoch: 11341, Training Loss: 0.01857
Epoch: 11342, Training Loss: 0.01410
Epoch: 11342, Training Loss: 0.01494
Epoch: 11342, Training Loss: 0.01497
Epoch: 11342, Training Loss: 0.01857
Epoch: 11343, Training Loss: 0.01410
Epoch: 11343, Training Loss: 0.01494
Epoch: 11343, Training Loss: 0.01497
Epoch: 11343, Training Loss: 0.01857
Epoch: 11344, Training Loss: 0.01409
Epoch: 11344, Training Loss: 0.01494
Epoch: 11344, Training Loss: 0.01497
Epoch: 11344, Training Loss: 0.01857
Epoch: 11345, Training Loss: 0.01409
Epoch: 11345, Training Loss: 0.01494
Epoch: 11345, Training Loss: 0.01496
Epoch: 11345, Training Loss: 0.01856
Epoch: 11346, Training Loss: 0.01409
Epoch: 11346, Training Loss: 0.01494
Epoch: 11346, Training Loss: 0.01496
Epoch: 11346, Training Loss: 0.01856
Epoch: 11347, Training Loss: 0.01409
Epoch: 11347, Training Loss: 0.01494
Epoch: 11347, Training Loss: 0.01496
Epoch: 11347, Training Loss: 0.01856
Epoch: 11348, Training Loss: 0.01409
Epoch: 11348, Training Loss: 0.01494
Epoch: 11348, Training Loss: 0.01496
Epoch: 11348, Training Loss: 0.01856
Epoch: 11349, Training Loss: 0.01409
Epoch: 11349, Training Loss: 0.01494
Epoch: 11349, Training Loss: 0.01496
Epoch: 11349, Training Loss: 0.01856
Epoch: 11350, Training Loss: 0.01409
Epoch: 11350, Training Loss: 0.01494
Epoch: 11350, Training Loss: 0.01496
Epoch: 11350, Training Loss: 0.01856
Epoch: 11351, Training Loss: 0.01409
Epoch: 11351, Training Loss: 0.01494
Epoch: 11351, Training Loss: 0.01496
Epoch: 11351, Training Loss: 0.01856
Epoch: 11352, Training Loss: 0.01409
Epoch: 11352, Training Loss: 0.01494
Epoch: 11352, Training Loss: 0.01496
Epoch: 11352, Training Loss: 0.01856
Epoch: 11353, Training Loss: 0.01409
Epoch: 11353, Training Loss: 0.01493
Epoch: 11353, Training Loss: 0.01496
Epoch: 11353, Training Loss: 0.01856
Epoch: 11354, Training Loss: 0.01409
Epoch: 11354, Training Loss: 0.01493
Epoch: 11354, Training Loss: 0.01496
Epoch: 11354, Training Loss: 0.01855
Epoch: 11355, Training Loss: 0.01409
Epoch: 11355, Training Loss: 0.01493
Epoch: 11355, Training Loss: 0.01496
Epoch: 11355, Training Loss: 0.01855
Epoch: 11356, Training Loss: 0.01408
Epoch: 11356, Training Loss: 0.01493
Epoch: 11356, Training Loss: 0.01495
Epoch: 11356, Training Loss: 0.01855
Epoch: 11357, Training Loss: 0.01408
Epoch: 11357, Training Loss: 0.01493
Epoch: 11357, Training Loss: 0.01495
Epoch: 11357, Training Loss: 0.01855
Epoch: 11358, Training Loss: 0.01408
Epoch: 11358, Training Loss: 0.01493
Epoch: 11358, Training Loss: 0.01495
Epoch: 11358, Training Loss: 0.01855
Epoch: 11359, Training Loss: 0.01408
Epoch: 11359, Training Loss: 0.01493
Epoch: 11359, Training Loss: 0.01495
Epoch: 11359, Training Loss: 0.01855
Epoch: 11360, Training Loss: 0.01408
Epoch: 11360, Training Loss: 0.01493
Epoch: 11360, Training Loss: 0.01495
Epoch: 11360, Training Loss: 0.01855
Epoch: 11361, Training Loss: 0.01408
Epoch: 11361, Training Loss: 0.01493
Epoch: 11361, Training Loss: 0.01495
Epoch: 11361, Training Loss: 0.01855
Epoch: 11362, Training Loss: 0.01408
Epoch: 11362, Training Loss: 0.01493
Epoch: 11362, Training Loss: 0.01495
Epoch: 11362, Training Loss: 0.01855
Epoch: 11363, Training Loss: 0.01408
Epoch: 11363, Training Loss: 0.01493
Epoch: 11363, Training Loss: 0.01495
Epoch: 11363, Training Loss: 0.01854
Epoch: 11364, Training Loss: 0.01408
Epoch: 11364, Training Loss: 0.01493
Epoch: 11364, Training Loss: 0.01495
Epoch: 11364, Training Loss: 0.01854
Epoch: 11365, Training Loss: 0.01408
Epoch: 11365, Training Loss: 0.01492
Epoch: 11365, Training Loss: 0.01495
Epoch: 11365, Training Loss: 0.01854
Epoch: 11366, Training Loss: 0.01408
Epoch: 11366, Training Loss: 0.01492
Epoch: 11366, Training Loss: 0.01495
Epoch: 11366, Training Loss: 0.01854
Epoch: 11367, Training Loss: 0.01408
Epoch: 11367, Training Loss: 0.01492
Epoch: 11367, Training Loss: 0.01494
Epoch: 11367, Training Loss: 0.01854
Epoch: 11368, Training Loss: 0.01408
Epoch: 11368, Training Loss: 0.01492
Epoch: 11368, Training Loss: 0.01494
Epoch: 11368, Training Loss: 0.01854
Epoch: 11369, Training Loss: 0.01407
Epoch: 11369, Training Loss: 0.01492
Epoch: 11369, Training Loss: 0.01494
Epoch: 11369, Training Loss: 0.01854
Epoch: 11370, Training Loss: 0.01407
Epoch: 11370, Training Loss: 0.01492
Epoch: 11370, Training Loss: 0.01494
Epoch: 11370, Training Loss: 0.01854
Epoch: 11371, Training Loss: 0.01407
Epoch: 11371, Training Loss: 0.01492
Epoch: 11371, Training Loss: 0.01494
Epoch: 11371, Training Loss: 0.01854
Epoch: 11372, Training Loss: 0.01407
Epoch: 11372, Training Loss: 0.01492
Epoch: 11372, Training Loss: 0.01494
Epoch: 11372, Training Loss: 0.01853
Epoch: 11373, Training Loss: 0.01407
Epoch: 11373, Training Loss: 0.01492
Epoch: 11373, Training Loss: 0.01494
Epoch: 11373, Training Loss: 0.01853
Epoch: 11374, Training Loss: 0.01407
Epoch: 11374, Training Loss: 0.01492
Epoch: 11374, Training Loss: 0.01494
Epoch: 11374, Training Loss: 0.01853
Epoch: 11375, Training Loss: 0.01407
Epoch: 11375, Training Loss: 0.01492
Epoch: 11375, Training Loss: 0.01494
Epoch: 11375, Training Loss: 0.01853
Epoch: 11376, Training Loss: 0.01407
Epoch: 11376, Training Loss: 0.01491
Epoch: 11376, Training Loss: 0.01494
Epoch: 11376, Training Loss: 0.01853
Epoch: 11377, Training Loss: 0.01407
Epoch: 11377, Training Loss: 0.01491
Epoch: 11377, Training Loss: 0.01494
Epoch: 11377, Training Loss: 0.01853
Epoch: 11378, Training Loss: 0.01407
Epoch: 11378, Training Loss: 0.01491
Epoch: 11378, Training Loss: 0.01494
Epoch: 11378, Training Loss: 0.01853
Epoch: 11379, Training Loss: 0.01407
Epoch: 11379, Training Loss: 0.01491
Epoch: 11379, Training Loss: 0.01493
Epoch: 11379, Training Loss: 0.01853
Epoch: 11380, Training Loss: 0.01407
Epoch: 11380, Training Loss: 0.01491
Epoch: 11380, Training Loss: 0.01493
Epoch: 11380, Training Loss: 0.01853
Epoch: 11381, Training Loss: 0.01406
Epoch: 11381, Training Loss: 0.01491
Epoch: 11381, Training Loss: 0.01493
Epoch: 11381, Training Loss: 0.01852
Epoch: 11382, Training Loss: 0.01406
Epoch: 11382, Training Loss: 0.01491
Epoch: 11382, Training Loss: 0.01493
Epoch: 11382, Training Loss: 0.01852
Epoch: 11383, Training Loss: 0.01406
Epoch: 11383, Training Loss: 0.01491
Epoch: 11383, Training Loss: 0.01493
Epoch: 11383, Training Loss: 0.01852
Epoch: 11384, Training Loss: 0.01406
Epoch: 11384, Training Loss: 0.01491
Epoch: 11384, Training Loss: 0.01493
Epoch: 11384, Training Loss: 0.01852
Epoch: 11385, Training Loss: 0.01406
Epoch: 11385, Training Loss: 0.01491
Epoch: 11385, Training Loss: 0.01493
Epoch: 11385, Training Loss: 0.01852
Epoch: 11386, Training Loss: 0.01406
Epoch: 11386, Training Loss: 0.01491
Epoch: 11386, Training Loss: 0.01493
Epoch: 11386, Training Loss: 0.01852
Epoch: 11387, Training Loss: 0.01406
Epoch: 11387, Training Loss: 0.01491
Epoch: 11387, Training Loss: 0.01493
Epoch: 11387, Training Loss: 0.01852
Epoch: 11388, Training Loss: 0.01406
Epoch: 11388, Training Loss: 0.01490
Epoch: 11388, Training Loss: 0.01493
Epoch: 11388, Training Loss: 0.01852
Epoch: 11389, Training Loss: 0.01406
Epoch: 11389, Training Loss: 0.01490
Epoch: 11389, Training Loss: 0.01493
Epoch: 11389, Training Loss: 0.01852
Epoch: 11390, Training Loss: 0.01406
Epoch: 11390, Training Loss: 0.01490
Epoch: 11390, Training Loss: 0.01492
Epoch: 11390, Training Loss: 0.01851
Epoch: 11391, Training Loss: 0.01406
Epoch: 11391, Training Loss: 0.01490
Epoch: 11391, Training Loss: 0.01492
Epoch: 11391, Training Loss: 0.01851
Epoch: 11392, Training Loss: 0.01406
Epoch: 11392, Training Loss: 0.01490
Epoch: 11392, Training Loss: 0.01492
Epoch: 11392, Training Loss: 0.01851
Epoch: 11393, Training Loss: 0.01406
Epoch: 11393, Training Loss: 0.01490
Epoch: 11393, Training Loss: 0.01492
Epoch: 11393, Training Loss: 0.01851
Epoch: 11394, Training Loss: 0.01405
Epoch: 11394, Training Loss: 0.01490
Epoch: 11394, Training Loss: 0.01492
Epoch: 11394, Training Loss: 0.01851
Epoch: 11395, Training Loss: 0.01405
Epoch: 11395, Training Loss: 0.01490
Epoch: 11395, Training Loss: 0.01492
Epoch: 11395, Training Loss: 0.01851
Epoch: 11396, Training Loss: 0.01405
Epoch: 11396, Training Loss: 0.01490
Epoch: 11396, Training Loss: 0.01492
Epoch: 11396, Training Loss: 0.01851
Epoch: 11397, Training Loss: 0.01405
Epoch: 11397, Training Loss: 0.01490
Epoch: 11397, Training Loss: 0.01492
Epoch: 11397, Training Loss: 0.01851
Epoch: 11398, Training Loss: 0.01405
Epoch: 11398, Training Loss: 0.01490
Epoch: 11398, Training Loss: 0.01492
Epoch: 11398, Training Loss: 0.01851
Epoch: 11399, Training Loss: 0.01405
Epoch: 11399, Training Loss: 0.01489
Epoch: 11399, Training Loss: 0.01492
Epoch: 11399, Training Loss: 0.01850
Epoch: 11400, Training Loss: 0.01405
Epoch: 11400, Training Loss: 0.01489
Epoch: 11400, Training Loss: 0.01492
Epoch: 11400, Training Loss: 0.01850
Epoch: 11401, Training Loss: 0.01405
Epoch: 11401, Training Loss: 0.01489
Epoch: 11401, Training Loss: 0.01492
Epoch: 11401, Training Loss: 0.01850
Epoch: 11402, Training Loss: 0.01405
Epoch: 11402, Training Loss: 0.01489
Epoch: 11402, Training Loss: 0.01491
Epoch: 11402, Training Loss: 0.01850
Epoch: 11403, Training Loss: 0.01405
Epoch: 11403, Training Loss: 0.01489
Epoch: 11403, Training Loss: 0.01491
Epoch: 11403, Training Loss: 0.01850
Epoch: 11404, Training Loss: 0.01405
Epoch: 11404, Training Loss: 0.01489
Epoch: 11404, Training Loss: 0.01491
Epoch: 11404, Training Loss: 0.01850
Epoch: 11405, Training Loss: 0.01405
Epoch: 11405, Training Loss: 0.01489
Epoch: 11405, Training Loss: 0.01491
Epoch: 11405, Training Loss: 0.01850
Epoch: 11406, Training Loss: 0.01405
Epoch: 11406, Training Loss: 0.01489
Epoch: 11406, Training Loss: 0.01491
Epoch: 11406, Training Loss: 0.01850
Epoch: 11407, Training Loss: 0.01404
Epoch: 11407, Training Loss: 0.01489
Epoch: 11407, Training Loss: 0.01491
Epoch: 11407, Training Loss: 0.01850
Epoch: 11408, Training Loss: 0.01404
Epoch: 11408, Training Loss: 0.01489
Epoch: 11408, Training Loss: 0.01491
Epoch: 11408, Training Loss: 0.01849
Epoch: 11409, Training Loss: 0.01404
Epoch: 11409, Training Loss: 0.01489
Epoch: 11409, Training Loss: 0.01491
Epoch: 11409, Training Loss: 0.01849
Epoch: 11410, Training Loss: 0.01404
Epoch: 11410, Training Loss: 0.01489
Epoch: 11410, Training Loss: 0.01491
Epoch: 11410, Training Loss: 0.01849
Epoch: 11411, Training Loss: 0.01404
Epoch: 11411, Training Loss: 0.01488
Epoch: 11411, Training Loss: 0.01491
Epoch: 11411, Training Loss: 0.01849
Epoch: 11412, Training Loss: 0.01404
Epoch: 11412, Training Loss: 0.01488
Epoch: 11412, Training Loss: 0.01491
Epoch: 11412, Training Loss: 0.01849
Epoch: 11413, Training Loss: 0.01404
Epoch: 11413, Training Loss: 0.01488
Epoch: 11413, Training Loss: 0.01490
Epoch: 11413, Training Loss: 0.01849
Epoch: 11414, Training Loss: 0.01404
Epoch: 11414, Training Loss: 0.01488
Epoch: 11414, Training Loss: 0.01490
Epoch: 11414, Training Loss: 0.01849
Epoch: 11415, Training Loss: 0.01404
Epoch: 11415, Training Loss: 0.01488
Epoch: 11415, Training Loss: 0.01490
Epoch: 11415, Training Loss: 0.01849
Epoch: 11416, Training Loss: 0.01404
Epoch: 11416, Training Loss: 0.01488
Epoch: 11416, Training Loss: 0.01490
Epoch: 11416, Training Loss: 0.01849
Epoch: 11417, Training Loss: 0.01404
Epoch: 11417, Training Loss: 0.01488
Epoch: 11417, Training Loss: 0.01490
Epoch: 11417, Training Loss: 0.01848
Epoch: 11418, Training Loss: 0.01404
Epoch: 11418, Training Loss: 0.01488
Epoch: 11418, Training Loss: 0.01490
Epoch: 11418, Training Loss: 0.01848
Epoch: 11419, Training Loss: 0.01404
Epoch: 11419, Training Loss: 0.01488
Epoch: 11419, Training Loss: 0.01490
Epoch: 11419, Training Loss: 0.01848
Epoch: 11420, Training Loss: 0.01403
Epoch: 11420, Training Loss: 0.01488
Epoch: 11420, Training Loss: 0.01490
Epoch: 11420, Training Loss: 0.01848
Epoch: 11421, Training Loss: 0.01403
Epoch: 11421, Training Loss: 0.01488
Epoch: 11421, Training Loss: 0.01490
Epoch: 11421, Training Loss: 0.01848
Epoch: 11422, Training Loss: 0.01403
Epoch: 11422, Training Loss: 0.01487
Epoch: 11422, Training Loss: 0.01490
Epoch: 11422, Training Loss: 0.01848
Epoch: 11423, Training Loss: 0.01403
Epoch: 11423, Training Loss: 0.01487
Epoch: 11423, Training Loss: 0.01490
Epoch: 11423, Training Loss: 0.01848
Epoch: 11424, Training Loss: 0.01403
Epoch: 11424, Training Loss: 0.01487
Epoch: 11424, Training Loss: 0.01490
Epoch: 11424, Training Loss: 0.01848
Epoch: 11425, Training Loss: 0.01403
Epoch: 11425, Training Loss: 0.01487
Epoch: 11425, Training Loss: 0.01489
Epoch: 11425, Training Loss: 0.01848
Epoch: 11426, Training Loss: 0.01403
Epoch: 11426, Training Loss: 0.01487
Epoch: 11426, Training Loss: 0.01489
Epoch: 11426, Training Loss: 0.01847
Epoch: 11427, Training Loss: 0.01403
Epoch: 11427, Training Loss: 0.01487
Epoch: 11427, Training Loss: 0.01489
Epoch: 11427, Training Loss: 0.01847
Epoch: 11428, Training Loss: 0.01403
Epoch: 11428, Training Loss: 0.01487
Epoch: 11428, Training Loss: 0.01489
Epoch: 11428, Training Loss: 0.01847
Epoch: 11429, Training Loss: 0.01403
Epoch: 11429, Training Loss: 0.01487
Epoch: 11429, Training Loss: 0.01489
Epoch: 11429, Training Loss: 0.01847
Epoch: 11430, Training Loss: 0.01403
Epoch: 11430, Training Loss: 0.01487
Epoch: 11430, Training Loss: 0.01489
Epoch: 11430, Training Loss: 0.01847
Epoch: 11431, Training Loss: 0.01403
Epoch: 11431, Training Loss: 0.01487
Epoch: 11431, Training Loss: 0.01489
Epoch: 11431, Training Loss: 0.01847
Epoch: 11432, Training Loss: 0.01402
Epoch: 11432, Training Loss: 0.01487
Epoch: 11432, Training Loss: 0.01489
Epoch: 11432, Training Loss: 0.01847
Epoch: 11433, Training Loss: 0.01402
Epoch: 11433, Training Loss: 0.01487
Epoch: 11433, Training Loss: 0.01489
Epoch: 11433, Training Loss: 0.01847
Epoch: 11434, Training Loss: 0.01402
Epoch: 11434, Training Loss: 0.01486
Epoch: 11434, Training Loss: 0.01489
Epoch: 11434, Training Loss: 0.01847
Epoch: 11435, Training Loss: 0.01402
Epoch: 11435, Training Loss: 0.01486
Epoch: 11435, Training Loss: 0.01489
Epoch: 11435, Training Loss: 0.01847
Epoch: 11436, Training Loss: 0.01402
Epoch: 11436, Training Loss: 0.01486
Epoch: 11436, Training Loss: 0.01489
Epoch: 11436, Training Loss: 0.01846
Epoch: 11437, Training Loss: 0.01402
Epoch: 11437, Training Loss: 0.01486
Epoch: 11437, Training Loss: 0.01488
Epoch: 11437, Training Loss: 0.01846
Epoch: 11438, Training Loss: 0.01402
Epoch: 11438, Training Loss: 0.01486
Epoch: 11438, Training Loss: 0.01488
Epoch: 11438, Training Loss: 0.01846
Epoch: 11439, Training Loss: 0.01402
Epoch: 11439, Training Loss: 0.01486
Epoch: 11439, Training Loss: 0.01488
Epoch: 11439, Training Loss: 0.01846
Epoch: 11440, Training Loss: 0.01402
Epoch: 11440, Training Loss: 0.01486
Epoch: 11440, Training Loss: 0.01488
Epoch: 11440, Training Loss: 0.01846
Epoch: 11441, Training Loss: 0.01402
Epoch: 11441, Training Loss: 0.01486
Epoch: 11441, Training Loss: 0.01488
Epoch: 11441, Training Loss: 0.01846
Epoch: 11442, Training Loss: 0.01402
Epoch: 11442, Training Loss: 0.01486
Epoch: 11442, Training Loss: 0.01488
Epoch: 11442, Training Loss: 0.01846
Epoch: 11443, Training Loss: 0.01402
Epoch: 11443, Training Loss: 0.01486
Epoch: 11443, Training Loss: 0.01488
Epoch: 11443, Training Loss: 0.01846
Epoch: 11444, Training Loss: 0.01402
Epoch: 11444, Training Loss: 0.01486
Epoch: 11444, Training Loss: 0.01488
Epoch: 11444, Training Loss: 0.01846
Epoch: 11445, Training Loss: 0.01401
Epoch: 11445, Training Loss: 0.01486
Epoch: 11445, Training Loss: 0.01488
Epoch: 11445, Training Loss: 0.01845
Epoch: 11446, Training Loss: 0.01401
Epoch: 11446, Training Loss: 0.01485
Epoch: 11446, Training Loss: 0.01488
Epoch: 11446, Training Loss: 0.01845
Epoch: 11447, Training Loss: 0.01401
Epoch: 11447, Training Loss: 0.01485
Epoch: 11447, Training Loss: 0.01488
Epoch: 11447, Training Loss: 0.01845
Epoch: 11448, Training Loss: 0.01401
Epoch: 11448, Training Loss: 0.01485
Epoch: 11448, Training Loss: 0.01487
Epoch: 11448, Training Loss: 0.01845
Epoch: 11449, Training Loss: 0.01401
Epoch: 11449, Training Loss: 0.01485
Epoch: 11449, Training Loss: 0.01487
Epoch: 11449, Training Loss: 0.01845
Epoch: 11450, Training Loss: 0.01401
Epoch: 11450, Training Loss: 0.01485
Epoch: 11450, Training Loss: 0.01487
Epoch: 11450, Training Loss: 0.01845
Epoch: 11451, Training Loss: 0.01401
Epoch: 11451, Training Loss: 0.01485
Epoch: 11451, Training Loss: 0.01487
Epoch: 11451, Training Loss: 0.01845
Epoch: 11452, Training Loss: 0.01401
Epoch: 11452, Training Loss: 0.01485
Epoch: 11452, Training Loss: 0.01487
Epoch: 11452, Training Loss: 0.01845
Epoch: 11453, Training Loss: 0.01401
Epoch: 11453, Training Loss: 0.01485
Epoch: 11453, Training Loss: 0.01487
Epoch: 11453, Training Loss: 0.01845
Epoch: 11454, Training Loss: 0.01401
Epoch: 11454, Training Loss: 0.01485
Epoch: 11454, Training Loss: 0.01487
Epoch: 11454, Training Loss: 0.01844
Epoch: 11455, Training Loss: 0.01401
Epoch: 11455, Training Loss: 0.01485
Epoch: 11455, Training Loss: 0.01487
Epoch: 11455, Training Loss: 0.01844
Epoch: 11456, Training Loss: 0.01401
Epoch: 11456, Training Loss: 0.01485
Epoch: 11456, Training Loss: 0.01487
Epoch: 11456, Training Loss: 0.01844
Epoch: 11457, Training Loss: 0.01401
Epoch: 11457, Training Loss: 0.01484
Epoch: 11457, Training Loss: 0.01487
Epoch: 11457, Training Loss: 0.01844
Epoch: 11458, Training Loss: 0.01400
Epoch: 11458, Training Loss: 0.01484
Epoch: 11458, Training Loss: 0.01487
Epoch: 11458, Training Loss: 0.01844
Epoch: 11459, Training Loss: 0.01400
Epoch: 11459, Training Loss: 0.01484
Epoch: 11459, Training Loss: 0.01487
Epoch: 11459, Training Loss: 0.01844
Epoch: 11460, Training Loss: 0.01400
Epoch: 11460, Training Loss: 0.01484
Epoch: 11460, Training Loss: 0.01486
Epoch: 11460, Training Loss: 0.01844
Epoch: 11461, Training Loss: 0.01400
Epoch: 11461, Training Loss: 0.01484
Epoch: 11461, Training Loss: 0.01486
Epoch: 11461, Training Loss: 0.01844
Epoch: 11462, Training Loss: 0.01400
Epoch: 11462, Training Loss: 0.01484
Epoch: 11462, Training Loss: 0.01486
Epoch: 11462, Training Loss: 0.01844
Epoch: 11463, Training Loss: 0.01400
Epoch: 11463, Training Loss: 0.01484
Epoch: 11463, Training Loss: 0.01486
Epoch: 11463, Training Loss: 0.01843
Epoch: 11464, Training Loss: 0.01400
Epoch: 11464, Training Loss: 0.01484
Epoch: 11464, Training Loss: 0.01486
Epoch: 11464, Training Loss: 0.01843
Epoch: 11465, Training Loss: 0.01400
Epoch: 11465, Training Loss: 0.01484
Epoch: 11465, Training Loss: 0.01486
Epoch: 11465, Training Loss: 0.01843
Epoch: 11466, Training Loss: 0.01400
Epoch: 11466, Training Loss: 0.01484
Epoch: 11466, Training Loss: 0.01486
Epoch: 11466, Training Loss: 0.01843
Epoch: 11467, Training Loss: 0.01400
Epoch: 11467, Training Loss: 0.01484
Epoch: 11467, Training Loss: 0.01486
Epoch: 11467, Training Loss: 0.01843
Epoch: 11468, Training Loss: 0.01400
Epoch: 11468, Training Loss: 0.01484
Epoch: 11468, Training Loss: 0.01486
Epoch: 11468, Training Loss: 0.01843
Epoch: 11469, Training Loss: 0.01400
Epoch: 11469, Training Loss: 0.01483
Epoch: 11469, Training Loss: 0.01486
Epoch: 11469, Training Loss: 0.01843
Epoch: 11470, Training Loss: 0.01400
Epoch: 11470, Training Loss: 0.01483
Epoch: 11470, Training Loss: 0.01486
Epoch: 11470, Training Loss: 0.01843
Epoch: 11471, Training Loss: 0.01399
Epoch: 11471, Training Loss: 0.01483
Epoch: 11471, Training Loss: 0.01485
Epoch: 11471, Training Loss: 0.01843
Epoch: 11472, Training Loss: 0.01399
Epoch: 11472, Training Loss: 0.01483
Epoch: 11472, Training Loss: 0.01485
Epoch: 11472, Training Loss: 0.01842
Epoch: 11473, Training Loss: 0.01399
Epoch: 11473, Training Loss: 0.01483
Epoch: 11473, Training Loss: 0.01485
Epoch: 11473, Training Loss: 0.01842
Epoch: 11474, Training Loss: 0.01399
Epoch: 11474, Training Loss: 0.01483
Epoch: 11474, Training Loss: 0.01485
Epoch: 11474, Training Loss: 0.01842
Epoch: 11475, Training Loss: 0.01399
Epoch: 11475, Training Loss: 0.01483
Epoch: 11475, Training Loss: 0.01485
Epoch: 11475, Training Loss: 0.01842
Epoch: 11476, Training Loss: 0.01399
Epoch: 11476, Training Loss: 0.01483
Epoch: 11476, Training Loss: 0.01485
Epoch: 11476, Training Loss: 0.01842
Epoch: 11477, Training Loss: 0.01399
Epoch: 11477, Training Loss: 0.01483
Epoch: 11477, Training Loss: 0.01485
Epoch: 11477, Training Loss: 0.01842
Epoch: 11478, Training Loss: 0.01399
Epoch: 11478, Training Loss: 0.01483
Epoch: 11478, Training Loss: 0.01485
Epoch: 11478, Training Loss: 0.01842
Epoch: 11479, Training Loss: 0.01399
Epoch: 11479, Training Loss: 0.01483
Epoch: 11479, Training Loss: 0.01485
Epoch: 11479, Training Loss: 0.01842
Epoch: 11480, Training Loss: 0.01399
Epoch: 11480, Training Loss: 0.01483
Epoch: 11480, Training Loss: 0.01485
Epoch: 11480, Training Loss: 0.01842
Epoch: 11481, Training Loss: 0.01399
Epoch: 11481, Training Loss: 0.01482
Epoch: 11481, Training Loss: 0.01485
Epoch: 11481, Training Loss: 0.01841
Epoch: 11482, Training Loss: 0.01399
Epoch: 11482, Training Loss: 0.01482
Epoch: 11482, Training Loss: 0.01485
Epoch: 11482, Training Loss: 0.01841
Epoch: 11483, Training Loss: 0.01399
Epoch: 11483, Training Loss: 0.01482
Epoch: 11483, Training Loss: 0.01484
Epoch: 11483, Training Loss: 0.01841
Epoch: 11484, Training Loss: 0.01398
Epoch: 11484, Training Loss: 0.01482
Epoch: 11484, Training Loss: 0.01484
Epoch: 11484, Training Loss: 0.01841
Epoch: 11485, Training Loss: 0.01398
Epoch: 11485, Training Loss: 0.01482
Epoch: 11485, Training Loss: 0.01484
Epoch: 11485, Training Loss: 0.01841
Epoch: 11486, Training Loss: 0.01398
Epoch: 11486, Training Loss: 0.01482
Epoch: 11486, Training Loss: 0.01484
Epoch: 11486, Training Loss: 0.01841
Epoch: 11487, Training Loss: 0.01398
Epoch: 11487, Training Loss: 0.01482
Epoch: 11487, Training Loss: 0.01484
Epoch: 11487, Training Loss: 0.01841
Epoch: 11488, Training Loss: 0.01398
Epoch: 11488, Training Loss: 0.01482
Epoch: 11488, Training Loss: 0.01484
Epoch: 11488, Training Loss: 0.01841
Epoch: 11489, Training Loss: 0.01398
Epoch: 11489, Training Loss: 0.01482
Epoch: 11489, Training Loss: 0.01484
Epoch: 11489, Training Loss: 0.01841
Epoch: 11490, Training Loss: 0.01398
Epoch: 11490, Training Loss: 0.01482
Epoch: 11490, Training Loss: 0.01484
Epoch: 11490, Training Loss: 0.01841
Epoch: 11491, Training Loss: 0.01398
Epoch: 11491, Training Loss: 0.01482
Epoch: 11491, Training Loss: 0.01484
Epoch: 11491, Training Loss: 0.01840
Epoch: 11492, Training Loss: 0.01398
Epoch: 11492, Training Loss: 0.01481
Epoch: 11492, Training Loss: 0.01484
Epoch: 11492, Training Loss: 0.01840
Epoch: 11493, Training Loss: 0.01398
Epoch: 11493, Training Loss: 0.01481
Epoch: 11493, Training Loss: 0.01484
Epoch: 11493, Training Loss: 0.01840
Epoch: 11494, Training Loss: 0.01398
Epoch: 11494, Training Loss: 0.01481
Epoch: 11494, Training Loss: 0.01484
Epoch: 11494, Training Loss: 0.01840
Epoch: 11495, Training Loss: 0.01398
Epoch: 11495, Training Loss: 0.01481
Epoch: 11495, Training Loss: 0.01483
Epoch: 11495, Training Loss: 0.01840
Epoch: 11496, Training Loss: 0.01398
Epoch: 11496, Training Loss: 0.01481
Epoch: 11496, Training Loss: 0.01483
Epoch: 11496, Training Loss: 0.01840
Epoch: 11497, Training Loss: 0.01397
Epoch: 11497, Training Loss: 0.01481
Epoch: 11497, Training Loss: 0.01483
Epoch: 11497, Training Loss: 0.01840
Epoch: 11498, Training Loss: 0.01397
Epoch: 11498, Training Loss: 0.01481
Epoch: 11498, Training Loss: 0.01483
Epoch: 11498, Training Loss: 0.01840
Epoch: 11499, Training Loss: 0.01397
Epoch: 11499, Training Loss: 0.01481
Epoch: 11499, Training Loss: 0.01483
Epoch: 11499, Training Loss: 0.01840
Epoch: 11500, Training Loss: 0.01397
Epoch: 11500, Training Loss: 0.01481
Epoch: 11500, Training Loss: 0.01483
Epoch: 11500, Training Loss: 0.01839
Epoch: 11501, Training Loss: 0.01397
Epoch: 11501, Training Loss: 0.01481
Epoch: 11501, Training Loss: 0.01483
Epoch: 11501, Training Loss: 0.01839
Epoch: 11502, Training Loss: 0.01397
Epoch: 11502, Training Loss: 0.01481
Epoch: 11502, Training Loss: 0.01483
Epoch: 11502, Training Loss: 0.01839
Epoch: 11503, Training Loss: 0.01397
Epoch: 11503, Training Loss: 0.01481
Epoch: 11503, Training Loss: 0.01483
Epoch: 11503, Training Loss: 0.01839
Epoch: 11504, Training Loss: 0.01397
Epoch: 11504, Training Loss: 0.01480
Epoch: 11504, Training Loss: 0.01483
Epoch: 11504, Training Loss: 0.01839
Epoch: 11505, Training Loss: 0.01397
Epoch: 11505, Training Loss: 0.01480
Epoch: 11505, Training Loss: 0.01483
Epoch: 11505, Training Loss: 0.01839
Epoch: 11506, Training Loss: 0.01397
Epoch: 11506, Training Loss: 0.01480
Epoch: 11506, Training Loss: 0.01483
Epoch: 11506, Training Loss: 0.01839
Epoch: 11507, Training Loss: 0.01397
Epoch: 11507, Training Loss: 0.01480
Epoch: 11507, Training Loss: 0.01482
Epoch: 11507, Training Loss: 0.01839
Epoch: 11508, Training Loss: 0.01397
Epoch: 11508, Training Loss: 0.01480
Epoch: 11508, Training Loss: 0.01482
Epoch: 11508, Training Loss: 0.01839
Epoch: 11509, Training Loss: 0.01397
Epoch: 11509, Training Loss: 0.01480
Epoch: 11509, Training Loss: 0.01482
Epoch: 11509, Training Loss: 0.01838
Epoch: 11510, Training Loss: 0.01396
Epoch: 11510, Training Loss: 0.01480
Epoch: 11510, Training Loss: 0.01482
Epoch: 11510, Training Loss: 0.01838
Epoch: 11511, Training Loss: 0.01396
Epoch: 11511, Training Loss: 0.01480
Epoch: 11511, Training Loss: 0.01482
Epoch: 11511, Training Loss: 0.01838
Epoch: 11512, Training Loss: 0.01396
Epoch: 11512, Training Loss: 0.01480
Epoch: 11512, Training Loss: 0.01482
Epoch: 11512, Training Loss: 0.01838
Epoch: 11513, Training Loss: 0.01396
Epoch: 11513, Training Loss: 0.01480
Epoch: 11513, Training Loss: 0.01482
Epoch: 11513, Training Loss: 0.01838
Epoch: 11514, Training Loss: 0.01396
Epoch: 11514, Training Loss: 0.01480
Epoch: 11514, Training Loss: 0.01482
Epoch: 11514, Training Loss: 0.01838
Epoch: 11515, Training Loss: 0.01396
Epoch: 11515, Training Loss: 0.01480
Epoch: 11515, Training Loss: 0.01482
Epoch: 11515, Training Loss: 0.01838
Epoch: 11516, Training Loss: 0.01396
Epoch: 11516, Training Loss: 0.01479
Epoch: 11516, Training Loss: 0.01482
Epoch: 11516, Training Loss: 0.01838
Epoch: 11517, Training Loss: 0.01396
Epoch: 11517, Training Loss: 0.01479
Epoch: 11517, Training Loss: 0.01482
Epoch: 11517, Training Loss: 0.01838
Epoch: 11518, Training Loss: 0.01396
Epoch: 11518, Training Loss: 0.01479
Epoch: 11518, Training Loss: 0.01481
Epoch: 11518, Training Loss: 0.01837
Epoch: 11519, Training Loss: 0.01396
Epoch: 11519, Training Loss: 0.01479
Epoch: 11519, Training Loss: 0.01481
Epoch: 11519, Training Loss: 0.01837
Epoch: 11520, Training Loss: 0.01396
Epoch: 11520, Training Loss: 0.01479
Epoch: 11520, Training Loss: 0.01481
Epoch: 11520, Training Loss: 0.01837
Epoch: 11521, Training Loss: 0.01396
Epoch: 11521, Training Loss: 0.01479
Epoch: 11521, Training Loss: 0.01481
Epoch: 11521, Training Loss: 0.01837
Epoch: 11522, Training Loss: 0.01396
Epoch: 11522, Training Loss: 0.01479
Epoch: 11522, Training Loss: 0.01481
Epoch: 11522, Training Loss: 0.01837
Epoch: 11523, Training Loss: 0.01395
Epoch: 11523, Training Loss: 0.01479
Epoch: 11523, Training Loss: 0.01481
Epoch: 11523, Training Loss: 0.01837
Epoch: 11524, Training Loss: 0.01395
Epoch: 11524, Training Loss: 0.01479
Epoch: 11524, Training Loss: 0.01481
Epoch: 11524, Training Loss: 0.01837
Epoch: 11525, Training Loss: 0.01395
Epoch: 11525, Training Loss: 0.01479
Epoch: 11525, Training Loss: 0.01481
Epoch: 11525, Training Loss: 0.01837
Epoch: 11526, Training Loss: 0.01395
Epoch: 11526, Training Loss: 0.01479
Epoch: 11526, Training Loss: 0.01481
Epoch: 11526, Training Loss: 0.01837
Epoch: 11527, Training Loss: 0.01395
Epoch: 11527, Training Loss: 0.01478
Epoch: 11527, Training Loss: 0.01481
Epoch: 11527, Training Loss: 0.01837
Epoch: 11528, Training Loss: 0.01395
Epoch: 11528, Training Loss: 0.01478
Epoch: 11528, Training Loss: 0.01481
Epoch: 11528, Training Loss: 0.01836
Epoch: 11529, Training Loss: 0.01395
Epoch: 11529, Training Loss: 0.01478
Epoch: 11529, Training Loss: 0.01481
Epoch: 11529, Training Loss: 0.01836
Epoch: 11530, Training Loss: 0.01395
Epoch: 11530, Training Loss: 0.01478
Epoch: 11530, Training Loss: 0.01480
Epoch: 11530, Training Loss: 0.01836
Epoch: 11531, Training Loss: 0.01395
Epoch: 11531, Training Loss: 0.01478
Epoch: 11531, Training Loss: 0.01480
Epoch: 11531, Training Loss: 0.01836
Epoch: 11532, Training Loss: 0.01395
Epoch: 11532, Training Loss: 0.01478
Epoch: 11532, Training Loss: 0.01480
Epoch: 11532, Training Loss: 0.01836
Epoch: 11533, Training Loss: 0.01395
Epoch: 11533, Training Loss: 0.01478
Epoch: 11533, Training Loss: 0.01480
Epoch: 11533, Training Loss: 0.01836
Epoch: 11534, Training Loss: 0.01395
Epoch: 11534, Training Loss: 0.01478
Epoch: 11534, Training Loss: 0.01480
Epoch: 11534, Training Loss: 0.01836
Epoch: 11535, Training Loss: 0.01395
Epoch: 11535, Training Loss: 0.01478
Epoch: 11535, Training Loss: 0.01480
Epoch: 11535, Training Loss: 0.01836
Epoch: 11536, Training Loss: 0.01394
Epoch: 11536, Training Loss: 0.01478
Epoch: 11536, Training Loss: 0.01480
Epoch: 11536, Training Loss: 0.01836
Epoch: 11537, Training Loss: 0.01394
Epoch: 11537, Training Loss: 0.01478
Epoch: 11537, Training Loss: 0.01480
Epoch: 11537, Training Loss: 0.01835
Epoch: 11538, Training Loss: 0.01394
Epoch: 11538, Training Loss: 0.01478
Epoch: 11538, Training Loss: 0.01480
Epoch: 11538, Training Loss: 0.01835
Epoch: 11539, Training Loss: 0.01394
Epoch: 11539, Training Loss: 0.01477
Epoch: 11539, Training Loss: 0.01480
Epoch: 11539, Training Loss: 0.01835
Epoch: 11540, Training Loss: 0.01394
Epoch: 11540, Training Loss: 0.01477
Epoch: 11540, Training Loss: 0.01480
Epoch: 11540, Training Loss: 0.01835
Epoch: 11541, Training Loss: 0.01394
Epoch: 11541, Training Loss: 0.01477
Epoch: 11541, Training Loss: 0.01480
Epoch: 11541, Training Loss: 0.01835
Epoch: 11542, Training Loss: 0.01394
Epoch: 11542, Training Loss: 0.01477
Epoch: 11542, Training Loss: 0.01479
Epoch: 11542, Training Loss: 0.01835
Epoch: 11543, Training Loss: 0.01394
Epoch: 11543, Training Loss: 0.01477
Epoch: 11543, Training Loss: 0.01479
Epoch: 11543, Training Loss: 0.01835
Epoch: 11544, Training Loss: 0.01394
Epoch: 11544, Training Loss: 0.01477
Epoch: 11544, Training Loss: 0.01479
Epoch: 11544, Training Loss: 0.01835
Epoch: 11545, Training Loss: 0.01394
Epoch: 11545, Training Loss: 0.01477
Epoch: 11545, Training Loss: 0.01479
Epoch: 11545, Training Loss: 0.01835
Epoch: 11546, Training Loss: 0.01394
Epoch: 11546, Training Loss: 0.01477
Epoch: 11546, Training Loss: 0.01479
Epoch: 11546, Training Loss: 0.01834
Epoch: 11547, Training Loss: 0.01394
Epoch: 11547, Training Loss: 0.01477
Epoch: 11547, Training Loss: 0.01479
Epoch: 11547, Training Loss: 0.01834
Epoch: 11548, Training Loss: 0.01394
Epoch: 11548, Training Loss: 0.01477
Epoch: 11548, Training Loss: 0.01479
Epoch: 11548, Training Loss: 0.01834
Epoch: 11549, Training Loss: 0.01393
Epoch: 11549, Training Loss: 0.01477
Epoch: 11549, Training Loss: 0.01479
Epoch: 11549, Training Loss: 0.01834
Epoch: 11550, Training Loss: 0.01393
Epoch: 11550, Training Loss: 0.01477
Epoch: 11550, Training Loss: 0.01479
Epoch: 11550, Training Loss: 0.01834
Epoch: 11551, Training Loss: 0.01393
Epoch: 11551, Training Loss: 0.01476
Epoch: 11551, Training Loss: 0.01479
Epoch: 11551, Training Loss: 0.01834
Epoch: 11552, Training Loss: 0.01393
Epoch: 11552, Training Loss: 0.01476
Epoch: 11552, Training Loss: 0.01479
Epoch: 11552, Training Loss: 0.01834
Epoch: 11553, Training Loss: 0.01393
Epoch: 11553, Training Loss: 0.01476
Epoch: 11553, Training Loss: 0.01479
Epoch: 11553, Training Loss: 0.01834
Epoch: 11554, Training Loss: 0.01393
Epoch: 11554, Training Loss: 0.01476
Epoch: 11554, Training Loss: 0.01478
Epoch: 11554, Training Loss: 0.01834
Epoch: 11555, Training Loss: 0.01393
Epoch: 11555, Training Loss: 0.01476
Epoch: 11555, Training Loss: 0.01478
Epoch: 11555, Training Loss: 0.01833
Epoch: 11556, Training Loss: 0.01393
Epoch: 11556, Training Loss: 0.01476
Epoch: 11556, Training Loss: 0.01478
Epoch: 11556, Training Loss: 0.01833
Epoch: 11557, Training Loss: 0.01393
Epoch: 11557, Training Loss: 0.01476
Epoch: 11557, Training Loss: 0.01478
Epoch: 11557, Training Loss: 0.01833
Epoch: 11558, Training Loss: 0.01393
Epoch: 11558, Training Loss: 0.01476
Epoch: 11558, Training Loss: 0.01478
Epoch: 11558, Training Loss: 0.01833
Epoch: 11559, Training Loss: 0.01393
Epoch: 11559, Training Loss: 0.01476
Epoch: 11559, Training Loss: 0.01478
Epoch: 11559, Training Loss: 0.01833
Epoch: 11560, Training Loss: 0.01393
Epoch: 11560, Training Loss: 0.01476
Epoch: 11560, Training Loss: 0.01478
Epoch: 11560, Training Loss: 0.01833
Epoch: 11561, Training Loss: 0.01393
Epoch: 11561, Training Loss: 0.01476
Epoch: 11561, Training Loss: 0.01478
Epoch: 11561, Training Loss: 0.01833
Epoch: 11562, Training Loss: 0.01392
Epoch: 11562, Training Loss: 0.01476
Epoch: 11562, Training Loss: 0.01478
Epoch: 11562, Training Loss: 0.01833
Epoch: 11563, Training Loss: 0.01392
Epoch: 11563, Training Loss: 0.01475
Epoch: 11563, Training Loss: 0.01478
Epoch: 11563, Training Loss: 0.01833
Epoch: 11564, Training Loss: 0.01392
Epoch: 11564, Training Loss: 0.01475
Epoch: 11564, Training Loss: 0.01478
Epoch: 11564, Training Loss: 0.01833
Epoch: 11565, Training Loss: 0.01392
Epoch: 11565, Training Loss: 0.01475
Epoch: 11565, Training Loss: 0.01477
Epoch: 11565, Training Loss: 0.01832
Epoch: 11566, Training Loss: 0.01392
Epoch: 11566, Training Loss: 0.01475
Epoch: 11566, Training Loss: 0.01477
Epoch: 11566, Training Loss: 0.01832
Epoch: 11567, Training Loss: 0.01392
Epoch: 11567, Training Loss: 0.01475
Epoch: 11567, Training Loss: 0.01477
Epoch: 11567, Training Loss: 0.01832
Epoch: 11568, Training Loss: 0.01392
Epoch: 11568, Training Loss: 0.01475
Epoch: 11568, Training Loss: 0.01477
Epoch: 11568, Training Loss: 0.01832
Epoch: 11569, Training Loss: 0.01392
Epoch: 11569, Training Loss: 0.01475
Epoch: 11569, Training Loss: 0.01477
Epoch: 11569, Training Loss: 0.01832
Epoch: 11570, Training Loss: 0.01392
Epoch: 11570, Training Loss: 0.01475
Epoch: 11570, Training Loss: 0.01477
Epoch: 11570, Training Loss: 0.01832
Epoch: 11571, Training Loss: 0.01392
Epoch: 11571, Training Loss: 0.01475
Epoch: 11571, Training Loss: 0.01477
Epoch: 11571, Training Loss: 0.01832
Epoch: 11572, Training Loss: 0.01392
Epoch: 11572, Training Loss: 0.01475
Epoch: 11572, Training Loss: 0.01477
Epoch: 11572, Training Loss: 0.01832
Epoch: 11573, Training Loss: 0.01392
Epoch: 11573, Training Loss: 0.01475
Epoch: 11573, Training Loss: 0.01477
Epoch: 11573, Training Loss: 0.01832
Epoch: 11574, Training Loss: 0.01392
Epoch: 11574, Training Loss: 0.01475
Epoch: 11574, Training Loss: 0.01477
Epoch: 11574, Training Loss: 0.01831
Epoch: 11575, Training Loss: 0.01391
Epoch: 11575, Training Loss: 0.01474
Epoch: 11575, Training Loss: 0.01477
Epoch: 11575, Training Loss: 0.01831
Epoch: 11576, Training Loss: 0.01391
Epoch: 11576, Training Loss: 0.01474
Epoch: 11576, Training Loss: 0.01477
Epoch: 11576, Training Loss: 0.01831
Epoch: 11577, Training Loss: 0.01391
Epoch: 11577, Training Loss: 0.01474
Epoch: 11577, Training Loss: 0.01476
Epoch: 11577, Training Loss: 0.01831
Epoch: 11578, Training Loss: 0.01391
Epoch: 11578, Training Loss: 0.01474
Epoch: 11578, Training Loss: 0.01476
Epoch: 11578, Training Loss: 0.01831
Epoch: 11579, Training Loss: 0.01391
Epoch: 11579, Training Loss: 0.01474
Epoch: 11579, Training Loss: 0.01476
Epoch: 11579, Training Loss: 0.01831
Epoch: 11580, Training Loss: 0.01391
Epoch: 11580, Training Loss: 0.01474
Epoch: 11580, Training Loss: 0.01476
Epoch: 11580, Training Loss: 0.01831
Epoch: 11581, Training Loss: 0.01391
Epoch: 11581, Training Loss: 0.01474
Epoch: 11581, Training Loss: 0.01476
Epoch: 11581, Training Loss: 0.01831
Epoch: 11582, Training Loss: 0.01391
Epoch: 11582, Training Loss: 0.01474
Epoch: 11582, Training Loss: 0.01476
Epoch: 11582, Training Loss: 0.01831
Epoch: 11583, Training Loss: 0.01391
Epoch: 11583, Training Loss: 0.01474
Epoch: 11583, Training Loss: 0.01476
Epoch: 11583, Training Loss: 0.01831
Epoch: 11584, Training Loss: 0.01391
Epoch: 11584, Training Loss: 0.01474
Epoch: 11584, Training Loss: 0.01476
Epoch: 11584, Training Loss: 0.01830
Epoch: 11585, Training Loss: 0.01391
Epoch: 11585, Training Loss: 0.01474
Epoch: 11585, Training Loss: 0.01476
Epoch: 11585, Training Loss: 0.01830
Epoch: 11586, Training Loss: 0.01391
Epoch: 11586, Training Loss: 0.01474
Epoch: 11586, Training Loss: 0.01476
Epoch: 11586, Training Loss: 0.01830
Epoch: 11587, Training Loss: 0.01391
Epoch: 11587, Training Loss: 0.01473
Epoch: 11587, Training Loss: 0.01476
Epoch: 11587, Training Loss: 0.01830
Epoch: 11588, Training Loss: 0.01390
Epoch: 11588, Training Loss: 0.01473
Epoch: 11588, Training Loss: 0.01476
Epoch: 11588, Training Loss: 0.01830
Epoch: 11589, Training Loss: 0.01390
Epoch: 11589, Training Loss: 0.01473
Epoch: 11589, Training Loss: 0.01475
Epoch: 11589, Training Loss: 0.01830
Epoch: 11590, Training Loss: 0.01390
Epoch: 11590, Training Loss: 0.01473
Epoch: 11590, Training Loss: 0.01475
Epoch: 11590, Training Loss: 0.01830
Epoch: 11591, Training Loss: 0.01390
Epoch: 11591, Training Loss: 0.01473
Epoch: 11591, Training Loss: 0.01475
Epoch: 11591, Training Loss: 0.01830
Epoch: 11592, Training Loss: 0.01390
Epoch: 11592, Training Loss: 0.01473
Epoch: 11592, Training Loss: 0.01475
Epoch: 11592, Training Loss: 0.01830
Epoch: 11593, Training Loss: 0.01390
Epoch: 11593, Training Loss: 0.01473
Epoch: 11593, Training Loss: 0.01475
Epoch: 11593, Training Loss: 0.01829
Epoch: 11594, Training Loss: 0.01390
Epoch: 11594, Training Loss: 0.01473
Epoch: 11594, Training Loss: 0.01475
Epoch: 11594, Training Loss: 0.01829
Epoch: 11595, Training Loss: 0.01390
Epoch: 11595, Training Loss: 0.01473
Epoch: 11595, Training Loss: 0.01475
Epoch: 11595, Training Loss: 0.01829
Epoch: 11596, Training Loss: 0.01390
Epoch: 11596, Training Loss: 0.01473
Epoch: 11596, Training Loss: 0.01475
Epoch: 11596, Training Loss: 0.01829
Epoch: 11597, Training Loss: 0.01390
Epoch: 11597, Training Loss: 0.01473
Epoch: 11597, Training Loss: 0.01475
Epoch: 11597, Training Loss: 0.01829
Epoch: 11598, Training Loss: 0.01390
Epoch: 11598, Training Loss: 0.01473
Epoch: 11598, Training Loss: 0.01475
Epoch: 11598, Training Loss: 0.01829
Epoch: 11599, Training Loss: 0.01390
Epoch: 11599, Training Loss: 0.01472
Epoch: 11599, Training Loss: 0.01475
Epoch: 11599, Training Loss: 0.01829
Epoch: 11600, Training Loss: 0.01390
Epoch: 11600, Training Loss: 0.01472
Epoch: 11600, Training Loss: 0.01475
Epoch: 11600, Training Loss: 0.01829
Epoch: 11601, Training Loss: 0.01389
Epoch: 11601, Training Loss: 0.01472
Epoch: 11601, Training Loss: 0.01474
Epoch: 11601, Training Loss: 0.01829
Epoch: 11602, Training Loss: 0.01389
Epoch: 11602, Training Loss: 0.01472
Epoch: 11602, Training Loss: 0.01474
Epoch: 11602, Training Loss: 0.01828
Epoch: 11603, Training Loss: 0.01389
Epoch: 11603, Training Loss: 0.01472
Epoch: 11603, Training Loss: 0.01474
Epoch: 11603, Training Loss: 0.01828
Epoch: 11604, Training Loss: 0.01389
Epoch: 11604, Training Loss: 0.01472
Epoch: 11604, Training Loss: 0.01474
Epoch: 11604, Training Loss: 0.01828
Epoch: 11605, Training Loss: 0.01389
Epoch: 11605, Training Loss: 0.01472
Epoch: 11605, Training Loss: 0.01474
Epoch: 11605, Training Loss: 0.01828
Epoch: 11606, Training Loss: 0.01389
Epoch: 11606, Training Loss: 0.01472
Epoch: 11606, Training Loss: 0.01474
Epoch: 11606, Training Loss: 0.01828
Epoch: 11607, Training Loss: 0.01389
Epoch: 11607, Training Loss: 0.01472
Epoch: 11607, Training Loss: 0.01474
Epoch: 11607, Training Loss: 0.01828
Epoch: 11608, Training Loss: 0.01389
Epoch: 11608, Training Loss: 0.01472
Epoch: 11608, Training Loss: 0.01474
Epoch: 11608, Training Loss: 0.01828
Epoch: 11609, Training Loss: 0.01389
Epoch: 11609, Training Loss: 0.01472
Epoch: 11609, Training Loss: 0.01474
Epoch: 11609, Training Loss: 0.01828
Epoch: 11610, Training Loss: 0.01389
Epoch: 11610, Training Loss: 0.01472
Epoch: 11610, Training Loss: 0.01474
Epoch: 11610, Training Loss: 0.01828
Epoch: 11611, Training Loss: 0.01389
Epoch: 11611, Training Loss: 0.01471
Epoch: 11611, Training Loss: 0.01474
Epoch: 11611, Training Loss: 0.01828
Epoch: 11612, Training Loss: 0.01389
Epoch: 11612, Training Loss: 0.01471
Epoch: 11612, Training Loss: 0.01474
Epoch: 11612, Training Loss: 0.01827
Epoch: 11613, Training Loss: 0.01389
Epoch: 11613, Training Loss: 0.01471
Epoch: 11613, Training Loss: 0.01473
Epoch: 11613, Training Loss: 0.01827
Epoch: 11614, Training Loss: 0.01388
Epoch: 11614, Training Loss: 0.01471
Epoch: 11614, Training Loss: 0.01473
Epoch: 11614, Training Loss: 0.01827
Epoch: 11615, Training Loss: 0.01388
Epoch: 11615, Training Loss: 0.01471
Epoch: 11615, Training Loss: 0.01473
Epoch: 11615, Training Loss: 0.01827
Epoch: 11616, Training Loss: 0.01388
Epoch: 11616, Training Loss: 0.01471
Epoch: 11616, Training Loss: 0.01473
Epoch: 11616, Training Loss: 0.01827
Epoch: 11617, Training Loss: 0.01388
Epoch: 11617, Training Loss: 0.01471
Epoch: 11617, Training Loss: 0.01473
Epoch: 11617, Training Loss: 0.01827
Epoch: 11618, Training Loss: 0.01388
Epoch: 11618, Training Loss: 0.01471
Epoch: 11618, Training Loss: 0.01473
Epoch: 11618, Training Loss: 0.01827
Epoch: 11619, Training Loss: 0.01388
Epoch: 11619, Training Loss: 0.01471
Epoch: 11619, Training Loss: 0.01473
Epoch: 11619, Training Loss: 0.01827
Epoch: 11620, Training Loss: 0.01388
Epoch: 11620, Training Loss: 0.01471
Epoch: 11620, Training Loss: 0.01473
Epoch: 11620, Training Loss: 0.01827
Epoch: 11621, Training Loss: 0.01388
Epoch: 11621, Training Loss: 0.01471
Epoch: 11621, Training Loss: 0.01473
Epoch: 11621, Training Loss: 0.01826
Epoch: 11622, Training Loss: 0.01388
Epoch: 11622, Training Loss: 0.01471
Epoch: 11622, Training Loss: 0.01473
Epoch: 11622, Training Loss: 0.01826
Epoch: 11623, Training Loss: 0.01388
Epoch: 11623, Training Loss: 0.01470
Epoch: 11623, Training Loss: 0.01473
Epoch: 11623, Training Loss: 0.01826
Epoch: 11624, Training Loss: 0.01388
Epoch: 11624, Training Loss: 0.01470
Epoch: 11624, Training Loss: 0.01473
Epoch: 11624, Training Loss: 0.01826
Epoch: 11625, Training Loss: 0.01388
Epoch: 11625, Training Loss: 0.01470
Epoch: 11625, Training Loss: 0.01472
Epoch: 11625, Training Loss: 0.01826
Epoch: 11626, Training Loss: 0.01388
Epoch: 11626, Training Loss: 0.01470
Epoch: 11626, Training Loss: 0.01472
Epoch: 11626, Training Loss: 0.01826
Epoch: 11627, Training Loss: 0.01387
Epoch: 11627, Training Loss: 0.01470
Epoch: 11627, Training Loss: 0.01472
Epoch: 11627, Training Loss: 0.01826
Epoch: 11628, Training Loss: 0.01387
Epoch: 11628, Training Loss: 0.01470
Epoch: 11628, Training Loss: 0.01472
Epoch: 11628, Training Loss: 0.01826
Epoch: 11629, Training Loss: 0.01387
Epoch: 11629, Training Loss: 0.01470
Epoch: 11629, Training Loss: 0.01472
Epoch: 11629, Training Loss: 0.01826
Epoch: 11630, Training Loss: 0.01387
Epoch: 11630, Training Loss: 0.01470
Epoch: 11630, Training Loss: 0.01472
Epoch: 11630, Training Loss: 0.01826
Epoch: 11631, Training Loss: 0.01387
Epoch: 11631, Training Loss: 0.01470
Epoch: 11631, Training Loss: 0.01472
Epoch: 11631, Training Loss: 0.01825
Epoch: 11632, Training Loss: 0.01387
Epoch: 11632, Training Loss: 0.01470
Epoch: 11632, Training Loss: 0.01472
Epoch: 11632, Training Loss: 0.01825
Epoch: 11633, Training Loss: 0.01387
Epoch: 11633, Training Loss: 0.01470
Epoch: 11633, Training Loss: 0.01472
Epoch: 11633, Training Loss: 0.01825
Epoch: 11634, Training Loss: 0.01387
Epoch: 11634, Training Loss: 0.01470
Epoch: 11634, Training Loss: 0.01472
Epoch: 11634, Training Loss: 0.01825
Epoch: 11635, Training Loss: 0.01387
Epoch: 11635, Training Loss: 0.01469
Epoch: 11635, Training Loss: 0.01472
Epoch: 11635, Training Loss: 0.01825
Epoch: 11636, Training Loss: 0.01387
Epoch: 11636, Training Loss: 0.01469
Epoch: 11636, Training Loss: 0.01472
Epoch: 11636, Training Loss: 0.01825
Epoch: 11637, Training Loss: 0.01387
Epoch: 11637, Training Loss: 0.01469
Epoch: 11637, Training Loss: 0.01471
Epoch: 11637, Training Loss: 0.01825
Epoch: 11638, Training Loss: 0.01387
Epoch: 11638, Training Loss: 0.01469
Epoch: 11638, Training Loss: 0.01471
Epoch: 11638, Training Loss: 0.01825
Epoch: 11639, Training Loss: 0.01387
Epoch: 11639, Training Loss: 0.01469
Epoch: 11639, Training Loss: 0.01471
Epoch: 11639, Training Loss: 0.01825
Epoch: 11640, Training Loss: 0.01386
Epoch: 11640, Training Loss: 0.01469
Epoch: 11640, Training Loss: 0.01471
Epoch: 11640, Training Loss: 0.01824
Epoch: 11641, Training Loss: 0.01386
Epoch: 11641, Training Loss: 0.01469
Epoch: 11641, Training Loss: 0.01471
Epoch: 11641, Training Loss: 0.01824
Epoch: 11642, Training Loss: 0.01386
Epoch: 11642, Training Loss: 0.01469
Epoch: 11642, Training Loss: 0.01471
Epoch: 11642, Training Loss: 0.01824
Epoch: 11643, Training Loss: 0.01386
Epoch: 11643, Training Loss: 0.01469
Epoch: 11643, Training Loss: 0.01471
Epoch: 11643, Training Loss: 0.01824
Epoch: 11644, Training Loss: 0.01386
Epoch: 11644, Training Loss: 0.01469
Epoch: 11644, Training Loss: 0.01471
Epoch: 11644, Training Loss: 0.01824
Epoch: 11645, Training Loss: 0.01386
Epoch: 11645, Training Loss: 0.01469
Epoch: 11645, Training Loss: 0.01471
Epoch: 11645, Training Loss: 0.01824
Epoch: 11646, Training Loss: 0.01386
Epoch: 11646, Training Loss: 0.01469
Epoch: 11646, Training Loss: 0.01471
Epoch: 11646, Training Loss: 0.01824
Epoch: 11647, Training Loss: 0.01386
Epoch: 11647, Training Loss: 0.01468
Epoch: 11647, Training Loss: 0.01471
Epoch: 11647, Training Loss: 0.01824
Epoch: 11648, Training Loss: 0.01386
Epoch: 11648, Training Loss: 0.01468
Epoch: 11648, Training Loss: 0.01471
Epoch: 11648, Training Loss: 0.01824
Epoch: 11649, Training Loss: 0.01386
Epoch: 11649, Training Loss: 0.01468
Epoch: 11649, Training Loss: 0.01470
Epoch: 11649, Training Loss: 0.01823
Epoch: 11650, Training Loss: 0.01386
Epoch: 11650, Training Loss: 0.01468
Epoch: 11650, Training Loss: 0.01470
Epoch: 11650, Training Loss: 0.01823
Epoch: 11651, Training Loss: 0.01386
Epoch: 11651, Training Loss: 0.01468
Epoch: 11651, Training Loss: 0.01470
Epoch: 11651, Training Loss: 0.01823
Epoch: 11652, Training Loss: 0.01386
Epoch: 11652, Training Loss: 0.01468
Epoch: 11652, Training Loss: 0.01470
Epoch: 11652, Training Loss: 0.01823
Epoch: 11653, Training Loss: 0.01386
Epoch: 11653, Training Loss: 0.01468
Epoch: 11653, Training Loss: 0.01470
Epoch: 11653, Training Loss: 0.01823
Epoch: 11654, Training Loss: 0.01385
Epoch: 11654, Training Loss: 0.01468
Epoch: 11654, Training Loss: 0.01470
Epoch: 11654, Training Loss: 0.01823
Epoch: 11655, Training Loss: 0.01385
Epoch: 11655, Training Loss: 0.01468
Epoch: 11655, Training Loss: 0.01470
Epoch: 11655, Training Loss: 0.01823
Epoch: 11656, Training Loss: 0.01385
Epoch: 11656, Training Loss: 0.01468
Epoch: 11656, Training Loss: 0.01470
Epoch: 11656, Training Loss: 0.01823
Epoch: 11657, Training Loss: 0.01385
Epoch: 11657, Training Loss: 0.01468
Epoch: 11657, Training Loss: 0.01470
Epoch: 11657, Training Loss: 0.01823
Epoch: 11658, Training Loss: 0.01385
Epoch: 11658, Training Loss: 0.01468
Epoch: 11658, Training Loss: 0.01470
Epoch: 11658, Training Loss: 0.01823
Epoch: 11659, Training Loss: 0.01385
Epoch: 11659, Training Loss: 0.01467
Epoch: 11659, Training Loss: 0.01470
Epoch: 11659, Training Loss: 0.01822
Epoch: 11660, Training Loss: 0.01385
Epoch: 11660, Training Loss: 0.01467
Epoch: 11660, Training Loss: 0.01470
Epoch: 11660, Training Loss: 0.01822
Epoch: 11661, Training Loss: 0.01385
Epoch: 11661, Training Loss: 0.01467
Epoch: 11661, Training Loss: 0.01469
Epoch: 11661, Training Loss: 0.01822
Epoch: 11662, Training Loss: 0.01385
Epoch: 11662, Training Loss: 0.01467
Epoch: 11662, Training Loss: 0.01469
Epoch: 11662, Training Loss: 0.01822
Epoch: 11663, Training Loss: 0.01385
Epoch: 11663, Training Loss: 0.01467
Epoch: 11663, Training Loss: 0.01469
Epoch: 11663, Training Loss: 0.01822
Epoch: 11664, Training Loss: 0.01385
Epoch: 11664, Training Loss: 0.01467
Epoch: 11664, Training Loss: 0.01469
Epoch: 11664, Training Loss: 0.01822
Epoch: 11665, Training Loss: 0.01385
Epoch: 11665, Training Loss: 0.01467
Epoch: 11665, Training Loss: 0.01469
Epoch: 11665, Training Loss: 0.01822
Epoch: 11666, Training Loss: 0.01385
Epoch: 11666, Training Loss: 0.01467
Epoch: 11666, Training Loss: 0.01469
Epoch: 11666, Training Loss: 0.01822
Epoch: 11667, Training Loss: 0.01384
Epoch: 11667, Training Loss: 0.01467
Epoch: 11667, Training Loss: 0.01469
Epoch: 11667, Training Loss: 0.01822
Epoch: 11668, Training Loss: 0.01384
Epoch: 11668, Training Loss: 0.01467
Epoch: 11668, Training Loss: 0.01469
Epoch: 11668, Training Loss: 0.01821
Epoch: 11669, Training Loss: 0.01384
Epoch: 11669, Training Loss: 0.01467
Epoch: 11669, Training Loss: 0.01469
Epoch: 11669, Training Loss: 0.01821
Epoch: 11670, Training Loss: 0.01384
Epoch: 11670, Training Loss: 0.01467
Epoch: 11670, Training Loss: 0.01469
Epoch: 11670, Training Loss: 0.01821
Epoch: 11671, Training Loss: 0.01384
Epoch: 11671, Training Loss: 0.01466
Epoch: 11671, Training Loss: 0.01469
Epoch: 11671, Training Loss: 0.01821
Epoch: 11672, Training Loss: 0.01384
Epoch: 11672, Training Loss: 0.01466
Epoch: 11672, Training Loss: 0.01469
Epoch: 11672, Training Loss: 0.01821
Epoch: 11673, Training Loss: 0.01384
Epoch: 11673, Training Loss: 0.01466
Epoch: 11673, Training Loss: 0.01468
Epoch: 11673, Training Loss: 0.01821
Epoch: 11674, Training Loss: 0.01384
Epoch: 11674, Training Loss: 0.01466
Epoch: 11674, Training Loss: 0.01468
Epoch: 11674, Training Loss: 0.01821
Epoch: 11675, Training Loss: 0.01384
Epoch: 11675, Training Loss: 0.01466
Epoch: 11675, Training Loss: 0.01468
Epoch: 11675, Training Loss: 0.01821
Epoch: 11676, Training Loss: 0.01384
Epoch: 11676, Training Loss: 0.01466
Epoch: 11676, Training Loss: 0.01468
Epoch: 11676, Training Loss: 0.01821
Epoch: 11677, Training Loss: 0.01384
Epoch: 11677, Training Loss: 0.01466
Epoch: 11677, Training Loss: 0.01468
Epoch: 11677, Training Loss: 0.01821
Epoch: 11678, Training Loss: 0.01384
Epoch: 11678, Training Loss: 0.01466
Epoch: 11678, Training Loss: 0.01468
Epoch: 11678, Training Loss: 0.01820
Epoch: 11679, Training Loss: 0.01384
Epoch: 11679, Training Loss: 0.01466
Epoch: 11679, Training Loss: 0.01468
Epoch: 11679, Training Loss: 0.01820
Epoch: 11680, Training Loss: 0.01383
Epoch: 11680, Training Loss: 0.01466
Epoch: 11680, Training Loss: 0.01468
Epoch: 11680, Training Loss: 0.01820
Epoch: 11681, Training Loss: 0.01383
Epoch: 11681, Training Loss: 0.01466
Epoch: 11681, Training Loss: 0.01468
Epoch: 11681, Training Loss: 0.01820
Epoch: 11682, Training Loss: 0.01383
Epoch: 11682, Training Loss: 0.01466
Epoch: 11682, Training Loss: 0.01468
Epoch: 11682, Training Loss: 0.01820
Epoch: 11683, Training Loss: 0.01383
Epoch: 11683, Training Loss: 0.01465
Epoch: 11683, Training Loss: 0.01468
Epoch: 11683, Training Loss: 0.01820
Epoch: 11684, Training Loss: 0.01383
Epoch: 11684, Training Loss: 0.01465
Epoch: 11684, Training Loss: 0.01468
Epoch: 11684, Training Loss: 0.01820
Epoch: 11685, Training Loss: 0.01383
Epoch: 11685, Training Loss: 0.01465
Epoch: 11685, Training Loss: 0.01467
Epoch: 11685, Training Loss: 0.01820
Epoch: 11686, Training Loss: 0.01383
Epoch: 11686, Training Loss: 0.01465
Epoch: 11686, Training Loss: 0.01467
Epoch: 11686, Training Loss: 0.01820
Epoch: 11687, Training Loss: 0.01383
Epoch: 11687, Training Loss: 0.01465
Epoch: 11687, Training Loss: 0.01467
Epoch: 11687, Training Loss: 0.01819
Epoch: 11688, Training Loss: 0.01383
Epoch: 11688, Training Loss: 0.01465
Epoch: 11688, Training Loss: 0.01467
Epoch: 11688, Training Loss: 0.01819
Epoch: 11689, Training Loss: 0.01383
Epoch: 11689, Training Loss: 0.01465
Epoch: 11689, Training Loss: 0.01467
Epoch: 11689, Training Loss: 0.01819
Epoch: 11690, Training Loss: 0.01383
Epoch: 11690, Training Loss: 0.01465
Epoch: 11690, Training Loss: 0.01467
Epoch: 11690, Training Loss: 0.01819
Epoch: 11691, Training Loss: 0.01383
Epoch: 11691, Training Loss: 0.01465
Epoch: 11691, Training Loss: 0.01467
Epoch: 11691, Training Loss: 0.01819
Epoch: 11692, Training Loss: 0.01383
Epoch: 11692, Training Loss: 0.01465
Epoch: 11692, Training Loss: 0.01467
Epoch: 11692, Training Loss: 0.01819
Epoch: 11693, Training Loss: 0.01383
Epoch: 11693, Training Loss: 0.01465
Epoch: 11693, Training Loss: 0.01467
Epoch: 11693, Training Loss: 0.01819
Epoch: 11694, Training Loss: 0.01382
Epoch: 11694, Training Loss: 0.01465
Epoch: 11694, Training Loss: 0.01467
Epoch: 11694, Training Loss: 0.01819
Epoch: 11695, Training Loss: 0.01382
Epoch: 11695, Training Loss: 0.01464
Epoch: 11695, Training Loss: 0.01467
Epoch: 11695, Training Loss: 0.01819
Epoch: 11696, Training Loss: 0.01382
Epoch: 11696, Training Loss: 0.01464
Epoch: 11696, Training Loss: 0.01467
Epoch: 11696, Training Loss: 0.01819
Epoch: 11697, Training Loss: 0.01382
Epoch: 11697, Training Loss: 0.01464
Epoch: 11697, Training Loss: 0.01466
Epoch: 11697, Training Loss: 0.01818
Epoch: 11698, Training Loss: 0.01382
Epoch: 11698, Training Loss: 0.01464
Epoch: 11698, Training Loss: 0.01466
Epoch: 11698, Training Loss: 0.01818
Epoch: 11699, Training Loss: 0.01382
Epoch: 11699, Training Loss: 0.01464
Epoch: 11699, Training Loss: 0.01466
Epoch: 11699, Training Loss: 0.01818
Epoch: 11700, Training Loss: 0.01382
Epoch: 11700, Training Loss: 0.01464
Epoch: 11700, Training Loss: 0.01466
Epoch: 11700, Training Loss: 0.01818
Epoch: 11701, Training Loss: 0.01382
Epoch: 11701, Training Loss: 0.01464
Epoch: 11701, Training Loss: 0.01466
Epoch: 11701, Training Loss: 0.01818
Epoch: 11702, Training Loss: 0.01382
Epoch: 11702, Training Loss: 0.01464
Epoch: 11702, Training Loss: 0.01466
Epoch: 11702, Training Loss: 0.01818
Epoch: 11703, Training Loss: 0.01382
Epoch: 11703, Training Loss: 0.01464
Epoch: 11703, Training Loss: 0.01466
Epoch: 11703, Training Loss: 0.01818
Epoch: 11704, Training Loss: 0.01382
Epoch: 11704, Training Loss: 0.01464
Epoch: 11704, Training Loss: 0.01466
Epoch: 11704, Training Loss: 0.01818
Epoch: 11705, Training Loss: 0.01382
Epoch: 11705, Training Loss: 0.01464
Epoch: 11705, Training Loss: 0.01466
Epoch: 11705, Training Loss: 0.01818
Epoch: 11706, Training Loss: 0.01382
Epoch: 11706, Training Loss: 0.01464
Epoch: 11706, Training Loss: 0.01466
Epoch: 11706, Training Loss: 0.01818
Epoch: 11707, Training Loss: 0.01381
Epoch: 11707, Training Loss: 0.01463
Epoch: 11707, Training Loss: 0.01466
Epoch: 11707, Training Loss: 0.01817
Epoch: 11708, Training Loss: 0.01381
Epoch: 11708, Training Loss: 0.01463
Epoch: 11708, Training Loss: 0.01466
Epoch: 11708, Training Loss: 0.01817
Epoch: 11709, Training Loss: 0.01381
Epoch: 11709, Training Loss: 0.01463
Epoch: 11709, Training Loss: 0.01465
Epoch: 11709, Training Loss: 0.01817
Epoch: 11710, Training Loss: 0.01381
Epoch: 11710, Training Loss: 0.01463
Epoch: 11710, Training Loss: 0.01465
Epoch: 11710, Training Loss: 0.01817
Epoch: 11711, Training Loss: 0.01381
Epoch: 11711, Training Loss: 0.01463
Epoch: 11711, Training Loss: 0.01465
Epoch: 11711, Training Loss: 0.01817
Epoch: 11712, Training Loss: 0.01381
Epoch: 11712, Training Loss: 0.01463
Epoch: 11712, Training Loss: 0.01465
Epoch: 11712, Training Loss: 0.01817
Epoch: 11713, Training Loss: 0.01381
Epoch: 11713, Training Loss: 0.01463
Epoch: 11713, Training Loss: 0.01465
Epoch: 11713, Training Loss: 0.01817
Epoch: 11714, Training Loss: 0.01381
Epoch: 11714, Training Loss: 0.01463
Epoch: 11714, Training Loss: 0.01465
Epoch: 11714, Training Loss: 0.01817
Epoch: 11715, Training Loss: 0.01381
Epoch: 11715, Training Loss: 0.01463
Epoch: 11715, Training Loss: 0.01465
Epoch: 11715, Training Loss: 0.01817
Epoch: 11716, Training Loss: 0.01381
Epoch: 11716, Training Loss: 0.01463
Epoch: 11716, Training Loss: 0.01465
Epoch: 11716, Training Loss: 0.01816
Epoch: 11717, Training Loss: 0.01381
Epoch: 11717, Training Loss: 0.01463
Epoch: 11717, Training Loss: 0.01465
Epoch: 11717, Training Loss: 0.01816
Epoch: 11718, Training Loss: 0.01381
Epoch: 11718, Training Loss: 0.01463
Epoch: 11718, Training Loss: 0.01465
Epoch: 11718, Training Loss: 0.01816
Epoch: 11719, Training Loss: 0.01381
Epoch: 11719, Training Loss: 0.01462
Epoch: 11719, Training Loss: 0.01465
Epoch: 11719, Training Loss: 0.01816
Epoch: 11720, Training Loss: 0.01380
Epoch: 11720, Training Loss: 0.01462
Epoch: 11720, Training Loss: 0.01465
Epoch: 11720, Training Loss: 0.01816
Epoch: 11721, Training Loss: 0.01380
Epoch: 11721, Training Loss: 0.01462
Epoch: 11721, Training Loss: 0.01464
Epoch: 11721, Training Loss: 0.01816
Epoch: 11722, Training Loss: 0.01380
Epoch: 11722, Training Loss: 0.01462
Epoch: 11722, Training Loss: 0.01464
Epoch: 11722, Training Loss: 0.01816
Epoch: 11723, Training Loss: 0.01380
Epoch: 11723, Training Loss: 0.01462
Epoch: 11723, Training Loss: 0.01464
Epoch: 11723, Training Loss: 0.01816
Epoch: 11724, Training Loss: 0.01380
Epoch: 11724, Training Loss: 0.01462
Epoch: 11724, Training Loss: 0.01464
Epoch: 11724, Training Loss: 0.01816
Epoch: 11725, Training Loss: 0.01380
Epoch: 11725, Training Loss: 0.01462
Epoch: 11725, Training Loss: 0.01464
Epoch: 11725, Training Loss: 0.01816
Epoch: 11726, Training Loss: 0.01380
Epoch: 11726, Training Loss: 0.01462
Epoch: 11726, Training Loss: 0.01464
Epoch: 11726, Training Loss: 0.01815
Epoch: 11727, Training Loss: 0.01380
Epoch: 11727, Training Loss: 0.01462
Epoch: 11727, Training Loss: 0.01464
Epoch: 11727, Training Loss: 0.01815
Epoch: 11728, Training Loss: 0.01380
Epoch: 11728, Training Loss: 0.01462
Epoch: 11728, Training Loss: 0.01464
Epoch: 11728, Training Loss: 0.01815
Epoch: 11729, Training Loss: 0.01380
Epoch: 11729, Training Loss: 0.01462
Epoch: 11729, Training Loss: 0.01464
Epoch: 11729, Training Loss: 0.01815
Epoch: 11730, Training Loss: 0.01380
Epoch: 11730, Training Loss: 0.01462
Epoch: 11730, Training Loss: 0.01464
Epoch: 11730, Training Loss: 0.01815
Epoch: 11731, Training Loss: 0.01380
Epoch: 11731, Training Loss: 0.01461
Epoch: 11731, Training Loss: 0.01464
Epoch: 11731, Training Loss: 0.01815
Epoch: 11732, Training Loss: 0.01380
Epoch: 11732, Training Loss: 0.01461
Epoch: 11732, Training Loss: 0.01464
Epoch: 11732, Training Loss: 0.01815
Epoch: 11733, Training Loss: 0.01380
Epoch: 11733, Training Loss: 0.01461
Epoch: 11733, Training Loss: 0.01464
Epoch: 11733, Training Loss: 0.01815
Epoch: 11734, Training Loss: 0.01379
Epoch: 11734, Training Loss: 0.01461
Epoch: 11734, Training Loss: 0.01463
Epoch: 11734, Training Loss: 0.01815
Epoch: 11735, Training Loss: 0.01379
Epoch: 11735, Training Loss: 0.01461
Epoch: 11735, Training Loss: 0.01463
Epoch: 11735, Training Loss: 0.01814
Epoch: 11736, Training Loss: 0.01379
Epoch: 11736, Training Loss: 0.01461
Epoch: 11736, Training Loss: 0.01463
Epoch: 11736, Training Loss: 0.01814
Epoch: 11737, Training Loss: 0.01379
Epoch: 11737, Training Loss: 0.01461
Epoch: 11737, Training Loss: 0.01463
Epoch: 11737, Training Loss: 0.01814
Epoch: 11738, Training Loss: 0.01379
Epoch: 11738, Training Loss: 0.01461
Epoch: 11738, Training Loss: 0.01463
Epoch: 11738, Training Loss: 0.01814
Epoch: 11739, Training Loss: 0.01379
Epoch: 11739, Training Loss: 0.01461
Epoch: 11739, Training Loss: 0.01463
Epoch: 11739, Training Loss: 0.01814
Epoch: 11740, Training Loss: 0.01379
Epoch: 11740, Training Loss: 0.01461
Epoch: 11740, Training Loss: 0.01463
Epoch: 11740, Training Loss: 0.01814
Epoch: 11741, Training Loss: 0.01379
Epoch: 11741, Training Loss: 0.01461
Epoch: 11741, Training Loss: 0.01463
Epoch: 11741, Training Loss: 0.01814
Epoch: 11742, Training Loss: 0.01379
Epoch: 11742, Training Loss: 0.01461
Epoch: 11742, Training Loss: 0.01463
Epoch: 11742, Training Loss: 0.01814
Epoch: 11743, Training Loss: 0.01379
Epoch: 11743, Training Loss: 0.01461
Epoch: 11743, Training Loss: 0.01463
Epoch: 11743, Training Loss: 0.01814
Epoch: 11744, Training Loss: 0.01379
Epoch: 11744, Training Loss: 0.01460
Epoch: 11744, Training Loss: 0.01463
Epoch: 11744, Training Loss: 0.01814
Epoch: 11745, Training Loss: 0.01379
Epoch: 11745, Training Loss: 0.01460
Epoch: 11745, Training Loss: 0.01463
Epoch: 11745, Training Loss: 0.01813
Epoch: 11746, Training Loss: 0.01379
Epoch: 11746, Training Loss: 0.01460
Epoch: 11746, Training Loss: 0.01462
Epoch: 11746, Training Loss: 0.01813
Epoch: 11747, Training Loss: 0.01378
Epoch: 11747, Training Loss: 0.01460
Epoch: 11747, Training Loss: 0.01462
Epoch: 11747, Training Loss: 0.01813
Epoch: 11748, Training Loss: 0.01378
Epoch: 11748, Training Loss: 0.01460
Epoch: 11748, Training Loss: 0.01462
Epoch: 11748, Training Loss: 0.01813
Epoch: 11749, Training Loss: 0.01378
Epoch: 11749, Training Loss: 0.01460
Epoch: 11749, Training Loss: 0.01462
Epoch: 11749, Training Loss: 0.01813
Epoch: 11750, Training Loss: 0.01378
Epoch: 11750, Training Loss: 0.01460
Epoch: 11750, Training Loss: 0.01462
Epoch: 11750, Training Loss: 0.01813
Epoch: 11751, Training Loss: 0.01378
Epoch: 11751, Training Loss: 0.01460
Epoch: 11751, Training Loss: 0.01462
Epoch: 11751, Training Loss: 0.01813
Epoch: 11752, Training Loss: 0.01378
Epoch: 11752, Training Loss: 0.01460
Epoch: 11752, Training Loss: 0.01462
Epoch: 11752, Training Loss: 0.01813
Epoch: 11753, Training Loss: 0.01378
Epoch: 11753, Training Loss: 0.01460
Epoch: 11753, Training Loss: 0.01462
Epoch: 11753, Training Loss: 0.01813
Epoch: 11754, Training Loss: 0.01378
Epoch: 11754, Training Loss: 0.01460
Epoch: 11754, Training Loss: 0.01462
Epoch: 11754, Training Loss: 0.01813
Epoch: 11755, Training Loss: 0.01378
Epoch: 11755, Training Loss: 0.01460
Epoch: 11755, Training Loss: 0.01462
Epoch: 11755, Training Loss: 0.01812
Epoch: 11756, Training Loss: 0.01378
Epoch: 11756, Training Loss: 0.01459
Epoch: 11756, Training Loss: 0.01462
Epoch: 11756, Training Loss: 0.01812
Epoch: 11757, Training Loss: 0.01378
Epoch: 11757, Training Loss: 0.01459
Epoch: 11757, Training Loss: 0.01462
Epoch: 11757, Training Loss: 0.01812
Epoch: 11758, Training Loss: 0.01378
Epoch: 11758, Training Loss: 0.01459
Epoch: 11758, Training Loss: 0.01461
Epoch: 11758, Training Loss: 0.01812
Epoch: 11759, Training Loss: 0.01378
Epoch: 11759, Training Loss: 0.01459
Epoch: 11759, Training Loss: 0.01461
Epoch: 11759, Training Loss: 0.01812
Epoch: 11760, Training Loss: 0.01378
Epoch: 11760, Training Loss: 0.01459
Epoch: 11760, Training Loss: 0.01461
Epoch: 11760, Training Loss: 0.01812
Epoch: 11761, Training Loss: 0.01377
Epoch: 11761, Training Loss: 0.01459
Epoch: 11761, Training Loss: 0.01461
Epoch: 11761, Training Loss: 0.01812
Epoch: 11762, Training Loss: 0.01377
Epoch: 11762, Training Loss: 0.01459
Epoch: 11762, Training Loss: 0.01461
Epoch: 11762, Training Loss: 0.01812
Epoch: 11763, Training Loss: 0.01377
Epoch: 11763, Training Loss: 0.01459
Epoch: 11763, Training Loss: 0.01461
Epoch: 11763, Training Loss: 0.01812
Epoch: 11764, Training Loss: 0.01377
Epoch: 11764, Training Loss: 0.01459
Epoch: 11764, Training Loss: 0.01461
Epoch: 11764, Training Loss: 0.01811
Epoch: 11765, Training Loss: 0.01377
Epoch: 11765, Training Loss: 0.01459
Epoch: 11765, Training Loss: 0.01461
Epoch: 11765, Training Loss: 0.01811
Epoch: 11766, Training Loss: 0.01377
Epoch: 11766, Training Loss: 0.01459
Epoch: 11766, Training Loss: 0.01461
Epoch: 11766, Training Loss: 0.01811
Epoch: 11767, Training Loss: 0.01377
Epoch: 11767, Training Loss: 0.01459
Epoch: 11767, Training Loss: 0.01461
Epoch: 11767, Training Loss: 0.01811
Epoch: 11768, Training Loss: 0.01377
Epoch: 11768, Training Loss: 0.01458
Epoch: 11768, Training Loss: 0.01461
Epoch: 11768, Training Loss: 0.01811
Epoch: 11769, Training Loss: 0.01377
Epoch: 11769, Training Loss: 0.01458
Epoch: 11769, Training Loss: 0.01461
Epoch: 11769, Training Loss: 0.01811
Epoch: 11770, Training Loss: 0.01377
Epoch: 11770, Training Loss: 0.01458
Epoch: 11770, Training Loss: 0.01460
Epoch: 11770, Training Loss: 0.01811
Epoch: 11771, Training Loss: 0.01377
Epoch: 11771, Training Loss: 0.01458
Epoch: 11771, Training Loss: 0.01460
Epoch: 11771, Training Loss: 0.01811
Epoch: 11772, Training Loss: 0.01377
Epoch: 11772, Training Loss: 0.01458
Epoch: 11772, Training Loss: 0.01460
Epoch: 11772, Training Loss: 0.01811
Epoch: 11773, Training Loss: 0.01377
Epoch: 11773, Training Loss: 0.01458
Epoch: 11773, Training Loss: 0.01460
Epoch: 11773, Training Loss: 0.01811
Epoch: 11774, Training Loss: 0.01376
Epoch: 11774, Training Loss: 0.01458
Epoch: 11774, Training Loss: 0.01460
Epoch: 11774, Training Loss: 0.01810
Epoch: 11775, Training Loss: 0.01376
Epoch: 11775, Training Loss: 0.01458
Epoch: 11775, Training Loss: 0.01460
Epoch: 11775, Training Loss: 0.01810
Epoch: 11776, Training Loss: 0.01376
Epoch: 11776, Training Loss: 0.01458
Epoch: 11776, Training Loss: 0.01460
Epoch: 11776, Training Loss: 0.01810
Epoch: 11777, Training Loss: 0.01376
Epoch: 11777, Training Loss: 0.01458
Epoch: 11777, Training Loss: 0.01460
Epoch: 11777, Training Loss: 0.01810
Epoch: 11778, Training Loss: 0.01376
Epoch: 11778, Training Loss: 0.01458
Epoch: 11778, Training Loss: 0.01460
Epoch: 11778, Training Loss: 0.01810
Epoch: 11779, Training Loss: 0.01376
Epoch: 11779, Training Loss: 0.01458
Epoch: 11779, Training Loss: 0.01460
Epoch: 11779, Training Loss: 0.01810
Epoch: 11780, Training Loss: 0.01376
Epoch: 11780, Training Loss: 0.01457
Epoch: 11780, Training Loss: 0.01460
Epoch: 11780, Training Loss: 0.01810
Epoch: 11781, Training Loss: 0.01376
Epoch: 11781, Training Loss: 0.01457
Epoch: 11781, Training Loss: 0.01460
Epoch: 11781, Training Loss: 0.01810
Epoch: 11782, Training Loss: 0.01376
Epoch: 11782, Training Loss: 0.01457
Epoch: 11782, Training Loss: 0.01460
Epoch: 11782, Training Loss: 0.01810
Epoch: 11783, Training Loss: 0.01376
Epoch: 11783, Training Loss: 0.01457
Epoch: 11783, Training Loss: 0.01459
Epoch: 11783, Training Loss: 0.01810
Epoch: 11784, Training Loss: 0.01376
Epoch: 11784, Training Loss: 0.01457
Epoch: 11784, Training Loss: 0.01459
Epoch: 11784, Training Loss: 0.01809
Epoch: 11785, Training Loss: 0.01376
Epoch: 11785, Training Loss: 0.01457
Epoch: 11785, Training Loss: 0.01459
Epoch: 11785, Training Loss: 0.01809
Epoch: 11786, Training Loss: 0.01376
Epoch: 11786, Training Loss: 0.01457
Epoch: 11786, Training Loss: 0.01459
Epoch: 11786, Training Loss: 0.01809
Epoch: 11787, Training Loss: 0.01376
Epoch: 11787, Training Loss: 0.01457
Epoch: 11787, Training Loss: 0.01459
Epoch: 11787, Training Loss: 0.01809
Epoch: 11788, Training Loss: 0.01375
Epoch: 11788, Training Loss: 0.01457
Epoch: 11788, Training Loss: 0.01459
Epoch: 11788, Training Loss: 0.01809
Epoch: 11789, Training Loss: 0.01375
Epoch: 11789, Training Loss: 0.01457
Epoch: 11789, Training Loss: 0.01459
Epoch: 11789, Training Loss: 0.01809
Epoch: 11790, Training Loss: 0.01375
Epoch: 11790, Training Loss: 0.01457
Epoch: 11790, Training Loss: 0.01459
Epoch: 11790, Training Loss: 0.01809
Epoch: 11791, Training Loss: 0.01375
Epoch: 11791, Training Loss: 0.01457
Epoch: 11791, Training Loss: 0.01459
Epoch: 11791, Training Loss: 0.01809
Epoch: 11792, Training Loss: 0.01375
Epoch: 11792, Training Loss: 0.01457
Epoch: 11792, Training Loss: 0.01459
Epoch: 11792, Training Loss: 0.01809
Epoch: 11793, Training Loss: 0.01375
Epoch: 11793, Training Loss: 0.01456
Epoch: 11793, Training Loss: 0.01459
Epoch: 11793, Training Loss: 0.01808
Epoch: 11794, Training Loss: 0.01375
Epoch: 11794, Training Loss: 0.01456
Epoch: 11794, Training Loss: 0.01459
Epoch: 11794, Training Loss: 0.01808
Epoch: 11795, Training Loss: 0.01375
Epoch: 11795, Training Loss: 0.01456
Epoch: 11795, Training Loss: 0.01458
Epoch: 11795, Training Loss: 0.01808
Epoch: 11796, Training Loss: 0.01375
Epoch: 11796, Training Loss: 0.01456
Epoch: 11796, Training Loss: 0.01458
Epoch: 11796, Training Loss: 0.01808
Epoch: 11797, Training Loss: 0.01375
Epoch: 11797, Training Loss: 0.01456
Epoch: 11797, Training Loss: 0.01458
Epoch: 11797, Training Loss: 0.01808
Epoch: 11798, Training Loss: 0.01375
Epoch: 11798, Training Loss: 0.01456
Epoch: 11798, Training Loss: 0.01458
Epoch: 11798, Training Loss: 0.01808
Epoch: 11799, Training Loss: 0.01375
Epoch: 11799, Training Loss: 0.01456
Epoch: 11799, Training Loss: 0.01458
Epoch: 11799, Training Loss: 0.01808
Epoch: 11800, Training Loss: 0.01375
Epoch: 11800, Training Loss: 0.01456
Epoch: 11800, Training Loss: 0.01458
Epoch: 11800, Training Loss: 0.01808
Epoch: 11801, Training Loss: 0.01374
Epoch: 11801, Training Loss: 0.01456
Epoch: 11801, Training Loss: 0.01458
Epoch: 11801, Training Loss: 0.01808
Epoch: 11802, Training Loss: 0.01374
Epoch: 11802, Training Loss: 0.01456
Epoch: 11802, Training Loss: 0.01458
Epoch: 11802, Training Loss: 0.01808
Epoch: 11803, Training Loss: 0.01374
Epoch: 11803, Training Loss: 0.01456
Epoch: 11803, Training Loss: 0.01458
Epoch: 11803, Training Loss: 0.01807
Epoch: 11804, Training Loss: 0.01374
Epoch: 11804, Training Loss: 0.01456
Epoch: 11804, Training Loss: 0.01458
Epoch: 11804, Training Loss: 0.01807
Epoch: 11805, Training Loss: 0.01374
Epoch: 11805, Training Loss: 0.01455
Epoch: 11805, Training Loss: 0.01458
Epoch: 11805, Training Loss: 0.01807
Epoch: 11806, Training Loss: 0.01374
Epoch: 11806, Training Loss: 0.01455
Epoch: 11806, Training Loss: 0.01458
Epoch: 11806, Training Loss: 0.01807
Epoch: 11807, Training Loss: 0.01374
Epoch: 11807, Training Loss: 0.01455
Epoch: 11807, Training Loss: 0.01457
Epoch: 11807, Training Loss: 0.01807
Epoch: 11808, Training Loss: 0.01374
Epoch: 11808, Training Loss: 0.01455
Epoch: 11808, Training Loss: 0.01457
Epoch: 11808, Training Loss: 0.01807
Epoch: 11809, Training Loss: 0.01374
Epoch: 11809, Training Loss: 0.01455
Epoch: 11809, Training Loss: 0.01457
Epoch: 11809, Training Loss: 0.01807
Epoch: 11810, Training Loss: 0.01374
Epoch: 11810, Training Loss: 0.01455
Epoch: 11810, Training Loss: 0.01457
Epoch: 11810, Training Loss: 0.01807
Epoch: 11811, Training Loss: 0.01374
Epoch: 11811, Training Loss: 0.01455
Epoch: 11811, Training Loss: 0.01457
Epoch: 11811, Training Loss: 0.01807
Epoch: 11812, Training Loss: 0.01374
Epoch: 11812, Training Loss: 0.01455
Epoch: 11812, Training Loss: 0.01457
Epoch: 11812, Training Loss: 0.01807
Epoch: 11813, Training Loss: 0.01374
Epoch: 11813, Training Loss: 0.01455
Epoch: 11813, Training Loss: 0.01457
Epoch: 11813, Training Loss: 0.01806
Epoch: 11814, Training Loss: 0.01374
Epoch: 11814, Training Loss: 0.01455
Epoch: 11814, Training Loss: 0.01457
Epoch: 11814, Training Loss: 0.01806
Epoch: 11815, Training Loss: 0.01373
Epoch: 11815, Training Loss: 0.01455
Epoch: 11815, Training Loss: 0.01457
Epoch: 11815, Training Loss: 0.01806
Epoch: 11816, Training Loss: 0.01373
Epoch: 11816, Training Loss: 0.01455
Epoch: 11816, Training Loss: 0.01457
Epoch: 11816, Training Loss: 0.01806
Epoch: 11817, Training Loss: 0.01373
Epoch: 11817, Training Loss: 0.01454
Epoch: 11817, Training Loss: 0.01457
Epoch: 11817, Training Loss: 0.01806
Epoch: 11818, Training Loss: 0.01373
Epoch: 11818, Training Loss: 0.01454
Epoch: 11818, Training Loss: 0.01457
Epoch: 11818, Training Loss: 0.01806
Epoch: 11819, Training Loss: 0.01373
Epoch: 11819, Training Loss: 0.01454
Epoch: 11819, Training Loss: 0.01456
Epoch: 11819, Training Loss: 0.01806
Epoch: 11820, Training Loss: 0.01373
Epoch: 11820, Training Loss: 0.01454
Epoch: 11820, Training Loss: 0.01456
Epoch: 11820, Training Loss: 0.01806
Epoch: 11821, Training Loss: 0.01373
Epoch: 11821, Training Loss: 0.01454
Epoch: 11821, Training Loss: 0.01456
Epoch: 11821, Training Loss: 0.01806
Epoch: 11822, Training Loss: 0.01373
Epoch: 11822, Training Loss: 0.01454
Epoch: 11822, Training Loss: 0.01456
Epoch: 11822, Training Loss: 0.01805
Epoch: 11823, Training Loss: 0.01373
Epoch: 11823, Training Loss: 0.01454
Epoch: 11823, Training Loss: 0.01456
Epoch: 11823, Training Loss: 0.01805
Epoch: 11824, Training Loss: 0.01373
Epoch: 11824, Training Loss: 0.01454
Epoch: 11824, Training Loss: 0.01456
Epoch: 11824, Training Loss: 0.01805
Epoch: 11825, Training Loss: 0.01373
Epoch: 11825, Training Loss: 0.01454
Epoch: 11825, Training Loss: 0.01456
Epoch: 11825, Training Loss: 0.01805
Epoch: 11826, Training Loss: 0.01373
Epoch: 11826, Training Loss: 0.01454
Epoch: 11826, Training Loss: 0.01456
Epoch: 11826, Training Loss: 0.01805
Epoch: 11827, Training Loss: 0.01373
Epoch: 11827, Training Loss: 0.01454
Epoch: 11827, Training Loss: 0.01456
Epoch: 11827, Training Loss: 0.01805
Epoch: 11828, Training Loss: 0.01372
Epoch: 11828, Training Loss: 0.01454
Epoch: 11828, Training Loss: 0.01456
Epoch: 11828, Training Loss: 0.01805
Epoch: 11829, Training Loss: 0.01372
Epoch: 11829, Training Loss: 0.01454
Epoch: 11829, Training Loss: 0.01456
Epoch: 11829, Training Loss: 0.01805
Epoch: 11830, Training Loss: 0.01372
Epoch: 11830, Training Loss: 0.01453
Epoch: 11830, Training Loss: 0.01456
Epoch: 11830, Training Loss: 0.01805
Epoch: 11831, Training Loss: 0.01372
Epoch: 11831, Training Loss: 0.01453
Epoch: 11831, Training Loss: 0.01456
Epoch: 11831, Training Loss: 0.01805
Epoch: 11832, Training Loss: 0.01372
Epoch: 11832, Training Loss: 0.01453
Epoch: 11832, Training Loss: 0.01455
Epoch: 11832, Training Loss: 0.01804
Epoch: 11833, Training Loss: 0.01372
Epoch: 11833, Training Loss: 0.01453
Epoch: 11833, Training Loss: 0.01455
Epoch: 11833, Training Loss: 0.01804
Epoch: 11834, Training Loss: 0.01372
Epoch: 11834, Training Loss: 0.01453
Epoch: 11834, Training Loss: 0.01455
Epoch: 11834, Training Loss: 0.01804
Epoch: 11835, Training Loss: 0.01372
Epoch: 11835, Training Loss: 0.01453
Epoch: 11835, Training Loss: 0.01455
Epoch: 11835, Training Loss: 0.01804
Epoch: 11836, Training Loss: 0.01372
Epoch: 11836, Training Loss: 0.01453
Epoch: 11836, Training Loss: 0.01455
Epoch: 11836, Training Loss: 0.01804
Epoch: 11837, Training Loss: 0.01372
Epoch: 11837, Training Loss: 0.01453
Epoch: 11837, Training Loss: 0.01455
Epoch: 11837, Training Loss: 0.01804
Epoch: 11838, Training Loss: 0.01372
Epoch: 11838, Training Loss: 0.01453
Epoch: 11838, Training Loss: 0.01455
Epoch: 11838, Training Loss: 0.01804
Epoch: 11839, Training Loss: 0.01372
Epoch: 11839, Training Loss: 0.01453
Epoch: 11839, Training Loss: 0.01455
Epoch: 11839, Training Loss: 0.01804
Epoch: 11840, Training Loss: 0.01372
Epoch: 11840, Training Loss: 0.01453
Epoch: 11840, Training Loss: 0.01455
Epoch: 11840, Training Loss: 0.01804
Epoch: 11841, Training Loss: 0.01372
Epoch: 11841, Training Loss: 0.01453
Epoch: 11841, Training Loss: 0.01455
Epoch: 11841, Training Loss: 0.01804
Epoch: 11842, Training Loss: 0.01371
Epoch: 11842, Training Loss: 0.01452
Epoch: 11842, Training Loss: 0.01455
Epoch: 11842, Training Loss: 0.01803
Epoch: 11843, Training Loss: 0.01371
Epoch: 11843, Training Loss: 0.01452
Epoch: 11843, Training Loss: 0.01455
Epoch: 11843, Training Loss: 0.01803
Epoch: 11844, Training Loss: 0.01371
Epoch: 11844, Training Loss: 0.01452
Epoch: 11844, Training Loss: 0.01454
Epoch: 11844, Training Loss: 0.01803
Epoch: 11845, Training Loss: 0.01371
Epoch: 11845, Training Loss: 0.01452
Epoch: 11845, Training Loss: 0.01454
Epoch: 11845, Training Loss: 0.01803
Epoch: 11846, Training Loss: 0.01371
Epoch: 11846, Training Loss: 0.01452
Epoch: 11846, Training Loss: 0.01454
Epoch: 11846, Training Loss: 0.01803
Epoch: 11847, Training Loss: 0.01371
Epoch: 11847, Training Loss: 0.01452
Epoch: 11847, Training Loss: 0.01454
Epoch: 11847, Training Loss: 0.01803
Epoch: 11848, Training Loss: 0.01371
Epoch: 11848, Training Loss: 0.01452
Epoch: 11848, Training Loss: 0.01454
Epoch: 11848, Training Loss: 0.01803
Epoch: 11849, Training Loss: 0.01371
Epoch: 11849, Training Loss: 0.01452
Epoch: 11849, Training Loss: 0.01454
Epoch: 11849, Training Loss: 0.01803
Epoch: 11850, Training Loss: 0.01371
Epoch: 11850, Training Loss: 0.01452
Epoch: 11850, Training Loss: 0.01454
Epoch: 11850, Training Loss: 0.01803
Epoch: 11851, Training Loss: 0.01371
Epoch: 11851, Training Loss: 0.01452
Epoch: 11851, Training Loss: 0.01454
Epoch: 11851, Training Loss: 0.01803
Epoch: 11852, Training Loss: 0.01371
Epoch: 11852, Training Loss: 0.01452
Epoch: 11852, Training Loss: 0.01454
Epoch: 11852, Training Loss: 0.01802
Epoch: 11853, Training Loss: 0.01371
Epoch: 11853, Training Loss: 0.01452
Epoch: 11853, Training Loss: 0.01454
Epoch: 11853, Training Loss: 0.01802
Epoch: 11854, Training Loss: 0.01371
Epoch: 11854, Training Loss: 0.01452
Epoch: 11854, Training Loss: 0.01454
Epoch: 11854, Training Loss: 0.01802
Epoch: 11855, Training Loss: 0.01371
Epoch: 11855, Training Loss: 0.01451
Epoch: 11855, Training Loss: 0.01454
Epoch: 11855, Training Loss: 0.01802
Epoch: 11856, Training Loss: 0.01370
Epoch: 11856, Training Loss: 0.01451
Epoch: 11856, Training Loss: 0.01454
Epoch: 11856, Training Loss: 0.01802
Epoch: 11857, Training Loss: 0.01370
Epoch: 11857, Training Loss: 0.01451
Epoch: 11857, Training Loss: 0.01453
Epoch: 11857, Training Loss: 0.01802
Epoch: 11858, Training Loss: 0.01370
Epoch: 11858, Training Loss: 0.01451
Epoch: 11858, Training Loss: 0.01453
Epoch: 11858, Training Loss: 0.01802
Epoch: 11859, Training Loss: 0.01370
Epoch: 11859, Training Loss: 0.01451
Epoch: 11859, Training Loss: 0.01453
Epoch: 11859, Training Loss: 0.01802
Epoch: 11860, Training Loss: 0.01370
Epoch: 11860, Training Loss: 0.01451
Epoch: 11860, Training Loss: 0.01453
Epoch: 11860, Training Loss: 0.01802
Epoch: 11861, Training Loss: 0.01370
Epoch: 11861, Training Loss: 0.01451
Epoch: 11861, Training Loss: 0.01453
Epoch: 11861, Training Loss: 0.01802
Epoch: 11862, Training Loss: 0.01370
Epoch: 11862, Training Loss: 0.01451
Epoch: 11862, Training Loss: 0.01453
Epoch: 11862, Training Loss: 0.01801
Epoch: 11863, Training Loss: 0.01370
Epoch: 11863, Training Loss: 0.01451
Epoch: 11863, Training Loss: 0.01453
Epoch: 11863, Training Loss: 0.01801
Epoch: 11864, Training Loss: 0.01370
Epoch: 11864, Training Loss: 0.01451
Epoch: 11864, Training Loss: 0.01453
Epoch: 11864, Training Loss: 0.01801
Epoch: 11865, Training Loss: 0.01370
Epoch: 11865, Training Loss: 0.01451
Epoch: 11865, Training Loss: 0.01453
Epoch: 11865, Training Loss: 0.01801
Epoch: 11866, Training Loss: 0.01370
Epoch: 11866, Training Loss: 0.01451
Epoch: 11866, Training Loss: 0.01453
Epoch: 11866, Training Loss: 0.01801
Epoch: 11867, Training Loss: 0.01370
Epoch: 11867, Training Loss: 0.01450
Epoch: 11867, Training Loss: 0.01453
Epoch: 11867, Training Loss: 0.01801
Epoch: 11868, Training Loss: 0.01370
Epoch: 11868, Training Loss: 0.01450
Epoch: 11868, Training Loss: 0.01453
Epoch: 11868, Training Loss: 0.01801
Epoch: 11869, Training Loss: 0.01369
Epoch: 11869, Training Loss: 0.01450
Epoch: 11869, Training Loss: 0.01452
Epoch: 11869, Training Loss: 0.01801
Epoch: 11870, Training Loss: 0.01369
Epoch: 11870, Training Loss: 0.01450
Epoch: 11870, Training Loss: 0.01452
Epoch: 11870, Training Loss: 0.01801
Epoch: 11871, Training Loss: 0.01369
Epoch: 11871, Training Loss: 0.01450
Epoch: 11871, Training Loss: 0.01452
Epoch: 11871, Training Loss: 0.01800
Epoch: 11872, Training Loss: 0.01369
Epoch: 11872, Training Loss: 0.01450
Epoch: 11872, Training Loss: 0.01452
Epoch: 11872, Training Loss: 0.01800
Epoch: 11873, Training Loss: 0.01369
Epoch: 11873, Training Loss: 0.01450
Epoch: 11873, Training Loss: 0.01452
Epoch: 11873, Training Loss: 0.01800
Epoch: 11874, Training Loss: 0.01369
Epoch: 11874, Training Loss: 0.01450
Epoch: 11874, Training Loss: 0.01452
Epoch: 11874, Training Loss: 0.01800
Epoch: 11875, Training Loss: 0.01369
Epoch: 11875, Training Loss: 0.01450
Epoch: 11875, Training Loss: 0.01452
Epoch: 11875, Training Loss: 0.01800
Epoch: 11876, Training Loss: 0.01369
Epoch: 11876, Training Loss: 0.01450
Epoch: 11876, Training Loss: 0.01452
Epoch: 11876, Training Loss: 0.01800
Epoch: 11877, Training Loss: 0.01369
Epoch: 11877, Training Loss: 0.01450
Epoch: 11877, Training Loss: 0.01452
Epoch: 11877, Training Loss: 0.01800
Epoch: 11878, Training Loss: 0.01369
Epoch: 11878, Training Loss: 0.01450
Epoch: 11878, Training Loss: 0.01452
Epoch: 11878, Training Loss: 0.01800
Epoch: 11879, Training Loss: 0.01369
Epoch: 11879, Training Loss: 0.01450
Epoch: 11879, Training Loss: 0.01452
Epoch: 11879, Training Loss: 0.01800
Epoch: 11880, Training Loss: 0.01369
Epoch: 11880, Training Loss: 0.01449
Epoch: 11880, Training Loss: 0.01452
Epoch: 11880, Training Loss: 0.01800
Epoch: 11881, Training Loss: 0.01369
Epoch: 11881, Training Loss: 0.01449
Epoch: 11881, Training Loss: 0.01452
Epoch: 11881, Training Loss: 0.01799
Epoch: 11882, Training Loss: 0.01369
Epoch: 11882, Training Loss: 0.01449
Epoch: 11882, Training Loss: 0.01451
Epoch: 11882, Training Loss: 0.01799
Epoch: 11883, Training Loss: 0.01368
Epoch: 11883, Training Loss: 0.01449
Epoch: 11883, Training Loss: 0.01451
Epoch: 11883, Training Loss: 0.01799
Epoch: 11884, Training Loss: 0.01368
Epoch: 11884, Training Loss: 0.01449
Epoch: 11884, Training Loss: 0.01451
Epoch: 11884, Training Loss: 0.01799
Epoch: 11885, Training Loss: 0.01368
Epoch: 11885, Training Loss: 0.01449
Epoch: 11885, Training Loss: 0.01451
Epoch: 11885, Training Loss: 0.01799
Epoch: 11886, Training Loss: 0.01368
Epoch: 11886, Training Loss: 0.01449
Epoch: 11886, Training Loss: 0.01451
Epoch: 11886, Training Loss: 0.01799
Epoch: 11887, Training Loss: 0.01368
Epoch: 11887, Training Loss: 0.01449
Epoch: 11887, Training Loss: 0.01451
Epoch: 11887, Training Loss: 0.01799
Epoch: 11888, Training Loss: 0.01368
Epoch: 11888, Training Loss: 0.01449
Epoch: 11888, Training Loss: 0.01451
Epoch: 11888, Training Loss: 0.01799
Epoch: 11889, Training Loss: 0.01368
Epoch: 11889, Training Loss: 0.01449
Epoch: 11889, Training Loss: 0.01451
Epoch: 11889, Training Loss: 0.01799
Epoch: 11890, Training Loss: 0.01368
Epoch: 11890, Training Loss: 0.01449
Epoch: 11890, Training Loss: 0.01451
Epoch: 11890, Training Loss: 0.01799
Epoch: 11891, Training Loss: 0.01368
Epoch: 11891, Training Loss: 0.01449
Epoch: 11891, Training Loss: 0.01451
Epoch: 11891, Training Loss: 0.01798
Epoch: 11892, Training Loss: 0.01368
Epoch: 11892, Training Loss: 0.01448
Epoch: 11892, Training Loss: 0.01451
Epoch: 11892, Training Loss: 0.01798
Epoch: 11893, Training Loss: 0.01368
Epoch: 11893, Training Loss: 0.01448
Epoch: 11893, Training Loss: 0.01451
Epoch: 11893, Training Loss: 0.01798
Epoch: 11894, Training Loss: 0.01368
Epoch: 11894, Training Loss: 0.01448
Epoch: 11894, Training Loss: 0.01450
Epoch: 11894, Training Loss: 0.01798
Epoch: 11895, Training Loss: 0.01368
Epoch: 11895, Training Loss: 0.01448
Epoch: 11895, Training Loss: 0.01450
Epoch: 11895, Training Loss: 0.01798
Epoch: 11896, Training Loss: 0.01368
Epoch: 11896, Training Loss: 0.01448
Epoch: 11896, Training Loss: 0.01450
Epoch: 11896, Training Loss: 0.01798
Epoch: 11897, Training Loss: 0.01367
Epoch: 11897, Training Loss: 0.01448
Epoch: 11897, Training Loss: 0.01450
Epoch: 11897, Training Loss: 0.01798
Epoch: 11898, Training Loss: 0.01367
Epoch: 11898, Training Loss: 0.01448
Epoch: 11898, Training Loss: 0.01450
Epoch: 11898, Training Loss: 0.01798
Epoch: 11899, Training Loss: 0.01367
Epoch: 11899, Training Loss: 0.01448
Epoch: 11899, Training Loss: 0.01450
Epoch: 11899, Training Loss: 0.01798
Epoch: 11900, Training Loss: 0.01367
Epoch: 11900, Training Loss: 0.01448
Epoch: 11900, Training Loss: 0.01450
Epoch: 11900, Training Loss: 0.01798
Epoch: 11901, Training Loss: 0.01367
Epoch: 11901, Training Loss: 0.01448
Epoch: 11901, Training Loss: 0.01450
Epoch: 11901, Training Loss: 0.01797
Epoch: 11902, Training Loss: 0.01367
Epoch: 11902, Training Loss: 0.01448
Epoch: 11902, Training Loss: 0.01450
Epoch: 11902, Training Loss: 0.01797
Epoch: 11903, Training Loss: 0.01367
Epoch: 11903, Training Loss: 0.01448
Epoch: 11903, Training Loss: 0.01450
Epoch: 11903, Training Loss: 0.01797
Epoch: 11904, Training Loss: 0.01367
Epoch: 11904, Training Loss: 0.01448
Epoch: 11904, Training Loss: 0.01450
Epoch: 11904, Training Loss: 0.01797
Epoch: 11905, Training Loss: 0.01367
Epoch: 11905, Training Loss: 0.01447
Epoch: 11905, Training Loss: 0.01450
Epoch: 11905, Training Loss: 0.01797
Epoch: 11906, Training Loss: 0.01367
Epoch: 11906, Training Loss: 0.01447
Epoch: 11906, Training Loss: 0.01450
Epoch: 11906, Training Loss: 0.01797
Epoch: 11907, Training Loss: 0.01367
Epoch: 11907, Training Loss: 0.01447
Epoch: 11907, Training Loss: 0.01449
Epoch: 11907, Training Loss: 0.01797
Epoch: 11908, Training Loss: 0.01367
Epoch: 11908, Training Loss: 0.01447
Epoch: 11908, Training Loss: 0.01449
Epoch: 11908, Training Loss: 0.01797
Epoch: 11909, Training Loss: 0.01367
Epoch: 11909, Training Loss: 0.01447
Epoch: 11909, Training Loss: 0.01449
Epoch: 11909, Training Loss: 0.01797
Epoch: 11910, Training Loss: 0.01367
Epoch: 11910, Training Loss: 0.01447
Epoch: 11910, Training Loss: 0.01449
Epoch: 11910, Training Loss: 0.01797
Epoch: 11911, Training Loss: 0.01366
Epoch: 11911, Training Loss: 0.01447
Epoch: 11911, Training Loss: 0.01449
Epoch: 11911, Training Loss: 0.01796
Epoch: 11912, Training Loss: 0.01366
Epoch: 11912, Training Loss: 0.01447
Epoch: 11912, Training Loss: 0.01449
Epoch: 11912, Training Loss: 0.01796
Epoch: 11913, Training Loss: 0.01366
Epoch: 11913, Training Loss: 0.01447
Epoch: 11913, Training Loss: 0.01449
Epoch: 11913, Training Loss: 0.01796
Epoch: 11914, Training Loss: 0.01366
Epoch: 11914, Training Loss: 0.01447
Epoch: 11914, Training Loss: 0.01449
Epoch: 11914, Training Loss: 0.01796
Epoch: 11915, Training Loss: 0.01366
Epoch: 11915, Training Loss: 0.01447
Epoch: 11915, Training Loss: 0.01449
Epoch: 11915, Training Loss: 0.01796
Epoch: 11916, Training Loss: 0.01366
Epoch: 11916, Training Loss: 0.01447
Epoch: 11916, Training Loss: 0.01449
Epoch: 11916, Training Loss: 0.01796
Epoch: 11917, Training Loss: 0.01366
Epoch: 11917, Training Loss: 0.01446
Epoch: 11917, Training Loss: 0.01449
Epoch: 11917, Training Loss: 0.01796
Epoch: 11918, Training Loss: 0.01366
Epoch: 11918, Training Loss: 0.01446
Epoch: 11918, Training Loss: 0.01449
Epoch: 11918, Training Loss: 0.01796
Epoch: 11919, Training Loss: 0.01366
Epoch: 11919, Training Loss: 0.01446
Epoch: 11919, Training Loss: 0.01448
Epoch: 11919, Training Loss: 0.01796
Epoch: 11920, Training Loss: 0.01366
Epoch: 11920, Training Loss: 0.01446
Epoch: 11920, Training Loss: 0.01448
Epoch: 11920, Training Loss: 0.01796
Epoch: 11921, Training Loss: 0.01366
Epoch: 11921, Training Loss: 0.01446
Epoch: 11921, Training Loss: 0.01448
Epoch: 11921, Training Loss: 0.01795
Epoch: 11922, Training Loss: 0.01366
Epoch: 11922, Training Loss: 0.01446
Epoch: 11922, Training Loss: 0.01448
Epoch: 11922, Training Loss: 0.01795
Epoch: 11923, Training Loss: 0.01366
Epoch: 11923, Training Loss: 0.01446
Epoch: 11923, Training Loss: 0.01448
Epoch: 11923, Training Loss: 0.01795
Epoch: 11924, Training Loss: 0.01366
Epoch: 11924, Training Loss: 0.01446
Epoch: 11924, Training Loss: 0.01448
Epoch: 11924, Training Loss: 0.01795
Epoch: 11925, Training Loss: 0.01365
Epoch: 11925, Training Loss: 0.01446
Epoch: 11925, Training Loss: 0.01448
Epoch: 11925, Training Loss: 0.01795
Epoch: 11926, Training Loss: 0.01365
Epoch: 11926, Training Loss: 0.01446
Epoch: 11926, Training Loss: 0.01448
Epoch: 11926, Training Loss: 0.01795
Epoch: 11927, Training Loss: 0.01365
Epoch: 11927, Training Loss: 0.01446
Epoch: 11927, Training Loss: 0.01448
Epoch: 11927, Training Loss: 0.01795
Epoch: 11928, Training Loss: 0.01365
Epoch: 11928, Training Loss: 0.01446
Epoch: 11928, Training Loss: 0.01448
Epoch: 11928, Training Loss: 0.01795
Epoch: 11929, Training Loss: 0.01365
Epoch: 11929, Training Loss: 0.01446
Epoch: 11929, Training Loss: 0.01448
Epoch: 11929, Training Loss: 0.01795
Epoch: 11930, Training Loss: 0.01365
Epoch: 11930, Training Loss: 0.01445
Epoch: 11930, Training Loss: 0.01448
Epoch: 11930, Training Loss: 0.01795
Epoch: 11931, Training Loss: 0.01365
Epoch: 11931, Training Loss: 0.01445
Epoch: 11931, Training Loss: 0.01448
Epoch: 11931, Training Loss: 0.01794
Epoch: 11932, Training Loss: 0.01365
Epoch: 11932, Training Loss: 0.01445
Epoch: 11932, Training Loss: 0.01447
Epoch: 11932, Training Loss: 0.01794
Epoch: 11933, Training Loss: 0.01365
Epoch: 11933, Training Loss: 0.01445
Epoch: 11933, Training Loss: 0.01447
Epoch: 11933, Training Loss: 0.01794
Epoch: 11934, Training Loss: 0.01365
Epoch: 11934, Training Loss: 0.01445
Epoch: 11934, Training Loss: 0.01447
Epoch: 11934, Training Loss: 0.01794
Epoch: 11935, Training Loss: 0.01365
Epoch: 11935, Training Loss: 0.01445
Epoch: 11935, Training Loss: 0.01447
Epoch: 11935, Training Loss: 0.01794
Epoch: 11936, Training Loss: 0.01365
Epoch: 11936, Training Loss: 0.01445
Epoch: 11936, Training Loss: 0.01447
Epoch: 11936, Training Loss: 0.01794
Epoch: 11937, Training Loss: 0.01365
Epoch: 11937, Training Loss: 0.01445
Epoch: 11937, Training Loss: 0.01447
Epoch: 11937, Training Loss: 0.01794
Epoch: 11938, Training Loss: 0.01364
Epoch: 11938, Training Loss: 0.01445
Epoch: 11938, Training Loss: 0.01447
Epoch: 11938, Training Loss: 0.01794
Epoch: 11939, Training Loss: 0.01364
Epoch: 11939, Training Loss: 0.01445
Epoch: 11939, Training Loss: 0.01447
Epoch: 11939, Training Loss: 0.01794
Epoch: 11940, Training Loss: 0.01364
Epoch: 11940, Training Loss: 0.01445
Epoch: 11940, Training Loss: 0.01447
Epoch: 11940, Training Loss: 0.01794
Epoch: 11941, Training Loss: 0.01364
Epoch: 11941, Training Loss: 0.01445
Epoch: 11941, Training Loss: 0.01447
Epoch: 11941, Training Loss: 0.01793
Epoch: 11942, Training Loss: 0.01364
Epoch: 11942, Training Loss: 0.01444
Epoch: 11942, Training Loss: 0.01447
Epoch: 11942, Training Loss: 0.01793
Epoch: 11943, Training Loss: 0.01364
Epoch: 11943, Training Loss: 0.01444
Epoch: 11943, Training Loss: 0.01447
Epoch: 11943, Training Loss: 0.01793
Epoch: 11944, Training Loss: 0.01364
Epoch: 11944, Training Loss: 0.01444
Epoch: 11944, Training Loss: 0.01446
Epoch: 11944, Training Loss: 0.01793
Epoch: 11945, Training Loss: 0.01364
Epoch: 11945, Training Loss: 0.01444
Epoch: 11945, Training Loss: 0.01446
Epoch: 11945, Training Loss: 0.01793
Epoch: 11946, Training Loss: 0.01364
Epoch: 11946, Training Loss: 0.01444
Epoch: 11946, Training Loss: 0.01446
Epoch: 11946, Training Loss: 0.01793
Epoch: 11947, Training Loss: 0.01364
Epoch: 11947, Training Loss: 0.01444
Epoch: 11947, Training Loss: 0.01446
Epoch: 11947, Training Loss: 0.01793
Epoch: 11948, Training Loss: 0.01364
Epoch: 11948, Training Loss: 0.01444
Epoch: 11948, Training Loss: 0.01446
Epoch: 11948, Training Loss: 0.01793
Epoch: 11949, Training Loss: 0.01364
Epoch: 11949, Training Loss: 0.01444
Epoch: 11949, Training Loss: 0.01446
Epoch: 11949, Training Loss: 0.01793
Epoch: 11950, Training Loss: 0.01364
Epoch: 11950, Training Loss: 0.01444
Epoch: 11950, Training Loss: 0.01446
Epoch: 11950, Training Loss: 0.01793
Epoch: 11951, Training Loss: 0.01364
Epoch: 11951, Training Loss: 0.01444
Epoch: 11951, Training Loss: 0.01446
Epoch: 11951, Training Loss: 0.01792
Epoch: 11952, Training Loss: 0.01363
Epoch: 11952, Training Loss: 0.01444
Epoch: 11952, Training Loss: 0.01446
Epoch: 11952, Training Loss: 0.01792
Epoch: 11953, Training Loss: 0.01363
Epoch: 11953, Training Loss: 0.01444
Epoch: 11953, Training Loss: 0.01446
Epoch: 11953, Training Loss: 0.01792
Epoch: 11954, Training Loss: 0.01363
Epoch: 11954, Training Loss: 0.01444
Epoch: 11954, Training Loss: 0.01446
Epoch: 11954, Training Loss: 0.01792
Epoch: 11955, Training Loss: 0.01363
Epoch: 11955, Training Loss: 0.01443
Epoch: 11955, Training Loss: 0.01446
Epoch: 11955, Training Loss: 0.01792
Epoch: 11956, Training Loss: 0.01363
Epoch: 11956, Training Loss: 0.01443
Epoch: 11956, Training Loss: 0.01446
Epoch: 11956, Training Loss: 0.01792
Epoch: 11957, Training Loss: 0.01363
Epoch: 11957, Training Loss: 0.01443
Epoch: 11957, Training Loss: 0.01445
Epoch: 11957, Training Loss: 0.01792
Epoch: 11958, Training Loss: 0.01363
Epoch: 11958, Training Loss: 0.01443
Epoch: 11958, Training Loss: 0.01445
Epoch: 11958, Training Loss: 0.01792
Epoch: 11959, Training Loss: 0.01363
Epoch: 11959, Training Loss: 0.01443
Epoch: 11959, Training Loss: 0.01445
Epoch: 11959, Training Loss: 0.01792
Epoch: 11960, Training Loss: 0.01363
Epoch: 11960, Training Loss: 0.01443
Epoch: 11960, Training Loss: 0.01445
Epoch: 11960, Training Loss: 0.01791
Epoch: 11961, Training Loss: 0.01363
Epoch: 11961, Training Loss: 0.01443
Epoch: 11961, Training Loss: 0.01445
Epoch: 11961, Training Loss: 0.01791
Epoch: 11962, Training Loss: 0.01363
Epoch: 11962, Training Loss: 0.01443
Epoch: 11962, Training Loss: 0.01445
Epoch: 11962, Training Loss: 0.01791
Epoch: 11963, Training Loss: 0.01363
Epoch: 11963, Training Loss: 0.01443
Epoch: 11963, Training Loss: 0.01445
Epoch: 11963, Training Loss: 0.01791
Epoch: 11964, Training Loss: 0.01363
Epoch: 11964, Training Loss: 0.01443
Epoch: 11964, Training Loss: 0.01445
Epoch: 11964, Training Loss: 0.01791
Epoch: 11965, Training Loss: 0.01363
Epoch: 11965, Training Loss: 0.01443
Epoch: 11965, Training Loss: 0.01445
Epoch: 11965, Training Loss: 0.01791
Epoch: 11966, Training Loss: 0.01362
Epoch: 11966, Training Loss: 0.01443
Epoch: 11966, Training Loss: 0.01445
Epoch: 11966, Training Loss: 0.01791
Epoch: 11967, Training Loss: 0.01362
Epoch: 11967, Training Loss: 0.01443
Epoch: 11967, Training Loss: 0.01445
Epoch: 11967, Training Loss: 0.01791
Epoch: 11968, Training Loss: 0.01362
Epoch: 11968, Training Loss: 0.01442
Epoch: 11968, Training Loss: 0.01445
Epoch: 11968, Training Loss: 0.01791
Epoch: 11969, Training Loss: 0.01362
Epoch: 11969, Training Loss: 0.01442
Epoch: 11969, Training Loss: 0.01445
Epoch: 11969, Training Loss: 0.01791
Epoch: 11970, Training Loss: 0.01362
Epoch: 11970, Training Loss: 0.01442
Epoch: 11970, Training Loss: 0.01444
Epoch: 11970, Training Loss: 0.01790
Epoch: 11971, Training Loss: 0.01362
Epoch: 11971, Training Loss: 0.01442
Epoch: 11971, Training Loss: 0.01444
Epoch: 11971, Training Loss: 0.01790
Epoch: 11972, Training Loss: 0.01362
Epoch: 11972, Training Loss: 0.01442
Epoch: 11972, Training Loss: 0.01444
Epoch: 11972, Training Loss: 0.01790
Epoch: 11973, Training Loss: 0.01362
Epoch: 11973, Training Loss: 0.01442
Epoch: 11973, Training Loss: 0.01444
Epoch: 11973, Training Loss: 0.01790
Epoch: 11974, Training Loss: 0.01362
Epoch: 11974, Training Loss: 0.01442
Epoch: 11974, Training Loss: 0.01444
Epoch: 11974, Training Loss: 0.01790
Epoch: 11975, Training Loss: 0.01362
Epoch: 11975, Training Loss: 0.01442
Epoch: 11975, Training Loss: 0.01444
Epoch: 11975, Training Loss: 0.01790
Epoch: 11976, Training Loss: 0.01362
Epoch: 11976, Training Loss: 0.01442
Epoch: 11976, Training Loss: 0.01444
Epoch: 11976, Training Loss: 0.01790
Epoch: 11977, Training Loss: 0.01362
Epoch: 11977, Training Loss: 0.01442
Epoch: 11977, Training Loss: 0.01444
Epoch: 11977, Training Loss: 0.01790
Epoch: 11978, Training Loss: 0.01362
Epoch: 11978, Training Loss: 0.01442
Epoch: 11978, Training Loss: 0.01444
Epoch: 11978, Training Loss: 0.01790
Epoch: 11979, Training Loss: 0.01362
Epoch: 11979, Training Loss: 0.01442
Epoch: 11979, Training Loss: 0.01444
Epoch: 11979, Training Loss: 0.01790
Epoch: 11980, Training Loss: 0.01361
Epoch: 11980, Training Loss: 0.01441
Epoch: 11980, Training Loss: 0.01444
Epoch: 11980, Training Loss: 0.01789
Epoch: 11981, Training Loss: 0.01361
Epoch: 11981, Training Loss: 0.01441
Epoch: 11981, Training Loss: 0.01444
Epoch: 11981, Training Loss: 0.01789
Epoch: 11982, Training Loss: 0.01361
Epoch: 11982, Training Loss: 0.01441
Epoch: 11982, Training Loss: 0.01443
Epoch: 11982, Training Loss: 0.01789
Epoch: 11983, Training Loss: 0.01361
Epoch: 11983, Training Loss: 0.01441
Epoch: 11983, Training Loss: 0.01443
Epoch: 11983, Training Loss: 0.01789
Epoch: 11984, Training Loss: 0.01361
Epoch: 11984, Training Loss: 0.01441
Epoch: 11984, Training Loss: 0.01443
Epoch: 11984, Training Loss: 0.01789
Epoch: 11985, Training Loss: 0.01361
Epoch: 11985, Training Loss: 0.01441
Epoch: 11985, Training Loss: 0.01443
Epoch: 11985, Training Loss: 0.01789
Epoch: 11986, Training Loss: 0.01361
Epoch: 11986, Training Loss: 0.01441
Epoch: 11986, Training Loss: 0.01443
Epoch: 11986, Training Loss: 0.01789
Epoch: 11987, Training Loss: 0.01361
Epoch: 11987, Training Loss: 0.01441
Epoch: 11987, Training Loss: 0.01443
Epoch: 11987, Training Loss: 0.01789
Epoch: 11988, Training Loss: 0.01361
Epoch: 11988, Training Loss: 0.01441
Epoch: 11988, Training Loss: 0.01443
Epoch: 11988, Training Loss: 0.01789
Epoch: 11989, Training Loss: 0.01361
Epoch: 11989, Training Loss: 0.01441
Epoch: 11989, Training Loss: 0.01443
Epoch: 11989, Training Loss: 0.01789
Epoch: 11990, Training Loss: 0.01361
Epoch: 11990, Training Loss: 0.01441
Epoch: 11990, Training Loss: 0.01443
Epoch: 11990, Training Loss: 0.01788
Epoch: 11991, Training Loss: 0.01361
Epoch: 11991, Training Loss: 0.01441
Epoch: 11991, Training Loss: 0.01443
Epoch: 11991, Training Loss: 0.01788
Epoch: 11992, Training Loss: 0.01361
Epoch: 11992, Training Loss: 0.01441
Epoch: 11992, Training Loss: 0.01443
Epoch: 11992, Training Loss: 0.01788
Epoch: 11993, Training Loss: 0.01361
Epoch: 11993, Training Loss: 0.01440
Epoch: 11993, Training Loss: 0.01443
Epoch: 11993, Training Loss: 0.01788
Epoch: 11994, Training Loss: 0.01360
Epoch: 11994, Training Loss: 0.01440
Epoch: 11994, Training Loss: 0.01443
Epoch: 11994, Training Loss: 0.01788
Epoch: 11995, Training Loss: 0.01360
Epoch: 11995, Training Loss: 0.01440
Epoch: 11995, Training Loss: 0.01442
Epoch: 11995, Training Loss: 0.01788
Epoch: 11996, Training Loss: 0.01360
Epoch: 11996, Training Loss: 0.01440
Epoch: 11996, Training Loss: 0.01442
Epoch: 11996, Training Loss: 0.01788
Epoch: 11997, Training Loss: 0.01360
Epoch: 11997, Training Loss: 0.01440
Epoch: 11997, Training Loss: 0.01442
Epoch: 11997, Training Loss: 0.01788
Epoch: 11998, Training Loss: 0.01360
Epoch: 11998, Training Loss: 0.01440
Epoch: 11998, Training Loss: 0.01442
Epoch: 11998, Training Loss: 0.01788
Epoch: 11999, Training Loss: 0.01360
Epoch: 11999, Training Loss: 0.01440
Epoch: 11999, Training Loss: 0.01442
Epoch: 11999, Training Loss: 0.01788
Epoch: 12000, Training Loss: 0.01360
Epoch: 12000, Training Loss: 0.01440
Epoch: 12000, Training Loss: 0.01442
Epoch: 12000, Training Loss: 0.01787
Epoch: 12001, Training Loss: 0.01360
Epoch: 12001, Training Loss: 0.01440
Epoch: 12001, Training Loss: 0.01442
Epoch: 12001, Training Loss: 0.01787
Epoch: 12002, Training Loss: 0.01360
Epoch: 12002, Training Loss: 0.01440
Epoch: 12002, Training Loss: 0.01442
Epoch: 12002, Training Loss: 0.01787
Epoch: 12003, Training Loss: 0.01360
Epoch: 12003, Training Loss: 0.01440
Epoch: 12003, Training Loss: 0.01442
Epoch: 12003, Training Loss: 0.01787
Epoch: 12004, Training Loss: 0.01360
Epoch: 12004, Training Loss: 0.01440
Epoch: 12004, Training Loss: 0.01442
Epoch: 12004, Training Loss: 0.01787
Epoch: 12005, Training Loss: 0.01360
Epoch: 12005, Training Loss: 0.01440
Epoch: 12005, Training Loss: 0.01442
Epoch: 12005, Training Loss: 0.01787
Epoch: 12006, Training Loss: 0.01360
Epoch: 12006, Training Loss: 0.01439
Epoch: 12006, Training Loss: 0.01442
Epoch: 12006, Training Loss: 0.01787
Epoch: 12007, Training Loss: 0.01360
Epoch: 12007, Training Loss: 0.01439
Epoch: 12007, Training Loss: 0.01442
Epoch: 12007, Training Loss: 0.01787
Epoch: 12008, Training Loss: 0.01359
Epoch: 12008, Training Loss: 0.01439
Epoch: 12008, Training Loss: 0.01441
Epoch: 12008, Training Loss: 0.01787
Epoch: 12009, Training Loss: 0.01359
Epoch: 12009, Training Loss: 0.01439
Epoch: 12009, Training Loss: 0.01441
Epoch: 12009, Training Loss: 0.01787
Epoch: 12010, Training Loss: 0.01359
Epoch: 12010, Training Loss: 0.01439
Epoch: 12010, Training Loss: 0.01441
Epoch: 12010, Training Loss: 0.01787
Epoch: 12011, Training Loss: 0.01359
Epoch: 12011, Training Loss: 0.01439
Epoch: 12011, Training Loss: 0.01441
Epoch: 12011, Training Loss: 0.01786
Epoch: 12012, Training Loss: 0.01359
Epoch: 12012, Training Loss: 0.01439
Epoch: 12012, Training Loss: 0.01441
Epoch: 12012, Training Loss: 0.01786
Epoch: 12013, Training Loss: 0.01359
Epoch: 12013, Training Loss: 0.01439
Epoch: 12013, Training Loss: 0.01441
Epoch: 12013, Training Loss: 0.01786
Epoch: 12014, Training Loss: 0.01359
Epoch: 12014, Training Loss: 0.01439
Epoch: 12014, Training Loss: 0.01441
Epoch: 12014, Training Loss: 0.01786
Epoch: 12015, Training Loss: 0.01359
Epoch: 12015, Training Loss: 0.01439
Epoch: 12015, Training Loss: 0.01441
Epoch: 12015, Training Loss: 0.01786
Epoch: 12016, Training Loss: 0.01359
Epoch: 12016, Training Loss: 0.01439
Epoch: 12016, Training Loss: 0.01441
Epoch: 12016, Training Loss: 0.01786
Epoch: 12017, Training Loss: 0.01359
Epoch: 12017, Training Loss: 0.01439
Epoch: 12017, Training Loss: 0.01441
Epoch: 12017, Training Loss: 0.01786
Epoch: 12018, Training Loss: 0.01359
Epoch: 12018, Training Loss: 0.01439
Epoch: 12018, Training Loss: 0.01441
Epoch: 12018, Training Loss: 0.01786
Epoch: 12019, Training Loss: 0.01359
Epoch: 12019, Training Loss: 0.01438
Epoch: 12019, Training Loss: 0.01441
Epoch: 12019, Training Loss: 0.01786
Epoch: 12020, Training Loss: 0.01359
Epoch: 12020, Training Loss: 0.01438
Epoch: 12020, Training Loss: 0.01440
Epoch: 12020, Training Loss: 0.01786
Epoch: 12021, Training Loss: 0.01359
Epoch: 12021, Training Loss: 0.01438
Epoch: 12021, Training Loss: 0.01440
Epoch: 12021, Training Loss: 0.01785
Epoch: 12022, Training Loss: 0.01358
Epoch: 12022, Training Loss: 0.01438
Epoch: 12022, Training Loss: 0.01440
Epoch: 12022, Training Loss: 0.01785
Epoch: 12023, Training Loss: 0.01358
Epoch: 12023, Training Loss: 0.01438
Epoch: 12023, Training Loss: 0.01440
Epoch: 12023, Training Loss: 0.01785
Epoch: 12024, Training Loss: 0.01358
Epoch: 12024, Training Loss: 0.01438
Epoch: 12024, Training Loss: 0.01440
Epoch: 12024, Training Loss: 0.01785
Epoch: 12025, Training Loss: 0.01358
Epoch: 12025, Training Loss: 0.01438
Epoch: 12025, Training Loss: 0.01440
Epoch: 12025, Training Loss: 0.01785
Epoch: 12026, Training Loss: 0.01358
Epoch: 12026, Training Loss: 0.01438
Epoch: 12026, Training Loss: 0.01440
Epoch: 12026, Training Loss: 0.01785
Epoch: 12027, Training Loss: 0.01358
Epoch: 12027, Training Loss: 0.01438
Epoch: 12027, Training Loss: 0.01440
Epoch: 12027, Training Loss: 0.01785
Epoch: 12028, Training Loss: 0.01358
Epoch: 12028, Training Loss: 0.01438
Epoch: 12028, Training Loss: 0.01440
Epoch: 12028, Training Loss: 0.01785
Epoch: 12029, Training Loss: 0.01358
Epoch: 12029, Training Loss: 0.01438
Epoch: 12029, Training Loss: 0.01440
Epoch: 12029, Training Loss: 0.01785
Epoch: 12030, Training Loss: 0.01358
Epoch: 12030, Training Loss: 0.01438
Epoch: 12030, Training Loss: 0.01440
Epoch: 12030, Training Loss: 0.01785
Epoch: 12031, Training Loss: 0.01358
Epoch: 12031, Training Loss: 0.01437
Epoch: 12031, Training Loss: 0.01440
Epoch: 12031, Training Loss: 0.01784
Epoch: 12032, Training Loss: 0.01358
Epoch: 12032, Training Loss: 0.01437
Epoch: 12032, Training Loss: 0.01440
Epoch: 12032, Training Loss: 0.01784
Epoch: 12033, Training Loss: 0.01358
Epoch: 12033, Training Loss: 0.01437
Epoch: 12033, Training Loss: 0.01439
Epoch: 12033, Training Loss: 0.01784
Epoch: 12034, Training Loss: 0.01358
Epoch: 12034, Training Loss: 0.01437
Epoch: 12034, Training Loss: 0.01439
Epoch: 12034, Training Loss: 0.01784
Epoch: 12035, Training Loss: 0.01358
Epoch: 12035, Training Loss: 0.01437
Epoch: 12035, Training Loss: 0.01439
Epoch: 12035, Training Loss: 0.01784
Epoch: 12036, Training Loss: 0.01357
Epoch: 12036, Training Loss: 0.01437
Epoch: 12036, Training Loss: 0.01439
Epoch: 12036, Training Loss: 0.01784
Epoch: 12037, Training Loss: 0.01357
Epoch: 12037, Training Loss: 0.01437
Epoch: 12037, Training Loss: 0.01439
Epoch: 12037, Training Loss: 0.01784
Epoch: 12038, Training Loss: 0.01357
Epoch: 12038, Training Loss: 0.01437
Epoch: 12038, Training Loss: 0.01439
Epoch: 12038, Training Loss: 0.01784
Epoch: 12039, Training Loss: 0.01357
Epoch: 12039, Training Loss: 0.01437
Epoch: 12039, Training Loss: 0.01439
Epoch: 12039, Training Loss: 0.01784
Epoch: 12040, Training Loss: 0.01357
Epoch: 12040, Training Loss: 0.01437
Epoch: 12040, Training Loss: 0.01439
Epoch: 12040, Training Loss: 0.01784
Epoch: 12041, Training Loss: 0.01357
Epoch: 12041, Training Loss: 0.01437
Epoch: 12041, Training Loss: 0.01439
Epoch: 12041, Training Loss: 0.01783
Epoch: 12042, Training Loss: 0.01357
Epoch: 12042, Training Loss: 0.01437
Epoch: 12042, Training Loss: 0.01439
Epoch: 12042, Training Loss: 0.01783
Epoch: 12043, Training Loss: 0.01357
Epoch: 12043, Training Loss: 0.01437
Epoch: 12043, Training Loss: 0.01439
Epoch: 12043, Training Loss: 0.01783
Epoch: 12044, Training Loss: 0.01357
Epoch: 12044, Training Loss: 0.01436
Epoch: 12044, Training Loss: 0.01439
Epoch: 12044, Training Loss: 0.01783
Epoch: 12045, Training Loss: 0.01357
Epoch: 12045, Training Loss: 0.01436
Epoch: 12045, Training Loss: 0.01439
Epoch: 12045, Training Loss: 0.01783
Epoch: 12046, Training Loss: 0.01357
Epoch: 12046, Training Loss: 0.01436
Epoch: 12046, Training Loss: 0.01438
Epoch: 12046, Training Loss: 0.01783
Epoch: 12047, Training Loss: 0.01357
Epoch: 12047, Training Loss: 0.01436
Epoch: 12047, Training Loss: 0.01438
Epoch: 12047, Training Loss: 0.01783
Epoch: 12048, Training Loss: 0.01357
Epoch: 12048, Training Loss: 0.01436
Epoch: 12048, Training Loss: 0.01438
Epoch: 12048, Training Loss: 0.01783
Epoch: 12049, Training Loss: 0.01357
Epoch: 12049, Training Loss: 0.01436
Epoch: 12049, Training Loss: 0.01438
Epoch: 12049, Training Loss: 0.01783
Epoch: 12050, Training Loss: 0.01356
Epoch: 12050, Training Loss: 0.01436
Epoch: 12050, Training Loss: 0.01438
Epoch: 12050, Training Loss: 0.01783
Epoch: 12051, Training Loss: 0.01356
Epoch: 12051, Training Loss: 0.01436
Epoch: 12051, Training Loss: 0.01438
Epoch: 12051, Training Loss: 0.01782
Epoch: 12052, Training Loss: 0.01356
Epoch: 12052, Training Loss: 0.01436
Epoch: 12052, Training Loss: 0.01438
Epoch: 12052, Training Loss: 0.01782
Epoch: 12053, Training Loss: 0.01356
Epoch: 12053, Training Loss: 0.01436
Epoch: 12053, Training Loss: 0.01438
Epoch: 12053, Training Loss: 0.01782
Epoch: 12054, Training Loss: 0.01356
Epoch: 12054, Training Loss: 0.01436
Epoch: 12054, Training Loss: 0.01438
Epoch: 12054, Training Loss: 0.01782
Epoch: 12055, Training Loss: 0.01356
Epoch: 12055, Training Loss: 0.01436
Epoch: 12055, Training Loss: 0.01438
Epoch: 12055, Training Loss: 0.01782
Epoch: 12056, Training Loss: 0.01356
Epoch: 12056, Training Loss: 0.01436
Epoch: 12056, Training Loss: 0.01438
Epoch: 12056, Training Loss: 0.01782
Epoch: 12057, Training Loss: 0.01356
Epoch: 12057, Training Loss: 0.01435
Epoch: 12057, Training Loss: 0.01438
Epoch: 12057, Training Loss: 0.01782
Epoch: 12058, Training Loss: 0.01356
Epoch: 12058, Training Loss: 0.01435
Epoch: 12058, Training Loss: 0.01438
Epoch: 12058, Training Loss: 0.01782
Epoch: 12059, Training Loss: 0.01356
Epoch: 12059, Training Loss: 0.01435
Epoch: 12059, Training Loss: 0.01437
Epoch: 12059, Training Loss: 0.01782
Epoch: 12060, Training Loss: 0.01356
Epoch: 12060, Training Loss: 0.01435
Epoch: 12060, Training Loss: 0.01437
Epoch: 12060, Training Loss: 0.01782
Epoch: 12061, Training Loss: 0.01356
Epoch: 12061, Training Loss: 0.01435
Epoch: 12061, Training Loss: 0.01437
Epoch: 12061, Training Loss: 0.01781
Epoch: 12062, Training Loss: 0.01356
Epoch: 12062, Training Loss: 0.01435
Epoch: 12062, Training Loss: 0.01437
Epoch: 12062, Training Loss: 0.01781
Epoch: 12063, Training Loss: 0.01356
Epoch: 12063, Training Loss: 0.01435
Epoch: 12063, Training Loss: 0.01437
Epoch: 12063, Training Loss: 0.01781
Epoch: 12064, Training Loss: 0.01355
Epoch: 12064, Training Loss: 0.01435
Epoch: 12064, Training Loss: 0.01437
Epoch: 12064, Training Loss: 0.01781
Epoch: 12065, Training Loss: 0.01355
Epoch: 12065, Training Loss: 0.01435
Epoch: 12065, Training Loss: 0.01437
Epoch: 12065, Training Loss: 0.01781
Epoch: 12066, Training Loss: 0.01355
Epoch: 12066, Training Loss: 0.01435
Epoch: 12066, Training Loss: 0.01437
Epoch: 12066, Training Loss: 0.01781
Epoch: 12067, Training Loss: 0.01355
Epoch: 12067, Training Loss: 0.01435
Epoch: 12067, Training Loss: 0.01437
Epoch: 12067, Training Loss: 0.01781
Epoch: 12068, Training Loss: 0.01355
Epoch: 12068, Training Loss: 0.01435
Epoch: 12068, Training Loss: 0.01437
Epoch: 12068, Training Loss: 0.01781
Epoch: 12069, Training Loss: 0.01355
Epoch: 12069, Training Loss: 0.01435
Epoch: 12069, Training Loss: 0.01437
Epoch: 12069, Training Loss: 0.01781
Epoch: 12070, Training Loss: 0.01355
Epoch: 12070, Training Loss: 0.01434
Epoch: 12070, Training Loss: 0.01437
Epoch: 12070, Training Loss: 0.01781
Epoch: 12071, Training Loss: 0.01355
Epoch: 12071, Training Loss: 0.01434
Epoch: 12071, Training Loss: 0.01437
Epoch: 12071, Training Loss: 0.01780
Epoch: 12072, Training Loss: 0.01355
Epoch: 12072, Training Loss: 0.01434
Epoch: 12072, Training Loss: 0.01436
Epoch: 12072, Training Loss: 0.01780
Epoch: 12073, Training Loss: 0.01355
Epoch: 12073, Training Loss: 0.01434
Epoch: 12073, Training Loss: 0.01436
Epoch: 12073, Training Loss: 0.01780
Epoch: 12074, Training Loss: 0.01355
Epoch: 12074, Training Loss: 0.01434
Epoch: 12074, Training Loss: 0.01436
Epoch: 12074, Training Loss: 0.01780
Epoch: 12075, Training Loss: 0.01355
Epoch: 12075, Training Loss: 0.01434
Epoch: 12075, Training Loss: 0.01436
Epoch: 12075, Training Loss: 0.01780
Epoch: 12076, Training Loss: 0.01355
Epoch: 12076, Training Loss: 0.01434
Epoch: 12076, Training Loss: 0.01436
Epoch: 12076, Training Loss: 0.01780
Epoch: 12077, Training Loss: 0.01355
Epoch: 12077, Training Loss: 0.01434
Epoch: 12077, Training Loss: 0.01436
Epoch: 12077, Training Loss: 0.01780
Epoch: 12078, Training Loss: 0.01355
Epoch: 12078, Training Loss: 0.01434
Epoch: 12078, Training Loss: 0.01436
Epoch: 12078, Training Loss: 0.01780
Epoch: 12079, Training Loss: 0.01354
Epoch: 12079, Training Loss: 0.01434
Epoch: 12079, Training Loss: 0.01436
Epoch: 12079, Training Loss: 0.01780
Epoch: 12080, Training Loss: 0.01354
Epoch: 12080, Training Loss: 0.01434
Epoch: 12080, Training Loss: 0.01436
Epoch: 12080, Training Loss: 0.01780
Epoch: 12081, Training Loss: 0.01354
Epoch: 12081, Training Loss: 0.01434
Epoch: 12081, Training Loss: 0.01436
Epoch: 12081, Training Loss: 0.01779
Epoch: 12082, Training Loss: 0.01354
Epoch: 12082, Training Loss: 0.01434
Epoch: 12082, Training Loss: 0.01436
Epoch: 12082, Training Loss: 0.01779
Epoch: 12083, Training Loss: 0.01354
Epoch: 12083, Training Loss: 0.01433
Epoch: 12083, Training Loss: 0.01436
Epoch: 12083, Training Loss: 0.01779
Epoch: 12084, Training Loss: 0.01354
Epoch: 12084, Training Loss: 0.01433
Epoch: 12084, Training Loss: 0.01435
Epoch: 12084, Training Loss: 0.01779
Epoch: 12085, Training Loss: 0.01354
Epoch: 12085, Training Loss: 0.01433
Epoch: 12085, Training Loss: 0.01435
Epoch: 12085, Training Loss: 0.01779
Epoch: 12086, Training Loss: 0.01354
Epoch: 12086, Training Loss: 0.01433
Epoch: 12086, Training Loss: 0.01435
Epoch: 12086, Training Loss: 0.01779
Epoch: 12087, Training Loss: 0.01354
Epoch: 12087, Training Loss: 0.01433
Epoch: 12087, Training Loss: 0.01435
Epoch: 12087, Training Loss: 0.01779
Epoch: 12088, Training Loss: 0.01354
Epoch: 12088, Training Loss: 0.01433
Epoch: 12088, Training Loss: 0.01435
Epoch: 12088, Training Loss: 0.01779
Epoch: 12089, Training Loss: 0.01354
Epoch: 12089, Training Loss: 0.01433
Epoch: 12089, Training Loss: 0.01435
Epoch: 12089, Training Loss: 0.01779
Epoch: 12090, Training Loss: 0.01354
Epoch: 12090, Training Loss: 0.01433
Epoch: 12090, Training Loss: 0.01435
Epoch: 12090, Training Loss: 0.01779
Epoch: 12091, Training Loss: 0.01354
Epoch: 12091, Training Loss: 0.01433
Epoch: 12091, Training Loss: 0.01435
Epoch: 12091, Training Loss: 0.01778
Epoch: 12092, Training Loss: 0.01354
Epoch: 12092, Training Loss: 0.01433
Epoch: 12092, Training Loss: 0.01435
Epoch: 12092, Training Loss: 0.01778
Epoch: 12093, Training Loss: 0.01353
Epoch: 12093, Training Loss: 0.01433
Epoch: 12093, Training Loss: 0.01435
Epoch: 12093, Training Loss: 0.01778
Epoch: 12094, Training Loss: 0.01353
Epoch: 12094, Training Loss: 0.01433
Epoch: 12094, Training Loss: 0.01435
Epoch: 12094, Training Loss: 0.01778
Epoch: 12095, Training Loss: 0.01353
Epoch: 12095, Training Loss: 0.01433
Epoch: 12095, Training Loss: 0.01435
Epoch: 12095, Training Loss: 0.01778
Epoch: 12096, Training Loss: 0.01353
Epoch: 12096, Training Loss: 0.01432
Epoch: 12096, Training Loss: 0.01435
Epoch: 12096, Training Loss: 0.01778
Epoch: 12097, Training Loss: 0.01353
Epoch: 12097, Training Loss: 0.01432
Epoch: 12097, Training Loss: 0.01434
Epoch: 12097, Training Loss: 0.01778
Epoch: 12098, Training Loss: 0.01353
Epoch: 12098, Training Loss: 0.01432
Epoch: 12098, Training Loss: 0.01434
Epoch: 12098, Training Loss: 0.01778
Epoch: 12099, Training Loss: 0.01353
Epoch: 12099, Training Loss: 0.01432
Epoch: 12099, Training Loss: 0.01434
Epoch: 12099, Training Loss: 0.01778
Epoch: 12100, Training Loss: 0.01353
Epoch: 12100, Training Loss: 0.01432
Epoch: 12100, Training Loss: 0.01434
Epoch: 12100, Training Loss: 0.01778
Epoch: 12101, Training Loss: 0.01353
Epoch: 12101, Training Loss: 0.01432
Epoch: 12101, Training Loss: 0.01434
Epoch: 12101, Training Loss: 0.01778
Epoch: 12102, Training Loss: 0.01353
Epoch: 12102, Training Loss: 0.01432
Epoch: 12102, Training Loss: 0.01434
Epoch: 12102, Training Loss: 0.01777
Epoch: 12103, Training Loss: 0.01353
Epoch: 12103, Training Loss: 0.01432
Epoch: 12103, Training Loss: 0.01434
Epoch: 12103, Training Loss: 0.01777
Epoch: 12104, Training Loss: 0.01353
Epoch: 12104, Training Loss: 0.01432
Epoch: 12104, Training Loss: 0.01434
Epoch: 12104, Training Loss: 0.01777
Epoch: 12105, Training Loss: 0.01353
Epoch: 12105, Training Loss: 0.01432
Epoch: 12105, Training Loss: 0.01434
Epoch: 12105, Training Loss: 0.01777
Epoch: 12106, Training Loss: 0.01353
Epoch: 12106, Training Loss: 0.01432
Epoch: 12106, Training Loss: 0.01434
Epoch: 12106, Training Loss: 0.01777
Epoch: 12107, Training Loss: 0.01352
Epoch: 12107, Training Loss: 0.01432
Epoch: 12107, Training Loss: 0.01434
Epoch: 12107, Training Loss: 0.01777
Epoch: 12108, Training Loss: 0.01352
Epoch: 12108, Training Loss: 0.01432
Epoch: 12108, Training Loss: 0.01434
Epoch: 12108, Training Loss: 0.01777
Epoch: 12109, Training Loss: 0.01352
Epoch: 12109, Training Loss: 0.01431
Epoch: 12109, Training Loss: 0.01434
Epoch: 12109, Training Loss: 0.01777
Epoch: 12110, Training Loss: 0.01352
Epoch: 12110, Training Loss: 0.01431
Epoch: 12110, Training Loss: 0.01433
Epoch: 12110, Training Loss: 0.01777
Epoch: 12111, Training Loss: 0.01352
Epoch: 12111, Training Loss: 0.01431
Epoch: 12111, Training Loss: 0.01433
Epoch: 12111, Training Loss: 0.01777
Epoch: 12112, Training Loss: 0.01352
Epoch: 12112, Training Loss: 0.01431
Epoch: 12112, Training Loss: 0.01433
Epoch: 12112, Training Loss: 0.01776
Epoch: 12113, Training Loss: 0.01352
Epoch: 12113, Training Loss: 0.01431
Epoch: 12113, Training Loss: 0.01433
Epoch: 12113, Training Loss: 0.01776
Epoch: 12114, Training Loss: 0.01352
Epoch: 12114, Training Loss: 0.01431
Epoch: 12114, Training Loss: 0.01433
Epoch: 12114, Training Loss: 0.01776
Epoch: 12115, Training Loss: 0.01352
Epoch: 12115, Training Loss: 0.01431
Epoch: 12115, Training Loss: 0.01433
Epoch: 12115, Training Loss: 0.01776
Epoch: 12116, Training Loss: 0.01352
Epoch: 12116, Training Loss: 0.01431
Epoch: 12116, Training Loss: 0.01433
Epoch: 12116, Training Loss: 0.01776
Epoch: 12117, Training Loss: 0.01352
Epoch: 12117, Training Loss: 0.01431
Epoch: 12117, Training Loss: 0.01433
Epoch: 12117, Training Loss: 0.01776
Epoch: 12118, Training Loss: 0.01352
Epoch: 12118, Training Loss: 0.01431
Epoch: 12118, Training Loss: 0.01433
Epoch: 12118, Training Loss: 0.01776
Epoch: 12119, Training Loss: 0.01352
Epoch: 12119, Training Loss: 0.01431
Epoch: 12119, Training Loss: 0.01433
Epoch: 12119, Training Loss: 0.01776
Epoch: 12120, Training Loss: 0.01352
Epoch: 12120, Training Loss: 0.01431
Epoch: 12120, Training Loss: 0.01433
Epoch: 12120, Training Loss: 0.01776
Epoch: 12121, Training Loss: 0.01351
Epoch: 12121, Training Loss: 0.01431
Epoch: 12121, Training Loss: 0.01433
Epoch: 12121, Training Loss: 0.01776
Epoch: 12122, Training Loss: 0.01351
Epoch: 12122, Training Loss: 0.01430
Epoch: 12122, Training Loss: 0.01433
Epoch: 12122, Training Loss: 0.01775
Epoch: 12123, Training Loss: 0.01351
Epoch: 12123, Training Loss: 0.01430
Epoch: 12123, Training Loss: 0.01432
Epoch: 12123, Training Loss: 0.01775
Epoch: 12124, Training Loss: 0.01351
Epoch: 12124, Training Loss: 0.01430
Epoch: 12124, Training Loss: 0.01432
Epoch: 12124, Training Loss: 0.01775
Epoch: 12125, Training Loss: 0.01351
Epoch: 12125, Training Loss: 0.01430
Epoch: 12125, Training Loss: 0.01432
Epoch: 12125, Training Loss: 0.01775
Epoch: 12126, Training Loss: 0.01351
Epoch: 12126, Training Loss: 0.01430
Epoch: 12126, Training Loss: 0.01432
Epoch: 12126, Training Loss: 0.01775
Epoch: 12127, Training Loss: 0.01351
Epoch: 12127, Training Loss: 0.01430
Epoch: 12127, Training Loss: 0.01432
Epoch: 12127, Training Loss: 0.01775
Epoch: 12128, Training Loss: 0.01351
Epoch: 12128, Training Loss: 0.01430
Epoch: 12128, Training Loss: 0.01432
Epoch: 12128, Training Loss: 0.01775
Epoch: 12129, Training Loss: 0.01351
Epoch: 12129, Training Loss: 0.01430
Epoch: 12129, Training Loss: 0.01432
Epoch: 12129, Training Loss: 0.01775
Epoch: 12130, Training Loss: 0.01351
Epoch: 12130, Training Loss: 0.01430
Epoch: 12130, Training Loss: 0.01432
Epoch: 12130, Training Loss: 0.01775
Epoch: 12131, Training Loss: 0.01351
Epoch: 12131, Training Loss: 0.01430
Epoch: 12131, Training Loss: 0.01432
Epoch: 12131, Training Loss: 0.01775
Epoch: 12132, Training Loss: 0.01351
Epoch: 12132, Training Loss: 0.01430
Epoch: 12132, Training Loss: 0.01432
Epoch: 12132, Training Loss: 0.01774
Epoch: 12133, Training Loss: 0.01351
Epoch: 12133, Training Loss: 0.01430
Epoch: 12133, Training Loss: 0.01432
Epoch: 12133, Training Loss: 0.01774
Epoch: 12134, Training Loss: 0.01351
Epoch: 12134, Training Loss: 0.01430
Epoch: 12134, Training Loss: 0.01432
Epoch: 12134, Training Loss: 0.01774
Epoch: 12135, Training Loss: 0.01351
Epoch: 12135, Training Loss: 0.01429
Epoch: 12135, Training Loss: 0.01432
Epoch: 12135, Training Loss: 0.01774
Epoch: 12136, Training Loss: 0.01350
Epoch: 12136, Training Loss: 0.01429
Epoch: 12136, Training Loss: 0.01431
Epoch: 12136, Training Loss: 0.01774
Epoch: 12137, Training Loss: 0.01350
Epoch: 12137, Training Loss: 0.01429
Epoch: 12137, Training Loss: 0.01431
Epoch: 12137, Training Loss: 0.01774
Epoch: 12138, Training Loss: 0.01350
Epoch: 12138, Training Loss: 0.01429
Epoch: 12138, Training Loss: 0.01431
Epoch: 12138, Training Loss: 0.01774
Epoch: 12139, Training Loss: 0.01350
Epoch: 12139, Training Loss: 0.01429
Epoch: 12139, Training Loss: 0.01431
Epoch: 12139, Training Loss: 0.01774
Epoch: 12140, Training Loss: 0.01350
Epoch: 12140, Training Loss: 0.01429
Epoch: 12140, Training Loss: 0.01431
Epoch: 12140, Training Loss: 0.01774
Epoch: 12141, Training Loss: 0.01350
Epoch: 12141, Training Loss: 0.01429
Epoch: 12141, Training Loss: 0.01431
Epoch: 12141, Training Loss: 0.01774
Epoch: 12142, Training Loss: 0.01350
Epoch: 12142, Training Loss: 0.01429
Epoch: 12142, Training Loss: 0.01431
Epoch: 12142, Training Loss: 0.01774
Epoch: 12143, Training Loss: 0.01350
Epoch: 12143, Training Loss: 0.01429
Epoch: 12143, Training Loss: 0.01431
Epoch: 12143, Training Loss: 0.01773
Epoch: 12144, Training Loss: 0.01350
Epoch: 12144, Training Loss: 0.01429
Epoch: 12144, Training Loss: 0.01431
Epoch: 12144, Training Loss: 0.01773
Epoch: 12145, Training Loss: 0.01350
Epoch: 12145, Training Loss: 0.01429
Epoch: 12145, Training Loss: 0.01431
Epoch: 12145, Training Loss: 0.01773
Epoch: 12146, Training Loss: 0.01350
Epoch: 12146, Training Loss: 0.01429
Epoch: 12146, Training Loss: 0.01431
Epoch: 12146, Training Loss: 0.01773
Epoch: 12147, Training Loss: 0.01350
Epoch: 12147, Training Loss: 0.01429
Epoch: 12147, Training Loss: 0.01431
Epoch: 12147, Training Loss: 0.01773
Epoch: 12148, Training Loss: 0.01350
Epoch: 12148, Training Loss: 0.01428
Epoch: 12148, Training Loss: 0.01431
Epoch: 12148, Training Loss: 0.01773
Epoch: 12149, Training Loss: 0.01350
Epoch: 12149, Training Loss: 0.01428
Epoch: 12149, Training Loss: 0.01430
Epoch: 12149, Training Loss: 0.01773
Epoch: 12150, Training Loss: 0.01349
Epoch: 12150, Training Loss: 0.01428
Epoch: 12150, Training Loss: 0.01430
Epoch: 12150, Training Loss: 0.01773
Epoch: 12151, Training Loss: 0.01349
Epoch: 12151, Training Loss: 0.01428
Epoch: 12151, Training Loss: 0.01430
Epoch: 12151, Training Loss: 0.01773
Epoch: 12152, Training Loss: 0.01349
Epoch: 12152, Training Loss: 0.01428
Epoch: 12152, Training Loss: 0.01430
Epoch: 12152, Training Loss: 0.01773
Epoch: 12153, Training Loss: 0.01349
Epoch: 12153, Training Loss: 0.01428
Epoch: 12153, Training Loss: 0.01430
Epoch: 12153, Training Loss: 0.01772
Epoch: 12154, Training Loss: 0.01349
Epoch: 12154, Training Loss: 0.01428
Epoch: 12154, Training Loss: 0.01430
Epoch: 12154, Training Loss: 0.01772
Epoch: 12155, Training Loss: 0.01349
Epoch: 12155, Training Loss: 0.01428
Epoch: 12155, Training Loss: 0.01430
Epoch: 12155, Training Loss: 0.01772
Epoch: 12156, Training Loss: 0.01349
Epoch: 12156, Training Loss: 0.01428
Epoch: 12156, Training Loss: 0.01430
Epoch: 12156, Training Loss: 0.01772
Epoch: 12157, Training Loss: 0.01349
Epoch: 12157, Training Loss: 0.01428
Epoch: 12157, Training Loss: 0.01430
Epoch: 12157, Training Loss: 0.01772
Epoch: 12158, Training Loss: 0.01349
Epoch: 12158, Training Loss: 0.01428
Epoch: 12158, Training Loss: 0.01430
Epoch: 12158, Training Loss: 0.01772
Epoch: 12159, Training Loss: 0.01349
Epoch: 12159, Training Loss: 0.01428
Epoch: 12159, Training Loss: 0.01430
Epoch: 12159, Training Loss: 0.01772
Epoch: 12160, Training Loss: 0.01349
Epoch: 12160, Training Loss: 0.01428
Epoch: 12160, Training Loss: 0.01430
Epoch: 12160, Training Loss: 0.01772
Epoch: 12161, Training Loss: 0.01349
Epoch: 12161, Training Loss: 0.01427
Epoch: 12161, Training Loss: 0.01430
Epoch: 12161, Training Loss: 0.01772
Epoch: 12162, Training Loss: 0.01349
Epoch: 12162, Training Loss: 0.01427
Epoch: 12162, Training Loss: 0.01429
Epoch: 12162, Training Loss: 0.01772
Epoch: 12163, Training Loss: 0.01349
Epoch: 12163, Training Loss: 0.01427
Epoch: 12163, Training Loss: 0.01429
Epoch: 12163, Training Loss: 0.01771
Epoch: 12164, Training Loss: 0.01348
Epoch: 12164, Training Loss: 0.01427
Epoch: 12164, Training Loss: 0.01429
Epoch: 12164, Training Loss: 0.01771
Epoch: 12165, Training Loss: 0.01348
Epoch: 12165, Training Loss: 0.01427
Epoch: 12165, Training Loss: 0.01429
Epoch: 12165, Training Loss: 0.01771
Epoch: 12166, Training Loss: 0.01348
Epoch: 12166, Training Loss: 0.01427
Epoch: 12166, Training Loss: 0.01429
Epoch: 12166, Training Loss: 0.01771
Epoch: 12167, Training Loss: 0.01348
Epoch: 12167, Training Loss: 0.01427
Epoch: 12167, Training Loss: 0.01429
Epoch: 12167, Training Loss: 0.01771
Epoch: 12168, Training Loss: 0.01348
Epoch: 12168, Training Loss: 0.01427
Epoch: 12168, Training Loss: 0.01429
Epoch: 12168, Training Loss: 0.01771
Epoch: 12169, Training Loss: 0.01348
Epoch: 12169, Training Loss: 0.01427
Epoch: 12169, Training Loss: 0.01429
Epoch: 12169, Training Loss: 0.01771
Epoch: 12170, Training Loss: 0.01348
Epoch: 12170, Training Loss: 0.01427
Epoch: 12170, Training Loss: 0.01429
Epoch: 12170, Training Loss: 0.01771
Epoch: 12171, Training Loss: 0.01348
Epoch: 12171, Training Loss: 0.01427
Epoch: 12171, Training Loss: 0.01429
Epoch: 12171, Training Loss: 0.01771
Epoch: 12172, Training Loss: 0.01348
Epoch: 12172, Training Loss: 0.01427
Epoch: 12172, Training Loss: 0.01429
Epoch: 12172, Training Loss: 0.01771
Epoch: 12173, Training Loss: 0.01348
Epoch: 12173, Training Loss: 0.01427
Epoch: 12173, Training Loss: 0.01429
Epoch: 12173, Training Loss: 0.01771
Epoch: 12174, Training Loss: 0.01348
Epoch: 12174, Training Loss: 0.01426
Epoch: 12174, Training Loss: 0.01429
Epoch: 12174, Training Loss: 0.01770
Epoch: 12175, Training Loss: 0.01348
Epoch: 12175, Training Loss: 0.01426
Epoch: 12175, Training Loss: 0.01428
Epoch: 12175, Training Loss: 0.01770
Epoch: 12176, Training Loss: 0.01348
Epoch: 12176, Training Loss: 0.01426
Epoch: 12176, Training Loss: 0.01428
Epoch: 12176, Training Loss: 0.01770
Epoch: 12177, Training Loss: 0.01348
Epoch: 12177, Training Loss: 0.01426
Epoch: 12177, Training Loss: 0.01428
Epoch: 12177, Training Loss: 0.01770
Epoch: 12178, Training Loss: 0.01348
Epoch: 12178, Training Loss: 0.01426
Epoch: 12178, Training Loss: 0.01428
Epoch: 12178, Training Loss: 0.01770
Epoch: 12179, Training Loss: 0.01347
Epoch: 12179, Training Loss: 0.01426
Epoch: 12179, Training Loss: 0.01428
Epoch: 12179, Training Loss: 0.01770
Epoch: 12180, Training Loss: 0.01347
Epoch: 12180, Training Loss: 0.01426
Epoch: 12180, Training Loss: 0.01428
Epoch: 12180, Training Loss: 0.01770
Epoch: 12181, Training Loss: 0.01347
Epoch: 12181, Training Loss: 0.01426
Epoch: 12181, Training Loss: 0.01428
Epoch: 12181, Training Loss: 0.01770
Epoch: 12182, Training Loss: 0.01347
Epoch: 12182, Training Loss: 0.01426
Epoch: 12182, Training Loss: 0.01428
Epoch: 12182, Training Loss: 0.01770
Epoch: 12183, Training Loss: 0.01347
Epoch: 12183, Training Loss: 0.01426
Epoch: 12183, Training Loss: 0.01428
Epoch: 12183, Training Loss: 0.01770
Epoch: 12184, Training Loss: 0.01347
Epoch: 12184, Training Loss: 0.01426
Epoch: 12184, Training Loss: 0.01428
Epoch: 12184, Training Loss: 0.01769
Epoch: 12185, Training Loss: 0.01347
Epoch: 12185, Training Loss: 0.01426
Epoch: 12185, Training Loss: 0.01428
Epoch: 12185, Training Loss: 0.01769
Epoch: 12186, Training Loss: 0.01347
Epoch: 12186, Training Loss: 0.01426
Epoch: 12186, Training Loss: 0.01428
Epoch: 12186, Training Loss: 0.01769
Epoch: 12187, Training Loss: 0.01347
Epoch: 12187, Training Loss: 0.01425
Epoch: 12187, Training Loss: 0.01428
Epoch: 12187, Training Loss: 0.01769
Epoch: 12188, Training Loss: 0.01347
Epoch: 12188, Training Loss: 0.01425
Epoch: 12188, Training Loss: 0.01427
Epoch: 12188, Training Loss: 0.01769
Epoch: 12189, Training Loss: 0.01347
Epoch: 12189, Training Loss: 0.01425
Epoch: 12189, Training Loss: 0.01427
Epoch: 12189, Training Loss: 0.01769
Epoch: 12190, Training Loss: 0.01347
Epoch: 12190, Training Loss: 0.01425
Epoch: 12190, Training Loss: 0.01427
Epoch: 12190, Training Loss: 0.01769
Epoch: 12191, Training Loss: 0.01347
Epoch: 12191, Training Loss: 0.01425
Epoch: 12191, Training Loss: 0.01427
Epoch: 12191, Training Loss: 0.01769
Epoch: 12192, Training Loss: 0.01347
Epoch: 12192, Training Loss: 0.01425
Epoch: 12192, Training Loss: 0.01427
Epoch: 12192, Training Loss: 0.01769
Epoch: 12193, Training Loss: 0.01346
Epoch: 12193, Training Loss: 0.01425
Epoch: 12193, Training Loss: 0.01427
Epoch: 12193, Training Loss: 0.01769
Epoch: 12194, Training Loss: 0.01346
Epoch: 12194, Training Loss: 0.01425
Epoch: 12194, Training Loss: 0.01427
Epoch: 12194, Training Loss: 0.01768
Epoch: 12195, Training Loss: 0.01346
Epoch: 12195, Training Loss: 0.01425
Epoch: 12195, Training Loss: 0.01427
Epoch: 12195, Training Loss: 0.01768
Epoch: 12196, Training Loss: 0.01346
Epoch: 12196, Training Loss: 0.01425
Epoch: 12196, Training Loss: 0.01427
Epoch: 12196, Training Loss: 0.01768
Epoch: 12197, Training Loss: 0.01346
Epoch: 12197, Training Loss: 0.01425
Epoch: 12197, Training Loss: 0.01427
Epoch: 12197, Training Loss: 0.01768
Epoch: 12198, Training Loss: 0.01346
Epoch: 12198, Training Loss: 0.01425
Epoch: 12198, Training Loss: 0.01427
Epoch: 12198, Training Loss: 0.01768
Epoch: 12199, Training Loss: 0.01346
Epoch: 12199, Training Loss: 0.01425
Epoch: 12199, Training Loss: 0.01427
Epoch: 12199, Training Loss: 0.01768
Epoch: 12200, Training Loss: 0.01346
Epoch: 12200, Training Loss: 0.01424
Epoch: 12200, Training Loss: 0.01427
Epoch: 12200, Training Loss: 0.01768
Epoch: 12201, Training Loss: 0.01346
Epoch: 12201, Training Loss: 0.01424
Epoch: 12201, Training Loss: 0.01427
Epoch: 12201, Training Loss: 0.01768
Epoch: 12202, Training Loss: 0.01346
Epoch: 12202, Training Loss: 0.01424
Epoch: 12202, Training Loss: 0.01426
Epoch: 12202, Training Loss: 0.01768
Epoch: 12203, Training Loss: 0.01346
Epoch: 12203, Training Loss: 0.01424
Epoch: 12203, Training Loss: 0.01426
Epoch: 12203, Training Loss: 0.01768
Epoch: 12204, Training Loss: 0.01346
Epoch: 12204, Training Loss: 0.01424
Epoch: 12204, Training Loss: 0.01426
Epoch: 12204, Training Loss: 0.01768
Epoch: 12205, Training Loss: 0.01346
Epoch: 12205, Training Loss: 0.01424
Epoch: 12205, Training Loss: 0.01426
Epoch: 12205, Training Loss: 0.01767
Epoch: 12206, Training Loss: 0.01346
Epoch: 12206, Training Loss: 0.01424
Epoch: 12206, Training Loss: 0.01426
Epoch: 12206, Training Loss: 0.01767
Epoch: 12207, Training Loss: 0.01345
Epoch: 12207, Training Loss: 0.01424
Epoch: 12207, Training Loss: 0.01426
Epoch: 12207, Training Loss: 0.01767
Epoch: 12208, Training Loss: 0.01345
Epoch: 12208, Training Loss: 0.01424
Epoch: 12208, Training Loss: 0.01426
Epoch: 12208, Training Loss: 0.01767
Epoch: 12209, Training Loss: 0.01345
Epoch: 12209, Training Loss: 0.01424
Epoch: 12209, Training Loss: 0.01426
Epoch: 12209, Training Loss: 0.01767
Epoch: 12210, Training Loss: 0.01345
Epoch: 12210, Training Loss: 0.01424
Epoch: 12210, Training Loss: 0.01426
Epoch: 12210, Training Loss: 0.01767
Epoch: 12211, Training Loss: 0.01345
Epoch: 12211, Training Loss: 0.01424
Epoch: 12211, Training Loss: 0.01426
Epoch: 12211, Training Loss: 0.01767
Epoch: 12212, Training Loss: 0.01345
Epoch: 12212, Training Loss: 0.01424
Epoch: 12212, Training Loss: 0.01426
Epoch: 12212, Training Loss: 0.01767
Epoch: 12213, Training Loss: 0.01345
Epoch: 12213, Training Loss: 0.01423
Epoch: 12213, Training Loss: 0.01426
Epoch: 12213, Training Loss: 0.01767
Epoch: 12214, Training Loss: 0.01345
Epoch: 12214, Training Loss: 0.01423
Epoch: 12214, Training Loss: 0.01426
Epoch: 12214, Training Loss: 0.01767
Epoch: 12215, Training Loss: 0.01345
Epoch: 12215, Training Loss: 0.01423
Epoch: 12215, Training Loss: 0.01425
Epoch: 12215, Training Loss: 0.01766
Epoch: 12216, Training Loss: 0.01345
Epoch: 12216, Training Loss: 0.01423
Epoch: 12216, Training Loss: 0.01425
Epoch: 12216, Training Loss: 0.01766
Epoch: 12217, Training Loss: 0.01345
Epoch: 12217, Training Loss: 0.01423
Epoch: 12217, Training Loss: 0.01425
Epoch: 12217, Training Loss: 0.01766
Epoch: 12218, Training Loss: 0.01345
Epoch: 12218, Training Loss: 0.01423
Epoch: 12218, Training Loss: 0.01425
Epoch: 12218, Training Loss: 0.01766
Epoch: 12219, Training Loss: 0.01345
Epoch: 12219, Training Loss: 0.01423
Epoch: 12219, Training Loss: 0.01425
Epoch: 12219, Training Loss: 0.01766
Epoch: 12220, Training Loss: 0.01345
Epoch: 12220, Training Loss: 0.01423
Epoch: 12220, Training Loss: 0.01425
Epoch: 12220, Training Loss: 0.01766
Epoch: 12221, Training Loss: 0.01345
Epoch: 12221, Training Loss: 0.01423
Epoch: 12221, Training Loss: 0.01425
Epoch: 12221, Training Loss: 0.01766
Epoch: 12222, Training Loss: 0.01344
Epoch: 12222, Training Loss: 0.01423
Epoch: 12222, Training Loss: 0.01425
Epoch: 12222, Training Loss: 0.01766
Epoch: 12223, Training Loss: 0.01344
Epoch: 12223, Training Loss: 0.01423
Epoch: 12223, Training Loss: 0.01425
Epoch: 12223, Training Loss: 0.01766
Epoch: 12224, Training Loss: 0.01344
Epoch: 12224, Training Loss: 0.01423
Epoch: 12224, Training Loss: 0.01425
Epoch: 12224, Training Loss: 0.01766
Epoch: 12225, Training Loss: 0.01344
Epoch: 12225, Training Loss: 0.01423
Epoch: 12225, Training Loss: 0.01425
Epoch: 12225, Training Loss: 0.01765
Epoch: 12226, Training Loss: 0.01344
Epoch: 12226, Training Loss: 0.01422
Epoch: 12226, Training Loss: 0.01425
Epoch: 12226, Training Loss: 0.01765
Epoch: 12227, Training Loss: 0.01344
Epoch: 12227, Training Loss: 0.01422
Epoch: 12227, Training Loss: 0.01425
Epoch: 12227, Training Loss: 0.01765
Epoch: 12228, Training Loss: 0.01344
Epoch: 12228, Training Loss: 0.01422
Epoch: 12228, Training Loss: 0.01424
Epoch: 12228, Training Loss: 0.01765
Epoch: 12229, Training Loss: 0.01344
Epoch: 12229, Training Loss: 0.01422
Epoch: 12229, Training Loss: 0.01424
Epoch: 12229, Training Loss: 0.01765
Epoch: 12230, Training Loss: 0.01344
Epoch: 12230, Training Loss: 0.01422
Epoch: 12230, Training Loss: 0.01424
Epoch: 12230, Training Loss: 0.01765
Epoch: 12231, Training Loss: 0.01344
Epoch: 12231, Training Loss: 0.01422
Epoch: 12231, Training Loss: 0.01424
Epoch: 12231, Training Loss: 0.01765
Epoch: 12232, Training Loss: 0.01344
Epoch: 12232, Training Loss: 0.01422
Epoch: 12232, Training Loss: 0.01424
Epoch: 12232, Training Loss: 0.01765
Epoch: 12233, Training Loss: 0.01344
Epoch: 12233, Training Loss: 0.01422
Epoch: 12233, Training Loss: 0.01424
Epoch: 12233, Training Loss: 0.01765
Epoch: 12234, Training Loss: 0.01344
Epoch: 12234, Training Loss: 0.01422
Epoch: 12234, Training Loss: 0.01424
Epoch: 12234, Training Loss: 0.01765
Epoch: 12235, Training Loss: 0.01344
Epoch: 12235, Training Loss: 0.01422
Epoch: 12235, Training Loss: 0.01424
Epoch: 12235, Training Loss: 0.01765
Epoch: 12236, Training Loss: 0.01343
Epoch: 12236, Training Loss: 0.01422
Epoch: 12236, Training Loss: 0.01424
Epoch: 12236, Training Loss: 0.01764
Epoch: 12237, Training Loss: 0.01343
Epoch: 12237, Training Loss: 0.01422
Epoch: 12237, Training Loss: 0.01424
Epoch: 12237, Training Loss: 0.01764
Epoch: 12238, Training Loss: 0.01343
Epoch: 12238, Training Loss: 0.01422
Epoch: 12238, Training Loss: 0.01424
Epoch: 12238, Training Loss: 0.01764
Epoch: 12239, Training Loss: 0.01343
Epoch: 12239, Training Loss: 0.01422
Epoch: 12239, Training Loss: 0.01424
Epoch: 12239, Training Loss: 0.01764
Epoch: 12240, Training Loss: 0.01343
Epoch: 12240, Training Loss: 0.01421
Epoch: 12240, Training Loss: 0.01424
Epoch: 12240, Training Loss: 0.01764
Epoch: 12241, Training Loss: 0.01343
Epoch: 12241, Training Loss: 0.01421
Epoch: 12241, Training Loss: 0.01423
Epoch: 12241, Training Loss: 0.01764
Epoch: 12242, Training Loss: 0.01343
Epoch: 12242, Training Loss: 0.01421
Epoch: 12242, Training Loss: 0.01423
Epoch: 12242, Training Loss: 0.01764
Epoch: 12243, Training Loss: 0.01343
Epoch: 12243, Training Loss: 0.01421
Epoch: 12243, Training Loss: 0.01423
Epoch: 12243, Training Loss: 0.01764
Epoch: 12244, Training Loss: 0.01343
Epoch: 12244, Training Loss: 0.01421
Epoch: 12244, Training Loss: 0.01423
Epoch: 12244, Training Loss: 0.01764
Epoch: 12245, Training Loss: 0.01343
Epoch: 12245, Training Loss: 0.01421
Epoch: 12245, Training Loss: 0.01423
Epoch: 12245, Training Loss: 0.01764
Epoch: 12246, Training Loss: 0.01343
Epoch: 12246, Training Loss: 0.01421
Epoch: 12246, Training Loss: 0.01423
Epoch: 12246, Training Loss: 0.01763
Epoch: 12247, Training Loss: 0.01343
Epoch: 12247, Training Loss: 0.01421
Epoch: 12247, Training Loss: 0.01423
Epoch: 12247, Training Loss: 0.01763
Epoch: 12248, Training Loss: 0.01343
Epoch: 12248, Training Loss: 0.01421
Epoch: 12248, Training Loss: 0.01423
Epoch: 12248, Training Loss: 0.01763
Epoch: 12249, Training Loss: 0.01343
Epoch: 12249, Training Loss: 0.01421
Epoch: 12249, Training Loss: 0.01423
Epoch: 12249, Training Loss: 0.01763
Epoch: 12250, Training Loss: 0.01343
Epoch: 12250, Training Loss: 0.01421
Epoch: 12250, Training Loss: 0.01423
Epoch: 12250, Training Loss: 0.01763
Epoch: 12251, Training Loss: 0.01342
Epoch: 12251, Training Loss: 0.01421
Epoch: 12251, Training Loss: 0.01423
Epoch: 12251, Training Loss: 0.01763
Epoch: 12252, Training Loss: 0.01342
Epoch: 12252, Training Loss: 0.01421
Epoch: 12252, Training Loss: 0.01423
Epoch: 12252, Training Loss: 0.01763
Epoch: 12253, Training Loss: 0.01342
Epoch: 12253, Training Loss: 0.01420
Epoch: 12253, Training Loss: 0.01423
Epoch: 12253, Training Loss: 0.01763
Epoch: 12254, Training Loss: 0.01342
Epoch: 12254, Training Loss: 0.01420
Epoch: 12254, Training Loss: 0.01422
Epoch: 12254, Training Loss: 0.01763
Epoch: 12255, Training Loss: 0.01342
Epoch: 12255, Training Loss: 0.01420
Epoch: 12255, Training Loss: 0.01422
Epoch: 12255, Training Loss: 0.01763
Epoch: 12256, Training Loss: 0.01342
Epoch: 12256, Training Loss: 0.01420
Epoch: 12256, Training Loss: 0.01422
Epoch: 12256, Training Loss: 0.01763
Epoch: 12257, Training Loss: 0.01342
Epoch: 12257, Training Loss: 0.01420
Epoch: 12257, Training Loss: 0.01422
Epoch: 12257, Training Loss: 0.01762
Epoch: 12258, Training Loss: 0.01342
Epoch: 12258, Training Loss: 0.01420
Epoch: 12258, Training Loss: 0.01422
Epoch: 12258, Training Loss: 0.01762
Epoch: 12259, Training Loss: 0.01342
Epoch: 12259, Training Loss: 0.01420
Epoch: 12259, Training Loss: 0.01422
Epoch: 12259, Training Loss: 0.01762
Epoch: 12260, Training Loss: 0.01342
Epoch: 12260, Training Loss: 0.01420
Epoch: 12260, Training Loss: 0.01422
Epoch: 12260, Training Loss: 0.01762
Epoch: 12261, Training Loss: 0.01342
Epoch: 12261, Training Loss: 0.01420
Epoch: 12261, Training Loss: 0.01422
Epoch: 12261, Training Loss: 0.01762
Epoch: 12262, Training Loss: 0.01342
Epoch: 12262, Training Loss: 0.01420
Epoch: 12262, Training Loss: 0.01422
Epoch: 12262, Training Loss: 0.01762
Epoch: 12263, Training Loss: 0.01342
Epoch: 12263, Training Loss: 0.01420
Epoch: 12263, Training Loss: 0.01422
Epoch: 12263, Training Loss: 0.01762
Epoch: 12264, Training Loss: 0.01342
Epoch: 12264, Training Loss: 0.01420
Epoch: 12264, Training Loss: 0.01422
Epoch: 12264, Training Loss: 0.01762
Epoch: 12265, Training Loss: 0.01342
Epoch: 12265, Training Loss: 0.01420
Epoch: 12265, Training Loss: 0.01422
Epoch: 12265, Training Loss: 0.01762
Epoch: 12266, Training Loss: 0.01341
Epoch: 12266, Training Loss: 0.01419
Epoch: 12266, Training Loss: 0.01422
Epoch: 12266, Training Loss: 0.01762
Epoch: 12267, Training Loss: 0.01341
Epoch: 12267, Training Loss: 0.01419
Epoch: 12267, Training Loss: 0.01421
Epoch: 12267, Training Loss: 0.01761
Epoch: 12268, Training Loss: 0.01341
Epoch: 12268, Training Loss: 0.01419
Epoch: 12268, Training Loss: 0.01421
Epoch: 12268, Training Loss: 0.01761
Epoch: 12269, Training Loss: 0.01341
Epoch: 12269, Training Loss: 0.01419
Epoch: 12269, Training Loss: 0.01421
Epoch: 12269, Training Loss: 0.01761
Epoch: 12270, Training Loss: 0.01341
Epoch: 12270, Training Loss: 0.01419
Epoch: 12270, Training Loss: 0.01421
Epoch: 12270, Training Loss: 0.01761
Epoch: 12271, Training Loss: 0.01341
Epoch: 12271, Training Loss: 0.01419
Epoch: 12271, Training Loss: 0.01421
Epoch: 12271, Training Loss: 0.01761
Epoch: 12272, Training Loss: 0.01341
Epoch: 12272, Training Loss: 0.01419
Epoch: 12272, Training Loss: 0.01421
Epoch: 12272, Training Loss: 0.01761
Epoch: 12273, Training Loss: 0.01341
Epoch: 12273, Training Loss: 0.01419
Epoch: 12273, Training Loss: 0.01421
Epoch: 12273, Training Loss: 0.01761
Epoch: 12274, Training Loss: 0.01341
Epoch: 12274, Training Loss: 0.01419
Epoch: 12274, Training Loss: 0.01421
Epoch: 12274, Training Loss: 0.01761
Epoch: 12275, Training Loss: 0.01341
Epoch: 12275, Training Loss: 0.01419
Epoch: 12275, Training Loss: 0.01421
Epoch: 12275, Training Loss: 0.01761
Epoch: 12276, Training Loss: 0.01341
Epoch: 12276, Training Loss: 0.01419
Epoch: 12276, Training Loss: 0.01421
Epoch: 12276, Training Loss: 0.01761
Epoch: 12277, Training Loss: 0.01341
Epoch: 12277, Training Loss: 0.01419
Epoch: 12277, Training Loss: 0.01421
Epoch: 12277, Training Loss: 0.01761
Epoch: 12278, Training Loss: 0.01341
Epoch: 12278, Training Loss: 0.01419
Epoch: 12278, Training Loss: 0.01421
Epoch: 12278, Training Loss: 0.01760
Epoch: 12279, Training Loss: 0.01341
Epoch: 12279, Training Loss: 0.01419
Epoch: 12279, Training Loss: 0.01421
Epoch: 12279, Training Loss: 0.01760
Epoch: 12280, Training Loss: 0.01340
Epoch: 12280, Training Loss: 0.01418
Epoch: 12280, Training Loss: 0.01421
Epoch: 12280, Training Loss: 0.01760
Epoch: 12281, Training Loss: 0.01340
Epoch: 12281, Training Loss: 0.01418
Epoch: 12281, Training Loss: 0.01420
Epoch: 12281, Training Loss: 0.01760
Epoch: 12282, Training Loss: 0.01340
Epoch: 12282, Training Loss: 0.01418
Epoch: 12282, Training Loss: 0.01420
Epoch: 12282, Training Loss: 0.01760
Epoch: 12283, Training Loss: 0.01340
Epoch: 12283, Training Loss: 0.01418
Epoch: 12283, Training Loss: 0.01420
Epoch: 12283, Training Loss: 0.01760
Epoch: 12284, Training Loss: 0.01340
Epoch: 12284, Training Loss: 0.01418
Epoch: 12284, Training Loss: 0.01420
Epoch: 12284, Training Loss: 0.01760
Epoch: 12285, Training Loss: 0.01340
Epoch: 12285, Training Loss: 0.01418
Epoch: 12285, Training Loss: 0.01420
Epoch: 12285, Training Loss: 0.01760
Epoch: 12286, Training Loss: 0.01340
Epoch: 12286, Training Loss: 0.01418
Epoch: 12286, Training Loss: 0.01420
Epoch: 12286, Training Loss: 0.01760
Epoch: 12287, Training Loss: 0.01340
Epoch: 12287, Training Loss: 0.01418
Epoch: 12287, Training Loss: 0.01420
Epoch: 12287, Training Loss: 0.01760
Epoch: 12288, Training Loss: 0.01340
Epoch: 12288, Training Loss: 0.01418
Epoch: 12288, Training Loss: 0.01420
Epoch: 12288, Training Loss: 0.01759
Epoch: 12289, Training Loss: 0.01340
Epoch: 12289, Training Loss: 0.01418
Epoch: 12289, Training Loss: 0.01420
Epoch: 12289, Training Loss: 0.01759
Epoch: 12290, Training Loss: 0.01340
Epoch: 12290, Training Loss: 0.01418
Epoch: 12290, Training Loss: 0.01420
Epoch: 12290, Training Loss: 0.01759
Epoch: 12291, Training Loss: 0.01340
Epoch: 12291, Training Loss: 0.01418
Epoch: 12291, Training Loss: 0.01420
Epoch: 12291, Training Loss: 0.01759
Epoch: 12292, Training Loss: 0.01340
Epoch: 12292, Training Loss: 0.01418
Epoch: 12292, Training Loss: 0.01420
Epoch: 12292, Training Loss: 0.01759
Epoch: 12293, Training Loss: 0.01340
Epoch: 12293, Training Loss: 0.01417
Epoch: 12293, Training Loss: 0.01420
Epoch: 12293, Training Loss: 0.01759
Epoch: 12294, Training Loss: 0.01340
Epoch: 12294, Training Loss: 0.01417
Epoch: 12294, Training Loss: 0.01419
Epoch: 12294, Training Loss: 0.01759
Epoch: 12295, Training Loss: 0.01339
Epoch: 12295, Training Loss: 0.01417
Epoch: 12295, Training Loss: 0.01419
Epoch: 12295, Training Loss: 0.01759
Epoch: 12296, Training Loss: 0.01339
Epoch: 12296, Training Loss: 0.01417
Epoch: 12296, Training Loss: 0.01419
Epoch: 12296, Training Loss: 0.01759
Epoch: 12297, Training Loss: 0.01339
Epoch: 12297, Training Loss: 0.01417
Epoch: 12297, Training Loss: 0.01419
Epoch: 12297, Training Loss: 0.01759
Epoch: 12298, Training Loss: 0.01339
Epoch: 12298, Training Loss: 0.01417
Epoch: 12298, Training Loss: 0.01419
Epoch: 12298, Training Loss: 0.01759
Epoch: 12299, Training Loss: 0.01339
Epoch: 12299, Training Loss: 0.01417
Epoch: 12299, Training Loss: 0.01419
Epoch: 12299, Training Loss: 0.01758
Epoch: 12300, Training Loss: 0.01339
Epoch: 12300, Training Loss: 0.01417
Epoch: 12300, Training Loss: 0.01419
Epoch: 12300, Training Loss: 0.01758
Epoch: 12301, Training Loss: 0.01339
Epoch: 12301, Training Loss: 0.01417
Epoch: 12301, Training Loss: 0.01419
Epoch: 12301, Training Loss: 0.01758
Epoch: 12302, Training Loss: 0.01339
Epoch: 12302, Training Loss: 0.01417
Epoch: 12302, Training Loss: 0.01419
Epoch: 12302, Training Loss: 0.01758
Epoch: 12303, Training Loss: 0.01339
Epoch: 12303, Training Loss: 0.01417
Epoch: 12303, Training Loss: 0.01419
Epoch: 12303, Training Loss: 0.01758
Epoch: 12304, Training Loss: 0.01339
Epoch: 12304, Training Loss: 0.01417
Epoch: 12304, Training Loss: 0.01419
Epoch: 12304, Training Loss: 0.01758
Epoch: 12305, Training Loss: 0.01339
Epoch: 12305, Training Loss: 0.01417
Epoch: 12305, Training Loss: 0.01419
Epoch: 12305, Training Loss: 0.01758
Epoch: 12306, Training Loss: 0.01339
Epoch: 12306, Training Loss: 0.01416
Epoch: 12306, Training Loss: 0.01419
Epoch: 12306, Training Loss: 0.01758
Epoch: 12307, Training Loss: 0.01339
Epoch: 12307, Training Loss: 0.01416
Epoch: 12307, Training Loss: 0.01418
Epoch: 12307, Training Loss: 0.01758
Epoch: 12308, Training Loss: 0.01339
Epoch: 12308, Training Loss: 0.01416
Epoch: 12308, Training Loss: 0.01418
Epoch: 12308, Training Loss: 0.01758
Epoch: 12309, Training Loss: 0.01338
Epoch: 12309, Training Loss: 0.01416
Epoch: 12309, Training Loss: 0.01418
Epoch: 12309, Training Loss: 0.01757
Epoch: 12310, Training Loss: 0.01338
Epoch: 12310, Training Loss: 0.01416
Epoch: 12310, Training Loss: 0.01418
Epoch: 12310, Training Loss: 0.01757
Epoch: 12311, Training Loss: 0.01338
Epoch: 12311, Training Loss: 0.01416
Epoch: 12311, Training Loss: 0.01418
Epoch: 12311, Training Loss: 0.01757
Epoch: 12312, Training Loss: 0.01338
Epoch: 12312, Training Loss: 0.01416
Epoch: 12312, Training Loss: 0.01418
Epoch: 12312, Training Loss: 0.01757
Epoch: 12313, Training Loss: 0.01338
Epoch: 12313, Training Loss: 0.01416
Epoch: 12313, Training Loss: 0.01418
Epoch: 12313, Training Loss: 0.01757
Epoch: 12314, Training Loss: 0.01338
Epoch: 12314, Training Loss: 0.01416
Epoch: 12314, Training Loss: 0.01418
Epoch: 12314, Training Loss: 0.01757
Epoch: 12315, Training Loss: 0.01338
Epoch: 12315, Training Loss: 0.01416
Epoch: 12315, Training Loss: 0.01418
Epoch: 12315, Training Loss: 0.01757
Epoch: 12316, Training Loss: 0.01338
Epoch: 12316, Training Loss: 0.01416
Epoch: 12316, Training Loss: 0.01418
Epoch: 12316, Training Loss: 0.01757
Epoch: 12317, Training Loss: 0.01338
Epoch: 12317, Training Loss: 0.01416
Epoch: 12317, Training Loss: 0.01418
Epoch: 12317, Training Loss: 0.01757
Epoch: 12318, Training Loss: 0.01338
Epoch: 12318, Training Loss: 0.01416
Epoch: 12318, Training Loss: 0.01418
Epoch: 12318, Training Loss: 0.01757
Epoch: 12319, Training Loss: 0.01338
Epoch: 12319, Training Loss: 0.01416
Epoch: 12319, Training Loss: 0.01418
Epoch: 12319, Training Loss: 0.01757
Epoch: 12320, Training Loss: 0.01338
Epoch: 12320, Training Loss: 0.01415
Epoch: 12320, Training Loss: 0.01418
Epoch: 12320, Training Loss: 0.01756
Epoch: 12321, Training Loss: 0.01338
Epoch: 12321, Training Loss: 0.01415
Epoch: 12321, Training Loss: 0.01417
Epoch: 12321, Training Loss: 0.01756
Epoch: 12322, Training Loss: 0.01338
Epoch: 12322, Training Loss: 0.01415
Epoch: 12322, Training Loss: 0.01417
Epoch: 12322, Training Loss: 0.01756
Epoch: 12323, Training Loss: 0.01338
Epoch: 12323, Training Loss: 0.01415
Epoch: 12323, Training Loss: 0.01417
Epoch: 12323, Training Loss: 0.01756
Epoch: 12324, Training Loss: 0.01337
Epoch: 12324, Training Loss: 0.01415
Epoch: 12324, Training Loss: 0.01417
Epoch: 12324, Training Loss: 0.01756
Epoch: 12325, Training Loss: 0.01337
Epoch: 12325, Training Loss: 0.01415
Epoch: 12325, Training Loss: 0.01417
Epoch: 12325, Training Loss: 0.01756
Epoch: 12326, Training Loss: 0.01337
Epoch: 12326, Training Loss: 0.01415
Epoch: 12326, Training Loss: 0.01417
Epoch: 12326, Training Loss: 0.01756
Epoch: 12327, Training Loss: 0.01337
Epoch: 12327, Training Loss: 0.01415
Epoch: 12327, Training Loss: 0.01417
Epoch: 12327, Training Loss: 0.01756
Epoch: 12328, Training Loss: 0.01337
Epoch: 12328, Training Loss: 0.01415
Epoch: 12328, Training Loss: 0.01417
Epoch: 12328, Training Loss: 0.01756
Epoch: 12329, Training Loss: 0.01337
Epoch: 12329, Training Loss: 0.01415
Epoch: 12329, Training Loss: 0.01417
Epoch: 12329, Training Loss: 0.01756
Epoch: 12330, Training Loss: 0.01337
Epoch: 12330, Training Loss: 0.01415
Epoch: 12330, Training Loss: 0.01417
Epoch: 12330, Training Loss: 0.01755
Epoch: 12331, Training Loss: 0.01337
Epoch: 12331, Training Loss: 0.01415
Epoch: 12331, Training Loss: 0.01417
Epoch: 12331, Training Loss: 0.01755
Epoch: 12332, Training Loss: 0.01337
Epoch: 12332, Training Loss: 0.01415
Epoch: 12332, Training Loss: 0.01417
Epoch: 12332, Training Loss: 0.01755
Epoch: 12333, Training Loss: 0.01337
Epoch: 12333, Training Loss: 0.01414
Epoch: 12333, Training Loss: 0.01417
Epoch: 12333, Training Loss: 0.01755
Epoch: 12334, Training Loss: 0.01337
Epoch: 12334, Training Loss: 0.01414
Epoch: 12334, Training Loss: 0.01416
Epoch: 12334, Training Loss: 0.01755
Epoch: 12335, Training Loss: 0.01337
Epoch: 12335, Training Loss: 0.01414
Epoch: 12335, Training Loss: 0.01416
Epoch: 12335, Training Loss: 0.01755
Epoch: 12336, Training Loss: 0.01337
Epoch: 12336, Training Loss: 0.01414
Epoch: 12336, Training Loss: 0.01416
Epoch: 12336, Training Loss: 0.01755
Epoch: 12337, Training Loss: 0.01337
Epoch: 12337, Training Loss: 0.01414
Epoch: 12337, Training Loss: 0.01416
Epoch: 12337, Training Loss: 0.01755
Epoch: 12338, Training Loss: 0.01337
Epoch: 12338, Training Loss: 0.01414
Epoch: 12338, Training Loss: 0.01416
Epoch: 12338, Training Loss: 0.01755
Epoch: 12339, Training Loss: 0.01336
Epoch: 12339, Training Loss: 0.01414
Epoch: 12339, Training Loss: 0.01416
Epoch: 12339, Training Loss: 0.01755
Epoch: 12340, Training Loss: 0.01336
Epoch: 12340, Training Loss: 0.01414
Epoch: 12340, Training Loss: 0.01416
Epoch: 12340, Training Loss: 0.01755
Epoch: 12341, Training Loss: 0.01336
Epoch: 12341, Training Loss: 0.01414
Epoch: 12341, Training Loss: 0.01416
Epoch: 12341, Training Loss: 0.01754
Epoch: 12342, Training Loss: 0.01336
Epoch: 12342, Training Loss: 0.01414
Epoch: 12342, Training Loss: 0.01416
Epoch: 12342, Training Loss: 0.01754
Epoch: 12343, Training Loss: 0.01336
Epoch: 12343, Training Loss: 0.01414
Epoch: 12343, Training Loss: 0.01416
Epoch: 12343, Training Loss: 0.01754
Epoch: 12344, Training Loss: 0.01336
Epoch: 12344, Training Loss: 0.01414
Epoch: 12344, Training Loss: 0.01416
Epoch: 12344, Training Loss: 0.01754
Epoch: 12345, Training Loss: 0.01336
Epoch: 12345, Training Loss: 0.01414
Epoch: 12345, Training Loss: 0.01416
Epoch: 12345, Training Loss: 0.01754
Epoch: 12346, Training Loss: 0.01336
Epoch: 12346, Training Loss: 0.01413
Epoch: 12346, Training Loss: 0.01416
Epoch: 12346, Training Loss: 0.01754
Epoch: 12347, Training Loss: 0.01336
Epoch: 12347, Training Loss: 0.01413
Epoch: 12347, Training Loss: 0.01416
Epoch: 12347, Training Loss: 0.01754
Epoch: 12348, Training Loss: 0.01336
Epoch: 12348, Training Loss: 0.01413
Epoch: 12348, Training Loss: 0.01415
Epoch: 12348, Training Loss: 0.01754
Epoch: 12349, Training Loss: 0.01336
Epoch: 12349, Training Loss: 0.01413
Epoch: 12349, Training Loss: 0.01415
Epoch: 12349, Training Loss: 0.01754
Epoch: 12350, Training Loss: 0.01336
Epoch: 12350, Training Loss: 0.01413
Epoch: 12350, Training Loss: 0.01415
Epoch: 12350, Training Loss: 0.01754
Epoch: 12351, Training Loss: 0.01336
Epoch: 12351, Training Loss: 0.01413
Epoch: 12351, Training Loss: 0.01415
Epoch: 12351, Training Loss: 0.01753
Epoch: 12352, Training Loss: 0.01336
Epoch: 12352, Training Loss: 0.01413
Epoch: 12352, Training Loss: 0.01415
Epoch: 12352, Training Loss: 0.01753
Epoch: 12353, Training Loss: 0.01336
Epoch: 12353, Training Loss: 0.01413
Epoch: 12353, Training Loss: 0.01415
Epoch: 12353, Training Loss: 0.01753
Epoch: 12354, Training Loss: 0.01335
Epoch: 12354, Training Loss: 0.01413
Epoch: 12354, Training Loss: 0.01415
Epoch: 12354, Training Loss: 0.01753
Epoch: 12355, Training Loss: 0.01335
Epoch: 12355, Training Loss: 0.01413
Epoch: 12355, Training Loss: 0.01415
Epoch: 12355, Training Loss: 0.01753
Epoch: 12356, Training Loss: 0.01335
Epoch: 12356, Training Loss: 0.01413
Epoch: 12356, Training Loss: 0.01415
Epoch: 12356, Training Loss: 0.01753
Epoch: 12357, Training Loss: 0.01335
Epoch: 12357, Training Loss: 0.01413
Epoch: 12357, Training Loss: 0.01415
Epoch: 12357, Training Loss: 0.01753
Epoch: 12358, Training Loss: 0.01335
Epoch: 12358, Training Loss: 0.01413
Epoch: 12358, Training Loss: 0.01415
Epoch: 12358, Training Loss: 0.01753
Epoch: 12359, Training Loss: 0.01335
Epoch: 12359, Training Loss: 0.01413
Epoch: 12359, Training Loss: 0.01415
Epoch: 12359, Training Loss: 0.01753
Epoch: 12360, Training Loss: 0.01335
Epoch: 12360, Training Loss: 0.01412
Epoch: 12360, Training Loss: 0.01415
Epoch: 12360, Training Loss: 0.01753
Epoch: 12361, Training Loss: 0.01335
Epoch: 12361, Training Loss: 0.01412
Epoch: 12361, Training Loss: 0.01414
Epoch: 12361, Training Loss: 0.01753
Epoch: 12362, Training Loss: 0.01335
Epoch: 12362, Training Loss: 0.01412
Epoch: 12362, Training Loss: 0.01414
Epoch: 12362, Training Loss: 0.01752
Epoch: 12363, Training Loss: 0.01335
Epoch: 12363, Training Loss: 0.01412
Epoch: 12363, Training Loss: 0.01414
Epoch: 12363, Training Loss: 0.01752
Epoch: 12364, Training Loss: 0.01335
Epoch: 12364, Training Loss: 0.01412
Epoch: 12364, Training Loss: 0.01414
Epoch: 12364, Training Loss: 0.01752
Epoch: 12365, Training Loss: 0.01335
Epoch: 12365, Training Loss: 0.01412
Epoch: 12365, Training Loss: 0.01414
Epoch: 12365, Training Loss: 0.01752
Epoch: 12366, Training Loss: 0.01335
Epoch: 12366, Training Loss: 0.01412
Epoch: 12366, Training Loss: 0.01414
Epoch: 12366, Training Loss: 0.01752
Epoch: 12367, Training Loss: 0.01335
Epoch: 12367, Training Loss: 0.01412
Epoch: 12367, Training Loss: 0.01414
Epoch: 12367, Training Loss: 0.01752
Epoch: 12368, Training Loss: 0.01334
Epoch: 12368, Training Loss: 0.01412
Epoch: 12368, Training Loss: 0.01414
Epoch: 12368, Training Loss: 0.01752
Epoch: 12369, Training Loss: 0.01334
Epoch: 12369, Training Loss: 0.01412
Epoch: 12369, Training Loss: 0.01414
Epoch: 12369, Training Loss: 0.01752
Epoch: 12370, Training Loss: 0.01334
Epoch: 12370, Training Loss: 0.01412
Epoch: 12370, Training Loss: 0.01414
Epoch: 12370, Training Loss: 0.01752
Epoch: 12371, Training Loss: 0.01334
Epoch: 12371, Training Loss: 0.01412
Epoch: 12371, Training Loss: 0.01414
Epoch: 12371, Training Loss: 0.01752
Epoch: 12372, Training Loss: 0.01334
Epoch: 12372, Training Loss: 0.01412
Epoch: 12372, Training Loss: 0.01414
Epoch: 12372, Training Loss: 0.01752
Epoch: 12373, Training Loss: 0.01334
Epoch: 12373, Training Loss: 0.01411
Epoch: 12373, Training Loss: 0.01414
Epoch: 12373, Training Loss: 0.01751
Epoch: 12374, Training Loss: 0.01334
Epoch: 12374, Training Loss: 0.01411
Epoch: 12374, Training Loss: 0.01413
Epoch: 12374, Training Loss: 0.01751
Epoch: 12375, Training Loss: 0.01334
Epoch: 12375, Training Loss: 0.01411
Epoch: 12375, Training Loss: 0.01413
Epoch: 12375, Training Loss: 0.01751
Epoch: 12376, Training Loss: 0.01334
Epoch: 12376, Training Loss: 0.01411
Epoch: 12376, Training Loss: 0.01413
Epoch: 12376, Training Loss: 0.01751
Epoch: 12377, Training Loss: 0.01334
Epoch: 12377, Training Loss: 0.01411
Epoch: 12377, Training Loss: 0.01413
Epoch: 12377, Training Loss: 0.01751
Epoch: 12378, Training Loss: 0.01334
Epoch: 12378, Training Loss: 0.01411
Epoch: 12378, Training Loss: 0.01413
Epoch: 12378, Training Loss: 0.01751
Epoch: 12379, Training Loss: 0.01334
Epoch: 12379, Training Loss: 0.01411
Epoch: 12379, Training Loss: 0.01413
Epoch: 12379, Training Loss: 0.01751
Epoch: 12380, Training Loss: 0.01334
Epoch: 12380, Training Loss: 0.01411
Epoch: 12380, Training Loss: 0.01413
Epoch: 12380, Training Loss: 0.01751
Epoch: 12381, Training Loss: 0.01334
Epoch: 12381, Training Loss: 0.01411
Epoch: 12381, Training Loss: 0.01413
Epoch: 12381, Training Loss: 0.01751
Epoch: 12382, Training Loss: 0.01334
Epoch: 12382, Training Loss: 0.01411
Epoch: 12382, Training Loss: 0.01413
Epoch: 12382, Training Loss: 0.01751
Epoch: 12383, Training Loss: 0.01333
Epoch: 12383, Training Loss: 0.01411
Epoch: 12383, Training Loss: 0.01413
Epoch: 12383, Training Loss: 0.01750
Epoch: 12384, Training Loss: 0.01333
Epoch: 12384, Training Loss: 0.01411
Epoch: 12384, Training Loss: 0.01413
Epoch: 12384, Training Loss: 0.01750
Epoch: 12385, Training Loss: 0.01333
Epoch: 12385, Training Loss: 0.01411
Epoch: 12385, Training Loss: 0.01413
Epoch: 12385, Training Loss: 0.01750
Epoch: 12386, Training Loss: 0.01333
Epoch: 12386, Training Loss: 0.01411
Epoch: 12386, Training Loss: 0.01413
Epoch: 12386, Training Loss: 0.01750
Epoch: 12387, Training Loss: 0.01333
Epoch: 12387, Training Loss: 0.01410
Epoch: 12387, Training Loss: 0.01413
Epoch: 12387, Training Loss: 0.01750
Epoch: 12388, Training Loss: 0.01333
Epoch: 12388, Training Loss: 0.01410
Epoch: 12388, Training Loss: 0.01412
Epoch: 12388, Training Loss: 0.01750
Epoch: 12389, Training Loss: 0.01333
Epoch: 12389, Training Loss: 0.01410
Epoch: 12389, Training Loss: 0.01412
Epoch: 12389, Training Loss: 0.01750
Epoch: 12390, Training Loss: 0.01333
Epoch: 12390, Training Loss: 0.01410
Epoch: 12390, Training Loss: 0.01412
Epoch: 12390, Training Loss: 0.01750
Epoch: 12391, Training Loss: 0.01333
Epoch: 12391, Training Loss: 0.01410
Epoch: 12391, Training Loss: 0.01412
Epoch: 12391, Training Loss: 0.01750
Epoch: 12392, Training Loss: 0.01333
Epoch: 12392, Training Loss: 0.01410
Epoch: 12392, Training Loss: 0.01412
Epoch: 12392, Training Loss: 0.01750
Epoch: 12393, Training Loss: 0.01333
Epoch: 12393, Training Loss: 0.01410
Epoch: 12393, Training Loss: 0.01412
Epoch: 12393, Training Loss: 0.01750
Epoch: 12394, Training Loss: 0.01333
Epoch: 12394, Training Loss: 0.01410
Epoch: 12394, Training Loss: 0.01412
Epoch: 12394, Training Loss: 0.01749
Epoch: 12395, Training Loss: 0.01333
Epoch: 12395, Training Loss: 0.01410
Epoch: 12395, Training Loss: 0.01412
Epoch: 12395, Training Loss: 0.01749
Epoch: 12396, Training Loss: 0.01333
Epoch: 12396, Training Loss: 0.01410
Epoch: 12396, Training Loss: 0.01412
Epoch: 12396, Training Loss: 0.01749
Epoch: 12397, Training Loss: 0.01333
Epoch: 12397, Training Loss: 0.01410
Epoch: 12397, Training Loss: 0.01412
Epoch: 12397, Training Loss: 0.01749
Epoch: 12398, Training Loss: 0.01332
Epoch: 12398, Training Loss: 0.01410
Epoch: 12398, Training Loss: 0.01412
Epoch: 12398, Training Loss: 0.01749
Epoch: 12399, Training Loss: 0.01332
Epoch: 12399, Training Loss: 0.01410
Epoch: 12399, Training Loss: 0.01412
Epoch: 12399, Training Loss: 0.01749
Epoch: 12400, Training Loss: 0.01332
Epoch: 12400, Training Loss: 0.01409
Epoch: 12400, Training Loss: 0.01412
Epoch: 12400, Training Loss: 0.01749
Epoch: 12401, Training Loss: 0.01332
Epoch: 12401, Training Loss: 0.01409
Epoch: 12401, Training Loss: 0.01411
Epoch: 12401, Training Loss: 0.01749
Epoch: 12402, Training Loss: 0.01332
Epoch: 12402, Training Loss: 0.01409
Epoch: 12402, Training Loss: 0.01411
Epoch: 12402, Training Loss: 0.01749
Epoch: 12403, Training Loss: 0.01332
Epoch: 12403, Training Loss: 0.01409
Epoch: 12403, Training Loss: 0.01411
Epoch: 12403, Training Loss: 0.01749
Epoch: 12404, Training Loss: 0.01332
Epoch: 12404, Training Loss: 0.01409
Epoch: 12404, Training Loss: 0.01411
Epoch: 12404, Training Loss: 0.01749
Epoch: 12405, Training Loss: 0.01332
Epoch: 12405, Training Loss: 0.01409
Epoch: 12405, Training Loss: 0.01411
Epoch: 12405, Training Loss: 0.01748
Epoch: 12406, Training Loss: 0.01332
Epoch: 12406, Training Loss: 0.01409
Epoch: 12406, Training Loss: 0.01411
Epoch: 12406, Training Loss: 0.01748
Epoch: 12407, Training Loss: 0.01332
Epoch: 12407, Training Loss: 0.01409
Epoch: 12407, Training Loss: 0.01411
Epoch: 12407, Training Loss: 0.01748
Epoch: 12408, Training Loss: 0.01332
Epoch: 12408, Training Loss: 0.01409
Epoch: 12408, Training Loss: 0.01411
Epoch: 12408, Training Loss: 0.01748
Epoch: 12409, Training Loss: 0.01332
Epoch: 12409, Training Loss: 0.01409
Epoch: 12409, Training Loss: 0.01411
Epoch: 12409, Training Loss: 0.01748
Epoch: 12410, Training Loss: 0.01332
Epoch: 12410, Training Loss: 0.01409
Epoch: 12410, Training Loss: 0.01411
Epoch: 12410, Training Loss: 0.01748
Epoch: 12411, Training Loss: 0.01332
Epoch: 12411, Training Loss: 0.01409
Epoch: 12411, Training Loss: 0.01411
Epoch: 12411, Training Loss: 0.01748
Epoch: 12412, Training Loss: 0.01332
Epoch: 12412, Training Loss: 0.01409
Epoch: 12412, Training Loss: 0.01411
Epoch: 12412, Training Loss: 0.01748
Epoch: 12413, Training Loss: 0.01331
Epoch: 12413, Training Loss: 0.01409
Epoch: 12413, Training Loss: 0.01411
Epoch: 12413, Training Loss: 0.01748
Epoch: 12414, Training Loss: 0.01331
Epoch: 12414, Training Loss: 0.01408
Epoch: 12414, Training Loss: 0.01411
Epoch: 12414, Training Loss: 0.01748
Epoch: 12415, Training Loss: 0.01331
Epoch: 12415, Training Loss: 0.01408
Epoch: 12415, Training Loss: 0.01410
Epoch: 12415, Training Loss: 0.01747
Epoch: 12416, Training Loss: 0.01331
Epoch: 12416, Training Loss: 0.01408
Epoch: 12416, Training Loss: 0.01410
Epoch: 12416, Training Loss: 0.01747
Epoch: 12417, Training Loss: 0.01331
Epoch: 12417, Training Loss: 0.01408
Epoch: 12417, Training Loss: 0.01410
Epoch: 12417, Training Loss: 0.01747
Epoch: 12418, Training Loss: 0.01331
Epoch: 12418, Training Loss: 0.01408
Epoch: 12418, Training Loss: 0.01410
Epoch: 12418, Training Loss: 0.01747
Epoch: 12419, Training Loss: 0.01331
Epoch: 12419, Training Loss: 0.01408
Epoch: 12419, Training Loss: 0.01410
Epoch: 12419, Training Loss: 0.01747
Epoch: 12420, Training Loss: 0.01331
Epoch: 12420, Training Loss: 0.01408
Epoch: 12420, Training Loss: 0.01410
Epoch: 12420, Training Loss: 0.01747
Epoch: 12421, Training Loss: 0.01331
Epoch: 12421, Training Loss: 0.01408
Epoch: 12421, Training Loss: 0.01410
Epoch: 12421, Training Loss: 0.01747
Epoch: 12422, Training Loss: 0.01331
Epoch: 12422, Training Loss: 0.01408
Epoch: 12422, Training Loss: 0.01410
Epoch: 12422, Training Loss: 0.01747
Epoch: 12423, Training Loss: 0.01331
Epoch: 12423, Training Loss: 0.01408
Epoch: 12423, Training Loss: 0.01410
Epoch: 12423, Training Loss: 0.01747
Epoch: 12424, Training Loss: 0.01331
Epoch: 12424, Training Loss: 0.01408
Epoch: 12424, Training Loss: 0.01410
Epoch: 12424, Training Loss: 0.01747
Epoch: 12425, Training Loss: 0.01331
Epoch: 12425, Training Loss: 0.01408
Epoch: 12425, Training Loss: 0.01410
Epoch: 12425, Training Loss: 0.01747
Epoch: 12426, Training Loss: 0.01331
Epoch: 12426, Training Loss: 0.01408
Epoch: 12426, Training Loss: 0.01410
Epoch: 12426, Training Loss: 0.01746
Epoch: 12427, Training Loss: 0.01331
Epoch: 12427, Training Loss: 0.01408
Epoch: 12427, Training Loss: 0.01410
Epoch: 12427, Training Loss: 0.01746
Epoch: 12428, Training Loss: 0.01330
Epoch: 12428, Training Loss: 0.01407
Epoch: 12428, Training Loss: 0.01410
Epoch: 12428, Training Loss: 0.01746
Epoch: 12429, Training Loss: 0.01330
Epoch: 12429, Training Loss: 0.01407
Epoch: 12429, Training Loss: 0.01409
Epoch: 12429, Training Loss: 0.01746
Epoch: 12430, Training Loss: 0.01330
Epoch: 12430, Training Loss: 0.01407
Epoch: 12430, Training Loss: 0.01409
Epoch: 12430, Training Loss: 0.01746
Epoch: 12431, Training Loss: 0.01330
Epoch: 12431, Training Loss: 0.01407
Epoch: 12431, Training Loss: 0.01409
Epoch: 12431, Training Loss: 0.01746
Epoch: 12432, Training Loss: 0.01330
Epoch: 12432, Training Loss: 0.01407
Epoch: 12432, Training Loss: 0.01409
Epoch: 12432, Training Loss: 0.01746
Epoch: 12433, Training Loss: 0.01330
Epoch: 12433, Training Loss: 0.01407
Epoch: 12433, Training Loss: 0.01409
Epoch: 12433, Training Loss: 0.01746
Epoch: 12434, Training Loss: 0.01330
Epoch: 12434, Training Loss: 0.01407
Epoch: 12434, Training Loss: 0.01409
Epoch: 12434, Training Loss: 0.01746
Epoch: 12435, Training Loss: 0.01330
Epoch: 12435, Training Loss: 0.01407
Epoch: 12435, Training Loss: 0.01409
Epoch: 12435, Training Loss: 0.01746
Epoch: 12436, Training Loss: 0.01330
Epoch: 12436, Training Loss: 0.01407
Epoch: 12436, Training Loss: 0.01409
Epoch: 12436, Training Loss: 0.01746
Epoch: 12437, Training Loss: 0.01330
Epoch: 12437, Training Loss: 0.01407
Epoch: 12437, Training Loss: 0.01409
Epoch: 12437, Training Loss: 0.01745
Epoch: 12438, Training Loss: 0.01330
Epoch: 12438, Training Loss: 0.01407
Epoch: 12438, Training Loss: 0.01409
Epoch: 12438, Training Loss: 0.01745
Epoch: 12439, Training Loss: 0.01330
Epoch: 12439, Training Loss: 0.01407
Epoch: 12439, Training Loss: 0.01409
Epoch: 12439, Training Loss: 0.01745
Epoch: 12440, Training Loss: 0.01330
Epoch: 12440, Training Loss: 0.01407
Epoch: 12440, Training Loss: 0.01409
Epoch: 12440, Training Loss: 0.01745
Epoch: 12441, Training Loss: 0.01330
Epoch: 12441, Training Loss: 0.01406
Epoch: 12441, Training Loss: 0.01409
Epoch: 12441, Training Loss: 0.01745
Epoch: 12442, Training Loss: 0.01330
Epoch: 12442, Training Loss: 0.01406
Epoch: 12442, Training Loss: 0.01408
Epoch: 12442, Training Loss: 0.01745
Epoch: 12443, Training Loss: 0.01329
Epoch: 12443, Training Loss: 0.01406
Epoch: 12443, Training Loss: 0.01408
Epoch: 12443, Training Loss: 0.01745
Epoch: 12444, Training Loss: 0.01329
Epoch: 12444, Training Loss: 0.01406
Epoch: 12444, Training Loss: 0.01408
Epoch: 12444, Training Loss: 0.01745
Epoch: 12445, Training Loss: 0.01329
Epoch: 12445, Training Loss: 0.01406
Epoch: 12445, Training Loss: 0.01408
Epoch: 12445, Training Loss: 0.01745
Epoch: 12446, Training Loss: 0.01329
Epoch: 12446, Training Loss: 0.01406
Epoch: 12446, Training Loss: 0.01408
Epoch: 12446, Training Loss: 0.01745
Epoch: 12447, Training Loss: 0.01329
Epoch: 12447, Training Loss: 0.01406
Epoch: 12447, Training Loss: 0.01408
Epoch: 12447, Training Loss: 0.01745
Epoch: 12448, Training Loss: 0.01329
Epoch: 12448, Training Loss: 0.01406
Epoch: 12448, Training Loss: 0.01408
Epoch: 12448, Training Loss: 0.01744
Epoch: 12449, Training Loss: 0.01329
Epoch: 12449, Training Loss: 0.01406
Epoch: 12449, Training Loss: 0.01408
Epoch: 12449, Training Loss: 0.01744
Epoch: 12450, Training Loss: 0.01329
Epoch: 12450, Training Loss: 0.01406
Epoch: 12450, Training Loss: 0.01408
Epoch: 12450, Training Loss: 0.01744
Epoch: 12451, Training Loss: 0.01329
Epoch: 12451, Training Loss: 0.01406
Epoch: 12451, Training Loss: 0.01408
Epoch: 12451, Training Loss: 0.01744
Epoch: 12452, Training Loss: 0.01329
Epoch: 12452, Training Loss: 0.01406
Epoch: 12452, Training Loss: 0.01408
Epoch: 12452, Training Loss: 0.01744
Epoch: 12453, Training Loss: 0.01329
Epoch: 12453, Training Loss: 0.01406
Epoch: 12453, Training Loss: 0.01408
Epoch: 12453, Training Loss: 0.01744
Epoch: 12454, Training Loss: 0.01329
Epoch: 12454, Training Loss: 0.01406
Epoch: 12454, Training Loss: 0.01408
Epoch: 12454, Training Loss: 0.01744
Epoch: 12455, Training Loss: 0.01329
Epoch: 12455, Training Loss: 0.01405
Epoch: 12455, Training Loss: 0.01408
Epoch: 12455, Training Loss: 0.01744
Epoch: 12456, Training Loss: 0.01329
Epoch: 12456, Training Loss: 0.01405
Epoch: 12456, Training Loss: 0.01407
Epoch: 12456, Training Loss: 0.01744
Epoch: 12457, Training Loss: 0.01329
Epoch: 12457, Training Loss: 0.01405
Epoch: 12457, Training Loss: 0.01407
Epoch: 12457, Training Loss: 0.01744
Epoch: 12458, Training Loss: 0.01328
Epoch: 12458, Training Loss: 0.01405
Epoch: 12458, Training Loss: 0.01407
Epoch: 12458, Training Loss: 0.01743
Epoch: 12459, Training Loss: 0.01328
Epoch: 12459, Training Loss: 0.01405
Epoch: 12459, Training Loss: 0.01407
Epoch: 12459, Training Loss: 0.01743
Epoch: 12460, Training Loss: 0.01328
Epoch: 12460, Training Loss: 0.01405
Epoch: 12460, Training Loss: 0.01407
Epoch: 12460, Training Loss: 0.01743
Epoch: 12461, Training Loss: 0.01328
Epoch: 12461, Training Loss: 0.01405
Epoch: 12461, Training Loss: 0.01407
Epoch: 12461, Training Loss: 0.01743
Epoch: 12462, Training Loss: 0.01328
Epoch: 12462, Training Loss: 0.01405
Epoch: 12462, Training Loss: 0.01407
Epoch: 12462, Training Loss: 0.01743
Epoch: 12463, Training Loss: 0.01328
Epoch: 12463, Training Loss: 0.01405
Epoch: 12463, Training Loss: 0.01407
Epoch: 12463, Training Loss: 0.01743
Epoch: 12464, Training Loss: 0.01328
Epoch: 12464, Training Loss: 0.01405
Epoch: 12464, Training Loss: 0.01407
Epoch: 12464, Training Loss: 0.01743
Epoch: 12465, Training Loss: 0.01328
Epoch: 12465, Training Loss: 0.01405
Epoch: 12465, Training Loss: 0.01407
Epoch: 12465, Training Loss: 0.01743
Epoch: 12466, Training Loss: 0.01328
Epoch: 12466, Training Loss: 0.01405
Epoch: 12466, Training Loss: 0.01407
Epoch: 12466, Training Loss: 0.01743
Epoch: 12467, Training Loss: 0.01328
Epoch: 12467, Training Loss: 0.01405
Epoch: 12467, Training Loss: 0.01407
Epoch: 12467, Training Loss: 0.01743
Epoch: 12468, Training Loss: 0.01328
Epoch: 12468, Training Loss: 0.01405
Epoch: 12468, Training Loss: 0.01407
Epoch: 12468, Training Loss: 0.01743
Epoch: 12469, Training Loss: 0.01328
Epoch: 12469, Training Loss: 0.01404
Epoch: 12469, Training Loss: 0.01407
Epoch: 12469, Training Loss: 0.01742
Epoch: 12470, Training Loss: 0.01328
Epoch: 12470, Training Loss: 0.01404
Epoch: 12470, Training Loss: 0.01406
Epoch: 12470, Training Loss: 0.01742
Epoch: 12471, Training Loss: 0.01328
Epoch: 12471, Training Loss: 0.01404
Epoch: 12471, Training Loss: 0.01406
Epoch: 12471, Training Loss: 0.01742
Epoch: 12472, Training Loss: 0.01328
Epoch: 12472, Training Loss: 0.01404
Epoch: 12472, Training Loss: 0.01406
Epoch: 12472, Training Loss: 0.01742
Epoch: 12473, Training Loss: 0.01327
Epoch: 12473, Training Loss: 0.01404
Epoch: 12473, Training Loss: 0.01406
Epoch: 12473, Training Loss: 0.01742
Epoch: 12474, Training Loss: 0.01327
Epoch: 12474, Training Loss: 0.01404
Epoch: 12474, Training Loss: 0.01406
Epoch: 12474, Training Loss: 0.01742
Epoch: 12475, Training Loss: 0.01327
Epoch: 12475, Training Loss: 0.01404
Epoch: 12475, Training Loss: 0.01406
Epoch: 12475, Training Loss: 0.01742
Epoch: 12476, Training Loss: 0.01327
Epoch: 12476, Training Loss: 0.01404
Epoch: 12476, Training Loss: 0.01406
Epoch: 12476, Training Loss: 0.01742
Epoch: 12477, Training Loss: 0.01327
Epoch: 12477, Training Loss: 0.01404
Epoch: 12477, Training Loss: 0.01406
Epoch: 12477, Training Loss: 0.01742
Epoch: 12478, Training Loss: 0.01327
Epoch: 12478, Training Loss: 0.01404
Epoch: 12478, Training Loss: 0.01406
Epoch: 12478, Training Loss: 0.01742
Epoch: 12479, Training Loss: 0.01327
Epoch: 12479, Training Loss: 0.01404
Epoch: 12479, Training Loss: 0.01406
Epoch: 12479, Training Loss: 0.01742
Epoch: 12480, Training Loss: 0.01327
Epoch: 12480, Training Loss: 0.01404
Epoch: 12480, Training Loss: 0.01406
Epoch: 12480, Training Loss: 0.01741
Epoch: 12481, Training Loss: 0.01327
Epoch: 12481, Training Loss: 0.01404
Epoch: 12481, Training Loss: 0.01406
Epoch: 12481, Training Loss: 0.01741
Epoch: 12482, Training Loss: 0.01327
Epoch: 12482, Training Loss: 0.01403
Epoch: 12482, Training Loss: 0.01406
Epoch: 12482, Training Loss: 0.01741
Epoch: 12483, Training Loss: 0.01327
Epoch: 12483, Training Loss: 0.01403
Epoch: 12483, Training Loss: 0.01405
Epoch: 12483, Training Loss: 0.01741
Epoch: 12484, Training Loss: 0.01327
Epoch: 12484, Training Loss: 0.01403
Epoch: 12484, Training Loss: 0.01405
Epoch: 12484, Training Loss: 0.01741
Epoch: 12485, Training Loss: 0.01327
Epoch: 12485, Training Loss: 0.01403
Epoch: 12485, Training Loss: 0.01405
Epoch: 12485, Training Loss: 0.01741
Epoch: 12486, Training Loss: 0.01327
Epoch: 12486, Training Loss: 0.01403
Epoch: 12486, Training Loss: 0.01405
Epoch: 12486, Training Loss: 0.01741
Epoch: 12487, Training Loss: 0.01327
Epoch: 12487, Training Loss: 0.01403
Epoch: 12487, Training Loss: 0.01405
Epoch: 12487, Training Loss: 0.01741
Epoch: 12488, Training Loss: 0.01326
Epoch: 12488, Training Loss: 0.01403
Epoch: 12488, Training Loss: 0.01405
Epoch: 12488, Training Loss: 0.01741
Epoch: 12489, Training Loss: 0.01326
Epoch: 12489, Training Loss: 0.01403
Epoch: 12489, Training Loss: 0.01405
Epoch: 12489, Training Loss: 0.01741
Epoch: 12490, Training Loss: 0.01326
Epoch: 12490, Training Loss: 0.01403
Epoch: 12490, Training Loss: 0.01405
Epoch: 12490, Training Loss: 0.01741
Epoch: 12491, Training Loss: 0.01326
Epoch: 12491, Training Loss: 0.01403
Epoch: 12491, Training Loss: 0.01405
Epoch: 12491, Training Loss: 0.01740
Epoch: 12492, Training Loss: 0.01326
Epoch: 12492, Training Loss: 0.01403
Epoch: 12492, Training Loss: 0.01405
Epoch: 12492, Training Loss: 0.01740
Epoch: 12493, Training Loss: 0.01326
Epoch: 12493, Training Loss: 0.01403
Epoch: 12493, Training Loss: 0.01405
Epoch: 12493, Training Loss: 0.01740
Epoch: 12494, Training Loss: 0.01326
Epoch: 12494, Training Loss: 0.01403
Epoch: 12494, Training Loss: 0.01405
Epoch: 12494, Training Loss: 0.01740
Epoch: 12495, Training Loss: 0.01326
Epoch: 12495, Training Loss: 0.01403
Epoch: 12495, Training Loss: 0.01405
Epoch: 12495, Training Loss: 0.01740
Epoch: 12496, Training Loss: 0.01326
Epoch: 12496, Training Loss: 0.01402
Epoch: 12496, Training Loss: 0.01405
Epoch: 12496, Training Loss: 0.01740
Epoch: 12497, Training Loss: 0.01326
Epoch: 12497, Training Loss: 0.01402
Epoch: 12497, Training Loss: 0.01404
Epoch: 12497, Training Loss: 0.01740
Epoch: 12498, Training Loss: 0.01326
Epoch: 12498, Training Loss: 0.01402
Epoch: 12498, Training Loss: 0.01404
Epoch: 12498, Training Loss: 0.01740
Epoch: 12499, Training Loss: 0.01326
Epoch: 12499, Training Loss: 0.01402
Epoch: 12499, Training Loss: 0.01404
Epoch: 12499, Training Loss: 0.01740
Epoch: 12500, Training Loss: 0.01326
Epoch: 12500, Training Loss: 0.01402
Epoch: 12500, Training Loss: 0.01404
Epoch: 12500, Training Loss: 0.01740
Epoch: 12501, Training Loss: 0.01326
Epoch: 12501, Training Loss: 0.01402
Epoch: 12501, Training Loss: 0.01404
Epoch: 12501, Training Loss: 0.01740
Epoch: 12502, Training Loss: 0.01326
Epoch: 12502, Training Loss: 0.01402
Epoch: 12502, Training Loss: 0.01404
Epoch: 12502, Training Loss: 0.01739
Epoch: 12503, Training Loss: 0.01325
Epoch: 12503, Training Loss: 0.01402
Epoch: 12503, Training Loss: 0.01404
Epoch: 12503, Training Loss: 0.01739
Epoch: 12504, Training Loss: 0.01325
Epoch: 12504, Training Loss: 0.01402
Epoch: 12504, Training Loss: 0.01404
Epoch: 12504, Training Loss: 0.01739
Epoch: 12505, Training Loss: 0.01325
Epoch: 12505, Training Loss: 0.01402
Epoch: 12505, Training Loss: 0.01404
Epoch: 12505, Training Loss: 0.01739
Epoch: 12506, Training Loss: 0.01325
Epoch: 12506, Training Loss: 0.01402
Epoch: 12506, Training Loss: 0.01404
Epoch: 12506, Training Loss: 0.01739
Epoch: 12507, Training Loss: 0.01325
Epoch: 12507, Training Loss: 0.01402
Epoch: 12507, Training Loss: 0.01404
Epoch: 12507, Training Loss: 0.01739
Epoch: 12508, Training Loss: 0.01325
Epoch: 12508, Training Loss: 0.01402
Epoch: 12508, Training Loss: 0.01404
Epoch: 12508, Training Loss: 0.01739
Epoch: 12509, Training Loss: 0.01325
Epoch: 12509, Training Loss: 0.01402
Epoch: 12509, Training Loss: 0.01404
Epoch: 12509, Training Loss: 0.01739
Epoch: 12510, Training Loss: 0.01325
Epoch: 12510, Training Loss: 0.01401
Epoch: 12510, Training Loss: 0.01404
Epoch: 12510, Training Loss: 0.01739
Epoch: 12511, Training Loss: 0.01325
Epoch: 12511, Training Loss: 0.01401
Epoch: 12511, Training Loss: 0.01403
Epoch: 12511, Training Loss: 0.01739
Epoch: 12512, Training Loss: 0.01325
Epoch: 12512, Training Loss: 0.01401
Epoch: 12512, Training Loss: 0.01403
Epoch: 12512, Training Loss: 0.01739
Epoch: 12513, Training Loss: 0.01325
Epoch: 12513, Training Loss: 0.01401
Epoch: 12513, Training Loss: 0.01403
Epoch: 12513, Training Loss: 0.01738
Epoch: 12514, Training Loss: 0.01325
Epoch: 12514, Training Loss: 0.01401
Epoch: 12514, Training Loss: 0.01403
Epoch: 12514, Training Loss: 0.01738
Epoch: 12515, Training Loss: 0.01325
Epoch: 12515, Training Loss: 0.01401
Epoch: 12515, Training Loss: 0.01403
Epoch: 12515, Training Loss: 0.01738
Epoch: 12516, Training Loss: 0.01325
Epoch: 12516, Training Loss: 0.01401
Epoch: 12516, Training Loss: 0.01403
Epoch: 12516, Training Loss: 0.01738
Epoch: 12517, Training Loss: 0.01325
Epoch: 12517, Training Loss: 0.01401
Epoch: 12517, Training Loss: 0.01403
Epoch: 12517, Training Loss: 0.01738
Epoch: 12518, Training Loss: 0.01324
Epoch: 12518, Training Loss: 0.01401
Epoch: 12518, Training Loss: 0.01403
Epoch: 12518, Training Loss: 0.01738
Epoch: 12519, Training Loss: 0.01324
Epoch: 12519, Training Loss: 0.01401
Epoch: 12519, Training Loss: 0.01403
Epoch: 12519, Training Loss: 0.01738
Epoch: 12520, Training Loss: 0.01324
Epoch: 12520, Training Loss: 0.01401
Epoch: 12520, Training Loss: 0.01403
Epoch: 12520, Training Loss: 0.01738
Epoch: 12521, Training Loss: 0.01324
Epoch: 12521, Training Loss: 0.01401
Epoch: 12521, Training Loss: 0.01403
Epoch: 12521, Training Loss: 0.01738
Epoch: 12522, Training Loss: 0.01324
Epoch: 12522, Training Loss: 0.01401
Epoch: 12522, Training Loss: 0.01403
Epoch: 12522, Training Loss: 0.01738
Epoch: 12523, Training Loss: 0.01324
Epoch: 12523, Training Loss: 0.01401
Epoch: 12523, Training Loss: 0.01403
Epoch: 12523, Training Loss: 0.01738
Epoch: 12524, Training Loss: 0.01324
Epoch: 12524, Training Loss: 0.01400
Epoch: 12524, Training Loss: 0.01402
Epoch: 12524, Training Loss: 0.01737
Epoch: 12525, Training Loss: 0.01324
Epoch: 12525, Training Loss: 0.01400
Epoch: 12525, Training Loss: 0.01402
Epoch: 12525, Training Loss: 0.01737
Epoch: 12526, Training Loss: 0.01324
Epoch: 12526, Training Loss: 0.01400
Epoch: 12526, Training Loss: 0.01402
Epoch: 12526, Training Loss: 0.01737
Epoch: 12527, Training Loss: 0.01324
Epoch: 12527, Training Loss: 0.01400
Epoch: 12527, Training Loss: 0.01402
Epoch: 12527, Training Loss: 0.01737
Epoch: 12528, Training Loss: 0.01324
Epoch: 12528, Training Loss: 0.01400
Epoch: 12528, Training Loss: 0.01402
Epoch: 12528, Training Loss: 0.01737
Epoch: 12529, Training Loss: 0.01324
Epoch: 12529, Training Loss: 0.01400
Epoch: 12529, Training Loss: 0.01402
Epoch: 12529, Training Loss: 0.01737
Epoch: 12530, Training Loss: 0.01324
Epoch: 12530, Training Loss: 0.01400
Epoch: 12530, Training Loss: 0.01402
Epoch: 12530, Training Loss: 0.01737
Epoch: 12531, Training Loss: 0.01324
Epoch: 12531, Training Loss: 0.01400
Epoch: 12531, Training Loss: 0.01402
Epoch: 12531, Training Loss: 0.01737
Epoch: 12532, Training Loss: 0.01324
Epoch: 12532, Training Loss: 0.01400
Epoch: 12532, Training Loss: 0.01402
Epoch: 12532, Training Loss: 0.01737
Epoch: 12533, Training Loss: 0.01323
Epoch: 12533, Training Loss: 0.01400
Epoch: 12533, Training Loss: 0.01402
Epoch: 12533, Training Loss: 0.01737
Epoch: 12534, Training Loss: 0.01323
Epoch: 12534, Training Loss: 0.01400
Epoch: 12534, Training Loss: 0.01402
Epoch: 12534, Training Loss: 0.01736
Epoch: 12535, Training Loss: 0.01323
Epoch: 12535, Training Loss: 0.01400
Epoch: 12535, Training Loss: 0.01402
Epoch: 12535, Training Loss: 0.01736
Epoch: 12536, Training Loss: 0.01323
Epoch: 12536, Training Loss: 0.01400
Epoch: 12536, Training Loss: 0.01402
Epoch: 12536, Training Loss: 0.01736
Epoch: 12537, Training Loss: 0.01323
Epoch: 12537, Training Loss: 0.01400
Epoch: 12537, Training Loss: 0.01402
Epoch: 12537, Training Loss: 0.01736
Epoch: 12538, Training Loss: 0.01323
Epoch: 12538, Training Loss: 0.01399
Epoch: 12538, Training Loss: 0.01401
Epoch: 12538, Training Loss: 0.01736
Epoch: 12539, Training Loss: 0.01323
Epoch: 12539, Training Loss: 0.01399
Epoch: 12539, Training Loss: 0.01401
Epoch: 12539, Training Loss: 0.01736
Epoch: 12540, Training Loss: 0.01323
Epoch: 12540, Training Loss: 0.01399
Epoch: 12540, Training Loss: 0.01401
Epoch: 12540, Training Loss: 0.01736
Epoch: 12541, Training Loss: 0.01323
Epoch: 12541, Training Loss: 0.01399
Epoch: 12541, Training Loss: 0.01401
Epoch: 12541, Training Loss: 0.01736
Epoch: 12542, Training Loss: 0.01323
Epoch: 12542, Training Loss: 0.01399
Epoch: 12542, Training Loss: 0.01401
Epoch: 12542, Training Loss: 0.01736
Epoch: 12543, Training Loss: 0.01323
Epoch: 12543, Training Loss: 0.01399
Epoch: 12543, Training Loss: 0.01401
Epoch: 12543, Training Loss: 0.01736
Epoch: 12544, Training Loss: 0.01323
Epoch: 12544, Training Loss: 0.01399
Epoch: 12544, Training Loss: 0.01401
Epoch: 12544, Training Loss: 0.01736
Epoch: 12545, Training Loss: 0.01323
Epoch: 12545, Training Loss: 0.01399
Epoch: 12545, Training Loss: 0.01401
Epoch: 12545, Training Loss: 0.01735
Epoch: 12546, Training Loss: 0.01323
Epoch: 12546, Training Loss: 0.01399
Epoch: 12546, Training Loss: 0.01401
Epoch: 12546, Training Loss: 0.01735
Epoch: 12547, Training Loss: 0.01323
Epoch: 12547, Training Loss: 0.01399
Epoch: 12547, Training Loss: 0.01401
Epoch: 12547, Training Loss: 0.01735
Epoch: 12548, Training Loss: 0.01323
Epoch: 12548, Training Loss: 0.01399
Epoch: 12548, Training Loss: 0.01401
Epoch: 12548, Training Loss: 0.01735
Epoch: 12549, Training Loss: 0.01322
Epoch: 12549, Training Loss: 0.01399
Epoch: 12549, Training Loss: 0.01401
Epoch: 12549, Training Loss: 0.01735
Epoch: 12550, Training Loss: 0.01322
Epoch: 12550, Training Loss: 0.01399
Epoch: 12550, Training Loss: 0.01401
Epoch: 12550, Training Loss: 0.01735
Epoch: 12551, Training Loss: 0.01322
Epoch: 12551, Training Loss: 0.01398
Epoch: 12551, Training Loss: 0.01401
Epoch: 12551, Training Loss: 0.01735
Epoch: 12552, Training Loss: 0.01322
Epoch: 12552, Training Loss: 0.01398
Epoch: 12552, Training Loss: 0.01400
Epoch: 12552, Training Loss: 0.01735
Epoch: 12553, Training Loss: 0.01322
Epoch: 12553, Training Loss: 0.01398
Epoch: 12553, Training Loss: 0.01400
Epoch: 12553, Training Loss: 0.01735
Epoch: 12554, Training Loss: 0.01322
Epoch: 12554, Training Loss: 0.01398
Epoch: 12554, Training Loss: 0.01400
Epoch: 12554, Training Loss: 0.01735
Epoch: 12555, Training Loss: 0.01322
Epoch: 12555, Training Loss: 0.01398
Epoch: 12555, Training Loss: 0.01400
Epoch: 12555, Training Loss: 0.01735
Epoch: 12556, Training Loss: 0.01322
Epoch: 12556, Training Loss: 0.01398
Epoch: 12556, Training Loss: 0.01400
Epoch: 12556, Training Loss: 0.01734
Epoch: 12557, Training Loss: 0.01322
Epoch: 12557, Training Loss: 0.01398
Epoch: 12557, Training Loss: 0.01400
Epoch: 12557, Training Loss: 0.01734
Epoch: 12558, Training Loss: 0.01322
Epoch: 12558, Training Loss: 0.01398
Epoch: 12558, Training Loss: 0.01400
Epoch: 12558, Training Loss: 0.01734
Epoch: 12559, Training Loss: 0.01322
Epoch: 12559, Training Loss: 0.01398
Epoch: 12559, Training Loss: 0.01400
Epoch: 12559, Training Loss: 0.01734
Epoch: 12560, Training Loss: 0.01322
Epoch: 12560, Training Loss: 0.01398
Epoch: 12560, Training Loss: 0.01400
Epoch: 12560, Training Loss: 0.01734
Epoch: 12561, Training Loss: 0.01322
Epoch: 12561, Training Loss: 0.01398
Epoch: 12561, Training Loss: 0.01400
Epoch: 12561, Training Loss: 0.01734
Epoch: 12562, Training Loss: 0.01322
Epoch: 12562, Training Loss: 0.01398
Epoch: 12562, Training Loss: 0.01400
Epoch: 12562, Training Loss: 0.01734
Epoch: 12563, Training Loss: 0.01322
Epoch: 12563, Training Loss: 0.01398
Epoch: 12563, Training Loss: 0.01400
Epoch: 12563, Training Loss: 0.01734
Epoch: 12564, Training Loss: 0.01321
Epoch: 12564, Training Loss: 0.01398
Epoch: 12564, Training Loss: 0.01400
Epoch: 12564, Training Loss: 0.01734
Epoch: 12565, Training Loss: 0.01321
Epoch: 12565, Training Loss: 0.01397
Epoch: 12565, Training Loss: 0.01400
Epoch: 12565, Training Loss: 0.01734
Epoch: 12566, Training Loss: 0.01321
Epoch: 12566, Training Loss: 0.01397
Epoch: 12566, Training Loss: 0.01399
Epoch: 12566, Training Loss: 0.01734
Epoch: 12567, Training Loss: 0.01321
Epoch: 12567, Training Loss: 0.01397
Epoch: 12567, Training Loss: 0.01399
Epoch: 12567, Training Loss: 0.01733
Epoch: 12568, Training Loss: 0.01321
Epoch: 12568, Training Loss: 0.01397
Epoch: 12568, Training Loss: 0.01399
Epoch: 12568, Training Loss: 0.01733
Epoch: 12569, Training Loss: 0.01321
Epoch: 12569, Training Loss: 0.01397
Epoch: 12569, Training Loss: 0.01399
Epoch: 12569, Training Loss: 0.01733
Epoch: 12570, Training Loss: 0.01321
Epoch: 12570, Training Loss: 0.01397
Epoch: 12570, Training Loss: 0.01399
Epoch: 12570, Training Loss: 0.01733
Epoch: 12571, Training Loss: 0.01321
Epoch: 12571, Training Loss: 0.01397
Epoch: 12571, Training Loss: 0.01399
Epoch: 12571, Training Loss: 0.01733
Epoch: 12572, Training Loss: 0.01321
Epoch: 12572, Training Loss: 0.01397
Epoch: 12572, Training Loss: 0.01399
Epoch: 12572, Training Loss: 0.01733
Epoch: 12573, Training Loss: 0.01321
Epoch: 12573, Training Loss: 0.01397
Epoch: 12573, Training Loss: 0.01399
Epoch: 12573, Training Loss: 0.01733
Epoch: 12574, Training Loss: 0.01321
Epoch: 12574, Training Loss: 0.01397
Epoch: 12574, Training Loss: 0.01399
Epoch: 12574, Training Loss: 0.01733
Epoch: 12575, Training Loss: 0.01321
Epoch: 12575, Training Loss: 0.01397
Epoch: 12575, Training Loss: 0.01399
Epoch: 12575, Training Loss: 0.01733
Epoch: 12576, Training Loss: 0.01321
Epoch: 12576, Training Loss: 0.01397
Epoch: 12576, Training Loss: 0.01399
Epoch: 12576, Training Loss: 0.01733
Epoch: 12577, Training Loss: 0.01321
Epoch: 12577, Training Loss: 0.01397
Epoch: 12577, Training Loss: 0.01399
Epoch: 12577, Training Loss: 0.01733
Epoch: 12578, Training Loss: 0.01321
Epoch: 12578, Training Loss: 0.01397
Epoch: 12578, Training Loss: 0.01399
Epoch: 12578, Training Loss: 0.01732
Epoch: 12579, Training Loss: 0.01320
Epoch: 12579, Training Loss: 0.01396
Epoch: 12579, Training Loss: 0.01399
Epoch: 12579, Training Loss: 0.01732
Epoch: 12580, Training Loss: 0.01320
Epoch: 12580, Training Loss: 0.01396
Epoch: 12580, Training Loss: 0.01398
Epoch: 12580, Training Loss: 0.01732
Epoch: 12581, Training Loss: 0.01320
Epoch: 12581, Training Loss: 0.01396
Epoch: 12581, Training Loss: 0.01398
Epoch: 12581, Training Loss: 0.01732
Epoch: 12582, Training Loss: 0.01320
Epoch: 12582, Training Loss: 0.01396
Epoch: 12582, Training Loss: 0.01398
Epoch: 12582, Training Loss: 0.01732
Epoch: 12583, Training Loss: 0.01320
Epoch: 12583, Training Loss: 0.01396
Epoch: 12583, Training Loss: 0.01398
Epoch: 12583, Training Loss: 0.01732
Epoch: 12584, Training Loss: 0.01320
Epoch: 12584, Training Loss: 0.01396
Epoch: 12584, Training Loss: 0.01398
Epoch: 12584, Training Loss: 0.01732
Epoch: 12585, Training Loss: 0.01320
Epoch: 12585, Training Loss: 0.01396
Epoch: 12585, Training Loss: 0.01398
Epoch: 12585, Training Loss: 0.01732
Epoch: 12586, Training Loss: 0.01320
Epoch: 12586, Training Loss: 0.01396
Epoch: 12586, Training Loss: 0.01398
Epoch: 12586, Training Loss: 0.01732
Epoch: 12587, Training Loss: 0.01320
Epoch: 12587, Training Loss: 0.01396
Epoch: 12587, Training Loss: 0.01398
Epoch: 12587, Training Loss: 0.01732
Epoch: 12588, Training Loss: 0.01320
Epoch: 12588, Training Loss: 0.01396
Epoch: 12588, Training Loss: 0.01398
Epoch: 12588, Training Loss: 0.01732
Epoch: 12589, Training Loss: 0.01320
Epoch: 12589, Training Loss: 0.01396
Epoch: 12589, Training Loss: 0.01398
Epoch: 12589, Training Loss: 0.01731
Epoch: 12590, Training Loss: 0.01320
Epoch: 12590, Training Loss: 0.01396
Epoch: 12590, Training Loss: 0.01398
Epoch: 12590, Training Loss: 0.01731
Epoch: 12591, Training Loss: 0.01320
Epoch: 12591, Training Loss: 0.01396
Epoch: 12591, Training Loss: 0.01398
Epoch: 12591, Training Loss: 0.01731
Epoch: 12592, Training Loss: 0.01320
Epoch: 12592, Training Loss: 0.01396
Epoch: 12592, Training Loss: 0.01398
Epoch: 12592, Training Loss: 0.01731
Epoch: 12593, Training Loss: 0.01320
Epoch: 12593, Training Loss: 0.01395
Epoch: 12593, Training Loss: 0.01398
Epoch: 12593, Training Loss: 0.01731
Epoch: 12594, Training Loss: 0.01319
Epoch: 12594, Training Loss: 0.01395
Epoch: 12594, Training Loss: 0.01397
Epoch: 12594, Training Loss: 0.01731
Epoch: 12595, Training Loss: 0.01319
Epoch: 12595, Training Loss: 0.01395
Epoch: 12595, Training Loss: 0.01397
Epoch: 12595, Training Loss: 0.01731
Epoch: 12596, Training Loss: 0.01319
Epoch: 12596, Training Loss: 0.01395
Epoch: 12596, Training Loss: 0.01397
Epoch: 12596, Training Loss: 0.01731
Epoch: 12597, Training Loss: 0.01319
Epoch: 12597, Training Loss: 0.01395
Epoch: 12597, Training Loss: 0.01397
Epoch: 12597, Training Loss: 0.01731
Epoch: 12598, Training Loss: 0.01319
Epoch: 12598, Training Loss: 0.01395
Epoch: 12598, Training Loss: 0.01397
Epoch: 12598, Training Loss: 0.01731
Epoch: 12599, Training Loss: 0.01319
Epoch: 12599, Training Loss: 0.01395
Epoch: 12599, Training Loss: 0.01397
Epoch: 12599, Training Loss: 0.01731
Epoch: 12600, Training Loss: 0.01319
Epoch: 12600, Training Loss: 0.01395
Epoch: 12600, Training Loss: 0.01397
Epoch: 12600, Training Loss: 0.01730
Epoch: 12601, Training Loss: 0.01319
Epoch: 12601, Training Loss: 0.01395
Epoch: 12601, Training Loss: 0.01397
Epoch: 12601, Training Loss: 0.01730
Epoch: 12602, Training Loss: 0.01319
Epoch: 12602, Training Loss: 0.01395
Epoch: 12602, Training Loss: 0.01397
Epoch: 12602, Training Loss: 0.01730
Epoch: 12603, Training Loss: 0.01319
Epoch: 12603, Training Loss: 0.01395
Epoch: 12603, Training Loss: 0.01397
Epoch: 12603, Training Loss: 0.01730
Epoch: 12604, Training Loss: 0.01319
Epoch: 12604, Training Loss: 0.01395
Epoch: 12604, Training Loss: 0.01397
Epoch: 12604, Training Loss: 0.01730
Epoch: 12605, Training Loss: 0.01319
Epoch: 12605, Training Loss: 0.01395
Epoch: 12605, Training Loss: 0.01397
Epoch: 12605, Training Loss: 0.01730
Epoch: 12606, Training Loss: 0.01319
Epoch: 12606, Training Loss: 0.01395
Epoch: 12606, Training Loss: 0.01397
Epoch: 12606, Training Loss: 0.01730
Epoch: 12607, Training Loss: 0.01319
Epoch: 12607, Training Loss: 0.01394
Epoch: 12607, Training Loss: 0.01397
Epoch: 12607, Training Loss: 0.01730
Epoch: 12608, Training Loss: 0.01319
Epoch: 12608, Training Loss: 0.01394
Epoch: 12608, Training Loss: 0.01396
Epoch: 12608, Training Loss: 0.01730
Epoch: 12609, Training Loss: 0.01319
Epoch: 12609, Training Loss: 0.01394
Epoch: 12609, Training Loss: 0.01396
Epoch: 12609, Training Loss: 0.01730
Epoch: 12610, Training Loss: 0.01318
Epoch: 12610, Training Loss: 0.01394
Epoch: 12610, Training Loss: 0.01396
Epoch: 12610, Training Loss: 0.01730
Epoch: 12611, Training Loss: 0.01318
Epoch: 12611, Training Loss: 0.01394
Epoch: 12611, Training Loss: 0.01396
Epoch: 12611, Training Loss: 0.01729
Epoch: 12612, Training Loss: 0.01318
Epoch: 12612, Training Loss: 0.01394
Epoch: 12612, Training Loss: 0.01396
Epoch: 12612, Training Loss: 0.01729
Epoch: 12613, Training Loss: 0.01318
Epoch: 12613, Training Loss: 0.01394
Epoch: 12613, Training Loss: 0.01396
Epoch: 12613, Training Loss: 0.01729
Epoch: 12614, Training Loss: 0.01318
Epoch: 12614, Training Loss: 0.01394
Epoch: 12614, Training Loss: 0.01396
Epoch: 12614, Training Loss: 0.01729
Epoch: 12615, Training Loss: 0.01318
Epoch: 12615, Training Loss: 0.01394
Epoch: 12615, Training Loss: 0.01396
Epoch: 12615, Training Loss: 0.01729
Epoch: 12616, Training Loss: 0.01318
Epoch: 12616, Training Loss: 0.01394
Epoch: 12616, Training Loss: 0.01396
Epoch: 12616, Training Loss: 0.01729
Epoch: 12617, Training Loss: 0.01318
Epoch: 12617, Training Loss: 0.01394
Epoch: 12617, Training Loss: 0.01396
Epoch: 12617, Training Loss: 0.01729
Epoch: 12618, Training Loss: 0.01318
Epoch: 12618, Training Loss: 0.01394
Epoch: 12618, Training Loss: 0.01396
Epoch: 12618, Training Loss: 0.01729
Epoch: 12619, Training Loss: 0.01318
Epoch: 12619, Training Loss: 0.01394
Epoch: 12619, Training Loss: 0.01396
Epoch: 12619, Training Loss: 0.01729
Epoch: 12620, Training Loss: 0.01318
Epoch: 12620, Training Loss: 0.01394
Epoch: 12620, Training Loss: 0.01396
Epoch: 12620, Training Loss: 0.01729
Epoch: 12621, Training Loss: 0.01318
Epoch: 12621, Training Loss: 0.01393
Epoch: 12621, Training Loss: 0.01396
Epoch: 12621, Training Loss: 0.01729
Epoch: 12622, Training Loss: 0.01318
Epoch: 12622, Training Loss: 0.01393
Epoch: 12622, Training Loss: 0.01395
Epoch: 12622, Training Loss: 0.01728
Epoch: 12623, Training Loss: 0.01318
Epoch: 12623, Training Loss: 0.01393
Epoch: 12623, Training Loss: 0.01395
Epoch: 12623, Training Loss: 0.01728
Epoch: 12624, Training Loss: 0.01318
Epoch: 12624, Training Loss: 0.01393
Epoch: 12624, Training Loss: 0.01395
Epoch: 12624, Training Loss: 0.01728
Epoch: 12625, Training Loss: 0.01317
Epoch: 12625, Training Loss: 0.01393
Epoch: 12625, Training Loss: 0.01395
Epoch: 12625, Training Loss: 0.01728
Epoch: 12626, Training Loss: 0.01317
Epoch: 12626, Training Loss: 0.01393
Epoch: 12626, Training Loss: 0.01395
Epoch: 12626, Training Loss: 0.01728
Epoch: 12627, Training Loss: 0.01317
Epoch: 12627, Training Loss: 0.01393
Epoch: 12627, Training Loss: 0.01395
Epoch: 12627, Training Loss: 0.01728
Epoch: 12628, Training Loss: 0.01317
Epoch: 12628, Training Loss: 0.01393
Epoch: 12628, Training Loss: 0.01395
Epoch: 12628, Training Loss: 0.01728
Epoch: 12629, Training Loss: 0.01317
Epoch: 12629, Training Loss: 0.01393
Epoch: 12629, Training Loss: 0.01395
Epoch: 12629, Training Loss: 0.01728
Epoch: 12630, Training Loss: 0.01317
Epoch: 12630, Training Loss: 0.01393
Epoch: 12630, Training Loss: 0.01395
Epoch: 12630, Training Loss: 0.01728
Epoch: 12631, Training Loss: 0.01317
Epoch: 12631, Training Loss: 0.01393
Epoch: 12631, Training Loss: 0.01395
Epoch: 12631, Training Loss: 0.01728
Epoch: 12632, Training Loss: 0.01317
Epoch: 12632, Training Loss: 0.01393
Epoch: 12632, Training Loss: 0.01395
Epoch: 12632, Training Loss: 0.01728
Epoch: 12633, Training Loss: 0.01317
Epoch: 12633, Training Loss: 0.01393
Epoch: 12633, Training Loss: 0.01395
Epoch: 12633, Training Loss: 0.01727
Epoch: 12634, Training Loss: 0.01317
Epoch: 12634, Training Loss: 0.01393
Epoch: 12634, Training Loss: 0.01395
Epoch: 12634, Training Loss: 0.01727
Epoch: 12635, Training Loss: 0.01317
Epoch: 12635, Training Loss: 0.01392
Epoch: 12635, Training Loss: 0.01395
Epoch: 12635, Training Loss: 0.01727
Epoch: 12636, Training Loss: 0.01317
Epoch: 12636, Training Loss: 0.01392
Epoch: 12636, Training Loss: 0.01394
Epoch: 12636, Training Loss: 0.01727
Epoch: 12637, Training Loss: 0.01317
Epoch: 12637, Training Loss: 0.01392
Epoch: 12637, Training Loss: 0.01394
Epoch: 12637, Training Loss: 0.01727
Epoch: 12638, Training Loss: 0.01317
Epoch: 12638, Training Loss: 0.01392
Epoch: 12638, Training Loss: 0.01394
Epoch: 12638, Training Loss: 0.01727
Epoch: 12639, Training Loss: 0.01317
Epoch: 12639, Training Loss: 0.01392
Epoch: 12639, Training Loss: 0.01394
Epoch: 12639, Training Loss: 0.01727
Epoch: 12640, Training Loss: 0.01316
Epoch: 12640, Training Loss: 0.01392
Epoch: 12640, Training Loss: 0.01394
Epoch: 12640, Training Loss: 0.01727
Epoch: 12641, Training Loss: 0.01316
Epoch: 12641, Training Loss: 0.01392
Epoch: 12641, Training Loss: 0.01394
Epoch: 12641, Training Loss: 0.01727
Epoch: 12642, Training Loss: 0.01316
Epoch: 12642, Training Loss: 0.01392
Epoch: 12642, Training Loss: 0.01394
Epoch: 12642, Training Loss: 0.01727
Epoch: 12643, Training Loss: 0.01316
Epoch: 12643, Training Loss: 0.01392
Epoch: 12643, Training Loss: 0.01394
Epoch: 12643, Training Loss: 0.01727
Epoch: 12644, Training Loss: 0.01316
Epoch: 12644, Training Loss: 0.01392
Epoch: 12644, Training Loss: 0.01394
Epoch: 12644, Training Loss: 0.01727
Epoch: 12645, Training Loss: 0.01316
Epoch: 12645, Training Loss: 0.01392
Epoch: 12645, Training Loss: 0.01394
Epoch: 12645, Training Loss: 0.01726
Epoch: 12646, Training Loss: 0.01316
Epoch: 12646, Training Loss: 0.01392
Epoch: 12646, Training Loss: 0.01394
Epoch: 12646, Training Loss: 0.01726
Epoch: 12647, Training Loss: 0.01316
Epoch: 12647, Training Loss: 0.01392
Epoch: 12647, Training Loss: 0.01394
Epoch: 12647, Training Loss: 0.01726
Epoch: 12648, Training Loss: 0.01316
Epoch: 12648, Training Loss: 0.01392
Epoch: 12648, Training Loss: 0.01394
Epoch: 12648, Training Loss: 0.01726
Epoch: 12649, Training Loss: 0.01316
Epoch: 12649, Training Loss: 0.01391
Epoch: 12649, Training Loss: 0.01394
Epoch: 12649, Training Loss: 0.01726
Epoch: 12650, Training Loss: 0.01316
Epoch: 12650, Training Loss: 0.01391
Epoch: 12650, Training Loss: 0.01393
Epoch: 12650, Training Loss: 0.01726
Epoch: 12651, Training Loss: 0.01316
Epoch: 12651, Training Loss: 0.01391
Epoch: 12651, Training Loss: 0.01393
Epoch: 12651, Training Loss: 0.01726
Epoch: 12652, Training Loss: 0.01316
Epoch: 12652, Training Loss: 0.01391
Epoch: 12652, Training Loss: 0.01393
Epoch: 12652, Training Loss: 0.01726
Epoch: 12653, Training Loss: 0.01316
Epoch: 12653, Training Loss: 0.01391
Epoch: 12653, Training Loss: 0.01393
Epoch: 12653, Training Loss: 0.01726
Epoch: 12654, Training Loss: 0.01316
Epoch: 12654, Training Loss: 0.01391
Epoch: 12654, Training Loss: 0.01393
Epoch: 12654, Training Loss: 0.01726
Epoch: 12655, Training Loss: 0.01316
Epoch: 12655, Training Loss: 0.01391
Epoch: 12655, Training Loss: 0.01393
Epoch: 12655, Training Loss: 0.01726
Epoch: 12656, Training Loss: 0.01315
Epoch: 12656, Training Loss: 0.01391
Epoch: 12656, Training Loss: 0.01393
Epoch: 12656, Training Loss: 0.01725
Epoch: 12657, Training Loss: 0.01315
Epoch: 12657, Training Loss: 0.01391
Epoch: 12657, Training Loss: 0.01393
Epoch: 12657, Training Loss: 0.01725
Epoch: 12658, Training Loss: 0.01315
Epoch: 12658, Training Loss: 0.01391
Epoch: 12658, Training Loss: 0.01393
Epoch: 12658, Training Loss: 0.01725
Epoch: 12659, Training Loss: 0.01315
Epoch: 12659, Training Loss: 0.01391
Epoch: 12659, Training Loss: 0.01393
Epoch: 12659, Training Loss: 0.01725
Epoch: 12660, Training Loss: 0.01315
Epoch: 12660, Training Loss: 0.01391
Epoch: 12660, Training Loss: 0.01393
Epoch: 12660, Training Loss: 0.01725
Epoch: 12661, Training Loss: 0.01315
Epoch: 12661, Training Loss: 0.01391
Epoch: 12661, Training Loss: 0.01393
Epoch: 12661, Training Loss: 0.01725
Epoch: 12662, Training Loss: 0.01315
Epoch: 12662, Training Loss: 0.01391
Epoch: 12662, Training Loss: 0.01393
Epoch: 12662, Training Loss: 0.01725
Epoch: 12663, Training Loss: 0.01315
Epoch: 12663, Training Loss: 0.01390
Epoch: 12663, Training Loss: 0.01393
Epoch: 12663, Training Loss: 0.01725
Epoch: 12664, Training Loss: 0.01315
Epoch: 12664, Training Loss: 0.01390
Epoch: 12664, Training Loss: 0.01392
Epoch: 12664, Training Loss: 0.01725
Epoch: 12665, Training Loss: 0.01315
Epoch: 12665, Training Loss: 0.01390
Epoch: 12665, Training Loss: 0.01392
Epoch: 12665, Training Loss: 0.01725
Epoch: 12666, Training Loss: 0.01315
Epoch: 12666, Training Loss: 0.01390
Epoch: 12666, Training Loss: 0.01392
Epoch: 12666, Training Loss: 0.01725
Epoch: 12667, Training Loss: 0.01315
Epoch: 12667, Training Loss: 0.01390
Epoch: 12667, Training Loss: 0.01392
Epoch: 12667, Training Loss: 0.01724
Epoch: 12668, Training Loss: 0.01315
Epoch: 12668, Training Loss: 0.01390
Epoch: 12668, Training Loss: 0.01392
Epoch: 12668, Training Loss: 0.01724
Epoch: 12669, Training Loss: 0.01315
Epoch: 12669, Training Loss: 0.01390
Epoch: 12669, Training Loss: 0.01392
Epoch: 12669, Training Loss: 0.01724
Epoch: 12670, Training Loss: 0.01315
Epoch: 12670, Training Loss: 0.01390
Epoch: 12670, Training Loss: 0.01392
Epoch: 12670, Training Loss: 0.01724
Epoch: 12671, Training Loss: 0.01314
Epoch: 12671, Training Loss: 0.01390
Epoch: 12671, Training Loss: 0.01392
Epoch: 12671, Training Loss: 0.01724
Epoch: 12672, Training Loss: 0.01314
Epoch: 12672, Training Loss: 0.01390
Epoch: 12672, Training Loss: 0.01392
Epoch: 12672, Training Loss: 0.01724
Epoch: 12673, Training Loss: 0.01314
Epoch: 12673, Training Loss: 0.01390
Epoch: 12673, Training Loss: 0.01392
Epoch: 12673, Training Loss: 0.01724
Epoch: 12674, Training Loss: 0.01314
Epoch: 12674, Training Loss: 0.01390
Epoch: 12674, Training Loss: 0.01392
Epoch: 12674, Training Loss: 0.01724
Epoch: 12675, Training Loss: 0.01314
Epoch: 12675, Training Loss: 0.01390
Epoch: 12675, Training Loss: 0.01392
Epoch: 12675, Training Loss: 0.01724
Epoch: 12676, Training Loss: 0.01314
Epoch: 12676, Training Loss: 0.01390
Epoch: 12676, Training Loss: 0.01392
Epoch: 12676, Training Loss: 0.01724
Epoch: 12677, Training Loss: 0.01314
Epoch: 12677, Training Loss: 0.01390
Epoch: 12677, Training Loss: 0.01392
Epoch: 12677, Training Loss: 0.01724
Epoch: 12678, Training Loss: 0.01314
Epoch: 12678, Training Loss: 0.01389
Epoch: 12678, Training Loss: 0.01391
Epoch: 12678, Training Loss: 0.01723
Epoch: 12679, Training Loss: 0.01314
Epoch: 12679, Training Loss: 0.01389
Epoch: 12679, Training Loss: 0.01391
Epoch: 12679, Training Loss: 0.01723
Epoch: 12680, Training Loss: 0.01314
Epoch: 12680, Training Loss: 0.01389
Epoch: 12680, Training Loss: 0.01391
Epoch: 12680, Training Loss: 0.01723
Epoch: 12681, Training Loss: 0.01314
Epoch: 12681, Training Loss: 0.01389
Epoch: 12681, Training Loss: 0.01391
Epoch: 12681, Training Loss: 0.01723
Epoch: 12682, Training Loss: 0.01314
Epoch: 12682, Training Loss: 0.01389
Epoch: 12682, Training Loss: 0.01391
Epoch: 12682, Training Loss: 0.01723
Epoch: 12683, Training Loss: 0.01314
Epoch: 12683, Training Loss: 0.01389
Epoch: 12683, Training Loss: 0.01391
Epoch: 12683, Training Loss: 0.01723
Epoch: 12684, Training Loss: 0.01314
Epoch: 12684, Training Loss: 0.01389
Epoch: 12684, Training Loss: 0.01391
Epoch: 12684, Training Loss: 0.01723
Epoch: 12685, Training Loss: 0.01314
Epoch: 12685, Training Loss: 0.01389
Epoch: 12685, Training Loss: 0.01391
Epoch: 12685, Training Loss: 0.01723
Epoch: 12686, Training Loss: 0.01314
Epoch: 12686, Training Loss: 0.01389
Epoch: 12686, Training Loss: 0.01391
Epoch: 12686, Training Loss: 0.01723
Epoch: 12687, Training Loss: 0.01313
Epoch: 12687, Training Loss: 0.01389
Epoch: 12687, Training Loss: 0.01391
Epoch: 12687, Training Loss: 0.01723
Epoch: 12688, Training Loss: 0.01313
Epoch: 12688, Training Loss: 0.01389
Epoch: 12688, Training Loss: 0.01391
Epoch: 12688, Training Loss: 0.01723
Epoch: 12689, Training Loss: 0.01313
Epoch: 12689, Training Loss: 0.01389
Epoch: 12689, Training Loss: 0.01391
Epoch: 12689, Training Loss: 0.01722
Epoch: 12690, Training Loss: 0.01313
Epoch: 12690, Training Loss: 0.01389
Epoch: 12690, Training Loss: 0.01391
Epoch: 12690, Training Loss: 0.01722
Epoch: 12691, Training Loss: 0.01313
Epoch: 12691, Training Loss: 0.01389
Epoch: 12691, Training Loss: 0.01391
Epoch: 12691, Training Loss: 0.01722
Epoch: 12692, Training Loss: 0.01313
Epoch: 12692, Training Loss: 0.01388
Epoch: 12692, Training Loss: 0.01390
Epoch: 12692, Training Loss: 0.01722
Epoch: 12693, Training Loss: 0.01313
Epoch: 12693, Training Loss: 0.01388
Epoch: 12693, Training Loss: 0.01390
Epoch: 12693, Training Loss: 0.01722
Epoch: 12694, Training Loss: 0.01313
Epoch: 12694, Training Loss: 0.01388
Epoch: 12694, Training Loss: 0.01390
Epoch: 12694, Training Loss: 0.01722
Epoch: 12695, Training Loss: 0.01313
Epoch: 12695, Training Loss: 0.01388
Epoch: 12695, Training Loss: 0.01390
Epoch: 12695, Training Loss: 0.01722
Epoch: 12696, Training Loss: 0.01313
Epoch: 12696, Training Loss: 0.01388
Epoch: 12696, Training Loss: 0.01390
Epoch: 12696, Training Loss: 0.01722
Epoch: 12697, Training Loss: 0.01313
Epoch: 12697, Training Loss: 0.01388
Epoch: 12697, Training Loss: 0.01390
Epoch: 12697, Training Loss: 0.01722
Epoch: 12698, Training Loss: 0.01313
Epoch: 12698, Training Loss: 0.01388
Epoch: 12698, Training Loss: 0.01390
Epoch: 12698, Training Loss: 0.01722
Epoch: 12699, Training Loss: 0.01313
Epoch: 12699, Training Loss: 0.01388
Epoch: 12699, Training Loss: 0.01390
Epoch: 12699, Training Loss: 0.01722
Epoch: 12700, Training Loss: 0.01313
Epoch: 12700, Training Loss: 0.01388
Epoch: 12700, Training Loss: 0.01390
Epoch: 12700, Training Loss: 0.01721
Epoch: 12701, Training Loss: 0.01313
Epoch: 12701, Training Loss: 0.01388
Epoch: 12701, Training Loss: 0.01390
Epoch: 12701, Training Loss: 0.01721
Epoch: 12702, Training Loss: 0.01312
Epoch: 12702, Training Loss: 0.01388
Epoch: 12702, Training Loss: 0.01390
Epoch: 12702, Training Loss: 0.01721
Epoch: 12703, Training Loss: 0.01312
Epoch: 12703, Training Loss: 0.01388
Epoch: 12703, Training Loss: 0.01390
Epoch: 12703, Training Loss: 0.01721
Epoch: 12704, Training Loss: 0.01312
Epoch: 12704, Training Loss: 0.01388
Epoch: 12704, Training Loss: 0.01390
Epoch: 12704, Training Loss: 0.01721
Epoch: 12705, Training Loss: 0.01312
Epoch: 12705, Training Loss: 0.01388
Epoch: 12705, Training Loss: 0.01390
Epoch: 12705, Training Loss: 0.01721
Epoch: 12706, Training Loss: 0.01312
Epoch: 12706, Training Loss: 0.01387
Epoch: 12706, Training Loss: 0.01389
Epoch: 12706, Training Loss: 0.01721
Epoch: 12707, Training Loss: 0.01312
Epoch: 12707, Training Loss: 0.01387
Epoch: 12707, Training Loss: 0.01389
Epoch: 12707, Training Loss: 0.01721
Epoch: 12708, Training Loss: 0.01312
Epoch: 12708, Training Loss: 0.01387
Epoch: 12708, Training Loss: 0.01389
Epoch: 12708, Training Loss: 0.01721
Epoch: 12709, Training Loss: 0.01312
Epoch: 12709, Training Loss: 0.01387
Epoch: 12709, Training Loss: 0.01389
Epoch: 12709, Training Loss: 0.01721
Epoch: 12710, Training Loss: 0.01312
Epoch: 12710, Training Loss: 0.01387
Epoch: 12710, Training Loss: 0.01389
Epoch: 12710, Training Loss: 0.01721
Epoch: 12711, Training Loss: 0.01312
Epoch: 12711, Training Loss: 0.01387
Epoch: 12711, Training Loss: 0.01389
Epoch: 12711, Training Loss: 0.01721
Epoch: 12712, Training Loss: 0.01312
Epoch: 12712, Training Loss: 0.01387
Epoch: 12712, Training Loss: 0.01389
Epoch: 12712, Training Loss: 0.01720
Epoch: 12713, Training Loss: 0.01312
Epoch: 12713, Training Loss: 0.01387
Epoch: 12713, Training Loss: 0.01389
Epoch: 12713, Training Loss: 0.01720
Epoch: 12714, Training Loss: 0.01312
Epoch: 12714, Training Loss: 0.01387
Epoch: 12714, Training Loss: 0.01389
Epoch: 12714, Training Loss: 0.01720
Epoch: 12715, Training Loss: 0.01312
Epoch: 12715, Training Loss: 0.01387
Epoch: 12715, Training Loss: 0.01389
Epoch: 12715, Training Loss: 0.01720
Epoch: 12716, Training Loss: 0.01312
Epoch: 12716, Training Loss: 0.01387
Epoch: 12716, Training Loss: 0.01389
Epoch: 12716, Training Loss: 0.01720
Epoch: 12717, Training Loss: 0.01312
Epoch: 12717, Training Loss: 0.01387
Epoch: 12717, Training Loss: 0.01389
Epoch: 12717, Training Loss: 0.01720
Epoch: 12718, Training Loss: 0.01311
Epoch: 12718, Training Loss: 0.01387
Epoch: 12718, Training Loss: 0.01389
Epoch: 12718, Training Loss: 0.01720
Epoch: 12719, Training Loss: 0.01311
Epoch: 12719, Training Loss: 0.01387
Epoch: 12719, Training Loss: 0.01389
Epoch: 12719, Training Loss: 0.01720
Epoch: 12720, Training Loss: 0.01311
Epoch: 12720, Training Loss: 0.01386
Epoch: 12720, Training Loss: 0.01388
Epoch: 12720, Training Loss: 0.01720
Epoch: 12721, Training Loss: 0.01311
Epoch: 12721, Training Loss: 0.01386
Epoch: 12721, Training Loss: 0.01388
Epoch: 12721, Training Loss: 0.01720
Epoch: 12722, Training Loss: 0.01311
Epoch: 12722, Training Loss: 0.01386
Epoch: 12722, Training Loss: 0.01388
Epoch: 12722, Training Loss: 0.01720
Epoch: 12723, Training Loss: 0.01311
Epoch: 12723, Training Loss: 0.01386
Epoch: 12723, Training Loss: 0.01388
Epoch: 12723, Training Loss: 0.01719
Epoch: 12724, Training Loss: 0.01311
Epoch: 12724, Training Loss: 0.01386
Epoch: 12724, Training Loss: 0.01388
Epoch: 12724, Training Loss: 0.01719
Epoch: 12725, Training Loss: 0.01311
Epoch: 12725, Training Loss: 0.01386
Epoch: 12725, Training Loss: 0.01388
Epoch: 12725, Training Loss: 0.01719
Epoch: 12726, Training Loss: 0.01311
Epoch: 12726, Training Loss: 0.01386
Epoch: 12726, Training Loss: 0.01388
Epoch: 12726, Training Loss: 0.01719
Epoch: 12727, Training Loss: 0.01311
Epoch: 12727, Training Loss: 0.01386
Epoch: 12727, Training Loss: 0.01388
Epoch: 12727, Training Loss: 0.01719
Epoch: 12728, Training Loss: 0.01311
Epoch: 12728, Training Loss: 0.01386
Epoch: 12728, Training Loss: 0.01388
Epoch: 12728, Training Loss: 0.01719
Epoch: 12729, Training Loss: 0.01311
Epoch: 12729, Training Loss: 0.01386
Epoch: 12729, Training Loss: 0.01388
Epoch: 12729, Training Loss: 0.01719
Epoch: 12730, Training Loss: 0.01311
Epoch: 12730, Training Loss: 0.01386
Epoch: 12730, Training Loss: 0.01388
Epoch: 12730, Training Loss: 0.01719
Epoch: 12731, Training Loss: 0.01311
Epoch: 12731, Training Loss: 0.01386
Epoch: 12731, Training Loss: 0.01388
Epoch: 12731, Training Loss: 0.01719
Epoch: 12732, Training Loss: 0.01311
Epoch: 12732, Training Loss: 0.01386
Epoch: 12732, Training Loss: 0.01388
Epoch: 12732, Training Loss: 0.01719
Epoch: 12733, Training Loss: 0.01311
Epoch: 12733, Training Loss: 0.01386
Epoch: 12733, Training Loss: 0.01388
Epoch: 12733, Training Loss: 0.01719
Epoch: 12734, Training Loss: 0.01310
Epoch: 12734, Training Loss: 0.01385
Epoch: 12734, Training Loss: 0.01388
Epoch: 12734, Training Loss: 0.01718
Epoch: 12735, Training Loss: 0.01310
Epoch: 12735, Training Loss: 0.01385
Epoch: 12735, Training Loss: 0.01387
Epoch: 12735, Training Loss: 0.01718
Epoch: 12736, Training Loss: 0.01310
Epoch: 12736, Training Loss: 0.01385
Epoch: 12736, Training Loss: 0.01387
Epoch: 12736, Training Loss: 0.01718
Epoch: 12737, Training Loss: 0.01310
Epoch: 12737, Training Loss: 0.01385
Epoch: 12737, Training Loss: 0.01387
Epoch: 12737, Training Loss: 0.01718
Epoch: 12738, Training Loss: 0.01310
Epoch: 12738, Training Loss: 0.01385
Epoch: 12738, Training Loss: 0.01387
Epoch: 12738, Training Loss: 0.01718
Epoch: 12739, Training Loss: 0.01310
Epoch: 12739, Training Loss: 0.01385
Epoch: 12739, Training Loss: 0.01387
Epoch: 12739, Training Loss: 0.01718
Epoch: 12740, Training Loss: 0.01310
Epoch: 12740, Training Loss: 0.01385
Epoch: 12740, Training Loss: 0.01387
Epoch: 12740, Training Loss: 0.01718
Epoch: 12741, Training Loss: 0.01310
Epoch: 12741, Training Loss: 0.01385
Epoch: 12741, Training Loss: 0.01387
Epoch: 12741, Training Loss: 0.01718
Epoch: 12742, Training Loss: 0.01310
Epoch: 12742, Training Loss: 0.01385
Epoch: 12742, Training Loss: 0.01387
Epoch: 12742, Training Loss: 0.01718
Epoch: 12743, Training Loss: 0.01310
Epoch: 12743, Training Loss: 0.01385
Epoch: 12743, Training Loss: 0.01387
Epoch: 12743, Training Loss: 0.01718
Epoch: 12744, Training Loss: 0.01310
Epoch: 12744, Training Loss: 0.01385
Epoch: 12744, Training Loss: 0.01387
Epoch: 12744, Training Loss: 0.01718
Epoch: 12745, Training Loss: 0.01310
Epoch: 12745, Training Loss: 0.01385
Epoch: 12745, Training Loss: 0.01387
Epoch: 12745, Training Loss: 0.01717
Epoch: 12746, Training Loss: 0.01310
Epoch: 12746, Training Loss: 0.01385
Epoch: 12746, Training Loss: 0.01387
Epoch: 12746, Training Loss: 0.01717
Epoch: 12747, Training Loss: 0.01310
Epoch: 12747, Training Loss: 0.01385
Epoch: 12747, Training Loss: 0.01387
Epoch: 12747, Training Loss: 0.01717
Epoch: 12748, Training Loss: 0.01310
Epoch: 12748, Training Loss: 0.01385
Epoch: 12748, Training Loss: 0.01387
Epoch: 12748, Training Loss: 0.01717
Epoch: 12749, Training Loss: 0.01309
Epoch: 12749, Training Loss: 0.01384
Epoch: 12749, Training Loss: 0.01386
Epoch: 12749, Training Loss: 0.01717
Epoch: 12750, Training Loss: 0.01309
Epoch: 12750, Training Loss: 0.01384
Epoch: 12750, Training Loss: 0.01386
Epoch: 12750, Training Loss: 0.01717
Epoch: 12751, Training Loss: 0.01309
Epoch: 12751, Training Loss: 0.01384
Epoch: 12751, Training Loss: 0.01386
Epoch: 12751, Training Loss: 0.01717
Epoch: 12752, Training Loss: 0.01309
Epoch: 12752, Training Loss: 0.01384
Epoch: 12752, Training Loss: 0.01386
Epoch: 12752, Training Loss: 0.01717
Epoch: 12753, Training Loss: 0.01309
Epoch: 12753, Training Loss: 0.01384
Epoch: 12753, Training Loss: 0.01386
Epoch: 12753, Training Loss: 0.01717
Epoch: 12754, Training Loss: 0.01309
Epoch: 12754, Training Loss: 0.01384
Epoch: 12754, Training Loss: 0.01386
Epoch: 12754, Training Loss: 0.01717
Epoch: 12755, Training Loss: 0.01309
Epoch: 12755, Training Loss: 0.01384
Epoch: 12755, Training Loss: 0.01386
Epoch: 12755, Training Loss: 0.01717
Epoch: 12756, Training Loss: 0.01309
Epoch: 12756, Training Loss: 0.01384
Epoch: 12756, Training Loss: 0.01386
Epoch: 12756, Training Loss: 0.01717
Epoch: 12757, Training Loss: 0.01309
Epoch: 12757, Training Loss: 0.01384
Epoch: 12757, Training Loss: 0.01386
Epoch: 12757, Training Loss: 0.01716
Epoch: 12758, Training Loss: 0.01309
Epoch: 12758, Training Loss: 0.01384
Epoch: 12758, Training Loss: 0.01386
Epoch: 12758, Training Loss: 0.01716
Epoch: 12759, Training Loss: 0.01309
Epoch: 12759, Training Loss: 0.01384
Epoch: 12759, Training Loss: 0.01386
Epoch: 12759, Training Loss: 0.01716
Epoch: 12760, Training Loss: 0.01309
Epoch: 12760, Training Loss: 0.01384
Epoch: 12760, Training Loss: 0.01386
Epoch: 12760, Training Loss: 0.01716
Epoch: 12761, Training Loss: 0.01309
Epoch: 12761, Training Loss: 0.01384
Epoch: 12761, Training Loss: 0.01386
Epoch: 12761, Training Loss: 0.01716
Epoch: 12762, Training Loss: 0.01309
Epoch: 12762, Training Loss: 0.01384
Epoch: 12762, Training Loss: 0.01386
Epoch: 12762, Training Loss: 0.01716
Epoch: 12763, Training Loss: 0.01309
Epoch: 12763, Training Loss: 0.01383
Epoch: 12763, Training Loss: 0.01385
Epoch: 12763, Training Loss: 0.01716
Epoch: 12764, Training Loss: 0.01309
Epoch: 12764, Training Loss: 0.01383
Epoch: 12764, Training Loss: 0.01385
Epoch: 12764, Training Loss: 0.01716
Epoch: 12765, Training Loss: 0.01308
Epoch: 12765, Training Loss: 0.01383
Epoch: 12765, Training Loss: 0.01385
Epoch: 12765, Training Loss: 0.01716
Epoch: 12766, Training Loss: 0.01308
Epoch: 12766, Training Loss: 0.01383
Epoch: 12766, Training Loss: 0.01385
Epoch: 12766, Training Loss: 0.01716
Epoch: 12767, Training Loss: 0.01308
Epoch: 12767, Training Loss: 0.01383
Epoch: 12767, Training Loss: 0.01385
Epoch: 12767, Training Loss: 0.01716
Epoch: 12768, Training Loss: 0.01308
Epoch: 12768, Training Loss: 0.01383
Epoch: 12768, Training Loss: 0.01385
Epoch: 12768, Training Loss: 0.01715
Epoch: 12769, Training Loss: 0.01308
Epoch: 12769, Training Loss: 0.01383
Epoch: 12769, Training Loss: 0.01385
Epoch: 12769, Training Loss: 0.01715
Epoch: 12770, Training Loss: 0.01308
Epoch: 12770, Training Loss: 0.01383
Epoch: 12770, Training Loss: 0.01385
Epoch: 12770, Training Loss: 0.01715
Epoch: 12771, Training Loss: 0.01308
Epoch: 12771, Training Loss: 0.01383
Epoch: 12771, Training Loss: 0.01385
Epoch: 12771, Training Loss: 0.01715
Epoch: 12772, Training Loss: 0.01308
Epoch: 12772, Training Loss: 0.01383
Epoch: 12772, Training Loss: 0.01385
Epoch: 12772, Training Loss: 0.01715
Epoch: 12773, Training Loss: 0.01308
Epoch: 12773, Training Loss: 0.01383
Epoch: 12773, Training Loss: 0.01385
Epoch: 12773, Training Loss: 0.01715
Epoch: 12774, Training Loss: 0.01308
Epoch: 12774, Training Loss: 0.01383
Epoch: 12774, Training Loss: 0.01385
Epoch: 12774, Training Loss: 0.01715
Epoch: 12775, Training Loss: 0.01308
Epoch: 12775, Training Loss: 0.01383
Epoch: 12775, Training Loss: 0.01385
Epoch: 12775, Training Loss: 0.01715
Epoch: 12776, Training Loss: 0.01308
Epoch: 12776, Training Loss: 0.01383
Epoch: 12776, Training Loss: 0.01385
Epoch: 12776, Training Loss: 0.01715
Epoch: 12777, Training Loss: 0.01308
Epoch: 12777, Training Loss: 0.01382
Epoch: 12777, Training Loss: 0.01385
Epoch: 12777, Training Loss: 0.01715
Epoch: 12778, Training Loss: 0.01308
Epoch: 12778, Training Loss: 0.01382
Epoch: 12778, Training Loss: 0.01384
Epoch: 12778, Training Loss: 0.01715
Epoch: 12779, Training Loss: 0.01308
Epoch: 12779, Training Loss: 0.01382
Epoch: 12779, Training Loss: 0.01384
Epoch: 12779, Training Loss: 0.01714
Epoch: 12780, Training Loss: 0.01308
Epoch: 12780, Training Loss: 0.01382
Epoch: 12780, Training Loss: 0.01384
Epoch: 12780, Training Loss: 0.01714
Epoch: 12781, Training Loss: 0.01307
Epoch: 12781, Training Loss: 0.01382
Epoch: 12781, Training Loss: 0.01384
Epoch: 12781, Training Loss: 0.01714
Epoch: 12782, Training Loss: 0.01307
Epoch: 12782, Training Loss: 0.01382
Epoch: 12782, Training Loss: 0.01384
Epoch: 12782, Training Loss: 0.01714
Epoch: 12783, Training Loss: 0.01307
Epoch: 12783, Training Loss: 0.01382
Epoch: 12783, Training Loss: 0.01384
Epoch: 12783, Training Loss: 0.01714
Epoch: 12784, Training Loss: 0.01307
Epoch: 12784, Training Loss: 0.01382
Epoch: 12784, Training Loss: 0.01384
Epoch: 12784, Training Loss: 0.01714
Epoch: 12785, Training Loss: 0.01307
Epoch: 12785, Training Loss: 0.01382
Epoch: 12785, Training Loss: 0.01384
Epoch: 12785, Training Loss: 0.01714
Epoch: 12786, Training Loss: 0.01307
Epoch: 12786, Training Loss: 0.01382
Epoch: 12786, Training Loss: 0.01384
Epoch: 12786, Training Loss: 0.01714
Epoch: 12787, Training Loss: 0.01307
Epoch: 12787, Training Loss: 0.01382
Epoch: 12787, Training Loss: 0.01384
Epoch: 12787, Training Loss: 0.01714
Epoch: 12788, Training Loss: 0.01307
Epoch: 12788, Training Loss: 0.01382
Epoch: 12788, Training Loss: 0.01384
Epoch: 12788, Training Loss: 0.01714
Epoch: 12789, Training Loss: 0.01307
Epoch: 12789, Training Loss: 0.01382
Epoch: 12789, Training Loss: 0.01384
Epoch: 12789, Training Loss: 0.01714
Epoch: 12790, Training Loss: 0.01307
Epoch: 12790, Training Loss: 0.01382
Epoch: 12790, Training Loss: 0.01384
Epoch: 12790, Training Loss: 0.01713
Epoch: 12791, Training Loss: 0.01307
Epoch: 12791, Training Loss: 0.01382
Epoch: 12791, Training Loss: 0.01384
Epoch: 12791, Training Loss: 0.01713
Epoch: 12792, Training Loss: 0.01307
Epoch: 12792, Training Loss: 0.01381
Epoch: 12792, Training Loss: 0.01383
Epoch: 12792, Training Loss: 0.01713
Epoch: 12793, Training Loss: 0.01307
Epoch: 12793, Training Loss: 0.01381
Epoch: 12793, Training Loss: 0.01383
Epoch: 12793, Training Loss: 0.01713
Epoch: 12794, Training Loss: 0.01307
Epoch: 12794, Training Loss: 0.01381
Epoch: 12794, Training Loss: 0.01383
Epoch: 12794, Training Loss: 0.01713
Epoch: 12795, Training Loss: 0.01307
Epoch: 12795, Training Loss: 0.01381
Epoch: 12795, Training Loss: 0.01383
Epoch: 12795, Training Loss: 0.01713
Epoch: 12796, Training Loss: 0.01306
Epoch: 12796, Training Loss: 0.01381
Epoch: 12796, Training Loss: 0.01383
Epoch: 12796, Training Loss: 0.01713
Epoch: 12797, Training Loss: 0.01306
Epoch: 12797, Training Loss: 0.01381
Epoch: 12797, Training Loss: 0.01383
Epoch: 12797, Training Loss: 0.01713
Epoch: 12798, Training Loss: 0.01306
Epoch: 12798, Training Loss: 0.01381
Epoch: 12798, Training Loss: 0.01383
Epoch: 12798, Training Loss: 0.01713
Epoch: 12799, Training Loss: 0.01306
Epoch: 12799, Training Loss: 0.01381
Epoch: 12799, Training Loss: 0.01383
Epoch: 12799, Training Loss: 0.01713
Epoch: 12800, Training Loss: 0.01306
Epoch: 12800, Training Loss: 0.01381
Epoch: 12800, Training Loss: 0.01383
Epoch: 12800, Training Loss: 0.01713
Epoch: 12801, Training Loss: 0.01306
Epoch: 12801, Training Loss: 0.01381
Epoch: 12801, Training Loss: 0.01383
Epoch: 12801, Training Loss: 0.01713
Epoch: 12802, Training Loss: 0.01306
Epoch: 12802, Training Loss: 0.01381
Epoch: 12802, Training Loss: 0.01383
Epoch: 12802, Training Loss: 0.01712
Epoch: 12803, Training Loss: 0.01306
Epoch: 12803, Training Loss: 0.01381
Epoch: 12803, Training Loss: 0.01383
Epoch: 12803, Training Loss: 0.01712
Epoch: 12804, Training Loss: 0.01306
Epoch: 12804, Training Loss: 0.01381
Epoch: 12804, Training Loss: 0.01383
Epoch: 12804, Training Loss: 0.01712
Epoch: 12805, Training Loss: 0.01306
Epoch: 12805, Training Loss: 0.01381
Epoch: 12805, Training Loss: 0.01383
Epoch: 12805, Training Loss: 0.01712
Epoch: 12806, Training Loss: 0.01306
Epoch: 12806, Training Loss: 0.01380
Epoch: 12806, Training Loss: 0.01382
Epoch: 12806, Training Loss: 0.01712
Epoch: 12807, Training Loss: 0.01306
Epoch: 12807, Training Loss: 0.01380
Epoch: 12807, Training Loss: 0.01382
Epoch: 12807, Training Loss: 0.01712
Epoch: 12808, Training Loss: 0.01306
Epoch: 12808, Training Loss: 0.01380
Epoch: 12808, Training Loss: 0.01382
Epoch: 12808, Training Loss: 0.01712
Epoch: 12809, Training Loss: 0.01306
Epoch: 12809, Training Loss: 0.01380
Epoch: 12809, Training Loss: 0.01382
Epoch: 12809, Training Loss: 0.01712
Epoch: 12810, Training Loss: 0.01306
Epoch: 12810, Training Loss: 0.01380
Epoch: 12810, Training Loss: 0.01382
Epoch: 12810, Training Loss: 0.01712
Epoch: 12811, Training Loss: 0.01306
Epoch: 12811, Training Loss: 0.01380
Epoch: 12811, Training Loss: 0.01382
Epoch: 12811, Training Loss: 0.01712
Epoch: 12812, Training Loss: 0.01305
Epoch: 12812, Training Loss: 0.01380
Epoch: 12812, Training Loss: 0.01382
Epoch: 12812, Training Loss: 0.01712
Epoch: 12813, Training Loss: 0.01305
Epoch: 12813, Training Loss: 0.01380
Epoch: 12813, Training Loss: 0.01382
Epoch: 12813, Training Loss: 0.01711
Epoch: 12814, Training Loss: 0.01305
Epoch: 12814, Training Loss: 0.01380
Epoch: 12814, Training Loss: 0.01382
Epoch: 12814, Training Loss: 0.01711
Epoch: 12815, Training Loss: 0.01305
Epoch: 12815, Training Loss: 0.01380
Epoch: 12815, Training Loss: 0.01382
Epoch: 12815, Training Loss: 0.01711
Epoch: 12816, Training Loss: 0.01305
Epoch: 12816, Training Loss: 0.01380
Epoch: 12816, Training Loss: 0.01382
Epoch: 12816, Training Loss: 0.01711
Epoch: 12817, Training Loss: 0.01305
Epoch: 12817, Training Loss: 0.01380
Epoch: 12817, Training Loss: 0.01382
Epoch: 12817, Training Loss: 0.01711
Epoch: 12818, Training Loss: 0.01305
Epoch: 12818, Training Loss: 0.01380
Epoch: 12818, Training Loss: 0.01382
Epoch: 12818, Training Loss: 0.01711
Epoch: 12819, Training Loss: 0.01305
Epoch: 12819, Training Loss: 0.01380
Epoch: 12819, Training Loss: 0.01382
Epoch: 12819, Training Loss: 0.01711
Epoch: 12820, Training Loss: 0.01305
Epoch: 12820, Training Loss: 0.01379
Epoch: 12820, Training Loss: 0.01382
Epoch: 12820, Training Loss: 0.01711
Epoch: 12821, Training Loss: 0.01305
Epoch: 12821, Training Loss: 0.01379
Epoch: 12821, Training Loss: 0.01381
Epoch: 12821, Training Loss: 0.01711
Epoch: 12822, Training Loss: 0.01305
Epoch: 12822, Training Loss: 0.01379
Epoch: 12822, Training Loss: 0.01381
Epoch: 12822, Training Loss: 0.01711
Epoch: 12823, Training Loss: 0.01305
Epoch: 12823, Training Loss: 0.01379
Epoch: 12823, Training Loss: 0.01381
Epoch: 12823, Training Loss: 0.01711
Epoch: 12824, Training Loss: 0.01305
Epoch: 12824, Training Loss: 0.01379
Epoch: 12824, Training Loss: 0.01381
Epoch: 12824, Training Loss: 0.01711
Epoch: 12825, Training Loss: 0.01305
Epoch: 12825, Training Loss: 0.01379
Epoch: 12825, Training Loss: 0.01381
Epoch: 12825, Training Loss: 0.01710
Epoch: 12826, Training Loss: 0.01305
Epoch: 12826, Training Loss: 0.01379
Epoch: 12826, Training Loss: 0.01381
Epoch: 12826, Training Loss: 0.01710
Epoch: 12827, Training Loss: 0.01305
Epoch: 12827, Training Loss: 0.01379
Epoch: 12827, Training Loss: 0.01381
Epoch: 12827, Training Loss: 0.01710
Epoch: 12828, Training Loss: 0.01304
Epoch: 12828, Training Loss: 0.01379
Epoch: 12828, Training Loss: 0.01381
Epoch: 12828, Training Loss: 0.01710
Epoch: 12829, Training Loss: 0.01304
Epoch: 12829, Training Loss: 0.01379
Epoch: 12829, Training Loss: 0.01381
Epoch: 12829, Training Loss: 0.01710
Epoch: 12830, Training Loss: 0.01304
Epoch: 12830, Training Loss: 0.01379
Epoch: 12830, Training Loss: 0.01381
Epoch: 12830, Training Loss: 0.01710
Epoch: 12831, Training Loss: 0.01304
Epoch: 12831, Training Loss: 0.01379
Epoch: 12831, Training Loss: 0.01381
Epoch: 12831, Training Loss: 0.01710
Epoch: 12832, Training Loss: 0.01304
Epoch: 12832, Training Loss: 0.01379
Epoch: 12832, Training Loss: 0.01381
Epoch: 12832, Training Loss: 0.01710
Epoch: 12833, Training Loss: 0.01304
Epoch: 12833, Training Loss: 0.01379
Epoch: 12833, Training Loss: 0.01381
Epoch: 12833, Training Loss: 0.01710
Epoch: 12834, Training Loss: 0.01304
Epoch: 12834, Training Loss: 0.01379
Epoch: 12834, Training Loss: 0.01381
Epoch: 12834, Training Loss: 0.01710
Epoch: 12835, Training Loss: 0.01304
Epoch: 12835, Training Loss: 0.01378
Epoch: 12835, Training Loss: 0.01380
Epoch: 12835, Training Loss: 0.01710
Epoch: 12836, Training Loss: 0.01304
Epoch: 12836, Training Loss: 0.01378
Epoch: 12836, Training Loss: 0.01380
Epoch: 12836, Training Loss: 0.01709
Epoch: 12837, Training Loss: 0.01304
Epoch: 12837, Training Loss: 0.01378
Epoch: 12837, Training Loss: 0.01380
Epoch: 12837, Training Loss: 0.01709
Epoch: 12838, Training Loss: 0.01304
Epoch: 12838, Training Loss: 0.01378
Epoch: 12838, Training Loss: 0.01380
Epoch: 12838, Training Loss: 0.01709
Epoch: 12839, Training Loss: 0.01304
Epoch: 12839, Training Loss: 0.01378
Epoch: 12839, Training Loss: 0.01380
Epoch: 12839, Training Loss: 0.01709
Epoch: 12840, Training Loss: 0.01304
Epoch: 12840, Training Loss: 0.01378
Epoch: 12840, Training Loss: 0.01380
Epoch: 12840, Training Loss: 0.01709
Epoch: 12841, Training Loss: 0.01304
Epoch: 12841, Training Loss: 0.01378
Epoch: 12841, Training Loss: 0.01380
Epoch: 12841, Training Loss: 0.01709
Epoch: 12842, Training Loss: 0.01304
Epoch: 12842, Training Loss: 0.01378
Epoch: 12842, Training Loss: 0.01380
Epoch: 12842, Training Loss: 0.01709
Epoch: 12843, Training Loss: 0.01304
Epoch: 12843, Training Loss: 0.01378
Epoch: 12843, Training Loss: 0.01380
Epoch: 12843, Training Loss: 0.01709
Epoch: 12844, Training Loss: 0.01303
Epoch: 12844, Training Loss: 0.01378
Epoch: 12844, Training Loss: 0.01380
Epoch: 12844, Training Loss: 0.01709
Epoch: 12845, Training Loss: 0.01303
Epoch: 12845, Training Loss: 0.01378
Epoch: 12845, Training Loss: 0.01380
Epoch: 12845, Training Loss: 0.01709
Epoch: 12846, Training Loss: 0.01303
Epoch: 12846, Training Loss: 0.01378
Epoch: 12846, Training Loss: 0.01380
Epoch: 12846, Training Loss: 0.01709
Epoch: 12847, Training Loss: 0.01303
Epoch: 12847, Training Loss: 0.01378
Epoch: 12847, Training Loss: 0.01380
Epoch: 12847, Training Loss: 0.01708
Epoch: 12848, Training Loss: 0.01303
Epoch: 12848, Training Loss: 0.01378
Epoch: 12848, Training Loss: 0.01380
Epoch: 12848, Training Loss: 0.01708
Epoch: 12849, Training Loss: 0.01303
Epoch: 12849, Training Loss: 0.01377
Epoch: 12849, Training Loss: 0.01380
Epoch: 12849, Training Loss: 0.01708
Epoch: 12850, Training Loss: 0.01303
Epoch: 12850, Training Loss: 0.01377
Epoch: 12850, Training Loss: 0.01379
Epoch: 12850, Training Loss: 0.01708
Epoch: 12851, Training Loss: 0.01303
Epoch: 12851, Training Loss: 0.01377
Epoch: 12851, Training Loss: 0.01379
Epoch: 12851, Training Loss: 0.01708
Epoch: 12852, Training Loss: 0.01303
Epoch: 12852, Training Loss: 0.01377
Epoch: 12852, Training Loss: 0.01379
Epoch: 12852, Training Loss: 0.01708
Epoch: 12853, Training Loss: 0.01303
Epoch: 12853, Training Loss: 0.01377
Epoch: 12853, Training Loss: 0.01379
Epoch: 12853, Training Loss: 0.01708
Epoch: 12854, Training Loss: 0.01303
Epoch: 12854, Training Loss: 0.01377
Epoch: 12854, Training Loss: 0.01379
Epoch: 12854, Training Loss: 0.01708
Epoch: 12855, Training Loss: 0.01303
Epoch: 12855, Training Loss: 0.01377
Epoch: 12855, Training Loss: 0.01379
Epoch: 12855, Training Loss: 0.01708
Epoch: 12856, Training Loss: 0.01303
Epoch: 12856, Training Loss: 0.01377
Epoch: 12856, Training Loss: 0.01379
Epoch: 12856, Training Loss: 0.01708
Epoch: 12857, Training Loss: 0.01303
Epoch: 12857, Training Loss: 0.01377
Epoch: 12857, Training Loss: 0.01379
Epoch: 12857, Training Loss: 0.01708
Epoch: 12858, Training Loss: 0.01303
Epoch: 12858, Training Loss: 0.01377
Epoch: 12858, Training Loss: 0.01379
Epoch: 12858, Training Loss: 0.01708
Epoch: 12859, Training Loss: 0.01303
Epoch: 12859, Training Loss: 0.01377
Epoch: 12859, Training Loss: 0.01379
Epoch: 12859, Training Loss: 0.01707
Epoch: 12860, Training Loss: 0.01302
Epoch: 12860, Training Loss: 0.01377
Epoch: 12860, Training Loss: 0.01379
Epoch: 12860, Training Loss: 0.01707
Epoch: 12861, Training Loss: 0.01302
Epoch: 12861, Training Loss: 0.01377
Epoch: 12861, Training Loss: 0.01379
Epoch: 12861, Training Loss: 0.01707
Epoch: 12862, Training Loss: 0.01302
Epoch: 12862, Training Loss: 0.01377
Epoch: 12862, Training Loss: 0.01379
Epoch: 12862, Training Loss: 0.01707
Epoch: 12863, Training Loss: 0.01302
Epoch: 12863, Training Loss: 0.01377
Epoch: 12863, Training Loss: 0.01379
Epoch: 12863, Training Loss: 0.01707
Epoch: 12864, Training Loss: 0.01302
Epoch: 12864, Training Loss: 0.01376
Epoch: 12864, Training Loss: 0.01378
Epoch: 12864, Training Loss: 0.01707
Epoch: 12865, Training Loss: 0.01302
Epoch: 12865, Training Loss: 0.01376
Epoch: 12865, Training Loss: 0.01378
Epoch: 12865, Training Loss: 0.01707
Epoch: 12866, Training Loss: 0.01302
Epoch: 12866, Training Loss: 0.01376
Epoch: 12866, Training Loss: 0.01378
Epoch: 12866, Training Loss: 0.01707
Epoch: 12867, Training Loss: 0.01302
Epoch: 12867, Training Loss: 0.01376
Epoch: 12867, Training Loss: 0.01378
Epoch: 12867, Training Loss: 0.01707
Epoch: 12868, Training Loss: 0.01302
Epoch: 12868, Training Loss: 0.01376
Epoch: 12868, Training Loss: 0.01378
Epoch: 12868, Training Loss: 0.01707
Epoch: 12869, Training Loss: 0.01302
Epoch: 12869, Training Loss: 0.01376
Epoch: 12869, Training Loss: 0.01378
Epoch: 12869, Training Loss: 0.01707
Epoch: 12870, Training Loss: 0.01302
Epoch: 12870, Training Loss: 0.01376
Epoch: 12870, Training Loss: 0.01378
Epoch: 12870, Training Loss: 0.01706
Epoch: 12871, Training Loss: 0.01302
Epoch: 12871, Training Loss: 0.01376
Epoch: 12871, Training Loss: 0.01378
Epoch: 12871, Training Loss: 0.01706
Epoch: 12872, Training Loss: 0.01302
Epoch: 12872, Training Loss: 0.01376
Epoch: 12872, Training Loss: 0.01378
Epoch: 12872, Training Loss: 0.01706
Epoch: 12873, Training Loss: 0.01302
Epoch: 12873, Training Loss: 0.01376
Epoch: 12873, Training Loss: 0.01378
Epoch: 12873, Training Loss: 0.01706
Epoch: 12874, Training Loss: 0.01302
Epoch: 12874, Training Loss: 0.01376
Epoch: 12874, Training Loss: 0.01378
Epoch: 12874, Training Loss: 0.01706
Epoch: 12875, Training Loss: 0.01302
Epoch: 12875, Training Loss: 0.01376
Epoch: 12875, Training Loss: 0.01378
Epoch: 12875, Training Loss: 0.01706
Epoch: 12876, Training Loss: 0.01301
Epoch: 12876, Training Loss: 0.01376
Epoch: 12876, Training Loss: 0.01378
Epoch: 12876, Training Loss: 0.01706
Epoch: 12877, Training Loss: 0.01301
Epoch: 12877, Training Loss: 0.01376
Epoch: 12877, Training Loss: 0.01378
Epoch: 12877, Training Loss: 0.01706
Epoch: 12878, Training Loss: 0.01301
Epoch: 12878, Training Loss: 0.01376
Epoch: 12878, Training Loss: 0.01378
Epoch: 12878, Training Loss: 0.01706
Epoch: 12879, Training Loss: 0.01301
Epoch: 12879, Training Loss: 0.01375
Epoch: 12879, Training Loss: 0.01377
Epoch: 12879, Training Loss: 0.01706
Epoch: 12880, Training Loss: 0.01301
Epoch: 12880, Training Loss: 0.01375
Epoch: 12880, Training Loss: 0.01377
Epoch: 12880, Training Loss: 0.01706
Epoch: 12881, Training Loss: 0.01301
Epoch: 12881, Training Loss: 0.01375
Epoch: 12881, Training Loss: 0.01377
Epoch: 12881, Training Loss: 0.01706
Epoch: 12882, Training Loss: 0.01301
Epoch: 12882, Training Loss: 0.01375
Epoch: 12882, Training Loss: 0.01377
Epoch: 12882, Training Loss: 0.01705
Epoch: 12883, Training Loss: 0.01301
Epoch: 12883, Training Loss: 0.01375
Epoch: 12883, Training Loss: 0.01377
Epoch: 12883, Training Loss: 0.01705
Epoch: 12884, Training Loss: 0.01301
Epoch: 12884, Training Loss: 0.01375
Epoch: 12884, Training Loss: 0.01377
Epoch: 12884, Training Loss: 0.01705
Epoch: 12885, Training Loss: 0.01301
Epoch: 12885, Training Loss: 0.01375
Epoch: 12885, Training Loss: 0.01377
Epoch: 12885, Training Loss: 0.01705
Epoch: 12886, Training Loss: 0.01301
Epoch: 12886, Training Loss: 0.01375
Epoch: 12886, Training Loss: 0.01377
Epoch: 12886, Training Loss: 0.01705
Epoch: 12887, Training Loss: 0.01301
Epoch: 12887, Training Loss: 0.01375
Epoch: 12887, Training Loss: 0.01377
Epoch: 12887, Training Loss: 0.01705
Epoch: 12888, Training Loss: 0.01301
Epoch: 12888, Training Loss: 0.01375
Epoch: 12888, Training Loss: 0.01377
Epoch: 12888, Training Loss: 0.01705
Epoch: 12889, Training Loss: 0.01301
Epoch: 12889, Training Loss: 0.01375
Epoch: 12889, Training Loss: 0.01377
Epoch: 12889, Training Loss: 0.01705
Epoch: 12890, Training Loss: 0.01301
Epoch: 12890, Training Loss: 0.01375
Epoch: 12890, Training Loss: 0.01377
Epoch: 12890, Training Loss: 0.01705
Epoch: 12891, Training Loss: 0.01301
Epoch: 12891, Training Loss: 0.01375
Epoch: 12891, Training Loss: 0.01377
Epoch: 12891, Training Loss: 0.01705
Epoch: 12892, Training Loss: 0.01300
Epoch: 12892, Training Loss: 0.01375
Epoch: 12892, Training Loss: 0.01377
Epoch: 12892, Training Loss: 0.01705
Epoch: 12893, Training Loss: 0.01300
Epoch: 12893, Training Loss: 0.01374
Epoch: 12893, Training Loss: 0.01376
Epoch: 12893, Training Loss: 0.01704
Epoch: 12894, Training Loss: 0.01300
Epoch: 12894, Training Loss: 0.01374
Epoch: 12894, Training Loss: 0.01376
Epoch: 12894, Training Loss: 0.01704
Epoch: 12895, Training Loss: 0.01300
Epoch: 12895, Training Loss: 0.01374
Epoch: 12895, Training Loss: 0.01376
Epoch: 12895, Training Loss: 0.01704
Epoch: 12896, Training Loss: 0.01300
Epoch: 12896, Training Loss: 0.01374
Epoch: 12896, Training Loss: 0.01376
Epoch: 12896, Training Loss: 0.01704
Epoch: 12897, Training Loss: 0.01300
Epoch: 12897, Training Loss: 0.01374
Epoch: 12897, Training Loss: 0.01376
Epoch: 12897, Training Loss: 0.01704
Epoch: 12898, Training Loss: 0.01300
Epoch: 12898, Training Loss: 0.01374
Epoch: 12898, Training Loss: 0.01376
Epoch: 12898, Training Loss: 0.01704
Epoch: 12899, Training Loss: 0.01300
Epoch: 12899, Training Loss: 0.01374
Epoch: 12899, Training Loss: 0.01376
Epoch: 12899, Training Loss: 0.01704
Epoch: 12900, Training Loss: 0.01300
Epoch: 12900, Training Loss: 0.01374
Epoch: 12900, Training Loss: 0.01376
Epoch: 12900, Training Loss: 0.01704
Epoch: 12901, Training Loss: 0.01300
Epoch: 12901, Training Loss: 0.01374
Epoch: 12901, Training Loss: 0.01376
Epoch: 12901, Training Loss: 0.01704
Epoch: 12902, Training Loss: 0.01300
Epoch: 12902, Training Loss: 0.01374
Epoch: 12902, Training Loss: 0.01376
Epoch: 12902, Training Loss: 0.01704
Epoch: 12903, Training Loss: 0.01300
Epoch: 12903, Training Loss: 0.01374
Epoch: 12903, Training Loss: 0.01376
Epoch: 12903, Training Loss: 0.01704
Epoch: 12904, Training Loss: 0.01300
Epoch: 12904, Training Loss: 0.01374
Epoch: 12904, Training Loss: 0.01376
Epoch: 12904, Training Loss: 0.01704
Epoch: 12905, Training Loss: 0.01300
Epoch: 12905, Training Loss: 0.01374
Epoch: 12905, Training Loss: 0.01376
Epoch: 12905, Training Loss: 0.01703
Epoch: 12906, Training Loss: 0.01300
Epoch: 12906, Training Loss: 0.01374
Epoch: 12906, Training Loss: 0.01376
Epoch: 12906, Training Loss: 0.01703
Epoch: 12907, Training Loss: 0.01300
Epoch: 12907, Training Loss: 0.01374
Epoch: 12907, Training Loss: 0.01376
Epoch: 12907, Training Loss: 0.01703
Epoch: 12908, Training Loss: 0.01299
Epoch: 12908, Training Loss: 0.01373
Epoch: 12908, Training Loss: 0.01375
Epoch: 12908, Training Loss: 0.01703
Epoch: 12909, Training Loss: 0.01299
Epoch: 12909, Training Loss: 0.01373
Epoch: 12909, Training Loss: 0.01375
Epoch: 12909, Training Loss: 0.01703
Epoch: 12910, Training Loss: 0.01299
Epoch: 12910, Training Loss: 0.01373
Epoch: 12910, Training Loss: 0.01375
Epoch: 12910, Training Loss: 0.01703
Epoch: 12911, Training Loss: 0.01299
Epoch: 12911, Training Loss: 0.01373
Epoch: 12911, Training Loss: 0.01375
Epoch: 12911, Training Loss: 0.01703
Epoch: 12912, Training Loss: 0.01299
Epoch: 12912, Training Loss: 0.01373
Epoch: 12912, Training Loss: 0.01375
Epoch: 12912, Training Loss: 0.01703
Epoch: 12913, Training Loss: 0.01299
Epoch: 12913, Training Loss: 0.01373
Epoch: 12913, Training Loss: 0.01375
Epoch: 12913, Training Loss: 0.01703
Epoch: 12914, Training Loss: 0.01299
Epoch: 12914, Training Loss: 0.01373
Epoch: 12914, Training Loss: 0.01375
Epoch: 12914, Training Loss: 0.01703
Epoch: 12915, Training Loss: 0.01299
Epoch: 12915, Training Loss: 0.01373
Epoch: 12915, Training Loss: 0.01375
Epoch: 12915, Training Loss: 0.01703
Epoch: 12916, Training Loss: 0.01299
Epoch: 12916, Training Loss: 0.01373
Epoch: 12916, Training Loss: 0.01375
Epoch: 12916, Training Loss: 0.01703
Epoch: 12917, Training Loss: 0.01299
Epoch: 12917, Training Loss: 0.01373
Epoch: 12917, Training Loss: 0.01375
Epoch: 12917, Training Loss: 0.01702
Epoch: 12918, Training Loss: 0.01299
Epoch: 12918, Training Loss: 0.01373
Epoch: 12918, Training Loss: 0.01375
Epoch: 12918, Training Loss: 0.01702
Epoch: 12919, Training Loss: 0.01299
Epoch: 12919, Training Loss: 0.01373
Epoch: 12919, Training Loss: 0.01375
Epoch: 12919, Training Loss: 0.01702
Epoch: 12920, Training Loss: 0.01299
Epoch: 12920, Training Loss: 0.01373
Epoch: 12920, Training Loss: 0.01375
Epoch: 12920, Training Loss: 0.01702
Epoch: 12921, Training Loss: 0.01299
Epoch: 12921, Training Loss: 0.01373
Epoch: 12921, Training Loss: 0.01375
Epoch: 12921, Training Loss: 0.01702
Epoch: 12922, Training Loss: 0.01299
Epoch: 12922, Training Loss: 0.01372
Epoch: 12922, Training Loss: 0.01374
Epoch: 12922, Training Loss: 0.01702
Epoch: 12923, Training Loss: 0.01299
Epoch: 12923, Training Loss: 0.01372
Epoch: 12923, Training Loss: 0.01374
Epoch: 12923, Training Loss: 0.01702
Epoch: 12924, Training Loss: 0.01298
Epoch: 12924, Training Loss: 0.01372
Epoch: 12924, Training Loss: 0.01374
Epoch: 12924, Training Loss: 0.01702
Epoch: 12925, Training Loss: 0.01298
Epoch: 12925, Training Loss: 0.01372
Epoch: 12925, Training Loss: 0.01374
Epoch: 12925, Training Loss: 0.01702
Epoch: 12926, Training Loss: 0.01298
Epoch: 12926, Training Loss: 0.01372
Epoch: 12926, Training Loss: 0.01374
Epoch: 12926, Training Loss: 0.01702
Epoch: 12927, Training Loss: 0.01298
Epoch: 12927, Training Loss: 0.01372
Epoch: 12927, Training Loss: 0.01374
Epoch: 12927, Training Loss: 0.01702
Epoch: 12928, Training Loss: 0.01298
Epoch: 12928, Training Loss: 0.01372
Epoch: 12928, Training Loss: 0.01374
Epoch: 12928, Training Loss: 0.01701
Epoch: 12929, Training Loss: 0.01298
Epoch: 12929, Training Loss: 0.01372
Epoch: 12929, Training Loss: 0.01374
Epoch: 12929, Training Loss: 0.01701
Epoch: 12930, Training Loss: 0.01298
Epoch: 12930, Training Loss: 0.01372
Epoch: 12930, Training Loss: 0.01374
Epoch: 12930, Training Loss: 0.01701
Epoch: 12931, Training Loss: 0.01298
Epoch: 12931, Training Loss: 0.01372
Epoch: 12931, Training Loss: 0.01374
Epoch: 12931, Training Loss: 0.01701
Epoch: 12932, Training Loss: 0.01298
Epoch: 12932, Training Loss: 0.01372
Epoch: 12932, Training Loss: 0.01374
Epoch: 12932, Training Loss: 0.01701
Epoch: 12933, Training Loss: 0.01298
Epoch: 12933, Training Loss: 0.01372
Epoch: 12933, Training Loss: 0.01374
Epoch: 12933, Training Loss: 0.01701
Epoch: 12934, Training Loss: 0.01298
Epoch: 12934, Training Loss: 0.01372
Epoch: 12934, Training Loss: 0.01374
Epoch: 12934, Training Loss: 0.01701
Epoch: 12935, Training Loss: 0.01298
Epoch: 12935, Training Loss: 0.01372
Epoch: 12935, Training Loss: 0.01374
Epoch: 12935, Training Loss: 0.01701
Epoch: 12936, Training Loss: 0.01298
Epoch: 12936, Training Loss: 0.01372
Epoch: 12936, Training Loss: 0.01374
Epoch: 12936, Training Loss: 0.01701
Epoch: 12937, Training Loss: 0.01298
Epoch: 12937, Training Loss: 0.01371
Epoch: 12937, Training Loss: 0.01373
Epoch: 12937, Training Loss: 0.01701
Epoch: 12938, Training Loss: 0.01298
Epoch: 12938, Training Loss: 0.01371
Epoch: 12938, Training Loss: 0.01373
Epoch: 12938, Training Loss: 0.01701
Epoch: 12939, Training Loss: 0.01298
Epoch: 12939, Training Loss: 0.01371
Epoch: 12939, Training Loss: 0.01373
Epoch: 12939, Training Loss: 0.01701
Epoch: 12940, Training Loss: 0.01297
Epoch: 12940, Training Loss: 0.01371
Epoch: 12940, Training Loss: 0.01373
Epoch: 12940, Training Loss: 0.01700
Epoch: 12941, Training Loss: 0.01297
Epoch: 12941, Training Loss: 0.01371
Epoch: 12941, Training Loss: 0.01373
Epoch: 12941, Training Loss: 0.01700
Epoch: 12942, Training Loss: 0.01297
Epoch: 12942, Training Loss: 0.01371
Epoch: 12942, Training Loss: 0.01373
Epoch: 12942, Training Loss: 0.01700
Epoch: 12943, Training Loss: 0.01297
Epoch: 12943, Training Loss: 0.01371
Epoch: 12943, Training Loss: 0.01373
Epoch: 12943, Training Loss: 0.01700
Epoch: 12944, Training Loss: 0.01297
Epoch: 12944, Training Loss: 0.01371
Epoch: 12944, Training Loss: 0.01373
Epoch: 12944, Training Loss: 0.01700
Epoch: 12945, Training Loss: 0.01297
Epoch: 12945, Training Loss: 0.01371
Epoch: 12945, Training Loss: 0.01373
Epoch: 12945, Training Loss: 0.01700
Epoch: 12946, Training Loss: 0.01297
Epoch: 12946, Training Loss: 0.01371
Epoch: 12946, Training Loss: 0.01373
Epoch: 12946, Training Loss: 0.01700
Epoch: 12947, Training Loss: 0.01297
Epoch: 12947, Training Loss: 0.01371
Epoch: 12947, Training Loss: 0.01373
Epoch: 12947, Training Loss: 0.01700
Epoch: 12948, Training Loss: 0.01297
Epoch: 12948, Training Loss: 0.01371
Epoch: 12948, Training Loss: 0.01373
Epoch: 12948, Training Loss: 0.01700
Epoch: 12949, Training Loss: 0.01297
Epoch: 12949, Training Loss: 0.01371
Epoch: 12949, Training Loss: 0.01373
Epoch: 12949, Training Loss: 0.01700
Epoch: 12950, Training Loss: 0.01297
Epoch: 12950, Training Loss: 0.01371
Epoch: 12950, Training Loss: 0.01373
Epoch: 12950, Training Loss: 0.01700
Epoch: 12951, Training Loss: 0.01297
Epoch: 12951, Training Loss: 0.01371
Epoch: 12951, Training Loss: 0.01373
Epoch: 12951, Training Loss: 0.01699
Epoch: 12952, Training Loss: 0.01297
Epoch: 12952, Training Loss: 0.01370
Epoch: 12952, Training Loss: 0.01372
Epoch: 12952, Training Loss: 0.01699
Epoch: 12953, Training Loss: 0.01297
Epoch: 12953, Training Loss: 0.01370
Epoch: 12953, Training Loss: 0.01372
Epoch: 12953, Training Loss: 0.01699
Epoch: 12954, Training Loss: 0.01297
Epoch: 12954, Training Loss: 0.01370
Epoch: 12954, Training Loss: 0.01372
Epoch: 12954, Training Loss: 0.01699
Epoch: 12955, Training Loss: 0.01297
Epoch: 12955, Training Loss: 0.01370
Epoch: 12955, Training Loss: 0.01372
Epoch: 12955, Training Loss: 0.01699
Epoch: 12956, Training Loss: 0.01296
Epoch: 12956, Training Loss: 0.01370
Epoch: 12956, Training Loss: 0.01372
Epoch: 12956, Training Loss: 0.01699
Epoch: 12957, Training Loss: 0.01296
Epoch: 12957, Training Loss: 0.01370
Epoch: 12957, Training Loss: 0.01372
Epoch: 12957, Training Loss: 0.01699
Epoch: 12958, Training Loss: 0.01296
Epoch: 12958, Training Loss: 0.01370
Epoch: 12958, Training Loss: 0.01372
Epoch: 12958, Training Loss: 0.01699
Epoch: 12959, Training Loss: 0.01296
Epoch: 12959, Training Loss: 0.01370
Epoch: 12959, Training Loss: 0.01372
Epoch: 12959, Training Loss: 0.01699
Epoch: 12960, Training Loss: 0.01296
Epoch: 12960, Training Loss: 0.01370
Epoch: 12960, Training Loss: 0.01372
Epoch: 12960, Training Loss: 0.01699
Epoch: 12961, Training Loss: 0.01296
Epoch: 12961, Training Loss: 0.01370
Epoch: 12961, Training Loss: 0.01372
Epoch: 12961, Training Loss: 0.01699
Epoch: 12962, Training Loss: 0.01296
Epoch: 12962, Training Loss: 0.01370
Epoch: 12962, Training Loss: 0.01372
Epoch: 12962, Training Loss: 0.01699
Epoch: 12963, Training Loss: 0.01296
Epoch: 12963, Training Loss: 0.01370
Epoch: 12963, Training Loss: 0.01372
Epoch: 12963, Training Loss: 0.01698
Epoch: 12964, Training Loss: 0.01296
Epoch: 12964, Training Loss: 0.01370
Epoch: 12964, Training Loss: 0.01372
Epoch: 12964, Training Loss: 0.01698
Epoch: 12965, Training Loss: 0.01296
Epoch: 12965, Training Loss: 0.01370
Epoch: 12965, Training Loss: 0.01372
Epoch: 12965, Training Loss: 0.01698
Epoch: 12966, Training Loss: 0.01296
Epoch: 12966, Training Loss: 0.01370
Epoch: 12966, Training Loss: 0.01371
Epoch: 12966, Training Loss: 0.01698
Epoch: 12967, Training Loss: 0.01296
Epoch: 12967, Training Loss: 0.01369
Epoch: 12967, Training Loss: 0.01371
Epoch: 12967, Training Loss: 0.01698
Epoch: 12968, Training Loss: 0.01296
Epoch: 12968, Training Loss: 0.01369
Epoch: 12968, Training Loss: 0.01371
Epoch: 12968, Training Loss: 0.01698
Epoch: 12969, Training Loss: 0.01296
Epoch: 12969, Training Loss: 0.01369
Epoch: 12969, Training Loss: 0.01371
Epoch: 12969, Training Loss: 0.01698
Epoch: 12970, Training Loss: 0.01296
Epoch: 12970, Training Loss: 0.01369
Epoch: 12970, Training Loss: 0.01371
Epoch: 12970, Training Loss: 0.01698
Epoch: 12971, Training Loss: 0.01296
Epoch: 12971, Training Loss: 0.01369
Epoch: 12971, Training Loss: 0.01371
Epoch: 12971, Training Loss: 0.01698
Epoch: 12972, Training Loss: 0.01295
Epoch: 12972, Training Loss: 0.01369
Epoch: 12972, Training Loss: 0.01371
Epoch: 12972, Training Loss: 0.01698
Epoch: 12973, Training Loss: 0.01295
Epoch: 12973, Training Loss: 0.01369
Epoch: 12973, Training Loss: 0.01371
Epoch: 12973, Training Loss: 0.01698
Epoch: 12974, Training Loss: 0.01295
Epoch: 12974, Training Loss: 0.01369
Epoch: 12974, Training Loss: 0.01371
Epoch: 12974, Training Loss: 0.01698
Epoch: 12975, Training Loss: 0.01295
Epoch: 12975, Training Loss: 0.01369
Epoch: 12975, Training Loss: 0.01371
Epoch: 12975, Training Loss: 0.01697
Epoch: 12976, Training Loss: 0.01295
Epoch: 12976, Training Loss: 0.01369
Epoch: 12976, Training Loss: 0.01371
Epoch: 12976, Training Loss: 0.01697
Epoch: 12977, Training Loss: 0.01295
Epoch: 12977, Training Loss: 0.01369
Epoch: 12977, Training Loss: 0.01371
Epoch: 12977, Training Loss: 0.01697
Epoch: 12978, Training Loss: 0.01295
Epoch: 12978, Training Loss: 0.01369
Epoch: 12978, Training Loss: 0.01371
Epoch: 12978, Training Loss: 0.01697
Epoch: 12979, Training Loss: 0.01295
Epoch: 12979, Training Loss: 0.01369
Epoch: 12979, Training Loss: 0.01371
Epoch: 12979, Training Loss: 0.01697
Epoch: 12980, Training Loss: 0.01295
Epoch: 12980, Training Loss: 0.01369
Epoch: 12980, Training Loss: 0.01371
Epoch: 12980, Training Loss: 0.01697
Epoch: 12981, Training Loss: 0.01295
Epoch: 12981, Training Loss: 0.01368
Epoch: 12981, Training Loss: 0.01370
Epoch: 12981, Training Loss: 0.01697
Epoch: 12982, Training Loss: 0.01295
Epoch: 12982, Training Loss: 0.01368
Epoch: 12982, Training Loss: 0.01370
Epoch: 12982, Training Loss: 0.01697
Epoch: 12983, Training Loss: 0.01295
Epoch: 12983, Training Loss: 0.01368
Epoch: 12983, Training Loss: 0.01370
Epoch: 12983, Training Loss: 0.01697
Epoch: 12984, Training Loss: 0.01295
Epoch: 12984, Training Loss: 0.01368
Epoch: 12984, Training Loss: 0.01370
Epoch: 12984, Training Loss: 0.01697
Epoch: 12985, Training Loss: 0.01295
Epoch: 12985, Training Loss: 0.01368
Epoch: 12985, Training Loss: 0.01370
Epoch: 12985, Training Loss: 0.01697
Epoch: 12986, Training Loss: 0.01295
Epoch: 12986, Training Loss: 0.01368
Epoch: 12986, Training Loss: 0.01370
Epoch: 12986, Training Loss: 0.01696
Epoch: 12987, Training Loss: 0.01295
Epoch: 12987, Training Loss: 0.01368
Epoch: 12987, Training Loss: 0.01370
Epoch: 12987, Training Loss: 0.01696
Epoch: 12988, Training Loss: 0.01294
Epoch: 12988, Training Loss: 0.01368
Epoch: 12988, Training Loss: 0.01370
Epoch: 12988, Training Loss: 0.01696
Epoch: 12989, Training Loss: 0.01294
Epoch: 12989, Training Loss: 0.01368
Epoch: 12989, Training Loss: 0.01370
Epoch: 12989, Training Loss: 0.01696
Epoch: 12990, Training Loss: 0.01294
Epoch: 12990, Training Loss: 0.01368
Epoch: 12990, Training Loss: 0.01370
Epoch: 12990, Training Loss: 0.01696
Epoch: 12991, Training Loss: 0.01294
Epoch: 12991, Training Loss: 0.01368
Epoch: 12991, Training Loss: 0.01370
Epoch: 12991, Training Loss: 0.01696
Epoch: 12992, Training Loss: 0.01294
Epoch: 12992, Training Loss: 0.01368
Epoch: 12992, Training Loss: 0.01370
Epoch: 12992, Training Loss: 0.01696
Epoch: 12993, Training Loss: 0.01294
Epoch: 12993, Training Loss: 0.01368
Epoch: 12993, Training Loss: 0.01370
Epoch: 12993, Training Loss: 0.01696
Epoch: 12994, Training Loss: 0.01294
Epoch: 12994, Training Loss: 0.01368
Epoch: 12994, Training Loss: 0.01370
Epoch: 12994, Training Loss: 0.01696
Epoch: 12995, Training Loss: 0.01294
Epoch: 12995, Training Loss: 0.01368
Epoch: 12995, Training Loss: 0.01370
Epoch: 12995, Training Loss: 0.01696
Epoch: 12996, Training Loss: 0.01294
Epoch: 12996, Training Loss: 0.01367
Epoch: 12996, Training Loss: 0.01369
Epoch: 12996, Training Loss: 0.01696
Epoch: 12997, Training Loss: 0.01294
Epoch: 12997, Training Loss: 0.01367
Epoch: 12997, Training Loss: 0.01369
Epoch: 12997, Training Loss: 0.01696
Epoch: 12998, Training Loss: 0.01294
Epoch: 12998, Training Loss: 0.01367
Epoch: 12998, Training Loss: 0.01369
Epoch: 12998, Training Loss: 0.01695
Epoch: 12999, Training Loss: 0.01294
Epoch: 12999, Training Loss: 0.01367
Epoch: 12999, Training Loss: 0.01369
Epoch: 12999, Training Loss: 0.01695
Epoch: 13000, Training Loss: 0.01294
Epoch: 13000, Training Loss: 0.01367
Epoch: 13000, Training Loss: 0.01369
Epoch: 13000, Training Loss: 0.01695
Epoch: 13001, Training Loss: 0.01294
Epoch: 13001, Training Loss: 0.01367
Epoch: 13001, Training Loss: 0.01369
Epoch: 13001, Training Loss: 0.01695
Epoch: 13002, Training Loss: 0.01294
Epoch: 13002, Training Loss: 0.01367
Epoch: 13002, Training Loss: 0.01369
Epoch: 13002, Training Loss: 0.01695
Epoch: 13003, Training Loss: 0.01294
Epoch: 13003, Training Loss: 0.01367
Epoch: 13003, Training Loss: 0.01369
Epoch: 13003, Training Loss: 0.01695
Epoch: 13004, Training Loss: 0.01294
Epoch: 13004, Training Loss: 0.01367
Epoch: 13004, Training Loss: 0.01369
Epoch: 13004, Training Loss: 0.01695
Epoch: 13005, Training Loss: 0.01293
Epoch: 13005, Training Loss: 0.01367
Epoch: 13005, Training Loss: 0.01369
Epoch: 13005, Training Loss: 0.01695
Epoch: 13006, Training Loss: 0.01293
Epoch: 13006, Training Loss: 0.01367
Epoch: 13006, Training Loss: 0.01369
Epoch: 13006, Training Loss: 0.01695
Epoch: 13007, Training Loss: 0.01293
Epoch: 13007, Training Loss: 0.01367
Epoch: 13007, Training Loss: 0.01369
Epoch: 13007, Training Loss: 0.01695
Epoch: 13008, Training Loss: 0.01293
Epoch: 13008, Training Loss: 0.01367
Epoch: 13008, Training Loss: 0.01369
Epoch: 13008, Training Loss: 0.01695
Epoch: 13009, Training Loss: 0.01293
Epoch: 13009, Training Loss: 0.01367
Epoch: 13009, Training Loss: 0.01369
Epoch: 13009, Training Loss: 0.01695
Epoch: 13010, Training Loss: 0.01293
Epoch: 13010, Training Loss: 0.01367
Epoch: 13010, Training Loss: 0.01369
Epoch: 13010, Training Loss: 0.01694
Epoch: 13011, Training Loss: 0.01293
Epoch: 13011, Training Loss: 0.01366
Epoch: 13011, Training Loss: 0.01368
Epoch: 13011, Training Loss: 0.01694
Epoch: 13012, Training Loss: 0.01293
Epoch: 13012, Training Loss: 0.01366
Epoch: 13012, Training Loss: 0.01368
Epoch: 13012, Training Loss: 0.01694
Epoch: 13013, Training Loss: 0.01293
Epoch: 13013, Training Loss: 0.01366
Epoch: 13013, Training Loss: 0.01368
Epoch: 13013, Training Loss: 0.01694
Epoch: 13014, Training Loss: 0.01293
Epoch: 13014, Training Loss: 0.01366
Epoch: 13014, Training Loss: 0.01368
Epoch: 13014, Training Loss: 0.01694
Epoch: 13015, Training Loss: 0.01293
Epoch: 13015, Training Loss: 0.01366
Epoch: 13015, Training Loss: 0.01368
Epoch: 13015, Training Loss: 0.01694
Epoch: 13016, Training Loss: 0.01293
Epoch: 13016, Training Loss: 0.01366
Epoch: 13016, Training Loss: 0.01368
Epoch: 13016, Training Loss: 0.01694
Epoch: 13017, Training Loss: 0.01293
Epoch: 13017, Training Loss: 0.01366
Epoch: 13017, Training Loss: 0.01368
Epoch: 13017, Training Loss: 0.01694
Epoch: 13018, Training Loss: 0.01293
Epoch: 13018, Training Loss: 0.01366
Epoch: 13018, Training Loss: 0.01368
Epoch: 13018, Training Loss: 0.01694
Epoch: 13019, Training Loss: 0.01293
Epoch: 13019, Training Loss: 0.01366
Epoch: 13019, Training Loss: 0.01368
Epoch: 13019, Training Loss: 0.01694
Epoch: 13020, Training Loss: 0.01293
Epoch: 13020, Training Loss: 0.01366
Epoch: 13020, Training Loss: 0.01368
Epoch: 13020, Training Loss: 0.01694
Epoch: 13021, Training Loss: 0.01292
Epoch: 13021, Training Loss: 0.01366
Epoch: 13021, Training Loss: 0.01368
Epoch: 13021, Training Loss: 0.01693
Epoch: 13022, Training Loss: 0.01292
Epoch: 13022, Training Loss: 0.01366
Epoch: 13022, Training Loss: 0.01368
Epoch: 13022, Training Loss: 0.01693
Epoch: 13023, Training Loss: 0.01292
Epoch: 13023, Training Loss: 0.01366
Epoch: 13023, Training Loss: 0.01368
Epoch: 13023, Training Loss: 0.01693
Epoch: 13024, Training Loss: 0.01292
Epoch: 13024, Training Loss: 0.01366
Epoch: 13024, Training Loss: 0.01368
Epoch: 13024, Training Loss: 0.01693
Epoch: 13025, Training Loss: 0.01292
Epoch: 13025, Training Loss: 0.01366
Epoch: 13025, Training Loss: 0.01368
Epoch: 13025, Training Loss: 0.01693
Epoch: 13026, Training Loss: 0.01292
Epoch: 13026, Training Loss: 0.01365
Epoch: 13026, Training Loss: 0.01367
Epoch: 13026, Training Loss: 0.01693
Epoch: 13027, Training Loss: 0.01292
Epoch: 13027, Training Loss: 0.01365
Epoch: 13027, Training Loss: 0.01367
Epoch: 13027, Training Loss: 0.01693
Epoch: 13028, Training Loss: 0.01292
Epoch: 13028, Training Loss: 0.01365
Epoch: 13028, Training Loss: 0.01367
Epoch: 13028, Training Loss: 0.01693
Epoch: 13029, Training Loss: 0.01292
Epoch: 13029, Training Loss: 0.01365
Epoch: 13029, Training Loss: 0.01367
Epoch: 13029, Training Loss: 0.01693
Epoch: 13030, Training Loss: 0.01292
Epoch: 13030, Training Loss: 0.01365
Epoch: 13030, Training Loss: 0.01367
Epoch: 13030, Training Loss: 0.01693
Epoch: 13031, Training Loss: 0.01292
Epoch: 13031, Training Loss: 0.01365
Epoch: 13031, Training Loss: 0.01367
Epoch: 13031, Training Loss: 0.01693
Epoch: 13032, Training Loss: 0.01292
Epoch: 13032, Training Loss: 0.01365
Epoch: 13032, Training Loss: 0.01367
Epoch: 13032, Training Loss: 0.01693
Epoch: 13033, Training Loss: 0.01292
Epoch: 13033, Training Loss: 0.01365
Epoch: 13033, Training Loss: 0.01367
Epoch: 13033, Training Loss: 0.01692
Epoch: 13034, Training Loss: 0.01292
Epoch: 13034, Training Loss: 0.01365
Epoch: 13034, Training Loss: 0.01367
Epoch: 13034, Training Loss: 0.01692
Epoch: 13035, Training Loss: 0.01292
Epoch: 13035, Training Loss: 0.01365
Epoch: 13035, Training Loss: 0.01367
Epoch: 13035, Training Loss: 0.01692
Epoch: 13036, Training Loss: 0.01292
Epoch: 13036, Training Loss: 0.01365
Epoch: 13036, Training Loss: 0.01367
Epoch: 13036, Training Loss: 0.01692
Epoch: 13037, Training Loss: 0.01291
Epoch: 13037, Training Loss: 0.01365
Epoch: 13037, Training Loss: 0.01367
Epoch: 13037, Training Loss: 0.01692
Epoch: 13038, Training Loss: 0.01291
Epoch: 13038, Training Loss: 0.01365
Epoch: 13038, Training Loss: 0.01367
Epoch: 13038, Training Loss: 0.01692
Epoch: 13039, Training Loss: 0.01291
Epoch: 13039, Training Loss: 0.01365
Epoch: 13039, Training Loss: 0.01367
Epoch: 13039, Training Loss: 0.01692
Epoch: 13040, Training Loss: 0.01291
Epoch: 13040, Training Loss: 0.01365
Epoch: 13040, Training Loss: 0.01366
Epoch: 13040, Training Loss: 0.01692
Epoch: 13041, Training Loss: 0.01291
Epoch: 13041, Training Loss: 0.01364
Epoch: 13041, Training Loss: 0.01366
Epoch: 13041, Training Loss: 0.01692
Epoch: 13042, Training Loss: 0.01291
Epoch: 13042, Training Loss: 0.01364
Epoch: 13042, Training Loss: 0.01366
Epoch: 13042, Training Loss: 0.01692
Epoch: 13043, Training Loss: 0.01291
Epoch: 13043, Training Loss: 0.01364
Epoch: 13043, Training Loss: 0.01366
Epoch: 13043, Training Loss: 0.01692
Epoch: 13044, Training Loss: 0.01291
Epoch: 13044, Training Loss: 0.01364
Epoch: 13044, Training Loss: 0.01366
Epoch: 13044, Training Loss: 0.01692
Epoch: 13045, Training Loss: 0.01291
Epoch: 13045, Training Loss: 0.01364
Epoch: 13045, Training Loss: 0.01366
Epoch: 13045, Training Loss: 0.01691
Epoch: 13046, Training Loss: 0.01291
Epoch: 13046, Training Loss: 0.01364
Epoch: 13046, Training Loss: 0.01366
Epoch: 13046, Training Loss: 0.01691
Epoch: 13047, Training Loss: 0.01291
Epoch: 13047, Training Loss: 0.01364
Epoch: 13047, Training Loss: 0.01366
Epoch: 13047, Training Loss: 0.01691
Epoch: 13048, Training Loss: 0.01291
Epoch: 13048, Training Loss: 0.01364
Epoch: 13048, Training Loss: 0.01366
Epoch: 13048, Training Loss: 0.01691
Epoch: 13049, Training Loss: 0.01291
Epoch: 13049, Training Loss: 0.01364
Epoch: 13049, Training Loss: 0.01366
Epoch: 13049, Training Loss: 0.01691
Epoch: 13050, Training Loss: 0.01291
Epoch: 13050, Training Loss: 0.01364
Epoch: 13050, Training Loss: 0.01366
Epoch: 13050, Training Loss: 0.01691
Epoch: 13051, Training Loss: 0.01291
Epoch: 13051, Training Loss: 0.01364
Epoch: 13051, Training Loss: 0.01366
Epoch: 13051, Training Loss: 0.01691
Epoch: 13052, Training Loss: 0.01291
Epoch: 13052, Training Loss: 0.01364
Epoch: 13052, Training Loss: 0.01366
Epoch: 13052, Training Loss: 0.01691
Epoch: 13053, Training Loss: 0.01291
Epoch: 13053, Training Loss: 0.01364
Epoch: 13053, Training Loss: 0.01366
Epoch: 13053, Training Loss: 0.01691
Epoch: 13054, Training Loss: 0.01290
Epoch: 13054, Training Loss: 0.01364
Epoch: 13054, Training Loss: 0.01366
Epoch: 13054, Training Loss: 0.01691
Epoch: 13055, Training Loss: 0.01290
Epoch: 13055, Training Loss: 0.01364
Epoch: 13055, Training Loss: 0.01365
Epoch: 13055, Training Loss: 0.01691
Epoch: 13056, Training Loss: 0.01290
Epoch: 13056, Training Loss: 0.01363
Epoch: 13056, Training Loss: 0.01365
Epoch: 13056, Training Loss: 0.01691
Epoch: 13057, Training Loss: 0.01290
Epoch: 13057, Training Loss: 0.01363
Epoch: 13057, Training Loss: 0.01365
Epoch: 13057, Training Loss: 0.01690
Epoch: 13058, Training Loss: 0.01290
Epoch: 13058, Training Loss: 0.01363
Epoch: 13058, Training Loss: 0.01365
Epoch: 13058, Training Loss: 0.01690
Epoch: 13059, Training Loss: 0.01290
Epoch: 13059, Training Loss: 0.01363
Epoch: 13059, Training Loss: 0.01365
Epoch: 13059, Training Loss: 0.01690
Epoch: 13060, Training Loss: 0.01290
Epoch: 13060, Training Loss: 0.01363
Epoch: 13060, Training Loss: 0.01365
Epoch: 13060, Training Loss: 0.01690
Epoch: 13061, Training Loss: 0.01290
Epoch: 13061, Training Loss: 0.01363
Epoch: 13061, Training Loss: 0.01365
Epoch: 13061, Training Loss: 0.01690
Epoch: 13062, Training Loss: 0.01290
Epoch: 13062, Training Loss: 0.01363
Epoch: 13062, Training Loss: 0.01365
Epoch: 13062, Training Loss: 0.01690
Epoch: 13063, Training Loss: 0.01290
Epoch: 13063, Training Loss: 0.01363
Epoch: 13063, Training Loss: 0.01365
Epoch: 13063, Training Loss: 0.01690
Epoch: 13064, Training Loss: 0.01290
Epoch: 13064, Training Loss: 0.01363
Epoch: 13064, Training Loss: 0.01365
Epoch: 13064, Training Loss: 0.01690
Epoch: 13065, Training Loss: 0.01290
Epoch: 13065, Training Loss: 0.01363
Epoch: 13065, Training Loss: 0.01365
Epoch: 13065, Training Loss: 0.01690
Epoch: 13066, Training Loss: 0.01290
Epoch: 13066, Training Loss: 0.01363
Epoch: 13066, Training Loss: 0.01365
Epoch: 13066, Training Loss: 0.01690
Epoch: 13067, Training Loss: 0.01290
Epoch: 13067, Training Loss: 0.01363
Epoch: 13067, Training Loss: 0.01365
Epoch: 13067, Training Loss: 0.01690
Epoch: 13068, Training Loss: 0.01290
Epoch: 13068, Training Loss: 0.01363
Epoch: 13068, Training Loss: 0.01365
Epoch: 13068, Training Loss: 0.01690
Epoch: 13069, Training Loss: 0.01290
Epoch: 13069, Training Loss: 0.01363
Epoch: 13069, Training Loss: 0.01365
Epoch: 13069, Training Loss: 0.01689
Epoch: 13070, Training Loss: 0.01289
Epoch: 13070, Training Loss: 0.01363
Epoch: 13070, Training Loss: 0.01364
Epoch: 13070, Training Loss: 0.01689
Epoch: 13071, Training Loss: 0.01289
Epoch: 13071, Training Loss: 0.01362
Epoch: 13071, Training Loss: 0.01364
Epoch: 13071, Training Loss: 0.01689
Epoch: 13072, Training Loss: 0.01289
Epoch: 13072, Training Loss: 0.01362
Epoch: 13072, Training Loss: 0.01364
Epoch: 13072, Training Loss: 0.01689
Epoch: 13073, Training Loss: 0.01289
Epoch: 13073, Training Loss: 0.01362
Epoch: 13073, Training Loss: 0.01364
Epoch: 13073, Training Loss: 0.01689
Epoch: 13074, Training Loss: 0.01289
Epoch: 13074, Training Loss: 0.01362
Epoch: 13074, Training Loss: 0.01364
Epoch: 13074, Training Loss: 0.01689
Epoch: 13075, Training Loss: 0.01289
Epoch: 13075, Training Loss: 0.01362
Epoch: 13075, Training Loss: 0.01364
Epoch: 13075, Training Loss: 0.01689
Epoch: 13076, Training Loss: 0.01289
Epoch: 13076, Training Loss: 0.01362
Epoch: 13076, Training Loss: 0.01364
Epoch: 13076, Training Loss: 0.01689
Epoch: 13077, Training Loss: 0.01289
Epoch: 13077, Training Loss: 0.01362
Epoch: 13077, Training Loss: 0.01364
Epoch: 13077, Training Loss: 0.01689
Epoch: 13078, Training Loss: 0.01289
Epoch: 13078, Training Loss: 0.01362
Epoch: 13078, Training Loss: 0.01364
Epoch: 13078, Training Loss: 0.01689
Epoch: 13079, Training Loss: 0.01289
Epoch: 13079, Training Loss: 0.01362
Epoch: 13079, Training Loss: 0.01364
Epoch: 13079, Training Loss: 0.01689
Epoch: 13080, Training Loss: 0.01289
Epoch: 13080, Training Loss: 0.01362
Epoch: 13080, Training Loss: 0.01364
Epoch: 13080, Training Loss: 0.01688
Epoch: 13081, Training Loss: 0.01289
Epoch: 13081, Training Loss: 0.01362
Epoch: 13081, Training Loss: 0.01364
Epoch: 13081, Training Loss: 0.01688
Epoch: 13082, Training Loss: 0.01289
Epoch: 13082, Training Loss: 0.01362
Epoch: 13082, Training Loss: 0.01364
Epoch: 13082, Training Loss: 0.01688
Epoch: 13083, Training Loss: 0.01289
Epoch: 13083, Training Loss: 0.01362
Epoch: 13083, Training Loss: 0.01364
Epoch: 13083, Training Loss: 0.01688
Epoch: 13084, Training Loss: 0.01289
Epoch: 13084, Training Loss: 0.01362
Epoch: 13084, Training Loss: 0.01364
Epoch: 13084, Training Loss: 0.01688
Epoch: 13085, Training Loss: 0.01289
Epoch: 13085, Training Loss: 0.01362
Epoch: 13085, Training Loss: 0.01363
Epoch: 13085, Training Loss: 0.01688
Epoch: 13086, Training Loss: 0.01288
Epoch: 13086, Training Loss: 0.01361
Epoch: 13086, Training Loss: 0.01363
Epoch: 13086, Training Loss: 0.01688
Epoch: 13087, Training Loss: 0.01288
Epoch: 13087, Training Loss: 0.01361
Epoch: 13087, Training Loss: 0.01363
Epoch: 13087, Training Loss: 0.01688
Epoch: 13088, Training Loss: 0.01288
Epoch: 13088, Training Loss: 0.01361
Epoch: 13088, Training Loss: 0.01363
Epoch: 13088, Training Loss: 0.01688
Epoch: 13089, Training Loss: 0.01288
Epoch: 13089, Training Loss: 0.01361
Epoch: 13089, Training Loss: 0.01363
Epoch: 13089, Training Loss: 0.01688
Epoch: 13090, Training Loss: 0.01288
Epoch: 13090, Training Loss: 0.01361
Epoch: 13090, Training Loss: 0.01363
Epoch: 13090, Training Loss: 0.01688
Epoch: 13091, Training Loss: 0.01288
Epoch: 13091, Training Loss: 0.01361
Epoch: 13091, Training Loss: 0.01363
Epoch: 13091, Training Loss: 0.01688
Epoch: 13092, Training Loss: 0.01288
Epoch: 13092, Training Loss: 0.01361
Epoch: 13092, Training Loss: 0.01363
Epoch: 13092, Training Loss: 0.01687
Epoch: 13093, Training Loss: 0.01288
Epoch: 13093, Training Loss: 0.01361
Epoch: 13093, Training Loss: 0.01363
Epoch: 13093, Training Loss: 0.01687
Epoch: 13094, Training Loss: 0.01288
Epoch: 13094, Training Loss: 0.01361
Epoch: 13094, Training Loss: 0.01363
Epoch: 13094, Training Loss: 0.01687
Epoch: 13095, Training Loss: 0.01288
Epoch: 13095, Training Loss: 0.01361
Epoch: 13095, Training Loss: 0.01363
Epoch: 13095, Training Loss: 0.01687
Epoch: 13096, Training Loss: 0.01288
Epoch: 13096, Training Loss: 0.01361
Epoch: 13096, Training Loss: 0.01363
Epoch: 13096, Training Loss: 0.01687
Epoch: 13097, Training Loss: 0.01288
Epoch: 13097, Training Loss: 0.01361
Epoch: 13097, Training Loss: 0.01363
Epoch: 13097, Training Loss: 0.01687
Epoch: 13098, Training Loss: 0.01288
Epoch: 13098, Training Loss: 0.01361
Epoch: 13098, Training Loss: 0.01363
Epoch: 13098, Training Loss: 0.01687
Epoch: 13099, Training Loss: 0.01288
Epoch: 13099, Training Loss: 0.01361
Epoch: 13099, Training Loss: 0.01363
Epoch: 13099, Training Loss: 0.01687
Epoch: 13100, Training Loss: 0.01288
Epoch: 13100, Training Loss: 0.01361
Epoch: 13100, Training Loss: 0.01362
Epoch: 13100, Training Loss: 0.01687
Epoch: 13101, Training Loss: 0.01288
Epoch: 13101, Training Loss: 0.01360
Epoch: 13101, Training Loss: 0.01362
Epoch: 13101, Training Loss: 0.01687
Epoch: 13102, Training Loss: 0.01288
Epoch: 13102, Training Loss: 0.01360
Epoch: 13102, Training Loss: 0.01362
Epoch: 13102, Training Loss: 0.01687
Epoch: 13103, Training Loss: 0.01287
Epoch: 13103, Training Loss: 0.01360
Epoch: 13103, Training Loss: 0.01362
Epoch: 13103, Training Loss: 0.01687
Epoch: 13104, Training Loss: 0.01287
Epoch: 13104, Training Loss: 0.01360
Epoch: 13104, Training Loss: 0.01362
Epoch: 13104, Training Loss: 0.01686
Epoch: 13105, Training Loss: 0.01287
Epoch: 13105, Training Loss: 0.01360
Epoch: 13105, Training Loss: 0.01362
Epoch: 13105, Training Loss: 0.01686
Epoch: 13106, Training Loss: 0.01287
Epoch: 13106, Training Loss: 0.01360
Epoch: 13106, Training Loss: 0.01362
Epoch: 13106, Training Loss: 0.01686
Epoch: 13107, Training Loss: 0.01287
Epoch: 13107, Training Loss: 0.01360
Epoch: 13107, Training Loss: 0.01362
Epoch: 13107, Training Loss: 0.01686
Epoch: 13108, Training Loss: 0.01287
Epoch: 13108, Training Loss: 0.01360
Epoch: 13108, Training Loss: 0.01362
Epoch: 13108, Training Loss: 0.01686
Epoch: 13109, Training Loss: 0.01287
Epoch: 13109, Training Loss: 0.01360
Epoch: 13109, Training Loss: 0.01362
Epoch: 13109, Training Loss: 0.01686
Epoch: 13110, Training Loss: 0.01287
Epoch: 13110, Training Loss: 0.01360
Epoch: 13110, Training Loss: 0.01362
Epoch: 13110, Training Loss: 0.01686
Epoch: 13111, Training Loss: 0.01287
Epoch: 13111, Training Loss: 0.01360
Epoch: 13111, Training Loss: 0.01362
Epoch: 13111, Training Loss: 0.01686
Epoch: 13112, Training Loss: 0.01287
Epoch: 13112, Training Loss: 0.01360
Epoch: 13112, Training Loss: 0.01362
Epoch: 13112, Training Loss: 0.01686
Epoch: 13113, Training Loss: 0.01287
Epoch: 13113, Training Loss: 0.01360
Epoch: 13113, Training Loss: 0.01362
Epoch: 13113, Training Loss: 0.01686
Epoch: 13114, Training Loss: 0.01287
Epoch: 13114, Training Loss: 0.01360
Epoch: 13114, Training Loss: 0.01362
Epoch: 13114, Training Loss: 0.01686
Epoch: 13115, Training Loss: 0.01287
Epoch: 13115, Training Loss: 0.01360
Epoch: 13115, Training Loss: 0.01361
Epoch: 13115, Training Loss: 0.01686
Epoch: 13116, Training Loss: 0.01287
Epoch: 13116, Training Loss: 0.01359
Epoch: 13116, Training Loss: 0.01361
Epoch: 13116, Training Loss: 0.01685
Epoch: 13117, Training Loss: 0.01287
Epoch: 13117, Training Loss: 0.01359
Epoch: 13117, Training Loss: 0.01361
Epoch: 13117, Training Loss: 0.01685
Epoch: 13118, Training Loss: 0.01287
Epoch: 13118, Training Loss: 0.01359
Epoch: 13118, Training Loss: 0.01361
Epoch: 13118, Training Loss: 0.01685
Epoch: 13119, Training Loss: 0.01286
Epoch: 13119, Training Loss: 0.01359
Epoch: 13119, Training Loss: 0.01361
Epoch: 13119, Training Loss: 0.01685
Epoch: 13120, Training Loss: 0.01286
Epoch: 13120, Training Loss: 0.01359
Epoch: 13120, Training Loss: 0.01361
Epoch: 13120, Training Loss: 0.01685
Epoch: 13121, Training Loss: 0.01286
Epoch: 13121, Training Loss: 0.01359
Epoch: 13121, Training Loss: 0.01361
Epoch: 13121, Training Loss: 0.01685
Epoch: 13122, Training Loss: 0.01286
Epoch: 13122, Training Loss: 0.01359
Epoch: 13122, Training Loss: 0.01361
Epoch: 13122, Training Loss: 0.01685
Epoch: 13123, Training Loss: 0.01286
Epoch: 13123, Training Loss: 0.01359
Epoch: 13123, Training Loss: 0.01361
Epoch: 13123, Training Loss: 0.01685
Epoch: 13124, Training Loss: 0.01286
Epoch: 13124, Training Loss: 0.01359
Epoch: 13124, Training Loss: 0.01361
Epoch: 13124, Training Loss: 0.01685
Epoch: 13125, Training Loss: 0.01286
Epoch: 13125, Training Loss: 0.01359
Epoch: 13125, Training Loss: 0.01361
Epoch: 13125, Training Loss: 0.01685
Epoch: 13126, Training Loss: 0.01286
Epoch: 13126, Training Loss: 0.01359
Epoch: 13126, Training Loss: 0.01361
Epoch: 13126, Training Loss: 0.01685
Epoch: 13127, Training Loss: 0.01286
Epoch: 13127, Training Loss: 0.01359
Epoch: 13127, Training Loss: 0.01361
Epoch: 13127, Training Loss: 0.01685
Epoch: 13128, Training Loss: 0.01286
Epoch: 13128, Training Loss: 0.01359
Epoch: 13128, Training Loss: 0.01361
Epoch: 13128, Training Loss: 0.01684
Epoch: 13129, Training Loss: 0.01286
Epoch: 13129, Training Loss: 0.01359
Epoch: 13129, Training Loss: 0.01361
Epoch: 13129, Training Loss: 0.01684
Epoch: 13130, Training Loss: 0.01286
Epoch: 13130, Training Loss: 0.01359
Epoch: 13130, Training Loss: 0.01360
Epoch: 13130, Training Loss: 0.01684
Epoch: 13131, Training Loss: 0.01286
Epoch: 13131, Training Loss: 0.01358
Epoch: 13131, Training Loss: 0.01360
Epoch: 13131, Training Loss: 0.01684
Epoch: 13132, Training Loss: 0.01286
Epoch: 13132, Training Loss: 0.01358
Epoch: 13132, Training Loss: 0.01360
Epoch: 13132, Training Loss: 0.01684
Epoch: 13133, Training Loss: 0.01286
Epoch: 13133, Training Loss: 0.01358
Epoch: 13133, Training Loss: 0.01360
Epoch: 13133, Training Loss: 0.01684
Epoch: 13134, Training Loss: 0.01286
Epoch: 13134, Training Loss: 0.01358
Epoch: 13134, Training Loss: 0.01360
Epoch: 13134, Training Loss: 0.01684
Epoch: 13135, Training Loss: 0.01286
Epoch: 13135, Training Loss: 0.01358
Epoch: 13135, Training Loss: 0.01360
Epoch: 13135, Training Loss: 0.01684
Epoch: 13136, Training Loss: 0.01285
Epoch: 13136, Training Loss: 0.01358
Epoch: 13136, Training Loss: 0.01360
Epoch: 13136, Training Loss: 0.01684
Epoch: 13137, Training Loss: 0.01285
Epoch: 13137, Training Loss: 0.01358
Epoch: 13137, Training Loss: 0.01360
Epoch: 13137, Training Loss: 0.01684
Epoch: 13138, Training Loss: 0.01285
Epoch: 13138, Training Loss: 0.01358
Epoch: 13138, Training Loss: 0.01360
Epoch: 13138, Training Loss: 0.01684
Epoch: 13139, Training Loss: 0.01285
Epoch: 13139, Training Loss: 0.01358
Epoch: 13139, Training Loss: 0.01360
Epoch: 13139, Training Loss: 0.01684
Epoch: 13140, Training Loss: 0.01285
Epoch: 13140, Training Loss: 0.01358
Epoch: 13140, Training Loss: 0.01360
Epoch: 13140, Training Loss: 0.01683
Epoch: 13141, Training Loss: 0.01285
Epoch: 13141, Training Loss: 0.01358
Epoch: 13141, Training Loss: 0.01360
Epoch: 13141, Training Loss: 0.01683
Epoch: 13142, Training Loss: 0.01285
Epoch: 13142, Training Loss: 0.01358
Epoch: 13142, Training Loss: 0.01360
Epoch: 13142, Training Loss: 0.01683
Epoch: 13143, Training Loss: 0.01285
Epoch: 13143, Training Loss: 0.01358
Epoch: 13143, Training Loss: 0.01360
Epoch: 13143, Training Loss: 0.01683
Epoch: 13144, Training Loss: 0.01285
Epoch: 13144, Training Loss: 0.01358
Epoch: 13144, Training Loss: 0.01360
Epoch: 13144, Training Loss: 0.01683
Epoch: 13145, Training Loss: 0.01285
Epoch: 13145, Training Loss: 0.01358
Epoch: 13145, Training Loss: 0.01359
Epoch: 13145, Training Loss: 0.01683
Epoch: 13146, Training Loss: 0.01285
Epoch: 13146, Training Loss: 0.01357
Epoch: 13146, Training Loss: 0.01359
Epoch: 13146, Training Loss: 0.01683
Epoch: 13147, Training Loss: 0.01285
Epoch: 13147, Training Loss: 0.01357
Epoch: 13147, Training Loss: 0.01359
Epoch: 13147, Training Loss: 0.01683
Epoch: 13148, Training Loss: 0.01285
Epoch: 13148, Training Loss: 0.01357
Epoch: 13148, Training Loss: 0.01359
Epoch: 13148, Training Loss: 0.01683
Epoch: 13149, Training Loss: 0.01285
Epoch: 13149, Training Loss: 0.01357
Epoch: 13149, Training Loss: 0.01359
Epoch: 13149, Training Loss: 0.01683
Epoch: 13150, Training Loss: 0.01285
Epoch: 13150, Training Loss: 0.01357
Epoch: 13150, Training Loss: 0.01359
Epoch: 13150, Training Loss: 0.01683
Epoch: 13151, Training Loss: 0.01285
Epoch: 13151, Training Loss: 0.01357
Epoch: 13151, Training Loss: 0.01359
Epoch: 13151, Training Loss: 0.01683
Epoch: 13152, Training Loss: 0.01284
Epoch: 13152, Training Loss: 0.01357
Epoch: 13152, Training Loss: 0.01359
Epoch: 13152, Training Loss: 0.01682
Epoch: 13153, Training Loss: 0.01284
Epoch: 13153, Training Loss: 0.01357
Epoch: 13153, Training Loss: 0.01359
Epoch: 13153, Training Loss: 0.01682
Epoch: 13154, Training Loss: 0.01284
Epoch: 13154, Training Loss: 0.01357
Epoch: 13154, Training Loss: 0.01359
Epoch: 13154, Training Loss: 0.01682
Epoch: 13155, Training Loss: 0.01284
Epoch: 13155, Training Loss: 0.01357
Epoch: 13155, Training Loss: 0.01359
Epoch: 13155, Training Loss: 0.01682
Epoch: 13156, Training Loss: 0.01284
Epoch: 13156, Training Loss: 0.01357
Epoch: 13156, Training Loss: 0.01359
Epoch: 13156, Training Loss: 0.01682
Epoch: 13157, Training Loss: 0.01284
Epoch: 13157, Training Loss: 0.01357
Epoch: 13157, Training Loss: 0.01359
Epoch: 13157, Training Loss: 0.01682
Epoch: 13158, Training Loss: 0.01284
Epoch: 13158, Training Loss: 0.01357
Epoch: 13158, Training Loss: 0.01359
Epoch: 13158, Training Loss: 0.01682
Epoch: 13159, Training Loss: 0.01284
Epoch: 13159, Training Loss: 0.01357
Epoch: 13159, Training Loss: 0.01359
Epoch: 13159, Training Loss: 0.01682
Epoch: 13160, Training Loss: 0.01284
Epoch: 13160, Training Loss: 0.01357
Epoch: 13160, Training Loss: 0.01359
Epoch: 13160, Training Loss: 0.01682
Epoch: 13161, Training Loss: 0.01284
Epoch: 13161, Training Loss: 0.01356
Epoch: 13161, Training Loss: 0.01358
Epoch: 13161, Training Loss: 0.01682
Epoch: 13162, Training Loss: 0.01284
Epoch: 13162, Training Loss: 0.01356
Epoch: 13162, Training Loss: 0.01358
Epoch: 13162, Training Loss: 0.01682
Epoch: 13163, Training Loss: 0.01284
Epoch: 13163, Training Loss: 0.01356
Epoch: 13163, Training Loss: 0.01358
Epoch: 13163, Training Loss: 0.01682
Epoch: 13164, Training Loss: 0.01284
Epoch: 13164, Training Loss: 0.01356
Epoch: 13164, Training Loss: 0.01358
Epoch: 13164, Training Loss: 0.01681
Epoch: 13165, Training Loss: 0.01284
Epoch: 13165, Training Loss: 0.01356
Epoch: 13165, Training Loss: 0.01358
Epoch: 13165, Training Loss: 0.01681
Epoch: 13166, Training Loss: 0.01284
Epoch: 13166, Training Loss: 0.01356
Epoch: 13166, Training Loss: 0.01358
Epoch: 13166, Training Loss: 0.01681
Epoch: 13167, Training Loss: 0.01284
Epoch: 13167, Training Loss: 0.01356
Epoch: 13167, Training Loss: 0.01358
Epoch: 13167, Training Loss: 0.01681
Epoch: 13168, Training Loss: 0.01284
Epoch: 13168, Training Loss: 0.01356
Epoch: 13168, Training Loss: 0.01358
Epoch: 13168, Training Loss: 0.01681
Epoch: 13169, Training Loss: 0.01283
Epoch: 13169, Training Loss: 0.01356
Epoch: 13169, Training Loss: 0.01358
Epoch: 13169, Training Loss: 0.01681
Epoch: 13170, Training Loss: 0.01283
Epoch: 13170, Training Loss: 0.01356
Epoch: 13170, Training Loss: 0.01358
Epoch: 13170, Training Loss: 0.01681
Epoch: 13171, Training Loss: 0.01283
Epoch: 13171, Training Loss: 0.01356
Epoch: 13171, Training Loss: 0.01358
Epoch: 13171, Training Loss: 0.01681
Epoch: 13172, Training Loss: 0.01283
Epoch: 13172, Training Loss: 0.01356
Epoch: 13172, Training Loss: 0.01358
Epoch: 13172, Training Loss: 0.01681
Epoch: 13173, Training Loss: 0.01283
Epoch: 13173, Training Loss: 0.01356
Epoch: 13173, Training Loss: 0.01358
Epoch: 13173, Training Loss: 0.01681
Epoch: 13174, Training Loss: 0.01283
Epoch: 13174, Training Loss: 0.01356
Epoch: 13174, Training Loss: 0.01358
Epoch: 13174, Training Loss: 0.01681
Epoch: 13175, Training Loss: 0.01283
Epoch: 13175, Training Loss: 0.01356
Epoch: 13175, Training Loss: 0.01358
Epoch: 13175, Training Loss: 0.01681
Epoch: 13176, Training Loss: 0.01283
Epoch: 13176, Training Loss: 0.01355
Epoch: 13176, Training Loss: 0.01357
Epoch: 13176, Training Loss: 0.01680
Epoch: 13177, Training Loss: 0.01283
Epoch: 13177, Training Loss: 0.01355
Epoch: 13177, Training Loss: 0.01357
Epoch: 13177, Training Loss: 0.01680
Epoch: 13178, Training Loss: 0.01283
Epoch: 13178, Training Loss: 0.01355
Epoch: 13178, Training Loss: 0.01357
Epoch: 13178, Training Loss: 0.01680
Epoch: 13179, Training Loss: 0.01283
Epoch: 13179, Training Loss: 0.01355
Epoch: 13179, Training Loss: 0.01357
Epoch: 13179, Training Loss: 0.01680
Epoch: 13180, Training Loss: 0.01283
Epoch: 13180, Training Loss: 0.01355
Epoch: 13180, Training Loss: 0.01357
Epoch: 13180, Training Loss: 0.01680
Epoch: 13181, Training Loss: 0.01283
Epoch: 13181, Training Loss: 0.01355
Epoch: 13181, Training Loss: 0.01357
Epoch: 13181, Training Loss: 0.01680
Epoch: 13182, Training Loss: 0.01283
Epoch: 13182, Training Loss: 0.01355
Epoch: 13182, Training Loss: 0.01357
Epoch: 13182, Training Loss: 0.01680
Epoch: 13183, Training Loss: 0.01283
Epoch: 13183, Training Loss: 0.01355
Epoch: 13183, Training Loss: 0.01357
Epoch: 13183, Training Loss: 0.01680
Epoch: 13184, Training Loss: 0.01283
Epoch: 13184, Training Loss: 0.01355
Epoch: 13184, Training Loss: 0.01357
Epoch: 13184, Training Loss: 0.01680
Epoch: 13185, Training Loss: 0.01283
Epoch: 13185, Training Loss: 0.01355
Epoch: 13185, Training Loss: 0.01357
Epoch: 13185, Training Loss: 0.01680
Epoch: 13186, Training Loss: 0.01282
Epoch: 13186, Training Loss: 0.01355
Epoch: 13186, Training Loss: 0.01357
Epoch: 13186, Training Loss: 0.01680
Epoch: 13187, Training Loss: 0.01282
Epoch: 13187, Training Loss: 0.01355
Epoch: 13187, Training Loss: 0.01357
Epoch: 13187, Training Loss: 0.01680
Epoch: 13188, Training Loss: 0.01282
Epoch: 13188, Training Loss: 0.01355
Epoch: 13188, Training Loss: 0.01357
Epoch: 13188, Training Loss: 0.01679
Epoch: 13189, Training Loss: 0.01282
Epoch: 13189, Training Loss: 0.01355
Epoch: 13189, Training Loss: 0.01357
Epoch: 13189, Training Loss: 0.01679
Epoch: 13190, Training Loss: 0.01282
Epoch: 13190, Training Loss: 0.01355
Epoch: 13190, Training Loss: 0.01357
Epoch: 13190, Training Loss: 0.01679
Epoch: 13191, Training Loss: 0.01282
Epoch: 13191, Training Loss: 0.01354
Epoch: 13191, Training Loss: 0.01356
Epoch: 13191, Training Loss: 0.01679
Epoch: 13192, Training Loss: 0.01282
Epoch: 13192, Training Loss: 0.01354
Epoch: 13192, Training Loss: 0.01356
Epoch: 13192, Training Loss: 0.01679
Epoch: 13193, Training Loss: 0.01282
Epoch: 13193, Training Loss: 0.01354
Epoch: 13193, Training Loss: 0.01356
Epoch: 13193, Training Loss: 0.01679
Epoch: 13194, Training Loss: 0.01282
Epoch: 13194, Training Loss: 0.01354
Epoch: 13194, Training Loss: 0.01356
Epoch: 13194, Training Loss: 0.01679
Epoch: 13195, Training Loss: 0.01282
Epoch: 13195, Training Loss: 0.01354
Epoch: 13195, Training Loss: 0.01356
Epoch: 13195, Training Loss: 0.01679
Epoch: 13196, Training Loss: 0.01282
Epoch: 13196, Training Loss: 0.01354
Epoch: 13196, Training Loss: 0.01356
Epoch: 13196, Training Loss: 0.01679
Epoch: 13197, Training Loss: 0.01282
Epoch: 13197, Training Loss: 0.01354
Epoch: 13197, Training Loss: 0.01356
Epoch: 13197, Training Loss: 0.01679
Epoch: 13198, Training Loss: 0.01282
Epoch: 13198, Training Loss: 0.01354
Epoch: 13198, Training Loss: 0.01356
Epoch: 13198, Training Loss: 0.01679
Epoch: 13199, Training Loss: 0.01282
Epoch: 13199, Training Loss: 0.01354
Epoch: 13199, Training Loss: 0.01356
Epoch: 13199, Training Loss: 0.01679
Epoch: 13200, Training Loss: 0.01282
Epoch: 13200, Training Loss: 0.01354
Epoch: 13200, Training Loss: 0.01356
Epoch: 13200, Training Loss: 0.01678
Epoch: 13201, Training Loss: 0.01282
Epoch: 13201, Training Loss: 0.01354
Epoch: 13201, Training Loss: 0.01356
Epoch: 13201, Training Loss: 0.01678
Epoch: 13202, Training Loss: 0.01281
Epoch: 13202, Training Loss: 0.01354
Epoch: 13202, Training Loss: 0.01356
Epoch: 13202, Training Loss: 0.01678
Epoch: 13203, Training Loss: 0.01281
Epoch: 13203, Training Loss: 0.01354
Epoch: 13203, Training Loss: 0.01356
Epoch: 13203, Training Loss: 0.01678
Epoch: 13204, Training Loss: 0.01281
Epoch: 13204, Training Loss: 0.01354
Epoch: 13204, Training Loss: 0.01356
Epoch: 13204, Training Loss: 0.01678
Epoch: 13205, Training Loss: 0.01281
Epoch: 13205, Training Loss: 0.01354
Epoch: 13205, Training Loss: 0.01356
Epoch: 13205, Training Loss: 0.01678
Epoch: 13206, Training Loss: 0.01281
Epoch: 13206, Training Loss: 0.01354
Epoch: 13206, Training Loss: 0.01355
Epoch: 13206, Training Loss: 0.01678
Epoch: 13207, Training Loss: 0.01281
Epoch: 13207, Training Loss: 0.01353
Epoch: 13207, Training Loss: 0.01355
Epoch: 13207, Training Loss: 0.01678
Epoch: 13208, Training Loss: 0.01281
Epoch: 13208, Training Loss: 0.01353
Epoch: 13208, Training Loss: 0.01355
Epoch: 13208, Training Loss: 0.01678
Epoch: 13209, Training Loss: 0.01281
Epoch: 13209, Training Loss: 0.01353
Epoch: 13209, Training Loss: 0.01355
Epoch: 13209, Training Loss: 0.01678
Epoch: 13210, Training Loss: 0.01281
Epoch: 13210, Training Loss: 0.01353
Epoch: 13210, Training Loss: 0.01355
Epoch: 13210, Training Loss: 0.01678
Epoch: 13211, Training Loss: 0.01281
Epoch: 13211, Training Loss: 0.01353
Epoch: 13211, Training Loss: 0.01355
Epoch: 13211, Training Loss: 0.01678
Epoch: 13212, Training Loss: 0.01281
Epoch: 13212, Training Loss: 0.01353
Epoch: 13212, Training Loss: 0.01355
Epoch: 13212, Training Loss: 0.01677
Epoch: 13213, Training Loss: 0.01281
Epoch: 13213, Training Loss: 0.01353
Epoch: 13213, Training Loss: 0.01355
Epoch: 13213, Training Loss: 0.01677
Epoch: 13214, Training Loss: 0.01281
Epoch: 13214, Training Loss: 0.01353
Epoch: 13214, Training Loss: 0.01355
Epoch: 13214, Training Loss: 0.01677
Epoch: 13215, Training Loss: 0.01281
Epoch: 13215, Training Loss: 0.01353
Epoch: 13215, Training Loss: 0.01355
Epoch: 13215, Training Loss: 0.01677
Epoch: 13216, Training Loss: 0.01281
Epoch: 13216, Training Loss: 0.01353
Epoch: 13216, Training Loss: 0.01355
Epoch: 13216, Training Loss: 0.01677
Epoch: 13217, Training Loss: 0.01281
Epoch: 13217, Training Loss: 0.01353
Epoch: 13217, Training Loss: 0.01355
Epoch: 13217, Training Loss: 0.01677
Epoch: 13218, Training Loss: 0.01281
Epoch: 13218, Training Loss: 0.01353
Epoch: 13218, Training Loss: 0.01355
Epoch: 13218, Training Loss: 0.01677
Epoch: 13219, Training Loss: 0.01280
Epoch: 13219, Training Loss: 0.01353
Epoch: 13219, Training Loss: 0.01355
Epoch: 13219, Training Loss: 0.01677
Epoch: 13220, Training Loss: 0.01280
Epoch: 13220, Training Loss: 0.01353
Epoch: 13220, Training Loss: 0.01355
Epoch: 13220, Training Loss: 0.01677
Epoch: 13221, Training Loss: 0.01280
Epoch: 13221, Training Loss: 0.01353
Epoch: 13221, Training Loss: 0.01354
Epoch: 13221, Training Loss: 0.01677
Epoch: 13222, Training Loss: 0.01280
Epoch: 13222, Training Loss: 0.01352
Epoch: 13222, Training Loss: 0.01354
Epoch: 13222, Training Loss: 0.01677
Epoch: 13223, Training Loss: 0.01280
Epoch: 13223, Training Loss: 0.01352
Epoch: 13223, Training Loss: 0.01354
Epoch: 13223, Training Loss: 0.01677
Epoch: 13224, Training Loss: 0.01280
Epoch: 13224, Training Loss: 0.01352
Epoch: 13224, Training Loss: 0.01354
Epoch: 13224, Training Loss: 0.01676
Epoch: 13225, Training Loss: 0.01280
Epoch: 13225, Training Loss: 0.01352
Epoch: 13225, Training Loss: 0.01354
Epoch: 13225, Training Loss: 0.01676
Epoch: 13226, Training Loss: 0.01280
Epoch: 13226, Training Loss: 0.01352
Epoch: 13226, Training Loss: 0.01354
Epoch: 13226, Training Loss: 0.01676
Epoch: 13227, Training Loss: 0.01280
Epoch: 13227, Training Loss: 0.01352
Epoch: 13227, Training Loss: 0.01354
Epoch: 13227, Training Loss: 0.01676
Epoch: 13228, Training Loss: 0.01280
Epoch: 13228, Training Loss: 0.01352
Epoch: 13228, Training Loss: 0.01354
Epoch: 13228, Training Loss: 0.01676
Epoch: 13229, Training Loss: 0.01280
Epoch: 13229, Training Loss: 0.01352
Epoch: 13229, Training Loss: 0.01354
Epoch: 13229, Training Loss: 0.01676
Epoch: 13230, Training Loss: 0.01280
Epoch: 13230, Training Loss: 0.01352
Epoch: 13230, Training Loss: 0.01354
Epoch: 13230, Training Loss: 0.01676
Epoch: 13231, Training Loss: 0.01280
Epoch: 13231, Training Loss: 0.01352
Epoch: 13231, Training Loss: 0.01354
Epoch: 13231, Training Loss: 0.01676
Epoch: 13232, Training Loss: 0.01280
Epoch: 13232, Training Loss: 0.01352
Epoch: 13232, Training Loss: 0.01354
Epoch: 13232, Training Loss: 0.01676
Epoch: 13233, Training Loss: 0.01280
Epoch: 13233, Training Loss: 0.01352
Epoch: 13233, Training Loss: 0.01354
Epoch: 13233, Training Loss: 0.01676
Epoch: 13234, Training Loss: 0.01280
Epoch: 13234, Training Loss: 0.01352
Epoch: 13234, Training Loss: 0.01354
Epoch: 13234, Training Loss: 0.01676
Epoch: 13235, Training Loss: 0.01280
Epoch: 13235, Training Loss: 0.01352
Epoch: 13235, Training Loss: 0.01354
Epoch: 13235, Training Loss: 0.01676
Epoch: 13236, Training Loss: 0.01279
Epoch: 13236, Training Loss: 0.01352
Epoch: 13236, Training Loss: 0.01354
Epoch: 13236, Training Loss: 0.01675
Epoch: 13237, Training Loss: 0.01279
Epoch: 13237, Training Loss: 0.01351
Epoch: 13237, Training Loss: 0.01353
Epoch: 13237, Training Loss: 0.01675
Epoch: 13238, Training Loss: 0.01279
Epoch: 13238, Training Loss: 0.01351
Epoch: 13238, Training Loss: 0.01353
Epoch: 13238, Training Loss: 0.01675
Epoch: 13239, Training Loss: 0.01279
Epoch: 13239, Training Loss: 0.01351
Epoch: 13239, Training Loss: 0.01353
Epoch: 13239, Training Loss: 0.01675
Epoch: 13240, Training Loss: 0.01279
Epoch: 13240, Training Loss: 0.01351
Epoch: 13240, Training Loss: 0.01353
Epoch: 13240, Training Loss: 0.01675
Epoch: 13241, Training Loss: 0.01279
Epoch: 13241, Training Loss: 0.01351
Epoch: 13241, Training Loss: 0.01353
Epoch: 13241, Training Loss: 0.01675
Epoch: 13242, Training Loss: 0.01279
Epoch: 13242, Training Loss: 0.01351
Epoch: 13242, Training Loss: 0.01353
Epoch: 13242, Training Loss: 0.01675
Epoch: 13243, Training Loss: 0.01279
Epoch: 13243, Training Loss: 0.01351
Epoch: 13243, Training Loss: 0.01353
Epoch: 13243, Training Loss: 0.01675
Epoch: 13244, Training Loss: 0.01279
Epoch: 13244, Training Loss: 0.01351
Epoch: 13244, Training Loss: 0.01353
Epoch: 13244, Training Loss: 0.01675
Epoch: 13245, Training Loss: 0.01279
Epoch: 13245, Training Loss: 0.01351
Epoch: 13245, Training Loss: 0.01353
Epoch: 13245, Training Loss: 0.01675
Epoch: 13246, Training Loss: 0.01279
Epoch: 13246, Training Loss: 0.01351
Epoch: 13246, Training Loss: 0.01353
Epoch: 13246, Training Loss: 0.01675
Epoch: 13247, Training Loss: 0.01279
Epoch: 13247, Training Loss: 0.01351
Epoch: 13247, Training Loss: 0.01353
Epoch: 13247, Training Loss: 0.01675
Epoch: 13248, Training Loss: 0.01279
Epoch: 13248, Training Loss: 0.01351
Epoch: 13248, Training Loss: 0.01353
Epoch: 13248, Training Loss: 0.01674
Epoch: 13249, Training Loss: 0.01279
Epoch: 13249, Training Loss: 0.01351
Epoch: 13249, Training Loss: 0.01353
Epoch: 13249, Training Loss: 0.01674
Epoch: 13250, Training Loss: 0.01279
Epoch: 13250, Training Loss: 0.01351
Epoch: 13250, Training Loss: 0.01353
Epoch: 13250, Training Loss: 0.01674
Epoch: 13251, Training Loss: 0.01279
Epoch: 13251, Training Loss: 0.01351
Epoch: 13251, Training Loss: 0.01353
Epoch: 13251, Training Loss: 0.01674
Epoch: 13252, Training Loss: 0.01279
Epoch: 13252, Training Loss: 0.01351
Epoch: 13252, Training Loss: 0.01352
Epoch: 13252, Training Loss: 0.01674
Epoch: 13253, Training Loss: 0.01278
Epoch: 13253, Training Loss: 0.01350
Epoch: 13253, Training Loss: 0.01352
Epoch: 13253, Training Loss: 0.01674
Epoch: 13254, Training Loss: 0.01278
Epoch: 13254, Training Loss: 0.01350
Epoch: 13254, Training Loss: 0.01352
Epoch: 13254, Training Loss: 0.01674
Epoch: 13255, Training Loss: 0.01278
Epoch: 13255, Training Loss: 0.01350
Epoch: 13255, Training Loss: 0.01352
Epoch: 13255, Training Loss: 0.01674
Epoch: 13256, Training Loss: 0.01278
Epoch: 13256, Training Loss: 0.01350
Epoch: 13256, Training Loss: 0.01352
Epoch: 13256, Training Loss: 0.01674
Epoch: 13257, Training Loss: 0.01278
Epoch: 13257, Training Loss: 0.01350
Epoch: 13257, Training Loss: 0.01352
Epoch: 13257, Training Loss: 0.01674
Epoch: 13258, Training Loss: 0.01278
Epoch: 13258, Training Loss: 0.01350
Epoch: 13258, Training Loss: 0.01352
Epoch: 13258, Training Loss: 0.01674
Epoch: 13259, Training Loss: 0.01278
Epoch: 13259, Training Loss: 0.01350
Epoch: 13259, Training Loss: 0.01352
Epoch: 13259, Training Loss: 0.01674
Epoch: 13260, Training Loss: 0.01278
Epoch: 13260, Training Loss: 0.01350
Epoch: 13260, Training Loss: 0.01352
Epoch: 13260, Training Loss: 0.01673
Epoch: 13261, Training Loss: 0.01278
Epoch: 13261, Training Loss: 0.01350
Epoch: 13261, Training Loss: 0.01352
Epoch: 13261, Training Loss: 0.01673
Epoch: 13262, Training Loss: 0.01278
Epoch: 13262, Training Loss: 0.01350
Epoch: 13262, Training Loss: 0.01352
Epoch: 13262, Training Loss: 0.01673
Epoch: 13263, Training Loss: 0.01278
Epoch: 13263, Training Loss: 0.01350
Epoch: 13263, Training Loss: 0.01352
Epoch: 13263, Training Loss: 0.01673
Epoch: 13264, Training Loss: 0.01278
Epoch: 13264, Training Loss: 0.01350
Epoch: 13264, Training Loss: 0.01352
Epoch: 13264, Training Loss: 0.01673
Epoch: 13265, Training Loss: 0.01278
Epoch: 13265, Training Loss: 0.01350
Epoch: 13265, Training Loss: 0.01352
Epoch: 13265, Training Loss: 0.01673
Epoch: 13266, Training Loss: 0.01278
Epoch: 13266, Training Loss: 0.01350
Epoch: 13266, Training Loss: 0.01352
Epoch: 13266, Training Loss: 0.01673
Epoch: 13267, Training Loss: 0.01278
Epoch: 13267, Training Loss: 0.01350
Epoch: 13267, Training Loss: 0.01351
Epoch: 13267, Training Loss: 0.01673
Epoch: 13268, Training Loss: 0.01278
Epoch: 13268, Training Loss: 0.01349
Epoch: 13268, Training Loss: 0.01351
Epoch: 13268, Training Loss: 0.01673
Epoch: 13269, Training Loss: 0.01277
Epoch: 13269, Training Loss: 0.01349
Epoch: 13269, Training Loss: 0.01351
Epoch: 13269, Training Loss: 0.01673
Epoch: 13270, Training Loss: 0.01277
Epoch: 13270, Training Loss: 0.01349
Epoch: 13270, Training Loss: 0.01351
Epoch: 13270, Training Loss: 0.01673
Epoch: 13271, Training Loss: 0.01277
Epoch: 13271, Training Loss: 0.01349
Epoch: 13271, Training Loss: 0.01351
Epoch: 13271, Training Loss: 0.01673
Epoch: 13272, Training Loss: 0.01277
Epoch: 13272, Training Loss: 0.01349
Epoch: 13272, Training Loss: 0.01351
Epoch: 13272, Training Loss: 0.01673
Epoch: 13273, Training Loss: 0.01277
Epoch: 13273, Training Loss: 0.01349
Epoch: 13273, Training Loss: 0.01351
Epoch: 13273, Training Loss: 0.01672
Epoch: 13274, Training Loss: 0.01277
Epoch: 13274, Training Loss: 0.01349
Epoch: 13274, Training Loss: 0.01351
Epoch: 13274, Training Loss: 0.01672
Epoch: 13275, Training Loss: 0.01277
Epoch: 13275, Training Loss: 0.01349
Epoch: 13275, Training Loss: 0.01351
Epoch: 13275, Training Loss: 0.01672
Epoch: 13276, Training Loss: 0.01277
Epoch: 13276, Training Loss: 0.01349
Epoch: 13276, Training Loss: 0.01351
Epoch: 13276, Training Loss: 0.01672
Epoch: 13277, Training Loss: 0.01277
Epoch: 13277, Training Loss: 0.01349
Epoch: 13277, Training Loss: 0.01351
Epoch: 13277, Training Loss: 0.01672
Epoch: 13278, Training Loss: 0.01277
Epoch: 13278, Training Loss: 0.01349
Epoch: 13278, Training Loss: 0.01351
Epoch: 13278, Training Loss: 0.01672
Epoch: 13279, Training Loss: 0.01277
Epoch: 13279, Training Loss: 0.01349
Epoch: 13279, Training Loss: 0.01351
Epoch: 13279, Training Loss: 0.01672
Epoch: 13280, Training Loss: 0.01277
Epoch: 13280, Training Loss: 0.01349
Epoch: 13280, Training Loss: 0.01351
Epoch: 13280, Training Loss: 0.01672
Epoch: 13281, Training Loss: 0.01277
Epoch: 13281, Training Loss: 0.01349
Epoch: 13281, Training Loss: 0.01351
Epoch: 13281, Training Loss: 0.01672
Epoch: 13282, Training Loss: 0.01277
Epoch: 13282, Training Loss: 0.01349
Epoch: 13282, Training Loss: 0.01351
Epoch: 13282, Training Loss: 0.01672
Epoch: 13283, Training Loss: 0.01277
Epoch: 13283, Training Loss: 0.01349
Epoch: 13283, Training Loss: 0.01350
Epoch: 13283, Training Loss: 0.01672
Epoch: 13284, Training Loss: 0.01277
Epoch: 13284, Training Loss: 0.01348
Epoch: 13284, Training Loss: 0.01350
Epoch: 13284, Training Loss: 0.01672
Epoch: 13285, Training Loss: 0.01277
Epoch: 13285, Training Loss: 0.01348
Epoch: 13285, Training Loss: 0.01350
Epoch: 13285, Training Loss: 0.01671
Epoch: 13286, Training Loss: 0.01276
Epoch: 13286, Training Loss: 0.01348
Epoch: 13286, Training Loss: 0.01350
Epoch: 13286, Training Loss: 0.01671
Epoch: 13287, Training Loss: 0.01276
Epoch: 13287, Training Loss: 0.01348
Epoch: 13287, Training Loss: 0.01350
Epoch: 13287, Training Loss: 0.01671
Epoch: 13288, Training Loss: 0.01276
Epoch: 13288, Training Loss: 0.01348
Epoch: 13288, Training Loss: 0.01350
Epoch: 13288, Training Loss: 0.01671
Epoch: 13289, Training Loss: 0.01276
Epoch: 13289, Training Loss: 0.01348
Epoch: 13289, Training Loss: 0.01350
Epoch: 13289, Training Loss: 0.01671
Epoch: 13290, Training Loss: 0.01276
Epoch: 13290, Training Loss: 0.01348
Epoch: 13290, Training Loss: 0.01350
Epoch: 13290, Training Loss: 0.01671
Epoch: 13291, Training Loss: 0.01276
Epoch: 13291, Training Loss: 0.01348
Epoch: 13291, Training Loss: 0.01350
Epoch: 13291, Training Loss: 0.01671
Epoch: 13292, Training Loss: 0.01276
Epoch: 13292, Training Loss: 0.01348
Epoch: 13292, Training Loss: 0.01350
Epoch: 13292, Training Loss: 0.01671
Epoch: 13293, Training Loss: 0.01276
Epoch: 13293, Training Loss: 0.01348
Epoch: 13293, Training Loss: 0.01350
Epoch: 13293, Training Loss: 0.01671
Epoch: 13294, Training Loss: 0.01276
Epoch: 13294, Training Loss: 0.01348
Epoch: 13294, Training Loss: 0.01350
Epoch: 13294, Training Loss: 0.01671
Epoch: 13295, Training Loss: 0.01276
Epoch: 13295, Training Loss: 0.01348
Epoch: 13295, Training Loss: 0.01350
Epoch: 13295, Training Loss: 0.01671
Epoch: 13296, Training Loss: 0.01276
Epoch: 13296, Training Loss: 0.01348
Epoch: 13296, Training Loss: 0.01350
Epoch: 13296, Training Loss: 0.01671
Epoch: 13297, Training Loss: 0.01276
Epoch: 13297, Training Loss: 0.01348
Epoch: 13297, Training Loss: 0.01350
Epoch: 13297, Training Loss: 0.01670
Epoch: 13298, Training Loss: 0.01276
Epoch: 13298, Training Loss: 0.01348
Epoch: 13298, Training Loss: 0.01349
Epoch: 13298, Training Loss: 0.01670
Epoch: 13299, Training Loss: 0.01276
Epoch: 13299, Training Loss: 0.01347
Epoch: 13299, Training Loss: 0.01349
Epoch: 13299, Training Loss: 0.01670
Epoch: 13300, Training Loss: 0.01276
Epoch: 13300, Training Loss: 0.01347
Epoch: 13300, Training Loss: 0.01349
Epoch: 13300, Training Loss: 0.01670
Epoch: 13301, Training Loss: 0.01276
Epoch: 13301, Training Loss: 0.01347
Epoch: 13301, Training Loss: 0.01349
Epoch: 13301, Training Loss: 0.01670
Epoch: 13302, Training Loss: 0.01276
Epoch: 13302, Training Loss: 0.01347
Epoch: 13302, Training Loss: 0.01349
Epoch: 13302, Training Loss: 0.01670
Epoch: 13303, Training Loss: 0.01275
Epoch: 13303, Training Loss: 0.01347
Epoch: 13303, Training Loss: 0.01349
Epoch: 13303, Training Loss: 0.01670
Epoch: 13304, Training Loss: 0.01275
Epoch: 13304, Training Loss: 0.01347
Epoch: 13304, Training Loss: 0.01349
Epoch: 13304, Training Loss: 0.01670
Epoch: 13305, Training Loss: 0.01275
Epoch: 13305, Training Loss: 0.01347
Epoch: 13305, Training Loss: 0.01349
Epoch: 13305, Training Loss: 0.01670
Epoch: 13306, Training Loss: 0.01275
Epoch: 13306, Training Loss: 0.01347
Epoch: 13306, Training Loss: 0.01349
Epoch: 13306, Training Loss: 0.01670
Epoch: 13307, Training Loss: 0.01275
Epoch: 13307, Training Loss: 0.01347
Epoch: 13307, Training Loss: 0.01349
Epoch: 13307, Training Loss: 0.01670
Epoch: 13308, Training Loss: 0.01275
Epoch: 13308, Training Loss: 0.01347
Epoch: 13308, Training Loss: 0.01349
Epoch: 13308, Training Loss: 0.01670
Epoch: 13309, Training Loss: 0.01275
Epoch: 13309, Training Loss: 0.01347
Epoch: 13309, Training Loss: 0.01349
Epoch: 13309, Training Loss: 0.01669
Epoch: 13310, Training Loss: 0.01275
Epoch: 13310, Training Loss: 0.01347
Epoch: 13310, Training Loss: 0.01349
Epoch: 13310, Training Loss: 0.01669
Epoch: 13311, Training Loss: 0.01275
Epoch: 13311, Training Loss: 0.01347
Epoch: 13311, Training Loss: 0.01349
Epoch: 13311, Training Loss: 0.01669
Epoch: 13312, Training Loss: 0.01275
Epoch: 13312, Training Loss: 0.01347
Epoch: 13312, Training Loss: 0.01349
Epoch: 13312, Training Loss: 0.01669
Epoch: 13313, Training Loss: 0.01275
Epoch: 13313, Training Loss: 0.01347
Epoch: 13313, Training Loss: 0.01349
Epoch: 13313, Training Loss: 0.01669
Epoch: 13314, Training Loss: 0.01275
Epoch: 13314, Training Loss: 0.01347
Epoch: 13314, Training Loss: 0.01348
Epoch: 13314, Training Loss: 0.01669
Epoch: 13315, Training Loss: 0.01275
Epoch: 13315, Training Loss: 0.01346
Epoch: 13315, Training Loss: 0.01348
Epoch: 13315, Training Loss: 0.01669
Epoch: 13316, Training Loss: 0.01275
Epoch: 13316, Training Loss: 0.01346
Epoch: 13316, Training Loss: 0.01348
Epoch: 13316, Training Loss: 0.01669
Epoch: 13317, Training Loss: 0.01275
Epoch: 13317, Training Loss: 0.01346
Epoch: 13317, Training Loss: 0.01348
Epoch: 13317, Training Loss: 0.01669
Epoch: 13318, Training Loss: 0.01275
Epoch: 13318, Training Loss: 0.01346
Epoch: 13318, Training Loss: 0.01348
Epoch: 13318, Training Loss: 0.01669
Epoch: 13319, Training Loss: 0.01275
Epoch: 13319, Training Loss: 0.01346
Epoch: 13319, Training Loss: 0.01348
Epoch: 13319, Training Loss: 0.01669
Epoch: 13320, Training Loss: 0.01274
Epoch: 13320, Training Loss: 0.01346
Epoch: 13320, Training Loss: 0.01348
Epoch: 13320, Training Loss: 0.01669
Epoch: 13321, Training Loss: 0.01274
Epoch: 13321, Training Loss: 0.01346
Epoch: 13321, Training Loss: 0.01348
Epoch: 13321, Training Loss: 0.01669
Epoch: 13322, Training Loss: 0.01274
Epoch: 13322, Training Loss: 0.01346
Epoch: 13322, Training Loss: 0.01348
Epoch: 13322, Training Loss: 0.01668
Epoch: 13323, Training Loss: 0.01274
Epoch: 13323, Training Loss: 0.01346
Epoch: 13323, Training Loss: 0.01348
Epoch: 13323, Training Loss: 0.01668
Epoch: 13324, Training Loss: 0.01274
Epoch: 13324, Training Loss: 0.01346
Epoch: 13324, Training Loss: 0.01348
Epoch: 13324, Training Loss: 0.01668
Epoch: 13325, Training Loss: 0.01274
Epoch: 13325, Training Loss: 0.01346
Epoch: 13325, Training Loss: 0.01348
Epoch: 13325, Training Loss: 0.01668
Epoch: 13326, Training Loss: 0.01274
Epoch: 13326, Training Loss: 0.01346
Epoch: 13326, Training Loss: 0.01348
Epoch: 13326, Training Loss: 0.01668
Epoch: 13327, Training Loss: 0.01274
Epoch: 13327, Training Loss: 0.01346
Epoch: 13327, Training Loss: 0.01348
Epoch: 13327, Training Loss: 0.01668
Epoch: 13328, Training Loss: 0.01274
Epoch: 13328, Training Loss: 0.01346
Epoch: 13328, Training Loss: 0.01348
Epoch: 13328, Training Loss: 0.01668
Epoch: 13329, Training Loss: 0.01274
Epoch: 13329, Training Loss: 0.01346
Epoch: 13329, Training Loss: 0.01347
Epoch: 13329, Training Loss: 0.01668
Epoch: 13330, Training Loss: 0.01274
Epoch: 13330, Training Loss: 0.01345
Epoch: 13330, Training Loss: 0.01347
Epoch: 13330, Training Loss: 0.01668
Epoch: 13331, Training Loss: 0.01274
Epoch: 13331, Training Loss: 0.01345
Epoch: 13331, Training Loss: 0.01347
Epoch: 13331, Training Loss: 0.01668
Epoch: 13332, Training Loss: 0.01274
Epoch: 13332, Training Loss: 0.01345
Epoch: 13332, Training Loss: 0.01347
Epoch: 13332, Training Loss: 0.01668
Epoch: 13333, Training Loss: 0.01274
Epoch: 13333, Training Loss: 0.01345
Epoch: 13333, Training Loss: 0.01347
Epoch: 13333, Training Loss: 0.01668
Epoch: 13334, Training Loss: 0.01274
Epoch: 13334, Training Loss: 0.01345
Epoch: 13334, Training Loss: 0.01347
Epoch: 13334, Training Loss: 0.01667
Epoch: 13335, Training Loss: 0.01274
Epoch: 13335, Training Loss: 0.01345
Epoch: 13335, Training Loss: 0.01347
Epoch: 13335, Training Loss: 0.01667
Epoch: 13336, Training Loss: 0.01274
Epoch: 13336, Training Loss: 0.01345
Epoch: 13336, Training Loss: 0.01347
Epoch: 13336, Training Loss: 0.01667
Epoch: 13337, Training Loss: 0.01273
Epoch: 13337, Training Loss: 0.01345
Epoch: 13337, Training Loss: 0.01347
Epoch: 13337, Training Loss: 0.01667
Epoch: 13338, Training Loss: 0.01273
Epoch: 13338, Training Loss: 0.01345
Epoch: 13338, Training Loss: 0.01347
Epoch: 13338, Training Loss: 0.01667
Epoch: 13339, Training Loss: 0.01273
Epoch: 13339, Training Loss: 0.01345
Epoch: 13339, Training Loss: 0.01347
Epoch: 13339, Training Loss: 0.01667
Epoch: 13340, Training Loss: 0.01273
Epoch: 13340, Training Loss: 0.01345
Epoch: 13340, Training Loss: 0.01347
Epoch: 13340, Training Loss: 0.01667
Epoch: 13341, Training Loss: 0.01273
Epoch: 13341, Training Loss: 0.01345
Epoch: 13341, Training Loss: 0.01347
Epoch: 13341, Training Loss: 0.01667
Epoch: 13342, Training Loss: 0.01273
Epoch: 13342, Training Loss: 0.01345
Epoch: 13342, Training Loss: 0.01347
Epoch: 13342, Training Loss: 0.01667
Epoch: 13343, Training Loss: 0.01273
Epoch: 13343, Training Loss: 0.01345
Epoch: 13343, Training Loss: 0.01347
Epoch: 13343, Training Loss: 0.01667
Epoch: 13344, Training Loss: 0.01273
Epoch: 13344, Training Loss: 0.01345
Epoch: 13344, Training Loss: 0.01347
Epoch: 13344, Training Loss: 0.01667
Epoch: 13345, Training Loss: 0.01273
Epoch: 13345, Training Loss: 0.01345
Epoch: 13345, Training Loss: 0.01346
Epoch: 13345, Training Loss: 0.01667
Epoch: 13346, Training Loss: 0.01273
Epoch: 13346, Training Loss: 0.01344
Epoch: 13346, Training Loss: 0.01346
Epoch: 13346, Training Loss: 0.01666
Epoch: 13347, Training Loss: 0.01273
Epoch: 13347, Training Loss: 0.01344
Epoch: 13347, Training Loss: 0.01346
Epoch: 13347, Training Loss: 0.01666
Epoch: 13348, Training Loss: 0.01273
Epoch: 13348, Training Loss: 0.01344
Epoch: 13348, Training Loss: 0.01346
Epoch: 13348, Training Loss: 0.01666
Epoch: 13349, Training Loss: 0.01273
Epoch: 13349, Training Loss: 0.01344
Epoch: 13349, Training Loss: 0.01346
Epoch: 13349, Training Loss: 0.01666
Epoch: 13350, Training Loss: 0.01273
Epoch: 13350, Training Loss: 0.01344
Epoch: 13350, Training Loss: 0.01346
Epoch: 13350, Training Loss: 0.01666
Epoch: 13351, Training Loss: 0.01273
Epoch: 13351, Training Loss: 0.01344
Epoch: 13351, Training Loss: 0.01346
Epoch: 13351, Training Loss: 0.01666
Epoch: 13352, Training Loss: 0.01273
Epoch: 13352, Training Loss: 0.01344
Epoch: 13352, Training Loss: 0.01346
Epoch: 13352, Training Loss: 0.01666
Epoch: 13353, Training Loss: 0.01273
Epoch: 13353, Training Loss: 0.01344
Epoch: 13353, Training Loss: 0.01346
Epoch: 13353, Training Loss: 0.01666
Epoch: 13354, Training Loss: 0.01272
Epoch: 13354, Training Loss: 0.01344
Epoch: 13354, Training Loss: 0.01346
Epoch: 13354, Training Loss: 0.01666
Epoch: 13355, Training Loss: 0.01272
Epoch: 13355, Training Loss: 0.01344
Epoch: 13355, Training Loss: 0.01346
Epoch: 13355, Training Loss: 0.01666
Epoch: 13356, Training Loss: 0.01272
Epoch: 13356, Training Loss: 0.01344
Epoch: 13356, Training Loss: 0.01346
Epoch: 13356, Training Loss: 0.01666
Epoch: 13357, Training Loss: 0.01272
Epoch: 13357, Training Loss: 0.01344
Epoch: 13357, Training Loss: 0.01346
Epoch: 13357, Training Loss: 0.01666
Epoch: 13358, Training Loss: 0.01272
Epoch: 13358, Training Loss: 0.01344
Epoch: 13358, Training Loss: 0.01346
Epoch: 13358, Training Loss: 0.01665
Epoch: 13359, Training Loss: 0.01272
Epoch: 13359, Training Loss: 0.01344
Epoch: 13359, Training Loss: 0.01346
Epoch: 13359, Training Loss: 0.01665
Epoch: 13360, Training Loss: 0.01272
Epoch: 13360, Training Loss: 0.01344
Epoch: 13360, Training Loss: 0.01345
Epoch: 13360, Training Loss: 0.01665
Epoch: 13361, Training Loss: 0.01272
Epoch: 13361, Training Loss: 0.01343
Epoch: 13361, Training Loss: 0.01345
Epoch: 13361, Training Loss: 0.01665
Epoch: 13362, Training Loss: 0.01272
Epoch: 13362, Training Loss: 0.01343
Epoch: 13362, Training Loss: 0.01345
Epoch: 13362, Training Loss: 0.01665
Epoch: 13363, Training Loss: 0.01272
Epoch: 13363, Training Loss: 0.01343
Epoch: 13363, Training Loss: 0.01345
Epoch: 13363, Training Loss: 0.01665
Epoch: 13364, Training Loss: 0.01272
Epoch: 13364, Training Loss: 0.01343
Epoch: 13364, Training Loss: 0.01345
Epoch: 13364, Training Loss: 0.01665
Epoch: 13365, Training Loss: 0.01272
Epoch: 13365, Training Loss: 0.01343
Epoch: 13365, Training Loss: 0.01345
Epoch: 13365, Training Loss: 0.01665
Epoch: 13366, Training Loss: 0.01272
Epoch: 13366, Training Loss: 0.01343
Epoch: 13366, Training Loss: 0.01345
Epoch: 13366, Training Loss: 0.01665
Epoch: 13367, Training Loss: 0.01272
Epoch: 13367, Training Loss: 0.01343
Epoch: 13367, Training Loss: 0.01345
Epoch: 13367, Training Loss: 0.01665
Epoch: 13368, Training Loss: 0.01272
Epoch: 13368, Training Loss: 0.01343
Epoch: 13368, Training Loss: 0.01345
Epoch: 13368, Training Loss: 0.01665
Epoch: 13369, Training Loss: 0.01272
Epoch: 13369, Training Loss: 0.01343
Epoch: 13369, Training Loss: 0.01345
Epoch: 13369, Training Loss: 0.01665
Epoch: 13370, Training Loss: 0.01272
Epoch: 13370, Training Loss: 0.01343
Epoch: 13370, Training Loss: 0.01345
Epoch: 13370, Training Loss: 0.01665
Epoch: 13371, Training Loss: 0.01271
Epoch: 13371, Training Loss: 0.01343
Epoch: 13371, Training Loss: 0.01345
Epoch: 13371, Training Loss: 0.01664
Epoch: 13372, Training Loss: 0.01271
Epoch: 13372, Training Loss: 0.01343
Epoch: 13372, Training Loss: 0.01345
Epoch: 13372, Training Loss: 0.01664
Epoch: 13373, Training Loss: 0.01271
Epoch: 13373, Training Loss: 0.01343
Epoch: 13373, Training Loss: 0.01345
Epoch: 13373, Training Loss: 0.01664
Epoch: 13374, Training Loss: 0.01271
Epoch: 13374, Training Loss: 0.01343
Epoch: 13374, Training Loss: 0.01345
Epoch: 13374, Training Loss: 0.01664
Epoch: 13375, Training Loss: 0.01271
Epoch: 13375, Training Loss: 0.01343
Epoch: 13375, Training Loss: 0.01345
Epoch: 13375, Training Loss: 0.01664
Epoch: 13376, Training Loss: 0.01271
Epoch: 13376, Training Loss: 0.01343
Epoch: 13376, Training Loss: 0.01344
Epoch: 13376, Training Loss: 0.01664
Epoch: 13377, Training Loss: 0.01271
Epoch: 13377, Training Loss: 0.01342
Epoch: 13377, Training Loss: 0.01344
Epoch: 13377, Training Loss: 0.01664
Epoch: 13378, Training Loss: 0.01271
Epoch: 13378, Training Loss: 0.01342
Epoch: 13378, Training Loss: 0.01344
Epoch: 13378, Training Loss: 0.01664
Epoch: 13379, Training Loss: 0.01271
Epoch: 13379, Training Loss: 0.01342
Epoch: 13379, Training Loss: 0.01344
Epoch: 13379, Training Loss: 0.01664
Epoch: 13380, Training Loss: 0.01271
Epoch: 13380, Training Loss: 0.01342
Epoch: 13380, Training Loss: 0.01344
Epoch: 13380, Training Loss: 0.01664
Epoch: 13381, Training Loss: 0.01271
Epoch: 13381, Training Loss: 0.01342
Epoch: 13381, Training Loss: 0.01344
Epoch: 13381, Training Loss: 0.01664
Epoch: 13382, Training Loss: 0.01271
Epoch: 13382, Training Loss: 0.01342
Epoch: 13382, Training Loss: 0.01344
Epoch: 13382, Training Loss: 0.01664
Epoch: 13383, Training Loss: 0.01271
Epoch: 13383, Training Loss: 0.01342
Epoch: 13383, Training Loss: 0.01344
Epoch: 13383, Training Loss: 0.01663
Epoch: 13384, Training Loss: 0.01271
Epoch: 13384, Training Loss: 0.01342
Epoch: 13384, Training Loss: 0.01344
Epoch: 13384, Training Loss: 0.01663
Epoch: 13385, Training Loss: 0.01271
Epoch: 13385, Training Loss: 0.01342
Epoch: 13385, Training Loss: 0.01344
Epoch: 13385, Training Loss: 0.01663
Epoch: 13386, Training Loss: 0.01271
Epoch: 13386, Training Loss: 0.01342
Epoch: 13386, Training Loss: 0.01344
Epoch: 13386, Training Loss: 0.01663
Epoch: 13387, Training Loss: 0.01271
Epoch: 13387, Training Loss: 0.01342
Epoch: 13387, Training Loss: 0.01344
Epoch: 13387, Training Loss: 0.01663
Epoch: 13388, Training Loss: 0.01270
Epoch: 13388, Training Loss: 0.01342
Epoch: 13388, Training Loss: 0.01344
Epoch: 13388, Training Loss: 0.01663
Epoch: 13389, Training Loss: 0.01270
Epoch: 13389, Training Loss: 0.01342
Epoch: 13389, Training Loss: 0.01344
Epoch: 13389, Training Loss: 0.01663
Epoch: 13390, Training Loss: 0.01270
Epoch: 13390, Training Loss: 0.01342
Epoch: 13390, Training Loss: 0.01344
Epoch: 13390, Training Loss: 0.01663
Epoch: 13391, Training Loss: 0.01270
Epoch: 13391, Training Loss: 0.01342
Epoch: 13391, Training Loss: 0.01343
Epoch: 13391, Training Loss: 0.01663
Epoch: 13392, Training Loss: 0.01270
Epoch: 13392, Training Loss: 0.01341
Epoch: 13392, Training Loss: 0.01343
Epoch: 13392, Training Loss: 0.01663
Epoch: 13393, Training Loss: 0.01270
Epoch: 13393, Training Loss: 0.01341
Epoch: 13393, Training Loss: 0.01343
Epoch: 13393, Training Loss: 0.01663
Epoch: 13394, Training Loss: 0.01270
Epoch: 13394, Training Loss: 0.01341
Epoch: 13394, Training Loss: 0.01343
Epoch: 13394, Training Loss: 0.01663
Epoch: 13395, Training Loss: 0.01270
Epoch: 13395, Training Loss: 0.01341
Epoch: 13395, Training Loss: 0.01343
Epoch: 13395, Training Loss: 0.01662
Epoch: 13396, Training Loss: 0.01270
Epoch: 13396, Training Loss: 0.01341
Epoch: 13396, Training Loss: 0.01343
Epoch: 13396, Training Loss: 0.01662
Epoch: 13397, Training Loss: 0.01270
Epoch: 13397, Training Loss: 0.01341
Epoch: 13397, Training Loss: 0.01343
Epoch: 13397, Training Loss: 0.01662
Epoch: 13398, Training Loss: 0.01270
Epoch: 13398, Training Loss: 0.01341
Epoch: 13398, Training Loss: 0.01343
Epoch: 13398, Training Loss: 0.01662
Epoch: 13399, Training Loss: 0.01270
Epoch: 13399, Training Loss: 0.01341
Epoch: 13399, Training Loss: 0.01343
Epoch: 13399, Training Loss: 0.01662
Epoch: 13400, Training Loss: 0.01270
Epoch: 13400, Training Loss: 0.01341
Epoch: 13400, Training Loss: 0.01343
Epoch: 13400, Training Loss: 0.01662
Epoch: 13401, Training Loss: 0.01270
Epoch: 13401, Training Loss: 0.01341
Epoch: 13401, Training Loss: 0.01343
Epoch: 13401, Training Loss: 0.01662
Epoch: 13402, Training Loss: 0.01270
Epoch: 13402, Training Loss: 0.01341
Epoch: 13402, Training Loss: 0.01343
Epoch: 13402, Training Loss: 0.01662
Epoch: 13403, Training Loss: 0.01270
Epoch: 13403, Training Loss: 0.01341
Epoch: 13403, Training Loss: 0.01343
Epoch: 13403, Training Loss: 0.01662
Epoch: 13404, Training Loss: 0.01270
Epoch: 13404, Training Loss: 0.01341
Epoch: 13404, Training Loss: 0.01343
Epoch: 13404, Training Loss: 0.01662
Epoch: 13405, Training Loss: 0.01270
Epoch: 13405, Training Loss: 0.01341
Epoch: 13405, Training Loss: 0.01343
Epoch: 13405, Training Loss: 0.01662
Epoch: 13406, Training Loss: 0.01269
Epoch: 13406, Training Loss: 0.01341
Epoch: 13406, Training Loss: 0.01343
Epoch: 13406, Training Loss: 0.01662
Epoch: 13407, Training Loss: 0.01269
Epoch: 13407, Training Loss: 0.01341
Epoch: 13407, Training Loss: 0.01342
Epoch: 13407, Training Loss: 0.01662
Epoch: 13408, Training Loss: 0.01269
Epoch: 13408, Training Loss: 0.01340
Epoch: 13408, Training Loss: 0.01342
Epoch: 13408, Training Loss: 0.01661
Epoch: 13409, Training Loss: 0.01269
Epoch: 13409, Training Loss: 0.01340
Epoch: 13409, Training Loss: 0.01342
Epoch: 13409, Training Loss: 0.01661
Epoch: 13410, Training Loss: 0.01269
Epoch: 13410, Training Loss: 0.01340
Epoch: 13410, Training Loss: 0.01342
Epoch: 13410, Training Loss: 0.01661
Epoch: 13411, Training Loss: 0.01269
Epoch: 13411, Training Loss: 0.01340
Epoch: 13411, Training Loss: 0.01342
Epoch: 13411, Training Loss: 0.01661
Epoch: 13412, Training Loss: 0.01269
Epoch: 13412, Training Loss: 0.01340
Epoch: 13412, Training Loss: 0.01342
Epoch: 13412, Training Loss: 0.01661
Epoch: 13413, Training Loss: 0.01269
Epoch: 13413, Training Loss: 0.01340
Epoch: 13413, Training Loss: 0.01342
Epoch: 13413, Training Loss: 0.01661
Epoch: 13414, Training Loss: 0.01269
Epoch: 13414, Training Loss: 0.01340
Epoch: 13414, Training Loss: 0.01342
Epoch: 13414, Training Loss: 0.01661
Epoch: 13415, Training Loss: 0.01269
Epoch: 13415, Training Loss: 0.01340
Epoch: 13415, Training Loss: 0.01342
Epoch: 13415, Training Loss: 0.01661
Epoch: 13416, Training Loss: 0.01269
Epoch: 13416, Training Loss: 0.01340
Epoch: 13416, Training Loss: 0.01342
Epoch: 13416, Training Loss: 0.01661
Epoch: 13417, Training Loss: 0.01269
Epoch: 13417, Training Loss: 0.01340
Epoch: 13417, Training Loss: 0.01342
Epoch: 13417, Training Loss: 0.01661
Epoch: 13418, Training Loss: 0.01269
Epoch: 13418, Training Loss: 0.01340
Epoch: 13418, Training Loss: 0.01342
Epoch: 13418, Training Loss: 0.01661
Epoch: 13419, Training Loss: 0.01269
Epoch: 13419, Training Loss: 0.01340
Epoch: 13419, Training Loss: 0.01342
Epoch: 13419, Training Loss: 0.01661
Epoch: 13420, Training Loss: 0.01269
Epoch: 13420, Training Loss: 0.01340
Epoch: 13420, Training Loss: 0.01342
Epoch: 13420, Training Loss: 0.01660
Epoch: 13421, Training Loss: 0.01269
Epoch: 13421, Training Loss: 0.01340
Epoch: 13421, Training Loss: 0.01342
Epoch: 13421, Training Loss: 0.01660
Epoch: 13422, Training Loss: 0.01269
Epoch: 13422, Training Loss: 0.01340
Epoch: 13422, Training Loss: 0.01342
Epoch: 13422, Training Loss: 0.01660
Epoch: 13423, Training Loss: 0.01268
Epoch: 13423, Training Loss: 0.01340
Epoch: 13423, Training Loss: 0.01341
Epoch: 13423, Training Loss: 0.01660
Epoch: 13424, Training Loss: 0.01268
Epoch: 13424, Training Loss: 0.01339
Epoch: 13424, Training Loss: 0.01341
Epoch: 13424, Training Loss: 0.01660
Epoch: 13425, Training Loss: 0.01268
Epoch: 13425, Training Loss: 0.01339
Epoch: 13425, Training Loss: 0.01341
Epoch: 13425, Training Loss: 0.01660
Epoch: 13426, Training Loss: 0.01268
Epoch: 13426, Training Loss: 0.01339
Epoch: 13426, Training Loss: 0.01341
Epoch: 13426, Training Loss: 0.01660
Epoch: 13427, Training Loss: 0.01268
Epoch: 13427, Training Loss: 0.01339
Epoch: 13427, Training Loss: 0.01341
Epoch: 13427, Training Loss: 0.01660
Epoch: 13428, Training Loss: 0.01268
Epoch: 13428, Training Loss: 0.01339
Epoch: 13428, Training Loss: 0.01341
Epoch: 13428, Training Loss: 0.01660
Epoch: 13429, Training Loss: 0.01268
Epoch: 13429, Training Loss: 0.01339
Epoch: 13429, Training Loss: 0.01341
Epoch: 13429, Training Loss: 0.01660
Epoch: 13430, Training Loss: 0.01268
Epoch: 13430, Training Loss: 0.01339
Epoch: 13430, Training Loss: 0.01341
Epoch: 13430, Training Loss: 0.01660
Epoch: 13431, Training Loss: 0.01268
Epoch: 13431, Training Loss: 0.01339
Epoch: 13431, Training Loss: 0.01341
Epoch: 13431, Training Loss: 0.01660
Epoch: 13432, Training Loss: 0.01268
Epoch: 13432, Training Loss: 0.01339
Epoch: 13432, Training Loss: 0.01341
Epoch: 13432, Training Loss: 0.01660
Epoch: 13433, Training Loss: 0.01268
Epoch: 13433, Training Loss: 0.01339
Epoch: 13433, Training Loss: 0.01341
Epoch: 13433, Training Loss: 0.01659
Epoch: 13434, Training Loss: 0.01268
Epoch: 13434, Training Loss: 0.01339
Epoch: 13434, Training Loss: 0.01341
Epoch: 13434, Training Loss: 0.01659
Epoch: 13435, Training Loss: 0.01268
Epoch: 13435, Training Loss: 0.01339
Epoch: 13435, Training Loss: 0.01341
Epoch: 13435, Training Loss: 0.01659
Epoch: 13436, Training Loss: 0.01268
Epoch: 13436, Training Loss: 0.01339
Epoch: 13436, Training Loss: 0.01341
Epoch: 13436, Training Loss: 0.01659
Epoch: 13437, Training Loss: 0.01268
Epoch: 13437, Training Loss: 0.01339
Epoch: 13437, Training Loss: 0.01341
Epoch: 13437, Training Loss: 0.01659
Epoch: 13438, Training Loss: 0.01268
Epoch: 13438, Training Loss: 0.01339
Epoch: 13438, Training Loss: 0.01341
Epoch: 13438, Training Loss: 0.01659
Epoch: 13439, Training Loss: 0.01268
Epoch: 13439, Training Loss: 0.01339
Epoch: 13439, Training Loss: 0.01340
Epoch: 13439, Training Loss: 0.01659
Epoch: 13440, Training Loss: 0.01267
Epoch: 13440, Training Loss: 0.01338
Epoch: 13440, Training Loss: 0.01340
Epoch: 13440, Training Loss: 0.01659
Epoch: 13441, Training Loss: 0.01267
Epoch: 13441, Training Loss: 0.01338
Epoch: 13441, Training Loss: 0.01340
Epoch: 13441, Training Loss: 0.01659
Epoch: 13442, Training Loss: 0.01267
Epoch: 13442, Training Loss: 0.01338
Epoch: 13442, Training Loss: 0.01340
Epoch: 13442, Training Loss: 0.01659
Epoch: 13443, Training Loss: 0.01267
Epoch: 13443, Training Loss: 0.01338
Epoch: 13443, Training Loss: 0.01340
Epoch: 13443, Training Loss: 0.01659
Epoch: 13444, Training Loss: 0.01267
Epoch: 13444, Training Loss: 0.01338
Epoch: 13444, Training Loss: 0.01340
Epoch: 13444, Training Loss: 0.01659
Epoch: 13445, Training Loss: 0.01267
Epoch: 13445, Training Loss: 0.01338
Epoch: 13445, Training Loss: 0.01340
Epoch: 13445, Training Loss: 0.01658
Epoch: 13446, Training Loss: 0.01267
Epoch: 13446, Training Loss: 0.01338
Epoch: 13446, Training Loss: 0.01340
Epoch: 13446, Training Loss: 0.01658
Epoch: 13447, Training Loss: 0.01267
Epoch: 13447, Training Loss: 0.01338
Epoch: 13447, Training Loss: 0.01340
Epoch: 13447, Training Loss: 0.01658
Epoch: 13448, Training Loss: 0.01267
Epoch: 13448, Training Loss: 0.01338
Epoch: 13448, Training Loss: 0.01340
Epoch: 13448, Training Loss: 0.01658
Epoch: 13449, Training Loss: 0.01267
Epoch: 13449, Training Loss: 0.01338
Epoch: 13449, Training Loss: 0.01340
Epoch: 13449, Training Loss: 0.01658
Epoch: 13450, Training Loss: 0.01267
Epoch: 13450, Training Loss: 0.01338
Epoch: 13450, Training Loss: 0.01340
Epoch: 13450, Training Loss: 0.01658
Epoch: 13451, Training Loss: 0.01267
Epoch: 13451, Training Loss: 0.01338
Epoch: 13451, Training Loss: 0.01340
Epoch: 13451, Training Loss: 0.01658
Epoch: 13452, Training Loss: 0.01267
Epoch: 13452, Training Loss: 0.01338
Epoch: 13452, Training Loss: 0.01340
Epoch: 13452, Training Loss: 0.01658
Epoch: 13453, Training Loss: 0.01267
Epoch: 13453, Training Loss: 0.01338
Epoch: 13453, Training Loss: 0.01340
Epoch: 13453, Training Loss: 0.01658
Epoch: 13454, Training Loss: 0.01267
Epoch: 13454, Training Loss: 0.01338
Epoch: 13454, Training Loss: 0.01339
Epoch: 13454, Training Loss: 0.01658
Epoch: 13455, Training Loss: 0.01267
Epoch: 13455, Training Loss: 0.01337
Epoch: 13455, Training Loss: 0.01339
Epoch: 13455, Training Loss: 0.01658
Epoch: 13456, Training Loss: 0.01267
Epoch: 13456, Training Loss: 0.01337
Epoch: 13456, Training Loss: 0.01339
Epoch: 13456, Training Loss: 0.01658
Epoch: 13457, Training Loss: 0.01266
Epoch: 13457, Training Loss: 0.01337
Epoch: 13457, Training Loss: 0.01339
Epoch: 13457, Training Loss: 0.01658
Epoch: 13458, Training Loss: 0.01266
Epoch: 13458, Training Loss: 0.01337
Epoch: 13458, Training Loss: 0.01339
Epoch: 13458, Training Loss: 0.01657
Epoch: 13459, Training Loss: 0.01266
Epoch: 13459, Training Loss: 0.01337
Epoch: 13459, Training Loss: 0.01339
Epoch: 13459, Training Loss: 0.01657
Epoch: 13460, Training Loss: 0.01266
Epoch: 13460, Training Loss: 0.01337
Epoch: 13460, Training Loss: 0.01339
Epoch: 13460, Training Loss: 0.01657
Epoch: 13461, Training Loss: 0.01266
Epoch: 13461, Training Loss: 0.01337
Epoch: 13461, Training Loss: 0.01339
Epoch: 13461, Training Loss: 0.01657
Epoch: 13462, Training Loss: 0.01266
Epoch: 13462, Training Loss: 0.01337
Epoch: 13462, Training Loss: 0.01339
Epoch: 13462, Training Loss: 0.01657
Epoch: 13463, Training Loss: 0.01266
Epoch: 13463, Training Loss: 0.01337
Epoch: 13463, Training Loss: 0.01339
Epoch: 13463, Training Loss: 0.01657
Epoch: 13464, Training Loss: 0.01266
Epoch: 13464, Training Loss: 0.01337
Epoch: 13464, Training Loss: 0.01339
Epoch: 13464, Training Loss: 0.01657
Epoch: 13465, Training Loss: 0.01266
Epoch: 13465, Training Loss: 0.01337
Epoch: 13465, Training Loss: 0.01339
Epoch: 13465, Training Loss: 0.01657
Epoch: 13466, Training Loss: 0.01266
Epoch: 13466, Training Loss: 0.01337
Epoch: 13466, Training Loss: 0.01339
Epoch: 13466, Training Loss: 0.01657
Epoch: 13467, Training Loss: 0.01266
Epoch: 13467, Training Loss: 0.01337
Epoch: 13467, Training Loss: 0.01339
Epoch: 13467, Training Loss: 0.01657
Epoch: 13468, Training Loss: 0.01266
Epoch: 13468, Training Loss: 0.01337
Epoch: 13468, Training Loss: 0.01339
Epoch: 13468, Training Loss: 0.01657
Epoch: 13469, Training Loss: 0.01266
Epoch: 13469, Training Loss: 0.01337
Epoch: 13469, Training Loss: 0.01339
Epoch: 13469, Training Loss: 0.01657
Epoch: 13470, Training Loss: 0.01266
Epoch: 13470, Training Loss: 0.01337
Epoch: 13470, Training Loss: 0.01338
Epoch: 13470, Training Loss: 0.01656
Epoch: 13471, Training Loss: 0.01266
Epoch: 13471, Training Loss: 0.01336
Epoch: 13471, Training Loss: 0.01338
Epoch: 13471, Training Loss: 0.01656
Epoch: 13472, Training Loss: 0.01266
Epoch: 13472, Training Loss: 0.01336
Epoch: 13472, Training Loss: 0.01338
Epoch: 13472, Training Loss: 0.01656
Epoch: 13473, Training Loss: 0.01266
Epoch: 13473, Training Loss: 0.01336
Epoch: 13473, Training Loss: 0.01338
Epoch: 13473, Training Loss: 0.01656
Epoch: 13474, Training Loss: 0.01266
Epoch: 13474, Training Loss: 0.01336
Epoch: 13474, Training Loss: 0.01338
Epoch: 13474, Training Loss: 0.01656
Epoch: 13475, Training Loss: 0.01265
Epoch: 13475, Training Loss: 0.01336
Epoch: 13475, Training Loss: 0.01338
Epoch: 13475, Training Loss: 0.01656
Epoch: 13476, Training Loss: 0.01265
Epoch: 13476, Training Loss: 0.01336
Epoch: 13476, Training Loss: 0.01338
Epoch: 13476, Training Loss: 0.01656
Epoch: 13477, Training Loss: 0.01265
Epoch: 13477, Training Loss: 0.01336
Epoch: 13477, Training Loss: 0.01338
Epoch: 13477, Training Loss: 0.01656
Epoch: 13478, Training Loss: 0.01265
Epoch: 13478, Training Loss: 0.01336
Epoch: 13478, Training Loss: 0.01338
Epoch: 13478, Training Loss: 0.01656
Epoch: 13479, Training Loss: 0.01265
Epoch: 13479, Training Loss: 0.01336
Epoch: 13479, Training Loss: 0.01338
Epoch: 13479, Training Loss: 0.01656
Epoch: 13480, Training Loss: 0.01265
Epoch: 13480, Training Loss: 0.01336
Epoch: 13480, Training Loss: 0.01338
Epoch: 13480, Training Loss: 0.01656
Epoch: 13481, Training Loss: 0.01265
Epoch: 13481, Training Loss: 0.01336
Epoch: 13481, Training Loss: 0.01338
Epoch: 13481, Training Loss: 0.01656
Epoch: 13482, Training Loss: 0.01265
Epoch: 13482, Training Loss: 0.01336
Epoch: 13482, Training Loss: 0.01338
Epoch: 13482, Training Loss: 0.01656
Epoch: 13483, Training Loss: 0.01265
Epoch: 13483, Training Loss: 0.01336
Epoch: 13483, Training Loss: 0.01338
Epoch: 13483, Training Loss: 0.01655
Epoch: 13484, Training Loss: 0.01265
Epoch: 13484, Training Loss: 0.01336
Epoch: 13484, Training Loss: 0.01338
Epoch: 13484, Training Loss: 0.01655
Epoch: 13485, Training Loss: 0.01265
Epoch: 13485, Training Loss: 0.01336
Epoch: 13485, Training Loss: 0.01338
Epoch: 13485, Training Loss: 0.01655
Epoch: 13486, Training Loss: 0.01265
Epoch: 13486, Training Loss: 0.01336
Epoch: 13486, Training Loss: 0.01337
Epoch: 13486, Training Loss: 0.01655
Epoch: 13487, Training Loss: 0.01265
Epoch: 13487, Training Loss: 0.01335
Epoch: 13487, Training Loss: 0.01337
Epoch: 13487, Training Loss: 0.01655
Epoch: 13488, Training Loss: 0.01265
Epoch: 13488, Training Loss: 0.01335
Epoch: 13488, Training Loss: 0.01337
Epoch: 13488, Training Loss: 0.01655
Epoch: 13489, Training Loss: 0.01265
Epoch: 13489, Training Loss: 0.01335
Epoch: 13489, Training Loss: 0.01337
Epoch: 13489, Training Loss: 0.01655
Epoch: 13490, Training Loss: 0.01265
Epoch: 13490, Training Loss: 0.01335
Epoch: 13490, Training Loss: 0.01337
Epoch: 13490, Training Loss: 0.01655
Epoch: 13491, Training Loss: 0.01265
Epoch: 13491, Training Loss: 0.01335
Epoch: 13491, Training Loss: 0.01337
Epoch: 13491, Training Loss: 0.01655
Epoch: 13492, Training Loss: 0.01264
Epoch: 13492, Training Loss: 0.01335
Epoch: 13492, Training Loss: 0.01337
Epoch: 13492, Training Loss: 0.01655
Epoch: 13493, Training Loss: 0.01264
Epoch: 13493, Training Loss: 0.01335
Epoch: 13493, Training Loss: 0.01337
Epoch: 13493, Training Loss: 0.01655
Epoch: 13494, Training Loss: 0.01264
Epoch: 13494, Training Loss: 0.01335
Epoch: 13494, Training Loss: 0.01337
Epoch: 13494, Training Loss: 0.01655
Epoch: 13495, Training Loss: 0.01264
Epoch: 13495, Training Loss: 0.01335
Epoch: 13495, Training Loss: 0.01337
Epoch: 13495, Training Loss: 0.01654
Epoch: 13496, Training Loss: 0.01264
Epoch: 13496, Training Loss: 0.01335
Epoch: 13496, Training Loss: 0.01337
Epoch: 13496, Training Loss: 0.01654
Epoch: 13497, Training Loss: 0.01264
Epoch: 13497, Training Loss: 0.01335
Epoch: 13497, Training Loss: 0.01337
Epoch: 13497, Training Loss: 0.01654
Epoch: 13498, Training Loss: 0.01264
Epoch: 13498, Training Loss: 0.01335
Epoch: 13498, Training Loss: 0.01337
Epoch: 13498, Training Loss: 0.01654
Epoch: 13499, Training Loss: 0.01264
Epoch: 13499, Training Loss: 0.01335
Epoch: 13499, Training Loss: 0.01337
Epoch: 13499, Training Loss: 0.01654
Epoch: 13500, Training Loss: 0.01264
Epoch: 13500, Training Loss: 0.01335
Epoch: 13500, Training Loss: 0.01337
Epoch: 13500, Training Loss: 0.01654
Epoch: 13501, Training Loss: 0.01264
Epoch: 13501, Training Loss: 0.01335
Epoch: 13501, Training Loss: 0.01337
Epoch: 13501, Training Loss: 0.01654
Epoch: 13502, Training Loss: 0.01264
Epoch: 13502, Training Loss: 0.01335
Epoch: 13502, Training Loss: 0.01336
Epoch: 13502, Training Loss: 0.01654
Epoch: 13503, Training Loss: 0.01264
Epoch: 13503, Training Loss: 0.01334
Epoch: 13503, Training Loss: 0.01336
Epoch: 13503, Training Loss: 0.01654
Epoch: 13504, Training Loss: 0.01264
Epoch: 13504, Training Loss: 0.01334
Epoch: 13504, Training Loss: 0.01336
Epoch: 13504, Training Loss: 0.01654
Epoch: 13505, Training Loss: 0.01264
Epoch: 13505, Training Loss: 0.01334
Epoch: 13505, Training Loss: 0.01336
Epoch: 13505, Training Loss: 0.01654
Epoch: 13506, Training Loss: 0.01264
Epoch: 13506, Training Loss: 0.01334
Epoch: 13506, Training Loss: 0.01336
Epoch: 13506, Training Loss: 0.01654
Epoch: 13507, Training Loss: 0.01264
Epoch: 13507, Training Loss: 0.01334
Epoch: 13507, Training Loss: 0.01336
Epoch: 13507, Training Loss: 0.01654
Epoch: 13508, Training Loss: 0.01264
Epoch: 13508, Training Loss: 0.01334
Epoch: 13508, Training Loss: 0.01336
Epoch: 13508, Training Loss: 0.01653
Epoch: 13509, Training Loss: 0.01263
Epoch: 13509, Training Loss: 0.01334
Epoch: 13509, Training Loss: 0.01336
Epoch: 13509, Training Loss: 0.01653
Epoch: 13510, Training Loss: 0.01263
Epoch: 13510, Training Loss: 0.01334
Epoch: 13510, Training Loss: 0.01336
Epoch: 13510, Training Loss: 0.01653
Epoch: 13511, Training Loss: 0.01263
Epoch: 13511, Training Loss: 0.01334
Epoch: 13511, Training Loss: 0.01336
Epoch: 13511, Training Loss: 0.01653
Epoch: 13512, Training Loss: 0.01263
Epoch: 13512, Training Loss: 0.01334
Epoch: 13512, Training Loss: 0.01336
Epoch: 13512, Training Loss: 0.01653
Epoch: 13513, Training Loss: 0.01263
Epoch: 13513, Training Loss: 0.01334
Epoch: 13513, Training Loss: 0.01336
Epoch: 13513, Training Loss: 0.01653
Epoch: 13514, Training Loss: 0.01263
Epoch: 13514, Training Loss: 0.01334
Epoch: 13514, Training Loss: 0.01336
Epoch: 13514, Training Loss: 0.01653
Epoch: 13515, Training Loss: 0.01263
Epoch: 13515, Training Loss: 0.01334
Epoch: 13515, Training Loss: 0.01336
Epoch: 13515, Training Loss: 0.01653
Epoch: 13516, Training Loss: 0.01263
Epoch: 13516, Training Loss: 0.01334
Epoch: 13516, Training Loss: 0.01336
Epoch: 13516, Training Loss: 0.01653
Epoch: 13517, Training Loss: 0.01263
Epoch: 13517, Training Loss: 0.01334
Epoch: 13517, Training Loss: 0.01336
Epoch: 13517, Training Loss: 0.01653
Epoch: 13518, Training Loss: 0.01263
Epoch: 13518, Training Loss: 0.01334
Epoch: 13518, Training Loss: 0.01335
Epoch: 13518, Training Loss: 0.01653
Epoch: 13519, Training Loss: 0.01263
Epoch: 13519, Training Loss: 0.01333
Epoch: 13519, Training Loss: 0.01335
Epoch: 13519, Training Loss: 0.01653
Epoch: 13520, Training Loss: 0.01263
Epoch: 13520, Training Loss: 0.01333
Epoch: 13520, Training Loss: 0.01335
Epoch: 13520, Training Loss: 0.01653
Epoch: 13521, Training Loss: 0.01263
Epoch: 13521, Training Loss: 0.01333
Epoch: 13521, Training Loss: 0.01335
Epoch: 13521, Training Loss: 0.01652
Epoch: 13522, Training Loss: 0.01263
Epoch: 13522, Training Loss: 0.01333
Epoch: 13522, Training Loss: 0.01335
Epoch: 13522, Training Loss: 0.01652
Epoch: 13523, Training Loss: 0.01263
Epoch: 13523, Training Loss: 0.01333
Epoch: 13523, Training Loss: 0.01335
Epoch: 13523, Training Loss: 0.01652
Epoch: 13524, Training Loss: 0.01263
Epoch: 13524, Training Loss: 0.01333
Epoch: 13524, Training Loss: 0.01335
Epoch: 13524, Training Loss: 0.01652
Epoch: 13525, Training Loss: 0.01263
Epoch: 13525, Training Loss: 0.01333
Epoch: 13525, Training Loss: 0.01335
Epoch: 13525, Training Loss: 0.01652
Epoch: 13526, Training Loss: 0.01263
Epoch: 13526, Training Loss: 0.01333
Epoch: 13526, Training Loss: 0.01335
Epoch: 13526, Training Loss: 0.01652
Epoch: 13527, Training Loss: 0.01262
Epoch: 13527, Training Loss: 0.01333
Epoch: 13527, Training Loss: 0.01335
Epoch: 13527, Training Loss: 0.01652
Epoch: 13528, Training Loss: 0.01262
Epoch: 13528, Training Loss: 0.01333
Epoch: 13528, Training Loss: 0.01335
Epoch: 13528, Training Loss: 0.01652
Epoch: 13529, Training Loss: 0.01262
Epoch: 13529, Training Loss: 0.01333
Epoch: 13529, Training Loss: 0.01335
Epoch: 13529, Training Loss: 0.01652
Epoch: 13530, Training Loss: 0.01262
Epoch: 13530, Training Loss: 0.01333
Epoch: 13530, Training Loss: 0.01335
Epoch: 13530, Training Loss: 0.01652
Epoch: 13531, Training Loss: 0.01262
Epoch: 13531, Training Loss: 0.01333
Epoch: 13531, Training Loss: 0.01335
Epoch: 13531, Training Loss: 0.01652
Epoch: 13532, Training Loss: 0.01262
Epoch: 13532, Training Loss: 0.01333
Epoch: 13532, Training Loss: 0.01335
Epoch: 13532, Training Loss: 0.01652
Epoch: 13533, Training Loss: 0.01262
Epoch: 13533, Training Loss: 0.01333
Epoch: 13533, Training Loss: 0.01335
Epoch: 13533, Training Loss: 0.01651
Epoch: 13534, Training Loss: 0.01262
Epoch: 13534, Training Loss: 0.01333
Epoch: 13534, Training Loss: 0.01334
Epoch: 13534, Training Loss: 0.01651
Epoch: 13535, Training Loss: 0.01262
Epoch: 13535, Training Loss: 0.01332
Epoch: 13535, Training Loss: 0.01334
Epoch: 13535, Training Loss: 0.01651
Epoch: 13536, Training Loss: 0.01262
Epoch: 13536, Training Loss: 0.01332
Epoch: 13536, Training Loss: 0.01334
Epoch: 13536, Training Loss: 0.01651
Epoch: 13537, Training Loss: 0.01262
Epoch: 13537, Training Loss: 0.01332
Epoch: 13537, Training Loss: 0.01334
Epoch: 13537, Training Loss: 0.01651
Epoch: 13538, Training Loss: 0.01262
Epoch: 13538, Training Loss: 0.01332
Epoch: 13538, Training Loss: 0.01334
Epoch: 13538, Training Loss: 0.01651
Epoch: 13539, Training Loss: 0.01262
Epoch: 13539, Training Loss: 0.01332
Epoch: 13539, Training Loss: 0.01334
Epoch: 13539, Training Loss: 0.01651
Epoch: 13540, Training Loss: 0.01262
Epoch: 13540, Training Loss: 0.01332
Epoch: 13540, Training Loss: 0.01334
Epoch: 13540, Training Loss: 0.01651
Epoch: 13541, Training Loss: 0.01262
Epoch: 13541, Training Loss: 0.01332
Epoch: 13541, Training Loss: 0.01334
Epoch: 13541, Training Loss: 0.01651
Epoch: 13542, Training Loss: 0.01262
Epoch: 13542, Training Loss: 0.01332
Epoch: 13542, Training Loss: 0.01334
Epoch: 13542, Training Loss: 0.01651
Epoch: 13543, Training Loss: 0.01262
Epoch: 13543, Training Loss: 0.01332
Epoch: 13543, Training Loss: 0.01334
Epoch: 13543, Training Loss: 0.01651
Epoch: 13544, Training Loss: 0.01261
Epoch: 13544, Training Loss: 0.01332
Epoch: 13544, Training Loss: 0.01334
Epoch: 13544, Training Loss: 0.01651
Epoch: 13545, Training Loss: 0.01261
Epoch: 13545, Training Loss: 0.01332
Epoch: 13545, Training Loss: 0.01334
Epoch: 13545, Training Loss: 0.01651
Epoch: 13546, Training Loss: 0.01261
Epoch: 13546, Training Loss: 0.01332
Epoch: 13546, Training Loss: 0.01334
Epoch: 13546, Training Loss: 0.01650
Epoch: 13547, Training Loss: 0.01261
Epoch: 13547, Training Loss: 0.01332
Epoch: 13547, Training Loss: 0.01334
Epoch: 13547, Training Loss: 0.01650
Epoch: 13548, Training Loss: 0.01261
Epoch: 13548, Training Loss: 0.01332
Epoch: 13548, Training Loss: 0.01334
Epoch: 13548, Training Loss: 0.01650
Epoch: 13549, Training Loss: 0.01261
Epoch: 13549, Training Loss: 0.01332
Epoch: 13549, Training Loss: 0.01334
Epoch: 13549, Training Loss: 0.01650
Epoch: 13550, Training Loss: 0.01261
Epoch: 13550, Training Loss: 0.01332
Epoch: 13550, Training Loss: 0.01333
Epoch: 13550, Training Loss: 0.01650
Epoch: 13551, Training Loss: 0.01261
Epoch: 13551, Training Loss: 0.01331
Epoch: 13551, Training Loss: 0.01333
Epoch: 13551, Training Loss: 0.01650
Epoch: 13552, Training Loss: 0.01261
Epoch: 13552, Training Loss: 0.01331
Epoch: 13552, Training Loss: 0.01333
Epoch: 13552, Training Loss: 0.01650
Epoch: 13553, Training Loss: 0.01261
Epoch: 13553, Training Loss: 0.01331
Epoch: 13553, Training Loss: 0.01333
Epoch: 13553, Training Loss: 0.01650
Epoch: 13554, Training Loss: 0.01261
Epoch: 13554, Training Loss: 0.01331
Epoch: 13554, Training Loss: 0.01333
Epoch: 13554, Training Loss: 0.01650
Epoch: 13555, Training Loss: 0.01261
Epoch: 13555, Training Loss: 0.01331
Epoch: 13555, Training Loss: 0.01333
Epoch: 13555, Training Loss: 0.01650
Epoch: 13556, Training Loss: 0.01261
Epoch: 13556, Training Loss: 0.01331
Epoch: 13556, Training Loss: 0.01333
Epoch: 13556, Training Loss: 0.01650
Epoch: 13557, Training Loss: 0.01261
Epoch: 13557, Training Loss: 0.01331
Epoch: 13557, Training Loss: 0.01333
Epoch: 13557, Training Loss: 0.01650
Epoch: 13558, Training Loss: 0.01261
Epoch: 13558, Training Loss: 0.01331
Epoch: 13558, Training Loss: 0.01333
Epoch: 13558, Training Loss: 0.01649
Epoch: 13559, Training Loss: 0.01261
Epoch: 13559, Training Loss: 0.01331
Epoch: 13559, Training Loss: 0.01333
Epoch: 13559, Training Loss: 0.01649
Epoch: 13560, Training Loss: 0.01261
Epoch: 13560, Training Loss: 0.01331
Epoch: 13560, Training Loss: 0.01333
Epoch: 13560, Training Loss: 0.01649
Epoch: 13561, Training Loss: 0.01261
Epoch: 13561, Training Loss: 0.01331
Epoch: 13561, Training Loss: 0.01333
Epoch: 13561, Training Loss: 0.01649
Epoch: 13562, Training Loss: 0.01260
Epoch: 13562, Training Loss: 0.01331
Epoch: 13562, Training Loss: 0.01333
Epoch: 13562, Training Loss: 0.01649
Epoch: 13563, Training Loss: 0.01260
Epoch: 13563, Training Loss: 0.01331
Epoch: 13563, Training Loss: 0.01333
Epoch: 13563, Training Loss: 0.01649
Epoch: 13564, Training Loss: 0.01260
Epoch: 13564, Training Loss: 0.01331
Epoch: 13564, Training Loss: 0.01333
Epoch: 13564, Training Loss: 0.01649
Epoch: 13565, Training Loss: 0.01260
Epoch: 13565, Training Loss: 0.01331
Epoch: 13565, Training Loss: 0.01333
Epoch: 13565, Training Loss: 0.01649
Epoch: 13566, Training Loss: 0.01260
Epoch: 13566, Training Loss: 0.01331
Epoch: 13566, Training Loss: 0.01332
Epoch: 13566, Training Loss: 0.01649
Epoch: 13567, Training Loss: 0.01260
Epoch: 13567, Training Loss: 0.01330
Epoch: 13567, Training Loss: 0.01332
Epoch: 13567, Training Loss: 0.01649
Epoch: 13568, Training Loss: 0.01260
Epoch: 13568, Training Loss: 0.01330
Epoch: 13568, Training Loss: 0.01332
Epoch: 13568, Training Loss: 0.01649
Epoch: 13569, Training Loss: 0.01260
Epoch: 13569, Training Loss: 0.01330
Epoch: 13569, Training Loss: 0.01332
Epoch: 13569, Training Loss: 0.01649
Epoch: 13570, Training Loss: 0.01260
Epoch: 13570, Training Loss: 0.01330
Epoch: 13570, Training Loss: 0.01332
Epoch: 13570, Training Loss: 0.01649
Epoch: 13571, Training Loss: 0.01260
Epoch: 13571, Training Loss: 0.01330
Epoch: 13571, Training Loss: 0.01332
Epoch: 13571, Training Loss: 0.01648
Epoch: 13572, Training Loss: 0.01260
Epoch: 13572, Training Loss: 0.01330
Epoch: 13572, Training Loss: 0.01332
Epoch: 13572, Training Loss: 0.01648
Epoch: 13573, Training Loss: 0.01260
Epoch: 13573, Training Loss: 0.01330
Epoch: 13573, Training Loss: 0.01332
Epoch: 13573, Training Loss: 0.01648
Epoch: 13574, Training Loss: 0.01260
Epoch: 13574, Training Loss: 0.01330
Epoch: 13574, Training Loss: 0.01332
Epoch: 13574, Training Loss: 0.01648
Epoch: 13575, Training Loss: 0.01260
Epoch: 13575, Training Loss: 0.01330
Epoch: 13575, Training Loss: 0.01332
Epoch: 13575, Training Loss: 0.01648
Epoch: 13576, Training Loss: 0.01260
Epoch: 13576, Training Loss: 0.01330
Epoch: 13576, Training Loss: 0.01332
Epoch: 13576, Training Loss: 0.01648
Epoch: 13577, Training Loss: 0.01260
Epoch: 13577, Training Loss: 0.01330
Epoch: 13577, Training Loss: 0.01332
Epoch: 13577, Training Loss: 0.01648
Epoch: 13578, Training Loss: 0.01260
Epoch: 13578, Training Loss: 0.01330
Epoch: 13578, Training Loss: 0.01332
Epoch: 13578, Training Loss: 0.01648
Epoch: 13579, Training Loss: 0.01259
Epoch: 13579, Training Loss: 0.01330
Epoch: 13579, Training Loss: 0.01332
Epoch: 13579, Training Loss: 0.01648
Epoch: 13580, Training Loss: 0.01259
Epoch: 13580, Training Loss: 0.01330
Epoch: 13580, Training Loss: 0.01332
Epoch: 13580, Training Loss: 0.01648
Epoch: 13581, Training Loss: 0.01259
Epoch: 13581, Training Loss: 0.01330
Epoch: 13581, Training Loss: 0.01332
Epoch: 13581, Training Loss: 0.01648
Epoch: 13582, Training Loss: 0.01259
Epoch: 13582, Training Loss: 0.01330
Epoch: 13582, Training Loss: 0.01331
Epoch: 13582, Training Loss: 0.01648
Epoch: 13583, Training Loss: 0.01259
Epoch: 13583, Training Loss: 0.01329
Epoch: 13583, Training Loss: 0.01331
Epoch: 13583, Training Loss: 0.01648
Epoch: 13584, Training Loss: 0.01259
Epoch: 13584, Training Loss: 0.01329
Epoch: 13584, Training Loss: 0.01331
Epoch: 13584, Training Loss: 0.01647
Epoch: 13585, Training Loss: 0.01259
Epoch: 13585, Training Loss: 0.01329
Epoch: 13585, Training Loss: 0.01331
Epoch: 13585, Training Loss: 0.01647
Epoch: 13586, Training Loss: 0.01259
Epoch: 13586, Training Loss: 0.01329
Epoch: 13586, Training Loss: 0.01331
Epoch: 13586, Training Loss: 0.01647
Epoch: 13587, Training Loss: 0.01259
Epoch: 13587, Training Loss: 0.01329
Epoch: 13587, Training Loss: 0.01331
Epoch: 13587, Training Loss: 0.01647
Epoch: 13588, Training Loss: 0.01259
Epoch: 13588, Training Loss: 0.01329
Epoch: 13588, Training Loss: 0.01331
Epoch: 13588, Training Loss: 0.01647
Epoch: 13589, Training Loss: 0.01259
Epoch: 13589, Training Loss: 0.01329
Epoch: 13589, Training Loss: 0.01331
Epoch: 13589, Training Loss: 0.01647
Epoch: 13590, Training Loss: 0.01259
Epoch: 13590, Training Loss: 0.01329
Epoch: 13590, Training Loss: 0.01331
Epoch: 13590, Training Loss: 0.01647
Epoch: 13591, Training Loss: 0.01259
Epoch: 13591, Training Loss: 0.01329
Epoch: 13591, Training Loss: 0.01331
Epoch: 13591, Training Loss: 0.01647
Epoch: 13592, Training Loss: 0.01259
Epoch: 13592, Training Loss: 0.01329
Epoch: 13592, Training Loss: 0.01331
Epoch: 13592, Training Loss: 0.01647
Epoch: 13593, Training Loss: 0.01259
Epoch: 13593, Training Loss: 0.01329
Epoch: 13593, Training Loss: 0.01331
Epoch: 13593, Training Loss: 0.01647
Epoch: 13594, Training Loss: 0.01259
Epoch: 13594, Training Loss: 0.01329
Epoch: 13594, Training Loss: 0.01331
Epoch: 13594, Training Loss: 0.01647
Epoch: 13595, Training Loss: 0.01259
Epoch: 13595, Training Loss: 0.01329
Epoch: 13595, Training Loss: 0.01331
Epoch: 13595, Training Loss: 0.01647
Epoch: 13596, Training Loss: 0.01259
Epoch: 13596, Training Loss: 0.01329
Epoch: 13596, Training Loss: 0.01331
Epoch: 13596, Training Loss: 0.01647
Epoch: 13597, Training Loss: 0.01258
Epoch: 13597, Training Loss: 0.01329
Epoch: 13597, Training Loss: 0.01331
Epoch: 13597, Training Loss: 0.01646
Epoch: 13598, Training Loss: 0.01258
Epoch: 13598, Training Loss: 0.01329
Epoch: 13598, Training Loss: 0.01330
Epoch: 13598, Training Loss: 0.01646
Epoch: 13599, Training Loss: 0.01258
Epoch: 13599, Training Loss: 0.01328
Epoch: 13599, Training Loss: 0.01330
Epoch: 13599, Training Loss: 0.01646
Epoch: 13600, Training Loss: 0.01258
Epoch: 13600, Training Loss: 0.01328
Epoch: 13600, Training Loss: 0.01330
Epoch: 13600, Training Loss: 0.01646
Epoch: 13601, Training Loss: 0.01258
Epoch: 13601, Training Loss: 0.01328
Epoch: 13601, Training Loss: 0.01330
Epoch: 13601, Training Loss: 0.01646
Epoch: 13602, Training Loss: 0.01258
Epoch: 13602, Training Loss: 0.01328
Epoch: 13602, Training Loss: 0.01330
Epoch: 13602, Training Loss: 0.01646
Epoch: 13603, Training Loss: 0.01258
Epoch: 13603, Training Loss: 0.01328
Epoch: 13603, Training Loss: 0.01330
Epoch: 13603, Training Loss: 0.01646
Epoch: 13604, Training Loss: 0.01258
Epoch: 13604, Training Loss: 0.01328
Epoch: 13604, Training Loss: 0.01330
Epoch: 13604, Training Loss: 0.01646
Epoch: 13605, Training Loss: 0.01258
Epoch: 13605, Training Loss: 0.01328
Epoch: 13605, Training Loss: 0.01330
Epoch: 13605, Training Loss: 0.01646
Epoch: 13606, Training Loss: 0.01258
Epoch: 13606, Training Loss: 0.01328
Epoch: 13606, Training Loss: 0.01330
Epoch: 13606, Training Loss: 0.01646
Epoch: 13607, Training Loss: 0.01258
Epoch: 13607, Training Loss: 0.01328
Epoch: 13607, Training Loss: 0.01330
Epoch: 13607, Training Loss: 0.01646
Epoch: 13608, Training Loss: 0.01258
Epoch: 13608, Training Loss: 0.01328
Epoch: 13608, Training Loss: 0.01330
Epoch: 13608, Training Loss: 0.01646
Epoch: 13609, Training Loss: 0.01258
Epoch: 13609, Training Loss: 0.01328
Epoch: 13609, Training Loss: 0.01330
Epoch: 13609, Training Loss: 0.01645
Epoch: 13610, Training Loss: 0.01258
Epoch: 13610, Training Loss: 0.01328
Epoch: 13610, Training Loss: 0.01330
Epoch: 13610, Training Loss: 0.01645
Epoch: 13611, Training Loss: 0.01258
Epoch: 13611, Training Loss: 0.01328
Epoch: 13611, Training Loss: 0.01330
Epoch: 13611, Training Loss: 0.01645
Epoch: 13612, Training Loss: 0.01258
Epoch: 13612, Training Loss: 0.01328
Epoch: 13612, Training Loss: 0.01330
Epoch: 13612, Training Loss: 0.01645
Epoch: 13613, Training Loss: 0.01258
Epoch: 13613, Training Loss: 0.01328
Epoch: 13613, Training Loss: 0.01330
Epoch: 13613, Training Loss: 0.01645
Epoch: 13614, Training Loss: 0.01258
Epoch: 13614, Training Loss: 0.01328
Epoch: 13614, Training Loss: 0.01329
Epoch: 13614, Training Loss: 0.01645
Epoch: 13615, Training Loss: 0.01257
Epoch: 13615, Training Loss: 0.01327
Epoch: 13615, Training Loss: 0.01329
Epoch: 13615, Training Loss: 0.01645
Epoch: 13616, Training Loss: 0.01257
Epoch: 13616, Training Loss: 0.01327
Epoch: 13616, Training Loss: 0.01329
Epoch: 13616, Training Loss: 0.01645
Epoch: 13617, Training Loss: 0.01257
Epoch: 13617, Training Loss: 0.01327
Epoch: 13617, Training Loss: 0.01329
Epoch: 13617, Training Loss: 0.01645
Epoch: 13618, Training Loss: 0.01257
Epoch: 13618, Training Loss: 0.01327
Epoch: 13618, Training Loss: 0.01329
Epoch: 13618, Training Loss: 0.01645
Epoch: 13619, Training Loss: 0.01257
Epoch: 13619, Training Loss: 0.01327
Epoch: 13619, Training Loss: 0.01329
Epoch: 13619, Training Loss: 0.01645
Epoch: 13620, Training Loss: 0.01257
Epoch: 13620, Training Loss: 0.01327
Epoch: 13620, Training Loss: 0.01329
Epoch: 13620, Training Loss: 0.01645
Epoch: 13621, Training Loss: 0.01257
Epoch: 13621, Training Loss: 0.01327
Epoch: 13621, Training Loss: 0.01329
Epoch: 13621, Training Loss: 0.01645
Epoch: 13622, Training Loss: 0.01257
Epoch: 13622, Training Loss: 0.01327
Epoch: 13622, Training Loss: 0.01329
Epoch: 13622, Training Loss: 0.01644
Epoch: 13623, Training Loss: 0.01257
Epoch: 13623, Training Loss: 0.01327
Epoch: 13623, Training Loss: 0.01329
Epoch: 13623, Training Loss: 0.01644
Epoch: 13624, Training Loss: 0.01257
Epoch: 13624, Training Loss: 0.01327
Epoch: 13624, Training Loss: 0.01329
Epoch: 13624, Training Loss: 0.01644
Epoch: 13625, Training Loss: 0.01257
Epoch: 13625, Training Loss: 0.01327
Epoch: 13625, Training Loss: 0.01329
Epoch: 13625, Training Loss: 0.01644
Epoch: 13626, Training Loss: 0.01257
Epoch: 13626, Training Loss: 0.01327
Epoch: 13626, Training Loss: 0.01329
Epoch: 13626, Training Loss: 0.01644
Epoch: 13627, Training Loss: 0.01257
Epoch: 13627, Training Loss: 0.01327
Epoch: 13627, Training Loss: 0.01329
Epoch: 13627, Training Loss: 0.01644
Epoch: 13628, Training Loss: 0.01257
Epoch: 13628, Training Loss: 0.01327
Epoch: 13628, Training Loss: 0.01329
Epoch: 13628, Training Loss: 0.01644
Epoch: 13629, Training Loss: 0.01257
Epoch: 13629, Training Loss: 0.01327
Epoch: 13629, Training Loss: 0.01329
Epoch: 13629, Training Loss: 0.01644
Epoch: 13630, Training Loss: 0.01257
Epoch: 13630, Training Loss: 0.01327
Epoch: 13630, Training Loss: 0.01328
Epoch: 13630, Training Loss: 0.01644
Epoch: 13631, Training Loss: 0.01257
Epoch: 13631, Training Loss: 0.01327
Epoch: 13631, Training Loss: 0.01328
Epoch: 13631, Training Loss: 0.01644
Epoch: 13632, Training Loss: 0.01256
Epoch: 13632, Training Loss: 0.01326
Epoch: 13632, Training Loss: 0.01328
Epoch: 13632, Training Loss: 0.01644
Epoch: 13633, Training Loss: 0.01256
Epoch: 13633, Training Loss: 0.01326
Epoch: 13633, Training Loss: 0.01328
Epoch: 13633, Training Loss: 0.01644
Epoch: 13634, Training Loss: 0.01256
Epoch: 13634, Training Loss: 0.01326
Epoch: 13634, Training Loss: 0.01328
Epoch: 13634, Training Loss: 0.01644
Epoch: 13635, Training Loss: 0.01256
Epoch: 13635, Training Loss: 0.01326
Epoch: 13635, Training Loss: 0.01328
Epoch: 13635, Training Loss: 0.01643
Epoch: 13636, Training Loss: 0.01256
Epoch: 13636, Training Loss: 0.01326
Epoch: 13636, Training Loss: 0.01328
Epoch: 13636, Training Loss: 0.01643
Epoch: 13637, Training Loss: 0.01256
Epoch: 13637, Training Loss: 0.01326
Epoch: 13637, Training Loss: 0.01328
Epoch: 13637, Training Loss: 0.01643
Epoch: 13638, Training Loss: 0.01256
Epoch: 13638, Training Loss: 0.01326
Epoch: 13638, Training Loss: 0.01328
Epoch: 13638, Training Loss: 0.01643
Epoch: 13639, Training Loss: 0.01256
Epoch: 13639, Training Loss: 0.01326
Epoch: 13639, Training Loss: 0.01328
Epoch: 13639, Training Loss: 0.01643
Epoch: 13640, Training Loss: 0.01256
Epoch: 13640, Training Loss: 0.01326
Epoch: 13640, Training Loss: 0.01328
Epoch: 13640, Training Loss: 0.01643
Epoch: 13641, Training Loss: 0.01256
Epoch: 13641, Training Loss: 0.01326
Epoch: 13641, Training Loss: 0.01328
Epoch: 13641, Training Loss: 0.01643
Epoch: 13642, Training Loss: 0.01256
Epoch: 13642, Training Loss: 0.01326
Epoch: 13642, Training Loss: 0.01328
Epoch: 13642, Training Loss: 0.01643
Epoch: 13643, Training Loss: 0.01256
Epoch: 13643, Training Loss: 0.01326
Epoch: 13643, Training Loss: 0.01328
Epoch: 13643, Training Loss: 0.01643
Epoch: 13644, Training Loss: 0.01256
Epoch: 13644, Training Loss: 0.01326
Epoch: 13644, Training Loss: 0.01328
Epoch: 13644, Training Loss: 0.01643
Epoch: 13645, Training Loss: 0.01256
Epoch: 13645, Training Loss: 0.01326
Epoch: 13645, Training Loss: 0.01328
Epoch: 13645, Training Loss: 0.01643
Epoch: 13646, Training Loss: 0.01256
Epoch: 13646, Training Loss: 0.01326
Epoch: 13646, Training Loss: 0.01327
Epoch: 13646, Training Loss: 0.01643
Epoch: 13647, Training Loss: 0.01256
Epoch: 13647, Training Loss: 0.01326
Epoch: 13647, Training Loss: 0.01327
Epoch: 13647, Training Loss: 0.01643
Epoch: 13648, Training Loss: 0.01256
Epoch: 13648, Training Loss: 0.01325
Epoch: 13648, Training Loss: 0.01327
Epoch: 13648, Training Loss: 0.01642
Epoch: 13649, Training Loss: 0.01256
Epoch: 13649, Training Loss: 0.01325
Epoch: 13649, Training Loss: 0.01327
Epoch: 13649, Training Loss: 0.01642
Epoch: 13650, Training Loss: 0.01255
Epoch: 13650, Training Loss: 0.01325
Epoch: 13650, Training Loss: 0.01327
Epoch: 13650, Training Loss: 0.01642
Epoch: 13651, Training Loss: 0.01255
Epoch: 13651, Training Loss: 0.01325
Epoch: 13651, Training Loss: 0.01327
Epoch: 13651, Training Loss: 0.01642
Epoch: 13652, Training Loss: 0.01255
Epoch: 13652, Training Loss: 0.01325
Epoch: 13652, Training Loss: 0.01327
Epoch: 13652, Training Loss: 0.01642
Epoch: 13653, Training Loss: 0.01255
Epoch: 13653, Training Loss: 0.01325
Epoch: 13653, Training Loss: 0.01327
Epoch: 13653, Training Loss: 0.01642
Epoch: 13654, Training Loss: 0.01255
Epoch: 13654, Training Loss: 0.01325
Epoch: 13654, Training Loss: 0.01327
Epoch: 13654, Training Loss: 0.01642
Epoch: 13655, Training Loss: 0.01255
Epoch: 13655, Training Loss: 0.01325
Epoch: 13655, Training Loss: 0.01327
Epoch: 13655, Training Loss: 0.01642
Epoch: 13656, Training Loss: 0.01255
Epoch: 13656, Training Loss: 0.01325
Epoch: 13656, Training Loss: 0.01327
Epoch: 13656, Training Loss: 0.01642
Epoch: 13657, Training Loss: 0.01255
Epoch: 13657, Training Loss: 0.01325
Epoch: 13657, Training Loss: 0.01327
Epoch: 13657, Training Loss: 0.01642
Epoch: 13658, Training Loss: 0.01255
Epoch: 13658, Training Loss: 0.01325
Epoch: 13658, Training Loss: 0.01327
Epoch: 13658, Training Loss: 0.01642
Epoch: 13659, Training Loss: 0.01255
Epoch: 13659, Training Loss: 0.01325
Epoch: 13659, Training Loss: 0.01327
Epoch: 13659, Training Loss: 0.01642
Epoch: 13660, Training Loss: 0.01255
Epoch: 13660, Training Loss: 0.01325
Epoch: 13660, Training Loss: 0.01327
Epoch: 13660, Training Loss: 0.01642
Epoch: 13661, Training Loss: 0.01255
Epoch: 13661, Training Loss: 0.01325
Epoch: 13661, Training Loss: 0.01327
Epoch: 13661, Training Loss: 0.01641
Epoch: 13662, Training Loss: 0.01255
Epoch: 13662, Training Loss: 0.01325
Epoch: 13662, Training Loss: 0.01326
Epoch: 13662, Training Loss: 0.01641
Epoch: 13663, Training Loss: 0.01255
Epoch: 13663, Training Loss: 0.01325
Epoch: 13663, Training Loss: 0.01326
Epoch: 13663, Training Loss: 0.01641
Epoch: 13664, Training Loss: 0.01255
Epoch: 13664, Training Loss: 0.01324
Epoch: 13664, Training Loss: 0.01326
Epoch: 13664, Training Loss: 0.01641
Epoch: 13665, Training Loss: 0.01255
Epoch: 13665, Training Loss: 0.01324
Epoch: 13665, Training Loss: 0.01326
Epoch: 13665, Training Loss: 0.01641
Epoch: 13666, Training Loss: 0.01255
Epoch: 13666, Training Loss: 0.01324
Epoch: 13666, Training Loss: 0.01326
Epoch: 13666, Training Loss: 0.01641
Epoch: 13667, Training Loss: 0.01255
Epoch: 13667, Training Loss: 0.01324
Epoch: 13667, Training Loss: 0.01326
Epoch: 13667, Training Loss: 0.01641
Epoch: 13668, Training Loss: 0.01254
Epoch: 13668, Training Loss: 0.01324
Epoch: 13668, Training Loss: 0.01326
Epoch: 13668, Training Loss: 0.01641
Epoch: 13669, Training Loss: 0.01254
Epoch: 13669, Training Loss: 0.01324
Epoch: 13669, Training Loss: 0.01326
Epoch: 13669, Training Loss: 0.01641
Epoch: 13670, Training Loss: 0.01254
Epoch: 13670, Training Loss: 0.01324
Epoch: 13670, Training Loss: 0.01326
Epoch: 13670, Training Loss: 0.01641
Epoch: 13671, Training Loss: 0.01254
Epoch: 13671, Training Loss: 0.01324
Epoch: 13671, Training Loss: 0.01326
Epoch: 13671, Training Loss: 0.01641
Epoch: 13672, Training Loss: 0.01254
Epoch: 13672, Training Loss: 0.01324
Epoch: 13672, Training Loss: 0.01326
Epoch: 13672, Training Loss: 0.01641
Epoch: 13673, Training Loss: 0.01254
Epoch: 13673, Training Loss: 0.01324
Epoch: 13673, Training Loss: 0.01326
Epoch: 13673, Training Loss: 0.01641
Epoch: 13674, Training Loss: 0.01254
Epoch: 13674, Training Loss: 0.01324
Epoch: 13674, Training Loss: 0.01326
Epoch: 13674, Training Loss: 0.01640
Epoch: 13675, Training Loss: 0.01254
Epoch: 13675, Training Loss: 0.01324
Epoch: 13675, Training Loss: 0.01326
Epoch: 13675, Training Loss: 0.01640
Epoch: 13676, Training Loss: 0.01254
Epoch: 13676, Training Loss: 0.01324
Epoch: 13676, Training Loss: 0.01326
Epoch: 13676, Training Loss: 0.01640
Epoch: 13677, Training Loss: 0.01254
Epoch: 13677, Training Loss: 0.01324
Epoch: 13677, Training Loss: 0.01326
Epoch: 13677, Training Loss: 0.01640
Epoch: 13678, Training Loss: 0.01254
Epoch: 13678, Training Loss: 0.01324
Epoch: 13678, Training Loss: 0.01326
Epoch: 13678, Training Loss: 0.01640
Epoch: 13679, Training Loss: 0.01254
Epoch: 13679, Training Loss: 0.01324
Epoch: 13679, Training Loss: 0.01325
Epoch: 13679, Training Loss: 0.01640
Epoch: 13680, Training Loss: 0.01254
Epoch: 13680, Training Loss: 0.01323
Epoch: 13680, Training Loss: 0.01325
Epoch: 13680, Training Loss: 0.01640
Epoch: 13681, Training Loss: 0.01254
Epoch: 13681, Training Loss: 0.01323
Epoch: 13681, Training Loss: 0.01325
Epoch: 13681, Training Loss: 0.01640
Epoch: 13682, Training Loss: 0.01254
Epoch: 13682, Training Loss: 0.01323
Epoch: 13682, Training Loss: 0.01325
Epoch: 13682, Training Loss: 0.01640
Epoch: 13683, Training Loss: 0.01254
Epoch: 13683, Training Loss: 0.01323
Epoch: 13683, Training Loss: 0.01325
Epoch: 13683, Training Loss: 0.01640
Epoch: 13684, Training Loss: 0.01254
Epoch: 13684, Training Loss: 0.01323
Epoch: 13684, Training Loss: 0.01325
Epoch: 13684, Training Loss: 0.01640
Epoch: 13685, Training Loss: 0.01254
Epoch: 13685, Training Loss: 0.01323
Epoch: 13685, Training Loss: 0.01325
Epoch: 13685, Training Loss: 0.01640
Epoch: 13686, Training Loss: 0.01253
Epoch: 13686, Training Loss: 0.01323
Epoch: 13686, Training Loss: 0.01325
Epoch: 13686, Training Loss: 0.01639
Epoch: 13687, Training Loss: 0.01253
Epoch: 13687, Training Loss: 0.01323
Epoch: 13687, Training Loss: 0.01325
Epoch: 13687, Training Loss: 0.01639
Epoch: 13688, Training Loss: 0.01253
Epoch: 13688, Training Loss: 0.01323
Epoch: 13688, Training Loss: 0.01325
Epoch: 13688, Training Loss: 0.01639
Epoch: 13689, Training Loss: 0.01253
Epoch: 13689, Training Loss: 0.01323
Epoch: 13689, Training Loss: 0.01325
Epoch: 13689, Training Loss: 0.01639
Epoch: 13690, Training Loss: 0.01253
Epoch: 13690, Training Loss: 0.01323
Epoch: 13690, Training Loss: 0.01325
Epoch: 13690, Training Loss: 0.01639
Epoch: 13691, Training Loss: 0.01253
Epoch: 13691, Training Loss: 0.01323
Epoch: 13691, Training Loss: 0.01325
Epoch: 13691, Training Loss: 0.01639
Epoch: 13692, Training Loss: 0.01253
Epoch: 13692, Training Loss: 0.01323
Epoch: 13692, Training Loss: 0.01325
Epoch: 13692, Training Loss: 0.01639
Epoch: 13693, Training Loss: 0.01253
Epoch: 13693, Training Loss: 0.01323
Epoch: 13693, Training Loss: 0.01325
Epoch: 13693, Training Loss: 0.01639
Epoch: 13694, Training Loss: 0.01253
Epoch: 13694, Training Loss: 0.01323
Epoch: 13694, Training Loss: 0.01325
Epoch: 13694, Training Loss: 0.01639
Epoch: 13695, Training Loss: 0.01253
Epoch: 13695, Training Loss: 0.01323
Epoch: 13695, Training Loss: 0.01324
Epoch: 13695, Training Loss: 0.01639
Epoch: 13696, Training Loss: 0.01253
Epoch: 13696, Training Loss: 0.01323
Epoch: 13696, Training Loss: 0.01324
Epoch: 13696, Training Loss: 0.01639
Epoch: 13697, Training Loss: 0.01253
Epoch: 13697, Training Loss: 0.01322
Epoch: 13697, Training Loss: 0.01324
Epoch: 13697, Training Loss: 0.01639
Epoch: 13698, Training Loss: 0.01253
Epoch: 13698, Training Loss: 0.01322
Epoch: 13698, Training Loss: 0.01324
Epoch: 13698, Training Loss: 0.01639
Epoch: 13699, Training Loss: 0.01253
Epoch: 13699, Training Loss: 0.01322
Epoch: 13699, Training Loss: 0.01324
Epoch: 13699, Training Loss: 0.01638
Epoch: 13700, Training Loss: 0.01253
Epoch: 13700, Training Loss: 0.01322
Epoch: 13700, Training Loss: 0.01324
Epoch: 13700, Training Loss: 0.01638
Epoch: 13701, Training Loss: 0.01253
Epoch: 13701, Training Loss: 0.01322
Epoch: 13701, Training Loss: 0.01324
Epoch: 13701, Training Loss: 0.01638
Epoch: 13702, Training Loss: 0.01253
Epoch: 13702, Training Loss: 0.01322
Epoch: 13702, Training Loss: 0.01324
Epoch: 13702, Training Loss: 0.01638
Epoch: 13703, Training Loss: 0.01252
Epoch: 13703, Training Loss: 0.01322
Epoch: 13703, Training Loss: 0.01324
Epoch: 13703, Training Loss: 0.01638
Epoch: 13704, Training Loss: 0.01252
Epoch: 13704, Training Loss: 0.01322
Epoch: 13704, Training Loss: 0.01324
Epoch: 13704, Training Loss: 0.01638
Epoch: 13705, Training Loss: 0.01252
Epoch: 13705, Training Loss: 0.01322
Epoch: 13705, Training Loss: 0.01324
Epoch: 13705, Training Loss: 0.01638
Epoch: 13706, Training Loss: 0.01252
Epoch: 13706, Training Loss: 0.01322
Epoch: 13706, Training Loss: 0.01324
Epoch: 13706, Training Loss: 0.01638
Epoch: 13707, Training Loss: 0.01252
Epoch: 13707, Training Loss: 0.01322
Epoch: 13707, Training Loss: 0.01324
Epoch: 13707, Training Loss: 0.01638
Epoch: 13708, Training Loss: 0.01252
Epoch: 13708, Training Loss: 0.01322
Epoch: 13708, Training Loss: 0.01324
Epoch: 13708, Training Loss: 0.01638
Epoch: 13709, Training Loss: 0.01252
Epoch: 13709, Training Loss: 0.01322
Epoch: 13709, Training Loss: 0.01324
Epoch: 13709, Training Loss: 0.01638
Epoch: 13710, Training Loss: 0.01252
Epoch: 13710, Training Loss: 0.01322
Epoch: 13710, Training Loss: 0.01324
Epoch: 13710, Training Loss: 0.01638
Epoch: 13711, Training Loss: 0.01252
Epoch: 13711, Training Loss: 0.01322
Epoch: 13711, Training Loss: 0.01323
Epoch: 13711, Training Loss: 0.01638
Epoch: 13712, Training Loss: 0.01252
Epoch: 13712, Training Loss: 0.01322
Epoch: 13712, Training Loss: 0.01323
Epoch: 13712, Training Loss: 0.01637
Epoch: 13713, Training Loss: 0.01252
Epoch: 13713, Training Loss: 0.01321
Epoch: 13713, Training Loss: 0.01323
Epoch: 13713, Training Loss: 0.01637
Epoch: 13714, Training Loss: 0.01252
Epoch: 13714, Training Loss: 0.01321
Epoch: 13714, Training Loss: 0.01323
Epoch: 13714, Training Loss: 0.01637
Epoch: 13715, Training Loss: 0.01252
Epoch: 13715, Training Loss: 0.01321
Epoch: 13715, Training Loss: 0.01323
Epoch: 13715, Training Loss: 0.01637
Epoch: 13716, Training Loss: 0.01252
Epoch: 13716, Training Loss: 0.01321
Epoch: 13716, Training Loss: 0.01323
Epoch: 13716, Training Loss: 0.01637
Epoch: 13717, Training Loss: 0.01252
Epoch: 13717, Training Loss: 0.01321
Epoch: 13717, Training Loss: 0.01323
Epoch: 13717, Training Loss: 0.01637
Epoch: 13718, Training Loss: 0.01252
Epoch: 13718, Training Loss: 0.01321
Epoch: 13718, Training Loss: 0.01323
Epoch: 13718, Training Loss: 0.01637
Epoch: 13719, Training Loss: 0.01252
Epoch: 13719, Training Loss: 0.01321
Epoch: 13719, Training Loss: 0.01323
Epoch: 13719, Training Loss: 0.01637
Epoch: 13720, Training Loss: 0.01252
Epoch: 13720, Training Loss: 0.01321
Epoch: 13720, Training Loss: 0.01323
Epoch: 13720, Training Loss: 0.01637
Epoch: 13721, Training Loss: 0.01251
Epoch: 13721, Training Loss: 0.01321
Epoch: 13721, Training Loss: 0.01323
Epoch: 13721, Training Loss: 0.01637
Epoch: 13722, Training Loss: 0.01251
Epoch: 13722, Training Loss: 0.01321
Epoch: 13722, Training Loss: 0.01323
Epoch: 13722, Training Loss: 0.01637
Epoch: 13723, Training Loss: 0.01251
Epoch: 13723, Training Loss: 0.01321
Epoch: 13723, Training Loss: 0.01323
Epoch: 13723, Training Loss: 0.01637
Epoch: 13724, Training Loss: 0.01251
Epoch: 13724, Training Loss: 0.01321
Epoch: 13724, Training Loss: 0.01323
Epoch: 13724, Training Loss: 0.01637
Epoch: 13725, Training Loss: 0.01251
Epoch: 13725, Training Loss: 0.01321
Epoch: 13725, Training Loss: 0.01323
Epoch: 13725, Training Loss: 0.01636
Epoch: 13726, Training Loss: 0.01251
Epoch: 13726, Training Loss: 0.01321
Epoch: 13726, Training Loss: 0.01323
Epoch: 13726, Training Loss: 0.01636
Epoch: 13727, Training Loss: 0.01251
Epoch: 13727, Training Loss: 0.01321
Epoch: 13727, Training Loss: 0.01323
Epoch: 13727, Training Loss: 0.01636
Epoch: 13728, Training Loss: 0.01251
Epoch: 13728, Training Loss: 0.01321
Epoch: 13728, Training Loss: 0.01322
Epoch: 13728, Training Loss: 0.01636
Epoch: 13729, Training Loss: 0.01251
Epoch: 13729, Training Loss: 0.01320
Epoch: 13729, Training Loss: 0.01322
Epoch: 13729, Training Loss: 0.01636
Epoch: 13730, Training Loss: 0.01251
Epoch: 13730, Training Loss: 0.01320
Epoch: 13730, Training Loss: 0.01322
Epoch: 13730, Training Loss: 0.01636
Epoch: 13731, Training Loss: 0.01251
Epoch: 13731, Training Loss: 0.01320
Epoch: 13731, Training Loss: 0.01322
Epoch: 13731, Training Loss: 0.01636
Epoch: 13732, Training Loss: 0.01251
Epoch: 13732, Training Loss: 0.01320
Epoch: 13732, Training Loss: 0.01322
Epoch: 13732, Training Loss: 0.01636
Epoch: 13733, Training Loss: 0.01251
Epoch: 13733, Training Loss: 0.01320
Epoch: 13733, Training Loss: 0.01322
Epoch: 13733, Training Loss: 0.01636
Epoch: 13734, Training Loss: 0.01251
Epoch: 13734, Training Loss: 0.01320
Epoch: 13734, Training Loss: 0.01322
Epoch: 13734, Training Loss: 0.01636
Epoch: 13735, Training Loss: 0.01251
Epoch: 13735, Training Loss: 0.01320
Epoch: 13735, Training Loss: 0.01322
Epoch: 13735, Training Loss: 0.01636
Epoch: 13736, Training Loss: 0.01251
Epoch: 13736, Training Loss: 0.01320
Epoch: 13736, Training Loss: 0.01322
Epoch: 13736, Training Loss: 0.01636
Epoch: 13737, Training Loss: 0.01251
Epoch: 13737, Training Loss: 0.01320
Epoch: 13737, Training Loss: 0.01322
Epoch: 13737, Training Loss: 0.01636
Epoch: 13738, Training Loss: 0.01251
Epoch: 13738, Training Loss: 0.01320
Epoch: 13738, Training Loss: 0.01322
Epoch: 13738, Training Loss: 0.01635
Epoch: 13739, Training Loss: 0.01250
Epoch: 13739, Training Loss: 0.01320
Epoch: 13739, Training Loss: 0.01322
Epoch: 13739, Training Loss: 0.01635
Epoch: 13740, Training Loss: 0.01250
Epoch: 13740, Training Loss: 0.01320
Epoch: 13740, Training Loss: 0.01322
Epoch: 13740, Training Loss: 0.01635
Epoch: 13741, Training Loss: 0.01250
Epoch: 13741, Training Loss: 0.01320
Epoch: 13741, Training Loss: 0.01322
Epoch: 13741, Training Loss: 0.01635
Epoch: 13742, Training Loss: 0.01250
Epoch: 13742, Training Loss: 0.01320
Epoch: 13742, Training Loss: 0.01322
Epoch: 13742, Training Loss: 0.01635
Epoch: 13743, Training Loss: 0.01250
Epoch: 13743, Training Loss: 0.01320
Epoch: 13743, Training Loss: 0.01322
Epoch: 13743, Training Loss: 0.01635
Epoch: 13744, Training Loss: 0.01250
Epoch: 13744, Training Loss: 0.01320
Epoch: 13744, Training Loss: 0.01321
Epoch: 13744, Training Loss: 0.01635
Epoch: 13745, Training Loss: 0.01250
Epoch: 13745, Training Loss: 0.01320
Epoch: 13745, Training Loss: 0.01321
Epoch: 13745, Training Loss: 0.01635
Epoch: 13746, Training Loss: 0.01250
Epoch: 13746, Training Loss: 0.01319
Epoch: 13746, Training Loss: 0.01321
Epoch: 13746, Training Loss: 0.01635
Epoch: 13747, Training Loss: 0.01250
Epoch: 13747, Training Loss: 0.01319
Epoch: 13747, Training Loss: 0.01321
Epoch: 13747, Training Loss: 0.01635
Epoch: 13748, Training Loss: 0.01250
Epoch: 13748, Training Loss: 0.01319
Epoch: 13748, Training Loss: 0.01321
Epoch: 13748, Training Loss: 0.01635
Epoch: 13749, Training Loss: 0.01250
Epoch: 13749, Training Loss: 0.01319
Epoch: 13749, Training Loss: 0.01321
Epoch: 13749, Training Loss: 0.01635
Epoch: 13750, Training Loss: 0.01250
Epoch: 13750, Training Loss: 0.01319
Epoch: 13750, Training Loss: 0.01321
Epoch: 13750, Training Loss: 0.01635
Epoch: 13751, Training Loss: 0.01250
Epoch: 13751, Training Loss: 0.01319
Epoch: 13751, Training Loss: 0.01321
Epoch: 13751, Training Loss: 0.01634
Epoch: 13752, Training Loss: 0.01250
Epoch: 13752, Training Loss: 0.01319
Epoch: 13752, Training Loss: 0.01321
Epoch: 13752, Training Loss: 0.01634
Epoch: 13753, Training Loss: 0.01250
Epoch: 13753, Training Loss: 0.01319
Epoch: 13753, Training Loss: 0.01321
Epoch: 13753, Training Loss: 0.01634
Epoch: 13754, Training Loss: 0.01250
Epoch: 13754, Training Loss: 0.01319
Epoch: 13754, Training Loss: 0.01321
Epoch: 13754, Training Loss: 0.01634
Epoch: 13755, Training Loss: 0.01250
Epoch: 13755, Training Loss: 0.01319
Epoch: 13755, Training Loss: 0.01321
Epoch: 13755, Training Loss: 0.01634
Epoch: 13756, Training Loss: 0.01250
Epoch: 13756, Training Loss: 0.01319
Epoch: 13756, Training Loss: 0.01321
Epoch: 13756, Training Loss: 0.01634
Epoch: 13757, Training Loss: 0.01249
Epoch: 13757, Training Loss: 0.01319
Epoch: 13757, Training Loss: 0.01321
Epoch: 13757, Training Loss: 0.01634
Epoch: 13758, Training Loss: 0.01249
Epoch: 13758, Training Loss: 0.01319
Epoch: 13758, Training Loss: 0.01321
Epoch: 13758, Training Loss: 0.01634
Epoch: 13759, Training Loss: 0.01249
Epoch: 13759, Training Loss: 0.01319
Epoch: 13759, Training Loss: 0.01321
Epoch: 13759, Training Loss: 0.01634
Epoch: 13760, Training Loss: 0.01249
Epoch: 13760, Training Loss: 0.01319
Epoch: 13760, Training Loss: 0.01321
Epoch: 13760, Training Loss: 0.01634
Epoch: 13761, Training Loss: 0.01249
Epoch: 13761, Training Loss: 0.01319
Epoch: 13761, Training Loss: 0.01320
Epoch: 13761, Training Loss: 0.01634
Epoch: 13762, Training Loss: 0.01249
Epoch: 13762, Training Loss: 0.01318
Epoch: 13762, Training Loss: 0.01320
Epoch: 13762, Training Loss: 0.01634
Epoch: 13763, Training Loss: 0.01249
Epoch: 13763, Training Loss: 0.01318
Epoch: 13763, Training Loss: 0.01320
Epoch: 13763, Training Loss: 0.01634
Epoch: 13764, Training Loss: 0.01249
Epoch: 13764, Training Loss: 0.01318
Epoch: 13764, Training Loss: 0.01320
Epoch: 13764, Training Loss: 0.01633
Epoch: 13765, Training Loss: 0.01249
Epoch: 13765, Training Loss: 0.01318
Epoch: 13765, Training Loss: 0.01320
Epoch: 13765, Training Loss: 0.01633
Epoch: 13766, Training Loss: 0.01249
Epoch: 13766, Training Loss: 0.01318
Epoch: 13766, Training Loss: 0.01320
Epoch: 13766, Training Loss: 0.01633
Epoch: 13767, Training Loss: 0.01249
Epoch: 13767, Training Loss: 0.01318
Epoch: 13767, Training Loss: 0.01320
Epoch: 13767, Training Loss: 0.01633
Epoch: 13768, Training Loss: 0.01249
Epoch: 13768, Training Loss: 0.01318
Epoch: 13768, Training Loss: 0.01320
Epoch: 13768, Training Loss: 0.01633
Epoch: 13769, Training Loss: 0.01249
Epoch: 13769, Training Loss: 0.01318
Epoch: 13769, Training Loss: 0.01320
Epoch: 13769, Training Loss: 0.01633
Epoch: 13770, Training Loss: 0.01249
Epoch: 13770, Training Loss: 0.01318
Epoch: 13770, Training Loss: 0.01320
Epoch: 13770, Training Loss: 0.01633
Epoch: 13771, Training Loss: 0.01249
Epoch: 13771, Training Loss: 0.01318
Epoch: 13771, Training Loss: 0.01320
Epoch: 13771, Training Loss: 0.01633
Epoch: 13772, Training Loss: 0.01249
Epoch: 13772, Training Loss: 0.01318
Epoch: 13772, Training Loss: 0.01320
Epoch: 13772, Training Loss: 0.01633
Epoch: 13773, Training Loss: 0.01249
Epoch: 13773, Training Loss: 0.01318
Epoch: 13773, Training Loss: 0.01320
Epoch: 13773, Training Loss: 0.01633
Epoch: 13774, Training Loss: 0.01249
Epoch: 13774, Training Loss: 0.01318
Epoch: 13774, Training Loss: 0.01320
Epoch: 13774, Training Loss: 0.01633
Epoch: 13775, Training Loss: 0.01248
Epoch: 13775, Training Loss: 0.01318
Epoch: 13775, Training Loss: 0.01320
Epoch: 13775, Training Loss: 0.01633
Epoch: 13776, Training Loss: 0.01248
Epoch: 13776, Training Loss: 0.01318
Epoch: 13776, Training Loss: 0.01320
Epoch: 13776, Training Loss: 0.01633
Epoch: 13777, Training Loss: 0.01248
Epoch: 13777, Training Loss: 0.01318
Epoch: 13777, Training Loss: 0.01319
Epoch: 13777, Training Loss: 0.01632
Epoch: 13778, Training Loss: 0.01248
Epoch: 13778, Training Loss: 0.01318
Epoch: 13778, Training Loss: 0.01319
Epoch: 13778, Training Loss: 0.01632
Epoch: 13779, Training Loss: 0.01248
Epoch: 13779, Training Loss: 0.01317
Epoch: 13779, Training Loss: 0.01319
Epoch: 13779, Training Loss: 0.01632
Epoch: 13780, Training Loss: 0.01248
Epoch: 13780, Training Loss: 0.01317
Epoch: 13780, Training Loss: 0.01319
Epoch: 13780, Training Loss: 0.01632
Epoch: 13781, Training Loss: 0.01248
Epoch: 13781, Training Loss: 0.01317
Epoch: 13781, Training Loss: 0.01319
Epoch: 13781, Training Loss: 0.01632
Epoch: 13782, Training Loss: 0.01248
Epoch: 13782, Training Loss: 0.01317
Epoch: 13782, Training Loss: 0.01319
Epoch: 13782, Training Loss: 0.01632
Epoch: 13783, Training Loss: 0.01248
Epoch: 13783, Training Loss: 0.01317
Epoch: 13783, Training Loss: 0.01319
Epoch: 13783, Training Loss: 0.01632
Epoch: 13784, Training Loss: 0.01248
Epoch: 13784, Training Loss: 0.01317
Epoch: 13784, Training Loss: 0.01319
Epoch: 13784, Training Loss: 0.01632
Epoch: 13785, Training Loss: 0.01248
Epoch: 13785, Training Loss: 0.01317
Epoch: 13785, Training Loss: 0.01319
Epoch: 13785, Training Loss: 0.01632
Epoch: 13786, Training Loss: 0.01248
Epoch: 13786, Training Loss: 0.01317
Epoch: 13786, Training Loss: 0.01319
Epoch: 13786, Training Loss: 0.01632
Epoch: 13787, Training Loss: 0.01248
Epoch: 13787, Training Loss: 0.01317
Epoch: 13787, Training Loss: 0.01319
Epoch: 13787, Training Loss: 0.01632
Epoch: 13788, Training Loss: 0.01248
Epoch: 13788, Training Loss: 0.01317
Epoch: 13788, Training Loss: 0.01319
Epoch: 13788, Training Loss: 0.01632
Epoch: 13789, Training Loss: 0.01248
Epoch: 13789, Training Loss: 0.01317
Epoch: 13789, Training Loss: 0.01319
Epoch: 13789, Training Loss: 0.01632
Epoch: 13790, Training Loss: 0.01248
Epoch: 13790, Training Loss: 0.01317
Epoch: 13790, Training Loss: 0.01319
Epoch: 13790, Training Loss: 0.01631
Epoch: 13791, Training Loss: 0.01248
Epoch: 13791, Training Loss: 0.01317
Epoch: 13791, Training Loss: 0.01319
Epoch: 13791, Training Loss: 0.01631
Epoch: 13792, Training Loss: 0.01248
Epoch: 13792, Training Loss: 0.01317
Epoch: 13792, Training Loss: 0.01319
Epoch: 13792, Training Loss: 0.01631
Epoch: 13793, Training Loss: 0.01247
Epoch: 13793, Training Loss: 0.01317
Epoch: 13793, Training Loss: 0.01319
Epoch: 13793, Training Loss: 0.01631
Epoch: 13794, Training Loss: 0.01247
Epoch: 13794, Training Loss: 0.01317
Epoch: 13794, Training Loss: 0.01318
Epoch: 13794, Training Loss: 0.01631
Epoch: 13795, Training Loss: 0.01247
Epoch: 13795, Training Loss: 0.01316
Epoch: 13795, Training Loss: 0.01318
Epoch: 13795, Training Loss: 0.01631
Epoch: 13796, Training Loss: 0.01247
Epoch: 13796, Training Loss: 0.01316
Epoch: 13796, Training Loss: 0.01318
Epoch: 13796, Training Loss: 0.01631
Epoch: 13797, Training Loss: 0.01247
Epoch: 13797, Training Loss: 0.01316
Epoch: 13797, Training Loss: 0.01318
Epoch: 13797, Training Loss: 0.01631
Epoch: 13798, Training Loss: 0.01247
Epoch: 13798, Training Loss: 0.01316
Epoch: 13798, Training Loss: 0.01318
Epoch: 13798, Training Loss: 0.01631
Epoch: 13799, Training Loss: 0.01247
Epoch: 13799, Training Loss: 0.01316
Epoch: 13799, Training Loss: 0.01318
Epoch: 13799, Training Loss: 0.01631
Epoch: 13800, Training Loss: 0.01247
Epoch: 13800, Training Loss: 0.01316
Epoch: 13800, Training Loss: 0.01318
Epoch: 13800, Training Loss: 0.01631
Epoch: 13801, Training Loss: 0.01247
Epoch: 13801, Training Loss: 0.01316
Epoch: 13801, Training Loss: 0.01318
Epoch: 13801, Training Loss: 0.01631
Epoch: 13802, Training Loss: 0.01247
Epoch: 13802, Training Loss: 0.01316
Epoch: 13802, Training Loss: 0.01318
Epoch: 13802, Training Loss: 0.01631
Epoch: 13803, Training Loss: 0.01247
Epoch: 13803, Training Loss: 0.01316
Epoch: 13803, Training Loss: 0.01318
Epoch: 13803, Training Loss: 0.01631
Epoch: 13804, Training Loss: 0.01247
Epoch: 13804, Training Loss: 0.01316
Epoch: 13804, Training Loss: 0.01318
Epoch: 13804, Training Loss: 0.01630
Epoch: 13805, Training Loss: 0.01247
Epoch: 13805, Training Loss: 0.01316
Epoch: 13805, Training Loss: 0.01318
Epoch: 13805, Training Loss: 0.01630
Epoch: 13806, Training Loss: 0.01247
Epoch: 13806, Training Loss: 0.01316
Epoch: 13806, Training Loss: 0.01318
Epoch: 13806, Training Loss: 0.01630
Epoch: 13807, Training Loss: 0.01247
Epoch: 13807, Training Loss: 0.01316
Epoch: 13807, Training Loss: 0.01318
Epoch: 13807, Training Loss: 0.01630
Epoch: 13808, Training Loss: 0.01247
Epoch: 13808, Training Loss: 0.01316
Epoch: 13808, Training Loss: 0.01318
Epoch: 13808, Training Loss: 0.01630
Epoch: 13809, Training Loss: 0.01247
Epoch: 13809, Training Loss: 0.01316
Epoch: 13809, Training Loss: 0.01318
Epoch: 13809, Training Loss: 0.01630
Epoch: 13810, Training Loss: 0.01247
Epoch: 13810, Training Loss: 0.01316
Epoch: 13810, Training Loss: 0.01317
Epoch: 13810, Training Loss: 0.01630
Epoch: 13811, Training Loss: 0.01246
Epoch: 13811, Training Loss: 0.01316
Epoch: 13811, Training Loss: 0.01317
Epoch: 13811, Training Loss: 0.01630
Epoch: 13812, Training Loss: 0.01246
Epoch: 13812, Training Loss: 0.01315
Epoch: 13812, Training Loss: 0.01317
Epoch: 13812, Training Loss: 0.01630
Epoch: 13813, Training Loss: 0.01246
Epoch: 13813, Training Loss: 0.01315
Epoch: 13813, Training Loss: 0.01317
Epoch: 13813, Training Loss: 0.01630
Epoch: 13814, Training Loss: 0.01246
Epoch: 13814, Training Loss: 0.01315
Epoch: 13814, Training Loss: 0.01317
Epoch: 13814, Training Loss: 0.01630
Epoch: 13815, Training Loss: 0.01246
Epoch: 13815, Training Loss: 0.01315
Epoch: 13815, Training Loss: 0.01317
Epoch: 13815, Training Loss: 0.01630
Epoch: 13816, Training Loss: 0.01246
Epoch: 13816, Training Loss: 0.01315
Epoch: 13816, Training Loss: 0.01317
Epoch: 13816, Training Loss: 0.01630
Epoch: 13817, Training Loss: 0.01246
Epoch: 13817, Training Loss: 0.01315
Epoch: 13817, Training Loss: 0.01317
Epoch: 13817, Training Loss: 0.01629
Epoch: 13818, Training Loss: 0.01246
Epoch: 13818, Training Loss: 0.01315
Epoch: 13818, Training Loss: 0.01317
Epoch: 13818, Training Loss: 0.01629
Epoch: 13819, Training Loss: 0.01246
Epoch: 13819, Training Loss: 0.01315
Epoch: 13819, Training Loss: 0.01317
Epoch: 13819, Training Loss: 0.01629
Epoch: 13820, Training Loss: 0.01246
Epoch: 13820, Training Loss: 0.01315
Epoch: 13820, Training Loss: 0.01317
Epoch: 13820, Training Loss: 0.01629
Epoch: 13821, Training Loss: 0.01246
Epoch: 13821, Training Loss: 0.01315
Epoch: 13821, Training Loss: 0.01317
Epoch: 13821, Training Loss: 0.01629
Epoch: 13822, Training Loss: 0.01246
Epoch: 13822, Training Loss: 0.01315
Epoch: 13822, Training Loss: 0.01317
Epoch: 13822, Training Loss: 0.01629
Epoch: 13823, Training Loss: 0.01246
Epoch: 13823, Training Loss: 0.01315
Epoch: 13823, Training Loss: 0.01317
Epoch: 13823, Training Loss: 0.01629
Epoch: 13824, Training Loss: 0.01246
Epoch: 13824, Training Loss: 0.01315
Epoch: 13824, Training Loss: 0.01317
Epoch: 13824, Training Loss: 0.01629
Epoch: 13825, Training Loss: 0.01246
Epoch: 13825, Training Loss: 0.01315
Epoch: 13825, Training Loss: 0.01317
Epoch: 13825, Training Loss: 0.01629
Epoch: 13826, Training Loss: 0.01246
Epoch: 13826, Training Loss: 0.01315
Epoch: 13826, Training Loss: 0.01317
Epoch: 13826, Training Loss: 0.01629
Epoch: 13827, Training Loss: 0.01246
Epoch: 13827, Training Loss: 0.01315
Epoch: 13827, Training Loss: 0.01316
Epoch: 13827, Training Loss: 0.01629
Epoch: 13828, Training Loss: 0.01246
Epoch: 13828, Training Loss: 0.01315
Epoch: 13828, Training Loss: 0.01316
Epoch: 13828, Training Loss: 0.01629
Epoch: 13829, Training Loss: 0.01246
Epoch: 13829, Training Loss: 0.01314
Epoch: 13829, Training Loss: 0.01316
Epoch: 13829, Training Loss: 0.01629
Epoch: 13830, Training Loss: 0.01245
Epoch: 13830, Training Loss: 0.01314
Epoch: 13830, Training Loss: 0.01316
Epoch: 13830, Training Loss: 0.01628
Epoch: 13831, Training Loss: 0.01245
Epoch: 13831, Training Loss: 0.01314
Epoch: 13831, Training Loss: 0.01316
Epoch: 13831, Training Loss: 0.01628
Epoch: 13832, Training Loss: 0.01245
Epoch: 13832, Training Loss: 0.01314
Epoch: 13832, Training Loss: 0.01316
Epoch: 13832, Training Loss: 0.01628
Epoch: 13833, Training Loss: 0.01245
Epoch: 13833, Training Loss: 0.01314
Epoch: 13833, Training Loss: 0.01316
Epoch: 13833, Training Loss: 0.01628
Epoch: 13834, Training Loss: 0.01245
Epoch: 13834, Training Loss: 0.01314
Epoch: 13834, Training Loss: 0.01316
Epoch: 13834, Training Loss: 0.01628
Epoch: 13835, Training Loss: 0.01245
Epoch: 13835, Training Loss: 0.01314
Epoch: 13835, Training Loss: 0.01316
Epoch: 13835, Training Loss: 0.01628
Epoch: 13836, Training Loss: 0.01245
Epoch: 13836, Training Loss: 0.01314
Epoch: 13836, Training Loss: 0.01316
Epoch: 13836, Training Loss: 0.01628
Epoch: 13837, Training Loss: 0.01245
Epoch: 13837, Training Loss: 0.01314
Epoch: 13837, Training Loss: 0.01316
Epoch: 13837, Training Loss: 0.01628
Epoch: 13838, Training Loss: 0.01245
Epoch: 13838, Training Loss: 0.01314
Epoch: 13838, Training Loss: 0.01316
Epoch: 13838, Training Loss: 0.01628
Epoch: 13839, Training Loss: 0.01245
Epoch: 13839, Training Loss: 0.01314
Epoch: 13839, Training Loss: 0.01316
Epoch: 13839, Training Loss: 0.01628
Epoch: 13840, Training Loss: 0.01245
Epoch: 13840, Training Loss: 0.01314
Epoch: 13840, Training Loss: 0.01316
Epoch: 13840, Training Loss: 0.01628
Epoch: 13841, Training Loss: 0.01245
Epoch: 13841, Training Loss: 0.01314
Epoch: 13841, Training Loss: 0.01316
Epoch: 13841, Training Loss: 0.01628
Epoch: 13842, Training Loss: 0.01245
Epoch: 13842, Training Loss: 0.01314
Epoch: 13842, Training Loss: 0.01316
Epoch: 13842, Training Loss: 0.01628
Epoch: 13843, Training Loss: 0.01245
Epoch: 13843, Training Loss: 0.01314
Epoch: 13843, Training Loss: 0.01315
Epoch: 13843, Training Loss: 0.01627
Epoch: 13844, Training Loss: 0.01245
Epoch: 13844, Training Loss: 0.01314
Epoch: 13844, Training Loss: 0.01315
Epoch: 13844, Training Loss: 0.01627
Epoch: 13845, Training Loss: 0.01245
Epoch: 13845, Training Loss: 0.01313
Epoch: 13845, Training Loss: 0.01315
Epoch: 13845, Training Loss: 0.01627
Epoch: 13846, Training Loss: 0.01245
Epoch: 13846, Training Loss: 0.01313
Epoch: 13846, Training Loss: 0.01315
Epoch: 13846, Training Loss: 0.01627
Epoch: 13847, Training Loss: 0.01245
Epoch: 13847, Training Loss: 0.01313
Epoch: 13847, Training Loss: 0.01315
Epoch: 13847, Training Loss: 0.01627
Epoch: 13848, Training Loss: 0.01244
Epoch: 13848, Training Loss: 0.01313
Epoch: 13848, Training Loss: 0.01315
Epoch: 13848, Training Loss: 0.01627
Epoch: 13849, Training Loss: 0.01244
Epoch: 13849, Training Loss: 0.01313
Epoch: 13849, Training Loss: 0.01315
Epoch: 13849, Training Loss: 0.01627
Epoch: 13850, Training Loss: 0.01244
Epoch: 13850, Training Loss: 0.01313
Epoch: 13850, Training Loss: 0.01315
Epoch: 13850, Training Loss: 0.01627
Epoch: 13851, Training Loss: 0.01244
Epoch: 13851, Training Loss: 0.01313
Epoch: 13851, Training Loss: 0.01315
Epoch: 13851, Training Loss: 0.01627
Epoch: 13852, Training Loss: 0.01244
Epoch: 13852, Training Loss: 0.01313
Epoch: 13852, Training Loss: 0.01315
Epoch: 13852, Training Loss: 0.01627
Epoch: 13853, Training Loss: 0.01244
Epoch: 13853, Training Loss: 0.01313
Epoch: 13853, Training Loss: 0.01315
Epoch: 13853, Training Loss: 0.01627
Epoch: 13854, Training Loss: 0.01244
Epoch: 13854, Training Loss: 0.01313
Epoch: 13854, Training Loss: 0.01315
Epoch: 13854, Training Loss: 0.01627
Epoch: 13855, Training Loss: 0.01244
Epoch: 13855, Training Loss: 0.01313
Epoch: 13855, Training Loss: 0.01315
Epoch: 13855, Training Loss: 0.01627
Epoch: 13856, Training Loss: 0.01244
Epoch: 13856, Training Loss: 0.01313
Epoch: 13856, Training Loss: 0.01315
Epoch: 13856, Training Loss: 0.01626
Epoch: 13857, Training Loss: 0.01244
Epoch: 13857, Training Loss: 0.01313
Epoch: 13857, Training Loss: 0.01315
Epoch: 13857, Training Loss: 0.01626
Epoch: 13858, Training Loss: 0.01244
Epoch: 13858, Training Loss: 0.01313
Epoch: 13858, Training Loss: 0.01315
Epoch: 13858, Training Loss: 0.01626
Epoch: 13859, Training Loss: 0.01244
Epoch: 13859, Training Loss: 0.01313
Epoch: 13859, Training Loss: 0.01315
Epoch: 13859, Training Loss: 0.01626
Epoch: 13860, Training Loss: 0.01244
Epoch: 13860, Training Loss: 0.01313
Epoch: 13860, Training Loss: 0.01314
Epoch: 13860, Training Loss: 0.01626
Epoch: 13861, Training Loss: 0.01244
Epoch: 13861, Training Loss: 0.01313
Epoch: 13861, Training Loss: 0.01314
Epoch: 13861, Training Loss: 0.01626
Epoch: 13862, Training Loss: 0.01244
Epoch: 13862, Training Loss: 0.01312
Epoch: 13862, Training Loss: 0.01314
Epoch: 13862, Training Loss: 0.01626
Epoch: 13863, Training Loss: 0.01244
Epoch: 13863, Training Loss: 0.01312
Epoch: 13863, Training Loss: 0.01314
Epoch: 13863, Training Loss: 0.01626
Epoch: 13864, Training Loss: 0.01244
Epoch: 13864, Training Loss: 0.01312
Epoch: 13864, Training Loss: 0.01314
Epoch: 13864, Training Loss: 0.01626
Epoch: 13865, Training Loss: 0.01244
Epoch: 13865, Training Loss: 0.01312
Epoch: 13865, Training Loss: 0.01314
Epoch: 13865, Training Loss: 0.01626
Epoch: 13866, Training Loss: 0.01243
Epoch: 13866, Training Loss: 0.01312
Epoch: 13866, Training Loss: 0.01314
Epoch: 13866, Training Loss: 0.01626
Epoch: 13867, Training Loss: 0.01243
Epoch: 13867, Training Loss: 0.01312
Epoch: 13867, Training Loss: 0.01314
Epoch: 13867, Training Loss: 0.01626
Epoch: 13868, Training Loss: 0.01243
Epoch: 13868, Training Loss: 0.01312
Epoch: 13868, Training Loss: 0.01314
Epoch: 13868, Training Loss: 0.01626
Epoch: 13869, Training Loss: 0.01243
Epoch: 13869, Training Loss: 0.01312
Epoch: 13869, Training Loss: 0.01314
Epoch: 13869, Training Loss: 0.01625
Epoch: 13870, Training Loss: 0.01243
Epoch: 13870, Training Loss: 0.01312
Epoch: 13870, Training Loss: 0.01314
Epoch: 13870, Training Loss: 0.01625
Epoch: 13871, Training Loss: 0.01243
Epoch: 13871, Training Loss: 0.01312
Epoch: 13871, Training Loss: 0.01314
Epoch: 13871, Training Loss: 0.01625
Epoch: 13872, Training Loss: 0.01243
Epoch: 13872, Training Loss: 0.01312
Epoch: 13872, Training Loss: 0.01314
Epoch: 13872, Training Loss: 0.01625
Epoch: 13873, Training Loss: 0.01243
Epoch: 13873, Training Loss: 0.01312
Epoch: 13873, Training Loss: 0.01314
Epoch: 13873, Training Loss: 0.01625
Epoch: 13874, Training Loss: 0.01243
Epoch: 13874, Training Loss: 0.01312
Epoch: 13874, Training Loss: 0.01314
Epoch: 13874, Training Loss: 0.01625
Epoch: 13875, Training Loss: 0.01243
Epoch: 13875, Training Loss: 0.01312
Epoch: 13875, Training Loss: 0.01314
Epoch: 13875, Training Loss: 0.01625
Epoch: 13876, Training Loss: 0.01243
Epoch: 13876, Training Loss: 0.01312
Epoch: 13876, Training Loss: 0.01314
Epoch: 13876, Training Loss: 0.01625
Epoch: 13877, Training Loss: 0.01243
Epoch: 13877, Training Loss: 0.01312
Epoch: 13877, Training Loss: 0.01313
Epoch: 13877, Training Loss: 0.01625
Epoch: 13878, Training Loss: 0.01243
Epoch: 13878, Training Loss: 0.01312
Epoch: 13878, Training Loss: 0.01313
Epoch: 13878, Training Loss: 0.01625
Epoch: 13879, Training Loss: 0.01243
Epoch: 13879, Training Loss: 0.01311
Epoch: 13879, Training Loss: 0.01313
Epoch: 13879, Training Loss: 0.01625
Epoch: 13880, Training Loss: 0.01243
Epoch: 13880, Training Loss: 0.01311
Epoch: 13880, Training Loss: 0.01313
Epoch: 13880, Training Loss: 0.01625
Epoch: 13881, Training Loss: 0.01243
Epoch: 13881, Training Loss: 0.01311
Epoch: 13881, Training Loss: 0.01313
Epoch: 13881, Training Loss: 0.01625
Epoch: 13882, Training Loss: 0.01243
Epoch: 13882, Training Loss: 0.01311
Epoch: 13882, Training Loss: 0.01313
Epoch: 13882, Training Loss: 0.01625
Epoch: 13883, Training Loss: 0.01243
Epoch: 13883, Training Loss: 0.01311
Epoch: 13883, Training Loss: 0.01313
Epoch: 13883, Training Loss: 0.01624
Epoch: 13884, Training Loss: 0.01242
Epoch: 13884, Training Loss: 0.01311
Epoch: 13884, Training Loss: 0.01313
Epoch: 13884, Training Loss: 0.01624
Epoch: 13885, Training Loss: 0.01242
Epoch: 13885, Training Loss: 0.01311
Epoch: 13885, Training Loss: 0.01313
Epoch: 13885, Training Loss: 0.01624
Epoch: 13886, Training Loss: 0.01242
Epoch: 13886, Training Loss: 0.01311
Epoch: 13886, Training Loss: 0.01313
Epoch: 13886, Training Loss: 0.01624
Epoch: 13887, Training Loss: 0.01242
Epoch: 13887, Training Loss: 0.01311
Epoch: 13887, Training Loss: 0.01313
Epoch: 13887, Training Loss: 0.01624
Epoch: 13888, Training Loss: 0.01242
Epoch: 13888, Training Loss: 0.01311
Epoch: 13888, Training Loss: 0.01313
Epoch: 13888, Training Loss: 0.01624
Epoch: 13889, Training Loss: 0.01242
Epoch: 13889, Training Loss: 0.01311
Epoch: 13889, Training Loss: 0.01313
Epoch: 13889, Training Loss: 0.01624
Epoch: 13890, Training Loss: 0.01242
Epoch: 13890, Training Loss: 0.01311
Epoch: 13890, Training Loss: 0.01313
Epoch: 13890, Training Loss: 0.01624
Epoch: 13891, Training Loss: 0.01242
Epoch: 13891, Training Loss: 0.01311
Epoch: 13891, Training Loss: 0.01313
Epoch: 13891, Training Loss: 0.01624
Epoch: 13892, Training Loss: 0.01242
Epoch: 13892, Training Loss: 0.01311
Epoch: 13892, Training Loss: 0.01313
Epoch: 13892, Training Loss: 0.01624
Epoch: 13893, Training Loss: 0.01242
Epoch: 13893, Training Loss: 0.01311
Epoch: 13893, Training Loss: 0.01312
Epoch: 13893, Training Loss: 0.01624
Epoch: 13894, Training Loss: 0.01242
Epoch: 13894, Training Loss: 0.01311
Epoch: 13894, Training Loss: 0.01312
Epoch: 13894, Training Loss: 0.01624
Epoch: 13895, Training Loss: 0.01242
Epoch: 13895, Training Loss: 0.01311
Epoch: 13895, Training Loss: 0.01312
Epoch: 13895, Training Loss: 0.01624
Epoch: 13896, Training Loss: 0.01242
Epoch: 13896, Training Loss: 0.01310
Epoch: 13896, Training Loss: 0.01312
Epoch: 13896, Training Loss: 0.01623
Epoch: 13897, Training Loss: 0.01242
Epoch: 13897, Training Loss: 0.01310
Epoch: 13897, Training Loss: 0.01312
Epoch: 13897, Training Loss: 0.01623
Epoch: 13898, Training Loss: 0.01242
Epoch: 13898, Training Loss: 0.01310
Epoch: 13898, Training Loss: 0.01312
Epoch: 13898, Training Loss: 0.01623
Epoch: 13899, Training Loss: 0.01242
Epoch: 13899, Training Loss: 0.01310
Epoch: 13899, Training Loss: 0.01312
Epoch: 13899, Training Loss: 0.01623
Epoch: 13900, Training Loss: 0.01242
Epoch: 13900, Training Loss: 0.01310
Epoch: 13900, Training Loss: 0.01312
Epoch: 13900, Training Loss: 0.01623
Epoch: 13901, Training Loss: 0.01242
Epoch: 13901, Training Loss: 0.01310
Epoch: 13901, Training Loss: 0.01312
Epoch: 13901, Training Loss: 0.01623
Epoch: 13902, Training Loss: 0.01242
Epoch: 13902, Training Loss: 0.01310
Epoch: 13902, Training Loss: 0.01312
Epoch: 13902, Training Loss: 0.01623
Epoch: 13903, Training Loss: 0.01241
Epoch: 13903, Training Loss: 0.01310
Epoch: 13903, Training Loss: 0.01312
Epoch: 13903, Training Loss: 0.01623
Epoch: 13904, Training Loss: 0.01241
Epoch: 13904, Training Loss: 0.01310
Epoch: 13904, Training Loss: 0.01312
Epoch: 13904, Training Loss: 0.01623
Epoch: 13905, Training Loss: 0.01241
Epoch: 13905, Training Loss: 0.01310
Epoch: 13905, Training Loss: 0.01312
Epoch: 13905, Training Loss: 0.01623
Epoch: 13906, Training Loss: 0.01241
Epoch: 13906, Training Loss: 0.01310
Epoch: 13906, Training Loss: 0.01312
Epoch: 13906, Training Loss: 0.01623
Epoch: 13907, Training Loss: 0.01241
Epoch: 13907, Training Loss: 0.01310
Epoch: 13907, Training Loss: 0.01312
Epoch: 13907, Training Loss: 0.01623
Epoch: 13908, Training Loss: 0.01241
Epoch: 13908, Training Loss: 0.01310
Epoch: 13908, Training Loss: 0.01312
Epoch: 13908, Training Loss: 0.01623
Epoch: 13909, Training Loss: 0.01241
Epoch: 13909, Training Loss: 0.01310
Epoch: 13909, Training Loss: 0.01312
Epoch: 13909, Training Loss: 0.01622
Epoch: 13910, Training Loss: 0.01241
Epoch: 13910, Training Loss: 0.01310
Epoch: 13910, Training Loss: 0.01311
Epoch: 13910, Training Loss: 0.01622
Epoch: 13911, Training Loss: 0.01241
Epoch: 13911, Training Loss: 0.01310
Epoch: 13911, Training Loss: 0.01311
Epoch: 13911, Training Loss: 0.01622
Epoch: 13912, Training Loss: 0.01241
Epoch: 13912, Training Loss: 0.01309
Epoch: 13912, Training Loss: 0.01311
Epoch: 13912, Training Loss: 0.01622
Epoch: 13913, Training Loss: 0.01241
Epoch: 13913, Training Loss: 0.01309
Epoch: 13913, Training Loss: 0.01311
Epoch: 13913, Training Loss: 0.01622
Epoch: 13914, Training Loss: 0.01241
Epoch: 13914, Training Loss: 0.01309
Epoch: 13914, Training Loss: 0.01311
Epoch: 13914, Training Loss: 0.01622
Epoch: 13915, Training Loss: 0.01241
Epoch: 13915, Training Loss: 0.01309
Epoch: 13915, Training Loss: 0.01311
Epoch: 13915, Training Loss: 0.01622
Epoch: 13916, Training Loss: 0.01241
Epoch: 13916, Training Loss: 0.01309
Epoch: 13916, Training Loss: 0.01311
Epoch: 13916, Training Loss: 0.01622
Epoch: 13917, Training Loss: 0.01241
Epoch: 13917, Training Loss: 0.01309
Epoch: 13917, Training Loss: 0.01311
Epoch: 13917, Training Loss: 0.01622
Epoch: 13918, Training Loss: 0.01241
Epoch: 13918, Training Loss: 0.01309
Epoch: 13918, Training Loss: 0.01311
Epoch: 13918, Training Loss: 0.01622
Epoch: 13919, Training Loss: 0.01241
Epoch: 13919, Training Loss: 0.01309
Epoch: 13919, Training Loss: 0.01311
Epoch: 13919, Training Loss: 0.01622
Epoch: 13920, Training Loss: 0.01241
Epoch: 13920, Training Loss: 0.01309
Epoch: 13920, Training Loss: 0.01311
Epoch: 13920, Training Loss: 0.01622
Epoch: 13921, Training Loss: 0.01240
Epoch: 13921, Training Loss: 0.01309
Epoch: 13921, Training Loss: 0.01311
Epoch: 13921, Training Loss: 0.01622
Epoch: 13922, Training Loss: 0.01240
Epoch: 13922, Training Loss: 0.01309
Epoch: 13922, Training Loss: 0.01311
Epoch: 13922, Training Loss: 0.01622
Epoch: 13923, Training Loss: 0.01240
Epoch: 13923, Training Loss: 0.01309
Epoch: 13923, Training Loss: 0.01311
Epoch: 13923, Training Loss: 0.01621
Epoch: 13924, Training Loss: 0.01240
Epoch: 13924, Training Loss: 0.01309
Epoch: 13924, Training Loss: 0.01311
Epoch: 13924, Training Loss: 0.01621
Epoch: 13925, Training Loss: 0.01240
Epoch: 13925, Training Loss: 0.01309
Epoch: 13925, Training Loss: 0.01311
Epoch: 13925, Training Loss: 0.01621
Epoch: 13926, Training Loss: 0.01240
Epoch: 13926, Training Loss: 0.01309
Epoch: 13926, Training Loss: 0.01311
Epoch: 13926, Training Loss: 0.01621
Epoch: 13927, Training Loss: 0.01240
Epoch: 13927, Training Loss: 0.01309
Epoch: 13927, Training Loss: 0.01310
Epoch: 13927, Training Loss: 0.01621
Epoch: 13928, Training Loss: 0.01240
Epoch: 13928, Training Loss: 0.01309
Epoch: 13928, Training Loss: 0.01310
Epoch: 13928, Training Loss: 0.01621
Epoch: 13929, Training Loss: 0.01240
Epoch: 13929, Training Loss: 0.01308
Epoch: 13929, Training Loss: 0.01310
Epoch: 13929, Training Loss: 0.01621
Epoch: 13930, Training Loss: 0.01240
Epoch: 13930, Training Loss: 0.01308
Epoch: 13930, Training Loss: 0.01310
Epoch: 13930, Training Loss: 0.01621
Epoch: 13931, Training Loss: 0.01240
Epoch: 13931, Training Loss: 0.01308
Epoch: 13931, Training Loss: 0.01310
Epoch: 13931, Training Loss: 0.01621
Epoch: 13932, Training Loss: 0.01240
Epoch: 13932, Training Loss: 0.01308
Epoch: 13932, Training Loss: 0.01310
Epoch: 13932, Training Loss: 0.01621
Epoch: 13933, Training Loss: 0.01240
Epoch: 13933, Training Loss: 0.01308
Epoch: 13933, Training Loss: 0.01310
Epoch: 13933, Training Loss: 0.01621
Epoch: 13934, Training Loss: 0.01240
Epoch: 13934, Training Loss: 0.01308
Epoch: 13934, Training Loss: 0.01310
Epoch: 13934, Training Loss: 0.01621
Epoch: 13935, Training Loss: 0.01240
Epoch: 13935, Training Loss: 0.01308
Epoch: 13935, Training Loss: 0.01310
Epoch: 13935, Training Loss: 0.01621
Epoch: 13936, Training Loss: 0.01240
Epoch: 13936, Training Loss: 0.01308
Epoch: 13936, Training Loss: 0.01310
Epoch: 13936, Training Loss: 0.01620
Epoch: 13937, Training Loss: 0.01240
Epoch: 13937, Training Loss: 0.01308
Epoch: 13937, Training Loss: 0.01310
Epoch: 13937, Training Loss: 0.01620
Epoch: 13938, Training Loss: 0.01240
Epoch: 13938, Training Loss: 0.01308
Epoch: 13938, Training Loss: 0.01310
Epoch: 13938, Training Loss: 0.01620
Epoch: 13939, Training Loss: 0.01239
Epoch: 13939, Training Loss: 0.01308
Epoch: 13939, Training Loss: 0.01310
Epoch: 13939, Training Loss: 0.01620
Epoch: 13940, Training Loss: 0.01239
Epoch: 13940, Training Loss: 0.01308
Epoch: 13940, Training Loss: 0.01310
Epoch: 13940, Training Loss: 0.01620
Epoch: 13941, Training Loss: 0.01239
Epoch: 13941, Training Loss: 0.01308
Epoch: 13941, Training Loss: 0.01310
Epoch: 13941, Training Loss: 0.01620
Epoch: 13942, Training Loss: 0.01239
Epoch: 13942, Training Loss: 0.01308
Epoch: 13942, Training Loss: 0.01310
Epoch: 13942, Training Loss: 0.01620
Epoch: 13943, Training Loss: 0.01239
Epoch: 13943, Training Loss: 0.01308
Epoch: 13943, Training Loss: 0.01310
Epoch: 13943, Training Loss: 0.01620
Epoch: 13944, Training Loss: 0.01239
Epoch: 13944, Training Loss: 0.01308
Epoch: 13944, Training Loss: 0.01309
Epoch: 13944, Training Loss: 0.01620
Epoch: 13945, Training Loss: 0.01239
Epoch: 13945, Training Loss: 0.01308
Epoch: 13945, Training Loss: 0.01309
Epoch: 13945, Training Loss: 0.01620
Epoch: 13946, Training Loss: 0.01239
Epoch: 13946, Training Loss: 0.01307
Epoch: 13946, Training Loss: 0.01309
Epoch: 13946, Training Loss: 0.01620
Epoch: 13947, Training Loss: 0.01239
Epoch: 13947, Training Loss: 0.01307
Epoch: 13947, Training Loss: 0.01309
Epoch: 13947, Training Loss: 0.01620
Epoch: 13948, Training Loss: 0.01239
Epoch: 13948, Training Loss: 0.01307
Epoch: 13948, Training Loss: 0.01309
Epoch: 13948, Training Loss: 0.01620
Epoch: 13949, Training Loss: 0.01239
Epoch: 13949, Training Loss: 0.01307
Epoch: 13949, Training Loss: 0.01309
Epoch: 13949, Training Loss: 0.01619
Epoch: 13950, Training Loss: 0.01239
Epoch: 13950, Training Loss: 0.01307
Epoch: 13950, Training Loss: 0.01309
Epoch: 13950, Training Loss: 0.01619
Epoch: 13951, Training Loss: 0.01239
Epoch: 13951, Training Loss: 0.01307
Epoch: 13951, Training Loss: 0.01309
Epoch: 13951, Training Loss: 0.01619
Epoch: 13952, Training Loss: 0.01239
Epoch: 13952, Training Loss: 0.01307
Epoch: 13952, Training Loss: 0.01309
Epoch: 13952, Training Loss: 0.01619
Epoch: 13953, Training Loss: 0.01239
Epoch: 13953, Training Loss: 0.01307
Epoch: 13953, Training Loss: 0.01309
Epoch: 13953, Training Loss: 0.01619
Epoch: 13954, Training Loss: 0.01239
Epoch: 13954, Training Loss: 0.01307
Epoch: 13954, Training Loss: 0.01309
Epoch: 13954, Training Loss: 0.01619
Epoch: 13955, Training Loss: 0.01239
Epoch: 13955, Training Loss: 0.01307
Epoch: 13955, Training Loss: 0.01309
Epoch: 13955, Training Loss: 0.01619
Epoch: 13956, Training Loss: 0.01239
Epoch: 13956, Training Loss: 0.01307
Epoch: 13956, Training Loss: 0.01309
Epoch: 13956, Training Loss: 0.01619
Epoch: 13957, Training Loss: 0.01239
Epoch: 13957, Training Loss: 0.01307
Epoch: 13957, Training Loss: 0.01309
Epoch: 13957, Training Loss: 0.01619
Epoch: 13958, Training Loss: 0.01238
Epoch: 13958, Training Loss: 0.01307
Epoch: 13958, Training Loss: 0.01309
Epoch: 13958, Training Loss: 0.01619
Epoch: 13959, Training Loss: 0.01238
Epoch: 13959, Training Loss: 0.01307
Epoch: 13959, Training Loss: 0.01309
Epoch: 13959, Training Loss: 0.01619
Epoch: 13960, Training Loss: 0.01238
Epoch: 13960, Training Loss: 0.01307
Epoch: 13960, Training Loss: 0.01309
Epoch: 13960, Training Loss: 0.01619
Epoch: 13961, Training Loss: 0.01238
Epoch: 13961, Training Loss: 0.01307
Epoch: 13961, Training Loss: 0.01308
Epoch: 13961, Training Loss: 0.01619
Epoch: 13962, Training Loss: 0.01238
Epoch: 13962, Training Loss: 0.01307
Epoch: 13962, Training Loss: 0.01308
Epoch: 13962, Training Loss: 0.01619
Epoch: 13963, Training Loss: 0.01238
Epoch: 13963, Training Loss: 0.01306
Epoch: 13963, Training Loss: 0.01308
Epoch: 13963, Training Loss: 0.01618
Epoch: 13964, Training Loss: 0.01238
Epoch: 13964, Training Loss: 0.01306
Epoch: 13964, Training Loss: 0.01308
Epoch: 13964, Training Loss: 0.01618
Epoch: 13965, Training Loss: 0.01238
Epoch: 13965, Training Loss: 0.01306
Epoch: 13965, Training Loss: 0.01308
Epoch: 13965, Training Loss: 0.01618
Epoch: 13966, Training Loss: 0.01238
Epoch: 13966, Training Loss: 0.01306
Epoch: 13966, Training Loss: 0.01308
Epoch: 13966, Training Loss: 0.01618
Epoch: 13967, Training Loss: 0.01238
Epoch: 13967, Training Loss: 0.01306
Epoch: 13967, Training Loss: 0.01308
Epoch: 13967, Training Loss: 0.01618
Epoch: 13968, Training Loss: 0.01238
Epoch: 13968, Training Loss: 0.01306
Epoch: 13968, Training Loss: 0.01308
Epoch: 13968, Training Loss: 0.01618
Epoch: 13969, Training Loss: 0.01238
Epoch: 13969, Training Loss: 0.01306
Epoch: 13969, Training Loss: 0.01308
Epoch: 13969, Training Loss: 0.01618
Epoch: 13970, Training Loss: 0.01238
Epoch: 13970, Training Loss: 0.01306
Epoch: 13970, Training Loss: 0.01308
Epoch: 13970, Training Loss: 0.01618
Epoch: 13971, Training Loss: 0.01238
Epoch: 13971, Training Loss: 0.01306
Epoch: 13971, Training Loss: 0.01308
Epoch: 13971, Training Loss: 0.01618
Epoch: 13972, Training Loss: 0.01238
Epoch: 13972, Training Loss: 0.01306
Epoch: 13972, Training Loss: 0.01308
Epoch: 13972, Training Loss: 0.01618
Epoch: 13973, Training Loss: 0.01238
Epoch: 13973, Training Loss: 0.01306
Epoch: 13973, Training Loss: 0.01308
Epoch: 13973, Training Loss: 0.01618
Epoch: 13974, Training Loss: 0.01238
Epoch: 13974, Training Loss: 0.01306
Epoch: 13974, Training Loss: 0.01308
Epoch: 13974, Training Loss: 0.01618
Epoch: 13975, Training Loss: 0.01238
Epoch: 13975, Training Loss: 0.01306
Epoch: 13975, Training Loss: 0.01308
Epoch: 13975, Training Loss: 0.01618
Epoch: 13976, Training Loss: 0.01237
Epoch: 13976, Training Loss: 0.01306
Epoch: 13976, Training Loss: 0.01308
Epoch: 13976, Training Loss: 0.01617
Epoch: 13977, Training Loss: 0.01237
Epoch: 13977, Training Loss: 0.01306
Epoch: 13977, Training Loss: 0.01308
Epoch: 13977, Training Loss: 0.01617
Epoch: 13978, Training Loss: 0.01237
Epoch: 13978, Training Loss: 0.01306
Epoch: 13978, Training Loss: 0.01307
Epoch: 13978, Training Loss: 0.01617
Epoch: 13979, Training Loss: 0.01237
Epoch: 13979, Training Loss: 0.01306
Epoch: 13979, Training Loss: 0.01307
Epoch: 13979, Training Loss: 0.01617
Epoch: 13980, Training Loss: 0.01237
Epoch: 13980, Training Loss: 0.01305
Epoch: 13980, Training Loss: 0.01307
Epoch: 13980, Training Loss: 0.01617
Epoch: 13981, Training Loss: 0.01237
Epoch: 13981, Training Loss: 0.01305
Epoch: 13981, Training Loss: 0.01307
Epoch: 13981, Training Loss: 0.01617
Epoch: 13982, Training Loss: 0.01237
Epoch: 13982, Training Loss: 0.01305
Epoch: 13982, Training Loss: 0.01307
Epoch: 13982, Training Loss: 0.01617
Epoch: 13983, Training Loss: 0.01237
Epoch: 13983, Training Loss: 0.01305
Epoch: 13983, Training Loss: 0.01307
Epoch: 13983, Training Loss: 0.01617
Epoch: 13984, Training Loss: 0.01237
Epoch: 13984, Training Loss: 0.01305
Epoch: 13984, Training Loss: 0.01307
Epoch: 13984, Training Loss: 0.01617
Epoch: 13985, Training Loss: 0.01237
Epoch: 13985, Training Loss: 0.01305
Epoch: 13985, Training Loss: 0.01307
Epoch: 13985, Training Loss: 0.01617
Epoch: 13986, Training Loss: 0.01237
Epoch: 13986, Training Loss: 0.01305
Epoch: 13986, Training Loss: 0.01307
Epoch: 13986, Training Loss: 0.01617
Epoch: 13987, Training Loss: 0.01237
Epoch: 13987, Training Loss: 0.01305
Epoch: 13987, Training Loss: 0.01307
Epoch: 13987, Training Loss: 0.01617
Epoch: 13988, Training Loss: 0.01237
Epoch: 13988, Training Loss: 0.01305
Epoch: 13988, Training Loss: 0.01307
Epoch: 13988, Training Loss: 0.01617
Epoch: 13989, Training Loss: 0.01237
Epoch: 13989, Training Loss: 0.01305
Epoch: 13989, Training Loss: 0.01307
Epoch: 13989, Training Loss: 0.01616
Epoch: 13990, Training Loss: 0.01237
Epoch: 13990, Training Loss: 0.01305
Epoch: 13990, Training Loss: 0.01307
Epoch: 13990, Training Loss: 0.01616
Epoch: 13991, Training Loss: 0.01237
Epoch: 13991, Training Loss: 0.01305
Epoch: 13991, Training Loss: 0.01307
Epoch: 13991, Training Loss: 0.01616
Epoch: 13992, Training Loss: 0.01237
Epoch: 13992, Training Loss: 0.01305
Epoch: 13992, Training Loss: 0.01307
Epoch: 13992, Training Loss: 0.01616
Epoch: 13993, Training Loss: 0.01237
Epoch: 13993, Training Loss: 0.01305
Epoch: 13993, Training Loss: 0.01307
Epoch: 13993, Training Loss: 0.01616
Epoch: 13994, Training Loss: 0.01237
Epoch: 13994, Training Loss: 0.01305
Epoch: 13994, Training Loss: 0.01307
Epoch: 13994, Training Loss: 0.01616
Epoch: 13995, Training Loss: 0.01236
Epoch: 13995, Training Loss: 0.01305
Epoch: 13995, Training Loss: 0.01306
Epoch: 13995, Training Loss: 0.01616
Epoch: 13996, Training Loss: 0.01236
Epoch: 13996, Training Loss: 0.01305
Epoch: 13996, Training Loss: 0.01306
Epoch: 13996, Training Loss: 0.01616
Epoch: 13997, Training Loss: 0.01236
Epoch: 13997, Training Loss: 0.01304
Epoch: 13997, Training Loss: 0.01306
Epoch: 13997, Training Loss: 0.01616
Epoch: 13998, Training Loss: 0.01236
Epoch: 13998, Training Loss: 0.01304
Epoch: 13998, Training Loss: 0.01306
Epoch: 13998, Training Loss: 0.01616
Epoch: 13999, Training Loss: 0.01236
Epoch: 13999, Training Loss: 0.01304
Epoch: 13999, Training Loss: 0.01306
Epoch: 13999, Training Loss: 0.01616
Epoch: 14000, Training Loss: 0.01236
Epoch: 14000, Training Loss: 0.01304
Epoch: 14000, Training Loss: 0.01306
Epoch: 14000, Training Loss: 0.01616
Epoch: 14001, Training Loss: 0.01236
Epoch: 14001, Training Loss: 0.01304
Epoch: 14001, Training Loss: 0.01306
Epoch: 14001, Training Loss: 0.01616
Epoch: 14002, Training Loss: 0.01236
Epoch: 14002, Training Loss: 0.01304
Epoch: 14002, Training Loss: 0.01306
Epoch: 14002, Training Loss: 0.01616
Epoch: 14003, Training Loss: 0.01236
Epoch: 14003, Training Loss: 0.01304
Epoch: 14003, Training Loss: 0.01306
Epoch: 14003, Training Loss: 0.01615
Epoch: 14004, Training Loss: 0.01236
Epoch: 14004, Training Loss: 0.01304
Epoch: 14004, Training Loss: 0.01306
Epoch: 14004, Training Loss: 0.01615
Epoch: 14005, Training Loss: 0.01236
Epoch: 14005, Training Loss: 0.01304
Epoch: 14005, Training Loss: 0.01306
Epoch: 14005, Training Loss: 0.01615
Epoch: 14006, Training Loss: 0.01236
Epoch: 14006, Training Loss: 0.01304
Epoch: 14006, Training Loss: 0.01306
Epoch: 14006, Training Loss: 0.01615
Epoch: 14007, Training Loss: 0.01236
Epoch: 14007, Training Loss: 0.01304
Epoch: 14007, Training Loss: 0.01306
Epoch: 14007, Training Loss: 0.01615
Epoch: 14008, Training Loss: 0.01236
Epoch: 14008, Training Loss: 0.01304
Epoch: 14008, Training Loss: 0.01306
Epoch: 14008, Training Loss: 0.01615
Epoch: 14009, Training Loss: 0.01236
Epoch: 14009, Training Loss: 0.01304
Epoch: 14009, Training Loss: 0.01306
Epoch: 14009, Training Loss: 0.01615
Epoch: 14010, Training Loss: 0.01236
Epoch: 14010, Training Loss: 0.01304
Epoch: 14010, Training Loss: 0.01306
Epoch: 14010, Training Loss: 0.01615
Epoch: 14011, Training Loss: 0.01236
Epoch: 14011, Training Loss: 0.01304
Epoch: 14011, Training Loss: 0.01306
Epoch: 14011, Training Loss: 0.01615
Epoch: 14012, Training Loss: 0.01236
Epoch: 14012, Training Loss: 0.01304
Epoch: 14012, Training Loss: 0.01305
Epoch: 14012, Training Loss: 0.01615
Epoch: 14013, Training Loss: 0.01236
Epoch: 14013, Training Loss: 0.01304
Epoch: 14013, Training Loss: 0.01305
Epoch: 14013, Training Loss: 0.01615
Epoch: 14014, Training Loss: 0.01235
Epoch: 14014, Training Loss: 0.01303
Epoch: 14014, Training Loss: 0.01305
Epoch: 14014, Training Loss: 0.01615
Epoch: 14015, Training Loss: 0.01235
Epoch: 14015, Training Loss: 0.01303
Epoch: 14015, Training Loss: 0.01305
Epoch: 14015, Training Loss: 0.01615
Epoch: 14016, Training Loss: 0.01235
Epoch: 14016, Training Loss: 0.01303
Epoch: 14016, Training Loss: 0.01305
Epoch: 14016, Training Loss: 0.01614
Epoch: 14017, Training Loss: 0.01235
Epoch: 14017, Training Loss: 0.01303
Epoch: 14017, Training Loss: 0.01305
Epoch: 14017, Training Loss: 0.01614
Epoch: 14018, Training Loss: 0.01235
Epoch: 14018, Training Loss: 0.01303
Epoch: 14018, Training Loss: 0.01305
Epoch: 14018, Training Loss: 0.01614
Epoch: 14019, Training Loss: 0.01235
Epoch: 14019, Training Loss: 0.01303
Epoch: 14019, Training Loss: 0.01305
Epoch: 14019, Training Loss: 0.01614
Epoch: 14020, Training Loss: 0.01235
Epoch: 14020, Training Loss: 0.01303
Epoch: 14020, Training Loss: 0.01305
Epoch: 14020, Training Loss: 0.01614
Epoch: 14021, Training Loss: 0.01235
Epoch: 14021, Training Loss: 0.01303
Epoch: 14021, Training Loss: 0.01305
Epoch: 14021, Training Loss: 0.01614
Epoch: 14022, Training Loss: 0.01235
Epoch: 14022, Training Loss: 0.01303
Epoch: 14022, Training Loss: 0.01305
Epoch: 14022, Training Loss: 0.01614
Epoch: 14023, Training Loss: 0.01235
Epoch: 14023, Training Loss: 0.01303
Epoch: 14023, Training Loss: 0.01305
Epoch: 14023, Training Loss: 0.01614
Epoch: 14024, Training Loss: 0.01235
Epoch: 14024, Training Loss: 0.01303
Epoch: 14024, Training Loss: 0.01305
Epoch: 14024, Training Loss: 0.01614
Epoch: 14025, Training Loss: 0.01235
Epoch: 14025, Training Loss: 0.01303
Epoch: 14025, Training Loss: 0.01305
Epoch: 14025, Training Loss: 0.01614
Epoch: 14026, Training Loss: 0.01235
Epoch: 14026, Training Loss: 0.01303
Epoch: 14026, Training Loss: 0.01305
Epoch: 14026, Training Loss: 0.01614
Epoch: 14027, Training Loss: 0.01235
Epoch: 14027, Training Loss: 0.01303
Epoch: 14027, Training Loss: 0.01305
Epoch: 14027, Training Loss: 0.01614
Epoch: 14028, Training Loss: 0.01235
Epoch: 14028, Training Loss: 0.01303
Epoch: 14028, Training Loss: 0.01305
Epoch: 14028, Training Loss: 0.01614
Epoch: 14029, Training Loss: 0.01235
Epoch: 14029, Training Loss: 0.01303
Epoch: 14029, Training Loss: 0.01304
Epoch: 14029, Training Loss: 0.01614
Epoch: 14030, Training Loss: 0.01235
Epoch: 14030, Training Loss: 0.01303
Epoch: 14030, Training Loss: 0.01304
Epoch: 14030, Training Loss: 0.01613
Epoch: 14031, Training Loss: 0.01235
Epoch: 14031, Training Loss: 0.01302
Epoch: 14031, Training Loss: 0.01304
Epoch: 14031, Training Loss: 0.01613
Epoch: 14032, Training Loss: 0.01234
Epoch: 14032, Training Loss: 0.01302
Epoch: 14032, Training Loss: 0.01304
Epoch: 14032, Training Loss: 0.01613
Epoch: 14033, Training Loss: 0.01234
Epoch: 14033, Training Loss: 0.01302
Epoch: 14033, Training Loss: 0.01304
Epoch: 14033, Training Loss: 0.01613
Epoch: 14034, Training Loss: 0.01234
Epoch: 14034, Training Loss: 0.01302
Epoch: 14034, Training Loss: 0.01304
Epoch: 14034, Training Loss: 0.01613
Epoch: 14035, Training Loss: 0.01234
Epoch: 14035, Training Loss: 0.01302
Epoch: 14035, Training Loss: 0.01304
Epoch: 14035, Training Loss: 0.01613
Epoch: 14036, Training Loss: 0.01234
Epoch: 14036, Training Loss: 0.01302
Epoch: 14036, Training Loss: 0.01304
Epoch: 14036, Training Loss: 0.01613
Epoch: 14037, Training Loss: 0.01234
Epoch: 14037, Training Loss: 0.01302
Epoch: 14037, Training Loss: 0.01304
Epoch: 14037, Training Loss: 0.01613
Epoch: 14038, Training Loss: 0.01234
Epoch: 14038, Training Loss: 0.01302
Epoch: 14038, Training Loss: 0.01304
Epoch: 14038, Training Loss: 0.01613
Epoch: 14039, Training Loss: 0.01234
Epoch: 14039, Training Loss: 0.01302
Epoch: 14039, Training Loss: 0.01304
Epoch: 14039, Training Loss: 0.01613
Epoch: 14040, Training Loss: 0.01234
Epoch: 14040, Training Loss: 0.01302
Epoch: 14040, Training Loss: 0.01304
Epoch: 14040, Training Loss: 0.01613
Epoch: 14041, Training Loss: 0.01234
Epoch: 14041, Training Loss: 0.01302
Epoch: 14041, Training Loss: 0.01304
Epoch: 14041, Training Loss: 0.01613
Epoch: 14042, Training Loss: 0.01234
Epoch: 14042, Training Loss: 0.01302
Epoch: 14042, Training Loss: 0.01304
Epoch: 14042, Training Loss: 0.01613
Epoch: 14043, Training Loss: 0.01234
Epoch: 14043, Training Loss: 0.01302
Epoch: 14043, Training Loss: 0.01304
Epoch: 14043, Training Loss: 0.01612
Epoch: 14044, Training Loss: 0.01234
Epoch: 14044, Training Loss: 0.01302
Epoch: 14044, Training Loss: 0.01304
Epoch: 14044, Training Loss: 0.01612
Epoch: 14045, Training Loss: 0.01234
Epoch: 14045, Training Loss: 0.01302
Epoch: 14045, Training Loss: 0.01304
Epoch: 14045, Training Loss: 0.01612
Epoch: 14046, Training Loss: 0.01234
Epoch: 14046, Training Loss: 0.01302
Epoch: 14046, Training Loss: 0.01303
Epoch: 14046, Training Loss: 0.01612
Epoch: 14047, Training Loss: 0.01234
Epoch: 14047, Training Loss: 0.01302
Epoch: 14047, Training Loss: 0.01303
Epoch: 14047, Training Loss: 0.01612
Epoch: 14048, Training Loss: 0.01234
Epoch: 14048, Training Loss: 0.01301
Epoch: 14048, Training Loss: 0.01303
Epoch: 14048, Training Loss: 0.01612
Epoch: 14049, Training Loss: 0.01234
Epoch: 14049, Training Loss: 0.01301
Epoch: 14049, Training Loss: 0.01303
Epoch: 14049, Training Loss: 0.01612
Epoch: 14050, Training Loss: 0.01234
Epoch: 14050, Training Loss: 0.01301
Epoch: 14050, Training Loss: 0.01303
Epoch: 14050, Training Loss: 0.01612
Epoch: 14051, Training Loss: 0.01233
Epoch: 14051, Training Loss: 0.01301
Epoch: 14051, Training Loss: 0.01303
Epoch: 14051, Training Loss: 0.01612
Epoch: 14052, Training Loss: 0.01233
Epoch: 14052, Training Loss: 0.01301
Epoch: 14052, Training Loss: 0.01303
Epoch: 14052, Training Loss: 0.01612
Epoch: 14053, Training Loss: 0.01233
Epoch: 14053, Training Loss: 0.01301
Epoch: 14053, Training Loss: 0.01303
Epoch: 14053, Training Loss: 0.01612
Epoch: 14054, Training Loss: 0.01233
Epoch: 14054, Training Loss: 0.01301
Epoch: 14054, Training Loss: 0.01303
Epoch: 14054, Training Loss: 0.01612
Epoch: 14055, Training Loss: 0.01233
Epoch: 14055, Training Loss: 0.01301
Epoch: 14055, Training Loss: 0.01303
Epoch: 14055, Training Loss: 0.01612
Epoch: 14056, Training Loss: 0.01233
Epoch: 14056, Training Loss: 0.01301
Epoch: 14056, Training Loss: 0.01303
Epoch: 14056, Training Loss: 0.01612
Epoch: 14057, Training Loss: 0.01233
Epoch: 14057, Training Loss: 0.01301
Epoch: 14057, Training Loss: 0.01303
Epoch: 14057, Training Loss: 0.01611
Epoch: 14058, Training Loss: 0.01233
Epoch: 14058, Training Loss: 0.01301
Epoch: 14058, Training Loss: 0.01303
Epoch: 14058, Training Loss: 0.01611
Epoch: 14059, Training Loss: 0.01233
Epoch: 14059, Training Loss: 0.01301
Epoch: 14059, Training Loss: 0.01303
Epoch: 14059, Training Loss: 0.01611
Epoch: 14060, Training Loss: 0.01233
Epoch: 14060, Training Loss: 0.01301
Epoch: 14060, Training Loss: 0.01303
Epoch: 14060, Training Loss: 0.01611
Epoch: 14061, Training Loss: 0.01233
Epoch: 14061, Training Loss: 0.01301
Epoch: 14061, Training Loss: 0.01303
Epoch: 14061, Training Loss: 0.01611
Epoch: 14062, Training Loss: 0.01233
Epoch: 14062, Training Loss: 0.01301
Epoch: 14062, Training Loss: 0.01303
Epoch: 14062, Training Loss: 0.01611
Epoch: 14063, Training Loss: 0.01233
Epoch: 14063, Training Loss: 0.01301
Epoch: 14063, Training Loss: 0.01302
Epoch: 14063, Training Loss: 0.01611
Epoch: 14064, Training Loss: 0.01233
Epoch: 14064, Training Loss: 0.01301
Epoch: 14064, Training Loss: 0.01302
Epoch: 14064, Training Loss: 0.01611
Epoch: 14065, Training Loss: 0.01233
Epoch: 14065, Training Loss: 0.01300
Epoch: 14065, Training Loss: 0.01302
Epoch: 14065, Training Loss: 0.01611
Epoch: 14066, Training Loss: 0.01233
Epoch: 14066, Training Loss: 0.01300
Epoch: 14066, Training Loss: 0.01302
Epoch: 14066, Training Loss: 0.01611
Epoch: 14067, Training Loss: 0.01233
Epoch: 14067, Training Loss: 0.01300
Epoch: 14067, Training Loss: 0.01302
Epoch: 14067, Training Loss: 0.01611
Epoch: 14068, Training Loss: 0.01233
Epoch: 14068, Training Loss: 0.01300
Epoch: 14068, Training Loss: 0.01302
Epoch: 14068, Training Loss: 0.01611
Epoch: 14069, Training Loss: 0.01233
Epoch: 14069, Training Loss: 0.01300
Epoch: 14069, Training Loss: 0.01302
Epoch: 14069, Training Loss: 0.01611
Epoch: 14070, Training Loss: 0.01232
Epoch: 14070, Training Loss: 0.01300
Epoch: 14070, Training Loss: 0.01302
Epoch: 14070, Training Loss: 0.01611
Epoch: 14071, Training Loss: 0.01232
Epoch: 14071, Training Loss: 0.01300
Epoch: 14071, Training Loss: 0.01302
Epoch: 14071, Training Loss: 0.01610
Epoch: 14072, Training Loss: 0.01232
Epoch: 14072, Training Loss: 0.01300
Epoch: 14072, Training Loss: 0.01302
Epoch: 14072, Training Loss: 0.01610
Epoch: 14073, Training Loss: 0.01232
Epoch: 14073, Training Loss: 0.01300
Epoch: 14073, Training Loss: 0.01302
Epoch: 14073, Training Loss: 0.01610
Epoch: 14074, Training Loss: 0.01232
Epoch: 14074, Training Loss: 0.01300
Epoch: 14074, Training Loss: 0.01302
Epoch: 14074, Training Loss: 0.01610
Epoch: 14075, Training Loss: 0.01232
Epoch: 14075, Training Loss: 0.01300
Epoch: 14075, Training Loss: 0.01302
Epoch: 14075, Training Loss: 0.01610
Epoch: 14076, Training Loss: 0.01232
Epoch: 14076, Training Loss: 0.01300
Epoch: 14076, Training Loss: 0.01302
Epoch: 14076, Training Loss: 0.01610
Epoch: 14077, Training Loss: 0.01232
Epoch: 14077, Training Loss: 0.01300
Epoch: 14077, Training Loss: 0.01302
Epoch: 14077, Training Loss: 0.01610
Epoch: 14078, Training Loss: 0.01232
Epoch: 14078, Training Loss: 0.01300
Epoch: 14078, Training Loss: 0.01302
Epoch: 14078, Training Loss: 0.01610
Epoch: 14079, Training Loss: 0.01232
Epoch: 14079, Training Loss: 0.01300
Epoch: 14079, Training Loss: 0.01302
Epoch: 14079, Training Loss: 0.01610
Epoch: 14080, Training Loss: 0.01232
Epoch: 14080, Training Loss: 0.01300
Epoch: 14080, Training Loss: 0.01301
Epoch: 14080, Training Loss: 0.01610
Epoch: 14081, Training Loss: 0.01232
Epoch: 14081, Training Loss: 0.01300
Epoch: 14081, Training Loss: 0.01301
Epoch: 14081, Training Loss: 0.01610
Epoch: 14082, Training Loss: 0.01232
Epoch: 14082, Training Loss: 0.01300
Epoch: 14082, Training Loss: 0.01301
Epoch: 14082, Training Loss: 0.01610
Epoch: 14083, Training Loss: 0.01232
Epoch: 14083, Training Loss: 0.01299
Epoch: 14083, Training Loss: 0.01301
Epoch: 14083, Training Loss: 0.01610
Epoch: 14084, Training Loss: 0.01232
Epoch: 14084, Training Loss: 0.01299
Epoch: 14084, Training Loss: 0.01301
Epoch: 14084, Training Loss: 0.01609
Epoch: 14085, Training Loss: 0.01232
Epoch: 14085, Training Loss: 0.01299
Epoch: 14085, Training Loss: 0.01301
Epoch: 14085, Training Loss: 0.01609
Epoch: 14086, Training Loss: 0.01232
Epoch: 14086, Training Loss: 0.01299
Epoch: 14086, Training Loss: 0.01301
Epoch: 14086, Training Loss: 0.01609
Epoch: 14087, Training Loss: 0.01232
Epoch: 14087, Training Loss: 0.01299
Epoch: 14087, Training Loss: 0.01301
Epoch: 14087, Training Loss: 0.01609
Epoch: 14088, Training Loss: 0.01231
Epoch: 14088, Training Loss: 0.01299
Epoch: 14088, Training Loss: 0.01301
Epoch: 14088, Training Loss: 0.01609
Epoch: 14089, Training Loss: 0.01231
Epoch: 14089, Training Loss: 0.01299
Epoch: 14089, Training Loss: 0.01301
Epoch: 14089, Training Loss: 0.01609
Epoch: 14090, Training Loss: 0.01231
Epoch: 14090, Training Loss: 0.01299
Epoch: 14090, Training Loss: 0.01301
Epoch: 14090, Training Loss: 0.01609
Epoch: 14091, Training Loss: 0.01231
Epoch: 14091, Training Loss: 0.01299
Epoch: 14091, Training Loss: 0.01301
Epoch: 14091, Training Loss: 0.01609
Epoch: 14092, Training Loss: 0.01231
Epoch: 14092, Training Loss: 0.01299
Epoch: 14092, Training Loss: 0.01301
Epoch: 14092, Training Loss: 0.01609
Epoch: 14093, Training Loss: 0.01231
Epoch: 14093, Training Loss: 0.01299
Epoch: 14093, Training Loss: 0.01301
Epoch: 14093, Training Loss: 0.01609
Epoch: 14094, Training Loss: 0.01231
Epoch: 14094, Training Loss: 0.01299
Epoch: 14094, Training Loss: 0.01301
Epoch: 14094, Training Loss: 0.01609
Epoch: 14095, Training Loss: 0.01231
Epoch: 14095, Training Loss: 0.01299
Epoch: 14095, Training Loss: 0.01301
Epoch: 14095, Training Loss: 0.01609
Epoch: 14096, Training Loss: 0.01231
Epoch: 14096, Training Loss: 0.01299
Epoch: 14096, Training Loss: 0.01301
Epoch: 14096, Training Loss: 0.01609
Epoch: 14097, Training Loss: 0.01231
Epoch: 14097, Training Loss: 0.01299
Epoch: 14097, Training Loss: 0.01300
Epoch: 14097, Training Loss: 0.01609
Epoch: 14098, Training Loss: 0.01231
Epoch: 14098, Training Loss: 0.01299
Epoch: 14098, Training Loss: 0.01300
Epoch: 14098, Training Loss: 0.01608
Epoch: 14099, Training Loss: 0.01231
Epoch: 14099, Training Loss: 0.01299
Epoch: 14099, Training Loss: 0.01300
Epoch: 14099, Training Loss: 0.01608
Epoch: 14100, Training Loss: 0.01231
Epoch: 14100, Training Loss: 0.01298
Epoch: 14100, Training Loss: 0.01300
Epoch: 14100, Training Loss: 0.01608
Epoch: 14101, Training Loss: 0.01231
Epoch: 14101, Training Loss: 0.01298
Epoch: 14101, Training Loss: 0.01300
Epoch: 14101, Training Loss: 0.01608
Epoch: 14102, Training Loss: 0.01231
Epoch: 14102, Training Loss: 0.01298
Epoch: 14102, Training Loss: 0.01300
Epoch: 14102, Training Loss: 0.01608
Epoch: 14103, Training Loss: 0.01231
Epoch: 14103, Training Loss: 0.01298
Epoch: 14103, Training Loss: 0.01300
Epoch: 14103, Training Loss: 0.01608
Epoch: 14104, Training Loss: 0.01231
Epoch: 14104, Training Loss: 0.01298
Epoch: 14104, Training Loss: 0.01300
Epoch: 14104, Training Loss: 0.01608
Epoch: 14105, Training Loss: 0.01231
Epoch: 14105, Training Loss: 0.01298
Epoch: 14105, Training Loss: 0.01300
Epoch: 14105, Training Loss: 0.01608
Epoch: 14106, Training Loss: 0.01231
Epoch: 14106, Training Loss: 0.01298
Epoch: 14106, Training Loss: 0.01300
Epoch: 14106, Training Loss: 0.01608
Epoch: 14107, Training Loss: 0.01230
Epoch: 14107, Training Loss: 0.01298
Epoch: 14107, Training Loss: 0.01300
Epoch: 14107, Training Loss: 0.01608
Epoch: 14108, Training Loss: 0.01230
Epoch: 14108, Training Loss: 0.01298
Epoch: 14108, Training Loss: 0.01300
Epoch: 14108, Training Loss: 0.01608
Epoch: 14109, Training Loss: 0.01230
Epoch: 14109, Training Loss: 0.01298
Epoch: 14109, Training Loss: 0.01300
Epoch: 14109, Training Loss: 0.01608
Epoch: 14110, Training Loss: 0.01230
Epoch: 14110, Training Loss: 0.01298
Epoch: 14110, Training Loss: 0.01300
Epoch: 14110, Training Loss: 0.01608
Epoch: 14111, Training Loss: 0.01230
Epoch: 14111, Training Loss: 0.01298
Epoch: 14111, Training Loss: 0.01300
Epoch: 14111, Training Loss: 0.01608
Epoch: 14112, Training Loss: 0.01230
Epoch: 14112, Training Loss: 0.01298
Epoch: 14112, Training Loss: 0.01300
Epoch: 14112, Training Loss: 0.01607
Epoch: 14113, Training Loss: 0.01230
Epoch: 14113, Training Loss: 0.01298
Epoch: 14113, Training Loss: 0.01300
Epoch: 14113, Training Loss: 0.01607
Epoch: 14114, Training Loss: 0.01230
Epoch: 14114, Training Loss: 0.01298
Epoch: 14114, Training Loss: 0.01299
Epoch: 14114, Training Loss: 0.01607
Epoch: 14115, Training Loss: 0.01230
Epoch: 14115, Training Loss: 0.01298
Epoch: 14115, Training Loss: 0.01299
Epoch: 14115, Training Loss: 0.01607
Epoch: 14116, Training Loss: 0.01230
Epoch: 14116, Training Loss: 0.01298
Epoch: 14116, Training Loss: 0.01299
Epoch: 14116, Training Loss: 0.01607
Epoch: 14117, Training Loss: 0.01230
Epoch: 14117, Training Loss: 0.01297
Epoch: 14117, Training Loss: 0.01299
Epoch: 14117, Training Loss: 0.01607
Epoch: 14118, Training Loss: 0.01230
Epoch: 14118, Training Loss: 0.01297
Epoch: 14118, Training Loss: 0.01299
Epoch: 14118, Training Loss: 0.01607
Epoch: 14119, Training Loss: 0.01230
Epoch: 14119, Training Loss: 0.01297
Epoch: 14119, Training Loss: 0.01299
Epoch: 14119, Training Loss: 0.01607
Epoch: 14120, Training Loss: 0.01230
Epoch: 14120, Training Loss: 0.01297
Epoch: 14120, Training Loss: 0.01299
Epoch: 14120, Training Loss: 0.01607
Epoch: 14121, Training Loss: 0.01230
Epoch: 14121, Training Loss: 0.01297
Epoch: 14121, Training Loss: 0.01299
Epoch: 14121, Training Loss: 0.01607
Epoch: 14122, Training Loss: 0.01230
Epoch: 14122, Training Loss: 0.01297
Epoch: 14122, Training Loss: 0.01299
Epoch: 14122, Training Loss: 0.01607
Epoch: 14123, Training Loss: 0.01230
Epoch: 14123, Training Loss: 0.01297
Epoch: 14123, Training Loss: 0.01299
Epoch: 14123, Training Loss: 0.01607
Epoch: 14124, Training Loss: 0.01230
Epoch: 14124, Training Loss: 0.01297
Epoch: 14124, Training Loss: 0.01299
Epoch: 14124, Training Loss: 0.01607
Epoch: 14125, Training Loss: 0.01230
Epoch: 14125, Training Loss: 0.01297
Epoch: 14125, Training Loss: 0.01299
Epoch: 14125, Training Loss: 0.01606
Epoch: 14126, Training Loss: 0.01229
Epoch: 14126, Training Loss: 0.01297
Epoch: 14126, Training Loss: 0.01299
Epoch: 14126, Training Loss: 0.01606
Epoch: 14127, Training Loss: 0.01229
Epoch: 14127, Training Loss: 0.01297
Epoch: 14127, Training Loss: 0.01299
Epoch: 14127, Training Loss: 0.01606
Epoch: 14128, Training Loss: 0.01229
Epoch: 14128, Training Loss: 0.01297
Epoch: 14128, Training Loss: 0.01299
Epoch: 14128, Training Loss: 0.01606
Epoch: 14129, Training Loss: 0.01229
Epoch: 14129, Training Loss: 0.01297
Epoch: 14129, Training Loss: 0.01299
Epoch: 14129, Training Loss: 0.01606
Epoch: 14130, Training Loss: 0.01229
Epoch: 14130, Training Loss: 0.01297
Epoch: 14130, Training Loss: 0.01299
Epoch: 14130, Training Loss: 0.01606
Epoch: 14131, Training Loss: 0.01229
Epoch: 14131, Training Loss: 0.01297
Epoch: 14131, Training Loss: 0.01299
Epoch: 14131, Training Loss: 0.01606
Epoch: 14132, Training Loss: 0.01229
Epoch: 14132, Training Loss: 0.01297
Epoch: 14132, Training Loss: 0.01298
Epoch: 14132, Training Loss: 0.01606
Epoch: 14133, Training Loss: 0.01229
Epoch: 14133, Training Loss: 0.01297
Epoch: 14133, Training Loss: 0.01298
Epoch: 14133, Training Loss: 0.01606
Epoch: 14134, Training Loss: 0.01229
Epoch: 14134, Training Loss: 0.01296
Epoch: 14134, Training Loss: 0.01298
Epoch: 14134, Training Loss: 0.01606
Epoch: 14135, Training Loss: 0.01229
Epoch: 14135, Training Loss: 0.01296
Epoch: 14135, Training Loss: 0.01298
Epoch: 14135, Training Loss: 0.01606
Epoch: 14136, Training Loss: 0.01229
Epoch: 14136, Training Loss: 0.01296
Epoch: 14136, Training Loss: 0.01298
Epoch: 14136, Training Loss: 0.01606
Epoch: 14137, Training Loss: 0.01229
Epoch: 14137, Training Loss: 0.01296
Epoch: 14137, Training Loss: 0.01298
Epoch: 14137, Training Loss: 0.01606
Epoch: 14138, Training Loss: 0.01229
Epoch: 14138, Training Loss: 0.01296
Epoch: 14138, Training Loss: 0.01298
Epoch: 14138, Training Loss: 0.01606
Epoch: 14139, Training Loss: 0.01229
Epoch: 14139, Training Loss: 0.01296
Epoch: 14139, Training Loss: 0.01298
Epoch: 14139, Training Loss: 0.01605
Epoch: 14140, Training Loss: 0.01229
Epoch: 14140, Training Loss: 0.01296
Epoch: 14140, Training Loss: 0.01298
Epoch: 14140, Training Loss: 0.01605
Epoch: 14141, Training Loss: 0.01229
Epoch: 14141, Training Loss: 0.01296
Epoch: 14141, Training Loss: 0.01298
Epoch: 14141, Training Loss: 0.01605
Epoch: 14142, Training Loss: 0.01229
Epoch: 14142, Training Loss: 0.01296
Epoch: 14142, Training Loss: 0.01298
Epoch: 14142, Training Loss: 0.01605
Epoch: 14143, Training Loss: 0.01229
Epoch: 14143, Training Loss: 0.01296
Epoch: 14143, Training Loss: 0.01298
Epoch: 14143, Training Loss: 0.01605
Epoch: 14144, Training Loss: 0.01229
Epoch: 14144, Training Loss: 0.01296
Epoch: 14144, Training Loss: 0.01298
Epoch: 14144, Training Loss: 0.01605
Epoch: 14145, Training Loss: 0.01228
Epoch: 14145, Training Loss: 0.01296
Epoch: 14145, Training Loss: 0.01298
Epoch: 14145, Training Loss: 0.01605
Epoch: 14146, Training Loss: 0.01228
Epoch: 14146, Training Loss: 0.01296
Epoch: 14146, Training Loss: 0.01298
Epoch: 14146, Training Loss: 0.01605
Epoch: 14147, Training Loss: 0.01228
Epoch: 14147, Training Loss: 0.01296
Epoch: 14147, Training Loss: 0.01298
Epoch: 14147, Training Loss: 0.01605
Epoch: 14148, Training Loss: 0.01228
Epoch: 14148, Training Loss: 0.01296
Epoch: 14148, Training Loss: 0.01298
Epoch: 14148, Training Loss: 0.01605
Epoch: 14149, Training Loss: 0.01228
Epoch: 14149, Training Loss: 0.01296
Epoch: 14149, Training Loss: 0.01297
Epoch: 14149, Training Loss: 0.01605
Epoch: 14150, Training Loss: 0.01228
Epoch: 14150, Training Loss: 0.01296
Epoch: 14150, Training Loss: 0.01297
Epoch: 14150, Training Loss: 0.01605
Epoch: 14151, Training Loss: 0.01228
Epoch: 14151, Training Loss: 0.01296
Epoch: 14151, Training Loss: 0.01297
Epoch: 14151, Training Loss: 0.01605
Epoch: 14152, Training Loss: 0.01228
Epoch: 14152, Training Loss: 0.01295
Epoch: 14152, Training Loss: 0.01297
Epoch: 14152, Training Loss: 0.01605
Epoch: 14153, Training Loss: 0.01228
Epoch: 14153, Training Loss: 0.01295
Epoch: 14153, Training Loss: 0.01297
Epoch: 14153, Training Loss: 0.01604
Epoch: 14154, Training Loss: 0.01228
Epoch: 14154, Training Loss: 0.01295
Epoch: 14154, Training Loss: 0.01297
Epoch: 14154, Training Loss: 0.01604
Epoch: 14155, Training Loss: 0.01228
Epoch: 14155, Training Loss: 0.01295
Epoch: 14155, Training Loss: 0.01297
Epoch: 14155, Training Loss: 0.01604
Epoch: 14156, Training Loss: 0.01228
Epoch: 14156, Training Loss: 0.01295
Epoch: 14156, Training Loss: 0.01297
Epoch: 14156, Training Loss: 0.01604
Epoch: 14157, Training Loss: 0.01228
Epoch: 14157, Training Loss: 0.01295
Epoch: 14157, Training Loss: 0.01297
Epoch: 14157, Training Loss: 0.01604
Epoch: 14158, Training Loss: 0.01228
Epoch: 14158, Training Loss: 0.01295
Epoch: 14158, Training Loss: 0.01297
Epoch: 14158, Training Loss: 0.01604
Epoch: 14159, Training Loss: 0.01228
Epoch: 14159, Training Loss: 0.01295
Epoch: 14159, Training Loss: 0.01297
Epoch: 14159, Training Loss: 0.01604
Epoch: 14160, Training Loss: 0.01228
Epoch: 14160, Training Loss: 0.01295
Epoch: 14160, Training Loss: 0.01297
Epoch: 14160, Training Loss: 0.01604
Epoch: 14161, Training Loss: 0.01228
Epoch: 14161, Training Loss: 0.01295
Epoch: 14161, Training Loss: 0.01297
Epoch: 14161, Training Loss: 0.01604
Epoch: 14162, Training Loss: 0.01228
Epoch: 14162, Training Loss: 0.01295
Epoch: 14162, Training Loss: 0.01297
Epoch: 14162, Training Loss: 0.01604
Epoch: 14163, Training Loss: 0.01228
Epoch: 14163, Training Loss: 0.01295
Epoch: 14163, Training Loss: 0.01297
Epoch: 14163, Training Loss: 0.01604
Epoch: 14164, Training Loss: 0.01227
Epoch: 14164, Training Loss: 0.01295
Epoch: 14164, Training Loss: 0.01297
Epoch: 14164, Training Loss: 0.01604
Epoch: 14165, Training Loss: 0.01227
Epoch: 14165, Training Loss: 0.01295
Epoch: 14165, Training Loss: 0.01297
Epoch: 14165, Training Loss: 0.01604
Epoch: 14166, Training Loss: 0.01227
Epoch: 14166, Training Loss: 0.01295
Epoch: 14166, Training Loss: 0.01296
Epoch: 14166, Training Loss: 0.01603
Epoch: 14167, Training Loss: 0.01227
Epoch: 14167, Training Loss: 0.01295
Epoch: 14167, Training Loss: 0.01296
Epoch: 14167, Training Loss: 0.01603
Epoch: 14168, Training Loss: 0.01227
Epoch: 14168, Training Loss: 0.01295
Epoch: 14168, Training Loss: 0.01296
Epoch: 14168, Training Loss: 0.01603
Epoch: 14169, Training Loss: 0.01227
Epoch: 14169, Training Loss: 0.01294
Epoch: 14169, Training Loss: 0.01296
Epoch: 14169, Training Loss: 0.01603
Epoch: 14170, Training Loss: 0.01227
Epoch: 14170, Training Loss: 0.01294
Epoch: 14170, Training Loss: 0.01296
Epoch: 14170, Training Loss: 0.01603
Epoch: 14171, Training Loss: 0.01227
Epoch: 14171, Training Loss: 0.01294
Epoch: 14171, Training Loss: 0.01296
Epoch: 14171, Training Loss: 0.01603
Epoch: 14172, Training Loss: 0.01227
Epoch: 14172, Training Loss: 0.01294
Epoch: 14172, Training Loss: 0.01296
Epoch: 14172, Training Loss: 0.01603
Epoch: 14173, Training Loss: 0.01227
Epoch: 14173, Training Loss: 0.01294
Epoch: 14173, Training Loss: 0.01296
Epoch: 14173, Training Loss: 0.01603
Epoch: 14174, Training Loss: 0.01227
Epoch: 14174, Training Loss: 0.01294
Epoch: 14174, Training Loss: 0.01296
Epoch: 14174, Training Loss: 0.01603
Epoch: 14175, Training Loss: 0.01227
Epoch: 14175, Training Loss: 0.01294
Epoch: 14175, Training Loss: 0.01296
Epoch: 14175, Training Loss: 0.01603
Epoch: 14176, Training Loss: 0.01227
Epoch: 14176, Training Loss: 0.01294
Epoch: 14176, Training Loss: 0.01296
Epoch: 14176, Training Loss: 0.01603
Epoch: 14177, Training Loss: 0.01227
Epoch: 14177, Training Loss: 0.01294
Epoch: 14177, Training Loss: 0.01296
Epoch: 14177, Training Loss: 0.01603
Epoch: 14178, Training Loss: 0.01227
Epoch: 14178, Training Loss: 0.01294
Epoch: 14178, Training Loss: 0.01296
Epoch: 14178, Training Loss: 0.01603
Epoch: 14179, Training Loss: 0.01227
Epoch: 14179, Training Loss: 0.01294
Epoch: 14179, Training Loss: 0.01296
Epoch: 14179, Training Loss: 0.01603
Epoch: 14180, Training Loss: 0.01227
Epoch: 14180, Training Loss: 0.01294
Epoch: 14180, Training Loss: 0.01296
Epoch: 14180, Training Loss: 0.01602
Epoch: 14181, Training Loss: 0.01227
Epoch: 14181, Training Loss: 0.01294
Epoch: 14181, Training Loss: 0.01296
Epoch: 14181, Training Loss: 0.01602
Epoch: 14182, Training Loss: 0.01227
Epoch: 14182, Training Loss: 0.01294
Epoch: 14182, Training Loss: 0.01296
Epoch: 14182, Training Loss: 0.01602
Epoch: 14183, Training Loss: 0.01226
Epoch: 14183, Training Loss: 0.01294
Epoch: 14183, Training Loss: 0.01296
Epoch: 14183, Training Loss: 0.01602
Epoch: 14184, Training Loss: 0.01226
Epoch: 14184, Training Loss: 0.01294
Epoch: 14184, Training Loss: 0.01295
Epoch: 14184, Training Loss: 0.01602
Epoch: 14185, Training Loss: 0.01226
Epoch: 14185, Training Loss: 0.01294
Epoch: 14185, Training Loss: 0.01295
Epoch: 14185, Training Loss: 0.01602
Epoch: 14186, Training Loss: 0.01226
Epoch: 14186, Training Loss: 0.01294
Epoch: 14186, Training Loss: 0.01295
Epoch: 14186, Training Loss: 0.01602
Epoch: 14187, Training Loss: 0.01226
Epoch: 14187, Training Loss: 0.01293
Epoch: 14187, Training Loss: 0.01295
Epoch: 14187, Training Loss: 0.01602
Epoch: 14188, Training Loss: 0.01226
Epoch: 14188, Training Loss: 0.01293
Epoch: 14188, Training Loss: 0.01295
Epoch: 14188, Training Loss: 0.01602
Epoch: 14189, Training Loss: 0.01226
Epoch: 14189, Training Loss: 0.01293
Epoch: 14189, Training Loss: 0.01295
Epoch: 14189, Training Loss: 0.01602
Epoch: 14190, Training Loss: 0.01226
Epoch: 14190, Training Loss: 0.01293
Epoch: 14190, Training Loss: 0.01295
Epoch: 14190, Training Loss: 0.01602
Epoch: 14191, Training Loss: 0.01226
Epoch: 14191, Training Loss: 0.01293
Epoch: 14191, Training Loss: 0.01295
Epoch: 14191, Training Loss: 0.01602
Epoch: 14192, Training Loss: 0.01226
Epoch: 14192, Training Loss: 0.01293
Epoch: 14192, Training Loss: 0.01295
Epoch: 14192, Training Loss: 0.01602
Epoch: 14193, Training Loss: 0.01226
Epoch: 14193, Training Loss: 0.01293
Epoch: 14193, Training Loss: 0.01295
Epoch: 14193, Training Loss: 0.01602
Epoch: 14194, Training Loss: 0.01226
Epoch: 14194, Training Loss: 0.01293
Epoch: 14194, Training Loss: 0.01295
Epoch: 14194, Training Loss: 0.01601
Epoch: 14195, Training Loss: 0.01226
Epoch: 14195, Training Loss: 0.01293
Epoch: 14195, Training Loss: 0.01295
Epoch: 14195, Training Loss: 0.01601
Epoch: 14196, Training Loss: 0.01226
Epoch: 14196, Training Loss: 0.01293
Epoch: 14196, Training Loss: 0.01295
Epoch: 14196, Training Loss: 0.01601
Epoch: 14197, Training Loss: 0.01226
Epoch: 14197, Training Loss: 0.01293
Epoch: 14197, Training Loss: 0.01295
Epoch: 14197, Training Loss: 0.01601
Epoch: 14198, Training Loss: 0.01226
Epoch: 14198, Training Loss: 0.01293
Epoch: 14198, Training Loss: 0.01295
Epoch: 14198, Training Loss: 0.01601
Epoch: 14199, Training Loss: 0.01226
Epoch: 14199, Training Loss: 0.01293
Epoch: 14199, Training Loss: 0.01295
Epoch: 14199, Training Loss: 0.01601
Epoch: 14200, Training Loss: 0.01226
Epoch: 14200, Training Loss: 0.01293
Epoch: 14200, Training Loss: 0.01295
Epoch: 14200, Training Loss: 0.01601
Epoch: 14201, Training Loss: 0.01226
Epoch: 14201, Training Loss: 0.01293
Epoch: 14201, Training Loss: 0.01294
Epoch: 14201, Training Loss: 0.01601
Epoch: 14202, Training Loss: 0.01225
Epoch: 14202, Training Loss: 0.01293
Epoch: 14202, Training Loss: 0.01294
Epoch: 14202, Training Loss: 0.01601
Epoch: 14203, Training Loss: 0.01225
Epoch: 14203, Training Loss: 0.01293
Epoch: 14203, Training Loss: 0.01294
Epoch: 14203, Training Loss: 0.01601
Epoch: 14204, Training Loss: 0.01225
Epoch: 14204, Training Loss: 0.01292
Epoch: 14204, Training Loss: 0.01294
Epoch: 14204, Training Loss: 0.01601
Epoch: 14205, Training Loss: 0.01225
Epoch: 14205, Training Loss: 0.01292
Epoch: 14205, Training Loss: 0.01294
Epoch: 14205, Training Loss: 0.01601
Epoch: 14206, Training Loss: 0.01225
Epoch: 14206, Training Loss: 0.01292
Epoch: 14206, Training Loss: 0.01294
Epoch: 14206, Training Loss: 0.01601
Epoch: 14207, Training Loss: 0.01225
Epoch: 14207, Training Loss: 0.01292
Epoch: 14207, Training Loss: 0.01294
Epoch: 14207, Training Loss: 0.01601
Epoch: 14208, Training Loss: 0.01225
Epoch: 14208, Training Loss: 0.01292
Epoch: 14208, Training Loss: 0.01294
Epoch: 14208, Training Loss: 0.01600
Epoch: 14209, Training Loss: 0.01225
Epoch: 14209, Training Loss: 0.01292
Epoch: 14209, Training Loss: 0.01294
Epoch: 14209, Training Loss: 0.01600
Epoch: 14210, Training Loss: 0.01225
Epoch: 14210, Training Loss: 0.01292
Epoch: 14210, Training Loss: 0.01294
Epoch: 14210, Training Loss: 0.01600
Epoch: 14211, Training Loss: 0.01225
Epoch: 14211, Training Loss: 0.01292
Epoch: 14211, Training Loss: 0.01294
Epoch: 14211, Training Loss: 0.01600
Epoch: 14212, Training Loss: 0.01225
Epoch: 14212, Training Loss: 0.01292
Epoch: 14212, Training Loss: 0.01294
Epoch: 14212, Training Loss: 0.01600
Epoch: 14213, Training Loss: 0.01225
Epoch: 14213, Training Loss: 0.01292
Epoch: 14213, Training Loss: 0.01294
Epoch: 14213, Training Loss: 0.01600
Epoch: 14214, Training Loss: 0.01225
Epoch: 14214, Training Loss: 0.01292
Epoch: 14214, Training Loss: 0.01294
Epoch: 14214, Training Loss: 0.01600
Epoch: 14215, Training Loss: 0.01225
Epoch: 14215, Training Loss: 0.01292
Epoch: 14215, Training Loss: 0.01294
Epoch: 14215, Training Loss: 0.01600
Epoch: 14216, Training Loss: 0.01225
Epoch: 14216, Training Loss: 0.01292
Epoch: 14216, Training Loss: 0.01294
Epoch: 14216, Training Loss: 0.01600
Epoch: 14217, Training Loss: 0.01225
Epoch: 14217, Training Loss: 0.01292
Epoch: 14217, Training Loss: 0.01294
Epoch: 14217, Training Loss: 0.01600
Epoch: 14218, Training Loss: 0.01225
Epoch: 14218, Training Loss: 0.01292
Epoch: 14218, Training Loss: 0.01294
Epoch: 14218, Training Loss: 0.01600
Epoch: 14219, Training Loss: 0.01225
Epoch: 14219, Training Loss: 0.01292
Epoch: 14219, Training Loss: 0.01293
Epoch: 14219, Training Loss: 0.01600
Epoch: 14220, Training Loss: 0.01225
Epoch: 14220, Training Loss: 0.01292
Epoch: 14220, Training Loss: 0.01293
Epoch: 14220, Training Loss: 0.01600
Epoch: 14221, Training Loss: 0.01224
Epoch: 14221, Training Loss: 0.01291
Epoch: 14221, Training Loss: 0.01293
Epoch: 14221, Training Loss: 0.01600
Epoch: 14222, Training Loss: 0.01224
Epoch: 14222, Training Loss: 0.01291
Epoch: 14222, Training Loss: 0.01293
Epoch: 14222, Training Loss: 0.01599
Epoch: 14223, Training Loss: 0.01224
Epoch: 14223, Training Loss: 0.01291
Epoch: 14223, Training Loss: 0.01293
Epoch: 14223, Training Loss: 0.01599
Epoch: 14224, Training Loss: 0.01224
Epoch: 14224, Training Loss: 0.01291
Epoch: 14224, Training Loss: 0.01293
Epoch: 14224, Training Loss: 0.01599
Epoch: 14225, Training Loss: 0.01224
Epoch: 14225, Training Loss: 0.01291
Epoch: 14225, Training Loss: 0.01293
Epoch: 14225, Training Loss: 0.01599
Epoch: 14226, Training Loss: 0.01224
Epoch: 14226, Training Loss: 0.01291
Epoch: 14226, Training Loss: 0.01293
Epoch: 14226, Training Loss: 0.01599
Epoch: 14227, Training Loss: 0.01224
Epoch: 14227, Training Loss: 0.01291
Epoch: 14227, Training Loss: 0.01293
Epoch: 14227, Training Loss: 0.01599
Epoch: 14228, Training Loss: 0.01224
Epoch: 14228, Training Loss: 0.01291
Epoch: 14228, Training Loss: 0.01293
Epoch: 14228, Training Loss: 0.01599
Epoch: 14229, Training Loss: 0.01224
Epoch: 14229, Training Loss: 0.01291
Epoch: 14229, Training Loss: 0.01293
Epoch: 14229, Training Loss: 0.01599
Epoch: 14230, Training Loss: 0.01224
Epoch: 14230, Training Loss: 0.01291
Epoch: 14230, Training Loss: 0.01293
Epoch: 14230, Training Loss: 0.01599
Epoch: 14231, Training Loss: 0.01224
Epoch: 14231, Training Loss: 0.01291
Epoch: 14231, Training Loss: 0.01293
Epoch: 14231, Training Loss: 0.01599
Epoch: 14232, Training Loss: 0.01224
Epoch: 14232, Training Loss: 0.01291
Epoch: 14232, Training Loss: 0.01293
Epoch: 14232, Training Loss: 0.01599
Epoch: 14233, Training Loss: 0.01224
Epoch: 14233, Training Loss: 0.01291
Epoch: 14233, Training Loss: 0.01293
Epoch: 14233, Training Loss: 0.01599
Epoch: 14234, Training Loss: 0.01224
Epoch: 14234, Training Loss: 0.01291
Epoch: 14234, Training Loss: 0.01293
Epoch: 14234, Training Loss: 0.01599
Epoch: 14235, Training Loss: 0.01224
Epoch: 14235, Training Loss: 0.01291
Epoch: 14235, Training Loss: 0.01293
Epoch: 14235, Training Loss: 0.01599
Epoch: 14236, Training Loss: 0.01224
Epoch: 14236, Training Loss: 0.01291
Epoch: 14236, Training Loss: 0.01292
Epoch: 14236, Training Loss: 0.01598
Epoch: 14237, Training Loss: 0.01224
Epoch: 14237, Training Loss: 0.01291
Epoch: 14237, Training Loss: 0.01292
Epoch: 14237, Training Loss: 0.01598
Epoch: 14238, Training Loss: 0.01224
Epoch: 14238, Training Loss: 0.01291
Epoch: 14238, Training Loss: 0.01292
Epoch: 14238, Training Loss: 0.01598
Epoch: 14239, Training Loss: 0.01224
Epoch: 14239, Training Loss: 0.01290
Epoch: 14239, Training Loss: 0.01292
Epoch: 14239, Training Loss: 0.01598
Epoch: 14240, Training Loss: 0.01223
Epoch: 14240, Training Loss: 0.01290
Epoch: 14240, Training Loss: 0.01292
Epoch: 14240, Training Loss: 0.01598
Epoch: 14241, Training Loss: 0.01223
Epoch: 14241, Training Loss: 0.01290
Epoch: 14241, Training Loss: 0.01292
Epoch: 14241, Training Loss: 0.01598
Epoch: 14242, Training Loss: 0.01223
Epoch: 14242, Training Loss: 0.01290
Epoch: 14242, Training Loss: 0.01292
Epoch: 14242, Training Loss: 0.01598
Epoch: 14243, Training Loss: 0.01223
Epoch: 14243, Training Loss: 0.01290
Epoch: 14243, Training Loss: 0.01292
Epoch: 14243, Training Loss: 0.01598
Epoch: 14244, Training Loss: 0.01223
Epoch: 14244, Training Loss: 0.01290
Epoch: 14244, Training Loss: 0.01292
Epoch: 14244, Training Loss: 0.01598
Epoch: 14245, Training Loss: 0.01223
Epoch: 14245, Training Loss: 0.01290
Epoch: 14245, Training Loss: 0.01292
Epoch: 14245, Training Loss: 0.01598
Epoch: 14246, Training Loss: 0.01223
Epoch: 14246, Training Loss: 0.01290
Epoch: 14246, Training Loss: 0.01292
Epoch: 14246, Training Loss: 0.01598
Epoch: 14247, Training Loss: 0.01223
Epoch: 14247, Training Loss: 0.01290
Epoch: 14247, Training Loss: 0.01292
Epoch: 14247, Training Loss: 0.01598
Epoch: 14248, Training Loss: 0.01223
Epoch: 14248, Training Loss: 0.01290
Epoch: 14248, Training Loss: 0.01292
Epoch: 14248, Training Loss: 0.01598
Epoch: 14249, Training Loss: 0.01223
Epoch: 14249, Training Loss: 0.01290
Epoch: 14249, Training Loss: 0.01292
Epoch: 14249, Training Loss: 0.01598
Epoch: 14250, Training Loss: 0.01223
Epoch: 14250, Training Loss: 0.01290
Epoch: 14250, Training Loss: 0.01292
Epoch: 14250, Training Loss: 0.01597
Epoch: 14251, Training Loss: 0.01223
Epoch: 14251, Training Loss: 0.01290
Epoch: 14251, Training Loss: 0.01292
Epoch: 14251, Training Loss: 0.01597
Epoch: 14252, Training Loss: 0.01223
Epoch: 14252, Training Loss: 0.01290
Epoch: 14252, Training Loss: 0.01292
Epoch: 14252, Training Loss: 0.01597
Epoch: 14253, Training Loss: 0.01223
Epoch: 14253, Training Loss: 0.01290
Epoch: 14253, Training Loss: 0.01292
Epoch: 14253, Training Loss: 0.01597
Epoch: 14254, Training Loss: 0.01223
Epoch: 14254, Training Loss: 0.01290
Epoch: 14254, Training Loss: 0.01291
Epoch: 14254, Training Loss: 0.01597
Epoch: 14255, Training Loss: 0.01223
Epoch: 14255, Training Loss: 0.01290
Epoch: 14255, Training Loss: 0.01291
Epoch: 14255, Training Loss: 0.01597
Epoch: 14256, Training Loss: 0.01223
Epoch: 14256, Training Loss: 0.01290
Epoch: 14256, Training Loss: 0.01291
Epoch: 14256, Training Loss: 0.01597
Epoch: 14257, Training Loss: 0.01223
Epoch: 14257, Training Loss: 0.01289
Epoch: 14257, Training Loss: 0.01291
Epoch: 14257, Training Loss: 0.01597
Epoch: 14258, Training Loss: 0.01223
Epoch: 14258, Training Loss: 0.01289
Epoch: 14258, Training Loss: 0.01291
Epoch: 14258, Training Loss: 0.01597
Epoch: 14259, Training Loss: 0.01222
Epoch: 14259, Training Loss: 0.01289
Epoch: 14259, Training Loss: 0.01291
Epoch: 14259, Training Loss: 0.01597
Epoch: 14260, Training Loss: 0.01222
Epoch: 14260, Training Loss: 0.01289
Epoch: 14260, Training Loss: 0.01291
Epoch: 14260, Training Loss: 0.01597
Epoch: 14261, Training Loss: 0.01222
Epoch: 14261, Training Loss: 0.01289
Epoch: 14261, Training Loss: 0.01291
Epoch: 14261, Training Loss: 0.01597
Epoch: 14262, Training Loss: 0.01222
Epoch: 14262, Training Loss: 0.01289
Epoch: 14262, Training Loss: 0.01291
Epoch: 14262, Training Loss: 0.01597
Epoch: 14263, Training Loss: 0.01222
Epoch: 14263, Training Loss: 0.01289
Epoch: 14263, Training Loss: 0.01291
Epoch: 14263, Training Loss: 0.01596
Epoch: 14264, Training Loss: 0.01222
Epoch: 14264, Training Loss: 0.01289
Epoch: 14264, Training Loss: 0.01291
Epoch: 14264, Training Loss: 0.01596
Epoch: 14265, Training Loss: 0.01222
Epoch: 14265, Training Loss: 0.01289
Epoch: 14265, Training Loss: 0.01291
Epoch: 14265, Training Loss: 0.01596
Epoch: 14266, Training Loss: 0.01222
Epoch: 14266, Training Loss: 0.01289
Epoch: 14266, Training Loss: 0.01291
Epoch: 14266, Training Loss: 0.01596
Epoch: 14267, Training Loss: 0.01222
Epoch: 14267, Training Loss: 0.01289
Epoch: 14267, Training Loss: 0.01291
Epoch: 14267, Training Loss: 0.01596
Epoch: 14268, Training Loss: 0.01222
Epoch: 14268, Training Loss: 0.01289
Epoch: 14268, Training Loss: 0.01291
Epoch: 14268, Training Loss: 0.01596
Epoch: 14269, Training Loss: 0.01222
Epoch: 14269, Training Loss: 0.01289
Epoch: 14269, Training Loss: 0.01291
Epoch: 14269, Training Loss: 0.01596
Epoch: 14270, Training Loss: 0.01222
Epoch: 14270, Training Loss: 0.01289
Epoch: 14270, Training Loss: 0.01291
Epoch: 14270, Training Loss: 0.01596
Epoch: 14271, Training Loss: 0.01222
Epoch: 14271, Training Loss: 0.01289
Epoch: 14271, Training Loss: 0.01290
Epoch: 14271, Training Loss: 0.01596
Epoch: 14272, Training Loss: 0.01222
Epoch: 14272, Training Loss: 0.01289
Epoch: 14272, Training Loss: 0.01290
Epoch: 14272, Training Loss: 0.01596
Epoch: 14273, Training Loss: 0.01222
Epoch: 14273, Training Loss: 0.01289
Epoch: 14273, Training Loss: 0.01290
Epoch: 14273, Training Loss: 0.01596
Epoch: 14274, Training Loss: 0.01222
Epoch: 14274, Training Loss: 0.01288
Epoch: 14274, Training Loss: 0.01290
Epoch: 14274, Training Loss: 0.01596
Epoch: 14275, Training Loss: 0.01222
Epoch: 14275, Training Loss: 0.01288
Epoch: 14275, Training Loss: 0.01290
Epoch: 14275, Training Loss: 0.01596
Epoch: 14276, Training Loss: 0.01222
Epoch: 14276, Training Loss: 0.01288
Epoch: 14276, Training Loss: 0.01290
Epoch: 14276, Training Loss: 0.01596
Epoch: 14277, Training Loss: 0.01222
Epoch: 14277, Training Loss: 0.01288
Epoch: 14277, Training Loss: 0.01290
Epoch: 14277, Training Loss: 0.01595
Epoch: 14278, Training Loss: 0.01222
Epoch: 14278, Training Loss: 0.01288
Epoch: 14278, Training Loss: 0.01290
Epoch: 14278, Training Loss: 0.01595
Epoch: 14279, Training Loss: 0.01221
Epoch: 14279, Training Loss: 0.01288
Epoch: 14279, Training Loss: 0.01290
Epoch: 14279, Training Loss: 0.01595
Epoch: 14280, Training Loss: 0.01221
Epoch: 14280, Training Loss: 0.01288
Epoch: 14280, Training Loss: 0.01290
Epoch: 14280, Training Loss: 0.01595
Epoch: 14281, Training Loss: 0.01221
Epoch: 14281, Training Loss: 0.01288
Epoch: 14281, Training Loss: 0.01290
Epoch: 14281, Training Loss: 0.01595
Epoch: 14282, Training Loss: 0.01221
Epoch: 14282, Training Loss: 0.01288
Epoch: 14282, Training Loss: 0.01290
Epoch: 14282, Training Loss: 0.01595
Epoch: 14283, Training Loss: 0.01221
Epoch: 14283, Training Loss: 0.01288
Epoch: 14283, Training Loss: 0.01290
Epoch: 14283, Training Loss: 0.01595
Epoch: 14284, Training Loss: 0.01221
Epoch: 14284, Training Loss: 0.01288
Epoch: 14284, Training Loss: 0.01290
Epoch: 14284, Training Loss: 0.01595
Epoch: 14285, Training Loss: 0.01221
Epoch: 14285, Training Loss: 0.01288
Epoch: 14285, Training Loss: 0.01290
Epoch: 14285, Training Loss: 0.01595
Epoch: 14286, Training Loss: 0.01221
Epoch: 14286, Training Loss: 0.01288
Epoch: 14286, Training Loss: 0.01290
Epoch: 14286, Training Loss: 0.01595
Epoch: 14287, Training Loss: 0.01221
Epoch: 14287, Training Loss: 0.01288
Epoch: 14287, Training Loss: 0.01290
Epoch: 14287, Training Loss: 0.01595
Epoch: 14288, Training Loss: 0.01221
Epoch: 14288, Training Loss: 0.01288
Epoch: 14288, Training Loss: 0.01290
Epoch: 14288, Training Loss: 0.01595
Epoch: 14289, Training Loss: 0.01221
Epoch: 14289, Training Loss: 0.01288
Epoch: 14289, Training Loss: 0.01289
Epoch: 14289, Training Loss: 0.01595
Epoch: 14290, Training Loss: 0.01221
Epoch: 14290, Training Loss: 0.01288
Epoch: 14290, Training Loss: 0.01289
Epoch: 14290, Training Loss: 0.01595
Epoch: 14291, Training Loss: 0.01221
Epoch: 14291, Training Loss: 0.01288
Epoch: 14291, Training Loss: 0.01289
Epoch: 14291, Training Loss: 0.01594
Epoch: 14292, Training Loss: 0.01221
Epoch: 14292, Training Loss: 0.01287
Epoch: 14292, Training Loss: 0.01289
Epoch: 14292, Training Loss: 0.01594
Epoch: 14293, Training Loss: 0.01221
Epoch: 14293, Training Loss: 0.01287
Epoch: 14293, Training Loss: 0.01289
Epoch: 14293, Training Loss: 0.01594
Epoch: 14294, Training Loss: 0.01221
Epoch: 14294, Training Loss: 0.01287
Epoch: 14294, Training Loss: 0.01289
Epoch: 14294, Training Loss: 0.01594
Epoch: 14295, Training Loss: 0.01221
Epoch: 14295, Training Loss: 0.01287
Epoch: 14295, Training Loss: 0.01289
Epoch: 14295, Training Loss: 0.01594
Epoch: 14296, Training Loss: 0.01221
Epoch: 14296, Training Loss: 0.01287
Epoch: 14296, Training Loss: 0.01289
Epoch: 14296, Training Loss: 0.01594
Epoch: 14297, Training Loss: 0.01221
Epoch: 14297, Training Loss: 0.01287
Epoch: 14297, Training Loss: 0.01289
Epoch: 14297, Training Loss: 0.01594
Epoch: 14298, Training Loss: 0.01220
Epoch: 14298, Training Loss: 0.01287
Epoch: 14298, Training Loss: 0.01289
Epoch: 14298, Training Loss: 0.01594
Epoch: 14299, Training Loss: 0.01220
Epoch: 14299, Training Loss: 0.01287
Epoch: 14299, Training Loss: 0.01289
Epoch: 14299, Training Loss: 0.01594
Epoch: 14300, Training Loss: 0.01220
Epoch: 14300, Training Loss: 0.01287
Epoch: 14300, Training Loss: 0.01289
Epoch: 14300, Training Loss: 0.01594
Epoch: 14301, Training Loss: 0.01220
Epoch: 14301, Training Loss: 0.01287
Epoch: 14301, Training Loss: 0.01289
Epoch: 14301, Training Loss: 0.01594
Epoch: 14302, Training Loss: 0.01220
Epoch: 14302, Training Loss: 0.01287
Epoch: 14302, Training Loss: 0.01289
Epoch: 14302, Training Loss: 0.01594
Epoch: 14303, Training Loss: 0.01220
Epoch: 14303, Training Loss: 0.01287
Epoch: 14303, Training Loss: 0.01289
Epoch: 14303, Training Loss: 0.01594
Epoch: 14304, Training Loss: 0.01220
Epoch: 14304, Training Loss: 0.01287
Epoch: 14304, Training Loss: 0.01289
Epoch: 14304, Training Loss: 0.01594
Epoch: 14305, Training Loss: 0.01220
Epoch: 14305, Training Loss: 0.01287
Epoch: 14305, Training Loss: 0.01289
Epoch: 14305, Training Loss: 0.01593
Epoch: 14306, Training Loss: 0.01220
Epoch: 14306, Training Loss: 0.01287
Epoch: 14306, Training Loss: 0.01289
Epoch: 14306, Training Loss: 0.01593
Epoch: 14307, Training Loss: 0.01220
Epoch: 14307, Training Loss: 0.01287
Epoch: 14307, Training Loss: 0.01288
Epoch: 14307, Training Loss: 0.01593
Epoch: 14308, Training Loss: 0.01220
Epoch: 14308, Training Loss: 0.01287
Epoch: 14308, Training Loss: 0.01288
Epoch: 14308, Training Loss: 0.01593
Epoch: 14309, Training Loss: 0.01220
Epoch: 14309, Training Loss: 0.01287
Epoch: 14309, Training Loss: 0.01288
Epoch: 14309, Training Loss: 0.01593
Epoch: 14310, Training Loss: 0.01220
Epoch: 14310, Training Loss: 0.01286
Epoch: 14310, Training Loss: 0.01288
Epoch: 14310, Training Loss: 0.01593
Epoch: 14311, Training Loss: 0.01220
Epoch: 14311, Training Loss: 0.01286
Epoch: 14311, Training Loss: 0.01288
Epoch: 14311, Training Loss: 0.01593
Epoch: 14312, Training Loss: 0.01220
Epoch: 14312, Training Loss: 0.01286
Epoch: 14312, Training Loss: 0.01288
Epoch: 14312, Training Loss: 0.01593
Epoch: 14313, Training Loss: 0.01220
Epoch: 14313, Training Loss: 0.01286
Epoch: 14313, Training Loss: 0.01288
Epoch: 14313, Training Loss: 0.01593
Epoch: 14314, Training Loss: 0.01220
Epoch: 14314, Training Loss: 0.01286
Epoch: 14314, Training Loss: 0.01288
Epoch: 14314, Training Loss: 0.01593
Epoch: 14315, Training Loss: 0.01220
Epoch: 14315, Training Loss: 0.01286
Epoch: 14315, Training Loss: 0.01288
Epoch: 14315, Training Loss: 0.01593
Epoch: 14316, Training Loss: 0.01220
Epoch: 14316, Training Loss: 0.01286
Epoch: 14316, Training Loss: 0.01288
Epoch: 14316, Training Loss: 0.01593
Epoch: 14317, Training Loss: 0.01219
Epoch: 14317, Training Loss: 0.01286
Epoch: 14317, Training Loss: 0.01288
Epoch: 14317, Training Loss: 0.01593
Epoch: 14318, Training Loss: 0.01219
Epoch: 14318, Training Loss: 0.01286
Epoch: 14318, Training Loss: 0.01288
Epoch: 14318, Training Loss: 0.01593
Epoch: 14319, Training Loss: 0.01219
Epoch: 14319, Training Loss: 0.01286
Epoch: 14319, Training Loss: 0.01288
Epoch: 14319, Training Loss: 0.01592
Epoch: 14320, Training Loss: 0.01219
Epoch: 14320, Training Loss: 0.01286
Epoch: 14320, Training Loss: 0.01288
Epoch: 14320, Training Loss: 0.01592
Epoch: 14321, Training Loss: 0.01219
Epoch: 14321, Training Loss: 0.01286
Epoch: 14321, Training Loss: 0.01288
Epoch: 14321, Training Loss: 0.01592
Epoch: 14322, Training Loss: 0.01219
Epoch: 14322, Training Loss: 0.01286
Epoch: 14322, Training Loss: 0.01288
Epoch: 14322, Training Loss: 0.01592
Epoch: 14323, Training Loss: 0.01219
Epoch: 14323, Training Loss: 0.01286
Epoch: 14323, Training Loss: 0.01288
Epoch: 14323, Training Loss: 0.01592
Epoch: 14324, Training Loss: 0.01219
Epoch: 14324, Training Loss: 0.01286
Epoch: 14324, Training Loss: 0.01287
Epoch: 14324, Training Loss: 0.01592
Epoch: 14325, Training Loss: 0.01219
Epoch: 14325, Training Loss: 0.01286
Epoch: 14325, Training Loss: 0.01287
Epoch: 14325, Training Loss: 0.01592
Epoch: 14326, Training Loss: 0.01219
Epoch: 14326, Training Loss: 0.01286
Epoch: 14326, Training Loss: 0.01287
Epoch: 14326, Training Loss: 0.01592
Epoch: 14327, Training Loss: 0.01219
Epoch: 14327, Training Loss: 0.01285
Epoch: 14327, Training Loss: 0.01287
Epoch: 14327, Training Loss: 0.01592
Epoch: 14328, Training Loss: 0.01219
Epoch: 14328, Training Loss: 0.01285
Epoch: 14328, Training Loss: 0.01287
Epoch: 14328, Training Loss: 0.01592
Epoch: 14329, Training Loss: 0.01219
Epoch: 14329, Training Loss: 0.01285
Epoch: 14329, Training Loss: 0.01287
Epoch: 14329, Training Loss: 0.01592
Epoch: 14330, Training Loss: 0.01219
Epoch: 14330, Training Loss: 0.01285
Epoch: 14330, Training Loss: 0.01287
Epoch: 14330, Training Loss: 0.01592
Epoch: 14331, Training Loss: 0.01219
Epoch: 14331, Training Loss: 0.01285
Epoch: 14331, Training Loss: 0.01287
Epoch: 14331, Training Loss: 0.01592
Epoch: 14332, Training Loss: 0.01219
Epoch: 14332, Training Loss: 0.01285
Epoch: 14332, Training Loss: 0.01287
Epoch: 14332, Training Loss: 0.01592
Epoch: 14333, Training Loss: 0.01219
Epoch: 14333, Training Loss: 0.01285
Epoch: 14333, Training Loss: 0.01287
Epoch: 14333, Training Loss: 0.01592
Epoch: 14334, Training Loss: 0.01219
Epoch: 14334, Training Loss: 0.01285
Epoch: 14334, Training Loss: 0.01287
Epoch: 14334, Training Loss: 0.01591
Epoch: 14335, Training Loss: 0.01219
Epoch: 14335, Training Loss: 0.01285
Epoch: 14335, Training Loss: 0.01287
Epoch: 14335, Training Loss: 0.01591
Epoch: 14336, Training Loss: 0.01219
Epoch: 14336, Training Loss: 0.01285
Epoch: 14336, Training Loss: 0.01287
Epoch: 14336, Training Loss: 0.01591
Epoch: 14337, Training Loss: 0.01218
Epoch: 14337, Training Loss: 0.01285
Epoch: 14337, Training Loss: 0.01287
Epoch: 14337, Training Loss: 0.01591
Epoch: 14338, Training Loss: 0.01218
Epoch: 14338, Training Loss: 0.01285
Epoch: 14338, Training Loss: 0.01287
Epoch: 14338, Training Loss: 0.01591
Epoch: 14339, Training Loss: 0.01218
Epoch: 14339, Training Loss: 0.01285
Epoch: 14339, Training Loss: 0.01287
Epoch: 14339, Training Loss: 0.01591
Epoch: 14340, Training Loss: 0.01218
Epoch: 14340, Training Loss: 0.01285
Epoch: 14340, Training Loss: 0.01287
Epoch: 14340, Training Loss: 0.01591
Epoch: 14341, Training Loss: 0.01218
Epoch: 14341, Training Loss: 0.01285
Epoch: 14341, Training Loss: 0.01287
Epoch: 14341, Training Loss: 0.01591
Epoch: 14342, Training Loss: 0.01218
Epoch: 14342, Training Loss: 0.01285
Epoch: 14342, Training Loss: 0.01286
Epoch: 14342, Training Loss: 0.01591
Epoch: 14343, Training Loss: 0.01218
Epoch: 14343, Training Loss: 0.01285
Epoch: 14343, Training Loss: 0.01286
Epoch: 14343, Training Loss: 0.01591
Epoch: 14344, Training Loss: 0.01218
Epoch: 14344, Training Loss: 0.01285
Epoch: 14344, Training Loss: 0.01286
Epoch: 14344, Training Loss: 0.01591
Epoch: 14345, Training Loss: 0.01218
Epoch: 14345, Training Loss: 0.01284
Epoch: 14345, Training Loss: 0.01286
Epoch: 14345, Training Loss: 0.01591
Epoch: 14346, Training Loss: 0.01218
Epoch: 14346, Training Loss: 0.01284
Epoch: 14346, Training Loss: 0.01286
Epoch: 14346, Training Loss: 0.01591
Epoch: 14347, Training Loss: 0.01218
Epoch: 14347, Training Loss: 0.01284
Epoch: 14347, Training Loss: 0.01286
Epoch: 14347, Training Loss: 0.01591
Epoch: 14348, Training Loss: 0.01218
Epoch: 14348, Training Loss: 0.01284
Epoch: 14348, Training Loss: 0.01286
Epoch: 14348, Training Loss: 0.01590
Epoch: 14349, Training Loss: 0.01218
Epoch: 14349, Training Loss: 0.01284
Epoch: 14349, Training Loss: 0.01286
Epoch: 14349, Training Loss: 0.01590
Epoch: 14350, Training Loss: 0.01218
Epoch: 14350, Training Loss: 0.01284
Epoch: 14350, Training Loss: 0.01286
Epoch: 14350, Training Loss: 0.01590
Epoch: 14351, Training Loss: 0.01218
Epoch: 14351, Training Loss: 0.01284
Epoch: 14351, Training Loss: 0.01286
Epoch: 14351, Training Loss: 0.01590
Epoch: 14352, Training Loss: 0.01218
Epoch: 14352, Training Loss: 0.01284
Epoch: 14352, Training Loss: 0.01286
Epoch: 14352, Training Loss: 0.01590
Epoch: 14353, Training Loss: 0.01218
Epoch: 14353, Training Loss: 0.01284
Epoch: 14353, Training Loss: 0.01286
Epoch: 14353, Training Loss: 0.01590
Epoch: 14354, Training Loss: 0.01218
Epoch: 14354, Training Loss: 0.01284
Epoch: 14354, Training Loss: 0.01286
Epoch: 14354, Training Loss: 0.01590
Epoch: 14355, Training Loss: 0.01218
Epoch: 14355, Training Loss: 0.01284
Epoch: 14355, Training Loss: 0.01286
Epoch: 14355, Training Loss: 0.01590
Epoch: 14356, Training Loss: 0.01217
Epoch: 14356, Training Loss: 0.01284
Epoch: 14356, Training Loss: 0.01286
Epoch: 14356, Training Loss: 0.01590
Epoch: 14357, Training Loss: 0.01217
Epoch: 14357, Training Loss: 0.01284
Epoch: 14357, Training Loss: 0.01286
Epoch: 14357, Training Loss: 0.01590
Epoch: 14358, Training Loss: 0.01217
Epoch: 14358, Training Loss: 0.01284
Epoch: 14358, Training Loss: 0.01286
Epoch: 14358, Training Loss: 0.01590
Epoch: 14359, Training Loss: 0.01217
Epoch: 14359, Training Loss: 0.01284
Epoch: 14359, Training Loss: 0.01286
Epoch: 14359, Training Loss: 0.01590
Epoch: 14360, Training Loss: 0.01217
Epoch: 14360, Training Loss: 0.01284
Epoch: 14360, Training Loss: 0.01285
Epoch: 14360, Training Loss: 0.01590
Epoch: 14361, Training Loss: 0.01217
Epoch: 14361, Training Loss: 0.01284
Epoch: 14361, Training Loss: 0.01285
Epoch: 14361, Training Loss: 0.01590
Epoch: 14362, Training Loss: 0.01217
Epoch: 14362, Training Loss: 0.01284
Epoch: 14362, Training Loss: 0.01285
Epoch: 14362, Training Loss: 0.01589
Epoch: 14363, Training Loss: 0.01217
Epoch: 14363, Training Loss: 0.01283
Epoch: 14363, Training Loss: 0.01285
Epoch: 14363, Training Loss: 0.01589
Epoch: 14364, Training Loss: 0.01217
Epoch: 14364, Training Loss: 0.01283
Epoch: 14364, Training Loss: 0.01285
Epoch: 14364, Training Loss: 0.01589
Epoch: 14365, Training Loss: 0.01217
Epoch: 14365, Training Loss: 0.01283
Epoch: 14365, Training Loss: 0.01285
Epoch: 14365, Training Loss: 0.01589
Epoch: 14366, Training Loss: 0.01217
Epoch: 14366, Training Loss: 0.01283
Epoch: 14366, Training Loss: 0.01285
Epoch: 14366, Training Loss: 0.01589
Epoch: 14367, Training Loss: 0.01217
Epoch: 14367, Training Loss: 0.01283
Epoch: 14367, Training Loss: 0.01285
Epoch: 14367, Training Loss: 0.01589
Epoch: 14368, Training Loss: 0.01217
Epoch: 14368, Training Loss: 0.01283
Epoch: 14368, Training Loss: 0.01285
Epoch: 14368, Training Loss: 0.01589
Epoch: 14369, Training Loss: 0.01217
Epoch: 14369, Training Loss: 0.01283
Epoch: 14369, Training Loss: 0.01285
Epoch: 14369, Training Loss: 0.01589
Epoch: 14370, Training Loss: 0.01217
Epoch: 14370, Training Loss: 0.01283
Epoch: 14370, Training Loss: 0.01285
Epoch: 14370, Training Loss: 0.01589
Epoch: 14371, Training Loss: 0.01217
Epoch: 14371, Training Loss: 0.01283
Epoch: 14371, Training Loss: 0.01285
Epoch: 14371, Training Loss: 0.01589
Epoch: 14372, Training Loss: 0.01217
Epoch: 14372, Training Loss: 0.01283
Epoch: 14372, Training Loss: 0.01285
Epoch: 14372, Training Loss: 0.01589
Epoch: 14373, Training Loss: 0.01217
Epoch: 14373, Training Loss: 0.01283
Epoch: 14373, Training Loss: 0.01285
Epoch: 14373, Training Loss: 0.01589
Epoch: 14374, Training Loss: 0.01217
Epoch: 14374, Training Loss: 0.01283
Epoch: 14374, Training Loss: 0.01285
Epoch: 14374, Training Loss: 0.01589
Epoch: 14375, Training Loss: 0.01217
Epoch: 14375, Training Loss: 0.01283
Epoch: 14375, Training Loss: 0.01285
Epoch: 14375, Training Loss: 0.01589
Epoch: 14376, Training Loss: 0.01216
Epoch: 14376, Training Loss: 0.01283
Epoch: 14376, Training Loss: 0.01285
Epoch: 14376, Training Loss: 0.01588
Epoch: 14377, Training Loss: 0.01216
Epoch: 14377, Training Loss: 0.01283
Epoch: 14377, Training Loss: 0.01285
Epoch: 14377, Training Loss: 0.01588
Epoch: 14378, Training Loss: 0.01216
Epoch: 14378, Training Loss: 0.01283
Epoch: 14378, Training Loss: 0.01284
Epoch: 14378, Training Loss: 0.01588
Epoch: 14379, Training Loss: 0.01216
Epoch: 14379, Training Loss: 0.01283
Epoch: 14379, Training Loss: 0.01284
Epoch: 14379, Training Loss: 0.01588
Epoch: 14380, Training Loss: 0.01216
Epoch: 14380, Training Loss: 0.01283
Epoch: 14380, Training Loss: 0.01284
Epoch: 14380, Training Loss: 0.01588
Epoch: 14381, Training Loss: 0.01216
Epoch: 14381, Training Loss: 0.01282
Epoch: 14381, Training Loss: 0.01284
Epoch: 14381, Training Loss: 0.01588
Epoch: 14382, Training Loss: 0.01216
Epoch: 14382, Training Loss: 0.01282
Epoch: 14382, Training Loss: 0.01284
Epoch: 14382, Training Loss: 0.01588
Epoch: 14383, Training Loss: 0.01216
Epoch: 14383, Training Loss: 0.01282
Epoch: 14383, Training Loss: 0.01284
Epoch: 14383, Training Loss: 0.01588
Epoch: 14384, Training Loss: 0.01216
Epoch: 14384, Training Loss: 0.01282
Epoch: 14384, Training Loss: 0.01284
Epoch: 14384, Training Loss: 0.01588
Epoch: 14385, Training Loss: 0.01216
Epoch: 14385, Training Loss: 0.01282
Epoch: 14385, Training Loss: 0.01284
Epoch: 14385, Training Loss: 0.01588
Epoch: 14386, Training Loss: 0.01216
Epoch: 14386, Training Loss: 0.01282
Epoch: 14386, Training Loss: 0.01284
Epoch: 14386, Training Loss: 0.01588
Epoch: 14387, Training Loss: 0.01216
Epoch: 14387, Training Loss: 0.01282
Epoch: 14387, Training Loss: 0.01284
Epoch: 14387, Training Loss: 0.01588
Epoch: 14388, Training Loss: 0.01216
Epoch: 14388, Training Loss: 0.01282
Epoch: 14388, Training Loss: 0.01284
Epoch: 14388, Training Loss: 0.01588
Epoch: 14389, Training Loss: 0.01216
Epoch: 14389, Training Loss: 0.01282
Epoch: 14389, Training Loss: 0.01284
Epoch: 14389, Training Loss: 0.01588
Epoch: 14390, Training Loss: 0.01216
Epoch: 14390, Training Loss: 0.01282
Epoch: 14390, Training Loss: 0.01284
Epoch: 14390, Training Loss: 0.01587
Epoch: 14391, Training Loss: 0.01216
Epoch: 14391, Training Loss: 0.01282
Epoch: 14391, Training Loss: 0.01284
Epoch: 14391, Training Loss: 0.01587
Epoch: 14392, Training Loss: 0.01216
Epoch: 14392, Training Loss: 0.01282
Epoch: 14392, Training Loss: 0.01284
Epoch: 14392, Training Loss: 0.01587
Epoch: 14393, Training Loss: 0.01216
Epoch: 14393, Training Loss: 0.01282
Epoch: 14393, Training Loss: 0.01284
Epoch: 14393, Training Loss: 0.01587
Epoch: 14394, Training Loss: 0.01216
Epoch: 14394, Training Loss: 0.01282
Epoch: 14394, Training Loss: 0.01284
Epoch: 14394, Training Loss: 0.01587
Epoch: 14395, Training Loss: 0.01215
Epoch: 14395, Training Loss: 0.01282
Epoch: 14395, Training Loss: 0.01284
Epoch: 14395, Training Loss: 0.01587
Epoch: 14396, Training Loss: 0.01215
Epoch: 14396, Training Loss: 0.01282
Epoch: 14396, Training Loss: 0.01283
Epoch: 14396, Training Loss: 0.01587
Epoch: 14397, Training Loss: 0.01215
Epoch: 14397, Training Loss: 0.01282
Epoch: 14397, Training Loss: 0.01283
Epoch: 14397, Training Loss: 0.01587
Epoch: 14398, Training Loss: 0.01215
Epoch: 14398, Training Loss: 0.01282
Epoch: 14398, Training Loss: 0.01283
Epoch: 14398, Training Loss: 0.01587
Epoch: 14399, Training Loss: 0.01215
Epoch: 14399, Training Loss: 0.01281
Epoch: 14399, Training Loss: 0.01283
Epoch: 14399, Training Loss: 0.01587
Epoch: 14400, Training Loss: 0.01215
Epoch: 14400, Training Loss: 0.01281
Epoch: 14400, Training Loss: 0.01283
Epoch: 14400, Training Loss: 0.01587
Epoch: 14401, Training Loss: 0.01215
Epoch: 14401, Training Loss: 0.01281
Epoch: 14401, Training Loss: 0.01283
Epoch: 14401, Training Loss: 0.01587
Epoch: 14402, Training Loss: 0.01215
Epoch: 14402, Training Loss: 0.01281
Epoch: 14402, Training Loss: 0.01283
Epoch: 14402, Training Loss: 0.01587
Epoch: 14403, Training Loss: 0.01215
Epoch: 14403, Training Loss: 0.01281
Epoch: 14403, Training Loss: 0.01283
Epoch: 14403, Training Loss: 0.01587
Epoch: 14404, Training Loss: 0.01215
Epoch: 14404, Training Loss: 0.01281
Epoch: 14404, Training Loss: 0.01283
Epoch: 14404, Training Loss: 0.01586
Epoch: 14405, Training Loss: 0.01215
Epoch: 14405, Training Loss: 0.01281
Epoch: 14405, Training Loss: 0.01283
Epoch: 14405, Training Loss: 0.01586
Epoch: 14406, Training Loss: 0.01215
Epoch: 14406, Training Loss: 0.01281
Epoch: 14406, Training Loss: 0.01283
Epoch: 14406, Training Loss: 0.01586
Epoch: 14407, Training Loss: 0.01215
Epoch: 14407, Training Loss: 0.01281
Epoch: 14407, Training Loss: 0.01283
Epoch: 14407, Training Loss: 0.01586
Epoch: 14408, Training Loss: 0.01215
Epoch: 14408, Training Loss: 0.01281
Epoch: 14408, Training Loss: 0.01283
Epoch: 14408, Training Loss: 0.01586
Epoch: 14409, Training Loss: 0.01215
Epoch: 14409, Training Loss: 0.01281
Epoch: 14409, Training Loss: 0.01283
Epoch: 14409, Training Loss: 0.01586
Epoch: 14410, Training Loss: 0.01215
Epoch: 14410, Training Loss: 0.01281
Epoch: 14410, Training Loss: 0.01283
Epoch: 14410, Training Loss: 0.01586
Epoch: 14411, Training Loss: 0.01215
Epoch: 14411, Training Loss: 0.01281
Epoch: 14411, Training Loss: 0.01283
Epoch: 14411, Training Loss: 0.01586
Epoch: 14412, Training Loss: 0.01215
Epoch: 14412, Training Loss: 0.01281
Epoch: 14412, Training Loss: 0.01283
Epoch: 14412, Training Loss: 0.01586
Epoch: 14413, Training Loss: 0.01215
Epoch: 14413, Training Loss: 0.01281
Epoch: 14413, Training Loss: 0.01282
Epoch: 14413, Training Loss: 0.01586
Epoch: 14414, Training Loss: 0.01215
Epoch: 14414, Training Loss: 0.01281
Epoch: 14414, Training Loss: 0.01282
Epoch: 14414, Training Loss: 0.01586
Epoch: 14415, Training Loss: 0.01214
Epoch: 14415, Training Loss: 0.01281
Epoch: 14415, Training Loss: 0.01282
Epoch: 14415, Training Loss: 0.01586
Epoch: 14416, Training Loss: 0.01214
Epoch: 14416, Training Loss: 0.01281
Epoch: 14416, Training Loss: 0.01282
Epoch: 14416, Training Loss: 0.01586
Epoch: 14417, Training Loss: 0.01214
Epoch: 14417, Training Loss: 0.01280
Epoch: 14417, Training Loss: 0.01282
Epoch: 14417, Training Loss: 0.01586
Epoch: 14418, Training Loss: 0.01214
Epoch: 14418, Training Loss: 0.01280
Epoch: 14418, Training Loss: 0.01282
Epoch: 14418, Training Loss: 0.01585
Epoch: 14419, Training Loss: 0.01214
Epoch: 14419, Training Loss: 0.01280
Epoch: 14419, Training Loss: 0.01282
Epoch: 14419, Training Loss: 0.01585
Epoch: 14420, Training Loss: 0.01214
Epoch: 14420, Training Loss: 0.01280
Epoch: 14420, Training Loss: 0.01282
Epoch: 14420, Training Loss: 0.01585
Epoch: 14421, Training Loss: 0.01214
Epoch: 14421, Training Loss: 0.01280
Epoch: 14421, Training Loss: 0.01282
Epoch: 14421, Training Loss: 0.01585
Epoch: 14422, Training Loss: 0.01214
Epoch: 14422, Training Loss: 0.01280
Epoch: 14422, Training Loss: 0.01282
Epoch: 14422, Training Loss: 0.01585
Epoch: 14423, Training Loss: 0.01214
Epoch: 14423, Training Loss: 0.01280
Epoch: 14423, Training Loss: 0.01282
Epoch: 14423, Training Loss: 0.01585
Epoch: 14424, Training Loss: 0.01214
Epoch: 14424, Training Loss: 0.01280
Epoch: 14424, Training Loss: 0.01282
Epoch: 14424, Training Loss: 0.01585
Epoch: 14425, Training Loss: 0.01214
Epoch: 14425, Training Loss: 0.01280
Epoch: 14425, Training Loss: 0.01282
Epoch: 14425, Training Loss: 0.01585
Epoch: 14426, Training Loss: 0.01214
Epoch: 14426, Training Loss: 0.01280
Epoch: 14426, Training Loss: 0.01282
Epoch: 14426, Training Loss: 0.01585
Epoch: 14427, Training Loss: 0.01214
Epoch: 14427, Training Loss: 0.01280
Epoch: 14427, Training Loss: 0.01282
Epoch: 14427, Training Loss: 0.01585
Epoch: 14428, Training Loss: 0.01214
Epoch: 14428, Training Loss: 0.01280
Epoch: 14428, Training Loss: 0.01282
Epoch: 14428, Training Loss: 0.01585
Epoch: 14429, Training Loss: 0.01214
Epoch: 14429, Training Loss: 0.01280
Epoch: 14429, Training Loss: 0.01282
Epoch: 14429, Training Loss: 0.01585
Epoch: 14430, Training Loss: 0.01214
Epoch: 14430, Training Loss: 0.01280
Epoch: 14430, Training Loss: 0.01282
Epoch: 14430, Training Loss: 0.01585
Epoch: 14431, Training Loss: 0.01214
Epoch: 14431, Training Loss: 0.01280
Epoch: 14431, Training Loss: 0.01281
Epoch: 14431, Training Loss: 0.01585
Epoch: 14432, Training Loss: 0.01214
Epoch: 14432, Training Loss: 0.01280
Epoch: 14432, Training Loss: 0.01281
Epoch: 14432, Training Loss: 0.01585
Epoch: 14433, Training Loss: 0.01214
Epoch: 14433, Training Loss: 0.01280
Epoch: 14433, Training Loss: 0.01281
Epoch: 14433, Training Loss: 0.01584
Epoch: 14434, Training Loss: 0.01213
Epoch: 14434, Training Loss: 0.01280
Epoch: 14434, Training Loss: 0.01281
Epoch: 14434, Training Loss: 0.01584
Epoch: 14435, Training Loss: 0.01213
Epoch: 14435, Training Loss: 0.01279
Epoch: 14435, Training Loss: 0.01281
Epoch: 14435, Training Loss: 0.01584
Epoch: 14436, Training Loss: 0.01213
Epoch: 14436, Training Loss: 0.01279
Epoch: 14436, Training Loss: 0.01281
Epoch: 14436, Training Loss: 0.01584
Epoch: 14437, Training Loss: 0.01213
Epoch: 14437, Training Loss: 0.01279
Epoch: 14437, Training Loss: 0.01281
Epoch: 14437, Training Loss: 0.01584
Epoch: 14438, Training Loss: 0.01213
Epoch: 14438, Training Loss: 0.01279
Epoch: 14438, Training Loss: 0.01281
Epoch: 14438, Training Loss: 0.01584
Epoch: 14439, Training Loss: 0.01213
Epoch: 14439, Training Loss: 0.01279
Epoch: 14439, Training Loss: 0.01281
Epoch: 14439, Training Loss: 0.01584
Epoch: 14440, Training Loss: 0.01213
Epoch: 14440, Training Loss: 0.01279
Epoch: 14440, Training Loss: 0.01281
Epoch: 14440, Training Loss: 0.01584
Epoch: 14441, Training Loss: 0.01213
Epoch: 14441, Training Loss: 0.01279
Epoch: 14441, Training Loss: 0.01281
Epoch: 14441, Training Loss: 0.01584
Epoch: 14442, Training Loss: 0.01213
Epoch: 14442, Training Loss: 0.01279
Epoch: 14442, Training Loss: 0.01281
Epoch: 14442, Training Loss: 0.01584
Epoch: 14443, Training Loss: 0.01213
Epoch: 14443, Training Loss: 0.01279
Epoch: 14443, Training Loss: 0.01281
Epoch: 14443, Training Loss: 0.01584
Epoch: 14444, Training Loss: 0.01213
Epoch: 14444, Training Loss: 0.01279
Epoch: 14444, Training Loss: 0.01281
Epoch: 14444, Training Loss: 0.01584
Epoch: 14445, Training Loss: 0.01213
Epoch: 14445, Training Loss: 0.01279
Epoch: 14445, Training Loss: 0.01281
Epoch: 14445, Training Loss: 0.01584
Epoch: 14446, Training Loss: 0.01213
Epoch: 14446, Training Loss: 0.01279
Epoch: 14446, Training Loss: 0.01281
Epoch: 14446, Training Loss: 0.01584
Epoch: 14447, Training Loss: 0.01213
Epoch: 14447, Training Loss: 0.01279
Epoch: 14447, Training Loss: 0.01281
Epoch: 14447, Training Loss: 0.01583
Epoch: 14448, Training Loss: 0.01213
Epoch: 14448, Training Loss: 0.01279
Epoch: 14448, Training Loss: 0.01281
Epoch: 14448, Training Loss: 0.01583
Epoch: 14449, Training Loss: 0.01213
Epoch: 14449, Training Loss: 0.01279
Epoch: 14449, Training Loss: 0.01280
Epoch: 14449, Training Loss: 0.01583
Epoch: 14450, Training Loss: 0.01213
Epoch: 14450, Training Loss: 0.01279
Epoch: 14450, Training Loss: 0.01280
Epoch: 14450, Training Loss: 0.01583
Epoch: 14451, Training Loss: 0.01213
Epoch: 14451, Training Loss: 0.01279
Epoch: 14451, Training Loss: 0.01280
Epoch: 14451, Training Loss: 0.01583
Epoch: 14452, Training Loss: 0.01213
Epoch: 14452, Training Loss: 0.01279
Epoch: 14452, Training Loss: 0.01280
Epoch: 14452, Training Loss: 0.01583
Epoch: 14453, Training Loss: 0.01213
Epoch: 14453, Training Loss: 0.01278
Epoch: 14453, Training Loss: 0.01280
Epoch: 14453, Training Loss: 0.01583
Epoch: 14454, Training Loss: 0.01212
Epoch: 14454, Training Loss: 0.01278
Epoch: 14454, Training Loss: 0.01280
Epoch: 14454, Training Loss: 0.01583
Epoch: 14455, Training Loss: 0.01212
Epoch: 14455, Training Loss: 0.01278
Epoch: 14455, Training Loss: 0.01280
Epoch: 14455, Training Loss: 0.01583
Epoch: 14456, Training Loss: 0.01212
Epoch: 14456, Training Loss: 0.01278
Epoch: 14456, Training Loss: 0.01280
Epoch: 14456, Training Loss: 0.01583
Epoch: 14457, Training Loss: 0.01212
Epoch: 14457, Training Loss: 0.01278
Epoch: 14457, Training Loss: 0.01280
Epoch: 14457, Training Loss: 0.01583
Epoch: 14458, Training Loss: 0.01212
Epoch: 14458, Training Loss: 0.01278
Epoch: 14458, Training Loss: 0.01280
Epoch: 14458, Training Loss: 0.01583
Epoch: 14459, Training Loss: 0.01212
Epoch: 14459, Training Loss: 0.01278
Epoch: 14459, Training Loss: 0.01280
Epoch: 14459, Training Loss: 0.01583
Epoch: 14460, Training Loss: 0.01212
Epoch: 14460, Training Loss: 0.01278
Epoch: 14460, Training Loss: 0.01280
Epoch: 14460, Training Loss: 0.01583
Epoch: 14461, Training Loss: 0.01212
Epoch: 14461, Training Loss: 0.01278
Epoch: 14461, Training Loss: 0.01280
Epoch: 14461, Training Loss: 0.01582
Epoch: 14462, Training Loss: 0.01212
Epoch: 14462, Training Loss: 0.01278
Epoch: 14462, Training Loss: 0.01280
Epoch: 14462, Training Loss: 0.01582
Epoch: 14463, Training Loss: 0.01212
Epoch: 14463, Training Loss: 0.01278
Epoch: 14463, Training Loss: 0.01280
Epoch: 14463, Training Loss: 0.01582
Epoch: 14464, Training Loss: 0.01212
Epoch: 14464, Training Loss: 0.01278
Epoch: 14464, Training Loss: 0.01280
Epoch: 14464, Training Loss: 0.01582
Epoch: 14465, Training Loss: 0.01212
Epoch: 14465, Training Loss: 0.01278
Epoch: 14465, Training Loss: 0.01280
Epoch: 14465, Training Loss: 0.01582
Epoch: 14466, Training Loss: 0.01212
Epoch: 14466, Training Loss: 0.01278
Epoch: 14466, Training Loss: 0.01280
Epoch: 14466, Training Loss: 0.01582
Epoch: 14467, Training Loss: 0.01212
Epoch: 14467, Training Loss: 0.01278
Epoch: 14467, Training Loss: 0.01279
Epoch: 14467, Training Loss: 0.01582
Epoch: 14468, Training Loss: 0.01212
Epoch: 14468, Training Loss: 0.01278
Epoch: 14468, Training Loss: 0.01279
Epoch: 14468, Training Loss: 0.01582
Epoch: 14469, Training Loss: 0.01212
Epoch: 14469, Training Loss: 0.01278
Epoch: 14469, Training Loss: 0.01279
Epoch: 14469, Training Loss: 0.01582
Epoch: 14470, Training Loss: 0.01212
Epoch: 14470, Training Loss: 0.01278
Epoch: 14470, Training Loss: 0.01279
Epoch: 14470, Training Loss: 0.01582
Epoch: 14471, Training Loss: 0.01212
Epoch: 14471, Training Loss: 0.01277
Epoch: 14471, Training Loss: 0.01279
Epoch: 14471, Training Loss: 0.01582
Epoch: 14472, Training Loss: 0.01212
Epoch: 14472, Training Loss: 0.01277
Epoch: 14472, Training Loss: 0.01279
Epoch: 14472, Training Loss: 0.01582
Epoch: 14473, Training Loss: 0.01212
Epoch: 14473, Training Loss: 0.01277
Epoch: 14473, Training Loss: 0.01279
Epoch: 14473, Training Loss: 0.01582
Epoch: 14474, Training Loss: 0.01211
Epoch: 14474, Training Loss: 0.01277
Epoch: 14474, Training Loss: 0.01279
Epoch: 14474, Training Loss: 0.01582
Epoch: 14475, Training Loss: 0.01211
Epoch: 14475, Training Loss: 0.01277
Epoch: 14475, Training Loss: 0.01279
Epoch: 14475, Training Loss: 0.01582
Epoch: 14476, Training Loss: 0.01211
Epoch: 14476, Training Loss: 0.01277
Epoch: 14476, Training Loss: 0.01279
Epoch: 14476, Training Loss: 0.01581
Epoch: 14477, Training Loss: 0.01211
Epoch: 14477, Training Loss: 0.01277
Epoch: 14477, Training Loss: 0.01279
Epoch: 14477, Training Loss: 0.01581
Epoch: 14478, Training Loss: 0.01211
Epoch: 14478, Training Loss: 0.01277
Epoch: 14478, Training Loss: 0.01279
Epoch: 14478, Training Loss: 0.01581
Epoch: 14479, Training Loss: 0.01211
Epoch: 14479, Training Loss: 0.01277
Epoch: 14479, Training Loss: 0.01279
Epoch: 14479, Training Loss: 0.01581
Epoch: 14480, Training Loss: 0.01211
Epoch: 14480, Training Loss: 0.01277
Epoch: 14480, Training Loss: 0.01279
Epoch: 14480, Training Loss: 0.01581
Epoch: 14481, Training Loss: 0.01211
Epoch: 14481, Training Loss: 0.01277
Epoch: 14481, Training Loss: 0.01279
Epoch: 14481, Training Loss: 0.01581
Epoch: 14482, Training Loss: 0.01211
Epoch: 14482, Training Loss: 0.01277
Epoch: 14482, Training Loss: 0.01279
Epoch: 14482, Training Loss: 0.01581
Epoch: 14483, Training Loss: 0.01211
Epoch: 14483, Training Loss: 0.01277
Epoch: 14483, Training Loss: 0.01279
Epoch: 14483, Training Loss: 0.01581
Epoch: 14484, Training Loss: 0.01211
Epoch: 14484, Training Loss: 0.01277
Epoch: 14484, Training Loss: 0.01279
Epoch: 14484, Training Loss: 0.01581
Epoch: 14485, Training Loss: 0.01211
Epoch: 14485, Training Loss: 0.01277
Epoch: 14485, Training Loss: 0.01278
Epoch: 14485, Training Loss: 0.01581
Epoch: 14486, Training Loss: 0.01211
Epoch: 14486, Training Loss: 0.01277
Epoch: 14486, Training Loss: 0.01278
Epoch: 14486, Training Loss: 0.01581
Epoch: 14487, Training Loss: 0.01211
Epoch: 14487, Training Loss: 0.01277
Epoch: 14487, Training Loss: 0.01278
Epoch: 14487, Training Loss: 0.01581
Epoch: 14488, Training Loss: 0.01211
Epoch: 14488, Training Loss: 0.01277
Epoch: 14488, Training Loss: 0.01278
Epoch: 14488, Training Loss: 0.01581
Epoch: 14489, Training Loss: 0.01211
Epoch: 14489, Training Loss: 0.01276
Epoch: 14489, Training Loss: 0.01278
Epoch: 14489, Training Loss: 0.01581
Epoch: 14490, Training Loss: 0.01211
Epoch: 14490, Training Loss: 0.01276
Epoch: 14490, Training Loss: 0.01278
Epoch: 14490, Training Loss: 0.01580
Epoch: 14491, Training Loss: 0.01211
Epoch: 14491, Training Loss: 0.01276
Epoch: 14491, Training Loss: 0.01278
Epoch: 14491, Training Loss: 0.01580
Epoch: 14492, Training Loss: 0.01211
Epoch: 14492, Training Loss: 0.01276
Epoch: 14492, Training Loss: 0.01278
Epoch: 14492, Training Loss: 0.01580
Epoch: 14493, Training Loss: 0.01210
Epoch: 14493, Training Loss: 0.01276
Epoch: 14493, Training Loss: 0.01278
Epoch: 14493, Training Loss: 0.01580
Epoch: 14494, Training Loss: 0.01210
Epoch: 14494, Training Loss: 0.01276
Epoch: 14494, Training Loss: 0.01278
Epoch: 14494, Training Loss: 0.01580
Epoch: 14495, Training Loss: 0.01210
Epoch: 14495, Training Loss: 0.01276
Epoch: 14495, Training Loss: 0.01278
Epoch: 14495, Training Loss: 0.01580
Epoch: 14496, Training Loss: 0.01210
Epoch: 14496, Training Loss: 0.01276
Epoch: 14496, Training Loss: 0.01278
Epoch: 14496, Training Loss: 0.01580
Epoch: 14497, Training Loss: 0.01210
Epoch: 14497, Training Loss: 0.01276
Epoch: 14497, Training Loss: 0.01278
Epoch: 14497, Training Loss: 0.01580
Epoch: 14498, Training Loss: 0.01210
Epoch: 14498, Training Loss: 0.01276
Epoch: 14498, Training Loss: 0.01278
Epoch: 14498, Training Loss: 0.01580
Epoch: 14499, Training Loss: 0.01210
Epoch: 14499, Training Loss: 0.01276
Epoch: 14499, Training Loss: 0.01278
Epoch: 14499, Training Loss: 0.01580
Epoch: 14500, Training Loss: 0.01210
Epoch: 14500, Training Loss: 0.01276
Epoch: 14500, Training Loss: 0.01278
Epoch: 14500, Training Loss: 0.01580
Epoch: 14501, Training Loss: 0.01210
Epoch: 14501, Training Loss: 0.01276
Epoch: 14501, Training Loss: 0.01278
Epoch: 14501, Training Loss: 0.01580
Epoch: 14502, Training Loss: 0.01210
Epoch: 14502, Training Loss: 0.01276
Epoch: 14502, Training Loss: 0.01278
Epoch: 14502, Training Loss: 0.01580
Epoch: 14503, Training Loss: 0.01210
Epoch: 14503, Training Loss: 0.01276
Epoch: 14503, Training Loss: 0.01278
Epoch: 14503, Training Loss: 0.01580
Epoch: 14504, Training Loss: 0.01210
Epoch: 14504, Training Loss: 0.01276
Epoch: 14504, Training Loss: 0.01277
Epoch: 14504, Training Loss: 0.01579
Epoch: 14505, Training Loss: 0.01210
Epoch: 14505, Training Loss: 0.01276
Epoch: 14505, Training Loss: 0.01277
Epoch: 14505, Training Loss: 0.01579
Epoch: 14506, Training Loss: 0.01210
Epoch: 14506, Training Loss: 0.01276
Epoch: 14506, Training Loss: 0.01277
Epoch: 14506, Training Loss: 0.01579
Epoch: 14507, Training Loss: 0.01210
Epoch: 14507, Training Loss: 0.01275
Epoch: 14507, Training Loss: 0.01277
Epoch: 14507, Training Loss: 0.01579
Epoch: 14508, Training Loss: 0.01210
Epoch: 14508, Training Loss: 0.01275
Epoch: 14508, Training Loss: 0.01277
Epoch: 14508, Training Loss: 0.01579
Epoch: 14509, Training Loss: 0.01210
Epoch: 14509, Training Loss: 0.01275
Epoch: 14509, Training Loss: 0.01277
Epoch: 14509, Training Loss: 0.01579
Epoch: 14510, Training Loss: 0.01210
Epoch: 14510, Training Loss: 0.01275
Epoch: 14510, Training Loss: 0.01277
Epoch: 14510, Training Loss: 0.01579
Epoch: 14511, Training Loss: 0.01210
Epoch: 14511, Training Loss: 0.01275
Epoch: 14511, Training Loss: 0.01277
Epoch: 14511, Training Loss: 0.01579
Epoch: 14512, Training Loss: 0.01210
Epoch: 14512, Training Loss: 0.01275
Epoch: 14512, Training Loss: 0.01277
Epoch: 14512, Training Loss: 0.01579
Epoch: 14513, Training Loss: 0.01209
Epoch: 14513, Training Loss: 0.01275
Epoch: 14513, Training Loss: 0.01277
Epoch: 14513, Training Loss: 0.01579
Epoch: 14514, Training Loss: 0.01209
Epoch: 14514, Training Loss: 0.01275
Epoch: 14514, Training Loss: 0.01277
Epoch: 14514, Training Loss: 0.01579
Epoch: 14515, Training Loss: 0.01209
Epoch: 14515, Training Loss: 0.01275
Epoch: 14515, Training Loss: 0.01277
Epoch: 14515, Training Loss: 0.01579
Epoch: 14516, Training Loss: 0.01209
Epoch: 14516, Training Loss: 0.01275
Epoch: 14516, Training Loss: 0.01277
Epoch: 14516, Training Loss: 0.01579
Epoch: 14517, Training Loss: 0.01209
Epoch: 14517, Training Loss: 0.01275
Epoch: 14517, Training Loss: 0.01277
Epoch: 14517, Training Loss: 0.01579
Epoch: 14518, Training Loss: 0.01209
Epoch: 14518, Training Loss: 0.01275
Epoch: 14518, Training Loss: 0.01277
Epoch: 14518, Training Loss: 0.01579
Epoch: 14519, Training Loss: 0.01209
Epoch: 14519, Training Loss: 0.01275
Epoch: 14519, Training Loss: 0.01277
Epoch: 14519, Training Loss: 0.01578
Epoch: 14520, Training Loss: 0.01209
Epoch: 14520, Training Loss: 0.01275
Epoch: 14520, Training Loss: 0.01277
Epoch: 14520, Training Loss: 0.01578
Epoch: 14521, Training Loss: 0.01209
Epoch: 14521, Training Loss: 0.01275
Epoch: 14521, Training Loss: 0.01277
Epoch: 14521, Training Loss: 0.01578
Epoch: 14522, Training Loss: 0.01209
Epoch: 14522, Training Loss: 0.01275
Epoch: 14522, Training Loss: 0.01276
Epoch: 14522, Training Loss: 0.01578
Epoch: 14523, Training Loss: 0.01209
Epoch: 14523, Training Loss: 0.01275
Epoch: 14523, Training Loss: 0.01276
Epoch: 14523, Training Loss: 0.01578
Epoch: 14524, Training Loss: 0.01209
Epoch: 14524, Training Loss: 0.01275
Epoch: 14524, Training Loss: 0.01276
Epoch: 14524, Training Loss: 0.01578
Epoch: 14525, Training Loss: 0.01209
Epoch: 14525, Training Loss: 0.01274
Epoch: 14525, Training Loss: 0.01276
Epoch: 14525, Training Loss: 0.01578
Epoch: 14526, Training Loss: 0.01209
Epoch: 14526, Training Loss: 0.01274
Epoch: 14526, Training Loss: 0.01276
Epoch: 14526, Training Loss: 0.01578
Epoch: 14527, Training Loss: 0.01209
Epoch: 14527, Training Loss: 0.01274
Epoch: 14527, Training Loss: 0.01276
Epoch: 14527, Training Loss: 0.01578
Epoch: 14528, Training Loss: 0.01209
Epoch: 14528, Training Loss: 0.01274
Epoch: 14528, Training Loss: 0.01276
Epoch: 14528, Training Loss: 0.01578
Epoch: 14529, Training Loss: 0.01209
Epoch: 14529, Training Loss: 0.01274
Epoch: 14529, Training Loss: 0.01276
Epoch: 14529, Training Loss: 0.01578
Epoch: 14530, Training Loss: 0.01209
Epoch: 14530, Training Loss: 0.01274
Epoch: 14530, Training Loss: 0.01276
Epoch: 14530, Training Loss: 0.01578
Epoch: 14531, Training Loss: 0.01209
Epoch: 14531, Training Loss: 0.01274
Epoch: 14531, Training Loss: 0.01276
Epoch: 14531, Training Loss: 0.01578
Epoch: 14532, Training Loss: 0.01209
Epoch: 14532, Training Loss: 0.01274
Epoch: 14532, Training Loss: 0.01276
Epoch: 14532, Training Loss: 0.01578
Epoch: 14533, Training Loss: 0.01208
Epoch: 14533, Training Loss: 0.01274
Epoch: 14533, Training Loss: 0.01276
Epoch: 14533, Training Loss: 0.01577
Epoch: 14534, Training Loss: 0.01208
Epoch: 14534, Training Loss: 0.01274
Epoch: 14534, Training Loss: 0.01276
Epoch: 14534, Training Loss: 0.01577
Epoch: 14535, Training Loss: 0.01208
Epoch: 14535, Training Loss: 0.01274
Epoch: 14535, Training Loss: 0.01276
Epoch: 14535, Training Loss: 0.01577
Epoch: 14536, Training Loss: 0.01208
Epoch: 14536, Training Loss: 0.01274
Epoch: 14536, Training Loss: 0.01276
Epoch: 14536, Training Loss: 0.01577
Epoch: 14537, Training Loss: 0.01208
Epoch: 14537, Training Loss: 0.01274
Epoch: 14537, Training Loss: 0.01276
Epoch: 14537, Training Loss: 0.01577
Epoch: 14538, Training Loss: 0.01208
Epoch: 14538, Training Loss: 0.01274
Epoch: 14538, Training Loss: 0.01276
Epoch: 14538, Training Loss: 0.01577
Epoch: 14539, Training Loss: 0.01208
Epoch: 14539, Training Loss: 0.01274
Epoch: 14539, Training Loss: 0.01276
Epoch: 14539, Training Loss: 0.01577
Epoch: 14540, Training Loss: 0.01208
Epoch: 14540, Training Loss: 0.01274
Epoch: 14540, Training Loss: 0.01275
Epoch: 14540, Training Loss: 0.01577
Epoch: 14541, Training Loss: 0.01208
Epoch: 14541, Training Loss: 0.01274
Epoch: 14541, Training Loss: 0.01275
Epoch: 14541, Training Loss: 0.01577
Epoch: 14542, Training Loss: 0.01208
Epoch: 14542, Training Loss: 0.01274
Epoch: 14542, Training Loss: 0.01275
Epoch: 14542, Training Loss: 0.01577
Epoch: 14543, Training Loss: 0.01208
Epoch: 14543, Training Loss: 0.01274
Epoch: 14543, Training Loss: 0.01275
Epoch: 14543, Training Loss: 0.01577
Epoch: 14544, Training Loss: 0.01208
Epoch: 14544, Training Loss: 0.01273
Epoch: 14544, Training Loss: 0.01275
Epoch: 14544, Training Loss: 0.01577
Epoch: 14545, Training Loss: 0.01208
Epoch: 14545, Training Loss: 0.01273
Epoch: 14545, Training Loss: 0.01275
Epoch: 14545, Training Loss: 0.01577
Epoch: 14546, Training Loss: 0.01208
Epoch: 14546, Training Loss: 0.01273
Epoch: 14546, Training Loss: 0.01275
Epoch: 14546, Training Loss: 0.01577
Epoch: 14547, Training Loss: 0.01208
Epoch: 14547, Training Loss: 0.01273
Epoch: 14547, Training Loss: 0.01275
Epoch: 14547, Training Loss: 0.01577
Epoch: 14548, Training Loss: 0.01208
Epoch: 14548, Training Loss: 0.01273
Epoch: 14548, Training Loss: 0.01275
Epoch: 14548, Training Loss: 0.01576
Epoch: 14549, Training Loss: 0.01208
Epoch: 14549, Training Loss: 0.01273
Epoch: 14549, Training Loss: 0.01275
Epoch: 14549, Training Loss: 0.01576
Epoch: 14550, Training Loss: 0.01208
Epoch: 14550, Training Loss: 0.01273
Epoch: 14550, Training Loss: 0.01275
Epoch: 14550, Training Loss: 0.01576
Epoch: 14551, Training Loss: 0.01208
Epoch: 14551, Training Loss: 0.01273
Epoch: 14551, Training Loss: 0.01275
Epoch: 14551, Training Loss: 0.01576
Epoch: 14552, Training Loss: 0.01208
Epoch: 14552, Training Loss: 0.01273
Epoch: 14552, Training Loss: 0.01275
Epoch: 14552, Training Loss: 0.01576
Epoch: 14553, Training Loss: 0.01207
Epoch: 14553, Training Loss: 0.01273
Epoch: 14553, Training Loss: 0.01275
Epoch: 14553, Training Loss: 0.01576
Epoch: 14554, Training Loss: 0.01207
Epoch: 14554, Training Loss: 0.01273
Epoch: 14554, Training Loss: 0.01275
Epoch: 14554, Training Loss: 0.01576
Epoch: 14555, Training Loss: 0.01207
Epoch: 14555, Training Loss: 0.01273
Epoch: 14555, Training Loss: 0.01275
Epoch: 14555, Training Loss: 0.01576
Epoch: 14556, Training Loss: 0.01207
Epoch: 14556, Training Loss: 0.01273
Epoch: 14556, Training Loss: 0.01275
Epoch: 14556, Training Loss: 0.01576
Epoch: 14557, Training Loss: 0.01207
Epoch: 14557, Training Loss: 0.01273
Epoch: 14557, Training Loss: 0.01275
Epoch: 14557, Training Loss: 0.01576
Epoch: 14558, Training Loss: 0.01207
Epoch: 14558, Training Loss: 0.01273
Epoch: 14558, Training Loss: 0.01274
Epoch: 14558, Training Loss: 0.01576
Epoch: 14559, Training Loss: 0.01207
Epoch: 14559, Training Loss: 0.01273
Epoch: 14559, Training Loss: 0.01274
Epoch: 14559, Training Loss: 0.01576
Epoch: 14560, Training Loss: 0.01207
Epoch: 14560, Training Loss: 0.01273
Epoch: 14560, Training Loss: 0.01274
Epoch: 14560, Training Loss: 0.01576
Epoch: 14561, Training Loss: 0.01207
Epoch: 14561, Training Loss: 0.01273
Epoch: 14561, Training Loss: 0.01274
Epoch: 14561, Training Loss: 0.01576
Epoch: 14562, Training Loss: 0.01207
Epoch: 14562, Training Loss: 0.01272
Epoch: 14562, Training Loss: 0.01274
Epoch: 14562, Training Loss: 0.01575
Epoch: 14563, Training Loss: 0.01207
Epoch: 14563, Training Loss: 0.01272
Epoch: 14563, Training Loss: 0.01274
Epoch: 14563, Training Loss: 0.01575
Epoch: 14564, Training Loss: 0.01207
Epoch: 14564, Training Loss: 0.01272
Epoch: 14564, Training Loss: 0.01274
Epoch: 14564, Training Loss: 0.01575
Epoch: 14565, Training Loss: 0.01207
Epoch: 14565, Training Loss: 0.01272
Epoch: 14565, Training Loss: 0.01274
Epoch: 14565, Training Loss: 0.01575
Epoch: 14566, Training Loss: 0.01207
Epoch: 14566, Training Loss: 0.01272
Epoch: 14566, Training Loss: 0.01274
Epoch: 14566, Training Loss: 0.01575
Epoch: 14567, Training Loss: 0.01207
Epoch: 14567, Training Loss: 0.01272
Epoch: 14567, Training Loss: 0.01274
Epoch: 14567, Training Loss: 0.01575
Epoch: 14568, Training Loss: 0.01207
Epoch: 14568, Training Loss: 0.01272
Epoch: 14568, Training Loss: 0.01274
Epoch: 14568, Training Loss: 0.01575
Epoch: 14569, Training Loss: 0.01207
Epoch: 14569, Training Loss: 0.01272
Epoch: 14569, Training Loss: 0.01274
Epoch: 14569, Training Loss: 0.01575
Epoch: 14570, Training Loss: 0.01207
Epoch: 14570, Training Loss: 0.01272
Epoch: 14570, Training Loss: 0.01274
Epoch: 14570, Training Loss: 0.01575
Epoch: 14571, Training Loss: 0.01207
Epoch: 14571, Training Loss: 0.01272
Epoch: 14571, Training Loss: 0.01274
Epoch: 14571, Training Loss: 0.01575
Epoch: 14572, Training Loss: 0.01207
Epoch: 14572, Training Loss: 0.01272
Epoch: 14572, Training Loss: 0.01274
Epoch: 14572, Training Loss: 0.01575
Epoch: 14573, Training Loss: 0.01206
Epoch: 14573, Training Loss: 0.01272
Epoch: 14573, Training Loss: 0.01274
Epoch: 14573, Training Loss: 0.01575
Epoch: 14574, Training Loss: 0.01206
Epoch: 14574, Training Loss: 0.01272
Epoch: 14574, Training Loss: 0.01274
Epoch: 14574, Training Loss: 0.01575
Epoch: 14575, Training Loss: 0.01206
Epoch: 14575, Training Loss: 0.01272
Epoch: 14575, Training Loss: 0.01274
Epoch: 14575, Training Loss: 0.01575
Epoch: 14576, Training Loss: 0.01206
Epoch: 14576, Training Loss: 0.01272
Epoch: 14576, Training Loss: 0.01273
Epoch: 14576, Training Loss: 0.01575
Epoch: 14577, Training Loss: 0.01206
Epoch: 14577, Training Loss: 0.01272
Epoch: 14577, Training Loss: 0.01273
Epoch: 14577, Training Loss: 0.01574
Epoch: 14578, Training Loss: 0.01206
Epoch: 14578, Training Loss: 0.01272
Epoch: 14578, Training Loss: 0.01273
Epoch: 14578, Training Loss: 0.01574
Epoch: 14579, Training Loss: 0.01206
Epoch: 14579, Training Loss: 0.01272
Epoch: 14579, Training Loss: 0.01273
Epoch: 14579, Training Loss: 0.01574
Epoch: 14580, Training Loss: 0.01206
Epoch: 14580, Training Loss: 0.01271
Epoch: 14580, Training Loss: 0.01273
Epoch: 14580, Training Loss: 0.01574
Epoch: 14581, Training Loss: 0.01206
Epoch: 14581, Training Loss: 0.01271
Epoch: 14581, Training Loss: 0.01273
Epoch: 14581, Training Loss: 0.01574
Epoch: 14582, Training Loss: 0.01206
Epoch: 14582, Training Loss: 0.01271
Epoch: 14582, Training Loss: 0.01273
Epoch: 14582, Training Loss: 0.01574
Epoch: 14583, Training Loss: 0.01206
Epoch: 14583, Training Loss: 0.01271
Epoch: 14583, Training Loss: 0.01273
Epoch: 14583, Training Loss: 0.01574
Epoch: 14584, Training Loss: 0.01206
Epoch: 14584, Training Loss: 0.01271
Epoch: 14584, Training Loss: 0.01273
Epoch: 14584, Training Loss: 0.01574
Epoch: 14585, Training Loss: 0.01206
Epoch: 14585, Training Loss: 0.01271
Epoch: 14585, Training Loss: 0.01273
Epoch: 14585, Training Loss: 0.01574
Epoch: 14586, Training Loss: 0.01206
Epoch: 14586, Training Loss: 0.01271
Epoch: 14586, Training Loss: 0.01273
Epoch: 14586, Training Loss: 0.01574
Epoch: 14587, Training Loss: 0.01206
Epoch: 14587, Training Loss: 0.01271
Epoch: 14587, Training Loss: 0.01273
Epoch: 14587, Training Loss: 0.01574
Epoch: 14588, Training Loss: 0.01206
Epoch: 14588, Training Loss: 0.01271
Epoch: 14588, Training Loss: 0.01273
Epoch: 14588, Training Loss: 0.01574
Epoch: 14589, Training Loss: 0.01206
Epoch: 14589, Training Loss: 0.01271
Epoch: 14589, Training Loss: 0.01273
Epoch: 14589, Training Loss: 0.01574
Epoch: 14590, Training Loss: 0.01206
Epoch: 14590, Training Loss: 0.01271
Epoch: 14590, Training Loss: 0.01273
Epoch: 14590, Training Loss: 0.01574
Epoch: 14591, Training Loss: 0.01206
Epoch: 14591, Training Loss: 0.01271
Epoch: 14591, Training Loss: 0.01273
Epoch: 14591, Training Loss: 0.01573
Epoch: 14592, Training Loss: 0.01206
Epoch: 14592, Training Loss: 0.01271
Epoch: 14592, Training Loss: 0.01273
Epoch: 14592, Training Loss: 0.01573
Epoch: 14593, Training Loss: 0.01205
Epoch: 14593, Training Loss: 0.01271
Epoch: 14593, Training Loss: 0.01273
Epoch: 14593, Training Loss: 0.01573
Epoch: 14594, Training Loss: 0.01205
Epoch: 14594, Training Loss: 0.01271
Epoch: 14594, Training Loss: 0.01273
Epoch: 14594, Training Loss: 0.01573
Epoch: 14595, Training Loss: 0.01205
Epoch: 14595, Training Loss: 0.01271
Epoch: 14595, Training Loss: 0.01272
Epoch: 14595, Training Loss: 0.01573
Epoch: 14596, Training Loss: 0.01205
Epoch: 14596, Training Loss: 0.01271
Epoch: 14596, Training Loss: 0.01272
Epoch: 14596, Training Loss: 0.01573
Epoch: 14597, Training Loss: 0.01205
Epoch: 14597, Training Loss: 0.01271
Epoch: 14597, Training Loss: 0.01272
Epoch: 14597, Training Loss: 0.01573
Epoch: 14598, Training Loss: 0.01205
Epoch: 14598, Training Loss: 0.01271
Epoch: 14598, Training Loss: 0.01272
Epoch: 14598, Training Loss: 0.01573
Epoch: 14599, Training Loss: 0.01205
Epoch: 14599, Training Loss: 0.01270
Epoch: 14599, Training Loss: 0.01272
Epoch: 14599, Training Loss: 0.01573
Epoch: 14600, Training Loss: 0.01205
Epoch: 14600, Training Loss: 0.01270
Epoch: 14600, Training Loss: 0.01272
Epoch: 14600, Training Loss: 0.01573
Epoch: 14601, Training Loss: 0.01205
Epoch: 14601, Training Loss: 0.01270
Epoch: 14601, Training Loss: 0.01272
Epoch: 14601, Training Loss: 0.01573
Epoch: 14602, Training Loss: 0.01205
Epoch: 14602, Training Loss: 0.01270
Epoch: 14602, Training Loss: 0.01272
Epoch: 14602, Training Loss: 0.01573
Epoch: 14603, Training Loss: 0.01205
Epoch: 14603, Training Loss: 0.01270
Epoch: 14603, Training Loss: 0.01272
Epoch: 14603, Training Loss: 0.01573
Epoch: 14604, Training Loss: 0.01205
Epoch: 14604, Training Loss: 0.01270
Epoch: 14604, Training Loss: 0.01272
Epoch: 14604, Training Loss: 0.01573
Epoch: 14605, Training Loss: 0.01205
Epoch: 14605, Training Loss: 0.01270
Epoch: 14605, Training Loss: 0.01272
Epoch: 14605, Training Loss: 0.01573
Epoch: 14606, Training Loss: 0.01205
Epoch: 14606, Training Loss: 0.01270
Epoch: 14606, Training Loss: 0.01272
Epoch: 14606, Training Loss: 0.01572
Epoch: 14607, Training Loss: 0.01205
Epoch: 14607, Training Loss: 0.01270
Epoch: 14607, Training Loss: 0.01272
Epoch: 14607, Training Loss: 0.01572
Epoch: 14608, Training Loss: 0.01205
Epoch: 14608, Training Loss: 0.01270
Epoch: 14608, Training Loss: 0.01272
Epoch: 14608, Training Loss: 0.01572
Epoch: 14609, Training Loss: 0.01205
Epoch: 14609, Training Loss: 0.01270
Epoch: 14609, Training Loss: 0.01272
Epoch: 14609, Training Loss: 0.01572
Epoch: 14610, Training Loss: 0.01205
Epoch: 14610, Training Loss: 0.01270
Epoch: 14610, Training Loss: 0.01272
Epoch: 14610, Training Loss: 0.01572
Epoch: 14611, Training Loss: 0.01205
Epoch: 14611, Training Loss: 0.01270
Epoch: 14611, Training Loss: 0.01272
Epoch: 14611, Training Loss: 0.01572
Epoch: 14612, Training Loss: 0.01205
Epoch: 14612, Training Loss: 0.01270
Epoch: 14612, Training Loss: 0.01272
Epoch: 14612, Training Loss: 0.01572
Epoch: 14613, Training Loss: 0.01204
Epoch: 14613, Training Loss: 0.01270
Epoch: 14613, Training Loss: 0.01271
Epoch: 14613, Training Loss: 0.01572
Epoch: 14614, Training Loss: 0.01204
Epoch: 14614, Training Loss: 0.01270
Epoch: 14614, Training Loss: 0.01271
Epoch: 14614, Training Loss: 0.01572
Epoch: 14615, Training Loss: 0.01204
Epoch: 14615, Training Loss: 0.01270
Epoch: 14615, Training Loss: 0.01271
Epoch: 14615, Training Loss: 0.01572
Epoch: 14616, Training Loss: 0.01204
Epoch: 14616, Training Loss: 0.01270
Epoch: 14616, Training Loss: 0.01271
Epoch: 14616, Training Loss: 0.01572
Epoch: 14617, Training Loss: 0.01204
Epoch: 14617, Training Loss: 0.01269
Epoch: 14617, Training Loss: 0.01271
Epoch: 14617, Training Loss: 0.01572
Epoch: 14618, Training Loss: 0.01204
Epoch: 14618, Training Loss: 0.01269
Epoch: 14618, Training Loss: 0.01271
Epoch: 14618, Training Loss: 0.01572
Epoch: 14619, Training Loss: 0.01204
Epoch: 14619, Training Loss: 0.01269
Epoch: 14619, Training Loss: 0.01271
Epoch: 14619, Training Loss: 0.01572
Epoch: 14620, Training Loss: 0.01204
Epoch: 14620, Training Loss: 0.01269
Epoch: 14620, Training Loss: 0.01271
Epoch: 14620, Training Loss: 0.01571
Epoch: 14621, Training Loss: 0.01204
Epoch: 14621, Training Loss: 0.01269
Epoch: 14621, Training Loss: 0.01271
Epoch: 14621, Training Loss: 0.01571
Epoch: 14622, Training Loss: 0.01204
Epoch: 14622, Training Loss: 0.01269
Epoch: 14622, Training Loss: 0.01271
Epoch: 14622, Training Loss: 0.01571
Epoch: 14623, Training Loss: 0.01204
Epoch: 14623, Training Loss: 0.01269
Epoch: 14623, Training Loss: 0.01271
Epoch: 14623, Training Loss: 0.01571
Epoch: 14624, Training Loss: 0.01204
Epoch: 14624, Training Loss: 0.01269
Epoch: 14624, Training Loss: 0.01271
Epoch: 14624, Training Loss: 0.01571
Epoch: 14625, Training Loss: 0.01204
Epoch: 14625, Training Loss: 0.01269
Epoch: 14625, Training Loss: 0.01271
Epoch: 14625, Training Loss: 0.01571
Epoch: 14626, Training Loss: 0.01204
Epoch: 14626, Training Loss: 0.01269
Epoch: 14626, Training Loss: 0.01271
Epoch: 14626, Training Loss: 0.01571
Epoch: 14627, Training Loss: 0.01204
Epoch: 14627, Training Loss: 0.01269
Epoch: 14627, Training Loss: 0.01271
Epoch: 14627, Training Loss: 0.01571
Epoch: 14628, Training Loss: 0.01204
Epoch: 14628, Training Loss: 0.01269
Epoch: 14628, Training Loss: 0.01271
Epoch: 14628, Training Loss: 0.01571
Epoch: 14629, Training Loss: 0.01204
Epoch: 14629, Training Loss: 0.01269
Epoch: 14629, Training Loss: 0.01271
Epoch: 14629, Training Loss: 0.01571
Epoch: 14630, Training Loss: 0.01204
Epoch: 14630, Training Loss: 0.01269
Epoch: 14630, Training Loss: 0.01271
Epoch: 14630, Training Loss: 0.01571
Epoch: 14631, Training Loss: 0.01204
Epoch: 14631, Training Loss: 0.01269
Epoch: 14631, Training Loss: 0.01271
Epoch: 14631, Training Loss: 0.01571
Epoch: 14632, Training Loss: 0.01204
Epoch: 14632, Training Loss: 0.01269
Epoch: 14632, Training Loss: 0.01270
Epoch: 14632, Training Loss: 0.01571
Epoch: 14633, Training Loss: 0.01203
Epoch: 14633, Training Loss: 0.01269
Epoch: 14633, Training Loss: 0.01270
Epoch: 14633, Training Loss: 0.01571
Epoch: 14634, Training Loss: 0.01203
Epoch: 14634, Training Loss: 0.01269
Epoch: 14634, Training Loss: 0.01270
Epoch: 14634, Training Loss: 0.01571
Epoch: 14635, Training Loss: 0.01203
Epoch: 14635, Training Loss: 0.01268
Epoch: 14635, Training Loss: 0.01270
Epoch: 14635, Training Loss: 0.01570
Epoch: 14636, Training Loss: 0.01203
Epoch: 14636, Training Loss: 0.01268
Epoch: 14636, Training Loss: 0.01270
Epoch: 14636, Training Loss: 0.01570
Epoch: 14637, Training Loss: 0.01203
Epoch: 14637, Training Loss: 0.01268
Epoch: 14637, Training Loss: 0.01270
Epoch: 14637, Training Loss: 0.01570
Epoch: 14638, Training Loss: 0.01203
Epoch: 14638, Training Loss: 0.01268
Epoch: 14638, Training Loss: 0.01270
Epoch: 14638, Training Loss: 0.01570
Epoch: 14639, Training Loss: 0.01203
Epoch: 14639, Training Loss: 0.01268
Epoch: 14639, Training Loss: 0.01270
Epoch: 14639, Training Loss: 0.01570
Epoch: 14640, Training Loss: 0.01203
Epoch: 14640, Training Loss: 0.01268
Epoch: 14640, Training Loss: 0.01270
Epoch: 14640, Training Loss: 0.01570
Epoch: 14641, Training Loss: 0.01203
Epoch: 14641, Training Loss: 0.01268
Epoch: 14641, Training Loss: 0.01270
Epoch: 14641, Training Loss: 0.01570
Epoch: 14642, Training Loss: 0.01203
Epoch: 14642, Training Loss: 0.01268
Epoch: 14642, Training Loss: 0.01270
Epoch: 14642, Training Loss: 0.01570
Epoch: 14643, Training Loss: 0.01203
Epoch: 14643, Training Loss: 0.01268
Epoch: 14643, Training Loss: 0.01270
Epoch: 14643, Training Loss: 0.01570
Epoch: 14644, Training Loss: 0.01203
Epoch: 14644, Training Loss: 0.01268
Epoch: 14644, Training Loss: 0.01270
Epoch: 14644, Training Loss: 0.01570
Epoch: 14645, Training Loss: 0.01203
Epoch: 14645, Training Loss: 0.01268
Epoch: 14645, Training Loss: 0.01270
Epoch: 14645, Training Loss: 0.01570
Epoch: 14646, Training Loss: 0.01203
Epoch: 14646, Training Loss: 0.01268
Epoch: 14646, Training Loss: 0.01270
Epoch: 14646, Training Loss: 0.01570
Epoch: 14647, Training Loss: 0.01203
Epoch: 14647, Training Loss: 0.01268
Epoch: 14647, Training Loss: 0.01270
Epoch: 14647, Training Loss: 0.01570
Epoch: 14648, Training Loss: 0.01203
Epoch: 14648, Training Loss: 0.01268
Epoch: 14648, Training Loss: 0.01270
Epoch: 14648, Training Loss: 0.01570
Epoch: 14649, Training Loss: 0.01203
Epoch: 14649, Training Loss: 0.01268
Epoch: 14649, Training Loss: 0.01270
Epoch: 14649, Training Loss: 0.01570
Epoch: 14650, Training Loss: 0.01203
Epoch: 14650, Training Loss: 0.01268
Epoch: 14650, Training Loss: 0.01269
Epoch: 14650, Training Loss: 0.01569
Epoch: 14651, Training Loss: 0.01203
Epoch: 14651, Training Loss: 0.01268
Epoch: 14651, Training Loss: 0.01269
Epoch: 14651, Training Loss: 0.01569
Epoch: 14652, Training Loss: 0.01203
Epoch: 14652, Training Loss: 0.01268
Epoch: 14652, Training Loss: 0.01269
Epoch: 14652, Training Loss: 0.01569
Epoch: 14653, Training Loss: 0.01202
Epoch: 14653, Training Loss: 0.01268
Epoch: 14653, Training Loss: 0.01269
Epoch: 14653, Training Loss: 0.01569
Epoch: 14654, Training Loss: 0.01202
Epoch: 14654, Training Loss: 0.01267
Epoch: 14654, Training Loss: 0.01269
Epoch: 14654, Training Loss: 0.01569
Epoch: 14655, Training Loss: 0.01202
Epoch: 14655, Training Loss: 0.01267
Epoch: 14655, Training Loss: 0.01269
Epoch: 14655, Training Loss: 0.01569
Epoch: 14656, Training Loss: 0.01202
Epoch: 14656, Training Loss: 0.01267
Epoch: 14656, Training Loss: 0.01269
Epoch: 14656, Training Loss: 0.01569
Epoch: 14657, Training Loss: 0.01202
Epoch: 14657, Training Loss: 0.01267
Epoch: 14657, Training Loss: 0.01269
Epoch: 14657, Training Loss: 0.01569
Epoch: 14658, Training Loss: 0.01202
Epoch: 14658, Training Loss: 0.01267
Epoch: 14658, Training Loss: 0.01269
Epoch: 14658, Training Loss: 0.01569
Epoch: 14659, Training Loss: 0.01202
Epoch: 14659, Training Loss: 0.01267
Epoch: 14659, Training Loss: 0.01269
Epoch: 14659, Training Loss: 0.01569
Epoch: 14660, Training Loss: 0.01202
Epoch: 14660, Training Loss: 0.01267
Epoch: 14660, Training Loss: 0.01269
Epoch: 14660, Training Loss: 0.01569
Epoch: 14661, Training Loss: 0.01202
Epoch: 14661, Training Loss: 0.01267
Epoch: 14661, Training Loss: 0.01269
Epoch: 14661, Training Loss: 0.01569
Epoch: 14662, Training Loss: 0.01202
Epoch: 14662, Training Loss: 0.01267
Epoch: 14662, Training Loss: 0.01269
Epoch: 14662, Training Loss: 0.01569
Epoch: 14663, Training Loss: 0.01202
Epoch: 14663, Training Loss: 0.01267
Epoch: 14663, Training Loss: 0.01269
Epoch: 14663, Training Loss: 0.01569
Epoch: 14664, Training Loss: 0.01202
Epoch: 14664, Training Loss: 0.01267
Epoch: 14664, Training Loss: 0.01269
Epoch: 14664, Training Loss: 0.01568
Epoch: 14665, Training Loss: 0.01202
Epoch: 14665, Training Loss: 0.01267
Epoch: 14665, Training Loss: 0.01269
Epoch: 14665, Training Loss: 0.01568
Epoch: 14666, Training Loss: 0.01202
Epoch: 14666, Training Loss: 0.01267
Epoch: 14666, Training Loss: 0.01269
Epoch: 14666, Training Loss: 0.01568
Epoch: 14667, Training Loss: 0.01202
Epoch: 14667, Training Loss: 0.01267
Epoch: 14667, Training Loss: 0.01269
Epoch: 14667, Training Loss: 0.01568
Epoch: 14668, Training Loss: 0.01202
Epoch: 14668, Training Loss: 0.01267
Epoch: 14668, Training Loss: 0.01269
Epoch: 14668, Training Loss: 0.01568
Epoch: 14669, Training Loss: 0.01202
Epoch: 14669, Training Loss: 0.01267
Epoch: 14669, Training Loss: 0.01268
Epoch: 14669, Training Loss: 0.01568
Epoch: 14670, Training Loss: 0.01202
Epoch: 14670, Training Loss: 0.01267
Epoch: 14670, Training Loss: 0.01268
Epoch: 14670, Training Loss: 0.01568
Epoch: 14671, Training Loss: 0.01202
Epoch: 14671, Training Loss: 0.01267
Epoch: 14671, Training Loss: 0.01268
Epoch: 14671, Training Loss: 0.01568
Epoch: 14672, Training Loss: 0.01202
Epoch: 14672, Training Loss: 0.01266
Epoch: 14672, Training Loss: 0.01268
Epoch: 14672, Training Loss: 0.01568
Epoch: 14673, Training Loss: 0.01201
Epoch: 14673, Training Loss: 0.01266
Epoch: 14673, Training Loss: 0.01268
Epoch: 14673, Training Loss: 0.01568
Epoch: 14674, Training Loss: 0.01201
Epoch: 14674, Training Loss: 0.01266
Epoch: 14674, Training Loss: 0.01268
Epoch: 14674, Training Loss: 0.01568
Epoch: 14675, Training Loss: 0.01201
Epoch: 14675, Training Loss: 0.01266
Epoch: 14675, Training Loss: 0.01268
Epoch: 14675, Training Loss: 0.01568
Epoch: 14676, Training Loss: 0.01201
Epoch: 14676, Training Loss: 0.01266
Epoch: 14676, Training Loss: 0.01268
Epoch: 14676, Training Loss: 0.01568
Epoch: 14677, Training Loss: 0.01201
Epoch: 14677, Training Loss: 0.01266
Epoch: 14677, Training Loss: 0.01268
Epoch: 14677, Training Loss: 0.01568
Epoch: 14678, Training Loss: 0.01201
Epoch: 14678, Training Loss: 0.01266
Epoch: 14678, Training Loss: 0.01268
Epoch: 14678, Training Loss: 0.01568
Epoch: 14679, Training Loss: 0.01201
Epoch: 14679, Training Loss: 0.01266
Epoch: 14679, Training Loss: 0.01268
Epoch: 14679, Training Loss: 0.01567
Epoch: 14680, Training Loss: 0.01201
Epoch: 14680, Training Loss: 0.01266
Epoch: 14680, Training Loss: 0.01268
Epoch: 14680, Training Loss: 0.01567
Epoch: 14681, Training Loss: 0.01201
Epoch: 14681, Training Loss: 0.01266
Epoch: 14681, Training Loss: 0.01268
Epoch: 14681, Training Loss: 0.01567
Epoch: 14682, Training Loss: 0.01201
Epoch: 14682, Training Loss: 0.01266
Epoch: 14682, Training Loss: 0.01268
Epoch: 14682, Training Loss: 0.01567
Epoch: 14683, Training Loss: 0.01201
Epoch: 14683, Training Loss: 0.01266
Epoch: 14683, Training Loss: 0.01268
Epoch: 14683, Training Loss: 0.01567
Epoch: 14684, Training Loss: 0.01201
Epoch: 14684, Training Loss: 0.01266
Epoch: 14684, Training Loss: 0.01268
Epoch: 14684, Training Loss: 0.01567
Epoch: 14685, Training Loss: 0.01201
Epoch: 14685, Training Loss: 0.01266
Epoch: 14685, Training Loss: 0.01268
Epoch: 14685, Training Loss: 0.01567
Epoch: 14686, Training Loss: 0.01201
Epoch: 14686, Training Loss: 0.01266
Epoch: 14686, Training Loss: 0.01268
Epoch: 14686, Training Loss: 0.01567
Epoch: 14687, Training Loss: 0.01201
Epoch: 14687, Training Loss: 0.01266
Epoch: 14687, Training Loss: 0.01267
Epoch: 14687, Training Loss: 0.01567
Epoch: 14688, Training Loss: 0.01201
Epoch: 14688, Training Loss: 0.01266
Epoch: 14688, Training Loss: 0.01267
Epoch: 14688, Training Loss: 0.01567
Epoch: 14689, Training Loss: 0.01201
Epoch: 14689, Training Loss: 0.01266
Epoch: 14689, Training Loss: 0.01267
Epoch: 14689, Training Loss: 0.01567
Epoch: 14690, Training Loss: 0.01201
Epoch: 14690, Training Loss: 0.01266
Epoch: 14690, Training Loss: 0.01267
Epoch: 14690, Training Loss: 0.01567
Epoch: 14691, Training Loss: 0.01201
Epoch: 14691, Training Loss: 0.01265
Epoch: 14691, Training Loss: 0.01267
Epoch: 14691, Training Loss: 0.01567
Epoch: 14692, Training Loss: 0.01201
Epoch: 14692, Training Loss: 0.01265
Epoch: 14692, Training Loss: 0.01267
Epoch: 14692, Training Loss: 0.01567
Epoch: 14693, Training Loss: 0.01201
Epoch: 14693, Training Loss: 0.01265
Epoch: 14693, Training Loss: 0.01267
Epoch: 14693, Training Loss: 0.01567
Epoch: 14694, Training Loss: 0.01200
Epoch: 14694, Training Loss: 0.01265
Epoch: 14694, Training Loss: 0.01267
Epoch: 14694, Training Loss: 0.01566
Epoch: 14695, Training Loss: 0.01200
Epoch: 14695, Training Loss: 0.01265
Epoch: 14695, Training Loss: 0.01267
Epoch: 14695, Training Loss: 0.01566
Epoch: 14696, Training Loss: 0.01200
Epoch: 14696, Training Loss: 0.01265
Epoch: 14696, Training Loss: 0.01267
Epoch: 14696, Training Loss: 0.01566
Epoch: 14697, Training Loss: 0.01200
Epoch: 14697, Training Loss: 0.01265
Epoch: 14697, Training Loss: 0.01267
Epoch: 14697, Training Loss: 0.01566
Epoch: 14698, Training Loss: 0.01200
Epoch: 14698, Training Loss: 0.01265
Epoch: 14698, Training Loss: 0.01267
Epoch: 14698, Training Loss: 0.01566
Epoch: 14699, Training Loss: 0.01200
Epoch: 14699, Training Loss: 0.01265
Epoch: 14699, Training Loss: 0.01267
Epoch: 14699, Training Loss: 0.01566
Epoch: 14700, Training Loss: 0.01200
Epoch: 14700, Training Loss: 0.01265
Epoch: 14700, Training Loss: 0.01267
Epoch: 14700, Training Loss: 0.01566
Epoch: 14701, Training Loss: 0.01200
Epoch: 14701, Training Loss: 0.01265
Epoch: 14701, Training Loss: 0.01267
Epoch: 14701, Training Loss: 0.01566
Epoch: 14702, Training Loss: 0.01200
Epoch: 14702, Training Loss: 0.01265
Epoch: 14702, Training Loss: 0.01267
Epoch: 14702, Training Loss: 0.01566
Epoch: 14703, Training Loss: 0.01200
Epoch: 14703, Training Loss: 0.01265
Epoch: 14703, Training Loss: 0.01267
Epoch: 14703, Training Loss: 0.01566
Epoch: 14704, Training Loss: 0.01200
Epoch: 14704, Training Loss: 0.01265
Epoch: 14704, Training Loss: 0.01267
Epoch: 14704, Training Loss: 0.01566
Epoch: 14705, Training Loss: 0.01200
Epoch: 14705, Training Loss: 0.01265
Epoch: 14705, Training Loss: 0.01267
Epoch: 14705, Training Loss: 0.01566
Epoch: 14706, Training Loss: 0.01200
Epoch: 14706, Training Loss: 0.01265
Epoch: 14706, Training Loss: 0.01266
Epoch: 14706, Training Loss: 0.01566
Epoch: 14707, Training Loss: 0.01200
Epoch: 14707, Training Loss: 0.01265
Epoch: 14707, Training Loss: 0.01266
Epoch: 14707, Training Loss: 0.01566
Epoch: 14708, Training Loss: 0.01200
Epoch: 14708, Training Loss: 0.01265
Epoch: 14708, Training Loss: 0.01266
Epoch: 14708, Training Loss: 0.01565
Epoch: 14709, Training Loss: 0.01200
Epoch: 14709, Training Loss: 0.01265
Epoch: 14709, Training Loss: 0.01266
Epoch: 14709, Training Loss: 0.01565
Epoch: 14710, Training Loss: 0.01200
Epoch: 14710, Training Loss: 0.01264
Epoch: 14710, Training Loss: 0.01266
Epoch: 14710, Training Loss: 0.01565
Epoch: 14711, Training Loss: 0.01200
Epoch: 14711, Training Loss: 0.01264
Epoch: 14711, Training Loss: 0.01266
Epoch: 14711, Training Loss: 0.01565
Epoch: 14712, Training Loss: 0.01200
Epoch: 14712, Training Loss: 0.01264
Epoch: 14712, Training Loss: 0.01266
Epoch: 14712, Training Loss: 0.01565
Epoch: 14713, Training Loss: 0.01200
Epoch: 14713, Training Loss: 0.01264
Epoch: 14713, Training Loss: 0.01266
Epoch: 14713, Training Loss: 0.01565
Epoch: 14714, Training Loss: 0.01199
Epoch: 14714, Training Loss: 0.01264
Epoch: 14714, Training Loss: 0.01266
Epoch: 14714, Training Loss: 0.01565
Epoch: 14715, Training Loss: 0.01199
Epoch: 14715, Training Loss: 0.01264
Epoch: 14715, Training Loss: 0.01266
Epoch: 14715, Training Loss: 0.01565
Epoch: 14716, Training Loss: 0.01199
Epoch: 14716, Training Loss: 0.01264
Epoch: 14716, Training Loss: 0.01266
Epoch: 14716, Training Loss: 0.01565
Epoch: 14717, Training Loss: 0.01199
Epoch: 14717, Training Loss: 0.01264
Epoch: 14717, Training Loss: 0.01266
Epoch: 14717, Training Loss: 0.01565
Epoch: 14718, Training Loss: 0.01199
Epoch: 14718, Training Loss: 0.01264
Epoch: 14718, Training Loss: 0.01266
Epoch: 14718, Training Loss: 0.01565
Epoch: 14719, Training Loss: 0.01199
Epoch: 14719, Training Loss: 0.01264
Epoch: 14719, Training Loss: 0.01266
Epoch: 14719, Training Loss: 0.01565
Epoch: 14720, Training Loss: 0.01199
Epoch: 14720, Training Loss: 0.01264
Epoch: 14720, Training Loss: 0.01266
Epoch: 14720, Training Loss: 0.01565
Epoch: 14721, Training Loss: 0.01199
Epoch: 14721, Training Loss: 0.01264
Epoch: 14721, Training Loss: 0.01266
Epoch: 14721, Training Loss: 0.01565
Epoch: 14722, Training Loss: 0.01199
Epoch: 14722, Training Loss: 0.01264
Epoch: 14722, Training Loss: 0.01266
Epoch: 14722, Training Loss: 0.01565
Epoch: 14723, Training Loss: 0.01199
Epoch: 14723, Training Loss: 0.01264
Epoch: 14723, Training Loss: 0.01266
Epoch: 14723, Training Loss: 0.01564
Epoch: 14724, Training Loss: 0.01199
Epoch: 14724, Training Loss: 0.01264
Epoch: 14724, Training Loss: 0.01265
Epoch: 14724, Training Loss: 0.01564
Epoch: 14725, Training Loss: 0.01199
Epoch: 14725, Training Loss: 0.01264
Epoch: 14725, Training Loss: 0.01265
Epoch: 14725, Training Loss: 0.01564
Epoch: 14726, Training Loss: 0.01199
Epoch: 14726, Training Loss: 0.01264
Epoch: 14726, Training Loss: 0.01265
Epoch: 14726, Training Loss: 0.01564
Epoch: 14727, Training Loss: 0.01199
Epoch: 14727, Training Loss: 0.01264
Epoch: 14727, Training Loss: 0.01265
Epoch: 14727, Training Loss: 0.01564
Epoch: 14728, Training Loss: 0.01199
Epoch: 14728, Training Loss: 0.01263
Epoch: 14728, Training Loss: 0.01265
Epoch: 14728, Training Loss: 0.01564
Epoch: 14729, Training Loss: 0.01199
Epoch: 14729, Training Loss: 0.01263
Epoch: 14729, Training Loss: 0.01265
Epoch: 14729, Training Loss: 0.01564
Epoch: 14730, Training Loss: 0.01199
Epoch: 14730, Training Loss: 0.01263
Epoch: 14730, Training Loss: 0.01265
Epoch: 14730, Training Loss: 0.01564
Epoch: 14731, Training Loss: 0.01199
Epoch: 14731, Training Loss: 0.01263
Epoch: 14731, Training Loss: 0.01265
Epoch: 14731, Training Loss: 0.01564
Epoch: 14732, Training Loss: 0.01199
Epoch: 14732, Training Loss: 0.01263
Epoch: 14732, Training Loss: 0.01265
Epoch: 14732, Training Loss: 0.01564
Epoch: 14733, Training Loss: 0.01199
Epoch: 14733, Training Loss: 0.01263
Epoch: 14733, Training Loss: 0.01265
Epoch: 14733, Training Loss: 0.01564
Epoch: 14734, Training Loss: 0.01198
Epoch: 14734, Training Loss: 0.01263
Epoch: 14734, Training Loss: 0.01265
Epoch: 14734, Training Loss: 0.01564
Epoch: 14735, Training Loss: 0.01198
Epoch: 14735, Training Loss: 0.01263
Epoch: 14735, Training Loss: 0.01265
Epoch: 14735, Training Loss: 0.01564
Epoch: 14736, Training Loss: 0.01198
Epoch: 14736, Training Loss: 0.01263
Epoch: 14736, Training Loss: 0.01265
Epoch: 14736, Training Loss: 0.01564
Epoch: 14737, Training Loss: 0.01198
Epoch: 14737, Training Loss: 0.01263
Epoch: 14737, Training Loss: 0.01265
Epoch: 14737, Training Loss: 0.01564
Epoch: 14738, Training Loss: 0.01198
Epoch: 14738, Training Loss: 0.01263
Epoch: 14738, Training Loss: 0.01265
Epoch: 14738, Training Loss: 0.01563
Epoch: 14739, Training Loss: 0.01198
Epoch: 14739, Training Loss: 0.01263
Epoch: 14739, Training Loss: 0.01265
Epoch: 14739, Training Loss: 0.01563
Epoch: 14740, Training Loss: 0.01198
Epoch: 14740, Training Loss: 0.01263
Epoch: 14740, Training Loss: 0.01265
Epoch: 14740, Training Loss: 0.01563
Epoch: 14741, Training Loss: 0.01198
Epoch: 14741, Training Loss: 0.01263
Epoch: 14741, Training Loss: 0.01265
Epoch: 14741, Training Loss: 0.01563
Epoch: 14742, Training Loss: 0.01198
Epoch: 14742, Training Loss: 0.01263
Epoch: 14742, Training Loss: 0.01265
Epoch: 14742, Training Loss: 0.01563
Epoch: 14743, Training Loss: 0.01198
Epoch: 14743, Training Loss: 0.01263
Epoch: 14743, Training Loss: 0.01264
Epoch: 14743, Training Loss: 0.01563
Epoch: 14744, Training Loss: 0.01198
Epoch: 14744, Training Loss: 0.01263
Epoch: 14744, Training Loss: 0.01264
Epoch: 14744, Training Loss: 0.01563
Epoch: 14745, Training Loss: 0.01198
Epoch: 14745, Training Loss: 0.01263
Epoch: 14745, Training Loss: 0.01264
Epoch: 14745, Training Loss: 0.01563
Epoch: 14746, Training Loss: 0.01198
Epoch: 14746, Training Loss: 0.01263
Epoch: 14746, Training Loss: 0.01264
Epoch: 14746, Training Loss: 0.01563
Epoch: 14747, Training Loss: 0.01198
Epoch: 14747, Training Loss: 0.01262
Epoch: 14747, Training Loss: 0.01264
Epoch: 14747, Training Loss: 0.01563
Epoch: 14748, Training Loss: 0.01198
Epoch: 14748, Training Loss: 0.01262
Epoch: 14748, Training Loss: 0.01264
Epoch: 14748, Training Loss: 0.01563
Epoch: 14749, Training Loss: 0.01198
Epoch: 14749, Training Loss: 0.01262
Epoch: 14749, Training Loss: 0.01264
Epoch: 14749, Training Loss: 0.01563
Epoch: 14750, Training Loss: 0.01198
Epoch: 14750, Training Loss: 0.01262
Epoch: 14750, Training Loss: 0.01264
Epoch: 14750, Training Loss: 0.01563
Epoch: 14751, Training Loss: 0.01198
Epoch: 14751, Training Loss: 0.01262
Epoch: 14751, Training Loss: 0.01264
Epoch: 14751, Training Loss: 0.01563
Epoch: 14752, Training Loss: 0.01198
Epoch: 14752, Training Loss: 0.01262
Epoch: 14752, Training Loss: 0.01264
Epoch: 14752, Training Loss: 0.01563
Epoch: 14753, Training Loss: 0.01198
Epoch: 14753, Training Loss: 0.01262
Epoch: 14753, Training Loss: 0.01264
Epoch: 14753, Training Loss: 0.01562
Epoch: 14754, Training Loss: 0.01198
Epoch: 14754, Training Loss: 0.01262
Epoch: 14754, Training Loss: 0.01264
Epoch: 14754, Training Loss: 0.01562
Epoch: 14755, Training Loss: 0.01197
Epoch: 14755, Training Loss: 0.01262
Epoch: 14755, Training Loss: 0.01264
Epoch: 14755, Training Loss: 0.01562
Epoch: 14756, Training Loss: 0.01197
Epoch: 14756, Training Loss: 0.01262
Epoch: 14756, Training Loss: 0.01264
Epoch: 14756, Training Loss: 0.01562
Epoch: 14757, Training Loss: 0.01197
Epoch: 14757, Training Loss: 0.01262
Epoch: 14757, Training Loss: 0.01264
Epoch: 14757, Training Loss: 0.01562
Epoch: 14758, Training Loss: 0.01197
Epoch: 14758, Training Loss: 0.01262
Epoch: 14758, Training Loss: 0.01264
Epoch: 14758, Training Loss: 0.01562
Epoch: 14759, Training Loss: 0.01197
Epoch: 14759, Training Loss: 0.01262
Epoch: 14759, Training Loss: 0.01264
Epoch: 14759, Training Loss: 0.01562
Epoch: 14760, Training Loss: 0.01197
Epoch: 14760, Training Loss: 0.01262
Epoch: 14760, Training Loss: 0.01264
Epoch: 14760, Training Loss: 0.01562
Epoch: 14761, Training Loss: 0.01197
Epoch: 14761, Training Loss: 0.01262
Epoch: 14761, Training Loss: 0.01264
Epoch: 14761, Training Loss: 0.01562
Epoch: 14762, Training Loss: 0.01197
Epoch: 14762, Training Loss: 0.01262
Epoch: 14762, Training Loss: 0.01263
Epoch: 14762, Training Loss: 0.01562
Epoch: 14763, Training Loss: 0.01197
Epoch: 14763, Training Loss: 0.01262
Epoch: 14763, Training Loss: 0.01263
Epoch: 14763, Training Loss: 0.01562
Epoch: 14764, Training Loss: 0.01197
Epoch: 14764, Training Loss: 0.01262
Epoch: 14764, Training Loss: 0.01263
Epoch: 14764, Training Loss: 0.01562
Epoch: 14765, Training Loss: 0.01197
Epoch: 14765, Training Loss: 0.01262
Epoch: 14765, Training Loss: 0.01263
Epoch: 14765, Training Loss: 0.01562
Epoch: 14766, Training Loss: 0.01197
Epoch: 14766, Training Loss: 0.01261
Epoch: 14766, Training Loss: 0.01263
Epoch: 14766, Training Loss: 0.01562
Epoch: 14767, Training Loss: 0.01197
Epoch: 14767, Training Loss: 0.01261
Epoch: 14767, Training Loss: 0.01263
Epoch: 14767, Training Loss: 0.01562
Epoch: 14768, Training Loss: 0.01197
Epoch: 14768, Training Loss: 0.01261
Epoch: 14768, Training Loss: 0.01263
Epoch: 14768, Training Loss: 0.01561
Epoch: 14769, Training Loss: 0.01197
Epoch: 14769, Training Loss: 0.01261
Epoch: 14769, Training Loss: 0.01263
Epoch: 14769, Training Loss: 0.01561
Epoch: 14770, Training Loss: 0.01197
Epoch: 14770, Training Loss: 0.01261
Epoch: 14770, Training Loss: 0.01263
Epoch: 14770, Training Loss: 0.01561
Epoch: 14771, Training Loss: 0.01197
Epoch: 14771, Training Loss: 0.01261
Epoch: 14771, Training Loss: 0.01263
Epoch: 14771, Training Loss: 0.01561
Epoch: 14772, Training Loss: 0.01197
Epoch: 14772, Training Loss: 0.01261
Epoch: 14772, Training Loss: 0.01263
Epoch: 14772, Training Loss: 0.01561
Epoch: 14773, Training Loss: 0.01197
Epoch: 14773, Training Loss: 0.01261
Epoch: 14773, Training Loss: 0.01263
Epoch: 14773, Training Loss: 0.01561
Epoch: 14774, Training Loss: 0.01197
Epoch: 14774, Training Loss: 0.01261
Epoch: 14774, Training Loss: 0.01263
Epoch: 14774, Training Loss: 0.01561
Epoch: 14775, Training Loss: 0.01196
Epoch: 14775, Training Loss: 0.01261
Epoch: 14775, Training Loss: 0.01263
Epoch: 14775, Training Loss: 0.01561
Epoch: 14776, Training Loss: 0.01196
Epoch: 14776, Training Loss: 0.01261
Epoch: 14776, Training Loss: 0.01263
Epoch: 14776, Training Loss: 0.01561
Epoch: 14777, Training Loss: 0.01196
Epoch: 14777, Training Loss: 0.01261
Epoch: 14777, Training Loss: 0.01263
Epoch: 14777, Training Loss: 0.01561
Epoch: 14778, Training Loss: 0.01196
Epoch: 14778, Training Loss: 0.01261
Epoch: 14778, Training Loss: 0.01263
Epoch: 14778, Training Loss: 0.01561
Epoch: 14779, Training Loss: 0.01196
Epoch: 14779, Training Loss: 0.01261
Epoch: 14779, Training Loss: 0.01263
Epoch: 14779, Training Loss: 0.01561
Epoch: 14780, Training Loss: 0.01196
Epoch: 14780, Training Loss: 0.01261
Epoch: 14780, Training Loss: 0.01262
Epoch: 14780, Training Loss: 0.01561
Epoch: 14781, Training Loss: 0.01196
Epoch: 14781, Training Loss: 0.01261
Epoch: 14781, Training Loss: 0.01262
Epoch: 14781, Training Loss: 0.01561
Epoch: 14782, Training Loss: 0.01196
Epoch: 14782, Training Loss: 0.01261
Epoch: 14782, Training Loss: 0.01262
Epoch: 14782, Training Loss: 0.01561
Epoch: 14783, Training Loss: 0.01196
Epoch: 14783, Training Loss: 0.01261
Epoch: 14783, Training Loss: 0.01262
Epoch: 14783, Training Loss: 0.01560
Epoch: 14784, Training Loss: 0.01196
Epoch: 14784, Training Loss: 0.01261
Epoch: 14784, Training Loss: 0.01262
Epoch: 14784, Training Loss: 0.01560
Epoch: 14785, Training Loss: 0.01196
Epoch: 14785, Training Loss: 0.01260
Epoch: 14785, Training Loss: 0.01262
Epoch: 14785, Training Loss: 0.01560
Epoch: 14786, Training Loss: 0.01196
Epoch: 14786, Training Loss: 0.01260
Epoch: 14786, Training Loss: 0.01262
Epoch: 14786, Training Loss: 0.01560
Epoch: 14787, Training Loss: 0.01196
Epoch: 14787, Training Loss: 0.01260
Epoch: 14787, Training Loss: 0.01262
Epoch: 14787, Training Loss: 0.01560
Epoch: 14788, Training Loss: 0.01196
Epoch: 14788, Training Loss: 0.01260
Epoch: 14788, Training Loss: 0.01262
Epoch: 14788, Training Loss: 0.01560
Epoch: 14789, Training Loss: 0.01196
Epoch: 14789, Training Loss: 0.01260
Epoch: 14789, Training Loss: 0.01262
Epoch: 14789, Training Loss: 0.01560
Epoch: 14790, Training Loss: 0.01196
Epoch: 14790, Training Loss: 0.01260
Epoch: 14790, Training Loss: 0.01262
Epoch: 14790, Training Loss: 0.01560
Epoch: 14791, Training Loss: 0.01196
Epoch: 14791, Training Loss: 0.01260
Epoch: 14791, Training Loss: 0.01262
Epoch: 14791, Training Loss: 0.01560
Epoch: 14792, Training Loss: 0.01196
Epoch: 14792, Training Loss: 0.01260
Epoch: 14792, Training Loss: 0.01262
Epoch: 14792, Training Loss: 0.01560
Epoch: 14793, Training Loss: 0.01196
Epoch: 14793, Training Loss: 0.01260
Epoch: 14793, Training Loss: 0.01262
Epoch: 14793, Training Loss: 0.01560
Epoch: 14794, Training Loss: 0.01196
Epoch: 14794, Training Loss: 0.01260
Epoch: 14794, Training Loss: 0.01262
Epoch: 14794, Training Loss: 0.01560
Epoch: 14795, Training Loss: 0.01196
Epoch: 14795, Training Loss: 0.01260
Epoch: 14795, Training Loss: 0.01262
Epoch: 14795, Training Loss: 0.01560
Epoch: 14796, Training Loss: 0.01195
Epoch: 14796, Training Loss: 0.01260
Epoch: 14796, Training Loss: 0.01262
Epoch: 14796, Training Loss: 0.01560
Epoch: 14797, Training Loss: 0.01195
Epoch: 14797, Training Loss: 0.01260
Epoch: 14797, Training Loss: 0.01262
Epoch: 14797, Training Loss: 0.01560
Epoch: 14798, Training Loss: 0.01195
Epoch: 14798, Training Loss: 0.01260
Epoch: 14798, Training Loss: 0.01262
Epoch: 14798, Training Loss: 0.01559
Epoch: 14799, Training Loss: 0.01195
Epoch: 14799, Training Loss: 0.01260
Epoch: 14799, Training Loss: 0.01261
Epoch: 14799, Training Loss: 0.01559
Epoch: 14800, Training Loss: 0.01195
Epoch: 14800, Training Loss: 0.01260
Epoch: 14800, Training Loss: 0.01261
Epoch: 14800, Training Loss: 0.01559
Epoch: 14801, Training Loss: 0.01195
Epoch: 14801, Training Loss: 0.01260
Epoch: 14801, Training Loss: 0.01261
Epoch: 14801, Training Loss: 0.01559
Epoch: 14802, Training Loss: 0.01195
Epoch: 14802, Training Loss: 0.01260
Epoch: 14802, Training Loss: 0.01261
Epoch: 14802, Training Loss: 0.01559
Epoch: 14803, Training Loss: 0.01195
Epoch: 14803, Training Loss: 0.01259
Epoch: 14803, Training Loss: 0.01261
Epoch: 14803, Training Loss: 0.01559
Epoch: 14804, Training Loss: 0.01195
Epoch: 14804, Training Loss: 0.01259
Epoch: 14804, Training Loss: 0.01261
Epoch: 14804, Training Loss: 0.01559
Epoch: 14805, Training Loss: 0.01195
Epoch: 14805, Training Loss: 0.01259
Epoch: 14805, Training Loss: 0.01261
Epoch: 14805, Training Loss: 0.01559
Epoch: 14806, Training Loss: 0.01195
Epoch: 14806, Training Loss: 0.01259
Epoch: 14806, Training Loss: 0.01261
Epoch: 14806, Training Loss: 0.01559
Epoch: 14807, Training Loss: 0.01195
Epoch: 14807, Training Loss: 0.01259
Epoch: 14807, Training Loss: 0.01261
Epoch: 14807, Training Loss: 0.01559
Epoch: 14808, Training Loss: 0.01195
Epoch: 14808, Training Loss: 0.01259
Epoch: 14808, Training Loss: 0.01261
Epoch: 14808, Training Loss: 0.01559
Epoch: 14809, Training Loss: 0.01195
Epoch: 14809, Training Loss: 0.01259
Epoch: 14809, Training Loss: 0.01261
Epoch: 14809, Training Loss: 0.01559
Epoch: 14810, Training Loss: 0.01195
Epoch: 14810, Training Loss: 0.01259
Epoch: 14810, Training Loss: 0.01261
Epoch: 14810, Training Loss: 0.01559
Epoch: 14811, Training Loss: 0.01195
Epoch: 14811, Training Loss: 0.01259
Epoch: 14811, Training Loss: 0.01261
Epoch: 14811, Training Loss: 0.01559
Epoch: 14812, Training Loss: 0.01195
Epoch: 14812, Training Loss: 0.01259
Epoch: 14812, Training Loss: 0.01261
Epoch: 14812, Training Loss: 0.01558
Epoch: 14813, Training Loss: 0.01195
Epoch: 14813, Training Loss: 0.01259
Epoch: 14813, Training Loss: 0.01261
Epoch: 14813, Training Loss: 0.01558
Epoch: 14814, Training Loss: 0.01195
Epoch: 14814, Training Loss: 0.01259
Epoch: 14814, Training Loss: 0.01261
Epoch: 14814, Training Loss: 0.01558
Epoch: 14815, Training Loss: 0.01195
Epoch: 14815, Training Loss: 0.01259
Epoch: 14815, Training Loss: 0.01261
Epoch: 14815, Training Loss: 0.01558
Epoch: 14816, Training Loss: 0.01194
Epoch: 14816, Training Loss: 0.01259
Epoch: 14816, Training Loss: 0.01261
Epoch: 14816, Training Loss: 0.01558
Epoch: 14817, Training Loss: 0.01194
Epoch: 14817, Training Loss: 0.01259
Epoch: 14817, Training Loss: 0.01261
Epoch: 14817, Training Loss: 0.01558
Epoch: 14818, Training Loss: 0.01194
Epoch: 14818, Training Loss: 0.01259
Epoch: 14818, Training Loss: 0.01260
Epoch: 14818, Training Loss: 0.01558
Epoch: 14819, Training Loss: 0.01194
Epoch: 14819, Training Loss: 0.01259
Epoch: 14819, Training Loss: 0.01260
Epoch: 14819, Training Loss: 0.01558
Epoch: 14820, Training Loss: 0.01194
Epoch: 14820, Training Loss: 0.01259
Epoch: 14820, Training Loss: 0.01260
Epoch: 14820, Training Loss: 0.01558
Epoch: 14821, Training Loss: 0.01194
Epoch: 14821, Training Loss: 0.01259
Epoch: 14821, Training Loss: 0.01260
Epoch: 14821, Training Loss: 0.01558
Epoch: 14822, Training Loss: 0.01194
Epoch: 14822, Training Loss: 0.01258
Epoch: 14822, Training Loss: 0.01260
Epoch: 14822, Training Loss: 0.01558
Epoch: 14823, Training Loss: 0.01194
Epoch: 14823, Training Loss: 0.01258
Epoch: 14823, Training Loss: 0.01260
Epoch: 14823, Training Loss: 0.01558
Epoch: 14824, Training Loss: 0.01194
Epoch: 14824, Training Loss: 0.01258
Epoch: 14824, Training Loss: 0.01260
Epoch: 14824, Training Loss: 0.01558
Epoch: 14825, Training Loss: 0.01194
Epoch: 14825, Training Loss: 0.01258
Epoch: 14825, Training Loss: 0.01260
Epoch: 14825, Training Loss: 0.01558
Epoch: 14826, Training Loss: 0.01194
Epoch: 14826, Training Loss: 0.01258
Epoch: 14826, Training Loss: 0.01260
Epoch: 14826, Training Loss: 0.01558
Epoch: 14827, Training Loss: 0.01194
Epoch: 14827, Training Loss: 0.01258
Epoch: 14827, Training Loss: 0.01260
Epoch: 14827, Training Loss: 0.01557
Epoch: 14828, Training Loss: 0.01194
Epoch: 14828, Training Loss: 0.01258
Epoch: 14828, Training Loss: 0.01260
Epoch: 14828, Training Loss: 0.01557
Epoch: 14829, Training Loss: 0.01194
Epoch: 14829, Training Loss: 0.01258
Epoch: 14829, Training Loss: 0.01260
Epoch: 14829, Training Loss: 0.01557
Epoch: 14830, Training Loss: 0.01194
Epoch: 14830, Training Loss: 0.01258
Epoch: 14830, Training Loss: 0.01260
Epoch: 14830, Training Loss: 0.01557
Epoch: 14831, Training Loss: 0.01194
Epoch: 14831, Training Loss: 0.01258
Epoch: 14831, Training Loss: 0.01260
Epoch: 14831, Training Loss: 0.01557
Epoch: 14832, Training Loss: 0.01194
Epoch: 14832, Training Loss: 0.01258
Epoch: 14832, Training Loss: 0.01260
Epoch: 14832, Training Loss: 0.01557
Epoch: 14833, Training Loss: 0.01194
Epoch: 14833, Training Loss: 0.01258
Epoch: 14833, Training Loss: 0.01260
Epoch: 14833, Training Loss: 0.01557
Epoch: 14834, Training Loss: 0.01194
Epoch: 14834, Training Loss: 0.01258
Epoch: 14834, Training Loss: 0.01260
Epoch: 14834, Training Loss: 0.01557
Epoch: 14835, Training Loss: 0.01194
Epoch: 14835, Training Loss: 0.01258
Epoch: 14835, Training Loss: 0.01260
Epoch: 14835, Training Loss: 0.01557
Epoch: 14836, Training Loss: 0.01194
Epoch: 14836, Training Loss: 0.01258
Epoch: 14836, Training Loss: 0.01260
Epoch: 14836, Training Loss: 0.01557
Epoch: 14837, Training Loss: 0.01193
Epoch: 14837, Training Loss: 0.01258
Epoch: 14837, Training Loss: 0.01259
Epoch: 14837, Training Loss: 0.01557
Epoch: 14838, Training Loss: 0.01193
Epoch: 14838, Training Loss: 0.01258
Epoch: 14838, Training Loss: 0.01259
Epoch: 14838, Training Loss: 0.01557
Epoch: 14839, Training Loss: 0.01193
Epoch: 14839, Training Loss: 0.01258
Epoch: 14839, Training Loss: 0.01259
Epoch: 14839, Training Loss: 0.01557
Epoch: 14840, Training Loss: 0.01193
Epoch: 14840, Training Loss: 0.01258
Epoch: 14840, Training Loss: 0.01259
Epoch: 14840, Training Loss: 0.01557
Epoch: 14841, Training Loss: 0.01193
Epoch: 14841, Training Loss: 0.01257
Epoch: 14841, Training Loss: 0.01259
Epoch: 14841, Training Loss: 0.01557
Epoch: 14842, Training Loss: 0.01193
Epoch: 14842, Training Loss: 0.01257
Epoch: 14842, Training Loss: 0.01259
Epoch: 14842, Training Loss: 0.01556
Epoch: 14843, Training Loss: 0.01193
Epoch: 14843, Training Loss: 0.01257
Epoch: 14843, Training Loss: 0.01259
Epoch: 14843, Training Loss: 0.01556
Epoch: 14844, Training Loss: 0.01193
Epoch: 14844, Training Loss: 0.01257
Epoch: 14844, Training Loss: 0.01259
Epoch: 14844, Training Loss: 0.01556
Epoch: 14845, Training Loss: 0.01193
Epoch: 14845, Training Loss: 0.01257
Epoch: 14845, Training Loss: 0.01259
Epoch: 14845, Training Loss: 0.01556
Epoch: 14846, Training Loss: 0.01193
Epoch: 14846, Training Loss: 0.01257
Epoch: 14846, Training Loss: 0.01259
Epoch: 14846, Training Loss: 0.01556
Epoch: 14847, Training Loss: 0.01193
Epoch: 14847, Training Loss: 0.01257
Epoch: 14847, Training Loss: 0.01259
Epoch: 14847, Training Loss: 0.01556
Epoch: 14848, Training Loss: 0.01193
Epoch: 14848, Training Loss: 0.01257
Epoch: 14848, Training Loss: 0.01259
Epoch: 14848, Training Loss: 0.01556
Epoch: 14849, Training Loss: 0.01193
Epoch: 14849, Training Loss: 0.01257
Epoch: 14849, Training Loss: 0.01259
Epoch: 14849, Training Loss: 0.01556
Epoch: 14850, Training Loss: 0.01193
Epoch: 14850, Training Loss: 0.01257
Epoch: 14850, Training Loss: 0.01259
Epoch: 14850, Training Loss: 0.01556
Epoch: 14851, Training Loss: 0.01193
Epoch: 14851, Training Loss: 0.01257
Epoch: 14851, Training Loss: 0.01259
Epoch: 14851, Training Loss: 0.01556
Epoch: 14852, Training Loss: 0.01193
Epoch: 14852, Training Loss: 0.01257
Epoch: 14852, Training Loss: 0.01259
Epoch: 14852, Training Loss: 0.01556
Epoch: 14853, Training Loss: 0.01193
Epoch: 14853, Training Loss: 0.01257
Epoch: 14853, Training Loss: 0.01259
Epoch: 14853, Training Loss: 0.01556
Epoch: 14854, Training Loss: 0.01193
Epoch: 14854, Training Loss: 0.01257
Epoch: 14854, Training Loss: 0.01259
Epoch: 14854, Training Loss: 0.01556
Epoch: 14855, Training Loss: 0.01193
Epoch: 14855, Training Loss: 0.01257
Epoch: 14855, Training Loss: 0.01259
Epoch: 14855, Training Loss: 0.01556
Epoch: 14856, Training Loss: 0.01193
Epoch: 14856, Training Loss: 0.01257
Epoch: 14856, Training Loss: 0.01258
Epoch: 14856, Training Loss: 0.01556
Epoch: 14857, Training Loss: 0.01192
Epoch: 14857, Training Loss: 0.01257
Epoch: 14857, Training Loss: 0.01258
Epoch: 14857, Training Loss: 0.01555
Epoch: 14858, Training Loss: 0.01192
Epoch: 14858, Training Loss: 0.01257
Epoch: 14858, Training Loss: 0.01258
Epoch: 14858, Training Loss: 0.01555
Epoch: 14859, Training Loss: 0.01192
Epoch: 14859, Training Loss: 0.01257
Epoch: 14859, Training Loss: 0.01258
Epoch: 14859, Training Loss: 0.01555
Epoch: 14860, Training Loss: 0.01192
Epoch: 14860, Training Loss: 0.01256
Epoch: 14860, Training Loss: 0.01258
Epoch: 14860, Training Loss: 0.01555
Epoch: 14861, Training Loss: 0.01192
Epoch: 14861, Training Loss: 0.01256
Epoch: 14861, Training Loss: 0.01258
Epoch: 14861, Training Loss: 0.01555
Epoch: 14862, Training Loss: 0.01192
Epoch: 14862, Training Loss: 0.01256
Epoch: 14862, Training Loss: 0.01258
Epoch: 14862, Training Loss: 0.01555
Epoch: 14863, Training Loss: 0.01192
Epoch: 14863, Training Loss: 0.01256
Epoch: 14863, Training Loss: 0.01258
Epoch: 14863, Training Loss: 0.01555
Epoch: 14864, Training Loss: 0.01192
Epoch: 14864, Training Loss: 0.01256
Epoch: 14864, Training Loss: 0.01258
Epoch: 14864, Training Loss: 0.01555
Epoch: 14865, Training Loss: 0.01192
Epoch: 14865, Training Loss: 0.01256
Epoch: 14865, Training Loss: 0.01258
Epoch: 14865, Training Loss: 0.01555
Epoch: 14866, Training Loss: 0.01192
Epoch: 14866, Training Loss: 0.01256
Epoch: 14866, Training Loss: 0.01258
Epoch: 14866, Training Loss: 0.01555
Epoch: 14867, Training Loss: 0.01192
Epoch: 14867, Training Loss: 0.01256
Epoch: 14867, Training Loss: 0.01258
Epoch: 14867, Training Loss: 0.01555
Epoch: 14868, Training Loss: 0.01192
Epoch: 14868, Training Loss: 0.01256
Epoch: 14868, Training Loss: 0.01258
Epoch: 14868, Training Loss: 0.01555
Epoch: 14869, Training Loss: 0.01192
Epoch: 14869, Training Loss: 0.01256
Epoch: 14869, Training Loss: 0.01258
Epoch: 14869, Training Loss: 0.01555
Epoch: 14870, Training Loss: 0.01192
Epoch: 14870, Training Loss: 0.01256
Epoch: 14870, Training Loss: 0.01258
Epoch: 14870, Training Loss: 0.01555
Epoch: 14871, Training Loss: 0.01192
Epoch: 14871, Training Loss: 0.01256
Epoch: 14871, Training Loss: 0.01258
Epoch: 14871, Training Loss: 0.01555
Epoch: 14872, Training Loss: 0.01192
Epoch: 14872, Training Loss: 0.01256
Epoch: 14872, Training Loss: 0.01258
Epoch: 14872, Training Loss: 0.01555
Epoch: 14873, Training Loss: 0.01192
Epoch: 14873, Training Loss: 0.01256
Epoch: 14873, Training Loss: 0.01258
Epoch: 14873, Training Loss: 0.01554
Epoch: 14874, Training Loss: 0.01192
Epoch: 14874, Training Loss: 0.01256
Epoch: 14874, Training Loss: 0.01258
Epoch: 14874, Training Loss: 0.01554
Epoch: 14875, Training Loss: 0.01192
Epoch: 14875, Training Loss: 0.01256
Epoch: 14875, Training Loss: 0.01257
Epoch: 14875, Training Loss: 0.01554
Epoch: 14876, Training Loss: 0.01192
Epoch: 14876, Training Loss: 0.01256
Epoch: 14876, Training Loss: 0.01257
Epoch: 14876, Training Loss: 0.01554
Epoch: 14877, Training Loss: 0.01192
Epoch: 14877, Training Loss: 0.01256
Epoch: 14877, Training Loss: 0.01257
Epoch: 14877, Training Loss: 0.01554
Epoch: 14878, Training Loss: 0.01191
Epoch: 14878, Training Loss: 0.01256
Epoch: 14878, Training Loss: 0.01257
Epoch: 14878, Training Loss: 0.01554
Epoch: 14879, Training Loss: 0.01191
Epoch: 14879, Training Loss: 0.01255
Epoch: 14879, Training Loss: 0.01257
Epoch: 14879, Training Loss: 0.01554
Epoch: 14880, Training Loss: 0.01191
Epoch: 14880, Training Loss: 0.01255
Epoch: 14880, Training Loss: 0.01257
Epoch: 14880, Training Loss: 0.01554
Epoch: 14881, Training Loss: 0.01191
Epoch: 14881, Training Loss: 0.01255
Epoch: 14881, Training Loss: 0.01257
Epoch: 14881, Training Loss: 0.01554
Epoch: 14882, Training Loss: 0.01191
Epoch: 14882, Training Loss: 0.01255
Epoch: 14882, Training Loss: 0.01257
Epoch: 14882, Training Loss: 0.01554
Epoch: 14883, Training Loss: 0.01191
Epoch: 14883, Training Loss: 0.01255
Epoch: 14883, Training Loss: 0.01257
Epoch: 14883, Training Loss: 0.01554
Epoch: 14884, Training Loss: 0.01191
Epoch: 14884, Training Loss: 0.01255
Epoch: 14884, Training Loss: 0.01257
Epoch: 14884, Training Loss: 0.01554
Epoch: 14885, Training Loss: 0.01191
Epoch: 14885, Training Loss: 0.01255
Epoch: 14885, Training Loss: 0.01257
Epoch: 14885, Training Loss: 0.01554
Epoch: 14886, Training Loss: 0.01191
Epoch: 14886, Training Loss: 0.01255
Epoch: 14886, Training Loss: 0.01257
Epoch: 14886, Training Loss: 0.01554
Epoch: 14887, Training Loss: 0.01191
Epoch: 14887, Training Loss: 0.01255
Epoch: 14887, Training Loss: 0.01257
Epoch: 14887, Training Loss: 0.01554
Epoch: 14888, Training Loss: 0.01191
Epoch: 14888, Training Loss: 0.01255
Epoch: 14888, Training Loss: 0.01257
Epoch: 14888, Training Loss: 0.01553
Epoch: 14889, Training Loss: 0.01191
Epoch: 14889, Training Loss: 0.01255
Epoch: 14889, Training Loss: 0.01257
Epoch: 14889, Training Loss: 0.01553
Epoch: 14890, Training Loss: 0.01191
Epoch: 14890, Training Loss: 0.01255
Epoch: 14890, Training Loss: 0.01257
Epoch: 14890, Training Loss: 0.01553
Epoch: 14891, Training Loss: 0.01191
Epoch: 14891, Training Loss: 0.01255
Epoch: 14891, Training Loss: 0.01257
Epoch: 14891, Training Loss: 0.01553
Epoch: 14892, Training Loss: 0.01191
Epoch: 14892, Training Loss: 0.01255
Epoch: 14892, Training Loss: 0.01257
Epoch: 14892, Training Loss: 0.01553
Epoch: 14893, Training Loss: 0.01191
Epoch: 14893, Training Loss: 0.01255
Epoch: 14893, Training Loss: 0.01257
Epoch: 14893, Training Loss: 0.01553
Epoch: 14894, Training Loss: 0.01191
Epoch: 14894, Training Loss: 0.01255
Epoch: 14894, Training Loss: 0.01256
Epoch: 14894, Training Loss: 0.01553
Epoch: 14895, Training Loss: 0.01191
Epoch: 14895, Training Loss: 0.01255
Epoch: 14895, Training Loss: 0.01256
Epoch: 14895, Training Loss: 0.01553
Epoch: 14896, Training Loss: 0.01191
Epoch: 14896, Training Loss: 0.01255
Epoch: 14896, Training Loss: 0.01256
Epoch: 14896, Training Loss: 0.01553
Epoch: 14897, Training Loss: 0.01191
Epoch: 14897, Training Loss: 0.01255
Epoch: 14897, Training Loss: 0.01256
Epoch: 14897, Training Loss: 0.01553
Epoch: 14898, Training Loss: 0.01191
Epoch: 14898, Training Loss: 0.01254
Epoch: 14898, Training Loss: 0.01256
Epoch: 14898, Training Loss: 0.01553
Epoch: 14899, Training Loss: 0.01190
Epoch: 14899, Training Loss: 0.01254
Epoch: 14899, Training Loss: 0.01256
Epoch: 14899, Training Loss: 0.01553
Epoch: 14900, Training Loss: 0.01190
Epoch: 14900, Training Loss: 0.01254
Epoch: 14900, Training Loss: 0.01256
Epoch: 14900, Training Loss: 0.01553
Epoch: 14901, Training Loss: 0.01190
Epoch: 14901, Training Loss: 0.01254
Epoch: 14901, Training Loss: 0.01256
Epoch: 14901, Training Loss: 0.01553
Epoch: 14902, Training Loss: 0.01190
Epoch: 14902, Training Loss: 0.01254
Epoch: 14902, Training Loss: 0.01256
Epoch: 14902, Training Loss: 0.01553
Epoch: 14903, Training Loss: 0.01190
Epoch: 14903, Training Loss: 0.01254
Epoch: 14903, Training Loss: 0.01256
Epoch: 14903, Training Loss: 0.01552
Epoch: 14904, Training Loss: 0.01190
Epoch: 14904, Training Loss: 0.01254
Epoch: 14904, Training Loss: 0.01256
Epoch: 14904, Training Loss: 0.01552
Epoch: 14905, Training Loss: 0.01190
Epoch: 14905, Training Loss: 0.01254
Epoch: 14905, Training Loss: 0.01256
Epoch: 14905, Training Loss: 0.01552
Epoch: 14906, Training Loss: 0.01190
Epoch: 14906, Training Loss: 0.01254
Epoch: 14906, Training Loss: 0.01256
Epoch: 14906, Training Loss: 0.01552
Epoch: 14907, Training Loss: 0.01190
Epoch: 14907, Training Loss: 0.01254
Epoch: 14907, Training Loss: 0.01256
Epoch: 14907, Training Loss: 0.01552
Epoch: 14908, Training Loss: 0.01190
Epoch: 14908, Training Loss: 0.01254
Epoch: 14908, Training Loss: 0.01256
Epoch: 14908, Training Loss: 0.01552
Epoch: 14909, Training Loss: 0.01190
Epoch: 14909, Training Loss: 0.01254
Epoch: 14909, Training Loss: 0.01256
Epoch: 14909, Training Loss: 0.01552
Epoch: 14910, Training Loss: 0.01190
Epoch: 14910, Training Loss: 0.01254
Epoch: 14910, Training Loss: 0.01256
Epoch: 14910, Training Loss: 0.01552
Epoch: 14911, Training Loss: 0.01190
Epoch: 14911, Training Loss: 0.01254
Epoch: 14911, Training Loss: 0.01256
Epoch: 14911, Training Loss: 0.01552
Epoch: 14912, Training Loss: 0.01190
Epoch: 14912, Training Loss: 0.01254
Epoch: 14912, Training Loss: 0.01256
Epoch: 14912, Training Loss: 0.01552
Epoch: 14913, Training Loss: 0.01190
Epoch: 14913, Training Loss: 0.01254
Epoch: 14913, Training Loss: 0.01255
Epoch: 14913, Training Loss: 0.01552
Epoch: 14914, Training Loss: 0.01190
Epoch: 14914, Training Loss: 0.01254
Epoch: 14914, Training Loss: 0.01255
Epoch: 14914, Training Loss: 0.01552
Epoch: 14915, Training Loss: 0.01190
Epoch: 14915, Training Loss: 0.01254
Epoch: 14915, Training Loss: 0.01255
Epoch: 14915, Training Loss: 0.01552
Epoch: 14916, Training Loss: 0.01190
Epoch: 14916, Training Loss: 0.01254
Epoch: 14916, Training Loss: 0.01255
Epoch: 14916, Training Loss: 0.01552
Epoch: 14917, Training Loss: 0.01190
Epoch: 14917, Training Loss: 0.01253
Epoch: 14917, Training Loss: 0.01255
Epoch: 14917, Training Loss: 0.01552
Epoch: 14918, Training Loss: 0.01190
Epoch: 14918, Training Loss: 0.01253
Epoch: 14918, Training Loss: 0.01255
Epoch: 14918, Training Loss: 0.01551
Epoch: 14919, Training Loss: 0.01190
Epoch: 14919, Training Loss: 0.01253
Epoch: 14919, Training Loss: 0.01255
Epoch: 14919, Training Loss: 0.01551
Epoch: 14920, Training Loss: 0.01189
Epoch: 14920, Training Loss: 0.01253
Epoch: 14920, Training Loss: 0.01255
Epoch: 14920, Training Loss: 0.01551
Epoch: 14921, Training Loss: 0.01189
Epoch: 14921, Training Loss: 0.01253
Epoch: 14921, Training Loss: 0.01255
Epoch: 14921, Training Loss: 0.01551
Epoch: 14922, Training Loss: 0.01189
Epoch: 14922, Training Loss: 0.01253
Epoch: 14922, Training Loss: 0.01255
Epoch: 14922, Training Loss: 0.01551
Epoch: 14923, Training Loss: 0.01189
Epoch: 14923, Training Loss: 0.01253
Epoch: 14923, Training Loss: 0.01255
Epoch: 14923, Training Loss: 0.01551
Epoch: 14924, Training Loss: 0.01189
Epoch: 14924, Training Loss: 0.01253
Epoch: 14924, Training Loss: 0.01255
Epoch: 14924, Training Loss: 0.01551
Epoch: 14925, Training Loss: 0.01189
Epoch: 14925, Training Loss: 0.01253
Epoch: 14925, Training Loss: 0.01255
Epoch: 14925, Training Loss: 0.01551
Epoch: 14926, Training Loss: 0.01189
Epoch: 14926, Training Loss: 0.01253
Epoch: 14926, Training Loss: 0.01255
Epoch: 14926, Training Loss: 0.01551
Epoch: 14927, Training Loss: 0.01189
Epoch: 14927, Training Loss: 0.01253
Epoch: 14927, Training Loss: 0.01255
Epoch: 14927, Training Loss: 0.01551
Epoch: 14928, Training Loss: 0.01189
Epoch: 14928, Training Loss: 0.01253
Epoch: 14928, Training Loss: 0.01255
Epoch: 14928, Training Loss: 0.01551
Epoch: 14929, Training Loss: 0.01189
Epoch: 14929, Training Loss: 0.01253
Epoch: 14929, Training Loss: 0.01255
Epoch: 14929, Training Loss: 0.01551
Epoch: 14930, Training Loss: 0.01189
Epoch: 14930, Training Loss: 0.01253
Epoch: 14930, Training Loss: 0.01255
Epoch: 14930, Training Loss: 0.01551
Epoch: 14931, Training Loss: 0.01189
Epoch: 14931, Training Loss: 0.01253
Epoch: 14931, Training Loss: 0.01255
Epoch: 14931, Training Loss: 0.01551
Epoch: 14932, Training Loss: 0.01189
Epoch: 14932, Training Loss: 0.01253
Epoch: 14932, Training Loss: 0.01254
Epoch: 14932, Training Loss: 0.01551
Epoch: 14933, Training Loss: 0.01189
Epoch: 14933, Training Loss: 0.01253
Epoch: 14933, Training Loss: 0.01254
Epoch: 14933, Training Loss: 0.01550
Epoch: 14934, Training Loss: 0.01189
Epoch: 14934, Training Loss: 0.01253
Epoch: 14934, Training Loss: 0.01254
Epoch: 14934, Training Loss: 0.01550
Epoch: 14935, Training Loss: 0.01189
Epoch: 14935, Training Loss: 0.01253
Epoch: 14935, Training Loss: 0.01254
Epoch: 14935, Training Loss: 0.01550
Epoch: 14936, Training Loss: 0.01189
Epoch: 14936, Training Loss: 0.01253
Epoch: 14936, Training Loss: 0.01254
Epoch: 14936, Training Loss: 0.01550
Epoch: 14937, Training Loss: 0.01189
Epoch: 14937, Training Loss: 0.01252
Epoch: 14937, Training Loss: 0.01254
Epoch: 14937, Training Loss: 0.01550
Epoch: 14938, Training Loss: 0.01189
Epoch: 14938, Training Loss: 0.01252
Epoch: 14938, Training Loss: 0.01254
Epoch: 14938, Training Loss: 0.01550
Epoch: 14939, Training Loss: 0.01189
Epoch: 14939, Training Loss: 0.01252
Epoch: 14939, Training Loss: 0.01254
Epoch: 14939, Training Loss: 0.01550
Epoch: 14940, Training Loss: 0.01189
Epoch: 14940, Training Loss: 0.01252
Epoch: 14940, Training Loss: 0.01254
Epoch: 14940, Training Loss: 0.01550
Epoch: 14941, Training Loss: 0.01188
Epoch: 14941, Training Loss: 0.01252
Epoch: 14941, Training Loss: 0.01254
Epoch: 14941, Training Loss: 0.01550
Epoch: 14942, Training Loss: 0.01188
Epoch: 14942, Training Loss: 0.01252
Epoch: 14942, Training Loss: 0.01254
Epoch: 14942, Training Loss: 0.01550
Epoch: 14943, Training Loss: 0.01188
Epoch: 14943, Training Loss: 0.01252
Epoch: 14943, Training Loss: 0.01254
Epoch: 14943, Training Loss: 0.01550
Epoch: 14944, Training Loss: 0.01188
Epoch: 14944, Training Loss: 0.01252
Epoch: 14944, Training Loss: 0.01254
Epoch: 14944, Training Loss: 0.01550
Epoch: 14945, Training Loss: 0.01188
Epoch: 14945, Training Loss: 0.01252
Epoch: 14945, Training Loss: 0.01254
Epoch: 14945, Training Loss: 0.01550
Epoch: 14946, Training Loss: 0.01188
Epoch: 14946, Training Loss: 0.01252
Epoch: 14946, Training Loss: 0.01254
Epoch: 14946, Training Loss: 0.01550
Epoch: 14947, Training Loss: 0.01188
Epoch: 14947, Training Loss: 0.01252
Epoch: 14947, Training Loss: 0.01254
Epoch: 14947, Training Loss: 0.01550
Epoch: 14948, Training Loss: 0.01188
Epoch: 14948, Training Loss: 0.01252
Epoch: 14948, Training Loss: 0.01254
Epoch: 14948, Training Loss: 0.01549
Epoch: 14949, Training Loss: 0.01188
Epoch: 14949, Training Loss: 0.01252
Epoch: 14949, Training Loss: 0.01254
Epoch: 14949, Training Loss: 0.01549
Epoch: 14950, Training Loss: 0.01188
Epoch: 14950, Training Loss: 0.01252
Epoch: 14950, Training Loss: 0.01254
Epoch: 14950, Training Loss: 0.01549
Epoch: 14951, Training Loss: 0.01188
Epoch: 14951, Training Loss: 0.01252
Epoch: 14951, Training Loss: 0.01253
Epoch: 14951, Training Loss: 0.01549
Epoch: 14952, Training Loss: 0.01188
Epoch: 14952, Training Loss: 0.01252
Epoch: 14952, Training Loss: 0.01253
Epoch: 14952, Training Loss: 0.01549
Epoch: 14953, Training Loss: 0.01188
Epoch: 14953, Training Loss: 0.01252
Epoch: 14953, Training Loss: 0.01253
Epoch: 14953, Training Loss: 0.01549
Epoch: 14954, Training Loss: 0.01188
Epoch: 14954, Training Loss: 0.01252
Epoch: 14954, Training Loss: 0.01253
Epoch: 14954, Training Loss: 0.01549
Epoch: 14955, Training Loss: 0.01188
Epoch: 14955, Training Loss: 0.01252
Epoch: 14955, Training Loss: 0.01253
Epoch: 14955, Training Loss: 0.01549
Epoch: 14956, Training Loss: 0.01188
Epoch: 14956, Training Loss: 0.01251
Epoch: 14956, Training Loss: 0.01253
Epoch: 14956, Training Loss: 0.01549
Epoch: 14957, Training Loss: 0.01188
Epoch: 14957, Training Loss: 0.01251
Epoch: 14957, Training Loss: 0.01253
Epoch: 14957, Training Loss: 0.01549
Epoch: 14958, Training Loss: 0.01188
Epoch: 14958, Training Loss: 0.01251
Epoch: 14958, Training Loss: 0.01253
Epoch: 14958, Training Loss: 0.01549
Epoch: 14959, Training Loss: 0.01188
Epoch: 14959, Training Loss: 0.01251
Epoch: 14959, Training Loss: 0.01253
Epoch: 14959, Training Loss: 0.01549
Epoch: 14960, Training Loss: 0.01188
Epoch: 14960, Training Loss: 0.01251
Epoch: 14960, Training Loss: 0.01253
Epoch: 14960, Training Loss: 0.01549
Epoch: 14961, Training Loss: 0.01187
Epoch: 14961, Training Loss: 0.01251
Epoch: 14961, Training Loss: 0.01253
Epoch: 14961, Training Loss: 0.01549
Epoch: 14962, Training Loss: 0.01187
Epoch: 14962, Training Loss: 0.01251
Epoch: 14962, Training Loss: 0.01253
Epoch: 14962, Training Loss: 0.01549
Epoch: 14963, Training Loss: 0.01187
Epoch: 14963, Training Loss: 0.01251
Epoch: 14963, Training Loss: 0.01253
Epoch: 14963, Training Loss: 0.01549
Epoch: 14964, Training Loss: 0.01187
Epoch: 14964, Training Loss: 0.01251
Epoch: 14964, Training Loss: 0.01253
Epoch: 14964, Training Loss: 0.01548
Epoch: 14965, Training Loss: 0.01187
Epoch: 14965, Training Loss: 0.01251
Epoch: 14965, Training Loss: 0.01253
Epoch: 14965, Training Loss: 0.01548
Epoch: 14966, Training Loss: 0.01187
Epoch: 14966, Training Loss: 0.01251
Epoch: 14966, Training Loss: 0.01253
Epoch: 14966, Training Loss: 0.01548
Epoch: 14967, Training Loss: 0.01187
Epoch: 14967, Training Loss: 0.01251
Epoch: 14967, Training Loss: 0.01253
Epoch: 14967, Training Loss: 0.01548
Epoch: 14968, Training Loss: 0.01187
Epoch: 14968, Training Loss: 0.01251
Epoch: 14968, Training Loss: 0.01253
Epoch: 14968, Training Loss: 0.01548
Epoch: 14969, Training Loss: 0.01187
Epoch: 14969, Training Loss: 0.01251
Epoch: 14969, Training Loss: 0.01253
Epoch: 14969, Training Loss: 0.01548
Epoch: 14970, Training Loss: 0.01187
Epoch: 14970, Training Loss: 0.01251
Epoch: 14970, Training Loss: 0.01252
Epoch: 14970, Training Loss: 0.01548
Epoch: 14971, Training Loss: 0.01187
Epoch: 14971, Training Loss: 0.01251
Epoch: 14971, Training Loss: 0.01252
Epoch: 14971, Training Loss: 0.01548
Epoch: 14972, Training Loss: 0.01187
Epoch: 14972, Training Loss: 0.01251
Epoch: 14972, Training Loss: 0.01252
Epoch: 14972, Training Loss: 0.01548
Epoch: 14973, Training Loss: 0.01187
Epoch: 14973, Training Loss: 0.01251
Epoch: 14973, Training Loss: 0.01252
Epoch: 14973, Training Loss: 0.01548
Epoch: 14974, Training Loss: 0.01187
Epoch: 14974, Training Loss: 0.01251
Epoch: 14974, Training Loss: 0.01252
Epoch: 14974, Training Loss: 0.01548
Epoch: 14975, Training Loss: 0.01187
Epoch: 14975, Training Loss: 0.01250
Epoch: 14975, Training Loss: 0.01252
Epoch: 14975, Training Loss: 0.01548
Epoch: 14976, Training Loss: 0.01187
Epoch: 14976, Training Loss: 0.01250
Epoch: 14976, Training Loss: 0.01252
Epoch: 14976, Training Loss: 0.01548
Epoch: 14977, Training Loss: 0.01187
Epoch: 14977, Training Loss: 0.01250
Epoch: 14977, Training Loss: 0.01252
Epoch: 14977, Training Loss: 0.01548
Epoch: 14978, Training Loss: 0.01187
Epoch: 14978, Training Loss: 0.01250
Epoch: 14978, Training Loss: 0.01252
Epoch: 14978, Training Loss: 0.01548
Epoch: 14979, Training Loss: 0.01187
Epoch: 14979, Training Loss: 0.01250
Epoch: 14979, Training Loss: 0.01252
Epoch: 14979, Training Loss: 0.01547
Epoch: 14980, Training Loss: 0.01187
Epoch: 14980, Training Loss: 0.01250
Epoch: 14980, Training Loss: 0.01252
Epoch: 14980, Training Loss: 0.01547
Epoch: 14981, Training Loss: 0.01187
Epoch: 14981, Training Loss: 0.01250
Epoch: 14981, Training Loss: 0.01252
Epoch: 14981, Training Loss: 0.01547
Epoch: 14982, Training Loss: 0.01186
Epoch: 14982, Training Loss: 0.01250
Epoch: 14982, Training Loss: 0.01252
Epoch: 14982, Training Loss: 0.01547
Epoch: 14983, Training Loss: 0.01186
Epoch: 14983, Training Loss: 0.01250
Epoch: 14983, Training Loss: 0.01252
Epoch: 14983, Training Loss: 0.01547
Epoch: 14984, Training Loss: 0.01186
Epoch: 14984, Training Loss: 0.01250
Epoch: 14984, Training Loss: 0.01252
Epoch: 14984, Training Loss: 0.01547
Epoch: 14985, Training Loss: 0.01186
Epoch: 14985, Training Loss: 0.01250
Epoch: 14985, Training Loss: 0.01252
Epoch: 14985, Training Loss: 0.01547
Epoch: 14986, Training Loss: 0.01186
Epoch: 14986, Training Loss: 0.01250
Epoch: 14986, Training Loss: 0.01252
Epoch: 14986, Training Loss: 0.01547
Epoch: 14987, Training Loss: 0.01186
Epoch: 14987, Training Loss: 0.01250
Epoch: 14987, Training Loss: 0.01252
Epoch: 14987, Training Loss: 0.01547
Epoch: 14988, Training Loss: 0.01186
Epoch: 14988, Training Loss: 0.01250
Epoch: 14988, Training Loss: 0.01252
Epoch: 14988, Training Loss: 0.01547
Epoch: 14989, Training Loss: 0.01186
Epoch: 14989, Training Loss: 0.01250
Epoch: 14989, Training Loss: 0.01252
Epoch: 14989, Training Loss: 0.01547
Epoch: 14990, Training Loss: 0.01186
Epoch: 14990, Training Loss: 0.01250
Epoch: 14990, Training Loss: 0.01251
Epoch: 14990, Training Loss: 0.01547
Epoch: 14991, Training Loss: 0.01186
Epoch: 14991, Training Loss: 0.01250
Epoch: 14991, Training Loss: 0.01251
Epoch: 14991, Training Loss: 0.01547
Epoch: 14992, Training Loss: 0.01186
Epoch: 14992, Training Loss: 0.01250
Epoch: 14992, Training Loss: 0.01251
Epoch: 14992, Training Loss: 0.01547
Epoch: 14993, Training Loss: 0.01186
Epoch: 14993, Training Loss: 0.01250
Epoch: 14993, Training Loss: 0.01251
Epoch: 14993, Training Loss: 0.01547
Epoch: 14994, Training Loss: 0.01186
Epoch: 14994, Training Loss: 0.01249
Epoch: 14994, Training Loss: 0.01251
Epoch: 14994, Training Loss: 0.01546
Epoch: 14995, Training Loss: 0.01186
Epoch: 14995, Training Loss: 0.01249
Epoch: 14995, Training Loss: 0.01251
Epoch: 14995, Training Loss: 0.01546
Epoch: 14996, Training Loss: 0.01186
Epoch: 14996, Training Loss: 0.01249
Epoch: 14996, Training Loss: 0.01251
Epoch: 14996, Training Loss: 0.01546
Epoch: 14997, Training Loss: 0.01186
Epoch: 14997, Training Loss: 0.01249
Epoch: 14997, Training Loss: 0.01251
Epoch: 14997, Training Loss: 0.01546
Epoch: 14998, Training Loss: 0.01186
Epoch: 14998, Training Loss: 0.01249
Epoch: 14998, Training Loss: 0.01251
Epoch: 14998, Training Loss: 0.01546
Epoch: 14999, Training Loss: 0.01186
Epoch: 14999, Training Loss: 0.01249
Epoch: 14999, Training Loss: 0.01251
Epoch: 14999, Training Loss: 0.01546
Epoch: 15000, Training Loss: 0.01186
Epoch: 15000, Training Loss: 0.01249
Epoch: 15000, Training Loss: 0.01251
Epoch: 15000, Training Loss: 0.01546
Epoch: 15001, Training Loss: 0.01186
Epoch: 15001, Training Loss: 0.01249
Epoch: 15001, Training Loss: 0.01251
Epoch: 15001, Training Loss: 0.01546
Epoch: 15002, Training Loss: 0.01186
Epoch: 15002, Training Loss: 0.01249
Epoch: 15002, Training Loss: 0.01251
Epoch: 15002, Training Loss: 0.01546
Epoch: 15003, Training Loss: 0.01185
Epoch: 15003, Training Loss: 0.01249
Epoch: 15003, Training Loss: 0.01251
Epoch: 15003, Training Loss: 0.01546
Epoch: 15004, Training Loss: 0.01185
Epoch: 15004, Training Loss: 0.01249
Epoch: 15004, Training Loss: 0.01251
Epoch: 15004, Training Loss: 0.01546
Epoch: 15005, Training Loss: 0.01185
Epoch: 15005, Training Loss: 0.01249
Epoch: 15005, Training Loss: 0.01251
Epoch: 15005, Training Loss: 0.01546
Epoch: 15006, Training Loss: 0.01185
Epoch: 15006, Training Loss: 0.01249
Epoch: 15006, Training Loss: 0.01251
Epoch: 15006, Training Loss: 0.01546
Epoch: 15007, Training Loss: 0.01185
Epoch: 15007, Training Loss: 0.01249
Epoch: 15007, Training Loss: 0.01251
Epoch: 15007, Training Loss: 0.01546
Epoch: 15008, Training Loss: 0.01185
Epoch: 15008, Training Loss: 0.01249
Epoch: 15008, Training Loss: 0.01251
Epoch: 15008, Training Loss: 0.01546
Epoch: 15009, Training Loss: 0.01185
Epoch: 15009, Training Loss: 0.01249
Epoch: 15009, Training Loss: 0.01250
Epoch: 15009, Training Loss: 0.01545
Epoch: 15010, Training Loss: 0.01185
Epoch: 15010, Training Loss: 0.01249
Epoch: 15010, Training Loss: 0.01250
Epoch: 15010, Training Loss: 0.01545
Epoch: 15011, Training Loss: 0.01185
Epoch: 15011, Training Loss: 0.01249
Epoch: 15011, Training Loss: 0.01250
Epoch: 15011, Training Loss: 0.01545
Epoch: 15012, Training Loss: 0.01185
Epoch: 15012, Training Loss: 0.01249
Epoch: 15012, Training Loss: 0.01250
Epoch: 15012, Training Loss: 0.01545
Epoch: 15013, Training Loss: 0.01185
Epoch: 15013, Training Loss: 0.01249
Epoch: 15013, Training Loss: 0.01250
Epoch: 15013, Training Loss: 0.01545
Epoch: 15014, Training Loss: 0.01185
Epoch: 15014, Training Loss: 0.01248
Epoch: 15014, Training Loss: 0.01250
Epoch: 15014, Training Loss: 0.01545
Epoch: 15015, Training Loss: 0.01185
Epoch: 15015, Training Loss: 0.01248
Epoch: 15015, Training Loss: 0.01250
Epoch: 15015, Training Loss: 0.01545
Epoch: 15016, Training Loss: 0.01185
Epoch: 15016, Training Loss: 0.01248
Epoch: 15016, Training Loss: 0.01250
Epoch: 15016, Training Loss: 0.01545
Epoch: 15017, Training Loss: 0.01185
Epoch: 15017, Training Loss: 0.01248
Epoch: 15017, Training Loss: 0.01250
Epoch: 15017, Training Loss: 0.01545
Epoch: 15018, Training Loss: 0.01185
Epoch: 15018, Training Loss: 0.01248
Epoch: 15018, Training Loss: 0.01250
Epoch: 15018, Training Loss: 0.01545
Epoch: 15019, Training Loss: 0.01185
Epoch: 15019, Training Loss: 0.01248
Epoch: 15019, Training Loss: 0.01250
Epoch: 15019, Training Loss: 0.01545
Epoch: 15020, Training Loss: 0.01185
Epoch: 15020, Training Loss: 0.01248
Epoch: 15020, Training Loss: 0.01250
Epoch: 15020, Training Loss: 0.01545
Epoch: 15021, Training Loss: 0.01185
Epoch: 15021, Training Loss: 0.01248
Epoch: 15021, Training Loss: 0.01250
Epoch: 15021, Training Loss: 0.01545
Epoch: 15022, Training Loss: 0.01185
Epoch: 15022, Training Loss: 0.01248
Epoch: 15022, Training Loss: 0.01250
Epoch: 15022, Training Loss: 0.01545
Epoch: 15023, Training Loss: 0.01185
Epoch: 15023, Training Loss: 0.01248
Epoch: 15023, Training Loss: 0.01250
Epoch: 15023, Training Loss: 0.01545
Epoch: 15024, Training Loss: 0.01184
Epoch: 15024, Training Loss: 0.01248
Epoch: 15024, Training Loss: 0.01250
Epoch: 15024, Training Loss: 0.01545
Epoch: 15025, Training Loss: 0.01184
Epoch: 15025, Training Loss: 0.01248
Epoch: 15025, Training Loss: 0.01250
Epoch: 15025, Training Loss: 0.01544
Epoch: 15026, Training Loss: 0.01184
Epoch: 15026, Training Loss: 0.01248
Epoch: 15026, Training Loss: 0.01250
Epoch: 15026, Training Loss: 0.01544
Epoch: 15027, Training Loss: 0.01184
Epoch: 15027, Training Loss: 0.01248
Epoch: 15027, Training Loss: 0.01250
Epoch: 15027, Training Loss: 0.01544
Epoch: 15028, Training Loss: 0.01184
Epoch: 15028, Training Loss: 0.01248
Epoch: 15028, Training Loss: 0.01249
Epoch: 15028, Training Loss: 0.01544
Epoch: 15029, Training Loss: 0.01184
Epoch: 15029, Training Loss: 0.01248
Epoch: 15029, Training Loss: 0.01249
Epoch: 15029, Training Loss: 0.01544
Epoch: 15030, Training Loss: 0.01184
Epoch: 15030, Training Loss: 0.01248
Epoch: 15030, Training Loss: 0.01249
Epoch: 15030, Training Loss: 0.01544
Epoch: 15031, Training Loss: 0.01184
Epoch: 15031, Training Loss: 0.01248
Epoch: 15031, Training Loss: 0.01249
Epoch: 15031, Training Loss: 0.01544
Epoch: 15032, Training Loss: 0.01184
Epoch: 15032, Training Loss: 0.01248
Epoch: 15032, Training Loss: 0.01249
Epoch: 15032, Training Loss: 0.01544
Epoch: 15033, Training Loss: 0.01184
Epoch: 15033, Training Loss: 0.01247
Epoch: 15033, Training Loss: 0.01249
Epoch: 15033, Training Loss: 0.01544
Epoch: 15034, Training Loss: 0.01184
Epoch: 15034, Training Loss: 0.01247
Epoch: 15034, Training Loss: 0.01249
Epoch: 15034, Training Loss: 0.01544
Epoch: 15035, Training Loss: 0.01184
Epoch: 15035, Training Loss: 0.01247
Epoch: 15035, Training Loss: 0.01249
Epoch: 15035, Training Loss: 0.01544
Epoch: 15036, Training Loss: 0.01184
Epoch: 15036, Training Loss: 0.01247
Epoch: 15036, Training Loss: 0.01249
Epoch: 15036, Training Loss: 0.01544
Epoch: 15037, Training Loss: 0.01184
Epoch: 15037, Training Loss: 0.01247
Epoch: 15037, Training Loss: 0.01249
Epoch: 15037, Training Loss: 0.01544
Epoch: 15038, Training Loss: 0.01184
Epoch: 15038, Training Loss: 0.01247
Epoch: 15038, Training Loss: 0.01249
Epoch: 15038, Training Loss: 0.01544
Epoch: 15039, Training Loss: 0.01184
Epoch: 15039, Training Loss: 0.01247
Epoch: 15039, Training Loss: 0.01249
Epoch: 15039, Training Loss: 0.01544
Epoch: 15040, Training Loss: 0.01184
Epoch: 15040, Training Loss: 0.01247
Epoch: 15040, Training Loss: 0.01249
Epoch: 15040, Training Loss: 0.01543
Epoch: 15041, Training Loss: 0.01184
Epoch: 15041, Training Loss: 0.01247
Epoch: 15041, Training Loss: 0.01249
Epoch: 15041, Training Loss: 0.01543
Epoch: 15042, Training Loss: 0.01184
Epoch: 15042, Training Loss: 0.01247
Epoch: 15042, Training Loss: 0.01249
Epoch: 15042, Training Loss: 0.01543
Epoch: 15043, Training Loss: 0.01184
Epoch: 15043, Training Loss: 0.01247
Epoch: 15043, Training Loss: 0.01249
Epoch: 15043, Training Loss: 0.01543
Epoch: 15044, Training Loss: 0.01184
Epoch: 15044, Training Loss: 0.01247
Epoch: 15044, Training Loss: 0.01249
Epoch: 15044, Training Loss: 0.01543
Epoch: 15045, Training Loss: 0.01184
Epoch: 15045, Training Loss: 0.01247
Epoch: 15045, Training Loss: 0.01249
Epoch: 15045, Training Loss: 0.01543
Epoch: 15046, Training Loss: 0.01183
Epoch: 15046, Training Loss: 0.01247
Epoch: 15046, Training Loss: 0.01249
Epoch: 15046, Training Loss: 0.01543
Epoch: 15047, Training Loss: 0.01183
Epoch: 15047, Training Loss: 0.01247
Epoch: 15047, Training Loss: 0.01249
Epoch: 15047, Training Loss: 0.01543
Epoch: 15048, Training Loss: 0.01183
Epoch: 15048, Training Loss: 0.01247
Epoch: 15048, Training Loss: 0.01248
Epoch: 15048, Training Loss: 0.01543
Epoch: 15049, Training Loss: 0.01183
Epoch: 15049, Training Loss: 0.01247
Epoch: 15049, Training Loss: 0.01248
Epoch: 15049, Training Loss: 0.01543
Epoch: 15050, Training Loss: 0.01183
Epoch: 15050, Training Loss: 0.01247
Epoch: 15050, Training Loss: 0.01248
Epoch: 15050, Training Loss: 0.01543
Epoch: 15051, Training Loss: 0.01183
Epoch: 15051, Training Loss: 0.01247
Epoch: 15051, Training Loss: 0.01248
Epoch: 15051, Training Loss: 0.01543
Epoch: 15052, Training Loss: 0.01183
Epoch: 15052, Training Loss: 0.01246
Epoch: 15052, Training Loss: 0.01248
Epoch: 15052, Training Loss: 0.01543
Epoch: 15053, Training Loss: 0.01183
Epoch: 15053, Training Loss: 0.01246
Epoch: 15053, Training Loss: 0.01248
Epoch: 15053, Training Loss: 0.01543
Epoch: 15054, Training Loss: 0.01183
Epoch: 15054, Training Loss: 0.01246
Epoch: 15054, Training Loss: 0.01248
Epoch: 15054, Training Loss: 0.01543
Epoch: 15055, Training Loss: 0.01183
Epoch: 15055, Training Loss: 0.01246
Epoch: 15055, Training Loss: 0.01248
Epoch: 15055, Training Loss: 0.01543
Epoch: 15056, Training Loss: 0.01183
Epoch: 15056, Training Loss: 0.01246
Epoch: 15056, Training Loss: 0.01248
Epoch: 15056, Training Loss: 0.01542
Epoch: 15057, Training Loss: 0.01183
Epoch: 15057, Training Loss: 0.01246
Epoch: 15057, Training Loss: 0.01248
Epoch: 15057, Training Loss: 0.01542
Epoch: 15058, Training Loss: 0.01183
Epoch: 15058, Training Loss: 0.01246
Epoch: 15058, Training Loss: 0.01248
Epoch: 15058, Training Loss: 0.01542
Epoch: 15059, Training Loss: 0.01183
Epoch: 15059, Training Loss: 0.01246
Epoch: 15059, Training Loss: 0.01248
Epoch: 15059, Training Loss: 0.01542
Epoch: 15060, Training Loss: 0.01183
Epoch: 15060, Training Loss: 0.01246
Epoch: 15060, Training Loss: 0.01248
Epoch: 15060, Training Loss: 0.01542
Epoch: 15061, Training Loss: 0.01183
Epoch: 15061, Training Loss: 0.01246
Epoch: 15061, Training Loss: 0.01248
Epoch: 15061, Training Loss: 0.01542
Epoch: 15062, Training Loss: 0.01183
Epoch: 15062, Training Loss: 0.01246
Epoch: 15062, Training Loss: 0.01248
Epoch: 15062, Training Loss: 0.01542
Epoch: 15063, Training Loss: 0.01183
Epoch: 15063, Training Loss: 0.01246
Epoch: 15063, Training Loss: 0.01248
Epoch: 15063, Training Loss: 0.01542
Epoch: 15064, Training Loss: 0.01183
Epoch: 15064, Training Loss: 0.01246
Epoch: 15064, Training Loss: 0.01248
Epoch: 15064, Training Loss: 0.01542
Epoch: 15065, Training Loss: 0.01183
Epoch: 15065, Training Loss: 0.01246
Epoch: 15065, Training Loss: 0.01248
Epoch: 15065, Training Loss: 0.01542
Epoch: 15066, Training Loss: 0.01183
Epoch: 15066, Training Loss: 0.01246
Epoch: 15066, Training Loss: 0.01248
Epoch: 15066, Training Loss: 0.01542
Epoch: 15067, Training Loss: 0.01182
Epoch: 15067, Training Loss: 0.01246
Epoch: 15067, Training Loss: 0.01247
Epoch: 15067, Training Loss: 0.01542
Epoch: 15068, Training Loss: 0.01182
Epoch: 15068, Training Loss: 0.01246
Epoch: 15068, Training Loss: 0.01247
Epoch: 15068, Training Loss: 0.01542
Epoch: 15069, Training Loss: 0.01182
Epoch: 15069, Training Loss: 0.01246
Epoch: 15069, Training Loss: 0.01247
Epoch: 15069, Training Loss: 0.01542
Epoch: 15070, Training Loss: 0.01182
Epoch: 15070, Training Loss: 0.01246
Epoch: 15070, Training Loss: 0.01247
Epoch: 15070, Training Loss: 0.01542
Epoch: 15071, Training Loss: 0.01182
Epoch: 15071, Training Loss: 0.01246
Epoch: 15071, Training Loss: 0.01247
Epoch: 15071, Training Loss: 0.01541
Epoch: 15072, Training Loss: 0.01182
Epoch: 15072, Training Loss: 0.01245
Epoch: 15072, Training Loss: 0.01247
Epoch: 15072, Training Loss: 0.01541
Epoch: 15073, Training Loss: 0.01182
Epoch: 15073, Training Loss: 0.01245
Epoch: 15073, Training Loss: 0.01247
Epoch: 15073, Training Loss: 0.01541
Epoch: 15074, Training Loss: 0.01182
Epoch: 15074, Training Loss: 0.01245
Epoch: 15074, Training Loss: 0.01247
Epoch: 15074, Training Loss: 0.01541
Epoch: 15075, Training Loss: 0.01182
Epoch: 15075, Training Loss: 0.01245
Epoch: 15075, Training Loss: 0.01247
Epoch: 15075, Training Loss: 0.01541
Epoch: 15076, Training Loss: 0.01182
Epoch: 15076, Training Loss: 0.01245
Epoch: 15076, Training Loss: 0.01247
Epoch: 15076, Training Loss: 0.01541
Epoch: 15077, Training Loss: 0.01182
Epoch: 15077, Training Loss: 0.01245
Epoch: 15077, Training Loss: 0.01247
Epoch: 15077, Training Loss: 0.01541
Epoch: 15078, Training Loss: 0.01182
Epoch: 15078, Training Loss: 0.01245
Epoch: 15078, Training Loss: 0.01247
Epoch: 15078, Training Loss: 0.01541
Epoch: 15079, Training Loss: 0.01182
Epoch: 15079, Training Loss: 0.01245
Epoch: 15079, Training Loss: 0.01247
Epoch: 15079, Training Loss: 0.01541
Epoch: 15080, Training Loss: 0.01182
Epoch: 15080, Training Loss: 0.01245
Epoch: 15080, Training Loss: 0.01247
Epoch: 15080, Training Loss: 0.01541
Epoch: 15081, Training Loss: 0.01182
Epoch: 15081, Training Loss: 0.01245
Epoch: 15081, Training Loss: 0.01247
Epoch: 15081, Training Loss: 0.01541
Epoch: 15082, Training Loss: 0.01182
Epoch: 15082, Training Loss: 0.01245
Epoch: 15082, Training Loss: 0.01247
Epoch: 15082, Training Loss: 0.01541
Epoch: 15083, Training Loss: 0.01182
Epoch: 15083, Training Loss: 0.01245
Epoch: 15083, Training Loss: 0.01247
Epoch: 15083, Training Loss: 0.01541
Epoch: 15084, Training Loss: 0.01182
Epoch: 15084, Training Loss: 0.01245
Epoch: 15084, Training Loss: 0.01247
Epoch: 15084, Training Loss: 0.01541
Epoch: 15085, Training Loss: 0.01182
Epoch: 15085, Training Loss: 0.01245
Epoch: 15085, Training Loss: 0.01247
Epoch: 15085, Training Loss: 0.01541
Epoch: 15086, Training Loss: 0.01182
Epoch: 15086, Training Loss: 0.01245
Epoch: 15086, Training Loss: 0.01246
Epoch: 15086, Training Loss: 0.01540
Epoch: 15087, Training Loss: 0.01182
Epoch: 15087, Training Loss: 0.01245
Epoch: 15087, Training Loss: 0.01246
Epoch: 15087, Training Loss: 0.01540
Epoch: 15088, Training Loss: 0.01181
Epoch: 15088, Training Loss: 0.01245
Epoch: 15088, Training Loss: 0.01246
Epoch: 15088, Training Loss: 0.01540
Epoch: 15089, Training Loss: 0.01181
Epoch: 15089, Training Loss: 0.01245
Epoch: 15089, Training Loss: 0.01246
Epoch: 15089, Training Loss: 0.01540
Epoch: 15090, Training Loss: 0.01181
Epoch: 15090, Training Loss: 0.01245
Epoch: 15090, Training Loss: 0.01246
Epoch: 15090, Training Loss: 0.01540
Epoch: 15091, Training Loss: 0.01181
Epoch: 15091, Training Loss: 0.01244
Epoch: 15091, Training Loss: 0.01246
Epoch: 15091, Training Loss: 0.01540
Epoch: 15092, Training Loss: 0.01181
Epoch: 15092, Training Loss: 0.01244
Epoch: 15092, Training Loss: 0.01246
Epoch: 15092, Training Loss: 0.01540
Epoch: 15093, Training Loss: 0.01181
Epoch: 15093, Training Loss: 0.01244
Epoch: 15093, Training Loss: 0.01246
Epoch: 15093, Training Loss: 0.01540
Epoch: 15094, Training Loss: 0.01181
Epoch: 15094, Training Loss: 0.01244
Epoch: 15094, Training Loss: 0.01246
Epoch: 15094, Training Loss: 0.01540
Epoch: 15095, Training Loss: 0.01181
Epoch: 15095, Training Loss: 0.01244
Epoch: 15095, Training Loss: 0.01246
Epoch: 15095, Training Loss: 0.01540
Epoch: 15096, Training Loss: 0.01181
Epoch: 15096, Training Loss: 0.01244
Epoch: 15096, Training Loss: 0.01246
Epoch: 15096, Training Loss: 0.01540
Epoch: 15097, Training Loss: 0.01181
Epoch: 15097, Training Loss: 0.01244
Epoch: 15097, Training Loss: 0.01246
Epoch: 15097, Training Loss: 0.01540
Epoch: 15098, Training Loss: 0.01181
Epoch: 15098, Training Loss: 0.01244
Epoch: 15098, Training Loss: 0.01246
Epoch: 15098, Training Loss: 0.01540
Epoch: 15099, Training Loss: 0.01181
Epoch: 15099, Training Loss: 0.01244
Epoch: 15099, Training Loss: 0.01246
Epoch: 15099, Training Loss: 0.01540
Epoch: 15100, Training Loss: 0.01181
Epoch: 15100, Training Loss: 0.01244
Epoch: 15100, Training Loss: 0.01246
Epoch: 15100, Training Loss: 0.01540
Epoch: 15101, Training Loss: 0.01181
Epoch: 15101, Training Loss: 0.01244
Epoch: 15101, Training Loss: 0.01246
Epoch: 15101, Training Loss: 0.01540
Epoch: 15102, Training Loss: 0.01181
Epoch: 15102, Training Loss: 0.01244
Epoch: 15102, Training Loss: 0.01246
Epoch: 15102, Training Loss: 0.01539
Epoch: 15103, Training Loss: 0.01181
Epoch: 15103, Training Loss: 0.01244
Epoch: 15103, Training Loss: 0.01246
Epoch: 15103, Training Loss: 0.01539
Epoch: 15104, Training Loss: 0.01181
Epoch: 15104, Training Loss: 0.01244
Epoch: 15104, Training Loss: 0.01246
Epoch: 15104, Training Loss: 0.01539
Epoch: 15105, Training Loss: 0.01181
Epoch: 15105, Training Loss: 0.01244
Epoch: 15105, Training Loss: 0.01246
Epoch: 15105, Training Loss: 0.01539
Epoch: 15106, Training Loss: 0.01181
Epoch: 15106, Training Loss: 0.01244
Epoch: 15106, Training Loss: 0.01245
Epoch: 15106, Training Loss: 0.01539
Epoch: 15107, Training Loss: 0.01181
Epoch: 15107, Training Loss: 0.01244
Epoch: 15107, Training Loss: 0.01245
Epoch: 15107, Training Loss: 0.01539
Epoch: 15108, Training Loss: 0.01181
Epoch: 15108, Training Loss: 0.01244
Epoch: 15108, Training Loss: 0.01245
Epoch: 15108, Training Loss: 0.01539
Epoch: 15109, Training Loss: 0.01180
Epoch: 15109, Training Loss: 0.01244
Epoch: 15109, Training Loss: 0.01245
Epoch: 15109, Training Loss: 0.01539
Epoch: 15110, Training Loss: 0.01180
Epoch: 15110, Training Loss: 0.01244
Epoch: 15110, Training Loss: 0.01245
Epoch: 15110, Training Loss: 0.01539
Epoch: 15111, Training Loss: 0.01180
Epoch: 15111, Training Loss: 0.01243
Epoch: 15111, Training Loss: 0.01245
Epoch: 15111, Training Loss: 0.01539
Epoch: 15112, Training Loss: 0.01180
Epoch: 15112, Training Loss: 0.01243
Epoch: 15112, Training Loss: 0.01245
Epoch: 15112, Training Loss: 0.01539
Epoch: 15113, Training Loss: 0.01180
Epoch: 15113, Training Loss: 0.01243
Epoch: 15113, Training Loss: 0.01245
Epoch: 15113, Training Loss: 0.01539
Epoch: 15114, Training Loss: 0.01180
Epoch: 15114, Training Loss: 0.01243
Epoch: 15114, Training Loss: 0.01245
Epoch: 15114, Training Loss: 0.01539
Epoch: 15115, Training Loss: 0.01180
Epoch: 15115, Training Loss: 0.01243
Epoch: 15115, Training Loss: 0.01245
Epoch: 15115, Training Loss: 0.01539
Epoch: 15116, Training Loss: 0.01180
Epoch: 15116, Training Loss: 0.01243
Epoch: 15116, Training Loss: 0.01245
Epoch: 15116, Training Loss: 0.01539
Epoch: 15117, Training Loss: 0.01180
Epoch: 15117, Training Loss: 0.01243
Epoch: 15117, Training Loss: 0.01245
Epoch: 15117, Training Loss: 0.01538
Epoch: 15118, Training Loss: 0.01180
Epoch: 15118, Training Loss: 0.01243
Epoch: 15118, Training Loss: 0.01245
Epoch: 15118, Training Loss: 0.01538
Epoch: 15119, Training Loss: 0.01180
Epoch: 15119, Training Loss: 0.01243
Epoch: 15119, Training Loss: 0.01245
Epoch: 15119, Training Loss: 0.01538
Epoch: 15120, Training Loss: 0.01180
Epoch: 15120, Training Loss: 0.01243
Epoch: 15120, Training Loss: 0.01245
Epoch: 15120, Training Loss: 0.01538
Epoch: 15121, Training Loss: 0.01180
Epoch: 15121, Training Loss: 0.01243
Epoch: 15121, Training Loss: 0.01245
Epoch: 15121, Training Loss: 0.01538
Epoch: 15122, Training Loss: 0.01180
Epoch: 15122, Training Loss: 0.01243
Epoch: 15122, Training Loss: 0.01245
Epoch: 15122, Training Loss: 0.01538
Epoch: 15123, Training Loss: 0.01180
Epoch: 15123, Training Loss: 0.01243
Epoch: 15123, Training Loss: 0.01245
Epoch: 15123, Training Loss: 0.01538
Epoch: 15124, Training Loss: 0.01180
Epoch: 15124, Training Loss: 0.01243
Epoch: 15124, Training Loss: 0.01245
Epoch: 15124, Training Loss: 0.01538
Epoch: 15125, Training Loss: 0.01180
Epoch: 15125, Training Loss: 0.01243
Epoch: 15125, Training Loss: 0.01245
Epoch: 15125, Training Loss: 0.01538
Epoch: 15126, Training Loss: 0.01180
Epoch: 15126, Training Loss: 0.01243
Epoch: 15126, Training Loss: 0.01244
Epoch: 15126, Training Loss: 0.01538
Epoch: 15127, Training Loss: 0.01180
Epoch: 15127, Training Loss: 0.01243
Epoch: 15127, Training Loss: 0.01244
Epoch: 15127, Training Loss: 0.01538
Epoch: 15128, Training Loss: 0.01180
Epoch: 15128, Training Loss: 0.01243
Epoch: 15128, Training Loss: 0.01244
Epoch: 15128, Training Loss: 0.01538
Epoch: 15129, Training Loss: 0.01180
Epoch: 15129, Training Loss: 0.01243
Epoch: 15129, Training Loss: 0.01244
Epoch: 15129, Training Loss: 0.01538
Epoch: 15130, Training Loss: 0.01180
Epoch: 15130, Training Loss: 0.01243
Epoch: 15130, Training Loss: 0.01244
Epoch: 15130, Training Loss: 0.01538
Epoch: 15131, Training Loss: 0.01179
Epoch: 15131, Training Loss: 0.01242
Epoch: 15131, Training Loss: 0.01244
Epoch: 15131, Training Loss: 0.01538
Epoch: 15132, Training Loss: 0.01179
Epoch: 15132, Training Loss: 0.01242
Epoch: 15132, Training Loss: 0.01244
Epoch: 15132, Training Loss: 0.01538
Epoch: 15133, Training Loss: 0.01179
Epoch: 15133, Training Loss: 0.01242
Epoch: 15133, Training Loss: 0.01244
Epoch: 15133, Training Loss: 0.01537
Epoch: 15134, Training Loss: 0.01179
Epoch: 15134, Training Loss: 0.01242
Epoch: 15134, Training Loss: 0.01244
Epoch: 15134, Training Loss: 0.01537
Epoch: 15135, Training Loss: 0.01179
Epoch: 15135, Training Loss: 0.01242
Epoch: 15135, Training Loss: 0.01244
Epoch: 15135, Training Loss: 0.01537
Epoch: 15136, Training Loss: 0.01179
Epoch: 15136, Training Loss: 0.01242
Epoch: 15136, Training Loss: 0.01244
Epoch: 15136, Training Loss: 0.01537
Epoch: 15137, Training Loss: 0.01179
Epoch: 15137, Training Loss: 0.01242
Epoch: 15137, Training Loss: 0.01244
Epoch: 15137, Training Loss: 0.01537
Epoch: 15138, Training Loss: 0.01179
Epoch: 15138, Training Loss: 0.01242
Epoch: 15138, Training Loss: 0.01244
Epoch: 15138, Training Loss: 0.01537
Epoch: 15139, Training Loss: 0.01179
Epoch: 15139, Training Loss: 0.01242
Epoch: 15139, Training Loss: 0.01244
Epoch: 15139, Training Loss: 0.01537
Epoch: 15140, Training Loss: 0.01179
Epoch: 15140, Training Loss: 0.01242
Epoch: 15140, Training Loss: 0.01244
Epoch: 15140, Training Loss: 0.01537
Epoch: 15141, Training Loss: 0.01179
Epoch: 15141, Training Loss: 0.01242
Epoch: 15141, Training Loss: 0.01244
Epoch: 15141, Training Loss: 0.01537
Epoch: 15142, Training Loss: 0.01179
Epoch: 15142, Training Loss: 0.01242
Epoch: 15142, Training Loss: 0.01244
Epoch: 15142, Training Loss: 0.01537
Epoch: 15143, Training Loss: 0.01179
Epoch: 15143, Training Loss: 0.01242
Epoch: 15143, Training Loss: 0.01244
Epoch: 15143, Training Loss: 0.01537
Epoch: 15144, Training Loss: 0.01179
Epoch: 15144, Training Loss: 0.01242
Epoch: 15144, Training Loss: 0.01244
Epoch: 15144, Training Loss: 0.01537
Epoch: 15145, Training Loss: 0.01179
Epoch: 15145, Training Loss: 0.01242
Epoch: 15145, Training Loss: 0.01243
Epoch: 15145, Training Loss: 0.01537
Epoch: 15146, Training Loss: 0.01179
Epoch: 15146, Training Loss: 0.01242
Epoch: 15146, Training Loss: 0.01243
Epoch: 15146, Training Loss: 0.01537
Epoch: 15147, Training Loss: 0.01179
Epoch: 15147, Training Loss: 0.01242
Epoch: 15147, Training Loss: 0.01243
Epoch: 15147, Training Loss: 0.01537
Epoch: 15148, Training Loss: 0.01179
Epoch: 15148, Training Loss: 0.01242
Epoch: 15148, Training Loss: 0.01243
Epoch: 15148, Training Loss: 0.01537
Epoch: 15149, Training Loss: 0.01179
Epoch: 15149, Training Loss: 0.01242
Epoch: 15149, Training Loss: 0.01243
Epoch: 15149, Training Loss: 0.01536
Epoch: 15150, Training Loss: 0.01179
Epoch: 15150, Training Loss: 0.01241
Epoch: 15150, Training Loss: 0.01243
Epoch: 15150, Training Loss: 0.01536
Epoch: 15151, Training Loss: 0.01179
Epoch: 15151, Training Loss: 0.01241
Epoch: 15151, Training Loss: 0.01243
Epoch: 15151, Training Loss: 0.01536
Epoch: 15152, Training Loss: 0.01178
Epoch: 15152, Training Loss: 0.01241
Epoch: 15152, Training Loss: 0.01243
Epoch: 15152, Training Loss: 0.01536
Epoch: 15153, Training Loss: 0.01178
Epoch: 15153, Training Loss: 0.01241
Epoch: 15153, Training Loss: 0.01243
Epoch: 15153, Training Loss: 0.01536
Epoch: 15154, Training Loss: 0.01178
Epoch: 15154, Training Loss: 0.01241
Epoch: 15154, Training Loss: 0.01243
Epoch: 15154, Training Loss: 0.01536
Epoch: 15155, Training Loss: 0.01178
Epoch: 15155, Training Loss: 0.01241
Epoch: 15155, Training Loss: 0.01243
Epoch: 15155, Training Loss: 0.01536
Epoch: 15156, Training Loss: 0.01178
Epoch: 15156, Training Loss: 0.01241
Epoch: 15156, Training Loss: 0.01243
Epoch: 15156, Training Loss: 0.01536
Epoch: 15157, Training Loss: 0.01178
Epoch: 15157, Training Loss: 0.01241
Epoch: 15157, Training Loss: 0.01243
Epoch: 15157, Training Loss: 0.01536
Epoch: 15158, Training Loss: 0.01178
Epoch: 15158, Training Loss: 0.01241
Epoch: 15158, Training Loss: 0.01243
Epoch: 15158, Training Loss: 0.01536
Epoch: 15159, Training Loss: 0.01178
Epoch: 15159, Training Loss: 0.01241
Epoch: 15159, Training Loss: 0.01243
Epoch: 15159, Training Loss: 0.01536
Epoch: 15160, Training Loss: 0.01178
Epoch: 15160, Training Loss: 0.01241
Epoch: 15160, Training Loss: 0.01243
Epoch: 15160, Training Loss: 0.01536
Epoch: 15161, Training Loss: 0.01178
Epoch: 15161, Training Loss: 0.01241
Epoch: 15161, Training Loss: 0.01243
Epoch: 15161, Training Loss: 0.01536
Epoch: 15162, Training Loss: 0.01178
Epoch: 15162, Training Loss: 0.01241
Epoch: 15162, Training Loss: 0.01243
Epoch: 15162, Training Loss: 0.01536
Epoch: 15163, Training Loss: 0.01178
Epoch: 15163, Training Loss: 0.01241
Epoch: 15163, Training Loss: 0.01243
Epoch: 15163, Training Loss: 0.01536
Epoch: 15164, Training Loss: 0.01178
Epoch: 15164, Training Loss: 0.01241
Epoch: 15164, Training Loss: 0.01243
Epoch: 15164, Training Loss: 0.01535
Epoch: 15165, Training Loss: 0.01178
Epoch: 15165, Training Loss: 0.01241
Epoch: 15165, Training Loss: 0.01242
Epoch: 15165, Training Loss: 0.01535
Epoch: 15166, Training Loss: 0.01178
Epoch: 15166, Training Loss: 0.01241
Epoch: 15166, Training Loss: 0.01242
Epoch: 15166, Training Loss: 0.01535
Epoch: 15167, Training Loss: 0.01178
Epoch: 15167, Training Loss: 0.01241
Epoch: 15167, Training Loss: 0.01242
Epoch: 15167, Training Loss: 0.01535
Epoch: 15168, Training Loss: 0.01178
Epoch: 15168, Training Loss: 0.01241
Epoch: 15168, Training Loss: 0.01242
Epoch: 15168, Training Loss: 0.01535
Epoch: 15169, Training Loss: 0.01178
Epoch: 15169, Training Loss: 0.01241
Epoch: 15169, Training Loss: 0.01242
Epoch: 15169, Training Loss: 0.01535
Epoch: 15170, Training Loss: 0.01178
Epoch: 15170, Training Loss: 0.01240
Epoch: 15170, Training Loss: 0.01242
Epoch: 15170, Training Loss: 0.01535
Epoch: 15171, Training Loss: 0.01178
Epoch: 15171, Training Loss: 0.01240
Epoch: 15171, Training Loss: 0.01242
Epoch: 15171, Training Loss: 0.01535
Epoch: 15172, Training Loss: 0.01178
Epoch: 15172, Training Loss: 0.01240
Epoch: 15172, Training Loss: 0.01242
Epoch: 15172, Training Loss: 0.01535
Epoch: 15173, Training Loss: 0.01177
Epoch: 15173, Training Loss: 0.01240
Epoch: 15173, Training Loss: 0.01242
Epoch: 15173, Training Loss: 0.01535
Epoch: 15174, Training Loss: 0.01177
Epoch: 15174, Training Loss: 0.01240
Epoch: 15174, Training Loss: 0.01242
Epoch: 15174, Training Loss: 0.01535
Epoch: 15175, Training Loss: 0.01177
Epoch: 15175, Training Loss: 0.01240
Epoch: 15175, Training Loss: 0.01242
Epoch: 15175, Training Loss: 0.01535
Epoch: 15176, Training Loss: 0.01177
Epoch: 15176, Training Loss: 0.01240
Epoch: 15176, Training Loss: 0.01242
Epoch: 15176, Training Loss: 0.01535
Epoch: 15177, Training Loss: 0.01177
Epoch: 15177, Training Loss: 0.01240
Epoch: 15177, Training Loss: 0.01242
Epoch: 15177, Training Loss: 0.01535
Epoch: 15178, Training Loss: 0.01177
Epoch: 15178, Training Loss: 0.01240
Epoch: 15178, Training Loss: 0.01242
Epoch: 15178, Training Loss: 0.01535
Epoch: 15179, Training Loss: 0.01177
Epoch: 15179, Training Loss: 0.01240
Epoch: 15179, Training Loss: 0.01242
Epoch: 15179, Training Loss: 0.01535
Epoch: 15180, Training Loss: 0.01177
Epoch: 15180, Training Loss: 0.01240
Epoch: 15180, Training Loss: 0.01242
Epoch: 15180, Training Loss: 0.01534
Epoch: 15181, Training Loss: 0.01177
Epoch: 15181, Training Loss: 0.01240
Epoch: 15181, Training Loss: 0.01242
Epoch: 15181, Training Loss: 0.01534
Epoch: 15182, Training Loss: 0.01177
Epoch: 15182, Training Loss: 0.01240
Epoch: 15182, Training Loss: 0.01242
Epoch: 15182, Training Loss: 0.01534
Epoch: 15183, Training Loss: 0.01177
Epoch: 15183, Training Loss: 0.01240
Epoch: 15183, Training Loss: 0.01242
Epoch: 15183, Training Loss: 0.01534
Epoch: 15184, Training Loss: 0.01177
Epoch: 15184, Training Loss: 0.01240
Epoch: 15184, Training Loss: 0.01241
Epoch: 15184, Training Loss: 0.01534
Epoch: 15185, Training Loss: 0.01177
Epoch: 15185, Training Loss: 0.01240
Epoch: 15185, Training Loss: 0.01241
Epoch: 15185, Training Loss: 0.01534
Epoch: 15186, Training Loss: 0.01177
Epoch: 15186, Training Loss: 0.01240
Epoch: 15186, Training Loss: 0.01241
Epoch: 15186, Training Loss: 0.01534
Epoch: 15187, Training Loss: 0.01177
Epoch: 15187, Training Loss: 0.01240
Epoch: 15187, Training Loss: 0.01241
Epoch: 15187, Training Loss: 0.01534
Epoch: 15188, Training Loss: 0.01177
Epoch: 15188, Training Loss: 0.01240
Epoch: 15188, Training Loss: 0.01241
Epoch: 15188, Training Loss: 0.01534
Epoch: 15189, Training Loss: 0.01177
Epoch: 15189, Training Loss: 0.01240
Epoch: 15189, Training Loss: 0.01241
Epoch: 15189, Training Loss: 0.01534
Epoch: 15190, Training Loss: 0.01177
Epoch: 15190, Training Loss: 0.01239
Epoch: 15190, Training Loss: 0.01241
Epoch: 15190, Training Loss: 0.01534
Epoch: 15191, Training Loss: 0.01177
Epoch: 15191, Training Loss: 0.01239
Epoch: 15191, Training Loss: 0.01241
Epoch: 15191, Training Loss: 0.01534
Epoch: 15192, Training Loss: 0.01177
Epoch: 15192, Training Loss: 0.01239
Epoch: 15192, Training Loss: 0.01241
Epoch: 15192, Training Loss: 0.01534
Epoch: 15193, Training Loss: 0.01177
Epoch: 15193, Training Loss: 0.01239
Epoch: 15193, Training Loss: 0.01241
Epoch: 15193, Training Loss: 0.01534
Epoch: 15194, Training Loss: 0.01177
Epoch: 15194, Training Loss: 0.01239
Epoch: 15194, Training Loss: 0.01241
Epoch: 15194, Training Loss: 0.01534
Epoch: 15195, Training Loss: 0.01176
Epoch: 15195, Training Loss: 0.01239
Epoch: 15195, Training Loss: 0.01241
Epoch: 15195, Training Loss: 0.01534
Epoch: 15196, Training Loss: 0.01176
Epoch: 15196, Training Loss: 0.01239
Epoch: 15196, Training Loss: 0.01241
Epoch: 15196, Training Loss: 0.01533
Epoch: 15197, Training Loss: 0.01176
Epoch: 15197, Training Loss: 0.01239
Epoch: 15197, Training Loss: 0.01241
Epoch: 15197, Training Loss: 0.01533
Epoch: 15198, Training Loss: 0.01176
Epoch: 15198, Training Loss: 0.01239
Epoch: 15198, Training Loss: 0.01241
Epoch: 15198, Training Loss: 0.01533
Epoch: 15199, Training Loss: 0.01176
Epoch: 15199, Training Loss: 0.01239
Epoch: 15199, Training Loss: 0.01241
Epoch: 15199, Training Loss: 0.01533
Epoch: 15200, Training Loss: 0.01176
Epoch: 15200, Training Loss: 0.01239
Epoch: 15200, Training Loss: 0.01241
Epoch: 15200, Training Loss: 0.01533
Epoch: 15201, Training Loss: 0.01176
Epoch: 15201, Training Loss: 0.01239
Epoch: 15201, Training Loss: 0.01241
Epoch: 15201, Training Loss: 0.01533
Epoch: 15202, Training Loss: 0.01176
Epoch: 15202, Training Loss: 0.01239
Epoch: 15202, Training Loss: 0.01241
Epoch: 15202, Training Loss: 0.01533
Epoch: 15203, Training Loss: 0.01176
Epoch: 15203, Training Loss: 0.01239
Epoch: 15203, Training Loss: 0.01241
Epoch: 15203, Training Loss: 0.01533
Epoch: 15204, Training Loss: 0.01176
Epoch: 15204, Training Loss: 0.01239
Epoch: 15204, Training Loss: 0.01240
Epoch: 15204, Training Loss: 0.01533
Epoch: 15205, Training Loss: 0.01176
Epoch: 15205, Training Loss: 0.01239
Epoch: 15205, Training Loss: 0.01240
Epoch: 15205, Training Loss: 0.01533
Epoch: 15206, Training Loss: 0.01176
Epoch: 15206, Training Loss: 0.01239
Epoch: 15206, Training Loss: 0.01240
Epoch: 15206, Training Loss: 0.01533
Epoch: 15207, Training Loss: 0.01176
Epoch: 15207, Training Loss: 0.01239
Epoch: 15207, Training Loss: 0.01240
Epoch: 15207, Training Loss: 0.01533
Epoch: 15208, Training Loss: 0.01176
Epoch: 15208, Training Loss: 0.01239
Epoch: 15208, Training Loss: 0.01240
Epoch: 15208, Training Loss: 0.01533
Epoch: 15209, Training Loss: 0.01176
Epoch: 15209, Training Loss: 0.01238
Epoch: 15209, Training Loss: 0.01240
Epoch: 15209, Training Loss: 0.01533
Epoch: 15210, Training Loss: 0.01176
Epoch: 15210, Training Loss: 0.01238
Epoch: 15210, Training Loss: 0.01240
Epoch: 15210, Training Loss: 0.01533
Epoch: 15211, Training Loss: 0.01176
Epoch: 15211, Training Loss: 0.01238
Epoch: 15211, Training Loss: 0.01240
Epoch: 15211, Training Loss: 0.01532
Epoch: 15212, Training Loss: 0.01176
Epoch: 15212, Training Loss: 0.01238
Epoch: 15212, Training Loss: 0.01240
Epoch: 15212, Training Loss: 0.01532
Epoch: 15213, Training Loss: 0.01176
Epoch: 15213, Training Loss: 0.01238
Epoch: 15213, Training Loss: 0.01240
Epoch: 15213, Training Loss: 0.01532
Epoch: 15214, Training Loss: 0.01176
Epoch: 15214, Training Loss: 0.01238
Epoch: 15214, Training Loss: 0.01240
Epoch: 15214, Training Loss: 0.01532
Epoch: 15215, Training Loss: 0.01176
Epoch: 15215, Training Loss: 0.01238
Epoch: 15215, Training Loss: 0.01240
Epoch: 15215, Training Loss: 0.01532
Epoch: 15216, Training Loss: 0.01176
Epoch: 15216, Training Loss: 0.01238
Epoch: 15216, Training Loss: 0.01240
Epoch: 15216, Training Loss: 0.01532
Epoch: 15217, Training Loss: 0.01175
Epoch: 15217, Training Loss: 0.01238
Epoch: 15217, Training Loss: 0.01240
Epoch: 15217, Training Loss: 0.01532
Epoch: 15218, Training Loss: 0.01175
Epoch: 15218, Training Loss: 0.01238
Epoch: 15218, Training Loss: 0.01240
Epoch: 15218, Training Loss: 0.01532
Epoch: 15219, Training Loss: 0.01175
Epoch: 15219, Training Loss: 0.01238
Epoch: 15219, Training Loss: 0.01240
Epoch: 15219, Training Loss: 0.01532
Epoch: 15220, Training Loss: 0.01175
Epoch: 15220, Training Loss: 0.01238
Epoch: 15220, Training Loss: 0.01240
Epoch: 15220, Training Loss: 0.01532
Epoch: 15221, Training Loss: 0.01175
Epoch: 15221, Training Loss: 0.01238
Epoch: 15221, Training Loss: 0.01240
Epoch: 15221, Training Loss: 0.01532
Epoch: 15222, Training Loss: 0.01175
Epoch: 15222, Training Loss: 0.01238
Epoch: 15222, Training Loss: 0.01240
Epoch: 15222, Training Loss: 0.01532
Epoch: 15223, Training Loss: 0.01175
Epoch: 15223, Training Loss: 0.01238
Epoch: 15223, Training Loss: 0.01240
Epoch: 15223, Training Loss: 0.01532
Epoch: 15224, Training Loss: 0.01175
Epoch: 15224, Training Loss: 0.01238
Epoch: 15224, Training Loss: 0.01239
Epoch: 15224, Training Loss: 0.01532
Epoch: 15225, Training Loss: 0.01175
Epoch: 15225, Training Loss: 0.01238
Epoch: 15225, Training Loss: 0.01239
Epoch: 15225, Training Loss: 0.01532
Epoch: 15226, Training Loss: 0.01175
Epoch: 15226, Training Loss: 0.01238
Epoch: 15226, Training Loss: 0.01239
Epoch: 15226, Training Loss: 0.01532
Epoch: 15227, Training Loss: 0.01175
Epoch: 15227, Training Loss: 0.01238
Epoch: 15227, Training Loss: 0.01239
Epoch: 15227, Training Loss: 0.01531
Epoch: 15228, Training Loss: 0.01175
Epoch: 15228, Training Loss: 0.01238
Epoch: 15228, Training Loss: 0.01239
Epoch: 15228, Training Loss: 0.01531
Epoch: 15229, Training Loss: 0.01175
Epoch: 15229, Training Loss: 0.01237
Epoch: 15229, Training Loss: 0.01239
Epoch: 15229, Training Loss: 0.01531
Epoch: 15230, Training Loss: 0.01175
Epoch: 15230, Training Loss: 0.01237
Epoch: 15230, Training Loss: 0.01239
Epoch: 15230, Training Loss: 0.01531
Epoch: 15231, Training Loss: 0.01175
Epoch: 15231, Training Loss: 0.01237
Epoch: 15231, Training Loss: 0.01239
Epoch: 15231, Training Loss: 0.01531
Epoch: 15232, Training Loss: 0.01175
Epoch: 15232, Training Loss: 0.01237
Epoch: 15232, Training Loss: 0.01239
Epoch: 15232, Training Loss: 0.01531
Epoch: 15233, Training Loss: 0.01175
Epoch: 15233, Training Loss: 0.01237
Epoch: 15233, Training Loss: 0.01239
Epoch: 15233, Training Loss: 0.01531
Epoch: 15234, Training Loss: 0.01175
Epoch: 15234, Training Loss: 0.01237
Epoch: 15234, Training Loss: 0.01239
Epoch: 15234, Training Loss: 0.01531
Epoch: 15235, Training Loss: 0.01175
Epoch: 15235, Training Loss: 0.01237
Epoch: 15235, Training Loss: 0.01239
Epoch: 15235, Training Loss: 0.01531
Epoch: 15236, Training Loss: 0.01175
Epoch: 15236, Training Loss: 0.01237
Epoch: 15236, Training Loss: 0.01239
Epoch: 15236, Training Loss: 0.01531
Epoch: 15237, Training Loss: 0.01175
Epoch: 15237, Training Loss: 0.01237
Epoch: 15237, Training Loss: 0.01239
Epoch: 15237, Training Loss: 0.01531
Epoch: 15238, Training Loss: 0.01174
Epoch: 15238, Training Loss: 0.01237
Epoch: 15238, Training Loss: 0.01239
Epoch: 15238, Training Loss: 0.01531
Epoch: 15239, Training Loss: 0.01174
Epoch: 15239, Training Loss: 0.01237
Epoch: 15239, Training Loss: 0.01239
Epoch: 15239, Training Loss: 0.01531
Epoch: 15240, Training Loss: 0.01174
Epoch: 15240, Training Loss: 0.01237
Epoch: 15240, Training Loss: 0.01239
Epoch: 15240, Training Loss: 0.01531
Epoch: 15241, Training Loss: 0.01174
Epoch: 15241, Training Loss: 0.01237
Epoch: 15241, Training Loss: 0.01239
Epoch: 15241, Training Loss: 0.01531
Epoch: 15242, Training Loss: 0.01174
Epoch: 15242, Training Loss: 0.01237
Epoch: 15242, Training Loss: 0.01239
Epoch: 15242, Training Loss: 0.01531
Epoch: 15243, Training Loss: 0.01174
Epoch: 15243, Training Loss: 0.01237
Epoch: 15243, Training Loss: 0.01239
Epoch: 15243, Training Loss: 0.01530
Epoch: 15244, Training Loss: 0.01174
Epoch: 15244, Training Loss: 0.01237
Epoch: 15244, Training Loss: 0.01238
Epoch: 15244, Training Loss: 0.01530
Epoch: 15245, Training Loss: 0.01174
Epoch: 15245, Training Loss: 0.01237
Epoch: 15245, Training Loss: 0.01238
Epoch: 15245, Training Loss: 0.01530
Epoch: 15246, Training Loss: 0.01174
Epoch: 15246, Training Loss: 0.01237
Epoch: 15246, Training Loss: 0.01238
Epoch: 15246, Training Loss: 0.01530
Epoch: 15247, Training Loss: 0.01174
Epoch: 15247, Training Loss: 0.01237
Epoch: 15247, Training Loss: 0.01238
Epoch: 15247, Training Loss: 0.01530
Epoch: 15248, Training Loss: 0.01174
Epoch: 15248, Training Loss: 0.01237
Epoch: 15248, Training Loss: 0.01238
Epoch: 15248, Training Loss: 0.01530
Epoch: 15249, Training Loss: 0.01174
Epoch: 15249, Training Loss: 0.01236
Epoch: 15249, Training Loss: 0.01238
Epoch: 15249, Training Loss: 0.01530
Epoch: 15250, Training Loss: 0.01174
Epoch: 15250, Training Loss: 0.01236
Epoch: 15250, Training Loss: 0.01238
Epoch: 15250, Training Loss: 0.01530
Epoch: 15251, Training Loss: 0.01174
Epoch: 15251, Training Loss: 0.01236
Epoch: 15251, Training Loss: 0.01238
Epoch: 15251, Training Loss: 0.01530
Epoch: 15252, Training Loss: 0.01174
Epoch: 15252, Training Loss: 0.01236
Epoch: 15252, Training Loss: 0.01238
Epoch: 15252, Training Loss: 0.01530
Epoch: 15253, Training Loss: 0.01174
Epoch: 15253, Training Loss: 0.01236
Epoch: 15253, Training Loss: 0.01238
Epoch: 15253, Training Loss: 0.01530
Epoch: 15254, Training Loss: 0.01174
Epoch: 15254, Training Loss: 0.01236
Epoch: 15254, Training Loss: 0.01238
Epoch: 15254, Training Loss: 0.01530
Epoch: 15255, Training Loss: 0.01174
Epoch: 15255, Training Loss: 0.01236
Epoch: 15255, Training Loss: 0.01238
Epoch: 15255, Training Loss: 0.01530
Epoch: 15256, Training Loss: 0.01174
Epoch: 15256, Training Loss: 0.01236
Epoch: 15256, Training Loss: 0.01238
Epoch: 15256, Training Loss: 0.01530
Epoch: 15257, Training Loss: 0.01174
Epoch: 15257, Training Loss: 0.01236
Epoch: 15257, Training Loss: 0.01238
Epoch: 15257, Training Loss: 0.01530
Epoch: 15258, Training Loss: 0.01174
Epoch: 15258, Training Loss: 0.01236
Epoch: 15258, Training Loss: 0.01238
Epoch: 15258, Training Loss: 0.01530
Epoch: 15259, Training Loss: 0.01174
Epoch: 15259, Training Loss: 0.01236
Epoch: 15259, Training Loss: 0.01238
Epoch: 15259, Training Loss: 0.01529
Epoch: 15260, Training Loss: 0.01173
Epoch: 15260, Training Loss: 0.01236
Epoch: 15260, Training Loss: 0.01238
Epoch: 15260, Training Loss: 0.01529
Epoch: 15261, Training Loss: 0.01173
Epoch: 15261, Training Loss: 0.01236
Epoch: 15261, Training Loss: 0.01238
Epoch: 15261, Training Loss: 0.01529
Epoch: 15262, Training Loss: 0.01173
Epoch: 15262, Training Loss: 0.01236
Epoch: 15262, Training Loss: 0.01238
Epoch: 15262, Training Loss: 0.01529
Epoch: 15263, Training Loss: 0.01173
Epoch: 15263, Training Loss: 0.01236
Epoch: 15263, Training Loss: 0.01238
Epoch: 15263, Training Loss: 0.01529
Epoch: 15264, Training Loss: 0.01173
Epoch: 15264, Training Loss: 0.01236
Epoch: 15264, Training Loss: 0.01237
Epoch: 15264, Training Loss: 0.01529
Epoch: 15265, Training Loss: 0.01173
Epoch: 15265, Training Loss: 0.01236
Epoch: 15265, Training Loss: 0.01237
Epoch: 15265, Training Loss: 0.01529
Epoch: 15266, Training Loss: 0.01173
Epoch: 15266, Training Loss: 0.01236
Epoch: 15266, Training Loss: 0.01237
Epoch: 15266, Training Loss: 0.01529
Epoch: 15267, Training Loss: 0.01173
Epoch: 15267, Training Loss: 0.01236
Epoch: 15267, Training Loss: 0.01237
Epoch: 15267, Training Loss: 0.01529
Epoch: 15268, Training Loss: 0.01173
Epoch: 15268, Training Loss: 0.01236
Epoch: 15268, Training Loss: 0.01237
Epoch: 15268, Training Loss: 0.01529
Epoch: 15269, Training Loss: 0.01173
Epoch: 15269, Training Loss: 0.01235
Epoch: 15269, Training Loss: 0.01237
Epoch: 15269, Training Loss: 0.01529
Epoch: 15270, Training Loss: 0.01173
Epoch: 15270, Training Loss: 0.01235
Epoch: 15270, Training Loss: 0.01237
Epoch: 15270, Training Loss: 0.01529
Epoch: 15271, Training Loss: 0.01173
Epoch: 15271, Training Loss: 0.01235
Epoch: 15271, Training Loss: 0.01237
Epoch: 15271, Training Loss: 0.01529
Epoch: 15272, Training Loss: 0.01173
Epoch: 15272, Training Loss: 0.01235
Epoch: 15272, Training Loss: 0.01237
Epoch: 15272, Training Loss: 0.01529
Epoch: 15273, Training Loss: 0.01173
Epoch: 15273, Training Loss: 0.01235
Epoch: 15273, Training Loss: 0.01237
Epoch: 15273, Training Loss: 0.01529
Epoch: 15274, Training Loss: 0.01173
Epoch: 15274, Training Loss: 0.01235
Epoch: 15274, Training Loss: 0.01237
Epoch: 15274, Training Loss: 0.01528
Epoch: 15275, Training Loss: 0.01173
Epoch: 15275, Training Loss: 0.01235
Epoch: 15275, Training Loss: 0.01237
Epoch: 15275, Training Loss: 0.01528
Epoch: 15276, Training Loss: 0.01173
Epoch: 15276, Training Loss: 0.01235
Epoch: 15276, Training Loss: 0.01237
Epoch: 15276, Training Loss: 0.01528
Epoch: 15277, Training Loss: 0.01173
Epoch: 15277, Training Loss: 0.01235
Epoch: 15277, Training Loss: 0.01237
Epoch: 15277, Training Loss: 0.01528
Epoch: 15278, Training Loss: 0.01173
Epoch: 15278, Training Loss: 0.01235
Epoch: 15278, Training Loss: 0.01237
Epoch: 15278, Training Loss: 0.01528
Epoch: 15279, Training Loss: 0.01173
Epoch: 15279, Training Loss: 0.01235
Epoch: 15279, Training Loss: 0.01237
Epoch: 15279, Training Loss: 0.01528
Epoch: 15280, Training Loss: 0.01173
Epoch: 15280, Training Loss: 0.01235
Epoch: 15280, Training Loss: 0.01237
Epoch: 15280, Training Loss: 0.01528
Epoch: 15281, Training Loss: 0.01173
Epoch: 15281, Training Loss: 0.01235
Epoch: 15281, Training Loss: 0.01237
Epoch: 15281, Training Loss: 0.01528
Epoch: 15282, Training Loss: 0.01172
Epoch: 15282, Training Loss: 0.01235
Epoch: 15282, Training Loss: 0.01237
Epoch: 15282, Training Loss: 0.01528
Epoch: 15283, Training Loss: 0.01172
Epoch: 15283, Training Loss: 0.01235
Epoch: 15283, Training Loss: 0.01237
Epoch: 15283, Training Loss: 0.01528
Epoch: 15284, Training Loss: 0.01172
Epoch: 15284, Training Loss: 0.01235
Epoch: 15284, Training Loss: 0.01236
Epoch: 15284, Training Loss: 0.01528
Epoch: 15285, Training Loss: 0.01172
Epoch: 15285, Training Loss: 0.01235
Epoch: 15285, Training Loss: 0.01236
Epoch: 15285, Training Loss: 0.01528
Epoch: 15286, Training Loss: 0.01172
Epoch: 15286, Training Loss: 0.01235
Epoch: 15286, Training Loss: 0.01236
Epoch: 15286, Training Loss: 0.01528
Epoch: 15287, Training Loss: 0.01172
Epoch: 15287, Training Loss: 0.01235
Epoch: 15287, Training Loss: 0.01236
Epoch: 15287, Training Loss: 0.01528
Epoch: 15288, Training Loss: 0.01172
Epoch: 15288, Training Loss: 0.01235
Epoch: 15288, Training Loss: 0.01236
Epoch: 15288, Training Loss: 0.01528
Epoch: 15289, Training Loss: 0.01172
Epoch: 15289, Training Loss: 0.01234
Epoch: 15289, Training Loss: 0.01236
Epoch: 15289, Training Loss: 0.01528
Epoch: 15290, Training Loss: 0.01172
Epoch: 15290, Training Loss: 0.01234
Epoch: 15290, Training Loss: 0.01236
Epoch: 15290, Training Loss: 0.01527
Epoch: 15291, Training Loss: 0.01172
Epoch: 15291, Training Loss: 0.01234
Epoch: 15291, Training Loss: 0.01236
Epoch: 15291, Training Loss: 0.01527
Epoch: 15292, Training Loss: 0.01172
Epoch: 15292, Training Loss: 0.01234
Epoch: 15292, Training Loss: 0.01236
Epoch: 15292, Training Loss: 0.01527
Epoch: 15293, Training Loss: 0.01172
Epoch: 15293, Training Loss: 0.01234
Epoch: 15293, Training Loss: 0.01236
Epoch: 15293, Training Loss: 0.01527
Epoch: 15294, Training Loss: 0.01172
Epoch: 15294, Training Loss: 0.01234
Epoch: 15294, Training Loss: 0.01236
Epoch: 15294, Training Loss: 0.01527
Epoch: 15295, Training Loss: 0.01172
Epoch: 15295, Training Loss: 0.01234
Epoch: 15295, Training Loss: 0.01236
Epoch: 15295, Training Loss: 0.01527
Epoch: 15296, Training Loss: 0.01172
Epoch: 15296, Training Loss: 0.01234
Epoch: 15296, Training Loss: 0.01236
Epoch: 15296, Training Loss: 0.01527
Epoch: 15297, Training Loss: 0.01172
Epoch: 15297, Training Loss: 0.01234
Epoch: 15297, Training Loss: 0.01236
Epoch: 15297, Training Loss: 0.01527
Epoch: 15298, Training Loss: 0.01172
Epoch: 15298, Training Loss: 0.01234
Epoch: 15298, Training Loss: 0.01236
Epoch: 15298, Training Loss: 0.01527
Epoch: 15299, Training Loss: 0.01172
Epoch: 15299, Training Loss: 0.01234
Epoch: 15299, Training Loss: 0.01236
Epoch: 15299, Training Loss: 0.01527
Epoch: 15300, Training Loss: 0.01172
Epoch: 15300, Training Loss: 0.01234
Epoch: 15300, Training Loss: 0.01236
Epoch: 15300, Training Loss: 0.01527
Epoch: 15301, Training Loss: 0.01172
Epoch: 15301, Training Loss: 0.01234
Epoch: 15301, Training Loss: 0.01236
Epoch: 15301, Training Loss: 0.01527
Epoch: 15302, Training Loss: 0.01172
Epoch: 15302, Training Loss: 0.01234
Epoch: 15302, Training Loss: 0.01236
Epoch: 15302, Training Loss: 0.01527
Epoch: 15303, Training Loss: 0.01171
Epoch: 15303, Training Loss: 0.01234
Epoch: 15303, Training Loss: 0.01236
Epoch: 15303, Training Loss: 0.01527
Epoch: 15304, Training Loss: 0.01171
Epoch: 15304, Training Loss: 0.01234
Epoch: 15304, Training Loss: 0.01235
Epoch: 15304, Training Loss: 0.01527
Epoch: 15305, Training Loss: 0.01171
Epoch: 15305, Training Loss: 0.01234
Epoch: 15305, Training Loss: 0.01235
Epoch: 15305, Training Loss: 0.01527
Epoch: 15306, Training Loss: 0.01171
Epoch: 15306, Training Loss: 0.01234
Epoch: 15306, Training Loss: 0.01235
Epoch: 15306, Training Loss: 0.01526
Epoch: 15307, Training Loss: 0.01171
Epoch: 15307, Training Loss: 0.01234
Epoch: 15307, Training Loss: 0.01235
Epoch: 15307, Training Loss: 0.01526
Epoch: 15308, Training Loss: 0.01171
Epoch: 15308, Training Loss: 0.01234
Epoch: 15308, Training Loss: 0.01235
Epoch: 15308, Training Loss: 0.01526
Epoch: 15309, Training Loss: 0.01171
Epoch: 15309, Training Loss: 0.01233
Epoch: 15309, Training Loss: 0.01235
Epoch: 15309, Training Loss: 0.01526
Epoch: 15310, Training Loss: 0.01171
Epoch: 15310, Training Loss: 0.01233
Epoch: 15310, Training Loss: 0.01235
Epoch: 15310, Training Loss: 0.01526
Epoch: 15311, Training Loss: 0.01171
Epoch: 15311, Training Loss: 0.01233
Epoch: 15311, Training Loss: 0.01235
Epoch: 15311, Training Loss: 0.01526
Epoch: 15312, Training Loss: 0.01171
Epoch: 15312, Training Loss: 0.01233
Epoch: 15312, Training Loss: 0.01235
Epoch: 15312, Training Loss: 0.01526
Epoch: 15313, Training Loss: 0.01171
Epoch: 15313, Training Loss: 0.01233
Epoch: 15313, Training Loss: 0.01235
Epoch: 15313, Training Loss: 0.01526
Epoch: 15314, Training Loss: 0.01171
Epoch: 15314, Training Loss: 0.01233
Epoch: 15314, Training Loss: 0.01235
Epoch: 15314, Training Loss: 0.01526
Epoch: 15315, Training Loss: 0.01171
Epoch: 15315, Training Loss: 0.01233
Epoch: 15315, Training Loss: 0.01235
Epoch: 15315, Training Loss: 0.01526
Epoch: 15316, Training Loss: 0.01171
Epoch: 15316, Training Loss: 0.01233
Epoch: 15316, Training Loss: 0.01235
Epoch: 15316, Training Loss: 0.01526
Epoch: 15317, Training Loss: 0.01171
Epoch: 15317, Training Loss: 0.01233
Epoch: 15317, Training Loss: 0.01235
Epoch: 15317, Training Loss: 0.01526
Epoch: 15318, Training Loss: 0.01171
Epoch: 15318, Training Loss: 0.01233
Epoch: 15318, Training Loss: 0.01235
Epoch: 15318, Training Loss: 0.01526
Epoch: 15319, Training Loss: 0.01171
Epoch: 15319, Training Loss: 0.01233
Epoch: 15319, Training Loss: 0.01235
Epoch: 15319, Training Loss: 0.01526
Epoch: 15320, Training Loss: 0.01171
Epoch: 15320, Training Loss: 0.01233
Epoch: 15320, Training Loss: 0.01235
Epoch: 15320, Training Loss: 0.01526
Epoch: 15321, Training Loss: 0.01171
Epoch: 15321, Training Loss: 0.01233
Epoch: 15321, Training Loss: 0.01235
Epoch: 15321, Training Loss: 0.01526
Epoch: 15322, Training Loss: 0.01171
Epoch: 15322, Training Loss: 0.01233
Epoch: 15322, Training Loss: 0.01235
Epoch: 15322, Training Loss: 0.01525
Epoch: 15323, Training Loss: 0.01171
Epoch: 15323, Training Loss: 0.01233
Epoch: 15323, Training Loss: 0.01235
Epoch: 15323, Training Loss: 0.01525
Epoch: 15324, Training Loss: 0.01171
Epoch: 15324, Training Loss: 0.01233
Epoch: 15324, Training Loss: 0.01234
Epoch: 15324, Training Loss: 0.01525
Epoch: 15325, Training Loss: 0.01170
Epoch: 15325, Training Loss: 0.01233
Epoch: 15325, Training Loss: 0.01234
Epoch: 15325, Training Loss: 0.01525
Epoch: 15326, Training Loss: 0.01170
Epoch: 15326, Training Loss: 0.01233
Epoch: 15326, Training Loss: 0.01234
Epoch: 15326, Training Loss: 0.01525
Epoch: 15327, Training Loss: 0.01170
Epoch: 15327, Training Loss: 0.01233
Epoch: 15327, Training Loss: 0.01234
Epoch: 15327, Training Loss: 0.01525
Epoch: 15328, Training Loss: 0.01170
Epoch: 15328, Training Loss: 0.01233
Epoch: 15328, Training Loss: 0.01234
Epoch: 15328, Training Loss: 0.01525
Epoch: 15329, Training Loss: 0.01170
Epoch: 15329, Training Loss: 0.01232
Epoch: 15329, Training Loss: 0.01234
Epoch: 15329, Training Loss: 0.01525
Epoch: 15330, Training Loss: 0.01170
Epoch: 15330, Training Loss: 0.01232
Epoch: 15330, Training Loss: 0.01234
Epoch: 15330, Training Loss: 0.01525
Epoch: 15331, Training Loss: 0.01170
Epoch: 15331, Training Loss: 0.01232
Epoch: 15331, Training Loss: 0.01234
Epoch: 15331, Training Loss: 0.01525
Epoch: 15332, Training Loss: 0.01170
Epoch: 15332, Training Loss: 0.01232
Epoch: 15332, Training Loss: 0.01234
Epoch: 15332, Training Loss: 0.01525
Epoch: 15333, Training Loss: 0.01170
Epoch: 15333, Training Loss: 0.01232
Epoch: 15333, Training Loss: 0.01234
Epoch: 15333, Training Loss: 0.01525
Epoch: 15334, Training Loss: 0.01170
Epoch: 15334, Training Loss: 0.01232
Epoch: 15334, Training Loss: 0.01234
Epoch: 15334, Training Loss: 0.01525
Epoch: 15335, Training Loss: 0.01170
Epoch: 15335, Training Loss: 0.01232
Epoch: 15335, Training Loss: 0.01234
Epoch: 15335, Training Loss: 0.01525
Epoch: 15336, Training Loss: 0.01170
Epoch: 15336, Training Loss: 0.01232
Epoch: 15336, Training Loss: 0.01234
Epoch: 15336, Training Loss: 0.01525
Epoch: 15337, Training Loss: 0.01170
Epoch: 15337, Training Loss: 0.01232
Epoch: 15337, Training Loss: 0.01234
Epoch: 15337, Training Loss: 0.01525
Epoch: 15338, Training Loss: 0.01170
Epoch: 15338, Training Loss: 0.01232
Epoch: 15338, Training Loss: 0.01234
Epoch: 15338, Training Loss: 0.01524
Epoch: 15339, Training Loss: 0.01170
Epoch: 15339, Training Loss: 0.01232
Epoch: 15339, Training Loss: 0.01234
Epoch: 15339, Training Loss: 0.01524
Epoch: 15340, Training Loss: 0.01170
Epoch: 15340, Training Loss: 0.01232
Epoch: 15340, Training Loss: 0.01234
Epoch: 15340, Training Loss: 0.01524
Epoch: 15341, Training Loss: 0.01170
Epoch: 15341, Training Loss: 0.01232
Epoch: 15341, Training Loss: 0.01234
Epoch: 15341, Training Loss: 0.01524
Epoch: 15342, Training Loss: 0.01170
Epoch: 15342, Training Loss: 0.01232
Epoch: 15342, Training Loss: 0.01234
Epoch: 15342, Training Loss: 0.01524
Epoch: 15343, Training Loss: 0.01170
Epoch: 15343, Training Loss: 0.01232
Epoch: 15343, Training Loss: 0.01234
Epoch: 15343, Training Loss: 0.01524
Epoch: 15344, Training Loss: 0.01170
Epoch: 15344, Training Loss: 0.01232
Epoch: 15344, Training Loss: 0.01233
Epoch: 15344, Training Loss: 0.01524
Epoch: 15345, Training Loss: 0.01170
Epoch: 15345, Training Loss: 0.01232
Epoch: 15345, Training Loss: 0.01233
Epoch: 15345, Training Loss: 0.01524
Epoch: 15346, Training Loss: 0.01170
Epoch: 15346, Training Loss: 0.01232
Epoch: 15346, Training Loss: 0.01233
Epoch: 15346, Training Loss: 0.01524
Epoch: 15347, Training Loss: 0.01169
Epoch: 15347, Training Loss: 0.01232
Epoch: 15347, Training Loss: 0.01233
Epoch: 15347, Training Loss: 0.01524
Epoch: 15348, Training Loss: 0.01169
Epoch: 15348, Training Loss: 0.01232
Epoch: 15348, Training Loss: 0.01233
Epoch: 15348, Training Loss: 0.01524
Epoch: 15349, Training Loss: 0.01169
Epoch: 15349, Training Loss: 0.01231
Epoch: 15349, Training Loss: 0.01233
Epoch: 15349, Training Loss: 0.01524
Epoch: 15350, Training Loss: 0.01169
Epoch: 15350, Training Loss: 0.01231
Epoch: 15350, Training Loss: 0.01233
Epoch: 15350, Training Loss: 0.01524
Epoch: 15351, Training Loss: 0.01169
Epoch: 15351, Training Loss: 0.01231
Epoch: 15351, Training Loss: 0.01233
Epoch: 15351, Training Loss: 0.01524
Epoch: 15352, Training Loss: 0.01169
Epoch: 15352, Training Loss: 0.01231
Epoch: 15352, Training Loss: 0.01233
Epoch: 15352, Training Loss: 0.01524
Epoch: 15353, Training Loss: 0.01169
Epoch: 15353, Training Loss: 0.01231
Epoch: 15353, Training Loss: 0.01233
Epoch: 15353, Training Loss: 0.01524
Epoch: 15354, Training Loss: 0.01169
Epoch: 15354, Training Loss: 0.01231
Epoch: 15354, Training Loss: 0.01233
Epoch: 15354, Training Loss: 0.01523
Epoch: 15355, Training Loss: 0.01169
Epoch: 15355, Training Loss: 0.01231
Epoch: 15355, Training Loss: 0.01233
Epoch: 15355, Training Loss: 0.01523
Epoch: 15356, Training Loss: 0.01169
Epoch: 15356, Training Loss: 0.01231
Epoch: 15356, Training Loss: 0.01233
Epoch: 15356, Training Loss: 0.01523
Epoch: 15357, Training Loss: 0.01169
Epoch: 15357, Training Loss: 0.01231
Epoch: 15357, Training Loss: 0.01233
Epoch: 15357, Training Loss: 0.01523
Epoch: 15358, Training Loss: 0.01169
Epoch: 15358, Training Loss: 0.01231
Epoch: 15358, Training Loss: 0.01233
Epoch: 15358, Training Loss: 0.01523
Epoch: 15359, Training Loss: 0.01169
Epoch: 15359, Training Loss: 0.01231
Epoch: 15359, Training Loss: 0.01233
Epoch: 15359, Training Loss: 0.01523
Epoch: 15360, Training Loss: 0.01169
Epoch: 15360, Training Loss: 0.01231
Epoch: 15360, Training Loss: 0.01233
Epoch: 15360, Training Loss: 0.01523
Epoch: 15361, Training Loss: 0.01169
Epoch: 15361, Training Loss: 0.01231
Epoch: 15361, Training Loss: 0.01233
Epoch: 15361, Training Loss: 0.01523
Epoch: 15362, Training Loss: 0.01169
Epoch: 15362, Training Loss: 0.01231
Epoch: 15362, Training Loss: 0.01233
Epoch: 15362, Training Loss: 0.01523
Epoch: 15363, Training Loss: 0.01169
Epoch: 15363, Training Loss: 0.01231
Epoch: 15363, Training Loss: 0.01233
Epoch: 15363, Training Loss: 0.01523
Epoch: 15364, Training Loss: 0.01169
Epoch: 15364, Training Loss: 0.01231
Epoch: 15364, Training Loss: 0.01232
Epoch: 15364, Training Loss: 0.01523
Epoch: 15365, Training Loss: 0.01169
Epoch: 15365, Training Loss: 0.01231
Epoch: 15365, Training Loss: 0.01232
Epoch: 15365, Training Loss: 0.01523
Epoch: 15366, Training Loss: 0.01169
Epoch: 15366, Training Loss: 0.01231
Epoch: 15366, Training Loss: 0.01232
Epoch: 15366, Training Loss: 0.01523
Epoch: 15367, Training Loss: 0.01169
Epoch: 15367, Training Loss: 0.01231
Epoch: 15367, Training Loss: 0.01232
Epoch: 15367, Training Loss: 0.01523
Epoch: 15368, Training Loss: 0.01169
Epoch: 15368, Training Loss: 0.01231
Epoch: 15368, Training Loss: 0.01232
Epoch: 15368, Training Loss: 0.01523
Epoch: 15369, Training Loss: 0.01168
Epoch: 15369, Training Loss: 0.01231
Epoch: 15369, Training Loss: 0.01232
Epoch: 15369, Training Loss: 0.01523
Epoch: 15370, Training Loss: 0.01168
Epoch: 15370, Training Loss: 0.01230
Epoch: 15370, Training Loss: 0.01232
Epoch: 15370, Training Loss: 0.01522
Epoch: 15371, Training Loss: 0.01168
Epoch: 15371, Training Loss: 0.01230
Epoch: 15371, Training Loss: 0.01232
Epoch: 15371, Training Loss: 0.01522
Epoch: 15372, Training Loss: 0.01168
Epoch: 15372, Training Loss: 0.01230
Epoch: 15372, Training Loss: 0.01232
Epoch: 15372, Training Loss: 0.01522
Epoch: 15373, Training Loss: 0.01168
Epoch: 15373, Training Loss: 0.01230
Epoch: 15373, Training Loss: 0.01232
Epoch: 15373, Training Loss: 0.01522
Epoch: 15374, Training Loss: 0.01168
Epoch: 15374, Training Loss: 0.01230
Epoch: 15374, Training Loss: 0.01232
Epoch: 15374, Training Loss: 0.01522
Epoch: 15375, Training Loss: 0.01168
Epoch: 15375, Training Loss: 0.01230
Epoch: 15375, Training Loss: 0.01232
Epoch: 15375, Training Loss: 0.01522
Epoch: 15376, Training Loss: 0.01168
Epoch: 15376, Training Loss: 0.01230
Epoch: 15376, Training Loss: 0.01232
Epoch: 15376, Training Loss: 0.01522
Epoch: 15377, Training Loss: 0.01168
Epoch: 15377, Training Loss: 0.01230
Epoch: 15377, Training Loss: 0.01232
Epoch: 15377, Training Loss: 0.01522
Epoch: 15378, Training Loss: 0.01168
Epoch: 15378, Training Loss: 0.01230
Epoch: 15378, Training Loss: 0.01232
Epoch: 15378, Training Loss: 0.01522
Epoch: 15379, Training Loss: 0.01168
Epoch: 15379, Training Loss: 0.01230
Epoch: 15379, Training Loss: 0.01232
Epoch: 15379, Training Loss: 0.01522
Epoch: 15380, Training Loss: 0.01168
Epoch: 15380, Training Loss: 0.01230
Epoch: 15380, Training Loss: 0.01232
Epoch: 15380, Training Loss: 0.01522
Epoch: 15381, Training Loss: 0.01168
Epoch: 15381, Training Loss: 0.01230
Epoch: 15381, Training Loss: 0.01232
Epoch: 15381, Training Loss: 0.01522
Epoch: 15382, Training Loss: 0.01168
Epoch: 15382, Training Loss: 0.01230
Epoch: 15382, Training Loss: 0.01232
Epoch: 15382, Training Loss: 0.01522
Epoch: 15383, Training Loss: 0.01168
Epoch: 15383, Training Loss: 0.01230
Epoch: 15383, Training Loss: 0.01232
Epoch: 15383, Training Loss: 0.01522
Epoch: 15384, Training Loss: 0.01168
Epoch: 15384, Training Loss: 0.01230
Epoch: 15384, Training Loss: 0.01231
Epoch: 15384, Training Loss: 0.01522
Epoch: 15385, Training Loss: 0.01168
Epoch: 15385, Training Loss: 0.01230
Epoch: 15385, Training Loss: 0.01231
Epoch: 15385, Training Loss: 0.01522
Epoch: 15386, Training Loss: 0.01168
Epoch: 15386, Training Loss: 0.01230
Epoch: 15386, Training Loss: 0.01231
Epoch: 15386, Training Loss: 0.01521
Epoch: 15387, Training Loss: 0.01168
Epoch: 15387, Training Loss: 0.01230
Epoch: 15387, Training Loss: 0.01231
Epoch: 15387, Training Loss: 0.01521
Epoch: 15388, Training Loss: 0.01168
Epoch: 15388, Training Loss: 0.01230
Epoch: 15388, Training Loss: 0.01231
Epoch: 15388, Training Loss: 0.01521
Epoch: 15389, Training Loss: 0.01168
Epoch: 15389, Training Loss: 0.01230
Epoch: 15389, Training Loss: 0.01231
Epoch: 15389, Training Loss: 0.01521
Epoch: 15390, Training Loss: 0.01168
Epoch: 15390, Training Loss: 0.01229
Epoch: 15390, Training Loss: 0.01231
Epoch: 15390, Training Loss: 0.01521
Epoch: 15391, Training Loss: 0.01167
Epoch: 15391, Training Loss: 0.01229
Epoch: 15391, Training Loss: 0.01231
Epoch: 15391, Training Loss: 0.01521
Epoch: 15392, Training Loss: 0.01167
Epoch: 15392, Training Loss: 0.01229
Epoch: 15392, Training Loss: 0.01231
Epoch: 15392, Training Loss: 0.01521
Epoch: 15393, Training Loss: 0.01167
Epoch: 15393, Training Loss: 0.01229
Epoch: 15393, Training Loss: 0.01231
Epoch: 15393, Training Loss: 0.01521
Epoch: 15394, Training Loss: 0.01167
Epoch: 15394, Training Loss: 0.01229
Epoch: 15394, Training Loss: 0.01231
Epoch: 15394, Training Loss: 0.01521
Epoch: 15395, Training Loss: 0.01167
Epoch: 15395, Training Loss: 0.01229
Epoch: 15395, Training Loss: 0.01231
Epoch: 15395, Training Loss: 0.01521
Epoch: 15396, Training Loss: 0.01167
Epoch: 15396, Training Loss: 0.01229
Epoch: 15396, Training Loss: 0.01231
Epoch: 15396, Training Loss: 0.01521
Epoch: 15397, Training Loss: 0.01167
Epoch: 15397, Training Loss: 0.01229
Epoch: 15397, Training Loss: 0.01231
Epoch: 15397, Training Loss: 0.01521
Epoch: 15398, Training Loss: 0.01167
Epoch: 15398, Training Loss: 0.01229
Epoch: 15398, Training Loss: 0.01231
Epoch: 15398, Training Loss: 0.01521
Epoch: 15399, Training Loss: 0.01167
Epoch: 15399, Training Loss: 0.01229
Epoch: 15399, Training Loss: 0.01231
Epoch: 15399, Training Loss: 0.01521
Epoch: 15400, Training Loss: 0.01167
Epoch: 15400, Training Loss: 0.01229
Epoch: 15400, Training Loss: 0.01231
Epoch: 15400, Training Loss: 0.01521
Epoch: 15401, Training Loss: 0.01167
Epoch: 15401, Training Loss: 0.01229
Epoch: 15401, Training Loss: 0.01231
Epoch: 15401, Training Loss: 0.01521
Epoch: 15402, Training Loss: 0.01167
Epoch: 15402, Training Loss: 0.01229
Epoch: 15402, Training Loss: 0.01231
Epoch: 15402, Training Loss: 0.01520
Epoch: 15403, Training Loss: 0.01167
Epoch: 15403, Training Loss: 0.01229
Epoch: 15403, Training Loss: 0.01231
Epoch: 15403, Training Loss: 0.01520
Epoch: 15404, Training Loss: 0.01167
Epoch: 15404, Training Loss: 0.01229
Epoch: 15404, Training Loss: 0.01230
Epoch: 15404, Training Loss: 0.01520
Epoch: 15405, Training Loss: 0.01167
Epoch: 15405, Training Loss: 0.01229
Epoch: 15405, Training Loss: 0.01230
Epoch: 15405, Training Loss: 0.01520
Epoch: 15406, Training Loss: 0.01167
Epoch: 15406, Training Loss: 0.01229
Epoch: 15406, Training Loss: 0.01230
Epoch: 15406, Training Loss: 0.01520
Epoch: 15407, Training Loss: 0.01167
Epoch: 15407, Training Loss: 0.01229
Epoch: 15407, Training Loss: 0.01230
Epoch: 15407, Training Loss: 0.01520
Epoch: 15408, Training Loss: 0.01167
Epoch: 15408, Training Loss: 0.01229
Epoch: 15408, Training Loss: 0.01230
Epoch: 15408, Training Loss: 0.01520
Epoch: 15409, Training Loss: 0.01167
Epoch: 15409, Training Loss: 0.01229
Epoch: 15409, Training Loss: 0.01230
Epoch: 15409, Training Loss: 0.01520
Epoch: 15410, Training Loss: 0.01167
Epoch: 15410, Training Loss: 0.01228
Epoch: 15410, Training Loss: 0.01230
Epoch: 15410, Training Loss: 0.01520
Epoch: 15411, Training Loss: 0.01167
Epoch: 15411, Training Loss: 0.01228
Epoch: 15411, Training Loss: 0.01230
Epoch: 15411, Training Loss: 0.01520
Epoch: 15412, Training Loss: 0.01167
Epoch: 15412, Training Loss: 0.01228
Epoch: 15412, Training Loss: 0.01230
Epoch: 15412, Training Loss: 0.01520
Epoch: 15413, Training Loss: 0.01166
Epoch: 15413, Training Loss: 0.01228
Epoch: 15413, Training Loss: 0.01230
Epoch: 15413, Training Loss: 0.01520
Epoch: 15414, Training Loss: 0.01166
Epoch: 15414, Training Loss: 0.01228
Epoch: 15414, Training Loss: 0.01230
Epoch: 15414, Training Loss: 0.01520
Epoch: 15415, Training Loss: 0.01166
Epoch: 15415, Training Loss: 0.01228
Epoch: 15415, Training Loss: 0.01230
Epoch: 15415, Training Loss: 0.01520
Epoch: 15416, Training Loss: 0.01166
Epoch: 15416, Training Loss: 0.01228
Epoch: 15416, Training Loss: 0.01230
Epoch: 15416, Training Loss: 0.01520
Epoch: 15417, Training Loss: 0.01166
Epoch: 15417, Training Loss: 0.01228
Epoch: 15417, Training Loss: 0.01230
Epoch: 15417, Training Loss: 0.01520
Epoch: 15418, Training Loss: 0.01166
Epoch: 15418, Training Loss: 0.01228
Epoch: 15418, Training Loss: 0.01230
Epoch: 15418, Training Loss: 0.01519
Epoch: 15419, Training Loss: 0.01166
Epoch: 15419, Training Loss: 0.01228
Epoch: 15419, Training Loss: 0.01230
Epoch: 15419, Training Loss: 0.01519
Epoch: 15420, Training Loss: 0.01166
Epoch: 15420, Training Loss: 0.01228
Epoch: 15420, Training Loss: 0.01230
Epoch: 15420, Training Loss: 0.01519
Epoch: 15421, Training Loss: 0.01166
Epoch: 15421, Training Loss: 0.01228
Epoch: 15421, Training Loss: 0.01230
Epoch: 15421, Training Loss: 0.01519
Epoch: 15422, Training Loss: 0.01166
Epoch: 15422, Training Loss: 0.01228
Epoch: 15422, Training Loss: 0.01230
Epoch: 15422, Training Loss: 0.01519
Epoch: 15423, Training Loss: 0.01166
Epoch: 15423, Training Loss: 0.01228
Epoch: 15423, Training Loss: 0.01230
Epoch: 15423, Training Loss: 0.01519
Epoch: 15424, Training Loss: 0.01166
Epoch: 15424, Training Loss: 0.01228
Epoch: 15424, Training Loss: 0.01230
Epoch: 15424, Training Loss: 0.01519
Epoch: 15425, Training Loss: 0.01166
Epoch: 15425, Training Loss: 0.01228
Epoch: 15425, Training Loss: 0.01229
Epoch: 15425, Training Loss: 0.01519
Epoch: 15426, Training Loss: 0.01166
Epoch: 15426, Training Loss: 0.01228
Epoch: 15426, Training Loss: 0.01229
Epoch: 15426, Training Loss: 0.01519
Epoch: 15427, Training Loss: 0.01166
Epoch: 15427, Training Loss: 0.01228
Epoch: 15427, Training Loss: 0.01229
Epoch: 15427, Training Loss: 0.01519
Epoch: 15428, Training Loss: 0.01166
Epoch: 15428, Training Loss: 0.01228
Epoch: 15428, Training Loss: 0.01229
Epoch: 15428, Training Loss: 0.01519
Epoch: 15429, Training Loss: 0.01166
Epoch: 15429, Training Loss: 0.01228
Epoch: 15429, Training Loss: 0.01229
Epoch: 15429, Training Loss: 0.01519
Epoch: 15430, Training Loss: 0.01166
Epoch: 15430, Training Loss: 0.01227
Epoch: 15430, Training Loss: 0.01229
Epoch: 15430, Training Loss: 0.01519
Epoch: 15431, Training Loss: 0.01166
Epoch: 15431, Training Loss: 0.01227
Epoch: 15431, Training Loss: 0.01229
Epoch: 15431, Training Loss: 0.01519
Epoch: 15432, Training Loss: 0.01166
Epoch: 15432, Training Loss: 0.01227
Epoch: 15432, Training Loss: 0.01229
Epoch: 15432, Training Loss: 0.01519
Epoch: 15433, Training Loss: 0.01166
Epoch: 15433, Training Loss: 0.01227
Epoch: 15433, Training Loss: 0.01229
Epoch: 15433, Training Loss: 0.01519
Epoch: 15434, Training Loss: 0.01166
Epoch: 15434, Training Loss: 0.01227
Epoch: 15434, Training Loss: 0.01229
Epoch: 15434, Training Loss: 0.01518
Epoch: 15435, Training Loss: 0.01165
Epoch: 15435, Training Loss: 0.01227
Epoch: 15435, Training Loss: 0.01229
Epoch: 15435, Training Loss: 0.01518
Epoch: 15436, Training Loss: 0.01165
Epoch: 15436, Training Loss: 0.01227
Epoch: 15436, Training Loss: 0.01229
Epoch: 15436, Training Loss: 0.01518
Epoch: 15437, Training Loss: 0.01165
Epoch: 15437, Training Loss: 0.01227
Epoch: 15437, Training Loss: 0.01229
Epoch: 15437, Training Loss: 0.01518
Epoch: 15438, Training Loss: 0.01165
Epoch: 15438, Training Loss: 0.01227
Epoch: 15438, Training Loss: 0.01229
Epoch: 15438, Training Loss: 0.01518
Epoch: 15439, Training Loss: 0.01165
Epoch: 15439, Training Loss: 0.01227
Epoch: 15439, Training Loss: 0.01229
Epoch: 15439, Training Loss: 0.01518
Epoch: 15440, Training Loss: 0.01165
Epoch: 15440, Training Loss: 0.01227
Epoch: 15440, Training Loss: 0.01229
Epoch: 15440, Training Loss: 0.01518
Epoch: 15441, Training Loss: 0.01165
Epoch: 15441, Training Loss: 0.01227
Epoch: 15441, Training Loss: 0.01229
Epoch: 15441, Training Loss: 0.01518
Epoch: 15442, Training Loss: 0.01165
Epoch: 15442, Training Loss: 0.01227
Epoch: 15442, Training Loss: 0.01229
Epoch: 15442, Training Loss: 0.01518
Epoch: 15443, Training Loss: 0.01165
Epoch: 15443, Training Loss: 0.01227
Epoch: 15443, Training Loss: 0.01229
Epoch: 15443, Training Loss: 0.01518
Epoch: 15444, Training Loss: 0.01165
Epoch: 15444, Training Loss: 0.01227
Epoch: 15444, Training Loss: 0.01229
Epoch: 15444, Training Loss: 0.01518
Epoch: 15445, Training Loss: 0.01165
Epoch: 15445, Training Loss: 0.01227
Epoch: 15445, Training Loss: 0.01228
Epoch: 15445, Training Loss: 0.01518
Epoch: 15446, Training Loss: 0.01165
Epoch: 15446, Training Loss: 0.01227
Epoch: 15446, Training Loss: 0.01228
Epoch: 15446, Training Loss: 0.01518
Epoch: 15447, Training Loss: 0.01165
Epoch: 15447, Training Loss: 0.01227
Epoch: 15447, Training Loss: 0.01228
Epoch: 15447, Training Loss: 0.01518
Epoch: 15448, Training Loss: 0.01165
Epoch: 15448, Training Loss: 0.01227
Epoch: 15448, Training Loss: 0.01228
Epoch: 15448, Training Loss: 0.01518
Epoch: 15449, Training Loss: 0.01165
Epoch: 15449, Training Loss: 0.01227
Epoch: 15449, Training Loss: 0.01228
Epoch: 15449, Training Loss: 0.01518
Epoch: 15450, Training Loss: 0.01165
Epoch: 15450, Training Loss: 0.01227
Epoch: 15450, Training Loss: 0.01228
Epoch: 15450, Training Loss: 0.01517
Epoch: 15451, Training Loss: 0.01165
Epoch: 15451, Training Loss: 0.01226
Epoch: 15451, Training Loss: 0.01228
Epoch: 15451, Training Loss: 0.01517
Epoch: 15452, Training Loss: 0.01165
Epoch: 15452, Training Loss: 0.01226
Epoch: 15452, Training Loss: 0.01228
Epoch: 15452, Training Loss: 0.01517
Epoch: 15453, Training Loss: 0.01165
Epoch: 15453, Training Loss: 0.01226
Epoch: 15453, Training Loss: 0.01228
Epoch: 15453, Training Loss: 0.01517
Epoch: 15454, Training Loss: 0.01165
Epoch: 15454, Training Loss: 0.01226
Epoch: 15454, Training Loss: 0.01228
Epoch: 15454, Training Loss: 0.01517
Epoch: 15455, Training Loss: 0.01165
Epoch: 15455, Training Loss: 0.01226
Epoch: 15455, Training Loss: 0.01228
Epoch: 15455, Training Loss: 0.01517
Epoch: 15456, Training Loss: 0.01165
Epoch: 15456, Training Loss: 0.01226
Epoch: 15456, Training Loss: 0.01228
Epoch: 15456, Training Loss: 0.01517
Epoch: 15457, Training Loss: 0.01164
Epoch: 15457, Training Loss: 0.01226
Epoch: 15457, Training Loss: 0.01228
Epoch: 15457, Training Loss: 0.01517
Epoch: 15458, Training Loss: 0.01164
Epoch: 15458, Training Loss: 0.01226
Epoch: 15458, Training Loss: 0.01228
Epoch: 15458, Training Loss: 0.01517
Epoch: 15459, Training Loss: 0.01164
Epoch: 15459, Training Loss: 0.01226
Epoch: 15459, Training Loss: 0.01228
Epoch: 15459, Training Loss: 0.01517
Epoch: 15460, Training Loss: 0.01164
Epoch: 15460, Training Loss: 0.01226
Epoch: 15460, Training Loss: 0.01228
Epoch: 15460, Training Loss: 0.01517
Epoch: 15461, Training Loss: 0.01164
Epoch: 15461, Training Loss: 0.01226
Epoch: 15461, Training Loss: 0.01228
Epoch: 15461, Training Loss: 0.01517
Epoch: 15462, Training Loss: 0.01164
Epoch: 15462, Training Loss: 0.01226
Epoch: 15462, Training Loss: 0.01228
Epoch: 15462, Training Loss: 0.01517
Epoch: 15463, Training Loss: 0.01164
Epoch: 15463, Training Loss: 0.01226
Epoch: 15463, Training Loss: 0.01228
Epoch: 15463, Training Loss: 0.01517
Epoch: 15464, Training Loss: 0.01164
Epoch: 15464, Training Loss: 0.01226
Epoch: 15464, Training Loss: 0.01228
Epoch: 15464, Training Loss: 0.01517
Epoch: 15465, Training Loss: 0.01164
Epoch: 15465, Training Loss: 0.01226
Epoch: 15465, Training Loss: 0.01227
Epoch: 15465, Training Loss: 0.01517
Epoch: 15466, Training Loss: 0.01164
Epoch: 15466, Training Loss: 0.01226
Epoch: 15466, Training Loss: 0.01227
Epoch: 15466, Training Loss: 0.01517
Epoch: 15467, Training Loss: 0.01164
Epoch: 15467, Training Loss: 0.01226
Epoch: 15467, Training Loss: 0.01227
Epoch: 15467, Training Loss: 0.01516
Epoch: 15468, Training Loss: 0.01164
Epoch: 15468, Training Loss: 0.01226
Epoch: 15468, Training Loss: 0.01227
Epoch: 15468, Training Loss: 0.01516
Epoch: 15469, Training Loss: 0.01164
Epoch: 15469, Training Loss: 0.01226
Epoch: 15469, Training Loss: 0.01227
Epoch: 15469, Training Loss: 0.01516
Epoch: 15470, Training Loss: 0.01164
Epoch: 15470, Training Loss: 0.01226
Epoch: 15470, Training Loss: 0.01227
Epoch: 15470, Training Loss: 0.01516
Epoch: 15471, Training Loss: 0.01164
Epoch: 15471, Training Loss: 0.01225
Epoch: 15471, Training Loss: 0.01227
Epoch: 15471, Training Loss: 0.01516
Epoch: 15472, Training Loss: 0.01164
Epoch: 15472, Training Loss: 0.01225
Epoch: 15472, Training Loss: 0.01227
Epoch: 15472, Training Loss: 0.01516
Epoch: 15473, Training Loss: 0.01164
Epoch: 15473, Training Loss: 0.01225
Epoch: 15473, Training Loss: 0.01227
Epoch: 15473, Training Loss: 0.01516
Epoch: 15474, Training Loss: 0.01164
Epoch: 15474, Training Loss: 0.01225
Epoch: 15474, Training Loss: 0.01227
Epoch: 15474, Training Loss: 0.01516
Epoch: 15475, Training Loss: 0.01164
Epoch: 15475, Training Loss: 0.01225
Epoch: 15475, Training Loss: 0.01227
Epoch: 15475, Training Loss: 0.01516
Epoch: 15476, Training Loss: 0.01164
Epoch: 15476, Training Loss: 0.01225
Epoch: 15476, Training Loss: 0.01227
Epoch: 15476, Training Loss: 0.01516
Epoch: 15477, Training Loss: 0.01164
Epoch: 15477, Training Loss: 0.01225
Epoch: 15477, Training Loss: 0.01227
Epoch: 15477, Training Loss: 0.01516
Epoch: 15478, Training Loss: 0.01164
Epoch: 15478, Training Loss: 0.01225
Epoch: 15478, Training Loss: 0.01227
Epoch: 15478, Training Loss: 0.01516
Epoch: 15479, Training Loss: 0.01163
Epoch: 15479, Training Loss: 0.01225
Epoch: 15479, Training Loss: 0.01227
Epoch: 15479, Training Loss: 0.01516
Epoch: 15480, Training Loss: 0.01163
Epoch: 15480, Training Loss: 0.01225
Epoch: 15480, Training Loss: 0.01227
Epoch: 15480, Training Loss: 0.01516
Epoch: 15481, Training Loss: 0.01163
Epoch: 15481, Training Loss: 0.01225
Epoch: 15481, Training Loss: 0.01227
Epoch: 15481, Training Loss: 0.01516
Epoch: 15482, Training Loss: 0.01163
Epoch: 15482, Training Loss: 0.01225
Epoch: 15482, Training Loss: 0.01227
Epoch: 15482, Training Loss: 0.01516
Epoch: 15483, Training Loss: 0.01163
Epoch: 15483, Training Loss: 0.01225
Epoch: 15483, Training Loss: 0.01227
Epoch: 15483, Training Loss: 0.01515
Epoch: 15484, Training Loss: 0.01163
Epoch: 15484, Training Loss: 0.01225
Epoch: 15484, Training Loss: 0.01227
Epoch: 15484, Training Loss: 0.01515
Epoch: 15485, Training Loss: 0.01163
Epoch: 15485, Training Loss: 0.01225
Epoch: 15485, Training Loss: 0.01227
Epoch: 15485, Training Loss: 0.01515
Epoch: 15486, Training Loss: 0.01163
Epoch: 15486, Training Loss: 0.01225
Epoch: 15486, Training Loss: 0.01226
Epoch: 15486, Training Loss: 0.01515
Epoch: 15487, Training Loss: 0.01163
Epoch: 15487, Training Loss: 0.01225
Epoch: 15487, Training Loss: 0.01226
Epoch: 15487, Training Loss: 0.01515
Epoch: 15488, Training Loss: 0.01163
Epoch: 15488, Training Loss: 0.01225
Epoch: 15488, Training Loss: 0.01226
Epoch: 15488, Training Loss: 0.01515
Epoch: 15489, Training Loss: 0.01163
Epoch: 15489, Training Loss: 0.01225
Epoch: 15489, Training Loss: 0.01226
Epoch: 15489, Training Loss: 0.01515
Epoch: 15490, Training Loss: 0.01163
Epoch: 15490, Training Loss: 0.01225
Epoch: 15490, Training Loss: 0.01226
Epoch: 15490, Training Loss: 0.01515
Epoch: 15491, Training Loss: 0.01163
Epoch: 15491, Training Loss: 0.01225
Epoch: 15491, Training Loss: 0.01226
Epoch: 15491, Training Loss: 0.01515
Epoch: 15492, Training Loss: 0.01163
Epoch: 15492, Training Loss: 0.01224
Epoch: 15492, Training Loss: 0.01226
Epoch: 15492, Training Loss: 0.01515
Epoch: 15493, Training Loss: 0.01163
Epoch: 15493, Training Loss: 0.01224
Epoch: 15493, Training Loss: 0.01226
Epoch: 15493, Training Loss: 0.01515
Epoch: 15494, Training Loss: 0.01163
Epoch: 15494, Training Loss: 0.01224
Epoch: 15494, Training Loss: 0.01226
Epoch: 15494, Training Loss: 0.01515
Epoch: 15495, Training Loss: 0.01163
Epoch: 15495, Training Loss: 0.01224
Epoch: 15495, Training Loss: 0.01226
Epoch: 15495, Training Loss: 0.01515
Epoch: 15496, Training Loss: 0.01163
Epoch: 15496, Training Loss: 0.01224
Epoch: 15496, Training Loss: 0.01226
Epoch: 15496, Training Loss: 0.01515
Epoch: 15497, Training Loss: 0.01163
Epoch: 15497, Training Loss: 0.01224
Epoch: 15497, Training Loss: 0.01226
Epoch: 15497, Training Loss: 0.01515
Epoch: 15498, Training Loss: 0.01163
Epoch: 15498, Training Loss: 0.01224
Epoch: 15498, Training Loss: 0.01226
Epoch: 15498, Training Loss: 0.01515
Epoch: 15499, Training Loss: 0.01163
Epoch: 15499, Training Loss: 0.01224
Epoch: 15499, Training Loss: 0.01226
Epoch: 15499, Training Loss: 0.01514
Epoch: 15500, Training Loss: 0.01163
Epoch: 15500, Training Loss: 0.01224
Epoch: 15500, Training Loss: 0.01226
Epoch: 15500, Training Loss: 0.01514
Epoch: 15501, Training Loss: 0.01163
Epoch: 15501, Training Loss: 0.01224
Epoch: 15501, Training Loss: 0.01226
Epoch: 15501, Training Loss: 0.01514
Epoch: 15502, Training Loss: 0.01162
Epoch: 15502, Training Loss: 0.01224
Epoch: 15502, Training Loss: 0.01226
Epoch: 15502, Training Loss: 0.01514
Epoch: 15503, Training Loss: 0.01162
Epoch: 15503, Training Loss: 0.01224
Epoch: 15503, Training Loss: 0.01226
Epoch: 15503, Training Loss: 0.01514
Epoch: 15504, Training Loss: 0.01162
Epoch: 15504, Training Loss: 0.01224
Epoch: 15504, Training Loss: 0.01226
Epoch: 15504, Training Loss: 0.01514
Epoch: 15505, Training Loss: 0.01162
Epoch: 15505, Training Loss: 0.01224
Epoch: 15505, Training Loss: 0.01226
Epoch: 15505, Training Loss: 0.01514
Epoch: 15506, Training Loss: 0.01162
Epoch: 15506, Training Loss: 0.01224
Epoch: 15506, Training Loss: 0.01225
Epoch: 15506, Training Loss: 0.01514
Epoch: 15507, Training Loss: 0.01162
Epoch: 15507, Training Loss: 0.01224
Epoch: 15507, Training Loss: 0.01225
Epoch: 15507, Training Loss: 0.01514
Epoch: 15508, Training Loss: 0.01162
Epoch: 15508, Training Loss: 0.01224
Epoch: 15508, Training Loss: 0.01225
Epoch: 15508, Training Loss: 0.01514
Epoch: 15509, Training Loss: 0.01162
Epoch: 15509, Training Loss: 0.01224
Epoch: 15509, Training Loss: 0.01225
Epoch: 15509, Training Loss: 0.01514
Epoch: 15510, Training Loss: 0.01162
Epoch: 15510, Training Loss: 0.01224
Epoch: 15510, Training Loss: 0.01225
Epoch: 15510, Training Loss: 0.01514
Epoch: 15511, Training Loss: 0.01162
Epoch: 15511, Training Loss: 0.01224
Epoch: 15511, Training Loss: 0.01225
Epoch: 15511, Training Loss: 0.01514
Epoch: 15512, Training Loss: 0.01162
Epoch: 15512, Training Loss: 0.01223
Epoch: 15512, Training Loss: 0.01225
Epoch: 15512, Training Loss: 0.01514
Epoch: 15513, Training Loss: 0.01162
Epoch: 15513, Training Loss: 0.01223
Epoch: 15513, Training Loss: 0.01225
Epoch: 15513, Training Loss: 0.01514
Epoch: 15514, Training Loss: 0.01162
Epoch: 15514, Training Loss: 0.01223
Epoch: 15514, Training Loss: 0.01225
Epoch: 15514, Training Loss: 0.01514
Epoch: 15515, Training Loss: 0.01162
Epoch: 15515, Training Loss: 0.01223
Epoch: 15515, Training Loss: 0.01225
Epoch: 15515, Training Loss: 0.01513
Epoch: 15516, Training Loss: 0.01162
Epoch: 15516, Training Loss: 0.01223
Epoch: 15516, Training Loss: 0.01225
Epoch: 15516, Training Loss: 0.01513
Epoch: 15517, Training Loss: 0.01162
Epoch: 15517, Training Loss: 0.01223
Epoch: 15517, Training Loss: 0.01225
Epoch: 15517, Training Loss: 0.01513
Epoch: 15518, Training Loss: 0.01162
Epoch: 15518, Training Loss: 0.01223
Epoch: 15518, Training Loss: 0.01225
Epoch: 15518, Training Loss: 0.01513
Epoch: 15519, Training Loss: 0.01162
Epoch: 15519, Training Loss: 0.01223
Epoch: 15519, Training Loss: 0.01225
Epoch: 15519, Training Loss: 0.01513
Epoch: 15520, Training Loss: 0.01162
Epoch: 15520, Training Loss: 0.01223
Epoch: 15520, Training Loss: 0.01225
Epoch: 15520, Training Loss: 0.01513
Epoch: 15521, Training Loss: 0.01162
Epoch: 15521, Training Loss: 0.01223
Epoch: 15521, Training Loss: 0.01225
Epoch: 15521, Training Loss: 0.01513
Epoch: 15522, Training Loss: 0.01162
Epoch: 15522, Training Loss: 0.01223
Epoch: 15522, Training Loss: 0.01225
Epoch: 15522, Training Loss: 0.01513
Epoch: 15523, Training Loss: 0.01162
Epoch: 15523, Training Loss: 0.01223
Epoch: 15523, Training Loss: 0.01225
Epoch: 15523, Training Loss: 0.01513
Epoch: 15524, Training Loss: 0.01161
Epoch: 15524, Training Loss: 0.01223
Epoch: 15524, Training Loss: 0.01225
Epoch: 15524, Training Loss: 0.01513
Epoch: 15525, Training Loss: 0.01161
Epoch: 15525, Training Loss: 0.01223
Epoch: 15525, Training Loss: 0.01225
Epoch: 15525, Training Loss: 0.01513
Epoch: 15526, Training Loss: 0.01161
Epoch: 15526, Training Loss: 0.01223
Epoch: 15526, Training Loss: 0.01225
Epoch: 15526, Training Loss: 0.01513
Epoch: 15527, Training Loss: 0.01161
Epoch: 15527, Training Loss: 0.01223
Epoch: 15527, Training Loss: 0.01224
Epoch: 15527, Training Loss: 0.01513
Epoch: 15528, Training Loss: 0.01161
Epoch: 15528, Training Loss: 0.01223
Epoch: 15528, Training Loss: 0.01224
Epoch: 15528, Training Loss: 0.01513
Epoch: 15529, Training Loss: 0.01161
Epoch: 15529, Training Loss: 0.01223
Epoch: 15529, Training Loss: 0.01224
Epoch: 15529, Training Loss: 0.01513
Epoch: 15530, Training Loss: 0.01161
Epoch: 15530, Training Loss: 0.01223
Epoch: 15530, Training Loss: 0.01224
Epoch: 15530, Training Loss: 0.01513
Epoch: 15531, Training Loss: 0.01161
Epoch: 15531, Training Loss: 0.01223
Epoch: 15531, Training Loss: 0.01224
Epoch: 15531, Training Loss: 0.01513
Epoch: 15532, Training Loss: 0.01161
Epoch: 15532, Training Loss: 0.01223
Epoch: 15532, Training Loss: 0.01224
Epoch: 15532, Training Loss: 0.01512
Epoch: 15533, Training Loss: 0.01161
Epoch: 15533, Training Loss: 0.01222
Epoch: 15533, Training Loss: 0.01224
Epoch: 15533, Training Loss: 0.01512
Epoch: 15534, Training Loss: 0.01161
Epoch: 15534, Training Loss: 0.01222
Epoch: 15534, Training Loss: 0.01224
Epoch: 15534, Training Loss: 0.01512
Epoch: 15535, Training Loss: 0.01161
Epoch: 15535, Training Loss: 0.01222
Epoch: 15535, Training Loss: 0.01224
Epoch: 15535, Training Loss: 0.01512
Epoch: 15536, Training Loss: 0.01161
Epoch: 15536, Training Loss: 0.01222
Epoch: 15536, Training Loss: 0.01224
Epoch: 15536, Training Loss: 0.01512
Epoch: 15537, Training Loss: 0.01161
Epoch: 15537, Training Loss: 0.01222
Epoch: 15537, Training Loss: 0.01224
Epoch: 15537, Training Loss: 0.01512
Epoch: 15538, Training Loss: 0.01161
Epoch: 15538, Training Loss: 0.01222
Epoch: 15538, Training Loss: 0.01224
Epoch: 15538, Training Loss: 0.01512
Epoch: 15539, Training Loss: 0.01161
Epoch: 15539, Training Loss: 0.01222
Epoch: 15539, Training Loss: 0.01224
Epoch: 15539, Training Loss: 0.01512
Epoch: 15540, Training Loss: 0.01161
Epoch: 15540, Training Loss: 0.01222
Epoch: 15540, Training Loss: 0.01224
Epoch: 15540, Training Loss: 0.01512
Epoch: 15541, Training Loss: 0.01161
Epoch: 15541, Training Loss: 0.01222
Epoch: 15541, Training Loss: 0.01224
Epoch: 15541, Training Loss: 0.01512
Epoch: 15542, Training Loss: 0.01161
Epoch: 15542, Training Loss: 0.01222
Epoch: 15542, Training Loss: 0.01224
Epoch: 15542, Training Loss: 0.01512
Epoch: 15543, Training Loss: 0.01161
Epoch: 15543, Training Loss: 0.01222
Epoch: 15543, Training Loss: 0.01224
Epoch: 15543, Training Loss: 0.01512
Epoch: 15544, Training Loss: 0.01161
Epoch: 15544, Training Loss: 0.01222
Epoch: 15544, Training Loss: 0.01224
Epoch: 15544, Training Loss: 0.01512
Epoch: 15545, Training Loss: 0.01161
Epoch: 15545, Training Loss: 0.01222
Epoch: 15545, Training Loss: 0.01224
Epoch: 15545, Training Loss: 0.01512
Epoch: 15546, Training Loss: 0.01160
Epoch: 15546, Training Loss: 0.01222
Epoch: 15546, Training Loss: 0.01224
Epoch: 15546, Training Loss: 0.01512
Epoch: 15547, Training Loss: 0.01160
Epoch: 15547, Training Loss: 0.01222
Epoch: 15547, Training Loss: 0.01223
Epoch: 15547, Training Loss: 0.01512
Epoch: 15548, Training Loss: 0.01160
Epoch: 15548, Training Loss: 0.01222
Epoch: 15548, Training Loss: 0.01223
Epoch: 15548, Training Loss: 0.01511
Epoch: 15549, Training Loss: 0.01160
Epoch: 15549, Training Loss: 0.01222
Epoch: 15549, Training Loss: 0.01223
Epoch: 15549, Training Loss: 0.01511
Epoch: 15550, Training Loss: 0.01160
Epoch: 15550, Training Loss: 0.01222
Epoch: 15550, Training Loss: 0.01223
Epoch: 15550, Training Loss: 0.01511
Epoch: 15551, Training Loss: 0.01160
Epoch: 15551, Training Loss: 0.01222
Epoch: 15551, Training Loss: 0.01223
Epoch: 15551, Training Loss: 0.01511
Epoch: 15552, Training Loss: 0.01160
Epoch: 15552, Training Loss: 0.01222
Epoch: 15552, Training Loss: 0.01223
Epoch: 15552, Training Loss: 0.01511
Epoch: 15553, Training Loss: 0.01160
Epoch: 15553, Training Loss: 0.01221
Epoch: 15553, Training Loss: 0.01223
Epoch: 15553, Training Loss: 0.01511
Epoch: 15554, Training Loss: 0.01160
Epoch: 15554, Training Loss: 0.01221
Epoch: 15554, Training Loss: 0.01223
Epoch: 15554, Training Loss: 0.01511
Epoch: 15555, Training Loss: 0.01160
Epoch: 15555, Training Loss: 0.01221
Epoch: 15555, Training Loss: 0.01223
Epoch: 15555, Training Loss: 0.01511
Epoch: 15556, Training Loss: 0.01160
Epoch: 15556, Training Loss: 0.01221
Epoch: 15556, Training Loss: 0.01223
Epoch: 15556, Training Loss: 0.01511
Epoch: 15557, Training Loss: 0.01160
Epoch: 15557, Training Loss: 0.01221
Epoch: 15557, Training Loss: 0.01223
Epoch: 15557, Training Loss: 0.01511
Epoch: 15558, Training Loss: 0.01160
Epoch: 15558, Training Loss: 0.01221
Epoch: 15558, Training Loss: 0.01223
Epoch: 15558, Training Loss: 0.01511
Epoch: 15559, Training Loss: 0.01160
Epoch: 15559, Training Loss: 0.01221
Epoch: 15559, Training Loss: 0.01223
Epoch: 15559, Training Loss: 0.01511
Epoch: 15560, Training Loss: 0.01160
Epoch: 15560, Training Loss: 0.01221
Epoch: 15560, Training Loss: 0.01223
Epoch: 15560, Training Loss: 0.01511
Epoch: 15561, Training Loss: 0.01160
Epoch: 15561, Training Loss: 0.01221
Epoch: 15561, Training Loss: 0.01223
Epoch: 15561, Training Loss: 0.01511
Epoch: 15562, Training Loss: 0.01160
Epoch: 15562, Training Loss: 0.01221
Epoch: 15562, Training Loss: 0.01223
Epoch: 15562, Training Loss: 0.01511
Epoch: 15563, Training Loss: 0.01160
Epoch: 15563, Training Loss: 0.01221
Epoch: 15563, Training Loss: 0.01223
Epoch: 15563, Training Loss: 0.01511
Epoch: 15564, Training Loss: 0.01160
Epoch: 15564, Training Loss: 0.01221
Epoch: 15564, Training Loss: 0.01223
Epoch: 15564, Training Loss: 0.01510
Epoch: 15565, Training Loss: 0.01160
Epoch: 15565, Training Loss: 0.01221
Epoch: 15565, Training Loss: 0.01223
Epoch: 15565, Training Loss: 0.01510
Epoch: 15566, Training Loss: 0.01160
Epoch: 15566, Training Loss: 0.01221
Epoch: 15566, Training Loss: 0.01223
Epoch: 15566, Training Loss: 0.01510
Epoch: 15567, Training Loss: 0.01160
Epoch: 15567, Training Loss: 0.01221
Epoch: 15567, Training Loss: 0.01223
Epoch: 15567, Training Loss: 0.01510
Epoch: 15568, Training Loss: 0.01160
Epoch: 15568, Training Loss: 0.01221
Epoch: 15568, Training Loss: 0.01222
Epoch: 15568, Training Loss: 0.01510
Epoch: 15569, Training Loss: 0.01159
Epoch: 15569, Training Loss: 0.01221
Epoch: 15569, Training Loss: 0.01222
Epoch: 15569, Training Loss: 0.01510
Epoch: 15570, Training Loss: 0.01159
Epoch: 15570, Training Loss: 0.01221
Epoch: 15570, Training Loss: 0.01222
Epoch: 15570, Training Loss: 0.01510
Epoch: 15571, Training Loss: 0.01159
Epoch: 15571, Training Loss: 0.01221
Epoch: 15571, Training Loss: 0.01222
Epoch: 15571, Training Loss: 0.01510
Epoch: 15572, Training Loss: 0.01159
Epoch: 15572, Training Loss: 0.01221
Epoch: 15572, Training Loss: 0.01222
Epoch: 15572, Training Loss: 0.01510
Epoch: 15573, Training Loss: 0.01159
Epoch: 15573, Training Loss: 0.01221
Epoch: 15573, Training Loss: 0.01222
Epoch: 15573, Training Loss: 0.01510
Epoch: 15574, Training Loss: 0.01159
Epoch: 15574, Training Loss: 0.01220
Epoch: 15574, Training Loss: 0.01222
Epoch: 15574, Training Loss: 0.01510
Epoch: 15575, Training Loss: 0.01159
Epoch: 15575, Training Loss: 0.01220
Epoch: 15575, Training Loss: 0.01222
Epoch: 15575, Training Loss: 0.01510
Epoch: 15576, Training Loss: 0.01159
Epoch: 15576, Training Loss: 0.01220
Epoch: 15576, Training Loss: 0.01222
Epoch: 15576, Training Loss: 0.01510
Epoch: 15577, Training Loss: 0.01159
Epoch: 15577, Training Loss: 0.01220
Epoch: 15577, Training Loss: 0.01222
Epoch: 15577, Training Loss: 0.01510
Epoch: 15578, Training Loss: 0.01159
Epoch: 15578, Training Loss: 0.01220
Epoch: 15578, Training Loss: 0.01222
Epoch: 15578, Training Loss: 0.01510
Epoch: 15579, Training Loss: 0.01159
Epoch: 15579, Training Loss: 0.01220
Epoch: 15579, Training Loss: 0.01222
Epoch: 15579, Training Loss: 0.01510
Epoch: 15580, Training Loss: 0.01159
Epoch: 15580, Training Loss: 0.01220
Epoch: 15580, Training Loss: 0.01222
Epoch: 15580, Training Loss: 0.01510
Epoch: 15581, Training Loss: 0.01159
Epoch: 15581, Training Loss: 0.01220
Epoch: 15581, Training Loss: 0.01222
Epoch: 15581, Training Loss: 0.01509
Epoch: 15582, Training Loss: 0.01159
Epoch: 15582, Training Loss: 0.01220
Epoch: 15582, Training Loss: 0.01222
Epoch: 15582, Training Loss: 0.01509
Epoch: 15583, Training Loss: 0.01159
Epoch: 15583, Training Loss: 0.01220
Epoch: 15583, Training Loss: 0.01222
Epoch: 15583, Training Loss: 0.01509
Epoch: 15584, Training Loss: 0.01159
Epoch: 15584, Training Loss: 0.01220
Epoch: 15584, Training Loss: 0.01222
Epoch: 15584, Training Loss: 0.01509
Epoch: 15585, Training Loss: 0.01159
Epoch: 15585, Training Loss: 0.01220
Epoch: 15585, Training Loss: 0.01222
Epoch: 15585, Training Loss: 0.01509
Epoch: 15586, Training Loss: 0.01159
Epoch: 15586, Training Loss: 0.01220
Epoch: 15586, Training Loss: 0.01222
Epoch: 15586, Training Loss: 0.01509
Epoch: 15587, Training Loss: 0.01159
Epoch: 15587, Training Loss: 0.01220
Epoch: 15587, Training Loss: 0.01222
Epoch: 15587, Training Loss: 0.01509
Epoch: 15588, Training Loss: 0.01159
Epoch: 15588, Training Loss: 0.01220
Epoch: 15588, Training Loss: 0.01221
Epoch: 15588, Training Loss: 0.01509
Epoch: 15589, Training Loss: 0.01159
Epoch: 15589, Training Loss: 0.01220
Epoch: 15589, Training Loss: 0.01221
Epoch: 15589, Training Loss: 0.01509
Epoch: 15590, Training Loss: 0.01159
Epoch: 15590, Training Loss: 0.01220
Epoch: 15590, Training Loss: 0.01221
Epoch: 15590, Training Loss: 0.01509
Epoch: 15591, Training Loss: 0.01158
Epoch: 15591, Training Loss: 0.01220
Epoch: 15591, Training Loss: 0.01221
Epoch: 15591, Training Loss: 0.01509
Epoch: 15592, Training Loss: 0.01158
Epoch: 15592, Training Loss: 0.01220
Epoch: 15592, Training Loss: 0.01221
Epoch: 15592, Training Loss: 0.01509
Epoch: 15593, Training Loss: 0.01158
Epoch: 15593, Training Loss: 0.01220
Epoch: 15593, Training Loss: 0.01221
Epoch: 15593, Training Loss: 0.01509
Epoch: 15594, Training Loss: 0.01158
Epoch: 15594, Training Loss: 0.01220
Epoch: 15594, Training Loss: 0.01221
Epoch: 15594, Training Loss: 0.01509
Epoch: 15595, Training Loss: 0.01158
Epoch: 15595, Training Loss: 0.01219
Epoch: 15595, Training Loss: 0.01221
Epoch: 15595, Training Loss: 0.01509
Epoch: 15596, Training Loss: 0.01158
Epoch: 15596, Training Loss: 0.01219
Epoch: 15596, Training Loss: 0.01221
Epoch: 15596, Training Loss: 0.01509
Epoch: 15597, Training Loss: 0.01158
Epoch: 15597, Training Loss: 0.01219
Epoch: 15597, Training Loss: 0.01221
Epoch: 15597, Training Loss: 0.01508
Epoch: 15598, Training Loss: 0.01158
Epoch: 15598, Training Loss: 0.01219
Epoch: 15598, Training Loss: 0.01221
Epoch: 15598, Training Loss: 0.01508
Epoch: 15599, Training Loss: 0.01158
Epoch: 15599, Training Loss: 0.01219
Epoch: 15599, Training Loss: 0.01221
Epoch: 15599, Training Loss: 0.01508
Epoch: 15600, Training Loss: 0.01158
Epoch: 15600, Training Loss: 0.01219
Epoch: 15600, Training Loss: 0.01221
Epoch: 15600, Training Loss: 0.01508
Epoch: 15601, Training Loss: 0.01158
Epoch: 15601, Training Loss: 0.01219
Epoch: 15601, Training Loss: 0.01221
Epoch: 15601, Training Loss: 0.01508
Epoch: 15602, Training Loss: 0.01158
Epoch: 15602, Training Loss: 0.01219
Epoch: 15602, Training Loss: 0.01221
Epoch: 15602, Training Loss: 0.01508
Epoch: 15603, Training Loss: 0.01158
Epoch: 15603, Training Loss: 0.01219
Epoch: 15603, Training Loss: 0.01221
Epoch: 15603, Training Loss: 0.01508
Epoch: 15604, Training Loss: 0.01158
Epoch: 15604, Training Loss: 0.01219
Epoch: 15604, Training Loss: 0.01221
Epoch: 15604, Training Loss: 0.01508
Epoch: 15605, Training Loss: 0.01158
Epoch: 15605, Training Loss: 0.01219
Epoch: 15605, Training Loss: 0.01221
Epoch: 15605, Training Loss: 0.01508
Epoch: 15606, Training Loss: 0.01158
Epoch: 15606, Training Loss: 0.01219
Epoch: 15606, Training Loss: 0.01221
Epoch: 15606, Training Loss: 0.01508
Epoch: 15607, Training Loss: 0.01158
Epoch: 15607, Training Loss: 0.01219
Epoch: 15607, Training Loss: 0.01221
Epoch: 15607, Training Loss: 0.01508
Epoch: 15608, Training Loss: 0.01158
Epoch: 15608, Training Loss: 0.01219
Epoch: 15608, Training Loss: 0.01221
Epoch: 15608, Training Loss: 0.01508
Epoch: 15609, Training Loss: 0.01158
Epoch: 15609, Training Loss: 0.01219
Epoch: 15609, Training Loss: 0.01220
Epoch: 15609, Training Loss: 0.01508
Epoch: 15610, Training Loss: 0.01158
Epoch: 15610, Training Loss: 0.01219
Epoch: 15610, Training Loss: 0.01220
Epoch: 15610, Training Loss: 0.01508
Epoch: 15611, Training Loss: 0.01158
Epoch: 15611, Training Loss: 0.01219
Epoch: 15611, Training Loss: 0.01220
Epoch: 15611, Training Loss: 0.01508
Epoch: 15612, Training Loss: 0.01158
Epoch: 15612, Training Loss: 0.01219
Epoch: 15612, Training Loss: 0.01220
Epoch: 15612, Training Loss: 0.01508
Epoch: 15613, Training Loss: 0.01158
Epoch: 15613, Training Loss: 0.01219
Epoch: 15613, Training Loss: 0.01220
Epoch: 15613, Training Loss: 0.01508
Epoch: 15614, Training Loss: 0.01157
Epoch: 15614, Training Loss: 0.01219
Epoch: 15614, Training Loss: 0.01220
Epoch: 15614, Training Loss: 0.01507
Epoch: 15615, Training Loss: 0.01157
Epoch: 15615, Training Loss: 0.01218
Epoch: 15615, Training Loss: 0.01220
Epoch: 15615, Training Loss: 0.01507
Epoch: 15616, Training Loss: 0.01157
Epoch: 15616, Training Loss: 0.01218
Epoch: 15616, Training Loss: 0.01220
Epoch: 15616, Training Loss: 0.01507
Epoch: 15617, Training Loss: 0.01157
Epoch: 15617, Training Loss: 0.01218
Epoch: 15617, Training Loss: 0.01220
Epoch: 15617, Training Loss: 0.01507
Epoch: 15618, Training Loss: 0.01157
Epoch: 15618, Training Loss: 0.01218
Epoch: 15618, Training Loss: 0.01220
Epoch: 15618, Training Loss: 0.01507
Epoch: 15619, Training Loss: 0.01157
Epoch: 15619, Training Loss: 0.01218
Epoch: 15619, Training Loss: 0.01220
Epoch: 15619, Training Loss: 0.01507
Epoch: 15620, Training Loss: 0.01157
Epoch: 15620, Training Loss: 0.01218
Epoch: 15620, Training Loss: 0.01220
Epoch: 15620, Training Loss: 0.01507
Epoch: 15621, Training Loss: 0.01157
Epoch: 15621, Training Loss: 0.01218
Epoch: 15621, Training Loss: 0.01220
Epoch: 15621, Training Loss: 0.01507
Epoch: 15622, Training Loss: 0.01157
Epoch: 15622, Training Loss: 0.01218
Epoch: 15622, Training Loss: 0.01220
Epoch: 15622, Training Loss: 0.01507
Epoch: 15623, Training Loss: 0.01157
Epoch: 15623, Training Loss: 0.01218
Epoch: 15623, Training Loss: 0.01220
Epoch: 15623, Training Loss: 0.01507
Epoch: 15624, Training Loss: 0.01157
Epoch: 15624, Training Loss: 0.01218
Epoch: 15624, Training Loss: 0.01220
Epoch: 15624, Training Loss: 0.01507
Epoch: 15625, Training Loss: 0.01157
Epoch: 15625, Training Loss: 0.01218
Epoch: 15625, Training Loss: 0.01220
Epoch: 15625, Training Loss: 0.01507
Epoch: 15626, Training Loss: 0.01157
Epoch: 15626, Training Loss: 0.01218
Epoch: 15626, Training Loss: 0.01220
Epoch: 15626, Training Loss: 0.01507
Epoch: 15627, Training Loss: 0.01157
Epoch: 15627, Training Loss: 0.01218
Epoch: 15627, Training Loss: 0.01220
Epoch: 15627, Training Loss: 0.01507
Epoch: 15628, Training Loss: 0.01157
Epoch: 15628, Training Loss: 0.01218
Epoch: 15628, Training Loss: 0.01220
Epoch: 15628, Training Loss: 0.01507
Epoch: 15629, Training Loss: 0.01157
Epoch: 15629, Training Loss: 0.01218
Epoch: 15629, Training Loss: 0.01220
Epoch: 15629, Training Loss: 0.01507
Epoch: 15630, Training Loss: 0.01157
Epoch: 15630, Training Loss: 0.01218
Epoch: 15630, Training Loss: 0.01219
Epoch: 15630, Training Loss: 0.01506
Epoch: 15631, Training Loss: 0.01157
Epoch: 15631, Training Loss: 0.01218
Epoch: 15631, Training Loss: 0.01219
Epoch: 15631, Training Loss: 0.01506
Epoch: 15632, Training Loss: 0.01157
Epoch: 15632, Training Loss: 0.01218
Epoch: 15632, Training Loss: 0.01219
Epoch: 15632, Training Loss: 0.01506
Epoch: 15633, Training Loss: 0.01157
Epoch: 15633, Training Loss: 0.01218
Epoch: 15633, Training Loss: 0.01219
Epoch: 15633, Training Loss: 0.01506
Epoch: 15634, Training Loss: 0.01157
Epoch: 15634, Training Loss: 0.01218
Epoch: 15634, Training Loss: 0.01219
Epoch: 15634, Training Loss: 0.01506
Epoch: 15635, Training Loss: 0.01157
Epoch: 15635, Training Loss: 0.01218
Epoch: 15635, Training Loss: 0.01219
Epoch: 15635, Training Loss: 0.01506
Epoch: 15636, Training Loss: 0.01157
Epoch: 15636, Training Loss: 0.01217
Epoch: 15636, Training Loss: 0.01219
Epoch: 15636, Training Loss: 0.01506
Epoch: 15637, Training Loss: 0.01156
Epoch: 15637, Training Loss: 0.01217
Epoch: 15637, Training Loss: 0.01219
Epoch: 15637, Training Loss: 0.01506
Epoch: 15638, Training Loss: 0.01156
Epoch: 15638, Training Loss: 0.01217
Epoch: 15638, Training Loss: 0.01219
Epoch: 15638, Training Loss: 0.01506
Epoch: 15639, Training Loss: 0.01156
Epoch: 15639, Training Loss: 0.01217
Epoch: 15639, Training Loss: 0.01219
Epoch: 15639, Training Loss: 0.01506
Epoch: 15640, Training Loss: 0.01156
Epoch: 15640, Training Loss: 0.01217
Epoch: 15640, Training Loss: 0.01219
Epoch: 15640, Training Loss: 0.01506
Epoch: 15641, Training Loss: 0.01156
Epoch: 15641, Training Loss: 0.01217
Epoch: 15641, Training Loss: 0.01219
Epoch: 15641, Training Loss: 0.01506
Epoch: 15642, Training Loss: 0.01156
Epoch: 15642, Training Loss: 0.01217
Epoch: 15642, Training Loss: 0.01219
Epoch: 15642, Training Loss: 0.01506
Epoch: 15643, Training Loss: 0.01156
Epoch: 15643, Training Loss: 0.01217
Epoch: 15643, Training Loss: 0.01219
Epoch: 15643, Training Loss: 0.01506
Epoch: 15644, Training Loss: 0.01156
Epoch: 15644, Training Loss: 0.01217
Epoch: 15644, Training Loss: 0.01219
Epoch: 15644, Training Loss: 0.01506
Epoch: 15645, Training Loss: 0.01156
Epoch: 15645, Training Loss: 0.01217
Epoch: 15645, Training Loss: 0.01219
Epoch: 15645, Training Loss: 0.01506
Epoch: 15646, Training Loss: 0.01156
Epoch: 15646, Training Loss: 0.01217
Epoch: 15646, Training Loss: 0.01219
Epoch: 15646, Training Loss: 0.01506
Epoch: 15647, Training Loss: 0.01156
Epoch: 15647, Training Loss: 0.01217
Epoch: 15647, Training Loss: 0.01219
Epoch: 15647, Training Loss: 0.01505
Epoch: 15648, Training Loss: 0.01156
Epoch: 15648, Training Loss: 0.01217
Epoch: 15648, Training Loss: 0.01219
Epoch: 15648, Training Loss: 0.01505
Epoch: 15649, Training Loss: 0.01156
Epoch: 15649, Training Loss: 0.01217
Epoch: 15649, Training Loss: 0.01219
Epoch: 15649, Training Loss: 0.01505
Epoch: 15650, Training Loss: 0.01156
Epoch: 15650, Training Loss: 0.01217
Epoch: 15650, Training Loss: 0.01219
Epoch: 15650, Training Loss: 0.01505
Epoch: 15651, Training Loss: 0.01156
Epoch: 15651, Training Loss: 0.01217
Epoch: 15651, Training Loss: 0.01218
Epoch: 15651, Training Loss: 0.01505
Epoch: 15652, Training Loss: 0.01156
Epoch: 15652, Training Loss: 0.01217
Epoch: 15652, Training Loss: 0.01218
Epoch: 15652, Training Loss: 0.01505
Epoch: 15653, Training Loss: 0.01156
Epoch: 15653, Training Loss: 0.01217
Epoch: 15653, Training Loss: 0.01218
Epoch: 15653, Training Loss: 0.01505
Epoch: 15654, Training Loss: 0.01156
Epoch: 15654, Training Loss: 0.01217
Epoch: 15654, Training Loss: 0.01218
Epoch: 15654, Training Loss: 0.01505
Epoch: 15655, Training Loss: 0.01156
Epoch: 15655, Training Loss: 0.01217
Epoch: 15655, Training Loss: 0.01218
Epoch: 15655, Training Loss: 0.01505
Epoch: 15656, Training Loss: 0.01156
Epoch: 15656, Training Loss: 0.01217
Epoch: 15656, Training Loss: 0.01218
Epoch: 15656, Training Loss: 0.01505
Epoch: 15657, Training Loss: 0.01156
Epoch: 15657, Training Loss: 0.01216
Epoch: 15657, Training Loss: 0.01218
Epoch: 15657, Training Loss: 0.01505
Epoch: 15658, Training Loss: 0.01156
Epoch: 15658, Training Loss: 0.01216
Epoch: 15658, Training Loss: 0.01218
Epoch: 15658, Training Loss: 0.01505
Epoch: 15659, Training Loss: 0.01155
Epoch: 15659, Training Loss: 0.01216
Epoch: 15659, Training Loss: 0.01218
Epoch: 15659, Training Loss: 0.01505
Epoch: 15660, Training Loss: 0.01155
Epoch: 15660, Training Loss: 0.01216
Epoch: 15660, Training Loss: 0.01218
Epoch: 15660, Training Loss: 0.01505
Epoch: 15661, Training Loss: 0.01155
Epoch: 15661, Training Loss: 0.01216
Epoch: 15661, Training Loss: 0.01218
Epoch: 15661, Training Loss: 0.01505
Epoch: 15662, Training Loss: 0.01155
Epoch: 15662, Training Loss: 0.01216
Epoch: 15662, Training Loss: 0.01218
Epoch: 15662, Training Loss: 0.01505
Epoch: 15663, Training Loss: 0.01155
Epoch: 15663, Training Loss: 0.01216
Epoch: 15663, Training Loss: 0.01218
Epoch: 15663, Training Loss: 0.01504
Epoch: 15664, Training Loss: 0.01155
Epoch: 15664, Training Loss: 0.01216
Epoch: 15664, Training Loss: 0.01218
Epoch: 15664, Training Loss: 0.01504
Epoch: 15665, Training Loss: 0.01155
Epoch: 15665, Training Loss: 0.01216
Epoch: 15665, Training Loss: 0.01218
Epoch: 15665, Training Loss: 0.01504
Epoch: 15666, Training Loss: 0.01155
Epoch: 15666, Training Loss: 0.01216
Epoch: 15666, Training Loss: 0.01218
Epoch: 15666, Training Loss: 0.01504
Epoch: 15667, Training Loss: 0.01155
Epoch: 15667, Training Loss: 0.01216
Epoch: 15667, Training Loss: 0.01218
Epoch: 15667, Training Loss: 0.01504
Epoch: 15668, Training Loss: 0.01155
Epoch: 15668, Training Loss: 0.01216
Epoch: 15668, Training Loss: 0.01218
Epoch: 15668, Training Loss: 0.01504
Epoch: 15669, Training Loss: 0.01155
Epoch: 15669, Training Loss: 0.01216
Epoch: 15669, Training Loss: 0.01218
Epoch: 15669, Training Loss: 0.01504
Epoch: 15670, Training Loss: 0.01155
Epoch: 15670, Training Loss: 0.01216
Epoch: 15670, Training Loss: 0.01218
Epoch: 15670, Training Loss: 0.01504
Epoch: 15671, Training Loss: 0.01155
Epoch: 15671, Training Loss: 0.01216
Epoch: 15671, Training Loss: 0.01218
Epoch: 15671, Training Loss: 0.01504
Epoch: 15672, Training Loss: 0.01155
Epoch: 15672, Training Loss: 0.01216
Epoch: 15672, Training Loss: 0.01217
Epoch: 15672, Training Loss: 0.01504
Epoch: 15673, Training Loss: 0.01155
Epoch: 15673, Training Loss: 0.01216
Epoch: 15673, Training Loss: 0.01217
Epoch: 15673, Training Loss: 0.01504
Epoch: 15674, Training Loss: 0.01155
Epoch: 15674, Training Loss: 0.01216
Epoch: 15674, Training Loss: 0.01217
Epoch: 15674, Training Loss: 0.01504
Epoch: 15675, Training Loss: 0.01155
Epoch: 15675, Training Loss: 0.01216
Epoch: 15675, Training Loss: 0.01217
Epoch: 15675, Training Loss: 0.01504
Epoch: 15676, Training Loss: 0.01155
Epoch: 15676, Training Loss: 0.01216
Epoch: 15676, Training Loss: 0.01217
Epoch: 15676, Training Loss: 0.01504
Epoch: 15677, Training Loss: 0.01155
Epoch: 15677, Training Loss: 0.01216
Epoch: 15677, Training Loss: 0.01217
Epoch: 15677, Training Loss: 0.01504
Epoch: 15678, Training Loss: 0.01155
Epoch: 15678, Training Loss: 0.01215
Epoch: 15678, Training Loss: 0.01217
Epoch: 15678, Training Loss: 0.01504
Epoch: 15679, Training Loss: 0.01155
Epoch: 15679, Training Loss: 0.01215
Epoch: 15679, Training Loss: 0.01217
Epoch: 15679, Training Loss: 0.01504
Epoch: 15680, Training Loss: 0.01155
Epoch: 15680, Training Loss: 0.01215
Epoch: 15680, Training Loss: 0.01217
Epoch: 15680, Training Loss: 0.01503
Epoch: 15681, Training Loss: 0.01155
Epoch: 15681, Training Loss: 0.01215
Epoch: 15681, Training Loss: 0.01217
Epoch: 15681, Training Loss: 0.01503
Epoch: 15682, Training Loss: 0.01154
Epoch: 15682, Training Loss: 0.01215
Epoch: 15682, Training Loss: 0.01217
Epoch: 15682, Training Loss: 0.01503
Epoch: 15683, Training Loss: 0.01154
Epoch: 15683, Training Loss: 0.01215
Epoch: 15683, Training Loss: 0.01217
Epoch: 15683, Training Loss: 0.01503
Epoch: 15684, Training Loss: 0.01154
Epoch: 15684, Training Loss: 0.01215
Epoch: 15684, Training Loss: 0.01217
Epoch: 15684, Training Loss: 0.01503
Epoch: 15685, Training Loss: 0.01154
Epoch: 15685, Training Loss: 0.01215
Epoch: 15685, Training Loss: 0.01217
Epoch: 15685, Training Loss: 0.01503
Epoch: 15686, Training Loss: 0.01154
Epoch: 15686, Training Loss: 0.01215
Epoch: 15686, Training Loss: 0.01217
Epoch: 15686, Training Loss: 0.01503
Epoch: 15687, Training Loss: 0.01154
Epoch: 15687, Training Loss: 0.01215
Epoch: 15687, Training Loss: 0.01217
Epoch: 15687, Training Loss: 0.01503
Epoch: 15688, Training Loss: 0.01154
Epoch: 15688, Training Loss: 0.01215
Epoch: 15688, Training Loss: 0.01217
Epoch: 15688, Training Loss: 0.01503
Epoch: 15689, Training Loss: 0.01154
Epoch: 15689, Training Loss: 0.01215
Epoch: 15689, Training Loss: 0.01217
Epoch: 15689, Training Loss: 0.01503
Epoch: 15690, Training Loss: 0.01154
Epoch: 15690, Training Loss: 0.01215
Epoch: 15690, Training Loss: 0.01217
Epoch: 15690, Training Loss: 0.01503
Epoch: 15691, Training Loss: 0.01154
Epoch: 15691, Training Loss: 0.01215
Epoch: 15691, Training Loss: 0.01217
Epoch: 15691, Training Loss: 0.01503
Epoch: 15692, Training Loss: 0.01154
Epoch: 15692, Training Loss: 0.01215
Epoch: 15692, Training Loss: 0.01216
Epoch: 15692, Training Loss: 0.01503
Epoch: 15693, Training Loss: 0.01154
Epoch: 15693, Training Loss: 0.01215
Epoch: 15693, Training Loss: 0.01216
Epoch: 15693, Training Loss: 0.01503
Epoch: 15694, Training Loss: 0.01154
Epoch: 15694, Training Loss: 0.01215
Epoch: 15694, Training Loss: 0.01216
Epoch: 15694, Training Loss: 0.01503
Epoch: 15695, Training Loss: 0.01154
Epoch: 15695, Training Loss: 0.01215
Epoch: 15695, Training Loss: 0.01216
Epoch: 15695, Training Loss: 0.01503
Epoch: 15696, Training Loss: 0.01154
Epoch: 15696, Training Loss: 0.01215
Epoch: 15696, Training Loss: 0.01216
Epoch: 15696, Training Loss: 0.01503
Epoch: 15697, Training Loss: 0.01154
Epoch: 15697, Training Loss: 0.01215
Epoch: 15697, Training Loss: 0.01216
Epoch: 15697, Training Loss: 0.01502
Epoch: 15698, Training Loss: 0.01154
Epoch: 15698, Training Loss: 0.01215
Epoch: 15698, Training Loss: 0.01216
Epoch: 15698, Training Loss: 0.01502
Epoch: 15699, Training Loss: 0.01154
Epoch: 15699, Training Loss: 0.01214
Epoch: 15699, Training Loss: 0.01216
Epoch: 15699, Training Loss: 0.01502
Epoch: 15700, Training Loss: 0.01154
Epoch: 15700, Training Loss: 0.01214
Epoch: 15700, Training Loss: 0.01216
Epoch: 15700, Training Loss: 0.01502
Epoch: 15701, Training Loss: 0.01154
Epoch: 15701, Training Loss: 0.01214
Epoch: 15701, Training Loss: 0.01216
Epoch: 15701, Training Loss: 0.01502
Epoch: 15702, Training Loss: 0.01154
Epoch: 15702, Training Loss: 0.01214
Epoch: 15702, Training Loss: 0.01216
Epoch: 15702, Training Loss: 0.01502
Epoch: 15703, Training Loss: 0.01154
Epoch: 15703, Training Loss: 0.01214
Epoch: 15703, Training Loss: 0.01216
Epoch: 15703, Training Loss: 0.01502
Epoch: 15704, Training Loss: 0.01154
Epoch: 15704, Training Loss: 0.01214
Epoch: 15704, Training Loss: 0.01216
Epoch: 15704, Training Loss: 0.01502
Epoch: 15705, Training Loss: 0.01153
Epoch: 15705, Training Loss: 0.01214
Epoch: 15705, Training Loss: 0.01216
Epoch: 15705, Training Loss: 0.01502
Epoch: 15706, Training Loss: 0.01153
Epoch: 15706, Training Loss: 0.01214
Epoch: 15706, Training Loss: 0.01216
Epoch: 15706, Training Loss: 0.01502
Epoch: 15707, Training Loss: 0.01153
Epoch: 15707, Training Loss: 0.01214
Epoch: 15707, Training Loss: 0.01216
Epoch: 15707, Training Loss: 0.01502
Epoch: 15708, Training Loss: 0.01153
Epoch: 15708, Training Loss: 0.01214
Epoch: 15708, Training Loss: 0.01216
Epoch: 15708, Training Loss: 0.01502
Epoch: 15709, Training Loss: 0.01153
Epoch: 15709, Training Loss: 0.01214
Epoch: 15709, Training Loss: 0.01216
Epoch: 15709, Training Loss: 0.01502
Epoch: 15710, Training Loss: 0.01153
Epoch: 15710, Training Loss: 0.01214
Epoch: 15710, Training Loss: 0.01216
Epoch: 15710, Training Loss: 0.01502
Epoch: 15711, Training Loss: 0.01153
Epoch: 15711, Training Loss: 0.01214
Epoch: 15711, Training Loss: 0.01216
Epoch: 15711, Training Loss: 0.01502
Epoch: 15712, Training Loss: 0.01153
Epoch: 15712, Training Loss: 0.01214
Epoch: 15712, Training Loss: 0.01216
Epoch: 15712, Training Loss: 0.01502
Epoch: 15713, Training Loss: 0.01153
Epoch: 15713, Training Loss: 0.01214
Epoch: 15713, Training Loss: 0.01215
Epoch: 15713, Training Loss: 0.01501
Epoch: 15714, Training Loss: 0.01153
Epoch: 15714, Training Loss: 0.01214
Epoch: 15714, Training Loss: 0.01215
Epoch: 15714, Training Loss: 0.01501
Epoch: 15715, Training Loss: 0.01153
Epoch: 15715, Training Loss: 0.01214
Epoch: 15715, Training Loss: 0.01215
Epoch: 15715, Training Loss: 0.01501
Epoch: 15716, Training Loss: 0.01153
Epoch: 15716, Training Loss: 0.01214
Epoch: 15716, Training Loss: 0.01215
Epoch: 15716, Training Loss: 0.01501
Epoch: 15717, Training Loss: 0.01153
Epoch: 15717, Training Loss: 0.01214
Epoch: 15717, Training Loss: 0.01215
Epoch: 15717, Training Loss: 0.01501
Epoch: 15718, Training Loss: 0.01153
Epoch: 15718, Training Loss: 0.01214
Epoch: 15718, Training Loss: 0.01215
Epoch: 15718, Training Loss: 0.01501
Epoch: 15719, Training Loss: 0.01153
Epoch: 15719, Training Loss: 0.01214
Epoch: 15719, Training Loss: 0.01215
Epoch: 15719, Training Loss: 0.01501
Epoch: 15720, Training Loss: 0.01153
Epoch: 15720, Training Loss: 0.01213
Epoch: 15720, Training Loss: 0.01215
Epoch: 15720, Training Loss: 0.01501
Epoch: 15721, Training Loss: 0.01153
Epoch: 15721, Training Loss: 0.01213
Epoch: 15721, Training Loss: 0.01215
Epoch: 15721, Training Loss: 0.01501
Epoch: 15722, Training Loss: 0.01153
Epoch: 15722, Training Loss: 0.01213
Epoch: 15722, Training Loss: 0.01215
Epoch: 15722, Training Loss: 0.01501
Epoch: 15723, Training Loss: 0.01153
Epoch: 15723, Training Loss: 0.01213
Epoch: 15723, Training Loss: 0.01215
Epoch: 15723, Training Loss: 0.01501
Epoch: 15724, Training Loss: 0.01153
Epoch: 15724, Training Loss: 0.01213
Epoch: 15724, Training Loss: 0.01215
Epoch: 15724, Training Loss: 0.01501
Epoch: 15725, Training Loss: 0.01153
Epoch: 15725, Training Loss: 0.01213
Epoch: 15725, Training Loss: 0.01215
Epoch: 15725, Training Loss: 0.01501
Epoch: 15726, Training Loss: 0.01153
Epoch: 15726, Training Loss: 0.01213
Epoch: 15726, Training Loss: 0.01215
Epoch: 15726, Training Loss: 0.01501
Epoch: 15727, Training Loss: 0.01153
Epoch: 15727, Training Loss: 0.01213
Epoch: 15727, Training Loss: 0.01215
Epoch: 15727, Training Loss: 0.01501
Epoch: 15728, Training Loss: 0.01152
Epoch: 15728, Training Loss: 0.01213
Epoch: 15728, Training Loss: 0.01215
Epoch: 15728, Training Loss: 0.01501
Epoch: 15729, Training Loss: 0.01152
Epoch: 15729, Training Loss: 0.01213
Epoch: 15729, Training Loss: 0.01215
Epoch: 15729, Training Loss: 0.01501
Epoch: 15730, Training Loss: 0.01152
Epoch: 15730, Training Loss: 0.01213
Epoch: 15730, Training Loss: 0.01215
Epoch: 15730, Training Loss: 0.01500
Epoch: 15731, Training Loss: 0.01152
Epoch: 15731, Training Loss: 0.01213
Epoch: 15731, Training Loss: 0.01215
Epoch: 15731, Training Loss: 0.01500
Epoch: 15732, Training Loss: 0.01152
Epoch: 15732, Training Loss: 0.01213
Epoch: 15732, Training Loss: 0.01215
Epoch: 15732, Training Loss: 0.01500
Epoch: 15733, Training Loss: 0.01152
Epoch: 15733, Training Loss: 0.01213
Epoch: 15733, Training Loss: 0.01215
Epoch: 15733, Training Loss: 0.01500
Epoch: 15734, Training Loss: 0.01152
Epoch: 15734, Training Loss: 0.01213
Epoch: 15734, Training Loss: 0.01214
Epoch: 15734, Training Loss: 0.01500
Epoch: 15735, Training Loss: 0.01152
Epoch: 15735, Training Loss: 0.01213
Epoch: 15735, Training Loss: 0.01214
Epoch: 15735, Training Loss: 0.01500
Epoch: 15736, Training Loss: 0.01152
Epoch: 15736, Training Loss: 0.01213
Epoch: 15736, Training Loss: 0.01214
Epoch: 15736, Training Loss: 0.01500
Epoch: 15737, Training Loss: 0.01152
Epoch: 15737, Training Loss: 0.01213
Epoch: 15737, Training Loss: 0.01214
Epoch: 15737, Training Loss: 0.01500
Epoch: 15738, Training Loss: 0.01152
Epoch: 15738, Training Loss: 0.01213
Epoch: 15738, Training Loss: 0.01214
Epoch: 15738, Training Loss: 0.01500
Epoch: 15739, Training Loss: 0.01152
Epoch: 15739, Training Loss: 0.01213
Epoch: 15739, Training Loss: 0.01214
Epoch: 15739, Training Loss: 0.01500
Epoch: 15740, Training Loss: 0.01152
Epoch: 15740, Training Loss: 0.01213
Epoch: 15740, Training Loss: 0.01214
Epoch: 15740, Training Loss: 0.01500
Epoch: 15741, Training Loss: 0.01152
Epoch: 15741, Training Loss: 0.01212
Epoch: 15741, Training Loss: 0.01214
Epoch: 15741, Training Loss: 0.01500
Epoch: 15742, Training Loss: 0.01152
Epoch: 15742, Training Loss: 0.01212
Epoch: 15742, Training Loss: 0.01214
Epoch: 15742, Training Loss: 0.01500
Epoch: 15743, Training Loss: 0.01152
Epoch: 15743, Training Loss: 0.01212
Epoch: 15743, Training Loss: 0.01214
Epoch: 15743, Training Loss: 0.01500
Epoch: 15744, Training Loss: 0.01152
Epoch: 15744, Training Loss: 0.01212
Epoch: 15744, Training Loss: 0.01214
Epoch: 15744, Training Loss: 0.01500
Epoch: 15745, Training Loss: 0.01152
Epoch: 15745, Training Loss: 0.01212
Epoch: 15745, Training Loss: 0.01214
Epoch: 15745, Training Loss: 0.01500
Epoch: 15746, Training Loss: 0.01152
Epoch: 15746, Training Loss: 0.01212
Epoch: 15746, Training Loss: 0.01214
Epoch: 15746, Training Loss: 0.01500
Epoch: 15747, Training Loss: 0.01152
Epoch: 15747, Training Loss: 0.01212
Epoch: 15747, Training Loss: 0.01214
Epoch: 15747, Training Loss: 0.01499
Epoch: 15748, Training Loss: 0.01152
Epoch: 15748, Training Loss: 0.01212
Epoch: 15748, Training Loss: 0.01214
Epoch: 15748, Training Loss: 0.01499
Epoch: 15749, Training Loss: 0.01152
Epoch: 15749, Training Loss: 0.01212
Epoch: 15749, Training Loss: 0.01214
Epoch: 15749, Training Loss: 0.01499
Epoch: 15750, Training Loss: 0.01152
Epoch: 15750, Training Loss: 0.01212
Epoch: 15750, Training Loss: 0.01214
Epoch: 15750, Training Loss: 0.01499
Epoch: 15751, Training Loss: 0.01151
Epoch: 15751, Training Loss: 0.01212
Epoch: 15751, Training Loss: 0.01214
Epoch: 15751, Training Loss: 0.01499
Epoch: 15752, Training Loss: 0.01151
Epoch: 15752, Training Loss: 0.01212
Epoch: 15752, Training Loss: 0.01214
Epoch: 15752, Training Loss: 0.01499
Epoch: 15753, Training Loss: 0.01151
Epoch: 15753, Training Loss: 0.01212
Epoch: 15753, Training Loss: 0.01214
Epoch: 15753, Training Loss: 0.01499
Epoch: 15754, Training Loss: 0.01151
Epoch: 15754, Training Loss: 0.01212
Epoch: 15754, Training Loss: 0.01214
Epoch: 15754, Training Loss: 0.01499
Epoch: 15755, Training Loss: 0.01151
Epoch: 15755, Training Loss: 0.01212
Epoch: 15755, Training Loss: 0.01214
Epoch: 15755, Training Loss: 0.01499
Epoch: 15756, Training Loss: 0.01151
Epoch: 15756, Training Loss: 0.01212
Epoch: 15756, Training Loss: 0.01213
Epoch: 15756, Training Loss: 0.01499
Epoch: 15757, Training Loss: 0.01151
Epoch: 15757, Training Loss: 0.01212
Epoch: 15757, Training Loss: 0.01213
Epoch: 15757, Training Loss: 0.01499
Epoch: 15758, Training Loss: 0.01151
Epoch: 15758, Training Loss: 0.01212
Epoch: 15758, Training Loss: 0.01213
Epoch: 15758, Training Loss: 0.01499
Epoch: 15759, Training Loss: 0.01151
Epoch: 15759, Training Loss: 0.01212
Epoch: 15759, Training Loss: 0.01213
Epoch: 15759, Training Loss: 0.01499
Epoch: 15760, Training Loss: 0.01151
Epoch: 15760, Training Loss: 0.01212
Epoch: 15760, Training Loss: 0.01213
Epoch: 15760, Training Loss: 0.01499
Epoch: 15761, Training Loss: 0.01151
Epoch: 15761, Training Loss: 0.01212
Epoch: 15761, Training Loss: 0.01213
Epoch: 15761, Training Loss: 0.01499
Epoch: 15762, Training Loss: 0.01151
Epoch: 15762, Training Loss: 0.01211
Epoch: 15762, Training Loss: 0.01213
Epoch: 15762, Training Loss: 0.01499
Epoch: 15763, Training Loss: 0.01151
Epoch: 15763, Training Loss: 0.01211
Epoch: 15763, Training Loss: 0.01213
Epoch: 15763, Training Loss: 0.01498
Epoch: 15764, Training Loss: 0.01151
Epoch: 15764, Training Loss: 0.01211
Epoch: 15764, Training Loss: 0.01213
Epoch: 15764, Training Loss: 0.01498
Epoch: 15765, Training Loss: 0.01151
Epoch: 15765, Training Loss: 0.01211
Epoch: 15765, Training Loss: 0.01213
Epoch: 15765, Training Loss: 0.01498
Epoch: 15766, Training Loss: 0.01151
Epoch: 15766, Training Loss: 0.01211
Epoch: 15766, Training Loss: 0.01213
Epoch: 15766, Training Loss: 0.01498
Epoch: 15767, Training Loss: 0.01151
Epoch: 15767, Training Loss: 0.01211
Epoch: 15767, Training Loss: 0.01213
Epoch: 15767, Training Loss: 0.01498
Epoch: 15768, Training Loss: 0.01151
Epoch: 15768, Training Loss: 0.01211
Epoch: 15768, Training Loss: 0.01213
Epoch: 15768, Training Loss: 0.01498
Epoch: 15769, Training Loss: 0.01151
Epoch: 15769, Training Loss: 0.01211
Epoch: 15769, Training Loss: 0.01213
Epoch: 15769, Training Loss: 0.01498
Epoch: 15770, Training Loss: 0.01151
Epoch: 15770, Training Loss: 0.01211
Epoch: 15770, Training Loss: 0.01213
Epoch: 15770, Training Loss: 0.01498
Epoch: 15771, Training Loss: 0.01151
Epoch: 15771, Training Loss: 0.01211
Epoch: 15771, Training Loss: 0.01213
Epoch: 15771, Training Loss: 0.01498
Epoch: 15772, Training Loss: 0.01151
Epoch: 15772, Training Loss: 0.01211
Epoch: 15772, Training Loss: 0.01213
Epoch: 15772, Training Loss: 0.01498
Epoch: 15773, Training Loss: 0.01151
Epoch: 15773, Training Loss: 0.01211
Epoch: 15773, Training Loss: 0.01213
Epoch: 15773, Training Loss: 0.01498
Epoch: 15774, Training Loss: 0.01150
Epoch: 15774, Training Loss: 0.01211
Epoch: 15774, Training Loss: 0.01213
Epoch: 15774, Training Loss: 0.01498
Epoch: 15775, Training Loss: 0.01150
Epoch: 15775, Training Loss: 0.01211
Epoch: 15775, Training Loss: 0.01213
Epoch: 15775, Training Loss: 0.01498
Epoch: 15776, Training Loss: 0.01150
Epoch: 15776, Training Loss: 0.01211
Epoch: 15776, Training Loss: 0.01213
Epoch: 15776, Training Loss: 0.01498
Epoch: 15777, Training Loss: 0.01150
Epoch: 15777, Training Loss: 0.01211
Epoch: 15777, Training Loss: 0.01212
Epoch: 15777, Training Loss: 0.01498
Epoch: 15778, Training Loss: 0.01150
Epoch: 15778, Training Loss: 0.01211
Epoch: 15778, Training Loss: 0.01212
Epoch: 15778, Training Loss: 0.01498
Epoch: 15779, Training Loss: 0.01150
Epoch: 15779, Training Loss: 0.01211
Epoch: 15779, Training Loss: 0.01212
Epoch: 15779, Training Loss: 0.01498
Epoch: 15780, Training Loss: 0.01150
Epoch: 15780, Training Loss: 0.01211
Epoch: 15780, Training Loss: 0.01212
Epoch: 15780, Training Loss: 0.01497
Epoch: 15781, Training Loss: 0.01150
Epoch: 15781, Training Loss: 0.01211
Epoch: 15781, Training Loss: 0.01212
Epoch: 15781, Training Loss: 0.01497
Epoch: 15782, Training Loss: 0.01150
Epoch: 15782, Training Loss: 0.01211
Epoch: 15782, Training Loss: 0.01212
Epoch: 15782, Training Loss: 0.01497
Epoch: 15783, Training Loss: 0.01150
Epoch: 15783, Training Loss: 0.01210
Epoch: 15783, Training Loss: 0.01212
Epoch: 15783, Training Loss: 0.01497
Epoch: 15784, Training Loss: 0.01150
Epoch: 15784, Training Loss: 0.01210
Epoch: 15784, Training Loss: 0.01212
Epoch: 15784, Training Loss: 0.01497
Epoch: 15785, Training Loss: 0.01150
Epoch: 15785, Training Loss: 0.01210
Epoch: 15785, Training Loss: 0.01212
Epoch: 15785, Training Loss: 0.01497
Epoch: 15786, Training Loss: 0.01150
Epoch: 15786, Training Loss: 0.01210
Epoch: 15786, Training Loss: 0.01212
Epoch: 15786, Training Loss: 0.01497
Epoch: 15787, Training Loss: 0.01150
Epoch: 15787, Training Loss: 0.01210
Epoch: 15787, Training Loss: 0.01212
Epoch: 15787, Training Loss: 0.01497
Epoch: 15788, Training Loss: 0.01150
Epoch: 15788, Training Loss: 0.01210
Epoch: 15788, Training Loss: 0.01212
Epoch: 15788, Training Loss: 0.01497
Epoch: 15789, Training Loss: 0.01150
Epoch: 15789, Training Loss: 0.01210
Epoch: 15789, Training Loss: 0.01212
Epoch: 15789, Training Loss: 0.01497
Epoch: 15790, Training Loss: 0.01150
Epoch: 15790, Training Loss: 0.01210
Epoch: 15790, Training Loss: 0.01212
Epoch: 15790, Training Loss: 0.01497
Epoch: 15791, Training Loss: 0.01150
Epoch: 15791, Training Loss: 0.01210
Epoch: 15791, Training Loss: 0.01212
Epoch: 15791, Training Loss: 0.01497
Epoch: 15792, Training Loss: 0.01150
Epoch: 15792, Training Loss: 0.01210
Epoch: 15792, Training Loss: 0.01212
Epoch: 15792, Training Loss: 0.01497
Epoch: 15793, Training Loss: 0.01150
Epoch: 15793, Training Loss: 0.01210
Epoch: 15793, Training Loss: 0.01212
Epoch: 15793, Training Loss: 0.01497
Epoch: 15794, Training Loss: 0.01150
Epoch: 15794, Training Loss: 0.01210
Epoch: 15794, Training Loss: 0.01212
Epoch: 15794, Training Loss: 0.01497
Epoch: 15795, Training Loss: 0.01150
Epoch: 15795, Training Loss: 0.01210
Epoch: 15795, Training Loss: 0.01212
Epoch: 15795, Training Loss: 0.01497
Epoch: 15796, Training Loss: 0.01150
Epoch: 15796, Training Loss: 0.01210
Epoch: 15796, Training Loss: 0.01212
Epoch: 15796, Training Loss: 0.01497
Epoch: 15797, Training Loss: 0.01149
Epoch: 15797, Training Loss: 0.01210
Epoch: 15797, Training Loss: 0.01212
Epoch: 15797, Training Loss: 0.01496
Epoch: 15798, Training Loss: 0.01149
Epoch: 15798, Training Loss: 0.01210
Epoch: 15798, Training Loss: 0.01211
Epoch: 15798, Training Loss: 0.01496
Epoch: 15799, Training Loss: 0.01149
Epoch: 15799, Training Loss: 0.01210
Epoch: 15799, Training Loss: 0.01211
Epoch: 15799, Training Loss: 0.01496
Epoch: 15800, Training Loss: 0.01149
Epoch: 15800, Training Loss: 0.01210
Epoch: 15800, Training Loss: 0.01211
Epoch: 15800, Training Loss: 0.01496
Epoch: 15801, Training Loss: 0.01149
Epoch: 15801, Training Loss: 0.01210
Epoch: 15801, Training Loss: 0.01211
Epoch: 15801, Training Loss: 0.01496
Epoch: 15802, Training Loss: 0.01149
Epoch: 15802, Training Loss: 0.01210
Epoch: 15802, Training Loss: 0.01211
Epoch: 15802, Training Loss: 0.01496
Epoch: 15803, Training Loss: 0.01149
Epoch: 15803, Training Loss: 0.01210
Epoch: 15803, Training Loss: 0.01211
Epoch: 15803, Training Loss: 0.01496
Epoch: 15804, Training Loss: 0.01149
Epoch: 15804, Training Loss: 0.01210
Epoch: 15804, Training Loss: 0.01211
Epoch: 15804, Training Loss: 0.01496
Epoch: 15805, Training Loss: 0.01149
Epoch: 15805, Training Loss: 0.01209
Epoch: 15805, Training Loss: 0.01211
Epoch: 15805, Training Loss: 0.01496
Epoch: 15806, Training Loss: 0.01149
Epoch: 15806, Training Loss: 0.01209
Epoch: 15806, Training Loss: 0.01211
Epoch: 15806, Training Loss: 0.01496
Epoch: 15807, Training Loss: 0.01149
Epoch: 15807, Training Loss: 0.01209
Epoch: 15807, Training Loss: 0.01211
Epoch: 15807, Training Loss: 0.01496
Epoch: 15808, Training Loss: 0.01149
Epoch: 15808, Training Loss: 0.01209
Epoch: 15808, Training Loss: 0.01211
Epoch: 15808, Training Loss: 0.01496
Epoch: 15809, Training Loss: 0.01149
Epoch: 15809, Training Loss: 0.01209
Epoch: 15809, Training Loss: 0.01211
Epoch: 15809, Training Loss: 0.01496
Epoch: 15810, Training Loss: 0.01149
Epoch: 15810, Training Loss: 0.01209
Epoch: 15810, Training Loss: 0.01211
Epoch: 15810, Training Loss: 0.01496
Epoch: 15811, Training Loss: 0.01149
Epoch: 15811, Training Loss: 0.01209
Epoch: 15811, Training Loss: 0.01211
Epoch: 15811, Training Loss: 0.01496
Epoch: 15812, Training Loss: 0.01149
Epoch: 15812, Training Loss: 0.01209
Epoch: 15812, Training Loss: 0.01211
Epoch: 15812, Training Loss: 0.01496
Epoch: 15813, Training Loss: 0.01149
Epoch: 15813, Training Loss: 0.01209
Epoch: 15813, Training Loss: 0.01211
Epoch: 15813, Training Loss: 0.01496
Epoch: 15814, Training Loss: 0.01149
Epoch: 15814, Training Loss: 0.01209
Epoch: 15814, Training Loss: 0.01211
Epoch: 15814, Training Loss: 0.01495
Epoch: 15815, Training Loss: 0.01149
Epoch: 15815, Training Loss: 0.01209
Epoch: 15815, Training Loss: 0.01211
Epoch: 15815, Training Loss: 0.01495
Epoch: 15816, Training Loss: 0.01149
Epoch: 15816, Training Loss: 0.01209
Epoch: 15816, Training Loss: 0.01211
Epoch: 15816, Training Loss: 0.01495
Epoch: 15817, Training Loss: 0.01149
Epoch: 15817, Training Loss: 0.01209
Epoch: 15817, Training Loss: 0.01211
Epoch: 15817, Training Loss: 0.01495
Epoch: 15818, Training Loss: 0.01149
Epoch: 15818, Training Loss: 0.01209
Epoch: 15818, Training Loss: 0.01211
Epoch: 15818, Training Loss: 0.01495
Epoch: 15819, Training Loss: 0.01149
Epoch: 15819, Training Loss: 0.01209
Epoch: 15819, Training Loss: 0.01210
Epoch: 15819, Training Loss: 0.01495
Epoch: 15820, Training Loss: 0.01148
Epoch: 15820, Training Loss: 0.01209
Epoch: 15820, Training Loss: 0.01210
Epoch: 15820, Training Loss: 0.01495
Epoch: 15821, Training Loss: 0.01148
Epoch: 15821, Training Loss: 0.01209
Epoch: 15821, Training Loss: 0.01210
Epoch: 15821, Training Loss: 0.01495
Epoch: 15822, Training Loss: 0.01148
Epoch: 15822, Training Loss: 0.01209
Epoch: 15822, Training Loss: 0.01210
Epoch: 15822, Training Loss: 0.01495
Epoch: 15823, Training Loss: 0.01148
Epoch: 15823, Training Loss: 0.01209
Epoch: 15823, Training Loss: 0.01210
Epoch: 15823, Training Loss: 0.01495
Epoch: 15824, Training Loss: 0.01148
Epoch: 15824, Training Loss: 0.01209
Epoch: 15824, Training Loss: 0.01210
Epoch: 15824, Training Loss: 0.01495
Epoch: 15825, Training Loss: 0.01148
Epoch: 15825, Training Loss: 0.01209
Epoch: 15825, Training Loss: 0.01210
Epoch: 15825, Training Loss: 0.01495
Epoch: 15826, Training Loss: 0.01148
Epoch: 15826, Training Loss: 0.01208
Epoch: 15826, Training Loss: 0.01210
Epoch: 15826, Training Loss: 0.01495
Epoch: 15827, Training Loss: 0.01148
Epoch: 15827, Training Loss: 0.01208
Epoch: 15827, Training Loss: 0.01210
Epoch: 15827, Training Loss: 0.01495
Epoch: 15828, Training Loss: 0.01148
Epoch: 15828, Training Loss: 0.01208
Epoch: 15828, Training Loss: 0.01210
Epoch: 15828, Training Loss: 0.01495
Epoch: 15829, Training Loss: 0.01148
Epoch: 15829, Training Loss: 0.01208
Epoch: 15829, Training Loss: 0.01210
Epoch: 15829, Training Loss: 0.01495
Epoch: 15830, Training Loss: 0.01148
Epoch: 15830, Training Loss: 0.01208
Epoch: 15830, Training Loss: 0.01210
Epoch: 15830, Training Loss: 0.01495
Epoch: 15831, Training Loss: 0.01148
Epoch: 15831, Training Loss: 0.01208
Epoch: 15831, Training Loss: 0.01210
Epoch: 15831, Training Loss: 0.01494
Epoch: 15832, Training Loss: 0.01148
Epoch: 15832, Training Loss: 0.01208
Epoch: 15832, Training Loss: 0.01210
Epoch: 15832, Training Loss: 0.01494
Epoch: 15833, Training Loss: 0.01148
Epoch: 15833, Training Loss: 0.01208
Epoch: 15833, Training Loss: 0.01210
Epoch: 15833, Training Loss: 0.01494
Epoch: 15834, Training Loss: 0.01148
Epoch: 15834, Training Loss: 0.01208
Epoch: 15834, Training Loss: 0.01210
Epoch: 15834, Training Loss: 0.01494
Epoch: 15835, Training Loss: 0.01148
Epoch: 15835, Training Loss: 0.01208
Epoch: 15835, Training Loss: 0.01210
Epoch: 15835, Training Loss: 0.01494
Epoch: 15836, Training Loss: 0.01148
Epoch: 15836, Training Loss: 0.01208
Epoch: 15836, Training Loss: 0.01210
Epoch: 15836, Training Loss: 0.01494
Epoch: 15837, Training Loss: 0.01148
Epoch: 15837, Training Loss: 0.01208
Epoch: 15837, Training Loss: 0.01210
Epoch: 15837, Training Loss: 0.01494
Epoch: 15838, Training Loss: 0.01148
Epoch: 15838, Training Loss: 0.01208
Epoch: 15838, Training Loss: 0.01210
Epoch: 15838, Training Loss: 0.01494
Epoch: 15839, Training Loss: 0.01148
Epoch: 15839, Training Loss: 0.01208
Epoch: 15839, Training Loss: 0.01210
Epoch: 15839, Training Loss: 0.01494
Epoch: 15840, Training Loss: 0.01148
Epoch: 15840, Training Loss: 0.01208
Epoch: 15840, Training Loss: 0.01209
Epoch: 15840, Training Loss: 0.01494
Epoch: 15841, Training Loss: 0.01148
Epoch: 15841, Training Loss: 0.01208
Epoch: 15841, Training Loss: 0.01209
Epoch: 15841, Training Loss: 0.01494
Epoch: 15842, Training Loss: 0.01148
Epoch: 15842, Training Loss: 0.01208
Epoch: 15842, Training Loss: 0.01209
Epoch: 15842, Training Loss: 0.01494
Epoch: 15843, Training Loss: 0.01147
Epoch: 15843, Training Loss: 0.01208
Epoch: 15843, Training Loss: 0.01209
Epoch: 15843, Training Loss: 0.01494
Epoch: 15844, Training Loss: 0.01147
Epoch: 15844, Training Loss: 0.01208
Epoch: 15844, Training Loss: 0.01209
Epoch: 15844, Training Loss: 0.01494
Epoch: 15845, Training Loss: 0.01147
Epoch: 15845, Training Loss: 0.01208
Epoch: 15845, Training Loss: 0.01209
Epoch: 15845, Training Loss: 0.01494
Epoch: 15846, Training Loss: 0.01147
Epoch: 15846, Training Loss: 0.01208
Epoch: 15846, Training Loss: 0.01209
Epoch: 15846, Training Loss: 0.01494
Epoch: 15847, Training Loss: 0.01147
Epoch: 15847, Training Loss: 0.01207
Epoch: 15847, Training Loss: 0.01209
Epoch: 15847, Training Loss: 0.01494
Epoch: 15848, Training Loss: 0.01147
Epoch: 15848, Training Loss: 0.01207
Epoch: 15848, Training Loss: 0.01209
Epoch: 15848, Training Loss: 0.01493
Epoch: 15849, Training Loss: 0.01147
Epoch: 15849, Training Loss: 0.01207
Epoch: 15849, Training Loss: 0.01209
Epoch: 15849, Training Loss: 0.01493
Epoch: 15850, Training Loss: 0.01147
Epoch: 15850, Training Loss: 0.01207
Epoch: 15850, Training Loss: 0.01209
Epoch: 15850, Training Loss: 0.01493
Epoch: 15851, Training Loss: 0.01147
Epoch: 15851, Training Loss: 0.01207
Epoch: 15851, Training Loss: 0.01209
Epoch: 15851, Training Loss: 0.01493
Epoch: 15852, Training Loss: 0.01147
Epoch: 15852, Training Loss: 0.01207
Epoch: 15852, Training Loss: 0.01209
Epoch: 15852, Training Loss: 0.01493
Epoch: 15853, Training Loss: 0.01147
Epoch: 15853, Training Loss: 0.01207
Epoch: 15853, Training Loss: 0.01209
Epoch: 15853, Training Loss: 0.01493
Epoch: 15854, Training Loss: 0.01147
Epoch: 15854, Training Loss: 0.01207
Epoch: 15854, Training Loss: 0.01209
Epoch: 15854, Training Loss: 0.01493
Epoch: 15855, Training Loss: 0.01147
Epoch: 15855, Training Loss: 0.01207
Epoch: 15855, Training Loss: 0.01209
Epoch: 15855, Training Loss: 0.01493
Epoch: 15856, Training Loss: 0.01147
Epoch: 15856, Training Loss: 0.01207
Epoch: 15856, Training Loss: 0.01209
Epoch: 15856, Training Loss: 0.01493
Epoch: 15857, Training Loss: 0.01147
Epoch: 15857, Training Loss: 0.01207
Epoch: 15857, Training Loss: 0.01209
Epoch: 15857, Training Loss: 0.01493
Epoch: 15858, Training Loss: 0.01147
Epoch: 15858, Training Loss: 0.01207
Epoch: 15858, Training Loss: 0.01209
Epoch: 15858, Training Loss: 0.01493
Epoch: 15859, Training Loss: 0.01147
Epoch: 15859, Training Loss: 0.01207
Epoch: 15859, Training Loss: 0.01209
Epoch: 15859, Training Loss: 0.01493
Epoch: 15860, Training Loss: 0.01147
Epoch: 15860, Training Loss: 0.01207
Epoch: 15860, Training Loss: 0.01209
Epoch: 15860, Training Loss: 0.01493
Epoch: 15861, Training Loss: 0.01147
Epoch: 15861, Training Loss: 0.01207
Epoch: 15861, Training Loss: 0.01209
Epoch: 15861, Training Loss: 0.01493
Epoch: 15862, Training Loss: 0.01147
Epoch: 15862, Training Loss: 0.01207
Epoch: 15862, Training Loss: 0.01208
Epoch: 15862, Training Loss: 0.01493
Epoch: 15863, Training Loss: 0.01147
Epoch: 15863, Training Loss: 0.01207
Epoch: 15863, Training Loss: 0.01208
Epoch: 15863, Training Loss: 0.01493
Epoch: 15864, Training Loss: 0.01147
Epoch: 15864, Training Loss: 0.01207
Epoch: 15864, Training Loss: 0.01208
Epoch: 15864, Training Loss: 0.01493
Epoch: 15865, Training Loss: 0.01147
Epoch: 15865, Training Loss: 0.01207
Epoch: 15865, Training Loss: 0.01208
Epoch: 15865, Training Loss: 0.01492
Epoch: 15866, Training Loss: 0.01146
Epoch: 15866, Training Loss: 0.01207
Epoch: 15866, Training Loss: 0.01208
Epoch: 15866, Training Loss: 0.01492
Epoch: 15867, Training Loss: 0.01146
Epoch: 15867, Training Loss: 0.01207
Epoch: 15867, Training Loss: 0.01208
Epoch: 15867, Training Loss: 0.01492
Epoch: 15868, Training Loss: 0.01146
Epoch: 15868, Training Loss: 0.01207
Epoch: 15868, Training Loss: 0.01208
Epoch: 15868, Training Loss: 0.01492
Epoch: 15869, Training Loss: 0.01146
Epoch: 15869, Training Loss: 0.01206
Epoch: 15869, Training Loss: 0.01208
Epoch: 15869, Training Loss: 0.01492
Epoch: 15870, Training Loss: 0.01146
Epoch: 15870, Training Loss: 0.01206
Epoch: 15870, Training Loss: 0.01208
Epoch: 15870, Training Loss: 0.01492
Epoch: 15871, Training Loss: 0.01146
Epoch: 15871, Training Loss: 0.01206
Epoch: 15871, Training Loss: 0.01208
Epoch: 15871, Training Loss: 0.01492
Epoch: 15872, Training Loss: 0.01146
Epoch: 15872, Training Loss: 0.01206
Epoch: 15872, Training Loss: 0.01208
Epoch: 15872, Training Loss: 0.01492
Epoch: 15873, Training Loss: 0.01146
Epoch: 15873, Training Loss: 0.01206
Epoch: 15873, Training Loss: 0.01208
Epoch: 15873, Training Loss: 0.01492
Epoch: 15874, Training Loss: 0.01146
Epoch: 15874, Training Loss: 0.01206
Epoch: 15874, Training Loss: 0.01208
Epoch: 15874, Training Loss: 0.01492
Epoch: 15875, Training Loss: 0.01146
Epoch: 15875, Training Loss: 0.01206
Epoch: 15875, Training Loss: 0.01208
Epoch: 15875, Training Loss: 0.01492
Epoch: 15876, Training Loss: 0.01146
Epoch: 15876, Training Loss: 0.01206
Epoch: 15876, Training Loss: 0.01208
Epoch: 15876, Training Loss: 0.01492
Epoch: 15877, Training Loss: 0.01146
Epoch: 15877, Training Loss: 0.01206
Epoch: 15877, Training Loss: 0.01208
Epoch: 15877, Training Loss: 0.01492
Epoch: 15878, Training Loss: 0.01146
Epoch: 15878, Training Loss: 0.01206
Epoch: 15878, Training Loss: 0.01208
Epoch: 15878, Training Loss: 0.01492
Epoch: 15879, Training Loss: 0.01146
Epoch: 15879, Training Loss: 0.01206
Epoch: 15879, Training Loss: 0.01208
Epoch: 15879, Training Loss: 0.01492
Epoch: 15880, Training Loss: 0.01146
Epoch: 15880, Training Loss: 0.01206
Epoch: 15880, Training Loss: 0.01208
Epoch: 15880, Training Loss: 0.01492
Epoch: 15881, Training Loss: 0.01146
Epoch: 15881, Training Loss: 0.01206
Epoch: 15881, Training Loss: 0.01208
Epoch: 15881, Training Loss: 0.01492
Epoch: 15882, Training Loss: 0.01146
Epoch: 15882, Training Loss: 0.01206
Epoch: 15882, Training Loss: 0.01208
Epoch: 15882, Training Loss: 0.01491
Epoch: 15883, Training Loss: 0.01146
Epoch: 15883, Training Loss: 0.01206
Epoch: 15883, Training Loss: 0.01207
Epoch: 15883, Training Loss: 0.01491
Epoch: 15884, Training Loss: 0.01146
Epoch: 15884, Training Loss: 0.01206
Epoch: 15884, Training Loss: 0.01207
Epoch: 15884, Training Loss: 0.01491
Epoch: 15885, Training Loss: 0.01146
Epoch: 15885, Training Loss: 0.01206
Epoch: 15885, Training Loss: 0.01207
Epoch: 15885, Training Loss: 0.01491
Epoch: 15886, Training Loss: 0.01146
Epoch: 15886, Training Loss: 0.01206
Epoch: 15886, Training Loss: 0.01207
Epoch: 15886, Training Loss: 0.01491
Epoch: 15887, Training Loss: 0.01146
Epoch: 15887, Training Loss: 0.01206
Epoch: 15887, Training Loss: 0.01207
Epoch: 15887, Training Loss: 0.01491
Epoch: 15888, Training Loss: 0.01146
Epoch: 15888, Training Loss: 0.01206
Epoch: 15888, Training Loss: 0.01207
Epoch: 15888, Training Loss: 0.01491
Epoch: 15889, Training Loss: 0.01145
Epoch: 15889, Training Loss: 0.01206
Epoch: 15889, Training Loss: 0.01207
Epoch: 15889, Training Loss: 0.01491
Epoch: 15890, Training Loss: 0.01145
Epoch: 15890, Training Loss: 0.01205
Epoch: 15890, Training Loss: 0.01207
Epoch: 15890, Training Loss: 0.01491
Epoch: 15891, Training Loss: 0.01145
Epoch: 15891, Training Loss: 0.01205
Epoch: 15891, Training Loss: 0.01207
Epoch: 15891, Training Loss: 0.01491
Epoch: 15892, Training Loss: 0.01145
Epoch: 15892, Training Loss: 0.01205
Epoch: 15892, Training Loss: 0.01207
Epoch: 15892, Training Loss: 0.01491
Epoch: 15893, Training Loss: 0.01145
Epoch: 15893, Training Loss: 0.01205
Epoch: 15893, Training Loss: 0.01207
Epoch: 15893, Training Loss: 0.01491
Epoch: 15894, Training Loss: 0.01145
Epoch: 15894, Training Loss: 0.01205
Epoch: 15894, Training Loss: 0.01207
Epoch: 15894, Training Loss: 0.01491
Epoch: 15895, Training Loss: 0.01145
Epoch: 15895, Training Loss: 0.01205
Epoch: 15895, Training Loss: 0.01207
Epoch: 15895, Training Loss: 0.01491
Epoch: 15896, Training Loss: 0.01145
Epoch: 15896, Training Loss: 0.01205
Epoch: 15896, Training Loss: 0.01207
Epoch: 15896, Training Loss: 0.01491
Epoch: 15897, Training Loss: 0.01145
Epoch: 15897, Training Loss: 0.01205
Epoch: 15897, Training Loss: 0.01207
Epoch: 15897, Training Loss: 0.01491
Epoch: 15898, Training Loss: 0.01145
Epoch: 15898, Training Loss: 0.01205
Epoch: 15898, Training Loss: 0.01207
Epoch: 15898, Training Loss: 0.01491
Epoch: 15899, Training Loss: 0.01145
Epoch: 15899, Training Loss: 0.01205
Epoch: 15899, Training Loss: 0.01207
Epoch: 15899, Training Loss: 0.01490
Epoch: 15900, Training Loss: 0.01145
Epoch: 15900, Training Loss: 0.01205
Epoch: 15900, Training Loss: 0.01207
Epoch: 15900, Training Loss: 0.01490
Epoch: 15901, Training Loss: 0.01145
Epoch: 15901, Training Loss: 0.01205
Epoch: 15901, Training Loss: 0.01207
Epoch: 15901, Training Loss: 0.01490
Epoch: 15902, Training Loss: 0.01145
Epoch: 15902, Training Loss: 0.01205
Epoch: 15902, Training Loss: 0.01207
Epoch: 15902, Training Loss: 0.01490
Epoch: 15903, Training Loss: 0.01145
Epoch: 15903, Training Loss: 0.01205
Epoch: 15903, Training Loss: 0.01207
Epoch: 15903, Training Loss: 0.01490
Epoch: 15904, Training Loss: 0.01145
Epoch: 15904, Training Loss: 0.01205
Epoch: 15904, Training Loss: 0.01206
Epoch: 15904, Training Loss: 0.01490
Epoch: 15905, Training Loss: 0.01145
Epoch: 15905, Training Loss: 0.01205
Epoch: 15905, Training Loss: 0.01206
Epoch: 15905, Training Loss: 0.01490
Epoch: 15906, Training Loss: 0.01145
Epoch: 15906, Training Loss: 0.01205
Epoch: 15906, Training Loss: 0.01206
Epoch: 15906, Training Loss: 0.01490
Epoch: 15907, Training Loss: 0.01145
Epoch: 15907, Training Loss: 0.01205
Epoch: 15907, Training Loss: 0.01206
Epoch: 15907, Training Loss: 0.01490
Epoch: 15908, Training Loss: 0.01145
Epoch: 15908, Training Loss: 0.01205
Epoch: 15908, Training Loss: 0.01206
Epoch: 15908, Training Loss: 0.01490
Epoch: 15909, Training Loss: 0.01145
Epoch: 15909, Training Loss: 0.01205
Epoch: 15909, Training Loss: 0.01206
Epoch: 15909, Training Loss: 0.01490
Epoch: 15910, Training Loss: 0.01145
Epoch: 15910, Training Loss: 0.01205
Epoch: 15910, Training Loss: 0.01206
Epoch: 15910, Training Loss: 0.01490
Epoch: 15911, Training Loss: 0.01145
Epoch: 15911, Training Loss: 0.01205
Epoch: 15911, Training Loss: 0.01206
Epoch: 15911, Training Loss: 0.01490
Epoch: 15912, Training Loss: 0.01145
Epoch: 15912, Training Loss: 0.01204
Epoch: 15912, Training Loss: 0.01206
Epoch: 15912, Training Loss: 0.01490
Epoch: 15913, Training Loss: 0.01144
Epoch: 15913, Training Loss: 0.01204
Epoch: 15913, Training Loss: 0.01206
Epoch: 15913, Training Loss: 0.01490
Epoch: 15914, Training Loss: 0.01144
Epoch: 15914, Training Loss: 0.01204
Epoch: 15914, Training Loss: 0.01206
Epoch: 15914, Training Loss: 0.01490
Epoch: 15915, Training Loss: 0.01144
Epoch: 15915, Training Loss: 0.01204
Epoch: 15915, Training Loss: 0.01206
Epoch: 15915, Training Loss: 0.01490
Epoch: 15916, Training Loss: 0.01144
Epoch: 15916, Training Loss: 0.01204
Epoch: 15916, Training Loss: 0.01206
Epoch: 15916, Training Loss: 0.01489
Epoch: 15917, Training Loss: 0.01144
Epoch: 15917, Training Loss: 0.01204
Epoch: 15917, Training Loss: 0.01206
Epoch: 15917, Training Loss: 0.01489
Epoch: 15918, Training Loss: 0.01144
Epoch: 15918, Training Loss: 0.01204
Epoch: 15918, Training Loss: 0.01206
Epoch: 15918, Training Loss: 0.01489
Epoch: 15919, Training Loss: 0.01144
Epoch: 15919, Training Loss: 0.01204
Epoch: 15919, Training Loss: 0.01206
Epoch: 15919, Training Loss: 0.01489
Epoch: 15920, Training Loss: 0.01144
Epoch: 15920, Training Loss: 0.01204
Epoch: 15920, Training Loss: 0.01206
Epoch: 15920, Training Loss: 0.01489
Epoch: 15921, Training Loss: 0.01144
Epoch: 15921, Training Loss: 0.01204
Epoch: 15921, Training Loss: 0.01206
Epoch: 15921, Training Loss: 0.01489
Epoch: 15922, Training Loss: 0.01144
Epoch: 15922, Training Loss: 0.01204
Epoch: 15922, Training Loss: 0.01206
Epoch: 15922, Training Loss: 0.01489
Epoch: 15923, Training Loss: 0.01144
Epoch: 15923, Training Loss: 0.01204
Epoch: 15923, Training Loss: 0.01206
Epoch: 15923, Training Loss: 0.01489
Epoch: 15924, Training Loss: 0.01144
Epoch: 15924, Training Loss: 0.01204
Epoch: 15924, Training Loss: 0.01206
Epoch: 15924, Training Loss: 0.01489
Epoch: 15925, Training Loss: 0.01144
Epoch: 15925, Training Loss: 0.01204
Epoch: 15925, Training Loss: 0.01206
Epoch: 15925, Training Loss: 0.01489
Epoch: 15926, Training Loss: 0.01144
Epoch: 15926, Training Loss: 0.01204
Epoch: 15926, Training Loss: 0.01205
Epoch: 15926, Training Loss: 0.01489
Epoch: 15927, Training Loss: 0.01144
Epoch: 15927, Training Loss: 0.01204
Epoch: 15927, Training Loss: 0.01205
Epoch: 15927, Training Loss: 0.01489
Epoch: 15928, Training Loss: 0.01144
Epoch: 15928, Training Loss: 0.01204
Epoch: 15928, Training Loss: 0.01205
Epoch: 15928, Training Loss: 0.01489
Epoch: 15929, Training Loss: 0.01144
Epoch: 15929, Training Loss: 0.01204
Epoch: 15929, Training Loss: 0.01205
Epoch: 15929, Training Loss: 0.01489
Epoch: 15930, Training Loss: 0.01144
Epoch: 15930, Training Loss: 0.01204
Epoch: 15930, Training Loss: 0.01205
Epoch: 15930, Training Loss: 0.01489
Epoch: 15931, Training Loss: 0.01144
Epoch: 15931, Training Loss: 0.01204
Epoch: 15931, Training Loss: 0.01205
Epoch: 15931, Training Loss: 0.01489
Epoch: 15932, Training Loss: 0.01144
Epoch: 15932, Training Loss: 0.01204
Epoch: 15932, Training Loss: 0.01205
Epoch: 15932, Training Loss: 0.01489
Epoch: 15933, Training Loss: 0.01144
Epoch: 15933, Training Loss: 0.01203
Epoch: 15933, Training Loss: 0.01205
Epoch: 15933, Training Loss: 0.01488
Epoch: 15934, Training Loss: 0.01144
Epoch: 15934, Training Loss: 0.01203
Epoch: 15934, Training Loss: 0.01205
Epoch: 15934, Training Loss: 0.01488
Epoch: 15935, Training Loss: 0.01144
Epoch: 15935, Training Loss: 0.01203
Epoch: 15935, Training Loss: 0.01205
Epoch: 15935, Training Loss: 0.01488
Epoch: 15936, Training Loss: 0.01143
Epoch: 15936, Training Loss: 0.01203
Epoch: 15936, Training Loss: 0.01205
Epoch: 15936, Training Loss: 0.01488
Epoch: 15937, Training Loss: 0.01143
Epoch: 15937, Training Loss: 0.01203
Epoch: 15937, Training Loss: 0.01205
Epoch: 15937, Training Loss: 0.01488
Epoch: 15938, Training Loss: 0.01143
Epoch: 15938, Training Loss: 0.01203
Epoch: 15938, Training Loss: 0.01205
Epoch: 15938, Training Loss: 0.01488
Epoch: 15939, Training Loss: 0.01143
Epoch: 15939, Training Loss: 0.01203
Epoch: 15939, Training Loss: 0.01205
Epoch: 15939, Training Loss: 0.01488
Epoch: 15940, Training Loss: 0.01143
Epoch: 15940, Training Loss: 0.01203
Epoch: 15940, Training Loss: 0.01205
Epoch: 15940, Training Loss: 0.01488
Epoch: 15941, Training Loss: 0.01143
Epoch: 15941, Training Loss: 0.01203
Epoch: 15941, Training Loss: 0.01205
Epoch: 15941, Training Loss: 0.01488
Epoch: 15942, Training Loss: 0.01143
Epoch: 15942, Training Loss: 0.01203
Epoch: 15942, Training Loss: 0.01205
Epoch: 15942, Training Loss: 0.01488
Epoch: 15943, Training Loss: 0.01143
Epoch: 15943, Training Loss: 0.01203
Epoch: 15943, Training Loss: 0.01205
Epoch: 15943, Training Loss: 0.01488
Epoch: 15944, Training Loss: 0.01143
Epoch: 15944, Training Loss: 0.01203
Epoch: 15944, Training Loss: 0.01205
Epoch: 15944, Training Loss: 0.01488
Epoch: 15945, Training Loss: 0.01143
Epoch: 15945, Training Loss: 0.01203
Epoch: 15945, Training Loss: 0.01205
Epoch: 15945, Training Loss: 0.01488
Epoch: 15946, Training Loss: 0.01143
Epoch: 15946, Training Loss: 0.01203
Epoch: 15946, Training Loss: 0.01205
Epoch: 15946, Training Loss: 0.01488
Epoch: 15947, Training Loss: 0.01143
Epoch: 15947, Training Loss: 0.01203
Epoch: 15947, Training Loss: 0.01204
Epoch: 15947, Training Loss: 0.01488
Epoch: 15948, Training Loss: 0.01143
Epoch: 15948, Training Loss: 0.01203
Epoch: 15948, Training Loss: 0.01204
Epoch: 15948, Training Loss: 0.01488
Epoch: 15949, Training Loss: 0.01143
Epoch: 15949, Training Loss: 0.01203
Epoch: 15949, Training Loss: 0.01204
Epoch: 15949, Training Loss: 0.01488
Epoch: 15950, Training Loss: 0.01143
Epoch: 15950, Training Loss: 0.01203
Epoch: 15950, Training Loss: 0.01204
Epoch: 15950, Training Loss: 0.01487
Epoch: 15951, Training Loss: 0.01143
Epoch: 15951, Training Loss: 0.01203
Epoch: 15951, Training Loss: 0.01204
Epoch: 15951, Training Loss: 0.01487
Epoch: 15952, Training Loss: 0.01143
Epoch: 15952, Training Loss: 0.01203
Epoch: 15952, Training Loss: 0.01204
Epoch: 15952, Training Loss: 0.01487
Epoch: 15953, Training Loss: 0.01143
Epoch: 15953, Training Loss: 0.01203
Epoch: 15953, Training Loss: 0.01204
Epoch: 15953, Training Loss: 0.01487
Epoch: 15954, Training Loss: 0.01143
Epoch: 15954, Training Loss: 0.01203
Epoch: 15954, Training Loss: 0.01204
Epoch: 15954, Training Loss: 0.01487
Epoch: 15955, Training Loss: 0.01143
Epoch: 15955, Training Loss: 0.01202
Epoch: 15955, Training Loss: 0.01204
Epoch: 15955, Training Loss: 0.01487
Epoch: 15956, Training Loss: 0.01143
Epoch: 15956, Training Loss: 0.01202
Epoch: 15956, Training Loss: 0.01204
Epoch: 15956, Training Loss: 0.01487
Epoch: 15957, Training Loss: 0.01143
Epoch: 15957, Training Loss: 0.01202
Epoch: 15957, Training Loss: 0.01204
Epoch: 15957, Training Loss: 0.01487
Epoch: 15958, Training Loss: 0.01143
Epoch: 15958, Training Loss: 0.01202
Epoch: 15958, Training Loss: 0.01204
Epoch: 15958, Training Loss: 0.01487
Epoch: 15959, Training Loss: 0.01142
Epoch: 15959, Training Loss: 0.01202
Epoch: 15959, Training Loss: 0.01204
Epoch: 15959, Training Loss: 0.01487
Epoch: 15960, Training Loss: 0.01142
Epoch: 15960, Training Loss: 0.01202
Epoch: 15960, Training Loss: 0.01204
Epoch: 15960, Training Loss: 0.01487
Epoch: 15961, Training Loss: 0.01142
Epoch: 15961, Training Loss: 0.01202
Epoch: 15961, Training Loss: 0.01204
Epoch: 15961, Training Loss: 0.01487
Epoch: 15962, Training Loss: 0.01142
Epoch: 15962, Training Loss: 0.01202
Epoch: 15962, Training Loss: 0.01204
Epoch: 15962, Training Loss: 0.01487
Epoch: 15963, Training Loss: 0.01142
Epoch: 15963, Training Loss: 0.01202
Epoch: 15963, Training Loss: 0.01204
Epoch: 15963, Training Loss: 0.01487
Epoch: 15964, Training Loss: 0.01142
Epoch: 15964, Training Loss: 0.01202
Epoch: 15964, Training Loss: 0.01204
Epoch: 15964, Training Loss: 0.01487
Epoch: 15965, Training Loss: 0.01142
Epoch: 15965, Training Loss: 0.01202
Epoch: 15965, Training Loss: 0.01204
Epoch: 15965, Training Loss: 0.01487
Epoch: 15966, Training Loss: 0.01142
Epoch: 15966, Training Loss: 0.01202
Epoch: 15966, Training Loss: 0.01204
Epoch: 15966, Training Loss: 0.01487
Epoch: 15967, Training Loss: 0.01142
Epoch: 15967, Training Loss: 0.01202
Epoch: 15967, Training Loss: 0.01204
Epoch: 15967, Training Loss: 0.01486
Epoch: 15968, Training Loss: 0.01142
Epoch: 15968, Training Loss: 0.01202
Epoch: 15968, Training Loss: 0.01204
Epoch: 15968, Training Loss: 0.01486
Epoch: 15969, Training Loss: 0.01142
Epoch: 15969, Training Loss: 0.01202
Epoch: 15969, Training Loss: 0.01203
Epoch: 15969, Training Loss: 0.01486
Epoch: 15970, Training Loss: 0.01142
Epoch: 15970, Training Loss: 0.01202
Epoch: 15970, Training Loss: 0.01203
Epoch: 15970, Training Loss: 0.01486
Epoch: 15971, Training Loss: 0.01142
Epoch: 15971, Training Loss: 0.01202
Epoch: 15971, Training Loss: 0.01203
Epoch: 15971, Training Loss: 0.01486
Epoch: 15972, Training Loss: 0.01142
Epoch: 15972, Training Loss: 0.01202
Epoch: 15972, Training Loss: 0.01203
Epoch: 15972, Training Loss: 0.01486
Epoch: 15973, Training Loss: 0.01142
Epoch: 15973, Training Loss: 0.01202
Epoch: 15973, Training Loss: 0.01203
Epoch: 15973, Training Loss: 0.01486
Epoch: 15974, Training Loss: 0.01142
Epoch: 15974, Training Loss: 0.01202
Epoch: 15974, Training Loss: 0.01203
Epoch: 15974, Training Loss: 0.01486
Epoch: 15975, Training Loss: 0.01142
Epoch: 15975, Training Loss: 0.01202
Epoch: 15975, Training Loss: 0.01203
Epoch: 15975, Training Loss: 0.01486
Epoch: 15976, Training Loss: 0.01142
Epoch: 15976, Training Loss: 0.01201
Epoch: 15976, Training Loss: 0.01203
Epoch: 15976, Training Loss: 0.01486
Epoch: 15977, Training Loss: 0.01142
Epoch: 15977, Training Loss: 0.01201
Epoch: 15977, Training Loss: 0.01203
Epoch: 15977, Training Loss: 0.01486
Epoch: 15978, Training Loss: 0.01142
Epoch: 15978, Training Loss: 0.01201
Epoch: 15978, Training Loss: 0.01203
Epoch: 15978, Training Loss: 0.01486
Epoch: 15979, Training Loss: 0.01142
Epoch: 15979, Training Loss: 0.01201
Epoch: 15979, Training Loss: 0.01203
Epoch: 15979, Training Loss: 0.01486
Epoch: 15980, Training Loss: 0.01142
Epoch: 15980, Training Loss: 0.01201
Epoch: 15980, Training Loss: 0.01203
Epoch: 15980, Training Loss: 0.01486
Epoch: 15981, Training Loss: 0.01142
Epoch: 15981, Training Loss: 0.01201
Epoch: 15981, Training Loss: 0.01203
Epoch: 15981, Training Loss: 0.01486
Epoch: 15982, Training Loss: 0.01142
Epoch: 15982, Training Loss: 0.01201
Epoch: 15982, Training Loss: 0.01203
Epoch: 15982, Training Loss: 0.01486
Epoch: 15983, Training Loss: 0.01141
Epoch: 15983, Training Loss: 0.01201
Epoch: 15983, Training Loss: 0.01203
Epoch: 15983, Training Loss: 0.01486
Epoch: 15984, Training Loss: 0.01141
Epoch: 15984, Training Loss: 0.01201
Epoch: 15984, Training Loss: 0.01203
Epoch: 15984, Training Loss: 0.01485
Epoch: 15985, Training Loss: 0.01141
Epoch: 15985, Training Loss: 0.01201
Epoch: 15985, Training Loss: 0.01203
Epoch: 15985, Training Loss: 0.01485
Epoch: 15986, Training Loss: 0.01141
Epoch: 15986, Training Loss: 0.01201
Epoch: 15986, Training Loss: 0.01203
Epoch: 15986, Training Loss: 0.01485
Epoch: 15987, Training Loss: 0.01141
Epoch: 15987, Training Loss: 0.01201
Epoch: 15987, Training Loss: 0.01203
Epoch: 15987, Training Loss: 0.01485
Epoch: 15988, Training Loss: 0.01141
Epoch: 15988, Training Loss: 0.01201
Epoch: 15988, Training Loss: 0.01203
Epoch: 15988, Training Loss: 0.01485
Epoch: 15989, Training Loss: 0.01141
Epoch: 15989, Training Loss: 0.01201
Epoch: 15989, Training Loss: 0.01203
Epoch: 15989, Training Loss: 0.01485
Epoch: 15990, Training Loss: 0.01141
Epoch: 15990, Training Loss: 0.01201
Epoch: 15990, Training Loss: 0.01203
Epoch: 15990, Training Loss: 0.01485
Epoch: 15991, Training Loss: 0.01141
Epoch: 15991, Training Loss: 0.01201
Epoch: 15991, Training Loss: 0.01202
Epoch: 15991, Training Loss: 0.01485
Epoch: 15992, Training Loss: 0.01141
Epoch: 15992, Training Loss: 0.01201
Epoch: 15992, Training Loss: 0.01202
Epoch: 15992, Training Loss: 0.01485
Epoch: 15993, Training Loss: 0.01141
Epoch: 15993, Training Loss: 0.01201
Epoch: 15993, Training Loss: 0.01202
Epoch: 15993, Training Loss: 0.01485
Epoch: 15994, Training Loss: 0.01141
Epoch: 15994, Training Loss: 0.01201
Epoch: 15994, Training Loss: 0.01202
Epoch: 15994, Training Loss: 0.01485
Epoch: 15995, Training Loss: 0.01141
Epoch: 15995, Training Loss: 0.01201
Epoch: 15995, Training Loss: 0.01202
Epoch: 15995, Training Loss: 0.01485
Epoch: 15996, Training Loss: 0.01141
Epoch: 15996, Training Loss: 0.01201
Epoch: 15996, Training Loss: 0.01202
Epoch: 15996, Training Loss: 0.01485
Epoch: 15997, Training Loss: 0.01141
Epoch: 15997, Training Loss: 0.01201
Epoch: 15997, Training Loss: 0.01202
Epoch: 15997, Training Loss: 0.01485
Epoch: 15998, Training Loss: 0.01141
Epoch: 15998, Training Loss: 0.01200
Epoch: 15998, Training Loss: 0.01202
Epoch: 15998, Training Loss: 0.01485
Epoch: 15999, Training Loss: 0.01141
Epoch: 15999, Training Loss: 0.01200
Epoch: 15999, Training Loss: 0.01202
Epoch: 15999, Training Loss: 0.01485
Epoch: 16000, Training Loss: 0.01141
Epoch: 16000, Training Loss: 0.01200
Epoch: 16000, Training Loss: 0.01202
Epoch: 16000, Training Loss: 0.01485
Epoch: 16001, Training Loss: 0.01141
Epoch: 16001, Training Loss: 0.01200
Epoch: 16001, Training Loss: 0.01202
Epoch: 16001, Training Loss: 0.01485
Epoch: 16002, Training Loss: 0.01141
Epoch: 16002, Training Loss: 0.01200
Epoch: 16002, Training Loss: 0.01202
Epoch: 16002, Training Loss: 0.01484
Epoch: 16003, Training Loss: 0.01141
Epoch: 16003, Training Loss: 0.01200
Epoch: 16003, Training Loss: 0.01202
Epoch: 16003, Training Loss: 0.01484
Epoch: 16004, Training Loss: 0.01141
Epoch: 16004, Training Loss: 0.01200
Epoch: 16004, Training Loss: 0.01202
Epoch: 16004, Training Loss: 0.01484
Epoch: 16005, Training Loss: 0.01141
Epoch: 16005, Training Loss: 0.01200
Epoch: 16005, Training Loss: 0.01202
Epoch: 16005, Training Loss: 0.01484
Epoch: 16006, Training Loss: 0.01141
Epoch: 16006, Training Loss: 0.01200
Epoch: 16006, Training Loss: 0.01202
Epoch: 16006, Training Loss: 0.01484
Epoch: 16007, Training Loss: 0.01140
Epoch: 16007, Training Loss: 0.01200
Epoch: 16007, Training Loss: 0.01202
Epoch: 16007, Training Loss: 0.01484
Epoch: 16008, Training Loss: 0.01140
Epoch: 16008, Training Loss: 0.01200
Epoch: 16008, Training Loss: 0.01202
Epoch: 16008, Training Loss: 0.01484
Epoch: 16009, Training Loss: 0.01140
Epoch: 16009, Training Loss: 0.01200
Epoch: 16009, Training Loss: 0.01202
Epoch: 16009, Training Loss: 0.01484
Epoch: 16010, Training Loss: 0.01140
Epoch: 16010, Training Loss: 0.01200
Epoch: 16010, Training Loss: 0.01202
Epoch: 16010, Training Loss: 0.01484
Epoch: 16011, Training Loss: 0.01140
Epoch: 16011, Training Loss: 0.01200
Epoch: 16011, Training Loss: 0.01202
Epoch: 16011, Training Loss: 0.01484
Epoch: 16012, Training Loss: 0.01140
Epoch: 16012, Training Loss: 0.01200
Epoch: 16012, Training Loss: 0.01201
Epoch: 16012, Training Loss: 0.01484
Epoch: 16013, Training Loss: 0.01140
Epoch: 16013, Training Loss: 0.01200
Epoch: 16013, Training Loss: 0.01201
Epoch: 16013, Training Loss: 0.01484
Epoch: 16014, Training Loss: 0.01140
Epoch: 16014, Training Loss: 0.01200
Epoch: 16014, Training Loss: 0.01201
Epoch: 16014, Training Loss: 0.01484
Epoch: 16015, Training Loss: 0.01140
Epoch: 16015, Training Loss: 0.01200
Epoch: 16015, Training Loss: 0.01201
Epoch: 16015, Training Loss: 0.01484
Epoch: 16016, Training Loss: 0.01140
Epoch: 16016, Training Loss: 0.01200
Epoch: 16016, Training Loss: 0.01201
Epoch: 16016, Training Loss: 0.01484
Epoch: 16017, Training Loss: 0.01140
Epoch: 16017, Training Loss: 0.01200
Epoch: 16017, Training Loss: 0.01201
Epoch: 16017, Training Loss: 0.01484
Epoch: 16018, Training Loss: 0.01140
Epoch: 16018, Training Loss: 0.01200
Epoch: 16018, Training Loss: 0.01201
Epoch: 16018, Training Loss: 0.01484
Epoch: 16019, Training Loss: 0.01140
Epoch: 16019, Training Loss: 0.01200
Epoch: 16019, Training Loss: 0.01201
Epoch: 16019, Training Loss: 0.01483
Epoch: 16020, Training Loss: 0.01140
Epoch: 16020, Training Loss: 0.01199
Epoch: 16020, Training Loss: 0.01201
Epoch: 16020, Training Loss: 0.01483
Epoch: 16021, Training Loss: 0.01140
Epoch: 16021, Training Loss: 0.01199
Epoch: 16021, Training Loss: 0.01201
Epoch: 16021, Training Loss: 0.01483
Epoch: 16022, Training Loss: 0.01140
Epoch: 16022, Training Loss: 0.01199
Epoch: 16022, Training Loss: 0.01201
Epoch: 16022, Training Loss: 0.01483
Epoch: 16023, Training Loss: 0.01140
Epoch: 16023, Training Loss: 0.01199
Epoch: 16023, Training Loss: 0.01201
Epoch: 16023, Training Loss: 0.01483
Epoch: 16024, Training Loss: 0.01140
Epoch: 16024, Training Loss: 0.01199
Epoch: 16024, Training Loss: 0.01201
Epoch: 16024, Training Loss: 0.01483
Epoch: 16025, Training Loss: 0.01140
Epoch: 16025, Training Loss: 0.01199
Epoch: 16025, Training Loss: 0.01201
Epoch: 16025, Training Loss: 0.01483
Epoch: 16026, Training Loss: 0.01140
Epoch: 16026, Training Loss: 0.01199
Epoch: 16026, Training Loss: 0.01201
Epoch: 16026, Training Loss: 0.01483
Epoch: 16027, Training Loss: 0.01140
Epoch: 16027, Training Loss: 0.01199
Epoch: 16027, Training Loss: 0.01201
Epoch: 16027, Training Loss: 0.01483
Epoch: 16028, Training Loss: 0.01140
Epoch: 16028, Training Loss: 0.01199
Epoch: 16028, Training Loss: 0.01201
Epoch: 16028, Training Loss: 0.01483
Epoch: 16029, Training Loss: 0.01140
Epoch: 16029, Training Loss: 0.01199
Epoch: 16029, Training Loss: 0.01201
Epoch: 16029, Training Loss: 0.01483
Epoch: 16030, Training Loss: 0.01139
Epoch: 16030, Training Loss: 0.01199
Epoch: 16030, Training Loss: 0.01201
Epoch: 16030, Training Loss: 0.01483
Epoch: 16031, Training Loss: 0.01139
Epoch: 16031, Training Loss: 0.01199
Epoch: 16031, Training Loss: 0.01201
Epoch: 16031, Training Loss: 0.01483
Epoch: 16032, Training Loss: 0.01139
Epoch: 16032, Training Loss: 0.01199
Epoch: 16032, Training Loss: 0.01201
Epoch: 16032, Training Loss: 0.01483
Epoch: 16033, Training Loss: 0.01139
Epoch: 16033, Training Loss: 0.01199
Epoch: 16033, Training Loss: 0.01201
Epoch: 16033, Training Loss: 0.01483
Epoch: 16034, Training Loss: 0.01139
Epoch: 16034, Training Loss: 0.01199
Epoch: 16034, Training Loss: 0.01200
Epoch: 16034, Training Loss: 0.01483
Epoch: 16035, Training Loss: 0.01139
Epoch: 16035, Training Loss: 0.01199
Epoch: 16035, Training Loss: 0.01200
Epoch: 16035, Training Loss: 0.01483
Epoch: 16036, Training Loss: 0.01139
Epoch: 16036, Training Loss: 0.01199
Epoch: 16036, Training Loss: 0.01200
Epoch: 16036, Training Loss: 0.01482
Epoch: 16037, Training Loss: 0.01139
Epoch: 16037, Training Loss: 0.01199
Epoch: 16037, Training Loss: 0.01200
Epoch: 16037, Training Loss: 0.01482
Epoch: 16038, Training Loss: 0.01139
Epoch: 16038, Training Loss: 0.01199
Epoch: 16038, Training Loss: 0.01200
Epoch: 16038, Training Loss: 0.01482
Epoch: 16039, Training Loss: 0.01139
Epoch: 16039, Training Loss: 0.01199
Epoch: 16039, Training Loss: 0.01200
Epoch: 16039, Training Loss: 0.01482
Epoch: 16040, Training Loss: 0.01139
Epoch: 16040, Training Loss: 0.01199
Epoch: 16040, Training Loss: 0.01200
Epoch: 16040, Training Loss: 0.01482
Epoch: 16041, Training Loss: 0.01139
Epoch: 16041, Training Loss: 0.01198
Epoch: 16041, Training Loss: 0.01200
Epoch: 16041, Training Loss: 0.01482
Epoch: 16042, Training Loss: 0.01139
Epoch: 16042, Training Loss: 0.01198
Epoch: 16042, Training Loss: 0.01200
Epoch: 16042, Training Loss: 0.01482
Epoch: 16043, Training Loss: 0.01139
Epoch: 16043, Training Loss: 0.01198
Epoch: 16043, Training Loss: 0.01200
Epoch: 16043, Training Loss: 0.01482
Epoch: 16044, Training Loss: 0.01139
Epoch: 16044, Training Loss: 0.01198
Epoch: 16044, Training Loss: 0.01200
Epoch: 16044, Training Loss: 0.01482
Epoch: 16045, Training Loss: 0.01139
Epoch: 16045, Training Loss: 0.01198
Epoch: 16045, Training Loss: 0.01200
Epoch: 16045, Training Loss: 0.01482
Epoch: 16046, Training Loss: 0.01139
Epoch: 16046, Training Loss: 0.01198
Epoch: 16046, Training Loss: 0.01200
Epoch: 16046, Training Loss: 0.01482
Epoch: 16047, Training Loss: 0.01139
Epoch: 16047, Training Loss: 0.01198
Epoch: 16047, Training Loss: 0.01200
Epoch: 16047, Training Loss: 0.01482
Epoch: 16048, Training Loss: 0.01139
Epoch: 16048, Training Loss: 0.01198
Epoch: 16048, Training Loss: 0.01200
Epoch: 16048, Training Loss: 0.01482
Epoch: 16049, Training Loss: 0.01139
Epoch: 16049, Training Loss: 0.01198
Epoch: 16049, Training Loss: 0.01200
Epoch: 16049, Training Loss: 0.01482
Epoch: 16050, Training Loss: 0.01139
Epoch: 16050, Training Loss: 0.01198
Epoch: 16050, Training Loss: 0.01200
Epoch: 16050, Training Loss: 0.01482
Epoch: 16051, Training Loss: 0.01139
Epoch: 16051, Training Loss: 0.01198
Epoch: 16051, Training Loss: 0.01200
Epoch: 16051, Training Loss: 0.01482
Epoch: 16052, Training Loss: 0.01139
Epoch: 16052, Training Loss: 0.01198
Epoch: 16052, Training Loss: 0.01200
Epoch: 16052, Training Loss: 0.01482
Epoch: 16053, Training Loss: 0.01139
Epoch: 16053, Training Loss: 0.01198
Epoch: 16053, Training Loss: 0.01200
Epoch: 16053, Training Loss: 0.01481
Epoch: 16054, Training Loss: 0.01138
Epoch: 16054, Training Loss: 0.01198
Epoch: 16054, Training Loss: 0.01200
Epoch: 16054, Training Loss: 0.01481
Epoch: 16055, Training Loss: 0.01138
Epoch: 16055, Training Loss: 0.01198
Epoch: 16055, Training Loss: 0.01200
Epoch: 16055, Training Loss: 0.01481
Epoch: 16056, Training Loss: 0.01138
Epoch: 16056, Training Loss: 0.01198
Epoch: 16056, Training Loss: 0.01199
Epoch: 16056, Training Loss: 0.01481
Epoch: 16057, Training Loss: 0.01138
Epoch: 16057, Training Loss: 0.01198
Epoch: 16057, Training Loss: 0.01199
Epoch: 16057, Training Loss: 0.01481
Epoch: 16058, Training Loss: 0.01138
Epoch: 16058, Training Loss: 0.01198
Epoch: 16058, Training Loss: 0.01199
Epoch: 16058, Training Loss: 0.01481
Epoch: 16059, Training Loss: 0.01138
Epoch: 16059, Training Loss: 0.01198
Epoch: 16059, Training Loss: 0.01199
Epoch: 16059, Training Loss: 0.01481
Epoch: 16060, Training Loss: 0.01138
Epoch: 16060, Training Loss: 0.01198
Epoch: 16060, Training Loss: 0.01199
Epoch: 16060, Training Loss: 0.01481
Epoch: 16061, Training Loss: 0.01138
Epoch: 16061, Training Loss: 0.01198
Epoch: 16061, Training Loss: 0.01199
Epoch: 16061, Training Loss: 0.01481
Epoch: 16062, Training Loss: 0.01138
Epoch: 16062, Training Loss: 0.01198
Epoch: 16062, Training Loss: 0.01199
Epoch: 16062, Training Loss: 0.01481
Epoch: 16063, Training Loss: 0.01138
Epoch: 16063, Training Loss: 0.01197
Epoch: 16063, Training Loss: 0.01199
Epoch: 16063, Training Loss: 0.01481
Epoch: 16064, Training Loss: 0.01138
Epoch: 16064, Training Loss: 0.01197
Epoch: 16064, Training Loss: 0.01199
Epoch: 16064, Training Loss: 0.01481
Epoch: 16065, Training Loss: 0.01138
Epoch: 16065, Training Loss: 0.01197
Epoch: 16065, Training Loss: 0.01199
Epoch: 16065, Training Loss: 0.01481
Epoch: 16066, Training Loss: 0.01138
Epoch: 16066, Training Loss: 0.01197
Epoch: 16066, Training Loss: 0.01199
Epoch: 16066, Training Loss: 0.01481
Epoch: 16067, Training Loss: 0.01138
Epoch: 16067, Training Loss: 0.01197
Epoch: 16067, Training Loss: 0.01199
Epoch: 16067, Training Loss: 0.01481
Epoch: 16068, Training Loss: 0.01138
Epoch: 16068, Training Loss: 0.01197
Epoch: 16068, Training Loss: 0.01199
Epoch: 16068, Training Loss: 0.01481
Epoch: 16069, Training Loss: 0.01138
Epoch: 16069, Training Loss: 0.01197
Epoch: 16069, Training Loss: 0.01199
Epoch: 16069, Training Loss: 0.01481
Epoch: 16070, Training Loss: 0.01138
Epoch: 16070, Training Loss: 0.01197
Epoch: 16070, Training Loss: 0.01199
Epoch: 16070, Training Loss: 0.01481
Epoch: 16071, Training Loss: 0.01138
Epoch: 16071, Training Loss: 0.01197
Epoch: 16071, Training Loss: 0.01199
Epoch: 16071, Training Loss: 0.01480
Epoch: 16072, Training Loss: 0.01138
Epoch: 16072, Training Loss: 0.01197
Epoch: 16072, Training Loss: 0.01199
Epoch: 16072, Training Loss: 0.01480
Epoch: 16073, Training Loss: 0.01138
Epoch: 16073, Training Loss: 0.01197
Epoch: 16073, Training Loss: 0.01199
Epoch: 16073, Training Loss: 0.01480
Epoch: 16074, Training Loss: 0.01138
Epoch: 16074, Training Loss: 0.01197
Epoch: 16074, Training Loss: 0.01199
Epoch: 16074, Training Loss: 0.01480
Epoch: 16075, Training Loss: 0.01138
Epoch: 16075, Training Loss: 0.01197
Epoch: 16075, Training Loss: 0.01199
Epoch: 16075, Training Loss: 0.01480
Epoch: 16076, Training Loss: 0.01138
Epoch: 16076, Training Loss: 0.01197
Epoch: 16076, Training Loss: 0.01199
Epoch: 16076, Training Loss: 0.01480
Epoch: 16077, Training Loss: 0.01138
Epoch: 16077, Training Loss: 0.01197
Epoch: 16077, Training Loss: 0.01199
Epoch: 16077, Training Loss: 0.01480
Epoch: 16078, Training Loss: 0.01137
Epoch: 16078, Training Loss: 0.01197
Epoch: 16078, Training Loss: 0.01198
Epoch: 16078, Training Loss: 0.01480
Epoch: 16079, Training Loss: 0.01137
Epoch: 16079, Training Loss: 0.01197
Epoch: 16079, Training Loss: 0.01198
Epoch: 16079, Training Loss: 0.01480
Epoch: 16080, Training Loss: 0.01137
Epoch: 16080, Training Loss: 0.01197
Epoch: 16080, Training Loss: 0.01198
Epoch: 16080, Training Loss: 0.01480
Epoch: 16081, Training Loss: 0.01137
Epoch: 16081, Training Loss: 0.01197
Epoch: 16081, Training Loss: 0.01198
Epoch: 16081, Training Loss: 0.01480
Epoch: 16082, Training Loss: 0.01137
Epoch: 16082, Training Loss: 0.01197
Epoch: 16082, Training Loss: 0.01198
Epoch: 16082, Training Loss: 0.01480
Epoch: 16083, Training Loss: 0.01137
Epoch: 16083, Training Loss: 0.01197
Epoch: 16083, Training Loss: 0.01198
Epoch: 16083, Training Loss: 0.01480
Epoch: 16084, Training Loss: 0.01137
Epoch: 16084, Training Loss: 0.01197
Epoch: 16084, Training Loss: 0.01198
Epoch: 16084, Training Loss: 0.01480
Epoch: 16085, Training Loss: 0.01137
Epoch: 16085, Training Loss: 0.01196
Epoch: 16085, Training Loss: 0.01198
Epoch: 16085, Training Loss: 0.01480
Epoch: 16086, Training Loss: 0.01137
Epoch: 16086, Training Loss: 0.01196
Epoch: 16086, Training Loss: 0.01198
Epoch: 16086, Training Loss: 0.01480
Epoch: 16087, Training Loss: 0.01137
Epoch: 16087, Training Loss: 0.01196
Epoch: 16087, Training Loss: 0.01198
Epoch: 16087, Training Loss: 0.01480
Epoch: 16088, Training Loss: 0.01137
Epoch: 16088, Training Loss: 0.01196
Epoch: 16088, Training Loss: 0.01198
Epoch: 16088, Training Loss: 0.01479
Epoch: 16089, Training Loss: 0.01137
Epoch: 16089, Training Loss: 0.01196
Epoch: 16089, Training Loss: 0.01198
Epoch: 16089, Training Loss: 0.01479
Epoch: 16090, Training Loss: 0.01137
Epoch: 16090, Training Loss: 0.01196
Epoch: 16090, Training Loss: 0.01198
Epoch: 16090, Training Loss: 0.01479
Epoch: 16091, Training Loss: 0.01137
Epoch: 16091, Training Loss: 0.01196
Epoch: 16091, Training Loss: 0.01198
Epoch: 16091, Training Loss: 0.01479
Epoch: 16092, Training Loss: 0.01137
Epoch: 16092, Training Loss: 0.01196
Epoch: 16092, Training Loss: 0.01198
Epoch: 16092, Training Loss: 0.01479
Epoch: 16093, Training Loss: 0.01137
Epoch: 16093, Training Loss: 0.01196
Epoch: 16093, Training Loss: 0.01198
Epoch: 16093, Training Loss: 0.01479
Epoch: 16094, Training Loss: 0.01137
Epoch: 16094, Training Loss: 0.01196
Epoch: 16094, Training Loss: 0.01198
Epoch: 16094, Training Loss: 0.01479
Epoch: 16095, Training Loss: 0.01137
Epoch: 16095, Training Loss: 0.01196
Epoch: 16095, Training Loss: 0.01198
Epoch: 16095, Training Loss: 0.01479
Epoch: 16096, Training Loss: 0.01137
Epoch: 16096, Training Loss: 0.01196
Epoch: 16096, Training Loss: 0.01198
Epoch: 16096, Training Loss: 0.01479
Epoch: 16097, Training Loss: 0.01137
Epoch: 16097, Training Loss: 0.01196
Epoch: 16097, Training Loss: 0.01198
Epoch: 16097, Training Loss: 0.01479
Epoch: 16098, Training Loss: 0.01137
Epoch: 16098, Training Loss: 0.01196
Epoch: 16098, Training Loss: 0.01198
Epoch: 16098, Training Loss: 0.01479
Epoch: 16099, Training Loss: 0.01137
Epoch: 16099, Training Loss: 0.01196
Epoch: 16099, Training Loss: 0.01198
Epoch: 16099, Training Loss: 0.01479
Epoch: 16100, Training Loss: 0.01137
Epoch: 16100, Training Loss: 0.01196
Epoch: 16100, Training Loss: 0.01197
Epoch: 16100, Training Loss: 0.01479
Epoch: 16101, Training Loss: 0.01136
Epoch: 16101, Training Loss: 0.01196
Epoch: 16101, Training Loss: 0.01197
Epoch: 16101, Training Loss: 0.01479
Epoch: 16102, Training Loss: 0.01136
Epoch: 16102, Training Loss: 0.01196
Epoch: 16102, Training Loss: 0.01197
Epoch: 16102, Training Loss: 0.01479
Epoch: 16103, Training Loss: 0.01136
Epoch: 16103, Training Loss: 0.01196
Epoch: 16103, Training Loss: 0.01197
Epoch: 16103, Training Loss: 0.01479
Epoch: 16104, Training Loss: 0.01136
Epoch: 16104, Training Loss: 0.01196
Epoch: 16104, Training Loss: 0.01197
Epoch: 16104, Training Loss: 0.01479
Epoch: 16105, Training Loss: 0.01136
Epoch: 16105, Training Loss: 0.01196
Epoch: 16105, Training Loss: 0.01197
Epoch: 16105, Training Loss: 0.01479
Epoch: 16106, Training Loss: 0.01136
Epoch: 16106, Training Loss: 0.01196
Epoch: 16106, Training Loss: 0.01197
Epoch: 16106, Training Loss: 0.01478
Epoch: 16107, Training Loss: 0.01136
Epoch: 16107, Training Loss: 0.01195
Epoch: 16107, Training Loss: 0.01197
Epoch: 16107, Training Loss: 0.01478
Epoch: 16108, Training Loss: 0.01136
Epoch: 16108, Training Loss: 0.01195
Epoch: 16108, Training Loss: 0.01197
Epoch: 16108, Training Loss: 0.01478
Epoch: 16109, Training Loss: 0.01136
Epoch: 16109, Training Loss: 0.01195
Epoch: 16109, Training Loss: 0.01197
Epoch: 16109, Training Loss: 0.01478
Epoch: 16110, Training Loss: 0.01136
Epoch: 16110, Training Loss: 0.01195
Epoch: 16110, Training Loss: 0.01197
Epoch: 16110, Training Loss: 0.01478
Epoch: 16111, Training Loss: 0.01136
Epoch: 16111, Training Loss: 0.01195
Epoch: 16111, Training Loss: 0.01197
Epoch: 16111, Training Loss: 0.01478
Epoch: 16112, Training Loss: 0.01136
Epoch: 16112, Training Loss: 0.01195
Epoch: 16112, Training Loss: 0.01197
Epoch: 16112, Training Loss: 0.01478
Epoch: 16113, Training Loss: 0.01136
Epoch: 16113, Training Loss: 0.01195
Epoch: 16113, Training Loss: 0.01197
Epoch: 16113, Training Loss: 0.01478
Epoch: 16114, Training Loss: 0.01136
Epoch: 16114, Training Loss: 0.01195
Epoch: 16114, Training Loss: 0.01197
Epoch: 16114, Training Loss: 0.01478
Epoch: 16115, Training Loss: 0.01136
Epoch: 16115, Training Loss: 0.01195
Epoch: 16115, Training Loss: 0.01197
Epoch: 16115, Training Loss: 0.01478
Epoch: 16116, Training Loss: 0.01136
Epoch: 16116, Training Loss: 0.01195
Epoch: 16116, Training Loss: 0.01197
Epoch: 16116, Training Loss: 0.01478
Epoch: 16117, Training Loss: 0.01136
Epoch: 16117, Training Loss: 0.01195
Epoch: 16117, Training Loss: 0.01197
Epoch: 16117, Training Loss: 0.01478
Epoch: 16118, Training Loss: 0.01136
Epoch: 16118, Training Loss: 0.01195
Epoch: 16118, Training Loss: 0.01197
Epoch: 16118, Training Loss: 0.01478
Epoch: 16119, Training Loss: 0.01136
Epoch: 16119, Training Loss: 0.01195
Epoch: 16119, Training Loss: 0.01197
Epoch: 16119, Training Loss: 0.01478
Epoch: 16120, Training Loss: 0.01136
Epoch: 16120, Training Loss: 0.01195
Epoch: 16120, Training Loss: 0.01197
Epoch: 16120, Training Loss: 0.01478
Epoch: 16121, Training Loss: 0.01136
Epoch: 16121, Training Loss: 0.01195
Epoch: 16121, Training Loss: 0.01197
Epoch: 16121, Training Loss: 0.01478
Epoch: 16122, Training Loss: 0.01136
Epoch: 16122, Training Loss: 0.01195
Epoch: 16122, Training Loss: 0.01196
Epoch: 16122, Training Loss: 0.01478
Epoch: 16123, Training Loss: 0.01136
Epoch: 16123, Training Loss: 0.01195
Epoch: 16123, Training Loss: 0.01196
Epoch: 16123, Training Loss: 0.01477
Epoch: 16124, Training Loss: 0.01136
Epoch: 16124, Training Loss: 0.01195
Epoch: 16124, Training Loss: 0.01196
Epoch: 16124, Training Loss: 0.01477
Epoch: 16125, Training Loss: 0.01135
Epoch: 16125, Training Loss: 0.01195
Epoch: 16125, Training Loss: 0.01196
Epoch: 16125, Training Loss: 0.01477
Epoch: 16126, Training Loss: 0.01135
Epoch: 16126, Training Loss: 0.01195
Epoch: 16126, Training Loss: 0.01196
Epoch: 16126, Training Loss: 0.01477
Epoch: 16127, Training Loss: 0.01135
Epoch: 16127, Training Loss: 0.01195
Epoch: 16127, Training Loss: 0.01196
Epoch: 16127, Training Loss: 0.01477
Epoch: 16128, Training Loss: 0.01135
Epoch: 16128, Training Loss: 0.01195
Epoch: 16128, Training Loss: 0.01196
Epoch: 16128, Training Loss: 0.01477
Epoch: 16129, Training Loss: 0.01135
Epoch: 16129, Training Loss: 0.01194
Epoch: 16129, Training Loss: 0.01196
Epoch: 16129, Training Loss: 0.01477
Epoch: 16130, Training Loss: 0.01135
Epoch: 16130, Training Loss: 0.01194
Epoch: 16130, Training Loss: 0.01196
Epoch: 16130, Training Loss: 0.01477
Epoch: 16131, Training Loss: 0.01135
Epoch: 16131, Training Loss: 0.01194
Epoch: 16131, Training Loss: 0.01196
Epoch: 16131, Training Loss: 0.01477
Epoch: 16132, Training Loss: 0.01135
Epoch: 16132, Training Loss: 0.01194
Epoch: 16132, Training Loss: 0.01196
Epoch: 16132, Training Loss: 0.01477
Epoch: 16133, Training Loss: 0.01135
Epoch: 16133, Training Loss: 0.01194
Epoch: 16133, Training Loss: 0.01196
Epoch: 16133, Training Loss: 0.01477
Epoch: 16134, Training Loss: 0.01135
Epoch: 16134, Training Loss: 0.01194
Epoch: 16134, Training Loss: 0.01196
Epoch: 16134, Training Loss: 0.01477
Epoch: 16135, Training Loss: 0.01135
Epoch: 16135, Training Loss: 0.01194
Epoch: 16135, Training Loss: 0.01196
Epoch: 16135, Training Loss: 0.01477
Epoch: 16136, Training Loss: 0.01135
Epoch: 16136, Training Loss: 0.01194
Epoch: 16136, Training Loss: 0.01196
Epoch: 16136, Training Loss: 0.01477
Epoch: 16137, Training Loss: 0.01135
Epoch: 16137, Training Loss: 0.01194
Epoch: 16137, Training Loss: 0.01196
Epoch: 16137, Training Loss: 0.01477
Epoch: 16138, Training Loss: 0.01135
Epoch: 16138, Training Loss: 0.01194
Epoch: 16138, Training Loss: 0.01196
Epoch: 16138, Training Loss: 0.01477
Epoch: 16139, Training Loss: 0.01135
Epoch: 16139, Training Loss: 0.01194
Epoch: 16139, Training Loss: 0.01196
Epoch: 16139, Training Loss: 0.01477
Epoch: 16140, Training Loss: 0.01135
Epoch: 16140, Training Loss: 0.01194
Epoch: 16140, Training Loss: 0.01196
Epoch: 16140, Training Loss: 0.01477
Epoch: 16141, Training Loss: 0.01135
Epoch: 16141, Training Loss: 0.01194
Epoch: 16141, Training Loss: 0.01196
Epoch: 16141, Training Loss: 0.01476
Epoch: 16142, Training Loss: 0.01135
Epoch: 16142, Training Loss: 0.01194
Epoch: 16142, Training Loss: 0.01196
Epoch: 16142, Training Loss: 0.01476
Epoch: 16143, Training Loss: 0.01135
Epoch: 16143, Training Loss: 0.01194
Epoch: 16143, Training Loss: 0.01196
Epoch: 16143, Training Loss: 0.01476
Epoch: 16144, Training Loss: 0.01135
Epoch: 16144, Training Loss: 0.01194
Epoch: 16144, Training Loss: 0.01195
Epoch: 16144, Training Loss: 0.01476
Epoch: 16145, Training Loss: 0.01135
Epoch: 16145, Training Loss: 0.01194
Epoch: 16145, Training Loss: 0.01195
Epoch: 16145, Training Loss: 0.01476
Epoch: 16146, Training Loss: 0.01135
Epoch: 16146, Training Loss: 0.01194
Epoch: 16146, Training Loss: 0.01195
Epoch: 16146, Training Loss: 0.01476
Epoch: 16147, Training Loss: 0.01135
Epoch: 16147, Training Loss: 0.01194
Epoch: 16147, Training Loss: 0.01195
Epoch: 16147, Training Loss: 0.01476
Epoch: 16148, Training Loss: 0.01135
Epoch: 16148, Training Loss: 0.01194
Epoch: 16148, Training Loss: 0.01195
Epoch: 16148, Training Loss: 0.01476
Epoch: 16149, Training Loss: 0.01134
Epoch: 16149, Training Loss: 0.01194
Epoch: 16149, Training Loss: 0.01195
Epoch: 16149, Training Loss: 0.01476
Epoch: 16150, Training Loss: 0.01134
Epoch: 16150, Training Loss: 0.01194
Epoch: 16150, Training Loss: 0.01195
Epoch: 16150, Training Loss: 0.01476
Epoch: 16151, Training Loss: 0.01134
Epoch: 16151, Training Loss: 0.01193
Epoch: 16151, Training Loss: 0.01195
Epoch: 16151, Training Loss: 0.01476
Epoch: 16152, Training Loss: 0.01134
Epoch: 16152, Training Loss: 0.01193
Epoch: 16152, Training Loss: 0.01195
Epoch: 16152, Training Loss: 0.01476
Epoch: 16153, Training Loss: 0.01134
Epoch: 16153, Training Loss: 0.01193
Epoch: 16153, Training Loss: 0.01195
Epoch: 16153, Training Loss: 0.01476
Epoch: 16154, Training Loss: 0.01134
Epoch: 16154, Training Loss: 0.01193
Epoch: 16154, Training Loss: 0.01195
Epoch: 16154, Training Loss: 0.01476
Epoch: 16155, Training Loss: 0.01134
Epoch: 16155, Training Loss: 0.01193
Epoch: 16155, Training Loss: 0.01195
Epoch: 16155, Training Loss: 0.01476
Epoch: 16156, Training Loss: 0.01134
Epoch: 16156, Training Loss: 0.01193
Epoch: 16156, Training Loss: 0.01195
Epoch: 16156, Training Loss: 0.01476
Epoch: 16157, Training Loss: 0.01134
Epoch: 16157, Training Loss: 0.01193
Epoch: 16157, Training Loss: 0.01195
Epoch: 16157, Training Loss: 0.01476
Epoch: 16158, Training Loss: 0.01134
Epoch: 16158, Training Loss: 0.01193
Epoch: 16158, Training Loss: 0.01195
Epoch: 16158, Training Loss: 0.01475
Epoch: 16159, Training Loss: 0.01134
Epoch: 16159, Training Loss: 0.01193
Epoch: 16159, Training Loss: 0.01195
Epoch: 16159, Training Loss: 0.01475
Epoch: 16160, Training Loss: 0.01134
Epoch: 16160, Training Loss: 0.01193
Epoch: 16160, Training Loss: 0.01195
Epoch: 16160, Training Loss: 0.01475
Epoch: 16161, Training Loss: 0.01134
Epoch: 16161, Training Loss: 0.01193
Epoch: 16161, Training Loss: 0.01195
Epoch: 16161, Training Loss: 0.01475
Epoch: 16162, Training Loss: 0.01134
Epoch: 16162, Training Loss: 0.01193
Epoch: 16162, Training Loss: 0.01195
Epoch: 16162, Training Loss: 0.01475
Epoch: 16163, Training Loss: 0.01134
Epoch: 16163, Training Loss: 0.01193
Epoch: 16163, Training Loss: 0.01195
Epoch: 16163, Training Loss: 0.01475
Epoch: 16164, Training Loss: 0.01134
Epoch: 16164, Training Loss: 0.01193
Epoch: 16164, Training Loss: 0.01195
Epoch: 16164, Training Loss: 0.01475
Epoch: 16165, Training Loss: 0.01134
Epoch: 16165, Training Loss: 0.01193
Epoch: 16165, Training Loss: 0.01195
Epoch: 16165, Training Loss: 0.01475
Epoch: 16166, Training Loss: 0.01134
Epoch: 16166, Training Loss: 0.01193
Epoch: 16166, Training Loss: 0.01194
Epoch: 16166, Training Loss: 0.01475
Epoch: 16167, Training Loss: 0.01134
Epoch: 16167, Training Loss: 0.01193
Epoch: 16167, Training Loss: 0.01194
Epoch: 16167, Training Loss: 0.01475
Epoch: 16168, Training Loss: 0.01134
Epoch: 16168, Training Loss: 0.01193
Epoch: 16168, Training Loss: 0.01194
Epoch: 16168, Training Loss: 0.01475
Epoch: 16169, Training Loss: 0.01134
Epoch: 16169, Training Loss: 0.01193
Epoch: 16169, Training Loss: 0.01194
Epoch: 16169, Training Loss: 0.01475
Epoch: 16170, Training Loss: 0.01134
Epoch: 16170, Training Loss: 0.01193
Epoch: 16170, Training Loss: 0.01194
Epoch: 16170, Training Loss: 0.01475
Epoch: 16171, Training Loss: 0.01134
Epoch: 16171, Training Loss: 0.01193
Epoch: 16171, Training Loss: 0.01194
Epoch: 16171, Training Loss: 0.01475
Epoch: 16172, Training Loss: 0.01134
Epoch: 16172, Training Loss: 0.01193
Epoch: 16172, Training Loss: 0.01194
Epoch: 16172, Training Loss: 0.01475
Epoch: 16173, Training Loss: 0.01133
Epoch: 16173, Training Loss: 0.01192
Epoch: 16173, Training Loss: 0.01194
Epoch: 16173, Training Loss: 0.01475
Epoch: 16174, Training Loss: 0.01133
Epoch: 16174, Training Loss: 0.01192
Epoch: 16174, Training Loss: 0.01194
Epoch: 16174, Training Loss: 0.01475
Epoch: 16175, Training Loss: 0.01133
Epoch: 16175, Training Loss: 0.01192
Epoch: 16175, Training Loss: 0.01194
Epoch: 16175, Training Loss: 0.01475
Epoch: 16176, Training Loss: 0.01133
Epoch: 16176, Training Loss: 0.01192
Epoch: 16176, Training Loss: 0.01194
Epoch: 16176, Training Loss: 0.01474
Epoch: 16177, Training Loss: 0.01133
Epoch: 16177, Training Loss: 0.01192
Epoch: 16177, Training Loss: 0.01194
Epoch: 16177, Training Loss: 0.01474
Epoch: 16178, Training Loss: 0.01133
Epoch: 16178, Training Loss: 0.01192
Epoch: 16178, Training Loss: 0.01194
Epoch: 16178, Training Loss: 0.01474
Epoch: 16179, Training Loss: 0.01133
Epoch: 16179, Training Loss: 0.01192
Epoch: 16179, Training Loss: 0.01194
Epoch: 16179, Training Loss: 0.01474
Epoch: 16180, Training Loss: 0.01133
Epoch: 16180, Training Loss: 0.01192
Epoch: 16180, Training Loss: 0.01194
Epoch: 16180, Training Loss: 0.01474
Epoch: 16181, Training Loss: 0.01133
Epoch: 16181, Training Loss: 0.01192
Epoch: 16181, Training Loss: 0.01194
Epoch: 16181, Training Loss: 0.01474
Epoch: 16182, Training Loss: 0.01133
Epoch: 16182, Training Loss: 0.01192
Epoch: 16182, Training Loss: 0.01194
Epoch: 16182, Training Loss: 0.01474
Epoch: 16183, Training Loss: 0.01133
Epoch: 16183, Training Loss: 0.01192
Epoch: 16183, Training Loss: 0.01194
Epoch: 16183, Training Loss: 0.01474
Epoch: 16184, Training Loss: 0.01133
Epoch: 16184, Training Loss: 0.01192
Epoch: 16184, Training Loss: 0.01194
Epoch: 16184, Training Loss: 0.01474
Epoch: 16185, Training Loss: 0.01133
Epoch: 16185, Training Loss: 0.01192
Epoch: 16185, Training Loss: 0.01194
Epoch: 16185, Training Loss: 0.01474
Epoch: 16186, Training Loss: 0.01133
Epoch: 16186, Training Loss: 0.01192
Epoch: 16186, Training Loss: 0.01194
Epoch: 16186, Training Loss: 0.01474
Epoch: 16187, Training Loss: 0.01133
Epoch: 16187, Training Loss: 0.01192
Epoch: 16187, Training Loss: 0.01194
Epoch: 16187, Training Loss: 0.01474
Epoch: 16188, Training Loss: 0.01133
Epoch: 16188, Training Loss: 0.01192
Epoch: 16188, Training Loss: 0.01193
Epoch: 16188, Training Loss: 0.01474
Epoch: 16189, Training Loss: 0.01133
Epoch: 16189, Training Loss: 0.01192
Epoch: 16189, Training Loss: 0.01193
Epoch: 16189, Training Loss: 0.01474
Epoch: 16190, Training Loss: 0.01133
Epoch: 16190, Training Loss: 0.01192
Epoch: 16190, Training Loss: 0.01193
Epoch: 16190, Training Loss: 0.01474
Epoch: 16191, Training Loss: 0.01133
Epoch: 16191, Training Loss: 0.01192
Epoch: 16191, Training Loss: 0.01193
Epoch: 16191, Training Loss: 0.01474
Epoch: 16192, Training Loss: 0.01133
Epoch: 16192, Training Loss: 0.01192
Epoch: 16192, Training Loss: 0.01193
Epoch: 16192, Training Loss: 0.01474
Epoch: 16193, Training Loss: 0.01133
Epoch: 16193, Training Loss: 0.01192
Epoch: 16193, Training Loss: 0.01193
Epoch: 16193, Training Loss: 0.01473
Epoch: 16194, Training Loss: 0.01133
Epoch: 16194, Training Loss: 0.01192
Epoch: 16194, Training Loss: 0.01193
Epoch: 16194, Training Loss: 0.01473
Epoch: 16195, Training Loss: 0.01133
Epoch: 16195, Training Loss: 0.01192
Epoch: 16195, Training Loss: 0.01193
Epoch: 16195, Training Loss: 0.01473
Epoch: 16196, Training Loss: 0.01133
Epoch: 16196, Training Loss: 0.01191
Epoch: 16196, Training Loss: 0.01193
Epoch: 16196, Training Loss: 0.01473
Epoch: 16197, Training Loss: 0.01132
Epoch: 16197, Training Loss: 0.01191
Epoch: 16197, Training Loss: 0.01193
Epoch: 16197, Training Loss: 0.01473
Epoch: 16198, Training Loss: 0.01132
Epoch: 16198, Training Loss: 0.01191
Epoch: 16198, Training Loss: 0.01193
Epoch: 16198, Training Loss: 0.01473
Epoch: 16199, Training Loss: 0.01132
Epoch: 16199, Training Loss: 0.01191
Epoch: 16199, Training Loss: 0.01193
Epoch: 16199, Training Loss: 0.01473
Epoch: 16200, Training Loss: 0.01132
Epoch: 16200, Training Loss: 0.01191
Epoch: 16200, Training Loss: 0.01193
Epoch: 16200, Training Loss: 0.01473
Epoch: 16201, Training Loss: 0.01132
Epoch: 16201, Training Loss: 0.01191
Epoch: 16201, Training Loss: 0.01193
Epoch: 16201, Training Loss: 0.01473
Epoch: 16202, Training Loss: 0.01132
Epoch: 16202, Training Loss: 0.01191
Epoch: 16202, Training Loss: 0.01193
Epoch: 16202, Training Loss: 0.01473
Epoch: 16203, Training Loss: 0.01132
Epoch: 16203, Training Loss: 0.01191
Epoch: 16203, Training Loss: 0.01193
Epoch: 16203, Training Loss: 0.01473
Epoch: 16204, Training Loss: 0.01132
Epoch: 16204, Training Loss: 0.01191
Epoch: 16204, Training Loss: 0.01193
Epoch: 16204, Training Loss: 0.01473
Epoch: 16205, Training Loss: 0.01132
Epoch: 16205, Training Loss: 0.01191
Epoch: 16205, Training Loss: 0.01193
Epoch: 16205, Training Loss: 0.01473
Epoch: 16206, Training Loss: 0.01132
Epoch: 16206, Training Loss: 0.01191
Epoch: 16206, Training Loss: 0.01193
Epoch: 16206, Training Loss: 0.01473
Epoch: 16207, Training Loss: 0.01132
Epoch: 16207, Training Loss: 0.01191
Epoch: 16207, Training Loss: 0.01193
Epoch: 16207, Training Loss: 0.01473
Epoch: 16208, Training Loss: 0.01132
Epoch: 16208, Training Loss: 0.01191
Epoch: 16208, Training Loss: 0.01193
Epoch: 16208, Training Loss: 0.01473
Epoch: 16209, Training Loss: 0.01132
Epoch: 16209, Training Loss: 0.01191
Epoch: 16209, Training Loss: 0.01193
Epoch: 16209, Training Loss: 0.01473
Epoch: 16210, Training Loss: 0.01132
Epoch: 16210, Training Loss: 0.01191
Epoch: 16210, Training Loss: 0.01192
Epoch: 16210, Training Loss: 0.01473
Epoch: 16211, Training Loss: 0.01132
Epoch: 16211, Training Loss: 0.01191
Epoch: 16211, Training Loss: 0.01192
Epoch: 16211, Training Loss: 0.01472
Epoch: 16212, Training Loss: 0.01132
Epoch: 16212, Training Loss: 0.01191
Epoch: 16212, Training Loss: 0.01192
Epoch: 16212, Training Loss: 0.01472
Epoch: 16213, Training Loss: 0.01132
Epoch: 16213, Training Loss: 0.01191
Epoch: 16213, Training Loss: 0.01192
Epoch: 16213, Training Loss: 0.01472
Epoch: 16214, Training Loss: 0.01132
Epoch: 16214, Training Loss: 0.01191
Epoch: 16214, Training Loss: 0.01192
Epoch: 16214, Training Loss: 0.01472
Epoch: 16215, Training Loss: 0.01132
Epoch: 16215, Training Loss: 0.01191
Epoch: 16215, Training Loss: 0.01192
Epoch: 16215, Training Loss: 0.01472
Epoch: 16216, Training Loss: 0.01132
Epoch: 16216, Training Loss: 0.01191
Epoch: 16216, Training Loss: 0.01192
Epoch: 16216, Training Loss: 0.01472
Epoch: 16217, Training Loss: 0.01132
Epoch: 16217, Training Loss: 0.01191
Epoch: 16217, Training Loss: 0.01192
Epoch: 16217, Training Loss: 0.01472
Epoch: 16218, Training Loss: 0.01132
Epoch: 16218, Training Loss: 0.01190
Epoch: 16218, Training Loss: 0.01192
Epoch: 16218, Training Loss: 0.01472
Epoch: 16219, Training Loss: 0.01132
Epoch: 16219, Training Loss: 0.01190
Epoch: 16219, Training Loss: 0.01192
Epoch: 16219, Training Loss: 0.01472
Epoch: 16220, Training Loss: 0.01132
Epoch: 16220, Training Loss: 0.01190
Epoch: 16220, Training Loss: 0.01192
Epoch: 16220, Training Loss: 0.01472
Epoch: 16221, Training Loss: 0.01131
Epoch: 16221, Training Loss: 0.01190
Epoch: 16221, Training Loss: 0.01192
Epoch: 16221, Training Loss: 0.01472
Epoch: 16222, Training Loss: 0.01131
Epoch: 16222, Training Loss: 0.01190
Epoch: 16222, Training Loss: 0.01192
Epoch: 16222, Training Loss: 0.01472
Epoch: 16223, Training Loss: 0.01131
Epoch: 16223, Training Loss: 0.01190
Epoch: 16223, Training Loss: 0.01192
Epoch: 16223, Training Loss: 0.01472
Epoch: 16224, Training Loss: 0.01131
Epoch: 16224, Training Loss: 0.01190
Epoch: 16224, Training Loss: 0.01192
Epoch: 16224, Training Loss: 0.01472
Epoch: 16225, Training Loss: 0.01131
Epoch: 16225, Training Loss: 0.01190
Epoch: 16225, Training Loss: 0.01192
Epoch: 16225, Training Loss: 0.01472
Epoch: 16226, Training Loss: 0.01131
Epoch: 16226, Training Loss: 0.01190
Epoch: 16226, Training Loss: 0.01192
Epoch: 16226, Training Loss: 0.01472
Epoch: 16227, Training Loss: 0.01131
Epoch: 16227, Training Loss: 0.01190
Epoch: 16227, Training Loss: 0.01192
Epoch: 16227, Training Loss: 0.01472
Epoch: 16228, Training Loss: 0.01131
Epoch: 16228, Training Loss: 0.01190
Epoch: 16228, Training Loss: 0.01192
Epoch: 16228, Training Loss: 0.01472
Epoch: 16229, Training Loss: 0.01131
Epoch: 16229, Training Loss: 0.01190
Epoch: 16229, Training Loss: 0.01192
Epoch: 16229, Training Loss: 0.01471
Epoch: 16230, Training Loss: 0.01131
Epoch: 16230, Training Loss: 0.01190
Epoch: 16230, Training Loss: 0.01192
Epoch: 16230, Training Loss: 0.01471
Epoch: 16231, Training Loss: 0.01131
Epoch: 16231, Training Loss: 0.01190
Epoch: 16231, Training Loss: 0.01192
Epoch: 16231, Training Loss: 0.01471
Epoch: 16232, Training Loss: 0.01131
Epoch: 16232, Training Loss: 0.01190
Epoch: 16232, Training Loss: 0.01191
Epoch: 16232, Training Loss: 0.01471
Epoch: 16233, Training Loss: 0.01131
Epoch: 16233, Training Loss: 0.01190
Epoch: 16233, Training Loss: 0.01191
Epoch: 16233, Training Loss: 0.01471
Epoch: 16234, Training Loss: 0.01131
Epoch: 16234, Training Loss: 0.01190
Epoch: 16234, Training Loss: 0.01191
Epoch: 16234, Training Loss: 0.01471
Epoch: 16235, Training Loss: 0.01131
Epoch: 16235, Training Loss: 0.01190
Epoch: 16235, Training Loss: 0.01191
Epoch: 16235, Training Loss: 0.01471
Epoch: 16236, Training Loss: 0.01131
Epoch: 16236, Training Loss: 0.01190
Epoch: 16236, Training Loss: 0.01191
Epoch: 16236, Training Loss: 0.01471
Epoch: 16237, Training Loss: 0.01131
Epoch: 16237, Training Loss: 0.01190
Epoch: 16237, Training Loss: 0.01191
Epoch: 16237, Training Loss: 0.01471
Epoch: 16238, Training Loss: 0.01131
Epoch: 16238, Training Loss: 0.01190
Epoch: 16238, Training Loss: 0.01191
Epoch: 16238, Training Loss: 0.01471
Epoch: 16239, Training Loss: 0.01131
Epoch: 16239, Training Loss: 0.01190
Epoch: 16239, Training Loss: 0.01191
Epoch: 16239, Training Loss: 0.01471
Epoch: 16240, Training Loss: 0.01131
Epoch: 16240, Training Loss: 0.01189
Epoch: 16240, Training Loss: 0.01191
Epoch: 16240, Training Loss: 0.01471
Epoch: 16241, Training Loss: 0.01131
Epoch: 16241, Training Loss: 0.01189
Epoch: 16241, Training Loss: 0.01191
Epoch: 16241, Training Loss: 0.01471
Epoch: 16242, Training Loss: 0.01131
Epoch: 16242, Training Loss: 0.01189
Epoch: 16242, Training Loss: 0.01191
Epoch: 16242, Training Loss: 0.01471
Epoch: 16243, Training Loss: 0.01131
Epoch: 16243, Training Loss: 0.01189
Epoch: 16243, Training Loss: 0.01191
Epoch: 16243, Training Loss: 0.01471
Epoch: 16244, Training Loss: 0.01131
Epoch: 16244, Training Loss: 0.01189
Epoch: 16244, Training Loss: 0.01191
Epoch: 16244, Training Loss: 0.01471
Epoch: 16245, Training Loss: 0.01131
Epoch: 16245, Training Loss: 0.01189
Epoch: 16245, Training Loss: 0.01191
Epoch: 16245, Training Loss: 0.01471
Epoch: 16246, Training Loss: 0.01130
Epoch: 16246, Training Loss: 0.01189
Epoch: 16246, Training Loss: 0.01191
Epoch: 16246, Training Loss: 0.01470
Epoch: 16247, Training Loss: 0.01130
Epoch: 16247, Training Loss: 0.01189
Epoch: 16247, Training Loss: 0.01191
Epoch: 16247, Training Loss: 0.01470
Epoch: 16248, Training Loss: 0.01130
Epoch: 16248, Training Loss: 0.01189
Epoch: 16248, Training Loss: 0.01191
Epoch: 16248, Training Loss: 0.01470
Epoch: 16249, Training Loss: 0.01130
Epoch: 16249, Training Loss: 0.01189
Epoch: 16249, Training Loss: 0.01191
Epoch: 16249, Training Loss: 0.01470
Epoch: 16250, Training Loss: 0.01130
Epoch: 16250, Training Loss: 0.01189
Epoch: 16250, Training Loss: 0.01191
Epoch: 16250, Training Loss: 0.01470
Epoch: 16251, Training Loss: 0.01130
Epoch: 16251, Training Loss: 0.01189
Epoch: 16251, Training Loss: 0.01191
Epoch: 16251, Training Loss: 0.01470
Epoch: 16252, Training Loss: 0.01130
Epoch: 16252, Training Loss: 0.01189
Epoch: 16252, Training Loss: 0.01191
Epoch: 16252, Training Loss: 0.01470
Epoch: 16253, Training Loss: 0.01130
Epoch: 16253, Training Loss: 0.01189
Epoch: 16253, Training Loss: 0.01191
Epoch: 16253, Training Loss: 0.01470
Epoch: 16254, Training Loss: 0.01130
Epoch: 16254, Training Loss: 0.01189
Epoch: 16254, Training Loss: 0.01190
Epoch: 16254, Training Loss: 0.01470
Epoch: 16255, Training Loss: 0.01130
Epoch: 16255, Training Loss: 0.01189
Epoch: 16255, Training Loss: 0.01190
Epoch: 16255, Training Loss: 0.01470
Epoch: 16256, Training Loss: 0.01130
Epoch: 16256, Training Loss: 0.01189
Epoch: 16256, Training Loss: 0.01190
Epoch: 16256, Training Loss: 0.01470
Epoch: 16257, Training Loss: 0.01130
Epoch: 16257, Training Loss: 0.01189
Epoch: 16257, Training Loss: 0.01190
Epoch: 16257, Training Loss: 0.01470
Epoch: 16258, Training Loss: 0.01130
Epoch: 16258, Training Loss: 0.01189
Epoch: 16258, Training Loss: 0.01190
Epoch: 16258, Training Loss: 0.01470
Epoch: 16259, Training Loss: 0.01130
Epoch: 16259, Training Loss: 0.01189
Epoch: 16259, Training Loss: 0.01190
Epoch: 16259, Training Loss: 0.01470
Epoch: 16260, Training Loss: 0.01130
Epoch: 16260, Training Loss: 0.01189
Epoch: 16260, Training Loss: 0.01190
Epoch: 16260, Training Loss: 0.01470
Epoch: 16261, Training Loss: 0.01130
Epoch: 16261, Training Loss: 0.01189
Epoch: 16261, Training Loss: 0.01190
Epoch: 16261, Training Loss: 0.01470
Epoch: 16262, Training Loss: 0.01130
Epoch: 16262, Training Loss: 0.01188
Epoch: 16262, Training Loss: 0.01190
Epoch: 16262, Training Loss: 0.01470
Epoch: 16263, Training Loss: 0.01130
Epoch: 16263, Training Loss: 0.01188
Epoch: 16263, Training Loss: 0.01190
Epoch: 16263, Training Loss: 0.01470
Epoch: 16264, Training Loss: 0.01130
Epoch: 16264, Training Loss: 0.01188
Epoch: 16264, Training Loss: 0.01190
Epoch: 16264, Training Loss: 0.01469
Epoch: 16265, Training Loss: 0.01130
Epoch: 16265, Training Loss: 0.01188
Epoch: 16265, Training Loss: 0.01190
Epoch: 16265, Training Loss: 0.01469
Epoch: 16266, Training Loss: 0.01130
Epoch: 16266, Training Loss: 0.01188
Epoch: 16266, Training Loss: 0.01190
Epoch: 16266, Training Loss: 0.01469
Epoch: 16267, Training Loss: 0.01130
Epoch: 16267, Training Loss: 0.01188
Epoch: 16267, Training Loss: 0.01190
Epoch: 16267, Training Loss: 0.01469
Epoch: 16268, Training Loss: 0.01130
Epoch: 16268, Training Loss: 0.01188
Epoch: 16268, Training Loss: 0.01190
Epoch: 16268, Training Loss: 0.01469
Epoch: 16269, Training Loss: 0.01130
Epoch: 16269, Training Loss: 0.01188
Epoch: 16269, Training Loss: 0.01190
Epoch: 16269, Training Loss: 0.01469
Epoch: 16270, Training Loss: 0.01129
Epoch: 16270, Training Loss: 0.01188
Epoch: 16270, Training Loss: 0.01190
Epoch: 16270, Training Loss: 0.01469
Epoch: 16271, Training Loss: 0.01129
Epoch: 16271, Training Loss: 0.01188
Epoch: 16271, Training Loss: 0.01190
Epoch: 16271, Training Loss: 0.01469
Epoch: 16272, Training Loss: 0.01129
Epoch: 16272, Training Loss: 0.01188
Epoch: 16272, Training Loss: 0.01190
Epoch: 16272, Training Loss: 0.01469
Epoch: 16273, Training Loss: 0.01129
Epoch: 16273, Training Loss: 0.01188
Epoch: 16273, Training Loss: 0.01190
Epoch: 16273, Training Loss: 0.01469
Epoch: 16274, Training Loss: 0.01129
Epoch: 16274, Training Loss: 0.01188
Epoch: 16274, Training Loss: 0.01190
Epoch: 16274, Training Loss: 0.01469
Epoch: 16275, Training Loss: 0.01129
Epoch: 16275, Training Loss: 0.01188
Epoch: 16275, Training Loss: 0.01190
Epoch: 16275, Training Loss: 0.01469
Epoch: 16276, Training Loss: 0.01129
Epoch: 16276, Training Loss: 0.01188
Epoch: 16276, Training Loss: 0.01190
Epoch: 16276, Training Loss: 0.01469
Epoch: 16277, Training Loss: 0.01129
Epoch: 16277, Training Loss: 0.01188
Epoch: 16277, Training Loss: 0.01189
Epoch: 16277, Training Loss: 0.01469
Epoch: 16278, Training Loss: 0.01129
Epoch: 16278, Training Loss: 0.01188
Epoch: 16278, Training Loss: 0.01189
Epoch: 16278, Training Loss: 0.01469
Epoch: 16279, Training Loss: 0.01129
Epoch: 16279, Training Loss: 0.01188
Epoch: 16279, Training Loss: 0.01189
Epoch: 16279, Training Loss: 0.01469
Epoch: 16280, Training Loss: 0.01129
Epoch: 16280, Training Loss: 0.01188
Epoch: 16280, Training Loss: 0.01189
Epoch: 16280, Training Loss: 0.01469
Epoch: 16281, Training Loss: 0.01129
Epoch: 16281, Training Loss: 0.01188
Epoch: 16281, Training Loss: 0.01189
Epoch: 16281, Training Loss: 0.01469
Epoch: 16282, Training Loss: 0.01129
Epoch: 16282, Training Loss: 0.01188
Epoch: 16282, Training Loss: 0.01189
Epoch: 16282, Training Loss: 0.01468
Epoch: 16283, Training Loss: 0.01129
Epoch: 16283, Training Loss: 0.01188
Epoch: 16283, Training Loss: 0.01189
Epoch: 16283, Training Loss: 0.01468
Epoch: 16284, Training Loss: 0.01129
Epoch: 16284, Training Loss: 0.01188
Epoch: 16284, Training Loss: 0.01189
Epoch: 16284, Training Loss: 0.01468
Epoch: 16285, Training Loss: 0.01129
Epoch: 16285, Training Loss: 0.01187
Epoch: 16285, Training Loss: 0.01189
Epoch: 16285, Training Loss: 0.01468
Epoch: 16286, Training Loss: 0.01129
Epoch: 16286, Training Loss: 0.01187
Epoch: 16286, Training Loss: 0.01189
Epoch: 16286, Training Loss: 0.01468
Epoch: 16287, Training Loss: 0.01129
Epoch: 16287, Training Loss: 0.01187
Epoch: 16287, Training Loss: 0.01189
Epoch: 16287, Training Loss: 0.01468
Epoch: 16288, Training Loss: 0.01129
Epoch: 16288, Training Loss: 0.01187
Epoch: 16288, Training Loss: 0.01189
Epoch: 16288, Training Loss: 0.01468
Epoch: 16289, Training Loss: 0.01129
Epoch: 16289, Training Loss: 0.01187
Epoch: 16289, Training Loss: 0.01189
Epoch: 16289, Training Loss: 0.01468
Epoch: 16290, Training Loss: 0.01129
Epoch: 16290, Training Loss: 0.01187
Epoch: 16290, Training Loss: 0.01189
Epoch: 16290, Training Loss: 0.01468
Epoch: 16291, Training Loss: 0.01129
Epoch: 16291, Training Loss: 0.01187
Epoch: 16291, Training Loss: 0.01189
Epoch: 16291, Training Loss: 0.01468
Epoch: 16292, Training Loss: 0.01129
Epoch: 16292, Training Loss: 0.01187
Epoch: 16292, Training Loss: 0.01189
Epoch: 16292, Training Loss: 0.01468
Epoch: 16293, Training Loss: 0.01129
Epoch: 16293, Training Loss: 0.01187
Epoch: 16293, Training Loss: 0.01189
Epoch: 16293, Training Loss: 0.01468
Epoch: 16294, Training Loss: 0.01128
Epoch: 16294, Training Loss: 0.01187
Epoch: 16294, Training Loss: 0.01189
Epoch: 16294, Training Loss: 0.01468
Epoch: 16295, Training Loss: 0.01128
Epoch: 16295, Training Loss: 0.01187
Epoch: 16295, Training Loss: 0.01189
Epoch: 16295, Training Loss: 0.01468
Epoch: 16296, Training Loss: 0.01128
Epoch: 16296, Training Loss: 0.01187
Epoch: 16296, Training Loss: 0.01189
Epoch: 16296, Training Loss: 0.01468
Epoch: 16297, Training Loss: 0.01128
Epoch: 16297, Training Loss: 0.01187
Epoch: 16297, Training Loss: 0.01189
Epoch: 16297, Training Loss: 0.01468
Epoch: 16298, Training Loss: 0.01128
Epoch: 16298, Training Loss: 0.01187
Epoch: 16298, Training Loss: 0.01189
Epoch: 16298, Training Loss: 0.01468
Epoch: 16299, Training Loss: 0.01128
Epoch: 16299, Training Loss: 0.01187
Epoch: 16299, Training Loss: 0.01188
Epoch: 16299, Training Loss: 0.01468
Epoch: 16300, Training Loss: 0.01128
Epoch: 16300, Training Loss: 0.01187
Epoch: 16300, Training Loss: 0.01188
Epoch: 16300, Training Loss: 0.01467
Epoch: 16301, Training Loss: 0.01128
Epoch: 16301, Training Loss: 0.01187
Epoch: 16301, Training Loss: 0.01188
Epoch: 16301, Training Loss: 0.01467
Epoch: 16302, Training Loss: 0.01128
Epoch: 16302, Training Loss: 0.01187
Epoch: 16302, Training Loss: 0.01188
Epoch: 16302, Training Loss: 0.01467
Epoch: 16303, Training Loss: 0.01128
Epoch: 16303, Training Loss: 0.01187
Epoch: 16303, Training Loss: 0.01188
Epoch: 16303, Training Loss: 0.01467
Epoch: 16304, Training Loss: 0.01128
Epoch: 16304, Training Loss: 0.01187
Epoch: 16304, Training Loss: 0.01188
Epoch: 16304, Training Loss: 0.01467
Epoch: 16305, Training Loss: 0.01128
Epoch: 16305, Training Loss: 0.01187
Epoch: 16305, Training Loss: 0.01188
Epoch: 16305, Training Loss: 0.01467
Epoch: 16306, Training Loss: 0.01128
Epoch: 16306, Training Loss: 0.01187
Epoch: 16306, Training Loss: 0.01188
Epoch: 16306, Training Loss: 0.01467
Epoch: 16307, Training Loss: 0.01128
Epoch: 16307, Training Loss: 0.01186
Epoch: 16307, Training Loss: 0.01188
Epoch: 16307, Training Loss: 0.01467
Epoch: 16308, Training Loss: 0.01128
Epoch: 16308, Training Loss: 0.01186
Epoch: 16308, Training Loss: 0.01188
Epoch: 16308, Training Loss: 0.01467
Epoch: 16309, Training Loss: 0.01128
Epoch: 16309, Training Loss: 0.01186
Epoch: 16309, Training Loss: 0.01188
Epoch: 16309, Training Loss: 0.01467
Epoch: 16310, Training Loss: 0.01128
Epoch: 16310, Training Loss: 0.01186
Epoch: 16310, Training Loss: 0.01188
Epoch: 16310, Training Loss: 0.01467
Epoch: 16311, Training Loss: 0.01128
Epoch: 16311, Training Loss: 0.01186
Epoch: 16311, Training Loss: 0.01188
Epoch: 16311, Training Loss: 0.01467
Epoch: 16312, Training Loss: 0.01128
Epoch: 16312, Training Loss: 0.01186
Epoch: 16312, Training Loss: 0.01188
Epoch: 16312, Training Loss: 0.01467
Epoch: 16313, Training Loss: 0.01128
Epoch: 16313, Training Loss: 0.01186
Epoch: 16313, Training Loss: 0.01188
Epoch: 16313, Training Loss: 0.01467
Epoch: 16314, Training Loss: 0.01128
Epoch: 16314, Training Loss: 0.01186
Epoch: 16314, Training Loss: 0.01188
Epoch: 16314, Training Loss: 0.01467
Epoch: 16315, Training Loss: 0.01128
Epoch: 16315, Training Loss: 0.01186
Epoch: 16315, Training Loss: 0.01188
Epoch: 16315, Training Loss: 0.01467
Epoch: 16316, Training Loss: 0.01128
Epoch: 16316, Training Loss: 0.01186
Epoch: 16316, Training Loss: 0.01188
Epoch: 16316, Training Loss: 0.01467
Epoch: 16317, Training Loss: 0.01128
Epoch: 16317, Training Loss: 0.01186
Epoch: 16317, Training Loss: 0.01188
Epoch: 16317, Training Loss: 0.01467
Epoch: 16318, Training Loss: 0.01128
Epoch: 16318, Training Loss: 0.01186
Epoch: 16318, Training Loss: 0.01188
Epoch: 16318, Training Loss: 0.01466
Epoch: 16319, Training Loss: 0.01127
Epoch: 16319, Training Loss: 0.01186
Epoch: 16319, Training Loss: 0.01188
Epoch: 16319, Training Loss: 0.01466
Epoch: 16320, Training Loss: 0.01127
Epoch: 16320, Training Loss: 0.01186
Epoch: 16320, Training Loss: 0.01188
Epoch: 16320, Training Loss: 0.01466
Epoch: 16321, Training Loss: 0.01127
Epoch: 16321, Training Loss: 0.01186
Epoch: 16321, Training Loss: 0.01188
Epoch: 16321, Training Loss: 0.01466
Epoch: 16322, Training Loss: 0.01127
Epoch: 16322, Training Loss: 0.01186
Epoch: 16322, Training Loss: 0.01187
Epoch: 16322, Training Loss: 0.01466
Epoch: 16323, Training Loss: 0.01127
Epoch: 16323, Training Loss: 0.01186
Epoch: 16323, Training Loss: 0.01187
Epoch: 16323, Training Loss: 0.01466
Epoch: 16324, Training Loss: 0.01127
Epoch: 16324, Training Loss: 0.01186
Epoch: 16324, Training Loss: 0.01187
Epoch: 16324, Training Loss: 0.01466
Epoch: 16325, Training Loss: 0.01127
Epoch: 16325, Training Loss: 0.01186
Epoch: 16325, Training Loss: 0.01187
Epoch: 16325, Training Loss: 0.01466
Epoch: 16326, Training Loss: 0.01127
Epoch: 16326, Training Loss: 0.01186
Epoch: 16326, Training Loss: 0.01187
Epoch: 16326, Training Loss: 0.01466
Epoch: 16327, Training Loss: 0.01127
Epoch: 16327, Training Loss: 0.01186
Epoch: 16327, Training Loss: 0.01187
Epoch: 16327, Training Loss: 0.01466
Epoch: 16328, Training Loss: 0.01127
Epoch: 16328, Training Loss: 0.01186
Epoch: 16328, Training Loss: 0.01187
Epoch: 16328, Training Loss: 0.01466
Epoch: 16329, Training Loss: 0.01127
Epoch: 16329, Training Loss: 0.01186
Epoch: 16329, Training Loss: 0.01187
Epoch: 16329, Training Loss: 0.01466
Epoch: 16330, Training Loss: 0.01127
Epoch: 16330, Training Loss: 0.01185
Epoch: 16330, Training Loss: 0.01187
Epoch: 16330, Training Loss: 0.01466
Epoch: 16331, Training Loss: 0.01127
Epoch: 16331, Training Loss: 0.01185
Epoch: 16331, Training Loss: 0.01187
Epoch: 16331, Training Loss: 0.01466
Epoch: 16332, Training Loss: 0.01127
Epoch: 16332, Training Loss: 0.01185
Epoch: 16332, Training Loss: 0.01187
Epoch: 16332, Training Loss: 0.01466
Epoch: 16333, Training Loss: 0.01127
Epoch: 16333, Training Loss: 0.01185
Epoch: 16333, Training Loss: 0.01187
Epoch: 16333, Training Loss: 0.01466
Epoch: 16334, Training Loss: 0.01127
Epoch: 16334, Training Loss: 0.01185
Epoch: 16334, Training Loss: 0.01187
Epoch: 16334, Training Loss: 0.01466
Epoch: 16335, Training Loss: 0.01127
Epoch: 16335, Training Loss: 0.01185
Epoch: 16335, Training Loss: 0.01187
Epoch: 16335, Training Loss: 0.01466
Epoch: 16336, Training Loss: 0.01127
Epoch: 16336, Training Loss: 0.01185
Epoch: 16336, Training Loss: 0.01187
Epoch: 16336, Training Loss: 0.01465
Epoch: 16337, Training Loss: 0.01127
Epoch: 16337, Training Loss: 0.01185
Epoch: 16337, Training Loss: 0.01187
Epoch: 16337, Training Loss: 0.01465
Epoch: 16338, Training Loss: 0.01127
Epoch: 16338, Training Loss: 0.01185
Epoch: 16338, Training Loss: 0.01187
Epoch: 16338, Training Loss: 0.01465
Epoch: 16339, Training Loss: 0.01127
Epoch: 16339, Training Loss: 0.01185
Epoch: 16339, Training Loss: 0.01187
Epoch: 16339, Training Loss: 0.01465
Epoch: 16340, Training Loss: 0.01127
Epoch: 16340, Training Loss: 0.01185
Epoch: 16340, Training Loss: 0.01187
Epoch: 16340, Training Loss: 0.01465
Epoch: 16341, Training Loss: 0.01127
Epoch: 16341, Training Loss: 0.01185
Epoch: 16341, Training Loss: 0.01187
Epoch: 16341, Training Loss: 0.01465
Epoch: 16342, Training Loss: 0.01127
Epoch: 16342, Training Loss: 0.01185
Epoch: 16342, Training Loss: 0.01187
Epoch: 16342, Training Loss: 0.01465
Epoch: 16343, Training Loss: 0.01126
Epoch: 16343, Training Loss: 0.01185
Epoch: 16343, Training Loss: 0.01187
Epoch: 16343, Training Loss: 0.01465
Epoch: 16344, Training Loss: 0.01126
Epoch: 16344, Training Loss: 0.01185
Epoch: 16344, Training Loss: 0.01186
Epoch: 16344, Training Loss: 0.01465
Epoch: 16345, Training Loss: 0.01126
Epoch: 16345, Training Loss: 0.01185
Epoch: 16345, Training Loss: 0.01186
Epoch: 16345, Training Loss: 0.01465
Epoch: 16346, Training Loss: 0.01126
Epoch: 16346, Training Loss: 0.01185
Epoch: 16346, Training Loss: 0.01186
Epoch: 16346, Training Loss: 0.01465
Epoch: 16347, Training Loss: 0.01126
Epoch: 16347, Training Loss: 0.01185
Epoch: 16347, Training Loss: 0.01186
Epoch: 16347, Training Loss: 0.01465
Epoch: 16348, Training Loss: 0.01126
Epoch: 16348, Training Loss: 0.01185
Epoch: 16348, Training Loss: 0.01186
Epoch: 16348, Training Loss: 0.01465
Epoch: 16349, Training Loss: 0.01126
Epoch: 16349, Training Loss: 0.01185
Epoch: 16349, Training Loss: 0.01186
Epoch: 16349, Training Loss: 0.01465
Epoch: 16350, Training Loss: 0.01126
Epoch: 16350, Training Loss: 0.01185
Epoch: 16350, Training Loss: 0.01186
Epoch: 16350, Training Loss: 0.01465
Epoch: 16351, Training Loss: 0.01126
Epoch: 16351, Training Loss: 0.01185
Epoch: 16351, Training Loss: 0.01186
Epoch: 16351, Training Loss: 0.01465
Epoch: 16352, Training Loss: 0.01126
Epoch: 16352, Training Loss: 0.01184
Epoch: 16352, Training Loss: 0.01186
Epoch: 16352, Training Loss: 0.01465
Epoch: 16353, Training Loss: 0.01126
Epoch: 16353, Training Loss: 0.01184
Epoch: 16353, Training Loss: 0.01186
Epoch: 16353, Training Loss: 0.01464
Epoch: 16354, Training Loss: 0.01126
Epoch: 16354, Training Loss: 0.01184
Epoch: 16354, Training Loss: 0.01186
Epoch: 16354, Training Loss: 0.01464
Epoch: 16355, Training Loss: 0.01126
Epoch: 16355, Training Loss: 0.01184
Epoch: 16355, Training Loss: 0.01186
Epoch: 16355, Training Loss: 0.01464
Epoch: 16356, Training Loss: 0.01126
Epoch: 16356, Training Loss: 0.01184
Epoch: 16356, Training Loss: 0.01186
Epoch: 16356, Training Loss: 0.01464
Epoch: 16357, Training Loss: 0.01126
Epoch: 16357, Training Loss: 0.01184
Epoch: 16357, Training Loss: 0.01186
Epoch: 16357, Training Loss: 0.01464
Epoch: 16358, Training Loss: 0.01126
Epoch: 16358, Training Loss: 0.01184
Epoch: 16358, Training Loss: 0.01186
Epoch: 16358, Training Loss: 0.01464
Epoch: 16359, Training Loss: 0.01126
Epoch: 16359, Training Loss: 0.01184
Epoch: 16359, Training Loss: 0.01186
Epoch: 16359, Training Loss: 0.01464
Epoch: 16360, Training Loss: 0.01126
Epoch: 16360, Training Loss: 0.01184
Epoch: 16360, Training Loss: 0.01186
Epoch: 16360, Training Loss: 0.01464
Epoch: 16361, Training Loss: 0.01126
Epoch: 16361, Training Loss: 0.01184
Epoch: 16361, Training Loss: 0.01186
Epoch: 16361, Training Loss: 0.01464
Epoch: 16362, Training Loss: 0.01126
Epoch: 16362, Training Loss: 0.01184
Epoch: 16362, Training Loss: 0.01186
Epoch: 16362, Training Loss: 0.01464
Epoch: 16363, Training Loss: 0.01126
Epoch: 16363, Training Loss: 0.01184
Epoch: 16363, Training Loss: 0.01186
Epoch: 16363, Training Loss: 0.01464
Epoch: 16364, Training Loss: 0.01126
Epoch: 16364, Training Loss: 0.01184
Epoch: 16364, Training Loss: 0.01186
Epoch: 16364, Training Loss: 0.01464
Epoch: 16365, Training Loss: 0.01126
Epoch: 16365, Training Loss: 0.01184
Epoch: 16365, Training Loss: 0.01186
Epoch: 16365, Training Loss: 0.01464
Epoch: 16366, Training Loss: 0.01126
Epoch: 16366, Training Loss: 0.01184
Epoch: 16366, Training Loss: 0.01186
Epoch: 16366, Training Loss: 0.01464
Epoch: 16367, Training Loss: 0.01126
Epoch: 16367, Training Loss: 0.01184
Epoch: 16367, Training Loss: 0.01185
Epoch: 16367, Training Loss: 0.01464
Epoch: 16368, Training Loss: 0.01125
Epoch: 16368, Training Loss: 0.01184
Epoch: 16368, Training Loss: 0.01185
Epoch: 16368, Training Loss: 0.01464
Epoch: 16369, Training Loss: 0.01125
Epoch: 16369, Training Loss: 0.01184
Epoch: 16369, Training Loss: 0.01185
Epoch: 16369, Training Loss: 0.01464
Epoch: 16370, Training Loss: 0.01125
Epoch: 16370, Training Loss: 0.01184
Epoch: 16370, Training Loss: 0.01185
Epoch: 16370, Training Loss: 0.01464
Epoch: 16371, Training Loss: 0.01125
Epoch: 16371, Training Loss: 0.01184
Epoch: 16371, Training Loss: 0.01185
Epoch: 16371, Training Loss: 0.01463
Epoch: 16372, Training Loss: 0.01125
Epoch: 16372, Training Loss: 0.01184
Epoch: 16372, Training Loss: 0.01185
Epoch: 16372, Training Loss: 0.01463
Epoch: 16373, Training Loss: 0.01125
Epoch: 16373, Training Loss: 0.01184
Epoch: 16373, Training Loss: 0.01185
Epoch: 16373, Training Loss: 0.01463
Epoch: 16374, Training Loss: 0.01125
Epoch: 16374, Training Loss: 0.01184
Epoch: 16374, Training Loss: 0.01185
Epoch: 16374, Training Loss: 0.01463
Epoch: 16375, Training Loss: 0.01125
Epoch: 16375, Training Loss: 0.01183
Epoch: 16375, Training Loss: 0.01185
Epoch: 16375, Training Loss: 0.01463
Epoch: 16376, Training Loss: 0.01125
Epoch: 16376, Training Loss: 0.01183
Epoch: 16376, Training Loss: 0.01185
Epoch: 16376, Training Loss: 0.01463
Epoch: 16377, Training Loss: 0.01125
Epoch: 16377, Training Loss: 0.01183
Epoch: 16377, Training Loss: 0.01185
Epoch: 16377, Training Loss: 0.01463
Epoch: 16378, Training Loss: 0.01125
Epoch: 16378, Training Loss: 0.01183
Epoch: 16378, Training Loss: 0.01185
Epoch: 16378, Training Loss: 0.01463
Epoch: 16379, Training Loss: 0.01125
Epoch: 16379, Training Loss: 0.01183
Epoch: 16379, Training Loss: 0.01185
Epoch: 16379, Training Loss: 0.01463
Epoch: 16380, Training Loss: 0.01125
Epoch: 16380, Training Loss: 0.01183
Epoch: 16380, Training Loss: 0.01185
Epoch: 16380, Training Loss: 0.01463
Epoch: 16381, Training Loss: 0.01125
Epoch: 16381, Training Loss: 0.01183
Epoch: 16381, Training Loss: 0.01185
Epoch: 16381, Training Loss: 0.01463
Epoch: 16382, Training Loss: 0.01125
Epoch: 16382, Training Loss: 0.01183
Epoch: 16382, Training Loss: 0.01185
Epoch: 16382, Training Loss: 0.01463
Epoch: 16383, Training Loss: 0.01125
Epoch: 16383, Training Loss: 0.01183
Epoch: 16383, Training Loss: 0.01185
Epoch: 16383, Training Loss: 0.01463
Epoch: 16384, Training Loss: 0.01125
Epoch: 16384, Training Loss: 0.01183
Epoch: 16384, Training Loss: 0.01185
Epoch: 16384, Training Loss: 0.01463
Epoch: 16385, Training Loss: 0.01125
Epoch: 16385, Training Loss: 0.01183
Epoch: 16385, Training Loss: 0.01185
Epoch: 16385, Training Loss: 0.01463
Epoch: 16386, Training Loss: 0.01125
Epoch: 16386, Training Loss: 0.01183
Epoch: 16386, Training Loss: 0.01185
Epoch: 16386, Training Loss: 0.01463
Epoch: 16387, Training Loss: 0.01125
Epoch: 16387, Training Loss: 0.01183
Epoch: 16387, Training Loss: 0.01185
Epoch: 16387, Training Loss: 0.01463
Epoch: 16388, Training Loss: 0.01125
Epoch: 16388, Training Loss: 0.01183
Epoch: 16388, Training Loss: 0.01185
Epoch: 16388, Training Loss: 0.01463
Epoch: 16389, Training Loss: 0.01125
Epoch: 16389, Training Loss: 0.01183
Epoch: 16389, Training Loss: 0.01184
Epoch: 16389, Training Loss: 0.01462
Epoch: 16390, Training Loss: 0.01125
Epoch: 16390, Training Loss: 0.01183
Epoch: 16390, Training Loss: 0.01184
Epoch: 16390, Training Loss: 0.01462
Epoch: 16391, Training Loss: 0.01125
Epoch: 16391, Training Loss: 0.01183
Epoch: 16391, Training Loss: 0.01184
Epoch: 16391, Training Loss: 0.01462
Epoch: 16392, Training Loss: 0.01124
Epoch: 16392, Training Loss: 0.01183
Epoch: 16392, Training Loss: 0.01184
Epoch: 16392, Training Loss: 0.01462
Epoch: 16393, Training Loss: 0.01124
Epoch: 16393, Training Loss: 0.01183
Epoch: 16393, Training Loss: 0.01184
Epoch: 16393, Training Loss: 0.01462
Epoch: 16394, Training Loss: 0.01124
Epoch: 16394, Training Loss: 0.01183
Epoch: 16394, Training Loss: 0.01184
Epoch: 16394, Training Loss: 0.01462
Epoch: 16395, Training Loss: 0.01124
Epoch: 16395, Training Loss: 0.01183
Epoch: 16395, Training Loss: 0.01184
Epoch: 16395, Training Loss: 0.01462
Epoch: 16396, Training Loss: 0.01124
Epoch: 16396, Training Loss: 0.01183
Epoch: 16396, Training Loss: 0.01184
Epoch: 16396, Training Loss: 0.01462
Epoch: 16397, Training Loss: 0.01124
Epoch: 16397, Training Loss: 0.01183
Epoch: 16397, Training Loss: 0.01184
Epoch: 16397, Training Loss: 0.01462
Epoch: 16398, Training Loss: 0.01124
Epoch: 16398, Training Loss: 0.01182
Epoch: 16398, Training Loss: 0.01184
Epoch: 16398, Training Loss: 0.01462
Epoch: 16399, Training Loss: 0.01124
Epoch: 16399, Training Loss: 0.01182
Epoch: 16399, Training Loss: 0.01184
Epoch: 16399, Training Loss: 0.01462
Epoch: 16400, Training Loss: 0.01124
Epoch: 16400, Training Loss: 0.01182
Epoch: 16400, Training Loss: 0.01184
Epoch: 16400, Training Loss: 0.01462
Epoch: 16401, Training Loss: 0.01124
Epoch: 16401, Training Loss: 0.01182
Epoch: 16401, Training Loss: 0.01184
Epoch: 16401, Training Loss: 0.01462
Epoch: 16402, Training Loss: 0.01124
Epoch: 16402, Training Loss: 0.01182
Epoch: 16402, Training Loss: 0.01184
Epoch: 16402, Training Loss: 0.01462
Epoch: 16403, Training Loss: 0.01124
Epoch: 16403, Training Loss: 0.01182
Epoch: 16403, Training Loss: 0.01184
Epoch: 16403, Training Loss: 0.01462
Epoch: 16404, Training Loss: 0.01124
Epoch: 16404, Training Loss: 0.01182
Epoch: 16404, Training Loss: 0.01184
Epoch: 16404, Training Loss: 0.01462
Epoch: 16405, Training Loss: 0.01124
Epoch: 16405, Training Loss: 0.01182
Epoch: 16405, Training Loss: 0.01184
Epoch: 16405, Training Loss: 0.01462
Epoch: 16406, Training Loss: 0.01124
Epoch: 16406, Training Loss: 0.01182
Epoch: 16406, Training Loss: 0.01184
Epoch: 16406, Training Loss: 0.01462
Epoch: 16407, Training Loss: 0.01124
Epoch: 16407, Training Loss: 0.01182
Epoch: 16407, Training Loss: 0.01184
Epoch: 16407, Training Loss: 0.01461
Epoch: 16408, Training Loss: 0.01124
Epoch: 16408, Training Loss: 0.01182
Epoch: 16408, Training Loss: 0.01184
Epoch: 16408, Training Loss: 0.01461
Epoch: 16409, Training Loss: 0.01124
Epoch: 16409, Training Loss: 0.01182
Epoch: 16409, Training Loss: 0.01184
Epoch: 16409, Training Loss: 0.01461
Epoch: 16410, Training Loss: 0.01124
Epoch: 16410, Training Loss: 0.01182
Epoch: 16410, Training Loss: 0.01184
Epoch: 16410, Training Loss: 0.01461
Epoch: 16411, Training Loss: 0.01124
Epoch: 16411, Training Loss: 0.01182
Epoch: 16411, Training Loss: 0.01184
Epoch: 16411, Training Loss: 0.01461
Epoch: 16412, Training Loss: 0.01124
Epoch: 16412, Training Loss: 0.01182
Epoch: 16412, Training Loss: 0.01183
Epoch: 16412, Training Loss: 0.01461
Epoch: 16413, Training Loss: 0.01124
Epoch: 16413, Training Loss: 0.01182
Epoch: 16413, Training Loss: 0.01183
Epoch: 16413, Training Loss: 0.01461
Epoch: 16414, Training Loss: 0.01124
Epoch: 16414, Training Loss: 0.01182
Epoch: 16414, Training Loss: 0.01183
Epoch: 16414, Training Loss: 0.01461
Epoch: 16415, Training Loss: 0.01124
Epoch: 16415, Training Loss: 0.01182
Epoch: 16415, Training Loss: 0.01183
Epoch: 16415, Training Loss: 0.01461
Epoch: 16416, Training Loss: 0.01124
Epoch: 16416, Training Loss: 0.01182
Epoch: 16416, Training Loss: 0.01183
Epoch: 16416, Training Loss: 0.01461
Epoch: 16417, Training Loss: 0.01123
Epoch: 16417, Training Loss: 0.01182
Epoch: 16417, Training Loss: 0.01183
Epoch: 16417, Training Loss: 0.01461
Epoch: 16418, Training Loss: 0.01123
Epoch: 16418, Training Loss: 0.01182
Epoch: 16418, Training Loss: 0.01183
Epoch: 16418, Training Loss: 0.01461
Epoch: 16419, Training Loss: 0.01123
Epoch: 16419, Training Loss: 0.01182
Epoch: 16419, Training Loss: 0.01183
Epoch: 16419, Training Loss: 0.01461
Epoch: 16420, Training Loss: 0.01123
Epoch: 16420, Training Loss: 0.01181
Epoch: 16420, Training Loss: 0.01183
Epoch: 16420, Training Loss: 0.01461
Epoch: 16421, Training Loss: 0.01123
Epoch: 16421, Training Loss: 0.01181
Epoch: 16421, Training Loss: 0.01183
Epoch: 16421, Training Loss: 0.01461
Epoch: 16422, Training Loss: 0.01123
Epoch: 16422, Training Loss: 0.01181
Epoch: 16422, Training Loss: 0.01183
Epoch: 16422, Training Loss: 0.01461
Epoch: 16423, Training Loss: 0.01123
Epoch: 16423, Training Loss: 0.01181
Epoch: 16423, Training Loss: 0.01183
Epoch: 16423, Training Loss: 0.01461
Epoch: 16424, Training Loss: 0.01123
Epoch: 16424, Training Loss: 0.01181
Epoch: 16424, Training Loss: 0.01183
Epoch: 16424, Training Loss: 0.01461
Epoch: 16425, Training Loss: 0.01123
Epoch: 16425, Training Loss: 0.01181
Epoch: 16425, Training Loss: 0.01183
Epoch: 16425, Training Loss: 0.01460
Epoch: 16426, Training Loss: 0.01123
Epoch: 16426, Training Loss: 0.01181
Epoch: 16426, Training Loss: 0.01183
Epoch: 16426, Training Loss: 0.01460
Epoch: 16427, Training Loss: 0.01123
Epoch: 16427, Training Loss: 0.01181
Epoch: 16427, Training Loss: 0.01183
Epoch: 16427, Training Loss: 0.01460
Epoch: 16428, Training Loss: 0.01123
Epoch: 16428, Training Loss: 0.01181
Epoch: 16428, Training Loss: 0.01183
Epoch: 16428, Training Loss: 0.01460
Epoch: 16429, Training Loss: 0.01123
Epoch: 16429, Training Loss: 0.01181
Epoch: 16429, Training Loss: 0.01183
Epoch: 16429, Training Loss: 0.01460
Epoch: 16430, Training Loss: 0.01123
Epoch: 16430, Training Loss: 0.01181
Epoch: 16430, Training Loss: 0.01183
Epoch: 16430, Training Loss: 0.01460
Epoch: 16431, Training Loss: 0.01123
Epoch: 16431, Training Loss: 0.01181
Epoch: 16431, Training Loss: 0.01183
Epoch: 16431, Training Loss: 0.01460
Epoch: 16432, Training Loss: 0.01123
Epoch: 16432, Training Loss: 0.01181
Epoch: 16432, Training Loss: 0.01183
Epoch: 16432, Training Loss: 0.01460
Epoch: 16433, Training Loss: 0.01123
Epoch: 16433, Training Loss: 0.01181
Epoch: 16433, Training Loss: 0.01183
Epoch: 16433, Training Loss: 0.01460
Epoch: 16434, Training Loss: 0.01123
Epoch: 16434, Training Loss: 0.01181
Epoch: 16434, Training Loss: 0.01183
Epoch: 16434, Training Loss: 0.01460
Epoch: 16435, Training Loss: 0.01123
Epoch: 16435, Training Loss: 0.01181
Epoch: 16435, Training Loss: 0.01182
Epoch: 16435, Training Loss: 0.01460
Epoch: 16436, Training Loss: 0.01123
Epoch: 16436, Training Loss: 0.01181
Epoch: 16436, Training Loss: 0.01182
Epoch: 16436, Training Loss: 0.01460
Epoch: 16437, Training Loss: 0.01123
Epoch: 16437, Training Loss: 0.01181
Epoch: 16437, Training Loss: 0.01182
Epoch: 16437, Training Loss: 0.01460
Epoch: 16438, Training Loss: 0.01123
Epoch: 16438, Training Loss: 0.01181
Epoch: 16438, Training Loss: 0.01182
Epoch: 16438, Training Loss: 0.01460
Epoch: 16439, Training Loss: 0.01123
Epoch: 16439, Training Loss: 0.01181
Epoch: 16439, Training Loss: 0.01182
Epoch: 16439, Training Loss: 0.01460
Epoch: 16440, Training Loss: 0.01123
Epoch: 16440, Training Loss: 0.01181
Epoch: 16440, Training Loss: 0.01182
Epoch: 16440, Training Loss: 0.01460
Epoch: 16441, Training Loss: 0.01123
Epoch: 16441, Training Loss: 0.01181
Epoch: 16441, Training Loss: 0.01182
Epoch: 16441, Training Loss: 0.01460
Epoch: 16442, Training Loss: 0.01122
Epoch: 16442, Training Loss: 0.01181
Epoch: 16442, Training Loss: 0.01182
Epoch: 16442, Training Loss: 0.01460
Epoch: 16443, Training Loss: 0.01122
Epoch: 16443, Training Loss: 0.01180
Epoch: 16443, Training Loss: 0.01182
Epoch: 16443, Training Loss: 0.01460
Epoch: 16444, Training Loss: 0.01122
Epoch: 16444, Training Loss: 0.01180
Epoch: 16444, Training Loss: 0.01182
Epoch: 16444, Training Loss: 0.01459
Epoch: 16445, Training Loss: 0.01122
Epoch: 16445, Training Loss: 0.01180
Epoch: 16445, Training Loss: 0.01182
Epoch: 16445, Training Loss: 0.01459
Epoch: 16446, Training Loss: 0.01122
Epoch: 16446, Training Loss: 0.01180
Epoch: 16446, Training Loss: 0.01182
Epoch: 16446, Training Loss: 0.01459
Epoch: 16447, Training Loss: 0.01122
Epoch: 16447, Training Loss: 0.01180
Epoch: 16447, Training Loss: 0.01182
Epoch: 16447, Training Loss: 0.01459
Epoch: 16448, Training Loss: 0.01122
Epoch: 16448, Training Loss: 0.01180
Epoch: 16448, Training Loss: 0.01182
Epoch: 16448, Training Loss: 0.01459
Epoch: 16449, Training Loss: 0.01122
Epoch: 16449, Training Loss: 0.01180
Epoch: 16449, Training Loss: 0.01182
Epoch: 16449, Training Loss: 0.01459
Epoch: 16450, Training Loss: 0.01122
Epoch: 16450, Training Loss: 0.01180
Epoch: 16450, Training Loss: 0.01182
Epoch: 16450, Training Loss: 0.01459
Epoch: 16451, Training Loss: 0.01122
Epoch: 16451, Training Loss: 0.01180
Epoch: 16451, Training Loss: 0.01182
Epoch: 16451, Training Loss: 0.01459
Epoch: 16452, Training Loss: 0.01122
Epoch: 16452, Training Loss: 0.01180
Epoch: 16452, Training Loss: 0.01182
Epoch: 16452, Training Loss: 0.01459
Epoch: 16453, Training Loss: 0.01122
Epoch: 16453, Training Loss: 0.01180
Epoch: 16453, Training Loss: 0.01182
Epoch: 16453, Training Loss: 0.01459
Epoch: 16454, Training Loss: 0.01122
Epoch: 16454, Training Loss: 0.01180
Epoch: 16454, Training Loss: 0.01182
Epoch: 16454, Training Loss: 0.01459
Epoch: 16455, Training Loss: 0.01122
Epoch: 16455, Training Loss: 0.01180
Epoch: 16455, Training Loss: 0.01182
Epoch: 16455, Training Loss: 0.01459
Epoch: 16456, Training Loss: 0.01122
Epoch: 16456, Training Loss: 0.01180
Epoch: 16456, Training Loss: 0.01182
Epoch: 16456, Training Loss: 0.01459
Epoch: 16457, Training Loss: 0.01122
Epoch: 16457, Training Loss: 0.01180
Epoch: 16457, Training Loss: 0.01181
Epoch: 16457, Training Loss: 0.01459
Epoch: 16458, Training Loss: 0.01122
Epoch: 16458, Training Loss: 0.01180
Epoch: 16458, Training Loss: 0.01181
Epoch: 16458, Training Loss: 0.01459
Epoch: 16459, Training Loss: 0.01122
Epoch: 16459, Training Loss: 0.01180
Epoch: 16459, Training Loss: 0.01181
Epoch: 16459, Training Loss: 0.01459
Epoch: 16460, Training Loss: 0.01122
Epoch: 16460, Training Loss: 0.01180
Epoch: 16460, Training Loss: 0.01181
Epoch: 16460, Training Loss: 0.01459
Epoch: 16461, Training Loss: 0.01122
Epoch: 16461, Training Loss: 0.01180
Epoch: 16461, Training Loss: 0.01181
Epoch: 16461, Training Loss: 0.01459
Epoch: 16462, Training Loss: 0.01122
Epoch: 16462, Training Loss: 0.01180
Epoch: 16462, Training Loss: 0.01181
Epoch: 16462, Training Loss: 0.01458
Epoch: 16463, Training Loss: 0.01122
Epoch: 16463, Training Loss: 0.01180
Epoch: 16463, Training Loss: 0.01181
Epoch: 16463, Training Loss: 0.01458
Epoch: 16464, Training Loss: 0.01122
Epoch: 16464, Training Loss: 0.01180
Epoch: 16464, Training Loss: 0.01181
Epoch: 16464, Training Loss: 0.01458
Epoch: 16465, Training Loss: 0.01122
Epoch: 16465, Training Loss: 0.01180
Epoch: 16465, Training Loss: 0.01181
Epoch: 16465, Training Loss: 0.01458
Epoch: 16466, Training Loss: 0.01121
Epoch: 16466, Training Loss: 0.01179
Epoch: 16466, Training Loss: 0.01181
Epoch: 16466, Training Loss: 0.01458
Epoch: 16467, Training Loss: 0.01121
Epoch: 16467, Training Loss: 0.01179
Epoch: 16467, Training Loss: 0.01181
Epoch: 16467, Training Loss: 0.01458
Epoch: 16468, Training Loss: 0.01121
Epoch: 16468, Training Loss: 0.01179
Epoch: 16468, Training Loss: 0.01181
Epoch: 16468, Training Loss: 0.01458
Epoch: 16469, Training Loss: 0.01121
Epoch: 16469, Training Loss: 0.01179
Epoch: 16469, Training Loss: 0.01181
Epoch: 16469, Training Loss: 0.01458
Epoch: 16470, Training Loss: 0.01121
Epoch: 16470, Training Loss: 0.01179
Epoch: 16470, Training Loss: 0.01181
Epoch: 16470, Training Loss: 0.01458
Epoch: 16471, Training Loss: 0.01121
Epoch: 16471, Training Loss: 0.01179
Epoch: 16471, Training Loss: 0.01181
Epoch: 16471, Training Loss: 0.01458
Epoch: 16472, Training Loss: 0.01121
Epoch: 16472, Training Loss: 0.01179
Epoch: 16472, Training Loss: 0.01181
Epoch: 16472, Training Loss: 0.01458
Epoch: 16473, Training Loss: 0.01121
Epoch: 16473, Training Loss: 0.01179
Epoch: 16473, Training Loss: 0.01181
Epoch: 16473, Training Loss: 0.01458
Epoch: 16474, Training Loss: 0.01121
Epoch: 16474, Training Loss: 0.01179
Epoch: 16474, Training Loss: 0.01181
Epoch: 16474, Training Loss: 0.01458
Epoch: 16475, Training Loss: 0.01121
Epoch: 16475, Training Loss: 0.01179
Epoch: 16475, Training Loss: 0.01181
Epoch: 16475, Training Loss: 0.01458
Epoch: 16476, Training Loss: 0.01121
Epoch: 16476, Training Loss: 0.01179
Epoch: 16476, Training Loss: 0.01181
Epoch: 16476, Training Loss: 0.01458
Epoch: 16477, Training Loss: 0.01121
Epoch: 16477, Training Loss: 0.01179
Epoch: 16477, Training Loss: 0.01181
Epoch: 16477, Training Loss: 0.01458
Epoch: 16478, Training Loss: 0.01121
Epoch: 16478, Training Loss: 0.01179
Epoch: 16478, Training Loss: 0.01181
Epoch: 16478, Training Loss: 0.01458
Epoch: 16479, Training Loss: 0.01121
Epoch: 16479, Training Loss: 0.01179
Epoch: 16479, Training Loss: 0.01181
Epoch: 16479, Training Loss: 0.01458
Epoch: 16480, Training Loss: 0.01121
Epoch: 16480, Training Loss: 0.01179
Epoch: 16480, Training Loss: 0.01180
Epoch: 16480, Training Loss: 0.01457
Epoch: 16481, Training Loss: 0.01121
Epoch: 16481, Training Loss: 0.01179
Epoch: 16481, Training Loss: 0.01180
Epoch: 16481, Training Loss: 0.01457
Epoch: 16482, Training Loss: 0.01121
Epoch: 16482, Training Loss: 0.01179
Epoch: 16482, Training Loss: 0.01180
Epoch: 16482, Training Loss: 0.01457
Epoch: 16483, Training Loss: 0.01121
Epoch: 16483, Training Loss: 0.01179
Epoch: 16483, Training Loss: 0.01180
Epoch: 16483, Training Loss: 0.01457
Epoch: 16484, Training Loss: 0.01121
Epoch: 16484, Training Loss: 0.01179
Epoch: 16484, Training Loss: 0.01180
Epoch: 16484, Training Loss: 0.01457
Epoch: 16485, Training Loss: 0.01121
Epoch: 16485, Training Loss: 0.01179
Epoch: 16485, Training Loss: 0.01180
Epoch: 16485, Training Loss: 0.01457
Epoch: 16486, Training Loss: 0.01121
Epoch: 16486, Training Loss: 0.01179
Epoch: 16486, Training Loss: 0.01180
Epoch: 16486, Training Loss: 0.01457
Epoch: 16487, Training Loss: 0.01121
Epoch: 16487, Training Loss: 0.01179
Epoch: 16487, Training Loss: 0.01180
Epoch: 16487, Training Loss: 0.01457
Epoch: 16488, Training Loss: 0.01121
Epoch: 16488, Training Loss: 0.01179
Epoch: 16488, Training Loss: 0.01180
Epoch: 16488, Training Loss: 0.01457
Epoch: 16489, Training Loss: 0.01121
Epoch: 16489, Training Loss: 0.01178
Epoch: 16489, Training Loss: 0.01180
Epoch: 16489, Training Loss: 0.01457
Epoch: 16490, Training Loss: 0.01121
Epoch: 16490, Training Loss: 0.01178
Epoch: 16490, Training Loss: 0.01180
Epoch: 16490, Training Loss: 0.01457
Epoch: 16491, Training Loss: 0.01120
Epoch: 16491, Training Loss: 0.01178
Epoch: 16491, Training Loss: 0.01180
Epoch: 16491, Training Loss: 0.01457
Epoch: 16492, Training Loss: 0.01120
Epoch: 16492, Training Loss: 0.01178
Epoch: 16492, Training Loss: 0.01180
Epoch: 16492, Training Loss: 0.01457
Epoch: 16493, Training Loss: 0.01120
Epoch: 16493, Training Loss: 0.01178
Epoch: 16493, Training Loss: 0.01180
Epoch: 16493, Training Loss: 0.01457
Epoch: 16494, Training Loss: 0.01120
Epoch: 16494, Training Loss: 0.01178
Epoch: 16494, Training Loss: 0.01180
Epoch: 16494, Training Loss: 0.01457
Epoch: 16495, Training Loss: 0.01120
Epoch: 16495, Training Loss: 0.01178
Epoch: 16495, Training Loss: 0.01180
Epoch: 16495, Training Loss: 0.01457
Epoch: 16496, Training Loss: 0.01120
Epoch: 16496, Training Loss: 0.01178
Epoch: 16496, Training Loss: 0.01180
Epoch: 16496, Training Loss: 0.01457
Epoch: 16497, Training Loss: 0.01120
Epoch: 16497, Training Loss: 0.01178
Epoch: 16497, Training Loss: 0.01180
Epoch: 16497, Training Loss: 0.01457
Epoch: 16498, Training Loss: 0.01120
Epoch: 16498, Training Loss: 0.01178
Epoch: 16498, Training Loss: 0.01180
Epoch: 16498, Training Loss: 0.01456
Epoch: 16499, Training Loss: 0.01120
Epoch: 16499, Training Loss: 0.01178
Epoch: 16499, Training Loss: 0.01180
Epoch: 16499, Training Loss: 0.01456
Epoch: 16500, Training Loss: 0.01120
Epoch: 16500, Training Loss: 0.01178
Epoch: 16500, Training Loss: 0.01180
Epoch: 16500, Training Loss: 0.01456
Epoch: 16501, Training Loss: 0.01120
Epoch: 16501, Training Loss: 0.01178
Epoch: 16501, Training Loss: 0.01180
Epoch: 16501, Training Loss: 0.01456
Epoch: 16502, Training Loss: 0.01120
Epoch: 16502, Training Loss: 0.01178
Epoch: 16502, Training Loss: 0.01180
Epoch: 16502, Training Loss: 0.01456
Epoch: 16503, Training Loss: 0.01120
Epoch: 16503, Training Loss: 0.01178
Epoch: 16503, Training Loss: 0.01179
Epoch: 16503, Training Loss: 0.01456
Epoch: 16504, Training Loss: 0.01120
Epoch: 16504, Training Loss: 0.01178
Epoch: 16504, Training Loss: 0.01179
Epoch: 16504, Training Loss: 0.01456
Epoch: 16505, Training Loss: 0.01120
Epoch: 16505, Training Loss: 0.01178
Epoch: 16505, Training Loss: 0.01179
Epoch: 16505, Training Loss: 0.01456
Epoch: 16506, Training Loss: 0.01120
Epoch: 16506, Training Loss: 0.01178
Epoch: 16506, Training Loss: 0.01179
Epoch: 16506, Training Loss: 0.01456
Epoch: 16507, Training Loss: 0.01120
Epoch: 16507, Training Loss: 0.01178
Epoch: 16507, Training Loss: 0.01179
Epoch: 16507, Training Loss: 0.01456
Epoch: 16508, Training Loss: 0.01120
Epoch: 16508, Training Loss: 0.01178
Epoch: 16508, Training Loss: 0.01179
Epoch: 16508, Training Loss: 0.01456
Epoch: 16509, Training Loss: 0.01120
Epoch: 16509, Training Loss: 0.01178
Epoch: 16509, Training Loss: 0.01179
Epoch: 16509, Training Loss: 0.01456
Epoch: 16510, Training Loss: 0.01120
Epoch: 16510, Training Loss: 0.01178
Epoch: 16510, Training Loss: 0.01179
Epoch: 16510, Training Loss: 0.01456
Epoch: 16511, Training Loss: 0.01120
Epoch: 16511, Training Loss: 0.01178
Epoch: 16511, Training Loss: 0.01179
Epoch: 16511, Training Loss: 0.01456
Epoch: 16512, Training Loss: 0.01120
Epoch: 16512, Training Loss: 0.01177
Epoch: 16512, Training Loss: 0.01179
Epoch: 16512, Training Loss: 0.01456
Epoch: 16513, Training Loss: 0.01120
Epoch: 16513, Training Loss: 0.01177
Epoch: 16513, Training Loss: 0.01179
Epoch: 16513, Training Loss: 0.01456
Epoch: 16514, Training Loss: 0.01120
Epoch: 16514, Training Loss: 0.01177
Epoch: 16514, Training Loss: 0.01179
Epoch: 16514, Training Loss: 0.01456
Epoch: 16515, Training Loss: 0.01120
Epoch: 16515, Training Loss: 0.01177
Epoch: 16515, Training Loss: 0.01179
Epoch: 16515, Training Loss: 0.01456
Epoch: 16516, Training Loss: 0.01119
Epoch: 16516, Training Loss: 0.01177
Epoch: 16516, Training Loss: 0.01179
Epoch: 16516, Training Loss: 0.01455
Epoch: 16517, Training Loss: 0.01119
Epoch: 16517, Training Loss: 0.01177
Epoch: 16517, Training Loss: 0.01179
Epoch: 16517, Training Loss: 0.01455
Epoch: 16518, Training Loss: 0.01119
Epoch: 16518, Training Loss: 0.01177
Epoch: 16518, Training Loss: 0.01179
Epoch: 16518, Training Loss: 0.01455
Epoch: 16519, Training Loss: 0.01119
Epoch: 16519, Training Loss: 0.01177
Epoch: 16519, Training Loss: 0.01179
Epoch: 16519, Training Loss: 0.01455
Epoch: 16520, Training Loss: 0.01119
Epoch: 16520, Training Loss: 0.01177
Epoch: 16520, Training Loss: 0.01179
Epoch: 16520, Training Loss: 0.01455
Epoch: 16521, Training Loss: 0.01119
Epoch: 16521, Training Loss: 0.01177
Epoch: 16521, Training Loss: 0.01179
Epoch: 16521, Training Loss: 0.01455
Epoch: 16522, Training Loss: 0.01119
Epoch: 16522, Training Loss: 0.01177
Epoch: 16522, Training Loss: 0.01179
Epoch: 16522, Training Loss: 0.01455
Epoch: 16523, Training Loss: 0.01119
Epoch: 16523, Training Loss: 0.01177
Epoch: 16523, Training Loss: 0.01179
Epoch: 16523, Training Loss: 0.01455
Epoch: 16524, Training Loss: 0.01119
Epoch: 16524, Training Loss: 0.01177
Epoch: 16524, Training Loss: 0.01179
Epoch: 16524, Training Loss: 0.01455
Epoch: 16525, Training Loss: 0.01119
Epoch: 16525, Training Loss: 0.01177
Epoch: 16525, Training Loss: 0.01179
Epoch: 16525, Training Loss: 0.01455
Epoch: 16526, Training Loss: 0.01119
Epoch: 16526, Training Loss: 0.01177
Epoch: 16526, Training Loss: 0.01178
Epoch: 16526, Training Loss: 0.01455
Epoch: 16527, Training Loss: 0.01119
Epoch: 16527, Training Loss: 0.01177
Epoch: 16527, Training Loss: 0.01178
Epoch: 16527, Training Loss: 0.01455
Epoch: 16528, Training Loss: 0.01119
Epoch: 16528, Training Loss: 0.01177
Epoch: 16528, Training Loss: 0.01178
Epoch: 16528, Training Loss: 0.01455
Epoch: 16529, Training Loss: 0.01119
Epoch: 16529, Training Loss: 0.01177
Epoch: 16529, Training Loss: 0.01178
Epoch: 16529, Training Loss: 0.01455
Epoch: 16530, Training Loss: 0.01119
Epoch: 16530, Training Loss: 0.01177
Epoch: 16530, Training Loss: 0.01178
Epoch: 16530, Training Loss: 0.01455
Epoch: 16531, Training Loss: 0.01119
Epoch: 16531, Training Loss: 0.01177
Epoch: 16531, Training Loss: 0.01178
Epoch: 16531, Training Loss: 0.01455
Epoch: 16532, Training Loss: 0.01119
Epoch: 16532, Training Loss: 0.01177
Epoch: 16532, Training Loss: 0.01178
Epoch: 16532, Training Loss: 0.01455
Epoch: 16533, Training Loss: 0.01119
Epoch: 16533, Training Loss: 0.01177
Epoch: 16533, Training Loss: 0.01178
Epoch: 16533, Training Loss: 0.01455
Epoch: 16534, Training Loss: 0.01119
Epoch: 16534, Training Loss: 0.01177
Epoch: 16534, Training Loss: 0.01178
Epoch: 16534, Training Loss: 0.01455
Epoch: 16535, Training Loss: 0.01119
Epoch: 16535, Training Loss: 0.01176
Epoch: 16535, Training Loss: 0.01178
Epoch: 16535, Training Loss: 0.01454
Epoch: 16536, Training Loss: 0.01119
Epoch: 16536, Training Loss: 0.01176
Epoch: 16536, Training Loss: 0.01178
Epoch: 16536, Training Loss: 0.01454
Epoch: 16537, Training Loss: 0.01119
Epoch: 16537, Training Loss: 0.01176
Epoch: 16537, Training Loss: 0.01178
Epoch: 16537, Training Loss: 0.01454
Epoch: 16538, Training Loss: 0.01119
Epoch: 16538, Training Loss: 0.01176
Epoch: 16538, Training Loss: 0.01178
Epoch: 16538, Training Loss: 0.01454
Epoch: 16539, Training Loss: 0.01119
Epoch: 16539, Training Loss: 0.01176
Epoch: 16539, Training Loss: 0.01178
Epoch: 16539, Training Loss: 0.01454
Epoch: 16540, Training Loss: 0.01119
Epoch: 16540, Training Loss: 0.01176
Epoch: 16540, Training Loss: 0.01178
Epoch: 16540, Training Loss: 0.01454
Epoch: 16541, Training Loss: 0.01118
Epoch: 16541, Training Loss: 0.01176
Epoch: 16541, Training Loss: 0.01178
Epoch: 16541, Training Loss: 0.01454
Epoch: 16542, Training Loss: 0.01118
Epoch: 16542, Training Loss: 0.01176
Epoch: 16542, Training Loss: 0.01178
Epoch: 16542, Training Loss: 0.01454
Epoch: 16543, Training Loss: 0.01118
Epoch: 16543, Training Loss: 0.01176
Epoch: 16543, Training Loss: 0.01178
Epoch: 16543, Training Loss: 0.01454
Epoch: 16544, Training Loss: 0.01118
Epoch: 16544, Training Loss: 0.01176
Epoch: 16544, Training Loss: 0.01178
Epoch: 16544, Training Loss: 0.01454
Epoch: 16545, Training Loss: 0.01118
Epoch: 16545, Training Loss: 0.01176
Epoch: 16545, Training Loss: 0.01178
Epoch: 16545, Training Loss: 0.01454
Epoch: 16546, Training Loss: 0.01118
Epoch: 16546, Training Loss: 0.01176
Epoch: 16546, Training Loss: 0.01178
Epoch: 16546, Training Loss: 0.01454
Epoch: 16547, Training Loss: 0.01118
Epoch: 16547, Training Loss: 0.01176
Epoch: 16547, Training Loss: 0.01178
Epoch: 16547, Training Loss: 0.01454
Epoch: 16548, Training Loss: 0.01118
Epoch: 16548, Training Loss: 0.01176
Epoch: 16548, Training Loss: 0.01178
Epoch: 16548, Training Loss: 0.01454
Epoch: 16549, Training Loss: 0.01118
Epoch: 16549, Training Loss: 0.01176
Epoch: 16549, Training Loss: 0.01177
Epoch: 16549, Training Loss: 0.01454
Epoch: 16550, Training Loss: 0.01118
Epoch: 16550, Training Loss: 0.01176
Epoch: 16550, Training Loss: 0.01177
Epoch: 16550, Training Loss: 0.01454
Epoch: 16551, Training Loss: 0.01118
Epoch: 16551, Training Loss: 0.01176
Epoch: 16551, Training Loss: 0.01177
Epoch: 16551, Training Loss: 0.01454
Epoch: 16552, Training Loss: 0.01118
Epoch: 16552, Training Loss: 0.01176
Epoch: 16552, Training Loss: 0.01177
Epoch: 16552, Training Loss: 0.01454
Epoch: 16553, Training Loss: 0.01118
Epoch: 16553, Training Loss: 0.01176
Epoch: 16553, Training Loss: 0.01177
Epoch: 16553, Training Loss: 0.01453
Epoch: 16554, Training Loss: 0.01118
Epoch: 16554, Training Loss: 0.01176
Epoch: 16554, Training Loss: 0.01177
Epoch: 16554, Training Loss: 0.01453
Epoch: 16555, Training Loss: 0.01118
Epoch: 16555, Training Loss: 0.01176
Epoch: 16555, Training Loss: 0.01177
Epoch: 16555, Training Loss: 0.01453
Epoch: 16556, Training Loss: 0.01118
Epoch: 16556, Training Loss: 0.01176
Epoch: 16556, Training Loss: 0.01177
Epoch: 16556, Training Loss: 0.01453
Epoch: 16557, Training Loss: 0.01118
Epoch: 16557, Training Loss: 0.01176
Epoch: 16557, Training Loss: 0.01177
Epoch: 16557, Training Loss: 0.01453
Epoch: 16558, Training Loss: 0.01118
Epoch: 16558, Training Loss: 0.01175
Epoch: 16558, Training Loss: 0.01177
Epoch: 16558, Training Loss: 0.01453
Epoch: 16559, Training Loss: 0.01118
Epoch: 16559, Training Loss: 0.01175
Epoch: 16559, Training Loss: 0.01177
Epoch: 16559, Training Loss: 0.01453
Epoch: 16560, Training Loss: 0.01118
Epoch: 16560, Training Loss: 0.01175
Epoch: 16560, Training Loss: 0.01177
Epoch: 16560, Training Loss: 0.01453
Epoch: 16561, Training Loss: 0.01118
Epoch: 16561, Training Loss: 0.01175
Epoch: 16561, Training Loss: 0.01177
Epoch: 16561, Training Loss: 0.01453
Epoch: 16562, Training Loss: 0.01118
Epoch: 16562, Training Loss: 0.01175
Epoch: 16562, Training Loss: 0.01177
Epoch: 16562, Training Loss: 0.01453
Epoch: 16563, Training Loss: 0.01118
Epoch: 16563, Training Loss: 0.01175
Epoch: 16563, Training Loss: 0.01177
Epoch: 16563, Training Loss: 0.01453
Epoch: 16564, Training Loss: 0.01118
Epoch: 16564, Training Loss: 0.01175
Epoch: 16564, Training Loss: 0.01177
Epoch: 16564, Training Loss: 0.01453
Epoch: 16565, Training Loss: 0.01118
Epoch: 16565, Training Loss: 0.01175
Epoch: 16565, Training Loss: 0.01177
Epoch: 16565, Training Loss: 0.01453
Epoch: 16566, Training Loss: 0.01117
Epoch: 16566, Training Loss: 0.01175
Epoch: 16566, Training Loss: 0.01177
Epoch: 16566, Training Loss: 0.01453
Epoch: 16567, Training Loss: 0.01117
Epoch: 16567, Training Loss: 0.01175
Epoch: 16567, Training Loss: 0.01177
Epoch: 16567, Training Loss: 0.01453
Epoch: 16568, Training Loss: 0.01117
Epoch: 16568, Training Loss: 0.01175
Epoch: 16568, Training Loss: 0.01177
Epoch: 16568, Training Loss: 0.01453
Epoch: 16569, Training Loss: 0.01117
Epoch: 16569, Training Loss: 0.01175
Epoch: 16569, Training Loss: 0.01177
Epoch: 16569, Training Loss: 0.01453
Epoch: 16570, Training Loss: 0.01117
Epoch: 16570, Training Loss: 0.01175
Epoch: 16570, Training Loss: 0.01177
Epoch: 16570, Training Loss: 0.01453
Epoch: 16571, Training Loss: 0.01117
Epoch: 16571, Training Loss: 0.01175
Epoch: 16571, Training Loss: 0.01177
Epoch: 16571, Training Loss: 0.01452
Epoch: 16572, Training Loss: 0.01117
Epoch: 16572, Training Loss: 0.01175
Epoch: 16572, Training Loss: 0.01176
Epoch: 16572, Training Loss: 0.01452
Epoch: 16573, Training Loss: 0.01117
Epoch: 16573, Training Loss: 0.01175
Epoch: 16573, Training Loss: 0.01176
Epoch: 16573, Training Loss: 0.01452
Epoch: 16574, Training Loss: 0.01117
Epoch: 16574, Training Loss: 0.01175
Epoch: 16574, Training Loss: 0.01176
Epoch: 16574, Training Loss: 0.01452
Epoch: 16575, Training Loss: 0.01117
Epoch: 16575, Training Loss: 0.01175
Epoch: 16575, Training Loss: 0.01176
Epoch: 16575, Training Loss: 0.01452
Epoch: 16576, Training Loss: 0.01117
Epoch: 16576, Training Loss: 0.01175
Epoch: 16576, Training Loss: 0.01176
Epoch: 16576, Training Loss: 0.01452
Epoch: 16577, Training Loss: 0.01117
Epoch: 16577, Training Loss: 0.01175
Epoch: 16577, Training Loss: 0.01176
Epoch: 16577, Training Loss: 0.01452
Epoch: 16578, Training Loss: 0.01117
Epoch: 16578, Training Loss: 0.01175
Epoch: 16578, Training Loss: 0.01176
Epoch: 16578, Training Loss: 0.01452
Epoch: 16579, Training Loss: 0.01117
Epoch: 16579, Training Loss: 0.01175
Epoch: 16579, Training Loss: 0.01176
Epoch: 16579, Training Loss: 0.01452
Epoch: 16580, Training Loss: 0.01117
Epoch: 16580, Training Loss: 0.01175
Epoch: 16580, Training Loss: 0.01176
Epoch: 16580, Training Loss: 0.01452
Epoch: 16581, Training Loss: 0.01117
Epoch: 16581, Training Loss: 0.01174
Epoch: 16581, Training Loss: 0.01176
Epoch: 16581, Training Loss: 0.01452
Epoch: 16582, Training Loss: 0.01117
Epoch: 16582, Training Loss: 0.01174
Epoch: 16582, Training Loss: 0.01176
Epoch: 16582, Training Loss: 0.01452
Epoch: 16583, Training Loss: 0.01117
Epoch: 16583, Training Loss: 0.01174
Epoch: 16583, Training Loss: 0.01176
Epoch: 16583, Training Loss: 0.01452
Epoch: 16584, Training Loss: 0.01117
Epoch: 16584, Training Loss: 0.01174
Epoch: 16584, Training Loss: 0.01176
Epoch: 16584, Training Loss: 0.01452
Epoch: 16585, Training Loss: 0.01117
Epoch: 16585, Training Loss: 0.01174
Epoch: 16585, Training Loss: 0.01176
Epoch: 16585, Training Loss: 0.01452
Epoch: 16586, Training Loss: 0.01117
Epoch: 16586, Training Loss: 0.01174
Epoch: 16586, Training Loss: 0.01176
Epoch: 16586, Training Loss: 0.01452
Epoch: 16587, Training Loss: 0.01117
Epoch: 16587, Training Loss: 0.01174
Epoch: 16587, Training Loss: 0.01176
Epoch: 16587, Training Loss: 0.01452
Epoch: 16588, Training Loss: 0.01117
Epoch: 16588, Training Loss: 0.01174
Epoch: 16588, Training Loss: 0.01176
Epoch: 16588, Training Loss: 0.01452
Epoch: 16589, Training Loss: 0.01117
Epoch: 16589, Training Loss: 0.01174
Epoch: 16589, Training Loss: 0.01176
Epoch: 16589, Training Loss: 0.01452
Epoch: 16590, Training Loss: 0.01117
Epoch: 16590, Training Loss: 0.01174
Epoch: 16590, Training Loss: 0.01176
Epoch: 16590, Training Loss: 0.01451
Epoch: 16591, Training Loss: 0.01116
Epoch: 16591, Training Loss: 0.01174
Epoch: 16591, Training Loss: 0.01176
Epoch: 16591, Training Loss: 0.01451
Epoch: 16592, Training Loss: 0.01116
Epoch: 16592, Training Loss: 0.01174
Epoch: 16592, Training Loss: 0.01176
Epoch: 16592, Training Loss: 0.01451
Epoch: 16593, Training Loss: 0.01116
Epoch: 16593, Training Loss: 0.01174
Epoch: 16593, Training Loss: 0.01176
Epoch: 16593, Training Loss: 0.01451
Epoch: 16594, Training Loss: 0.01116
Epoch: 16594, Training Loss: 0.01174
Epoch: 16594, Training Loss: 0.01176
Epoch: 16594, Training Loss: 0.01451
Epoch: 16595, Training Loss: 0.01116
Epoch: 16595, Training Loss: 0.01174
Epoch: 16595, Training Loss: 0.01175
Epoch: 16595, Training Loss: 0.01451
Epoch: 16596, Training Loss: 0.01116
Epoch: 16596, Training Loss: 0.01174
Epoch: 16596, Training Loss: 0.01175
Epoch: 16596, Training Loss: 0.01451
Epoch: 16597, Training Loss: 0.01116
Epoch: 16597, Training Loss: 0.01174
Epoch: 16597, Training Loss: 0.01175
Epoch: 16597, Training Loss: 0.01451
Epoch: 16598, Training Loss: 0.01116
Epoch: 16598, Training Loss: 0.01174
Epoch: 16598, Training Loss: 0.01175
Epoch: 16598, Training Loss: 0.01451
Epoch: 16599, Training Loss: 0.01116
Epoch: 16599, Training Loss: 0.01174
Epoch: 16599, Training Loss: 0.01175
Epoch: 16599, Training Loss: 0.01451
Epoch: 16600, Training Loss: 0.01116
Epoch: 16600, Training Loss: 0.01174
Epoch: 16600, Training Loss: 0.01175
Epoch: 16600, Training Loss: 0.01451
Epoch: 16601, Training Loss: 0.01116
Epoch: 16601, Training Loss: 0.01174
Epoch: 16601, Training Loss: 0.01175
Epoch: 16601, Training Loss: 0.01451
Epoch: 16602, Training Loss: 0.01116
Epoch: 16602, Training Loss: 0.01174
Epoch: 16602, Training Loss: 0.01175
Epoch: 16602, Training Loss: 0.01451
Epoch: 16603, Training Loss: 0.01116
Epoch: 16603, Training Loss: 0.01174
Epoch: 16603, Training Loss: 0.01175
Epoch: 16603, Training Loss: 0.01451
Epoch: 16604, Training Loss: 0.01116
Epoch: 16604, Training Loss: 0.01173
Epoch: 16604, Training Loss: 0.01175
Epoch: 16604, Training Loss: 0.01451
Epoch: 16605, Training Loss: 0.01116
Epoch: 16605, Training Loss: 0.01173
Epoch: 16605, Training Loss: 0.01175
Epoch: 16605, Training Loss: 0.01451
Epoch: 16606, Training Loss: 0.01116
Epoch: 16606, Training Loss: 0.01173
Epoch: 16606, Training Loss: 0.01175
Epoch: 16606, Training Loss: 0.01451
Epoch: 16607, Training Loss: 0.01116
Epoch: 16607, Training Loss: 0.01173
Epoch: 16607, Training Loss: 0.01175
Epoch: 16607, Training Loss: 0.01451
Epoch: 16608, Training Loss: 0.01116
Epoch: 16608, Training Loss: 0.01173
Epoch: 16608, Training Loss: 0.01175
Epoch: 16608, Training Loss: 0.01450
Epoch: 16609, Training Loss: 0.01116
Epoch: 16609, Training Loss: 0.01173
Epoch: 16609, Training Loss: 0.01175
Epoch: 16609, Training Loss: 0.01450
Epoch: 16610, Training Loss: 0.01116
Epoch: 16610, Training Loss: 0.01173
Epoch: 16610, Training Loss: 0.01175
Epoch: 16610, Training Loss: 0.01450
Epoch: 16611, Training Loss: 0.01116
Epoch: 16611, Training Loss: 0.01173
Epoch: 16611, Training Loss: 0.01175
Epoch: 16611, Training Loss: 0.01450
Epoch: 16612, Training Loss: 0.01116
Epoch: 16612, Training Loss: 0.01173
Epoch: 16612, Training Loss: 0.01175
Epoch: 16612, Training Loss: 0.01450
Epoch: 16613, Training Loss: 0.01116
Epoch: 16613, Training Loss: 0.01173
Epoch: 16613, Training Loss: 0.01175
Epoch: 16613, Training Loss: 0.01450
Epoch: 16614, Training Loss: 0.01116
Epoch: 16614, Training Loss: 0.01173
Epoch: 16614, Training Loss: 0.01175
Epoch: 16614, Training Loss: 0.01450
Epoch: 16615, Training Loss: 0.01116
Epoch: 16615, Training Loss: 0.01173
Epoch: 16615, Training Loss: 0.01175
Epoch: 16615, Training Loss: 0.01450
Epoch: 16616, Training Loss: 0.01115
Epoch: 16616, Training Loss: 0.01173
Epoch: 16616, Training Loss: 0.01175
Epoch: 16616, Training Loss: 0.01450
Epoch: 16617, Training Loss: 0.01115
Epoch: 16617, Training Loss: 0.01173
Epoch: 16617, Training Loss: 0.01175
Epoch: 16617, Training Loss: 0.01450
Epoch: 16618, Training Loss: 0.01115
Epoch: 16618, Training Loss: 0.01173
Epoch: 16618, Training Loss: 0.01174
Epoch: 16618, Training Loss: 0.01450
Epoch: 16619, Training Loss: 0.01115
Epoch: 16619, Training Loss: 0.01173
Epoch: 16619, Training Loss: 0.01174
Epoch: 16619, Training Loss: 0.01450
Epoch: 16620, Training Loss: 0.01115
Epoch: 16620, Training Loss: 0.01173
Epoch: 16620, Training Loss: 0.01174
Epoch: 16620, Training Loss: 0.01450
Epoch: 16621, Training Loss: 0.01115
Epoch: 16621, Training Loss: 0.01173
Epoch: 16621, Training Loss: 0.01174
Epoch: 16621, Training Loss: 0.01450
Epoch: 16622, Training Loss: 0.01115
Epoch: 16622, Training Loss: 0.01173
Epoch: 16622, Training Loss: 0.01174
Epoch: 16622, Training Loss: 0.01450
Epoch: 16623, Training Loss: 0.01115
Epoch: 16623, Training Loss: 0.01173
Epoch: 16623, Training Loss: 0.01174
Epoch: 16623, Training Loss: 0.01450
Epoch: 16624, Training Loss: 0.01115
Epoch: 16624, Training Loss: 0.01173
Epoch: 16624, Training Loss: 0.01174
Epoch: 16624, Training Loss: 0.01450
Epoch: 16625, Training Loss: 0.01115
Epoch: 16625, Training Loss: 0.01173
Epoch: 16625, Training Loss: 0.01174
Epoch: 16625, Training Loss: 0.01450
Epoch: 16626, Training Loss: 0.01115
Epoch: 16626, Training Loss: 0.01173
Epoch: 16626, Training Loss: 0.01174
Epoch: 16626, Training Loss: 0.01450
Epoch: 16627, Training Loss: 0.01115
Epoch: 16627, Training Loss: 0.01173
Epoch: 16627, Training Loss: 0.01174
Epoch: 16627, Training Loss: 0.01449
Epoch: 16628, Training Loss: 0.01115
Epoch: 16628, Training Loss: 0.01172
Epoch: 16628, Training Loss: 0.01174
Epoch: 16628, Training Loss: 0.01449
Epoch: 16629, Training Loss: 0.01115
Epoch: 16629, Training Loss: 0.01172
Epoch: 16629, Training Loss: 0.01174
Epoch: 16629, Training Loss: 0.01449
Epoch: 16630, Training Loss: 0.01115
Epoch: 16630, Training Loss: 0.01172
Epoch: 16630, Training Loss: 0.01174
Epoch: 16630, Training Loss: 0.01449
Epoch: 16631, Training Loss: 0.01115
Epoch: 16631, Training Loss: 0.01172
Epoch: 16631, Training Loss: 0.01174
Epoch: 16631, Training Loss: 0.01449
Epoch: 16632, Training Loss: 0.01115
Epoch: 16632, Training Loss: 0.01172
Epoch: 16632, Training Loss: 0.01174
Epoch: 16632, Training Loss: 0.01449
Epoch: 16633, Training Loss: 0.01115
Epoch: 16633, Training Loss: 0.01172
Epoch: 16633, Training Loss: 0.01174
Epoch: 16633, Training Loss: 0.01449
Epoch: 16634, Training Loss: 0.01115
Epoch: 16634, Training Loss: 0.01172
Epoch: 16634, Training Loss: 0.01174
Epoch: 16634, Training Loss: 0.01449
Epoch: 16635, Training Loss: 0.01115
Epoch: 16635, Training Loss: 0.01172
Epoch: 16635, Training Loss: 0.01174
Epoch: 16635, Training Loss: 0.01449
Epoch: 16636, Training Loss: 0.01115
Epoch: 16636, Training Loss: 0.01172
Epoch: 16636, Training Loss: 0.01174
Epoch: 16636, Training Loss: 0.01449
Epoch: 16637, Training Loss: 0.01115
Epoch: 16637, Training Loss: 0.01172
Epoch: 16637, Training Loss: 0.01174
Epoch: 16637, Training Loss: 0.01449
Epoch: 16638, Training Loss: 0.01115
Epoch: 16638, Training Loss: 0.01172
Epoch: 16638, Training Loss: 0.01174
Epoch: 16638, Training Loss: 0.01449
Epoch: 16639, Training Loss: 0.01115
Epoch: 16639, Training Loss: 0.01172
Epoch: 16639, Training Loss: 0.01174
Epoch: 16639, Training Loss: 0.01449
Epoch: 16640, Training Loss: 0.01115
Epoch: 16640, Training Loss: 0.01172
Epoch: 16640, Training Loss: 0.01174
Epoch: 16640, Training Loss: 0.01449
Epoch: 16641, Training Loss: 0.01115
Epoch: 16641, Training Loss: 0.01172
Epoch: 16641, Training Loss: 0.01174
Epoch: 16641, Training Loss: 0.01449
Epoch: 16642, Training Loss: 0.01114
Epoch: 16642, Training Loss: 0.01172
Epoch: 16642, Training Loss: 0.01173
Epoch: 16642, Training Loss: 0.01449
Epoch: 16643, Training Loss: 0.01114
Epoch: 16643, Training Loss: 0.01172
Epoch: 16643, Training Loss: 0.01173
Epoch: 16643, Training Loss: 0.01449
Epoch: 16644, Training Loss: 0.01114
Epoch: 16644, Training Loss: 0.01172
Epoch: 16644, Training Loss: 0.01173
Epoch: 16644, Training Loss: 0.01449
Epoch: 16645, Training Loss: 0.01114
Epoch: 16645, Training Loss: 0.01172
Epoch: 16645, Training Loss: 0.01173
Epoch: 16645, Training Loss: 0.01448
Epoch: 16646, Training Loss: 0.01114
Epoch: 16646, Training Loss: 0.01172
Epoch: 16646, Training Loss: 0.01173
Epoch: 16646, Training Loss: 0.01448
Epoch: 16647, Training Loss: 0.01114
Epoch: 16647, Training Loss: 0.01172
Epoch: 16647, Training Loss: 0.01173
Epoch: 16647, Training Loss: 0.01448
Epoch: 16648, Training Loss: 0.01114
Epoch: 16648, Training Loss: 0.01172
Epoch: 16648, Training Loss: 0.01173
Epoch: 16648, Training Loss: 0.01448
Epoch: 16649, Training Loss: 0.01114
Epoch: 16649, Training Loss: 0.01172
Epoch: 16649, Training Loss: 0.01173
Epoch: 16649, Training Loss: 0.01448
Epoch: 16650, Training Loss: 0.01114
Epoch: 16650, Training Loss: 0.01172
Epoch: 16650, Training Loss: 0.01173
Epoch: 16650, Training Loss: 0.01448
Epoch: 16651, Training Loss: 0.01114
Epoch: 16651, Training Loss: 0.01171
Epoch: 16651, Training Loss: 0.01173
Epoch: 16651, Training Loss: 0.01448
Epoch: 16652, Training Loss: 0.01114
Epoch: 16652, Training Loss: 0.01171
Epoch: 16652, Training Loss: 0.01173
Epoch: 16652, Training Loss: 0.01448
Epoch: 16653, Training Loss: 0.01114
Epoch: 16653, Training Loss: 0.01171
Epoch: 16653, Training Loss: 0.01173
Epoch: 16653, Training Loss: 0.01448
Epoch: 16654, Training Loss: 0.01114
Epoch: 16654, Training Loss: 0.01171
Epoch: 16654, Training Loss: 0.01173
Epoch: 16654, Training Loss: 0.01448
Epoch: 16655, Training Loss: 0.01114
Epoch: 16655, Training Loss: 0.01171
Epoch: 16655, Training Loss: 0.01173
Epoch: 16655, Training Loss: 0.01448
Epoch: 16656, Training Loss: 0.01114
Epoch: 16656, Training Loss: 0.01171
Epoch: 16656, Training Loss: 0.01173
Epoch: 16656, Training Loss: 0.01448
Epoch: 16657, Training Loss: 0.01114
Epoch: 16657, Training Loss: 0.01171
Epoch: 16657, Training Loss: 0.01173
Epoch: 16657, Training Loss: 0.01448
Epoch: 16658, Training Loss: 0.01114
Epoch: 16658, Training Loss: 0.01171
Epoch: 16658, Training Loss: 0.01173
Epoch: 16658, Training Loss: 0.01448
Epoch: 16659, Training Loss: 0.01114
Epoch: 16659, Training Loss: 0.01171
Epoch: 16659, Training Loss: 0.01173
Epoch: 16659, Training Loss: 0.01448
Epoch: 16660, Training Loss: 0.01114
Epoch: 16660, Training Loss: 0.01171
Epoch: 16660, Training Loss: 0.01173
Epoch: 16660, Training Loss: 0.01448
Epoch: 16661, Training Loss: 0.01114
Epoch: 16661, Training Loss: 0.01171
Epoch: 16661, Training Loss: 0.01173
Epoch: 16661, Training Loss: 0.01448
Epoch: 16662, Training Loss: 0.01114
Epoch: 16662, Training Loss: 0.01171
Epoch: 16662, Training Loss: 0.01173
Epoch: 16662, Training Loss: 0.01448
Epoch: 16663, Training Loss: 0.01114
Epoch: 16663, Training Loss: 0.01171
Epoch: 16663, Training Loss: 0.01173
Epoch: 16663, Training Loss: 0.01448
Epoch: 16664, Training Loss: 0.01114
Epoch: 16664, Training Loss: 0.01171
Epoch: 16664, Training Loss: 0.01173
Epoch: 16664, Training Loss: 0.01447
Epoch: 16665, Training Loss: 0.01114
Epoch: 16665, Training Loss: 0.01171
Epoch: 16665, Training Loss: 0.01172
Epoch: 16665, Training Loss: 0.01447
Epoch: 16666, Training Loss: 0.01114
Epoch: 16666, Training Loss: 0.01171
Epoch: 16666, Training Loss: 0.01172
Epoch: 16666, Training Loss: 0.01447
Epoch: 16667, Training Loss: 0.01113
Epoch: 16667, Training Loss: 0.01171
Epoch: 16667, Training Loss: 0.01172
Epoch: 16667, Training Loss: 0.01447
Epoch: 16668, Training Loss: 0.01113
Epoch: 16668, Training Loss: 0.01171
Epoch: 16668, Training Loss: 0.01172
Epoch: 16668, Training Loss: 0.01447
Epoch: 16669, Training Loss: 0.01113
Epoch: 16669, Training Loss: 0.01171
Epoch: 16669, Training Loss: 0.01172
Epoch: 16669, Training Loss: 0.01447
Epoch: 16670, Training Loss: 0.01113
Epoch: 16670, Training Loss: 0.01171
Epoch: 16670, Training Loss: 0.01172
Epoch: 16670, Training Loss: 0.01447
Epoch: 16671, Training Loss: 0.01113
Epoch: 16671, Training Loss: 0.01171
Epoch: 16671, Training Loss: 0.01172
Epoch: 16671, Training Loss: 0.01447
Epoch: 16672, Training Loss: 0.01113
Epoch: 16672, Training Loss: 0.01171
Epoch: 16672, Training Loss: 0.01172
Epoch: 16672, Training Loss: 0.01447
Epoch: 16673, Training Loss: 0.01113
Epoch: 16673, Training Loss: 0.01171
Epoch: 16673, Training Loss: 0.01172
Epoch: 16673, Training Loss: 0.01447
Epoch: 16674, Training Loss: 0.01113
Epoch: 16674, Training Loss: 0.01170
Epoch: 16674, Training Loss: 0.01172
Epoch: 16674, Training Loss: 0.01447
Epoch: 16675, Training Loss: 0.01113
Epoch: 16675, Training Loss: 0.01170
Epoch: 16675, Training Loss: 0.01172
Epoch: 16675, Training Loss: 0.01447
Epoch: 16676, Training Loss: 0.01113
Epoch: 16676, Training Loss: 0.01170
Epoch: 16676, Training Loss: 0.01172
Epoch: 16676, Training Loss: 0.01447
Epoch: 16677, Training Loss: 0.01113
Epoch: 16677, Training Loss: 0.01170
Epoch: 16677, Training Loss: 0.01172
Epoch: 16677, Training Loss: 0.01447
Epoch: 16678, Training Loss: 0.01113
Epoch: 16678, Training Loss: 0.01170
Epoch: 16678, Training Loss: 0.01172
Epoch: 16678, Training Loss: 0.01447
Epoch: 16679, Training Loss: 0.01113
Epoch: 16679, Training Loss: 0.01170
Epoch: 16679, Training Loss: 0.01172
Epoch: 16679, Training Loss: 0.01447
Epoch: 16680, Training Loss: 0.01113
Epoch: 16680, Training Loss: 0.01170
Epoch: 16680, Training Loss: 0.01172
Epoch: 16680, Training Loss: 0.01447
Epoch: 16681, Training Loss: 0.01113
Epoch: 16681, Training Loss: 0.01170
Epoch: 16681, Training Loss: 0.01172
Epoch: 16681, Training Loss: 0.01447
Epoch: 16682, Training Loss: 0.01113
Epoch: 16682, Training Loss: 0.01170
Epoch: 16682, Training Loss: 0.01172
Epoch: 16682, Training Loss: 0.01446
Epoch: 16683, Training Loss: 0.01113
Epoch: 16683, Training Loss: 0.01170
Epoch: 16683, Training Loss: 0.01172
Epoch: 16683, Training Loss: 0.01446
Epoch: 16684, Training Loss: 0.01113
Epoch: 16684, Training Loss: 0.01170
Epoch: 16684, Training Loss: 0.01172
Epoch: 16684, Training Loss: 0.01446
Epoch: 16685, Training Loss: 0.01113
Epoch: 16685, Training Loss: 0.01170
Epoch: 16685, Training Loss: 0.01172
Epoch: 16685, Training Loss: 0.01446
Epoch: 16686, Training Loss: 0.01113
Epoch: 16686, Training Loss: 0.01170
Epoch: 16686, Training Loss: 0.01172
Epoch: 16686, Training Loss: 0.01446
Epoch: 16687, Training Loss: 0.01113
Epoch: 16687, Training Loss: 0.01170
Epoch: 16687, Training Loss: 0.01172
Epoch: 16687, Training Loss: 0.01446
Epoch: 16688, Training Loss: 0.01113
Epoch: 16688, Training Loss: 0.01170
Epoch: 16688, Training Loss: 0.01171
Epoch: 16688, Training Loss: 0.01446
Epoch: 16689, Training Loss: 0.01113
Epoch: 16689, Training Loss: 0.01170
Epoch: 16689, Training Loss: 0.01171
Epoch: 16689, Training Loss: 0.01446
Epoch: 16690, Training Loss: 0.01113
Epoch: 16690, Training Loss: 0.01170
Epoch: 16690, Training Loss: 0.01171
Epoch: 16690, Training Loss: 0.01446
Epoch: 16691, Training Loss: 0.01113
Epoch: 16691, Training Loss: 0.01170
Epoch: 16691, Training Loss: 0.01171
Epoch: 16691, Training Loss: 0.01446
Epoch: 16692, Training Loss: 0.01112
Epoch: 16692, Training Loss: 0.01170
Epoch: 16692, Training Loss: 0.01171
Epoch: 16692, Training Loss: 0.01446
Epoch: 16693, Training Loss: 0.01112
Epoch: 16693, Training Loss: 0.01170
Epoch: 16693, Training Loss: 0.01171
Epoch: 16693, Training Loss: 0.01446
Epoch: 16694, Training Loss: 0.01112
Epoch: 16694, Training Loss: 0.01170
Epoch: 16694, Training Loss: 0.01171
Epoch: 16694, Training Loss: 0.01446
Epoch: 16695, Training Loss: 0.01112
Epoch: 16695, Training Loss: 0.01170
Epoch: 16695, Training Loss: 0.01171
Epoch: 16695, Training Loss: 0.01446
Epoch: 16696, Training Loss: 0.01112
Epoch: 16696, Training Loss: 0.01170
Epoch: 16696, Training Loss: 0.01171
Epoch: 16696, Training Loss: 0.01446
Epoch: 16697, Training Loss: 0.01112
Epoch: 16697, Training Loss: 0.01170
Epoch: 16697, Training Loss: 0.01171
Epoch: 16697, Training Loss: 0.01446
Epoch: 16698, Training Loss: 0.01112
Epoch: 16698, Training Loss: 0.01169
Epoch: 16698, Training Loss: 0.01171
Epoch: 16698, Training Loss: 0.01446
Epoch: 16699, Training Loss: 0.01112
Epoch: 16699, Training Loss: 0.01169
Epoch: 16699, Training Loss: 0.01171
Epoch: 16699, Training Loss: 0.01446
Epoch: 16700, Training Loss: 0.01112
Epoch: 16700, Training Loss: 0.01169
Epoch: 16700, Training Loss: 0.01171
Epoch: 16700, Training Loss: 0.01446
Epoch: 16701, Training Loss: 0.01112
Epoch: 16701, Training Loss: 0.01169
Epoch: 16701, Training Loss: 0.01171
Epoch: 16701, Training Loss: 0.01445
Epoch: 16702, Training Loss: 0.01112
Epoch: 16702, Training Loss: 0.01169
Epoch: 16702, Training Loss: 0.01171
Epoch: 16702, Training Loss: 0.01445
Epoch: 16703, Training Loss: 0.01112
Epoch: 16703, Training Loss: 0.01169
Epoch: 16703, Training Loss: 0.01171
Epoch: 16703, Training Loss: 0.01445
Epoch: 16704, Training Loss: 0.01112
Epoch: 16704, Training Loss: 0.01169
Epoch: 16704, Training Loss: 0.01171
Epoch: 16704, Training Loss: 0.01445
Epoch: 16705, Training Loss: 0.01112
Epoch: 16705, Training Loss: 0.01169
Epoch: 16705, Training Loss: 0.01171
Epoch: 16705, Training Loss: 0.01445
Epoch: 16706, Training Loss: 0.01112
Epoch: 16706, Training Loss: 0.01169
Epoch: 16706, Training Loss: 0.01171
Epoch: 16706, Training Loss: 0.01445
Epoch: 16707, Training Loss: 0.01112
Epoch: 16707, Training Loss: 0.01169
Epoch: 16707, Training Loss: 0.01171
Epoch: 16707, Training Loss: 0.01445
Epoch: 16708, Training Loss: 0.01112
Epoch: 16708, Training Loss: 0.01169
Epoch: 16708, Training Loss: 0.01171
Epoch: 16708, Training Loss: 0.01445
Epoch: 16709, Training Loss: 0.01112
Epoch: 16709, Training Loss: 0.01169
Epoch: 16709, Training Loss: 0.01171
Epoch: 16709, Training Loss: 0.01445
Epoch: 16710, Training Loss: 0.01112
Epoch: 16710, Training Loss: 0.01169
Epoch: 16710, Training Loss: 0.01171
Epoch: 16710, Training Loss: 0.01445
Epoch: 16711, Training Loss: 0.01112
Epoch: 16711, Training Loss: 0.01169
Epoch: 16711, Training Loss: 0.01171
Epoch: 16711, Training Loss: 0.01445
Epoch: 16712, Training Loss: 0.01112
Epoch: 16712, Training Loss: 0.01169
Epoch: 16712, Training Loss: 0.01170
Epoch: 16712, Training Loss: 0.01445
Epoch: 16713, Training Loss: 0.01112
Epoch: 16713, Training Loss: 0.01169
Epoch: 16713, Training Loss: 0.01170
Epoch: 16713, Training Loss: 0.01445
Epoch: 16714, Training Loss: 0.01112
Epoch: 16714, Training Loss: 0.01169
Epoch: 16714, Training Loss: 0.01170
Epoch: 16714, Training Loss: 0.01445
Epoch: 16715, Training Loss: 0.01112
Epoch: 16715, Training Loss: 0.01169
Epoch: 16715, Training Loss: 0.01170
Epoch: 16715, Training Loss: 0.01445
Epoch: 16716, Training Loss: 0.01112
Epoch: 16716, Training Loss: 0.01169
Epoch: 16716, Training Loss: 0.01170
Epoch: 16716, Training Loss: 0.01445
Epoch: 16717, Training Loss: 0.01112
Epoch: 16717, Training Loss: 0.01169
Epoch: 16717, Training Loss: 0.01170
Epoch: 16717, Training Loss: 0.01445
Epoch: 16718, Training Loss: 0.01111
Epoch: 16718, Training Loss: 0.01169
Epoch: 16718, Training Loss: 0.01170
Epoch: 16718, Training Loss: 0.01445
Epoch: 16719, Training Loss: 0.01111
Epoch: 16719, Training Loss: 0.01169
Epoch: 16719, Training Loss: 0.01170
Epoch: 16719, Training Loss: 0.01445
Epoch: 16720, Training Loss: 0.01111
Epoch: 16720, Training Loss: 0.01169
Epoch: 16720, Training Loss: 0.01170
Epoch: 16720, Training Loss: 0.01444
Epoch: 16721, Training Loss: 0.01111
Epoch: 16721, Training Loss: 0.01168
Epoch: 16721, Training Loss: 0.01170
Epoch: 16721, Training Loss: 0.01444
Epoch: 16722, Training Loss: 0.01111
Epoch: 16722, Training Loss: 0.01168
Epoch: 16722, Training Loss: 0.01170
Epoch: 16722, Training Loss: 0.01444
Epoch: 16723, Training Loss: 0.01111
Epoch: 16723, Training Loss: 0.01168
Epoch: 16723, Training Loss: 0.01170
Epoch: 16723, Training Loss: 0.01444
Epoch: 16724, Training Loss: 0.01111
Epoch: 16724, Training Loss: 0.01168
Epoch: 16724, Training Loss: 0.01170
Epoch: 16724, Training Loss: 0.01444
Epoch: 16725, Training Loss: 0.01111
Epoch: 16725, Training Loss: 0.01168
Epoch: 16725, Training Loss: 0.01170
Epoch: 16725, Training Loss: 0.01444
Epoch: 16726, Training Loss: 0.01111
Epoch: 16726, Training Loss: 0.01168
Epoch: 16726, Training Loss: 0.01170
Epoch: 16726, Training Loss: 0.01444
Epoch: 16727, Training Loss: 0.01111
Epoch: 16727, Training Loss: 0.01168
Epoch: 16727, Training Loss: 0.01170
Epoch: 16727, Training Loss: 0.01444
Epoch: 16728, Training Loss: 0.01111
Epoch: 16728, Training Loss: 0.01168
Epoch: 16728, Training Loss: 0.01170
Epoch: 16728, Training Loss: 0.01444
Epoch: 16729, Training Loss: 0.01111
Epoch: 16729, Training Loss: 0.01168
Epoch: 16729, Training Loss: 0.01170
Epoch: 16729, Training Loss: 0.01444
Epoch: 16730, Training Loss: 0.01111
Epoch: 16730, Training Loss: 0.01168
Epoch: 16730, Training Loss: 0.01170
Epoch: 16730, Training Loss: 0.01444
Epoch: 16731, Training Loss: 0.01111
Epoch: 16731, Training Loss: 0.01168
Epoch: 16731, Training Loss: 0.01170
Epoch: 16731, Training Loss: 0.01444
Epoch: 16732, Training Loss: 0.01111
Epoch: 16732, Training Loss: 0.01168
Epoch: 16732, Training Loss: 0.01170
Epoch: 16732, Training Loss: 0.01444
Epoch: 16733, Training Loss: 0.01111
Epoch: 16733, Training Loss: 0.01168
Epoch: 16733, Training Loss: 0.01170
Epoch: 16733, Training Loss: 0.01444
Epoch: 16734, Training Loss: 0.01111
Epoch: 16734, Training Loss: 0.01168
Epoch: 16734, Training Loss: 0.01170
Epoch: 16734, Training Loss: 0.01444
Epoch: 16735, Training Loss: 0.01111
Epoch: 16735, Training Loss: 0.01168
Epoch: 16735, Training Loss: 0.01169
Epoch: 16735, Training Loss: 0.01444
Epoch: 16736, Training Loss: 0.01111
Epoch: 16736, Training Loss: 0.01168
Epoch: 16736, Training Loss: 0.01169
Epoch: 16736, Training Loss: 0.01444
Epoch: 16737, Training Loss: 0.01111
Epoch: 16737, Training Loss: 0.01168
Epoch: 16737, Training Loss: 0.01169
Epoch: 16737, Training Loss: 0.01444
Epoch: 16738, Training Loss: 0.01111
Epoch: 16738, Training Loss: 0.01168
Epoch: 16738, Training Loss: 0.01169
Epoch: 16738, Training Loss: 0.01443
Epoch: 16739, Training Loss: 0.01111
Epoch: 16739, Training Loss: 0.01168
Epoch: 16739, Training Loss: 0.01169
Epoch: 16739, Training Loss: 0.01443
Epoch: 16740, Training Loss: 0.01111
Epoch: 16740, Training Loss: 0.01168
Epoch: 16740, Training Loss: 0.01169
Epoch: 16740, Training Loss: 0.01443
Epoch: 16741, Training Loss: 0.01111
Epoch: 16741, Training Loss: 0.01168
Epoch: 16741, Training Loss: 0.01169
Epoch: 16741, Training Loss: 0.01443
Epoch: 16742, Training Loss: 0.01111
Epoch: 16742, Training Loss: 0.01168
Epoch: 16742, Training Loss: 0.01169
Epoch: 16742, Training Loss: 0.01443
Epoch: 16743, Training Loss: 0.01110
Epoch: 16743, Training Loss: 0.01168
Epoch: 16743, Training Loss: 0.01169
Epoch: 16743, Training Loss: 0.01443
Epoch: 16744, Training Loss: 0.01110
Epoch: 16744, Training Loss: 0.01168
Epoch: 16744, Training Loss: 0.01169
Epoch: 16744, Training Loss: 0.01443
Epoch: 16745, Training Loss: 0.01110
Epoch: 16745, Training Loss: 0.01167
Epoch: 16745, Training Loss: 0.01169
Epoch: 16745, Training Loss: 0.01443
Epoch: 16746, Training Loss: 0.01110
Epoch: 16746, Training Loss: 0.01167
Epoch: 16746, Training Loss: 0.01169
Epoch: 16746, Training Loss: 0.01443
Epoch: 16747, Training Loss: 0.01110
Epoch: 16747, Training Loss: 0.01167
Epoch: 16747, Training Loss: 0.01169
Epoch: 16747, Training Loss: 0.01443
Epoch: 16748, Training Loss: 0.01110
Epoch: 16748, Training Loss: 0.01167
Epoch: 16748, Training Loss: 0.01169
Epoch: 16748, Training Loss: 0.01443
Epoch: 16749, Training Loss: 0.01110
Epoch: 16749, Training Loss: 0.01167
Epoch: 16749, Training Loss: 0.01169
Epoch: 16749, Training Loss: 0.01443
Epoch: 16750, Training Loss: 0.01110
Epoch: 16750, Training Loss: 0.01167
Epoch: 16750, Training Loss: 0.01169
Epoch: 16750, Training Loss: 0.01443
Epoch: 16751, Training Loss: 0.01110
Epoch: 16751, Training Loss: 0.01167
Epoch: 16751, Training Loss: 0.01169
Epoch: 16751, Training Loss: 0.01443
Epoch: 16752, Training Loss: 0.01110
Epoch: 16752, Training Loss: 0.01167
Epoch: 16752, Training Loss: 0.01169
Epoch: 16752, Training Loss: 0.01443
Epoch: 16753, Training Loss: 0.01110
Epoch: 16753, Training Loss: 0.01167
Epoch: 16753, Training Loss: 0.01169
Epoch: 16753, Training Loss: 0.01443
Epoch: 16754, Training Loss: 0.01110
Epoch: 16754, Training Loss: 0.01167
Epoch: 16754, Training Loss: 0.01169
Epoch: 16754, Training Loss: 0.01443
Epoch: 16755, Training Loss: 0.01110
Epoch: 16755, Training Loss: 0.01167
Epoch: 16755, Training Loss: 0.01169
Epoch: 16755, Training Loss: 0.01443
Epoch: 16756, Training Loss: 0.01110
Epoch: 16756, Training Loss: 0.01167
Epoch: 16756, Training Loss: 0.01169
Epoch: 16756, Training Loss: 0.01443
Epoch: 16757, Training Loss: 0.01110
Epoch: 16757, Training Loss: 0.01167
Epoch: 16757, Training Loss: 0.01169
Epoch: 16757, Training Loss: 0.01442
Epoch: 16758, Training Loss: 0.01110
Epoch: 16758, Training Loss: 0.01167
Epoch: 16758, Training Loss: 0.01169
Epoch: 16758, Training Loss: 0.01442
Epoch: 16759, Training Loss: 0.01110
Epoch: 16759, Training Loss: 0.01167
Epoch: 16759, Training Loss: 0.01168
Epoch: 16759, Training Loss: 0.01442
Epoch: 16760, Training Loss: 0.01110
Epoch: 16760, Training Loss: 0.01167
Epoch: 16760, Training Loss: 0.01168
Epoch: 16760, Training Loss: 0.01442
Epoch: 16761, Training Loss: 0.01110
Epoch: 16761, Training Loss: 0.01167
Epoch: 16761, Training Loss: 0.01168
Epoch: 16761, Training Loss: 0.01442
Epoch: 16762, Training Loss: 0.01110
Epoch: 16762, Training Loss: 0.01167
Epoch: 16762, Training Loss: 0.01168
Epoch: 16762, Training Loss: 0.01442
Epoch: 16763, Training Loss: 0.01110
Epoch: 16763, Training Loss: 0.01167
Epoch: 16763, Training Loss: 0.01168
Epoch: 16763, Training Loss: 0.01442
Epoch: 16764, Training Loss: 0.01110
Epoch: 16764, Training Loss: 0.01167
Epoch: 16764, Training Loss: 0.01168
Epoch: 16764, Training Loss: 0.01442
Epoch: 16765, Training Loss: 0.01110
Epoch: 16765, Training Loss: 0.01167
Epoch: 16765, Training Loss: 0.01168
Epoch: 16765, Training Loss: 0.01442
Epoch: 16766, Training Loss: 0.01110
Epoch: 16766, Training Loss: 0.01167
Epoch: 16766, Training Loss: 0.01168
Epoch: 16766, Training Loss: 0.01442
Epoch: 16767, Training Loss: 0.01110
Epoch: 16767, Training Loss: 0.01167
Epoch: 16767, Training Loss: 0.01168
Epoch: 16767, Training Loss: 0.01442
Epoch: 16768, Training Loss: 0.01110
Epoch: 16768, Training Loss: 0.01166
Epoch: 16768, Training Loss: 0.01168
Epoch: 16768, Training Loss: 0.01442
Epoch: 16769, Training Loss: 0.01109
Epoch: 16769, Training Loss: 0.01166
Epoch: 16769, Training Loss: 0.01168
Epoch: 16769, Training Loss: 0.01442
Epoch: 16770, Training Loss: 0.01109
Epoch: 16770, Training Loss: 0.01166
Epoch: 16770, Training Loss: 0.01168
Epoch: 16770, Training Loss: 0.01442
Epoch: 16771, Training Loss: 0.01109
Epoch: 16771, Training Loss: 0.01166
Epoch: 16771, Training Loss: 0.01168
Epoch: 16771, Training Loss: 0.01442
Epoch: 16772, Training Loss: 0.01109
Epoch: 16772, Training Loss: 0.01166
Epoch: 16772, Training Loss: 0.01168
Epoch: 16772, Training Loss: 0.01442
Epoch: 16773, Training Loss: 0.01109
Epoch: 16773, Training Loss: 0.01166
Epoch: 16773, Training Loss: 0.01168
Epoch: 16773, Training Loss: 0.01442
Epoch: 16774, Training Loss: 0.01109
Epoch: 16774, Training Loss: 0.01166
Epoch: 16774, Training Loss: 0.01168
Epoch: 16774, Training Loss: 0.01442
Epoch: 16775, Training Loss: 0.01109
Epoch: 16775, Training Loss: 0.01166
Epoch: 16775, Training Loss: 0.01168
Epoch: 16775, Training Loss: 0.01442
Epoch: 16776, Training Loss: 0.01109
Epoch: 16776, Training Loss: 0.01166
Epoch: 16776, Training Loss: 0.01168
Epoch: 16776, Training Loss: 0.01441
Epoch: 16777, Training Loss: 0.01109
Epoch: 16777, Training Loss: 0.01166
Epoch: 16777, Training Loss: 0.01168
Epoch: 16777, Training Loss: 0.01441
Epoch: 16778, Training Loss: 0.01109
Epoch: 16778, Training Loss: 0.01166
Epoch: 16778, Training Loss: 0.01168
Epoch: 16778, Training Loss: 0.01441
Epoch: 16779, Training Loss: 0.01109
Epoch: 16779, Training Loss: 0.01166
Epoch: 16779, Training Loss: 0.01168
Epoch: 16779, Training Loss: 0.01441
Epoch: 16780, Training Loss: 0.01109
Epoch: 16780, Training Loss: 0.01166
Epoch: 16780, Training Loss: 0.01168
Epoch: 16780, Training Loss: 0.01441
Epoch: 16781, Training Loss: 0.01109
Epoch: 16781, Training Loss: 0.01166
Epoch: 16781, Training Loss: 0.01168
Epoch: 16781, Training Loss: 0.01441
Epoch: 16782, Training Loss: 0.01109
Epoch: 16782, Training Loss: 0.01166
Epoch: 16782, Training Loss: 0.01167
Epoch: 16782, Training Loss: 0.01441
Epoch: 16783, Training Loss: 0.01109
Epoch: 16783, Training Loss: 0.01166
Epoch: 16783, Training Loss: 0.01167
Epoch: 16783, Training Loss: 0.01441
Epoch: 16784, Training Loss: 0.01109
Epoch: 16784, Training Loss: 0.01166
Epoch: 16784, Training Loss: 0.01167
Epoch: 16784, Training Loss: 0.01441
Epoch: 16785, Training Loss: 0.01109
Epoch: 16785, Training Loss: 0.01166
Epoch: 16785, Training Loss: 0.01167
Epoch: 16785, Training Loss: 0.01441
Epoch: 16786, Training Loss: 0.01109
Epoch: 16786, Training Loss: 0.01166
Epoch: 16786, Training Loss: 0.01167
Epoch: 16786, Training Loss: 0.01441
Epoch: 16787, Training Loss: 0.01109
Epoch: 16787, Training Loss: 0.01166
Epoch: 16787, Training Loss: 0.01167
Epoch: 16787, Training Loss: 0.01441
Epoch: 16788, Training Loss: 0.01109
Epoch: 16788, Training Loss: 0.01166
Epoch: 16788, Training Loss: 0.01167
Epoch: 16788, Training Loss: 0.01441
Epoch: 16789, Training Loss: 0.01109
Epoch: 16789, Training Loss: 0.01166
Epoch: 16789, Training Loss: 0.01167
Epoch: 16789, Training Loss: 0.01441
Epoch: 16790, Training Loss: 0.01109
Epoch: 16790, Training Loss: 0.01166
Epoch: 16790, Training Loss: 0.01167
Epoch: 16790, Training Loss: 0.01441
Epoch: 16791, Training Loss: 0.01109
Epoch: 16791, Training Loss: 0.01166
Epoch: 16791, Training Loss: 0.01167
Epoch: 16791, Training Loss: 0.01441
Epoch: 16792, Training Loss: 0.01109
Epoch: 16792, Training Loss: 0.01165
Epoch: 16792, Training Loss: 0.01167
Epoch: 16792, Training Loss: 0.01441
Epoch: 16793, Training Loss: 0.01109
Epoch: 16793, Training Loss: 0.01165
Epoch: 16793, Training Loss: 0.01167
Epoch: 16793, Training Loss: 0.01441
Epoch: 16794, Training Loss: 0.01108
Epoch: 16794, Training Loss: 0.01165
Epoch: 16794, Training Loss: 0.01167
Epoch: 16794, Training Loss: 0.01441
Epoch: 16795, Training Loss: 0.01108
Epoch: 16795, Training Loss: 0.01165
Epoch: 16795, Training Loss: 0.01167
Epoch: 16795, Training Loss: 0.01440
Epoch: 16796, Training Loss: 0.01108
Epoch: 16796, Training Loss: 0.01165
Epoch: 16796, Training Loss: 0.01167
Epoch: 16796, Training Loss: 0.01440
Epoch: 16797, Training Loss: 0.01108
Epoch: 16797, Training Loss: 0.01165
Epoch: 16797, Training Loss: 0.01167
Epoch: 16797, Training Loss: 0.01440
Epoch: 16798, Training Loss: 0.01108
Epoch: 16798, Training Loss: 0.01165
Epoch: 16798, Training Loss: 0.01167
Epoch: 16798, Training Loss: 0.01440
Epoch: 16799, Training Loss: 0.01108
Epoch: 16799, Training Loss: 0.01165
Epoch: 16799, Training Loss: 0.01167
Epoch: 16799, Training Loss: 0.01440
Epoch: 16800, Training Loss: 0.01108
Epoch: 16800, Training Loss: 0.01165
Epoch: 16800, Training Loss: 0.01167
Epoch: 16800, Training Loss: 0.01440
Epoch: 16801, Training Loss: 0.01108
Epoch: 16801, Training Loss: 0.01165
Epoch: 16801, Training Loss: 0.01167
Epoch: 16801, Training Loss: 0.01440
Epoch: 16802, Training Loss: 0.01108
Epoch: 16802, Training Loss: 0.01165
Epoch: 16802, Training Loss: 0.01167
Epoch: 16802, Training Loss: 0.01440
Epoch: 16803, Training Loss: 0.01108
Epoch: 16803, Training Loss: 0.01165
Epoch: 16803, Training Loss: 0.01167
Epoch: 16803, Training Loss: 0.01440
Epoch: 16804, Training Loss: 0.01108
Epoch: 16804, Training Loss: 0.01165
Epoch: 16804, Training Loss: 0.01167
Epoch: 16804, Training Loss: 0.01440
Epoch: 16805, Training Loss: 0.01108
Epoch: 16805, Training Loss: 0.01165
Epoch: 16805, Training Loss: 0.01167
Epoch: 16805, Training Loss: 0.01440
Epoch: 16806, Training Loss: 0.01108
Epoch: 16806, Training Loss: 0.01165
Epoch: 16806, Training Loss: 0.01166
Epoch: 16806, Training Loss: 0.01440
Epoch: 16807, Training Loss: 0.01108
Epoch: 16807, Training Loss: 0.01165
Epoch: 16807, Training Loss: 0.01166
Epoch: 16807, Training Loss: 0.01440
Epoch: 16808, Training Loss: 0.01108
Epoch: 16808, Training Loss: 0.01165
Epoch: 16808, Training Loss: 0.01166
Epoch: 16808, Training Loss: 0.01440
Epoch: 16809, Training Loss: 0.01108
Epoch: 16809, Training Loss: 0.01165
Epoch: 16809, Training Loss: 0.01166
Epoch: 16809, Training Loss: 0.01440
Epoch: 16810, Training Loss: 0.01108
Epoch: 16810, Training Loss: 0.01165
Epoch: 16810, Training Loss: 0.01166
Epoch: 16810, Training Loss: 0.01440
Epoch: 16811, Training Loss: 0.01108
Epoch: 16811, Training Loss: 0.01165
Epoch: 16811, Training Loss: 0.01166
Epoch: 16811, Training Loss: 0.01440
Epoch: 16812, Training Loss: 0.01108
Epoch: 16812, Training Loss: 0.01165
Epoch: 16812, Training Loss: 0.01166
Epoch: 16812, Training Loss: 0.01440
Epoch: 16813, Training Loss: 0.01108
Epoch: 16813, Training Loss: 0.01165
Epoch: 16813, Training Loss: 0.01166
Epoch: 16813, Training Loss: 0.01439
Epoch: 16814, Training Loss: 0.01108
Epoch: 16814, Training Loss: 0.01165
Epoch: 16814, Training Loss: 0.01166
Epoch: 16814, Training Loss: 0.01439
Epoch: 16815, Training Loss: 0.01108
Epoch: 16815, Training Loss: 0.01165
Epoch: 16815, Training Loss: 0.01166
Epoch: 16815, Training Loss: 0.01439
Epoch: 16816, Training Loss: 0.01108
Epoch: 16816, Training Loss: 0.01164
Epoch: 16816, Training Loss: 0.01166
Epoch: 16816, Training Loss: 0.01439
Epoch: 16817, Training Loss: 0.01108
Epoch: 16817, Training Loss: 0.01164
Epoch: 16817, Training Loss: 0.01166
Epoch: 16817, Training Loss: 0.01439
Epoch: 16818, Training Loss: 0.01108
Epoch: 16818, Training Loss: 0.01164
Epoch: 16818, Training Loss: 0.01166
Epoch: 16818, Training Loss: 0.01439
Epoch: 16819, Training Loss: 0.01108
Epoch: 16819, Training Loss: 0.01164
Epoch: 16819, Training Loss: 0.01166
Epoch: 16819, Training Loss: 0.01439
Epoch: 16820, Training Loss: 0.01107
Epoch: 16820, Training Loss: 0.01164
Epoch: 16820, Training Loss: 0.01166
Epoch: 16820, Training Loss: 0.01439
Epoch: 16821, Training Loss: 0.01107
Epoch: 16821, Training Loss: 0.01164
Epoch: 16821, Training Loss: 0.01166
Epoch: 16821, Training Loss: 0.01439
Epoch: 16822, Training Loss: 0.01107
Epoch: 16822, Training Loss: 0.01164
Epoch: 16822, Training Loss: 0.01166
Epoch: 16822, Training Loss: 0.01439
Epoch: 16823, Training Loss: 0.01107
Epoch: 16823, Training Loss: 0.01164
Epoch: 16823, Training Loss: 0.01166
Epoch: 16823, Training Loss: 0.01439
Epoch: 16824, Training Loss: 0.01107
Epoch: 16824, Training Loss: 0.01164
Epoch: 16824, Training Loss: 0.01166
Epoch: 16824, Training Loss: 0.01439
Epoch: 16825, Training Loss: 0.01107
Epoch: 16825, Training Loss: 0.01164
Epoch: 16825, Training Loss: 0.01166
Epoch: 16825, Training Loss: 0.01439
Epoch: 16826, Training Loss: 0.01107
Epoch: 16826, Training Loss: 0.01164
Epoch: 16826, Training Loss: 0.01166
Epoch: 16826, Training Loss: 0.01439
Epoch: 16827, Training Loss: 0.01107
Epoch: 16827, Training Loss: 0.01164
Epoch: 16827, Training Loss: 0.01166
Epoch: 16827, Training Loss: 0.01439
Epoch: 16828, Training Loss: 0.01107
Epoch: 16828, Training Loss: 0.01164
Epoch: 16828, Training Loss: 0.01166
Epoch: 16828, Training Loss: 0.01439
Epoch: 16829, Training Loss: 0.01107
Epoch: 16829, Training Loss: 0.01164
Epoch: 16829, Training Loss: 0.01166
Epoch: 16829, Training Loss: 0.01439
Epoch: 16830, Training Loss: 0.01107
Epoch: 16830, Training Loss: 0.01164
Epoch: 16830, Training Loss: 0.01165
Epoch: 16830, Training Loss: 0.01439
Epoch: 16831, Training Loss: 0.01107
Epoch: 16831, Training Loss: 0.01164
Epoch: 16831, Training Loss: 0.01165
Epoch: 16831, Training Loss: 0.01439
Epoch: 16832, Training Loss: 0.01107
Epoch: 16832, Training Loss: 0.01164
Epoch: 16832, Training Loss: 0.01165
Epoch: 16832, Training Loss: 0.01438
Epoch: 16833, Training Loss: 0.01107
Epoch: 16833, Training Loss: 0.01164
Epoch: 16833, Training Loss: 0.01165
Epoch: 16833, Training Loss: 0.01438
Epoch: 16834, Training Loss: 0.01107
Epoch: 16834, Training Loss: 0.01164
Epoch: 16834, Training Loss: 0.01165
Epoch: 16834, Training Loss: 0.01438
Epoch: 16835, Training Loss: 0.01107
Epoch: 16835, Training Loss: 0.01164
Epoch: 16835, Training Loss: 0.01165
Epoch: 16835, Training Loss: 0.01438
Epoch: 16836, Training Loss: 0.01107
Epoch: 16836, Training Loss: 0.01164
Epoch: 16836, Training Loss: 0.01165
Epoch: 16836, Training Loss: 0.01438
Epoch: 16837, Training Loss: 0.01107
Epoch: 16837, Training Loss: 0.01164
Epoch: 16837, Training Loss: 0.01165
Epoch: 16837, Training Loss: 0.01438
Epoch: 16838, Training Loss: 0.01107
Epoch: 16838, Training Loss: 0.01164
Epoch: 16838, Training Loss: 0.01165
Epoch: 16838, Training Loss: 0.01438
Epoch: 16839, Training Loss: 0.01107
Epoch: 16839, Training Loss: 0.01163
Epoch: 16839, Training Loss: 0.01165
Epoch: 16839, Training Loss: 0.01438
Epoch: 16840, Training Loss: 0.01107
Epoch: 16840, Training Loss: 0.01163
Epoch: 16840, Training Loss: 0.01165
Epoch: 16840, Training Loss: 0.01438
Epoch: 16841, Training Loss: 0.01107
Epoch: 16841, Training Loss: 0.01163
Epoch: 16841, Training Loss: 0.01165
Epoch: 16841, Training Loss: 0.01438
Epoch: 16842, Training Loss: 0.01107
Epoch: 16842, Training Loss: 0.01163
Epoch: 16842, Training Loss: 0.01165
Epoch: 16842, Training Loss: 0.01438
Epoch: 16843, Training Loss: 0.01107
Epoch: 16843, Training Loss: 0.01163
Epoch: 16843, Training Loss: 0.01165
Epoch: 16843, Training Loss: 0.01438
Epoch: 16844, Training Loss: 0.01107
Epoch: 16844, Training Loss: 0.01163
Epoch: 16844, Training Loss: 0.01165
Epoch: 16844, Training Loss: 0.01438
Epoch: 16845, Training Loss: 0.01107
Epoch: 16845, Training Loss: 0.01163
Epoch: 16845, Training Loss: 0.01165
Epoch: 16845, Training Loss: 0.01438
Epoch: 16846, Training Loss: 0.01106
Epoch: 16846, Training Loss: 0.01163
Epoch: 16846, Training Loss: 0.01165
Epoch: 16846, Training Loss: 0.01438
Epoch: 16847, Training Loss: 0.01106
Epoch: 16847, Training Loss: 0.01163
Epoch: 16847, Training Loss: 0.01165
Epoch: 16847, Training Loss: 0.01438
Epoch: 16848, Training Loss: 0.01106
Epoch: 16848, Training Loss: 0.01163
Epoch: 16848, Training Loss: 0.01165
Epoch: 16848, Training Loss: 0.01438
Epoch: 16849, Training Loss: 0.01106
Epoch: 16849, Training Loss: 0.01163
Epoch: 16849, Training Loss: 0.01165
Epoch: 16849, Training Loss: 0.01438
Epoch: 16850, Training Loss: 0.01106
Epoch: 16850, Training Loss: 0.01163
Epoch: 16850, Training Loss: 0.01165
Epoch: 16850, Training Loss: 0.01438
Epoch: 16851, Training Loss: 0.01106
Epoch: 16851, Training Loss: 0.01163
Epoch: 16851, Training Loss: 0.01165
Epoch: 16851, Training Loss: 0.01437
Epoch: 16852, Training Loss: 0.01106
Epoch: 16852, Training Loss: 0.01163
Epoch: 16852, Training Loss: 0.01165
Epoch: 16852, Training Loss: 0.01437
Epoch: 16853, Training Loss: 0.01106
Epoch: 16853, Training Loss: 0.01163
Epoch: 16853, Training Loss: 0.01164
Epoch: 16853, Training Loss: 0.01437
Epoch: 16854, Training Loss: 0.01106
Epoch: 16854, Training Loss: 0.01163
Epoch: 16854, Training Loss: 0.01164
Epoch: 16854, Training Loss: 0.01437
Epoch: 16855, Training Loss: 0.01106
Epoch: 16855, Training Loss: 0.01163
Epoch: 16855, Training Loss: 0.01164
Epoch: 16855, Training Loss: 0.01437
Epoch: 16856, Training Loss: 0.01106
Epoch: 16856, Training Loss: 0.01163
Epoch: 16856, Training Loss: 0.01164
Epoch: 16856, Training Loss: 0.01437
Epoch: 16857, Training Loss: 0.01106
Epoch: 16857, Training Loss: 0.01163
Epoch: 16857, Training Loss: 0.01164
Epoch: 16857, Training Loss: 0.01437
Epoch: 16858, Training Loss: 0.01106
Epoch: 16858, Training Loss: 0.01163
Epoch: 16858, Training Loss: 0.01164
Epoch: 16858, Training Loss: 0.01437
Epoch: 16859, Training Loss: 0.01106
Epoch: 16859, Training Loss: 0.01163
Epoch: 16859, Training Loss: 0.01164
Epoch: 16859, Training Loss: 0.01437
Epoch: 16860, Training Loss: 0.01106
Epoch: 16860, Training Loss: 0.01163
Epoch: 16860, Training Loss: 0.01164
Epoch: 16860, Training Loss: 0.01437
Epoch: 16861, Training Loss: 0.01106
Epoch: 16861, Training Loss: 0.01163
Epoch: 16861, Training Loss: 0.01164
Epoch: 16861, Training Loss: 0.01437
Epoch: 16862, Training Loss: 0.01106
Epoch: 16862, Training Loss: 0.01163
Epoch: 16862, Training Loss: 0.01164
Epoch: 16862, Training Loss: 0.01437
Epoch: 16863, Training Loss: 0.01106
Epoch: 16863, Training Loss: 0.01162
Epoch: 16863, Training Loss: 0.01164
Epoch: 16863, Training Loss: 0.01437
Epoch: 16864, Training Loss: 0.01106
Epoch: 16864, Training Loss: 0.01162
Epoch: 16864, Training Loss: 0.01164
Epoch: 16864, Training Loss: 0.01437
Epoch: 16865, Training Loss: 0.01106
Epoch: 16865, Training Loss: 0.01162
Epoch: 16865, Training Loss: 0.01164
Epoch: 16865, Training Loss: 0.01437
Epoch: 16866, Training Loss: 0.01106
Epoch: 16866, Training Loss: 0.01162
Epoch: 16866, Training Loss: 0.01164
Epoch: 16866, Training Loss: 0.01437
Epoch: 16867, Training Loss: 0.01106
Epoch: 16867, Training Loss: 0.01162
Epoch: 16867, Training Loss: 0.01164
Epoch: 16867, Training Loss: 0.01437
Epoch: 16868, Training Loss: 0.01106
Epoch: 16868, Training Loss: 0.01162
Epoch: 16868, Training Loss: 0.01164
Epoch: 16868, Training Loss: 0.01437
Epoch: 16869, Training Loss: 0.01106
Epoch: 16869, Training Loss: 0.01162
Epoch: 16869, Training Loss: 0.01164
Epoch: 16869, Training Loss: 0.01437
Epoch: 16870, Training Loss: 0.01106
Epoch: 16870, Training Loss: 0.01162
Epoch: 16870, Training Loss: 0.01164
Epoch: 16870, Training Loss: 0.01436
Epoch: 16871, Training Loss: 0.01106
Epoch: 16871, Training Loss: 0.01162
Epoch: 16871, Training Loss: 0.01164
Epoch: 16871, Training Loss: 0.01436
Epoch: 16872, Training Loss: 0.01105
Epoch: 16872, Training Loss: 0.01162
Epoch: 16872, Training Loss: 0.01164
Epoch: 16872, Training Loss: 0.01436
Epoch: 16873, Training Loss: 0.01105
Epoch: 16873, Training Loss: 0.01162
Epoch: 16873, Training Loss: 0.01164
Epoch: 16873, Training Loss: 0.01436
Epoch: 16874, Training Loss: 0.01105
Epoch: 16874, Training Loss: 0.01162
Epoch: 16874, Training Loss: 0.01164
Epoch: 16874, Training Loss: 0.01436
Epoch: 16875, Training Loss: 0.01105
Epoch: 16875, Training Loss: 0.01162
Epoch: 16875, Training Loss: 0.01164
Epoch: 16875, Training Loss: 0.01436
Epoch: 16876, Training Loss: 0.01105
Epoch: 16876, Training Loss: 0.01162
Epoch: 16876, Training Loss: 0.01164
Epoch: 16876, Training Loss: 0.01436
Epoch: 16877, Training Loss: 0.01105
Epoch: 16877, Training Loss: 0.01162
Epoch: 16877, Training Loss: 0.01163
Epoch: 16877, Training Loss: 0.01436
Epoch: 16878, Training Loss: 0.01105
Epoch: 16878, Training Loss: 0.01162
Epoch: 16878, Training Loss: 0.01163
Epoch: 16878, Training Loss: 0.01436
Epoch: 16879, Training Loss: 0.01105
Epoch: 16879, Training Loss: 0.01162
Epoch: 16879, Training Loss: 0.01163
Epoch: 16879, Training Loss: 0.01436
Epoch: 16880, Training Loss: 0.01105
Epoch: 16880, Training Loss: 0.01162
Epoch: 16880, Training Loss: 0.01163
Epoch: 16880, Training Loss: 0.01436
Epoch: 16881, Training Loss: 0.01105
Epoch: 16881, Training Loss: 0.01162
Epoch: 16881, Training Loss: 0.01163
Epoch: 16881, Training Loss: 0.01436
Epoch: 16882, Training Loss: 0.01105
Epoch: 16882, Training Loss: 0.01162
Epoch: 16882, Training Loss: 0.01163
Epoch: 16882, Training Loss: 0.01436
Epoch: 16883, Training Loss: 0.01105
Epoch: 16883, Training Loss: 0.01162
Epoch: 16883, Training Loss: 0.01163
Epoch: 16883, Training Loss: 0.01436
Epoch: 16884, Training Loss: 0.01105
Epoch: 16884, Training Loss: 0.01162
Epoch: 16884, Training Loss: 0.01163
Epoch: 16884, Training Loss: 0.01436
Epoch: 16885, Training Loss: 0.01105
Epoch: 16885, Training Loss: 0.01162
Epoch: 16885, Training Loss: 0.01163
Epoch: 16885, Training Loss: 0.01436
Epoch: 16886, Training Loss: 0.01105
Epoch: 16886, Training Loss: 0.01162
Epoch: 16886, Training Loss: 0.01163
Epoch: 16886, Training Loss: 0.01436
Epoch: 16887, Training Loss: 0.01105
Epoch: 16887, Training Loss: 0.01161
Epoch: 16887, Training Loss: 0.01163
Epoch: 16887, Training Loss: 0.01436
Epoch: 16888, Training Loss: 0.01105
Epoch: 16888, Training Loss: 0.01161
Epoch: 16888, Training Loss: 0.01163
Epoch: 16888, Training Loss: 0.01436
Epoch: 16889, Training Loss: 0.01105
Epoch: 16889, Training Loss: 0.01161
Epoch: 16889, Training Loss: 0.01163
Epoch: 16889, Training Loss: 0.01435
Epoch: 16890, Training Loss: 0.01105
Epoch: 16890, Training Loss: 0.01161
Epoch: 16890, Training Loss: 0.01163
Epoch: 16890, Training Loss: 0.01435
Epoch: 16891, Training Loss: 0.01105
Epoch: 16891, Training Loss: 0.01161
Epoch: 16891, Training Loss: 0.01163
Epoch: 16891, Training Loss: 0.01435
Epoch: 16892, Training Loss: 0.01105
Epoch: 16892, Training Loss: 0.01161
Epoch: 16892, Training Loss: 0.01163
Epoch: 16892, Training Loss: 0.01435
Epoch: 16893, Training Loss: 0.01105
Epoch: 16893, Training Loss: 0.01161
Epoch: 16893, Training Loss: 0.01163
Epoch: 16893, Training Loss: 0.01435
Epoch: 16894, Training Loss: 0.01105
Epoch: 16894, Training Loss: 0.01161
Epoch: 16894, Training Loss: 0.01163
Epoch: 16894, Training Loss: 0.01435
Epoch: 16895, Training Loss: 0.01105
Epoch: 16895, Training Loss: 0.01161
Epoch: 16895, Training Loss: 0.01163
Epoch: 16895, Training Loss: 0.01435
Epoch: 16896, Training Loss: 0.01105
Epoch: 16896, Training Loss: 0.01161
Epoch: 16896, Training Loss: 0.01163
Epoch: 16896, Training Loss: 0.01435
Epoch: 16897, Training Loss: 0.01105
Epoch: 16897, Training Loss: 0.01161
Epoch: 16897, Training Loss: 0.01163
Epoch: 16897, Training Loss: 0.01435
Epoch: 16898, Training Loss: 0.01104
Epoch: 16898, Training Loss: 0.01161
Epoch: 16898, Training Loss: 0.01163
Epoch: 16898, Training Loss: 0.01435
Epoch: 16899, Training Loss: 0.01104
Epoch: 16899, Training Loss: 0.01161
Epoch: 16899, Training Loss: 0.01163
Epoch: 16899, Training Loss: 0.01435
Epoch: 16900, Training Loss: 0.01104
Epoch: 16900, Training Loss: 0.01161
Epoch: 16900, Training Loss: 0.01163
Epoch: 16900, Training Loss: 0.01435
Epoch: 16901, Training Loss: 0.01104
Epoch: 16901, Training Loss: 0.01161
Epoch: 16901, Training Loss: 0.01162
Epoch: 16901, Training Loss: 0.01435
Epoch: 16902, Training Loss: 0.01104
Epoch: 16902, Training Loss: 0.01161
Epoch: 16902, Training Loss: 0.01162
Epoch: 16902, Training Loss: 0.01435
Epoch: 16903, Training Loss: 0.01104
Epoch: 16903, Training Loss: 0.01161
Epoch: 16903, Training Loss: 0.01162
Epoch: 16903, Training Loss: 0.01435
Epoch: 16904, Training Loss: 0.01104
Epoch: 16904, Training Loss: 0.01161
Epoch: 16904, Training Loss: 0.01162
Epoch: 16904, Training Loss: 0.01435
Epoch: 16905, Training Loss: 0.01104
Epoch: 16905, Training Loss: 0.01161
Epoch: 16905, Training Loss: 0.01162
Epoch: 16905, Training Loss: 0.01435
Epoch: 16906, Training Loss: 0.01104
Epoch: 16906, Training Loss: 0.01161
Epoch: 16906, Training Loss: 0.01162
Epoch: 16906, Training Loss: 0.01435
Epoch: 16907, Training Loss: 0.01104
Epoch: 16907, Training Loss: 0.01161
Epoch: 16907, Training Loss: 0.01162
Epoch: 16907, Training Loss: 0.01435
Epoch: 16908, Training Loss: 0.01104
Epoch: 16908, Training Loss: 0.01161
Epoch: 16908, Training Loss: 0.01162
Epoch: 16908, Training Loss: 0.01434
Epoch: 16909, Training Loss: 0.01104
Epoch: 16909, Training Loss: 0.01161
Epoch: 16909, Training Loss: 0.01162
Epoch: 16909, Training Loss: 0.01434
Epoch: 16910, Training Loss: 0.01104
Epoch: 16910, Training Loss: 0.01161
Epoch: 16910, Training Loss: 0.01162
Epoch: 16910, Training Loss: 0.01434
Epoch: 16911, Training Loss: 0.01104
Epoch: 16911, Training Loss: 0.01160
Epoch: 16911, Training Loss: 0.01162
Epoch: 16911, Training Loss: 0.01434
Epoch: 16912, Training Loss: 0.01104
Epoch: 16912, Training Loss: 0.01160
Epoch: 16912, Training Loss: 0.01162
Epoch: 16912, Training Loss: 0.01434
Epoch: 16913, Training Loss: 0.01104
Epoch: 16913, Training Loss: 0.01160
Epoch: 16913, Training Loss: 0.01162
Epoch: 16913, Training Loss: 0.01434
Epoch: 16914, Training Loss: 0.01104
Epoch: 16914, Training Loss: 0.01160
Epoch: 16914, Training Loss: 0.01162
Epoch: 16914, Training Loss: 0.01434
Epoch: 16915, Training Loss: 0.01104
Epoch: 16915, Training Loss: 0.01160
Epoch: 16915, Training Loss: 0.01162
Epoch: 16915, Training Loss: 0.01434
Epoch: 16916, Training Loss: 0.01104
Epoch: 16916, Training Loss: 0.01160
Epoch: 16916, Training Loss: 0.01162
Epoch: 16916, Training Loss: 0.01434
Epoch: 16917, Training Loss: 0.01104
Epoch: 16917, Training Loss: 0.01160
Epoch: 16917, Training Loss: 0.01162
Epoch: 16917, Training Loss: 0.01434
Epoch: 16918, Training Loss: 0.01104
Epoch: 16918, Training Loss: 0.01160
Epoch: 16918, Training Loss: 0.01162
Epoch: 16918, Training Loss: 0.01434
Epoch: 16919, Training Loss: 0.01104
Epoch: 16919, Training Loss: 0.01160
Epoch: 16919, Training Loss: 0.01162
Epoch: 16919, Training Loss: 0.01434
Epoch: 16920, Training Loss: 0.01104
Epoch: 16920, Training Loss: 0.01160
Epoch: 16920, Training Loss: 0.01162
Epoch: 16920, Training Loss: 0.01434
Epoch: 16921, Training Loss: 0.01104
Epoch: 16921, Training Loss: 0.01160
Epoch: 16921, Training Loss: 0.01162
Epoch: 16921, Training Loss: 0.01434
Epoch: 16922, Training Loss: 0.01104
Epoch: 16922, Training Loss: 0.01160
Epoch: 16922, Training Loss: 0.01162
Epoch: 16922, Training Loss: 0.01434
Epoch: 16923, Training Loss: 0.01104
Epoch: 16923, Training Loss: 0.01160
Epoch: 16923, Training Loss: 0.01162
Epoch: 16923, Training Loss: 0.01434
Epoch: 16924, Training Loss: 0.01103
Epoch: 16924, Training Loss: 0.01160
Epoch: 16924, Training Loss: 0.01162
Epoch: 16924, Training Loss: 0.01434
Epoch: 16925, Training Loss: 0.01103
Epoch: 16925, Training Loss: 0.01160
Epoch: 16925, Training Loss: 0.01161
Epoch: 16925, Training Loss: 0.01434
Epoch: 16926, Training Loss: 0.01103
Epoch: 16926, Training Loss: 0.01160
Epoch: 16926, Training Loss: 0.01161
Epoch: 16926, Training Loss: 0.01434
Epoch: 16927, Training Loss: 0.01103
Epoch: 16927, Training Loss: 0.01160
Epoch: 16927, Training Loss: 0.01161
Epoch: 16927, Training Loss: 0.01433
Epoch: 16928, Training Loss: 0.01103
Epoch: 16928, Training Loss: 0.01160
Epoch: 16928, Training Loss: 0.01161
Epoch: 16928, Training Loss: 0.01433
Epoch: 16929, Training Loss: 0.01103
Epoch: 16929, Training Loss: 0.01160
Epoch: 16929, Training Loss: 0.01161
Epoch: 16929, Training Loss: 0.01433
Epoch: 16930, Training Loss: 0.01103
Epoch: 16930, Training Loss: 0.01160
Epoch: 16930, Training Loss: 0.01161
Epoch: 16930, Training Loss: 0.01433
Epoch: 16931, Training Loss: 0.01103
Epoch: 16931, Training Loss: 0.01160
Epoch: 16931, Training Loss: 0.01161
Epoch: 16931, Training Loss: 0.01433
Epoch: 16932, Training Loss: 0.01103
Epoch: 16932, Training Loss: 0.01160
Epoch: 16932, Training Loss: 0.01161
Epoch: 16932, Training Loss: 0.01433
Epoch: 16933, Training Loss: 0.01103
Epoch: 16933, Training Loss: 0.01160
Epoch: 16933, Training Loss: 0.01161
Epoch: 16933, Training Loss: 0.01433
Epoch: 16934, Training Loss: 0.01103
Epoch: 16934, Training Loss: 0.01160
Epoch: 16934, Training Loss: 0.01161
Epoch: 16934, Training Loss: 0.01433
Epoch: 16935, Training Loss: 0.01103
Epoch: 16935, Training Loss: 0.01159
Epoch: 16935, Training Loss: 0.01161
Epoch: 16935, Training Loss: 0.01433
Epoch: 16936, Training Loss: 0.01103
Epoch: 16936, Training Loss: 0.01159
Epoch: 16936, Training Loss: 0.01161
Epoch: 16936, Training Loss: 0.01433
Epoch: 16937, Training Loss: 0.01103
Epoch: 16937, Training Loss: 0.01159
Epoch: 16937, Training Loss: 0.01161
Epoch: 16937, Training Loss: 0.01433
Epoch: 16938, Training Loss: 0.01103
Epoch: 16938, Training Loss: 0.01159
Epoch: 16938, Training Loss: 0.01161
Epoch: 16938, Training Loss: 0.01433
Epoch: 16939, Training Loss: 0.01103
Epoch: 16939, Training Loss: 0.01159
Epoch: 16939, Training Loss: 0.01161
Epoch: 16939, Training Loss: 0.01433
Epoch: 16940, Training Loss: 0.01103
Epoch: 16940, Training Loss: 0.01159
Epoch: 16940, Training Loss: 0.01161
Epoch: 16940, Training Loss: 0.01433
Epoch: 16941, Training Loss: 0.01103
Epoch: 16941, Training Loss: 0.01159
Epoch: 16941, Training Loss: 0.01161
Epoch: 16941, Training Loss: 0.01433
Epoch: 16942, Training Loss: 0.01103
Epoch: 16942, Training Loss: 0.01159
Epoch: 16942, Training Loss: 0.01161
Epoch: 16942, Training Loss: 0.01433
Epoch: 16943, Training Loss: 0.01103
Epoch: 16943, Training Loss: 0.01159
Epoch: 16943, Training Loss: 0.01161
Epoch: 16943, Training Loss: 0.01433
Epoch: 16944, Training Loss: 0.01103
Epoch: 16944, Training Loss: 0.01159
Epoch: 16944, Training Loss: 0.01161
Epoch: 16944, Training Loss: 0.01433
Epoch: 16945, Training Loss: 0.01103
Epoch: 16945, Training Loss: 0.01159
Epoch: 16945, Training Loss: 0.01161
Epoch: 16945, Training Loss: 0.01433
Epoch: 16946, Training Loss: 0.01103
Epoch: 16946, Training Loss: 0.01159
Epoch: 16946, Training Loss: 0.01161
Epoch: 16946, Training Loss: 0.01433
Epoch: 16947, Training Loss: 0.01103
Epoch: 16947, Training Loss: 0.01159
Epoch: 16947, Training Loss: 0.01161
Epoch: 16947, Training Loss: 0.01432
Epoch: 16948, Training Loss: 0.01103
Epoch: 16948, Training Loss: 0.01159
Epoch: 16948, Training Loss: 0.01161
Epoch: 16948, Training Loss: 0.01432
Epoch: 16949, Training Loss: 0.01103
Epoch: 16949, Training Loss: 0.01159
Epoch: 16949, Training Loss: 0.01160
Epoch: 16949, Training Loss: 0.01432
Epoch: 16950, Training Loss: 0.01102
Epoch: 16950, Training Loss: 0.01159
Epoch: 16950, Training Loss: 0.01160
Epoch: 16950, Training Loss: 0.01432
Epoch: 16951, Training Loss: 0.01102
Epoch: 16951, Training Loss: 0.01159
Epoch: 16951, Training Loss: 0.01160
Epoch: 16951, Training Loss: 0.01432
Epoch: 16952, Training Loss: 0.01102
Epoch: 16952, Training Loss: 0.01159
Epoch: 16952, Training Loss: 0.01160
Epoch: 16952, Training Loss: 0.01432
Epoch: 16953, Training Loss: 0.01102
Epoch: 16953, Training Loss: 0.01159
Epoch: 16953, Training Loss: 0.01160
Epoch: 16953, Training Loss: 0.01432
Epoch: 16954, Training Loss: 0.01102
Epoch: 16954, Training Loss: 0.01159
Epoch: 16954, Training Loss: 0.01160
Epoch: 16954, Training Loss: 0.01432
Epoch: 16955, Training Loss: 0.01102
Epoch: 16955, Training Loss: 0.01159
Epoch: 16955, Training Loss: 0.01160
Epoch: 16955, Training Loss: 0.01432
Epoch: 16956, Training Loss: 0.01102
Epoch: 16956, Training Loss: 0.01159
Epoch: 16956, Training Loss: 0.01160
Epoch: 16956, Training Loss: 0.01432
Epoch: 16957, Training Loss: 0.01102
Epoch: 16957, Training Loss: 0.01159
Epoch: 16957, Training Loss: 0.01160
Epoch: 16957, Training Loss: 0.01432
Epoch: 16958, Training Loss: 0.01102
Epoch: 16958, Training Loss: 0.01159
Epoch: 16958, Training Loss: 0.01160
Epoch: 16958, Training Loss: 0.01432
Epoch: 16959, Training Loss: 0.01102
Epoch: 16959, Training Loss: 0.01158
Epoch: 16959, Training Loss: 0.01160
Epoch: 16959, Training Loss: 0.01432
Epoch: 16960, Training Loss: 0.01102
Epoch: 16960, Training Loss: 0.01158
Epoch: 16960, Training Loss: 0.01160
Epoch: 16960, Training Loss: 0.01432
Epoch: 16961, Training Loss: 0.01102
Epoch: 16961, Training Loss: 0.01158
Epoch: 16961, Training Loss: 0.01160
Epoch: 16961, Training Loss: 0.01432
Epoch: 16962, Training Loss: 0.01102
Epoch: 16962, Training Loss: 0.01158
Epoch: 16962, Training Loss: 0.01160
Epoch: 16962, Training Loss: 0.01432
Epoch: 16963, Training Loss: 0.01102
Epoch: 16963, Training Loss: 0.01158
Epoch: 16963, Training Loss: 0.01160
Epoch: 16963, Training Loss: 0.01432
Epoch: 16964, Training Loss: 0.01102
Epoch: 16964, Training Loss: 0.01158
Epoch: 16964, Training Loss: 0.01160
Epoch: 16964, Training Loss: 0.01432
Epoch: 16965, Training Loss: 0.01102
Epoch: 16965, Training Loss: 0.01158
Epoch: 16965, Training Loss: 0.01160
Epoch: 16965, Training Loss: 0.01432
Epoch: 16966, Training Loss: 0.01102
Epoch: 16966, Training Loss: 0.01158
Epoch: 16966, Training Loss: 0.01160
Epoch: 16966, Training Loss: 0.01431
Epoch: 16967, Training Loss: 0.01102
Epoch: 16967, Training Loss: 0.01158
Epoch: 16967, Training Loss: 0.01160
Epoch: 16967, Training Loss: 0.01431
Epoch: 16968, Training Loss: 0.01102
Epoch: 16968, Training Loss: 0.01158
Epoch: 16968, Training Loss: 0.01160
Epoch: 16968, Training Loss: 0.01431
Epoch: 16969, Training Loss: 0.01102
Epoch: 16969, Training Loss: 0.01158
Epoch: 16969, Training Loss: 0.01160
Epoch: 16969, Training Loss: 0.01431
Epoch: 16970, Training Loss: 0.01102
Epoch: 16970, Training Loss: 0.01158
Epoch: 16970, Training Loss: 0.01160
Epoch: 16970, Training Loss: 0.01431
Epoch: 16971, Training Loss: 0.01102
Epoch: 16971, Training Loss: 0.01158
Epoch: 16971, Training Loss: 0.01160
Epoch: 16971, Training Loss: 0.01431
Epoch: 16972, Training Loss: 0.01102
Epoch: 16972, Training Loss: 0.01158
Epoch: 16972, Training Loss: 0.01160
Epoch: 16972, Training Loss: 0.01431
Epoch: 16973, Training Loss: 0.01102
Epoch: 16973, Training Loss: 0.01158
Epoch: 16973, Training Loss: 0.01159
Epoch: 16973, Training Loss: 0.01431
Epoch: 16974, Training Loss: 0.01102
Epoch: 16974, Training Loss: 0.01158
Epoch: 16974, Training Loss: 0.01159
Epoch: 16974, Training Loss: 0.01431
Epoch: 16975, Training Loss: 0.01102
Epoch: 16975, Training Loss: 0.01158
Epoch: 16975, Training Loss: 0.01159
Epoch: 16975, Training Loss: 0.01431
Epoch: 16976, Training Loss: 0.01101
Epoch: 16976, Training Loss: 0.01158
Epoch: 16976, Training Loss: 0.01159
Epoch: 16976, Training Loss: 0.01431
Epoch: 16977, Training Loss: 0.01101
Epoch: 16977, Training Loss: 0.01158
Epoch: 16977, Training Loss: 0.01159
Epoch: 16977, Training Loss: 0.01431
Epoch: 16978, Training Loss: 0.01101
Epoch: 16978, Training Loss: 0.01158
Epoch: 16978, Training Loss: 0.01159
Epoch: 16978, Training Loss: 0.01431
Epoch: 16979, Training Loss: 0.01101
Epoch: 16979, Training Loss: 0.01158
Epoch: 16979, Training Loss: 0.01159
Epoch: 16979, Training Loss: 0.01431
Epoch: 16980, Training Loss: 0.01101
Epoch: 16980, Training Loss: 0.01158
Epoch: 16980, Training Loss: 0.01159
Epoch: 16980, Training Loss: 0.01431
Epoch: 16981, Training Loss: 0.01101
Epoch: 16981, Training Loss: 0.01158
Epoch: 16981, Training Loss: 0.01159
Epoch: 16981, Training Loss: 0.01431
Epoch: 16982, Training Loss: 0.01101
Epoch: 16982, Training Loss: 0.01158
Epoch: 16982, Training Loss: 0.01159
Epoch: 16982, Training Loss: 0.01431
Epoch: 16983, Training Loss: 0.01101
Epoch: 16983, Training Loss: 0.01157
Epoch: 16983, Training Loss: 0.01159
Epoch: 16983, Training Loss: 0.01431
Epoch: 16984, Training Loss: 0.01101
Epoch: 16984, Training Loss: 0.01157
Epoch: 16984, Training Loss: 0.01159
Epoch: 16984, Training Loss: 0.01431
Epoch: 16985, Training Loss: 0.01101
Epoch: 16985, Training Loss: 0.01157
Epoch: 16985, Training Loss: 0.01159
Epoch: 16985, Training Loss: 0.01430
Epoch: 16986, Training Loss: 0.01101
Epoch: 16986, Training Loss: 0.01157
Epoch: 16986, Training Loss: 0.01159
Epoch: 16986, Training Loss: 0.01430
Epoch: 16987, Training Loss: 0.01101
Epoch: 16987, Training Loss: 0.01157
Epoch: 16987, Training Loss: 0.01159
Epoch: 16987, Training Loss: 0.01430
Epoch: 16988, Training Loss: 0.01101
Epoch: 16988, Training Loss: 0.01157
Epoch: 16988, Training Loss: 0.01159
Epoch: 16988, Training Loss: 0.01430
Epoch: 16989, Training Loss: 0.01101
Epoch: 16989, Training Loss: 0.01157
Epoch: 16989, Training Loss: 0.01159
Epoch: 16989, Training Loss: 0.01430
Epoch: 16990, Training Loss: 0.01101
Epoch: 16990, Training Loss: 0.01157
Epoch: 16990, Training Loss: 0.01159
Epoch: 16990, Training Loss: 0.01430
Epoch: 16991, Training Loss: 0.01101
Epoch: 16991, Training Loss: 0.01157
Epoch: 16991, Training Loss: 0.01159
Epoch: 16991, Training Loss: 0.01430
Epoch: 16992, Training Loss: 0.01101
Epoch: 16992, Training Loss: 0.01157
Epoch: 16992, Training Loss: 0.01159
Epoch: 16992, Training Loss: 0.01430
Epoch: 16993, Training Loss: 0.01101
Epoch: 16993, Training Loss: 0.01157
Epoch: 16993, Training Loss: 0.01159
Epoch: 16993, Training Loss: 0.01430
Epoch: 16994, Training Loss: 0.01101
Epoch: 16994, Training Loss: 0.01157
Epoch: 16994, Training Loss: 0.01159
Epoch: 16994, Training Loss: 0.01430
Epoch: 16995, Training Loss: 0.01101
Epoch: 16995, Training Loss: 0.01157
Epoch: 16995, Training Loss: 0.01159
Epoch: 16995, Training Loss: 0.01430
Epoch: 16996, Training Loss: 0.01101
Epoch: 16996, Training Loss: 0.01157
Epoch: 16996, Training Loss: 0.01159
Epoch: 16996, Training Loss: 0.01430
Epoch: 16997, Training Loss: 0.01101
Epoch: 16997, Training Loss: 0.01157
Epoch: 16997, Training Loss: 0.01158
Epoch: 16997, Training Loss: 0.01430
Epoch: 16998, Training Loss: 0.01101
Epoch: 16998, Training Loss: 0.01157
Epoch: 16998, Training Loss: 0.01158
Epoch: 16998, Training Loss: 0.01430
Epoch: 16999, Training Loss: 0.01101
Epoch: 16999, Training Loss: 0.01157
Epoch: 16999, Training Loss: 0.01158
Epoch: 16999, Training Loss: 0.01430
Epoch: 17000, Training Loss: 0.01101
Epoch: 17000, Training Loss: 0.01157
Epoch: 17000, Training Loss: 0.01158
Epoch: 17000, Training Loss: 0.01430
Epoch: 17001, Training Loss: 0.01101
Epoch: 17001, Training Loss: 0.01157
Epoch: 17001, Training Loss: 0.01158
Epoch: 17001, Training Loss: 0.01430
Epoch: 17002, Training Loss: 0.01100
Epoch: 17002, Training Loss: 0.01157
Epoch: 17002, Training Loss: 0.01158
Epoch: 17002, Training Loss: 0.01430
Epoch: 17003, Training Loss: 0.01100
Epoch: 17003, Training Loss: 0.01157
Epoch: 17003, Training Loss: 0.01158
Epoch: 17003, Training Loss: 0.01430
Epoch: 17004, Training Loss: 0.01100
Epoch: 17004, Training Loss: 0.01157
Epoch: 17004, Training Loss: 0.01158
Epoch: 17004, Training Loss: 0.01429
Epoch: 17005, Training Loss: 0.01100
Epoch: 17005, Training Loss: 0.01157
Epoch: 17005, Training Loss: 0.01158
Epoch: 17005, Training Loss: 0.01429
Epoch: 17006, Training Loss: 0.01100
Epoch: 17006, Training Loss: 0.01157
Epoch: 17006, Training Loss: 0.01158
Epoch: 17006, Training Loss: 0.01429
Epoch: 17007, Training Loss: 0.01100
Epoch: 17007, Training Loss: 0.01157
Epoch: 17007, Training Loss: 0.01158
Epoch: 17007, Training Loss: 0.01429
Epoch: 17008, Training Loss: 0.01100
Epoch: 17008, Training Loss: 0.01156
Epoch: 17008, Training Loss: 0.01158
Epoch: 17008, Training Loss: 0.01429
Epoch: 17009, Training Loss: 0.01100
Epoch: 17009, Training Loss: 0.01156
Epoch: 17009, Training Loss: 0.01158
Epoch: 17009, Training Loss: 0.01429
Epoch: 17010, Training Loss: 0.01100
Epoch: 17010, Training Loss: 0.01156
Epoch: 17010, Training Loss: 0.01158
Epoch: 17010, Training Loss: 0.01429
Epoch: 17011, Training Loss: 0.01100
Epoch: 17011, Training Loss: 0.01156
Epoch: 17011, Training Loss: 0.01158
Epoch: 17011, Training Loss: 0.01429
Epoch: 17012, Training Loss: 0.01100
Epoch: 17012, Training Loss: 0.01156
Epoch: 17012, Training Loss: 0.01158
Epoch: 17012, Training Loss: 0.01429
Epoch: 17013, Training Loss: 0.01100
Epoch: 17013, Training Loss: 0.01156
Epoch: 17013, Training Loss: 0.01158
Epoch: 17013, Training Loss: 0.01429
Epoch: 17014, Training Loss: 0.01100
Epoch: 17014, Training Loss: 0.01156
Epoch: 17014, Training Loss: 0.01158
Epoch: 17014, Training Loss: 0.01429
Epoch: 17015, Training Loss: 0.01100
Epoch: 17015, Training Loss: 0.01156
Epoch: 17015, Training Loss: 0.01158
Epoch: 17015, Training Loss: 0.01429
Epoch: 17016, Training Loss: 0.01100
Epoch: 17016, Training Loss: 0.01156
Epoch: 17016, Training Loss: 0.01158
Epoch: 17016, Training Loss: 0.01429
Epoch: 17017, Training Loss: 0.01100
Epoch: 17017, Training Loss: 0.01156
Epoch: 17017, Training Loss: 0.01158
Epoch: 17017, Training Loss: 0.01429
Epoch: 17018, Training Loss: 0.01100
Epoch: 17018, Training Loss: 0.01156
Epoch: 17018, Training Loss: 0.01158
Epoch: 17018, Training Loss: 0.01429
Epoch: 17019, Training Loss: 0.01100
Epoch: 17019, Training Loss: 0.01156
Epoch: 17019, Training Loss: 0.01158
Epoch: 17019, Training Loss: 0.01429
Epoch: 17020, Training Loss: 0.01100
Epoch: 17020, Training Loss: 0.01156
Epoch: 17020, Training Loss: 0.01158
Epoch: 17020, Training Loss: 0.01429
Epoch: 17021, Training Loss: 0.01100
Epoch: 17021, Training Loss: 0.01156
Epoch: 17021, Training Loss: 0.01157
Epoch: 17021, Training Loss: 0.01429
Epoch: 17022, Training Loss: 0.01100
Epoch: 17022, Training Loss: 0.01156
Epoch: 17022, Training Loss: 0.01157
Epoch: 17022, Training Loss: 0.01429
Epoch: 17023, Training Loss: 0.01100
Epoch: 17023, Training Loss: 0.01156
Epoch: 17023, Training Loss: 0.01157
Epoch: 17023, Training Loss: 0.01428
Epoch: 17024, Training Loss: 0.01100
Epoch: 17024, Training Loss: 0.01156
Epoch: 17024, Training Loss: 0.01157
Epoch: 17024, Training Loss: 0.01428
Epoch: 17025, Training Loss: 0.01100
Epoch: 17025, Training Loss: 0.01156
Epoch: 17025, Training Loss: 0.01157
Epoch: 17025, Training Loss: 0.01428
Epoch: 17026, Training Loss: 0.01100
Epoch: 17026, Training Loss: 0.01156
Epoch: 17026, Training Loss: 0.01157
Epoch: 17026, Training Loss: 0.01428
Epoch: 17027, Training Loss: 0.01100
Epoch: 17027, Training Loss: 0.01156
Epoch: 17027, Training Loss: 0.01157
Epoch: 17027, Training Loss: 0.01428
Epoch: 17028, Training Loss: 0.01099
Epoch: 17028, Training Loss: 0.01156
Epoch: 17028, Training Loss: 0.01157
Epoch: 17028, Training Loss: 0.01428
Epoch: 17029, Training Loss: 0.01099
Epoch: 17029, Training Loss: 0.01156
Epoch: 17029, Training Loss: 0.01157
Epoch: 17029, Training Loss: 0.01428
Epoch: 17030, Training Loss: 0.01099
Epoch: 17030, Training Loss: 0.01156
Epoch: 17030, Training Loss: 0.01157
Epoch: 17030, Training Loss: 0.01428
Epoch: 17031, Training Loss: 0.01099
Epoch: 17031, Training Loss: 0.01156
Epoch: 17031, Training Loss: 0.01157
Epoch: 17031, Training Loss: 0.01428
Epoch: 17032, Training Loss: 0.01099
Epoch: 17032, Training Loss: 0.01155
Epoch: 17032, Training Loss: 0.01157
Epoch: 17032, Training Loss: 0.01428
Epoch: 17033, Training Loss: 0.01099
Epoch: 17033, Training Loss: 0.01155
Epoch: 17033, Training Loss: 0.01157
Epoch: 17033, Training Loss: 0.01428
Epoch: 17034, Training Loss: 0.01099
Epoch: 17034, Training Loss: 0.01155
Epoch: 17034, Training Loss: 0.01157
Epoch: 17034, Training Loss: 0.01428
Epoch: 17035, Training Loss: 0.01099
Epoch: 17035, Training Loss: 0.01155
Epoch: 17035, Training Loss: 0.01157
Epoch: 17035, Training Loss: 0.01428
Epoch: 17036, Training Loss: 0.01099
Epoch: 17036, Training Loss: 0.01155
Epoch: 17036, Training Loss: 0.01157
Epoch: 17036, Training Loss: 0.01428
Epoch: 17037, Training Loss: 0.01099
Epoch: 17037, Training Loss: 0.01155
Epoch: 17037, Training Loss: 0.01157
Epoch: 17037, Training Loss: 0.01428
Epoch: 17038, Training Loss: 0.01099
Epoch: 17038, Training Loss: 0.01155
Epoch: 17038, Training Loss: 0.01157
Epoch: 17038, Training Loss: 0.01428
Epoch: 17039, Training Loss: 0.01099
Epoch: 17039, Training Loss: 0.01155
Epoch: 17039, Training Loss: 0.01157
Epoch: 17039, Training Loss: 0.01428
Epoch: 17040, Training Loss: 0.01099
Epoch: 17040, Training Loss: 0.01155
Epoch: 17040, Training Loss: 0.01157
Epoch: 17040, Training Loss: 0.01428
Epoch: 17041, Training Loss: 0.01099
Epoch: 17041, Training Loss: 0.01155
Epoch: 17041, Training Loss: 0.01157
Epoch: 17041, Training Loss: 0.01428
Epoch: 17042, Training Loss: 0.01099
Epoch: 17042, Training Loss: 0.01155
Epoch: 17042, Training Loss: 0.01157
Epoch: 17042, Training Loss: 0.01428
Epoch: 17043, Training Loss: 0.01099
Epoch: 17043, Training Loss: 0.01155
Epoch: 17043, Training Loss: 0.01157
Epoch: 17043, Training Loss: 0.01427
Epoch: 17044, Training Loss: 0.01099
Epoch: 17044, Training Loss: 0.01155
Epoch: 17044, Training Loss: 0.01157
Epoch: 17044, Training Loss: 0.01427
Epoch: 17045, Training Loss: 0.01099
Epoch: 17045, Training Loss: 0.01155
Epoch: 17045, Training Loss: 0.01157
Epoch: 17045, Training Loss: 0.01427
Epoch: 17046, Training Loss: 0.01099
Epoch: 17046, Training Loss: 0.01155
Epoch: 17046, Training Loss: 0.01156
Epoch: 17046, Training Loss: 0.01427
Epoch: 17047, Training Loss: 0.01099
Epoch: 17047, Training Loss: 0.01155
Epoch: 17047, Training Loss: 0.01156
Epoch: 17047, Training Loss: 0.01427
Epoch: 17048, Training Loss: 0.01099
Epoch: 17048, Training Loss: 0.01155
Epoch: 17048, Training Loss: 0.01156
Epoch: 17048, Training Loss: 0.01427
Epoch: 17049, Training Loss: 0.01099
Epoch: 17049, Training Loss: 0.01155
Epoch: 17049, Training Loss: 0.01156
Epoch: 17049, Training Loss: 0.01427
Epoch: 17050, Training Loss: 0.01099
Epoch: 17050, Training Loss: 0.01155
Epoch: 17050, Training Loss: 0.01156
Epoch: 17050, Training Loss: 0.01427
Epoch: 17051, Training Loss: 0.01099
Epoch: 17051, Training Loss: 0.01155
Epoch: 17051, Training Loss: 0.01156
Epoch: 17051, Training Loss: 0.01427
Epoch: 17052, Training Loss: 0.01099
Epoch: 17052, Training Loss: 0.01155
Epoch: 17052, Training Loss: 0.01156
Epoch: 17052, Training Loss: 0.01427
Epoch: 17053, Training Loss: 0.01099
Epoch: 17053, Training Loss: 0.01155
Epoch: 17053, Training Loss: 0.01156
Epoch: 17053, Training Loss: 0.01427
Epoch: 17054, Training Loss: 0.01099
Epoch: 17054, Training Loss: 0.01155
Epoch: 17054, Training Loss: 0.01156
Epoch: 17054, Training Loss: 0.01427
Epoch: 17055, Training Loss: 0.01098
Epoch: 17055, Training Loss: 0.01155
Epoch: 17055, Training Loss: 0.01156
Epoch: 17055, Training Loss: 0.01427
Epoch: 17056, Training Loss: 0.01098
Epoch: 17056, Training Loss: 0.01154
Epoch: 17056, Training Loss: 0.01156
Epoch: 17056, Training Loss: 0.01427
Epoch: 17057, Training Loss: 0.01098
Epoch: 17057, Training Loss: 0.01154
Epoch: 17057, Training Loss: 0.01156
Epoch: 17057, Training Loss: 0.01427
Epoch: 17058, Training Loss: 0.01098
Epoch: 17058, Training Loss: 0.01154
Epoch: 17058, Training Loss: 0.01156
Epoch: 17058, Training Loss: 0.01427
Epoch: 17059, Training Loss: 0.01098
Epoch: 17059, Training Loss: 0.01154
Epoch: 17059, Training Loss: 0.01156
Epoch: 17059, Training Loss: 0.01427
Epoch: 17060, Training Loss: 0.01098
Epoch: 17060, Training Loss: 0.01154
Epoch: 17060, Training Loss: 0.01156
Epoch: 17060, Training Loss: 0.01427
Epoch: 17061, Training Loss: 0.01098
Epoch: 17061, Training Loss: 0.01154
Epoch: 17061, Training Loss: 0.01156
Epoch: 17061, Training Loss: 0.01427
Epoch: 17062, Training Loss: 0.01098
Epoch: 17062, Training Loss: 0.01154
Epoch: 17062, Training Loss: 0.01156
Epoch: 17062, Training Loss: 0.01426
Epoch: 17063, Training Loss: 0.01098
Epoch: 17063, Training Loss: 0.01154
Epoch: 17063, Training Loss: 0.01156
Epoch: 17063, Training Loss: 0.01426
Epoch: 17064, Training Loss: 0.01098
Epoch: 17064, Training Loss: 0.01154
Epoch: 17064, Training Loss: 0.01156
Epoch: 17064, Training Loss: 0.01426
Epoch: 17065, Training Loss: 0.01098
Epoch: 17065, Training Loss: 0.01154
Epoch: 17065, Training Loss: 0.01156
Epoch: 17065, Training Loss: 0.01426
Epoch: 17066, Training Loss: 0.01098
Epoch: 17066, Training Loss: 0.01154
Epoch: 17066, Training Loss: 0.01156
Epoch: 17066, Training Loss: 0.01426
Epoch: 17067, Training Loss: 0.01098
Epoch: 17067, Training Loss: 0.01154
Epoch: 17067, Training Loss: 0.01156
Epoch: 17067, Training Loss: 0.01426
Epoch: 17068, Training Loss: 0.01098
Epoch: 17068, Training Loss: 0.01154
Epoch: 17068, Training Loss: 0.01156
Epoch: 17068, Training Loss: 0.01426
Epoch: 17069, Training Loss: 0.01098
Epoch: 17069, Training Loss: 0.01154
Epoch: 17069, Training Loss: 0.01156
Epoch: 17069, Training Loss: 0.01426
Epoch: 17070, Training Loss: 0.01098
Epoch: 17070, Training Loss: 0.01154
Epoch: 17070, Training Loss: 0.01155
Epoch: 17070, Training Loss: 0.01426
Epoch: 17071, Training Loss: 0.01098
Epoch: 17071, Training Loss: 0.01154
Epoch: 17071, Training Loss: 0.01155
Epoch: 17071, Training Loss: 0.01426
Epoch: 17072, Training Loss: 0.01098
Epoch: 17072, Training Loss: 0.01154
Epoch: 17072, Training Loss: 0.01155
Epoch: 17072, Training Loss: 0.01426
Epoch: 17073, Training Loss: 0.01098
Epoch: 17073, Training Loss: 0.01154
Epoch: 17073, Training Loss: 0.01155
Epoch: 17073, Training Loss: 0.01426
Epoch: 17074, Training Loss: 0.01098
Epoch: 17074, Training Loss: 0.01154
Epoch: 17074, Training Loss: 0.01155
Epoch: 17074, Training Loss: 0.01426
Epoch: 17075, Training Loss: 0.01098
Epoch: 17075, Training Loss: 0.01154
Epoch: 17075, Training Loss: 0.01155
Epoch: 17075, Training Loss: 0.01426
Epoch: 17076, Training Loss: 0.01098
Epoch: 17076, Training Loss: 0.01154
Epoch: 17076, Training Loss: 0.01155
Epoch: 17076, Training Loss: 0.01426
Epoch: 17077, Training Loss: 0.01098
Epoch: 17077, Training Loss: 0.01154
Epoch: 17077, Training Loss: 0.01155
Epoch: 17077, Training Loss: 0.01426
Epoch: 17078, Training Loss: 0.01098
Epoch: 17078, Training Loss: 0.01154
Epoch: 17078, Training Loss: 0.01155
Epoch: 17078, Training Loss: 0.01426
Epoch: 17079, Training Loss: 0.01098
Epoch: 17079, Training Loss: 0.01154
Epoch: 17079, Training Loss: 0.01155
Epoch: 17079, Training Loss: 0.01426
Epoch: 17080, Training Loss: 0.01098
Epoch: 17080, Training Loss: 0.01154
Epoch: 17080, Training Loss: 0.01155
Epoch: 17080, Training Loss: 0.01426
Epoch: 17081, Training Loss: 0.01097
Epoch: 17081, Training Loss: 0.01153
Epoch: 17081, Training Loss: 0.01155
Epoch: 17081, Training Loss: 0.01425
Epoch: 17082, Training Loss: 0.01097
Epoch: 17082, Training Loss: 0.01153
Epoch: 17082, Training Loss: 0.01155
Epoch: 17082, Training Loss: 0.01425
Epoch: 17083, Training Loss: 0.01097
Epoch: 17083, Training Loss: 0.01153
Epoch: 17083, Training Loss: 0.01155
Epoch: 17083, Training Loss: 0.01425
Epoch: 17084, Training Loss: 0.01097
Epoch: 17084, Training Loss: 0.01153
Epoch: 17084, Training Loss: 0.01155
Epoch: 17084, Training Loss: 0.01425
Epoch: 17085, Training Loss: 0.01097
Epoch: 17085, Training Loss: 0.01153
Epoch: 17085, Training Loss: 0.01155
Epoch: 17085, Training Loss: 0.01425
Epoch: 17086, Training Loss: 0.01097
Epoch: 17086, Training Loss: 0.01153
Epoch: 17086, Training Loss: 0.01155
Epoch: 17086, Training Loss: 0.01425
Epoch: 17087, Training Loss: 0.01097
Epoch: 17087, Training Loss: 0.01153
Epoch: 17087, Training Loss: 0.01155
Epoch: 17087, Training Loss: 0.01425
Epoch: 17088, Training Loss: 0.01097
Epoch: 17088, Training Loss: 0.01153
Epoch: 17088, Training Loss: 0.01155
Epoch: 17088, Training Loss: 0.01425
Epoch: 17089, Training Loss: 0.01097
Epoch: 17089, Training Loss: 0.01153
Epoch: 17089, Training Loss: 0.01155
Epoch: 17089, Training Loss: 0.01425
Epoch: 17090, Training Loss: 0.01097
Epoch: 17090, Training Loss: 0.01153
Epoch: 17090, Training Loss: 0.01155
Epoch: 17090, Training Loss: 0.01425
Epoch: 17091, Training Loss: 0.01097
Epoch: 17091, Training Loss: 0.01153
Epoch: 17091, Training Loss: 0.01155
Epoch: 17091, Training Loss: 0.01425
Epoch: 17092, Training Loss: 0.01097
Epoch: 17092, Training Loss: 0.01153
Epoch: 17092, Training Loss: 0.01155
Epoch: 17092, Training Loss: 0.01425
Epoch: 17093, Training Loss: 0.01097
Epoch: 17093, Training Loss: 0.01153
Epoch: 17093, Training Loss: 0.01155
Epoch: 17093, Training Loss: 0.01425
Epoch: 17094, Training Loss: 0.01097
Epoch: 17094, Training Loss: 0.01153
Epoch: 17094, Training Loss: 0.01154
Epoch: 17094, Training Loss: 0.01425
Epoch: 17095, Training Loss: 0.01097
Epoch: 17095, Training Loss: 0.01153
Epoch: 17095, Training Loss: 0.01154
Epoch: 17095, Training Loss: 0.01425
Epoch: 17096, Training Loss: 0.01097
Epoch: 17096, Training Loss: 0.01153
Epoch: 17096, Training Loss: 0.01154
Epoch: 17096, Training Loss: 0.01425
Epoch: 17097, Training Loss: 0.01097
Epoch: 17097, Training Loss: 0.01153
Epoch: 17097, Training Loss: 0.01154
Epoch: 17097, Training Loss: 0.01425
Epoch: 17098, Training Loss: 0.01097
Epoch: 17098, Training Loss: 0.01153
Epoch: 17098, Training Loss: 0.01154
Epoch: 17098, Training Loss: 0.01425
Epoch: 17099, Training Loss: 0.01097
Epoch: 17099, Training Loss: 0.01153
Epoch: 17099, Training Loss: 0.01154
Epoch: 17099, Training Loss: 0.01425
Epoch: 17100, Training Loss: 0.01097
Epoch: 17100, Training Loss: 0.01153
Epoch: 17100, Training Loss: 0.01154
Epoch: 17100, Training Loss: 0.01425
Epoch: 17101, Training Loss: 0.01097
Epoch: 17101, Training Loss: 0.01153
Epoch: 17101, Training Loss: 0.01154
Epoch: 17101, Training Loss: 0.01424
Epoch: 17102, Training Loss: 0.01097
Epoch: 17102, Training Loss: 0.01153
Epoch: 17102, Training Loss: 0.01154
Epoch: 17102, Training Loss: 0.01424
Epoch: 17103, Training Loss: 0.01097
Epoch: 17103, Training Loss: 0.01153
Epoch: 17103, Training Loss: 0.01154
Epoch: 17103, Training Loss: 0.01424
Epoch: 17104, Training Loss: 0.01097
Epoch: 17104, Training Loss: 0.01153
Epoch: 17104, Training Loss: 0.01154
Epoch: 17104, Training Loss: 0.01424
Epoch: 17105, Training Loss: 0.01097
Epoch: 17105, Training Loss: 0.01152
Epoch: 17105, Training Loss: 0.01154
Epoch: 17105, Training Loss: 0.01424
Epoch: 17106, Training Loss: 0.01097
Epoch: 17106, Training Loss: 0.01152
Epoch: 17106, Training Loss: 0.01154
Epoch: 17106, Training Loss: 0.01424
Epoch: 17107, Training Loss: 0.01097
Epoch: 17107, Training Loss: 0.01152
Epoch: 17107, Training Loss: 0.01154
Epoch: 17107, Training Loss: 0.01424
Epoch: 17108, Training Loss: 0.01096
Epoch: 17108, Training Loss: 0.01152
Epoch: 17108, Training Loss: 0.01154
Epoch: 17108, Training Loss: 0.01424
Epoch: 17109, Training Loss: 0.01096
Epoch: 17109, Training Loss: 0.01152
Epoch: 17109, Training Loss: 0.01154
Epoch: 17109, Training Loss: 0.01424
Epoch: 17110, Training Loss: 0.01096
Epoch: 17110, Training Loss: 0.01152
Epoch: 17110, Training Loss: 0.01154
Epoch: 17110, Training Loss: 0.01424
Epoch: 17111, Training Loss: 0.01096
Epoch: 17111, Training Loss: 0.01152
Epoch: 17111, Training Loss: 0.01154
Epoch: 17111, Training Loss: 0.01424
Epoch: 17112, Training Loss: 0.01096
Epoch: 17112, Training Loss: 0.01152
Epoch: 17112, Training Loss: 0.01154
Epoch: 17112, Training Loss: 0.01424
Epoch: 17113, Training Loss: 0.01096
Epoch: 17113, Training Loss: 0.01152
Epoch: 17113, Training Loss: 0.01154
Epoch: 17113, Training Loss: 0.01424
Epoch: 17114, Training Loss: 0.01096
Epoch: 17114, Training Loss: 0.01152
Epoch: 17114, Training Loss: 0.01154
Epoch: 17114, Training Loss: 0.01424
Epoch: 17115, Training Loss: 0.01096
Epoch: 17115, Training Loss: 0.01152
Epoch: 17115, Training Loss: 0.01154
Epoch: 17115, Training Loss: 0.01424
Epoch: 17116, Training Loss: 0.01096
Epoch: 17116, Training Loss: 0.01152
Epoch: 17116, Training Loss: 0.01154
Epoch: 17116, Training Loss: 0.01424
Epoch: 17117, Training Loss: 0.01096
Epoch: 17117, Training Loss: 0.01152
Epoch: 17117, Training Loss: 0.01154
Epoch: 17117, Training Loss: 0.01424
Epoch: 17118, Training Loss: 0.01096
Epoch: 17118, Training Loss: 0.01152
Epoch: 17118, Training Loss: 0.01154
Epoch: 17118, Training Loss: 0.01424
Epoch: 17119, Training Loss: 0.01096
Epoch: 17119, Training Loss: 0.01152
Epoch: 17119, Training Loss: 0.01153
Epoch: 17119, Training Loss: 0.01424
Epoch: 17120, Training Loss: 0.01096
Epoch: 17120, Training Loss: 0.01152
Epoch: 17120, Training Loss: 0.01153
Epoch: 17120, Training Loss: 0.01423
Epoch: 17121, Training Loss: 0.01096
Epoch: 17121, Training Loss: 0.01152
Epoch: 17121, Training Loss: 0.01153
Epoch: 17121, Training Loss: 0.01423
Epoch: 17122, Training Loss: 0.01096
Epoch: 17122, Training Loss: 0.01152
Epoch: 17122, Training Loss: 0.01153
Epoch: 17122, Training Loss: 0.01423
Epoch: 17123, Training Loss: 0.01096
Epoch: 17123, Training Loss: 0.01152
Epoch: 17123, Training Loss: 0.01153
Epoch: 17123, Training Loss: 0.01423
Epoch: 17124, Training Loss: 0.01096
Epoch: 17124, Training Loss: 0.01152
Epoch: 17124, Training Loss: 0.01153
Epoch: 17124, Training Loss: 0.01423
Epoch: 17125, Training Loss: 0.01096
Epoch: 17125, Training Loss: 0.01152
Epoch: 17125, Training Loss: 0.01153
Epoch: 17125, Training Loss: 0.01423
Epoch: 17126, Training Loss: 0.01096
Epoch: 17126, Training Loss: 0.01152
Epoch: 17126, Training Loss: 0.01153
Epoch: 17126, Training Loss: 0.01423
Epoch: 17127, Training Loss: 0.01096
Epoch: 17127, Training Loss: 0.01152
Epoch: 17127, Training Loss: 0.01153
Epoch: 17127, Training Loss: 0.01423
Epoch: 17128, Training Loss: 0.01096
Epoch: 17128, Training Loss: 0.01152
Epoch: 17128, Training Loss: 0.01153
Epoch: 17128, Training Loss: 0.01423
Epoch: 17129, Training Loss: 0.01096
Epoch: 17129, Training Loss: 0.01151
Epoch: 17129, Training Loss: 0.01153
Epoch: 17129, Training Loss: 0.01423
Epoch: 17130, Training Loss: 0.01096
Epoch: 17130, Training Loss: 0.01151
Epoch: 17130, Training Loss: 0.01153
Epoch: 17130, Training Loss: 0.01423
Epoch: 17131, Training Loss: 0.01096
Epoch: 17131, Training Loss: 0.01151
Epoch: 17131, Training Loss: 0.01153
Epoch: 17131, Training Loss: 0.01423
Epoch: 17132, Training Loss: 0.01096
Epoch: 17132, Training Loss: 0.01151
Epoch: 17132, Training Loss: 0.01153
Epoch: 17132, Training Loss: 0.01423
Epoch: 17133, Training Loss: 0.01096
Epoch: 17133, Training Loss: 0.01151
Epoch: 17133, Training Loss: 0.01153
Epoch: 17133, Training Loss: 0.01423
Epoch: 17134, Training Loss: 0.01095
Epoch: 17134, Training Loss: 0.01151
Epoch: 17134, Training Loss: 0.01153
Epoch: 17134, Training Loss: 0.01423
Epoch: 17135, Training Loss: 0.01095
Epoch: 17135, Training Loss: 0.01151
Epoch: 17135, Training Loss: 0.01153
Epoch: 17135, Training Loss: 0.01423
Epoch: 17136, Training Loss: 0.01095
Epoch: 17136, Training Loss: 0.01151
Epoch: 17136, Training Loss: 0.01153
Epoch: 17136, Training Loss: 0.01423
Epoch: 17137, Training Loss: 0.01095
Epoch: 17137, Training Loss: 0.01151
Epoch: 17137, Training Loss: 0.01153
Epoch: 17137, Training Loss: 0.01423
Epoch: 17138, Training Loss: 0.01095
Epoch: 17138, Training Loss: 0.01151
Epoch: 17138, Training Loss: 0.01153
Epoch: 17138, Training Loss: 0.01423
Epoch: 17139, Training Loss: 0.01095
Epoch: 17139, Training Loss: 0.01151
Epoch: 17139, Training Loss: 0.01153
Epoch: 17139, Training Loss: 0.01423
Epoch: 17140, Training Loss: 0.01095
Epoch: 17140, Training Loss: 0.01151
Epoch: 17140, Training Loss: 0.01153
Epoch: 17140, Training Loss: 0.01422
Epoch: 17141, Training Loss: 0.01095
Epoch: 17141, Training Loss: 0.01151
Epoch: 17141, Training Loss: 0.01153
Epoch: 17141, Training Loss: 0.01422
Epoch: 17142, Training Loss: 0.01095
Epoch: 17142, Training Loss: 0.01151
Epoch: 17142, Training Loss: 0.01153
Epoch: 17142, Training Loss: 0.01422
Epoch: 17143, Training Loss: 0.01095
Epoch: 17143, Training Loss: 0.01151
Epoch: 17143, Training Loss: 0.01152
Epoch: 17143, Training Loss: 0.01422
Epoch: 17144, Training Loss: 0.01095
Epoch: 17144, Training Loss: 0.01151
Epoch: 17144, Training Loss: 0.01152
Epoch: 17144, Training Loss: 0.01422
Epoch: 17145, Training Loss: 0.01095
Epoch: 17145, Training Loss: 0.01151
Epoch: 17145, Training Loss: 0.01152
Epoch: 17145, Training Loss: 0.01422
Epoch: 17146, Training Loss: 0.01095
Epoch: 17146, Training Loss: 0.01151
Epoch: 17146, Training Loss: 0.01152
Epoch: 17146, Training Loss: 0.01422
Epoch: 17147, Training Loss: 0.01095
Epoch: 17147, Training Loss: 0.01151
Epoch: 17147, Training Loss: 0.01152
Epoch: 17147, Training Loss: 0.01422
Epoch: 17148, Training Loss: 0.01095
Epoch: 17148, Training Loss: 0.01151
Epoch: 17148, Training Loss: 0.01152
Epoch: 17148, Training Loss: 0.01422
Epoch: 17149, Training Loss: 0.01095
Epoch: 17149, Training Loss: 0.01151
Epoch: 17149, Training Loss: 0.01152
Epoch: 17149, Training Loss: 0.01422
Epoch: 17150, Training Loss: 0.01095
Epoch: 17150, Training Loss: 0.01151
Epoch: 17150, Training Loss: 0.01152
Epoch: 17150, Training Loss: 0.01422
Epoch: 17151, Training Loss: 0.01095
Epoch: 17151, Training Loss: 0.01151
Epoch: 17151, Training Loss: 0.01152
Epoch: 17151, Training Loss: 0.01422
Epoch: 17152, Training Loss: 0.01095
Epoch: 17152, Training Loss: 0.01151
Epoch: 17152, Training Loss: 0.01152
Epoch: 17152, Training Loss: 0.01422
Epoch: 17153, Training Loss: 0.01095
Epoch: 17153, Training Loss: 0.01151
Epoch: 17153, Training Loss: 0.01152
Epoch: 17153, Training Loss: 0.01422
Epoch: 17154, Training Loss: 0.01095
Epoch: 17154, Training Loss: 0.01150
Epoch: 17154, Training Loss: 0.01152
Epoch: 17154, Training Loss: 0.01422
Epoch: 17155, Training Loss: 0.01095
Epoch: 17155, Training Loss: 0.01150
Epoch: 17155, Training Loss: 0.01152
Epoch: 17155, Training Loss: 0.01422
Epoch: 17156, Training Loss: 0.01095
Epoch: 17156, Training Loss: 0.01150
Epoch: 17156, Training Loss: 0.01152
Epoch: 17156, Training Loss: 0.01422
Epoch: 17157, Training Loss: 0.01095
Epoch: 17157, Training Loss: 0.01150
Epoch: 17157, Training Loss: 0.01152
Epoch: 17157, Training Loss: 0.01422
Epoch: 17158, Training Loss: 0.01095
Epoch: 17158, Training Loss: 0.01150
Epoch: 17158, Training Loss: 0.01152
Epoch: 17158, Training Loss: 0.01422
Epoch: 17159, Training Loss: 0.01095
Epoch: 17159, Training Loss: 0.01150
Epoch: 17159, Training Loss: 0.01152
Epoch: 17159, Training Loss: 0.01421
Epoch: 17160, Training Loss: 0.01095
Epoch: 17160, Training Loss: 0.01150
Epoch: 17160, Training Loss: 0.01152
Epoch: 17160, Training Loss: 0.01421
Epoch: 17161, Training Loss: 0.01094
Epoch: 17161, Training Loss: 0.01150
Epoch: 17161, Training Loss: 0.01152
Epoch: 17161, Training Loss: 0.01421
Epoch: 17162, Training Loss: 0.01094
Epoch: 17162, Training Loss: 0.01150
Epoch: 17162, Training Loss: 0.01152
Epoch: 17162, Training Loss: 0.01421
Epoch: 17163, Training Loss: 0.01094
Epoch: 17163, Training Loss: 0.01150
Epoch: 17163, Training Loss: 0.01152
Epoch: 17163, Training Loss: 0.01421
Epoch: 17164, Training Loss: 0.01094
Epoch: 17164, Training Loss: 0.01150
Epoch: 17164, Training Loss: 0.01152
Epoch: 17164, Training Loss: 0.01421
Epoch: 17165, Training Loss: 0.01094
Epoch: 17165, Training Loss: 0.01150
Epoch: 17165, Training Loss: 0.01152
Epoch: 17165, Training Loss: 0.01421
Epoch: 17166, Training Loss: 0.01094
Epoch: 17166, Training Loss: 0.01150
Epoch: 17166, Training Loss: 0.01152
Epoch: 17166, Training Loss: 0.01421
Epoch: 17167, Training Loss: 0.01094
Epoch: 17167, Training Loss: 0.01150
Epoch: 17167, Training Loss: 0.01152
Epoch: 17167, Training Loss: 0.01421
Epoch: 17168, Training Loss: 0.01094
Epoch: 17168, Training Loss: 0.01150
Epoch: 17168, Training Loss: 0.01151
Epoch: 17168, Training Loss: 0.01421
Epoch: 17169, Training Loss: 0.01094
Epoch: 17169, Training Loss: 0.01150
Epoch: 17169, Training Loss: 0.01151
Epoch: 17169, Training Loss: 0.01421
Epoch: 17170, Training Loss: 0.01094
Epoch: 17170, Training Loss: 0.01150
Epoch: 17170, Training Loss: 0.01151
Epoch: 17170, Training Loss: 0.01421
Epoch: 17171, Training Loss: 0.01094
Epoch: 17171, Training Loss: 0.01150
Epoch: 17171, Training Loss: 0.01151
Epoch: 17171, Training Loss: 0.01421
Epoch: 17172, Training Loss: 0.01094
Epoch: 17172, Training Loss: 0.01150
Epoch: 17172, Training Loss: 0.01151
Epoch: 17172, Training Loss: 0.01421
Epoch: 17173, Training Loss: 0.01094
Epoch: 17173, Training Loss: 0.01150
Epoch: 17173, Training Loss: 0.01151
Epoch: 17173, Training Loss: 0.01421
Epoch: 17174, Training Loss: 0.01094
Epoch: 17174, Training Loss: 0.01150
Epoch: 17174, Training Loss: 0.01151
Epoch: 17174, Training Loss: 0.01421
Epoch: 17175, Training Loss: 0.01094
Epoch: 17175, Training Loss: 0.01150
Epoch: 17175, Training Loss: 0.01151
Epoch: 17175, Training Loss: 0.01421
Epoch: 17176, Training Loss: 0.01094
Epoch: 17176, Training Loss: 0.01150
Epoch: 17176, Training Loss: 0.01151
Epoch: 17176, Training Loss: 0.01421
Epoch: 17177, Training Loss: 0.01094
Epoch: 17177, Training Loss: 0.01150
Epoch: 17177, Training Loss: 0.01151
Epoch: 17177, Training Loss: 0.01421
Epoch: 17178, Training Loss: 0.01094
Epoch: 17178, Training Loss: 0.01150
Epoch: 17178, Training Loss: 0.01151
Epoch: 17178, Training Loss: 0.01421
Epoch: 17179, Training Loss: 0.01094
Epoch: 17179, Training Loss: 0.01149
Epoch: 17179, Training Loss: 0.01151
Epoch: 17179, Training Loss: 0.01420
Epoch: 17180, Training Loss: 0.01094
Epoch: 17180, Training Loss: 0.01149
Epoch: 17180, Training Loss: 0.01151
Epoch: 17180, Training Loss: 0.01420
Epoch: 17181, Training Loss: 0.01094
Epoch: 17181, Training Loss: 0.01149
Epoch: 17181, Training Loss: 0.01151
Epoch: 17181, Training Loss: 0.01420
Epoch: 17182, Training Loss: 0.01094
Epoch: 17182, Training Loss: 0.01149
Epoch: 17182, Training Loss: 0.01151
Epoch: 17182, Training Loss: 0.01420
Epoch: 17183, Training Loss: 0.01094
Epoch: 17183, Training Loss: 0.01149
Epoch: 17183, Training Loss: 0.01151
Epoch: 17183, Training Loss: 0.01420
Epoch: 17184, Training Loss: 0.01094
Epoch: 17184, Training Loss: 0.01149
Epoch: 17184, Training Loss: 0.01151
Epoch: 17184, Training Loss: 0.01420
Epoch: 17185, Training Loss: 0.01094
Epoch: 17185, Training Loss: 0.01149
Epoch: 17185, Training Loss: 0.01151
Epoch: 17185, Training Loss: 0.01420
Epoch: 17186, Training Loss: 0.01094
Epoch: 17186, Training Loss: 0.01149
Epoch: 17186, Training Loss: 0.01151
Epoch: 17186, Training Loss: 0.01420
Epoch: 17187, Training Loss: 0.01094
Epoch: 17187, Training Loss: 0.01149
Epoch: 17187, Training Loss: 0.01151
Epoch: 17187, Training Loss: 0.01420
Epoch: 17188, Training Loss: 0.01093
Epoch: 17188, Training Loss: 0.01149
Epoch: 17188, Training Loss: 0.01151
Epoch: 17188, Training Loss: 0.01420
Epoch: 17189, Training Loss: 0.01093
Epoch: 17189, Training Loss: 0.01149
Epoch: 17189, Training Loss: 0.01151
Epoch: 17189, Training Loss: 0.01420
Epoch: 17190, Training Loss: 0.01093
Epoch: 17190, Training Loss: 0.01149
Epoch: 17190, Training Loss: 0.01151
Epoch: 17190, Training Loss: 0.01420
Epoch: 17191, Training Loss: 0.01093
Epoch: 17191, Training Loss: 0.01149
Epoch: 17191, Training Loss: 0.01151
Epoch: 17191, Training Loss: 0.01420
Epoch: 17192, Training Loss: 0.01093
Epoch: 17192, Training Loss: 0.01149
Epoch: 17192, Training Loss: 0.01151
Epoch: 17192, Training Loss: 0.01420
Epoch: 17193, Training Loss: 0.01093
Epoch: 17193, Training Loss: 0.01149
Epoch: 17193, Training Loss: 0.01150
Epoch: 17193, Training Loss: 0.01420
Epoch: 17194, Training Loss: 0.01093
Epoch: 17194, Training Loss: 0.01149
Epoch: 17194, Training Loss: 0.01150
Epoch: 17194, Training Loss: 0.01420
Epoch: 17195, Training Loss: 0.01093
Epoch: 17195, Training Loss: 0.01149
Epoch: 17195, Training Loss: 0.01150
Epoch: 17195, Training Loss: 0.01420
Epoch: 17196, Training Loss: 0.01093
Epoch: 17196, Training Loss: 0.01149
Epoch: 17196, Training Loss: 0.01150
Epoch: 17196, Training Loss: 0.01420
Epoch: 17197, Training Loss: 0.01093
Epoch: 17197, Training Loss: 0.01149
Epoch: 17197, Training Loss: 0.01150
Epoch: 17197, Training Loss: 0.01420
Epoch: 17198, Training Loss: 0.01093
Epoch: 17198, Training Loss: 0.01149
Epoch: 17198, Training Loss: 0.01150
Epoch: 17198, Training Loss: 0.01420
Epoch: 17199, Training Loss: 0.01093
Epoch: 17199, Training Loss: 0.01149
Epoch: 17199, Training Loss: 0.01150
Epoch: 17199, Training Loss: 0.01419
Epoch: 17200, Training Loss: 0.01093
Epoch: 17200, Training Loss: 0.01149
Epoch: 17200, Training Loss: 0.01150
Epoch: 17200, Training Loss: 0.01419
Epoch: 17201, Training Loss: 0.01093
Epoch: 17201, Training Loss: 0.01149
Epoch: 17201, Training Loss: 0.01150
Epoch: 17201, Training Loss: 0.01419
Epoch: 17202, Training Loss: 0.01093
Epoch: 17202, Training Loss: 0.01149
Epoch: 17202, Training Loss: 0.01150
Epoch: 17202, Training Loss: 0.01419
Epoch: 17203, Training Loss: 0.01093
Epoch: 17203, Training Loss: 0.01148
Epoch: 17203, Training Loss: 0.01150
Epoch: 17203, Training Loss: 0.01419
Epoch: 17204, Training Loss: 0.01093
Epoch: 17204, Training Loss: 0.01148
Epoch: 17204, Training Loss: 0.01150
Epoch: 17204, Training Loss: 0.01419
Epoch: 17205, Training Loss: 0.01093
Epoch: 17205, Training Loss: 0.01148
Epoch: 17205, Training Loss: 0.01150
Epoch: 17205, Training Loss: 0.01419
Epoch: 17206, Training Loss: 0.01093
Epoch: 17206, Training Loss: 0.01148
Epoch: 17206, Training Loss: 0.01150
Epoch: 17206, Training Loss: 0.01419
Epoch: 17207, Training Loss: 0.01093
Epoch: 17207, Training Loss: 0.01148
Epoch: 17207, Training Loss: 0.01150
Epoch: 17207, Training Loss: 0.01419
Epoch: 17208, Training Loss: 0.01093
Epoch: 17208, Training Loss: 0.01148
Epoch: 17208, Training Loss: 0.01150
Epoch: 17208, Training Loss: 0.01419
Epoch: 17209, Training Loss: 0.01093
Epoch: 17209, Training Loss: 0.01148
Epoch: 17209, Training Loss: 0.01150
Epoch: 17209, Training Loss: 0.01419
Epoch: 17210, Training Loss: 0.01093
Epoch: 17210, Training Loss: 0.01148
Epoch: 17210, Training Loss: 0.01150
Epoch: 17210, Training Loss: 0.01419
Epoch: 17211, Training Loss: 0.01093
Epoch: 17211, Training Loss: 0.01148
Epoch: 17211, Training Loss: 0.01150
Epoch: 17211, Training Loss: 0.01419
Epoch: 17212, Training Loss: 0.01093
Epoch: 17212, Training Loss: 0.01148
Epoch: 17212, Training Loss: 0.01150
Epoch: 17212, Training Loss: 0.01419
Epoch: 17213, Training Loss: 0.01093
Epoch: 17213, Training Loss: 0.01148
Epoch: 17213, Training Loss: 0.01150
Epoch: 17213, Training Loss: 0.01419
Epoch: 17214, Training Loss: 0.01092
Epoch: 17214, Training Loss: 0.01148
Epoch: 17214, Training Loss: 0.01150
Epoch: 17214, Training Loss: 0.01419
Epoch: 17215, Training Loss: 0.01092
Epoch: 17215, Training Loss: 0.01148
Epoch: 17215, Training Loss: 0.01150
Epoch: 17215, Training Loss: 0.01419
Epoch: 17216, Training Loss: 0.01092
Epoch: 17216, Training Loss: 0.01148
Epoch: 17216, Training Loss: 0.01150
Epoch: 17216, Training Loss: 0.01419
Epoch: 17217, Training Loss: 0.01092
Epoch: 17217, Training Loss: 0.01148
Epoch: 17217, Training Loss: 0.01149
Epoch: 17217, Training Loss: 0.01419
Epoch: 17218, Training Loss: 0.01092
Epoch: 17218, Training Loss: 0.01148
Epoch: 17218, Training Loss: 0.01149
Epoch: 17218, Training Loss: 0.01418
Epoch: 17219, Training Loss: 0.01092
Epoch: 17219, Training Loss: 0.01148
Epoch: 17219, Training Loss: 0.01149
Epoch: 17219, Training Loss: 0.01418
Epoch: 17220, Training Loss: 0.01092
Epoch: 17220, Training Loss: 0.01148
Epoch: 17220, Training Loss: 0.01149
Epoch: 17220, Training Loss: 0.01418
Epoch: 17221, Training Loss: 0.01092
Epoch: 17221, Training Loss: 0.01148
Epoch: 17221, Training Loss: 0.01149
Epoch: 17221, Training Loss: 0.01418
Epoch: 17222, Training Loss: 0.01092
Epoch: 17222, Training Loss: 0.01148
Epoch: 17222, Training Loss: 0.01149
Epoch: 17222, Training Loss: 0.01418
Epoch: 17223, Training Loss: 0.01092
Epoch: 17223, Training Loss: 0.01148
Epoch: 17223, Training Loss: 0.01149
Epoch: 17223, Training Loss: 0.01418
Epoch: 17224, Training Loss: 0.01092
Epoch: 17224, Training Loss: 0.01148
Epoch: 17224, Training Loss: 0.01149
Epoch: 17224, Training Loss: 0.01418
Epoch: 17225, Training Loss: 0.01092
Epoch: 17225, Training Loss: 0.01148
Epoch: 17225, Training Loss: 0.01149
Epoch: 17225, Training Loss: 0.01418
Epoch: 17226, Training Loss: 0.01092
Epoch: 17226, Training Loss: 0.01148
Epoch: 17226, Training Loss: 0.01149
Epoch: 17226, Training Loss: 0.01418
Epoch: 17227, Training Loss: 0.01092
Epoch: 17227, Training Loss: 0.01148
Epoch: 17227, Training Loss: 0.01149
Epoch: 17227, Training Loss: 0.01418
Epoch: 17228, Training Loss: 0.01092
Epoch: 17228, Training Loss: 0.01147
Epoch: 17228, Training Loss: 0.01149
Epoch: 17228, Training Loss: 0.01418
Epoch: 17229, Training Loss: 0.01092
Epoch: 17229, Training Loss: 0.01147
Epoch: 17229, Training Loss: 0.01149
Epoch: 17229, Training Loss: 0.01418
Epoch: 17230, Training Loss: 0.01092
Epoch: 17230, Training Loss: 0.01147
Epoch: 17230, Training Loss: 0.01149
Epoch: 17230, Training Loss: 0.01418
Epoch: 17231, Training Loss: 0.01092
Epoch: 17231, Training Loss: 0.01147
Epoch: 17231, Training Loss: 0.01149
Epoch: 17231, Training Loss: 0.01418
Epoch: 17232, Training Loss: 0.01092
Epoch: 17232, Training Loss: 0.01147
Epoch: 17232, Training Loss: 0.01149
Epoch: 17232, Training Loss: 0.01418
Epoch: 17233, Training Loss: 0.01092
Epoch: 17233, Training Loss: 0.01147
Epoch: 17233, Training Loss: 0.01149
Epoch: 17233, Training Loss: 0.01418
Epoch: 17234, Training Loss: 0.01092
Epoch: 17234, Training Loss: 0.01147
Epoch: 17234, Training Loss: 0.01149
Epoch: 17234, Training Loss: 0.01418
Epoch: 17235, Training Loss: 0.01092
Epoch: 17235, Training Loss: 0.01147
Epoch: 17235, Training Loss: 0.01149
Epoch: 17235, Training Loss: 0.01418
Epoch: 17236, Training Loss: 0.01092
Epoch: 17236, Training Loss: 0.01147
Epoch: 17236, Training Loss: 0.01149
Epoch: 17236, Training Loss: 0.01418
Epoch: 17237, Training Loss: 0.01092
Epoch: 17237, Training Loss: 0.01147
Epoch: 17237, Training Loss: 0.01149
Epoch: 17237, Training Loss: 0.01418
Epoch: 17238, Training Loss: 0.01092
Epoch: 17238, Training Loss: 0.01147
Epoch: 17238, Training Loss: 0.01149
Epoch: 17238, Training Loss: 0.01417
Epoch: 17239, Training Loss: 0.01092
Epoch: 17239, Training Loss: 0.01147
Epoch: 17239, Training Loss: 0.01149
Epoch: 17239, Training Loss: 0.01417
Epoch: 17240, Training Loss: 0.01092
Epoch: 17240, Training Loss: 0.01147
Epoch: 17240, Training Loss: 0.01149
Epoch: 17240, Training Loss: 0.01417
Epoch: 17241, Training Loss: 0.01091
Epoch: 17241, Training Loss: 0.01147
Epoch: 17241, Training Loss: 0.01149
Epoch: 17241, Training Loss: 0.01417
Epoch: 17242, Training Loss: 0.01091
Epoch: 17242, Training Loss: 0.01147
Epoch: 17242, Training Loss: 0.01148
Epoch: 17242, Training Loss: 0.01417
Epoch: 17243, Training Loss: 0.01091
Epoch: 17243, Training Loss: 0.01147
Epoch: 17243, Training Loss: 0.01148
Epoch: 17243, Training Loss: 0.01417
Epoch: 17244, Training Loss: 0.01091
Epoch: 17244, Training Loss: 0.01147
Epoch: 17244, Training Loss: 0.01148
Epoch: 17244, Training Loss: 0.01417
Epoch: 17245, Training Loss: 0.01091
Epoch: 17245, Training Loss: 0.01147
Epoch: 17245, Training Loss: 0.01148
Epoch: 17245, Training Loss: 0.01417
Epoch: 17246, Training Loss: 0.01091
Epoch: 17246, Training Loss: 0.01147
Epoch: 17246, Training Loss: 0.01148
Epoch: 17246, Training Loss: 0.01417
Epoch: 17247, Training Loss: 0.01091
Epoch: 17247, Training Loss: 0.01147
Epoch: 17247, Training Loss: 0.01148
Epoch: 17247, Training Loss: 0.01417
Epoch: 17248, Training Loss: 0.01091
Epoch: 17248, Training Loss: 0.01147
Epoch: 17248, Training Loss: 0.01148
Epoch: 17248, Training Loss: 0.01417
Epoch: 17249, Training Loss: 0.01091
Epoch: 17249, Training Loss: 0.01147
Epoch: 17249, Training Loss: 0.01148
Epoch: 17249, Training Loss: 0.01417
Epoch: 17250, Training Loss: 0.01091
Epoch: 17250, Training Loss: 0.01147
Epoch: 17250, Training Loss: 0.01148
Epoch: 17250, Training Loss: 0.01417
Epoch: 17251, Training Loss: 0.01091
Epoch: 17251, Training Loss: 0.01147
Epoch: 17251, Training Loss: 0.01148
Epoch: 17251, Training Loss: 0.01417
Epoch: 17252, Training Loss: 0.01091
Epoch: 17252, Training Loss: 0.01147
Epoch: 17252, Training Loss: 0.01148
Epoch: 17252, Training Loss: 0.01417
Epoch: 17253, Training Loss: 0.01091
Epoch: 17253, Training Loss: 0.01146
Epoch: 17253, Training Loss: 0.01148
Epoch: 17253, Training Loss: 0.01417
Epoch: 17254, Training Loss: 0.01091
Epoch: 17254, Training Loss: 0.01146
Epoch: 17254, Training Loss: 0.01148
Epoch: 17254, Training Loss: 0.01417
Epoch: 17255, Training Loss: 0.01091
Epoch: 17255, Training Loss: 0.01146
Epoch: 17255, Training Loss: 0.01148
Epoch: 17255, Training Loss: 0.01417
Epoch: 17256, Training Loss: 0.01091
Epoch: 17256, Training Loss: 0.01146
Epoch: 17256, Training Loss: 0.01148
Epoch: 17256, Training Loss: 0.01417
Epoch: 17257, Training Loss: 0.01091
Epoch: 17257, Training Loss: 0.01146
Epoch: 17257, Training Loss: 0.01148
Epoch: 17257, Training Loss: 0.01417
Epoch: 17258, Training Loss: 0.01091
Epoch: 17258, Training Loss: 0.01146
Epoch: 17258, Training Loss: 0.01148
Epoch: 17258, Training Loss: 0.01416
Epoch: 17259, Training Loss: 0.01091
Epoch: 17259, Training Loss: 0.01146
Epoch: 17259, Training Loss: 0.01148
Epoch: 17259, Training Loss: 0.01416
Epoch: 17260, Training Loss: 0.01091
Epoch: 17260, Training Loss: 0.01146
Epoch: 17260, Training Loss: 0.01148
Epoch: 17260, Training Loss: 0.01416
Epoch: 17261, Training Loss: 0.01091
Epoch: 17261, Training Loss: 0.01146
Epoch: 17261, Training Loss: 0.01148
Epoch: 17261, Training Loss: 0.01416
Epoch: 17262, Training Loss: 0.01091
Epoch: 17262, Training Loss: 0.01146
Epoch: 17262, Training Loss: 0.01148
Epoch: 17262, Training Loss: 0.01416
Epoch: 17263, Training Loss: 0.01091
Epoch: 17263, Training Loss: 0.01146
Epoch: 17263, Training Loss: 0.01148
Epoch: 17263, Training Loss: 0.01416
Epoch: 17264, Training Loss: 0.01091
Epoch: 17264, Training Loss: 0.01146
Epoch: 17264, Training Loss: 0.01148
Epoch: 17264, Training Loss: 0.01416
Epoch: 17265, Training Loss: 0.01091
Epoch: 17265, Training Loss: 0.01146
Epoch: 17265, Training Loss: 0.01148
Epoch: 17265, Training Loss: 0.01416
Epoch: 17266, Training Loss: 0.01091
Epoch: 17266, Training Loss: 0.01146
Epoch: 17266, Training Loss: 0.01148
Epoch: 17266, Training Loss: 0.01416
Epoch: 17267, Training Loss: 0.01091
Epoch: 17267, Training Loss: 0.01146
Epoch: 17267, Training Loss: 0.01147
Epoch: 17267, Training Loss: 0.01416
Epoch: 17268, Training Loss: 0.01090
Epoch: 17268, Training Loss: 0.01146
Epoch: 17268, Training Loss: 0.01147
Epoch: 17268, Training Loss: 0.01416
Epoch: 17269, Training Loss: 0.01090
Epoch: 17269, Training Loss: 0.01146
Epoch: 17269, Training Loss: 0.01147
Epoch: 17269, Training Loss: 0.01416
Epoch: 17270, Training Loss: 0.01090
Epoch: 17270, Training Loss: 0.01146
Epoch: 17270, Training Loss: 0.01147
Epoch: 17270, Training Loss: 0.01416
Epoch: 17271, Training Loss: 0.01090
Epoch: 17271, Training Loss: 0.01146
Epoch: 17271, Training Loss: 0.01147
Epoch: 17271, Training Loss: 0.01416
Epoch: 17272, Training Loss: 0.01090
Epoch: 17272, Training Loss: 0.01146
Epoch: 17272, Training Loss: 0.01147
Epoch: 17272, Training Loss: 0.01416
Epoch: 17273, Training Loss: 0.01090
Epoch: 17273, Training Loss: 0.01146
Epoch: 17273, Training Loss: 0.01147
Epoch: 17273, Training Loss: 0.01416
Epoch: 17274, Training Loss: 0.01090
Epoch: 17274, Training Loss: 0.01146
Epoch: 17274, Training Loss: 0.01147
Epoch: 17274, Training Loss: 0.01416
Epoch: 17275, Training Loss: 0.01090
Epoch: 17275, Training Loss: 0.01146
Epoch: 17275, Training Loss: 0.01147
Epoch: 17275, Training Loss: 0.01416
Epoch: 17276, Training Loss: 0.01090
Epoch: 17276, Training Loss: 0.01146
Epoch: 17276, Training Loss: 0.01147
Epoch: 17276, Training Loss: 0.01416
Epoch: 17277, Training Loss: 0.01090
Epoch: 17277, Training Loss: 0.01146
Epoch: 17277, Training Loss: 0.01147
Epoch: 17277, Training Loss: 0.01416
Epoch: 17278, Training Loss: 0.01090
Epoch: 17278, Training Loss: 0.01145
Epoch: 17278, Training Loss: 0.01147
Epoch: 17278, Training Loss: 0.01415
Epoch: 17279, Training Loss: 0.01090
Epoch: 17279, Training Loss: 0.01145
Epoch: 17279, Training Loss: 0.01147
Epoch: 17279, Training Loss: 0.01415
Epoch: 17280, Training Loss: 0.01090
Epoch: 17280, Training Loss: 0.01145
Epoch: 17280, Training Loss: 0.01147
Epoch: 17280, Training Loss: 0.01415
Epoch: 17281, Training Loss: 0.01090
Epoch: 17281, Training Loss: 0.01145
Epoch: 17281, Training Loss: 0.01147
Epoch: 17281, Training Loss: 0.01415
Epoch: 17282, Training Loss: 0.01090
Epoch: 17282, Training Loss: 0.01145
Epoch: 17282, Training Loss: 0.01147
Epoch: 17282, Training Loss: 0.01415
Epoch: 17283, Training Loss: 0.01090
Epoch: 17283, Training Loss: 0.01145
Epoch: 17283, Training Loss: 0.01147
Epoch: 17283, Training Loss: 0.01415
Epoch: 17284, Training Loss: 0.01090
Epoch: 17284, Training Loss: 0.01145
Epoch: 17284, Training Loss: 0.01147
Epoch: 17284, Training Loss: 0.01415
Epoch: 17285, Training Loss: 0.01090
Epoch: 17285, Training Loss: 0.01145
Epoch: 17285, Training Loss: 0.01147
Epoch: 17285, Training Loss: 0.01415
Epoch: 17286, Training Loss: 0.01090
Epoch: 17286, Training Loss: 0.01145
Epoch: 17286, Training Loss: 0.01147
Epoch: 17286, Training Loss: 0.01415
Epoch: 17287, Training Loss: 0.01090
Epoch: 17287, Training Loss: 0.01145
Epoch: 17287, Training Loss: 0.01147
Epoch: 17287, Training Loss: 0.01415
Epoch: 17288, Training Loss: 0.01090
Epoch: 17288, Training Loss: 0.01145
Epoch: 17288, Training Loss: 0.01147
Epoch: 17288, Training Loss: 0.01415
Epoch: 17289, Training Loss: 0.01090
Epoch: 17289, Training Loss: 0.01145
Epoch: 17289, Training Loss: 0.01147
Epoch: 17289, Training Loss: 0.01415
Epoch: 17290, Training Loss: 0.01090
Epoch: 17290, Training Loss: 0.01145
Epoch: 17290, Training Loss: 0.01147
Epoch: 17290, Training Loss: 0.01415
Epoch: 17291, Training Loss: 0.01090
Epoch: 17291, Training Loss: 0.01145
Epoch: 17291, Training Loss: 0.01147
Epoch: 17291, Training Loss: 0.01415
Epoch: 17292, Training Loss: 0.01090
Epoch: 17292, Training Loss: 0.01145
Epoch: 17292, Training Loss: 0.01146
Epoch: 17292, Training Loss: 0.01415
Epoch: 17293, Training Loss: 0.01090
Epoch: 17293, Training Loss: 0.01145
Epoch: 17293, Training Loss: 0.01146
Epoch: 17293, Training Loss: 0.01415
Epoch: 17294, Training Loss: 0.01090
Epoch: 17294, Training Loss: 0.01145
Epoch: 17294, Training Loss: 0.01146
Epoch: 17294, Training Loss: 0.01415
Epoch: 17295, Training Loss: 0.01089
Epoch: 17295, Training Loss: 0.01145
Epoch: 17295, Training Loss: 0.01146
Epoch: 17295, Training Loss: 0.01415
Epoch: 17296, Training Loss: 0.01089
Epoch: 17296, Training Loss: 0.01145
Epoch: 17296, Training Loss: 0.01146
Epoch: 17296, Training Loss: 0.01415
Epoch: 17297, Training Loss: 0.01089
Epoch: 17297, Training Loss: 0.01145
Epoch: 17297, Training Loss: 0.01146
Epoch: 17297, Training Loss: 0.01414
Epoch: 17298, Training Loss: 0.01089
Epoch: 17298, Training Loss: 0.01145
Epoch: 17298, Training Loss: 0.01146
Epoch: 17298, Training Loss: 0.01414
Epoch: 17299, Training Loss: 0.01089
Epoch: 17299, Training Loss: 0.01145
Epoch: 17299, Training Loss: 0.01146
Epoch: 17299, Training Loss: 0.01414
Epoch: 17300, Training Loss: 0.01089
Epoch: 17300, Training Loss: 0.01145
Epoch: 17300, Training Loss: 0.01146
Epoch: 17300, Training Loss: 0.01414
Epoch: 17301, Training Loss: 0.01089
Epoch: 17301, Training Loss: 0.01145
Epoch: 17301, Training Loss: 0.01146
Epoch: 17301, Training Loss: 0.01414
Epoch: 17302, Training Loss: 0.01089
Epoch: 17302, Training Loss: 0.01145
Epoch: 17302, Training Loss: 0.01146
Epoch: 17302, Training Loss: 0.01414
Epoch: 17303, Training Loss: 0.01089
Epoch: 17303, Training Loss: 0.01144
Epoch: 17303, Training Loss: 0.01146
Epoch: 17303, Training Loss: 0.01414
Epoch: 17304, Training Loss: 0.01089
Epoch: 17304, Training Loss: 0.01144
Epoch: 17304, Training Loss: 0.01146
Epoch: 17304, Training Loss: 0.01414
Epoch: 17305, Training Loss: 0.01089
Epoch: 17305, Training Loss: 0.01144
Epoch: 17305, Training Loss: 0.01146
Epoch: 17305, Training Loss: 0.01414
Epoch: 17306, Training Loss: 0.01089
Epoch: 17306, Training Loss: 0.01144
Epoch: 17306, Training Loss: 0.01146
Epoch: 17306, Training Loss: 0.01414
Epoch: 17307, Training Loss: 0.01089
Epoch: 17307, Training Loss: 0.01144
Epoch: 17307, Training Loss: 0.01146
Epoch: 17307, Training Loss: 0.01414
Epoch: 17308, Training Loss: 0.01089
Epoch: 17308, Training Loss: 0.01144
Epoch: 17308, Training Loss: 0.01146
Epoch: 17308, Training Loss: 0.01414
Epoch: 17309, Training Loss: 0.01089
Epoch: 17309, Training Loss: 0.01144
Epoch: 17309, Training Loss: 0.01146
Epoch: 17309, Training Loss: 0.01414
Epoch: 17310, Training Loss: 0.01089
Epoch: 17310, Training Loss: 0.01144
Epoch: 17310, Training Loss: 0.01146
Epoch: 17310, Training Loss: 0.01414
Epoch: 17311, Training Loss: 0.01089
Epoch: 17311, Training Loss: 0.01144
Epoch: 17311, Training Loss: 0.01146
Epoch: 17311, Training Loss: 0.01414
Epoch: 17312, Training Loss: 0.01089
Epoch: 17312, Training Loss: 0.01144
Epoch: 17312, Training Loss: 0.01146
Epoch: 17312, Training Loss: 0.01414
Epoch: 17313, Training Loss: 0.01089
Epoch: 17313, Training Loss: 0.01144
Epoch: 17313, Training Loss: 0.01146
Epoch: 17313, Training Loss: 0.01414
Epoch: 17314, Training Loss: 0.01089
Epoch: 17314, Training Loss: 0.01144
Epoch: 17314, Training Loss: 0.01146
Epoch: 17314, Training Loss: 0.01414
Epoch: 17315, Training Loss: 0.01089
Epoch: 17315, Training Loss: 0.01144
Epoch: 17315, Training Loss: 0.01146
Epoch: 17315, Training Loss: 0.01414
Epoch: 17316, Training Loss: 0.01089
Epoch: 17316, Training Loss: 0.01144
Epoch: 17316, Training Loss: 0.01146
Epoch: 17316, Training Loss: 0.01414
Epoch: 17317, Training Loss: 0.01089
Epoch: 17317, Training Loss: 0.01144
Epoch: 17317, Training Loss: 0.01145
Epoch: 17317, Training Loss: 0.01413
Epoch: 17318, Training Loss: 0.01089
Epoch: 17318, Training Loss: 0.01144
Epoch: 17318, Training Loss: 0.01145
Epoch: 17318, Training Loss: 0.01413
Epoch: 17319, Training Loss: 0.01089
Epoch: 17319, Training Loss: 0.01144
Epoch: 17319, Training Loss: 0.01145
Epoch: 17319, Training Loss: 0.01413
Epoch: 17320, Training Loss: 0.01089
Epoch: 17320, Training Loss: 0.01144
Epoch: 17320, Training Loss: 0.01145
Epoch: 17320, Training Loss: 0.01413
Epoch: 17321, Training Loss: 0.01089
Epoch: 17321, Training Loss: 0.01144
Epoch: 17321, Training Loss: 0.01145
Epoch: 17321, Training Loss: 0.01413
Epoch: 17322, Training Loss: 0.01088
Epoch: 17322, Training Loss: 0.01144
Epoch: 17322, Training Loss: 0.01145
Epoch: 17322, Training Loss: 0.01413
Epoch: 17323, Training Loss: 0.01088
Epoch: 17323, Training Loss: 0.01144
Epoch: 17323, Training Loss: 0.01145
Epoch: 17323, Training Loss: 0.01413
Epoch: 17324, Training Loss: 0.01088
Epoch: 17324, Training Loss: 0.01144
Epoch: 17324, Training Loss: 0.01145
Epoch: 17324, Training Loss: 0.01413
Epoch: 17325, Training Loss: 0.01088
Epoch: 17325, Training Loss: 0.01144
Epoch: 17325, Training Loss: 0.01145
Epoch: 17325, Training Loss: 0.01413
Epoch: 17326, Training Loss: 0.01088
Epoch: 17326, Training Loss: 0.01144
Epoch: 17326, Training Loss: 0.01145
Epoch: 17326, Training Loss: 0.01413
Epoch: 17327, Training Loss: 0.01088
Epoch: 17327, Training Loss: 0.01144
Epoch: 17327, Training Loss: 0.01145
Epoch: 17327, Training Loss: 0.01413
Epoch: 17328, Training Loss: 0.01088
Epoch: 17328, Training Loss: 0.01143
Epoch: 17328, Training Loss: 0.01145
Epoch: 17328, Training Loss: 0.01413
Epoch: 17329, Training Loss: 0.01088
Epoch: 17329, Training Loss: 0.01143
Epoch: 17329, Training Loss: 0.01145
Epoch: 17329, Training Loss: 0.01413
Epoch: 17330, Training Loss: 0.01088
Epoch: 17330, Training Loss: 0.01143
Epoch: 17330, Training Loss: 0.01145
Epoch: 17330, Training Loss: 0.01413
Epoch: 17331, Training Loss: 0.01088
Epoch: 17331, Training Loss: 0.01143
Epoch: 17331, Training Loss: 0.01145
Epoch: 17331, Training Loss: 0.01413
Epoch: 17332, Training Loss: 0.01088
Epoch: 17332, Training Loss: 0.01143
Epoch: 17332, Training Loss: 0.01145
Epoch: 17332, Training Loss: 0.01413
Epoch: 17333, Training Loss: 0.01088
Epoch: 17333, Training Loss: 0.01143
Epoch: 17333, Training Loss: 0.01145
Epoch: 17333, Training Loss: 0.01413
Epoch: 17334, Training Loss: 0.01088
Epoch: 17334, Training Loss: 0.01143
Epoch: 17334, Training Loss: 0.01145
Epoch: 17334, Training Loss: 0.01413
Epoch: 17335, Training Loss: 0.01088
Epoch: 17335, Training Loss: 0.01143
Epoch: 17335, Training Loss: 0.01145
Epoch: 17335, Training Loss: 0.01413
Epoch: 17336, Training Loss: 0.01088
Epoch: 17336, Training Loss: 0.01143
Epoch: 17336, Training Loss: 0.01145
Epoch: 17336, Training Loss: 0.01413
Epoch: 17337, Training Loss: 0.01088
Epoch: 17337, Training Loss: 0.01143
Epoch: 17337, Training Loss: 0.01145
Epoch: 17337, Training Loss: 0.01412
Epoch: 17338, Training Loss: 0.01088
Epoch: 17338, Training Loss: 0.01143
Epoch: 17338, Training Loss: 0.01145
Epoch: 17338, Training Loss: 0.01412
Epoch: 17339, Training Loss: 0.01088
Epoch: 17339, Training Loss: 0.01143
Epoch: 17339, Training Loss: 0.01145
Epoch: 17339, Training Loss: 0.01412
Epoch: 17340, Training Loss: 0.01088
Epoch: 17340, Training Loss: 0.01143
Epoch: 17340, Training Loss: 0.01145
Epoch: 17340, Training Loss: 0.01412
Epoch: 17341, Training Loss: 0.01088
Epoch: 17341, Training Loss: 0.01143
Epoch: 17341, Training Loss: 0.01145
Epoch: 17341, Training Loss: 0.01412
Epoch: 17342, Training Loss: 0.01088
Epoch: 17342, Training Loss: 0.01143
Epoch: 17342, Training Loss: 0.01144
Epoch: 17342, Training Loss: 0.01412
Epoch: 17343, Training Loss: 0.01088
Epoch: 17343, Training Loss: 0.01143
Epoch: 17343, Training Loss: 0.01144
Epoch: 17343, Training Loss: 0.01412
Epoch: 17344, Training Loss: 0.01088
Epoch: 17344, Training Loss: 0.01143
Epoch: 17344, Training Loss: 0.01144
Epoch: 17344, Training Loss: 0.01412
Epoch: 17345, Training Loss: 0.01088
Epoch: 17345, Training Loss: 0.01143
Epoch: 17345, Training Loss: 0.01144
Epoch: 17345, Training Loss: 0.01412
Epoch: 17346, Training Loss: 0.01088
Epoch: 17346, Training Loss: 0.01143
Epoch: 17346, Training Loss: 0.01144
Epoch: 17346, Training Loss: 0.01412
Epoch: 17347, Training Loss: 0.01088
Epoch: 17347, Training Loss: 0.01143
Epoch: 17347, Training Loss: 0.01144
Epoch: 17347, Training Loss: 0.01412
Epoch: 17348, Training Loss: 0.01088
Epoch: 17348, Training Loss: 0.01143
Epoch: 17348, Training Loss: 0.01144
Epoch: 17348, Training Loss: 0.01412
Epoch: 17349, Training Loss: 0.01087
Epoch: 17349, Training Loss: 0.01143
Epoch: 17349, Training Loss: 0.01144
Epoch: 17349, Training Loss: 0.01412
Epoch: 17350, Training Loss: 0.01087
Epoch: 17350, Training Loss: 0.01143
Epoch: 17350, Training Loss: 0.01144
Epoch: 17350, Training Loss: 0.01412
Epoch: 17351, Training Loss: 0.01087
Epoch: 17351, Training Loss: 0.01143
Epoch: 17351, Training Loss: 0.01144
Epoch: 17351, Training Loss: 0.01412
Epoch: 17352, Training Loss: 0.01087
Epoch: 17352, Training Loss: 0.01143
Epoch: 17352, Training Loss: 0.01144
Epoch: 17352, Training Loss: 0.01412
Epoch: 17353, Training Loss: 0.01087
Epoch: 17353, Training Loss: 0.01142
Epoch: 17353, Training Loss: 0.01144
Epoch: 17353, Training Loss: 0.01412
Epoch: 17354, Training Loss: 0.01087
Epoch: 17354, Training Loss: 0.01142
Epoch: 17354, Training Loss: 0.01144
Epoch: 17354, Training Loss: 0.01412
Epoch: 17355, Training Loss: 0.01087
Epoch: 17355, Training Loss: 0.01142
Epoch: 17355, Training Loss: 0.01144
Epoch: 17355, Training Loss: 0.01412
Epoch: 17356, Training Loss: 0.01087
Epoch: 17356, Training Loss: 0.01142
Epoch: 17356, Training Loss: 0.01144
Epoch: 17356, Training Loss: 0.01412
Epoch: 17357, Training Loss: 0.01087
Epoch: 17357, Training Loss: 0.01142
Epoch: 17357, Training Loss: 0.01144
Epoch: 17357, Training Loss: 0.01411
Epoch: 17358, Training Loss: 0.01087
Epoch: 17358, Training Loss: 0.01142
Epoch: 17358, Training Loss: 0.01144
Epoch: 17358, Training Loss: 0.01411
Epoch: 17359, Training Loss: 0.01087
Epoch: 17359, Training Loss: 0.01142
Epoch: 17359, Training Loss: 0.01144
Epoch: 17359, Training Loss: 0.01411
Epoch: 17360, Training Loss: 0.01087
Epoch: 17360, Training Loss: 0.01142
Epoch: 17360, Training Loss: 0.01144
Epoch: 17360, Training Loss: 0.01411
Epoch: 17361, Training Loss: 0.01087
Epoch: 17361, Training Loss: 0.01142
Epoch: 17361, Training Loss: 0.01144
Epoch: 17361, Training Loss: 0.01411
Epoch: 17362, Training Loss: 0.01087
Epoch: 17362, Training Loss: 0.01142
Epoch: 17362, Training Loss: 0.01144
Epoch: 17362, Training Loss: 0.01411
Epoch: 17363, Training Loss: 0.01087
Epoch: 17363, Training Loss: 0.01142
Epoch: 17363, Training Loss: 0.01144
Epoch: 17363, Training Loss: 0.01411
Epoch: 17364, Training Loss: 0.01087
Epoch: 17364, Training Loss: 0.01142
Epoch: 17364, Training Loss: 0.01144
Epoch: 17364, Training Loss: 0.01411
Epoch: 17365, Training Loss: 0.01087
Epoch: 17365, Training Loss: 0.01142
Epoch: 17365, Training Loss: 0.01144
Epoch: 17365, Training Loss: 0.01411
Epoch: 17366, Training Loss: 0.01087
Epoch: 17366, Training Loss: 0.01142
Epoch: 17366, Training Loss: 0.01144
Epoch: 17366, Training Loss: 0.01411
Epoch: 17367, Training Loss: 0.01087
Epoch: 17367, Training Loss: 0.01142
Epoch: 17367, Training Loss: 0.01143
Epoch: 17367, Training Loss: 0.01411
Epoch: 17368, Training Loss: 0.01087
Epoch: 17368, Training Loss: 0.01142
Epoch: 17368, Training Loss: 0.01143
Epoch: 17368, Training Loss: 0.01411
Epoch: 17369, Training Loss: 0.01087
Epoch: 17369, Training Loss: 0.01142
Epoch: 17369, Training Loss: 0.01143
Epoch: 17369, Training Loss: 0.01411
Epoch: 17370, Training Loss: 0.01087
Epoch: 17370, Training Loss: 0.01142
Epoch: 17370, Training Loss: 0.01143
Epoch: 17370, Training Loss: 0.01411
Epoch: 17371, Training Loss: 0.01087
Epoch: 17371, Training Loss: 0.01142
Epoch: 17371, Training Loss: 0.01143
Epoch: 17371, Training Loss: 0.01411
Epoch: 17372, Training Loss: 0.01087
Epoch: 17372, Training Loss: 0.01142
Epoch: 17372, Training Loss: 0.01143
Epoch: 17372, Training Loss: 0.01411
Epoch: 17373, Training Loss: 0.01087
Epoch: 17373, Training Loss: 0.01142
Epoch: 17373, Training Loss: 0.01143
Epoch: 17373, Training Loss: 0.01411
Epoch: 17374, Training Loss: 0.01087
Epoch: 17374, Training Loss: 0.01142
Epoch: 17374, Training Loss: 0.01143
Epoch: 17374, Training Loss: 0.01411
Epoch: 17375, Training Loss: 0.01087
Epoch: 17375, Training Loss: 0.01142
Epoch: 17375, Training Loss: 0.01143
Epoch: 17375, Training Loss: 0.01411
Epoch: 17376, Training Loss: 0.01087
Epoch: 17376, Training Loss: 0.01142
Epoch: 17376, Training Loss: 0.01143
Epoch: 17376, Training Loss: 0.01411
Epoch: 17377, Training Loss: 0.01086
Epoch: 17377, Training Loss: 0.01142
Epoch: 17377, Training Loss: 0.01143
Epoch: 17377, Training Loss: 0.01410
Epoch: 17378, Training Loss: 0.01086
Epoch: 17378, Training Loss: 0.01141
Epoch: 17378, Training Loss: 0.01143
Epoch: 17378, Training Loss: 0.01410
Epoch: 17379, Training Loss: 0.01086
Epoch: 17379, Training Loss: 0.01141
Epoch: 17379, Training Loss: 0.01143
Epoch: 17379, Training Loss: 0.01410
Epoch: 17380, Training Loss: 0.01086
Epoch: 17380, Training Loss: 0.01141
Epoch: 17380, Training Loss: 0.01143
Epoch: 17380, Training Loss: 0.01410
Epoch: 17381, Training Loss: 0.01086
Epoch: 17381, Training Loss: 0.01141
Epoch: 17381, Training Loss: 0.01143
Epoch: 17381, Training Loss: 0.01410
Epoch: 17382, Training Loss: 0.01086
Epoch: 17382, Training Loss: 0.01141
Epoch: 17382, Training Loss: 0.01143
Epoch: 17382, Training Loss: 0.01410
Epoch: 17383, Training Loss: 0.01086
Epoch: 17383, Training Loss: 0.01141
Epoch: 17383, Training Loss: 0.01143
Epoch: 17383, Training Loss: 0.01410
Epoch: 17384, Training Loss: 0.01086
Epoch: 17384, Training Loss: 0.01141
Epoch: 17384, Training Loss: 0.01143
Epoch: 17384, Training Loss: 0.01410
Epoch: 17385, Training Loss: 0.01086
Epoch: 17385, Training Loss: 0.01141
Epoch: 17385, Training Loss: 0.01143
Epoch: 17385, Training Loss: 0.01410
Epoch: 17386, Training Loss: 0.01086
Epoch: 17386, Training Loss: 0.01141
Epoch: 17386, Training Loss: 0.01143
Epoch: 17386, Training Loss: 0.01410
Epoch: 17387, Training Loss: 0.01086
Epoch: 17387, Training Loss: 0.01141
Epoch: 17387, Training Loss: 0.01143
Epoch: 17387, Training Loss: 0.01410
Epoch: 17388, Training Loss: 0.01086
Epoch: 17388, Training Loss: 0.01141
Epoch: 17388, Training Loss: 0.01143
Epoch: 17388, Training Loss: 0.01410
Epoch: 17389, Training Loss: 0.01086
Epoch: 17389, Training Loss: 0.01141
Epoch: 17389, Training Loss: 0.01143
Epoch: 17389, Training Loss: 0.01410
Epoch: 17390, Training Loss: 0.01086
Epoch: 17390, Training Loss: 0.01141
Epoch: 17390, Training Loss: 0.01143
Epoch: 17390, Training Loss: 0.01410
Epoch: 17391, Training Loss: 0.01086
Epoch: 17391, Training Loss: 0.01141
Epoch: 17391, Training Loss: 0.01143
Epoch: 17391, Training Loss: 0.01410
Epoch: 17392, Training Loss: 0.01086
Epoch: 17392, Training Loss: 0.01141
Epoch: 17392, Training Loss: 0.01142
Epoch: 17392, Training Loss: 0.01410
Epoch: 17393, Training Loss: 0.01086
Epoch: 17393, Training Loss: 0.01141
Epoch: 17393, Training Loss: 0.01142
Epoch: 17393, Training Loss: 0.01410
Epoch: 17394, Training Loss: 0.01086
Epoch: 17394, Training Loss: 0.01141
Epoch: 17394, Training Loss: 0.01142
Epoch: 17394, Training Loss: 0.01410
Epoch: 17395, Training Loss: 0.01086
Epoch: 17395, Training Loss: 0.01141
Epoch: 17395, Training Loss: 0.01142
Epoch: 17395, Training Loss: 0.01410
Epoch: 17396, Training Loss: 0.01086
Epoch: 17396, Training Loss: 0.01141
Epoch: 17396, Training Loss: 0.01142
Epoch: 17396, Training Loss: 0.01410
Epoch: 17397, Training Loss: 0.01086
Epoch: 17397, Training Loss: 0.01141
Epoch: 17397, Training Loss: 0.01142
Epoch: 17397, Training Loss: 0.01409
Epoch: 17398, Training Loss: 0.01086
Epoch: 17398, Training Loss: 0.01141
Epoch: 17398, Training Loss: 0.01142
Epoch: 17398, Training Loss: 0.01409
Epoch: 17399, Training Loss: 0.01086
Epoch: 17399, Training Loss: 0.01141
Epoch: 17399, Training Loss: 0.01142
Epoch: 17399, Training Loss: 0.01409
Epoch: 17400, Training Loss: 0.01086
Epoch: 17400, Training Loss: 0.01141
Epoch: 17400, Training Loss: 0.01142
Epoch: 17400, Training Loss: 0.01409
Epoch: 17401, Training Loss: 0.01086
Epoch: 17401, Training Loss: 0.01141
Epoch: 17401, Training Loss: 0.01142
Epoch: 17401, Training Loss: 0.01409
Epoch: 17402, Training Loss: 0.01086
Epoch: 17402, Training Loss: 0.01141
Epoch: 17402, Training Loss: 0.01142
Epoch: 17402, Training Loss: 0.01409
Epoch: 17403, Training Loss: 0.01086
Epoch: 17403, Training Loss: 0.01140
Epoch: 17403, Training Loss: 0.01142
Epoch: 17403, Training Loss: 0.01409
Epoch: 17404, Training Loss: 0.01085
Epoch: 17404, Training Loss: 0.01140
Epoch: 17404, Training Loss: 0.01142
Epoch: 17404, Training Loss: 0.01409
Epoch: 17405, Training Loss: 0.01085
Epoch: 17405, Training Loss: 0.01140
Epoch: 17405, Training Loss: 0.01142
Epoch: 17405, Training Loss: 0.01409
Epoch: 17406, Training Loss: 0.01085
Epoch: 17406, Training Loss: 0.01140
Epoch: 17406, Training Loss: 0.01142
Epoch: 17406, Training Loss: 0.01409
Epoch: 17407, Training Loss: 0.01085
Epoch: 17407, Training Loss: 0.01140
Epoch: 17407, Training Loss: 0.01142
Epoch: 17407, Training Loss: 0.01409
Epoch: 17408, Training Loss: 0.01085
Epoch: 17408, Training Loss: 0.01140
Epoch: 17408, Training Loss: 0.01142
Epoch: 17408, Training Loss: 0.01409
Epoch: 17409, Training Loss: 0.01085
Epoch: 17409, Training Loss: 0.01140
Epoch: 17409, Training Loss: 0.01142
Epoch: 17409, Training Loss: 0.01409
Epoch: 17410, Training Loss: 0.01085
Epoch: 17410, Training Loss: 0.01140
Epoch: 17410, Training Loss: 0.01142
Epoch: 17410, Training Loss: 0.01409
Epoch: 17411, Training Loss: 0.01085
Epoch: 17411, Training Loss: 0.01140
Epoch: 17411, Training Loss: 0.01142
Epoch: 17411, Training Loss: 0.01409
Epoch: 17412, Training Loss: 0.01085
Epoch: 17412, Training Loss: 0.01140
Epoch: 17412, Training Loss: 0.01142
Epoch: 17412, Training Loss: 0.01409
Epoch: 17413, Training Loss: 0.01085
Epoch: 17413, Training Loss: 0.01140
Epoch: 17413, Training Loss: 0.01142
Epoch: 17413, Training Loss: 0.01409
Epoch: 17414, Training Loss: 0.01085
Epoch: 17414, Training Loss: 0.01140
Epoch: 17414, Training Loss: 0.01142
Epoch: 17414, Training Loss: 0.01409
Epoch: 17415, Training Loss: 0.01085
Epoch: 17415, Training Loss: 0.01140
Epoch: 17415, Training Loss: 0.01142
Epoch: 17415, Training Loss: 0.01409
Epoch: 17416, Training Loss: 0.01085
Epoch: 17416, Training Loss: 0.01140
Epoch: 17416, Training Loss: 0.01142
Epoch: 17416, Training Loss: 0.01409
Epoch: 17417, Training Loss: 0.01085
Epoch: 17417, Training Loss: 0.01140
Epoch: 17417, Training Loss: 0.01141
Epoch: 17417, Training Loss: 0.01408
Epoch: 17418, Training Loss: 0.01085
Epoch: 17418, Training Loss: 0.01140
Epoch: 17418, Training Loss: 0.01141
Epoch: 17418, Training Loss: 0.01408
Epoch: 17419, Training Loss: 0.01085
Epoch: 17419, Training Loss: 0.01140
Epoch: 17419, Training Loss: 0.01141
Epoch: 17419, Training Loss: 0.01408
Epoch: 17420, Training Loss: 0.01085
Epoch: 17420, Training Loss: 0.01140
Epoch: 17420, Training Loss: 0.01141
Epoch: 17420, Training Loss: 0.01408
Epoch: 17421, Training Loss: 0.01085
Epoch: 17421, Training Loss: 0.01140
Epoch: 17421, Training Loss: 0.01141
Epoch: 17421, Training Loss: 0.01408
Epoch: 17422, Training Loss: 0.01085
Epoch: 17422, Training Loss: 0.01140
Epoch: 17422, Training Loss: 0.01141
Epoch: 17422, Training Loss: 0.01408
Epoch: 17423, Training Loss: 0.01085
Epoch: 17423, Training Loss: 0.01140
Epoch: 17423, Training Loss: 0.01141
Epoch: 17423, Training Loss: 0.01408
Epoch: 17424, Training Loss: 0.01085
Epoch: 17424, Training Loss: 0.01140
Epoch: 17424, Training Loss: 0.01141
Epoch: 17424, Training Loss: 0.01408
Epoch: 17425, Training Loss: 0.01085
Epoch: 17425, Training Loss: 0.01140
Epoch: 17425, Training Loss: 0.01141
Epoch: 17425, Training Loss: 0.01408
Epoch: 17426, Training Loss: 0.01085
Epoch: 17426, Training Loss: 0.01140
Epoch: 17426, Training Loss: 0.01141
Epoch: 17426, Training Loss: 0.01408
Epoch: 17427, Training Loss: 0.01085
Epoch: 17427, Training Loss: 0.01140
Epoch: 17427, Training Loss: 0.01141
Epoch: 17427, Training Loss: 0.01408
Epoch: 17428, Training Loss: 0.01085
Epoch: 17428, Training Loss: 0.01140
Epoch: 17428, Training Loss: 0.01141
Epoch: 17428, Training Loss: 0.01408
Epoch: 17429, Training Loss: 0.01085
Epoch: 17429, Training Loss: 0.01139
Epoch: 17429, Training Loss: 0.01141
Epoch: 17429, Training Loss: 0.01408
Epoch: 17430, Training Loss: 0.01085
Epoch: 17430, Training Loss: 0.01139
Epoch: 17430, Training Loss: 0.01141
Epoch: 17430, Training Loss: 0.01408
Epoch: 17431, Training Loss: 0.01084
Epoch: 17431, Training Loss: 0.01139
Epoch: 17431, Training Loss: 0.01141
Epoch: 17431, Training Loss: 0.01408
Epoch: 17432, Training Loss: 0.01084
Epoch: 17432, Training Loss: 0.01139
Epoch: 17432, Training Loss: 0.01141
Epoch: 17432, Training Loss: 0.01408
Epoch: 17433, Training Loss: 0.01084
Epoch: 17433, Training Loss: 0.01139
Epoch: 17433, Training Loss: 0.01141
Epoch: 17433, Training Loss: 0.01408
Epoch: 17434, Training Loss: 0.01084
Epoch: 17434, Training Loss: 0.01139
Epoch: 17434, Training Loss: 0.01141
Epoch: 17434, Training Loss: 0.01408
Epoch: 17435, Training Loss: 0.01084
Epoch: 17435, Training Loss: 0.01139
Epoch: 17435, Training Loss: 0.01141
Epoch: 17435, Training Loss: 0.01408
Epoch: 17436, Training Loss: 0.01084
Epoch: 17436, Training Loss: 0.01139
Epoch: 17436, Training Loss: 0.01141
Epoch: 17436, Training Loss: 0.01408
Epoch: 17437, Training Loss: 0.01084
Epoch: 17437, Training Loss: 0.01139
Epoch: 17437, Training Loss: 0.01141
Epoch: 17437, Training Loss: 0.01408
Epoch: 17438, Training Loss: 0.01084
Epoch: 17438, Training Loss: 0.01139
Epoch: 17438, Training Loss: 0.01141
Epoch: 17438, Training Loss: 0.01407
Epoch: 17439, Training Loss: 0.01084
Epoch: 17439, Training Loss: 0.01139
Epoch: 17439, Training Loss: 0.01141
Epoch: 17439, Training Loss: 0.01407
Epoch: 17440, Training Loss: 0.01084
Epoch: 17440, Training Loss: 0.01139
Epoch: 17440, Training Loss: 0.01141
Epoch: 17440, Training Loss: 0.01407
Epoch: 17441, Training Loss: 0.01084
Epoch: 17441, Training Loss: 0.01139
Epoch: 17441, Training Loss: 0.01141
Epoch: 17441, Training Loss: 0.01407
Epoch: 17442, Training Loss: 0.01084
Epoch: 17442, Training Loss: 0.01139
Epoch: 17442, Training Loss: 0.01140
Epoch: 17442, Training Loss: 0.01407
Epoch: 17443, Training Loss: 0.01084
Epoch: 17443, Training Loss: 0.01139
Epoch: 17443, Training Loss: 0.01140
Epoch: 17443, Training Loss: 0.01407
Epoch: 17444, Training Loss: 0.01084
Epoch: 17444, Training Loss: 0.01139
Epoch: 17444, Training Loss: 0.01140
Epoch: 17444, Training Loss: 0.01407
Epoch: 17445, Training Loss: 0.01084
Epoch: 17445, Training Loss: 0.01139
Epoch: 17445, Training Loss: 0.01140
Epoch: 17445, Training Loss: 0.01407
Epoch: 17446, Training Loss: 0.01084
Epoch: 17446, Training Loss: 0.01139
Epoch: 17446, Training Loss: 0.01140
Epoch: 17446, Training Loss: 0.01407
Epoch: 17447, Training Loss: 0.01084
Epoch: 17447, Training Loss: 0.01139
Epoch: 17447, Training Loss: 0.01140
Epoch: 17447, Training Loss: 0.01407
Epoch: 17448, Training Loss: 0.01084
Epoch: 17448, Training Loss: 0.01139
Epoch: 17448, Training Loss: 0.01140
Epoch: 17448, Training Loss: 0.01407
Epoch: 17449, Training Loss: 0.01084
Epoch: 17449, Training Loss: 0.01139
Epoch: 17449, Training Loss: 0.01140
Epoch: 17449, Training Loss: 0.01407
Epoch: 17450, Training Loss: 0.01084
Epoch: 17450, Training Loss: 0.01139
Epoch: 17450, Training Loss: 0.01140
Epoch: 17450, Training Loss: 0.01407
Epoch: 17451, Training Loss: 0.01084
Epoch: 17451, Training Loss: 0.01139
Epoch: 17451, Training Loss: 0.01140
Epoch: 17451, Training Loss: 0.01407
Epoch: 17452, Training Loss: 0.01084
Epoch: 17452, Training Loss: 0.01139
Epoch: 17452, Training Loss: 0.01140
Epoch: 17452, Training Loss: 0.01407
Epoch: 17453, Training Loss: 0.01084
Epoch: 17453, Training Loss: 0.01139
Epoch: 17453, Training Loss: 0.01140
Epoch: 17453, Training Loss: 0.01407
Epoch: 17454, Training Loss: 0.01084
Epoch: 17454, Training Loss: 0.01138
Epoch: 17454, Training Loss: 0.01140
Epoch: 17454, Training Loss: 0.01407
Epoch: 17455, Training Loss: 0.01084
Epoch: 17455, Training Loss: 0.01138
Epoch: 17455, Training Loss: 0.01140
Epoch: 17455, Training Loss: 0.01407
Epoch: 17456, Training Loss: 0.01084
Epoch: 17456, Training Loss: 0.01138
Epoch: 17456, Training Loss: 0.01140
Epoch: 17456, Training Loss: 0.01407
Epoch: 17457, Training Loss: 0.01084
Epoch: 17457, Training Loss: 0.01138
Epoch: 17457, Training Loss: 0.01140
Epoch: 17457, Training Loss: 0.01407
Epoch: 17458, Training Loss: 0.01084
Epoch: 17458, Training Loss: 0.01138
Epoch: 17458, Training Loss: 0.01140
Epoch: 17458, Training Loss: 0.01406
Epoch: 17459, Training Loss: 0.01083
Epoch: 17459, Training Loss: 0.01138
Epoch: 17459, Training Loss: 0.01140
Epoch: 17459, Training Loss: 0.01406
Epoch: 17460, Training Loss: 0.01083
Epoch: 17460, Training Loss: 0.01138
Epoch: 17460, Training Loss: 0.01140
Epoch: 17460, Training Loss: 0.01406
Epoch: 17461, Training Loss: 0.01083
Epoch: 17461, Training Loss: 0.01138
Epoch: 17461, Training Loss: 0.01140
Epoch: 17461, Training Loss: 0.01406
Epoch: 17462, Training Loss: 0.01083
Epoch: 17462, Training Loss: 0.01138
Epoch: 17462, Training Loss: 0.01140
Epoch: 17462, Training Loss: 0.01406
Epoch: 17463, Training Loss: 0.01083
Epoch: 17463, Training Loss: 0.01138
Epoch: 17463, Training Loss: 0.01140
Epoch: 17463, Training Loss: 0.01406
Epoch: 17464, Training Loss: 0.01083
Epoch: 17464, Training Loss: 0.01138
Epoch: 17464, Training Loss: 0.01140
Epoch: 17464, Training Loss: 0.01406
Epoch: 17465, Training Loss: 0.01083
Epoch: 17465, Training Loss: 0.01138
Epoch: 17465, Training Loss: 0.01140
Epoch: 17465, Training Loss: 0.01406
Epoch: 17466, Training Loss: 0.01083
Epoch: 17466, Training Loss: 0.01138
Epoch: 17466, Training Loss: 0.01140
Epoch: 17466, Training Loss: 0.01406
Epoch: 17467, Training Loss: 0.01083
Epoch: 17467, Training Loss: 0.01138
Epoch: 17467, Training Loss: 0.01140
Epoch: 17467, Training Loss: 0.01406
Epoch: 17468, Training Loss: 0.01083
Epoch: 17468, Training Loss: 0.01138
Epoch: 17468, Training Loss: 0.01139
Epoch: 17468, Training Loss: 0.01406
Epoch: 17469, Training Loss: 0.01083
Epoch: 17469, Training Loss: 0.01138
Epoch: 17469, Training Loss: 0.01139
Epoch: 17469, Training Loss: 0.01406
Epoch: 17470, Training Loss: 0.01083
Epoch: 17470, Training Loss: 0.01138
Epoch: 17470, Training Loss: 0.01139
Epoch: 17470, Training Loss: 0.01406
Epoch: 17471, Training Loss: 0.01083
Epoch: 17471, Training Loss: 0.01138
Epoch: 17471, Training Loss: 0.01139
Epoch: 17471, Training Loss: 0.01406
Epoch: 17472, Training Loss: 0.01083
Epoch: 17472, Training Loss: 0.01138
Epoch: 17472, Training Loss: 0.01139
Epoch: 17472, Training Loss: 0.01406
Epoch: 17473, Training Loss: 0.01083
Epoch: 17473, Training Loss: 0.01138
Epoch: 17473, Training Loss: 0.01139
Epoch: 17473, Training Loss: 0.01406
Epoch: 17474, Training Loss: 0.01083
Epoch: 17474, Training Loss: 0.01138
Epoch: 17474, Training Loss: 0.01139
Epoch: 17474, Training Loss: 0.01406
Epoch: 17475, Training Loss: 0.01083
Epoch: 17475, Training Loss: 0.01138
Epoch: 17475, Training Loss: 0.01139
Epoch: 17475, Training Loss: 0.01406
Epoch: 17476, Training Loss: 0.01083
Epoch: 17476, Training Loss: 0.01138
Epoch: 17476, Training Loss: 0.01139
Epoch: 17476, Training Loss: 0.01406
Epoch: 17477, Training Loss: 0.01083
Epoch: 17477, Training Loss: 0.01138
Epoch: 17477, Training Loss: 0.01139
Epoch: 17477, Training Loss: 0.01406
Epoch: 17478, Training Loss: 0.01083
Epoch: 17478, Training Loss: 0.01138
Epoch: 17478, Training Loss: 0.01139
Epoch: 17478, Training Loss: 0.01405
Epoch: 17479, Training Loss: 0.01083
Epoch: 17479, Training Loss: 0.01137
Epoch: 17479, Training Loss: 0.01139
Epoch: 17479, Training Loss: 0.01405
Epoch: 17480, Training Loss: 0.01083
Epoch: 17480, Training Loss: 0.01137
Epoch: 17480, Training Loss: 0.01139
Epoch: 17480, Training Loss: 0.01405
Epoch: 17481, Training Loss: 0.01083
Epoch: 17481, Training Loss: 0.01137
Epoch: 17481, Training Loss: 0.01139
Epoch: 17481, Training Loss: 0.01405
Epoch: 17482, Training Loss: 0.01083
Epoch: 17482, Training Loss: 0.01137
Epoch: 17482, Training Loss: 0.01139
Epoch: 17482, Training Loss: 0.01405
Epoch: 17483, Training Loss: 0.01083
Epoch: 17483, Training Loss: 0.01137
Epoch: 17483, Training Loss: 0.01139
Epoch: 17483, Training Loss: 0.01405
Epoch: 17484, Training Loss: 0.01083
Epoch: 17484, Training Loss: 0.01137
Epoch: 17484, Training Loss: 0.01139
Epoch: 17484, Training Loss: 0.01405
Epoch: 17485, Training Loss: 0.01083
Epoch: 17485, Training Loss: 0.01137
Epoch: 17485, Training Loss: 0.01139
Epoch: 17485, Training Loss: 0.01405
Epoch: 17486, Training Loss: 0.01082
Epoch: 17486, Training Loss: 0.01137
Epoch: 17486, Training Loss: 0.01139
Epoch: 17486, Training Loss: 0.01405
Epoch: 17487, Training Loss: 0.01082
Epoch: 17487, Training Loss: 0.01137
Epoch: 17487, Training Loss: 0.01139
Epoch: 17487, Training Loss: 0.01405
Epoch: 17488, Training Loss: 0.01082
Epoch: 17488, Training Loss: 0.01137
Epoch: 17488, Training Loss: 0.01139
Epoch: 17488, Training Loss: 0.01405
Epoch: 17489, Training Loss: 0.01082
Epoch: 17489, Training Loss: 0.01137
Epoch: 17489, Training Loss: 0.01139
Epoch: 17489, Training Loss: 0.01405
Epoch: 17490, Training Loss: 0.01082
Epoch: 17490, Training Loss: 0.01137
Epoch: 17490, Training Loss: 0.01139
Epoch: 17490, Training Loss: 0.01405
Epoch: 17491, Training Loss: 0.01082
Epoch: 17491, Training Loss: 0.01137
Epoch: 17491, Training Loss: 0.01139
Epoch: 17491, Training Loss: 0.01405
Epoch: 17492, Training Loss: 0.01082
Epoch: 17492, Training Loss: 0.01137
Epoch: 17492, Training Loss: 0.01139
Epoch: 17492, Training Loss: 0.01405
Epoch: 17493, Training Loss: 0.01082
Epoch: 17493, Training Loss: 0.01137
Epoch: 17493, Training Loss: 0.01138
Epoch: 17493, Training Loss: 0.01405
Epoch: 17494, Training Loss: 0.01082
Epoch: 17494, Training Loss: 0.01137
Epoch: 17494, Training Loss: 0.01138
Epoch: 17494, Training Loss: 0.01405
Epoch: 17495, Training Loss: 0.01082
Epoch: 17495, Training Loss: 0.01137
Epoch: 17495, Training Loss: 0.01138
Epoch: 17495, Training Loss: 0.01405
Epoch: 17496, Training Loss: 0.01082
Epoch: 17496, Training Loss: 0.01137
Epoch: 17496, Training Loss: 0.01138
Epoch: 17496, Training Loss: 0.01405
Epoch: 17497, Training Loss: 0.01082
Epoch: 17497, Training Loss: 0.01137
Epoch: 17497, Training Loss: 0.01138
Epoch: 17497, Training Loss: 0.01405
Epoch: 17498, Training Loss: 0.01082
Epoch: 17498, Training Loss: 0.01137
Epoch: 17498, Training Loss: 0.01138
Epoch: 17498, Training Loss: 0.01404
Epoch: 17499, Training Loss: 0.01082
Epoch: 17499, Training Loss: 0.01137
Epoch: 17499, Training Loss: 0.01138
Epoch: 17499, Training Loss: 0.01404
Epoch: 17500, Training Loss: 0.01082
Epoch: 17500, Training Loss: 0.01137
Epoch: 17500, Training Loss: 0.01138
Epoch: 17500, Training Loss: 0.01404
Epoch: 17501, Training Loss: 0.01082
Epoch: 17501, Training Loss: 0.01137
Epoch: 17501, Training Loss: 0.01138
Epoch: 17501, Training Loss: 0.01404
Epoch: 17502, Training Loss: 0.01082
Epoch: 17502, Training Loss: 0.01137
Epoch: 17502, Training Loss: 0.01138
Epoch: 17502, Training Loss: 0.01404
Epoch: 17503, Training Loss: 0.01082
Epoch: 17503, Training Loss: 0.01137
Epoch: 17503, Training Loss: 0.01138
Epoch: 17503, Training Loss: 0.01404
Epoch: 17504, Training Loss: 0.01082
Epoch: 17504, Training Loss: 0.01137
Epoch: 17504, Training Loss: 0.01138
Epoch: 17504, Training Loss: 0.01404
Epoch: 17505, Training Loss: 0.01082
Epoch: 17505, Training Loss: 0.01136
Epoch: 17505, Training Loss: 0.01138
Epoch: 17505, Training Loss: 0.01404
Epoch: 17506, Training Loss: 0.01082
Epoch: 17506, Training Loss: 0.01136
Epoch: 17506, Training Loss: 0.01138
Epoch: 17506, Training Loss: 0.01404
Epoch: 17507, Training Loss: 0.01082
Epoch: 17507, Training Loss: 0.01136
Epoch: 17507, Training Loss: 0.01138
Epoch: 17507, Training Loss: 0.01404
Epoch: 17508, Training Loss: 0.01082
Epoch: 17508, Training Loss: 0.01136
Epoch: 17508, Training Loss: 0.01138
Epoch: 17508, Training Loss: 0.01404
Epoch: 17509, Training Loss: 0.01082
Epoch: 17509, Training Loss: 0.01136
Epoch: 17509, Training Loss: 0.01138
Epoch: 17509, Training Loss: 0.01404
Epoch: 17510, Training Loss: 0.01082
Epoch: 17510, Training Loss: 0.01136
Epoch: 17510, Training Loss: 0.01138
Epoch: 17510, Training Loss: 0.01404
Epoch: 17511, Training Loss: 0.01082
Epoch: 17511, Training Loss: 0.01136
Epoch: 17511, Training Loss: 0.01138
Epoch: 17511, Training Loss: 0.01404
Epoch: 17512, Training Loss: 0.01082
Epoch: 17512, Training Loss: 0.01136
Epoch: 17512, Training Loss: 0.01138
Epoch: 17512, Training Loss: 0.01404
Epoch: 17513, Training Loss: 0.01082
Epoch: 17513, Training Loss: 0.01136
Epoch: 17513, Training Loss: 0.01138
Epoch: 17513, Training Loss: 0.01404
Epoch: 17514, Training Loss: 0.01081
Epoch: 17514, Training Loss: 0.01136
Epoch: 17514, Training Loss: 0.01138
Epoch: 17514, Training Loss: 0.01404
Epoch: 17515, Training Loss: 0.01081
Epoch: 17515, Training Loss: 0.01136
Epoch: 17515, Training Loss: 0.01138
Epoch: 17515, Training Loss: 0.01404
Epoch: 17516, Training Loss: 0.01081
Epoch: 17516, Training Loss: 0.01136
Epoch: 17516, Training Loss: 0.01138
Epoch: 17516, Training Loss: 0.01404
Epoch: 17517, Training Loss: 0.01081
Epoch: 17517, Training Loss: 0.01136
Epoch: 17517, Training Loss: 0.01138
Epoch: 17517, Training Loss: 0.01404
Epoch: 17518, Training Loss: 0.01081
Epoch: 17518, Training Loss: 0.01136
Epoch: 17518, Training Loss: 0.01138
Epoch: 17518, Training Loss: 0.01403
Epoch: 17519, Training Loss: 0.01081
Epoch: 17519, Training Loss: 0.01136
Epoch: 17519, Training Loss: 0.01137
Epoch: 17519, Training Loss: 0.01403
Epoch: 17520, Training Loss: 0.01081
Epoch: 17520, Training Loss: 0.01136
Epoch: 17520, Training Loss: 0.01137
Epoch: 17520, Training Loss: 0.01403
Epoch: 17521, Training Loss: 0.01081
Epoch: 17521, Training Loss: 0.01136
Epoch: 17521, Training Loss: 0.01137
Epoch: 17521, Training Loss: 0.01403
Epoch: 17522, Training Loss: 0.01081
Epoch: 17522, Training Loss: 0.01136
Epoch: 17522, Training Loss: 0.01137
Epoch: 17522, Training Loss: 0.01403
Epoch: 17523, Training Loss: 0.01081
Epoch: 17523, Training Loss: 0.01136
Epoch: 17523, Training Loss: 0.01137
Epoch: 17523, Training Loss: 0.01403
Epoch: 17524, Training Loss: 0.01081
Epoch: 17524, Training Loss: 0.01136
Epoch: 17524, Training Loss: 0.01137
Epoch: 17524, Training Loss: 0.01403
Epoch: 17525, Training Loss: 0.01081
Epoch: 17525, Training Loss: 0.01136
Epoch: 17525, Training Loss: 0.01137
Epoch: 17525, Training Loss: 0.01403
Epoch: 17526, Training Loss: 0.01081
Epoch: 17526, Training Loss: 0.01136
Epoch: 17526, Training Loss: 0.01137
Epoch: 17526, Training Loss: 0.01403
Epoch: 17527, Training Loss: 0.01081
Epoch: 17527, Training Loss: 0.01136
Epoch: 17527, Training Loss: 0.01137
Epoch: 17527, Training Loss: 0.01403
Epoch: 17528, Training Loss: 0.01081
Epoch: 17528, Training Loss: 0.01136
Epoch: 17528, Training Loss: 0.01137
Epoch: 17528, Training Loss: 0.01403
Epoch: 17529, Training Loss: 0.01081
Epoch: 17529, Training Loss: 0.01136
Epoch: 17529, Training Loss: 0.01137
Epoch: 17529, Training Loss: 0.01403
Epoch: 17530, Training Loss: 0.01081
Epoch: 17530, Training Loss: 0.01135
Epoch: 17530, Training Loss: 0.01137
Epoch: 17530, Training Loss: 0.01403
Epoch: 17531, Training Loss: 0.01081
Epoch: 17531, Training Loss: 0.01135
Epoch: 17531, Training Loss: 0.01137
Epoch: 17531, Training Loss: 0.01403
Epoch: 17532, Training Loss: 0.01081
Epoch: 17532, Training Loss: 0.01135
Epoch: 17532, Training Loss: 0.01137
Epoch: 17532, Training Loss: 0.01403
Epoch: 17533, Training Loss: 0.01081
Epoch: 17533, Training Loss: 0.01135
Epoch: 17533, Training Loss: 0.01137
Epoch: 17533, Training Loss: 0.01403
Epoch: 17534, Training Loss: 0.01081
Epoch: 17534, Training Loss: 0.01135
Epoch: 17534, Training Loss: 0.01137
Epoch: 17534, Training Loss: 0.01403
Epoch: 17535, Training Loss: 0.01081
Epoch: 17535, Training Loss: 0.01135
Epoch: 17535, Training Loss: 0.01137
Epoch: 17535, Training Loss: 0.01403
Epoch: 17536, Training Loss: 0.01081
Epoch: 17536, Training Loss: 0.01135
Epoch: 17536, Training Loss: 0.01137
Epoch: 17536, Training Loss: 0.01403
Epoch: 17537, Training Loss: 0.01081
Epoch: 17537, Training Loss: 0.01135
Epoch: 17537, Training Loss: 0.01137
Epoch: 17537, Training Loss: 0.01403
Epoch: 17538, Training Loss: 0.01081
Epoch: 17538, Training Loss: 0.01135
Epoch: 17538, Training Loss: 0.01137
Epoch: 17538, Training Loss: 0.01403
Epoch: 17539, Training Loss: 0.01081
Epoch: 17539, Training Loss: 0.01135
Epoch: 17539, Training Loss: 0.01137
Epoch: 17539, Training Loss: 0.01402
Epoch: 17540, Training Loss: 0.01081
Epoch: 17540, Training Loss: 0.01135
Epoch: 17540, Training Loss: 0.01137
Epoch: 17540, Training Loss: 0.01402
Epoch: 17541, Training Loss: 0.01081
Epoch: 17541, Training Loss: 0.01135
Epoch: 17541, Training Loss: 0.01137
Epoch: 17541, Training Loss: 0.01402
Epoch: 17542, Training Loss: 0.01080
Epoch: 17542, Training Loss: 0.01135
Epoch: 17542, Training Loss: 0.01137
Epoch: 17542, Training Loss: 0.01402
Epoch: 17543, Training Loss: 0.01080
Epoch: 17543, Training Loss: 0.01135
Epoch: 17543, Training Loss: 0.01137
Epoch: 17543, Training Loss: 0.01402
Epoch: 17544, Training Loss: 0.01080
Epoch: 17544, Training Loss: 0.01135
Epoch: 17544, Training Loss: 0.01136
Epoch: 17544, Training Loss: 0.01402
Epoch: 17545, Training Loss: 0.01080
Epoch: 17545, Training Loss: 0.01135
Epoch: 17545, Training Loss: 0.01136
Epoch: 17545, Training Loss: 0.01402
Epoch: 17546, Training Loss: 0.01080
Epoch: 17546, Training Loss: 0.01135
Epoch: 17546, Training Loss: 0.01136
Epoch: 17546, Training Loss: 0.01402
Epoch: 17547, Training Loss: 0.01080
Epoch: 17547, Training Loss: 0.01135
Epoch: 17547, Training Loss: 0.01136
Epoch: 17547, Training Loss: 0.01402
Epoch: 17548, Training Loss: 0.01080
Epoch: 17548, Training Loss: 0.01135
Epoch: 17548, Training Loss: 0.01136
Epoch: 17548, Training Loss: 0.01402
Epoch: 17549, Training Loss: 0.01080
Epoch: 17549, Training Loss: 0.01135
Epoch: 17549, Training Loss: 0.01136
Epoch: 17549, Training Loss: 0.01402
Epoch: 17550, Training Loss: 0.01080
Epoch: 17550, Training Loss: 0.01135
Epoch: 17550, Training Loss: 0.01136
Epoch: 17550, Training Loss: 0.01402
Epoch: 17551, Training Loss: 0.01080
Epoch: 17551, Training Loss: 0.01135
Epoch: 17551, Training Loss: 0.01136
Epoch: 17551, Training Loss: 0.01402
Epoch: 17552, Training Loss: 0.01080
Epoch: 17552, Training Loss: 0.01135
Epoch: 17552, Training Loss: 0.01136
Epoch: 17552, Training Loss: 0.01402
Epoch: 17553, Training Loss: 0.01080
Epoch: 17553, Training Loss: 0.01135
Epoch: 17553, Training Loss: 0.01136
Epoch: 17553, Training Loss: 0.01402
Epoch: 17554, Training Loss: 0.01080
Epoch: 17554, Training Loss: 0.01135
Epoch: 17554, Training Loss: 0.01136
Epoch: 17554, Training Loss: 0.01402
Epoch: 17555, Training Loss: 0.01080
Epoch: 17555, Training Loss: 0.01135
Epoch: 17555, Training Loss: 0.01136
Epoch: 17555, Training Loss: 0.01402
Epoch: 17556, Training Loss: 0.01080
Epoch: 17556, Training Loss: 0.01134
Epoch: 17556, Training Loss: 0.01136
Epoch: 17556, Training Loss: 0.01402
Epoch: 17557, Training Loss: 0.01080
Epoch: 17557, Training Loss: 0.01134
Epoch: 17557, Training Loss: 0.01136
Epoch: 17557, Training Loss: 0.01402
Epoch: 17558, Training Loss: 0.01080
Epoch: 17558, Training Loss: 0.01134
Epoch: 17558, Training Loss: 0.01136
Epoch: 17558, Training Loss: 0.01402
Epoch: 17559, Training Loss: 0.01080
Epoch: 17559, Training Loss: 0.01134
Epoch: 17559, Training Loss: 0.01136
Epoch: 17559, Training Loss: 0.01401
Epoch: 17560, Training Loss: 0.01080
Epoch: 17560, Training Loss: 0.01134
Epoch: 17560, Training Loss: 0.01136
Epoch: 17560, Training Loss: 0.01401
Epoch: 17561, Training Loss: 0.01080
Epoch: 17561, Training Loss: 0.01134
Epoch: 17561, Training Loss: 0.01136
Epoch: 17561, Training Loss: 0.01401
Epoch: 17562, Training Loss: 0.01080
Epoch: 17562, Training Loss: 0.01134
Epoch: 17562, Training Loss: 0.01136
Epoch: 17562, Training Loss: 0.01401
Epoch: 17563, Training Loss: 0.01080
Epoch: 17563, Training Loss: 0.01134
Epoch: 17563, Training Loss: 0.01136
Epoch: 17563, Training Loss: 0.01401
Epoch: 17564, Training Loss: 0.01080
Epoch: 17564, Training Loss: 0.01134
Epoch: 17564, Training Loss: 0.01136
Epoch: 17564, Training Loss: 0.01401
Epoch: 17565, Training Loss: 0.01080
Epoch: 17565, Training Loss: 0.01134
Epoch: 17565, Training Loss: 0.01136
Epoch: 17565, Training Loss: 0.01401
Epoch: 17566, Training Loss: 0.01080
Epoch: 17566, Training Loss: 0.01134
Epoch: 17566, Training Loss: 0.01136
Epoch: 17566, Training Loss: 0.01401
Epoch: 17567, Training Loss: 0.01080
Epoch: 17567, Training Loss: 0.01134
Epoch: 17567, Training Loss: 0.01136
Epoch: 17567, Training Loss: 0.01401
Epoch: 17568, Training Loss: 0.01080
Epoch: 17568, Training Loss: 0.01134
Epoch: 17568, Training Loss: 0.01136
Epoch: 17568, Training Loss: 0.01401
Epoch: 17569, Training Loss: 0.01079
Epoch: 17569, Training Loss: 0.01134
Epoch: 17569, Training Loss: 0.01136
Epoch: 17569, Training Loss: 0.01401
Epoch: 17570, Training Loss: 0.01079
Epoch: 17570, Training Loss: 0.01134
Epoch: 17570, Training Loss: 0.01135
Epoch: 17570, Training Loss: 0.01401
Epoch: 17571, Training Loss: 0.01079
Epoch: 17571, Training Loss: 0.01134
Epoch: 17571, Training Loss: 0.01135
Epoch: 17571, Training Loss: 0.01401
Epoch: 17572, Training Loss: 0.01079
Epoch: 17572, Training Loss: 0.01134
Epoch: 17572, Training Loss: 0.01135
Epoch: 17572, Training Loss: 0.01401
Epoch: 17573, Training Loss: 0.01079
Epoch: 17573, Training Loss: 0.01134
Epoch: 17573, Training Loss: 0.01135
Epoch: 17573, Training Loss: 0.01401
Epoch: 17574, Training Loss: 0.01079
Epoch: 17574, Training Loss: 0.01134
Epoch: 17574, Training Loss: 0.01135
Epoch: 17574, Training Loss: 0.01401
Epoch: 17575, Training Loss: 0.01079
Epoch: 17575, Training Loss: 0.01134
Epoch: 17575, Training Loss: 0.01135
Epoch: 17575, Training Loss: 0.01401
Epoch: 17576, Training Loss: 0.01079
Epoch: 17576, Training Loss: 0.01134
Epoch: 17576, Training Loss: 0.01135
Epoch: 17576, Training Loss: 0.01401
Epoch: 17577, Training Loss: 0.01079
Epoch: 17577, Training Loss: 0.01134
Epoch: 17577, Training Loss: 0.01135
Epoch: 17577, Training Loss: 0.01401
Epoch: 17578, Training Loss: 0.01079
Epoch: 17578, Training Loss: 0.01134
Epoch: 17578, Training Loss: 0.01135
Epoch: 17578, Training Loss: 0.01401
Epoch: 17579, Training Loss: 0.01079
Epoch: 17579, Training Loss: 0.01134
Epoch: 17579, Training Loss: 0.01135
Epoch: 17579, Training Loss: 0.01401
Epoch: 17580, Training Loss: 0.01079
Epoch: 17580, Training Loss: 0.01134
Epoch: 17580, Training Loss: 0.01135
Epoch: 17580, Training Loss: 0.01400
Epoch: 17581, Training Loss: 0.01079
Epoch: 17581, Training Loss: 0.01134
Epoch: 17581, Training Loss: 0.01135
Epoch: 17581, Training Loss: 0.01400
Epoch: 17582, Training Loss: 0.01079
Epoch: 17582, Training Loss: 0.01133
Epoch: 17582, Training Loss: 0.01135
Epoch: 17582, Training Loss: 0.01400
Epoch: 17583, Training Loss: 0.01079
Epoch: 17583, Training Loss: 0.01133
Epoch: 17583, Training Loss: 0.01135
Epoch: 17583, Training Loss: 0.01400
Epoch: 17584, Training Loss: 0.01079
Epoch: 17584, Training Loss: 0.01133
Epoch: 17584, Training Loss: 0.01135
Epoch: 17584, Training Loss: 0.01400
Epoch: 17585, Training Loss: 0.01079
Epoch: 17585, Training Loss: 0.01133
Epoch: 17585, Training Loss: 0.01135
Epoch: 17585, Training Loss: 0.01400
Epoch: 17586, Training Loss: 0.01079
Epoch: 17586, Training Loss: 0.01133
Epoch: 17586, Training Loss: 0.01135
Epoch: 17586, Training Loss: 0.01400
Epoch: 17587, Training Loss: 0.01079
Epoch: 17587, Training Loss: 0.01133
Epoch: 17587, Training Loss: 0.01135
Epoch: 17587, Training Loss: 0.01400
Epoch: 17588, Training Loss: 0.01079
Epoch: 17588, Training Loss: 0.01133
Epoch: 17588, Training Loss: 0.01135
Epoch: 17588, Training Loss: 0.01400
Epoch: 17589, Training Loss: 0.01079
Epoch: 17589, Training Loss: 0.01133
Epoch: 17589, Training Loss: 0.01135
Epoch: 17589, Training Loss: 0.01400
Epoch: 17590, Training Loss: 0.01079
Epoch: 17590, Training Loss: 0.01133
Epoch: 17590, Training Loss: 0.01135
Epoch: 17590, Training Loss: 0.01400
Epoch: 17591, Training Loss: 0.01079
Epoch: 17591, Training Loss: 0.01133
Epoch: 17591, Training Loss: 0.01135
Epoch: 17591, Training Loss: 0.01400
Epoch: 17592, Training Loss: 0.01079
Epoch: 17592, Training Loss: 0.01133
Epoch: 17592, Training Loss: 0.01135
Epoch: 17592, Training Loss: 0.01400
Epoch: 17593, Training Loss: 0.01079
Epoch: 17593, Training Loss: 0.01133
Epoch: 17593, Training Loss: 0.01135
Epoch: 17593, Training Loss: 0.01400
Epoch: 17594, Training Loss: 0.01079
Epoch: 17594, Training Loss: 0.01133
Epoch: 17594, Training Loss: 0.01135
Epoch: 17594, Training Loss: 0.01400
Epoch: 17595, Training Loss: 0.01079
Epoch: 17595, Training Loss: 0.01133
Epoch: 17595, Training Loss: 0.01134
Epoch: 17595, Training Loss: 0.01400
Epoch: 17596, Training Loss: 0.01079
Epoch: 17596, Training Loss: 0.01133
Epoch: 17596, Training Loss: 0.01134
Epoch: 17596, Training Loss: 0.01400
Epoch: 17597, Training Loss: 0.01078
Epoch: 17597, Training Loss: 0.01133
Epoch: 17597, Training Loss: 0.01134
Epoch: 17597, Training Loss: 0.01400
Epoch: 17598, Training Loss: 0.01078
Epoch: 17598, Training Loss: 0.01133
Epoch: 17598, Training Loss: 0.01134
Epoch: 17598, Training Loss: 0.01400
Epoch: 17599, Training Loss: 0.01078
Epoch: 17599, Training Loss: 0.01133
Epoch: 17599, Training Loss: 0.01134
Epoch: 17599, Training Loss: 0.01400
Epoch: 17600, Training Loss: 0.01078
Epoch: 17600, Training Loss: 0.01133
Epoch: 17600, Training Loss: 0.01134
Epoch: 17600, Training Loss: 0.01399
Epoch: 17601, Training Loss: 0.01078
Epoch: 17601, Training Loss: 0.01133
Epoch: 17601, Training Loss: 0.01134
Epoch: 17601, Training Loss: 0.01399
Epoch: 17602, Training Loss: 0.01078
Epoch: 17602, Training Loss: 0.01133
Epoch: 17602, Training Loss: 0.01134
Epoch: 17602, Training Loss: 0.01399
Epoch: 17603, Training Loss: 0.01078
Epoch: 17603, Training Loss: 0.01133
Epoch: 17603, Training Loss: 0.01134
Epoch: 17603, Training Loss: 0.01399
Epoch: 17604, Training Loss: 0.01078
Epoch: 17604, Training Loss: 0.01133
Epoch: 17604, Training Loss: 0.01134
Epoch: 17604, Training Loss: 0.01399
Epoch: 17605, Training Loss: 0.01078
Epoch: 17605, Training Loss: 0.01133
Epoch: 17605, Training Loss: 0.01134
Epoch: 17605, Training Loss: 0.01399
Epoch: 17606, Training Loss: 0.01078
Epoch: 17606, Training Loss: 0.01133
Epoch: 17606, Training Loss: 0.01134
Epoch: 17606, Training Loss: 0.01399
Epoch: 17607, Training Loss: 0.01078
Epoch: 17607, Training Loss: 0.01132
Epoch: 17607, Training Loss: 0.01134
Epoch: 17607, Training Loss: 0.01399
Epoch: 17608, Training Loss: 0.01078
Epoch: 17608, Training Loss: 0.01132
Epoch: 17608, Training Loss: 0.01134
Epoch: 17608, Training Loss: 0.01399
Epoch: 17609, Training Loss: 0.01078
Epoch: 17609, Training Loss: 0.01132
Epoch: 17609, Training Loss: 0.01134
Epoch: 17609, Training Loss: 0.01399
Epoch: 17610, Training Loss: 0.01078
Epoch: 17610, Training Loss: 0.01132
Epoch: 17610, Training Loss: 0.01134
Epoch: 17610, Training Loss: 0.01399
Epoch: 17611, Training Loss: 0.01078
Epoch: 17611, Training Loss: 0.01132
Epoch: 17611, Training Loss: 0.01134
Epoch: 17611, Training Loss: 0.01399
Epoch: 17612, Training Loss: 0.01078
Epoch: 17612, Training Loss: 0.01132
Epoch: 17612, Training Loss: 0.01134
Epoch: 17612, Training Loss: 0.01399
Epoch: 17613, Training Loss: 0.01078
Epoch: 17613, Training Loss: 0.01132
Epoch: 17613, Training Loss: 0.01134
Epoch: 17613, Training Loss: 0.01399
Epoch: 17614, Training Loss: 0.01078
Epoch: 17614, Training Loss: 0.01132
Epoch: 17614, Training Loss: 0.01134
Epoch: 17614, Training Loss: 0.01399
Epoch: 17615, Training Loss: 0.01078
Epoch: 17615, Training Loss: 0.01132
Epoch: 17615, Training Loss: 0.01134
Epoch: 17615, Training Loss: 0.01399
Epoch: 17616, Training Loss: 0.01078
Epoch: 17616, Training Loss: 0.01132
Epoch: 17616, Training Loss: 0.01134
Epoch: 17616, Training Loss: 0.01399
Epoch: 17617, Training Loss: 0.01078
Epoch: 17617, Training Loss: 0.01132
Epoch: 17617, Training Loss: 0.01134
Epoch: 17617, Training Loss: 0.01399
Epoch: 17618, Training Loss: 0.01078
Epoch: 17618, Training Loss: 0.01132
Epoch: 17618, Training Loss: 0.01134
Epoch: 17618, Training Loss: 0.01399
Epoch: 17619, Training Loss: 0.01078
Epoch: 17619, Training Loss: 0.01132
Epoch: 17619, Training Loss: 0.01134
Epoch: 17619, Training Loss: 0.01399
Epoch: 17620, Training Loss: 0.01078
Epoch: 17620, Training Loss: 0.01132
Epoch: 17620, Training Loss: 0.01134
Epoch: 17620, Training Loss: 0.01399
Epoch: 17621, Training Loss: 0.01078
Epoch: 17621, Training Loss: 0.01132
Epoch: 17621, Training Loss: 0.01133
Epoch: 17621, Training Loss: 0.01398
Epoch: 17622, Training Loss: 0.01078
Epoch: 17622, Training Loss: 0.01132
Epoch: 17622, Training Loss: 0.01133
Epoch: 17622, Training Loss: 0.01398
Epoch: 17623, Training Loss: 0.01078
Epoch: 17623, Training Loss: 0.01132
Epoch: 17623, Training Loss: 0.01133
Epoch: 17623, Training Loss: 0.01398
Epoch: 17624, Training Loss: 0.01078
Epoch: 17624, Training Loss: 0.01132
Epoch: 17624, Training Loss: 0.01133
Epoch: 17624, Training Loss: 0.01398
Epoch: 17625, Training Loss: 0.01077
Epoch: 17625, Training Loss: 0.01132
Epoch: 17625, Training Loss: 0.01133
Epoch: 17625, Training Loss: 0.01398
Epoch: 17626, Training Loss: 0.01077
Epoch: 17626, Training Loss: 0.01132
Epoch: 17626, Training Loss: 0.01133
Epoch: 17626, Training Loss: 0.01398
Epoch: 17627, Training Loss: 0.01077
Epoch: 17627, Training Loss: 0.01132
Epoch: 17627, Training Loss: 0.01133
Epoch: 17627, Training Loss: 0.01398
Epoch: 17628, Training Loss: 0.01077
Epoch: 17628, Training Loss: 0.01132
Epoch: 17628, Training Loss: 0.01133
Epoch: 17628, Training Loss: 0.01398
Epoch: 17629, Training Loss: 0.01077
Epoch: 17629, Training Loss: 0.01132
Epoch: 17629, Training Loss: 0.01133
Epoch: 17629, Training Loss: 0.01398
Epoch: 17630, Training Loss: 0.01077
Epoch: 17630, Training Loss: 0.01132
Epoch: 17630, Training Loss: 0.01133
Epoch: 17630, Training Loss: 0.01398
Epoch: 17631, Training Loss: 0.01077
Epoch: 17631, Training Loss: 0.01132
Epoch: 17631, Training Loss: 0.01133
Epoch: 17631, Training Loss: 0.01398
Epoch: 17632, Training Loss: 0.01077
Epoch: 17632, Training Loss: 0.01132
Epoch: 17632, Training Loss: 0.01133
Epoch: 17632, Training Loss: 0.01398
Epoch: 17633, Training Loss: 0.01077
Epoch: 17633, Training Loss: 0.01131
Epoch: 17633, Training Loss: 0.01133
Epoch: 17633, Training Loss: 0.01398
Epoch: 17634, Training Loss: 0.01077
Epoch: 17634, Training Loss: 0.01131
Epoch: 17634, Training Loss: 0.01133
Epoch: 17634, Training Loss: 0.01398
Epoch: 17635, Training Loss: 0.01077
Epoch: 17635, Training Loss: 0.01131
Epoch: 17635, Training Loss: 0.01133
Epoch: 17635, Training Loss: 0.01398
Epoch: 17636, Training Loss: 0.01077
Epoch: 17636, Training Loss: 0.01131
Epoch: 17636, Training Loss: 0.01133
Epoch: 17636, Training Loss: 0.01398
Epoch: 17637, Training Loss: 0.01077
Epoch: 17637, Training Loss: 0.01131
Epoch: 17637, Training Loss: 0.01133
Epoch: 17637, Training Loss: 0.01398
Epoch: 17638, Training Loss: 0.01077
Epoch: 17638, Training Loss: 0.01131
Epoch: 17638, Training Loss: 0.01133
Epoch: 17638, Training Loss: 0.01398
Epoch: 17639, Training Loss: 0.01077
Epoch: 17639, Training Loss: 0.01131
Epoch: 17639, Training Loss: 0.01133
Epoch: 17639, Training Loss: 0.01398
Epoch: 17640, Training Loss: 0.01077
Epoch: 17640, Training Loss: 0.01131
Epoch: 17640, Training Loss: 0.01133
Epoch: 17640, Training Loss: 0.01398
Epoch: 17641, Training Loss: 0.01077
Epoch: 17641, Training Loss: 0.01131
Epoch: 17641, Training Loss: 0.01133
Epoch: 17641, Training Loss: 0.01397
Epoch: 17642, Training Loss: 0.01077
Epoch: 17642, Training Loss: 0.01131
Epoch: 17642, Training Loss: 0.01133
Epoch: 17642, Training Loss: 0.01397
Epoch: 17643, Training Loss: 0.01077
Epoch: 17643, Training Loss: 0.01131
Epoch: 17643, Training Loss: 0.01133
Epoch: 17643, Training Loss: 0.01397
Epoch: 17644, Training Loss: 0.01077
Epoch: 17644, Training Loss: 0.01131
Epoch: 17644, Training Loss: 0.01133
Epoch: 17644, Training Loss: 0.01397
Epoch: 17645, Training Loss: 0.01077
Epoch: 17645, Training Loss: 0.01131
Epoch: 17645, Training Loss: 0.01133
Epoch: 17645, Training Loss: 0.01397
Epoch: 17646, Training Loss: 0.01077
Epoch: 17646, Training Loss: 0.01131
Epoch: 17646, Training Loss: 0.01133
Epoch: 17646, Training Loss: 0.01397
Epoch: 17647, Training Loss: 0.01077
Epoch: 17647, Training Loss: 0.01131
Epoch: 17647, Training Loss: 0.01132
Epoch: 17647, Training Loss: 0.01397
Epoch: 17648, Training Loss: 0.01077
Epoch: 17648, Training Loss: 0.01131
Epoch: 17648, Training Loss: 0.01132
Epoch: 17648, Training Loss: 0.01397
Epoch: 17649, Training Loss: 0.01077
Epoch: 17649, Training Loss: 0.01131
Epoch: 17649, Training Loss: 0.01132
Epoch: 17649, Training Loss: 0.01397
Epoch: 17650, Training Loss: 0.01077
Epoch: 17650, Training Loss: 0.01131
Epoch: 17650, Training Loss: 0.01132
Epoch: 17650, Training Loss: 0.01397
Epoch: 17651, Training Loss: 0.01077
Epoch: 17651, Training Loss: 0.01131
Epoch: 17651, Training Loss: 0.01132
Epoch: 17651, Training Loss: 0.01397
Epoch: 17652, Training Loss: 0.01077
Epoch: 17652, Training Loss: 0.01131
Epoch: 17652, Training Loss: 0.01132
Epoch: 17652, Training Loss: 0.01397
Epoch: 17653, Training Loss: 0.01076
Epoch: 17653, Training Loss: 0.01131
Epoch: 17653, Training Loss: 0.01132
Epoch: 17653, Training Loss: 0.01397
Epoch: 17654, Training Loss: 0.01076
Epoch: 17654, Training Loss: 0.01131
Epoch: 17654, Training Loss: 0.01132
Epoch: 17654, Training Loss: 0.01397
Epoch: 17655, Training Loss: 0.01076
Epoch: 17655, Training Loss: 0.01131
Epoch: 17655, Training Loss: 0.01132
Epoch: 17655, Training Loss: 0.01397
Epoch: 17656, Training Loss: 0.01076
Epoch: 17656, Training Loss: 0.01131
Epoch: 17656, Training Loss: 0.01132
Epoch: 17656, Training Loss: 0.01397
Epoch: 17657, Training Loss: 0.01076
Epoch: 17657, Training Loss: 0.01131
Epoch: 17657, Training Loss: 0.01132
Epoch: 17657, Training Loss: 0.01397
Epoch: 17658, Training Loss: 0.01076
Epoch: 17658, Training Loss: 0.01131
Epoch: 17658, Training Loss: 0.01132
Epoch: 17658, Training Loss: 0.01397
Epoch: 17659, Training Loss: 0.01076
Epoch: 17659, Training Loss: 0.01130
Epoch: 17659, Training Loss: 0.01132
Epoch: 17659, Training Loss: 0.01397
Epoch: 17660, Training Loss: 0.01076
Epoch: 17660, Training Loss: 0.01130
Epoch: 17660, Training Loss: 0.01132
Epoch: 17660, Training Loss: 0.01397
Epoch: 17661, Training Loss: 0.01076
Epoch: 17661, Training Loss: 0.01130
Epoch: 17661, Training Loss: 0.01132
Epoch: 17661, Training Loss: 0.01397
Epoch: 17662, Training Loss: 0.01076
Epoch: 17662, Training Loss: 0.01130
Epoch: 17662, Training Loss: 0.01132
Epoch: 17662, Training Loss: 0.01396
Epoch: 17663, Training Loss: 0.01076
Epoch: 17663, Training Loss: 0.01130
Epoch: 17663, Training Loss: 0.01132
Epoch: 17663, Training Loss: 0.01396
Epoch: 17664, Training Loss: 0.01076
Epoch: 17664, Training Loss: 0.01130
Epoch: 17664, Training Loss: 0.01132
Epoch: 17664, Training Loss: 0.01396
Epoch: 17665, Training Loss: 0.01076
Epoch: 17665, Training Loss: 0.01130
Epoch: 17665, Training Loss: 0.01132
Epoch: 17665, Training Loss: 0.01396
Epoch: 17666, Training Loss: 0.01076
Epoch: 17666, Training Loss: 0.01130
Epoch: 17666, Training Loss: 0.01132
Epoch: 17666, Training Loss: 0.01396
Epoch: 17667, Training Loss: 0.01076
Epoch: 17667, Training Loss: 0.01130
Epoch: 17667, Training Loss: 0.01132
Epoch: 17667, Training Loss: 0.01396
Epoch: 17668, Training Loss: 0.01076
Epoch: 17668, Training Loss: 0.01130
Epoch: 17668, Training Loss: 0.01132
Epoch: 17668, Training Loss: 0.01396
Epoch: 17669, Training Loss: 0.01076
Epoch: 17669, Training Loss: 0.01130
Epoch: 17669, Training Loss: 0.01132
Epoch: 17669, Training Loss: 0.01396
Epoch: 17670, Training Loss: 0.01076
Epoch: 17670, Training Loss: 0.01130
Epoch: 17670, Training Loss: 0.01132
Epoch: 17670, Training Loss: 0.01396
Epoch: 17671, Training Loss: 0.01076
Epoch: 17671, Training Loss: 0.01130
Epoch: 17671, Training Loss: 0.01132
Epoch: 17671, Training Loss: 0.01396
Epoch: 17672, Training Loss: 0.01076
Epoch: 17672, Training Loss: 0.01130
Epoch: 17672, Training Loss: 0.01132
Epoch: 17672, Training Loss: 0.01396
Epoch: 17673, Training Loss: 0.01076
Epoch: 17673, Training Loss: 0.01130
Epoch: 17673, Training Loss: 0.01131
Epoch: 17673, Training Loss: 0.01396
Epoch: 17674, Training Loss: 0.01076
Epoch: 17674, Training Loss: 0.01130
Epoch: 17674, Training Loss: 0.01131
Epoch: 17674, Training Loss: 0.01396
Epoch: 17675, Training Loss: 0.01076
Epoch: 17675, Training Loss: 0.01130
Epoch: 17675, Training Loss: 0.01131
Epoch: 17675, Training Loss: 0.01396
Epoch: 17676, Training Loss: 0.01076
Epoch: 17676, Training Loss: 0.01130
Epoch: 17676, Training Loss: 0.01131
Epoch: 17676, Training Loss: 0.01396
Epoch: 17677, Training Loss: 0.01076
Epoch: 17677, Training Loss: 0.01130
Epoch: 17677, Training Loss: 0.01131
Epoch: 17677, Training Loss: 0.01396
Epoch: 17678, Training Loss: 0.01076
Epoch: 17678, Training Loss: 0.01130
Epoch: 17678, Training Loss: 0.01131
Epoch: 17678, Training Loss: 0.01396
Epoch: 17679, Training Loss: 0.01076
Epoch: 17679, Training Loss: 0.01130
Epoch: 17679, Training Loss: 0.01131
Epoch: 17679, Training Loss: 0.01396
Epoch: 17680, Training Loss: 0.01076
Epoch: 17680, Training Loss: 0.01130
Epoch: 17680, Training Loss: 0.01131
Epoch: 17680, Training Loss: 0.01396
Epoch: 17681, Training Loss: 0.01075
Epoch: 17681, Training Loss: 0.01130
Epoch: 17681, Training Loss: 0.01131
Epoch: 17681, Training Loss: 0.01396
Epoch: 17682, Training Loss: 0.01075
Epoch: 17682, Training Loss: 0.01130
Epoch: 17682, Training Loss: 0.01131
Epoch: 17682, Training Loss: 0.01395
Epoch: 17683, Training Loss: 0.01075
Epoch: 17683, Training Loss: 0.01130
Epoch: 17683, Training Loss: 0.01131
Epoch: 17683, Training Loss: 0.01395
Epoch: 17684, Training Loss: 0.01075
Epoch: 17684, Training Loss: 0.01130
Epoch: 17684, Training Loss: 0.01131
Epoch: 17684, Training Loss: 0.01395
Epoch: 17685, Training Loss: 0.01075
Epoch: 17685, Training Loss: 0.01129
Epoch: 17685, Training Loss: 0.01131
Epoch: 17685, Training Loss: 0.01395
Epoch: 17686, Training Loss: 0.01075
Epoch: 17686, Training Loss: 0.01129
Epoch: 17686, Training Loss: 0.01131
Epoch: 17686, Training Loss: 0.01395
Epoch: 17687, Training Loss: 0.01075
Epoch: 17687, Training Loss: 0.01129
Epoch: 17687, Training Loss: 0.01131
Epoch: 17687, Training Loss: 0.01395
Epoch: 17688, Training Loss: 0.01075
Epoch: 17688, Training Loss: 0.01129
Epoch: 17688, Training Loss: 0.01131
Epoch: 17688, Training Loss: 0.01395
Epoch: 17689, Training Loss: 0.01075
Epoch: 17689, Training Loss: 0.01129
Epoch: 17689, Training Loss: 0.01131
Epoch: 17689, Training Loss: 0.01395
Epoch: 17690, Training Loss: 0.01075
Epoch: 17690, Training Loss: 0.01129
Epoch: 17690, Training Loss: 0.01131
Epoch: 17690, Training Loss: 0.01395
Epoch: 17691, Training Loss: 0.01075
Epoch: 17691, Training Loss: 0.01129
Epoch: 17691, Training Loss: 0.01131
Epoch: 17691, Training Loss: 0.01395
Epoch: 17692, Training Loss: 0.01075
Epoch: 17692, Training Loss: 0.01129
Epoch: 17692, Training Loss: 0.01131
Epoch: 17692, Training Loss: 0.01395
Epoch: 17693, Training Loss: 0.01075
Epoch: 17693, Training Loss: 0.01129
Epoch: 17693, Training Loss: 0.01131
Epoch: 17693, Training Loss: 0.01395
Epoch: 17694, Training Loss: 0.01075
Epoch: 17694, Training Loss: 0.01129
Epoch: 17694, Training Loss: 0.01131
Epoch: 17694, Training Loss: 0.01395
Epoch: 17695, Training Loss: 0.01075
Epoch: 17695, Training Loss: 0.01129
Epoch: 17695, Training Loss: 0.01131
Epoch: 17695, Training Loss: 0.01395
Epoch: 17696, Training Loss: 0.01075
Epoch: 17696, Training Loss: 0.01129
Epoch: 17696, Training Loss: 0.01131
Epoch: 17696, Training Loss: 0.01395
Epoch: 17697, Training Loss: 0.01075
Epoch: 17697, Training Loss: 0.01129
Epoch: 17697, Training Loss: 0.01131
Epoch: 17697, Training Loss: 0.01395
Epoch: 17698, Training Loss: 0.01075
Epoch: 17698, Training Loss: 0.01129
Epoch: 17698, Training Loss: 0.01131
Epoch: 17698, Training Loss: 0.01395
Epoch: 17699, Training Loss: 0.01075
Epoch: 17699, Training Loss: 0.01129
Epoch: 17699, Training Loss: 0.01130
Epoch: 17699, Training Loss: 0.01395
Epoch: 17700, Training Loss: 0.01075
Epoch: 17700, Training Loss: 0.01129
Epoch: 17700, Training Loss: 0.01130
Epoch: 17700, Training Loss: 0.01395
Epoch: 17701, Training Loss: 0.01075
Epoch: 17701, Training Loss: 0.01129
Epoch: 17701, Training Loss: 0.01130
Epoch: 17701, Training Loss: 0.01395
Epoch: 17702, Training Loss: 0.01075
Epoch: 17702, Training Loss: 0.01129
Epoch: 17702, Training Loss: 0.01130
Epoch: 17702, Training Loss: 0.01395
Epoch: 17703, Training Loss: 0.01075
Epoch: 17703, Training Loss: 0.01129
Epoch: 17703, Training Loss: 0.01130
Epoch: 17703, Training Loss: 0.01394
Epoch: 17704, Training Loss: 0.01075
Epoch: 17704, Training Loss: 0.01129
Epoch: 17704, Training Loss: 0.01130
Epoch: 17704, Training Loss: 0.01394
Epoch: 17705, Training Loss: 0.01075
Epoch: 17705, Training Loss: 0.01129
Epoch: 17705, Training Loss: 0.01130
Epoch: 17705, Training Loss: 0.01394
Epoch: 17706, Training Loss: 0.01075
Epoch: 17706, Training Loss: 0.01129
Epoch: 17706, Training Loss: 0.01130
Epoch: 17706, Training Loss: 0.01394
Epoch: 17707, Training Loss: 0.01075
Epoch: 17707, Training Loss: 0.01129
Epoch: 17707, Training Loss: 0.01130
Epoch: 17707, Training Loss: 0.01394
Epoch: 17708, Training Loss: 0.01075
Epoch: 17708, Training Loss: 0.01129
Epoch: 17708, Training Loss: 0.01130
Epoch: 17708, Training Loss: 0.01394
Epoch: 17709, Training Loss: 0.01074
Epoch: 17709, Training Loss: 0.01129
Epoch: 17709, Training Loss: 0.01130
Epoch: 17709, Training Loss: 0.01394
Epoch: 17710, Training Loss: 0.01074
Epoch: 17710, Training Loss: 0.01129
Epoch: 17710, Training Loss: 0.01130
Epoch: 17710, Training Loss: 0.01394
Epoch: 17711, Training Loss: 0.01074
Epoch: 17711, Training Loss: 0.01128
Epoch: 17711, Training Loss: 0.01130
Epoch: 17711, Training Loss: 0.01394
Epoch: 17712, Training Loss: 0.01074
Epoch: 17712, Training Loss: 0.01128
Epoch: 17712, Training Loss: 0.01130
Epoch: 17712, Training Loss: 0.01394
Epoch: 17713, Training Loss: 0.01074
Epoch: 17713, Training Loss: 0.01128
Epoch: 17713, Training Loss: 0.01130
Epoch: 17713, Training Loss: 0.01394
Epoch: 17714, Training Loss: 0.01074
Epoch: 17714, Training Loss: 0.01128
Epoch: 17714, Training Loss: 0.01130
Epoch: 17714, Training Loss: 0.01394
Epoch: 17715, Training Loss: 0.01074
Epoch: 17715, Training Loss: 0.01128
Epoch: 17715, Training Loss: 0.01130
Epoch: 17715, Training Loss: 0.01394
Epoch: 17716, Training Loss: 0.01074
Epoch: 17716, Training Loss: 0.01128
Epoch: 17716, Training Loss: 0.01130
Epoch: 17716, Training Loss: 0.01394
Epoch: 17717, Training Loss: 0.01074
Epoch: 17717, Training Loss: 0.01128
Epoch: 17717, Training Loss: 0.01130
Epoch: 17717, Training Loss: 0.01394
Epoch: 17718, Training Loss: 0.01074
Epoch: 17718, Training Loss: 0.01128
Epoch: 17718, Training Loss: 0.01130
Epoch: 17718, Training Loss: 0.01394
Epoch: 17719, Training Loss: 0.01074
Epoch: 17719, Training Loss: 0.01128
Epoch: 17719, Training Loss: 0.01130
Epoch: 17719, Training Loss: 0.01394
Epoch: 17720, Training Loss: 0.01074
Epoch: 17720, Training Loss: 0.01128
Epoch: 17720, Training Loss: 0.01130
Epoch: 17720, Training Loss: 0.01394
Epoch: 17721, Training Loss: 0.01074
Epoch: 17721, Training Loss: 0.01128
Epoch: 17721, Training Loss: 0.01130
Epoch: 17721, Training Loss: 0.01394
Epoch: 17722, Training Loss: 0.01074
Epoch: 17722, Training Loss: 0.01128
Epoch: 17722, Training Loss: 0.01130
Epoch: 17722, Training Loss: 0.01394
Epoch: 17723, Training Loss: 0.01074
Epoch: 17723, Training Loss: 0.01128
Epoch: 17723, Training Loss: 0.01130
Epoch: 17723, Training Loss: 0.01394
Epoch: 17724, Training Loss: 0.01074
Epoch: 17724, Training Loss: 0.01128
Epoch: 17724, Training Loss: 0.01130
Epoch: 17724, Training Loss: 0.01393
Epoch: 17725, Training Loss: 0.01074
Epoch: 17725, Training Loss: 0.01128
Epoch: 17725, Training Loss: 0.01129
Epoch: 17725, Training Loss: 0.01393
Epoch: 17726, Training Loss: 0.01074
Epoch: 17726, Training Loss: 0.01128
Epoch: 17726, Training Loss: 0.01129
Epoch: 17726, Training Loss: 0.01393
Epoch: 17727, Training Loss: 0.01074
Epoch: 17727, Training Loss: 0.01128
Epoch: 17727, Training Loss: 0.01129
Epoch: 17727, Training Loss: 0.01393
Epoch: 17728, Training Loss: 0.01074
Epoch: 17728, Training Loss: 0.01128
Epoch: 17728, Training Loss: 0.01129
Epoch: 17728, Training Loss: 0.01393
Epoch: 17729, Training Loss: 0.01074
Epoch: 17729, Training Loss: 0.01128
Epoch: 17729, Training Loss: 0.01129
Epoch: 17729, Training Loss: 0.01393
Epoch: 17730, Training Loss: 0.01074
Epoch: 17730, Training Loss: 0.01128
Epoch: 17730, Training Loss: 0.01129
Epoch: 17730, Training Loss: 0.01393
Epoch: 17731, Training Loss: 0.01074
Epoch: 17731, Training Loss: 0.01128
Epoch: 17731, Training Loss: 0.01129
Epoch: 17731, Training Loss: 0.01393
Epoch: 17732, Training Loss: 0.01074
Epoch: 17732, Training Loss: 0.01128
Epoch: 17732, Training Loss: 0.01129
Epoch: 17732, Training Loss: 0.01393
Epoch: 17733, Training Loss: 0.01074
Epoch: 17733, Training Loss: 0.01128
Epoch: 17733, Training Loss: 0.01129
Epoch: 17733, Training Loss: 0.01393
Epoch: 17734, Training Loss: 0.01074
Epoch: 17734, Training Loss: 0.01128
Epoch: 17734, Training Loss: 0.01129
Epoch: 17734, Training Loss: 0.01393
Epoch: 17735, Training Loss: 0.01074
Epoch: 17735, Training Loss: 0.01128
Epoch: 17735, Training Loss: 0.01129
Epoch: 17735, Training Loss: 0.01393
Epoch: 17736, Training Loss: 0.01074
Epoch: 17736, Training Loss: 0.01128
Epoch: 17736, Training Loss: 0.01129
Epoch: 17736, Training Loss: 0.01393
Epoch: 17737, Training Loss: 0.01073
Epoch: 17737, Training Loss: 0.01127
Epoch: 17737, Training Loss: 0.01129
Epoch: 17737, Training Loss: 0.01393
Epoch: 17738, Training Loss: 0.01073
Epoch: 17738, Training Loss: 0.01127
Epoch: 17738, Training Loss: 0.01129
Epoch: 17738, Training Loss: 0.01393
Epoch: 17739, Training Loss: 0.01073
Epoch: 17739, Training Loss: 0.01127
Epoch: 17739, Training Loss: 0.01129
Epoch: 17739, Training Loss: 0.01393
Epoch: 17740, Training Loss: 0.01073
Epoch: 17740, Training Loss: 0.01127
Epoch: 17740, Training Loss: 0.01129
Epoch: 17740, Training Loss: 0.01393
Epoch: 17741, Training Loss: 0.01073
Epoch: 17741, Training Loss: 0.01127
Epoch: 17741, Training Loss: 0.01129
Epoch: 17741, Training Loss: 0.01393
Epoch: 17742, Training Loss: 0.01073
Epoch: 17742, Training Loss: 0.01127
Epoch: 17742, Training Loss: 0.01129
Epoch: 17742, Training Loss: 0.01393
Epoch: 17743, Training Loss: 0.01073
Epoch: 17743, Training Loss: 0.01127
Epoch: 17743, Training Loss: 0.01129
Epoch: 17743, Training Loss: 0.01393
Epoch: 17744, Training Loss: 0.01073
Epoch: 17744, Training Loss: 0.01127
Epoch: 17744, Training Loss: 0.01129
Epoch: 17744, Training Loss: 0.01393
Epoch: 17745, Training Loss: 0.01073
Epoch: 17745, Training Loss: 0.01127
Epoch: 17745, Training Loss: 0.01129
Epoch: 17745, Training Loss: 0.01392
Epoch: 17746, Training Loss: 0.01073
Epoch: 17746, Training Loss: 0.01127
Epoch: 17746, Training Loss: 0.01129
Epoch: 17746, Training Loss: 0.01392
Epoch: 17747, Training Loss: 0.01073
Epoch: 17747, Training Loss: 0.01127
Epoch: 17747, Training Loss: 0.01129
Epoch: 17747, Training Loss: 0.01392
Epoch: 17748, Training Loss: 0.01073
Epoch: 17748, Training Loss: 0.01127
Epoch: 17748, Training Loss: 0.01129
Epoch: 17748, Training Loss: 0.01392
Epoch: 17749, Training Loss: 0.01073
Epoch: 17749, Training Loss: 0.01127
Epoch: 17749, Training Loss: 0.01129
Epoch: 17749, Training Loss: 0.01392
Epoch: 17750, Training Loss: 0.01073
Epoch: 17750, Training Loss: 0.01127
Epoch: 17750, Training Loss: 0.01129
Epoch: 17750, Training Loss: 0.01392
Epoch: 17751, Training Loss: 0.01073
Epoch: 17751, Training Loss: 0.01127
Epoch: 17751, Training Loss: 0.01128
Epoch: 17751, Training Loss: 0.01392
Epoch: 17752, Training Loss: 0.01073
Epoch: 17752, Training Loss: 0.01127
Epoch: 17752, Training Loss: 0.01128
Epoch: 17752, Training Loss: 0.01392
Epoch: 17753, Training Loss: 0.01073
Epoch: 17753, Training Loss: 0.01127
Epoch: 17753, Training Loss: 0.01128
Epoch: 17753, Training Loss: 0.01392
Epoch: 17754, Training Loss: 0.01073
Epoch: 17754, Training Loss: 0.01127
Epoch: 17754, Training Loss: 0.01128
Epoch: 17754, Training Loss: 0.01392
Epoch: 17755, Training Loss: 0.01073
Epoch: 17755, Training Loss: 0.01127
Epoch: 17755, Training Loss: 0.01128
Epoch: 17755, Training Loss: 0.01392
Epoch: 17756, Training Loss: 0.01073
Epoch: 17756, Training Loss: 0.01127
Epoch: 17756, Training Loss: 0.01128
Epoch: 17756, Training Loss: 0.01392
Epoch: 17757, Training Loss: 0.01073
Epoch: 17757, Training Loss: 0.01127
Epoch: 17757, Training Loss: 0.01128
Epoch: 17757, Training Loss: 0.01392
Epoch: 17758, Training Loss: 0.01073
Epoch: 17758, Training Loss: 0.01127
Epoch: 17758, Training Loss: 0.01128
Epoch: 17758, Training Loss: 0.01392
Epoch: 17759, Training Loss: 0.01073
Epoch: 17759, Training Loss: 0.01127
Epoch: 17759, Training Loss: 0.01128
Epoch: 17759, Training Loss: 0.01392
Epoch: 17760, Training Loss: 0.01073
Epoch: 17760, Training Loss: 0.01127
Epoch: 17760, Training Loss: 0.01128
Epoch: 17760, Training Loss: 0.01392
Epoch: 17761, Training Loss: 0.01073
Epoch: 17761, Training Loss: 0.01127
Epoch: 17761, Training Loss: 0.01128
Epoch: 17761, Training Loss: 0.01392
Epoch: 17762, Training Loss: 0.01073
Epoch: 17762, Training Loss: 0.01127
Epoch: 17762, Training Loss: 0.01128
Epoch: 17762, Training Loss: 0.01392
Epoch: 17763, Training Loss: 0.01073
Epoch: 17763, Training Loss: 0.01126
Epoch: 17763, Training Loss: 0.01128
Epoch: 17763, Training Loss: 0.01392
Epoch: 17764, Training Loss: 0.01073
Epoch: 17764, Training Loss: 0.01126
Epoch: 17764, Training Loss: 0.01128
Epoch: 17764, Training Loss: 0.01392
Epoch: 17765, Training Loss: 0.01073
Epoch: 17765, Training Loss: 0.01126
Epoch: 17765, Training Loss: 0.01128
Epoch: 17765, Training Loss: 0.01391
Epoch: 17766, Training Loss: 0.01072
Epoch: 17766, Training Loss: 0.01126
Epoch: 17766, Training Loss: 0.01128
Epoch: 17766, Training Loss: 0.01391
Epoch: 17767, Training Loss: 0.01072
Epoch: 17767, Training Loss: 0.01126
Epoch: 17767, Training Loss: 0.01128
Epoch: 17767, Training Loss: 0.01391
Epoch: 17768, Training Loss: 0.01072
Epoch: 17768, Training Loss: 0.01126
Epoch: 17768, Training Loss: 0.01128
Epoch: 17768, Training Loss: 0.01391
Epoch: 17769, Training Loss: 0.01072
Epoch: 17769, Training Loss: 0.01126
Epoch: 17769, Training Loss: 0.01128
Epoch: 17769, Training Loss: 0.01391
Epoch: 17770, Training Loss: 0.01072
Epoch: 17770, Training Loss: 0.01126
Epoch: 17770, Training Loss: 0.01128
Epoch: 17770, Training Loss: 0.01391
Epoch: 17771, Training Loss: 0.01072
Epoch: 17771, Training Loss: 0.01126
Epoch: 17771, Training Loss: 0.01128
Epoch: 17771, Training Loss: 0.01391
Epoch: 17772, Training Loss: 0.01072
Epoch: 17772, Training Loss: 0.01126
Epoch: 17772, Training Loss: 0.01128
Epoch: 17772, Training Loss: 0.01391
Epoch: 17773, Training Loss: 0.01072
Epoch: 17773, Training Loss: 0.01126
Epoch: 17773, Training Loss: 0.01128
Epoch: 17773, Training Loss: 0.01391
Epoch: 17774, Training Loss: 0.01072
Epoch: 17774, Training Loss: 0.01126
Epoch: 17774, Training Loss: 0.01128
Epoch: 17774, Training Loss: 0.01391
Epoch: 17775, Training Loss: 0.01072
Epoch: 17775, Training Loss: 0.01126
Epoch: 17775, Training Loss: 0.01128
Epoch: 17775, Training Loss: 0.01391
Epoch: 17776, Training Loss: 0.01072
Epoch: 17776, Training Loss: 0.01126
Epoch: 17776, Training Loss: 0.01128
Epoch: 17776, Training Loss: 0.01391
Epoch: 17777, Training Loss: 0.01072
Epoch: 17777, Training Loss: 0.01126
Epoch: 17777, Training Loss: 0.01127
Epoch: 17777, Training Loss: 0.01391
Epoch: 17778, Training Loss: 0.01072
Epoch: 17778, Training Loss: 0.01126
Epoch: 17778, Training Loss: 0.01127
Epoch: 17778, Training Loss: 0.01391
Epoch: 17779, Training Loss: 0.01072
Epoch: 17779, Training Loss: 0.01126
Epoch: 17779, Training Loss: 0.01127
Epoch: 17779, Training Loss: 0.01391
Epoch: 17780, Training Loss: 0.01072
Epoch: 17780, Training Loss: 0.01126
Epoch: 17780, Training Loss: 0.01127
Epoch: 17780, Training Loss: 0.01391
Epoch: 17781, Training Loss: 0.01072
Epoch: 17781, Training Loss: 0.01126
Epoch: 17781, Training Loss: 0.01127
Epoch: 17781, Training Loss: 0.01391
Epoch: 17782, Training Loss: 0.01072
Epoch: 17782, Training Loss: 0.01126
Epoch: 17782, Training Loss: 0.01127
Epoch: 17782, Training Loss: 0.01391
Epoch: 17783, Training Loss: 0.01072
Epoch: 17783, Training Loss: 0.01126
Epoch: 17783, Training Loss: 0.01127
Epoch: 17783, Training Loss: 0.01391
Epoch: 17784, Training Loss: 0.01072
Epoch: 17784, Training Loss: 0.01126
Epoch: 17784, Training Loss: 0.01127
Epoch: 17784, Training Loss: 0.01391
Epoch: 17785, Training Loss: 0.01072
Epoch: 17785, Training Loss: 0.01126
Epoch: 17785, Training Loss: 0.01127
Epoch: 17785, Training Loss: 0.01391
Epoch: 17786, Training Loss: 0.01072
Epoch: 17786, Training Loss: 0.01126
Epoch: 17786, Training Loss: 0.01127
Epoch: 17786, Training Loss: 0.01390
Epoch: 17787, Training Loss: 0.01072
Epoch: 17787, Training Loss: 0.01126
Epoch: 17787, Training Loss: 0.01127
Epoch: 17787, Training Loss: 0.01390
Epoch: 17788, Training Loss: 0.01072
Epoch: 17788, Training Loss: 0.01126
Epoch: 17788, Training Loss: 0.01127
Epoch: 17788, Training Loss: 0.01390
Epoch: 17789, Training Loss: 0.01072
Epoch: 17789, Training Loss: 0.01125
Epoch: 17789, Training Loss: 0.01127
Epoch: 17789, Training Loss: 0.01390
Epoch: 17790, Training Loss: 0.01072
Epoch: 17790, Training Loss: 0.01125
Epoch: 17790, Training Loss: 0.01127
Epoch: 17790, Training Loss: 0.01390
Epoch: 17791, Training Loss: 0.01072
Epoch: 17791, Training Loss: 0.01125
Epoch: 17791, Training Loss: 0.01127
Epoch: 17791, Training Loss: 0.01390
Epoch: 17792, Training Loss: 0.01072
Epoch: 17792, Training Loss: 0.01125
Epoch: 17792, Training Loss: 0.01127
Epoch: 17792, Training Loss: 0.01390
Epoch: 17793, Training Loss: 0.01072
Epoch: 17793, Training Loss: 0.01125
Epoch: 17793, Training Loss: 0.01127
Epoch: 17793, Training Loss: 0.01390
Epoch: 17794, Training Loss: 0.01071
Epoch: 17794, Training Loss: 0.01125
Epoch: 17794, Training Loss: 0.01127
Epoch: 17794, Training Loss: 0.01390
Epoch: 17795, Training Loss: 0.01071
Epoch: 17795, Training Loss: 0.01125
Epoch: 17795, Training Loss: 0.01127
Epoch: 17795, Training Loss: 0.01390
Epoch: 17796, Training Loss: 0.01071
Epoch: 17796, Training Loss: 0.01125
Epoch: 17796, Training Loss: 0.01127
Epoch: 17796, Training Loss: 0.01390
Epoch: 17797, Training Loss: 0.01071
Epoch: 17797, Training Loss: 0.01125
Epoch: 17797, Training Loss: 0.01127
Epoch: 17797, Training Loss: 0.01390
Epoch: 17798, Training Loss: 0.01071
Epoch: 17798, Training Loss: 0.01125
Epoch: 17798, Training Loss: 0.01127
Epoch: 17798, Training Loss: 0.01390
Epoch: 17799, Training Loss: 0.01071
Epoch: 17799, Training Loss: 0.01125
Epoch: 17799, Training Loss: 0.01127
Epoch: 17799, Training Loss: 0.01390
Epoch: 17800, Training Loss: 0.01071
Epoch: 17800, Training Loss: 0.01125
Epoch: 17800, Training Loss: 0.01127
Epoch: 17800, Training Loss: 0.01390
Epoch: 17801, Training Loss: 0.01071
Epoch: 17801, Training Loss: 0.01125
Epoch: 17801, Training Loss: 0.01127
Epoch: 17801, Training Loss: 0.01390
Epoch: 17802, Training Loss: 0.01071
Epoch: 17802, Training Loss: 0.01125
Epoch: 17802, Training Loss: 0.01127
Epoch: 17802, Training Loss: 0.01390
Epoch: 17803, Training Loss: 0.01071
Epoch: 17803, Training Loss: 0.01125
Epoch: 17803, Training Loss: 0.01126
Epoch: 17803, Training Loss: 0.01390
Epoch: 17804, Training Loss: 0.01071
Epoch: 17804, Training Loss: 0.01125
Epoch: 17804, Training Loss: 0.01126
Epoch: 17804, Training Loss: 0.01390
Epoch: 17805, Training Loss: 0.01071
Epoch: 17805, Training Loss: 0.01125
Epoch: 17805, Training Loss: 0.01126
Epoch: 17805, Training Loss: 0.01390
Epoch: 17806, Training Loss: 0.01071
Epoch: 17806, Training Loss: 0.01125
Epoch: 17806, Training Loss: 0.01126
Epoch: 17806, Training Loss: 0.01390
Epoch: 17807, Training Loss: 0.01071
Epoch: 17807, Training Loss: 0.01125
Epoch: 17807, Training Loss: 0.01126
Epoch: 17807, Training Loss: 0.01389
Epoch: 17808, Training Loss: 0.01071
Epoch: 17808, Training Loss: 0.01125
Epoch: 17808, Training Loss: 0.01126
Epoch: 17808, Training Loss: 0.01389
Epoch: 17809, Training Loss: 0.01071
Epoch: 17809, Training Loss: 0.01125
Epoch: 17809, Training Loss: 0.01126
Epoch: 17809, Training Loss: 0.01389
Epoch: 17810, Training Loss: 0.01071
Epoch: 17810, Training Loss: 0.01125
Epoch: 17810, Training Loss: 0.01126
Epoch: 17810, Training Loss: 0.01389
Epoch: 17811, Training Loss: 0.01071
Epoch: 17811, Training Loss: 0.01125
Epoch: 17811, Training Loss: 0.01126
Epoch: 17811, Training Loss: 0.01389
Epoch: 17812, Training Loss: 0.01071
Epoch: 17812, Training Loss: 0.01125
Epoch: 17812, Training Loss: 0.01126
Epoch: 17812, Training Loss: 0.01389
Epoch: 17813, Training Loss: 0.01071
Epoch: 17813, Training Loss: 0.01125
Epoch: 17813, Training Loss: 0.01126
Epoch: 17813, Training Loss: 0.01389
Epoch: 17814, Training Loss: 0.01071
Epoch: 17814, Training Loss: 0.01125
Epoch: 17814, Training Loss: 0.01126
Epoch: 17814, Training Loss: 0.01389
Epoch: 17815, Training Loss: 0.01071
Epoch: 17815, Training Loss: 0.01125
Epoch: 17815, Training Loss: 0.01126
Epoch: 17815, Training Loss: 0.01389
Epoch: 17816, Training Loss: 0.01071
Epoch: 17816, Training Loss: 0.01124
Epoch: 17816, Training Loss: 0.01126
Epoch: 17816, Training Loss: 0.01389
Epoch: 17817, Training Loss: 0.01071
Epoch: 17817, Training Loss: 0.01124
Epoch: 17817, Training Loss: 0.01126
Epoch: 17817, Training Loss: 0.01389
Epoch: 17818, Training Loss: 0.01071
Epoch: 17818, Training Loss: 0.01124
Epoch: 17818, Training Loss: 0.01126
Epoch: 17818, Training Loss: 0.01389
Epoch: 17819, Training Loss: 0.01071
Epoch: 17819, Training Loss: 0.01124
Epoch: 17819, Training Loss: 0.01126
Epoch: 17819, Training Loss: 0.01389
Epoch: 17820, Training Loss: 0.01071
Epoch: 17820, Training Loss: 0.01124
Epoch: 17820, Training Loss: 0.01126
Epoch: 17820, Training Loss: 0.01389
Epoch: 17821, Training Loss: 0.01071
Epoch: 17821, Training Loss: 0.01124
Epoch: 17821, Training Loss: 0.01126
Epoch: 17821, Training Loss: 0.01389
Epoch: 17822, Training Loss: 0.01070
Epoch: 17822, Training Loss: 0.01124
Epoch: 17822, Training Loss: 0.01126
Epoch: 17822, Training Loss: 0.01389
Epoch: 17823, Training Loss: 0.01070
Epoch: 17823, Training Loss: 0.01124
Epoch: 17823, Training Loss: 0.01126
Epoch: 17823, Training Loss: 0.01389
Epoch: 17824, Training Loss: 0.01070
Epoch: 17824, Training Loss: 0.01124
Epoch: 17824, Training Loss: 0.01126
Epoch: 17824, Training Loss: 0.01389
Epoch: 17825, Training Loss: 0.01070
Epoch: 17825, Training Loss: 0.01124
Epoch: 17825, Training Loss: 0.01126
Epoch: 17825, Training Loss: 0.01389
Epoch: 17826, Training Loss: 0.01070
Epoch: 17826, Training Loss: 0.01124
Epoch: 17826, Training Loss: 0.01126
Epoch: 17826, Training Loss: 0.01389
Epoch: 17827, Training Loss: 0.01070
Epoch: 17827, Training Loss: 0.01124
Epoch: 17827, Training Loss: 0.01126
Epoch: 17827, Training Loss: 0.01389
Epoch: 17828, Training Loss: 0.01070
Epoch: 17828, Training Loss: 0.01124
Epoch: 17828, Training Loss: 0.01126
Epoch: 17828, Training Loss: 0.01388
Epoch: 17829, Training Loss: 0.01070
Epoch: 17829, Training Loss: 0.01124
Epoch: 17829, Training Loss: 0.01125
Epoch: 17829, Training Loss: 0.01388
Epoch: 17830, Training Loss: 0.01070
Epoch: 17830, Training Loss: 0.01124
Epoch: 17830, Training Loss: 0.01125
Epoch: 17830, Training Loss: 0.01388
Epoch: 17831, Training Loss: 0.01070
Epoch: 17831, Training Loss: 0.01124
Epoch: 17831, Training Loss: 0.01125
Epoch: 17831, Training Loss: 0.01388
Epoch: 17832, Training Loss: 0.01070
Epoch: 17832, Training Loss: 0.01124
Epoch: 17832, Training Loss: 0.01125
Epoch: 17832, Training Loss: 0.01388
Epoch: 17833, Training Loss: 0.01070
Epoch: 17833, Training Loss: 0.01124
Epoch: 17833, Training Loss: 0.01125
Epoch: 17833, Training Loss: 0.01388
Epoch: 17834, Training Loss: 0.01070
Epoch: 17834, Training Loss: 0.01124
Epoch: 17834, Training Loss: 0.01125
Epoch: 17834, Training Loss: 0.01388
Epoch: 17835, Training Loss: 0.01070
Epoch: 17835, Training Loss: 0.01124
Epoch: 17835, Training Loss: 0.01125
Epoch: 17835, Training Loss: 0.01388
Epoch: 17836, Training Loss: 0.01070
Epoch: 17836, Training Loss: 0.01124
Epoch: 17836, Training Loss: 0.01125
Epoch: 17836, Training Loss: 0.01388
Epoch: 17837, Training Loss: 0.01070
Epoch: 17837, Training Loss: 0.01124
Epoch: 17837, Training Loss: 0.01125
Epoch: 17837, Training Loss: 0.01388
Epoch: 17838, Training Loss: 0.01070
Epoch: 17838, Training Loss: 0.01124
Epoch: 17838, Training Loss: 0.01125
Epoch: 17838, Training Loss: 0.01388
Epoch: 17839, Training Loss: 0.01070
Epoch: 17839, Training Loss: 0.01124
Epoch: 17839, Training Loss: 0.01125
Epoch: 17839, Training Loss: 0.01388
Epoch: 17840, Training Loss: 0.01070
Epoch: 17840, Training Loss: 0.01124
Epoch: 17840, Training Loss: 0.01125
Epoch: 17840, Training Loss: 0.01388
Epoch: 17841, Training Loss: 0.01070
Epoch: 17841, Training Loss: 0.01124
Epoch: 17841, Training Loss: 0.01125
Epoch: 17841, Training Loss: 0.01388
Epoch: 17842, Training Loss: 0.01070
Epoch: 17842, Training Loss: 0.01123
Epoch: 17842, Training Loss: 0.01125
Epoch: 17842, Training Loss: 0.01388
Epoch: 17843, Training Loss: 0.01070
Epoch: 17843, Training Loss: 0.01123
Epoch: 17843, Training Loss: 0.01125
Epoch: 17843, Training Loss: 0.01388
Epoch: 17844, Training Loss: 0.01070
Epoch: 17844, Training Loss: 0.01123
Epoch: 17844, Training Loss: 0.01125
Epoch: 17844, Training Loss: 0.01388
Epoch: 17845, Training Loss: 0.01070
Epoch: 17845, Training Loss: 0.01123
Epoch: 17845, Training Loss: 0.01125
Epoch: 17845, Training Loss: 0.01388
Epoch: 17846, Training Loss: 0.01070
Epoch: 17846, Training Loss: 0.01123
Epoch: 17846, Training Loss: 0.01125
Epoch: 17846, Training Loss: 0.01388
Epoch: 17847, Training Loss: 0.01070
Epoch: 17847, Training Loss: 0.01123
Epoch: 17847, Training Loss: 0.01125
Epoch: 17847, Training Loss: 0.01388
Epoch: 17848, Training Loss: 0.01070
Epoch: 17848, Training Loss: 0.01123
Epoch: 17848, Training Loss: 0.01125
Epoch: 17848, Training Loss: 0.01388
Epoch: 17849, Training Loss: 0.01070
Epoch: 17849, Training Loss: 0.01123
Epoch: 17849, Training Loss: 0.01125
Epoch: 17849, Training Loss: 0.01387
Epoch: 17850, Training Loss: 0.01070
Epoch: 17850, Training Loss: 0.01123
Epoch: 17850, Training Loss: 0.01125
Epoch: 17850, Training Loss: 0.01387
Epoch: 17851, Training Loss: 0.01069
Epoch: 17851, Training Loss: 0.01123
Epoch: 17851, Training Loss: 0.01125
Epoch: 17851, Training Loss: 0.01387
Epoch: 17852, Training Loss: 0.01069
Epoch: 17852, Training Loss: 0.01123
Epoch: 17852, Training Loss: 0.01125
Epoch: 17852, Training Loss: 0.01387
Epoch: 17853, Training Loss: 0.01069
Epoch: 17853, Training Loss: 0.01123
Epoch: 17853, Training Loss: 0.01125
Epoch: 17853, Training Loss: 0.01387
Epoch: 17854, Training Loss: 0.01069
Epoch: 17854, Training Loss: 0.01123
Epoch: 17854, Training Loss: 0.01125
Epoch: 17854, Training Loss: 0.01387
Epoch: 17855, Training Loss: 0.01069
Epoch: 17855, Training Loss: 0.01123
Epoch: 17855, Training Loss: 0.01125
Epoch: 17855, Training Loss: 0.01387
Epoch: 17856, Training Loss: 0.01069
Epoch: 17856, Training Loss: 0.01123
Epoch: 17856, Training Loss: 0.01124
Epoch: 17856, Training Loss: 0.01387
Epoch: 17857, Training Loss: 0.01069
Epoch: 17857, Training Loss: 0.01123
Epoch: 17857, Training Loss: 0.01124
Epoch: 17857, Training Loss: 0.01387
Epoch: 17858, Training Loss: 0.01069
Epoch: 17858, Training Loss: 0.01123
Epoch: 17858, Training Loss: 0.01124
Epoch: 17858, Training Loss: 0.01387
Epoch: 17859, Training Loss: 0.01069
Epoch: 17859, Training Loss: 0.01123
Epoch: 17859, Training Loss: 0.01124
Epoch: 17859, Training Loss: 0.01387
Epoch: 17860, Training Loss: 0.01069
Epoch: 17860, Training Loss: 0.01123
Epoch: 17860, Training Loss: 0.01124
Epoch: 17860, Training Loss: 0.01387
Epoch: 17861, Training Loss: 0.01069
Epoch: 17861, Training Loss: 0.01123
Epoch: 17861, Training Loss: 0.01124
Epoch: 17861, Training Loss: 0.01387
Epoch: 17862, Training Loss: 0.01069
Epoch: 17862, Training Loss: 0.01123
Epoch: 17862, Training Loss: 0.01124
Epoch: 17862, Training Loss: 0.01387
Epoch: 17863, Training Loss: 0.01069
Epoch: 17863, Training Loss: 0.01123
Epoch: 17863, Training Loss: 0.01124
Epoch: 17863, Training Loss: 0.01387
Epoch: 17864, Training Loss: 0.01069
Epoch: 17864, Training Loss: 0.01123
Epoch: 17864, Training Loss: 0.01124
Epoch: 17864, Training Loss: 0.01387
Epoch: 17865, Training Loss: 0.01069
Epoch: 17865, Training Loss: 0.01123
Epoch: 17865, Training Loss: 0.01124
Epoch: 17865, Training Loss: 0.01387
Epoch: 17866, Training Loss: 0.01069
Epoch: 17866, Training Loss: 0.01123
Epoch: 17866, Training Loss: 0.01124
Epoch: 17866, Training Loss: 0.01387
Epoch: 17867, Training Loss: 0.01069
Epoch: 17867, Training Loss: 0.01123
Epoch: 17867, Training Loss: 0.01124
Epoch: 17867, Training Loss: 0.01387
Epoch: 17868, Training Loss: 0.01069
Epoch: 17868, Training Loss: 0.01123
Epoch: 17868, Training Loss: 0.01124
Epoch: 17868, Training Loss: 0.01387
Epoch: 17869, Training Loss: 0.01069
Epoch: 17869, Training Loss: 0.01122
Epoch: 17869, Training Loss: 0.01124
Epoch: 17869, Training Loss: 0.01387
Epoch: 17870, Training Loss: 0.01069
Epoch: 17870, Training Loss: 0.01122
Epoch: 17870, Training Loss: 0.01124
Epoch: 17870, Training Loss: 0.01386
Epoch: 17871, Training Loss: 0.01069
Epoch: 17871, Training Loss: 0.01122
Epoch: 17871, Training Loss: 0.01124
Epoch: 17871, Training Loss: 0.01386
Epoch: 17872, Training Loss: 0.01069
Epoch: 17872, Training Loss: 0.01122
Epoch: 17872, Training Loss: 0.01124
Epoch: 17872, Training Loss: 0.01386
Epoch: 17873, Training Loss: 0.01069
Epoch: 17873, Training Loss: 0.01122
Epoch: 17873, Training Loss: 0.01124
Epoch: 17873, Training Loss: 0.01386
Epoch: 17874, Training Loss: 0.01069
Epoch: 17874, Training Loss: 0.01122
Epoch: 17874, Training Loss: 0.01124
Epoch: 17874, Training Loss: 0.01386
Epoch: 17875, Training Loss: 0.01069
Epoch: 17875, Training Loss: 0.01122
Epoch: 17875, Training Loss: 0.01124
Epoch: 17875, Training Loss: 0.01386
Epoch: 17876, Training Loss: 0.01069
Epoch: 17876, Training Loss: 0.01122
Epoch: 17876, Training Loss: 0.01124
Epoch: 17876, Training Loss: 0.01386
Epoch: 17877, Training Loss: 0.01069
Epoch: 17877, Training Loss: 0.01122
Epoch: 17877, Training Loss: 0.01124
Epoch: 17877, Training Loss: 0.01386
Epoch: 17878, Training Loss: 0.01069
Epoch: 17878, Training Loss: 0.01122
Epoch: 17878, Training Loss: 0.01124
Epoch: 17878, Training Loss: 0.01386
Epoch: 17879, Training Loss: 0.01069
Epoch: 17879, Training Loss: 0.01122
Epoch: 17879, Training Loss: 0.01124
Epoch: 17879, Training Loss: 0.01386
Epoch: 17880, Training Loss: 0.01068
Epoch: 17880, Training Loss: 0.01122
Epoch: 17880, Training Loss: 0.01124
Epoch: 17880, Training Loss: 0.01386
Epoch: 17881, Training Loss: 0.01068
Epoch: 17881, Training Loss: 0.01122
Epoch: 17881, Training Loss: 0.01124
Epoch: 17881, Training Loss: 0.01386
Epoch: 17882, Training Loss: 0.01068
Epoch: 17882, Training Loss: 0.01122
Epoch: 17882, Training Loss: 0.01123
Epoch: 17882, Training Loss: 0.01386
Epoch: 17883, Training Loss: 0.01068
Epoch: 17883, Training Loss: 0.01122
Epoch: 17883, Training Loss: 0.01123
Epoch: 17883, Training Loss: 0.01386
Epoch: 17884, Training Loss: 0.01068
Epoch: 17884, Training Loss: 0.01122
Epoch: 17884, Training Loss: 0.01123
Epoch: 17884, Training Loss: 0.01386
Epoch: 17885, Training Loss: 0.01068
Epoch: 17885, Training Loss: 0.01122
Epoch: 17885, Training Loss: 0.01123
Epoch: 17885, Training Loss: 0.01386
Epoch: 17886, Training Loss: 0.01068
Epoch: 17886, Training Loss: 0.01122
Epoch: 17886, Training Loss: 0.01123
Epoch: 17886, Training Loss: 0.01386
Epoch: 17887, Training Loss: 0.01068
Epoch: 17887, Training Loss: 0.01122
Epoch: 17887, Training Loss: 0.01123
Epoch: 17887, Training Loss: 0.01386
Epoch: 17888, Training Loss: 0.01068
Epoch: 17888, Training Loss: 0.01122
Epoch: 17888, Training Loss: 0.01123
Epoch: 17888, Training Loss: 0.01386
Epoch: 17889, Training Loss: 0.01068
Epoch: 17889, Training Loss: 0.01122
Epoch: 17889, Training Loss: 0.01123
Epoch: 17889, Training Loss: 0.01386
Epoch: 17890, Training Loss: 0.01068
Epoch: 17890, Training Loss: 0.01122
Epoch: 17890, Training Loss: 0.01123
Epoch: 17890, Training Loss: 0.01386
Epoch: 17891, Training Loss: 0.01068
Epoch: 17891, Training Loss: 0.01122
Epoch: 17891, Training Loss: 0.01123
Epoch: 17891, Training Loss: 0.01385
Epoch: 17892, Training Loss: 0.01068
Epoch: 17892, Training Loss: 0.01122
Epoch: 17892, Training Loss: 0.01123
Epoch: 17892, Training Loss: 0.01385
Epoch: 17893, Training Loss: 0.01068
Epoch: 17893, Training Loss: 0.01122
Epoch: 17893, Training Loss: 0.01123
Epoch: 17893, Training Loss: 0.01385
Epoch: 17894, Training Loss: 0.01068
Epoch: 17894, Training Loss: 0.01122
Epoch: 17894, Training Loss: 0.01123
Epoch: 17894, Training Loss: 0.01385
Epoch: 17895, Training Loss: 0.01068
Epoch: 17895, Training Loss: 0.01121
Epoch: 17895, Training Loss: 0.01123
Epoch: 17895, Training Loss: 0.01385
Epoch: 17896, Training Loss: 0.01068
Epoch: 17896, Training Loss: 0.01121
Epoch: 17896, Training Loss: 0.01123
Epoch: 17896, Training Loss: 0.01385
Epoch: 17897, Training Loss: 0.01068
Epoch: 17897, Training Loss: 0.01121
Epoch: 17897, Training Loss: 0.01123
Epoch: 17897, Training Loss: 0.01385
Epoch: 17898, Training Loss: 0.01068
Epoch: 17898, Training Loss: 0.01121
Epoch: 17898, Training Loss: 0.01123
Epoch: 17898, Training Loss: 0.01385
Epoch: 17899, Training Loss: 0.01068
Epoch: 17899, Training Loss: 0.01121
Epoch: 17899, Training Loss: 0.01123
Epoch: 17899, Training Loss: 0.01385
Epoch: 17900, Training Loss: 0.01068
Epoch: 17900, Training Loss: 0.01121
Epoch: 17900, Training Loss: 0.01123
Epoch: 17900, Training Loss: 0.01385
Epoch: 17901, Training Loss: 0.01068
Epoch: 17901, Training Loss: 0.01121
Epoch: 17901, Training Loss: 0.01123
Epoch: 17901, Training Loss: 0.01385
Epoch: 17902, Training Loss: 0.01068
Epoch: 17902, Training Loss: 0.01121
Epoch: 17902, Training Loss: 0.01123
Epoch: 17902, Training Loss: 0.01385
Epoch: 17903, Training Loss: 0.01068
Epoch: 17903, Training Loss: 0.01121
Epoch: 17903, Training Loss: 0.01123
Epoch: 17903, Training Loss: 0.01385
Epoch: 17904, Training Loss: 0.01068
Epoch: 17904, Training Loss: 0.01121
Epoch: 17904, Training Loss: 0.01123
Epoch: 17904, Training Loss: 0.01385
Epoch: 17905, Training Loss: 0.01068
Epoch: 17905, Training Loss: 0.01121
Epoch: 17905, Training Loss: 0.01123
Epoch: 17905, Training Loss: 0.01385
Epoch: 17906, Training Loss: 0.01068
Epoch: 17906, Training Loss: 0.01121
Epoch: 17906, Training Loss: 0.01123
Epoch: 17906, Training Loss: 0.01385
Epoch: 17907, Training Loss: 0.01068
Epoch: 17907, Training Loss: 0.01121
Epoch: 17907, Training Loss: 0.01123
Epoch: 17907, Training Loss: 0.01385
Epoch: 17908, Training Loss: 0.01067
Epoch: 17908, Training Loss: 0.01121
Epoch: 17908, Training Loss: 0.01123
Epoch: 17908, Training Loss: 0.01385
Epoch: 17909, Training Loss: 0.01067
Epoch: 17909, Training Loss: 0.01121
Epoch: 17909, Training Loss: 0.01122
Epoch: 17909, Training Loss: 0.01385
Epoch: 17910, Training Loss: 0.01067
Epoch: 17910, Training Loss: 0.01121
Epoch: 17910, Training Loss: 0.01122
Epoch: 17910, Training Loss: 0.01385
Epoch: 17911, Training Loss: 0.01067
Epoch: 17911, Training Loss: 0.01121
Epoch: 17911, Training Loss: 0.01122
Epoch: 17911, Training Loss: 0.01385
Epoch: 17912, Training Loss: 0.01067
Epoch: 17912, Training Loss: 0.01121
Epoch: 17912, Training Loss: 0.01122
Epoch: 17912, Training Loss: 0.01384
Epoch: 17913, Training Loss: 0.01067
Epoch: 17913, Training Loss: 0.01121
Epoch: 17913, Training Loss: 0.01122
Epoch: 17913, Training Loss: 0.01384
Epoch: 17914, Training Loss: 0.01067
Epoch: 17914, Training Loss: 0.01121
Epoch: 17914, Training Loss: 0.01122
Epoch: 17914, Training Loss: 0.01384
Epoch: 17915, Training Loss: 0.01067
Epoch: 17915, Training Loss: 0.01121
Epoch: 17915, Training Loss: 0.01122
Epoch: 17915, Training Loss: 0.01384
Epoch: 17916, Training Loss: 0.01067
Epoch: 17916, Training Loss: 0.01121
Epoch: 17916, Training Loss: 0.01122
Epoch: 17916, Training Loss: 0.01384
Epoch: 17917, Training Loss: 0.01067
Epoch: 17917, Training Loss: 0.01121
Epoch: 17917, Training Loss: 0.01122
Epoch: 17917, Training Loss: 0.01384
Epoch: 17918, Training Loss: 0.01067
Epoch: 17918, Training Loss: 0.01121
Epoch: 17918, Training Loss: 0.01122
Epoch: 17918, Training Loss: 0.01384
Epoch: 17919, Training Loss: 0.01067
Epoch: 17919, Training Loss: 0.01121
Epoch: 17919, Training Loss: 0.01122
Epoch: 17919, Training Loss: 0.01384
Epoch: 17920, Training Loss: 0.01067
Epoch: 17920, Training Loss: 0.01121
Epoch: 17920, Training Loss: 0.01122
Epoch: 17920, Training Loss: 0.01384
Epoch: 17921, Training Loss: 0.01067
Epoch: 17921, Training Loss: 0.01121
Epoch: 17921, Training Loss: 0.01122
Epoch: 17921, Training Loss: 0.01384
Epoch: 17922, Training Loss: 0.01067
Epoch: 17922, Training Loss: 0.01120
Epoch: 17922, Training Loss: 0.01122
Epoch: 17922, Training Loss: 0.01384
Epoch: 17923, Training Loss: 0.01067
Epoch: 17923, Training Loss: 0.01120
Epoch: 17923, Training Loss: 0.01122
Epoch: 17923, Training Loss: 0.01384
Epoch: 17924, Training Loss: 0.01067
Epoch: 17924, Training Loss: 0.01120
Epoch: 17924, Training Loss: 0.01122
Epoch: 17924, Training Loss: 0.01384
Epoch: 17925, Training Loss: 0.01067
Epoch: 17925, Training Loss: 0.01120
Epoch: 17925, Training Loss: 0.01122
Epoch: 17925, Training Loss: 0.01384
Epoch: 17926, Training Loss: 0.01067
Epoch: 17926, Training Loss: 0.01120
Epoch: 17926, Training Loss: 0.01122
Epoch: 17926, Training Loss: 0.01384
Epoch: 17927, Training Loss: 0.01067
Epoch: 17927, Training Loss: 0.01120
Epoch: 17927, Training Loss: 0.01122
Epoch: 17927, Training Loss: 0.01384
Epoch: 17928, Training Loss: 0.01067
Epoch: 17928, Training Loss: 0.01120
Epoch: 17928, Training Loss: 0.01122
Epoch: 17928, Training Loss: 0.01384
Epoch: 17929, Training Loss: 0.01067
Epoch: 17929, Training Loss: 0.01120
Epoch: 17929, Training Loss: 0.01122
Epoch: 17929, Training Loss: 0.01384
Epoch: 17930, Training Loss: 0.01067
Epoch: 17930, Training Loss: 0.01120
Epoch: 17930, Training Loss: 0.01122
Epoch: 17930, Training Loss: 0.01384
Epoch: 17931, Training Loss: 0.01067
Epoch: 17931, Training Loss: 0.01120
Epoch: 17931, Training Loss: 0.01122
Epoch: 17931, Training Loss: 0.01384
Epoch: 17932, Training Loss: 0.01067
Epoch: 17932, Training Loss: 0.01120
Epoch: 17932, Training Loss: 0.01122
Epoch: 17932, Training Loss: 0.01384
Epoch: 17933, Training Loss: 0.01067
Epoch: 17933, Training Loss: 0.01120
Epoch: 17933, Training Loss: 0.01122
Epoch: 17933, Training Loss: 0.01383
Epoch: 17934, Training Loss: 0.01067
Epoch: 17934, Training Loss: 0.01120
Epoch: 17934, Training Loss: 0.01122
Epoch: 17934, Training Loss: 0.01383
Epoch: 17935, Training Loss: 0.01067
Epoch: 17935, Training Loss: 0.01120
Epoch: 17935, Training Loss: 0.01121
Epoch: 17935, Training Loss: 0.01383
Epoch: 17936, Training Loss: 0.01067
Epoch: 17936, Training Loss: 0.01120
Epoch: 17936, Training Loss: 0.01121
Epoch: 17936, Training Loss: 0.01383
Epoch: 17937, Training Loss: 0.01066
Epoch: 17937, Training Loss: 0.01120
Epoch: 17937, Training Loss: 0.01121
Epoch: 17937, Training Loss: 0.01383
Epoch: 17938, Training Loss: 0.01066
Epoch: 17938, Training Loss: 0.01120
Epoch: 17938, Training Loss: 0.01121
Epoch: 17938, Training Loss: 0.01383
Epoch: 17939, Training Loss: 0.01066
Epoch: 17939, Training Loss: 0.01120
Epoch: 17939, Training Loss: 0.01121
Epoch: 17939, Training Loss: 0.01383
Epoch: 17940, Training Loss: 0.01066
Epoch: 17940, Training Loss: 0.01120
Epoch: 17940, Training Loss: 0.01121
Epoch: 17940, Training Loss: 0.01383
Epoch: 17941, Training Loss: 0.01066
Epoch: 17941, Training Loss: 0.01120
Epoch: 17941, Training Loss: 0.01121
Epoch: 17941, Training Loss: 0.01383
Epoch: 17942, Training Loss: 0.01066
Epoch: 17942, Training Loss: 0.01120
Epoch: 17942, Training Loss: 0.01121
Epoch: 17942, Training Loss: 0.01383
Epoch: 17943, Training Loss: 0.01066
Epoch: 17943, Training Loss: 0.01120
Epoch: 17943, Training Loss: 0.01121
Epoch: 17943, Training Loss: 0.01383
Epoch: 17944, Training Loss: 0.01066
Epoch: 17944, Training Loss: 0.01120
Epoch: 17944, Training Loss: 0.01121
Epoch: 17944, Training Loss: 0.01383
Epoch: 17945, Training Loss: 0.01066
Epoch: 17945, Training Loss: 0.01120
Epoch: 17945, Training Loss: 0.01121
Epoch: 17945, Training Loss: 0.01383
Epoch: 17946, Training Loss: 0.01066
Epoch: 17946, Training Loss: 0.01120
Epoch: 17946, Training Loss: 0.01121
Epoch: 17946, Training Loss: 0.01383
Epoch: 17947, Training Loss: 0.01066
Epoch: 17947, Training Loss: 0.01120
Epoch: 17947, Training Loss: 0.01121
Epoch: 17947, Training Loss: 0.01383
Epoch: 17948, Training Loss: 0.01066
Epoch: 17948, Training Loss: 0.01119
Epoch: 17948, Training Loss: 0.01121
Epoch: 17948, Training Loss: 0.01383
Epoch: 17949, Training Loss: 0.01066
Epoch: 17949, Training Loss: 0.01119
Epoch: 17949, Training Loss: 0.01121
Epoch: 17949, Training Loss: 0.01383
Epoch: 17950, Training Loss: 0.01066
Epoch: 17950, Training Loss: 0.01119
Epoch: 17950, Training Loss: 0.01121
Epoch: 17950, Training Loss: 0.01383
Epoch: 17951, Training Loss: 0.01066
Epoch: 17951, Training Loss: 0.01119
Epoch: 17951, Training Loss: 0.01121
Epoch: 17951, Training Loss: 0.01383
Epoch: 17952, Training Loss: 0.01066
Epoch: 17952, Training Loss: 0.01119
Epoch: 17952, Training Loss: 0.01121
Epoch: 17952, Training Loss: 0.01383
Epoch: 17953, Training Loss: 0.01066
Epoch: 17953, Training Loss: 0.01119
Epoch: 17953, Training Loss: 0.01121
Epoch: 17953, Training Loss: 0.01383
Epoch: 17954, Training Loss: 0.01066
Epoch: 17954, Training Loss: 0.01119
Epoch: 17954, Training Loss: 0.01121
Epoch: 17954, Training Loss: 0.01383
Epoch: 17955, Training Loss: 0.01066
Epoch: 17955, Training Loss: 0.01119
Epoch: 17955, Training Loss: 0.01121
Epoch: 17955, Training Loss: 0.01382
Epoch: 17956, Training Loss: 0.01066
Epoch: 17956, Training Loss: 0.01119
Epoch: 17956, Training Loss: 0.01121
Epoch: 17956, Training Loss: 0.01382
Epoch: 17957, Training Loss: 0.01066
Epoch: 17957, Training Loss: 0.01119
Epoch: 17957, Training Loss: 0.01121
Epoch: 17957, Training Loss: 0.01382
Epoch: 17958, Training Loss: 0.01066
Epoch: 17958, Training Loss: 0.01119
Epoch: 17958, Training Loss: 0.01121
Epoch: 17958, Training Loss: 0.01382
Epoch: 17959, Training Loss: 0.01066
Epoch: 17959, Training Loss: 0.01119
Epoch: 17959, Training Loss: 0.01121
Epoch: 17959, Training Loss: 0.01382
Epoch: 17960, Training Loss: 0.01066
Epoch: 17960, Training Loss: 0.01119
Epoch: 17960, Training Loss: 0.01121
Epoch: 17960, Training Loss: 0.01382
Epoch: 17961, Training Loss: 0.01066
Epoch: 17961, Training Loss: 0.01119
Epoch: 17961, Training Loss: 0.01121
Epoch: 17961, Training Loss: 0.01382
Epoch: 17962, Training Loss: 0.01066
Epoch: 17962, Training Loss: 0.01119
Epoch: 17962, Training Loss: 0.01120
Epoch: 17962, Training Loss: 0.01382
Epoch: 17963, Training Loss: 0.01066
Epoch: 17963, Training Loss: 0.01119
Epoch: 17963, Training Loss: 0.01120
Epoch: 17963, Training Loss: 0.01382
Epoch: 17964, Training Loss: 0.01066
Epoch: 17964, Training Loss: 0.01119
Epoch: 17964, Training Loss: 0.01120
Epoch: 17964, Training Loss: 0.01382
Epoch: 17965, Training Loss: 0.01066
Epoch: 17965, Training Loss: 0.01119
Epoch: 17965, Training Loss: 0.01120
Epoch: 17965, Training Loss: 0.01382
Epoch: 17966, Training Loss: 0.01065
Epoch: 17966, Training Loss: 0.01119
Epoch: 17966, Training Loss: 0.01120
Epoch: 17966, Training Loss: 0.01382
Epoch: 17967, Training Loss: 0.01065
Epoch: 17967, Training Loss: 0.01119
Epoch: 17967, Training Loss: 0.01120
Epoch: 17967, Training Loss: 0.01382
Epoch: 17968, Training Loss: 0.01065
Epoch: 17968, Training Loss: 0.01119
Epoch: 17968, Training Loss: 0.01120
Epoch: 17968, Training Loss: 0.01382
Epoch: 17969, Training Loss: 0.01065
Epoch: 17969, Training Loss: 0.01119
Epoch: 17969, Training Loss: 0.01120
Epoch: 17969, Training Loss: 0.01382
Epoch: 17970, Training Loss: 0.01065
Epoch: 17970, Training Loss: 0.01119
Epoch: 17970, Training Loss: 0.01120
Epoch: 17970, Training Loss: 0.01382
Epoch: 17971, Training Loss: 0.01065
Epoch: 17971, Training Loss: 0.01119
Epoch: 17971, Training Loss: 0.01120
Epoch: 17971, Training Loss: 0.01382
Epoch: 17972, Training Loss: 0.01065
Epoch: 17972, Training Loss: 0.01119
Epoch: 17972, Training Loss: 0.01120
Epoch: 17972, Training Loss: 0.01382
Epoch: 17973, Training Loss: 0.01065
Epoch: 17973, Training Loss: 0.01119
Epoch: 17973, Training Loss: 0.01120
Epoch: 17973, Training Loss: 0.01382
Epoch: 17974, Training Loss: 0.01065
Epoch: 17974, Training Loss: 0.01119
Epoch: 17974, Training Loss: 0.01120
Epoch: 17974, Training Loss: 0.01382
Epoch: 17975, Training Loss: 0.01065
Epoch: 17975, Training Loss: 0.01118
Epoch: 17975, Training Loss: 0.01120
Epoch: 17975, Training Loss: 0.01382
Epoch: 17976, Training Loss: 0.01065
Epoch: 17976, Training Loss: 0.01118
Epoch: 17976, Training Loss: 0.01120
Epoch: 17976, Training Loss: 0.01381
Epoch: 17977, Training Loss: 0.01065
Epoch: 17977, Training Loss: 0.01118
Epoch: 17977, Training Loss: 0.01120
Epoch: 17977, Training Loss: 0.01381
Epoch: 17978, Training Loss: 0.01065
Epoch: 17978, Training Loss: 0.01118
Epoch: 17978, Training Loss: 0.01120
Epoch: 17978, Training Loss: 0.01381
Epoch: 17979, Training Loss: 0.01065
Epoch: 17979, Training Loss: 0.01118
Epoch: 17979, Training Loss: 0.01120
Epoch: 17979, Training Loss: 0.01381
Epoch: 17980, Training Loss: 0.01065
Epoch: 17980, Training Loss: 0.01118
Epoch: 17980, Training Loss: 0.01120
Epoch: 17980, Training Loss: 0.01381
Epoch: 17981, Training Loss: 0.01065
Epoch: 17981, Training Loss: 0.01118
Epoch: 17981, Training Loss: 0.01120
Epoch: 17981, Training Loss: 0.01381
Epoch: 17982, Training Loss: 0.01065
Epoch: 17982, Training Loss: 0.01118
Epoch: 17982, Training Loss: 0.01120
Epoch: 17982, Training Loss: 0.01381
Epoch: 17983, Training Loss: 0.01065
Epoch: 17983, Training Loss: 0.01118
Epoch: 17983, Training Loss: 0.01120
Epoch: 17983, Training Loss: 0.01381
Epoch: 17984, Training Loss: 0.01065
Epoch: 17984, Training Loss: 0.01118
Epoch: 17984, Training Loss: 0.01120
Epoch: 17984, Training Loss: 0.01381
Epoch: 17985, Training Loss: 0.01065
Epoch: 17985, Training Loss: 0.01118
Epoch: 17985, Training Loss: 0.01120
Epoch: 17985, Training Loss: 0.01381
Epoch: 17986, Training Loss: 0.01065
Epoch: 17986, Training Loss: 0.01118
Epoch: 17986, Training Loss: 0.01120
Epoch: 17986, Training Loss: 0.01381
Epoch: 17987, Training Loss: 0.01065
Epoch: 17987, Training Loss: 0.01118
Epoch: 17987, Training Loss: 0.01120
Epoch: 17987, Training Loss: 0.01381
Epoch: 17988, Training Loss: 0.01065
Epoch: 17988, Training Loss: 0.01118
Epoch: 17988, Training Loss: 0.01119
Epoch: 17988, Training Loss: 0.01381
Epoch: 17989, Training Loss: 0.01065
Epoch: 17989, Training Loss: 0.01118
Epoch: 17989, Training Loss: 0.01119
Epoch: 17989, Training Loss: 0.01381
Epoch: 17990, Training Loss: 0.01065
Epoch: 17990, Training Loss: 0.01118
Epoch: 17990, Training Loss: 0.01119
Epoch: 17990, Training Loss: 0.01381
Epoch: 17991, Training Loss: 0.01065
Epoch: 17991, Training Loss: 0.01118
Epoch: 17991, Training Loss: 0.01119
Epoch: 17991, Training Loss: 0.01381
Epoch: 17992, Training Loss: 0.01065
Epoch: 17992, Training Loss: 0.01118
Epoch: 17992, Training Loss: 0.01119
Epoch: 17992, Training Loss: 0.01381
Epoch: 17993, Training Loss: 0.01065
Epoch: 17993, Training Loss: 0.01118
Epoch: 17993, Training Loss: 0.01119
Epoch: 17993, Training Loss: 0.01381
Epoch: 17994, Training Loss: 0.01065
Epoch: 17994, Training Loss: 0.01118
Epoch: 17994, Training Loss: 0.01119
Epoch: 17994, Training Loss: 0.01381
Epoch: 17995, Training Loss: 0.01064
Epoch: 17995, Training Loss: 0.01118
Epoch: 17995, Training Loss: 0.01119
Epoch: 17995, Training Loss: 0.01381
Epoch: 17996, Training Loss: 0.01064
Epoch: 17996, Training Loss: 0.01118
Epoch: 17996, Training Loss: 0.01119
Epoch: 17996, Training Loss: 0.01381
Epoch: 17997, Training Loss: 0.01064
Epoch: 17997, Training Loss: 0.01118
Epoch: 17997, Training Loss: 0.01119
Epoch: 17997, Training Loss: 0.01380
Epoch: 17998, Training Loss: 0.01064
Epoch: 17998, Training Loss: 0.01118
Epoch: 17998, Training Loss: 0.01119
Epoch: 17998, Training Loss: 0.01380
Epoch: 17999, Training Loss: 0.01064
Epoch: 17999, Training Loss: 0.01118
Epoch: 17999, Training Loss: 0.01119
Epoch: 17999, Training Loss: 0.01380
Epoch: 18000, Training Loss: 0.01064
Epoch: 18000, Training Loss: 0.01118
Epoch: 18000, Training Loss: 0.01119
Epoch: 18000, Training Loss: 0.01380
Epoch: 18001, Training Loss: 0.01064
Epoch: 18001, Training Loss: 0.01118
Epoch: 18001, Training Loss: 0.01119
Epoch: 18001, Training Loss: 0.01380
Epoch: 18002, Training Loss: 0.01064
Epoch: 18002, Training Loss: 0.01117
Epoch: 18002, Training Loss: 0.01119
Epoch: 18002, Training Loss: 0.01380
Epoch: 18003, Training Loss: 0.01064
Epoch: 18003, Training Loss: 0.01117
Epoch: 18003, Training Loss: 0.01119
Epoch: 18003, Training Loss: 0.01380
Epoch: 18004, Training Loss: 0.01064
Epoch: 18004, Training Loss: 0.01117
Epoch: 18004, Training Loss: 0.01119
Epoch: 18004, Training Loss: 0.01380
Epoch: 18005, Training Loss: 0.01064
Epoch: 18005, Training Loss: 0.01117
Epoch: 18005, Training Loss: 0.01119
Epoch: 18005, Training Loss: 0.01380
Epoch: 18006, Training Loss: 0.01064
Epoch: 18006, Training Loss: 0.01117
Epoch: 18006, Training Loss: 0.01119
Epoch: 18006, Training Loss: 0.01380
Epoch: 18007, Training Loss: 0.01064
Epoch: 18007, Training Loss: 0.01117
Epoch: 18007, Training Loss: 0.01119
Epoch: 18007, Training Loss: 0.01380
Epoch: 18008, Training Loss: 0.01064
Epoch: 18008, Training Loss: 0.01117
Epoch: 18008, Training Loss: 0.01119
Epoch: 18008, Training Loss: 0.01380
Epoch: 18009, Training Loss: 0.01064
Epoch: 18009, Training Loss: 0.01117
Epoch: 18009, Training Loss: 0.01119
Epoch: 18009, Training Loss: 0.01380
Epoch: 18010, Training Loss: 0.01064
Epoch: 18010, Training Loss: 0.01117
Epoch: 18010, Training Loss: 0.01119
Epoch: 18010, Training Loss: 0.01380
Epoch: 18011, Training Loss: 0.01064
Epoch: 18011, Training Loss: 0.01117
Epoch: 18011, Training Loss: 0.01119
Epoch: 18011, Training Loss: 0.01380
Epoch: 18012, Training Loss: 0.01064
Epoch: 18012, Training Loss: 0.01117
Epoch: 18012, Training Loss: 0.01119
Epoch: 18012, Training Loss: 0.01380
Epoch: 18013, Training Loss: 0.01064
Epoch: 18013, Training Loss: 0.01117
Epoch: 18013, Training Loss: 0.01119
Epoch: 18013, Training Loss: 0.01380
Epoch: 18014, Training Loss: 0.01064
Epoch: 18014, Training Loss: 0.01117
Epoch: 18014, Training Loss: 0.01119
Epoch: 18014, Training Loss: 0.01380
Epoch: 18015, Training Loss: 0.01064
Epoch: 18015, Training Loss: 0.01117
Epoch: 18015, Training Loss: 0.01118
Epoch: 18015, Training Loss: 0.01380
Epoch: 18016, Training Loss: 0.01064
Epoch: 18016, Training Loss: 0.01117
Epoch: 18016, Training Loss: 0.01118
Epoch: 18016, Training Loss: 0.01380
Epoch: 18017, Training Loss: 0.01064
Epoch: 18017, Training Loss: 0.01117
Epoch: 18017, Training Loss: 0.01118
Epoch: 18017, Training Loss: 0.01380
Epoch: 18018, Training Loss: 0.01064
Epoch: 18018, Training Loss: 0.01117
Epoch: 18018, Training Loss: 0.01118
Epoch: 18018, Training Loss: 0.01380
Epoch: 18019, Training Loss: 0.01064
Epoch: 18019, Training Loss: 0.01117
Epoch: 18019, Training Loss: 0.01118
Epoch: 18019, Training Loss: 0.01379
Epoch: 18020, Training Loss: 0.01064
Epoch: 18020, Training Loss: 0.01117
Epoch: 18020, Training Loss: 0.01118
Epoch: 18020, Training Loss: 0.01379
Epoch: 18021, Training Loss: 0.01064
Epoch: 18021, Training Loss: 0.01117
Epoch: 18021, Training Loss: 0.01118
Epoch: 18021, Training Loss: 0.01379
Epoch: 18022, Training Loss: 0.01064
Epoch: 18022, Training Loss: 0.01117
Epoch: 18022, Training Loss: 0.01118
Epoch: 18022, Training Loss: 0.01379
Epoch: 18023, Training Loss: 0.01064
Epoch: 18023, Training Loss: 0.01117
Epoch: 18023, Training Loss: 0.01118
Epoch: 18023, Training Loss: 0.01379
Epoch: 18024, Training Loss: 0.01063
Epoch: 18024, Training Loss: 0.01117
Epoch: 18024, Training Loss: 0.01118
Epoch: 18024, Training Loss: 0.01379
Epoch: 18025, Training Loss: 0.01063
Epoch: 18025, Training Loss: 0.01117
Epoch: 18025, Training Loss: 0.01118
Epoch: 18025, Training Loss: 0.01379
Epoch: 18026, Training Loss: 0.01063
Epoch: 18026, Training Loss: 0.01117
Epoch: 18026, Training Loss: 0.01118
Epoch: 18026, Training Loss: 0.01379
Epoch: 18027, Training Loss: 0.01063
Epoch: 18027, Training Loss: 0.01117
Epoch: 18027, Training Loss: 0.01118
Epoch: 18027, Training Loss: 0.01379
Epoch: 18028, Training Loss: 0.01063
Epoch: 18028, Training Loss: 0.01117
Epoch: 18028, Training Loss: 0.01118
Epoch: 18028, Training Loss: 0.01379
Epoch: 18029, Training Loss: 0.01063
Epoch: 18029, Training Loss: 0.01116
Epoch: 18029, Training Loss: 0.01118
Epoch: 18029, Training Loss: 0.01379
Epoch: 18030, Training Loss: 0.01063
Epoch: 18030, Training Loss: 0.01116
Epoch: 18030, Training Loss: 0.01118
Epoch: 18030, Training Loss: 0.01379
Epoch: 18031, Training Loss: 0.01063
Epoch: 18031, Training Loss: 0.01116
Epoch: 18031, Training Loss: 0.01118
Epoch: 18031, Training Loss: 0.01379
Epoch: 18032, Training Loss: 0.01063
Epoch: 18032, Training Loss: 0.01116
Epoch: 18032, Training Loss: 0.01118
Epoch: 18032, Training Loss: 0.01379
Epoch: 18033, Training Loss: 0.01063
Epoch: 18033, Training Loss: 0.01116
Epoch: 18033, Training Loss: 0.01118
Epoch: 18033, Training Loss: 0.01379
Epoch: 18034, Training Loss: 0.01063
Epoch: 18034, Training Loss: 0.01116
Epoch: 18034, Training Loss: 0.01118
Epoch: 18034, Training Loss: 0.01379
Epoch: 18035, Training Loss: 0.01063
Epoch: 18035, Training Loss: 0.01116
Epoch: 18035, Training Loss: 0.01118
Epoch: 18035, Training Loss: 0.01379
Epoch: 18036, Training Loss: 0.01063
Epoch: 18036, Training Loss: 0.01116
Epoch: 18036, Training Loss: 0.01118
Epoch: 18036, Training Loss: 0.01379
Epoch: 18037, Training Loss: 0.01063
Epoch: 18037, Training Loss: 0.01116
Epoch: 18037, Training Loss: 0.01118
Epoch: 18037, Training Loss: 0.01379
Epoch: 18038, Training Loss: 0.01063
Epoch: 18038, Training Loss: 0.01116
Epoch: 18038, Training Loss: 0.01118
Epoch: 18038, Training Loss: 0.01379
Epoch: 18039, Training Loss: 0.01063
Epoch: 18039, Training Loss: 0.01116
Epoch: 18039, Training Loss: 0.01118
Epoch: 18039, Training Loss: 0.01379
Epoch: 18040, Training Loss: 0.01063
Epoch: 18040, Training Loss: 0.01116
Epoch: 18040, Training Loss: 0.01118
Epoch: 18040, Training Loss: 0.01378
Epoch: 18041, Training Loss: 0.01063
Epoch: 18041, Training Loss: 0.01116
Epoch: 18041, Training Loss: 0.01118
Epoch: 18041, Training Loss: 0.01378
Epoch: 18042, Training Loss: 0.01063
Epoch: 18042, Training Loss: 0.01116
Epoch: 18042, Training Loss: 0.01117
Epoch: 18042, Training Loss: 0.01378
Epoch: 18043, Training Loss: 0.01063
Epoch: 18043, Training Loss: 0.01116
Epoch: 18043, Training Loss: 0.01117
Epoch: 18043, Training Loss: 0.01378
Epoch: 18044, Training Loss: 0.01063
Epoch: 18044, Training Loss: 0.01116
Epoch: 18044, Training Loss: 0.01117
Epoch: 18044, Training Loss: 0.01378
Epoch: 18045, Training Loss: 0.01063
Epoch: 18045, Training Loss: 0.01116
Epoch: 18045, Training Loss: 0.01117
Epoch: 18045, Training Loss: 0.01378
Epoch: 18046, Training Loss: 0.01063
Epoch: 18046, Training Loss: 0.01116
Epoch: 18046, Training Loss: 0.01117
Epoch: 18046, Training Loss: 0.01378
Epoch: 18047, Training Loss: 0.01063
Epoch: 18047, Training Loss: 0.01116
Epoch: 18047, Training Loss: 0.01117
Epoch: 18047, Training Loss: 0.01378
Epoch: 18048, Training Loss: 0.01063
Epoch: 18048, Training Loss: 0.01116
Epoch: 18048, Training Loss: 0.01117
Epoch: 18048, Training Loss: 0.01378
Epoch: 18049, Training Loss: 0.01063
Epoch: 18049, Training Loss: 0.01116
Epoch: 18049, Training Loss: 0.01117
Epoch: 18049, Training Loss: 0.01378
Epoch: 18050, Training Loss: 0.01063
Epoch: 18050, Training Loss: 0.01116
Epoch: 18050, Training Loss: 0.01117
Epoch: 18050, Training Loss: 0.01378
Epoch: 18051, Training Loss: 0.01063
Epoch: 18051, Training Loss: 0.01116
Epoch: 18051, Training Loss: 0.01117
Epoch: 18051, Training Loss: 0.01378
Epoch: 18052, Training Loss: 0.01063
Epoch: 18052, Training Loss: 0.01116
Epoch: 18052, Training Loss: 0.01117
Epoch: 18052, Training Loss: 0.01378
Epoch: 18053, Training Loss: 0.01062
Epoch: 18053, Training Loss: 0.01116
Epoch: 18053, Training Loss: 0.01117
Epoch: 18053, Training Loss: 0.01378
Epoch: 18054, Training Loss: 0.01062
Epoch: 18054, Training Loss: 0.01116
Epoch: 18054, Training Loss: 0.01117
Epoch: 18054, Training Loss: 0.01378
Epoch: 18055, Training Loss: 0.01062
Epoch: 18055, Training Loss: 0.01115
Epoch: 18055, Training Loss: 0.01117
Epoch: 18055, Training Loss: 0.01378
Epoch: 18056, Training Loss: 0.01062
Epoch: 18056, Training Loss: 0.01115
Epoch: 18056, Training Loss: 0.01117
Epoch: 18056, Training Loss: 0.01378
Epoch: 18057, Training Loss: 0.01062
Epoch: 18057, Training Loss: 0.01115
Epoch: 18057, Training Loss: 0.01117
Epoch: 18057, Training Loss: 0.01378
Epoch: 18058, Training Loss: 0.01062
Epoch: 18058, Training Loss: 0.01115
Epoch: 18058, Training Loss: 0.01117
Epoch: 18058, Training Loss: 0.01378
Epoch: 18059, Training Loss: 0.01062
Epoch: 18059, Training Loss: 0.01115
Epoch: 18059, Training Loss: 0.01117
Epoch: 18059, Training Loss: 0.01378
Epoch: 18060, Training Loss: 0.01062
Epoch: 18060, Training Loss: 0.01115
Epoch: 18060, Training Loss: 0.01117
Epoch: 18060, Training Loss: 0.01378
Epoch: 18061, Training Loss: 0.01062
Epoch: 18061, Training Loss: 0.01115
Epoch: 18061, Training Loss: 0.01117
Epoch: 18061, Training Loss: 0.01377
Epoch: 18062, Training Loss: 0.01062
Epoch: 18062, Training Loss: 0.01115
Epoch: 18062, Training Loss: 0.01117
Epoch: 18062, Training Loss: 0.01377
Epoch: 18063, Training Loss: 0.01062
Epoch: 18063, Training Loss: 0.01115
Epoch: 18063, Training Loss: 0.01117
Epoch: 18063, Training Loss: 0.01377
Epoch: 18064, Training Loss: 0.01062
Epoch: 18064, Training Loss: 0.01115
Epoch: 18064, Training Loss: 0.01117
Epoch: 18064, Training Loss: 0.01377
Epoch: 18065, Training Loss: 0.01062
Epoch: 18065, Training Loss: 0.01115
Epoch: 18065, Training Loss: 0.01117
Epoch: 18065, Training Loss: 0.01377
Epoch: 18066, Training Loss: 0.01062
Epoch: 18066, Training Loss: 0.01115
Epoch: 18066, Training Loss: 0.01117
Epoch: 18066, Training Loss: 0.01377
Epoch: 18067, Training Loss: 0.01062
Epoch: 18067, Training Loss: 0.01115
Epoch: 18067, Training Loss: 0.01117
Epoch: 18067, Training Loss: 0.01377
Epoch: 18068, Training Loss: 0.01062
Epoch: 18068, Training Loss: 0.01115
Epoch: 18068, Training Loss: 0.01117
Epoch: 18068, Training Loss: 0.01377
Epoch: 18069, Training Loss: 0.01062
Epoch: 18069, Training Loss: 0.01115
Epoch: 18069, Training Loss: 0.01116
Epoch: 18069, Training Loss: 0.01377
Epoch: 18070, Training Loss: 0.01062
Epoch: 18070, Training Loss: 0.01115
Epoch: 18070, Training Loss: 0.01116
Epoch: 18070, Training Loss: 0.01377
Epoch: 18071, Training Loss: 0.01062
Epoch: 18071, Training Loss: 0.01115
Epoch: 18071, Training Loss: 0.01116
Epoch: 18071, Training Loss: 0.01377
Epoch: 18072, Training Loss: 0.01062
Epoch: 18072, Training Loss: 0.01115
Epoch: 18072, Training Loss: 0.01116
Epoch: 18072, Training Loss: 0.01377
Epoch: 18073, Training Loss: 0.01062
Epoch: 18073, Training Loss: 0.01115
Epoch: 18073, Training Loss: 0.01116
Epoch: 18073, Training Loss: 0.01377
Epoch: 18074, Training Loss: 0.01062
Epoch: 18074, Training Loss: 0.01115
Epoch: 18074, Training Loss: 0.01116
Epoch: 18074, Training Loss: 0.01377
Epoch: 18075, Training Loss: 0.01062
Epoch: 18075, Training Loss: 0.01115
Epoch: 18075, Training Loss: 0.01116
Epoch: 18075, Training Loss: 0.01377
Epoch: 18076, Training Loss: 0.01062
Epoch: 18076, Training Loss: 0.01115
Epoch: 18076, Training Loss: 0.01116
Epoch: 18076, Training Loss: 0.01377
Epoch: 18077, Training Loss: 0.01062
Epoch: 18077, Training Loss: 0.01115
Epoch: 18077, Training Loss: 0.01116
Epoch: 18077, Training Loss: 0.01377
Epoch: 18078, Training Loss: 0.01062
Epoch: 18078, Training Loss: 0.01115
Epoch: 18078, Training Loss: 0.01116
Epoch: 18078, Training Loss: 0.01377
Epoch: 18079, Training Loss: 0.01062
Epoch: 18079, Training Loss: 0.01115
Epoch: 18079, Training Loss: 0.01116
Epoch: 18079, Training Loss: 0.01377
Epoch: 18080, Training Loss: 0.01062
Epoch: 18080, Training Loss: 0.01115
Epoch: 18080, Training Loss: 0.01116
Epoch: 18080, Training Loss: 0.01377
Epoch: 18081, Training Loss: 0.01062
Epoch: 18081, Training Loss: 0.01115
Epoch: 18081, Training Loss: 0.01116
Epoch: 18081, Training Loss: 0.01377
Epoch: 18082, Training Loss: 0.01061
Epoch: 18082, Training Loss: 0.01114
Epoch: 18082, Training Loss: 0.01116
Epoch: 18082, Training Loss: 0.01377
Epoch: 18083, Training Loss: 0.01061
Epoch: 18083, Training Loss: 0.01114
Epoch: 18083, Training Loss: 0.01116
Epoch: 18083, Training Loss: 0.01376
Epoch: 18084, Training Loss: 0.01061
Epoch: 18084, Training Loss: 0.01114
Epoch: 18084, Training Loss: 0.01116
Epoch: 18084, Training Loss: 0.01376
Epoch: 18085, Training Loss: 0.01061
Epoch: 18085, Training Loss: 0.01114
Epoch: 18085, Training Loss: 0.01116
Epoch: 18085, Training Loss: 0.01376
Epoch: 18086, Training Loss: 0.01061
Epoch: 18086, Training Loss: 0.01114
Epoch: 18086, Training Loss: 0.01116
Epoch: 18086, Training Loss: 0.01376
Epoch: 18087, Training Loss: 0.01061
Epoch: 18087, Training Loss: 0.01114
Epoch: 18087, Training Loss: 0.01116
Epoch: 18087, Training Loss: 0.01376
Epoch: 18088, Training Loss: 0.01061
Epoch: 18088, Training Loss: 0.01114
Epoch: 18088, Training Loss: 0.01116
Epoch: 18088, Training Loss: 0.01376
Epoch: 18089, Training Loss: 0.01061
Epoch: 18089, Training Loss: 0.01114
Epoch: 18089, Training Loss: 0.01116
Epoch: 18089, Training Loss: 0.01376
Epoch: 18090, Training Loss: 0.01061
Epoch: 18090, Training Loss: 0.01114
Epoch: 18090, Training Loss: 0.01116
Epoch: 18090, Training Loss: 0.01376
Epoch: 18091, Training Loss: 0.01061
Epoch: 18091, Training Loss: 0.01114
Epoch: 18091, Training Loss: 0.01116
Epoch: 18091, Training Loss: 0.01376
Epoch: 18092, Training Loss: 0.01061
Epoch: 18092, Training Loss: 0.01114
Epoch: 18092, Training Loss: 0.01116
Epoch: 18092, Training Loss: 0.01376
Epoch: 18093, Training Loss: 0.01061
Epoch: 18093, Training Loss: 0.01114
Epoch: 18093, Training Loss: 0.01116
Epoch: 18093, Training Loss: 0.01376
Epoch: 18094, Training Loss: 0.01061
Epoch: 18094, Training Loss: 0.01114
Epoch: 18094, Training Loss: 0.01116
Epoch: 18094, Training Loss: 0.01376
Epoch: 18095, Training Loss: 0.01061
Epoch: 18095, Training Loss: 0.01114
Epoch: 18095, Training Loss: 0.01116
Epoch: 18095, Training Loss: 0.01376
Epoch: 18096, Training Loss: 0.01061
Epoch: 18096, Training Loss: 0.01114
Epoch: 18096, Training Loss: 0.01115
Epoch: 18096, Training Loss: 0.01376
Epoch: 18097, Training Loss: 0.01061
Epoch: 18097, Training Loss: 0.01114
Epoch: 18097, Training Loss: 0.01115
Epoch: 18097, Training Loss: 0.01376
Epoch: 18098, Training Loss: 0.01061
Epoch: 18098, Training Loss: 0.01114
Epoch: 18098, Training Loss: 0.01115
Epoch: 18098, Training Loss: 0.01376
Epoch: 18099, Training Loss: 0.01061
Epoch: 18099, Training Loss: 0.01114
Epoch: 18099, Training Loss: 0.01115
Epoch: 18099, Training Loss: 0.01376
Epoch: 18100, Training Loss: 0.01061
Epoch: 18100, Training Loss: 0.01114
Epoch: 18100, Training Loss: 0.01115
Epoch: 18100, Training Loss: 0.01376
Epoch: 18101, Training Loss: 0.01061
Epoch: 18101, Training Loss: 0.01114
Epoch: 18101, Training Loss: 0.01115
Epoch: 18101, Training Loss: 0.01376
Epoch: 18102, Training Loss: 0.01061
Epoch: 18102, Training Loss: 0.01114
Epoch: 18102, Training Loss: 0.01115
Epoch: 18102, Training Loss: 0.01376
Epoch: 18103, Training Loss: 0.01061
Epoch: 18103, Training Loss: 0.01114
Epoch: 18103, Training Loss: 0.01115
Epoch: 18103, Training Loss: 0.01376
Epoch: 18104, Training Loss: 0.01061
Epoch: 18104, Training Loss: 0.01114
Epoch: 18104, Training Loss: 0.01115
Epoch: 18104, Training Loss: 0.01375
Epoch: 18105, Training Loss: 0.01061
Epoch: 18105, Training Loss: 0.01114
Epoch: 18105, Training Loss: 0.01115
Epoch: 18105, Training Loss: 0.01375
Epoch: 18106, Training Loss: 0.01061
Epoch: 18106, Training Loss: 0.01114
Epoch: 18106, Training Loss: 0.01115
Epoch: 18106, Training Loss: 0.01375
Epoch: 18107, Training Loss: 0.01061
Epoch: 18107, Training Loss: 0.01114
Epoch: 18107, Training Loss: 0.01115
Epoch: 18107, Training Loss: 0.01375
Epoch: 18108, Training Loss: 0.01061
Epoch: 18108, Training Loss: 0.01114
Epoch: 18108, Training Loss: 0.01115
Epoch: 18108, Training Loss: 0.01375
Epoch: 18109, Training Loss: 0.01061
Epoch: 18109, Training Loss: 0.01113
Epoch: 18109, Training Loss: 0.01115
Epoch: 18109, Training Loss: 0.01375
Epoch: 18110, Training Loss: 0.01061
Epoch: 18110, Training Loss: 0.01113
Epoch: 18110, Training Loss: 0.01115
Epoch: 18110, Training Loss: 0.01375
Epoch: 18111, Training Loss: 0.01060
Epoch: 18111, Training Loss: 0.01113
Epoch: 18111, Training Loss: 0.01115
Epoch: 18111, Training Loss: 0.01375
Epoch: 18112, Training Loss: 0.01060
Epoch: 18112, Training Loss: 0.01113
Epoch: 18112, Training Loss: 0.01115
Epoch: 18112, Training Loss: 0.01375
Epoch: 18113, Training Loss: 0.01060
Epoch: 18113, Training Loss: 0.01113
Epoch: 18113, Training Loss: 0.01115
Epoch: 18113, Training Loss: 0.01375
Epoch: 18114, Training Loss: 0.01060
Epoch: 18114, Training Loss: 0.01113
Epoch: 18114, Training Loss: 0.01115
Epoch: 18114, Training Loss: 0.01375
Epoch: 18115, Training Loss: 0.01060
Epoch: 18115, Training Loss: 0.01113
Epoch: 18115, Training Loss: 0.01115
Epoch: 18115, Training Loss: 0.01375
Epoch: 18116, Training Loss: 0.01060
Epoch: 18116, Training Loss: 0.01113
Epoch: 18116, Training Loss: 0.01115
Epoch: 18116, Training Loss: 0.01375
Epoch: 18117, Training Loss: 0.01060
Epoch: 18117, Training Loss: 0.01113
Epoch: 18117, Training Loss: 0.01115
Epoch: 18117, Training Loss: 0.01375
Epoch: 18118, Training Loss: 0.01060
Epoch: 18118, Training Loss: 0.01113
Epoch: 18118, Training Loss: 0.01115
Epoch: 18118, Training Loss: 0.01375
Epoch: 18119, Training Loss: 0.01060
Epoch: 18119, Training Loss: 0.01113
Epoch: 18119, Training Loss: 0.01115
Epoch: 18119, Training Loss: 0.01375
Epoch: 18120, Training Loss: 0.01060
Epoch: 18120, Training Loss: 0.01113
Epoch: 18120, Training Loss: 0.01115
Epoch: 18120, Training Loss: 0.01375
Epoch: 18121, Training Loss: 0.01060
Epoch: 18121, Training Loss: 0.01113
Epoch: 18121, Training Loss: 0.01115
Epoch: 18121, Training Loss: 0.01375
Epoch: 18122, Training Loss: 0.01060
Epoch: 18122, Training Loss: 0.01113
Epoch: 18122, Training Loss: 0.01115
Epoch: 18122, Training Loss: 0.01375
Epoch: 18123, Training Loss: 0.01060
Epoch: 18123, Training Loss: 0.01113
Epoch: 18123, Training Loss: 0.01114
Epoch: 18123, Training Loss: 0.01375
Epoch: 18124, Training Loss: 0.01060
Epoch: 18124, Training Loss: 0.01113
Epoch: 18124, Training Loss: 0.01114
Epoch: 18124, Training Loss: 0.01375
Epoch: 18125, Training Loss: 0.01060
Epoch: 18125, Training Loss: 0.01113
Epoch: 18125, Training Loss: 0.01114
Epoch: 18125, Training Loss: 0.01375
Epoch: 18126, Training Loss: 0.01060
Epoch: 18126, Training Loss: 0.01113
Epoch: 18126, Training Loss: 0.01114
Epoch: 18126, Training Loss: 0.01374
Epoch: 18127, Training Loss: 0.01060
Epoch: 18127, Training Loss: 0.01113
Epoch: 18127, Training Loss: 0.01114
Epoch: 18127, Training Loss: 0.01374
Epoch: 18128, Training Loss: 0.01060
Epoch: 18128, Training Loss: 0.01113
Epoch: 18128, Training Loss: 0.01114
Epoch: 18128, Training Loss: 0.01374
Epoch: 18129, Training Loss: 0.01060
Epoch: 18129, Training Loss: 0.01113
Epoch: 18129, Training Loss: 0.01114
Epoch: 18129, Training Loss: 0.01374
Epoch: 18130, Training Loss: 0.01060
Epoch: 18130, Training Loss: 0.01113
Epoch: 18130, Training Loss: 0.01114
Epoch: 18130, Training Loss: 0.01374
Epoch: 18131, Training Loss: 0.01060
Epoch: 18131, Training Loss: 0.01113
Epoch: 18131, Training Loss: 0.01114
Epoch: 18131, Training Loss: 0.01374
Epoch: 18132, Training Loss: 0.01060
Epoch: 18132, Training Loss: 0.01113
Epoch: 18132, Training Loss: 0.01114
Epoch: 18132, Training Loss: 0.01374
Epoch: 18133, Training Loss: 0.01060
Epoch: 18133, Training Loss: 0.01113
Epoch: 18133, Training Loss: 0.01114
Epoch: 18133, Training Loss: 0.01374
Epoch: 18134, Training Loss: 0.01060
Epoch: 18134, Training Loss: 0.01113
Epoch: 18134, Training Loss: 0.01114
Epoch: 18134, Training Loss: 0.01374
Epoch: 18135, Training Loss: 0.01060
Epoch: 18135, Training Loss: 0.01113
Epoch: 18135, Training Loss: 0.01114
Epoch: 18135, Training Loss: 0.01374
Epoch: 18136, Training Loss: 0.01060
Epoch: 18136, Training Loss: 0.01113
Epoch: 18136, Training Loss: 0.01114
Epoch: 18136, Training Loss: 0.01374
Epoch: 18137, Training Loss: 0.01060
Epoch: 18137, Training Loss: 0.01112
Epoch: 18137, Training Loss: 0.01114
Epoch: 18137, Training Loss: 0.01374
Epoch: 18138, Training Loss: 0.01060
Epoch: 18138, Training Loss: 0.01112
Epoch: 18138, Training Loss: 0.01114
Epoch: 18138, Training Loss: 0.01374
Epoch: 18139, Training Loss: 0.01060
Epoch: 18139, Training Loss: 0.01112
Epoch: 18139, Training Loss: 0.01114
Epoch: 18139, Training Loss: 0.01374
Epoch: 18140, Training Loss: 0.01060
Epoch: 18140, Training Loss: 0.01112
Epoch: 18140, Training Loss: 0.01114
Epoch: 18140, Training Loss: 0.01374
Epoch: 18141, Training Loss: 0.01059
Epoch: 18141, Training Loss: 0.01112
Epoch: 18141, Training Loss: 0.01114
Epoch: 18141, Training Loss: 0.01374
Epoch: 18142, Training Loss: 0.01059
Epoch: 18142, Training Loss: 0.01112
Epoch: 18142, Training Loss: 0.01114
Epoch: 18142, Training Loss: 0.01374
Epoch: 18143, Training Loss: 0.01059
Epoch: 18143, Training Loss: 0.01112
Epoch: 18143, Training Loss: 0.01114
Epoch: 18143, Training Loss: 0.01374
Epoch: 18144, Training Loss: 0.01059
Epoch: 18144, Training Loss: 0.01112
Epoch: 18144, Training Loss: 0.01114
Epoch: 18144, Training Loss: 0.01374
Epoch: 18145, Training Loss: 0.01059
Epoch: 18145, Training Loss: 0.01112
Epoch: 18145, Training Loss: 0.01114
Epoch: 18145, Training Loss: 0.01374
Epoch: 18146, Training Loss: 0.01059
Epoch: 18146, Training Loss: 0.01112
Epoch: 18146, Training Loss: 0.01114
Epoch: 18146, Training Loss: 0.01374
Epoch: 18147, Training Loss: 0.01059
Epoch: 18147, Training Loss: 0.01112
Epoch: 18147, Training Loss: 0.01114
Epoch: 18147, Training Loss: 0.01374
Epoch: 18148, Training Loss: 0.01059
Epoch: 18148, Training Loss: 0.01112
Epoch: 18148, Training Loss: 0.01114
Epoch: 18148, Training Loss: 0.01373
Epoch: 18149, Training Loss: 0.01059
Epoch: 18149, Training Loss: 0.01112
Epoch: 18149, Training Loss: 0.01114
Epoch: 18149, Training Loss: 0.01373
Epoch: 18150, Training Loss: 0.01059
Epoch: 18150, Training Loss: 0.01112
Epoch: 18150, Training Loss: 0.01113
Epoch: 18150, Training Loss: 0.01373
Epoch: 18151, Training Loss: 0.01059
Epoch: 18151, Training Loss: 0.01112
Epoch: 18151, Training Loss: 0.01113
Epoch: 18151, Training Loss: 0.01373
Epoch: 18152, Training Loss: 0.01059
Epoch: 18152, Training Loss: 0.01112
Epoch: 18152, Training Loss: 0.01113
Epoch: 18152, Training Loss: 0.01373
Epoch: 18153, Training Loss: 0.01059
Epoch: 18153, Training Loss: 0.01112
Epoch: 18153, Training Loss: 0.01113
Epoch: 18153, Training Loss: 0.01373
Epoch: 18154, Training Loss: 0.01059
Epoch: 18154, Training Loss: 0.01112
Epoch: 18154, Training Loss: 0.01113
Epoch: 18154, Training Loss: 0.01373
Epoch: 18155, Training Loss: 0.01059
Epoch: 18155, Training Loss: 0.01112
Epoch: 18155, Training Loss: 0.01113
Epoch: 18155, Training Loss: 0.01373
Epoch: 18156, Training Loss: 0.01059
Epoch: 18156, Training Loss: 0.01112
Epoch: 18156, Training Loss: 0.01113
Epoch: 18156, Training Loss: 0.01373
Epoch: 18157, Training Loss: 0.01059
Epoch: 18157, Training Loss: 0.01112
Epoch: 18157, Training Loss: 0.01113
Epoch: 18157, Training Loss: 0.01373
Epoch: 18158, Training Loss: 0.01059
Epoch: 18158, Training Loss: 0.01112
Epoch: 18158, Training Loss: 0.01113
Epoch: 18158, Training Loss: 0.01373
Epoch: 18159, Training Loss: 0.01059
Epoch: 18159, Training Loss: 0.01112
Epoch: 18159, Training Loss: 0.01113
Epoch: 18159, Training Loss: 0.01373
Epoch: 18160, Training Loss: 0.01059
Epoch: 18160, Training Loss: 0.01112
Epoch: 18160, Training Loss: 0.01113
Epoch: 18160, Training Loss: 0.01373
Epoch: 18161, Training Loss: 0.01059
Epoch: 18161, Training Loss: 0.01112
Epoch: 18161, Training Loss: 0.01113
Epoch: 18161, Training Loss: 0.01373
Epoch: 18162, Training Loss: 0.01059
Epoch: 18162, Training Loss: 0.01112
Epoch: 18162, Training Loss: 0.01113
Epoch: 18162, Training Loss: 0.01373
Epoch: 18163, Training Loss: 0.01059
Epoch: 18163, Training Loss: 0.01112
Epoch: 18163, Training Loss: 0.01113
Epoch: 18163, Training Loss: 0.01373
Epoch: 18164, Training Loss: 0.01059
Epoch: 18164, Training Loss: 0.01111
Epoch: 18164, Training Loss: 0.01113
Epoch: 18164, Training Loss: 0.01373
Epoch: 18165, Training Loss: 0.01059
Epoch: 18165, Training Loss: 0.01111
Epoch: 18165, Training Loss: 0.01113
Epoch: 18165, Training Loss: 0.01373
Epoch: 18166, Training Loss: 0.01059
Epoch: 18166, Training Loss: 0.01111
Epoch: 18166, Training Loss: 0.01113
Epoch: 18166, Training Loss: 0.01373
Epoch: 18167, Training Loss: 0.01059
Epoch: 18167, Training Loss: 0.01111
Epoch: 18167, Training Loss: 0.01113
Epoch: 18167, Training Loss: 0.01373
Epoch: 18168, Training Loss: 0.01059
Epoch: 18168, Training Loss: 0.01111
Epoch: 18168, Training Loss: 0.01113
Epoch: 18168, Training Loss: 0.01373
Epoch: 18169, Training Loss: 0.01059
Epoch: 18169, Training Loss: 0.01111
Epoch: 18169, Training Loss: 0.01113
Epoch: 18169, Training Loss: 0.01372
Epoch: 18170, Training Loss: 0.01058
Epoch: 18170, Training Loss: 0.01111
Epoch: 18170, Training Loss: 0.01113
Epoch: 18170, Training Loss: 0.01372
Epoch: 18171, Training Loss: 0.01058
Epoch: 18171, Training Loss: 0.01111
Epoch: 18171, Training Loss: 0.01113
Epoch: 18171, Training Loss: 0.01372
Epoch: 18172, Training Loss: 0.01058
Epoch: 18172, Training Loss: 0.01111
Epoch: 18172, Training Loss: 0.01113
Epoch: 18172, Training Loss: 0.01372
Epoch: 18173, Training Loss: 0.01058
Epoch: 18173, Training Loss: 0.01111
Epoch: 18173, Training Loss: 0.01113
Epoch: 18173, Training Loss: 0.01372
Epoch: 18174, Training Loss: 0.01058
Epoch: 18174, Training Loss: 0.01111
Epoch: 18174, Training Loss: 0.01113
Epoch: 18174, Training Loss: 0.01372
Epoch: 18175, Training Loss: 0.01058
Epoch: 18175, Training Loss: 0.01111
Epoch: 18175, Training Loss: 0.01113
Epoch: 18175, Training Loss: 0.01372
Epoch: 18176, Training Loss: 0.01058
Epoch: 18176, Training Loss: 0.01111
Epoch: 18176, Training Loss: 0.01113
Epoch: 18176, Training Loss: 0.01372
Epoch: 18177, Training Loss: 0.01058
Epoch: 18177, Training Loss: 0.01111
Epoch: 18177, Training Loss: 0.01112
Epoch: 18177, Training Loss: 0.01372
Epoch: 18178, Training Loss: 0.01058
Epoch: 18178, Training Loss: 0.01111
Epoch: 18178, Training Loss: 0.01112
Epoch: 18178, Training Loss: 0.01372
Epoch: 18179, Training Loss: 0.01058
Epoch: 18179, Training Loss: 0.01111
Epoch: 18179, Training Loss: 0.01112
Epoch: 18179, Training Loss: 0.01372
Epoch: 18180, Training Loss: 0.01058
Epoch: 18180, Training Loss: 0.01111
Epoch: 18180, Training Loss: 0.01112
Epoch: 18180, Training Loss: 0.01372
Epoch: 18181, Training Loss: 0.01058
Epoch: 18181, Training Loss: 0.01111
Epoch: 18181, Training Loss: 0.01112
Epoch: 18181, Training Loss: 0.01372
Epoch: 18182, Training Loss: 0.01058
Epoch: 18182, Training Loss: 0.01111
Epoch: 18182, Training Loss: 0.01112
Epoch: 18182, Training Loss: 0.01372
Epoch: 18183, Training Loss: 0.01058
Epoch: 18183, Training Loss: 0.01111
Epoch: 18183, Training Loss: 0.01112
Epoch: 18183, Training Loss: 0.01372
Epoch: 18184, Training Loss: 0.01058
Epoch: 18184, Training Loss: 0.01111
Epoch: 18184, Training Loss: 0.01112
Epoch: 18184, Training Loss: 0.01372
Epoch: 18185, Training Loss: 0.01058
Epoch: 18185, Training Loss: 0.01111
Epoch: 18185, Training Loss: 0.01112
Epoch: 18185, Training Loss: 0.01372
Epoch: 18186, Training Loss: 0.01058
Epoch: 18186, Training Loss: 0.01111
Epoch: 18186, Training Loss: 0.01112
Epoch: 18186, Training Loss: 0.01372
Epoch: 18187, Training Loss: 0.01058
Epoch: 18187, Training Loss: 0.01111
Epoch: 18187, Training Loss: 0.01112
Epoch: 18187, Training Loss: 0.01372
Epoch: 18188, Training Loss: 0.01058
Epoch: 18188, Training Loss: 0.01111
Epoch: 18188, Training Loss: 0.01112
Epoch: 18188, Training Loss: 0.01372
Epoch: 18189, Training Loss: 0.01058
Epoch: 18189, Training Loss: 0.01111
Epoch: 18189, Training Loss: 0.01112
Epoch: 18189, Training Loss: 0.01372
Epoch: 18190, Training Loss: 0.01058
Epoch: 18190, Training Loss: 0.01111
Epoch: 18190, Training Loss: 0.01112
Epoch: 18190, Training Loss: 0.01372
Epoch: 18191, Training Loss: 0.01058
Epoch: 18191, Training Loss: 0.01110
Epoch: 18191, Training Loss: 0.01112
Epoch: 18191, Training Loss: 0.01371
Epoch: 18192, Training Loss: 0.01058
Epoch: 18192, Training Loss: 0.01110
Epoch: 18192, Training Loss: 0.01112
Epoch: 18192, Training Loss: 0.01371
Epoch: 18193, Training Loss: 0.01058
Epoch: 18193, Training Loss: 0.01110
Epoch: 18193, Training Loss: 0.01112
Epoch: 18193, Training Loss: 0.01371
Epoch: 18194, Training Loss: 0.01058
Epoch: 18194, Training Loss: 0.01110
Epoch: 18194, Training Loss: 0.01112
Epoch: 18194, Training Loss: 0.01371
Epoch: 18195, Training Loss: 0.01058
Epoch: 18195, Training Loss: 0.01110
Epoch: 18195, Training Loss: 0.01112
Epoch: 18195, Training Loss: 0.01371
Epoch: 18196, Training Loss: 0.01058
Epoch: 18196, Training Loss: 0.01110
Epoch: 18196, Training Loss: 0.01112
Epoch: 18196, Training Loss: 0.01371
Epoch: 18197, Training Loss: 0.01058
Epoch: 18197, Training Loss: 0.01110
Epoch: 18197, Training Loss: 0.01112
Epoch: 18197, Training Loss: 0.01371
Epoch: 18198, Training Loss: 0.01058
Epoch: 18198, Training Loss: 0.01110
Epoch: 18198, Training Loss: 0.01112
Epoch: 18198, Training Loss: 0.01371
Epoch: 18199, Training Loss: 0.01058
Epoch: 18199, Training Loss: 0.01110
Epoch: 18199, Training Loss: 0.01112
Epoch: 18199, Training Loss: 0.01371
Epoch: 18200, Training Loss: 0.01057
Epoch: 18200, Training Loss: 0.01110
Epoch: 18200, Training Loss: 0.01112
Epoch: 18200, Training Loss: 0.01371
Epoch: 18201, Training Loss: 0.01057
Epoch: 18201, Training Loss: 0.01110
Epoch: 18201, Training Loss: 0.01112
Epoch: 18201, Training Loss: 0.01371
Epoch: 18202, Training Loss: 0.01057
Epoch: 18202, Training Loss: 0.01110
Epoch: 18202, Training Loss: 0.01112
Epoch: 18202, Training Loss: 0.01371
Epoch: 18203, Training Loss: 0.01057
Epoch: 18203, Training Loss: 0.01110
Epoch: 18203, Training Loss: 0.01112
Epoch: 18203, Training Loss: 0.01371
Epoch: 18204, Training Loss: 0.01057
Epoch: 18204, Training Loss: 0.01110
Epoch: 18204, Training Loss: 0.01111
Epoch: 18204, Training Loss: 0.01371
Epoch: 18205, Training Loss: 0.01057
Epoch: 18205, Training Loss: 0.01110
Epoch: 18205, Training Loss: 0.01111
Epoch: 18205, Training Loss: 0.01371
Epoch: 18206, Training Loss: 0.01057
Epoch: 18206, Training Loss: 0.01110
Epoch: 18206, Training Loss: 0.01111
Epoch: 18206, Training Loss: 0.01371
Epoch: 18207, Training Loss: 0.01057
Epoch: 18207, Training Loss: 0.01110
Epoch: 18207, Training Loss: 0.01111
Epoch: 18207, Training Loss: 0.01371
Epoch: 18208, Training Loss: 0.01057
Epoch: 18208, Training Loss: 0.01110
Epoch: 18208, Training Loss: 0.01111
Epoch: 18208, Training Loss: 0.01371
Epoch: 18209, Training Loss: 0.01057
Epoch: 18209, Training Loss: 0.01110
Epoch: 18209, Training Loss: 0.01111
Epoch: 18209, Training Loss: 0.01371
Epoch: 18210, Training Loss: 0.01057
Epoch: 18210, Training Loss: 0.01110
Epoch: 18210, Training Loss: 0.01111
Epoch: 18210, Training Loss: 0.01371
Epoch: 18211, Training Loss: 0.01057
Epoch: 18211, Training Loss: 0.01110
Epoch: 18211, Training Loss: 0.01111
Epoch: 18211, Training Loss: 0.01371
Epoch: 18212, Training Loss: 0.01057
Epoch: 18212, Training Loss: 0.01110
Epoch: 18212, Training Loss: 0.01111
Epoch: 18212, Training Loss: 0.01371
Epoch: 18213, Training Loss: 0.01057
Epoch: 18213, Training Loss: 0.01110
Epoch: 18213, Training Loss: 0.01111
Epoch: 18213, Training Loss: 0.01370
Epoch: 18214, Training Loss: 0.01057
Epoch: 18214, Training Loss: 0.01110
Epoch: 18214, Training Loss: 0.01111
Epoch: 18214, Training Loss: 0.01370
Epoch: 18215, Training Loss: 0.01057
Epoch: 18215, Training Loss: 0.01110
Epoch: 18215, Training Loss: 0.01111
Epoch: 18215, Training Loss: 0.01370
Epoch: 18216, Training Loss: 0.01057
Epoch: 18216, Training Loss: 0.01110
Epoch: 18216, Training Loss: 0.01111
Epoch: 18216, Training Loss: 0.01370
Epoch: 18217, Training Loss: 0.01057
Epoch: 18217, Training Loss: 0.01110
Epoch: 18217, Training Loss: 0.01111
Epoch: 18217, Training Loss: 0.01370
Epoch: 18218, Training Loss: 0.01057
Epoch: 18218, Training Loss: 0.01109
Epoch: 18218, Training Loss: 0.01111
Epoch: 18218, Training Loss: 0.01370
Epoch: 18219, Training Loss: 0.01057
Epoch: 18219, Training Loss: 0.01109
Epoch: 18219, Training Loss: 0.01111
Epoch: 18219, Training Loss: 0.01370
Epoch: 18220, Training Loss: 0.01057
Epoch: 18220, Training Loss: 0.01109
Epoch: 18220, Training Loss: 0.01111
Epoch: 18220, Training Loss: 0.01370
Epoch: 18221, Training Loss: 0.01057
Epoch: 18221, Training Loss: 0.01109
Epoch: 18221, Training Loss: 0.01111
Epoch: 18221, Training Loss: 0.01370
Epoch: 18222, Training Loss: 0.01057
Epoch: 18222, Training Loss: 0.01109
Epoch: 18222, Training Loss: 0.01111
Epoch: 18222, Training Loss: 0.01370
Epoch: 18223, Training Loss: 0.01057
Epoch: 18223, Training Loss: 0.01109
Epoch: 18223, Training Loss: 0.01111
Epoch: 18223, Training Loss: 0.01370
Epoch: 18224, Training Loss: 0.01057
Epoch: 18224, Training Loss: 0.01109
Epoch: 18224, Training Loss: 0.01111
Epoch: 18224, Training Loss: 0.01370
Epoch: 18225, Training Loss: 0.01057
Epoch: 18225, Training Loss: 0.01109
Epoch: 18225, Training Loss: 0.01111
Epoch: 18225, Training Loss: 0.01370
Epoch: 18226, Training Loss: 0.01057
Epoch: 18226, Training Loss: 0.01109
Epoch: 18226, Training Loss: 0.01111
Epoch: 18226, Training Loss: 0.01370
Epoch: 18227, Training Loss: 0.01057
Epoch: 18227, Training Loss: 0.01109
Epoch: 18227, Training Loss: 0.01111
Epoch: 18227, Training Loss: 0.01370
Epoch: 18228, Training Loss: 0.01057
Epoch: 18228, Training Loss: 0.01109
Epoch: 18228, Training Loss: 0.01111
Epoch: 18228, Training Loss: 0.01370
Epoch: 18229, Training Loss: 0.01056
Epoch: 18229, Training Loss: 0.01109
Epoch: 18229, Training Loss: 0.01111
Epoch: 18229, Training Loss: 0.01370
Epoch: 18230, Training Loss: 0.01056
Epoch: 18230, Training Loss: 0.01109
Epoch: 18230, Training Loss: 0.01111
Epoch: 18230, Training Loss: 0.01370
Epoch: 18231, Training Loss: 0.01056
Epoch: 18231, Training Loss: 0.01109
Epoch: 18231, Training Loss: 0.01111
Epoch: 18231, Training Loss: 0.01370
Epoch: 18232, Training Loss: 0.01056
Epoch: 18232, Training Loss: 0.01109
Epoch: 18232, Training Loss: 0.01110
Epoch: 18232, Training Loss: 0.01370
Epoch: 18233, Training Loss: 0.01056
Epoch: 18233, Training Loss: 0.01109
Epoch: 18233, Training Loss: 0.01110
Epoch: 18233, Training Loss: 0.01370
Epoch: 18234, Training Loss: 0.01056
Epoch: 18234, Training Loss: 0.01109
Epoch: 18234, Training Loss: 0.01110
Epoch: 18234, Training Loss: 0.01370
Epoch: 18235, Training Loss: 0.01056
Epoch: 18235, Training Loss: 0.01109
Epoch: 18235, Training Loss: 0.01110
Epoch: 18235, Training Loss: 0.01369
Epoch: 18236, Training Loss: 0.01056
Epoch: 18236, Training Loss: 0.01109
Epoch: 18236, Training Loss: 0.01110
Epoch: 18236, Training Loss: 0.01369
Epoch: 18237, Training Loss: 0.01056
Epoch: 18237, Training Loss: 0.01109
Epoch: 18237, Training Loss: 0.01110
Epoch: 18237, Training Loss: 0.01369
Epoch: 18238, Training Loss: 0.01056
Epoch: 18238, Training Loss: 0.01109
Epoch: 18238, Training Loss: 0.01110
Epoch: 18238, Training Loss: 0.01369
Epoch: 18239, Training Loss: 0.01056
Epoch: 18239, Training Loss: 0.01109
Epoch: 18239, Training Loss: 0.01110
Epoch: 18239, Training Loss: 0.01369
Epoch: 18240, Training Loss: 0.01056
Epoch: 18240, Training Loss: 0.01109
Epoch: 18240, Training Loss: 0.01110
Epoch: 18240, Training Loss: 0.01369
Epoch: 18241, Training Loss: 0.01056
Epoch: 18241, Training Loss: 0.01109
Epoch: 18241, Training Loss: 0.01110
Epoch: 18241, Training Loss: 0.01369
Epoch: 18242, Training Loss: 0.01056
Epoch: 18242, Training Loss: 0.01109
Epoch: 18242, Training Loss: 0.01110
Epoch: 18242, Training Loss: 0.01369
Epoch: 18243, Training Loss: 0.01056
Epoch: 18243, Training Loss: 0.01109
Epoch: 18243, Training Loss: 0.01110
Epoch: 18243, Training Loss: 0.01369
Epoch: 18244, Training Loss: 0.01056
Epoch: 18244, Training Loss: 0.01109
Epoch: 18244, Training Loss: 0.01110
Epoch: 18244, Training Loss: 0.01369
Epoch: 18245, Training Loss: 0.01056
Epoch: 18245, Training Loss: 0.01109
Epoch: 18245, Training Loss: 0.01110
Epoch: 18245, Training Loss: 0.01369
Epoch: 18246, Training Loss: 0.01056
Epoch: 18246, Training Loss: 0.01108
Epoch: 18246, Training Loss: 0.01110
Epoch: 18246, Training Loss: 0.01369
Epoch: 18247, Training Loss: 0.01056
Epoch: 18247, Training Loss: 0.01108
Epoch: 18247, Training Loss: 0.01110
Epoch: 18247, Training Loss: 0.01369
Epoch: 18248, Training Loss: 0.01056
Epoch: 18248, Training Loss: 0.01108
Epoch: 18248, Training Loss: 0.01110
Epoch: 18248, Training Loss: 0.01369
Epoch: 18249, Training Loss: 0.01056
Epoch: 18249, Training Loss: 0.01108
Epoch: 18249, Training Loss: 0.01110
Epoch: 18249, Training Loss: 0.01369
Epoch: 18250, Training Loss: 0.01056
Epoch: 18250, Training Loss: 0.01108
Epoch: 18250, Training Loss: 0.01110
Epoch: 18250, Training Loss: 0.01369
Epoch: 18251, Training Loss: 0.01056
Epoch: 18251, Training Loss: 0.01108
Epoch: 18251, Training Loss: 0.01110
Epoch: 18251, Training Loss: 0.01369
Epoch: 18252, Training Loss: 0.01056
Epoch: 18252, Training Loss: 0.01108
Epoch: 18252, Training Loss: 0.01110
Epoch: 18252, Training Loss: 0.01369
Epoch: 18253, Training Loss: 0.01056
Epoch: 18253, Training Loss: 0.01108
Epoch: 18253, Training Loss: 0.01110
Epoch: 18253, Training Loss: 0.01369
Epoch: 18254, Training Loss: 0.01056
Epoch: 18254, Training Loss: 0.01108
Epoch: 18254, Training Loss: 0.01110
Epoch: 18254, Training Loss: 0.01369
Epoch: 18255, Training Loss: 0.01056
Epoch: 18255, Training Loss: 0.01108
Epoch: 18255, Training Loss: 0.01110
Epoch: 18255, Training Loss: 0.01369
Epoch: 18256, Training Loss: 0.01056
Epoch: 18256, Training Loss: 0.01108
Epoch: 18256, Training Loss: 0.01110
Epoch: 18256, Training Loss: 0.01368
Epoch: 18257, Training Loss: 0.01056
Epoch: 18257, Training Loss: 0.01108
Epoch: 18257, Training Loss: 0.01110
Epoch: 18257, Training Loss: 0.01368
Epoch: 18258, Training Loss: 0.01056
Epoch: 18258, Training Loss: 0.01108
Epoch: 18258, Training Loss: 0.01110
Epoch: 18258, Training Loss: 0.01368
Epoch: 18259, Training Loss: 0.01055
Epoch: 18259, Training Loss: 0.01108
Epoch: 18259, Training Loss: 0.01109
Epoch: 18259, Training Loss: 0.01368
Epoch: 18260, Training Loss: 0.01055
Epoch: 18260, Training Loss: 0.01108
Epoch: 18260, Training Loss: 0.01109
Epoch: 18260, Training Loss: 0.01368
Epoch: 18261, Training Loss: 0.01055
Epoch: 18261, Training Loss: 0.01108
Epoch: 18261, Training Loss: 0.01109
Epoch: 18261, Training Loss: 0.01368
Epoch: 18262, Training Loss: 0.01055
Epoch: 18262, Training Loss: 0.01108
Epoch: 18262, Training Loss: 0.01109
Epoch: 18262, Training Loss: 0.01368
Epoch: 18263, Training Loss: 0.01055
Epoch: 18263, Training Loss: 0.01108
Epoch: 18263, Training Loss: 0.01109
Epoch: 18263, Training Loss: 0.01368
Epoch: 18264, Training Loss: 0.01055
Epoch: 18264, Training Loss: 0.01108
Epoch: 18264, Training Loss: 0.01109
Epoch: 18264, Training Loss: 0.01368
Epoch: 18265, Training Loss: 0.01055
Epoch: 18265, Training Loss: 0.01108
Epoch: 18265, Training Loss: 0.01109
Epoch: 18265, Training Loss: 0.01368
Epoch: 18266, Training Loss: 0.01055
Epoch: 18266, Training Loss: 0.01108
Epoch: 18266, Training Loss: 0.01109
Epoch: 18266, Training Loss: 0.01368
Epoch: 18267, Training Loss: 0.01055
Epoch: 18267, Training Loss: 0.01108
Epoch: 18267, Training Loss: 0.01109
Epoch: 18267, Training Loss: 0.01368
Epoch: 18268, Training Loss: 0.01055
Epoch: 18268, Training Loss: 0.01108
Epoch: 18268, Training Loss: 0.01109
Epoch: 18268, Training Loss: 0.01368
Epoch: 18269, Training Loss: 0.01055
Epoch: 18269, Training Loss: 0.01108
Epoch: 18269, Training Loss: 0.01109
Epoch: 18269, Training Loss: 0.01368
Epoch: 18270, Training Loss: 0.01055
Epoch: 18270, Training Loss: 0.01108
Epoch: 18270, Training Loss: 0.01109
Epoch: 18270, Training Loss: 0.01368
Epoch: 18271, Training Loss: 0.01055
Epoch: 18271, Training Loss: 0.01108
Epoch: 18271, Training Loss: 0.01109
Epoch: 18271, Training Loss: 0.01368
Epoch: 18272, Training Loss: 0.01055
Epoch: 18272, Training Loss: 0.01108
Epoch: 18272, Training Loss: 0.01109
Epoch: 18272, Training Loss: 0.01368
Epoch: 18273, Training Loss: 0.01055
Epoch: 18273, Training Loss: 0.01107
Epoch: 18273, Training Loss: 0.01109
Epoch: 18273, Training Loss: 0.01368
Epoch: 18274, Training Loss: 0.01055
Epoch: 18274, Training Loss: 0.01107
Epoch: 18274, Training Loss: 0.01109
Epoch: 18274, Training Loss: 0.01368
Epoch: 18275, Training Loss: 0.01055
Epoch: 18275, Training Loss: 0.01107
Epoch: 18275, Training Loss: 0.01109
Epoch: 18275, Training Loss: 0.01368
Epoch: 18276, Training Loss: 0.01055
Epoch: 18276, Training Loss: 0.01107
Epoch: 18276, Training Loss: 0.01109
Epoch: 18276, Training Loss: 0.01368
Epoch: 18277, Training Loss: 0.01055
Epoch: 18277, Training Loss: 0.01107
Epoch: 18277, Training Loss: 0.01109
Epoch: 18277, Training Loss: 0.01368
Epoch: 18278, Training Loss: 0.01055
Epoch: 18278, Training Loss: 0.01107
Epoch: 18278, Training Loss: 0.01109
Epoch: 18278, Training Loss: 0.01367
Epoch: 18279, Training Loss: 0.01055
Epoch: 18279, Training Loss: 0.01107
Epoch: 18279, Training Loss: 0.01109
Epoch: 18279, Training Loss: 0.01367
Epoch: 18280, Training Loss: 0.01055
Epoch: 18280, Training Loss: 0.01107
Epoch: 18280, Training Loss: 0.01109
Epoch: 18280, Training Loss: 0.01367
Epoch: 18281, Training Loss: 0.01055
Epoch: 18281, Training Loss: 0.01107
Epoch: 18281, Training Loss: 0.01109
Epoch: 18281, Training Loss: 0.01367
Epoch: 18282, Training Loss: 0.01055
Epoch: 18282, Training Loss: 0.01107
Epoch: 18282, Training Loss: 0.01109
Epoch: 18282, Training Loss: 0.01367
Epoch: 18283, Training Loss: 0.01055
Epoch: 18283, Training Loss: 0.01107
Epoch: 18283, Training Loss: 0.01109
Epoch: 18283, Training Loss: 0.01367
Epoch: 18284, Training Loss: 0.01055
Epoch: 18284, Training Loss: 0.01107
Epoch: 18284, Training Loss: 0.01109
Epoch: 18284, Training Loss: 0.01367
Epoch: 18285, Training Loss: 0.01055
Epoch: 18285, Training Loss: 0.01107
Epoch: 18285, Training Loss: 0.01109
Epoch: 18285, Training Loss: 0.01367
Epoch: 18286, Training Loss: 0.01055
Epoch: 18286, Training Loss: 0.01107
Epoch: 18286, Training Loss: 0.01109
Epoch: 18286, Training Loss: 0.01367
Epoch: 18287, Training Loss: 0.01055
Epoch: 18287, Training Loss: 0.01107
Epoch: 18287, Training Loss: 0.01108
Epoch: 18287, Training Loss: 0.01367
Epoch: 18288, Training Loss: 0.01055
Epoch: 18288, Training Loss: 0.01107
Epoch: 18288, Training Loss: 0.01108
Epoch: 18288, Training Loss: 0.01367
Epoch: 18289, Training Loss: 0.01054
Epoch: 18289, Training Loss: 0.01107
Epoch: 18289, Training Loss: 0.01108
Epoch: 18289, Training Loss: 0.01367
Epoch: 18290, Training Loss: 0.01054
Epoch: 18290, Training Loss: 0.01107
Epoch: 18290, Training Loss: 0.01108
Epoch: 18290, Training Loss: 0.01367
Epoch: 18291, Training Loss: 0.01054
Epoch: 18291, Training Loss: 0.01107
Epoch: 18291, Training Loss: 0.01108
Epoch: 18291, Training Loss: 0.01367
Epoch: 18292, Training Loss: 0.01054
Epoch: 18292, Training Loss: 0.01107
Epoch: 18292, Training Loss: 0.01108
Epoch: 18292, Training Loss: 0.01367
Epoch: 18293, Training Loss: 0.01054
Epoch: 18293, Training Loss: 0.01107
Epoch: 18293, Training Loss: 0.01108
Epoch: 18293, Training Loss: 0.01367
Epoch: 18294, Training Loss: 0.01054
Epoch: 18294, Training Loss: 0.01107
Epoch: 18294, Training Loss: 0.01108
Epoch: 18294, Training Loss: 0.01367
Epoch: 18295, Training Loss: 0.01054
Epoch: 18295, Training Loss: 0.01107
Epoch: 18295, Training Loss: 0.01108
Epoch: 18295, Training Loss: 0.01367
Epoch: 18296, Training Loss: 0.01054
Epoch: 18296, Training Loss: 0.01107
Epoch: 18296, Training Loss: 0.01108
Epoch: 18296, Training Loss: 0.01367
Epoch: 18297, Training Loss: 0.01054
Epoch: 18297, Training Loss: 0.01107
Epoch: 18297, Training Loss: 0.01108
Epoch: 18297, Training Loss: 0.01367
Epoch: 18298, Training Loss: 0.01054
Epoch: 18298, Training Loss: 0.01107
Epoch: 18298, Training Loss: 0.01108
Epoch: 18298, Training Loss: 0.01367
Epoch: 18299, Training Loss: 0.01054
Epoch: 18299, Training Loss: 0.01107
Epoch: 18299, Training Loss: 0.01108
Epoch: 18299, Training Loss: 0.01367
Epoch: 18300, Training Loss: 0.01054
Epoch: 18300, Training Loss: 0.01107
Epoch: 18300, Training Loss: 0.01108
Epoch: 18300, Training Loss: 0.01366
Epoch: 18301, Training Loss: 0.01054
Epoch: 18301, Training Loss: 0.01106
Epoch: 18301, Training Loss: 0.01108
Epoch: 18301, Training Loss: 0.01366
Epoch: 18302, Training Loss: 0.01054
Epoch: 18302, Training Loss: 0.01106
Epoch: 18302, Training Loss: 0.01108
Epoch: 18302, Training Loss: 0.01366
Epoch: 18303, Training Loss: 0.01054
Epoch: 18303, Training Loss: 0.01106
Epoch: 18303, Training Loss: 0.01108
Epoch: 18303, Training Loss: 0.01366
Epoch: 18304, Training Loss: 0.01054
Epoch: 18304, Training Loss: 0.01106
Epoch: 18304, Training Loss: 0.01108
Epoch: 18304, Training Loss: 0.01366
Epoch: 18305, Training Loss: 0.01054
Epoch: 18305, Training Loss: 0.01106
Epoch: 18305, Training Loss: 0.01108
Epoch: 18305, Training Loss: 0.01366
Epoch: 18306, Training Loss: 0.01054
Epoch: 18306, Training Loss: 0.01106
Epoch: 18306, Training Loss: 0.01108
Epoch: 18306, Training Loss: 0.01366
Epoch: 18307, Training Loss: 0.01054
Epoch: 18307, Training Loss: 0.01106
Epoch: 18307, Training Loss: 0.01108
Epoch: 18307, Training Loss: 0.01366
Epoch: 18308, Training Loss: 0.01054
Epoch: 18308, Training Loss: 0.01106
Epoch: 18308, Training Loss: 0.01108
Epoch: 18308, Training Loss: 0.01366
Epoch: 18309, Training Loss: 0.01054
Epoch: 18309, Training Loss: 0.01106
Epoch: 18309, Training Loss: 0.01108
Epoch: 18309, Training Loss: 0.01366
Epoch: 18310, Training Loss: 0.01054
Epoch: 18310, Training Loss: 0.01106
Epoch: 18310, Training Loss: 0.01108
Epoch: 18310, Training Loss: 0.01366
Epoch: 18311, Training Loss: 0.01054
Epoch: 18311, Training Loss: 0.01106
Epoch: 18311, Training Loss: 0.01108
Epoch: 18311, Training Loss: 0.01366
Epoch: 18312, Training Loss: 0.01054
Epoch: 18312, Training Loss: 0.01106
Epoch: 18312, Training Loss: 0.01108
Epoch: 18312, Training Loss: 0.01366
Epoch: 18313, Training Loss: 0.01054
Epoch: 18313, Training Loss: 0.01106
Epoch: 18313, Training Loss: 0.01108
Epoch: 18313, Training Loss: 0.01366
Epoch: 18314, Training Loss: 0.01054
Epoch: 18314, Training Loss: 0.01106
Epoch: 18314, Training Loss: 0.01107
Epoch: 18314, Training Loss: 0.01366
Epoch: 18315, Training Loss: 0.01054
Epoch: 18315, Training Loss: 0.01106
Epoch: 18315, Training Loss: 0.01107
Epoch: 18315, Training Loss: 0.01366
Epoch: 18316, Training Loss: 0.01054
Epoch: 18316, Training Loss: 0.01106
Epoch: 18316, Training Loss: 0.01107
Epoch: 18316, Training Loss: 0.01366
Epoch: 18317, Training Loss: 0.01054
Epoch: 18317, Training Loss: 0.01106
Epoch: 18317, Training Loss: 0.01107
Epoch: 18317, Training Loss: 0.01366
Epoch: 18318, Training Loss: 0.01053
Epoch: 18318, Training Loss: 0.01106
Epoch: 18318, Training Loss: 0.01107
Epoch: 18318, Training Loss: 0.01366
Epoch: 18319, Training Loss: 0.01053
Epoch: 18319, Training Loss: 0.01106
Epoch: 18319, Training Loss: 0.01107
Epoch: 18319, Training Loss: 0.01366
Epoch: 18320, Training Loss: 0.01053
Epoch: 18320, Training Loss: 0.01106
Epoch: 18320, Training Loss: 0.01107
Epoch: 18320, Training Loss: 0.01366
Epoch: 18321, Training Loss: 0.01053
Epoch: 18321, Training Loss: 0.01106
Epoch: 18321, Training Loss: 0.01107
Epoch: 18321, Training Loss: 0.01366
Epoch: 18322, Training Loss: 0.01053
Epoch: 18322, Training Loss: 0.01106
Epoch: 18322, Training Loss: 0.01107
Epoch: 18322, Training Loss: 0.01365
Epoch: 18323, Training Loss: 0.01053
Epoch: 18323, Training Loss: 0.01106
Epoch: 18323, Training Loss: 0.01107
Epoch: 18323, Training Loss: 0.01365
Epoch: 18324, Training Loss: 0.01053
Epoch: 18324, Training Loss: 0.01106
Epoch: 18324, Training Loss: 0.01107
Epoch: 18324, Training Loss: 0.01365
Epoch: 18325, Training Loss: 0.01053
Epoch: 18325, Training Loss: 0.01106
Epoch: 18325, Training Loss: 0.01107
Epoch: 18325, Training Loss: 0.01365
Epoch: 18326, Training Loss: 0.01053
Epoch: 18326, Training Loss: 0.01106
Epoch: 18326, Training Loss: 0.01107
Epoch: 18326, Training Loss: 0.01365
Epoch: 18327, Training Loss: 0.01053
Epoch: 18327, Training Loss: 0.01106
Epoch: 18327, Training Loss: 0.01107
Epoch: 18327, Training Loss: 0.01365
Epoch: 18328, Training Loss: 0.01053
Epoch: 18328, Training Loss: 0.01105
Epoch: 18328, Training Loss: 0.01107
Epoch: 18328, Training Loss: 0.01365
Epoch: 18329, Training Loss: 0.01053
Epoch: 18329, Training Loss: 0.01105
Epoch: 18329, Training Loss: 0.01107
Epoch: 18329, Training Loss: 0.01365
Epoch: 18330, Training Loss: 0.01053
Epoch: 18330, Training Loss: 0.01105
Epoch: 18330, Training Loss: 0.01107
Epoch: 18330, Training Loss: 0.01365
Epoch: 18331, Training Loss: 0.01053
Epoch: 18331, Training Loss: 0.01105
Epoch: 18331, Training Loss: 0.01107
Epoch: 18331, Training Loss: 0.01365
Epoch: 18332, Training Loss: 0.01053
Epoch: 18332, Training Loss: 0.01105
Epoch: 18332, Training Loss: 0.01107
Epoch: 18332, Training Loss: 0.01365
Epoch: 18333, Training Loss: 0.01053
Epoch: 18333, Training Loss: 0.01105
Epoch: 18333, Training Loss: 0.01107
Epoch: 18333, Training Loss: 0.01365
Epoch: 18334, Training Loss: 0.01053
Epoch: 18334, Training Loss: 0.01105
Epoch: 18334, Training Loss: 0.01107
Epoch: 18334, Training Loss: 0.01365
Epoch: 18335, Training Loss: 0.01053
Epoch: 18335, Training Loss: 0.01105
Epoch: 18335, Training Loss: 0.01107
Epoch: 18335, Training Loss: 0.01365
Epoch: 18336, Training Loss: 0.01053
Epoch: 18336, Training Loss: 0.01105
Epoch: 18336, Training Loss: 0.01107
Epoch: 18336, Training Loss: 0.01365
Epoch: 18337, Training Loss: 0.01053
Epoch: 18337, Training Loss: 0.01105
Epoch: 18337, Training Loss: 0.01107
Epoch: 18337, Training Loss: 0.01365
Epoch: 18338, Training Loss: 0.01053
Epoch: 18338, Training Loss: 0.01105
Epoch: 18338, Training Loss: 0.01107
Epoch: 18338, Training Loss: 0.01365
Epoch: 18339, Training Loss: 0.01053
Epoch: 18339, Training Loss: 0.01105
Epoch: 18339, Training Loss: 0.01107
Epoch: 18339, Training Loss: 0.01365
Epoch: 18340, Training Loss: 0.01053
Epoch: 18340, Training Loss: 0.01105
Epoch: 18340, Training Loss: 0.01107
Epoch: 18340, Training Loss: 0.01365
Epoch: 18341, Training Loss: 0.01053
Epoch: 18341, Training Loss: 0.01105
Epoch: 18341, Training Loss: 0.01107
Epoch: 18341, Training Loss: 0.01365
Epoch: 18342, Training Loss: 0.01053
Epoch: 18342, Training Loss: 0.01105
Epoch: 18342, Training Loss: 0.01106
Epoch: 18342, Training Loss: 0.01365
Epoch: 18343, Training Loss: 0.01053
Epoch: 18343, Training Loss: 0.01105
Epoch: 18343, Training Loss: 0.01106
Epoch: 18343, Training Loss: 0.01365
Epoch: 18344, Training Loss: 0.01053
Epoch: 18344, Training Loss: 0.01105
Epoch: 18344, Training Loss: 0.01106
Epoch: 18344, Training Loss: 0.01364
Epoch: 18345, Training Loss: 0.01053
Epoch: 18345, Training Loss: 0.01105
Epoch: 18345, Training Loss: 0.01106
Epoch: 18345, Training Loss: 0.01364
Epoch: 18346, Training Loss: 0.01053
Epoch: 18346, Training Loss: 0.01105
Epoch: 18346, Training Loss: 0.01106
Epoch: 18346, Training Loss: 0.01364
Epoch: 18347, Training Loss: 0.01053
Epoch: 18347, Training Loss: 0.01105
Epoch: 18347, Training Loss: 0.01106
Epoch: 18347, Training Loss: 0.01364
Epoch: 18348, Training Loss: 0.01052
Epoch: 18348, Training Loss: 0.01105
Epoch: 18348, Training Loss: 0.01106
Epoch: 18348, Training Loss: 0.01364
Epoch: 18349, Training Loss: 0.01052
Epoch: 18349, Training Loss: 0.01105
Epoch: 18349, Training Loss: 0.01106
Epoch: 18349, Training Loss: 0.01364
Epoch: 18350, Training Loss: 0.01052
Epoch: 18350, Training Loss: 0.01105
Epoch: 18350, Training Loss: 0.01106
Epoch: 18350, Training Loss: 0.01364
Epoch: 18351, Training Loss: 0.01052
Epoch: 18351, Training Loss: 0.01105
Epoch: 18351, Training Loss: 0.01106
Epoch: 18351, Training Loss: 0.01364
Epoch: 18352, Training Loss: 0.01052
Epoch: 18352, Training Loss: 0.01105
Epoch: 18352, Training Loss: 0.01106
Epoch: 18352, Training Loss: 0.01364
Epoch: 18353, Training Loss: 0.01052
Epoch: 18353, Training Loss: 0.01105
Epoch: 18353, Training Loss: 0.01106
Epoch: 18353, Training Loss: 0.01364
Epoch: 18354, Training Loss: 0.01052
Epoch: 18354, Training Loss: 0.01105
Epoch: 18354, Training Loss: 0.01106
Epoch: 18354, Training Loss: 0.01364
Epoch: 18355, Training Loss: 0.01052
Epoch: 18355, Training Loss: 0.01105
Epoch: 18355, Training Loss: 0.01106
Epoch: 18355, Training Loss: 0.01364
Epoch: 18356, Training Loss: 0.01052
Epoch: 18356, Training Loss: 0.01104
Epoch: 18356, Training Loss: 0.01106
Epoch: 18356, Training Loss: 0.01364
Epoch: 18357, Training Loss: 0.01052
Epoch: 18357, Training Loss: 0.01104
Epoch: 18357, Training Loss: 0.01106
Epoch: 18357, Training Loss: 0.01364
Epoch: 18358, Training Loss: 0.01052
Epoch: 18358, Training Loss: 0.01104
Epoch: 18358, Training Loss: 0.01106
Epoch: 18358, Training Loss: 0.01364
Epoch: 18359, Training Loss: 0.01052
Epoch: 18359, Training Loss: 0.01104
Epoch: 18359, Training Loss: 0.01106
Epoch: 18359, Training Loss: 0.01364
Epoch: 18360, Training Loss: 0.01052
Epoch: 18360, Training Loss: 0.01104
Epoch: 18360, Training Loss: 0.01106
Epoch: 18360, Training Loss: 0.01364
Epoch: 18361, Training Loss: 0.01052
Epoch: 18361, Training Loss: 0.01104
Epoch: 18361, Training Loss: 0.01106
Epoch: 18361, Training Loss: 0.01364
Epoch: 18362, Training Loss: 0.01052
Epoch: 18362, Training Loss: 0.01104
Epoch: 18362, Training Loss: 0.01106
Epoch: 18362, Training Loss: 0.01364
Epoch: 18363, Training Loss: 0.01052
Epoch: 18363, Training Loss: 0.01104
Epoch: 18363, Training Loss: 0.01106
Epoch: 18363, Training Loss: 0.01364
Epoch: 18364, Training Loss: 0.01052
Epoch: 18364, Training Loss: 0.01104
Epoch: 18364, Training Loss: 0.01106
Epoch: 18364, Training Loss: 0.01364
Epoch: 18365, Training Loss: 0.01052
Epoch: 18365, Training Loss: 0.01104
Epoch: 18365, Training Loss: 0.01106
Epoch: 18365, Training Loss: 0.01364
Epoch: 18366, Training Loss: 0.01052
Epoch: 18366, Training Loss: 0.01104
Epoch: 18366, Training Loss: 0.01106
Epoch: 18366, Training Loss: 0.01363
Epoch: 18367, Training Loss: 0.01052
Epoch: 18367, Training Loss: 0.01104
Epoch: 18367, Training Loss: 0.01106
Epoch: 18367, Training Loss: 0.01363
Epoch: 18368, Training Loss: 0.01052
Epoch: 18368, Training Loss: 0.01104
Epoch: 18368, Training Loss: 0.01106
Epoch: 18368, Training Loss: 0.01363
Epoch: 18369, Training Loss: 0.01052
Epoch: 18369, Training Loss: 0.01104
Epoch: 18369, Training Loss: 0.01105
Epoch: 18369, Training Loss: 0.01363
Epoch: 18370, Training Loss: 0.01052
Epoch: 18370, Training Loss: 0.01104
Epoch: 18370, Training Loss: 0.01105
Epoch: 18370, Training Loss: 0.01363
Epoch: 18371, Training Loss: 0.01052
Epoch: 18371, Training Loss: 0.01104
Epoch: 18371, Training Loss: 0.01105
Epoch: 18371, Training Loss: 0.01363
Epoch: 18372, Training Loss: 0.01052
Epoch: 18372, Training Loss: 0.01104
Epoch: 18372, Training Loss: 0.01105
Epoch: 18372, Training Loss: 0.01363
Epoch: 18373, Training Loss: 0.01052
Epoch: 18373, Training Loss: 0.01104
Epoch: 18373, Training Loss: 0.01105
Epoch: 18373, Training Loss: 0.01363
Epoch: 18374, Training Loss: 0.01052
Epoch: 18374, Training Loss: 0.01104
Epoch: 18374, Training Loss: 0.01105
Epoch: 18374, Training Loss: 0.01363
Epoch: 18375, Training Loss: 0.01052
Epoch: 18375, Training Loss: 0.01104
Epoch: 18375, Training Loss: 0.01105
Epoch: 18375, Training Loss: 0.01363
Epoch: 18376, Training Loss: 0.01052
Epoch: 18376, Training Loss: 0.01104
Epoch: 18376, Training Loss: 0.01105
Epoch: 18376, Training Loss: 0.01363
Epoch: 18377, Training Loss: 0.01052
Epoch: 18377, Training Loss: 0.01104
Epoch: 18377, Training Loss: 0.01105
Epoch: 18377, Training Loss: 0.01363
Epoch: 18378, Training Loss: 0.01051
Epoch: 18378, Training Loss: 0.01104
Epoch: 18378, Training Loss: 0.01105
Epoch: 18378, Training Loss: 0.01363
Epoch: 18379, Training Loss: 0.01051
Epoch: 18379, Training Loss: 0.01104
Epoch: 18379, Training Loss: 0.01105
Epoch: 18379, Training Loss: 0.01363
Epoch: 18380, Training Loss: 0.01051
Epoch: 18380, Training Loss: 0.01104
Epoch: 18380, Training Loss: 0.01105
Epoch: 18380, Training Loss: 0.01363
Epoch: 18381, Training Loss: 0.01051
Epoch: 18381, Training Loss: 0.01104
Epoch: 18381, Training Loss: 0.01105
Epoch: 18381, Training Loss: 0.01363
Epoch: 18382, Training Loss: 0.01051
Epoch: 18382, Training Loss: 0.01104
Epoch: 18382, Training Loss: 0.01105
Epoch: 18382, Training Loss: 0.01363
Epoch: 18383, Training Loss: 0.01051
Epoch: 18383, Training Loss: 0.01104
Epoch: 18383, Training Loss: 0.01105
Epoch: 18383, Training Loss: 0.01363
Epoch: 18384, Training Loss: 0.01051
Epoch: 18384, Training Loss: 0.01103
Epoch: 18384, Training Loss: 0.01105
Epoch: 18384, Training Loss: 0.01363
Epoch: 18385, Training Loss: 0.01051
Epoch: 18385, Training Loss: 0.01103
Epoch: 18385, Training Loss: 0.01105
Epoch: 18385, Training Loss: 0.01363
Epoch: 18386, Training Loss: 0.01051
Epoch: 18386, Training Loss: 0.01103
Epoch: 18386, Training Loss: 0.01105
Epoch: 18386, Training Loss: 0.01363
Epoch: 18387, Training Loss: 0.01051
Epoch: 18387, Training Loss: 0.01103
Epoch: 18387, Training Loss: 0.01105
Epoch: 18387, Training Loss: 0.01363
Epoch: 18388, Training Loss: 0.01051
Epoch: 18388, Training Loss: 0.01103
Epoch: 18388, Training Loss: 0.01105
Epoch: 18388, Training Loss: 0.01363
Epoch: 18389, Training Loss: 0.01051
Epoch: 18389, Training Loss: 0.01103
Epoch: 18389, Training Loss: 0.01105
Epoch: 18389, Training Loss: 0.01362
Epoch: 18390, Training Loss: 0.01051
Epoch: 18390, Training Loss: 0.01103
Epoch: 18390, Training Loss: 0.01105
Epoch: 18390, Training Loss: 0.01362
Epoch: 18391, Training Loss: 0.01051
Epoch: 18391, Training Loss: 0.01103
Epoch: 18391, Training Loss: 0.01105
Epoch: 18391, Training Loss: 0.01362
Epoch: 18392, Training Loss: 0.01051
Epoch: 18392, Training Loss: 0.01103
Epoch: 18392, Training Loss: 0.01105
Epoch: 18392, Training Loss: 0.01362
Epoch: 18393, Training Loss: 0.01051
Epoch: 18393, Training Loss: 0.01103
Epoch: 18393, Training Loss: 0.01105
Epoch: 18393, Training Loss: 0.01362
Epoch: 18394, Training Loss: 0.01051
Epoch: 18394, Training Loss: 0.01103
Epoch: 18394, Training Loss: 0.01105
Epoch: 18394, Training Loss: 0.01362
Epoch: 18395, Training Loss: 0.01051
Epoch: 18395, Training Loss: 0.01103
Epoch: 18395, Training Loss: 0.01105
Epoch: 18395, Training Loss: 0.01362
Epoch: 18396, Training Loss: 0.01051
Epoch: 18396, Training Loss: 0.01103
Epoch: 18396, Training Loss: 0.01105
Epoch: 18396, Training Loss: 0.01362
Epoch: 18397, Training Loss: 0.01051
Epoch: 18397, Training Loss: 0.01103
Epoch: 18397, Training Loss: 0.01104
Epoch: 18397, Training Loss: 0.01362
Epoch: 18398, Training Loss: 0.01051
Epoch: 18398, Training Loss: 0.01103
Epoch: 18398, Training Loss: 0.01104
Epoch: 18398, Training Loss: 0.01362
Epoch: 18399, Training Loss: 0.01051
Epoch: 18399, Training Loss: 0.01103
Epoch: 18399, Training Loss: 0.01104
Epoch: 18399, Training Loss: 0.01362
Epoch: 18400, Training Loss: 0.01051
Epoch: 18400, Training Loss: 0.01103
Epoch: 18400, Training Loss: 0.01104
Epoch: 18400, Training Loss: 0.01362
Epoch: 18401, Training Loss: 0.01051
Epoch: 18401, Training Loss: 0.01103
Epoch: 18401, Training Loss: 0.01104
Epoch: 18401, Training Loss: 0.01362
Epoch: 18402, Training Loss: 0.01051
Epoch: 18402, Training Loss: 0.01103
Epoch: 18402, Training Loss: 0.01104
Epoch: 18402, Training Loss: 0.01362
Epoch: 18403, Training Loss: 0.01051
Epoch: 18403, Training Loss: 0.01103
Epoch: 18403, Training Loss: 0.01104
Epoch: 18403, Training Loss: 0.01362
Epoch: 18404, Training Loss: 0.01051
Epoch: 18404, Training Loss: 0.01103
Epoch: 18404, Training Loss: 0.01104
Epoch: 18404, Training Loss: 0.01362
Epoch: 18405, Training Loss: 0.01051
Epoch: 18405, Training Loss: 0.01103
Epoch: 18405, Training Loss: 0.01104
Epoch: 18405, Training Loss: 0.01362
Epoch: 18406, Training Loss: 0.01051
Epoch: 18406, Training Loss: 0.01103
Epoch: 18406, Training Loss: 0.01104
Epoch: 18406, Training Loss: 0.01362
Epoch: 18407, Training Loss: 0.01051
Epoch: 18407, Training Loss: 0.01103
Epoch: 18407, Training Loss: 0.01104
Epoch: 18407, Training Loss: 0.01362
Epoch: 18408, Training Loss: 0.01050
Epoch: 18408, Training Loss: 0.01103
Epoch: 18408, Training Loss: 0.01104
Epoch: 18408, Training Loss: 0.01362
Epoch: 18409, Training Loss: 0.01050
Epoch: 18409, Training Loss: 0.01103
Epoch: 18409, Training Loss: 0.01104
Epoch: 18409, Training Loss: 0.01362
Epoch: 18410, Training Loss: 0.01050
Epoch: 18410, Training Loss: 0.01103
Epoch: 18410, Training Loss: 0.01104
Epoch: 18410, Training Loss: 0.01362
Epoch: 18411, Training Loss: 0.01050
Epoch: 18411, Training Loss: 0.01103
Epoch: 18411, Training Loss: 0.01104
Epoch: 18411, Training Loss: 0.01361
Epoch: 18412, Training Loss: 0.01050
Epoch: 18412, Training Loss: 0.01102
Epoch: 18412, Training Loss: 0.01104
Epoch: 18412, Training Loss: 0.01361
Epoch: 18413, Training Loss: 0.01050
Epoch: 18413, Training Loss: 0.01102
Epoch: 18413, Training Loss: 0.01104
Epoch: 18413, Training Loss: 0.01361
Epoch: 18414, Training Loss: 0.01050
Epoch: 18414, Training Loss: 0.01102
Epoch: 18414, Training Loss: 0.01104
Epoch: 18414, Training Loss: 0.01361
Epoch: 18415, Training Loss: 0.01050
Epoch: 18415, Training Loss: 0.01102
Epoch: 18415, Training Loss: 0.01104
Epoch: 18415, Training Loss: 0.01361
Epoch: 18416, Training Loss: 0.01050
Epoch: 18416, Training Loss: 0.01102
Epoch: 18416, Training Loss: 0.01104
Epoch: 18416, Training Loss: 0.01361
Epoch: 18417, Training Loss: 0.01050
Epoch: 18417, Training Loss: 0.01102
Epoch: 18417, Training Loss: 0.01104
Epoch: 18417, Training Loss: 0.01361
Epoch: 18418, Training Loss: 0.01050
Epoch: 18418, Training Loss: 0.01102
Epoch: 18418, Training Loss: 0.01104
Epoch: 18418, Training Loss: 0.01361
Epoch: 18419, Training Loss: 0.01050
Epoch: 18419, Training Loss: 0.01102
Epoch: 18419, Training Loss: 0.01104
Epoch: 18419, Training Loss: 0.01361
Epoch: 18420, Training Loss: 0.01050
Epoch: 18420, Training Loss: 0.01102
Epoch: 18420, Training Loss: 0.01104
Epoch: 18420, Training Loss: 0.01361
Epoch: 18421, Training Loss: 0.01050
Epoch: 18421, Training Loss: 0.01102
Epoch: 18421, Training Loss: 0.01104
Epoch: 18421, Training Loss: 0.01361
Epoch: 18422, Training Loss: 0.01050
Epoch: 18422, Training Loss: 0.01102
Epoch: 18422, Training Loss: 0.01104
Epoch: 18422, Training Loss: 0.01361
Epoch: 18423, Training Loss: 0.01050
Epoch: 18423, Training Loss: 0.01102
Epoch: 18423, Training Loss: 0.01104
Epoch: 18423, Training Loss: 0.01361
Epoch: 18424, Training Loss: 0.01050
Epoch: 18424, Training Loss: 0.01102
Epoch: 18424, Training Loss: 0.01104
Epoch: 18424, Training Loss: 0.01361
Epoch: 18425, Training Loss: 0.01050
Epoch: 18425, Training Loss: 0.01102
Epoch: 18425, Training Loss: 0.01103
Epoch: 18425, Training Loss: 0.01361
Epoch: 18426, Training Loss: 0.01050
Epoch: 18426, Training Loss: 0.01102
Epoch: 18426, Training Loss: 0.01103
Epoch: 18426, Training Loss: 0.01361
Epoch: 18427, Training Loss: 0.01050
Epoch: 18427, Training Loss: 0.01102
Epoch: 18427, Training Loss: 0.01103
Epoch: 18427, Training Loss: 0.01361
Epoch: 18428, Training Loss: 0.01050
Epoch: 18428, Training Loss: 0.01102
Epoch: 18428, Training Loss: 0.01103
Epoch: 18428, Training Loss: 0.01361
Epoch: 18429, Training Loss: 0.01050
Epoch: 18429, Training Loss: 0.01102
Epoch: 18429, Training Loss: 0.01103
Epoch: 18429, Training Loss: 0.01361
Epoch: 18430, Training Loss: 0.01050
Epoch: 18430, Training Loss: 0.01102
Epoch: 18430, Training Loss: 0.01103
Epoch: 18430, Training Loss: 0.01361
Epoch: 18431, Training Loss: 0.01050
Epoch: 18431, Training Loss: 0.01102
Epoch: 18431, Training Loss: 0.01103
Epoch: 18431, Training Loss: 0.01361
Epoch: 18432, Training Loss: 0.01050
Epoch: 18432, Training Loss: 0.01102
Epoch: 18432, Training Loss: 0.01103
Epoch: 18432, Training Loss: 0.01361
Epoch: 18433, Training Loss: 0.01050
Epoch: 18433, Training Loss: 0.01102
Epoch: 18433, Training Loss: 0.01103
Epoch: 18433, Training Loss: 0.01360
Epoch: 18434, Training Loss: 0.01050
Epoch: 18434, Training Loss: 0.01102
Epoch: 18434, Training Loss: 0.01103
Epoch: 18434, Training Loss: 0.01360
Epoch: 18435, Training Loss: 0.01050
Epoch: 18435, Training Loss: 0.01102
Epoch: 18435, Training Loss: 0.01103
Epoch: 18435, Training Loss: 0.01360
Epoch: 18436, Training Loss: 0.01050
Epoch: 18436, Training Loss: 0.01102
Epoch: 18436, Training Loss: 0.01103
Epoch: 18436, Training Loss: 0.01360
Epoch: 18437, Training Loss: 0.01050
Epoch: 18437, Training Loss: 0.01102
Epoch: 18437, Training Loss: 0.01103
Epoch: 18437, Training Loss: 0.01360
Epoch: 18438, Training Loss: 0.01050
Epoch: 18438, Training Loss: 0.01102
Epoch: 18438, Training Loss: 0.01103
Epoch: 18438, Training Loss: 0.01360
Epoch: 18439, Training Loss: 0.01049
Epoch: 18439, Training Loss: 0.01102
Epoch: 18439, Training Loss: 0.01103
Epoch: 18439, Training Loss: 0.01360
Epoch: 18440, Training Loss: 0.01049
Epoch: 18440, Training Loss: 0.01101
Epoch: 18440, Training Loss: 0.01103
Epoch: 18440, Training Loss: 0.01360
Epoch: 18441, Training Loss: 0.01049
Epoch: 18441, Training Loss: 0.01101
Epoch: 18441, Training Loss: 0.01103
Epoch: 18441, Training Loss: 0.01360
Epoch: 18442, Training Loss: 0.01049
Epoch: 18442, Training Loss: 0.01101
Epoch: 18442, Training Loss: 0.01103
Epoch: 18442, Training Loss: 0.01360
Epoch: 18443, Training Loss: 0.01049
Epoch: 18443, Training Loss: 0.01101
Epoch: 18443, Training Loss: 0.01103
Epoch: 18443, Training Loss: 0.01360
Epoch: 18444, Training Loss: 0.01049
Epoch: 18444, Training Loss: 0.01101
Epoch: 18444, Training Loss: 0.01103
Epoch: 18444, Training Loss: 0.01360
Epoch: 18445, Training Loss: 0.01049
Epoch: 18445, Training Loss: 0.01101
Epoch: 18445, Training Loss: 0.01103
Epoch: 18445, Training Loss: 0.01360
Epoch: 18446, Training Loss: 0.01049
Epoch: 18446, Training Loss: 0.01101
Epoch: 18446, Training Loss: 0.01103
Epoch: 18446, Training Loss: 0.01360
Epoch: 18447, Training Loss: 0.01049
Epoch: 18447, Training Loss: 0.01101
Epoch: 18447, Training Loss: 0.01103
Epoch: 18447, Training Loss: 0.01360
Epoch: 18448, Training Loss: 0.01049
Epoch: 18448, Training Loss: 0.01101
Epoch: 18448, Training Loss: 0.01103
Epoch: 18448, Training Loss: 0.01360
Epoch: 18449, Training Loss: 0.01049
Epoch: 18449, Training Loss: 0.01101
Epoch: 18449, Training Loss: 0.01103
Epoch: 18449, Training Loss: 0.01360
Epoch: 18450, Training Loss: 0.01049
Epoch: 18450, Training Loss: 0.01101
Epoch: 18450, Training Loss: 0.01103
Epoch: 18450, Training Loss: 0.01360
Epoch: 18451, Training Loss: 0.01049
Epoch: 18451, Training Loss: 0.01101
Epoch: 18451, Training Loss: 0.01103
Epoch: 18451, Training Loss: 0.01360
Epoch: 18452, Training Loss: 0.01049
Epoch: 18452, Training Loss: 0.01101
Epoch: 18452, Training Loss: 0.01103
Epoch: 18452, Training Loss: 0.01360
Epoch: 18453, Training Loss: 0.01049
Epoch: 18453, Training Loss: 0.01101
Epoch: 18453, Training Loss: 0.01102
Epoch: 18453, Training Loss: 0.01360
Epoch: 18454, Training Loss: 0.01049
Epoch: 18454, Training Loss: 0.01101
Epoch: 18454, Training Loss: 0.01102
Epoch: 18454, Training Loss: 0.01360
Epoch: 18455, Training Loss: 0.01049
Epoch: 18455, Training Loss: 0.01101
Epoch: 18455, Training Loss: 0.01102
Epoch: 18455, Training Loss: 0.01359
Epoch: 18456, Training Loss: 0.01049
Epoch: 18456, Training Loss: 0.01101
Epoch: 18456, Training Loss: 0.01102
Epoch: 18456, Training Loss: 0.01359
Epoch: 18457, Training Loss: 0.01049
Epoch: 18457, Training Loss: 0.01101
Epoch: 18457, Training Loss: 0.01102
Epoch: 18457, Training Loss: 0.01359
Epoch: 18458, Training Loss: 0.01049
Epoch: 18458, Training Loss: 0.01101
Epoch: 18458, Training Loss: 0.01102
Epoch: 18458, Training Loss: 0.01359
Epoch: 18459, Training Loss: 0.01049
Epoch: 18459, Training Loss: 0.01101
Epoch: 18459, Training Loss: 0.01102
Epoch: 18459, Training Loss: 0.01359
Epoch: 18460, Training Loss: 0.01049
Epoch: 18460, Training Loss: 0.01101
Epoch: 18460, Training Loss: 0.01102
Epoch: 18460, Training Loss: 0.01359
Epoch: 18461, Training Loss: 0.01049
Epoch: 18461, Training Loss: 0.01101
Epoch: 18461, Training Loss: 0.01102
Epoch: 18461, Training Loss: 0.01359
Epoch: 18462, Training Loss: 0.01049
Epoch: 18462, Training Loss: 0.01101
Epoch: 18462, Training Loss: 0.01102
Epoch: 18462, Training Loss: 0.01359
Epoch: 18463, Training Loss: 0.01049
Epoch: 18463, Training Loss: 0.01101
Epoch: 18463, Training Loss: 0.01102
Epoch: 18463, Training Loss: 0.01359
Epoch: 18464, Training Loss: 0.01049
Epoch: 18464, Training Loss: 0.01101
Epoch: 18464, Training Loss: 0.01102
Epoch: 18464, Training Loss: 0.01359
Epoch: 18465, Training Loss: 0.01049
Epoch: 18465, Training Loss: 0.01101
Epoch: 18465, Training Loss: 0.01102
Epoch: 18465, Training Loss: 0.01359
Epoch: 18466, Training Loss: 0.01049
Epoch: 18466, Training Loss: 0.01101
Epoch: 18466, Training Loss: 0.01102
Epoch: 18466, Training Loss: 0.01359
Epoch: 18467, Training Loss: 0.01049
Epoch: 18467, Training Loss: 0.01101
Epoch: 18467, Training Loss: 0.01102
Epoch: 18467, Training Loss: 0.01359
Epoch: 18468, Training Loss: 0.01049
Epoch: 18468, Training Loss: 0.01100
Epoch: 18468, Training Loss: 0.01102
Epoch: 18468, Training Loss: 0.01359
Epoch: 18469, Training Loss: 0.01048
Epoch: 18469, Training Loss: 0.01100
Epoch: 18469, Training Loss: 0.01102
Epoch: 18469, Training Loss: 0.01359
Epoch: 18470, Training Loss: 0.01048
Epoch: 18470, Training Loss: 0.01100
Epoch: 18470, Training Loss: 0.01102
Epoch: 18470, Training Loss: 0.01359
Epoch: 18471, Training Loss: 0.01048
Epoch: 18471, Training Loss: 0.01100
Epoch: 18471, Training Loss: 0.01102
Epoch: 18471, Training Loss: 0.01359
Epoch: 18472, Training Loss: 0.01048
Epoch: 18472, Training Loss: 0.01100
Epoch: 18472, Training Loss: 0.01102
Epoch: 18472, Training Loss: 0.01359
Epoch: 18473, Training Loss: 0.01048
Epoch: 18473, Training Loss: 0.01100
Epoch: 18473, Training Loss: 0.01102
Epoch: 18473, Training Loss: 0.01359
Epoch: 18474, Training Loss: 0.01048
Epoch: 18474, Training Loss: 0.01100
Epoch: 18474, Training Loss: 0.01102
Epoch: 18474, Training Loss: 0.01359
Epoch: 18475, Training Loss: 0.01048
Epoch: 18475, Training Loss: 0.01100
Epoch: 18475, Training Loss: 0.01102
Epoch: 18475, Training Loss: 0.01359
Epoch: 18476, Training Loss: 0.01048
Epoch: 18476, Training Loss: 0.01100
Epoch: 18476, Training Loss: 0.01102
Epoch: 18476, Training Loss: 0.01359
Epoch: 18477, Training Loss: 0.01048
Epoch: 18477, Training Loss: 0.01100
Epoch: 18477, Training Loss: 0.01102
Epoch: 18477, Training Loss: 0.01359
Epoch: 18478, Training Loss: 0.01048
Epoch: 18478, Training Loss: 0.01100
Epoch: 18478, Training Loss: 0.01102
Epoch: 18478, Training Loss: 0.01358
Epoch: 18479, Training Loss: 0.01048
Epoch: 18479, Training Loss: 0.01100
Epoch: 18479, Training Loss: 0.01102
Epoch: 18479, Training Loss: 0.01358
Epoch: 18480, Training Loss: 0.01048
Epoch: 18480, Training Loss: 0.01100
Epoch: 18480, Training Loss: 0.01102
Epoch: 18480, Training Loss: 0.01358
Epoch: 18481, Training Loss: 0.01048
Epoch: 18481, Training Loss: 0.01100
Epoch: 18481, Training Loss: 0.01101
Epoch: 18481, Training Loss: 0.01358
Epoch: 18482, Training Loss: 0.01048
Epoch: 18482, Training Loss: 0.01100
Epoch: 18482, Training Loss: 0.01101
Epoch: 18482, Training Loss: 0.01358
Epoch: 18483, Training Loss: 0.01048
Epoch: 18483, Training Loss: 0.01100
Epoch: 18483, Training Loss: 0.01101
Epoch: 18483, Training Loss: 0.01358
Epoch: 18484, Training Loss: 0.01048
Epoch: 18484, Training Loss: 0.01100
Epoch: 18484, Training Loss: 0.01101
Epoch: 18484, Training Loss: 0.01358
Epoch: 18485, Training Loss: 0.01048
Epoch: 18485, Training Loss: 0.01100
Epoch: 18485, Training Loss: 0.01101
Epoch: 18485, Training Loss: 0.01358
Epoch: 18486, Training Loss: 0.01048
Epoch: 18486, Training Loss: 0.01100
Epoch: 18486, Training Loss: 0.01101
Epoch: 18486, Training Loss: 0.01358
Epoch: 18487, Training Loss: 0.01048
Epoch: 18487, Training Loss: 0.01100
Epoch: 18487, Training Loss: 0.01101
Epoch: 18487, Training Loss: 0.01358
Epoch: 18488, Training Loss: 0.01048
Epoch: 18488, Training Loss: 0.01100
Epoch: 18488, Training Loss: 0.01101
Epoch: 18488, Training Loss: 0.01358
Epoch: 18489, Training Loss: 0.01048
Epoch: 18489, Training Loss: 0.01100
Epoch: 18489, Training Loss: 0.01101
Epoch: 18489, Training Loss: 0.01358
Epoch: 18490, Training Loss: 0.01048
Epoch: 18490, Training Loss: 0.01100
Epoch: 18490, Training Loss: 0.01101
Epoch: 18490, Training Loss: 0.01358
Epoch: 18491, Training Loss: 0.01048
Epoch: 18491, Training Loss: 0.01100
Epoch: 18491, Training Loss: 0.01101
Epoch: 18491, Training Loss: 0.01358
Epoch: 18492, Training Loss: 0.01048
Epoch: 18492, Training Loss: 0.01100
Epoch: 18492, Training Loss: 0.01101
Epoch: 18492, Training Loss: 0.01358
Epoch: 18493, Training Loss: 0.01048
Epoch: 18493, Training Loss: 0.01100
Epoch: 18493, Training Loss: 0.01101
Epoch: 18493, Training Loss: 0.01358
Epoch: 18494, Training Loss: 0.01048
Epoch: 18494, Training Loss: 0.01100
Epoch: 18494, Training Loss: 0.01101
Epoch: 18494, Training Loss: 0.01358
Epoch: 18495, Training Loss: 0.01048
Epoch: 18495, Training Loss: 0.01100
Epoch: 18495, Training Loss: 0.01101
Epoch: 18495, Training Loss: 0.01358
Epoch: 18496, Training Loss: 0.01048
Epoch: 18496, Training Loss: 0.01099
Epoch: 18496, Training Loss: 0.01101
Epoch: 18496, Training Loss: 0.01358
Epoch: 18497, Training Loss: 0.01048
Epoch: 18497, Training Loss: 0.01099
Epoch: 18497, Training Loss: 0.01101
Epoch: 18497, Training Loss: 0.01358
Epoch: 18498, Training Loss: 0.01048
Epoch: 18498, Training Loss: 0.01099
Epoch: 18498, Training Loss: 0.01101
Epoch: 18498, Training Loss: 0.01358
Epoch: 18499, Training Loss: 0.01047
Epoch: 18499, Training Loss: 0.01099
Epoch: 18499, Training Loss: 0.01101
Epoch: 18499, Training Loss: 0.01358
Epoch: 18500, Training Loss: 0.01047
Epoch: 18500, Training Loss: 0.01099
Epoch: 18500, Training Loss: 0.01101
Epoch: 18500, Training Loss: 0.01357
Epoch: 18501, Training Loss: 0.01047
Epoch: 18501, Training Loss: 0.01099
Epoch: 18501, Training Loss: 0.01101
Epoch: 18501, Training Loss: 0.01357
Epoch: 18502, Training Loss: 0.01047
Epoch: 18502, Training Loss: 0.01099
Epoch: 18502, Training Loss: 0.01101
Epoch: 18502, Training Loss: 0.01357
Epoch: 18503, Training Loss: 0.01047
Epoch: 18503, Training Loss: 0.01099
Epoch: 18503, Training Loss: 0.01101
Epoch: 18503, Training Loss: 0.01357
Epoch: 18504, Training Loss: 0.01047
Epoch: 18504, Training Loss: 0.01099
Epoch: 18504, Training Loss: 0.01101
Epoch: 18504, Training Loss: 0.01357
Epoch: 18505, Training Loss: 0.01047
Epoch: 18505, Training Loss: 0.01099
Epoch: 18505, Training Loss: 0.01101
Epoch: 18505, Training Loss: 0.01357
Epoch: 18506, Training Loss: 0.01047
Epoch: 18506, Training Loss: 0.01099
Epoch: 18506, Training Loss: 0.01101
Epoch: 18506, Training Loss: 0.01357
Epoch: 18507, Training Loss: 0.01047
Epoch: 18507, Training Loss: 0.01099
Epoch: 18507, Training Loss: 0.01101
Epoch: 18507, Training Loss: 0.01357
Epoch: 18508, Training Loss: 0.01047
Epoch: 18508, Training Loss: 0.01099
Epoch: 18508, Training Loss: 0.01101
Epoch: 18508, Training Loss: 0.01357
Epoch: 18509, Training Loss: 0.01047
Epoch: 18509, Training Loss: 0.01099
Epoch: 18509, Training Loss: 0.01100
Epoch: 18509, Training Loss: 0.01357
Epoch: 18510, Training Loss: 0.01047
Epoch: 18510, Training Loss: 0.01099
Epoch: 18510, Training Loss: 0.01100
Epoch: 18510, Training Loss: 0.01357
Epoch: 18511, Training Loss: 0.01047
Epoch: 18511, Training Loss: 0.01099
Epoch: 18511, Training Loss: 0.01100
Epoch: 18511, Training Loss: 0.01357
Epoch: 18512, Training Loss: 0.01047
Epoch: 18512, Training Loss: 0.01099
Epoch: 18512, Training Loss: 0.01100
Epoch: 18512, Training Loss: 0.01357
Epoch: 18513, Training Loss: 0.01047
Epoch: 18513, Training Loss: 0.01099
Epoch: 18513, Training Loss: 0.01100
Epoch: 18513, Training Loss: 0.01357
Epoch: 18514, Training Loss: 0.01047
Epoch: 18514, Training Loss: 0.01099
Epoch: 18514, Training Loss: 0.01100
Epoch: 18514, Training Loss: 0.01357
Epoch: 18515, Training Loss: 0.01047
Epoch: 18515, Training Loss: 0.01099
Epoch: 18515, Training Loss: 0.01100
Epoch: 18515, Training Loss: 0.01357
Epoch: 18516, Training Loss: 0.01047
Epoch: 18516, Training Loss: 0.01099
Epoch: 18516, Training Loss: 0.01100
Epoch: 18516, Training Loss: 0.01357
Epoch: 18517, Training Loss: 0.01047
Epoch: 18517, Training Loss: 0.01099
Epoch: 18517, Training Loss: 0.01100
Epoch: 18517, Training Loss: 0.01357
Epoch: 18518, Training Loss: 0.01047
Epoch: 18518, Training Loss: 0.01099
Epoch: 18518, Training Loss: 0.01100
Epoch: 18518, Training Loss: 0.01357
Epoch: 18519, Training Loss: 0.01047
Epoch: 18519, Training Loss: 0.01099
Epoch: 18519, Training Loss: 0.01100
Epoch: 18519, Training Loss: 0.01357
Epoch: 18520, Training Loss: 0.01047
Epoch: 18520, Training Loss: 0.01099
Epoch: 18520, Training Loss: 0.01100
Epoch: 18520, Training Loss: 0.01357
Epoch: 18521, Training Loss: 0.01047
Epoch: 18521, Training Loss: 0.01099
Epoch: 18521, Training Loss: 0.01100
Epoch: 18521, Training Loss: 0.01357
Epoch: 18522, Training Loss: 0.01047
Epoch: 18522, Training Loss: 0.01099
Epoch: 18522, Training Loss: 0.01100
Epoch: 18522, Training Loss: 0.01356
Epoch: 18523, Training Loss: 0.01047
Epoch: 18523, Training Loss: 0.01099
Epoch: 18523, Training Loss: 0.01100
Epoch: 18523, Training Loss: 0.01356
Epoch: 18524, Training Loss: 0.01047
Epoch: 18524, Training Loss: 0.01098
Epoch: 18524, Training Loss: 0.01100
Epoch: 18524, Training Loss: 0.01356
Epoch: 18525, Training Loss: 0.01047
Epoch: 18525, Training Loss: 0.01098
Epoch: 18525, Training Loss: 0.01100
Epoch: 18525, Training Loss: 0.01356
Epoch: 18526, Training Loss: 0.01047
Epoch: 18526, Training Loss: 0.01098
Epoch: 18526, Training Loss: 0.01100
Epoch: 18526, Training Loss: 0.01356
Epoch: 18527, Training Loss: 0.01047
Epoch: 18527, Training Loss: 0.01098
Epoch: 18527, Training Loss: 0.01100
Epoch: 18527, Training Loss: 0.01356
Epoch: 18528, Training Loss: 0.01047
Epoch: 18528, Training Loss: 0.01098
Epoch: 18528, Training Loss: 0.01100
Epoch: 18528, Training Loss: 0.01356
Epoch: 18529, Training Loss: 0.01047
Epoch: 18529, Training Loss: 0.01098
Epoch: 18529, Training Loss: 0.01100
Epoch: 18529, Training Loss: 0.01356
Epoch: 18530, Training Loss: 0.01046
Epoch: 18530, Training Loss: 0.01098
Epoch: 18530, Training Loss: 0.01100
Epoch: 18530, Training Loss: 0.01356
Epoch: 18531, Training Loss: 0.01046
Epoch: 18531, Training Loss: 0.01098
Epoch: 18531, Training Loss: 0.01100
Epoch: 18531, Training Loss: 0.01356
Epoch: 18532, Training Loss: 0.01046
Epoch: 18532, Training Loss: 0.01098
Epoch: 18532, Training Loss: 0.01100
Epoch: 18532, Training Loss: 0.01356
Epoch: 18533, Training Loss: 0.01046
Epoch: 18533, Training Loss: 0.01098
Epoch: 18533, Training Loss: 0.01100
Epoch: 18533, Training Loss: 0.01356
Epoch: 18534, Training Loss: 0.01046
Epoch: 18534, Training Loss: 0.01098
Epoch: 18534, Training Loss: 0.01100
Epoch: 18534, Training Loss: 0.01356
Epoch: 18535, Training Loss: 0.01046
Epoch: 18535, Training Loss: 0.01098
Epoch: 18535, Training Loss: 0.01100
Epoch: 18535, Training Loss: 0.01356
Epoch: 18536, Training Loss: 0.01046
Epoch: 18536, Training Loss: 0.01098
Epoch: 18536, Training Loss: 0.01100
Epoch: 18536, Training Loss: 0.01356
Epoch: 18537, Training Loss: 0.01046
Epoch: 18537, Training Loss: 0.01098
Epoch: 18537, Training Loss: 0.01099
Epoch: 18537, Training Loss: 0.01356
Epoch: 18538, Training Loss: 0.01046
Epoch: 18538, Training Loss: 0.01098
Epoch: 18538, Training Loss: 0.01099
Epoch: 18538, Training Loss: 0.01356
Epoch: 18539, Training Loss: 0.01046
Epoch: 18539, Training Loss: 0.01098
Epoch: 18539, Training Loss: 0.01099
Epoch: 18539, Training Loss: 0.01356
Epoch: 18540, Training Loss: 0.01046
Epoch: 18540, Training Loss: 0.01098
Epoch: 18540, Training Loss: 0.01099
Epoch: 18540, Training Loss: 0.01356
Epoch: 18541, Training Loss: 0.01046
Epoch: 18541, Training Loss: 0.01098
Epoch: 18541, Training Loss: 0.01099
Epoch: 18541, Training Loss: 0.01356
Epoch: 18542, Training Loss: 0.01046
Epoch: 18542, Training Loss: 0.01098
Epoch: 18542, Training Loss: 0.01099
Epoch: 18542, Training Loss: 0.01356
Epoch: 18543, Training Loss: 0.01046
Epoch: 18543, Training Loss: 0.01098
Epoch: 18543, Training Loss: 0.01099
Epoch: 18543, Training Loss: 0.01356
Epoch: 18544, Training Loss: 0.01046
Epoch: 18544, Training Loss: 0.01098
Epoch: 18544, Training Loss: 0.01099
Epoch: 18544, Training Loss: 0.01356
Epoch: 18545, Training Loss: 0.01046
Epoch: 18545, Training Loss: 0.01098
Epoch: 18545, Training Loss: 0.01099
Epoch: 18545, Training Loss: 0.01355
Epoch: 18546, Training Loss: 0.01046
Epoch: 18546, Training Loss: 0.01098
Epoch: 18546, Training Loss: 0.01099
Epoch: 18546, Training Loss: 0.01355
Epoch: 18547, Training Loss: 0.01046
Epoch: 18547, Training Loss: 0.01098
Epoch: 18547, Training Loss: 0.01099
Epoch: 18547, Training Loss: 0.01355
Epoch: 18548, Training Loss: 0.01046
Epoch: 18548, Training Loss: 0.01098
Epoch: 18548, Training Loss: 0.01099
Epoch: 18548, Training Loss: 0.01355
Epoch: 18549, Training Loss: 0.01046
Epoch: 18549, Training Loss: 0.01098
Epoch: 18549, Training Loss: 0.01099
Epoch: 18549, Training Loss: 0.01355
Epoch: 18550, Training Loss: 0.01046
Epoch: 18550, Training Loss: 0.01098
Epoch: 18550, Training Loss: 0.01099
Epoch: 18550, Training Loss: 0.01355
Epoch: 18551, Training Loss: 0.01046
Epoch: 18551, Training Loss: 0.01098
Epoch: 18551, Training Loss: 0.01099
Epoch: 18551, Training Loss: 0.01355
Epoch: 18552, Training Loss: 0.01046
Epoch: 18552, Training Loss: 0.01097
Epoch: 18552, Training Loss: 0.01099
Epoch: 18552, Training Loss: 0.01355
Epoch: 18553, Training Loss: 0.01046
Epoch: 18553, Training Loss: 0.01097
Epoch: 18553, Training Loss: 0.01099
Epoch: 18553, Training Loss: 0.01355
Epoch: 18554, Training Loss: 0.01046
Epoch: 18554, Training Loss: 0.01097
Epoch: 18554, Training Loss: 0.01099
Epoch: 18554, Training Loss: 0.01355
Epoch: 18555, Training Loss: 0.01046
Epoch: 18555, Training Loss: 0.01097
Epoch: 18555, Training Loss: 0.01099
Epoch: 18555, Training Loss: 0.01355
Epoch: 18556, Training Loss: 0.01046
Epoch: 18556, Training Loss: 0.01097
Epoch: 18556, Training Loss: 0.01099
Epoch: 18556, Training Loss: 0.01355
Epoch: 18557, Training Loss: 0.01046
Epoch: 18557, Training Loss: 0.01097
Epoch: 18557, Training Loss: 0.01099
Epoch: 18557, Training Loss: 0.01355
Epoch: 18558, Training Loss: 0.01046
Epoch: 18558, Training Loss: 0.01097
Epoch: 18558, Training Loss: 0.01099
Epoch: 18558, Training Loss: 0.01355
Epoch: 18559, Training Loss: 0.01046
Epoch: 18559, Training Loss: 0.01097
Epoch: 18559, Training Loss: 0.01099
Epoch: 18559, Training Loss: 0.01355
Epoch: 18560, Training Loss: 0.01045
Epoch: 18560, Training Loss: 0.01097
Epoch: 18560, Training Loss: 0.01099
Epoch: 18560, Training Loss: 0.01355
Epoch: 18561, Training Loss: 0.01045
Epoch: 18561, Training Loss: 0.01097
Epoch: 18561, Training Loss: 0.01099
Epoch: 18561, Training Loss: 0.01355
Epoch: 18562, Training Loss: 0.01045
Epoch: 18562, Training Loss: 0.01097
Epoch: 18562, Training Loss: 0.01099
Epoch: 18562, Training Loss: 0.01355
Epoch: 18563, Training Loss: 0.01045
Epoch: 18563, Training Loss: 0.01097
Epoch: 18563, Training Loss: 0.01099
Epoch: 18563, Training Loss: 0.01355
Epoch: 18564, Training Loss: 0.01045
Epoch: 18564, Training Loss: 0.01097
Epoch: 18564, Training Loss: 0.01099
Epoch: 18564, Training Loss: 0.01355
Epoch: 18565, Training Loss: 0.01045
Epoch: 18565, Training Loss: 0.01097
Epoch: 18565, Training Loss: 0.01098
Epoch: 18565, Training Loss: 0.01355
Epoch: 18566, Training Loss: 0.01045
Epoch: 18566, Training Loss: 0.01097
Epoch: 18566, Training Loss: 0.01098
Epoch: 18566, Training Loss: 0.01355
Epoch: 18567, Training Loss: 0.01045
Epoch: 18567, Training Loss: 0.01097
Epoch: 18567, Training Loss: 0.01098
Epoch: 18567, Training Loss: 0.01354
Epoch: 18568, Training Loss: 0.01045
Epoch: 18568, Training Loss: 0.01097
Epoch: 18568, Training Loss: 0.01098
Epoch: 18568, Training Loss: 0.01354
Epoch: 18569, Training Loss: 0.01045
Epoch: 18569, Training Loss: 0.01097
Epoch: 18569, Training Loss: 0.01098
Epoch: 18569, Training Loss: 0.01354
Epoch: 18570, Training Loss: 0.01045
Epoch: 18570, Training Loss: 0.01097
Epoch: 18570, Training Loss: 0.01098
Epoch: 18570, Training Loss: 0.01354
Epoch: 18571, Training Loss: 0.01045
Epoch: 18571, Training Loss: 0.01097
Epoch: 18571, Training Loss: 0.01098
Epoch: 18571, Training Loss: 0.01354
Epoch: 18572, Training Loss: 0.01045
Epoch: 18572, Training Loss: 0.01097
Epoch: 18572, Training Loss: 0.01098
Epoch: 18572, Training Loss: 0.01354
Epoch: 18573, Training Loss: 0.01045
Epoch: 18573, Training Loss: 0.01097
Epoch: 18573, Training Loss: 0.01098
Epoch: 18573, Training Loss: 0.01354
Epoch: 18574, Training Loss: 0.01045
Epoch: 18574, Training Loss: 0.01097
Epoch: 18574, Training Loss: 0.01098
Epoch: 18574, Training Loss: 0.01354
Epoch: 18575, Training Loss: 0.01045
Epoch: 18575, Training Loss: 0.01097
Epoch: 18575, Training Loss: 0.01098
Epoch: 18575, Training Loss: 0.01354
Epoch: 18576, Training Loss: 0.01045
Epoch: 18576, Training Loss: 0.01097
Epoch: 18576, Training Loss: 0.01098
Epoch: 18576, Training Loss: 0.01354
Epoch: 18577, Training Loss: 0.01045
Epoch: 18577, Training Loss: 0.01097
Epoch: 18577, Training Loss: 0.01098
Epoch: 18577, Training Loss: 0.01354
Epoch: 18578, Training Loss: 0.01045
Epoch: 18578, Training Loss: 0.01097
Epoch: 18578, Training Loss: 0.01098
Epoch: 18578, Training Loss: 0.01354
Epoch: 18579, Training Loss: 0.01045
Epoch: 18579, Training Loss: 0.01097
Epoch: 18579, Training Loss: 0.01098
Epoch: 18579, Training Loss: 0.01354
Epoch: 18580, Training Loss: 0.01045
Epoch: 18580, Training Loss: 0.01096
Epoch: 18580, Training Loss: 0.01098
Epoch: 18580, Training Loss: 0.01354
Epoch: 18581, Training Loss: 0.01045
Epoch: 18581, Training Loss: 0.01096
Epoch: 18581, Training Loss: 0.01098
Epoch: 18581, Training Loss: 0.01354
Epoch: 18582, Training Loss: 0.01045
Epoch: 18582, Training Loss: 0.01096
Epoch: 18582, Training Loss: 0.01098
Epoch: 18582, Training Loss: 0.01354
Epoch: 18583, Training Loss: 0.01045
Epoch: 18583, Training Loss: 0.01096
Epoch: 18583, Training Loss: 0.01098
Epoch: 18583, Training Loss: 0.01354
Epoch: 18584, Training Loss: 0.01045
Epoch: 18584, Training Loss: 0.01096
Epoch: 18584, Training Loss: 0.01098
Epoch: 18584, Training Loss: 0.01354
Epoch: 18585, Training Loss: 0.01045
Epoch: 18585, Training Loss: 0.01096
Epoch: 18585, Training Loss: 0.01098
Epoch: 18585, Training Loss: 0.01354
Epoch: 18586, Training Loss: 0.01045
Epoch: 18586, Training Loss: 0.01096
Epoch: 18586, Training Loss: 0.01098
Epoch: 18586, Training Loss: 0.01354
Epoch: 18587, Training Loss: 0.01045
Epoch: 18587, Training Loss: 0.01096
Epoch: 18587, Training Loss: 0.01098
Epoch: 18587, Training Loss: 0.01354
Epoch: 18588, Training Loss: 0.01045
Epoch: 18588, Training Loss: 0.01096
Epoch: 18588, Training Loss: 0.01098
Epoch: 18588, Training Loss: 0.01354
Epoch: 18589, Training Loss: 0.01045
Epoch: 18589, Training Loss: 0.01096
Epoch: 18589, Training Loss: 0.01098
Epoch: 18589, Training Loss: 0.01354
Epoch: 18590, Training Loss: 0.01045
Epoch: 18590, Training Loss: 0.01096
Epoch: 18590, Training Loss: 0.01098
Epoch: 18590, Training Loss: 0.01353
Epoch: 18591, Training Loss: 0.01044
Epoch: 18591, Training Loss: 0.01096
Epoch: 18591, Training Loss: 0.01098
Epoch: 18591, Training Loss: 0.01353
Epoch: 18592, Training Loss: 0.01044
Epoch: 18592, Training Loss: 0.01096
Epoch: 18592, Training Loss: 0.01098
Epoch: 18592, Training Loss: 0.01353
Epoch: 18593, Training Loss: 0.01044
Epoch: 18593, Training Loss: 0.01096
Epoch: 18593, Training Loss: 0.01098
Epoch: 18593, Training Loss: 0.01353
Epoch: 18594, Training Loss: 0.01044
Epoch: 18594, Training Loss: 0.01096
Epoch: 18594, Training Loss: 0.01097
Epoch: 18594, Training Loss: 0.01353
Epoch: 18595, Training Loss: 0.01044
Epoch: 18595, Training Loss: 0.01096
Epoch: 18595, Training Loss: 0.01097
Epoch: 18595, Training Loss: 0.01353
Epoch: 18596, Training Loss: 0.01044
Epoch: 18596, Training Loss: 0.01096
Epoch: 18596, Training Loss: 0.01097
Epoch: 18596, Training Loss: 0.01353
Epoch: 18597, Training Loss: 0.01044
Epoch: 18597, Training Loss: 0.01096
Epoch: 18597, Training Loss: 0.01097
Epoch: 18597, Training Loss: 0.01353
Epoch: 18598, Training Loss: 0.01044
Epoch: 18598, Training Loss: 0.01096
Epoch: 18598, Training Loss: 0.01097
Epoch: 18598, Training Loss: 0.01353
Epoch: 18599, Training Loss: 0.01044
Epoch: 18599, Training Loss: 0.01096
Epoch: 18599, Training Loss: 0.01097
Epoch: 18599, Training Loss: 0.01353
Epoch: 18600, Training Loss: 0.01044
Epoch: 18600, Training Loss: 0.01096
Epoch: 18600, Training Loss: 0.01097
Epoch: 18600, Training Loss: 0.01353
Epoch: 18601, Training Loss: 0.01044
Epoch: 18601, Training Loss: 0.01096
Epoch: 18601, Training Loss: 0.01097
Epoch: 18601, Training Loss: 0.01353
Epoch: 18602, Training Loss: 0.01044
Epoch: 18602, Training Loss: 0.01096
Epoch: 18602, Training Loss: 0.01097
Epoch: 18602, Training Loss: 0.01353
Epoch: 18603, Training Loss: 0.01044
Epoch: 18603, Training Loss: 0.01096
Epoch: 18603, Training Loss: 0.01097
Epoch: 18603, Training Loss: 0.01353
Epoch: 18604, Training Loss: 0.01044
Epoch: 18604, Training Loss: 0.01096
Epoch: 18604, Training Loss: 0.01097
Epoch: 18604, Training Loss: 0.01353
Epoch: 18605, Training Loss: 0.01044
Epoch: 18605, Training Loss: 0.01096
Epoch: 18605, Training Loss: 0.01097
Epoch: 18605, Training Loss: 0.01353
Epoch: 18606, Training Loss: 0.01044
Epoch: 18606, Training Loss: 0.01096
Epoch: 18606, Training Loss: 0.01097
Epoch: 18606, Training Loss: 0.01353
Epoch: 18607, Training Loss: 0.01044
Epoch: 18607, Training Loss: 0.01096
Epoch: 18607, Training Loss: 0.01097
Epoch: 18607, Training Loss: 0.01353
Epoch: 18608, Training Loss: 0.01044
Epoch: 18608, Training Loss: 0.01096
Epoch: 18608, Training Loss: 0.01097
Epoch: 18608, Training Loss: 0.01353
Epoch: 18609, Training Loss: 0.01044
Epoch: 18609, Training Loss: 0.01095
Epoch: 18609, Training Loss: 0.01097
Epoch: 18609, Training Loss: 0.01353
Epoch: 18610, Training Loss: 0.01044
Epoch: 18610, Training Loss: 0.01095
Epoch: 18610, Training Loss: 0.01097
Epoch: 18610, Training Loss: 0.01353
Epoch: 18611, Training Loss: 0.01044
Epoch: 18611, Training Loss: 0.01095
Epoch: 18611, Training Loss: 0.01097
Epoch: 18611, Training Loss: 0.01353
Epoch: 18612, Training Loss: 0.01044
Epoch: 18612, Training Loss: 0.01095
Epoch: 18612, Training Loss: 0.01097
Epoch: 18612, Training Loss: 0.01353
Epoch: 18613, Training Loss: 0.01044
Epoch: 18613, Training Loss: 0.01095
Epoch: 18613, Training Loss: 0.01097
Epoch: 18613, Training Loss: 0.01352
Epoch: 18614, Training Loss: 0.01044
Epoch: 18614, Training Loss: 0.01095
Epoch: 18614, Training Loss: 0.01097
Epoch: 18614, Training Loss: 0.01352
Epoch: 18615, Training Loss: 0.01044
Epoch: 18615, Training Loss: 0.01095
Epoch: 18615, Training Loss: 0.01097
Epoch: 18615, Training Loss: 0.01352
Epoch: 18616, Training Loss: 0.01044
Epoch: 18616, Training Loss: 0.01095
Epoch: 18616, Training Loss: 0.01097
Epoch: 18616, Training Loss: 0.01352
Epoch: 18617, Training Loss: 0.01044
Epoch: 18617, Training Loss: 0.01095
Epoch: 18617, Training Loss: 0.01097
Epoch: 18617, Training Loss: 0.01352
Epoch: 18618, Training Loss: 0.01044
Epoch: 18618, Training Loss: 0.01095
Epoch: 18618, Training Loss: 0.01097
Epoch: 18618, Training Loss: 0.01352
Epoch: 18619, Training Loss: 0.01044
Epoch: 18619, Training Loss: 0.01095
Epoch: 18619, Training Loss: 0.01097
Epoch: 18619, Training Loss: 0.01352
Epoch: 18620, Training Loss: 0.01044
Epoch: 18620, Training Loss: 0.01095
Epoch: 18620, Training Loss: 0.01097
Epoch: 18620, Training Loss: 0.01352
Epoch: 18621, Training Loss: 0.01043
Epoch: 18621, Training Loss: 0.01095
Epoch: 18621, Training Loss: 0.01097
Epoch: 18621, Training Loss: 0.01352
Epoch: 18622, Training Loss: 0.01043
Epoch: 18622, Training Loss: 0.01095
Epoch: 18622, Training Loss: 0.01096
Epoch: 18622, Training Loss: 0.01352
Epoch: 18623, Training Loss: 0.01043
Epoch: 18623, Training Loss: 0.01095
Epoch: 18623, Training Loss: 0.01096
Epoch: 18623, Training Loss: 0.01352
Epoch: 18624, Training Loss: 0.01043
Epoch: 18624, Training Loss: 0.01095
Epoch: 18624, Training Loss: 0.01096
Epoch: 18624, Training Loss: 0.01352
Epoch: 18625, Training Loss: 0.01043
Epoch: 18625, Training Loss: 0.01095
Epoch: 18625, Training Loss: 0.01096
Epoch: 18625, Training Loss: 0.01352
Epoch: 18626, Training Loss: 0.01043
Epoch: 18626, Training Loss: 0.01095
Epoch: 18626, Training Loss: 0.01096
Epoch: 18626, Training Loss: 0.01352
Epoch: 18627, Training Loss: 0.01043
Epoch: 18627, Training Loss: 0.01095
Epoch: 18627, Training Loss: 0.01096
Epoch: 18627, Training Loss: 0.01352
Epoch: 18628, Training Loss: 0.01043
Epoch: 18628, Training Loss: 0.01095
Epoch: 18628, Training Loss: 0.01096
Epoch: 18628, Training Loss: 0.01352
Epoch: 18629, Training Loss: 0.01043
Epoch: 18629, Training Loss: 0.01095
Epoch: 18629, Training Loss: 0.01096
Epoch: 18629, Training Loss: 0.01352
Epoch: 18630, Training Loss: 0.01043
Epoch: 18630, Training Loss: 0.01095
Epoch: 18630, Training Loss: 0.01096
Epoch: 18630, Training Loss: 0.01352
Epoch: 18631, Training Loss: 0.01043
Epoch: 18631, Training Loss: 0.01095
Epoch: 18631, Training Loss: 0.01096
Epoch: 18631, Training Loss: 0.01352
Epoch: 18632, Training Loss: 0.01043
Epoch: 18632, Training Loss: 0.01095
Epoch: 18632, Training Loss: 0.01096
Epoch: 18632, Training Loss: 0.01352
Epoch: 18633, Training Loss: 0.01043
Epoch: 18633, Training Loss: 0.01095
Epoch: 18633, Training Loss: 0.01096
Epoch: 18633, Training Loss: 0.01352
Epoch: 18634, Training Loss: 0.01043
Epoch: 18634, Training Loss: 0.01095
Epoch: 18634, Training Loss: 0.01096
Epoch: 18634, Training Loss: 0.01352
Epoch: 18635, Training Loss: 0.01043
Epoch: 18635, Training Loss: 0.01095
Epoch: 18635, Training Loss: 0.01096
Epoch: 18635, Training Loss: 0.01351
Epoch: 18636, Training Loss: 0.01043
Epoch: 18636, Training Loss: 0.01095
Epoch: 18636, Training Loss: 0.01096
Epoch: 18636, Training Loss: 0.01351
Epoch: 18637, Training Loss: 0.01043
Epoch: 18637, Training Loss: 0.01094
Epoch: 18637, Training Loss: 0.01096
Epoch: 18637, Training Loss: 0.01351
Epoch: 18638, Training Loss: 0.01043
Epoch: 18638, Training Loss: 0.01094
Epoch: 18638, Training Loss: 0.01096
Epoch: 18638, Training Loss: 0.01351
Epoch: 18639, Training Loss: 0.01043
Epoch: 18639, Training Loss: 0.01094
Epoch: 18639, Training Loss: 0.01096
Epoch: 18639, Training Loss: 0.01351
Epoch: 18640, Training Loss: 0.01043
Epoch: 18640, Training Loss: 0.01094
Epoch: 18640, Training Loss: 0.01096
Epoch: 18640, Training Loss: 0.01351
Epoch: 18641, Training Loss: 0.01043
Epoch: 18641, Training Loss: 0.01094
Epoch: 18641, Training Loss: 0.01096
Epoch: 18641, Training Loss: 0.01351
Epoch: 18642, Training Loss: 0.01043
Epoch: 18642, Training Loss: 0.01094
Epoch: 18642, Training Loss: 0.01096
Epoch: 18642, Training Loss: 0.01351
Epoch: 18643, Training Loss: 0.01043
Epoch: 18643, Training Loss: 0.01094
Epoch: 18643, Training Loss: 0.01096
Epoch: 18643, Training Loss: 0.01351
Epoch: 18644, Training Loss: 0.01043
Epoch: 18644, Training Loss: 0.01094
Epoch: 18644, Training Loss: 0.01096
Epoch: 18644, Training Loss: 0.01351
Epoch: 18645, Training Loss: 0.01043
Epoch: 18645, Training Loss: 0.01094
Epoch: 18645, Training Loss: 0.01096
Epoch: 18645, Training Loss: 0.01351
Epoch: 18646, Training Loss: 0.01043
Epoch: 18646, Training Loss: 0.01094
Epoch: 18646, Training Loss: 0.01096
Epoch: 18646, Training Loss: 0.01351
Epoch: 18647, Training Loss: 0.01043
Epoch: 18647, Training Loss: 0.01094
Epoch: 18647, Training Loss: 0.01096
Epoch: 18647, Training Loss: 0.01351
Epoch: 18648, Training Loss: 0.01043
Epoch: 18648, Training Loss: 0.01094
Epoch: 18648, Training Loss: 0.01096
Epoch: 18648, Training Loss: 0.01351
Epoch: 18649, Training Loss: 0.01043
Epoch: 18649, Training Loss: 0.01094
Epoch: 18649, Training Loss: 0.01096
Epoch: 18649, Training Loss: 0.01351
Epoch: 18650, Training Loss: 0.01043
Epoch: 18650, Training Loss: 0.01094
Epoch: 18650, Training Loss: 0.01095
Epoch: 18650, Training Loss: 0.01351
Epoch: 18651, Training Loss: 0.01043
Epoch: 18651, Training Loss: 0.01094
Epoch: 18651, Training Loss: 0.01095
Epoch: 18651, Training Loss: 0.01351
Epoch: 18652, Training Loss: 0.01042
Epoch: 18652, Training Loss: 0.01094
Epoch: 18652, Training Loss: 0.01095
Epoch: 18652, Training Loss: 0.01351
Epoch: 18653, Training Loss: 0.01042
Epoch: 18653, Training Loss: 0.01094
Epoch: 18653, Training Loss: 0.01095
Epoch: 18653, Training Loss: 0.01351
Epoch: 18654, Training Loss: 0.01042
Epoch: 18654, Training Loss: 0.01094
Epoch: 18654, Training Loss: 0.01095
Epoch: 18654, Training Loss: 0.01351
Epoch: 18655, Training Loss: 0.01042
Epoch: 18655, Training Loss: 0.01094
Epoch: 18655, Training Loss: 0.01095
Epoch: 18655, Training Loss: 0.01351
Epoch: 18656, Training Loss: 0.01042
Epoch: 18656, Training Loss: 0.01094
Epoch: 18656, Training Loss: 0.01095
Epoch: 18656, Training Loss: 0.01351
Epoch: 18657, Training Loss: 0.01042
Epoch: 18657, Training Loss: 0.01094
Epoch: 18657, Training Loss: 0.01095
Epoch: 18657, Training Loss: 0.01351
Epoch: 18658, Training Loss: 0.01042
Epoch: 18658, Training Loss: 0.01094
Epoch: 18658, Training Loss: 0.01095
Epoch: 18658, Training Loss: 0.01350
Epoch: 18659, Training Loss: 0.01042
Epoch: 18659, Training Loss: 0.01094
Epoch: 18659, Training Loss: 0.01095
Epoch: 18659, Training Loss: 0.01350
Epoch: 18660, Training Loss: 0.01042
Epoch: 18660, Training Loss: 0.01094
Epoch: 18660, Training Loss: 0.01095
Epoch: 18660, Training Loss: 0.01350
Epoch: 18661, Training Loss: 0.01042
Epoch: 18661, Training Loss: 0.01094
Epoch: 18661, Training Loss: 0.01095
Epoch: 18661, Training Loss: 0.01350
Epoch: 18662, Training Loss: 0.01042
Epoch: 18662, Training Loss: 0.01094
Epoch: 18662, Training Loss: 0.01095
Epoch: 18662, Training Loss: 0.01350
Epoch: 18663, Training Loss: 0.01042
Epoch: 18663, Training Loss: 0.01094
Epoch: 18663, Training Loss: 0.01095
Epoch: 18663, Training Loss: 0.01350
Epoch: 18664, Training Loss: 0.01042
Epoch: 18664, Training Loss: 0.01094
Epoch: 18664, Training Loss: 0.01095
Epoch: 18664, Training Loss: 0.01350
Epoch: 18665, Training Loss: 0.01042
Epoch: 18665, Training Loss: 0.01094
Epoch: 18665, Training Loss: 0.01095
Epoch: 18665, Training Loss: 0.01350
Epoch: 18666, Training Loss: 0.01042
Epoch: 18666, Training Loss: 0.01093
Epoch: 18666, Training Loss: 0.01095
Epoch: 18666, Training Loss: 0.01350
Epoch: 18667, Training Loss: 0.01042
Epoch: 18667, Training Loss: 0.01093
Epoch: 18667, Training Loss: 0.01095
Epoch: 18667, Training Loss: 0.01350
Epoch: 18668, Training Loss: 0.01042
Epoch: 18668, Training Loss: 0.01093
Epoch: 18668, Training Loss: 0.01095
Epoch: 18668, Training Loss: 0.01350
Epoch: 18669, Training Loss: 0.01042
Epoch: 18669, Training Loss: 0.01093
Epoch: 18669, Training Loss: 0.01095
Epoch: 18669, Training Loss: 0.01350
Epoch: 18670, Training Loss: 0.01042
Epoch: 18670, Training Loss: 0.01093
Epoch: 18670, Training Loss: 0.01095
Epoch: 18670, Training Loss: 0.01350
Epoch: 18671, Training Loss: 0.01042
Epoch: 18671, Training Loss: 0.01093
Epoch: 18671, Training Loss: 0.01095
Epoch: 18671, Training Loss: 0.01350
Epoch: 18672, Training Loss: 0.01042
Epoch: 18672, Training Loss: 0.01093
Epoch: 18672, Training Loss: 0.01095
Epoch: 18672, Training Loss: 0.01350
Epoch: 18673, Training Loss: 0.01042
Epoch: 18673, Training Loss: 0.01093
Epoch: 18673, Training Loss: 0.01095
Epoch: 18673, Training Loss: 0.01350
Epoch: 18674, Training Loss: 0.01042
Epoch: 18674, Training Loss: 0.01093
Epoch: 18674, Training Loss: 0.01095
Epoch: 18674, Training Loss: 0.01350
Epoch: 18675, Training Loss: 0.01042
Epoch: 18675, Training Loss: 0.01093
Epoch: 18675, Training Loss: 0.01095
Epoch: 18675, Training Loss: 0.01350
Epoch: 18676, Training Loss: 0.01042
Epoch: 18676, Training Loss: 0.01093
Epoch: 18676, Training Loss: 0.01095
Epoch: 18676, Training Loss: 0.01350
Epoch: 18677, Training Loss: 0.01042
Epoch: 18677, Training Loss: 0.01093
Epoch: 18677, Training Loss: 0.01095
Epoch: 18677, Training Loss: 0.01350
Epoch: 18678, Training Loss: 0.01042
Epoch: 18678, Training Loss: 0.01093
Epoch: 18678, Training Loss: 0.01095
Epoch: 18678, Training Loss: 0.01350
Epoch: 18679, Training Loss: 0.01042
Epoch: 18679, Training Loss: 0.01093
Epoch: 18679, Training Loss: 0.01094
Epoch: 18679, Training Loss: 0.01350
Epoch: 18680, Training Loss: 0.01042
Epoch: 18680, Training Loss: 0.01093
Epoch: 18680, Training Loss: 0.01094
Epoch: 18680, Training Loss: 0.01350
Epoch: 18681, Training Loss: 0.01042
Epoch: 18681, Training Loss: 0.01093
Epoch: 18681, Training Loss: 0.01094
Epoch: 18681, Training Loss: 0.01349
Epoch: 18682, Training Loss: 0.01042
Epoch: 18682, Training Loss: 0.01093
Epoch: 18682, Training Loss: 0.01094
Epoch: 18682, Training Loss: 0.01349
Epoch: 18683, Training Loss: 0.01041
Epoch: 18683, Training Loss: 0.01093
Epoch: 18683, Training Loss: 0.01094
Epoch: 18683, Training Loss: 0.01349
Epoch: 18684, Training Loss: 0.01041
Epoch: 18684, Training Loss: 0.01093
Epoch: 18684, Training Loss: 0.01094
Epoch: 18684, Training Loss: 0.01349
Epoch: 18685, Training Loss: 0.01041
Epoch: 18685, Training Loss: 0.01093
Epoch: 18685, Training Loss: 0.01094
Epoch: 18685, Training Loss: 0.01349
Epoch: 18686, Training Loss: 0.01041
Epoch: 18686, Training Loss: 0.01093
Epoch: 18686, Training Loss: 0.01094
Epoch: 18686, Training Loss: 0.01349
Epoch: 18687, Training Loss: 0.01041
Epoch: 18687, Training Loss: 0.01093
Epoch: 18687, Training Loss: 0.01094
Epoch: 18687, Training Loss: 0.01349
Epoch: 18688, Training Loss: 0.01041
Epoch: 18688, Training Loss: 0.01093
Epoch: 18688, Training Loss: 0.01094
Epoch: 18688, Training Loss: 0.01349
Epoch: 18689, Training Loss: 0.01041
Epoch: 18689, Training Loss: 0.01093
Epoch: 18689, Training Loss: 0.01094
Epoch: 18689, Training Loss: 0.01349
Epoch: 18690, Training Loss: 0.01041
Epoch: 18690, Training Loss: 0.01093
Epoch: 18690, Training Loss: 0.01094
Epoch: 18690, Training Loss: 0.01349
Epoch: 18691, Training Loss: 0.01041
Epoch: 18691, Training Loss: 0.01093
Epoch: 18691, Training Loss: 0.01094
Epoch: 18691, Training Loss: 0.01349
Epoch: 18692, Training Loss: 0.01041
Epoch: 18692, Training Loss: 0.01093
Epoch: 18692, Training Loss: 0.01094
Epoch: 18692, Training Loss: 0.01349
Epoch: 18693, Training Loss: 0.01041
Epoch: 18693, Training Loss: 0.01093
Epoch: 18693, Training Loss: 0.01094
Epoch: 18693, Training Loss: 0.01349
Epoch: 18694, Training Loss: 0.01041
Epoch: 18694, Training Loss: 0.01092
Epoch: 18694, Training Loss: 0.01094
Epoch: 18694, Training Loss: 0.01349
Epoch: 18695, Training Loss: 0.01041
Epoch: 18695, Training Loss: 0.01092
Epoch: 18695, Training Loss: 0.01094
Epoch: 18695, Training Loss: 0.01349
Epoch: 18696, Training Loss: 0.01041
Epoch: 18696, Training Loss: 0.01092
Epoch: 18696, Training Loss: 0.01094
Epoch: 18696, Training Loss: 0.01349
Epoch: 18697, Training Loss: 0.01041
Epoch: 18697, Training Loss: 0.01092
Epoch: 18697, Training Loss: 0.01094
Epoch: 18697, Training Loss: 0.01349
Epoch: 18698, Training Loss: 0.01041
Epoch: 18698, Training Loss: 0.01092
Epoch: 18698, Training Loss: 0.01094
Epoch: 18698, Training Loss: 0.01349
Epoch: 18699, Training Loss: 0.01041
Epoch: 18699, Training Loss: 0.01092
Epoch: 18699, Training Loss: 0.01094
Epoch: 18699, Training Loss: 0.01349
Epoch: 18700, Training Loss: 0.01041
Epoch: 18700, Training Loss: 0.01092
Epoch: 18700, Training Loss: 0.01094
Epoch: 18700, Training Loss: 0.01349
Epoch: 18701, Training Loss: 0.01041
Epoch: 18701, Training Loss: 0.01092
Epoch: 18701, Training Loss: 0.01094
Epoch: 18701, Training Loss: 0.01349
Epoch: 18702, Training Loss: 0.01041
Epoch: 18702, Training Loss: 0.01092
Epoch: 18702, Training Loss: 0.01094
Epoch: 18702, Training Loss: 0.01349
Epoch: 18703, Training Loss: 0.01041
Epoch: 18703, Training Loss: 0.01092
Epoch: 18703, Training Loss: 0.01094
Epoch: 18703, Training Loss: 0.01349
Epoch: 18704, Training Loss: 0.01041
Epoch: 18704, Training Loss: 0.01092
Epoch: 18704, Training Loss: 0.01094
Epoch: 18704, Training Loss: 0.01348
Epoch: 18705, Training Loss: 0.01041
Epoch: 18705, Training Loss: 0.01092
Epoch: 18705, Training Loss: 0.01094
Epoch: 18705, Training Loss: 0.01348
Epoch: 18706, Training Loss: 0.01041
Epoch: 18706, Training Loss: 0.01092
Epoch: 18706, Training Loss: 0.01094
Epoch: 18706, Training Loss: 0.01348
Epoch: 18707, Training Loss: 0.01041
Epoch: 18707, Training Loss: 0.01092
Epoch: 18707, Training Loss: 0.01094
Epoch: 18707, Training Loss: 0.01348
Epoch: 18708, Training Loss: 0.01041
Epoch: 18708, Training Loss: 0.01092
Epoch: 18708, Training Loss: 0.01093
Epoch: 18708, Training Loss: 0.01348
Epoch: 18709, Training Loss: 0.01041
Epoch: 18709, Training Loss: 0.01092
Epoch: 18709, Training Loss: 0.01093
Epoch: 18709, Training Loss: 0.01348
Epoch: 18710, Training Loss: 0.01041
Epoch: 18710, Training Loss: 0.01092
Epoch: 18710, Training Loss: 0.01093
Epoch: 18710, Training Loss: 0.01348
Epoch: 18711, Training Loss: 0.01041
Epoch: 18711, Training Loss: 0.01092
Epoch: 18711, Training Loss: 0.01093
Epoch: 18711, Training Loss: 0.01348
Epoch: 18712, Training Loss: 0.01041
Epoch: 18712, Training Loss: 0.01092
Epoch: 18712, Training Loss: 0.01093
Epoch: 18712, Training Loss: 0.01348
Epoch: 18713, Training Loss: 0.01041
Epoch: 18713, Training Loss: 0.01092
Epoch: 18713, Training Loss: 0.01093
Epoch: 18713, Training Loss: 0.01348
Epoch: 18714, Training Loss: 0.01040
Epoch: 18714, Training Loss: 0.01092
Epoch: 18714, Training Loss: 0.01093
Epoch: 18714, Training Loss: 0.01348
Epoch: 18715, Training Loss: 0.01040
Epoch: 18715, Training Loss: 0.01092
Epoch: 18715, Training Loss: 0.01093
Epoch: 18715, Training Loss: 0.01348
Epoch: 18716, Training Loss: 0.01040
Epoch: 18716, Training Loss: 0.01092
Epoch: 18716, Training Loss: 0.01093
Epoch: 18716, Training Loss: 0.01348
Epoch: 18717, Training Loss: 0.01040
Epoch: 18717, Training Loss: 0.01092
Epoch: 18717, Training Loss: 0.01093
Epoch: 18717, Training Loss: 0.01348
Epoch: 18718, Training Loss: 0.01040
Epoch: 18718, Training Loss: 0.01092
Epoch: 18718, Training Loss: 0.01093
Epoch: 18718, Training Loss: 0.01348
Epoch: 18719, Training Loss: 0.01040
Epoch: 18719, Training Loss: 0.01092
Epoch: 18719, Training Loss: 0.01093
Epoch: 18719, Training Loss: 0.01348
Epoch: 18720, Training Loss: 0.01040
Epoch: 18720, Training Loss: 0.01092
Epoch: 18720, Training Loss: 0.01093
Epoch: 18720, Training Loss: 0.01348
Epoch: 18721, Training Loss: 0.01040
Epoch: 18721, Training Loss: 0.01092
Epoch: 18721, Training Loss: 0.01093
Epoch: 18721, Training Loss: 0.01348
Epoch: 18722, Training Loss: 0.01040
Epoch: 18722, Training Loss: 0.01092
Epoch: 18722, Training Loss: 0.01093
Epoch: 18722, Training Loss: 0.01348
Epoch: 18723, Training Loss: 0.01040
Epoch: 18723, Training Loss: 0.01091
Epoch: 18723, Training Loss: 0.01093
Epoch: 18723, Training Loss: 0.01348
Epoch: 18724, Training Loss: 0.01040
Epoch: 18724, Training Loss: 0.01091
Epoch: 18724, Training Loss: 0.01093
Epoch: 18724, Training Loss: 0.01348
Epoch: 18725, Training Loss: 0.01040
Epoch: 18725, Training Loss: 0.01091
Epoch: 18725, Training Loss: 0.01093
Epoch: 18725, Training Loss: 0.01348
Epoch: 18726, Training Loss: 0.01040
Epoch: 18726, Training Loss: 0.01091
Epoch: 18726, Training Loss: 0.01093
Epoch: 18726, Training Loss: 0.01347
Epoch: 18727, Training Loss: 0.01040
Epoch: 18727, Training Loss: 0.01091
Epoch: 18727, Training Loss: 0.01093
Epoch: 18727, Training Loss: 0.01347
Epoch: 18728, Training Loss: 0.01040
Epoch: 18728, Training Loss: 0.01091
Epoch: 18728, Training Loss: 0.01093
Epoch: 18728, Training Loss: 0.01347
Epoch: 18729, Training Loss: 0.01040
Epoch: 18729, Training Loss: 0.01091
Epoch: 18729, Training Loss: 0.01093
Epoch: 18729, Training Loss: 0.01347
Epoch: 18730, Training Loss: 0.01040
Epoch: 18730, Training Loss: 0.01091
Epoch: 18730, Training Loss: 0.01093
Epoch: 18730, Training Loss: 0.01347
Epoch: 18731, Training Loss: 0.01040
Epoch: 18731, Training Loss: 0.01091
Epoch: 18731, Training Loss: 0.01093
Epoch: 18731, Training Loss: 0.01347
Epoch: 18732, Training Loss: 0.01040
Epoch: 18732, Training Loss: 0.01091
Epoch: 18732, Training Loss: 0.01093
Epoch: 18732, Training Loss: 0.01347
Epoch: 18733, Training Loss: 0.01040
Epoch: 18733, Training Loss: 0.01091
Epoch: 18733, Training Loss: 0.01093
Epoch: 18733, Training Loss: 0.01347
Epoch: 18734, Training Loss: 0.01040
Epoch: 18734, Training Loss: 0.01091
Epoch: 18734, Training Loss: 0.01093
Epoch: 18734, Training Loss: 0.01347
Epoch: 18735, Training Loss: 0.01040
Epoch: 18735, Training Loss: 0.01091
Epoch: 18735, Training Loss: 0.01093
Epoch: 18735, Training Loss: 0.01347
Epoch: 18736, Training Loss: 0.01040
Epoch: 18736, Training Loss: 0.01091
Epoch: 18736, Training Loss: 0.01092
Epoch: 18736, Training Loss: 0.01347
Epoch: 18737, Training Loss: 0.01040
Epoch: 18737, Training Loss: 0.01091
Epoch: 18737, Training Loss: 0.01092
Epoch: 18737, Training Loss: 0.01347
Epoch: 18738, Training Loss: 0.01040
Epoch: 18738, Training Loss: 0.01091
Epoch: 18738, Training Loss: 0.01092
Epoch: 18738, Training Loss: 0.01347
Epoch: 18739, Training Loss: 0.01040
Epoch: 18739, Training Loss: 0.01091
Epoch: 18739, Training Loss: 0.01092
Epoch: 18739, Training Loss: 0.01347
Epoch: 18740, Training Loss: 0.01040
Epoch: 18740, Training Loss: 0.01091
Epoch: 18740, Training Loss: 0.01092
Epoch: 18740, Training Loss: 0.01347
Epoch: 18741, Training Loss: 0.01040
Epoch: 18741, Training Loss: 0.01091
Epoch: 18741, Training Loss: 0.01092
Epoch: 18741, Training Loss: 0.01347
Epoch: 18742, Training Loss: 0.01040
Epoch: 18742, Training Loss: 0.01091
Epoch: 18742, Training Loss: 0.01092
Epoch: 18742, Training Loss: 0.01347
Epoch: 18743, Training Loss: 0.01040
Epoch: 18743, Training Loss: 0.01091
Epoch: 18743, Training Loss: 0.01092
Epoch: 18743, Training Loss: 0.01347
Epoch: 18744, Training Loss: 0.01040
Epoch: 18744, Training Loss: 0.01091
Epoch: 18744, Training Loss: 0.01092
Epoch: 18744, Training Loss: 0.01347
Epoch: 18745, Training Loss: 0.01039
Epoch: 18745, Training Loss: 0.01091
Epoch: 18745, Training Loss: 0.01092
Epoch: 18745, Training Loss: 0.01347
Epoch: 18746, Training Loss: 0.01039
Epoch: 18746, Training Loss: 0.01091
Epoch: 18746, Training Loss: 0.01092
Epoch: 18746, Training Loss: 0.01347
Epoch: 18747, Training Loss: 0.01039
Epoch: 18747, Training Loss: 0.01091
Epoch: 18747, Training Loss: 0.01092
Epoch: 18747, Training Loss: 0.01347
Epoch: 18748, Training Loss: 0.01039
Epoch: 18748, Training Loss: 0.01091
Epoch: 18748, Training Loss: 0.01092
Epoch: 18748, Training Loss: 0.01347
Epoch: 18749, Training Loss: 0.01039
Epoch: 18749, Training Loss: 0.01091
Epoch: 18749, Training Loss: 0.01092
Epoch: 18749, Training Loss: 0.01346
Epoch: 18750, Training Loss: 0.01039
Epoch: 18750, Training Loss: 0.01091
Epoch: 18750, Training Loss: 0.01092
Epoch: 18750, Training Loss: 0.01346
Epoch: 18751, Training Loss: 0.01039
Epoch: 18751, Training Loss: 0.01091
Epoch: 18751, Training Loss: 0.01092
Epoch: 18751, Training Loss: 0.01346
Epoch: 18752, Training Loss: 0.01039
Epoch: 18752, Training Loss: 0.01090
Epoch: 18752, Training Loss: 0.01092
Epoch: 18752, Training Loss: 0.01346
Epoch: 18753, Training Loss: 0.01039
Epoch: 18753, Training Loss: 0.01090
Epoch: 18753, Training Loss: 0.01092
Epoch: 18753, Training Loss: 0.01346
Epoch: 18754, Training Loss: 0.01039
Epoch: 18754, Training Loss: 0.01090
Epoch: 18754, Training Loss: 0.01092
Epoch: 18754, Training Loss: 0.01346
Epoch: 18755, Training Loss: 0.01039
Epoch: 18755, Training Loss: 0.01090
Epoch: 18755, Training Loss: 0.01092
Epoch: 18755, Training Loss: 0.01346
Epoch: 18756, Training Loss: 0.01039
Epoch: 18756, Training Loss: 0.01090
Epoch: 18756, Training Loss: 0.01092
Epoch: 18756, Training Loss: 0.01346
Epoch: 18757, Training Loss: 0.01039
Epoch: 18757, Training Loss: 0.01090
Epoch: 18757, Training Loss: 0.01092
Epoch: 18757, Training Loss: 0.01346
Epoch: 18758, Training Loss: 0.01039
Epoch: 18758, Training Loss: 0.01090
Epoch: 18758, Training Loss: 0.01092
Epoch: 18758, Training Loss: 0.01346
Epoch: 18759, Training Loss: 0.01039
Epoch: 18759, Training Loss: 0.01090
Epoch: 18759, Training Loss: 0.01092
Epoch: 18759, Training Loss: 0.01346
Epoch: 18760, Training Loss: 0.01039
Epoch: 18760, Training Loss: 0.01090
Epoch: 18760, Training Loss: 0.01092
Epoch: 18760, Training Loss: 0.01346
Epoch: 18761, Training Loss: 0.01039
Epoch: 18761, Training Loss: 0.01090
Epoch: 18761, Training Loss: 0.01092
Epoch: 18761, Training Loss: 0.01346
Epoch: 18762, Training Loss: 0.01039
Epoch: 18762, Training Loss: 0.01090
Epoch: 18762, Training Loss: 0.01092
Epoch: 18762, Training Loss: 0.01346
Epoch: 18763, Training Loss: 0.01039
Epoch: 18763, Training Loss: 0.01090
Epoch: 18763, Training Loss: 0.01092
Epoch: 18763, Training Loss: 0.01346
Epoch: 18764, Training Loss: 0.01039
Epoch: 18764, Training Loss: 0.01090
Epoch: 18764, Training Loss: 0.01092
Epoch: 18764, Training Loss: 0.01346
Epoch: 18765, Training Loss: 0.01039
Epoch: 18765, Training Loss: 0.01090
Epoch: 18765, Training Loss: 0.01091
Epoch: 18765, Training Loss: 0.01346
Epoch: 18766, Training Loss: 0.01039
Epoch: 18766, Training Loss: 0.01090
Epoch: 18766, Training Loss: 0.01091
Epoch: 18766, Training Loss: 0.01346
Epoch: 18767, Training Loss: 0.01039
Epoch: 18767, Training Loss: 0.01090
Epoch: 18767, Training Loss: 0.01091
Epoch: 18767, Training Loss: 0.01346
Epoch: 18768, Training Loss: 0.01039
Epoch: 18768, Training Loss: 0.01090
Epoch: 18768, Training Loss: 0.01091
Epoch: 18768, Training Loss: 0.01346
Epoch: 18769, Training Loss: 0.01039
Epoch: 18769, Training Loss: 0.01090
Epoch: 18769, Training Loss: 0.01091
Epoch: 18769, Training Loss: 0.01346
Epoch: 18770, Training Loss: 0.01039
Epoch: 18770, Training Loss: 0.01090
Epoch: 18770, Training Loss: 0.01091
Epoch: 18770, Training Loss: 0.01346
Epoch: 18771, Training Loss: 0.01039
Epoch: 18771, Training Loss: 0.01090
Epoch: 18771, Training Loss: 0.01091
Epoch: 18771, Training Loss: 0.01346
Epoch: 18772, Training Loss: 0.01039
Epoch: 18772, Training Loss: 0.01090
Epoch: 18772, Training Loss: 0.01091
Epoch: 18772, Training Loss: 0.01345
Epoch: 18773, Training Loss: 0.01039
Epoch: 18773, Training Loss: 0.01090
Epoch: 18773, Training Loss: 0.01091
Epoch: 18773, Training Loss: 0.01345
Epoch: 18774, Training Loss: 0.01039
Epoch: 18774, Training Loss: 0.01090
Epoch: 18774, Training Loss: 0.01091
Epoch: 18774, Training Loss: 0.01345
Epoch: 18775, Training Loss: 0.01039
Epoch: 18775, Training Loss: 0.01090
Epoch: 18775, Training Loss: 0.01091
Epoch: 18775, Training Loss: 0.01345
Epoch: 18776, Training Loss: 0.01038
Epoch: 18776, Training Loss: 0.01090
Epoch: 18776, Training Loss: 0.01091
Epoch: 18776, Training Loss: 0.01345
Epoch: 18777, Training Loss: 0.01038
Epoch: 18777, Training Loss: 0.01090
Epoch: 18777, Training Loss: 0.01091
Epoch: 18777, Training Loss: 0.01345
Epoch: 18778, Training Loss: 0.01038
Epoch: 18778, Training Loss: 0.01090
Epoch: 18778, Training Loss: 0.01091
Epoch: 18778, Training Loss: 0.01345
Epoch: 18779, Training Loss: 0.01038
Epoch: 18779, Training Loss: 0.01090
Epoch: 18779, Training Loss: 0.01091
Epoch: 18779, Training Loss: 0.01345
Epoch: 18780, Training Loss: 0.01038
Epoch: 18780, Training Loss: 0.01090
Epoch: 18780, Training Loss: 0.01091
Epoch: 18780, Training Loss: 0.01345
Epoch: 18781, Training Loss: 0.01038
Epoch: 18781, Training Loss: 0.01089
Epoch: 18781, Training Loss: 0.01091
Epoch: 18781, Training Loss: 0.01345
Epoch: 18782, Training Loss: 0.01038
Epoch: 18782, Training Loss: 0.01089
Epoch: 18782, Training Loss: 0.01091
Epoch: 18782, Training Loss: 0.01345
Epoch: 18783, Training Loss: 0.01038
Epoch: 18783, Training Loss: 0.01089
Epoch: 18783, Training Loss: 0.01091
Epoch: 18783, Training Loss: 0.01345
Epoch: 18784, Training Loss: 0.01038
Epoch: 18784, Training Loss: 0.01089
Epoch: 18784, Training Loss: 0.01091
Epoch: 18784, Training Loss: 0.01345
Epoch: 18785, Training Loss: 0.01038
Epoch: 18785, Training Loss: 0.01089
Epoch: 18785, Training Loss: 0.01091
Epoch: 18785, Training Loss: 0.01345
Epoch: 18786, Training Loss: 0.01038
Epoch: 18786, Training Loss: 0.01089
Epoch: 18786, Training Loss: 0.01091
Epoch: 18786, Training Loss: 0.01345
Epoch: 18787, Training Loss: 0.01038
Epoch: 18787, Training Loss: 0.01089
Epoch: 18787, Training Loss: 0.01091
Epoch: 18787, Training Loss: 0.01345
Epoch: 18788, Training Loss: 0.01038
Epoch: 18788, Training Loss: 0.01089
Epoch: 18788, Training Loss: 0.01091
Epoch: 18788, Training Loss: 0.01345
Epoch: 18789, Training Loss: 0.01038
Epoch: 18789, Training Loss: 0.01089
Epoch: 18789, Training Loss: 0.01091
Epoch: 18789, Training Loss: 0.01345
Epoch: 18790, Training Loss: 0.01038
Epoch: 18790, Training Loss: 0.01089
Epoch: 18790, Training Loss: 0.01091
Epoch: 18790, Training Loss: 0.01345
Epoch: 18791, Training Loss: 0.01038
Epoch: 18791, Training Loss: 0.01089
Epoch: 18791, Training Loss: 0.01091
Epoch: 18791, Training Loss: 0.01345
Epoch: 18792, Training Loss: 0.01038
Epoch: 18792, Training Loss: 0.01089
Epoch: 18792, Training Loss: 0.01091
Epoch: 18792, Training Loss: 0.01345
Epoch: 18793, Training Loss: 0.01038
Epoch: 18793, Training Loss: 0.01089
Epoch: 18793, Training Loss: 0.01091
Epoch: 18793, Training Loss: 0.01345
Epoch: 18794, Training Loss: 0.01038
Epoch: 18794, Training Loss: 0.01089
Epoch: 18794, Training Loss: 0.01090
Epoch: 18794, Training Loss: 0.01345
Epoch: 18795, Training Loss: 0.01038
Epoch: 18795, Training Loss: 0.01089
Epoch: 18795, Training Loss: 0.01090
Epoch: 18795, Training Loss: 0.01344
Epoch: 18796, Training Loss: 0.01038
Epoch: 18796, Training Loss: 0.01089
Epoch: 18796, Training Loss: 0.01090
Epoch: 18796, Training Loss: 0.01344
Epoch: 18797, Training Loss: 0.01038
Epoch: 18797, Training Loss: 0.01089
Epoch: 18797, Training Loss: 0.01090
Epoch: 18797, Training Loss: 0.01344
Epoch: 18798, Training Loss: 0.01038
Epoch: 18798, Training Loss: 0.01089
Epoch: 18798, Training Loss: 0.01090
Epoch: 18798, Training Loss: 0.01344
Epoch: 18799, Training Loss: 0.01038
Epoch: 18799, Training Loss: 0.01089
Epoch: 18799, Training Loss: 0.01090
Epoch: 18799, Training Loss: 0.01344
Epoch: 18800, Training Loss: 0.01038
Epoch: 18800, Training Loss: 0.01089
Epoch: 18800, Training Loss: 0.01090
Epoch: 18800, Training Loss: 0.01344
Epoch: 18801, Training Loss: 0.01038
Epoch: 18801, Training Loss: 0.01089
Epoch: 18801, Training Loss: 0.01090
Epoch: 18801, Training Loss: 0.01344
Epoch: 18802, Training Loss: 0.01038
Epoch: 18802, Training Loss: 0.01089
Epoch: 18802, Training Loss: 0.01090
Epoch: 18802, Training Loss: 0.01344
Epoch: 18803, Training Loss: 0.01038
Epoch: 18803, Training Loss: 0.01089
Epoch: 18803, Training Loss: 0.01090
Epoch: 18803, Training Loss: 0.01344
Epoch: 18804, Training Loss: 0.01038
Epoch: 18804, Training Loss: 0.01089
Epoch: 18804, Training Loss: 0.01090
Epoch: 18804, Training Loss: 0.01344
Epoch: 18805, Training Loss: 0.01038
Epoch: 18805, Training Loss: 0.01089
Epoch: 18805, Training Loss: 0.01090
Epoch: 18805, Training Loss: 0.01344
Epoch: 18806, Training Loss: 0.01038
Epoch: 18806, Training Loss: 0.01089
Epoch: 18806, Training Loss: 0.01090
Epoch: 18806, Training Loss: 0.01344
Epoch: 18807, Training Loss: 0.01037
Epoch: 18807, Training Loss: 0.01089
Epoch: 18807, Training Loss: 0.01090
Epoch: 18807, Training Loss: 0.01344
Epoch: 18808, Training Loss: 0.01037
Epoch: 18808, Training Loss: 0.01089
Epoch: 18808, Training Loss: 0.01090
Epoch: 18808, Training Loss: 0.01344
Epoch: 18809, Training Loss: 0.01037
Epoch: 18809, Training Loss: 0.01089
Epoch: 18809, Training Loss: 0.01090
Epoch: 18809, Training Loss: 0.01344
Epoch: 18810, Training Loss: 0.01037
Epoch: 18810, Training Loss: 0.01088
Epoch: 18810, Training Loss: 0.01090
Epoch: 18810, Training Loss: 0.01344
Epoch: 18811, Training Loss: 0.01037
Epoch: 18811, Training Loss: 0.01088
Epoch: 18811, Training Loss: 0.01090
Epoch: 18811, Training Loss: 0.01344
Epoch: 18812, Training Loss: 0.01037
Epoch: 18812, Training Loss: 0.01088
Epoch: 18812, Training Loss: 0.01090
Epoch: 18812, Training Loss: 0.01344
Epoch: 18813, Training Loss: 0.01037
Epoch: 18813, Training Loss: 0.01088
Epoch: 18813, Training Loss: 0.01090
Epoch: 18813, Training Loss: 0.01344
Epoch: 18814, Training Loss: 0.01037
Epoch: 18814, Training Loss: 0.01088
Epoch: 18814, Training Loss: 0.01090
Epoch: 18814, Training Loss: 0.01344
Epoch: 18815, Training Loss: 0.01037
Epoch: 18815, Training Loss: 0.01088
Epoch: 18815, Training Loss: 0.01090
Epoch: 18815, Training Loss: 0.01344
Epoch: 18816, Training Loss: 0.01037
Epoch: 18816, Training Loss: 0.01088
Epoch: 18816, Training Loss: 0.01090
Epoch: 18816, Training Loss: 0.01344
Epoch: 18817, Training Loss: 0.01037
Epoch: 18817, Training Loss: 0.01088
Epoch: 18817, Training Loss: 0.01090
Epoch: 18817, Training Loss: 0.01344
Epoch: 18818, Training Loss: 0.01037
Epoch: 18818, Training Loss: 0.01088
Epoch: 18818, Training Loss: 0.01090
Epoch: 18818, Training Loss: 0.01343
Epoch: 18819, Training Loss: 0.01037
Epoch: 18819, Training Loss: 0.01088
Epoch: 18819, Training Loss: 0.01090
Epoch: 18819, Training Loss: 0.01343
Epoch: 18820, Training Loss: 0.01037
Epoch: 18820, Training Loss: 0.01088
Epoch: 18820, Training Loss: 0.01090
Epoch: 18820, Training Loss: 0.01343
Epoch: 18821, Training Loss: 0.01037
Epoch: 18821, Training Loss: 0.01088
Epoch: 18821, Training Loss: 0.01090
Epoch: 18821, Training Loss: 0.01343
Epoch: 18822, Training Loss: 0.01037
Epoch: 18822, Training Loss: 0.01088
Epoch: 18822, Training Loss: 0.01090
Epoch: 18822, Training Loss: 0.01343
Epoch: 18823, Training Loss: 0.01037
Epoch: 18823, Training Loss: 0.01088
Epoch: 18823, Training Loss: 0.01089
Epoch: 18823, Training Loss: 0.01343
Epoch: 18824, Training Loss: 0.01037
Epoch: 18824, Training Loss: 0.01088
Epoch: 18824, Training Loss: 0.01089
Epoch: 18824, Training Loss: 0.01343
Epoch: 18825, Training Loss: 0.01037
Epoch: 18825, Training Loss: 0.01088
Epoch: 18825, Training Loss: 0.01089
Epoch: 18825, Training Loss: 0.01343
Epoch: 18826, Training Loss: 0.01037
Epoch: 18826, Training Loss: 0.01088
Epoch: 18826, Training Loss: 0.01089
Epoch: 18826, Training Loss: 0.01343
Epoch: 18827, Training Loss: 0.01037
Epoch: 18827, Training Loss: 0.01088
Epoch: 18827, Training Loss: 0.01089
Epoch: 18827, Training Loss: 0.01343
Epoch: 18828, Training Loss: 0.01037
Epoch: 18828, Training Loss: 0.01088
Epoch: 18828, Training Loss: 0.01089
Epoch: 18828, Training Loss: 0.01343
Epoch: 18829, Training Loss: 0.01037
Epoch: 18829, Training Loss: 0.01088
Epoch: 18829, Training Loss: 0.01089
Epoch: 18829, Training Loss: 0.01343
Epoch: 18830, Training Loss: 0.01037
Epoch: 18830, Training Loss: 0.01088
Epoch: 18830, Training Loss: 0.01089
Epoch: 18830, Training Loss: 0.01343
Epoch: 18831, Training Loss: 0.01037
Epoch: 18831, Training Loss: 0.01088
Epoch: 18831, Training Loss: 0.01089
Epoch: 18831, Training Loss: 0.01343
Epoch: 18832, Training Loss: 0.01037
Epoch: 18832, Training Loss: 0.01088
Epoch: 18832, Training Loss: 0.01089
Epoch: 18832, Training Loss: 0.01343
Epoch: 18833, Training Loss: 0.01037
Epoch: 18833, Training Loss: 0.01088
Epoch: 18833, Training Loss: 0.01089
Epoch: 18833, Training Loss: 0.01343
Epoch: 18834, Training Loss: 0.01037
Epoch: 18834, Training Loss: 0.01088
Epoch: 18834, Training Loss: 0.01089
Epoch: 18834, Training Loss: 0.01343
Epoch: 18835, Training Loss: 0.01037
Epoch: 18835, Training Loss: 0.01088
Epoch: 18835, Training Loss: 0.01089
Epoch: 18835, Training Loss: 0.01343
Epoch: 18836, Training Loss: 0.01037
Epoch: 18836, Training Loss: 0.01088
Epoch: 18836, Training Loss: 0.01089
Epoch: 18836, Training Loss: 0.01343
Epoch: 18837, Training Loss: 0.01037
Epoch: 18837, Training Loss: 0.01088
Epoch: 18837, Training Loss: 0.01089
Epoch: 18837, Training Loss: 0.01343
Epoch: 18838, Training Loss: 0.01037
Epoch: 18838, Training Loss: 0.01088
Epoch: 18838, Training Loss: 0.01089
Epoch: 18838, Training Loss: 0.01343
Epoch: 18839, Training Loss: 0.01036
Epoch: 18839, Training Loss: 0.01087
Epoch: 18839, Training Loss: 0.01089
Epoch: 18839, Training Loss: 0.01343
Epoch: 18840, Training Loss: 0.01036
Epoch: 18840, Training Loss: 0.01087
Epoch: 18840, Training Loss: 0.01089
Epoch: 18840, Training Loss: 0.01343
Epoch: 18841, Training Loss: 0.01036
Epoch: 18841, Training Loss: 0.01087
Epoch: 18841, Training Loss: 0.01089
Epoch: 18841, Training Loss: 0.01343
Epoch: 18842, Training Loss: 0.01036
Epoch: 18842, Training Loss: 0.01087
Epoch: 18842, Training Loss: 0.01089
Epoch: 18842, Training Loss: 0.01342
Epoch: 18843, Training Loss: 0.01036
Epoch: 18843, Training Loss: 0.01087
Epoch: 18843, Training Loss: 0.01089
Epoch: 18843, Training Loss: 0.01342
Epoch: 18844, Training Loss: 0.01036
Epoch: 18844, Training Loss: 0.01087
Epoch: 18844, Training Loss: 0.01089
Epoch: 18844, Training Loss: 0.01342
Epoch: 18845, Training Loss: 0.01036
Epoch: 18845, Training Loss: 0.01087
Epoch: 18845, Training Loss: 0.01089
Epoch: 18845, Training Loss: 0.01342
Epoch: 18846, Training Loss: 0.01036
Epoch: 18846, Training Loss: 0.01087
Epoch: 18846, Training Loss: 0.01089
Epoch: 18846, Training Loss: 0.01342
Epoch: 18847, Training Loss: 0.01036
Epoch: 18847, Training Loss: 0.01087
Epoch: 18847, Training Loss: 0.01089
Epoch: 18847, Training Loss: 0.01342
Epoch: 18848, Training Loss: 0.01036
Epoch: 18848, Training Loss: 0.01087
Epoch: 18848, Training Loss: 0.01089
Epoch: 18848, Training Loss: 0.01342
Epoch: 18849, Training Loss: 0.01036
Epoch: 18849, Training Loss: 0.01087
Epoch: 18849, Training Loss: 0.01089
Epoch: 18849, Training Loss: 0.01342
Epoch: 18850, Training Loss: 0.01036
Epoch: 18850, Training Loss: 0.01087
Epoch: 18850, Training Loss: 0.01089
Epoch: 18850, Training Loss: 0.01342
Epoch: 18851, Training Loss: 0.01036
Epoch: 18851, Training Loss: 0.01087
Epoch: 18851, Training Loss: 0.01089
Epoch: 18851, Training Loss: 0.01342
Epoch: 18852, Training Loss: 0.01036
Epoch: 18852, Training Loss: 0.01087
Epoch: 18852, Training Loss: 0.01088
Epoch: 18852, Training Loss: 0.01342
Epoch: 18853, Training Loss: 0.01036
Epoch: 18853, Training Loss: 0.01087
Epoch: 18853, Training Loss: 0.01088
Epoch: 18853, Training Loss: 0.01342
Epoch: 18854, Training Loss: 0.01036
Epoch: 18854, Training Loss: 0.01087
Epoch: 18854, Training Loss: 0.01088
Epoch: 18854, Training Loss: 0.01342
Epoch: 18855, Training Loss: 0.01036
Epoch: 18855, Training Loss: 0.01087
Epoch: 18855, Training Loss: 0.01088
Epoch: 18855, Training Loss: 0.01342
Epoch: 18856, Training Loss: 0.01036
Epoch: 18856, Training Loss: 0.01087
Epoch: 18856, Training Loss: 0.01088
Epoch: 18856, Training Loss: 0.01342
Epoch: 18857, Training Loss: 0.01036
Epoch: 18857, Training Loss: 0.01087
Epoch: 18857, Training Loss: 0.01088
Epoch: 18857, Training Loss: 0.01342
Epoch: 18858, Training Loss: 0.01036
Epoch: 18858, Training Loss: 0.01087
Epoch: 18858, Training Loss: 0.01088
Epoch: 18858, Training Loss: 0.01342
Epoch: 18859, Training Loss: 0.01036
Epoch: 18859, Training Loss: 0.01087
Epoch: 18859, Training Loss: 0.01088
Epoch: 18859, Training Loss: 0.01342
Epoch: 18860, Training Loss: 0.01036
Epoch: 18860, Training Loss: 0.01087
Epoch: 18860, Training Loss: 0.01088
Epoch: 18860, Training Loss: 0.01342
Epoch: 18861, Training Loss: 0.01036
Epoch: 18861, Training Loss: 0.01087
Epoch: 18861, Training Loss: 0.01088
Epoch: 18861, Training Loss: 0.01342
Epoch: 18862, Training Loss: 0.01036
Epoch: 18862, Training Loss: 0.01087
Epoch: 18862, Training Loss: 0.01088
Epoch: 18862, Training Loss: 0.01342
Epoch: 18863, Training Loss: 0.01036
Epoch: 18863, Training Loss: 0.01087
Epoch: 18863, Training Loss: 0.01088
Epoch: 18863, Training Loss: 0.01342
Epoch: 18864, Training Loss: 0.01036
Epoch: 18864, Training Loss: 0.01087
Epoch: 18864, Training Loss: 0.01088
Epoch: 18864, Training Loss: 0.01342
Epoch: 18865, Training Loss: 0.01036
Epoch: 18865, Training Loss: 0.01087
Epoch: 18865, Training Loss: 0.01088
Epoch: 18865, Training Loss: 0.01341
Epoch: 18866, Training Loss: 0.01036
Epoch: 18866, Training Loss: 0.01087
Epoch: 18866, Training Loss: 0.01088
Epoch: 18866, Training Loss: 0.01341
Epoch: 18867, Training Loss: 0.01036
Epoch: 18867, Training Loss: 0.01087
Epoch: 18867, Training Loss: 0.01088
Epoch: 18867, Training Loss: 0.01341
Epoch: 18868, Training Loss: 0.01036
Epoch: 18868, Training Loss: 0.01086
Epoch: 18868, Training Loss: 0.01088
Epoch: 18868, Training Loss: 0.01341
Epoch: 18869, Training Loss: 0.01036
Epoch: 18869, Training Loss: 0.01086
Epoch: 18869, Training Loss: 0.01088
Epoch: 18869, Training Loss: 0.01341
Epoch: 18870, Training Loss: 0.01035
Epoch: 18870, Training Loss: 0.01086
Epoch: 18870, Training Loss: 0.01088
Epoch: 18870, Training Loss: 0.01341
Epoch: 18871, Training Loss: 0.01035
Epoch: 18871, Training Loss: 0.01086
Epoch: 18871, Training Loss: 0.01088
Epoch: 18871, Training Loss: 0.01341
Epoch: 18872, Training Loss: 0.01035
Epoch: 18872, Training Loss: 0.01086
Epoch: 18872, Training Loss: 0.01088
Epoch: 18872, Training Loss: 0.01341
Epoch: 18873, Training Loss: 0.01035
Epoch: 18873, Training Loss: 0.01086
Epoch: 18873, Training Loss: 0.01088
Epoch: 18873, Training Loss: 0.01341
Epoch: 18874, Training Loss: 0.01035
Epoch: 18874, Training Loss: 0.01086
Epoch: 18874, Training Loss: 0.01088
Epoch: 18874, Training Loss: 0.01341
Epoch: 18875, Training Loss: 0.01035
Epoch: 18875, Training Loss: 0.01086
Epoch: 18875, Training Loss: 0.01088
Epoch: 18875, Training Loss: 0.01341
Epoch: 18876, Training Loss: 0.01035
Epoch: 18876, Training Loss: 0.01086
Epoch: 18876, Training Loss: 0.01088
Epoch: 18876, Training Loss: 0.01341
Epoch: 18877, Training Loss: 0.01035
Epoch: 18877, Training Loss: 0.01086
Epoch: 18877, Training Loss: 0.01088
Epoch: 18877, Training Loss: 0.01341
Epoch: 18878, Training Loss: 0.01035
Epoch: 18878, Training Loss: 0.01086
Epoch: 18878, Training Loss: 0.01088
Epoch: 18878, Training Loss: 0.01341
Epoch: 18879, Training Loss: 0.01035
Epoch: 18879, Training Loss: 0.01086
Epoch: 18879, Training Loss: 0.01088
Epoch: 18879, Training Loss: 0.01341
Epoch: 18880, Training Loss: 0.01035
Epoch: 18880, Training Loss: 0.01086
Epoch: 18880, Training Loss: 0.01088
Epoch: 18880, Training Loss: 0.01341
Epoch: 18881, Training Loss: 0.01035
Epoch: 18881, Training Loss: 0.01086
Epoch: 18881, Training Loss: 0.01087
Epoch: 18881, Training Loss: 0.01341
Epoch: 18882, Training Loss: 0.01035
Epoch: 18882, Training Loss: 0.01086
Epoch: 18882, Training Loss: 0.01087
Epoch: 18882, Training Loss: 0.01341
Epoch: 18883, Training Loss: 0.01035
Epoch: 18883, Training Loss: 0.01086
Epoch: 18883, Training Loss: 0.01087
Epoch: 18883, Training Loss: 0.01341
Epoch: 18884, Training Loss: 0.01035
Epoch: 18884, Training Loss: 0.01086
Epoch: 18884, Training Loss: 0.01087
Epoch: 18884, Training Loss: 0.01341
Epoch: 18885, Training Loss: 0.01035
Epoch: 18885, Training Loss: 0.01086
Epoch: 18885, Training Loss: 0.01087
Epoch: 18885, Training Loss: 0.01341
Epoch: 18886, Training Loss: 0.01035
Epoch: 18886, Training Loss: 0.01086
Epoch: 18886, Training Loss: 0.01087
Epoch: 18886, Training Loss: 0.01341
Epoch: 18887, Training Loss: 0.01035
Epoch: 18887, Training Loss: 0.01086
Epoch: 18887, Training Loss: 0.01087
Epoch: 18887, Training Loss: 0.01341
Epoch: 18888, Training Loss: 0.01035
Epoch: 18888, Training Loss: 0.01086
Epoch: 18888, Training Loss: 0.01087
Epoch: 18888, Training Loss: 0.01340
Epoch: 18889, Training Loss: 0.01035
Epoch: 18889, Training Loss: 0.01086
Epoch: 18889, Training Loss: 0.01087
Epoch: 18889, Training Loss: 0.01340
Epoch: 18890, Training Loss: 0.01035
Epoch: 18890, Training Loss: 0.01086
Epoch: 18890, Training Loss: 0.01087
Epoch: 18890, Training Loss: 0.01340
Epoch: 18891, Training Loss: 0.01035
Epoch: 18891, Training Loss: 0.01086
Epoch: 18891, Training Loss: 0.01087
Epoch: 18891, Training Loss: 0.01340
Epoch: 18892, Training Loss: 0.01035
Epoch: 18892, Training Loss: 0.01086
Epoch: 18892, Training Loss: 0.01087
Epoch: 18892, Training Loss: 0.01340
Epoch: 18893, Training Loss: 0.01035
Epoch: 18893, Training Loss: 0.01086
Epoch: 18893, Training Loss: 0.01087
Epoch: 18893, Training Loss: 0.01340
Epoch: 18894, Training Loss: 0.01035
Epoch: 18894, Training Loss: 0.01086
Epoch: 18894, Training Loss: 0.01087
Epoch: 18894, Training Loss: 0.01340
Epoch: 18895, Training Loss: 0.01035
Epoch: 18895, Training Loss: 0.01086
Epoch: 18895, Training Loss: 0.01087
Epoch: 18895, Training Loss: 0.01340
Epoch: 18896, Training Loss: 0.01035
Epoch: 18896, Training Loss: 0.01086
Epoch: 18896, Training Loss: 0.01087
Epoch: 18896, Training Loss: 0.01340
Epoch: 18897, Training Loss: 0.01035
Epoch: 18897, Training Loss: 0.01085
Epoch: 18897, Training Loss: 0.01087
Epoch: 18897, Training Loss: 0.01340
Epoch: 18898, Training Loss: 0.01035
Epoch: 18898, Training Loss: 0.01085
Epoch: 18898, Training Loss: 0.01087
Epoch: 18898, Training Loss: 0.01340
Epoch: 18899, Training Loss: 0.01035
Epoch: 18899, Training Loss: 0.01085
Epoch: 18899, Training Loss: 0.01087
Epoch: 18899, Training Loss: 0.01340
Epoch: 18900, Training Loss: 0.01035
Epoch: 18900, Training Loss: 0.01085
Epoch: 18900, Training Loss: 0.01087
Epoch: 18900, Training Loss: 0.01340
Epoch: 18901, Training Loss: 0.01035
Epoch: 18901, Training Loss: 0.01085
Epoch: 18901, Training Loss: 0.01087
Epoch: 18901, Training Loss: 0.01340
Epoch: 18902, Training Loss: 0.01034
Epoch: 18902, Training Loss: 0.01085
Epoch: 18902, Training Loss: 0.01087
Epoch: 18902, Training Loss: 0.01340
Epoch: 18903, Training Loss: 0.01034
Epoch: 18903, Training Loss: 0.01085
Epoch: 18903, Training Loss: 0.01087
Epoch: 18903, Training Loss: 0.01340
Epoch: 18904, Training Loss: 0.01034
Epoch: 18904, Training Loss: 0.01085
Epoch: 18904, Training Loss: 0.01087
Epoch: 18904, Training Loss: 0.01340
Epoch: 18905, Training Loss: 0.01034
Epoch: 18905, Training Loss: 0.01085
Epoch: 18905, Training Loss: 0.01087
Epoch: 18905, Training Loss: 0.01340
Epoch: 18906, Training Loss: 0.01034
Epoch: 18906, Training Loss: 0.01085
Epoch: 18906, Training Loss: 0.01087
Epoch: 18906, Training Loss: 0.01340
Epoch: 18907, Training Loss: 0.01034
Epoch: 18907, Training Loss: 0.01085
Epoch: 18907, Training Loss: 0.01087
Epoch: 18907, Training Loss: 0.01340
Epoch: 18908, Training Loss: 0.01034
Epoch: 18908, Training Loss: 0.01085
Epoch: 18908, Training Loss: 0.01087
Epoch: 18908, Training Loss: 0.01340
Epoch: 18909, Training Loss: 0.01034
Epoch: 18909, Training Loss: 0.01085
Epoch: 18909, Training Loss: 0.01087
Epoch: 18909, Training Loss: 0.01340
Epoch: 18910, Training Loss: 0.01034
Epoch: 18910, Training Loss: 0.01085
Epoch: 18910, Training Loss: 0.01086
Epoch: 18910, Training Loss: 0.01340
Epoch: 18911, Training Loss: 0.01034
Epoch: 18911, Training Loss: 0.01085
Epoch: 18911, Training Loss: 0.01086
Epoch: 18911, Training Loss: 0.01339
Epoch: 18912, Training Loss: 0.01034
Epoch: 18912, Training Loss: 0.01085
Epoch: 18912, Training Loss: 0.01086
Epoch: 18912, Training Loss: 0.01339
Epoch: 18913, Training Loss: 0.01034
Epoch: 18913, Training Loss: 0.01085
Epoch: 18913, Training Loss: 0.01086
Epoch: 18913, Training Loss: 0.01339
Epoch: 18914, Training Loss: 0.01034
Epoch: 18914, Training Loss: 0.01085
Epoch: 18914, Training Loss: 0.01086
Epoch: 18914, Training Loss: 0.01339
Epoch: 18915, Training Loss: 0.01034
Epoch: 18915, Training Loss: 0.01085
Epoch: 18915, Training Loss: 0.01086
Epoch: 18915, Training Loss: 0.01339
Epoch: 18916, Training Loss: 0.01034
Epoch: 18916, Training Loss: 0.01085
Epoch: 18916, Training Loss: 0.01086
Epoch: 18916, Training Loss: 0.01339
Epoch: 18917, Training Loss: 0.01034
Epoch: 18917, Training Loss: 0.01085
Epoch: 18917, Training Loss: 0.01086
Epoch: 18917, Training Loss: 0.01339
Epoch: 18918, Training Loss: 0.01034
Epoch: 18918, Training Loss: 0.01085
Epoch: 18918, Training Loss: 0.01086
Epoch: 18918, Training Loss: 0.01339
Epoch: 18919, Training Loss: 0.01034
Epoch: 18919, Training Loss: 0.01085
Epoch: 18919, Training Loss: 0.01086
Epoch: 18919, Training Loss: 0.01339
Epoch: 18920, Training Loss: 0.01034
Epoch: 18920, Training Loss: 0.01085
Epoch: 18920, Training Loss: 0.01086
Epoch: 18920, Training Loss: 0.01339
Epoch: 18921, Training Loss: 0.01034
Epoch: 18921, Training Loss: 0.01085
Epoch: 18921, Training Loss: 0.01086
Epoch: 18921, Training Loss: 0.01339
Epoch: 18922, Training Loss: 0.01034
Epoch: 18922, Training Loss: 0.01085
Epoch: 18922, Training Loss: 0.01086
Epoch: 18922, Training Loss: 0.01339
Epoch: 18923, Training Loss: 0.01034
Epoch: 18923, Training Loss: 0.01085
Epoch: 18923, Training Loss: 0.01086
Epoch: 18923, Training Loss: 0.01339
Epoch: 18924, Training Loss: 0.01034
Epoch: 18924, Training Loss: 0.01085
Epoch: 18924, Training Loss: 0.01086
Epoch: 18924, Training Loss: 0.01339
Epoch: 18925, Training Loss: 0.01034
Epoch: 18925, Training Loss: 0.01085
Epoch: 18925, Training Loss: 0.01086
Epoch: 18925, Training Loss: 0.01339
Epoch: 18926, Training Loss: 0.01034
Epoch: 18926, Training Loss: 0.01084
Epoch: 18926, Training Loss: 0.01086
Epoch: 18926, Training Loss: 0.01339
Epoch: 18927, Training Loss: 0.01034
Epoch: 18927, Training Loss: 0.01084
Epoch: 18927, Training Loss: 0.01086
Epoch: 18927, Training Loss: 0.01339
Epoch: 18928, Training Loss: 0.01034
Epoch: 18928, Training Loss: 0.01084
Epoch: 18928, Training Loss: 0.01086
Epoch: 18928, Training Loss: 0.01339
Epoch: 18929, Training Loss: 0.01034
Epoch: 18929, Training Loss: 0.01084
Epoch: 18929, Training Loss: 0.01086
Epoch: 18929, Training Loss: 0.01339
Epoch: 18930, Training Loss: 0.01034
Epoch: 18930, Training Loss: 0.01084
Epoch: 18930, Training Loss: 0.01086
Epoch: 18930, Training Loss: 0.01339
Epoch: 18931, Training Loss: 0.01034
Epoch: 18931, Training Loss: 0.01084
Epoch: 18931, Training Loss: 0.01086
Epoch: 18931, Training Loss: 0.01339
Epoch: 18932, Training Loss: 0.01034
Epoch: 18932, Training Loss: 0.01084
Epoch: 18932, Training Loss: 0.01086
Epoch: 18932, Training Loss: 0.01339
Epoch: 18933, Training Loss: 0.01033
Epoch: 18933, Training Loss: 0.01084
Epoch: 18933, Training Loss: 0.01086
Epoch: 18933, Training Loss: 0.01339
Epoch: 18934, Training Loss: 0.01033
Epoch: 18934, Training Loss: 0.01084
Epoch: 18934, Training Loss: 0.01086
Epoch: 18934, Training Loss: 0.01338
Epoch: 18935, Training Loss: 0.01033
Epoch: 18935, Training Loss: 0.01084
Epoch: 18935, Training Loss: 0.01086
Epoch: 18935, Training Loss: 0.01338
Epoch: 18936, Training Loss: 0.01033
Epoch: 18936, Training Loss: 0.01084
Epoch: 18936, Training Loss: 0.01086
Epoch: 18936, Training Loss: 0.01338
Epoch: 18937, Training Loss: 0.01033
Epoch: 18937, Training Loss: 0.01084
Epoch: 18937, Training Loss: 0.01086
Epoch: 18937, Training Loss: 0.01338
Epoch: 18938, Training Loss: 0.01033
Epoch: 18938, Training Loss: 0.01084
Epoch: 18938, Training Loss: 0.01086
Epoch: 18938, Training Loss: 0.01338
Epoch: 18939, Training Loss: 0.01033
Epoch: 18939, Training Loss: 0.01084
Epoch: 18939, Training Loss: 0.01085
Epoch: 18939, Training Loss: 0.01338
Epoch: 18940, Training Loss: 0.01033
Epoch: 18940, Training Loss: 0.01084
Epoch: 18940, Training Loss: 0.01085
Epoch: 18940, Training Loss: 0.01338
Epoch: 18941, Training Loss: 0.01033
Epoch: 18941, Training Loss: 0.01084
Epoch: 18941, Training Loss: 0.01085
Epoch: 18941, Training Loss: 0.01338
Epoch: 18942, Training Loss: 0.01033
Epoch: 18942, Training Loss: 0.01084
Epoch: 18942, Training Loss: 0.01085
Epoch: 18942, Training Loss: 0.01338
Epoch: 18943, Training Loss: 0.01033
Epoch: 18943, Training Loss: 0.01084
Epoch: 18943, Training Loss: 0.01085
Epoch: 18943, Training Loss: 0.01338
Epoch: 18944, Training Loss: 0.01033
Epoch: 18944, Training Loss: 0.01084
Epoch: 18944, Training Loss: 0.01085
Epoch: 18944, Training Loss: 0.01338
Epoch: 18945, Training Loss: 0.01033
Epoch: 18945, Training Loss: 0.01084
Epoch: 18945, Training Loss: 0.01085
Epoch: 18945, Training Loss: 0.01338
Epoch: 18946, Training Loss: 0.01033
Epoch: 18946, Training Loss: 0.01084
Epoch: 18946, Training Loss: 0.01085
Epoch: 18946, Training Loss: 0.01338
Epoch: 18947, Training Loss: 0.01033
Epoch: 18947, Training Loss: 0.01084
Epoch: 18947, Training Loss: 0.01085
Epoch: 18947, Training Loss: 0.01338
Epoch: 18948, Training Loss: 0.01033
Epoch: 18948, Training Loss: 0.01084
Epoch: 18948, Training Loss: 0.01085
Epoch: 18948, Training Loss: 0.01338
Epoch: 18949, Training Loss: 0.01033
Epoch: 18949, Training Loss: 0.01084
Epoch: 18949, Training Loss: 0.01085
Epoch: 18949, Training Loss: 0.01338
Epoch: 18950, Training Loss: 0.01033
Epoch: 18950, Training Loss: 0.01084
Epoch: 18950, Training Loss: 0.01085
Epoch: 18950, Training Loss: 0.01338
Epoch: 18951, Training Loss: 0.01033
Epoch: 18951, Training Loss: 0.01084
Epoch: 18951, Training Loss: 0.01085
Epoch: 18951, Training Loss: 0.01338
Epoch: 18952, Training Loss: 0.01033
Epoch: 18952, Training Loss: 0.01084
Epoch: 18952, Training Loss: 0.01085
Epoch: 18952, Training Loss: 0.01338
Epoch: 18953, Training Loss: 0.01033
Epoch: 18953, Training Loss: 0.01084
Epoch: 18953, Training Loss: 0.01085
Epoch: 18953, Training Loss: 0.01338
Epoch: 18954, Training Loss: 0.01033
Epoch: 18954, Training Loss: 0.01084
Epoch: 18954, Training Loss: 0.01085
Epoch: 18954, Training Loss: 0.01338
Epoch: 18955, Training Loss: 0.01033
Epoch: 18955, Training Loss: 0.01083
Epoch: 18955, Training Loss: 0.01085
Epoch: 18955, Training Loss: 0.01338
Epoch: 18956, Training Loss: 0.01033
Epoch: 18956, Training Loss: 0.01083
Epoch: 18956, Training Loss: 0.01085
Epoch: 18956, Training Loss: 0.01338
Epoch: 18957, Training Loss: 0.01033
Epoch: 18957, Training Loss: 0.01083
Epoch: 18957, Training Loss: 0.01085
Epoch: 18957, Training Loss: 0.01338
Epoch: 18958, Training Loss: 0.01033
Epoch: 18958, Training Loss: 0.01083
Epoch: 18958, Training Loss: 0.01085
Epoch: 18958, Training Loss: 0.01337
Epoch: 18959, Training Loss: 0.01033
Epoch: 18959, Training Loss: 0.01083
Epoch: 18959, Training Loss: 0.01085
Epoch: 18959, Training Loss: 0.01337
Epoch: 18960, Training Loss: 0.01033
Epoch: 18960, Training Loss: 0.01083
Epoch: 18960, Training Loss: 0.01085
Epoch: 18960, Training Loss: 0.01337
Epoch: 18961, Training Loss: 0.01033
Epoch: 18961, Training Loss: 0.01083
Epoch: 18961, Training Loss: 0.01085
Epoch: 18961, Training Loss: 0.01337
Epoch: 18962, Training Loss: 0.01033
Epoch: 18962, Training Loss: 0.01083
Epoch: 18962, Training Loss: 0.01085
Epoch: 18962, Training Loss: 0.01337
Epoch: 18963, Training Loss: 0.01033
Epoch: 18963, Training Loss: 0.01083
Epoch: 18963, Training Loss: 0.01085
Epoch: 18963, Training Loss: 0.01337
Epoch: 18964, Training Loss: 0.01033
Epoch: 18964, Training Loss: 0.01083
Epoch: 18964, Training Loss: 0.01085
Epoch: 18964, Training Loss: 0.01337
Epoch: 18965, Training Loss: 0.01032
Epoch: 18965, Training Loss: 0.01083
Epoch: 18965, Training Loss: 0.01085
Epoch: 18965, Training Loss: 0.01337
Epoch: 18966, Training Loss: 0.01032
Epoch: 18966, Training Loss: 0.01083
Epoch: 18966, Training Loss: 0.01085
Epoch: 18966, Training Loss: 0.01337
Epoch: 18967, Training Loss: 0.01032
Epoch: 18967, Training Loss: 0.01083
Epoch: 18967, Training Loss: 0.01085
Epoch: 18967, Training Loss: 0.01337
Epoch: 18968, Training Loss: 0.01032
Epoch: 18968, Training Loss: 0.01083
Epoch: 18968, Training Loss: 0.01084
Epoch: 18968, Training Loss: 0.01337
Epoch: 18969, Training Loss: 0.01032
Epoch: 18969, Training Loss: 0.01083
Epoch: 18969, Training Loss: 0.01084
Epoch: 18969, Training Loss: 0.01337
Epoch: 18970, Training Loss: 0.01032
Epoch: 18970, Training Loss: 0.01083
Epoch: 18970, Training Loss: 0.01084
Epoch: 18970, Training Loss: 0.01337
Epoch: 18971, Training Loss: 0.01032
Epoch: 18971, Training Loss: 0.01083
Epoch: 18971, Training Loss: 0.01084
Epoch: 18971, Training Loss: 0.01337
Epoch: 18972, Training Loss: 0.01032
Epoch: 18972, Training Loss: 0.01083
Epoch: 18972, Training Loss: 0.01084
Epoch: 18972, Training Loss: 0.01337
Epoch: 18973, Training Loss: 0.01032
Epoch: 18973, Training Loss: 0.01083
Epoch: 18973, Training Loss: 0.01084
Epoch: 18973, Training Loss: 0.01337
Epoch: 18974, Training Loss: 0.01032
Epoch: 18974, Training Loss: 0.01083
Epoch: 18974, Training Loss: 0.01084
Epoch: 18974, Training Loss: 0.01337
Epoch: 18975, Training Loss: 0.01032
Epoch: 18975, Training Loss: 0.01083
Epoch: 18975, Training Loss: 0.01084
Epoch: 18975, Training Loss: 0.01337
Epoch: 18976, Training Loss: 0.01032
Epoch: 18976, Training Loss: 0.01083
Epoch: 18976, Training Loss: 0.01084
Epoch: 18976, Training Loss: 0.01337
Epoch: 18977, Training Loss: 0.01032
Epoch: 18977, Training Loss: 0.01083
Epoch: 18977, Training Loss: 0.01084
Epoch: 18977, Training Loss: 0.01337
Epoch: 18978, Training Loss: 0.01032
Epoch: 18978, Training Loss: 0.01083
Epoch: 18978, Training Loss: 0.01084
Epoch: 18978, Training Loss: 0.01337
Epoch: 18979, Training Loss: 0.01032
Epoch: 18979, Training Loss: 0.01083
Epoch: 18979, Training Loss: 0.01084
Epoch: 18979, Training Loss: 0.01337
Epoch: 18980, Training Loss: 0.01032
Epoch: 18980, Training Loss: 0.01083
Epoch: 18980, Training Loss: 0.01084
Epoch: 18980, Training Loss: 0.01337
Epoch: 18981, Training Loss: 0.01032
Epoch: 18981, Training Loss: 0.01083
Epoch: 18981, Training Loss: 0.01084
Epoch: 18981, Training Loss: 0.01336
Epoch: 18982, Training Loss: 0.01032
Epoch: 18982, Training Loss: 0.01083
Epoch: 18982, Training Loss: 0.01084
Epoch: 18982, Training Loss: 0.01336
Epoch: 18983, Training Loss: 0.01032
Epoch: 18983, Training Loss: 0.01083
Epoch: 18983, Training Loss: 0.01084
Epoch: 18983, Training Loss: 0.01336
Epoch: 18984, Training Loss: 0.01032
Epoch: 18984, Training Loss: 0.01083
Epoch: 18984, Training Loss: 0.01084
Epoch: 18984, Training Loss: 0.01336
Epoch: 18985, Training Loss: 0.01032
Epoch: 18985, Training Loss: 0.01082
Epoch: 18985, Training Loss: 0.01084
Epoch: 18985, Training Loss: 0.01336
Epoch: 18986, Training Loss: 0.01032
Epoch: 18986, Training Loss: 0.01082
Epoch: 18986, Training Loss: 0.01084
Epoch: 18986, Training Loss: 0.01336
Epoch: 18987, Training Loss: 0.01032
Epoch: 18987, Training Loss: 0.01082
Epoch: 18987, Training Loss: 0.01084
Epoch: 18987, Training Loss: 0.01336
Epoch: 18988, Training Loss: 0.01032
Epoch: 18988, Training Loss: 0.01082
Epoch: 18988, Training Loss: 0.01084
Epoch: 18988, Training Loss: 0.01336
Epoch: 18989, Training Loss: 0.01032
Epoch: 18989, Training Loss: 0.01082
Epoch: 18989, Training Loss: 0.01084
Epoch: 18989, Training Loss: 0.01336
Epoch: 18990, Training Loss: 0.01032
Epoch: 18990, Training Loss: 0.01082
Epoch: 18990, Training Loss: 0.01084
Epoch: 18990, Training Loss: 0.01336
Epoch: 18991, Training Loss: 0.01032
Epoch: 18991, Training Loss: 0.01082
Epoch: 18991, Training Loss: 0.01084
Epoch: 18991, Training Loss: 0.01336
Epoch: 18992, Training Loss: 0.01032
Epoch: 18992, Training Loss: 0.01082
Epoch: 18992, Training Loss: 0.01084
Epoch: 18992, Training Loss: 0.01336
Epoch: 18993, Training Loss: 0.01032
Epoch: 18993, Training Loss: 0.01082
Epoch: 18993, Training Loss: 0.01084
Epoch: 18993, Training Loss: 0.01336
Epoch: 18994, Training Loss: 0.01032
Epoch: 18994, Training Loss: 0.01082
Epoch: 18994, Training Loss: 0.01084
Epoch: 18994, Training Loss: 0.01336
Epoch: 18995, Training Loss: 0.01032
Epoch: 18995, Training Loss: 0.01082
Epoch: 18995, Training Loss: 0.01084
Epoch: 18995, Training Loss: 0.01336
Epoch: 18996, Training Loss: 0.01032
Epoch: 18996, Training Loss: 0.01082
Epoch: 18996, Training Loss: 0.01084
Epoch: 18996, Training Loss: 0.01336
Epoch: 18997, Training Loss: 0.01031
Epoch: 18997, Training Loss: 0.01082
Epoch: 18997, Training Loss: 0.01084
Epoch: 18997, Training Loss: 0.01336
Epoch: 18998, Training Loss: 0.01031
Epoch: 18998, Training Loss: 0.01082
Epoch: 18998, Training Loss: 0.01083
Epoch: 18998, Training Loss: 0.01336
Epoch: 18999, Training Loss: 0.01031
Epoch: 18999, Training Loss: 0.01082
Epoch: 18999, Training Loss: 0.01083
Epoch: 18999, Training Loss: 0.01336
Epoch: 19000, Training Loss: 0.01031
Epoch: 19000, Training Loss: 0.01082
Epoch: 19000, Training Loss: 0.01083
Epoch: 19000, Training Loss: 0.01336
Epoch: 19001, Training Loss: 0.01031
Epoch: 19001, Training Loss: 0.01082
Epoch: 19001, Training Loss: 0.01083
Epoch: 19001, Training Loss: 0.01336
Epoch: 19002, Training Loss: 0.01031
Epoch: 19002, Training Loss: 0.01082
Epoch: 19002, Training Loss: 0.01083
Epoch: 19002, Training Loss: 0.01336
Epoch: 19003, Training Loss: 0.01031
Epoch: 19003, Training Loss: 0.01082
Epoch: 19003, Training Loss: 0.01083
Epoch: 19003, Training Loss: 0.01336
Epoch: 19004, Training Loss: 0.01031
Epoch: 19004, Training Loss: 0.01082
Epoch: 19004, Training Loss: 0.01083
Epoch: 19004, Training Loss: 0.01336
Epoch: 19005, Training Loss: 0.01031
Epoch: 19005, Training Loss: 0.01082
Epoch: 19005, Training Loss: 0.01083
Epoch: 19005, Training Loss: 0.01335
Epoch: 19006, Training Loss: 0.01031
Epoch: 19006, Training Loss: 0.01082
Epoch: 19006, Training Loss: 0.01083
Epoch: 19006, Training Loss: 0.01335
Epoch: 19007, Training Loss: 0.01031
Epoch: 19007, Training Loss: 0.01082
Epoch: 19007, Training Loss: 0.01083
Epoch: 19007, Training Loss: 0.01335
Epoch: 19008, Training Loss: 0.01031
Epoch: 19008, Training Loss: 0.01082
Epoch: 19008, Training Loss: 0.01083
Epoch: 19008, Training Loss: 0.01335
Epoch: 19009, Training Loss: 0.01031
Epoch: 19009, Training Loss: 0.01082
Epoch: 19009, Training Loss: 0.01083
Epoch: 19009, Training Loss: 0.01335
Epoch: 19010, Training Loss: 0.01031
Epoch: 19010, Training Loss: 0.01082
Epoch: 19010, Training Loss: 0.01083
Epoch: 19010, Training Loss: 0.01335
Epoch: 19011, Training Loss: 0.01031
Epoch: 19011, Training Loss: 0.01082
Epoch: 19011, Training Loss: 0.01083
Epoch: 19011, Training Loss: 0.01335
Epoch: 19012, Training Loss: 0.01031
Epoch: 19012, Training Loss: 0.01082
Epoch: 19012, Training Loss: 0.01083
Epoch: 19012, Training Loss: 0.01335
Epoch: 19013, Training Loss: 0.01031
Epoch: 19013, Training Loss: 0.01082
Epoch: 19013, Training Loss: 0.01083
Epoch: 19013, Training Loss: 0.01335
Epoch: 19014, Training Loss: 0.01031
Epoch: 19014, Training Loss: 0.01081
Epoch: 19014, Training Loss: 0.01083
Epoch: 19014, Training Loss: 0.01335
Epoch: 19015, Training Loss: 0.01031
Epoch: 19015, Training Loss: 0.01081
Epoch: 19015, Training Loss: 0.01083
Epoch: 19015, Training Loss: 0.01335
Epoch: 19016, Training Loss: 0.01031
Epoch: 19016, Training Loss: 0.01081
Epoch: 19016, Training Loss: 0.01083
Epoch: 19016, Training Loss: 0.01335
Epoch: 19017, Training Loss: 0.01031
Epoch: 19017, Training Loss: 0.01081
Epoch: 19017, Training Loss: 0.01083
Epoch: 19017, Training Loss: 0.01335
Epoch: 19018, Training Loss: 0.01031
Epoch: 19018, Training Loss: 0.01081
Epoch: 19018, Training Loss: 0.01083
Epoch: 19018, Training Loss: 0.01335
Epoch: 19019, Training Loss: 0.01031
Epoch: 19019, Training Loss: 0.01081
Epoch: 19019, Training Loss: 0.01083
Epoch: 19019, Training Loss: 0.01335
Epoch: 19020, Training Loss: 0.01031
Epoch: 19020, Training Loss: 0.01081
Epoch: 19020, Training Loss: 0.01083
Epoch: 19020, Training Loss: 0.01335
Epoch: 19021, Training Loss: 0.01031
Epoch: 19021, Training Loss: 0.01081
Epoch: 19021, Training Loss: 0.01083
Epoch: 19021, Training Loss: 0.01335
Epoch: 19022, Training Loss: 0.01031
Epoch: 19022, Training Loss: 0.01081
Epoch: 19022, Training Loss: 0.01083
Epoch: 19022, Training Loss: 0.01335
Epoch: 19023, Training Loss: 0.01031
Epoch: 19023, Training Loss: 0.01081
Epoch: 19023, Training Loss: 0.01083
Epoch: 19023, Training Loss: 0.01335
Epoch: 19024, Training Loss: 0.01031
Epoch: 19024, Training Loss: 0.01081
Epoch: 19024, Training Loss: 0.01083
Epoch: 19024, Training Loss: 0.01335
Epoch: 19025, Training Loss: 0.01031
Epoch: 19025, Training Loss: 0.01081
Epoch: 19025, Training Loss: 0.01083
Epoch: 19025, Training Loss: 0.01335
Epoch: 19026, Training Loss: 0.01031
Epoch: 19026, Training Loss: 0.01081
Epoch: 19026, Training Loss: 0.01083
Epoch: 19026, Training Loss: 0.01335
Epoch: 19027, Training Loss: 0.01031
Epoch: 19027, Training Loss: 0.01081
Epoch: 19027, Training Loss: 0.01082
Epoch: 19027, Training Loss: 0.01335
Epoch: 19028, Training Loss: 0.01030
Epoch: 19028, Training Loss: 0.01081
Epoch: 19028, Training Loss: 0.01082
Epoch: 19028, Training Loss: 0.01334
Epoch: 19029, Training Loss: 0.01030
Epoch: 19029, Training Loss: 0.01081
Epoch: 19029, Training Loss: 0.01082
Epoch: 19029, Training Loss: 0.01334
Epoch: 19030, Training Loss: 0.01030
Epoch: 19030, Training Loss: 0.01081
Epoch: 19030, Training Loss: 0.01082
Epoch: 19030, Training Loss: 0.01334
Epoch: 19031, Training Loss: 0.01030
Epoch: 19031, Training Loss: 0.01081
Epoch: 19031, Training Loss: 0.01082
Epoch: 19031, Training Loss: 0.01334
Epoch: 19032, Training Loss: 0.01030
Epoch: 19032, Training Loss: 0.01081
Epoch: 19032, Training Loss: 0.01082
Epoch: 19032, Training Loss: 0.01334
Epoch: 19033, Training Loss: 0.01030
Epoch: 19033, Training Loss: 0.01081
Epoch: 19033, Training Loss: 0.01082
Epoch: 19033, Training Loss: 0.01334
Epoch: 19034, Training Loss: 0.01030
Epoch: 19034, Training Loss: 0.01081
Epoch: 19034, Training Loss: 0.01082
Epoch: 19034, Training Loss: 0.01334
Epoch: 19035, Training Loss: 0.01030
Epoch: 19035, Training Loss: 0.01081
Epoch: 19035, Training Loss: 0.01082
Epoch: 19035, Training Loss: 0.01334
Epoch: 19036, Training Loss: 0.01030
Epoch: 19036, Training Loss: 0.01081
Epoch: 19036, Training Loss: 0.01082
Epoch: 19036, Training Loss: 0.01334
Epoch: 19037, Training Loss: 0.01030
Epoch: 19037, Training Loss: 0.01081
Epoch: 19037, Training Loss: 0.01082
Epoch: 19037, Training Loss: 0.01334
Epoch: 19038, Training Loss: 0.01030
Epoch: 19038, Training Loss: 0.01081
Epoch: 19038, Training Loss: 0.01082
Epoch: 19038, Training Loss: 0.01334
Epoch: 19039, Training Loss: 0.01030
Epoch: 19039, Training Loss: 0.01081
Epoch: 19039, Training Loss: 0.01082
Epoch: 19039, Training Loss: 0.01334
Epoch: 19040, Training Loss: 0.01030
Epoch: 19040, Training Loss: 0.01081
Epoch: 19040, Training Loss: 0.01082
Epoch: 19040, Training Loss: 0.01334
Epoch: 19041, Training Loss: 0.01030
Epoch: 19041, Training Loss: 0.01081
Epoch: 19041, Training Loss: 0.01082
Epoch: 19041, Training Loss: 0.01334
Epoch: 19042, Training Loss: 0.01030
Epoch: 19042, Training Loss: 0.01081
Epoch: 19042, Training Loss: 0.01082
Epoch: 19042, Training Loss: 0.01334
Epoch: 19043, Training Loss: 0.01030
Epoch: 19043, Training Loss: 0.01081
Epoch: 19043, Training Loss: 0.01082
Epoch: 19043, Training Loss: 0.01334
Epoch: 19044, Training Loss: 0.01030
Epoch: 19044, Training Loss: 0.01080
Epoch: 19044, Training Loss: 0.01082
Epoch: 19044, Training Loss: 0.01334
Epoch: 19045, Training Loss: 0.01030
Epoch: 19045, Training Loss: 0.01080
Epoch: 19045, Training Loss: 0.01082
Epoch: 19045, Training Loss: 0.01334
Epoch: 19046, Training Loss: 0.01030
Epoch: 19046, Training Loss: 0.01080
Epoch: 19046, Training Loss: 0.01082
Epoch: 19046, Training Loss: 0.01334
Epoch: 19047, Training Loss: 0.01030
Epoch: 19047, Training Loss: 0.01080
Epoch: 19047, Training Loss: 0.01082
Epoch: 19047, Training Loss: 0.01334
Epoch: 19048, Training Loss: 0.01030
Epoch: 19048, Training Loss: 0.01080
Epoch: 19048, Training Loss: 0.01082
Epoch: 19048, Training Loss: 0.01334
Epoch: 19049, Training Loss: 0.01030
Epoch: 19049, Training Loss: 0.01080
Epoch: 19049, Training Loss: 0.01082
Epoch: 19049, Training Loss: 0.01334
Epoch: 19050, Training Loss: 0.01030
Epoch: 19050, Training Loss: 0.01080
Epoch: 19050, Training Loss: 0.01082
Epoch: 19050, Training Loss: 0.01334
Epoch: 19051, Training Loss: 0.01030
Epoch: 19051, Training Loss: 0.01080
Epoch: 19051, Training Loss: 0.01082
Epoch: 19051, Training Loss: 0.01334
Epoch: 19052, Training Loss: 0.01030
Epoch: 19052, Training Loss: 0.01080
Epoch: 19052, Training Loss: 0.01082
Epoch: 19052, Training Loss: 0.01333
Epoch: 19053, Training Loss: 0.01030
Epoch: 19053, Training Loss: 0.01080
Epoch: 19053, Training Loss: 0.01082
Epoch: 19053, Training Loss: 0.01333
Epoch: 19054, Training Loss: 0.01030
Epoch: 19054, Training Loss: 0.01080
Epoch: 19054, Training Loss: 0.01082
Epoch: 19054, Training Loss: 0.01333
Epoch: 19055, Training Loss: 0.01030
Epoch: 19055, Training Loss: 0.01080
Epoch: 19055, Training Loss: 0.01082
Epoch: 19055, Training Loss: 0.01333
Epoch: 19056, Training Loss: 0.01030
Epoch: 19056, Training Loss: 0.01080
Epoch: 19056, Training Loss: 0.01082
Epoch: 19056, Training Loss: 0.01333
Epoch: 19057, Training Loss: 0.01030
Epoch: 19057, Training Loss: 0.01080
Epoch: 19057, Training Loss: 0.01081
Epoch: 19057, Training Loss: 0.01333
Epoch: 19058, Training Loss: 0.01030
Epoch: 19058, Training Loss: 0.01080
Epoch: 19058, Training Loss: 0.01081
Epoch: 19058, Training Loss: 0.01333
Epoch: 19059, Training Loss: 0.01030
Epoch: 19059, Training Loss: 0.01080
Epoch: 19059, Training Loss: 0.01081
Epoch: 19059, Training Loss: 0.01333
Epoch: 19060, Training Loss: 0.01029
Epoch: 19060, Training Loss: 0.01080
Epoch: 19060, Training Loss: 0.01081
Epoch: 19060, Training Loss: 0.01333
Epoch: 19061, Training Loss: 0.01029
Epoch: 19061, Training Loss: 0.01080
Epoch: 19061, Training Loss: 0.01081
Epoch: 19061, Training Loss: 0.01333
Epoch: 19062, Training Loss: 0.01029
Epoch: 19062, Training Loss: 0.01080
Epoch: 19062, Training Loss: 0.01081
Epoch: 19062, Training Loss: 0.01333
Epoch: 19063, Training Loss: 0.01029
Epoch: 19063, Training Loss: 0.01080
Epoch: 19063, Training Loss: 0.01081
Epoch: 19063, Training Loss: 0.01333
Epoch: 19064, Training Loss: 0.01029
Epoch: 19064, Training Loss: 0.01080
Epoch: 19064, Training Loss: 0.01081
Epoch: 19064, Training Loss: 0.01333
Epoch: 19065, Training Loss: 0.01029
Epoch: 19065, Training Loss: 0.01080
Epoch: 19065, Training Loss: 0.01081
Epoch: 19065, Training Loss: 0.01333
Epoch: 19066, Training Loss: 0.01029
Epoch: 19066, Training Loss: 0.01080
Epoch: 19066, Training Loss: 0.01081
Epoch: 19066, Training Loss: 0.01333
Epoch: 19067, Training Loss: 0.01029
Epoch: 19067, Training Loss: 0.01080
Epoch: 19067, Training Loss: 0.01081
Epoch: 19067, Training Loss: 0.01333
Epoch: 19068, Training Loss: 0.01029
Epoch: 19068, Training Loss: 0.01080
Epoch: 19068, Training Loss: 0.01081
Epoch: 19068, Training Loss: 0.01333
Epoch: 19069, Training Loss: 0.01029
Epoch: 19069, Training Loss: 0.01080
Epoch: 19069, Training Loss: 0.01081
Epoch: 19069, Training Loss: 0.01333
Epoch: 19070, Training Loss: 0.01029
Epoch: 19070, Training Loss: 0.01080
Epoch: 19070, Training Loss: 0.01081
Epoch: 19070, Training Loss: 0.01333
Epoch: 19071, Training Loss: 0.01029
Epoch: 19071, Training Loss: 0.01080
Epoch: 19071, Training Loss: 0.01081
Epoch: 19071, Training Loss: 0.01333
Epoch: 19072, Training Loss: 0.01029
Epoch: 19072, Training Loss: 0.01080
Epoch: 19072, Training Loss: 0.01081
Epoch: 19072, Training Loss: 0.01333
Epoch: 19073, Training Loss: 0.01029
Epoch: 19073, Training Loss: 0.01080
Epoch: 19073, Training Loss: 0.01081
Epoch: 19073, Training Loss: 0.01333
Epoch: 19074, Training Loss: 0.01029
Epoch: 19074, Training Loss: 0.01079
Epoch: 19074, Training Loss: 0.01081
Epoch: 19074, Training Loss: 0.01333
Epoch: 19075, Training Loss: 0.01029
Epoch: 19075, Training Loss: 0.01079
Epoch: 19075, Training Loss: 0.01081
Epoch: 19075, Training Loss: 0.01332
Epoch: 19076, Training Loss: 0.01029
Epoch: 19076, Training Loss: 0.01079
Epoch: 19076, Training Loss: 0.01081
Epoch: 19076, Training Loss: 0.01332
Epoch: 19077, Training Loss: 0.01029
Epoch: 19077, Training Loss: 0.01079
Epoch: 19077, Training Loss: 0.01081
Epoch: 19077, Training Loss: 0.01332
Epoch: 19078, Training Loss: 0.01029
Epoch: 19078, Training Loss: 0.01079
Epoch: 19078, Training Loss: 0.01081
Epoch: 19078, Training Loss: 0.01332
Epoch: 19079, Training Loss: 0.01029
Epoch: 19079, Training Loss: 0.01079
Epoch: 19079, Training Loss: 0.01081
Epoch: 19079, Training Loss: 0.01332
Epoch: 19080, Training Loss: 0.01029
Epoch: 19080, Training Loss: 0.01079
Epoch: 19080, Training Loss: 0.01081
Epoch: 19080, Training Loss: 0.01332
Epoch: 19081, Training Loss: 0.01029
Epoch: 19081, Training Loss: 0.01079
Epoch: 19081, Training Loss: 0.01081
Epoch: 19081, Training Loss: 0.01332
Epoch: 19082, Training Loss: 0.01029
Epoch: 19082, Training Loss: 0.01079
Epoch: 19082, Training Loss: 0.01081
Epoch: 19082, Training Loss: 0.01332
Epoch: 19083, Training Loss: 0.01029
Epoch: 19083, Training Loss: 0.01079
Epoch: 19083, Training Loss: 0.01081
Epoch: 19083, Training Loss: 0.01332
Epoch: 19084, Training Loss: 0.01029
Epoch: 19084, Training Loss: 0.01079
Epoch: 19084, Training Loss: 0.01081
Epoch: 19084, Training Loss: 0.01332
Epoch: 19085, Training Loss: 0.01029
Epoch: 19085, Training Loss: 0.01079
Epoch: 19085, Training Loss: 0.01081
Epoch: 19085, Training Loss: 0.01332
Epoch: 19086, Training Loss: 0.01029
Epoch: 19086, Training Loss: 0.01079
Epoch: 19086, Training Loss: 0.01080
Epoch: 19086, Training Loss: 0.01332
Epoch: 19087, Training Loss: 0.01029
Epoch: 19087, Training Loss: 0.01079
Epoch: 19087, Training Loss: 0.01080
Epoch: 19087, Training Loss: 0.01332
Epoch: 19088, Training Loss: 0.01029
Epoch: 19088, Training Loss: 0.01079
Epoch: 19088, Training Loss: 0.01080
Epoch: 19088, Training Loss: 0.01332
Epoch: 19089, Training Loss: 0.01029
Epoch: 19089, Training Loss: 0.01079
Epoch: 19089, Training Loss: 0.01080
Epoch: 19089, Training Loss: 0.01332
Epoch: 19090, Training Loss: 0.01029
Epoch: 19090, Training Loss: 0.01079
Epoch: 19090, Training Loss: 0.01080
Epoch: 19090, Training Loss: 0.01332
Epoch: 19091, Training Loss: 0.01029
Epoch: 19091, Training Loss: 0.01079
Epoch: 19091, Training Loss: 0.01080
Epoch: 19091, Training Loss: 0.01332
Epoch: 19092, Training Loss: 0.01028
Epoch: 19092, Training Loss: 0.01079
Epoch: 19092, Training Loss: 0.01080
Epoch: 19092, Training Loss: 0.01332
Epoch: 19093, Training Loss: 0.01028
Epoch: 19093, Training Loss: 0.01079
Epoch: 19093, Training Loss: 0.01080
Epoch: 19093, Training Loss: 0.01332
Epoch: 19094, Training Loss: 0.01028
Epoch: 19094, Training Loss: 0.01079
Epoch: 19094, Training Loss: 0.01080
Epoch: 19094, Training Loss: 0.01332
Epoch: 19095, Training Loss: 0.01028
Epoch: 19095, Training Loss: 0.01079
Epoch: 19095, Training Loss: 0.01080
Epoch: 19095, Training Loss: 0.01332
Epoch: 19096, Training Loss: 0.01028
Epoch: 19096, Training Loss: 0.01079
Epoch: 19096, Training Loss: 0.01080
Epoch: 19096, Training Loss: 0.01332
Epoch: 19097, Training Loss: 0.01028
Epoch: 19097, Training Loss: 0.01079
Epoch: 19097, Training Loss: 0.01080
Epoch: 19097, Training Loss: 0.01332
Epoch: 19098, Training Loss: 0.01028
Epoch: 19098, Training Loss: 0.01079
Epoch: 19098, Training Loss: 0.01080
Epoch: 19098, Training Loss: 0.01332
Epoch: 19099, Training Loss: 0.01028
Epoch: 19099, Training Loss: 0.01079
Epoch: 19099, Training Loss: 0.01080
Epoch: 19099, Training Loss: 0.01331
Epoch: 19100, Training Loss: 0.01028
Epoch: 19100, Training Loss: 0.01079
Epoch: 19100, Training Loss: 0.01080
Epoch: 19100, Training Loss: 0.01331
Epoch: 19101, Training Loss: 0.01028
Epoch: 19101, Training Loss: 0.01079
Epoch: 19101, Training Loss: 0.01080
Epoch: 19101, Training Loss: 0.01331
Epoch: 19102, Training Loss: 0.01028
Epoch: 19102, Training Loss: 0.01079
Epoch: 19102, Training Loss: 0.01080
Epoch: 19102, Training Loss: 0.01331
Epoch: 19103, Training Loss: 0.01028
Epoch: 19103, Training Loss: 0.01078
Epoch: 19103, Training Loss: 0.01080
Epoch: 19103, Training Loss: 0.01331
Epoch: 19104, Training Loss: 0.01028
Epoch: 19104, Training Loss: 0.01078
Epoch: 19104, Training Loss: 0.01080
Epoch: 19104, Training Loss: 0.01331
Epoch: 19105, Training Loss: 0.01028
Epoch: 19105, Training Loss: 0.01078
Epoch: 19105, Training Loss: 0.01080
Epoch: 19105, Training Loss: 0.01331
Epoch: 19106, Training Loss: 0.01028
Epoch: 19106, Training Loss: 0.01078
Epoch: 19106, Training Loss: 0.01080
Epoch: 19106, Training Loss: 0.01331
Epoch: 19107, Training Loss: 0.01028
Epoch: 19107, Training Loss: 0.01078
Epoch: 19107, Training Loss: 0.01080
Epoch: 19107, Training Loss: 0.01331
Epoch: 19108, Training Loss: 0.01028
Epoch: 19108, Training Loss: 0.01078
Epoch: 19108, Training Loss: 0.01080
Epoch: 19108, Training Loss: 0.01331
Epoch: 19109, Training Loss: 0.01028
Epoch: 19109, Training Loss: 0.01078
Epoch: 19109, Training Loss: 0.01080
Epoch: 19109, Training Loss: 0.01331
Epoch: 19110, Training Loss: 0.01028
Epoch: 19110, Training Loss: 0.01078
Epoch: 19110, Training Loss: 0.01080
Epoch: 19110, Training Loss: 0.01331
Epoch: 19111, Training Loss: 0.01028
Epoch: 19111, Training Loss: 0.01078
Epoch: 19111, Training Loss: 0.01080
Epoch: 19111, Training Loss: 0.01331
Epoch: 19112, Training Loss: 0.01028
Epoch: 19112, Training Loss: 0.01078
Epoch: 19112, Training Loss: 0.01080
Epoch: 19112, Training Loss: 0.01331
Epoch: 19113, Training Loss: 0.01028
Epoch: 19113, Training Loss: 0.01078
Epoch: 19113, Training Loss: 0.01080
Epoch: 19113, Training Loss: 0.01331
Epoch: 19114, Training Loss: 0.01028
Epoch: 19114, Training Loss: 0.01078
Epoch: 19114, Training Loss: 0.01080
Epoch: 19114, Training Loss: 0.01331
Epoch: 19115, Training Loss: 0.01028
Epoch: 19115, Training Loss: 0.01078
Epoch: 19115, Training Loss: 0.01080
Epoch: 19115, Training Loss: 0.01331
Epoch: 19116, Training Loss: 0.01028
Epoch: 19116, Training Loss: 0.01078
Epoch: 19116, Training Loss: 0.01079
Epoch: 19116, Training Loss: 0.01331
Epoch: 19117, Training Loss: 0.01028
Epoch: 19117, Training Loss: 0.01078
Epoch: 19117, Training Loss: 0.01079
Epoch: 19117, Training Loss: 0.01331
Epoch: 19118, Training Loss: 0.01028
Epoch: 19118, Training Loss: 0.01078
Epoch: 19118, Training Loss: 0.01079
Epoch: 19118, Training Loss: 0.01331
Epoch: 19119, Training Loss: 0.01028
Epoch: 19119, Training Loss: 0.01078
Epoch: 19119, Training Loss: 0.01079
Epoch: 19119, Training Loss: 0.01331
Epoch: 19120, Training Loss: 0.01028
Epoch: 19120, Training Loss: 0.01078
Epoch: 19120, Training Loss: 0.01079
Epoch: 19120, Training Loss: 0.01331
Epoch: 19121, Training Loss: 0.01028
Epoch: 19121, Training Loss: 0.01078
Epoch: 19121, Training Loss: 0.01079
Epoch: 19121, Training Loss: 0.01331
Epoch: 19122, Training Loss: 0.01028
Epoch: 19122, Training Loss: 0.01078
Epoch: 19122, Training Loss: 0.01079
Epoch: 19122, Training Loss: 0.01331
Epoch: 19123, Training Loss: 0.01028
Epoch: 19123, Training Loss: 0.01078
Epoch: 19123, Training Loss: 0.01079
Epoch: 19123, Training Loss: 0.01330
Epoch: 19124, Training Loss: 0.01027
Epoch: 19124, Training Loss: 0.01078
Epoch: 19124, Training Loss: 0.01079
Epoch: 19124, Training Loss: 0.01330
Epoch: 19125, Training Loss: 0.01027
Epoch: 19125, Training Loss: 0.01078
Epoch: 19125, Training Loss: 0.01079
Epoch: 19125, Training Loss: 0.01330
Epoch: 19126, Training Loss: 0.01027
Epoch: 19126, Training Loss: 0.01078
Epoch: 19126, Training Loss: 0.01079
Epoch: 19126, Training Loss: 0.01330
Epoch: 19127, Training Loss: 0.01027
Epoch: 19127, Training Loss: 0.01078
Epoch: 19127, Training Loss: 0.01079
Epoch: 19127, Training Loss: 0.01330
Epoch: 19128, Training Loss: 0.01027
Epoch: 19128, Training Loss: 0.01078
Epoch: 19128, Training Loss: 0.01079
Epoch: 19128, Training Loss: 0.01330
Epoch: 19129, Training Loss: 0.01027
Epoch: 19129, Training Loss: 0.01078
Epoch: 19129, Training Loss: 0.01079
Epoch: 19129, Training Loss: 0.01330
Epoch: 19130, Training Loss: 0.01027
Epoch: 19130, Training Loss: 0.01078
Epoch: 19130, Training Loss: 0.01079
Epoch: 19130, Training Loss: 0.01330
Epoch: 19131, Training Loss: 0.01027
Epoch: 19131, Training Loss: 0.01078
Epoch: 19131, Training Loss: 0.01079
Epoch: 19131, Training Loss: 0.01330
Epoch: 19132, Training Loss: 0.01027
Epoch: 19132, Training Loss: 0.01078
Epoch: 19132, Training Loss: 0.01079
Epoch: 19132, Training Loss: 0.01330
Epoch: 19133, Training Loss: 0.01027
Epoch: 19133, Training Loss: 0.01077
Epoch: 19133, Training Loss: 0.01079
Epoch: 19133, Training Loss: 0.01330
Epoch: 19134, Training Loss: 0.01027
Epoch: 19134, Training Loss: 0.01077
Epoch: 19134, Training Loss: 0.01079
Epoch: 19134, Training Loss: 0.01330
Epoch: 19135, Training Loss: 0.01027
Epoch: 19135, Training Loss: 0.01077
Epoch: 19135, Training Loss: 0.01079
Epoch: 19135, Training Loss: 0.01330
Epoch: 19136, Training Loss: 0.01027
Epoch: 19136, Training Loss: 0.01077
Epoch: 19136, Training Loss: 0.01079
Epoch: 19136, Training Loss: 0.01330
Epoch: 19137, Training Loss: 0.01027
Epoch: 19137, Training Loss: 0.01077
Epoch: 19137, Training Loss: 0.01079
Epoch: 19137, Training Loss: 0.01330
Epoch: 19138, Training Loss: 0.01027
Epoch: 19138, Training Loss: 0.01077
Epoch: 19138, Training Loss: 0.01079
Epoch: 19138, Training Loss: 0.01330
Epoch: 19139, Training Loss: 0.01027
Epoch: 19139, Training Loss: 0.01077
Epoch: 19139, Training Loss: 0.01079
Epoch: 19139, Training Loss: 0.01330
Epoch: 19140, Training Loss: 0.01027
Epoch: 19140, Training Loss: 0.01077
Epoch: 19140, Training Loss: 0.01079
Epoch: 19140, Training Loss: 0.01330
Epoch: 19141, Training Loss: 0.01027
Epoch: 19141, Training Loss: 0.01077
Epoch: 19141, Training Loss: 0.01079
Epoch: 19141, Training Loss: 0.01330
Epoch: 19142, Training Loss: 0.01027
Epoch: 19142, Training Loss: 0.01077
Epoch: 19142, Training Loss: 0.01079
Epoch: 19142, Training Loss: 0.01330
Epoch: 19143, Training Loss: 0.01027
Epoch: 19143, Training Loss: 0.01077
Epoch: 19143, Training Loss: 0.01079
Epoch: 19143, Training Loss: 0.01330
Epoch: 19144, Training Loss: 0.01027
Epoch: 19144, Training Loss: 0.01077
Epoch: 19144, Training Loss: 0.01079
Epoch: 19144, Training Loss: 0.01330
Epoch: 19145, Training Loss: 0.01027
Epoch: 19145, Training Loss: 0.01077
Epoch: 19145, Training Loss: 0.01079
Epoch: 19145, Training Loss: 0.01330
Epoch: 19146, Training Loss: 0.01027
Epoch: 19146, Training Loss: 0.01077
Epoch: 19146, Training Loss: 0.01078
Epoch: 19146, Training Loss: 0.01330
Epoch: 19147, Training Loss: 0.01027
Epoch: 19147, Training Loss: 0.01077
Epoch: 19147, Training Loss: 0.01078
Epoch: 19147, Training Loss: 0.01329
Epoch: 19148, Training Loss: 0.01027
Epoch: 19148, Training Loss: 0.01077
Epoch: 19148, Training Loss: 0.01078
Epoch: 19148, Training Loss: 0.01329
Epoch: 19149, Training Loss: 0.01027
Epoch: 19149, Training Loss: 0.01077
Epoch: 19149, Training Loss: 0.01078
Epoch: 19149, Training Loss: 0.01329
Epoch: 19150, Training Loss: 0.01027
Epoch: 19150, Training Loss: 0.01077
Epoch: 19150, Training Loss: 0.01078
Epoch: 19150, Training Loss: 0.01329
Epoch: 19151, Training Loss: 0.01027
Epoch: 19151, Training Loss: 0.01077
Epoch: 19151, Training Loss: 0.01078
Epoch: 19151, Training Loss: 0.01329
Epoch: 19152, Training Loss: 0.01027
Epoch: 19152, Training Loss: 0.01077
Epoch: 19152, Training Loss: 0.01078
Epoch: 19152, Training Loss: 0.01329
Epoch: 19153, Training Loss: 0.01027
Epoch: 19153, Training Loss: 0.01077
Epoch: 19153, Training Loss: 0.01078
Epoch: 19153, Training Loss: 0.01329
Epoch: 19154, Training Loss: 0.01027
Epoch: 19154, Training Loss: 0.01077
Epoch: 19154, Training Loss: 0.01078
Epoch: 19154, Training Loss: 0.01329
Epoch: 19155, Training Loss: 0.01027
Epoch: 19155, Training Loss: 0.01077
Epoch: 19155, Training Loss: 0.01078
Epoch: 19155, Training Loss: 0.01329
Epoch: 19156, Training Loss: 0.01027
Epoch: 19156, Training Loss: 0.01077
Epoch: 19156, Training Loss: 0.01078
Epoch: 19156, Training Loss: 0.01329
Epoch: 19157, Training Loss: 0.01026
Epoch: 19157, Training Loss: 0.01077
Epoch: 19157, Training Loss: 0.01078
Epoch: 19157, Training Loss: 0.01329
Epoch: 19158, Training Loss: 0.01026
Epoch: 19158, Training Loss: 0.01077
Epoch: 19158, Training Loss: 0.01078
Epoch: 19158, Training Loss: 0.01329
Epoch: 19159, Training Loss: 0.01026
Epoch: 19159, Training Loss: 0.01077
Epoch: 19159, Training Loss: 0.01078
Epoch: 19159, Training Loss: 0.01329
Epoch: 19160, Training Loss: 0.01026
Epoch: 19160, Training Loss: 0.01077
Epoch: 19160, Training Loss: 0.01078
Epoch: 19160, Training Loss: 0.01329
Epoch: 19161, Training Loss: 0.01026
Epoch: 19161, Training Loss: 0.01077
Epoch: 19161, Training Loss: 0.01078
Epoch: 19161, Training Loss: 0.01329
Epoch: 19162, Training Loss: 0.01026
Epoch: 19162, Training Loss: 0.01077
Epoch: 19162, Training Loss: 0.01078
Epoch: 19162, Training Loss: 0.01329
Epoch: 19163, Training Loss: 0.01026
Epoch: 19163, Training Loss: 0.01076
Epoch: 19163, Training Loss: 0.01078
Epoch: 19163, Training Loss: 0.01329
Epoch: 19164, Training Loss: 0.01026
Epoch: 19164, Training Loss: 0.01076
Epoch: 19164, Training Loss: 0.01078
Epoch: 19164, Training Loss: 0.01329
Epoch: 19165, Training Loss: 0.01026
Epoch: 19165, Training Loss: 0.01076
Epoch: 19165, Training Loss: 0.01078
Epoch: 19165, Training Loss: 0.01329
Epoch: 19166, Training Loss: 0.01026
Epoch: 19166, Training Loss: 0.01076
Epoch: 19166, Training Loss: 0.01078
Epoch: 19166, Training Loss: 0.01329
Epoch: 19167, Training Loss: 0.01026
Epoch: 19167, Training Loss: 0.01076
Epoch: 19167, Training Loss: 0.01078
Epoch: 19167, Training Loss: 0.01329
Epoch: 19168, Training Loss: 0.01026
Epoch: 19168, Training Loss: 0.01076
Epoch: 19168, Training Loss: 0.01078
Epoch: 19168, Training Loss: 0.01329
Epoch: 19169, Training Loss: 0.01026
Epoch: 19169, Training Loss: 0.01076
Epoch: 19169, Training Loss: 0.01078
Epoch: 19169, Training Loss: 0.01329
Epoch: 19170, Training Loss: 0.01026
Epoch: 19170, Training Loss: 0.01076
Epoch: 19170, Training Loss: 0.01078
Epoch: 19170, Training Loss: 0.01329
Epoch: 19171, Training Loss: 0.01026
Epoch: 19171, Training Loss: 0.01076
Epoch: 19171, Training Loss: 0.01078
Epoch: 19171, Training Loss: 0.01328
Epoch: 19172, Training Loss: 0.01026
Epoch: 19172, Training Loss: 0.01076
Epoch: 19172, Training Loss: 0.01078
Epoch: 19172, Training Loss: 0.01328
Epoch: 19173, Training Loss: 0.01026
Epoch: 19173, Training Loss: 0.01076
Epoch: 19173, Training Loss: 0.01078
Epoch: 19173, Training Loss: 0.01328
Epoch: 19174, Training Loss: 0.01026
Epoch: 19174, Training Loss: 0.01076
Epoch: 19174, Training Loss: 0.01078
Epoch: 19174, Training Loss: 0.01328
Epoch: 19175, Training Loss: 0.01026
Epoch: 19175, Training Loss: 0.01076
Epoch: 19175, Training Loss: 0.01078
Epoch: 19175, Training Loss: 0.01328
Epoch: 19176, Training Loss: 0.01026
Epoch: 19176, Training Loss: 0.01076
Epoch: 19176, Training Loss: 0.01077
Epoch: 19176, Training Loss: 0.01328
Epoch: 19177, Training Loss: 0.01026
Epoch: 19177, Training Loss: 0.01076
Epoch: 19177, Training Loss: 0.01077
Epoch: 19177, Training Loss: 0.01328
Epoch: 19178, Training Loss: 0.01026
Epoch: 19178, Training Loss: 0.01076
Epoch: 19178, Training Loss: 0.01077
Epoch: 19178, Training Loss: 0.01328
Epoch: 19179, Training Loss: 0.01026
Epoch: 19179, Training Loss: 0.01076
Epoch: 19179, Training Loss: 0.01077
Epoch: 19179, Training Loss: 0.01328
Epoch: 19180, Training Loss: 0.01026
Epoch: 19180, Training Loss: 0.01076
Epoch: 19180, Training Loss: 0.01077
Epoch: 19180, Training Loss: 0.01328
Epoch: 19181, Training Loss: 0.01026
Epoch: 19181, Training Loss: 0.01076
Epoch: 19181, Training Loss: 0.01077
Epoch: 19181, Training Loss: 0.01328
Epoch: 19182, Training Loss: 0.01026
Epoch: 19182, Training Loss: 0.01076
Epoch: 19182, Training Loss: 0.01077
Epoch: 19182, Training Loss: 0.01328
Epoch: 19183, Training Loss: 0.01026
Epoch: 19183, Training Loss: 0.01076
Epoch: 19183, Training Loss: 0.01077
Epoch: 19183, Training Loss: 0.01328
Epoch: 19184, Training Loss: 0.01026
Epoch: 19184, Training Loss: 0.01076
Epoch: 19184, Training Loss: 0.01077
Epoch: 19184, Training Loss: 0.01328
Epoch: 19185, Training Loss: 0.01026
Epoch: 19185, Training Loss: 0.01076
Epoch: 19185, Training Loss: 0.01077
Epoch: 19185, Training Loss: 0.01328
Epoch: 19186, Training Loss: 0.01026
Epoch: 19186, Training Loss: 0.01076
Epoch: 19186, Training Loss: 0.01077
Epoch: 19186, Training Loss: 0.01328
Epoch: 19187, Training Loss: 0.01026
Epoch: 19187, Training Loss: 0.01076
Epoch: 19187, Training Loss: 0.01077
Epoch: 19187, Training Loss: 0.01328
Epoch: 19188, Training Loss: 0.01026
Epoch: 19188, Training Loss: 0.01076
Epoch: 19188, Training Loss: 0.01077
Epoch: 19188, Training Loss: 0.01328
Epoch: 19189, Training Loss: 0.01025
Epoch: 19189, Training Loss: 0.01076
Epoch: 19189, Training Loss: 0.01077
Epoch: 19189, Training Loss: 0.01328
Epoch: 19190, Training Loss: 0.01025
Epoch: 19190, Training Loss: 0.01076
Epoch: 19190, Training Loss: 0.01077
Epoch: 19190, Training Loss: 0.01328
Epoch: 19191, Training Loss: 0.01025
Epoch: 19191, Training Loss: 0.01076
Epoch: 19191, Training Loss: 0.01077
Epoch: 19191, Training Loss: 0.01328
Epoch: 19192, Training Loss: 0.01025
Epoch: 19192, Training Loss: 0.01076
Epoch: 19192, Training Loss: 0.01077
Epoch: 19192, Training Loss: 0.01328
Epoch: 19193, Training Loss: 0.01025
Epoch: 19193, Training Loss: 0.01075
Epoch: 19193, Training Loss: 0.01077
Epoch: 19193, Training Loss: 0.01328
Epoch: 19194, Training Loss: 0.01025
Epoch: 19194, Training Loss: 0.01075
Epoch: 19194, Training Loss: 0.01077
Epoch: 19194, Training Loss: 0.01327
Epoch: 19195, Training Loss: 0.01025
Epoch: 19195, Training Loss: 0.01075
Epoch: 19195, Training Loss: 0.01077
Epoch: 19195, Training Loss: 0.01327
Epoch: 19196, Training Loss: 0.01025
Epoch: 19196, Training Loss: 0.01075
Epoch: 19196, Training Loss: 0.01077
Epoch: 19196, Training Loss: 0.01327
Epoch: 19197, Training Loss: 0.01025
Epoch: 19197, Training Loss: 0.01075
Epoch: 19197, Training Loss: 0.01077
Epoch: 19197, Training Loss: 0.01327
Epoch: 19198, Training Loss: 0.01025
Epoch: 19198, Training Loss: 0.01075
Epoch: 19198, Training Loss: 0.01077
Epoch: 19198, Training Loss: 0.01327
Epoch: 19199, Training Loss: 0.01025
Epoch: 19199, Training Loss: 0.01075
Epoch: 19199, Training Loss: 0.01077
Epoch: 19199, Training Loss: 0.01327
Epoch: 19200, Training Loss: 0.01025
Epoch: 19200, Training Loss: 0.01075
Epoch: 19200, Training Loss: 0.01077
Epoch: 19200, Training Loss: 0.01327
Epoch: 19201, Training Loss: 0.01025
Epoch: 19201, Training Loss: 0.01075
Epoch: 19201, Training Loss: 0.01077
Epoch: 19201, Training Loss: 0.01327
Epoch: 19202, Training Loss: 0.01025
Epoch: 19202, Training Loss: 0.01075
Epoch: 19202, Training Loss: 0.01077
Epoch: 19202, Training Loss: 0.01327
Epoch: 19203, Training Loss: 0.01025
Epoch: 19203, Training Loss: 0.01075
Epoch: 19203, Training Loss: 0.01077
Epoch: 19203, Training Loss: 0.01327
Epoch: 19204, Training Loss: 0.01025
Epoch: 19204, Training Loss: 0.01075
Epoch: 19204, Training Loss: 0.01077
Epoch: 19204, Training Loss: 0.01327
Epoch: 19205, Training Loss: 0.01025
Epoch: 19205, Training Loss: 0.01075
Epoch: 19205, Training Loss: 0.01077
Epoch: 19205, Training Loss: 0.01327
Epoch: 19206, Training Loss: 0.01025
Epoch: 19206, Training Loss: 0.01075
Epoch: 19206, Training Loss: 0.01076
Epoch: 19206, Training Loss: 0.01327
Epoch: 19207, Training Loss: 0.01025
Epoch: 19207, Training Loss: 0.01075
Epoch: 19207, Training Loss: 0.01076
Epoch: 19207, Training Loss: 0.01327
Epoch: 19208, Training Loss: 0.01025
Epoch: 19208, Training Loss: 0.01075
Epoch: 19208, Training Loss: 0.01076
Epoch: 19208, Training Loss: 0.01327
Epoch: 19209, Training Loss: 0.01025
Epoch: 19209, Training Loss: 0.01075
Epoch: 19209, Training Loss: 0.01076
Epoch: 19209, Training Loss: 0.01327
Epoch: 19210, Training Loss: 0.01025
Epoch: 19210, Training Loss: 0.01075
Epoch: 19210, Training Loss: 0.01076
Epoch: 19210, Training Loss: 0.01327
Epoch: 19211, Training Loss: 0.01025
Epoch: 19211, Training Loss: 0.01075
Epoch: 19211, Training Loss: 0.01076
Epoch: 19211, Training Loss: 0.01327
Epoch: 19212, Training Loss: 0.01025
Epoch: 19212, Training Loss: 0.01075
Epoch: 19212, Training Loss: 0.01076
Epoch: 19212, Training Loss: 0.01327
Epoch: 19213, Training Loss: 0.01025
Epoch: 19213, Training Loss: 0.01075
Epoch: 19213, Training Loss: 0.01076
Epoch: 19213, Training Loss: 0.01327
Epoch: 19214, Training Loss: 0.01025
Epoch: 19214, Training Loss: 0.01075
Epoch: 19214, Training Loss: 0.01076
Epoch: 19214, Training Loss: 0.01327
Epoch: 19215, Training Loss: 0.01025
Epoch: 19215, Training Loss: 0.01075
Epoch: 19215, Training Loss: 0.01076
Epoch: 19215, Training Loss: 0.01327
Epoch: 19216, Training Loss: 0.01025
Epoch: 19216, Training Loss: 0.01075
Epoch: 19216, Training Loss: 0.01076
Epoch: 19216, Training Loss: 0.01327
Epoch: 19217, Training Loss: 0.01025
Epoch: 19217, Training Loss: 0.01075
Epoch: 19217, Training Loss: 0.01076
Epoch: 19217, Training Loss: 0.01327
Epoch: 19218, Training Loss: 0.01025
Epoch: 19218, Training Loss: 0.01075
Epoch: 19218, Training Loss: 0.01076
Epoch: 19218, Training Loss: 0.01326
Epoch: 19219, Training Loss: 0.01025
Epoch: 19219, Training Loss: 0.01075
Epoch: 19219, Training Loss: 0.01076
Epoch: 19219, Training Loss: 0.01326
Epoch: 19220, Training Loss: 0.01025
Epoch: 19220, Training Loss: 0.01075
Epoch: 19220, Training Loss: 0.01076
Epoch: 19220, Training Loss: 0.01326
Epoch: 19221, Training Loss: 0.01024
Epoch: 19221, Training Loss: 0.01075
Epoch: 19221, Training Loss: 0.01076
Epoch: 19221, Training Loss: 0.01326
Epoch: 19222, Training Loss: 0.01024
Epoch: 19222, Training Loss: 0.01075
Epoch: 19222, Training Loss: 0.01076
Epoch: 19222, Training Loss: 0.01326
Epoch: 19223, Training Loss: 0.01024
Epoch: 19223, Training Loss: 0.01074
Epoch: 19223, Training Loss: 0.01076
Epoch: 19223, Training Loss: 0.01326
Epoch: 19224, Training Loss: 0.01024
Epoch: 19224, Training Loss: 0.01074
Epoch: 19224, Training Loss: 0.01076
Epoch: 19224, Training Loss: 0.01326
Epoch: 19225, Training Loss: 0.01024
Epoch: 19225, Training Loss: 0.01074
Epoch: 19225, Training Loss: 0.01076
Epoch: 19225, Training Loss: 0.01326
Epoch: 19226, Training Loss: 0.01024
Epoch: 19226, Training Loss: 0.01074
Epoch: 19226, Training Loss: 0.01076
Epoch: 19226, Training Loss: 0.01326
Epoch: 19227, Training Loss: 0.01024
Epoch: 19227, Training Loss: 0.01074
Epoch: 19227, Training Loss: 0.01076
Epoch: 19227, Training Loss: 0.01326
Epoch: 19228, Training Loss: 0.01024
Epoch: 19228, Training Loss: 0.01074
Epoch: 19228, Training Loss: 0.01076
Epoch: 19228, Training Loss: 0.01326
Epoch: 19229, Training Loss: 0.01024
Epoch: 19229, Training Loss: 0.01074
Epoch: 19229, Training Loss: 0.01076
Epoch: 19229, Training Loss: 0.01326
Epoch: 19230, Training Loss: 0.01024
Epoch: 19230, Training Loss: 0.01074
Epoch: 19230, Training Loss: 0.01076
Epoch: 19230, Training Loss: 0.01326
Epoch: 19231, Training Loss: 0.01024
Epoch: 19231, Training Loss: 0.01074
Epoch: 19231, Training Loss: 0.01076
Epoch: 19231, Training Loss: 0.01326
Epoch: 19232, Training Loss: 0.01024
Epoch: 19232, Training Loss: 0.01074
Epoch: 19232, Training Loss: 0.01076
Epoch: 19232, Training Loss: 0.01326
Epoch: 19233, Training Loss: 0.01024
Epoch: 19233, Training Loss: 0.01074
Epoch: 19233, Training Loss: 0.01076
Epoch: 19233, Training Loss: 0.01326
Epoch: 19234, Training Loss: 0.01024
Epoch: 19234, Training Loss: 0.01074
Epoch: 19234, Training Loss: 0.01076
Epoch: 19234, Training Loss: 0.01326
Epoch: 19235, Training Loss: 0.01024
Epoch: 19235, Training Loss: 0.01074
Epoch: 19235, Training Loss: 0.01076
Epoch: 19235, Training Loss: 0.01326
Epoch: 19236, Training Loss: 0.01024
Epoch: 19236, Training Loss: 0.01074
Epoch: 19236, Training Loss: 0.01075
Epoch: 19236, Training Loss: 0.01326
Epoch: 19237, Training Loss: 0.01024
Epoch: 19237, Training Loss: 0.01074
Epoch: 19237, Training Loss: 0.01075
Epoch: 19237, Training Loss: 0.01326
Epoch: 19238, Training Loss: 0.01024
Epoch: 19238, Training Loss: 0.01074
Epoch: 19238, Training Loss: 0.01075
Epoch: 19238, Training Loss: 0.01326
Epoch: 19239, Training Loss: 0.01024
Epoch: 19239, Training Loss: 0.01074
Epoch: 19239, Training Loss: 0.01075
Epoch: 19239, Training Loss: 0.01326
Epoch: 19240, Training Loss: 0.01024
Epoch: 19240, Training Loss: 0.01074
Epoch: 19240, Training Loss: 0.01075
Epoch: 19240, Training Loss: 0.01326
Epoch: 19241, Training Loss: 0.01024
Epoch: 19241, Training Loss: 0.01074
Epoch: 19241, Training Loss: 0.01075
Epoch: 19241, Training Loss: 0.01326
Epoch: 19242, Training Loss: 0.01024
Epoch: 19242, Training Loss: 0.01074
Epoch: 19242, Training Loss: 0.01075
Epoch: 19242, Training Loss: 0.01325
Epoch: 19243, Training Loss: 0.01024
Epoch: 19243, Training Loss: 0.01074
Epoch: 19243, Training Loss: 0.01075
Epoch: 19243, Training Loss: 0.01325
Epoch: 19244, Training Loss: 0.01024
Epoch: 19244, Training Loss: 0.01074
Epoch: 19244, Training Loss: 0.01075
Epoch: 19244, Training Loss: 0.01325
Epoch: 19245, Training Loss: 0.01024
Epoch: 19245, Training Loss: 0.01074
Epoch: 19245, Training Loss: 0.01075
Epoch: 19245, Training Loss: 0.01325
Epoch: 19246, Training Loss: 0.01024
Epoch: 19246, Training Loss: 0.01074
Epoch: 19246, Training Loss: 0.01075
Epoch: 19246, Training Loss: 0.01325
Epoch: 19247, Training Loss: 0.01024
Epoch: 19247, Training Loss: 0.01074
Epoch: 19247, Training Loss: 0.01075
Epoch: 19247, Training Loss: 0.01325
Epoch: 19248, Training Loss: 0.01024
Epoch: 19248, Training Loss: 0.01074
Epoch: 19248, Training Loss: 0.01075
Epoch: 19248, Training Loss: 0.01325
Epoch: 19249, Training Loss: 0.01024
Epoch: 19249, Training Loss: 0.01074
Epoch: 19249, Training Loss: 0.01075
Epoch: 19249, Training Loss: 0.01325
Epoch: 19250, Training Loss: 0.01024
Epoch: 19250, Training Loss: 0.01074
Epoch: 19250, Training Loss: 0.01075
Epoch: 19250, Training Loss: 0.01325
Epoch: 19251, Training Loss: 0.01024
Epoch: 19251, Training Loss: 0.01074
Epoch: 19251, Training Loss: 0.01075
Epoch: 19251, Training Loss: 0.01325
Epoch: 19252, Training Loss: 0.01024
Epoch: 19252, Training Loss: 0.01074
Epoch: 19252, Training Loss: 0.01075
Epoch: 19252, Training Loss: 0.01325
Epoch: 19253, Training Loss: 0.01024
Epoch: 19253, Training Loss: 0.01073
Epoch: 19253, Training Loss: 0.01075
Epoch: 19253, Training Loss: 0.01325
Epoch: 19254, Training Loss: 0.01023
Epoch: 19254, Training Loss: 0.01073
Epoch: 19254, Training Loss: 0.01075
Epoch: 19254, Training Loss: 0.01325
Epoch: 19255, Training Loss: 0.01023
Epoch: 19255, Training Loss: 0.01073
Epoch: 19255, Training Loss: 0.01075
Epoch: 19255, Training Loss: 0.01325
Epoch: 19256, Training Loss: 0.01023
Epoch: 19256, Training Loss: 0.01073
Epoch: 19256, Training Loss: 0.01075
Epoch: 19256, Training Loss: 0.01325
Epoch: 19257, Training Loss: 0.01023
Epoch: 19257, Training Loss: 0.01073
Epoch: 19257, Training Loss: 0.01075
Epoch: 19257, Training Loss: 0.01325
Epoch: 19258, Training Loss: 0.01023
Epoch: 19258, Training Loss: 0.01073
Epoch: 19258, Training Loss: 0.01075
Epoch: 19258, Training Loss: 0.01325
Epoch: 19259, Training Loss: 0.01023
Epoch: 19259, Training Loss: 0.01073
Epoch: 19259, Training Loss: 0.01075
Epoch: 19259, Training Loss: 0.01325
Epoch: 19260, Training Loss: 0.01023
Epoch: 19260, Training Loss: 0.01073
Epoch: 19260, Training Loss: 0.01075
Epoch: 19260, Training Loss: 0.01325
Epoch: 19261, Training Loss: 0.01023
Epoch: 19261, Training Loss: 0.01073
Epoch: 19261, Training Loss: 0.01075
Epoch: 19261, Training Loss: 0.01325
Epoch: 19262, Training Loss: 0.01023
Epoch: 19262, Training Loss: 0.01073
Epoch: 19262, Training Loss: 0.01075
Epoch: 19262, Training Loss: 0.01325
Epoch: 19263, Training Loss: 0.01023
Epoch: 19263, Training Loss: 0.01073
Epoch: 19263, Training Loss: 0.01075
Epoch: 19263, Training Loss: 0.01325
Epoch: 19264, Training Loss: 0.01023
Epoch: 19264, Training Loss: 0.01073
Epoch: 19264, Training Loss: 0.01075
Epoch: 19264, Training Loss: 0.01325
Epoch: 19265, Training Loss: 0.01023
Epoch: 19265, Training Loss: 0.01073
Epoch: 19265, Training Loss: 0.01075
Epoch: 19265, Training Loss: 0.01325
Epoch: 19266, Training Loss: 0.01023
Epoch: 19266, Training Loss: 0.01073
Epoch: 19266, Training Loss: 0.01074
Epoch: 19266, Training Loss: 0.01324
Epoch: 19267, Training Loss: 0.01023
Epoch: 19267, Training Loss: 0.01073
Epoch: 19267, Training Loss: 0.01074
Epoch: 19267, Training Loss: 0.01324
Epoch: 19268, Training Loss: 0.01023
Epoch: 19268, Training Loss: 0.01073
Epoch: 19268, Training Loss: 0.01074
Epoch: 19268, Training Loss: 0.01324
Epoch: 19269, Training Loss: 0.01023
Epoch: 19269, Training Loss: 0.01073
Epoch: 19269, Training Loss: 0.01074
Epoch: 19269, Training Loss: 0.01324
Epoch: 19270, Training Loss: 0.01023
Epoch: 19270, Training Loss: 0.01073
Epoch: 19270, Training Loss: 0.01074
Epoch: 19270, Training Loss: 0.01324
Epoch: 19271, Training Loss: 0.01023
Epoch: 19271, Training Loss: 0.01073
Epoch: 19271, Training Loss: 0.01074
Epoch: 19271, Training Loss: 0.01324
Epoch: 19272, Training Loss: 0.01023
Epoch: 19272, Training Loss: 0.01073
Epoch: 19272, Training Loss: 0.01074
Epoch: 19272, Training Loss: 0.01324
Epoch: 19273, Training Loss: 0.01023
Epoch: 19273, Training Loss: 0.01073
Epoch: 19273, Training Loss: 0.01074
Epoch: 19273, Training Loss: 0.01324
Epoch: 19274, Training Loss: 0.01023
Epoch: 19274, Training Loss: 0.01073
Epoch: 19274, Training Loss: 0.01074
Epoch: 19274, Training Loss: 0.01324
Epoch: 19275, Training Loss: 0.01023
Epoch: 19275, Training Loss: 0.01073
Epoch: 19275, Training Loss: 0.01074
Epoch: 19275, Training Loss: 0.01324
Epoch: 19276, Training Loss: 0.01023
Epoch: 19276, Training Loss: 0.01073
Epoch: 19276, Training Loss: 0.01074
Epoch: 19276, Training Loss: 0.01324
Epoch: 19277, Training Loss: 0.01023
Epoch: 19277, Training Loss: 0.01073
Epoch: 19277, Training Loss: 0.01074
Epoch: 19277, Training Loss: 0.01324
Epoch: 19278, Training Loss: 0.01023
Epoch: 19278, Training Loss: 0.01073
Epoch: 19278, Training Loss: 0.01074
Epoch: 19278, Training Loss: 0.01324
Epoch: 19279, Training Loss: 0.01023
Epoch: 19279, Training Loss: 0.01073
Epoch: 19279, Training Loss: 0.01074
Epoch: 19279, Training Loss: 0.01324
Epoch: 19280, Training Loss: 0.01023
Epoch: 19280, Training Loss: 0.01073
Epoch: 19280, Training Loss: 0.01074
Epoch: 19280, Training Loss: 0.01324
Epoch: 19281, Training Loss: 0.01023
Epoch: 19281, Training Loss: 0.01073
Epoch: 19281, Training Loss: 0.01074
Epoch: 19281, Training Loss: 0.01324
Epoch: 19282, Training Loss: 0.01023
Epoch: 19282, Training Loss: 0.01073
Epoch: 19282, Training Loss: 0.01074
Epoch: 19282, Training Loss: 0.01324
Epoch: 19283, Training Loss: 0.01023
Epoch: 19283, Training Loss: 0.01072
Epoch: 19283, Training Loss: 0.01074
Epoch: 19283, Training Loss: 0.01324
Epoch: 19284, Training Loss: 0.01023
Epoch: 19284, Training Loss: 0.01072
Epoch: 19284, Training Loss: 0.01074
Epoch: 19284, Training Loss: 0.01324
Epoch: 19285, Training Loss: 0.01023
Epoch: 19285, Training Loss: 0.01072
Epoch: 19285, Training Loss: 0.01074
Epoch: 19285, Training Loss: 0.01324
Epoch: 19286, Training Loss: 0.01023
Epoch: 19286, Training Loss: 0.01072
Epoch: 19286, Training Loss: 0.01074
Epoch: 19286, Training Loss: 0.01324
Epoch: 19287, Training Loss: 0.01022
Epoch: 19287, Training Loss: 0.01072
Epoch: 19287, Training Loss: 0.01074
Epoch: 19287, Training Loss: 0.01324
Epoch: 19288, Training Loss: 0.01022
Epoch: 19288, Training Loss: 0.01072
Epoch: 19288, Training Loss: 0.01074
Epoch: 19288, Training Loss: 0.01324
Epoch: 19289, Training Loss: 0.01022
Epoch: 19289, Training Loss: 0.01072
Epoch: 19289, Training Loss: 0.01074
Epoch: 19289, Training Loss: 0.01324
Epoch: 19290, Training Loss: 0.01022
Epoch: 19290, Training Loss: 0.01072
Epoch: 19290, Training Loss: 0.01074
Epoch: 19290, Training Loss: 0.01323
Epoch: 19291, Training Loss: 0.01022
Epoch: 19291, Training Loss: 0.01072
Epoch: 19291, Training Loss: 0.01074
Epoch: 19291, Training Loss: 0.01323
Epoch: 19292, Training Loss: 0.01022
Epoch: 19292, Training Loss: 0.01072
Epoch: 19292, Training Loss: 0.01074
Epoch: 19292, Training Loss: 0.01323
Epoch: 19293, Training Loss: 0.01022
Epoch: 19293, Training Loss: 0.01072
Epoch: 19293, Training Loss: 0.01074
Epoch: 19293, Training Loss: 0.01323
Epoch: 19294, Training Loss: 0.01022
Epoch: 19294, Training Loss: 0.01072
Epoch: 19294, Training Loss: 0.01074
Epoch: 19294, Training Loss: 0.01323
Epoch: 19295, Training Loss: 0.01022
Epoch: 19295, Training Loss: 0.01072
Epoch: 19295, Training Loss: 0.01074
Epoch: 19295, Training Loss: 0.01323
Epoch: 19296, Training Loss: 0.01022
Epoch: 19296, Training Loss: 0.01072
Epoch: 19296, Training Loss: 0.01073
Epoch: 19296, Training Loss: 0.01323
Epoch: 19297, Training Loss: 0.01022
Epoch: 19297, Training Loss: 0.01072
Epoch: 19297, Training Loss: 0.01073
Epoch: 19297, Training Loss: 0.01323
Epoch: 19298, Training Loss: 0.01022
Epoch: 19298, Training Loss: 0.01072
Epoch: 19298, Training Loss: 0.01073
Epoch: 19298, Training Loss: 0.01323
Epoch: 19299, Training Loss: 0.01022
Epoch: 19299, Training Loss: 0.01072
Epoch: 19299, Training Loss: 0.01073
Epoch: 19299, Training Loss: 0.01323
Epoch: 19300, Training Loss: 0.01022
Epoch: 19300, Training Loss: 0.01072
Epoch: 19300, Training Loss: 0.01073
Epoch: 19300, Training Loss: 0.01323
Epoch: 19301, Training Loss: 0.01022
Epoch: 19301, Training Loss: 0.01072
Epoch: 19301, Training Loss: 0.01073
Epoch: 19301, Training Loss: 0.01323
Epoch: 19302, Training Loss: 0.01022
Epoch: 19302, Training Loss: 0.01072
Epoch: 19302, Training Loss: 0.01073
Epoch: 19302, Training Loss: 0.01323
Epoch: 19303, Training Loss: 0.01022
Epoch: 19303, Training Loss: 0.01072
Epoch: 19303, Training Loss: 0.01073
Epoch: 19303, Training Loss: 0.01323
Epoch: 19304, Training Loss: 0.01022
Epoch: 19304, Training Loss: 0.01072
Epoch: 19304, Training Loss: 0.01073
Epoch: 19304, Training Loss: 0.01323
Epoch: 19305, Training Loss: 0.01022
Epoch: 19305, Training Loss: 0.01072
Epoch: 19305, Training Loss: 0.01073
Epoch: 19305, Training Loss: 0.01323
Epoch: 19306, Training Loss: 0.01022
Epoch: 19306, Training Loss: 0.01072
Epoch: 19306, Training Loss: 0.01073
Epoch: 19306, Training Loss: 0.01323
Epoch: 19307, Training Loss: 0.01022
Epoch: 19307, Training Loss: 0.01072
Epoch: 19307, Training Loss: 0.01073
Epoch: 19307, Training Loss: 0.01323
Epoch: 19308, Training Loss: 0.01022
Epoch: 19308, Training Loss: 0.01072
Epoch: 19308, Training Loss: 0.01073
Epoch: 19308, Training Loss: 0.01323
Epoch: 19309, Training Loss: 0.01022
Epoch: 19309, Training Loss: 0.01072
Epoch: 19309, Training Loss: 0.01073
Epoch: 19309, Training Loss: 0.01323
Epoch: 19310, Training Loss: 0.01022
Epoch: 19310, Training Loss: 0.01072
Epoch: 19310, Training Loss: 0.01073
Epoch: 19310, Training Loss: 0.01323
Epoch: 19311, Training Loss: 0.01022
Epoch: 19311, Training Loss: 0.01072
Epoch: 19311, Training Loss: 0.01073
Epoch: 19311, Training Loss: 0.01323
Epoch: 19312, Training Loss: 0.01022
Epoch: 19312, Training Loss: 0.01072
Epoch: 19312, Training Loss: 0.01073
Epoch: 19312, Training Loss: 0.01323
Epoch: 19313, Training Loss: 0.01022
Epoch: 19313, Training Loss: 0.01072
Epoch: 19313, Training Loss: 0.01073
Epoch: 19313, Training Loss: 0.01323
Epoch: 19314, Training Loss: 0.01022
Epoch: 19314, Training Loss: 0.01071
Epoch: 19314, Training Loss: 0.01073
Epoch: 19314, Training Loss: 0.01323
Epoch: 19315, Training Loss: 0.01022
Epoch: 19315, Training Loss: 0.01071
Epoch: 19315, Training Loss: 0.01073
Epoch: 19315, Training Loss: 0.01322
Epoch: 19316, Training Loss: 0.01022
Epoch: 19316, Training Loss: 0.01071
Epoch: 19316, Training Loss: 0.01073
Epoch: 19316, Training Loss: 0.01322
Epoch: 19317, Training Loss: 0.01022
Epoch: 19317, Training Loss: 0.01071
Epoch: 19317, Training Loss: 0.01073
Epoch: 19317, Training Loss: 0.01322
Epoch: 19318, Training Loss: 0.01022
Epoch: 19318, Training Loss: 0.01071
Epoch: 19318, Training Loss: 0.01073
Epoch: 19318, Training Loss: 0.01322
Epoch: 19319, Training Loss: 0.01021
Epoch: 19319, Training Loss: 0.01071
Epoch: 19319, Training Loss: 0.01073
Epoch: 19319, Training Loss: 0.01322
Epoch: 19320, Training Loss: 0.01021
Epoch: 19320, Training Loss: 0.01071
Epoch: 19320, Training Loss: 0.01073
Epoch: 19320, Training Loss: 0.01322
Epoch: 19321, Training Loss: 0.01021
Epoch: 19321, Training Loss: 0.01071
Epoch: 19321, Training Loss: 0.01073
Epoch: 19321, Training Loss: 0.01322
Epoch: 19322, Training Loss: 0.01021
Epoch: 19322, Training Loss: 0.01071
Epoch: 19322, Training Loss: 0.01073
Epoch: 19322, Training Loss: 0.01322
Epoch: 19323, Training Loss: 0.01021
Epoch: 19323, Training Loss: 0.01071
Epoch: 19323, Training Loss: 0.01073
Epoch: 19323, Training Loss: 0.01322
Epoch: 19324, Training Loss: 0.01021
Epoch: 19324, Training Loss: 0.01071
Epoch: 19324, Training Loss: 0.01073
Epoch: 19324, Training Loss: 0.01322
Epoch: 19325, Training Loss: 0.01021
Epoch: 19325, Training Loss: 0.01071
Epoch: 19325, Training Loss: 0.01073
Epoch: 19325, Training Loss: 0.01322
Epoch: 19326, Training Loss: 0.01021
Epoch: 19326, Training Loss: 0.01071
Epoch: 19326, Training Loss: 0.01072
Epoch: 19326, Training Loss: 0.01322
Epoch: 19327, Training Loss: 0.01021
Epoch: 19327, Training Loss: 0.01071
Epoch: 19327, Training Loss: 0.01072
Epoch: 19327, Training Loss: 0.01322
Epoch: 19328, Training Loss: 0.01021
Epoch: 19328, Training Loss: 0.01071
Epoch: 19328, Training Loss: 0.01072
Epoch: 19328, Training Loss: 0.01322
Epoch: 19329, Training Loss: 0.01021
Epoch: 19329, Training Loss: 0.01071
Epoch: 19329, Training Loss: 0.01072
Epoch: 19329, Training Loss: 0.01322
Epoch: 19330, Training Loss: 0.01021
Epoch: 19330, Training Loss: 0.01071
Epoch: 19330, Training Loss: 0.01072
Epoch: 19330, Training Loss: 0.01322
Epoch: 19331, Training Loss: 0.01021
Epoch: 19331, Training Loss: 0.01071
Epoch: 19331, Training Loss: 0.01072
Epoch: 19331, Training Loss: 0.01322
Epoch: 19332, Training Loss: 0.01021
Epoch: 19332, Training Loss: 0.01071
Epoch: 19332, Training Loss: 0.01072
Epoch: 19332, Training Loss: 0.01322
Epoch: 19333, Training Loss: 0.01021
Epoch: 19333, Training Loss: 0.01071
Epoch: 19333, Training Loss: 0.01072
Epoch: 19333, Training Loss: 0.01322
Epoch: 19334, Training Loss: 0.01021
Epoch: 19334, Training Loss: 0.01071
Epoch: 19334, Training Loss: 0.01072
Epoch: 19334, Training Loss: 0.01322
Epoch: 19335, Training Loss: 0.01021
Epoch: 19335, Training Loss: 0.01071
Epoch: 19335, Training Loss: 0.01072
Epoch: 19335, Training Loss: 0.01322
Epoch: 19336, Training Loss: 0.01021
Epoch: 19336, Training Loss: 0.01071
Epoch: 19336, Training Loss: 0.01072
Epoch: 19336, Training Loss: 0.01322
Epoch: 19337, Training Loss: 0.01021
Epoch: 19337, Training Loss: 0.01071
Epoch: 19337, Training Loss: 0.01072
Epoch: 19337, Training Loss: 0.01322
Epoch: 19338, Training Loss: 0.01021
Epoch: 19338, Training Loss: 0.01071
Epoch: 19338, Training Loss: 0.01072
Epoch: 19338, Training Loss: 0.01322
Epoch: 19339, Training Loss: 0.01021
Epoch: 19339, Training Loss: 0.01071
Epoch: 19339, Training Loss: 0.01072
Epoch: 19339, Training Loss: 0.01321
Epoch: 19340, Training Loss: 0.01021
Epoch: 19340, Training Loss: 0.01071
Epoch: 19340, Training Loss: 0.01072
Epoch: 19340, Training Loss: 0.01321
Epoch: 19341, Training Loss: 0.01021
Epoch: 19341, Training Loss: 0.01071
Epoch: 19341, Training Loss: 0.01072
Epoch: 19341, Training Loss: 0.01321
Epoch: 19342, Training Loss: 0.01021
Epoch: 19342, Training Loss: 0.01071
Epoch: 19342, Training Loss: 0.01072
Epoch: 19342, Training Loss: 0.01321
Epoch: 19343, Training Loss: 0.01021
Epoch: 19343, Training Loss: 0.01071
Epoch: 19343, Training Loss: 0.01072
Epoch: 19343, Training Loss: 0.01321
Epoch: 19344, Training Loss: 0.01021
Epoch: 19344, Training Loss: 0.01070
Epoch: 19344, Training Loss: 0.01072
Epoch: 19344, Training Loss: 0.01321
Epoch: 19345, Training Loss: 0.01021
Epoch: 19345, Training Loss: 0.01070
Epoch: 19345, Training Loss: 0.01072
Epoch: 19345, Training Loss: 0.01321
Epoch: 19346, Training Loss: 0.01021
Epoch: 19346, Training Loss: 0.01070
Epoch: 19346, Training Loss: 0.01072
Epoch: 19346, Training Loss: 0.01321
Epoch: 19347, Training Loss: 0.01021
Epoch: 19347, Training Loss: 0.01070
Epoch: 19347, Training Loss: 0.01072
Epoch: 19347, Training Loss: 0.01321
Epoch: 19348, Training Loss: 0.01021
Epoch: 19348, Training Loss: 0.01070
Epoch: 19348, Training Loss: 0.01072
Epoch: 19348, Training Loss: 0.01321
Epoch: 19349, Training Loss: 0.01021
Epoch: 19349, Training Loss: 0.01070
Epoch: 19349, Training Loss: 0.01072
Epoch: 19349, Training Loss: 0.01321
Epoch: 19350, Training Loss: 0.01021
Epoch: 19350, Training Loss: 0.01070
Epoch: 19350, Training Loss: 0.01072
Epoch: 19350, Training Loss: 0.01321
Epoch: 19351, Training Loss: 0.01021
Epoch: 19351, Training Loss: 0.01070
Epoch: 19351, Training Loss: 0.01072
Epoch: 19351, Training Loss: 0.01321
Epoch: 19352, Training Loss: 0.01020
Epoch: 19352, Training Loss: 0.01070
Epoch: 19352, Training Loss: 0.01072
Epoch: 19352, Training Loss: 0.01321
Epoch: 19353, Training Loss: 0.01020
Epoch: 19353, Training Loss: 0.01070
Epoch: 19353, Training Loss: 0.01072
Epoch: 19353, Training Loss: 0.01321
Epoch: 19354, Training Loss: 0.01020
Epoch: 19354, Training Loss: 0.01070
Epoch: 19354, Training Loss: 0.01072
Epoch: 19354, Training Loss: 0.01321
Epoch: 19355, Training Loss: 0.01020
Epoch: 19355, Training Loss: 0.01070
Epoch: 19355, Training Loss: 0.01072
Epoch: 19355, Training Loss: 0.01321
Epoch: 19356, Training Loss: 0.01020
Epoch: 19356, Training Loss: 0.01070
Epoch: 19356, Training Loss: 0.01072
Epoch: 19356, Training Loss: 0.01321
Epoch: 19357, Training Loss: 0.01020
Epoch: 19357, Training Loss: 0.01070
Epoch: 19357, Training Loss: 0.01071
Epoch: 19357, Training Loss: 0.01321
Epoch: 19358, Training Loss: 0.01020
Epoch: 19358, Training Loss: 0.01070
Epoch: 19358, Training Loss: 0.01071
Epoch: 19358, Training Loss: 0.01321
Epoch: 19359, Training Loss: 0.01020
Epoch: 19359, Training Loss: 0.01070
Epoch: 19359, Training Loss: 0.01071
Epoch: 19359, Training Loss: 0.01321
Epoch: 19360, Training Loss: 0.01020
Epoch: 19360, Training Loss: 0.01070
Epoch: 19360, Training Loss: 0.01071
Epoch: 19360, Training Loss: 0.01321
Epoch: 19361, Training Loss: 0.01020
Epoch: 19361, Training Loss: 0.01070
Epoch: 19361, Training Loss: 0.01071
Epoch: 19361, Training Loss: 0.01321
Epoch: 19362, Training Loss: 0.01020
Epoch: 19362, Training Loss: 0.01070
Epoch: 19362, Training Loss: 0.01071
Epoch: 19362, Training Loss: 0.01321
Epoch: 19363, Training Loss: 0.01020
Epoch: 19363, Training Loss: 0.01070
Epoch: 19363, Training Loss: 0.01071
Epoch: 19363, Training Loss: 0.01320
Epoch: 19364, Training Loss: 0.01020
Epoch: 19364, Training Loss: 0.01070
Epoch: 19364, Training Loss: 0.01071
Epoch: 19364, Training Loss: 0.01320
Epoch: 19365, Training Loss: 0.01020
Epoch: 19365, Training Loss: 0.01070
Epoch: 19365, Training Loss: 0.01071
Epoch: 19365, Training Loss: 0.01320
Epoch: 19366, Training Loss: 0.01020
Epoch: 19366, Training Loss: 0.01070
Epoch: 19366, Training Loss: 0.01071
Epoch: 19366, Training Loss: 0.01320
Epoch: 19367, Training Loss: 0.01020
Epoch: 19367, Training Loss: 0.01070
Epoch: 19367, Training Loss: 0.01071
Epoch: 19367, Training Loss: 0.01320
Epoch: 19368, Training Loss: 0.01020
Epoch: 19368, Training Loss: 0.01070
Epoch: 19368, Training Loss: 0.01071
Epoch: 19368, Training Loss: 0.01320
Epoch: 19369, Training Loss: 0.01020
Epoch: 19369, Training Loss: 0.01070
Epoch: 19369, Training Loss: 0.01071
Epoch: 19369, Training Loss: 0.01320
Epoch: 19370, Training Loss: 0.01020
Epoch: 19370, Training Loss: 0.01070
Epoch: 19370, Training Loss: 0.01071
Epoch: 19370, Training Loss: 0.01320
Epoch: 19371, Training Loss: 0.01020
Epoch: 19371, Training Loss: 0.01070
Epoch: 19371, Training Loss: 0.01071
Epoch: 19371, Training Loss: 0.01320
Epoch: 19372, Training Loss: 0.01020
Epoch: 19372, Training Loss: 0.01070
Epoch: 19372, Training Loss: 0.01071
Epoch: 19372, Training Loss: 0.01320
Epoch: 19373, Training Loss: 0.01020
Epoch: 19373, Training Loss: 0.01070
Epoch: 19373, Training Loss: 0.01071
Epoch: 19373, Training Loss: 0.01320
Epoch: 19374, Training Loss: 0.01020
Epoch: 19374, Training Loss: 0.01069
Epoch: 19374, Training Loss: 0.01071
Epoch: 19374, Training Loss: 0.01320
Epoch: 19375, Training Loss: 0.01020
Epoch: 19375, Training Loss: 0.01069
Epoch: 19375, Training Loss: 0.01071
Epoch: 19375, Training Loss: 0.01320
Epoch: 19376, Training Loss: 0.01020
Epoch: 19376, Training Loss: 0.01069
Epoch: 19376, Training Loss: 0.01071
Epoch: 19376, Training Loss: 0.01320
Epoch: 19377, Training Loss: 0.01020
Epoch: 19377, Training Loss: 0.01069
Epoch: 19377, Training Loss: 0.01071
Epoch: 19377, Training Loss: 0.01320
Epoch: 19378, Training Loss: 0.01020
Epoch: 19378, Training Loss: 0.01069
Epoch: 19378, Training Loss: 0.01071
Epoch: 19378, Training Loss: 0.01320
Epoch: 19379, Training Loss: 0.01020
Epoch: 19379, Training Loss: 0.01069
Epoch: 19379, Training Loss: 0.01071
Epoch: 19379, Training Loss: 0.01320
Epoch: 19380, Training Loss: 0.01020
Epoch: 19380, Training Loss: 0.01069
Epoch: 19380, Training Loss: 0.01071
Epoch: 19380, Training Loss: 0.01320
Epoch: 19381, Training Loss: 0.01020
Epoch: 19381, Training Loss: 0.01069
Epoch: 19381, Training Loss: 0.01071
Epoch: 19381, Training Loss: 0.01320
Epoch: 19382, Training Loss: 0.01020
Epoch: 19382, Training Loss: 0.01069
Epoch: 19382, Training Loss: 0.01071
Epoch: 19382, Training Loss: 0.01320
Epoch: 19383, Training Loss: 0.01020
Epoch: 19383, Training Loss: 0.01069
Epoch: 19383, Training Loss: 0.01071
Epoch: 19383, Training Loss: 0.01320
Epoch: 19384, Training Loss: 0.01020
Epoch: 19384, Training Loss: 0.01069
Epoch: 19384, Training Loss: 0.01071
Epoch: 19384, Training Loss: 0.01320
Epoch: 19385, Training Loss: 0.01019
Epoch: 19385, Training Loss: 0.01069
Epoch: 19385, Training Loss: 0.01071
Epoch: 19385, Training Loss: 0.01320
Epoch: 19386, Training Loss: 0.01019
Epoch: 19386, Training Loss: 0.01069
Epoch: 19386, Training Loss: 0.01071
Epoch: 19386, Training Loss: 0.01320
Epoch: 19387, Training Loss: 0.01019
Epoch: 19387, Training Loss: 0.01069
Epoch: 19387, Training Loss: 0.01070
Epoch: 19387, Training Loss: 0.01319
Epoch: 19388, Training Loss: 0.01019
Epoch: 19388, Training Loss: 0.01069
Epoch: 19388, Training Loss: 0.01070
Epoch: 19388, Training Loss: 0.01319
Epoch: 19389, Training Loss: 0.01019
Epoch: 19389, Training Loss: 0.01069
Epoch: 19389, Training Loss: 0.01070
Epoch: 19389, Training Loss: 0.01319
Epoch: 19390, Training Loss: 0.01019
Epoch: 19390, Training Loss: 0.01069
Epoch: 19390, Training Loss: 0.01070
Epoch: 19390, Training Loss: 0.01319
Epoch: 19391, Training Loss: 0.01019
Epoch: 19391, Training Loss: 0.01069
Epoch: 19391, Training Loss: 0.01070
Epoch: 19391, Training Loss: 0.01319
Epoch: 19392, Training Loss: 0.01019
Epoch: 19392, Training Loss: 0.01069
Epoch: 19392, Training Loss: 0.01070
Epoch: 19392, Training Loss: 0.01319
Epoch: 19393, Training Loss: 0.01019
Epoch: 19393, Training Loss: 0.01069
Epoch: 19393, Training Loss: 0.01070
Epoch: 19393, Training Loss: 0.01319
Epoch: 19394, Training Loss: 0.01019
Epoch: 19394, Training Loss: 0.01069
Epoch: 19394, Training Loss: 0.01070
Epoch: 19394, Training Loss: 0.01319
Epoch: 19395, Training Loss: 0.01019
Epoch: 19395, Training Loss: 0.01069
Epoch: 19395, Training Loss: 0.01070
Epoch: 19395, Training Loss: 0.01319
Epoch: 19396, Training Loss: 0.01019
Epoch: 19396, Training Loss: 0.01069
Epoch: 19396, Training Loss: 0.01070
Epoch: 19396, Training Loss: 0.01319
Epoch: 19397, Training Loss: 0.01019
Epoch: 19397, Training Loss: 0.01069
Epoch: 19397, Training Loss: 0.01070
Epoch: 19397, Training Loss: 0.01319
Epoch: 19398, Training Loss: 0.01019
Epoch: 19398, Training Loss: 0.01069
Epoch: 19398, Training Loss: 0.01070
Epoch: 19398, Training Loss: 0.01319
Epoch: 19399, Training Loss: 0.01019
Epoch: 19399, Training Loss: 0.01069
Epoch: 19399, Training Loss: 0.01070
Epoch: 19399, Training Loss: 0.01319
Epoch: 19400, Training Loss: 0.01019
Epoch: 19400, Training Loss: 0.01069
Epoch: 19400, Training Loss: 0.01070
Epoch: 19400, Training Loss: 0.01319
Epoch: 19401, Training Loss: 0.01019
Epoch: 19401, Training Loss: 0.01069
Epoch: 19401, Training Loss: 0.01070
Epoch: 19401, Training Loss: 0.01319
Epoch: 19402, Training Loss: 0.01019
Epoch: 19402, Training Loss: 0.01069
Epoch: 19402, Training Loss: 0.01070
Epoch: 19402, Training Loss: 0.01319
Epoch: 19403, Training Loss: 0.01019
Epoch: 19403, Training Loss: 0.01069
Epoch: 19403, Training Loss: 0.01070
Epoch: 19403, Training Loss: 0.01319
Epoch: 19404, Training Loss: 0.01019
Epoch: 19404, Training Loss: 0.01069
Epoch: 19404, Training Loss: 0.01070
Epoch: 19404, Training Loss: 0.01319
Epoch: 19405, Training Loss: 0.01019
Epoch: 19405, Training Loss: 0.01068
Epoch: 19405, Training Loss: 0.01070
Epoch: 19405, Training Loss: 0.01319
Epoch: 19406, Training Loss: 0.01019
Epoch: 19406, Training Loss: 0.01068
Epoch: 19406, Training Loss: 0.01070
Epoch: 19406, Training Loss: 0.01319
Epoch: 19407, Training Loss: 0.01019
Epoch: 19407, Training Loss: 0.01068
Epoch: 19407, Training Loss: 0.01070
Epoch: 19407, Training Loss: 0.01319
Epoch: 19408, Training Loss: 0.01019
Epoch: 19408, Training Loss: 0.01068
Epoch: 19408, Training Loss: 0.01070
Epoch: 19408, Training Loss: 0.01319
Epoch: 19409, Training Loss: 0.01019
Epoch: 19409, Training Loss: 0.01068
Epoch: 19409, Training Loss: 0.01070
Epoch: 19409, Training Loss: 0.01319
Epoch: 19410, Training Loss: 0.01019
Epoch: 19410, Training Loss: 0.01068
Epoch: 19410, Training Loss: 0.01070
Epoch: 19410, Training Loss: 0.01319
Epoch: 19411, Training Loss: 0.01019
Epoch: 19411, Training Loss: 0.01068
Epoch: 19411, Training Loss: 0.01070
Epoch: 19411, Training Loss: 0.01319
Epoch: 19412, Training Loss: 0.01019
Epoch: 19412, Training Loss: 0.01068
Epoch: 19412, Training Loss: 0.01070
Epoch: 19412, Training Loss: 0.01318
Epoch: 19413, Training Loss: 0.01019
Epoch: 19413, Training Loss: 0.01068
Epoch: 19413, Training Loss: 0.01070
Epoch: 19413, Training Loss: 0.01318
Epoch: 19414, Training Loss: 0.01019
Epoch: 19414, Training Loss: 0.01068
Epoch: 19414, Training Loss: 0.01070
Epoch: 19414, Training Loss: 0.01318
Epoch: 19415, Training Loss: 0.01019
Epoch: 19415, Training Loss: 0.01068
Epoch: 19415, Training Loss: 0.01070
Epoch: 19415, Training Loss: 0.01318
Epoch: 19416, Training Loss: 0.01019
Epoch: 19416, Training Loss: 0.01068
Epoch: 19416, Training Loss: 0.01070
Epoch: 19416, Training Loss: 0.01318
Epoch: 19417, Training Loss: 0.01019
Epoch: 19417, Training Loss: 0.01068
Epoch: 19417, Training Loss: 0.01069
Epoch: 19417, Training Loss: 0.01318
Epoch: 19418, Training Loss: 0.01018
Epoch: 19418, Training Loss: 0.01068
Epoch: 19418, Training Loss: 0.01069
Epoch: 19418, Training Loss: 0.01318
Epoch: 19419, Training Loss: 0.01018
Epoch: 19419, Training Loss: 0.01068
Epoch: 19419, Training Loss: 0.01069
Epoch: 19419, Training Loss: 0.01318
Epoch: 19420, Training Loss: 0.01018
Epoch: 19420, Training Loss: 0.01068
Epoch: 19420, Training Loss: 0.01069
Epoch: 19420, Training Loss: 0.01318
Epoch: 19421, Training Loss: 0.01018
Epoch: 19421, Training Loss: 0.01068
Epoch: 19421, Training Loss: 0.01069
Epoch: 19421, Training Loss: 0.01318
Epoch: 19422, Training Loss: 0.01018
Epoch: 19422, Training Loss: 0.01068
Epoch: 19422, Training Loss: 0.01069
Epoch: 19422, Training Loss: 0.01318
Epoch: 19423, Training Loss: 0.01018
Epoch: 19423, Training Loss: 0.01068
Epoch: 19423, Training Loss: 0.01069
Epoch: 19423, Training Loss: 0.01318
Epoch: 19424, Training Loss: 0.01018
Epoch: 19424, Training Loss: 0.01068
Epoch: 19424, Training Loss: 0.01069
Epoch: 19424, Training Loss: 0.01318
Epoch: 19425, Training Loss: 0.01018
Epoch: 19425, Training Loss: 0.01068
Epoch: 19425, Training Loss: 0.01069
Epoch: 19425, Training Loss: 0.01318
Epoch: 19426, Training Loss: 0.01018
Epoch: 19426, Training Loss: 0.01068
Epoch: 19426, Training Loss: 0.01069
Epoch: 19426, Training Loss: 0.01318
Epoch: 19427, Training Loss: 0.01018
Epoch: 19427, Training Loss: 0.01068
Epoch: 19427, Training Loss: 0.01069
Epoch: 19427, Training Loss: 0.01318
Epoch: 19428, Training Loss: 0.01018
Epoch: 19428, Training Loss: 0.01068
Epoch: 19428, Training Loss: 0.01069
Epoch: 19428, Training Loss: 0.01318
Epoch: 19429, Training Loss: 0.01018
Epoch: 19429, Training Loss: 0.01068
Epoch: 19429, Training Loss: 0.01069
Epoch: 19429, Training Loss: 0.01318
Epoch: 19430, Training Loss: 0.01018
Epoch: 19430, Training Loss: 0.01068
Epoch: 19430, Training Loss: 0.01069
Epoch: 19430, Training Loss: 0.01318
Epoch: 19431, Training Loss: 0.01018
Epoch: 19431, Training Loss: 0.01068
Epoch: 19431, Training Loss: 0.01069
Epoch: 19431, Training Loss: 0.01318
Epoch: 19432, Training Loss: 0.01018
Epoch: 19432, Training Loss: 0.01068
Epoch: 19432, Training Loss: 0.01069
Epoch: 19432, Training Loss: 0.01318
Epoch: 19433, Training Loss: 0.01018
Epoch: 19433, Training Loss: 0.01068
Epoch: 19433, Training Loss: 0.01069
Epoch: 19433, Training Loss: 0.01318
Epoch: 19434, Training Loss: 0.01018
Epoch: 19434, Training Loss: 0.01068
Epoch: 19434, Training Loss: 0.01069
Epoch: 19434, Training Loss: 0.01318
Epoch: 19435, Training Loss: 0.01018
Epoch: 19435, Training Loss: 0.01068
Epoch: 19435, Training Loss: 0.01069
Epoch: 19435, Training Loss: 0.01318
Epoch: 19436, Training Loss: 0.01018
Epoch: 19436, Training Loss: 0.01067
Epoch: 19436, Training Loss: 0.01069
Epoch: 19436, Training Loss: 0.01317
Epoch: 19437, Training Loss: 0.01018
Epoch: 19437, Training Loss: 0.01067
Epoch: 19437, Training Loss: 0.01069
Epoch: 19437, Training Loss: 0.01317
Epoch: 19438, Training Loss: 0.01018
Epoch: 19438, Training Loss: 0.01067
Epoch: 19438, Training Loss: 0.01069
Epoch: 19438, Training Loss: 0.01317
Epoch: 19439, Training Loss: 0.01018
Epoch: 19439, Training Loss: 0.01067
Epoch: 19439, Training Loss: 0.01069
Epoch: 19439, Training Loss: 0.01317
Epoch: 19440, Training Loss: 0.01018
Epoch: 19440, Training Loss: 0.01067
Epoch: 19440, Training Loss: 0.01069
Epoch: 19440, Training Loss: 0.01317
Epoch: 19441, Training Loss: 0.01018
Epoch: 19441, Training Loss: 0.01067
Epoch: 19441, Training Loss: 0.01069
Epoch: 19441, Training Loss: 0.01317
Epoch: 19442, Training Loss: 0.01018
Epoch: 19442, Training Loss: 0.01067
Epoch: 19442, Training Loss: 0.01069
Epoch: 19442, Training Loss: 0.01317
Epoch: 19443, Training Loss: 0.01018
Epoch: 19443, Training Loss: 0.01067
Epoch: 19443, Training Loss: 0.01069
Epoch: 19443, Training Loss: 0.01317
Epoch: 19444, Training Loss: 0.01018
Epoch: 19444, Training Loss: 0.01067
Epoch: 19444, Training Loss: 0.01069
Epoch: 19444, Training Loss: 0.01317
Epoch: 19445, Training Loss: 0.01018
Epoch: 19445, Training Loss: 0.01067
Epoch: 19445, Training Loss: 0.01069
Epoch: 19445, Training Loss: 0.01317
Epoch: 19446, Training Loss: 0.01018
Epoch: 19446, Training Loss: 0.01067
Epoch: 19446, Training Loss: 0.01069
Epoch: 19446, Training Loss: 0.01317
Epoch: 19447, Training Loss: 0.01018
Epoch: 19447, Training Loss: 0.01067
Epoch: 19447, Training Loss: 0.01069
Epoch: 19447, Training Loss: 0.01317
Epoch: 19448, Training Loss: 0.01018
Epoch: 19448, Training Loss: 0.01067
Epoch: 19448, Training Loss: 0.01068
Epoch: 19448, Training Loss: 0.01317
Epoch: 19449, Training Loss: 0.01018
Epoch: 19449, Training Loss: 0.01067
Epoch: 19449, Training Loss: 0.01068
Epoch: 19449, Training Loss: 0.01317
Epoch: 19450, Training Loss: 0.01018
Epoch: 19450, Training Loss: 0.01067
Epoch: 19450, Training Loss: 0.01068
Epoch: 19450, Training Loss: 0.01317
Epoch: 19451, Training Loss: 0.01017
Epoch: 19451, Training Loss: 0.01067
Epoch: 19451, Training Loss: 0.01068
Epoch: 19451, Training Loss: 0.01317
Epoch: 19452, Training Loss: 0.01017
Epoch: 19452, Training Loss: 0.01067
Epoch: 19452, Training Loss: 0.01068
Epoch: 19452, Training Loss: 0.01317
Epoch: 19453, Training Loss: 0.01017
Epoch: 19453, Training Loss: 0.01067
Epoch: 19453, Training Loss: 0.01068
Epoch: 19453, Training Loss: 0.01317
Epoch: 19454, Training Loss: 0.01017
Epoch: 19454, Training Loss: 0.01067
Epoch: 19454, Training Loss: 0.01068
Epoch: 19454, Training Loss: 0.01317
Epoch: 19455, Training Loss: 0.01017
Epoch: 19455, Training Loss: 0.01067
Epoch: 19455, Training Loss: 0.01068
Epoch: 19455, Training Loss: 0.01317
Epoch: 19456, Training Loss: 0.01017
Epoch: 19456, Training Loss: 0.01067
Epoch: 19456, Training Loss: 0.01068
Epoch: 19456, Training Loss: 0.01317
Epoch: 19457, Training Loss: 0.01017
Epoch: 19457, Training Loss: 0.01067
Epoch: 19457, Training Loss: 0.01068
Epoch: 19457, Training Loss: 0.01317
Epoch: 19458, Training Loss: 0.01017
Epoch: 19458, Training Loss: 0.01067
Epoch: 19458, Training Loss: 0.01068
Epoch: 19458, Training Loss: 0.01317
Epoch: 19459, Training Loss: 0.01017
Epoch: 19459, Training Loss: 0.01067
Epoch: 19459, Training Loss: 0.01068
Epoch: 19459, Training Loss: 0.01317
Epoch: 19460, Training Loss: 0.01017
Epoch: 19460, Training Loss: 0.01067
Epoch: 19460, Training Loss: 0.01068
Epoch: 19460, Training Loss: 0.01317
Epoch: 19461, Training Loss: 0.01017
Epoch: 19461, Training Loss: 0.01067
Epoch: 19461, Training Loss: 0.01068
Epoch: 19461, Training Loss: 0.01316
Epoch: 19462, Training Loss: 0.01017
Epoch: 19462, Training Loss: 0.01067
Epoch: 19462, Training Loss: 0.01068
Epoch: 19462, Training Loss: 0.01316
Epoch: 19463, Training Loss: 0.01017
Epoch: 19463, Training Loss: 0.01067
Epoch: 19463, Training Loss: 0.01068
Epoch: 19463, Training Loss: 0.01316
Epoch: 19464, Training Loss: 0.01017
Epoch: 19464, Training Loss: 0.01067
Epoch: 19464, Training Loss: 0.01068
Epoch: 19464, Training Loss: 0.01316
Epoch: 19465, Training Loss: 0.01017
Epoch: 19465, Training Loss: 0.01067
Epoch: 19465, Training Loss: 0.01068
Epoch: 19465, Training Loss: 0.01316
Epoch: 19466, Training Loss: 0.01017
Epoch: 19466, Training Loss: 0.01066
Epoch: 19466, Training Loss: 0.01068
Epoch: 19466, Training Loss: 0.01316
Epoch: 19467, Training Loss: 0.01017
Epoch: 19467, Training Loss: 0.01066
Epoch: 19467, Training Loss: 0.01068
Epoch: 19467, Training Loss: 0.01316
Epoch: 19468, Training Loss: 0.01017
Epoch: 19468, Training Loss: 0.01066
Epoch: 19468, Training Loss: 0.01068
Epoch: 19468, Training Loss: 0.01316
Epoch: 19469, Training Loss: 0.01017
Epoch: 19469, Training Loss: 0.01066
Epoch: 19469, Training Loss: 0.01068
Epoch: 19469, Training Loss: 0.01316
Epoch: 19470, Training Loss: 0.01017
Epoch: 19470, Training Loss: 0.01066
Epoch: 19470, Training Loss: 0.01068
Epoch: 19470, Training Loss: 0.01316
Epoch: 19471, Training Loss: 0.01017
Epoch: 19471, Training Loss: 0.01066
Epoch: 19471, Training Loss: 0.01068
Epoch: 19471, Training Loss: 0.01316
Epoch: 19472, Training Loss: 0.01017
Epoch: 19472, Training Loss: 0.01066
Epoch: 19472, Training Loss: 0.01068
Epoch: 19472, Training Loss: 0.01316
Epoch: 19473, Training Loss: 0.01017
Epoch: 19473, Training Loss: 0.01066
Epoch: 19473, Training Loss: 0.01068
Epoch: 19473, Training Loss: 0.01316
Epoch: 19474, Training Loss: 0.01017
Epoch: 19474, Training Loss: 0.01066
Epoch: 19474, Training Loss: 0.01068
Epoch: 19474, Training Loss: 0.01316
Epoch: 19475, Training Loss: 0.01017
Epoch: 19475, Training Loss: 0.01066
Epoch: 19475, Training Loss: 0.01068
Epoch: 19475, Training Loss: 0.01316
Epoch: 19476, Training Loss: 0.01017
Epoch: 19476, Training Loss: 0.01066
Epoch: 19476, Training Loss: 0.01068
Epoch: 19476, Training Loss: 0.01316
Epoch: 19477, Training Loss: 0.01017
Epoch: 19477, Training Loss: 0.01066
Epoch: 19477, Training Loss: 0.01068
Epoch: 19477, Training Loss: 0.01316
Epoch: 19478, Training Loss: 0.01017
Epoch: 19478, Training Loss: 0.01066
Epoch: 19478, Training Loss: 0.01068
Epoch: 19478, Training Loss: 0.01316
Epoch: 19479, Training Loss: 0.01017
Epoch: 19479, Training Loss: 0.01066
Epoch: 19479, Training Loss: 0.01067
Epoch: 19479, Training Loss: 0.01316
Epoch: 19480, Training Loss: 0.01017
Epoch: 19480, Training Loss: 0.01066
Epoch: 19480, Training Loss: 0.01067
Epoch: 19480, Training Loss: 0.01316
Epoch: 19481, Training Loss: 0.01017
Epoch: 19481, Training Loss: 0.01066
Epoch: 19481, Training Loss: 0.01067
Epoch: 19481, Training Loss: 0.01316
Epoch: 19482, Training Loss: 0.01017
Epoch: 19482, Training Loss: 0.01066
Epoch: 19482, Training Loss: 0.01067
Epoch: 19482, Training Loss: 0.01316
Epoch: 19483, Training Loss: 0.01017
Epoch: 19483, Training Loss: 0.01066
Epoch: 19483, Training Loss: 0.01067
Epoch: 19483, Training Loss: 0.01316
Epoch: 19484, Training Loss: 0.01016
Epoch: 19484, Training Loss: 0.01066
Epoch: 19484, Training Loss: 0.01067
Epoch: 19484, Training Loss: 0.01316
Epoch: 19485, Training Loss: 0.01016
Epoch: 19485, Training Loss: 0.01066
Epoch: 19485, Training Loss: 0.01067
Epoch: 19485, Training Loss: 0.01315
Epoch: 19486, Training Loss: 0.01016
Epoch: 19486, Training Loss: 0.01066
Epoch: 19486, Training Loss: 0.01067
Epoch: 19486, Training Loss: 0.01315
Epoch: 19487, Training Loss: 0.01016
Epoch: 19487, Training Loss: 0.01066
Epoch: 19487, Training Loss: 0.01067
Epoch: 19487, Training Loss: 0.01315
Epoch: 19488, Training Loss: 0.01016
Epoch: 19488, Training Loss: 0.01066
Epoch: 19488, Training Loss: 0.01067
Epoch: 19488, Training Loss: 0.01315
Epoch: 19489, Training Loss: 0.01016
Epoch: 19489, Training Loss: 0.01066
Epoch: 19489, Training Loss: 0.01067
Epoch: 19489, Training Loss: 0.01315
Epoch: 19490, Training Loss: 0.01016
Epoch: 19490, Training Loss: 0.01066
Epoch: 19490, Training Loss: 0.01067
Epoch: 19490, Training Loss: 0.01315
Epoch: 19491, Training Loss: 0.01016
Epoch: 19491, Training Loss: 0.01066
Epoch: 19491, Training Loss: 0.01067
Epoch: 19491, Training Loss: 0.01315
Epoch: 19492, Training Loss: 0.01016
Epoch: 19492, Training Loss: 0.01066
Epoch: 19492, Training Loss: 0.01067
Epoch: 19492, Training Loss: 0.01315
Epoch: 19493, Training Loss: 0.01016
Epoch: 19493, Training Loss: 0.01066
Epoch: 19493, Training Loss: 0.01067
Epoch: 19493, Training Loss: 0.01315
Epoch: 19494, Training Loss: 0.01016
Epoch: 19494, Training Loss: 0.01066
Epoch: 19494, Training Loss: 0.01067
Epoch: 19494, Training Loss: 0.01315
Epoch: 19495, Training Loss: 0.01016
Epoch: 19495, Training Loss: 0.01066
Epoch: 19495, Training Loss: 0.01067
Epoch: 19495, Training Loss: 0.01315
Epoch: 19496, Training Loss: 0.01016
Epoch: 19496, Training Loss: 0.01066
Epoch: 19496, Training Loss: 0.01067
Epoch: 19496, Training Loss: 0.01315
Epoch: 19497, Training Loss: 0.01016
Epoch: 19497, Training Loss: 0.01065
Epoch: 19497, Training Loss: 0.01067
Epoch: 19497, Training Loss: 0.01315
Epoch: 19498, Training Loss: 0.01016
Epoch: 19498, Training Loss: 0.01065
Epoch: 19498, Training Loss: 0.01067
Epoch: 19498, Training Loss: 0.01315
Epoch: 19499, Training Loss: 0.01016
Epoch: 19499, Training Loss: 0.01065
Epoch: 19499, Training Loss: 0.01067
Epoch: 19499, Training Loss: 0.01315
Epoch: 19500, Training Loss: 0.01016
Epoch: 19500, Training Loss: 0.01065
Epoch: 19500, Training Loss: 0.01067
Epoch: 19500, Training Loss: 0.01315
Epoch: 19501, Training Loss: 0.01016
Epoch: 19501, Training Loss: 0.01065
Epoch: 19501, Training Loss: 0.01067
Epoch: 19501, Training Loss: 0.01315
Epoch: 19502, Training Loss: 0.01016
Epoch: 19502, Training Loss: 0.01065
Epoch: 19502, Training Loss: 0.01067
Epoch: 19502, Training Loss: 0.01315
Epoch: 19503, Training Loss: 0.01016
Epoch: 19503, Training Loss: 0.01065
Epoch: 19503, Training Loss: 0.01067
Epoch: 19503, Training Loss: 0.01315
Epoch: 19504, Training Loss: 0.01016
Epoch: 19504, Training Loss: 0.01065
Epoch: 19504, Training Loss: 0.01067
Epoch: 19504, Training Loss: 0.01315
Epoch: 19505, Training Loss: 0.01016
Epoch: 19505, Training Loss: 0.01065
Epoch: 19505, Training Loss: 0.01067
Epoch: 19505, Training Loss: 0.01315
Epoch: 19506, Training Loss: 0.01016
Epoch: 19506, Training Loss: 0.01065
Epoch: 19506, Training Loss: 0.01067
Epoch: 19506, Training Loss: 0.01315
Epoch: 19507, Training Loss: 0.01016
Epoch: 19507, Training Loss: 0.01065
Epoch: 19507, Training Loss: 0.01067
Epoch: 19507, Training Loss: 0.01315
Epoch: 19508, Training Loss: 0.01016
Epoch: 19508, Training Loss: 0.01065
Epoch: 19508, Training Loss: 0.01067
Epoch: 19508, Training Loss: 0.01315
Epoch: 19509, Training Loss: 0.01016
Epoch: 19509, Training Loss: 0.01065
Epoch: 19509, Training Loss: 0.01067
Epoch: 19509, Training Loss: 0.01315
Epoch: 19510, Training Loss: 0.01016
Epoch: 19510, Training Loss: 0.01065
Epoch: 19510, Training Loss: 0.01066
Epoch: 19510, Training Loss: 0.01314
Epoch: 19511, Training Loss: 0.01016
Epoch: 19511, Training Loss: 0.01065
Epoch: 19511, Training Loss: 0.01066
Epoch: 19511, Training Loss: 0.01314
Epoch: 19512, Training Loss: 0.01016
Epoch: 19512, Training Loss: 0.01065
Epoch: 19512, Training Loss: 0.01066
Epoch: 19512, Training Loss: 0.01314
Epoch: 19513, Training Loss: 0.01016
Epoch: 19513, Training Loss: 0.01065
Epoch: 19513, Training Loss: 0.01066
Epoch: 19513, Training Loss: 0.01314
Epoch: 19514, Training Loss: 0.01016
Epoch: 19514, Training Loss: 0.01065
Epoch: 19514, Training Loss: 0.01066
Epoch: 19514, Training Loss: 0.01314
Epoch: 19515, Training Loss: 0.01016
Epoch: 19515, Training Loss: 0.01065
Epoch: 19515, Training Loss: 0.01066
Epoch: 19515, Training Loss: 0.01314
Epoch: 19516, Training Loss: 0.01016
Epoch: 19516, Training Loss: 0.01065
Epoch: 19516, Training Loss: 0.01066
Epoch: 19516, Training Loss: 0.01314
Epoch: 19517, Training Loss: 0.01015
Epoch: 19517, Training Loss: 0.01065
Epoch: 19517, Training Loss: 0.01066
Epoch: 19517, Training Loss: 0.01314
Epoch: 19518, Training Loss: 0.01015
Epoch: 19518, Training Loss: 0.01065
Epoch: 19518, Training Loss: 0.01066
Epoch: 19518, Training Loss: 0.01314
Epoch: 19519, Training Loss: 0.01015
Epoch: 19519, Training Loss: 0.01065
Epoch: 19519, Training Loss: 0.01066
Epoch: 19519, Training Loss: 0.01314
Epoch: 19520, Training Loss: 0.01015
Epoch: 19520, Training Loss: 0.01065
Epoch: 19520, Training Loss: 0.01066
Epoch: 19520, Training Loss: 0.01314
Epoch: 19521, Training Loss: 0.01015
Epoch: 19521, Training Loss: 0.01065
Epoch: 19521, Training Loss: 0.01066
Epoch: 19521, Training Loss: 0.01314
Epoch: 19522, Training Loss: 0.01015
Epoch: 19522, Training Loss: 0.01065
Epoch: 19522, Training Loss: 0.01066
Epoch: 19522, Training Loss: 0.01314
Epoch: 19523, Training Loss: 0.01015
Epoch: 19523, Training Loss: 0.01065
Epoch: 19523, Training Loss: 0.01066
Epoch: 19523, Training Loss: 0.01314
Epoch: 19524, Training Loss: 0.01015
Epoch: 19524, Training Loss: 0.01065
Epoch: 19524, Training Loss: 0.01066
Epoch: 19524, Training Loss: 0.01314
Epoch: 19525, Training Loss: 0.01015
Epoch: 19525, Training Loss: 0.01065
Epoch: 19525, Training Loss: 0.01066
Epoch: 19525, Training Loss: 0.01314
Epoch: 19526, Training Loss: 0.01015
Epoch: 19526, Training Loss: 0.01065
Epoch: 19526, Training Loss: 0.01066
Epoch: 19526, Training Loss: 0.01314
Epoch: 19527, Training Loss: 0.01015
Epoch: 19527, Training Loss: 0.01065
Epoch: 19527, Training Loss: 0.01066
Epoch: 19527, Training Loss: 0.01314
Epoch: 19528, Training Loss: 0.01015
Epoch: 19528, Training Loss: 0.01064
Epoch: 19528, Training Loss: 0.01066
Epoch: 19528, Training Loss: 0.01314
Epoch: 19529, Training Loss: 0.01015
Epoch: 19529, Training Loss: 0.01064
Epoch: 19529, Training Loss: 0.01066
Epoch: 19529, Training Loss: 0.01314
Epoch: 19530, Training Loss: 0.01015
Epoch: 19530, Training Loss: 0.01064
Epoch: 19530, Training Loss: 0.01066
Epoch: 19530, Training Loss: 0.01314
Epoch: 19531, Training Loss: 0.01015
Epoch: 19531, Training Loss: 0.01064
Epoch: 19531, Training Loss: 0.01066
Epoch: 19531, Training Loss: 0.01314
Epoch: 19532, Training Loss: 0.01015
Epoch: 19532, Training Loss: 0.01064
Epoch: 19532, Training Loss: 0.01066
Epoch: 19532, Training Loss: 0.01314
Epoch: 19533, Training Loss: 0.01015
Epoch: 19533, Training Loss: 0.01064
Epoch: 19533, Training Loss: 0.01066
Epoch: 19533, Training Loss: 0.01314
Epoch: 19534, Training Loss: 0.01015
Epoch: 19534, Training Loss: 0.01064
Epoch: 19534, Training Loss: 0.01066
Epoch: 19534, Training Loss: 0.01313
Epoch: 19535, Training Loss: 0.01015
Epoch: 19535, Training Loss: 0.01064
Epoch: 19535, Training Loss: 0.01066
Epoch: 19535, Training Loss: 0.01313
Epoch: 19536, Training Loss: 0.01015
Epoch: 19536, Training Loss: 0.01064
Epoch: 19536, Training Loss: 0.01066
Epoch: 19536, Training Loss: 0.01313
Epoch: 19537, Training Loss: 0.01015
Epoch: 19537, Training Loss: 0.01064
Epoch: 19537, Training Loss: 0.01066
Epoch: 19537, Training Loss: 0.01313
Epoch: 19538, Training Loss: 0.01015
Epoch: 19538, Training Loss: 0.01064
Epoch: 19538, Training Loss: 0.01066
Epoch: 19538, Training Loss: 0.01313
Epoch: 19539, Training Loss: 0.01015
Epoch: 19539, Training Loss: 0.01064
Epoch: 19539, Training Loss: 0.01066
Epoch: 19539, Training Loss: 0.01313
Epoch: 19540, Training Loss: 0.01015
Epoch: 19540, Training Loss: 0.01064
Epoch: 19540, Training Loss: 0.01065
Epoch: 19540, Training Loss: 0.01313
Epoch: 19541, Training Loss: 0.01015
Epoch: 19541, Training Loss: 0.01064
Epoch: 19541, Training Loss: 0.01065
Epoch: 19541, Training Loss: 0.01313
Epoch: 19542, Training Loss: 0.01015
Epoch: 19542, Training Loss: 0.01064
Epoch: 19542, Training Loss: 0.01065
Epoch: 19542, Training Loss: 0.01313
Epoch: 19543, Training Loss: 0.01015
Epoch: 19543, Training Loss: 0.01064
Epoch: 19543, Training Loss: 0.01065
Epoch: 19543, Training Loss: 0.01313
Epoch: 19544, Training Loss: 0.01015
Epoch: 19544, Training Loss: 0.01064
Epoch: 19544, Training Loss: 0.01065
Epoch: 19544, Training Loss: 0.01313
Epoch: 19545, Training Loss: 0.01015
Epoch: 19545, Training Loss: 0.01064
Epoch: 19545, Training Loss: 0.01065
Epoch: 19545, Training Loss: 0.01313
Epoch: 19546, Training Loss: 0.01015
Epoch: 19546, Training Loss: 0.01064
Epoch: 19546, Training Loss: 0.01065
Epoch: 19546, Training Loss: 0.01313
Epoch: 19547, Training Loss: 0.01015
Epoch: 19547, Training Loss: 0.01064
Epoch: 19547, Training Loss: 0.01065
Epoch: 19547, Training Loss: 0.01313
Epoch: 19548, Training Loss: 0.01015
Epoch: 19548, Training Loss: 0.01064
Epoch: 19548, Training Loss: 0.01065
Epoch: 19548, Training Loss: 0.01313
Epoch: 19549, Training Loss: 0.01015
Epoch: 19549, Training Loss: 0.01064
Epoch: 19549, Training Loss: 0.01065
Epoch: 19549, Training Loss: 0.01313
Epoch: 19550, Training Loss: 0.01015
Epoch: 19550, Training Loss: 0.01064
Epoch: 19550, Training Loss: 0.01065
Epoch: 19550, Training Loss: 0.01313
Epoch: 19551, Training Loss: 0.01014
Epoch: 19551, Training Loss: 0.01064
Epoch: 19551, Training Loss: 0.01065
Epoch: 19551, Training Loss: 0.01313
Epoch: 19552, Training Loss: 0.01014
Epoch: 19552, Training Loss: 0.01064
Epoch: 19552, Training Loss: 0.01065
Epoch: 19552, Training Loss: 0.01313
Epoch: 19553, Training Loss: 0.01014
Epoch: 19553, Training Loss: 0.01064
Epoch: 19553, Training Loss: 0.01065
Epoch: 19553, Training Loss: 0.01313
Epoch: 19554, Training Loss: 0.01014
Epoch: 19554, Training Loss: 0.01064
Epoch: 19554, Training Loss: 0.01065
Epoch: 19554, Training Loss: 0.01313
Epoch: 19555, Training Loss: 0.01014
Epoch: 19555, Training Loss: 0.01064
Epoch: 19555, Training Loss: 0.01065
Epoch: 19555, Training Loss: 0.01313
Epoch: 19556, Training Loss: 0.01014
Epoch: 19556, Training Loss: 0.01064
Epoch: 19556, Training Loss: 0.01065
Epoch: 19556, Training Loss: 0.01313
Epoch: 19557, Training Loss: 0.01014
Epoch: 19557, Training Loss: 0.01064
Epoch: 19557, Training Loss: 0.01065
Epoch: 19557, Training Loss: 0.01313
Epoch: 19558, Training Loss: 0.01014
Epoch: 19558, Training Loss: 0.01064
Epoch: 19558, Training Loss: 0.01065
Epoch: 19558, Training Loss: 0.01313
Epoch: 19559, Training Loss: 0.01014
Epoch: 19559, Training Loss: 0.01063
Epoch: 19559, Training Loss: 0.01065
Epoch: 19559, Training Loss: 0.01312
Epoch: 19560, Training Loss: 0.01014
Epoch: 19560, Training Loss: 0.01063
Epoch: 19560, Training Loss: 0.01065
Epoch: 19560, Training Loss: 0.01312
Epoch: 19561, Training Loss: 0.01014
Epoch: 19561, Training Loss: 0.01063
Epoch: 19561, Training Loss: 0.01065
Epoch: 19561, Training Loss: 0.01312
Epoch: 19562, Training Loss: 0.01014
Epoch: 19562, Training Loss: 0.01063
Epoch: 19562, Training Loss: 0.01065
Epoch: 19562, Training Loss: 0.01312
Epoch: 19563, Training Loss: 0.01014
Epoch: 19563, Training Loss: 0.01063
Epoch: 19563, Training Loss: 0.01065
Epoch: 19563, Training Loss: 0.01312
Epoch: 19564, Training Loss: 0.01014
Epoch: 19564, Training Loss: 0.01063
Epoch: 19564, Training Loss: 0.01065
Epoch: 19564, Training Loss: 0.01312
Epoch: 19565, Training Loss: 0.01014
Epoch: 19565, Training Loss: 0.01063
Epoch: 19565, Training Loss: 0.01065
Epoch: 19565, Training Loss: 0.01312
Epoch: 19566, Training Loss: 0.01014
Epoch: 19566, Training Loss: 0.01063
Epoch: 19566, Training Loss: 0.01065
Epoch: 19566, Training Loss: 0.01312
Epoch: 19567, Training Loss: 0.01014
Epoch: 19567, Training Loss: 0.01063
Epoch: 19567, Training Loss: 0.01065
Epoch: 19567, Training Loss: 0.01312
Epoch: 19568, Training Loss: 0.01014
Epoch: 19568, Training Loss: 0.01063
Epoch: 19568, Training Loss: 0.01065
Epoch: 19568, Training Loss: 0.01312
Epoch: 19569, Training Loss: 0.01014
Epoch: 19569, Training Loss: 0.01063
Epoch: 19569, Training Loss: 0.01065
Epoch: 19569, Training Loss: 0.01312
Epoch: 19570, Training Loss: 0.01014
Epoch: 19570, Training Loss: 0.01063
Epoch: 19570, Training Loss: 0.01065
Epoch: 19570, Training Loss: 0.01312
Epoch: 19571, Training Loss: 0.01014
Epoch: 19571, Training Loss: 0.01063
Epoch: 19571, Training Loss: 0.01064
Epoch: 19571, Training Loss: 0.01312
Epoch: 19572, Training Loss: 0.01014
Epoch: 19572, Training Loss: 0.01063
Epoch: 19572, Training Loss: 0.01064
Epoch: 19572, Training Loss: 0.01312
Epoch: 19573, Training Loss: 0.01014
Epoch: 19573, Training Loss: 0.01063
Epoch: 19573, Training Loss: 0.01064
Epoch: 19573, Training Loss: 0.01312
Epoch: 19574, Training Loss: 0.01014
Epoch: 19574, Training Loss: 0.01063
Epoch: 19574, Training Loss: 0.01064
Epoch: 19574, Training Loss: 0.01312
Epoch: 19575, Training Loss: 0.01014
Epoch: 19575, Training Loss: 0.01063
Epoch: 19575, Training Loss: 0.01064
Epoch: 19575, Training Loss: 0.01312
Epoch: 19576, Training Loss: 0.01014
Epoch: 19576, Training Loss: 0.01063
Epoch: 19576, Training Loss: 0.01064
Epoch: 19576, Training Loss: 0.01312
Epoch: 19577, Training Loss: 0.01014
Epoch: 19577, Training Loss: 0.01063
Epoch: 19577, Training Loss: 0.01064
Epoch: 19577, Training Loss: 0.01312
Epoch: 19578, Training Loss: 0.01014
Epoch: 19578, Training Loss: 0.01063
Epoch: 19578, Training Loss: 0.01064
Epoch: 19578, Training Loss: 0.01312
Epoch: 19579, Training Loss: 0.01014
Epoch: 19579, Training Loss: 0.01063
Epoch: 19579, Training Loss: 0.01064
Epoch: 19579, Training Loss: 0.01312
Epoch: 19580, Training Loss: 0.01014
Epoch: 19580, Training Loss: 0.01063
Epoch: 19580, Training Loss: 0.01064
Epoch: 19580, Training Loss: 0.01312
Epoch: 19581, Training Loss: 0.01014
Epoch: 19581, Training Loss: 0.01063
Epoch: 19581, Training Loss: 0.01064
Epoch: 19581, Training Loss: 0.01312
Epoch: 19582, Training Loss: 0.01014
Epoch: 19582, Training Loss: 0.01063
Epoch: 19582, Training Loss: 0.01064
Epoch: 19582, Training Loss: 0.01312
Epoch: 19583, Training Loss: 0.01014
Epoch: 19583, Training Loss: 0.01063
Epoch: 19583, Training Loss: 0.01064
Epoch: 19583, Training Loss: 0.01312
Epoch: 19584, Training Loss: 0.01013
Epoch: 19584, Training Loss: 0.01063
Epoch: 19584, Training Loss: 0.01064
Epoch: 19584, Training Loss: 0.01311
Epoch: 19585, Training Loss: 0.01013
Epoch: 19585, Training Loss: 0.01063
Epoch: 19585, Training Loss: 0.01064
Epoch: 19585, Training Loss: 0.01311
Epoch: 19586, Training Loss: 0.01013
Epoch: 19586, Training Loss: 0.01063
Epoch: 19586, Training Loss: 0.01064
Epoch: 19586, Training Loss: 0.01311
Epoch: 19587, Training Loss: 0.01013
Epoch: 19587, Training Loss: 0.01063
Epoch: 19587, Training Loss: 0.01064
Epoch: 19587, Training Loss: 0.01311
Epoch: 19588, Training Loss: 0.01013
Epoch: 19588, Training Loss: 0.01063
Epoch: 19588, Training Loss: 0.01064
Epoch: 19588, Training Loss: 0.01311
Epoch: 19589, Training Loss: 0.01013
Epoch: 19589, Training Loss: 0.01063
Epoch: 19589, Training Loss: 0.01064
Epoch: 19589, Training Loss: 0.01311
Epoch: 19590, Training Loss: 0.01013
Epoch: 19590, Training Loss: 0.01062
Epoch: 19590, Training Loss: 0.01064
Epoch: 19590, Training Loss: 0.01311
Epoch: 19591, Training Loss: 0.01013
Epoch: 19591, Training Loss: 0.01062
Epoch: 19591, Training Loss: 0.01064
Epoch: 19591, Training Loss: 0.01311
Epoch: 19592, Training Loss: 0.01013
Epoch: 19592, Training Loss: 0.01062
Epoch: 19592, Training Loss: 0.01064
Epoch: 19592, Training Loss: 0.01311
Epoch: 19593, Training Loss: 0.01013
Epoch: 19593, Training Loss: 0.01062
Epoch: 19593, Training Loss: 0.01064
Epoch: 19593, Training Loss: 0.01311
Epoch: 19594, Training Loss: 0.01013
Epoch: 19594, Training Loss: 0.01062
Epoch: 19594, Training Loss: 0.01064
Epoch: 19594, Training Loss: 0.01311
Epoch: 19595, Training Loss: 0.01013
Epoch: 19595, Training Loss: 0.01062
Epoch: 19595, Training Loss: 0.01064
Epoch: 19595, Training Loss: 0.01311
Epoch: 19596, Training Loss: 0.01013
Epoch: 19596, Training Loss: 0.01062
Epoch: 19596, Training Loss: 0.01064
Epoch: 19596, Training Loss: 0.01311
Epoch: 19597, Training Loss: 0.01013
Epoch: 19597, Training Loss: 0.01062
Epoch: 19597, Training Loss: 0.01064
Epoch: 19597, Training Loss: 0.01311
Epoch: 19598, Training Loss: 0.01013
Epoch: 19598, Training Loss: 0.01062
Epoch: 19598, Training Loss: 0.01064
Epoch: 19598, Training Loss: 0.01311
Epoch: 19599, Training Loss: 0.01013
Epoch: 19599, Training Loss: 0.01062
Epoch: 19599, Training Loss: 0.01064
Epoch: 19599, Training Loss: 0.01311
Epoch: 19600, Training Loss: 0.01013
Epoch: 19600, Training Loss: 0.01062
Epoch: 19600, Training Loss: 0.01064
Epoch: 19600, Training Loss: 0.01311
Epoch: 19601, Training Loss: 0.01013
Epoch: 19601, Training Loss: 0.01062
Epoch: 19601, Training Loss: 0.01064
Epoch: 19601, Training Loss: 0.01311
Epoch: 19602, Training Loss: 0.01013
Epoch: 19602, Training Loss: 0.01062
Epoch: 19602, Training Loss: 0.01063
Epoch: 19602, Training Loss: 0.01311
Epoch: 19603, Training Loss: 0.01013
Epoch: 19603, Training Loss: 0.01062
Epoch: 19603, Training Loss: 0.01063
Epoch: 19603, Training Loss: 0.01311
Epoch: 19604, Training Loss: 0.01013
Epoch: 19604, Training Loss: 0.01062
Epoch: 19604, Training Loss: 0.01063
Epoch: 19604, Training Loss: 0.01311
Epoch: 19605, Training Loss: 0.01013
Epoch: 19605, Training Loss: 0.01062
Epoch: 19605, Training Loss: 0.01063
Epoch: 19605, Training Loss: 0.01311
Epoch: 19606, Training Loss: 0.01013
Epoch: 19606, Training Loss: 0.01062
Epoch: 19606, Training Loss: 0.01063
Epoch: 19606, Training Loss: 0.01311
Epoch: 19607, Training Loss: 0.01013
Epoch: 19607, Training Loss: 0.01062
Epoch: 19607, Training Loss: 0.01063
Epoch: 19607, Training Loss: 0.01311
Epoch: 19608, Training Loss: 0.01013
Epoch: 19608, Training Loss: 0.01062
Epoch: 19608, Training Loss: 0.01063
Epoch: 19608, Training Loss: 0.01311
Epoch: 19609, Training Loss: 0.01013
Epoch: 19609, Training Loss: 0.01062
Epoch: 19609, Training Loss: 0.01063
Epoch: 19609, Training Loss: 0.01310
Epoch: 19610, Training Loss: 0.01013
Epoch: 19610, Training Loss: 0.01062
Epoch: 19610, Training Loss: 0.01063
Epoch: 19610, Training Loss: 0.01310
Epoch: 19611, Training Loss: 0.01013
Epoch: 19611, Training Loss: 0.01062
Epoch: 19611, Training Loss: 0.01063
Epoch: 19611, Training Loss: 0.01310
Epoch: 19612, Training Loss: 0.01013
Epoch: 19612, Training Loss: 0.01062
Epoch: 19612, Training Loss: 0.01063
Epoch: 19612, Training Loss: 0.01310
Epoch: 19613, Training Loss: 0.01013
Epoch: 19613, Training Loss: 0.01062
Epoch: 19613, Training Loss: 0.01063
Epoch: 19613, Training Loss: 0.01310
Epoch: 19614, Training Loss: 0.01013
Epoch: 19614, Training Loss: 0.01062
Epoch: 19614, Training Loss: 0.01063
Epoch: 19614, Training Loss: 0.01310
Epoch: 19615, Training Loss: 0.01013
Epoch: 19615, Training Loss: 0.01062
Epoch: 19615, Training Loss: 0.01063
Epoch: 19615, Training Loss: 0.01310
Epoch: 19616, Training Loss: 0.01013
Epoch: 19616, Training Loss: 0.01062
Epoch: 19616, Training Loss: 0.01063
Epoch: 19616, Training Loss: 0.01310
Epoch: 19617, Training Loss: 0.01013
Epoch: 19617, Training Loss: 0.01062
Epoch: 19617, Training Loss: 0.01063
Epoch: 19617, Training Loss: 0.01310
Epoch: 19618, Training Loss: 0.01012
Epoch: 19618, Training Loss: 0.01062
Epoch: 19618, Training Loss: 0.01063
Epoch: 19618, Training Loss: 0.01310
Epoch: 19619, Training Loss: 0.01012
Epoch: 19619, Training Loss: 0.01062
Epoch: 19619, Training Loss: 0.01063
Epoch: 19619, Training Loss: 0.01310
Epoch: 19620, Training Loss: 0.01012
Epoch: 19620, Training Loss: 0.01062
Epoch: 19620, Training Loss: 0.01063
Epoch: 19620, Training Loss: 0.01310
Epoch: 19621, Training Loss: 0.01012
Epoch: 19621, Training Loss: 0.01061
Epoch: 19621, Training Loss: 0.01063
Epoch: 19621, Training Loss: 0.01310
Epoch: 19622, Training Loss: 0.01012
Epoch: 19622, Training Loss: 0.01061
Epoch: 19622, Training Loss: 0.01063
Epoch: 19622, Training Loss: 0.01310
Epoch: 19623, Training Loss: 0.01012
Epoch: 19623, Training Loss: 0.01061
Epoch: 19623, Training Loss: 0.01063
Epoch: 19623, Training Loss: 0.01310
Epoch: 19624, Training Loss: 0.01012
Epoch: 19624, Training Loss: 0.01061
Epoch: 19624, Training Loss: 0.01063
Epoch: 19624, Training Loss: 0.01310
Epoch: 19625, Training Loss: 0.01012
Epoch: 19625, Training Loss: 0.01061
Epoch: 19625, Training Loss: 0.01063
Epoch: 19625, Training Loss: 0.01310
Epoch: 19626, Training Loss: 0.01012
Epoch: 19626, Training Loss: 0.01061
Epoch: 19626, Training Loss: 0.01063
Epoch: 19626, Training Loss: 0.01310
Epoch: 19627, Training Loss: 0.01012
Epoch: 19627, Training Loss: 0.01061
Epoch: 19627, Training Loss: 0.01063
Epoch: 19627, Training Loss: 0.01310
Epoch: 19628, Training Loss: 0.01012
Epoch: 19628, Training Loss: 0.01061
Epoch: 19628, Training Loss: 0.01063
Epoch: 19628, Training Loss: 0.01310
Epoch: 19629, Training Loss: 0.01012
Epoch: 19629, Training Loss: 0.01061
Epoch: 19629, Training Loss: 0.01063
Epoch: 19629, Training Loss: 0.01310
Epoch: 19630, Training Loss: 0.01012
Epoch: 19630, Training Loss: 0.01061
Epoch: 19630, Training Loss: 0.01063
Epoch: 19630, Training Loss: 0.01310
Epoch: 19631, Training Loss: 0.01012
Epoch: 19631, Training Loss: 0.01061
Epoch: 19631, Training Loss: 0.01063
Epoch: 19631, Training Loss: 0.01310
Epoch: 19632, Training Loss: 0.01012
Epoch: 19632, Training Loss: 0.01061
Epoch: 19632, Training Loss: 0.01063
Epoch: 19632, Training Loss: 0.01310
Epoch: 19633, Training Loss: 0.01012
Epoch: 19633, Training Loss: 0.01061
Epoch: 19633, Training Loss: 0.01062
Epoch: 19633, Training Loss: 0.01310
Epoch: 19634, Training Loss: 0.01012
Epoch: 19634, Training Loss: 0.01061
Epoch: 19634, Training Loss: 0.01062
Epoch: 19634, Training Loss: 0.01309
Epoch: 19635, Training Loss: 0.01012
Epoch: 19635, Training Loss: 0.01061
Epoch: 19635, Training Loss: 0.01062
Epoch: 19635, Training Loss: 0.01309
Epoch: 19636, Training Loss: 0.01012
Epoch: 19636, Training Loss: 0.01061
Epoch: 19636, Training Loss: 0.01062
Epoch: 19636, Training Loss: 0.01309
Epoch: 19637, Training Loss: 0.01012
Epoch: 19637, Training Loss: 0.01061
Epoch: 19637, Training Loss: 0.01062
Epoch: 19637, Training Loss: 0.01309
Epoch: 19638, Training Loss: 0.01012
Epoch: 19638, Training Loss: 0.01061
Epoch: 19638, Training Loss: 0.01062
Epoch: 19638, Training Loss: 0.01309
Epoch: 19639, Training Loss: 0.01012
Epoch: 19639, Training Loss: 0.01061
Epoch: 19639, Training Loss: 0.01062
Epoch: 19639, Training Loss: 0.01309
Epoch: 19640, Training Loss: 0.01012
Epoch: 19640, Training Loss: 0.01061
Epoch: 19640, Training Loss: 0.01062
Epoch: 19640, Training Loss: 0.01309
Epoch: 19641, Training Loss: 0.01012
Epoch: 19641, Training Loss: 0.01061
Epoch: 19641, Training Loss: 0.01062
Epoch: 19641, Training Loss: 0.01309
Epoch: 19642, Training Loss: 0.01012
Epoch: 19642, Training Loss: 0.01061
Epoch: 19642, Training Loss: 0.01062
Epoch: 19642, Training Loss: 0.01309
Epoch: 19643, Training Loss: 0.01012
Epoch: 19643, Training Loss: 0.01061
Epoch: 19643, Training Loss: 0.01062
Epoch: 19643, Training Loss: 0.01309
Epoch: 19644, Training Loss: 0.01012
Epoch: 19644, Training Loss: 0.01061
Epoch: 19644, Training Loss: 0.01062
Epoch: 19644, Training Loss: 0.01309
Epoch: 19645, Training Loss: 0.01012
Epoch: 19645, Training Loss: 0.01061
Epoch: 19645, Training Loss: 0.01062
Epoch: 19645, Training Loss: 0.01309
Epoch: 19646, Training Loss: 0.01012
Epoch: 19646, Training Loss: 0.01061
Epoch: 19646, Training Loss: 0.01062
Epoch: 19646, Training Loss: 0.01309
Epoch: 19647, Training Loss: 0.01012
Epoch: 19647, Training Loss: 0.01061
Epoch: 19647, Training Loss: 0.01062
Epoch: 19647, Training Loss: 0.01309
Epoch: 19648, Training Loss: 0.01012
Epoch: 19648, Training Loss: 0.01061
Epoch: 19648, Training Loss: 0.01062
Epoch: 19648, Training Loss: 0.01309
Epoch: 19649, Training Loss: 0.01012
Epoch: 19649, Training Loss: 0.01061
Epoch: 19649, Training Loss: 0.01062
Epoch: 19649, Training Loss: 0.01309
Epoch: 19650, Training Loss: 0.01012
Epoch: 19650, Training Loss: 0.01061
Epoch: 19650, Training Loss: 0.01062
Epoch: 19650, Training Loss: 0.01309
Epoch: 19651, Training Loss: 0.01011
Epoch: 19651, Training Loss: 0.01061
Epoch: 19651, Training Loss: 0.01062
Epoch: 19651, Training Loss: 0.01309
Epoch: 19652, Training Loss: 0.01011
Epoch: 19652, Training Loss: 0.01060
Epoch: 19652, Training Loss: 0.01062
Epoch: 19652, Training Loss: 0.01309
Epoch: 19653, Training Loss: 0.01011
Epoch: 19653, Training Loss: 0.01060
Epoch: 19653, Training Loss: 0.01062
Epoch: 19653, Training Loss: 0.01309
Epoch: 19654, Training Loss: 0.01011
Epoch: 19654, Training Loss: 0.01060
Epoch: 19654, Training Loss: 0.01062
Epoch: 19654, Training Loss: 0.01309
Epoch: 19655, Training Loss: 0.01011
Epoch: 19655, Training Loss: 0.01060
Epoch: 19655, Training Loss: 0.01062
Epoch: 19655, Training Loss: 0.01309
Epoch: 19656, Training Loss: 0.01011
Epoch: 19656, Training Loss: 0.01060
Epoch: 19656, Training Loss: 0.01062
Epoch: 19656, Training Loss: 0.01309
Epoch: 19657, Training Loss: 0.01011
Epoch: 19657, Training Loss: 0.01060
Epoch: 19657, Training Loss: 0.01062
Epoch: 19657, Training Loss: 0.01309
Epoch: 19658, Training Loss: 0.01011
Epoch: 19658, Training Loss: 0.01060
Epoch: 19658, Training Loss: 0.01062
Epoch: 19658, Training Loss: 0.01308
Epoch: 19659, Training Loss: 0.01011
Epoch: 19659, Training Loss: 0.01060
Epoch: 19659, Training Loss: 0.01062
Epoch: 19659, Training Loss: 0.01308
Epoch: 19660, Training Loss: 0.01011
Epoch: 19660, Training Loss: 0.01060
Epoch: 19660, Training Loss: 0.01062
Epoch: 19660, Training Loss: 0.01308
Epoch: 19661, Training Loss: 0.01011
Epoch: 19661, Training Loss: 0.01060
Epoch: 19661, Training Loss: 0.01062
Epoch: 19661, Training Loss: 0.01308
Epoch: 19662, Training Loss: 0.01011
Epoch: 19662, Training Loss: 0.01060
Epoch: 19662, Training Loss: 0.01062
Epoch: 19662, Training Loss: 0.01308
Epoch: 19663, Training Loss: 0.01011
Epoch: 19663, Training Loss: 0.01060
Epoch: 19663, Training Loss: 0.01062
Epoch: 19663, Training Loss: 0.01308
Epoch: 19664, Training Loss: 0.01011
Epoch: 19664, Training Loss: 0.01060
Epoch: 19664, Training Loss: 0.01062
Epoch: 19664, Training Loss: 0.01308
Epoch: 19665, Training Loss: 0.01011
Epoch: 19665, Training Loss: 0.01060
Epoch: 19665, Training Loss: 0.01061
Epoch: 19665, Training Loss: 0.01308
Epoch: 19666, Training Loss: 0.01011
Epoch: 19666, Training Loss: 0.01060
Epoch: 19666, Training Loss: 0.01061
Epoch: 19666, Training Loss: 0.01308
Epoch: 19667, Training Loss: 0.01011
Epoch: 19667, Training Loss: 0.01060
Epoch: 19667, Training Loss: 0.01061
Epoch: 19667, Training Loss: 0.01308
Epoch: 19668, Training Loss: 0.01011
Epoch: 19668, Training Loss: 0.01060
Epoch: 19668, Training Loss: 0.01061
Epoch: 19668, Training Loss: 0.01308
Epoch: 19669, Training Loss: 0.01011
Epoch: 19669, Training Loss: 0.01060
Epoch: 19669, Training Loss: 0.01061
Epoch: 19669, Training Loss: 0.01308
Epoch: 19670, Training Loss: 0.01011
Epoch: 19670, Training Loss: 0.01060
Epoch: 19670, Training Loss: 0.01061
Epoch: 19670, Training Loss: 0.01308
Epoch: 19671, Training Loss: 0.01011
Epoch: 19671, Training Loss: 0.01060
Epoch: 19671, Training Loss: 0.01061
Epoch: 19671, Training Loss: 0.01308
Epoch: 19672, Training Loss: 0.01011
Epoch: 19672, Training Loss: 0.01060
Epoch: 19672, Training Loss: 0.01061
Epoch: 19672, Training Loss: 0.01308
Epoch: 19673, Training Loss: 0.01011
Epoch: 19673, Training Loss: 0.01060
Epoch: 19673, Training Loss: 0.01061
Epoch: 19673, Training Loss: 0.01308
Epoch: 19674, Training Loss: 0.01011
Epoch: 19674, Training Loss: 0.01060
Epoch: 19674, Training Loss: 0.01061
Epoch: 19674, Training Loss: 0.01308
Epoch: 19675, Training Loss: 0.01011
Epoch: 19675, Training Loss: 0.01060
Epoch: 19675, Training Loss: 0.01061
Epoch: 19675, Training Loss: 0.01308
Epoch: 19676, Training Loss: 0.01011
Epoch: 19676, Training Loss: 0.01060
Epoch: 19676, Training Loss: 0.01061
Epoch: 19676, Training Loss: 0.01308
Epoch: 19677, Training Loss: 0.01011
Epoch: 19677, Training Loss: 0.01060
Epoch: 19677, Training Loss: 0.01061
Epoch: 19677, Training Loss: 0.01308
Epoch: 19678, Training Loss: 0.01011
Epoch: 19678, Training Loss: 0.01060
Epoch: 19678, Training Loss: 0.01061
Epoch: 19678, Training Loss: 0.01308
Epoch: 19679, Training Loss: 0.01011
Epoch: 19679, Training Loss: 0.01060
Epoch: 19679, Training Loss: 0.01061
Epoch: 19679, Training Loss: 0.01308
Epoch: 19680, Training Loss: 0.01011
Epoch: 19680, Training Loss: 0.01060
Epoch: 19680, Training Loss: 0.01061
Epoch: 19680, Training Loss: 0.01308
Epoch: 19681, Training Loss: 0.01011
Epoch: 19681, Training Loss: 0.01060
Epoch: 19681, Training Loss: 0.01061
Epoch: 19681, Training Loss: 0.01308
Epoch: 19682, Training Loss: 0.01011
Epoch: 19682, Training Loss: 0.01060
Epoch: 19682, Training Loss: 0.01061
Epoch: 19682, Training Loss: 0.01308
Epoch: 19683, Training Loss: 0.01011
Epoch: 19683, Training Loss: 0.01060
Epoch: 19683, Training Loss: 0.01061
Epoch: 19683, Training Loss: 0.01307
Epoch: 19684, Training Loss: 0.01011
Epoch: 19684, Training Loss: 0.01059
Epoch: 19684, Training Loss: 0.01061
Epoch: 19684, Training Loss: 0.01307
Epoch: 19685, Training Loss: 0.01010
Epoch: 19685, Training Loss: 0.01059
Epoch: 19685, Training Loss: 0.01061
Epoch: 19685, Training Loss: 0.01307
Epoch: 19686, Training Loss: 0.01010
Epoch: 19686, Training Loss: 0.01059
Epoch: 19686, Training Loss: 0.01061
Epoch: 19686, Training Loss: 0.01307
Epoch: 19687, Training Loss: 0.01010
Epoch: 19687, Training Loss: 0.01059
Epoch: 19687, Training Loss: 0.01061
Epoch: 19687, Training Loss: 0.01307
Epoch: 19688, Training Loss: 0.01010
Epoch: 19688, Training Loss: 0.01059
Epoch: 19688, Training Loss: 0.01061
Epoch: 19688, Training Loss: 0.01307
Epoch: 19689, Training Loss: 0.01010
Epoch: 19689, Training Loss: 0.01059
Epoch: 19689, Training Loss: 0.01061
Epoch: 19689, Training Loss: 0.01307
Epoch: 19690, Training Loss: 0.01010
Epoch: 19690, Training Loss: 0.01059
Epoch: 19690, Training Loss: 0.01061
Epoch: 19690, Training Loss: 0.01307
Epoch: 19691, Training Loss: 0.01010
Epoch: 19691, Training Loss: 0.01059
Epoch: 19691, Training Loss: 0.01061
Epoch: 19691, Training Loss: 0.01307
Epoch: 19692, Training Loss: 0.01010
Epoch: 19692, Training Loss: 0.01059
Epoch: 19692, Training Loss: 0.01061
Epoch: 19692, Training Loss: 0.01307
Epoch: 19693, Training Loss: 0.01010
Epoch: 19693, Training Loss: 0.01059
Epoch: 19693, Training Loss: 0.01061
Epoch: 19693, Training Loss: 0.01307
Epoch: 19694, Training Loss: 0.01010
Epoch: 19694, Training Loss: 0.01059
Epoch: 19694, Training Loss: 0.01061
Epoch: 19694, Training Loss: 0.01307
Epoch: 19695, Training Loss: 0.01010
Epoch: 19695, Training Loss: 0.01059
Epoch: 19695, Training Loss: 0.01061
Epoch: 19695, Training Loss: 0.01307
Epoch: 19696, Training Loss: 0.01010
Epoch: 19696, Training Loss: 0.01059
Epoch: 19696, Training Loss: 0.01060
Epoch: 19696, Training Loss: 0.01307
Epoch: 19697, Training Loss: 0.01010
Epoch: 19697, Training Loss: 0.01059
Epoch: 19697, Training Loss: 0.01060
Epoch: 19697, Training Loss: 0.01307
Epoch: 19698, Training Loss: 0.01010
Epoch: 19698, Training Loss: 0.01059
Epoch: 19698, Training Loss: 0.01060
Epoch: 19698, Training Loss: 0.01307
Epoch: 19699, Training Loss: 0.01010
Epoch: 19699, Training Loss: 0.01059
Epoch: 19699, Training Loss: 0.01060
Epoch: 19699, Training Loss: 0.01307
Epoch: 19700, Training Loss: 0.01010
Epoch: 19700, Training Loss: 0.01059
Epoch: 19700, Training Loss: 0.01060
Epoch: 19700, Training Loss: 0.01307
Epoch: 19701, Training Loss: 0.01010
Epoch: 19701, Training Loss: 0.01059
Epoch: 19701, Training Loss: 0.01060
Epoch: 19701, Training Loss: 0.01307
Epoch: 19702, Training Loss: 0.01010
Epoch: 19702, Training Loss: 0.01059
Epoch: 19702, Training Loss: 0.01060
Epoch: 19702, Training Loss: 0.01307
Epoch: 19703, Training Loss: 0.01010
Epoch: 19703, Training Loss: 0.01059
Epoch: 19703, Training Loss: 0.01060
Epoch: 19703, Training Loss: 0.01307
Epoch: 19704, Training Loss: 0.01010
Epoch: 19704, Training Loss: 0.01059
Epoch: 19704, Training Loss: 0.01060
Epoch: 19704, Training Loss: 0.01307
Epoch: 19705, Training Loss: 0.01010
Epoch: 19705, Training Loss: 0.01059
Epoch: 19705, Training Loss: 0.01060
Epoch: 19705, Training Loss: 0.01307
Epoch: 19706, Training Loss: 0.01010
Epoch: 19706, Training Loss: 0.01059
Epoch: 19706, Training Loss: 0.01060
Epoch: 19706, Training Loss: 0.01307
Epoch: 19707, Training Loss: 0.01010
Epoch: 19707, Training Loss: 0.01059
Epoch: 19707, Training Loss: 0.01060
Epoch: 19707, Training Loss: 0.01307
Epoch: 19708, Training Loss: 0.01010
Epoch: 19708, Training Loss: 0.01059
Epoch: 19708, Training Loss: 0.01060
Epoch: 19708, Training Loss: 0.01306
Epoch: 19709, Training Loss: 0.01010
Epoch: 19709, Training Loss: 0.01059
Epoch: 19709, Training Loss: 0.01060
Epoch: 19709, Training Loss: 0.01306
Epoch: 19710, Training Loss: 0.01010
Epoch: 19710, Training Loss: 0.01059
Epoch: 19710, Training Loss: 0.01060
Epoch: 19710, Training Loss: 0.01306
Epoch: 19711, Training Loss: 0.01010
Epoch: 19711, Training Loss: 0.01059
Epoch: 19711, Training Loss: 0.01060
Epoch: 19711, Training Loss: 0.01306
Epoch: 19712, Training Loss: 0.01010
Epoch: 19712, Training Loss: 0.01059
Epoch: 19712, Training Loss: 0.01060
Epoch: 19712, Training Loss: 0.01306
Epoch: 19713, Training Loss: 0.01010
Epoch: 19713, Training Loss: 0.01059
Epoch: 19713, Training Loss: 0.01060
Epoch: 19713, Training Loss: 0.01306
Epoch: 19714, Training Loss: 0.01010
Epoch: 19714, Training Loss: 0.01059
Epoch: 19714, Training Loss: 0.01060
Epoch: 19714, Training Loss: 0.01306
Epoch: 19715, Training Loss: 0.01010
Epoch: 19715, Training Loss: 0.01058
Epoch: 19715, Training Loss: 0.01060
Epoch: 19715, Training Loss: 0.01306
Epoch: 19716, Training Loss: 0.01010
Epoch: 19716, Training Loss: 0.01058
Epoch: 19716, Training Loss: 0.01060
Epoch: 19716, Training Loss: 0.01306
Epoch: 19717, Training Loss: 0.01010
Epoch: 19717, Training Loss: 0.01058
Epoch: 19717, Training Loss: 0.01060
Epoch: 19717, Training Loss: 0.01306
Epoch: 19718, Training Loss: 0.01010
Epoch: 19718, Training Loss: 0.01058
Epoch: 19718, Training Loss: 0.01060
Epoch: 19718, Training Loss: 0.01306
Epoch: 19719, Training Loss: 0.01009
Epoch: 19719, Training Loss: 0.01058
Epoch: 19719, Training Loss: 0.01060
Epoch: 19719, Training Loss: 0.01306
Epoch: 19720, Training Loss: 0.01009
Epoch: 19720, Training Loss: 0.01058
Epoch: 19720, Training Loss: 0.01060
Epoch: 19720, Training Loss: 0.01306
Epoch: 19721, Training Loss: 0.01009
Epoch: 19721, Training Loss: 0.01058
Epoch: 19721, Training Loss: 0.01060
Epoch: 19721, Training Loss: 0.01306
Epoch: 19722, Training Loss: 0.01009
Epoch: 19722, Training Loss: 0.01058
Epoch: 19722, Training Loss: 0.01060
Epoch: 19722, Training Loss: 0.01306
Epoch: 19723, Training Loss: 0.01009
Epoch: 19723, Training Loss: 0.01058
Epoch: 19723, Training Loss: 0.01060
Epoch: 19723, Training Loss: 0.01306
Epoch: 19724, Training Loss: 0.01009
Epoch: 19724, Training Loss: 0.01058
Epoch: 19724, Training Loss: 0.01060
Epoch: 19724, Training Loss: 0.01306
Epoch: 19725, Training Loss: 0.01009
Epoch: 19725, Training Loss: 0.01058
Epoch: 19725, Training Loss: 0.01060
Epoch: 19725, Training Loss: 0.01306
Epoch: 19726, Training Loss: 0.01009
Epoch: 19726, Training Loss: 0.01058
Epoch: 19726, Training Loss: 0.01060
Epoch: 19726, Training Loss: 0.01306
Epoch: 19727, Training Loss: 0.01009
Epoch: 19727, Training Loss: 0.01058
Epoch: 19727, Training Loss: 0.01059
Epoch: 19727, Training Loss: 0.01306
Epoch: 19728, Training Loss: 0.01009
Epoch: 19728, Training Loss: 0.01058
Epoch: 19728, Training Loss: 0.01059
Epoch: 19728, Training Loss: 0.01306
Epoch: 19729, Training Loss: 0.01009
Epoch: 19729, Training Loss: 0.01058
Epoch: 19729, Training Loss: 0.01059
Epoch: 19729, Training Loss: 0.01306
Epoch: 19730, Training Loss: 0.01009
Epoch: 19730, Training Loss: 0.01058
Epoch: 19730, Training Loss: 0.01059
Epoch: 19730, Training Loss: 0.01306
Epoch: 19731, Training Loss: 0.01009
Epoch: 19731, Training Loss: 0.01058
Epoch: 19731, Training Loss: 0.01059
Epoch: 19731, Training Loss: 0.01306
Epoch: 19732, Training Loss: 0.01009
Epoch: 19732, Training Loss: 0.01058
Epoch: 19732, Training Loss: 0.01059
Epoch: 19732, Training Loss: 0.01306
Epoch: 19733, Training Loss: 0.01009
Epoch: 19733, Training Loss: 0.01058
Epoch: 19733, Training Loss: 0.01059
Epoch: 19733, Training Loss: 0.01306
Epoch: 19734, Training Loss: 0.01009
Epoch: 19734, Training Loss: 0.01058
Epoch: 19734, Training Loss: 0.01059
Epoch: 19734, Training Loss: 0.01305
Epoch: 19735, Training Loss: 0.01009
Epoch: 19735, Training Loss: 0.01058
Epoch: 19735, Training Loss: 0.01059
Epoch: 19735, Training Loss: 0.01305
Epoch: 19736, Training Loss: 0.01009
Epoch: 19736, Training Loss: 0.01058
Epoch: 19736, Training Loss: 0.01059
Epoch: 19736, Training Loss: 0.01305
Epoch: 19737, Training Loss: 0.01009
Epoch: 19737, Training Loss: 0.01058
Epoch: 19737, Training Loss: 0.01059
Epoch: 19737, Training Loss: 0.01305
Epoch: 19738, Training Loss: 0.01009
Epoch: 19738, Training Loss: 0.01058
Epoch: 19738, Training Loss: 0.01059
Epoch: 19738, Training Loss: 0.01305
Epoch: 19739, Training Loss: 0.01009
Epoch: 19739, Training Loss: 0.01058
Epoch: 19739, Training Loss: 0.01059
Epoch: 19739, Training Loss: 0.01305
Epoch: 19740, Training Loss: 0.01009
Epoch: 19740, Training Loss: 0.01058
Epoch: 19740, Training Loss: 0.01059
Epoch: 19740, Training Loss: 0.01305
Epoch: 19741, Training Loss: 0.01009
Epoch: 19741, Training Loss: 0.01058
Epoch: 19741, Training Loss: 0.01059
Epoch: 19741, Training Loss: 0.01305
Epoch: 19742, Training Loss: 0.01009
Epoch: 19742, Training Loss: 0.01058
Epoch: 19742, Training Loss: 0.01059
Epoch: 19742, Training Loss: 0.01305
Epoch: 19743, Training Loss: 0.01009
Epoch: 19743, Training Loss: 0.01058
Epoch: 19743, Training Loss: 0.01059
Epoch: 19743, Training Loss: 0.01305
Epoch: 19744, Training Loss: 0.01009
Epoch: 19744, Training Loss: 0.01058
Epoch: 19744, Training Loss: 0.01059
Epoch: 19744, Training Loss: 0.01305
Epoch: 19745, Training Loss: 0.01009
Epoch: 19745, Training Loss: 0.01058
Epoch: 19745, Training Loss: 0.01059
Epoch: 19745, Training Loss: 0.01305
Epoch: 19746, Training Loss: 0.01009
Epoch: 19746, Training Loss: 0.01057
Epoch: 19746, Training Loss: 0.01059
Epoch: 19746, Training Loss: 0.01305
Epoch: 19747, Training Loss: 0.01009
Epoch: 19747, Training Loss: 0.01057
Epoch: 19747, Training Loss: 0.01059
Epoch: 19747, Training Loss: 0.01305
Epoch: 19748, Training Loss: 0.01009
Epoch: 19748, Training Loss: 0.01057
Epoch: 19748, Training Loss: 0.01059
Epoch: 19748, Training Loss: 0.01305
Epoch: 19749, Training Loss: 0.01009
Epoch: 19749, Training Loss: 0.01057
Epoch: 19749, Training Loss: 0.01059
Epoch: 19749, Training Loss: 0.01305
Epoch: 19750, Training Loss: 0.01009
Epoch: 19750, Training Loss: 0.01057
Epoch: 19750, Training Loss: 0.01059
Epoch: 19750, Training Loss: 0.01305
Epoch: 19751, Training Loss: 0.01009
Epoch: 19751, Training Loss: 0.01057
Epoch: 19751, Training Loss: 0.01059
Epoch: 19751, Training Loss: 0.01305
Epoch: 19752, Training Loss: 0.01009
Epoch: 19752, Training Loss: 0.01057
Epoch: 19752, Training Loss: 0.01059
Epoch: 19752, Training Loss: 0.01305
Epoch: 19753, Training Loss: 0.01008
Epoch: 19753, Training Loss: 0.01057
Epoch: 19753, Training Loss: 0.01059
Epoch: 19753, Training Loss: 0.01305
Epoch: 19754, Training Loss: 0.01008
Epoch: 19754, Training Loss: 0.01057
Epoch: 19754, Training Loss: 0.01059
Epoch: 19754, Training Loss: 0.01305
Epoch: 19755, Training Loss: 0.01008
Epoch: 19755, Training Loss: 0.01057
Epoch: 19755, Training Loss: 0.01059
Epoch: 19755, Training Loss: 0.01305
Epoch: 19756, Training Loss: 0.01008
Epoch: 19756, Training Loss: 0.01057
Epoch: 19756, Training Loss: 0.01059
Epoch: 19756, Training Loss: 0.01305
Epoch: 19757, Training Loss: 0.01008
Epoch: 19757, Training Loss: 0.01057
Epoch: 19757, Training Loss: 0.01059
Epoch: 19757, Training Loss: 0.01305
Epoch: 19758, Training Loss: 0.01008
Epoch: 19758, Training Loss: 0.01057
Epoch: 19758, Training Loss: 0.01059
Epoch: 19758, Training Loss: 0.01305
Epoch: 19759, Training Loss: 0.01008
Epoch: 19759, Training Loss: 0.01057
Epoch: 19759, Training Loss: 0.01058
Epoch: 19759, Training Loss: 0.01304
Epoch: 19760, Training Loss: 0.01008
Epoch: 19760, Training Loss: 0.01057
Epoch: 19760, Training Loss: 0.01058
Epoch: 19760, Training Loss: 0.01304
Epoch: 19761, Training Loss: 0.01008
Epoch: 19761, Training Loss: 0.01057
Epoch: 19761, Training Loss: 0.01058
Epoch: 19761, Training Loss: 0.01304
Epoch: 19762, Training Loss: 0.01008
Epoch: 19762, Training Loss: 0.01057
Epoch: 19762, Training Loss: 0.01058
Epoch: 19762, Training Loss: 0.01304
Epoch: 19763, Training Loss: 0.01008
Epoch: 19763, Training Loss: 0.01057
Epoch: 19763, Training Loss: 0.01058
Epoch: 19763, Training Loss: 0.01304
Epoch: 19764, Training Loss: 0.01008
Epoch: 19764, Training Loss: 0.01057
Epoch: 19764, Training Loss: 0.01058
Epoch: 19764, Training Loss: 0.01304
Epoch: 19765, Training Loss: 0.01008
Epoch: 19765, Training Loss: 0.01057
Epoch: 19765, Training Loss: 0.01058
Epoch: 19765, Training Loss: 0.01304
Epoch: 19766, Training Loss: 0.01008
Epoch: 19766, Training Loss: 0.01057
Epoch: 19766, Training Loss: 0.01058
Epoch: 19766, Training Loss: 0.01304
Epoch: 19767, Training Loss: 0.01008
Epoch: 19767, Training Loss: 0.01057
Epoch: 19767, Training Loss: 0.01058
Epoch: 19767, Training Loss: 0.01304
Epoch: 19768, Training Loss: 0.01008
Epoch: 19768, Training Loss: 0.01057
Epoch: 19768, Training Loss: 0.01058
Epoch: 19768, Training Loss: 0.01304
Epoch: 19769, Training Loss: 0.01008
Epoch: 19769, Training Loss: 0.01057
Epoch: 19769, Training Loss: 0.01058
Epoch: 19769, Training Loss: 0.01304
Epoch: 19770, Training Loss: 0.01008
Epoch: 19770, Training Loss: 0.01057
Epoch: 19770, Training Loss: 0.01058
Epoch: 19770, Training Loss: 0.01304
Epoch: 19771, Training Loss: 0.01008
Epoch: 19771, Training Loss: 0.01057
Epoch: 19771, Training Loss: 0.01058
Epoch: 19771, Training Loss: 0.01304
Epoch: 19772, Training Loss: 0.01008
Epoch: 19772, Training Loss: 0.01057
Epoch: 19772, Training Loss: 0.01058
Epoch: 19772, Training Loss: 0.01304
Epoch: 19773, Training Loss: 0.01008
Epoch: 19773, Training Loss: 0.01057
Epoch: 19773, Training Loss: 0.01058
Epoch: 19773, Training Loss: 0.01304
Epoch: 19774, Training Loss: 0.01008
Epoch: 19774, Training Loss: 0.01057
Epoch: 19774, Training Loss: 0.01058
Epoch: 19774, Training Loss: 0.01304
Epoch: 19775, Training Loss: 0.01008
Epoch: 19775, Training Loss: 0.01057
Epoch: 19775, Training Loss: 0.01058
Epoch: 19775, Training Loss: 0.01304
Epoch: 19776, Training Loss: 0.01008
Epoch: 19776, Training Loss: 0.01057
Epoch: 19776, Training Loss: 0.01058
Epoch: 19776, Training Loss: 0.01304
Epoch: 19777, Training Loss: 0.01008
Epoch: 19777, Training Loss: 0.01057
Epoch: 19777, Training Loss: 0.01058
Epoch: 19777, Training Loss: 0.01304
Epoch: 19778, Training Loss: 0.01008
Epoch: 19778, Training Loss: 0.01056
Epoch: 19778, Training Loss: 0.01058
Epoch: 19778, Training Loss: 0.01304
Epoch: 19779, Training Loss: 0.01008
Epoch: 19779, Training Loss: 0.01056
Epoch: 19779, Training Loss: 0.01058
Epoch: 19779, Training Loss: 0.01304
Epoch: 19780, Training Loss: 0.01008
Epoch: 19780, Training Loss: 0.01056
Epoch: 19780, Training Loss: 0.01058
Epoch: 19780, Training Loss: 0.01304
Epoch: 19781, Training Loss: 0.01008
Epoch: 19781, Training Loss: 0.01056
Epoch: 19781, Training Loss: 0.01058
Epoch: 19781, Training Loss: 0.01304
Epoch: 19782, Training Loss: 0.01008
Epoch: 19782, Training Loss: 0.01056
Epoch: 19782, Training Loss: 0.01058
Epoch: 19782, Training Loss: 0.01304
Epoch: 19783, Training Loss: 0.01008
Epoch: 19783, Training Loss: 0.01056
Epoch: 19783, Training Loss: 0.01058
Epoch: 19783, Training Loss: 0.01304
Epoch: 19784, Training Loss: 0.01008
Epoch: 19784, Training Loss: 0.01056
Epoch: 19784, Training Loss: 0.01058
Epoch: 19784, Training Loss: 0.01303
Epoch: 19785, Training Loss: 0.01008
Epoch: 19785, Training Loss: 0.01056
Epoch: 19785, Training Loss: 0.01058
Epoch: 19785, Training Loss: 0.01303
Epoch: 19786, Training Loss: 0.01008
Epoch: 19786, Training Loss: 0.01056
Epoch: 19786, Training Loss: 0.01058
Epoch: 19786, Training Loss: 0.01303
Epoch: 19787, Training Loss: 0.01007
Epoch: 19787, Training Loss: 0.01056
Epoch: 19787, Training Loss: 0.01058
Epoch: 19787, Training Loss: 0.01303
Epoch: 19788, Training Loss: 0.01007
Epoch: 19788, Training Loss: 0.01056
Epoch: 19788, Training Loss: 0.01058
Epoch: 19788, Training Loss: 0.01303
Epoch: 19789, Training Loss: 0.01007
Epoch: 19789, Training Loss: 0.01056
Epoch: 19789, Training Loss: 0.01058
Epoch: 19789, Training Loss: 0.01303
Epoch: 19790, Training Loss: 0.01007
Epoch: 19790, Training Loss: 0.01056
Epoch: 19790, Training Loss: 0.01057
Epoch: 19790, Training Loss: 0.01303
Epoch: 19791, Training Loss: 0.01007
Epoch: 19791, Training Loss: 0.01056
Epoch: 19791, Training Loss: 0.01057
Epoch: 19791, Training Loss: 0.01303
Epoch: 19792, Training Loss: 0.01007
Epoch: 19792, Training Loss: 0.01056
Epoch: 19792, Training Loss: 0.01057
Epoch: 19792, Training Loss: 0.01303
Epoch: 19793, Training Loss: 0.01007
Epoch: 19793, Training Loss: 0.01056
Epoch: 19793, Training Loss: 0.01057
Epoch: 19793, Training Loss: 0.01303
Epoch: 19794, Training Loss: 0.01007
Epoch: 19794, Training Loss: 0.01056
Epoch: 19794, Training Loss: 0.01057
Epoch: 19794, Training Loss: 0.01303
Epoch: 19795, Training Loss: 0.01007
Epoch: 19795, Training Loss: 0.01056
Epoch: 19795, Training Loss: 0.01057
Epoch: 19795, Training Loss: 0.01303
Epoch: 19796, Training Loss: 0.01007
Epoch: 19796, Training Loss: 0.01056
Epoch: 19796, Training Loss: 0.01057
Epoch: 19796, Training Loss: 0.01303
Epoch: 19797, Training Loss: 0.01007
Epoch: 19797, Training Loss: 0.01056
Epoch: 19797, Training Loss: 0.01057
Epoch: 19797, Training Loss: 0.01303
Epoch: 19798, Training Loss: 0.01007
Epoch: 19798, Training Loss: 0.01056
Epoch: 19798, Training Loss: 0.01057
Epoch: 19798, Training Loss: 0.01303
Epoch: 19799, Training Loss: 0.01007
Epoch: 19799, Training Loss: 0.01056
Epoch: 19799, Training Loss: 0.01057
Epoch: 19799, Training Loss: 0.01303
Epoch: 19800, Training Loss: 0.01007
Epoch: 19800, Training Loss: 0.01056
Epoch: 19800, Training Loss: 0.01057
Epoch: 19800, Training Loss: 0.01303
Epoch: 19801, Training Loss: 0.01007
Epoch: 19801, Training Loss: 0.01056
Epoch: 19801, Training Loss: 0.01057
Epoch: 19801, Training Loss: 0.01303
Epoch: 19802, Training Loss: 0.01007
Epoch: 19802, Training Loss: 0.01056
Epoch: 19802, Training Loss: 0.01057
Epoch: 19802, Training Loss: 0.01303
Epoch: 19803, Training Loss: 0.01007
Epoch: 19803, Training Loss: 0.01056
Epoch: 19803, Training Loss: 0.01057
Epoch: 19803, Training Loss: 0.01303
Epoch: 19804, Training Loss: 0.01007
Epoch: 19804, Training Loss: 0.01056
Epoch: 19804, Training Loss: 0.01057
Epoch: 19804, Training Loss: 0.01303
Epoch: 19805, Training Loss: 0.01007
Epoch: 19805, Training Loss: 0.01056
Epoch: 19805, Training Loss: 0.01057
Epoch: 19805, Training Loss: 0.01303
Epoch: 19806, Training Loss: 0.01007
Epoch: 19806, Training Loss: 0.01056
Epoch: 19806, Training Loss: 0.01057
Epoch: 19806, Training Loss: 0.01303
Epoch: 19807, Training Loss: 0.01007
Epoch: 19807, Training Loss: 0.01056
Epoch: 19807, Training Loss: 0.01057
Epoch: 19807, Training Loss: 0.01303
Epoch: 19808, Training Loss: 0.01007
Epoch: 19808, Training Loss: 0.01056
Epoch: 19808, Training Loss: 0.01057
Epoch: 19808, Training Loss: 0.01303
Epoch: 19809, Training Loss: 0.01007
Epoch: 19809, Training Loss: 0.01056
Epoch: 19809, Training Loss: 0.01057
Epoch: 19809, Training Loss: 0.01302
Epoch: 19810, Training Loss: 0.01007
Epoch: 19810, Training Loss: 0.01055
Epoch: 19810, Training Loss: 0.01057
Epoch: 19810, Training Loss: 0.01302
Epoch: 19811, Training Loss: 0.01007
Epoch: 19811, Training Loss: 0.01055
Epoch: 19811, Training Loss: 0.01057
Epoch: 19811, Training Loss: 0.01302
Epoch: 19812, Training Loss: 0.01007
Epoch: 19812, Training Loss: 0.01055
Epoch: 19812, Training Loss: 0.01057
Epoch: 19812, Training Loss: 0.01302
Epoch: 19813, Training Loss: 0.01007
Epoch: 19813, Training Loss: 0.01055
Epoch: 19813, Training Loss: 0.01057
Epoch: 19813, Training Loss: 0.01302
Epoch: 19814, Training Loss: 0.01007
Epoch: 19814, Training Loss: 0.01055
Epoch: 19814, Training Loss: 0.01057
Epoch: 19814, Training Loss: 0.01302
Epoch: 19815, Training Loss: 0.01007
Epoch: 19815, Training Loss: 0.01055
Epoch: 19815, Training Loss: 0.01057
Epoch: 19815, Training Loss: 0.01302
Epoch: 19816, Training Loss: 0.01007
Epoch: 19816, Training Loss: 0.01055
Epoch: 19816, Training Loss: 0.01057
Epoch: 19816, Training Loss: 0.01302
Epoch: 19817, Training Loss: 0.01007
Epoch: 19817, Training Loss: 0.01055
Epoch: 19817, Training Loss: 0.01057
Epoch: 19817, Training Loss: 0.01302
Epoch: 19818, Training Loss: 0.01007
Epoch: 19818, Training Loss: 0.01055
Epoch: 19818, Training Loss: 0.01057
Epoch: 19818, Training Loss: 0.01302
Epoch: 19819, Training Loss: 0.01007
Epoch: 19819, Training Loss: 0.01055
Epoch: 19819, Training Loss: 0.01057
Epoch: 19819, Training Loss: 0.01302
Epoch: 19820, Training Loss: 0.01007
Epoch: 19820, Training Loss: 0.01055
Epoch: 19820, Training Loss: 0.01057
Epoch: 19820, Training Loss: 0.01302
Epoch: 19821, Training Loss: 0.01006
Epoch: 19821, Training Loss: 0.01055
Epoch: 19821, Training Loss: 0.01057
Epoch: 19821, Training Loss: 0.01302
Epoch: 19822, Training Loss: 0.01006
Epoch: 19822, Training Loss: 0.01055
Epoch: 19822, Training Loss: 0.01056
Epoch: 19822, Training Loss: 0.01302
Epoch: 19823, Training Loss: 0.01006
Epoch: 19823, Training Loss: 0.01055
Epoch: 19823, Training Loss: 0.01056
Epoch: 19823, Training Loss: 0.01302
Epoch: 19824, Training Loss: 0.01006
Epoch: 19824, Training Loss: 0.01055
Epoch: 19824, Training Loss: 0.01056
Epoch: 19824, Training Loss: 0.01302
Epoch: 19825, Training Loss: 0.01006
Epoch: 19825, Training Loss: 0.01055
Epoch: 19825, Training Loss: 0.01056
Epoch: 19825, Training Loss: 0.01302
Epoch: 19826, Training Loss: 0.01006
Epoch: 19826, Training Loss: 0.01055
Epoch: 19826, Training Loss: 0.01056
Epoch: 19826, Training Loss: 0.01302
Epoch: 19827, Training Loss: 0.01006
Epoch: 19827, Training Loss: 0.01055
Epoch: 19827, Training Loss: 0.01056
Epoch: 19827, Training Loss: 0.01302
Epoch: 19828, Training Loss: 0.01006
Epoch: 19828, Training Loss: 0.01055
Epoch: 19828, Training Loss: 0.01056
Epoch: 19828, Training Loss: 0.01302
Epoch: 19829, Training Loss: 0.01006
Epoch: 19829, Training Loss: 0.01055
Epoch: 19829, Training Loss: 0.01056
Epoch: 19829, Training Loss: 0.01302
Epoch: 19830, Training Loss: 0.01006
Epoch: 19830, Training Loss: 0.01055
Epoch: 19830, Training Loss: 0.01056
Epoch: 19830, Training Loss: 0.01302
Epoch: 19831, Training Loss: 0.01006
Epoch: 19831, Training Loss: 0.01055
Epoch: 19831, Training Loss: 0.01056
Epoch: 19831, Training Loss: 0.01302
Epoch: 19832, Training Loss: 0.01006
Epoch: 19832, Training Loss: 0.01055
Epoch: 19832, Training Loss: 0.01056
Epoch: 19832, Training Loss: 0.01302
Epoch: 19833, Training Loss: 0.01006
Epoch: 19833, Training Loss: 0.01055
Epoch: 19833, Training Loss: 0.01056
Epoch: 19833, Training Loss: 0.01302
Epoch: 19834, Training Loss: 0.01006
Epoch: 19834, Training Loss: 0.01055
Epoch: 19834, Training Loss: 0.01056
Epoch: 19834, Training Loss: 0.01301
Epoch: 19835, Training Loss: 0.01006
Epoch: 19835, Training Loss: 0.01055
Epoch: 19835, Training Loss: 0.01056
Epoch: 19835, Training Loss: 0.01301
Epoch: 19836, Training Loss: 0.01006
Epoch: 19836, Training Loss: 0.01055
Epoch: 19836, Training Loss: 0.01056
Epoch: 19836, Training Loss: 0.01301
Epoch: 19837, Training Loss: 0.01006
Epoch: 19837, Training Loss: 0.01055
Epoch: 19837, Training Loss: 0.01056
Epoch: 19837, Training Loss: 0.01301
Epoch: 19838, Training Loss: 0.01006
Epoch: 19838, Training Loss: 0.01055
Epoch: 19838, Training Loss: 0.01056
Epoch: 19838, Training Loss: 0.01301
Epoch: 19839, Training Loss: 0.01006
Epoch: 19839, Training Loss: 0.01055
Epoch: 19839, Training Loss: 0.01056
Epoch: 19839, Training Loss: 0.01301
Epoch: 19840, Training Loss: 0.01006
Epoch: 19840, Training Loss: 0.01055
Epoch: 19840, Training Loss: 0.01056
Epoch: 19840, Training Loss: 0.01301
Epoch: 19841, Training Loss: 0.01006
Epoch: 19841, Training Loss: 0.01054
Epoch: 19841, Training Loss: 0.01056
Epoch: 19841, Training Loss: 0.01301
Epoch: 19842, Training Loss: 0.01006
Epoch: 19842, Training Loss: 0.01054
Epoch: 19842, Training Loss: 0.01056
Epoch: 19842, Training Loss: 0.01301
Epoch: 19843, Training Loss: 0.01006
Epoch: 19843, Training Loss: 0.01054
Epoch: 19843, Training Loss: 0.01056
Epoch: 19843, Training Loss: 0.01301
Epoch: 19844, Training Loss: 0.01006
Epoch: 19844, Training Loss: 0.01054
Epoch: 19844, Training Loss: 0.01056
Epoch: 19844, Training Loss: 0.01301
Epoch: 19845, Training Loss: 0.01006
Epoch: 19845, Training Loss: 0.01054
Epoch: 19845, Training Loss: 0.01056
Epoch: 19845, Training Loss: 0.01301
Epoch: 19846, Training Loss: 0.01006
Epoch: 19846, Training Loss: 0.01054
Epoch: 19846, Training Loss: 0.01056
Epoch: 19846, Training Loss: 0.01301
Epoch: 19847, Training Loss: 0.01006
Epoch: 19847, Training Loss: 0.01054
Epoch: 19847, Training Loss: 0.01056
Epoch: 19847, Training Loss: 0.01301
Epoch: 19848, Training Loss: 0.01006
Epoch: 19848, Training Loss: 0.01054
Epoch: 19848, Training Loss: 0.01056
Epoch: 19848, Training Loss: 0.01301
Epoch: 19849, Training Loss: 0.01006
Epoch: 19849, Training Loss: 0.01054
Epoch: 19849, Training Loss: 0.01056
Epoch: 19849, Training Loss: 0.01301
Epoch: 19850, Training Loss: 0.01006
Epoch: 19850, Training Loss: 0.01054
Epoch: 19850, Training Loss: 0.01056
Epoch: 19850, Training Loss: 0.01301
Epoch: 19851, Training Loss: 0.01006
Epoch: 19851, Training Loss: 0.01054
Epoch: 19851, Training Loss: 0.01056
Epoch: 19851, Training Loss: 0.01301
Epoch: 19852, Training Loss: 0.01006
Epoch: 19852, Training Loss: 0.01054
Epoch: 19852, Training Loss: 0.01056
Epoch: 19852, Training Loss: 0.01301
Epoch: 19853, Training Loss: 0.01006
Epoch: 19853, Training Loss: 0.01054
Epoch: 19853, Training Loss: 0.01056
Epoch: 19853, Training Loss: 0.01301
Epoch: 19854, Training Loss: 0.01006
Epoch: 19854, Training Loss: 0.01054
Epoch: 19854, Training Loss: 0.01055
Epoch: 19854, Training Loss: 0.01301
Epoch: 19855, Training Loss: 0.01005
Epoch: 19855, Training Loss: 0.01054
Epoch: 19855, Training Loss: 0.01055
Epoch: 19855, Training Loss: 0.01301
Epoch: 19856, Training Loss: 0.01005
Epoch: 19856, Training Loss: 0.01054
Epoch: 19856, Training Loss: 0.01055
Epoch: 19856, Training Loss: 0.01301
Epoch: 19857, Training Loss: 0.01005
Epoch: 19857, Training Loss: 0.01054
Epoch: 19857, Training Loss: 0.01055
Epoch: 19857, Training Loss: 0.01301
Epoch: 19858, Training Loss: 0.01005
Epoch: 19858, Training Loss: 0.01054
Epoch: 19858, Training Loss: 0.01055
Epoch: 19858, Training Loss: 0.01301
Epoch: 19859, Training Loss: 0.01005
Epoch: 19859, Training Loss: 0.01054
Epoch: 19859, Training Loss: 0.01055
Epoch: 19859, Training Loss: 0.01301
Epoch: 19860, Training Loss: 0.01005
Epoch: 19860, Training Loss: 0.01054
Epoch: 19860, Training Loss: 0.01055
Epoch: 19860, Training Loss: 0.01300
Epoch: 19861, Training Loss: 0.01005
Epoch: 19861, Training Loss: 0.01054
Epoch: 19861, Training Loss: 0.01055
Epoch: 19861, Training Loss: 0.01300
Epoch: 19862, Training Loss: 0.01005
Epoch: 19862, Training Loss: 0.01054
Epoch: 19862, Training Loss: 0.01055
Epoch: 19862, Training Loss: 0.01300
Epoch: 19863, Training Loss: 0.01005
Epoch: 19863, Training Loss: 0.01054
Epoch: 19863, Training Loss: 0.01055
Epoch: 19863, Training Loss: 0.01300
Epoch: 19864, Training Loss: 0.01005
Epoch: 19864, Training Loss: 0.01054
Epoch: 19864, Training Loss: 0.01055
Epoch: 19864, Training Loss: 0.01300
Epoch: 19865, Training Loss: 0.01005
Epoch: 19865, Training Loss: 0.01054
Epoch: 19865, Training Loss: 0.01055
Epoch: 19865, Training Loss: 0.01300
Epoch: 19866, Training Loss: 0.01005
Epoch: 19866, Training Loss: 0.01054
Epoch: 19866, Training Loss: 0.01055
Epoch: 19866, Training Loss: 0.01300
Epoch: 19867, Training Loss: 0.01005
Epoch: 19867, Training Loss: 0.01054
Epoch: 19867, Training Loss: 0.01055
Epoch: 19867, Training Loss: 0.01300
Epoch: 19868, Training Loss: 0.01005
Epoch: 19868, Training Loss: 0.01054
Epoch: 19868, Training Loss: 0.01055
Epoch: 19868, Training Loss: 0.01300
Epoch: 19869, Training Loss: 0.01005
Epoch: 19869, Training Loss: 0.01054
Epoch: 19869, Training Loss: 0.01055
Epoch: 19869, Training Loss: 0.01300
Epoch: 19870, Training Loss: 0.01005
Epoch: 19870, Training Loss: 0.01054
Epoch: 19870, Training Loss: 0.01055
Epoch: 19870, Training Loss: 0.01300
Epoch: 19871, Training Loss: 0.01005
Epoch: 19871, Training Loss: 0.01054
Epoch: 19871, Training Loss: 0.01055
Epoch: 19871, Training Loss: 0.01300
Epoch: 19872, Training Loss: 0.01005
Epoch: 19872, Training Loss: 0.01054
Epoch: 19872, Training Loss: 0.01055
Epoch: 19872, Training Loss: 0.01300
Epoch: 19873, Training Loss: 0.01005
Epoch: 19873, Training Loss: 0.01053
Epoch: 19873, Training Loss: 0.01055
Epoch: 19873, Training Loss: 0.01300
Epoch: 19874, Training Loss: 0.01005
Epoch: 19874, Training Loss: 0.01053
Epoch: 19874, Training Loss: 0.01055
Epoch: 19874, Training Loss: 0.01300
Epoch: 19875, Training Loss: 0.01005
Epoch: 19875, Training Loss: 0.01053
Epoch: 19875, Training Loss: 0.01055
Epoch: 19875, Training Loss: 0.01300
Epoch: 19876, Training Loss: 0.01005
Epoch: 19876, Training Loss: 0.01053
Epoch: 19876, Training Loss: 0.01055
Epoch: 19876, Training Loss: 0.01300
Epoch: 19877, Training Loss: 0.01005
Epoch: 19877, Training Loss: 0.01053
Epoch: 19877, Training Loss: 0.01055
Epoch: 19877, Training Loss: 0.01300
Epoch: 19878, Training Loss: 0.01005
Epoch: 19878, Training Loss: 0.01053
Epoch: 19878, Training Loss: 0.01055
Epoch: 19878, Training Loss: 0.01300
Epoch: 19879, Training Loss: 0.01005
Epoch: 19879, Training Loss: 0.01053
Epoch: 19879, Training Loss: 0.01055
Epoch: 19879, Training Loss: 0.01300
Epoch: 19880, Training Loss: 0.01005
Epoch: 19880, Training Loss: 0.01053
Epoch: 19880, Training Loss: 0.01055
Epoch: 19880, Training Loss: 0.01300
Epoch: 19881, Training Loss: 0.01005
Epoch: 19881, Training Loss: 0.01053
Epoch: 19881, Training Loss: 0.01055
Epoch: 19881, Training Loss: 0.01300
Epoch: 19882, Training Loss: 0.01005
Epoch: 19882, Training Loss: 0.01053
Epoch: 19882, Training Loss: 0.01055
Epoch: 19882, Training Loss: 0.01300
Epoch: 19883, Training Loss: 0.01005
Epoch: 19883, Training Loss: 0.01053
Epoch: 19883, Training Loss: 0.01055
Epoch: 19883, Training Loss: 0.01300
Epoch: 19884, Training Loss: 0.01005
Epoch: 19884, Training Loss: 0.01053
Epoch: 19884, Training Loss: 0.01055
Epoch: 19884, Training Loss: 0.01300
Epoch: 19885, Training Loss: 0.01005
Epoch: 19885, Training Loss: 0.01053
Epoch: 19885, Training Loss: 0.01054
Epoch: 19885, Training Loss: 0.01299
Epoch: 19886, Training Loss: 0.01005
Epoch: 19886, Training Loss: 0.01053
Epoch: 19886, Training Loss: 0.01054
Epoch: 19886, Training Loss: 0.01299
Epoch: 19887, Training Loss: 0.01005
Epoch: 19887, Training Loss: 0.01053
Epoch: 19887, Training Loss: 0.01054
Epoch: 19887, Training Loss: 0.01299
Epoch: 19888, Training Loss: 0.01005
Epoch: 19888, Training Loss: 0.01053
Epoch: 19888, Training Loss: 0.01054
Epoch: 19888, Training Loss: 0.01299
Epoch: 19889, Training Loss: 0.01005
Epoch: 19889, Training Loss: 0.01053
Epoch: 19889, Training Loss: 0.01054
Epoch: 19889, Training Loss: 0.01299
Epoch: 19890, Training Loss: 0.01004
Epoch: 19890, Training Loss: 0.01053
Epoch: 19890, Training Loss: 0.01054
Epoch: 19890, Training Loss: 0.01299
Epoch: 19891, Training Loss: 0.01004
Epoch: 19891, Training Loss: 0.01053
Epoch: 19891, Training Loss: 0.01054
Epoch: 19891, Training Loss: 0.01299
Epoch: 19892, Training Loss: 0.01004
Epoch: 19892, Training Loss: 0.01053
Epoch: 19892, Training Loss: 0.01054
Epoch: 19892, Training Loss: 0.01299
Epoch: 19893, Training Loss: 0.01004
Epoch: 19893, Training Loss: 0.01053
Epoch: 19893, Training Loss: 0.01054
Epoch: 19893, Training Loss: 0.01299
Epoch: 19894, Training Loss: 0.01004
Epoch: 19894, Training Loss: 0.01053
Epoch: 19894, Training Loss: 0.01054
Epoch: 19894, Training Loss: 0.01299
Epoch: 19895, Training Loss: 0.01004
Epoch: 19895, Training Loss: 0.01053
Epoch: 19895, Training Loss: 0.01054
Epoch: 19895, Training Loss: 0.01299
Epoch: 19896, Training Loss: 0.01004
Epoch: 19896, Training Loss: 0.01053
Epoch: 19896, Training Loss: 0.01054
Epoch: 19896, Training Loss: 0.01299
Epoch: 19897, Training Loss: 0.01004
Epoch: 19897, Training Loss: 0.01053
Epoch: 19897, Training Loss: 0.01054
Epoch: 19897, Training Loss: 0.01299
Epoch: 19898, Training Loss: 0.01004
Epoch: 19898, Training Loss: 0.01053
Epoch: 19898, Training Loss: 0.01054
Epoch: 19898, Training Loss: 0.01299
Epoch: 19899, Training Loss: 0.01004
Epoch: 19899, Training Loss: 0.01053
Epoch: 19899, Training Loss: 0.01054
Epoch: 19899, Training Loss: 0.01299
Epoch: 19900, Training Loss: 0.01004
Epoch: 19900, Training Loss: 0.01053
Epoch: 19900, Training Loss: 0.01054
Epoch: 19900, Training Loss: 0.01299
Epoch: 19901, Training Loss: 0.01004
Epoch: 19901, Training Loss: 0.01053
Epoch: 19901, Training Loss: 0.01054
Epoch: 19901, Training Loss: 0.01299
Epoch: 19902, Training Loss: 0.01004
Epoch: 19902, Training Loss: 0.01053
Epoch: 19902, Training Loss: 0.01054
Epoch: 19902, Training Loss: 0.01299
Epoch: 19903, Training Loss: 0.01004
Epoch: 19903, Training Loss: 0.01053
Epoch: 19903, Training Loss: 0.01054
Epoch: 19903, Training Loss: 0.01299
Epoch: 19904, Training Loss: 0.01004
Epoch: 19904, Training Loss: 0.01053
Epoch: 19904, Training Loss: 0.01054
Epoch: 19904, Training Loss: 0.01299
Epoch: 19905, Training Loss: 0.01004
Epoch: 19905, Training Loss: 0.01052
Epoch: 19905, Training Loss: 0.01054
Epoch: 19905, Training Loss: 0.01299
Epoch: 19906, Training Loss: 0.01004
Epoch: 19906, Training Loss: 0.01052
Epoch: 19906, Training Loss: 0.01054
Epoch: 19906, Training Loss: 0.01299
Epoch: 19907, Training Loss: 0.01004
Epoch: 19907, Training Loss: 0.01052
Epoch: 19907, Training Loss: 0.01054
Epoch: 19907, Training Loss: 0.01299
Epoch: 19908, Training Loss: 0.01004
Epoch: 19908, Training Loss: 0.01052
Epoch: 19908, Training Loss: 0.01054
Epoch: 19908, Training Loss: 0.01299
Epoch: 19909, Training Loss: 0.01004
Epoch: 19909, Training Loss: 0.01052
Epoch: 19909, Training Loss: 0.01054
Epoch: 19909, Training Loss: 0.01299
Epoch: 19910, Training Loss: 0.01004
Epoch: 19910, Training Loss: 0.01052
Epoch: 19910, Training Loss: 0.01054
Epoch: 19910, Training Loss: 0.01299
Epoch: 19911, Training Loss: 0.01004
Epoch: 19911, Training Loss: 0.01052
Epoch: 19911, Training Loss: 0.01054
Epoch: 19911, Training Loss: 0.01298
Epoch: 19912, Training Loss: 0.01004
Epoch: 19912, Training Loss: 0.01052
Epoch: 19912, Training Loss: 0.01054
Epoch: 19912, Training Loss: 0.01298
Epoch: 19913, Training Loss: 0.01004
Epoch: 19913, Training Loss: 0.01052
Epoch: 19913, Training Loss: 0.01054
Epoch: 19913, Training Loss: 0.01298
Epoch: 19914, Training Loss: 0.01004
Epoch: 19914, Training Loss: 0.01052
Epoch: 19914, Training Loss: 0.01054
Epoch: 19914, Training Loss: 0.01298
Epoch: 19915, Training Loss: 0.01004
Epoch: 19915, Training Loss: 0.01052
Epoch: 19915, Training Loss: 0.01054
Epoch: 19915, Training Loss: 0.01298
Epoch: 19916, Training Loss: 0.01004
Epoch: 19916, Training Loss: 0.01052
Epoch: 19916, Training Loss: 0.01054
Epoch: 19916, Training Loss: 0.01298
Epoch: 19917, Training Loss: 0.01004
Epoch: 19917, Training Loss: 0.01052
Epoch: 19917, Training Loss: 0.01053
Epoch: 19917, Training Loss: 0.01298
Epoch: 19918, Training Loss: 0.01004
Epoch: 19918, Training Loss: 0.01052
Epoch: 19918, Training Loss: 0.01053
Epoch: 19918, Training Loss: 0.01298
Epoch: 19919, Training Loss: 0.01004
Epoch: 19919, Training Loss: 0.01052
Epoch: 19919, Training Loss: 0.01053
Epoch: 19919, Training Loss: 0.01298
Epoch: 19920, Training Loss: 0.01004
Epoch: 19920, Training Loss: 0.01052
Epoch: 19920, Training Loss: 0.01053
Epoch: 19920, Training Loss: 0.01298
Epoch: 19921, Training Loss: 0.01004
Epoch: 19921, Training Loss: 0.01052
Epoch: 19921, Training Loss: 0.01053
Epoch: 19921, Training Loss: 0.01298
Epoch: 19922, Training Loss: 0.01004
Epoch: 19922, Training Loss: 0.01052
Epoch: 19922, Training Loss: 0.01053
Epoch: 19922, Training Loss: 0.01298
Epoch: 19923, Training Loss: 0.01004
Epoch: 19923, Training Loss: 0.01052
Epoch: 19923, Training Loss: 0.01053
Epoch: 19923, Training Loss: 0.01298
Epoch: 19924, Training Loss: 0.01003
Epoch: 19924, Training Loss: 0.01052
Epoch: 19924, Training Loss: 0.01053
Epoch: 19924, Training Loss: 0.01298
Epoch: 19925, Training Loss: 0.01003
Epoch: 19925, Training Loss: 0.01052
Epoch: 19925, Training Loss: 0.01053
Epoch: 19925, Training Loss: 0.01298
Epoch: 19926, Training Loss: 0.01003
Epoch: 19926, Training Loss: 0.01052
Epoch: 19926, Training Loss: 0.01053
Epoch: 19926, Training Loss: 0.01298
Epoch: 19927, Training Loss: 0.01003
Epoch: 19927, Training Loss: 0.01052
Epoch: 19927, Training Loss: 0.01053
Epoch: 19927, Training Loss: 0.01298
Epoch: 19928, Training Loss: 0.01003
Epoch: 19928, Training Loss: 0.01052
Epoch: 19928, Training Loss: 0.01053
Epoch: 19928, Training Loss: 0.01298
Epoch: 19929, Training Loss: 0.01003
Epoch: 19929, Training Loss: 0.01052
Epoch: 19929, Training Loss: 0.01053
Epoch: 19929, Training Loss: 0.01298
Epoch: 19930, Training Loss: 0.01003
Epoch: 19930, Training Loss: 0.01052
Epoch: 19930, Training Loss: 0.01053
Epoch: 19930, Training Loss: 0.01298
Epoch: 19931, Training Loss: 0.01003
Epoch: 19931, Training Loss: 0.01052
Epoch: 19931, Training Loss: 0.01053
Epoch: 19931, Training Loss: 0.01298
Epoch: 19932, Training Loss: 0.01003
Epoch: 19932, Training Loss: 0.01052
Epoch: 19932, Training Loss: 0.01053
Epoch: 19932, Training Loss: 0.01298
Epoch: 19933, Training Loss: 0.01003
Epoch: 19933, Training Loss: 0.01052
Epoch: 19933, Training Loss: 0.01053
Epoch: 19933, Training Loss: 0.01298
Epoch: 19934, Training Loss: 0.01003
Epoch: 19934, Training Loss: 0.01052
Epoch: 19934, Training Loss: 0.01053
Epoch: 19934, Training Loss: 0.01298
Epoch: 19935, Training Loss: 0.01003
Epoch: 19935, Training Loss: 0.01052
Epoch: 19935, Training Loss: 0.01053
Epoch: 19935, Training Loss: 0.01298
Epoch: 19936, Training Loss: 0.01003
Epoch: 19936, Training Loss: 0.01052
Epoch: 19936, Training Loss: 0.01053
Epoch: 19936, Training Loss: 0.01297
Epoch: 19937, Training Loss: 0.01003
Epoch: 19937, Training Loss: 0.01051
Epoch: 19937, Training Loss: 0.01053
Epoch: 19937, Training Loss: 0.01297
Epoch: 19938, Training Loss: 0.01003
Epoch: 19938, Training Loss: 0.01051
Epoch: 19938, Training Loss: 0.01053
Epoch: 19938, Training Loss: 0.01297
Epoch: 19939, Training Loss: 0.01003
Epoch: 19939, Training Loss: 0.01051
Epoch: 19939, Training Loss: 0.01053
Epoch: 19939, Training Loss: 0.01297
Epoch: 19940, Training Loss: 0.01003
Epoch: 19940, Training Loss: 0.01051
Epoch: 19940, Training Loss: 0.01053
Epoch: 19940, Training Loss: 0.01297
Epoch: 19941, Training Loss: 0.01003
Epoch: 19941, Training Loss: 0.01051
Epoch: 19941, Training Loss: 0.01053
Epoch: 19941, Training Loss: 0.01297
Epoch: 19942, Training Loss: 0.01003
Epoch: 19942, Training Loss: 0.01051
Epoch: 19942, Training Loss: 0.01053
Epoch: 19942, Training Loss: 0.01297
Epoch: 19943, Training Loss: 0.01003
Epoch: 19943, Training Loss: 0.01051
Epoch: 19943, Training Loss: 0.01053
Epoch: 19943, Training Loss: 0.01297
Epoch: 19944, Training Loss: 0.01003
Epoch: 19944, Training Loss: 0.01051
Epoch: 19944, Training Loss: 0.01053
Epoch: 19944, Training Loss: 0.01297
Epoch: 19945, Training Loss: 0.01003
Epoch: 19945, Training Loss: 0.01051
Epoch: 19945, Training Loss: 0.01053
Epoch: 19945, Training Loss: 0.01297
Epoch: 19946, Training Loss: 0.01003
Epoch: 19946, Training Loss: 0.01051
Epoch: 19946, Training Loss: 0.01053
Epoch: 19946, Training Loss: 0.01297
Epoch: 19947, Training Loss: 0.01003
Epoch: 19947, Training Loss: 0.01051
Epoch: 19947, Training Loss: 0.01053
Epoch: 19947, Training Loss: 0.01297
Epoch: 19948, Training Loss: 0.01003
Epoch: 19948, Training Loss: 0.01051
Epoch: 19948, Training Loss: 0.01053
Epoch: 19948, Training Loss: 0.01297
Epoch: 19949, Training Loss: 0.01003
Epoch: 19949, Training Loss: 0.01051
Epoch: 19949, Training Loss: 0.01052
Epoch: 19949, Training Loss: 0.01297
Epoch: 19950, Training Loss: 0.01003
Epoch: 19950, Training Loss: 0.01051
Epoch: 19950, Training Loss: 0.01052
Epoch: 19950, Training Loss: 0.01297
Epoch: 19951, Training Loss: 0.01003
Epoch: 19951, Training Loss: 0.01051
Epoch: 19951, Training Loss: 0.01052
Epoch: 19951, Training Loss: 0.01297
Epoch: 19952, Training Loss: 0.01003
Epoch: 19952, Training Loss: 0.01051
Epoch: 19952, Training Loss: 0.01052
Epoch: 19952, Training Loss: 0.01297
Epoch: 19953, Training Loss: 0.01003
Epoch: 19953, Training Loss: 0.01051
Epoch: 19953, Training Loss: 0.01052
Epoch: 19953, Training Loss: 0.01297
Epoch: 19954, Training Loss: 0.01003
Epoch: 19954, Training Loss: 0.01051
Epoch: 19954, Training Loss: 0.01052
Epoch: 19954, Training Loss: 0.01297
Epoch: 19955, Training Loss: 0.01003
Epoch: 19955, Training Loss: 0.01051
Epoch: 19955, Training Loss: 0.01052
Epoch: 19955, Training Loss: 0.01297
Epoch: 19956, Training Loss: 0.01003
Epoch: 19956, Training Loss: 0.01051
Epoch: 19956, Training Loss: 0.01052
Epoch: 19956, Training Loss: 0.01297
Epoch: 19957, Training Loss: 0.01003
Epoch: 19957, Training Loss: 0.01051
Epoch: 19957, Training Loss: 0.01052
Epoch: 19957, Training Loss: 0.01297
Epoch: 19958, Training Loss: 0.01003
Epoch: 19958, Training Loss: 0.01051
Epoch: 19958, Training Loss: 0.01052
Epoch: 19958, Training Loss: 0.01297
Epoch: 19959, Training Loss: 0.01002
Epoch: 19959, Training Loss: 0.01051
Epoch: 19959, Training Loss: 0.01052
Epoch: 19959, Training Loss: 0.01297
Epoch: 19960, Training Loss: 0.01002
Epoch: 19960, Training Loss: 0.01051
Epoch: 19960, Training Loss: 0.01052
Epoch: 19960, Training Loss: 0.01297
Epoch: 19961, Training Loss: 0.01002
Epoch: 19961, Training Loss: 0.01051
Epoch: 19961, Training Loss: 0.01052
Epoch: 19961, Training Loss: 0.01297
Epoch: 19962, Training Loss: 0.01002
Epoch: 19962, Training Loss: 0.01051
Epoch: 19962, Training Loss: 0.01052
Epoch: 19962, Training Loss: 0.01296
Epoch: 19963, Training Loss: 0.01002
Epoch: 19963, Training Loss: 0.01051
Epoch: 19963, Training Loss: 0.01052
Epoch: 19963, Training Loss: 0.01296
Epoch: 19964, Training Loss: 0.01002
Epoch: 19964, Training Loss: 0.01051
Epoch: 19964, Training Loss: 0.01052
Epoch: 19964, Training Loss: 0.01296
Epoch: 19965, Training Loss: 0.01002
Epoch: 19965, Training Loss: 0.01051
Epoch: 19965, Training Loss: 0.01052
Epoch: 19965, Training Loss: 0.01296
Epoch: 19966, Training Loss: 0.01002
Epoch: 19966, Training Loss: 0.01051
Epoch: 19966, Training Loss: 0.01052
Epoch: 19966, Training Loss: 0.01296
Epoch: 19967, Training Loss: 0.01002
Epoch: 19967, Training Loss: 0.01051
Epoch: 19967, Training Loss: 0.01052
Epoch: 19967, Training Loss: 0.01296
Epoch: 19968, Training Loss: 0.01002
Epoch: 19968, Training Loss: 0.01051
Epoch: 19968, Training Loss: 0.01052
Epoch: 19968, Training Loss: 0.01296
Epoch: 19969, Training Loss: 0.01002
Epoch: 19969, Training Loss: 0.01050
Epoch: 19969, Training Loss: 0.01052
Epoch: 19969, Training Loss: 0.01296
Epoch: 19970, Training Loss: 0.01002
Epoch: 19970, Training Loss: 0.01050
Epoch: 19970, Training Loss: 0.01052
Epoch: 19970, Training Loss: 0.01296
Epoch: 19971, Training Loss: 0.01002
Epoch: 19971, Training Loss: 0.01050
Epoch: 19971, Training Loss: 0.01052
Epoch: 19971, Training Loss: 0.01296
Epoch: 19972, Training Loss: 0.01002
Epoch: 19972, Training Loss: 0.01050
Epoch: 19972, Training Loss: 0.01052
Epoch: 19972, Training Loss: 0.01296
Epoch: 19973, Training Loss: 0.01002
Epoch: 19973, Training Loss: 0.01050
Epoch: 19973, Training Loss: 0.01052
Epoch: 19973, Training Loss: 0.01296
Epoch: 19974, Training Loss: 0.01002
Epoch: 19974, Training Loss: 0.01050
Epoch: 19974, Training Loss: 0.01052
Epoch: 19974, Training Loss: 0.01296
Epoch: 19975, Training Loss: 0.01002
Epoch: 19975, Training Loss: 0.01050
Epoch: 19975, Training Loss: 0.01052
Epoch: 19975, Training Loss: 0.01296
Epoch: 19976, Training Loss: 0.01002
Epoch: 19976, Training Loss: 0.01050
Epoch: 19976, Training Loss: 0.01052
Epoch: 19976, Training Loss: 0.01296
Epoch: 19977, Training Loss: 0.01002
Epoch: 19977, Training Loss: 0.01050
Epoch: 19977, Training Loss: 0.01052
Epoch: 19977, Training Loss: 0.01296
Epoch: 19978, Training Loss: 0.01002
Epoch: 19978, Training Loss: 0.01050
Epoch: 19978, Training Loss: 0.01052
Epoch: 19978, Training Loss: 0.01296
Epoch: 19979, Training Loss: 0.01002
Epoch: 19979, Training Loss: 0.01050
Epoch: 19979, Training Loss: 0.01052
Epoch: 19979, Training Loss: 0.01296
Epoch: 19980, Training Loss: 0.01002
Epoch: 19980, Training Loss: 0.01050
Epoch: 19980, Training Loss: 0.01052
Epoch: 19980, Training Loss: 0.01296
Epoch: 19981, Training Loss: 0.01002
Epoch: 19981, Training Loss: 0.01050
Epoch: 19981, Training Loss: 0.01051
Epoch: 19981, Training Loss: 0.01296
Epoch: 19982, Training Loss: 0.01002
Epoch: 19982, Training Loss: 0.01050
Epoch: 19982, Training Loss: 0.01051
Epoch: 19982, Training Loss: 0.01296
Epoch: 19983, Training Loss: 0.01002
Epoch: 19983, Training Loss: 0.01050
Epoch: 19983, Training Loss: 0.01051
Epoch: 19983, Training Loss: 0.01296
Epoch: 19984, Training Loss: 0.01002
Epoch: 19984, Training Loss: 0.01050
Epoch: 19984, Training Loss: 0.01051
Epoch: 19984, Training Loss: 0.01296
Epoch: 19985, Training Loss: 0.01002
Epoch: 19985, Training Loss: 0.01050
Epoch: 19985, Training Loss: 0.01051
Epoch: 19985, Training Loss: 0.01296
Epoch: 19986, Training Loss: 0.01002
Epoch: 19986, Training Loss: 0.01050
Epoch: 19986, Training Loss: 0.01051
Epoch: 19986, Training Loss: 0.01296
Epoch: 19987, Training Loss: 0.01002
Epoch: 19987, Training Loss: 0.01050
Epoch: 19987, Training Loss: 0.01051
Epoch: 19987, Training Loss: 0.01296
Epoch: 19988, Training Loss: 0.01002
Epoch: 19988, Training Loss: 0.01050
Epoch: 19988, Training Loss: 0.01051
Epoch: 19988, Training Loss: 0.01295
Epoch: 19989, Training Loss: 0.01002
Epoch: 19989, Training Loss: 0.01050
Epoch: 19989, Training Loss: 0.01051
Epoch: 19989, Training Loss: 0.01295
Epoch: 19990, Training Loss: 0.01002
Epoch: 19990, Training Loss: 0.01050
Epoch: 19990, Training Loss: 0.01051
Epoch: 19990, Training Loss: 0.01295
Epoch: 19991, Training Loss: 0.01002
Epoch: 19991, Training Loss: 0.01050
Epoch: 19991, Training Loss: 0.01051
Epoch: 19991, Training Loss: 0.01295
Epoch: 19992, Training Loss: 0.01002
Epoch: 19992, Training Loss: 0.01050
Epoch: 19992, Training Loss: 0.01051
Epoch: 19992, Training Loss: 0.01295
Epoch: 19993, Training Loss: 0.01001
Epoch: 19993, Training Loss: 0.01050
Epoch: 19993, Training Loss: 0.01051
Epoch: 19993, Training Loss: 0.01295
Epoch: 19994, Training Loss: 0.01001
Epoch: 19994, Training Loss: 0.01050
Epoch: 19994, Training Loss: 0.01051
Epoch: 19994, Training Loss: 0.01295
Epoch: 19995, Training Loss: 0.01001
Epoch: 19995, Training Loss: 0.01050
Epoch: 19995, Training Loss: 0.01051
Epoch: 19995, Training Loss: 0.01295
Epoch: 19996, Training Loss: 0.01001
Epoch: 19996, Training Loss: 0.01050
Epoch: 19996, Training Loss: 0.01051
Epoch: 19996, Training Loss: 0.01295
Epoch: 19997, Training Loss: 0.01001
Epoch: 19997, Training Loss: 0.01050
Epoch: 19997, Training Loss: 0.01051
Epoch: 19997, Training Loss: 0.01295
Epoch: 19998, Training Loss: 0.01001
Epoch: 19998, Training Loss: 0.01050
Epoch: 19998, Training Loss: 0.01051
Epoch: 19998, Training Loss: 0.01295
Epoch: 19999, Training Loss: 0.01001
Epoch: 19999, Training Loss: 0.01050
Epoch: 19999, Training Loss: 0.01051
Epoch: 19999, Training Loss: 0.01295
Epoch: 20000, Training Loss: 0.01001
Epoch: 20000, Training Loss: 0.01050
Epoch: 20000, Training Loss: 0.01051
Epoch: 20000, Training Loss: 0.01295
Epoch: 20001, Training Loss: 0.01001
Epoch: 20001, Training Loss: 0.01049
Epoch: 20001, Training Loss: 0.01051
Epoch: 20001, Training Loss: 0.01295
Epoch: 20002, Training Loss: 0.01001
Epoch: 20002, Training Loss: 0.01049
Epoch: 20002, Training Loss: 0.01051
Epoch: 20002, Training Loss: 0.01295
Epoch: 20003, Training Loss: 0.01001
Epoch: 20003, Training Loss: 0.01049
Epoch: 20003, Training Loss: 0.01051
Epoch: 20003, Training Loss: 0.01295
Epoch: 20004, Training Loss: 0.01001
Epoch: 20004, Training Loss: 0.01049
Epoch: 20004, Training Loss: 0.01051
Epoch: 20004, Training Loss: 0.01295
Epoch: 20005, Training Loss: 0.01001
Epoch: 20005, Training Loss: 0.01049
Epoch: 20005, Training Loss: 0.01051
Epoch: 20005, Training Loss: 0.01295
Epoch: 20006, Training Loss: 0.01001
Epoch: 20006, Training Loss: 0.01049
Epoch: 20006, Training Loss: 0.01051
Epoch: 20006, Training Loss: 0.01295
Epoch: 20007, Training Loss: 0.01001
Epoch: 20007, Training Loss: 0.01049
Epoch: 20007, Training Loss: 0.01051
Epoch: 20007, Training Loss: 0.01295
Epoch: 20008, Training Loss: 0.01001
Epoch: 20008, Training Loss: 0.01049
Epoch: 20008, Training Loss: 0.01051
Epoch: 20008, Training Loss: 0.01295
Epoch: 20009, Training Loss: 0.01001
Epoch: 20009, Training Loss: 0.01049
Epoch: 20009, Training Loss: 0.01051
Epoch: 20009, Training Loss: 0.01295
Epoch: 20010, Training Loss: 0.01001
Epoch: 20010, Training Loss: 0.01049
Epoch: 20010, Training Loss: 0.01051
Epoch: 20010, Training Loss: 0.01295
Epoch: 20011, Training Loss: 0.01001
Epoch: 20011, Training Loss: 0.01049
Epoch: 20011, Training Loss: 0.01051
Epoch: 20011, Training Loss: 0.01295
Epoch: 20012, Training Loss: 0.01001
Epoch: 20012, Training Loss: 0.01049
Epoch: 20012, Training Loss: 0.01051
Epoch: 20012, Training Loss: 0.01295
Epoch: 20013, Training Loss: 0.01001
Epoch: 20013, Training Loss: 0.01049
Epoch: 20013, Training Loss: 0.01051
Epoch: 20013, Training Loss: 0.01294
Epoch: 20014, Training Loss: 0.01001
Epoch: 20014, Training Loss: 0.01049
Epoch: 20014, Training Loss: 0.01050
Epoch: 20014, Training Loss: 0.01294
Epoch: 20015, Training Loss: 0.01001
Epoch: 20015, Training Loss: 0.01049
Epoch: 20015, Training Loss: 0.01050
Epoch: 20015, Training Loss: 0.01294
Epoch: 20016, Training Loss: 0.01001
Epoch: 20016, Training Loss: 0.01049
Epoch: 20016, Training Loss: 0.01050
Epoch: 20016, Training Loss: 0.01294
Epoch: 20017, Training Loss: 0.01001
Epoch: 20017, Training Loss: 0.01049
Epoch: 20017, Training Loss: 0.01050
Epoch: 20017, Training Loss: 0.01294
Epoch: 20018, Training Loss: 0.01001
Epoch: 20018, Training Loss: 0.01049
Epoch: 20018, Training Loss: 0.01050
Epoch: 20018, Training Loss: 0.01294
Epoch: 20019, Training Loss: 0.01001
Epoch: 20019, Training Loss: 0.01049
Epoch: 20019, Training Loss: 0.01050
Epoch: 20019, Training Loss: 0.01294
Epoch: 20020, Training Loss: 0.01001
Epoch: 20020, Training Loss: 0.01049
Epoch: 20020, Training Loss: 0.01050
Epoch: 20020, Training Loss: 0.01294
Epoch: 20021, Training Loss: 0.01001
Epoch: 20021, Training Loss: 0.01049
Epoch: 20021, Training Loss: 0.01050
Epoch: 20021, Training Loss: 0.01294
Epoch: 20022, Training Loss: 0.01001
Epoch: 20022, Training Loss: 0.01049
Epoch: 20022, Training Loss: 0.01050
Epoch: 20022, Training Loss: 0.01294
Epoch: 20023, Training Loss: 0.01001
Epoch: 20023, Training Loss: 0.01049
Epoch: 20023, Training Loss: 0.01050
Epoch: 20023, Training Loss: 0.01294
Epoch: 20024, Training Loss: 0.01001
Epoch: 20024, Training Loss: 0.01049
Epoch: 20024, Training Loss: 0.01050
Epoch: 20024, Training Loss: 0.01294
Epoch: 20025, Training Loss: 0.01001
Epoch: 20025, Training Loss: 0.01049
Epoch: 20025, Training Loss: 0.01050
Epoch: 20025, Training Loss: 0.01294
Epoch: 20026, Training Loss: 0.01001
Epoch: 20026, Training Loss: 0.01049
Epoch: 20026, Training Loss: 0.01050
Epoch: 20026, Training Loss: 0.01294
Epoch: 20027, Training Loss: 0.01001
Epoch: 20027, Training Loss: 0.01049
Epoch: 20027, Training Loss: 0.01050
Epoch: 20027, Training Loss: 0.01294
Epoch: 20028, Training Loss: 0.01000
Epoch: 20028, Training Loss: 0.01049
Epoch: 20028, Training Loss: 0.01050
Epoch: 20028, Training Loss: 0.01294
Epoch: 20029, Training Loss: 0.01000
Epoch: 20029, Training Loss: 0.01049
Epoch: 20029, Training Loss: 0.01050
Epoch: 20029, Training Loss: 0.01294
Epoch: 20030, Training Loss: 0.01000
Epoch: 20030, Training Loss: 0.01049
Epoch: 20030, Training Loss: 0.01050
Epoch: 20030, Training Loss: 0.01294
Epoch: 20031, Training Loss: 0.01000
Epoch: 20031, Training Loss: 0.01049
Epoch: 20031, Training Loss: 0.01050
Epoch: 20031, Training Loss: 0.01294
Epoch: 20032, Training Loss: 0.01000
Epoch: 20032, Training Loss: 0.01049
Epoch: 20032, Training Loss: 0.01050
Epoch: 20032, Training Loss: 0.01294
Epoch: 20033, Training Loss: 0.01000
Epoch: 20033, Training Loss: 0.01049
Epoch: 20033, Training Loss: 0.01050
Epoch: 20033, Training Loss: 0.01294
Epoch: 20034, Training Loss: 0.01000
Epoch: 20034, Training Loss: 0.01048
Epoch: 20034, Training Loss: 0.01050
Epoch: 20034, Training Loss: 0.01294
Epoch: 20035, Training Loss: 0.01000
Epoch: 20035, Training Loss: 0.01048
Epoch: 20035, Training Loss: 0.01050
Epoch: 20035, Training Loss: 0.01294
Epoch: 20036, Training Loss: 0.01000
Epoch: 20036, Training Loss: 0.01048
Epoch: 20036, Training Loss: 0.01050
Epoch: 20036, Training Loss: 0.01294
Epoch: 20037, Training Loss: 0.01000
Epoch: 20037, Training Loss: 0.01048
Epoch: 20037, Training Loss: 0.01050
Epoch: 20037, Training Loss: 0.01294
Epoch: 20038, Training Loss: 0.01000
Epoch: 20038, Training Loss: 0.01048
Epoch: 20038, Training Loss: 0.01050
Epoch: 20038, Training Loss: 0.01294
Epoch: 20039, Training Loss: 0.01000
Epoch: 20039, Training Loss: 0.01048
Epoch: 20039, Training Loss: 0.01050
Epoch: 20039, Training Loss: 0.01293
Epoch: 20040, Training Loss: 0.01000
Epoch: 20040, Training Loss: 0.01048
Epoch: 20040, Training Loss: 0.01050
Epoch: 20040, Training Loss: 0.01293
Epoch: 20041, Training Loss: 0.01000
Epoch: 20041, Training Loss: 0.01048
Epoch: 20041, Training Loss: 0.01050
Epoch: 20041, Training Loss: 0.01293
Epoch: 20042, Training Loss: 0.01000
Epoch: 20042, Training Loss: 0.01048
Epoch: 20042, Training Loss: 0.01050
Epoch: 20042, Training Loss: 0.01293
Epoch: 20043, Training Loss: 0.01000
Epoch: 20043, Training Loss: 0.01048
Epoch: 20043, Training Loss: 0.01050
Epoch: 20043, Training Loss: 0.01293
Epoch: 20044, Training Loss: 0.01000
Epoch: 20044, Training Loss: 0.01048
Epoch: 20044, Training Loss: 0.01050
Epoch: 20044, Training Loss: 0.01293
Epoch: 20045, Training Loss: 0.01000
Epoch: 20045, Training Loss: 0.01048
Epoch: 20045, Training Loss: 0.01050
Epoch: 20045, Training Loss: 0.01293
Epoch: 20046, Training Loss: 0.01000
Epoch: 20046, Training Loss: 0.01048
Epoch: 20046, Training Loss: 0.01049
Epoch: 20046, Training Loss: 0.01293
Epoch: 20047, Training Loss: 0.01000
Epoch: 20047, Training Loss: 0.01048
Epoch: 20047, Training Loss: 0.01049
Epoch: 20047, Training Loss: 0.01293
Epoch: 20048, Training Loss: 0.01000
Epoch: 20048, Training Loss: 0.01048
Epoch: 20048, Training Loss: 0.01049
Epoch: 20048, Training Loss: 0.01293
Epoch: 20049, Training Loss: 0.01000
Epoch: 20049, Training Loss: 0.01048
Epoch: 20049, Training Loss: 0.01049
Epoch: 20049, Training Loss: 0.01293
Epoch: 20050, Training Loss: 0.01000
Epoch: 20050, Training Loss: 0.01048
Epoch: 20050, Training Loss: 0.01049
Epoch: 20050, Training Loss: 0.01293
Epoch: 20051, Training Loss: 0.01000
Epoch: 20051, Training Loss: 0.01048
Epoch: 20051, Training Loss: 0.01049
Epoch: 20051, Training Loss: 0.01293
Epoch: 20052, Training Loss: 0.01000
Epoch: 20052, Training Loss: 0.01048
Epoch: 20052, Training Loss: 0.01049
Epoch: 20052, Training Loss: 0.01293
Epoch: 20053, Training Loss: 0.01000
Epoch: 20053, Training Loss: 0.01048
Epoch: 20053, Training Loss: 0.01049
Epoch: 20053, Training Loss: 0.01293
Epoch: 20054, Training Loss: 0.01000
Epoch: 20054, Training Loss: 0.01048
Epoch: 20054, Training Loss: 0.01049
Epoch: 20054, Training Loss: 0.01293
Epoch: 20055, Training Loss: 0.01000
Epoch: 20055, Training Loss: 0.01048
Epoch: 20055, Training Loss: 0.01049
Epoch: 20055, Training Loss: 0.01293
Epoch: 20056, Training Loss: 0.01000
Epoch: 20056, Training Loss: 0.01048
Epoch: 20056, Training Loss: 0.01049
Epoch: 20056, Training Loss: 0.01293
Epoch: 20057, Training Loss: 0.01000
Epoch: 20057, Training Loss: 0.01048
Epoch: 20057, Training Loss: 0.01049
Epoch: 20057, Training Loss: 0.01293
Epoch: 20058, Training Loss: 0.01000
Epoch: 20058, Training Loss: 0.01048
Epoch: 20058, Training Loss: 0.01049
Epoch: 20058, Training Loss: 0.01293
Epoch: 20059, Training Loss: 0.01000
Epoch: 20059, Training Loss: 0.01048
Epoch: 20059, Training Loss: 0.01049
Epoch: 20059, Training Loss: 0.01293
Epoch: 20060, Training Loss: 0.01000
Epoch: 20060, Training Loss: 0.01048
Epoch: 20060, Training Loss: 0.01049
Epoch: 20060, Training Loss: 0.01293
Epoch: 20061, Training Loss: 0.01000
Epoch: 20061, Training Loss: 0.01048
Epoch: 20061, Training Loss: 0.01049
Epoch: 20061, Training Loss: 0.01293
Epoch: 20062, Training Loss: 0.01000
Epoch: 20062, Training Loss: 0.01048
Epoch: 20062, Training Loss: 0.01049
Epoch: 20062, Training Loss: 0.01293
Epoch: 20063, Training Loss: 0.00999
Epoch: 20063, Training Loss: 0.01048
Epoch: 20063, Training Loss: 0.01049
Epoch: 20063, Training Loss: 0.01293
Epoch: 20064, Training Loss: 0.00999
Epoch: 20064, Training Loss: 0.01048
Epoch: 20064, Training Loss: 0.01049
Epoch: 20064, Training Loss: 0.01293
Epoch: 20065, Training Loss: 0.00999
Epoch: 20065, Training Loss: 0.01048
Epoch: 20065, Training Loss: 0.01049
Epoch: 20065, Training Loss: 0.01292
Epoch: 20066, Training Loss: 0.00999
Epoch: 20066, Training Loss: 0.01047
Epoch: 20066, Training Loss: 0.01049
Epoch: 20066, Training Loss: 0.01292
Epoch: 20067, Training Loss: 0.00999
Epoch: 20067, Training Loss: 0.01047
Epoch: 20067, Training Loss: 0.01049
Epoch: 20067, Training Loss: 0.01292
Epoch: 20068, Training Loss: 0.00999
Epoch: 20068, Training Loss: 0.01047
Epoch: 20068, Training Loss: 0.01049
Epoch: 20068, Training Loss: 0.01292
Epoch: 20069, Training Loss: 0.00999
Epoch: 20069, Training Loss: 0.01047
Epoch: 20069, Training Loss: 0.01049
Epoch: 20069, Training Loss: 0.01292
Epoch: 20070, Training Loss: 0.00999
Epoch: 20070, Training Loss: 0.01047
Epoch: 20070, Training Loss: 0.01049
Epoch: 20070, Training Loss: 0.01292
Epoch: 20071, Training Loss: 0.00999
Epoch: 20071, Training Loss: 0.01047
Epoch: 20071, Training Loss: 0.01049
Epoch: 20071, Training Loss: 0.01292
Epoch: 20072, Training Loss: 0.00999
Epoch: 20072, Training Loss: 0.01047
Epoch: 20072, Training Loss: 0.01049
Epoch: 20072, Training Loss: 0.01292
Epoch: 20073, Training Loss: 0.00999
Epoch: 20073, Training Loss: 0.01047
Epoch: 20073, Training Loss: 0.01049
Epoch: 20073, Training Loss: 0.01292
Epoch: 20074, Training Loss: 0.00999
Epoch: 20074, Training Loss: 0.01047
Epoch: 20074, Training Loss: 0.01049
Epoch: 20074, Training Loss: 0.01292
Epoch: 20075, Training Loss: 0.00999
Epoch: 20075, Training Loss: 0.01047
Epoch: 20075, Training Loss: 0.01049
Epoch: 20075, Training Loss: 0.01292
Epoch: 20076, Training Loss: 0.00999
Epoch: 20076, Training Loss: 0.01047
Epoch: 20076, Training Loss: 0.01049
Epoch: 20076, Training Loss: 0.01292
Epoch: 20077, Training Loss: 0.00999
Epoch: 20077, Training Loss: 0.01047
Epoch: 20077, Training Loss: 0.01049
Epoch: 20077, Training Loss: 0.01292
Epoch: 20078, Training Loss: 0.00999
Epoch: 20078, Training Loss: 0.01047
Epoch: 20078, Training Loss: 0.01048
Epoch: 20078, Training Loss: 0.01292
Epoch: 20079, Training Loss: 0.00999
Epoch: 20079, Training Loss: 0.01047
Epoch: 20079, Training Loss: 0.01048
Epoch: 20079, Training Loss: 0.01292
Epoch: 20080, Training Loss: 0.00999
Epoch: 20080, Training Loss: 0.01047
Epoch: 20080, Training Loss: 0.01048
Epoch: 20080, Training Loss: 0.01292
Epoch: 20081, Training Loss: 0.00999
Epoch: 20081, Training Loss: 0.01047
Epoch: 20081, Training Loss: 0.01048
Epoch: 20081, Training Loss: 0.01292
Epoch: 20082, Training Loss: 0.00999
Epoch: 20082, Training Loss: 0.01047
Epoch: 20082, Training Loss: 0.01048
Epoch: 20082, Training Loss: 0.01292
Epoch: 20083, Training Loss: 0.00999
Epoch: 20083, Training Loss: 0.01047
Epoch: 20083, Training Loss: 0.01048
Epoch: 20083, Training Loss: 0.01292
Epoch: 20084, Training Loss: 0.00999
Epoch: 20084, Training Loss: 0.01047
Epoch: 20084, Training Loss: 0.01048
Epoch: 20084, Training Loss: 0.01292
Epoch: 20085, Training Loss: 0.00999
Epoch: 20085, Training Loss: 0.01047
Epoch: 20085, Training Loss: 0.01048
Epoch: 20085, Training Loss: 0.01292
Epoch: 20086, Training Loss: 0.00999
Epoch: 20086, Training Loss: 0.01047
Epoch: 20086, Training Loss: 0.01048
Epoch: 20086, Training Loss: 0.01292
Epoch: 20087, Training Loss: 0.00999
Epoch: 20087, Training Loss: 0.01047
Epoch: 20087, Training Loss: 0.01048
Epoch: 20087, Training Loss: 0.01292
Epoch: 20088, Training Loss: 0.00999
Epoch: 20088, Training Loss: 0.01047
Epoch: 20088, Training Loss: 0.01048
Epoch: 20088, Training Loss: 0.01292
Epoch: 20089, Training Loss: 0.00999
Epoch: 20089, Training Loss: 0.01047
Epoch: 20089, Training Loss: 0.01048
Epoch: 20089, Training Loss: 0.01292
Epoch: 20090, Training Loss: 0.00999
Epoch: 20090, Training Loss: 0.01047
Epoch: 20090, Training Loss: 0.01048
Epoch: 20090, Training Loss: 0.01292
Epoch: 20091, Training Loss: 0.00999
Epoch: 20091, Training Loss: 0.01047
Epoch: 20091, Training Loss: 0.01048
Epoch: 20091, Training Loss: 0.01291
Epoch: 20092, Training Loss: 0.00999
Epoch: 20092, Training Loss: 0.01047
Epoch: 20092, Training Loss: 0.01048
Epoch: 20092, Training Loss: 0.01291
Epoch: 20093, Training Loss: 0.00999
Epoch: 20093, Training Loss: 0.01047
Epoch: 20093, Training Loss: 0.01048
Epoch: 20093, Training Loss: 0.01291
Epoch: 20094, Training Loss: 0.00999
Epoch: 20094, Training Loss: 0.01047
Epoch: 20094, Training Loss: 0.01048
Epoch: 20094, Training Loss: 0.01291
Epoch: 20095, Training Loss: 0.00999
Epoch: 20095, Training Loss: 0.01047
Epoch: 20095, Training Loss: 0.01048
Epoch: 20095, Training Loss: 0.01291
Epoch: 20096, Training Loss: 0.00999
Epoch: 20096, Training Loss: 0.01047
Epoch: 20096, Training Loss: 0.01048
Epoch: 20096, Training Loss: 0.01291
Epoch: 20097, Training Loss: 0.00999
Epoch: 20097, Training Loss: 0.01047
Epoch: 20097, Training Loss: 0.01048
Epoch: 20097, Training Loss: 0.01291
Epoch: 20098, Training Loss: 0.00998
Epoch: 20098, Training Loss: 0.01047
Epoch: 20098, Training Loss: 0.01048
Epoch: 20098, Training Loss: 0.01291
Epoch: 20099, Training Loss: 0.00998
Epoch: 20099, Training Loss: 0.01046
Epoch: 20099, Training Loss: 0.01048
Epoch: 20099, Training Loss: 0.01291
Epoch: 20100, Training Loss: 0.00998
Epoch: 20100, Training Loss: 0.01046
Epoch: 20100, Training Loss: 0.01048
Epoch: 20100, Training Loss: 0.01291
Epoch: 20101, Training Loss: 0.00998
Epoch: 20101, Training Loss: 0.01046
Epoch: 20101, Training Loss: 0.01048
Epoch: 20101, Training Loss: 0.01291
Epoch: 20102, Training Loss: 0.00998
Epoch: 20102, Training Loss: 0.01046
Epoch: 20102, Training Loss: 0.01048
Epoch: 20102, Training Loss: 0.01291
Epoch: 20103, Training Loss: 0.00998
Epoch: 20103, Training Loss: 0.01046
Epoch: 20103, Training Loss: 0.01048
Epoch: 20103, Training Loss: 0.01291
Epoch: 20104, Training Loss: 0.00998
Epoch: 20104, Training Loss: 0.01046
Epoch: 20104, Training Loss: 0.01048
Epoch: 20104, Training Loss: 0.01291
Epoch: 20105, Training Loss: 0.00998
Epoch: 20105, Training Loss: 0.01046
Epoch: 20105, Training Loss: 0.01048
Epoch: 20105, Training Loss: 0.01291
Epoch: 20106, Training Loss: 0.00998
Epoch: 20106, Training Loss: 0.01046
Epoch: 20106, Training Loss: 0.01048
Epoch: 20106, Training Loss: 0.01291
Epoch: 20107, Training Loss: 0.00998
Epoch: 20107, Training Loss: 0.01046
Epoch: 20107, Training Loss: 0.01048
Epoch: 20107, Training Loss: 0.01291
Epoch: 20108, Training Loss: 0.00998
Epoch: 20108, Training Loss: 0.01046
Epoch: 20108, Training Loss: 0.01048
Epoch: 20108, Training Loss: 0.01291
Epoch: 20109, Training Loss: 0.00998
Epoch: 20109, Training Loss: 0.01046
Epoch: 20109, Training Loss: 0.01048
Epoch: 20109, Training Loss: 0.01291
Epoch: 20110, Training Loss: 0.00998
Epoch: 20110, Training Loss: 0.01046
Epoch: 20110, Training Loss: 0.01048
Epoch: 20110, Training Loss: 0.01291
Epoch: 20111, Training Loss: 0.00998
Epoch: 20111, Training Loss: 0.01046
Epoch: 20111, Training Loss: 0.01047
Epoch: 20111, Training Loss: 0.01291
Epoch: 20112, Training Loss: 0.00998
Epoch: 20112, Training Loss: 0.01046
Epoch: 20112, Training Loss: 0.01047
Epoch: 20112, Training Loss: 0.01291
Epoch: 20113, Training Loss: 0.00998
Epoch: 20113, Training Loss: 0.01046
Epoch: 20113, Training Loss: 0.01047
Epoch: 20113, Training Loss: 0.01291
Epoch: 20114, Training Loss: 0.00998
Epoch: 20114, Training Loss: 0.01046
Epoch: 20114, Training Loss: 0.01047
Epoch: 20114, Training Loss: 0.01291
Epoch: 20115, Training Loss: 0.00998
Epoch: 20115, Training Loss: 0.01046
Epoch: 20115, Training Loss: 0.01047
Epoch: 20115, Training Loss: 0.01291
Epoch: 20116, Training Loss: 0.00998
Epoch: 20116, Training Loss: 0.01046
Epoch: 20116, Training Loss: 0.01047
Epoch: 20116, Training Loss: 0.01291
Epoch: 20117, Training Loss: 0.00998
Epoch: 20117, Training Loss: 0.01046
Epoch: 20117, Training Loss: 0.01047
Epoch: 20117, Training Loss: 0.01290
Epoch: 20118, Training Loss: 0.00998
Epoch: 20118, Training Loss: 0.01046
Epoch: 20118, Training Loss: 0.01047
Epoch: 20118, Training Loss: 0.01290
Epoch: 20119, Training Loss: 0.00998
Epoch: 20119, Training Loss: 0.01046
Epoch: 20119, Training Loss: 0.01047
Epoch: 20119, Training Loss: 0.01290
Epoch: 20120, Training Loss: 0.00998
Epoch: 20120, Training Loss: 0.01046
Epoch: 20120, Training Loss: 0.01047
Epoch: 20120, Training Loss: 0.01290
Epoch: 20121, Training Loss: 0.00998
Epoch: 20121, Training Loss: 0.01046
Epoch: 20121, Training Loss: 0.01047
Epoch: 20121, Training Loss: 0.01290
Epoch: 20122, Training Loss: 0.00998
Epoch: 20122, Training Loss: 0.01046
Epoch: 20122, Training Loss: 0.01047
Epoch: 20122, Training Loss: 0.01290
Epoch: 20123, Training Loss: 0.00998
Epoch: 20123, Training Loss: 0.01046
Epoch: 20123, Training Loss: 0.01047
Epoch: 20123, Training Loss: 0.01290
Epoch: 20124, Training Loss: 0.00998
Epoch: 20124, Training Loss: 0.01046
Epoch: 20124, Training Loss: 0.01047
Epoch: 20124, Training Loss: 0.01290
Epoch: 20125, Training Loss: 0.00998
Epoch: 20125, Training Loss: 0.01046
Epoch: 20125, Training Loss: 0.01047
Epoch: 20125, Training Loss: 0.01290
Epoch: 20126, Training Loss: 0.00998
Epoch: 20126, Training Loss: 0.01046
Epoch: 20126, Training Loss: 0.01047
Epoch: 20126, Training Loss: 0.01290
Epoch: 20127, Training Loss: 0.00998
Epoch: 20127, Training Loss: 0.01046
Epoch: 20127, Training Loss: 0.01047
Epoch: 20127, Training Loss: 0.01290
Epoch: 20128, Training Loss: 0.00998
Epoch: 20128, Training Loss: 0.01046
Epoch: 20128, Training Loss: 0.01047
Epoch: 20128, Training Loss: 0.01290
Epoch: 20129, Training Loss: 0.00998
Epoch: 20129, Training Loss: 0.01046
Epoch: 20129, Training Loss: 0.01047
Epoch: 20129, Training Loss: 0.01290
Epoch: 20130, Training Loss: 0.00998
Epoch: 20130, Training Loss: 0.01046
Epoch: 20130, Training Loss: 0.01047
Epoch: 20130, Training Loss: 0.01290
Epoch: 20131, Training Loss: 0.00998
Epoch: 20131, Training Loss: 0.01045
Epoch: 20131, Training Loss: 0.01047
Epoch: 20131, Training Loss: 0.01290
Epoch: 20132, Training Loss: 0.00998
Epoch: 20132, Training Loss: 0.01045
Epoch: 20132, Training Loss: 0.01047
Epoch: 20132, Training Loss: 0.01290
Epoch: 20133, Training Loss: 0.00997
Epoch: 20133, Training Loss: 0.01045
Epoch: 20133, Training Loss: 0.01047
Epoch: 20133, Training Loss: 0.01290
Epoch: 20134, Training Loss: 0.00997
Epoch: 20134, Training Loss: 0.01045
Epoch: 20134, Training Loss: 0.01047
Epoch: 20134, Training Loss: 0.01290
Epoch: 20135, Training Loss: 0.00997
Epoch: 20135, Training Loss: 0.01045
Epoch: 20135, Training Loss: 0.01047
Epoch: 20135, Training Loss: 0.01290
Epoch: 20136, Training Loss: 0.00997
Epoch: 20136, Training Loss: 0.01045
Epoch: 20136, Training Loss: 0.01047
Epoch: 20136, Training Loss: 0.01290
Epoch: 20137, Training Loss: 0.00997
Epoch: 20137, Training Loss: 0.01045
Epoch: 20137, Training Loss: 0.01047
Epoch: 20137, Training Loss: 0.01290
Epoch: 20138, Training Loss: 0.00997
Epoch: 20138, Training Loss: 0.01045
Epoch: 20138, Training Loss: 0.01047
Epoch: 20138, Training Loss: 0.01290
Epoch: 20139, Training Loss: 0.00997
Epoch: 20139, Training Loss: 0.01045
Epoch: 20139, Training Loss: 0.01047
Epoch: 20139, Training Loss: 0.01290
Epoch: 20140, Training Loss: 0.00997
Epoch: 20140, Training Loss: 0.01045
Epoch: 20140, Training Loss: 0.01047
Epoch: 20140, Training Loss: 0.01290
Epoch: 20141, Training Loss: 0.00997
Epoch: 20141, Training Loss: 0.01045
Epoch: 20141, Training Loss: 0.01047
Epoch: 20141, Training Loss: 0.01290
Epoch: 20142, Training Loss: 0.00997
Epoch: 20142, Training Loss: 0.01045
Epoch: 20142, Training Loss: 0.01047
Epoch: 20142, Training Loss: 0.01290
Epoch: 20143, Training Loss: 0.00997
Epoch: 20143, Training Loss: 0.01045
Epoch: 20143, Training Loss: 0.01046
Epoch: 20143, Training Loss: 0.01289
Epoch: 20144, Training Loss: 0.00997
Epoch: 20144, Training Loss: 0.01045
Epoch: 20144, Training Loss: 0.01046
Epoch: 20144, Training Loss: 0.01289
Epoch: 20145, Training Loss: 0.00997
Epoch: 20145, Training Loss: 0.01045
Epoch: 20145, Training Loss: 0.01046
Epoch: 20145, Training Loss: 0.01289
Epoch: 20146, Training Loss: 0.00997
Epoch: 20146, Training Loss: 0.01045
Epoch: 20146, Training Loss: 0.01046
Epoch: 20146, Training Loss: 0.01289
Epoch: 20147, Training Loss: 0.00997
Epoch: 20147, Training Loss: 0.01045
Epoch: 20147, Training Loss: 0.01046
Epoch: 20147, Training Loss: 0.01289
Epoch: 20148, Training Loss: 0.00997
Epoch: 20148, Training Loss: 0.01045
Epoch: 20148, Training Loss: 0.01046
Epoch: 20148, Training Loss: 0.01289
Epoch: 20149, Training Loss: 0.00997
Epoch: 20149, Training Loss: 0.01045
Epoch: 20149, Training Loss: 0.01046
Epoch: 20149, Training Loss: 0.01289
Epoch: 20150, Training Loss: 0.00997
Epoch: 20150, Training Loss: 0.01045
Epoch: 20150, Training Loss: 0.01046
Epoch: 20150, Training Loss: 0.01289
Epoch: 20151, Training Loss: 0.00997
Epoch: 20151, Training Loss: 0.01045
Epoch: 20151, Training Loss: 0.01046
Epoch: 20151, Training Loss: 0.01289
Epoch: 20152, Training Loss: 0.00997
Epoch: 20152, Training Loss: 0.01045
Epoch: 20152, Training Loss: 0.01046
Epoch: 20152, Training Loss: 0.01289
Epoch: 20153, Training Loss: 0.00997
Epoch: 20153, Training Loss: 0.01045
Epoch: 20153, Training Loss: 0.01046
Epoch: 20153, Training Loss: 0.01289
Epoch: 20154, Training Loss: 0.00997
Epoch: 20154, Training Loss: 0.01045
Epoch: 20154, Training Loss: 0.01046
Epoch: 20154, Training Loss: 0.01289
Epoch: 20155, Training Loss: 0.00997
Epoch: 20155, Training Loss: 0.01045
Epoch: 20155, Training Loss: 0.01046
Epoch: 20155, Training Loss: 0.01289
Epoch: 20156, Training Loss: 0.00997
Epoch: 20156, Training Loss: 0.01045
Epoch: 20156, Training Loss: 0.01046
Epoch: 20156, Training Loss: 0.01289
Epoch: 20157, Training Loss: 0.00997
Epoch: 20157, Training Loss: 0.01045
Epoch: 20157, Training Loss: 0.01046
Epoch: 20157, Training Loss: 0.01289
Epoch: 20158, Training Loss: 0.00997
Epoch: 20158, Training Loss: 0.01045
Epoch: 20158, Training Loss: 0.01046
Epoch: 20158, Training Loss: 0.01289
Epoch: 20159, Training Loss: 0.00997
Epoch: 20159, Training Loss: 0.01045
Epoch: 20159, Training Loss: 0.01046
Epoch: 20159, Training Loss: 0.01289
Epoch: 20160, Training Loss: 0.00997
Epoch: 20160, Training Loss: 0.01045
Epoch: 20160, Training Loss: 0.01046
Epoch: 20160, Training Loss: 0.01289
Epoch: 20161, Training Loss: 0.00997
Epoch: 20161, Training Loss: 0.01045
Epoch: 20161, Training Loss: 0.01046
Epoch: 20161, Training Loss: 0.01289
Epoch: 20162, Training Loss: 0.00997
Epoch: 20162, Training Loss: 0.01045
Epoch: 20162, Training Loss: 0.01046
Epoch: 20162, Training Loss: 0.01289
Epoch: 20163, Training Loss: 0.00997
Epoch: 20163, Training Loss: 0.01045
Epoch: 20163, Training Loss: 0.01046
Epoch: 20163, Training Loss: 0.01289
Epoch: 20164, Training Loss: 0.00997
Epoch: 20164, Training Loss: 0.01044
Epoch: 20164, Training Loss: 0.01046
Epoch: 20164, Training Loss: 0.01289
Epoch: 20165, Training Loss: 0.00997
Epoch: 20165, Training Loss: 0.01044
Epoch: 20165, Training Loss: 0.01046
Epoch: 20165, Training Loss: 0.01289
Epoch: 20166, Training Loss: 0.00997
Epoch: 20166, Training Loss: 0.01044
Epoch: 20166, Training Loss: 0.01046
Epoch: 20166, Training Loss: 0.01289
Epoch: 20167, Training Loss: 0.00997
Epoch: 20167, Training Loss: 0.01044
Epoch: 20167, Training Loss: 0.01046
Epoch: 20167, Training Loss: 0.01289
Epoch: 20168, Training Loss: 0.00996
Epoch: 20168, Training Loss: 0.01044
Epoch: 20168, Training Loss: 0.01046
Epoch: 20168, Training Loss: 0.01289
Epoch: 20169, Training Loss: 0.00996
Epoch: 20169, Training Loss: 0.01044
Epoch: 20169, Training Loss: 0.01046
Epoch: 20169, Training Loss: 0.01288
Epoch: 20170, Training Loss: 0.00996
Epoch: 20170, Training Loss: 0.01044
Epoch: 20170, Training Loss: 0.01046
Epoch: 20170, Training Loss: 0.01288
Epoch: 20171, Training Loss: 0.00996
Epoch: 20171, Training Loss: 0.01044
Epoch: 20171, Training Loss: 0.01046
Epoch: 20171, Training Loss: 0.01288
Epoch: 20172, Training Loss: 0.00996
Epoch: 20172, Training Loss: 0.01044
Epoch: 20172, Training Loss: 0.01046
Epoch: 20172, Training Loss: 0.01288
Epoch: 20173, Training Loss: 0.00996
Epoch: 20173, Training Loss: 0.01044
Epoch: 20173, Training Loss: 0.01046
Epoch: 20173, Training Loss: 0.01288
Epoch: 20174, Training Loss: 0.00996
Epoch: 20174, Training Loss: 0.01044
Epoch: 20174, Training Loss: 0.01046
Epoch: 20174, Training Loss: 0.01288
Epoch: 20175, Training Loss: 0.00996
Epoch: 20175, Training Loss: 0.01044
Epoch: 20175, Training Loss: 0.01046
Epoch: 20175, Training Loss: 0.01288
Epoch: 20176, Training Loss: 0.00996
Epoch: 20176, Training Loss: 0.01044
Epoch: 20176, Training Loss: 0.01045
Epoch: 20176, Training Loss: 0.01288
Epoch: 20177, Training Loss: 0.00996
Epoch: 20177, Training Loss: 0.01044
Epoch: 20177, Training Loss: 0.01045
Epoch: 20177, Training Loss: 0.01288
Epoch: 20178, Training Loss: 0.00996
Epoch: 20178, Training Loss: 0.01044
Epoch: 20178, Training Loss: 0.01045
Epoch: 20178, Training Loss: 0.01288
Epoch: 20179, Training Loss: 0.00996
Epoch: 20179, Training Loss: 0.01044
Epoch: 20179, Training Loss: 0.01045
Epoch: 20179, Training Loss: 0.01288
Epoch: 20180, Training Loss: 0.00996
Epoch: 20180, Training Loss: 0.01044
Epoch: 20180, Training Loss: 0.01045
Epoch: 20180, Training Loss: 0.01288
Epoch: 20181, Training Loss: 0.00996
Epoch: 20181, Training Loss: 0.01044
Epoch: 20181, Training Loss: 0.01045
Epoch: 20181, Training Loss: 0.01288
Epoch: 20182, Training Loss: 0.00996
Epoch: 20182, Training Loss: 0.01044
Epoch: 20182, Training Loss: 0.01045
Epoch: 20182, Training Loss: 0.01288
Epoch: 20183, Training Loss: 0.00996
Epoch: 20183, Training Loss: 0.01044
Epoch: 20183, Training Loss: 0.01045
Epoch: 20183, Training Loss: 0.01288
Epoch: 20184, Training Loss: 0.00996
Epoch: 20184, Training Loss: 0.01044
Epoch: 20184, Training Loss: 0.01045
Epoch: 20184, Training Loss: 0.01288
Epoch: 20185, Training Loss: 0.00996
Epoch: 20185, Training Loss: 0.01044
Epoch: 20185, Training Loss: 0.01045
Epoch: 20185, Training Loss: 0.01288
Epoch: 20186, Training Loss: 0.00996
Epoch: 20186, Training Loss: 0.01044
Epoch: 20186, Training Loss: 0.01045
Epoch: 20186, Training Loss: 0.01288
Epoch: 20187, Training Loss: 0.00996
Epoch: 20187, Training Loss: 0.01044
Epoch: 20187, Training Loss: 0.01045
Epoch: 20187, Training Loss: 0.01288
Epoch: 20188, Training Loss: 0.00996
Epoch: 20188, Training Loss: 0.01044
Epoch: 20188, Training Loss: 0.01045
Epoch: 20188, Training Loss: 0.01288
Epoch: 20189, Training Loss: 0.00996
Epoch: 20189, Training Loss: 0.01044
Epoch: 20189, Training Loss: 0.01045
Epoch: 20189, Training Loss: 0.01288
Epoch: 20190, Training Loss: 0.00996
Epoch: 20190, Training Loss: 0.01044
Epoch: 20190, Training Loss: 0.01045
Epoch: 20190, Training Loss: 0.01288
Epoch: 20191, Training Loss: 0.00996
Epoch: 20191, Training Loss: 0.01044
Epoch: 20191, Training Loss: 0.01045
Epoch: 20191, Training Loss: 0.01288
Epoch: 20192, Training Loss: 0.00996
Epoch: 20192, Training Loss: 0.01044
Epoch: 20192, Training Loss: 0.01045
Epoch: 20192, Training Loss: 0.01288
Epoch: 20193, Training Loss: 0.00996
Epoch: 20193, Training Loss: 0.01044
Epoch: 20193, Training Loss: 0.01045
Epoch: 20193, Training Loss: 0.01288
Epoch: 20194, Training Loss: 0.00996
Epoch: 20194, Training Loss: 0.01044
Epoch: 20194, Training Loss: 0.01045
Epoch: 20194, Training Loss: 0.01288
Epoch: 20195, Training Loss: 0.00996
Epoch: 20195, Training Loss: 0.01044
Epoch: 20195, Training Loss: 0.01045
Epoch: 20195, Training Loss: 0.01287
Epoch: 20196, Training Loss: 0.00996
Epoch: 20196, Training Loss: 0.01044
Epoch: 20196, Training Loss: 0.01045
Epoch: 20196, Training Loss: 0.01287
Epoch: 20197, Training Loss: 0.00996
Epoch: 20197, Training Loss: 0.01043
Epoch: 20197, Training Loss: 0.01045
Epoch: 20197, Training Loss: 0.01287
Epoch: 20198, Training Loss: 0.00996
Epoch: 20198, Training Loss: 0.01043
Epoch: 20198, Training Loss: 0.01045
Epoch: 20198, Training Loss: 0.01287
Epoch: 20199, Training Loss: 0.00996
Epoch: 20199, Training Loss: 0.01043
Epoch: 20199, Training Loss: 0.01045
Epoch: 20199, Training Loss: 0.01287
Epoch: 20200, Training Loss: 0.00996
Epoch: 20200, Training Loss: 0.01043
Epoch: 20200, Training Loss: 0.01045
Epoch: 20200, Training Loss: 0.01287
Epoch: 20201, Training Loss: 0.00996
Epoch: 20201, Training Loss: 0.01043
Epoch: 20201, Training Loss: 0.01045
Epoch: 20201, Training Loss: 0.01287
Epoch: 20202, Training Loss: 0.00996
Epoch: 20202, Training Loss: 0.01043
Epoch: 20202, Training Loss: 0.01045
Epoch: 20202, Training Loss: 0.01287
Epoch: 20203, Training Loss: 0.00996
Epoch: 20203, Training Loss: 0.01043
Epoch: 20203, Training Loss: 0.01045
Epoch: 20203, Training Loss: 0.01287
Epoch: 20204, Training Loss: 0.00995
Epoch: 20204, Training Loss: 0.01043
Epoch: 20204, Training Loss: 0.01045
Epoch: 20204, Training Loss: 0.01287
Epoch: 20205, Training Loss: 0.00995
Epoch: 20205, Training Loss: 0.01043
Epoch: 20205, Training Loss: 0.01045
Epoch: 20205, Training Loss: 0.01287
Epoch: 20206, Training Loss: 0.00995
Epoch: 20206, Training Loss: 0.01043
Epoch: 20206, Training Loss: 0.01045
Epoch: 20206, Training Loss: 0.01287
Epoch: 20207, Training Loss: 0.00995
Epoch: 20207, Training Loss: 0.01043
Epoch: 20207, Training Loss: 0.01045
Epoch: 20207, Training Loss: 0.01287
Epoch: 20208, Training Loss: 0.00995
Epoch: 20208, Training Loss: 0.01043
Epoch: 20208, Training Loss: 0.01045
Epoch: 20208, Training Loss: 0.01287
Epoch: 20209, Training Loss: 0.00995
Epoch: 20209, Training Loss: 0.01043
Epoch: 20209, Training Loss: 0.01044
Epoch: 20209, Training Loss: 0.01287
Epoch: 20210, Training Loss: 0.00995
Epoch: 20210, Training Loss: 0.01043
Epoch: 20210, Training Loss: 0.01044
Epoch: 20210, Training Loss: 0.01287
Epoch: 20211, Training Loss: 0.00995
Epoch: 20211, Training Loss: 0.01043
Epoch: 20211, Training Loss: 0.01044
Epoch: 20211, Training Loss: 0.01287
Epoch: 20212, Training Loss: 0.00995
Epoch: 20212, Training Loss: 0.01043
Epoch: 20212, Training Loss: 0.01044
Epoch: 20212, Training Loss: 0.01287
Epoch: 20213, Training Loss: 0.00995
Epoch: 20213, Training Loss: 0.01043
Epoch: 20213, Training Loss: 0.01044
Epoch: 20213, Training Loss: 0.01287
Epoch: 20214, Training Loss: 0.00995
Epoch: 20214, Training Loss: 0.01043
Epoch: 20214, Training Loss: 0.01044
Epoch: 20214, Training Loss: 0.01287
Epoch: 20215, Training Loss: 0.00995
Epoch: 20215, Training Loss: 0.01043
Epoch: 20215, Training Loss: 0.01044
Epoch: 20215, Training Loss: 0.01287
Epoch: 20216, Training Loss: 0.00995
Epoch: 20216, Training Loss: 0.01043
Epoch: 20216, Training Loss: 0.01044
Epoch: 20216, Training Loss: 0.01287
Epoch: 20217, Training Loss: 0.00995
Epoch: 20217, Training Loss: 0.01043
Epoch: 20217, Training Loss: 0.01044
Epoch: 20217, Training Loss: 0.01287
Epoch: 20218, Training Loss: 0.00995
Epoch: 20218, Training Loss: 0.01043
Epoch: 20218, Training Loss: 0.01044
Epoch: 20218, Training Loss: 0.01287
Epoch: 20219, Training Loss: 0.00995
Epoch: 20219, Training Loss: 0.01043
Epoch: 20219, Training Loss: 0.01044
Epoch: 20219, Training Loss: 0.01287
Epoch: 20220, Training Loss: 0.00995
Epoch: 20220, Training Loss: 0.01043
Epoch: 20220, Training Loss: 0.01044
Epoch: 20220, Training Loss: 0.01287
Epoch: 20221, Training Loss: 0.00995
Epoch: 20221, Training Loss: 0.01043
Epoch: 20221, Training Loss: 0.01044
Epoch: 20221, Training Loss: 0.01286
Epoch: 20222, Training Loss: 0.00995
Epoch: 20222, Training Loss: 0.01043
Epoch: 20222, Training Loss: 0.01044
Epoch: 20222, Training Loss: 0.01286
Epoch: 20223, Training Loss: 0.00995
Epoch: 20223, Training Loss: 0.01043
Epoch: 20223, Training Loss: 0.01044
Epoch: 20223, Training Loss: 0.01286
Epoch: 20224, Training Loss: 0.00995
Epoch: 20224, Training Loss: 0.01043
Epoch: 20224, Training Loss: 0.01044
Epoch: 20224, Training Loss: 0.01286
Epoch: 20225, Training Loss: 0.00995
Epoch: 20225, Training Loss: 0.01043
Epoch: 20225, Training Loss: 0.01044
Epoch: 20225, Training Loss: 0.01286
Epoch: 20226, Training Loss: 0.00995
Epoch: 20226, Training Loss: 0.01043
Epoch: 20226, Training Loss: 0.01044
Epoch: 20226, Training Loss: 0.01286
Epoch: 20227, Training Loss: 0.00995
Epoch: 20227, Training Loss: 0.01043
Epoch: 20227, Training Loss: 0.01044
Epoch: 20227, Training Loss: 0.01286
Epoch: 20228, Training Loss: 0.00995
Epoch: 20228, Training Loss: 0.01043
Epoch: 20228, Training Loss: 0.01044
Epoch: 20228, Training Loss: 0.01286
Epoch: 20229, Training Loss: 0.00995
Epoch: 20229, Training Loss: 0.01042
Epoch: 20229, Training Loss: 0.01044
Epoch: 20229, Training Loss: 0.01286
Epoch: 20230, Training Loss: 0.00995
Epoch: 20230, Training Loss: 0.01042
Epoch: 20230, Training Loss: 0.01044
Epoch: 20230, Training Loss: 0.01286
Epoch: 20231, Training Loss: 0.00995
Epoch: 20231, Training Loss: 0.01042
Epoch: 20231, Training Loss: 0.01044
Epoch: 20231, Training Loss: 0.01286
Epoch: 20232, Training Loss: 0.00995
Epoch: 20232, Training Loss: 0.01042
Epoch: 20232, Training Loss: 0.01044
Epoch: 20232, Training Loss: 0.01286
Epoch: 20233, Training Loss: 0.00995
Epoch: 20233, Training Loss: 0.01042
Epoch: 20233, Training Loss: 0.01044
Epoch: 20233, Training Loss: 0.01286
Epoch: 20234, Training Loss: 0.00995
Epoch: 20234, Training Loss: 0.01042
Epoch: 20234, Training Loss: 0.01044
Epoch: 20234, Training Loss: 0.01286
Epoch: 20235, Training Loss: 0.00995
Epoch: 20235, Training Loss: 0.01042
Epoch: 20235, Training Loss: 0.01044
Epoch: 20235, Training Loss: 0.01286
Epoch: 20236, Training Loss: 0.00995
Epoch: 20236, Training Loss: 0.01042
Epoch: 20236, Training Loss: 0.01044
Epoch: 20236, Training Loss: 0.01286
Epoch: 20237, Training Loss: 0.00995
Epoch: 20237, Training Loss: 0.01042
Epoch: 20237, Training Loss: 0.01044
Epoch: 20237, Training Loss: 0.01286
Epoch: 20238, Training Loss: 0.00995
Epoch: 20238, Training Loss: 0.01042
Epoch: 20238, Training Loss: 0.01044
Epoch: 20238, Training Loss: 0.01286
Epoch: 20239, Training Loss: 0.00994
Epoch: 20239, Training Loss: 0.01042
Epoch: 20239, Training Loss: 0.01044
Epoch: 20239, Training Loss: 0.01286
Epoch: 20240, Training Loss: 0.00994
Epoch: 20240, Training Loss: 0.01042
Epoch: 20240, Training Loss: 0.01044
Epoch: 20240, Training Loss: 0.01286
Epoch: 20241, Training Loss: 0.00994
Epoch: 20241, Training Loss: 0.01042
Epoch: 20241, Training Loss: 0.01043
Epoch: 20241, Training Loss: 0.01286
Epoch: 20242, Training Loss: 0.00994
Epoch: 20242, Training Loss: 0.01042
Epoch: 20242, Training Loss: 0.01043
Epoch: 20242, Training Loss: 0.01286
Epoch: 20243, Training Loss: 0.00994
Epoch: 20243, Training Loss: 0.01042
Epoch: 20243, Training Loss: 0.01043
Epoch: 20243, Training Loss: 0.01286
Epoch: 20244, Training Loss: 0.00994
Epoch: 20244, Training Loss: 0.01042
Epoch: 20244, Training Loss: 0.01043
Epoch: 20244, Training Loss: 0.01286
Epoch: 20245, Training Loss: 0.00994
Epoch: 20245, Training Loss: 0.01042
Epoch: 20245, Training Loss: 0.01043
Epoch: 20245, Training Loss: 0.01286
Epoch: 20246, Training Loss: 0.00994
Epoch: 20246, Training Loss: 0.01042
Epoch: 20246, Training Loss: 0.01043
Epoch: 20246, Training Loss: 0.01286
Epoch: 20247, Training Loss: 0.00994
Epoch: 20247, Training Loss: 0.01042
Epoch: 20247, Training Loss: 0.01043
Epoch: 20247, Training Loss: 0.01285
Epoch: 20248, Training Loss: 0.00994
Epoch: 20248, Training Loss: 0.01042
Epoch: 20248, Training Loss: 0.01043
Epoch: 20248, Training Loss: 0.01285
Epoch: 20249, Training Loss: 0.00994
Epoch: 20249, Training Loss: 0.01042
Epoch: 20249, Training Loss: 0.01043
Epoch: 20249, Training Loss: 0.01285
Epoch: 20250, Training Loss: 0.00994
Epoch: 20250, Training Loss: 0.01042
Epoch: 20250, Training Loss: 0.01043
Epoch: 20250, Training Loss: 0.01285
Epoch: 20251, Training Loss: 0.00994
Epoch: 20251, Training Loss: 0.01042
Epoch: 20251, Training Loss: 0.01043
Epoch: 20251, Training Loss: 0.01285
Epoch: 20252, Training Loss: 0.00994
Epoch: 20252, Training Loss: 0.01042
Epoch: 20252, Training Loss: 0.01043
Epoch: 20252, Training Loss: 0.01285
Epoch: 20253, Training Loss: 0.00994
Epoch: 20253, Training Loss: 0.01042
Epoch: 20253, Training Loss: 0.01043
Epoch: 20253, Training Loss: 0.01285
Epoch: 20254, Training Loss: 0.00994
Epoch: 20254, Training Loss: 0.01042
Epoch: 20254, Training Loss: 0.01043
Epoch: 20254, Training Loss: 0.01285
Epoch: 20255, Training Loss: 0.00994
Epoch: 20255, Training Loss: 0.01042
Epoch: 20255, Training Loss: 0.01043
Epoch: 20255, Training Loss: 0.01285
Epoch: 20256, Training Loss: 0.00994
Epoch: 20256, Training Loss: 0.01042
Epoch: 20256, Training Loss: 0.01043
Epoch: 20256, Training Loss: 0.01285
Epoch: 20257, Training Loss: 0.00994
Epoch: 20257, Training Loss: 0.01042
Epoch: 20257, Training Loss: 0.01043
Epoch: 20257, Training Loss: 0.01285
Epoch: 20258, Training Loss: 0.00994
Epoch: 20258, Training Loss: 0.01042
Epoch: 20258, Training Loss: 0.01043
Epoch: 20258, Training Loss: 0.01285
Epoch: 20259, Training Loss: 0.00994
Epoch: 20259, Training Loss: 0.01042
Epoch: 20259, Training Loss: 0.01043
Epoch: 20259, Training Loss: 0.01285
Epoch: 20260, Training Loss: 0.00994
Epoch: 20260, Training Loss: 0.01042
Epoch: 20260, Training Loss: 0.01043
Epoch: 20260, Training Loss: 0.01285
Epoch: 20261, Training Loss: 0.00994
Epoch: 20261, Training Loss: 0.01042
Epoch: 20261, Training Loss: 0.01043
Epoch: 20261, Training Loss: 0.01285
Epoch: 20262, Training Loss: 0.00994
Epoch: 20262, Training Loss: 0.01041
Epoch: 20262, Training Loss: 0.01043
Epoch: 20262, Training Loss: 0.01285
Epoch: 20263, Training Loss: 0.00994
Epoch: 20263, Training Loss: 0.01041
Epoch: 20263, Training Loss: 0.01043
Epoch: 20263, Training Loss: 0.01285
Epoch: 20264, Training Loss: 0.00994
Epoch: 20264, Training Loss: 0.01041
Epoch: 20264, Training Loss: 0.01043
Epoch: 20264, Training Loss: 0.01285
Epoch: 20265, Training Loss: 0.00994
Epoch: 20265, Training Loss: 0.01041
Epoch: 20265, Training Loss: 0.01043
Epoch: 20265, Training Loss: 0.01285
Epoch: 20266, Training Loss: 0.00994
Epoch: 20266, Training Loss: 0.01041
Epoch: 20266, Training Loss: 0.01043
Epoch: 20266, Training Loss: 0.01285
Epoch: 20267, Training Loss: 0.00994
Epoch: 20267, Training Loss: 0.01041
Epoch: 20267, Training Loss: 0.01043
Epoch: 20267, Training Loss: 0.01285
Epoch: 20268, Training Loss: 0.00994
Epoch: 20268, Training Loss: 0.01041
Epoch: 20268, Training Loss: 0.01043
Epoch: 20268, Training Loss: 0.01285
Epoch: 20269, Training Loss: 0.00994
Epoch: 20269, Training Loss: 0.01041
Epoch: 20269, Training Loss: 0.01043
Epoch: 20269, Training Loss: 0.01285
Epoch: 20270, Training Loss: 0.00994
Epoch: 20270, Training Loss: 0.01041
Epoch: 20270, Training Loss: 0.01043
Epoch: 20270, Training Loss: 0.01285
Epoch: 20271, Training Loss: 0.00994
Epoch: 20271, Training Loss: 0.01041
Epoch: 20271, Training Loss: 0.01043
Epoch: 20271, Training Loss: 0.01285
Epoch: 20272, Training Loss: 0.00994
Epoch: 20272, Training Loss: 0.01041
Epoch: 20272, Training Loss: 0.01043
Epoch: 20272, Training Loss: 0.01285
Epoch: 20273, Training Loss: 0.00994
Epoch: 20273, Training Loss: 0.01041
Epoch: 20273, Training Loss: 0.01043
Epoch: 20273, Training Loss: 0.01285
Epoch: 20274, Training Loss: 0.00994
Epoch: 20274, Training Loss: 0.01041
Epoch: 20274, Training Loss: 0.01042
Epoch: 20274, Training Loss: 0.01284
Epoch: 20275, Training Loss: 0.00993
Epoch: 20275, Training Loss: 0.01041
Epoch: 20275, Training Loss: 0.01042
Epoch: 20275, Training Loss: 0.01284
Epoch: 20276, Training Loss: 0.00993
Epoch: 20276, Training Loss: 0.01041
Epoch: 20276, Training Loss: 0.01042
Epoch: 20276, Training Loss: 0.01284
Epoch: 20277, Training Loss: 0.00993
Epoch: 20277, Training Loss: 0.01041
Epoch: 20277, Training Loss: 0.01042
Epoch: 20277, Training Loss: 0.01284
Epoch: 20278, Training Loss: 0.00993
Epoch: 20278, Training Loss: 0.01041
Epoch: 20278, Training Loss: 0.01042
Epoch: 20278, Training Loss: 0.01284
Epoch: 20279, Training Loss: 0.00993
Epoch: 20279, Training Loss: 0.01041
Epoch: 20279, Training Loss: 0.01042
Epoch: 20279, Training Loss: 0.01284
Epoch: 20280, Training Loss: 0.00993
Epoch: 20280, Training Loss: 0.01041
Epoch: 20280, Training Loss: 0.01042
Epoch: 20280, Training Loss: 0.01284
Epoch: 20281, Training Loss: 0.00993
Epoch: 20281, Training Loss: 0.01041
Epoch: 20281, Training Loss: 0.01042
Epoch: 20281, Training Loss: 0.01284
Epoch: 20282, Training Loss: 0.00993
Epoch: 20282, Training Loss: 0.01041
Epoch: 20282, Training Loss: 0.01042
Epoch: 20282, Training Loss: 0.01284
Epoch: 20283, Training Loss: 0.00993
Epoch: 20283, Training Loss: 0.01041
Epoch: 20283, Training Loss: 0.01042
Epoch: 20283, Training Loss: 0.01284
Epoch: 20284, Training Loss: 0.00993
Epoch: 20284, Training Loss: 0.01041
Epoch: 20284, Training Loss: 0.01042
Epoch: 20284, Training Loss: 0.01284
Epoch: 20285, Training Loss: 0.00993
Epoch: 20285, Training Loss: 0.01041
Epoch: 20285, Training Loss: 0.01042
Epoch: 20285, Training Loss: 0.01284
Epoch: 20286, Training Loss: 0.00993
Epoch: 20286, Training Loss: 0.01041
Epoch: 20286, Training Loss: 0.01042
Epoch: 20286, Training Loss: 0.01284
Epoch: 20287, Training Loss: 0.00993
Epoch: 20287, Training Loss: 0.01041
Epoch: 20287, Training Loss: 0.01042
Epoch: 20287, Training Loss: 0.01284
Epoch: 20288, Training Loss: 0.00993
Epoch: 20288, Training Loss: 0.01041
Epoch: 20288, Training Loss: 0.01042
Epoch: 20288, Training Loss: 0.01284
Epoch: 20289, Training Loss: 0.00993
Epoch: 20289, Training Loss: 0.01041
Epoch: 20289, Training Loss: 0.01042
Epoch: 20289, Training Loss: 0.01284
Epoch: 20290, Training Loss: 0.00993
Epoch: 20290, Training Loss: 0.01041
Epoch: 20290, Training Loss: 0.01042
Epoch: 20290, Training Loss: 0.01284
Epoch: 20291, Training Loss: 0.00993
Epoch: 20291, Training Loss: 0.01041
Epoch: 20291, Training Loss: 0.01042
Epoch: 20291, Training Loss: 0.01284
Epoch: 20292, Training Loss: 0.00993
Epoch: 20292, Training Loss: 0.01041
Epoch: 20292, Training Loss: 0.01042
Epoch: 20292, Training Loss: 0.01284
Epoch: 20293, Training Loss: 0.00993
Epoch: 20293, Training Loss: 0.01041
Epoch: 20293, Training Loss: 0.01042
Epoch: 20293, Training Loss: 0.01284
Epoch: 20294, Training Loss: 0.00993
Epoch: 20294, Training Loss: 0.01041
Epoch: 20294, Training Loss: 0.01042
Epoch: 20294, Training Loss: 0.01284
Epoch: 20295, Training Loss: 0.00993
Epoch: 20295, Training Loss: 0.01040
Epoch: 20295, Training Loss: 0.01042
Epoch: 20295, Training Loss: 0.01284
Epoch: 20296, Training Loss: 0.00993
Epoch: 20296, Training Loss: 0.01040
Epoch: 20296, Training Loss: 0.01042
Epoch: 20296, Training Loss: 0.01284
Epoch: 20297, Training Loss: 0.00993
Epoch: 20297, Training Loss: 0.01040
Epoch: 20297, Training Loss: 0.01042
Epoch: 20297, Training Loss: 0.01284
Epoch: 20298, Training Loss: 0.00993
Epoch: 20298, Training Loss: 0.01040
Epoch: 20298, Training Loss: 0.01042
Epoch: 20298, Training Loss: 0.01284
Epoch: 20299, Training Loss: 0.00993
Epoch: 20299, Training Loss: 0.01040
Epoch: 20299, Training Loss: 0.01042
Epoch: 20299, Training Loss: 0.01284
Epoch: 20300, Training Loss: 0.00993
Epoch: 20300, Training Loss: 0.01040
Epoch: 20300, Training Loss: 0.01042
Epoch: 20300, Training Loss: 0.01283
Epoch: 20301, Training Loss: 0.00993
Epoch: 20301, Training Loss: 0.01040
Epoch: 20301, Training Loss: 0.01042
Epoch: 20301, Training Loss: 0.01283
Epoch: 20302, Training Loss: 0.00993
Epoch: 20302, Training Loss: 0.01040
Epoch: 20302, Training Loss: 0.01042
Epoch: 20302, Training Loss: 0.01283
Epoch: 20303, Training Loss: 0.00993
Epoch: 20303, Training Loss: 0.01040
Epoch: 20303, Training Loss: 0.01042
Epoch: 20303, Training Loss: 0.01283
Epoch: 20304, Training Loss: 0.00993
Epoch: 20304, Training Loss: 0.01040
Epoch: 20304, Training Loss: 0.01042
Epoch: 20304, Training Loss: 0.01283
Epoch: 20305, Training Loss: 0.00993
Epoch: 20305, Training Loss: 0.01040
Epoch: 20305, Training Loss: 0.01042
Epoch: 20305, Training Loss: 0.01283
Epoch: 20306, Training Loss: 0.00993
Epoch: 20306, Training Loss: 0.01040
Epoch: 20306, Training Loss: 0.01042
Epoch: 20306, Training Loss: 0.01283
Epoch: 20307, Training Loss: 0.00993
Epoch: 20307, Training Loss: 0.01040
Epoch: 20307, Training Loss: 0.01041
Epoch: 20307, Training Loss: 0.01283
Epoch: 20308, Training Loss: 0.00993
Epoch: 20308, Training Loss: 0.01040
Epoch: 20308, Training Loss: 0.01041
Epoch: 20308, Training Loss: 0.01283
Epoch: 20309, Training Loss: 0.00993
Epoch: 20309, Training Loss: 0.01040
Epoch: 20309, Training Loss: 0.01041
Epoch: 20309, Training Loss: 0.01283
Epoch: 20310, Training Loss: 0.00992
Epoch: 20310, Training Loss: 0.01040
Epoch: 20310, Training Loss: 0.01041
Epoch: 20310, Training Loss: 0.01283
Epoch: 20311, Training Loss: 0.00992
Epoch: 20311, Training Loss: 0.01040
Epoch: 20311, Training Loss: 0.01041
Epoch: 20311, Training Loss: 0.01283
Epoch: 20312, Training Loss: 0.00992
Epoch: 20312, Training Loss: 0.01040
Epoch: 20312, Training Loss: 0.01041
Epoch: 20312, Training Loss: 0.01283
Epoch: 20313, Training Loss: 0.00992
Epoch: 20313, Training Loss: 0.01040
Epoch: 20313, Training Loss: 0.01041
Epoch: 20313, Training Loss: 0.01283
Epoch: 20314, Training Loss: 0.00992
Epoch: 20314, Training Loss: 0.01040
Epoch: 20314, Training Loss: 0.01041
Epoch: 20314, Training Loss: 0.01283
Epoch: 20315, Training Loss: 0.00992
Epoch: 20315, Training Loss: 0.01040
Epoch: 20315, Training Loss: 0.01041
Epoch: 20315, Training Loss: 0.01283
Epoch: 20316, Training Loss: 0.00992
Epoch: 20316, Training Loss: 0.01040
Epoch: 20316, Training Loss: 0.01041
Epoch: 20316, Training Loss: 0.01283
Epoch: 20317, Training Loss: 0.00992
Epoch: 20317, Training Loss: 0.01040
Epoch: 20317, Training Loss: 0.01041
Epoch: 20317, Training Loss: 0.01283
Epoch: 20318, Training Loss: 0.00992
Epoch: 20318, Training Loss: 0.01040
Epoch: 20318, Training Loss: 0.01041
Epoch: 20318, Training Loss: 0.01283
Epoch: 20319, Training Loss: 0.00992
Epoch: 20319, Training Loss: 0.01040
Epoch: 20319, Training Loss: 0.01041
Epoch: 20319, Training Loss: 0.01283
Epoch: 20320, Training Loss: 0.00992
Epoch: 20320, Training Loss: 0.01040
Epoch: 20320, Training Loss: 0.01041
Epoch: 20320, Training Loss: 0.01283
Epoch: 20321, Training Loss: 0.00992
Epoch: 20321, Training Loss: 0.01040
Epoch: 20321, Training Loss: 0.01041
Epoch: 20321, Training Loss: 0.01283
Epoch: 20322, Training Loss: 0.00992
Epoch: 20322, Training Loss: 0.01040
Epoch: 20322, Training Loss: 0.01041
Epoch: 20322, Training Loss: 0.01283
Epoch: 20323, Training Loss: 0.00992
Epoch: 20323, Training Loss: 0.01040
Epoch: 20323, Training Loss: 0.01041
Epoch: 20323, Training Loss: 0.01283
Epoch: 20324, Training Loss: 0.00992
Epoch: 20324, Training Loss: 0.01040
Epoch: 20324, Training Loss: 0.01041
Epoch: 20324, Training Loss: 0.01283
Epoch: 20325, Training Loss: 0.00992
Epoch: 20325, Training Loss: 0.01040
Epoch: 20325, Training Loss: 0.01041
Epoch: 20325, Training Loss: 0.01283
Epoch: 20326, Training Loss: 0.00992
Epoch: 20326, Training Loss: 0.01040
Epoch: 20326, Training Loss: 0.01041
Epoch: 20326, Training Loss: 0.01282
Epoch: 20327, Training Loss: 0.00992
Epoch: 20327, Training Loss: 0.01040
Epoch: 20327, Training Loss: 0.01041
Epoch: 20327, Training Loss: 0.01282
Epoch: 20328, Training Loss: 0.00992
Epoch: 20328, Training Loss: 0.01039
Epoch: 20328, Training Loss: 0.01041
Epoch: 20328, Training Loss: 0.01282
Epoch: 20329, Training Loss: 0.00992
Epoch: 20329, Training Loss: 0.01039
Epoch: 20329, Training Loss: 0.01041
Epoch: 20329, Training Loss: 0.01282
Epoch: 20330, Training Loss: 0.00992
Epoch: 20330, Training Loss: 0.01039
Epoch: 20330, Training Loss: 0.01041
Epoch: 20330, Training Loss: 0.01282
Epoch: 20331, Training Loss: 0.00992
Epoch: 20331, Training Loss: 0.01039
Epoch: 20331, Training Loss: 0.01041
Epoch: 20331, Training Loss: 0.01282
Epoch: 20332, Training Loss: 0.00992
Epoch: 20332, Training Loss: 0.01039
Epoch: 20332, Training Loss: 0.01041
Epoch: 20332, Training Loss: 0.01282
Epoch: 20333, Training Loss: 0.00992
Epoch: 20333, Training Loss: 0.01039
Epoch: 20333, Training Loss: 0.01041
Epoch: 20333, Training Loss: 0.01282
Epoch: 20334, Training Loss: 0.00992
Epoch: 20334, Training Loss: 0.01039
Epoch: 20334, Training Loss: 0.01041
Epoch: 20334, Training Loss: 0.01282
Epoch: 20335, Training Loss: 0.00992
Epoch: 20335, Training Loss: 0.01039
Epoch: 20335, Training Loss: 0.01041
Epoch: 20335, Training Loss: 0.01282
Epoch: 20336, Training Loss: 0.00992
Epoch: 20336, Training Loss: 0.01039
Epoch: 20336, Training Loss: 0.01041
Epoch: 20336, Training Loss: 0.01282
Epoch: 20337, Training Loss: 0.00992
Epoch: 20337, Training Loss: 0.01039
Epoch: 20337, Training Loss: 0.01041
Epoch: 20337, Training Loss: 0.01282
Epoch: 20338, Training Loss: 0.00992
Epoch: 20338, Training Loss: 0.01039
Epoch: 20338, Training Loss: 0.01041
Epoch: 20338, Training Loss: 0.01282
Epoch: 20339, Training Loss: 0.00992
Epoch: 20339, Training Loss: 0.01039
Epoch: 20339, Training Loss: 0.01041
Epoch: 20339, Training Loss: 0.01282
Epoch: 20340, Training Loss: 0.00992
Epoch: 20340, Training Loss: 0.01039
Epoch: 20340, Training Loss: 0.01040
Epoch: 20340, Training Loss: 0.01282
Epoch: 20341, Training Loss: 0.00992
Epoch: 20341, Training Loss: 0.01039
Epoch: 20341, Training Loss: 0.01040
Epoch: 20341, Training Loss: 0.01282
Epoch: 20342, Training Loss: 0.00992
Epoch: 20342, Training Loss: 0.01039
Epoch: 20342, Training Loss: 0.01040
Epoch: 20342, Training Loss: 0.01282
Epoch: 20343, Training Loss: 0.00992
Epoch: 20343, Training Loss: 0.01039
Epoch: 20343, Training Loss: 0.01040
Epoch: 20343, Training Loss: 0.01282
Epoch: 20344, Training Loss: 0.00992
Epoch: 20344, Training Loss: 0.01039
Epoch: 20344, Training Loss: 0.01040
Epoch: 20344, Training Loss: 0.01282
Epoch: 20345, Training Loss: 0.00992
Epoch: 20345, Training Loss: 0.01039
Epoch: 20345, Training Loss: 0.01040
Epoch: 20345, Training Loss: 0.01282
Epoch: 20346, Training Loss: 0.00991
Epoch: 20346, Training Loss: 0.01039
Epoch: 20346, Training Loss: 0.01040
Epoch: 20346, Training Loss: 0.01282
Epoch: 20347, Training Loss: 0.00991
Epoch: 20347, Training Loss: 0.01039
Epoch: 20347, Training Loss: 0.01040
Epoch: 20347, Training Loss: 0.01282
Epoch: 20348, Training Loss: 0.00991
Epoch: 20348, Training Loss: 0.01039
Epoch: 20348, Training Loss: 0.01040
Epoch: 20348, Training Loss: 0.01282
Epoch: 20349, Training Loss: 0.00991
Epoch: 20349, Training Loss: 0.01039
Epoch: 20349, Training Loss: 0.01040
Epoch: 20349, Training Loss: 0.01282
Epoch: 20350, Training Loss: 0.00991
Epoch: 20350, Training Loss: 0.01039
Epoch: 20350, Training Loss: 0.01040
Epoch: 20350, Training Loss: 0.01282
Epoch: 20351, Training Loss: 0.00991
Epoch: 20351, Training Loss: 0.01039
Epoch: 20351, Training Loss: 0.01040
Epoch: 20351, Training Loss: 0.01282
Epoch: 20352, Training Loss: 0.00991
Epoch: 20352, Training Loss: 0.01039
Epoch: 20352, Training Loss: 0.01040
Epoch: 20352, Training Loss: 0.01282
Epoch: 20353, Training Loss: 0.00991
Epoch: 20353, Training Loss: 0.01039
Epoch: 20353, Training Loss: 0.01040
Epoch: 20353, Training Loss: 0.01281
Epoch: 20354, Training Loss: 0.00991
Epoch: 20354, Training Loss: 0.01039
Epoch: 20354, Training Loss: 0.01040
Epoch: 20354, Training Loss: 0.01281
Epoch: 20355, Training Loss: 0.00991
Epoch: 20355, Training Loss: 0.01039
Epoch: 20355, Training Loss: 0.01040
Epoch: 20355, Training Loss: 0.01281
Epoch: 20356, Training Loss: 0.00991
Epoch: 20356, Training Loss: 0.01039
Epoch: 20356, Training Loss: 0.01040
Epoch: 20356, Training Loss: 0.01281
Epoch: 20357, Training Loss: 0.00991
Epoch: 20357, Training Loss: 0.01039
Epoch: 20357, Training Loss: 0.01040
Epoch: 20357, Training Loss: 0.01281
Epoch: 20358, Training Loss: 0.00991
Epoch: 20358, Training Loss: 0.01039
Epoch: 20358, Training Loss: 0.01040
Epoch: 20358, Training Loss: 0.01281
Epoch: 20359, Training Loss: 0.00991
Epoch: 20359, Training Loss: 0.01039
Epoch: 20359, Training Loss: 0.01040
Epoch: 20359, Training Loss: 0.01281
Epoch: 20360, Training Loss: 0.00991
Epoch: 20360, Training Loss: 0.01039
Epoch: 20360, Training Loss: 0.01040
Epoch: 20360, Training Loss: 0.01281
Epoch: 20361, Training Loss: 0.00991
Epoch: 20361, Training Loss: 0.01039
Epoch: 20361, Training Loss: 0.01040
Epoch: 20361, Training Loss: 0.01281
Epoch: 20362, Training Loss: 0.00991
Epoch: 20362, Training Loss: 0.01038
Epoch: 20362, Training Loss: 0.01040
Epoch: 20362, Training Loss: 0.01281
Epoch: 20363, Training Loss: 0.00991
Epoch: 20363, Training Loss: 0.01038
Epoch: 20363, Training Loss: 0.01040
Epoch: 20363, Training Loss: 0.01281
Epoch: 20364, Training Loss: 0.00991
Epoch: 20364, Training Loss: 0.01038
Epoch: 20364, Training Loss: 0.01040
Epoch: 20364, Training Loss: 0.01281
Epoch: 20365, Training Loss: 0.00991
Epoch: 20365, Training Loss: 0.01038
Epoch: 20365, Training Loss: 0.01040
Epoch: 20365, Training Loss: 0.01281
Epoch: 20366, Training Loss: 0.00991
Epoch: 20366, Training Loss: 0.01038
Epoch: 20366, Training Loss: 0.01040
Epoch: 20366, Training Loss: 0.01281
Epoch: 20367, Training Loss: 0.00991
Epoch: 20367, Training Loss: 0.01038
Epoch: 20367, Training Loss: 0.01040
Epoch: 20367, Training Loss: 0.01281
Epoch: 20368, Training Loss: 0.00991
Epoch: 20368, Training Loss: 0.01038
Epoch: 20368, Training Loss: 0.01040
Epoch: 20368, Training Loss: 0.01281
Epoch: 20369, Training Loss: 0.00991
Epoch: 20369, Training Loss: 0.01038
Epoch: 20369, Training Loss: 0.01040
Epoch: 20369, Training Loss: 0.01281
Epoch: 20370, Training Loss: 0.00991
Epoch: 20370, Training Loss: 0.01038
Epoch: 20370, Training Loss: 0.01040
Epoch: 20370, Training Loss: 0.01281
Epoch: 20371, Training Loss: 0.00991
Epoch: 20371, Training Loss: 0.01038
Epoch: 20371, Training Loss: 0.01040
Epoch: 20371, Training Loss: 0.01281
Epoch: 20372, Training Loss: 0.00991
Epoch: 20372, Training Loss: 0.01038
Epoch: 20372, Training Loss: 0.01040
Epoch: 20372, Training Loss: 0.01281
Epoch: 20373, Training Loss: 0.00991
Epoch: 20373, Training Loss: 0.01038
Epoch: 20373, Training Loss: 0.01040
Epoch: 20373, Training Loss: 0.01281
Epoch: 20374, Training Loss: 0.00991
Epoch: 20374, Training Loss: 0.01038
Epoch: 20374, Training Loss: 0.01039
Epoch: 20374, Training Loss: 0.01281
Epoch: 20375, Training Loss: 0.00991
Epoch: 20375, Training Loss: 0.01038
Epoch: 20375, Training Loss: 0.01039
Epoch: 20375, Training Loss: 0.01281
Epoch: 20376, Training Loss: 0.00991
Epoch: 20376, Training Loss: 0.01038
Epoch: 20376, Training Loss: 0.01039
Epoch: 20376, Training Loss: 0.01281
Epoch: 20377, Training Loss: 0.00991
Epoch: 20377, Training Loss: 0.01038
Epoch: 20377, Training Loss: 0.01039
Epoch: 20377, Training Loss: 0.01281
Epoch: 20378, Training Loss: 0.00991
Epoch: 20378, Training Loss: 0.01038
Epoch: 20378, Training Loss: 0.01039
Epoch: 20378, Training Loss: 0.01281
Epoch: 20379, Training Loss: 0.00991
Epoch: 20379, Training Loss: 0.01038
Epoch: 20379, Training Loss: 0.01039
Epoch: 20379, Training Loss: 0.01280
Epoch: 20380, Training Loss: 0.00991
Epoch: 20380, Training Loss: 0.01038
Epoch: 20380, Training Loss: 0.01039
Epoch: 20380, Training Loss: 0.01280
Epoch: 20381, Training Loss: 0.00991
Epoch: 20381, Training Loss: 0.01038
Epoch: 20381, Training Loss: 0.01039
Epoch: 20381, Training Loss: 0.01280
Epoch: 20382, Training Loss: 0.00990
Epoch: 20382, Training Loss: 0.01038
Epoch: 20382, Training Loss: 0.01039
Epoch: 20382, Training Loss: 0.01280
Epoch: 20383, Training Loss: 0.00990
Epoch: 20383, Training Loss: 0.01038
Epoch: 20383, Training Loss: 0.01039
Epoch: 20383, Training Loss: 0.01280
Epoch: 20384, Training Loss: 0.00990
Epoch: 20384, Training Loss: 0.01038
Epoch: 20384, Training Loss: 0.01039
Epoch: 20384, Training Loss: 0.01280
Epoch: 20385, Training Loss: 0.00990
Epoch: 20385, Training Loss: 0.01038
Epoch: 20385, Training Loss: 0.01039
Epoch: 20385, Training Loss: 0.01280
Epoch: 20386, Training Loss: 0.00990
Epoch: 20386, Training Loss: 0.01038
Epoch: 20386, Training Loss: 0.01039
Epoch: 20386, Training Loss: 0.01280
Epoch: 20387, Training Loss: 0.00990
Epoch: 20387, Training Loss: 0.01038
Epoch: 20387, Training Loss: 0.01039
Epoch: 20387, Training Loss: 0.01280
Epoch: 20388, Training Loss: 0.00990
Epoch: 20388, Training Loss: 0.01038
Epoch: 20388, Training Loss: 0.01039
Epoch: 20388, Training Loss: 0.01280
Epoch: 20389, Training Loss: 0.00990
Epoch: 20389, Training Loss: 0.01038
Epoch: 20389, Training Loss: 0.01039
Epoch: 20389, Training Loss: 0.01280
Epoch: 20390, Training Loss: 0.00990
Epoch: 20390, Training Loss: 0.01038
Epoch: 20390, Training Loss: 0.01039
Epoch: 20390, Training Loss: 0.01280
Epoch: 20391, Training Loss: 0.00990
Epoch: 20391, Training Loss: 0.01038
Epoch: 20391, Training Loss: 0.01039
Epoch: 20391, Training Loss: 0.01280
Epoch: 20392, Training Loss: 0.00990
Epoch: 20392, Training Loss: 0.01038
Epoch: 20392, Training Loss: 0.01039
Epoch: 20392, Training Loss: 0.01280
Epoch: 20393, Training Loss: 0.00990
Epoch: 20393, Training Loss: 0.01038
Epoch: 20393, Training Loss: 0.01039
Epoch: 20393, Training Loss: 0.01280
Epoch: 20394, Training Loss: 0.00990
Epoch: 20394, Training Loss: 0.01038
Epoch: 20394, Training Loss: 0.01039
Epoch: 20394, Training Loss: 0.01280
Epoch: 20395, Training Loss: 0.00990
Epoch: 20395, Training Loss: 0.01037
Epoch: 20395, Training Loss: 0.01039
Epoch: 20395, Training Loss: 0.01280
Epoch: 20396, Training Loss: 0.00990
Epoch: 20396, Training Loss: 0.01037
Epoch: 20396, Training Loss: 0.01039
Epoch: 20396, Training Loss: 0.01280
Epoch: 20397, Training Loss: 0.00990
Epoch: 20397, Training Loss: 0.01037
Epoch: 20397, Training Loss: 0.01039
Epoch: 20397, Training Loss: 0.01280
Epoch: 20398, Training Loss: 0.00990
Epoch: 20398, Training Loss: 0.01037
Epoch: 20398, Training Loss: 0.01039
Epoch: 20398, Training Loss: 0.01280
Epoch: 20399, Training Loss: 0.00990
Epoch: 20399, Training Loss: 0.01037
Epoch: 20399, Training Loss: 0.01039
Epoch: 20399, Training Loss: 0.01280
Epoch: 20400, Training Loss: 0.00990
Epoch: 20400, Training Loss: 0.01037
Epoch: 20400, Training Loss: 0.01039
Epoch: 20400, Training Loss: 0.01280
Epoch: 20401, Training Loss: 0.00990
Epoch: 20401, Training Loss: 0.01037
Epoch: 20401, Training Loss: 0.01039
Epoch: 20401, Training Loss: 0.01280
Epoch: 20402, Training Loss: 0.00990
Epoch: 20402, Training Loss: 0.01037
Epoch: 20402, Training Loss: 0.01039
Epoch: 20402, Training Loss: 0.01280
Epoch: 20403, Training Loss: 0.00990
Epoch: 20403, Training Loss: 0.01037
Epoch: 20403, Training Loss: 0.01039
Epoch: 20403, Training Loss: 0.01280
Epoch: 20404, Training Loss: 0.00990
Epoch: 20404, Training Loss: 0.01037
Epoch: 20404, Training Loss: 0.01039
Epoch: 20404, Training Loss: 0.01280
Epoch: 20405, Training Loss: 0.00990
Epoch: 20405, Training Loss: 0.01037
Epoch: 20405, Training Loss: 0.01039
Epoch: 20405, Training Loss: 0.01280
Epoch: 20406, Training Loss: 0.00990
Epoch: 20406, Training Loss: 0.01037
Epoch: 20406, Training Loss: 0.01039
Epoch: 20406, Training Loss: 0.01279
Epoch: 20407, Training Loss: 0.00990
Epoch: 20407, Training Loss: 0.01037
Epoch: 20407, Training Loss: 0.01038
Epoch: 20407, Training Loss: 0.01279
Epoch: 20408, Training Loss: 0.00990
Epoch: 20408, Training Loss: 0.01037
Epoch: 20408, Training Loss: 0.01038
Epoch: 20408, Training Loss: 0.01279
Epoch: 20409, Training Loss: 0.00990
Epoch: 20409, Training Loss: 0.01037
Epoch: 20409, Training Loss: 0.01038
Epoch: 20409, Training Loss: 0.01279
Epoch: 20410, Training Loss: 0.00990
Epoch: 20410, Training Loss: 0.01037
Epoch: 20410, Training Loss: 0.01038
Epoch: 20410, Training Loss: 0.01279
Epoch: 20411, Training Loss: 0.00990
Epoch: 20411, Training Loss: 0.01037
Epoch: 20411, Training Loss: 0.01038
Epoch: 20411, Training Loss: 0.01279
Epoch: 20412, Training Loss: 0.00990
Epoch: 20412, Training Loss: 0.01037
Epoch: 20412, Training Loss: 0.01038
Epoch: 20412, Training Loss: 0.01279
Epoch: 20413, Training Loss: 0.00990
Epoch: 20413, Training Loss: 0.01037
Epoch: 20413, Training Loss: 0.01038
Epoch: 20413, Training Loss: 0.01279
Epoch: 20414, Training Loss: 0.00990
Epoch: 20414, Training Loss: 0.01037
Epoch: 20414, Training Loss: 0.01038
Epoch: 20414, Training Loss: 0.01279
Epoch: 20415, Training Loss: 0.00990
Epoch: 20415, Training Loss: 0.01037
Epoch: 20415, Training Loss: 0.01038
Epoch: 20415, Training Loss: 0.01279
Epoch: 20416, Training Loss: 0.00990
Epoch: 20416, Training Loss: 0.01037
Epoch: 20416, Training Loss: 0.01038
Epoch: 20416, Training Loss: 0.01279
Epoch: 20417, Training Loss: 0.00990
Epoch: 20417, Training Loss: 0.01037
Epoch: 20417, Training Loss: 0.01038
Epoch: 20417, Training Loss: 0.01279
Epoch: 20418, Training Loss: 0.00989
Epoch: 20418, Training Loss: 0.01037
Epoch: 20418, Training Loss: 0.01038
Epoch: 20418, Training Loss: 0.01279
Epoch: 20419, Training Loss: 0.00989
Epoch: 20419, Training Loss: 0.01037
Epoch: 20419, Training Loss: 0.01038
Epoch: 20419, Training Loss: 0.01279
Epoch: 20420, Training Loss: 0.00989
Epoch: 20420, Training Loss: 0.01037
Epoch: 20420, Training Loss: 0.01038
Epoch: 20420, Training Loss: 0.01279
Epoch: 20421, Training Loss: 0.00989
Epoch: 20421, Training Loss: 0.01037
Epoch: 20421, Training Loss: 0.01038
Epoch: 20421, Training Loss: 0.01279
Epoch: 20422, Training Loss: 0.00989
Epoch: 20422, Training Loss: 0.01037
Epoch: 20422, Training Loss: 0.01038
Epoch: 20422, Training Loss: 0.01279
Epoch: 20423, Training Loss: 0.00989
Epoch: 20423, Training Loss: 0.01037
Epoch: 20423, Training Loss: 0.01038
Epoch: 20423, Training Loss: 0.01279
Epoch: 20424, Training Loss: 0.00989
Epoch: 20424, Training Loss: 0.01037
Epoch: 20424, Training Loss: 0.01038
Epoch: 20424, Training Loss: 0.01279
Epoch: 20425, Training Loss: 0.00989
Epoch: 20425, Training Loss: 0.01037
Epoch: 20425, Training Loss: 0.01038
Epoch: 20425, Training Loss: 0.01279
Epoch: 20426, Training Loss: 0.00989
Epoch: 20426, Training Loss: 0.01037
Epoch: 20426, Training Loss: 0.01038
Epoch: 20426, Training Loss: 0.01279
Epoch: 20427, Training Loss: 0.00989
Epoch: 20427, Training Loss: 0.01037
Epoch: 20427, Training Loss: 0.01038
Epoch: 20427, Training Loss: 0.01279
Epoch: 20428, Training Loss: 0.00989
Epoch: 20428, Training Loss: 0.01036
Epoch: 20428, Training Loss: 0.01038
Epoch: 20428, Training Loss: 0.01279
Epoch: 20429, Training Loss: 0.00989
Epoch: 20429, Training Loss: 0.01036
Epoch: 20429, Training Loss: 0.01038
Epoch: 20429, Training Loss: 0.01279
Epoch: 20430, Training Loss: 0.00989
Epoch: 20430, Training Loss: 0.01036
Epoch: 20430, Training Loss: 0.01038
Epoch: 20430, Training Loss: 0.01279
Epoch: 20431, Training Loss: 0.00989
Epoch: 20431, Training Loss: 0.01036
Epoch: 20431, Training Loss: 0.01038
Epoch: 20431, Training Loss: 0.01279
Epoch: 20432, Training Loss: 0.00989
Epoch: 20432, Training Loss: 0.01036
Epoch: 20432, Training Loss: 0.01038
Epoch: 20432, Training Loss: 0.01279
Epoch: 20433, Training Loss: 0.00989
Epoch: 20433, Training Loss: 0.01036
Epoch: 20433, Training Loss: 0.01038
Epoch: 20433, Training Loss: 0.01278
Epoch: 20434, Training Loss: 0.00989
Epoch: 20434, Training Loss: 0.01036
Epoch: 20434, Training Loss: 0.01038
Epoch: 20434, Training Loss: 0.01278
Epoch: 20435, Training Loss: 0.00989
Epoch: 20435, Training Loss: 0.01036
Epoch: 20435, Training Loss: 0.01038
Epoch: 20435, Training Loss: 0.01278
Epoch: 20436, Training Loss: 0.00989
Epoch: 20436, Training Loss: 0.01036
Epoch: 20436, Training Loss: 0.01038
Epoch: 20436, Training Loss: 0.01278
Epoch: 20437, Training Loss: 0.00989
Epoch: 20437, Training Loss: 0.01036
Epoch: 20437, Training Loss: 0.01038
Epoch: 20437, Training Loss: 0.01278
Epoch: 20438, Training Loss: 0.00989
Epoch: 20438, Training Loss: 0.01036
Epoch: 20438, Training Loss: 0.01038
Epoch: 20438, Training Loss: 0.01278
Epoch: 20439, Training Loss: 0.00989
Epoch: 20439, Training Loss: 0.01036
Epoch: 20439, Training Loss: 0.01038
Epoch: 20439, Training Loss: 0.01278
Epoch: 20440, Training Loss: 0.00989
Epoch: 20440, Training Loss: 0.01036
Epoch: 20440, Training Loss: 0.01037
Epoch: 20440, Training Loss: 0.01278
Epoch: 20441, Training Loss: 0.00989
Epoch: 20441, Training Loss: 0.01036
Epoch: 20441, Training Loss: 0.01037
Epoch: 20441, Training Loss: 0.01278
Epoch: 20442, Training Loss: 0.00989
Epoch: 20442, Training Loss: 0.01036
Epoch: 20442, Training Loss: 0.01037
Epoch: 20442, Training Loss: 0.01278
Epoch: 20443, Training Loss: 0.00989
Epoch: 20443, Training Loss: 0.01036
Epoch: 20443, Training Loss: 0.01037
Epoch: 20443, Training Loss: 0.01278
Epoch: 20444, Training Loss: 0.00989
Epoch: 20444, Training Loss: 0.01036
Epoch: 20444, Training Loss: 0.01037
Epoch: 20444, Training Loss: 0.01278
Epoch: 20445, Training Loss: 0.00989
Epoch: 20445, Training Loss: 0.01036
Epoch: 20445, Training Loss: 0.01037
Epoch: 20445, Training Loss: 0.01278
Epoch: 20446, Training Loss: 0.00989
Epoch: 20446, Training Loss: 0.01036
Epoch: 20446, Training Loss: 0.01037
Epoch: 20446, Training Loss: 0.01278
Epoch: 20447, Training Loss: 0.00989
Epoch: 20447, Training Loss: 0.01036
Epoch: 20447, Training Loss: 0.01037
Epoch: 20447, Training Loss: 0.01278
Epoch: 20448, Training Loss: 0.00989
Epoch: 20448, Training Loss: 0.01036
Epoch: 20448, Training Loss: 0.01037
Epoch: 20448, Training Loss: 0.01278
Epoch: 20449, Training Loss: 0.00989
Epoch: 20449, Training Loss: 0.01036
Epoch: 20449, Training Loss: 0.01037
Epoch: 20449, Training Loss: 0.01278
Epoch: 20450, Training Loss: 0.00989
Epoch: 20450, Training Loss: 0.01036
Epoch: 20450, Training Loss: 0.01037
Epoch: 20450, Training Loss: 0.01278
Epoch: 20451, Training Loss: 0.00989
Epoch: 20451, Training Loss: 0.01036
Epoch: 20451, Training Loss: 0.01037
Epoch: 20451, Training Loss: 0.01278
Epoch: 20452, Training Loss: 0.00989
Epoch: 20452, Training Loss: 0.01036
Epoch: 20452, Training Loss: 0.01037
Epoch: 20452, Training Loss: 0.01278
Epoch: 20453, Training Loss: 0.00989
Epoch: 20453, Training Loss: 0.01036
Epoch: 20453, Training Loss: 0.01037
Epoch: 20453, Training Loss: 0.01278
Epoch: 20454, Training Loss: 0.00988
Epoch: 20454, Training Loss: 0.01036
Epoch: 20454, Training Loss: 0.01037
Epoch: 20454, Training Loss: 0.01278
Epoch: 20455, Training Loss: 0.00988
Epoch: 20455, Training Loss: 0.01036
Epoch: 20455, Training Loss: 0.01037
Epoch: 20455, Training Loss: 0.01278
Epoch: 20456, Training Loss: 0.00988
Epoch: 20456, Training Loss: 0.01036
Epoch: 20456, Training Loss: 0.01037
Epoch: 20456, Training Loss: 0.01278
Epoch: 20457, Training Loss: 0.00988
Epoch: 20457, Training Loss: 0.01036
Epoch: 20457, Training Loss: 0.01037
Epoch: 20457, Training Loss: 0.01278
Epoch: 20458, Training Loss: 0.00988
Epoch: 20458, Training Loss: 0.01036
Epoch: 20458, Training Loss: 0.01037
Epoch: 20458, Training Loss: 0.01278
Epoch: 20459, Training Loss: 0.00988
Epoch: 20459, Training Loss: 0.01036
Epoch: 20459, Training Loss: 0.01037
Epoch: 20459, Training Loss: 0.01278
Epoch: 20460, Training Loss: 0.00988
Epoch: 20460, Training Loss: 0.01036
Epoch: 20460, Training Loss: 0.01037
Epoch: 20460, Training Loss: 0.01277
Epoch: 20461, Training Loss: 0.00988
Epoch: 20461, Training Loss: 0.01036
Epoch: 20461, Training Loss: 0.01037
Epoch: 20461, Training Loss: 0.01277
Epoch: 20462, Training Loss: 0.00988
Epoch: 20462, Training Loss: 0.01035
Epoch: 20462, Training Loss: 0.01037
Epoch: 20462, Training Loss: 0.01277
Epoch: 20463, Training Loss: 0.00988
Epoch: 20463, Training Loss: 0.01035
Epoch: 20463, Training Loss: 0.01037
Epoch: 20463, Training Loss: 0.01277
Epoch: 20464, Training Loss: 0.00988
Epoch: 20464, Training Loss: 0.01035
Epoch: 20464, Training Loss: 0.01037
Epoch: 20464, Training Loss: 0.01277
Epoch: 20465, Training Loss: 0.00988
Epoch: 20465, Training Loss: 0.01035
Epoch: 20465, Training Loss: 0.01037
Epoch: 20465, Training Loss: 0.01277
Epoch: 20466, Training Loss: 0.00988
Epoch: 20466, Training Loss: 0.01035
Epoch: 20466, Training Loss: 0.01037
Epoch: 20466, Training Loss: 0.01277
Epoch: 20467, Training Loss: 0.00988
Epoch: 20467, Training Loss: 0.01035
Epoch: 20467, Training Loss: 0.01037
Epoch: 20467, Training Loss: 0.01277
Epoch: 20468, Training Loss: 0.00988
Epoch: 20468, Training Loss: 0.01035
Epoch: 20468, Training Loss: 0.01037
Epoch: 20468, Training Loss: 0.01277
Epoch: 20469, Training Loss: 0.00988
Epoch: 20469, Training Loss: 0.01035
Epoch: 20469, Training Loss: 0.01037
Epoch: 20469, Training Loss: 0.01277
Epoch: 20470, Training Loss: 0.00988
Epoch: 20470, Training Loss: 0.01035
Epoch: 20470, Training Loss: 0.01037
Epoch: 20470, Training Loss: 0.01277
Epoch: 20471, Training Loss: 0.00988
Epoch: 20471, Training Loss: 0.01035
Epoch: 20471, Training Loss: 0.01037
Epoch: 20471, Training Loss: 0.01277
Epoch: 20472, Training Loss: 0.00988
Epoch: 20472, Training Loss: 0.01035
Epoch: 20472, Training Loss: 0.01037
Epoch: 20472, Training Loss: 0.01277
Epoch: 20473, Training Loss: 0.00988
Epoch: 20473, Training Loss: 0.01035
Epoch: 20473, Training Loss: 0.01037
Epoch: 20473, Training Loss: 0.01277
Epoch: 20474, Training Loss: 0.00988
Epoch: 20474, Training Loss: 0.01035
Epoch: 20474, Training Loss: 0.01036
Epoch: 20474, Training Loss: 0.01277
Epoch: 20475, Training Loss: 0.00988
Epoch: 20475, Training Loss: 0.01035
Epoch: 20475, Training Loss: 0.01036
Epoch: 20475, Training Loss: 0.01277
Epoch: 20476, Training Loss: 0.00988
Epoch: 20476, Training Loss: 0.01035
Epoch: 20476, Training Loss: 0.01036
Epoch: 20476, Training Loss: 0.01277
Epoch: 20477, Training Loss: 0.00988
Epoch: 20477, Training Loss: 0.01035
Epoch: 20477, Training Loss: 0.01036
Epoch: 20477, Training Loss: 0.01277
Epoch: 20478, Training Loss: 0.00988
Epoch: 20478, Training Loss: 0.01035
Epoch: 20478, Training Loss: 0.01036
Epoch: 20478, Training Loss: 0.01277
Epoch: 20479, Training Loss: 0.00988
Epoch: 20479, Training Loss: 0.01035
Epoch: 20479, Training Loss: 0.01036
Epoch: 20479, Training Loss: 0.01277
Epoch: 20480, Training Loss: 0.00988
Epoch: 20480, Training Loss: 0.01035
Epoch: 20480, Training Loss: 0.01036
Epoch: 20480, Training Loss: 0.01277
Epoch: 20481, Training Loss: 0.00988
Epoch: 20481, Training Loss: 0.01035
Epoch: 20481, Training Loss: 0.01036
Epoch: 20481, Training Loss: 0.01277
Epoch: 20482, Training Loss: 0.00988
Epoch: 20482, Training Loss: 0.01035
Epoch: 20482, Training Loss: 0.01036
Epoch: 20482, Training Loss: 0.01277
Epoch: 20483, Training Loss: 0.00988
Epoch: 20483, Training Loss: 0.01035
Epoch: 20483, Training Loss: 0.01036
Epoch: 20483, Training Loss: 0.01277
Epoch: 20484, Training Loss: 0.00988
Epoch: 20484, Training Loss: 0.01035
Epoch: 20484, Training Loss: 0.01036
Epoch: 20484, Training Loss: 0.01277
Epoch: 20485, Training Loss: 0.00988
Epoch: 20485, Training Loss: 0.01035
Epoch: 20485, Training Loss: 0.01036
Epoch: 20485, Training Loss: 0.01277
Epoch: 20486, Training Loss: 0.00988
Epoch: 20486, Training Loss: 0.01035
Epoch: 20486, Training Loss: 0.01036
Epoch: 20486, Training Loss: 0.01276
Epoch: 20487, Training Loss: 0.00988
Epoch: 20487, Training Loss: 0.01035
Epoch: 20487, Training Loss: 0.01036
Epoch: 20487, Training Loss: 0.01276
Epoch: 20488, Training Loss: 0.00988
Epoch: 20488, Training Loss: 0.01035
Epoch: 20488, Training Loss: 0.01036
Epoch: 20488, Training Loss: 0.01276
Epoch: 20489, Training Loss: 0.00988
Epoch: 20489, Training Loss: 0.01035
Epoch: 20489, Training Loss: 0.01036
Epoch: 20489, Training Loss: 0.01276
Epoch: 20490, Training Loss: 0.00987
Epoch: 20490, Training Loss: 0.01035
Epoch: 20490, Training Loss: 0.01036
Epoch: 20490, Training Loss: 0.01276
Epoch: 20491, Training Loss: 0.00987
Epoch: 20491, Training Loss: 0.01035
Epoch: 20491, Training Loss: 0.01036
Epoch: 20491, Training Loss: 0.01276
Epoch: 20492, Training Loss: 0.00987
Epoch: 20492, Training Loss: 0.01035
Epoch: 20492, Training Loss: 0.01036
Epoch: 20492, Training Loss: 0.01276
Epoch: 20493, Training Loss: 0.00987
Epoch: 20493, Training Loss: 0.01035
Epoch: 20493, Training Loss: 0.01036
Epoch: 20493, Training Loss: 0.01276
Epoch: 20494, Training Loss: 0.00987
Epoch: 20494, Training Loss: 0.01035
Epoch: 20494, Training Loss: 0.01036
Epoch: 20494, Training Loss: 0.01276
Epoch: 20495, Training Loss: 0.00987
Epoch: 20495, Training Loss: 0.01034
Epoch: 20495, Training Loss: 0.01036
Epoch: 20495, Training Loss: 0.01276
Epoch: 20496, Training Loss: 0.00987
Epoch: 20496, Training Loss: 0.01034
Epoch: 20496, Training Loss: 0.01036
Epoch: 20496, Training Loss: 0.01276
Epoch: 20497, Training Loss: 0.00987
Epoch: 20497, Training Loss: 0.01034
Epoch: 20497, Training Loss: 0.01036
Epoch: 20497, Training Loss: 0.01276
Epoch: 20498, Training Loss: 0.00987
Epoch: 20498, Training Loss: 0.01034
Epoch: 20498, Training Loss: 0.01036
Epoch: 20498, Training Loss: 0.01276
Epoch: 20499, Training Loss: 0.00987
Epoch: 20499, Training Loss: 0.01034
Epoch: 20499, Training Loss: 0.01036
Epoch: 20499, Training Loss: 0.01276
Epoch: 20500, Training Loss: 0.00987
Epoch: 20500, Training Loss: 0.01034
Epoch: 20500, Training Loss: 0.01036
Epoch: 20500, Training Loss: 0.01276
Epoch: 20501, Training Loss: 0.00987
Epoch: 20501, Training Loss: 0.01034
Epoch: 20501, Training Loss: 0.01036
Epoch: 20501, Training Loss: 0.01276
Epoch: 20502, Training Loss: 0.00987
Epoch: 20502, Training Loss: 0.01034
Epoch: 20502, Training Loss: 0.01036
Epoch: 20502, Training Loss: 0.01276
Epoch: 20503, Training Loss: 0.00987
Epoch: 20503, Training Loss: 0.01034
Epoch: 20503, Training Loss: 0.01036
Epoch: 20503, Training Loss: 0.01276
Epoch: 20504, Training Loss: 0.00987
Epoch: 20504, Training Loss: 0.01034
Epoch: 20504, Training Loss: 0.01036
Epoch: 20504, Training Loss: 0.01276
Epoch: 20505, Training Loss: 0.00987
Epoch: 20505, Training Loss: 0.01034
Epoch: 20505, Training Loss: 0.01036
Epoch: 20505, Training Loss: 0.01276
Epoch: 20506, Training Loss: 0.00987
Epoch: 20506, Training Loss: 0.01034
Epoch: 20506, Training Loss: 0.01036
Epoch: 20506, Training Loss: 0.01276
Epoch: 20507, Training Loss: 0.00987
Epoch: 20507, Training Loss: 0.01034
Epoch: 20507, Training Loss: 0.01035
Epoch: 20507, Training Loss: 0.01276
Epoch: 20508, Training Loss: 0.00987
Epoch: 20508, Training Loss: 0.01034
Epoch: 20508, Training Loss: 0.01035
Epoch: 20508, Training Loss: 0.01276
Epoch: 20509, Training Loss: 0.00987
Epoch: 20509, Training Loss: 0.01034
Epoch: 20509, Training Loss: 0.01035
Epoch: 20509, Training Loss: 0.01276
Epoch: 20510, Training Loss: 0.00987
Epoch: 20510, Training Loss: 0.01034
Epoch: 20510, Training Loss: 0.01035
Epoch: 20510, Training Loss: 0.01276
Epoch: 20511, Training Loss: 0.00987
Epoch: 20511, Training Loss: 0.01034
Epoch: 20511, Training Loss: 0.01035
Epoch: 20511, Training Loss: 0.01276
Epoch: 20512, Training Loss: 0.00987
Epoch: 20512, Training Loss: 0.01034
Epoch: 20512, Training Loss: 0.01035
Epoch: 20512, Training Loss: 0.01276
Epoch: 20513, Training Loss: 0.00987
Epoch: 20513, Training Loss: 0.01034
Epoch: 20513, Training Loss: 0.01035
Epoch: 20513, Training Loss: 0.01275
Epoch: 20514, Training Loss: 0.00987
Epoch: 20514, Training Loss: 0.01034
Epoch: 20514, Training Loss: 0.01035
Epoch: 20514, Training Loss: 0.01275
Epoch: 20515, Training Loss: 0.00987
Epoch: 20515, Training Loss: 0.01034
Epoch: 20515, Training Loss: 0.01035
Epoch: 20515, Training Loss: 0.01275
Epoch: 20516, Training Loss: 0.00987
Epoch: 20516, Training Loss: 0.01034
Epoch: 20516, Training Loss: 0.01035
Epoch: 20516, Training Loss: 0.01275
Epoch: 20517, Training Loss: 0.00987
Epoch: 20517, Training Loss: 0.01034
Epoch: 20517, Training Loss: 0.01035
Epoch: 20517, Training Loss: 0.01275
Epoch: 20518, Training Loss: 0.00987
Epoch: 20518, Training Loss: 0.01034
Epoch: 20518, Training Loss: 0.01035
Epoch: 20518, Training Loss: 0.01275
Epoch: 20519, Training Loss: 0.00987
Epoch: 20519, Training Loss: 0.01034
Epoch: 20519, Training Loss: 0.01035
Epoch: 20519, Training Loss: 0.01275
Epoch: 20520, Training Loss: 0.00987
Epoch: 20520, Training Loss: 0.01034
Epoch: 20520, Training Loss: 0.01035
Epoch: 20520, Training Loss: 0.01275
Epoch: 20521, Training Loss: 0.00987
Epoch: 20521, Training Loss: 0.01034
Epoch: 20521, Training Loss: 0.01035
Epoch: 20521, Training Loss: 0.01275
Epoch: 20522, Training Loss: 0.00987
Epoch: 20522, Training Loss: 0.01034
Epoch: 20522, Training Loss: 0.01035
Epoch: 20522, Training Loss: 0.01275
Epoch: 20523, Training Loss: 0.00987
Epoch: 20523, Training Loss: 0.01034
Epoch: 20523, Training Loss: 0.01035
Epoch: 20523, Training Loss: 0.01275
Epoch: 20524, Training Loss: 0.00987
Epoch: 20524, Training Loss: 0.01034
Epoch: 20524, Training Loss: 0.01035
Epoch: 20524, Training Loss: 0.01275
Epoch: 20525, Training Loss: 0.00987
Epoch: 20525, Training Loss: 0.01034
Epoch: 20525, Training Loss: 0.01035
Epoch: 20525, Training Loss: 0.01275
Epoch: 20526, Training Loss: 0.00986
Epoch: 20526, Training Loss: 0.01034
Epoch: 20526, Training Loss: 0.01035
Epoch: 20526, Training Loss: 0.01275
Epoch: 20527, Training Loss: 0.00986
Epoch: 20527, Training Loss: 0.01034
Epoch: 20527, Training Loss: 0.01035
Epoch: 20527, Training Loss: 0.01275
Epoch: 20528, Training Loss: 0.00986
Epoch: 20528, Training Loss: 0.01034
Epoch: 20528, Training Loss: 0.01035
Epoch: 20528, Training Loss: 0.01275
Epoch: 20529, Training Loss: 0.00986
Epoch: 20529, Training Loss: 0.01033
Epoch: 20529, Training Loss: 0.01035
Epoch: 20529, Training Loss: 0.01275
Epoch: 20530, Training Loss: 0.00986
Epoch: 20530, Training Loss: 0.01033
Epoch: 20530, Training Loss: 0.01035
Epoch: 20530, Training Loss: 0.01275
Epoch: 20531, Training Loss: 0.00986
Epoch: 20531, Training Loss: 0.01033
Epoch: 20531, Training Loss: 0.01035
Epoch: 20531, Training Loss: 0.01275
Epoch: 20532, Training Loss: 0.00986
Epoch: 20532, Training Loss: 0.01033
Epoch: 20532, Training Loss: 0.01035
Epoch: 20532, Training Loss: 0.01275
Epoch: 20533, Training Loss: 0.00986
Epoch: 20533, Training Loss: 0.01033
Epoch: 20533, Training Loss: 0.01035
Epoch: 20533, Training Loss: 0.01275
Epoch: 20534, Training Loss: 0.00986
Epoch: 20534, Training Loss: 0.01033
Epoch: 20534, Training Loss: 0.01035
Epoch: 20534, Training Loss: 0.01275
Epoch: 20535, Training Loss: 0.00986
Epoch: 20535, Training Loss: 0.01033
Epoch: 20535, Training Loss: 0.01035
Epoch: 20535, Training Loss: 0.01275
Epoch: 20536, Training Loss: 0.00986
Epoch: 20536, Training Loss: 0.01033
Epoch: 20536, Training Loss: 0.01035
Epoch: 20536, Training Loss: 0.01275
Epoch: 20537, Training Loss: 0.00986
Epoch: 20537, Training Loss: 0.01033
Epoch: 20537, Training Loss: 0.01035
Epoch: 20537, Training Loss: 0.01275
Epoch: 20538, Training Loss: 0.00986
Epoch: 20538, Training Loss: 0.01033
Epoch: 20538, Training Loss: 0.01035
Epoch: 20538, Training Loss: 0.01275
Epoch: 20539, Training Loss: 0.00986
Epoch: 20539, Training Loss: 0.01033
Epoch: 20539, Training Loss: 0.01035
Epoch: 20539, Training Loss: 0.01275
Epoch: 20540, Training Loss: 0.00986
Epoch: 20540, Training Loss: 0.01033
Epoch: 20540, Training Loss: 0.01035
Epoch: 20540, Training Loss: 0.01274
Epoch: 20541, Training Loss: 0.00986
Epoch: 20541, Training Loss: 0.01033
Epoch: 20541, Training Loss: 0.01034
Epoch: 20541, Training Loss: 0.01274
Epoch: 20542, Training Loss: 0.00986
Epoch: 20542, Training Loss: 0.01033
Epoch: 20542, Training Loss: 0.01034
Epoch: 20542, Training Loss: 0.01274
Epoch: 20543, Training Loss: 0.00986
Epoch: 20543, Training Loss: 0.01033
Epoch: 20543, Training Loss: 0.01034
Epoch: 20543, Training Loss: 0.01274
Epoch: 20544, Training Loss: 0.00986
Epoch: 20544, Training Loss: 0.01033
Epoch: 20544, Training Loss: 0.01034
Epoch: 20544, Training Loss: 0.01274
Epoch: 20545, Training Loss: 0.00986
Epoch: 20545, Training Loss: 0.01033
Epoch: 20545, Training Loss: 0.01034
Epoch: 20545, Training Loss: 0.01274
Epoch: 20546, Training Loss: 0.00986
Epoch: 20546, Training Loss: 0.01033
Epoch: 20546, Training Loss: 0.01034
Epoch: 20546, Training Loss: 0.01274
Epoch: 20547, Training Loss: 0.00986
Epoch: 20547, Training Loss: 0.01033
Epoch: 20547, Training Loss: 0.01034
Epoch: 20547, Training Loss: 0.01274
Epoch: 20548, Training Loss: 0.00986
Epoch: 20548, Training Loss: 0.01033
Epoch: 20548, Training Loss: 0.01034
Epoch: 20548, Training Loss: 0.01274
Epoch: 20549, Training Loss: 0.00986
Epoch: 20549, Training Loss: 0.01033
Epoch: 20549, Training Loss: 0.01034
Epoch: 20549, Training Loss: 0.01274
Epoch: 20550, Training Loss: 0.00986
Epoch: 20550, Training Loss: 0.01033
Epoch: 20550, Training Loss: 0.01034
Epoch: 20550, Training Loss: 0.01274
Epoch: 20551, Training Loss: 0.00986
Epoch: 20551, Training Loss: 0.01033
Epoch: 20551, Training Loss: 0.01034
Epoch: 20551, Training Loss: 0.01274
Epoch: 20552, Training Loss: 0.00986
Epoch: 20552, Training Loss: 0.01033
Epoch: 20552, Training Loss: 0.01034
Epoch: 20552, Training Loss: 0.01274
Epoch: 20553, Training Loss: 0.00986
Epoch: 20553, Training Loss: 0.01033
Epoch: 20553, Training Loss: 0.01034
Epoch: 20553, Training Loss: 0.01274
Epoch: 20554, Training Loss: 0.00986
Epoch: 20554, Training Loss: 0.01033
Epoch: 20554, Training Loss: 0.01034
Epoch: 20554, Training Loss: 0.01274
Epoch: 20555, Training Loss: 0.00986
Epoch: 20555, Training Loss: 0.01033
Epoch: 20555, Training Loss: 0.01034
Epoch: 20555, Training Loss: 0.01274
Epoch: 20556, Training Loss: 0.00986
Epoch: 20556, Training Loss: 0.01033
Epoch: 20556, Training Loss: 0.01034
Epoch: 20556, Training Loss: 0.01274
Epoch: 20557, Training Loss: 0.00986
Epoch: 20557, Training Loss: 0.01033
Epoch: 20557, Training Loss: 0.01034
Epoch: 20557, Training Loss: 0.01274
Epoch: 20558, Training Loss: 0.00986
Epoch: 20558, Training Loss: 0.01033
Epoch: 20558, Training Loss: 0.01034
Epoch: 20558, Training Loss: 0.01274
Epoch: 20559, Training Loss: 0.00986
Epoch: 20559, Training Loss: 0.01033
Epoch: 20559, Training Loss: 0.01034
Epoch: 20559, Training Loss: 0.01274
Epoch: 20560, Training Loss: 0.00986
Epoch: 20560, Training Loss: 0.01033
Epoch: 20560, Training Loss: 0.01034
Epoch: 20560, Training Loss: 0.01274
Epoch: 20561, Training Loss: 0.00986
Epoch: 20561, Training Loss: 0.01033
Epoch: 20561, Training Loss: 0.01034
Epoch: 20561, Training Loss: 0.01274
Epoch: 20562, Training Loss: 0.00985
Epoch: 20562, Training Loss: 0.01033
Epoch: 20562, Training Loss: 0.01034
Epoch: 20562, Training Loss: 0.01274
Epoch: 20563, Training Loss: 0.00985
Epoch: 20563, Training Loss: 0.01032
Epoch: 20563, Training Loss: 0.01034
Epoch: 20563, Training Loss: 0.01274
Epoch: 20564, Training Loss: 0.00985
Epoch: 20564, Training Loss: 0.01032
Epoch: 20564, Training Loss: 0.01034
Epoch: 20564, Training Loss: 0.01274
Epoch: 20565, Training Loss: 0.00985
Epoch: 20565, Training Loss: 0.01032
Epoch: 20565, Training Loss: 0.01034
Epoch: 20565, Training Loss: 0.01274
Epoch: 20566, Training Loss: 0.00985
Epoch: 20566, Training Loss: 0.01032
Epoch: 20566, Training Loss: 0.01034
Epoch: 20566, Training Loss: 0.01274
Epoch: 20567, Training Loss: 0.00985
Epoch: 20567, Training Loss: 0.01032
Epoch: 20567, Training Loss: 0.01034
Epoch: 20567, Training Loss: 0.01273
Epoch: 20568, Training Loss: 0.00985
Epoch: 20568, Training Loss: 0.01032
Epoch: 20568, Training Loss: 0.01034
Epoch: 20568, Training Loss: 0.01273
Epoch: 20569, Training Loss: 0.00985
Epoch: 20569, Training Loss: 0.01032
Epoch: 20569, Training Loss: 0.01034
Epoch: 20569, Training Loss: 0.01273
Epoch: 20570, Training Loss: 0.00985
Epoch: 20570, Training Loss: 0.01032
Epoch: 20570, Training Loss: 0.01034
Epoch: 20570, Training Loss: 0.01273
Epoch: 20571, Training Loss: 0.00985
Epoch: 20571, Training Loss: 0.01032
Epoch: 20571, Training Loss: 0.01034
Epoch: 20571, Training Loss: 0.01273
Epoch: 20572, Training Loss: 0.00985
Epoch: 20572, Training Loss: 0.01032
Epoch: 20572, Training Loss: 0.01034
Epoch: 20572, Training Loss: 0.01273
Epoch: 20573, Training Loss: 0.00985
Epoch: 20573, Training Loss: 0.01032
Epoch: 20573, Training Loss: 0.01034
Epoch: 20573, Training Loss: 0.01273
Epoch: 20574, Training Loss: 0.00985
Epoch: 20574, Training Loss: 0.01032
Epoch: 20574, Training Loss: 0.01034
Epoch: 20574, Training Loss: 0.01273
Epoch: 20575, Training Loss: 0.00985
Epoch: 20575, Training Loss: 0.01032
Epoch: 20575, Training Loss: 0.01033
Epoch: 20575, Training Loss: 0.01273
Epoch: 20576, Training Loss: 0.00985
Epoch: 20576, Training Loss: 0.01032
Epoch: 20576, Training Loss: 0.01033
Epoch: 20576, Training Loss: 0.01273
Epoch: 20577, Training Loss: 0.00985
Epoch: 20577, Training Loss: 0.01032
Epoch: 20577, Training Loss: 0.01033
Epoch: 20577, Training Loss: 0.01273
Epoch: 20578, Training Loss: 0.00985
Epoch: 20578, Training Loss: 0.01032
Epoch: 20578, Training Loss: 0.01033
Epoch: 20578, Training Loss: 0.01273
Epoch: 20579, Training Loss: 0.00985
Epoch: 20579, Training Loss: 0.01032
Epoch: 20579, Training Loss: 0.01033
Epoch: 20579, Training Loss: 0.01273
Epoch: 20580, Training Loss: 0.00985
Epoch: 20580, Training Loss: 0.01032
Epoch: 20580, Training Loss: 0.01033
Epoch: 20580, Training Loss: 0.01273
Epoch: 20581, Training Loss: 0.00985
Epoch: 20581, Training Loss: 0.01032
Epoch: 20581, Training Loss: 0.01033
Epoch: 20581, Training Loss: 0.01273
Epoch: 20582, Training Loss: 0.00985
Epoch: 20582, Training Loss: 0.01032
Epoch: 20582, Training Loss: 0.01033
Epoch: 20582, Training Loss: 0.01273
Epoch: 20583, Training Loss: 0.00985
Epoch: 20583, Training Loss: 0.01032
Epoch: 20583, Training Loss: 0.01033
Epoch: 20583, Training Loss: 0.01273
Epoch: 20584, Training Loss: 0.00985
Epoch: 20584, Training Loss: 0.01032
Epoch: 20584, Training Loss: 0.01033
Epoch: 20584, Training Loss: 0.01273
Epoch: 20585, Training Loss: 0.00985
Epoch: 20585, Training Loss: 0.01032
Epoch: 20585, Training Loss: 0.01033
Epoch: 20585, Training Loss: 0.01273
Epoch: 20586, Training Loss: 0.00985
Epoch: 20586, Training Loss: 0.01032
Epoch: 20586, Training Loss: 0.01033
Epoch: 20586, Training Loss: 0.01273
Epoch: 20587, Training Loss: 0.00985
Epoch: 20587, Training Loss: 0.01032
Epoch: 20587, Training Loss: 0.01033
Epoch: 20587, Training Loss: 0.01273
Epoch: 20588, Training Loss: 0.00985
Epoch: 20588, Training Loss: 0.01032
Epoch: 20588, Training Loss: 0.01033
Epoch: 20588, Training Loss: 0.01273
Epoch: 20589, Training Loss: 0.00985
Epoch: 20589, Training Loss: 0.01032
Epoch: 20589, Training Loss: 0.01033
Epoch: 20589, Training Loss: 0.01273
Epoch: 20590, Training Loss: 0.00985
Epoch: 20590, Training Loss: 0.01032
Epoch: 20590, Training Loss: 0.01033
Epoch: 20590, Training Loss: 0.01273
Epoch: 20591, Training Loss: 0.00985
Epoch: 20591, Training Loss: 0.01032
Epoch: 20591, Training Loss: 0.01033
Epoch: 20591, Training Loss: 0.01273
Epoch: 20592, Training Loss: 0.00985
Epoch: 20592, Training Loss: 0.01032
Epoch: 20592, Training Loss: 0.01033
Epoch: 20592, Training Loss: 0.01273
Epoch: 20593, Training Loss: 0.00985
Epoch: 20593, Training Loss: 0.01032
Epoch: 20593, Training Loss: 0.01033
Epoch: 20593, Training Loss: 0.01273
Epoch: 20594, Training Loss: 0.00985
Epoch: 20594, Training Loss: 0.01032
Epoch: 20594, Training Loss: 0.01033
Epoch: 20594, Training Loss: 0.01272
Epoch: 20595, Training Loss: 0.00985
Epoch: 20595, Training Loss: 0.01032
Epoch: 20595, Training Loss: 0.01033
Epoch: 20595, Training Loss: 0.01272
Epoch: 20596, Training Loss: 0.00985
Epoch: 20596, Training Loss: 0.01032
Epoch: 20596, Training Loss: 0.01033
Epoch: 20596, Training Loss: 0.01272
Epoch: 20597, Training Loss: 0.00985
Epoch: 20597, Training Loss: 0.01031
Epoch: 20597, Training Loss: 0.01033
Epoch: 20597, Training Loss: 0.01272
Epoch: 20598, Training Loss: 0.00985
Epoch: 20598, Training Loss: 0.01031
Epoch: 20598, Training Loss: 0.01033
Epoch: 20598, Training Loss: 0.01272
Epoch: 20599, Training Loss: 0.00984
Epoch: 20599, Training Loss: 0.01031
Epoch: 20599, Training Loss: 0.01033
Epoch: 20599, Training Loss: 0.01272
Epoch: 20600, Training Loss: 0.00984
Epoch: 20600, Training Loss: 0.01031
Epoch: 20600, Training Loss: 0.01033
Epoch: 20600, Training Loss: 0.01272
Epoch: 20601, Training Loss: 0.00984
Epoch: 20601, Training Loss: 0.01031
Epoch: 20601, Training Loss: 0.01033
Epoch: 20601, Training Loss: 0.01272
Epoch: 20602, Training Loss: 0.00984
Epoch: 20602, Training Loss: 0.01031
Epoch: 20602, Training Loss: 0.01033
Epoch: 20602, Training Loss: 0.01272
Epoch: 20603, Training Loss: 0.00984
Epoch: 20603, Training Loss: 0.01031
Epoch: 20603, Training Loss: 0.01033
Epoch: 20603, Training Loss: 0.01272
Epoch: 20604, Training Loss: 0.00984
Epoch: 20604, Training Loss: 0.01031
Epoch: 20604, Training Loss: 0.01033
Epoch: 20604, Training Loss: 0.01272
Epoch: 20605, Training Loss: 0.00984
Epoch: 20605, Training Loss: 0.01031
Epoch: 20605, Training Loss: 0.01033
Epoch: 20605, Training Loss: 0.01272
Epoch: 20606, Training Loss: 0.00984
Epoch: 20606, Training Loss: 0.01031
Epoch: 20606, Training Loss: 0.01033
Epoch: 20606, Training Loss: 0.01272
Epoch: 20607, Training Loss: 0.00984
Epoch: 20607, Training Loss: 0.01031
Epoch: 20607, Training Loss: 0.01033
Epoch: 20607, Training Loss: 0.01272
Epoch: 20608, Training Loss: 0.00984
Epoch: 20608, Training Loss: 0.01031
Epoch: 20608, Training Loss: 0.01032
Epoch: 20608, Training Loss: 0.01272
Epoch: 20609, Training Loss: 0.00984
Epoch: 20609, Training Loss: 0.01031
Epoch: 20609, Training Loss: 0.01032
Epoch: 20609, Training Loss: 0.01272
Epoch: 20610, Training Loss: 0.00984
Epoch: 20610, Training Loss: 0.01031
Epoch: 20610, Training Loss: 0.01032
Epoch: 20610, Training Loss: 0.01272
Epoch: 20611, Training Loss: 0.00984
Epoch: 20611, Training Loss: 0.01031
Epoch: 20611, Training Loss: 0.01032
Epoch: 20611, Training Loss: 0.01272
Epoch: 20612, Training Loss: 0.00984
Epoch: 20612, Training Loss: 0.01031
Epoch: 20612, Training Loss: 0.01032
Epoch: 20612, Training Loss: 0.01272
Epoch: 20613, Training Loss: 0.00984
Epoch: 20613, Training Loss: 0.01031
Epoch: 20613, Training Loss: 0.01032
Epoch: 20613, Training Loss: 0.01272
Epoch: 20614, Training Loss: 0.00984
Epoch: 20614, Training Loss: 0.01031
Epoch: 20614, Training Loss: 0.01032
Epoch: 20614, Training Loss: 0.01272
Epoch: 20615, Training Loss: 0.00984
Epoch: 20615, Training Loss: 0.01031
Epoch: 20615, Training Loss: 0.01032
Epoch: 20615, Training Loss: 0.01272
Epoch: 20616, Training Loss: 0.00984
Epoch: 20616, Training Loss: 0.01031
Epoch: 20616, Training Loss: 0.01032
Epoch: 20616, Training Loss: 0.01272
Epoch: 20617, Training Loss: 0.00984
Epoch: 20617, Training Loss: 0.01031
Epoch: 20617, Training Loss: 0.01032
Epoch: 20617, Training Loss: 0.01272
Epoch: 20618, Training Loss: 0.00984
Epoch: 20618, Training Loss: 0.01031
Epoch: 20618, Training Loss: 0.01032
Epoch: 20618, Training Loss: 0.01272
Epoch: 20619, Training Loss: 0.00984
Epoch: 20619, Training Loss: 0.01031
Epoch: 20619, Training Loss: 0.01032
Epoch: 20619, Training Loss: 0.01272
Epoch: 20620, Training Loss: 0.00984
Epoch: 20620, Training Loss: 0.01031
Epoch: 20620, Training Loss: 0.01032
Epoch: 20620, Training Loss: 0.01272
Epoch: 20621, Training Loss: 0.00984
Epoch: 20621, Training Loss: 0.01031
Epoch: 20621, Training Loss: 0.01032
Epoch: 20621, Training Loss: 0.01271
Epoch: 20622, Training Loss: 0.00984
Epoch: 20622, Training Loss: 0.01031
Epoch: 20622, Training Loss: 0.01032
Epoch: 20622, Training Loss: 0.01271
Epoch: 20623, Training Loss: 0.00984
Epoch: 20623, Training Loss: 0.01031
Epoch: 20623, Training Loss: 0.01032
Epoch: 20623, Training Loss: 0.01271
Epoch: 20624, Training Loss: 0.00984
Epoch: 20624, Training Loss: 0.01031
Epoch: 20624, Training Loss: 0.01032
Epoch: 20624, Training Loss: 0.01271
Epoch: 20625, Training Loss: 0.00984
Epoch: 20625, Training Loss: 0.01031
Epoch: 20625, Training Loss: 0.01032
Epoch: 20625, Training Loss: 0.01271
Epoch: 20626, Training Loss: 0.00984
Epoch: 20626, Training Loss: 0.01031
Epoch: 20626, Training Loss: 0.01032
Epoch: 20626, Training Loss: 0.01271
Epoch: 20627, Training Loss: 0.00984
Epoch: 20627, Training Loss: 0.01031
Epoch: 20627, Training Loss: 0.01032
Epoch: 20627, Training Loss: 0.01271
Epoch: 20628, Training Loss: 0.00984
Epoch: 20628, Training Loss: 0.01031
Epoch: 20628, Training Loss: 0.01032
Epoch: 20628, Training Loss: 0.01271
Epoch: 20629, Training Loss: 0.00984
Epoch: 20629, Training Loss: 0.01031
Epoch: 20629, Training Loss: 0.01032
Epoch: 20629, Training Loss: 0.01271
Epoch: 20630, Training Loss: 0.00984
Epoch: 20630, Training Loss: 0.01031
Epoch: 20630, Training Loss: 0.01032
Epoch: 20630, Training Loss: 0.01271
Epoch: 20631, Training Loss: 0.00984
Epoch: 20631, Training Loss: 0.01030
Epoch: 20631, Training Loss: 0.01032
Epoch: 20631, Training Loss: 0.01271
Epoch: 20632, Training Loss: 0.00984
Epoch: 20632, Training Loss: 0.01030
Epoch: 20632, Training Loss: 0.01032
Epoch: 20632, Training Loss: 0.01271
Epoch: 20633, Training Loss: 0.00984
Epoch: 20633, Training Loss: 0.01030
Epoch: 20633, Training Loss: 0.01032
Epoch: 20633, Training Loss: 0.01271
Epoch: 20634, Training Loss: 0.00984
Epoch: 20634, Training Loss: 0.01030
Epoch: 20634, Training Loss: 0.01032
Epoch: 20634, Training Loss: 0.01271
Epoch: 20635, Training Loss: 0.00983
Epoch: 20635, Training Loss: 0.01030
Epoch: 20635, Training Loss: 0.01032
Epoch: 20635, Training Loss: 0.01271
Epoch: 20636, Training Loss: 0.00983
Epoch: 20636, Training Loss: 0.01030
Epoch: 20636, Training Loss: 0.01032
Epoch: 20636, Training Loss: 0.01271
Epoch: 20637, Training Loss: 0.00983
Epoch: 20637, Training Loss: 0.01030
Epoch: 20637, Training Loss: 0.01032
Epoch: 20637, Training Loss: 0.01271
Epoch: 20638, Training Loss: 0.00983
Epoch: 20638, Training Loss: 0.01030
Epoch: 20638, Training Loss: 0.01032
Epoch: 20638, Training Loss: 0.01271
Epoch: 20639, Training Loss: 0.00983
Epoch: 20639, Training Loss: 0.01030
Epoch: 20639, Training Loss: 0.01032
Epoch: 20639, Training Loss: 0.01271
Epoch: 20640, Training Loss: 0.00983
Epoch: 20640, Training Loss: 0.01030
Epoch: 20640, Training Loss: 0.01032
Epoch: 20640, Training Loss: 0.01271
Epoch: 20641, Training Loss: 0.00983
Epoch: 20641, Training Loss: 0.01030
Epoch: 20641, Training Loss: 0.01032
Epoch: 20641, Training Loss: 0.01271
Epoch: 20642, Training Loss: 0.00983
Epoch: 20642, Training Loss: 0.01030
Epoch: 20642, Training Loss: 0.01031
Epoch: 20642, Training Loss: 0.01271
Epoch: 20643, Training Loss: 0.00983
Epoch: 20643, Training Loss: 0.01030
Epoch: 20643, Training Loss: 0.01031
Epoch: 20643, Training Loss: 0.01271
Epoch: 20644, Training Loss: 0.00983
Epoch: 20644, Training Loss: 0.01030
Epoch: 20644, Training Loss: 0.01031
Epoch: 20644, Training Loss: 0.01271
Epoch: 20645, Training Loss: 0.00983
Epoch: 20645, Training Loss: 0.01030
Epoch: 20645, Training Loss: 0.01031
Epoch: 20645, Training Loss: 0.01271
Epoch: 20646, Training Loss: 0.00983
Epoch: 20646, Training Loss: 0.01030
Epoch: 20646, Training Loss: 0.01031
Epoch: 20646, Training Loss: 0.01271
Epoch: 20647, Training Loss: 0.00983
Epoch: 20647, Training Loss: 0.01030
Epoch: 20647, Training Loss: 0.01031
Epoch: 20647, Training Loss: 0.01271
Epoch: 20648, Training Loss: 0.00983
Epoch: 20648, Training Loss: 0.01030
Epoch: 20648, Training Loss: 0.01031
Epoch: 20648, Training Loss: 0.01270
Epoch: 20649, Training Loss: 0.00983
Epoch: 20649, Training Loss: 0.01030
Epoch: 20649, Training Loss: 0.01031
Epoch: 20649, Training Loss: 0.01270
Epoch: 20650, Training Loss: 0.00983
Epoch: 20650, Training Loss: 0.01030
Epoch: 20650, Training Loss: 0.01031
Epoch: 20650, Training Loss: 0.01270
Epoch: 20651, Training Loss: 0.00983
Epoch: 20651, Training Loss: 0.01030
Epoch: 20651, Training Loss: 0.01031
Epoch: 20651, Training Loss: 0.01270
Epoch: 20652, Training Loss: 0.00983
Epoch: 20652, Training Loss: 0.01030
Epoch: 20652, Training Loss: 0.01031
Epoch: 20652, Training Loss: 0.01270
Epoch: 20653, Training Loss: 0.00983
Epoch: 20653, Training Loss: 0.01030
Epoch: 20653, Training Loss: 0.01031
Epoch: 20653, Training Loss: 0.01270
Epoch: 20654, Training Loss: 0.00983
Epoch: 20654, Training Loss: 0.01030
Epoch: 20654, Training Loss: 0.01031
Epoch: 20654, Training Loss: 0.01270
Epoch: 20655, Training Loss: 0.00983
Epoch: 20655, Training Loss: 0.01030
Epoch: 20655, Training Loss: 0.01031
Epoch: 20655, Training Loss: 0.01270
Epoch: 20656, Training Loss: 0.00983
Epoch: 20656, Training Loss: 0.01030
Epoch: 20656, Training Loss: 0.01031
Epoch: 20656, Training Loss: 0.01270
Epoch: 20657, Training Loss: 0.00983
Epoch: 20657, Training Loss: 0.01030
Epoch: 20657, Training Loss: 0.01031
Epoch: 20657, Training Loss: 0.01270
Epoch: 20658, Training Loss: 0.00983
Epoch: 20658, Training Loss: 0.01030
Epoch: 20658, Training Loss: 0.01031
Epoch: 20658, Training Loss: 0.01270
Epoch: 20659, Training Loss: 0.00983
Epoch: 20659, Training Loss: 0.01030
Epoch: 20659, Training Loss: 0.01031
Epoch: 20659, Training Loss: 0.01270
Epoch: 20660, Training Loss: 0.00983
Epoch: 20660, Training Loss: 0.01030
Epoch: 20660, Training Loss: 0.01031
Epoch: 20660, Training Loss: 0.01270
Epoch: 20661, Training Loss: 0.00983
Epoch: 20661, Training Loss: 0.01030
Epoch: 20661, Training Loss: 0.01031
Epoch: 20661, Training Loss: 0.01270
Epoch: 20662, Training Loss: 0.00983
Epoch: 20662, Training Loss: 0.01030
Epoch: 20662, Training Loss: 0.01031
Epoch: 20662, Training Loss: 0.01270
Epoch: 20663, Training Loss: 0.00983
Epoch: 20663, Training Loss: 0.01030
Epoch: 20663, Training Loss: 0.01031
Epoch: 20663, Training Loss: 0.01270
Epoch: 20664, Training Loss: 0.00983
Epoch: 20664, Training Loss: 0.01030
Epoch: 20664, Training Loss: 0.01031
Epoch: 20664, Training Loss: 0.01270
Epoch: 20665, Training Loss: 0.00983
Epoch: 20665, Training Loss: 0.01029
Epoch: 20665, Training Loss: 0.01031
Epoch: 20665, Training Loss: 0.01270
Epoch: 20666, Training Loss: 0.00983
Epoch: 20666, Training Loss: 0.01029
Epoch: 20666, Training Loss: 0.01031
Epoch: 20666, Training Loss: 0.01270
Epoch: 20667, Training Loss: 0.00983
Epoch: 20667, Training Loss: 0.01029
Epoch: 20667, Training Loss: 0.01031
Epoch: 20667, Training Loss: 0.01270
Epoch: 20668, Training Loss: 0.00983
Epoch: 20668, Training Loss: 0.01029
Epoch: 20668, Training Loss: 0.01031
Epoch: 20668, Training Loss: 0.01270
Epoch: 20669, Training Loss: 0.00983
Epoch: 20669, Training Loss: 0.01029
Epoch: 20669, Training Loss: 0.01031
Epoch: 20669, Training Loss: 0.01270
Epoch: 20670, Training Loss: 0.00983
Epoch: 20670, Training Loss: 0.01029
Epoch: 20670, Training Loss: 0.01031
Epoch: 20670, Training Loss: 0.01270
Epoch: 20671, Training Loss: 0.00983
Epoch: 20671, Training Loss: 0.01029
Epoch: 20671, Training Loss: 0.01031
Epoch: 20671, Training Loss: 0.01270
Epoch: 20672, Training Loss: 0.00982
Epoch: 20672, Training Loss: 0.01029
Epoch: 20672, Training Loss: 0.01031
Epoch: 20672, Training Loss: 0.01270
Epoch: 20673, Training Loss: 0.00982
Epoch: 20673, Training Loss: 0.01029
Epoch: 20673, Training Loss: 0.01031
Epoch: 20673, Training Loss: 0.01270
Epoch: 20674, Training Loss: 0.00982
Epoch: 20674, Training Loss: 0.01029
Epoch: 20674, Training Loss: 0.01031
Epoch: 20674, Training Loss: 0.01270
Epoch: 20675, Training Loss: 0.00982
Epoch: 20675, Training Loss: 0.01029
Epoch: 20675, Training Loss: 0.01031
Epoch: 20675, Training Loss: 0.01270
Epoch: 20676, Training Loss: 0.00982
Epoch: 20676, Training Loss: 0.01029
Epoch: 20676, Training Loss: 0.01030
Epoch: 20676, Training Loss: 0.01269
Epoch: 20677, Training Loss: 0.00982
Epoch: 20677, Training Loss: 0.01029
Epoch: 20677, Training Loss: 0.01030
Epoch: 20677, Training Loss: 0.01269
Epoch: 20678, Training Loss: 0.00982
Epoch: 20678, Training Loss: 0.01029
Epoch: 20678, Training Loss: 0.01030
Epoch: 20678, Training Loss: 0.01269
Epoch: 20679, Training Loss: 0.00982
Epoch: 20679, Training Loss: 0.01029
Epoch: 20679, Training Loss: 0.01030
Epoch: 20679, Training Loss: 0.01269
Epoch: 20680, Training Loss: 0.00982
Epoch: 20680, Training Loss: 0.01029
Epoch: 20680, Training Loss: 0.01030
Epoch: 20680, Training Loss: 0.01269
Epoch: 20681, Training Loss: 0.00982
Epoch: 20681, Training Loss: 0.01029
Epoch: 20681, Training Loss: 0.01030
Epoch: 20681, Training Loss: 0.01269
Epoch: 20682, Training Loss: 0.00982
Epoch: 20682, Training Loss: 0.01029
Epoch: 20682, Training Loss: 0.01030
Epoch: 20682, Training Loss: 0.01269
Epoch: 20683, Training Loss: 0.00982
Epoch: 20683, Training Loss: 0.01029
Epoch: 20683, Training Loss: 0.01030
Epoch: 20683, Training Loss: 0.01269
Epoch: 20684, Training Loss: 0.00982
Epoch: 20684, Training Loss: 0.01029
Epoch: 20684, Training Loss: 0.01030
Epoch: 20684, Training Loss: 0.01269
Epoch: 20685, Training Loss: 0.00982
Epoch: 20685, Training Loss: 0.01029
Epoch: 20685, Training Loss: 0.01030
Epoch: 20685, Training Loss: 0.01269
Epoch: 20686, Training Loss: 0.00982
Epoch: 20686, Training Loss: 0.01029
Epoch: 20686, Training Loss: 0.01030
Epoch: 20686, Training Loss: 0.01269
Epoch: 20687, Training Loss: 0.00982
Epoch: 20687, Training Loss: 0.01029
Epoch: 20687, Training Loss: 0.01030
Epoch: 20687, Training Loss: 0.01269
Epoch: 20688, Training Loss: 0.00982
Epoch: 20688, Training Loss: 0.01029
Epoch: 20688, Training Loss: 0.01030
Epoch: 20688, Training Loss: 0.01269
Epoch: 20689, Training Loss: 0.00982
Epoch: 20689, Training Loss: 0.01029
Epoch: 20689, Training Loss: 0.01030
Epoch: 20689, Training Loss: 0.01269
Epoch: 20690, Training Loss: 0.00982
Epoch: 20690, Training Loss: 0.01029
Epoch: 20690, Training Loss: 0.01030
Epoch: 20690, Training Loss: 0.01269
Epoch: 20691, Training Loss: 0.00982
Epoch: 20691, Training Loss: 0.01029
Epoch: 20691, Training Loss: 0.01030
Epoch: 20691, Training Loss: 0.01269
Epoch: 20692, Training Loss: 0.00982
Epoch: 20692, Training Loss: 0.01029
Epoch: 20692, Training Loss: 0.01030
Epoch: 20692, Training Loss: 0.01269
Epoch: 20693, Training Loss: 0.00982
Epoch: 20693, Training Loss: 0.01029
Epoch: 20693, Training Loss: 0.01030
Epoch: 20693, Training Loss: 0.01269
Epoch: 20694, Training Loss: 0.00982
Epoch: 20694, Training Loss: 0.01029
Epoch: 20694, Training Loss: 0.01030
Epoch: 20694, Training Loss: 0.01269
Epoch: 20695, Training Loss: 0.00982
Epoch: 20695, Training Loss: 0.01029
Epoch: 20695, Training Loss: 0.01030
Epoch: 20695, Training Loss: 0.01269
Epoch: 20696, Training Loss: 0.00982
Epoch: 20696, Training Loss: 0.01029
Epoch: 20696, Training Loss: 0.01030
Epoch: 20696, Training Loss: 0.01269
Epoch: 20697, Training Loss: 0.00982
Epoch: 20697, Training Loss: 0.01029
Epoch: 20697, Training Loss: 0.01030
Epoch: 20697, Training Loss: 0.01269
Epoch: 20698, Training Loss: 0.00982
Epoch: 20698, Training Loss: 0.01029
Epoch: 20698, Training Loss: 0.01030
Epoch: 20698, Training Loss: 0.01269
Epoch: 20699, Training Loss: 0.00982
Epoch: 20699, Training Loss: 0.01028
Epoch: 20699, Training Loss: 0.01030
Epoch: 20699, Training Loss: 0.01269
Epoch: 20700, Training Loss: 0.00982
Epoch: 20700, Training Loss: 0.01028
Epoch: 20700, Training Loss: 0.01030
Epoch: 20700, Training Loss: 0.01269
Epoch: 20701, Training Loss: 0.00982
Epoch: 20701, Training Loss: 0.01028
Epoch: 20701, Training Loss: 0.01030
Epoch: 20701, Training Loss: 0.01269
Epoch: 20702, Training Loss: 0.00982
Epoch: 20702, Training Loss: 0.01028
Epoch: 20702, Training Loss: 0.01030
Epoch: 20702, Training Loss: 0.01269
Epoch: 20703, Training Loss: 0.00982
Epoch: 20703, Training Loss: 0.01028
Epoch: 20703, Training Loss: 0.01030
Epoch: 20703, Training Loss: 0.01268
Epoch: 20704, Training Loss: 0.00982
Epoch: 20704, Training Loss: 0.01028
Epoch: 20704, Training Loss: 0.01030
Epoch: 20704, Training Loss: 0.01268
Epoch: 20705, Training Loss: 0.00982
Epoch: 20705, Training Loss: 0.01028
Epoch: 20705, Training Loss: 0.01030
Epoch: 20705, Training Loss: 0.01268
Epoch: 20706, Training Loss: 0.00982
Epoch: 20706, Training Loss: 0.01028
Epoch: 20706, Training Loss: 0.01030
Epoch: 20706, Training Loss: 0.01268
Epoch: 20707, Training Loss: 0.00982
Epoch: 20707, Training Loss: 0.01028
Epoch: 20707, Training Loss: 0.01030
Epoch: 20707, Training Loss: 0.01268
Epoch: 20708, Training Loss: 0.00982
Epoch: 20708, Training Loss: 0.01028
Epoch: 20708, Training Loss: 0.01030
Epoch: 20708, Training Loss: 0.01268
Epoch: 20709, Training Loss: 0.00981
Epoch: 20709, Training Loss: 0.01028
Epoch: 20709, Training Loss: 0.01030
Epoch: 20709, Training Loss: 0.01268
Epoch: 20710, Training Loss: 0.00981
Epoch: 20710, Training Loss: 0.01028
Epoch: 20710, Training Loss: 0.01030
Epoch: 20710, Training Loss: 0.01268
Epoch: 20711, Training Loss: 0.00981
Epoch: 20711, Training Loss: 0.01028
Epoch: 20711, Training Loss: 0.01029
Epoch: 20711, Training Loss: 0.01268
Epoch: 20712, Training Loss: 0.00981
Epoch: 20712, Training Loss: 0.01028
Epoch: 20712, Training Loss: 0.01029
Epoch: 20712, Training Loss: 0.01268
Epoch: 20713, Training Loss: 0.00981
Epoch: 20713, Training Loss: 0.01028
Epoch: 20713, Training Loss: 0.01029
Epoch: 20713, Training Loss: 0.01268
Epoch: 20714, Training Loss: 0.00981
Epoch: 20714, Training Loss: 0.01028
Epoch: 20714, Training Loss: 0.01029
Epoch: 20714, Training Loss: 0.01268
Epoch: 20715, Training Loss: 0.00981
Epoch: 20715, Training Loss: 0.01028
Epoch: 20715, Training Loss: 0.01029
Epoch: 20715, Training Loss: 0.01268
Epoch: 20716, Training Loss: 0.00981
Epoch: 20716, Training Loss: 0.01028
Epoch: 20716, Training Loss: 0.01029
Epoch: 20716, Training Loss: 0.01268
Epoch: 20717, Training Loss: 0.00981
Epoch: 20717, Training Loss: 0.01028
Epoch: 20717, Training Loss: 0.01029
Epoch: 20717, Training Loss: 0.01268
Epoch: 20718, Training Loss: 0.00981
Epoch: 20718, Training Loss: 0.01028
Epoch: 20718, Training Loss: 0.01029
Epoch: 20718, Training Loss: 0.01268
Epoch: 20719, Training Loss: 0.00981
Epoch: 20719, Training Loss: 0.01028
Epoch: 20719, Training Loss: 0.01029
Epoch: 20719, Training Loss: 0.01268
Epoch: 20720, Training Loss: 0.00981
Epoch: 20720, Training Loss: 0.01028
Epoch: 20720, Training Loss: 0.01029
Epoch: 20720, Training Loss: 0.01268
Epoch: 20721, Training Loss: 0.00981
Epoch: 20721, Training Loss: 0.01028
Epoch: 20721, Training Loss: 0.01029
Epoch: 20721, Training Loss: 0.01268
Epoch: 20722, Training Loss: 0.00981
Epoch: 20722, Training Loss: 0.01028
Epoch: 20722, Training Loss: 0.01029
Epoch: 20722, Training Loss: 0.01268
Epoch: 20723, Training Loss: 0.00981
Epoch: 20723, Training Loss: 0.01028
Epoch: 20723, Training Loss: 0.01029
Epoch: 20723, Training Loss: 0.01268
Epoch: 20724, Training Loss: 0.00981
Epoch: 20724, Training Loss: 0.01028
Epoch: 20724, Training Loss: 0.01029
Epoch: 20724, Training Loss: 0.01268
Epoch: 20725, Training Loss: 0.00981
Epoch: 20725, Training Loss: 0.01028
Epoch: 20725, Training Loss: 0.01029
Epoch: 20725, Training Loss: 0.01268
Epoch: 20726, Training Loss: 0.00981
Epoch: 20726, Training Loss: 0.01028
Epoch: 20726, Training Loss: 0.01029
Epoch: 20726, Training Loss: 0.01268
Epoch: 20727, Training Loss: 0.00981
Epoch: 20727, Training Loss: 0.01028
Epoch: 20727, Training Loss: 0.01029
Epoch: 20727, Training Loss: 0.01268
Epoch: 20728, Training Loss: 0.00981
Epoch: 20728, Training Loss: 0.01028
Epoch: 20728, Training Loss: 0.01029
Epoch: 20728, Training Loss: 0.01268
Epoch: 20729, Training Loss: 0.00981
Epoch: 20729, Training Loss: 0.01028
Epoch: 20729, Training Loss: 0.01029
Epoch: 20729, Training Loss: 0.01268
Epoch: 20730, Training Loss: 0.00981
Epoch: 20730, Training Loss: 0.01028
Epoch: 20730, Training Loss: 0.01029
Epoch: 20730, Training Loss: 0.01267
Epoch: 20731, Training Loss: 0.00981
Epoch: 20731, Training Loss: 0.01028
Epoch: 20731, Training Loss: 0.01029
Epoch: 20731, Training Loss: 0.01267
Epoch: 20732, Training Loss: 0.00981
Epoch: 20732, Training Loss: 0.01028
Epoch: 20732, Training Loss: 0.01029
Epoch: 20732, Training Loss: 0.01267
Epoch: 20733, Training Loss: 0.00981
Epoch: 20733, Training Loss: 0.01027
Epoch: 20733, Training Loss: 0.01029
Epoch: 20733, Training Loss: 0.01267
Epoch: 20734, Training Loss: 0.00981
Epoch: 20734, Training Loss: 0.01027
Epoch: 20734, Training Loss: 0.01029
Epoch: 20734, Training Loss: 0.01267
Epoch: 20735, Training Loss: 0.00981
Epoch: 20735, Training Loss: 0.01027
Epoch: 20735, Training Loss: 0.01029
Epoch: 20735, Training Loss: 0.01267
Epoch: 20736, Training Loss: 0.00981
Epoch: 20736, Training Loss: 0.01027
Epoch: 20736, Training Loss: 0.01029
Epoch: 20736, Training Loss: 0.01267
Epoch: 20737, Training Loss: 0.00981
Epoch: 20737, Training Loss: 0.01027
Epoch: 20737, Training Loss: 0.01029
Epoch: 20737, Training Loss: 0.01267
Epoch: 20738, Training Loss: 0.00981
Epoch: 20738, Training Loss: 0.01027
Epoch: 20738, Training Loss: 0.01029
Epoch: 20738, Training Loss: 0.01267
Epoch: 20739, Training Loss: 0.00981
Epoch: 20739, Training Loss: 0.01027
Epoch: 20739, Training Loss: 0.01029
Epoch: 20739, Training Loss: 0.01267
Epoch: 20740, Training Loss: 0.00981
Epoch: 20740, Training Loss: 0.01027
Epoch: 20740, Training Loss: 0.01029
Epoch: 20740, Training Loss: 0.01267
Epoch: 20741, Training Loss: 0.00981
Epoch: 20741, Training Loss: 0.01027
Epoch: 20741, Training Loss: 0.01029
Epoch: 20741, Training Loss: 0.01267
Epoch: 20742, Training Loss: 0.00981
Epoch: 20742, Training Loss: 0.01027
Epoch: 20742, Training Loss: 0.01029
Epoch: 20742, Training Loss: 0.01267
Epoch: 20743, Training Loss: 0.00981
Epoch: 20743, Training Loss: 0.01027
Epoch: 20743, Training Loss: 0.01029
Epoch: 20743, Training Loss: 0.01267
Epoch: 20744, Training Loss: 0.00981
Epoch: 20744, Training Loss: 0.01027
Epoch: 20744, Training Loss: 0.01029
Epoch: 20744, Training Loss: 0.01267
Epoch: 20745, Training Loss: 0.00981
Epoch: 20745, Training Loss: 0.01027
Epoch: 20745, Training Loss: 0.01028
Epoch: 20745, Training Loss: 0.01267
Epoch: 20746, Training Loss: 0.00980
Epoch: 20746, Training Loss: 0.01027
Epoch: 20746, Training Loss: 0.01028
Epoch: 20746, Training Loss: 0.01267
Epoch: 20747, Training Loss: 0.00980
Epoch: 20747, Training Loss: 0.01027
Epoch: 20747, Training Loss: 0.01028
Epoch: 20747, Training Loss: 0.01267
Epoch: 20748, Training Loss: 0.00980
Epoch: 20748, Training Loss: 0.01027
Epoch: 20748, Training Loss: 0.01028
Epoch: 20748, Training Loss: 0.01267
Epoch: 20749, Training Loss: 0.00980
Epoch: 20749, Training Loss: 0.01027
Epoch: 20749, Training Loss: 0.01028
Epoch: 20749, Training Loss: 0.01267
Epoch: 20750, Training Loss: 0.00980
Epoch: 20750, Training Loss: 0.01027
Epoch: 20750, Training Loss: 0.01028
Epoch: 20750, Training Loss: 0.01267
Epoch: 20751, Training Loss: 0.00980
Epoch: 20751, Training Loss: 0.01027
Epoch: 20751, Training Loss: 0.01028
Epoch: 20751, Training Loss: 0.01267
Epoch: 20752, Training Loss: 0.00980
Epoch: 20752, Training Loss: 0.01027
Epoch: 20752, Training Loss: 0.01028
Epoch: 20752, Training Loss: 0.01267
Epoch: 20753, Training Loss: 0.00980
Epoch: 20753, Training Loss: 0.01027
Epoch: 20753, Training Loss: 0.01028
Epoch: 20753, Training Loss: 0.01267
Epoch: 20754, Training Loss: 0.00980
Epoch: 20754, Training Loss: 0.01027
Epoch: 20754, Training Loss: 0.01028
Epoch: 20754, Training Loss: 0.01267
Epoch: 20755, Training Loss: 0.00980
Epoch: 20755, Training Loss: 0.01027
Epoch: 20755, Training Loss: 0.01028
Epoch: 20755, Training Loss: 0.01267
Epoch: 20756, Training Loss: 0.00980
Epoch: 20756, Training Loss: 0.01027
Epoch: 20756, Training Loss: 0.01028
Epoch: 20756, Training Loss: 0.01267
Epoch: 20757, Training Loss: 0.00980
Epoch: 20757, Training Loss: 0.01027
Epoch: 20757, Training Loss: 0.01028
Epoch: 20757, Training Loss: 0.01267
Epoch: 20758, Training Loss: 0.00980
Epoch: 20758, Training Loss: 0.01027
Epoch: 20758, Training Loss: 0.01028
Epoch: 20758, Training Loss: 0.01266
Epoch: 20759, Training Loss: 0.00980
Epoch: 20759, Training Loss: 0.01027
Epoch: 20759, Training Loss: 0.01028
Epoch: 20759, Training Loss: 0.01266
Epoch: 20760, Training Loss: 0.00980
Epoch: 20760, Training Loss: 0.01027
Epoch: 20760, Training Loss: 0.01028
Epoch: 20760, Training Loss: 0.01266
Epoch: 20761, Training Loss: 0.00980
Epoch: 20761, Training Loss: 0.01027
Epoch: 20761, Training Loss: 0.01028
Epoch: 20761, Training Loss: 0.01266
Epoch: 20762, Training Loss: 0.00980
Epoch: 20762, Training Loss: 0.01027
Epoch: 20762, Training Loss: 0.01028
Epoch: 20762, Training Loss: 0.01266
Epoch: 20763, Training Loss: 0.00980
Epoch: 20763, Training Loss: 0.01027
Epoch: 20763, Training Loss: 0.01028
Epoch: 20763, Training Loss: 0.01266
Epoch: 20764, Training Loss: 0.00980
Epoch: 20764, Training Loss: 0.01027
Epoch: 20764, Training Loss: 0.01028
Epoch: 20764, Training Loss: 0.01266
Epoch: 20765, Training Loss: 0.00980
Epoch: 20765, Training Loss: 0.01027
Epoch: 20765, Training Loss: 0.01028
Epoch: 20765, Training Loss: 0.01266
Epoch: 20766, Training Loss: 0.00980
Epoch: 20766, Training Loss: 0.01027
Epoch: 20766, Training Loss: 0.01028
Epoch: 20766, Training Loss: 0.01266
Epoch: 20767, Training Loss: 0.00980
Epoch: 20767, Training Loss: 0.01027
Epoch: 20767, Training Loss: 0.01028
Epoch: 20767, Training Loss: 0.01266
Epoch: 20768, Training Loss: 0.00980
Epoch: 20768, Training Loss: 0.01026
Epoch: 20768, Training Loss: 0.01028
Epoch: 20768, Training Loss: 0.01266
Epoch: 20769, Training Loss: 0.00980
Epoch: 20769, Training Loss: 0.01026
Epoch: 20769, Training Loss: 0.01028
Epoch: 20769, Training Loss: 0.01266
Epoch: 20770, Training Loss: 0.00980
Epoch: 20770, Training Loss: 0.01026
Epoch: 20770, Training Loss: 0.01028
Epoch: 20770, Training Loss: 0.01266
Epoch: 20771, Training Loss: 0.00980
Epoch: 20771, Training Loss: 0.01026
Epoch: 20771, Training Loss: 0.01028
Epoch: 20771, Training Loss: 0.01266
Epoch: 20772, Training Loss: 0.00980
Epoch: 20772, Training Loss: 0.01026
Epoch: 20772, Training Loss: 0.01028
Epoch: 20772, Training Loss: 0.01266
Epoch: 20773, Training Loss: 0.00980
Epoch: 20773, Training Loss: 0.01026
Epoch: 20773, Training Loss: 0.01028
Epoch: 20773, Training Loss: 0.01266
Epoch: 20774, Training Loss: 0.00980
Epoch: 20774, Training Loss: 0.01026
Epoch: 20774, Training Loss: 0.01028
Epoch: 20774, Training Loss: 0.01266
Epoch: 20775, Training Loss: 0.00980
Epoch: 20775, Training Loss: 0.01026
Epoch: 20775, Training Loss: 0.01028
Epoch: 20775, Training Loss: 0.01266
Epoch: 20776, Training Loss: 0.00980
Epoch: 20776, Training Loss: 0.01026
Epoch: 20776, Training Loss: 0.01028
Epoch: 20776, Training Loss: 0.01266
Epoch: 20777, Training Loss: 0.00980
Epoch: 20777, Training Loss: 0.01026
Epoch: 20777, Training Loss: 0.01028
Epoch: 20777, Training Loss: 0.01266
Epoch: 20778, Training Loss: 0.00980
Epoch: 20778, Training Loss: 0.01026
Epoch: 20778, Training Loss: 0.01028
Epoch: 20778, Training Loss: 0.01266
Epoch: 20779, Training Loss: 0.00980
Epoch: 20779, Training Loss: 0.01026
Epoch: 20779, Training Loss: 0.01027
Epoch: 20779, Training Loss: 0.01266
Epoch: 20780, Training Loss: 0.00980
Epoch: 20780, Training Loss: 0.01026
Epoch: 20780, Training Loss: 0.01027
Epoch: 20780, Training Loss: 0.01266
Epoch: 20781, Training Loss: 0.00980
Epoch: 20781, Training Loss: 0.01026
Epoch: 20781, Training Loss: 0.01027
Epoch: 20781, Training Loss: 0.01266
Epoch: 20782, Training Loss: 0.00980
Epoch: 20782, Training Loss: 0.01026
Epoch: 20782, Training Loss: 0.01027
Epoch: 20782, Training Loss: 0.01266
Epoch: 20783, Training Loss: 0.00979
Epoch: 20783, Training Loss: 0.01026
Epoch: 20783, Training Loss: 0.01027
Epoch: 20783, Training Loss: 0.01266
Epoch: 20784, Training Loss: 0.00979
Epoch: 20784, Training Loss: 0.01026
Epoch: 20784, Training Loss: 0.01027
Epoch: 20784, Training Loss: 0.01266
Epoch: 20785, Training Loss: 0.00979
Epoch: 20785, Training Loss: 0.01026
Epoch: 20785, Training Loss: 0.01027
Epoch: 20785, Training Loss: 0.01265
Epoch: 20786, Training Loss: 0.00979
Epoch: 20786, Training Loss: 0.01026
Epoch: 20786, Training Loss: 0.01027
Epoch: 20786, Training Loss: 0.01265
Epoch: 20787, Training Loss: 0.00979
Epoch: 20787, Training Loss: 0.01026
Epoch: 20787, Training Loss: 0.01027
Epoch: 20787, Training Loss: 0.01265
Epoch: 20788, Training Loss: 0.00979
Epoch: 20788, Training Loss: 0.01026
Epoch: 20788, Training Loss: 0.01027
Epoch: 20788, Training Loss: 0.01265
Epoch: 20789, Training Loss: 0.00979
Epoch: 20789, Training Loss: 0.01026
Epoch: 20789, Training Loss: 0.01027
Epoch: 20789, Training Loss: 0.01265
Epoch: 20790, Training Loss: 0.00979
Epoch: 20790, Training Loss: 0.01026
Epoch: 20790, Training Loss: 0.01027
Epoch: 20790, Training Loss: 0.01265
Epoch: 20791, Training Loss: 0.00979
Epoch: 20791, Training Loss: 0.01026
Epoch: 20791, Training Loss: 0.01027
Epoch: 20791, Training Loss: 0.01265
Epoch: 20792, Training Loss: 0.00979
Epoch: 20792, Training Loss: 0.01026
Epoch: 20792, Training Loss: 0.01027
Epoch: 20792, Training Loss: 0.01265
Epoch: 20793, Training Loss: 0.00979
Epoch: 20793, Training Loss: 0.01026
Epoch: 20793, Training Loss: 0.01027
Epoch: 20793, Training Loss: 0.01265
Epoch: 20794, Training Loss: 0.00979
Epoch: 20794, Training Loss: 0.01026
Epoch: 20794, Training Loss: 0.01027
Epoch: 20794, Training Loss: 0.01265
Epoch: 20795, Training Loss: 0.00979
Epoch: 20795, Training Loss: 0.01026
Epoch: 20795, Training Loss: 0.01027
Epoch: 20795, Training Loss: 0.01265
Epoch: 20796, Training Loss: 0.00979
Epoch: 20796, Training Loss: 0.01026
Epoch: 20796, Training Loss: 0.01027
Epoch: 20796, Training Loss: 0.01265
Epoch: 20797, Training Loss: 0.00979
Epoch: 20797, Training Loss: 0.01026
Epoch: 20797, Training Loss: 0.01027
Epoch: 20797, Training Loss: 0.01265
Epoch: 20798, Training Loss: 0.00979
Epoch: 20798, Training Loss: 0.01026
Epoch: 20798, Training Loss: 0.01027
Epoch: 20798, Training Loss: 0.01265
Epoch: 20799, Training Loss: 0.00979
Epoch: 20799, Training Loss: 0.01026
Epoch: 20799, Training Loss: 0.01027
Epoch: 20799, Training Loss: 0.01265
Epoch: 20800, Training Loss: 0.00979
Epoch: 20800, Training Loss: 0.01026
Epoch: 20800, Training Loss: 0.01027
Epoch: 20800, Training Loss: 0.01265
Epoch: 20801, Training Loss: 0.00979
Epoch: 20801, Training Loss: 0.01026
Epoch: 20801, Training Loss: 0.01027
Epoch: 20801, Training Loss: 0.01265
Epoch: 20802, Training Loss: 0.00979
Epoch: 20802, Training Loss: 0.01025
Epoch: 20802, Training Loss: 0.01027
Epoch: 20802, Training Loss: 0.01265
Epoch: 20803, Training Loss: 0.00979
Epoch: 20803, Training Loss: 0.01025
Epoch: 20803, Training Loss: 0.01027
Epoch: 20803, Training Loss: 0.01265
Epoch: 20804, Training Loss: 0.00979
Epoch: 20804, Training Loss: 0.01025
Epoch: 20804, Training Loss: 0.01027
Epoch: 20804, Training Loss: 0.01265
Epoch: 20805, Training Loss: 0.00979
Epoch: 20805, Training Loss: 0.01025
Epoch: 20805, Training Loss: 0.01027
Epoch: 20805, Training Loss: 0.01265
Epoch: 20806, Training Loss: 0.00979
Epoch: 20806, Training Loss: 0.01025
Epoch: 20806, Training Loss: 0.01027
Epoch: 20806, Training Loss: 0.01265
Epoch: 20807, Training Loss: 0.00979
Epoch: 20807, Training Loss: 0.01025
Epoch: 20807, Training Loss: 0.01027
Epoch: 20807, Training Loss: 0.01265
Epoch: 20808, Training Loss: 0.00979
Epoch: 20808, Training Loss: 0.01025
Epoch: 20808, Training Loss: 0.01027
Epoch: 20808, Training Loss: 0.01265
Epoch: 20809, Training Loss: 0.00979
Epoch: 20809, Training Loss: 0.01025
Epoch: 20809, Training Loss: 0.01027
Epoch: 20809, Training Loss: 0.01265
Epoch: 20810, Training Loss: 0.00979
Epoch: 20810, Training Loss: 0.01025
Epoch: 20810, Training Loss: 0.01027
Epoch: 20810, Training Loss: 0.01265
Epoch: 20811, Training Loss: 0.00979
Epoch: 20811, Training Loss: 0.01025
Epoch: 20811, Training Loss: 0.01027
Epoch: 20811, Training Loss: 0.01265
Epoch: 20812, Training Loss: 0.00979
Epoch: 20812, Training Loss: 0.01025
Epoch: 20812, Training Loss: 0.01027
Epoch: 20812, Training Loss: 0.01265
Epoch: 20813, Training Loss: 0.00979
Epoch: 20813, Training Loss: 0.01025
Epoch: 20813, Training Loss: 0.01027
Epoch: 20813, Training Loss: 0.01264
Epoch: 20814, Training Loss: 0.00979
Epoch: 20814, Training Loss: 0.01025
Epoch: 20814, Training Loss: 0.01026
Epoch: 20814, Training Loss: 0.01264
Epoch: 20815, Training Loss: 0.00979
Epoch: 20815, Training Loss: 0.01025
Epoch: 20815, Training Loss: 0.01026
Epoch: 20815, Training Loss: 0.01264
Epoch: 20816, Training Loss: 0.00979
Epoch: 20816, Training Loss: 0.01025
Epoch: 20816, Training Loss: 0.01026
Epoch: 20816, Training Loss: 0.01264
Epoch: 20817, Training Loss: 0.00979
Epoch: 20817, Training Loss: 0.01025
Epoch: 20817, Training Loss: 0.01026
Epoch: 20817, Training Loss: 0.01264
Epoch: 20818, Training Loss: 0.00979
Epoch: 20818, Training Loss: 0.01025
Epoch: 20818, Training Loss: 0.01026
Epoch: 20818, Training Loss: 0.01264
Epoch: 20819, Training Loss: 0.00979
Epoch: 20819, Training Loss: 0.01025
Epoch: 20819, Training Loss: 0.01026
Epoch: 20819, Training Loss: 0.01264
Epoch: 20820, Training Loss: 0.00978
Epoch: 20820, Training Loss: 0.01025
Epoch: 20820, Training Loss: 0.01026
Epoch: 20820, Training Loss: 0.01264
Epoch: 20821, Training Loss: 0.00978
Epoch: 20821, Training Loss: 0.01025
Epoch: 20821, Training Loss: 0.01026
Epoch: 20821, Training Loss: 0.01264
Epoch: 20822, Training Loss: 0.00978
Epoch: 20822, Training Loss: 0.01025
Epoch: 20822, Training Loss: 0.01026
Epoch: 20822, Training Loss: 0.01264
Epoch: 20823, Training Loss: 0.00978
Epoch: 20823, Training Loss: 0.01025
Epoch: 20823, Training Loss: 0.01026
Epoch: 20823, Training Loss: 0.01264
Epoch: 20824, Training Loss: 0.00978
Epoch: 20824, Training Loss: 0.01025
Epoch: 20824, Training Loss: 0.01026
Epoch: 20824, Training Loss: 0.01264
Epoch: 20825, Training Loss: 0.00978
Epoch: 20825, Training Loss: 0.01025
Epoch: 20825, Training Loss: 0.01026
Epoch: 20825, Training Loss: 0.01264
Epoch: 20826, Training Loss: 0.00978
Epoch: 20826, Training Loss: 0.01025
Epoch: 20826, Training Loss: 0.01026
Epoch: 20826, Training Loss: 0.01264
Epoch: 20827, Training Loss: 0.00978
Epoch: 20827, Training Loss: 0.01025
Epoch: 20827, Training Loss: 0.01026
Epoch: 20827, Training Loss: 0.01264
Epoch: 20828, Training Loss: 0.00978
Epoch: 20828, Training Loss: 0.01025
Epoch: 20828, Training Loss: 0.01026
Epoch: 20828, Training Loss: 0.01264
Epoch: 20829, Training Loss: 0.00978
Epoch: 20829, Training Loss: 0.01025
Epoch: 20829, Training Loss: 0.01026
Epoch: 20829, Training Loss: 0.01264
Epoch: 20830, Training Loss: 0.00978
Epoch: 20830, Training Loss: 0.01025
Epoch: 20830, Training Loss: 0.01026
Epoch: 20830, Training Loss: 0.01264
Epoch: 20831, Training Loss: 0.00978
Epoch: 20831, Training Loss: 0.01025
Epoch: 20831, Training Loss: 0.01026
Epoch: 20831, Training Loss: 0.01264
Epoch: 20832, Training Loss: 0.00978
Epoch: 20832, Training Loss: 0.01025
Epoch: 20832, Training Loss: 0.01026
Epoch: 20832, Training Loss: 0.01264
Epoch: 20833, Training Loss: 0.00978
Epoch: 20833, Training Loss: 0.01025
Epoch: 20833, Training Loss: 0.01026
Epoch: 20833, Training Loss: 0.01264
Epoch: 20834, Training Loss: 0.00978
Epoch: 20834, Training Loss: 0.01025
Epoch: 20834, Training Loss: 0.01026
Epoch: 20834, Training Loss: 0.01264
Epoch: 20835, Training Loss: 0.00978
Epoch: 20835, Training Loss: 0.01025
Epoch: 20835, Training Loss: 0.01026
Epoch: 20835, Training Loss: 0.01264
Epoch: 20836, Training Loss: 0.00978
Epoch: 20836, Training Loss: 0.01025
Epoch: 20836, Training Loss: 0.01026
Epoch: 20836, Training Loss: 0.01264
Epoch: 20837, Training Loss: 0.00978
Epoch: 20837, Training Loss: 0.01024
Epoch: 20837, Training Loss: 0.01026
Epoch: 20837, Training Loss: 0.01264
Epoch: 20838, Training Loss: 0.00978
Epoch: 20838, Training Loss: 0.01024
Epoch: 20838, Training Loss: 0.01026
Epoch: 20838, Training Loss: 0.01264
Epoch: 20839, Training Loss: 0.00978
Epoch: 20839, Training Loss: 0.01024
Epoch: 20839, Training Loss: 0.01026
Epoch: 20839, Training Loss: 0.01264
Epoch: 20840, Training Loss: 0.00978
Epoch: 20840, Training Loss: 0.01024
Epoch: 20840, Training Loss: 0.01026
Epoch: 20840, Training Loss: 0.01263
Epoch: 20841, Training Loss: 0.00978
Epoch: 20841, Training Loss: 0.01024
Epoch: 20841, Training Loss: 0.01026
Epoch: 20841, Training Loss: 0.01263
Epoch: 20842, Training Loss: 0.00978
Epoch: 20842, Training Loss: 0.01024
Epoch: 20842, Training Loss: 0.01026
Epoch: 20842, Training Loss: 0.01263
Epoch: 20843, Training Loss: 0.00978
Epoch: 20843, Training Loss: 0.01024
Epoch: 20843, Training Loss: 0.01026
Epoch: 20843, Training Loss: 0.01263
Epoch: 20844, Training Loss: 0.00978
Epoch: 20844, Training Loss: 0.01024
Epoch: 20844, Training Loss: 0.01026
Epoch: 20844, Training Loss: 0.01263
Epoch: 20845, Training Loss: 0.00978
Epoch: 20845, Training Loss: 0.01024
Epoch: 20845, Training Loss: 0.01026
Epoch: 20845, Training Loss: 0.01263
Epoch: 20846, Training Loss: 0.00978
Epoch: 20846, Training Loss: 0.01024
Epoch: 20846, Training Loss: 0.01026
Epoch: 20846, Training Loss: 0.01263
Epoch: 20847, Training Loss: 0.00978
Epoch: 20847, Training Loss: 0.01024
Epoch: 20847, Training Loss: 0.01026
Epoch: 20847, Training Loss: 0.01263
Epoch: 20848, Training Loss: 0.00978
Epoch: 20848, Training Loss: 0.01024
Epoch: 20848, Training Loss: 0.01025
Epoch: 20848, Training Loss: 0.01263
Epoch: 20849, Training Loss: 0.00978
Epoch: 20849, Training Loss: 0.01024
Epoch: 20849, Training Loss: 0.01025
Epoch: 20849, Training Loss: 0.01263
Epoch: 20850, Training Loss: 0.00978
Epoch: 20850, Training Loss: 0.01024
Epoch: 20850, Training Loss: 0.01025
Epoch: 20850, Training Loss: 0.01263
Epoch: 20851, Training Loss: 0.00978
Epoch: 20851, Training Loss: 0.01024
Epoch: 20851, Training Loss: 0.01025
Epoch: 20851, Training Loss: 0.01263
Epoch: 20852, Training Loss: 0.00978
Epoch: 20852, Training Loss: 0.01024
Epoch: 20852, Training Loss: 0.01025
Epoch: 20852, Training Loss: 0.01263
Epoch: 20853, Training Loss: 0.00978
Epoch: 20853, Training Loss: 0.01024
Epoch: 20853, Training Loss: 0.01025
Epoch: 20853, Training Loss: 0.01263
Epoch: 20854, Training Loss: 0.00978
Epoch: 20854, Training Loss: 0.01024
Epoch: 20854, Training Loss: 0.01025
Epoch: 20854, Training Loss: 0.01263
Epoch: 20855, Training Loss: 0.00978
Epoch: 20855, Training Loss: 0.01024
Epoch: 20855, Training Loss: 0.01025
Epoch: 20855, Training Loss: 0.01263
Epoch: 20856, Training Loss: 0.00978
Epoch: 20856, Training Loss: 0.01024
Epoch: 20856, Training Loss: 0.01025
Epoch: 20856, Training Loss: 0.01263
Epoch: 20857, Training Loss: 0.00977
Epoch: 20857, Training Loss: 0.01024
Epoch: 20857, Training Loss: 0.01025
Epoch: 20857, Training Loss: 0.01263
Epoch: 20858, Training Loss: 0.00977
Epoch: 20858, Training Loss: 0.01024
Epoch: 20858, Training Loss: 0.01025
Epoch: 20858, Training Loss: 0.01263
Epoch: 20859, Training Loss: 0.00977
Epoch: 20859, Training Loss: 0.01024
Epoch: 20859, Training Loss: 0.01025
Epoch: 20859, Training Loss: 0.01263
Epoch: 20860, Training Loss: 0.00977
Epoch: 20860, Training Loss: 0.01024
Epoch: 20860, Training Loss: 0.01025
Epoch: 20860, Training Loss: 0.01263
Epoch: 20861, Training Loss: 0.00977
Epoch: 20861, Training Loss: 0.01024
Epoch: 20861, Training Loss: 0.01025
Epoch: 20861, Training Loss: 0.01263
Epoch: 20862, Training Loss: 0.00977
Epoch: 20862, Training Loss: 0.01024
Epoch: 20862, Training Loss: 0.01025
Epoch: 20862, Training Loss: 0.01263
Epoch: 20863, Training Loss: 0.00977
Epoch: 20863, Training Loss: 0.01024
Epoch: 20863, Training Loss: 0.01025
Epoch: 20863, Training Loss: 0.01263
Epoch: 20864, Training Loss: 0.00977
Epoch: 20864, Training Loss: 0.01024
Epoch: 20864, Training Loss: 0.01025
Epoch: 20864, Training Loss: 0.01263
Epoch: 20865, Training Loss: 0.00977
Epoch: 20865, Training Loss: 0.01024
Epoch: 20865, Training Loss: 0.01025
Epoch: 20865, Training Loss: 0.01263
Epoch: 20866, Training Loss: 0.00977
Epoch: 20866, Training Loss: 0.01024
Epoch: 20866, Training Loss: 0.01025
Epoch: 20866, Training Loss: 0.01263
Epoch: 20867, Training Loss: 0.00977
Epoch: 20867, Training Loss: 0.01024
Epoch: 20867, Training Loss: 0.01025
Epoch: 20867, Training Loss: 0.01263
Epoch: 20868, Training Loss: 0.00977
Epoch: 20868, Training Loss: 0.01024
Epoch: 20868, Training Loss: 0.01025
Epoch: 20868, Training Loss: 0.01262
Epoch: 20869, Training Loss: 0.00977
Epoch: 20869, Training Loss: 0.01024
Epoch: 20869, Training Loss: 0.01025
Epoch: 20869, Training Loss: 0.01262
Epoch: 20870, Training Loss: 0.00977
Epoch: 20870, Training Loss: 0.01024
Epoch: 20870, Training Loss: 0.01025
Epoch: 20870, Training Loss: 0.01262
Epoch: 20871, Training Loss: 0.00977
Epoch: 20871, Training Loss: 0.01023
Epoch: 20871, Training Loss: 0.01025
Epoch: 20871, Training Loss: 0.01262
Epoch: 20872, Training Loss: 0.00977
Epoch: 20872, Training Loss: 0.01023
Epoch: 20872, Training Loss: 0.01025
Epoch: 20872, Training Loss: 0.01262
Epoch: 20873, Training Loss: 0.00977
Epoch: 20873, Training Loss: 0.01023
Epoch: 20873, Training Loss: 0.01025
Epoch: 20873, Training Loss: 0.01262
Epoch: 20874, Training Loss: 0.00977
Epoch: 20874, Training Loss: 0.01023
Epoch: 20874, Training Loss: 0.01025
Epoch: 20874, Training Loss: 0.01262
Epoch: 20875, Training Loss: 0.00977
Epoch: 20875, Training Loss: 0.01023
Epoch: 20875, Training Loss: 0.01025
Epoch: 20875, Training Loss: 0.01262
Epoch: 20876, Training Loss: 0.00977
Epoch: 20876, Training Loss: 0.01023
Epoch: 20876, Training Loss: 0.01025
Epoch: 20876, Training Loss: 0.01262
Epoch: 20877, Training Loss: 0.00977
Epoch: 20877, Training Loss: 0.01023
Epoch: 20877, Training Loss: 0.01025
Epoch: 20877, Training Loss: 0.01262
Epoch: 20878, Training Loss: 0.00977
Epoch: 20878, Training Loss: 0.01023
Epoch: 20878, Training Loss: 0.01025
Epoch: 20878, Training Loss: 0.01262
Epoch: 20879, Training Loss: 0.00977
Epoch: 20879, Training Loss: 0.01023
Epoch: 20879, Training Loss: 0.01025
Epoch: 20879, Training Loss: 0.01262
Epoch: 20880, Training Loss: 0.00977
Epoch: 20880, Training Loss: 0.01023
Epoch: 20880, Training Loss: 0.01025
Epoch: 20880, Training Loss: 0.01262
Epoch: 20881, Training Loss: 0.00977
Epoch: 20881, Training Loss: 0.01023
Epoch: 20881, Training Loss: 0.01025
Epoch: 20881, Training Loss: 0.01262
Epoch: 20882, Training Loss: 0.00977
Epoch: 20882, Training Loss: 0.01023
Epoch: 20882, Training Loss: 0.01025
Epoch: 20882, Training Loss: 0.01262
Epoch: 20883, Training Loss: 0.00977
Epoch: 20883, Training Loss: 0.01023
Epoch: 20883, Training Loss: 0.01024
Epoch: 20883, Training Loss: 0.01262
Epoch: 20884, Training Loss: 0.00977
Epoch: 20884, Training Loss: 0.01023
Epoch: 20884, Training Loss: 0.01024
Epoch: 20884, Training Loss: 0.01262
Epoch: 20885, Training Loss: 0.00977
Epoch: 20885, Training Loss: 0.01023
Epoch: 20885, Training Loss: 0.01024
Epoch: 20885, Training Loss: 0.01262
Epoch: 20886, Training Loss: 0.00977
Epoch: 20886, Training Loss: 0.01023
Epoch: 20886, Training Loss: 0.01024
Epoch: 20886, Training Loss: 0.01262
Epoch: 20887, Training Loss: 0.00977
Epoch: 20887, Training Loss: 0.01023
Epoch: 20887, Training Loss: 0.01024
Epoch: 20887, Training Loss: 0.01262
Epoch: 20888, Training Loss: 0.00977
Epoch: 20888, Training Loss: 0.01023
Epoch: 20888, Training Loss: 0.01024
Epoch: 20888, Training Loss: 0.01262
Epoch: 20889, Training Loss: 0.00977
Epoch: 20889, Training Loss: 0.01023
Epoch: 20889, Training Loss: 0.01024
Epoch: 20889, Training Loss: 0.01262
Epoch: 20890, Training Loss: 0.00977
Epoch: 20890, Training Loss: 0.01023
Epoch: 20890, Training Loss: 0.01024
Epoch: 20890, Training Loss: 0.01262
Epoch: 20891, Training Loss: 0.00977
Epoch: 20891, Training Loss: 0.01023
Epoch: 20891, Training Loss: 0.01024
Epoch: 20891, Training Loss: 0.01262
Epoch: 20892, Training Loss: 0.00977
Epoch: 20892, Training Loss: 0.01023
Epoch: 20892, Training Loss: 0.01024
Epoch: 20892, Training Loss: 0.01262
Epoch: 20893, Training Loss: 0.00977
Epoch: 20893, Training Loss: 0.01023
Epoch: 20893, Training Loss: 0.01024
Epoch: 20893, Training Loss: 0.01262
Epoch: 20894, Training Loss: 0.00977
Epoch: 20894, Training Loss: 0.01023
Epoch: 20894, Training Loss: 0.01024
Epoch: 20894, Training Loss: 0.01262
Epoch: 20895, Training Loss: 0.00976
Epoch: 20895, Training Loss: 0.01023
Epoch: 20895, Training Loss: 0.01024
Epoch: 20895, Training Loss: 0.01262
Epoch: 20896, Training Loss: 0.00976
Epoch: 20896, Training Loss: 0.01023
Epoch: 20896, Training Loss: 0.01024
Epoch: 20896, Training Loss: 0.01261
Epoch: 20897, Training Loss: 0.00976
Epoch: 20897, Training Loss: 0.01023
Epoch: 20897, Training Loss: 0.01024
Epoch: 20897, Training Loss: 0.01261
Epoch: 20898, Training Loss: 0.00976
Epoch: 20898, Training Loss: 0.01023
Epoch: 20898, Training Loss: 0.01024
Epoch: 20898, Training Loss: 0.01261
Epoch: 20899, Training Loss: 0.00976
Epoch: 20899, Training Loss: 0.01023
Epoch: 20899, Training Loss: 0.01024
Epoch: 20899, Training Loss: 0.01261
Epoch: 20900, Training Loss: 0.00976
Epoch: 20900, Training Loss: 0.01023
Epoch: 20900, Training Loss: 0.01024
Epoch: 20900, Training Loss: 0.01261
Epoch: 20901, Training Loss: 0.00976
Epoch: 20901, Training Loss: 0.01023
Epoch: 20901, Training Loss: 0.01024
Epoch: 20901, Training Loss: 0.01261
Epoch: 20902, Training Loss: 0.00976
Epoch: 20902, Training Loss: 0.01023
Epoch: 20902, Training Loss: 0.01024
Epoch: 20902, Training Loss: 0.01261
Epoch: 20903, Training Loss: 0.00976
Epoch: 20903, Training Loss: 0.01023
Epoch: 20903, Training Loss: 0.01024
Epoch: 20903, Training Loss: 0.01261
Epoch: 20904, Training Loss: 0.00976
Epoch: 20904, Training Loss: 0.01023
Epoch: 20904, Training Loss: 0.01024
Epoch: 20904, Training Loss: 0.01261
Epoch: 20905, Training Loss: 0.00976
Epoch: 20905, Training Loss: 0.01023
Epoch: 20905, Training Loss: 0.01024
Epoch: 20905, Training Loss: 0.01261
Epoch: 20906, Training Loss: 0.00976
Epoch: 20906, Training Loss: 0.01022
Epoch: 20906, Training Loss: 0.01024
Epoch: 20906, Training Loss: 0.01261
Epoch: 20907, Training Loss: 0.00976
Epoch: 20907, Training Loss: 0.01022
Epoch: 20907, Training Loss: 0.01024
Epoch: 20907, Training Loss: 0.01261
Epoch: 20908, Training Loss: 0.00976
Epoch: 20908, Training Loss: 0.01022
Epoch: 20908, Training Loss: 0.01024
Epoch: 20908, Training Loss: 0.01261
Epoch: 20909, Training Loss: 0.00976
Epoch: 20909, Training Loss: 0.01022
Epoch: 20909, Training Loss: 0.01024
Epoch: 20909, Training Loss: 0.01261
Epoch: 20910, Training Loss: 0.00976
Epoch: 20910, Training Loss: 0.01022
Epoch: 20910, Training Loss: 0.01024
Epoch: 20910, Training Loss: 0.01261
Epoch: 20911, Training Loss: 0.00976
Epoch: 20911, Training Loss: 0.01022
Epoch: 20911, Training Loss: 0.01024
Epoch: 20911, Training Loss: 0.01261
Epoch: 20912, Training Loss: 0.00976
Epoch: 20912, Training Loss: 0.01022
Epoch: 20912, Training Loss: 0.01024
Epoch: 20912, Training Loss: 0.01261
Epoch: 20913, Training Loss: 0.00976
Epoch: 20913, Training Loss: 0.01022
Epoch: 20913, Training Loss: 0.01024
Epoch: 20913, Training Loss: 0.01261
Epoch: 20914, Training Loss: 0.00976
Epoch: 20914, Training Loss: 0.01022
Epoch: 20914, Training Loss: 0.01024
Epoch: 20914, Training Loss: 0.01261
Epoch: 20915, Training Loss: 0.00976
Epoch: 20915, Training Loss: 0.01022
Epoch: 20915, Training Loss: 0.01024
Epoch: 20915, Training Loss: 0.01261
Epoch: 20916, Training Loss: 0.00976
Epoch: 20916, Training Loss: 0.01022
Epoch: 20916, Training Loss: 0.01024
Epoch: 20916, Training Loss: 0.01261
Epoch: 20917, Training Loss: 0.00976
Epoch: 20917, Training Loss: 0.01022
Epoch: 20917, Training Loss: 0.01023
Epoch: 20917, Training Loss: 0.01261
Epoch: 20918, Training Loss: 0.00976
Epoch: 20918, Training Loss: 0.01022
Epoch: 20918, Training Loss: 0.01023
Epoch: 20918, Training Loss: 0.01261
Epoch: 20919, Training Loss: 0.00976
Epoch: 20919, Training Loss: 0.01022
Epoch: 20919, Training Loss: 0.01023
Epoch: 20919, Training Loss: 0.01261
Epoch: 20920, Training Loss: 0.00976
Epoch: 20920, Training Loss: 0.01022
Epoch: 20920, Training Loss: 0.01023
Epoch: 20920, Training Loss: 0.01261
Epoch: 20921, Training Loss: 0.00976
Epoch: 20921, Training Loss: 0.01022
Epoch: 20921, Training Loss: 0.01023
Epoch: 20921, Training Loss: 0.01261
Epoch: 20922, Training Loss: 0.00976
Epoch: 20922, Training Loss: 0.01022
Epoch: 20922, Training Loss: 0.01023
Epoch: 20922, Training Loss: 0.01261
Epoch: 20923, Training Loss: 0.00976
Epoch: 20923, Training Loss: 0.01022
Epoch: 20923, Training Loss: 0.01023
Epoch: 20923, Training Loss: 0.01260
Epoch: 20924, Training Loss: 0.00976
Epoch: 20924, Training Loss: 0.01022
Epoch: 20924, Training Loss: 0.01023
Epoch: 20924, Training Loss: 0.01260
Epoch: 20925, Training Loss: 0.00976
Epoch: 20925, Training Loss: 0.01022
Epoch: 20925, Training Loss: 0.01023
Epoch: 20925, Training Loss: 0.01260
Epoch: 20926, Training Loss: 0.00976
Epoch: 20926, Training Loss: 0.01022
Epoch: 20926, Training Loss: 0.01023
Epoch: 20926, Training Loss: 0.01260
Epoch: 20927, Training Loss: 0.00976
Epoch: 20927, Training Loss: 0.01022
Epoch: 20927, Training Loss: 0.01023
Epoch: 20927, Training Loss: 0.01260
Epoch: 20928, Training Loss: 0.00976
Epoch: 20928, Training Loss: 0.01022
Epoch: 20928, Training Loss: 0.01023
Epoch: 20928, Training Loss: 0.01260
Epoch: 20929, Training Loss: 0.00976
Epoch: 20929, Training Loss: 0.01022
Epoch: 20929, Training Loss: 0.01023
Epoch: 20929, Training Loss: 0.01260
Epoch: 20930, Training Loss: 0.00976
Epoch: 20930, Training Loss: 0.01022
Epoch: 20930, Training Loss: 0.01023
Epoch: 20930, Training Loss: 0.01260
Epoch: 20931, Training Loss: 0.00976
Epoch: 20931, Training Loss: 0.01022
Epoch: 20931, Training Loss: 0.01023
Epoch: 20931, Training Loss: 0.01260
Epoch: 20932, Training Loss: 0.00975
Epoch: 20932, Training Loss: 0.01022
Epoch: 20932, Training Loss: 0.01023
Epoch: 20932, Training Loss: 0.01260
Epoch: 20933, Training Loss: 0.00975
Epoch: 20933, Training Loss: 0.01022
Epoch: 20933, Training Loss: 0.01023
Epoch: 20933, Training Loss: 0.01260
Epoch: 20934, Training Loss: 0.00975
Epoch: 20934, Training Loss: 0.01022
Epoch: 20934, Training Loss: 0.01023
Epoch: 20934, Training Loss: 0.01260
Epoch: 20935, Training Loss: 0.00975
Epoch: 20935, Training Loss: 0.01022
Epoch: 20935, Training Loss: 0.01023
Epoch: 20935, Training Loss: 0.01260
Epoch: 20936, Training Loss: 0.00975
Epoch: 20936, Training Loss: 0.01022
Epoch: 20936, Training Loss: 0.01023
Epoch: 20936, Training Loss: 0.01260
Epoch: 20937, Training Loss: 0.00975
Epoch: 20937, Training Loss: 0.01022
Epoch: 20937, Training Loss: 0.01023
Epoch: 20937, Training Loss: 0.01260
Epoch: 20938, Training Loss: 0.00975
Epoch: 20938, Training Loss: 0.01022
Epoch: 20938, Training Loss: 0.01023
Epoch: 20938, Training Loss: 0.01260
Epoch: 20939, Training Loss: 0.00975
Epoch: 20939, Training Loss: 0.01022
Epoch: 20939, Training Loss: 0.01023
Epoch: 20939, Training Loss: 0.01260
Epoch: 20940, Training Loss: 0.00975
Epoch: 20940, Training Loss: 0.01022
Epoch: 20940, Training Loss: 0.01023
Epoch: 20940, Training Loss: 0.01260
Epoch: 20941, Training Loss: 0.00975
Epoch: 20941, Training Loss: 0.01021
Epoch: 20941, Training Loss: 0.01023
Epoch: 20941, Training Loss: 0.01260
Epoch: 20942, Training Loss: 0.00975
Epoch: 20942, Training Loss: 0.01021
Epoch: 20942, Training Loss: 0.01023
Epoch: 20942, Training Loss: 0.01260
Epoch: 20943, Training Loss: 0.00975
Epoch: 20943, Training Loss: 0.01021
Epoch: 20943, Training Loss: 0.01023
Epoch: 20943, Training Loss: 0.01260
Epoch: 20944, Training Loss: 0.00975
Epoch: 20944, Training Loss: 0.01021
Epoch: 20944, Training Loss: 0.01023
Epoch: 20944, Training Loss: 0.01260
Epoch: 20945, Training Loss: 0.00975
Epoch: 20945, Training Loss: 0.01021
Epoch: 20945, Training Loss: 0.01023
Epoch: 20945, Training Loss: 0.01260
Epoch: 20946, Training Loss: 0.00975
Epoch: 20946, Training Loss: 0.01021
Epoch: 20946, Training Loss: 0.01023
Epoch: 20946, Training Loss: 0.01260
Epoch: 20947, Training Loss: 0.00975
Epoch: 20947, Training Loss: 0.01021
Epoch: 20947, Training Loss: 0.01023
Epoch: 20947, Training Loss: 0.01260
Epoch: 20948, Training Loss: 0.00975
Epoch: 20948, Training Loss: 0.01021
Epoch: 20948, Training Loss: 0.01023
Epoch: 20948, Training Loss: 0.01260
Epoch: 20949, Training Loss: 0.00975
Epoch: 20949, Training Loss: 0.01021
Epoch: 20949, Training Loss: 0.01023
Epoch: 20949, Training Loss: 0.01260
Epoch: 20950, Training Loss: 0.00975
Epoch: 20950, Training Loss: 0.01021
Epoch: 20950, Training Loss: 0.01023
Epoch: 20950, Training Loss: 0.01260
Epoch: 20951, Training Loss: 0.00975
Epoch: 20951, Training Loss: 0.01021
Epoch: 20951, Training Loss: 0.01023
Epoch: 20951, Training Loss: 0.01259
Epoch: 20952, Training Loss: 0.00975
Epoch: 20952, Training Loss: 0.01021
Epoch: 20952, Training Loss: 0.01022
Epoch: 20952, Training Loss: 0.01259
Epoch: 20953, Training Loss: 0.00975
Epoch: 20953, Training Loss: 0.01021
Epoch: 20953, Training Loss: 0.01022
Epoch: 20953, Training Loss: 0.01259
Epoch: 20954, Training Loss: 0.00975
Epoch: 20954, Training Loss: 0.01021
Epoch: 20954, Training Loss: 0.01022
Epoch: 20954, Training Loss: 0.01259
Epoch: 20955, Training Loss: 0.00975
Epoch: 20955, Training Loss: 0.01021
Epoch: 20955, Training Loss: 0.01022
Epoch: 20955, Training Loss: 0.01259
Epoch: 20956, Training Loss: 0.00975
Epoch: 20956, Training Loss: 0.01021
Epoch: 20956, Training Loss: 0.01022
Epoch: 20956, Training Loss: 0.01259
Epoch: 20957, Training Loss: 0.00975
Epoch: 20957, Training Loss: 0.01021
Epoch: 20957, Training Loss: 0.01022
Epoch: 20957, Training Loss: 0.01259
Epoch: 20958, Training Loss: 0.00975
Epoch: 20958, Training Loss: 0.01021
Epoch: 20958, Training Loss: 0.01022
Epoch: 20958, Training Loss: 0.01259
Epoch: 20959, Training Loss: 0.00975
Epoch: 20959, Training Loss: 0.01021
Epoch: 20959, Training Loss: 0.01022
Epoch: 20959, Training Loss: 0.01259
Epoch: 20960, Training Loss: 0.00975
Epoch: 20960, Training Loss: 0.01021
Epoch: 20960, Training Loss: 0.01022
Epoch: 20960, Training Loss: 0.01259
Epoch: 20961, Training Loss: 0.00975
Epoch: 20961, Training Loss: 0.01021
Epoch: 20961, Training Loss: 0.01022
Epoch: 20961, Training Loss: 0.01259
Epoch: 20962, Training Loss: 0.00975
Epoch: 20962, Training Loss: 0.01021
Epoch: 20962, Training Loss: 0.01022
Epoch: 20962, Training Loss: 0.01259
Epoch: 20963, Training Loss: 0.00975
Epoch: 20963, Training Loss: 0.01021
Epoch: 20963, Training Loss: 0.01022
Epoch: 20963, Training Loss: 0.01259
Epoch: 20964, Training Loss: 0.00975
Epoch: 20964, Training Loss: 0.01021
Epoch: 20964, Training Loss: 0.01022
Epoch: 20964, Training Loss: 0.01259
Epoch: 20965, Training Loss: 0.00975
Epoch: 20965, Training Loss: 0.01021
Epoch: 20965, Training Loss: 0.01022
Epoch: 20965, Training Loss: 0.01259
Epoch: 20966, Training Loss: 0.00975
Epoch: 20966, Training Loss: 0.01021
Epoch: 20966, Training Loss: 0.01022
Epoch: 20966, Training Loss: 0.01259
Epoch: 20967, Training Loss: 0.00975
Epoch: 20967, Training Loss: 0.01021
Epoch: 20967, Training Loss: 0.01022
Epoch: 20967, Training Loss: 0.01259
Epoch: 20968, Training Loss: 0.00975
Epoch: 20968, Training Loss: 0.01021
Epoch: 20968, Training Loss: 0.01022
Epoch: 20968, Training Loss: 0.01259
Epoch: 20969, Training Loss: 0.00975
Epoch: 20969, Training Loss: 0.01021
Epoch: 20969, Training Loss: 0.01022
Epoch: 20969, Training Loss: 0.01259
Epoch: 20970, Training Loss: 0.00974
Epoch: 20970, Training Loss: 0.01021
Epoch: 20970, Training Loss: 0.01022
Epoch: 20970, Training Loss: 0.01259
Epoch: 20971, Training Loss: 0.00974
Epoch: 20971, Training Loss: 0.01021
Epoch: 20971, Training Loss: 0.01022
Epoch: 20971, Training Loss: 0.01259
Epoch: 20972, Training Loss: 0.00974
Epoch: 20972, Training Loss: 0.01021
Epoch: 20972, Training Loss: 0.01022
Epoch: 20972, Training Loss: 0.01259
Epoch: 20973, Training Loss: 0.00974
Epoch: 20973, Training Loss: 0.01021
Epoch: 20973, Training Loss: 0.01022
Epoch: 20973, Training Loss: 0.01259
Epoch: 20974, Training Loss: 0.00974
Epoch: 20974, Training Loss: 0.01021
Epoch: 20974, Training Loss: 0.01022
Epoch: 20974, Training Loss: 0.01259
Epoch: 20975, Training Loss: 0.00974
Epoch: 20975, Training Loss: 0.01021
Epoch: 20975, Training Loss: 0.01022
Epoch: 20975, Training Loss: 0.01259
Epoch: 20976, Training Loss: 0.00974
Epoch: 20976, Training Loss: 0.01020
Epoch: 20976, Training Loss: 0.01022
Epoch: 20976, Training Loss: 0.01259
Epoch: 20977, Training Loss: 0.00974
Epoch: 20977, Training Loss: 0.01020
Epoch: 20977, Training Loss: 0.01022
Epoch: 20977, Training Loss: 0.01259
Epoch: 20978, Training Loss: 0.00974
Epoch: 20978, Training Loss: 0.01020
Epoch: 20978, Training Loss: 0.01022
Epoch: 20978, Training Loss: 0.01259
Epoch: 20979, Training Loss: 0.00974
Epoch: 20979, Training Loss: 0.01020
Epoch: 20979, Training Loss: 0.01022
Epoch: 20979, Training Loss: 0.01258
Epoch: 20980, Training Loss: 0.00974
Epoch: 20980, Training Loss: 0.01020
Epoch: 20980, Training Loss: 0.01022
Epoch: 20980, Training Loss: 0.01258
Epoch: 20981, Training Loss: 0.00974
Epoch: 20981, Training Loss: 0.01020
Epoch: 20981, Training Loss: 0.01022
Epoch: 20981, Training Loss: 0.01258
Epoch: 20982, Training Loss: 0.00974
Epoch: 20982, Training Loss: 0.01020
Epoch: 20982, Training Loss: 0.01022
Epoch: 20982, Training Loss: 0.01258
Epoch: 20983, Training Loss: 0.00974
Epoch: 20983, Training Loss: 0.01020
Epoch: 20983, Training Loss: 0.01022
Epoch: 20983, Training Loss: 0.01258
Epoch: 20984, Training Loss: 0.00974
Epoch: 20984, Training Loss: 0.01020
Epoch: 20984, Training Loss: 0.01022
Epoch: 20984, Training Loss: 0.01258
Epoch: 20985, Training Loss: 0.00974
Epoch: 20985, Training Loss: 0.01020
Epoch: 20985, Training Loss: 0.01022
Epoch: 20985, Training Loss: 0.01258
Epoch: 20986, Training Loss: 0.00974
Epoch: 20986, Training Loss: 0.01020
Epoch: 20986, Training Loss: 0.01022
Epoch: 20986, Training Loss: 0.01258
Epoch: 20987, Training Loss: 0.00974
Epoch: 20987, Training Loss: 0.01020
Epoch: 20987, Training Loss: 0.01021
Epoch: 20987, Training Loss: 0.01258
Epoch: 20988, Training Loss: 0.00974
Epoch: 20988, Training Loss: 0.01020
Epoch: 20988, Training Loss: 0.01021
Epoch: 20988, Training Loss: 0.01258
Epoch: 20989, Training Loss: 0.00974
Epoch: 20989, Training Loss: 0.01020
Epoch: 20989, Training Loss: 0.01021
Epoch: 20989, Training Loss: 0.01258
Epoch: 20990, Training Loss: 0.00974
Epoch: 20990, Training Loss: 0.01020
Epoch: 20990, Training Loss: 0.01021
Epoch: 20990, Training Loss: 0.01258
Epoch: 20991, Training Loss: 0.00974
Epoch: 20991, Training Loss: 0.01020
Epoch: 20991, Training Loss: 0.01021
Epoch: 20991, Training Loss: 0.01258
Epoch: 20992, Training Loss: 0.00974
Epoch: 20992, Training Loss: 0.01020
Epoch: 20992, Training Loss: 0.01021
Epoch: 20992, Training Loss: 0.01258
Epoch: 20993, Training Loss: 0.00974
Epoch: 20993, Training Loss: 0.01020
Epoch: 20993, Training Loss: 0.01021
Epoch: 20993, Training Loss: 0.01258
Epoch: 20994, Training Loss: 0.00974
Epoch: 20994, Training Loss: 0.01020
Epoch: 20994, Training Loss: 0.01021
Epoch: 20994, Training Loss: 0.01258
Epoch: 20995, Training Loss: 0.00974
Epoch: 20995, Training Loss: 0.01020
Epoch: 20995, Training Loss: 0.01021
Epoch: 20995, Training Loss: 0.01258
Epoch: 20996, Training Loss: 0.00974
Epoch: 20996, Training Loss: 0.01020
Epoch: 20996, Training Loss: 0.01021
Epoch: 20996, Training Loss: 0.01258
Epoch: 20997, Training Loss: 0.00974
Epoch: 20997, Training Loss: 0.01020
Epoch: 20997, Training Loss: 0.01021
Epoch: 20997, Training Loss: 0.01258
Epoch: 20998, Training Loss: 0.00974
Epoch: 20998, Training Loss: 0.01020
Epoch: 20998, Training Loss: 0.01021
Epoch: 20998, Training Loss: 0.01258
Epoch: 20999, Training Loss: 0.00974
Epoch: 20999, Training Loss: 0.01020
Epoch: 20999, Training Loss: 0.01021
Epoch: 20999, Training Loss: 0.01258
Epoch: 21000, Training Loss: 0.00974
Epoch: 21000, Training Loss: 0.01020
Epoch: 21000, Training Loss: 0.01021
Epoch: 21000, Training Loss: 0.01258
Epoch: 21001, Training Loss: 0.00974
Epoch: 21001, Training Loss: 0.01020
Epoch: 21001, Training Loss: 0.01021
Epoch: 21001, Training Loss: 0.01258
Epoch: 21002, Training Loss: 0.00974
Epoch: 21002, Training Loss: 0.01020
Epoch: 21002, Training Loss: 0.01021
Epoch: 21002, Training Loss: 0.01258
Epoch: 21003, Training Loss: 0.00974
Epoch: 21003, Training Loss: 0.01020
Epoch: 21003, Training Loss: 0.01021
Epoch: 21003, Training Loss: 0.01258
Epoch: 21004, Training Loss: 0.00974
Epoch: 21004, Training Loss: 0.01020
Epoch: 21004, Training Loss: 0.01021
Epoch: 21004, Training Loss: 0.01258
Epoch: 21005, Training Loss: 0.00974
Epoch: 21005, Training Loss: 0.01020
Epoch: 21005, Training Loss: 0.01021
Epoch: 21005, Training Loss: 0.01258
Epoch: 21006, Training Loss: 0.00974
Epoch: 21006, Training Loss: 0.01020
Epoch: 21006, Training Loss: 0.01021
Epoch: 21006, Training Loss: 0.01258
Epoch: 21007, Training Loss: 0.00973
Epoch: 21007, Training Loss: 0.01020
Epoch: 21007, Training Loss: 0.01021
Epoch: 21007, Training Loss: 0.01257
Epoch: 21008, Training Loss: 0.00973
Epoch: 21008, Training Loss: 0.01020
Epoch: 21008, Training Loss: 0.01021
Epoch: 21008, Training Loss: 0.01257
Epoch: 21009, Training Loss: 0.00973
Epoch: 21009, Training Loss: 0.01020
Epoch: 21009, Training Loss: 0.01021
Epoch: 21009, Training Loss: 0.01257
Epoch: 21010, Training Loss: 0.00973
Epoch: 21010, Training Loss: 0.01020
Epoch: 21010, Training Loss: 0.01021
Epoch: 21010, Training Loss: 0.01257
Epoch: 21011, Training Loss: 0.00973
Epoch: 21011, Training Loss: 0.01019
Epoch: 21011, Training Loss: 0.01021
Epoch: 21011, Training Loss: 0.01257
Epoch: 21012, Training Loss: 0.00973
Epoch: 21012, Training Loss: 0.01019
Epoch: 21012, Training Loss: 0.01021
Epoch: 21012, Training Loss: 0.01257
Epoch: 21013, Training Loss: 0.00973
Epoch: 21013, Training Loss: 0.01019
Epoch: 21013, Training Loss: 0.01021
Epoch: 21013, Training Loss: 0.01257
Epoch: 21014, Training Loss: 0.00973
Epoch: 21014, Training Loss: 0.01019
Epoch: 21014, Training Loss: 0.01021
Epoch: 21014, Training Loss: 0.01257
Epoch: 21015, Training Loss: 0.00973
Epoch: 21015, Training Loss: 0.01019
Epoch: 21015, Training Loss: 0.01021
Epoch: 21015, Training Loss: 0.01257
Epoch: 21016, Training Loss: 0.00973
Epoch: 21016, Training Loss: 0.01019
Epoch: 21016, Training Loss: 0.01021
Epoch: 21016, Training Loss: 0.01257
Epoch: 21017, Training Loss: 0.00973
Epoch: 21017, Training Loss: 0.01019
Epoch: 21017, Training Loss: 0.01021
Epoch: 21017, Training Loss: 0.01257
Epoch: 21018, Training Loss: 0.00973
Epoch: 21018, Training Loss: 0.01019
Epoch: 21018, Training Loss: 0.01021
Epoch: 21018, Training Loss: 0.01257
Epoch: 21019, Training Loss: 0.00973
Epoch: 21019, Training Loss: 0.01019
Epoch: 21019, Training Loss: 0.01021
Epoch: 21019, Training Loss: 0.01257
Epoch: 21020, Training Loss: 0.00973
Epoch: 21020, Training Loss: 0.01019
Epoch: 21020, Training Loss: 0.01021
Epoch: 21020, Training Loss: 0.01257
Epoch: 21021, Training Loss: 0.00973
Epoch: 21021, Training Loss: 0.01019
Epoch: 21021, Training Loss: 0.01021
Epoch: 21021, Training Loss: 0.01257
Epoch: 21022, Training Loss: 0.00973
Epoch: 21022, Training Loss: 0.01019
Epoch: 21022, Training Loss: 0.01020
Epoch: 21022, Training Loss: 0.01257
Epoch: 21023, Training Loss: 0.00973
Epoch: 21023, Training Loss: 0.01019
Epoch: 21023, Training Loss: 0.01020
Epoch: 21023, Training Loss: 0.01257
Epoch: 21024, Training Loss: 0.00973
Epoch: 21024, Training Loss: 0.01019
Epoch: 21024, Training Loss: 0.01020
Epoch: 21024, Training Loss: 0.01257
Epoch: 21025, Training Loss: 0.00973
Epoch: 21025, Training Loss: 0.01019
Epoch: 21025, Training Loss: 0.01020
Epoch: 21025, Training Loss: 0.01257
Epoch: 21026, Training Loss: 0.00973
Epoch: 21026, Training Loss: 0.01019
Epoch: 21026, Training Loss: 0.01020
Epoch: 21026, Training Loss: 0.01257
Epoch: 21027, Training Loss: 0.00973
Epoch: 21027, Training Loss: 0.01019
Epoch: 21027, Training Loss: 0.01020
Epoch: 21027, Training Loss: 0.01257
Epoch: 21028, Training Loss: 0.00973
Epoch: 21028, Training Loss: 0.01019
Epoch: 21028, Training Loss: 0.01020
Epoch: 21028, Training Loss: 0.01257
Epoch: 21029, Training Loss: 0.00973
Epoch: 21029, Training Loss: 0.01019
Epoch: 21029, Training Loss: 0.01020
Epoch: 21029, Training Loss: 0.01257
Epoch: 21030, Training Loss: 0.00973
Epoch: 21030, Training Loss: 0.01019
Epoch: 21030, Training Loss: 0.01020
Epoch: 21030, Training Loss: 0.01257
Epoch: 21031, Training Loss: 0.00973
Epoch: 21031, Training Loss: 0.01019
Epoch: 21031, Training Loss: 0.01020
Epoch: 21031, Training Loss: 0.01257
Epoch: 21032, Training Loss: 0.00973
Epoch: 21032, Training Loss: 0.01019
Epoch: 21032, Training Loss: 0.01020
Epoch: 21032, Training Loss: 0.01257
Epoch: 21033, Training Loss: 0.00973
Epoch: 21033, Training Loss: 0.01019
Epoch: 21033, Training Loss: 0.01020
Epoch: 21033, Training Loss: 0.01257
Epoch: 21034, Training Loss: 0.00973
Epoch: 21034, Training Loss: 0.01019
Epoch: 21034, Training Loss: 0.01020
Epoch: 21034, Training Loss: 0.01257
Epoch: 21035, Training Loss: 0.00973
Epoch: 21035, Training Loss: 0.01019
Epoch: 21035, Training Loss: 0.01020
Epoch: 21035, Training Loss: 0.01256
Epoch: 21036, Training Loss: 0.00973
Epoch: 21036, Training Loss: 0.01019
Epoch: 21036, Training Loss: 0.01020
Epoch: 21036, Training Loss: 0.01256
Epoch: 21037, Training Loss: 0.00973
Epoch: 21037, Training Loss: 0.01019
Epoch: 21037, Training Loss: 0.01020
Epoch: 21037, Training Loss: 0.01256
Epoch: 21038, Training Loss: 0.00973
Epoch: 21038, Training Loss: 0.01019
Epoch: 21038, Training Loss: 0.01020
Epoch: 21038, Training Loss: 0.01256
Epoch: 21039, Training Loss: 0.00973
Epoch: 21039, Training Loss: 0.01019
Epoch: 21039, Training Loss: 0.01020
Epoch: 21039, Training Loss: 0.01256
Epoch: 21040, Training Loss: 0.00973
Epoch: 21040, Training Loss: 0.01019
Epoch: 21040, Training Loss: 0.01020
Epoch: 21040, Training Loss: 0.01256
Epoch: 21041, Training Loss: 0.00973
Epoch: 21041, Training Loss: 0.01019
Epoch: 21041, Training Loss: 0.01020
Epoch: 21041, Training Loss: 0.01256
Epoch: 21042, Training Loss: 0.00973
Epoch: 21042, Training Loss: 0.01019
Epoch: 21042, Training Loss: 0.01020
Epoch: 21042, Training Loss: 0.01256
Epoch: 21043, Training Loss: 0.00973
Epoch: 21043, Training Loss: 0.01019
Epoch: 21043, Training Loss: 0.01020
Epoch: 21043, Training Loss: 0.01256
Epoch: 21044, Training Loss: 0.00973
Epoch: 21044, Training Loss: 0.01019
Epoch: 21044, Training Loss: 0.01020
Epoch: 21044, Training Loss: 0.01256
Epoch: 21045, Training Loss: 0.00972
Epoch: 21045, Training Loss: 0.01019
Epoch: 21045, Training Loss: 0.01020
Epoch: 21045, Training Loss: 0.01256
Epoch: 21046, Training Loss: 0.00972
Epoch: 21046, Training Loss: 0.01018
Epoch: 21046, Training Loss: 0.01020
Epoch: 21046, Training Loss: 0.01256
Epoch: 21047, Training Loss: 0.00972
Epoch: 21047, Training Loss: 0.01018
Epoch: 21047, Training Loss: 0.01020
Epoch: 21047, Training Loss: 0.01256
Epoch: 21048, Training Loss: 0.00972
Epoch: 21048, Training Loss: 0.01018
Epoch: 21048, Training Loss: 0.01020
Epoch: 21048, Training Loss: 0.01256
Epoch: 21049, Training Loss: 0.00972
Epoch: 21049, Training Loss: 0.01018
Epoch: 21049, Training Loss: 0.01020
Epoch: 21049, Training Loss: 0.01256
Epoch: 21050, Training Loss: 0.00972
Epoch: 21050, Training Loss: 0.01018
Epoch: 21050, Training Loss: 0.01020
Epoch: 21050, Training Loss: 0.01256
Epoch: 21051, Training Loss: 0.00972
Epoch: 21051, Training Loss: 0.01018
Epoch: 21051, Training Loss: 0.01020
Epoch: 21051, Training Loss: 0.01256
Epoch: 21052, Training Loss: 0.00972
Epoch: 21052, Training Loss: 0.01018
Epoch: 21052, Training Loss: 0.01020
Epoch: 21052, Training Loss: 0.01256
Epoch: 21053, Training Loss: 0.00972
Epoch: 21053, Training Loss: 0.01018
Epoch: 21053, Training Loss: 0.01020
Epoch: 21053, Training Loss: 0.01256
Epoch: 21054, Training Loss: 0.00972
Epoch: 21054, Training Loss: 0.01018
Epoch: 21054, Training Loss: 0.01020
Epoch: 21054, Training Loss: 0.01256
Epoch: 21055, Training Loss: 0.00972
Epoch: 21055, Training Loss: 0.01018
Epoch: 21055, Training Loss: 0.01020
Epoch: 21055, Training Loss: 0.01256
Epoch: 21056, Training Loss: 0.00972
Epoch: 21056, Training Loss: 0.01018
Epoch: 21056, Training Loss: 0.01020
Epoch: 21056, Training Loss: 0.01256
Epoch: 21057, Training Loss: 0.00972
Epoch: 21057, Training Loss: 0.01018
Epoch: 21057, Training Loss: 0.01019
Epoch: 21057, Training Loss: 0.01256
Epoch: 21058, Training Loss: 0.00972
Epoch: 21058, Training Loss: 0.01018
Epoch: 21058, Training Loss: 0.01019
Epoch: 21058, Training Loss: 0.01256
Epoch: 21059, Training Loss: 0.00972
Epoch: 21059, Training Loss: 0.01018
Epoch: 21059, Training Loss: 0.01019
Epoch: 21059, Training Loss: 0.01256
Epoch: 21060, Training Loss: 0.00972
Epoch: 21060, Training Loss: 0.01018
Epoch: 21060, Training Loss: 0.01019
Epoch: 21060, Training Loss: 0.01256
Epoch: 21061, Training Loss: 0.00972
Epoch: 21061, Training Loss: 0.01018
Epoch: 21061, Training Loss: 0.01019
Epoch: 21061, Training Loss: 0.01256
Epoch: 21062, Training Loss: 0.00972
Epoch: 21062, Training Loss: 0.01018
Epoch: 21062, Training Loss: 0.01019
Epoch: 21062, Training Loss: 0.01256
Epoch: 21063, Training Loss: 0.00972
Epoch: 21063, Training Loss: 0.01018
Epoch: 21063, Training Loss: 0.01019
Epoch: 21063, Training Loss: 0.01255
Epoch: 21064, Training Loss: 0.00972
Epoch: 21064, Training Loss: 0.01018
Epoch: 21064, Training Loss: 0.01019
Epoch: 21064, Training Loss: 0.01255
Epoch: 21065, Training Loss: 0.00972
Epoch: 21065, Training Loss: 0.01018
Epoch: 21065, Training Loss: 0.01019
Epoch: 21065, Training Loss: 0.01255
Epoch: 21066, Training Loss: 0.00972
Epoch: 21066, Training Loss: 0.01018
Epoch: 21066, Training Loss: 0.01019
Epoch: 21066, Training Loss: 0.01255
Epoch: 21067, Training Loss: 0.00972
Epoch: 21067, Training Loss: 0.01018
Epoch: 21067, Training Loss: 0.01019
Epoch: 21067, Training Loss: 0.01255
Epoch: 21068, Training Loss: 0.00972
Epoch: 21068, Training Loss: 0.01018
Epoch: 21068, Training Loss: 0.01019
Epoch: 21068, Training Loss: 0.01255
Epoch: 21069, Training Loss: 0.00972
Epoch: 21069, Training Loss: 0.01018
Epoch: 21069, Training Loss: 0.01019
Epoch: 21069, Training Loss: 0.01255
Epoch: 21070, Training Loss: 0.00972
Epoch: 21070, Training Loss: 0.01018
Epoch: 21070, Training Loss: 0.01019
Epoch: 21070, Training Loss: 0.01255
Epoch: 21071, Training Loss: 0.00972
Epoch: 21071, Training Loss: 0.01018
Epoch: 21071, Training Loss: 0.01019
Epoch: 21071, Training Loss: 0.01255
Epoch: 21072, Training Loss: 0.00972
Epoch: 21072, Training Loss: 0.01018
Epoch: 21072, Training Loss: 0.01019
Epoch: 21072, Training Loss: 0.01255
Epoch: 21073, Training Loss: 0.00972
Epoch: 21073, Training Loss: 0.01018
Epoch: 21073, Training Loss: 0.01019
Epoch: 21073, Training Loss: 0.01255
Epoch: 21074, Training Loss: 0.00972
Epoch: 21074, Training Loss: 0.01018
Epoch: 21074, Training Loss: 0.01019
Epoch: 21074, Training Loss: 0.01255
Epoch: 21075, Training Loss: 0.00972
Epoch: 21075, Training Loss: 0.01018
Epoch: 21075, Training Loss: 0.01019
Epoch: 21075, Training Loss: 0.01255
Epoch: 21076, Training Loss: 0.00972
Epoch: 21076, Training Loss: 0.01018
Epoch: 21076, Training Loss: 0.01019
Epoch: 21076, Training Loss: 0.01255
Epoch: 21077, Training Loss: 0.00972
Epoch: 21077, Training Loss: 0.01018
Epoch: 21077, Training Loss: 0.01019
Epoch: 21077, Training Loss: 0.01255
Epoch: 21078, Training Loss: 0.00972
Epoch: 21078, Training Loss: 0.01018
Epoch: 21078, Training Loss: 0.01019
Epoch: 21078, Training Loss: 0.01255
Epoch: 21079, Training Loss: 0.00972
Epoch: 21079, Training Loss: 0.01018
Epoch: 21079, Training Loss: 0.01019
Epoch: 21079, Training Loss: 0.01255
Epoch: 21080, Training Loss: 0.00972
Epoch: 21080, Training Loss: 0.01018
Epoch: 21080, Training Loss: 0.01019
Epoch: 21080, Training Loss: 0.01255
Epoch: 21081, Training Loss: 0.00972
Epoch: 21081, Training Loss: 0.01017
Epoch: 21081, Training Loss: 0.01019
Epoch: 21081, Training Loss: 0.01255
Epoch: 21082, Training Loss: 0.00972
Epoch: 21082, Training Loss: 0.01017
Epoch: 21082, Training Loss: 0.01019
Epoch: 21082, Training Loss: 0.01255
Epoch: 21083, Training Loss: 0.00971
Epoch: 21083, Training Loss: 0.01017
Epoch: 21083, Training Loss: 0.01019
Epoch: 21083, Training Loss: 0.01255
Epoch: 21084, Training Loss: 0.00971
Epoch: 21084, Training Loss: 0.01017
Epoch: 21084, Training Loss: 0.01019
Epoch: 21084, Training Loss: 0.01255
Epoch: 21085, Training Loss: 0.00971
Epoch: 21085, Training Loss: 0.01017
Epoch: 21085, Training Loss: 0.01019
Epoch: 21085, Training Loss: 0.01255
Epoch: 21086, Training Loss: 0.00971
Epoch: 21086, Training Loss: 0.01017
Epoch: 21086, Training Loss: 0.01019
Epoch: 21086, Training Loss: 0.01255
Epoch: 21087, Training Loss: 0.00971
Epoch: 21087, Training Loss: 0.01017
Epoch: 21087, Training Loss: 0.01019
Epoch: 21087, Training Loss: 0.01255
Epoch: 21088, Training Loss: 0.00971
Epoch: 21088, Training Loss: 0.01017
Epoch: 21088, Training Loss: 0.01019
Epoch: 21088, Training Loss: 0.01255
Epoch: 21089, Training Loss: 0.00971
Epoch: 21089, Training Loss: 0.01017
Epoch: 21089, Training Loss: 0.01019
Epoch: 21089, Training Loss: 0.01255
Epoch: 21090, Training Loss: 0.00971
Epoch: 21090, Training Loss: 0.01017
Epoch: 21090, Training Loss: 0.01019
Epoch: 21090, Training Loss: 0.01255
Epoch: 21091, Training Loss: 0.00971
Epoch: 21091, Training Loss: 0.01017
Epoch: 21091, Training Loss: 0.01019
Epoch: 21091, Training Loss: 0.01255
Epoch: 21092, Training Loss: 0.00971
Epoch: 21092, Training Loss: 0.01017
Epoch: 21092, Training Loss: 0.01019
Epoch: 21092, Training Loss: 0.01254
Epoch: 21093, Training Loss: 0.00971
Epoch: 21093, Training Loss: 0.01017
Epoch: 21093, Training Loss: 0.01018
Epoch: 21093, Training Loss: 0.01254
Epoch: 21094, Training Loss: 0.00971
Epoch: 21094, Training Loss: 0.01017
Epoch: 21094, Training Loss: 0.01018
Epoch: 21094, Training Loss: 0.01254
Epoch: 21095, Training Loss: 0.00971
Epoch: 21095, Training Loss: 0.01017
Epoch: 21095, Training Loss: 0.01018
Epoch: 21095, Training Loss: 0.01254
Epoch: 21096, Training Loss: 0.00971
Epoch: 21096, Training Loss: 0.01017
Epoch: 21096, Training Loss: 0.01018
Epoch: 21096, Training Loss: 0.01254
Epoch: 21097, Training Loss: 0.00971
Epoch: 21097, Training Loss: 0.01017
Epoch: 21097, Training Loss: 0.01018
Epoch: 21097, Training Loss: 0.01254
Epoch: 21098, Training Loss: 0.00971
Epoch: 21098, Training Loss: 0.01017
Epoch: 21098, Training Loss: 0.01018
Epoch: 21098, Training Loss: 0.01254
Epoch: 21099, Training Loss: 0.00971
Epoch: 21099, Training Loss: 0.01017
Epoch: 21099, Training Loss: 0.01018
Epoch: 21099, Training Loss: 0.01254
Epoch: 21100, Training Loss: 0.00971
Epoch: 21100, Training Loss: 0.01017
Epoch: 21100, Training Loss: 0.01018
Epoch: 21100, Training Loss: 0.01254
Epoch: 21101, Training Loss: 0.00971
Epoch: 21101, Training Loss: 0.01017
Epoch: 21101, Training Loss: 0.01018
Epoch: 21101, Training Loss: 0.01254
Epoch: 21102, Training Loss: 0.00971
Epoch: 21102, Training Loss: 0.01017
Epoch: 21102, Training Loss: 0.01018
Epoch: 21102, Training Loss: 0.01254
Epoch: 21103, Training Loss: 0.00971
Epoch: 21103, Training Loss: 0.01017
Epoch: 21103, Training Loss: 0.01018
Epoch: 21103, Training Loss: 0.01254
Epoch: 21104, Training Loss: 0.00971
Epoch: 21104, Training Loss: 0.01017
Epoch: 21104, Training Loss: 0.01018
Epoch: 21104, Training Loss: 0.01254
Epoch: 21105, Training Loss: 0.00971
Epoch: 21105, Training Loss: 0.01017
Epoch: 21105, Training Loss: 0.01018
Epoch: 21105, Training Loss: 0.01254
Epoch: 21106, Training Loss: 0.00971
Epoch: 21106, Training Loss: 0.01017
Epoch: 21106, Training Loss: 0.01018
Epoch: 21106, Training Loss: 0.01254
Epoch: 21107, Training Loss: 0.00971
Epoch: 21107, Training Loss: 0.01017
Epoch: 21107, Training Loss: 0.01018
Epoch: 21107, Training Loss: 0.01254
Epoch: 21108, Training Loss: 0.00971
Epoch: 21108, Training Loss: 0.01017
Epoch: 21108, Training Loss: 0.01018
Epoch: 21108, Training Loss: 0.01254
Epoch: 21109, Training Loss: 0.00971
Epoch: 21109, Training Loss: 0.01017
Epoch: 21109, Training Loss: 0.01018
Epoch: 21109, Training Loss: 0.01254
Epoch: 21110, Training Loss: 0.00971
Epoch: 21110, Training Loss: 0.01017
Epoch: 21110, Training Loss: 0.01018
Epoch: 21110, Training Loss: 0.01254
Epoch: 21111, Training Loss: 0.00971
Epoch: 21111, Training Loss: 0.01017
Epoch: 21111, Training Loss: 0.01018
Epoch: 21111, Training Loss: 0.01254
Epoch: 21112, Training Loss: 0.00971
Epoch: 21112, Training Loss: 0.01017
Epoch: 21112, Training Loss: 0.01018
Epoch: 21112, Training Loss: 0.01254
Epoch: 21113, Training Loss: 0.00971
Epoch: 21113, Training Loss: 0.01017
Epoch: 21113, Training Loss: 0.01018
Epoch: 21113, Training Loss: 0.01254
Epoch: 21114, Training Loss: 0.00971
Epoch: 21114, Training Loss: 0.01017
Epoch: 21114, Training Loss: 0.01018
Epoch: 21114, Training Loss: 0.01254
Epoch: 21115, Training Loss: 0.00971
Epoch: 21115, Training Loss: 0.01017
Epoch: 21115, Training Loss: 0.01018
Epoch: 21115, Training Loss: 0.01254
Epoch: 21116, Training Loss: 0.00971
Epoch: 21116, Training Loss: 0.01017
Epoch: 21116, Training Loss: 0.01018
Epoch: 21116, Training Loss: 0.01254
Epoch: 21117, Training Loss: 0.00971
Epoch: 21117, Training Loss: 0.01016
Epoch: 21117, Training Loss: 0.01018
Epoch: 21117, Training Loss: 0.01254
Epoch: 21118, Training Loss: 0.00971
Epoch: 21118, Training Loss: 0.01016
Epoch: 21118, Training Loss: 0.01018
Epoch: 21118, Training Loss: 0.01254
Epoch: 21119, Training Loss: 0.00971
Epoch: 21119, Training Loss: 0.01016
Epoch: 21119, Training Loss: 0.01018
Epoch: 21119, Training Loss: 0.01254
Epoch: 21120, Training Loss: 0.00971
Epoch: 21120, Training Loss: 0.01016
Epoch: 21120, Training Loss: 0.01018
Epoch: 21120, Training Loss: 0.01253
Epoch: 21121, Training Loss: 0.00970
Epoch: 21121, Training Loss: 0.01016
Epoch: 21121, Training Loss: 0.01018
Epoch: 21121, Training Loss: 0.01253
Epoch: 21122, Training Loss: 0.00970
Epoch: 21122, Training Loss: 0.01016
Epoch: 21122, Training Loss: 0.01018
Epoch: 21122, Training Loss: 0.01253
Epoch: 21123, Training Loss: 0.00970
Epoch: 21123, Training Loss: 0.01016
Epoch: 21123, Training Loss: 0.01018
Epoch: 21123, Training Loss: 0.01253
Epoch: 21124, Training Loss: 0.00970
Epoch: 21124, Training Loss: 0.01016
Epoch: 21124, Training Loss: 0.01018
Epoch: 21124, Training Loss: 0.01253
Epoch: 21125, Training Loss: 0.00970
Epoch: 21125, Training Loss: 0.01016
Epoch: 21125, Training Loss: 0.01018
Epoch: 21125, Training Loss: 0.01253
Epoch: 21126, Training Loss: 0.00970
Epoch: 21126, Training Loss: 0.01016
Epoch: 21126, Training Loss: 0.01018
Epoch: 21126, Training Loss: 0.01253
Epoch: 21127, Training Loss: 0.00970
Epoch: 21127, Training Loss: 0.01016
Epoch: 21127, Training Loss: 0.01018
Epoch: 21127, Training Loss: 0.01253
Epoch: 21128, Training Loss: 0.00970
Epoch: 21128, Training Loss: 0.01016
Epoch: 21128, Training Loss: 0.01017
Epoch: 21128, Training Loss: 0.01253
Epoch: 21129, Training Loss: 0.00970
Epoch: 21129, Training Loss: 0.01016
Epoch: 21129, Training Loss: 0.01017
Epoch: 21129, Training Loss: 0.01253
Epoch: 21130, Training Loss: 0.00970
Epoch: 21130, Training Loss: 0.01016
Epoch: 21130, Training Loss: 0.01017
Epoch: 21130, Training Loss: 0.01253
Epoch: 21131, Training Loss: 0.00970
Epoch: 21131, Training Loss: 0.01016
Epoch: 21131, Training Loss: 0.01017
Epoch: 21131, Training Loss: 0.01253
Epoch: 21132, Training Loss: 0.00970
Epoch: 21132, Training Loss: 0.01016
Epoch: 21132, Training Loss: 0.01017
Epoch: 21132, Training Loss: 0.01253
Epoch: 21133, Training Loss: 0.00970
Epoch: 21133, Training Loss: 0.01016
Epoch: 21133, Training Loss: 0.01017
Epoch: 21133, Training Loss: 0.01253
Epoch: 21134, Training Loss: 0.00970
Epoch: 21134, Training Loss: 0.01016
Epoch: 21134, Training Loss: 0.01017
Epoch: 21134, Training Loss: 0.01253
Epoch: 21135, Training Loss: 0.00970
Epoch: 21135, Training Loss: 0.01016
Epoch: 21135, Training Loss: 0.01017
Epoch: 21135, Training Loss: 0.01253
Epoch: 21136, Training Loss: 0.00970
Epoch: 21136, Training Loss: 0.01016
Epoch: 21136, Training Loss: 0.01017
Epoch: 21136, Training Loss: 0.01253
Epoch: 21137, Training Loss: 0.00970
Epoch: 21137, Training Loss: 0.01016
Epoch: 21137, Training Loss: 0.01017
Epoch: 21137, Training Loss: 0.01253
Epoch: 21138, Training Loss: 0.00970
Epoch: 21138, Training Loss: 0.01016
Epoch: 21138, Training Loss: 0.01017
Epoch: 21138, Training Loss: 0.01253
Epoch: 21139, Training Loss: 0.00970
Epoch: 21139, Training Loss: 0.01016
Epoch: 21139, Training Loss: 0.01017
Epoch: 21139, Training Loss: 0.01253
Epoch: 21140, Training Loss: 0.00970
Epoch: 21140, Training Loss: 0.01016
Epoch: 21140, Training Loss: 0.01017
Epoch: 21140, Training Loss: 0.01253
Epoch: 21141, Training Loss: 0.00970
Epoch: 21141, Training Loss: 0.01016
Epoch: 21141, Training Loss: 0.01017
Epoch: 21141, Training Loss: 0.01253
Epoch: 21142, Training Loss: 0.00970
Epoch: 21142, Training Loss: 0.01016
Epoch: 21142, Training Loss: 0.01017
Epoch: 21142, Training Loss: 0.01253
Epoch: 21143, Training Loss: 0.00970
Epoch: 21143, Training Loss: 0.01016
Epoch: 21143, Training Loss: 0.01017
Epoch: 21143, Training Loss: 0.01253
Epoch: 21144, Training Loss: 0.00970
Epoch: 21144, Training Loss: 0.01016
Epoch: 21144, Training Loss: 0.01017
Epoch: 21144, Training Loss: 0.01253
Epoch: 21145, Training Loss: 0.00970
Epoch: 21145, Training Loss: 0.01016
Epoch: 21145, Training Loss: 0.01017
Epoch: 21145, Training Loss: 0.01253
Epoch: 21146, Training Loss: 0.00970
Epoch: 21146, Training Loss: 0.01016
Epoch: 21146, Training Loss: 0.01017
Epoch: 21146, Training Loss: 0.01253
Epoch: 21147, Training Loss: 0.00970
Epoch: 21147, Training Loss: 0.01016
Epoch: 21147, Training Loss: 0.01017
Epoch: 21147, Training Loss: 0.01253
Epoch: 21148, Training Loss: 0.00970
Epoch: 21148, Training Loss: 0.01016
Epoch: 21148, Training Loss: 0.01017
Epoch: 21148, Training Loss: 0.01252
Epoch: 21149, Training Loss: 0.00970
Epoch: 21149, Training Loss: 0.01016
Epoch: 21149, Training Loss: 0.01017
Epoch: 21149, Training Loss: 0.01252
Epoch: 21150, Training Loss: 0.00970
Epoch: 21150, Training Loss: 0.01016
Epoch: 21150, Training Loss: 0.01017
Epoch: 21150, Training Loss: 0.01252
Epoch: 21151, Training Loss: 0.00970
Epoch: 21151, Training Loss: 0.01016
Epoch: 21151, Training Loss: 0.01017
Epoch: 21151, Training Loss: 0.01252
Epoch: 21152, Training Loss: 0.00970
Epoch: 21152, Training Loss: 0.01015
Epoch: 21152, Training Loss: 0.01017
Epoch: 21152, Training Loss: 0.01252
Epoch: 21153, Training Loss: 0.00970
Epoch: 21153, Training Loss: 0.01015
Epoch: 21153, Training Loss: 0.01017
Epoch: 21153, Training Loss: 0.01252
Epoch: 21154, Training Loss: 0.00970
Epoch: 21154, Training Loss: 0.01015
Epoch: 21154, Training Loss: 0.01017
Epoch: 21154, Training Loss: 0.01252
Epoch: 21155, Training Loss: 0.00970
Epoch: 21155, Training Loss: 0.01015
Epoch: 21155, Training Loss: 0.01017
Epoch: 21155, Training Loss: 0.01252
Epoch: 21156, Training Loss: 0.00970
Epoch: 21156, Training Loss: 0.01015
Epoch: 21156, Training Loss: 0.01017
Epoch: 21156, Training Loss: 0.01252
Epoch: 21157, Training Loss: 0.00970
Epoch: 21157, Training Loss: 0.01015
Epoch: 21157, Training Loss: 0.01017
Epoch: 21157, Training Loss: 0.01252
Epoch: 21158, Training Loss: 0.00970
Epoch: 21158, Training Loss: 0.01015
Epoch: 21158, Training Loss: 0.01017
Epoch: 21158, Training Loss: 0.01252
Epoch: 21159, Training Loss: 0.00969
Epoch: 21159, Training Loss: 0.01015
Epoch: 21159, Training Loss: 0.01017
Epoch: 21159, Training Loss: 0.01252
Epoch: 21160, Training Loss: 0.00969
Epoch: 21160, Training Loss: 0.01015
Epoch: 21160, Training Loss: 0.01017
Epoch: 21160, Training Loss: 0.01252
Epoch: 21161, Training Loss: 0.00969
Epoch: 21161, Training Loss: 0.01015
Epoch: 21161, Training Loss: 0.01017
Epoch: 21161, Training Loss: 0.01252
Epoch: 21162, Training Loss: 0.00969
Epoch: 21162, Training Loss: 0.01015
Epoch: 21162, Training Loss: 0.01017
Epoch: 21162, Training Loss: 0.01252
Epoch: 21163, Training Loss: 0.00969
Epoch: 21163, Training Loss: 0.01015
Epoch: 21163, Training Loss: 0.01016
Epoch: 21163, Training Loss: 0.01252
Epoch: 21164, Training Loss: 0.00969
Epoch: 21164, Training Loss: 0.01015
Epoch: 21164, Training Loss: 0.01016
Epoch: 21164, Training Loss: 0.01252
Epoch: 21165, Training Loss: 0.00969
Epoch: 21165, Training Loss: 0.01015
Epoch: 21165, Training Loss: 0.01016
Epoch: 21165, Training Loss: 0.01252
Epoch: 21166, Training Loss: 0.00969
Epoch: 21166, Training Loss: 0.01015
Epoch: 21166, Training Loss: 0.01016
Epoch: 21166, Training Loss: 0.01252
Epoch: 21167, Training Loss: 0.00969
Epoch: 21167, Training Loss: 0.01015
Epoch: 21167, Training Loss: 0.01016
Epoch: 21167, Training Loss: 0.01252
Epoch: 21168, Training Loss: 0.00969
Epoch: 21168, Training Loss: 0.01015
Epoch: 21168, Training Loss: 0.01016
Epoch: 21168, Training Loss: 0.01252
Epoch: 21169, Training Loss: 0.00969
Epoch: 21169, Training Loss: 0.01015
Epoch: 21169, Training Loss: 0.01016
Epoch: 21169, Training Loss: 0.01252
Epoch: 21170, Training Loss: 0.00969
Epoch: 21170, Training Loss: 0.01015
Epoch: 21170, Training Loss: 0.01016
Epoch: 21170, Training Loss: 0.01252
Epoch: 21171, Training Loss: 0.00969
Epoch: 21171, Training Loss: 0.01015
Epoch: 21171, Training Loss: 0.01016
Epoch: 21171, Training Loss: 0.01252
Epoch: 21172, Training Loss: 0.00969
Epoch: 21172, Training Loss: 0.01015
Epoch: 21172, Training Loss: 0.01016
Epoch: 21172, Training Loss: 0.01252
Epoch: 21173, Training Loss: 0.00969
Epoch: 21173, Training Loss: 0.01015
Epoch: 21173, Training Loss: 0.01016
Epoch: 21173, Training Loss: 0.01252
Epoch: 21174, Training Loss: 0.00969
Epoch: 21174, Training Loss: 0.01015
Epoch: 21174, Training Loss: 0.01016
Epoch: 21174, Training Loss: 0.01252
Epoch: 21175, Training Loss: 0.00969
Epoch: 21175, Training Loss: 0.01015
Epoch: 21175, Training Loss: 0.01016
Epoch: 21175, Training Loss: 0.01252
Epoch: 21176, Training Loss: 0.00969
Epoch: 21176, Training Loss: 0.01015
Epoch: 21176, Training Loss: 0.01016
Epoch: 21176, Training Loss: 0.01252
Epoch: 21177, Training Loss: 0.00969
Epoch: 21177, Training Loss: 0.01015
Epoch: 21177, Training Loss: 0.01016
Epoch: 21177, Training Loss: 0.01251
Epoch: 21178, Training Loss: 0.00969
Epoch: 21178, Training Loss: 0.01015
Epoch: 21178, Training Loss: 0.01016
Epoch: 21178, Training Loss: 0.01251
Epoch: 21179, Training Loss: 0.00969
Epoch: 21179, Training Loss: 0.01015
Epoch: 21179, Training Loss: 0.01016
Epoch: 21179, Training Loss: 0.01251
Epoch: 21180, Training Loss: 0.00969
Epoch: 21180, Training Loss: 0.01015
Epoch: 21180, Training Loss: 0.01016
Epoch: 21180, Training Loss: 0.01251
Epoch: 21181, Training Loss: 0.00969
Epoch: 21181, Training Loss: 0.01015
Epoch: 21181, Training Loss: 0.01016
Epoch: 21181, Training Loss: 0.01251
Epoch: 21182, Training Loss: 0.00969
Epoch: 21182, Training Loss: 0.01015
Epoch: 21182, Training Loss: 0.01016
Epoch: 21182, Training Loss: 0.01251
Epoch: 21183, Training Loss: 0.00969
Epoch: 21183, Training Loss: 0.01015
Epoch: 21183, Training Loss: 0.01016
Epoch: 21183, Training Loss: 0.01251
Epoch: 21184, Training Loss: 0.00969
Epoch: 21184, Training Loss: 0.01015
Epoch: 21184, Training Loss: 0.01016
Epoch: 21184, Training Loss: 0.01251
Epoch: 21185, Training Loss: 0.00969
Epoch: 21185, Training Loss: 0.01015
Epoch: 21185, Training Loss: 0.01016
Epoch: 21185, Training Loss: 0.01251
Epoch: 21186, Training Loss: 0.00969
Epoch: 21186, Training Loss: 0.01015
Epoch: 21186, Training Loss: 0.01016
Epoch: 21186, Training Loss: 0.01251
Epoch: 21187, Training Loss: 0.00969
Epoch: 21187, Training Loss: 0.01015
Epoch: 21187, Training Loss: 0.01016
Epoch: 21187, Training Loss: 0.01251
Epoch: 21188, Training Loss: 0.00969
Epoch: 21188, Training Loss: 0.01014
Epoch: 21188, Training Loss: 0.01016
Epoch: 21188, Training Loss: 0.01251
Epoch: 21189, Training Loss: 0.00969
Epoch: 21189, Training Loss: 0.01014
Epoch: 21189, Training Loss: 0.01016
Epoch: 21189, Training Loss: 0.01251
Epoch: 21190, Training Loss: 0.00969
Epoch: 21190, Training Loss: 0.01014
Epoch: 21190, Training Loss: 0.01016
Epoch: 21190, Training Loss: 0.01251
Epoch: 21191, Training Loss: 0.00969
Epoch: 21191, Training Loss: 0.01014
Epoch: 21191, Training Loss: 0.01016
Epoch: 21191, Training Loss: 0.01251
Epoch: 21192, Training Loss: 0.00969
Epoch: 21192, Training Loss: 0.01014
Epoch: 21192, Training Loss: 0.01016
Epoch: 21192, Training Loss: 0.01251
Epoch: 21193, Training Loss: 0.00969
Epoch: 21193, Training Loss: 0.01014
Epoch: 21193, Training Loss: 0.01016
Epoch: 21193, Training Loss: 0.01251
Epoch: 21194, Training Loss: 0.00969
Epoch: 21194, Training Loss: 0.01014
Epoch: 21194, Training Loss: 0.01016
Epoch: 21194, Training Loss: 0.01251
Epoch: 21195, Training Loss: 0.00969
Epoch: 21195, Training Loss: 0.01014
Epoch: 21195, Training Loss: 0.01016
Epoch: 21195, Training Loss: 0.01251
Epoch: 21196, Training Loss: 0.00969
Epoch: 21196, Training Loss: 0.01014
Epoch: 21196, Training Loss: 0.01016
Epoch: 21196, Training Loss: 0.01251
Epoch: 21197, Training Loss: 0.00969
Epoch: 21197, Training Loss: 0.01014
Epoch: 21197, Training Loss: 0.01016
Epoch: 21197, Training Loss: 0.01251
Epoch: 21198, Training Loss: 0.00968
Epoch: 21198, Training Loss: 0.01014
Epoch: 21198, Training Loss: 0.01016
Epoch: 21198, Training Loss: 0.01251
Epoch: 21199, Training Loss: 0.00968
Epoch: 21199, Training Loss: 0.01014
Epoch: 21199, Training Loss: 0.01015
Epoch: 21199, Training Loss: 0.01251
Epoch: 21200, Training Loss: 0.00968
Epoch: 21200, Training Loss: 0.01014
Epoch: 21200, Training Loss: 0.01015
Epoch: 21200, Training Loss: 0.01251
Epoch: 21201, Training Loss: 0.00968
Epoch: 21201, Training Loss: 0.01014
Epoch: 21201, Training Loss: 0.01015
Epoch: 21201, Training Loss: 0.01251
Epoch: 21202, Training Loss: 0.00968
Epoch: 21202, Training Loss: 0.01014
Epoch: 21202, Training Loss: 0.01015
Epoch: 21202, Training Loss: 0.01251
Epoch: 21203, Training Loss: 0.00968
Epoch: 21203, Training Loss: 0.01014
Epoch: 21203, Training Loss: 0.01015
Epoch: 21203, Training Loss: 0.01251
Epoch: 21204, Training Loss: 0.00968
Epoch: 21204, Training Loss: 0.01014
Epoch: 21204, Training Loss: 0.01015
Epoch: 21204, Training Loss: 0.01251
Epoch: 21205, Training Loss: 0.00968
Epoch: 21205, Training Loss: 0.01014
Epoch: 21205, Training Loss: 0.01015
Epoch: 21205, Training Loss: 0.01250
Epoch: 21206, Training Loss: 0.00968
Epoch: 21206, Training Loss: 0.01014
Epoch: 21206, Training Loss: 0.01015
Epoch: 21206, Training Loss: 0.01250
Epoch: 21207, Training Loss: 0.00968
Epoch: 21207, Training Loss: 0.01014
Epoch: 21207, Training Loss: 0.01015
Epoch: 21207, Training Loss: 0.01250
Epoch: 21208, Training Loss: 0.00968
Epoch: 21208, Training Loss: 0.01014
Epoch: 21208, Training Loss: 0.01015
Epoch: 21208, Training Loss: 0.01250
Epoch: 21209, Training Loss: 0.00968
Epoch: 21209, Training Loss: 0.01014
Epoch: 21209, Training Loss: 0.01015
Epoch: 21209, Training Loss: 0.01250
Epoch: 21210, Training Loss: 0.00968
Epoch: 21210, Training Loss: 0.01014
Epoch: 21210, Training Loss: 0.01015
Epoch: 21210, Training Loss: 0.01250
Epoch: 21211, Training Loss: 0.00968
Epoch: 21211, Training Loss: 0.01014
Epoch: 21211, Training Loss: 0.01015
Epoch: 21211, Training Loss: 0.01250
Epoch: 21212, Training Loss: 0.00968
Epoch: 21212, Training Loss: 0.01014
Epoch: 21212, Training Loss: 0.01015
Epoch: 21212, Training Loss: 0.01250
Epoch: 21213, Training Loss: 0.00968
Epoch: 21213, Training Loss: 0.01014
Epoch: 21213, Training Loss: 0.01015
Epoch: 21213, Training Loss: 0.01250
Epoch: 21214, Training Loss: 0.00968
Epoch: 21214, Training Loss: 0.01014
Epoch: 21214, Training Loss: 0.01015
Epoch: 21214, Training Loss: 0.01250
Epoch: 21215, Training Loss: 0.00968
Epoch: 21215, Training Loss: 0.01014
Epoch: 21215, Training Loss: 0.01015
Epoch: 21215, Training Loss: 0.01250
Epoch: 21216, Training Loss: 0.00968
Epoch: 21216, Training Loss: 0.01014
Epoch: 21216, Training Loss: 0.01015
Epoch: 21216, Training Loss: 0.01250
Epoch: 21217, Training Loss: 0.00968
Epoch: 21217, Training Loss: 0.01014
Epoch: 21217, Training Loss: 0.01015
Epoch: 21217, Training Loss: 0.01250
Epoch: 21218, Training Loss: 0.00968
Epoch: 21218, Training Loss: 0.01014
Epoch: 21218, Training Loss: 0.01015
Epoch: 21218, Training Loss: 0.01250
Epoch: 21219, Training Loss: 0.00968
Epoch: 21219, Training Loss: 0.01014
Epoch: 21219, Training Loss: 0.01015
Epoch: 21219, Training Loss: 0.01250
Epoch: 21220, Training Loss: 0.00968
Epoch: 21220, Training Loss: 0.01014
Epoch: 21220, Training Loss: 0.01015
Epoch: 21220, Training Loss: 0.01250
Epoch: 21221, Training Loss: 0.00968
Epoch: 21221, Training Loss: 0.01014
Epoch: 21221, Training Loss: 0.01015
Epoch: 21221, Training Loss: 0.01250
Epoch: 21222, Training Loss: 0.00968
Epoch: 21222, Training Loss: 0.01014
Epoch: 21222, Training Loss: 0.01015
Epoch: 21222, Training Loss: 0.01250
Epoch: 21223, Training Loss: 0.00968
Epoch: 21223, Training Loss: 0.01013
Epoch: 21223, Training Loss: 0.01015
Epoch: 21223, Training Loss: 0.01250
Epoch: 21224, Training Loss: 0.00968
Epoch: 21224, Training Loss: 0.01013
Epoch: 21224, Training Loss: 0.01015
Epoch: 21224, Training Loss: 0.01250
Epoch: 21225, Training Loss: 0.00968
Epoch: 21225, Training Loss: 0.01013
Epoch: 21225, Training Loss: 0.01015
Epoch: 21225, Training Loss: 0.01250
Epoch: 21226, Training Loss: 0.00968
Epoch: 21226, Training Loss: 0.01013
Epoch: 21226, Training Loss: 0.01015
Epoch: 21226, Training Loss: 0.01250
Epoch: 21227, Training Loss: 0.00968
Epoch: 21227, Training Loss: 0.01013
Epoch: 21227, Training Loss: 0.01015
Epoch: 21227, Training Loss: 0.01250
Epoch: 21228, Training Loss: 0.00968
Epoch: 21228, Training Loss: 0.01013
Epoch: 21228, Training Loss: 0.01015
Epoch: 21228, Training Loss: 0.01250
Epoch: 21229, Training Loss: 0.00968
Epoch: 21229, Training Loss: 0.01013
Epoch: 21229, Training Loss: 0.01015
Epoch: 21229, Training Loss: 0.01250
Epoch: 21230, Training Loss: 0.00968
Epoch: 21230, Training Loss: 0.01013
Epoch: 21230, Training Loss: 0.01015
Epoch: 21230, Training Loss: 0.01250
Epoch: 21231, Training Loss: 0.00968
Epoch: 21231, Training Loss: 0.01013
Epoch: 21231, Training Loss: 0.01015
Epoch: 21231, Training Loss: 0.01250
Epoch: 21232, Training Loss: 0.00968
Epoch: 21232, Training Loss: 0.01013
Epoch: 21232, Training Loss: 0.01015
Epoch: 21232, Training Loss: 0.01250
Epoch: 21233, Training Loss: 0.00968
Epoch: 21233, Training Loss: 0.01013
Epoch: 21233, Training Loss: 0.01015
Epoch: 21233, Training Loss: 0.01250
Epoch: 21234, Training Loss: 0.00968
Epoch: 21234, Training Loss: 0.01013
Epoch: 21234, Training Loss: 0.01014
Epoch: 21234, Training Loss: 0.01249
Epoch: 21235, Training Loss: 0.00968
Epoch: 21235, Training Loss: 0.01013
Epoch: 21235, Training Loss: 0.01014
Epoch: 21235, Training Loss: 0.01249
Epoch: 21236, Training Loss: 0.00967
Epoch: 21236, Training Loss: 0.01013
Epoch: 21236, Training Loss: 0.01014
Epoch: 21236, Training Loss: 0.01249
Epoch: 21237, Training Loss: 0.00967
Epoch: 21237, Training Loss: 0.01013
Epoch: 21237, Training Loss: 0.01014
Epoch: 21237, Training Loss: 0.01249
Epoch: 21238, Training Loss: 0.00967
Epoch: 21238, Training Loss: 0.01013
Epoch: 21238, Training Loss: 0.01014
Epoch: 21238, Training Loss: 0.01249
Epoch: 21239, Training Loss: 0.00967
Epoch: 21239, Training Loss: 0.01013
Epoch: 21239, Training Loss: 0.01014
Epoch: 21239, Training Loss: 0.01249
Epoch: 21240, Training Loss: 0.00967
Epoch: 21240, Training Loss: 0.01013
Epoch: 21240, Training Loss: 0.01014
Epoch: 21240, Training Loss: 0.01249
Epoch: 21241, Training Loss: 0.00967
Epoch: 21241, Training Loss: 0.01013
Epoch: 21241, Training Loss: 0.01014
Epoch: 21241, Training Loss: 0.01249
Epoch: 21242, Training Loss: 0.00967
Epoch: 21242, Training Loss: 0.01013
Epoch: 21242, Training Loss: 0.01014
Epoch: 21242, Training Loss: 0.01249
Epoch: 21243, Training Loss: 0.00967
Epoch: 21243, Training Loss: 0.01013
Epoch: 21243, Training Loss: 0.01014
Epoch: 21243, Training Loss: 0.01249
Epoch: 21244, Training Loss: 0.00967
Epoch: 21244, Training Loss: 0.01013
Epoch: 21244, Training Loss: 0.01014
Epoch: 21244, Training Loss: 0.01249
Epoch: 21245, Training Loss: 0.00967
Epoch: 21245, Training Loss: 0.01013
Epoch: 21245, Training Loss: 0.01014
Epoch: 21245, Training Loss: 0.01249
Epoch: 21246, Training Loss: 0.00967
Epoch: 21246, Training Loss: 0.01013
Epoch: 21246, Training Loss: 0.01014
Epoch: 21246, Training Loss: 0.01249
Epoch: 21247, Training Loss: 0.00967
Epoch: 21247, Training Loss: 0.01013
Epoch: 21247, Training Loss: 0.01014
Epoch: 21247, Training Loss: 0.01249
Epoch: 21248, Training Loss: 0.00967
Epoch: 21248, Training Loss: 0.01013
Epoch: 21248, Training Loss: 0.01014
Epoch: 21248, Training Loss: 0.01249
Epoch: 21249, Training Loss: 0.00967
Epoch: 21249, Training Loss: 0.01013
Epoch: 21249, Training Loss: 0.01014
Epoch: 21249, Training Loss: 0.01249
Epoch: 21250, Training Loss: 0.00967
Epoch: 21250, Training Loss: 0.01013
Epoch: 21250, Training Loss: 0.01014
Epoch: 21250, Training Loss: 0.01249
Epoch: 21251, Training Loss: 0.00967
Epoch: 21251, Training Loss: 0.01013
Epoch: 21251, Training Loss: 0.01014
Epoch: 21251, Training Loss: 0.01249
Epoch: 21252, Training Loss: 0.00967
Epoch: 21252, Training Loss: 0.01013
Epoch: 21252, Training Loss: 0.01014
Epoch: 21252, Training Loss: 0.01249
Epoch: 21253, Training Loss: 0.00967
Epoch: 21253, Training Loss: 0.01013
Epoch: 21253, Training Loss: 0.01014
Epoch: 21253, Training Loss: 0.01249
Epoch: 21254, Training Loss: 0.00967
Epoch: 21254, Training Loss: 0.01013
Epoch: 21254, Training Loss: 0.01014
Epoch: 21254, Training Loss: 0.01249
Epoch: 21255, Training Loss: 0.00967
Epoch: 21255, Training Loss: 0.01013
Epoch: 21255, Training Loss: 0.01014
Epoch: 21255, Training Loss: 0.01249
Epoch: 21256, Training Loss: 0.00967
Epoch: 21256, Training Loss: 0.01013
Epoch: 21256, Training Loss: 0.01014
Epoch: 21256, Training Loss: 0.01249
Epoch: 21257, Training Loss: 0.00967
Epoch: 21257, Training Loss: 0.01013
Epoch: 21257, Training Loss: 0.01014
Epoch: 21257, Training Loss: 0.01249
Epoch: 21258, Training Loss: 0.00967
Epoch: 21258, Training Loss: 0.01013
Epoch: 21258, Training Loss: 0.01014
Epoch: 21258, Training Loss: 0.01249
Epoch: 21259, Training Loss: 0.00967
Epoch: 21259, Training Loss: 0.01012
Epoch: 21259, Training Loss: 0.01014
Epoch: 21259, Training Loss: 0.01249
Epoch: 21260, Training Loss: 0.00967
Epoch: 21260, Training Loss: 0.01012
Epoch: 21260, Training Loss: 0.01014
Epoch: 21260, Training Loss: 0.01249
Epoch: 21261, Training Loss: 0.00967
Epoch: 21261, Training Loss: 0.01012
Epoch: 21261, Training Loss: 0.01014
Epoch: 21261, Training Loss: 0.01249
Epoch: 21262, Training Loss: 0.00967
Epoch: 21262, Training Loss: 0.01012
Epoch: 21262, Training Loss: 0.01014
Epoch: 21262, Training Loss: 0.01248
Epoch: 21263, Training Loss: 0.00967
Epoch: 21263, Training Loss: 0.01012
Epoch: 21263, Training Loss: 0.01014
Epoch: 21263, Training Loss: 0.01248
Epoch: 21264, Training Loss: 0.00967
Epoch: 21264, Training Loss: 0.01012
Epoch: 21264, Training Loss: 0.01014
Epoch: 21264, Training Loss: 0.01248
Epoch: 21265, Training Loss: 0.00967
Epoch: 21265, Training Loss: 0.01012
Epoch: 21265, Training Loss: 0.01014
Epoch: 21265, Training Loss: 0.01248
Epoch: 21266, Training Loss: 0.00967
Epoch: 21266, Training Loss: 0.01012
Epoch: 21266, Training Loss: 0.01014
Epoch: 21266, Training Loss: 0.01248
Epoch: 21267, Training Loss: 0.00967
Epoch: 21267, Training Loss: 0.01012
Epoch: 21267, Training Loss: 0.01014
Epoch: 21267, Training Loss: 0.01248
Epoch: 21268, Training Loss: 0.00967
Epoch: 21268, Training Loss: 0.01012
Epoch: 21268, Training Loss: 0.01014
Epoch: 21268, Training Loss: 0.01248
Epoch: 21269, Training Loss: 0.00967
Epoch: 21269, Training Loss: 0.01012
Epoch: 21269, Training Loss: 0.01014
Epoch: 21269, Training Loss: 0.01248
Epoch: 21270, Training Loss: 0.00967
Epoch: 21270, Training Loss: 0.01012
Epoch: 21270, Training Loss: 0.01013
Epoch: 21270, Training Loss: 0.01248
Epoch: 21271, Training Loss: 0.00967
Epoch: 21271, Training Loss: 0.01012
Epoch: 21271, Training Loss: 0.01013
Epoch: 21271, Training Loss: 0.01248
Epoch: 21272, Training Loss: 0.00967
Epoch: 21272, Training Loss: 0.01012
Epoch: 21272, Training Loss: 0.01013
Epoch: 21272, Training Loss: 0.01248
Epoch: 21273, Training Loss: 0.00967
Epoch: 21273, Training Loss: 0.01012
Epoch: 21273, Training Loss: 0.01013
Epoch: 21273, Training Loss: 0.01248
Epoch: 21274, Training Loss: 0.00967
Epoch: 21274, Training Loss: 0.01012
Epoch: 21274, Training Loss: 0.01013
Epoch: 21274, Training Loss: 0.01248
Epoch: 21275, Training Loss: 0.00966
Epoch: 21275, Training Loss: 0.01012
Epoch: 21275, Training Loss: 0.01013
Epoch: 21275, Training Loss: 0.01248
Epoch: 21276, Training Loss: 0.00966
Epoch: 21276, Training Loss: 0.01012
Epoch: 21276, Training Loss: 0.01013
Epoch: 21276, Training Loss: 0.01248
Epoch: 21277, Training Loss: 0.00966
Epoch: 21277, Training Loss: 0.01012
Epoch: 21277, Training Loss: 0.01013
Epoch: 21277, Training Loss: 0.01248
Epoch: 21278, Training Loss: 0.00966
Epoch: 21278, Training Loss: 0.01012
Epoch: 21278, Training Loss: 0.01013
Epoch: 21278, Training Loss: 0.01248
Epoch: 21279, Training Loss: 0.00966
Epoch: 21279, Training Loss: 0.01012
Epoch: 21279, Training Loss: 0.01013
Epoch: 21279, Training Loss: 0.01248
Epoch: 21280, Training Loss: 0.00966
Epoch: 21280, Training Loss: 0.01012
Epoch: 21280, Training Loss: 0.01013
Epoch: 21280, Training Loss: 0.01248
Epoch: 21281, Training Loss: 0.00966
Epoch: 21281, Training Loss: 0.01012
Epoch: 21281, Training Loss: 0.01013
Epoch: 21281, Training Loss: 0.01248
Epoch: 21282, Training Loss: 0.00966
Epoch: 21282, Training Loss: 0.01012
Epoch: 21282, Training Loss: 0.01013
Epoch: 21282, Training Loss: 0.01248
Epoch: 21283, Training Loss: 0.00966
Epoch: 21283, Training Loss: 0.01012
Epoch: 21283, Training Loss: 0.01013
Epoch: 21283, Training Loss: 0.01248
Epoch: 21284, Training Loss: 0.00966
Epoch: 21284, Training Loss: 0.01012
Epoch: 21284, Training Loss: 0.01013
Epoch: 21284, Training Loss: 0.01248
Epoch: 21285, Training Loss: 0.00966
Epoch: 21285, Training Loss: 0.01012
Epoch: 21285, Training Loss: 0.01013
Epoch: 21285, Training Loss: 0.01248
Epoch: 21286, Training Loss: 0.00966
Epoch: 21286, Training Loss: 0.01012
Epoch: 21286, Training Loss: 0.01013
Epoch: 21286, Training Loss: 0.01248
Epoch: 21287, Training Loss: 0.00966
Epoch: 21287, Training Loss: 0.01012
Epoch: 21287, Training Loss: 0.01013
Epoch: 21287, Training Loss: 0.01248
Epoch: 21288, Training Loss: 0.00966
Epoch: 21288, Training Loss: 0.01012
Epoch: 21288, Training Loss: 0.01013
Epoch: 21288, Training Loss: 0.01248
Epoch: 21289, Training Loss: 0.00966
Epoch: 21289, Training Loss: 0.01012
Epoch: 21289, Training Loss: 0.01013
Epoch: 21289, Training Loss: 0.01248
Epoch: 21290, Training Loss: 0.00966
Epoch: 21290, Training Loss: 0.01012
Epoch: 21290, Training Loss: 0.01013
Epoch: 21290, Training Loss: 0.01248
Epoch: 21291, Training Loss: 0.00966
Epoch: 21291, Training Loss: 0.01012
Epoch: 21291, Training Loss: 0.01013
Epoch: 21291, Training Loss: 0.01247
Epoch: 21292, Training Loss: 0.00966
Epoch: 21292, Training Loss: 0.01012
Epoch: 21292, Training Loss: 0.01013
Epoch: 21292, Training Loss: 0.01247
Epoch: 21293, Training Loss: 0.00966
Epoch: 21293, Training Loss: 0.01012
Epoch: 21293, Training Loss: 0.01013
Epoch: 21293, Training Loss: 0.01247
Epoch: 21294, Training Loss: 0.00966
Epoch: 21294, Training Loss: 0.01012
Epoch: 21294, Training Loss: 0.01013
Epoch: 21294, Training Loss: 0.01247
Epoch: 21295, Training Loss: 0.00966
Epoch: 21295, Training Loss: 0.01011
Epoch: 21295, Training Loss: 0.01013
Epoch: 21295, Training Loss: 0.01247
Epoch: 21296, Training Loss: 0.00966
Epoch: 21296, Training Loss: 0.01011
Epoch: 21296, Training Loss: 0.01013
Epoch: 21296, Training Loss: 0.01247
Epoch: 21297, Training Loss: 0.00966
Epoch: 21297, Training Loss: 0.01011
Epoch: 21297, Training Loss: 0.01013
Epoch: 21297, Training Loss: 0.01247
Epoch: 21298, Training Loss: 0.00966
Epoch: 21298, Training Loss: 0.01011
Epoch: 21298, Training Loss: 0.01013
Epoch: 21298, Training Loss: 0.01247
Epoch: 21299, Training Loss: 0.00966
Epoch: 21299, Training Loss: 0.01011
Epoch: 21299, Training Loss: 0.01013
Epoch: 21299, Training Loss: 0.01247
Epoch: 21300, Training Loss: 0.00966
Epoch: 21300, Training Loss: 0.01011
Epoch: 21300, Training Loss: 0.01013
Epoch: 21300, Training Loss: 0.01247
Epoch: 21301, Training Loss: 0.00966
Epoch: 21301, Training Loss: 0.01011
Epoch: 21301, Training Loss: 0.01013
Epoch: 21301, Training Loss: 0.01247
Epoch: 21302, Training Loss: 0.00966
Epoch: 21302, Training Loss: 0.01011
Epoch: 21302, Training Loss: 0.01013
Epoch: 21302, Training Loss: 0.01247
Epoch: 21303, Training Loss: 0.00966
Epoch: 21303, Training Loss: 0.01011
Epoch: 21303, Training Loss: 0.01013
Epoch: 21303, Training Loss: 0.01247
Epoch: 21304, Training Loss: 0.00966
Epoch: 21304, Training Loss: 0.01011
Epoch: 21304, Training Loss: 0.01013
Epoch: 21304, Training Loss: 0.01247
Epoch: 21305, Training Loss: 0.00966
Epoch: 21305, Training Loss: 0.01011
Epoch: 21305, Training Loss: 0.01013
Epoch: 21305, Training Loss: 0.01247
Epoch: 21306, Training Loss: 0.00966
Epoch: 21306, Training Loss: 0.01011
Epoch: 21306, Training Loss: 0.01012
Epoch: 21306, Training Loss: 0.01247
Epoch: 21307, Training Loss: 0.00966
Epoch: 21307, Training Loss: 0.01011
Epoch: 21307, Training Loss: 0.01012
Epoch: 21307, Training Loss: 0.01247
Epoch: 21308, Training Loss: 0.00966
Epoch: 21308, Training Loss: 0.01011
Epoch: 21308, Training Loss: 0.01012
Epoch: 21308, Training Loss: 0.01247
Epoch: 21309, Training Loss: 0.00966
Epoch: 21309, Training Loss: 0.01011
Epoch: 21309, Training Loss: 0.01012
Epoch: 21309, Training Loss: 0.01247
Epoch: 21310, Training Loss: 0.00966
Epoch: 21310, Training Loss: 0.01011
Epoch: 21310, Training Loss: 0.01012
Epoch: 21310, Training Loss: 0.01247
Epoch: 21311, Training Loss: 0.00966
Epoch: 21311, Training Loss: 0.01011
Epoch: 21311, Training Loss: 0.01012
Epoch: 21311, Training Loss: 0.01247
Epoch: 21312, Training Loss: 0.00966
Epoch: 21312, Training Loss: 0.01011
Epoch: 21312, Training Loss: 0.01012
Epoch: 21312, Training Loss: 0.01247
Epoch: 21313, Training Loss: 0.00965
Epoch: 21313, Training Loss: 0.01011
Epoch: 21313, Training Loss: 0.01012
Epoch: 21313, Training Loss: 0.01247
Epoch: 21314, Training Loss: 0.00965
Epoch: 21314, Training Loss: 0.01011
Epoch: 21314, Training Loss: 0.01012
Epoch: 21314, Training Loss: 0.01247
Epoch: 21315, Training Loss: 0.00965
Epoch: 21315, Training Loss: 0.01011
Epoch: 21315, Training Loss: 0.01012
Epoch: 21315, Training Loss: 0.01247
Epoch: 21316, Training Loss: 0.00965
Epoch: 21316, Training Loss: 0.01011
Epoch: 21316, Training Loss: 0.01012
Epoch: 21316, Training Loss: 0.01247
Epoch: 21317, Training Loss: 0.00965
Epoch: 21317, Training Loss: 0.01011
Epoch: 21317, Training Loss: 0.01012
Epoch: 21317, Training Loss: 0.01247
Epoch: 21318, Training Loss: 0.00965
Epoch: 21318, Training Loss: 0.01011
Epoch: 21318, Training Loss: 0.01012
Epoch: 21318, Training Loss: 0.01247
Epoch: 21319, Training Loss: 0.00965
Epoch: 21319, Training Loss: 0.01011
Epoch: 21319, Training Loss: 0.01012
Epoch: 21319, Training Loss: 0.01247
Epoch: 21320, Training Loss: 0.00965
Epoch: 21320, Training Loss: 0.01011
Epoch: 21320, Training Loss: 0.01012
Epoch: 21320, Training Loss: 0.01246
Epoch: 21321, Training Loss: 0.00965
Epoch: 21321, Training Loss: 0.01011
Epoch: 21321, Training Loss: 0.01012
Epoch: 21321, Training Loss: 0.01246
Epoch: 21322, Training Loss: 0.00965
Epoch: 21322, Training Loss: 0.01011
Epoch: 21322, Training Loss: 0.01012
Epoch: 21322, Training Loss: 0.01246
Epoch: 21323, Training Loss: 0.00965
Epoch: 21323, Training Loss: 0.01011
Epoch: 21323, Training Loss: 0.01012
Epoch: 21323, Training Loss: 0.01246
Epoch: 21324, Training Loss: 0.00965
Epoch: 21324, Training Loss: 0.01011
Epoch: 21324, Training Loss: 0.01012
Epoch: 21324, Training Loss: 0.01246
Epoch: 21325, Training Loss: 0.00965
Epoch: 21325, Training Loss: 0.01011
Epoch: 21325, Training Loss: 0.01012
Epoch: 21325, Training Loss: 0.01246
Epoch: 21326, Training Loss: 0.00965
Epoch: 21326, Training Loss: 0.01011
Epoch: 21326, Training Loss: 0.01012
Epoch: 21326, Training Loss: 0.01246
Epoch: 21327, Training Loss: 0.00965
Epoch: 21327, Training Loss: 0.01011
Epoch: 21327, Training Loss: 0.01012
Epoch: 21327, Training Loss: 0.01246
Epoch: 21328, Training Loss: 0.00965
Epoch: 21328, Training Loss: 0.01011
Epoch: 21328, Training Loss: 0.01012
Epoch: 21328, Training Loss: 0.01246
Epoch: 21329, Training Loss: 0.00965
Epoch: 21329, Training Loss: 0.01011
Epoch: 21329, Training Loss: 0.01012
Epoch: 21329, Training Loss: 0.01246
Epoch: 21330, Training Loss: 0.00965
Epoch: 21330, Training Loss: 0.01011
Epoch: 21330, Training Loss: 0.01012
Epoch: 21330, Training Loss: 0.01246
Epoch: 21331, Training Loss: 0.00965
Epoch: 21331, Training Loss: 0.01010
Epoch: 21331, Training Loss: 0.01012
Epoch: 21331, Training Loss: 0.01246
Epoch: 21332, Training Loss: 0.00965
Epoch: 21332, Training Loss: 0.01010
Epoch: 21332, Training Loss: 0.01012
Epoch: 21332, Training Loss: 0.01246
Epoch: 21333, Training Loss: 0.00965
Epoch: 21333, Training Loss: 0.01010
Epoch: 21333, Training Loss: 0.01012
Epoch: 21333, Training Loss: 0.01246
Epoch: 21334, Training Loss: 0.00965
Epoch: 21334, Training Loss: 0.01010
Epoch: 21334, Training Loss: 0.01012
Epoch: 21334, Training Loss: 0.01246
Epoch: 21335, Training Loss: 0.00965
Epoch: 21335, Training Loss: 0.01010
Epoch: 21335, Training Loss: 0.01012
Epoch: 21335, Training Loss: 0.01246
Epoch: 21336, Training Loss: 0.00965
Epoch: 21336, Training Loss: 0.01010
Epoch: 21336, Training Loss: 0.01012
Epoch: 21336, Training Loss: 0.01246
Epoch: 21337, Training Loss: 0.00965
Epoch: 21337, Training Loss: 0.01010
Epoch: 21337, Training Loss: 0.01012
Epoch: 21337, Training Loss: 0.01246
Epoch: 21338, Training Loss: 0.00965
Epoch: 21338, Training Loss: 0.01010
Epoch: 21338, Training Loss: 0.01012
Epoch: 21338, Training Loss: 0.01246
Epoch: 21339, Training Loss: 0.00965
Epoch: 21339, Training Loss: 0.01010
Epoch: 21339, Training Loss: 0.01012
Epoch: 21339, Training Loss: 0.01246
Epoch: 21340, Training Loss: 0.00965
Epoch: 21340, Training Loss: 0.01010
Epoch: 21340, Training Loss: 0.01012
Epoch: 21340, Training Loss: 0.01246
Epoch: 21341, Training Loss: 0.00965
Epoch: 21341, Training Loss: 0.01010
Epoch: 21341, Training Loss: 0.01012
Epoch: 21341, Training Loss: 0.01246
Epoch: 21342, Training Loss: 0.00965
Epoch: 21342, Training Loss: 0.01010
Epoch: 21342, Training Loss: 0.01011
Epoch: 21342, Training Loss: 0.01246
Epoch: 21343, Training Loss: 0.00965
Epoch: 21343, Training Loss: 0.01010
Epoch: 21343, Training Loss: 0.01011
Epoch: 21343, Training Loss: 0.01246
Epoch: 21344, Training Loss: 0.00965
Epoch: 21344, Training Loss: 0.01010
Epoch: 21344, Training Loss: 0.01011
Epoch: 21344, Training Loss: 0.01246
Epoch: 21345, Training Loss: 0.00965
Epoch: 21345, Training Loss: 0.01010
Epoch: 21345, Training Loss: 0.01011
Epoch: 21345, Training Loss: 0.01246
Epoch: 21346, Training Loss: 0.00965
Epoch: 21346, Training Loss: 0.01010
Epoch: 21346, Training Loss: 0.01011
Epoch: 21346, Training Loss: 0.01246
Epoch: 21347, Training Loss: 0.00965
Epoch: 21347, Training Loss: 0.01010
Epoch: 21347, Training Loss: 0.01011
Epoch: 21347, Training Loss: 0.01246
Epoch: 21348, Training Loss: 0.00965
Epoch: 21348, Training Loss: 0.01010
Epoch: 21348, Training Loss: 0.01011
Epoch: 21348, Training Loss: 0.01245
Epoch: 21349, Training Loss: 0.00965
Epoch: 21349, Training Loss: 0.01010
Epoch: 21349, Training Loss: 0.01011
Epoch: 21349, Training Loss: 0.01245
Epoch: 21350, Training Loss: 0.00965
Epoch: 21350, Training Loss: 0.01010
Epoch: 21350, Training Loss: 0.01011
Epoch: 21350, Training Loss: 0.01245
Epoch: 21351, Training Loss: 0.00965
Epoch: 21351, Training Loss: 0.01010
Epoch: 21351, Training Loss: 0.01011
Epoch: 21351, Training Loss: 0.01245
Epoch: 21352, Training Loss: 0.00964
Epoch: 21352, Training Loss: 0.01010
Epoch: 21352, Training Loss: 0.01011
Epoch: 21352, Training Loss: 0.01245
Epoch: 21353, Training Loss: 0.00964
Epoch: 21353, Training Loss: 0.01010
Epoch: 21353, Training Loss: 0.01011
Epoch: 21353, Training Loss: 0.01245
Epoch: 21354, Training Loss: 0.00964
Epoch: 21354, Training Loss: 0.01010
Epoch: 21354, Training Loss: 0.01011
Epoch: 21354, Training Loss: 0.01245
Epoch: 21355, Training Loss: 0.00964
Epoch: 21355, Training Loss: 0.01010
Epoch: 21355, Training Loss: 0.01011
Epoch: 21355, Training Loss: 0.01245
Epoch: 21356, Training Loss: 0.00964
Epoch: 21356, Training Loss: 0.01010
Epoch: 21356, Training Loss: 0.01011
Epoch: 21356, Training Loss: 0.01245
Epoch: 21357, Training Loss: 0.00964
Epoch: 21357, Training Loss: 0.01010
Epoch: 21357, Training Loss: 0.01011
Epoch: 21357, Training Loss: 0.01245
Epoch: 21358, Training Loss: 0.00964
Epoch: 21358, Training Loss: 0.01010
Epoch: 21358, Training Loss: 0.01011
Epoch: 21358, Training Loss: 0.01245
Epoch: 21359, Training Loss: 0.00964
Epoch: 21359, Training Loss: 0.01010
Epoch: 21359, Training Loss: 0.01011
Epoch: 21359, Training Loss: 0.01245
Epoch: 21360, Training Loss: 0.00964
Epoch: 21360, Training Loss: 0.01010
Epoch: 21360, Training Loss: 0.01011
Epoch: 21360, Training Loss: 0.01245
Epoch: 21361, Training Loss: 0.00964
Epoch: 21361, Training Loss: 0.01010
Epoch: 21361, Training Loss: 0.01011
Epoch: 21361, Training Loss: 0.01245
Epoch: 21362, Training Loss: 0.00964
Epoch: 21362, Training Loss: 0.01010
Epoch: 21362, Training Loss: 0.01011
Epoch: 21362, Training Loss: 0.01245
Epoch: 21363, Training Loss: 0.00964
Epoch: 21363, Training Loss: 0.01010
Epoch: 21363, Training Loss: 0.01011
Epoch: 21363, Training Loss: 0.01245
Epoch: 21364, Training Loss: 0.00964
Epoch: 21364, Training Loss: 0.01010
Epoch: 21364, Training Loss: 0.01011
Epoch: 21364, Training Loss: 0.01245
Epoch: 21365, Training Loss: 0.00964
Epoch: 21365, Training Loss: 0.01010
Epoch: 21365, Training Loss: 0.01011
Epoch: 21365, Training Loss: 0.01245
Epoch: 21366, Training Loss: 0.00964
Epoch: 21366, Training Loss: 0.01010
Epoch: 21366, Training Loss: 0.01011
Epoch: 21366, Training Loss: 0.01245
Epoch: 21367, Training Loss: 0.00964
Epoch: 21367, Training Loss: 0.01009
Epoch: 21367, Training Loss: 0.01011
Epoch: 21367, Training Loss: 0.01245
Epoch: 21368, Training Loss: 0.00964
Epoch: 21368, Training Loss: 0.01009
Epoch: 21368, Training Loss: 0.01011
Epoch: 21368, Training Loss: 0.01245
Epoch: 21369, Training Loss: 0.00964
Epoch: 21369, Training Loss: 0.01009
Epoch: 21369, Training Loss: 0.01011
Epoch: 21369, Training Loss: 0.01245
Epoch: 21370, Training Loss: 0.00964
Epoch: 21370, Training Loss: 0.01009
Epoch: 21370, Training Loss: 0.01011
Epoch: 21370, Training Loss: 0.01245
Epoch: 21371, Training Loss: 0.00964
Epoch: 21371, Training Loss: 0.01009
Epoch: 21371, Training Loss: 0.01011
Epoch: 21371, Training Loss: 0.01245
Epoch: 21372, Training Loss: 0.00964
Epoch: 21372, Training Loss: 0.01009
Epoch: 21372, Training Loss: 0.01011
Epoch: 21372, Training Loss: 0.01245
Epoch: 21373, Training Loss: 0.00964
Epoch: 21373, Training Loss: 0.01009
Epoch: 21373, Training Loss: 0.01011
Epoch: 21373, Training Loss: 0.01245
Epoch: 21374, Training Loss: 0.00964
Epoch: 21374, Training Loss: 0.01009
Epoch: 21374, Training Loss: 0.01011
Epoch: 21374, Training Loss: 0.01245
Epoch: 21375, Training Loss: 0.00964
Epoch: 21375, Training Loss: 0.01009
Epoch: 21375, Training Loss: 0.01011
Epoch: 21375, Training Loss: 0.01245
Epoch: 21376, Training Loss: 0.00964
Epoch: 21376, Training Loss: 0.01009
Epoch: 21376, Training Loss: 0.01011
Epoch: 21376, Training Loss: 0.01245
Epoch: 21377, Training Loss: 0.00964
Epoch: 21377, Training Loss: 0.01009
Epoch: 21377, Training Loss: 0.01011
Epoch: 21377, Training Loss: 0.01244
Epoch: 21378, Training Loss: 0.00964
Epoch: 21378, Training Loss: 0.01009
Epoch: 21378, Training Loss: 0.01010
Epoch: 21378, Training Loss: 0.01244
Epoch: 21379, Training Loss: 0.00964
Epoch: 21379, Training Loss: 0.01009
Epoch: 21379, Training Loss: 0.01010
Epoch: 21379, Training Loss: 0.01244
Epoch: 21380, Training Loss: 0.00964
Epoch: 21380, Training Loss: 0.01009
Epoch: 21380, Training Loss: 0.01010
Epoch: 21380, Training Loss: 0.01244
Epoch: 21381, Training Loss: 0.00964
Epoch: 21381, Training Loss: 0.01009
Epoch: 21381, Training Loss: 0.01010
Epoch: 21381, Training Loss: 0.01244
Epoch: 21382, Training Loss: 0.00964
Epoch: 21382, Training Loss: 0.01009
Epoch: 21382, Training Loss: 0.01010
Epoch: 21382, Training Loss: 0.01244
Epoch: 21383, Training Loss: 0.00964
Epoch: 21383, Training Loss: 0.01009
Epoch: 21383, Training Loss: 0.01010
Epoch: 21383, Training Loss: 0.01244
Epoch: 21384, Training Loss: 0.00964
Epoch: 21384, Training Loss: 0.01009
Epoch: 21384, Training Loss: 0.01010
Epoch: 21384, Training Loss: 0.01244
Epoch: 21385, Training Loss: 0.00964
Epoch: 21385, Training Loss: 0.01009
Epoch: 21385, Training Loss: 0.01010
Epoch: 21385, Training Loss: 0.01244
Epoch: 21386, Training Loss: 0.00964
Epoch: 21386, Training Loss: 0.01009
Epoch: 21386, Training Loss: 0.01010
Epoch: 21386, Training Loss: 0.01244
Epoch: 21387, Training Loss: 0.00964
Epoch: 21387, Training Loss: 0.01009
Epoch: 21387, Training Loss: 0.01010
Epoch: 21387, Training Loss: 0.01244
Epoch: 21388, Training Loss: 0.00964
Epoch: 21388, Training Loss: 0.01009
Epoch: 21388, Training Loss: 0.01010
Epoch: 21388, Training Loss: 0.01244
Epoch: 21389, Training Loss: 0.00964
Epoch: 21389, Training Loss: 0.01009
Epoch: 21389, Training Loss: 0.01010
Epoch: 21389, Training Loss: 0.01244
Epoch: 21390, Training Loss: 0.00964
Epoch: 21390, Training Loss: 0.01009
Epoch: 21390, Training Loss: 0.01010
Epoch: 21390, Training Loss: 0.01244
Epoch: 21391, Training Loss: 0.00963
Epoch: 21391, Training Loss: 0.01009
Epoch: 21391, Training Loss: 0.01010
Epoch: 21391, Training Loss: 0.01244
Epoch: 21392, Training Loss: 0.00963
Epoch: 21392, Training Loss: 0.01009
Epoch: 21392, Training Loss: 0.01010
Epoch: 21392, Training Loss: 0.01244
Epoch: 21393, Training Loss: 0.00963
Epoch: 21393, Training Loss: 0.01009
Epoch: 21393, Training Loss: 0.01010
Epoch: 21393, Training Loss: 0.01244
Epoch: 21394, Training Loss: 0.00963
Epoch: 21394, Training Loss: 0.01009
Epoch: 21394, Training Loss: 0.01010
Epoch: 21394, Training Loss: 0.01244
Epoch: 21395, Training Loss: 0.00963
Epoch: 21395, Training Loss: 0.01009
Epoch: 21395, Training Loss: 0.01010
Epoch: 21395, Training Loss: 0.01244
Epoch: 21396, Training Loss: 0.00963
Epoch: 21396, Training Loss: 0.01009
Epoch: 21396, Training Loss: 0.01010
Epoch: 21396, Training Loss: 0.01244
Epoch: 21397, Training Loss: 0.00963
Epoch: 21397, Training Loss: 0.01009
Epoch: 21397, Training Loss: 0.01010
Epoch: 21397, Training Loss: 0.01244
Epoch: 21398, Training Loss: 0.00963
Epoch: 21398, Training Loss: 0.01009
Epoch: 21398, Training Loss: 0.01010
Epoch: 21398, Training Loss: 0.01244
Epoch: 21399, Training Loss: 0.00963
Epoch: 21399, Training Loss: 0.01009
Epoch: 21399, Training Loss: 0.01010
Epoch: 21399, Training Loss: 0.01244
Epoch: 21400, Training Loss: 0.00963
Epoch: 21400, Training Loss: 0.01009
Epoch: 21400, Training Loss: 0.01010
Epoch: 21400, Training Loss: 0.01244
Epoch: 21401, Training Loss: 0.00963
Epoch: 21401, Training Loss: 0.01009
Epoch: 21401, Training Loss: 0.01010
Epoch: 21401, Training Loss: 0.01244
Epoch: 21402, Training Loss: 0.00963
Epoch: 21402, Training Loss: 0.01009
Epoch: 21402, Training Loss: 0.01010
Epoch: 21402, Training Loss: 0.01244
Epoch: 21403, Training Loss: 0.00963
Epoch: 21403, Training Loss: 0.01008
Epoch: 21403, Training Loss: 0.01010
Epoch: 21403, Training Loss: 0.01244
Epoch: 21404, Training Loss: 0.00963
Epoch: 21404, Training Loss: 0.01008
Epoch: 21404, Training Loss: 0.01010
Epoch: 21404, Training Loss: 0.01244
Epoch: 21405, Training Loss: 0.00963
Epoch: 21405, Training Loss: 0.01008
Epoch: 21405, Training Loss: 0.01010
Epoch: 21405, Training Loss: 0.01244
Epoch: 21406, Training Loss: 0.00963
Epoch: 21406, Training Loss: 0.01008
Epoch: 21406, Training Loss: 0.01010
Epoch: 21406, Training Loss: 0.01243
Epoch: 21407, Training Loss: 0.00963
Epoch: 21407, Training Loss: 0.01008
Epoch: 21407, Training Loss: 0.01010
Epoch: 21407, Training Loss: 0.01243
Epoch: 21408, Training Loss: 0.00963
Epoch: 21408, Training Loss: 0.01008
Epoch: 21408, Training Loss: 0.01010
Epoch: 21408, Training Loss: 0.01243
Epoch: 21409, Training Loss: 0.00963
Epoch: 21409, Training Loss: 0.01008
Epoch: 21409, Training Loss: 0.01010
Epoch: 21409, Training Loss: 0.01243
Epoch: 21410, Training Loss: 0.00963
Epoch: 21410, Training Loss: 0.01008
Epoch: 21410, Training Loss: 0.01010
Epoch: 21410, Training Loss: 0.01243
Epoch: 21411, Training Loss: 0.00963
Epoch: 21411, Training Loss: 0.01008
Epoch: 21411, Training Loss: 0.01010
Epoch: 21411, Training Loss: 0.01243
Epoch: 21412, Training Loss: 0.00963
Epoch: 21412, Training Loss: 0.01008
Epoch: 21412, Training Loss: 0.01010
Epoch: 21412, Training Loss: 0.01243
Epoch: 21413, Training Loss: 0.00963
Epoch: 21413, Training Loss: 0.01008
Epoch: 21413, Training Loss: 0.01010
Epoch: 21413, Training Loss: 0.01243
Epoch: 21414, Training Loss: 0.00963
Epoch: 21414, Training Loss: 0.01008
Epoch: 21414, Training Loss: 0.01009
Epoch: 21414, Training Loss: 0.01243
Epoch: 21415, Training Loss: 0.00963
Epoch: 21415, Training Loss: 0.01008
Epoch: 21415, Training Loss: 0.01009
Epoch: 21415, Training Loss: 0.01243
Epoch: 21416, Training Loss: 0.00963
Epoch: 21416, Training Loss: 0.01008
Epoch: 21416, Training Loss: 0.01009
Epoch: 21416, Training Loss: 0.01243
Epoch: 21417, Training Loss: 0.00963
Epoch: 21417, Training Loss: 0.01008
Epoch: 21417, Training Loss: 0.01009
Epoch: 21417, Training Loss: 0.01243
Epoch: 21418, Training Loss: 0.00963
Epoch: 21418, Training Loss: 0.01008
Epoch: 21418, Training Loss: 0.01009
Epoch: 21418, Training Loss: 0.01243
Epoch: 21419, Training Loss: 0.00963
Epoch: 21419, Training Loss: 0.01008
Epoch: 21419, Training Loss: 0.01009
Epoch: 21419, Training Loss: 0.01243
Epoch: 21420, Training Loss: 0.00963
Epoch: 21420, Training Loss: 0.01008
Epoch: 21420, Training Loss: 0.01009
Epoch: 21420, Training Loss: 0.01243
Epoch: 21421, Training Loss: 0.00963
Epoch: 21421, Training Loss: 0.01008
Epoch: 21421, Training Loss: 0.01009
Epoch: 21421, Training Loss: 0.01243
Epoch: 21422, Training Loss: 0.00963
Epoch: 21422, Training Loss: 0.01008
Epoch: 21422, Training Loss: 0.01009
Epoch: 21422, Training Loss: 0.01243
Epoch: 21423, Training Loss: 0.00963
Epoch: 21423, Training Loss: 0.01008
Epoch: 21423, Training Loss: 0.01009
Epoch: 21423, Training Loss: 0.01243
Epoch: 21424, Training Loss: 0.00963
Epoch: 21424, Training Loss: 0.01008
Epoch: 21424, Training Loss: 0.01009
Epoch: 21424, Training Loss: 0.01243
Epoch: 21425, Training Loss: 0.00963
Epoch: 21425, Training Loss: 0.01008
Epoch: 21425, Training Loss: 0.01009
Epoch: 21425, Training Loss: 0.01243
Epoch: 21426, Training Loss: 0.00963
Epoch: 21426, Training Loss: 0.01008
Epoch: 21426, Training Loss: 0.01009
Epoch: 21426, Training Loss: 0.01243
Epoch: 21427, Training Loss: 0.00963
Epoch: 21427, Training Loss: 0.01008
Epoch: 21427, Training Loss: 0.01009
Epoch: 21427, Training Loss: 0.01243
Epoch: 21428, Training Loss: 0.00963
Epoch: 21428, Training Loss: 0.01008
Epoch: 21428, Training Loss: 0.01009
Epoch: 21428, Training Loss: 0.01243
Epoch: 21429, Training Loss: 0.00963
Epoch: 21429, Training Loss: 0.01008
Epoch: 21429, Training Loss: 0.01009
Epoch: 21429, Training Loss: 0.01243
Epoch: 21430, Training Loss: 0.00962
Epoch: 21430, Training Loss: 0.01008
Epoch: 21430, Training Loss: 0.01009
Epoch: 21430, Training Loss: 0.01243
Epoch: 21431, Training Loss: 0.00962
Epoch: 21431, Training Loss: 0.01008
Epoch: 21431, Training Loss: 0.01009
Epoch: 21431, Training Loss: 0.01243
Epoch: 21432, Training Loss: 0.00962
Epoch: 21432, Training Loss: 0.01008
Epoch: 21432, Training Loss: 0.01009
Epoch: 21432, Training Loss: 0.01243
Epoch: 21433, Training Loss: 0.00962
Epoch: 21433, Training Loss: 0.01008
Epoch: 21433, Training Loss: 0.01009
Epoch: 21433, Training Loss: 0.01243
Epoch: 21434, Training Loss: 0.00962
Epoch: 21434, Training Loss: 0.01008
Epoch: 21434, Training Loss: 0.01009
Epoch: 21434, Training Loss: 0.01243
Epoch: 21435, Training Loss: 0.00962
Epoch: 21435, Training Loss: 0.01008
Epoch: 21435, Training Loss: 0.01009
Epoch: 21435, Training Loss: 0.01242
Epoch: 21436, Training Loss: 0.00962
Epoch: 21436, Training Loss: 0.01008
Epoch: 21436, Training Loss: 0.01009
Epoch: 21436, Training Loss: 0.01242
Epoch: 21437, Training Loss: 0.00962
Epoch: 21437, Training Loss: 0.01008
Epoch: 21437, Training Loss: 0.01009
Epoch: 21437, Training Loss: 0.01242
Epoch: 21438, Training Loss: 0.00962
Epoch: 21438, Training Loss: 0.01008
Epoch: 21438, Training Loss: 0.01009
Epoch: 21438, Training Loss: 0.01242
Epoch: 21439, Training Loss: 0.00962
Epoch: 21439, Training Loss: 0.01008
Epoch: 21439, Training Loss: 0.01009
Epoch: 21439, Training Loss: 0.01242
Epoch: 21440, Training Loss: 0.00962
Epoch: 21440, Training Loss: 0.01007
Epoch: 21440, Training Loss: 0.01009
Epoch: 21440, Training Loss: 0.01242
Epoch: 21441, Training Loss: 0.00962
Epoch: 21441, Training Loss: 0.01007
Epoch: 21441, Training Loss: 0.01009
Epoch: 21441, Training Loss: 0.01242
Epoch: 21442, Training Loss: 0.00962
Epoch: 21442, Training Loss: 0.01007
Epoch: 21442, Training Loss: 0.01009
Epoch: 21442, Training Loss: 0.01242
Epoch: 21443, Training Loss: 0.00962
Epoch: 21443, Training Loss: 0.01007
Epoch: 21443, Training Loss: 0.01009
Epoch: 21443, Training Loss: 0.01242
Epoch: 21444, Training Loss: 0.00962
Epoch: 21444, Training Loss: 0.01007
Epoch: 21444, Training Loss: 0.01009
Epoch: 21444, Training Loss: 0.01242
Epoch: 21445, Training Loss: 0.00962
Epoch: 21445, Training Loss: 0.01007
Epoch: 21445, Training Loss: 0.01009
Epoch: 21445, Training Loss: 0.01242
Epoch: 21446, Training Loss: 0.00962
Epoch: 21446, Training Loss: 0.01007
Epoch: 21446, Training Loss: 0.01009
Epoch: 21446, Training Loss: 0.01242
Epoch: 21447, Training Loss: 0.00962
Epoch: 21447, Training Loss: 0.01007
Epoch: 21447, Training Loss: 0.01009
Epoch: 21447, Training Loss: 0.01242
Epoch: 21448, Training Loss: 0.00962
Epoch: 21448, Training Loss: 0.01007
Epoch: 21448, Training Loss: 0.01009
Epoch: 21448, Training Loss: 0.01242
Epoch: 21449, Training Loss: 0.00962
Epoch: 21449, Training Loss: 0.01007
Epoch: 21449, Training Loss: 0.01009
Epoch: 21449, Training Loss: 0.01242
Epoch: 21450, Training Loss: 0.00962
Epoch: 21450, Training Loss: 0.01007
Epoch: 21450, Training Loss: 0.01009
Epoch: 21450, Training Loss: 0.01242
Epoch: 21451, Training Loss: 0.00962
Epoch: 21451, Training Loss: 0.01007
Epoch: 21451, Training Loss: 0.01008
Epoch: 21451, Training Loss: 0.01242
Epoch: 21452, Training Loss: 0.00962
Epoch: 21452, Training Loss: 0.01007
Epoch: 21452, Training Loss: 0.01008
Epoch: 21452, Training Loss: 0.01242
Epoch: 21453, Training Loss: 0.00962
Epoch: 21453, Training Loss: 0.01007
Epoch: 21453, Training Loss: 0.01008
Epoch: 21453, Training Loss: 0.01242
Epoch: 21454, Training Loss: 0.00962
Epoch: 21454, Training Loss: 0.01007
Epoch: 21454, Training Loss: 0.01008
Epoch: 21454, Training Loss: 0.01242
Epoch: 21455, Training Loss: 0.00962
Epoch: 21455, Training Loss: 0.01007
Epoch: 21455, Training Loss: 0.01008
Epoch: 21455, Training Loss: 0.01242
Epoch: 21456, Training Loss: 0.00962
Epoch: 21456, Training Loss: 0.01007
Epoch: 21456, Training Loss: 0.01008
Epoch: 21456, Training Loss: 0.01242
Epoch: 21457, Training Loss: 0.00962
Epoch: 21457, Training Loss: 0.01007
Epoch: 21457, Training Loss: 0.01008
Epoch: 21457, Training Loss: 0.01242
Epoch: 21458, Training Loss: 0.00962
Epoch: 21458, Training Loss: 0.01007
Epoch: 21458, Training Loss: 0.01008
Epoch: 21458, Training Loss: 0.01242
Epoch: 21459, Training Loss: 0.00962
Epoch: 21459, Training Loss: 0.01007
Epoch: 21459, Training Loss: 0.01008
Epoch: 21459, Training Loss: 0.01242
Epoch: 21460, Training Loss: 0.00962
Epoch: 21460, Training Loss: 0.01007
Epoch: 21460, Training Loss: 0.01008
Epoch: 21460, Training Loss: 0.01242
Epoch: 21461, Training Loss: 0.00962
Epoch: 21461, Training Loss: 0.01007
Epoch: 21461, Training Loss: 0.01008
Epoch: 21461, Training Loss: 0.01242
Epoch: 21462, Training Loss: 0.00962
Epoch: 21462, Training Loss: 0.01007
Epoch: 21462, Training Loss: 0.01008
Epoch: 21462, Training Loss: 0.01242
Epoch: 21463, Training Loss: 0.00962
Epoch: 21463, Training Loss: 0.01007
Epoch: 21463, Training Loss: 0.01008
Epoch: 21463, Training Loss: 0.01242
Epoch: 21464, Training Loss: 0.00962
Epoch: 21464, Training Loss: 0.01007
Epoch: 21464, Training Loss: 0.01008
Epoch: 21464, Training Loss: 0.01241
Epoch: 21465, Training Loss: 0.00962
Epoch: 21465, Training Loss: 0.01007
Epoch: 21465, Training Loss: 0.01008
Epoch: 21465, Training Loss: 0.01241
Epoch: 21466, Training Loss: 0.00962
Epoch: 21466, Training Loss: 0.01007
Epoch: 21466, Training Loss: 0.01008
Epoch: 21466, Training Loss: 0.01241
Epoch: 21467, Training Loss: 0.00962
Epoch: 21467, Training Loss: 0.01007
Epoch: 21467, Training Loss: 0.01008
Epoch: 21467, Training Loss: 0.01241
Epoch: 21468, Training Loss: 0.00962
Epoch: 21468, Training Loss: 0.01007
Epoch: 21468, Training Loss: 0.01008
Epoch: 21468, Training Loss: 0.01241
Epoch: 21469, Training Loss: 0.00961
Epoch: 21469, Training Loss: 0.01007
Epoch: 21469, Training Loss: 0.01008
Epoch: 21469, Training Loss: 0.01241
Epoch: 21470, Training Loss: 0.00961
Epoch: 21470, Training Loss: 0.01007
Epoch: 21470, Training Loss: 0.01008
Epoch: 21470, Training Loss: 0.01241
Epoch: 21471, Training Loss: 0.00961
Epoch: 21471, Training Loss: 0.01007
Epoch: 21471, Training Loss: 0.01008
Epoch: 21471, Training Loss: 0.01241
Epoch: 21472, Training Loss: 0.00961
Epoch: 21472, Training Loss: 0.01007
Epoch: 21472, Training Loss: 0.01008
Epoch: 21472, Training Loss: 0.01241
Epoch: 21473, Training Loss: 0.00961
Epoch: 21473, Training Loss: 0.01007
Epoch: 21473, Training Loss: 0.01008
Epoch: 21473, Training Loss: 0.01241
Epoch: 21474, Training Loss: 0.00961
Epoch: 21474, Training Loss: 0.01007
Epoch: 21474, Training Loss: 0.01008
Epoch: 21474, Training Loss: 0.01241
Epoch: 21475, Training Loss: 0.00961
Epoch: 21475, Training Loss: 0.01007
Epoch: 21475, Training Loss: 0.01008
Epoch: 21475, Training Loss: 0.01241
Epoch: 21476, Training Loss: 0.00961
Epoch: 21476, Training Loss: 0.01006
Epoch: 21476, Training Loss: 0.01008
Epoch: 21476, Training Loss: 0.01241
Epoch: 21477, Training Loss: 0.00961
Epoch: 21477, Training Loss: 0.01006
Epoch: 21477, Training Loss: 0.01008
Epoch: 21477, Training Loss: 0.01241
Epoch: 21478, Training Loss: 0.00961
Epoch: 21478, Training Loss: 0.01006
Epoch: 21478, Training Loss: 0.01008
Epoch: 21478, Training Loss: 0.01241
Epoch: 21479, Training Loss: 0.00961
Epoch: 21479, Training Loss: 0.01006
Epoch: 21479, Training Loss: 0.01008
Epoch: 21479, Training Loss: 0.01241
Epoch: 21480, Training Loss: 0.00961
Epoch: 21480, Training Loss: 0.01006
Epoch: 21480, Training Loss: 0.01008
Epoch: 21480, Training Loss: 0.01241
Epoch: 21481, Training Loss: 0.00961
Epoch: 21481, Training Loss: 0.01006
Epoch: 21481, Training Loss: 0.01008
Epoch: 21481, Training Loss: 0.01241
Epoch: 21482, Training Loss: 0.00961
Epoch: 21482, Training Loss: 0.01006
Epoch: 21482, Training Loss: 0.01008
Epoch: 21482, Training Loss: 0.01241
Epoch: 21483, Training Loss: 0.00961
Epoch: 21483, Training Loss: 0.01006
Epoch: 21483, Training Loss: 0.01008
Epoch: 21483, Training Loss: 0.01241
Epoch: 21484, Training Loss: 0.00961
Epoch: 21484, Training Loss: 0.01006
Epoch: 21484, Training Loss: 0.01008
Epoch: 21484, Training Loss: 0.01241
Epoch: 21485, Training Loss: 0.00961
Epoch: 21485, Training Loss: 0.01006
Epoch: 21485, Training Loss: 0.01008
Epoch: 21485, Training Loss: 0.01241
Epoch: 21486, Training Loss: 0.00961
Epoch: 21486, Training Loss: 0.01006
Epoch: 21486, Training Loss: 0.01008
Epoch: 21486, Training Loss: 0.01241
Epoch: 21487, Training Loss: 0.00961
Epoch: 21487, Training Loss: 0.01006
Epoch: 21487, Training Loss: 0.01007
Epoch: 21487, Training Loss: 0.01241
Epoch: 21488, Training Loss: 0.00961
Epoch: 21488, Training Loss: 0.01006
Epoch: 21488, Training Loss: 0.01007
Epoch: 21488, Training Loss: 0.01241
Epoch: 21489, Training Loss: 0.00961
Epoch: 21489, Training Loss: 0.01006
Epoch: 21489, Training Loss: 0.01007
Epoch: 21489, Training Loss: 0.01241
Epoch: 21490, Training Loss: 0.00961
Epoch: 21490, Training Loss: 0.01006
Epoch: 21490, Training Loss: 0.01007
Epoch: 21490, Training Loss: 0.01241
Epoch: 21491, Training Loss: 0.00961
Epoch: 21491, Training Loss: 0.01006
Epoch: 21491, Training Loss: 0.01007
Epoch: 21491, Training Loss: 0.01241
Epoch: 21492, Training Loss: 0.00961
Epoch: 21492, Training Loss: 0.01006
Epoch: 21492, Training Loss: 0.01007
Epoch: 21492, Training Loss: 0.01241
Epoch: 21493, Training Loss: 0.00961
Epoch: 21493, Training Loss: 0.01006
Epoch: 21493, Training Loss: 0.01007
Epoch: 21493, Training Loss: 0.01240
Epoch: 21494, Training Loss: 0.00961
Epoch: 21494, Training Loss: 0.01006
Epoch: 21494, Training Loss: 0.01007
Epoch: 21494, Training Loss: 0.01240
Epoch: 21495, Training Loss: 0.00961
Epoch: 21495, Training Loss: 0.01006
Epoch: 21495, Training Loss: 0.01007
Epoch: 21495, Training Loss: 0.01240
Epoch: 21496, Training Loss: 0.00961
Epoch: 21496, Training Loss: 0.01006
Epoch: 21496, Training Loss: 0.01007
Epoch: 21496, Training Loss: 0.01240
Epoch: 21497, Training Loss: 0.00961
Epoch: 21497, Training Loss: 0.01006
Epoch: 21497, Training Loss: 0.01007
Epoch: 21497, Training Loss: 0.01240
Epoch: 21498, Training Loss: 0.00961
Epoch: 21498, Training Loss: 0.01006
Epoch: 21498, Training Loss: 0.01007
Epoch: 21498, Training Loss: 0.01240
Epoch: 21499, Training Loss: 0.00961
Epoch: 21499, Training Loss: 0.01006
Epoch: 21499, Training Loss: 0.01007
Epoch: 21499, Training Loss: 0.01240
Epoch: 21500, Training Loss: 0.00961
Epoch: 21500, Training Loss: 0.01006
Epoch: 21500, Training Loss: 0.01007
Epoch: 21500, Training Loss: 0.01240
Epoch: 21501, Training Loss: 0.00961
Epoch: 21501, Training Loss: 0.01006
Epoch: 21501, Training Loss: 0.01007
Epoch: 21501, Training Loss: 0.01240
Epoch: 21502, Training Loss: 0.00961
Epoch: 21502, Training Loss: 0.01006
Epoch: 21502, Training Loss: 0.01007
Epoch: 21502, Training Loss: 0.01240
Epoch: 21503, Training Loss: 0.00961
Epoch: 21503, Training Loss: 0.01006
Epoch: 21503, Training Loss: 0.01007
Epoch: 21503, Training Loss: 0.01240
Epoch: 21504, Training Loss: 0.00961
Epoch: 21504, Training Loss: 0.01006
Epoch: 21504, Training Loss: 0.01007
Epoch: 21504, Training Loss: 0.01240
Epoch: 21505, Training Loss: 0.00961
Epoch: 21505, Training Loss: 0.01006
Epoch: 21505, Training Loss: 0.01007
Epoch: 21505, Training Loss: 0.01240
Epoch: 21506, Training Loss: 0.00961
Epoch: 21506, Training Loss: 0.01006
Epoch: 21506, Training Loss: 0.01007
Epoch: 21506, Training Loss: 0.01240
Epoch: 21507, Training Loss: 0.00961
Epoch: 21507, Training Loss: 0.01006
Epoch: 21507, Training Loss: 0.01007
Epoch: 21507, Training Loss: 0.01240
Epoch: 21508, Training Loss: 0.00960
Epoch: 21508, Training Loss: 0.01006
Epoch: 21508, Training Loss: 0.01007
Epoch: 21508, Training Loss: 0.01240
Epoch: 21509, Training Loss: 0.00960
Epoch: 21509, Training Loss: 0.01006
Epoch: 21509, Training Loss: 0.01007
Epoch: 21509, Training Loss: 0.01240
Epoch: 21510, Training Loss: 0.00960
Epoch: 21510, Training Loss: 0.01006
Epoch: 21510, Training Loss: 0.01007
Epoch: 21510, Training Loss: 0.01240
Epoch: 21511, Training Loss: 0.00960
Epoch: 21511, Training Loss: 0.01006
Epoch: 21511, Training Loss: 0.01007
Epoch: 21511, Training Loss: 0.01240
Epoch: 21512, Training Loss: 0.00960
Epoch: 21512, Training Loss: 0.01006
Epoch: 21512, Training Loss: 0.01007
Epoch: 21512, Training Loss: 0.01240
Epoch: 21513, Training Loss: 0.00960
Epoch: 21513, Training Loss: 0.01005
Epoch: 21513, Training Loss: 0.01007
Epoch: 21513, Training Loss: 0.01240
Epoch: 21514, Training Loss: 0.00960
Epoch: 21514, Training Loss: 0.01005
Epoch: 21514, Training Loss: 0.01007
Epoch: 21514, Training Loss: 0.01240
Epoch: 21515, Training Loss: 0.00960
Epoch: 21515, Training Loss: 0.01005
Epoch: 21515, Training Loss: 0.01007
Epoch: 21515, Training Loss: 0.01240
Epoch: 21516, Training Loss: 0.00960
Epoch: 21516, Training Loss: 0.01005
Epoch: 21516, Training Loss: 0.01007
Epoch: 21516, Training Loss: 0.01240
Epoch: 21517, Training Loss: 0.00960
Epoch: 21517, Training Loss: 0.01005
Epoch: 21517, Training Loss: 0.01007
Epoch: 21517, Training Loss: 0.01240
Epoch: 21518, Training Loss: 0.00960
Epoch: 21518, Training Loss: 0.01005
Epoch: 21518, Training Loss: 0.01007
Epoch: 21518, Training Loss: 0.01240
Epoch: 21519, Training Loss: 0.00960
Epoch: 21519, Training Loss: 0.01005
Epoch: 21519, Training Loss: 0.01007
Epoch: 21519, Training Loss: 0.01240
Epoch: 21520, Training Loss: 0.00960
Epoch: 21520, Training Loss: 0.01005
Epoch: 21520, Training Loss: 0.01007
Epoch: 21520, Training Loss: 0.01240
Epoch: 21521, Training Loss: 0.00960
Epoch: 21521, Training Loss: 0.01005
Epoch: 21521, Training Loss: 0.01007
Epoch: 21521, Training Loss: 0.01240
Epoch: 21522, Training Loss: 0.00960
Epoch: 21522, Training Loss: 0.01005
Epoch: 21522, Training Loss: 0.01007
Epoch: 21522, Training Loss: 0.01240
Epoch: 21523, Training Loss: 0.00960
Epoch: 21523, Training Loss: 0.01005
Epoch: 21523, Training Loss: 0.01006
Epoch: 21523, Training Loss: 0.01239
Epoch: 21524, Training Loss: 0.00960
Epoch: 21524, Training Loss: 0.01005
Epoch: 21524, Training Loss: 0.01006
Epoch: 21524, Training Loss: 0.01239
Epoch: 21525, Training Loss: 0.00960
Epoch: 21525, Training Loss: 0.01005
Epoch: 21525, Training Loss: 0.01006
Epoch: 21525, Training Loss: 0.01239
Epoch: 21526, Training Loss: 0.00960
Epoch: 21526, Training Loss: 0.01005
Epoch: 21526, Training Loss: 0.01006
Epoch: 21526, Training Loss: 0.01239
Epoch: 21527, Training Loss: 0.00960
Epoch: 21527, Training Loss: 0.01005
Epoch: 21527, Training Loss: 0.01006
Epoch: 21527, Training Loss: 0.01239
Epoch: 21528, Training Loss: 0.00960
Epoch: 21528, Training Loss: 0.01005
Epoch: 21528, Training Loss: 0.01006
Epoch: 21528, Training Loss: 0.01239
Epoch: 21529, Training Loss: 0.00960
Epoch: 21529, Training Loss: 0.01005
Epoch: 21529, Training Loss: 0.01006
Epoch: 21529, Training Loss: 0.01239
Epoch: 21530, Training Loss: 0.00960
Epoch: 21530, Training Loss: 0.01005
Epoch: 21530, Training Loss: 0.01006
Epoch: 21530, Training Loss: 0.01239
Epoch: 21531, Training Loss: 0.00960
Epoch: 21531, Training Loss: 0.01005
Epoch: 21531, Training Loss: 0.01006
Epoch: 21531, Training Loss: 0.01239
Epoch: 21532, Training Loss: 0.00960
Epoch: 21532, Training Loss: 0.01005
Epoch: 21532, Training Loss: 0.01006
Epoch: 21532, Training Loss: 0.01239
Epoch: 21533, Training Loss: 0.00960
Epoch: 21533, Training Loss: 0.01005
Epoch: 21533, Training Loss: 0.01006
Epoch: 21533, Training Loss: 0.01239
Epoch: 21534, Training Loss: 0.00960
Epoch: 21534, Training Loss: 0.01005
Epoch: 21534, Training Loss: 0.01006
Epoch: 21534, Training Loss: 0.01239
Epoch: 21535, Training Loss: 0.00960
Epoch: 21535, Training Loss: 0.01005
Epoch: 21535, Training Loss: 0.01006
Epoch: 21535, Training Loss: 0.01239
Epoch: 21536, Training Loss: 0.00960
Epoch: 21536, Training Loss: 0.01005
Epoch: 21536, Training Loss: 0.01006
Epoch: 21536, Training Loss: 0.01239
Epoch: 21537, Training Loss: 0.00960
Epoch: 21537, Training Loss: 0.01005
Epoch: 21537, Training Loss: 0.01006
Epoch: 21537, Training Loss: 0.01239
Epoch: 21538, Training Loss: 0.00960
Epoch: 21538, Training Loss: 0.01005
Epoch: 21538, Training Loss: 0.01006
Epoch: 21538, Training Loss: 0.01239
Epoch: 21539, Training Loss: 0.00960
Epoch: 21539, Training Loss: 0.01005
Epoch: 21539, Training Loss: 0.01006
Epoch: 21539, Training Loss: 0.01239
Epoch: 21540, Training Loss: 0.00960
Epoch: 21540, Training Loss: 0.01005
Epoch: 21540, Training Loss: 0.01006
Epoch: 21540, Training Loss: 0.01239
Epoch: 21541, Training Loss: 0.00960
Epoch: 21541, Training Loss: 0.01005
Epoch: 21541, Training Loss: 0.01006
Epoch: 21541, Training Loss: 0.01239
Epoch: 21542, Training Loss: 0.00960
Epoch: 21542, Training Loss: 0.01005
Epoch: 21542, Training Loss: 0.01006
Epoch: 21542, Training Loss: 0.01239
Epoch: 21543, Training Loss: 0.00960
Epoch: 21543, Training Loss: 0.01005
Epoch: 21543, Training Loss: 0.01006
Epoch: 21543, Training Loss: 0.01239
Epoch: 21544, Training Loss: 0.00960
Epoch: 21544, Training Loss: 0.01005
Epoch: 21544, Training Loss: 0.01006
Epoch: 21544, Training Loss: 0.01239
Epoch: 21545, Training Loss: 0.00960
Epoch: 21545, Training Loss: 0.01005
Epoch: 21545, Training Loss: 0.01006
Epoch: 21545, Training Loss: 0.01239
Epoch: 21546, Training Loss: 0.00960
Epoch: 21546, Training Loss: 0.01005
Epoch: 21546, Training Loss: 0.01006
Epoch: 21546, Training Loss: 0.01239
Epoch: 21547, Training Loss: 0.00960
Epoch: 21547, Training Loss: 0.01005
Epoch: 21547, Training Loss: 0.01006
Epoch: 21547, Training Loss: 0.01239
Epoch: 21548, Training Loss: 0.00959
Epoch: 21548, Training Loss: 0.01005
Epoch: 21548, Training Loss: 0.01006
Epoch: 21548, Training Loss: 0.01239
Epoch: 21549, Training Loss: 0.00959
Epoch: 21549, Training Loss: 0.01004
Epoch: 21549, Training Loss: 0.01006
Epoch: 21549, Training Loss: 0.01239
Epoch: 21550, Training Loss: 0.00959
Epoch: 21550, Training Loss: 0.01004
Epoch: 21550, Training Loss: 0.01006
Epoch: 21550, Training Loss: 0.01239
Epoch: 21551, Training Loss: 0.00959
Epoch: 21551, Training Loss: 0.01004
Epoch: 21551, Training Loss: 0.01006
Epoch: 21551, Training Loss: 0.01239
Epoch: 21552, Training Loss: 0.00959
Epoch: 21552, Training Loss: 0.01004
Epoch: 21552, Training Loss: 0.01006
Epoch: 21552, Training Loss: 0.01238
Epoch: 21553, Training Loss: 0.00959
Epoch: 21553, Training Loss: 0.01004
Epoch: 21553, Training Loss: 0.01006
Epoch: 21553, Training Loss: 0.01238
Epoch: 21554, Training Loss: 0.00959
Epoch: 21554, Training Loss: 0.01004
Epoch: 21554, Training Loss: 0.01006
Epoch: 21554, Training Loss: 0.01238
Epoch: 21555, Training Loss: 0.00959
Epoch: 21555, Training Loss: 0.01004
Epoch: 21555, Training Loss: 0.01006
Epoch: 21555, Training Loss: 0.01238
Epoch: 21556, Training Loss: 0.00959
Epoch: 21556, Training Loss: 0.01004
Epoch: 21556, Training Loss: 0.01006
Epoch: 21556, Training Loss: 0.01238
Epoch: 21557, Training Loss: 0.00959
Epoch: 21557, Training Loss: 0.01004
Epoch: 21557, Training Loss: 0.01006
Epoch: 21557, Training Loss: 0.01238
Epoch: 21558, Training Loss: 0.00959
Epoch: 21558, Training Loss: 0.01004
Epoch: 21558, Training Loss: 0.01006
Epoch: 21558, Training Loss: 0.01238
Epoch: 21559, Training Loss: 0.00959
Epoch: 21559, Training Loss: 0.01004
Epoch: 21559, Training Loss: 0.01006
Epoch: 21559, Training Loss: 0.01238
Epoch: 21560, Training Loss: 0.00959
Epoch: 21560, Training Loss: 0.01004
Epoch: 21560, Training Loss: 0.01005
Epoch: 21560, Training Loss: 0.01238
Epoch: 21561, Training Loss: 0.00959
Epoch: 21561, Training Loss: 0.01004
Epoch: 21561, Training Loss: 0.01005
Epoch: 21561, Training Loss: 0.01238
Epoch: 21562, Training Loss: 0.00959
Epoch: 21562, Training Loss: 0.01004
Epoch: 21562, Training Loss: 0.01005
Epoch: 21562, Training Loss: 0.01238
Epoch: 21563, Training Loss: 0.00959
Epoch: 21563, Training Loss: 0.01004
Epoch: 21563, Training Loss: 0.01005
Epoch: 21563, Training Loss: 0.01238
Epoch: 21564, Training Loss: 0.00959
Epoch: 21564, Training Loss: 0.01004
Epoch: 21564, Training Loss: 0.01005
Epoch: 21564, Training Loss: 0.01238
Epoch: 21565, Training Loss: 0.00959
Epoch: 21565, Training Loss: 0.01004
Epoch: 21565, Training Loss: 0.01005
Epoch: 21565, Training Loss: 0.01238
Epoch: 21566, Training Loss: 0.00959
Epoch: 21566, Training Loss: 0.01004
Epoch: 21566, Training Loss: 0.01005
Epoch: 21566, Training Loss: 0.01238
Epoch: 21567, Training Loss: 0.00959
Epoch: 21567, Training Loss: 0.01004
Epoch: 21567, Training Loss: 0.01005
Epoch: 21567, Training Loss: 0.01238
Epoch: 21568, Training Loss: 0.00959
Epoch: 21568, Training Loss: 0.01004
Epoch: 21568, Training Loss: 0.01005
Epoch: 21568, Training Loss: 0.01238
Epoch: 21569, Training Loss: 0.00959
Epoch: 21569, Training Loss: 0.01004
Epoch: 21569, Training Loss: 0.01005
Epoch: 21569, Training Loss: 0.01238
Epoch: 21570, Training Loss: 0.00959
Epoch: 21570, Training Loss: 0.01004
Epoch: 21570, Training Loss: 0.01005
Epoch: 21570, Training Loss: 0.01238
Epoch: 21571, Training Loss: 0.00959
Epoch: 21571, Training Loss: 0.01004
Epoch: 21571, Training Loss: 0.01005
Epoch: 21571, Training Loss: 0.01238
Epoch: 21572, Training Loss: 0.00959
Epoch: 21572, Training Loss: 0.01004
Epoch: 21572, Training Loss: 0.01005
Epoch: 21572, Training Loss: 0.01238
Epoch: 21573, Training Loss: 0.00959
Epoch: 21573, Training Loss: 0.01004
Epoch: 21573, Training Loss: 0.01005
Epoch: 21573, Training Loss: 0.01238
Epoch: 21574, Training Loss: 0.00959
Epoch: 21574, Training Loss: 0.01004
Epoch: 21574, Training Loss: 0.01005
Epoch: 21574, Training Loss: 0.01238
Epoch: 21575, Training Loss: 0.00959
Epoch: 21575, Training Loss: 0.01004
Epoch: 21575, Training Loss: 0.01005
Epoch: 21575, Training Loss: 0.01238
Epoch: 21576, Training Loss: 0.00959
Epoch: 21576, Training Loss: 0.01004
Epoch: 21576, Training Loss: 0.01005
Epoch: 21576, Training Loss: 0.01238
Epoch: 21577, Training Loss: 0.00959
Epoch: 21577, Training Loss: 0.01004
Epoch: 21577, Training Loss: 0.01005
Epoch: 21577, Training Loss: 0.01238
Epoch: 21578, Training Loss: 0.00959
Epoch: 21578, Training Loss: 0.01004
Epoch: 21578, Training Loss: 0.01005
Epoch: 21578, Training Loss: 0.01238
Epoch: 21579, Training Loss: 0.00959
Epoch: 21579, Training Loss: 0.01004
Epoch: 21579, Training Loss: 0.01005
Epoch: 21579, Training Loss: 0.01238
Epoch: 21580, Training Loss: 0.00959
Epoch: 21580, Training Loss: 0.01004
Epoch: 21580, Training Loss: 0.01005
Epoch: 21580, Training Loss: 0.01238
Epoch: 21581, Training Loss: 0.00959
Epoch: 21581, Training Loss: 0.01004
Epoch: 21581, Training Loss: 0.01005
Epoch: 21581, Training Loss: 0.01237
Epoch: 21582, Training Loss: 0.00959
Epoch: 21582, Training Loss: 0.01004
Epoch: 21582, Training Loss: 0.01005
Epoch: 21582, Training Loss: 0.01237
Epoch: 21583, Training Loss: 0.00959
Epoch: 21583, Training Loss: 0.01004
Epoch: 21583, Training Loss: 0.01005
Epoch: 21583, Training Loss: 0.01237
Epoch: 21584, Training Loss: 0.00959
Epoch: 21584, Training Loss: 0.01004
Epoch: 21584, Training Loss: 0.01005
Epoch: 21584, Training Loss: 0.01237
Epoch: 21585, Training Loss: 0.00959
Epoch: 21585, Training Loss: 0.01004
Epoch: 21585, Training Loss: 0.01005
Epoch: 21585, Training Loss: 0.01237
Epoch: 21586, Training Loss: 0.00959
Epoch: 21586, Training Loss: 0.01003
Epoch: 21586, Training Loss: 0.01005
Epoch: 21586, Training Loss: 0.01237
Epoch: 21587, Training Loss: 0.00958
Epoch: 21587, Training Loss: 0.01003
Epoch: 21587, Training Loss: 0.01005
Epoch: 21587, Training Loss: 0.01237
Epoch: 21588, Training Loss: 0.00958
Epoch: 21588, Training Loss: 0.01003
Epoch: 21588, Training Loss: 0.01005
Epoch: 21588, Training Loss: 0.01237
Epoch: 21589, Training Loss: 0.00958
Epoch: 21589, Training Loss: 0.01003
Epoch: 21589, Training Loss: 0.01005
Epoch: 21589, Training Loss: 0.01237
Epoch: 21590, Training Loss: 0.00958
Epoch: 21590, Training Loss: 0.01003
Epoch: 21590, Training Loss: 0.01005
Epoch: 21590, Training Loss: 0.01237
Epoch: 21591, Training Loss: 0.00958
Epoch: 21591, Training Loss: 0.01003
Epoch: 21591, Training Loss: 0.01005
Epoch: 21591, Training Loss: 0.01237
Epoch: 21592, Training Loss: 0.00958
Epoch: 21592, Training Loss: 0.01003
Epoch: 21592, Training Loss: 0.01005
Epoch: 21592, Training Loss: 0.01237
Epoch: 21593, Training Loss: 0.00958
Epoch: 21593, Training Loss: 0.01003
Epoch: 21593, Training Loss: 0.01005
Epoch: 21593, Training Loss: 0.01237
Epoch: 21594, Training Loss: 0.00958
Epoch: 21594, Training Loss: 0.01003
Epoch: 21594, Training Loss: 0.01005
Epoch: 21594, Training Loss: 0.01237
Epoch: 21595, Training Loss: 0.00958
Epoch: 21595, Training Loss: 0.01003
Epoch: 21595, Training Loss: 0.01005
Epoch: 21595, Training Loss: 0.01237
Epoch: 21596, Training Loss: 0.00958
Epoch: 21596, Training Loss: 0.01003
Epoch: 21596, Training Loss: 0.01005
Epoch: 21596, Training Loss: 0.01237
Epoch: 21597, Training Loss: 0.00958
Epoch: 21597, Training Loss: 0.01003
Epoch: 21597, Training Loss: 0.01004
Epoch: 21597, Training Loss: 0.01237
Epoch: 21598, Training Loss: 0.00958
Epoch: 21598, Training Loss: 0.01003
Epoch: 21598, Training Loss: 0.01004
Epoch: 21598, Training Loss: 0.01237
Epoch: 21599, Training Loss: 0.00958
Epoch: 21599, Training Loss: 0.01003
Epoch: 21599, Training Loss: 0.01004
Epoch: 21599, Training Loss: 0.01237
Epoch: 21600, Training Loss: 0.00958
Epoch: 21600, Training Loss: 0.01003
Epoch: 21600, Training Loss: 0.01004
Epoch: 21600, Training Loss: 0.01237
Epoch: 21601, Training Loss: 0.00958
Epoch: 21601, Training Loss: 0.01003
Epoch: 21601, Training Loss: 0.01004
Epoch: 21601, Training Loss: 0.01237
Epoch: 21602, Training Loss: 0.00958
Epoch: 21602, Training Loss: 0.01003
Epoch: 21602, Training Loss: 0.01004
Epoch: 21602, Training Loss: 0.01237
Epoch: 21603, Training Loss: 0.00958
Epoch: 21603, Training Loss: 0.01003
Epoch: 21603, Training Loss: 0.01004
Epoch: 21603, Training Loss: 0.01237
Epoch: 21604, Training Loss: 0.00958
Epoch: 21604, Training Loss: 0.01003
Epoch: 21604, Training Loss: 0.01004
Epoch: 21604, Training Loss: 0.01237
Epoch: 21605, Training Loss: 0.00958
Epoch: 21605, Training Loss: 0.01003
Epoch: 21605, Training Loss: 0.01004
Epoch: 21605, Training Loss: 0.01237
Epoch: 21606, Training Loss: 0.00958
Epoch: 21606, Training Loss: 0.01003
Epoch: 21606, Training Loss: 0.01004
Epoch: 21606, Training Loss: 0.01237
Epoch: 21607, Training Loss: 0.00958
Epoch: 21607, Training Loss: 0.01003
Epoch: 21607, Training Loss: 0.01004
Epoch: 21607, Training Loss: 0.01237
Epoch: 21608, Training Loss: 0.00958
Epoch: 21608, Training Loss: 0.01003
Epoch: 21608, Training Loss: 0.01004
Epoch: 21608, Training Loss: 0.01237
Epoch: 21609, Training Loss: 0.00958
Epoch: 21609, Training Loss: 0.01003
Epoch: 21609, Training Loss: 0.01004
Epoch: 21609, Training Loss: 0.01237
Epoch: 21610, Training Loss: 0.00958
Epoch: 21610, Training Loss: 0.01003
Epoch: 21610, Training Loss: 0.01004
Epoch: 21610, Training Loss: 0.01237
Epoch: 21611, Training Loss: 0.00958
Epoch: 21611, Training Loss: 0.01003
Epoch: 21611, Training Loss: 0.01004
Epoch: 21611, Training Loss: 0.01236
Epoch: 21612, Training Loss: 0.00958
Epoch: 21612, Training Loss: 0.01003
Epoch: 21612, Training Loss: 0.01004
Epoch: 21612, Training Loss: 0.01236
Epoch: 21613, Training Loss: 0.00958
Epoch: 21613, Training Loss: 0.01003
Epoch: 21613, Training Loss: 0.01004
Epoch: 21613, Training Loss: 0.01236
Epoch: 21614, Training Loss: 0.00958
Epoch: 21614, Training Loss: 0.01003
Epoch: 21614, Training Loss: 0.01004
Epoch: 21614, Training Loss: 0.01236
Epoch: 21615, Training Loss: 0.00958
Epoch: 21615, Training Loss: 0.01003
Epoch: 21615, Training Loss: 0.01004
Epoch: 21615, Training Loss: 0.01236
Epoch: 21616, Training Loss: 0.00958
Epoch: 21616, Training Loss: 0.01003
Epoch: 21616, Training Loss: 0.01004
Epoch: 21616, Training Loss: 0.01236
Epoch: 21617, Training Loss: 0.00958
Epoch: 21617, Training Loss: 0.01003
Epoch: 21617, Training Loss: 0.01004
Epoch: 21617, Training Loss: 0.01236
Epoch: 21618, Training Loss: 0.00958
Epoch: 21618, Training Loss: 0.01003
Epoch: 21618, Training Loss: 0.01004
Epoch: 21618, Training Loss: 0.01236
Epoch: 21619, Training Loss: 0.00958
Epoch: 21619, Training Loss: 0.01003
Epoch: 21619, Training Loss: 0.01004
Epoch: 21619, Training Loss: 0.01236
Epoch: 21620, Training Loss: 0.00958
Epoch: 21620, Training Loss: 0.01003
Epoch: 21620, Training Loss: 0.01004
Epoch: 21620, Training Loss: 0.01236
Epoch: 21621, Training Loss: 0.00958
Epoch: 21621, Training Loss: 0.01003
Epoch: 21621, Training Loss: 0.01004
Epoch: 21621, Training Loss: 0.01236
Epoch: 21622, Training Loss: 0.00958
Epoch: 21622, Training Loss: 0.01003
Epoch: 21622, Training Loss: 0.01004
Epoch: 21622, Training Loss: 0.01236
Epoch: 21623, Training Loss: 0.00958
Epoch: 21623, Training Loss: 0.01002
Epoch: 21623, Training Loss: 0.01004
Epoch: 21623, Training Loss: 0.01236
Epoch: 21624, Training Loss: 0.00958
Epoch: 21624, Training Loss: 0.01002
Epoch: 21624, Training Loss: 0.01004
Epoch: 21624, Training Loss: 0.01236
Epoch: 21625, Training Loss: 0.00958
Epoch: 21625, Training Loss: 0.01002
Epoch: 21625, Training Loss: 0.01004
Epoch: 21625, Training Loss: 0.01236
Epoch: 21626, Training Loss: 0.00958
Epoch: 21626, Training Loss: 0.01002
Epoch: 21626, Training Loss: 0.01004
Epoch: 21626, Training Loss: 0.01236
Epoch: 21627, Training Loss: 0.00957
Epoch: 21627, Training Loss: 0.01002
Epoch: 21627, Training Loss: 0.01004
Epoch: 21627, Training Loss: 0.01236
Epoch: 21628, Training Loss: 0.00957
Epoch: 21628, Training Loss: 0.01002
Epoch: 21628, Training Loss: 0.01004
Epoch: 21628, Training Loss: 0.01236
Epoch: 21629, Training Loss: 0.00957
Epoch: 21629, Training Loss: 0.01002
Epoch: 21629, Training Loss: 0.01004
Epoch: 21629, Training Loss: 0.01236
Epoch: 21630, Training Loss: 0.00957
Epoch: 21630, Training Loss: 0.01002
Epoch: 21630, Training Loss: 0.01004
Epoch: 21630, Training Loss: 0.01236
Epoch: 21631, Training Loss: 0.00957
Epoch: 21631, Training Loss: 0.01002
Epoch: 21631, Training Loss: 0.01004
Epoch: 21631, Training Loss: 0.01236
Epoch: 21632, Training Loss: 0.00957
Epoch: 21632, Training Loss: 0.01002
Epoch: 21632, Training Loss: 0.01004
Epoch: 21632, Training Loss: 0.01236
Epoch: 21633, Training Loss: 0.00957
Epoch: 21633, Training Loss: 0.01002
Epoch: 21633, Training Loss: 0.01003
Epoch: 21633, Training Loss: 0.01236
Epoch: 21634, Training Loss: 0.00957
Epoch: 21634, Training Loss: 0.01002
Epoch: 21634, Training Loss: 0.01003
Epoch: 21634, Training Loss: 0.01236
Epoch: 21635, Training Loss: 0.00957
Epoch: 21635, Training Loss: 0.01002
Epoch: 21635, Training Loss: 0.01003
Epoch: 21635, Training Loss: 0.01236
Epoch: 21636, Training Loss: 0.00957
Epoch: 21636, Training Loss: 0.01002
Epoch: 21636, Training Loss: 0.01003
Epoch: 21636, Training Loss: 0.01236
Epoch: 21637, Training Loss: 0.00957
Epoch: 21637, Training Loss: 0.01002
Epoch: 21637, Training Loss: 0.01003
Epoch: 21637, Training Loss: 0.01236
Epoch: 21638, Training Loss: 0.00957
Epoch: 21638, Training Loss: 0.01002
Epoch: 21638, Training Loss: 0.01003
Epoch: 21638, Training Loss: 0.01236
Epoch: 21639, Training Loss: 0.00957
Epoch: 21639, Training Loss: 0.01002
Epoch: 21639, Training Loss: 0.01003
Epoch: 21639, Training Loss: 0.01236
Epoch: 21640, Training Loss: 0.00957
Epoch: 21640, Training Loss: 0.01002
Epoch: 21640, Training Loss: 0.01003
Epoch: 21640, Training Loss: 0.01235
Epoch: 21641, Training Loss: 0.00957
Epoch: 21641, Training Loss: 0.01002
Epoch: 21641, Training Loss: 0.01003
Epoch: 21641, Training Loss: 0.01235
Epoch: 21642, Training Loss: 0.00957
Epoch: 21642, Training Loss: 0.01002
Epoch: 21642, Training Loss: 0.01003
Epoch: 21642, Training Loss: 0.01235
Epoch: 21643, Training Loss: 0.00957
Epoch: 21643, Training Loss: 0.01002
Epoch: 21643, Training Loss: 0.01003
Epoch: 21643, Training Loss: 0.01235
Epoch: 21644, Training Loss: 0.00957
Epoch: 21644, Training Loss: 0.01002
Epoch: 21644, Training Loss: 0.01003
Epoch: 21644, Training Loss: 0.01235
Epoch: 21645, Training Loss: 0.00957
Epoch: 21645, Training Loss: 0.01002
Epoch: 21645, Training Loss: 0.01003
Epoch: 21645, Training Loss: 0.01235
Epoch: 21646, Training Loss: 0.00957
Epoch: 21646, Training Loss: 0.01002
Epoch: 21646, Training Loss: 0.01003
Epoch: 21646, Training Loss: 0.01235
Epoch: 21647, Training Loss: 0.00957
Epoch: 21647, Training Loss: 0.01002
Epoch: 21647, Training Loss: 0.01003
Epoch: 21647, Training Loss: 0.01235
Epoch: 21648, Training Loss: 0.00957
Epoch: 21648, Training Loss: 0.01002
Epoch: 21648, Training Loss: 0.01003
Epoch: 21648, Training Loss: 0.01235
Epoch: 21649, Training Loss: 0.00957
Epoch: 21649, Training Loss: 0.01002
Epoch: 21649, Training Loss: 0.01003
Epoch: 21649, Training Loss: 0.01235
Epoch: 21650, Training Loss: 0.00957
Epoch: 21650, Training Loss: 0.01002
Epoch: 21650, Training Loss: 0.01003
Epoch: 21650, Training Loss: 0.01235
Epoch: 21651, Training Loss: 0.00957
Epoch: 21651, Training Loss: 0.01002
Epoch: 21651, Training Loss: 0.01003
Epoch: 21651, Training Loss: 0.01235
Epoch: 21652, Training Loss: 0.00957
Epoch: 21652, Training Loss: 0.01002
Epoch: 21652, Training Loss: 0.01003
Epoch: 21652, Training Loss: 0.01235
Epoch: 21653, Training Loss: 0.00957
Epoch: 21653, Training Loss: 0.01002
Epoch: 21653, Training Loss: 0.01003
Epoch: 21653, Training Loss: 0.01235
Epoch: 21654, Training Loss: 0.00957
Epoch: 21654, Training Loss: 0.01002
Epoch: 21654, Training Loss: 0.01003
Epoch: 21654, Training Loss: 0.01235
Epoch: 21655, Training Loss: 0.00957
Epoch: 21655, Training Loss: 0.01002
Epoch: 21655, Training Loss: 0.01003
Epoch: 21655, Training Loss: 0.01235
Epoch: 21656, Training Loss: 0.00957
Epoch: 21656, Training Loss: 0.01002
Epoch: 21656, Training Loss: 0.01003
Epoch: 21656, Training Loss: 0.01235
Epoch: 21657, Training Loss: 0.00957
Epoch: 21657, Training Loss: 0.01002
Epoch: 21657, Training Loss: 0.01003
Epoch: 21657, Training Loss: 0.01235
Epoch: 21658, Training Loss: 0.00957
Epoch: 21658, Training Loss: 0.01002
Epoch: 21658, Training Loss: 0.01003
Epoch: 21658, Training Loss: 0.01235
Epoch: 21659, Training Loss: 0.00957
Epoch: 21659, Training Loss: 0.01002
Epoch: 21659, Training Loss: 0.01003
Epoch: 21659, Training Loss: 0.01235
Epoch: 21660, Training Loss: 0.00957
Epoch: 21660, Training Loss: 0.01001
Epoch: 21660, Training Loss: 0.01003
Epoch: 21660, Training Loss: 0.01235
Epoch: 21661, Training Loss: 0.00957
Epoch: 21661, Training Loss: 0.01001
Epoch: 21661, Training Loss: 0.01003
Epoch: 21661, Training Loss: 0.01235
Epoch: 21662, Training Loss: 0.00957
Epoch: 21662, Training Loss: 0.01001
Epoch: 21662, Training Loss: 0.01003
Epoch: 21662, Training Loss: 0.01235
Epoch: 21663, Training Loss: 0.00957
Epoch: 21663, Training Loss: 0.01001
Epoch: 21663, Training Loss: 0.01003
Epoch: 21663, Training Loss: 0.01235
Epoch: 21664, Training Loss: 0.00957
Epoch: 21664, Training Loss: 0.01001
Epoch: 21664, Training Loss: 0.01003
Epoch: 21664, Training Loss: 0.01235
Epoch: 21665, Training Loss: 0.00957
Epoch: 21665, Training Loss: 0.01001
Epoch: 21665, Training Loss: 0.01003
Epoch: 21665, Training Loss: 0.01235
Epoch: 21666, Training Loss: 0.00957
Epoch: 21666, Training Loss: 0.01001
Epoch: 21666, Training Loss: 0.01003
Epoch: 21666, Training Loss: 0.01235
Epoch: 21667, Training Loss: 0.00956
Epoch: 21667, Training Loss: 0.01001
Epoch: 21667, Training Loss: 0.01003
Epoch: 21667, Training Loss: 0.01235
Epoch: 21668, Training Loss: 0.00956
Epoch: 21668, Training Loss: 0.01001
Epoch: 21668, Training Loss: 0.01003
Epoch: 21668, Training Loss: 0.01235
Epoch: 21669, Training Loss: 0.00956
Epoch: 21669, Training Loss: 0.01001
Epoch: 21669, Training Loss: 0.01003
Epoch: 21669, Training Loss: 0.01235
Epoch: 21670, Training Loss: 0.00956
Epoch: 21670, Training Loss: 0.01001
Epoch: 21670, Training Loss: 0.01002
Epoch: 21670, Training Loss: 0.01234
Epoch: 21671, Training Loss: 0.00956
Epoch: 21671, Training Loss: 0.01001
Epoch: 21671, Training Loss: 0.01002
Epoch: 21671, Training Loss: 0.01234
Epoch: 21672, Training Loss: 0.00956
Epoch: 21672, Training Loss: 0.01001
Epoch: 21672, Training Loss: 0.01002
Epoch: 21672, Training Loss: 0.01234
Epoch: 21673, Training Loss: 0.00956
Epoch: 21673, Training Loss: 0.01001
Epoch: 21673, Training Loss: 0.01002
Epoch: 21673, Training Loss: 0.01234
Epoch: 21674, Training Loss: 0.00956
Epoch: 21674, Training Loss: 0.01001
Epoch: 21674, Training Loss: 0.01002
Epoch: 21674, Training Loss: 0.01234
Epoch: 21675, Training Loss: 0.00956
Epoch: 21675, Training Loss: 0.01001
Epoch: 21675, Training Loss: 0.01002
Epoch: 21675, Training Loss: 0.01234
Epoch: 21676, Training Loss: 0.00956
Epoch: 21676, Training Loss: 0.01001
Epoch: 21676, Training Loss: 0.01002
Epoch: 21676, Training Loss: 0.01234
Epoch: 21677, Training Loss: 0.00956
Epoch: 21677, Training Loss: 0.01001
Epoch: 21677, Training Loss: 0.01002
Epoch: 21677, Training Loss: 0.01234
Epoch: 21678, Training Loss: 0.00956
Epoch: 21678, Training Loss: 0.01001
Epoch: 21678, Training Loss: 0.01002
Epoch: 21678, Training Loss: 0.01234
Epoch: 21679, Training Loss: 0.00956
Epoch: 21679, Training Loss: 0.01001
Epoch: 21679, Training Loss: 0.01002
Epoch: 21679, Training Loss: 0.01234
Epoch: 21680, Training Loss: 0.00956
Epoch: 21680, Training Loss: 0.01001
Epoch: 21680, Training Loss: 0.01002
Epoch: 21680, Training Loss: 0.01234
Epoch: 21681, Training Loss: 0.00956
Epoch: 21681, Training Loss: 0.01001
Epoch: 21681, Training Loss: 0.01002
Epoch: 21681, Training Loss: 0.01234
Epoch: 21682, Training Loss: 0.00956
Epoch: 21682, Training Loss: 0.01001
Epoch: 21682, Training Loss: 0.01002
Epoch: 21682, Training Loss: 0.01234
Epoch: 21683, Training Loss: 0.00956
Epoch: 21683, Training Loss: 0.01001
Epoch: 21683, Training Loss: 0.01002
Epoch: 21683, Training Loss: 0.01234
Epoch: 21684, Training Loss: 0.00956
Epoch: 21684, Training Loss: 0.01001
Epoch: 21684, Training Loss: 0.01002
Epoch: 21684, Training Loss: 0.01234
Epoch: 21685, Training Loss: 0.00956
Epoch: 21685, Training Loss: 0.01001
Epoch: 21685, Training Loss: 0.01002
Epoch: 21685, Training Loss: 0.01234
Epoch: 21686, Training Loss: 0.00956
Epoch: 21686, Training Loss: 0.01001
Epoch: 21686, Training Loss: 0.01002
Epoch: 21686, Training Loss: 0.01234
Epoch: 21687, Training Loss: 0.00956
Epoch: 21687, Training Loss: 0.01001
Epoch: 21687, Training Loss: 0.01002
Epoch: 21687, Training Loss: 0.01234
Epoch: 21688, Training Loss: 0.00956
Epoch: 21688, Training Loss: 0.01001
Epoch: 21688, Training Loss: 0.01002
Epoch: 21688, Training Loss: 0.01234
Epoch: 21689, Training Loss: 0.00956
Epoch: 21689, Training Loss: 0.01001
Epoch: 21689, Training Loss: 0.01002
Epoch: 21689, Training Loss: 0.01234
Epoch: 21690, Training Loss: 0.00956
Epoch: 21690, Training Loss: 0.01001
Epoch: 21690, Training Loss: 0.01002
Epoch: 21690, Training Loss: 0.01234
Epoch: 21691, Training Loss: 0.00956
Epoch: 21691, Training Loss: 0.01001
Epoch: 21691, Training Loss: 0.01002
Epoch: 21691, Training Loss: 0.01234
Epoch: 21692, Training Loss: 0.00956
Epoch: 21692, Training Loss: 0.01001
Epoch: 21692, Training Loss: 0.01002
Epoch: 21692, Training Loss: 0.01234
Epoch: 21693, Training Loss: 0.00956
Epoch: 21693, Training Loss: 0.01001
Epoch: 21693, Training Loss: 0.01002
Epoch: 21693, Training Loss: 0.01234
Epoch: 21694, Training Loss: 0.00956
Epoch: 21694, Training Loss: 0.01001
Epoch: 21694, Training Loss: 0.01002
Epoch: 21694, Training Loss: 0.01234
Epoch: 21695, Training Loss: 0.00956
Epoch: 21695, Training Loss: 0.01001
Epoch: 21695, Training Loss: 0.01002
Epoch: 21695, Training Loss: 0.01234
Epoch: 21696, Training Loss: 0.00956
Epoch: 21696, Training Loss: 0.01001
Epoch: 21696, Training Loss: 0.01002
Epoch: 21696, Training Loss: 0.01234
Epoch: 21697, Training Loss: 0.00956
Epoch: 21697, Training Loss: 0.01000
Epoch: 21697, Training Loss: 0.01002
Epoch: 21697, Training Loss: 0.01234
Epoch: 21698, Training Loss: 0.00956
Epoch: 21698, Training Loss: 0.01000
Epoch: 21698, Training Loss: 0.01002
Epoch: 21698, Training Loss: 0.01234
Epoch: 21699, Training Loss: 0.00956
Epoch: 21699, Training Loss: 0.01000
Epoch: 21699, Training Loss: 0.01002
Epoch: 21699, Training Loss: 0.01233
Epoch: 21700, Training Loss: 0.00956
Epoch: 21700, Training Loss: 0.01000
Epoch: 21700, Training Loss: 0.01002
Epoch: 21700, Training Loss: 0.01233
Epoch: 21701, Training Loss: 0.00956
Epoch: 21701, Training Loss: 0.01000
Epoch: 21701, Training Loss: 0.01002
Epoch: 21701, Training Loss: 0.01233
Epoch: 21702, Training Loss: 0.00956
Epoch: 21702, Training Loss: 0.01000
Epoch: 21702, Training Loss: 0.01002
Epoch: 21702, Training Loss: 0.01233
Epoch: 21703, Training Loss: 0.00956
Epoch: 21703, Training Loss: 0.01000
Epoch: 21703, Training Loss: 0.01002
Epoch: 21703, Training Loss: 0.01233
Epoch: 21704, Training Loss: 0.00956
Epoch: 21704, Training Loss: 0.01000
Epoch: 21704, Training Loss: 0.01002
Epoch: 21704, Training Loss: 0.01233
Epoch: 21705, Training Loss: 0.00956
Epoch: 21705, Training Loss: 0.01000
Epoch: 21705, Training Loss: 0.01002
Epoch: 21705, Training Loss: 0.01233
Epoch: 21706, Training Loss: 0.00955
Epoch: 21706, Training Loss: 0.01000
Epoch: 21706, Training Loss: 0.01002
Epoch: 21706, Training Loss: 0.01233
Epoch: 21707, Training Loss: 0.00955
Epoch: 21707, Training Loss: 0.01000
Epoch: 21707, Training Loss: 0.01001
Epoch: 21707, Training Loss: 0.01233
Epoch: 21708, Training Loss: 0.00955
Epoch: 21708, Training Loss: 0.01000
Epoch: 21708, Training Loss: 0.01001
Epoch: 21708, Training Loss: 0.01233
Epoch: 21709, Training Loss: 0.00955
Epoch: 21709, Training Loss: 0.01000
Epoch: 21709, Training Loss: 0.01001
Epoch: 21709, Training Loss: 0.01233
Epoch: 21710, Training Loss: 0.00955
Epoch: 21710, Training Loss: 0.01000
Epoch: 21710, Training Loss: 0.01001
Epoch: 21710, Training Loss: 0.01233
Epoch: 21711, Training Loss: 0.00955
Epoch: 21711, Training Loss: 0.01000
Epoch: 21711, Training Loss: 0.01001
Epoch: 21711, Training Loss: 0.01233
Epoch: 21712, Training Loss: 0.00955
Epoch: 21712, Training Loss: 0.01000
Epoch: 21712, Training Loss: 0.01001
Epoch: 21712, Training Loss: 0.01233
Epoch: 21713, Training Loss: 0.00955
Epoch: 21713, Training Loss: 0.01000
Epoch: 21713, Training Loss: 0.01001
Epoch: 21713, Training Loss: 0.01233
Epoch: 21714, Training Loss: 0.00955
Epoch: 21714, Training Loss: 0.01000
Epoch: 21714, Training Loss: 0.01001
Epoch: 21714, Training Loss: 0.01233
Epoch: 21715, Training Loss: 0.00955
Epoch: 21715, Training Loss: 0.01000
Epoch: 21715, Training Loss: 0.01001
Epoch: 21715, Training Loss: 0.01233
Epoch: 21716, Training Loss: 0.00955
Epoch: 21716, Training Loss: 0.01000
Epoch: 21716, Training Loss: 0.01001
Epoch: 21716, Training Loss: 0.01233
Epoch: 21717, Training Loss: 0.00955
Epoch: 21717, Training Loss: 0.01000
Epoch: 21717, Training Loss: 0.01001
Epoch: 21717, Training Loss: 0.01233
Epoch: 21718, Training Loss: 0.00955
Epoch: 21718, Training Loss: 0.01000
Epoch: 21718, Training Loss: 0.01001
Epoch: 21718, Training Loss: 0.01233
Epoch: 21719, Training Loss: 0.00955
Epoch: 21719, Training Loss: 0.01000
Epoch: 21719, Training Loss: 0.01001
Epoch: 21719, Training Loss: 0.01233
Epoch: 21720, Training Loss: 0.00955
Epoch: 21720, Training Loss: 0.01000
Epoch: 21720, Training Loss: 0.01001
Epoch: 21720, Training Loss: 0.01233
Epoch: 21721, Training Loss: 0.00955
Epoch: 21721, Training Loss: 0.01000
Epoch: 21721, Training Loss: 0.01001
Epoch: 21721, Training Loss: 0.01233
Epoch: 21722, Training Loss: 0.00955
Epoch: 21722, Training Loss: 0.01000
Epoch: 21722, Training Loss: 0.01001
Epoch: 21722, Training Loss: 0.01233
Epoch: 21723, Training Loss: 0.00955
Epoch: 21723, Training Loss: 0.01000
Epoch: 21723, Training Loss: 0.01001
Epoch: 21723, Training Loss: 0.01233
Epoch: 21724, Training Loss: 0.00955
Epoch: 21724, Training Loss: 0.01000
Epoch: 21724, Training Loss: 0.01001
Epoch: 21724, Training Loss: 0.01233
Epoch: 21725, Training Loss: 0.00955
Epoch: 21725, Training Loss: 0.01000
Epoch: 21725, Training Loss: 0.01001
Epoch: 21725, Training Loss: 0.01233
Epoch: 21726, Training Loss: 0.00955
Epoch: 21726, Training Loss: 0.01000
Epoch: 21726, Training Loss: 0.01001
Epoch: 21726, Training Loss: 0.01233
Epoch: 21727, Training Loss: 0.00955
Epoch: 21727, Training Loss: 0.01000
Epoch: 21727, Training Loss: 0.01001
Epoch: 21727, Training Loss: 0.01233
Epoch: 21728, Training Loss: 0.00955
Epoch: 21728, Training Loss: 0.01000
Epoch: 21728, Training Loss: 0.01001
Epoch: 21728, Training Loss: 0.01233
Epoch: 21729, Training Loss: 0.00955
Epoch: 21729, Training Loss: 0.01000
Epoch: 21729, Training Loss: 0.01001
Epoch: 21729, Training Loss: 0.01232
Epoch: 21730, Training Loss: 0.00955
Epoch: 21730, Training Loss: 0.01000
Epoch: 21730, Training Loss: 0.01001
Epoch: 21730, Training Loss: 0.01232
Epoch: 21731, Training Loss: 0.00955
Epoch: 21731, Training Loss: 0.01000
Epoch: 21731, Training Loss: 0.01001
Epoch: 21731, Training Loss: 0.01232
Epoch: 21732, Training Loss: 0.00955
Epoch: 21732, Training Loss: 0.01000
Epoch: 21732, Training Loss: 0.01001
Epoch: 21732, Training Loss: 0.01232
Epoch: 21733, Training Loss: 0.00955
Epoch: 21733, Training Loss: 0.01000
Epoch: 21733, Training Loss: 0.01001
Epoch: 21733, Training Loss: 0.01232
Epoch: 21734, Training Loss: 0.00955
Epoch: 21734, Training Loss: 0.00999
Epoch: 21734, Training Loss: 0.01001
Epoch: 21734, Training Loss: 0.01232
Epoch: 21735, Training Loss: 0.00955
Epoch: 21735, Training Loss: 0.00999
Epoch: 21735, Training Loss: 0.01001
Epoch: 21735, Training Loss: 0.01232
Epoch: 21736, Training Loss: 0.00955
Epoch: 21736, Training Loss: 0.00999
Epoch: 21736, Training Loss: 0.01001
Epoch: 21736, Training Loss: 0.01232
Epoch: 21737, Training Loss: 0.00955
Epoch: 21737, Training Loss: 0.00999
Epoch: 21737, Training Loss: 0.01001
Epoch: 21737, Training Loss: 0.01232
Epoch: 21738, Training Loss: 0.00955
Epoch: 21738, Training Loss: 0.00999
Epoch: 21738, Training Loss: 0.01001
Epoch: 21738, Training Loss: 0.01232
Epoch: 21739, Training Loss: 0.00955
Epoch: 21739, Training Loss: 0.00999
Epoch: 21739, Training Loss: 0.01001
Epoch: 21739, Training Loss: 0.01232
Epoch: 21740, Training Loss: 0.00955
Epoch: 21740, Training Loss: 0.00999
Epoch: 21740, Training Loss: 0.01001
Epoch: 21740, Training Loss: 0.01232
Epoch: 21741, Training Loss: 0.00955
Epoch: 21741, Training Loss: 0.00999
Epoch: 21741, Training Loss: 0.01001
Epoch: 21741, Training Loss: 0.01232
Epoch: 21742, Training Loss: 0.00955
Epoch: 21742, Training Loss: 0.00999
Epoch: 21742, Training Loss: 0.01001
Epoch: 21742, Training Loss: 0.01232
Epoch: 21743, Training Loss: 0.00955
Epoch: 21743, Training Loss: 0.00999
Epoch: 21743, Training Loss: 0.01001
Epoch: 21743, Training Loss: 0.01232
Epoch: 21744, Training Loss: 0.00955
Epoch: 21744, Training Loss: 0.00999
Epoch: 21744, Training Loss: 0.01001
Epoch: 21744, Training Loss: 0.01232
Epoch: 21745, Training Loss: 0.00955
Epoch: 21745, Training Loss: 0.00999
Epoch: 21745, Training Loss: 0.01000
Epoch: 21745, Training Loss: 0.01232
Epoch: 21746, Training Loss: 0.00954
Epoch: 21746, Training Loss: 0.00999
Epoch: 21746, Training Loss: 0.01000
Epoch: 21746, Training Loss: 0.01232
Epoch: 21747, Training Loss: 0.00954
Epoch: 21747, Training Loss: 0.00999
Epoch: 21747, Training Loss: 0.01000
Epoch: 21747, Training Loss: 0.01232
Epoch: 21748, Training Loss: 0.00954
Epoch: 21748, Training Loss: 0.00999
Epoch: 21748, Training Loss: 0.01000
Epoch: 21748, Training Loss: 0.01232
Epoch: 21749, Training Loss: 0.00954
Epoch: 21749, Training Loss: 0.00999
Epoch: 21749, Training Loss: 0.01000
Epoch: 21749, Training Loss: 0.01232
Epoch: 21750, Training Loss: 0.00954
Epoch: 21750, Training Loss: 0.00999
Epoch: 21750, Training Loss: 0.01000
Epoch: 21750, Training Loss: 0.01232
Epoch: 21751, Training Loss: 0.00954
Epoch: 21751, Training Loss: 0.00999
Epoch: 21751, Training Loss: 0.01000
Epoch: 21751, Training Loss: 0.01232
Epoch: 21752, Training Loss: 0.00954
Epoch: 21752, Training Loss: 0.00999
Epoch: 21752, Training Loss: 0.01000
Epoch: 21752, Training Loss: 0.01232
Epoch: 21753, Training Loss: 0.00954
Epoch: 21753, Training Loss: 0.00999
Epoch: 21753, Training Loss: 0.01000
Epoch: 21753, Training Loss: 0.01232
Epoch: 21754, Training Loss: 0.00954
Epoch: 21754, Training Loss: 0.00999
Epoch: 21754, Training Loss: 0.01000
Epoch: 21754, Training Loss: 0.01232
Epoch: 21755, Training Loss: 0.00954
Epoch: 21755, Training Loss: 0.00999
Epoch: 21755, Training Loss: 0.01000
Epoch: 21755, Training Loss: 0.01232
Epoch: 21756, Training Loss: 0.00954
Epoch: 21756, Training Loss: 0.00999
Epoch: 21756, Training Loss: 0.01000
Epoch: 21756, Training Loss: 0.01232
Epoch: 21757, Training Loss: 0.00954
Epoch: 21757, Training Loss: 0.00999
Epoch: 21757, Training Loss: 0.01000
Epoch: 21757, Training Loss: 0.01232
Epoch: 21758, Training Loss: 0.00954
Epoch: 21758, Training Loss: 0.00999
Epoch: 21758, Training Loss: 0.01000
Epoch: 21758, Training Loss: 0.01232
Epoch: 21759, Training Loss: 0.00954
Epoch: 21759, Training Loss: 0.00999
Epoch: 21759, Training Loss: 0.01000
Epoch: 21759, Training Loss: 0.01231
Epoch: 21760, Training Loss: 0.00954
Epoch: 21760, Training Loss: 0.00999
Epoch: 21760, Training Loss: 0.01000
Epoch: 21760, Training Loss: 0.01231
Epoch: 21761, Training Loss: 0.00954
Epoch: 21761, Training Loss: 0.00999
Epoch: 21761, Training Loss: 0.01000
Epoch: 21761, Training Loss: 0.01231
Epoch: 21762, Training Loss: 0.00954
Epoch: 21762, Training Loss: 0.00999
Epoch: 21762, Training Loss: 0.01000
Epoch: 21762, Training Loss: 0.01231
Epoch: 21763, Training Loss: 0.00954
Epoch: 21763, Training Loss: 0.00999
Epoch: 21763, Training Loss: 0.01000
Epoch: 21763, Training Loss: 0.01231
Epoch: 21764, Training Loss: 0.00954
Epoch: 21764, Training Loss: 0.00999
Epoch: 21764, Training Loss: 0.01000
Epoch: 21764, Training Loss: 0.01231
Epoch: 21765, Training Loss: 0.00954
Epoch: 21765, Training Loss: 0.00999
Epoch: 21765, Training Loss: 0.01000
Epoch: 21765, Training Loss: 0.01231
Epoch: 21766, Training Loss: 0.00954
Epoch: 21766, Training Loss: 0.00999
Epoch: 21766, Training Loss: 0.01000
Epoch: 21766, Training Loss: 0.01231
Epoch: 21767, Training Loss: 0.00954
Epoch: 21767, Training Loss: 0.00999
Epoch: 21767, Training Loss: 0.01000
Epoch: 21767, Training Loss: 0.01231
Epoch: 21768, Training Loss: 0.00954
Epoch: 21768, Training Loss: 0.00999
Epoch: 21768, Training Loss: 0.01000
Epoch: 21768, Training Loss: 0.01231
Epoch: 21769, Training Loss: 0.00954
Epoch: 21769, Training Loss: 0.00999
Epoch: 21769, Training Loss: 0.01000
Epoch: 21769, Training Loss: 0.01231
Epoch: 21770, Training Loss: 0.00954
Epoch: 21770, Training Loss: 0.00999
Epoch: 21770, Training Loss: 0.01000
Epoch: 21770, Training Loss: 0.01231
Epoch: 21771, Training Loss: 0.00954
Epoch: 21771, Training Loss: 0.00998
Epoch: 21771, Training Loss: 0.01000
Epoch: 21771, Training Loss: 0.01231
Epoch: 21772, Training Loss: 0.00954
Epoch: 21772, Training Loss: 0.00998
Epoch: 21772, Training Loss: 0.01000
Epoch: 21772, Training Loss: 0.01231
Epoch: 21773, Training Loss: 0.00954
Epoch: 21773, Training Loss: 0.00998
Epoch: 21773, Training Loss: 0.01000
Epoch: 21773, Training Loss: 0.01231
Epoch: 21774, Training Loss: 0.00954
Epoch: 21774, Training Loss: 0.00998
Epoch: 21774, Training Loss: 0.01000
Epoch: 21774, Training Loss: 0.01231
Epoch: 21775, Training Loss: 0.00954
Epoch: 21775, Training Loss: 0.00998
Epoch: 21775, Training Loss: 0.01000
Epoch: 21775, Training Loss: 0.01231
Epoch: 21776, Training Loss: 0.00954
Epoch: 21776, Training Loss: 0.00998
Epoch: 21776, Training Loss: 0.01000
Epoch: 21776, Training Loss: 0.01231
Epoch: 21777, Training Loss: 0.00954
Epoch: 21777, Training Loss: 0.00998
Epoch: 21777, Training Loss: 0.01000
Epoch: 21777, Training Loss: 0.01231
Epoch: 21778, Training Loss: 0.00954
Epoch: 21778, Training Loss: 0.00998
Epoch: 21778, Training Loss: 0.01000
Epoch: 21778, Training Loss: 0.01231
Epoch: 21779, Training Loss: 0.00954
Epoch: 21779, Training Loss: 0.00998
Epoch: 21779, Training Loss: 0.01000
Epoch: 21779, Training Loss: 0.01231
Epoch: 21780, Training Loss: 0.00954
Epoch: 21780, Training Loss: 0.00998
Epoch: 21780, Training Loss: 0.01000
Epoch: 21780, Training Loss: 0.01231
Epoch: 21781, Training Loss: 0.00954
Epoch: 21781, Training Loss: 0.00998
Epoch: 21781, Training Loss: 0.01000
Epoch: 21781, Training Loss: 0.01231
Epoch: 21782, Training Loss: 0.00954
Epoch: 21782, Training Loss: 0.00998
Epoch: 21782, Training Loss: 0.00999
Epoch: 21782, Training Loss: 0.01231
Epoch: 21783, Training Loss: 0.00954
Epoch: 21783, Training Loss: 0.00998
Epoch: 21783, Training Loss: 0.00999
Epoch: 21783, Training Loss: 0.01231
Epoch: 21784, Training Loss: 0.00954
Epoch: 21784, Training Loss: 0.00998
Epoch: 21784, Training Loss: 0.00999
Epoch: 21784, Training Loss: 0.01231
Epoch: 21785, Training Loss: 0.00954
Epoch: 21785, Training Loss: 0.00998
Epoch: 21785, Training Loss: 0.00999
Epoch: 21785, Training Loss: 0.01231
Epoch: 21786, Training Loss: 0.00954
Epoch: 21786, Training Loss: 0.00998
Epoch: 21786, Training Loss: 0.00999
Epoch: 21786, Training Loss: 0.01231
Epoch: 21787, Training Loss: 0.00953
Epoch: 21787, Training Loss: 0.00998
Epoch: 21787, Training Loss: 0.00999
Epoch: 21787, Training Loss: 0.01231
Epoch: 21788, Training Loss: 0.00953
Epoch: 21788, Training Loss: 0.00998
Epoch: 21788, Training Loss: 0.00999
Epoch: 21788, Training Loss: 0.01230
Epoch: 21789, Training Loss: 0.00953
Epoch: 21789, Training Loss: 0.00998
Epoch: 21789, Training Loss: 0.00999
Epoch: 21789, Training Loss: 0.01230
Epoch: 21790, Training Loss: 0.00953
Epoch: 21790, Training Loss: 0.00998
Epoch: 21790, Training Loss: 0.00999
Epoch: 21790, Training Loss: 0.01230
Epoch: 21791, Training Loss: 0.00953
Epoch: 21791, Training Loss: 0.00998
Epoch: 21791, Training Loss: 0.00999
Epoch: 21791, Training Loss: 0.01230
Epoch: 21792, Training Loss: 0.00953
Epoch: 21792, Training Loss: 0.00998
Epoch: 21792, Training Loss: 0.00999
Epoch: 21792, Training Loss: 0.01230
Epoch: 21793, Training Loss: 0.00953
Epoch: 21793, Training Loss: 0.00998
Epoch: 21793, Training Loss: 0.00999
Epoch: 21793, Training Loss: 0.01230
Epoch: 21794, Training Loss: 0.00953
Epoch: 21794, Training Loss: 0.00998
Epoch: 21794, Training Loss: 0.00999
Epoch: 21794, Training Loss: 0.01230
Epoch: 21795, Training Loss: 0.00953
Epoch: 21795, Training Loss: 0.00998
Epoch: 21795, Training Loss: 0.00999
Epoch: 21795, Training Loss: 0.01230
Epoch: 21796, Training Loss: 0.00953
Epoch: 21796, Training Loss: 0.00998
Epoch: 21796, Training Loss: 0.00999
Epoch: 21796, Training Loss: 0.01230
Epoch: 21797, Training Loss: 0.00953
Epoch: 21797, Training Loss: 0.00998
Epoch: 21797, Training Loss: 0.00999
Epoch: 21797, Training Loss: 0.01230
Epoch: 21798, Training Loss: 0.00953
Epoch: 21798, Training Loss: 0.00998
Epoch: 21798, Training Loss: 0.00999
Epoch: 21798, Training Loss: 0.01230
Epoch: 21799, Training Loss: 0.00953
Epoch: 21799, Training Loss: 0.00998
Epoch: 21799, Training Loss: 0.00999
Epoch: 21799, Training Loss: 0.01230
Epoch: 21800, Training Loss: 0.00953
Epoch: 21800, Training Loss: 0.00998
Epoch: 21800, Training Loss: 0.00999
Epoch: 21800, Training Loss: 0.01230
Epoch: 21801, Training Loss: 0.00953
Epoch: 21801, Training Loss: 0.00998
Epoch: 21801, Training Loss: 0.00999
Epoch: 21801, Training Loss: 0.01230
Epoch: 21802, Training Loss: 0.00953
Epoch: 21802, Training Loss: 0.00998
Epoch: 21802, Training Loss: 0.00999
Epoch: 21802, Training Loss: 0.01230
Epoch: 21803, Training Loss: 0.00953
Epoch: 21803, Training Loss: 0.00998
Epoch: 21803, Training Loss: 0.00999
Epoch: 21803, Training Loss: 0.01230
Epoch: 21804, Training Loss: 0.00953
Epoch: 21804, Training Loss: 0.00998
Epoch: 21804, Training Loss: 0.00999
Epoch: 21804, Training Loss: 0.01230
Epoch: 21805, Training Loss: 0.00953
Epoch: 21805, Training Loss: 0.00998
Epoch: 21805, Training Loss: 0.00999
Epoch: 21805, Training Loss: 0.01230
Epoch: 21806, Training Loss: 0.00953
Epoch: 21806, Training Loss: 0.00998
Epoch: 21806, Training Loss: 0.00999
Epoch: 21806, Training Loss: 0.01230
Epoch: 21807, Training Loss: 0.00953
Epoch: 21807, Training Loss: 0.00998
Epoch: 21807, Training Loss: 0.00999
Epoch: 21807, Training Loss: 0.01230
Epoch: 21808, Training Loss: 0.00953
Epoch: 21808, Training Loss: 0.00998
Epoch: 21808, Training Loss: 0.00999
Epoch: 21808, Training Loss: 0.01230
Epoch: 21809, Training Loss: 0.00953
Epoch: 21809, Training Loss: 0.00997
Epoch: 21809, Training Loss: 0.00999
Epoch: 21809, Training Loss: 0.01230
Epoch: 21810, Training Loss: 0.00953
Epoch: 21810, Training Loss: 0.00997
Epoch: 21810, Training Loss: 0.00999
Epoch: 21810, Training Loss: 0.01230
Epoch: 21811, Training Loss: 0.00953
Epoch: 21811, Training Loss: 0.00997
Epoch: 21811, Training Loss: 0.00999
Epoch: 21811, Training Loss: 0.01230
Epoch: 21812, Training Loss: 0.00953
Epoch: 21812, Training Loss: 0.00997
Epoch: 21812, Training Loss: 0.00999
Epoch: 21812, Training Loss: 0.01230
Epoch: 21813, Training Loss: 0.00953
Epoch: 21813, Training Loss: 0.00997
Epoch: 21813, Training Loss: 0.00999
Epoch: 21813, Training Loss: 0.01230
Epoch: 21814, Training Loss: 0.00953
Epoch: 21814, Training Loss: 0.00997
Epoch: 21814, Training Loss: 0.00999
Epoch: 21814, Training Loss: 0.01230
Epoch: 21815, Training Loss: 0.00953
Epoch: 21815, Training Loss: 0.00997
Epoch: 21815, Training Loss: 0.00999
Epoch: 21815, Training Loss: 0.01230
Epoch: 21816, Training Loss: 0.00953
Epoch: 21816, Training Loss: 0.00997
Epoch: 21816, Training Loss: 0.00999
Epoch: 21816, Training Loss: 0.01230
Epoch: 21817, Training Loss: 0.00953
Epoch: 21817, Training Loss: 0.00997
Epoch: 21817, Training Loss: 0.00999
Epoch: 21817, Training Loss: 0.01230
Epoch: 21818, Training Loss: 0.00953
Epoch: 21818, Training Loss: 0.00997
Epoch: 21818, Training Loss: 0.00999
Epoch: 21818, Training Loss: 0.01229
Epoch: 21819, Training Loss: 0.00953
Epoch: 21819, Training Loss: 0.00997
Epoch: 21819, Training Loss: 0.00998
Epoch: 21819, Training Loss: 0.01229
Epoch: 21820, Training Loss: 0.00953
Epoch: 21820, Training Loss: 0.00997
Epoch: 21820, Training Loss: 0.00998
Epoch: 21820, Training Loss: 0.01229
Epoch: 21821, Training Loss: 0.00953
Epoch: 21821, Training Loss: 0.00997
Epoch: 21821, Training Loss: 0.00998
Epoch: 21821, Training Loss: 0.01229
Epoch: 21822, Training Loss: 0.00953
Epoch: 21822, Training Loss: 0.00997
Epoch: 21822, Training Loss: 0.00998
Epoch: 21822, Training Loss: 0.01229
Epoch: 21823, Training Loss: 0.00953
Epoch: 21823, Training Loss: 0.00997
Epoch: 21823, Training Loss: 0.00998
Epoch: 21823, Training Loss: 0.01229
Epoch: 21824, Training Loss: 0.00953
Epoch: 21824, Training Loss: 0.00997
Epoch: 21824, Training Loss: 0.00998
Epoch: 21824, Training Loss: 0.01229
Epoch: 21825, Training Loss: 0.00953
Epoch: 21825, Training Loss: 0.00997
Epoch: 21825, Training Loss: 0.00998
Epoch: 21825, Training Loss: 0.01229
Epoch: 21826, Training Loss: 0.00953
Epoch: 21826, Training Loss: 0.00997
Epoch: 21826, Training Loss: 0.00998
Epoch: 21826, Training Loss: 0.01229
Epoch: 21827, Training Loss: 0.00952
Epoch: 21827, Training Loss: 0.00997
Epoch: 21827, Training Loss: 0.00998
Epoch: 21827, Training Loss: 0.01229
Epoch: 21828, Training Loss: 0.00952
Epoch: 21828, Training Loss: 0.00997
Epoch: 21828, Training Loss: 0.00998
Epoch: 21828, Training Loss: 0.01229
Epoch: 21829, Training Loss: 0.00952
Epoch: 21829, Training Loss: 0.00997
Epoch: 21829, Training Loss: 0.00998
Epoch: 21829, Training Loss: 0.01229
Epoch: 21830, Training Loss: 0.00952
Epoch: 21830, Training Loss: 0.00997
Epoch: 21830, Training Loss: 0.00998
Epoch: 21830, Training Loss: 0.01229
Epoch: 21831, Training Loss: 0.00952
Epoch: 21831, Training Loss: 0.00997
Epoch: 21831, Training Loss: 0.00998
Epoch: 21831, Training Loss: 0.01229
Epoch: 21832, Training Loss: 0.00952
Epoch: 21832, Training Loss: 0.00997
Epoch: 21832, Training Loss: 0.00998
Epoch: 21832, Training Loss: 0.01229
Epoch: 21833, Training Loss: 0.00952
Epoch: 21833, Training Loss: 0.00997
Epoch: 21833, Training Loss: 0.00998
Epoch: 21833, Training Loss: 0.01229
Epoch: 21834, Training Loss: 0.00952
Epoch: 21834, Training Loss: 0.00997
Epoch: 21834, Training Loss: 0.00998
Epoch: 21834, Training Loss: 0.01229
Epoch: 21835, Training Loss: 0.00952
Epoch: 21835, Training Loss: 0.00997
Epoch: 21835, Training Loss: 0.00998
Epoch: 21835, Training Loss: 0.01229
Epoch: 21836, Training Loss: 0.00952
Epoch: 21836, Training Loss: 0.00997
Epoch: 21836, Training Loss: 0.00998
Epoch: 21836, Training Loss: 0.01229
Epoch: 21837, Training Loss: 0.00952
Epoch: 21837, Training Loss: 0.00997
Epoch: 21837, Training Loss: 0.00998
Epoch: 21837, Training Loss: 0.01229
Epoch: 21838, Training Loss: 0.00952
Epoch: 21838, Training Loss: 0.00997
Epoch: 21838, Training Loss: 0.00998
Epoch: 21838, Training Loss: 0.01229
Epoch: 21839, Training Loss: 0.00952
Epoch: 21839, Training Loss: 0.00997
Epoch: 21839, Training Loss: 0.00998
Epoch: 21839, Training Loss: 0.01229
Epoch: 21840, Training Loss: 0.00952
Epoch: 21840, Training Loss: 0.00997
Epoch: 21840, Training Loss: 0.00998
Epoch: 21840, Training Loss: 0.01229
Epoch: 21841, Training Loss: 0.00952
Epoch: 21841, Training Loss: 0.00997
Epoch: 21841, Training Loss: 0.00998
Epoch: 21841, Training Loss: 0.01229
Epoch: 21842, Training Loss: 0.00952
Epoch: 21842, Training Loss: 0.00997
Epoch: 21842, Training Loss: 0.00998
Epoch: 21842, Training Loss: 0.01229
Epoch: 21843, Training Loss: 0.00952
Epoch: 21843, Training Loss: 0.00997
Epoch: 21843, Training Loss: 0.00998
Epoch: 21843, Training Loss: 0.01229
Epoch: 21844, Training Loss: 0.00952
Epoch: 21844, Training Loss: 0.00997
Epoch: 21844, Training Loss: 0.00998
Epoch: 21844, Training Loss: 0.01229
Epoch: 21845, Training Loss: 0.00952
Epoch: 21845, Training Loss: 0.00997
Epoch: 21845, Training Loss: 0.00998
Epoch: 21845, Training Loss: 0.01229
Epoch: 21846, Training Loss: 0.00952
Epoch: 21846, Training Loss: 0.00996
Epoch: 21846, Training Loss: 0.00998
Epoch: 21846, Training Loss: 0.01229
Epoch: 21847, Training Loss: 0.00952
Epoch: 21847, Training Loss: 0.00996
Epoch: 21847, Training Loss: 0.00998
Epoch: 21847, Training Loss: 0.01229
Epoch: 21848, Training Loss: 0.00952
Epoch: 21848, Training Loss: 0.00996
Epoch: 21848, Training Loss: 0.00998
Epoch: 21848, Training Loss: 0.01228
Epoch: 21849, Training Loss: 0.00952
Epoch: 21849, Training Loss: 0.00996
Epoch: 21849, Training Loss: 0.00998
Epoch: 21849, Training Loss: 0.01228
Epoch: 21850, Training Loss: 0.00952
Epoch: 21850, Training Loss: 0.00996
Epoch: 21850, Training Loss: 0.00998
Epoch: 21850, Training Loss: 0.01228
Epoch: 21851, Training Loss: 0.00952
Epoch: 21851, Training Loss: 0.00996
Epoch: 21851, Training Loss: 0.00998
Epoch: 21851, Training Loss: 0.01228
Epoch: 21852, Training Loss: 0.00952
Epoch: 21852, Training Loss: 0.00996
Epoch: 21852, Training Loss: 0.00998
Epoch: 21852, Training Loss: 0.01228
Epoch: 21853, Training Loss: 0.00952
Epoch: 21853, Training Loss: 0.00996
Epoch: 21853, Training Loss: 0.00998
Epoch: 21853, Training Loss: 0.01228
Epoch: 21854, Training Loss: 0.00952
Epoch: 21854, Training Loss: 0.00996
Epoch: 21854, Training Loss: 0.00998
Epoch: 21854, Training Loss: 0.01228
Epoch: 21855, Training Loss: 0.00952
Epoch: 21855, Training Loss: 0.00996
Epoch: 21855, Training Loss: 0.00998
Epoch: 21855, Training Loss: 0.01228
Epoch: 21856, Training Loss: 0.00952
Epoch: 21856, Training Loss: 0.00996
Epoch: 21856, Training Loss: 0.00998
Epoch: 21856, Training Loss: 0.01228
Epoch: 21857, Training Loss: 0.00952
Epoch: 21857, Training Loss: 0.00996
Epoch: 21857, Training Loss: 0.00997
Epoch: 21857, Training Loss: 0.01228
Epoch: 21858, Training Loss: 0.00952
Epoch: 21858, Training Loss: 0.00996
Epoch: 21858, Training Loss: 0.00997
Epoch: 21858, Training Loss: 0.01228
Epoch: 21859, Training Loss: 0.00952
Epoch: 21859, Training Loss: 0.00996
Epoch: 21859, Training Loss: 0.00997
Epoch: 21859, Training Loss: 0.01228
Epoch: 21860, Training Loss: 0.00952
Epoch: 21860, Training Loss: 0.00996
Epoch: 21860, Training Loss: 0.00997
Epoch: 21860, Training Loss: 0.01228
Epoch: 21861, Training Loss: 0.00952
Epoch: 21861, Training Loss: 0.00996
Epoch: 21861, Training Loss: 0.00997
Epoch: 21861, Training Loss: 0.01228
Epoch: 21862, Training Loss: 0.00952
Epoch: 21862, Training Loss: 0.00996
Epoch: 21862, Training Loss: 0.00997
Epoch: 21862, Training Loss: 0.01228
Epoch: 21863, Training Loss: 0.00952
Epoch: 21863, Training Loss: 0.00996
Epoch: 21863, Training Loss: 0.00997
Epoch: 21863, Training Loss: 0.01228
Epoch: 21864, Training Loss: 0.00952
Epoch: 21864, Training Loss: 0.00996
Epoch: 21864, Training Loss: 0.00997
Epoch: 21864, Training Loss: 0.01228
Epoch: 21865, Training Loss: 0.00952
Epoch: 21865, Training Loss: 0.00996
Epoch: 21865, Training Loss: 0.00997
Epoch: 21865, Training Loss: 0.01228
Epoch: 21866, Training Loss: 0.00952
Epoch: 21866, Training Loss: 0.00996
Epoch: 21866, Training Loss: 0.00997
Epoch: 21866, Training Loss: 0.01228
Epoch: 21867, Training Loss: 0.00951
Epoch: 21867, Training Loss: 0.00996
Epoch: 21867, Training Loss: 0.00997
Epoch: 21867, Training Loss: 0.01228
Epoch: 21868, Training Loss: 0.00951
Epoch: 21868, Training Loss: 0.00996
Epoch: 21868, Training Loss: 0.00997
Epoch: 21868, Training Loss: 0.01228
Epoch: 21869, Training Loss: 0.00951
Epoch: 21869, Training Loss: 0.00996
Epoch: 21869, Training Loss: 0.00997
Epoch: 21869, Training Loss: 0.01228
Epoch: 21870, Training Loss: 0.00951
Epoch: 21870, Training Loss: 0.00996
Epoch: 21870, Training Loss: 0.00997
Epoch: 21870, Training Loss: 0.01228
Epoch: 21871, Training Loss: 0.00951
Epoch: 21871, Training Loss: 0.00996
Epoch: 21871, Training Loss: 0.00997
Epoch: 21871, Training Loss: 0.01228
Epoch: 21872, Training Loss: 0.00951
Epoch: 21872, Training Loss: 0.00996
Epoch: 21872, Training Loss: 0.00997
Epoch: 21872, Training Loss: 0.01228
Epoch: 21873, Training Loss: 0.00951
Epoch: 21873, Training Loss: 0.00996
Epoch: 21873, Training Loss: 0.00997
Epoch: 21873, Training Loss: 0.01228
Epoch: 21874, Training Loss: 0.00951
Epoch: 21874, Training Loss: 0.00996
Epoch: 21874, Training Loss: 0.00997
Epoch: 21874, Training Loss: 0.01228
Epoch: 21875, Training Loss: 0.00951
Epoch: 21875, Training Loss: 0.00996
Epoch: 21875, Training Loss: 0.00997
Epoch: 21875, Training Loss: 0.01228
Epoch: 21876, Training Loss: 0.00951
Epoch: 21876, Training Loss: 0.00996
Epoch: 21876, Training Loss: 0.00997
Epoch: 21876, Training Loss: 0.01228
Epoch: 21877, Training Loss: 0.00951
Epoch: 21877, Training Loss: 0.00996
Epoch: 21877, Training Loss: 0.00997
Epoch: 21877, Training Loss: 0.01228
Epoch: 21878, Training Loss: 0.00951
Epoch: 21878, Training Loss: 0.00996
Epoch: 21878, Training Loss: 0.00997
Epoch: 21878, Training Loss: 0.01227
Epoch: 21879, Training Loss: 0.00951
Epoch: 21879, Training Loss: 0.00996
Epoch: 21879, Training Loss: 0.00997
Epoch: 21879, Training Loss: 0.01227
Epoch: 21880, Training Loss: 0.00951
Epoch: 21880, Training Loss: 0.00996
Epoch: 21880, Training Loss: 0.00997
Epoch: 21880, Training Loss: 0.01227
Epoch: 21881, Training Loss: 0.00951
Epoch: 21881, Training Loss: 0.00996
Epoch: 21881, Training Loss: 0.00997
Epoch: 21881, Training Loss: 0.01227
Epoch: 21882, Training Loss: 0.00951
Epoch: 21882, Training Loss: 0.00996
Epoch: 21882, Training Loss: 0.00997
Epoch: 21882, Training Loss: 0.01227
Epoch: 21883, Training Loss: 0.00951
Epoch: 21883, Training Loss: 0.00996
Epoch: 21883, Training Loss: 0.00997
Epoch: 21883, Training Loss: 0.01227
Epoch: 21884, Training Loss: 0.00951
Epoch: 21884, Training Loss: 0.00995
Epoch: 21884, Training Loss: 0.00997
Epoch: 21884, Training Loss: 0.01227
Epoch: 21885, Training Loss: 0.00951
Epoch: 21885, Training Loss: 0.00995
Epoch: 21885, Training Loss: 0.00997
Epoch: 21885, Training Loss: 0.01227
Epoch: 21886, Training Loss: 0.00951
Epoch: 21886, Training Loss: 0.00995
Epoch: 21886, Training Loss: 0.00997
Epoch: 21886, Training Loss: 0.01227
Epoch: 21887, Training Loss: 0.00951
Epoch: 21887, Training Loss: 0.00995
Epoch: 21887, Training Loss: 0.00997
Epoch: 21887, Training Loss: 0.01227
Epoch: 21888, Training Loss: 0.00951
Epoch: 21888, Training Loss: 0.00995
Epoch: 21888, Training Loss: 0.00997
Epoch: 21888, Training Loss: 0.01227
Epoch: 21889, Training Loss: 0.00951
Epoch: 21889, Training Loss: 0.00995
Epoch: 21889, Training Loss: 0.00997
Epoch: 21889, Training Loss: 0.01227
Epoch: 21890, Training Loss: 0.00951
Epoch: 21890, Training Loss: 0.00995
Epoch: 21890, Training Loss: 0.00997
Epoch: 21890, Training Loss: 0.01227
Epoch: 21891, Training Loss: 0.00951
Epoch: 21891, Training Loss: 0.00995
Epoch: 21891, Training Loss: 0.00997
Epoch: 21891, Training Loss: 0.01227
Epoch: 21892, Training Loss: 0.00951
Epoch: 21892, Training Loss: 0.00995
Epoch: 21892, Training Loss: 0.00997
Epoch: 21892, Training Loss: 0.01227
Epoch: 21893, Training Loss: 0.00951
Epoch: 21893, Training Loss: 0.00995
Epoch: 21893, Training Loss: 0.00997
Epoch: 21893, Training Loss: 0.01227
Epoch: 21894, Training Loss: 0.00951
Epoch: 21894, Training Loss: 0.00995
Epoch: 21894, Training Loss: 0.00996
Epoch: 21894, Training Loss: 0.01227
Epoch: 21895, Training Loss: 0.00951
Epoch: 21895, Training Loss: 0.00995
Epoch: 21895, Training Loss: 0.00996
Epoch: 21895, Training Loss: 0.01227
Epoch: 21896, Training Loss: 0.00951
Epoch: 21896, Training Loss: 0.00995
Epoch: 21896, Training Loss: 0.00996
Epoch: 21896, Training Loss: 0.01227
Epoch: 21897, Training Loss: 0.00951
Epoch: 21897, Training Loss: 0.00995
Epoch: 21897, Training Loss: 0.00996
Epoch: 21897, Training Loss: 0.01227
Epoch: 21898, Training Loss: 0.00951
Epoch: 21898, Training Loss: 0.00995
Epoch: 21898, Training Loss: 0.00996
Epoch: 21898, Training Loss: 0.01227
Epoch: 21899, Training Loss: 0.00951
Epoch: 21899, Training Loss: 0.00995
Epoch: 21899, Training Loss: 0.00996
Epoch: 21899, Training Loss: 0.01227
Epoch: 21900, Training Loss: 0.00951
Epoch: 21900, Training Loss: 0.00995
Epoch: 21900, Training Loss: 0.00996
Epoch: 21900, Training Loss: 0.01227
Epoch: 21901, Training Loss: 0.00951
Epoch: 21901, Training Loss: 0.00995
Epoch: 21901, Training Loss: 0.00996
Epoch: 21901, Training Loss: 0.01227
Epoch: 21902, Training Loss: 0.00951
Epoch: 21902, Training Loss: 0.00995
Epoch: 21902, Training Loss: 0.00996
Epoch: 21902, Training Loss: 0.01227
Epoch: 21903, Training Loss: 0.00951
Epoch: 21903, Training Loss: 0.00995
Epoch: 21903, Training Loss: 0.00996
Epoch: 21903, Training Loss: 0.01227
Epoch: 21904, Training Loss: 0.00951
Epoch: 21904, Training Loss: 0.00995
Epoch: 21904, Training Loss: 0.00996
Epoch: 21904, Training Loss: 0.01227
Epoch: 21905, Training Loss: 0.00951
Epoch: 21905, Training Loss: 0.00995
Epoch: 21905, Training Loss: 0.00996
Epoch: 21905, Training Loss: 0.01227
Epoch: 21906, Training Loss: 0.00951
Epoch: 21906, Training Loss: 0.00995
Epoch: 21906, Training Loss: 0.00996
Epoch: 21906, Training Loss: 0.01227
Epoch: 21907, Training Loss: 0.00951
Epoch: 21907, Training Loss: 0.00995
Epoch: 21907, Training Loss: 0.00996
Epoch: 21907, Training Loss: 0.01227
Epoch: 21908, Training Loss: 0.00950
Epoch: 21908, Training Loss: 0.00995
Epoch: 21908, Training Loss: 0.00996
Epoch: 21908, Training Loss: 0.01227
Epoch: 21909, Training Loss: 0.00950
Epoch: 21909, Training Loss: 0.00995
Epoch: 21909, Training Loss: 0.00996
Epoch: 21909, Training Loss: 0.01226
Epoch: 21910, Training Loss: 0.00950
Epoch: 21910, Training Loss: 0.00995
Epoch: 21910, Training Loss: 0.00996
Epoch: 21910, Training Loss: 0.01226
Epoch: 21911, Training Loss: 0.00950
Epoch: 21911, Training Loss: 0.00995
Epoch: 21911, Training Loss: 0.00996
Epoch: 21911, Training Loss: 0.01226
Epoch: 21912, Training Loss: 0.00950
Epoch: 21912, Training Loss: 0.00995
Epoch: 21912, Training Loss: 0.00996
Epoch: 21912, Training Loss: 0.01226
Epoch: 21913, Training Loss: 0.00950
Epoch: 21913, Training Loss: 0.00995
Epoch: 21913, Training Loss: 0.00996
Epoch: 21913, Training Loss: 0.01226
Epoch: 21914, Training Loss: 0.00950
Epoch: 21914, Training Loss: 0.00995
Epoch: 21914, Training Loss: 0.00996
Epoch: 21914, Training Loss: 0.01226
Epoch: 21915, Training Loss: 0.00950
Epoch: 21915, Training Loss: 0.00995
Epoch: 21915, Training Loss: 0.00996
Epoch: 21915, Training Loss: 0.01226
Epoch: 21916, Training Loss: 0.00950
Epoch: 21916, Training Loss: 0.00995
Epoch: 21916, Training Loss: 0.00996
Epoch: 21916, Training Loss: 0.01226
Epoch: 21917, Training Loss: 0.00950
Epoch: 21917, Training Loss: 0.00995
Epoch: 21917, Training Loss: 0.00996
Epoch: 21917, Training Loss: 0.01226
Epoch: 21918, Training Loss: 0.00950
Epoch: 21918, Training Loss: 0.00995
Epoch: 21918, Training Loss: 0.00996
Epoch: 21918, Training Loss: 0.01226
Epoch: 21919, Training Loss: 0.00950
Epoch: 21919, Training Loss: 0.00995
Epoch: 21919, Training Loss: 0.00996
Epoch: 21919, Training Loss: 0.01226
Epoch: 21920, Training Loss: 0.00950
Epoch: 21920, Training Loss: 0.00995
Epoch: 21920, Training Loss: 0.00996
Epoch: 21920, Training Loss: 0.01226
Epoch: 21921, Training Loss: 0.00950
Epoch: 21921, Training Loss: 0.00994
Epoch: 21921, Training Loss: 0.00996
Epoch: 21921, Training Loss: 0.01226
Epoch: 21922, Training Loss: 0.00950
Epoch: 21922, Training Loss: 0.00994
Epoch: 21922, Training Loss: 0.00996
Epoch: 21922, Training Loss: 0.01226
Epoch: 21923, Training Loss: 0.00950
Epoch: 21923, Training Loss: 0.00994
Epoch: 21923, Training Loss: 0.00996
Epoch: 21923, Training Loss: 0.01226
Epoch: 21924, Training Loss: 0.00950
Epoch: 21924, Training Loss: 0.00994
Epoch: 21924, Training Loss: 0.00996
Epoch: 21924, Training Loss: 0.01226
Epoch: 21925, Training Loss: 0.00950
Epoch: 21925, Training Loss: 0.00994
Epoch: 21925, Training Loss: 0.00996
Epoch: 21925, Training Loss: 0.01226
Epoch: 21926, Training Loss: 0.00950
Epoch: 21926, Training Loss: 0.00994
Epoch: 21926, Training Loss: 0.00996
Epoch: 21926, Training Loss: 0.01226
Epoch: 21927, Training Loss: 0.00950
Epoch: 21927, Training Loss: 0.00994
Epoch: 21927, Training Loss: 0.00996
Epoch: 21927, Training Loss: 0.01226
Epoch: 21928, Training Loss: 0.00950
Epoch: 21928, Training Loss: 0.00994
Epoch: 21928, Training Loss: 0.00996
Epoch: 21928, Training Loss: 0.01226
Epoch: 21929, Training Loss: 0.00950
Epoch: 21929, Training Loss: 0.00994
Epoch: 21929, Training Loss: 0.00996
Epoch: 21929, Training Loss: 0.01226
Epoch: 21930, Training Loss: 0.00950
Epoch: 21930, Training Loss: 0.00994
Epoch: 21930, Training Loss: 0.00996
Epoch: 21930, Training Loss: 0.01226
Epoch: 21931, Training Loss: 0.00950
Epoch: 21931, Training Loss: 0.00994
Epoch: 21931, Training Loss: 0.00996
Epoch: 21931, Training Loss: 0.01226
Epoch: 21932, Training Loss: 0.00950
Epoch: 21932, Training Loss: 0.00994
Epoch: 21932, Training Loss: 0.00995
Epoch: 21932, Training Loss: 0.01226
Epoch: 21933, Training Loss: 0.00950
Epoch: 21933, Training Loss: 0.00994
Epoch: 21933, Training Loss: 0.00995
Epoch: 21933, Training Loss: 0.01226
Epoch: 21934, Training Loss: 0.00950
Epoch: 21934, Training Loss: 0.00994
Epoch: 21934, Training Loss: 0.00995
Epoch: 21934, Training Loss: 0.01226
Epoch: 21935, Training Loss: 0.00950
Epoch: 21935, Training Loss: 0.00994
Epoch: 21935, Training Loss: 0.00995
Epoch: 21935, Training Loss: 0.01226
Epoch: 21936, Training Loss: 0.00950
Epoch: 21936, Training Loss: 0.00994
Epoch: 21936, Training Loss: 0.00995
Epoch: 21936, Training Loss: 0.01226
Epoch: 21937, Training Loss: 0.00950
Epoch: 21937, Training Loss: 0.00994
Epoch: 21937, Training Loss: 0.00995
Epoch: 21937, Training Loss: 0.01226
Epoch: 21938, Training Loss: 0.00950
Epoch: 21938, Training Loss: 0.00994
Epoch: 21938, Training Loss: 0.00995
Epoch: 21938, Training Loss: 0.01226
Epoch: 21939, Training Loss: 0.00950
Epoch: 21939, Training Loss: 0.00994
Epoch: 21939, Training Loss: 0.00995
Epoch: 21939, Training Loss: 0.01225
Epoch: 21940, Training Loss: 0.00950
Epoch: 21940, Training Loss: 0.00994
Epoch: 21940, Training Loss: 0.00995
Epoch: 21940, Training Loss: 0.01225
Epoch: 21941, Training Loss: 0.00950
Epoch: 21941, Training Loss: 0.00994
Epoch: 21941, Training Loss: 0.00995
Epoch: 21941, Training Loss: 0.01225
Epoch: 21942, Training Loss: 0.00950
Epoch: 21942, Training Loss: 0.00994
Epoch: 21942, Training Loss: 0.00995
Epoch: 21942, Training Loss: 0.01225
Epoch: 21943, Training Loss: 0.00950
Epoch: 21943, Training Loss: 0.00994
Epoch: 21943, Training Loss: 0.00995
Epoch: 21943, Training Loss: 0.01225
Epoch: 21944, Training Loss: 0.00950
Epoch: 21944, Training Loss: 0.00994
Epoch: 21944, Training Loss: 0.00995
Epoch: 21944, Training Loss: 0.01225
Epoch: 21945, Training Loss: 0.00950
Epoch: 21945, Training Loss: 0.00994
Epoch: 21945, Training Loss: 0.00995
Epoch: 21945, Training Loss: 0.01225
Epoch: 21946, Training Loss: 0.00950
Epoch: 21946, Training Loss: 0.00994
Epoch: 21946, Training Loss: 0.00995
Epoch: 21946, Training Loss: 0.01225
Epoch: 21947, Training Loss: 0.00950
Epoch: 21947, Training Loss: 0.00994
Epoch: 21947, Training Loss: 0.00995
Epoch: 21947, Training Loss: 0.01225
Epoch: 21948, Training Loss: 0.00949
Epoch: 21948, Training Loss: 0.00994
Epoch: 21948, Training Loss: 0.00995
Epoch: 21948, Training Loss: 0.01225
Epoch: 21949, Training Loss: 0.00949
Epoch: 21949, Training Loss: 0.00994
Epoch: 21949, Training Loss: 0.00995
Epoch: 21949, Training Loss: 0.01225
Epoch: 21950, Training Loss: 0.00949
Epoch: 21950, Training Loss: 0.00994
Epoch: 21950, Training Loss: 0.00995
Epoch: 21950, Training Loss: 0.01225
Epoch: 21951, Training Loss: 0.00949
Epoch: 21951, Training Loss: 0.00994
Epoch: 21951, Training Loss: 0.00995
Epoch: 21951, Training Loss: 0.01225
Epoch: 21952, Training Loss: 0.00949
Epoch: 21952, Training Loss: 0.00994
Epoch: 21952, Training Loss: 0.00995
Epoch: 21952, Training Loss: 0.01225
Epoch: 21953, Training Loss: 0.00949
Epoch: 21953, Training Loss: 0.00994
Epoch: 21953, Training Loss: 0.00995
Epoch: 21953, Training Loss: 0.01225
Epoch: 21954, Training Loss: 0.00949
Epoch: 21954, Training Loss: 0.00994
Epoch: 21954, Training Loss: 0.00995
Epoch: 21954, Training Loss: 0.01225
Epoch: 21955, Training Loss: 0.00949
Epoch: 21955, Training Loss: 0.00994
Epoch: 21955, Training Loss: 0.00995
Epoch: 21955, Training Loss: 0.01225
Epoch: 21956, Training Loss: 0.00949
Epoch: 21956, Training Loss: 0.00994
Epoch: 21956, Training Loss: 0.00995
Epoch: 21956, Training Loss: 0.01225
Epoch: 21957, Training Loss: 0.00949
Epoch: 21957, Training Loss: 0.00994
Epoch: 21957, Training Loss: 0.00995
Epoch: 21957, Training Loss: 0.01225
Epoch: 21958, Training Loss: 0.00949
Epoch: 21958, Training Loss: 0.00994
Epoch: 21958, Training Loss: 0.00995
Epoch: 21958, Training Loss: 0.01225
Epoch: 21959, Training Loss: 0.00949
Epoch: 21959, Training Loss: 0.00993
Epoch: 21959, Training Loss: 0.00995
Epoch: 21959, Training Loss: 0.01225
Epoch: 21960, Training Loss: 0.00949
Epoch: 21960, Training Loss: 0.00993
Epoch: 21960, Training Loss: 0.00995
Epoch: 21960, Training Loss: 0.01225
Epoch: 21961, Training Loss: 0.00949
Epoch: 21961, Training Loss: 0.00993
Epoch: 21961, Training Loss: 0.00995
Epoch: 21961, Training Loss: 0.01225
Epoch: 21962, Training Loss: 0.00949
Epoch: 21962, Training Loss: 0.00993
Epoch: 21962, Training Loss: 0.00995
Epoch: 21962, Training Loss: 0.01225
Epoch: 21963, Training Loss: 0.00949
Epoch: 21963, Training Loss: 0.00993
Epoch: 21963, Training Loss: 0.00995
Epoch: 21963, Training Loss: 0.01225
Epoch: 21964, Training Loss: 0.00949
Epoch: 21964, Training Loss: 0.00993
Epoch: 21964, Training Loss: 0.00995
Epoch: 21964, Training Loss: 0.01225
Epoch: 21965, Training Loss: 0.00949
Epoch: 21965, Training Loss: 0.00993
Epoch: 21965, Training Loss: 0.00995
Epoch: 21965, Training Loss: 0.01225
Epoch: 21966, Training Loss: 0.00949
Epoch: 21966, Training Loss: 0.00993
Epoch: 21966, Training Loss: 0.00995
Epoch: 21966, Training Loss: 0.01225
Epoch: 21967, Training Loss: 0.00949
Epoch: 21967, Training Loss: 0.00993
Epoch: 21967, Training Loss: 0.00995
Epoch: 21967, Training Loss: 0.01225
Epoch: 21968, Training Loss: 0.00949
Epoch: 21968, Training Loss: 0.00993
Epoch: 21968, Training Loss: 0.00995
Epoch: 21968, Training Loss: 0.01225
Epoch: 21969, Training Loss: 0.00949
Epoch: 21969, Training Loss: 0.00993
Epoch: 21969, Training Loss: 0.00995
Epoch: 21969, Training Loss: 0.01224
Epoch: 21970, Training Loss: 0.00949
Epoch: 21970, Training Loss: 0.00993
Epoch: 21970, Training Loss: 0.00994
Epoch: 21970, Training Loss: 0.01224
Epoch: 21971, Training Loss: 0.00949
Epoch: 21971, Training Loss: 0.00993
Epoch: 21971, Training Loss: 0.00994
Epoch: 21971, Training Loss: 0.01224
Epoch: 21972, Training Loss: 0.00949
Epoch: 21972, Training Loss: 0.00993
Epoch: 21972, Training Loss: 0.00994
Epoch: 21972, Training Loss: 0.01224
Epoch: 21973, Training Loss: 0.00949
Epoch: 21973, Training Loss: 0.00993
Epoch: 21973, Training Loss: 0.00994
Epoch: 21973, Training Loss: 0.01224
Epoch: 21974, Training Loss: 0.00949
Epoch: 21974, Training Loss: 0.00993
Epoch: 21974, Training Loss: 0.00994
Epoch: 21974, Training Loss: 0.01224
Epoch: 21975, Training Loss: 0.00949
Epoch: 21975, Training Loss: 0.00993
Epoch: 21975, Training Loss: 0.00994
Epoch: 21975, Training Loss: 0.01224
Epoch: 21976, Training Loss: 0.00949
Epoch: 21976, Training Loss: 0.00993
Epoch: 21976, Training Loss: 0.00994
Epoch: 21976, Training Loss: 0.01224
Epoch: 21977, Training Loss: 0.00949
Epoch: 21977, Training Loss: 0.00993
Epoch: 21977, Training Loss: 0.00994
Epoch: 21977, Training Loss: 0.01224
Epoch: 21978, Training Loss: 0.00949
Epoch: 21978, Training Loss: 0.00993
Epoch: 21978, Training Loss: 0.00994
Epoch: 21978, Training Loss: 0.01224
Epoch: 21979, Training Loss: 0.00949
Epoch: 21979, Training Loss: 0.00993
Epoch: 21979, Training Loss: 0.00994
Epoch: 21979, Training Loss: 0.01224
Epoch: 21980, Training Loss: 0.00949
Epoch: 21980, Training Loss: 0.00993
Epoch: 21980, Training Loss: 0.00994
Epoch: 21980, Training Loss: 0.01224
Epoch: 21981, Training Loss: 0.00949
Epoch: 21981, Training Loss: 0.00993
Epoch: 21981, Training Loss: 0.00994
Epoch: 21981, Training Loss: 0.01224
Epoch: 21982, Training Loss: 0.00949
Epoch: 21982, Training Loss: 0.00993
Epoch: 21982, Training Loss: 0.00994
Epoch: 21982, Training Loss: 0.01224
Epoch: 21983, Training Loss: 0.00949
Epoch: 21983, Training Loss: 0.00993
Epoch: 21983, Training Loss: 0.00994
Epoch: 21983, Training Loss: 0.01224
Epoch: 21984, Training Loss: 0.00949
Epoch: 21984, Training Loss: 0.00993
Epoch: 21984, Training Loss: 0.00994
Epoch: 21984, Training Loss: 0.01224
Epoch: 21985, Training Loss: 0.00949
Epoch: 21985, Training Loss: 0.00993
Epoch: 21985, Training Loss: 0.00994
Epoch: 21985, Training Loss: 0.01224
Epoch: 21986, Training Loss: 0.00949
Epoch: 21986, Training Loss: 0.00993
Epoch: 21986, Training Loss: 0.00994
Epoch: 21986, Training Loss: 0.01224
Epoch: 21987, Training Loss: 0.00949
Epoch: 21987, Training Loss: 0.00993
Epoch: 21987, Training Loss: 0.00994
Epoch: 21987, Training Loss: 0.01224
Epoch: 21988, Training Loss: 0.00949
Epoch: 21988, Training Loss: 0.00993
Epoch: 21988, Training Loss: 0.00994
Epoch: 21988, Training Loss: 0.01224
Epoch: 21989, Training Loss: 0.00948
Epoch: 21989, Training Loss: 0.00993
Epoch: 21989, Training Loss: 0.00994
Epoch: 21989, Training Loss: 0.01224
Epoch: 21990, Training Loss: 0.00948
Epoch: 21990, Training Loss: 0.00993
Epoch: 21990, Training Loss: 0.00994
Epoch: 21990, Training Loss: 0.01224
Epoch: 21991, Training Loss: 0.00948
Epoch: 21991, Training Loss: 0.00993
Epoch: 21991, Training Loss: 0.00994
Epoch: 21991, Training Loss: 0.01224
Epoch: 21992, Training Loss: 0.00948
Epoch: 21992, Training Loss: 0.00993
Epoch: 21992, Training Loss: 0.00994
Epoch: 21992, Training Loss: 0.01224
Epoch: 21993, Training Loss: 0.00948
Epoch: 21993, Training Loss: 0.00993
Epoch: 21993, Training Loss: 0.00994
Epoch: 21993, Training Loss: 0.01224
Epoch: 21994, Training Loss: 0.00948
Epoch: 21994, Training Loss: 0.00993
Epoch: 21994, Training Loss: 0.00994
Epoch: 21994, Training Loss: 0.01224
Epoch: 21995, Training Loss: 0.00948
Epoch: 21995, Training Loss: 0.00993
Epoch: 21995, Training Loss: 0.00994
Epoch: 21995, Training Loss: 0.01224
Epoch: 21996, Training Loss: 0.00948
Epoch: 21996, Training Loss: 0.00993
Epoch: 21996, Training Loss: 0.00994
Epoch: 21996, Training Loss: 0.01224
Epoch: 21997, Training Loss: 0.00948
Epoch: 21997, Training Loss: 0.00992
Epoch: 21997, Training Loss: 0.00994
Epoch: 21997, Training Loss: 0.01224
Epoch: 21998, Training Loss: 0.00948
Epoch: 21998, Training Loss: 0.00992
Epoch: 21998, Training Loss: 0.00994
Epoch: 21998, Training Loss: 0.01224
Epoch: 21999, Training Loss: 0.00948
Epoch: 21999, Training Loss: 0.00992
Epoch: 21999, Training Loss: 0.00994
Epoch: 21999, Training Loss: 0.01223
Epoch: 22000, Training Loss: 0.00948
Epoch: 22000, Training Loss: 0.00992
Epoch: 22000, Training Loss: 0.00994
Epoch: 22000, Training Loss: 0.01223
Epoch: 22001, Training Loss: 0.00948
Epoch: 22001, Training Loss: 0.00992
Epoch: 22001, Training Loss: 0.00994
Epoch: 22001, Training Loss: 0.01223
Epoch: 22002, Training Loss: 0.00948
Epoch: 22002, Training Loss: 0.00992
Epoch: 22002, Training Loss: 0.00994
Epoch: 22002, Training Loss: 0.01223
Epoch: 22003, Training Loss: 0.00948
Epoch: 22003, Training Loss: 0.00992
Epoch: 22003, Training Loss: 0.00994
Epoch: 22003, Training Loss: 0.01223
Epoch: 22004, Training Loss: 0.00948
Epoch: 22004, Training Loss: 0.00992
Epoch: 22004, Training Loss: 0.00994
Epoch: 22004, Training Loss: 0.01223
Epoch: 22005, Training Loss: 0.00948
Epoch: 22005, Training Loss: 0.00992
Epoch: 22005, Training Loss: 0.00994
Epoch: 22005, Training Loss: 0.01223
Epoch: 22006, Training Loss: 0.00948
Epoch: 22006, Training Loss: 0.00992
Epoch: 22006, Training Loss: 0.00994
Epoch: 22006, Training Loss: 0.01223
Epoch: 22007, Training Loss: 0.00948
Epoch: 22007, Training Loss: 0.00992
Epoch: 22007, Training Loss: 0.00994
Epoch: 22007, Training Loss: 0.01223
Epoch: 22008, Training Loss: 0.00948
Epoch: 22008, Training Loss: 0.00992
Epoch: 22008, Training Loss: 0.00993
Epoch: 22008, Training Loss: 0.01223
Epoch: 22009, Training Loss: 0.00948
Epoch: 22009, Training Loss: 0.00992
Epoch: 22009, Training Loss: 0.00993
Epoch: 22009, Training Loss: 0.01223
Epoch: 22010, Training Loss: 0.00948
Epoch: 22010, Training Loss: 0.00992
Epoch: 22010, Training Loss: 0.00993
Epoch: 22010, Training Loss: 0.01223
Epoch: 22011, Training Loss: 0.00948
Epoch: 22011, Training Loss: 0.00992
Epoch: 22011, Training Loss: 0.00993
Epoch: 22011, Training Loss: 0.01223
Epoch: 22012, Training Loss: 0.00948
Epoch: 22012, Training Loss: 0.00992
Epoch: 22012, Training Loss: 0.00993
Epoch: 22012, Training Loss: 0.01223
Epoch: 22013, Training Loss: 0.00948
Epoch: 22013, Training Loss: 0.00992
Epoch: 22013, Training Loss: 0.00993
Epoch: 22013, Training Loss: 0.01223
Epoch: 22014, Training Loss: 0.00948
Epoch: 22014, Training Loss: 0.00992
Epoch: 22014, Training Loss: 0.00993
Epoch: 22014, Training Loss: 0.01223
Epoch: 22015, Training Loss: 0.00948
Epoch: 22015, Training Loss: 0.00992
Epoch: 22015, Training Loss: 0.00993
Epoch: 22015, Training Loss: 0.01223
Epoch: 22016, Training Loss: 0.00948
Epoch: 22016, Training Loss: 0.00992
Epoch: 22016, Training Loss: 0.00993
Epoch: 22016, Training Loss: 0.01223
Epoch: 22017, Training Loss: 0.00948
Epoch: 22017, Training Loss: 0.00992
Epoch: 22017, Training Loss: 0.00993
Epoch: 22017, Training Loss: 0.01223
Epoch: 22018, Training Loss: 0.00948
Epoch: 22018, Training Loss: 0.00992
Epoch: 22018, Training Loss: 0.00993
Epoch: 22018, Training Loss: 0.01223
Epoch: 22019, Training Loss: 0.00948
Epoch: 22019, Training Loss: 0.00992
Epoch: 22019, Training Loss: 0.00993
Epoch: 22019, Training Loss: 0.01223
Epoch: 22020, Training Loss: 0.00948
Epoch: 22020, Training Loss: 0.00992
Epoch: 22020, Training Loss: 0.00993
Epoch: 22020, Training Loss: 0.01223
Epoch: 22021, Training Loss: 0.00948
Epoch: 22021, Training Loss: 0.00992
Epoch: 22021, Training Loss: 0.00993
Epoch: 22021, Training Loss: 0.01223
Epoch: 22022, Training Loss: 0.00948
Epoch: 22022, Training Loss: 0.00992
Epoch: 22022, Training Loss: 0.00993
Epoch: 22022, Training Loss: 0.01223
Epoch: 22023, Training Loss: 0.00948
Epoch: 22023, Training Loss: 0.00992
Epoch: 22023, Training Loss: 0.00993
Epoch: 22023, Training Loss: 0.01223
Epoch: 22024, Training Loss: 0.00948
Epoch: 22024, Training Loss: 0.00992
Epoch: 22024, Training Loss: 0.00993
Epoch: 22024, Training Loss: 0.01223
Epoch: 22025, Training Loss: 0.00948
Epoch: 22025, Training Loss: 0.00992
Epoch: 22025, Training Loss: 0.00993
Epoch: 22025, Training Loss: 0.01223
Epoch: 22026, Training Loss: 0.00948
Epoch: 22026, Training Loss: 0.00992
Epoch: 22026, Training Loss: 0.00993
Epoch: 22026, Training Loss: 0.01223
Epoch: 22027, Training Loss: 0.00948
Epoch: 22027, Training Loss: 0.00992
Epoch: 22027, Training Loss: 0.00993
Epoch: 22027, Training Loss: 0.01223
Epoch: 22028, Training Loss: 0.00948
Epoch: 22028, Training Loss: 0.00992
Epoch: 22028, Training Loss: 0.00993
Epoch: 22028, Training Loss: 0.01223
Epoch: 22029, Training Loss: 0.00948
Epoch: 22029, Training Loss: 0.00992
Epoch: 22029, Training Loss: 0.00993
Epoch: 22029, Training Loss: 0.01223
Epoch: 22030, Training Loss: 0.00947
Epoch: 22030, Training Loss: 0.00992
Epoch: 22030, Training Loss: 0.00993
Epoch: 22030, Training Loss: 0.01222
Epoch: 22031, Training Loss: 0.00947
Epoch: 22031, Training Loss: 0.00992
Epoch: 22031, Training Loss: 0.00993
Epoch: 22031, Training Loss: 0.01222
Epoch: 22032, Training Loss: 0.00947
Epoch: 22032, Training Loss: 0.00992
Epoch: 22032, Training Loss: 0.00993
Epoch: 22032, Training Loss: 0.01222
Epoch: 22033, Training Loss: 0.00947
Epoch: 22033, Training Loss: 0.00992
Epoch: 22033, Training Loss: 0.00993
Epoch: 22033, Training Loss: 0.01222
Epoch: 22034, Training Loss: 0.00947
Epoch: 22034, Training Loss: 0.00992
Epoch: 22034, Training Loss: 0.00993
Epoch: 22034, Training Loss: 0.01222
Epoch: 22035, Training Loss: 0.00947
Epoch: 22035, Training Loss: 0.00991
Epoch: 22035, Training Loss: 0.00993
Epoch: 22035, Training Loss: 0.01222
Epoch: 22036, Training Loss: 0.00947
Epoch: 22036, Training Loss: 0.00991
Epoch: 22036, Training Loss: 0.00993
Epoch: 22036, Training Loss: 0.01222
Epoch: 22037, Training Loss: 0.00947
Epoch: 22037, Training Loss: 0.00991
Epoch: 22037, Training Loss: 0.00993
Epoch: 22037, Training Loss: 0.01222
Epoch: 22038, Training Loss: 0.00947
Epoch: 22038, Training Loss: 0.00991
Epoch: 22038, Training Loss: 0.00993
Epoch: 22038, Training Loss: 0.01222
Epoch: 22039, Training Loss: 0.00947
Epoch: 22039, Training Loss: 0.00991
Epoch: 22039, Training Loss: 0.00993
Epoch: 22039, Training Loss: 0.01222
Epoch: 22040, Training Loss: 0.00947
Epoch: 22040, Training Loss: 0.00991
Epoch: 22040, Training Loss: 0.00993
Epoch: 22040, Training Loss: 0.01222
Epoch: 22041, Training Loss: 0.00947
Epoch: 22041, Training Loss: 0.00991
Epoch: 22041, Training Loss: 0.00993
Epoch: 22041, Training Loss: 0.01222
Epoch: 22042, Training Loss: 0.00947
Epoch: 22042, Training Loss: 0.00991
Epoch: 22042, Training Loss: 0.00993
Epoch: 22042, Training Loss: 0.01222
Epoch: 22043, Training Loss: 0.00947
Epoch: 22043, Training Loss: 0.00991
Epoch: 22043, Training Loss: 0.00993
Epoch: 22043, Training Loss: 0.01222
Epoch: 22044, Training Loss: 0.00947
Epoch: 22044, Training Loss: 0.00991
Epoch: 22044, Training Loss: 0.00993
Epoch: 22044, Training Loss: 0.01222
Epoch: 22045, Training Loss: 0.00947
Epoch: 22045, Training Loss: 0.00991
Epoch: 22045, Training Loss: 0.00993
Epoch: 22045, Training Loss: 0.01222
Epoch: 22046, Training Loss: 0.00947
Epoch: 22046, Training Loss: 0.00991
Epoch: 22046, Training Loss: 0.00992
Epoch: 22046, Training Loss: 0.01222
Epoch: 22047, Training Loss: 0.00947
Epoch: 22047, Training Loss: 0.00991
Epoch: 22047, Training Loss: 0.00992
Epoch: 22047, Training Loss: 0.01222
Epoch: 22048, Training Loss: 0.00947
Epoch: 22048, Training Loss: 0.00991
Epoch: 22048, Training Loss: 0.00992
Epoch: 22048, Training Loss: 0.01222
Epoch: 22049, Training Loss: 0.00947
Epoch: 22049, Training Loss: 0.00991
Epoch: 22049, Training Loss: 0.00992
Epoch: 22049, Training Loss: 0.01222
Epoch: 22050, Training Loss: 0.00947
Epoch: 22050, Training Loss: 0.00991
Epoch: 22050, Training Loss: 0.00992
Epoch: 22050, Training Loss: 0.01222
Epoch: 22051, Training Loss: 0.00947
Epoch: 22051, Training Loss: 0.00991
Epoch: 22051, Training Loss: 0.00992
Epoch: 22051, Training Loss: 0.01222
Epoch: 22052, Training Loss: 0.00947
Epoch: 22052, Training Loss: 0.00991
Epoch: 22052, Training Loss: 0.00992
Epoch: 22052, Training Loss: 0.01222
Epoch: 22053, Training Loss: 0.00947
Epoch: 22053, Training Loss: 0.00991
Epoch: 22053, Training Loss: 0.00992
Epoch: 22053, Training Loss: 0.01222
Epoch: 22054, Training Loss: 0.00947
Epoch: 22054, Training Loss: 0.00991
Epoch: 22054, Training Loss: 0.00992
Epoch: 22054, Training Loss: 0.01222
Epoch: 22055, Training Loss: 0.00947
Epoch: 22055, Training Loss: 0.00991
Epoch: 22055, Training Loss: 0.00992
Epoch: 22055, Training Loss: 0.01222
Epoch: 22056, Training Loss: 0.00947
Epoch: 22056, Training Loss: 0.00991
Epoch: 22056, Training Loss: 0.00992
Epoch: 22056, Training Loss: 0.01222
Epoch: 22057, Training Loss: 0.00947
Epoch: 22057, Training Loss: 0.00991
Epoch: 22057, Training Loss: 0.00992
Epoch: 22057, Training Loss: 0.01222
Epoch: 22058, Training Loss: 0.00947
Epoch: 22058, Training Loss: 0.00991
Epoch: 22058, Training Loss: 0.00992
Epoch: 22058, Training Loss: 0.01222
Epoch: 22059, Training Loss: 0.00947
Epoch: 22059, Training Loss: 0.00991
Epoch: 22059, Training Loss: 0.00992
Epoch: 22059, Training Loss: 0.01222
Epoch: 22060, Training Loss: 0.00947
Epoch: 22060, Training Loss: 0.00991
Epoch: 22060, Training Loss: 0.00992
Epoch: 22060, Training Loss: 0.01221
Epoch: 22061, Training Loss: 0.00947
Epoch: 22061, Training Loss: 0.00991
Epoch: 22061, Training Loss: 0.00992
Epoch: 22061, Training Loss: 0.01221
Epoch: 22062, Training Loss: 0.00947
Epoch: 22062, Training Loss: 0.00991
Epoch: 22062, Training Loss: 0.00992
Epoch: 22062, Training Loss: 0.01221
Epoch: 22063, Training Loss: 0.00947
Epoch: 22063, Training Loss: 0.00991
Epoch: 22063, Training Loss: 0.00992
Epoch: 22063, Training Loss: 0.01221
Epoch: 22064, Training Loss: 0.00947
Epoch: 22064, Training Loss: 0.00991
Epoch: 22064, Training Loss: 0.00992
Epoch: 22064, Training Loss: 0.01221
Epoch: 22065, Training Loss: 0.00947
Epoch: 22065, Training Loss: 0.00991
Epoch: 22065, Training Loss: 0.00992
Epoch: 22065, Training Loss: 0.01221
Epoch: 22066, Training Loss: 0.00947
Epoch: 22066, Training Loss: 0.00991
Epoch: 22066, Training Loss: 0.00992
Epoch: 22066, Training Loss: 0.01221
Epoch: 22067, Training Loss: 0.00947
Epoch: 22067, Training Loss: 0.00991
Epoch: 22067, Training Loss: 0.00992
Epoch: 22067, Training Loss: 0.01221
Epoch: 22068, Training Loss: 0.00947
Epoch: 22068, Training Loss: 0.00991
Epoch: 22068, Training Loss: 0.00992
Epoch: 22068, Training Loss: 0.01221
Epoch: 22069, Training Loss: 0.00947
Epoch: 22069, Training Loss: 0.00991
Epoch: 22069, Training Loss: 0.00992
Epoch: 22069, Training Loss: 0.01221
Epoch: 22070, Training Loss: 0.00947
Epoch: 22070, Training Loss: 0.00991
Epoch: 22070, Training Loss: 0.00992
Epoch: 22070, Training Loss: 0.01221
Epoch: 22071, Training Loss: 0.00946
Epoch: 22071, Training Loss: 0.00991
Epoch: 22071, Training Loss: 0.00992
Epoch: 22071, Training Loss: 0.01221
Epoch: 22072, Training Loss: 0.00946
Epoch: 22072, Training Loss: 0.00991
Epoch: 22072, Training Loss: 0.00992
Epoch: 22072, Training Loss: 0.01221
Epoch: 22073, Training Loss: 0.00946
Epoch: 22073, Training Loss: 0.00990
Epoch: 22073, Training Loss: 0.00992
Epoch: 22073, Training Loss: 0.01221
Epoch: 22074, Training Loss: 0.00946
Epoch: 22074, Training Loss: 0.00990
Epoch: 22074, Training Loss: 0.00992
Epoch: 22074, Training Loss: 0.01221
Epoch: 22075, Training Loss: 0.00946
Epoch: 22075, Training Loss: 0.00990
Epoch: 22075, Training Loss: 0.00992
Epoch: 22075, Training Loss: 0.01221
Epoch: 22076, Training Loss: 0.00946
Epoch: 22076, Training Loss: 0.00990
Epoch: 22076, Training Loss: 0.00992
Epoch: 22076, Training Loss: 0.01221
Epoch: 22077, Training Loss: 0.00946
Epoch: 22077, Training Loss: 0.00990
Epoch: 22077, Training Loss: 0.00992
Epoch: 22077, Training Loss: 0.01221
Epoch: 22078, Training Loss: 0.00946
Epoch: 22078, Training Loss: 0.00990
Epoch: 22078, Training Loss: 0.00992
Epoch: 22078, Training Loss: 0.01221
Epoch: 22079, Training Loss: 0.00946
Epoch: 22079, Training Loss: 0.00990
Epoch: 22079, Training Loss: 0.00992
Epoch: 22079, Training Loss: 0.01221
Epoch: 22080, Training Loss: 0.00946
Epoch: 22080, Training Loss: 0.00990
Epoch: 22080, Training Loss: 0.00992
Epoch: 22080, Training Loss: 0.01221
Epoch: 22081, Training Loss: 0.00946
Epoch: 22081, Training Loss: 0.00990
Epoch: 22081, Training Loss: 0.00992
Epoch: 22081, Training Loss: 0.01221
Epoch: 22082, Training Loss: 0.00946
Epoch: 22082, Training Loss: 0.00990
Epoch: 22082, Training Loss: 0.00992
Epoch: 22082, Training Loss: 0.01221
Epoch: 22083, Training Loss: 0.00946
Epoch: 22083, Training Loss: 0.00990
Epoch: 22083, Training Loss: 0.00992
Epoch: 22083, Training Loss: 0.01221
Epoch: 22084, Training Loss: 0.00946
Epoch: 22084, Training Loss: 0.00990
Epoch: 22084, Training Loss: 0.00991
Epoch: 22084, Training Loss: 0.01221
Epoch: 22085, Training Loss: 0.00946
Epoch: 22085, Training Loss: 0.00990
Epoch: 22085, Training Loss: 0.00991
Epoch: 22085, Training Loss: 0.01221
Epoch: 22086, Training Loss: 0.00946
Epoch: 22086, Training Loss: 0.00990
Epoch: 22086, Training Loss: 0.00991
Epoch: 22086, Training Loss: 0.01221
Epoch: 22087, Training Loss: 0.00946
Epoch: 22087, Training Loss: 0.00990
Epoch: 22087, Training Loss: 0.00991
Epoch: 22087, Training Loss: 0.01221
Epoch: 22088, Training Loss: 0.00946
Epoch: 22088, Training Loss: 0.00990
Epoch: 22088, Training Loss: 0.00991
Epoch: 22088, Training Loss: 0.01221
Epoch: 22089, Training Loss: 0.00946
Epoch: 22089, Training Loss: 0.00990
Epoch: 22089, Training Loss: 0.00991
Epoch: 22089, Training Loss: 0.01221
Epoch: 22090, Training Loss: 0.00946
Epoch: 22090, Training Loss: 0.00990
Epoch: 22090, Training Loss: 0.00991
Epoch: 22090, Training Loss: 0.01221
Epoch: 22091, Training Loss: 0.00946
Epoch: 22091, Training Loss: 0.00990
Epoch: 22091, Training Loss: 0.00991
Epoch: 22091, Training Loss: 0.01220
Epoch: 22092, Training Loss: 0.00946
Epoch: 22092, Training Loss: 0.00990
Epoch: 22092, Training Loss: 0.00991
Epoch: 22092, Training Loss: 0.01220
Epoch: 22093, Training Loss: 0.00946
Epoch: 22093, Training Loss: 0.00990
Epoch: 22093, Training Loss: 0.00991
Epoch: 22093, Training Loss: 0.01220
Epoch: 22094, Training Loss: 0.00946
Epoch: 22094, Training Loss: 0.00990
Epoch: 22094, Training Loss: 0.00991
Epoch: 22094, Training Loss: 0.01220
Epoch: 22095, Training Loss: 0.00946
Epoch: 22095, Training Loss: 0.00990
Epoch: 22095, Training Loss: 0.00991
Epoch: 22095, Training Loss: 0.01220
Epoch: 22096, Training Loss: 0.00946
Epoch: 22096, Training Loss: 0.00990
Epoch: 22096, Training Loss: 0.00991
Epoch: 22096, Training Loss: 0.01220
Epoch: 22097, Training Loss: 0.00946
Epoch: 22097, Training Loss: 0.00990
Epoch: 22097, Training Loss: 0.00991
Epoch: 22097, Training Loss: 0.01220
Epoch: 22098, Training Loss: 0.00946
Epoch: 22098, Training Loss: 0.00990
Epoch: 22098, Training Loss: 0.00991
Epoch: 22098, Training Loss: 0.01220
Epoch: 22099, Training Loss: 0.00946
Epoch: 22099, Training Loss: 0.00990
Epoch: 22099, Training Loss: 0.00991
Epoch: 22099, Training Loss: 0.01220
Epoch: 22100, Training Loss: 0.00946
Epoch: 22100, Training Loss: 0.00990
Epoch: 22100, Training Loss: 0.00991
Epoch: 22100, Training Loss: 0.01220
Epoch: 22101, Training Loss: 0.00946
Epoch: 22101, Training Loss: 0.00990
Epoch: 22101, Training Loss: 0.00991
Epoch: 22101, Training Loss: 0.01220
Epoch: 22102, Training Loss: 0.00946
Epoch: 22102, Training Loss: 0.00990
Epoch: 22102, Training Loss: 0.00991
Epoch: 22102, Training Loss: 0.01220
Epoch: 22103, Training Loss: 0.00946
Epoch: 22103, Training Loss: 0.00990
Epoch: 22103, Training Loss: 0.00991
Epoch: 22103, Training Loss: 0.01220
Epoch: 22104, Training Loss: 0.00946
Epoch: 22104, Training Loss: 0.00990
Epoch: 22104, Training Loss: 0.00991
Epoch: 22104, Training Loss: 0.01220
Epoch: 22105, Training Loss: 0.00946
Epoch: 22105, Training Loss: 0.00990
Epoch: 22105, Training Loss: 0.00991
Epoch: 22105, Training Loss: 0.01220
Epoch: 22106, Training Loss: 0.00946
Epoch: 22106, Training Loss: 0.00990
Epoch: 22106, Training Loss: 0.00991
Epoch: 22106, Training Loss: 0.01220
Epoch: 22107, Training Loss: 0.00946
Epoch: 22107, Training Loss: 0.00990
Epoch: 22107, Training Loss: 0.00991
Epoch: 22107, Training Loss: 0.01220
Epoch: 22108, Training Loss: 0.00946
Epoch: 22108, Training Loss: 0.00990
Epoch: 22108, Training Loss: 0.00991
Epoch: 22108, Training Loss: 0.01220
Epoch: 22109, Training Loss: 0.00946
Epoch: 22109, Training Loss: 0.00990
Epoch: 22109, Training Loss: 0.00991
Epoch: 22109, Training Loss: 0.01220
Epoch: 22110, Training Loss: 0.00946
Epoch: 22110, Training Loss: 0.00990
Epoch: 22110, Training Loss: 0.00991
Epoch: 22110, Training Loss: 0.01220
Epoch: 22111, Training Loss: 0.00946
Epoch: 22111, Training Loss: 0.00990
Epoch: 22111, Training Loss: 0.00991
Epoch: 22111, Training Loss: 0.01220
Epoch: 22112, Training Loss: 0.00945
Epoch: 22112, Training Loss: 0.00989
Epoch: 22112, Training Loss: 0.00991
Epoch: 22112, Training Loss: 0.01220
Epoch: 22113, Training Loss: 0.00945
Epoch: 22113, Training Loss: 0.00989
Epoch: 22113, Training Loss: 0.00991
Epoch: 22113, Training Loss: 0.01220
Epoch: 22114, Training Loss: 0.00945
Epoch: 22114, Training Loss: 0.00989
Epoch: 22114, Training Loss: 0.00991
Epoch: 22114, Training Loss: 0.01220
Epoch: 22115, Training Loss: 0.00945
Epoch: 22115, Training Loss: 0.00989
Epoch: 22115, Training Loss: 0.00991
Epoch: 22115, Training Loss: 0.01220
Epoch: 22116, Training Loss: 0.00945
Epoch: 22116, Training Loss: 0.00989
Epoch: 22116, Training Loss: 0.00991
Epoch: 22116, Training Loss: 0.01220
Epoch: 22117, Training Loss: 0.00945
Epoch: 22117, Training Loss: 0.00989
Epoch: 22117, Training Loss: 0.00991
Epoch: 22117, Training Loss: 0.01220
Epoch: 22118, Training Loss: 0.00945
Epoch: 22118, Training Loss: 0.00989
Epoch: 22118, Training Loss: 0.00991
Epoch: 22118, Training Loss: 0.01220
Epoch: 22119, Training Loss: 0.00945
Epoch: 22119, Training Loss: 0.00989
Epoch: 22119, Training Loss: 0.00991
Epoch: 22119, Training Loss: 0.01220
Epoch: 22120, Training Loss: 0.00945
Epoch: 22120, Training Loss: 0.00989
Epoch: 22120, Training Loss: 0.00991
Epoch: 22120, Training Loss: 0.01220
Epoch: 22121, Training Loss: 0.00945
Epoch: 22121, Training Loss: 0.00989
Epoch: 22121, Training Loss: 0.00991
Epoch: 22121, Training Loss: 0.01219
Epoch: 22122, Training Loss: 0.00945
Epoch: 22122, Training Loss: 0.00989
Epoch: 22122, Training Loss: 0.00990
Epoch: 22122, Training Loss: 0.01219
Epoch: 22123, Training Loss: 0.00945
Epoch: 22123, Training Loss: 0.00989
Epoch: 22123, Training Loss: 0.00990
Epoch: 22123, Training Loss: 0.01219
Epoch: 22124, Training Loss: 0.00945
Epoch: 22124, Training Loss: 0.00989
Epoch: 22124, Training Loss: 0.00990
Epoch: 22124, Training Loss: 0.01219
Epoch: 22125, Training Loss: 0.00945
Epoch: 22125, Training Loss: 0.00989
Epoch: 22125, Training Loss: 0.00990
Epoch: 22125, Training Loss: 0.01219
Epoch: 22126, Training Loss: 0.00945
Epoch: 22126, Training Loss: 0.00989
Epoch: 22126, Training Loss: 0.00990
Epoch: 22126, Training Loss: 0.01219
Epoch: 22127, Training Loss: 0.00945
Epoch: 22127, Training Loss: 0.00989
Epoch: 22127, Training Loss: 0.00990
Epoch: 22127, Training Loss: 0.01219
Epoch: 22128, Training Loss: 0.00945
Epoch: 22128, Training Loss: 0.00989
Epoch: 22128, Training Loss: 0.00990
Epoch: 22128, Training Loss: 0.01219
Epoch: 22129, Training Loss: 0.00945
Epoch: 22129, Training Loss: 0.00989
Epoch: 22129, Training Loss: 0.00990
Epoch: 22129, Training Loss: 0.01219
Epoch: 22130, Training Loss: 0.00945
Epoch: 22130, Training Loss: 0.00989
Epoch: 22130, Training Loss: 0.00990
Epoch: 22130, Training Loss: 0.01219
Epoch: 22131, Training Loss: 0.00945
Epoch: 22131, Training Loss: 0.00989
Epoch: 22131, Training Loss: 0.00990
Epoch: 22131, Training Loss: 0.01219
Epoch: 22132, Training Loss: 0.00945
Epoch: 22132, Training Loss: 0.00989
Epoch: 22132, Training Loss: 0.00990
Epoch: 22132, Training Loss: 0.01219
Epoch: 22133, Training Loss: 0.00945
Epoch: 22133, Training Loss: 0.00989
Epoch: 22133, Training Loss: 0.00990
Epoch: 22133, Training Loss: 0.01219
Epoch: 22134, Training Loss: 0.00945
Epoch: 22134, Training Loss: 0.00989
Epoch: 22134, Training Loss: 0.00990
Epoch: 22134, Training Loss: 0.01219
Epoch: 22135, Training Loss: 0.00945
Epoch: 22135, Training Loss: 0.00989
Epoch: 22135, Training Loss: 0.00990
Epoch: 22135, Training Loss: 0.01219
Epoch: 22136, Training Loss: 0.00945
Epoch: 22136, Training Loss: 0.00989
Epoch: 22136, Training Loss: 0.00990
Epoch: 22136, Training Loss: 0.01219
Epoch: 22137, Training Loss: 0.00945
Epoch: 22137, Training Loss: 0.00989
Epoch: 22137, Training Loss: 0.00990
Epoch: 22137, Training Loss: 0.01219
Epoch: 22138, Training Loss: 0.00945
Epoch: 22138, Training Loss: 0.00989
Epoch: 22138, Training Loss: 0.00990
Epoch: 22138, Training Loss: 0.01219
Epoch: 22139, Training Loss: 0.00945
Epoch: 22139, Training Loss: 0.00989
Epoch: 22139, Training Loss: 0.00990
Epoch: 22139, Training Loss: 0.01219
Epoch: 22140, Training Loss: 0.00945
Epoch: 22140, Training Loss: 0.00989
Epoch: 22140, Training Loss: 0.00990
Epoch: 22140, Training Loss: 0.01219
Epoch: 22141, Training Loss: 0.00945
Epoch: 22141, Training Loss: 0.00989
Epoch: 22141, Training Loss: 0.00990
Epoch: 22141, Training Loss: 0.01219
Epoch: 22142, Training Loss: 0.00945
Epoch: 22142, Training Loss: 0.00989
Epoch: 22142, Training Loss: 0.00990
Epoch: 22142, Training Loss: 0.01219
Epoch: 22143, Training Loss: 0.00945
Epoch: 22143, Training Loss: 0.00989
Epoch: 22143, Training Loss: 0.00990
Epoch: 22143, Training Loss: 0.01219
Epoch: 22144, Training Loss: 0.00945
Epoch: 22144, Training Loss: 0.00989
Epoch: 22144, Training Loss: 0.00990
Epoch: 22144, Training Loss: 0.01219
Epoch: 22145, Training Loss: 0.00945
Epoch: 22145, Training Loss: 0.00989
Epoch: 22145, Training Loss: 0.00990
Epoch: 22145, Training Loss: 0.01219
Epoch: 22146, Training Loss: 0.00945
Epoch: 22146, Training Loss: 0.00989
Epoch: 22146, Training Loss: 0.00990
Epoch: 22146, Training Loss: 0.01219
Epoch: 22147, Training Loss: 0.00945
Epoch: 22147, Training Loss: 0.00989
Epoch: 22147, Training Loss: 0.00990
Epoch: 22147, Training Loss: 0.01219
Epoch: 22148, Training Loss: 0.00945
Epoch: 22148, Training Loss: 0.00989
Epoch: 22148, Training Loss: 0.00990
Epoch: 22148, Training Loss: 0.01219
Epoch: 22149, Training Loss: 0.00945
Epoch: 22149, Training Loss: 0.00989
Epoch: 22149, Training Loss: 0.00990
Epoch: 22149, Training Loss: 0.01219
Epoch: 22150, Training Loss: 0.00945
Epoch: 22150, Training Loss: 0.00988
Epoch: 22150, Training Loss: 0.00990
Epoch: 22150, Training Loss: 0.01219
Epoch: 22151, Training Loss: 0.00945
Epoch: 22151, Training Loss: 0.00988
Epoch: 22151, Training Loss: 0.00990
Epoch: 22151, Training Loss: 0.01219
Epoch: 22152, Training Loss: 0.00945
Epoch: 22152, Training Loss: 0.00988
Epoch: 22152, Training Loss: 0.00990
Epoch: 22152, Training Loss: 0.01218
Epoch: 22153, Training Loss: 0.00944
Epoch: 22153, Training Loss: 0.00988
Epoch: 22153, Training Loss: 0.00990
Epoch: 22153, Training Loss: 0.01218
Epoch: 22154, Training Loss: 0.00944
Epoch: 22154, Training Loss: 0.00988
Epoch: 22154, Training Loss: 0.00990
Epoch: 22154, Training Loss: 0.01218
Epoch: 22155, Training Loss: 0.00944
Epoch: 22155, Training Loss: 0.00988
Epoch: 22155, Training Loss: 0.00990
Epoch: 22155, Training Loss: 0.01218
Epoch: 22156, Training Loss: 0.00944
Epoch: 22156, Training Loss: 0.00988
Epoch: 22156, Training Loss: 0.00990
Epoch: 22156, Training Loss: 0.01218
Epoch: 22157, Training Loss: 0.00944
Epoch: 22157, Training Loss: 0.00988
Epoch: 22157, Training Loss: 0.00990
Epoch: 22157, Training Loss: 0.01218
Epoch: 22158, Training Loss: 0.00944
Epoch: 22158, Training Loss: 0.00988
Epoch: 22158, Training Loss: 0.00990
Epoch: 22158, Training Loss: 0.01218
Epoch: 22159, Training Loss: 0.00944
Epoch: 22159, Training Loss: 0.00988
Epoch: 22159, Training Loss: 0.00990
Epoch: 22159, Training Loss: 0.01218
Epoch: 22160, Training Loss: 0.00944
Epoch: 22160, Training Loss: 0.00988
Epoch: 22160, Training Loss: 0.00989
Epoch: 22160, Training Loss: 0.01218
Epoch: 22161, Training Loss: 0.00944
Epoch: 22161, Training Loss: 0.00988
Epoch: 22161, Training Loss: 0.00989
Epoch: 22161, Training Loss: 0.01218
Epoch: 22162, Training Loss: 0.00944
Epoch: 22162, Training Loss: 0.00988
Epoch: 22162, Training Loss: 0.00989
Epoch: 22162, Training Loss: 0.01218
Epoch: 22163, Training Loss: 0.00944
Epoch: 22163, Training Loss: 0.00988
Epoch: 22163, Training Loss: 0.00989
Epoch: 22163, Training Loss: 0.01218
Epoch: 22164, Training Loss: 0.00944
Epoch: 22164, Training Loss: 0.00988
Epoch: 22164, Training Loss: 0.00989
Epoch: 22164, Training Loss: 0.01218
Epoch: 22165, Training Loss: 0.00944
Epoch: 22165, Training Loss: 0.00988
Epoch: 22165, Training Loss: 0.00989
Epoch: 22165, Training Loss: 0.01218
Epoch: 22166, Training Loss: 0.00944
Epoch: 22166, Training Loss: 0.00988
Epoch: 22166, Training Loss: 0.00989
Epoch: 22166, Training Loss: 0.01218
Epoch: 22167, Training Loss: 0.00944
Epoch: 22167, Training Loss: 0.00988
Epoch: 22167, Training Loss: 0.00989
Epoch: 22167, Training Loss: 0.01218
Epoch: 22168, Training Loss: 0.00944
Epoch: 22168, Training Loss: 0.00988
Epoch: 22168, Training Loss: 0.00989
Epoch: 22168, Training Loss: 0.01218
Epoch: 22169, Training Loss: 0.00944
Epoch: 22169, Training Loss: 0.00988
Epoch: 22169, Training Loss: 0.00989
Epoch: 22169, Training Loss: 0.01218
Epoch: 22170, Training Loss: 0.00944
Epoch: 22170, Training Loss: 0.00988
Epoch: 22170, Training Loss: 0.00989
Epoch: 22170, Training Loss: 0.01218
Epoch: 22171, Training Loss: 0.00944
Epoch: 22171, Training Loss: 0.00988
Epoch: 22171, Training Loss: 0.00989
Epoch: 22171, Training Loss: 0.01218
Epoch: 22172, Training Loss: 0.00944
Epoch: 22172, Training Loss: 0.00988
Epoch: 22172, Training Loss: 0.00989
Epoch: 22172, Training Loss: 0.01218
Epoch: 22173, Training Loss: 0.00944
Epoch: 22173, Training Loss: 0.00988
Epoch: 22173, Training Loss: 0.00989
Epoch: 22173, Training Loss: 0.01218
Epoch: 22174, Training Loss: 0.00944
Epoch: 22174, Training Loss: 0.00988
Epoch: 22174, Training Loss: 0.00989
Epoch: 22174, Training Loss: 0.01218
Epoch: 22175, Training Loss: 0.00944
Epoch: 22175, Training Loss: 0.00988
Epoch: 22175, Training Loss: 0.00989
Epoch: 22175, Training Loss: 0.01218
Epoch: 22176, Training Loss: 0.00944
Epoch: 22176, Training Loss: 0.00988
Epoch: 22176, Training Loss: 0.00989
Epoch: 22176, Training Loss: 0.01218
Epoch: 22177, Training Loss: 0.00944
Epoch: 22177, Training Loss: 0.00988
Epoch: 22177, Training Loss: 0.00989
Epoch: 22177, Training Loss: 0.01218
Epoch: 22178, Training Loss: 0.00944
Epoch: 22178, Training Loss: 0.00988
Epoch: 22178, Training Loss: 0.00989
Epoch: 22178, Training Loss: 0.01218
Epoch: 22179, Training Loss: 0.00944
Epoch: 22179, Training Loss: 0.00988
Epoch: 22179, Training Loss: 0.00989
Epoch: 22179, Training Loss: 0.01218
Epoch: 22180, Training Loss: 0.00944
Epoch: 22180, Training Loss: 0.00988
Epoch: 22180, Training Loss: 0.00989
Epoch: 22180, Training Loss: 0.01218
Epoch: 22181, Training Loss: 0.00944
Epoch: 22181, Training Loss: 0.00988
Epoch: 22181, Training Loss: 0.00989
Epoch: 22181, Training Loss: 0.01218
Epoch: 22182, Training Loss: 0.00944
Epoch: 22182, Training Loss: 0.00988
Epoch: 22182, Training Loss: 0.00989
Epoch: 22182, Training Loss: 0.01218
Epoch: 22183, Training Loss: 0.00944
Epoch: 22183, Training Loss: 0.00988
Epoch: 22183, Training Loss: 0.00989
Epoch: 22183, Training Loss: 0.01217
Epoch: 22184, Training Loss: 0.00944
Epoch: 22184, Training Loss: 0.00988
Epoch: 22184, Training Loss: 0.00989
Epoch: 22184, Training Loss: 0.01217
Epoch: 22185, Training Loss: 0.00944
Epoch: 22185, Training Loss: 0.00988
Epoch: 22185, Training Loss: 0.00989
Epoch: 22185, Training Loss: 0.01217
Epoch: 22186, Training Loss: 0.00944
Epoch: 22186, Training Loss: 0.00988
Epoch: 22186, Training Loss: 0.00989
Epoch: 22186, Training Loss: 0.01217
Epoch: 22187, Training Loss: 0.00944
Epoch: 22187, Training Loss: 0.00988
Epoch: 22187, Training Loss: 0.00989
Epoch: 22187, Training Loss: 0.01217
Epoch: 22188, Training Loss: 0.00944
Epoch: 22188, Training Loss: 0.00987
Epoch: 22188, Training Loss: 0.00989
Epoch: 22188, Training Loss: 0.01217
Epoch: 22189, Training Loss: 0.00944
Epoch: 22189, Training Loss: 0.00987
Epoch: 22189, Training Loss: 0.00989
Epoch: 22189, Training Loss: 0.01217
Epoch: 22190, Training Loss: 0.00944
Epoch: 22190, Training Loss: 0.00987
Epoch: 22190, Training Loss: 0.00989
Epoch: 22190, Training Loss: 0.01217
Epoch: 22191, Training Loss: 0.00944
Epoch: 22191, Training Loss: 0.00987
Epoch: 22191, Training Loss: 0.00989
Epoch: 22191, Training Loss: 0.01217
Epoch: 22192, Training Loss: 0.00944
Epoch: 22192, Training Loss: 0.00987
Epoch: 22192, Training Loss: 0.00989
Epoch: 22192, Training Loss: 0.01217
Epoch: 22193, Training Loss: 0.00944
Epoch: 22193, Training Loss: 0.00987
Epoch: 22193, Training Loss: 0.00989
Epoch: 22193, Training Loss: 0.01217
Epoch: 22194, Training Loss: 0.00944
Epoch: 22194, Training Loss: 0.00987
Epoch: 22194, Training Loss: 0.00989
Epoch: 22194, Training Loss: 0.01217
Epoch: 22195, Training Loss: 0.00943
Epoch: 22195, Training Loss: 0.00987
Epoch: 22195, Training Loss: 0.00989
Epoch: 22195, Training Loss: 0.01217
Epoch: 22196, Training Loss: 0.00943
Epoch: 22196, Training Loss: 0.00987
Epoch: 22196, Training Loss: 0.00989
Epoch: 22196, Training Loss: 0.01217
Epoch: 22197, Training Loss: 0.00943
Epoch: 22197, Training Loss: 0.00987
Epoch: 22197, Training Loss: 0.00989
Epoch: 22197, Training Loss: 0.01217
Epoch: 22198, Training Loss: 0.00943
Epoch: 22198, Training Loss: 0.00987
Epoch: 22198, Training Loss: 0.00989
Epoch: 22198, Training Loss: 0.01217
Epoch: 22199, Training Loss: 0.00943
Epoch: 22199, Training Loss: 0.00987
Epoch: 22199, Training Loss: 0.00988
Epoch: 22199, Training Loss: 0.01217
Epoch: 22200, Training Loss: 0.00943
Epoch: 22200, Training Loss: 0.00987
Epoch: 22200, Training Loss: 0.00988
Epoch: 22200, Training Loss: 0.01217
Epoch: 22201, Training Loss: 0.00943
Epoch: 22201, Training Loss: 0.00987
Epoch: 22201, Training Loss: 0.00988
Epoch: 22201, Training Loss: 0.01217
Epoch: 22202, Training Loss: 0.00943
Epoch: 22202, Training Loss: 0.00987
Epoch: 22202, Training Loss: 0.00988
Epoch: 22202, Training Loss: 0.01217
Epoch: 22203, Training Loss: 0.00943
Epoch: 22203, Training Loss: 0.00987
Epoch: 22203, Training Loss: 0.00988
Epoch: 22203, Training Loss: 0.01217
Epoch: 22204, Training Loss: 0.00943
Epoch: 22204, Training Loss: 0.00987
Epoch: 22204, Training Loss: 0.00988
Epoch: 22204, Training Loss: 0.01217
Epoch: 22205, Training Loss: 0.00943
Epoch: 22205, Training Loss: 0.00987
Epoch: 22205, Training Loss: 0.00988
Epoch: 22205, Training Loss: 0.01217
Epoch: 22206, Training Loss: 0.00943
Epoch: 22206, Training Loss: 0.00987
Epoch: 22206, Training Loss: 0.00988
Epoch: 22206, Training Loss: 0.01217
Epoch: 22207, Training Loss: 0.00943
Epoch: 22207, Training Loss: 0.00987
Epoch: 22207, Training Loss: 0.00988
Epoch: 22207, Training Loss: 0.01217
Epoch: 22208, Training Loss: 0.00943
Epoch: 22208, Training Loss: 0.00987
Epoch: 22208, Training Loss: 0.00988
Epoch: 22208, Training Loss: 0.01217
Epoch: 22209, Training Loss: 0.00943
Epoch: 22209, Training Loss: 0.00987
Epoch: 22209, Training Loss: 0.00988
Epoch: 22209, Training Loss: 0.01217
Epoch: 22210, Training Loss: 0.00943
Epoch: 22210, Training Loss: 0.00987
Epoch: 22210, Training Loss: 0.00988
Epoch: 22210, Training Loss: 0.01217
Epoch: 22211, Training Loss: 0.00943
Epoch: 22211, Training Loss: 0.00987
Epoch: 22211, Training Loss: 0.00988
Epoch: 22211, Training Loss: 0.01217
Epoch: 22212, Training Loss: 0.00943
Epoch: 22212, Training Loss: 0.00987
Epoch: 22212, Training Loss: 0.00988
Epoch: 22212, Training Loss: 0.01217
Epoch: 22213, Training Loss: 0.00943
Epoch: 22213, Training Loss: 0.00987
Epoch: 22213, Training Loss: 0.00988
Epoch: 22213, Training Loss: 0.01217
Epoch: 22214, Training Loss: 0.00943
Epoch: 22214, Training Loss: 0.00987
Epoch: 22214, Training Loss: 0.00988
Epoch: 22214, Training Loss: 0.01216
Epoch: 22215, Training Loss: 0.00943
Epoch: 22215, Training Loss: 0.00987
Epoch: 22215, Training Loss: 0.00988
Epoch: 22215, Training Loss: 0.01216
Epoch: 22216, Training Loss: 0.00943
Epoch: 22216, Training Loss: 0.00987
Epoch: 22216, Training Loss: 0.00988
Epoch: 22216, Training Loss: 0.01216
Epoch: 22217, Training Loss: 0.00943
Epoch: 22217, Training Loss: 0.00987
Epoch: 22217, Training Loss: 0.00988
Epoch: 22217, Training Loss: 0.01216
Epoch: 22218, Training Loss: 0.00943
Epoch: 22218, Training Loss: 0.00987
Epoch: 22218, Training Loss: 0.00988
Epoch: 22218, Training Loss: 0.01216
Epoch: 22219, Training Loss: 0.00943
Epoch: 22219, Training Loss: 0.00987
Epoch: 22219, Training Loss: 0.00988
Epoch: 22219, Training Loss: 0.01216
Epoch: 22220, Training Loss: 0.00943
Epoch: 22220, Training Loss: 0.00987
Epoch: 22220, Training Loss: 0.00988
Epoch: 22220, Training Loss: 0.01216
Epoch: 22221, Training Loss: 0.00943
Epoch: 22221, Training Loss: 0.00987
Epoch: 22221, Training Loss: 0.00988
Epoch: 22221, Training Loss: 0.01216
Epoch: 22222, Training Loss: 0.00943
Epoch: 22222, Training Loss: 0.00987
Epoch: 22222, Training Loss: 0.00988
Epoch: 22222, Training Loss: 0.01216
Epoch: 22223, Training Loss: 0.00943
Epoch: 22223, Training Loss: 0.00987
Epoch: 22223, Training Loss: 0.00988
Epoch: 22223, Training Loss: 0.01216
Epoch: 22224, Training Loss: 0.00943
Epoch: 22224, Training Loss: 0.00987
Epoch: 22224, Training Loss: 0.00988
Epoch: 22224, Training Loss: 0.01216
Epoch: 22225, Training Loss: 0.00943
Epoch: 22225, Training Loss: 0.00987
Epoch: 22225, Training Loss: 0.00988
Epoch: 22225, Training Loss: 0.01216
Epoch: 22226, Training Loss: 0.00943
Epoch: 22226, Training Loss: 0.00987
Epoch: 22226, Training Loss: 0.00988
Epoch: 22226, Training Loss: 0.01216
Epoch: 22227, Training Loss: 0.00943
Epoch: 22227, Training Loss: 0.00986
Epoch: 22227, Training Loss: 0.00988
Epoch: 22227, Training Loss: 0.01216
Epoch: 22228, Training Loss: 0.00943
Epoch: 22228, Training Loss: 0.00986
Epoch: 22228, Training Loss: 0.00988
Epoch: 22228, Training Loss: 0.01216
Epoch: 22229, Training Loss: 0.00943
Epoch: 22229, Training Loss: 0.00986
Epoch: 22229, Training Loss: 0.00988
Epoch: 22229, Training Loss: 0.01216
Epoch: 22230, Training Loss: 0.00943
Epoch: 22230, Training Loss: 0.00986
Epoch: 22230, Training Loss: 0.00988
Epoch: 22230, Training Loss: 0.01216
Epoch: 22231, Training Loss: 0.00943
Epoch: 22231, Training Loss: 0.00986
Epoch: 22231, Training Loss: 0.00988
Epoch: 22231, Training Loss: 0.01216
Epoch: 22232, Training Loss: 0.00943
Epoch: 22232, Training Loss: 0.00986
Epoch: 22232, Training Loss: 0.00988
Epoch: 22232, Training Loss: 0.01216
Epoch: 22233, Training Loss: 0.00943
Epoch: 22233, Training Loss: 0.00986
Epoch: 22233, Training Loss: 0.00988
Epoch: 22233, Training Loss: 0.01216
Epoch: 22234, Training Loss: 0.00943
Epoch: 22234, Training Loss: 0.00986
Epoch: 22234, Training Loss: 0.00988
Epoch: 22234, Training Loss: 0.01216
Epoch: 22235, Training Loss: 0.00943
Epoch: 22235, Training Loss: 0.00986
Epoch: 22235, Training Loss: 0.00988
Epoch: 22235, Training Loss: 0.01216
Epoch: 22236, Training Loss: 0.00942
Epoch: 22236, Training Loss: 0.00986
Epoch: 22236, Training Loss: 0.00988
Epoch: 22236, Training Loss: 0.01216
Epoch: 22237, Training Loss: 0.00942
Epoch: 22237, Training Loss: 0.00986
Epoch: 22237, Training Loss: 0.00987
Epoch: 22237, Training Loss: 0.01216
Epoch: 22238, Training Loss: 0.00942
Epoch: 22238, Training Loss: 0.00986
Epoch: 22238, Training Loss: 0.00987
Epoch: 22238, Training Loss: 0.01216
Epoch: 22239, Training Loss: 0.00942
Epoch: 22239, Training Loss: 0.00986
Epoch: 22239, Training Loss: 0.00987
Epoch: 22239, Training Loss: 0.01216
Epoch: 22240, Training Loss: 0.00942
Epoch: 22240, Training Loss: 0.00986
Epoch: 22240, Training Loss: 0.00987
Epoch: 22240, Training Loss: 0.01216
Epoch: 22241, Training Loss: 0.00942
Epoch: 22241, Training Loss: 0.00986
Epoch: 22241, Training Loss: 0.00987
Epoch: 22241, Training Loss: 0.01216
Epoch: 22242, Training Loss: 0.00942
Epoch: 22242, Training Loss: 0.00986
Epoch: 22242, Training Loss: 0.00987
Epoch: 22242, Training Loss: 0.01216
Epoch: 22243, Training Loss: 0.00942
Epoch: 22243, Training Loss: 0.00986
Epoch: 22243, Training Loss: 0.00987
Epoch: 22243, Training Loss: 0.01216
Epoch: 22244, Training Loss: 0.00942
Epoch: 22244, Training Loss: 0.00986
Epoch: 22244, Training Loss: 0.00987
Epoch: 22244, Training Loss: 0.01216
Epoch: 22245, Training Loss: 0.00942
Epoch: 22245, Training Loss: 0.00986
Epoch: 22245, Training Loss: 0.00987
Epoch: 22245, Training Loss: 0.01215
Epoch: 22246, Training Loss: 0.00942
Epoch: 22246, Training Loss: 0.00986
Epoch: 22246, Training Loss: 0.00987
Epoch: 22246, Training Loss: 0.01215
Epoch: 22247, Training Loss: 0.00942
Epoch: 22247, Training Loss: 0.00986
Epoch: 22247, Training Loss: 0.00987
Epoch: 22247, Training Loss: 0.01215
Epoch: 22248, Training Loss: 0.00942
Epoch: 22248, Training Loss: 0.00986
Epoch: 22248, Training Loss: 0.00987
Epoch: 22248, Training Loss: 0.01215
Epoch: 22249, Training Loss: 0.00942
Epoch: 22249, Training Loss: 0.00986
Epoch: 22249, Training Loss: 0.00987
Epoch: 22249, Training Loss: 0.01215
Epoch: 22250, Training Loss: 0.00942
Epoch: 22250, Training Loss: 0.00986
Epoch: 22250, Training Loss: 0.00987
Epoch: 22250, Training Loss: 0.01215
Epoch: 22251, Training Loss: 0.00942
Epoch: 22251, Training Loss: 0.00986
Epoch: 22251, Training Loss: 0.00987
Epoch: 22251, Training Loss: 0.01215
Epoch: 22252, Training Loss: 0.00942
Epoch: 22252, Training Loss: 0.00986
Epoch: 22252, Training Loss: 0.00987
Epoch: 22252, Training Loss: 0.01215
Epoch: 22253, Training Loss: 0.00942
Epoch: 22253, Training Loss: 0.00986
Epoch: 22253, Training Loss: 0.00987
Epoch: 22253, Training Loss: 0.01215
Epoch: 22254, Training Loss: 0.00942
Epoch: 22254, Training Loss: 0.00986
Epoch: 22254, Training Loss: 0.00987
Epoch: 22254, Training Loss: 0.01215
Epoch: 22255, Training Loss: 0.00942
Epoch: 22255, Training Loss: 0.00986
Epoch: 22255, Training Loss: 0.00987
Epoch: 22255, Training Loss: 0.01215
Epoch: 22256, Training Loss: 0.00942
Epoch: 22256, Training Loss: 0.00986
Epoch: 22256, Training Loss: 0.00987
Epoch: 22256, Training Loss: 0.01215
Epoch: 22257, Training Loss: 0.00942
Epoch: 22257, Training Loss: 0.00986
Epoch: 22257, Training Loss: 0.00987
Epoch: 22257, Training Loss: 0.01215
Epoch: 22258, Training Loss: 0.00942
Epoch: 22258, Training Loss: 0.00986
Epoch: 22258, Training Loss: 0.00987
Epoch: 22258, Training Loss: 0.01215
Epoch: 22259, Training Loss: 0.00942
Epoch: 22259, Training Loss: 0.00986
Epoch: 22259, Training Loss: 0.00987
Epoch: 22259, Training Loss: 0.01215
Epoch: 22260, Training Loss: 0.00942
Epoch: 22260, Training Loss: 0.00986
Epoch: 22260, Training Loss: 0.00987
Epoch: 22260, Training Loss: 0.01215
Epoch: 22261, Training Loss: 0.00942
Epoch: 22261, Training Loss: 0.00986
Epoch: 22261, Training Loss: 0.00987
Epoch: 22261, Training Loss: 0.01215
Epoch: 22262, Training Loss: 0.00942
Epoch: 22262, Training Loss: 0.00986
Epoch: 22262, Training Loss: 0.00987
Epoch: 22262, Training Loss: 0.01215
Epoch: 22263, Training Loss: 0.00942
Epoch: 22263, Training Loss: 0.00986
Epoch: 22263, Training Loss: 0.00987
Epoch: 22263, Training Loss: 0.01215
Epoch: 22264, Training Loss: 0.00942
Epoch: 22264, Training Loss: 0.00986
Epoch: 22264, Training Loss: 0.00987
Epoch: 22264, Training Loss: 0.01215
Epoch: 22265, Training Loss: 0.00942
Epoch: 22265, Training Loss: 0.00986
Epoch: 22265, Training Loss: 0.00987
Epoch: 22265, Training Loss: 0.01215
Epoch: 22266, Training Loss: 0.00942
Epoch: 22266, Training Loss: 0.00985
Epoch: 22266, Training Loss: 0.00987
Epoch: 22266, Training Loss: 0.01215
Epoch: 22267, Training Loss: 0.00942
Epoch: 22267, Training Loss: 0.00985
Epoch: 22267, Training Loss: 0.00987
Epoch: 22267, Training Loss: 0.01215
Epoch: 22268, Training Loss: 0.00942
Epoch: 22268, Training Loss: 0.00985
Epoch: 22268, Training Loss: 0.00987
Epoch: 22268, Training Loss: 0.01215
Epoch: 22269, Training Loss: 0.00942
Epoch: 22269, Training Loss: 0.00985
Epoch: 22269, Training Loss: 0.00987
Epoch: 22269, Training Loss: 0.01215
Epoch: 22270, Training Loss: 0.00942
Epoch: 22270, Training Loss: 0.00985
Epoch: 22270, Training Loss: 0.00987
Epoch: 22270, Training Loss: 0.01215
Epoch: 22271, Training Loss: 0.00942
Epoch: 22271, Training Loss: 0.00985
Epoch: 22271, Training Loss: 0.00987
Epoch: 22271, Training Loss: 0.01215
Epoch: 22272, Training Loss: 0.00942
Epoch: 22272, Training Loss: 0.00985
Epoch: 22272, Training Loss: 0.00987
Epoch: 22272, Training Loss: 0.01215
Epoch: 22273, Training Loss: 0.00942
Epoch: 22273, Training Loss: 0.00985
Epoch: 22273, Training Loss: 0.00987
Epoch: 22273, Training Loss: 0.01215
Epoch: 22274, Training Loss: 0.00942
Epoch: 22274, Training Loss: 0.00985
Epoch: 22274, Training Loss: 0.00987
Epoch: 22274, Training Loss: 0.01215
Epoch: 22275, Training Loss: 0.00942
Epoch: 22275, Training Loss: 0.00985
Epoch: 22275, Training Loss: 0.00987
Epoch: 22275, Training Loss: 0.01215
Epoch: 22276, Training Loss: 0.00942
Epoch: 22276, Training Loss: 0.00985
Epoch: 22276, Training Loss: 0.00986
Epoch: 22276, Training Loss: 0.01214
Epoch: 22277, Training Loss: 0.00942
Epoch: 22277, Training Loss: 0.00985
Epoch: 22277, Training Loss: 0.00986
Epoch: 22277, Training Loss: 0.01214
Epoch: 22278, Training Loss: 0.00941
Epoch: 22278, Training Loss: 0.00985
Epoch: 22278, Training Loss: 0.00986
Epoch: 22278, Training Loss: 0.01214
Epoch: 22279, Training Loss: 0.00941
Epoch: 22279, Training Loss: 0.00985
Epoch: 22279, Training Loss: 0.00986
Epoch: 22279, Training Loss: 0.01214
Epoch: 22280, Training Loss: 0.00941
Epoch: 22280, Training Loss: 0.00985
Epoch: 22280, Training Loss: 0.00986
Epoch: 22280, Training Loss: 0.01214
Epoch: 22281, Training Loss: 0.00941
Epoch: 22281, Training Loss: 0.00985
Epoch: 22281, Training Loss: 0.00986
Epoch: 22281, Training Loss: 0.01214
Epoch: 22282, Training Loss: 0.00941
Epoch: 22282, Training Loss: 0.00985
Epoch: 22282, Training Loss: 0.00986
Epoch: 22282, Training Loss: 0.01214
Epoch: 22283, Training Loss: 0.00941
Epoch: 22283, Training Loss: 0.00985
Epoch: 22283, Training Loss: 0.00986
Epoch: 22283, Training Loss: 0.01214
Epoch: 22284, Training Loss: 0.00941
Epoch: 22284, Training Loss: 0.00985
Epoch: 22284, Training Loss: 0.00986
Epoch: 22284, Training Loss: 0.01214
Epoch: 22285, Training Loss: 0.00941
Epoch: 22285, Training Loss: 0.00985
Epoch: 22285, Training Loss: 0.00986
Epoch: 22285, Training Loss: 0.01214
Epoch: 22286, Training Loss: 0.00941
Epoch: 22286, Training Loss: 0.00985
Epoch: 22286, Training Loss: 0.00986
Epoch: 22286, Training Loss: 0.01214
Epoch: 22287, Training Loss: 0.00941
Epoch: 22287, Training Loss: 0.00985
Epoch: 22287, Training Loss: 0.00986
Epoch: 22287, Training Loss: 0.01214
Epoch: 22288, Training Loss: 0.00941
Epoch: 22288, Training Loss: 0.00985
Epoch: 22288, Training Loss: 0.00986
Epoch: 22288, Training Loss: 0.01214
Epoch: 22289, Training Loss: 0.00941
Epoch: 22289, Training Loss: 0.00985
Epoch: 22289, Training Loss: 0.00986
Epoch: 22289, Training Loss: 0.01214
Epoch: 22290, Training Loss: 0.00941
Epoch: 22290, Training Loss: 0.00985
Epoch: 22290, Training Loss: 0.00986
Epoch: 22290, Training Loss: 0.01214
Epoch: 22291, Training Loss: 0.00941
Epoch: 22291, Training Loss: 0.00985
Epoch: 22291, Training Loss: 0.00986
Epoch: 22291, Training Loss: 0.01214
Epoch: 22292, Training Loss: 0.00941
Epoch: 22292, Training Loss: 0.00985
Epoch: 22292, Training Loss: 0.00986
Epoch: 22292, Training Loss: 0.01214
Epoch: 22293, Training Loss: 0.00941
Epoch: 22293, Training Loss: 0.00985
Epoch: 22293, Training Loss: 0.00986
Epoch: 22293, Training Loss: 0.01214
Epoch: 22294, Training Loss: 0.00941
Epoch: 22294, Training Loss: 0.00985
Epoch: 22294, Training Loss: 0.00986
Epoch: 22294, Training Loss: 0.01214
Epoch: 22295, Training Loss: 0.00941
Epoch: 22295, Training Loss: 0.00985
Epoch: 22295, Training Loss: 0.00986
Epoch: 22295, Training Loss: 0.01214
Epoch: 22296, Training Loss: 0.00941
Epoch: 22296, Training Loss: 0.00985
Epoch: 22296, Training Loss: 0.00986
Epoch: 22296, Training Loss: 0.01214
Epoch: 22297, Training Loss: 0.00941
Epoch: 22297, Training Loss: 0.00985
Epoch: 22297, Training Loss: 0.00986
Epoch: 22297, Training Loss: 0.01214
Epoch: 22298, Training Loss: 0.00941
Epoch: 22298, Training Loss: 0.00985
Epoch: 22298, Training Loss: 0.00986
Epoch: 22298, Training Loss: 0.01214
Epoch: 22299, Training Loss: 0.00941
Epoch: 22299, Training Loss: 0.00985
Epoch: 22299, Training Loss: 0.00986
Epoch: 22299, Training Loss: 0.01214
Epoch: 22300, Training Loss: 0.00941
Epoch: 22300, Training Loss: 0.00985
Epoch: 22300, Training Loss: 0.00986
Epoch: 22300, Training Loss: 0.01214
Epoch: 22301, Training Loss: 0.00941
Epoch: 22301, Training Loss: 0.00985
Epoch: 22301, Training Loss: 0.00986
Epoch: 22301, Training Loss: 0.01214
Epoch: 22302, Training Loss: 0.00941
Epoch: 22302, Training Loss: 0.00985
Epoch: 22302, Training Loss: 0.00986
Epoch: 22302, Training Loss: 0.01214
Epoch: 22303, Training Loss: 0.00941
Epoch: 22303, Training Loss: 0.00985
Epoch: 22303, Training Loss: 0.00986
Epoch: 22303, Training Loss: 0.01214
Epoch: 22304, Training Loss: 0.00941
Epoch: 22304, Training Loss: 0.00985
Epoch: 22304, Training Loss: 0.00986
Epoch: 22304, Training Loss: 0.01214
Epoch: 22305, Training Loss: 0.00941
Epoch: 22305, Training Loss: 0.00984
Epoch: 22305, Training Loss: 0.00986
Epoch: 22305, Training Loss: 0.01214
Epoch: 22306, Training Loss: 0.00941
Epoch: 22306, Training Loss: 0.00984
Epoch: 22306, Training Loss: 0.00986
Epoch: 22306, Training Loss: 0.01214
Epoch: 22307, Training Loss: 0.00941
Epoch: 22307, Training Loss: 0.00984
Epoch: 22307, Training Loss: 0.00986
Epoch: 22307, Training Loss: 0.01213
Epoch: 22308, Training Loss: 0.00941
Epoch: 22308, Training Loss: 0.00984
Epoch: 22308, Training Loss: 0.00986
Epoch: 22308, Training Loss: 0.01213
Epoch: 22309, Training Loss: 0.00941
Epoch: 22309, Training Loss: 0.00984
Epoch: 22309, Training Loss: 0.00986
Epoch: 22309, Training Loss: 0.01213
Epoch: 22310, Training Loss: 0.00941
Epoch: 22310, Training Loss: 0.00984
Epoch: 22310, Training Loss: 0.00986
Epoch: 22310, Training Loss: 0.01213
Epoch: 22311, Training Loss: 0.00941
Epoch: 22311, Training Loss: 0.00984
Epoch: 22311, Training Loss: 0.00986
Epoch: 22311, Training Loss: 0.01213
Epoch: 22312, Training Loss: 0.00941
Epoch: 22312, Training Loss: 0.00984
Epoch: 22312, Training Loss: 0.00986
Epoch: 22312, Training Loss: 0.01213
Epoch: 22313, Training Loss: 0.00941
Epoch: 22313, Training Loss: 0.00984
Epoch: 22313, Training Loss: 0.00986
Epoch: 22313, Training Loss: 0.01213
Epoch: 22314, Training Loss: 0.00941
Epoch: 22314, Training Loss: 0.00984
Epoch: 22314, Training Loss: 0.00986
Epoch: 22314, Training Loss: 0.01213
Epoch: 22315, Training Loss: 0.00941
Epoch: 22315, Training Loss: 0.00984
Epoch: 22315, Training Loss: 0.00985
Epoch: 22315, Training Loss: 0.01213
Epoch: 22316, Training Loss: 0.00941
Epoch: 22316, Training Loss: 0.00984
Epoch: 22316, Training Loss: 0.00985
Epoch: 22316, Training Loss: 0.01213
Epoch: 22317, Training Loss: 0.00941
Epoch: 22317, Training Loss: 0.00984
Epoch: 22317, Training Loss: 0.00985
Epoch: 22317, Training Loss: 0.01213
Epoch: 22318, Training Loss: 0.00941
Epoch: 22318, Training Loss: 0.00984
Epoch: 22318, Training Loss: 0.00985
Epoch: 22318, Training Loss: 0.01213
Epoch: 22319, Training Loss: 0.00941
Epoch: 22319, Training Loss: 0.00984
Epoch: 22319, Training Loss: 0.00985
Epoch: 22319, Training Loss: 0.01213
Epoch: 22320, Training Loss: 0.00940
Epoch: 22320, Training Loss: 0.00984
Epoch: 22320, Training Loss: 0.00985
Epoch: 22320, Training Loss: 0.01213
Epoch: 22321, Training Loss: 0.00940
Epoch: 22321, Training Loss: 0.00984
Epoch: 22321, Training Loss: 0.00985
Epoch: 22321, Training Loss: 0.01213
Epoch: 22322, Training Loss: 0.00940
Epoch: 22322, Training Loss: 0.00984
Epoch: 22322, Training Loss: 0.00985
Epoch: 22322, Training Loss: 0.01213
Epoch: 22323, Training Loss: 0.00940
Epoch: 22323, Training Loss: 0.00984
Epoch: 22323, Training Loss: 0.00985
Epoch: 22323, Training Loss: 0.01213
Epoch: 22324, Training Loss: 0.00940
Epoch: 22324, Training Loss: 0.00984
Epoch: 22324, Training Loss: 0.00985
Epoch: 22324, Training Loss: 0.01213
Epoch: 22325, Training Loss: 0.00940
Epoch: 22325, Training Loss: 0.00984
Epoch: 22325, Training Loss: 0.00985
Epoch: 22325, Training Loss: 0.01213
Epoch: 22326, Training Loss: 0.00940
Epoch: 22326, Training Loss: 0.00984
Epoch: 22326, Training Loss: 0.00985
Epoch: 22326, Training Loss: 0.01213
Epoch: 22327, Training Loss: 0.00940
Epoch: 22327, Training Loss: 0.00984
Epoch: 22327, Training Loss: 0.00985
Epoch: 22327, Training Loss: 0.01213
Epoch: 22328, Training Loss: 0.00940
Epoch: 22328, Training Loss: 0.00984
Epoch: 22328, Training Loss: 0.00985
Epoch: 22328, Training Loss: 0.01213
Epoch: 22329, Training Loss: 0.00940
Epoch: 22329, Training Loss: 0.00984
Epoch: 22329, Training Loss: 0.00985
Epoch: 22329, Training Loss: 0.01213
Epoch: 22330, Training Loss: 0.00940
Epoch: 22330, Training Loss: 0.00984
Epoch: 22330, Training Loss: 0.00985
Epoch: 22330, Training Loss: 0.01213
Epoch: 22331, Training Loss: 0.00940
Epoch: 22331, Training Loss: 0.00984
Epoch: 22331, Training Loss: 0.00985
Epoch: 22331, Training Loss: 0.01213
Epoch: 22332, Training Loss: 0.00940
Epoch: 22332, Training Loss: 0.00984
Epoch: 22332, Training Loss: 0.00985
Epoch: 22332, Training Loss: 0.01213
Epoch: 22333, Training Loss: 0.00940
Epoch: 22333, Training Loss: 0.00984
Epoch: 22333, Training Loss: 0.00985
Epoch: 22333, Training Loss: 0.01213
Epoch: 22334, Training Loss: 0.00940
Epoch: 22334, Training Loss: 0.00984
Epoch: 22334, Training Loss: 0.00985
Epoch: 22334, Training Loss: 0.01213
Epoch: 22335, Training Loss: 0.00940
Epoch: 22335, Training Loss: 0.00984
Epoch: 22335, Training Loss: 0.00985
Epoch: 22335, Training Loss: 0.01213
Epoch: 22336, Training Loss: 0.00940
Epoch: 22336, Training Loss: 0.00984
Epoch: 22336, Training Loss: 0.00985
Epoch: 22336, Training Loss: 0.01213
Epoch: 22337, Training Loss: 0.00940
Epoch: 22337, Training Loss: 0.00984
Epoch: 22337, Training Loss: 0.00985
Epoch: 22337, Training Loss: 0.01213
Epoch: 22338, Training Loss: 0.00940
Epoch: 22338, Training Loss: 0.00984
Epoch: 22338, Training Loss: 0.00985
Epoch: 22338, Training Loss: 0.01212
Epoch: 22339, Training Loss: 0.00940
Epoch: 22339, Training Loss: 0.00984
Epoch: 22339, Training Loss: 0.00985
Epoch: 22339, Training Loss: 0.01212
Epoch: 22340, Training Loss: 0.00940
Epoch: 22340, Training Loss: 0.00984
Epoch: 22340, Training Loss: 0.00985
Epoch: 22340, Training Loss: 0.01212
Epoch: 22341, Training Loss: 0.00940
Epoch: 22341, Training Loss: 0.00984
Epoch: 22341, Training Loss: 0.00985
Epoch: 22341, Training Loss: 0.01212
Epoch: 22342, Training Loss: 0.00940
Epoch: 22342, Training Loss: 0.00984
Epoch: 22342, Training Loss: 0.00985
Epoch: 22342, Training Loss: 0.01212
Epoch: 22343, Training Loss: 0.00940
Epoch: 22343, Training Loss: 0.00984
Epoch: 22343, Training Loss: 0.00985
Epoch: 22343, Training Loss: 0.01212
Epoch: 22344, Training Loss: 0.00940
Epoch: 22344, Training Loss: 0.00983
Epoch: 22344, Training Loss: 0.00985
Epoch: 22344, Training Loss: 0.01212
Epoch: 22345, Training Loss: 0.00940
Epoch: 22345, Training Loss: 0.00983
Epoch: 22345, Training Loss: 0.00985
Epoch: 22345, Training Loss: 0.01212
Epoch: 22346, Training Loss: 0.00940
Epoch: 22346, Training Loss: 0.00983
Epoch: 22346, Training Loss: 0.00985
Epoch: 22346, Training Loss: 0.01212
Epoch: 22347, Training Loss: 0.00940
Epoch: 22347, Training Loss: 0.00983
Epoch: 22347, Training Loss: 0.00985
Epoch: 22347, Training Loss: 0.01212
Epoch: 22348, Training Loss: 0.00940
Epoch: 22348, Training Loss: 0.00983
Epoch: 22348, Training Loss: 0.00985
Epoch: 22348, Training Loss: 0.01212
Epoch: 22349, Training Loss: 0.00940
Epoch: 22349, Training Loss: 0.00983
Epoch: 22349, Training Loss: 0.00985
Epoch: 22349, Training Loss: 0.01212
Epoch: 22350, Training Loss: 0.00940
Epoch: 22350, Training Loss: 0.00983
Epoch: 22350, Training Loss: 0.00985
Epoch: 22350, Training Loss: 0.01212
Epoch: 22351, Training Loss: 0.00940
Epoch: 22351, Training Loss: 0.00983
Epoch: 22351, Training Loss: 0.00985
Epoch: 22351, Training Loss: 0.01212
Epoch: 22352, Training Loss: 0.00940
Epoch: 22352, Training Loss: 0.00983
Epoch: 22352, Training Loss: 0.00985
Epoch: 22352, Training Loss: 0.01212
Epoch: 22353, Training Loss: 0.00940
Epoch: 22353, Training Loss: 0.00983
Epoch: 22353, Training Loss: 0.00985
Epoch: 22353, Training Loss: 0.01212
Epoch: 22354, Training Loss: 0.00940
Epoch: 22354, Training Loss: 0.00983
Epoch: 22354, Training Loss: 0.00984
Epoch: 22354, Training Loss: 0.01212
Epoch: 22355, Training Loss: 0.00940
Epoch: 22355, Training Loss: 0.00983
Epoch: 22355, Training Loss: 0.00984
Epoch: 22355, Training Loss: 0.01212
Epoch: 22356, Training Loss: 0.00940
Epoch: 22356, Training Loss: 0.00983
Epoch: 22356, Training Loss: 0.00984
Epoch: 22356, Training Loss: 0.01212
Epoch: 22357, Training Loss: 0.00940
Epoch: 22357, Training Loss: 0.00983
Epoch: 22357, Training Loss: 0.00984
Epoch: 22357, Training Loss: 0.01212
Epoch: 22358, Training Loss: 0.00940
Epoch: 22358, Training Loss: 0.00983
Epoch: 22358, Training Loss: 0.00984
Epoch: 22358, Training Loss: 0.01212
Epoch: 22359, Training Loss: 0.00940
Epoch: 22359, Training Loss: 0.00983
Epoch: 22359, Training Loss: 0.00984
Epoch: 22359, Training Loss: 0.01212
Epoch: 22360, Training Loss: 0.00940
Epoch: 22360, Training Loss: 0.00983
Epoch: 22360, Training Loss: 0.00984
Epoch: 22360, Training Loss: 0.01212
Epoch: 22361, Training Loss: 0.00939
Epoch: 22361, Training Loss: 0.00983
Epoch: 22361, Training Loss: 0.00984
Epoch: 22361, Training Loss: 0.01212
Epoch: 22362, Training Loss: 0.00939
Epoch: 22362, Training Loss: 0.00983
Epoch: 22362, Training Loss: 0.00984
Epoch: 22362, Training Loss: 0.01212
Epoch: 22363, Training Loss: 0.00939
Epoch: 22363, Training Loss: 0.00983
Epoch: 22363, Training Loss: 0.00984
Epoch: 22363, Training Loss: 0.01212
Epoch: 22364, Training Loss: 0.00939
Epoch: 22364, Training Loss: 0.00983
Epoch: 22364, Training Loss: 0.00984
Epoch: 22364, Training Loss: 0.01212
Epoch: 22365, Training Loss: 0.00939
Epoch: 22365, Training Loss: 0.00983
Epoch: 22365, Training Loss: 0.00984
Epoch: 22365, Training Loss: 0.01212
Epoch: 22366, Training Loss: 0.00939
Epoch: 22366, Training Loss: 0.00983
Epoch: 22366, Training Loss: 0.00984
Epoch: 22366, Training Loss: 0.01212
Epoch: 22367, Training Loss: 0.00939
Epoch: 22367, Training Loss: 0.00983
Epoch: 22367, Training Loss: 0.00984
Epoch: 22367, Training Loss: 0.01212
Epoch: 22368, Training Loss: 0.00939
Epoch: 22368, Training Loss: 0.00983
Epoch: 22368, Training Loss: 0.00984
Epoch: 22368, Training Loss: 0.01212
Epoch: 22369, Training Loss: 0.00939
Epoch: 22369, Training Loss: 0.00983
Epoch: 22369, Training Loss: 0.00984
Epoch: 22369, Training Loss: 0.01211
Epoch: 22370, Training Loss: 0.00939
Epoch: 22370, Training Loss: 0.00983
Epoch: 22370, Training Loss: 0.00984
Epoch: 22370, Training Loss: 0.01211
Epoch: 22371, Training Loss: 0.00939
Epoch: 22371, Training Loss: 0.00983
Epoch: 22371, Training Loss: 0.00984
Epoch: 22371, Training Loss: 0.01211
Epoch: 22372, Training Loss: 0.00939
Epoch: 22372, Training Loss: 0.00983
Epoch: 22372, Training Loss: 0.00984
Epoch: 22372, Training Loss: 0.01211
Epoch: 22373, Training Loss: 0.00939
Epoch: 22373, Training Loss: 0.00983
Epoch: 22373, Training Loss: 0.00984
Epoch: 22373, Training Loss: 0.01211
Epoch: 22374, Training Loss: 0.00939
Epoch: 22374, Training Loss: 0.00983
Epoch: 22374, Training Loss: 0.00984
Epoch: 22374, Training Loss: 0.01211
Epoch: 22375, Training Loss: 0.00939
Epoch: 22375, Training Loss: 0.00983
Epoch: 22375, Training Loss: 0.00984
Epoch: 22375, Training Loss: 0.01211
Epoch: 22376, Training Loss: 0.00939
Epoch: 22376, Training Loss: 0.00983
Epoch: 22376, Training Loss: 0.00984
Epoch: 22376, Training Loss: 0.01211
Epoch: 22377, Training Loss: 0.00939
Epoch: 22377, Training Loss: 0.00983
Epoch: 22377, Training Loss: 0.00984
Epoch: 22377, Training Loss: 0.01211
Epoch: 22378, Training Loss: 0.00939
Epoch: 22378, Training Loss: 0.00983
Epoch: 22378, Training Loss: 0.00984
Epoch: 22378, Training Loss: 0.01211
Epoch: 22379, Training Loss: 0.00939
Epoch: 22379, Training Loss: 0.00983
Epoch: 22379, Training Loss: 0.00984
Epoch: 22379, Training Loss: 0.01211
Epoch: 22380, Training Loss: 0.00939
Epoch: 22380, Training Loss: 0.00983
Epoch: 22380, Training Loss: 0.00984
Epoch: 22380, Training Loss: 0.01211
Epoch: 22381, Training Loss: 0.00939
Epoch: 22381, Training Loss: 0.00983
Epoch: 22381, Training Loss: 0.00984
Epoch: 22381, Training Loss: 0.01211
Epoch: 22382, Training Loss: 0.00939
Epoch: 22382, Training Loss: 0.00983
Epoch: 22382, Training Loss: 0.00984
Epoch: 22382, Training Loss: 0.01211
Epoch: 22383, Training Loss: 0.00939
Epoch: 22383, Training Loss: 0.00982
Epoch: 22383, Training Loss: 0.00984
Epoch: 22383, Training Loss: 0.01211
Epoch: 22384, Training Loss: 0.00939
Epoch: 22384, Training Loss: 0.00982
Epoch: 22384, Training Loss: 0.00984
Epoch: 22384, Training Loss: 0.01211
Epoch: 22385, Training Loss: 0.00939
Epoch: 22385, Training Loss: 0.00982
Epoch: 22385, Training Loss: 0.00984
Epoch: 22385, Training Loss: 0.01211
Epoch: 22386, Training Loss: 0.00939
Epoch: 22386, Training Loss: 0.00982
Epoch: 22386, Training Loss: 0.00984
Epoch: 22386, Training Loss: 0.01211
Epoch: 22387, Training Loss: 0.00939
Epoch: 22387, Training Loss: 0.00982
Epoch: 22387, Training Loss: 0.00984
Epoch: 22387, Training Loss: 0.01211
Epoch: 22388, Training Loss: 0.00939
Epoch: 22388, Training Loss: 0.00982
Epoch: 22388, Training Loss: 0.00984
Epoch: 22388, Training Loss: 0.01211
Epoch: 22389, Training Loss: 0.00939
Epoch: 22389, Training Loss: 0.00982
Epoch: 22389, Training Loss: 0.00984
Epoch: 22389, Training Loss: 0.01211
Epoch: 22390, Training Loss: 0.00939
Epoch: 22390, Training Loss: 0.00982
Epoch: 22390, Training Loss: 0.00984
Epoch: 22390, Training Loss: 0.01211
Epoch: 22391, Training Loss: 0.00939
Epoch: 22391, Training Loss: 0.00982
Epoch: 22391, Training Loss: 0.00984
Epoch: 22391, Training Loss: 0.01211
Epoch: 22392, Training Loss: 0.00939
Epoch: 22392, Training Loss: 0.00982
Epoch: 22392, Training Loss: 0.00984
Epoch: 22392, Training Loss: 0.01211
Epoch: 22393, Training Loss: 0.00939
Epoch: 22393, Training Loss: 0.00982
Epoch: 22393, Training Loss: 0.00983
Epoch: 22393, Training Loss: 0.01211
Epoch: 22394, Training Loss: 0.00939
Epoch: 22394, Training Loss: 0.00982
Epoch: 22394, Training Loss: 0.00983
Epoch: 22394, Training Loss: 0.01211
Epoch: 22395, Training Loss: 0.00939
Epoch: 22395, Training Loss: 0.00982
Epoch: 22395, Training Loss: 0.00983
Epoch: 22395, Training Loss: 0.01211
Epoch: 22396, Training Loss: 0.00939
Epoch: 22396, Training Loss: 0.00982
Epoch: 22396, Training Loss: 0.00983
Epoch: 22396, Training Loss: 0.01211
Epoch: 22397, Training Loss: 0.00939
Epoch: 22397, Training Loss: 0.00982
Epoch: 22397, Training Loss: 0.00983
Epoch: 22397, Training Loss: 0.01211
Epoch: 22398, Training Loss: 0.00939
Epoch: 22398, Training Loss: 0.00982
Epoch: 22398, Training Loss: 0.00983
Epoch: 22398, Training Loss: 0.01211
Epoch: 22399, Training Loss: 0.00939
Epoch: 22399, Training Loss: 0.00982
Epoch: 22399, Training Loss: 0.00983
Epoch: 22399, Training Loss: 0.01211
Epoch: 22400, Training Loss: 0.00939
Epoch: 22400, Training Loss: 0.00982
Epoch: 22400, Training Loss: 0.00983
Epoch: 22400, Training Loss: 0.01210
Epoch: 22401, Training Loss: 0.00939
Epoch: 22401, Training Loss: 0.00982
Epoch: 22401, Training Loss: 0.00983
Epoch: 22401, Training Loss: 0.01210
Epoch: 22402, Training Loss: 0.00939
Epoch: 22402, Training Loss: 0.00982
Epoch: 22402, Training Loss: 0.00983
Epoch: 22402, Training Loss: 0.01210
Epoch: 22403, Training Loss: 0.00938
Epoch: 22403, Training Loss: 0.00982
Epoch: 22403, Training Loss: 0.00983
Epoch: 22403, Training Loss: 0.01210
Epoch: 22404, Training Loss: 0.00938
Epoch: 22404, Training Loss: 0.00982
Epoch: 22404, Training Loss: 0.00983
Epoch: 22404, Training Loss: 0.01210
Epoch: 22405, Training Loss: 0.00938
Epoch: 22405, Training Loss: 0.00982
Epoch: 22405, Training Loss: 0.00983
Epoch: 22405, Training Loss: 0.01210
Epoch: 22406, Training Loss: 0.00938
Epoch: 22406, Training Loss: 0.00982
Epoch: 22406, Training Loss: 0.00983
Epoch: 22406, Training Loss: 0.01210
Epoch: 22407, Training Loss: 0.00938
Epoch: 22407, Training Loss: 0.00982
Epoch: 22407, Training Loss: 0.00983
Epoch: 22407, Training Loss: 0.01210
Epoch: 22408, Training Loss: 0.00938
Epoch: 22408, Training Loss: 0.00982
Epoch: 22408, Training Loss: 0.00983
Epoch: 22408, Training Loss: 0.01210
Epoch: 22409, Training Loss: 0.00938
Epoch: 22409, Training Loss: 0.00982
Epoch: 22409, Training Loss: 0.00983
Epoch: 22409, Training Loss: 0.01210
Epoch: 22410, Training Loss: 0.00938
Epoch: 22410, Training Loss: 0.00982
Epoch: 22410, Training Loss: 0.00983
Epoch: 22410, Training Loss: 0.01210
Epoch: 22411, Training Loss: 0.00938
Epoch: 22411, Training Loss: 0.00982
Epoch: 22411, Training Loss: 0.00983
Epoch: 22411, Training Loss: 0.01210
Epoch: 22412, Training Loss: 0.00938
Epoch: 22412, Training Loss: 0.00982
Epoch: 22412, Training Loss: 0.00983
Epoch: 22412, Training Loss: 0.01210
Epoch: 22413, Training Loss: 0.00938
Epoch: 22413, Training Loss: 0.00982
Epoch: 22413, Training Loss: 0.00983
Epoch: 22413, Training Loss: 0.01210
Epoch: 22414, Training Loss: 0.00938
Epoch: 22414, Training Loss: 0.00982
Epoch: 22414, Training Loss: 0.00983
Epoch: 22414, Training Loss: 0.01210
Epoch: 22415, Training Loss: 0.00938
Epoch: 22415, Training Loss: 0.00982
Epoch: 22415, Training Loss: 0.00983
Epoch: 22415, Training Loss: 0.01210
Epoch: 22416, Training Loss: 0.00938
Epoch: 22416, Training Loss: 0.00982
Epoch: 22416, Training Loss: 0.00983
Epoch: 22416, Training Loss: 0.01210
Epoch: 22417, Training Loss: 0.00938
Epoch: 22417, Training Loss: 0.00982
Epoch: 22417, Training Loss: 0.00983
Epoch: 22417, Training Loss: 0.01210
Epoch: 22418, Training Loss: 0.00938
Epoch: 22418, Training Loss: 0.00982
Epoch: 22418, Training Loss: 0.00983
Epoch: 22418, Training Loss: 0.01210
Epoch: 22419, Training Loss: 0.00938
Epoch: 22419, Training Loss: 0.00982
Epoch: 22419, Training Loss: 0.00983
Epoch: 22419, Training Loss: 0.01210
Epoch: 22420, Training Loss: 0.00938
Epoch: 22420, Training Loss: 0.00982
Epoch: 22420, Training Loss: 0.00983
Epoch: 22420, Training Loss: 0.01210
Epoch: 22421, Training Loss: 0.00938
Epoch: 22421, Training Loss: 0.00982
Epoch: 22421, Training Loss: 0.00983
Epoch: 22421, Training Loss: 0.01210
Epoch: 22422, Training Loss: 0.00938
Epoch: 22422, Training Loss: 0.00981
Epoch: 22422, Training Loss: 0.00983
Epoch: 22422, Training Loss: 0.01210
Epoch: 22423, Training Loss: 0.00938
Epoch: 22423, Training Loss: 0.00981
Epoch: 22423, Training Loss: 0.00983
Epoch: 22423, Training Loss: 0.01210
Epoch: 22424, Training Loss: 0.00938
Epoch: 22424, Training Loss: 0.00981
Epoch: 22424, Training Loss: 0.00983
Epoch: 22424, Training Loss: 0.01210
Epoch: 22425, Training Loss: 0.00938
Epoch: 22425, Training Loss: 0.00981
Epoch: 22425, Training Loss: 0.00983
Epoch: 22425, Training Loss: 0.01210
Epoch: 22426, Training Loss: 0.00938
Epoch: 22426, Training Loss: 0.00981
Epoch: 22426, Training Loss: 0.00983
Epoch: 22426, Training Loss: 0.01210
Epoch: 22427, Training Loss: 0.00938
Epoch: 22427, Training Loss: 0.00981
Epoch: 22427, Training Loss: 0.00983
Epoch: 22427, Training Loss: 0.01210
Epoch: 22428, Training Loss: 0.00938
Epoch: 22428, Training Loss: 0.00981
Epoch: 22428, Training Loss: 0.00983
Epoch: 22428, Training Loss: 0.01210
Epoch: 22429, Training Loss: 0.00938
Epoch: 22429, Training Loss: 0.00981
Epoch: 22429, Training Loss: 0.00983
Epoch: 22429, Training Loss: 0.01210
Epoch: 22430, Training Loss: 0.00938
Epoch: 22430, Training Loss: 0.00981
Epoch: 22430, Training Loss: 0.00983
Epoch: 22430, Training Loss: 0.01210
Epoch: 22431, Training Loss: 0.00938
Epoch: 22431, Training Loss: 0.00981
Epoch: 22431, Training Loss: 0.00983
Epoch: 22431, Training Loss: 0.01210
Epoch: 22432, Training Loss: 0.00938
Epoch: 22432, Training Loss: 0.00981
Epoch: 22432, Training Loss: 0.00982
Epoch: 22432, Training Loss: 0.01209
Epoch: 22433, Training Loss: 0.00938
Epoch: 22433, Training Loss: 0.00981
Epoch: 22433, Training Loss: 0.00982
Epoch: 22433, Training Loss: 0.01209
Epoch: 22434, Training Loss: 0.00938
Epoch: 22434, Training Loss: 0.00981
Epoch: 22434, Training Loss: 0.00982
Epoch: 22434, Training Loss: 0.01209
Epoch: 22435, Training Loss: 0.00938
Epoch: 22435, Training Loss: 0.00981
Epoch: 22435, Training Loss: 0.00982
Epoch: 22435, Training Loss: 0.01209
Epoch: 22436, Training Loss: 0.00938
Epoch: 22436, Training Loss: 0.00981
Epoch: 22436, Training Loss: 0.00982
Epoch: 22436, Training Loss: 0.01209
Epoch: 22437, Training Loss: 0.00938
Epoch: 22437, Training Loss: 0.00981
Epoch: 22437, Training Loss: 0.00982
Epoch: 22437, Training Loss: 0.01209
Epoch: 22438, Training Loss: 0.00938
Epoch: 22438, Training Loss: 0.00981
Epoch: 22438, Training Loss: 0.00982
Epoch: 22438, Training Loss: 0.01209
Epoch: 22439, Training Loss: 0.00938
Epoch: 22439, Training Loss: 0.00981
Epoch: 22439, Training Loss: 0.00982
Epoch: 22439, Training Loss: 0.01209
Epoch: 22440, Training Loss: 0.00938
Epoch: 22440, Training Loss: 0.00981
Epoch: 22440, Training Loss: 0.00982
Epoch: 22440, Training Loss: 0.01209
Epoch: 22441, Training Loss: 0.00938
Epoch: 22441, Training Loss: 0.00981
Epoch: 22441, Training Loss: 0.00982
Epoch: 22441, Training Loss: 0.01209
Epoch: 22442, Training Loss: 0.00938
Epoch: 22442, Training Loss: 0.00981
Epoch: 22442, Training Loss: 0.00982
Epoch: 22442, Training Loss: 0.01209
Epoch: 22443, Training Loss: 0.00938
Epoch: 22443, Training Loss: 0.00981
Epoch: 22443, Training Loss: 0.00982
Epoch: 22443, Training Loss: 0.01209
Epoch: 22444, Training Loss: 0.00938
Epoch: 22444, Training Loss: 0.00981
Epoch: 22444, Training Loss: 0.00982
Epoch: 22444, Training Loss: 0.01209
Epoch: 22445, Training Loss: 0.00938
Epoch: 22445, Training Loss: 0.00981
Epoch: 22445, Training Loss: 0.00982
Epoch: 22445, Training Loss: 0.01209
Epoch: 22446, Training Loss: 0.00937
Epoch: 22446, Training Loss: 0.00981
Epoch: 22446, Training Loss: 0.00982
Epoch: 22446, Training Loss: 0.01209
Epoch: 22447, Training Loss: 0.00937
Epoch: 22447, Training Loss: 0.00981
Epoch: 22447, Training Loss: 0.00982
Epoch: 22447, Training Loss: 0.01209
Epoch: 22448, Training Loss: 0.00937
Epoch: 22448, Training Loss: 0.00981
Epoch: 22448, Training Loss: 0.00982
Epoch: 22448, Training Loss: 0.01209
Epoch: 22449, Training Loss: 0.00937
Epoch: 22449, Training Loss: 0.00981
Epoch: 22449, Training Loss: 0.00982
Epoch: 22449, Training Loss: 0.01209
Epoch: 22450, Training Loss: 0.00937
Epoch: 22450, Training Loss: 0.00981
Epoch: 22450, Training Loss: 0.00982
Epoch: 22450, Training Loss: 0.01209
Epoch: 22451, Training Loss: 0.00937
Epoch: 22451, Training Loss: 0.00981
Epoch: 22451, Training Loss: 0.00982
Epoch: 22451, Training Loss: 0.01209
Epoch: 22452, Training Loss: 0.00937
Epoch: 22452, Training Loss: 0.00981
Epoch: 22452, Training Loss: 0.00982
Epoch: 22452, Training Loss: 0.01209
Epoch: 22453, Training Loss: 0.00937
Epoch: 22453, Training Loss: 0.00981
Epoch: 22453, Training Loss: 0.00982
Epoch: 22453, Training Loss: 0.01209
Epoch: 22454, Training Loss: 0.00937
Epoch: 22454, Training Loss: 0.00981
Epoch: 22454, Training Loss: 0.00982
Epoch: 22454, Training Loss: 0.01209
Epoch: 22455, Training Loss: 0.00937
Epoch: 22455, Training Loss: 0.00981
Epoch: 22455, Training Loss: 0.00982
Epoch: 22455, Training Loss: 0.01209
Epoch: 22456, Training Loss: 0.00937
Epoch: 22456, Training Loss: 0.00981
Epoch: 22456, Training Loss: 0.00982
Epoch: 22456, Training Loss: 0.01209
Epoch: 22457, Training Loss: 0.00937
Epoch: 22457, Training Loss: 0.00981
Epoch: 22457, Training Loss: 0.00982
Epoch: 22457, Training Loss: 0.01209
Epoch: 22458, Training Loss: 0.00937
Epoch: 22458, Training Loss: 0.00981
Epoch: 22458, Training Loss: 0.00982
Epoch: 22458, Training Loss: 0.01209
Epoch: 22459, Training Loss: 0.00937
Epoch: 22459, Training Loss: 0.00981
Epoch: 22459, Training Loss: 0.00982
Epoch: 22459, Training Loss: 0.01209
Epoch: 22460, Training Loss: 0.00937
Epoch: 22460, Training Loss: 0.00981
Epoch: 22460, Training Loss: 0.00982
Epoch: 22460, Training Loss: 0.01209
Epoch: 22461, Training Loss: 0.00937
Epoch: 22461, Training Loss: 0.00980
Epoch: 22461, Training Loss: 0.00982
Epoch: 22461, Training Loss: 0.01209
Epoch: 22462, Training Loss: 0.00937
Epoch: 22462, Training Loss: 0.00980
Epoch: 22462, Training Loss: 0.00982
Epoch: 22462, Training Loss: 0.01209
Epoch: 22463, Training Loss: 0.00937
Epoch: 22463, Training Loss: 0.00980
Epoch: 22463, Training Loss: 0.00982
Epoch: 22463, Training Loss: 0.01208
Epoch: 22464, Training Loss: 0.00937
Epoch: 22464, Training Loss: 0.00980
Epoch: 22464, Training Loss: 0.00982
Epoch: 22464, Training Loss: 0.01208
Epoch: 22465, Training Loss: 0.00937
Epoch: 22465, Training Loss: 0.00980
Epoch: 22465, Training Loss: 0.00982
Epoch: 22465, Training Loss: 0.01208
Epoch: 22466, Training Loss: 0.00937
Epoch: 22466, Training Loss: 0.00980
Epoch: 22466, Training Loss: 0.00982
Epoch: 22466, Training Loss: 0.01208
Epoch: 22467, Training Loss: 0.00937
Epoch: 22467, Training Loss: 0.00980
Epoch: 22467, Training Loss: 0.00982
Epoch: 22467, Training Loss: 0.01208
Epoch: 22468, Training Loss: 0.00937
Epoch: 22468, Training Loss: 0.00980
Epoch: 22468, Training Loss: 0.00982
Epoch: 22468, Training Loss: 0.01208
Epoch: 22469, Training Loss: 0.00937
Epoch: 22469, Training Loss: 0.00980
Epoch: 22469, Training Loss: 0.00982
Epoch: 22469, Training Loss: 0.01208
Epoch: 22470, Training Loss: 0.00937
Epoch: 22470, Training Loss: 0.00980
Epoch: 22470, Training Loss: 0.00982
Epoch: 22470, Training Loss: 0.01208
Epoch: 22471, Training Loss: 0.00937
Epoch: 22471, Training Loss: 0.00980
Epoch: 22471, Training Loss: 0.00981
Epoch: 22471, Training Loss: 0.01208
Epoch: 22472, Training Loss: 0.00937
Epoch: 22472, Training Loss: 0.00980
Epoch: 22472, Training Loss: 0.00981
Epoch: 22472, Training Loss: 0.01208
Epoch: 22473, Training Loss: 0.00937
Epoch: 22473, Training Loss: 0.00980
Epoch: 22473, Training Loss: 0.00981
Epoch: 22473, Training Loss: 0.01208
Epoch: 22474, Training Loss: 0.00937
Epoch: 22474, Training Loss: 0.00980
Epoch: 22474, Training Loss: 0.00981
Epoch: 22474, Training Loss: 0.01208
Epoch: 22475, Training Loss: 0.00937
Epoch: 22475, Training Loss: 0.00980
Epoch: 22475, Training Loss: 0.00981
Epoch: 22475, Training Loss: 0.01208
Epoch: 22476, Training Loss: 0.00937
Epoch: 22476, Training Loss: 0.00980
Epoch: 22476, Training Loss: 0.00981
Epoch: 22476, Training Loss: 0.01208
Epoch: 22477, Training Loss: 0.00937
Epoch: 22477, Training Loss: 0.00980
Epoch: 22477, Training Loss: 0.00981
Epoch: 22477, Training Loss: 0.01208
Epoch: 22478, Training Loss: 0.00937
Epoch: 22478, Training Loss: 0.00980
Epoch: 22478, Training Loss: 0.00981
Epoch: 22478, Training Loss: 0.01208
Epoch: 22479, Training Loss: 0.00937
Epoch: 22479, Training Loss: 0.00980
Epoch: 22479, Training Loss: 0.00981
Epoch: 22479, Training Loss: 0.01208
Epoch: 22480, Training Loss: 0.00937
Epoch: 22480, Training Loss: 0.00980
Epoch: 22480, Training Loss: 0.00981
Epoch: 22480, Training Loss: 0.01208
Epoch: 22481, Training Loss: 0.00937
Epoch: 22481, Training Loss: 0.00980
Epoch: 22481, Training Loss: 0.00981
Epoch: 22481, Training Loss: 0.01208
Epoch: 22482, Training Loss: 0.00937
Epoch: 22482, Training Loss: 0.00980
Epoch: 22482, Training Loss: 0.00981
Epoch: 22482, Training Loss: 0.01208
Epoch: 22483, Training Loss: 0.00937
Epoch: 22483, Training Loss: 0.00980
Epoch: 22483, Training Loss: 0.00981
Epoch: 22483, Training Loss: 0.01208
Epoch: 22484, Training Loss: 0.00937
Epoch: 22484, Training Loss: 0.00980
Epoch: 22484, Training Loss: 0.00981
Epoch: 22484, Training Loss: 0.01208
Epoch: 22485, Training Loss: 0.00937
Epoch: 22485, Training Loss: 0.00980
Epoch: 22485, Training Loss: 0.00981
Epoch: 22485, Training Loss: 0.01208
Epoch: 22486, Training Loss: 0.00937
Epoch: 22486, Training Loss: 0.00980
Epoch: 22486, Training Loss: 0.00981
Epoch: 22486, Training Loss: 0.01208
Epoch: 22487, Training Loss: 0.00937
Epoch: 22487, Training Loss: 0.00980
Epoch: 22487, Training Loss: 0.00981
Epoch: 22487, Training Loss: 0.01208
Epoch: 22488, Training Loss: 0.00936
Epoch: 22488, Training Loss: 0.00980
Epoch: 22488, Training Loss: 0.00981
Epoch: 22488, Training Loss: 0.01208
Epoch: 22489, Training Loss: 0.00936
Epoch: 22489, Training Loss: 0.00980
Epoch: 22489, Training Loss: 0.00981
Epoch: 22489, Training Loss: 0.01208
Epoch: 22490, Training Loss: 0.00936
Epoch: 22490, Training Loss: 0.00980
Epoch: 22490, Training Loss: 0.00981
Epoch: 22490, Training Loss: 0.01208
Epoch: 22491, Training Loss: 0.00936
Epoch: 22491, Training Loss: 0.00980
Epoch: 22491, Training Loss: 0.00981
Epoch: 22491, Training Loss: 0.01208
Epoch: 22492, Training Loss: 0.00936
Epoch: 22492, Training Loss: 0.00980
Epoch: 22492, Training Loss: 0.00981
Epoch: 22492, Training Loss: 0.01208
Epoch: 22493, Training Loss: 0.00936
Epoch: 22493, Training Loss: 0.00980
Epoch: 22493, Training Loss: 0.00981
Epoch: 22493, Training Loss: 0.01208
Epoch: 22494, Training Loss: 0.00936
Epoch: 22494, Training Loss: 0.00980
Epoch: 22494, Training Loss: 0.00981
Epoch: 22494, Training Loss: 0.01208
Epoch: 22495, Training Loss: 0.00936
Epoch: 22495, Training Loss: 0.00980
Epoch: 22495, Training Loss: 0.00981
Epoch: 22495, Training Loss: 0.01207
Epoch: 22496, Training Loss: 0.00936
Epoch: 22496, Training Loss: 0.00980
Epoch: 22496, Training Loss: 0.00981
Epoch: 22496, Training Loss: 0.01207
Epoch: 22497, Training Loss: 0.00936
Epoch: 22497, Training Loss: 0.00980
Epoch: 22497, Training Loss: 0.00981
Epoch: 22497, Training Loss: 0.01207
Epoch: 22498, Training Loss: 0.00936
Epoch: 22498, Training Loss: 0.00980
Epoch: 22498, Training Loss: 0.00981
Epoch: 22498, Training Loss: 0.01207
Epoch: 22499, Training Loss: 0.00936
Epoch: 22499, Training Loss: 0.00980
Epoch: 22499, Training Loss: 0.00981
Epoch: 22499, Training Loss: 0.01207
Epoch: 22500, Training Loss: 0.00936
Epoch: 22500, Training Loss: 0.00980
Epoch: 22500, Training Loss: 0.00981
Epoch: 22500, Training Loss: 0.01207
Epoch: 22501, Training Loss: 0.00936
Epoch: 22501, Training Loss: 0.00979
Epoch: 22501, Training Loss: 0.00981
Epoch: 22501, Training Loss: 0.01207
Epoch: 22502, Training Loss: 0.00936
Epoch: 22502, Training Loss: 0.00979
Epoch: 22502, Training Loss: 0.00981
Epoch: 22502, Training Loss: 0.01207
Epoch: 22503, Training Loss: 0.00936
Epoch: 22503, Training Loss: 0.00979
Epoch: 22503, Training Loss: 0.00981
Epoch: 22503, Training Loss: 0.01207
Epoch: 22504, Training Loss: 0.00936
Epoch: 22504, Training Loss: 0.00979
Epoch: 22504, Training Loss: 0.00981
Epoch: 22504, Training Loss: 0.01207
Epoch: 22505, Training Loss: 0.00936
Epoch: 22505, Training Loss: 0.00979
Epoch: 22505, Training Loss: 0.00981
Epoch: 22505, Training Loss: 0.01207
Epoch: 22506, Training Loss: 0.00936
Epoch: 22506, Training Loss: 0.00979
Epoch: 22506, Training Loss: 0.00981
Epoch: 22506, Training Loss: 0.01207
Epoch: 22507, Training Loss: 0.00936
Epoch: 22507, Training Loss: 0.00979
Epoch: 22507, Training Loss: 0.00981
Epoch: 22507, Training Loss: 0.01207
Epoch: 22508, Training Loss: 0.00936
Epoch: 22508, Training Loss: 0.00979
Epoch: 22508, Training Loss: 0.00981
Epoch: 22508, Training Loss: 0.01207
Epoch: 22509, Training Loss: 0.00936
Epoch: 22509, Training Loss: 0.00979
Epoch: 22509, Training Loss: 0.00981
Epoch: 22509, Training Loss: 0.01207
Epoch: 22510, Training Loss: 0.00936
Epoch: 22510, Training Loss: 0.00979
Epoch: 22510, Training Loss: 0.00981
Epoch: 22510, Training Loss: 0.01207
Epoch: 22511, Training Loss: 0.00936
Epoch: 22511, Training Loss: 0.00979
Epoch: 22511, Training Loss: 0.00980
Epoch: 22511, Training Loss: 0.01207
Epoch: 22512, Training Loss: 0.00936
Epoch: 22512, Training Loss: 0.00979
Epoch: 22512, Training Loss: 0.00980
Epoch: 22512, Training Loss: 0.01207
Epoch: 22513, Training Loss: 0.00936
Epoch: 22513, Training Loss: 0.00979
Epoch: 22513, Training Loss: 0.00980
Epoch: 22513, Training Loss: 0.01207
Epoch: 22514, Training Loss: 0.00936
Epoch: 22514, Training Loss: 0.00979
Epoch: 22514, Training Loss: 0.00980
Epoch: 22514, Training Loss: 0.01207
Epoch: 22515, Training Loss: 0.00936
Epoch: 22515, Training Loss: 0.00979
Epoch: 22515, Training Loss: 0.00980
Epoch: 22515, Training Loss: 0.01207
Epoch: 22516, Training Loss: 0.00936
Epoch: 22516, Training Loss: 0.00979
Epoch: 22516, Training Loss: 0.00980
Epoch: 22516, Training Loss: 0.01207
Epoch: 22517, Training Loss: 0.00936
Epoch: 22517, Training Loss: 0.00979
Epoch: 22517, Training Loss: 0.00980
Epoch: 22517, Training Loss: 0.01207
Epoch: 22518, Training Loss: 0.00936
Epoch: 22518, Training Loss: 0.00979
Epoch: 22518, Training Loss: 0.00980
Epoch: 22518, Training Loss: 0.01207
Epoch: 22519, Training Loss: 0.00936
Epoch: 22519, Training Loss: 0.00979
Epoch: 22519, Training Loss: 0.00980
Epoch: 22519, Training Loss: 0.01207
Epoch: 22520, Training Loss: 0.00936
Epoch: 22520, Training Loss: 0.00979
Epoch: 22520, Training Loss: 0.00980
Epoch: 22520, Training Loss: 0.01207
Epoch: 22521, Training Loss: 0.00936
Epoch: 22521, Training Loss: 0.00979
Epoch: 22521, Training Loss: 0.00980
Epoch: 22521, Training Loss: 0.01207
Epoch: 22522, Training Loss: 0.00936
Epoch: 22522, Training Loss: 0.00979
Epoch: 22522, Training Loss: 0.00980
Epoch: 22522, Training Loss: 0.01207
Epoch: 22523, Training Loss: 0.00936
Epoch: 22523, Training Loss: 0.00979
Epoch: 22523, Training Loss: 0.00980
Epoch: 22523, Training Loss: 0.01207
Epoch: 22524, Training Loss: 0.00936
Epoch: 22524, Training Loss: 0.00979
Epoch: 22524, Training Loss: 0.00980
Epoch: 22524, Training Loss: 0.01207
Epoch: 22525, Training Loss: 0.00936
Epoch: 22525, Training Loss: 0.00979
Epoch: 22525, Training Loss: 0.00980
Epoch: 22525, Training Loss: 0.01207
Epoch: 22526, Training Loss: 0.00936
Epoch: 22526, Training Loss: 0.00979
Epoch: 22526, Training Loss: 0.00980
Epoch: 22526, Training Loss: 0.01206
Epoch: 22527, Training Loss: 0.00936
Epoch: 22527, Training Loss: 0.00979
Epoch: 22527, Training Loss: 0.00980
Epoch: 22527, Training Loss: 0.01206
Epoch: 22528, Training Loss: 0.00936
Epoch: 22528, Training Loss: 0.00979
Epoch: 22528, Training Loss: 0.00980
Epoch: 22528, Training Loss: 0.01206
Epoch: 22529, Training Loss: 0.00936
Epoch: 22529, Training Loss: 0.00979
Epoch: 22529, Training Loss: 0.00980
Epoch: 22529, Training Loss: 0.01206
Epoch: 22530, Training Loss: 0.00935
Epoch: 22530, Training Loss: 0.00979
Epoch: 22530, Training Loss: 0.00980
Epoch: 22530, Training Loss: 0.01206
Epoch: 22531, Training Loss: 0.00935
Epoch: 22531, Training Loss: 0.00979
Epoch: 22531, Training Loss: 0.00980
Epoch: 22531, Training Loss: 0.01206
Epoch: 22532, Training Loss: 0.00935
Epoch: 22532, Training Loss: 0.00979
Epoch: 22532, Training Loss: 0.00980
Epoch: 22532, Training Loss: 0.01206
Epoch: 22533, Training Loss: 0.00935
Epoch: 22533, Training Loss: 0.00979
Epoch: 22533, Training Loss: 0.00980
Epoch: 22533, Training Loss: 0.01206
Epoch: 22534, Training Loss: 0.00935
Epoch: 22534, Training Loss: 0.00979
Epoch: 22534, Training Loss: 0.00980
Epoch: 22534, Training Loss: 0.01206
Epoch: 22535, Training Loss: 0.00935
Epoch: 22535, Training Loss: 0.00979
Epoch: 22535, Training Loss: 0.00980
Epoch: 22535, Training Loss: 0.01206
Epoch: 22536, Training Loss: 0.00935
Epoch: 22536, Training Loss: 0.00979
Epoch: 22536, Training Loss: 0.00980
Epoch: 22536, Training Loss: 0.01206
Epoch: 22537, Training Loss: 0.00935
Epoch: 22537, Training Loss: 0.00979
Epoch: 22537, Training Loss: 0.00980
Epoch: 22537, Training Loss: 0.01206
Epoch: 22538, Training Loss: 0.00935
Epoch: 22538, Training Loss: 0.00979
Epoch: 22538, Training Loss: 0.00980
Epoch: 22538, Training Loss: 0.01206
Epoch: 22539, Training Loss: 0.00935
Epoch: 22539, Training Loss: 0.00979
Epoch: 22539, Training Loss: 0.00980
Epoch: 22539, Training Loss: 0.01206
Epoch: 22540, Training Loss: 0.00935
Epoch: 22540, Training Loss: 0.00978
Epoch: 22540, Training Loss: 0.00980
Epoch: 22540, Training Loss: 0.01206
Epoch: 22541, Training Loss: 0.00935
Epoch: 22541, Training Loss: 0.00978
Epoch: 22541, Training Loss: 0.00980
Epoch: 22541, Training Loss: 0.01206
Epoch: 22542, Training Loss: 0.00935
Epoch: 22542, Training Loss: 0.00978
Epoch: 22542, Training Loss: 0.00980
Epoch: 22542, Training Loss: 0.01206
Epoch: 22543, Training Loss: 0.00935
Epoch: 22543, Training Loss: 0.00978
Epoch: 22543, Training Loss: 0.00980
Epoch: 22543, Training Loss: 0.01206
Epoch: 22544, Training Loss: 0.00935
Epoch: 22544, Training Loss: 0.00978
Epoch: 22544, Training Loss: 0.00980
Epoch: 22544, Training Loss: 0.01206
Epoch: 22545, Training Loss: 0.00935
Epoch: 22545, Training Loss: 0.00978
Epoch: 22545, Training Loss: 0.00980
Epoch: 22545, Training Loss: 0.01206
Epoch: 22546, Training Loss: 0.00935
Epoch: 22546, Training Loss: 0.00978
Epoch: 22546, Training Loss: 0.00980
Epoch: 22546, Training Loss: 0.01206
Epoch: 22547, Training Loss: 0.00935
Epoch: 22547, Training Loss: 0.00978
Epoch: 22547, Training Loss: 0.00980
Epoch: 22547, Training Loss: 0.01206
Epoch: 22548, Training Loss: 0.00935
Epoch: 22548, Training Loss: 0.00978
Epoch: 22548, Training Loss: 0.00980
Epoch: 22548, Training Loss: 0.01206
Epoch: 22549, Training Loss: 0.00935
Epoch: 22549, Training Loss: 0.00978
Epoch: 22549, Training Loss: 0.00980
Epoch: 22549, Training Loss: 0.01206
Epoch: 22550, Training Loss: 0.00935
Epoch: 22550, Training Loss: 0.00978
Epoch: 22550, Training Loss: 0.00979
Epoch: 22550, Training Loss: 0.01206
Epoch: 22551, Training Loss: 0.00935
Epoch: 22551, Training Loss: 0.00978
Epoch: 22551, Training Loss: 0.00979
Epoch: 22551, Training Loss: 0.01206
Epoch: 22552, Training Loss: 0.00935
Epoch: 22552, Training Loss: 0.00978
Epoch: 22552, Training Loss: 0.00979
Epoch: 22552, Training Loss: 0.01206
Epoch: 22553, Training Loss: 0.00935
Epoch: 22553, Training Loss: 0.00978
Epoch: 22553, Training Loss: 0.00979
Epoch: 22553, Training Loss: 0.01206
Epoch: 22554, Training Loss: 0.00935
Epoch: 22554, Training Loss: 0.00978
Epoch: 22554, Training Loss: 0.00979
Epoch: 22554, Training Loss: 0.01206
Epoch: 22555, Training Loss: 0.00935
Epoch: 22555, Training Loss: 0.00978
Epoch: 22555, Training Loss: 0.00979
Epoch: 22555, Training Loss: 0.01206
Epoch: 22556, Training Loss: 0.00935
Epoch: 22556, Training Loss: 0.00978
Epoch: 22556, Training Loss: 0.00979
Epoch: 22556, Training Loss: 0.01206
Epoch: 22557, Training Loss: 0.00935
Epoch: 22557, Training Loss: 0.00978
Epoch: 22557, Training Loss: 0.00979
Epoch: 22557, Training Loss: 0.01206
Epoch: 22558, Training Loss: 0.00935
Epoch: 22558, Training Loss: 0.00978
Epoch: 22558, Training Loss: 0.00979
Epoch: 22558, Training Loss: 0.01205
Epoch: 22559, Training Loss: 0.00935
Epoch: 22559, Training Loss: 0.00978
Epoch: 22559, Training Loss: 0.00979
Epoch: 22559, Training Loss: 0.01205
Epoch: 22560, Training Loss: 0.00935
Epoch: 22560, Training Loss: 0.00978
Epoch: 22560, Training Loss: 0.00979
Epoch: 22560, Training Loss: 0.01205
Epoch: 22561, Training Loss: 0.00935
Epoch: 22561, Training Loss: 0.00978
Epoch: 22561, Training Loss: 0.00979
Epoch: 22561, Training Loss: 0.01205
Epoch: 22562, Training Loss: 0.00935
Epoch: 22562, Training Loss: 0.00978
Epoch: 22562, Training Loss: 0.00979
Epoch: 22562, Training Loss: 0.01205
Epoch: 22563, Training Loss: 0.00935
Epoch: 22563, Training Loss: 0.00978
Epoch: 22563, Training Loss: 0.00979
Epoch: 22563, Training Loss: 0.01205
Epoch: 22564, Training Loss: 0.00935
Epoch: 22564, Training Loss: 0.00978
Epoch: 22564, Training Loss: 0.00979
Epoch: 22564, Training Loss: 0.01205
Epoch: 22565, Training Loss: 0.00935
Epoch: 22565, Training Loss: 0.00978
Epoch: 22565, Training Loss: 0.00979
Epoch: 22565, Training Loss: 0.01205
Epoch: 22566, Training Loss: 0.00935
Epoch: 22566, Training Loss: 0.00978
Epoch: 22566, Training Loss: 0.00979
Epoch: 22566, Training Loss: 0.01205
Epoch: 22567, Training Loss: 0.00935
Epoch: 22567, Training Loss: 0.00978
Epoch: 22567, Training Loss: 0.00979
Epoch: 22567, Training Loss: 0.01205
Epoch: 22568, Training Loss: 0.00935
Epoch: 22568, Training Loss: 0.00978
Epoch: 22568, Training Loss: 0.00979
Epoch: 22568, Training Loss: 0.01205
Epoch: 22569, Training Loss: 0.00935
Epoch: 22569, Training Loss: 0.00978
Epoch: 22569, Training Loss: 0.00979
Epoch: 22569, Training Loss: 0.01205
Epoch: 22570, Training Loss: 0.00935
Epoch: 22570, Training Loss: 0.00978
Epoch: 22570, Training Loss: 0.00979
Epoch: 22570, Training Loss: 0.01205
Epoch: 22571, Training Loss: 0.00935
Epoch: 22571, Training Loss: 0.00978
Epoch: 22571, Training Loss: 0.00979
Epoch: 22571, Training Loss: 0.01205
Epoch: 22572, Training Loss: 0.00935
Epoch: 22572, Training Loss: 0.00978
Epoch: 22572, Training Loss: 0.00979
Epoch: 22572, Training Loss: 0.01205
Epoch: 22573, Training Loss: 0.00934
Epoch: 22573, Training Loss: 0.00978
Epoch: 22573, Training Loss: 0.00979
Epoch: 22573, Training Loss: 0.01205
Epoch: 22574, Training Loss: 0.00934
Epoch: 22574, Training Loss: 0.00978
Epoch: 22574, Training Loss: 0.00979
Epoch: 22574, Training Loss: 0.01205
Epoch: 22575, Training Loss: 0.00934
Epoch: 22575, Training Loss: 0.00978
Epoch: 22575, Training Loss: 0.00979
Epoch: 22575, Training Loss: 0.01205
Epoch: 22576, Training Loss: 0.00934
Epoch: 22576, Training Loss: 0.00978
Epoch: 22576, Training Loss: 0.00979
Epoch: 22576, Training Loss: 0.01205
Epoch: 22577, Training Loss: 0.00934
Epoch: 22577, Training Loss: 0.00978
Epoch: 22577, Training Loss: 0.00979
Epoch: 22577, Training Loss: 0.01205
Epoch: 22578, Training Loss: 0.00934
Epoch: 22578, Training Loss: 0.00978
Epoch: 22578, Training Loss: 0.00979
Epoch: 22578, Training Loss: 0.01205
Epoch: 22579, Training Loss: 0.00934
Epoch: 22579, Training Loss: 0.00978
Epoch: 22579, Training Loss: 0.00979
Epoch: 22579, Training Loss: 0.01205
Epoch: 22580, Training Loss: 0.00934
Epoch: 22580, Training Loss: 0.00977
Epoch: 22580, Training Loss: 0.00979
Epoch: 22580, Training Loss: 0.01205
Epoch: 22581, Training Loss: 0.00934
Epoch: 22581, Training Loss: 0.00977
Epoch: 22581, Training Loss: 0.00979
Epoch: 22581, Training Loss: 0.01205
Epoch: 22582, Training Loss: 0.00934
Epoch: 22582, Training Loss: 0.00977
Epoch: 22582, Training Loss: 0.00979
Epoch: 22582, Training Loss: 0.01205
Epoch: 22583, Training Loss: 0.00934
Epoch: 22583, Training Loss: 0.00977
Epoch: 22583, Training Loss: 0.00979
Epoch: 22583, Training Loss: 0.01205
Epoch: 22584, Training Loss: 0.00934
Epoch: 22584, Training Loss: 0.00977
Epoch: 22584, Training Loss: 0.00979
Epoch: 22584, Training Loss: 0.01205
Epoch: 22585, Training Loss: 0.00934
Epoch: 22585, Training Loss: 0.00977
Epoch: 22585, Training Loss: 0.00979
Epoch: 22585, Training Loss: 0.01205
Epoch: 22586, Training Loss: 0.00934
Epoch: 22586, Training Loss: 0.00977
Epoch: 22586, Training Loss: 0.00979
Epoch: 22586, Training Loss: 0.01205
Epoch: 22587, Training Loss: 0.00934
Epoch: 22587, Training Loss: 0.00977
Epoch: 22587, Training Loss: 0.00979
Epoch: 22587, Training Loss: 0.01205
Epoch: 22588, Training Loss: 0.00934
Epoch: 22588, Training Loss: 0.00977
Epoch: 22588, Training Loss: 0.00979
Epoch: 22588, Training Loss: 0.01205
Epoch: 22589, Training Loss: 0.00934
Epoch: 22589, Training Loss: 0.00977
Epoch: 22589, Training Loss: 0.00979
Epoch: 22589, Training Loss: 0.01205
Epoch: 22590, Training Loss: 0.00934
Epoch: 22590, Training Loss: 0.00977
Epoch: 22590, Training Loss: 0.00978
Epoch: 22590, Training Loss: 0.01204
Epoch: 22591, Training Loss: 0.00934
Epoch: 22591, Training Loss: 0.00977
Epoch: 22591, Training Loss: 0.00978
Epoch: 22591, Training Loss: 0.01204
Epoch: 22592, Training Loss: 0.00934
Epoch: 22592, Training Loss: 0.00977
Epoch: 22592, Training Loss: 0.00978
Epoch: 22592, Training Loss: 0.01204
Epoch: 22593, Training Loss: 0.00934
Epoch: 22593, Training Loss: 0.00977
Epoch: 22593, Training Loss: 0.00978
Epoch: 22593, Training Loss: 0.01204
Epoch: 22594, Training Loss: 0.00934
Epoch: 22594, Training Loss: 0.00977
Epoch: 22594, Training Loss: 0.00978
Epoch: 22594, Training Loss: 0.01204
Epoch: 22595, Training Loss: 0.00934
Epoch: 22595, Training Loss: 0.00977
Epoch: 22595, Training Loss: 0.00978
Epoch: 22595, Training Loss: 0.01204
Epoch: 22596, Training Loss: 0.00934
Epoch: 22596, Training Loss: 0.00977
Epoch: 22596, Training Loss: 0.00978
Epoch: 22596, Training Loss: 0.01204
Epoch: 22597, Training Loss: 0.00934
Epoch: 22597, Training Loss: 0.00977
Epoch: 22597, Training Loss: 0.00978
Epoch: 22597, Training Loss: 0.01204
Epoch: 22598, Training Loss: 0.00934
Epoch: 22598, Training Loss: 0.00977
Epoch: 22598, Training Loss: 0.00978
Epoch: 22598, Training Loss: 0.01204
Epoch: 22599, Training Loss: 0.00934
Epoch: 22599, Training Loss: 0.00977
Epoch: 22599, Training Loss: 0.00978
Epoch: 22599, Training Loss: 0.01204
Epoch: 22600, Training Loss: 0.00934
Epoch: 22600, Training Loss: 0.00977
Epoch: 22600, Training Loss: 0.00978
Epoch: 22600, Training Loss: 0.01204
Epoch: 22601, Training Loss: 0.00934
Epoch: 22601, Training Loss: 0.00977
Epoch: 22601, Training Loss: 0.00978
Epoch: 22601, Training Loss: 0.01204
Epoch: 22602, Training Loss: 0.00934
Epoch: 22602, Training Loss: 0.00977
Epoch: 22602, Training Loss: 0.00978
Epoch: 22602, Training Loss: 0.01204
Epoch: 22603, Training Loss: 0.00934
Epoch: 22603, Training Loss: 0.00977
Epoch: 22603, Training Loss: 0.00978
Epoch: 22603, Training Loss: 0.01204
Epoch: 22604, Training Loss: 0.00934
Epoch: 22604, Training Loss: 0.00977
Epoch: 22604, Training Loss: 0.00978
Epoch: 22604, Training Loss: 0.01204
Epoch: 22605, Training Loss: 0.00934
Epoch: 22605, Training Loss: 0.00977
Epoch: 22605, Training Loss: 0.00978
Epoch: 22605, Training Loss: 0.01204
Epoch: 22606, Training Loss: 0.00934
Epoch: 22606, Training Loss: 0.00977
Epoch: 22606, Training Loss: 0.00978
Epoch: 22606, Training Loss: 0.01204
Epoch: 22607, Training Loss: 0.00934
Epoch: 22607, Training Loss: 0.00977
Epoch: 22607, Training Loss: 0.00978
Epoch: 22607, Training Loss: 0.01204
Epoch: 22608, Training Loss: 0.00934
Epoch: 22608, Training Loss: 0.00977
Epoch: 22608, Training Loss: 0.00978
Epoch: 22608, Training Loss: 0.01204
Epoch: 22609, Training Loss: 0.00934
Epoch: 22609, Training Loss: 0.00977
Epoch: 22609, Training Loss: 0.00978
Epoch: 22609, Training Loss: 0.01204
Epoch: 22610, Training Loss: 0.00934
Epoch: 22610, Training Loss: 0.00977
Epoch: 22610, Training Loss: 0.00978
Epoch: 22610, Training Loss: 0.01204
Epoch: 22611, Training Loss: 0.00934
Epoch: 22611, Training Loss: 0.00977
Epoch: 22611, Training Loss: 0.00978
Epoch: 22611, Training Loss: 0.01204
Epoch: 22612, Training Loss: 0.00934
Epoch: 22612, Training Loss: 0.00977
Epoch: 22612, Training Loss: 0.00978
Epoch: 22612, Training Loss: 0.01204
Epoch: 22613, Training Loss: 0.00934
Epoch: 22613, Training Loss: 0.00977
Epoch: 22613, Training Loss: 0.00978
Epoch: 22613, Training Loss: 0.01204
Epoch: 22614, Training Loss: 0.00934
Epoch: 22614, Training Loss: 0.00977
Epoch: 22614, Training Loss: 0.00978
Epoch: 22614, Training Loss: 0.01204
Epoch: 22615, Training Loss: 0.00934
Epoch: 22615, Training Loss: 0.00977
Epoch: 22615, Training Loss: 0.00978
Epoch: 22615, Training Loss: 0.01204
Epoch: 22616, Training Loss: 0.00933
Epoch: 22616, Training Loss: 0.00977
Epoch: 22616, Training Loss: 0.00978
Epoch: 22616, Training Loss: 0.01204
Epoch: 22617, Training Loss: 0.00933
Epoch: 22617, Training Loss: 0.00977
Epoch: 22617, Training Loss: 0.00978
Epoch: 22617, Training Loss: 0.01204
Epoch: 22618, Training Loss: 0.00933
Epoch: 22618, Training Loss: 0.00977
Epoch: 22618, Training Loss: 0.00978
Epoch: 22618, Training Loss: 0.01204
Epoch: 22619, Training Loss: 0.00933
Epoch: 22619, Training Loss: 0.00977
Epoch: 22619, Training Loss: 0.00978
Epoch: 22619, Training Loss: 0.01204
Epoch: 22620, Training Loss: 0.00933
Epoch: 22620, Training Loss: 0.00976
Epoch: 22620, Training Loss: 0.00978
Epoch: 22620, Training Loss: 0.01204
Epoch: 22621, Training Loss: 0.00933
Epoch: 22621, Training Loss: 0.00976
Epoch: 22621, Training Loss: 0.00978
Epoch: 22621, Training Loss: 0.01204
Epoch: 22622, Training Loss: 0.00933
Epoch: 22622, Training Loss: 0.00976
Epoch: 22622, Training Loss: 0.00978
Epoch: 22622, Training Loss: 0.01203
Epoch: 22623, Training Loss: 0.00933
Epoch: 22623, Training Loss: 0.00976
Epoch: 22623, Training Loss: 0.00978
Epoch: 22623, Training Loss: 0.01203
Epoch: 22624, Training Loss: 0.00933
Epoch: 22624, Training Loss: 0.00976
Epoch: 22624, Training Loss: 0.00978
Epoch: 22624, Training Loss: 0.01203
Epoch: 22625, Training Loss: 0.00933
Epoch: 22625, Training Loss: 0.00976
Epoch: 22625, Training Loss: 0.00978
Epoch: 22625, Training Loss: 0.01203
Epoch: 22626, Training Loss: 0.00933
Epoch: 22626, Training Loss: 0.00976
Epoch: 22626, Training Loss: 0.00978
Epoch: 22626, Training Loss: 0.01203
Epoch: 22627, Training Loss: 0.00933
Epoch: 22627, Training Loss: 0.00976
Epoch: 22627, Training Loss: 0.00978
Epoch: 22627, Training Loss: 0.01203
Epoch: 22628, Training Loss: 0.00933
Epoch: 22628, Training Loss: 0.00976
Epoch: 22628, Training Loss: 0.00978
Epoch: 22628, Training Loss: 0.01203
Epoch: 22629, Training Loss: 0.00933
Epoch: 22629, Training Loss: 0.00976
Epoch: 22629, Training Loss: 0.00978
Epoch: 22629, Training Loss: 0.01203
Epoch: 22630, Training Loss: 0.00933
Epoch: 22630, Training Loss: 0.00976
Epoch: 22630, Training Loss: 0.00977
Epoch: 22630, Training Loss: 0.01203
Epoch: 22631, Training Loss: 0.00933
Epoch: 22631, Training Loss: 0.00976
Epoch: 22631, Training Loss: 0.00977
Epoch: 22631, Training Loss: 0.01203
Epoch: 22632, Training Loss: 0.00933
Epoch: 22632, Training Loss: 0.00976
Epoch: 22632, Training Loss: 0.00977
Epoch: 22632, Training Loss: 0.01203
Epoch: 22633, Training Loss: 0.00933
Epoch: 22633, Training Loss: 0.00976
Epoch: 22633, Training Loss: 0.00977
Epoch: 22633, Training Loss: 0.01203
Epoch: 22634, Training Loss: 0.00933
Epoch: 22634, Training Loss: 0.00976
Epoch: 22634, Training Loss: 0.00977
Epoch: 22634, Training Loss: 0.01203
Epoch: 22635, Training Loss: 0.00933
Epoch: 22635, Training Loss: 0.00976
Epoch: 22635, Training Loss: 0.00977
Epoch: 22635, Training Loss: 0.01203
Epoch: 22636, Training Loss: 0.00933
Epoch: 22636, Training Loss: 0.00976
Epoch: 22636, Training Loss: 0.00977
Epoch: 22636, Training Loss: 0.01203
Epoch: 22637, Training Loss: 0.00933
Epoch: 22637, Training Loss: 0.00976
Epoch: 22637, Training Loss: 0.00977
Epoch: 22637, Training Loss: 0.01203
Epoch: 22638, Training Loss: 0.00933
Epoch: 22638, Training Loss: 0.00976
Epoch: 22638, Training Loss: 0.00977
Epoch: 22638, Training Loss: 0.01203
Epoch: 22639, Training Loss: 0.00933
Epoch: 22639, Training Loss: 0.00976
Epoch: 22639, Training Loss: 0.00977
Epoch: 22639, Training Loss: 0.01203
Epoch: 22640, Training Loss: 0.00933
Epoch: 22640, Training Loss: 0.00976
Epoch: 22640, Training Loss: 0.00977
Epoch: 22640, Training Loss: 0.01203
Epoch: 22641, Training Loss: 0.00933
Epoch: 22641, Training Loss: 0.00976
Epoch: 22641, Training Loss: 0.00977
Epoch: 22641, Training Loss: 0.01203
Epoch: 22642, Training Loss: 0.00933
Epoch: 22642, Training Loss: 0.00976
Epoch: 22642, Training Loss: 0.00977
Epoch: 22642, Training Loss: 0.01203
Epoch: 22643, Training Loss: 0.00933
Epoch: 22643, Training Loss: 0.00976
Epoch: 22643, Training Loss: 0.00977
Epoch: 22643, Training Loss: 0.01203
Epoch: 22644, Training Loss: 0.00933
Epoch: 22644, Training Loss: 0.00976
Epoch: 22644, Training Loss: 0.00977
Epoch: 22644, Training Loss: 0.01203
Epoch: 22645, Training Loss: 0.00933
Epoch: 22645, Training Loss: 0.00976
Epoch: 22645, Training Loss: 0.00977
Epoch: 22645, Training Loss: 0.01203
Epoch: 22646, Training Loss: 0.00933
Epoch: 22646, Training Loss: 0.00976
Epoch: 22646, Training Loss: 0.00977
Epoch: 22646, Training Loss: 0.01203
Epoch: 22647, Training Loss: 0.00933
Epoch: 22647, Training Loss: 0.00976
Epoch: 22647, Training Loss: 0.00977
Epoch: 22647, Training Loss: 0.01203
Epoch: 22648, Training Loss: 0.00933
Epoch: 22648, Training Loss: 0.00976
Epoch: 22648, Training Loss: 0.00977
Epoch: 22648, Training Loss: 0.01203
Epoch: 22649, Training Loss: 0.00933
Epoch: 22649, Training Loss: 0.00976
Epoch: 22649, Training Loss: 0.00977
Epoch: 22649, Training Loss: 0.01203
Epoch: 22650, Training Loss: 0.00933
Epoch: 22650, Training Loss: 0.00976
Epoch: 22650, Training Loss: 0.00977
Epoch: 22650, Training Loss: 0.01203
Epoch: 22651, Training Loss: 0.00933
Epoch: 22651, Training Loss: 0.00976
Epoch: 22651, Training Loss: 0.00977
Epoch: 22651, Training Loss: 0.01203
Epoch: 22652, Training Loss: 0.00933
Epoch: 22652, Training Loss: 0.00976
Epoch: 22652, Training Loss: 0.00977
Epoch: 22652, Training Loss: 0.01203
Epoch: 22653, Training Loss: 0.00933
Epoch: 22653, Training Loss: 0.00976
Epoch: 22653, Training Loss: 0.00977
Epoch: 22653, Training Loss: 0.01203
Epoch: 22654, Training Loss: 0.00933
Epoch: 22654, Training Loss: 0.00976
Epoch: 22654, Training Loss: 0.00977
Epoch: 22654, Training Loss: 0.01202
Epoch: 22655, Training Loss: 0.00933
Epoch: 22655, Training Loss: 0.00976
Epoch: 22655, Training Loss: 0.00977
Epoch: 22655, Training Loss: 0.01202
Epoch: 22656, Training Loss: 0.00933
Epoch: 22656, Training Loss: 0.00976
Epoch: 22656, Training Loss: 0.00977
Epoch: 22656, Training Loss: 0.01202
Epoch: 22657, Training Loss: 0.00933
Epoch: 22657, Training Loss: 0.00976
Epoch: 22657, Training Loss: 0.00977
Epoch: 22657, Training Loss: 0.01202
Epoch: 22658, Training Loss: 0.00933
Epoch: 22658, Training Loss: 0.00976
Epoch: 22658, Training Loss: 0.00977
Epoch: 22658, Training Loss: 0.01202
Epoch: 22659, Training Loss: 0.00932
Epoch: 22659, Training Loss: 0.00976
Epoch: 22659, Training Loss: 0.00977
Epoch: 22659, Training Loss: 0.01202
Epoch: 22660, Training Loss: 0.00932
Epoch: 22660, Training Loss: 0.00975
Epoch: 22660, Training Loss: 0.00977
Epoch: 22660, Training Loss: 0.01202
Epoch: 22661, Training Loss: 0.00932
Epoch: 22661, Training Loss: 0.00975
Epoch: 22661, Training Loss: 0.00977
Epoch: 22661, Training Loss: 0.01202
Epoch: 22662, Training Loss: 0.00932
Epoch: 22662, Training Loss: 0.00975
Epoch: 22662, Training Loss: 0.00977
Epoch: 22662, Training Loss: 0.01202
Epoch: 22663, Training Loss: 0.00932
Epoch: 22663, Training Loss: 0.00975
Epoch: 22663, Training Loss: 0.00977
Epoch: 22663, Training Loss: 0.01202
Epoch: 22664, Training Loss: 0.00932
Epoch: 22664, Training Loss: 0.00975
Epoch: 22664, Training Loss: 0.00977
Epoch: 22664, Training Loss: 0.01202
Epoch: 22665, Training Loss: 0.00932
Epoch: 22665, Training Loss: 0.00975
Epoch: 22665, Training Loss: 0.00977
Epoch: 22665, Training Loss: 0.01202
Epoch: 22666, Training Loss: 0.00932
Epoch: 22666, Training Loss: 0.00975
Epoch: 22666, Training Loss: 0.00977
Epoch: 22666, Training Loss: 0.01202
Epoch: 22667, Training Loss: 0.00932
Epoch: 22667, Training Loss: 0.00975
Epoch: 22667, Training Loss: 0.00977
Epoch: 22667, Training Loss: 0.01202
Epoch: 22668, Training Loss: 0.00932
Epoch: 22668, Training Loss: 0.00975
Epoch: 22668, Training Loss: 0.00977
Epoch: 22668, Training Loss: 0.01202
Epoch: 22669, Training Loss: 0.00932
Epoch: 22669, Training Loss: 0.00975
Epoch: 22669, Training Loss: 0.00976
Epoch: 22669, Training Loss: 0.01202
Epoch: 22670, Training Loss: 0.00932
Epoch: 22670, Training Loss: 0.00975
Epoch: 22670, Training Loss: 0.00976
Epoch: 22670, Training Loss: 0.01202
Epoch: 22671, Training Loss: 0.00932
Epoch: 22671, Training Loss: 0.00975
Epoch: 22671, Training Loss: 0.00976
Epoch: 22671, Training Loss: 0.01202
Epoch: 22672, Training Loss: 0.00932
Epoch: 22672, Training Loss: 0.00975
Epoch: 22672, Training Loss: 0.00976
Epoch: 22672, Training Loss: 0.01202
Epoch: 22673, Training Loss: 0.00932
Epoch: 22673, Training Loss: 0.00975
Epoch: 22673, Training Loss: 0.00976
Epoch: 22673, Training Loss: 0.01202
Epoch: 22674, Training Loss: 0.00932
Epoch: 22674, Training Loss: 0.00975
Epoch: 22674, Training Loss: 0.00976
Epoch: 22674, Training Loss: 0.01202
Epoch: 22675, Training Loss: 0.00932
Epoch: 22675, Training Loss: 0.00975
Epoch: 22675, Training Loss: 0.00976
Epoch: 22675, Training Loss: 0.01202
Epoch: 22676, Training Loss: 0.00932
Epoch: 22676, Training Loss: 0.00975
Epoch: 22676, Training Loss: 0.00976
Epoch: 22676, Training Loss: 0.01202
Epoch: 22677, Training Loss: 0.00932
Epoch: 22677, Training Loss: 0.00975
Epoch: 22677, Training Loss: 0.00976
Epoch: 22677, Training Loss: 0.01202
Epoch: 22678, Training Loss: 0.00932
Epoch: 22678, Training Loss: 0.00975
Epoch: 22678, Training Loss: 0.00976
Epoch: 22678, Training Loss: 0.01202
Epoch: 22679, Training Loss: 0.00932
Epoch: 22679, Training Loss: 0.00975
Epoch: 22679, Training Loss: 0.00976
Epoch: 22679, Training Loss: 0.01202
Epoch: 22680, Training Loss: 0.00932
Epoch: 22680, Training Loss: 0.00975
Epoch: 22680, Training Loss: 0.00976
Epoch: 22680, Training Loss: 0.01202
Epoch: 22681, Training Loss: 0.00932
Epoch: 22681, Training Loss: 0.00975
Epoch: 22681, Training Loss: 0.00976
Epoch: 22681, Training Loss: 0.01202
Epoch: 22682, Training Loss: 0.00932
Epoch: 22682, Training Loss: 0.00975
Epoch: 22682, Training Loss: 0.00976
Epoch: 22682, Training Loss: 0.01202
Epoch: 22683, Training Loss: 0.00932
Epoch: 22683, Training Loss: 0.00975
Epoch: 22683, Training Loss: 0.00976
Epoch: 22683, Training Loss: 0.01202
Epoch: 22684, Training Loss: 0.00932
Epoch: 22684, Training Loss: 0.00975
Epoch: 22684, Training Loss: 0.00976
Epoch: 22684, Training Loss: 0.01202
Epoch: 22685, Training Loss: 0.00932
Epoch: 22685, Training Loss: 0.00975
Epoch: 22685, Training Loss: 0.00976
Epoch: 22685, Training Loss: 0.01202
Epoch: 22686, Training Loss: 0.00932
Epoch: 22686, Training Loss: 0.00975
Epoch: 22686, Training Loss: 0.00976
Epoch: 22686, Training Loss: 0.01201
Epoch: 22687, Training Loss: 0.00932
Epoch: 22687, Training Loss: 0.00975
Epoch: 22687, Training Loss: 0.00976
Epoch: 22687, Training Loss: 0.01201
Epoch: 22688, Training Loss: 0.00932
Epoch: 22688, Training Loss: 0.00975
Epoch: 22688, Training Loss: 0.00976
Epoch: 22688, Training Loss: 0.01201
Epoch: 22689, Training Loss: 0.00932
Epoch: 22689, Training Loss: 0.00975
Epoch: 22689, Training Loss: 0.00976
Epoch: 22689, Training Loss: 0.01201
Epoch: 22690, Training Loss: 0.00932
Epoch: 22690, Training Loss: 0.00975
Epoch: 22690, Training Loss: 0.00976
Epoch: 22690, Training Loss: 0.01201
Epoch: 22691, Training Loss: 0.00932
Epoch: 22691, Training Loss: 0.00975
Epoch: 22691, Training Loss: 0.00976
Epoch: 22691, Training Loss: 0.01201
Epoch: 22692, Training Loss: 0.00932
Epoch: 22692, Training Loss: 0.00975
Epoch: 22692, Training Loss: 0.00976
Epoch: 22692, Training Loss: 0.01201
Epoch: 22693, Training Loss: 0.00932
Epoch: 22693, Training Loss: 0.00975
Epoch: 22693, Training Loss: 0.00976
Epoch: 22693, Training Loss: 0.01201
Epoch: 22694, Training Loss: 0.00932
Epoch: 22694, Training Loss: 0.00975
Epoch: 22694, Training Loss: 0.00976
Epoch: 22694, Training Loss: 0.01201
Epoch: 22695, Training Loss: 0.00932
Epoch: 22695, Training Loss: 0.00975
Epoch: 22695, Training Loss: 0.00976
Epoch: 22695, Training Loss: 0.01201
Epoch: 22696, Training Loss: 0.00932
Epoch: 22696, Training Loss: 0.00975
Epoch: 22696, Training Loss: 0.00976
Epoch: 22696, Training Loss: 0.01201
Epoch: 22697, Training Loss: 0.00932
Epoch: 22697, Training Loss: 0.00975
Epoch: 22697, Training Loss: 0.00976
Epoch: 22697, Training Loss: 0.01201
Epoch: 22698, Training Loss: 0.00932
Epoch: 22698, Training Loss: 0.00975
Epoch: 22698, Training Loss: 0.00976
Epoch: 22698, Training Loss: 0.01201
Epoch: 22699, Training Loss: 0.00932
Epoch: 22699, Training Loss: 0.00975
Epoch: 22699, Training Loss: 0.00976
Epoch: 22699, Training Loss: 0.01201
Epoch: 22700, Training Loss: 0.00932
Epoch: 22700, Training Loss: 0.00974
Epoch: 22700, Training Loss: 0.00976
Epoch: 22700, Training Loss: 0.01201
Epoch: 22701, Training Loss: 0.00932
Epoch: 22701, Training Loss: 0.00974
Epoch: 22701, Training Loss: 0.00976
Epoch: 22701, Training Loss: 0.01201
Epoch: 22702, Training Loss: 0.00931
Epoch: 22702, Training Loss: 0.00974
Epoch: 22702, Training Loss: 0.00976
Epoch: 22702, Training Loss: 0.01201
Epoch: 22703, Training Loss: 0.00931
Epoch: 22703, Training Loss: 0.00974
Epoch: 22703, Training Loss: 0.00976
Epoch: 22703, Training Loss: 0.01201
Epoch: 22704, Training Loss: 0.00931
Epoch: 22704, Training Loss: 0.00974
Epoch: 22704, Training Loss: 0.00976
Epoch: 22704, Training Loss: 0.01201
Epoch: 22705, Training Loss: 0.00931
Epoch: 22705, Training Loss: 0.00974
Epoch: 22705, Training Loss: 0.00976
Epoch: 22705, Training Loss: 0.01201
Epoch: 22706, Training Loss: 0.00931
Epoch: 22706, Training Loss: 0.00974
Epoch: 22706, Training Loss: 0.00976
Epoch: 22706, Training Loss: 0.01201
Epoch: 22707, Training Loss: 0.00931
Epoch: 22707, Training Loss: 0.00974
Epoch: 22707, Training Loss: 0.00976
Epoch: 22707, Training Loss: 0.01201
Epoch: 22708, Training Loss: 0.00931
Epoch: 22708, Training Loss: 0.00974
Epoch: 22708, Training Loss: 0.00976
Epoch: 22708, Training Loss: 0.01201
Epoch: 22709, Training Loss: 0.00931
Epoch: 22709, Training Loss: 0.00974
Epoch: 22709, Training Loss: 0.00975
Epoch: 22709, Training Loss: 0.01201
Epoch: 22710, Training Loss: 0.00931
Epoch: 22710, Training Loss: 0.00974
Epoch: 22710, Training Loss: 0.00975
Epoch: 22710, Training Loss: 0.01201
Epoch: 22711, Training Loss: 0.00931
Epoch: 22711, Training Loss: 0.00974
Epoch: 22711, Training Loss: 0.00975
Epoch: 22711, Training Loss: 0.01201
Epoch: 22712, Training Loss: 0.00931
Epoch: 22712, Training Loss: 0.00974
Epoch: 22712, Training Loss: 0.00975
Epoch: 22712, Training Loss: 0.01201
Epoch: 22713, Training Loss: 0.00931
Epoch: 22713, Training Loss: 0.00974
Epoch: 22713, Training Loss: 0.00975
Epoch: 22713, Training Loss: 0.01201
Epoch: 22714, Training Loss: 0.00931
Epoch: 22714, Training Loss: 0.00974
Epoch: 22714, Training Loss: 0.00975
Epoch: 22714, Training Loss: 0.01201
Epoch: 22715, Training Loss: 0.00931
Epoch: 22715, Training Loss: 0.00974
Epoch: 22715, Training Loss: 0.00975
Epoch: 22715, Training Loss: 0.01201
Epoch: 22716, Training Loss: 0.00931
Epoch: 22716, Training Loss: 0.00974
Epoch: 22716, Training Loss: 0.00975
Epoch: 22716, Training Loss: 0.01201
Epoch: 22717, Training Loss: 0.00931
Epoch: 22717, Training Loss: 0.00974
Epoch: 22717, Training Loss: 0.00975
Epoch: 22717, Training Loss: 0.01201
Epoch: 22718, Training Loss: 0.00931
Epoch: 22718, Training Loss: 0.00974
Epoch: 22718, Training Loss: 0.00975
Epoch: 22718, Training Loss: 0.01200
Epoch: 22719, Training Loss: 0.00931
Epoch: 22719, Training Loss: 0.00974
Epoch: 22719, Training Loss: 0.00975
Epoch: 22719, Training Loss: 0.01200
Epoch: 22720, Training Loss: 0.00931
Epoch: 22720, Training Loss: 0.00974
Epoch: 22720, Training Loss: 0.00975
Epoch: 22720, Training Loss: 0.01200
Epoch: 22721, Training Loss: 0.00931
Epoch: 22721, Training Loss: 0.00974
Epoch: 22721, Training Loss: 0.00975
Epoch: 22721, Training Loss: 0.01200
Epoch: 22722, Training Loss: 0.00931
Epoch: 22722, Training Loss: 0.00974
Epoch: 22722, Training Loss: 0.00975
Epoch: 22722, Training Loss: 0.01200
Epoch: 22723, Training Loss: 0.00931
Epoch: 22723, Training Loss: 0.00974
Epoch: 22723, Training Loss: 0.00975
Epoch: 22723, Training Loss: 0.01200
Epoch: 22724, Training Loss: 0.00931
Epoch: 22724, Training Loss: 0.00974
Epoch: 22724, Training Loss: 0.00975
Epoch: 22724, Training Loss: 0.01200
Epoch: 22725, Training Loss: 0.00931
Epoch: 22725, Training Loss: 0.00974
Epoch: 22725, Training Loss: 0.00975
Epoch: 22725, Training Loss: 0.01200
Epoch: 22726, Training Loss: 0.00931
Epoch: 22726, Training Loss: 0.00974
Epoch: 22726, Training Loss: 0.00975
Epoch: 22726, Training Loss: 0.01200
Epoch: 22727, Training Loss: 0.00931
Epoch: 22727, Training Loss: 0.00974
Epoch: 22727, Training Loss: 0.00975
Epoch: 22727, Training Loss: 0.01200
Epoch: 22728, Training Loss: 0.00931
Epoch: 22728, Training Loss: 0.00974
Epoch: 22728, Training Loss: 0.00975
Epoch: 22728, Training Loss: 0.01200
Epoch: 22729, Training Loss: 0.00931
Epoch: 22729, Training Loss: 0.00974
Epoch: 22729, Training Loss: 0.00975
Epoch: 22729, Training Loss: 0.01200
Epoch: 22730, Training Loss: 0.00931
Epoch: 22730, Training Loss: 0.00974
Epoch: 22730, Training Loss: 0.00975
Epoch: 22730, Training Loss: 0.01200
Epoch: 22731, Training Loss: 0.00931
Epoch: 22731, Training Loss: 0.00974
Epoch: 22731, Training Loss: 0.00975
Epoch: 22731, Training Loss: 0.01200
Epoch: 22732, Training Loss: 0.00931
Epoch: 22732, Training Loss: 0.00974
Epoch: 22732, Training Loss: 0.00975
Epoch: 22732, Training Loss: 0.01200
Epoch: 22733, Training Loss: 0.00931
Epoch: 22733, Training Loss: 0.00974
Epoch: 22733, Training Loss: 0.00975
Epoch: 22733, Training Loss: 0.01200
Epoch: 22734, Training Loss: 0.00931
Epoch: 22734, Training Loss: 0.00974
Epoch: 22734, Training Loss: 0.00975
Epoch: 22734, Training Loss: 0.01200
Epoch: 22735, Training Loss: 0.00931
Epoch: 22735, Training Loss: 0.00974
Epoch: 22735, Training Loss: 0.00975
Epoch: 22735, Training Loss: 0.01200
Epoch: 22736, Training Loss: 0.00931
Epoch: 22736, Training Loss: 0.00974
Epoch: 22736, Training Loss: 0.00975
Epoch: 22736, Training Loss: 0.01200
Epoch: 22737, Training Loss: 0.00931
Epoch: 22737, Training Loss: 0.00974
Epoch: 22737, Training Loss: 0.00975
Epoch: 22737, Training Loss: 0.01200
Epoch: 22738, Training Loss: 0.00931
Epoch: 22738, Training Loss: 0.00974
Epoch: 22738, Training Loss: 0.00975
Epoch: 22738, Training Loss: 0.01200
Epoch: 22739, Training Loss: 0.00931
Epoch: 22739, Training Loss: 0.00974
Epoch: 22739, Training Loss: 0.00975
Epoch: 22739, Training Loss: 0.01200
Epoch: 22740, Training Loss: 0.00931
Epoch: 22740, Training Loss: 0.00973
Epoch: 22740, Training Loss: 0.00975
Epoch: 22740, Training Loss: 0.01200
Epoch: 22741, Training Loss: 0.00931
Epoch: 22741, Training Loss: 0.00973
Epoch: 22741, Training Loss: 0.00975
Epoch: 22741, Training Loss: 0.01200
Epoch: 22742, Training Loss: 0.00931
Epoch: 22742, Training Loss: 0.00973
Epoch: 22742, Training Loss: 0.00975
Epoch: 22742, Training Loss: 0.01200
Epoch: 22743, Training Loss: 0.00931
Epoch: 22743, Training Loss: 0.00973
Epoch: 22743, Training Loss: 0.00975
Epoch: 22743, Training Loss: 0.01200
Epoch: 22744, Training Loss: 0.00931
Epoch: 22744, Training Loss: 0.00973
Epoch: 22744, Training Loss: 0.00975
Epoch: 22744, Training Loss: 0.01200
Epoch: 22745, Training Loss: 0.00930
Epoch: 22745, Training Loss: 0.00973
Epoch: 22745, Training Loss: 0.00975
Epoch: 22745, Training Loss: 0.01200
Epoch: 22746, Training Loss: 0.00930
Epoch: 22746, Training Loss: 0.00973
Epoch: 22746, Training Loss: 0.00975
Epoch: 22746, Training Loss: 0.01200
Epoch: 22747, Training Loss: 0.00930
Epoch: 22747, Training Loss: 0.00973
Epoch: 22747, Training Loss: 0.00975
Epoch: 22747, Training Loss: 0.01200
Epoch: 22748, Training Loss: 0.00930
Epoch: 22748, Training Loss: 0.00973
Epoch: 22748, Training Loss: 0.00975
Epoch: 22748, Training Loss: 0.01200
Epoch: 22749, Training Loss: 0.00930
Epoch: 22749, Training Loss: 0.00973
Epoch: 22749, Training Loss: 0.00975
Epoch: 22749, Training Loss: 0.01200
Epoch: 22750, Training Loss: 0.00930
Epoch: 22750, Training Loss: 0.00973
Epoch: 22750, Training Loss: 0.00974
Epoch: 22750, Training Loss: 0.01199
Epoch: 22751, Training Loss: 0.00930
Epoch: 22751, Training Loss: 0.00973
Epoch: 22751, Training Loss: 0.00974
Epoch: 22751, Training Loss: 0.01199
Epoch: 22752, Training Loss: 0.00930
Epoch: 22752, Training Loss: 0.00973
Epoch: 22752, Training Loss: 0.00974
Epoch: 22752, Training Loss: 0.01199
Epoch: 22753, Training Loss: 0.00930
Epoch: 22753, Training Loss: 0.00973
Epoch: 22753, Training Loss: 0.00974
Epoch: 22753, Training Loss: 0.01199
Epoch: 22754, Training Loss: 0.00930
Epoch: 22754, Training Loss: 0.00973
Epoch: 22754, Training Loss: 0.00974
Epoch: 22754, Training Loss: 0.01199
Epoch: 22755, Training Loss: 0.00930
Epoch: 22755, Training Loss: 0.00973
Epoch: 22755, Training Loss: 0.00974
Epoch: 22755, Training Loss: 0.01199
Epoch: 22756, Training Loss: 0.00930
Epoch: 22756, Training Loss: 0.00973
Epoch: 22756, Training Loss: 0.00974
Epoch: 22756, Training Loss: 0.01199
Epoch: 22757, Training Loss: 0.00930
Epoch: 22757, Training Loss: 0.00973
Epoch: 22757, Training Loss: 0.00974
Epoch: 22757, Training Loss: 0.01199
Epoch: 22758, Training Loss: 0.00930
Epoch: 22758, Training Loss: 0.00973
Epoch: 22758, Training Loss: 0.00974
Epoch: 22758, Training Loss: 0.01199
Epoch: 22759, Training Loss: 0.00930
Epoch: 22759, Training Loss: 0.00973
Epoch: 22759, Training Loss: 0.00974
Epoch: 22759, Training Loss: 0.01199
Epoch: 22760, Training Loss: 0.00930
Epoch: 22760, Training Loss: 0.00973
Epoch: 22760, Training Loss: 0.00974
Epoch: 22760, Training Loss: 0.01199
Epoch: 22761, Training Loss: 0.00930
Epoch: 22761, Training Loss: 0.00973
Epoch: 22761, Training Loss: 0.00974
Epoch: 22761, Training Loss: 0.01199
Epoch: 22762, Training Loss: 0.00930
Epoch: 22762, Training Loss: 0.00973
Epoch: 22762, Training Loss: 0.00974
Epoch: 22762, Training Loss: 0.01199
Epoch: 22763, Training Loss: 0.00930
Epoch: 22763, Training Loss: 0.00973
Epoch: 22763, Training Loss: 0.00974
Epoch: 22763, Training Loss: 0.01199
Epoch: 22764, Training Loss: 0.00930
Epoch: 22764, Training Loss: 0.00973
Epoch: 22764, Training Loss: 0.00974
Epoch: 22764, Training Loss: 0.01199
Epoch: 22765, Training Loss: 0.00930
Epoch: 22765, Training Loss: 0.00973
Epoch: 22765, Training Loss: 0.00974
Epoch: 22765, Training Loss: 0.01199
Epoch: 22766, Training Loss: 0.00930
Epoch: 22766, Training Loss: 0.00973
Epoch: 22766, Training Loss: 0.00974
Epoch: 22766, Training Loss: 0.01199
Epoch: 22767, Training Loss: 0.00930
Epoch: 22767, Training Loss: 0.00973
Epoch: 22767, Training Loss: 0.00974
Epoch: 22767, Training Loss: 0.01199
Epoch: 22768, Training Loss: 0.00930
Epoch: 22768, Training Loss: 0.00973
Epoch: 22768, Training Loss: 0.00974
Epoch: 22768, Training Loss: 0.01199
Epoch: 22769, Training Loss: 0.00930
Epoch: 22769, Training Loss: 0.00973
Epoch: 22769, Training Loss: 0.00974
Epoch: 22769, Training Loss: 0.01199
Epoch: 22770, Training Loss: 0.00930
Epoch: 22770, Training Loss: 0.00973
Epoch: 22770, Training Loss: 0.00974
Epoch: 22770, Training Loss: 0.01199
Epoch: 22771, Training Loss: 0.00930
Epoch: 22771, Training Loss: 0.00973
Epoch: 22771, Training Loss: 0.00974
Epoch: 22771, Training Loss: 0.01199
Epoch: 22772, Training Loss: 0.00930
Epoch: 22772, Training Loss: 0.00973
Epoch: 22772, Training Loss: 0.00974
Epoch: 22772, Training Loss: 0.01199
Epoch: 22773, Training Loss: 0.00930
Epoch: 22773, Training Loss: 0.00973
Epoch: 22773, Training Loss: 0.00974
Epoch: 22773, Training Loss: 0.01199
Epoch: 22774, Training Loss: 0.00930
Epoch: 22774, Training Loss: 0.00973
Epoch: 22774, Training Loss: 0.00974
Epoch: 22774, Training Loss: 0.01199
Epoch: 22775, Training Loss: 0.00930
Epoch: 22775, Training Loss: 0.00973
Epoch: 22775, Training Loss: 0.00974
Epoch: 22775, Training Loss: 0.01199
Epoch: 22776, Training Loss: 0.00930
Epoch: 22776, Training Loss: 0.00973
Epoch: 22776, Training Loss: 0.00974
Epoch: 22776, Training Loss: 0.01199
Epoch: 22777, Training Loss: 0.00930
Epoch: 22777, Training Loss: 0.00973
Epoch: 22777, Training Loss: 0.00974
Epoch: 22777, Training Loss: 0.01199
Epoch: 22778, Training Loss: 0.00930
Epoch: 22778, Training Loss: 0.00973
Epoch: 22778, Training Loss: 0.00974
Epoch: 22778, Training Loss: 0.01199
Epoch: 22779, Training Loss: 0.00930
Epoch: 22779, Training Loss: 0.00973
Epoch: 22779, Training Loss: 0.00974
Epoch: 22779, Training Loss: 0.01199
Epoch: 22780, Training Loss: 0.00930
Epoch: 22780, Training Loss: 0.00972
Epoch: 22780, Training Loss: 0.00974
Epoch: 22780, Training Loss: 0.01199
Epoch: 22781, Training Loss: 0.00930
Epoch: 22781, Training Loss: 0.00972
Epoch: 22781, Training Loss: 0.00974
Epoch: 22781, Training Loss: 0.01199
Epoch: 22782, Training Loss: 0.00930
Epoch: 22782, Training Loss: 0.00972
Epoch: 22782, Training Loss: 0.00974
Epoch: 22782, Training Loss: 0.01198
Epoch: 22783, Training Loss: 0.00930
Epoch: 22783, Training Loss: 0.00972
Epoch: 22783, Training Loss: 0.00974
Epoch: 22783, Training Loss: 0.01198
Epoch: 22784, Training Loss: 0.00930
Epoch: 22784, Training Loss: 0.00972
Epoch: 22784, Training Loss: 0.00974
Epoch: 22784, Training Loss: 0.01198
Epoch: 22785, Training Loss: 0.00930
Epoch: 22785, Training Loss: 0.00972
Epoch: 22785, Training Loss: 0.00974
Epoch: 22785, Training Loss: 0.01198
Epoch: 22786, Training Loss: 0.00930
Epoch: 22786, Training Loss: 0.00972
Epoch: 22786, Training Loss: 0.00974
Epoch: 22786, Training Loss: 0.01198
Epoch: 22787, Training Loss: 0.00930
Epoch: 22787, Training Loss: 0.00972
Epoch: 22787, Training Loss: 0.00974
Epoch: 22787, Training Loss: 0.01198
Epoch: 22788, Training Loss: 0.00929
Epoch: 22788, Training Loss: 0.00972
Epoch: 22788, Training Loss: 0.00974
Epoch: 22788, Training Loss: 0.01198
Epoch: 22789, Training Loss: 0.00929
Epoch: 22789, Training Loss: 0.00972
Epoch: 22789, Training Loss: 0.00974
Epoch: 22789, Training Loss: 0.01198
Epoch: 22790, Training Loss: 0.00929
Epoch: 22790, Training Loss: 0.00972
Epoch: 22790, Training Loss: 0.00973
Epoch: 22790, Training Loss: 0.01198
Epoch: 22791, Training Loss: 0.00929
Epoch: 22791, Training Loss: 0.00972
Epoch: 22791, Training Loss: 0.00973
Epoch: 22791, Training Loss: 0.01198
Epoch: 22792, Training Loss: 0.00929
Epoch: 22792, Training Loss: 0.00972
Epoch: 22792, Training Loss: 0.00973
Epoch: 22792, Training Loss: 0.01198
Epoch: 22793, Training Loss: 0.00929
Epoch: 22793, Training Loss: 0.00972
Epoch: 22793, Training Loss: 0.00973
Epoch: 22793, Training Loss: 0.01198
Epoch: 22794, Training Loss: 0.00929
Epoch: 22794, Training Loss: 0.00972
Epoch: 22794, Training Loss: 0.00973
Epoch: 22794, Training Loss: 0.01198
Epoch: 22795, Training Loss: 0.00929
Epoch: 22795, Training Loss: 0.00972
Epoch: 22795, Training Loss: 0.00973
Epoch: 22795, Training Loss: 0.01198
Epoch: 22796, Training Loss: 0.00929
Epoch: 22796, Training Loss: 0.00972
Epoch: 22796, Training Loss: 0.00973
Epoch: 22796, Training Loss: 0.01198
Epoch: 22797, Training Loss: 0.00929
Epoch: 22797, Training Loss: 0.00972
Epoch: 22797, Training Loss: 0.00973
Epoch: 22797, Training Loss: 0.01198
Epoch: 22798, Training Loss: 0.00929
Epoch: 22798, Training Loss: 0.00972
Epoch: 22798, Training Loss: 0.00973
Epoch: 22798, Training Loss: 0.01198
Epoch: 22799, Training Loss: 0.00929
Epoch: 22799, Training Loss: 0.00972
Epoch: 22799, Training Loss: 0.00973
Epoch: 22799, Training Loss: 0.01198
Epoch: 22800, Training Loss: 0.00929
Epoch: 22800, Training Loss: 0.00972
Epoch: 22800, Training Loss: 0.00973
Epoch: 22800, Training Loss: 0.01198
Epoch: 22801, Training Loss: 0.00929
Epoch: 22801, Training Loss: 0.00972
Epoch: 22801, Training Loss: 0.00973
Epoch: 22801, Training Loss: 0.01198
Epoch: 22802, Training Loss: 0.00929
Epoch: 22802, Training Loss: 0.00972
Epoch: 22802, Training Loss: 0.00973
Epoch: 22802, Training Loss: 0.01198
Epoch: 22803, Training Loss: 0.00929
Epoch: 22803, Training Loss: 0.00972
Epoch: 22803, Training Loss: 0.00973
Epoch: 22803, Training Loss: 0.01198
Epoch: 22804, Training Loss: 0.00929
Epoch: 22804, Training Loss: 0.00972
Epoch: 22804, Training Loss: 0.00973
Epoch: 22804, Training Loss: 0.01198
Epoch: 22805, Training Loss: 0.00929
Epoch: 22805, Training Loss: 0.00972
Epoch: 22805, Training Loss: 0.00973
Epoch: 22805, Training Loss: 0.01198
Epoch: 22806, Training Loss: 0.00929
Epoch: 22806, Training Loss: 0.00972
Epoch: 22806, Training Loss: 0.00973
Epoch: 22806, Training Loss: 0.01198
Epoch: 22807, Training Loss: 0.00929
Epoch: 22807, Training Loss: 0.00972
Epoch: 22807, Training Loss: 0.00973
Epoch: 22807, Training Loss: 0.01198
Epoch: 22808, Training Loss: 0.00929
Epoch: 22808, Training Loss: 0.00972
Epoch: 22808, Training Loss: 0.00973
Epoch: 22808, Training Loss: 0.01198
Epoch: 22809, Training Loss: 0.00929
Epoch: 22809, Training Loss: 0.00972
Epoch: 22809, Training Loss: 0.00973
Epoch: 22809, Training Loss: 0.01198
Epoch: 22810, Training Loss: 0.00929
Epoch: 22810, Training Loss: 0.00972
Epoch: 22810, Training Loss: 0.00973
Epoch: 22810, Training Loss: 0.01198
Epoch: 22811, Training Loss: 0.00929
Epoch: 22811, Training Loss: 0.00972
Epoch: 22811, Training Loss: 0.00973
Epoch: 22811, Training Loss: 0.01198
Epoch: 22812, Training Loss: 0.00929
Epoch: 22812, Training Loss: 0.00972
Epoch: 22812, Training Loss: 0.00973
Epoch: 22812, Training Loss: 0.01198
Epoch: 22813, Training Loss: 0.00929
Epoch: 22813, Training Loss: 0.00972
Epoch: 22813, Training Loss: 0.00973
Epoch: 22813, Training Loss: 0.01198
Epoch: 22814, Training Loss: 0.00929
Epoch: 22814, Training Loss: 0.00972
Epoch: 22814, Training Loss: 0.00973
Epoch: 22814, Training Loss: 0.01197
Epoch: 22815, Training Loss: 0.00929
Epoch: 22815, Training Loss: 0.00972
Epoch: 22815, Training Loss: 0.00973
Epoch: 22815, Training Loss: 0.01197
Epoch: 22816, Training Loss: 0.00929
Epoch: 22816, Training Loss: 0.00972
Epoch: 22816, Training Loss: 0.00973
Epoch: 22816, Training Loss: 0.01197
Epoch: 22817, Training Loss: 0.00929
Epoch: 22817, Training Loss: 0.00972
Epoch: 22817, Training Loss: 0.00973
Epoch: 22817, Training Loss: 0.01197
Epoch: 22818, Training Loss: 0.00929
Epoch: 22818, Training Loss: 0.00972
Epoch: 22818, Training Loss: 0.00973
Epoch: 22818, Training Loss: 0.01197
Epoch: 22819, Training Loss: 0.00929
Epoch: 22819, Training Loss: 0.00972
Epoch: 22819, Training Loss: 0.00973
Epoch: 22819, Training Loss: 0.01197
Epoch: 22820, Training Loss: 0.00929
Epoch: 22820, Training Loss: 0.00972
Epoch: 22820, Training Loss: 0.00973
Epoch: 22820, Training Loss: 0.01197
Epoch: 22821, Training Loss: 0.00929
Epoch: 22821, Training Loss: 0.00971
Epoch: 22821, Training Loss: 0.00973
Epoch: 22821, Training Loss: 0.01197
Epoch: 22822, Training Loss: 0.00929
Epoch: 22822, Training Loss: 0.00971
Epoch: 22822, Training Loss: 0.00973
Epoch: 22822, Training Loss: 0.01197
Epoch: 22823, Training Loss: 0.00929
Epoch: 22823, Training Loss: 0.00971
Epoch: 22823, Training Loss: 0.00973
Epoch: 22823, Training Loss: 0.01197
Epoch: 22824, Training Loss: 0.00929
Epoch: 22824, Training Loss: 0.00971
Epoch: 22824, Training Loss: 0.00973
Epoch: 22824, Training Loss: 0.01197
Epoch: 22825, Training Loss: 0.00929
Epoch: 22825, Training Loss: 0.00971
Epoch: 22825, Training Loss: 0.00973
Epoch: 22825, Training Loss: 0.01197
Epoch: 22826, Training Loss: 0.00929
Epoch: 22826, Training Loss: 0.00971
Epoch: 22826, Training Loss: 0.00973
Epoch: 22826, Training Loss: 0.01197
Epoch: 22827, Training Loss: 0.00929
Epoch: 22827, Training Loss: 0.00971
Epoch: 22827, Training Loss: 0.00973
Epoch: 22827, Training Loss: 0.01197
Epoch: 22828, Training Loss: 0.00929
Epoch: 22828, Training Loss: 0.00971
Epoch: 22828, Training Loss: 0.00973
Epoch: 22828, Training Loss: 0.01197
Epoch: 22829, Training Loss: 0.00929
Epoch: 22829, Training Loss: 0.00971
Epoch: 22829, Training Loss: 0.00973
Epoch: 22829, Training Loss: 0.01197
Epoch: 22830, Training Loss: 0.00929
Epoch: 22830, Training Loss: 0.00971
Epoch: 22830, Training Loss: 0.00972
Epoch: 22830, Training Loss: 0.01197
Epoch: 22831, Training Loss: 0.00928
Epoch: 22831, Training Loss: 0.00971
Epoch: 22831, Training Loss: 0.00972
Epoch: 22831, Training Loss: 0.01197
Epoch: 22832, Training Loss: 0.00928
Epoch: 22832, Training Loss: 0.00971
Epoch: 22832, Training Loss: 0.00972
Epoch: 22832, Training Loss: 0.01197
Epoch: 22833, Training Loss: 0.00928
Epoch: 22833, Training Loss: 0.00971
Epoch: 22833, Training Loss: 0.00972
Epoch: 22833, Training Loss: 0.01197
Epoch: 22834, Training Loss: 0.00928
Epoch: 22834, Training Loss: 0.00971
Epoch: 22834, Training Loss: 0.00972
Epoch: 22834, Training Loss: 0.01197
Epoch: 22835, Training Loss: 0.00928
Epoch: 22835, Training Loss: 0.00971
Epoch: 22835, Training Loss: 0.00972
Epoch: 22835, Training Loss: 0.01197
Epoch: 22836, Training Loss: 0.00928
Epoch: 22836, Training Loss: 0.00971
Epoch: 22836, Training Loss: 0.00972
Epoch: 22836, Training Loss: 0.01197
Epoch: 22837, Training Loss: 0.00928
Epoch: 22837, Training Loss: 0.00971
Epoch: 22837, Training Loss: 0.00972
Epoch: 22837, Training Loss: 0.01197
Epoch: 22838, Training Loss: 0.00928
Epoch: 22838, Training Loss: 0.00971
Epoch: 22838, Training Loss: 0.00972
Epoch: 22838, Training Loss: 0.01197
Epoch: 22839, Training Loss: 0.00928
Epoch: 22839, Training Loss: 0.00971
Epoch: 22839, Training Loss: 0.00972
Epoch: 22839, Training Loss: 0.01197
Epoch: 22840, Training Loss: 0.00928
Epoch: 22840, Training Loss: 0.00971
Epoch: 22840, Training Loss: 0.00972
Epoch: 22840, Training Loss: 0.01197
Epoch: 22841, Training Loss: 0.00928
Epoch: 22841, Training Loss: 0.00971
Epoch: 22841, Training Loss: 0.00972
Epoch: 22841, Training Loss: 0.01197
Epoch: 22842, Training Loss: 0.00928
Epoch: 22842, Training Loss: 0.00971
Epoch: 22842, Training Loss: 0.00972
Epoch: 22842, Training Loss: 0.01197
Epoch: 22843, Training Loss: 0.00928
Epoch: 22843, Training Loss: 0.00971
Epoch: 22843, Training Loss: 0.00972
Epoch: 22843, Training Loss: 0.01197
Epoch: 22844, Training Loss: 0.00928
Epoch: 22844, Training Loss: 0.00971
Epoch: 22844, Training Loss: 0.00972
Epoch: 22844, Training Loss: 0.01197
Epoch: 22845, Training Loss: 0.00928
Epoch: 22845, Training Loss: 0.00971
Epoch: 22845, Training Loss: 0.00972
Epoch: 22845, Training Loss: 0.01197
Epoch: 22846, Training Loss: 0.00928
Epoch: 22846, Training Loss: 0.00971
Epoch: 22846, Training Loss: 0.00972
Epoch: 22846, Training Loss: 0.01197
Epoch: 22847, Training Loss: 0.00928
Epoch: 22847, Training Loss: 0.00971
Epoch: 22847, Training Loss: 0.00972
Epoch: 22847, Training Loss: 0.01196
Epoch: 22848, Training Loss: 0.00928
Epoch: 22848, Training Loss: 0.00971
Epoch: 22848, Training Loss: 0.00972
Epoch: 22848, Training Loss: 0.01196
Epoch: 22849, Training Loss: 0.00928
Epoch: 22849, Training Loss: 0.00971
Epoch: 22849, Training Loss: 0.00972
Epoch: 22849, Training Loss: 0.01196
Epoch: 22850, Training Loss: 0.00928
Epoch: 22850, Training Loss: 0.00971
Epoch: 22850, Training Loss: 0.00972
Epoch: 22850, Training Loss: 0.01196
Epoch: 22851, Training Loss: 0.00928
Epoch: 22851, Training Loss: 0.00971
Epoch: 22851, Training Loss: 0.00972
Epoch: 22851, Training Loss: 0.01196
Epoch: 22852, Training Loss: 0.00928
Epoch: 22852, Training Loss: 0.00971
Epoch: 22852, Training Loss: 0.00972
Epoch: 22852, Training Loss: 0.01196
Epoch: 22853, Training Loss: 0.00928
Epoch: 22853, Training Loss: 0.00971
Epoch: 22853, Training Loss: 0.00972
Epoch: 22853, Training Loss: 0.01196
Epoch: 22854, Training Loss: 0.00928
Epoch: 22854, Training Loss: 0.00971
Epoch: 22854, Training Loss: 0.00972
Epoch: 22854, Training Loss: 0.01196
Epoch: 22855, Training Loss: 0.00928
Epoch: 22855, Training Loss: 0.00971
Epoch: 22855, Training Loss: 0.00972
Epoch: 22855, Training Loss: 0.01196
Epoch: 22856, Training Loss: 0.00928
Epoch: 22856, Training Loss: 0.00971
Epoch: 22856, Training Loss: 0.00972
Epoch: 22856, Training Loss: 0.01196
Epoch: 22857, Training Loss: 0.00928
Epoch: 22857, Training Loss: 0.00971
Epoch: 22857, Training Loss: 0.00972
Epoch: 22857, Training Loss: 0.01196
Epoch: 22858, Training Loss: 0.00928
Epoch: 22858, Training Loss: 0.00971
Epoch: 22858, Training Loss: 0.00972
Epoch: 22858, Training Loss: 0.01196
Epoch: 22859, Training Loss: 0.00928
Epoch: 22859, Training Loss: 0.00971
Epoch: 22859, Training Loss: 0.00972
Epoch: 22859, Training Loss: 0.01196
Epoch: 22860, Training Loss: 0.00928
Epoch: 22860, Training Loss: 0.00971
Epoch: 22860, Training Loss: 0.00972
Epoch: 22860, Training Loss: 0.01196
Epoch: 22861, Training Loss: 0.00928
Epoch: 22861, Training Loss: 0.00970
Epoch: 22861, Training Loss: 0.00972
Epoch: 22861, Training Loss: 0.01196
Epoch: 22862, Training Loss: 0.00928
Epoch: 22862, Training Loss: 0.00970
Epoch: 22862, Training Loss: 0.00972
Epoch: 22862, Training Loss: 0.01196
Epoch: 22863, Training Loss: 0.00928
Epoch: 22863, Training Loss: 0.00970
Epoch: 22863, Training Loss: 0.00972
Epoch: 22863, Training Loss: 0.01196
Epoch: 22864, Training Loss: 0.00928
Epoch: 22864, Training Loss: 0.00970
Epoch: 22864, Training Loss: 0.00972
Epoch: 22864, Training Loss: 0.01196
Epoch: 22865, Training Loss: 0.00928
Epoch: 22865, Training Loss: 0.00970
Epoch: 22865, Training Loss: 0.00972
Epoch: 22865, Training Loss: 0.01196
Epoch: 22866, Training Loss: 0.00928
Epoch: 22866, Training Loss: 0.00970
Epoch: 22866, Training Loss: 0.00972
Epoch: 22866, Training Loss: 0.01196
Epoch: 22867, Training Loss: 0.00928
Epoch: 22867, Training Loss: 0.00970
Epoch: 22867, Training Loss: 0.00972
Epoch: 22867, Training Loss: 0.01196
Epoch: 22868, Training Loss: 0.00928
Epoch: 22868, Training Loss: 0.00970
Epoch: 22868, Training Loss: 0.00972
Epoch: 22868, Training Loss: 0.01196
Epoch: 22869, Training Loss: 0.00928
Epoch: 22869, Training Loss: 0.00970
Epoch: 22869, Training Loss: 0.00972
Epoch: 22869, Training Loss: 0.01196
Epoch: 22870, Training Loss: 0.00928
Epoch: 22870, Training Loss: 0.00970
Epoch: 22870, Training Loss: 0.00972
Epoch: 22870, Training Loss: 0.01196
Epoch: 22871, Training Loss: 0.00928
Epoch: 22871, Training Loss: 0.00970
Epoch: 22871, Training Loss: 0.00971
Epoch: 22871, Training Loss: 0.01196
Epoch: 22872, Training Loss: 0.00928
Epoch: 22872, Training Loss: 0.00970
Epoch: 22872, Training Loss: 0.00971
Epoch: 22872, Training Loss: 0.01196
Epoch: 22873, Training Loss: 0.00928
Epoch: 22873, Training Loss: 0.00970
Epoch: 22873, Training Loss: 0.00971
Epoch: 22873, Training Loss: 0.01196
Epoch: 22874, Training Loss: 0.00928
Epoch: 22874, Training Loss: 0.00970
Epoch: 22874, Training Loss: 0.00971
Epoch: 22874, Training Loss: 0.01196
Epoch: 22875, Training Loss: 0.00927
Epoch: 22875, Training Loss: 0.00970
Epoch: 22875, Training Loss: 0.00971
Epoch: 22875, Training Loss: 0.01196
Epoch: 22876, Training Loss: 0.00927
Epoch: 22876, Training Loss: 0.00970
Epoch: 22876, Training Loss: 0.00971
Epoch: 22876, Training Loss: 0.01196
Epoch: 22877, Training Loss: 0.00927
Epoch: 22877, Training Loss: 0.00970
Epoch: 22877, Training Loss: 0.00971
Epoch: 22877, Training Loss: 0.01196
Epoch: 22878, Training Loss: 0.00927
Epoch: 22878, Training Loss: 0.00970
Epoch: 22878, Training Loss: 0.00971
Epoch: 22878, Training Loss: 0.01196
Epoch: 22879, Training Loss: 0.00927
Epoch: 22879, Training Loss: 0.00970
Epoch: 22879, Training Loss: 0.00971
Epoch: 22879, Training Loss: 0.01195
Epoch: 22880, Training Loss: 0.00927
Epoch: 22880, Training Loss: 0.00970
Epoch: 22880, Training Loss: 0.00971
Epoch: 22880, Training Loss: 0.01195
Epoch: 22881, Training Loss: 0.00927
Epoch: 22881, Training Loss: 0.00970
Epoch: 22881, Training Loss: 0.00971
Epoch: 22881, Training Loss: 0.01195
Epoch: 22882, Training Loss: 0.00927
Epoch: 22882, Training Loss: 0.00970
Epoch: 22882, Training Loss: 0.00971
Epoch: 22882, Training Loss: 0.01195
Epoch: 22883, Training Loss: 0.00927
Epoch: 22883, Training Loss: 0.00970
Epoch: 22883, Training Loss: 0.00971
Epoch: 22883, Training Loss: 0.01195
Epoch: 22884, Training Loss: 0.00927
Epoch: 22884, Training Loss: 0.00970
Epoch: 22884, Training Loss: 0.00971
Epoch: 22884, Training Loss: 0.01195
Epoch: 22885, Training Loss: 0.00927
Epoch: 22885, Training Loss: 0.00970
Epoch: 22885, Training Loss: 0.00971
Epoch: 22885, Training Loss: 0.01195
Epoch: 22886, Training Loss: 0.00927
Epoch: 22886, Training Loss: 0.00970
Epoch: 22886, Training Loss: 0.00971
Epoch: 22886, Training Loss: 0.01195
Epoch: 22887, Training Loss: 0.00927
Epoch: 22887, Training Loss: 0.00970
Epoch: 22887, Training Loss: 0.00971
Epoch: 22887, Training Loss: 0.01195
Epoch: 22888, Training Loss: 0.00927
Epoch: 22888, Training Loss: 0.00970
Epoch: 22888, Training Loss: 0.00971
Epoch: 22888, Training Loss: 0.01195
Epoch: 22889, Training Loss: 0.00927
Epoch: 22889, Training Loss: 0.00970
Epoch: 22889, Training Loss: 0.00971
Epoch: 22889, Training Loss: 0.01195
Epoch: 22890, Training Loss: 0.00927
Epoch: 22890, Training Loss: 0.00970
Epoch: 22890, Training Loss: 0.00971
Epoch: 22890, Training Loss: 0.01195
Epoch: 22891, Training Loss: 0.00927
Epoch: 22891, Training Loss: 0.00970
Epoch: 22891, Training Loss: 0.00971
Epoch: 22891, Training Loss: 0.01195
Epoch: 22892, Training Loss: 0.00927
Epoch: 22892, Training Loss: 0.00970
Epoch: 22892, Training Loss: 0.00971
Epoch: 22892, Training Loss: 0.01195
Epoch: 22893, Training Loss: 0.00927
Epoch: 22893, Training Loss: 0.00970
Epoch: 22893, Training Loss: 0.00971
Epoch: 22893, Training Loss: 0.01195
Epoch: 22894, Training Loss: 0.00927
Epoch: 22894, Training Loss: 0.00970
Epoch: 22894, Training Loss: 0.00971
Epoch: 22894, Training Loss: 0.01195
Epoch: 22895, Training Loss: 0.00927
Epoch: 22895, Training Loss: 0.00970
Epoch: 22895, Training Loss: 0.00971
Epoch: 22895, Training Loss: 0.01195
Epoch: 22896, Training Loss: 0.00927
Epoch: 22896, Training Loss: 0.00970
Epoch: 22896, Training Loss: 0.00971
Epoch: 22896, Training Loss: 0.01195
Epoch: 22897, Training Loss: 0.00927
Epoch: 22897, Training Loss: 0.00970
Epoch: 22897, Training Loss: 0.00971
Epoch: 22897, Training Loss: 0.01195
Epoch: 22898, Training Loss: 0.00927
Epoch: 22898, Training Loss: 0.00970
Epoch: 22898, Training Loss: 0.00971
Epoch: 22898, Training Loss: 0.01195
Epoch: 22899, Training Loss: 0.00927
Epoch: 22899, Training Loss: 0.00970
Epoch: 22899, Training Loss: 0.00971
Epoch: 22899, Training Loss: 0.01195
Epoch: 22900, Training Loss: 0.00927
Epoch: 22900, Training Loss: 0.00970
Epoch: 22900, Training Loss: 0.00971
Epoch: 22900, Training Loss: 0.01195
Epoch: 22901, Training Loss: 0.00927
Epoch: 22901, Training Loss: 0.00970
Epoch: 22901, Training Loss: 0.00971
Epoch: 22901, Training Loss: 0.01195
Epoch: 22902, Training Loss: 0.00927
Epoch: 22902, Training Loss: 0.00969
Epoch: 22902, Training Loss: 0.00971
Epoch: 22902, Training Loss: 0.01195
Epoch: 22903, Training Loss: 0.00927
Epoch: 22903, Training Loss: 0.00969
Epoch: 22903, Training Loss: 0.00971
Epoch: 22903, Training Loss: 0.01195
Epoch: 22904, Training Loss: 0.00927
Epoch: 22904, Training Loss: 0.00969
Epoch: 22904, Training Loss: 0.00971
Epoch: 22904, Training Loss: 0.01195
Epoch: 22905, Training Loss: 0.00927
Epoch: 22905, Training Loss: 0.00969
Epoch: 22905, Training Loss: 0.00971
Epoch: 22905, Training Loss: 0.01195
Epoch: 22906, Training Loss: 0.00927
Epoch: 22906, Training Loss: 0.00969
Epoch: 22906, Training Loss: 0.00971
Epoch: 22906, Training Loss: 0.01195
Epoch: 22907, Training Loss: 0.00927
Epoch: 22907, Training Loss: 0.00969
Epoch: 22907, Training Loss: 0.00971
Epoch: 22907, Training Loss: 0.01195
Epoch: 22908, Training Loss: 0.00927
Epoch: 22908, Training Loss: 0.00969
Epoch: 22908, Training Loss: 0.00971
Epoch: 22908, Training Loss: 0.01195
Epoch: 22909, Training Loss: 0.00927
Epoch: 22909, Training Loss: 0.00969
Epoch: 22909, Training Loss: 0.00971
Epoch: 22909, Training Loss: 0.01195
Epoch: 22910, Training Loss: 0.00927
Epoch: 22910, Training Loss: 0.00969
Epoch: 22910, Training Loss: 0.00971
Epoch: 22910, Training Loss: 0.01195
Epoch: 22911, Training Loss: 0.00927
Epoch: 22911, Training Loss: 0.00969
Epoch: 22911, Training Loss: 0.00970
Epoch: 22911, Training Loss: 0.01195
Epoch: 22912, Training Loss: 0.00927
Epoch: 22912, Training Loss: 0.00969
Epoch: 22912, Training Loss: 0.00970
Epoch: 22912, Training Loss: 0.01194
Epoch: 22913, Training Loss: 0.00927
Epoch: 22913, Training Loss: 0.00969
Epoch: 22913, Training Loss: 0.00970
Epoch: 22913, Training Loss: 0.01194
Epoch: 22914, Training Loss: 0.00927
Epoch: 22914, Training Loss: 0.00969
Epoch: 22914, Training Loss: 0.00970
Epoch: 22914, Training Loss: 0.01194
Epoch: 22915, Training Loss: 0.00927
Epoch: 22915, Training Loss: 0.00969
Epoch: 22915, Training Loss: 0.00970
Epoch: 22915, Training Loss: 0.01194
Epoch: 22916, Training Loss: 0.00927
Epoch: 22916, Training Loss: 0.00969
Epoch: 22916, Training Loss: 0.00970
Epoch: 22916, Training Loss: 0.01194
Epoch: 22917, Training Loss: 0.00927
Epoch: 22917, Training Loss: 0.00969
Epoch: 22917, Training Loss: 0.00970
Epoch: 22917, Training Loss: 0.01194
Epoch: 22918, Training Loss: 0.00927
Epoch: 22918, Training Loss: 0.00969
Epoch: 22918, Training Loss: 0.00970
Epoch: 22918, Training Loss: 0.01194
Epoch: 22919, Training Loss: 0.00926
Epoch: 22919, Training Loss: 0.00969
Epoch: 22919, Training Loss: 0.00970
Epoch: 22919, Training Loss: 0.01194
Epoch: 22920, Training Loss: 0.00926
Epoch: 22920, Training Loss: 0.00969
Epoch: 22920, Training Loss: 0.00970
Epoch: 22920, Training Loss: 0.01194
Epoch: 22921, Training Loss: 0.00926
Epoch: 22921, Training Loss: 0.00969
Epoch: 22921, Training Loss: 0.00970
Epoch: 22921, Training Loss: 0.01194
Epoch: 22922, Training Loss: 0.00926
Epoch: 22922, Training Loss: 0.00969
Epoch: 22922, Training Loss: 0.00970
Epoch: 22922, Training Loss: 0.01194
Epoch: 22923, Training Loss: 0.00926
Epoch: 22923, Training Loss: 0.00969
Epoch: 22923, Training Loss: 0.00970
Epoch: 22923, Training Loss: 0.01194
Epoch: 22924, Training Loss: 0.00926
Epoch: 22924, Training Loss: 0.00969
Epoch: 22924, Training Loss: 0.00970
Epoch: 22924, Training Loss: 0.01194
Epoch: 22925, Training Loss: 0.00926
Epoch: 22925, Training Loss: 0.00969
Epoch: 22925, Training Loss: 0.00970
Epoch: 22925, Training Loss: 0.01194
Epoch: 22926, Training Loss: 0.00926
Epoch: 22926, Training Loss: 0.00969
Epoch: 22926, Training Loss: 0.00970
Epoch: 22926, Training Loss: 0.01194
Epoch: 22927, Training Loss: 0.00926
Epoch: 22927, Training Loss: 0.00969
Epoch: 22927, Training Loss: 0.00970
Epoch: 22927, Training Loss: 0.01194
Epoch: 22928, Training Loss: 0.00926
Epoch: 22928, Training Loss: 0.00969
Epoch: 22928, Training Loss: 0.00970
Epoch: 22928, Training Loss: 0.01194
Epoch: 22929, Training Loss: 0.00926
Epoch: 22929, Training Loss: 0.00969
Epoch: 22929, Training Loss: 0.00970
Epoch: 22929, Training Loss: 0.01194
Epoch: 22930, Training Loss: 0.00926
Epoch: 22930, Training Loss: 0.00969
Epoch: 22930, Training Loss: 0.00970
Epoch: 22930, Training Loss: 0.01194
Epoch: 22931, Training Loss: 0.00926
Epoch: 22931, Training Loss: 0.00969
Epoch: 22931, Training Loss: 0.00970
Epoch: 22931, Training Loss: 0.01194
Epoch: 22932, Training Loss: 0.00926
Epoch: 22932, Training Loss: 0.00969
Epoch: 22932, Training Loss: 0.00970
Epoch: 22932, Training Loss: 0.01194
Epoch: 22933, Training Loss: 0.00926
Epoch: 22933, Training Loss: 0.00969
Epoch: 22933, Training Loss: 0.00970
Epoch: 22933, Training Loss: 0.01194
Epoch: 22934, Training Loss: 0.00926
Epoch: 22934, Training Loss: 0.00969
Epoch: 22934, Training Loss: 0.00970
Epoch: 22934, Training Loss: 0.01194
Epoch: 22935, Training Loss: 0.00926
Epoch: 22935, Training Loss: 0.00969
Epoch: 22935, Training Loss: 0.00970
Epoch: 22935, Training Loss: 0.01194
Epoch: 22936, Training Loss: 0.00926
Epoch: 22936, Training Loss: 0.00969
Epoch: 22936, Training Loss: 0.00970
Epoch: 22936, Training Loss: 0.01194
Epoch: 22937, Training Loss: 0.00926
Epoch: 22937, Training Loss: 0.00969
Epoch: 22937, Training Loss: 0.00970
Epoch: 22937, Training Loss: 0.01194
Epoch: 22938, Training Loss: 0.00926
Epoch: 22938, Training Loss: 0.00969
Epoch: 22938, Training Loss: 0.00970
Epoch: 22938, Training Loss: 0.01194
Epoch: 22939, Training Loss: 0.00926
Epoch: 22939, Training Loss: 0.00969
Epoch: 22939, Training Loss: 0.00970
Epoch: 22939, Training Loss: 0.01194
Epoch: 22940, Training Loss: 0.00926
Epoch: 22940, Training Loss: 0.00969
Epoch: 22940, Training Loss: 0.00970
Epoch: 22940, Training Loss: 0.01194
Epoch: 22941, Training Loss: 0.00926
Epoch: 22941, Training Loss: 0.00969
Epoch: 22941, Training Loss: 0.00970
Epoch: 22941, Training Loss: 0.01194
Epoch: 22942, Training Loss: 0.00926
Epoch: 22942, Training Loss: 0.00969
Epoch: 22942, Training Loss: 0.00970
Epoch: 22942, Training Loss: 0.01194
Epoch: 22943, Training Loss: 0.00926
Epoch: 22943, Training Loss: 0.00968
Epoch: 22943, Training Loss: 0.00970
Epoch: 22943, Training Loss: 0.01194
Epoch: 22944, Training Loss: 0.00926
Epoch: 22944, Training Loss: 0.00968
Epoch: 22944, Training Loss: 0.00970
Epoch: 22944, Training Loss: 0.01193
Epoch: 22945, Training Loss: 0.00926
Epoch: 22945, Training Loss: 0.00968
Epoch: 22945, Training Loss: 0.00970
Epoch: 22945, Training Loss: 0.01193
Epoch: 22946, Training Loss: 0.00926
Epoch: 22946, Training Loss: 0.00968
Epoch: 22946, Training Loss: 0.00970
Epoch: 22946, Training Loss: 0.01193
Epoch: 22947, Training Loss: 0.00926
Epoch: 22947, Training Loss: 0.00968
Epoch: 22947, Training Loss: 0.00970
Epoch: 22947, Training Loss: 0.01193
Epoch: 22948, Training Loss: 0.00926
Epoch: 22948, Training Loss: 0.00968
Epoch: 22948, Training Loss: 0.00970
Epoch: 22948, Training Loss: 0.01193
Epoch: 22949, Training Loss: 0.00926
Epoch: 22949, Training Loss: 0.00968
Epoch: 22949, Training Loss: 0.00970
Epoch: 22949, Training Loss: 0.01193
Epoch: 22950, Training Loss: 0.00926
Epoch: 22950, Training Loss: 0.00968
Epoch: 22950, Training Loss: 0.00970
Epoch: 22950, Training Loss: 0.01193
Epoch: 22951, Training Loss: 0.00926
Epoch: 22951, Training Loss: 0.00968
Epoch: 22951, Training Loss: 0.00970
Epoch: 22951, Training Loss: 0.01193
Epoch: 22952, Training Loss: 0.00926
Epoch: 22952, Training Loss: 0.00968
Epoch: 22952, Training Loss: 0.00969
Epoch: 22952, Training Loss: 0.01193
Epoch: 22953, Training Loss: 0.00926
Epoch: 22953, Training Loss: 0.00968
Epoch: 22953, Training Loss: 0.00969
Epoch: 22953, Training Loss: 0.01193
Epoch: 22954, Training Loss: 0.00926
Epoch: 22954, Training Loss: 0.00968
Epoch: 22954, Training Loss: 0.00969
Epoch: 22954, Training Loss: 0.01193
Epoch: 22955, Training Loss: 0.00926
Epoch: 22955, Training Loss: 0.00968
Epoch: 22955, Training Loss: 0.00969
Epoch: 22955, Training Loss: 0.01193
Epoch: 22956, Training Loss: 0.00926
Epoch: 22956, Training Loss: 0.00968
Epoch: 22956, Training Loss: 0.00969
Epoch: 22956, Training Loss: 0.01193
Epoch: 22957, Training Loss: 0.00926
Epoch: 22957, Training Loss: 0.00968
Epoch: 22957, Training Loss: 0.00969
Epoch: 22957, Training Loss: 0.01193
Epoch: 22958, Training Loss: 0.00926
Epoch: 22958, Training Loss: 0.00968
Epoch: 22958, Training Loss: 0.00969
Epoch: 22958, Training Loss: 0.01193
Epoch: 22959, Training Loss: 0.00926
Epoch: 22959, Training Loss: 0.00968
Epoch: 22959, Training Loss: 0.00969
Epoch: 22959, Training Loss: 0.01193
Epoch: 22960, Training Loss: 0.00926
Epoch: 22960, Training Loss: 0.00968
Epoch: 22960, Training Loss: 0.00969
Epoch: 22960, Training Loss: 0.01193
Epoch: 22961, Training Loss: 0.00926
Epoch: 22961, Training Loss: 0.00968
Epoch: 22961, Training Loss: 0.00969
Epoch: 22961, Training Loss: 0.01193
Epoch: 22962, Training Loss: 0.00925
Epoch: 22962, Training Loss: 0.00968
Epoch: 22962, Training Loss: 0.00969
Epoch: 22962, Training Loss: 0.01193
Epoch: 22963, Training Loss: 0.00925
Epoch: 22963, Training Loss: 0.00968
Epoch: 22963, Training Loss: 0.00969
Epoch: 22963, Training Loss: 0.01193
Epoch: 22964, Training Loss: 0.00925
Epoch: 22964, Training Loss: 0.00968
Epoch: 22964, Training Loss: 0.00969
Epoch: 22964, Training Loss: 0.01193
Epoch: 22965, Training Loss: 0.00925
Epoch: 22965, Training Loss: 0.00968
Epoch: 22965, Training Loss: 0.00969
Epoch: 22965, Training Loss: 0.01193
Epoch: 22966, Training Loss: 0.00925
Epoch: 22966, Training Loss: 0.00968
Epoch: 22966, Training Loss: 0.00969
Epoch: 22966, Training Loss: 0.01193
Epoch: 22967, Training Loss: 0.00925
Epoch: 22967, Training Loss: 0.00968
Epoch: 22967, Training Loss: 0.00969
Epoch: 22967, Training Loss: 0.01193
Epoch: 22968, Training Loss: 0.00925
Epoch: 22968, Training Loss: 0.00968
Epoch: 22968, Training Loss: 0.00969
Epoch: 22968, Training Loss: 0.01193
Epoch: 22969, Training Loss: 0.00925
Epoch: 22969, Training Loss: 0.00968
Epoch: 22969, Training Loss: 0.00969
Epoch: 22969, Training Loss: 0.01193
Epoch: 22970, Training Loss: 0.00925
Epoch: 22970, Training Loss: 0.00968
Epoch: 22970, Training Loss: 0.00969
Epoch: 22970, Training Loss: 0.01193
Epoch: 22971, Training Loss: 0.00925
Epoch: 22971, Training Loss: 0.00968
Epoch: 22971, Training Loss: 0.00969
Epoch: 22971, Training Loss: 0.01193
Epoch: 22972, Training Loss: 0.00925
Epoch: 22972, Training Loss: 0.00968
Epoch: 22972, Training Loss: 0.00969
Epoch: 22972, Training Loss: 0.01193
Epoch: 22973, Training Loss: 0.00925
Epoch: 22973, Training Loss: 0.00968
Epoch: 22973, Training Loss: 0.00969
Epoch: 22973, Training Loss: 0.01193
Epoch: 22974, Training Loss: 0.00925
Epoch: 22974, Training Loss: 0.00968
Epoch: 22974, Training Loss: 0.00969
Epoch: 22974, Training Loss: 0.01193
Epoch: 22975, Training Loss: 0.00925
Epoch: 22975, Training Loss: 0.00968
Epoch: 22975, Training Loss: 0.00969
Epoch: 22975, Training Loss: 0.01193
Epoch: 22976, Training Loss: 0.00925
Epoch: 22976, Training Loss: 0.00968
Epoch: 22976, Training Loss: 0.00969
Epoch: 22976, Training Loss: 0.01193
Epoch: 22977, Training Loss: 0.00925
Epoch: 22977, Training Loss: 0.00968
Epoch: 22977, Training Loss: 0.00969
Epoch: 22977, Training Loss: 0.01192
Epoch: 22978, Training Loss: 0.00925
Epoch: 22978, Training Loss: 0.00968
Epoch: 22978, Training Loss: 0.00969
Epoch: 22978, Training Loss: 0.01192
Epoch: 22979, Training Loss: 0.00925
Epoch: 22979, Training Loss: 0.00968
Epoch: 22979, Training Loss: 0.00969
Epoch: 22979, Training Loss: 0.01192
Epoch: 22980, Training Loss: 0.00925
Epoch: 22980, Training Loss: 0.00968
Epoch: 22980, Training Loss: 0.00969
Epoch: 22980, Training Loss: 0.01192
Epoch: 22981, Training Loss: 0.00925
Epoch: 22981, Training Loss: 0.00968
Epoch: 22981, Training Loss: 0.00969
Epoch: 22981, Training Loss: 0.01192
Epoch: 22982, Training Loss: 0.00925
Epoch: 22982, Training Loss: 0.00968
Epoch: 22982, Training Loss: 0.00969
Epoch: 22982, Training Loss: 0.01192
Epoch: 22983, Training Loss: 0.00925
Epoch: 22983, Training Loss: 0.00967
Epoch: 22983, Training Loss: 0.00969
Epoch: 22983, Training Loss: 0.01192
Epoch: 22984, Training Loss: 0.00925
Epoch: 22984, Training Loss: 0.00967
Epoch: 22984, Training Loss: 0.00969
Epoch: 22984, Training Loss: 0.01192
Epoch: 22985, Training Loss: 0.00925
Epoch: 22985, Training Loss: 0.00967
Epoch: 22985, Training Loss: 0.00969
Epoch: 22985, Training Loss: 0.01192
Epoch: 22986, Training Loss: 0.00925
Epoch: 22986, Training Loss: 0.00967
Epoch: 22986, Training Loss: 0.00969
Epoch: 22986, Training Loss: 0.01192
Epoch: 22987, Training Loss: 0.00925
Epoch: 22987, Training Loss: 0.00967
Epoch: 22987, Training Loss: 0.00969
Epoch: 22987, Training Loss: 0.01192
Epoch: 22988, Training Loss: 0.00925
Epoch: 22988, Training Loss: 0.00967
Epoch: 22988, Training Loss: 0.00969
Epoch: 22988, Training Loss: 0.01192
Epoch: 22989, Training Loss: 0.00925
Epoch: 22989, Training Loss: 0.00967
Epoch: 22989, Training Loss: 0.00969
Epoch: 22989, Training Loss: 0.01192
Epoch: 22990, Training Loss: 0.00925
Epoch: 22990, Training Loss: 0.00967
Epoch: 22990, Training Loss: 0.00969
Epoch: 22990, Training Loss: 0.01192
Epoch: 22991, Training Loss: 0.00925
Epoch: 22991, Training Loss: 0.00967
Epoch: 22991, Training Loss: 0.00969
Epoch: 22991, Training Loss: 0.01192
Epoch: 22992, Training Loss: 0.00925
Epoch: 22992, Training Loss: 0.00967
Epoch: 22992, Training Loss: 0.00969
Epoch: 22992, Training Loss: 0.01192
Epoch: 22993, Training Loss: 0.00925
Epoch: 22993, Training Loss: 0.00967
Epoch: 22993, Training Loss: 0.00968
Epoch: 22993, Training Loss: 0.01192
Epoch: 22994, Training Loss: 0.00925
Epoch: 22994, Training Loss: 0.00967
Epoch: 22994, Training Loss: 0.00968
Epoch: 22994, Training Loss: 0.01192
Epoch: 22995, Training Loss: 0.00925
Epoch: 22995, Training Loss: 0.00967
Epoch: 22995, Training Loss: 0.00968
Epoch: 22995, Training Loss: 0.01192
Epoch: 22996, Training Loss: 0.00925
Epoch: 22996, Training Loss: 0.00967
Epoch: 22996, Training Loss: 0.00968
Epoch: 22996, Training Loss: 0.01192
Epoch: 22997, Training Loss: 0.00925
Epoch: 22997, Training Loss: 0.00967
Epoch: 22997, Training Loss: 0.00968
Epoch: 22997, Training Loss: 0.01192
Epoch: 22998, Training Loss: 0.00925
Epoch: 22998, Training Loss: 0.00967
Epoch: 22998, Training Loss: 0.00968
Epoch: 22998, Training Loss: 0.01192
Epoch: 22999, Training Loss: 0.00925
Epoch: 22999, Training Loss: 0.00967
Epoch: 22999, Training Loss: 0.00968
Epoch: 22999, Training Loss: 0.01192
Epoch: 23000, Training Loss: 0.00925
Epoch: 23000, Training Loss: 0.00967
Epoch: 23000, Training Loss: 0.00968
Epoch: 23000, Training Loss: 0.01192
Epoch: 23001, Training Loss: 0.00925
Epoch: 23001, Training Loss: 0.00967
Epoch: 23001, Training Loss: 0.00968
Epoch: 23001, Training Loss: 0.01192
Epoch: 23002, Training Loss: 0.00925
Epoch: 23002, Training Loss: 0.00967
Epoch: 23002, Training Loss: 0.00968
Epoch: 23002, Training Loss: 0.01192
Epoch: 23003, Training Loss: 0.00925
Epoch: 23003, Training Loss: 0.00967
Epoch: 23003, Training Loss: 0.00968
Epoch: 23003, Training Loss: 0.01192
Epoch: 23004, Training Loss: 0.00925
Epoch: 23004, Training Loss: 0.00967
Epoch: 23004, Training Loss: 0.00968
Epoch: 23004, Training Loss: 0.01192
Epoch: 23005, Training Loss: 0.00925
Epoch: 23005, Training Loss: 0.00967
Epoch: 23005, Training Loss: 0.00968
Epoch: 23005, Training Loss: 0.01192
Epoch: 23006, Training Loss: 0.00924
Epoch: 23006, Training Loss: 0.00967
Epoch: 23006, Training Loss: 0.00968
Epoch: 23006, Training Loss: 0.01192
Epoch: 23007, Training Loss: 0.00924
Epoch: 23007, Training Loss: 0.00967
Epoch: 23007, Training Loss: 0.00968
Epoch: 23007, Training Loss: 0.01192
Epoch: 23008, Training Loss: 0.00924
Epoch: 23008, Training Loss: 0.00967
Epoch: 23008, Training Loss: 0.00968
Epoch: 23008, Training Loss: 0.01192
Epoch: 23009, Training Loss: 0.00924
Epoch: 23009, Training Loss: 0.00967
Epoch: 23009, Training Loss: 0.00968
Epoch: 23009, Training Loss: 0.01192
Epoch: 23010, Training Loss: 0.00924
Epoch: 23010, Training Loss: 0.00967
Epoch: 23010, Training Loss: 0.00968
Epoch: 23010, Training Loss: 0.01191
Epoch: 23011, Training Loss: 0.00924
Epoch: 23011, Training Loss: 0.00967
Epoch: 23011, Training Loss: 0.00968
Epoch: 23011, Training Loss: 0.01191
Epoch: 23012, Training Loss: 0.00924
Epoch: 23012, Training Loss: 0.00967
Epoch: 23012, Training Loss: 0.00968
Epoch: 23012, Training Loss: 0.01191
Epoch: 23013, Training Loss: 0.00924
Epoch: 23013, Training Loss: 0.00967
Epoch: 23013, Training Loss: 0.00968
Epoch: 23013, Training Loss: 0.01191
Epoch: 23014, Training Loss: 0.00924
Epoch: 23014, Training Loss: 0.00967
Epoch: 23014, Training Loss: 0.00968
Epoch: 23014, Training Loss: 0.01191
Epoch: 23015, Training Loss: 0.00924
Epoch: 23015, Training Loss: 0.00967
Epoch: 23015, Training Loss: 0.00968
Epoch: 23015, Training Loss: 0.01191
Epoch: 23016, Training Loss: 0.00924
Epoch: 23016, Training Loss: 0.00967
Epoch: 23016, Training Loss: 0.00968
Epoch: 23016, Training Loss: 0.01191
Epoch: 23017, Training Loss: 0.00924
Epoch: 23017, Training Loss: 0.00967
Epoch: 23017, Training Loss: 0.00968
Epoch: 23017, Training Loss: 0.01191
Epoch: 23018, Training Loss: 0.00924
Epoch: 23018, Training Loss: 0.00967
Epoch: 23018, Training Loss: 0.00968
Epoch: 23018, Training Loss: 0.01191
Epoch: 23019, Training Loss: 0.00924
Epoch: 23019, Training Loss: 0.00967
Epoch: 23019, Training Loss: 0.00968
Epoch: 23019, Training Loss: 0.01191
Epoch: 23020, Training Loss: 0.00924
Epoch: 23020, Training Loss: 0.00967
Epoch: 23020, Training Loss: 0.00968
Epoch: 23020, Training Loss: 0.01191
Epoch: 23021, Training Loss: 0.00924
Epoch: 23021, Training Loss: 0.00967
Epoch: 23021, Training Loss: 0.00968
Epoch: 23021, Training Loss: 0.01191
Epoch: 23022, Training Loss: 0.00924
Epoch: 23022, Training Loss: 0.00967
Epoch: 23022, Training Loss: 0.00968
Epoch: 23022, Training Loss: 0.01191
Epoch: 23023, Training Loss: 0.00924
Epoch: 23023, Training Loss: 0.00967
Epoch: 23023, Training Loss: 0.00968
Epoch: 23023, Training Loss: 0.01191
Epoch: 23024, Training Loss: 0.00924
Epoch: 23024, Training Loss: 0.00966
Epoch: 23024, Training Loss: 0.00968
Epoch: 23024, Training Loss: 0.01191
Epoch: 23025, Training Loss: 0.00924
Epoch: 23025, Training Loss: 0.00966
Epoch: 23025, Training Loss: 0.00968
Epoch: 23025, Training Loss: 0.01191
Epoch: 23026, Training Loss: 0.00924
Epoch: 23026, Training Loss: 0.00966
Epoch: 23026, Training Loss: 0.00968
Epoch: 23026, Training Loss: 0.01191
Epoch: 23027, Training Loss: 0.00924
Epoch: 23027, Training Loss: 0.00966
Epoch: 23027, Training Loss: 0.00968
Epoch: 23027, Training Loss: 0.01191
Epoch: 23028, Training Loss: 0.00924
Epoch: 23028, Training Loss: 0.00966
Epoch: 23028, Training Loss: 0.00968
Epoch: 23028, Training Loss: 0.01191
Epoch: 23029, Training Loss: 0.00924
Epoch: 23029, Training Loss: 0.00966
Epoch: 23029, Training Loss: 0.00968
Epoch: 23029, Training Loss: 0.01191
Epoch: 23030, Training Loss: 0.00924
Epoch: 23030, Training Loss: 0.00966
Epoch: 23030, Training Loss: 0.00968
Epoch: 23030, Training Loss: 0.01191
Epoch: 23031, Training Loss: 0.00924
Epoch: 23031, Training Loss: 0.00966
Epoch: 23031, Training Loss: 0.00968
Epoch: 23031, Training Loss: 0.01191
Epoch: 23032, Training Loss: 0.00924
Epoch: 23032, Training Loss: 0.00966
Epoch: 23032, Training Loss: 0.00968
Epoch: 23032, Training Loss: 0.01191
Epoch: 23033, Training Loss: 0.00924
Epoch: 23033, Training Loss: 0.00966
Epoch: 23033, Training Loss: 0.00968
Epoch: 23033, Training Loss: 0.01191
Epoch: 23034, Training Loss: 0.00924
Epoch: 23034, Training Loss: 0.00966
Epoch: 23034, Training Loss: 0.00967
Epoch: 23034, Training Loss: 0.01191
Epoch: 23035, Training Loss: 0.00924
Epoch: 23035, Training Loss: 0.00966
Epoch: 23035, Training Loss: 0.00967
Epoch: 23035, Training Loss: 0.01191
Epoch: 23036, Training Loss: 0.00924
Epoch: 23036, Training Loss: 0.00966
Epoch: 23036, Training Loss: 0.00967
Epoch: 23036, Training Loss: 0.01191
Epoch: 23037, Training Loss: 0.00924
Epoch: 23037, Training Loss: 0.00966
Epoch: 23037, Training Loss: 0.00967
Epoch: 23037, Training Loss: 0.01191
Epoch: 23038, Training Loss: 0.00924
Epoch: 23038, Training Loss: 0.00966
Epoch: 23038, Training Loss: 0.00967
Epoch: 23038, Training Loss: 0.01191
Epoch: 23039, Training Loss: 0.00924
Epoch: 23039, Training Loss: 0.00966
Epoch: 23039, Training Loss: 0.00967
Epoch: 23039, Training Loss: 0.01191
Epoch: 23040, Training Loss: 0.00924
Epoch: 23040, Training Loss: 0.00966
Epoch: 23040, Training Loss: 0.00967
Epoch: 23040, Training Loss: 0.01191
Epoch: 23041, Training Loss: 0.00924
Epoch: 23041, Training Loss: 0.00966
Epoch: 23041, Training Loss: 0.00967
Epoch: 23041, Training Loss: 0.01191
Epoch: 23042, Training Loss: 0.00924
Epoch: 23042, Training Loss: 0.00966
Epoch: 23042, Training Loss: 0.00967
Epoch: 23042, Training Loss: 0.01191
Epoch: 23043, Training Loss: 0.00924
Epoch: 23043, Training Loss: 0.00966
Epoch: 23043, Training Loss: 0.00967
Epoch: 23043, Training Loss: 0.01190
Epoch: 23044, Training Loss: 0.00924
Epoch: 23044, Training Loss: 0.00966
Epoch: 23044, Training Loss: 0.00967
Epoch: 23044, Training Loss: 0.01190
Epoch: 23045, Training Loss: 0.00924
Epoch: 23045, Training Loss: 0.00966
Epoch: 23045, Training Loss: 0.00967
Epoch: 23045, Training Loss: 0.01190
Epoch: 23046, Training Loss: 0.00924
Epoch: 23046, Training Loss: 0.00966
Epoch: 23046, Training Loss: 0.00967
Epoch: 23046, Training Loss: 0.01190
Epoch: 23047, Training Loss: 0.00924
Epoch: 23047, Training Loss: 0.00966
Epoch: 23047, Training Loss: 0.00967
Epoch: 23047, Training Loss: 0.01190
Epoch: 23048, Training Loss: 0.00924
Epoch: 23048, Training Loss: 0.00966
Epoch: 23048, Training Loss: 0.00967
Epoch: 23048, Training Loss: 0.01190
Epoch: 23049, Training Loss: 0.00924
Epoch: 23049, Training Loss: 0.00966
Epoch: 23049, Training Loss: 0.00967
Epoch: 23049, Training Loss: 0.01190
Epoch: 23050, Training Loss: 0.00923
Epoch: 23050, Training Loss: 0.00966
Epoch: 23050, Training Loss: 0.00967
Epoch: 23050, Training Loss: 0.01190
Epoch: 23051, Training Loss: 0.00923
Epoch: 23051, Training Loss: 0.00966
Epoch: 23051, Training Loss: 0.00967
Epoch: 23051, Training Loss: 0.01190
Epoch: 23052, Training Loss: 0.00923
Epoch: 23052, Training Loss: 0.00966
Epoch: 23052, Training Loss: 0.00967
Epoch: 23052, Training Loss: 0.01190
Epoch: 23053, Training Loss: 0.00923
Epoch: 23053, Training Loss: 0.00966
Epoch: 23053, Training Loss: 0.00967
Epoch: 23053, Training Loss: 0.01190
Epoch: 23054, Training Loss: 0.00923
Epoch: 23054, Training Loss: 0.00966
Epoch: 23054, Training Loss: 0.00967
Epoch: 23054, Training Loss: 0.01190
Epoch: 23055, Training Loss: 0.00923
Epoch: 23055, Training Loss: 0.00966
Epoch: 23055, Training Loss: 0.00967
Epoch: 23055, Training Loss: 0.01190
Epoch: 23056, Training Loss: 0.00923
Epoch: 23056, Training Loss: 0.00966
Epoch: 23056, Training Loss: 0.00967
Epoch: 23056, Training Loss: 0.01190
Epoch: 23057, Training Loss: 0.00923
Epoch: 23057, Training Loss: 0.00966
Epoch: 23057, Training Loss: 0.00967
Epoch: 23057, Training Loss: 0.01190
Epoch: 23058, Training Loss: 0.00923
Epoch: 23058, Training Loss: 0.00966
Epoch: 23058, Training Loss: 0.00967
Epoch: 23058, Training Loss: 0.01190
Epoch: 23059, Training Loss: 0.00923
Epoch: 23059, Training Loss: 0.00966
Epoch: 23059, Training Loss: 0.00967
Epoch: 23059, Training Loss: 0.01190
Epoch: 23060, Training Loss: 0.00923
Epoch: 23060, Training Loss: 0.00966
Epoch: 23060, Training Loss: 0.00967
Epoch: 23060, Training Loss: 0.01190
Epoch: 23061, Training Loss: 0.00923
Epoch: 23061, Training Loss: 0.00966
Epoch: 23061, Training Loss: 0.00967
Epoch: 23061, Training Loss: 0.01190
Epoch: 23062, Training Loss: 0.00923
Epoch: 23062, Training Loss: 0.00966
Epoch: 23062, Training Loss: 0.00967
Epoch: 23062, Training Loss: 0.01190
Epoch: 23063, Training Loss: 0.00923
Epoch: 23063, Training Loss: 0.00966
Epoch: 23063, Training Loss: 0.00967
Epoch: 23063, Training Loss: 0.01190
Epoch: 23064, Training Loss: 0.00923
Epoch: 23064, Training Loss: 0.00966
Epoch: 23064, Training Loss: 0.00967
Epoch: 23064, Training Loss: 0.01190
Epoch: 23065, Training Loss: 0.00923
Epoch: 23065, Training Loss: 0.00966
Epoch: 23065, Training Loss: 0.00967
Epoch: 23065, Training Loss: 0.01190
Epoch: 23066, Training Loss: 0.00923
Epoch: 23066, Training Loss: 0.00965
Epoch: 23066, Training Loss: 0.00967
Epoch: 23066, Training Loss: 0.01190
Epoch: 23067, Training Loss: 0.00923
Epoch: 23067, Training Loss: 0.00965
Epoch: 23067, Training Loss: 0.00967
Epoch: 23067, Training Loss: 0.01190
Epoch: 23068, Training Loss: 0.00923
Epoch: 23068, Training Loss: 0.00965
Epoch: 23068, Training Loss: 0.00967
Epoch: 23068, Training Loss: 0.01190
Epoch: 23069, Training Loss: 0.00923
Epoch: 23069, Training Loss: 0.00965
Epoch: 23069, Training Loss: 0.00967
Epoch: 23069, Training Loss: 0.01190
Epoch: 23070, Training Loss: 0.00923
Epoch: 23070, Training Loss: 0.00965
Epoch: 23070, Training Loss: 0.00967
Epoch: 23070, Training Loss: 0.01190
Epoch: 23071, Training Loss: 0.00923
Epoch: 23071, Training Loss: 0.00965
Epoch: 23071, Training Loss: 0.00967
Epoch: 23071, Training Loss: 0.01190
Epoch: 23072, Training Loss: 0.00923
Epoch: 23072, Training Loss: 0.00965
Epoch: 23072, Training Loss: 0.00967
Epoch: 23072, Training Loss: 0.01190
Epoch: 23073, Training Loss: 0.00923
Epoch: 23073, Training Loss: 0.00965
Epoch: 23073, Training Loss: 0.00967
Epoch: 23073, Training Loss: 0.01190
Epoch: 23074, Training Loss: 0.00923
Epoch: 23074, Training Loss: 0.00965
Epoch: 23074, Training Loss: 0.00967
Epoch: 23074, Training Loss: 0.01190
Epoch: 23075, Training Loss: 0.00923
Epoch: 23075, Training Loss: 0.00965
Epoch: 23075, Training Loss: 0.00966
Epoch: 23075, Training Loss: 0.01190
Epoch: 23076, Training Loss: 0.00923
Epoch: 23076, Training Loss: 0.00965
Epoch: 23076, Training Loss: 0.00966
Epoch: 23076, Training Loss: 0.01189
Epoch: 23077, Training Loss: 0.00923
Epoch: 23077, Training Loss: 0.00965
Epoch: 23077, Training Loss: 0.00966
Epoch: 23077, Training Loss: 0.01189
Epoch: 23078, Training Loss: 0.00923
Epoch: 23078, Training Loss: 0.00965
Epoch: 23078, Training Loss: 0.00966
Epoch: 23078, Training Loss: 0.01189
Epoch: 23079, Training Loss: 0.00923
Epoch: 23079, Training Loss: 0.00965
Epoch: 23079, Training Loss: 0.00966
Epoch: 23079, Training Loss: 0.01189
Epoch: 23080, Training Loss: 0.00923
Epoch: 23080, Training Loss: 0.00965
Epoch: 23080, Training Loss: 0.00966
Epoch: 23080, Training Loss: 0.01189
Epoch: 23081, Training Loss: 0.00923
Epoch: 23081, Training Loss: 0.00965
Epoch: 23081, Training Loss: 0.00966
Epoch: 23081, Training Loss: 0.01189
Epoch: 23082, Training Loss: 0.00923
Epoch: 23082, Training Loss: 0.00965
Epoch: 23082, Training Loss: 0.00966
Epoch: 23082, Training Loss: 0.01189
Epoch: 23083, Training Loss: 0.00923
Epoch: 23083, Training Loss: 0.00965
Epoch: 23083, Training Loss: 0.00966
Epoch: 23083, Training Loss: 0.01189
Epoch: 23084, Training Loss: 0.00923
Epoch: 23084, Training Loss: 0.00965
Epoch: 23084, Training Loss: 0.00966
Epoch: 23084, Training Loss: 0.01189
Epoch: 23085, Training Loss: 0.00923
Epoch: 23085, Training Loss: 0.00965
Epoch: 23085, Training Loss: 0.00966
Epoch: 23085, Training Loss: 0.01189
Epoch: 23086, Training Loss: 0.00923
Epoch: 23086, Training Loss: 0.00965
Epoch: 23086, Training Loss: 0.00966
Epoch: 23086, Training Loss: 0.01189
Epoch: 23087, Training Loss: 0.00923
Epoch: 23087, Training Loss: 0.00965
Epoch: 23087, Training Loss: 0.00966
Epoch: 23087, Training Loss: 0.01189
Epoch: 23088, Training Loss: 0.00923
Epoch: 23088, Training Loss: 0.00965
Epoch: 23088, Training Loss: 0.00966
Epoch: 23088, Training Loss: 0.01189
Epoch: 23089, Training Loss: 0.00923
Epoch: 23089, Training Loss: 0.00965
Epoch: 23089, Training Loss: 0.00966
Epoch: 23089, Training Loss: 0.01189
Epoch: 23090, Training Loss: 0.00923
Epoch: 23090, Training Loss: 0.00965
Epoch: 23090, Training Loss: 0.00966
Epoch: 23090, Training Loss: 0.01189
Epoch: 23091, Training Loss: 0.00923
Epoch: 23091, Training Loss: 0.00965
Epoch: 23091, Training Loss: 0.00966
Epoch: 23091, Training Loss: 0.01189
Epoch: 23092, Training Loss: 0.00923
Epoch: 23092, Training Loss: 0.00965
Epoch: 23092, Training Loss: 0.00966
Epoch: 23092, Training Loss: 0.01189
Epoch: 23093, Training Loss: 0.00923
Epoch: 23093, Training Loss: 0.00965
Epoch: 23093, Training Loss: 0.00966
Epoch: 23093, Training Loss: 0.01189
Epoch: 23094, Training Loss: 0.00923
Epoch: 23094, Training Loss: 0.00965
Epoch: 23094, Training Loss: 0.00966
Epoch: 23094, Training Loss: 0.01189
Epoch: 23095, Training Loss: 0.00922
Epoch: 23095, Training Loss: 0.00965
Epoch: 23095, Training Loss: 0.00966
Epoch: 23095, Training Loss: 0.01189
Epoch: 23096, Training Loss: 0.00922
Epoch: 23096, Training Loss: 0.00965
Epoch: 23096, Training Loss: 0.00966
Epoch: 23096, Training Loss: 0.01189
Epoch: 23097, Training Loss: 0.00922
Epoch: 23097, Training Loss: 0.00965
Epoch: 23097, Training Loss: 0.00966
Epoch: 23097, Training Loss: 0.01189
Epoch: 23098, Training Loss: 0.00922
Epoch: 23098, Training Loss: 0.00965
Epoch: 23098, Training Loss: 0.00966
Epoch: 23098, Training Loss: 0.01189
Epoch: 23099, Training Loss: 0.00922
Epoch: 23099, Training Loss: 0.00965
Epoch: 23099, Training Loss: 0.00966
Epoch: 23099, Training Loss: 0.01189
Epoch: 23100, Training Loss: 0.00922
Epoch: 23100, Training Loss: 0.00965
Epoch: 23100, Training Loss: 0.00966
Epoch: 23100, Training Loss: 0.01189
Epoch: 23101, Training Loss: 0.00922
Epoch: 23101, Training Loss: 0.00965
Epoch: 23101, Training Loss: 0.00966
Epoch: 23101, Training Loss: 0.01189
Epoch: 23102, Training Loss: 0.00922
Epoch: 23102, Training Loss: 0.00965
Epoch: 23102, Training Loss: 0.00966
Epoch: 23102, Training Loss: 0.01189
Epoch: 23103, Training Loss: 0.00922
Epoch: 23103, Training Loss: 0.00965
Epoch: 23103, Training Loss: 0.00966
Epoch: 23103, Training Loss: 0.01189
Epoch: 23104, Training Loss: 0.00922
Epoch: 23104, Training Loss: 0.00965
Epoch: 23104, Training Loss: 0.00966
Epoch: 23104, Training Loss: 0.01189
Epoch: 23105, Training Loss: 0.00922
Epoch: 23105, Training Loss: 0.00965
Epoch: 23105, Training Loss: 0.00966
Epoch: 23105, Training Loss: 0.01189
Epoch: 23106, Training Loss: 0.00922
Epoch: 23106, Training Loss: 0.00965
Epoch: 23106, Training Loss: 0.00966
Epoch: 23106, Training Loss: 0.01189
Epoch: 23107, Training Loss: 0.00922
Epoch: 23107, Training Loss: 0.00964
Epoch: 23107, Training Loss: 0.00966
Epoch: 23107, Training Loss: 0.01189
Epoch: 23108, Training Loss: 0.00922
Epoch: 23108, Training Loss: 0.00964
Epoch: 23108, Training Loss: 0.00966
Epoch: 23108, Training Loss: 0.01189
Epoch: 23109, Training Loss: 0.00922
Epoch: 23109, Training Loss: 0.00964
Epoch: 23109, Training Loss: 0.00966
Epoch: 23109, Training Loss: 0.01188
Epoch: 23110, Training Loss: 0.00922
Epoch: 23110, Training Loss: 0.00964
Epoch: 23110, Training Loss: 0.00966
Epoch: 23110, Training Loss: 0.01188
Epoch: 23111, Training Loss: 0.00922
Epoch: 23111, Training Loss: 0.00964
Epoch: 23111, Training Loss: 0.00966
Epoch: 23111, Training Loss: 0.01188
Epoch: 23112, Training Loss: 0.00922
Epoch: 23112, Training Loss: 0.00964
Epoch: 23112, Training Loss: 0.00966
Epoch: 23112, Training Loss: 0.01188
Epoch: 23113, Training Loss: 0.00922
Epoch: 23113, Training Loss: 0.00964
Epoch: 23113, Training Loss: 0.00966
Epoch: 23113, Training Loss: 0.01188
Epoch: 23114, Training Loss: 0.00922
Epoch: 23114, Training Loss: 0.00964
Epoch: 23114, Training Loss: 0.00966
Epoch: 23114, Training Loss: 0.01188
Epoch: 23115, Training Loss: 0.00922
Epoch: 23115, Training Loss: 0.00964
Epoch: 23115, Training Loss: 0.00966
Epoch: 23115, Training Loss: 0.01188
Epoch: 23116, Training Loss: 0.00922
Epoch: 23116, Training Loss: 0.00964
Epoch: 23116, Training Loss: 0.00965
Epoch: 23116, Training Loss: 0.01188
Epoch: 23117, Training Loss: 0.00922
Epoch: 23117, Training Loss: 0.00964
Epoch: 23117, Training Loss: 0.00965
Epoch: 23117, Training Loss: 0.01188
Epoch: 23118, Training Loss: 0.00922
Epoch: 23118, Training Loss: 0.00964
Epoch: 23118, Training Loss: 0.00965
Epoch: 23118, Training Loss: 0.01188
Epoch: 23119, Training Loss: 0.00922
Epoch: 23119, Training Loss: 0.00964
Epoch: 23119, Training Loss: 0.00965
Epoch: 23119, Training Loss: 0.01188
Epoch: 23120, Training Loss: 0.00922
Epoch: 23120, Training Loss: 0.00964
Epoch: 23120, Training Loss: 0.00965
Epoch: 23120, Training Loss: 0.01188
Epoch: 23121, Training Loss: 0.00922
Epoch: 23121, Training Loss: 0.00964
Epoch: 23121, Training Loss: 0.00965
Epoch: 23121, Training Loss: 0.01188
Epoch: 23122, Training Loss: 0.00922
Epoch: 23122, Training Loss: 0.00964
Epoch: 23122, Training Loss: 0.00965
Epoch: 23122, Training Loss: 0.01188
Epoch: 23123, Training Loss: 0.00922
Epoch: 23123, Training Loss: 0.00964
Epoch: 23123, Training Loss: 0.00965
Epoch: 23123, Training Loss: 0.01188
Epoch: 23124, Training Loss: 0.00922
Epoch: 23124, Training Loss: 0.00964
Epoch: 23124, Training Loss: 0.00965
Epoch: 23124, Training Loss: 0.01188
Epoch: 23125, Training Loss: 0.00922
Epoch: 23125, Training Loss: 0.00964
Epoch: 23125, Training Loss: 0.00965
Epoch: 23125, Training Loss: 0.01188
Epoch: 23126, Training Loss: 0.00922
Epoch: 23126, Training Loss: 0.00964
Epoch: 23126, Training Loss: 0.00965
Epoch: 23126, Training Loss: 0.01188
Epoch: 23127, Training Loss: 0.00922
Epoch: 23127, Training Loss: 0.00964
Epoch: 23127, Training Loss: 0.00965
Epoch: 23127, Training Loss: 0.01188
Epoch: 23128, Training Loss: 0.00922
Epoch: 23128, Training Loss: 0.00964
Epoch: 23128, Training Loss: 0.00965
Epoch: 23128, Training Loss: 0.01188
Epoch: 23129, Training Loss: 0.00922
Epoch: 23129, Training Loss: 0.00964
Epoch: 23129, Training Loss: 0.00965
Epoch: 23129, Training Loss: 0.01188
Epoch: 23130, Training Loss: 0.00922
Epoch: 23130, Training Loss: 0.00964
Epoch: 23130, Training Loss: 0.00965
Epoch: 23130, Training Loss: 0.01188
Epoch: 23131, Training Loss: 0.00922
Epoch: 23131, Training Loss: 0.00964
Epoch: 23131, Training Loss: 0.00965
Epoch: 23131, Training Loss: 0.01188
Epoch: 23132, Training Loss: 0.00922
Epoch: 23132, Training Loss: 0.00964
Epoch: 23132, Training Loss: 0.00965
Epoch: 23132, Training Loss: 0.01188
Epoch: 23133, Training Loss: 0.00922
Epoch: 23133, Training Loss: 0.00964
Epoch: 23133, Training Loss: 0.00965
Epoch: 23133, Training Loss: 0.01188
Epoch: 23134, Training Loss: 0.00922
Epoch: 23134, Training Loss: 0.00964
Epoch: 23134, Training Loss: 0.00965
Epoch: 23134, Training Loss: 0.01188
Epoch: 23135, Training Loss: 0.00922
Epoch: 23135, Training Loss: 0.00964
Epoch: 23135, Training Loss: 0.00965
Epoch: 23135, Training Loss: 0.01188
Epoch: 23136, Training Loss: 0.00922
Epoch: 23136, Training Loss: 0.00964
Epoch: 23136, Training Loss: 0.00965
Epoch: 23136, Training Loss: 0.01188
Epoch: 23137, Training Loss: 0.00922
Epoch: 23137, Training Loss: 0.00964
Epoch: 23137, Training Loss: 0.00965
Epoch: 23137, Training Loss: 0.01188
Epoch: 23138, Training Loss: 0.00922
Epoch: 23138, Training Loss: 0.00964
Epoch: 23138, Training Loss: 0.00965
Epoch: 23138, Training Loss: 0.01188
Epoch: 23139, Training Loss: 0.00921
Epoch: 23139, Training Loss: 0.00964
Epoch: 23139, Training Loss: 0.00965
Epoch: 23139, Training Loss: 0.01188
Epoch: 23140, Training Loss: 0.00921
Epoch: 23140, Training Loss: 0.00964
Epoch: 23140, Training Loss: 0.00965
Epoch: 23140, Training Loss: 0.01188
Epoch: 23141, Training Loss: 0.00921
Epoch: 23141, Training Loss: 0.00964
Epoch: 23141, Training Loss: 0.00965
Epoch: 23141, Training Loss: 0.01188
Epoch: 23142, Training Loss: 0.00921
Epoch: 23142, Training Loss: 0.00964
Epoch: 23142, Training Loss: 0.00965
Epoch: 23142, Training Loss: 0.01187
Epoch: 23143, Training Loss: 0.00921
Epoch: 23143, Training Loss: 0.00964
Epoch: 23143, Training Loss: 0.00965
Epoch: 23143, Training Loss: 0.01187
Epoch: 23144, Training Loss: 0.00921
Epoch: 23144, Training Loss: 0.00964
Epoch: 23144, Training Loss: 0.00965
Epoch: 23144, Training Loss: 0.01187
Epoch: 23145, Training Loss: 0.00921
Epoch: 23145, Training Loss: 0.00964
Epoch: 23145, Training Loss: 0.00965
Epoch: 23145, Training Loss: 0.01187
Epoch: 23146, Training Loss: 0.00921
Epoch: 23146, Training Loss: 0.00964
Epoch: 23146, Training Loss: 0.00965
Epoch: 23146, Training Loss: 0.01187
Epoch: 23147, Training Loss: 0.00921
Epoch: 23147, Training Loss: 0.00964
Epoch: 23147, Training Loss: 0.00965
Epoch: 23147, Training Loss: 0.01187
Epoch: 23148, Training Loss: 0.00921
Epoch: 23148, Training Loss: 0.00963
Epoch: 23148, Training Loss: 0.00965
Epoch: 23148, Training Loss: 0.01187
Epoch: 23149, Training Loss: 0.00921
Epoch: 23149, Training Loss: 0.00963
Epoch: 23149, Training Loss: 0.00965
Epoch: 23149, Training Loss: 0.01187
Epoch: 23150, Training Loss: 0.00921
Epoch: 23150, Training Loss: 0.00963
Epoch: 23150, Training Loss: 0.00965
Epoch: 23150, Training Loss: 0.01187
Epoch: 23151, Training Loss: 0.00921
Epoch: 23151, Training Loss: 0.00963
Epoch: 23151, Training Loss: 0.00965
Epoch: 23151, Training Loss: 0.01187
Epoch: 23152, Training Loss: 0.00921
Epoch: 23152, Training Loss: 0.00963
Epoch: 23152, Training Loss: 0.00965
Epoch: 23152, Training Loss: 0.01187
Epoch: 23153, Training Loss: 0.00921
Epoch: 23153, Training Loss: 0.00963
Epoch: 23153, Training Loss: 0.00965
Epoch: 23153, Training Loss: 0.01187
Epoch: 23154, Training Loss: 0.00921
Epoch: 23154, Training Loss: 0.00963
Epoch: 23154, Training Loss: 0.00965
Epoch: 23154, Training Loss: 0.01187
Epoch: 23155, Training Loss: 0.00921
Epoch: 23155, Training Loss: 0.00963
Epoch: 23155, Training Loss: 0.00965
Epoch: 23155, Training Loss: 0.01187
Epoch: 23156, Training Loss: 0.00921
Epoch: 23156, Training Loss: 0.00963
Epoch: 23156, Training Loss: 0.00965
Epoch: 23156, Training Loss: 0.01187
Epoch: 23157, Training Loss: 0.00921
Epoch: 23157, Training Loss: 0.00963
Epoch: 23157, Training Loss: 0.00964
Epoch: 23157, Training Loss: 0.01187
Epoch: 23158, Training Loss: 0.00921
Epoch: 23158, Training Loss: 0.00963
Epoch: 23158, Training Loss: 0.00964
Epoch: 23158, Training Loss: 0.01187
Epoch: 23159, Training Loss: 0.00921
Epoch: 23159, Training Loss: 0.00963
Epoch: 23159, Training Loss: 0.00964
Epoch: 23159, Training Loss: 0.01187
Epoch: 23160, Training Loss: 0.00921
Epoch: 23160, Training Loss: 0.00963
Epoch: 23160, Training Loss: 0.00964
Epoch: 23160, Training Loss: 0.01187
Epoch: 23161, Training Loss: 0.00921
Epoch: 23161, Training Loss: 0.00963
Epoch: 23161, Training Loss: 0.00964
Epoch: 23161, Training Loss: 0.01187
Epoch: 23162, Training Loss: 0.00921
Epoch: 23162, Training Loss: 0.00963
Epoch: 23162, Training Loss: 0.00964
Epoch: 23162, Training Loss: 0.01187
Epoch: 23163, Training Loss: 0.00921
Epoch: 23163, Training Loss: 0.00963
Epoch: 23163, Training Loss: 0.00964
Epoch: 23163, Training Loss: 0.01187
Epoch: 23164, Training Loss: 0.00921
Epoch: 23164, Training Loss: 0.00963
Epoch: 23164, Training Loss: 0.00964
Epoch: 23164, Training Loss: 0.01187
Epoch: 23165, Training Loss: 0.00921
Epoch: 23165, Training Loss: 0.00963
Epoch: 23165, Training Loss: 0.00964
Epoch: 23165, Training Loss: 0.01187
Epoch: 23166, Training Loss: 0.00921
Epoch: 23166, Training Loss: 0.00963
Epoch: 23166, Training Loss: 0.00964
Epoch: 23166, Training Loss: 0.01187
Epoch: 23167, Training Loss: 0.00921
Epoch: 23167, Training Loss: 0.00963
Epoch: 23167, Training Loss: 0.00964
Epoch: 23167, Training Loss: 0.01187
Epoch: 23168, Training Loss: 0.00921
Epoch: 23168, Training Loss: 0.00963
Epoch: 23168, Training Loss: 0.00964
Epoch: 23168, Training Loss: 0.01187
Epoch: 23169, Training Loss: 0.00921
Epoch: 23169, Training Loss: 0.00963
Epoch: 23169, Training Loss: 0.00964
Epoch: 23169, Training Loss: 0.01187
Epoch: 23170, Training Loss: 0.00921
Epoch: 23170, Training Loss: 0.00963
Epoch: 23170, Training Loss: 0.00964
Epoch: 23170, Training Loss: 0.01187
Epoch: 23171, Training Loss: 0.00921
Epoch: 23171, Training Loss: 0.00963
Epoch: 23171, Training Loss: 0.00964
Epoch: 23171, Training Loss: 0.01187
Epoch: 23172, Training Loss: 0.00921
Epoch: 23172, Training Loss: 0.00963
Epoch: 23172, Training Loss: 0.00964
Epoch: 23172, Training Loss: 0.01187
Epoch: 23173, Training Loss: 0.00921
Epoch: 23173, Training Loss: 0.00963
Epoch: 23173, Training Loss: 0.00964
Epoch: 23173, Training Loss: 0.01187
Epoch: 23174, Training Loss: 0.00921
Epoch: 23174, Training Loss: 0.00963
Epoch: 23174, Training Loss: 0.00964
Epoch: 23174, Training Loss: 0.01187
Epoch: 23175, Training Loss: 0.00921
Epoch: 23175, Training Loss: 0.00963
Epoch: 23175, Training Loss: 0.00964
Epoch: 23175, Training Loss: 0.01186
Epoch: 23176, Training Loss: 0.00921
Epoch: 23176, Training Loss: 0.00963
Epoch: 23176, Training Loss: 0.00964
Epoch: 23176, Training Loss: 0.01186
Epoch: 23177, Training Loss: 0.00921
Epoch: 23177, Training Loss: 0.00963
Epoch: 23177, Training Loss: 0.00964
Epoch: 23177, Training Loss: 0.01186
Epoch: 23178, Training Loss: 0.00921
Epoch: 23178, Training Loss: 0.00963
Epoch: 23178, Training Loss: 0.00964
Epoch: 23178, Training Loss: 0.01186
Epoch: 23179, Training Loss: 0.00921
Epoch: 23179, Training Loss: 0.00963
Epoch: 23179, Training Loss: 0.00964
Epoch: 23179, Training Loss: 0.01186
Epoch: 23180, Training Loss: 0.00921
Epoch: 23180, Training Loss: 0.00963
Epoch: 23180, Training Loss: 0.00964
Epoch: 23180, Training Loss: 0.01186
Epoch: 23181, Training Loss: 0.00921
Epoch: 23181, Training Loss: 0.00963
Epoch: 23181, Training Loss: 0.00964
Epoch: 23181, Training Loss: 0.01186
Epoch: 23182, Training Loss: 0.00921
Epoch: 23182, Training Loss: 0.00963
Epoch: 23182, Training Loss: 0.00964
Epoch: 23182, Training Loss: 0.01186
Epoch: 23183, Training Loss: 0.00921
Epoch: 23183, Training Loss: 0.00963
Epoch: 23183, Training Loss: 0.00964
Epoch: 23183, Training Loss: 0.01186
Epoch: 23184, Training Loss: 0.00920
Epoch: 23184, Training Loss: 0.00963
Epoch: 23184, Training Loss: 0.00964
Epoch: 23184, Training Loss: 0.01186
Epoch: 23185, Training Loss: 0.00920
Epoch: 23185, Training Loss: 0.00963
Epoch: 23185, Training Loss: 0.00964
Epoch: 23185, Training Loss: 0.01186
Epoch: 23186, Training Loss: 0.00920
Epoch: 23186, Training Loss: 0.00963
Epoch: 23186, Training Loss: 0.00964
Epoch: 23186, Training Loss: 0.01186
Epoch: 23187, Training Loss: 0.00920
Epoch: 23187, Training Loss: 0.00963
Epoch: 23187, Training Loss: 0.00964
Epoch: 23187, Training Loss: 0.01186
Epoch: 23188, Training Loss: 0.00920
Epoch: 23188, Training Loss: 0.00963
Epoch: 23188, Training Loss: 0.00964
Epoch: 23188, Training Loss: 0.01186
Epoch: 23189, Training Loss: 0.00920
Epoch: 23189, Training Loss: 0.00963
Epoch: 23189, Training Loss: 0.00964
Epoch: 23189, Training Loss: 0.01186
Epoch: 23190, Training Loss: 0.00920
Epoch: 23190, Training Loss: 0.00962
Epoch: 23190, Training Loss: 0.00964
Epoch: 23190, Training Loss: 0.01186
Epoch: 23191, Training Loss: 0.00920
Epoch: 23191, Training Loss: 0.00962
Epoch: 23191, Training Loss: 0.00964
Epoch: 23191, Training Loss: 0.01186
Epoch: 23192, Training Loss: 0.00920
Epoch: 23192, Training Loss: 0.00962
Epoch: 23192, Training Loss: 0.00964
Epoch: 23192, Training Loss: 0.01186
Epoch: 23193, Training Loss: 0.00920
Epoch: 23193, Training Loss: 0.00962
Epoch: 23193, Training Loss: 0.00964
Epoch: 23193, Training Loss: 0.01186
Epoch: 23194, Training Loss: 0.00920
Epoch: 23194, Training Loss: 0.00962
Epoch: 23194, Training Loss: 0.00964
Epoch: 23194, Training Loss: 0.01186
Epoch: 23195, Training Loss: 0.00920
Epoch: 23195, Training Loss: 0.00962
Epoch: 23195, Training Loss: 0.00964
Epoch: 23195, Training Loss: 0.01186
Epoch: 23196, Training Loss: 0.00920
Epoch: 23196, Training Loss: 0.00962
Epoch: 23196, Training Loss: 0.00964
Epoch: 23196, Training Loss: 0.01186
Epoch: 23197, Training Loss: 0.00920
Epoch: 23197, Training Loss: 0.00962
Epoch: 23197, Training Loss: 0.00964
Epoch: 23197, Training Loss: 0.01186
Epoch: 23198, Training Loss: 0.00920
Epoch: 23198, Training Loss: 0.00962
Epoch: 23198, Training Loss: 0.00964
Epoch: 23198, Training Loss: 0.01186
Epoch: 23199, Training Loss: 0.00920
Epoch: 23199, Training Loss: 0.00962
Epoch: 23199, Training Loss: 0.00963
Epoch: 23199, Training Loss: 0.01186
Epoch: 23200, Training Loss: 0.00920
Epoch: 23200, Training Loss: 0.00962
Epoch: 23200, Training Loss: 0.00963
Epoch: 23200, Training Loss: 0.01186
Epoch: 23201, Training Loss: 0.00920
Epoch: 23201, Training Loss: 0.00962
Epoch: 23201, Training Loss: 0.00963
Epoch: 23201, Training Loss: 0.01186
Epoch: 23202, Training Loss: 0.00920
Epoch: 23202, Training Loss: 0.00962
Epoch: 23202, Training Loss: 0.00963
Epoch: 23202, Training Loss: 0.01186
Epoch: 23203, Training Loss: 0.00920
Epoch: 23203, Training Loss: 0.00962
Epoch: 23203, Training Loss: 0.00963
Epoch: 23203, Training Loss: 0.01186
Epoch: 23204, Training Loss: 0.00920
Epoch: 23204, Training Loss: 0.00962
Epoch: 23204, Training Loss: 0.00963
Epoch: 23204, Training Loss: 0.01186
Epoch: 23205, Training Loss: 0.00920
Epoch: 23205, Training Loss: 0.00962
Epoch: 23205, Training Loss: 0.00963
Epoch: 23205, Training Loss: 0.01186
Epoch: 23206, Training Loss: 0.00920
Epoch: 23206, Training Loss: 0.00962
Epoch: 23206, Training Loss: 0.00963
Epoch: 23206, Training Loss: 0.01186
Epoch: 23207, Training Loss: 0.00920
Epoch: 23207, Training Loss: 0.00962
Epoch: 23207, Training Loss: 0.00963
Epoch: 23207, Training Loss: 0.01186
Epoch: 23208, Training Loss: 0.00920
Epoch: 23208, Training Loss: 0.00962
Epoch: 23208, Training Loss: 0.00963
Epoch: 23208, Training Loss: 0.01185
Epoch: 23209, Training Loss: 0.00920
Epoch: 23209, Training Loss: 0.00962
Epoch: 23209, Training Loss: 0.00963
Epoch: 23209, Training Loss: 0.01185
Epoch: 23210, Training Loss: 0.00920
Epoch: 23210, Training Loss: 0.00962
Epoch: 23210, Training Loss: 0.00963
Epoch: 23210, Training Loss: 0.01185
Epoch: 23211, Training Loss: 0.00920
Epoch: 23211, Training Loss: 0.00962
Epoch: 23211, Training Loss: 0.00963
Epoch: 23211, Training Loss: 0.01185
Epoch: 23212, Training Loss: 0.00920
Epoch: 23212, Training Loss: 0.00962
Epoch: 23212, Training Loss: 0.00963
Epoch: 23212, Training Loss: 0.01185
Epoch: 23213, Training Loss: 0.00920
Epoch: 23213, Training Loss: 0.00962
Epoch: 23213, Training Loss: 0.00963
Epoch: 23213, Training Loss: 0.01185
Epoch: 23214, Training Loss: 0.00920
Epoch: 23214, Training Loss: 0.00962
Epoch: 23214, Training Loss: 0.00963
Epoch: 23214, Training Loss: 0.01185
Epoch: 23215, Training Loss: 0.00920
Epoch: 23215, Training Loss: 0.00962
Epoch: 23215, Training Loss: 0.00963
Epoch: 23215, Training Loss: 0.01185
Epoch: 23216, Training Loss: 0.00920
Epoch: 23216, Training Loss: 0.00962
Epoch: 23216, Training Loss: 0.00963
Epoch: 23216, Training Loss: 0.01185
Epoch: 23217, Training Loss: 0.00920
Epoch: 23217, Training Loss: 0.00962
Epoch: 23217, Training Loss: 0.00963
Epoch: 23217, Training Loss: 0.01185
Epoch: 23218, Training Loss: 0.00920
Epoch: 23218, Training Loss: 0.00962
Epoch: 23218, Training Loss: 0.00963
Epoch: 23218, Training Loss: 0.01185
Epoch: 23219, Training Loss: 0.00920
Epoch: 23219, Training Loss: 0.00962
Epoch: 23219, Training Loss: 0.00963
Epoch: 23219, Training Loss: 0.01185
Epoch: 23220, Training Loss: 0.00920
Epoch: 23220, Training Loss: 0.00962
Epoch: 23220, Training Loss: 0.00963
Epoch: 23220, Training Loss: 0.01185
Epoch: 23221, Training Loss: 0.00920
Epoch: 23221, Training Loss: 0.00962
Epoch: 23221, Training Loss: 0.00963
Epoch: 23221, Training Loss: 0.01185
Epoch: 23222, Training Loss: 0.00920
Epoch: 23222, Training Loss: 0.00962
Epoch: 23222, Training Loss: 0.00963
Epoch: 23222, Training Loss: 0.01185
Epoch: 23223, Training Loss: 0.00920
Epoch: 23223, Training Loss: 0.00962
Epoch: 23223, Training Loss: 0.00963
Epoch: 23223, Training Loss: 0.01185
Epoch: 23224, Training Loss: 0.00920
Epoch: 23224, Training Loss: 0.00962
Epoch: 23224, Training Loss: 0.00963
Epoch: 23224, Training Loss: 0.01185
Epoch: 23225, Training Loss: 0.00920
Epoch: 23225, Training Loss: 0.00962
Epoch: 23225, Training Loss: 0.00963
Epoch: 23225, Training Loss: 0.01185
Epoch: 23226, Training Loss: 0.00920
Epoch: 23226, Training Loss: 0.00962
Epoch: 23226, Training Loss: 0.00963
Epoch: 23226, Training Loss: 0.01185
Epoch: 23227, Training Loss: 0.00920
Epoch: 23227, Training Loss: 0.00962
Epoch: 23227, Training Loss: 0.00963
Epoch: 23227, Training Loss: 0.01185
Epoch: 23228, Training Loss: 0.00919
Epoch: 23228, Training Loss: 0.00962
Epoch: 23228, Training Loss: 0.00963
Epoch: 23228, Training Loss: 0.01185
Epoch: 23229, Training Loss: 0.00919
Epoch: 23229, Training Loss: 0.00962
Epoch: 23229, Training Loss: 0.00963
Epoch: 23229, Training Loss: 0.01185
Epoch: 23230, Training Loss: 0.00919
Epoch: 23230, Training Loss: 0.00962
Epoch: 23230, Training Loss: 0.00963
Epoch: 23230, Training Loss: 0.01185
Epoch: 23231, Training Loss: 0.00919
Epoch: 23231, Training Loss: 0.00961
Epoch: 23231, Training Loss: 0.00963
Epoch: 23231, Training Loss: 0.01185
Epoch: 23232, Training Loss: 0.00919
Epoch: 23232, Training Loss: 0.00961
Epoch: 23232, Training Loss: 0.00963
Epoch: 23232, Training Loss: 0.01185
Epoch: 23233, Training Loss: 0.00919
Epoch: 23233, Training Loss: 0.00961
Epoch: 23233, Training Loss: 0.00963
Epoch: 23233, Training Loss: 0.01185
Epoch: 23234, Training Loss: 0.00919
Epoch: 23234, Training Loss: 0.00961
Epoch: 23234, Training Loss: 0.00963
Epoch: 23234, Training Loss: 0.01185
Epoch: 23235, Training Loss: 0.00919
Epoch: 23235, Training Loss: 0.00961
Epoch: 23235, Training Loss: 0.00963
Epoch: 23235, Training Loss: 0.01185
Epoch: 23236, Training Loss: 0.00919
Epoch: 23236, Training Loss: 0.00961
Epoch: 23236, Training Loss: 0.00963
Epoch: 23236, Training Loss: 0.01185
Epoch: 23237, Training Loss: 0.00919
Epoch: 23237, Training Loss: 0.00961
Epoch: 23237, Training Loss: 0.00963
Epoch: 23237, Training Loss: 0.01185
Epoch: 23238, Training Loss: 0.00919
Epoch: 23238, Training Loss: 0.00961
Epoch: 23238, Training Loss: 0.00963
Epoch: 23238, Training Loss: 0.01185
Epoch: 23239, Training Loss: 0.00919
Epoch: 23239, Training Loss: 0.00961
Epoch: 23239, Training Loss: 0.00963
Epoch: 23239, Training Loss: 0.01185
Epoch: 23240, Training Loss: 0.00919
Epoch: 23240, Training Loss: 0.00961
Epoch: 23240, Training Loss: 0.00963
Epoch: 23240, Training Loss: 0.01185
Epoch: 23241, Training Loss: 0.00919
Epoch: 23241, Training Loss: 0.00961
Epoch: 23241, Training Loss: 0.00962
Epoch: 23241, Training Loss: 0.01185
Epoch: 23242, Training Loss: 0.00919
Epoch: 23242, Training Loss: 0.00961
Epoch: 23242, Training Loss: 0.00962
Epoch: 23242, Training Loss: 0.01184
Epoch: 23243, Training Loss: 0.00919
Epoch: 23243, Training Loss: 0.00961
Epoch: 23243, Training Loss: 0.00962
Epoch: 23243, Training Loss: 0.01184
Epoch: 23244, Training Loss: 0.00919
Epoch: 23244, Training Loss: 0.00961
Epoch: 23244, Training Loss: 0.00962
Epoch: 23244, Training Loss: 0.01184
Epoch: 23245, Training Loss: 0.00919
Epoch: 23245, Training Loss: 0.00961
Epoch: 23245, Training Loss: 0.00962
Epoch: 23245, Training Loss: 0.01184
Epoch: 23246, Training Loss: 0.00919
Epoch: 23246, Training Loss: 0.00961
Epoch: 23246, Training Loss: 0.00962
Epoch: 23246, Training Loss: 0.01184
Epoch: 23247, Training Loss: 0.00919
Epoch: 23247, Training Loss: 0.00961
Epoch: 23247, Training Loss: 0.00962
Epoch: 23247, Training Loss: 0.01184
Epoch: 23248, Training Loss: 0.00919
Epoch: 23248, Training Loss: 0.00961
Epoch: 23248, Training Loss: 0.00962
Epoch: 23248, Training Loss: 0.01184
Epoch: 23249, Training Loss: 0.00919
Epoch: 23249, Training Loss: 0.00961
Epoch: 23249, Training Loss: 0.00962
Epoch: 23249, Training Loss: 0.01184
Epoch: 23250, Training Loss: 0.00919
Epoch: 23250, Training Loss: 0.00961
Epoch: 23250, Training Loss: 0.00962
Epoch: 23250, Training Loss: 0.01184
Epoch: 23251, Training Loss: 0.00919
Epoch: 23251, Training Loss: 0.00961
Epoch: 23251, Training Loss: 0.00962
Epoch: 23251, Training Loss: 0.01184
Epoch: 23252, Training Loss: 0.00919
Epoch: 23252, Training Loss: 0.00961
Epoch: 23252, Training Loss: 0.00962
Epoch: 23252, Training Loss: 0.01184
Epoch: 23253, Training Loss: 0.00919
Epoch: 23253, Training Loss: 0.00961
Epoch: 23253, Training Loss: 0.00962
Epoch: 23253, Training Loss: 0.01184
Epoch: 23254, Training Loss: 0.00919
Epoch: 23254, Training Loss: 0.00961
Epoch: 23254, Training Loss: 0.00962
Epoch: 23254, Training Loss: 0.01184
Epoch: 23255, Training Loss: 0.00919
Epoch: 23255, Training Loss: 0.00961
Epoch: 23255, Training Loss: 0.00962
Epoch: 23255, Training Loss: 0.01184
Epoch: 23256, Training Loss: 0.00919
Epoch: 23256, Training Loss: 0.00961
Epoch: 23256, Training Loss: 0.00962
Epoch: 23256, Training Loss: 0.01184
Epoch: 23257, Training Loss: 0.00919
Epoch: 23257, Training Loss: 0.00961
Epoch: 23257, Training Loss: 0.00962
Epoch: 23257, Training Loss: 0.01184
Epoch: 23258, Training Loss: 0.00919
Epoch: 23258, Training Loss: 0.00961
Epoch: 23258, Training Loss: 0.00962
Epoch: 23258, Training Loss: 0.01184
Epoch: 23259, Training Loss: 0.00919
Epoch: 23259, Training Loss: 0.00961
Epoch: 23259, Training Loss: 0.00962
Epoch: 23259, Training Loss: 0.01184
Epoch: 23260, Training Loss: 0.00919
Epoch: 23260, Training Loss: 0.00961
Epoch: 23260, Training Loss: 0.00962
Epoch: 23260, Training Loss: 0.01184
Epoch: 23261, Training Loss: 0.00919
Epoch: 23261, Training Loss: 0.00961
Epoch: 23261, Training Loss: 0.00962
Epoch: 23261, Training Loss: 0.01184
Epoch: 23262, Training Loss: 0.00919
Epoch: 23262, Training Loss: 0.00961
Epoch: 23262, Training Loss: 0.00962
Epoch: 23262, Training Loss: 0.01184
Epoch: 23263, Training Loss: 0.00919
Epoch: 23263, Training Loss: 0.00961
Epoch: 23263, Training Loss: 0.00962
Epoch: 23263, Training Loss: 0.01184
Epoch: 23264, Training Loss: 0.00919
Epoch: 23264, Training Loss: 0.00961
Epoch: 23264, Training Loss: 0.00962
Epoch: 23264, Training Loss: 0.01184
Epoch: 23265, Training Loss: 0.00919
Epoch: 23265, Training Loss: 0.00961
Epoch: 23265, Training Loss: 0.00962
Epoch: 23265, Training Loss: 0.01184
Epoch: 23266, Training Loss: 0.00919
Epoch: 23266, Training Loss: 0.00961
Epoch: 23266, Training Loss: 0.00962
Epoch: 23266, Training Loss: 0.01184
Epoch: 23267, Training Loss: 0.00919
Epoch: 23267, Training Loss: 0.00961
Epoch: 23267, Training Loss: 0.00962
Epoch: 23267, Training Loss: 0.01184
Epoch: 23268, Training Loss: 0.00919
Epoch: 23268, Training Loss: 0.00961
Epoch: 23268, Training Loss: 0.00962
Epoch: 23268, Training Loss: 0.01184
Epoch: 23269, Training Loss: 0.00919
Epoch: 23269, Training Loss: 0.00961
Epoch: 23269, Training Loss: 0.00962
Epoch: 23269, Training Loss: 0.01184
Epoch: 23270, Training Loss: 0.00919
Epoch: 23270, Training Loss: 0.00961
Epoch: 23270, Training Loss: 0.00962
Epoch: 23270, Training Loss: 0.01184
Epoch: 23271, Training Loss: 0.00919
Epoch: 23271, Training Loss: 0.00961
Epoch: 23271, Training Loss: 0.00962
Epoch: 23271, Training Loss: 0.01184
Epoch: 23272, Training Loss: 0.00919
Epoch: 23272, Training Loss: 0.00961
Epoch: 23272, Training Loss: 0.00962
Epoch: 23272, Training Loss: 0.01184
Epoch: 23273, Training Loss: 0.00918
Epoch: 23273, Training Loss: 0.00960
Epoch: 23273, Training Loss: 0.00962
Epoch: 23273, Training Loss: 0.01184
Epoch: 23274, Training Loss: 0.00918
Epoch: 23274, Training Loss: 0.00960
Epoch: 23274, Training Loss: 0.00962
Epoch: 23274, Training Loss: 0.01184
Epoch: 23275, Training Loss: 0.00918
Epoch: 23275, Training Loss: 0.00960
Epoch: 23275, Training Loss: 0.00962
Epoch: 23275, Training Loss: 0.01183
Epoch: 23276, Training Loss: 0.00918
Epoch: 23276, Training Loss: 0.00960
Epoch: 23276, Training Loss: 0.00962
Epoch: 23276, Training Loss: 0.01183
Epoch: 23277, Training Loss: 0.00918
Epoch: 23277, Training Loss: 0.00960
Epoch: 23277, Training Loss: 0.00962
Epoch: 23277, Training Loss: 0.01183
Epoch: 23278, Training Loss: 0.00918
Epoch: 23278, Training Loss: 0.00960
Epoch: 23278, Training Loss: 0.00962
Epoch: 23278, Training Loss: 0.01183
Epoch: 23279, Training Loss: 0.00918
Epoch: 23279, Training Loss: 0.00960
Epoch: 23279, Training Loss: 0.00962
Epoch: 23279, Training Loss: 0.01183
Epoch: 23280, Training Loss: 0.00918
Epoch: 23280, Training Loss: 0.00960
Epoch: 23280, Training Loss: 0.00962
Epoch: 23280, Training Loss: 0.01183
Epoch: 23281, Training Loss: 0.00918
Epoch: 23281, Training Loss: 0.00960
Epoch: 23281, Training Loss: 0.00962
Epoch: 23281, Training Loss: 0.01183
Epoch: 23282, Training Loss: 0.00918
Epoch: 23282, Training Loss: 0.00960
Epoch: 23282, Training Loss: 0.00961
Epoch: 23282, Training Loss: 0.01183
Epoch: 23283, Training Loss: 0.00918
Epoch: 23283, Training Loss: 0.00960
Epoch: 23283, Training Loss: 0.00961
Epoch: 23283, Training Loss: 0.01183
Epoch: 23284, Training Loss: 0.00918
Epoch: 23284, Training Loss: 0.00960
Epoch: 23284, Training Loss: 0.00961
Epoch: 23284, Training Loss: 0.01183
Epoch: 23285, Training Loss: 0.00918
Epoch: 23285, Training Loss: 0.00960
Epoch: 23285, Training Loss: 0.00961
Epoch: 23285, Training Loss: 0.01183
Epoch: 23286, Training Loss: 0.00918
Epoch: 23286, Training Loss: 0.00960
Epoch: 23286, Training Loss: 0.00961
Epoch: 23286, Training Loss: 0.01183
Epoch: 23287, Training Loss: 0.00918
Epoch: 23287, Training Loss: 0.00960
Epoch: 23287, Training Loss: 0.00961
Epoch: 23287, Training Loss: 0.01183
Epoch: 23288, Training Loss: 0.00918
Epoch: 23288, Training Loss: 0.00960
Epoch: 23288, Training Loss: 0.00961
Epoch: 23288, Training Loss: 0.01183
Epoch: 23289, Training Loss: 0.00918
Epoch: 23289, Training Loss: 0.00960
Epoch: 23289, Training Loss: 0.00961
Epoch: 23289, Training Loss: 0.01183
Epoch: 23290, Training Loss: 0.00918
Epoch: 23290, Training Loss: 0.00960
Epoch: 23290, Training Loss: 0.00961
Epoch: 23290, Training Loss: 0.01183
Epoch: 23291, Training Loss: 0.00918
Epoch: 23291, Training Loss: 0.00960
Epoch: 23291, Training Loss: 0.00961
Epoch: 23291, Training Loss: 0.01183
Epoch: 23292, Training Loss: 0.00918
Epoch: 23292, Training Loss: 0.00960
Epoch: 23292, Training Loss: 0.00961
Epoch: 23292, Training Loss: 0.01183
Epoch: 23293, Training Loss: 0.00918
Epoch: 23293, Training Loss: 0.00960
Epoch: 23293, Training Loss: 0.00961
Epoch: 23293, Training Loss: 0.01183
Epoch: 23294, Training Loss: 0.00918
Epoch: 23294, Training Loss: 0.00960
Epoch: 23294, Training Loss: 0.00961
Epoch: 23294, Training Loss: 0.01183
Epoch: 23295, Training Loss: 0.00918
Epoch: 23295, Training Loss: 0.00960
Epoch: 23295, Training Loss: 0.00961
Epoch: 23295, Training Loss: 0.01183
Epoch: 23296, Training Loss: 0.00918
Epoch: 23296, Training Loss: 0.00960
Epoch: 23296, Training Loss: 0.00961
Epoch: 23296, Training Loss: 0.01183
Epoch: 23297, Training Loss: 0.00918
Epoch: 23297, Training Loss: 0.00960
Epoch: 23297, Training Loss: 0.00961
Epoch: 23297, Training Loss: 0.01183
Epoch: 23298, Training Loss: 0.00918
Epoch: 23298, Training Loss: 0.00960
Epoch: 23298, Training Loss: 0.00961
Epoch: 23298, Training Loss: 0.01183
Epoch: 23299, Training Loss: 0.00918
Epoch: 23299, Training Loss: 0.00960
Epoch: 23299, Training Loss: 0.00961
Epoch: 23299, Training Loss: 0.01183
Epoch: 23300, Training Loss: 0.00918
Epoch: 23300, Training Loss: 0.00960
Epoch: 23300, Training Loss: 0.00961
Epoch: 23300, Training Loss: 0.01183
Epoch: 23301, Training Loss: 0.00918
Epoch: 23301, Training Loss: 0.00960
Epoch: 23301, Training Loss: 0.00961
Epoch: 23301, Training Loss: 0.01183
Epoch: 23302, Training Loss: 0.00918
Epoch: 23302, Training Loss: 0.00960
Epoch: 23302, Training Loss: 0.00961
Epoch: 23302, Training Loss: 0.01183
Epoch: 23303, Training Loss: 0.00918
Epoch: 23303, Training Loss: 0.00960
Epoch: 23303, Training Loss: 0.00961
Epoch: 23303, Training Loss: 0.01183
Epoch: 23304, Training Loss: 0.00918
Epoch: 23304, Training Loss: 0.00960
Epoch: 23304, Training Loss: 0.00961
Epoch: 23304, Training Loss: 0.01183
Epoch: 23305, Training Loss: 0.00918
Epoch: 23305, Training Loss: 0.00960
Epoch: 23305, Training Loss: 0.00961
Epoch: 23305, Training Loss: 0.01183
Epoch: 23306, Training Loss: 0.00918
Epoch: 23306, Training Loss: 0.00960
Epoch: 23306, Training Loss: 0.00961
Epoch: 23306, Training Loss: 0.01183
Epoch: 23307, Training Loss: 0.00918
Epoch: 23307, Training Loss: 0.00960
Epoch: 23307, Training Loss: 0.00961
Epoch: 23307, Training Loss: 0.01183
Epoch: 23308, Training Loss: 0.00918
Epoch: 23308, Training Loss: 0.00960
Epoch: 23308, Training Loss: 0.00961
Epoch: 23308, Training Loss: 0.01183
Epoch: 23309, Training Loss: 0.00918
Epoch: 23309, Training Loss: 0.00960
Epoch: 23309, Training Loss: 0.00961
Epoch: 23309, Training Loss: 0.01182
Epoch: 23310, Training Loss: 0.00918
Epoch: 23310, Training Loss: 0.00960
Epoch: 23310, Training Loss: 0.00961
Epoch: 23310, Training Loss: 0.01182
Epoch: 23311, Training Loss: 0.00918
Epoch: 23311, Training Loss: 0.00960
Epoch: 23311, Training Loss: 0.00961
Epoch: 23311, Training Loss: 0.01182
Epoch: 23312, Training Loss: 0.00918
Epoch: 23312, Training Loss: 0.00960
Epoch: 23312, Training Loss: 0.00961
Epoch: 23312, Training Loss: 0.01182
Epoch: 23313, Training Loss: 0.00918
Epoch: 23313, Training Loss: 0.00960
Epoch: 23313, Training Loss: 0.00961
Epoch: 23313, Training Loss: 0.01182
Epoch: 23314, Training Loss: 0.00918
Epoch: 23314, Training Loss: 0.00960
Epoch: 23314, Training Loss: 0.00961
Epoch: 23314, Training Loss: 0.01182
Epoch: 23315, Training Loss: 0.00918
Epoch: 23315, Training Loss: 0.00959
Epoch: 23315, Training Loss: 0.00961
Epoch: 23315, Training Loss: 0.01182
Epoch: 23316, Training Loss: 0.00918
Epoch: 23316, Training Loss: 0.00959
Epoch: 23316, Training Loss: 0.00961
Epoch: 23316, Training Loss: 0.01182
Epoch: 23317, Training Loss: 0.00918
Epoch: 23317, Training Loss: 0.00959
Epoch: 23317, Training Loss: 0.00961
Epoch: 23317, Training Loss: 0.01182
Epoch: 23318, Training Loss: 0.00917
Epoch: 23318, Training Loss: 0.00959
Epoch: 23318, Training Loss: 0.00961
Epoch: 23318, Training Loss: 0.01182
Epoch: 23319, Training Loss: 0.00917
Epoch: 23319, Training Loss: 0.00959
Epoch: 23319, Training Loss: 0.00961
Epoch: 23319, Training Loss: 0.01182
Epoch: 23320, Training Loss: 0.00917
Epoch: 23320, Training Loss: 0.00959
Epoch: 23320, Training Loss: 0.00961
Epoch: 23320, Training Loss: 0.01182
Epoch: 23321, Training Loss: 0.00917
Epoch: 23321, Training Loss: 0.00959
Epoch: 23321, Training Loss: 0.00961
Epoch: 23321, Training Loss: 0.01182
Epoch: 23322, Training Loss: 0.00917
Epoch: 23322, Training Loss: 0.00959
Epoch: 23322, Training Loss: 0.00961
Epoch: 23322, Training Loss: 0.01182
Epoch: 23323, Training Loss: 0.00917
Epoch: 23323, Training Loss: 0.00959
Epoch: 23323, Training Loss: 0.00961
Epoch: 23323, Training Loss: 0.01182
Epoch: 23324, Training Loss: 0.00917
Epoch: 23324, Training Loss: 0.00959
Epoch: 23324, Training Loss: 0.00960
Epoch: 23324, Training Loss: 0.01182
Epoch: 23325, Training Loss: 0.00917
Epoch: 23325, Training Loss: 0.00959
Epoch: 23325, Training Loss: 0.00960
Epoch: 23325, Training Loss: 0.01182
Epoch: 23326, Training Loss: 0.00917
Epoch: 23326, Training Loss: 0.00959
Epoch: 23326, Training Loss: 0.00960
Epoch: 23326, Training Loss: 0.01182
Epoch: 23327, Training Loss: 0.00917
Epoch: 23327, Training Loss: 0.00959
Epoch: 23327, Training Loss: 0.00960
Epoch: 23327, Training Loss: 0.01182
Epoch: 23328, Training Loss: 0.00917
Epoch: 23328, Training Loss: 0.00959
Epoch: 23328, Training Loss: 0.00960
Epoch: 23328, Training Loss: 0.01182
Epoch: 23329, Training Loss: 0.00917
Epoch: 23329, Training Loss: 0.00959
Epoch: 23329, Training Loss: 0.00960
Epoch: 23329, Training Loss: 0.01182
Epoch: 23330, Training Loss: 0.00917
Epoch: 23330, Training Loss: 0.00959
Epoch: 23330, Training Loss: 0.00960
Epoch: 23330, Training Loss: 0.01182
Epoch: 23331, Training Loss: 0.00917
Epoch: 23331, Training Loss: 0.00959
Epoch: 23331, Training Loss: 0.00960
Epoch: 23331, Training Loss: 0.01182
Epoch: 23332, Training Loss: 0.00917
Epoch: 23332, Training Loss: 0.00959
Epoch: 23332, Training Loss: 0.00960
Epoch: 23332, Training Loss: 0.01182
Epoch: 23333, Training Loss: 0.00917
Epoch: 23333, Training Loss: 0.00959
Epoch: 23333, Training Loss: 0.00960
Epoch: 23333, Training Loss: 0.01182
Epoch: 23334, Training Loss: 0.00917
Epoch: 23334, Training Loss: 0.00959
Epoch: 23334, Training Loss: 0.00960
Epoch: 23334, Training Loss: 0.01182
Epoch: 23335, Training Loss: 0.00917
Epoch: 23335, Training Loss: 0.00959
Epoch: 23335, Training Loss: 0.00960
Epoch: 23335, Training Loss: 0.01182
Epoch: 23336, Training Loss: 0.00917
Epoch: 23336, Training Loss: 0.00959
Epoch: 23336, Training Loss: 0.00960
Epoch: 23336, Training Loss: 0.01182
Epoch: 23337, Training Loss: 0.00917
Epoch: 23337, Training Loss: 0.00959
Epoch: 23337, Training Loss: 0.00960
Epoch: 23337, Training Loss: 0.01182
Epoch: 23338, Training Loss: 0.00917
Epoch: 23338, Training Loss: 0.00959
Epoch: 23338, Training Loss: 0.00960
Epoch: 23338, Training Loss: 0.01182
Epoch: 23339, Training Loss: 0.00917
Epoch: 23339, Training Loss: 0.00959
Epoch: 23339, Training Loss: 0.00960
Epoch: 23339, Training Loss: 0.01182
Epoch: 23340, Training Loss: 0.00917
Epoch: 23340, Training Loss: 0.00959
Epoch: 23340, Training Loss: 0.00960
Epoch: 23340, Training Loss: 0.01182
Epoch: 23341, Training Loss: 0.00917
Epoch: 23341, Training Loss: 0.00959
Epoch: 23341, Training Loss: 0.00960
Epoch: 23341, Training Loss: 0.01182
Epoch: 23342, Training Loss: 0.00917
Epoch: 23342, Training Loss: 0.00959
Epoch: 23342, Training Loss: 0.00960
Epoch: 23342, Training Loss: 0.01181
Epoch: 23343, Training Loss: 0.00917
Epoch: 23343, Training Loss: 0.00959
Epoch: 23343, Training Loss: 0.00960
Epoch: 23343, Training Loss: 0.01181
Epoch: 23344, Training Loss: 0.00917
Epoch: 23344, Training Loss: 0.00959
Epoch: 23344, Training Loss: 0.00960
Epoch: 23344, Training Loss: 0.01181
Epoch: 23345, Training Loss: 0.00917
Epoch: 23345, Training Loss: 0.00959
Epoch: 23345, Training Loss: 0.00960
Epoch: 23345, Training Loss: 0.01181
Epoch: 23346, Training Loss: 0.00917
Epoch: 23346, Training Loss: 0.00959
Epoch: 23346, Training Loss: 0.00960
Epoch: 23346, Training Loss: 0.01181
Epoch: 23347, Training Loss: 0.00917
Epoch: 23347, Training Loss: 0.00959
Epoch: 23347, Training Loss: 0.00960
Epoch: 23347, Training Loss: 0.01181
Epoch: 23348, Training Loss: 0.00917
Epoch: 23348, Training Loss: 0.00959
Epoch: 23348, Training Loss: 0.00960
Epoch: 23348, Training Loss: 0.01181
Epoch: 23349, Training Loss: 0.00917
Epoch: 23349, Training Loss: 0.00959
Epoch: 23349, Training Loss: 0.00960
Epoch: 23349, Training Loss: 0.01181
Epoch: 23350, Training Loss: 0.00917
Epoch: 23350, Training Loss: 0.00959
Epoch: 23350, Training Loss: 0.00960
Epoch: 23350, Training Loss: 0.01181
Epoch: 23351, Training Loss: 0.00917
Epoch: 23351, Training Loss: 0.00959
Epoch: 23351, Training Loss: 0.00960
Epoch: 23351, Training Loss: 0.01181
Epoch: 23352, Training Loss: 0.00917
Epoch: 23352, Training Loss: 0.00959
Epoch: 23352, Training Loss: 0.00960
Epoch: 23352, Training Loss: 0.01181
Epoch: 23353, Training Loss: 0.00917
Epoch: 23353, Training Loss: 0.00959
Epoch: 23353, Training Loss: 0.00960
Epoch: 23353, Training Loss: 0.01181
Epoch: 23354, Training Loss: 0.00917
Epoch: 23354, Training Loss: 0.00959
Epoch: 23354, Training Loss: 0.00960
Epoch: 23354, Training Loss: 0.01181
Epoch: 23355, Training Loss: 0.00917
Epoch: 23355, Training Loss: 0.00959
Epoch: 23355, Training Loss: 0.00960
Epoch: 23355, Training Loss: 0.01181
Epoch: 23356, Training Loss: 0.00917
Epoch: 23356, Training Loss: 0.00959
Epoch: 23356, Training Loss: 0.00960
Epoch: 23356, Training Loss: 0.01181
Epoch: 23357, Training Loss: 0.00917
Epoch: 23357, Training Loss: 0.00958
Epoch: 23357, Training Loss: 0.00960
Epoch: 23357, Training Loss: 0.01181
Epoch: 23358, Training Loss: 0.00917
Epoch: 23358, Training Loss: 0.00958
Epoch: 23358, Training Loss: 0.00960
Epoch: 23358, Training Loss: 0.01181
Epoch: 23359, Training Loss: 0.00917
Epoch: 23359, Training Loss: 0.00958
Epoch: 23359, Training Loss: 0.00960
Epoch: 23359, Training Loss: 0.01181
Epoch: 23360, Training Loss: 0.00917
Epoch: 23360, Training Loss: 0.00958
Epoch: 23360, Training Loss: 0.00960
Epoch: 23360, Training Loss: 0.01181
Epoch: 23361, Training Loss: 0.00917
Epoch: 23361, Training Loss: 0.00958
Epoch: 23361, Training Loss: 0.00960
Epoch: 23361, Training Loss: 0.01181
Epoch: 23362, Training Loss: 0.00917
Epoch: 23362, Training Loss: 0.00958
Epoch: 23362, Training Loss: 0.00960
Epoch: 23362, Training Loss: 0.01181
Epoch: 23363, Training Loss: 0.00916
Epoch: 23363, Training Loss: 0.00958
Epoch: 23363, Training Loss: 0.00960
Epoch: 23363, Training Loss: 0.01181
Epoch: 23364, Training Loss: 0.00916
Epoch: 23364, Training Loss: 0.00958
Epoch: 23364, Training Loss: 0.00960
Epoch: 23364, Training Loss: 0.01181
Epoch: 23365, Training Loss: 0.00916
Epoch: 23365, Training Loss: 0.00958
Epoch: 23365, Training Loss: 0.00960
Epoch: 23365, Training Loss: 0.01181
Epoch: 23366, Training Loss: 0.00916
Epoch: 23366, Training Loss: 0.00958
Epoch: 23366, Training Loss: 0.00959
Epoch: 23366, Training Loss: 0.01181
Epoch: 23367, Training Loss: 0.00916
Epoch: 23367, Training Loss: 0.00958
Epoch: 23367, Training Loss: 0.00959
Epoch: 23367, Training Loss: 0.01181
Epoch: 23368, Training Loss: 0.00916
Epoch: 23368, Training Loss: 0.00958
Epoch: 23368, Training Loss: 0.00959
Epoch: 23368, Training Loss: 0.01181
Epoch: 23369, Training Loss: 0.00916
Epoch: 23369, Training Loss: 0.00958
Epoch: 23369, Training Loss: 0.00959
Epoch: 23369, Training Loss: 0.01181
Epoch: 23370, Training Loss: 0.00916
Epoch: 23370, Training Loss: 0.00958
Epoch: 23370, Training Loss: 0.00959
Epoch: 23370, Training Loss: 0.01181
Epoch: 23371, Training Loss: 0.00916
Epoch: 23371, Training Loss: 0.00958
Epoch: 23371, Training Loss: 0.00959
Epoch: 23371, Training Loss: 0.01181
Epoch: 23372, Training Loss: 0.00916
Epoch: 23372, Training Loss: 0.00958
Epoch: 23372, Training Loss: 0.00959
Epoch: 23372, Training Loss: 0.01181
Epoch: 23373, Training Loss: 0.00916
Epoch: 23373, Training Loss: 0.00958
Epoch: 23373, Training Loss: 0.00959
Epoch: 23373, Training Loss: 0.01181
Epoch: 23374, Training Loss: 0.00916
Epoch: 23374, Training Loss: 0.00958
Epoch: 23374, Training Loss: 0.00959
Epoch: 23374, Training Loss: 0.01181
Epoch: 23375, Training Loss: 0.00916
Epoch: 23375, Training Loss: 0.00958
Epoch: 23375, Training Loss: 0.00959
Epoch: 23375, Training Loss: 0.01181
Epoch: 23376, Training Loss: 0.00916
Epoch: 23376, Training Loss: 0.00958
Epoch: 23376, Training Loss: 0.00959
Epoch: 23376, Training Loss: 0.01180
Epoch: 23377, Training Loss: 0.00916
Epoch: 23377, Training Loss: 0.00958
Epoch: 23377, Training Loss: 0.00959
Epoch: 23377, Training Loss: 0.01180
Epoch: 23378, Training Loss: 0.00916
Epoch: 23378, Training Loss: 0.00958
Epoch: 23378, Training Loss: 0.00959
Epoch: 23378, Training Loss: 0.01180
Epoch: 23379, Training Loss: 0.00916
Epoch: 23379, Training Loss: 0.00958
Epoch: 23379, Training Loss: 0.00959
Epoch: 23379, Training Loss: 0.01180
Epoch: 23380, Training Loss: 0.00916
Epoch: 23380, Training Loss: 0.00958
Epoch: 23380, Training Loss: 0.00959
Epoch: 23380, Training Loss: 0.01180
Epoch: 23381, Training Loss: 0.00916
Epoch: 23381, Training Loss: 0.00958
Epoch: 23381, Training Loss: 0.00959
Epoch: 23381, Training Loss: 0.01180
Epoch: 23382, Training Loss: 0.00916
Epoch: 23382, Training Loss: 0.00958
Epoch: 23382, Training Loss: 0.00959
Epoch: 23382, Training Loss: 0.01180
Epoch: 23383, Training Loss: 0.00916
Epoch: 23383, Training Loss: 0.00958
Epoch: 23383, Training Loss: 0.00959
Epoch: 23383, Training Loss: 0.01180
Epoch: 23384, Training Loss: 0.00916
Epoch: 23384, Training Loss: 0.00958
Epoch: 23384, Training Loss: 0.00959
Epoch: 23384, Training Loss: 0.01180
Epoch: 23385, Training Loss: 0.00916
Epoch: 23385, Training Loss: 0.00958
Epoch: 23385, Training Loss: 0.00959
Epoch: 23385, Training Loss: 0.01180
Epoch: 23386, Training Loss: 0.00916
Epoch: 23386, Training Loss: 0.00958
Epoch: 23386, Training Loss: 0.00959
Epoch: 23386, Training Loss: 0.01180
Epoch: 23387, Training Loss: 0.00916
Epoch: 23387, Training Loss: 0.00958
Epoch: 23387, Training Loss: 0.00959
Epoch: 23387, Training Loss: 0.01180
Epoch: 23388, Training Loss: 0.00916
Epoch: 23388, Training Loss: 0.00958
Epoch: 23388, Training Loss: 0.00959
Epoch: 23388, Training Loss: 0.01180
Epoch: 23389, Training Loss: 0.00916
Epoch: 23389, Training Loss: 0.00958
Epoch: 23389, Training Loss: 0.00959
Epoch: 23389, Training Loss: 0.01180
Epoch: 23390, Training Loss: 0.00916
Epoch: 23390, Training Loss: 0.00958
Epoch: 23390, Training Loss: 0.00959
Epoch: 23390, Training Loss: 0.01180
Epoch: 23391, Training Loss: 0.00916
Epoch: 23391, Training Loss: 0.00958
Epoch: 23391, Training Loss: 0.00959
Epoch: 23391, Training Loss: 0.01180
Epoch: 23392, Training Loss: 0.00916
Epoch: 23392, Training Loss: 0.00958
Epoch: 23392, Training Loss: 0.00959
Epoch: 23392, Training Loss: 0.01180
Epoch: 23393, Training Loss: 0.00916
Epoch: 23393, Training Loss: 0.00958
Epoch: 23393, Training Loss: 0.00959
Epoch: 23393, Training Loss: 0.01180
Epoch: 23394, Training Loss: 0.00916
Epoch: 23394, Training Loss: 0.00958
Epoch: 23394, Training Loss: 0.00959
Epoch: 23394, Training Loss: 0.01180
Epoch: 23395, Training Loss: 0.00916
Epoch: 23395, Training Loss: 0.00958
Epoch: 23395, Training Loss: 0.00959
Epoch: 23395, Training Loss: 0.01180
Epoch: 23396, Training Loss: 0.00916
Epoch: 23396, Training Loss: 0.00958
Epoch: 23396, Training Loss: 0.00959
Epoch: 23396, Training Loss: 0.01180
Epoch: 23397, Training Loss: 0.00916
Epoch: 23397, Training Loss: 0.00958
Epoch: 23397, Training Loss: 0.00959
Epoch: 23397, Training Loss: 0.01180
Epoch: 23398, Training Loss: 0.00916
Epoch: 23398, Training Loss: 0.00958
Epoch: 23398, Training Loss: 0.00959
Epoch: 23398, Training Loss: 0.01180
Epoch: 23399, Training Loss: 0.00916
Epoch: 23399, Training Loss: 0.00957
Epoch: 23399, Training Loss: 0.00959
Epoch: 23399, Training Loss: 0.01180
Epoch: 23400, Training Loss: 0.00916
Epoch: 23400, Training Loss: 0.00957
Epoch: 23400, Training Loss: 0.00959
Epoch: 23400, Training Loss: 0.01180
Epoch: 23401, Training Loss: 0.00916
Epoch: 23401, Training Loss: 0.00957
Epoch: 23401, Training Loss: 0.00959
Epoch: 23401, Training Loss: 0.01180
Epoch: 23402, Training Loss: 0.00916
Epoch: 23402, Training Loss: 0.00957
Epoch: 23402, Training Loss: 0.00959
Epoch: 23402, Training Loss: 0.01180
Epoch: 23403, Training Loss: 0.00916
Epoch: 23403, Training Loss: 0.00957
Epoch: 23403, Training Loss: 0.00959
Epoch: 23403, Training Loss: 0.01180
Epoch: 23404, Training Loss: 0.00916
Epoch: 23404, Training Loss: 0.00957
Epoch: 23404, Training Loss: 0.00959
Epoch: 23404, Training Loss: 0.01180
Epoch: 23405, Training Loss: 0.00916
Epoch: 23405, Training Loss: 0.00957
Epoch: 23405, Training Loss: 0.00959
Epoch: 23405, Training Loss: 0.01180
Epoch: 23406, Training Loss: 0.00916
Epoch: 23406, Training Loss: 0.00957
Epoch: 23406, Training Loss: 0.00959
Epoch: 23406, Training Loss: 0.01180
Epoch: 23407, Training Loss: 0.00916
Epoch: 23407, Training Loss: 0.00957
Epoch: 23407, Training Loss: 0.00959
Epoch: 23407, Training Loss: 0.01180
Epoch: 23408, Training Loss: 0.00915
Epoch: 23408, Training Loss: 0.00957
Epoch: 23408, Training Loss: 0.00958
Epoch: 23408, Training Loss: 0.01180
Epoch: 23409, Training Loss: 0.00915
Epoch: 23409, Training Loss: 0.00957
Epoch: 23409, Training Loss: 0.00958
Epoch: 23409, Training Loss: 0.01180
Epoch: 23410, Training Loss: 0.00915
Epoch: 23410, Training Loss: 0.00957
Epoch: 23410, Training Loss: 0.00958
Epoch: 23410, Training Loss: 0.01179
Epoch: 23411, Training Loss: 0.00915
Epoch: 23411, Training Loss: 0.00957
Epoch: 23411, Training Loss: 0.00958
Epoch: 23411, Training Loss: 0.01179
Epoch: 23412, Training Loss: 0.00915
Epoch: 23412, Training Loss: 0.00957
Epoch: 23412, Training Loss: 0.00958
Epoch: 23412, Training Loss: 0.01179
Epoch: 23413, Training Loss: 0.00915
Epoch: 23413, Training Loss: 0.00957
Epoch: 23413, Training Loss: 0.00958
Epoch: 23413, Training Loss: 0.01179
Epoch: 23414, Training Loss: 0.00915
Epoch: 23414, Training Loss: 0.00957
Epoch: 23414, Training Loss: 0.00958
Epoch: 23414, Training Loss: 0.01179
Epoch: 23415, Training Loss: 0.00915
Epoch: 23415, Training Loss: 0.00957
Epoch: 23415, Training Loss: 0.00958
Epoch: 23415, Training Loss: 0.01179
Epoch: 23416, Training Loss: 0.00915
Epoch: 23416, Training Loss: 0.00957
Epoch: 23416, Training Loss: 0.00958
Epoch: 23416, Training Loss: 0.01179
Epoch: 23417, Training Loss: 0.00915
Epoch: 23417, Training Loss: 0.00957
Epoch: 23417, Training Loss: 0.00958
Epoch: 23417, Training Loss: 0.01179
Epoch: 23418, Training Loss: 0.00915
Epoch: 23418, Training Loss: 0.00957
Epoch: 23418, Training Loss: 0.00958
Epoch: 23418, Training Loss: 0.01179
Epoch: 23419, Training Loss: 0.00915
Epoch: 23419, Training Loss: 0.00957
Epoch: 23419, Training Loss: 0.00958
Epoch: 23419, Training Loss: 0.01179
Epoch: 23420, Training Loss: 0.00915
Epoch: 23420, Training Loss: 0.00957
Epoch: 23420, Training Loss: 0.00958
Epoch: 23420, Training Loss: 0.01179
Epoch: 23421, Training Loss: 0.00915
Epoch: 23421, Training Loss: 0.00957
Epoch: 23421, Training Loss: 0.00958
Epoch: 23421, Training Loss: 0.01179
Epoch: 23422, Training Loss: 0.00915
Epoch: 23422, Training Loss: 0.00957
Epoch: 23422, Training Loss: 0.00958
Epoch: 23422, Training Loss: 0.01179
Epoch: 23423, Training Loss: 0.00915
Epoch: 23423, Training Loss: 0.00957
Epoch: 23423, Training Loss: 0.00958
Epoch: 23423, Training Loss: 0.01179
Epoch: 23424, Training Loss: 0.00915
Epoch: 23424, Training Loss: 0.00957
Epoch: 23424, Training Loss: 0.00958
Epoch: 23424, Training Loss: 0.01179
Epoch: 23425, Training Loss: 0.00915
Epoch: 23425, Training Loss: 0.00957
Epoch: 23425, Training Loss: 0.00958
Epoch: 23425, Training Loss: 0.01179
Epoch: 23426, Training Loss: 0.00915
Epoch: 23426, Training Loss: 0.00957
Epoch: 23426, Training Loss: 0.00958
Epoch: 23426, Training Loss: 0.01179
Epoch: 23427, Training Loss: 0.00915
Epoch: 23427, Training Loss: 0.00957
Epoch: 23427, Training Loss: 0.00958
Epoch: 23427, Training Loss: 0.01179
Epoch: 23428, Training Loss: 0.00915
Epoch: 23428, Training Loss: 0.00957
Epoch: 23428, Training Loss: 0.00958
Epoch: 23428, Training Loss: 0.01179
Epoch: 23429, Training Loss: 0.00915
Epoch: 23429, Training Loss: 0.00957
Epoch: 23429, Training Loss: 0.00958
Epoch: 23429, Training Loss: 0.01179
Epoch: 23430, Training Loss: 0.00915
Epoch: 23430, Training Loss: 0.00957
Epoch: 23430, Training Loss: 0.00958
Epoch: 23430, Training Loss: 0.01179
Epoch: 23431, Training Loss: 0.00915
Epoch: 23431, Training Loss: 0.00957
Epoch: 23431, Training Loss: 0.00958
Epoch: 23431, Training Loss: 0.01179
Epoch: 23432, Training Loss: 0.00915
Epoch: 23432, Training Loss: 0.00957
Epoch: 23432, Training Loss: 0.00958
Epoch: 23432, Training Loss: 0.01179
Epoch: 23433, Training Loss: 0.00915
Epoch: 23433, Training Loss: 0.00957
Epoch: 23433, Training Loss: 0.00958
Epoch: 23433, Training Loss: 0.01179
Epoch: 23434, Training Loss: 0.00915
Epoch: 23434, Training Loss: 0.00957
Epoch: 23434, Training Loss: 0.00958
Epoch: 23434, Training Loss: 0.01179
Epoch: 23435, Training Loss: 0.00915
Epoch: 23435, Training Loss: 0.00957
Epoch: 23435, Training Loss: 0.00958
Epoch: 23435, Training Loss: 0.01179
Epoch: 23436, Training Loss: 0.00915
Epoch: 23436, Training Loss: 0.00957
Epoch: 23436, Training Loss: 0.00958
Epoch: 23436, Training Loss: 0.01179
Epoch: 23437, Training Loss: 0.00915
Epoch: 23437, Training Loss: 0.00957
Epoch: 23437, Training Loss: 0.00958
Epoch: 23437, Training Loss: 0.01179
Epoch: 23438, Training Loss: 0.00915
Epoch: 23438, Training Loss: 0.00957
Epoch: 23438, Training Loss: 0.00958
Epoch: 23438, Training Loss: 0.01179
Epoch: 23439, Training Loss: 0.00915
Epoch: 23439, Training Loss: 0.00957
Epoch: 23439, Training Loss: 0.00958
Epoch: 23439, Training Loss: 0.01179
Epoch: 23440, Training Loss: 0.00915
Epoch: 23440, Training Loss: 0.00957
Epoch: 23440, Training Loss: 0.00958
Epoch: 23440, Training Loss: 0.01179
Epoch: 23441, Training Loss: 0.00915
Epoch: 23441, Training Loss: 0.00957
Epoch: 23441, Training Loss: 0.00958
Epoch: 23441, Training Loss: 0.01179
Epoch: 23442, Training Loss: 0.00915
Epoch: 23442, Training Loss: 0.00956
Epoch: 23442, Training Loss: 0.00958
Epoch: 23442, Training Loss: 0.01179
Epoch: 23443, Training Loss: 0.00915
Epoch: 23443, Training Loss: 0.00956
Epoch: 23443, Training Loss: 0.00958
Epoch: 23443, Training Loss: 0.01178
Epoch: 23444, Training Loss: 0.00915
Epoch: 23444, Training Loss: 0.00956
Epoch: 23444, Training Loss: 0.00958
Epoch: 23444, Training Loss: 0.01178
Epoch: 23445, Training Loss: 0.00915
Epoch: 23445, Training Loss: 0.00956
Epoch: 23445, Training Loss: 0.00958
Epoch: 23445, Training Loss: 0.01178
Epoch: 23446, Training Loss: 0.00915
Epoch: 23446, Training Loss: 0.00956
Epoch: 23446, Training Loss: 0.00958
Epoch: 23446, Training Loss: 0.01178
Epoch: 23447, Training Loss: 0.00915
Epoch: 23447, Training Loss: 0.00956
Epoch: 23447, Training Loss: 0.00958
Epoch: 23447, Training Loss: 0.01178
Epoch: 23448, Training Loss: 0.00915
Epoch: 23448, Training Loss: 0.00956
Epoch: 23448, Training Loss: 0.00958
Epoch: 23448, Training Loss: 0.01178
Epoch: 23449, Training Loss: 0.00915
Epoch: 23449, Training Loss: 0.00956
Epoch: 23449, Training Loss: 0.00958
Epoch: 23449, Training Loss: 0.01178
Epoch: 23450, Training Loss: 0.00915
Epoch: 23450, Training Loss: 0.00956
Epoch: 23450, Training Loss: 0.00958
Epoch: 23450, Training Loss: 0.01178
Epoch: 23451, Training Loss: 0.00915
Epoch: 23451, Training Loss: 0.00956
Epoch: 23451, Training Loss: 0.00957
Epoch: 23451, Training Loss: 0.01178
Epoch: 23452, Training Loss: 0.00915
Epoch: 23452, Training Loss: 0.00956
Epoch: 23452, Training Loss: 0.00957
Epoch: 23452, Training Loss: 0.01178
Epoch: 23453, Training Loss: 0.00915
Epoch: 23453, Training Loss: 0.00956
Epoch: 23453, Training Loss: 0.00957
Epoch: 23453, Training Loss: 0.01178
Epoch: 23454, Training Loss: 0.00914
Epoch: 23454, Training Loss: 0.00956
Epoch: 23454, Training Loss: 0.00957
Epoch: 23454, Training Loss: 0.01178
Epoch: 23455, Training Loss: 0.00914
Epoch: 23455, Training Loss: 0.00956
Epoch: 23455, Training Loss: 0.00957
Epoch: 23455, Training Loss: 0.01178
Epoch: 23456, Training Loss: 0.00914
Epoch: 23456, Training Loss: 0.00956
Epoch: 23456, Training Loss: 0.00957
Epoch: 23456, Training Loss: 0.01178
Epoch: 23457, Training Loss: 0.00914
Epoch: 23457, Training Loss: 0.00956
Epoch: 23457, Training Loss: 0.00957
Epoch: 23457, Training Loss: 0.01178
Epoch: 23458, Training Loss: 0.00914
Epoch: 23458, Training Loss: 0.00956
Epoch: 23458, Training Loss: 0.00957
Epoch: 23458, Training Loss: 0.01178
Epoch: 23459, Training Loss: 0.00914
Epoch: 23459, Training Loss: 0.00956
Epoch: 23459, Training Loss: 0.00957
Epoch: 23459, Training Loss: 0.01178
Epoch: 23460, Training Loss: 0.00914
Epoch: 23460, Training Loss: 0.00956
Epoch: 23460, Training Loss: 0.00957
Epoch: 23460, Training Loss: 0.01178
Epoch: 23461, Training Loss: 0.00914
Epoch: 23461, Training Loss: 0.00956
Epoch: 23461, Training Loss: 0.00957
Epoch: 23461, Training Loss: 0.01178
Epoch: 23462, Training Loss: 0.00914
Epoch: 23462, Training Loss: 0.00956
Epoch: 23462, Training Loss: 0.00957
Epoch: 23462, Training Loss: 0.01178
Epoch: 23463, Training Loss: 0.00914
Epoch: 23463, Training Loss: 0.00956
Epoch: 23463, Training Loss: 0.00957
Epoch: 23463, Training Loss: 0.01178
Epoch: 23464, Training Loss: 0.00914
Epoch: 23464, Training Loss: 0.00956
Epoch: 23464, Training Loss: 0.00957
Epoch: 23464, Training Loss: 0.01178
Epoch: 23465, Training Loss: 0.00914
Epoch: 23465, Training Loss: 0.00956
Epoch: 23465, Training Loss: 0.00957
Epoch: 23465, Training Loss: 0.01178
Epoch: 23466, Training Loss: 0.00914
Epoch: 23466, Training Loss: 0.00956
Epoch: 23466, Training Loss: 0.00957
Epoch: 23466, Training Loss: 0.01178
Epoch: 23467, Training Loss: 0.00914
Epoch: 23467, Training Loss: 0.00956
Epoch: 23467, Training Loss: 0.00957
Epoch: 23467, Training Loss: 0.01178
Epoch: 23468, Training Loss: 0.00914
Epoch: 23468, Training Loss: 0.00956
Epoch: 23468, Training Loss: 0.00957
Epoch: 23468, Training Loss: 0.01178
Epoch: 23469, Training Loss: 0.00914
Epoch: 23469, Training Loss: 0.00956
Epoch: 23469, Training Loss: 0.00957
Epoch: 23469, Training Loss: 0.01178
Epoch: 23470, Training Loss: 0.00914
Epoch: 23470, Training Loss: 0.00956
Epoch: 23470, Training Loss: 0.00957
Epoch: 23470, Training Loss: 0.01178
Epoch: 23471, Training Loss: 0.00914
Epoch: 23471, Training Loss: 0.00956
Epoch: 23471, Training Loss: 0.00957
Epoch: 23471, Training Loss: 0.01178
Epoch: 23472, Training Loss: 0.00914
Epoch: 23472, Training Loss: 0.00956
Epoch: 23472, Training Loss: 0.00957
Epoch: 23472, Training Loss: 0.01178
Epoch: 23473, Training Loss: 0.00914
Epoch: 23473, Training Loss: 0.00956
Epoch: 23473, Training Loss: 0.00957
Epoch: 23473, Training Loss: 0.01178
Epoch: 23474, Training Loss: 0.00914
Epoch: 23474, Training Loss: 0.00956
Epoch: 23474, Training Loss: 0.00957
Epoch: 23474, Training Loss: 0.01178
Epoch: 23475, Training Loss: 0.00914
Epoch: 23475, Training Loss: 0.00956
Epoch: 23475, Training Loss: 0.00957
Epoch: 23475, Training Loss: 0.01178
Epoch: 23476, Training Loss: 0.00914
Epoch: 23476, Training Loss: 0.00956
Epoch: 23476, Training Loss: 0.00957
Epoch: 23476, Training Loss: 0.01178
Epoch: 23477, Training Loss: 0.00914
Epoch: 23477, Training Loss: 0.00956
Epoch: 23477, Training Loss: 0.00957
Epoch: 23477, Training Loss: 0.01177
Epoch: 23478, Training Loss: 0.00914
Epoch: 23478, Training Loss: 0.00956
Epoch: 23478, Training Loss: 0.00957
Epoch: 23478, Training Loss: 0.01177
Epoch: 23479, Training Loss: 0.00914
Epoch: 23479, Training Loss: 0.00956
Epoch: 23479, Training Loss: 0.00957
Epoch: 23479, Training Loss: 0.01177
Epoch: 23480, Training Loss: 0.00914
Epoch: 23480, Training Loss: 0.00956
Epoch: 23480, Training Loss: 0.00957
Epoch: 23480, Training Loss: 0.01177
Epoch: 23481, Training Loss: 0.00914
Epoch: 23481, Training Loss: 0.00956
Epoch: 23481, Training Loss: 0.00957
Epoch: 23481, Training Loss: 0.01177
Epoch: 23482, Training Loss: 0.00914
Epoch: 23482, Training Loss: 0.00956
Epoch: 23482, Training Loss: 0.00957
Epoch: 23482, Training Loss: 0.01177
Epoch: 23483, Training Loss: 0.00914
Epoch: 23483, Training Loss: 0.00956
Epoch: 23483, Training Loss: 0.00957
Epoch: 23483, Training Loss: 0.01177
Epoch: 23484, Training Loss: 0.00914
Epoch: 23484, Training Loss: 0.00955
Epoch: 23484, Training Loss: 0.00957
Epoch: 23484, Training Loss: 0.01177
Epoch: 23485, Training Loss: 0.00914
Epoch: 23485, Training Loss: 0.00955
Epoch: 23485, Training Loss: 0.00957
Epoch: 23485, Training Loss: 0.01177
Epoch: 23486, Training Loss: 0.00914
Epoch: 23486, Training Loss: 0.00955
Epoch: 23486, Training Loss: 0.00957
Epoch: 23486, Training Loss: 0.01177
Epoch: 23487, Training Loss: 0.00914
Epoch: 23487, Training Loss: 0.00955
Epoch: 23487, Training Loss: 0.00957
Epoch: 23487, Training Loss: 0.01177
Epoch: 23488, Training Loss: 0.00914
Epoch: 23488, Training Loss: 0.00955
Epoch: 23488, Training Loss: 0.00957
Epoch: 23488, Training Loss: 0.01177
Epoch: 23489, Training Loss: 0.00914
Epoch: 23489, Training Loss: 0.00955
Epoch: 23489, Training Loss: 0.00957
Epoch: 23489, Training Loss: 0.01177
Epoch: 23490, Training Loss: 0.00914
Epoch: 23490, Training Loss: 0.00955
Epoch: 23490, Training Loss: 0.00957
Epoch: 23490, Training Loss: 0.01177
Epoch: 23491, Training Loss: 0.00914
Epoch: 23491, Training Loss: 0.00955
Epoch: 23491, Training Loss: 0.00957
Epoch: 23491, Training Loss: 0.01177
Epoch: 23492, Training Loss: 0.00914
Epoch: 23492, Training Loss: 0.00955
Epoch: 23492, Training Loss: 0.00957
Epoch: 23492, Training Loss: 0.01177
Epoch: 23493, Training Loss: 0.00914
Epoch: 23493, Training Loss: 0.00955
Epoch: 23493, Training Loss: 0.00956
Epoch: 23493, Training Loss: 0.01177
Epoch: 23494, Training Loss: 0.00914
Epoch: 23494, Training Loss: 0.00955
Epoch: 23494, Training Loss: 0.00956
Epoch: 23494, Training Loss: 0.01177
Epoch: 23495, Training Loss: 0.00914
Epoch: 23495, Training Loss: 0.00955
Epoch: 23495, Training Loss: 0.00956
Epoch: 23495, Training Loss: 0.01177
Epoch: 23496, Training Loss: 0.00914
Epoch: 23496, Training Loss: 0.00955
Epoch: 23496, Training Loss: 0.00956
Epoch: 23496, Training Loss: 0.01177
Epoch: 23497, Training Loss: 0.00914
Epoch: 23497, Training Loss: 0.00955
Epoch: 23497, Training Loss: 0.00956
Epoch: 23497, Training Loss: 0.01177
Epoch: 23498, Training Loss: 0.00914
Epoch: 23498, Training Loss: 0.00955
Epoch: 23498, Training Loss: 0.00956
Epoch: 23498, Training Loss: 0.01177
Epoch: 23499, Training Loss: 0.00913
Epoch: 23499, Training Loss: 0.00955
Epoch: 23499, Training Loss: 0.00956
Epoch: 23499, Training Loss: 0.01177
Epoch: 23500, Training Loss: 0.00913
Epoch: 23500, Training Loss: 0.00955
Epoch: 23500, Training Loss: 0.00956
Epoch: 23500, Training Loss: 0.01177
Epoch: 23501, Training Loss: 0.00913
Epoch: 23501, Training Loss: 0.00955
Epoch: 23501, Training Loss: 0.00956
Epoch: 23501, Training Loss: 0.01177
Epoch: 23502, Training Loss: 0.00913
Epoch: 23502, Training Loss: 0.00955
Epoch: 23502, Training Loss: 0.00956
Epoch: 23502, Training Loss: 0.01177
Epoch: 23503, Training Loss: 0.00913
Epoch: 23503, Training Loss: 0.00955
Epoch: 23503, Training Loss: 0.00956
Epoch: 23503, Training Loss: 0.01177
Epoch: 23504, Training Loss: 0.00913
Epoch: 23504, Training Loss: 0.00955
Epoch: 23504, Training Loss: 0.00956
Epoch: 23504, Training Loss: 0.01177
Epoch: 23505, Training Loss: 0.00913
Epoch: 23505, Training Loss: 0.00955
Epoch: 23505, Training Loss: 0.00956
Epoch: 23505, Training Loss: 0.01177
Epoch: 23506, Training Loss: 0.00913
Epoch: 23506, Training Loss: 0.00955
Epoch: 23506, Training Loss: 0.00956
Epoch: 23506, Training Loss: 0.01177
Epoch: 23507, Training Loss: 0.00913
Epoch: 23507, Training Loss: 0.00955
Epoch: 23507, Training Loss: 0.00956
Epoch: 23507, Training Loss: 0.01177
Epoch: 23508, Training Loss: 0.00913
Epoch: 23508, Training Loss: 0.00955
Epoch: 23508, Training Loss: 0.00956
Epoch: 23508, Training Loss: 0.01177
Epoch: 23509, Training Loss: 0.00913
Epoch: 23509, Training Loss: 0.00955
Epoch: 23509, Training Loss: 0.00956
Epoch: 23509, Training Loss: 0.01177
Epoch: 23510, Training Loss: 0.00913
Epoch: 23510, Training Loss: 0.00955
Epoch: 23510, Training Loss: 0.00956
Epoch: 23510, Training Loss: 0.01177
Epoch: 23511, Training Loss: 0.00913
Epoch: 23511, Training Loss: 0.00955
Epoch: 23511, Training Loss: 0.00956
Epoch: 23511, Training Loss: 0.01176
Epoch: 23512, Training Loss: 0.00913
Epoch: 23512, Training Loss: 0.00955
Epoch: 23512, Training Loss: 0.00956
Epoch: 23512, Training Loss: 0.01176
Epoch: 23513, Training Loss: 0.00913
Epoch: 23513, Training Loss: 0.00955
Epoch: 23513, Training Loss: 0.00956
Epoch: 23513, Training Loss: 0.01176
Epoch: 23514, Training Loss: 0.00913
Epoch: 23514, Training Loss: 0.00955
Epoch: 23514, Training Loss: 0.00956
Epoch: 23514, Training Loss: 0.01176
Epoch: 23515, Training Loss: 0.00913
Epoch: 23515, Training Loss: 0.00955
Epoch: 23515, Training Loss: 0.00956
Epoch: 23515, Training Loss: 0.01176
Epoch: 23516, Training Loss: 0.00913
Epoch: 23516, Training Loss: 0.00955
Epoch: 23516, Training Loss: 0.00956
Epoch: 23516, Training Loss: 0.01176
Epoch: 23517, Training Loss: 0.00913
Epoch: 23517, Training Loss: 0.00955
Epoch: 23517, Training Loss: 0.00956
Epoch: 23517, Training Loss: 0.01176
Epoch: 23518, Training Loss: 0.00913
Epoch: 23518, Training Loss: 0.00955
Epoch: 23518, Training Loss: 0.00956
Epoch: 23518, Training Loss: 0.01176
Epoch: 23519, Training Loss: 0.00913
Epoch: 23519, Training Loss: 0.00955
Epoch: 23519, Training Loss: 0.00956
Epoch: 23519, Training Loss: 0.01176
Epoch: 23520, Training Loss: 0.00913
Epoch: 23520, Training Loss: 0.00955
Epoch: 23520, Training Loss: 0.00956
Epoch: 23520, Training Loss: 0.01176
Epoch: 23521, Training Loss: 0.00913
Epoch: 23521, Training Loss: 0.00955
Epoch: 23521, Training Loss: 0.00956
Epoch: 23521, Training Loss: 0.01176
Epoch: 23522, Training Loss: 0.00913
Epoch: 23522, Training Loss: 0.00955
Epoch: 23522, Training Loss: 0.00956
Epoch: 23522, Training Loss: 0.01176
Epoch: 23523, Training Loss: 0.00913
Epoch: 23523, Training Loss: 0.00955
Epoch: 23523, Training Loss: 0.00956
Epoch: 23523, Training Loss: 0.01176
Epoch: 23524, Training Loss: 0.00913
Epoch: 23524, Training Loss: 0.00955
Epoch: 23524, Training Loss: 0.00956
Epoch: 23524, Training Loss: 0.01176
Epoch: 23525, Training Loss: 0.00913
Epoch: 23525, Training Loss: 0.00955
Epoch: 23525, Training Loss: 0.00956
Epoch: 23525, Training Loss: 0.01176
Epoch: 23526, Training Loss: 0.00913
Epoch: 23526, Training Loss: 0.00955
Epoch: 23526, Training Loss: 0.00956
Epoch: 23526, Training Loss: 0.01176
Epoch: 23527, Training Loss: 0.00913
Epoch: 23527, Training Loss: 0.00954
Epoch: 23527, Training Loss: 0.00956
Epoch: 23527, Training Loss: 0.01176
Epoch: 23528, Training Loss: 0.00913
Epoch: 23528, Training Loss: 0.00954
Epoch: 23528, Training Loss: 0.00956
Epoch: 23528, Training Loss: 0.01176
Epoch: 23529, Training Loss: 0.00913
Epoch: 23529, Training Loss: 0.00954
Epoch: 23529, Training Loss: 0.00956
Epoch: 23529, Training Loss: 0.01176
Epoch: 23530, Training Loss: 0.00913
Epoch: 23530, Training Loss: 0.00954
Epoch: 23530, Training Loss: 0.00956
Epoch: 23530, Training Loss: 0.01176
Epoch: 23531, Training Loss: 0.00913
Epoch: 23531, Training Loss: 0.00954
Epoch: 23531, Training Loss: 0.00956
Epoch: 23531, Training Loss: 0.01176
Epoch: 23532, Training Loss: 0.00913
Epoch: 23532, Training Loss: 0.00954
Epoch: 23532, Training Loss: 0.00956
Epoch: 23532, Training Loss: 0.01176
Epoch: 23533, Training Loss: 0.00913
Epoch: 23533, Training Loss: 0.00954
Epoch: 23533, Training Loss: 0.00956
Epoch: 23533, Training Loss: 0.01176
Epoch: 23534, Training Loss: 0.00913
Epoch: 23534, Training Loss: 0.00954
Epoch: 23534, Training Loss: 0.00956
Epoch: 23534, Training Loss: 0.01176
Epoch: 23535, Training Loss: 0.00913
Epoch: 23535, Training Loss: 0.00954
Epoch: 23535, Training Loss: 0.00955
Epoch: 23535, Training Loss: 0.01176
Epoch: 23536, Training Loss: 0.00913
Epoch: 23536, Training Loss: 0.00954
Epoch: 23536, Training Loss: 0.00955
Epoch: 23536, Training Loss: 0.01176
Epoch: 23537, Training Loss: 0.00913
Epoch: 23537, Training Loss: 0.00954
Epoch: 23537, Training Loss: 0.00955
Epoch: 23537, Training Loss: 0.01176
Epoch: 23538, Training Loss: 0.00913
Epoch: 23538, Training Loss: 0.00954
Epoch: 23538, Training Loss: 0.00955
Epoch: 23538, Training Loss: 0.01176
Epoch: 23539, Training Loss: 0.00913
Epoch: 23539, Training Loss: 0.00954
Epoch: 23539, Training Loss: 0.00955
Epoch: 23539, Training Loss: 0.01176
Epoch: 23540, Training Loss: 0.00913
Epoch: 23540, Training Loss: 0.00954
Epoch: 23540, Training Loss: 0.00955
Epoch: 23540, Training Loss: 0.01176
Epoch: 23541, Training Loss: 0.00913
Epoch: 23541, Training Loss: 0.00954
Epoch: 23541, Training Loss: 0.00955
Epoch: 23541, Training Loss: 0.01176
Epoch: 23542, Training Loss: 0.00913
Epoch: 23542, Training Loss: 0.00954
Epoch: 23542, Training Loss: 0.00955
Epoch: 23542, Training Loss: 0.01176
Epoch: 23543, Training Loss: 0.00913
Epoch: 23543, Training Loss: 0.00954
Epoch: 23543, Training Loss: 0.00955
Epoch: 23543, Training Loss: 0.01176
Epoch: 23544, Training Loss: 0.00913
Epoch: 23544, Training Loss: 0.00954
Epoch: 23544, Training Loss: 0.00955
Epoch: 23544, Training Loss: 0.01176
Epoch: 23545, Training Loss: 0.00912
Epoch: 23545, Training Loss: 0.00954
Epoch: 23545, Training Loss: 0.00955
Epoch: 23545, Training Loss: 0.01176
Epoch: 23546, Training Loss: 0.00912
Epoch: 23546, Training Loss: 0.00954
Epoch: 23546, Training Loss: 0.00955
Epoch: 23546, Training Loss: 0.01175
Epoch: 23547, Training Loss: 0.00912
Epoch: 23547, Training Loss: 0.00954
Epoch: 23547, Training Loss: 0.00955
Epoch: 23547, Training Loss: 0.01175
Epoch: 23548, Training Loss: 0.00912
Epoch: 23548, Training Loss: 0.00954
Epoch: 23548, Training Loss: 0.00955
Epoch: 23548, Training Loss: 0.01175
Epoch: 23549, Training Loss: 0.00912
Epoch: 23549, Training Loss: 0.00954
Epoch: 23549, Training Loss: 0.00955
Epoch: 23549, Training Loss: 0.01175
Epoch: 23550, Training Loss: 0.00912
Epoch: 23550, Training Loss: 0.00954
Epoch: 23550, Training Loss: 0.00955
Epoch: 23550, Training Loss: 0.01175
Epoch: 23551, Training Loss: 0.00912
Epoch: 23551, Training Loss: 0.00954
Epoch: 23551, Training Loss: 0.00955
Epoch: 23551, Training Loss: 0.01175
Epoch: 23552, Training Loss: 0.00912
Epoch: 23552, Training Loss: 0.00954
Epoch: 23552, Training Loss: 0.00955
Epoch: 23552, Training Loss: 0.01175
Epoch: 23553, Training Loss: 0.00912
Epoch: 23553, Training Loss: 0.00954
Epoch: 23553, Training Loss: 0.00955
Epoch: 23553, Training Loss: 0.01175
Epoch: 23554, Training Loss: 0.00912
Epoch: 23554, Training Loss: 0.00954
Epoch: 23554, Training Loss: 0.00955
Epoch: 23554, Training Loss: 0.01175
Epoch: 23555, Training Loss: 0.00912
Epoch: 23555, Training Loss: 0.00954
Epoch: 23555, Training Loss: 0.00955
Epoch: 23555, Training Loss: 0.01175
Epoch: 23556, Training Loss: 0.00912
Epoch: 23556, Training Loss: 0.00954
Epoch: 23556, Training Loss: 0.00955
Epoch: 23556, Training Loss: 0.01175
Epoch: 23557, Training Loss: 0.00912
Epoch: 23557, Training Loss: 0.00954
Epoch: 23557, Training Loss: 0.00955
Epoch: 23557, Training Loss: 0.01175
Epoch: 23558, Training Loss: 0.00912
Epoch: 23558, Training Loss: 0.00954
Epoch: 23558, Training Loss: 0.00955
Epoch: 23558, Training Loss: 0.01175
Epoch: 23559, Training Loss: 0.00912
Epoch: 23559, Training Loss: 0.00954
Epoch: 23559, Training Loss: 0.00955
Epoch: 23559, Training Loss: 0.01175
Epoch: 23560, Training Loss: 0.00912
Epoch: 23560, Training Loss: 0.00954
Epoch: 23560, Training Loss: 0.00955
Epoch: 23560, Training Loss: 0.01175
Epoch: 23561, Training Loss: 0.00912
Epoch: 23561, Training Loss: 0.00954
Epoch: 23561, Training Loss: 0.00955
Epoch: 23561, Training Loss: 0.01175
Epoch: 23562, Training Loss: 0.00912
Epoch: 23562, Training Loss: 0.00954
Epoch: 23562, Training Loss: 0.00955
Epoch: 23562, Training Loss: 0.01175
Epoch: 23563, Training Loss: 0.00912
Epoch: 23563, Training Loss: 0.00954
Epoch: 23563, Training Loss: 0.00955
Epoch: 23563, Training Loss: 0.01175
Epoch: 23564, Training Loss: 0.00912
Epoch: 23564, Training Loss: 0.00954
Epoch: 23564, Training Loss: 0.00955
Epoch: 23564, Training Loss: 0.01175
Epoch: 23565, Training Loss: 0.00912
Epoch: 23565, Training Loss: 0.00954
Epoch: 23565, Training Loss: 0.00955
Epoch: 23565, Training Loss: 0.01175
Epoch: 23566, Training Loss: 0.00912
Epoch: 23566, Training Loss: 0.00954
Epoch: 23566, Training Loss: 0.00955
Epoch: 23566, Training Loss: 0.01175
Epoch: 23567, Training Loss: 0.00912
Epoch: 23567, Training Loss: 0.00954
Epoch: 23567, Training Loss: 0.00955
Epoch: 23567, Training Loss: 0.01175
Epoch: 23568, Training Loss: 0.00912
Epoch: 23568, Training Loss: 0.00954
Epoch: 23568, Training Loss: 0.00955
Epoch: 23568, Training Loss: 0.01175
Epoch: 23569, Training Loss: 0.00912
Epoch: 23569, Training Loss: 0.00953
Epoch: 23569, Training Loss: 0.00955
Epoch: 23569, Training Loss: 0.01175
Epoch: 23570, Training Loss: 0.00912
Epoch: 23570, Training Loss: 0.00953
Epoch: 23570, Training Loss: 0.00955
Epoch: 23570, Training Loss: 0.01175
Epoch: 23571, Training Loss: 0.00912
Epoch: 23571, Training Loss: 0.00953
Epoch: 23571, Training Loss: 0.00955
Epoch: 23571, Training Loss: 0.01175
Epoch: 23572, Training Loss: 0.00912
Epoch: 23572, Training Loss: 0.00953
Epoch: 23572, Training Loss: 0.00955
Epoch: 23572, Training Loss: 0.01175
Epoch: 23573, Training Loss: 0.00912
Epoch: 23573, Training Loss: 0.00953
Epoch: 23573, Training Loss: 0.00955
Epoch: 23573, Training Loss: 0.01175
Epoch: 23574, Training Loss: 0.00912
Epoch: 23574, Training Loss: 0.00953
Epoch: 23574, Training Loss: 0.00955
Epoch: 23574, Training Loss: 0.01175
Epoch: 23575, Training Loss: 0.00912
Epoch: 23575, Training Loss: 0.00953
Epoch: 23575, Training Loss: 0.00955
Epoch: 23575, Training Loss: 0.01175
Epoch: 23576, Training Loss: 0.00912
Epoch: 23576, Training Loss: 0.00953
Epoch: 23576, Training Loss: 0.00955
Epoch: 23576, Training Loss: 0.01175
Epoch: 23577, Training Loss: 0.00912
Epoch: 23577, Training Loss: 0.00953
Epoch: 23577, Training Loss: 0.00955
Epoch: 23577, Training Loss: 0.01175
Epoch: 23578, Training Loss: 0.00912
Epoch: 23578, Training Loss: 0.00953
Epoch: 23578, Training Loss: 0.00954
Epoch: 23578, Training Loss: 0.01175
Epoch: 23579, Training Loss: 0.00912
Epoch: 23579, Training Loss: 0.00953
Epoch: 23579, Training Loss: 0.00954
Epoch: 23579, Training Loss: 0.01175
Epoch: 23580, Training Loss: 0.00912
Epoch: 23580, Training Loss: 0.00953
Epoch: 23580, Training Loss: 0.00954
Epoch: 23580, Training Loss: 0.01174
Epoch: 23581, Training Loss: 0.00912
Epoch: 23581, Training Loss: 0.00953
Epoch: 23581, Training Loss: 0.00954
Epoch: 23581, Training Loss: 0.01174
Epoch: 23582, Training Loss: 0.00912
Epoch: 23582, Training Loss: 0.00953
Epoch: 23582, Training Loss: 0.00954
Epoch: 23582, Training Loss: 0.01174
Epoch: 23583, Training Loss: 0.00912
Epoch: 23583, Training Loss: 0.00953
Epoch: 23583, Training Loss: 0.00954
Epoch: 23583, Training Loss: 0.01174
Epoch: 23584, Training Loss: 0.00912
Epoch: 23584, Training Loss: 0.00953
Epoch: 23584, Training Loss: 0.00954
Epoch: 23584, Training Loss: 0.01174
Epoch: 23585, Training Loss: 0.00912
Epoch: 23585, Training Loss: 0.00953
Epoch: 23585, Training Loss: 0.00954
Epoch: 23585, Training Loss: 0.01174
Epoch: 23586, Training Loss: 0.00912
Epoch: 23586, Training Loss: 0.00953
Epoch: 23586, Training Loss: 0.00954
Epoch: 23586, Training Loss: 0.01174
Epoch: 23587, Training Loss: 0.00912
Epoch: 23587, Training Loss: 0.00953
Epoch: 23587, Training Loss: 0.00954
Epoch: 23587, Training Loss: 0.01174
Epoch: 23588, Training Loss: 0.00912
Epoch: 23588, Training Loss: 0.00953
Epoch: 23588, Training Loss: 0.00954
Epoch: 23588, Training Loss: 0.01174
Epoch: 23589, Training Loss: 0.00912
Epoch: 23589, Training Loss: 0.00953
Epoch: 23589, Training Loss: 0.00954
Epoch: 23589, Training Loss: 0.01174
Epoch: 23590, Training Loss: 0.00912
Epoch: 23590, Training Loss: 0.00953
Epoch: 23590, Training Loss: 0.00954
Epoch: 23590, Training Loss: 0.01174
Epoch: 23591, Training Loss: 0.00911
Epoch: 23591, Training Loss: 0.00953
Epoch: 23591, Training Loss: 0.00954
Epoch: 23591, Training Loss: 0.01174
Epoch: 23592, Training Loss: 0.00911
Epoch: 23592, Training Loss: 0.00953
Epoch: 23592, Training Loss: 0.00954
Epoch: 23592, Training Loss: 0.01174
Epoch: 23593, Training Loss: 0.00911
Epoch: 23593, Training Loss: 0.00953
Epoch: 23593, Training Loss: 0.00954
Epoch: 23593, Training Loss: 0.01174
Epoch: 23594, Training Loss: 0.00911
Epoch: 23594, Training Loss: 0.00953
Epoch: 23594, Training Loss: 0.00954
Epoch: 23594, Training Loss: 0.01174
Epoch: 23595, Training Loss: 0.00911
Epoch: 23595, Training Loss: 0.00953
Epoch: 23595, Training Loss: 0.00954
Epoch: 23595, Training Loss: 0.01174
Epoch: 23596, Training Loss: 0.00911
Epoch: 23596, Training Loss: 0.00953
Epoch: 23596, Training Loss: 0.00954
Epoch: 23596, Training Loss: 0.01174
Epoch: 23597, Training Loss: 0.00911
Epoch: 23597, Training Loss: 0.00953
Epoch: 23597, Training Loss: 0.00954
Epoch: 23597, Training Loss: 0.01174
Epoch: 23598, Training Loss: 0.00911
Epoch: 23598, Training Loss: 0.00953
Epoch: 23598, Training Loss: 0.00954
Epoch: 23598, Training Loss: 0.01174
Epoch: 23599, Training Loss: 0.00911
Epoch: 23599, Training Loss: 0.00953
Epoch: 23599, Training Loss: 0.00954
Epoch: 23599, Training Loss: 0.01174
Epoch: 23600, Training Loss: 0.00911
Epoch: 23600, Training Loss: 0.00953
Epoch: 23600, Training Loss: 0.00954
Epoch: 23600, Training Loss: 0.01174
Epoch: 23601, Training Loss: 0.00911
Epoch: 23601, Training Loss: 0.00953
Epoch: 23601, Training Loss: 0.00954
Epoch: 23601, Training Loss: 0.01174
Epoch: 23602, Training Loss: 0.00911
Epoch: 23602, Training Loss: 0.00953
Epoch: 23602, Training Loss: 0.00954
Epoch: 23602, Training Loss: 0.01174
Epoch: 23603, Training Loss: 0.00911
Epoch: 23603, Training Loss: 0.00953
Epoch: 23603, Training Loss: 0.00954
Epoch: 23603, Training Loss: 0.01174
Epoch: 23604, Training Loss: 0.00911
Epoch: 23604, Training Loss: 0.00953
Epoch: 23604, Training Loss: 0.00954
Epoch: 23604, Training Loss: 0.01174
Epoch: 23605, Training Loss: 0.00911
Epoch: 23605, Training Loss: 0.00953
Epoch: 23605, Training Loss: 0.00954
Epoch: 23605, Training Loss: 0.01174
Epoch: 23606, Training Loss: 0.00911
Epoch: 23606, Training Loss: 0.00953
Epoch: 23606, Training Loss: 0.00954
Epoch: 23606, Training Loss: 0.01174
Epoch: 23607, Training Loss: 0.00911
Epoch: 23607, Training Loss: 0.00953
Epoch: 23607, Training Loss: 0.00954
Epoch: 23607, Training Loss: 0.01174
Epoch: 23608, Training Loss: 0.00911
Epoch: 23608, Training Loss: 0.00953
Epoch: 23608, Training Loss: 0.00954
Epoch: 23608, Training Loss: 0.01174
Epoch: 23609, Training Loss: 0.00911
Epoch: 23609, Training Loss: 0.00953
Epoch: 23609, Training Loss: 0.00954
Epoch: 23609, Training Loss: 0.01174
Epoch: 23610, Training Loss: 0.00911
Epoch: 23610, Training Loss: 0.00953
Epoch: 23610, Training Loss: 0.00954
Epoch: 23610, Training Loss: 0.01174
Epoch: 23611, Training Loss: 0.00911
Epoch: 23611, Training Loss: 0.00953
Epoch: 23611, Training Loss: 0.00954
Epoch: 23611, Training Loss: 0.01174
Epoch: 23612, Training Loss: 0.00911
Epoch: 23612, Training Loss: 0.00952
Epoch: 23612, Training Loss: 0.00954
Epoch: 23612, Training Loss: 0.01174
Epoch: 23613, Training Loss: 0.00911
Epoch: 23613, Training Loss: 0.00952
Epoch: 23613, Training Loss: 0.00954
Epoch: 23613, Training Loss: 0.01174
Epoch: 23614, Training Loss: 0.00911
Epoch: 23614, Training Loss: 0.00952
Epoch: 23614, Training Loss: 0.00954
Epoch: 23614, Training Loss: 0.01173
Epoch: 23615, Training Loss: 0.00911
Epoch: 23615, Training Loss: 0.00952
Epoch: 23615, Training Loss: 0.00954
Epoch: 23615, Training Loss: 0.01173
Epoch: 23616, Training Loss: 0.00911
Epoch: 23616, Training Loss: 0.00952
Epoch: 23616, Training Loss: 0.00954
Epoch: 23616, Training Loss: 0.01173
Epoch: 23617, Training Loss: 0.00911
Epoch: 23617, Training Loss: 0.00952
Epoch: 23617, Training Loss: 0.00954
Epoch: 23617, Training Loss: 0.01173
Epoch: 23618, Training Loss: 0.00911
Epoch: 23618, Training Loss: 0.00952
Epoch: 23618, Training Loss: 0.00954
Epoch: 23618, Training Loss: 0.01173
Epoch: 23619, Training Loss: 0.00911
Epoch: 23619, Training Loss: 0.00952
Epoch: 23619, Training Loss: 0.00954
Epoch: 23619, Training Loss: 0.01173
Epoch: 23620, Training Loss: 0.00911
Epoch: 23620, Training Loss: 0.00952
Epoch: 23620, Training Loss: 0.00954
Epoch: 23620, Training Loss: 0.01173
Epoch: 23621, Training Loss: 0.00911
Epoch: 23621, Training Loss: 0.00952
Epoch: 23621, Training Loss: 0.00953
Epoch: 23621, Training Loss: 0.01173
Epoch: 23622, Training Loss: 0.00911
Epoch: 23622, Training Loss: 0.00952
Epoch: 23622, Training Loss: 0.00953
Epoch: 23622, Training Loss: 0.01173
Epoch: 23623, Training Loss: 0.00911
Epoch: 23623, Training Loss: 0.00952
Epoch: 23623, Training Loss: 0.00953
Epoch: 23623, Training Loss: 0.01173
Epoch: 23624, Training Loss: 0.00911
Epoch: 23624, Training Loss: 0.00952
Epoch: 23624, Training Loss: 0.00953
Epoch: 23624, Training Loss: 0.01173
Epoch: 23625, Training Loss: 0.00911
Epoch: 23625, Training Loss: 0.00952
Epoch: 23625, Training Loss: 0.00953
Epoch: 23625, Training Loss: 0.01173
Epoch: 23626, Training Loss: 0.00911
Epoch: 23626, Training Loss: 0.00952
Epoch: 23626, Training Loss: 0.00953
Epoch: 23626, Training Loss: 0.01173
Epoch: 23627, Training Loss: 0.00911
Epoch: 23627, Training Loss: 0.00952
Epoch: 23627, Training Loss: 0.00953
Epoch: 23627, Training Loss: 0.01173
Epoch: 23628, Training Loss: 0.00911
Epoch: 23628, Training Loss: 0.00952
Epoch: 23628, Training Loss: 0.00953
Epoch: 23628, Training Loss: 0.01173
Epoch: 23629, Training Loss: 0.00911
Epoch: 23629, Training Loss: 0.00952
Epoch: 23629, Training Loss: 0.00953
Epoch: 23629, Training Loss: 0.01173
Epoch: 23630, Training Loss: 0.00911
Epoch: 23630, Training Loss: 0.00952
Epoch: 23630, Training Loss: 0.00953
Epoch: 23630, Training Loss: 0.01173
Epoch: 23631, Training Loss: 0.00911
Epoch: 23631, Training Loss: 0.00952
Epoch: 23631, Training Loss: 0.00953
Epoch: 23631, Training Loss: 0.01173
Epoch: 23632, Training Loss: 0.00911
Epoch: 23632, Training Loss: 0.00952
Epoch: 23632, Training Loss: 0.00953
Epoch: 23632, Training Loss: 0.01173
Epoch: 23633, Training Loss: 0.00911
Epoch: 23633, Training Loss: 0.00952
Epoch: 23633, Training Loss: 0.00953
Epoch: 23633, Training Loss: 0.01173
Epoch: 23634, Training Loss: 0.00911
Epoch: 23634, Training Loss: 0.00952
Epoch: 23634, Training Loss: 0.00953
Epoch: 23634, Training Loss: 0.01173
Epoch: 23635, Training Loss: 0.00911
Epoch: 23635, Training Loss: 0.00952
Epoch: 23635, Training Loss: 0.00953
Epoch: 23635, Training Loss: 0.01173
Epoch: 23636, Training Loss: 0.00911
Epoch: 23636, Training Loss: 0.00952
Epoch: 23636, Training Loss: 0.00953
Epoch: 23636, Training Loss: 0.01173
Epoch: 23637, Training Loss: 0.00910
Epoch: 23637, Training Loss: 0.00952
Epoch: 23637, Training Loss: 0.00953
Epoch: 23637, Training Loss: 0.01173
Epoch: 23638, Training Loss: 0.00910
Epoch: 23638, Training Loss: 0.00952
Epoch: 23638, Training Loss: 0.00953
Epoch: 23638, Training Loss: 0.01173
Epoch: 23639, Training Loss: 0.00910
Epoch: 23639, Training Loss: 0.00952
Epoch: 23639, Training Loss: 0.00953
Epoch: 23639, Training Loss: 0.01173
Epoch: 23640, Training Loss: 0.00910
Epoch: 23640, Training Loss: 0.00952
Epoch: 23640, Training Loss: 0.00953
Epoch: 23640, Training Loss: 0.01173
Epoch: 23641, Training Loss: 0.00910
Epoch: 23641, Training Loss: 0.00952
Epoch: 23641, Training Loss: 0.00953
Epoch: 23641, Training Loss: 0.01173
Epoch: 23642, Training Loss: 0.00910
Epoch: 23642, Training Loss: 0.00952
Epoch: 23642, Training Loss: 0.00953
Epoch: 23642, Training Loss: 0.01173
Epoch: 23643, Training Loss: 0.00910
Epoch: 23643, Training Loss: 0.00952
Epoch: 23643, Training Loss: 0.00953
Epoch: 23643, Training Loss: 0.01173
Epoch: 23644, Training Loss: 0.00910
Epoch: 23644, Training Loss: 0.00952
Epoch: 23644, Training Loss: 0.00953
Epoch: 23644, Training Loss: 0.01173
Epoch: 23645, Training Loss: 0.00910
Epoch: 23645, Training Loss: 0.00952
Epoch: 23645, Training Loss: 0.00953
Epoch: 23645, Training Loss: 0.01173
Epoch: 23646, Training Loss: 0.00910
Epoch: 23646, Training Loss: 0.00952
Epoch: 23646, Training Loss: 0.00953
Epoch: 23646, Training Loss: 0.01173
Epoch: 23647, Training Loss: 0.00910
Epoch: 23647, Training Loss: 0.00952
Epoch: 23647, Training Loss: 0.00953
Epoch: 23647, Training Loss: 0.01173
Epoch: 23648, Training Loss: 0.00910
Epoch: 23648, Training Loss: 0.00952
Epoch: 23648, Training Loss: 0.00953
Epoch: 23648, Training Loss: 0.01172
Epoch: 23649, Training Loss: 0.00910
Epoch: 23649, Training Loss: 0.00952
Epoch: 23649, Training Loss: 0.00953
Epoch: 23649, Training Loss: 0.01172
Epoch: 23650, Training Loss: 0.00910
Epoch: 23650, Training Loss: 0.00952
Epoch: 23650, Training Loss: 0.00953
Epoch: 23650, Training Loss: 0.01172
Epoch: 23651, Training Loss: 0.00910
Epoch: 23651, Training Loss: 0.00952
Epoch: 23651, Training Loss: 0.00953
Epoch: 23651, Training Loss: 0.01172
Epoch: 23652, Training Loss: 0.00910
Epoch: 23652, Training Loss: 0.00952
Epoch: 23652, Training Loss: 0.00953
Epoch: 23652, Training Loss: 0.01172
Epoch: 23653, Training Loss: 0.00910
Epoch: 23653, Training Loss: 0.00952
Epoch: 23653, Training Loss: 0.00953
Epoch: 23653, Training Loss: 0.01172
Epoch: 23654, Training Loss: 0.00910
Epoch: 23654, Training Loss: 0.00952
Epoch: 23654, Training Loss: 0.00953
Epoch: 23654, Training Loss: 0.01172
Epoch: 23655, Training Loss: 0.00910
Epoch: 23655, Training Loss: 0.00951
Epoch: 23655, Training Loss: 0.00953
Epoch: 23655, Training Loss: 0.01172
Epoch: 23656, Training Loss: 0.00910
Epoch: 23656, Training Loss: 0.00951
Epoch: 23656, Training Loss: 0.00953
Epoch: 23656, Training Loss: 0.01172
Epoch: 23657, Training Loss: 0.00910
Epoch: 23657, Training Loss: 0.00951
Epoch: 23657, Training Loss: 0.00953
Epoch: 23657, Training Loss: 0.01172
Epoch: 23658, Training Loss: 0.00910
Epoch: 23658, Training Loss: 0.00951
Epoch: 23658, Training Loss: 0.00953
Epoch: 23658, Training Loss: 0.01172
Epoch: 23659, Training Loss: 0.00910
Epoch: 23659, Training Loss: 0.00951
Epoch: 23659, Training Loss: 0.00953
Epoch: 23659, Training Loss: 0.01172
Epoch: 23660, Training Loss: 0.00910
Epoch: 23660, Training Loss: 0.00951
Epoch: 23660, Training Loss: 0.00953
Epoch: 23660, Training Loss: 0.01172
Epoch: 23661, Training Loss: 0.00910
Epoch: 23661, Training Loss: 0.00951
Epoch: 23661, Training Loss: 0.00953
Epoch: 23661, Training Loss: 0.01172
Epoch: 23662, Training Loss: 0.00910
Epoch: 23662, Training Loss: 0.00951
Epoch: 23662, Training Loss: 0.00953
Epoch: 23662, Training Loss: 0.01172
Epoch: 23663, Training Loss: 0.00910
Epoch: 23663, Training Loss: 0.00951
Epoch: 23663, Training Loss: 0.00953
Epoch: 23663, Training Loss: 0.01172
Epoch: 23664, Training Loss: 0.00910
Epoch: 23664, Training Loss: 0.00951
Epoch: 23664, Training Loss: 0.00952
Epoch: 23664, Training Loss: 0.01172
Epoch: 23665, Training Loss: 0.00910
Epoch: 23665, Training Loss: 0.00951
Epoch: 23665, Training Loss: 0.00952
Epoch: 23665, Training Loss: 0.01172
Epoch: 23666, Training Loss: 0.00910
Epoch: 23666, Training Loss: 0.00951
Epoch: 23666, Training Loss: 0.00952
Epoch: 23666, Training Loss: 0.01172
Epoch: 23667, Training Loss: 0.00910
Epoch: 23667, Training Loss: 0.00951
Epoch: 23667, Training Loss: 0.00952
Epoch: 23667, Training Loss: 0.01172
Epoch: 23668, Training Loss: 0.00910
Epoch: 23668, Training Loss: 0.00951
Epoch: 23668, Training Loss: 0.00952
Epoch: 23668, Training Loss: 0.01172
Epoch: 23669, Training Loss: 0.00910
Epoch: 23669, Training Loss: 0.00951
Epoch: 23669, Training Loss: 0.00952
Epoch: 23669, Training Loss: 0.01172
Epoch: 23670, Training Loss: 0.00910
Epoch: 23670, Training Loss: 0.00951
Epoch: 23670, Training Loss: 0.00952
Epoch: 23670, Training Loss: 0.01172
Epoch: 23671, Training Loss: 0.00910
Epoch: 23671, Training Loss: 0.00951
Epoch: 23671, Training Loss: 0.00952
Epoch: 23671, Training Loss: 0.01172
Epoch: 23672, Training Loss: 0.00910
Epoch: 23672, Training Loss: 0.00951
Epoch: 23672, Training Loss: 0.00952
Epoch: 23672, Training Loss: 0.01172
Epoch: 23673, Training Loss: 0.00910
Epoch: 23673, Training Loss: 0.00951
Epoch: 23673, Training Loss: 0.00952
Epoch: 23673, Training Loss: 0.01172
Epoch: 23674, Training Loss: 0.00910
Epoch: 23674, Training Loss: 0.00951
Epoch: 23674, Training Loss: 0.00952
Epoch: 23674, Training Loss: 0.01172
Epoch: 23675, Training Loss: 0.00910
Epoch: 23675, Training Loss: 0.00951
Epoch: 23675, Training Loss: 0.00952
Epoch: 23675, Training Loss: 0.01172
Epoch: 23676, Training Loss: 0.00910
Epoch: 23676, Training Loss: 0.00951
Epoch: 23676, Training Loss: 0.00952
Epoch: 23676, Training Loss: 0.01172
Epoch: 23677, Training Loss: 0.00910
Epoch: 23677, Training Loss: 0.00951
Epoch: 23677, Training Loss: 0.00952
Epoch: 23677, Training Loss: 0.01172
Epoch: 23678, Training Loss: 0.00910
Epoch: 23678, Training Loss: 0.00951
Epoch: 23678, Training Loss: 0.00952
Epoch: 23678, Training Loss: 0.01172
Epoch: 23679, Training Loss: 0.00910
Epoch: 23679, Training Loss: 0.00951
Epoch: 23679, Training Loss: 0.00952
Epoch: 23679, Training Loss: 0.01172
Epoch: 23680, Training Loss: 0.00910
Epoch: 23680, Training Loss: 0.00951
Epoch: 23680, Training Loss: 0.00952
Epoch: 23680, Training Loss: 0.01172
Epoch: 23681, Training Loss: 0.00910
Epoch: 23681, Training Loss: 0.00951
Epoch: 23681, Training Loss: 0.00952
Epoch: 23681, Training Loss: 0.01172
Epoch: 23682, Training Loss: 0.00910
Epoch: 23682, Training Loss: 0.00951
Epoch: 23682, Training Loss: 0.00952
Epoch: 23682, Training Loss: 0.01172
Epoch: 23683, Training Loss: 0.00909
Epoch: 23683, Training Loss: 0.00951
Epoch: 23683, Training Loss: 0.00952
Epoch: 23683, Training Loss: 0.01171
Epoch: 23684, Training Loss: 0.00909
Epoch: 23684, Training Loss: 0.00951
Epoch: 23684, Training Loss: 0.00952
Epoch: 23684, Training Loss: 0.01171
Epoch: 23685, Training Loss: 0.00909
Epoch: 23685, Training Loss: 0.00951
Epoch: 23685, Training Loss: 0.00952
Epoch: 23685, Training Loss: 0.01171
Epoch: 23686, Training Loss: 0.00909
Epoch: 23686, Training Loss: 0.00951
Epoch: 23686, Training Loss: 0.00952
Epoch: 23686, Training Loss: 0.01171
Epoch: 23687, Training Loss: 0.00909
Epoch: 23687, Training Loss: 0.00951
Epoch: 23687, Training Loss: 0.00952
Epoch: 23687, Training Loss: 0.01171
Epoch: 23688, Training Loss: 0.00909
Epoch: 23688, Training Loss: 0.00951
Epoch: 23688, Training Loss: 0.00952
Epoch: 23688, Training Loss: 0.01171
Epoch: 23689, Training Loss: 0.00909
Epoch: 23689, Training Loss: 0.00951
Epoch: 23689, Training Loss: 0.00952
Epoch: 23689, Training Loss: 0.01171
Epoch: 23690, Training Loss: 0.00909
Epoch: 23690, Training Loss: 0.00951
Epoch: 23690, Training Loss: 0.00952
Epoch: 23690, Training Loss: 0.01171
Epoch: 23691, Training Loss: 0.00909
Epoch: 23691, Training Loss: 0.00951
Epoch: 23691, Training Loss: 0.00952
Epoch: 23691, Training Loss: 0.01171
Epoch: 23692, Training Loss: 0.00909
Epoch: 23692, Training Loss: 0.00951
Epoch: 23692, Training Loss: 0.00952
Epoch: 23692, Training Loss: 0.01171
Epoch: 23693, Training Loss: 0.00909
Epoch: 23693, Training Loss: 0.00951
Epoch: 23693, Training Loss: 0.00952
Epoch: 23693, Training Loss: 0.01171
Epoch: 23694, Training Loss: 0.00909
Epoch: 23694, Training Loss: 0.00951
Epoch: 23694, Training Loss: 0.00952
Epoch: 23694, Training Loss: 0.01171
Epoch: 23695, Training Loss: 0.00909
Epoch: 23695, Training Loss: 0.00951
Epoch: 23695, Training Loss: 0.00952
Epoch: 23695, Training Loss: 0.01171
Epoch: 23696, Training Loss: 0.00909
Epoch: 23696, Training Loss: 0.00951
Epoch: 23696, Training Loss: 0.00952
Epoch: 23696, Training Loss: 0.01171
Epoch: 23697, Training Loss: 0.00909
Epoch: 23697, Training Loss: 0.00951
Epoch: 23697, Training Loss: 0.00952
Epoch: 23697, Training Loss: 0.01171
Epoch: 23698, Training Loss: 0.00909
Epoch: 23698, Training Loss: 0.00950
Epoch: 23698, Training Loss: 0.00952
Epoch: 23698, Training Loss: 0.01171
Epoch: 23699, Training Loss: 0.00909
Epoch: 23699, Training Loss: 0.00950
Epoch: 23699, Training Loss: 0.00952
Epoch: 23699, Training Loss: 0.01171
Epoch: 23700, Training Loss: 0.00909
Epoch: 23700, Training Loss: 0.00950
Epoch: 23700, Training Loss: 0.00952
Epoch: 23700, Training Loss: 0.01171
Epoch: 23701, Training Loss: 0.00909
Epoch: 23701, Training Loss: 0.00950
Epoch: 23701, Training Loss: 0.00952
Epoch: 23701, Training Loss: 0.01171
Epoch: 23702, Training Loss: 0.00909
Epoch: 23702, Training Loss: 0.00950
Epoch: 23702, Training Loss: 0.00952
Epoch: 23702, Training Loss: 0.01171
Epoch: 23703, Training Loss: 0.00909
Epoch: 23703, Training Loss: 0.00950
Epoch: 23703, Training Loss: 0.00952
Epoch: 23703, Training Loss: 0.01171
Epoch: 23704, Training Loss: 0.00909
Epoch: 23704, Training Loss: 0.00950
Epoch: 23704, Training Loss: 0.00952
Epoch: 23704, Training Loss: 0.01171
Epoch: 23705, Training Loss: 0.00909
Epoch: 23705, Training Loss: 0.00950
Epoch: 23705, Training Loss: 0.00952
Epoch: 23705, Training Loss: 0.01171
Epoch: 23706, Training Loss: 0.00909
Epoch: 23706, Training Loss: 0.00950
Epoch: 23706, Training Loss: 0.00952
Epoch: 23706, Training Loss: 0.01171
Epoch: 23707, Training Loss: 0.00909
Epoch: 23707, Training Loss: 0.00950
Epoch: 23707, Training Loss: 0.00951
Epoch: 23707, Training Loss: 0.01171
Epoch: 23708, Training Loss: 0.00909
Epoch: 23708, Training Loss: 0.00950
Epoch: 23708, Training Loss: 0.00951
Epoch: 23708, Training Loss: 0.01171
Epoch: 23709, Training Loss: 0.00909
Epoch: 23709, Training Loss: 0.00950
Epoch: 23709, Training Loss: 0.00951
Epoch: 23709, Training Loss: 0.01171
Epoch: 23710, Training Loss: 0.00909
Epoch: 23710, Training Loss: 0.00950
Epoch: 23710, Training Loss: 0.00951
Epoch: 23710, Training Loss: 0.01171
Epoch: 23711, Training Loss: 0.00909
Epoch: 23711, Training Loss: 0.00950
Epoch: 23711, Training Loss: 0.00951
Epoch: 23711, Training Loss: 0.01171
Epoch: 23712, Training Loss: 0.00909
Epoch: 23712, Training Loss: 0.00950
Epoch: 23712, Training Loss: 0.00951
Epoch: 23712, Training Loss: 0.01171
Epoch: 23713, Training Loss: 0.00909
Epoch: 23713, Training Loss: 0.00950
Epoch: 23713, Training Loss: 0.00951
Epoch: 23713, Training Loss: 0.01171
Epoch: 23714, Training Loss: 0.00909
Epoch: 23714, Training Loss: 0.00950
Epoch: 23714, Training Loss: 0.00951
Epoch: 23714, Training Loss: 0.01171
Epoch: 23715, Training Loss: 0.00909
Epoch: 23715, Training Loss: 0.00950
Epoch: 23715, Training Loss: 0.00951
Epoch: 23715, Training Loss: 0.01171
Epoch: 23716, Training Loss: 0.00909
Epoch: 23716, Training Loss: 0.00950
Epoch: 23716, Training Loss: 0.00951
Epoch: 23716, Training Loss: 0.01171
Epoch: 23717, Training Loss: 0.00909
Epoch: 23717, Training Loss: 0.00950
Epoch: 23717, Training Loss: 0.00951
Epoch: 23717, Training Loss: 0.01170
Epoch: 23718, Training Loss: 0.00909
Epoch: 23718, Training Loss: 0.00950
Epoch: 23718, Training Loss: 0.00951
Epoch: 23718, Training Loss: 0.01170
Epoch: 23719, Training Loss: 0.00909
Epoch: 23719, Training Loss: 0.00950
Epoch: 23719, Training Loss: 0.00951
Epoch: 23719, Training Loss: 0.01170
Epoch: 23720, Training Loss: 0.00909
Epoch: 23720, Training Loss: 0.00950
Epoch: 23720, Training Loss: 0.00951
Epoch: 23720, Training Loss: 0.01170
Epoch: 23721, Training Loss: 0.00909
Epoch: 23721, Training Loss: 0.00950
Epoch: 23721, Training Loss: 0.00951
Epoch: 23721, Training Loss: 0.01170
Epoch: 23722, Training Loss: 0.00909
Epoch: 23722, Training Loss: 0.00950
Epoch: 23722, Training Loss: 0.00951
Epoch: 23722, Training Loss: 0.01170
Epoch: 23723, Training Loss: 0.00909
Epoch: 23723, Training Loss: 0.00950
Epoch: 23723, Training Loss: 0.00951
Epoch: 23723, Training Loss: 0.01170
Epoch: 23724, Training Loss: 0.00909
Epoch: 23724, Training Loss: 0.00950
Epoch: 23724, Training Loss: 0.00951
Epoch: 23724, Training Loss: 0.01170
Epoch: 23725, Training Loss: 0.00909
Epoch: 23725, Training Loss: 0.00950
Epoch: 23725, Training Loss: 0.00951
Epoch: 23725, Training Loss: 0.01170
Epoch: 23726, Training Loss: 0.00909
Epoch: 23726, Training Loss: 0.00950
Epoch: 23726, Training Loss: 0.00951
Epoch: 23726, Training Loss: 0.01170
Epoch: 23727, Training Loss: 0.00909
Epoch: 23727, Training Loss: 0.00950
Epoch: 23727, Training Loss: 0.00951
Epoch: 23727, Training Loss: 0.01170
Epoch: 23728, Training Loss: 0.00909
Epoch: 23728, Training Loss: 0.00950
Epoch: 23728, Training Loss: 0.00951
Epoch: 23728, Training Loss: 0.01170
Epoch: 23729, Training Loss: 0.00908
Epoch: 23729, Training Loss: 0.00950
Epoch: 23729, Training Loss: 0.00951
Epoch: 23729, Training Loss: 0.01170
Epoch: 23730, Training Loss: 0.00908
Epoch: 23730, Training Loss: 0.00950
Epoch: 23730, Training Loss: 0.00951
Epoch: 23730, Training Loss: 0.01170
Epoch: 23731, Training Loss: 0.00908
Epoch: 23731, Training Loss: 0.00950
Epoch: 23731, Training Loss: 0.00951
Epoch: 23731, Training Loss: 0.01170
Epoch: 23732, Training Loss: 0.00908
Epoch: 23732, Training Loss: 0.00950
Epoch: 23732, Training Loss: 0.00951
Epoch: 23732, Training Loss: 0.01170
Epoch: 23733, Training Loss: 0.00908
Epoch: 23733, Training Loss: 0.00950
Epoch: 23733, Training Loss: 0.00951
Epoch: 23733, Training Loss: 0.01170
Epoch: 23734, Training Loss: 0.00908
Epoch: 23734, Training Loss: 0.00950
Epoch: 23734, Training Loss: 0.00951
Epoch: 23734, Training Loss: 0.01170
Epoch: 23735, Training Loss: 0.00908
Epoch: 23735, Training Loss: 0.00950
Epoch: 23735, Training Loss: 0.00951
Epoch: 23735, Training Loss: 0.01170
Epoch: 23736, Training Loss: 0.00908
Epoch: 23736, Training Loss: 0.00950
Epoch: 23736, Training Loss: 0.00951
Epoch: 23736, Training Loss: 0.01170
Epoch: 23737, Training Loss: 0.00908
Epoch: 23737, Training Loss: 0.00950
Epoch: 23737, Training Loss: 0.00951
Epoch: 23737, Training Loss: 0.01170
Epoch: 23738, Training Loss: 0.00908
Epoch: 23738, Training Loss: 0.00950
Epoch: 23738, Training Loss: 0.00951
Epoch: 23738, Training Loss: 0.01170
Epoch: 23739, Training Loss: 0.00908
Epoch: 23739, Training Loss: 0.00950
Epoch: 23739, Training Loss: 0.00951
Epoch: 23739, Training Loss: 0.01170
Epoch: 23740, Training Loss: 0.00908
Epoch: 23740, Training Loss: 0.00950
Epoch: 23740, Training Loss: 0.00951
Epoch: 23740, Training Loss: 0.01170
Epoch: 23741, Training Loss: 0.00908
Epoch: 23741, Training Loss: 0.00949
Epoch: 23741, Training Loss: 0.00951
Epoch: 23741, Training Loss: 0.01170
Epoch: 23742, Training Loss: 0.00908
Epoch: 23742, Training Loss: 0.00949
Epoch: 23742, Training Loss: 0.00951
Epoch: 23742, Training Loss: 0.01170
Epoch: 23743, Training Loss: 0.00908
Epoch: 23743, Training Loss: 0.00949
Epoch: 23743, Training Loss: 0.00951
Epoch: 23743, Training Loss: 0.01170
Epoch: 23744, Training Loss: 0.00908
Epoch: 23744, Training Loss: 0.00949
Epoch: 23744, Training Loss: 0.00951
Epoch: 23744, Training Loss: 0.01170
Epoch: 23745, Training Loss: 0.00908
Epoch: 23745, Training Loss: 0.00949
Epoch: 23745, Training Loss: 0.00951
Epoch: 23745, Training Loss: 0.01170
Epoch: 23746, Training Loss: 0.00908
Epoch: 23746, Training Loss: 0.00949
Epoch: 23746, Training Loss: 0.00951
Epoch: 23746, Training Loss: 0.01170
Epoch: 23747, Training Loss: 0.00908
Epoch: 23747, Training Loss: 0.00949
Epoch: 23747, Training Loss: 0.00951
Epoch: 23747, Training Loss: 0.01170
Epoch: 23748, Training Loss: 0.00908
Epoch: 23748, Training Loss: 0.00949
Epoch: 23748, Training Loss: 0.00951
Epoch: 23748, Training Loss: 0.01170
Epoch: 23749, Training Loss: 0.00908
Epoch: 23749, Training Loss: 0.00949
Epoch: 23749, Training Loss: 0.00951
Epoch: 23749, Training Loss: 0.01170
Epoch: 23750, Training Loss: 0.00908
Epoch: 23750, Training Loss: 0.00949
Epoch: 23750, Training Loss: 0.00950
Epoch: 23750, Training Loss: 0.01170
Epoch: 23751, Training Loss: 0.00908
Epoch: 23751, Training Loss: 0.00949
Epoch: 23751, Training Loss: 0.00950
Epoch: 23751, Training Loss: 0.01170
Epoch: 23752, Training Loss: 0.00908
Epoch: 23752, Training Loss: 0.00949
Epoch: 23752, Training Loss: 0.00950
Epoch: 23752, Training Loss: 0.01169
Epoch: 23753, Training Loss: 0.00908
Epoch: 23753, Training Loss: 0.00949
Epoch: 23753, Training Loss: 0.00950
Epoch: 23753, Training Loss: 0.01169
Epoch: 23754, Training Loss: 0.00908
Epoch: 23754, Training Loss: 0.00949
Epoch: 23754, Training Loss: 0.00950
Epoch: 23754, Training Loss: 0.01169
Epoch: 23755, Training Loss: 0.00908
Epoch: 23755, Training Loss: 0.00949
Epoch: 23755, Training Loss: 0.00950
Epoch: 23755, Training Loss: 0.01169
Epoch: 23756, Training Loss: 0.00908
Epoch: 23756, Training Loss: 0.00949
Epoch: 23756, Training Loss: 0.00950
Epoch: 23756, Training Loss: 0.01169
Epoch: 23757, Training Loss: 0.00908
Epoch: 23757, Training Loss: 0.00949
Epoch: 23757, Training Loss: 0.00950
Epoch: 23757, Training Loss: 0.01169
Epoch: 23758, Training Loss: 0.00908
Epoch: 23758, Training Loss: 0.00949
Epoch: 23758, Training Loss: 0.00950
Epoch: 23758, Training Loss: 0.01169
Epoch: 23759, Training Loss: 0.00908
Epoch: 23759, Training Loss: 0.00949
Epoch: 23759, Training Loss: 0.00950
Epoch: 23759, Training Loss: 0.01169
Epoch: 23760, Training Loss: 0.00908
Epoch: 23760, Training Loss: 0.00949
Epoch: 23760, Training Loss: 0.00950
Epoch: 23760, Training Loss: 0.01169
Epoch: 23761, Training Loss: 0.00908
Epoch: 23761, Training Loss: 0.00949
Epoch: 23761, Training Loss: 0.00950
Epoch: 23761, Training Loss: 0.01169
Epoch: 23762, Training Loss: 0.00908
Epoch: 23762, Training Loss: 0.00949
Epoch: 23762, Training Loss: 0.00950
Epoch: 23762, Training Loss: 0.01169
Epoch: 23763, Training Loss: 0.00908
Epoch: 23763, Training Loss: 0.00949
Epoch: 23763, Training Loss: 0.00950
Epoch: 23763, Training Loss: 0.01169
Epoch: 23764, Training Loss: 0.00908
Epoch: 23764, Training Loss: 0.00949
Epoch: 23764, Training Loss: 0.00950
Epoch: 23764, Training Loss: 0.01169
Epoch: 23765, Training Loss: 0.00908
Epoch: 23765, Training Loss: 0.00949
Epoch: 23765, Training Loss: 0.00950
Epoch: 23765, Training Loss: 0.01169
Epoch: 23766, Training Loss: 0.00908
Epoch: 23766, Training Loss: 0.00949
Epoch: 23766, Training Loss: 0.00950
Epoch: 23766, Training Loss: 0.01169
Epoch: 23767, Training Loss: 0.00908
Epoch: 23767, Training Loss: 0.00949
Epoch: 23767, Training Loss: 0.00950
Epoch: 23767, Training Loss: 0.01169
Epoch: 23768, Training Loss: 0.00908
Epoch: 23768, Training Loss: 0.00949
Epoch: 23768, Training Loss: 0.00950
Epoch: 23768, Training Loss: 0.01169
Epoch: 23769, Training Loss: 0.00908
Epoch: 23769, Training Loss: 0.00949
Epoch: 23769, Training Loss: 0.00950
Epoch: 23769, Training Loss: 0.01169
Epoch: 23770, Training Loss: 0.00908
Epoch: 23770, Training Loss: 0.00949
Epoch: 23770, Training Loss: 0.00950
Epoch: 23770, Training Loss: 0.01169
Epoch: 23771, Training Loss: 0.00908
Epoch: 23771, Training Loss: 0.00949
Epoch: 23771, Training Loss: 0.00950
Epoch: 23771, Training Loss: 0.01169
Epoch: 23772, Training Loss: 0.00908
Epoch: 23772, Training Loss: 0.00949
Epoch: 23772, Training Loss: 0.00950
Epoch: 23772, Training Loss: 0.01169
Epoch: 23773, Training Loss: 0.00908
Epoch: 23773, Training Loss: 0.00949
Epoch: 23773, Training Loss: 0.00950
Epoch: 23773, Training Loss: 0.01169
Epoch: 23774, Training Loss: 0.00908
Epoch: 23774, Training Loss: 0.00949
Epoch: 23774, Training Loss: 0.00950
Epoch: 23774, Training Loss: 0.01169
Epoch: 23775, Training Loss: 0.00908
Epoch: 23775, Training Loss: 0.00949
Epoch: 23775, Training Loss: 0.00950
Epoch: 23775, Training Loss: 0.01169
Epoch: 23776, Training Loss: 0.00907
Epoch: 23776, Training Loss: 0.00949
Epoch: 23776, Training Loss: 0.00950
Epoch: 23776, Training Loss: 0.01169
Epoch: 23777, Training Loss: 0.00907
Epoch: 23777, Training Loss: 0.00949
Epoch: 23777, Training Loss: 0.00950
Epoch: 23777, Training Loss: 0.01169
Epoch: 23778, Training Loss: 0.00907
Epoch: 23778, Training Loss: 0.00949
Epoch: 23778, Training Loss: 0.00950
Epoch: 23778, Training Loss: 0.01169
Epoch: 23779, Training Loss: 0.00907
Epoch: 23779, Training Loss: 0.00949
Epoch: 23779, Training Loss: 0.00950
Epoch: 23779, Training Loss: 0.01169
Epoch: 23780, Training Loss: 0.00907
Epoch: 23780, Training Loss: 0.00949
Epoch: 23780, Training Loss: 0.00950
Epoch: 23780, Training Loss: 0.01169
Epoch: 23781, Training Loss: 0.00907
Epoch: 23781, Training Loss: 0.00949
Epoch: 23781, Training Loss: 0.00950
Epoch: 23781, Training Loss: 0.01169
Epoch: 23782, Training Loss: 0.00907
Epoch: 23782, Training Loss: 0.00949
Epoch: 23782, Training Loss: 0.00950
Epoch: 23782, Training Loss: 0.01169
Epoch: 23783, Training Loss: 0.00907
Epoch: 23783, Training Loss: 0.00949
Epoch: 23783, Training Loss: 0.00950
Epoch: 23783, Training Loss: 0.01169
Epoch: 23784, Training Loss: 0.00907
Epoch: 23784, Training Loss: 0.00949
Epoch: 23784, Training Loss: 0.00950
Epoch: 23784, Training Loss: 0.01169
Epoch: 23785, Training Loss: 0.00907
Epoch: 23785, Training Loss: 0.00948
Epoch: 23785, Training Loss: 0.00950
Epoch: 23785, Training Loss: 0.01169
Epoch: 23786, Training Loss: 0.00907
Epoch: 23786, Training Loss: 0.00948
Epoch: 23786, Training Loss: 0.00950
Epoch: 23786, Training Loss: 0.01169
Epoch: 23787, Training Loss: 0.00907
Epoch: 23787, Training Loss: 0.00948
Epoch: 23787, Training Loss: 0.00950
Epoch: 23787, Training Loss: 0.01168
Epoch: 23788, Training Loss: 0.00907
Epoch: 23788, Training Loss: 0.00948
Epoch: 23788, Training Loss: 0.00950
Epoch: 23788, Training Loss: 0.01168
Epoch: 23789, Training Loss: 0.00907
Epoch: 23789, Training Loss: 0.00948
Epoch: 23789, Training Loss: 0.00950
Epoch: 23789, Training Loss: 0.01168
Epoch: 23790, Training Loss: 0.00907
Epoch: 23790, Training Loss: 0.00948
Epoch: 23790, Training Loss: 0.00950
Epoch: 23790, Training Loss: 0.01168
Epoch: 23791, Training Loss: 0.00907
Epoch: 23791, Training Loss: 0.00948
Epoch: 23791, Training Loss: 0.00950
Epoch: 23791, Training Loss: 0.01168
Epoch: 23792, Training Loss: 0.00907
Epoch: 23792, Training Loss: 0.00948
Epoch: 23792, Training Loss: 0.00950
Epoch: 23792, Training Loss: 0.01168
Epoch: 23793, Training Loss: 0.00907
Epoch: 23793, Training Loss: 0.00948
Epoch: 23793, Training Loss: 0.00949
Epoch: 23793, Training Loss: 0.01168
Epoch: 23794, Training Loss: 0.00907
Epoch: 23794, Training Loss: 0.00948
Epoch: 23794, Training Loss: 0.00949
Epoch: 23794, Training Loss: 0.01168
Epoch: 23795, Training Loss: 0.00907
Epoch: 23795, Training Loss: 0.00948
Epoch: 23795, Training Loss: 0.00949
Epoch: 23795, Training Loss: 0.01168
Epoch: 23796, Training Loss: 0.00907
Epoch: 23796, Training Loss: 0.00948
Epoch: 23796, Training Loss: 0.00949
Epoch: 23796, Training Loss: 0.01168
Epoch: 23797, Training Loss: 0.00907
Epoch: 23797, Training Loss: 0.00948
Epoch: 23797, Training Loss: 0.00949
Epoch: 23797, Training Loss: 0.01168
Epoch: 23798, Training Loss: 0.00907
Epoch: 23798, Training Loss: 0.00948
Epoch: 23798, Training Loss: 0.00949
Epoch: 23798, Training Loss: 0.01168
Epoch: 23799, Training Loss: 0.00907
Epoch: 23799, Training Loss: 0.00948
Epoch: 23799, Training Loss: 0.00949
Epoch: 23799, Training Loss: 0.01168
Epoch: 23800, Training Loss: 0.00907
Epoch: 23800, Training Loss: 0.00948
Epoch: 23800, Training Loss: 0.00949
Epoch: 23800, Training Loss: 0.01168
Epoch: 23801, Training Loss: 0.00907
Epoch: 23801, Training Loss: 0.00948
Epoch: 23801, Training Loss: 0.00949
Epoch: 23801, Training Loss: 0.01168
Epoch: 23802, Training Loss: 0.00907
Epoch: 23802, Training Loss: 0.00948
Epoch: 23802, Training Loss: 0.00949
Epoch: 23802, Training Loss: 0.01168
Epoch: 23803, Training Loss: 0.00907
Epoch: 23803, Training Loss: 0.00948
Epoch: 23803, Training Loss: 0.00949
Epoch: 23803, Training Loss: 0.01168
Epoch: 23804, Training Loss: 0.00907
Epoch: 23804, Training Loss: 0.00948
Epoch: 23804, Training Loss: 0.00949
Epoch: 23804, Training Loss: 0.01168
Epoch: 23805, Training Loss: 0.00907
Epoch: 23805, Training Loss: 0.00948
Epoch: 23805, Training Loss: 0.00949
Epoch: 23805, Training Loss: 0.01168
Epoch: 23806, Training Loss: 0.00907
Epoch: 23806, Training Loss: 0.00948
Epoch: 23806, Training Loss: 0.00949
Epoch: 23806, Training Loss: 0.01168
Epoch: 23807, Training Loss: 0.00907
Epoch: 23807, Training Loss: 0.00948
Epoch: 23807, Training Loss: 0.00949
Epoch: 23807, Training Loss: 0.01168
Epoch: 23808, Training Loss: 0.00907
Epoch: 23808, Training Loss: 0.00948
Epoch: 23808, Training Loss: 0.00949
Epoch: 23808, Training Loss: 0.01168
Epoch: 23809, Training Loss: 0.00907
Epoch: 23809, Training Loss: 0.00948
Epoch: 23809, Training Loss: 0.00949
Epoch: 23809, Training Loss: 0.01168
Epoch: 23810, Training Loss: 0.00907
Epoch: 23810, Training Loss: 0.00948
Epoch: 23810, Training Loss: 0.00949
Epoch: 23810, Training Loss: 0.01168
Epoch: 23811, Training Loss: 0.00907
Epoch: 23811, Training Loss: 0.00948
Epoch: 23811, Training Loss: 0.00949
Epoch: 23811, Training Loss: 0.01168
Epoch: 23812, Training Loss: 0.00907
Epoch: 23812, Training Loss: 0.00948
Epoch: 23812, Training Loss: 0.00949
Epoch: 23812, Training Loss: 0.01168
Epoch: 23813, Training Loss: 0.00907
Epoch: 23813, Training Loss: 0.00948
Epoch: 23813, Training Loss: 0.00949
Epoch: 23813, Training Loss: 0.01168
Epoch: 23814, Training Loss: 0.00907
Epoch: 23814, Training Loss: 0.00948
Epoch: 23814, Training Loss: 0.00949
Epoch: 23814, Training Loss: 0.01168
Epoch: 23815, Training Loss: 0.00907
Epoch: 23815, Training Loss: 0.00948
Epoch: 23815, Training Loss: 0.00949
Epoch: 23815, Training Loss: 0.01168
Epoch: 23816, Training Loss: 0.00907
Epoch: 23816, Training Loss: 0.00948
Epoch: 23816, Training Loss: 0.00949
Epoch: 23816, Training Loss: 0.01168
Epoch: 23817, Training Loss: 0.00907
Epoch: 23817, Training Loss: 0.00948
Epoch: 23817, Training Loss: 0.00949
Epoch: 23817, Training Loss: 0.01168
Epoch: 23818, Training Loss: 0.00907
Epoch: 23818, Training Loss: 0.00948
Epoch: 23818, Training Loss: 0.00949
Epoch: 23818, Training Loss: 0.01168
Epoch: 23819, Training Loss: 0.00907
Epoch: 23819, Training Loss: 0.00948
Epoch: 23819, Training Loss: 0.00949
Epoch: 23819, Training Loss: 0.01168
Epoch: 23820, Training Loss: 0.00907
Epoch: 23820, Training Loss: 0.00948
Epoch: 23820, Training Loss: 0.00949
Epoch: 23820, Training Loss: 0.01168
Epoch: 23821, Training Loss: 0.00907
Epoch: 23821, Training Loss: 0.00948
Epoch: 23821, Training Loss: 0.00949
Epoch: 23821, Training Loss: 0.01168
Epoch: 23822, Training Loss: 0.00906
Epoch: 23822, Training Loss: 0.00948
Epoch: 23822, Training Loss: 0.00949
Epoch: 23822, Training Loss: 0.01167
Epoch: 23823, Training Loss: 0.00906
Epoch: 23823, Training Loss: 0.00948
Epoch: 23823, Training Loss: 0.00949
Epoch: 23823, Training Loss: 0.01167
Epoch: 23824, Training Loss: 0.00906
Epoch: 23824, Training Loss: 0.00948
Epoch: 23824, Training Loss: 0.00949
Epoch: 23824, Training Loss: 0.01167
Epoch: 23825, Training Loss: 0.00906
Epoch: 23825, Training Loss: 0.00948
Epoch: 23825, Training Loss: 0.00949
Epoch: 23825, Training Loss: 0.01167
Epoch: 23826, Training Loss: 0.00906
Epoch: 23826, Training Loss: 0.00948
Epoch: 23826, Training Loss: 0.00949
Epoch: 23826, Training Loss: 0.01167
Epoch: 23827, Training Loss: 0.00906
Epoch: 23827, Training Loss: 0.00948
Epoch: 23827, Training Loss: 0.00949
Epoch: 23827, Training Loss: 0.01167
Epoch: 23828, Training Loss: 0.00906
Epoch: 23828, Training Loss: 0.00947
Epoch: 23828, Training Loss: 0.00949
Epoch: 23828, Training Loss: 0.01167
Epoch: 23829, Training Loss: 0.00906
Epoch: 23829, Training Loss: 0.00947
Epoch: 23829, Training Loss: 0.00949
Epoch: 23829, Training Loss: 0.01167
Epoch: 23830, Training Loss: 0.00906
Epoch: 23830, Training Loss: 0.00947
Epoch: 23830, Training Loss: 0.00949
Epoch: 23830, Training Loss: 0.01167
Epoch: 23831, Training Loss: 0.00906
Epoch: 23831, Training Loss: 0.00947
Epoch: 23831, Training Loss: 0.00949
Epoch: 23831, Training Loss: 0.01167
Epoch: 23832, Training Loss: 0.00906
Epoch: 23832, Training Loss: 0.00947
Epoch: 23832, Training Loss: 0.00949
Epoch: 23832, Training Loss: 0.01167
Epoch: 23833, Training Loss: 0.00906
Epoch: 23833, Training Loss: 0.00947
Epoch: 23833, Training Loss: 0.00949
Epoch: 23833, Training Loss: 0.01167
Epoch: 23834, Training Loss: 0.00906
Epoch: 23834, Training Loss: 0.00947
Epoch: 23834, Training Loss: 0.00949
Epoch: 23834, Training Loss: 0.01167
Epoch: 23835, Training Loss: 0.00906
Epoch: 23835, Training Loss: 0.00947
Epoch: 23835, Training Loss: 0.00949
Epoch: 23835, Training Loss: 0.01167
Epoch: 23836, Training Loss: 0.00906
Epoch: 23836, Training Loss: 0.00947
Epoch: 23836, Training Loss: 0.00949
Epoch: 23836, Training Loss: 0.01167
Epoch: 23837, Training Loss: 0.00906
Epoch: 23837, Training Loss: 0.00947
Epoch: 23837, Training Loss: 0.00948
Epoch: 23837, Training Loss: 0.01167
Epoch: 23838, Training Loss: 0.00906
Epoch: 23838, Training Loss: 0.00947
Epoch: 23838, Training Loss: 0.00948
Epoch: 23838, Training Loss: 0.01167
Epoch: 23839, Training Loss: 0.00906
Epoch: 23839, Training Loss: 0.00947
Epoch: 23839, Training Loss: 0.00948
Epoch: 23839, Training Loss: 0.01167
Epoch: 23840, Training Loss: 0.00906
Epoch: 23840, Training Loss: 0.00947
Epoch: 23840, Training Loss: 0.00948
Epoch: 23840, Training Loss: 0.01167
Epoch: 23841, Training Loss: 0.00906
Epoch: 23841, Training Loss: 0.00947
Epoch: 23841, Training Loss: 0.00948
Epoch: 23841, Training Loss: 0.01167
Epoch: 23842, Training Loss: 0.00906
Epoch: 23842, Training Loss: 0.00947
Epoch: 23842, Training Loss: 0.00948
Epoch: 23842, Training Loss: 0.01167
Epoch: 23843, Training Loss: 0.00906
Epoch: 23843, Training Loss: 0.00947
Epoch: 23843, Training Loss: 0.00948
Epoch: 23843, Training Loss: 0.01167
Epoch: 23844, Training Loss: 0.00906
Epoch: 23844, Training Loss: 0.00947
Epoch: 23844, Training Loss: 0.00948
Epoch: 23844, Training Loss: 0.01167
Epoch: 23845, Training Loss: 0.00906
Epoch: 23845, Training Loss: 0.00947
Epoch: 23845, Training Loss: 0.00948
Epoch: 23845, Training Loss: 0.01167
Epoch: 23846, Training Loss: 0.00906
Epoch: 23846, Training Loss: 0.00947
Epoch: 23846, Training Loss: 0.00948
Epoch: 23846, Training Loss: 0.01167
Epoch: 23847, Training Loss: 0.00906
Epoch: 23847, Training Loss: 0.00947
Epoch: 23847, Training Loss: 0.00948
Epoch: 23847, Training Loss: 0.01167
Epoch: 23848, Training Loss: 0.00906
Epoch: 23848, Training Loss: 0.00947
Epoch: 23848, Training Loss: 0.00948
Epoch: 23848, Training Loss: 0.01167
Epoch: 23849, Training Loss: 0.00906
Epoch: 23849, Training Loss: 0.00947
Epoch: 23849, Training Loss: 0.00948
Epoch: 23849, Training Loss: 0.01167
Epoch: 23850, Training Loss: 0.00906
Epoch: 23850, Training Loss: 0.00947
Epoch: 23850, Training Loss: 0.00948
Epoch: 23850, Training Loss: 0.01167
Epoch: 23851, Training Loss: 0.00906
Epoch: 23851, Training Loss: 0.00947
Epoch: 23851, Training Loss: 0.00948
Epoch: 23851, Training Loss: 0.01167
Epoch: 23852, Training Loss: 0.00906
Epoch: 23852, Training Loss: 0.00947
Epoch: 23852, Training Loss: 0.00948
Epoch: 23852, Training Loss: 0.01167
Epoch: 23853, Training Loss: 0.00906
Epoch: 23853, Training Loss: 0.00947
Epoch: 23853, Training Loss: 0.00948
Epoch: 23853, Training Loss: 0.01167
Epoch: 23854, Training Loss: 0.00906
Epoch: 23854, Training Loss: 0.00947
Epoch: 23854, Training Loss: 0.00948
Epoch: 23854, Training Loss: 0.01167
Epoch: 23855, Training Loss: 0.00906
Epoch: 23855, Training Loss: 0.00947
Epoch: 23855, Training Loss: 0.00948
Epoch: 23855, Training Loss: 0.01167
Epoch: 23856, Training Loss: 0.00906
Epoch: 23856, Training Loss: 0.00947
Epoch: 23856, Training Loss: 0.00948
Epoch: 23856, Training Loss: 0.01166
Epoch: 23857, Training Loss: 0.00906
Epoch: 23857, Training Loss: 0.00947
Epoch: 23857, Training Loss: 0.00948
Epoch: 23857, Training Loss: 0.01166
Epoch: 23858, Training Loss: 0.00906
Epoch: 23858, Training Loss: 0.00947
Epoch: 23858, Training Loss: 0.00948
Epoch: 23858, Training Loss: 0.01166
Epoch: 23859, Training Loss: 0.00906
Epoch: 23859, Training Loss: 0.00947
Epoch: 23859, Training Loss: 0.00948
Epoch: 23859, Training Loss: 0.01166
Epoch: 23860, Training Loss: 0.00906
Epoch: 23860, Training Loss: 0.00947
Epoch: 23860, Training Loss: 0.00948
Epoch: 23860, Training Loss: 0.01166
Epoch: 23861, Training Loss: 0.00906
Epoch: 23861, Training Loss: 0.00947
Epoch: 23861, Training Loss: 0.00948
Epoch: 23861, Training Loss: 0.01166
Epoch: 23862, Training Loss: 0.00906
Epoch: 23862, Training Loss: 0.00947
Epoch: 23862, Training Loss: 0.00948
Epoch: 23862, Training Loss: 0.01166
Epoch: 23863, Training Loss: 0.00906
Epoch: 23863, Training Loss: 0.00947
Epoch: 23863, Training Loss: 0.00948
Epoch: 23863, Training Loss: 0.01166
Epoch: 23864, Training Loss: 0.00906
Epoch: 23864, Training Loss: 0.00947
Epoch: 23864, Training Loss: 0.00948
Epoch: 23864, Training Loss: 0.01166
Epoch: 23865, Training Loss: 0.00906
Epoch: 23865, Training Loss: 0.00947
Epoch: 23865, Training Loss: 0.00948
Epoch: 23865, Training Loss: 0.01166
Epoch: 23866, Training Loss: 0.00906
Epoch: 23866, Training Loss: 0.00947
Epoch: 23866, Training Loss: 0.00948
Epoch: 23866, Training Loss: 0.01166
Epoch: 23867, Training Loss: 0.00906
Epoch: 23867, Training Loss: 0.00947
Epoch: 23867, Training Loss: 0.00948
Epoch: 23867, Training Loss: 0.01166
Epoch: 23868, Training Loss: 0.00906
Epoch: 23868, Training Loss: 0.00947
Epoch: 23868, Training Loss: 0.00948
Epoch: 23868, Training Loss: 0.01166
Epoch: 23869, Training Loss: 0.00905
Epoch: 23869, Training Loss: 0.00947
Epoch: 23869, Training Loss: 0.00948
Epoch: 23869, Training Loss: 0.01166
Epoch: 23870, Training Loss: 0.00905
Epoch: 23870, Training Loss: 0.00947
Epoch: 23870, Training Loss: 0.00948
Epoch: 23870, Training Loss: 0.01166
Epoch: 23871, Training Loss: 0.00905
Epoch: 23871, Training Loss: 0.00947
Epoch: 23871, Training Loss: 0.00948
Epoch: 23871, Training Loss: 0.01166
Epoch: 23872, Training Loss: 0.00905
Epoch: 23872, Training Loss: 0.00946
Epoch: 23872, Training Loss: 0.00948
Epoch: 23872, Training Loss: 0.01166
Epoch: 23873, Training Loss: 0.00905
Epoch: 23873, Training Loss: 0.00946
Epoch: 23873, Training Loss: 0.00948
Epoch: 23873, Training Loss: 0.01166
Epoch: 23874, Training Loss: 0.00905
Epoch: 23874, Training Loss: 0.00946
Epoch: 23874, Training Loss: 0.00948
Epoch: 23874, Training Loss: 0.01166
Epoch: 23875, Training Loss: 0.00905
Epoch: 23875, Training Loss: 0.00946
Epoch: 23875, Training Loss: 0.00948
Epoch: 23875, Training Loss: 0.01166
Epoch: 23876, Training Loss: 0.00905
Epoch: 23876, Training Loss: 0.00946
Epoch: 23876, Training Loss: 0.00948
Epoch: 23876, Training Loss: 0.01166
Epoch: 23877, Training Loss: 0.00905
Epoch: 23877, Training Loss: 0.00946
Epoch: 23877, Training Loss: 0.00948
Epoch: 23877, Training Loss: 0.01166
Epoch: 23878, Training Loss: 0.00905
Epoch: 23878, Training Loss: 0.00946
Epoch: 23878, Training Loss: 0.00948
Epoch: 23878, Training Loss: 0.01166
Epoch: 23879, Training Loss: 0.00905
Epoch: 23879, Training Loss: 0.00946
Epoch: 23879, Training Loss: 0.00948
Epoch: 23879, Training Loss: 0.01166
Epoch: 23880, Training Loss: 0.00905
Epoch: 23880, Training Loss: 0.00946
Epoch: 23880, Training Loss: 0.00947
Epoch: 23880, Training Loss: 0.01166
Epoch: 23881, Training Loss: 0.00905
Epoch: 23881, Training Loss: 0.00946
Epoch: 23881, Training Loss: 0.00947
Epoch: 23881, Training Loss: 0.01166
Epoch: 23882, Training Loss: 0.00905
Epoch: 23882, Training Loss: 0.00946
Epoch: 23882, Training Loss: 0.00947
Epoch: 23882, Training Loss: 0.01166
Epoch: 23883, Training Loss: 0.00905
Epoch: 23883, Training Loss: 0.00946
Epoch: 23883, Training Loss: 0.00947
Epoch: 23883, Training Loss: 0.01166
Epoch: 23884, Training Loss: 0.00905
Epoch: 23884, Training Loss: 0.00946
Epoch: 23884, Training Loss: 0.00947
Epoch: 23884, Training Loss: 0.01166
Epoch: 23885, Training Loss: 0.00905
Epoch: 23885, Training Loss: 0.00946
Epoch: 23885, Training Loss: 0.00947
Epoch: 23885, Training Loss: 0.01166
Epoch: 23886, Training Loss: 0.00905
Epoch: 23886, Training Loss: 0.00946
Epoch: 23886, Training Loss: 0.00947
Epoch: 23886, Training Loss: 0.01166
Epoch: 23887, Training Loss: 0.00905
Epoch: 23887, Training Loss: 0.00946
Epoch: 23887, Training Loss: 0.00947
Epoch: 23887, Training Loss: 0.01166
Epoch: 23888, Training Loss: 0.00905
Epoch: 23888, Training Loss: 0.00946
Epoch: 23888, Training Loss: 0.00947
Epoch: 23888, Training Loss: 0.01166
Epoch: 23889, Training Loss: 0.00905
Epoch: 23889, Training Loss: 0.00946
Epoch: 23889, Training Loss: 0.00947
Epoch: 23889, Training Loss: 0.01166
Epoch: 23890, Training Loss: 0.00905
Epoch: 23890, Training Loss: 0.00946
Epoch: 23890, Training Loss: 0.00947
Epoch: 23890, Training Loss: 0.01166
Epoch: 23891, Training Loss: 0.00905
Epoch: 23891, Training Loss: 0.00946
Epoch: 23891, Training Loss: 0.00947
Epoch: 23891, Training Loss: 0.01165
Epoch: 23892, Training Loss: 0.00905
Epoch: 23892, Training Loss: 0.00946
Epoch: 23892, Training Loss: 0.00947
Epoch: 23892, Training Loss: 0.01165
Epoch: 23893, Training Loss: 0.00905
Epoch: 23893, Training Loss: 0.00946
Epoch: 23893, Training Loss: 0.00947
Epoch: 23893, Training Loss: 0.01165
Epoch: 23894, Training Loss: 0.00905
Epoch: 23894, Training Loss: 0.00946
Epoch: 23894, Training Loss: 0.00947
Epoch: 23894, Training Loss: 0.01165
Epoch: 23895, Training Loss: 0.00905
Epoch: 23895, Training Loss: 0.00946
Epoch: 23895, Training Loss: 0.00947
Epoch: 23895, Training Loss: 0.01165
Epoch: 23896, Training Loss: 0.00905
Epoch: 23896, Training Loss: 0.00946
Epoch: 23896, Training Loss: 0.00947
Epoch: 23896, Training Loss: 0.01165
Epoch: 23897, Training Loss: 0.00905
Epoch: 23897, Training Loss: 0.00946
Epoch: 23897, Training Loss: 0.00947
Epoch: 23897, Training Loss: 0.01165
Epoch: 23898, Training Loss: 0.00905
Epoch: 23898, Training Loss: 0.00946
Epoch: 23898, Training Loss: 0.00947
Epoch: 23898, Training Loss: 0.01165
Epoch: 23899, Training Loss: 0.00905
Epoch: 23899, Training Loss: 0.00946
Epoch: 23899, Training Loss: 0.00947
Epoch: 23899, Training Loss: 0.01165
Epoch: 23900, Training Loss: 0.00905
Epoch: 23900, Training Loss: 0.00946
Epoch: 23900, Training Loss: 0.00947
Epoch: 23900, Training Loss: 0.01165
Epoch: 23901, Training Loss: 0.00905
Epoch: 23901, Training Loss: 0.00946
Epoch: 23901, Training Loss: 0.00947
Epoch: 23901, Training Loss: 0.01165
Epoch: 23902, Training Loss: 0.00905
Epoch: 23902, Training Loss: 0.00946
Epoch: 23902, Training Loss: 0.00947
Epoch: 23902, Training Loss: 0.01165
Epoch: 23903, Training Loss: 0.00905
Epoch: 23903, Training Loss: 0.00946
Epoch: 23903, Training Loss: 0.00947
Epoch: 23903, Training Loss: 0.01165
Epoch: 23904, Training Loss: 0.00905
Epoch: 23904, Training Loss: 0.00946
Epoch: 23904, Training Loss: 0.00947
Epoch: 23904, Training Loss: 0.01165
Epoch: 23905, Training Loss: 0.00905
Epoch: 23905, Training Loss: 0.00946
Epoch: 23905, Training Loss: 0.00947
Epoch: 23905, Training Loss: 0.01165
Epoch: 23906, Training Loss: 0.00905
Epoch: 23906, Training Loss: 0.00946
Epoch: 23906, Training Loss: 0.00947
Epoch: 23906, Training Loss: 0.01165
Epoch: 23907, Training Loss: 0.00905
Epoch: 23907, Training Loss: 0.00946
Epoch: 23907, Training Loss: 0.00947
Epoch: 23907, Training Loss: 0.01165
Epoch: 23908, Training Loss: 0.00905
Epoch: 23908, Training Loss: 0.00946
Epoch: 23908, Training Loss: 0.00947
Epoch: 23908, Training Loss: 0.01165
Epoch: 23909, Training Loss: 0.00905
Epoch: 23909, Training Loss: 0.00946
Epoch: 23909, Training Loss: 0.00947
Epoch: 23909, Training Loss: 0.01165
Epoch: 23910, Training Loss: 0.00905
Epoch: 23910, Training Loss: 0.00946
Epoch: 23910, Training Loss: 0.00947
Epoch: 23910, Training Loss: 0.01165
Epoch: 23911, Training Loss: 0.00905
Epoch: 23911, Training Loss: 0.00946
Epoch: 23911, Training Loss: 0.00947
Epoch: 23911, Training Loss: 0.01165
Epoch: 23912, Training Loss: 0.00905
Epoch: 23912, Training Loss: 0.00946
Epoch: 23912, Training Loss: 0.00947
Epoch: 23912, Training Loss: 0.01165
Epoch: 23913, Training Loss: 0.00905
Epoch: 23913, Training Loss: 0.00946
Epoch: 23913, Training Loss: 0.00947
Epoch: 23913, Training Loss: 0.01165
Epoch: 23914, Training Loss: 0.00905
Epoch: 23914, Training Loss: 0.00946
Epoch: 23914, Training Loss: 0.00947
Epoch: 23914, Training Loss: 0.01165
Epoch: 23915, Training Loss: 0.00905
Epoch: 23915, Training Loss: 0.00946
Epoch: 23915, Training Loss: 0.00947
Epoch: 23915, Training Loss: 0.01165
Epoch: 23916, Training Loss: 0.00904
Epoch: 23916, Training Loss: 0.00945
Epoch: 23916, Training Loss: 0.00947
Epoch: 23916, Training Loss: 0.01165
Epoch: 23917, Training Loss: 0.00904
Epoch: 23917, Training Loss: 0.00945
Epoch: 23917, Training Loss: 0.00947
Epoch: 23917, Training Loss: 0.01165
Epoch: 23918, Training Loss: 0.00904
Epoch: 23918, Training Loss: 0.00945
Epoch: 23918, Training Loss: 0.00947
Epoch: 23918, Training Loss: 0.01165
Epoch: 23919, Training Loss: 0.00904
Epoch: 23919, Training Loss: 0.00945
Epoch: 23919, Training Loss: 0.00947
Epoch: 23919, Training Loss: 0.01165
Epoch: 23920, Training Loss: 0.00904
Epoch: 23920, Training Loss: 0.00945
Epoch: 23920, Training Loss: 0.00947
Epoch: 23920, Training Loss: 0.01165
Epoch: 23921, Training Loss: 0.00904
Epoch: 23921, Training Loss: 0.00945
Epoch: 23921, Training Loss: 0.00947
Epoch: 23921, Training Loss: 0.01165
Epoch: 23922, Training Loss: 0.00904
Epoch: 23922, Training Loss: 0.00945
Epoch: 23922, Training Loss: 0.00947
Epoch: 23922, Training Loss: 0.01165
Epoch: 23923, Training Loss: 0.00904
Epoch: 23923, Training Loss: 0.00945
Epoch: 23923, Training Loss: 0.00947
Epoch: 23923, Training Loss: 0.01165
Epoch: 23924, Training Loss: 0.00904
Epoch: 23924, Training Loss: 0.00945
Epoch: 23924, Training Loss: 0.00946
Epoch: 23924, Training Loss: 0.01165
Epoch: 23925, Training Loss: 0.00904
Epoch: 23925, Training Loss: 0.00945
Epoch: 23925, Training Loss: 0.00946
Epoch: 23925, Training Loss: 0.01165
Epoch: 23926, Training Loss: 0.00904
Epoch: 23926, Training Loss: 0.00945
Epoch: 23926, Training Loss: 0.00946
Epoch: 23926, Training Loss: 0.01165
Epoch: 23927, Training Loss: 0.00904
Epoch: 23927, Training Loss: 0.00945
Epoch: 23927, Training Loss: 0.00946
Epoch: 23927, Training Loss: 0.01164
Epoch: 23928, Training Loss: 0.00904
Epoch: 23928, Training Loss: 0.00945
Epoch: 23928, Training Loss: 0.00946
Epoch: 23928, Training Loss: 0.01164
Epoch: 23929, Training Loss: 0.00904
Epoch: 23929, Training Loss: 0.00945
Epoch: 23929, Training Loss: 0.00946
Epoch: 23929, Training Loss: 0.01164
Epoch: 23930, Training Loss: 0.00904
Epoch: 23930, Training Loss: 0.00945
Epoch: 23930, Training Loss: 0.00946
Epoch: 23930, Training Loss: 0.01164
Epoch: 23931, Training Loss: 0.00904
Epoch: 23931, Training Loss: 0.00945
Epoch: 23931, Training Loss: 0.00946
Epoch: 23931, Training Loss: 0.01164
Epoch: 23932, Training Loss: 0.00904
Epoch: 23932, Training Loss: 0.00945
Epoch: 23932, Training Loss: 0.00946
Epoch: 23932, Training Loss: 0.01164
Epoch: 23933, Training Loss: 0.00904
Epoch: 23933, Training Loss: 0.00945
Epoch: 23933, Training Loss: 0.00946
Epoch: 23933, Training Loss: 0.01164
Epoch: 23934, Training Loss: 0.00904
Epoch: 23934, Training Loss: 0.00945
Epoch: 23934, Training Loss: 0.00946
Epoch: 23934, Training Loss: 0.01164
Epoch: 23935, Training Loss: 0.00904
Epoch: 23935, Training Loss: 0.00945
Epoch: 23935, Training Loss: 0.00946
Epoch: 23935, Training Loss: 0.01164
Epoch: 23936, Training Loss: 0.00904
Epoch: 23936, Training Loss: 0.00945
Epoch: 23936, Training Loss: 0.00946
Epoch: 23936, Training Loss: 0.01164
Epoch: 23937, Training Loss: 0.00904
Epoch: 23937, Training Loss: 0.00945
Epoch: 23937, Training Loss: 0.00946
Epoch: 23937, Training Loss: 0.01164
Epoch: 23938, Training Loss: 0.00904
Epoch: 23938, Training Loss: 0.00945
Epoch: 23938, Training Loss: 0.00946
Epoch: 23938, Training Loss: 0.01164
Epoch: 23939, Training Loss: 0.00904
Epoch: 23939, Training Loss: 0.00945
Epoch: 23939, Training Loss: 0.00946
Epoch: 23939, Training Loss: 0.01164
Epoch: 23940, Training Loss: 0.00904
Epoch: 23940, Training Loss: 0.00945
Epoch: 23940, Training Loss: 0.00946
Epoch: 23940, Training Loss: 0.01164
Epoch: 23941, Training Loss: 0.00904
Epoch: 23941, Training Loss: 0.00945
Epoch: 23941, Training Loss: 0.00946
Epoch: 23941, Training Loss: 0.01164
Epoch: 23942, Training Loss: 0.00904
Epoch: 23942, Training Loss: 0.00945
Epoch: 23942, Training Loss: 0.00946
Epoch: 23942, Training Loss: 0.01164
Epoch: 23943, Training Loss: 0.00904
Epoch: 23943, Training Loss: 0.00945
Epoch: 23943, Training Loss: 0.00946
Epoch: 23943, Training Loss: 0.01164
Epoch: 23944, Training Loss: 0.00904
Epoch: 23944, Training Loss: 0.00945
Epoch: 23944, Training Loss: 0.00946
Epoch: 23944, Training Loss: 0.01164
Epoch: 23945, Training Loss: 0.00904
Epoch: 23945, Training Loss: 0.00945
Epoch: 23945, Training Loss: 0.00946
Epoch: 23945, Training Loss: 0.01164
Epoch: 23946, Training Loss: 0.00904
Epoch: 23946, Training Loss: 0.00945
Epoch: 23946, Training Loss: 0.00946
Epoch: 23946, Training Loss: 0.01164
Epoch: 23947, Training Loss: 0.00904
Epoch: 23947, Training Loss: 0.00945
Epoch: 23947, Training Loss: 0.00946
Epoch: 23947, Training Loss: 0.01164
Epoch: 23948, Training Loss: 0.00904
Epoch: 23948, Training Loss: 0.00945
Epoch: 23948, Training Loss: 0.00946
Epoch: 23948, Training Loss: 0.01164
Epoch: 23949, Training Loss: 0.00904
Epoch: 23949, Training Loss: 0.00945
Epoch: 23949, Training Loss: 0.00946
Epoch: 23949, Training Loss: 0.01164
Epoch: 23950, Training Loss: 0.00904
Epoch: 23950, Training Loss: 0.00945
Epoch: 23950, Training Loss: 0.00946
Epoch: 23950, Training Loss: 0.01164
Epoch: 23951, Training Loss: 0.00904
Epoch: 23951, Training Loss: 0.00945
Epoch: 23951, Training Loss: 0.00946
Epoch: 23951, Training Loss: 0.01164
Epoch: 23952, Training Loss: 0.00904
Epoch: 23952, Training Loss: 0.00945
Epoch: 23952, Training Loss: 0.00946
Epoch: 23952, Training Loss: 0.01164
Epoch: 23953, Training Loss: 0.00904
Epoch: 23953, Training Loss: 0.00945
Epoch: 23953, Training Loss: 0.00946
Epoch: 23953, Training Loss: 0.01164
Epoch: 23954, Training Loss: 0.00904
Epoch: 23954, Training Loss: 0.00945
Epoch: 23954, Training Loss: 0.00946
Epoch: 23954, Training Loss: 0.01164
Epoch: 23955, Training Loss: 0.00904
Epoch: 23955, Training Loss: 0.00945
Epoch: 23955, Training Loss: 0.00946
Epoch: 23955, Training Loss: 0.01164
Epoch: 23956, Training Loss: 0.00904
Epoch: 23956, Training Loss: 0.00945
Epoch: 23956, Training Loss: 0.00946
Epoch: 23956, Training Loss: 0.01164
Epoch: 23957, Training Loss: 0.00904
Epoch: 23957, Training Loss: 0.00945
Epoch: 23957, Training Loss: 0.00946
Epoch: 23957, Training Loss: 0.01164
Epoch: 23958, Training Loss: 0.00904
Epoch: 23958, Training Loss: 0.00945
Epoch: 23958, Training Loss: 0.00946
Epoch: 23958, Training Loss: 0.01164
Epoch: 23959, Training Loss: 0.00904
Epoch: 23959, Training Loss: 0.00945
Epoch: 23959, Training Loss: 0.00946
Epoch: 23959, Training Loss: 0.01164
Epoch: 23960, Training Loss: 0.00904
Epoch: 23960, Training Loss: 0.00944
Epoch: 23960, Training Loss: 0.00946
Epoch: 23960, Training Loss: 0.01164
Epoch: 23961, Training Loss: 0.00904
Epoch: 23961, Training Loss: 0.00944
Epoch: 23961, Training Loss: 0.00946
Epoch: 23961, Training Loss: 0.01164
Epoch: 23962, Training Loss: 0.00904
Epoch: 23962, Training Loss: 0.00944
Epoch: 23962, Training Loss: 0.00946
Epoch: 23962, Training Loss: 0.01163
Epoch: 23963, Training Loss: 0.00903
Epoch: 23963, Training Loss: 0.00944
Epoch: 23963, Training Loss: 0.00946
Epoch: 23963, Training Loss: 0.01163
Epoch: 23964, Training Loss: 0.00903
Epoch: 23964, Training Loss: 0.00944
Epoch: 23964, Training Loss: 0.00946
Epoch: 23964, Training Loss: 0.01163
Epoch: 23965, Training Loss: 0.00903
Epoch: 23965, Training Loss: 0.00944
Epoch: 23965, Training Loss: 0.00946
Epoch: 23965, Training Loss: 0.01163
Epoch: 23966, Training Loss: 0.00903
Epoch: 23966, Training Loss: 0.00944
Epoch: 23966, Training Loss: 0.00946
Epoch: 23966, Training Loss: 0.01163
Epoch: 23967, Training Loss: 0.00903
Epoch: 23967, Training Loss: 0.00944
Epoch: 23967, Training Loss: 0.00946
Epoch: 23967, Training Loss: 0.01163
Epoch: 23968, Training Loss: 0.00903
Epoch: 23968, Training Loss: 0.00944
Epoch: 23968, Training Loss: 0.00945
Epoch: 23968, Training Loss: 0.01163
Epoch: 23969, Training Loss: 0.00903
Epoch: 23969, Training Loss: 0.00944
Epoch: 23969, Training Loss: 0.00945
Epoch: 23969, Training Loss: 0.01163
Epoch: 23970, Training Loss: 0.00903
Epoch: 23970, Training Loss: 0.00944
Epoch: 23970, Training Loss: 0.00945
Epoch: 23970, Training Loss: 0.01163
Epoch: 23971, Training Loss: 0.00903
Epoch: 23971, Training Loss: 0.00944
Epoch: 23971, Training Loss: 0.00945
Epoch: 23971, Training Loss: 0.01163
Epoch: 23972, Training Loss: 0.00903
Epoch: 23972, Training Loss: 0.00944
Epoch: 23972, Training Loss: 0.00945
Epoch: 23972, Training Loss: 0.01163
Epoch: 23973, Training Loss: 0.00903
Epoch: 23973, Training Loss: 0.00944
Epoch: 23973, Training Loss: 0.00945
Epoch: 23973, Training Loss: 0.01163
Epoch: 23974, Training Loss: 0.00903
Epoch: 23974, Training Loss: 0.00944
Epoch: 23974, Training Loss: 0.00945
Epoch: 23974, Training Loss: 0.01163
Epoch: 23975, Training Loss: 0.00903
Epoch: 23975, Training Loss: 0.00944
Epoch: 23975, Training Loss: 0.00945
Epoch: 23975, Training Loss: 0.01163
Epoch: 23976, Training Loss: 0.00903
Epoch: 23976, Training Loss: 0.00944
Epoch: 23976, Training Loss: 0.00945
Epoch: 23976, Training Loss: 0.01163
Epoch: 23977, Training Loss: 0.00903
Epoch: 23977, Training Loss: 0.00944
Epoch: 23977, Training Loss: 0.00945
Epoch: 23977, Training Loss: 0.01163
Epoch: 23978, Training Loss: 0.00903
Epoch: 23978, Training Loss: 0.00944
Epoch: 23978, Training Loss: 0.00945
Epoch: 23978, Training Loss: 0.01163
Epoch: 23979, Training Loss: 0.00903
Epoch: 23979, Training Loss: 0.00944
Epoch: 23979, Training Loss: 0.00945
Epoch: 23979, Training Loss: 0.01163
Epoch: 23980, Training Loss: 0.00903
Epoch: 23980, Training Loss: 0.00944
Epoch: 23980, Training Loss: 0.00945
Epoch: 23980, Training Loss: 0.01163
Epoch: 23981, Training Loss: 0.00903
Epoch: 23981, Training Loss: 0.00944
Epoch: 23981, Training Loss: 0.00945
Epoch: 23981, Training Loss: 0.01163
Epoch: 23982, Training Loss: 0.00903
Epoch: 23982, Training Loss: 0.00944
Epoch: 23982, Training Loss: 0.00945
Epoch: 23982, Training Loss: 0.01163
Epoch: 23983, Training Loss: 0.00903
Epoch: 23983, Training Loss: 0.00944
Epoch: 23983, Training Loss: 0.00945
Epoch: 23983, Training Loss: 0.01163
Epoch: 23984, Training Loss: 0.00903
Epoch: 23984, Training Loss: 0.00944
Epoch: 23984, Training Loss: 0.00945
Epoch: 23984, Training Loss: 0.01163
Epoch: 23985, Training Loss: 0.00903
Epoch: 23985, Training Loss: 0.00944
Epoch: 23985, Training Loss: 0.00945
Epoch: 23985, Training Loss: 0.01163
Epoch: 23986, Training Loss: 0.00903
Epoch: 23986, Training Loss: 0.00944
Epoch: 23986, Training Loss: 0.00945
Epoch: 23986, Training Loss: 0.01163
Epoch: 23987, Training Loss: 0.00903
Epoch: 23987, Training Loss: 0.00944
Epoch: 23987, Training Loss: 0.00945
Epoch: 23987, Training Loss: 0.01163
Epoch: 23988, Training Loss: 0.00903
Epoch: 23988, Training Loss: 0.00944
Epoch: 23988, Training Loss: 0.00945
Epoch: 23988, Training Loss: 0.01163
Epoch: 23989, Training Loss: 0.00903
Epoch: 23989, Training Loss: 0.00944
Epoch: 23989, Training Loss: 0.00945
Epoch: 23989, Training Loss: 0.01163
Epoch: 23990, Training Loss: 0.00903
Epoch: 23990, Training Loss: 0.00944
Epoch: 23990, Training Loss: 0.00945
Epoch: 23990, Training Loss: 0.01163
Epoch: 23991, Training Loss: 0.00903
Epoch: 23991, Training Loss: 0.00944
Epoch: 23991, Training Loss: 0.00945
Epoch: 23991, Training Loss: 0.01163
Epoch: 23992, Training Loss: 0.00903
Epoch: 23992, Training Loss: 0.00944
Epoch: 23992, Training Loss: 0.00945
Epoch: 23992, Training Loss: 0.01163
Epoch: 23993, Training Loss: 0.00903
Epoch: 23993, Training Loss: 0.00944
Epoch: 23993, Training Loss: 0.00945
Epoch: 23993, Training Loss: 0.01163
Epoch: 23994, Training Loss: 0.00903
Epoch: 23994, Training Loss: 0.00944
Epoch: 23994, Training Loss: 0.00945
Epoch: 23994, Training Loss: 0.01163
Epoch: 23995, Training Loss: 0.00903
Epoch: 23995, Training Loss: 0.00944
Epoch: 23995, Training Loss: 0.00945
Epoch: 23995, Training Loss: 0.01163
Epoch: 23996, Training Loss: 0.00903
Epoch: 23996, Training Loss: 0.00944
Epoch: 23996, Training Loss: 0.00945
Epoch: 23996, Training Loss: 0.01163
Epoch: 23997, Training Loss: 0.00903
Epoch: 23997, Training Loss: 0.00944
Epoch: 23997, Training Loss: 0.00945
Epoch: 23997, Training Loss: 0.01162
Epoch: 23998, Training Loss: 0.00903
Epoch: 23998, Training Loss: 0.00944
Epoch: 23998, Training Loss: 0.00945
Epoch: 23998, Training Loss: 0.01162
Epoch: 23999, Training Loss: 0.00903
Epoch: 23999, Training Loss: 0.00944
Epoch: 23999, Training Loss: 0.00945
Epoch: 23999, Training Loss: 0.01162
Epoch: 24000, Training Loss: 0.00903
Epoch: 24000, Training Loss: 0.00944
Epoch: 24000, Training Loss: 0.00945
Epoch: 24000, Training Loss: 0.01162
Epoch: 24001, Training Loss: 0.00903
Epoch: 24001, Training Loss: 0.00944
Epoch: 24001, Training Loss: 0.00945
Epoch: 24001, Training Loss: 0.01162
Epoch: 24002, Training Loss: 0.00903
Epoch: 24002, Training Loss: 0.00944
Epoch: 24002, Training Loss: 0.00945
Epoch: 24002, Training Loss: 0.01162
Epoch: 24003, Training Loss: 0.00903
Epoch: 24003, Training Loss: 0.00944
Epoch: 24003, Training Loss: 0.00945
Epoch: 24003, Training Loss: 0.01162
Epoch: 24004, Training Loss: 0.00903
Epoch: 24004, Training Loss: 0.00943
Epoch: 24004, Training Loss: 0.00945
Epoch: 24004, Training Loss: 0.01162
Epoch: 24005, Training Loss: 0.00903
Epoch: 24005, Training Loss: 0.00943
Epoch: 24005, Training Loss: 0.00945
Epoch: 24005, Training Loss: 0.01162
Epoch: 24006, Training Loss: 0.00903
Epoch: 24006, Training Loss: 0.00943
Epoch: 24006, Training Loss: 0.00945
Epoch: 24006, Training Loss: 0.01162
Epoch: 24007, Training Loss: 0.00903
Epoch: 24007, Training Loss: 0.00943
Epoch: 24007, Training Loss: 0.00945
Epoch: 24007, Training Loss: 0.01162
Epoch: 24008, Training Loss: 0.00903
Epoch: 24008, Training Loss: 0.00943
Epoch: 24008, Training Loss: 0.00945
Epoch: 24008, Training Loss: 0.01162
Epoch: 24009, Training Loss: 0.00903
Epoch: 24009, Training Loss: 0.00943
Epoch: 24009, Training Loss: 0.00945
Epoch: 24009, Training Loss: 0.01162
Epoch: 24010, Training Loss: 0.00902
Epoch: 24010, Training Loss: 0.00943
Epoch: 24010, Training Loss: 0.00945
Epoch: 24010, Training Loss: 0.01162
Epoch: 24011, Training Loss: 0.00902
Epoch: 24011, Training Loss: 0.00943
Epoch: 24011, Training Loss: 0.00945
Epoch: 24011, Training Loss: 0.01162
Epoch: 24012, Training Loss: 0.00902
Epoch: 24012, Training Loss: 0.00943
Epoch: 24012, Training Loss: 0.00944
Epoch: 24012, Training Loss: 0.01162
Epoch: 24013, Training Loss: 0.00902
Epoch: 24013, Training Loss: 0.00943
Epoch: 24013, Training Loss: 0.00944
Epoch: 24013, Training Loss: 0.01162
Epoch: 24014, Training Loss: 0.00902
Epoch: 24014, Training Loss: 0.00943
Epoch: 24014, Training Loss: 0.00944
Epoch: 24014, Training Loss: 0.01162
Epoch: 24015, Training Loss: 0.00902
Epoch: 24015, Training Loss: 0.00943
Epoch: 24015, Training Loss: 0.00944
Epoch: 24015, Training Loss: 0.01162
Epoch: 24016, Training Loss: 0.00902
Epoch: 24016, Training Loss: 0.00943
Epoch: 24016, Training Loss: 0.00944
Epoch: 24016, Training Loss: 0.01162
Epoch: 24017, Training Loss: 0.00902
Epoch: 24017, Training Loss: 0.00943
Epoch: 24017, Training Loss: 0.00944
Epoch: 24017, Training Loss: 0.01162
Epoch: 24018, Training Loss: 0.00902
Epoch: 24018, Training Loss: 0.00943
Epoch: 24018, Training Loss: 0.00944
Epoch: 24018, Training Loss: 0.01162
Epoch: 24019, Training Loss: 0.00902
Epoch: 24019, Training Loss: 0.00943
Epoch: 24019, Training Loss: 0.00944
Epoch: 24019, Training Loss: 0.01162
Epoch: 24020, Training Loss: 0.00902
Epoch: 24020, Training Loss: 0.00943
Epoch: 24020, Training Loss: 0.00944
Epoch: 24020, Training Loss: 0.01162
Epoch: 24021, Training Loss: 0.00902
Epoch: 24021, Training Loss: 0.00943
Epoch: 24021, Training Loss: 0.00944
Epoch: 24021, Training Loss: 0.01162
Epoch: 24022, Training Loss: 0.00902
Epoch: 24022, Training Loss: 0.00943
Epoch: 24022, Training Loss: 0.00944
Epoch: 24022, Training Loss: 0.01162
Epoch: 24023, Training Loss: 0.00902
Epoch: 24023, Training Loss: 0.00943
Epoch: 24023, Training Loss: 0.00944
Epoch: 24023, Training Loss: 0.01162
Epoch: 24024, Training Loss: 0.00902
Epoch: 24024, Training Loss: 0.00943
Epoch: 24024, Training Loss: 0.00944
Epoch: 24024, Training Loss: 0.01162
Epoch: 24025, Training Loss: 0.00902
Epoch: 24025, Training Loss: 0.00943
Epoch: 24025, Training Loss: 0.00944
Epoch: 24025, Training Loss: 0.01162
Epoch: 24026, Training Loss: 0.00902
Epoch: 24026, Training Loss: 0.00943
Epoch: 24026, Training Loss: 0.00944
Epoch: 24026, Training Loss: 0.01162
Epoch: 24027, Training Loss: 0.00902
Epoch: 24027, Training Loss: 0.00943
Epoch: 24027, Training Loss: 0.00944
Epoch: 24027, Training Loss: 0.01162
Epoch: 24028, Training Loss: 0.00902
Epoch: 24028, Training Loss: 0.00943
Epoch: 24028, Training Loss: 0.00944
Epoch: 24028, Training Loss: 0.01162
Epoch: 24029, Training Loss: 0.00902
Epoch: 24029, Training Loss: 0.00943
Epoch: 24029, Training Loss: 0.00944
Epoch: 24029, Training Loss: 0.01162
Epoch: 24030, Training Loss: 0.00902
Epoch: 24030, Training Loss: 0.00943
Epoch: 24030, Training Loss: 0.00944
Epoch: 24030, Training Loss: 0.01162
Epoch: 24031, Training Loss: 0.00902
Epoch: 24031, Training Loss: 0.00943
Epoch: 24031, Training Loss: 0.00944
Epoch: 24031, Training Loss: 0.01162
Epoch: 24032, Training Loss: 0.00902
Epoch: 24032, Training Loss: 0.00943
Epoch: 24032, Training Loss: 0.00944
Epoch: 24032, Training Loss: 0.01161
Epoch: 24033, Training Loss: 0.00902
Epoch: 24033, Training Loss: 0.00943
Epoch: 24033, Training Loss: 0.00944
Epoch: 24033, Training Loss: 0.01161
Epoch: 24034, Training Loss: 0.00902
Epoch: 24034, Training Loss: 0.00943
Epoch: 24034, Training Loss: 0.00944
Epoch: 24034, Training Loss: 0.01161
Epoch: 24035, Training Loss: 0.00902
Epoch: 24035, Training Loss: 0.00943
Epoch: 24035, Training Loss: 0.00944
Epoch: 24035, Training Loss: 0.01161
Epoch: 24036, Training Loss: 0.00902
Epoch: 24036, Training Loss: 0.00943
Epoch: 24036, Training Loss: 0.00944
Epoch: 24036, Training Loss: 0.01161
Epoch: 24037, Training Loss: 0.00902
Epoch: 24037, Training Loss: 0.00943
Epoch: 24037, Training Loss: 0.00944
Epoch: 24037, Training Loss: 0.01161
Epoch: 24038, Training Loss: 0.00902
Epoch: 24038, Training Loss: 0.00943
Epoch: 24038, Training Loss: 0.00944
Epoch: 24038, Training Loss: 0.01161
Epoch: 24039, Training Loss: 0.00902
Epoch: 24039, Training Loss: 0.00943
Epoch: 24039, Training Loss: 0.00944
Epoch: 24039, Training Loss: 0.01161
Epoch: 24040, Training Loss: 0.00902
Epoch: 24040, Training Loss: 0.00943
Epoch: 24040, Training Loss: 0.00944
Epoch: 24040, Training Loss: 0.01161
Epoch: 24041, Training Loss: 0.00902
Epoch: 24041, Training Loss: 0.00943
Epoch: 24041, Training Loss: 0.00944
Epoch: 24041, Training Loss: 0.01161
Epoch: 24042, Training Loss: 0.00902
Epoch: 24042, Training Loss: 0.00943
Epoch: 24042, Training Loss: 0.00944
Epoch: 24042, Training Loss: 0.01161
Epoch: 24043, Training Loss: 0.00902
Epoch: 24043, Training Loss: 0.00943
Epoch: 24043, Training Loss: 0.00944
Epoch: 24043, Training Loss: 0.01161
Epoch: 24044, Training Loss: 0.00902
Epoch: 24044, Training Loss: 0.00943
Epoch: 24044, Training Loss: 0.00944
Epoch: 24044, Training Loss: 0.01161
Epoch: 24045, Training Loss: 0.00902
Epoch: 24045, Training Loss: 0.00943
Epoch: 24045, Training Loss: 0.00944
Epoch: 24045, Training Loss: 0.01161
Epoch: 24046, Training Loss: 0.00902
Epoch: 24046, Training Loss: 0.00943
Epoch: 24046, Training Loss: 0.00944
Epoch: 24046, Training Loss: 0.01161
Epoch: 24047, Training Loss: 0.00902
Epoch: 24047, Training Loss: 0.00943
Epoch: 24047, Training Loss: 0.00944
Epoch: 24047, Training Loss: 0.01161
Epoch: 24048, Training Loss: 0.00902
Epoch: 24048, Training Loss: 0.00942
Epoch: 24048, Training Loss: 0.00944
Epoch: 24048, Training Loss: 0.01161
Epoch: 24049, Training Loss: 0.00902
Epoch: 24049, Training Loss: 0.00942
Epoch: 24049, Training Loss: 0.00944
Epoch: 24049, Training Loss: 0.01161
Epoch: 24050, Training Loss: 0.00902
Epoch: 24050, Training Loss: 0.00942
Epoch: 24050, Training Loss: 0.00944
Epoch: 24050, Training Loss: 0.01161
Epoch: 24051, Training Loss: 0.00902
Epoch: 24051, Training Loss: 0.00942
Epoch: 24051, Training Loss: 0.00944
Epoch: 24051, Training Loss: 0.01161
Epoch: 24052, Training Loss: 0.00902
Epoch: 24052, Training Loss: 0.00942
Epoch: 24052, Training Loss: 0.00944
Epoch: 24052, Training Loss: 0.01161
Epoch: 24053, Training Loss: 0.00902
Epoch: 24053, Training Loss: 0.00942
Epoch: 24053, Training Loss: 0.00944
Epoch: 24053, Training Loss: 0.01161
Epoch: 24054, Training Loss: 0.00902
Epoch: 24054, Training Loss: 0.00942
Epoch: 24054, Training Loss: 0.00944
Epoch: 24054, Training Loss: 0.01161
Epoch: 24055, Training Loss: 0.00902
Epoch: 24055, Training Loss: 0.00942
Epoch: 24055, Training Loss: 0.00944
Epoch: 24055, Training Loss: 0.01161
Epoch: 24056, Training Loss: 0.00902
Epoch: 24056, Training Loss: 0.00942
Epoch: 24056, Training Loss: 0.00943
Epoch: 24056, Training Loss: 0.01161
Epoch: 24057, Training Loss: 0.00902
Epoch: 24057, Training Loss: 0.00942
Epoch: 24057, Training Loss: 0.00943
Epoch: 24057, Training Loss: 0.01161
Epoch: 24058, Training Loss: 0.00901
Epoch: 24058, Training Loss: 0.00942
Epoch: 24058, Training Loss: 0.00943
Epoch: 24058, Training Loss: 0.01161
Epoch: 24059, Training Loss: 0.00901
Epoch: 24059, Training Loss: 0.00942
Epoch: 24059, Training Loss: 0.00943
Epoch: 24059, Training Loss: 0.01161
Epoch: 24060, Training Loss: 0.00901
Epoch: 24060, Training Loss: 0.00942
Epoch: 24060, Training Loss: 0.00943
Epoch: 24060, Training Loss: 0.01161
Epoch: 24061, Training Loss: 0.00901
Epoch: 24061, Training Loss: 0.00942
Epoch: 24061, Training Loss: 0.00943
Epoch: 24061, Training Loss: 0.01161
Epoch: 24062, Training Loss: 0.00901
Epoch: 24062, Training Loss: 0.00942
Epoch: 24062, Training Loss: 0.00943
Epoch: 24062, Training Loss: 0.01161
Epoch: 24063, Training Loss: 0.00901
Epoch: 24063, Training Loss: 0.00942
Epoch: 24063, Training Loss: 0.00943
Epoch: 24063, Training Loss: 0.01161
Epoch: 24064, Training Loss: 0.00901
Epoch: 24064, Training Loss: 0.00942
Epoch: 24064, Training Loss: 0.00943
Epoch: 24064, Training Loss: 0.01161
Epoch: 24065, Training Loss: 0.00901
Epoch: 24065, Training Loss: 0.00942
Epoch: 24065, Training Loss: 0.00943
Epoch: 24065, Training Loss: 0.01161
Epoch: 24066, Training Loss: 0.00901
Epoch: 24066, Training Loss: 0.00942
Epoch: 24066, Training Loss: 0.00943
Epoch: 24066, Training Loss: 0.01161
Epoch: 24067, Training Loss: 0.00901
Epoch: 24067, Training Loss: 0.00942
Epoch: 24067, Training Loss: 0.00943
Epoch: 24067, Training Loss: 0.01161
Epoch: 24068, Training Loss: 0.00901
Epoch: 24068, Training Loss: 0.00942
Epoch: 24068, Training Loss: 0.00943
Epoch: 24068, Training Loss: 0.01160
Epoch: 24069, Training Loss: 0.00901
Epoch: 24069, Training Loss: 0.00942
Epoch: 24069, Training Loss: 0.00943
Epoch: 24069, Training Loss: 0.01160
Epoch: 24070, Training Loss: 0.00901
Epoch: 24070, Training Loss: 0.00942
Epoch: 24070, Training Loss: 0.00943
Epoch: 24070, Training Loss: 0.01160
Epoch: 24071, Training Loss: 0.00901
Epoch: 24071, Training Loss: 0.00942
Epoch: 24071, Training Loss: 0.00943
Epoch: 24071, Training Loss: 0.01160
Epoch: 24072, Training Loss: 0.00901
Epoch: 24072, Training Loss: 0.00942
Epoch: 24072, Training Loss: 0.00943
Epoch: 24072, Training Loss: 0.01160
Epoch: 24073, Training Loss: 0.00901
Epoch: 24073, Training Loss: 0.00942
Epoch: 24073, Training Loss: 0.00943
Epoch: 24073, Training Loss: 0.01160
Epoch: 24074, Training Loss: 0.00901
Epoch: 24074, Training Loss: 0.00942
Epoch: 24074, Training Loss: 0.00943
Epoch: 24074, Training Loss: 0.01160
Epoch: 24075, Training Loss: 0.00901
Epoch: 24075, Training Loss: 0.00942
Epoch: 24075, Training Loss: 0.00943
Epoch: 24075, Training Loss: 0.01160
Epoch: 24076, Training Loss: 0.00901
Epoch: 24076, Training Loss: 0.00942
Epoch: 24076, Training Loss: 0.00943
Epoch: 24076, Training Loss: 0.01160
Epoch: 24077, Training Loss: 0.00901
Epoch: 24077, Training Loss: 0.00942
Epoch: 24077, Training Loss: 0.00943
Epoch: 24077, Training Loss: 0.01160
Epoch: 24078, Training Loss: 0.00901
Epoch: 24078, Training Loss: 0.00942
Epoch: 24078, Training Loss: 0.00943
Epoch: 24078, Training Loss: 0.01160
Epoch: 24079, Training Loss: 0.00901
Epoch: 24079, Training Loss: 0.00942
Epoch: 24079, Training Loss: 0.00943
Epoch: 24079, Training Loss: 0.01160
Epoch: 24080, Training Loss: 0.00901
Epoch: 24080, Training Loss: 0.00942
Epoch: 24080, Training Loss: 0.00943
Epoch: 24080, Training Loss: 0.01160
Epoch: 24081, Training Loss: 0.00901
Epoch: 24081, Training Loss: 0.00942
Epoch: 24081, Training Loss: 0.00943
Epoch: 24081, Training Loss: 0.01160
Epoch: 24082, Training Loss: 0.00901
Epoch: 24082, Training Loss: 0.00942
Epoch: 24082, Training Loss: 0.00943
Epoch: 24082, Training Loss: 0.01160
Epoch: 24083, Training Loss: 0.00901
Epoch: 24083, Training Loss: 0.00942
Epoch: 24083, Training Loss: 0.00943
Epoch: 24083, Training Loss: 0.01160
Epoch: 24084, Training Loss: 0.00901
Epoch: 24084, Training Loss: 0.00942
Epoch: 24084, Training Loss: 0.00943
Epoch: 24084, Training Loss: 0.01160
Epoch: 24085, Training Loss: 0.00901
Epoch: 24085, Training Loss: 0.00942
Epoch: 24085, Training Loss: 0.00943
Epoch: 24085, Training Loss: 0.01160
Epoch: 24086, Training Loss: 0.00901
Epoch: 24086, Training Loss: 0.00942
Epoch: 24086, Training Loss: 0.00943
Epoch: 24086, Training Loss: 0.01160
Epoch: 24087, Training Loss: 0.00901
Epoch: 24087, Training Loss: 0.00942
Epoch: 24087, Training Loss: 0.00943
Epoch: 24087, Training Loss: 0.01160
Epoch: 24088, Training Loss: 0.00901
Epoch: 24088, Training Loss: 0.00942
Epoch: 24088, Training Loss: 0.00943
Epoch: 24088, Training Loss: 0.01160
Epoch: 24089, Training Loss: 0.00901
Epoch: 24089, Training Loss: 0.00942
Epoch: 24089, Training Loss: 0.00943
Epoch: 24089, Training Loss: 0.01160
Epoch: 24090, Training Loss: 0.00901
Epoch: 24090, Training Loss: 0.00942
Epoch: 24090, Training Loss: 0.00943
Epoch: 24090, Training Loss: 0.01160
Epoch: 24091, Training Loss: 0.00901
Epoch: 24091, Training Loss: 0.00942
Epoch: 24091, Training Loss: 0.00943
Epoch: 24091, Training Loss: 0.01160
Epoch: 24092, Training Loss: 0.00901
Epoch: 24092, Training Loss: 0.00941
Epoch: 24092, Training Loss: 0.00943
Epoch: 24092, Training Loss: 0.01160
Epoch: 24093, Training Loss: 0.00901
Epoch: 24093, Training Loss: 0.00941
Epoch: 24093, Training Loss: 0.00943
Epoch: 24093, Training Loss: 0.01160
Epoch: 24094, Training Loss: 0.00901
Epoch: 24094, Training Loss: 0.00941
Epoch: 24094, Training Loss: 0.00943
Epoch: 24094, Training Loss: 0.01160
Epoch: 24095, Training Loss: 0.00901
Epoch: 24095, Training Loss: 0.00941
Epoch: 24095, Training Loss: 0.00943
Epoch: 24095, Training Loss: 0.01160
Epoch: 24096, Training Loss: 0.00901
Epoch: 24096, Training Loss: 0.00941
Epoch: 24096, Training Loss: 0.00943
Epoch: 24096, Training Loss: 0.01160
Epoch: 24097, Training Loss: 0.00901
Epoch: 24097, Training Loss: 0.00941
Epoch: 24097, Training Loss: 0.00943
Epoch: 24097, Training Loss: 0.01160
Epoch: 24098, Training Loss: 0.00901
Epoch: 24098, Training Loss: 0.00941
Epoch: 24098, Training Loss: 0.00943
Epoch: 24098, Training Loss: 0.01160
Epoch: 24099, Training Loss: 0.00901
Epoch: 24099, Training Loss: 0.00941
Epoch: 24099, Training Loss: 0.00943
Epoch: 24099, Training Loss: 0.01160
Epoch: 24100, Training Loss: 0.00901
Epoch: 24100, Training Loss: 0.00941
Epoch: 24100, Training Loss: 0.00942
Epoch: 24100, Training Loss: 0.01160
Epoch: 24101, Training Loss: 0.00901
Epoch: 24101, Training Loss: 0.00941
Epoch: 24101, Training Loss: 0.00942
Epoch: 24101, Training Loss: 0.01160
Epoch: 24102, Training Loss: 0.00901
Epoch: 24102, Training Loss: 0.00941
Epoch: 24102, Training Loss: 0.00942
Epoch: 24102, Training Loss: 0.01160
Epoch: 24103, Training Loss: 0.00901
Epoch: 24103, Training Loss: 0.00941
Epoch: 24103, Training Loss: 0.00942
Epoch: 24103, Training Loss: 0.01159
Epoch: 24104, Training Loss: 0.00901
Epoch: 24104, Training Loss: 0.00941
Epoch: 24104, Training Loss: 0.00942
Epoch: 24104, Training Loss: 0.01159
Epoch: 24105, Training Loss: 0.00900
Epoch: 24105, Training Loss: 0.00941
Epoch: 24105, Training Loss: 0.00942
Epoch: 24105, Training Loss: 0.01159
Epoch: 24106, Training Loss: 0.00900
Epoch: 24106, Training Loss: 0.00941
Epoch: 24106, Training Loss: 0.00942
Epoch: 24106, Training Loss: 0.01159
Epoch: 24107, Training Loss: 0.00900
Epoch: 24107, Training Loss: 0.00941
Epoch: 24107, Training Loss: 0.00942
Epoch: 24107, Training Loss: 0.01159
Epoch: 24108, Training Loss: 0.00900
Epoch: 24108, Training Loss: 0.00941
Epoch: 24108, Training Loss: 0.00942
Epoch: 24108, Training Loss: 0.01159
Epoch: 24109, Training Loss: 0.00900
Epoch: 24109, Training Loss: 0.00941
Epoch: 24109, Training Loss: 0.00942
Epoch: 24109, Training Loss: 0.01159
Epoch: 24110, Training Loss: 0.00900
Epoch: 24110, Training Loss: 0.00941
Epoch: 24110, Training Loss: 0.00942
Epoch: 24110, Training Loss: 0.01159
Epoch: 24111, Training Loss: 0.00900
Epoch: 24111, Training Loss: 0.00941
Epoch: 24111, Training Loss: 0.00942
Epoch: 24111, Training Loss: 0.01159
Epoch: 24112, Training Loss: 0.00900
Epoch: 24112, Training Loss: 0.00941
Epoch: 24112, Training Loss: 0.00942
Epoch: 24112, Training Loss: 0.01159
Epoch: 24113, Training Loss: 0.00900
Epoch: 24113, Training Loss: 0.00941
Epoch: 24113, Training Loss: 0.00942
Epoch: 24113, Training Loss: 0.01159
Epoch: 24114, Training Loss: 0.00900
Epoch: 24114, Training Loss: 0.00941
Epoch: 24114, Training Loss: 0.00942
Epoch: 24114, Training Loss: 0.01159
Epoch: 24115, Training Loss: 0.00900
Epoch: 24115, Training Loss: 0.00941
Epoch: 24115, Training Loss: 0.00942
Epoch: 24115, Training Loss: 0.01159
Epoch: 24116, Training Loss: 0.00900
Epoch: 24116, Training Loss: 0.00941
Epoch: 24116, Training Loss: 0.00942
Epoch: 24116, Training Loss: 0.01159
Epoch: 24117, Training Loss: 0.00900
Epoch: 24117, Training Loss: 0.00941
Epoch: 24117, Training Loss: 0.00942
Epoch: 24117, Training Loss: 0.01159
Epoch: 24118, Training Loss: 0.00900
Epoch: 24118, Training Loss: 0.00941
Epoch: 24118, Training Loss: 0.00942
Epoch: 24118, Training Loss: 0.01159
Epoch: 24119, Training Loss: 0.00900
Epoch: 24119, Training Loss: 0.00941
Epoch: 24119, Training Loss: 0.00942
Epoch: 24119, Training Loss: 0.01159
Epoch: 24120, Training Loss: 0.00900
Epoch: 24120, Training Loss: 0.00941
Epoch: 24120, Training Loss: 0.00942
Epoch: 24120, Training Loss: 0.01159
Epoch: 24121, Training Loss: 0.00900
Epoch: 24121, Training Loss: 0.00941
Epoch: 24121, Training Loss: 0.00942
Epoch: 24121, Training Loss: 0.01159
Epoch: 24122, Training Loss: 0.00900
Epoch: 24122, Training Loss: 0.00941
Epoch: 24122, Training Loss: 0.00942
Epoch: 24122, Training Loss: 0.01159
Epoch: 24123, Training Loss: 0.00900
Epoch: 24123, Training Loss: 0.00941
Epoch: 24123, Training Loss: 0.00942
Epoch: 24123, Training Loss: 0.01159
Epoch: 24124, Training Loss: 0.00900
Epoch: 24124, Training Loss: 0.00941
Epoch: 24124, Training Loss: 0.00942
Epoch: 24124, Training Loss: 0.01159
Epoch: 24125, Training Loss: 0.00900
Epoch: 24125, Training Loss: 0.00941
Epoch: 24125, Training Loss: 0.00942
Epoch: 24125, Training Loss: 0.01159
Epoch: 24126, Training Loss: 0.00900
Epoch: 24126, Training Loss: 0.00941
Epoch: 24126, Training Loss: 0.00942
Epoch: 24126, Training Loss: 0.01159
Epoch: 24127, Training Loss: 0.00900
Epoch: 24127, Training Loss: 0.00941
Epoch: 24127, Training Loss: 0.00942
Epoch: 24127, Training Loss: 0.01159
Epoch: 24128, Training Loss: 0.00900
Epoch: 24128, Training Loss: 0.00941
Epoch: 24128, Training Loss: 0.00942
Epoch: 24128, Training Loss: 0.01159
Epoch: 24129, Training Loss: 0.00900
Epoch: 24129, Training Loss: 0.00941
Epoch: 24129, Training Loss: 0.00942
Epoch: 24129, Training Loss: 0.01159
Epoch: 24130, Training Loss: 0.00900
Epoch: 24130, Training Loss: 0.00941
Epoch: 24130, Training Loss: 0.00942
Epoch: 24130, Training Loss: 0.01159
Epoch: 24131, Training Loss: 0.00900
Epoch: 24131, Training Loss: 0.00941
Epoch: 24131, Training Loss: 0.00942
Epoch: 24131, Training Loss: 0.01159
Epoch: 24132, Training Loss: 0.00900
Epoch: 24132, Training Loss: 0.00941
Epoch: 24132, Training Loss: 0.00942
Epoch: 24132, Training Loss: 0.01159
Epoch: 24133, Training Loss: 0.00900
Epoch: 24133, Training Loss: 0.00941
Epoch: 24133, Training Loss: 0.00942
Epoch: 24133, Training Loss: 0.01159
Epoch: 24134, Training Loss: 0.00900
Epoch: 24134, Training Loss: 0.00941
Epoch: 24134, Training Loss: 0.00942
Epoch: 24134, Training Loss: 0.01159
Epoch: 24135, Training Loss: 0.00900
Epoch: 24135, Training Loss: 0.00941
Epoch: 24135, Training Loss: 0.00942
Epoch: 24135, Training Loss: 0.01159
Epoch: 24136, Training Loss: 0.00900
Epoch: 24136, Training Loss: 0.00940
Epoch: 24136, Training Loss: 0.00942
Epoch: 24136, Training Loss: 0.01159
Epoch: 24137, Training Loss: 0.00900
Epoch: 24137, Training Loss: 0.00940
Epoch: 24137, Training Loss: 0.00942
Epoch: 24137, Training Loss: 0.01159
Epoch: 24138, Training Loss: 0.00900
Epoch: 24138, Training Loss: 0.00940
Epoch: 24138, Training Loss: 0.00942
Epoch: 24138, Training Loss: 0.01159
Epoch: 24139, Training Loss: 0.00900
Epoch: 24139, Training Loss: 0.00940
Epoch: 24139, Training Loss: 0.00942
Epoch: 24139, Training Loss: 0.01158
Epoch: 24140, Training Loss: 0.00900
Epoch: 24140, Training Loss: 0.00940
Epoch: 24140, Training Loss: 0.00942
Epoch: 24140, Training Loss: 0.01158
Epoch: 24141, Training Loss: 0.00900
Epoch: 24141, Training Loss: 0.00940
Epoch: 24141, Training Loss: 0.00942
Epoch: 24141, Training Loss: 0.01158
Epoch: 24142, Training Loss: 0.00900
Epoch: 24142, Training Loss: 0.00940
Epoch: 24142, Training Loss: 0.00942
Epoch: 24142, Training Loss: 0.01158
Epoch: 24143, Training Loss: 0.00900
Epoch: 24143, Training Loss: 0.00940
Epoch: 24143, Training Loss: 0.00942
Epoch: 24143, Training Loss: 0.01158
Epoch: 24144, Training Loss: 0.00900
Epoch: 24144, Training Loss: 0.00940
Epoch: 24144, Training Loss: 0.00942
Epoch: 24144, Training Loss: 0.01158
Epoch: 24145, Training Loss: 0.00900
Epoch: 24145, Training Loss: 0.00940
Epoch: 24145, Training Loss: 0.00941
Epoch: 24145, Training Loss: 0.01158
Epoch: 24146, Training Loss: 0.00900
Epoch: 24146, Training Loss: 0.00940
Epoch: 24146, Training Loss: 0.00941
Epoch: 24146, Training Loss: 0.01158
Epoch: 24147, Training Loss: 0.00900
Epoch: 24147, Training Loss: 0.00940
Epoch: 24147, Training Loss: 0.00941
Epoch: 24147, Training Loss: 0.01158
Epoch: 24148, Training Loss: 0.00900
Epoch: 24148, Training Loss: 0.00940
Epoch: 24148, Training Loss: 0.00941
Epoch: 24148, Training Loss: 0.01158
Epoch: 24149, Training Loss: 0.00900
Epoch: 24149, Training Loss: 0.00940
Epoch: 24149, Training Loss: 0.00941
Epoch: 24149, Training Loss: 0.01158
Epoch: 24150, Training Loss: 0.00900
Epoch: 24150, Training Loss: 0.00940
Epoch: 24150, Training Loss: 0.00941
Epoch: 24150, Training Loss: 0.01158
Epoch: 24151, Training Loss: 0.00900
Epoch: 24151, Training Loss: 0.00940
Epoch: 24151, Training Loss: 0.00941
Epoch: 24151, Training Loss: 0.01158
Epoch: 24152, Training Loss: 0.00900
Epoch: 24152, Training Loss: 0.00940
Epoch: 24152, Training Loss: 0.00941
Epoch: 24152, Training Loss: 0.01158
Epoch: 24153, Training Loss: 0.00899
Epoch: 24153, Training Loss: 0.00940
Epoch: 24153, Training Loss: 0.00941
Epoch: 24153, Training Loss: 0.01158
Epoch: 24154, Training Loss: 0.00899
Epoch: 24154, Training Loss: 0.00940
Epoch: 24154, Training Loss: 0.00941
Epoch: 24154, Training Loss: 0.01158
Epoch: 24155, Training Loss: 0.00899
Epoch: 24155, Training Loss: 0.00940
Epoch: 24155, Training Loss: 0.00941
Epoch: 24155, Training Loss: 0.01158
Epoch: 24156, Training Loss: 0.00899
Epoch: 24156, Training Loss: 0.00940
Epoch: 24156, Training Loss: 0.00941
Epoch: 24156, Training Loss: 0.01158
Epoch: 24157, Training Loss: 0.00899
Epoch: 24157, Training Loss: 0.00940
Epoch: 24157, Training Loss: 0.00941
Epoch: 24157, Training Loss: 0.01158
Epoch: 24158, Training Loss: 0.00899
Epoch: 24158, Training Loss: 0.00940
Epoch: 24158, Training Loss: 0.00941
Epoch: 24158, Training Loss: 0.01158
Epoch: 24159, Training Loss: 0.00899
Epoch: 24159, Training Loss: 0.00940
Epoch: 24159, Training Loss: 0.00941
Epoch: 24159, Training Loss: 0.01158
Epoch: 24160, Training Loss: 0.00899
Epoch: 24160, Training Loss: 0.00940
Epoch: 24160, Training Loss: 0.00941
Epoch: 24160, Training Loss: 0.01158
Epoch: 24161, Training Loss: 0.00899
Epoch: 24161, Training Loss: 0.00940
Epoch: 24161, Training Loss: 0.00941
Epoch: 24161, Training Loss: 0.01158
Epoch: 24162, Training Loss: 0.00899
Epoch: 24162, Training Loss: 0.00940
Epoch: 24162, Training Loss: 0.00941
Epoch: 24162, Training Loss: 0.01158
Epoch: 24163, Training Loss: 0.00899
Epoch: 24163, Training Loss: 0.00940
Epoch: 24163, Training Loss: 0.00941
Epoch: 24163, Training Loss: 0.01158
Epoch: 24164, Training Loss: 0.00899
Epoch: 24164, Training Loss: 0.00940
Epoch: 24164, Training Loss: 0.00941
Epoch: 24164, Training Loss: 0.01158
Epoch: 24165, Training Loss: 0.00899
Epoch: 24165, Training Loss: 0.00940
Epoch: 24165, Training Loss: 0.00941
Epoch: 24165, Training Loss: 0.01158
Epoch: 24166, Training Loss: 0.00899
Epoch: 24166, Training Loss: 0.00940
Epoch: 24166, Training Loss: 0.00941
Epoch: 24166, Training Loss: 0.01158
Epoch: 24167, Training Loss: 0.00899
Epoch: 24167, Training Loss: 0.00940
Epoch: 24167, Training Loss: 0.00941
Epoch: 24167, Training Loss: 0.01158
Epoch: 24168, Training Loss: 0.00899
Epoch: 24168, Training Loss: 0.00940
Epoch: 24168, Training Loss: 0.00941
Epoch: 24168, Training Loss: 0.01158
Epoch: 24169, Training Loss: 0.00899
Epoch: 24169, Training Loss: 0.00940
Epoch: 24169, Training Loss: 0.00941
Epoch: 24169, Training Loss: 0.01158
Epoch: 24170, Training Loss: 0.00899
Epoch: 24170, Training Loss: 0.00940
Epoch: 24170, Training Loss: 0.00941
Epoch: 24170, Training Loss: 0.01158
Epoch: 24171, Training Loss: 0.00899
Epoch: 24171, Training Loss: 0.00940
Epoch: 24171, Training Loss: 0.00941
Epoch: 24171, Training Loss: 0.01158
Epoch: 24172, Training Loss: 0.00899
Epoch: 24172, Training Loss: 0.00940
Epoch: 24172, Training Loss: 0.00941
Epoch: 24172, Training Loss: 0.01158
Epoch: 24173, Training Loss: 0.00899
Epoch: 24173, Training Loss: 0.00940
Epoch: 24173, Training Loss: 0.00941
Epoch: 24173, Training Loss: 0.01158
Epoch: 24174, Training Loss: 0.00899
Epoch: 24174, Training Loss: 0.00940
Epoch: 24174, Training Loss: 0.00941
Epoch: 24174, Training Loss: 0.01157
Epoch: 24175, Training Loss: 0.00899
Epoch: 24175, Training Loss: 0.00940
Epoch: 24175, Training Loss: 0.00941
Epoch: 24175, Training Loss: 0.01157
Epoch: 24176, Training Loss: 0.00899
Epoch: 24176, Training Loss: 0.00940
Epoch: 24176, Training Loss: 0.00941
Epoch: 24176, Training Loss: 0.01157
Epoch: 24177, Training Loss: 0.00899
Epoch: 24177, Training Loss: 0.00940
Epoch: 24177, Training Loss: 0.00941
Epoch: 24177, Training Loss: 0.01157
Epoch: 24178, Training Loss: 0.00899
Epoch: 24178, Training Loss: 0.00940
Epoch: 24178, Training Loss: 0.00941
Epoch: 24178, Training Loss: 0.01157
Epoch: 24179, Training Loss: 0.00899
Epoch: 24179, Training Loss: 0.00940
Epoch: 24179, Training Loss: 0.00941
Epoch: 24179, Training Loss: 0.01157
Epoch: 24180, Training Loss: 0.00899
Epoch: 24180, Training Loss: 0.00940
Epoch: 24180, Training Loss: 0.00941
Epoch: 24180, Training Loss: 0.01157
Epoch: 24181, Training Loss: 0.00899
Epoch: 24181, Training Loss: 0.00939
Epoch: 24181, Training Loss: 0.00941
Epoch: 24181, Training Loss: 0.01157
Epoch: 24182, Training Loss: 0.00899
Epoch: 24182, Training Loss: 0.00939
Epoch: 24182, Training Loss: 0.00941
Epoch: 24182, Training Loss: 0.01157
Epoch: 24183, Training Loss: 0.00899
Epoch: 24183, Training Loss: 0.00939
Epoch: 24183, Training Loss: 0.00941
Epoch: 24183, Training Loss: 0.01157
Epoch: 24184, Training Loss: 0.00899
Epoch: 24184, Training Loss: 0.00939
Epoch: 24184, Training Loss: 0.00941
Epoch: 24184, Training Loss: 0.01157
Epoch: 24185, Training Loss: 0.00899
Epoch: 24185, Training Loss: 0.00939
Epoch: 24185, Training Loss: 0.00941
Epoch: 24185, Training Loss: 0.01157
Epoch: 24186, Training Loss: 0.00899
Epoch: 24186, Training Loss: 0.00939
Epoch: 24186, Training Loss: 0.00941
Epoch: 24186, Training Loss: 0.01157
Epoch: 24187, Training Loss: 0.00899
Epoch: 24187, Training Loss: 0.00939
Epoch: 24187, Training Loss: 0.00941
Epoch: 24187, Training Loss: 0.01157
Epoch: 24188, Training Loss: 0.00899
Epoch: 24188, Training Loss: 0.00939
Epoch: 24188, Training Loss: 0.00941
Epoch: 24188, Training Loss: 0.01157
Epoch: 24189, Training Loss: 0.00899
Epoch: 24189, Training Loss: 0.00939
Epoch: 24189, Training Loss: 0.00940
Epoch: 24189, Training Loss: 0.01157
Epoch: 24190, Training Loss: 0.00899
Epoch: 24190, Training Loss: 0.00939
Epoch: 24190, Training Loss: 0.00940
Epoch: 24190, Training Loss: 0.01157
Epoch: 24191, Training Loss: 0.00899
Epoch: 24191, Training Loss: 0.00939
Epoch: 24191, Training Loss: 0.00940
Epoch: 24191, Training Loss: 0.01157
Epoch: 24192, Training Loss: 0.00899
Epoch: 24192, Training Loss: 0.00939
Epoch: 24192, Training Loss: 0.00940
Epoch: 24192, Training Loss: 0.01157
Epoch: 24193, Training Loss: 0.00899
Epoch: 24193, Training Loss: 0.00939
Epoch: 24193, Training Loss: 0.00940
Epoch: 24193, Training Loss: 0.01157
Epoch: 24194, Training Loss: 0.00899
Epoch: 24194, Training Loss: 0.00939
Epoch: 24194, Training Loss: 0.00940
Epoch: 24194, Training Loss: 0.01157
Epoch: 24195, Training Loss: 0.00899
Epoch: 24195, Training Loss: 0.00939
Epoch: 24195, Training Loss: 0.00940
Epoch: 24195, Training Loss: 0.01157
Epoch: 24196, Training Loss: 0.00899
Epoch: 24196, Training Loss: 0.00939
Epoch: 24196, Training Loss: 0.00940
Epoch: 24196, Training Loss: 0.01157
Epoch: 24197, Training Loss: 0.00899
Epoch: 24197, Training Loss: 0.00939
Epoch: 24197, Training Loss: 0.00940
Epoch: 24197, Training Loss: 0.01157
Epoch: 24198, Training Loss: 0.00899
Epoch: 24198, Training Loss: 0.00939
Epoch: 24198, Training Loss: 0.00940
Epoch: 24198, Training Loss: 0.01157
Epoch: 24199, Training Loss: 0.00899
Epoch: 24199, Training Loss: 0.00939
Epoch: 24199, Training Loss: 0.00940
Epoch: 24199, Training Loss: 0.01157
Epoch: 24200, Training Loss: 0.00899
Epoch: 24200, Training Loss: 0.00939
Epoch: 24200, Training Loss: 0.00940
Epoch: 24200, Training Loss: 0.01157
Epoch: 24201, Training Loss: 0.00898
Epoch: 24201, Training Loss: 0.00939
Epoch: 24201, Training Loss: 0.00940
Epoch: 24201, Training Loss: 0.01157
Epoch: 24202, Training Loss: 0.00898
Epoch: 24202, Training Loss: 0.00939
Epoch: 24202, Training Loss: 0.00940
Epoch: 24202, Training Loss: 0.01157
Epoch: 24203, Training Loss: 0.00898
Epoch: 24203, Training Loss: 0.00939
Epoch: 24203, Training Loss: 0.00940
Epoch: 24203, Training Loss: 0.01157
Epoch: 24204, Training Loss: 0.00898
Epoch: 24204, Training Loss: 0.00939
Epoch: 24204, Training Loss: 0.00940
Epoch: 24204, Training Loss: 0.01157
Epoch: 24205, Training Loss: 0.00898
Epoch: 24205, Training Loss: 0.00939
Epoch: 24205, Training Loss: 0.00940
Epoch: 24205, Training Loss: 0.01157
Epoch: 24206, Training Loss: 0.00898
Epoch: 24206, Training Loss: 0.00939
Epoch: 24206, Training Loss: 0.00940
Epoch: 24206, Training Loss: 0.01157
Epoch: 24207, Training Loss: 0.00898
Epoch: 24207, Training Loss: 0.00939
Epoch: 24207, Training Loss: 0.00940
Epoch: 24207, Training Loss: 0.01157
Epoch: 24208, Training Loss: 0.00898
Epoch: 24208, Training Loss: 0.00939
Epoch: 24208, Training Loss: 0.00940
Epoch: 24208, Training Loss: 0.01157
Epoch: 24209, Training Loss: 0.00898
Epoch: 24209, Training Loss: 0.00939
Epoch: 24209, Training Loss: 0.00940
Epoch: 24209, Training Loss: 0.01157
Epoch: 24210, Training Loss: 0.00898
Epoch: 24210, Training Loss: 0.00939
Epoch: 24210, Training Loss: 0.00940
Epoch: 24210, Training Loss: 0.01156
Epoch: 24211, Training Loss: 0.00898
Epoch: 24211, Training Loss: 0.00939
Epoch: 24211, Training Loss: 0.00940
Epoch: 24211, Training Loss: 0.01156
Epoch: 24212, Training Loss: 0.00898
Epoch: 24212, Training Loss: 0.00939
Epoch: 24212, Training Loss: 0.00940
Epoch: 24212, Training Loss: 0.01156
Epoch: 24213, Training Loss: 0.00898
Epoch: 24213, Training Loss: 0.00939
Epoch: 24213, Training Loss: 0.00940
Epoch: 24213, Training Loss: 0.01156
Epoch: 24214, Training Loss: 0.00898
Epoch: 24214, Training Loss: 0.00939
Epoch: 24214, Training Loss: 0.00940
Epoch: 24214, Training Loss: 0.01156
Epoch: 24215, Training Loss: 0.00898
Epoch: 24215, Training Loss: 0.00939
Epoch: 24215, Training Loss: 0.00940
Epoch: 24215, Training Loss: 0.01156
Epoch: 24216, Training Loss: 0.00898
Epoch: 24216, Training Loss: 0.00939
Epoch: 24216, Training Loss: 0.00940
Epoch: 24216, Training Loss: 0.01156
Epoch: 24217, Training Loss: 0.00898
Epoch: 24217, Training Loss: 0.00939
Epoch: 24217, Training Loss: 0.00940
Epoch: 24217, Training Loss: 0.01156
Epoch: 24218, Training Loss: 0.00898
Epoch: 24218, Training Loss: 0.00939
Epoch: 24218, Training Loss: 0.00940
Epoch: 24218, Training Loss: 0.01156
Epoch: 24219, Training Loss: 0.00898
Epoch: 24219, Training Loss: 0.00939
Epoch: 24219, Training Loss: 0.00940
Epoch: 24219, Training Loss: 0.01156
Epoch: 24220, Training Loss: 0.00898
Epoch: 24220, Training Loss: 0.00939
Epoch: 24220, Training Loss: 0.00940
Epoch: 24220, Training Loss: 0.01156
Epoch: 24221, Training Loss: 0.00898
Epoch: 24221, Training Loss: 0.00939
Epoch: 24221, Training Loss: 0.00940
Epoch: 24221, Training Loss: 0.01156
Epoch: 24222, Training Loss: 0.00898
Epoch: 24222, Training Loss: 0.00939
Epoch: 24222, Training Loss: 0.00940
Epoch: 24222, Training Loss: 0.01156
Epoch: 24223, Training Loss: 0.00898
Epoch: 24223, Training Loss: 0.00939
Epoch: 24223, Training Loss: 0.00940
Epoch: 24223, Training Loss: 0.01156
Epoch: 24224, Training Loss: 0.00898
Epoch: 24224, Training Loss: 0.00939
Epoch: 24224, Training Loss: 0.00940
Epoch: 24224, Training Loss: 0.01156
Epoch: 24225, Training Loss: 0.00898
Epoch: 24225, Training Loss: 0.00939
Epoch: 24225, Training Loss: 0.00940
Epoch: 24225, Training Loss: 0.01156
Epoch: 24226, Training Loss: 0.00898
Epoch: 24226, Training Loss: 0.00938
Epoch: 24226, Training Loss: 0.00940
Epoch: 24226, Training Loss: 0.01156
Epoch: 24227, Training Loss: 0.00898
Epoch: 24227, Training Loss: 0.00938
Epoch: 24227, Training Loss: 0.00940
Epoch: 24227, Training Loss: 0.01156
Epoch: 24228, Training Loss: 0.00898
Epoch: 24228, Training Loss: 0.00938
Epoch: 24228, Training Loss: 0.00940
Epoch: 24228, Training Loss: 0.01156
Epoch: 24229, Training Loss: 0.00898
Epoch: 24229, Training Loss: 0.00938
Epoch: 24229, Training Loss: 0.00940
Epoch: 24229, Training Loss: 0.01156
Epoch: 24230, Training Loss: 0.00898
Epoch: 24230, Training Loss: 0.00938
Epoch: 24230, Training Loss: 0.00940
Epoch: 24230, Training Loss: 0.01156
Epoch: 24231, Training Loss: 0.00898
Epoch: 24231, Training Loss: 0.00938
Epoch: 24231, Training Loss: 0.00940
Epoch: 24231, Training Loss: 0.01156
Epoch: 24232, Training Loss: 0.00898
Epoch: 24232, Training Loss: 0.00938
Epoch: 24232, Training Loss: 0.00940
Epoch: 24232, Training Loss: 0.01156
Epoch: 24233, Training Loss: 0.00898
Epoch: 24233, Training Loss: 0.00938
Epoch: 24233, Training Loss: 0.00940
Epoch: 24233, Training Loss: 0.01156
Epoch: 24234, Training Loss: 0.00898
Epoch: 24234, Training Loss: 0.00938
Epoch: 24234, Training Loss: 0.00939
Epoch: 24234, Training Loss: 0.01156
Epoch: 24235, Training Loss: 0.00898
Epoch: 24235, Training Loss: 0.00938
Epoch: 24235, Training Loss: 0.00939
Epoch: 24235, Training Loss: 0.01156
Epoch: 24236, Training Loss: 0.00898
Epoch: 24236, Training Loss: 0.00938
Epoch: 24236, Training Loss: 0.00939
Epoch: 24236, Training Loss: 0.01156
Epoch: 24237, Training Loss: 0.00898
Epoch: 24237, Training Loss: 0.00938
Epoch: 24237, Training Loss: 0.00939
Epoch: 24237, Training Loss: 0.01156
Epoch: 24238, Training Loss: 0.00898
Epoch: 24238, Training Loss: 0.00938
Epoch: 24238, Training Loss: 0.00939
Epoch: 24238, Training Loss: 0.01156
Epoch: 24239, Training Loss: 0.00898
Epoch: 24239, Training Loss: 0.00938
Epoch: 24239, Training Loss: 0.00939
Epoch: 24239, Training Loss: 0.01156
Epoch: 24240, Training Loss: 0.00898
Epoch: 24240, Training Loss: 0.00938
Epoch: 24240, Training Loss: 0.00939
Epoch: 24240, Training Loss: 0.01156
Epoch: 24241, Training Loss: 0.00898
Epoch: 24241, Training Loss: 0.00938
Epoch: 24241, Training Loss: 0.00939
Epoch: 24241, Training Loss: 0.01156
Epoch: 24242, Training Loss: 0.00898
Epoch: 24242, Training Loss: 0.00938
Epoch: 24242, Training Loss: 0.00939
Epoch: 24242, Training Loss: 0.01156
Epoch: 24243, Training Loss: 0.00898
Epoch: 24243, Training Loss: 0.00938
Epoch: 24243, Training Loss: 0.00939
Epoch: 24243, Training Loss: 0.01156
Epoch: 24244, Training Loss: 0.00898
Epoch: 24244, Training Loss: 0.00938
Epoch: 24244, Training Loss: 0.00939
Epoch: 24244, Training Loss: 0.01156
Epoch: 24245, Training Loss: 0.00898
Epoch: 24245, Training Loss: 0.00938
Epoch: 24245, Training Loss: 0.00939
Epoch: 24245, Training Loss: 0.01156
Epoch: 24246, Training Loss: 0.00898
Epoch: 24246, Training Loss: 0.00938
Epoch: 24246, Training Loss: 0.00939
Epoch: 24246, Training Loss: 0.01155
Epoch: 24247, Training Loss: 0.00898
Epoch: 24247, Training Loss: 0.00938
Epoch: 24247, Training Loss: 0.00939
Epoch: 24247, Training Loss: 0.01155
Epoch: 24248, Training Loss: 0.00898
Epoch: 24248, Training Loss: 0.00938
Epoch: 24248, Training Loss: 0.00939
Epoch: 24248, Training Loss: 0.01155
Epoch: 24249, Training Loss: 0.00897
Epoch: 24249, Training Loss: 0.00938
Epoch: 24249, Training Loss: 0.00939
Epoch: 24249, Training Loss: 0.01155
Epoch: 24250, Training Loss: 0.00897
Epoch: 24250, Training Loss: 0.00938
Epoch: 24250, Training Loss: 0.00939
Epoch: 24250, Training Loss: 0.01155
Epoch: 24251, Training Loss: 0.00897
Epoch: 24251, Training Loss: 0.00938
Epoch: 24251, Training Loss: 0.00939
Epoch: 24251, Training Loss: 0.01155
Epoch: 24252, Training Loss: 0.00897
Epoch: 24252, Training Loss: 0.00938
Epoch: 24252, Training Loss: 0.00939
Epoch: 24252, Training Loss: 0.01155
Epoch: 24253, Training Loss: 0.00897
Epoch: 24253, Training Loss: 0.00938
Epoch: 24253, Training Loss: 0.00939
Epoch: 24253, Training Loss: 0.01155
Epoch: 24254, Training Loss: 0.00897
Epoch: 24254, Training Loss: 0.00938
Epoch: 24254, Training Loss: 0.00939
Epoch: 24254, Training Loss: 0.01155
Epoch: 24255, Training Loss: 0.00897
Epoch: 24255, Training Loss: 0.00938
Epoch: 24255, Training Loss: 0.00939
Epoch: 24255, Training Loss: 0.01155
Epoch: 24256, Training Loss: 0.00897
Epoch: 24256, Training Loss: 0.00938
Epoch: 24256, Training Loss: 0.00939
Epoch: 24256, Training Loss: 0.01155
Epoch: 24257, Training Loss: 0.00897
Epoch: 24257, Training Loss: 0.00938
Epoch: 24257, Training Loss: 0.00939
Epoch: 24257, Training Loss: 0.01155
Epoch: 24258, Training Loss: 0.00897
Epoch: 24258, Training Loss: 0.00938
Epoch: 24258, Training Loss: 0.00939
Epoch: 24258, Training Loss: 0.01155
Epoch: 24259, Training Loss: 0.00897
Epoch: 24259, Training Loss: 0.00938
Epoch: 24259, Training Loss: 0.00939
Epoch: 24259, Training Loss: 0.01155
Epoch: 24260, Training Loss: 0.00897
Epoch: 24260, Training Loss: 0.00938
Epoch: 24260, Training Loss: 0.00939
Epoch: 24260, Training Loss: 0.01155
Epoch: 24261, Training Loss: 0.00897
Epoch: 24261, Training Loss: 0.00938
Epoch: 24261, Training Loss: 0.00939
Epoch: 24261, Training Loss: 0.01155
Epoch: 24262, Training Loss: 0.00897
Epoch: 24262, Training Loss: 0.00938
Epoch: 24262, Training Loss: 0.00939
Epoch: 24262, Training Loss: 0.01155
Epoch: 24263, Training Loss: 0.00897
Epoch: 24263, Training Loss: 0.00938
Epoch: 24263, Training Loss: 0.00939
Epoch: 24263, Training Loss: 0.01155
Epoch: 24264, Training Loss: 0.00897
Epoch: 24264, Training Loss: 0.00938
Epoch: 24264, Training Loss: 0.00939
Epoch: 24264, Training Loss: 0.01155
Epoch: 24265, Training Loss: 0.00897
Epoch: 24265, Training Loss: 0.00938
Epoch: 24265, Training Loss: 0.00939
Epoch: 24265, Training Loss: 0.01155
Epoch: 24266, Training Loss: 0.00897
Epoch: 24266, Training Loss: 0.00938
Epoch: 24266, Training Loss: 0.00939
Epoch: 24266, Training Loss: 0.01155
Epoch: 24267, Training Loss: 0.00897
Epoch: 24267, Training Loss: 0.00938
Epoch: 24267, Training Loss: 0.00939
Epoch: 24267, Training Loss: 0.01155
Epoch: 24268, Training Loss: 0.00897
Epoch: 24268, Training Loss: 0.00938
Epoch: 24268, Training Loss: 0.00939
Epoch: 24268, Training Loss: 0.01155
Epoch: 24269, Training Loss: 0.00897
Epoch: 24269, Training Loss: 0.00938
Epoch: 24269, Training Loss: 0.00939
Epoch: 24269, Training Loss: 0.01155
Epoch: 24270, Training Loss: 0.00897
Epoch: 24270, Training Loss: 0.00938
Epoch: 24270, Training Loss: 0.00939
Epoch: 24270, Training Loss: 0.01155
Epoch: 24271, Training Loss: 0.00897
Epoch: 24271, Training Loss: 0.00937
Epoch: 24271, Training Loss: 0.00939
Epoch: 24271, Training Loss: 0.01155
Epoch: 24272, Training Loss: 0.00897
Epoch: 24272, Training Loss: 0.00937
Epoch: 24272, Training Loss: 0.00939
Epoch: 24272, Training Loss: 0.01155
Epoch: 24273, Training Loss: 0.00897
Epoch: 24273, Training Loss: 0.00937
Epoch: 24273, Training Loss: 0.00939
Epoch: 24273, Training Loss: 0.01155
Epoch: 24274, Training Loss: 0.00897
Epoch: 24274, Training Loss: 0.00937
Epoch: 24274, Training Loss: 0.00939
Epoch: 24274, Training Loss: 0.01155
Epoch: 24275, Training Loss: 0.00897
Epoch: 24275, Training Loss: 0.00937
Epoch: 24275, Training Loss: 0.00939
Epoch: 24275, Training Loss: 0.01155
Epoch: 24276, Training Loss: 0.00897
Epoch: 24276, Training Loss: 0.00937
Epoch: 24276, Training Loss: 0.00939
Epoch: 24276, Training Loss: 0.01155
Epoch: 24277, Training Loss: 0.00897
Epoch: 24277, Training Loss: 0.00937
Epoch: 24277, Training Loss: 0.00939
Epoch: 24277, Training Loss: 0.01155
Epoch: 24278, Training Loss: 0.00897
Epoch: 24278, Training Loss: 0.00937
Epoch: 24278, Training Loss: 0.00939
Epoch: 24278, Training Loss: 0.01155
Epoch: 24279, Training Loss: 0.00897
Epoch: 24279, Training Loss: 0.00937
Epoch: 24279, Training Loss: 0.00938
Epoch: 24279, Training Loss: 0.01155
Epoch: 24280, Training Loss: 0.00897
Epoch: 24280, Training Loss: 0.00937
Epoch: 24280, Training Loss: 0.00938
Epoch: 24280, Training Loss: 0.01155
Epoch: 24281, Training Loss: 0.00897
Epoch: 24281, Training Loss: 0.00937
Epoch: 24281, Training Loss: 0.00938
Epoch: 24281, Training Loss: 0.01155
Epoch: 24282, Training Loss: 0.00897
Epoch: 24282, Training Loss: 0.00937
Epoch: 24282, Training Loss: 0.00938
Epoch: 24282, Training Loss: 0.01154
Epoch: 24283, Training Loss: 0.00897
Epoch: 24283, Training Loss: 0.00937
Epoch: 24283, Training Loss: 0.00938
Epoch: 24283, Training Loss: 0.01154
Epoch: 24284, Training Loss: 0.00897
Epoch: 24284, Training Loss: 0.00937
Epoch: 24284, Training Loss: 0.00938
Epoch: 24284, Training Loss: 0.01154
Epoch: 24285, Training Loss: 0.00897
Epoch: 24285, Training Loss: 0.00937
Epoch: 24285, Training Loss: 0.00938
Epoch: 24285, Training Loss: 0.01154
Epoch: 24286, Training Loss: 0.00897
Epoch: 24286, Training Loss: 0.00937
Epoch: 24286, Training Loss: 0.00938
Epoch: 24286, Training Loss: 0.01154
Epoch: 24287, Training Loss: 0.00897
Epoch: 24287, Training Loss: 0.00937
Epoch: 24287, Training Loss: 0.00938
Epoch: 24287, Training Loss: 0.01154
Epoch: 24288, Training Loss: 0.00897
Epoch: 24288, Training Loss: 0.00937
Epoch: 24288, Training Loss: 0.00938
Epoch: 24288, Training Loss: 0.01154
Epoch: 24289, Training Loss: 0.00897
Epoch: 24289, Training Loss: 0.00937
Epoch: 24289, Training Loss: 0.00938
Epoch: 24289, Training Loss: 0.01154
Epoch: 24290, Training Loss: 0.00897
Epoch: 24290, Training Loss: 0.00937
Epoch: 24290, Training Loss: 0.00938
Epoch: 24290, Training Loss: 0.01154
Epoch: 24291, Training Loss: 0.00897
Epoch: 24291, Training Loss: 0.00937
Epoch: 24291, Training Loss: 0.00938
Epoch: 24291, Training Loss: 0.01154
Epoch: 24292, Training Loss: 0.00897
Epoch: 24292, Training Loss: 0.00937
Epoch: 24292, Training Loss: 0.00938
Epoch: 24292, Training Loss: 0.01154
Epoch: 24293, Training Loss: 0.00897
Epoch: 24293, Training Loss: 0.00937
Epoch: 24293, Training Loss: 0.00938
Epoch: 24293, Training Loss: 0.01154
Epoch: 24294, Training Loss: 0.00897
Epoch: 24294, Training Loss: 0.00937
Epoch: 24294, Training Loss: 0.00938
Epoch: 24294, Training Loss: 0.01154
Epoch: 24295, Training Loss: 0.00897
Epoch: 24295, Training Loss: 0.00937
Epoch: 24295, Training Loss: 0.00938
Epoch: 24295, Training Loss: 0.01154
Epoch: 24296, Training Loss: 0.00897
Epoch: 24296, Training Loss: 0.00937
Epoch: 24296, Training Loss: 0.00938
Epoch: 24296, Training Loss: 0.01154
Epoch: 24297, Training Loss: 0.00896
Epoch: 24297, Training Loss: 0.00937
Epoch: 24297, Training Loss: 0.00938
Epoch: 24297, Training Loss: 0.01154
Epoch: 24298, Training Loss: 0.00896
Epoch: 24298, Training Loss: 0.00937
Epoch: 24298, Training Loss: 0.00938
Epoch: 24298, Training Loss: 0.01154
Epoch: 24299, Training Loss: 0.00896
Epoch: 24299, Training Loss: 0.00937
Epoch: 24299, Training Loss: 0.00938
Epoch: 24299, Training Loss: 0.01154
Epoch: 24300, Training Loss: 0.00896
Epoch: 24300, Training Loss: 0.00937
Epoch: 24300, Training Loss: 0.00938
Epoch: 24300, Training Loss: 0.01154
Epoch: 24301, Training Loss: 0.00896
Epoch: 24301, Training Loss: 0.00937
Epoch: 24301, Training Loss: 0.00938
Epoch: 24301, Training Loss: 0.01154
Epoch: 24302, Training Loss: 0.00896
Epoch: 24302, Training Loss: 0.00937
Epoch: 24302, Training Loss: 0.00938
Epoch: 24302, Training Loss: 0.01154
Epoch: 24303, Training Loss: 0.00896
Epoch: 24303, Training Loss: 0.00937
Epoch: 24303, Training Loss: 0.00938
Epoch: 24303, Training Loss: 0.01154
Epoch: 24304, Training Loss: 0.00896
Epoch: 24304, Training Loss: 0.00937
Epoch: 24304, Training Loss: 0.00938
Epoch: 24304, Training Loss: 0.01154
Epoch: 24305, Training Loss: 0.00896
Epoch: 24305, Training Loss: 0.00937
Epoch: 24305, Training Loss: 0.00938
Epoch: 24305, Training Loss: 0.01154
Epoch: 24306, Training Loss: 0.00896
Epoch: 24306, Training Loss: 0.00937
Epoch: 24306, Training Loss: 0.00938
Epoch: 24306, Training Loss: 0.01154
Epoch: 24307, Training Loss: 0.00896
Epoch: 24307, Training Loss: 0.00937
Epoch: 24307, Training Loss: 0.00938
Epoch: 24307, Training Loss: 0.01154
Epoch: 24308, Training Loss: 0.00896
Epoch: 24308, Training Loss: 0.00937
Epoch: 24308, Training Loss: 0.00938
Epoch: 24308, Training Loss: 0.01154
Epoch: 24309, Training Loss: 0.00896
Epoch: 24309, Training Loss: 0.00937
Epoch: 24309, Training Loss: 0.00938
Epoch: 24309, Training Loss: 0.01154
Epoch: 24310, Training Loss: 0.00896
Epoch: 24310, Training Loss: 0.00937
Epoch: 24310, Training Loss: 0.00938
Epoch: 24310, Training Loss: 0.01154
Epoch: 24311, Training Loss: 0.00896
Epoch: 24311, Training Loss: 0.00937
Epoch: 24311, Training Loss: 0.00938
Epoch: 24311, Training Loss: 0.01154
Epoch: 24312, Training Loss: 0.00896
Epoch: 24312, Training Loss: 0.00937
Epoch: 24312, Training Loss: 0.00938
Epoch: 24312, Training Loss: 0.01154
Epoch: 24313, Training Loss: 0.00896
Epoch: 24313, Training Loss: 0.00937
Epoch: 24313, Training Loss: 0.00938
Epoch: 24313, Training Loss: 0.01154
Epoch: 24314, Training Loss: 0.00896
Epoch: 24314, Training Loss: 0.00937
Epoch: 24314, Training Loss: 0.00938
Epoch: 24314, Training Loss: 0.01154
Epoch: 24315, Training Loss: 0.00896
Epoch: 24315, Training Loss: 0.00937
Epoch: 24315, Training Loss: 0.00938
Epoch: 24315, Training Loss: 0.01154
Epoch: 24316, Training Loss: 0.00896
Epoch: 24316, Training Loss: 0.00936
Epoch: 24316, Training Loss: 0.00938
Epoch: 24316, Training Loss: 0.01154
Epoch: 24317, Training Loss: 0.00896
Epoch: 24317, Training Loss: 0.00936
Epoch: 24317, Training Loss: 0.00938
Epoch: 24317, Training Loss: 0.01154
Epoch: 24318, Training Loss: 0.00896
Epoch: 24318, Training Loss: 0.00936
Epoch: 24318, Training Loss: 0.00938
Epoch: 24318, Training Loss: 0.01153
Epoch: 24319, Training Loss: 0.00896
Epoch: 24319, Training Loss: 0.00936
Epoch: 24319, Training Loss: 0.00938
Epoch: 24319, Training Loss: 0.01153
Epoch: 24320, Training Loss: 0.00896
Epoch: 24320, Training Loss: 0.00936
Epoch: 24320, Training Loss: 0.00938
Epoch: 24320, Training Loss: 0.01153
Epoch: 24321, Training Loss: 0.00896
Epoch: 24321, Training Loss: 0.00936
Epoch: 24321, Training Loss: 0.00938
Epoch: 24321, Training Loss: 0.01153
Epoch: 24322, Training Loss: 0.00896
Epoch: 24322, Training Loss: 0.00936
Epoch: 24322, Training Loss: 0.00938
Epoch: 24322, Training Loss: 0.01153
Epoch: 24323, Training Loss: 0.00896
Epoch: 24323, Training Loss: 0.00936
Epoch: 24323, Training Loss: 0.00938
Epoch: 24323, Training Loss: 0.01153
Epoch: 24324, Training Loss: 0.00896
Epoch: 24324, Training Loss: 0.00936
Epoch: 24324, Training Loss: 0.00937
Epoch: 24324, Training Loss: 0.01153
Epoch: 24325, Training Loss: 0.00896
Epoch: 24325, Training Loss: 0.00936
Epoch: 24325, Training Loss: 0.00937
Epoch: 24325, Training Loss: 0.01153
Epoch: 24326, Training Loss: 0.00896
Epoch: 24326, Training Loss: 0.00936
Epoch: 24326, Training Loss: 0.00937
Epoch: 24326, Training Loss: 0.01153
Epoch: 24327, Training Loss: 0.00896
Epoch: 24327, Training Loss: 0.00936
Epoch: 24327, Training Loss: 0.00937
Epoch: 24327, Training Loss: 0.01153
Epoch: 24328, Training Loss: 0.00896
Epoch: 24328, Training Loss: 0.00936
Epoch: 24328, Training Loss: 0.00937
Epoch: 24328, Training Loss: 0.01153
Epoch: 24329, Training Loss: 0.00896
Epoch: 24329, Training Loss: 0.00936
Epoch: 24329, Training Loss: 0.00937
Epoch: 24329, Training Loss: 0.01153
Epoch: 24330, Training Loss: 0.00896
Epoch: 24330, Training Loss: 0.00936
Epoch: 24330, Training Loss: 0.00937
Epoch: 24330, Training Loss: 0.01153
Epoch: 24331, Training Loss: 0.00896
Epoch: 24331, Training Loss: 0.00936
Epoch: 24331, Training Loss: 0.00937
Epoch: 24331, Training Loss: 0.01153
Epoch: 24332, Training Loss: 0.00896
Epoch: 24332, Training Loss: 0.00936
Epoch: 24332, Training Loss: 0.00937
Epoch: 24332, Training Loss: 0.01153
Epoch: 24333, Training Loss: 0.00896
Epoch: 24333, Training Loss: 0.00936
Epoch: 24333, Training Loss: 0.00937
Epoch: 24333, Training Loss: 0.01153
Epoch: 24334, Training Loss: 0.00896
Epoch: 24334, Training Loss: 0.00936
Epoch: 24334, Training Loss: 0.00937
Epoch: 24334, Training Loss: 0.01153
Epoch: 24335, Training Loss: 0.00896
Epoch: 24335, Training Loss: 0.00936
Epoch: 24335, Training Loss: 0.00937
Epoch: 24335, Training Loss: 0.01153
Epoch: 24336, Training Loss: 0.00896
Epoch: 24336, Training Loss: 0.00936
Epoch: 24336, Training Loss: 0.00937
Epoch: 24336, Training Loss: 0.01153
Epoch: 24337, Training Loss: 0.00896
Epoch: 24337, Training Loss: 0.00936
Epoch: 24337, Training Loss: 0.00937
Epoch: 24337, Training Loss: 0.01153
Epoch: 24338, Training Loss: 0.00896
Epoch: 24338, Training Loss: 0.00936
Epoch: 24338, Training Loss: 0.00937
Epoch: 24338, Training Loss: 0.01153
Epoch: 24339, Training Loss: 0.00896
Epoch: 24339, Training Loss: 0.00936
Epoch: 24339, Training Loss: 0.00937
Epoch: 24339, Training Loss: 0.01153
Epoch: 24340, Training Loss: 0.00896
Epoch: 24340, Training Loss: 0.00936
Epoch: 24340, Training Loss: 0.00937
Epoch: 24340, Training Loss: 0.01153
Epoch: 24341, Training Loss: 0.00896
Epoch: 24341, Training Loss: 0.00936
Epoch: 24341, Training Loss: 0.00937
Epoch: 24341, Training Loss: 0.01153
Epoch: 24342, Training Loss: 0.00896
Epoch: 24342, Training Loss: 0.00936
Epoch: 24342, Training Loss: 0.00937
Epoch: 24342, Training Loss: 0.01153
Epoch: 24343, Training Loss: 0.00896
Epoch: 24343, Training Loss: 0.00936
Epoch: 24343, Training Loss: 0.00937
Epoch: 24343, Training Loss: 0.01153
Epoch: 24344, Training Loss: 0.00896
Epoch: 24344, Training Loss: 0.00936
Epoch: 24344, Training Loss: 0.00937
Epoch: 24344, Training Loss: 0.01153
Epoch: 24345, Training Loss: 0.00895
Epoch: 24345, Training Loss: 0.00936
Epoch: 24345, Training Loss: 0.00937
Epoch: 24345, Training Loss: 0.01153
Epoch: 24346, Training Loss: 0.00895
Epoch: 24346, Training Loss: 0.00936
Epoch: 24346, Training Loss: 0.00937
Epoch: 24346, Training Loss: 0.01153
Epoch: 24347, Training Loss: 0.00895
Epoch: 24347, Training Loss: 0.00936
Epoch: 24347, Training Loss: 0.00937
Epoch: 24347, Training Loss: 0.01153
Epoch: 24348, Training Loss: 0.00895
Epoch: 24348, Training Loss: 0.00936
Epoch: 24348, Training Loss: 0.00937
Epoch: 24348, Training Loss: 0.01153
Epoch: 24349, Training Loss: 0.00895
Epoch: 24349, Training Loss: 0.00936
Epoch: 24349, Training Loss: 0.00937
Epoch: 24349, Training Loss: 0.01153
Epoch: 24350, Training Loss: 0.00895
Epoch: 24350, Training Loss: 0.00936
Epoch: 24350, Training Loss: 0.00937
Epoch: 24350, Training Loss: 0.01153
Epoch: 24351, Training Loss: 0.00895
Epoch: 24351, Training Loss: 0.00936
Epoch: 24351, Training Loss: 0.00937
Epoch: 24351, Training Loss: 0.01153
Epoch: 24352, Training Loss: 0.00895
Epoch: 24352, Training Loss: 0.00936
Epoch: 24352, Training Loss: 0.00937
Epoch: 24352, Training Loss: 0.01153
Epoch: 24353, Training Loss: 0.00895
Epoch: 24353, Training Loss: 0.00936
Epoch: 24353, Training Loss: 0.00937
Epoch: 24353, Training Loss: 0.01153
Epoch: 24354, Training Loss: 0.00895
Epoch: 24354, Training Loss: 0.00936
Epoch: 24354, Training Loss: 0.00937
Epoch: 24354, Training Loss: 0.01152
Epoch: 24355, Training Loss: 0.00895
Epoch: 24355, Training Loss: 0.00936
Epoch: 24355, Training Loss: 0.00937
Epoch: 24355, Training Loss: 0.01152
Epoch: 24356, Training Loss: 0.00895
Epoch: 24356, Training Loss: 0.00936
Epoch: 24356, Training Loss: 0.00937
Epoch: 24356, Training Loss: 0.01152
Epoch: 24357, Training Loss: 0.00895
Epoch: 24357, Training Loss: 0.00936
Epoch: 24357, Training Loss: 0.00937
Epoch: 24357, Training Loss: 0.01152
Epoch: 24358, Training Loss: 0.00895
Epoch: 24358, Training Loss: 0.00936
Epoch: 24358, Training Loss: 0.00937
Epoch: 24358, Training Loss: 0.01152
Epoch: 24359, Training Loss: 0.00895
Epoch: 24359, Training Loss: 0.00936
Epoch: 24359, Training Loss: 0.00937
Epoch: 24359, Training Loss: 0.01152
Epoch: 24360, Training Loss: 0.00895
Epoch: 24360, Training Loss: 0.00936
Epoch: 24360, Training Loss: 0.00937
Epoch: 24360, Training Loss: 0.01152
Epoch: 24361, Training Loss: 0.00895
Epoch: 24361, Training Loss: 0.00935
Epoch: 24361, Training Loss: 0.00937
Epoch: 24361, Training Loss: 0.01152
Epoch: 24362, Training Loss: 0.00895
Epoch: 24362, Training Loss: 0.00935
Epoch: 24362, Training Loss: 0.00937
Epoch: 24362, Training Loss: 0.01152
Epoch: 24363, Training Loss: 0.00895
Epoch: 24363, Training Loss: 0.00935
Epoch: 24363, Training Loss: 0.00937
Epoch: 24363, Training Loss: 0.01152
Epoch: 24364, Training Loss: 0.00895
Epoch: 24364, Training Loss: 0.00935
Epoch: 24364, Training Loss: 0.00937
Epoch: 24364, Training Loss: 0.01152
Epoch: 24365, Training Loss: 0.00895
Epoch: 24365, Training Loss: 0.00935
Epoch: 24365, Training Loss: 0.00937
Epoch: 24365, Training Loss: 0.01152
Epoch: 24366, Training Loss: 0.00895
Epoch: 24366, Training Loss: 0.00935
Epoch: 24366, Training Loss: 0.00937
Epoch: 24366, Training Loss: 0.01152
Epoch: 24367, Training Loss: 0.00895
Epoch: 24367, Training Loss: 0.00935
Epoch: 24367, Training Loss: 0.00937
Epoch: 24367, Training Loss: 0.01152
Epoch: 24368, Training Loss: 0.00895
Epoch: 24368, Training Loss: 0.00935
Epoch: 24368, Training Loss: 0.00937
Epoch: 24368, Training Loss: 0.01152
Epoch: 24369, Training Loss: 0.00895
Epoch: 24369, Training Loss: 0.00935
Epoch: 24369, Training Loss: 0.00936
Epoch: 24369, Training Loss: 0.01152
Epoch: 24370, Training Loss: 0.00895
Epoch: 24370, Training Loss: 0.00935
Epoch: 24370, Training Loss: 0.00936
Epoch: 24370, Training Loss: 0.01152
Epoch: 24371, Training Loss: 0.00895
Epoch: 24371, Training Loss: 0.00935
Epoch: 24371, Training Loss: 0.00936
Epoch: 24371, Training Loss: 0.01152
Epoch: 24372, Training Loss: 0.00895
Epoch: 24372, Training Loss: 0.00935
Epoch: 24372, Training Loss: 0.00936
Epoch: 24372, Training Loss: 0.01152
Epoch: 24373, Training Loss: 0.00895
Epoch: 24373, Training Loss: 0.00935
Epoch: 24373, Training Loss: 0.00936
Epoch: 24373, Training Loss: 0.01152
Epoch: 24374, Training Loss: 0.00895
Epoch: 24374, Training Loss: 0.00935
Epoch: 24374, Training Loss: 0.00936
Epoch: 24374, Training Loss: 0.01152
Epoch: 24375, Training Loss: 0.00895
Epoch: 24375, Training Loss: 0.00935
Epoch: 24375, Training Loss: 0.00936
Epoch: 24375, Training Loss: 0.01152
Epoch: 24376, Training Loss: 0.00895
Epoch: 24376, Training Loss: 0.00935
Epoch: 24376, Training Loss: 0.00936
Epoch: 24376, Training Loss: 0.01152
Epoch: 24377, Training Loss: 0.00895
Epoch: 24377, Training Loss: 0.00935
Epoch: 24377, Training Loss: 0.00936
Epoch: 24377, Training Loss: 0.01152
Epoch: 24378, Training Loss: 0.00895
Epoch: 24378, Training Loss: 0.00935
Epoch: 24378, Training Loss: 0.00936
Epoch: 24378, Training Loss: 0.01152
Epoch: 24379, Training Loss: 0.00895
Epoch: 24379, Training Loss: 0.00935
Epoch: 24379, Training Loss: 0.00936
Epoch: 24379, Training Loss: 0.01152
Epoch: 24380, Training Loss: 0.00895
Epoch: 24380, Training Loss: 0.00935
Epoch: 24380, Training Loss: 0.00936
Epoch: 24380, Training Loss: 0.01152
Epoch: 24381, Training Loss: 0.00895
Epoch: 24381, Training Loss: 0.00935
Epoch: 24381, Training Loss: 0.00936
Epoch: 24381, Training Loss: 0.01152
Epoch: 24382, Training Loss: 0.00895
Epoch: 24382, Training Loss: 0.00935
Epoch: 24382, Training Loss: 0.00936
Epoch: 24382, Training Loss: 0.01152
Epoch: 24383, Training Loss: 0.00895
Epoch: 24383, Training Loss: 0.00935
Epoch: 24383, Training Loss: 0.00936
Epoch: 24383, Training Loss: 0.01152
Epoch: 24384, Training Loss: 0.00895
Epoch: 24384, Training Loss: 0.00935
Epoch: 24384, Training Loss: 0.00936
Epoch: 24384, Training Loss: 0.01152
Epoch: 24385, Training Loss: 0.00895
Epoch: 24385, Training Loss: 0.00935
Epoch: 24385, Training Loss: 0.00936
Epoch: 24385, Training Loss: 0.01152
Epoch: 24386, Training Loss: 0.00895
Epoch: 24386, Training Loss: 0.00935
Epoch: 24386, Training Loss: 0.00936
Epoch: 24386, Training Loss: 0.01152
Epoch: 24387, Training Loss: 0.00895
Epoch: 24387, Training Loss: 0.00935
Epoch: 24387, Training Loss: 0.00936
Epoch: 24387, Training Loss: 0.01152
Epoch: 24388, Training Loss: 0.00895
Epoch: 24388, Training Loss: 0.00935
Epoch: 24388, Training Loss: 0.00936
Epoch: 24388, Training Loss: 0.01152
Epoch: 24389, Training Loss: 0.00895
Epoch: 24389, Training Loss: 0.00935
Epoch: 24389, Training Loss: 0.00936
Epoch: 24389, Training Loss: 0.01152
Epoch: 24390, Training Loss: 0.00895
Epoch: 24390, Training Loss: 0.00935
Epoch: 24390, Training Loss: 0.00936
Epoch: 24390, Training Loss: 0.01152
Epoch: 24391, Training Loss: 0.00895
Epoch: 24391, Training Loss: 0.00935
Epoch: 24391, Training Loss: 0.00936
Epoch: 24391, Training Loss: 0.01151
Epoch: 24392, Training Loss: 0.00895
Epoch: 24392, Training Loss: 0.00935
Epoch: 24392, Training Loss: 0.00936
Epoch: 24392, Training Loss: 0.01151
Epoch: 24393, Training Loss: 0.00895
Epoch: 24393, Training Loss: 0.00935
Epoch: 24393, Training Loss: 0.00936
Epoch: 24393, Training Loss: 0.01151
Epoch: 24394, Training Loss: 0.00894
Epoch: 24394, Training Loss: 0.00935
Epoch: 24394, Training Loss: 0.00936
Epoch: 24394, Training Loss: 0.01151
Epoch: 24395, Training Loss: 0.00894
Epoch: 24395, Training Loss: 0.00935
Epoch: 24395, Training Loss: 0.00936
Epoch: 24395, Training Loss: 0.01151
Epoch: 24396, Training Loss: 0.00894
Epoch: 24396, Training Loss: 0.00935
Epoch: 24396, Training Loss: 0.00936
Epoch: 24396, Training Loss: 0.01151
Epoch: 24397, Training Loss: 0.00894
Epoch: 24397, Training Loss: 0.00935
Epoch: 24397, Training Loss: 0.00936
Epoch: 24397, Training Loss: 0.01151
Epoch: 24398, Training Loss: 0.00894
Epoch: 24398, Training Loss: 0.00935
Epoch: 24398, Training Loss: 0.00936
Epoch: 24398, Training Loss: 0.01151
Epoch: 24399, Training Loss: 0.00894
Epoch: 24399, Training Loss: 0.00935
Epoch: 24399, Training Loss: 0.00936
Epoch: 24399, Training Loss: 0.01151
Epoch: 24400, Training Loss: 0.00894
Epoch: 24400, Training Loss: 0.00935
Epoch: 24400, Training Loss: 0.00936
Epoch: 24400, Training Loss: 0.01151
Epoch: 24401, Training Loss: 0.00894
Epoch: 24401, Training Loss: 0.00935
Epoch: 24401, Training Loss: 0.00936
Epoch: 24401, Training Loss: 0.01151
Epoch: 24402, Training Loss: 0.00894
Epoch: 24402, Training Loss: 0.00935
Epoch: 24402, Training Loss: 0.00936
Epoch: 24402, Training Loss: 0.01151
Epoch: 24403, Training Loss: 0.00894
Epoch: 24403, Training Loss: 0.00935
Epoch: 24403, Training Loss: 0.00936
Epoch: 24403, Training Loss: 0.01151
Epoch: 24404, Training Loss: 0.00894
Epoch: 24404, Training Loss: 0.00935
Epoch: 24404, Training Loss: 0.00936
Epoch: 24404, Training Loss: 0.01151
Epoch: 24405, Training Loss: 0.00894
Epoch: 24405, Training Loss: 0.00935
Epoch: 24405, Training Loss: 0.00936
Epoch: 24405, Training Loss: 0.01151
Epoch: 24406, Training Loss: 0.00894
Epoch: 24406, Training Loss: 0.00934
Epoch: 24406, Training Loss: 0.00936
Epoch: 24406, Training Loss: 0.01151
Epoch: 24407, Training Loss: 0.00894
Epoch: 24407, Training Loss: 0.00934
Epoch: 24407, Training Loss: 0.00936
Epoch: 24407, Training Loss: 0.01151
Epoch: 24408, Training Loss: 0.00894
Epoch: 24408, Training Loss: 0.00934
Epoch: 24408, Training Loss: 0.00936
Epoch: 24408, Training Loss: 0.01151
Epoch: 24409, Training Loss: 0.00894
Epoch: 24409, Training Loss: 0.00934
Epoch: 24409, Training Loss: 0.00936
Epoch: 24409, Training Loss: 0.01151
Epoch: 24410, Training Loss: 0.00894
Epoch: 24410, Training Loss: 0.00934
Epoch: 24410, Training Loss: 0.00936
Epoch: 24410, Training Loss: 0.01151
Epoch: 24411, Training Loss: 0.00894
Epoch: 24411, Training Loss: 0.00934
Epoch: 24411, Training Loss: 0.00936
Epoch: 24411, Training Loss: 0.01151
Epoch: 24412, Training Loss: 0.00894
Epoch: 24412, Training Loss: 0.00934
Epoch: 24412, Training Loss: 0.00936
Epoch: 24412, Training Loss: 0.01151
Epoch: 24413, Training Loss: 0.00894
Epoch: 24413, Training Loss: 0.00934
Epoch: 24413, Training Loss: 0.00936
Epoch: 24413, Training Loss: 0.01151
Epoch: 24414, Training Loss: 0.00894
Epoch: 24414, Training Loss: 0.00934
Epoch: 24414, Training Loss: 0.00935
Epoch: 24414, Training Loss: 0.01151
Epoch: 24415, Training Loss: 0.00894
Epoch: 24415, Training Loss: 0.00934
Epoch: 24415, Training Loss: 0.00935
Epoch: 24415, Training Loss: 0.01151
Epoch: 24416, Training Loss: 0.00894
Epoch: 24416, Training Loss: 0.00934
Epoch: 24416, Training Loss: 0.00935
Epoch: 24416, Training Loss: 0.01151
Epoch: 24417, Training Loss: 0.00894
Epoch: 24417, Training Loss: 0.00934
Epoch: 24417, Training Loss: 0.00935
Epoch: 24417, Training Loss: 0.01151
Epoch: 24418, Training Loss: 0.00894
Epoch: 24418, Training Loss: 0.00934
Epoch: 24418, Training Loss: 0.00935
Epoch: 24418, Training Loss: 0.01151
Epoch: 24419, Training Loss: 0.00894
Epoch: 24419, Training Loss: 0.00934
Epoch: 24419, Training Loss: 0.00935
Epoch: 24419, Training Loss: 0.01151
Epoch: 24420, Training Loss: 0.00894
Epoch: 24420, Training Loss: 0.00934
Epoch: 24420, Training Loss: 0.00935
Epoch: 24420, Training Loss: 0.01151
Epoch: 24421, Training Loss: 0.00894
Epoch: 24421, Training Loss: 0.00934
Epoch: 24421, Training Loss: 0.00935
Epoch: 24421, Training Loss: 0.01151
Epoch: 24422, Training Loss: 0.00894
Epoch: 24422, Training Loss: 0.00934
Epoch: 24422, Training Loss: 0.00935
Epoch: 24422, Training Loss: 0.01151
Epoch: 24423, Training Loss: 0.00894
Epoch: 24423, Training Loss: 0.00934
Epoch: 24423, Training Loss: 0.00935
Epoch: 24423, Training Loss: 0.01151
Epoch: 24424, Training Loss: 0.00894
Epoch: 24424, Training Loss: 0.00934
Epoch: 24424, Training Loss: 0.00935
Epoch: 24424, Training Loss: 0.01151
Epoch: 24425, Training Loss: 0.00894
Epoch: 24425, Training Loss: 0.00934
Epoch: 24425, Training Loss: 0.00935
Epoch: 24425, Training Loss: 0.01151
Epoch: 24426, Training Loss: 0.00894
Epoch: 24426, Training Loss: 0.00934
Epoch: 24426, Training Loss: 0.00935
Epoch: 24426, Training Loss: 0.01151
Epoch: 24427, Training Loss: 0.00894
Epoch: 24427, Training Loss: 0.00934
Epoch: 24427, Training Loss: 0.00935
Epoch: 24427, Training Loss: 0.01150
Epoch: 24428, Training Loss: 0.00894
Epoch: 24428, Training Loss: 0.00934
Epoch: 24428, Training Loss: 0.00935
Epoch: 24428, Training Loss: 0.01150
Epoch: 24429, Training Loss: 0.00894
Epoch: 24429, Training Loss: 0.00934
Epoch: 24429, Training Loss: 0.00935
Epoch: 24429, Training Loss: 0.01150
Epoch: 24430, Training Loss: 0.00894
Epoch: 24430, Training Loss: 0.00934
Epoch: 24430, Training Loss: 0.00935
Epoch: 24430, Training Loss: 0.01150
Epoch: 24431, Training Loss: 0.00894
Epoch: 24431, Training Loss: 0.00934
Epoch: 24431, Training Loss: 0.00935
Epoch: 24431, Training Loss: 0.01150
Epoch: 24432, Training Loss: 0.00894
Epoch: 24432, Training Loss: 0.00934
Epoch: 24432, Training Loss: 0.00935
Epoch: 24432, Training Loss: 0.01150
Epoch: 24433, Training Loss: 0.00894
Epoch: 24433, Training Loss: 0.00934
Epoch: 24433, Training Loss: 0.00935
Epoch: 24433, Training Loss: 0.01150
Epoch: 24434, Training Loss: 0.00894
Epoch: 24434, Training Loss: 0.00934
Epoch: 24434, Training Loss: 0.00935
Epoch: 24434, Training Loss: 0.01150
Epoch: 24435, Training Loss: 0.00894
Epoch: 24435, Training Loss: 0.00934
Epoch: 24435, Training Loss: 0.00935
Epoch: 24435, Training Loss: 0.01150
Epoch: 24436, Training Loss: 0.00894
Epoch: 24436, Training Loss: 0.00934
Epoch: 24436, Training Loss: 0.00935
Epoch: 24436, Training Loss: 0.01150
Epoch: 24437, Training Loss: 0.00894
Epoch: 24437, Training Loss: 0.00934
Epoch: 24437, Training Loss: 0.00935
Epoch: 24437, Training Loss: 0.01150
Epoch: 24438, Training Loss: 0.00894
Epoch: 24438, Training Loss: 0.00934
Epoch: 24438, Training Loss: 0.00935
Epoch: 24438, Training Loss: 0.01150
Epoch: 24439, Training Loss: 0.00894
Epoch: 24439, Training Loss: 0.00934
Epoch: 24439, Training Loss: 0.00935
Epoch: 24439, Training Loss: 0.01150
Epoch: 24440, Training Loss: 0.00894
Epoch: 24440, Training Loss: 0.00934
Epoch: 24440, Training Loss: 0.00935
Epoch: 24440, Training Loss: 0.01150
Epoch: 24441, Training Loss: 0.00894
Epoch: 24441, Training Loss: 0.00934
Epoch: 24441, Training Loss: 0.00935
Epoch: 24441, Training Loss: 0.01150
Epoch: 24442, Training Loss: 0.00893
Epoch: 24442, Training Loss: 0.00934
Epoch: 24442, Training Loss: 0.00935
Epoch: 24442, Training Loss: 0.01150
Epoch: 24443, Training Loss: 0.00893
Epoch: 24443, Training Loss: 0.00934
Epoch: 24443, Training Loss: 0.00935
Epoch: 24443, Training Loss: 0.01150
Epoch: 24444, Training Loss: 0.00893
Epoch: 24444, Training Loss: 0.00934
Epoch: 24444, Training Loss: 0.00935
Epoch: 24444, Training Loss: 0.01150
Epoch: 24445, Training Loss: 0.00893
Epoch: 24445, Training Loss: 0.00934
Epoch: 24445, Training Loss: 0.00935
Epoch: 24445, Training Loss: 0.01150
Epoch: 24446, Training Loss: 0.00893
Epoch: 24446, Training Loss: 0.00934
Epoch: 24446, Training Loss: 0.00935
Epoch: 24446, Training Loss: 0.01150
Epoch: 24447, Training Loss: 0.00893
Epoch: 24447, Training Loss: 0.00934
Epoch: 24447, Training Loss: 0.00935
Epoch: 24447, Training Loss: 0.01150
Epoch: 24448, Training Loss: 0.00893
Epoch: 24448, Training Loss: 0.00934
Epoch: 24448, Training Loss: 0.00935
Epoch: 24448, Training Loss: 0.01150
Epoch: 24449, Training Loss: 0.00893
Epoch: 24449, Training Loss: 0.00934
Epoch: 24449, Training Loss: 0.00935
Epoch: 24449, Training Loss: 0.01150
Epoch: 24450, Training Loss: 0.00893
Epoch: 24450, Training Loss: 0.00934
Epoch: 24450, Training Loss: 0.00935
Epoch: 24450, Training Loss: 0.01150
Epoch: 24451, Training Loss: 0.00893
Epoch: 24451, Training Loss: 0.00934
Epoch: 24451, Training Loss: 0.00935
Epoch: 24451, Training Loss: 0.01150
Epoch: 24452, Training Loss: 0.00893
Epoch: 24452, Training Loss: 0.00933
Epoch: 24452, Training Loss: 0.00935
Epoch: 24452, Training Loss: 0.01150
Epoch: 24453, Training Loss: 0.00893
Epoch: 24453, Training Loss: 0.00933
Epoch: 24453, Training Loss: 0.00935
Epoch: 24453, Training Loss: 0.01150
Epoch: 24454, Training Loss: 0.00893
Epoch: 24454, Training Loss: 0.00933
Epoch: 24454, Training Loss: 0.00935
Epoch: 24454, Training Loss: 0.01150
Epoch: 24455, Training Loss: 0.00893
Epoch: 24455, Training Loss: 0.00933
Epoch: 24455, Training Loss: 0.00935
Epoch: 24455, Training Loss: 0.01150
Epoch: 24456, Training Loss: 0.00893
Epoch: 24456, Training Loss: 0.00933
Epoch: 24456, Training Loss: 0.00935
Epoch: 24456, Training Loss: 0.01150
Epoch: 24457, Training Loss: 0.00893
Epoch: 24457, Training Loss: 0.00933
Epoch: 24457, Training Loss: 0.00935
Epoch: 24457, Training Loss: 0.01150
Epoch: 24458, Training Loss: 0.00893
Epoch: 24458, Training Loss: 0.00933
Epoch: 24458, Training Loss: 0.00935
Epoch: 24458, Training Loss: 0.01150
Epoch: 24459, Training Loss: 0.00893
Epoch: 24459, Training Loss: 0.00933
Epoch: 24459, Training Loss: 0.00934
Epoch: 24459, Training Loss: 0.01150
Epoch: 24460, Training Loss: 0.00893
Epoch: 24460, Training Loss: 0.00933
Epoch: 24460, Training Loss: 0.00934
Epoch: 24460, Training Loss: 0.01150
Epoch: 24461, Training Loss: 0.00893
Epoch: 24461, Training Loss: 0.00933
Epoch: 24461, Training Loss: 0.00934
Epoch: 24461, Training Loss: 0.01150
Epoch: 24462, Training Loss: 0.00893
Epoch: 24462, Training Loss: 0.00933
Epoch: 24462, Training Loss: 0.00934
Epoch: 24462, Training Loss: 0.01150
Epoch: 24463, Training Loss: 0.00893
Epoch: 24463, Training Loss: 0.00933
Epoch: 24463, Training Loss: 0.00934
Epoch: 24463, Training Loss: 0.01149
Epoch: 24464, Training Loss: 0.00893
Epoch: 24464, Training Loss: 0.00933
Epoch: 24464, Training Loss: 0.00934
Epoch: 24464, Training Loss: 0.01149
Epoch: 24465, Training Loss: 0.00893
Epoch: 24465, Training Loss: 0.00933
Epoch: 24465, Training Loss: 0.00934
Epoch: 24465, Training Loss: 0.01149
Epoch: 24466, Training Loss: 0.00893
Epoch: 24466, Training Loss: 0.00933
Epoch: 24466, Training Loss: 0.00934
Epoch: 24466, Training Loss: 0.01149
Epoch: 24467, Training Loss: 0.00893
Epoch: 24467, Training Loss: 0.00933
Epoch: 24467, Training Loss: 0.00934
Epoch: 24467, Training Loss: 0.01149
Epoch: 24468, Training Loss: 0.00893
Epoch: 24468, Training Loss: 0.00933
Epoch: 24468, Training Loss: 0.00934
Epoch: 24468, Training Loss: 0.01149
Epoch: 24469, Training Loss: 0.00893
Epoch: 24469, Training Loss: 0.00933
Epoch: 24469, Training Loss: 0.00934
Epoch: 24469, Training Loss: 0.01149
Epoch: 24470, Training Loss: 0.00893
Epoch: 24470, Training Loss: 0.00933
Epoch: 24470, Training Loss: 0.00934
Epoch: 24470, Training Loss: 0.01149
Epoch: 24471, Training Loss: 0.00893
Epoch: 24471, Training Loss: 0.00933
Epoch: 24471, Training Loss: 0.00934
Epoch: 24471, Training Loss: 0.01149
Epoch: 24472, Training Loss: 0.00893
Epoch: 24472, Training Loss: 0.00933
Epoch: 24472, Training Loss: 0.00934
Epoch: 24472, Training Loss: 0.01149
Epoch: 24473, Training Loss: 0.00893
Epoch: 24473, Training Loss: 0.00933
Epoch: 24473, Training Loss: 0.00934
Epoch: 24473, Training Loss: 0.01149
Epoch: 24474, Training Loss: 0.00893
Epoch: 24474, Training Loss: 0.00933
Epoch: 24474, Training Loss: 0.00934
Epoch: 24474, Training Loss: 0.01149
Epoch: 24475, Training Loss: 0.00893
Epoch: 24475, Training Loss: 0.00933
Epoch: 24475, Training Loss: 0.00934
Epoch: 24475, Training Loss: 0.01149
Epoch: 24476, Training Loss: 0.00893
Epoch: 24476, Training Loss: 0.00933
Epoch: 24476, Training Loss: 0.00934
Epoch: 24476, Training Loss: 0.01149
Epoch: 24477, Training Loss: 0.00893
Epoch: 24477, Training Loss: 0.00933
Epoch: 24477, Training Loss: 0.00934
Epoch: 24477, Training Loss: 0.01149
Epoch: 24478, Training Loss: 0.00893
Epoch: 24478, Training Loss: 0.00933
Epoch: 24478, Training Loss: 0.00934
Epoch: 24478, Training Loss: 0.01149
Epoch: 24479, Training Loss: 0.00893
Epoch: 24479, Training Loss: 0.00933
Epoch: 24479, Training Loss: 0.00934
Epoch: 24479, Training Loss: 0.01149
Epoch: 24480, Training Loss: 0.00893
Epoch: 24480, Training Loss: 0.00933
Epoch: 24480, Training Loss: 0.00934
Epoch: 24480, Training Loss: 0.01149
Epoch: 24481, Training Loss: 0.00893
Epoch: 24481, Training Loss: 0.00933
Epoch: 24481, Training Loss: 0.00934
Epoch: 24481, Training Loss: 0.01149
Epoch: 24482, Training Loss: 0.00893
Epoch: 24482, Training Loss: 0.00933
Epoch: 24482, Training Loss: 0.00934
Epoch: 24482, Training Loss: 0.01149
Epoch: 24483, Training Loss: 0.00893
Epoch: 24483, Training Loss: 0.00933
Epoch: 24483, Training Loss: 0.00934
Epoch: 24483, Training Loss: 0.01149
Epoch: 24484, Training Loss: 0.00893
Epoch: 24484, Training Loss: 0.00933
Epoch: 24484, Training Loss: 0.00934
Epoch: 24484, Training Loss: 0.01149
Epoch: 24485, Training Loss: 0.00893
Epoch: 24485, Training Loss: 0.00933
Epoch: 24485, Training Loss: 0.00934
Epoch: 24485, Training Loss: 0.01149
Epoch: 24486, Training Loss: 0.00893
Epoch: 24486, Training Loss: 0.00933
Epoch: 24486, Training Loss: 0.00934
Epoch: 24486, Training Loss: 0.01149
Epoch: 24487, Training Loss: 0.00893
Epoch: 24487, Training Loss: 0.00933
Epoch: 24487, Training Loss: 0.00934
Epoch: 24487, Training Loss: 0.01149
Epoch: 24488, Training Loss: 0.00893
Epoch: 24488, Training Loss: 0.00933
Epoch: 24488, Training Loss: 0.00934
Epoch: 24488, Training Loss: 0.01149
Epoch: 24489, Training Loss: 0.00893
Epoch: 24489, Training Loss: 0.00933
Epoch: 24489, Training Loss: 0.00934
Epoch: 24489, Training Loss: 0.01149
Epoch: 24490, Training Loss: 0.00893
Epoch: 24490, Training Loss: 0.00933
Epoch: 24490, Training Loss: 0.00934
Epoch: 24490, Training Loss: 0.01149
Epoch: 24491, Training Loss: 0.00892
Epoch: 24491, Training Loss: 0.00933
Epoch: 24491, Training Loss: 0.00934
Epoch: 24491, Training Loss: 0.01149
Epoch: 24492, Training Loss: 0.00892
Epoch: 24492, Training Loss: 0.00933
Epoch: 24492, Training Loss: 0.00934
Epoch: 24492, Training Loss: 0.01149
Epoch: 24493, Training Loss: 0.00892
Epoch: 24493, Training Loss: 0.00933
Epoch: 24493, Training Loss: 0.00934
Epoch: 24493, Training Loss: 0.01149
Epoch: 24494, Training Loss: 0.00892
Epoch: 24494, Training Loss: 0.00933
Epoch: 24494, Training Loss: 0.00934
Epoch: 24494, Training Loss: 0.01149
Epoch: 24495, Training Loss: 0.00892
Epoch: 24495, Training Loss: 0.00933
Epoch: 24495, Training Loss: 0.00934
Epoch: 24495, Training Loss: 0.01149
Epoch: 24496, Training Loss: 0.00892
Epoch: 24496, Training Loss: 0.00933
Epoch: 24496, Training Loss: 0.00934
Epoch: 24496, Training Loss: 0.01149
Epoch: 24497, Training Loss: 0.00892
Epoch: 24497, Training Loss: 0.00932
Epoch: 24497, Training Loss: 0.00934
Epoch: 24497, Training Loss: 0.01149
Epoch: 24498, Training Loss: 0.00892
Epoch: 24498, Training Loss: 0.00932
Epoch: 24498, Training Loss: 0.00934
Epoch: 24498, Training Loss: 0.01149
Epoch: 24499, Training Loss: 0.00892
Epoch: 24499, Training Loss: 0.00932
Epoch: 24499, Training Loss: 0.00934
Epoch: 24499, Training Loss: 0.01149
Epoch: 24500, Training Loss: 0.00892
Epoch: 24500, Training Loss: 0.00932
Epoch: 24500, Training Loss: 0.00934
Epoch: 24500, Training Loss: 0.01148
Epoch: 24501, Training Loss: 0.00892
Epoch: 24501, Training Loss: 0.00932
Epoch: 24501, Training Loss: 0.00934
Epoch: 24501, Training Loss: 0.01148
Epoch: 24502, Training Loss: 0.00892
Epoch: 24502, Training Loss: 0.00932
Epoch: 24502, Training Loss: 0.00934
Epoch: 24502, Training Loss: 0.01148
Epoch: 24503, Training Loss: 0.00892
Epoch: 24503, Training Loss: 0.00932
Epoch: 24503, Training Loss: 0.00934
Epoch: 24503, Training Loss: 0.01148
Epoch: 24504, Training Loss: 0.00892
Epoch: 24504, Training Loss: 0.00932
Epoch: 24504, Training Loss: 0.00934
Epoch: 24504, Training Loss: 0.01148
Epoch: 24505, Training Loss: 0.00892
Epoch: 24505, Training Loss: 0.00932
Epoch: 24505, Training Loss: 0.00933
Epoch: 24505, Training Loss: 0.01148
Epoch: 24506, Training Loss: 0.00892
Epoch: 24506, Training Loss: 0.00932
Epoch: 24506, Training Loss: 0.00933
Epoch: 24506, Training Loss: 0.01148
Epoch: 24507, Training Loss: 0.00892
Epoch: 24507, Training Loss: 0.00932
Epoch: 24507, Training Loss: 0.00933
Epoch: 24507, Training Loss: 0.01148
Epoch: 24508, Training Loss: 0.00892
Epoch: 24508, Training Loss: 0.00932
Epoch: 24508, Training Loss: 0.00933
Epoch: 24508, Training Loss: 0.01148
Epoch: 24509, Training Loss: 0.00892
Epoch: 24509, Training Loss: 0.00932
Epoch: 24509, Training Loss: 0.00933
Epoch: 24509, Training Loss: 0.01148
Epoch: 24510, Training Loss: 0.00892
Epoch: 24510, Training Loss: 0.00932
Epoch: 24510, Training Loss: 0.00933
Epoch: 24510, Training Loss: 0.01148
Epoch: 24511, Training Loss: 0.00892
Epoch: 24511, Training Loss: 0.00932
Epoch: 24511, Training Loss: 0.00933
Epoch: 24511, Training Loss: 0.01148
Epoch: 24512, Training Loss: 0.00892
Epoch: 24512, Training Loss: 0.00932
Epoch: 24512, Training Loss: 0.00933
Epoch: 24512, Training Loss: 0.01148
Epoch: 24513, Training Loss: 0.00892
Epoch: 24513, Training Loss: 0.00932
Epoch: 24513, Training Loss: 0.00933
Epoch: 24513, Training Loss: 0.01148
Epoch: 24514, Training Loss: 0.00892
Epoch: 24514, Training Loss: 0.00932
Epoch: 24514, Training Loss: 0.00933
Epoch: 24514, Training Loss: 0.01148
Epoch: 24515, Training Loss: 0.00892
Epoch: 24515, Training Loss: 0.00932
Epoch: 24515, Training Loss: 0.00933
Epoch: 24515, Training Loss: 0.01148
Epoch: 24516, Training Loss: 0.00892
Epoch: 24516, Training Loss: 0.00932
Epoch: 24516, Training Loss: 0.00933
Epoch: 24516, Training Loss: 0.01148
Epoch: 24517, Training Loss: 0.00892
Epoch: 24517, Training Loss: 0.00932
Epoch: 24517, Training Loss: 0.00933
Epoch: 24517, Training Loss: 0.01148
Epoch: 24518, Training Loss: 0.00892
Epoch: 24518, Training Loss: 0.00932
Epoch: 24518, Training Loss: 0.00933
Epoch: 24518, Training Loss: 0.01148
Epoch: 24519, Training Loss: 0.00892
Epoch: 24519, Training Loss: 0.00932
Epoch: 24519, Training Loss: 0.00933
Epoch: 24519, Training Loss: 0.01148
Epoch: 24520, Training Loss: 0.00892
Epoch: 24520, Training Loss: 0.00932
Epoch: 24520, Training Loss: 0.00933
Epoch: 24520, Training Loss: 0.01148
Epoch: 24521, Training Loss: 0.00892
Epoch: 24521, Training Loss: 0.00932
Epoch: 24521, Training Loss: 0.00933
Epoch: 24521, Training Loss: 0.01148
Epoch: 24522, Training Loss: 0.00892
Epoch: 24522, Training Loss: 0.00932
Epoch: 24522, Training Loss: 0.00933
Epoch: 24522, Training Loss: 0.01148
Epoch: 24523, Training Loss: 0.00892
Epoch: 24523, Training Loss: 0.00932
Epoch: 24523, Training Loss: 0.00933
Epoch: 24523, Training Loss: 0.01148
Epoch: 24524, Training Loss: 0.00892
Epoch: 24524, Training Loss: 0.00932
Epoch: 24524, Training Loss: 0.00933
Epoch: 24524, Training Loss: 0.01148
Epoch: 24525, Training Loss: 0.00892
Epoch: 24525, Training Loss: 0.00932
Epoch: 24525, Training Loss: 0.00933
Epoch: 24525, Training Loss: 0.01148
Epoch: 24526, Training Loss: 0.00892
Epoch: 24526, Training Loss: 0.00932
Epoch: 24526, Training Loss: 0.00933
Epoch: 24526, Training Loss: 0.01148
Epoch: 24527, Training Loss: 0.00892
Epoch: 24527, Training Loss: 0.00932
Epoch: 24527, Training Loss: 0.00933
Epoch: 24527, Training Loss: 0.01148
Epoch: 24528, Training Loss: 0.00892
Epoch: 24528, Training Loss: 0.00932
Epoch: 24528, Training Loss: 0.00933
Epoch: 24528, Training Loss: 0.01148
Epoch: 24529, Training Loss: 0.00892
Epoch: 24529, Training Loss: 0.00932
Epoch: 24529, Training Loss: 0.00933
Epoch: 24529, Training Loss: 0.01148
Epoch: 24530, Training Loss: 0.00892
Epoch: 24530, Training Loss: 0.00932
Epoch: 24530, Training Loss: 0.00933
Epoch: 24530, Training Loss: 0.01148
Epoch: 24531, Training Loss: 0.00892
Epoch: 24531, Training Loss: 0.00932
Epoch: 24531, Training Loss: 0.00933
Epoch: 24531, Training Loss: 0.01148
Epoch: 24532, Training Loss: 0.00892
Epoch: 24532, Training Loss: 0.00932
Epoch: 24532, Training Loss: 0.00933
Epoch: 24532, Training Loss: 0.01148
Epoch: 24533, Training Loss: 0.00892
Epoch: 24533, Training Loss: 0.00932
Epoch: 24533, Training Loss: 0.00933
Epoch: 24533, Training Loss: 0.01148
Epoch: 24534, Training Loss: 0.00892
Epoch: 24534, Training Loss: 0.00932
Epoch: 24534, Training Loss: 0.00933
Epoch: 24534, Training Loss: 0.01148
Epoch: 24535, Training Loss: 0.00892
Epoch: 24535, Training Loss: 0.00932
Epoch: 24535, Training Loss: 0.00933
Epoch: 24535, Training Loss: 0.01148
Epoch: 24536, Training Loss: 0.00892
Epoch: 24536, Training Loss: 0.00932
Epoch: 24536, Training Loss: 0.00933
Epoch: 24536, Training Loss: 0.01147
Epoch: 24537, Training Loss: 0.00892
Epoch: 24537, Training Loss: 0.00932
Epoch: 24537, Training Loss: 0.00933
Epoch: 24537, Training Loss: 0.01147
Epoch: 24538, Training Loss: 0.00892
Epoch: 24538, Training Loss: 0.00932
Epoch: 24538, Training Loss: 0.00933
Epoch: 24538, Training Loss: 0.01147
Epoch: 24539, Training Loss: 0.00892
Epoch: 24539, Training Loss: 0.00932
Epoch: 24539, Training Loss: 0.00933
Epoch: 24539, Training Loss: 0.01147
Epoch: 24540, Training Loss: 0.00891
Epoch: 24540, Training Loss: 0.00932
Epoch: 24540, Training Loss: 0.00933
Epoch: 24540, Training Loss: 0.01147
Epoch: 24541, Training Loss: 0.00891
Epoch: 24541, Training Loss: 0.00932
Epoch: 24541, Training Loss: 0.00933
Epoch: 24541, Training Loss: 0.01147
Epoch: 24542, Training Loss: 0.00891
Epoch: 24542, Training Loss: 0.00932
Epoch: 24542, Training Loss: 0.00933
Epoch: 24542, Training Loss: 0.01147
Epoch: 24543, Training Loss: 0.00891
Epoch: 24543, Training Loss: 0.00931
Epoch: 24543, Training Loss: 0.00933
Epoch: 24543, Training Loss: 0.01147
Epoch: 24544, Training Loss: 0.00891
Epoch: 24544, Training Loss: 0.00931
Epoch: 24544, Training Loss: 0.00933
Epoch: 24544, Training Loss: 0.01147
Epoch: 24545, Training Loss: 0.00891
Epoch: 24545, Training Loss: 0.00931
Epoch: 24545, Training Loss: 0.00933
Epoch: 24545, Training Loss: 0.01147
Epoch: 24546, Training Loss: 0.00891
Epoch: 24546, Training Loss: 0.00931
Epoch: 24546, Training Loss: 0.00933
Epoch: 24546, Training Loss: 0.01147
Epoch: 24547, Training Loss: 0.00891
Epoch: 24547, Training Loss: 0.00931
Epoch: 24547, Training Loss: 0.00933
Epoch: 24547, Training Loss: 0.01147
Epoch: 24548, Training Loss: 0.00891
Epoch: 24548, Training Loss: 0.00931
Epoch: 24548, Training Loss: 0.00933
Epoch: 24548, Training Loss: 0.01147
Epoch: 24549, Training Loss: 0.00891
Epoch: 24549, Training Loss: 0.00931
Epoch: 24549, Training Loss: 0.00933
Epoch: 24549, Training Loss: 0.01147
Epoch: 24550, Training Loss: 0.00891
Epoch: 24550, Training Loss: 0.00931
Epoch: 24550, Training Loss: 0.00932
Epoch: 24550, Training Loss: 0.01147
Epoch: 24551, Training Loss: 0.00891
Epoch: 24551, Training Loss: 0.00931
Epoch: 24551, Training Loss: 0.00932
Epoch: 24551, Training Loss: 0.01147
Epoch: 24552, Training Loss: 0.00891
Epoch: 24552, Training Loss: 0.00931
Epoch: 24552, Training Loss: 0.00932
Epoch: 24552, Training Loss: 0.01147
Epoch: 24553, Training Loss: 0.00891
Epoch: 24553, Training Loss: 0.00931
Epoch: 24553, Training Loss: 0.00932
Epoch: 24553, Training Loss: 0.01147
Epoch: 24554, Training Loss: 0.00891
Epoch: 24554, Training Loss: 0.00931
Epoch: 24554, Training Loss: 0.00932
Epoch: 24554, Training Loss: 0.01147
Epoch: 24555, Training Loss: 0.00891
Epoch: 24555, Training Loss: 0.00931
Epoch: 24555, Training Loss: 0.00932
Epoch: 24555, Training Loss: 0.01147
Epoch: 24556, Training Loss: 0.00891
Epoch: 24556, Training Loss: 0.00931
Epoch: 24556, Training Loss: 0.00932
Epoch: 24556, Training Loss: 0.01147
Epoch: 24557, Training Loss: 0.00891
Epoch: 24557, Training Loss: 0.00931
Epoch: 24557, Training Loss: 0.00932
Epoch: 24557, Training Loss: 0.01147
Epoch: 24558, Training Loss: 0.00891
Epoch: 24558, Training Loss: 0.00931
Epoch: 24558, Training Loss: 0.00932
Epoch: 24558, Training Loss: 0.01147
Epoch: 24559, Training Loss: 0.00891
Epoch: 24559, Training Loss: 0.00931
Epoch: 24559, Training Loss: 0.00932
Epoch: 24559, Training Loss: 0.01147
Epoch: 24560, Training Loss: 0.00891
Epoch: 24560, Training Loss: 0.00931
Epoch: 24560, Training Loss: 0.00932
Epoch: 24560, Training Loss: 0.01147
Epoch: 24561, Training Loss: 0.00891
Epoch: 24561, Training Loss: 0.00931
Epoch: 24561, Training Loss: 0.00932
Epoch: 24561, Training Loss: 0.01147
Epoch: 24562, Training Loss: 0.00891
Epoch: 24562, Training Loss: 0.00931
Epoch: 24562, Training Loss: 0.00932
Epoch: 24562, Training Loss: 0.01147
Epoch: 24563, Training Loss: 0.00891
Epoch: 24563, Training Loss: 0.00931
Epoch: 24563, Training Loss: 0.00932
Epoch: 24563, Training Loss: 0.01147
Epoch: 24564, Training Loss: 0.00891
Epoch: 24564, Training Loss: 0.00931
Epoch: 24564, Training Loss: 0.00932
Epoch: 24564, Training Loss: 0.01147
Epoch: 24565, Training Loss: 0.00891
Epoch: 24565, Training Loss: 0.00931
Epoch: 24565, Training Loss: 0.00932
Epoch: 24565, Training Loss: 0.01147
Epoch: 24566, Training Loss: 0.00891
Epoch: 24566, Training Loss: 0.00931
Epoch: 24566, Training Loss: 0.00932
Epoch: 24566, Training Loss: 0.01147
Epoch: 24567, Training Loss: 0.00891
Epoch: 24567, Training Loss: 0.00931
Epoch: 24567, Training Loss: 0.00932
Epoch: 24567, Training Loss: 0.01147
Epoch: 24568, Training Loss: 0.00891
Epoch: 24568, Training Loss: 0.00931
Epoch: 24568, Training Loss: 0.00932
Epoch: 24568, Training Loss: 0.01147
Epoch: 24569, Training Loss: 0.00891
Epoch: 24569, Training Loss: 0.00931
Epoch: 24569, Training Loss: 0.00932
Epoch: 24569, Training Loss: 0.01147
Epoch: 24570, Training Loss: 0.00891
Epoch: 24570, Training Loss: 0.00931
Epoch: 24570, Training Loss: 0.00932
Epoch: 24570, Training Loss: 0.01147
Epoch: 24571, Training Loss: 0.00891
Epoch: 24571, Training Loss: 0.00931
Epoch: 24571, Training Loss: 0.00932
Epoch: 24571, Training Loss: 0.01147
Epoch: 24572, Training Loss: 0.00891
Epoch: 24572, Training Loss: 0.00931
Epoch: 24572, Training Loss: 0.00932
Epoch: 24572, Training Loss: 0.01147
Epoch: 24573, Training Loss: 0.00891
Epoch: 24573, Training Loss: 0.00931
Epoch: 24573, Training Loss: 0.00932
Epoch: 24573, Training Loss: 0.01146
Epoch: 24574, Training Loss: 0.00891
Epoch: 24574, Training Loss: 0.00931
Epoch: 24574, Training Loss: 0.00932
Epoch: 24574, Training Loss: 0.01146
Epoch: 24575, Training Loss: 0.00891
Epoch: 24575, Training Loss: 0.00931
Epoch: 24575, Training Loss: 0.00932
Epoch: 24575, Training Loss: 0.01146
Epoch: 24576, Training Loss: 0.00891
Epoch: 24576, Training Loss: 0.00931
Epoch: 24576, Training Loss: 0.00932
Epoch: 24576, Training Loss: 0.01146
Epoch: 24577, Training Loss: 0.00891
Epoch: 24577, Training Loss: 0.00931
Epoch: 24577, Training Loss: 0.00932
Epoch: 24577, Training Loss: 0.01146
Epoch: 24578, Training Loss: 0.00891
Epoch: 24578, Training Loss: 0.00931
Epoch: 24578, Training Loss: 0.00932
Epoch: 24578, Training Loss: 0.01146
Epoch: 24579, Training Loss: 0.00891
Epoch: 24579, Training Loss: 0.00931
Epoch: 24579, Training Loss: 0.00932
Epoch: 24579, Training Loss: 0.01146
Epoch: 24580, Training Loss: 0.00891
Epoch: 24580, Training Loss: 0.00931
Epoch: 24580, Training Loss: 0.00932
Epoch: 24580, Training Loss: 0.01146
Epoch: 24581, Training Loss: 0.00891
Epoch: 24581, Training Loss: 0.00931
Epoch: 24581, Training Loss: 0.00932
Epoch: 24581, Training Loss: 0.01146
Epoch: 24582, Training Loss: 0.00891
Epoch: 24582, Training Loss: 0.00931
Epoch: 24582, Training Loss: 0.00932
Epoch: 24582, Training Loss: 0.01146
Epoch: 24583, Training Loss: 0.00891
Epoch: 24583, Training Loss: 0.00931
Epoch: 24583, Training Loss: 0.00932
Epoch: 24583, Training Loss: 0.01146
Epoch: 24584, Training Loss: 0.00891
Epoch: 24584, Training Loss: 0.00931
Epoch: 24584, Training Loss: 0.00932
Epoch: 24584, Training Loss: 0.01146
Epoch: 24585, Training Loss: 0.00891
Epoch: 24585, Training Loss: 0.00931
Epoch: 24585, Training Loss: 0.00932
Epoch: 24585, Training Loss: 0.01146
Epoch: 24586, Training Loss: 0.00891
Epoch: 24586, Training Loss: 0.00931
Epoch: 24586, Training Loss: 0.00932
Epoch: 24586, Training Loss: 0.01146
Epoch: 24587, Training Loss: 0.00891
Epoch: 24587, Training Loss: 0.00931
Epoch: 24587, Training Loss: 0.00932
Epoch: 24587, Training Loss: 0.01146
Epoch: 24588, Training Loss: 0.00891
Epoch: 24588, Training Loss: 0.00931
Epoch: 24588, Training Loss: 0.00932
Epoch: 24588, Training Loss: 0.01146
Epoch: 24589, Training Loss: 0.00890
Epoch: 24589, Training Loss: 0.00930
Epoch: 24589, Training Loss: 0.00932
Epoch: 24589, Training Loss: 0.01146
Epoch: 24590, Training Loss: 0.00890
Epoch: 24590, Training Loss: 0.00930
Epoch: 24590, Training Loss: 0.00932
Epoch: 24590, Training Loss: 0.01146
Epoch: 24591, Training Loss: 0.00890
Epoch: 24591, Training Loss: 0.00930
Epoch: 24591, Training Loss: 0.00932
Epoch: 24591, Training Loss: 0.01146
Epoch: 24592, Training Loss: 0.00890
Epoch: 24592, Training Loss: 0.00930
Epoch: 24592, Training Loss: 0.00932
Epoch: 24592, Training Loss: 0.01146
Epoch: 24593, Training Loss: 0.00890
Epoch: 24593, Training Loss: 0.00930
Epoch: 24593, Training Loss: 0.00932
Epoch: 24593, Training Loss: 0.01146
Epoch: 24594, Training Loss: 0.00890
Epoch: 24594, Training Loss: 0.00930
Epoch: 24594, Training Loss: 0.00932
Epoch: 24594, Training Loss: 0.01146
Epoch: 24595, Training Loss: 0.00890
Epoch: 24595, Training Loss: 0.00930
Epoch: 24595, Training Loss: 0.00932
Epoch: 24595, Training Loss: 0.01146
Epoch: 24596, Training Loss: 0.00890
Epoch: 24596, Training Loss: 0.00930
Epoch: 24596, Training Loss: 0.00931
Epoch: 24596, Training Loss: 0.01146
Epoch: 24597, Training Loss: 0.00890
Epoch: 24597, Training Loss: 0.00930
Epoch: 24597, Training Loss: 0.00931
Epoch: 24597, Training Loss: 0.01146
Epoch: 24598, Training Loss: 0.00890
Epoch: 24598, Training Loss: 0.00930
Epoch: 24598, Training Loss: 0.00931
Epoch: 24598, Training Loss: 0.01146
Epoch: 24599, Training Loss: 0.00890
Epoch: 24599, Training Loss: 0.00930
Epoch: 24599, Training Loss: 0.00931
Epoch: 24599, Training Loss: 0.01146
Epoch: 24600, Training Loss: 0.00890
Epoch: 24600, Training Loss: 0.00930
Epoch: 24600, Training Loss: 0.00931
Epoch: 24600, Training Loss: 0.01146
Epoch: 24601, Training Loss: 0.00890
Epoch: 24601, Training Loss: 0.00930
Epoch: 24601, Training Loss: 0.00931
Epoch: 24601, Training Loss: 0.01146
Epoch: 24602, Training Loss: 0.00890
Epoch: 24602, Training Loss: 0.00930
Epoch: 24602, Training Loss: 0.00931
Epoch: 24602, Training Loss: 0.01146
Epoch: 24603, Training Loss: 0.00890
Epoch: 24603, Training Loss: 0.00930
Epoch: 24603, Training Loss: 0.00931
Epoch: 24603, Training Loss: 0.01146
Epoch: 24604, Training Loss: 0.00890
Epoch: 24604, Training Loss: 0.00930
Epoch: 24604, Training Loss: 0.00931
Epoch: 24604, Training Loss: 0.01146
Epoch: 24605, Training Loss: 0.00890
Epoch: 24605, Training Loss: 0.00930
Epoch: 24605, Training Loss: 0.00931
Epoch: 24605, Training Loss: 0.01146
Epoch: 24606, Training Loss: 0.00890
Epoch: 24606, Training Loss: 0.00930
Epoch: 24606, Training Loss: 0.00931
Epoch: 24606, Training Loss: 0.01146
Epoch: 24607, Training Loss: 0.00890
Epoch: 24607, Training Loss: 0.00930
Epoch: 24607, Training Loss: 0.00931
Epoch: 24607, Training Loss: 0.01146
Epoch: 24608, Training Loss: 0.00890
Epoch: 24608, Training Loss: 0.00930
Epoch: 24608, Training Loss: 0.00931
Epoch: 24608, Training Loss: 0.01146
Epoch: 24609, Training Loss: 0.00890
Epoch: 24609, Training Loss: 0.00930
Epoch: 24609, Training Loss: 0.00931
Epoch: 24609, Training Loss: 0.01146
Epoch: 24610, Training Loss: 0.00890
Epoch: 24610, Training Loss: 0.00930
Epoch: 24610, Training Loss: 0.00931
Epoch: 24610, Training Loss: 0.01145
Epoch: 24611, Training Loss: 0.00890
Epoch: 24611, Training Loss: 0.00930
Epoch: 24611, Training Loss: 0.00931
Epoch: 24611, Training Loss: 0.01145
Epoch: 24612, Training Loss: 0.00890
Epoch: 24612, Training Loss: 0.00930
Epoch: 24612, Training Loss: 0.00931
Epoch: 24612, Training Loss: 0.01145
Epoch: 24613, Training Loss: 0.00890
Epoch: 24613, Training Loss: 0.00930
Epoch: 24613, Training Loss: 0.00931
Epoch: 24613, Training Loss: 0.01145
Epoch: 24614, Training Loss: 0.00890
Epoch: 24614, Training Loss: 0.00930
Epoch: 24614, Training Loss: 0.00931
Epoch: 24614, Training Loss: 0.01145
Epoch: 24615, Training Loss: 0.00890
Epoch: 24615, Training Loss: 0.00930
Epoch: 24615, Training Loss: 0.00931
Epoch: 24615, Training Loss: 0.01145
Epoch: 24616, Training Loss: 0.00890
Epoch: 24616, Training Loss: 0.00930
Epoch: 24616, Training Loss: 0.00931
Epoch: 24616, Training Loss: 0.01145
Epoch: 24617, Training Loss: 0.00890
Epoch: 24617, Training Loss: 0.00930
Epoch: 24617, Training Loss: 0.00931
Epoch: 24617, Training Loss: 0.01145
Epoch: 24618, Training Loss: 0.00890
Epoch: 24618, Training Loss: 0.00930
Epoch: 24618, Training Loss: 0.00931
Epoch: 24618, Training Loss: 0.01145
Epoch: 24619, Training Loss: 0.00890
Epoch: 24619, Training Loss: 0.00930
Epoch: 24619, Training Loss: 0.00931
Epoch: 24619, Training Loss: 0.01145
Epoch: 24620, Training Loss: 0.00890
Epoch: 24620, Training Loss: 0.00930
Epoch: 24620, Training Loss: 0.00931
Epoch: 24620, Training Loss: 0.01145
Epoch: 24621, Training Loss: 0.00890
Epoch: 24621, Training Loss: 0.00930
Epoch: 24621, Training Loss: 0.00931
Epoch: 24621, Training Loss: 0.01145
Epoch: 24622, Training Loss: 0.00890
Epoch: 24622, Training Loss: 0.00930
Epoch: 24622, Training Loss: 0.00931
Epoch: 24622, Training Loss: 0.01145
Epoch: 24623, Training Loss: 0.00890
Epoch: 24623, Training Loss: 0.00930
Epoch: 24623, Training Loss: 0.00931
Epoch: 24623, Training Loss: 0.01145
Epoch: 24624, Training Loss: 0.00890
Epoch: 24624, Training Loss: 0.00930
Epoch: 24624, Training Loss: 0.00931
Epoch: 24624, Training Loss: 0.01145
Epoch: 24625, Training Loss: 0.00890
Epoch: 24625, Training Loss: 0.00930
Epoch: 24625, Training Loss: 0.00931
Epoch: 24625, Training Loss: 0.01145
Epoch: 24626, Training Loss: 0.00890
Epoch: 24626, Training Loss: 0.00930
Epoch: 24626, Training Loss: 0.00931
Epoch: 24626, Training Loss: 0.01145
Epoch: 24627, Training Loss: 0.00890
Epoch: 24627, Training Loss: 0.00930
Epoch: 24627, Training Loss: 0.00931
Epoch: 24627, Training Loss: 0.01145
Epoch: 24628, Training Loss: 0.00890
Epoch: 24628, Training Loss: 0.00930
Epoch: 24628, Training Loss: 0.00931
Epoch: 24628, Training Loss: 0.01145
Epoch: 24629, Training Loss: 0.00890
Epoch: 24629, Training Loss: 0.00930
Epoch: 24629, Training Loss: 0.00931
Epoch: 24629, Training Loss: 0.01145
Epoch: 24630, Training Loss: 0.00890
Epoch: 24630, Training Loss: 0.00930
Epoch: 24630, Training Loss: 0.00931
Epoch: 24630, Training Loss: 0.01145
Epoch: 24631, Training Loss: 0.00890
Epoch: 24631, Training Loss: 0.00930
Epoch: 24631, Training Loss: 0.00931
Epoch: 24631, Training Loss: 0.01145
Epoch: 24632, Training Loss: 0.00890
Epoch: 24632, Training Loss: 0.00930
Epoch: 24632, Training Loss: 0.00931
Epoch: 24632, Training Loss: 0.01145
Epoch: 24633, Training Loss: 0.00890
Epoch: 24633, Training Loss: 0.00930
Epoch: 24633, Training Loss: 0.00931
Epoch: 24633, Training Loss: 0.01145
Epoch: 24634, Training Loss: 0.00890
Epoch: 24634, Training Loss: 0.00930
Epoch: 24634, Training Loss: 0.00931
Epoch: 24634, Training Loss: 0.01145
Epoch: 24635, Training Loss: 0.00890
Epoch: 24635, Training Loss: 0.00929
Epoch: 24635, Training Loss: 0.00931
Epoch: 24635, Training Loss: 0.01145
Epoch: 24636, Training Loss: 0.00890
Epoch: 24636, Training Loss: 0.00929
Epoch: 24636, Training Loss: 0.00931
Epoch: 24636, Training Loss: 0.01145
Epoch: 24637, Training Loss: 0.00890
Epoch: 24637, Training Loss: 0.00929
Epoch: 24637, Training Loss: 0.00931
Epoch: 24637, Training Loss: 0.01145
Epoch: 24638, Training Loss: 0.00890
Epoch: 24638, Training Loss: 0.00929
Epoch: 24638, Training Loss: 0.00931
Epoch: 24638, Training Loss: 0.01145
Epoch: 24639, Training Loss: 0.00889
Epoch: 24639, Training Loss: 0.00929
Epoch: 24639, Training Loss: 0.00931
Epoch: 24639, Training Loss: 0.01145
Epoch: 24640, Training Loss: 0.00889
Epoch: 24640, Training Loss: 0.00929
Epoch: 24640, Training Loss: 0.00931
Epoch: 24640, Training Loss: 0.01145
Epoch: 24641, Training Loss: 0.00889
Epoch: 24641, Training Loss: 0.00929
Epoch: 24641, Training Loss: 0.00931
Epoch: 24641, Training Loss: 0.01145
Epoch: 24642, Training Loss: 0.00889
Epoch: 24642, Training Loss: 0.00929
Epoch: 24642, Training Loss: 0.00930
Epoch: 24642, Training Loss: 0.01145
Epoch: 24643, Training Loss: 0.00889
Epoch: 24643, Training Loss: 0.00929
Epoch: 24643, Training Loss: 0.00930
Epoch: 24643, Training Loss: 0.01145
Epoch: 24644, Training Loss: 0.00889
Epoch: 24644, Training Loss: 0.00929
Epoch: 24644, Training Loss: 0.00930
Epoch: 24644, Training Loss: 0.01145
Epoch: 24645, Training Loss: 0.00889
Epoch: 24645, Training Loss: 0.00929
Epoch: 24645, Training Loss: 0.00930
Epoch: 24645, Training Loss: 0.01145
Epoch: 24646, Training Loss: 0.00889
Epoch: 24646, Training Loss: 0.00929
Epoch: 24646, Training Loss: 0.00930
Epoch: 24646, Training Loss: 0.01145
Epoch: 24647, Training Loss: 0.00889
Epoch: 24647, Training Loss: 0.00929
Epoch: 24647, Training Loss: 0.00930
Epoch: 24647, Training Loss: 0.01144
Epoch: 24648, Training Loss: 0.00889
Epoch: 24648, Training Loss: 0.00929
Epoch: 24648, Training Loss: 0.00930
Epoch: 24648, Training Loss: 0.01144
Epoch: 24649, Training Loss: 0.00889
Epoch: 24649, Training Loss: 0.00929
Epoch: 24649, Training Loss: 0.00930
Epoch: 24649, Training Loss: 0.01144
Epoch: 24650, Training Loss: 0.00889
Epoch: 24650, Training Loss: 0.00929
Epoch: 24650, Training Loss: 0.00930
Epoch: 24650, Training Loss: 0.01144
Epoch: 24651, Training Loss: 0.00889
Epoch: 24651, Training Loss: 0.00929
Epoch: 24651, Training Loss: 0.00930
Epoch: 24651, Training Loss: 0.01144
Epoch: 24652, Training Loss: 0.00889
Epoch: 24652, Training Loss: 0.00929
Epoch: 24652, Training Loss: 0.00930
Epoch: 24652, Training Loss: 0.01144
Epoch: 24653, Training Loss: 0.00889
Epoch: 24653, Training Loss: 0.00929
Epoch: 24653, Training Loss: 0.00930
Epoch: 24653, Training Loss: 0.01144
Epoch: 24654, Training Loss: 0.00889
Epoch: 24654, Training Loss: 0.00929
Epoch: 24654, Training Loss: 0.00930
Epoch: 24654, Training Loss: 0.01144
Epoch: 24655, Training Loss: 0.00889
Epoch: 24655, Training Loss: 0.00929
Epoch: 24655, Training Loss: 0.00930
Epoch: 24655, Training Loss: 0.01144
Epoch: 24656, Training Loss: 0.00889
Epoch: 24656, Training Loss: 0.00929
Epoch: 24656, Training Loss: 0.00930
Epoch: 24656, Training Loss: 0.01144
Epoch: 24657, Training Loss: 0.00889
Epoch: 24657, Training Loss: 0.00929
Epoch: 24657, Training Loss: 0.00930
Epoch: 24657, Training Loss: 0.01144
Epoch: 24658, Training Loss: 0.00889
Epoch: 24658, Training Loss: 0.00929
Epoch: 24658, Training Loss: 0.00930
Epoch: 24658, Training Loss: 0.01144
Epoch: 24659, Training Loss: 0.00889
Epoch: 24659, Training Loss: 0.00929
Epoch: 24659, Training Loss: 0.00930
Epoch: 24659, Training Loss: 0.01144
Epoch: 24660, Training Loss: 0.00889
Epoch: 24660, Training Loss: 0.00929
Epoch: 24660, Training Loss: 0.00930
Epoch: 24660, Training Loss: 0.01144
Epoch: 24661, Training Loss: 0.00889
Epoch: 24661, Training Loss: 0.00929
Epoch: 24661, Training Loss: 0.00930
Epoch: 24661, Training Loss: 0.01144
Epoch: 24662, Training Loss: 0.00889
Epoch: 24662, Training Loss: 0.00929
Epoch: 24662, Training Loss: 0.00930
Epoch: 24662, Training Loss: 0.01144
Epoch: 24663, Training Loss: 0.00889
Epoch: 24663, Training Loss: 0.00929
Epoch: 24663, Training Loss: 0.00930
Epoch: 24663, Training Loss: 0.01144
Epoch: 24664, Training Loss: 0.00889
Epoch: 24664, Training Loss: 0.00929
Epoch: 24664, Training Loss: 0.00930
Epoch: 24664, Training Loss: 0.01144
Epoch: 24665, Training Loss: 0.00889
Epoch: 24665, Training Loss: 0.00929
Epoch: 24665, Training Loss: 0.00930
Epoch: 24665, Training Loss: 0.01144
Epoch: 24666, Training Loss: 0.00889
Epoch: 24666, Training Loss: 0.00929
Epoch: 24666, Training Loss: 0.00930
Epoch: 24666, Training Loss: 0.01144
Epoch: 24667, Training Loss: 0.00889
Epoch: 24667, Training Loss: 0.00929
Epoch: 24667, Training Loss: 0.00930
Epoch: 24667, Training Loss: 0.01144
Epoch: 24668, Training Loss: 0.00889
Epoch: 24668, Training Loss: 0.00929
Epoch: 24668, Training Loss: 0.00930
Epoch: 24668, Training Loss: 0.01144
Epoch: 24669, Training Loss: 0.00889
Epoch: 24669, Training Loss: 0.00929
Epoch: 24669, Training Loss: 0.00930
Epoch: 24669, Training Loss: 0.01144
Epoch: 24670, Training Loss: 0.00889
Epoch: 24670, Training Loss: 0.00929
Epoch: 24670, Training Loss: 0.00930
Epoch: 24670, Training Loss: 0.01144
Epoch: 24671, Training Loss: 0.00889
Epoch: 24671, Training Loss: 0.00929
Epoch: 24671, Training Loss: 0.00930
Epoch: 24671, Training Loss: 0.01144
Epoch: 24672, Training Loss: 0.00889
Epoch: 24672, Training Loss: 0.00929
Epoch: 24672, Training Loss: 0.00930
Epoch: 24672, Training Loss: 0.01144
Epoch: 24673, Training Loss: 0.00889
Epoch: 24673, Training Loss: 0.00929
Epoch: 24673, Training Loss: 0.00930
Epoch: 24673, Training Loss: 0.01144
Epoch: 24674, Training Loss: 0.00889
Epoch: 24674, Training Loss: 0.00929
Epoch: 24674, Training Loss: 0.00930
Epoch: 24674, Training Loss: 0.01144
Epoch: 24675, Training Loss: 0.00889
Epoch: 24675, Training Loss: 0.00929
Epoch: 24675, Training Loss: 0.00930
Epoch: 24675, Training Loss: 0.01144
Epoch: 24676, Training Loss: 0.00889
Epoch: 24676, Training Loss: 0.00929
Epoch: 24676, Training Loss: 0.00930
Epoch: 24676, Training Loss: 0.01144
Epoch: 24677, Training Loss: 0.00889
Epoch: 24677, Training Loss: 0.00929
Epoch: 24677, Training Loss: 0.00930
Epoch: 24677, Training Loss: 0.01144
Epoch: 24678, Training Loss: 0.00889
Epoch: 24678, Training Loss: 0.00929
Epoch: 24678, Training Loss: 0.00930
Epoch: 24678, Training Loss: 0.01144
Epoch: 24679, Training Loss: 0.00889
Epoch: 24679, Training Loss: 0.00929
Epoch: 24679, Training Loss: 0.00930
Epoch: 24679, Training Loss: 0.01144
Epoch: 24680, Training Loss: 0.00889
Epoch: 24680, Training Loss: 0.00929
Epoch: 24680, Training Loss: 0.00930
Epoch: 24680, Training Loss: 0.01144
Epoch: 24681, Training Loss: 0.00889
Epoch: 24681, Training Loss: 0.00928
Epoch: 24681, Training Loss: 0.00930
Epoch: 24681, Training Loss: 0.01144
Epoch: 24682, Training Loss: 0.00889
Epoch: 24682, Training Loss: 0.00928
Epoch: 24682, Training Loss: 0.00930
Epoch: 24682, Training Loss: 0.01144
Epoch: 24683, Training Loss: 0.00889
Epoch: 24683, Training Loss: 0.00928
Epoch: 24683, Training Loss: 0.00930
Epoch: 24683, Training Loss: 0.01144
Epoch: 24684, Training Loss: 0.00889
Epoch: 24684, Training Loss: 0.00928
Epoch: 24684, Training Loss: 0.00930
Epoch: 24684, Training Loss: 0.01143
Epoch: 24685, Training Loss: 0.00889
Epoch: 24685, Training Loss: 0.00928
Epoch: 24685, Training Loss: 0.00930
Epoch: 24685, Training Loss: 0.01143
Epoch: 24686, Training Loss: 0.00889
Epoch: 24686, Training Loss: 0.00928
Epoch: 24686, Training Loss: 0.00930
Epoch: 24686, Training Loss: 0.01143
Epoch: 24687, Training Loss: 0.00889
Epoch: 24687, Training Loss: 0.00928
Epoch: 24687, Training Loss: 0.00930
Epoch: 24687, Training Loss: 0.01143
Epoch: 24688, Training Loss: 0.00888
Epoch: 24688, Training Loss: 0.00928
Epoch: 24688, Training Loss: 0.00929
Epoch: 24688, Training Loss: 0.01143
Epoch: 24689, Training Loss: 0.00888
Epoch: 24689, Training Loss: 0.00928
Epoch: 24689, Training Loss: 0.00929
Epoch: 24689, Training Loss: 0.01143
Epoch: 24690, Training Loss: 0.00888
Epoch: 24690, Training Loss: 0.00928
Epoch: 24690, Training Loss: 0.00929
Epoch: 24690, Training Loss: 0.01143
Epoch: 24691, Training Loss: 0.00888
Epoch: 24691, Training Loss: 0.00928
Epoch: 24691, Training Loss: 0.00929
Epoch: 24691, Training Loss: 0.01143
Epoch: 24692, Training Loss: 0.00888
Epoch: 24692, Training Loss: 0.00928
Epoch: 24692, Training Loss: 0.00929
Epoch: 24692, Training Loss: 0.01143
Epoch: 24693, Training Loss: 0.00888
Epoch: 24693, Training Loss: 0.00928
Epoch: 24693, Training Loss: 0.00929
Epoch: 24693, Training Loss: 0.01143
Epoch: 24694, Training Loss: 0.00888
Epoch: 24694, Training Loss: 0.00928
Epoch: 24694, Training Loss: 0.00929
Epoch: 24694, Training Loss: 0.01143
Epoch: 24695, Training Loss: 0.00888
Epoch: 24695, Training Loss: 0.00928
Epoch: 24695, Training Loss: 0.00929
Epoch: 24695, Training Loss: 0.01143
Epoch: 24696, Training Loss: 0.00888
Epoch: 24696, Training Loss: 0.00928
Epoch: 24696, Training Loss: 0.00929
Epoch: 24696, Training Loss: 0.01143
Epoch: 24697, Training Loss: 0.00888
Epoch: 24697, Training Loss: 0.00928
Epoch: 24697, Training Loss: 0.00929
Epoch: 24697, Training Loss: 0.01143
Epoch: 24698, Training Loss: 0.00888
Epoch: 24698, Training Loss: 0.00928
Epoch: 24698, Training Loss: 0.00929
Epoch: 24698, Training Loss: 0.01143
Epoch: 24699, Training Loss: 0.00888
Epoch: 24699, Training Loss: 0.00928
Epoch: 24699, Training Loss: 0.00929
Epoch: 24699, Training Loss: 0.01143
Epoch: 24700, Training Loss: 0.00888
Epoch: 24700, Training Loss: 0.00928
Epoch: 24700, Training Loss: 0.00929
Epoch: 24700, Training Loss: 0.01143
Epoch: 24701, Training Loss: 0.00888
Epoch: 24701, Training Loss: 0.00928
Epoch: 24701, Training Loss: 0.00929
Epoch: 24701, Training Loss: 0.01143
Epoch: 24702, Training Loss: 0.00888
Epoch: 24702, Training Loss: 0.00928
Epoch: 24702, Training Loss: 0.00929
Epoch: 24702, Training Loss: 0.01143
Epoch: 24703, Training Loss: 0.00888
Epoch: 24703, Training Loss: 0.00928
Epoch: 24703, Training Loss: 0.00929
Epoch: 24703, Training Loss: 0.01143
Epoch: 24704, Training Loss: 0.00888
Epoch: 24704, Training Loss: 0.00928
Epoch: 24704, Training Loss: 0.00929
Epoch: 24704, Training Loss: 0.01143
Epoch: 24705, Training Loss: 0.00888
Epoch: 24705, Training Loss: 0.00928
Epoch: 24705, Training Loss: 0.00929
Epoch: 24705, Training Loss: 0.01143
Epoch: 24706, Training Loss: 0.00888
Epoch: 24706, Training Loss: 0.00928
Epoch: 24706, Training Loss: 0.00929
Epoch: 24706, Training Loss: 0.01143
Epoch: 24707, Training Loss: 0.00888
Epoch: 24707, Training Loss: 0.00928
Epoch: 24707, Training Loss: 0.00929
Epoch: 24707, Training Loss: 0.01143
Epoch: 24708, Training Loss: 0.00888
Epoch: 24708, Training Loss: 0.00928
Epoch: 24708, Training Loss: 0.00929
Epoch: 24708, Training Loss: 0.01143
Epoch: 24709, Training Loss: 0.00888
Epoch: 24709, Training Loss: 0.00928
Epoch: 24709, Training Loss: 0.00929
Epoch: 24709, Training Loss: 0.01143
Epoch: 24710, Training Loss: 0.00888
Epoch: 24710, Training Loss: 0.00928
Epoch: 24710, Training Loss: 0.00929
Epoch: 24710, Training Loss: 0.01143
Epoch: 24711, Training Loss: 0.00888
Epoch: 24711, Training Loss: 0.00928
Epoch: 24711, Training Loss: 0.00929
Epoch: 24711, Training Loss: 0.01143
Epoch: 24712, Training Loss: 0.00888
Epoch: 24712, Training Loss: 0.00928
Epoch: 24712, Training Loss: 0.00929
Epoch: 24712, Training Loss: 0.01143
Epoch: 24713, Training Loss: 0.00888
Epoch: 24713, Training Loss: 0.00928
Epoch: 24713, Training Loss: 0.00929
Epoch: 24713, Training Loss: 0.01143
Epoch: 24714, Training Loss: 0.00888
Epoch: 24714, Training Loss: 0.00928
Epoch: 24714, Training Loss: 0.00929
Epoch: 24714, Training Loss: 0.01143
Epoch: 24715, Training Loss: 0.00888
Epoch: 24715, Training Loss: 0.00928
Epoch: 24715, Training Loss: 0.00929
Epoch: 24715, Training Loss: 0.01143
Epoch: 24716, Training Loss: 0.00888
Epoch: 24716, Training Loss: 0.00928
Epoch: 24716, Training Loss: 0.00929
Epoch: 24716, Training Loss: 0.01143
Epoch: 24717, Training Loss: 0.00888
Epoch: 24717, Training Loss: 0.00928
Epoch: 24717, Training Loss: 0.00929
Epoch: 24717, Training Loss: 0.01143
Epoch: 24718, Training Loss: 0.00888
Epoch: 24718, Training Loss: 0.00928
Epoch: 24718, Training Loss: 0.00929
Epoch: 24718, Training Loss: 0.01143
Epoch: 24719, Training Loss: 0.00888
Epoch: 24719, Training Loss: 0.00928
Epoch: 24719, Training Loss: 0.00929
Epoch: 24719, Training Loss: 0.01143
Epoch: 24720, Training Loss: 0.00888
Epoch: 24720, Training Loss: 0.00928
Epoch: 24720, Training Loss: 0.00929
Epoch: 24720, Training Loss: 0.01143
Epoch: 24721, Training Loss: 0.00888
Epoch: 24721, Training Loss: 0.00928
Epoch: 24721, Training Loss: 0.00929
Epoch: 24721, Training Loss: 0.01142
Epoch: 24722, Training Loss: 0.00888
Epoch: 24722, Training Loss: 0.00928
Epoch: 24722, Training Loss: 0.00929
Epoch: 24722, Training Loss: 0.01142
Epoch: 24723, Training Loss: 0.00888
Epoch: 24723, Training Loss: 0.00928
Epoch: 24723, Training Loss: 0.00929
Epoch: 24723, Training Loss: 0.01142
Epoch: 24724, Training Loss: 0.00888
Epoch: 24724, Training Loss: 0.00928
Epoch: 24724, Training Loss: 0.00929
Epoch: 24724, Training Loss: 0.01142
Epoch: 24725, Training Loss: 0.00888
Epoch: 24725, Training Loss: 0.00928
Epoch: 24725, Training Loss: 0.00929
Epoch: 24725, Training Loss: 0.01142
Epoch: 24726, Training Loss: 0.00888
Epoch: 24726, Training Loss: 0.00928
Epoch: 24726, Training Loss: 0.00929
Epoch: 24726, Training Loss: 0.01142
Epoch: 24727, Training Loss: 0.00888
Epoch: 24727, Training Loss: 0.00927
Epoch: 24727, Training Loss: 0.00929
Epoch: 24727, Training Loss: 0.01142
Epoch: 24728, Training Loss: 0.00888
Epoch: 24728, Training Loss: 0.00927
Epoch: 24728, Training Loss: 0.00929
Epoch: 24728, Training Loss: 0.01142
Epoch: 24729, Training Loss: 0.00888
Epoch: 24729, Training Loss: 0.00927
Epoch: 24729, Training Loss: 0.00929
Epoch: 24729, Training Loss: 0.01142
Epoch: 24730, Training Loss: 0.00888
Epoch: 24730, Training Loss: 0.00927
Epoch: 24730, Training Loss: 0.00929
Epoch: 24730, Training Loss: 0.01142
Epoch: 24731, Training Loss: 0.00888
Epoch: 24731, Training Loss: 0.00927
Epoch: 24731, Training Loss: 0.00929
Epoch: 24731, Training Loss: 0.01142
Epoch: 24732, Training Loss: 0.00888
Epoch: 24732, Training Loss: 0.00927
Epoch: 24732, Training Loss: 0.00929
Epoch: 24732, Training Loss: 0.01142
Epoch: 24733, Training Loss: 0.00888
Epoch: 24733, Training Loss: 0.00927
Epoch: 24733, Training Loss: 0.00929
Epoch: 24733, Training Loss: 0.01142
Epoch: 24734, Training Loss: 0.00888
Epoch: 24734, Training Loss: 0.00927
Epoch: 24734, Training Loss: 0.00929
Epoch: 24734, Training Loss: 0.01142
Epoch: 24735, Training Loss: 0.00888
Epoch: 24735, Training Loss: 0.00927
Epoch: 24735, Training Loss: 0.00928
Epoch: 24735, Training Loss: 0.01142
Epoch: 24736, Training Loss: 0.00888
Epoch: 24736, Training Loss: 0.00927
Epoch: 24736, Training Loss: 0.00928
Epoch: 24736, Training Loss: 0.01142
Epoch: 24737, Training Loss: 0.00888
Epoch: 24737, Training Loss: 0.00927
Epoch: 24737, Training Loss: 0.00928
Epoch: 24737, Training Loss: 0.01142
Epoch: 24738, Training Loss: 0.00887
Epoch: 24738, Training Loss: 0.00927
Epoch: 24738, Training Loss: 0.00928
Epoch: 24738, Training Loss: 0.01142
Epoch: 24739, Training Loss: 0.00887
Epoch: 24739, Training Loss: 0.00927
Epoch: 24739, Training Loss: 0.00928
Epoch: 24739, Training Loss: 0.01142
Epoch: 24740, Training Loss: 0.00887
Epoch: 24740, Training Loss: 0.00927
Epoch: 24740, Training Loss: 0.00928
Epoch: 24740, Training Loss: 0.01142
Epoch: 24741, Training Loss: 0.00887
Epoch: 24741, Training Loss: 0.00927
Epoch: 24741, Training Loss: 0.00928
Epoch: 24741, Training Loss: 0.01142
Epoch: 24742, Training Loss: 0.00887
Epoch: 24742, Training Loss: 0.00927
Epoch: 24742, Training Loss: 0.00928
Epoch: 24742, Training Loss: 0.01142
Epoch: 24743, Training Loss: 0.00887
Epoch: 24743, Training Loss: 0.00927
Epoch: 24743, Training Loss: 0.00928
Epoch: 24743, Training Loss: 0.01142
Epoch: 24744, Training Loss: 0.00887
Epoch: 24744, Training Loss: 0.00927
Epoch: 24744, Training Loss: 0.00928
Epoch: 24744, Training Loss: 0.01142
Epoch: 24745, Training Loss: 0.00887
Epoch: 24745, Training Loss: 0.00927
Epoch: 24745, Training Loss: 0.00928
Epoch: 24745, Training Loss: 0.01142
Epoch: 24746, Training Loss: 0.00887
Epoch: 24746, Training Loss: 0.00927
Epoch: 24746, Training Loss: 0.00928
Epoch: 24746, Training Loss: 0.01142
Epoch: 24747, Training Loss: 0.00887
Epoch: 24747, Training Loss: 0.00927
Epoch: 24747, Training Loss: 0.00928
Epoch: 24747, Training Loss: 0.01142
Epoch: 24748, Training Loss: 0.00887
Epoch: 24748, Training Loss: 0.00927
Epoch: 24748, Training Loss: 0.00928
Epoch: 24748, Training Loss: 0.01142
Epoch: 24749, Training Loss: 0.00887
Epoch: 24749, Training Loss: 0.00927
Epoch: 24749, Training Loss: 0.00928
Epoch: 24749, Training Loss: 0.01142
Epoch: 24750, Training Loss: 0.00887
Epoch: 24750, Training Loss: 0.00927
Epoch: 24750, Training Loss: 0.00928
Epoch: 24750, Training Loss: 0.01142
Epoch: 24751, Training Loss: 0.00887
Epoch: 24751, Training Loss: 0.00927
Epoch: 24751, Training Loss: 0.00928
Epoch: 24751, Training Loss: 0.01142
Epoch: 24752, Training Loss: 0.00887
Epoch: 24752, Training Loss: 0.00927
Epoch: 24752, Training Loss: 0.00928
Epoch: 24752, Training Loss: 0.01142
Epoch: 24753, Training Loss: 0.00887
Epoch: 24753, Training Loss: 0.00927
Epoch: 24753, Training Loss: 0.00928
Epoch: 24753, Training Loss: 0.01142
Epoch: 24754, Training Loss: 0.00887
Epoch: 24754, Training Loss: 0.00927
Epoch: 24754, Training Loss: 0.00928
Epoch: 24754, Training Loss: 0.01142
Epoch: 24755, Training Loss: 0.00887
Epoch: 24755, Training Loss: 0.00927
Epoch: 24755, Training Loss: 0.00928
Epoch: 24755, Training Loss: 0.01142
Epoch: 24756, Training Loss: 0.00887
Epoch: 24756, Training Loss: 0.00927
Epoch: 24756, Training Loss: 0.00928
Epoch: 24756, Training Loss: 0.01142
Epoch: 24757, Training Loss: 0.00887
Epoch: 24757, Training Loss: 0.00927
Epoch: 24757, Training Loss: 0.00928
Epoch: 24757, Training Loss: 0.01142
Epoch: 24758, Training Loss: 0.00887
Epoch: 24758, Training Loss: 0.00927
Epoch: 24758, Training Loss: 0.00928
Epoch: 24758, Training Loss: 0.01141
Epoch: 24759, Training Loss: 0.00887
Epoch: 24759, Training Loss: 0.00927
Epoch: 24759, Training Loss: 0.00928
Epoch: 24759, Training Loss: 0.01141
Epoch: 24760, Training Loss: 0.00887
Epoch: 24760, Training Loss: 0.00927
Epoch: 24760, Training Loss: 0.00928
Epoch: 24760, Training Loss: 0.01141
Epoch: 24761, Training Loss: 0.00887
Epoch: 24761, Training Loss: 0.00927
Epoch: 24761, Training Loss: 0.00928
Epoch: 24761, Training Loss: 0.01141
Epoch: 24762, Training Loss: 0.00887
Epoch: 24762, Training Loss: 0.00927
Epoch: 24762, Training Loss: 0.00928
Epoch: 24762, Training Loss: 0.01141
Epoch: 24763, Training Loss: 0.00887
Epoch: 24763, Training Loss: 0.00927
Epoch: 24763, Training Loss: 0.00928
Epoch: 24763, Training Loss: 0.01141
Epoch: 24764, Training Loss: 0.00887
Epoch: 24764, Training Loss: 0.00927
Epoch: 24764, Training Loss: 0.00928
Epoch: 24764, Training Loss: 0.01141
Epoch: 24765, Training Loss: 0.00887
Epoch: 24765, Training Loss: 0.00927
Epoch: 24765, Training Loss: 0.00928
Epoch: 24765, Training Loss: 0.01141
Epoch: 24766, Training Loss: 0.00887
Epoch: 24766, Training Loss: 0.00927
Epoch: 24766, Training Loss: 0.00928
Epoch: 24766, Training Loss: 0.01141
Epoch: 24767, Training Loss: 0.00887
Epoch: 24767, Training Loss: 0.00927
Epoch: 24767, Training Loss: 0.00928
Epoch: 24767, Training Loss: 0.01141
Epoch: 24768, Training Loss: 0.00887
Epoch: 24768, Training Loss: 0.00927
Epoch: 24768, Training Loss: 0.00928
Epoch: 24768, Training Loss: 0.01141
Epoch: 24769, Training Loss: 0.00887
Epoch: 24769, Training Loss: 0.00927
Epoch: 24769, Training Loss: 0.00928
Epoch: 24769, Training Loss: 0.01141
Epoch: 24770, Training Loss: 0.00887
Epoch: 24770, Training Loss: 0.00927
Epoch: 24770, Training Loss: 0.00928
Epoch: 24770, Training Loss: 0.01141
Epoch: 24771, Training Loss: 0.00887
Epoch: 24771, Training Loss: 0.00927
Epoch: 24771, Training Loss: 0.00928
Epoch: 24771, Training Loss: 0.01141
Epoch: 24772, Training Loss: 0.00887
Epoch: 24772, Training Loss: 0.00927
Epoch: 24772, Training Loss: 0.00928
Epoch: 24772, Training Loss: 0.01141
Epoch: 24773, Training Loss: 0.00887
Epoch: 24773, Training Loss: 0.00927
Epoch: 24773, Training Loss: 0.00928
Epoch: 24773, Training Loss: 0.01141
Epoch: 24774, Training Loss: 0.00887
Epoch: 24774, Training Loss: 0.00926
Epoch: 24774, Training Loss: 0.00928
Epoch: 24774, Training Loss: 0.01141
Epoch: 24775, Training Loss: 0.00887
Epoch: 24775, Training Loss: 0.00926
Epoch: 24775, Training Loss: 0.00928
Epoch: 24775, Training Loss: 0.01141
Epoch: 24776, Training Loss: 0.00887
Epoch: 24776, Training Loss: 0.00926
Epoch: 24776, Training Loss: 0.00928
Epoch: 24776, Training Loss: 0.01141
Epoch: 24777, Training Loss: 0.00887
Epoch: 24777, Training Loss: 0.00926
Epoch: 24777, Training Loss: 0.00928
Epoch: 24777, Training Loss: 0.01141
Epoch: 24778, Training Loss: 0.00887
Epoch: 24778, Training Loss: 0.00926
Epoch: 24778, Training Loss: 0.00928
Epoch: 24778, Training Loss: 0.01141
Epoch: 24779, Training Loss: 0.00887
Epoch: 24779, Training Loss: 0.00926
Epoch: 24779, Training Loss: 0.00928
Epoch: 24779, Training Loss: 0.01141
Epoch: 24780, Training Loss: 0.00887
Epoch: 24780, Training Loss: 0.00926
Epoch: 24780, Training Loss: 0.00928
Epoch: 24780, Training Loss: 0.01141
Epoch: 24781, Training Loss: 0.00887
Epoch: 24781, Training Loss: 0.00926
Epoch: 24781, Training Loss: 0.00927
Epoch: 24781, Training Loss: 0.01141
Epoch: 24782, Training Loss: 0.00887
Epoch: 24782, Training Loss: 0.00926
Epoch: 24782, Training Loss: 0.00927
Epoch: 24782, Training Loss: 0.01141
Epoch: 24783, Training Loss: 0.00887
Epoch: 24783, Training Loss: 0.00926
Epoch: 24783, Training Loss: 0.00927
Epoch: 24783, Training Loss: 0.01141
Epoch: 24784, Training Loss: 0.00887
Epoch: 24784, Training Loss: 0.00926
Epoch: 24784, Training Loss: 0.00927
Epoch: 24784, Training Loss: 0.01141
Epoch: 24785, Training Loss: 0.00887
Epoch: 24785, Training Loss: 0.00926
Epoch: 24785, Training Loss: 0.00927
Epoch: 24785, Training Loss: 0.01141
Epoch: 24786, Training Loss: 0.00887
Epoch: 24786, Training Loss: 0.00926
Epoch: 24786, Training Loss: 0.00927
Epoch: 24786, Training Loss: 0.01141
Epoch: 24787, Training Loss: 0.00887
Epoch: 24787, Training Loss: 0.00926
Epoch: 24787, Training Loss: 0.00927
Epoch: 24787, Training Loss: 0.01141
Epoch: 24788, Training Loss: 0.00886
Epoch: 24788, Training Loss: 0.00926
Epoch: 24788, Training Loss: 0.00927
Epoch: 24788, Training Loss: 0.01141
Epoch: 24789, Training Loss: 0.00886
Epoch: 24789, Training Loss: 0.00926
Epoch: 24789, Training Loss: 0.00927
Epoch: 24789, Training Loss: 0.01141
Epoch: 24790, Training Loss: 0.00886
Epoch: 24790, Training Loss: 0.00926
Epoch: 24790, Training Loss: 0.00927
Epoch: 24790, Training Loss: 0.01141
Epoch: 24791, Training Loss: 0.00886
Epoch: 24791, Training Loss: 0.00926
Epoch: 24791, Training Loss: 0.00927
Epoch: 24791, Training Loss: 0.01141
Epoch: 24792, Training Loss: 0.00886
Epoch: 24792, Training Loss: 0.00926
Epoch: 24792, Training Loss: 0.00927
Epoch: 24792, Training Loss: 0.01141
Epoch: 24793, Training Loss: 0.00886
Epoch: 24793, Training Loss: 0.00926
Epoch: 24793, Training Loss: 0.00927
Epoch: 24793, Training Loss: 0.01141
Epoch: 24794, Training Loss: 0.00886
Epoch: 24794, Training Loss: 0.00926
Epoch: 24794, Training Loss: 0.00927
Epoch: 24794, Training Loss: 0.01141
Epoch: 24795, Training Loss: 0.00886
Epoch: 24795, Training Loss: 0.00926
Epoch: 24795, Training Loss: 0.00927
Epoch: 24795, Training Loss: 0.01140
Epoch: 24796, Training Loss: 0.00886
Epoch: 24796, Training Loss: 0.00926
Epoch: 24796, Training Loss: 0.00927
Epoch: 24796, Training Loss: 0.01140
Epoch: 24797, Training Loss: 0.00886
Epoch: 24797, Training Loss: 0.00926
Epoch: 24797, Training Loss: 0.00927
Epoch: 24797, Training Loss: 0.01140
Epoch: 24798, Training Loss: 0.00886
Epoch: 24798, Training Loss: 0.00926
Epoch: 24798, Training Loss: 0.00927
Epoch: 24798, Training Loss: 0.01140
Epoch: 24799, Training Loss: 0.00886
Epoch: 24799, Training Loss: 0.00926
Epoch: 24799, Training Loss: 0.00927
Epoch: 24799, Training Loss: 0.01140
Epoch: 24800, Training Loss: 0.00886
Epoch: 24800, Training Loss: 0.00926
Epoch: 24800, Training Loss: 0.00927
Epoch: 24800, Training Loss: 0.01140
Epoch: 24801, Training Loss: 0.00886
Epoch: 24801, Training Loss: 0.00926
Epoch: 24801, Training Loss: 0.00927
Epoch: 24801, Training Loss: 0.01140
Epoch: 24802, Training Loss: 0.00886
Epoch: 24802, Training Loss: 0.00926
Epoch: 24802, Training Loss: 0.00927
Epoch: 24802, Training Loss: 0.01140
Epoch: 24803, Training Loss: 0.00886
Epoch: 24803, Training Loss: 0.00926
Epoch: 24803, Training Loss: 0.00927
Epoch: 24803, Training Loss: 0.01140
Epoch: 24804, Training Loss: 0.00886
Epoch: 24804, Training Loss: 0.00926
Epoch: 24804, Training Loss: 0.00927
Epoch: 24804, Training Loss: 0.01140
Epoch: 24805, Training Loss: 0.00886
Epoch: 24805, Training Loss: 0.00926
Epoch: 24805, Training Loss: 0.00927
Epoch: 24805, Training Loss: 0.01140
Epoch: 24806, Training Loss: 0.00886
Epoch: 24806, Training Loss: 0.00926
Epoch: 24806, Training Loss: 0.00927
Epoch: 24806, Training Loss: 0.01140
Epoch: 24807, Training Loss: 0.00886
Epoch: 24807, Training Loss: 0.00926
Epoch: 24807, Training Loss: 0.00927
Epoch: 24807, Training Loss: 0.01140
Epoch: 24808, Training Loss: 0.00886
Epoch: 24808, Training Loss: 0.00926
Epoch: 24808, Training Loss: 0.00927
Epoch: 24808, Training Loss: 0.01140
Epoch: 24809, Training Loss: 0.00886
Epoch: 24809, Training Loss: 0.00926
Epoch: 24809, Training Loss: 0.00927
Epoch: 24809, Training Loss: 0.01140
Epoch: 24810, Training Loss: 0.00886
Epoch: 24810, Training Loss: 0.00926
Epoch: 24810, Training Loss: 0.00927
Epoch: 24810, Training Loss: 0.01140
Epoch: 24811, Training Loss: 0.00886
Epoch: 24811, Training Loss: 0.00926
Epoch: 24811, Training Loss: 0.00927
Epoch: 24811, Training Loss: 0.01140
Epoch: 24812, Training Loss: 0.00886
Epoch: 24812, Training Loss: 0.00926
Epoch: 24812, Training Loss: 0.00927
Epoch: 24812, Training Loss: 0.01140
Epoch: 24813, Training Loss: 0.00886
Epoch: 24813, Training Loss: 0.00926
Epoch: 24813, Training Loss: 0.00927
Epoch: 24813, Training Loss: 0.01140
Epoch: 24814, Training Loss: 0.00886
Epoch: 24814, Training Loss: 0.00926
Epoch: 24814, Training Loss: 0.00927
Epoch: 24814, Training Loss: 0.01140
Epoch: 24815, Training Loss: 0.00886
Epoch: 24815, Training Loss: 0.00926
Epoch: 24815, Training Loss: 0.00927
Epoch: 24815, Training Loss: 0.01140
Epoch: 24816, Training Loss: 0.00886
Epoch: 24816, Training Loss: 0.00926
Epoch: 24816, Training Loss: 0.00927
Epoch: 24816, Training Loss: 0.01140
Epoch: 24817, Training Loss: 0.00886
Epoch: 24817, Training Loss: 0.00926
Epoch: 24817, Training Loss: 0.00927
Epoch: 24817, Training Loss: 0.01140
Epoch: 24818, Training Loss: 0.00886
Epoch: 24818, Training Loss: 0.00926
Epoch: 24818, Training Loss: 0.00927
Epoch: 24818, Training Loss: 0.01140
Epoch: 24819, Training Loss: 0.00886
Epoch: 24819, Training Loss: 0.00926
Epoch: 24819, Training Loss: 0.00927
Epoch: 24819, Training Loss: 0.01140
Epoch: 24820, Training Loss: 0.00886
Epoch: 24820, Training Loss: 0.00925
Epoch: 24820, Training Loss: 0.00927
Epoch: 24820, Training Loss: 0.01140
Epoch: 24821, Training Loss: 0.00886
Epoch: 24821, Training Loss: 0.00925
Epoch: 24821, Training Loss: 0.00927
Epoch: 24821, Training Loss: 0.01140
Epoch: 24822, Training Loss: 0.00886
Epoch: 24822, Training Loss: 0.00925
Epoch: 24822, Training Loss: 0.00927
Epoch: 24822, Training Loss: 0.01140
Epoch: 24823, Training Loss: 0.00886
Epoch: 24823, Training Loss: 0.00925
Epoch: 24823, Training Loss: 0.00927
Epoch: 24823, Training Loss: 0.01140
Epoch: 24824, Training Loss: 0.00886
Epoch: 24824, Training Loss: 0.00925
Epoch: 24824, Training Loss: 0.00927
Epoch: 24824, Training Loss: 0.01140
Epoch: 24825, Training Loss: 0.00886
Epoch: 24825, Training Loss: 0.00925
Epoch: 24825, Training Loss: 0.00927
Epoch: 24825, Training Loss: 0.01140
Epoch: 24826, Training Loss: 0.00886
Epoch: 24826, Training Loss: 0.00925
Epoch: 24826, Training Loss: 0.00927
Epoch: 24826, Training Loss: 0.01140
Epoch: 24827, Training Loss: 0.00886
Epoch: 24827, Training Loss: 0.00925
Epoch: 24827, Training Loss: 0.00927
Epoch: 24827, Training Loss: 0.01140
Epoch: 24828, Training Loss: 0.00886
Epoch: 24828, Training Loss: 0.00925
Epoch: 24828, Training Loss: 0.00926
Epoch: 24828, Training Loss: 0.01140
Epoch: 24829, Training Loss: 0.00886
Epoch: 24829, Training Loss: 0.00925
Epoch: 24829, Training Loss: 0.00926
Epoch: 24829, Training Loss: 0.01140
Epoch: 24830, Training Loss: 0.00886
Epoch: 24830, Training Loss: 0.00925
Epoch: 24830, Training Loss: 0.00926
Epoch: 24830, Training Loss: 0.01140
Epoch: 24831, Training Loss: 0.00886
Epoch: 24831, Training Loss: 0.00925
Epoch: 24831, Training Loss: 0.00926
Epoch: 24831, Training Loss: 0.01140
Epoch: 24832, Training Loss: 0.00886
Epoch: 24832, Training Loss: 0.00925
Epoch: 24832, Training Loss: 0.00926
Epoch: 24832, Training Loss: 0.01140
Epoch: 24833, Training Loss: 0.00886
Epoch: 24833, Training Loss: 0.00925
Epoch: 24833, Training Loss: 0.00926
Epoch: 24833, Training Loss: 0.01139
Epoch: 24834, Training Loss: 0.00886
Epoch: 24834, Training Loss: 0.00925
Epoch: 24834, Training Loss: 0.00926
Epoch: 24834, Training Loss: 0.01139
Epoch: 24835, Training Loss: 0.00886
Epoch: 24835, Training Loss: 0.00925
Epoch: 24835, Training Loss: 0.00926
Epoch: 24835, Training Loss: 0.01139
Epoch: 24836, Training Loss: 0.00886
Epoch: 24836, Training Loss: 0.00925
Epoch: 24836, Training Loss: 0.00926
Epoch: 24836, Training Loss: 0.01139
Epoch: 24837, Training Loss: 0.00886
Epoch: 24837, Training Loss: 0.00925
Epoch: 24837, Training Loss: 0.00926
Epoch: 24837, Training Loss: 0.01139
Epoch: 24838, Training Loss: 0.00885
Epoch: 24838, Training Loss: 0.00925
Epoch: 24838, Training Loss: 0.00926
Epoch: 24838, Training Loss: 0.01139
Epoch: 24839, Training Loss: 0.00885
Epoch: 24839, Training Loss: 0.00925
Epoch: 24839, Training Loss: 0.00926
Epoch: 24839, Training Loss: 0.01139
Epoch: 24840, Training Loss: 0.00885
Epoch: 24840, Training Loss: 0.00925
Epoch: 24840, Training Loss: 0.00926
Epoch: 24840, Training Loss: 0.01139
Epoch: 24841, Training Loss: 0.00885
Epoch: 24841, Training Loss: 0.00925
Epoch: 24841, Training Loss: 0.00926
Epoch: 24841, Training Loss: 0.01139
Epoch: 24842, Training Loss: 0.00885
Epoch: 24842, Training Loss: 0.00925
Epoch: 24842, Training Loss: 0.00926
Epoch: 24842, Training Loss: 0.01139
Epoch: 24843, Training Loss: 0.00885
Epoch: 24843, Training Loss: 0.00925
Epoch: 24843, Training Loss: 0.00926
Epoch: 24843, Training Loss: 0.01139
Epoch: 24844, Training Loss: 0.00885
Epoch: 24844, Training Loss: 0.00925
Epoch: 24844, Training Loss: 0.00926
Epoch: 24844, Training Loss: 0.01139
Epoch: 24845, Training Loss: 0.00885
Epoch: 24845, Training Loss: 0.00925
Epoch: 24845, Training Loss: 0.00926
Epoch: 24845, Training Loss: 0.01139
Epoch: 24846, Training Loss: 0.00885
Epoch: 24846, Training Loss: 0.00925
Epoch: 24846, Training Loss: 0.00926
Epoch: 24846, Training Loss: 0.01139
Epoch: 24847, Training Loss: 0.00885
Epoch: 24847, Training Loss: 0.00925
Epoch: 24847, Training Loss: 0.00926
Epoch: 24847, Training Loss: 0.01139
Epoch: 24848, Training Loss: 0.00885
Epoch: 24848, Training Loss: 0.00925
Epoch: 24848, Training Loss: 0.00926
Epoch: 24848, Training Loss: 0.01139
Epoch: 24849, Training Loss: 0.00885
Epoch: 24849, Training Loss: 0.00925
Epoch: 24849, Training Loss: 0.00926
Epoch: 24849, Training Loss: 0.01139
Epoch: 24850, Training Loss: 0.00885
Epoch: 24850, Training Loss: 0.00925
Epoch: 24850, Training Loss: 0.00926
Epoch: 24850, Training Loss: 0.01139
Epoch: 24851, Training Loss: 0.00885
Epoch: 24851, Training Loss: 0.00925
Epoch: 24851, Training Loss: 0.00926
Epoch: 24851, Training Loss: 0.01139
Epoch: 24852, Training Loss: 0.00885
Epoch: 24852, Training Loss: 0.00925
Epoch: 24852, Training Loss: 0.00926
Epoch: 24852, Training Loss: 0.01139
Epoch: 24853, Training Loss: 0.00885
Epoch: 24853, Training Loss: 0.00925
Epoch: 24853, Training Loss: 0.00926
Epoch: 24853, Training Loss: 0.01139
Epoch: 24854, Training Loss: 0.00885
Epoch: 24854, Training Loss: 0.00925
Epoch: 24854, Training Loss: 0.00926
Epoch: 24854, Training Loss: 0.01139
Epoch: 24855, Training Loss: 0.00885
Epoch: 24855, Training Loss: 0.00925
Epoch: 24855, Training Loss: 0.00926
Epoch: 24855, Training Loss: 0.01139
Epoch: 24856, Training Loss: 0.00885
Epoch: 24856, Training Loss: 0.00925
Epoch: 24856, Training Loss: 0.00926
Epoch: 24856, Training Loss: 0.01139
Epoch: 24857, Training Loss: 0.00885
Epoch: 24857, Training Loss: 0.00925
Epoch: 24857, Training Loss: 0.00926
Epoch: 24857, Training Loss: 0.01139
Epoch: 24858, Training Loss: 0.00885
Epoch: 24858, Training Loss: 0.00925
Epoch: 24858, Training Loss: 0.00926
Epoch: 24858, Training Loss: 0.01139
Epoch: 24859, Training Loss: 0.00885
Epoch: 24859, Training Loss: 0.00925
Epoch: 24859, Training Loss: 0.00926
Epoch: 24859, Training Loss: 0.01139
Epoch: 24860, Training Loss: 0.00885
Epoch: 24860, Training Loss: 0.00925
Epoch: 24860, Training Loss: 0.00926
Epoch: 24860, Training Loss: 0.01139
Epoch: 24861, Training Loss: 0.00885
Epoch: 24861, Training Loss: 0.00925
Epoch: 24861, Training Loss: 0.00926
Epoch: 24861, Training Loss: 0.01139
Epoch: 24862, Training Loss: 0.00885
Epoch: 24862, Training Loss: 0.00925
Epoch: 24862, Training Loss: 0.00926
Epoch: 24862, Training Loss: 0.01139
Epoch: 24863, Training Loss: 0.00885
Epoch: 24863, Training Loss: 0.00925
Epoch: 24863, Training Loss: 0.00926
Epoch: 24863, Training Loss: 0.01139
Epoch: 24864, Training Loss: 0.00885
Epoch: 24864, Training Loss: 0.00925
Epoch: 24864, Training Loss: 0.00926
Epoch: 24864, Training Loss: 0.01139
Epoch: 24865, Training Loss: 0.00885
Epoch: 24865, Training Loss: 0.00925
Epoch: 24865, Training Loss: 0.00926
Epoch: 24865, Training Loss: 0.01139
Epoch: 24866, Training Loss: 0.00885
Epoch: 24866, Training Loss: 0.00925
Epoch: 24866, Training Loss: 0.00926
Epoch: 24866, Training Loss: 0.01139
Epoch: 24867, Training Loss: 0.00885
Epoch: 24867, Training Loss: 0.00924
Epoch: 24867, Training Loss: 0.00926
Epoch: 24867, Training Loss: 0.01139
Epoch: 24868, Training Loss: 0.00885
Epoch: 24868, Training Loss: 0.00924
Epoch: 24868, Training Loss: 0.00926
Epoch: 24868, Training Loss: 0.01139
Epoch: 24869, Training Loss: 0.00885
Epoch: 24869, Training Loss: 0.00924
Epoch: 24869, Training Loss: 0.00926
Epoch: 24869, Training Loss: 0.01139
Epoch: 24870, Training Loss: 0.00885
Epoch: 24870, Training Loss: 0.00924
Epoch: 24870, Training Loss: 0.00926
Epoch: 24870, Training Loss: 0.01138
Epoch: 24871, Training Loss: 0.00885
Epoch: 24871, Training Loss: 0.00924
Epoch: 24871, Training Loss: 0.00926
Epoch: 24871, Training Loss: 0.01138
Epoch: 24872, Training Loss: 0.00885
Epoch: 24872, Training Loss: 0.00924
Epoch: 24872, Training Loss: 0.00926
Epoch: 24872, Training Loss: 0.01138
Epoch: 24873, Training Loss: 0.00885
Epoch: 24873, Training Loss: 0.00924
Epoch: 24873, Training Loss: 0.00926
Epoch: 24873, Training Loss: 0.01138
Epoch: 24874, Training Loss: 0.00885
Epoch: 24874, Training Loss: 0.00924
Epoch: 24874, Training Loss: 0.00925
Epoch: 24874, Training Loss: 0.01138
Epoch: 24875, Training Loss: 0.00885
Epoch: 24875, Training Loss: 0.00924
Epoch: 24875, Training Loss: 0.00925
Epoch: 24875, Training Loss: 0.01138
Epoch: 24876, Training Loss: 0.00885
Epoch: 24876, Training Loss: 0.00924
Epoch: 24876, Training Loss: 0.00925
Epoch: 24876, Training Loss: 0.01138
Epoch: 24877, Training Loss: 0.00885
Epoch: 24877, Training Loss: 0.00924
Epoch: 24877, Training Loss: 0.00925
Epoch: 24877, Training Loss: 0.01138
Epoch: 24878, Training Loss: 0.00885
Epoch: 24878, Training Loss: 0.00924
Epoch: 24878, Training Loss: 0.00925
Epoch: 24878, Training Loss: 0.01138
Epoch: 24879, Training Loss: 0.00885
Epoch: 24879, Training Loss: 0.00924
Epoch: 24879, Training Loss: 0.00925
Epoch: 24879, Training Loss: 0.01138
Epoch: 24880, Training Loss: 0.00885
Epoch: 24880, Training Loss: 0.00924
Epoch: 24880, Training Loss: 0.00925
Epoch: 24880, Training Loss: 0.01138
Epoch: 24881, Training Loss: 0.00885
Epoch: 24881, Training Loss: 0.00924
Epoch: 24881, Training Loss: 0.00925
Epoch: 24881, Training Loss: 0.01138
Epoch: 24882, Training Loss: 0.00885
Epoch: 24882, Training Loss: 0.00924
Epoch: 24882, Training Loss: 0.00925
Epoch: 24882, Training Loss: 0.01138
Epoch: 24883, Training Loss: 0.00885
Epoch: 24883, Training Loss: 0.00924
Epoch: 24883, Training Loss: 0.00925
Epoch: 24883, Training Loss: 0.01138
Epoch: 24884, Training Loss: 0.00885
Epoch: 24884, Training Loss: 0.00924
Epoch: 24884, Training Loss: 0.00925
Epoch: 24884, Training Loss: 0.01138
Epoch: 24885, Training Loss: 0.00885
Epoch: 24885, Training Loss: 0.00924
Epoch: 24885, Training Loss: 0.00925
Epoch: 24885, Training Loss: 0.01138
Epoch: 24886, Training Loss: 0.00885
Epoch: 24886, Training Loss: 0.00924
Epoch: 24886, Training Loss: 0.00925
Epoch: 24886, Training Loss: 0.01138
Epoch: 24887, Training Loss: 0.00885
Epoch: 24887, Training Loss: 0.00924
Epoch: 24887, Training Loss: 0.00925
Epoch: 24887, Training Loss: 0.01138
Epoch: 24888, Training Loss: 0.00884
Epoch: 24888, Training Loss: 0.00924
Epoch: 24888, Training Loss: 0.00925
Epoch: 24888, Training Loss: 0.01138
Epoch: 24889, Training Loss: 0.00884
Epoch: 24889, Training Loss: 0.00924
Epoch: 24889, Training Loss: 0.00925
Epoch: 24889, Training Loss: 0.01138
Epoch: 24890, Training Loss: 0.00884
Epoch: 24890, Training Loss: 0.00924
Epoch: 24890, Training Loss: 0.00925
Epoch: 24890, Training Loss: 0.01138
Epoch: 24891, Training Loss: 0.00884
Epoch: 24891, Training Loss: 0.00924
Epoch: 24891, Training Loss: 0.00925
Epoch: 24891, Training Loss: 0.01138
Epoch: 24892, Training Loss: 0.00884
Epoch: 24892, Training Loss: 0.00924
Epoch: 24892, Training Loss: 0.00925
Epoch: 24892, Training Loss: 0.01138
Epoch: 24893, Training Loss: 0.00884
Epoch: 24893, Training Loss: 0.00924
Epoch: 24893, Training Loss: 0.00925
Epoch: 24893, Training Loss: 0.01138
Epoch: 24894, Training Loss: 0.00884
Epoch: 24894, Training Loss: 0.00924
Epoch: 24894, Training Loss: 0.00925
Epoch: 24894, Training Loss: 0.01138
Epoch: 24895, Training Loss: 0.00884
Epoch: 24895, Training Loss: 0.00924
Epoch: 24895, Training Loss: 0.00925
Epoch: 24895, Training Loss: 0.01138
Epoch: 24896, Training Loss: 0.00884
Epoch: 24896, Training Loss: 0.00924
Epoch: 24896, Training Loss: 0.00925
Epoch: 24896, Training Loss: 0.01138
Epoch: 24897, Training Loss: 0.00884
Epoch: 24897, Training Loss: 0.00924
Epoch: 24897, Training Loss: 0.00925
Epoch: 24897, Training Loss: 0.01138
Epoch: 24898, Training Loss: 0.00884
Epoch: 24898, Training Loss: 0.00924
Epoch: 24898, Training Loss: 0.00925
Epoch: 24898, Training Loss: 0.01138
Epoch: 24899, Training Loss: 0.00884
Epoch: 24899, Training Loss: 0.00924
Epoch: 24899, Training Loss: 0.00925
Epoch: 24899, Training Loss: 0.01138
Epoch: 24900, Training Loss: 0.00884
Epoch: 24900, Training Loss: 0.00924
Epoch: 24900, Training Loss: 0.00925
Epoch: 24900, Training Loss: 0.01138
Epoch: 24901, Training Loss: 0.00884
Epoch: 24901, Training Loss: 0.00924
Epoch: 24901, Training Loss: 0.00925
Epoch: 24901, Training Loss: 0.01138
Epoch: 24902, Training Loss: 0.00884
Epoch: 24902, Training Loss: 0.00924
Epoch: 24902, Training Loss: 0.00925
Epoch: 24902, Training Loss: 0.01138
Epoch: 24903, Training Loss: 0.00884
Epoch: 24903, Training Loss: 0.00924
Epoch: 24903, Training Loss: 0.00925
Epoch: 24903, Training Loss: 0.01138
Epoch: 24904, Training Loss: 0.00884
Epoch: 24904, Training Loss: 0.00924
Epoch: 24904, Training Loss: 0.00925
Epoch: 24904, Training Loss: 0.01138
Epoch: 24905, Training Loss: 0.00884
Epoch: 24905, Training Loss: 0.00924
Epoch: 24905, Training Loss: 0.00925
Epoch: 24905, Training Loss: 0.01138
Epoch: 24906, Training Loss: 0.00884
Epoch: 24906, Training Loss: 0.00924
Epoch: 24906, Training Loss: 0.00925
Epoch: 24906, Training Loss: 0.01138
Epoch: 24907, Training Loss: 0.00884
Epoch: 24907, Training Loss: 0.00924
Epoch: 24907, Training Loss: 0.00925
Epoch: 24907, Training Loss: 0.01138
Epoch: 24908, Training Loss: 0.00884
Epoch: 24908, Training Loss: 0.00924
Epoch: 24908, Training Loss: 0.00925
Epoch: 24908, Training Loss: 0.01137
Epoch: 24909, Training Loss: 0.00884
Epoch: 24909, Training Loss: 0.00924
Epoch: 24909, Training Loss: 0.00925
Epoch: 24909, Training Loss: 0.01137
Epoch: 24910, Training Loss: 0.00884
Epoch: 24910, Training Loss: 0.00924
Epoch: 24910, Training Loss: 0.00925
Epoch: 24910, Training Loss: 0.01137
Epoch: 24911, Training Loss: 0.00884
Epoch: 24911, Training Loss: 0.00924
Epoch: 24911, Training Loss: 0.00925
Epoch: 24911, Training Loss: 0.01137
Epoch: 24912, Training Loss: 0.00884
Epoch: 24912, Training Loss: 0.00924
Epoch: 24912, Training Loss: 0.00925
Epoch: 24912, Training Loss: 0.01137
Epoch: 24913, Training Loss: 0.00884
Epoch: 24913, Training Loss: 0.00924
Epoch: 24913, Training Loss: 0.00925
Epoch: 24913, Training Loss: 0.01137
Epoch: 24914, Training Loss: 0.00884
Epoch: 24914, Training Loss: 0.00923
Epoch: 24914, Training Loss: 0.00925
Epoch: 24914, Training Loss: 0.01137
Epoch: 24915, Training Loss: 0.00884
Epoch: 24915, Training Loss: 0.00923
Epoch: 24915, Training Loss: 0.00925
Epoch: 24915, Training Loss: 0.01137
Epoch: 24916, Training Loss: 0.00884
Epoch: 24916, Training Loss: 0.00923
Epoch: 24916, Training Loss: 0.00925
Epoch: 24916, Training Loss: 0.01137
Epoch: 24917, Training Loss: 0.00884
Epoch: 24917, Training Loss: 0.00923
Epoch: 24917, Training Loss: 0.00925
Epoch: 24917, Training Loss: 0.01137
Epoch: 24918, Training Loss: 0.00884
Epoch: 24918, Training Loss: 0.00923
Epoch: 24918, Training Loss: 0.00925
Epoch: 24918, Training Loss: 0.01137
Epoch: 24919, Training Loss: 0.00884
Epoch: 24919, Training Loss: 0.00923
Epoch: 24919, Training Loss: 0.00925
Epoch: 24919, Training Loss: 0.01137
Epoch: 24920, Training Loss: 0.00884
Epoch: 24920, Training Loss: 0.00923
Epoch: 24920, Training Loss: 0.00925
Epoch: 24920, Training Loss: 0.01137
Epoch: 24921, Training Loss: 0.00884
Epoch: 24921, Training Loss: 0.00923
Epoch: 24921, Training Loss: 0.00924
Epoch: 24921, Training Loss: 0.01137
Epoch: 24922, Training Loss: 0.00884
Epoch: 24922, Training Loss: 0.00923
Epoch: 24922, Training Loss: 0.00924
Epoch: 24922, Training Loss: 0.01137
Epoch: 24923, Training Loss: 0.00884
Epoch: 24923, Training Loss: 0.00923
Epoch: 24923, Training Loss: 0.00924
Epoch: 24923, Training Loss: 0.01137
Epoch: 24924, Training Loss: 0.00884
Epoch: 24924, Training Loss: 0.00923
Epoch: 24924, Training Loss: 0.00924
Epoch: 24924, Training Loss: 0.01137
Epoch: 24925, Training Loss: 0.00884
Epoch: 24925, Training Loss: 0.00923
Epoch: 24925, Training Loss: 0.00924
Epoch: 24925, Training Loss: 0.01137
Epoch: 24926, Training Loss: 0.00884
Epoch: 24926, Training Loss: 0.00923
Epoch: 24926, Training Loss: 0.00924
Epoch: 24926, Training Loss: 0.01137
Epoch: 24927, Training Loss: 0.00884
Epoch: 24927, Training Loss: 0.00923
Epoch: 24927, Training Loss: 0.00924
Epoch: 24927, Training Loss: 0.01137
Epoch: 24928, Training Loss: 0.00884
Epoch: 24928, Training Loss: 0.00923
Epoch: 24928, Training Loss: 0.00924
Epoch: 24928, Training Loss: 0.01137
Epoch: 24929, Training Loss: 0.00884
Epoch: 24929, Training Loss: 0.00923
Epoch: 24929, Training Loss: 0.00924
Epoch: 24929, Training Loss: 0.01137
Epoch: 24930, Training Loss: 0.00884
Epoch: 24930, Training Loss: 0.00923
Epoch: 24930, Training Loss: 0.00924
Epoch: 24930, Training Loss: 0.01137
Epoch: 24931, Training Loss: 0.00884
Epoch: 24931, Training Loss: 0.00923
Epoch: 24931, Training Loss: 0.00924
Epoch: 24931, Training Loss: 0.01137
Epoch: 24932, Training Loss: 0.00884
Epoch: 24932, Training Loss: 0.00923
Epoch: 24932, Training Loss: 0.00924
Epoch: 24932, Training Loss: 0.01137
Epoch: 24933, Training Loss: 0.00884
Epoch: 24933, Training Loss: 0.00923
Epoch: 24933, Training Loss: 0.00924
Epoch: 24933, Training Loss: 0.01137
Epoch: 24934, Training Loss: 0.00884
Epoch: 24934, Training Loss: 0.00923
Epoch: 24934, Training Loss: 0.00924
Epoch: 24934, Training Loss: 0.01137
Epoch: 24935, Training Loss: 0.00884
Epoch: 24935, Training Loss: 0.00923
Epoch: 24935, Training Loss: 0.00924
Epoch: 24935, Training Loss: 0.01137
Epoch: 24936, Training Loss: 0.00884
Epoch: 24936, Training Loss: 0.00923
Epoch: 24936, Training Loss: 0.00924
Epoch: 24936, Training Loss: 0.01137
Epoch: 24937, Training Loss: 0.00884
Epoch: 24937, Training Loss: 0.00923
Epoch: 24937, Training Loss: 0.00924
Epoch: 24937, Training Loss: 0.01137
Epoch: 24938, Training Loss: 0.00883
Epoch: 24938, Training Loss: 0.00923
Epoch: 24938, Training Loss: 0.00924
Epoch: 24938, Training Loss: 0.01137
Epoch: 24939, Training Loss: 0.00883
Epoch: 24939, Training Loss: 0.00923
Epoch: 24939, Training Loss: 0.00924
Epoch: 24939, Training Loss: 0.01137
Epoch: 24940, Training Loss: 0.00883
Epoch: 24940, Training Loss: 0.00923
Epoch: 24940, Training Loss: 0.00924
Epoch: 24940, Training Loss: 0.01137
Epoch: 24941, Training Loss: 0.00883
Epoch: 24941, Training Loss: 0.00923
Epoch: 24941, Training Loss: 0.00924
Epoch: 24941, Training Loss: 0.01137
Epoch: 24942, Training Loss: 0.00883
Epoch: 24942, Training Loss: 0.00923
Epoch: 24942, Training Loss: 0.00924
Epoch: 24942, Training Loss: 0.01137
Epoch: 24943, Training Loss: 0.00883
Epoch: 24943, Training Loss: 0.00923
Epoch: 24943, Training Loss: 0.00924
Epoch: 24943, Training Loss: 0.01137
Epoch: 24944, Training Loss: 0.00883
Epoch: 24944, Training Loss: 0.00923
Epoch: 24944, Training Loss: 0.00924
Epoch: 24944, Training Loss: 0.01137
Epoch: 24945, Training Loss: 0.00883
Epoch: 24945, Training Loss: 0.00923
Epoch: 24945, Training Loss: 0.00924
Epoch: 24945, Training Loss: 0.01136
Epoch: 24946, Training Loss: 0.00883
Epoch: 24946, Training Loss: 0.00923
Epoch: 24946, Training Loss: 0.00924
Epoch: 24946, Training Loss: 0.01136
Epoch: 24947, Training Loss: 0.00883
Epoch: 24947, Training Loss: 0.00923
Epoch: 24947, Training Loss: 0.00924
Epoch: 24947, Training Loss: 0.01136
Epoch: 24948, Training Loss: 0.00883
Epoch: 24948, Training Loss: 0.00923
Epoch: 24948, Training Loss: 0.00924
Epoch: 24948, Training Loss: 0.01136
Epoch: 24949, Training Loss: 0.00883
Epoch: 24949, Training Loss: 0.00923
Epoch: 24949, Training Loss: 0.00924
Epoch: 24949, Training Loss: 0.01136
Epoch: 24950, Training Loss: 0.00883
Epoch: 24950, Training Loss: 0.00923
Epoch: 24950, Training Loss: 0.00924
Epoch: 24950, Training Loss: 0.01136
Epoch: 24951, Training Loss: 0.00883
Epoch: 24951, Training Loss: 0.00923
Epoch: 24951, Training Loss: 0.00924
Epoch: 24951, Training Loss: 0.01136
Epoch: 24952, Training Loss: 0.00883
Epoch: 24952, Training Loss: 0.00923
Epoch: 24952, Training Loss: 0.00924
Epoch: 24952, Training Loss: 0.01136
Epoch: 24953, Training Loss: 0.00883
Epoch: 24953, Training Loss: 0.00923
Epoch: 24953, Training Loss: 0.00924
Epoch: 24953, Training Loss: 0.01136
Epoch: 24954, Training Loss: 0.00883
Epoch: 24954, Training Loss: 0.00923
Epoch: 24954, Training Loss: 0.00924
Epoch: 24954, Training Loss: 0.01136
Epoch: 24955, Training Loss: 0.00883
Epoch: 24955, Training Loss: 0.00923
Epoch: 24955, Training Loss: 0.00924
Epoch: 24955, Training Loss: 0.01136
Epoch: 24956, Training Loss: 0.00883
Epoch: 24956, Training Loss: 0.00923
Epoch: 24956, Training Loss: 0.00924
Epoch: 24956, Training Loss: 0.01136
Epoch: 24957, Training Loss: 0.00883
Epoch: 24957, Training Loss: 0.00923
Epoch: 24957, Training Loss: 0.00924
Epoch: 24957, Training Loss: 0.01136
Epoch: 24958, Training Loss: 0.00883
Epoch: 24958, Training Loss: 0.00923
Epoch: 24958, Training Loss: 0.00924
Epoch: 24958, Training Loss: 0.01136
Epoch: 24959, Training Loss: 0.00883
Epoch: 24959, Training Loss: 0.00923
Epoch: 24959, Training Loss: 0.00924
Epoch: 24959, Training Loss: 0.01136
Epoch: 24960, Training Loss: 0.00883
Epoch: 24960, Training Loss: 0.00923
Epoch: 24960, Training Loss: 0.00924
Epoch: 24960, Training Loss: 0.01136
Epoch: 24961, Training Loss: 0.00883
Epoch: 24961, Training Loss: 0.00922
Epoch: 24961, Training Loss: 0.00924
Epoch: 24961, Training Loss: 0.01136
Epoch: 24962, Training Loss: 0.00883
Epoch: 24962, Training Loss: 0.00922
Epoch: 24962, Training Loss: 0.00924
Epoch: 24962, Training Loss: 0.01136
Epoch: 24963, Training Loss: 0.00883
Epoch: 24963, Training Loss: 0.00922
Epoch: 24963, Training Loss: 0.00924
Epoch: 24963, Training Loss: 0.01136
Epoch: 24964, Training Loss: 0.00883
Epoch: 24964, Training Loss: 0.00922
Epoch: 24964, Training Loss: 0.00924
Epoch: 24964, Training Loss: 0.01136
Epoch: 24965, Training Loss: 0.00883
Epoch: 24965, Training Loss: 0.00922
Epoch: 24965, Training Loss: 0.00924
Epoch: 24965, Training Loss: 0.01136
Epoch: 24966, Training Loss: 0.00883
Epoch: 24966, Training Loss: 0.00922
Epoch: 24966, Training Loss: 0.00924
Epoch: 24966, Training Loss: 0.01136
Epoch: 24967, Training Loss: 0.00883
Epoch: 24967, Training Loss: 0.00922
Epoch: 24967, Training Loss: 0.00924
Epoch: 24967, Training Loss: 0.01136
Epoch: 24968, Training Loss: 0.00883
Epoch: 24968, Training Loss: 0.00922
Epoch: 24968, Training Loss: 0.00923
Epoch: 24968, Training Loss: 0.01136
Epoch: 24969, Training Loss: 0.00883
Epoch: 24969, Training Loss: 0.00922
Epoch: 24969, Training Loss: 0.00923
Epoch: 24969, Training Loss: 0.01136
Epoch: 24970, Training Loss: 0.00883
Epoch: 24970, Training Loss: 0.00922
Epoch: 24970, Training Loss: 0.00923
Epoch: 24970, Training Loss: 0.01136
Epoch: 24971, Training Loss: 0.00883
Epoch: 24971, Training Loss: 0.00922
Epoch: 24971, Training Loss: 0.00923
Epoch: 24971, Training Loss: 0.01136
Epoch: 24972, Training Loss: 0.00883
Epoch: 24972, Training Loss: 0.00922
Epoch: 24972, Training Loss: 0.00923
Epoch: 24972, Training Loss: 0.01136
Epoch: 24973, Training Loss: 0.00883
Epoch: 24973, Training Loss: 0.00922
Epoch: 24973, Training Loss: 0.00923
Epoch: 24973, Training Loss: 0.01136
Epoch: 24974, Training Loss: 0.00883
Epoch: 24974, Training Loss: 0.00922
Epoch: 24974, Training Loss: 0.00923
Epoch: 24974, Training Loss: 0.01136
Epoch: 24975, Training Loss: 0.00883
Epoch: 24975, Training Loss: 0.00922
Epoch: 24975, Training Loss: 0.00923
Epoch: 24975, Training Loss: 0.01136
Epoch: 24976, Training Loss: 0.00883
Epoch: 24976, Training Loss: 0.00922
Epoch: 24976, Training Loss: 0.00923
Epoch: 24976, Training Loss: 0.01136
Epoch: 24977, Training Loss: 0.00883
Epoch: 24977, Training Loss: 0.00922
Epoch: 24977, Training Loss: 0.00923
Epoch: 24977, Training Loss: 0.01136
Epoch: 24978, Training Loss: 0.00883
Epoch: 24978, Training Loss: 0.00922
Epoch: 24978, Training Loss: 0.00923
Epoch: 24978, Training Loss: 0.01136
Epoch: 24979, Training Loss: 0.00883
Epoch: 24979, Training Loss: 0.00922
Epoch: 24979, Training Loss: 0.00923
Epoch: 24979, Training Loss: 0.01136
Epoch: 24980, Training Loss: 0.00883
Epoch: 24980, Training Loss: 0.00922
Epoch: 24980, Training Loss: 0.00923
Epoch: 24980, Training Loss: 0.01136
Epoch: 24981, Training Loss: 0.00883
Epoch: 24981, Training Loss: 0.00922
Epoch: 24981, Training Loss: 0.00923
Epoch: 24981, Training Loss: 0.01136
Epoch: 24982, Training Loss: 0.00883
Epoch: 24982, Training Loss: 0.00922
Epoch: 24982, Training Loss: 0.00923
Epoch: 24982, Training Loss: 0.01136
Epoch: 24983, Training Loss: 0.00883
Epoch: 24983, Training Loss: 0.00922
Epoch: 24983, Training Loss: 0.00923
Epoch: 24983, Training Loss: 0.01135
Epoch: 24984, Training Loss: 0.00883
Epoch: 24984, Training Loss: 0.00922
Epoch: 24984, Training Loss: 0.00923
Epoch: 24984, Training Loss: 0.01135
Epoch: 24985, Training Loss: 0.00883
Epoch: 24985, Training Loss: 0.00922
Epoch: 24985, Training Loss: 0.00923
Epoch: 24985, Training Loss: 0.01135
Epoch: 24986, Training Loss: 0.00883
Epoch: 24986, Training Loss: 0.00922
Epoch: 24986, Training Loss: 0.00923
Epoch: 24986, Training Loss: 0.01135
Epoch: 24987, Training Loss: 0.00883
Epoch: 24987, Training Loss: 0.00922
Epoch: 24987, Training Loss: 0.00923
Epoch: 24987, Training Loss: 0.01135
Epoch: 24988, Training Loss: 0.00882
Epoch: 24988, Training Loss: 0.00922
Epoch: 24988, Training Loss: 0.00923
Epoch: 24988, Training Loss: 0.01135
Epoch: 24989, Training Loss: 0.00882
Epoch: 24989, Training Loss: 0.00922
Epoch: 24989, Training Loss: 0.00923
Epoch: 24989, Training Loss: 0.01135
Epoch: 24990, Training Loss: 0.00882
Epoch: 24990, Training Loss: 0.00922
Epoch: 24990, Training Loss: 0.00923
Epoch: 24990, Training Loss: 0.01135
Epoch: 24991, Training Loss: 0.00882
Epoch: 24991, Training Loss: 0.00922
Epoch: 24991, Training Loss: 0.00923
Epoch: 24991, Training Loss: 0.01135
Epoch: 24992, Training Loss: 0.00882
Epoch: 24992, Training Loss: 0.00922
Epoch: 24992, Training Loss: 0.00923
Epoch: 24992, Training Loss: 0.01135
Epoch: 24993, Training Loss: 0.00882
Epoch: 24993, Training Loss: 0.00922
Epoch: 24993, Training Loss: 0.00923
Epoch: 24993, Training Loss: 0.01135
Epoch: 24994, Training Loss: 0.00882
Epoch: 24994, Training Loss: 0.00922
Epoch: 24994, Training Loss: 0.00923
Epoch: 24994, Training Loss: 0.01135
Epoch: 24995, Training Loss: 0.00882
Epoch: 24995, Training Loss: 0.00922
Epoch: 24995, Training Loss: 0.00923
Epoch: 24995, Training Loss: 0.01135
Epoch: 24996, Training Loss: 0.00882
Epoch: 24996, Training Loss: 0.00922
Epoch: 24996, Training Loss: 0.00923
Epoch: 24996, Training Loss: 0.01135
Epoch: 24997, Training Loss: 0.00882
Epoch: 24997, Training Loss: 0.00922
Epoch: 24997, Training Loss: 0.00923
Epoch: 24997, Training Loss: 0.01135
Epoch: 24998, Training Loss: 0.00882
Epoch: 24998, Training Loss: 0.00922
Epoch: 24998, Training Loss: 0.00923
Epoch: 24998, Training Loss: 0.01135
Epoch: 24999, Training Loss: 0.00882
Epoch: 24999, Training Loss: 0.00922
Epoch: 24999, Training Loss: 0.00923
Epoch: 24999, Training Loss: 0.01135
Epoch: 25000, Training Loss: 0.00882
Epoch: 25000, Training Loss: 0.00922
Epoch: 25000, Training Loss: 0.00923
Epoch: 25000, Training Loss: 0.01135
Epoch: 25001, Training Loss: 0.00882
Epoch: 25001, Training Loss: 0.00922
Epoch: 25001, Training Loss: 0.00923
Epoch: 25001, Training Loss: 0.01135
Epoch: 25002, Training Loss: 0.00882
Epoch: 25002, Training Loss: 0.00922
Epoch: 25002, Training Loss: 0.00923
Epoch: 25002, Training Loss: 0.01135
Epoch: 25003, Training Loss: 0.00882
Epoch: 25003, Training Loss: 0.00922
Epoch: 25003, Training Loss: 0.00923
Epoch: 25003, Training Loss: 0.01135
Epoch: 25004, Training Loss: 0.00882
Epoch: 25004, Training Loss: 0.00922
Epoch: 25004, Training Loss: 0.00923
Epoch: 25004, Training Loss: 0.01135
Epoch: 25005, Training Loss: 0.00882
Epoch: 25005, Training Loss: 0.00922
Epoch: 25005, Training Loss: 0.00923
Epoch: 25005, Training Loss: 0.01135
Epoch: 25006, Training Loss: 0.00882
Epoch: 25006, Training Loss: 0.00922
Epoch: 25006, Training Loss: 0.00923
Epoch: 25006, Training Loss: 0.01135
Epoch: 25007, Training Loss: 0.00882
Epoch: 25007, Training Loss: 0.00922
Epoch: 25007, Training Loss: 0.00923
Epoch: 25007, Training Loss: 0.01135
Epoch: 25008, Training Loss: 0.00882
Epoch: 25008, Training Loss: 0.00921
Epoch: 25008, Training Loss: 0.00923
Epoch: 25008, Training Loss: 0.01135
Epoch: 25009, Training Loss: 0.00882
Epoch: 25009, Training Loss: 0.00921
Epoch: 25009, Training Loss: 0.00923
Epoch: 25009, Training Loss: 0.01135
Epoch: 25010, Training Loss: 0.00882
Epoch: 25010, Training Loss: 0.00921
Epoch: 25010, Training Loss: 0.00923
Epoch: 25010, Training Loss: 0.01135
Epoch: 25011, Training Loss: 0.00882
Epoch: 25011, Training Loss: 0.00921
Epoch: 25011, Training Loss: 0.00923
Epoch: 25011, Training Loss: 0.01135
Epoch: 25012, Training Loss: 0.00882
Epoch: 25012, Training Loss: 0.00921
Epoch: 25012, Training Loss: 0.00923
Epoch: 25012, Training Loss: 0.01135
Epoch: 25013, Training Loss: 0.00882
Epoch: 25013, Training Loss: 0.00921
Epoch: 25013, Training Loss: 0.00923
Epoch: 25013, Training Loss: 0.01135
Epoch: 25014, Training Loss: 0.00882
Epoch: 25014, Training Loss: 0.00921
Epoch: 25014, Training Loss: 0.00923
Epoch: 25014, Training Loss: 0.01135
Epoch: 25015, Training Loss: 0.00882
Epoch: 25015, Training Loss: 0.00921
Epoch: 25015, Training Loss: 0.00922
Epoch: 25015, Training Loss: 0.01135
Epoch: 25016, Training Loss: 0.00882
Epoch: 25016, Training Loss: 0.00921
Epoch: 25016, Training Loss: 0.00922
Epoch: 25016, Training Loss: 0.01135
Epoch: 25017, Training Loss: 0.00882
Epoch: 25017, Training Loss: 0.00921
Epoch: 25017, Training Loss: 0.00922
Epoch: 25017, Training Loss: 0.01135
Epoch: 25018, Training Loss: 0.00882
Epoch: 25018, Training Loss: 0.00921
Epoch: 25018, Training Loss: 0.00922
Epoch: 25018, Training Loss: 0.01135
Epoch: 25019, Training Loss: 0.00882
Epoch: 25019, Training Loss: 0.00921
Epoch: 25019, Training Loss: 0.00922
Epoch: 25019, Training Loss: 0.01135
Epoch: 25020, Training Loss: 0.00882
Epoch: 25020, Training Loss: 0.00921
Epoch: 25020, Training Loss: 0.00922
Epoch: 25020, Training Loss: 0.01135
Epoch: 25021, Training Loss: 0.00882
Epoch: 25021, Training Loss: 0.00921
Epoch: 25021, Training Loss: 0.00922
Epoch: 25021, Training Loss: 0.01134
Epoch: 25022, Training Loss: 0.00882
Epoch: 25022, Training Loss: 0.00921
Epoch: 25022, Training Loss: 0.00922
Epoch: 25022, Training Loss: 0.01134
Epoch: 25023, Training Loss: 0.00882
Epoch: 25023, Training Loss: 0.00921
Epoch: 25023, Training Loss: 0.00922
Epoch: 25023, Training Loss: 0.01134
Epoch: 25024, Training Loss: 0.00882
Epoch: 25024, Training Loss: 0.00921
Epoch: 25024, Training Loss: 0.00922
Epoch: 25024, Training Loss: 0.01134
Epoch: 25025, Training Loss: 0.00882
Epoch: 25025, Training Loss: 0.00921
Epoch: 25025, Training Loss: 0.00922
Epoch: 25025, Training Loss: 0.01134
Epoch: 25026, Training Loss: 0.00882
Epoch: 25026, Training Loss: 0.00921
Epoch: 25026, Training Loss: 0.00922
Epoch: 25026, Training Loss: 0.01134
Epoch: 25027, Training Loss: 0.00882
Epoch: 25027, Training Loss: 0.00921
Epoch: 25027, Training Loss: 0.00922
Epoch: 25027, Training Loss: 0.01134
Epoch: 25028, Training Loss: 0.00882
Epoch: 25028, Training Loss: 0.00921
Epoch: 25028, Training Loss: 0.00922
Epoch: 25028, Training Loss: 0.01134
Epoch: 25029, Training Loss: 0.00882
Epoch: 25029, Training Loss: 0.00921
Epoch: 25029, Training Loss: 0.00922
Epoch: 25029, Training Loss: 0.01134
Epoch: 25030, Training Loss: 0.00882
Epoch: 25030, Training Loss: 0.00921
Epoch: 25030, Training Loss: 0.00922
Epoch: 25030, Training Loss: 0.01134
Epoch: 25031, Training Loss: 0.00882
Epoch: 25031, Training Loss: 0.00921
Epoch: 25031, Training Loss: 0.00922
Epoch: 25031, Training Loss: 0.01134
Epoch: 25032, Training Loss: 0.00882
Epoch: 25032, Training Loss: 0.00921
Epoch: 25032, Training Loss: 0.00922
Epoch: 25032, Training Loss: 0.01134
Epoch: 25033, Training Loss: 0.00882
Epoch: 25033, Training Loss: 0.00921
Epoch: 25033, Training Loss: 0.00922
Epoch: 25033, Training Loss: 0.01134
Epoch: 25034, Training Loss: 0.00882
Epoch: 25034, Training Loss: 0.00921
Epoch: 25034, Training Loss: 0.00922
Epoch: 25034, Training Loss: 0.01134
Epoch: 25035, Training Loss: 0.00882
Epoch: 25035, Training Loss: 0.00921
Epoch: 25035, Training Loss: 0.00922
Epoch: 25035, Training Loss: 0.01134
Epoch: 25036, Training Loss: 0.00882
Epoch: 25036, Training Loss: 0.00921
Epoch: 25036, Training Loss: 0.00922
Epoch: 25036, Training Loss: 0.01134
Epoch: 25037, Training Loss: 0.00882
Epoch: 25037, Training Loss: 0.00921
Epoch: 25037, Training Loss: 0.00922
Epoch: 25037, Training Loss: 0.01134
Epoch: 25038, Training Loss: 0.00882
Epoch: 25038, Training Loss: 0.00921
Epoch: 25038, Training Loss: 0.00922
Epoch: 25038, Training Loss: 0.01134
Epoch: 25039, Training Loss: 0.00881
Epoch: 25039, Training Loss: 0.00921
Epoch: 25039, Training Loss: 0.00922
Epoch: 25039, Training Loss: 0.01134
Epoch: 25040, Training Loss: 0.00881
Epoch: 25040, Training Loss: 0.00921
Epoch: 25040, Training Loss: 0.00922
Epoch: 25040, Training Loss: 0.01134
Epoch: 25041, Training Loss: 0.00881
Epoch: 25041, Training Loss: 0.00921
Epoch: 25041, Training Loss: 0.00922
Epoch: 25041, Training Loss: 0.01134
Epoch: 25042, Training Loss: 0.00881
Epoch: 25042, Training Loss: 0.00921
Epoch: 25042, Training Loss: 0.00922
Epoch: 25042, Training Loss: 0.01134
Epoch: 25043, Training Loss: 0.00881
Epoch: 25043, Training Loss: 0.00921
Epoch: 25043, Training Loss: 0.00922
Epoch: 25043, Training Loss: 0.01134
Epoch: 25044, Training Loss: 0.00881
Epoch: 25044, Training Loss: 0.00921
Epoch: 25044, Training Loss: 0.00922
Epoch: 25044, Training Loss: 0.01134
Epoch: 25045, Training Loss: 0.00881
Epoch: 25045, Training Loss: 0.00921
Epoch: 25045, Training Loss: 0.00922
Epoch: 25045, Training Loss: 0.01134
Epoch: 25046, Training Loss: 0.00881
Epoch: 25046, Training Loss: 0.00921
Epoch: 25046, Training Loss: 0.00922
Epoch: 25046, Training Loss: 0.01134
Epoch: 25047, Training Loss: 0.00881
Epoch: 25047, Training Loss: 0.00921
Epoch: 25047, Training Loss: 0.00922
Epoch: 25047, Training Loss: 0.01134
Epoch: 25048, Training Loss: 0.00881
Epoch: 25048, Training Loss: 0.00921
Epoch: 25048, Training Loss: 0.00922
Epoch: 25048, Training Loss: 0.01134
Epoch: 25049, Training Loss: 0.00881
Epoch: 25049, Training Loss: 0.00921
Epoch: 25049, Training Loss: 0.00922
Epoch: 25049, Training Loss: 0.01134
Epoch: 25050, Training Loss: 0.00881
Epoch: 25050, Training Loss: 0.00921
Epoch: 25050, Training Loss: 0.00922
Epoch: 25050, Training Loss: 0.01134
Epoch: 25051, Training Loss: 0.00881
Epoch: 25051, Training Loss: 0.00921
Epoch: 25051, Training Loss: 0.00922
Epoch: 25051, Training Loss: 0.01134
Epoch: 25052, Training Loss: 0.00881
Epoch: 25052, Training Loss: 0.00921
Epoch: 25052, Training Loss: 0.00922
Epoch: 25052, Training Loss: 0.01134
Epoch: 25053, Training Loss: 0.00881
Epoch: 25053, Training Loss: 0.00921
Epoch: 25053, Training Loss: 0.00922
Epoch: 25053, Training Loss: 0.01134
Epoch: 25054, Training Loss: 0.00881
Epoch: 25054, Training Loss: 0.00921
Epoch: 25054, Training Loss: 0.00922
Epoch: 25054, Training Loss: 0.01134
Epoch: 25055, Training Loss: 0.00881
Epoch: 25055, Training Loss: 0.00921
Epoch: 25055, Training Loss: 0.00922
Epoch: 25055, Training Loss: 0.01134
Epoch: 25056, Training Loss: 0.00881
Epoch: 25056, Training Loss: 0.00920
Epoch: 25056, Training Loss: 0.00922
Epoch: 25056, Training Loss: 0.01134
Epoch: 25057, Training Loss: 0.00881
Epoch: 25057, Training Loss: 0.00920
Epoch: 25057, Training Loss: 0.00922
Epoch: 25057, Training Loss: 0.01134
Epoch: 25058, Training Loss: 0.00881
Epoch: 25058, Training Loss: 0.00920
Epoch: 25058, Training Loss: 0.00922
Epoch: 25058, Training Loss: 0.01134
Epoch: 25059, Training Loss: 0.00881
Epoch: 25059, Training Loss: 0.00920
Epoch: 25059, Training Loss: 0.00922
Epoch: 25059, Training Loss: 0.01133
Epoch: 25060, Training Loss: 0.00881
Epoch: 25060, Training Loss: 0.00920
Epoch: 25060, Training Loss: 0.00922
Epoch: 25060, Training Loss: 0.01133
Epoch: 25061, Training Loss: 0.00881
Epoch: 25061, Training Loss: 0.00920
Epoch: 25061, Training Loss: 0.00922
Epoch: 25061, Training Loss: 0.01133
Epoch: 25062, Training Loss: 0.00881
Epoch: 25062, Training Loss: 0.00920
Epoch: 25062, Training Loss: 0.00922
Epoch: 25062, Training Loss: 0.01133
Epoch: 25063, Training Loss: 0.00881
Epoch: 25063, Training Loss: 0.00920
Epoch: 25063, Training Loss: 0.00921
Epoch: 25063, Training Loss: 0.01133
Epoch: 25064, Training Loss: 0.00881
Epoch: 25064, Training Loss: 0.00920
Epoch: 25064, Training Loss: 0.00921
Epoch: 25064, Training Loss: 0.01133
Epoch: 25065, Training Loss: 0.00881
Epoch: 25065, Training Loss: 0.00920
Epoch: 25065, Training Loss: 0.00921
Epoch: 25065, Training Loss: 0.01133
Epoch: 25066, Training Loss: 0.00881
Epoch: 25066, Training Loss: 0.00920
Epoch: 25066, Training Loss: 0.00921
Epoch: 25066, Training Loss: 0.01133
Epoch: 25067, Training Loss: 0.00881
Epoch: 25067, Training Loss: 0.00920
Epoch: 25067, Training Loss: 0.00921
Epoch: 25067, Training Loss: 0.01133
Epoch: 25068, Training Loss: 0.00881
Epoch: 25068, Training Loss: 0.00920
Epoch: 25068, Training Loss: 0.00921
Epoch: 25068, Training Loss: 0.01133
Epoch: 25069, Training Loss: 0.00881
Epoch: 25069, Training Loss: 0.00920
Epoch: 25069, Training Loss: 0.00921
Epoch: 25069, Training Loss: 0.01133
Epoch: 25070, Training Loss: 0.00881
Epoch: 25070, Training Loss: 0.00920
Epoch: 25070, Training Loss: 0.00921
Epoch: 25070, Training Loss: 0.01133
Epoch: 25071, Training Loss: 0.00881
Epoch: 25071, Training Loss: 0.00920
Epoch: 25071, Training Loss: 0.00921
Epoch: 25071, Training Loss: 0.01133
Epoch: 25072, Training Loss: 0.00881
Epoch: 25072, Training Loss: 0.00920
Epoch: 25072, Training Loss: 0.00921
Epoch: 25072, Training Loss: 0.01133
Epoch: 25073, Training Loss: 0.00881
Epoch: 25073, Training Loss: 0.00920
Epoch: 25073, Training Loss: 0.00921
Epoch: 25073, Training Loss: 0.01133
Epoch: 25074, Training Loss: 0.00881
Epoch: 25074, Training Loss: 0.00920
Epoch: 25074, Training Loss: 0.00921
Epoch: 25074, Training Loss: 0.01133
Epoch: 25075, Training Loss: 0.00881
Epoch: 25075, Training Loss: 0.00920
Epoch: 25075, Training Loss: 0.00921
Epoch: 25075, Training Loss: 0.01133
Epoch: 25076, Training Loss: 0.00881
Epoch: 25076, Training Loss: 0.00920
Epoch: 25076, Training Loss: 0.00921
Epoch: 25076, Training Loss: 0.01133
Epoch: 25077, Training Loss: 0.00881
Epoch: 25077, Training Loss: 0.00920
Epoch: 25077, Training Loss: 0.00921
Epoch: 25077, Training Loss: 0.01133
Epoch: 25078, Training Loss: 0.00881
Epoch: 25078, Training Loss: 0.00920
Epoch: 25078, Training Loss: 0.00921
Epoch: 25078, Training Loss: 0.01133
Epoch: 25079, Training Loss: 0.00881
Epoch: 25079, Training Loss: 0.00920
Epoch: 25079, Training Loss: 0.00921
Epoch: 25079, Training Loss: 0.01133
Epoch: 25080, Training Loss: 0.00881
Epoch: 25080, Training Loss: 0.00920
Epoch: 25080, Training Loss: 0.00921
Epoch: 25080, Training Loss: 0.01133
Epoch: 25081, Training Loss: 0.00881
Epoch: 25081, Training Loss: 0.00920
Epoch: 25081, Training Loss: 0.00921
Epoch: 25081, Training Loss: 0.01133
Epoch: 25082, Training Loss: 0.00881
Epoch: 25082, Training Loss: 0.00920
Epoch: 25082, Training Loss: 0.00921
Epoch: 25082, Training Loss: 0.01133
Epoch: 25083, Training Loss: 0.00881
Epoch: 25083, Training Loss: 0.00920
Epoch: 25083, Training Loss: 0.00921
Epoch: 25083, Training Loss: 0.01133
Epoch: 25084, Training Loss: 0.00881
Epoch: 25084, Training Loss: 0.00920
Epoch: 25084, Training Loss: 0.00921
Epoch: 25084, Training Loss: 0.01133
Epoch: 25085, Training Loss: 0.00881
Epoch: 25085, Training Loss: 0.00920
Epoch: 25085, Training Loss: 0.00921
Epoch: 25085, Training Loss: 0.01133
Epoch: 25086, Training Loss: 0.00881
Epoch: 25086, Training Loss: 0.00920
Epoch: 25086, Training Loss: 0.00921
Epoch: 25086, Training Loss: 0.01133
Epoch: 25087, Training Loss: 0.00881
Epoch: 25087, Training Loss: 0.00920
Epoch: 25087, Training Loss: 0.00921
Epoch: 25087, Training Loss: 0.01133
Epoch: 25088, Training Loss: 0.00881
Epoch: 25088, Training Loss: 0.00920
Epoch: 25088, Training Loss: 0.00921
Epoch: 25088, Training Loss: 0.01133
Epoch: 25089, Training Loss: 0.00881
Epoch: 25089, Training Loss: 0.00920
Epoch: 25089, Training Loss: 0.00921
Epoch: 25089, Training Loss: 0.01133
Epoch: 25090, Training Loss: 0.00880
Epoch: 25090, Training Loss: 0.00920
Epoch: 25090, Training Loss: 0.00921
Epoch: 25090, Training Loss: 0.01133
Epoch: 25091, Training Loss: 0.00880
Epoch: 25091, Training Loss: 0.00920
Epoch: 25091, Training Loss: 0.00921
Epoch: 25091, Training Loss: 0.01133
Epoch: 25092, Training Loss: 0.00880
Epoch: 25092, Training Loss: 0.00920
Epoch: 25092, Training Loss: 0.00921
Epoch: 25092, Training Loss: 0.01133
Epoch: 25093, Training Loss: 0.00880
Epoch: 25093, Training Loss: 0.00920
Epoch: 25093, Training Loss: 0.00921
Epoch: 25093, Training Loss: 0.01133
Epoch: 25094, Training Loss: 0.00880
Epoch: 25094, Training Loss: 0.00920
Epoch: 25094, Training Loss: 0.00921
Epoch: 25094, Training Loss: 0.01133
Epoch: 25095, Training Loss: 0.00880
Epoch: 25095, Training Loss: 0.00920
Epoch: 25095, Training Loss: 0.00921
Epoch: 25095, Training Loss: 0.01133
Epoch: 25096, Training Loss: 0.00880
Epoch: 25096, Training Loss: 0.00920
Epoch: 25096, Training Loss: 0.00921
Epoch: 25096, Training Loss: 0.01133
Epoch: 25097, Training Loss: 0.00880
Epoch: 25097, Training Loss: 0.00920
Epoch: 25097, Training Loss: 0.00921
Epoch: 25097, Training Loss: 0.01132
Epoch: 25098, Training Loss: 0.00880
Epoch: 25098, Training Loss: 0.00920
Epoch: 25098, Training Loss: 0.00921
Epoch: 25098, Training Loss: 0.01132
Epoch: 25099, Training Loss: 0.00880
Epoch: 25099, Training Loss: 0.00920
Epoch: 25099, Training Loss: 0.00921
Epoch: 25099, Training Loss: 0.01132
Epoch: 25100, Training Loss: 0.00880
Epoch: 25100, Training Loss: 0.00920
Epoch: 25100, Training Loss: 0.00921
Epoch: 25100, Training Loss: 0.01132
Epoch: 25101, Training Loss: 0.00880
Epoch: 25101, Training Loss: 0.00920
Epoch: 25101, Training Loss: 0.00921
Epoch: 25101, Training Loss: 0.01132
Epoch: 25102, Training Loss: 0.00880
Epoch: 25102, Training Loss: 0.00920
Epoch: 25102, Training Loss: 0.00921
Epoch: 25102, Training Loss: 0.01132
Epoch: 25103, Training Loss: 0.00880
Epoch: 25103, Training Loss: 0.00919
Epoch: 25103, Training Loss: 0.00921
Epoch: 25103, Training Loss: 0.01132
Epoch: 25104, Training Loss: 0.00880
Epoch: 25104, Training Loss: 0.00919
Epoch: 25104, Training Loss: 0.00921
Epoch: 25104, Training Loss: 0.01132
Epoch: 25105, Training Loss: 0.00880
Epoch: 25105, Training Loss: 0.00919
Epoch: 25105, Training Loss: 0.00921
Epoch: 25105, Training Loss: 0.01132
Epoch: 25106, Training Loss: 0.00880
Epoch: 25106, Training Loss: 0.00919
Epoch: 25106, Training Loss: 0.00921
Epoch: 25106, Training Loss: 0.01132
Epoch: 25107, Training Loss: 0.00880
Epoch: 25107, Training Loss: 0.00919
Epoch: 25107, Training Loss: 0.00921
Epoch: 25107, Training Loss: 0.01132
Epoch: 25108, Training Loss: 0.00880
Epoch: 25108, Training Loss: 0.00919
Epoch: 25108, Training Loss: 0.00921
Epoch: 25108, Training Loss: 0.01132
Epoch: 25109, Training Loss: 0.00880
Epoch: 25109, Training Loss: 0.00919
Epoch: 25109, Training Loss: 0.00921
Epoch: 25109, Training Loss: 0.01132
Epoch: 25110, Training Loss: 0.00880
Epoch: 25110, Training Loss: 0.00919
Epoch: 25110, Training Loss: 0.00920
Epoch: 25110, Training Loss: 0.01132
Epoch: 25111, Training Loss: 0.00880
Epoch: 25111, Training Loss: 0.00919
Epoch: 25111, Training Loss: 0.00920
Epoch: 25111, Training Loss: 0.01132
Epoch: 25112, Training Loss: 0.00880
Epoch: 25112, Training Loss: 0.00919
Epoch: 25112, Training Loss: 0.00920
Epoch: 25112, Training Loss: 0.01132
Epoch: 25113, Training Loss: 0.00880
Epoch: 25113, Training Loss: 0.00919
Epoch: 25113, Training Loss: 0.00920
Epoch: 25113, Training Loss: 0.01132
Epoch: 25114, Training Loss: 0.00880
Epoch: 25114, Training Loss: 0.00919
Epoch: 25114, Training Loss: 0.00920
Epoch: 25114, Training Loss: 0.01132
Epoch: 25115, Training Loss: 0.00880
Epoch: 25115, Training Loss: 0.00919
Epoch: 25115, Training Loss: 0.00920
Epoch: 25115, Training Loss: 0.01132
Epoch: 25116, Training Loss: 0.00880
Epoch: 25116, Training Loss: 0.00919
Epoch: 25116, Training Loss: 0.00920
Epoch: 25116, Training Loss: 0.01132
Epoch: 25117, Training Loss: 0.00880
Epoch: 25117, Training Loss: 0.00919
Epoch: 25117, Training Loss: 0.00920
Epoch: 25117, Training Loss: 0.01132
Epoch: 25118, Training Loss: 0.00880
Epoch: 25118, Training Loss: 0.00919
Epoch: 25118, Training Loss: 0.00920
Epoch: 25118, Training Loss: 0.01132
Epoch: 25119, Training Loss: 0.00880
Epoch: 25119, Training Loss: 0.00919
Epoch: 25119, Training Loss: 0.00920
Epoch: 25119, Training Loss: 0.01132
Epoch: 25120, Training Loss: 0.00880
Epoch: 25120, Training Loss: 0.00919
Epoch: 25120, Training Loss: 0.00920
Epoch: 25120, Training Loss: 0.01132
Epoch: 25121, Training Loss: 0.00880
Epoch: 25121, Training Loss: 0.00919
Epoch: 25121, Training Loss: 0.00920
Epoch: 25121, Training Loss: 0.01132
Epoch: 25122, Training Loss: 0.00880
Epoch: 25122, Training Loss: 0.00919
Epoch: 25122, Training Loss: 0.00920
Epoch: 25122, Training Loss: 0.01132
Epoch: 25123, Training Loss: 0.00880
Epoch: 25123, Training Loss: 0.00919
Epoch: 25123, Training Loss: 0.00920
Epoch: 25123, Training Loss: 0.01132
Epoch: 25124, Training Loss: 0.00880
Epoch: 25124, Training Loss: 0.00919
Epoch: 25124, Training Loss: 0.00920
Epoch: 25124, Training Loss: 0.01132
Epoch: 25125, Training Loss: 0.00880
Epoch: 25125, Training Loss: 0.00919
Epoch: 25125, Training Loss: 0.00920
Epoch: 25125, Training Loss: 0.01132
Epoch: 25126, Training Loss: 0.00880
Epoch: 25126, Training Loss: 0.00919
Epoch: 25126, Training Loss: 0.00920
Epoch: 25126, Training Loss: 0.01132
Epoch: 25127, Training Loss: 0.00880
Epoch: 25127, Training Loss: 0.00919
Epoch: 25127, Training Loss: 0.00920
Epoch: 25127, Training Loss: 0.01132
Epoch: 25128, Training Loss: 0.00880
Epoch: 25128, Training Loss: 0.00919
Epoch: 25128, Training Loss: 0.00920
Epoch: 25128, Training Loss: 0.01132
Epoch: 25129, Training Loss: 0.00880
Epoch: 25129, Training Loss: 0.00919
Epoch: 25129, Training Loss: 0.00920
Epoch: 25129, Training Loss: 0.01132
Epoch: 25130, Training Loss: 0.00880
Epoch: 25130, Training Loss: 0.00919
Epoch: 25130, Training Loss: 0.00920
Epoch: 25130, Training Loss: 0.01132
Epoch: 25131, Training Loss: 0.00880
Epoch: 25131, Training Loss: 0.00919
Epoch: 25131, Training Loss: 0.00920
Epoch: 25131, Training Loss: 0.01132
Epoch: 25132, Training Loss: 0.00880
Epoch: 25132, Training Loss: 0.00919
Epoch: 25132, Training Loss: 0.00920
Epoch: 25132, Training Loss: 0.01132
Epoch: 25133, Training Loss: 0.00880
Epoch: 25133, Training Loss: 0.00919
Epoch: 25133, Training Loss: 0.00920
Epoch: 25133, Training Loss: 0.01132
Epoch: 25134, Training Loss: 0.00880
Epoch: 25134, Training Loss: 0.00919
Epoch: 25134, Training Loss: 0.00920
Epoch: 25134, Training Loss: 0.01132
Epoch: 25135, Training Loss: 0.00880
Epoch: 25135, Training Loss: 0.00919
Epoch: 25135, Training Loss: 0.00920
Epoch: 25135, Training Loss: 0.01131
Epoch: 25136, Training Loss: 0.00880
Epoch: 25136, Training Loss: 0.00919
Epoch: 25136, Training Loss: 0.00920
Epoch: 25136, Training Loss: 0.01131
Epoch: 25137, Training Loss: 0.00880
Epoch: 25137, Training Loss: 0.00919
Epoch: 25137, Training Loss: 0.00920
Epoch: 25137, Training Loss: 0.01131
Epoch: 25138, Training Loss: 0.00880
Epoch: 25138, Training Loss: 0.00919
Epoch: 25138, Training Loss: 0.00920
Epoch: 25138, Training Loss: 0.01131
Epoch: 25139, Training Loss: 0.00880
Epoch: 25139, Training Loss: 0.00919
Epoch: 25139, Training Loss: 0.00920
Epoch: 25139, Training Loss: 0.01131
Epoch: 25140, Training Loss: 0.00880
Epoch: 25140, Training Loss: 0.00919
Epoch: 25140, Training Loss: 0.00920
Epoch: 25140, Training Loss: 0.01131
Epoch: 25141, Training Loss: 0.00879
Epoch: 25141, Training Loss: 0.00919
Epoch: 25141, Training Loss: 0.00920
Epoch: 25141, Training Loss: 0.01131
Epoch: 25142, Training Loss: 0.00879
Epoch: 25142, Training Loss: 0.00919
Epoch: 25142, Training Loss: 0.00920
Epoch: 25142, Training Loss: 0.01131
Epoch: 25143, Training Loss: 0.00879
Epoch: 25143, Training Loss: 0.00919
Epoch: 25143, Training Loss: 0.00920
Epoch: 25143, Training Loss: 0.01131
Epoch: 25144, Training Loss: 0.00879
Epoch: 25144, Training Loss: 0.00919
Epoch: 25144, Training Loss: 0.00920
Epoch: 25144, Training Loss: 0.01131
Epoch: 25145, Training Loss: 0.00879
Epoch: 25145, Training Loss: 0.00919
Epoch: 25145, Training Loss: 0.00920
Epoch: 25145, Training Loss: 0.01131
Epoch: 25146, Training Loss: 0.00879
Epoch: 25146, Training Loss: 0.00919
Epoch: 25146, Training Loss: 0.00920
Epoch: 25146, Training Loss: 0.01131
Epoch: 25147, Training Loss: 0.00879
Epoch: 25147, Training Loss: 0.00919
Epoch: 25147, Training Loss: 0.00920
Epoch: 25147, Training Loss: 0.01131
Epoch: 25148, Training Loss: 0.00879
Epoch: 25148, Training Loss: 0.00919
Epoch: 25148, Training Loss: 0.00920
Epoch: 25148, Training Loss: 0.01131
Epoch: 25149, Training Loss: 0.00879
Epoch: 25149, Training Loss: 0.00919
Epoch: 25149, Training Loss: 0.00920
Epoch: 25149, Training Loss: 0.01131
Epoch: 25150, Training Loss: 0.00879
Epoch: 25150, Training Loss: 0.00919
Epoch: 25150, Training Loss: 0.00920
Epoch: 25150, Training Loss: 0.01131
Epoch: 25151, Training Loss: 0.00879
Epoch: 25151, Training Loss: 0.00918
Epoch: 25151, Training Loss: 0.00920
Epoch: 25151, Training Loss: 0.01131
Epoch: 25152, Training Loss: 0.00879
Epoch: 25152, Training Loss: 0.00918
Epoch: 25152, Training Loss: 0.00920
Epoch: 25152, Training Loss: 0.01131
Epoch: 25153, Training Loss: 0.00879
Epoch: 25153, Training Loss: 0.00918
Epoch: 25153, Training Loss: 0.00920
Epoch: 25153, Training Loss: 0.01131
Epoch: 25154, Training Loss: 0.00879
Epoch: 25154, Training Loss: 0.00918
Epoch: 25154, Training Loss: 0.00920
Epoch: 25154, Training Loss: 0.01131
Epoch: 25155, Training Loss: 0.00879
Epoch: 25155, Training Loss: 0.00918
Epoch: 25155, Training Loss: 0.00920
Epoch: 25155, Training Loss: 0.01131
Epoch: 25156, Training Loss: 0.00879
Epoch: 25156, Training Loss: 0.00918
Epoch: 25156, Training Loss: 0.00920
Epoch: 25156, Training Loss: 0.01131
Epoch: 25157, Training Loss: 0.00879
Epoch: 25157, Training Loss: 0.00918
Epoch: 25157, Training Loss: 0.00920
Epoch: 25157, Training Loss: 0.01131
Epoch: 25158, Training Loss: 0.00879
Epoch: 25158, Training Loss: 0.00918
Epoch: 25158, Training Loss: 0.00919
Epoch: 25158, Training Loss: 0.01131
Epoch: 25159, Training Loss: 0.00879
Epoch: 25159, Training Loss: 0.00918
Epoch: 25159, Training Loss: 0.00919
Epoch: 25159, Training Loss: 0.01131
Epoch: 25160, Training Loss: 0.00879
Epoch: 25160, Training Loss: 0.00918
Epoch: 25160, Training Loss: 0.00919
Epoch: 25160, Training Loss: 0.01131
Epoch: 25161, Training Loss: 0.00879
Epoch: 25161, Training Loss: 0.00918
Epoch: 25161, Training Loss: 0.00919
Epoch: 25161, Training Loss: 0.01131
Epoch: 25162, Training Loss: 0.00879
Epoch: 25162, Training Loss: 0.00918
Epoch: 25162, Training Loss: 0.00919
Epoch: 25162, Training Loss: 0.01131
Epoch: 25163, Training Loss: 0.00879
Epoch: 25163, Training Loss: 0.00918
Epoch: 25163, Training Loss: 0.00919
Epoch: 25163, Training Loss: 0.01131
Epoch: 25164, Training Loss: 0.00879
Epoch: 25164, Training Loss: 0.00918
Epoch: 25164, Training Loss: 0.00919
Epoch: 25164, Training Loss: 0.01131
Epoch: 25165, Training Loss: 0.00879
Epoch: 25165, Training Loss: 0.00918
Epoch: 25165, Training Loss: 0.00919
Epoch: 25165, Training Loss: 0.01131
Epoch: 25166, Training Loss: 0.00879
Epoch: 25166, Training Loss: 0.00918
Epoch: 25166, Training Loss: 0.00919
Epoch: 25166, Training Loss: 0.01131
Epoch: 25167, Training Loss: 0.00879
Epoch: 25167, Training Loss: 0.00918
Epoch: 25167, Training Loss: 0.00919
Epoch: 25167, Training Loss: 0.01131
Epoch: 25168, Training Loss: 0.00879
Epoch: 25168, Training Loss: 0.00918
Epoch: 25168, Training Loss: 0.00919
Epoch: 25168, Training Loss: 0.01131
Epoch: 25169, Training Loss: 0.00879
Epoch: 25169, Training Loss: 0.00918
Epoch: 25169, Training Loss: 0.00919
Epoch: 25169, Training Loss: 0.01131
Epoch: 25170, Training Loss: 0.00879
Epoch: 25170, Training Loss: 0.00918
Epoch: 25170, Training Loss: 0.00919
Epoch: 25170, Training Loss: 0.01131
Epoch: 25171, Training Loss: 0.00879
Epoch: 25171, Training Loss: 0.00918
Epoch: 25171, Training Loss: 0.00919
Epoch: 25171, Training Loss: 0.01131
Epoch: 25172, Training Loss: 0.00879
Epoch: 25172, Training Loss: 0.00918
Epoch: 25172, Training Loss: 0.00919
Epoch: 25172, Training Loss: 0.01131
Epoch: 25173, Training Loss: 0.00879
Epoch: 25173, Training Loss: 0.00918
Epoch: 25173, Training Loss: 0.00919
Epoch: 25173, Training Loss: 0.01130
Epoch: 25174, Training Loss: 0.00879
Epoch: 25174, Training Loss: 0.00918
Epoch: 25174, Training Loss: 0.00919
Epoch: 25174, Training Loss: 0.01130
Epoch: 25175, Training Loss: 0.00879
Epoch: 25175, Training Loss: 0.00918
Epoch: 25175, Training Loss: 0.00919
Epoch: 25175, Training Loss: 0.01130
Epoch: 25176, Training Loss: 0.00879
Epoch: 25176, Training Loss: 0.00918
Epoch: 25176, Training Loss: 0.00919
Epoch: 25176, Training Loss: 0.01130
Epoch: 25177, Training Loss: 0.00879
Epoch: 25177, Training Loss: 0.00918
Epoch: 25177, Training Loss: 0.00919
Epoch: 25177, Training Loss: 0.01130
Epoch: 25178, Training Loss: 0.00879
Epoch: 25178, Training Loss: 0.00918
Epoch: 25178, Training Loss: 0.00919
Epoch: 25178, Training Loss: 0.01130
Epoch: 25179, Training Loss: 0.00879
Epoch: 25179, Training Loss: 0.00918
Epoch: 25179, Training Loss: 0.00919
Epoch: 25179, Training Loss: 0.01130
Epoch: 25180, Training Loss: 0.00879
Epoch: 25180, Training Loss: 0.00918
Epoch: 25180, Training Loss: 0.00919
Epoch: 25180, Training Loss: 0.01130
Epoch: 25181, Training Loss: 0.00879
Epoch: 25181, Training Loss: 0.00918
Epoch: 25181, Training Loss: 0.00919
Epoch: 25181, Training Loss: 0.01130
Epoch: 25182, Training Loss: 0.00879
Epoch: 25182, Training Loss: 0.00918
Epoch: 25182, Training Loss: 0.00919
Epoch: 25182, Training Loss: 0.01130
Epoch: 25183, Training Loss: 0.00879
Epoch: 25183, Training Loss: 0.00918
Epoch: 25183, Training Loss: 0.00919
Epoch: 25183, Training Loss: 0.01130
Epoch: 25184, Training Loss: 0.00879
Epoch: 25184, Training Loss: 0.00918
Epoch: 25184, Training Loss: 0.00919
Epoch: 25184, Training Loss: 0.01130
Epoch: 25185, Training Loss: 0.00879
Epoch: 25185, Training Loss: 0.00918
Epoch: 25185, Training Loss: 0.00919
Epoch: 25185, Training Loss: 0.01130
Epoch: 25186, Training Loss: 0.00879
Epoch: 25186, Training Loss: 0.00918
Epoch: 25186, Training Loss: 0.00919
Epoch: 25186, Training Loss: 0.01130
Epoch: 25187, Training Loss: 0.00879
Epoch: 25187, Training Loss: 0.00918
Epoch: 25187, Training Loss: 0.00919
Epoch: 25187, Training Loss: 0.01130
Epoch: 25188, Training Loss: 0.00879
Epoch: 25188, Training Loss: 0.00918
Epoch: 25188, Training Loss: 0.00919
Epoch: 25188, Training Loss: 0.01130
Epoch: 25189, Training Loss: 0.00879
Epoch: 25189, Training Loss: 0.00918
Epoch: 25189, Training Loss: 0.00919
Epoch: 25189, Training Loss: 0.01130
Epoch: 25190, Training Loss: 0.00879
Epoch: 25190, Training Loss: 0.00918
Epoch: 25190, Training Loss: 0.00919
Epoch: 25190, Training Loss: 0.01130
Epoch: 25191, Training Loss: 0.00879
Epoch: 25191, Training Loss: 0.00918
Epoch: 25191, Training Loss: 0.00919
Epoch: 25191, Training Loss: 0.01130
Epoch: 25192, Training Loss: 0.00878
Epoch: 25192, Training Loss: 0.00918
Epoch: 25192, Training Loss: 0.00919
Epoch: 25192, Training Loss: 0.01130
Epoch: 25193, Training Loss: 0.00878
Epoch: 25193, Training Loss: 0.00918
Epoch: 25193, Training Loss: 0.00919
Epoch: 25193, Training Loss: 0.01130
Epoch: 25194, Training Loss: 0.00878
Epoch: 25194, Training Loss: 0.00918
Epoch: 25194, Training Loss: 0.00919
Epoch: 25194, Training Loss: 0.01130
Epoch: 25195, Training Loss: 0.00878
Epoch: 25195, Training Loss: 0.00918
Epoch: 25195, Training Loss: 0.00919
Epoch: 25195, Training Loss: 0.01130
Epoch: 25196, Training Loss: 0.00878
Epoch: 25196, Training Loss: 0.00918
Epoch: 25196, Training Loss: 0.00919
Epoch: 25196, Training Loss: 0.01130
Epoch: 25197, Training Loss: 0.00878
Epoch: 25197, Training Loss: 0.00918
Epoch: 25197, Training Loss: 0.00919
Epoch: 25197, Training Loss: 0.01130
Epoch: 25198, Training Loss: 0.00878
Epoch: 25198, Training Loss: 0.00918
Epoch: 25198, Training Loss: 0.00919
Epoch: 25198, Training Loss: 0.01130
Epoch: 25199, Training Loss: 0.00878
Epoch: 25199, Training Loss: 0.00917
Epoch: 25199, Training Loss: 0.00919
Epoch: 25199, Training Loss: 0.01130
Epoch: 25200, Training Loss: 0.00878
Epoch: 25200, Training Loss: 0.00917
Epoch: 25200, Training Loss: 0.00919
Epoch: 25200, Training Loss: 0.01130
Epoch: 25201, Training Loss: 0.00878
Epoch: 25201, Training Loss: 0.00917
Epoch: 25201, Training Loss: 0.00919
Epoch: 25201, Training Loss: 0.01130
Epoch: 25202, Training Loss: 0.00878
Epoch: 25202, Training Loss: 0.00917
Epoch: 25202, Training Loss: 0.00919
Epoch: 25202, Training Loss: 0.01130
Epoch: 25203, Training Loss: 0.00878
Epoch: 25203, Training Loss: 0.00917
Epoch: 25203, Training Loss: 0.00919
Epoch: 25203, Training Loss: 0.01130
Epoch: 25204, Training Loss: 0.00878
Epoch: 25204, Training Loss: 0.00917
Epoch: 25204, Training Loss: 0.00919
Epoch: 25204, Training Loss: 0.01130
Epoch: 25205, Training Loss: 0.00878
Epoch: 25205, Training Loss: 0.00917
Epoch: 25205, Training Loss: 0.00918
Epoch: 25205, Training Loss: 0.01130
Epoch: 25206, Training Loss: 0.00878
Epoch: 25206, Training Loss: 0.00917
Epoch: 25206, Training Loss: 0.00918
Epoch: 25206, Training Loss: 0.01130
Epoch: 25207, Training Loss: 0.00878
Epoch: 25207, Training Loss: 0.00917
Epoch: 25207, Training Loss: 0.00918
Epoch: 25207, Training Loss: 0.01130
Epoch: 25208, Training Loss: 0.00878
Epoch: 25208, Training Loss: 0.00917
Epoch: 25208, Training Loss: 0.00918
Epoch: 25208, Training Loss: 0.01130
Epoch: 25209, Training Loss: 0.00878
Epoch: 25209, Training Loss: 0.00917
Epoch: 25209, Training Loss: 0.00918
Epoch: 25209, Training Loss: 0.01130
Epoch: 25210, Training Loss: 0.00878
Epoch: 25210, Training Loss: 0.00917
Epoch: 25210, Training Loss: 0.00918
Epoch: 25210, Training Loss: 0.01130
Epoch: 25211, Training Loss: 0.00878
Epoch: 25211, Training Loss: 0.00917
Epoch: 25211, Training Loss: 0.00918
Epoch: 25211, Training Loss: 0.01130
Epoch: 25212, Training Loss: 0.00878
Epoch: 25212, Training Loss: 0.00917
Epoch: 25212, Training Loss: 0.00918
Epoch: 25212, Training Loss: 0.01129
Epoch: 25213, Training Loss: 0.00878
Epoch: 25213, Training Loss: 0.00917
Epoch: 25213, Training Loss: 0.00918
Epoch: 25213, Training Loss: 0.01129
Epoch: 25214, Training Loss: 0.00878
Epoch: 25214, Training Loss: 0.00917
Epoch: 25214, Training Loss: 0.00918
Epoch: 25214, Training Loss: 0.01129
Epoch: 25215, Training Loss: 0.00878
Epoch: 25215, Training Loss: 0.00917
Epoch: 25215, Training Loss: 0.00918
Epoch: 25215, Training Loss: 0.01129
Epoch: 25216, Training Loss: 0.00878
Epoch: 25216, Training Loss: 0.00917
Epoch: 25216, Training Loss: 0.00918
Epoch: 25216, Training Loss: 0.01129
Epoch: 25217, Training Loss: 0.00878
Epoch: 25217, Training Loss: 0.00917
Epoch: 25217, Training Loss: 0.00918
Epoch: 25217, Training Loss: 0.01129
Epoch: 25218, Training Loss: 0.00878
Epoch: 25218, Training Loss: 0.00917
Epoch: 25218, Training Loss: 0.00918
Epoch: 25218, Training Loss: 0.01129
Epoch: 25219, Training Loss: 0.00878
Epoch: 25219, Training Loss: 0.00917
Epoch: 25219, Training Loss: 0.00918
Epoch: 25219, Training Loss: 0.01129
Epoch: 25220, Training Loss: 0.00878
Epoch: 25220, Training Loss: 0.00917
Epoch: 25220, Training Loss: 0.00918
Epoch: 25220, Training Loss: 0.01129
Epoch: 25221, Training Loss: 0.00878
Epoch: 25221, Training Loss: 0.00917
Epoch: 25221, Training Loss: 0.00918
Epoch: 25221, Training Loss: 0.01129
Epoch: 25222, Training Loss: 0.00878
Epoch: 25222, Training Loss: 0.00917
Epoch: 25222, Training Loss: 0.00918
Epoch: 25222, Training Loss: 0.01129
Epoch: 25223, Training Loss: 0.00878
Epoch: 25223, Training Loss: 0.00917
Epoch: 25223, Training Loss: 0.00918
Epoch: 25223, Training Loss: 0.01129
Epoch: 25224, Training Loss: 0.00878
Epoch: 25224, Training Loss: 0.00917
Epoch: 25224, Training Loss: 0.00918
Epoch: 25224, Training Loss: 0.01129
Epoch: 25225, Training Loss: 0.00878
Epoch: 25225, Training Loss: 0.00917
Epoch: 25225, Training Loss: 0.00918
Epoch: 25225, Training Loss: 0.01129
Epoch: 25226, Training Loss: 0.00878
Epoch: 25226, Training Loss: 0.00917
Epoch: 25226, Training Loss: 0.00918
Epoch: 25226, Training Loss: 0.01129
Epoch: 25227, Training Loss: 0.00878
Epoch: 25227, Training Loss: 0.00917
Epoch: 25227, Training Loss: 0.00918
Epoch: 25227, Training Loss: 0.01129
Epoch: 25228, Training Loss: 0.00878
Epoch: 25228, Training Loss: 0.00917
Epoch: 25228, Training Loss: 0.00918
Epoch: 25228, Training Loss: 0.01129
Epoch: 25229, Training Loss: 0.00878
Epoch: 25229, Training Loss: 0.00917
Epoch: 25229, Training Loss: 0.00918
Epoch: 25229, Training Loss: 0.01129
Epoch: 25230, Training Loss: 0.00878
Epoch: 25230, Training Loss: 0.00917
Epoch: 25230, Training Loss: 0.00918
Epoch: 25230, Training Loss: 0.01129
Epoch: 25231, Training Loss: 0.00878
Epoch: 25231, Training Loss: 0.00917
Epoch: 25231, Training Loss: 0.00918
Epoch: 25231, Training Loss: 0.01129
Epoch: 25232, Training Loss: 0.00878
Epoch: 25232, Training Loss: 0.00917
Epoch: 25232, Training Loss: 0.00918
Epoch: 25232, Training Loss: 0.01129
Epoch: 25233, Training Loss: 0.00878
Epoch: 25233, Training Loss: 0.00917
Epoch: 25233, Training Loss: 0.00918
Epoch: 25233, Training Loss: 0.01129
Epoch: 25234, Training Loss: 0.00878
Epoch: 25234, Training Loss: 0.00917
Epoch: 25234, Training Loss: 0.00918
Epoch: 25234, Training Loss: 0.01129
Epoch: 25235, Training Loss: 0.00878
Epoch: 25235, Training Loss: 0.00917
Epoch: 25235, Training Loss: 0.00918
Epoch: 25235, Training Loss: 0.01129
Epoch: 25236, Training Loss: 0.00878
Epoch: 25236, Training Loss: 0.00917
Epoch: 25236, Training Loss: 0.00918
Epoch: 25236, Training Loss: 0.01129
Epoch: 25237, Training Loss: 0.00878
Epoch: 25237, Training Loss: 0.00917
Epoch: 25237, Training Loss: 0.00918
Epoch: 25237, Training Loss: 0.01129
Epoch: 25238, Training Loss: 0.00878
Epoch: 25238, Training Loss: 0.00917
Epoch: 25238, Training Loss: 0.00918
Epoch: 25238, Training Loss: 0.01129
Epoch: 25239, Training Loss: 0.00878
Epoch: 25239, Training Loss: 0.00917
Epoch: 25239, Training Loss: 0.00918
Epoch: 25239, Training Loss: 0.01129
Epoch: 25240, Training Loss: 0.00878
Epoch: 25240, Training Loss: 0.00917
Epoch: 25240, Training Loss: 0.00918
Epoch: 25240, Training Loss: 0.01129
Epoch: 25241, Training Loss: 0.00878
Epoch: 25241, Training Loss: 0.00917
Epoch: 25241, Training Loss: 0.00918
Epoch: 25241, Training Loss: 0.01129
Epoch: 25242, Training Loss: 0.00878
Epoch: 25242, Training Loss: 0.00917
Epoch: 25242, Training Loss: 0.00918
Epoch: 25242, Training Loss: 0.01129
Epoch: 25243, Training Loss: 0.00877
Epoch: 25243, Training Loss: 0.00917
Epoch: 25243, Training Loss: 0.00918
Epoch: 25243, Training Loss: 0.01129
Epoch: 25244, Training Loss: 0.00877
Epoch: 25244, Training Loss: 0.00917
Epoch: 25244, Training Loss: 0.00918
Epoch: 25244, Training Loss: 0.01129
Epoch: 25245, Training Loss: 0.00877
Epoch: 25245, Training Loss: 0.00917
Epoch: 25245, Training Loss: 0.00918
Epoch: 25245, Training Loss: 0.01129
Epoch: 25246, Training Loss: 0.00877
Epoch: 25246, Training Loss: 0.00916
Epoch: 25246, Training Loss: 0.00918
Epoch: 25246, Training Loss: 0.01129
Epoch: 25247, Training Loss: 0.00877
Epoch: 25247, Training Loss: 0.00916
Epoch: 25247, Training Loss: 0.00918
Epoch: 25247, Training Loss: 0.01129
Epoch: 25248, Training Loss: 0.00877
Epoch: 25248, Training Loss: 0.00916
Epoch: 25248, Training Loss: 0.00918
Epoch: 25248, Training Loss: 0.01129
Epoch: 25249, Training Loss: 0.00877
Epoch: 25249, Training Loss: 0.00916
Epoch: 25249, Training Loss: 0.00918
Epoch: 25249, Training Loss: 0.01129
Epoch: 25250, Training Loss: 0.00877
Epoch: 25250, Training Loss: 0.00916
Epoch: 25250, Training Loss: 0.00918
Epoch: 25250, Training Loss: 0.01128
Epoch: 25251, Training Loss: 0.00877
Epoch: 25251, Training Loss: 0.00916
Epoch: 25251, Training Loss: 0.00918
Epoch: 25251, Training Loss: 0.01128
Epoch: 25252, Training Loss: 0.00877
Epoch: 25252, Training Loss: 0.00916
Epoch: 25252, Training Loss: 0.00918
Epoch: 25252, Training Loss: 0.01128
Epoch: 25253, Training Loss: 0.00877
Epoch: 25253, Training Loss: 0.00916
Epoch: 25253, Training Loss: 0.00917
Epoch: 25253, Training Loss: 0.01128
Epoch: 25254, Training Loss: 0.00877
Epoch: 25254, Training Loss: 0.00916
Epoch: 25254, Training Loss: 0.00917
Epoch: 25254, Training Loss: 0.01128
Epoch: 25255, Training Loss: 0.00877
Epoch: 25255, Training Loss: 0.00916
Epoch: 25255, Training Loss: 0.00917
Epoch: 25255, Training Loss: 0.01128
Epoch: 25256, Training Loss: 0.00877
Epoch: 25256, Training Loss: 0.00916
Epoch: 25256, Training Loss: 0.00917
Epoch: 25256, Training Loss: 0.01128
Epoch: 25257, Training Loss: 0.00877
Epoch: 25257, Training Loss: 0.00916
Epoch: 25257, Training Loss: 0.00917
Epoch: 25257, Training Loss: 0.01128
Epoch: 25258, Training Loss: 0.00877
Epoch: 25258, Training Loss: 0.00916
Epoch: 25258, Training Loss: 0.00917
Epoch: 25258, Training Loss: 0.01128
Epoch: 25259, Training Loss: 0.00877
Epoch: 25259, Training Loss: 0.00916
Epoch: 25259, Training Loss: 0.00917
Epoch: 25259, Training Loss: 0.01128
Epoch: 25260, Training Loss: 0.00877
Epoch: 25260, Training Loss: 0.00916
Epoch: 25260, Training Loss: 0.00917
Epoch: 25260, Training Loss: 0.01128
Epoch: 25261, Training Loss: 0.00877
Epoch: 25261, Training Loss: 0.00916
Epoch: 25261, Training Loss: 0.00917
Epoch: 25261, Training Loss: 0.01128
Epoch: 25262, Training Loss: 0.00877
Epoch: 25262, Training Loss: 0.00916
Epoch: 25262, Training Loss: 0.00917
Epoch: 25262, Training Loss: 0.01128
Epoch: 25263, Training Loss: 0.00877
Epoch: 25263, Training Loss: 0.00916
Epoch: 25263, Training Loss: 0.00917
Epoch: 25263, Training Loss: 0.01128
Epoch: 25264, Training Loss: 0.00877
Epoch: 25264, Training Loss: 0.00916
Epoch: 25264, Training Loss: 0.00917
Epoch: 25264, Training Loss: 0.01128
Epoch: 25265, Training Loss: 0.00877
Epoch: 25265, Training Loss: 0.00916
Epoch: 25265, Training Loss: 0.00917
Epoch: 25265, Training Loss: 0.01128
Epoch: 25266, Training Loss: 0.00877
Epoch: 25266, Training Loss: 0.00916
Epoch: 25266, Training Loss: 0.00917
Epoch: 25266, Training Loss: 0.01128
Epoch: 25267, Training Loss: 0.00877
Epoch: 25267, Training Loss: 0.00916
Epoch: 25267, Training Loss: 0.00917
Epoch: 25267, Training Loss: 0.01128
Epoch: 25268, Training Loss: 0.00877
Epoch: 25268, Training Loss: 0.00916
Epoch: 25268, Training Loss: 0.00917
Epoch: 25268, Training Loss: 0.01128
Epoch: 25269, Training Loss: 0.00877
Epoch: 25269, Training Loss: 0.00916
Epoch: 25269, Training Loss: 0.00917
Epoch: 25269, Training Loss: 0.01128
Epoch: 25270, Training Loss: 0.00877
Epoch: 25270, Training Loss: 0.00916
Epoch: 25270, Training Loss: 0.00917
Epoch: 25270, Training Loss: 0.01128
Epoch: 25271, Training Loss: 0.00877
Epoch: 25271, Training Loss: 0.00916
Epoch: 25271, Training Loss: 0.00917
Epoch: 25271, Training Loss: 0.01128
Epoch: 25272, Training Loss: 0.00877
Epoch: 25272, Training Loss: 0.00916
Epoch: 25272, Training Loss: 0.00917
Epoch: 25272, Training Loss: 0.01128
Epoch: 25273, Training Loss: 0.00877
Epoch: 25273, Training Loss: 0.00916
Epoch: 25273, Training Loss: 0.00917
Epoch: 25273, Training Loss: 0.01128
Epoch: 25274, Training Loss: 0.00877
Epoch: 25274, Training Loss: 0.00916
Epoch: 25274, Training Loss: 0.00917
Epoch: 25274, Training Loss: 0.01128
Epoch: 25275, Training Loss: 0.00877
Epoch: 25275, Training Loss: 0.00916
Epoch: 25275, Training Loss: 0.00917
Epoch: 25275, Training Loss: 0.01128
Epoch: 25276, Training Loss: 0.00877
Epoch: 25276, Training Loss: 0.00916
Epoch: 25276, Training Loss: 0.00917
Epoch: 25276, Training Loss: 0.01128
Epoch: 25277, Training Loss: 0.00877
Epoch: 25277, Training Loss: 0.00916
Epoch: 25277, Training Loss: 0.00917
Epoch: 25277, Training Loss: 0.01128
Epoch: 25278, Training Loss: 0.00877
Epoch: 25278, Training Loss: 0.00916
Epoch: 25278, Training Loss: 0.00917
Epoch: 25278, Training Loss: 0.01128
Epoch: 25279, Training Loss: 0.00877
Epoch: 25279, Training Loss: 0.00916
Epoch: 25279, Training Loss: 0.00917
Epoch: 25279, Training Loss: 0.01128
Epoch: 25280, Training Loss: 0.00877
Epoch: 25280, Training Loss: 0.00916
Epoch: 25280, Training Loss: 0.00917
Epoch: 25280, Training Loss: 0.01128
Epoch: 25281, Training Loss: 0.00877
Epoch: 25281, Training Loss: 0.00916
Epoch: 25281, Training Loss: 0.00917
Epoch: 25281, Training Loss: 0.01128
Epoch: 25282, Training Loss: 0.00877
Epoch: 25282, Training Loss: 0.00916
Epoch: 25282, Training Loss: 0.00917
Epoch: 25282, Training Loss: 0.01128
Epoch: 25283, Training Loss: 0.00877
Epoch: 25283, Training Loss: 0.00916
Epoch: 25283, Training Loss: 0.00917
Epoch: 25283, Training Loss: 0.01128
Epoch: 25284, Training Loss: 0.00877
Epoch: 25284, Training Loss: 0.00916
Epoch: 25284, Training Loss: 0.00917
Epoch: 25284, Training Loss: 0.01128
Epoch: 25285, Training Loss: 0.00877
Epoch: 25285, Training Loss: 0.00916
Epoch: 25285, Training Loss: 0.00917
Epoch: 25285, Training Loss: 0.01128
Epoch: 25286, Training Loss: 0.00877
Epoch: 25286, Training Loss: 0.00916
Epoch: 25286, Training Loss: 0.00917
Epoch: 25286, Training Loss: 0.01128
Epoch: 25287, Training Loss: 0.00877
Epoch: 25287, Training Loss: 0.00916
Epoch: 25287, Training Loss: 0.00917
Epoch: 25287, Training Loss: 0.01128
Epoch: 25288, Training Loss: 0.00877
Epoch: 25288, Training Loss: 0.00916
Epoch: 25288, Training Loss: 0.00917
Epoch: 25288, Training Loss: 0.01128
Epoch: 25289, Training Loss: 0.00877
Epoch: 25289, Training Loss: 0.00916
Epoch: 25289, Training Loss: 0.00917
Epoch: 25289, Training Loss: 0.01127
Epoch: 25290, Training Loss: 0.00877
Epoch: 25290, Training Loss: 0.00916
Epoch: 25290, Training Loss: 0.00917
Epoch: 25290, Training Loss: 0.01127
Epoch: 25291, Training Loss: 0.00877
Epoch: 25291, Training Loss: 0.00916
Epoch: 25291, Training Loss: 0.00917
Epoch: 25291, Training Loss: 0.01127
Epoch: 25292, Training Loss: 0.00877
Epoch: 25292, Training Loss: 0.00916
Epoch: 25292, Training Loss: 0.00917
Epoch: 25292, Training Loss: 0.01127
Epoch: 25293, Training Loss: 0.00877
Epoch: 25293, Training Loss: 0.00916
Epoch: 25293, Training Loss: 0.00917
Epoch: 25293, Training Loss: 0.01127
Epoch: 25294, Training Loss: 0.00877
Epoch: 25294, Training Loss: 0.00916
Epoch: 25294, Training Loss: 0.00917
Epoch: 25294, Training Loss: 0.01127
Epoch: 25295, Training Loss: 0.00876
Epoch: 25295, Training Loss: 0.00915
Epoch: 25295, Training Loss: 0.00917
Epoch: 25295, Training Loss: 0.01127
Epoch: 25296, Training Loss: 0.00876
Epoch: 25296, Training Loss: 0.00915
Epoch: 25296, Training Loss: 0.00917
Epoch: 25296, Training Loss: 0.01127
Epoch: 25297, Training Loss: 0.00876
Epoch: 25297, Training Loss: 0.00915
Epoch: 25297, Training Loss: 0.00917
Epoch: 25297, Training Loss: 0.01127
Epoch: 25298, Training Loss: 0.00876
Epoch: 25298, Training Loss: 0.00915
Epoch: 25298, Training Loss: 0.00917
Epoch: 25298, Training Loss: 0.01127
Epoch: 25299, Training Loss: 0.00876
Epoch: 25299, Training Loss: 0.00915
Epoch: 25299, Training Loss: 0.00917
Epoch: 25299, Training Loss: 0.01127
Epoch: 25300, Training Loss: 0.00876
Epoch: 25300, Training Loss: 0.00915
Epoch: 25300, Training Loss: 0.00917
Epoch: 25300, Training Loss: 0.01127
Epoch: 25301, Training Loss: 0.00876
Epoch: 25301, Training Loss: 0.00915
Epoch: 25301, Training Loss: 0.00916
Epoch: 25301, Training Loss: 0.01127
Epoch: 25302, Training Loss: 0.00876
Epoch: 25302, Training Loss: 0.00915
Epoch: 25302, Training Loss: 0.00916
Epoch: 25302, Training Loss: 0.01127
Epoch: 25303, Training Loss: 0.00876
Epoch: 25303, Training Loss: 0.00915
Epoch: 25303, Training Loss: 0.00916
Epoch: 25303, Training Loss: 0.01127
Epoch: 25304, Training Loss: 0.00876
Epoch: 25304, Training Loss: 0.00915
Epoch: 25304, Training Loss: 0.00916
Epoch: 25304, Training Loss: 0.01127
Epoch: 25305, Training Loss: 0.00876
Epoch: 25305, Training Loss: 0.00915
Epoch: 25305, Training Loss: 0.00916
Epoch: 25305, Training Loss: 0.01127
Epoch: 25306, Training Loss: 0.00876
Epoch: 25306, Training Loss: 0.00915
Epoch: 25306, Training Loss: 0.00916
Epoch: 25306, Training Loss: 0.01127
Epoch: 25307, Training Loss: 0.00876
Epoch: 25307, Training Loss: 0.00915
Epoch: 25307, Training Loss: 0.00916
Epoch: 25307, Training Loss: 0.01127
Epoch: 25308, Training Loss: 0.00876
Epoch: 25308, Training Loss: 0.00915
Epoch: 25308, Training Loss: 0.00916
Epoch: 25308, Training Loss: 0.01127
Epoch: 25309, Training Loss: 0.00876
Epoch: 25309, Training Loss: 0.00915
Epoch: 25309, Training Loss: 0.00916
Epoch: 25309, Training Loss: 0.01127
Epoch: 25310, Training Loss: 0.00876
Epoch: 25310, Training Loss: 0.00915
Epoch: 25310, Training Loss: 0.00916
Epoch: 25310, Training Loss: 0.01127
Epoch: 25311, Training Loss: 0.00876
Epoch: 25311, Training Loss: 0.00915
Epoch: 25311, Training Loss: 0.00916
Epoch: 25311, Training Loss: 0.01127
Epoch: 25312, Training Loss: 0.00876
Epoch: 25312, Training Loss: 0.00915
Epoch: 25312, Training Loss: 0.00916
Epoch: 25312, Training Loss: 0.01127
Epoch: 25313, Training Loss: 0.00876
Epoch: 25313, Training Loss: 0.00915
Epoch: 25313, Training Loss: 0.00916
Epoch: 25313, Training Loss: 0.01127
Epoch: 25314, Training Loss: 0.00876
Epoch: 25314, Training Loss: 0.00915
Epoch: 25314, Training Loss: 0.00916
Epoch: 25314, Training Loss: 0.01127
Epoch: 25315, Training Loss: 0.00876
Epoch: 25315, Training Loss: 0.00915
Epoch: 25315, Training Loss: 0.00916
Epoch: 25315, Training Loss: 0.01127
Epoch: 25316, Training Loss: 0.00876
Epoch: 25316, Training Loss: 0.00915
Epoch: 25316, Training Loss: 0.00916
Epoch: 25316, Training Loss: 0.01127
Epoch: 25317, Training Loss: 0.00876
Epoch: 25317, Training Loss: 0.00915
Epoch: 25317, Training Loss: 0.00916
Epoch: 25317, Training Loss: 0.01127
Epoch: 25318, Training Loss: 0.00876
Epoch: 25318, Training Loss: 0.00915
Epoch: 25318, Training Loss: 0.00916
Epoch: 25318, Training Loss: 0.01127
Epoch: 25319, Training Loss: 0.00876
Epoch: 25319, Training Loss: 0.00915
Epoch: 25319, Training Loss: 0.00916
Epoch: 25319, Training Loss: 0.01127
Epoch: 25320, Training Loss: 0.00876
Epoch: 25320, Training Loss: 0.00915
Epoch: 25320, Training Loss: 0.00916
Epoch: 25320, Training Loss: 0.01127
Epoch: 25321, Training Loss: 0.00876
Epoch: 25321, Training Loss: 0.00915
Epoch: 25321, Training Loss: 0.00916
Epoch: 25321, Training Loss: 0.01127
Epoch: 25322, Training Loss: 0.00876
Epoch: 25322, Training Loss: 0.00915
Epoch: 25322, Training Loss: 0.00916
Epoch: 25322, Training Loss: 0.01127
Epoch: 25323, Training Loss: 0.00876
Epoch: 25323, Training Loss: 0.00915
Epoch: 25323, Training Loss: 0.00916
Epoch: 25323, Training Loss: 0.01127
Epoch: 25324, Training Loss: 0.00876
Epoch: 25324, Training Loss: 0.00915
Epoch: 25324, Training Loss: 0.00916
Epoch: 25324, Training Loss: 0.01127
Epoch: 25325, Training Loss: 0.00876
Epoch: 25325, Training Loss: 0.00915
Epoch: 25325, Training Loss: 0.00916
Epoch: 25325, Training Loss: 0.01127
Epoch: 25326, Training Loss: 0.00876
Epoch: 25326, Training Loss: 0.00915
Epoch: 25326, Training Loss: 0.00916
Epoch: 25326, Training Loss: 0.01127
Epoch: 25327, Training Loss: 0.00876
Epoch: 25327, Training Loss: 0.00915
Epoch: 25327, Training Loss: 0.00916
Epoch: 25327, Training Loss: 0.01126
Epoch: 25328, Training Loss: 0.00876
Epoch: 25328, Training Loss: 0.00915
Epoch: 25328, Training Loss: 0.00916
Epoch: 25328, Training Loss: 0.01126
Epoch: 25329, Training Loss: 0.00876
Epoch: 25329, Training Loss: 0.00915
Epoch: 25329, Training Loss: 0.00916
Epoch: 25329, Training Loss: 0.01126
Epoch: 25330, Training Loss: 0.00876
Epoch: 25330, Training Loss: 0.00915
Epoch: 25330, Training Loss: 0.00916
Epoch: 25330, Training Loss: 0.01126
Epoch: 25331, Training Loss: 0.00876
Epoch: 25331, Training Loss: 0.00915
Epoch: 25331, Training Loss: 0.00916
Epoch: 25331, Training Loss: 0.01126
Epoch: 25332, Training Loss: 0.00876
Epoch: 25332, Training Loss: 0.00915
Epoch: 25332, Training Loss: 0.00916
Epoch: 25332, Training Loss: 0.01126
Epoch: 25333, Training Loss: 0.00876
Epoch: 25333, Training Loss: 0.00915
Epoch: 25333, Training Loss: 0.00916
Epoch: 25333, Training Loss: 0.01126
Epoch: 25334, Training Loss: 0.00876
Epoch: 25334, Training Loss: 0.00915
Epoch: 25334, Training Loss: 0.00916
Epoch: 25334, Training Loss: 0.01126
Epoch: 25335, Training Loss: 0.00876
Epoch: 25335, Training Loss: 0.00915
Epoch: 25335, Training Loss: 0.00916
Epoch: 25335, Training Loss: 0.01126
Epoch: 25336, Training Loss: 0.00876
Epoch: 25336, Training Loss: 0.00915
Epoch: 25336, Training Loss: 0.00916
Epoch: 25336, Training Loss: 0.01126
Epoch: 25337, Training Loss: 0.00876
Epoch: 25337, Training Loss: 0.00915
Epoch: 25337, Training Loss: 0.00916
Epoch: 25337, Training Loss: 0.01126
Epoch: 25338, Training Loss: 0.00876
Epoch: 25338, Training Loss: 0.00915
Epoch: 25338, Training Loss: 0.00916
Epoch: 25338, Training Loss: 0.01126
Epoch: 25339, Training Loss: 0.00876
Epoch: 25339, Training Loss: 0.00915
Epoch: 25339, Training Loss: 0.00916
Epoch: 25339, Training Loss: 0.01126
Epoch: 25340, Training Loss: 0.00876
Epoch: 25340, Training Loss: 0.00915
Epoch: 25340, Training Loss: 0.00916
Epoch: 25340, Training Loss: 0.01126
Epoch: 25341, Training Loss: 0.00876
Epoch: 25341, Training Loss: 0.00915
Epoch: 25341, Training Loss: 0.00916
Epoch: 25341, Training Loss: 0.01126
Epoch: 25342, Training Loss: 0.00876
Epoch: 25342, Training Loss: 0.00915
Epoch: 25342, Training Loss: 0.00916
Epoch: 25342, Training Loss: 0.01126
Epoch: 25343, Training Loss: 0.00876
Epoch: 25343, Training Loss: 0.00914
Epoch: 25343, Training Loss: 0.00916
Epoch: 25343, Training Loss: 0.01126
Epoch: 25344, Training Loss: 0.00876
Epoch: 25344, Training Loss: 0.00914
Epoch: 25344, Training Loss: 0.00916
Epoch: 25344, Training Loss: 0.01126
Epoch: 25345, Training Loss: 0.00876
Epoch: 25345, Training Loss: 0.00914
Epoch: 25345, Training Loss: 0.00916
Epoch: 25345, Training Loss: 0.01126
Epoch: 25346, Training Loss: 0.00876
Epoch: 25346, Training Loss: 0.00914
Epoch: 25346, Training Loss: 0.00916
Epoch: 25346, Training Loss: 0.01126
Epoch: 25347, Training Loss: 0.00875
Epoch: 25347, Training Loss: 0.00914
Epoch: 25347, Training Loss: 0.00916
Epoch: 25347, Training Loss: 0.01126
Epoch: 25348, Training Loss: 0.00875
Epoch: 25348, Training Loss: 0.00914
Epoch: 25348, Training Loss: 0.00916
Epoch: 25348, Training Loss: 0.01126
Epoch: 25349, Training Loss: 0.00875
Epoch: 25349, Training Loss: 0.00914
Epoch: 25349, Training Loss: 0.00915
Epoch: 25349, Training Loss: 0.01126
Epoch: 25350, Training Loss: 0.00875
Epoch: 25350, Training Loss: 0.00914
Epoch: 25350, Training Loss: 0.00915
Epoch: 25350, Training Loss: 0.01126
Epoch: 25351, Training Loss: 0.00875
Epoch: 25351, Training Loss: 0.00914
Epoch: 25351, Training Loss: 0.00915
Epoch: 25351, Training Loss: 0.01126
Epoch: 25352, Training Loss: 0.00875
Epoch: 25352, Training Loss: 0.00914
Epoch: 25352, Training Loss: 0.00915
Epoch: 25352, Training Loss: 0.01126
Epoch: 25353, Training Loss: 0.00875
Epoch: 25353, Training Loss: 0.00914
Epoch: 25353, Training Loss: 0.00915
Epoch: 25353, Training Loss: 0.01126
Epoch: 25354, Training Loss: 0.00875
Epoch: 25354, Training Loss: 0.00914
Epoch: 25354, Training Loss: 0.00915
Epoch: 25354, Training Loss: 0.01126
Epoch: 25355, Training Loss: 0.00875
Epoch: 25355, Training Loss: 0.00914
Epoch: 25355, Training Loss: 0.00915
Epoch: 25355, Training Loss: 0.01126
Epoch: 25356, Training Loss: 0.00875
Epoch: 25356, Training Loss: 0.00914
Epoch: 25356, Training Loss: 0.00915
Epoch: 25356, Training Loss: 0.01126
Epoch: 25357, Training Loss: 0.00875
Epoch: 25357, Training Loss: 0.00914
Epoch: 25357, Training Loss: 0.00915
Epoch: 25357, Training Loss: 0.01126
Epoch: 25358, Training Loss: 0.00875
Epoch: 25358, Training Loss: 0.00914
Epoch: 25358, Training Loss: 0.00915
Epoch: 25358, Training Loss: 0.01126
Epoch: 25359, Training Loss: 0.00875
Epoch: 25359, Training Loss: 0.00914
Epoch: 25359, Training Loss: 0.00915
Epoch: 25359, Training Loss: 0.01126
Epoch: 25360, Training Loss: 0.00875
Epoch: 25360, Training Loss: 0.00914
Epoch: 25360, Training Loss: 0.00915
Epoch: 25360, Training Loss: 0.01126
Epoch: 25361, Training Loss: 0.00875
Epoch: 25361, Training Loss: 0.00914
Epoch: 25361, Training Loss: 0.00915
Epoch: 25361, Training Loss: 0.01126
Epoch: 25362, Training Loss: 0.00875
Epoch: 25362, Training Loss: 0.00914
Epoch: 25362, Training Loss: 0.00915
Epoch: 25362, Training Loss: 0.01126
Epoch: 25363, Training Loss: 0.00875
Epoch: 25363, Training Loss: 0.00914
Epoch: 25363, Training Loss: 0.00915
Epoch: 25363, Training Loss: 0.01126
Epoch: 25364, Training Loss: 0.00875
Epoch: 25364, Training Loss: 0.00914
Epoch: 25364, Training Loss: 0.00915
Epoch: 25364, Training Loss: 0.01126
Epoch: 25365, Training Loss: 0.00875
Epoch: 25365, Training Loss: 0.00914
Epoch: 25365, Training Loss: 0.00915
Epoch: 25365, Training Loss: 0.01126
Epoch: 25366, Training Loss: 0.00875
Epoch: 25366, Training Loss: 0.00914
Epoch: 25366, Training Loss: 0.00915
Epoch: 25366, Training Loss: 0.01125
Epoch: 25367, Training Loss: 0.00875
Epoch: 25367, Training Loss: 0.00914
Epoch: 25367, Training Loss: 0.00915
Epoch: 25367, Training Loss: 0.01125
Epoch: 25368, Training Loss: 0.00875
Epoch: 25368, Training Loss: 0.00914
Epoch: 25368, Training Loss: 0.00915
Epoch: 25368, Training Loss: 0.01125
Epoch: 25369, Training Loss: 0.00875
Epoch: 25369, Training Loss: 0.00914
Epoch: 25369, Training Loss: 0.00915
Epoch: 25369, Training Loss: 0.01125
Epoch: 25370, Training Loss: 0.00875
Epoch: 25370, Training Loss: 0.00914
Epoch: 25370, Training Loss: 0.00915
Epoch: 25370, Training Loss: 0.01125
Epoch: 25371, Training Loss: 0.00875
Epoch: 25371, Training Loss: 0.00914
Epoch: 25371, Training Loss: 0.00915
Epoch: 25371, Training Loss: 0.01125
Epoch: 25372, Training Loss: 0.00875
Epoch: 25372, Training Loss: 0.00914
Epoch: 25372, Training Loss: 0.00915
Epoch: 25372, Training Loss: 0.01125
Epoch: 25373, Training Loss: 0.00875
Epoch: 25373, Training Loss: 0.00914
Epoch: 25373, Training Loss: 0.00915
Epoch: 25373, Training Loss: 0.01125
Epoch: 25374, Training Loss: 0.00875
Epoch: 25374, Training Loss: 0.00914
Epoch: 25374, Training Loss: 0.00915
Epoch: 25374, Training Loss: 0.01125
Epoch: 25375, Training Loss: 0.00875
Epoch: 25375, Training Loss: 0.00914
Epoch: 25375, Training Loss: 0.00915
Epoch: 25375, Training Loss: 0.01125
Epoch: 25376, Training Loss: 0.00875
Epoch: 25376, Training Loss: 0.00914
Epoch: 25376, Training Loss: 0.00915
Epoch: 25376, Training Loss: 0.01125
Epoch: 25377, Training Loss: 0.00875
Epoch: 25377, Training Loss: 0.00914
Epoch: 25377, Training Loss: 0.00915
Epoch: 25377, Training Loss: 0.01125
Epoch: 25378, Training Loss: 0.00875
Epoch: 25378, Training Loss: 0.00914
Epoch: 25378, Training Loss: 0.00915
Epoch: 25378, Training Loss: 0.01125
Epoch: 25379, Training Loss: 0.00875
Epoch: 25379, Training Loss: 0.00914
Epoch: 25379, Training Loss: 0.00915
Epoch: 25379, Training Loss: 0.01125
Epoch: 25380, Training Loss: 0.00875
Epoch: 25380, Training Loss: 0.00914
Epoch: 25380, Training Loss: 0.00915
Epoch: 25380, Training Loss: 0.01125
Epoch: 25381, Training Loss: 0.00875
Epoch: 25381, Training Loss: 0.00914
Epoch: 25381, Training Loss: 0.00915
Epoch: 25381, Training Loss: 0.01125
Epoch: 25382, Training Loss: 0.00875
Epoch: 25382, Training Loss: 0.00914
Epoch: 25382, Training Loss: 0.00915
Epoch: 25382, Training Loss: 0.01125
Epoch: 25383, Training Loss: 0.00875
Epoch: 25383, Training Loss: 0.00914
Epoch: 25383, Training Loss: 0.00915
Epoch: 25383, Training Loss: 0.01125
Epoch: 25384, Training Loss: 0.00875
Epoch: 25384, Training Loss: 0.00914
Epoch: 25384, Training Loss: 0.00915
Epoch: 25384, Training Loss: 0.01125
Epoch: 25385, Training Loss: 0.00875
Epoch: 25385, Training Loss: 0.00914
Epoch: 25385, Training Loss: 0.00915
Epoch: 25385, Training Loss: 0.01125
Epoch: 25386, Training Loss: 0.00875
Epoch: 25386, Training Loss: 0.00914
Epoch: 25386, Training Loss: 0.00915
Epoch: 25386, Training Loss: 0.01125
Epoch: 25387, Training Loss: 0.00875
Epoch: 25387, Training Loss: 0.00914
Epoch: 25387, Training Loss: 0.00915
Epoch: 25387, Training Loss: 0.01125
Epoch: 25388, Training Loss: 0.00875
Epoch: 25388, Training Loss: 0.00914
Epoch: 25388, Training Loss: 0.00915
Epoch: 25388, Training Loss: 0.01125
Epoch: 25389, Training Loss: 0.00875
Epoch: 25389, Training Loss: 0.00914
Epoch: 25389, Training Loss: 0.00915
Epoch: 25389, Training Loss: 0.01125
Epoch: 25390, Training Loss: 0.00875
Epoch: 25390, Training Loss: 0.00914
Epoch: 25390, Training Loss: 0.00915
Epoch: 25390, Training Loss: 0.01125
Epoch: 25391, Training Loss: 0.00875
Epoch: 25391, Training Loss: 0.00913
Epoch: 25391, Training Loss: 0.00915
Epoch: 25391, Training Loss: 0.01125
Epoch: 25392, Training Loss: 0.00875
Epoch: 25392, Training Loss: 0.00913
Epoch: 25392, Training Loss: 0.00915
Epoch: 25392, Training Loss: 0.01125
Epoch: 25393, Training Loss: 0.00875
Epoch: 25393, Training Loss: 0.00913
Epoch: 25393, Training Loss: 0.00915
Epoch: 25393, Training Loss: 0.01125
Epoch: 25394, Training Loss: 0.00875
Epoch: 25394, Training Loss: 0.00913
Epoch: 25394, Training Loss: 0.00915
Epoch: 25394, Training Loss: 0.01125
Epoch: 25395, Training Loss: 0.00875
Epoch: 25395, Training Loss: 0.00913
Epoch: 25395, Training Loss: 0.00915
Epoch: 25395, Training Loss: 0.01125
Epoch: 25396, Training Loss: 0.00875
Epoch: 25396, Training Loss: 0.00913
Epoch: 25396, Training Loss: 0.00915
Epoch: 25396, Training Loss: 0.01125
Epoch: 25397, Training Loss: 0.00875
Epoch: 25397, Training Loss: 0.00913
Epoch: 25397, Training Loss: 0.00915
Epoch: 25397, Training Loss: 0.01125
Epoch: 25398, Training Loss: 0.00874
Epoch: 25398, Training Loss: 0.00913
Epoch: 25398, Training Loss: 0.00914
Epoch: 25398, Training Loss: 0.01125
Epoch: 25399, Training Loss: 0.00874
Epoch: 25399, Training Loss: 0.00913
Epoch: 25399, Training Loss: 0.00914
Epoch: 25399, Training Loss: 0.01125
Epoch: 25400, Training Loss: 0.00874
Epoch: 25400, Training Loss: 0.00913
Epoch: 25400, Training Loss: 0.00914
Epoch: 25400, Training Loss: 0.01125
Epoch: 25401, Training Loss: 0.00874
Epoch: 25401, Training Loss: 0.00913
Epoch: 25401, Training Loss: 0.00914
Epoch: 25401, Training Loss: 0.01125
Epoch: 25402, Training Loss: 0.00874
Epoch: 25402, Training Loss: 0.00913
Epoch: 25402, Training Loss: 0.00914
Epoch: 25402, Training Loss: 0.01125
Epoch: 25403, Training Loss: 0.00874
Epoch: 25403, Training Loss: 0.00913
Epoch: 25403, Training Loss: 0.00914
Epoch: 25403, Training Loss: 0.01125
Epoch: 25404, Training Loss: 0.00874
Epoch: 25404, Training Loss: 0.00913
Epoch: 25404, Training Loss: 0.00914
Epoch: 25404, Training Loss: 0.01125
Epoch: 25405, Training Loss: 0.00874
Epoch: 25405, Training Loss: 0.00913
Epoch: 25405, Training Loss: 0.00914
Epoch: 25405, Training Loss: 0.01124
Epoch: 25406, Training Loss: 0.00874
Epoch: 25406, Training Loss: 0.00913
Epoch: 25406, Training Loss: 0.00914
Epoch: 25406, Training Loss: 0.01124
Epoch: 25407, Training Loss: 0.00874
Epoch: 25407, Training Loss: 0.00913
Epoch: 25407, Training Loss: 0.00914
Epoch: 25407, Training Loss: 0.01124
Epoch: 25408, Training Loss: 0.00874
Epoch: 25408, Training Loss: 0.00913
Epoch: 25408, Training Loss: 0.00914
Epoch: 25408, Training Loss: 0.01124
Epoch: 25409, Training Loss: 0.00874
Epoch: 25409, Training Loss: 0.00913
Epoch: 25409, Training Loss: 0.00914
Epoch: 25409, Training Loss: 0.01124
Epoch: 25410, Training Loss: 0.00874
Epoch: 25410, Training Loss: 0.00913
Epoch: 25410, Training Loss: 0.00914
Epoch: 25410, Training Loss: 0.01124
Epoch: 25411, Training Loss: 0.00874
Epoch: 25411, Training Loss: 0.00913
Epoch: 25411, Training Loss: 0.00914
Epoch: 25411, Training Loss: 0.01124
Epoch: 25412, Training Loss: 0.00874
Epoch: 25412, Training Loss: 0.00913
Epoch: 25412, Training Loss: 0.00914
Epoch: 25412, Training Loss: 0.01124
Epoch: 25413, Training Loss: 0.00874
Epoch: 25413, Training Loss: 0.00913
Epoch: 25413, Training Loss: 0.00914
Epoch: 25413, Training Loss: 0.01124
Epoch: 25414, Training Loss: 0.00874
Epoch: 25414, Training Loss: 0.00913
Epoch: 25414, Training Loss: 0.00914
Epoch: 25414, Training Loss: 0.01124
Epoch: 25415, Training Loss: 0.00874
Epoch: 25415, Training Loss: 0.00913
Epoch: 25415, Training Loss: 0.00914
Epoch: 25415, Training Loss: 0.01124
Epoch: 25416, Training Loss: 0.00874
Epoch: 25416, Training Loss: 0.00913
Epoch: 25416, Training Loss: 0.00914
Epoch: 25416, Training Loss: 0.01124
Epoch: 25417, Training Loss: 0.00874
Epoch: 25417, Training Loss: 0.00913
Epoch: 25417, Training Loss: 0.00914
Epoch: 25417, Training Loss: 0.01124
Epoch: 25418, Training Loss: 0.00874
Epoch: 25418, Training Loss: 0.00913
Epoch: 25418, Training Loss: 0.00914
Epoch: 25418, Training Loss: 0.01124
Epoch: 25419, Training Loss: 0.00874
Epoch: 25419, Training Loss: 0.00913
Epoch: 25419, Training Loss: 0.00914
Epoch: 25419, Training Loss: 0.01124
Epoch: 25420, Training Loss: 0.00874
Epoch: 25420, Training Loss: 0.00913
Epoch: 25420, Training Loss: 0.00914
Epoch: 25420, Training Loss: 0.01124
Epoch: 25421, Training Loss: 0.00874
Epoch: 25421, Training Loss: 0.00913
Epoch: 25421, Training Loss: 0.00914
Epoch: 25421, Training Loss: 0.01124
Epoch: 25422, Training Loss: 0.00874
Epoch: 25422, Training Loss: 0.00913
Epoch: 25422, Training Loss: 0.00914
Epoch: 25422, Training Loss: 0.01124
Epoch: 25423, Training Loss: 0.00874
Epoch: 25423, Training Loss: 0.00913
Epoch: 25423, Training Loss: 0.00914
Epoch: 25423, Training Loss: 0.01124
Epoch: 25424, Training Loss: 0.00874
Epoch: 25424, Training Loss: 0.00913
Epoch: 25424, Training Loss: 0.00914
Epoch: 25424, Training Loss: 0.01124
Epoch: 25425, Training Loss: 0.00874
Epoch: 25425, Training Loss: 0.00913
Epoch: 25425, Training Loss: 0.00914
Epoch: 25425, Training Loss: 0.01124
Epoch: 25426, Training Loss: 0.00874
Epoch: 25426, Training Loss: 0.00913
Epoch: 25426, Training Loss: 0.00914
Epoch: 25426, Training Loss: 0.01124
Epoch: 25427, Training Loss: 0.00874
Epoch: 25427, Training Loss: 0.00913
Epoch: 25427, Training Loss: 0.00914
Epoch: 25427, Training Loss: 0.01124
Epoch: 25428, Training Loss: 0.00874
Epoch: 25428, Training Loss: 0.00913
Epoch: 25428, Training Loss: 0.00914
Epoch: 25428, Training Loss: 0.01124
Epoch: 25429, Training Loss: 0.00874
Epoch: 25429, Training Loss: 0.00913
Epoch: 25429, Training Loss: 0.00914
Epoch: 25429, Training Loss: 0.01124
Epoch: 25430, Training Loss: 0.00874
Epoch: 25430, Training Loss: 0.00913
Epoch: 25430, Training Loss: 0.00914
Epoch: 25430, Training Loss: 0.01124
Epoch: 25431, Training Loss: 0.00874
Epoch: 25431, Training Loss: 0.00913
Epoch: 25431, Training Loss: 0.00914
Epoch: 25431, Training Loss: 0.01124
Epoch: 25432, Training Loss: 0.00874
Epoch: 25432, Training Loss: 0.00913
Epoch: 25432, Training Loss: 0.00914
Epoch: 25432, Training Loss: 0.01124
Epoch: 25433, Training Loss: 0.00874
Epoch: 25433, Training Loss: 0.00913
Epoch: 25433, Training Loss: 0.00914
Epoch: 25433, Training Loss: 0.01124
Epoch: 25434, Training Loss: 0.00874
Epoch: 25434, Training Loss: 0.00913
Epoch: 25434, Training Loss: 0.00914
Epoch: 25434, Training Loss: 0.01124
Epoch: 25435, Training Loss: 0.00874
Epoch: 25435, Training Loss: 0.00913
Epoch: 25435, Training Loss: 0.00914
Epoch: 25435, Training Loss: 0.01124
Epoch: 25436, Training Loss: 0.00874
Epoch: 25436, Training Loss: 0.00913
Epoch: 25436, Training Loss: 0.00914
Epoch: 25436, Training Loss: 0.01124
Epoch: 25437, Training Loss: 0.00874
Epoch: 25437, Training Loss: 0.00913
Epoch: 25437, Training Loss: 0.00914
Epoch: 25437, Training Loss: 0.01124
Epoch: 25438, Training Loss: 0.00874
Epoch: 25438, Training Loss: 0.00913
Epoch: 25438, Training Loss: 0.00914
Epoch: 25438, Training Loss: 0.01124
Epoch: 25439, Training Loss: 0.00874
Epoch: 25439, Training Loss: 0.00913
Epoch: 25439, Training Loss: 0.00914
Epoch: 25439, Training Loss: 0.01124
Epoch: 25440, Training Loss: 0.00874
Epoch: 25440, Training Loss: 0.00912
Epoch: 25440, Training Loss: 0.00914
Epoch: 25440, Training Loss: 0.01124
Epoch: 25441, Training Loss: 0.00874
Epoch: 25441, Training Loss: 0.00912
Epoch: 25441, Training Loss: 0.00914
Epoch: 25441, Training Loss: 0.01124
Epoch: 25442, Training Loss: 0.00874
Epoch: 25442, Training Loss: 0.00912
Epoch: 25442, Training Loss: 0.00914
Epoch: 25442, Training Loss: 0.01124
Epoch: 25443, Training Loss: 0.00874
Epoch: 25443, Training Loss: 0.00912
Epoch: 25443, Training Loss: 0.00914
Epoch: 25443, Training Loss: 0.01124
Epoch: 25444, Training Loss: 0.00874
Epoch: 25444, Training Loss: 0.00912
Epoch: 25444, Training Loss: 0.00914
Epoch: 25444, Training Loss: 0.01123
Epoch: 25445, Training Loss: 0.00874
Epoch: 25445, Training Loss: 0.00912
Epoch: 25445, Training Loss: 0.00914
Epoch: 25445, Training Loss: 0.01123
Epoch: 25446, Training Loss: 0.00874
Epoch: 25446, Training Loss: 0.00912
Epoch: 25446, Training Loss: 0.00913
Epoch: 25446, Training Loss: 0.01123
Epoch: 25447, Training Loss: 0.00874
Epoch: 25447, Training Loss: 0.00912
Epoch: 25447, Training Loss: 0.00913
Epoch: 25447, Training Loss: 0.01123
Epoch: 25448, Training Loss: 0.00874
Epoch: 25448, Training Loss: 0.00912
Epoch: 25448, Training Loss: 0.00913
Epoch: 25448, Training Loss: 0.01123
Epoch: 25449, Training Loss: 0.00874
Epoch: 25449, Training Loss: 0.00912
Epoch: 25449, Training Loss: 0.00913
Epoch: 25449, Training Loss: 0.01123
Epoch: 25450, Training Loss: 0.00873
Epoch: 25450, Training Loss: 0.00912
Epoch: 25450, Training Loss: 0.00913
Epoch: 25450, Training Loss: 0.01123
Epoch: 25451, Training Loss: 0.00873
Epoch: 25451, Training Loss: 0.00912
Epoch: 25451, Training Loss: 0.00913
Epoch: 25451, Training Loss: 0.01123
Epoch: 25452, Training Loss: 0.00873
Epoch: 25452, Training Loss: 0.00912
Epoch: 25452, Training Loss: 0.00913
Epoch: 25452, Training Loss: 0.01123
Epoch: 25453, Training Loss: 0.00873
Epoch: 25453, Training Loss: 0.00912
Epoch: 25453, Training Loss: 0.00913
Epoch: 25453, Training Loss: 0.01123
Epoch: 25454, Training Loss: 0.00873
Epoch: 25454, Training Loss: 0.00912
Epoch: 25454, Training Loss: 0.00913
Epoch: 25454, Training Loss: 0.01123
Epoch: 25455, Training Loss: 0.00873
Epoch: 25455, Training Loss: 0.00912
Epoch: 25455, Training Loss: 0.00913
Epoch: 25455, Training Loss: 0.01123
Epoch: 25456, Training Loss: 0.00873
Epoch: 25456, Training Loss: 0.00912
Epoch: 25456, Training Loss: 0.00913
Epoch: 25456, Training Loss: 0.01123
Epoch: 25457, Training Loss: 0.00873
Epoch: 25457, Training Loss: 0.00912
Epoch: 25457, Training Loss: 0.00913
Epoch: 25457, Training Loss: 0.01123
Epoch: 25458, Training Loss: 0.00873
Epoch: 25458, Training Loss: 0.00912
Epoch: 25458, Training Loss: 0.00913
Epoch: 25458, Training Loss: 0.01123
Epoch: 25459, Training Loss: 0.00873
Epoch: 25459, Training Loss: 0.00912
Epoch: 25459, Training Loss: 0.00913
Epoch: 25459, Training Loss: 0.01123
Epoch: 25460, Training Loss: 0.00873
Epoch: 25460, Training Loss: 0.00912
Epoch: 25460, Training Loss: 0.00913
Epoch: 25460, Training Loss: 0.01123
Epoch: 25461, Training Loss: 0.00873
Epoch: 25461, Training Loss: 0.00912
Epoch: 25461, Training Loss: 0.00913
Epoch: 25461, Training Loss: 0.01123
Epoch: 25462, Training Loss: 0.00873
Epoch: 25462, Training Loss: 0.00912
Epoch: 25462, Training Loss: 0.00913
Epoch: 25462, Training Loss: 0.01123
Epoch: 25463, Training Loss: 0.00873
Epoch: 25463, Training Loss: 0.00912
Epoch: 25463, Training Loss: 0.00913
Epoch: 25463, Training Loss: 0.01123
Epoch: 25464, Training Loss: 0.00873
Epoch: 25464, Training Loss: 0.00912
Epoch: 25464, Training Loss: 0.00913
Epoch: 25464, Training Loss: 0.01123
Epoch: 25465, Training Loss: 0.00873
Epoch: 25465, Training Loss: 0.00912
Epoch: 25465, Training Loss: 0.00913
Epoch: 25465, Training Loss: 0.01123
Epoch: 25466, Training Loss: 0.00873
Epoch: 25466, Training Loss: 0.00912
Epoch: 25466, Training Loss: 0.00913
Epoch: 25466, Training Loss: 0.01123
Epoch: 25467, Training Loss: 0.00873
Epoch: 25467, Training Loss: 0.00912
Epoch: 25467, Training Loss: 0.00913
Epoch: 25467, Training Loss: 0.01123
Epoch: 25468, Training Loss: 0.00873
Epoch: 25468, Training Loss: 0.00912
Epoch: 25468, Training Loss: 0.00913
Epoch: 25468, Training Loss: 0.01123
Epoch: 25469, Training Loss: 0.00873
Epoch: 25469, Training Loss: 0.00912
Epoch: 25469, Training Loss: 0.00913
Epoch: 25469, Training Loss: 0.01123
Epoch: 25470, Training Loss: 0.00873
Epoch: 25470, Training Loss: 0.00912
Epoch: 25470, Training Loss: 0.00913
Epoch: 25470, Training Loss: 0.01123
Epoch: 25471, Training Loss: 0.00873
Epoch: 25471, Training Loss: 0.00912
Epoch: 25471, Training Loss: 0.00913
Epoch: 25471, Training Loss: 0.01123
Epoch: 25472, Training Loss: 0.00873
Epoch: 25472, Training Loss: 0.00912
Epoch: 25472, Training Loss: 0.00913
Epoch: 25472, Training Loss: 0.01123
Epoch: 25473, Training Loss: 0.00873
Epoch: 25473, Training Loss: 0.00912
Epoch: 25473, Training Loss: 0.00913
Epoch: 25473, Training Loss: 0.01123
Epoch: 25474, Training Loss: 0.00873
Epoch: 25474, Training Loss: 0.00912
Epoch: 25474, Training Loss: 0.00913
Epoch: 25474, Training Loss: 0.01123
Epoch: 25475, Training Loss: 0.00873
Epoch: 25475, Training Loss: 0.00912
Epoch: 25475, Training Loss: 0.00913
Epoch: 25475, Training Loss: 0.01123
Epoch: 25476, Training Loss: 0.00873
Epoch: 25476, Training Loss: 0.00912
Epoch: 25476, Training Loss: 0.00913
Epoch: 25476, Training Loss: 0.01123
Epoch: 25477, Training Loss: 0.00873
Epoch: 25477, Training Loss: 0.00912
Epoch: 25477, Training Loss: 0.00913
Epoch: 25477, Training Loss: 0.01123
Epoch: 25478, Training Loss: 0.00873
Epoch: 25478, Training Loss: 0.00912
Epoch: 25478, Training Loss: 0.00913
Epoch: 25478, Training Loss: 0.01123
Epoch: 25479, Training Loss: 0.00873
Epoch: 25479, Training Loss: 0.00912
Epoch: 25479, Training Loss: 0.00913
Epoch: 25479, Training Loss: 0.01123
Epoch: 25480, Training Loss: 0.00873
Epoch: 25480, Training Loss: 0.00912
Epoch: 25480, Training Loss: 0.00913
Epoch: 25480, Training Loss: 0.01123
Epoch: 25481, Training Loss: 0.00873
Epoch: 25481, Training Loss: 0.00912
Epoch: 25481, Training Loss: 0.00913
Epoch: 25481, Training Loss: 0.01123
Epoch: 25482, Training Loss: 0.00873
Epoch: 25482, Training Loss: 0.00912
Epoch: 25482, Training Loss: 0.00913
Epoch: 25482, Training Loss: 0.01123
Epoch: 25483, Training Loss: 0.00873
Epoch: 25483, Training Loss: 0.00912
Epoch: 25483, Training Loss: 0.00913
Epoch: 25483, Training Loss: 0.01122
Epoch: 25484, Training Loss: 0.00873
Epoch: 25484, Training Loss: 0.00912
Epoch: 25484, Training Loss: 0.00913
Epoch: 25484, Training Loss: 0.01122
Epoch: 25485, Training Loss: 0.00873
Epoch: 25485, Training Loss: 0.00912
Epoch: 25485, Training Loss: 0.00913
Epoch: 25485, Training Loss: 0.01122
Epoch: 25486, Training Loss: 0.00873
Epoch: 25486, Training Loss: 0.00912
Epoch: 25486, Training Loss: 0.00913
Epoch: 25486, Training Loss: 0.01122
Epoch: 25487, Training Loss: 0.00873
Epoch: 25487, Training Loss: 0.00912
Epoch: 25487, Training Loss: 0.00913
Epoch: 25487, Training Loss: 0.01122
Epoch: 25488, Training Loss: 0.00873
Epoch: 25488, Training Loss: 0.00912
Epoch: 25488, Training Loss: 0.00913
Epoch: 25488, Training Loss: 0.01122
Epoch: 25489, Training Loss: 0.00873
Epoch: 25489, Training Loss: 0.00911
Epoch: 25489, Training Loss: 0.00913
Epoch: 25489, Training Loss: 0.01122
Epoch: 25490, Training Loss: 0.00873
Epoch: 25490, Training Loss: 0.00911
Epoch: 25490, Training Loss: 0.00913
Epoch: 25490, Training Loss: 0.01122
Epoch: 25491, Training Loss: 0.00873
Epoch: 25491, Training Loss: 0.00911
Epoch: 25491, Training Loss: 0.00913
Epoch: 25491, Training Loss: 0.01122
Epoch: 25492, Training Loss: 0.00873
Epoch: 25492, Training Loss: 0.00911
Epoch: 25492, Training Loss: 0.00913
Epoch: 25492, Training Loss: 0.01122
Epoch: 25493, Training Loss: 0.00873
Epoch: 25493, Training Loss: 0.00911
Epoch: 25493, Training Loss: 0.00913
Epoch: 25493, Training Loss: 0.01122
Epoch: 25494, Training Loss: 0.00873
Epoch: 25494, Training Loss: 0.00911
Epoch: 25494, Training Loss: 0.00913
Epoch: 25494, Training Loss: 0.01122
Epoch: 25495, Training Loss: 0.00873
Epoch: 25495, Training Loss: 0.00911
Epoch: 25495, Training Loss: 0.00912
Epoch: 25495, Training Loss: 0.01122
Epoch: 25496, Training Loss: 0.00873
Epoch: 25496, Training Loss: 0.00911
Epoch: 25496, Training Loss: 0.00912
Epoch: 25496, Training Loss: 0.01122
Epoch: 25497, Training Loss: 0.00873
Epoch: 25497, Training Loss: 0.00911
Epoch: 25497, Training Loss: 0.00912
Epoch: 25497, Training Loss: 0.01122
Epoch: 25498, Training Loss: 0.00873
Epoch: 25498, Training Loss: 0.00911
Epoch: 25498, Training Loss: 0.00912
Epoch: 25498, Training Loss: 0.01122
Epoch: 25499, Training Loss: 0.00873
Epoch: 25499, Training Loss: 0.00911
Epoch: 25499, Training Loss: 0.00912
Epoch: 25499, Training Loss: 0.01122
Epoch: 25500, Training Loss: 0.00873
Epoch: 25500, Training Loss: 0.00911
Epoch: 25500, Training Loss: 0.00912
Epoch: 25500, Training Loss: 0.01122
Epoch: 25501, Training Loss: 0.00873
Epoch: 25501, Training Loss: 0.00911
Epoch: 25501, Training Loss: 0.00912
Epoch: 25501, Training Loss: 0.01122
Epoch: 25502, Training Loss: 0.00873
Epoch: 25502, Training Loss: 0.00911
Epoch: 25502, Training Loss: 0.00912
Epoch: 25502, Training Loss: 0.01122
Epoch: 25503, Training Loss: 0.00872
Epoch: 25503, Training Loss: 0.00911
Epoch: 25503, Training Loss: 0.00912
Epoch: 25503, Training Loss: 0.01122
Epoch: 25504, Training Loss: 0.00872
Epoch: 25504, Training Loss: 0.00911
Epoch: 25504, Training Loss: 0.00912
Epoch: 25504, Training Loss: 0.01122
Epoch: 25505, Training Loss: 0.00872
Epoch: 25505, Training Loss: 0.00911
Epoch: 25505, Training Loss: 0.00912
Epoch: 25505, Training Loss: 0.01122
Epoch: 25506, Training Loss: 0.00872
Epoch: 25506, Training Loss: 0.00911
Epoch: 25506, Training Loss: 0.00912
Epoch: 25506, Training Loss: 0.01122
Epoch: 25507, Training Loss: 0.00872
Epoch: 25507, Training Loss: 0.00911
Epoch: 25507, Training Loss: 0.00912
Epoch: 25507, Training Loss: 0.01122
Epoch: 25508, Training Loss: 0.00872
Epoch: 25508, Training Loss: 0.00911
Epoch: 25508, Training Loss: 0.00912
Epoch: 25508, Training Loss: 0.01122
Epoch: 25509, Training Loss: 0.00872
Epoch: 25509, Training Loss: 0.00911
Epoch: 25509, Training Loss: 0.00912
Epoch: 25509, Training Loss: 0.01122
Epoch: 25510, Training Loss: 0.00872
Epoch: 25510, Training Loss: 0.00911
Epoch: 25510, Training Loss: 0.00912
Epoch: 25510, Training Loss: 0.01122
Epoch: 25511, Training Loss: 0.00872
Epoch: 25511, Training Loss: 0.00911
Epoch: 25511, Training Loss: 0.00912
Epoch: 25511, Training Loss: 0.01122
Epoch: 25512, Training Loss: 0.00872
Epoch: 25512, Training Loss: 0.00911
Epoch: 25512, Training Loss: 0.00912
Epoch: 25512, Training Loss: 0.01122
Epoch: 25513, Training Loss: 0.00872
Epoch: 25513, Training Loss: 0.00911
Epoch: 25513, Training Loss: 0.00912
Epoch: 25513, Training Loss: 0.01122
Epoch: 25514, Training Loss: 0.00872
Epoch: 25514, Training Loss: 0.00911
Epoch: 25514, Training Loss: 0.00912
Epoch: 25514, Training Loss: 0.01122
Epoch: 25515, Training Loss: 0.00872
Epoch: 25515, Training Loss: 0.00911
Epoch: 25515, Training Loss: 0.00912
Epoch: 25515, Training Loss: 0.01122
Epoch: 25516, Training Loss: 0.00872
Epoch: 25516, Training Loss: 0.00911
Epoch: 25516, Training Loss: 0.00912
Epoch: 25516, Training Loss: 0.01122
Epoch: 25517, Training Loss: 0.00872
Epoch: 25517, Training Loss: 0.00911
Epoch: 25517, Training Loss: 0.00912
Epoch: 25517, Training Loss: 0.01122
Epoch: 25518, Training Loss: 0.00872
Epoch: 25518, Training Loss: 0.00911
Epoch: 25518, Training Loss: 0.00912
Epoch: 25518, Training Loss: 0.01122
Epoch: 25519, Training Loss: 0.00872
Epoch: 25519, Training Loss: 0.00911
Epoch: 25519, Training Loss: 0.00912
Epoch: 25519, Training Loss: 0.01122
Epoch: 25520, Training Loss: 0.00872
Epoch: 25520, Training Loss: 0.00911
Epoch: 25520, Training Loss: 0.00912
Epoch: 25520, Training Loss: 0.01122
Epoch: 25521, Training Loss: 0.00872
Epoch: 25521, Training Loss: 0.00911
Epoch: 25521, Training Loss: 0.00912
Epoch: 25521, Training Loss: 0.01122
Epoch: 25522, Training Loss: 0.00872
Epoch: 25522, Training Loss: 0.00911
Epoch: 25522, Training Loss: 0.00912
Epoch: 25522, Training Loss: 0.01121
Epoch: 25523, Training Loss: 0.00872
Epoch: 25523, Training Loss: 0.00911
Epoch: 25523, Training Loss: 0.00912
Epoch: 25523, Training Loss: 0.01121
Epoch: 25524, Training Loss: 0.00872
Epoch: 25524, Training Loss: 0.00911
Epoch: 25524, Training Loss: 0.00912
Epoch: 25524, Training Loss: 0.01121
Epoch: 25525, Training Loss: 0.00872
Epoch: 25525, Training Loss: 0.00911
Epoch: 25525, Training Loss: 0.00912
Epoch: 25525, Training Loss: 0.01121
Epoch: 25526, Training Loss: 0.00872
Epoch: 25526, Training Loss: 0.00911
Epoch: 25526, Training Loss: 0.00912
Epoch: 25526, Training Loss: 0.01121
Epoch: 25527, Training Loss: 0.00872
Epoch: 25527, Training Loss: 0.00911
Epoch: 25527, Training Loss: 0.00912
Epoch: 25527, Training Loss: 0.01121
Epoch: 25528, Training Loss: 0.00872
Epoch: 25528, Training Loss: 0.00911
Epoch: 25528, Training Loss: 0.00912
Epoch: 25528, Training Loss: 0.01121
Epoch: 25529, Training Loss: 0.00872
Epoch: 25529, Training Loss: 0.00911
Epoch: 25529, Training Loss: 0.00912
Epoch: 25529, Training Loss: 0.01121
Epoch: 25530, Training Loss: 0.00872
Epoch: 25530, Training Loss: 0.00911
Epoch: 25530, Training Loss: 0.00912
Epoch: 25530, Training Loss: 0.01121
Epoch: 25531, Training Loss: 0.00872
Epoch: 25531, Training Loss: 0.00911
Epoch: 25531, Training Loss: 0.00912
Epoch: 25531, Training Loss: 0.01121
Epoch: 25532, Training Loss: 0.00872
Epoch: 25532, Training Loss: 0.00911
Epoch: 25532, Training Loss: 0.00912
Epoch: 25532, Training Loss: 0.01121
Epoch: 25533, Training Loss: 0.00872
Epoch: 25533, Training Loss: 0.00911
Epoch: 25533, Training Loss: 0.00912
Epoch: 25533, Training Loss: 0.01121
Epoch: 25534, Training Loss: 0.00872
Epoch: 25534, Training Loss: 0.00911
Epoch: 25534, Training Loss: 0.00912
Epoch: 25534, Training Loss: 0.01121
Epoch: 25535, Training Loss: 0.00872
Epoch: 25535, Training Loss: 0.00911
Epoch: 25535, Training Loss: 0.00912
Epoch: 25535, Training Loss: 0.01121
Epoch: 25536, Training Loss: 0.00872
Epoch: 25536, Training Loss: 0.00911
Epoch: 25536, Training Loss: 0.00912
Epoch: 25536, Training Loss: 0.01121
Epoch: 25537, Training Loss: 0.00872
Epoch: 25537, Training Loss: 0.00911
Epoch: 25537, Training Loss: 0.00912
Epoch: 25537, Training Loss: 0.01121
Epoch: 25538, Training Loss: 0.00872
Epoch: 25538, Training Loss: 0.00910
Epoch: 25538, Training Loss: 0.00912
Epoch: 25538, Training Loss: 0.01121
Epoch: 25539, Training Loss: 0.00872
Epoch: 25539, Training Loss: 0.00910
Epoch: 25539, Training Loss: 0.00912
Epoch: 25539, Training Loss: 0.01121
Epoch: 25540, Training Loss: 0.00872
Epoch: 25540, Training Loss: 0.00910
Epoch: 25540, Training Loss: 0.00912
Epoch: 25540, Training Loss: 0.01121
Epoch: 25541, Training Loss: 0.00872
Epoch: 25541, Training Loss: 0.00910
Epoch: 25541, Training Loss: 0.00912
Epoch: 25541, Training Loss: 0.01121
Epoch: 25542, Training Loss: 0.00872
Epoch: 25542, Training Loss: 0.00910
Epoch: 25542, Training Loss: 0.00912
Epoch: 25542, Training Loss: 0.01121
Epoch: 25543, Training Loss: 0.00872
Epoch: 25543, Training Loss: 0.00910
Epoch: 25543, Training Loss: 0.00912
Epoch: 25543, Training Loss: 0.01121
Epoch: 25544, Training Loss: 0.00872
Epoch: 25544, Training Loss: 0.00910
Epoch: 25544, Training Loss: 0.00911
Epoch: 25544, Training Loss: 0.01121
Epoch: 25545, Training Loss: 0.00872
Epoch: 25545, Training Loss: 0.00910
Epoch: 25545, Training Loss: 0.00911
Epoch: 25545, Training Loss: 0.01121
Epoch: 25546, Training Loss: 0.00872
Epoch: 25546, Training Loss: 0.00910
Epoch: 25546, Training Loss: 0.00911
Epoch: 25546, Training Loss: 0.01121
Epoch: 25547, Training Loss: 0.00872
Epoch: 25547, Training Loss: 0.00910
Epoch: 25547, Training Loss: 0.00911
Epoch: 25547, Training Loss: 0.01121
Epoch: 25548, Training Loss: 0.00872
Epoch: 25548, Training Loss: 0.00910
Epoch: 25548, Training Loss: 0.00911
Epoch: 25548, Training Loss: 0.01121
Epoch: 25549, Training Loss: 0.00872
Epoch: 25549, Training Loss: 0.00910
Epoch: 25549, Training Loss: 0.00911
Epoch: 25549, Training Loss: 0.01121
Epoch: 25550, Training Loss: 0.00872
Epoch: 25550, Training Loss: 0.00910
Epoch: 25550, Training Loss: 0.00911
Epoch: 25550, Training Loss: 0.01121
Epoch: 25551, Training Loss: 0.00872
Epoch: 25551, Training Loss: 0.00910
Epoch: 25551, Training Loss: 0.00911
Epoch: 25551, Training Loss: 0.01121
Epoch: 25552, Training Loss: 0.00872
Epoch: 25552, Training Loss: 0.00910
Epoch: 25552, Training Loss: 0.00911
Epoch: 25552, Training Loss: 0.01121
Epoch: 25553, Training Loss: 0.00872
Epoch: 25553, Training Loss: 0.00910
Epoch: 25553, Training Loss: 0.00911
Epoch: 25553, Training Loss: 0.01121
Epoch: 25554, Training Loss: 0.00872
Epoch: 25554, Training Loss: 0.00910
Epoch: 25554, Training Loss: 0.00911
Epoch: 25554, Training Loss: 0.01121
Epoch: 25555, Training Loss: 0.00871
Epoch: 25555, Training Loss: 0.00910
Epoch: 25555, Training Loss: 0.00911
Epoch: 25555, Training Loss: 0.01121
Epoch: 25556, Training Loss: 0.00871
Epoch: 25556, Training Loss: 0.00910
Epoch: 25556, Training Loss: 0.00911
Epoch: 25556, Training Loss: 0.01121
Epoch: 25557, Training Loss: 0.00871
Epoch: 25557, Training Loss: 0.00910
Epoch: 25557, Training Loss: 0.00911
Epoch: 25557, Training Loss: 0.01121
Epoch: 25558, Training Loss: 0.00871
Epoch: 25558, Training Loss: 0.00910
Epoch: 25558, Training Loss: 0.00911
Epoch: 25558, Training Loss: 0.01121
Epoch: 25559, Training Loss: 0.00871
Epoch: 25559, Training Loss: 0.00910
Epoch: 25559, Training Loss: 0.00911
Epoch: 25559, Training Loss: 0.01121
Epoch: 25560, Training Loss: 0.00871
Epoch: 25560, Training Loss: 0.00910
Epoch: 25560, Training Loss: 0.00911
Epoch: 25560, Training Loss: 0.01121
Epoch: 25561, Training Loss: 0.00871
Epoch: 25561, Training Loss: 0.00910
Epoch: 25561, Training Loss: 0.00911
Epoch: 25561, Training Loss: 0.01120
Epoch: 25562, Training Loss: 0.00871
Epoch: 25562, Training Loss: 0.00910
Epoch: 25562, Training Loss: 0.00911
Epoch: 25562, Training Loss: 0.01120
Epoch: 25563, Training Loss: 0.00871
Epoch: 25563, Training Loss: 0.00910
Epoch: 25563, Training Loss: 0.00911
Epoch: 25563, Training Loss: 0.01120
Epoch: 25564, Training Loss: 0.00871
Epoch: 25564, Training Loss: 0.00910
Epoch: 25564, Training Loss: 0.00911
Epoch: 25564, Training Loss: 0.01120
Epoch: 25565, Training Loss: 0.00871
Epoch: 25565, Training Loss: 0.00910
Epoch: 25565, Training Loss: 0.00911
Epoch: 25565, Training Loss: 0.01120
Epoch: 25566, Training Loss: 0.00871
Epoch: 25566, Training Loss: 0.00910
Epoch: 25566, Training Loss: 0.00911
Epoch: 25566, Training Loss: 0.01120
Epoch: 25567, Training Loss: 0.00871
Epoch: 25567, Training Loss: 0.00910
Epoch: 25567, Training Loss: 0.00911
Epoch: 25567, Training Loss: 0.01120
Epoch: 25568, Training Loss: 0.00871
Epoch: 25568, Training Loss: 0.00910
Epoch: 25568, Training Loss: 0.00911
Epoch: 25568, Training Loss: 0.01120
Epoch: 25569, Training Loss: 0.00871
Epoch: 25569, Training Loss: 0.00910
Epoch: 25569, Training Loss: 0.00911
Epoch: 25569, Training Loss: 0.01120
Epoch: 25570, Training Loss: 0.00871
Epoch: 25570, Training Loss: 0.00910
Epoch: 25570, Training Loss: 0.00911
Epoch: 25570, Training Loss: 0.01120
Epoch: 25571, Training Loss: 0.00871
Epoch: 25571, Training Loss: 0.00910
Epoch: 25571, Training Loss: 0.00911
Epoch: 25571, Training Loss: 0.01120
Epoch: 25572, Training Loss: 0.00871
Epoch: 25572, Training Loss: 0.00910
Epoch: 25572, Training Loss: 0.00911
Epoch: 25572, Training Loss: 0.01120
Epoch: 25573, Training Loss: 0.00871
Epoch: 25573, Training Loss: 0.00910
Epoch: 25573, Training Loss: 0.00911
Epoch: 25573, Training Loss: 0.01120
Epoch: 25574, Training Loss: 0.00871
Epoch: 25574, Training Loss: 0.00910
Epoch: 25574, Training Loss: 0.00911
Epoch: 25574, Training Loss: 0.01120
Epoch: 25575, Training Loss: 0.00871
Epoch: 25575, Training Loss: 0.00910
Epoch: 25575, Training Loss: 0.00911
Epoch: 25575, Training Loss: 0.01120
Epoch: 25576, Training Loss: 0.00871
Epoch: 25576, Training Loss: 0.00910
Epoch: 25576, Training Loss: 0.00911
Epoch: 25576, Training Loss: 0.01120
Epoch: 25577, Training Loss: 0.00871
Epoch: 25577, Training Loss: 0.00910
Epoch: 25577, Training Loss: 0.00911
Epoch: 25577, Training Loss: 0.01120
Epoch: 25578, Training Loss: 0.00871
Epoch: 25578, Training Loss: 0.00910
Epoch: 25578, Training Loss: 0.00911
Epoch: 25578, Training Loss: 0.01120
Epoch: 25579, Training Loss: 0.00871
Epoch: 25579, Training Loss: 0.00910
Epoch: 25579, Training Loss: 0.00911
Epoch: 25579, Training Loss: 0.01120
Epoch: 25580, Training Loss: 0.00871
Epoch: 25580, Training Loss: 0.00910
Epoch: 25580, Training Loss: 0.00911
Epoch: 25580, Training Loss: 0.01120
Epoch: 25581, Training Loss: 0.00871
Epoch: 25581, Training Loss: 0.00910
Epoch: 25581, Training Loss: 0.00911
Epoch: 25581, Training Loss: 0.01120
Epoch: 25582, Training Loss: 0.00871
Epoch: 25582, Training Loss: 0.00910
Epoch: 25582, Training Loss: 0.00911
Epoch: 25582, Training Loss: 0.01120
Epoch: 25583, Training Loss: 0.00871
Epoch: 25583, Training Loss: 0.00910
Epoch: 25583, Training Loss: 0.00911
Epoch: 25583, Training Loss: 0.01120
Epoch: 25584, Training Loss: 0.00871
Epoch: 25584, Training Loss: 0.00910
Epoch: 25584, Training Loss: 0.00911
Epoch: 25584, Training Loss: 0.01120
Epoch: 25585, Training Loss: 0.00871
Epoch: 25585, Training Loss: 0.00910
Epoch: 25585, Training Loss: 0.00911
Epoch: 25585, Training Loss: 0.01120
Epoch: 25586, Training Loss: 0.00871
Epoch: 25586, Training Loss: 0.00910
Epoch: 25586, Training Loss: 0.00911
Epoch: 25586, Training Loss: 0.01120
Epoch: 25587, Training Loss: 0.00871
Epoch: 25587, Training Loss: 0.00909
Epoch: 25587, Training Loss: 0.00911
Epoch: 25587, Training Loss: 0.01120
Epoch: 25588, Training Loss: 0.00871
Epoch: 25588, Training Loss: 0.00909
Epoch: 25588, Training Loss: 0.00911
Epoch: 25588, Training Loss: 0.01120
Epoch: 25589, Training Loss: 0.00871
Epoch: 25589, Training Loss: 0.00909
Epoch: 25589, Training Loss: 0.00911
Epoch: 25589, Training Loss: 0.01120
Epoch: 25590, Training Loss: 0.00871
Epoch: 25590, Training Loss: 0.00909
Epoch: 25590, Training Loss: 0.00911
Epoch: 25590, Training Loss: 0.01120
Epoch: 25591, Training Loss: 0.00871
Epoch: 25591, Training Loss: 0.00909
Epoch: 25591, Training Loss: 0.00911
Epoch: 25591, Training Loss: 0.01120
Epoch: 25592, Training Loss: 0.00871
Epoch: 25592, Training Loss: 0.00909
Epoch: 25592, Training Loss: 0.00911
Epoch: 25592, Training Loss: 0.01120
Epoch: 25593, Training Loss: 0.00871
Epoch: 25593, Training Loss: 0.00909
Epoch: 25593, Training Loss: 0.00910
Epoch: 25593, Training Loss: 0.01120
Epoch: 25594, Training Loss: 0.00871
Epoch: 25594, Training Loss: 0.00909
Epoch: 25594, Training Loss: 0.00910
Epoch: 25594, Training Loss: 0.01120
Epoch: 25595, Training Loss: 0.00871
Epoch: 25595, Training Loss: 0.00909
Epoch: 25595, Training Loss: 0.00910
Epoch: 25595, Training Loss: 0.01120
Epoch: 25596, Training Loss: 0.00871
Epoch: 25596, Training Loss: 0.00909
Epoch: 25596, Training Loss: 0.00910
Epoch: 25596, Training Loss: 0.01120
Epoch: 25597, Training Loss: 0.00871
Epoch: 25597, Training Loss: 0.00909
Epoch: 25597, Training Loss: 0.00910
Epoch: 25597, Training Loss: 0.01120
Epoch: 25598, Training Loss: 0.00871
Epoch: 25598, Training Loss: 0.00909
Epoch: 25598, Training Loss: 0.00910
Epoch: 25598, Training Loss: 0.01120
Epoch: 25599, Training Loss: 0.00871
Epoch: 25599, Training Loss: 0.00909
Epoch: 25599, Training Loss: 0.00910
Epoch: 25599, Training Loss: 0.01120
Epoch: 25600, Training Loss: 0.00871
Epoch: 25600, Training Loss: 0.00909
Epoch: 25600, Training Loss: 0.00910
Epoch: 25600, Training Loss: 0.01120
Epoch: 25601, Training Loss: 0.00871
Epoch: 25601, Training Loss: 0.00909
Epoch: 25601, Training Loss: 0.00910
Epoch: 25601, Training Loss: 0.01119
Epoch: 25602, Training Loss: 0.00871
Epoch: 25602, Training Loss: 0.00909
Epoch: 25602, Training Loss: 0.00910
Epoch: 25602, Training Loss: 0.01119
Epoch: 25603, Training Loss: 0.00871
Epoch: 25603, Training Loss: 0.00909
Epoch: 25603, Training Loss: 0.00910
Epoch: 25603, Training Loss: 0.01119
Epoch: 25604, Training Loss: 0.00871
Epoch: 25604, Training Loss: 0.00909
Epoch: 25604, Training Loss: 0.00910
Epoch: 25604, Training Loss: 0.01119
Epoch: 25605, Training Loss: 0.00871
Epoch: 25605, Training Loss: 0.00909
Epoch: 25605, Training Loss: 0.00910
Epoch: 25605, Training Loss: 0.01119
Epoch: 25606, Training Loss: 0.00871
Epoch: 25606, Training Loss: 0.00909
Epoch: 25606, Training Loss: 0.00910
Epoch: 25606, Training Loss: 0.01119
Epoch: 25607, Training Loss: 0.00871
Epoch: 25607, Training Loss: 0.00909
Epoch: 25607, Training Loss: 0.00910
Epoch: 25607, Training Loss: 0.01119
Epoch: 25608, Training Loss: 0.00870
Epoch: 25608, Training Loss: 0.00909
Epoch: 25608, Training Loss: 0.00910
Epoch: 25608, Training Loss: 0.01119
Epoch: 25609, Training Loss: 0.00870
Epoch: 25609, Training Loss: 0.00909
Epoch: 25609, Training Loss: 0.00910
Epoch: 25609, Training Loss: 0.01119
Epoch: 25610, Training Loss: 0.00870
Epoch: 25610, Training Loss: 0.00909
Epoch: 25610, Training Loss: 0.00910
Epoch: 25610, Training Loss: 0.01119
Epoch: 25611, Training Loss: 0.00870
Epoch: 25611, Training Loss: 0.00909
Epoch: 25611, Training Loss: 0.00910
Epoch: 25611, Training Loss: 0.01119
Epoch: 25612, Training Loss: 0.00870
Epoch: 25612, Training Loss: 0.00909
Epoch: 25612, Training Loss: 0.00910
Epoch: 25612, Training Loss: 0.01119
Epoch: 25613, Training Loss: 0.00870
Epoch: 25613, Training Loss: 0.00909
Epoch: 25613, Training Loss: 0.00910
Epoch: 25613, Training Loss: 0.01119
Epoch: 25614, Training Loss: 0.00870
Epoch: 25614, Training Loss: 0.00909
Epoch: 25614, Training Loss: 0.00910
Epoch: 25614, Training Loss: 0.01119
Epoch: 25615, Training Loss: 0.00870
Epoch: 25615, Training Loss: 0.00909
Epoch: 25615, Training Loss: 0.00910
Epoch: 25615, Training Loss: 0.01119
Epoch: 25616, Training Loss: 0.00870
Epoch: 25616, Training Loss: 0.00909
Epoch: 25616, Training Loss: 0.00910
Epoch: 25616, Training Loss: 0.01119
Epoch: 25617, Training Loss: 0.00870
Epoch: 25617, Training Loss: 0.00909
Epoch: 25617, Training Loss: 0.00910
Epoch: 25617, Training Loss: 0.01119
Epoch: 25618, Training Loss: 0.00870
Epoch: 25618, Training Loss: 0.00909
Epoch: 25618, Training Loss: 0.00910
Epoch: 25618, Training Loss: 0.01119
Epoch: 25619, Training Loss: 0.00870
Epoch: 25619, Training Loss: 0.00909
Epoch: 25619, Training Loss: 0.00910
Epoch: 25619, Training Loss: 0.01119
Epoch: 25620, Training Loss: 0.00870
Epoch: 25620, Training Loss: 0.00909
Epoch: 25620, Training Loss: 0.00910
Epoch: 25620, Training Loss: 0.01119
Epoch: 25621, Training Loss: 0.00870
Epoch: 25621, Training Loss: 0.00909
Epoch: 25621, Training Loss: 0.00910
Epoch: 25621, Training Loss: 0.01119
Epoch: 25622, Training Loss: 0.00870
Epoch: 25622, Training Loss: 0.00909
Epoch: 25622, Training Loss: 0.00910
Epoch: 25622, Training Loss: 0.01119
Epoch: 25623, Training Loss: 0.00870
Epoch: 25623, Training Loss: 0.00909
Epoch: 25623, Training Loss: 0.00910
Epoch: 25623, Training Loss: 0.01119
Epoch: 25624, Training Loss: 0.00870
Epoch: 25624, Training Loss: 0.00909
Epoch: 25624, Training Loss: 0.00910
Epoch: 25624, Training Loss: 0.01119
Epoch: 25625, Training Loss: 0.00870
Epoch: 25625, Training Loss: 0.00909
Epoch: 25625, Training Loss: 0.00910
Epoch: 25625, Training Loss: 0.01119
Epoch: 25626, Training Loss: 0.00870
Epoch: 25626, Training Loss: 0.00909
Epoch: 25626, Training Loss: 0.00910
Epoch: 25626, Training Loss: 0.01119
Epoch: 25627, Training Loss: 0.00870
Epoch: 25627, Training Loss: 0.00909
Epoch: 25627, Training Loss: 0.00910
Epoch: 25627, Training Loss: 0.01119
Epoch: 25628, Training Loss: 0.00870
Epoch: 25628, Training Loss: 0.00909
Epoch: 25628, Training Loss: 0.00910
Epoch: 25628, Training Loss: 0.01119
Epoch: 25629, Training Loss: 0.00870
Epoch: 25629, Training Loss: 0.00909
Epoch: 25629, Training Loss: 0.00910
Epoch: 25629, Training Loss: 0.01119
Epoch: 25630, Training Loss: 0.00870
Epoch: 25630, Training Loss: 0.00909
Epoch: 25630, Training Loss: 0.00910
Epoch: 25630, Training Loss: 0.01119
Epoch: 25631, Training Loss: 0.00870
Epoch: 25631, Training Loss: 0.00909
Epoch: 25631, Training Loss: 0.00910
Epoch: 25631, Training Loss: 0.01119
Epoch: 25632, Training Loss: 0.00870
Epoch: 25632, Training Loss: 0.00909
Epoch: 25632, Training Loss: 0.00910
Epoch: 25632, Training Loss: 0.01119
Epoch: 25633, Training Loss: 0.00870
Epoch: 25633, Training Loss: 0.00909
Epoch: 25633, Training Loss: 0.00910
Epoch: 25633, Training Loss: 0.01119
Epoch: 25634, Training Loss: 0.00870
Epoch: 25634, Training Loss: 0.00909
Epoch: 25634, Training Loss: 0.00910
Epoch: 25634, Training Loss: 0.01119
Epoch: 25635, Training Loss: 0.00870
Epoch: 25635, Training Loss: 0.00909
Epoch: 25635, Training Loss: 0.00910
Epoch: 25635, Training Loss: 0.01119
Epoch: 25636, Training Loss: 0.00870
Epoch: 25636, Training Loss: 0.00908
Epoch: 25636, Training Loss: 0.00910
Epoch: 25636, Training Loss: 0.01119
Epoch: 25637, Training Loss: 0.00870
Epoch: 25637, Training Loss: 0.00908
Epoch: 25637, Training Loss: 0.00910
Epoch: 25637, Training Loss: 0.01119
Epoch: 25638, Training Loss: 0.00870
Epoch: 25638, Training Loss: 0.00908
Epoch: 25638, Training Loss: 0.00910
Epoch: 25638, Training Loss: 0.01119
Epoch: 25639, Training Loss: 0.00870
Epoch: 25639, Training Loss: 0.00908
Epoch: 25639, Training Loss: 0.00910
Epoch: 25639, Training Loss: 0.01119
Epoch: 25640, Training Loss: 0.00870
Epoch: 25640, Training Loss: 0.00908
Epoch: 25640, Training Loss: 0.00910
Epoch: 25640, Training Loss: 0.01118
Epoch: 25641, Training Loss: 0.00870
Epoch: 25641, Training Loss: 0.00908
Epoch: 25641, Training Loss: 0.00910
Epoch: 25641, Training Loss: 0.01118
Epoch: 25642, Training Loss: 0.00870
Epoch: 25642, Training Loss: 0.00908
Epoch: 25642, Training Loss: 0.00909
Epoch: 25642, Training Loss: 0.01118
Epoch: 25643, Training Loss: 0.00870
Epoch: 25643, Training Loss: 0.00908
Epoch: 25643, Training Loss: 0.00909
Epoch: 25643, Training Loss: 0.01118
Epoch: 25644, Training Loss: 0.00870
Epoch: 25644, Training Loss: 0.00908
Epoch: 25644, Training Loss: 0.00909
Epoch: 25644, Training Loss: 0.01118
Epoch: 25645, Training Loss: 0.00870
Epoch: 25645, Training Loss: 0.00908
Epoch: 25645, Training Loss: 0.00909
Epoch: 25645, Training Loss: 0.01118
Epoch: 25646, Training Loss: 0.00870
Epoch: 25646, Training Loss: 0.00908
Epoch: 25646, Training Loss: 0.00909
Epoch: 25646, Training Loss: 0.01118
Epoch: 25647, Training Loss: 0.00870
Epoch: 25647, Training Loss: 0.00908
Epoch: 25647, Training Loss: 0.00909
Epoch: 25647, Training Loss: 0.01118
Epoch: 25648, Training Loss: 0.00870
Epoch: 25648, Training Loss: 0.00908
Epoch: 25648, Training Loss: 0.00909
Epoch: 25648, Training Loss: 0.01118
Epoch: 25649, Training Loss: 0.00870
Epoch: 25649, Training Loss: 0.00908
Epoch: 25649, Training Loss: 0.00909
Epoch: 25649, Training Loss: 0.01118
Epoch: 25650, Training Loss: 0.00870
Epoch: 25650, Training Loss: 0.00908
Epoch: 25650, Training Loss: 0.00909
Epoch: 25650, Training Loss: 0.01118
Epoch: 25651, Training Loss: 0.00870
Epoch: 25651, Training Loss: 0.00908
Epoch: 25651, Training Loss: 0.00909
Epoch: 25651, Training Loss: 0.01118
Epoch: 25652, Training Loss: 0.00870
Epoch: 25652, Training Loss: 0.00908
Epoch: 25652, Training Loss: 0.00909
Epoch: 25652, Training Loss: 0.01118
Epoch: 25653, Training Loss: 0.00870
Epoch: 25653, Training Loss: 0.00908
Epoch: 25653, Training Loss: 0.00909
Epoch: 25653, Training Loss: 0.01118
Epoch: 25654, Training Loss: 0.00870
Epoch: 25654, Training Loss: 0.00908
Epoch: 25654, Training Loss: 0.00909
Epoch: 25654, Training Loss: 0.01118
Epoch: 25655, Training Loss: 0.00870
Epoch: 25655, Training Loss: 0.00908
Epoch: 25655, Training Loss: 0.00909
Epoch: 25655, Training Loss: 0.01118
Epoch: 25656, Training Loss: 0.00870
Epoch: 25656, Training Loss: 0.00908
Epoch: 25656, Training Loss: 0.00909
Epoch: 25656, Training Loss: 0.01118
Epoch: 25657, Training Loss: 0.00870
Epoch: 25657, Training Loss: 0.00908
Epoch: 25657, Training Loss: 0.00909
Epoch: 25657, Training Loss: 0.01118
Epoch: 25658, Training Loss: 0.00870
Epoch: 25658, Training Loss: 0.00908
Epoch: 25658, Training Loss: 0.00909
Epoch: 25658, Training Loss: 0.01118
Epoch: 25659, Training Loss: 0.00870
Epoch: 25659, Training Loss: 0.00908
Epoch: 25659, Training Loss: 0.00909
Epoch: 25659, Training Loss: 0.01118
Epoch: 25660, Training Loss: 0.00869
Epoch: 25660, Training Loss: 0.00908
Epoch: 25660, Training Loss: 0.00909
Epoch: 25660, Training Loss: 0.01118
Epoch: 25661, Training Loss: 0.00869
Epoch: 25661, Training Loss: 0.00908
Epoch: 25661, Training Loss: 0.00909
Epoch: 25661, Training Loss: 0.01118
Epoch: 25662, Training Loss: 0.00869
Epoch: 25662, Training Loss: 0.00908
Epoch: 25662, Training Loss: 0.00909
Epoch: 25662, Training Loss: 0.01118
Epoch: 25663, Training Loss: 0.00869
Epoch: 25663, Training Loss: 0.00908
Epoch: 25663, Training Loss: 0.00909
Epoch: 25663, Training Loss: 0.01118
Epoch: 25664, Training Loss: 0.00869
Epoch: 25664, Training Loss: 0.00908
Epoch: 25664, Training Loss: 0.00909
Epoch: 25664, Training Loss: 0.01118
Epoch: 25665, Training Loss: 0.00869
Epoch: 25665, Training Loss: 0.00908
Epoch: 25665, Training Loss: 0.00909
Epoch: 25665, Training Loss: 0.01118
Epoch: 25666, Training Loss: 0.00869
Epoch: 25666, Training Loss: 0.00908
Epoch: 25666, Training Loss: 0.00909
Epoch: 25666, Training Loss: 0.01118
Epoch: 25667, Training Loss: 0.00869
Epoch: 25667, Training Loss: 0.00908
Epoch: 25667, Training Loss: 0.00909
Epoch: 25667, Training Loss: 0.01118
Epoch: 25668, Training Loss: 0.00869
Epoch: 25668, Training Loss: 0.00908
Epoch: 25668, Training Loss: 0.00909
Epoch: 25668, Training Loss: 0.01118
Epoch: 25669, Training Loss: 0.00869
Epoch: 25669, Training Loss: 0.00908
Epoch: 25669, Training Loss: 0.00909
Epoch: 25669, Training Loss: 0.01118
Epoch: 25670, Training Loss: 0.00869
Epoch: 25670, Training Loss: 0.00908
Epoch: 25670, Training Loss: 0.00909
Epoch: 25670, Training Loss: 0.01118
Epoch: 25671, Training Loss: 0.00869
Epoch: 25671, Training Loss: 0.00908
Epoch: 25671, Training Loss: 0.00909
Epoch: 25671, Training Loss: 0.01118
Epoch: 25672, Training Loss: 0.00869
Epoch: 25672, Training Loss: 0.00908
Epoch: 25672, Training Loss: 0.00909
Epoch: 25672, Training Loss: 0.01118
Epoch: 25673, Training Loss: 0.00869
Epoch: 25673, Training Loss: 0.00908
Epoch: 25673, Training Loss: 0.00909
Epoch: 25673, Training Loss: 0.01118
Epoch: 25674, Training Loss: 0.00869
Epoch: 25674, Training Loss: 0.00908
Epoch: 25674, Training Loss: 0.00909
Epoch: 25674, Training Loss: 0.01118
Epoch: 25675, Training Loss: 0.00869
Epoch: 25675, Training Loss: 0.00908
Epoch: 25675, Training Loss: 0.00909
Epoch: 25675, Training Loss: 0.01118
Epoch: 25676, Training Loss: 0.00869
Epoch: 25676, Training Loss: 0.00908
Epoch: 25676, Training Loss: 0.00909
Epoch: 25676, Training Loss: 0.01118
Epoch: 25677, Training Loss: 0.00869
Epoch: 25677, Training Loss: 0.00908
Epoch: 25677, Training Loss: 0.00909
Epoch: 25677, Training Loss: 0.01118
Epoch: 25678, Training Loss: 0.00869
Epoch: 25678, Training Loss: 0.00908
Epoch: 25678, Training Loss: 0.00909
Epoch: 25678, Training Loss: 0.01118
Epoch: 25679, Training Loss: 0.00869
Epoch: 25679, Training Loss: 0.00908
Epoch: 25679, Training Loss: 0.00909
Epoch: 25679, Training Loss: 0.01118
Epoch: 25680, Training Loss: 0.00869
Epoch: 25680, Training Loss: 0.00908
Epoch: 25680, Training Loss: 0.00909
Epoch: 25680, Training Loss: 0.01117
Epoch: 25681, Training Loss: 0.00869
Epoch: 25681, Training Loss: 0.00908
Epoch: 25681, Training Loss: 0.00909
Epoch: 25681, Training Loss: 0.01117
Epoch: 25682, Training Loss: 0.00869
Epoch: 25682, Training Loss: 0.00908
Epoch: 25682, Training Loss: 0.00909
Epoch: 25682, Training Loss: 0.01117
Epoch: 25683, Training Loss: 0.00869
Epoch: 25683, Training Loss: 0.00908
Epoch: 25683, Training Loss: 0.00909
Epoch: 25683, Training Loss: 0.01117
Epoch: 25684, Training Loss: 0.00869
Epoch: 25684, Training Loss: 0.00908
Epoch: 25684, Training Loss: 0.00909
Epoch: 25684, Training Loss: 0.01117
Epoch: 25685, Training Loss: 0.00869
Epoch: 25685, Training Loss: 0.00907
Epoch: 25685, Training Loss: 0.00909
Epoch: 25685, Training Loss: 0.01117
Epoch: 25686, Training Loss: 0.00869
Epoch: 25686, Training Loss: 0.00907
Epoch: 25686, Training Loss: 0.00909
Epoch: 25686, Training Loss: 0.01117
Epoch: 25687, Training Loss: 0.00869
Epoch: 25687, Training Loss: 0.00907
Epoch: 25687, Training Loss: 0.00909
Epoch: 25687, Training Loss: 0.01117
Epoch: 25688, Training Loss: 0.00869
Epoch: 25688, Training Loss: 0.00907
Epoch: 25688, Training Loss: 0.00909
Epoch: 25688, Training Loss: 0.01117
Epoch: 25689, Training Loss: 0.00869
Epoch: 25689, Training Loss: 0.00907
Epoch: 25689, Training Loss: 0.00909
Epoch: 25689, Training Loss: 0.01117
Epoch: 25690, Training Loss: 0.00869
Epoch: 25690, Training Loss: 0.00907
Epoch: 25690, Training Loss: 0.00909
Epoch: 25690, Training Loss: 0.01117
Epoch: 25691, Training Loss: 0.00869
Epoch: 25691, Training Loss: 0.00907
Epoch: 25691, Training Loss: 0.00908
Epoch: 25691, Training Loss: 0.01117
Epoch: 25692, Training Loss: 0.00869
Epoch: 25692, Training Loss: 0.00907
Epoch: 25692, Training Loss: 0.00908
Epoch: 25692, Training Loss: 0.01117
Epoch: 25693, Training Loss: 0.00869
Epoch: 25693, Training Loss: 0.00907
Epoch: 25693, Training Loss: 0.00908
Epoch: 25693, Training Loss: 0.01117
Epoch: 25694, Training Loss: 0.00869
Epoch: 25694, Training Loss: 0.00907
Epoch: 25694, Training Loss: 0.00908
Epoch: 25694, Training Loss: 0.01117
Epoch: 25695, Training Loss: 0.00869
Epoch: 25695, Training Loss: 0.00907
Epoch: 25695, Training Loss: 0.00908
Epoch: 25695, Training Loss: 0.01117
Epoch: 25696, Training Loss: 0.00869
Epoch: 25696, Training Loss: 0.00907
Epoch: 25696, Training Loss: 0.00908
Epoch: 25696, Training Loss: 0.01117
Epoch: 25697, Training Loss: 0.00869
Epoch: 25697, Training Loss: 0.00907
Epoch: 25697, Training Loss: 0.00908
Epoch: 25697, Training Loss: 0.01117
Epoch: 25698, Training Loss: 0.00869
Epoch: 25698, Training Loss: 0.00907
Epoch: 25698, Training Loss: 0.00908
Epoch: 25698, Training Loss: 0.01117
Epoch: 25699, Training Loss: 0.00869
Epoch: 25699, Training Loss: 0.00907
Epoch: 25699, Training Loss: 0.00908
Epoch: 25699, Training Loss: 0.01117
Epoch: 25700, Training Loss: 0.00869
Epoch: 25700, Training Loss: 0.00907
Epoch: 25700, Training Loss: 0.00908
Epoch: 25700, Training Loss: 0.01117
Epoch: 25701, Training Loss: 0.00869
Epoch: 25701, Training Loss: 0.00907
Epoch: 25701, Training Loss: 0.00908
Epoch: 25701, Training Loss: 0.01117
Epoch: 25702, Training Loss: 0.00869
Epoch: 25702, Training Loss: 0.00907
Epoch: 25702, Training Loss: 0.00908
Epoch: 25702, Training Loss: 0.01117
Epoch: 25703, Training Loss: 0.00869
Epoch: 25703, Training Loss: 0.00907
Epoch: 25703, Training Loss: 0.00908
Epoch: 25703, Training Loss: 0.01117
Epoch: 25704, Training Loss: 0.00869
Epoch: 25704, Training Loss: 0.00907
Epoch: 25704, Training Loss: 0.00908
Epoch: 25704, Training Loss: 0.01117
Epoch: 25705, Training Loss: 0.00869
Epoch: 25705, Training Loss: 0.00907
Epoch: 25705, Training Loss: 0.00908
Epoch: 25705, Training Loss: 0.01117
Epoch: 25706, Training Loss: 0.00869
Epoch: 25706, Training Loss: 0.00907
Epoch: 25706, Training Loss: 0.00908
Epoch: 25706, Training Loss: 0.01117
Epoch: 25707, Training Loss: 0.00869
Epoch: 25707, Training Loss: 0.00907
Epoch: 25707, Training Loss: 0.00908
Epoch: 25707, Training Loss: 0.01117
Epoch: 25708, Training Loss: 0.00869
Epoch: 25708, Training Loss: 0.00907
Epoch: 25708, Training Loss: 0.00908
Epoch: 25708, Training Loss: 0.01117
Epoch: 25709, Training Loss: 0.00869
Epoch: 25709, Training Loss: 0.00907
Epoch: 25709, Training Loss: 0.00908
Epoch: 25709, Training Loss: 0.01117
Epoch: 25710, Training Loss: 0.00869
Epoch: 25710, Training Loss: 0.00907
Epoch: 25710, Training Loss: 0.00908
Epoch: 25710, Training Loss: 0.01117
Epoch: 25711, Training Loss: 0.00869
Epoch: 25711, Training Loss: 0.00907
Epoch: 25711, Training Loss: 0.00908
Epoch: 25711, Training Loss: 0.01117
Epoch: 25712, Training Loss: 0.00869
Epoch: 25712, Training Loss: 0.00907
Epoch: 25712, Training Loss: 0.00908
Epoch: 25712, Training Loss: 0.01117
Epoch: 25713, Training Loss: 0.00868
Epoch: 25713, Training Loss: 0.00907
Epoch: 25713, Training Loss: 0.00908
Epoch: 25713, Training Loss: 0.01117
Epoch: 25714, Training Loss: 0.00868
Epoch: 25714, Training Loss: 0.00907
Epoch: 25714, Training Loss: 0.00908
Epoch: 25714, Training Loss: 0.01117
Epoch: 25715, Training Loss: 0.00868
Epoch: 25715, Training Loss: 0.00907
Epoch: 25715, Training Loss: 0.00908
Epoch: 25715, Training Loss: 0.01117
Epoch: 25716, Training Loss: 0.00868
Epoch: 25716, Training Loss: 0.00907
Epoch: 25716, Training Loss: 0.00908
Epoch: 25716, Training Loss: 0.01117
Epoch: 25717, Training Loss: 0.00868
Epoch: 25717, Training Loss: 0.00907
Epoch: 25717, Training Loss: 0.00908
Epoch: 25717, Training Loss: 0.01117
Epoch: 25718, Training Loss: 0.00868
Epoch: 25718, Training Loss: 0.00907
Epoch: 25718, Training Loss: 0.00908
Epoch: 25718, Training Loss: 0.01117
Epoch: 25719, Training Loss: 0.00868
Epoch: 25719, Training Loss: 0.00907
Epoch: 25719, Training Loss: 0.00908
Epoch: 25719, Training Loss: 0.01117
Epoch: 25720, Training Loss: 0.00868
Epoch: 25720, Training Loss: 0.00907
Epoch: 25720, Training Loss: 0.00908
Epoch: 25720, Training Loss: 0.01116
Epoch: 25721, Training Loss: 0.00868
Epoch: 25721, Training Loss: 0.00907
Epoch: 25721, Training Loss: 0.00908
Epoch: 25721, Training Loss: 0.01116
Epoch: 25722, Training Loss: 0.00868
Epoch: 25722, Training Loss: 0.00907
Epoch: 25722, Training Loss: 0.00908
Epoch: 25722, Training Loss: 0.01116
Epoch: 25723, Training Loss: 0.00868
Epoch: 25723, Training Loss: 0.00907
Epoch: 25723, Training Loss: 0.00908
Epoch: 25723, Training Loss: 0.01116
Epoch: 25724, Training Loss: 0.00868
Epoch: 25724, Training Loss: 0.00907
Epoch: 25724, Training Loss: 0.00908
Epoch: 25724, Training Loss: 0.01116
Epoch: 25725, Training Loss: 0.00868
Epoch: 25725, Training Loss: 0.00907
Epoch: 25725, Training Loss: 0.00908
Epoch: 25725, Training Loss: 0.01116
Epoch: 25726, Training Loss: 0.00868
Epoch: 25726, Training Loss: 0.00907
Epoch: 25726, Training Loss: 0.00908
Epoch: 25726, Training Loss: 0.01116
Epoch: 25727, Training Loss: 0.00868
Epoch: 25727, Training Loss: 0.00907
Epoch: 25727, Training Loss: 0.00908
Epoch: 25727, Training Loss: 0.01116
Epoch: 25728, Training Loss: 0.00868
Epoch: 25728, Training Loss: 0.00907
Epoch: 25728, Training Loss: 0.00908
Epoch: 25728, Training Loss: 0.01116
Epoch: 25729, Training Loss: 0.00868
Epoch: 25729, Training Loss: 0.00907
Epoch: 25729, Training Loss: 0.00908
Epoch: 25729, Training Loss: 0.01116
Epoch: 25730, Training Loss: 0.00868
Epoch: 25730, Training Loss: 0.00907
Epoch: 25730, Training Loss: 0.00908
Epoch: 25730, Training Loss: 0.01116
Epoch: 25731, Training Loss: 0.00868
Epoch: 25731, Training Loss: 0.00907
Epoch: 25731, Training Loss: 0.00908
Epoch: 25731, Training Loss: 0.01116
Epoch: 25732, Training Loss: 0.00868
Epoch: 25732, Training Loss: 0.00907
Epoch: 25732, Training Loss: 0.00908
Epoch: 25732, Training Loss: 0.01116
Epoch: 25733, Training Loss: 0.00868
Epoch: 25733, Training Loss: 0.00907
Epoch: 25733, Training Loss: 0.00908
Epoch: 25733, Training Loss: 0.01116
Epoch: 25734, Training Loss: 0.00868
Epoch: 25734, Training Loss: 0.00907
Epoch: 25734, Training Loss: 0.00908
Epoch: 25734, Training Loss: 0.01116
Epoch: 25735, Training Loss: 0.00868
Epoch: 25735, Training Loss: 0.00906
Epoch: 25735, Training Loss: 0.00908
Epoch: 25735, Training Loss: 0.01116
Epoch: 25736, Training Loss: 0.00868
Epoch: 25736, Training Loss: 0.00906
Epoch: 25736, Training Loss: 0.00908
Epoch: 25736, Training Loss: 0.01116
Epoch: 25737, Training Loss: 0.00868
Epoch: 25737, Training Loss: 0.00906
Epoch: 25737, Training Loss: 0.00908
Epoch: 25737, Training Loss: 0.01116
Epoch: 25738, Training Loss: 0.00868
Epoch: 25738, Training Loss: 0.00906
Epoch: 25738, Training Loss: 0.00908
Epoch: 25738, Training Loss: 0.01116
Epoch: 25739, Training Loss: 0.00868
Epoch: 25739, Training Loss: 0.00906
Epoch: 25739, Training Loss: 0.00908
Epoch: 25739, Training Loss: 0.01116
Epoch: 25740, Training Loss: 0.00868
Epoch: 25740, Training Loss: 0.00906
Epoch: 25740, Training Loss: 0.00908
Epoch: 25740, Training Loss: 0.01116
Epoch: 25741, Training Loss: 0.00868
Epoch: 25741, Training Loss: 0.00906
Epoch: 25741, Training Loss: 0.00907
Epoch: 25741, Training Loss: 0.01116
Epoch: 25742, Training Loss: 0.00868
Epoch: 25742, Training Loss: 0.00906
Epoch: 25742, Training Loss: 0.00907
Epoch: 25742, Training Loss: 0.01116
Epoch: 25743, Training Loss: 0.00868
Epoch: 25743, Training Loss: 0.00906
Epoch: 25743, Training Loss: 0.00907
Epoch: 25743, Training Loss: 0.01116
Epoch: 25744, Training Loss: 0.00868
Epoch: 25744, Training Loss: 0.00906
Epoch: 25744, Training Loss: 0.00907
Epoch: 25744, Training Loss: 0.01116
Epoch: 25745, Training Loss: 0.00868
Epoch: 25745, Training Loss: 0.00906
Epoch: 25745, Training Loss: 0.00907
Epoch: 25745, Training Loss: 0.01116
Epoch: 25746, Training Loss: 0.00868
Epoch: 25746, Training Loss: 0.00906
Epoch: 25746, Training Loss: 0.00907
Epoch: 25746, Training Loss: 0.01116
Epoch: 25747, Training Loss: 0.00868
Epoch: 25747, Training Loss: 0.00906
Epoch: 25747, Training Loss: 0.00907
Epoch: 25747, Training Loss: 0.01116
Epoch: 25748, Training Loss: 0.00868
Epoch: 25748, Training Loss: 0.00906
Epoch: 25748, Training Loss: 0.00907
Epoch: 25748, Training Loss: 0.01116
Epoch: 25749, Training Loss: 0.00868
Epoch: 25749, Training Loss: 0.00906
Epoch: 25749, Training Loss: 0.00907
Epoch: 25749, Training Loss: 0.01116
Epoch: 25750, Training Loss: 0.00868
Epoch: 25750, Training Loss: 0.00906
Epoch: 25750, Training Loss: 0.00907
Epoch: 25750, Training Loss: 0.01116
Epoch: 25751, Training Loss: 0.00868
Epoch: 25751, Training Loss: 0.00906
Epoch: 25751, Training Loss: 0.00907
Epoch: 25751, Training Loss: 0.01116
Epoch: 25752, Training Loss: 0.00868
Epoch: 25752, Training Loss: 0.00906
Epoch: 25752, Training Loss: 0.00907
Epoch: 25752, Training Loss: 0.01116
Epoch: 25753, Training Loss: 0.00868
Epoch: 25753, Training Loss: 0.00906
Epoch: 25753, Training Loss: 0.00907
Epoch: 25753, Training Loss: 0.01116
Epoch: 25754, Training Loss: 0.00868
Epoch: 25754, Training Loss: 0.00906
Epoch: 25754, Training Loss: 0.00907
Epoch: 25754, Training Loss: 0.01116
Epoch: 25755, Training Loss: 0.00868
Epoch: 25755, Training Loss: 0.00906
Epoch: 25755, Training Loss: 0.00907
Epoch: 25755, Training Loss: 0.01116
Epoch: 25756, Training Loss: 0.00868
Epoch: 25756, Training Loss: 0.00906
Epoch: 25756, Training Loss: 0.00907
Epoch: 25756, Training Loss: 0.01116
Epoch: 25757, Training Loss: 0.00868
Epoch: 25757, Training Loss: 0.00906
Epoch: 25757, Training Loss: 0.00907
Epoch: 25757, Training Loss: 0.01116
Epoch: 25758, Training Loss: 0.00868
Epoch: 25758, Training Loss: 0.00906
Epoch: 25758, Training Loss: 0.00907
Epoch: 25758, Training Loss: 0.01116
Epoch: 25759, Training Loss: 0.00868
Epoch: 25759, Training Loss: 0.00906
Epoch: 25759, Training Loss: 0.00907
Epoch: 25759, Training Loss: 0.01115
Epoch: 25760, Training Loss: 0.00868
Epoch: 25760, Training Loss: 0.00906
Epoch: 25760, Training Loss: 0.00907
Epoch: 25760, Training Loss: 0.01115
Epoch: 25761, Training Loss: 0.00868
Epoch: 25761, Training Loss: 0.00906
Epoch: 25761, Training Loss: 0.00907
Epoch: 25761, Training Loss: 0.01115
Epoch: 25762, Training Loss: 0.00868
Epoch: 25762, Training Loss: 0.00906
Epoch: 25762, Training Loss: 0.00907
Epoch: 25762, Training Loss: 0.01115
Epoch: 25763, Training Loss: 0.00868
Epoch: 25763, Training Loss: 0.00906
Epoch: 25763, Training Loss: 0.00907
Epoch: 25763, Training Loss: 0.01115
Epoch: 25764, Training Loss: 0.00868
Epoch: 25764, Training Loss: 0.00906
Epoch: 25764, Training Loss: 0.00907
Epoch: 25764, Training Loss: 0.01115
Epoch: 25765, Training Loss: 0.00868
Epoch: 25765, Training Loss: 0.00906
Epoch: 25765, Training Loss: 0.00907
Epoch: 25765, Training Loss: 0.01115
Epoch: 25766, Training Loss: 0.00867
Epoch: 25766, Training Loss: 0.00906
Epoch: 25766, Training Loss: 0.00907
Epoch: 25766, Training Loss: 0.01115
Epoch: 25767, Training Loss: 0.00867
Epoch: 25767, Training Loss: 0.00906
Epoch: 25767, Training Loss: 0.00907
Epoch: 25767, Training Loss: 0.01115
Epoch: 25768, Training Loss: 0.00867
Epoch: 25768, Training Loss: 0.00906
Epoch: 25768, Training Loss: 0.00907
Epoch: 25768, Training Loss: 0.01115
Epoch: 25769, Training Loss: 0.00867
Epoch: 25769, Training Loss: 0.00906
Epoch: 25769, Training Loss: 0.00907
Epoch: 25769, Training Loss: 0.01115
Epoch: 25770, Training Loss: 0.00867
Epoch: 25770, Training Loss: 0.00906
Epoch: 25770, Training Loss: 0.00907
Epoch: 25770, Training Loss: 0.01115
Epoch: 25771, Training Loss: 0.00867
Epoch: 25771, Training Loss: 0.00906
Epoch: 25771, Training Loss: 0.00907
Epoch: 25771, Training Loss: 0.01115
Epoch: 25772, Training Loss: 0.00867
Epoch: 25772, Training Loss: 0.00906
Epoch: 25772, Training Loss: 0.00907
Epoch: 25772, Training Loss: 0.01115
Epoch: 25773, Training Loss: 0.00867
Epoch: 25773, Training Loss: 0.00906
Epoch: 25773, Training Loss: 0.00907
Epoch: 25773, Training Loss: 0.01115
Epoch: 25774, Training Loss: 0.00867
Epoch: 25774, Training Loss: 0.00906
Epoch: 25774, Training Loss: 0.00907
Epoch: 25774, Training Loss: 0.01115
Epoch: 25775, Training Loss: 0.00867
Epoch: 25775, Training Loss: 0.00906
Epoch: 25775, Training Loss: 0.00907
Epoch: 25775, Training Loss: 0.01115
Epoch: 25776, Training Loss: 0.00867
Epoch: 25776, Training Loss: 0.00906
Epoch: 25776, Training Loss: 0.00907
Epoch: 25776, Training Loss: 0.01115
Epoch: 25777, Training Loss: 0.00867
Epoch: 25777, Training Loss: 0.00906
Epoch: 25777, Training Loss: 0.00907
Epoch: 25777, Training Loss: 0.01115
Epoch: 25778, Training Loss: 0.00867
Epoch: 25778, Training Loss: 0.00906
Epoch: 25778, Training Loss: 0.00907
Epoch: 25778, Training Loss: 0.01115
Epoch: 25779, Training Loss: 0.00867
Epoch: 25779, Training Loss: 0.00906
Epoch: 25779, Training Loss: 0.00907
Epoch: 25779, Training Loss: 0.01115
Epoch: 25780, Training Loss: 0.00867
Epoch: 25780, Training Loss: 0.00906
Epoch: 25780, Training Loss: 0.00907
Epoch: 25780, Training Loss: 0.01115
Epoch: 25781, Training Loss: 0.00867
Epoch: 25781, Training Loss: 0.00906
Epoch: 25781, Training Loss: 0.00907
Epoch: 25781, Training Loss: 0.01115
Epoch: 25782, Training Loss: 0.00867
Epoch: 25782, Training Loss: 0.00906
Epoch: 25782, Training Loss: 0.00907
Epoch: 25782, Training Loss: 0.01115
Epoch: 25783, Training Loss: 0.00867
Epoch: 25783, Training Loss: 0.00906
Epoch: 25783, Training Loss: 0.00907
Epoch: 25783, Training Loss: 0.01115
Epoch: 25784, Training Loss: 0.00867
Epoch: 25784, Training Loss: 0.00905
Epoch: 25784, Training Loss: 0.00907
Epoch: 25784, Training Loss: 0.01115
Epoch: 25785, Training Loss: 0.00867
Epoch: 25785, Training Loss: 0.00905
Epoch: 25785, Training Loss: 0.00907
Epoch: 25785, Training Loss: 0.01115
Epoch: 25786, Training Loss: 0.00867
Epoch: 25786, Training Loss: 0.00905
Epoch: 25786, Training Loss: 0.00907
Epoch: 25786, Training Loss: 0.01115
Epoch: 25787, Training Loss: 0.00867
Epoch: 25787, Training Loss: 0.00905
Epoch: 25787, Training Loss: 0.00907
Epoch: 25787, Training Loss: 0.01115
Epoch: 25788, Training Loss: 0.00867
Epoch: 25788, Training Loss: 0.00905
Epoch: 25788, Training Loss: 0.00907
Epoch: 25788, Training Loss: 0.01115
Epoch: 25789, Training Loss: 0.00867
Epoch: 25789, Training Loss: 0.00905
Epoch: 25789, Training Loss: 0.00907
Epoch: 25789, Training Loss: 0.01115
Epoch: 25790, Training Loss: 0.00867
Epoch: 25790, Training Loss: 0.00905
Epoch: 25790, Training Loss: 0.00906
Epoch: 25790, Training Loss: 0.01115
Epoch: 25791, Training Loss: 0.00867
Epoch: 25791, Training Loss: 0.00905
Epoch: 25791, Training Loss: 0.00906
Epoch: 25791, Training Loss: 0.01115
Epoch: 25792, Training Loss: 0.00867
Epoch: 25792, Training Loss: 0.00905
Epoch: 25792, Training Loss: 0.00906
Epoch: 25792, Training Loss: 0.01115
Epoch: 25793, Training Loss: 0.00867
Epoch: 25793, Training Loss: 0.00905
Epoch: 25793, Training Loss: 0.00906
Epoch: 25793, Training Loss: 0.01115
Epoch: 25794, Training Loss: 0.00867
Epoch: 25794, Training Loss: 0.00905
Epoch: 25794, Training Loss: 0.00906
Epoch: 25794, Training Loss: 0.01115
Epoch: 25795, Training Loss: 0.00867
Epoch: 25795, Training Loss: 0.00905
Epoch: 25795, Training Loss: 0.00906
Epoch: 25795, Training Loss: 0.01115
Epoch: 25796, Training Loss: 0.00867
Epoch: 25796, Training Loss: 0.00905
Epoch: 25796, Training Loss: 0.00906
Epoch: 25796, Training Loss: 0.01115
Epoch: 25797, Training Loss: 0.00867
Epoch: 25797, Training Loss: 0.00905
Epoch: 25797, Training Loss: 0.00906
Epoch: 25797, Training Loss: 0.01115
Epoch: 25798, Training Loss: 0.00867
Epoch: 25798, Training Loss: 0.00905
Epoch: 25798, Training Loss: 0.00906
Epoch: 25798, Training Loss: 0.01115
Epoch: 25799, Training Loss: 0.00867
Epoch: 25799, Training Loss: 0.00905
Epoch: 25799, Training Loss: 0.00906
Epoch: 25799, Training Loss: 0.01114
Epoch: 25800, Training Loss: 0.00867
Epoch: 25800, Training Loss: 0.00905
Epoch: 25800, Training Loss: 0.00906
Epoch: 25800, Training Loss: 0.01114
Epoch: 25801, Training Loss: 0.00867
Epoch: 25801, Training Loss: 0.00905
Epoch: 25801, Training Loss: 0.00906
Epoch: 25801, Training Loss: 0.01114
Epoch: 25802, Training Loss: 0.00867
Epoch: 25802, Training Loss: 0.00905
Epoch: 25802, Training Loss: 0.00906
Epoch: 25802, Training Loss: 0.01114
Epoch: 25803, Training Loss: 0.00867
Epoch: 25803, Training Loss: 0.00905
Epoch: 25803, Training Loss: 0.00906
Epoch: 25803, Training Loss: 0.01114
Epoch: 25804, Training Loss: 0.00867
Epoch: 25804, Training Loss: 0.00905
Epoch: 25804, Training Loss: 0.00906
Epoch: 25804, Training Loss: 0.01114
Epoch: 25805, Training Loss: 0.00867
Epoch: 25805, Training Loss: 0.00905
Epoch: 25805, Training Loss: 0.00906
Epoch: 25805, Training Loss: 0.01114
Epoch: 25806, Training Loss: 0.00867
Epoch: 25806, Training Loss: 0.00905
Epoch: 25806, Training Loss: 0.00906
Epoch: 25806, Training Loss: 0.01114
Epoch: 25807, Training Loss: 0.00867
Epoch: 25807, Training Loss: 0.00905
Epoch: 25807, Training Loss: 0.00906
Epoch: 25807, Training Loss: 0.01114
Epoch: 25808, Training Loss: 0.00867
Epoch: 25808, Training Loss: 0.00905
Epoch: 25808, Training Loss: 0.00906
Epoch: 25808, Training Loss: 0.01114
Epoch: 25809, Training Loss: 0.00867
Epoch: 25809, Training Loss: 0.00905
Epoch: 25809, Training Loss: 0.00906
Epoch: 25809, Training Loss: 0.01114
Epoch: 25810, Training Loss: 0.00867
Epoch: 25810, Training Loss: 0.00905
Epoch: 25810, Training Loss: 0.00906
Epoch: 25810, Training Loss: 0.01114
Epoch: 25811, Training Loss: 0.00867
Epoch: 25811, Training Loss: 0.00905
Epoch: 25811, Training Loss: 0.00906
Epoch: 25811, Training Loss: 0.01114
Epoch: 25812, Training Loss: 0.00867
Epoch: 25812, Training Loss: 0.00905
Epoch: 25812, Training Loss: 0.00906
Epoch: 25812, Training Loss: 0.01114
Epoch: 25813, Training Loss: 0.00867
Epoch: 25813, Training Loss: 0.00905
Epoch: 25813, Training Loss: 0.00906
Epoch: 25813, Training Loss: 0.01114
Epoch: 25814, Training Loss: 0.00867
Epoch: 25814, Training Loss: 0.00905
Epoch: 25814, Training Loss: 0.00906
Epoch: 25814, Training Loss: 0.01114
Epoch: 25815, Training Loss: 0.00867
Epoch: 25815, Training Loss: 0.00905
Epoch: 25815, Training Loss: 0.00906
Epoch: 25815, Training Loss: 0.01114
Epoch: 25816, Training Loss: 0.00867
Epoch: 25816, Training Loss: 0.00905
Epoch: 25816, Training Loss: 0.00906
Epoch: 25816, Training Loss: 0.01114
Epoch: 25817, Training Loss: 0.00867
Epoch: 25817, Training Loss: 0.00905
Epoch: 25817, Training Loss: 0.00906
Epoch: 25817, Training Loss: 0.01114
Epoch: 25818, Training Loss: 0.00867
Epoch: 25818, Training Loss: 0.00905
Epoch: 25818, Training Loss: 0.00906
Epoch: 25818, Training Loss: 0.01114
Epoch: 25819, Training Loss: 0.00867
Epoch: 25819, Training Loss: 0.00905
Epoch: 25819, Training Loss: 0.00906
Epoch: 25819, Training Loss: 0.01114
Epoch: 25820, Training Loss: 0.00866
Epoch: 25820, Training Loss: 0.00905
Epoch: 25820, Training Loss: 0.00906
Epoch: 25820, Training Loss: 0.01114
Epoch: 25821, Training Loss: 0.00866
Epoch: 25821, Training Loss: 0.00905
Epoch: 25821, Training Loss: 0.00906
Epoch: 25821, Training Loss: 0.01114
Epoch: 25822, Training Loss: 0.00866
Epoch: 25822, Training Loss: 0.00905
Epoch: 25822, Training Loss: 0.00906
Epoch: 25822, Training Loss: 0.01114
Epoch: 25823, Training Loss: 0.00866
Epoch: 25823, Training Loss: 0.00905
Epoch: 25823, Training Loss: 0.00906
Epoch: 25823, Training Loss: 0.01114
Epoch: 25824, Training Loss: 0.00866
Epoch: 25824, Training Loss: 0.00905
Epoch: 25824, Training Loss: 0.00906
Epoch: 25824, Training Loss: 0.01114
Epoch: 25825, Training Loss: 0.00866
Epoch: 25825, Training Loss: 0.00905
Epoch: 25825, Training Loss: 0.00906
Epoch: 25825, Training Loss: 0.01114
Epoch: 25826, Training Loss: 0.00866
Epoch: 25826, Training Loss: 0.00905
Epoch: 25826, Training Loss: 0.00906
Epoch: 25826, Training Loss: 0.01114
Epoch: 25827, Training Loss: 0.00866
Epoch: 25827, Training Loss: 0.00905
Epoch: 25827, Training Loss: 0.00906
Epoch: 25827, Training Loss: 0.01114
Epoch: 25828, Training Loss: 0.00866
Epoch: 25828, Training Loss: 0.00905
Epoch: 25828, Training Loss: 0.00906
Epoch: 25828, Training Loss: 0.01114
Epoch: 25829, Training Loss: 0.00866
Epoch: 25829, Training Loss: 0.00905
Epoch: 25829, Training Loss: 0.00906
Epoch: 25829, Training Loss: 0.01114
Epoch: 25830, Training Loss: 0.00866
Epoch: 25830, Training Loss: 0.00905
Epoch: 25830, Training Loss: 0.00906
Epoch: 25830, Training Loss: 0.01114
Epoch: 25831, Training Loss: 0.00866
Epoch: 25831, Training Loss: 0.00905
Epoch: 25831, Training Loss: 0.00906
Epoch: 25831, Training Loss: 0.01114
Epoch: 25832, Training Loss: 0.00866
Epoch: 25832, Training Loss: 0.00905
Epoch: 25832, Training Loss: 0.00906
Epoch: 25832, Training Loss: 0.01114
Epoch: 25833, Training Loss: 0.00866
Epoch: 25833, Training Loss: 0.00905
Epoch: 25833, Training Loss: 0.00906
Epoch: 25833, Training Loss: 0.01114
Epoch: 25834, Training Loss: 0.00866
Epoch: 25834, Training Loss: 0.00904
Epoch: 25834, Training Loss: 0.00906
Epoch: 25834, Training Loss: 0.01114
Epoch: 25835, Training Loss: 0.00866
Epoch: 25835, Training Loss: 0.00904
Epoch: 25835, Training Loss: 0.00906
Epoch: 25835, Training Loss: 0.01114
Epoch: 25836, Training Loss: 0.00866
Epoch: 25836, Training Loss: 0.00904
Epoch: 25836, Training Loss: 0.00906
Epoch: 25836, Training Loss: 0.01114
Epoch: 25837, Training Loss: 0.00866
Epoch: 25837, Training Loss: 0.00904
Epoch: 25837, Training Loss: 0.00906
Epoch: 25837, Training Loss: 0.01114
Epoch: 25838, Training Loss: 0.00866
Epoch: 25838, Training Loss: 0.00904
Epoch: 25838, Training Loss: 0.00906
Epoch: 25838, Training Loss: 0.01114
Epoch: 25839, Training Loss: 0.00866
Epoch: 25839, Training Loss: 0.00904
Epoch: 25839, Training Loss: 0.00906
Epoch: 25839, Training Loss: 0.01113
Epoch: 25840, Training Loss: 0.00866
Epoch: 25840, Training Loss: 0.00904
Epoch: 25840, Training Loss: 0.00905
Epoch: 25840, Training Loss: 0.01113
Epoch: 25841, Training Loss: 0.00866
Epoch: 25841, Training Loss: 0.00904
Epoch: 25841, Training Loss: 0.00905
Epoch: 25841, Training Loss: 0.01113
Epoch: 25842, Training Loss: 0.00866
Epoch: 25842, Training Loss: 0.00904
Epoch: 25842, Training Loss: 0.00905
Epoch: 25842, Training Loss: 0.01113
Epoch: 25843, Training Loss: 0.00866
Epoch: 25843, Training Loss: 0.00904
Epoch: 25843, Training Loss: 0.00905
Epoch: 25843, Training Loss: 0.01113
Epoch: 25844, Training Loss: 0.00866
Epoch: 25844, Training Loss: 0.00904
Epoch: 25844, Training Loss: 0.00905
Epoch: 25844, Training Loss: 0.01113
Epoch: 25845, Training Loss: 0.00866
Epoch: 25845, Training Loss: 0.00904
Epoch: 25845, Training Loss: 0.00905
Epoch: 25845, Training Loss: 0.01113
Epoch: 25846, Training Loss: 0.00866
Epoch: 25846, Training Loss: 0.00904
Epoch: 25846, Training Loss: 0.00905
Epoch: 25846, Training Loss: 0.01113
Epoch: 25847, Training Loss: 0.00866
Epoch: 25847, Training Loss: 0.00904
Epoch: 25847, Training Loss: 0.00905
Epoch: 25847, Training Loss: 0.01113
Epoch: 25848, Training Loss: 0.00866
Epoch: 25848, Training Loss: 0.00904
Epoch: 25848, Training Loss: 0.00905
Epoch: 25848, Training Loss: 0.01113
Epoch: 25849, Training Loss: 0.00866
Epoch: 25849, Training Loss: 0.00904
Epoch: 25849, Training Loss: 0.00905
Epoch: 25849, Training Loss: 0.01113
Epoch: 25850, Training Loss: 0.00866
Epoch: 25850, Training Loss: 0.00904
Epoch: 25850, Training Loss: 0.00905
Epoch: 25850, Training Loss: 0.01113
Epoch: 25851, Training Loss: 0.00866
Epoch: 25851, Training Loss: 0.00904
Epoch: 25851, Training Loss: 0.00905
Epoch: 25851, Training Loss: 0.01113
Epoch: 25852, Training Loss: 0.00866
Epoch: 25852, Training Loss: 0.00904
Epoch: 25852, Training Loss: 0.00905
Epoch: 25852, Training Loss: 0.01113
Epoch: 25853, Training Loss: 0.00866
Epoch: 25853, Training Loss: 0.00904
Epoch: 25853, Training Loss: 0.00905
Epoch: 25853, Training Loss: 0.01113
Epoch: 25854, Training Loss: 0.00866
Epoch: 25854, Training Loss: 0.00904
Epoch: 25854, Training Loss: 0.00905
Epoch: 25854, Training Loss: 0.01113
Epoch: 25855, Training Loss: 0.00866
Epoch: 25855, Training Loss: 0.00904
Epoch: 25855, Training Loss: 0.00905
Epoch: 25855, Training Loss: 0.01113
Epoch: 25856, Training Loss: 0.00866
Epoch: 25856, Training Loss: 0.00904
Epoch: 25856, Training Loss: 0.00905
Epoch: 25856, Training Loss: 0.01113
Epoch: 25857, Training Loss: 0.00866
Epoch: 25857, Training Loss: 0.00904
Epoch: 25857, Training Loss: 0.00905
Epoch: 25857, Training Loss: 0.01113
Epoch: 25858, Training Loss: 0.00866
Epoch: 25858, Training Loss: 0.00904
Epoch: 25858, Training Loss: 0.00905
Epoch: 25858, Training Loss: 0.01113
Epoch: 25859, Training Loss: 0.00866
Epoch: 25859, Training Loss: 0.00904
Epoch: 25859, Training Loss: 0.00905
Epoch: 25859, Training Loss: 0.01113
Epoch: 25860, Training Loss: 0.00866
Epoch: 25860, Training Loss: 0.00904
Epoch: 25860, Training Loss: 0.00905
Epoch: 25860, Training Loss: 0.01113
Epoch: 25861, Training Loss: 0.00866
Epoch: 25861, Training Loss: 0.00904
Epoch: 25861, Training Loss: 0.00905
Epoch: 25861, Training Loss: 0.01113
Epoch: 25862, Training Loss: 0.00866
Epoch: 25862, Training Loss: 0.00904
Epoch: 25862, Training Loss: 0.00905
Epoch: 25862, Training Loss: 0.01113
Epoch: 25863, Training Loss: 0.00866
Epoch: 25863, Training Loss: 0.00904
Epoch: 25863, Training Loss: 0.00905
Epoch: 25863, Training Loss: 0.01113
Epoch: 25864, Training Loss: 0.00866
Epoch: 25864, Training Loss: 0.00904
Epoch: 25864, Training Loss: 0.00905
Epoch: 25864, Training Loss: 0.01113
Epoch: 25865, Training Loss: 0.00866
Epoch: 25865, Training Loss: 0.00904
Epoch: 25865, Training Loss: 0.00905
Epoch: 25865, Training Loss: 0.01113
Epoch: 25866, Training Loss: 0.00866
Epoch: 25866, Training Loss: 0.00904
Epoch: 25866, Training Loss: 0.00905
Epoch: 25866, Training Loss: 0.01113
Epoch: 25867, Training Loss: 0.00866
Epoch: 25867, Training Loss: 0.00904
Epoch: 25867, Training Loss: 0.00905
Epoch: 25867, Training Loss: 0.01113
Epoch: 25868, Training Loss: 0.00866
Epoch: 25868, Training Loss: 0.00904
Epoch: 25868, Training Loss: 0.00905
Epoch: 25868, Training Loss: 0.01113
Epoch: 25869, Training Loss: 0.00866
Epoch: 25869, Training Loss: 0.00904
Epoch: 25869, Training Loss: 0.00905
Epoch: 25869, Training Loss: 0.01113
Epoch: 25870, Training Loss: 0.00866
Epoch: 25870, Training Loss: 0.00904
Epoch: 25870, Training Loss: 0.00905
Epoch: 25870, Training Loss: 0.01113
Epoch: 25871, Training Loss: 0.00866
Epoch: 25871, Training Loss: 0.00904
Epoch: 25871, Training Loss: 0.00905
Epoch: 25871, Training Loss: 0.01113
Epoch: 25872, Training Loss: 0.00866
Epoch: 25872, Training Loss: 0.00904
Epoch: 25872, Training Loss: 0.00905
Epoch: 25872, Training Loss: 0.01113
Epoch: 25873, Training Loss: 0.00865
Epoch: 25873, Training Loss: 0.00904
Epoch: 25873, Training Loss: 0.00905
Epoch: 25873, Training Loss: 0.01113
Epoch: 25874, Training Loss: 0.00865
Epoch: 25874, Training Loss: 0.00904
Epoch: 25874, Training Loss: 0.00905
Epoch: 25874, Training Loss: 0.01113
Epoch: 25875, Training Loss: 0.00865
Epoch: 25875, Training Loss: 0.00904
Epoch: 25875, Training Loss: 0.00905
Epoch: 25875, Training Loss: 0.01113
Epoch: 25876, Training Loss: 0.00865
Epoch: 25876, Training Loss: 0.00904
Epoch: 25876, Training Loss: 0.00905
Epoch: 25876, Training Loss: 0.01113
Epoch: 25877, Training Loss: 0.00865
Epoch: 25877, Training Loss: 0.00904
Epoch: 25877, Training Loss: 0.00905
Epoch: 25877, Training Loss: 0.01113
Epoch: 25878, Training Loss: 0.00865
Epoch: 25878, Training Loss: 0.00904
Epoch: 25878, Training Loss: 0.00905
Epoch: 25878, Training Loss: 0.01113
Epoch: 25879, Training Loss: 0.00865
Epoch: 25879, Training Loss: 0.00904
Epoch: 25879, Training Loss: 0.00905
Epoch: 25879, Training Loss: 0.01112
Epoch: 25880, Training Loss: 0.00865
Epoch: 25880, Training Loss: 0.00904
Epoch: 25880, Training Loss: 0.00905
Epoch: 25880, Training Loss: 0.01112
Epoch: 25881, Training Loss: 0.00865
Epoch: 25881, Training Loss: 0.00904
Epoch: 25881, Training Loss: 0.00905
Epoch: 25881, Training Loss: 0.01112
Epoch: 25882, Training Loss: 0.00865
Epoch: 25882, Training Loss: 0.00904
Epoch: 25882, Training Loss: 0.00905
Epoch: 25882, Training Loss: 0.01112
Epoch: 25883, Training Loss: 0.00865
Epoch: 25883, Training Loss: 0.00904
Epoch: 25883, Training Loss: 0.00905
Epoch: 25883, Training Loss: 0.01112
Epoch: 25884, Training Loss: 0.00865
Epoch: 25884, Training Loss: 0.00903
Epoch: 25884, Training Loss: 0.00905
Epoch: 25884, Training Loss: 0.01112
Epoch: 25885, Training Loss: 0.00865
Epoch: 25885, Training Loss: 0.00903
Epoch: 25885, Training Loss: 0.00905
Epoch: 25885, Training Loss: 0.01112
Epoch: 25886, Training Loss: 0.00865
Epoch: 25886, Training Loss: 0.00903
Epoch: 25886, Training Loss: 0.00905
Epoch: 25886, Training Loss: 0.01112
Epoch: 25887, Training Loss: 0.00865
Epoch: 25887, Training Loss: 0.00903
Epoch: 25887, Training Loss: 0.00905
Epoch: 25887, Training Loss: 0.01112
Epoch: 25888, Training Loss: 0.00865
Epoch: 25888, Training Loss: 0.00903
Epoch: 25888, Training Loss: 0.00905
Epoch: 25888, Training Loss: 0.01112
Epoch: 25889, Training Loss: 0.00865
Epoch: 25889, Training Loss: 0.00903
Epoch: 25889, Training Loss: 0.00905
Epoch: 25889, Training Loss: 0.01112
Epoch: 25890, Training Loss: 0.00865
Epoch: 25890, Training Loss: 0.00903
Epoch: 25890, Training Loss: 0.00904
Epoch: 25890, Training Loss: 0.01112
Epoch: 25891, Training Loss: 0.00865
Epoch: 25891, Training Loss: 0.00903
Epoch: 25891, Training Loss: 0.00904
Epoch: 25891, Training Loss: 0.01112
Epoch: 25892, Training Loss: 0.00865
Epoch: 25892, Training Loss: 0.00903
Epoch: 25892, Training Loss: 0.00904
Epoch: 25892, Training Loss: 0.01112
Epoch: 25893, Training Loss: 0.00865
Epoch: 25893, Training Loss: 0.00903
Epoch: 25893, Training Loss: 0.00904
Epoch: 25893, Training Loss: 0.01112
Epoch: 25894, Training Loss: 0.00865
Epoch: 25894, Training Loss: 0.00903
Epoch: 25894, Training Loss: 0.00904
Epoch: 25894, Training Loss: 0.01112
Epoch: 25895, Training Loss: 0.00865
Epoch: 25895, Training Loss: 0.00903
Epoch: 25895, Training Loss: 0.00904
Epoch: 25895, Training Loss: 0.01112
Epoch: 25896, Training Loss: 0.00865
Epoch: 25896, Training Loss: 0.00903
Epoch: 25896, Training Loss: 0.00904
Epoch: 25896, Training Loss: 0.01112
Epoch: 25897, Training Loss: 0.00865
Epoch: 25897, Training Loss: 0.00903
Epoch: 25897, Training Loss: 0.00904
Epoch: 25897, Training Loss: 0.01112
Epoch: 25898, Training Loss: 0.00865
Epoch: 25898, Training Loss: 0.00903
Epoch: 25898, Training Loss: 0.00904
Epoch: 25898, Training Loss: 0.01112
Epoch: 25899, Training Loss: 0.00865
Epoch: 25899, Training Loss: 0.00903
Epoch: 25899, Training Loss: 0.00904
Epoch: 25899, Training Loss: 0.01112
Epoch: 25900, Training Loss: 0.00865
Epoch: 25900, Training Loss: 0.00903
Epoch: 25900, Training Loss: 0.00904
Epoch: 25900, Training Loss: 0.01112
Epoch: 25901, Training Loss: 0.00865
Epoch: 25901, Training Loss: 0.00903
Epoch: 25901, Training Loss: 0.00904
Epoch: 25901, Training Loss: 0.01112
Epoch: 25902, Training Loss: 0.00865
Epoch: 25902, Training Loss: 0.00903
Epoch: 25902, Training Loss: 0.00904
Epoch: 25902, Training Loss: 0.01112
Epoch: 25903, Training Loss: 0.00865
Epoch: 25903, Training Loss: 0.00903
Epoch: 25903, Training Loss: 0.00904
Epoch: 25903, Training Loss: 0.01112
Epoch: 25904, Training Loss: 0.00865
Epoch: 25904, Training Loss: 0.00903
Epoch: 25904, Training Loss: 0.00904
Epoch: 25904, Training Loss: 0.01112
Epoch: 25905, Training Loss: 0.00865
Epoch: 25905, Training Loss: 0.00903
Epoch: 25905, Training Loss: 0.00904
Epoch: 25905, Training Loss: 0.01112
Epoch: 25906, Training Loss: 0.00865
Epoch: 25906, Training Loss: 0.00903
Epoch: 25906, Training Loss: 0.00904
Epoch: 25906, Training Loss: 0.01112
Epoch: 25907, Training Loss: 0.00865
Epoch: 25907, Training Loss: 0.00903
Epoch: 25907, Training Loss: 0.00904
Epoch: 25907, Training Loss: 0.01112
Epoch: 25908, Training Loss: 0.00865
Epoch: 25908, Training Loss: 0.00903
Epoch: 25908, Training Loss: 0.00904
Epoch: 25908, Training Loss: 0.01112
Epoch: 25909, Training Loss: 0.00865
Epoch: 25909, Training Loss: 0.00903
Epoch: 25909, Training Loss: 0.00904
Epoch: 25909, Training Loss: 0.01112
Epoch: 25910, Training Loss: 0.00865
Epoch: 25910, Training Loss: 0.00903
Epoch: 25910, Training Loss: 0.00904
Epoch: 25910, Training Loss: 0.01112
Epoch: 25911, Training Loss: 0.00865
Epoch: 25911, Training Loss: 0.00903
Epoch: 25911, Training Loss: 0.00904
Epoch: 25911, Training Loss: 0.01112
Epoch: 25912, Training Loss: 0.00865
Epoch: 25912, Training Loss: 0.00903
Epoch: 25912, Training Loss: 0.00904
Epoch: 25912, Training Loss: 0.01112
Epoch: 25913, Training Loss: 0.00865
Epoch: 25913, Training Loss: 0.00903
Epoch: 25913, Training Loss: 0.00904
Epoch: 25913, Training Loss: 0.01112
Epoch: 25914, Training Loss: 0.00865
Epoch: 25914, Training Loss: 0.00903
Epoch: 25914, Training Loss: 0.00904
Epoch: 25914, Training Loss: 0.01112
Epoch: 25915, Training Loss: 0.00865
Epoch: 25915, Training Loss: 0.00903
Epoch: 25915, Training Loss: 0.00904
Epoch: 25915, Training Loss: 0.01112
Epoch: 25916, Training Loss: 0.00865
Epoch: 25916, Training Loss: 0.00903
Epoch: 25916, Training Loss: 0.00904
Epoch: 25916, Training Loss: 0.01112
Epoch: 25917, Training Loss: 0.00865
Epoch: 25917, Training Loss: 0.00903
Epoch: 25917, Training Loss: 0.00904
Epoch: 25917, Training Loss: 0.01112
Epoch: 25918, Training Loss: 0.00865
Epoch: 25918, Training Loss: 0.00903
Epoch: 25918, Training Loss: 0.00904
Epoch: 25918, Training Loss: 0.01112
Epoch: 25919, Training Loss: 0.00865
Epoch: 25919, Training Loss: 0.00903
Epoch: 25919, Training Loss: 0.00904
Epoch: 25919, Training Loss: 0.01112
Epoch: 25920, Training Loss: 0.00865
Epoch: 25920, Training Loss: 0.00903
Epoch: 25920, Training Loss: 0.00904
Epoch: 25920, Training Loss: 0.01111
Epoch: 25921, Training Loss: 0.00865
Epoch: 25921, Training Loss: 0.00903
Epoch: 25921, Training Loss: 0.00904
Epoch: 25921, Training Loss: 0.01111
Epoch: 25922, Training Loss: 0.00865
Epoch: 25922, Training Loss: 0.00903
Epoch: 25922, Training Loss: 0.00904
Epoch: 25922, Training Loss: 0.01111
Epoch: 25923, Training Loss: 0.00865
Epoch: 25923, Training Loss: 0.00903
Epoch: 25923, Training Loss: 0.00904
Epoch: 25923, Training Loss: 0.01111
Epoch: 25924, Training Loss: 0.00865
Epoch: 25924, Training Loss: 0.00903
Epoch: 25924, Training Loss: 0.00904
Epoch: 25924, Training Loss: 0.01111
Epoch: 25925, Training Loss: 0.00865
Epoch: 25925, Training Loss: 0.00903
Epoch: 25925, Training Loss: 0.00904
Epoch: 25925, Training Loss: 0.01111
Epoch: 25926, Training Loss: 0.00865
Epoch: 25926, Training Loss: 0.00903
Epoch: 25926, Training Loss: 0.00904
Epoch: 25926, Training Loss: 0.01111
Epoch: 25927, Training Loss: 0.00864
Epoch: 25927, Training Loss: 0.00903
Epoch: 25927, Training Loss: 0.00904
Epoch: 25927, Training Loss: 0.01111
Epoch: 25928, Training Loss: 0.00864
Epoch: 25928, Training Loss: 0.00903
Epoch: 25928, Training Loss: 0.00904
Epoch: 25928, Training Loss: 0.01111
Epoch: 25929, Training Loss: 0.00864
Epoch: 25929, Training Loss: 0.00903
Epoch: 25929, Training Loss: 0.00904
Epoch: 25929, Training Loss: 0.01111
Epoch: 25930, Training Loss: 0.00864
Epoch: 25930, Training Loss: 0.00903
Epoch: 25930, Training Loss: 0.00904
Epoch: 25930, Training Loss: 0.01111
Epoch: 25931, Training Loss: 0.00864
Epoch: 25931, Training Loss: 0.00903
Epoch: 25931, Training Loss: 0.00904
Epoch: 25931, Training Loss: 0.01111
Epoch: 25932, Training Loss: 0.00864
Epoch: 25932, Training Loss: 0.00903
Epoch: 25932, Training Loss: 0.00904
Epoch: 25932, Training Loss: 0.01111
Epoch: 25933, Training Loss: 0.00864
Epoch: 25933, Training Loss: 0.00903
Epoch: 25933, Training Loss: 0.00904
Epoch: 25933, Training Loss: 0.01111
Epoch: 25934, Training Loss: 0.00864
Epoch: 25934, Training Loss: 0.00903
Epoch: 25934, Training Loss: 0.00904
Epoch: 25934, Training Loss: 0.01111
Epoch: 25935, Training Loss: 0.00864
Epoch: 25935, Training Loss: 0.00902
Epoch: 25935, Training Loss: 0.00904
Epoch: 25935, Training Loss: 0.01111
Epoch: 25936, Training Loss: 0.00864
Epoch: 25936, Training Loss: 0.00902
Epoch: 25936, Training Loss: 0.00904
Epoch: 25936, Training Loss: 0.01111
Epoch: 25937, Training Loss: 0.00864
Epoch: 25937, Training Loss: 0.00902
Epoch: 25937, Training Loss: 0.00904
Epoch: 25937, Training Loss: 0.01111
Epoch: 25938, Training Loss: 0.00864
Epoch: 25938, Training Loss: 0.00902
Epoch: 25938, Training Loss: 0.00904
Epoch: 25938, Training Loss: 0.01111
Epoch: 25939, Training Loss: 0.00864
Epoch: 25939, Training Loss: 0.00902
Epoch: 25939, Training Loss: 0.00904
Epoch: 25939, Training Loss: 0.01111
Epoch: 25940, Training Loss: 0.00864
Epoch: 25940, Training Loss: 0.00902
Epoch: 25940, Training Loss: 0.00903
Epoch: 25940, Training Loss: 0.01111
Epoch: 25941, Training Loss: 0.00864
Epoch: 25941, Training Loss: 0.00902
Epoch: 25941, Training Loss: 0.00903
Epoch: 25941, Training Loss: 0.01111
Epoch: 25942, Training Loss: 0.00864
Epoch: 25942, Training Loss: 0.00902
Epoch: 25942, Training Loss: 0.00903
Epoch: 25942, Training Loss: 0.01111
Epoch: 25943, Training Loss: 0.00864
Epoch: 25943, Training Loss: 0.00902
Epoch: 25943, Training Loss: 0.00903
Epoch: 25943, Training Loss: 0.01111
Epoch: 25944, Training Loss: 0.00864
Epoch: 25944, Training Loss: 0.00902
Epoch: 25944, Training Loss: 0.00903
Epoch: 25944, Training Loss: 0.01111
Epoch: 25945, Training Loss: 0.00864
Epoch: 25945, Training Loss: 0.00902
Epoch: 25945, Training Loss: 0.00903
Epoch: 25945, Training Loss: 0.01111
Epoch: 25946, Training Loss: 0.00864
Epoch: 25946, Training Loss: 0.00902
Epoch: 25946, Training Loss: 0.00903
Epoch: 25946, Training Loss: 0.01111
Epoch: 25947, Training Loss: 0.00864
Epoch: 25947, Training Loss: 0.00902
Epoch: 25947, Training Loss: 0.00903
Epoch: 25947, Training Loss: 0.01111
Epoch: 25948, Training Loss: 0.00864
Epoch: 25948, Training Loss: 0.00902
Epoch: 25948, Training Loss: 0.00903
Epoch: 25948, Training Loss: 0.01111
Epoch: 25949, Training Loss: 0.00864
Epoch: 25949, Training Loss: 0.00902
Epoch: 25949, Training Loss: 0.00903
Epoch: 25949, Training Loss: 0.01111
Epoch: 25950, Training Loss: 0.00864
Epoch: 25950, Training Loss: 0.00902
Epoch: 25950, Training Loss: 0.00903
Epoch: 25950, Training Loss: 0.01111
Epoch: 25951, Training Loss: 0.00864
Epoch: 25951, Training Loss: 0.00902
Epoch: 25951, Training Loss: 0.00903
Epoch: 25951, Training Loss: 0.01111
Epoch: 25952, Training Loss: 0.00864
Epoch: 25952, Training Loss: 0.00902
Epoch: 25952, Training Loss: 0.00903
Epoch: 25952, Training Loss: 0.01111
Epoch: 25953, Training Loss: 0.00864
Epoch: 25953, Training Loss: 0.00902
Epoch: 25953, Training Loss: 0.00903
Epoch: 25953, Training Loss: 0.01111
Epoch: 25954, Training Loss: 0.00864
Epoch: 25954, Training Loss: 0.00902
Epoch: 25954, Training Loss: 0.00903
Epoch: 25954, Training Loss: 0.01111
Epoch: 25955, Training Loss: 0.00864
Epoch: 25955, Training Loss: 0.00902
Epoch: 25955, Training Loss: 0.00903
Epoch: 25955, Training Loss: 0.01111
Epoch: 25956, Training Loss: 0.00864
Epoch: 25956, Training Loss: 0.00902
Epoch: 25956, Training Loss: 0.00903
Epoch: 25956, Training Loss: 0.01111
Epoch: 25957, Training Loss: 0.00864
Epoch: 25957, Training Loss: 0.00902
Epoch: 25957, Training Loss: 0.00903
Epoch: 25957, Training Loss: 0.01111
Epoch: 25958, Training Loss: 0.00864
Epoch: 25958, Training Loss: 0.00902
Epoch: 25958, Training Loss: 0.00903
Epoch: 25958, Training Loss: 0.01111
Epoch: 25959, Training Loss: 0.00864
Epoch: 25959, Training Loss: 0.00902
Epoch: 25959, Training Loss: 0.00903
Epoch: 25959, Training Loss: 0.01111
Epoch: 25960, Training Loss: 0.00864
Epoch: 25960, Training Loss: 0.00902
Epoch: 25960, Training Loss: 0.00903
Epoch: 25960, Training Loss: 0.01110
Epoch: 25961, Training Loss: 0.00864
Epoch: 25961, Training Loss: 0.00902
Epoch: 25961, Training Loss: 0.00903
Epoch: 25961, Training Loss: 0.01110
Epoch: 25962, Training Loss: 0.00864
Epoch: 25962, Training Loss: 0.00902
Epoch: 25962, Training Loss: 0.00903
Epoch: 25962, Training Loss: 0.01110
Epoch: 25963, Training Loss: 0.00864
Epoch: 25963, Training Loss: 0.00902
Epoch: 25963, Training Loss: 0.00903
Epoch: 25963, Training Loss: 0.01110
Epoch: 25964, Training Loss: 0.00864
Epoch: 25964, Training Loss: 0.00902
Epoch: 25964, Training Loss: 0.00903
Epoch: 25964, Training Loss: 0.01110
Epoch: 25965, Training Loss: 0.00864
Epoch: 25965, Training Loss: 0.00902
Epoch: 25965, Training Loss: 0.00903
Epoch: 25965, Training Loss: 0.01110
Epoch: 25966, Training Loss: 0.00864
Epoch: 25966, Training Loss: 0.00902
Epoch: 25966, Training Loss: 0.00903
Epoch: 25966, Training Loss: 0.01110
Epoch: 25967, Training Loss: 0.00864
Epoch: 25967, Training Loss: 0.00902
Epoch: 25967, Training Loss: 0.00903
Epoch: 25967, Training Loss: 0.01110
Epoch: 25968, Training Loss: 0.00864
Epoch: 25968, Training Loss: 0.00902
Epoch: 25968, Training Loss: 0.00903
Epoch: 25968, Training Loss: 0.01110
Epoch: 25969, Training Loss: 0.00864
Epoch: 25969, Training Loss: 0.00902
Epoch: 25969, Training Loss: 0.00903
Epoch: 25969, Training Loss: 0.01110
Epoch: 25970, Training Loss: 0.00864
Epoch: 25970, Training Loss: 0.00902
Epoch: 25970, Training Loss: 0.00903
Epoch: 25970, Training Loss: 0.01110
Epoch: 25971, Training Loss: 0.00864
Epoch: 25971, Training Loss: 0.00902
Epoch: 25971, Training Loss: 0.00903
Epoch: 25971, Training Loss: 0.01110
Epoch: 25972, Training Loss: 0.00864
Epoch: 25972, Training Loss: 0.00902
Epoch: 25972, Training Loss: 0.00903
Epoch: 25972, Training Loss: 0.01110
Epoch: 25973, Training Loss: 0.00864
Epoch: 25973, Training Loss: 0.00902
Epoch: 25973, Training Loss: 0.00903
Epoch: 25973, Training Loss: 0.01110
Epoch: 25974, Training Loss: 0.00864
Epoch: 25974, Training Loss: 0.00902
Epoch: 25974, Training Loss: 0.00903
Epoch: 25974, Training Loss: 0.01110
Epoch: 25975, Training Loss: 0.00864
Epoch: 25975, Training Loss: 0.00902
Epoch: 25975, Training Loss: 0.00903
Epoch: 25975, Training Loss: 0.01110
Epoch: 25976, Training Loss: 0.00864
Epoch: 25976, Training Loss: 0.00902
Epoch: 25976, Training Loss: 0.00903
Epoch: 25976, Training Loss: 0.01110
Epoch: 25977, Training Loss: 0.00864
Epoch: 25977, Training Loss: 0.00902
Epoch: 25977, Training Loss: 0.00903
Epoch: 25977, Training Loss: 0.01110
Epoch: 25978, Training Loss: 0.00864
Epoch: 25978, Training Loss: 0.00902
Epoch: 25978, Training Loss: 0.00903
Epoch: 25978, Training Loss: 0.01110
Epoch: 25979, Training Loss: 0.00864
Epoch: 25979, Training Loss: 0.00902
Epoch: 25979, Training Loss: 0.00903
Epoch: 25979, Training Loss: 0.01110
Epoch: 25980, Training Loss: 0.00864
Epoch: 25980, Training Loss: 0.00902
Epoch: 25980, Training Loss: 0.00903
Epoch: 25980, Training Loss: 0.01110
Epoch: 25981, Training Loss: 0.00863
Epoch: 25981, Training Loss: 0.00902
Epoch: 25981, Training Loss: 0.00903
Epoch: 25981, Training Loss: 0.01110
Epoch: 25982, Training Loss: 0.00863
Epoch: 25982, Training Loss: 0.00902
Epoch: 25982, Training Loss: 0.00903
Epoch: 25982, Training Loss: 0.01110
Epoch: 25983, Training Loss: 0.00863
Epoch: 25983, Training Loss: 0.00902
Epoch: 25983, Training Loss: 0.00903
Epoch: 25983, Training Loss: 0.01110
Epoch: 25984, Training Loss: 0.00863
Epoch: 25984, Training Loss: 0.00902
Epoch: 25984, Training Loss: 0.00903
Epoch: 25984, Training Loss: 0.01110
Epoch: 25985, Training Loss: 0.00863
Epoch: 25985, Training Loss: 0.00901
Epoch: 25985, Training Loss: 0.00903
Epoch: 25985, Training Loss: 0.01110
Epoch: 25986, Training Loss: 0.00863
Epoch: 25986, Training Loss: 0.00901
Epoch: 25986, Training Loss: 0.00903
Epoch: 25986, Training Loss: 0.01110
Epoch: 25987, Training Loss: 0.00863
Epoch: 25987, Training Loss: 0.00901
Epoch: 25987, Training Loss: 0.00903
Epoch: 25987, Training Loss: 0.01110
Epoch: 25988, Training Loss: 0.00863
Epoch: 25988, Training Loss: 0.00901
Epoch: 25988, Training Loss: 0.00903
Epoch: 25988, Training Loss: 0.01110
Epoch: 25989, Training Loss: 0.00863
Epoch: 25989, Training Loss: 0.00901
Epoch: 25989, Training Loss: 0.00903
Epoch: 25989, Training Loss: 0.01110
Epoch: 25990, Training Loss: 0.00863
Epoch: 25990, Training Loss: 0.00901
Epoch: 25990, Training Loss: 0.00903
Epoch: 25990, Training Loss: 0.01110
Epoch: 25991, Training Loss: 0.00863
Epoch: 25991, Training Loss: 0.00901
Epoch: 25991, Training Loss: 0.00902
Epoch: 25991, Training Loss: 0.01110
Epoch: 25992, Training Loss: 0.00863
Epoch: 25992, Training Loss: 0.00901
Epoch: 25992, Training Loss: 0.00902
Epoch: 25992, Training Loss: 0.01110
Epoch: 25993, Training Loss: 0.00863
Epoch: 25993, Training Loss: 0.00901
Epoch: 25993, Training Loss: 0.00902
Epoch: 25993, Training Loss: 0.01110
Epoch: 25994, Training Loss: 0.00863
Epoch: 25994, Training Loss: 0.00901
Epoch: 25994, Training Loss: 0.00902
Epoch: 25994, Training Loss: 0.01110
Epoch: 25995, Training Loss: 0.00863
Epoch: 25995, Training Loss: 0.00901
Epoch: 25995, Training Loss: 0.00902
Epoch: 25995, Training Loss: 0.01110
Epoch: 25996, Training Loss: 0.00863
Epoch: 25996, Training Loss: 0.00901
Epoch: 25996, Training Loss: 0.00902
Epoch: 25996, Training Loss: 0.01110
Epoch: 25997, Training Loss: 0.00863
Epoch: 25997, Training Loss: 0.00901
Epoch: 25997, Training Loss: 0.00902
Epoch: 25997, Training Loss: 0.01110
Epoch: 25998, Training Loss: 0.00863
Epoch: 25998, Training Loss: 0.00901
Epoch: 25998, Training Loss: 0.00902
Epoch: 25998, Training Loss: 0.01110
Epoch: 25999, Training Loss: 0.00863
Epoch: 25999, Training Loss: 0.00901
Epoch: 25999, Training Loss: 0.00902
Epoch: 25999, Training Loss: 0.01110
Epoch: 26000, Training Loss: 0.00863
Epoch: 26000, Training Loss: 0.00901
Epoch: 26000, Training Loss: 0.00902
Epoch: 26000, Training Loss: 0.01109
Epoch: 26001, Training Loss: 0.00863
Epoch: 26001, Training Loss: 0.00901
Epoch: 26001, Training Loss: 0.00902
Epoch: 26001, Training Loss: 0.01109
Epoch: 26002, Training Loss: 0.00863
Epoch: 26002, Training Loss: 0.00901
Epoch: 26002, Training Loss: 0.00902
Epoch: 26002, Training Loss: 0.01109
Epoch: 26003, Training Loss: 0.00863
Epoch: 26003, Training Loss: 0.00901
Epoch: 26003, Training Loss: 0.00902
Epoch: 26003, Training Loss: 0.01109
Epoch: 26004, Training Loss: 0.00863
Epoch: 26004, Training Loss: 0.00901
Epoch: 26004, Training Loss: 0.00902
Epoch: 26004, Training Loss: 0.01109
Epoch: 26005, Training Loss: 0.00863
Epoch: 26005, Training Loss: 0.00901
Epoch: 26005, Training Loss: 0.00902
Epoch: 26005, Training Loss: 0.01109
Epoch: 26006, Training Loss: 0.00863
Epoch: 26006, Training Loss: 0.00901
Epoch: 26006, Training Loss: 0.00902
Epoch: 26006, Training Loss: 0.01109
Epoch: 26007, Training Loss: 0.00863
Epoch: 26007, Training Loss: 0.00901
Epoch: 26007, Training Loss: 0.00902
Epoch: 26007, Training Loss: 0.01109
Epoch: 26008, Training Loss: 0.00863
Epoch: 26008, Training Loss: 0.00901
Epoch: 26008, Training Loss: 0.00902
Epoch: 26008, Training Loss: 0.01109
Epoch: 26009, Training Loss: 0.00863
Epoch: 26009, Training Loss: 0.00901
Epoch: 26009, Training Loss: 0.00902
Epoch: 26009, Training Loss: 0.01109
Epoch: 26010, Training Loss: 0.00863
Epoch: 26010, Training Loss: 0.00901
Epoch: 26010, Training Loss: 0.00902
Epoch: 26010, Training Loss: 0.01109
Epoch: 26011, Training Loss: 0.00863
Epoch: 26011, Training Loss: 0.00901
Epoch: 26011, Training Loss: 0.00902
Epoch: 26011, Training Loss: 0.01109
Epoch: 26012, Training Loss: 0.00863
Epoch: 26012, Training Loss: 0.00901
Epoch: 26012, Training Loss: 0.00902
Epoch: 26012, Training Loss: 0.01109
Epoch: 26013, Training Loss: 0.00863
Epoch: 26013, Training Loss: 0.00901
Epoch: 26013, Training Loss: 0.00902
Epoch: 26013, Training Loss: 0.01109
Epoch: 26014, Training Loss: 0.00863
Epoch: 26014, Training Loss: 0.00901
Epoch: 26014, Training Loss: 0.00902
Epoch: 26014, Training Loss: 0.01109
Epoch: 26015, Training Loss: 0.00863
Epoch: 26015, Training Loss: 0.00901
Epoch: 26015, Training Loss: 0.00902
Epoch: 26015, Training Loss: 0.01109
Epoch: 26016, Training Loss: 0.00863
Epoch: 26016, Training Loss: 0.00901
Epoch: 26016, Training Loss: 0.00902
Epoch: 26016, Training Loss: 0.01109
Epoch: 26017, Training Loss: 0.00863
Epoch: 26017, Training Loss: 0.00901
Epoch: 26017, Training Loss: 0.00902
Epoch: 26017, Training Loss: 0.01109
Epoch: 26018, Training Loss: 0.00863
Epoch: 26018, Training Loss: 0.00901
Epoch: 26018, Training Loss: 0.00902
Epoch: 26018, Training Loss: 0.01109
Epoch: 26019, Training Loss: 0.00863
Epoch: 26019, Training Loss: 0.00901
Epoch: 26019, Training Loss: 0.00902
Epoch: 26019, Training Loss: 0.01109
Epoch: 26020, Training Loss: 0.00863
Epoch: 26020, Training Loss: 0.00901
Epoch: 26020, Training Loss: 0.00902
Epoch: 26020, Training Loss: 0.01109
Epoch: 26021, Training Loss: 0.00863
Epoch: 26021, Training Loss: 0.00901
Epoch: 26021, Training Loss: 0.00902
Epoch: 26021, Training Loss: 0.01109
Epoch: 26022, Training Loss: 0.00863
Epoch: 26022, Training Loss: 0.00901
Epoch: 26022, Training Loss: 0.00902
Epoch: 26022, Training Loss: 0.01109
Epoch: 26023, Training Loss: 0.00863
Epoch: 26023, Training Loss: 0.00901
Epoch: 26023, Training Loss: 0.00902
Epoch: 26023, Training Loss: 0.01109
Epoch: 26024, Training Loss: 0.00863
Epoch: 26024, Training Loss: 0.00901
Epoch: 26024, Training Loss: 0.00902
Epoch: 26024, Training Loss: 0.01109
Epoch: 26025, Training Loss: 0.00863
Epoch: 26025, Training Loss: 0.00901
Epoch: 26025, Training Loss: 0.00902
Epoch: 26025, Training Loss: 0.01109
Epoch: 26026, Training Loss: 0.00863
Epoch: 26026, Training Loss: 0.00901
Epoch: 26026, Training Loss: 0.00902
Epoch: 26026, Training Loss: 0.01109
Epoch: 26027, Training Loss: 0.00863
Epoch: 26027, Training Loss: 0.00901
Epoch: 26027, Training Loss: 0.00902
Epoch: 26027, Training Loss: 0.01109
Epoch: 26028, Training Loss: 0.00863
Epoch: 26028, Training Loss: 0.00901
Epoch: 26028, Training Loss: 0.00902
Epoch: 26028, Training Loss: 0.01109
Epoch: 26029, Training Loss: 0.00863
Epoch: 26029, Training Loss: 0.00901
Epoch: 26029, Training Loss: 0.00902
Epoch: 26029, Training Loss: 0.01109
Epoch: 26030, Training Loss: 0.00863
Epoch: 26030, Training Loss: 0.00901
Epoch: 26030, Training Loss: 0.00902
Epoch: 26030, Training Loss: 0.01109
Epoch: 26031, Training Loss: 0.00863
Epoch: 26031, Training Loss: 0.00901
Epoch: 26031, Training Loss: 0.00902
Epoch: 26031, Training Loss: 0.01109
Epoch: 26032, Training Loss: 0.00863
Epoch: 26032, Training Loss: 0.00901
Epoch: 26032, Training Loss: 0.00902
Epoch: 26032, Training Loss: 0.01109
Epoch: 26033, Training Loss: 0.00863
Epoch: 26033, Training Loss: 0.00901
Epoch: 26033, Training Loss: 0.00902
Epoch: 26033, Training Loss: 0.01109
Epoch: 26034, Training Loss: 0.00863
Epoch: 26034, Training Loss: 0.00901
Epoch: 26034, Training Loss: 0.00902
Epoch: 26034, Training Loss: 0.01109
Epoch: 26035, Training Loss: 0.00862
Epoch: 26035, Training Loss: 0.00900
Epoch: 26035, Training Loss: 0.00902
Epoch: 26035, Training Loss: 0.01109
Epoch: 26036, Training Loss: 0.00862
Epoch: 26036, Training Loss: 0.00900
Epoch: 26036, Training Loss: 0.00902
Epoch: 26036, Training Loss: 0.01109
Epoch: 26037, Training Loss: 0.00862
Epoch: 26037, Training Loss: 0.00900
Epoch: 26037, Training Loss: 0.00902
Epoch: 26037, Training Loss: 0.01109
Epoch: 26038, Training Loss: 0.00862
Epoch: 26038, Training Loss: 0.00900
Epoch: 26038, Training Loss: 0.00902
Epoch: 26038, Training Loss: 0.01109
Epoch: 26039, Training Loss: 0.00862
Epoch: 26039, Training Loss: 0.00900
Epoch: 26039, Training Loss: 0.00902
Epoch: 26039, Training Loss: 0.01109
Epoch: 26040, Training Loss: 0.00862
Epoch: 26040, Training Loss: 0.00900
Epoch: 26040, Training Loss: 0.00902
Epoch: 26040, Training Loss: 0.01109
Epoch: 26041, Training Loss: 0.00862
Epoch: 26041, Training Loss: 0.00900
Epoch: 26041, Training Loss: 0.00901
Epoch: 26041, Training Loss: 0.01108
Epoch: 26042, Training Loss: 0.00862
Epoch: 26042, Training Loss: 0.00900
Epoch: 26042, Training Loss: 0.00901
Epoch: 26042, Training Loss: 0.01108
Epoch: 26043, Training Loss: 0.00862
Epoch: 26043, Training Loss: 0.00900
Epoch: 26043, Training Loss: 0.00901
Epoch: 26043, Training Loss: 0.01108
Epoch: 26044, Training Loss: 0.00862
Epoch: 26044, Training Loss: 0.00900
Epoch: 26044, Training Loss: 0.00901
Epoch: 26044, Training Loss: 0.01108
Epoch: 26045, Training Loss: 0.00862
Epoch: 26045, Training Loss: 0.00900
Epoch: 26045, Training Loss: 0.00901
Epoch: 26045, Training Loss: 0.01108
Epoch: 26046, Training Loss: 0.00862
Epoch: 26046, Training Loss: 0.00900
Epoch: 26046, Training Loss: 0.00901
Epoch: 26046, Training Loss: 0.01108
Epoch: 26047, Training Loss: 0.00862
Epoch: 26047, Training Loss: 0.00900
Epoch: 26047, Training Loss: 0.00901
Epoch: 26047, Training Loss: 0.01108
Epoch: 26048, Training Loss: 0.00862
Epoch: 26048, Training Loss: 0.00900
Epoch: 26048, Training Loss: 0.00901
Epoch: 26048, Training Loss: 0.01108
Epoch: 26049, Training Loss: 0.00862
Epoch: 26049, Training Loss: 0.00900
Epoch: 26049, Training Loss: 0.00901
Epoch: 26049, Training Loss: 0.01108
Epoch: 26050, Training Loss: 0.00862
Epoch: 26050, Training Loss: 0.00900
Epoch: 26050, Training Loss: 0.00901
Epoch: 26050, Training Loss: 0.01108
Epoch: 26051, Training Loss: 0.00862
Epoch: 26051, Training Loss: 0.00900
Epoch: 26051, Training Loss: 0.00901
Epoch: 26051, Training Loss: 0.01108
Epoch: 26052, Training Loss: 0.00862
Epoch: 26052, Training Loss: 0.00900
Epoch: 26052, Training Loss: 0.00901
Epoch: 26052, Training Loss: 0.01108
Epoch: 26053, Training Loss: 0.00862
Epoch: 26053, Training Loss: 0.00900
Epoch: 26053, Training Loss: 0.00901
Epoch: 26053, Training Loss: 0.01108
Epoch: 26054, Training Loss: 0.00862
Epoch: 26054, Training Loss: 0.00900
Epoch: 26054, Training Loss: 0.00901
Epoch: 26054, Training Loss: 0.01108
Epoch: 26055, Training Loss: 0.00862
Epoch: 26055, Training Loss: 0.00900
Epoch: 26055, Training Loss: 0.00901
Epoch: 26055, Training Loss: 0.01108
Epoch: 26056, Training Loss: 0.00862
Epoch: 26056, Training Loss: 0.00900
Epoch: 26056, Training Loss: 0.00901
Epoch: 26056, Training Loss: 0.01108
Epoch: 26057, Training Loss: 0.00862
Epoch: 26057, Training Loss: 0.00900
Epoch: 26057, Training Loss: 0.00901
Epoch: 26057, Training Loss: 0.01108
Epoch: 26058, Training Loss: 0.00862
Epoch: 26058, Training Loss: 0.00900
Epoch: 26058, Training Loss: 0.00901
Epoch: 26058, Training Loss: 0.01108
Epoch: 26059, Training Loss: 0.00862
Epoch: 26059, Training Loss: 0.00900
Epoch: 26059, Training Loss: 0.00901
Epoch: 26059, Training Loss: 0.01108
Epoch: 26060, Training Loss: 0.00862
Epoch: 26060, Training Loss: 0.00900
Epoch: 26060, Training Loss: 0.00901
Epoch: 26060, Training Loss: 0.01108
Epoch: 26061, Training Loss: 0.00862
Epoch: 26061, Training Loss: 0.00900
Epoch: 26061, Training Loss: 0.00901
Epoch: 26061, Training Loss: 0.01108
Epoch: 26062, Training Loss: 0.00862
Epoch: 26062, Training Loss: 0.00900
Epoch: 26062, Training Loss: 0.00901
Epoch: 26062, Training Loss: 0.01108
Epoch: 26063, Training Loss: 0.00862
Epoch: 26063, Training Loss: 0.00900
Epoch: 26063, Training Loss: 0.00901
Epoch: 26063, Training Loss: 0.01108
Epoch: 26064, Training Loss: 0.00862
Epoch: 26064, Training Loss: 0.00900
Epoch: 26064, Training Loss: 0.00901
Epoch: 26064, Training Loss: 0.01108
Epoch: 26065, Training Loss: 0.00862
Epoch: 26065, Training Loss: 0.00900
Epoch: 26065, Training Loss: 0.00901
Epoch: 26065, Training Loss: 0.01108
Epoch: 26066, Training Loss: 0.00862
Epoch: 26066, Training Loss: 0.00900
Epoch: 26066, Training Loss: 0.00901
Epoch: 26066, Training Loss: 0.01108
Epoch: 26067, Training Loss: 0.00862
Epoch: 26067, Training Loss: 0.00900
Epoch: 26067, Training Loss: 0.00901
Epoch: 26067, Training Loss: 0.01108
Epoch: 26068, Training Loss: 0.00862
Epoch: 26068, Training Loss: 0.00900
Epoch: 26068, Training Loss: 0.00901
Epoch: 26068, Training Loss: 0.01108
Epoch: 26069, Training Loss: 0.00862
Epoch: 26069, Training Loss: 0.00900
Epoch: 26069, Training Loss: 0.00901
Epoch: 26069, Training Loss: 0.01108
Epoch: 26070, Training Loss: 0.00862
Epoch: 26070, Training Loss: 0.00900
Epoch: 26070, Training Loss: 0.00901
Epoch: 26070, Training Loss: 0.01108
Epoch: 26071, Training Loss: 0.00862
Epoch: 26071, Training Loss: 0.00900
Epoch: 26071, Training Loss: 0.00901
Epoch: 26071, Training Loss: 0.01108
Epoch: 26072, Training Loss: 0.00862
Epoch: 26072, Training Loss: 0.00900
Epoch: 26072, Training Loss: 0.00901
Epoch: 26072, Training Loss: 0.01108
Epoch: 26073, Training Loss: 0.00862
Epoch: 26073, Training Loss: 0.00900
Epoch: 26073, Training Loss: 0.00901
Epoch: 26073, Training Loss: 0.01108
Epoch: 26074, Training Loss: 0.00862
Epoch: 26074, Training Loss: 0.00900
Epoch: 26074, Training Loss: 0.00901
Epoch: 26074, Training Loss: 0.01108
Epoch: 26075, Training Loss: 0.00862
Epoch: 26075, Training Loss: 0.00900
Epoch: 26075, Training Loss: 0.00901
Epoch: 26075, Training Loss: 0.01108
Epoch: 26076, Training Loss: 0.00862
Epoch: 26076, Training Loss: 0.00900
Epoch: 26076, Training Loss: 0.00901
Epoch: 26076, Training Loss: 0.01108
Epoch: 26077, Training Loss: 0.00862
Epoch: 26077, Training Loss: 0.00900
Epoch: 26077, Training Loss: 0.00901
Epoch: 26077, Training Loss: 0.01108
Epoch: 26078, Training Loss: 0.00862
Epoch: 26078, Training Loss: 0.00900
Epoch: 26078, Training Loss: 0.00901
Epoch: 26078, Training Loss: 0.01108
Epoch: 26079, Training Loss: 0.00862
Epoch: 26079, Training Loss: 0.00900
Epoch: 26079, Training Loss: 0.00901
Epoch: 26079, Training Loss: 0.01108
Epoch: 26080, Training Loss: 0.00862
Epoch: 26080, Training Loss: 0.00900
Epoch: 26080, Training Loss: 0.00901
Epoch: 26080, Training Loss: 0.01108
Epoch: 26081, Training Loss: 0.00862
Epoch: 26081, Training Loss: 0.00900
Epoch: 26081, Training Loss: 0.00901
Epoch: 26081, Training Loss: 0.01107
Epoch: 26082, Training Loss: 0.00862
Epoch: 26082, Training Loss: 0.00900
Epoch: 26082, Training Loss: 0.00901
Epoch: 26082, Training Loss: 0.01107
Epoch: 26083, Training Loss: 0.00862
Epoch: 26083, Training Loss: 0.00900
Epoch: 26083, Training Loss: 0.00901
Epoch: 26083, Training Loss: 0.01107
Epoch: 26084, Training Loss: 0.00862
Epoch: 26084, Training Loss: 0.00900
Epoch: 26084, Training Loss: 0.00901
Epoch: 26084, Training Loss: 0.01107
Epoch: 26085, Training Loss: 0.00862
Epoch: 26085, Training Loss: 0.00900
Epoch: 26085, Training Loss: 0.00901
Epoch: 26085, Training Loss: 0.01107
Epoch: 26086, Training Loss: 0.00862
Epoch: 26086, Training Loss: 0.00899
Epoch: 26086, Training Loss: 0.00901
Epoch: 26086, Training Loss: 0.01107
Epoch: 26087, Training Loss: 0.00862
Epoch: 26087, Training Loss: 0.00899
Epoch: 26087, Training Loss: 0.00901
Epoch: 26087, Training Loss: 0.01107
Epoch: 26088, Training Loss: 0.00862
Epoch: 26088, Training Loss: 0.00899
Epoch: 26088, Training Loss: 0.00901
Epoch: 26088, Training Loss: 0.01107
Epoch: 26089, Training Loss: 0.00861
Epoch: 26089, Training Loss: 0.00899
Epoch: 26089, Training Loss: 0.00901
Epoch: 26089, Training Loss: 0.01107
Epoch: 26090, Training Loss: 0.00861
Epoch: 26090, Training Loss: 0.00899
Epoch: 26090, Training Loss: 0.00901
Epoch: 26090, Training Loss: 0.01107
Epoch: 26091, Training Loss: 0.00861
Epoch: 26091, Training Loss: 0.00899
Epoch: 26091, Training Loss: 0.00901
Epoch: 26091, Training Loss: 0.01107
Epoch: 26092, Training Loss: 0.00861
Epoch: 26092, Training Loss: 0.00899
Epoch: 26092, Training Loss: 0.00900
Epoch: 26092, Training Loss: 0.01107
Epoch: 26093, Training Loss: 0.00861
Epoch: 26093, Training Loss: 0.00899
Epoch: 26093, Training Loss: 0.00900
Epoch: 26093, Training Loss: 0.01107
Epoch: 26094, Training Loss: 0.00861
Epoch: 26094, Training Loss: 0.00899
Epoch: 26094, Training Loss: 0.00900
Epoch: 26094, Training Loss: 0.01107
Epoch: 26095, Training Loss: 0.00861
Epoch: 26095, Training Loss: 0.00899
Epoch: 26095, Training Loss: 0.00900
Epoch: 26095, Training Loss: 0.01107
Epoch: 26096, Training Loss: 0.00861
Epoch: 26096, Training Loss: 0.00899
Epoch: 26096, Training Loss: 0.00900
Epoch: 26096, Training Loss: 0.01107
Epoch: 26097, Training Loss: 0.00861
Epoch: 26097, Training Loss: 0.00899
Epoch: 26097, Training Loss: 0.00900
Epoch: 26097, Training Loss: 0.01107
Epoch: 26098, Training Loss: 0.00861
Epoch: 26098, Training Loss: 0.00899
Epoch: 26098, Training Loss: 0.00900
Epoch: 26098, Training Loss: 0.01107
Epoch: 26099, Training Loss: 0.00861
Epoch: 26099, Training Loss: 0.00899
Epoch: 26099, Training Loss: 0.00900
Epoch: 26099, Training Loss: 0.01107
Epoch: 26100, Training Loss: 0.00861
Epoch: 26100, Training Loss: 0.00899
Epoch: 26100, Training Loss: 0.00900
Epoch: 26100, Training Loss: 0.01107
Epoch: 26101, Training Loss: 0.00861
Epoch: 26101, Training Loss: 0.00899
Epoch: 26101, Training Loss: 0.00900
Epoch: 26101, Training Loss: 0.01107
Epoch: 26102, Training Loss: 0.00861
Epoch: 26102, Training Loss: 0.00899
Epoch: 26102, Training Loss: 0.00900
Epoch: 26102, Training Loss: 0.01107
Epoch: 26103, Training Loss: 0.00861
Epoch: 26103, Training Loss: 0.00899
Epoch: 26103, Training Loss: 0.00900
Epoch: 26103, Training Loss: 0.01107
Epoch: 26104, Training Loss: 0.00861
Epoch: 26104, Training Loss: 0.00899
Epoch: 26104, Training Loss: 0.00900
Epoch: 26104, Training Loss: 0.01107
Epoch: 26105, Training Loss: 0.00861
Epoch: 26105, Training Loss: 0.00899
Epoch: 26105, Training Loss: 0.00900
Epoch: 26105, Training Loss: 0.01107
Epoch: 26106, Training Loss: 0.00861
Epoch: 26106, Training Loss: 0.00899
Epoch: 26106, Training Loss: 0.00900
Epoch: 26106, Training Loss: 0.01107
Epoch: 26107, Training Loss: 0.00861
Epoch: 26107, Training Loss: 0.00899
Epoch: 26107, Training Loss: 0.00900
Epoch: 26107, Training Loss: 0.01107
Epoch: 26108, Training Loss: 0.00861
Epoch: 26108, Training Loss: 0.00899
Epoch: 26108, Training Loss: 0.00900
Epoch: 26108, Training Loss: 0.01107
Epoch: 26109, Training Loss: 0.00861
Epoch: 26109, Training Loss: 0.00899
Epoch: 26109, Training Loss: 0.00900
Epoch: 26109, Training Loss: 0.01107
Epoch: 26110, Training Loss: 0.00861
Epoch: 26110, Training Loss: 0.00899
Epoch: 26110, Training Loss: 0.00900
Epoch: 26110, Training Loss: 0.01107
Epoch: 26111, Training Loss: 0.00861
Epoch: 26111, Training Loss: 0.00899
Epoch: 26111, Training Loss: 0.00900
Epoch: 26111, Training Loss: 0.01107
Epoch: 26112, Training Loss: 0.00861
Epoch: 26112, Training Loss: 0.00899
Epoch: 26112, Training Loss: 0.00900
Epoch: 26112, Training Loss: 0.01107
Epoch: 26113, Training Loss: 0.00861
Epoch: 26113, Training Loss: 0.00899
Epoch: 26113, Training Loss: 0.00900
Epoch: 26113, Training Loss: 0.01107
Epoch: 26114, Training Loss: 0.00861
Epoch: 26114, Training Loss: 0.00899
Epoch: 26114, Training Loss: 0.00900
Epoch: 26114, Training Loss: 0.01107
Epoch: 26115, Training Loss: 0.00861
Epoch: 26115, Training Loss: 0.00899
Epoch: 26115, Training Loss: 0.00900
Epoch: 26115, Training Loss: 0.01107
Epoch: 26116, Training Loss: 0.00861
Epoch: 26116, Training Loss: 0.00899
Epoch: 26116, Training Loss: 0.00900
Epoch: 26116, Training Loss: 0.01107
Epoch: 26117, Training Loss: 0.00861
Epoch: 26117, Training Loss: 0.00899
Epoch: 26117, Training Loss: 0.00900
Epoch: 26117, Training Loss: 0.01107
Epoch: 26118, Training Loss: 0.00861
Epoch: 26118, Training Loss: 0.00899
Epoch: 26118, Training Loss: 0.00900
Epoch: 26118, Training Loss: 0.01107
Epoch: 26119, Training Loss: 0.00861
Epoch: 26119, Training Loss: 0.00899
Epoch: 26119, Training Loss: 0.00900
Epoch: 26119, Training Loss: 0.01107
Epoch: 26120, Training Loss: 0.00861
Epoch: 26120, Training Loss: 0.00899
Epoch: 26120, Training Loss: 0.00900
Epoch: 26120, Training Loss: 0.01107
Epoch: 26121, Training Loss: 0.00861
Epoch: 26121, Training Loss: 0.00899
Epoch: 26121, Training Loss: 0.00900
Epoch: 26121, Training Loss: 0.01107
Epoch: 26122, Training Loss: 0.00861
Epoch: 26122, Training Loss: 0.00899
Epoch: 26122, Training Loss: 0.00900
Epoch: 26122, Training Loss: 0.01106
Epoch: 26123, Training Loss: 0.00861
Epoch: 26123, Training Loss: 0.00899
Epoch: 26123, Training Loss: 0.00900
Epoch: 26123, Training Loss: 0.01106
Epoch: 26124, Training Loss: 0.00861
Epoch: 26124, Training Loss: 0.00899
Epoch: 26124, Training Loss: 0.00900
Epoch: 26124, Training Loss: 0.01106
Epoch: 26125, Training Loss: 0.00861
Epoch: 26125, Training Loss: 0.00899
Epoch: 26125, Training Loss: 0.00900
Epoch: 26125, Training Loss: 0.01106
Epoch: 26126, Training Loss: 0.00861
Epoch: 26126, Training Loss: 0.00899
Epoch: 26126, Training Loss: 0.00900
Epoch: 26126, Training Loss: 0.01106
Epoch: 26127, Training Loss: 0.00861
Epoch: 26127, Training Loss: 0.00899
Epoch: 26127, Training Loss: 0.00900
Epoch: 26127, Training Loss: 0.01106
Epoch: 26128, Training Loss: 0.00861
Epoch: 26128, Training Loss: 0.00899
Epoch: 26128, Training Loss: 0.00900
Epoch: 26128, Training Loss: 0.01106
Epoch: 26129, Training Loss: 0.00861
Epoch: 26129, Training Loss: 0.00899
Epoch: 26129, Training Loss: 0.00900
Epoch: 26129, Training Loss: 0.01106
Epoch: 26130, Training Loss: 0.00861
Epoch: 26130, Training Loss: 0.00899
Epoch: 26130, Training Loss: 0.00900
Epoch: 26130, Training Loss: 0.01106
Epoch: 26131, Training Loss: 0.00861
Epoch: 26131, Training Loss: 0.00899
Epoch: 26131, Training Loss: 0.00900
Epoch: 26131, Training Loss: 0.01106
Epoch: 26132, Training Loss: 0.00861
Epoch: 26132, Training Loss: 0.00899
Epoch: 26132, Training Loss: 0.00900
Epoch: 26132, Training Loss: 0.01106
Epoch: 26133, Training Loss: 0.00861
Epoch: 26133, Training Loss: 0.00899
Epoch: 26133, Training Loss: 0.00900
Epoch: 26133, Training Loss: 0.01106
Epoch: 26134, Training Loss: 0.00861
Epoch: 26134, Training Loss: 0.00899
Epoch: 26134, Training Loss: 0.00900
Epoch: 26134, Training Loss: 0.01106
Epoch: 26135, Training Loss: 0.00861
Epoch: 26135, Training Loss: 0.00899
Epoch: 26135, Training Loss: 0.00900
Epoch: 26135, Training Loss: 0.01106
Epoch: 26136, Training Loss: 0.00861
Epoch: 26136, Training Loss: 0.00899
Epoch: 26136, Training Loss: 0.00900
Epoch: 26136, Training Loss: 0.01106
Epoch: 26137, Training Loss: 0.00861
Epoch: 26137, Training Loss: 0.00898
Epoch: 26137, Training Loss: 0.00900
Epoch: 26137, Training Loss: 0.01106
Epoch: 26138, Training Loss: 0.00861
Epoch: 26138, Training Loss: 0.00898
Epoch: 26138, Training Loss: 0.00900
Epoch: 26138, Training Loss: 0.01106
Epoch: 26139, Training Loss: 0.00861
Epoch: 26139, Training Loss: 0.00898
Epoch: 26139, Training Loss: 0.00900
Epoch: 26139, Training Loss: 0.01106
Epoch: 26140, Training Loss: 0.00861
Epoch: 26140, Training Loss: 0.00898
Epoch: 26140, Training Loss: 0.00900
Epoch: 26140, Training Loss: 0.01106
Epoch: 26141, Training Loss: 0.00861
Epoch: 26141, Training Loss: 0.00898
Epoch: 26141, Training Loss: 0.00900
Epoch: 26141, Training Loss: 0.01106
Epoch: 26142, Training Loss: 0.00861
Epoch: 26142, Training Loss: 0.00898
Epoch: 26142, Training Loss: 0.00899
Epoch: 26142, Training Loss: 0.01106
Epoch: 26143, Training Loss: 0.00860
Epoch: 26143, Training Loss: 0.00898
Epoch: 26143, Training Loss: 0.00899
Epoch: 26143, Training Loss: 0.01106
Epoch: 26144, Training Loss: 0.00860
Epoch: 26144, Training Loss: 0.00898
Epoch: 26144, Training Loss: 0.00899
Epoch: 26144, Training Loss: 0.01106
Epoch: 26145, Training Loss: 0.00860
Epoch: 26145, Training Loss: 0.00898
Epoch: 26145, Training Loss: 0.00899
Epoch: 26145, Training Loss: 0.01106
Epoch: 26146, Training Loss: 0.00860
Epoch: 26146, Training Loss: 0.00898
Epoch: 26146, Training Loss: 0.00899
Epoch: 26146, Training Loss: 0.01106
Epoch: 26147, Training Loss: 0.00860
Epoch: 26147, Training Loss: 0.00898
Epoch: 26147, Training Loss: 0.00899
Epoch: 26147, Training Loss: 0.01106
Epoch: 26148, Training Loss: 0.00860
Epoch: 26148, Training Loss: 0.00898
Epoch: 26148, Training Loss: 0.00899
Epoch: 26148, Training Loss: 0.01106
Epoch: 26149, Training Loss: 0.00860
Epoch: 26149, Training Loss: 0.00898
Epoch: 26149, Training Loss: 0.00899
Epoch: 26149, Training Loss: 0.01106
Epoch: 26150, Training Loss: 0.00860
Epoch: 26150, Training Loss: 0.00898
Epoch: 26150, Training Loss: 0.00899
Epoch: 26150, Training Loss: 0.01106
Epoch: 26151, Training Loss: 0.00860
Epoch: 26151, Training Loss: 0.00898
Epoch: 26151, Training Loss: 0.00899
Epoch: 26151, Training Loss: 0.01106
Epoch: 26152, Training Loss: 0.00860
Epoch: 26152, Training Loss: 0.00898
Epoch: 26152, Training Loss: 0.00899
Epoch: 26152, Training Loss: 0.01106
Epoch: 26153, Training Loss: 0.00860
Epoch: 26153, Training Loss: 0.00898
Epoch: 26153, Training Loss: 0.00899
Epoch: 26153, Training Loss: 0.01106
Epoch: 26154, Training Loss: 0.00860
Epoch: 26154, Training Loss: 0.00898
Epoch: 26154, Training Loss: 0.00899
Epoch: 26154, Training Loss: 0.01106
Epoch: 26155, Training Loss: 0.00860
Epoch: 26155, Training Loss: 0.00898
Epoch: 26155, Training Loss: 0.00899
Epoch: 26155, Training Loss: 0.01106
Epoch: 26156, Training Loss: 0.00860
Epoch: 26156, Training Loss: 0.00898
Epoch: 26156, Training Loss: 0.00899
Epoch: 26156, Training Loss: 0.01106
Epoch: 26157, Training Loss: 0.00860
Epoch: 26157, Training Loss: 0.00898
Epoch: 26157, Training Loss: 0.00899
Epoch: 26157, Training Loss: 0.01106
Epoch: 26158, Training Loss: 0.00860
Epoch: 26158, Training Loss: 0.00898
Epoch: 26158, Training Loss: 0.00899
Epoch: 26158, Training Loss: 0.01106
Epoch: 26159, Training Loss: 0.00860
Epoch: 26159, Training Loss: 0.00898
Epoch: 26159, Training Loss: 0.00899
Epoch: 26159, Training Loss: 0.01106
Epoch: 26160, Training Loss: 0.00860
Epoch: 26160, Training Loss: 0.00898
Epoch: 26160, Training Loss: 0.00899
Epoch: 26160, Training Loss: 0.01106
Epoch: 26161, Training Loss: 0.00860
Epoch: 26161, Training Loss: 0.00898
Epoch: 26161, Training Loss: 0.00899
Epoch: 26161, Training Loss: 0.01106
Epoch: 26162, Training Loss: 0.00860
Epoch: 26162, Training Loss: 0.00898
Epoch: 26162, Training Loss: 0.00899
Epoch: 26162, Training Loss: 0.01106
Epoch: 26163, Training Loss: 0.00860
Epoch: 26163, Training Loss: 0.00898
Epoch: 26163, Training Loss: 0.00899
Epoch: 26163, Training Loss: 0.01105
Epoch: 26164, Training Loss: 0.00860
Epoch: 26164, Training Loss: 0.00898
Epoch: 26164, Training Loss: 0.00899
Epoch: 26164, Training Loss: 0.01105
Epoch: 26165, Training Loss: 0.00860
Epoch: 26165, Training Loss: 0.00898
Epoch: 26165, Training Loss: 0.00899
Epoch: 26165, Training Loss: 0.01105
Epoch: 26166, Training Loss: 0.00860
Epoch: 26166, Training Loss: 0.00898
Epoch: 26166, Training Loss: 0.00899
Epoch: 26166, Training Loss: 0.01105
Epoch: 26167, Training Loss: 0.00860
Epoch: 26167, Training Loss: 0.00898
Epoch: 26167, Training Loss: 0.00899
Epoch: 26167, Training Loss: 0.01105
Epoch: 26168, Training Loss: 0.00860
Epoch: 26168, Training Loss: 0.00898
Epoch: 26168, Training Loss: 0.00899
Epoch: 26168, Training Loss: 0.01105
Epoch: 26169, Training Loss: 0.00860
Epoch: 26169, Training Loss: 0.00898
Epoch: 26169, Training Loss: 0.00899
Epoch: 26169, Training Loss: 0.01105
Epoch: 26170, Training Loss: 0.00860
Epoch: 26170, Training Loss: 0.00898
Epoch: 26170, Training Loss: 0.00899
Epoch: 26170, Training Loss: 0.01105
Epoch: 26171, Training Loss: 0.00860
Epoch: 26171, Training Loss: 0.00898
Epoch: 26171, Training Loss: 0.00899
Epoch: 26171, Training Loss: 0.01105
Epoch: 26172, Training Loss: 0.00860
Epoch: 26172, Training Loss: 0.00898
Epoch: 26172, Training Loss: 0.00899
Epoch: 26172, Training Loss: 0.01105
Epoch: 26173, Training Loss: 0.00860
Epoch: 26173, Training Loss: 0.00898
Epoch: 26173, Training Loss: 0.00899
Epoch: 26173, Training Loss: 0.01105
Epoch: 26174, Training Loss: 0.00860
Epoch: 26174, Training Loss: 0.00898
Epoch: 26174, Training Loss: 0.00899
Epoch: 26174, Training Loss: 0.01105
Epoch: 26175, Training Loss: 0.00860
Epoch: 26175, Training Loss: 0.00898
Epoch: 26175, Training Loss: 0.00899
Epoch: 26175, Training Loss: 0.01105
Epoch: 26176, Training Loss: 0.00860
Epoch: 26176, Training Loss: 0.00898
Epoch: 26176, Training Loss: 0.00899
Epoch: 26176, Training Loss: 0.01105
Epoch: 26177, Training Loss: 0.00860
Epoch: 26177, Training Loss: 0.00898
Epoch: 26177, Training Loss: 0.00899
Epoch: 26177, Training Loss: 0.01105
Epoch: 26178, Training Loss: 0.00860
Epoch: 26178, Training Loss: 0.00898
Epoch: 26178, Training Loss: 0.00899
Epoch: 26178, Training Loss: 0.01105
Epoch: 26179, Training Loss: 0.00860
Epoch: 26179, Training Loss: 0.00898
Epoch: 26179, Training Loss: 0.00899
Epoch: 26179, Training Loss: 0.01105
Epoch: 26180, Training Loss: 0.00860
Epoch: 26180, Training Loss: 0.00898
Epoch: 26180, Training Loss: 0.00899
Epoch: 26180, Training Loss: 0.01105
Epoch: 26181, Training Loss: 0.00860
Epoch: 26181, Training Loss: 0.00898
Epoch: 26181, Training Loss: 0.00899
Epoch: 26181, Training Loss: 0.01105
Epoch: 26182, Training Loss: 0.00860
Epoch: 26182, Training Loss: 0.00898
Epoch: 26182, Training Loss: 0.00899
Epoch: 26182, Training Loss: 0.01105
Epoch: 26183, Training Loss: 0.00860
Epoch: 26183, Training Loss: 0.00898
Epoch: 26183, Training Loss: 0.00899
Epoch: 26183, Training Loss: 0.01105
Epoch: 26184, Training Loss: 0.00860
Epoch: 26184, Training Loss: 0.00898
Epoch: 26184, Training Loss: 0.00899
Epoch: 26184, Training Loss: 0.01105
Epoch: 26185, Training Loss: 0.00860
Epoch: 26185, Training Loss: 0.00898
Epoch: 26185, Training Loss: 0.00899
Epoch: 26185, Training Loss: 0.01105
Epoch: 26186, Training Loss: 0.00860
Epoch: 26186, Training Loss: 0.00898
Epoch: 26186, Training Loss: 0.00899
Epoch: 26186, Training Loss: 0.01105
Epoch: 26187, Training Loss: 0.00860
Epoch: 26187, Training Loss: 0.00898
Epoch: 26187, Training Loss: 0.00899
Epoch: 26187, Training Loss: 0.01105
Epoch: 26188, Training Loss: 0.00860
Epoch: 26188, Training Loss: 0.00897
Epoch: 26188, Training Loss: 0.00899
Epoch: 26188, Training Loss: 0.01105
Epoch: 26189, Training Loss: 0.00860
Epoch: 26189, Training Loss: 0.00897
Epoch: 26189, Training Loss: 0.00899
Epoch: 26189, Training Loss: 0.01105
Epoch: 26190, Training Loss: 0.00860
Epoch: 26190, Training Loss: 0.00897
Epoch: 26190, Training Loss: 0.00899
Epoch: 26190, Training Loss: 0.01105
Epoch: 26191, Training Loss: 0.00860
Epoch: 26191, Training Loss: 0.00897
Epoch: 26191, Training Loss: 0.00899
Epoch: 26191, Training Loss: 0.01105
Epoch: 26192, Training Loss: 0.00860
Epoch: 26192, Training Loss: 0.00897
Epoch: 26192, Training Loss: 0.00899
Epoch: 26192, Training Loss: 0.01105
Epoch: 26193, Training Loss: 0.00860
Epoch: 26193, Training Loss: 0.00897
Epoch: 26193, Training Loss: 0.00898
Epoch: 26193, Training Loss: 0.01105
Epoch: 26194, Training Loss: 0.00860
Epoch: 26194, Training Loss: 0.00897
Epoch: 26194, Training Loss: 0.00898
Epoch: 26194, Training Loss: 0.01105
Epoch: 26195, Training Loss: 0.00860
Epoch: 26195, Training Loss: 0.00897
Epoch: 26195, Training Loss: 0.00898
Epoch: 26195, Training Loss: 0.01105
Epoch: 26196, Training Loss: 0.00860
Epoch: 26196, Training Loss: 0.00897
Epoch: 26196, Training Loss: 0.00898
Epoch: 26196, Training Loss: 0.01105
Epoch: 26197, Training Loss: 0.00860
Epoch: 26197, Training Loss: 0.00897
Epoch: 26197, Training Loss: 0.00898
Epoch: 26197, Training Loss: 0.01105
Epoch: 26198, Training Loss: 0.00859
Epoch: 26198, Training Loss: 0.00897
Epoch: 26198, Training Loss: 0.00898
Epoch: 26198, Training Loss: 0.01105
Epoch: 26199, Training Loss: 0.00859
Epoch: 26199, Training Loss: 0.00897
Epoch: 26199, Training Loss: 0.00898
Epoch: 26199, Training Loss: 0.01105
Epoch: 26200, Training Loss: 0.00859
Epoch: 26200, Training Loss: 0.00897
Epoch: 26200, Training Loss: 0.00898
Epoch: 26200, Training Loss: 0.01105
Epoch: 26201, Training Loss: 0.00859
Epoch: 26201, Training Loss: 0.00897
Epoch: 26201, Training Loss: 0.00898
Epoch: 26201, Training Loss: 0.01105
Epoch: 26202, Training Loss: 0.00859
Epoch: 26202, Training Loss: 0.00897
Epoch: 26202, Training Loss: 0.00898
Epoch: 26202, Training Loss: 0.01105
Epoch: 26203, Training Loss: 0.00859
Epoch: 26203, Training Loss: 0.00897
Epoch: 26203, Training Loss: 0.00898
Epoch: 26203, Training Loss: 0.01105
Epoch: 26204, Training Loss: 0.00859
Epoch: 26204, Training Loss: 0.00897
Epoch: 26204, Training Loss: 0.00898
Epoch: 26204, Training Loss: 0.01104
Epoch: 26205, Training Loss: 0.00859
Epoch: 26205, Training Loss: 0.00897
Epoch: 26205, Training Loss: 0.00898
Epoch: 26205, Training Loss: 0.01104
Epoch: 26206, Training Loss: 0.00859
Epoch: 26206, Training Loss: 0.00897
Epoch: 26206, Training Loss: 0.00898
Epoch: 26206, Training Loss: 0.01104
Epoch: 26207, Training Loss: 0.00859
Epoch: 26207, Training Loss: 0.00897
Epoch: 26207, Training Loss: 0.00898
Epoch: 26207, Training Loss: 0.01104
Epoch: 26208, Training Loss: 0.00859
Epoch: 26208, Training Loss: 0.00897
Epoch: 26208, Training Loss: 0.00898
Epoch: 26208, Training Loss: 0.01104
Epoch: 26209, Training Loss: 0.00859
Epoch: 26209, Training Loss: 0.00897
Epoch: 26209, Training Loss: 0.00898
Epoch: 26209, Training Loss: 0.01104
Epoch: 26210, Training Loss: 0.00859
Epoch: 26210, Training Loss: 0.00897
Epoch: 26210, Training Loss: 0.00898
Epoch: 26210, Training Loss: 0.01104
Epoch: 26211, Training Loss: 0.00859
Epoch: 26211, Training Loss: 0.00897
Epoch: 26211, Training Loss: 0.00898
Epoch: 26211, Training Loss: 0.01104
Epoch: 26212, Training Loss: 0.00859
Epoch: 26212, Training Loss: 0.00897
Epoch: 26212, Training Loss: 0.00898
Epoch: 26212, Training Loss: 0.01104
Epoch: 26213, Training Loss: 0.00859
Epoch: 26213, Training Loss: 0.00897
Epoch: 26213, Training Loss: 0.00898
Epoch: 26213, Training Loss: 0.01104
Epoch: 26214, Training Loss: 0.00859
Epoch: 26214, Training Loss: 0.00897
Epoch: 26214, Training Loss: 0.00898
Epoch: 26214, Training Loss: 0.01104
Epoch: 26215, Training Loss: 0.00859
Epoch: 26215, Training Loss: 0.00897
Epoch: 26215, Training Loss: 0.00898
Epoch: 26215, Training Loss: 0.01104
Epoch: 26216, Training Loss: 0.00859
Epoch: 26216, Training Loss: 0.00897
Epoch: 26216, Training Loss: 0.00898
Epoch: 26216, Training Loss: 0.01104
Epoch: 26217, Training Loss: 0.00859
Epoch: 26217, Training Loss: 0.00897
Epoch: 26217, Training Loss: 0.00898
Epoch: 26217, Training Loss: 0.01104
Epoch: 26218, Training Loss: 0.00859
Epoch: 26218, Training Loss: 0.00897
Epoch: 26218, Training Loss: 0.00898
Epoch: 26218, Training Loss: 0.01104
Epoch: 26219, Training Loss: 0.00859
Epoch: 26219, Training Loss: 0.00897
Epoch: 26219, Training Loss: 0.00898
Epoch: 26219, Training Loss: 0.01104
Epoch: 26220, Training Loss: 0.00859
Epoch: 26220, Training Loss: 0.00897
Epoch: 26220, Training Loss: 0.00898
Epoch: 26220, Training Loss: 0.01104
Epoch: 26221, Training Loss: 0.00859
Epoch: 26221, Training Loss: 0.00897
Epoch: 26221, Training Loss: 0.00898
Epoch: 26221, Training Loss: 0.01104
Epoch: 26222, Training Loss: 0.00859
Epoch: 26222, Training Loss: 0.00897
Epoch: 26222, Training Loss: 0.00898
Epoch: 26222, Training Loss: 0.01104
Epoch: 26223, Training Loss: 0.00859
Epoch: 26223, Training Loss: 0.00897
Epoch: 26223, Training Loss: 0.00898
Epoch: 26223, Training Loss: 0.01104
Epoch: 26224, Training Loss: 0.00859
Epoch: 26224, Training Loss: 0.00897
Epoch: 26224, Training Loss: 0.00898
Epoch: 26224, Training Loss: 0.01104
Epoch: 26225, Training Loss: 0.00859
Epoch: 26225, Training Loss: 0.00897
Epoch: 26225, Training Loss: 0.00898
Epoch: 26225, Training Loss: 0.01104
Epoch: 26226, Training Loss: 0.00859
Epoch: 26226, Training Loss: 0.00897
Epoch: 26226, Training Loss: 0.00898
Epoch: 26226, Training Loss: 0.01104
Epoch: 26227, Training Loss: 0.00859
Epoch: 26227, Training Loss: 0.00897
Epoch: 26227, Training Loss: 0.00898
Epoch: 26227, Training Loss: 0.01104
Epoch: 26228, Training Loss: 0.00859
Epoch: 26228, Training Loss: 0.00897
Epoch: 26228, Training Loss: 0.00898
Epoch: 26228, Training Loss: 0.01104
Epoch: 26229, Training Loss: 0.00859
Epoch: 26229, Training Loss: 0.00897
Epoch: 26229, Training Loss: 0.00898
Epoch: 26229, Training Loss: 0.01104
Epoch: 26230, Training Loss: 0.00859
Epoch: 26230, Training Loss: 0.00897
Epoch: 26230, Training Loss: 0.00898
Epoch: 26230, Training Loss: 0.01104
Epoch: 26231, Training Loss: 0.00859
Epoch: 26231, Training Loss: 0.00897
Epoch: 26231, Training Loss: 0.00898
Epoch: 26231, Training Loss: 0.01104
Epoch: 26232, Training Loss: 0.00859
Epoch: 26232, Training Loss: 0.00897
Epoch: 26232, Training Loss: 0.00898
Epoch: 26232, Training Loss: 0.01104
Epoch: 26233, Training Loss: 0.00859
Epoch: 26233, Training Loss: 0.00897
Epoch: 26233, Training Loss: 0.00898
Epoch: 26233, Training Loss: 0.01104
Epoch: 26234, Training Loss: 0.00859
Epoch: 26234, Training Loss: 0.00897
Epoch: 26234, Training Loss: 0.00898
Epoch: 26234, Training Loss: 0.01104
Epoch: 26235, Training Loss: 0.00859
Epoch: 26235, Training Loss: 0.00897
Epoch: 26235, Training Loss: 0.00898
Epoch: 26235, Training Loss: 0.01104
Epoch: 26236, Training Loss: 0.00859
Epoch: 26236, Training Loss: 0.00897
Epoch: 26236, Training Loss: 0.00898
Epoch: 26236, Training Loss: 0.01104
Epoch: 26237, Training Loss: 0.00859
Epoch: 26237, Training Loss: 0.00897
Epoch: 26237, Training Loss: 0.00898
Epoch: 26237, Training Loss: 0.01104
Epoch: 26238, Training Loss: 0.00859
Epoch: 26238, Training Loss: 0.00897
Epoch: 26238, Training Loss: 0.00898
Epoch: 26238, Training Loss: 0.01104
Epoch: 26239, Training Loss: 0.00859
Epoch: 26239, Training Loss: 0.00896
Epoch: 26239, Training Loss: 0.00898
Epoch: 26239, Training Loss: 0.01104
Epoch: 26240, Training Loss: 0.00859
Epoch: 26240, Training Loss: 0.00896
Epoch: 26240, Training Loss: 0.00898
Epoch: 26240, Training Loss: 0.01104
Epoch: 26241, Training Loss: 0.00859
Epoch: 26241, Training Loss: 0.00896
Epoch: 26241, Training Loss: 0.00898
Epoch: 26241, Training Loss: 0.01104
Epoch: 26242, Training Loss: 0.00859
Epoch: 26242, Training Loss: 0.00896
Epoch: 26242, Training Loss: 0.00898
Epoch: 26242, Training Loss: 0.01104
Epoch: 26243, Training Loss: 0.00859
Epoch: 26243, Training Loss: 0.00896
Epoch: 26243, Training Loss: 0.00898
Epoch: 26243, Training Loss: 0.01104
Epoch: 26244, Training Loss: 0.00859
Epoch: 26244, Training Loss: 0.00896
Epoch: 26244, Training Loss: 0.00898
Epoch: 26244, Training Loss: 0.01104
Epoch: 26245, Training Loss: 0.00859
Epoch: 26245, Training Loss: 0.00896
Epoch: 26245, Training Loss: 0.00897
Epoch: 26245, Training Loss: 0.01103
Epoch: 26246, Training Loss: 0.00859
Epoch: 26246, Training Loss: 0.00896
Epoch: 26246, Training Loss: 0.00897
Epoch: 26246, Training Loss: 0.01103
Epoch: 26247, Training Loss: 0.00859
Epoch: 26247, Training Loss: 0.00896
Epoch: 26247, Training Loss: 0.00897
Epoch: 26247, Training Loss: 0.01103
Epoch: 26248, Training Loss: 0.00859
Epoch: 26248, Training Loss: 0.00896
Epoch: 26248, Training Loss: 0.00897
Epoch: 26248, Training Loss: 0.01103
Epoch: 26249, Training Loss: 0.00859
Epoch: 26249, Training Loss: 0.00896
Epoch: 26249, Training Loss: 0.00897
Epoch: 26249, Training Loss: 0.01103
Epoch: 26250, Training Loss: 0.00859
Epoch: 26250, Training Loss: 0.00896
Epoch: 26250, Training Loss: 0.00897
Epoch: 26250, Training Loss: 0.01103
Epoch: 26251, Training Loss: 0.00859
Epoch: 26251, Training Loss: 0.00896
Epoch: 26251, Training Loss: 0.00897
Epoch: 26251, Training Loss: 0.01103
Epoch: 26252, Training Loss: 0.00859
Epoch: 26252, Training Loss: 0.00896
Epoch: 26252, Training Loss: 0.00897
Epoch: 26252, Training Loss: 0.01103
Epoch: 26253, Training Loss: 0.00858
Epoch: 26253, Training Loss: 0.00896
Epoch: 26253, Training Loss: 0.00897
Epoch: 26253, Training Loss: 0.01103
Epoch: 26254, Training Loss: 0.00858
Epoch: 26254, Training Loss: 0.00896
Epoch: 26254, Training Loss: 0.00897
Epoch: 26254, Training Loss: 0.01103
Epoch: 26255, Training Loss: 0.00858
Epoch: 26255, Training Loss: 0.00896
Epoch: 26255, Training Loss: 0.00897
Epoch: 26255, Training Loss: 0.01103
Epoch: 26256, Training Loss: 0.00858
Epoch: 26256, Training Loss: 0.00896
Epoch: 26256, Training Loss: 0.00897
Epoch: 26256, Training Loss: 0.01103
Epoch: 26257, Training Loss: 0.00858
Epoch: 26257, Training Loss: 0.00896
Epoch: 26257, Training Loss: 0.00897
Epoch: 26257, Training Loss: 0.01103
Epoch: 26258, Training Loss: 0.00858
Epoch: 26258, Training Loss: 0.00896
Epoch: 26258, Training Loss: 0.00897
Epoch: 26258, Training Loss: 0.01103
Epoch: 26259, Training Loss: 0.00858
Epoch: 26259, Training Loss: 0.00896
Epoch: 26259, Training Loss: 0.00897
Epoch: 26259, Training Loss: 0.01103
Epoch: 26260, Training Loss: 0.00858
Epoch: 26260, Training Loss: 0.00896
Epoch: 26260, Training Loss: 0.00897
Epoch: 26260, Training Loss: 0.01103
Epoch: 26261, Training Loss: 0.00858
Epoch: 26261, Training Loss: 0.00896
Epoch: 26261, Training Loss: 0.00897
Epoch: 26261, Training Loss: 0.01103
Epoch: 26262, Training Loss: 0.00858
Epoch: 26262, Training Loss: 0.00896
Epoch: 26262, Training Loss: 0.00897
Epoch: 26262, Training Loss: 0.01103
Epoch: 26263, Training Loss: 0.00858
Epoch: 26263, Training Loss: 0.00896
Epoch: 26263, Training Loss: 0.00897
Epoch: 26263, Training Loss: 0.01103
Epoch: 26264, Training Loss: 0.00858
Epoch: 26264, Training Loss: 0.00896
Epoch: 26264, Training Loss: 0.00897
Epoch: 26264, Training Loss: 0.01103
Epoch: 26265, Training Loss: 0.00858
Epoch: 26265, Training Loss: 0.00896
Epoch: 26265, Training Loss: 0.00897
Epoch: 26265, Training Loss: 0.01103
Epoch: 26266, Training Loss: 0.00858
Epoch: 26266, Training Loss: 0.00896
Epoch: 26266, Training Loss: 0.00897
Epoch: 26266, Training Loss: 0.01103
Epoch: 26267, Training Loss: 0.00858
Epoch: 26267, Training Loss: 0.00896
Epoch: 26267, Training Loss: 0.00897
Epoch: 26267, Training Loss: 0.01103
Epoch: 26268, Training Loss: 0.00858
Epoch: 26268, Training Loss: 0.00896
Epoch: 26268, Training Loss: 0.00897
Epoch: 26268, Training Loss: 0.01103
Epoch: 26269, Training Loss: 0.00858
Epoch: 26269, Training Loss: 0.00896
Epoch: 26269, Training Loss: 0.00897
Epoch: 26269, Training Loss: 0.01103
Epoch: 26270, Training Loss: 0.00858
Epoch: 26270, Training Loss: 0.00896
Epoch: 26270, Training Loss: 0.00897
Epoch: 26270, Training Loss: 0.01103
Epoch: 26271, Training Loss: 0.00858
Epoch: 26271, Training Loss: 0.00896
Epoch: 26271, Training Loss: 0.00897
Epoch: 26271, Training Loss: 0.01103
Epoch: 26272, Training Loss: 0.00858
Epoch: 26272, Training Loss: 0.00896
Epoch: 26272, Training Loss: 0.00897
Epoch: 26272, Training Loss: 0.01103
Epoch: 26273, Training Loss: 0.00858
Epoch: 26273, Training Loss: 0.00896
Epoch: 26273, Training Loss: 0.00897
Epoch: 26273, Training Loss: 0.01103
Epoch: 26274, Training Loss: 0.00858
Epoch: 26274, Training Loss: 0.00896
Epoch: 26274, Training Loss: 0.00897
Epoch: 26274, Training Loss: 0.01103
Epoch: 26275, Training Loss: 0.00858
Epoch: 26275, Training Loss: 0.00896
Epoch: 26275, Training Loss: 0.00897
Epoch: 26275, Training Loss: 0.01103
Epoch: 26276, Training Loss: 0.00858
Epoch: 26276, Training Loss: 0.00896
Epoch: 26276, Training Loss: 0.00897
Epoch: 26276, Training Loss: 0.01103
Epoch: 26277, Training Loss: 0.00858
Epoch: 26277, Training Loss: 0.00896
Epoch: 26277, Training Loss: 0.00897
Epoch: 26277, Training Loss: 0.01103
Epoch: 26278, Training Loss: 0.00858
Epoch: 26278, Training Loss: 0.00896
Epoch: 26278, Training Loss: 0.00897
Epoch: 26278, Training Loss: 0.01103
Epoch: 26279, Training Loss: 0.00858
Epoch: 26279, Training Loss: 0.00896
Epoch: 26279, Training Loss: 0.00897
Epoch: 26279, Training Loss: 0.01103
Epoch: 26280, Training Loss: 0.00858
Epoch: 26280, Training Loss: 0.00896
Epoch: 26280, Training Loss: 0.00897
Epoch: 26280, Training Loss: 0.01103
Epoch: 26281, Training Loss: 0.00858
Epoch: 26281, Training Loss: 0.00896
Epoch: 26281, Training Loss: 0.00897
Epoch: 26281, Training Loss: 0.01103
Epoch: 26282, Training Loss: 0.00858
Epoch: 26282, Training Loss: 0.00896
Epoch: 26282, Training Loss: 0.00897
Epoch: 26282, Training Loss: 0.01103
Epoch: 26283, Training Loss: 0.00858
Epoch: 26283, Training Loss: 0.00896
Epoch: 26283, Training Loss: 0.00897
Epoch: 26283, Training Loss: 0.01103
Epoch: 26284, Training Loss: 0.00858
Epoch: 26284, Training Loss: 0.00896
Epoch: 26284, Training Loss: 0.00897
Epoch: 26284, Training Loss: 0.01103
Epoch: 26285, Training Loss: 0.00858
Epoch: 26285, Training Loss: 0.00896
Epoch: 26285, Training Loss: 0.00897
Epoch: 26285, Training Loss: 0.01103
Epoch: 26286, Training Loss: 0.00858
Epoch: 26286, Training Loss: 0.00896
Epoch: 26286, Training Loss: 0.00897
Epoch: 26286, Training Loss: 0.01102
Epoch: 26287, Training Loss: 0.00858
Epoch: 26287, Training Loss: 0.00896
Epoch: 26287, Training Loss: 0.00897
Epoch: 26287, Training Loss: 0.01102
Epoch: 26288, Training Loss: 0.00858
Epoch: 26288, Training Loss: 0.00896
Epoch: 26288, Training Loss: 0.00897
Epoch: 26288, Training Loss: 0.01102
Epoch: 26289, Training Loss: 0.00858
Epoch: 26289, Training Loss: 0.00896
Epoch: 26289, Training Loss: 0.00897
Epoch: 26289, Training Loss: 0.01102
Epoch: 26290, Training Loss: 0.00858
Epoch: 26290, Training Loss: 0.00896
Epoch: 26290, Training Loss: 0.00897
Epoch: 26290, Training Loss: 0.01102
Epoch: 26291, Training Loss: 0.00858
Epoch: 26291, Training Loss: 0.00895
Epoch: 26291, Training Loss: 0.00897
Epoch: 26291, Training Loss: 0.01102
Epoch: 26292, Training Loss: 0.00858
Epoch: 26292, Training Loss: 0.00895
Epoch: 26292, Training Loss: 0.00897
Epoch: 26292, Training Loss: 0.01102
Epoch: 26293, Training Loss: 0.00858
Epoch: 26293, Training Loss: 0.00895
Epoch: 26293, Training Loss: 0.00897
Epoch: 26293, Training Loss: 0.01102
Epoch: 26294, Training Loss: 0.00858
Epoch: 26294, Training Loss: 0.00895
Epoch: 26294, Training Loss: 0.00897
Epoch: 26294, Training Loss: 0.01102
Epoch: 26295, Training Loss: 0.00858
Epoch: 26295, Training Loss: 0.00895
Epoch: 26295, Training Loss: 0.00897
Epoch: 26295, Training Loss: 0.01102
Epoch: 26296, Training Loss: 0.00858
Epoch: 26296, Training Loss: 0.00895
Epoch: 26296, Training Loss: 0.00896
Epoch: 26296, Training Loss: 0.01102
Epoch: 26297, Training Loss: 0.00858
Epoch: 26297, Training Loss: 0.00895
Epoch: 26297, Training Loss: 0.00896
Epoch: 26297, Training Loss: 0.01102
Epoch: 26298, Training Loss: 0.00858
Epoch: 26298, Training Loss: 0.00895
Epoch: 26298, Training Loss: 0.00896
Epoch: 26298, Training Loss: 0.01102
Epoch: 26299, Training Loss: 0.00858
Epoch: 26299, Training Loss: 0.00895
Epoch: 26299, Training Loss: 0.00896
Epoch: 26299, Training Loss: 0.01102
Epoch: 26300, Training Loss: 0.00858
Epoch: 26300, Training Loss: 0.00895
Epoch: 26300, Training Loss: 0.00896
Epoch: 26300, Training Loss: 0.01102
Epoch: 26301, Training Loss: 0.00858
Epoch: 26301, Training Loss: 0.00895
Epoch: 26301, Training Loss: 0.00896
Epoch: 26301, Training Loss: 0.01102
Epoch: 26302, Training Loss: 0.00858
Epoch: 26302, Training Loss: 0.00895
Epoch: 26302, Training Loss: 0.00896
Epoch: 26302, Training Loss: 0.01102
Epoch: 26303, Training Loss: 0.00858
Epoch: 26303, Training Loss: 0.00895
Epoch: 26303, Training Loss: 0.00896
Epoch: 26303, Training Loss: 0.01102
Epoch: 26304, Training Loss: 0.00858
Epoch: 26304, Training Loss: 0.00895
Epoch: 26304, Training Loss: 0.00896
Epoch: 26304, Training Loss: 0.01102
Epoch: 26305, Training Loss: 0.00858
Epoch: 26305, Training Loss: 0.00895
Epoch: 26305, Training Loss: 0.00896
Epoch: 26305, Training Loss: 0.01102
Epoch: 26306, Training Loss: 0.00858
Epoch: 26306, Training Loss: 0.00895
Epoch: 26306, Training Loss: 0.00896
Epoch: 26306, Training Loss: 0.01102
Epoch: 26307, Training Loss: 0.00858
Epoch: 26307, Training Loss: 0.00895
Epoch: 26307, Training Loss: 0.00896
Epoch: 26307, Training Loss: 0.01102
Epoch: 26308, Training Loss: 0.00857
Epoch: 26308, Training Loss: 0.00895
Epoch: 26308, Training Loss: 0.00896
Epoch: 26308, Training Loss: 0.01102
Epoch: 26309, Training Loss: 0.00857
Epoch: 26309, Training Loss: 0.00895
Epoch: 26309, Training Loss: 0.00896
Epoch: 26309, Training Loss: 0.01102
Epoch: 26310, Training Loss: 0.00857
Epoch: 26310, Training Loss: 0.00895
Epoch: 26310, Training Loss: 0.00896
Epoch: 26310, Training Loss: 0.01102
Epoch: 26311, Training Loss: 0.00857
Epoch: 26311, Training Loss: 0.00895
Epoch: 26311, Training Loss: 0.00896
Epoch: 26311, Training Loss: 0.01102
Epoch: 26312, Training Loss: 0.00857
Epoch: 26312, Training Loss: 0.00895
Epoch: 26312, Training Loss: 0.00896
Epoch: 26312, Training Loss: 0.01102
Epoch: 26313, Training Loss: 0.00857
Epoch: 26313, Training Loss: 0.00895
Epoch: 26313, Training Loss: 0.00896
Epoch: 26313, Training Loss: 0.01102
Epoch: 26314, Training Loss: 0.00857
Epoch: 26314, Training Loss: 0.00895
Epoch: 26314, Training Loss: 0.00896
Epoch: 26314, Training Loss: 0.01102
Epoch: 26315, Training Loss: 0.00857
Epoch: 26315, Training Loss: 0.00895
Epoch: 26315, Training Loss: 0.00896
Epoch: 26315, Training Loss: 0.01102
Epoch: 26316, Training Loss: 0.00857
Epoch: 26316, Training Loss: 0.00895
Epoch: 26316, Training Loss: 0.00896
Epoch: 26316, Training Loss: 0.01102
Epoch: 26317, Training Loss: 0.00857
Epoch: 26317, Training Loss: 0.00895
Epoch: 26317, Training Loss: 0.00896
Epoch: 26317, Training Loss: 0.01102
Epoch: 26318, Training Loss: 0.00857
Epoch: 26318, Training Loss: 0.00895
Epoch: 26318, Training Loss: 0.00896
Epoch: 26318, Training Loss: 0.01102
Epoch: 26319, Training Loss: 0.00857
Epoch: 26319, Training Loss: 0.00895
Epoch: 26319, Training Loss: 0.00896
Epoch: 26319, Training Loss: 0.01102
Epoch: 26320, Training Loss: 0.00857
Epoch: 26320, Training Loss: 0.00895
Epoch: 26320, Training Loss: 0.00896
Epoch: 26320, Training Loss: 0.01102
Epoch: 26321, Training Loss: 0.00857
Epoch: 26321, Training Loss: 0.00895
Epoch: 26321, Training Loss: 0.00896
Epoch: 26321, Training Loss: 0.01102
Epoch: 26322, Training Loss: 0.00857
Epoch: 26322, Training Loss: 0.00895
Epoch: 26322, Training Loss: 0.00896
Epoch: 26322, Training Loss: 0.01102
Epoch: 26323, Training Loss: 0.00857
Epoch: 26323, Training Loss: 0.00895
Epoch: 26323, Training Loss: 0.00896
Epoch: 26323, Training Loss: 0.01102
Epoch: 26324, Training Loss: 0.00857
Epoch: 26324, Training Loss: 0.00895
Epoch: 26324, Training Loss: 0.00896
Epoch: 26324, Training Loss: 0.01102
Epoch: 26325, Training Loss: 0.00857
Epoch: 26325, Training Loss: 0.00895
Epoch: 26325, Training Loss: 0.00896
Epoch: 26325, Training Loss: 0.01102
Epoch: 26326, Training Loss: 0.00857
Epoch: 26326, Training Loss: 0.00895
Epoch: 26326, Training Loss: 0.00896
Epoch: 26326, Training Loss: 0.01102
Epoch: 26327, Training Loss: 0.00857
Epoch: 26327, Training Loss: 0.00895
Epoch: 26327, Training Loss: 0.00896
Epoch: 26327, Training Loss: 0.01101
Epoch: 26328, Training Loss: 0.00857
Epoch: 26328, Training Loss: 0.00895
Epoch: 26328, Training Loss: 0.00896
Epoch: 26328, Training Loss: 0.01101
Epoch: 26329, Training Loss: 0.00857
Epoch: 26329, Training Loss: 0.00895
Epoch: 26329, Training Loss: 0.00896
Epoch: 26329, Training Loss: 0.01101
Epoch: 26330, Training Loss: 0.00857
Epoch: 26330, Training Loss: 0.00895
Epoch: 26330, Training Loss: 0.00896
Epoch: 26330, Training Loss: 0.01101
Epoch: 26331, Training Loss: 0.00857
Epoch: 26331, Training Loss: 0.00895
Epoch: 26331, Training Loss: 0.00896
Epoch: 26331, Training Loss: 0.01101
Epoch: 26332, Training Loss: 0.00857
Epoch: 26332, Training Loss: 0.00895
Epoch: 26332, Training Loss: 0.00896
Epoch: 26332, Training Loss: 0.01101
Epoch: 26333, Training Loss: 0.00857
Epoch: 26333, Training Loss: 0.00895
Epoch: 26333, Training Loss: 0.00896
Epoch: 26333, Training Loss: 0.01101
Epoch: 26334, Training Loss: 0.00857
Epoch: 26334, Training Loss: 0.00895
Epoch: 26334, Training Loss: 0.00896
Epoch: 26334, Training Loss: 0.01101
Epoch: 26335, Training Loss: 0.00857
Epoch: 26335, Training Loss: 0.00895
Epoch: 26335, Training Loss: 0.00896
Epoch: 26335, Training Loss: 0.01101
Epoch: 26336, Training Loss: 0.00857
Epoch: 26336, Training Loss: 0.00895
Epoch: 26336, Training Loss: 0.00896
Epoch: 26336, Training Loss: 0.01101
Epoch: 26337, Training Loss: 0.00857
Epoch: 26337, Training Loss: 0.00895
Epoch: 26337, Training Loss: 0.00896
Epoch: 26337, Training Loss: 0.01101
Epoch: 26338, Training Loss: 0.00857
Epoch: 26338, Training Loss: 0.00895
Epoch: 26338, Training Loss: 0.00896
Epoch: 26338, Training Loss: 0.01101
Epoch: 26339, Training Loss: 0.00857
Epoch: 26339, Training Loss: 0.00895
Epoch: 26339, Training Loss: 0.00896
Epoch: 26339, Training Loss: 0.01101
Epoch: 26340, Training Loss: 0.00857
Epoch: 26340, Training Loss: 0.00895
Epoch: 26340, Training Loss: 0.00896
Epoch: 26340, Training Loss: 0.01101
Epoch: 26341, Training Loss: 0.00857
Epoch: 26341, Training Loss: 0.00895
Epoch: 26341, Training Loss: 0.00896
Epoch: 26341, Training Loss: 0.01101
Epoch: 26342, Training Loss: 0.00857
Epoch: 26342, Training Loss: 0.00894
Epoch: 26342, Training Loss: 0.00896
Epoch: 26342, Training Loss: 0.01101
Epoch: 26343, Training Loss: 0.00857
Epoch: 26343, Training Loss: 0.00894
Epoch: 26343, Training Loss: 0.00896
Epoch: 26343, Training Loss: 0.01101
Epoch: 26344, Training Loss: 0.00857
Epoch: 26344, Training Loss: 0.00894
Epoch: 26344, Training Loss: 0.00896
Epoch: 26344, Training Loss: 0.01101
Epoch: 26345, Training Loss: 0.00857
Epoch: 26345, Training Loss: 0.00894
Epoch: 26345, Training Loss: 0.00896
Epoch: 26345, Training Loss: 0.01101
Epoch: 26346, Training Loss: 0.00857
Epoch: 26346, Training Loss: 0.00894
Epoch: 26346, Training Loss: 0.00896
Epoch: 26346, Training Loss: 0.01101
Epoch: 26347, Training Loss: 0.00857
Epoch: 26347, Training Loss: 0.00894
Epoch: 26347, Training Loss: 0.00895
Epoch: 26347, Training Loss: 0.01101
Epoch: 26348, Training Loss: 0.00857
Epoch: 26348, Training Loss: 0.00894
Epoch: 26348, Training Loss: 0.00895
Epoch: 26348, Training Loss: 0.01101
Epoch: 26349, Training Loss: 0.00857
Epoch: 26349, Training Loss: 0.00894
Epoch: 26349, Training Loss: 0.00895
Epoch: 26349, Training Loss: 0.01101
Epoch: 26350, Training Loss: 0.00857
Epoch: 26350, Training Loss: 0.00894
Epoch: 26350, Training Loss: 0.00895
Epoch: 26350, Training Loss: 0.01101
Epoch: 26351, Training Loss: 0.00857
Epoch: 26351, Training Loss: 0.00894
Epoch: 26351, Training Loss: 0.00895
Epoch: 26351, Training Loss: 0.01101
Epoch: 26352, Training Loss: 0.00857
Epoch: 26352, Training Loss: 0.00894
Epoch: 26352, Training Loss: 0.00895
Epoch: 26352, Training Loss: 0.01101
Epoch: 26353, Training Loss: 0.00857
Epoch: 26353, Training Loss: 0.00894
Epoch: 26353, Training Loss: 0.00895
Epoch: 26353, Training Loss: 0.01101
Epoch: 26354, Training Loss: 0.00857
Epoch: 26354, Training Loss: 0.00894
Epoch: 26354, Training Loss: 0.00895
Epoch: 26354, Training Loss: 0.01101
Epoch: 26355, Training Loss: 0.00857
Epoch: 26355, Training Loss: 0.00894
Epoch: 26355, Training Loss: 0.00895
Epoch: 26355, Training Loss: 0.01101
Epoch: 26356, Training Loss: 0.00857
Epoch: 26356, Training Loss: 0.00894
Epoch: 26356, Training Loss: 0.00895
Epoch: 26356, Training Loss: 0.01101
Epoch: 26357, Training Loss: 0.00857
Epoch: 26357, Training Loss: 0.00894
Epoch: 26357, Training Loss: 0.00895
Epoch: 26357, Training Loss: 0.01101
Epoch: 26358, Training Loss: 0.00857
Epoch: 26358, Training Loss: 0.00894
Epoch: 26358, Training Loss: 0.00895
Epoch: 26358, Training Loss: 0.01101
Epoch: 26359, Training Loss: 0.00857
Epoch: 26359, Training Loss: 0.00894
Epoch: 26359, Training Loss: 0.00895
Epoch: 26359, Training Loss: 0.01101
Epoch: 26360, Training Loss: 0.00857
Epoch: 26360, Training Loss: 0.00894
Epoch: 26360, Training Loss: 0.00895
Epoch: 26360, Training Loss: 0.01101
Epoch: 26361, Training Loss: 0.00857
Epoch: 26361, Training Loss: 0.00894
Epoch: 26361, Training Loss: 0.00895
Epoch: 26361, Training Loss: 0.01101
Epoch: 26362, Training Loss: 0.00857
Epoch: 26362, Training Loss: 0.00894
Epoch: 26362, Training Loss: 0.00895
Epoch: 26362, Training Loss: 0.01101
Epoch: 26363, Training Loss: 0.00856
Epoch: 26363, Training Loss: 0.00894
Epoch: 26363, Training Loss: 0.00895
Epoch: 26363, Training Loss: 0.01101
Epoch: 26364, Training Loss: 0.00856
Epoch: 26364, Training Loss: 0.00894
Epoch: 26364, Training Loss: 0.00895
Epoch: 26364, Training Loss: 0.01101
Epoch: 26365, Training Loss: 0.00856
Epoch: 26365, Training Loss: 0.00894
Epoch: 26365, Training Loss: 0.00895
Epoch: 26365, Training Loss: 0.01101
Epoch: 26366, Training Loss: 0.00856
Epoch: 26366, Training Loss: 0.00894
Epoch: 26366, Training Loss: 0.00895
Epoch: 26366, Training Loss: 0.01101
Epoch: 26367, Training Loss: 0.00856
Epoch: 26367, Training Loss: 0.00894
Epoch: 26367, Training Loss: 0.00895
Epoch: 26367, Training Loss: 0.01101
Epoch: 26368, Training Loss: 0.00856
Epoch: 26368, Training Loss: 0.00894
Epoch: 26368, Training Loss: 0.00895
Epoch: 26368, Training Loss: 0.01101
Epoch: 26369, Training Loss: 0.00856
Epoch: 26369, Training Loss: 0.00894
Epoch: 26369, Training Loss: 0.00895
Epoch: 26369, Training Loss: 0.01100
Epoch: 26370, Training Loss: 0.00856
Epoch: 26370, Training Loss: 0.00894
Epoch: 26370, Training Loss: 0.00895
Epoch: 26370, Training Loss: 0.01100
Epoch: 26371, Training Loss: 0.00856
Epoch: 26371, Training Loss: 0.00894
Epoch: 26371, Training Loss: 0.00895
Epoch: 26371, Training Loss: 0.01100
Epoch: 26372, Training Loss: 0.00856
Epoch: 26372, Training Loss: 0.00894
Epoch: 26372, Training Loss: 0.00895
Epoch: 26372, Training Loss: 0.01100
Epoch: 26373, Training Loss: 0.00856
Epoch: 26373, Training Loss: 0.00894
Epoch: 26373, Training Loss: 0.00895
Epoch: 26373, Training Loss: 0.01100
Epoch: 26374, Training Loss: 0.00856
Epoch: 26374, Training Loss: 0.00894
Epoch: 26374, Training Loss: 0.00895
Epoch: 26374, Training Loss: 0.01100
Epoch: 26375, Training Loss: 0.00856
Epoch: 26375, Training Loss: 0.00894
Epoch: 26375, Training Loss: 0.00895
Epoch: 26375, Training Loss: 0.01100
Epoch: 26376, Training Loss: 0.00856
Epoch: 26376, Training Loss: 0.00894
Epoch: 26376, Training Loss: 0.00895
Epoch: 26376, Training Loss: 0.01100
Epoch: 26377, Training Loss: 0.00856
Epoch: 26377, Training Loss: 0.00894
Epoch: 26377, Training Loss: 0.00895
Epoch: 26377, Training Loss: 0.01100
Epoch: 26378, Training Loss: 0.00856
Epoch: 26378, Training Loss: 0.00894
Epoch: 26378, Training Loss: 0.00895
Epoch: 26378, Training Loss: 0.01100
Epoch: 26379, Training Loss: 0.00856
Epoch: 26379, Training Loss: 0.00894
Epoch: 26379, Training Loss: 0.00895
Epoch: 26379, Training Loss: 0.01100
Epoch: 26380, Training Loss: 0.00856
Epoch: 26380, Training Loss: 0.00894
Epoch: 26380, Training Loss: 0.00895
Epoch: 26380, Training Loss: 0.01100
Epoch: 26381, Training Loss: 0.00856
Epoch: 26381, Training Loss: 0.00894
Epoch: 26381, Training Loss: 0.00895
Epoch: 26381, Training Loss: 0.01100
Epoch: 26382, Training Loss: 0.00856
Epoch: 26382, Training Loss: 0.00894
Epoch: 26382, Training Loss: 0.00895
Epoch: 26382, Training Loss: 0.01100
Epoch: 26383, Training Loss: 0.00856
Epoch: 26383, Training Loss: 0.00894
Epoch: 26383, Training Loss: 0.00895
Epoch: 26383, Training Loss: 0.01100
Epoch: 26384, Training Loss: 0.00856
Epoch: 26384, Training Loss: 0.00894
Epoch: 26384, Training Loss: 0.00895
Epoch: 26384, Training Loss: 0.01100
Epoch: 26385, Training Loss: 0.00856
Epoch: 26385, Training Loss: 0.00894
Epoch: 26385, Training Loss: 0.00895
Epoch: 26385, Training Loss: 0.01100
Epoch: 26386, Training Loss: 0.00856
Epoch: 26386, Training Loss: 0.00894
Epoch: 26386, Training Loss: 0.00895
Epoch: 26386, Training Loss: 0.01100
Epoch: 26387, Training Loss: 0.00856
Epoch: 26387, Training Loss: 0.00894
Epoch: 26387, Training Loss: 0.00895
Epoch: 26387, Training Loss: 0.01100
Epoch: 26388, Training Loss: 0.00856
Epoch: 26388, Training Loss: 0.00894
Epoch: 26388, Training Loss: 0.00895
Epoch: 26388, Training Loss: 0.01100
Epoch: 26389, Training Loss: 0.00856
Epoch: 26389, Training Loss: 0.00894
Epoch: 26389, Training Loss: 0.00895
Epoch: 26389, Training Loss: 0.01100
Epoch: 26390, Training Loss: 0.00856
Epoch: 26390, Training Loss: 0.00894
Epoch: 26390, Training Loss: 0.00895
Epoch: 26390, Training Loss: 0.01100
Epoch: 26391, Training Loss: 0.00856
Epoch: 26391, Training Loss: 0.00894
Epoch: 26391, Training Loss: 0.00895
Epoch: 26391, Training Loss: 0.01100
Epoch: 26392, Training Loss: 0.00856
Epoch: 26392, Training Loss: 0.00894
Epoch: 26392, Training Loss: 0.00895
Epoch: 26392, Training Loss: 0.01100
Epoch: 26393, Training Loss: 0.00856
Epoch: 26393, Training Loss: 0.00894
Epoch: 26393, Training Loss: 0.00895
Epoch: 26393, Training Loss: 0.01100
Epoch: 26394, Training Loss: 0.00856
Epoch: 26394, Training Loss: 0.00893
Epoch: 26394, Training Loss: 0.00895
Epoch: 26394, Training Loss: 0.01100
Epoch: 26395, Training Loss: 0.00856
Epoch: 26395, Training Loss: 0.00893
Epoch: 26395, Training Loss: 0.00895
Epoch: 26395, Training Loss: 0.01100
Epoch: 26396, Training Loss: 0.00856
Epoch: 26396, Training Loss: 0.00893
Epoch: 26396, Training Loss: 0.00895
Epoch: 26396, Training Loss: 0.01100
Epoch: 26397, Training Loss: 0.00856
Epoch: 26397, Training Loss: 0.00893
Epoch: 26397, Training Loss: 0.00895
Epoch: 26397, Training Loss: 0.01100
Epoch: 26398, Training Loss: 0.00856
Epoch: 26398, Training Loss: 0.00893
Epoch: 26398, Training Loss: 0.00895
Epoch: 26398, Training Loss: 0.01100
Epoch: 26399, Training Loss: 0.00856
Epoch: 26399, Training Loss: 0.00893
Epoch: 26399, Training Loss: 0.00894
Epoch: 26399, Training Loss: 0.01100
Epoch: 26400, Training Loss: 0.00856
Epoch: 26400, Training Loss: 0.00893
Epoch: 26400, Training Loss: 0.00894
Epoch: 26400, Training Loss: 0.01100
Epoch: 26401, Training Loss: 0.00856
Epoch: 26401, Training Loss: 0.00893
Epoch: 26401, Training Loss: 0.00894
Epoch: 26401, Training Loss: 0.01100
Epoch: 26402, Training Loss: 0.00856
Epoch: 26402, Training Loss: 0.00893
Epoch: 26402, Training Loss: 0.00894
Epoch: 26402, Training Loss: 0.01100
Epoch: 26403, Training Loss: 0.00856
Epoch: 26403, Training Loss: 0.00893
Epoch: 26403, Training Loss: 0.00894
Epoch: 26403, Training Loss: 0.01100
Epoch: 26404, Training Loss: 0.00856
Epoch: 26404, Training Loss: 0.00893
Epoch: 26404, Training Loss: 0.00894
Epoch: 26404, Training Loss: 0.01100
Epoch: 26405, Training Loss: 0.00856
Epoch: 26405, Training Loss: 0.00893
Epoch: 26405, Training Loss: 0.00894
Epoch: 26405, Training Loss: 0.01100
Epoch: 26406, Training Loss: 0.00856
Epoch: 26406, Training Loss: 0.00893
Epoch: 26406, Training Loss: 0.00894
Epoch: 26406, Training Loss: 0.01100
Epoch: 26407, Training Loss: 0.00856
Epoch: 26407, Training Loss: 0.00893
Epoch: 26407, Training Loss: 0.00894
Epoch: 26407, Training Loss: 0.01100
Epoch: 26408, Training Loss: 0.00856
Epoch: 26408, Training Loss: 0.00893
Epoch: 26408, Training Loss: 0.00894
Epoch: 26408, Training Loss: 0.01100
Epoch: 26409, Training Loss: 0.00856
Epoch: 26409, Training Loss: 0.00893
Epoch: 26409, Training Loss: 0.00894
Epoch: 26409, Training Loss: 0.01100
Epoch: 26410, Training Loss: 0.00856
Epoch: 26410, Training Loss: 0.00893
Epoch: 26410, Training Loss: 0.00894
Epoch: 26410, Training Loss: 0.01099
Epoch: 26411, Training Loss: 0.00856
Epoch: 26411, Training Loss: 0.00893
Epoch: 26411, Training Loss: 0.00894
Epoch: 26411, Training Loss: 0.01099
Epoch: 26412, Training Loss: 0.00856
Epoch: 26412, Training Loss: 0.00893
Epoch: 26412, Training Loss: 0.00894
Epoch: 26412, Training Loss: 0.01099
Epoch: 26413, Training Loss: 0.00856
Epoch: 26413, Training Loss: 0.00893
Epoch: 26413, Training Loss: 0.00894
Epoch: 26413, Training Loss: 0.01099
Epoch: 26414, Training Loss: 0.00856
Epoch: 26414, Training Loss: 0.00893
Epoch: 26414, Training Loss: 0.00894
Epoch: 26414, Training Loss: 0.01099
Epoch: 26415, Training Loss: 0.00856
Epoch: 26415, Training Loss: 0.00893
Epoch: 26415, Training Loss: 0.00894
Epoch: 26415, Training Loss: 0.01099
Epoch: 26416, Training Loss: 0.00856
Epoch: 26416, Training Loss: 0.00893
Epoch: 26416, Training Loss: 0.00894
Epoch: 26416, Training Loss: 0.01099
Epoch: 26417, Training Loss: 0.00856
Epoch: 26417, Training Loss: 0.00893
Epoch: 26417, Training Loss: 0.00894
Epoch: 26417, Training Loss: 0.01099
Epoch: 26418, Training Loss: 0.00855
Epoch: 26418, Training Loss: 0.00893
Epoch: 26418, Training Loss: 0.00894
Epoch: 26418, Training Loss: 0.01099
Epoch: 26419, Training Loss: 0.00855
Epoch: 26419, Training Loss: 0.00893
Epoch: 26419, Training Loss: 0.00894
Epoch: 26419, Training Loss: 0.01099
Epoch: 26420, Training Loss: 0.00855
Epoch: 26420, Training Loss: 0.00893
Epoch: 26420, Training Loss: 0.00894
Epoch: 26420, Training Loss: 0.01099
Epoch: 26421, Training Loss: 0.00855
Epoch: 26421, Training Loss: 0.00893
Epoch: 26421, Training Loss: 0.00894
Epoch: 26421, Training Loss: 0.01099
Epoch: 26422, Training Loss: 0.00855
Epoch: 26422, Training Loss: 0.00893
Epoch: 26422, Training Loss: 0.00894
Epoch: 26422, Training Loss: 0.01099
Epoch: 26423, Training Loss: 0.00855
Epoch: 26423, Training Loss: 0.00893
Epoch: 26423, Training Loss: 0.00894
Epoch: 26423, Training Loss: 0.01099
Epoch: 26424, Training Loss: 0.00855
Epoch: 26424, Training Loss: 0.00893
Epoch: 26424, Training Loss: 0.00894
Epoch: 26424, Training Loss: 0.01099
Epoch: 26425, Training Loss: 0.00855
Epoch: 26425, Training Loss: 0.00893
Epoch: 26425, Training Loss: 0.00894
Epoch: 26425, Training Loss: 0.01099
Epoch: 26426, Training Loss: 0.00855
Epoch: 26426, Training Loss: 0.00893
Epoch: 26426, Training Loss: 0.00894
Epoch: 26426, Training Loss: 0.01099
Epoch: 26427, Training Loss: 0.00855
Epoch: 26427, Training Loss: 0.00893
Epoch: 26427, Training Loss: 0.00894
Epoch: 26427, Training Loss: 0.01099
Epoch: 26428, Training Loss: 0.00855
Epoch: 26428, Training Loss: 0.00893
Epoch: 26428, Training Loss: 0.00894
Epoch: 26428, Training Loss: 0.01099
Epoch: 26429, Training Loss: 0.00855
Epoch: 26429, Training Loss: 0.00893
Epoch: 26429, Training Loss: 0.00894
Epoch: 26429, Training Loss: 0.01099
Epoch: 26430, Training Loss: 0.00855
Epoch: 26430, Training Loss: 0.00893
Epoch: 26430, Training Loss: 0.00894
Epoch: 26430, Training Loss: 0.01099
Epoch: 26431, Training Loss: 0.00855
Epoch: 26431, Training Loss: 0.00893
Epoch: 26431, Training Loss: 0.00894
Epoch: 26431, Training Loss: 0.01099
Epoch: 26432, Training Loss: 0.00855
Epoch: 26432, Training Loss: 0.00893
Epoch: 26432, Training Loss: 0.00894
Epoch: 26432, Training Loss: 0.01099
Epoch: 26433, Training Loss: 0.00855
Epoch: 26433, Training Loss: 0.00893
Epoch: 26433, Training Loss: 0.00894
Epoch: 26433, Training Loss: 0.01099
Epoch: 26434, Training Loss: 0.00855
Epoch: 26434, Training Loss: 0.00893
Epoch: 26434, Training Loss: 0.00894
Epoch: 26434, Training Loss: 0.01099
Epoch: 26435, Training Loss: 0.00855
Epoch: 26435, Training Loss: 0.00893
Epoch: 26435, Training Loss: 0.00894
Epoch: 26435, Training Loss: 0.01099
Epoch: 26436, Training Loss: 0.00855
Epoch: 26436, Training Loss: 0.00893
Epoch: 26436, Training Loss: 0.00894
Epoch: 26436, Training Loss: 0.01099
Epoch: 26437, Training Loss: 0.00855
Epoch: 26437, Training Loss: 0.00893
Epoch: 26437, Training Loss: 0.00894
Epoch: 26437, Training Loss: 0.01099
Epoch: 26438, Training Loss: 0.00855
Epoch: 26438, Training Loss: 0.00893
Epoch: 26438, Training Loss: 0.00894
Epoch: 26438, Training Loss: 0.01099
Epoch: 26439, Training Loss: 0.00855
Epoch: 26439, Training Loss: 0.00893
Epoch: 26439, Training Loss: 0.00894
Epoch: 26439, Training Loss: 0.01099
Epoch: 26440, Training Loss: 0.00855
Epoch: 26440, Training Loss: 0.00893
Epoch: 26440, Training Loss: 0.00894
Epoch: 26440, Training Loss: 0.01099
Epoch: 26441, Training Loss: 0.00855
Epoch: 26441, Training Loss: 0.00893
Epoch: 26441, Training Loss: 0.00894
Epoch: 26441, Training Loss: 0.01099
Epoch: 26442, Training Loss: 0.00855
Epoch: 26442, Training Loss: 0.00893
Epoch: 26442, Training Loss: 0.00894
Epoch: 26442, Training Loss: 0.01099
Epoch: 26443, Training Loss: 0.00855
Epoch: 26443, Training Loss: 0.00893
Epoch: 26443, Training Loss: 0.00894
Epoch: 26443, Training Loss: 0.01099
Epoch: 26444, Training Loss: 0.00855
Epoch: 26444, Training Loss: 0.00893
Epoch: 26444, Training Loss: 0.00894
Epoch: 26444, Training Loss: 0.01099
Epoch: 26445, Training Loss: 0.00855
Epoch: 26445, Training Loss: 0.00893
Epoch: 26445, Training Loss: 0.00894
Epoch: 26445, Training Loss: 0.01099
Epoch: 26446, Training Loss: 0.00855
Epoch: 26446, Training Loss: 0.00892
Epoch: 26446, Training Loss: 0.00894
Epoch: 26446, Training Loss: 0.01099
Epoch: 26447, Training Loss: 0.00855
Epoch: 26447, Training Loss: 0.00892
Epoch: 26447, Training Loss: 0.00894
Epoch: 26447, Training Loss: 0.01099
Epoch: 26448, Training Loss: 0.00855
Epoch: 26448, Training Loss: 0.00892
Epoch: 26448, Training Loss: 0.00894
Epoch: 26448, Training Loss: 0.01099
Epoch: 26449, Training Loss: 0.00855
Epoch: 26449, Training Loss: 0.00892
Epoch: 26449, Training Loss: 0.00894
Epoch: 26449, Training Loss: 0.01099
Epoch: 26450, Training Loss: 0.00855
Epoch: 26450, Training Loss: 0.00892
Epoch: 26450, Training Loss: 0.00894
Epoch: 26450, Training Loss: 0.01099
Epoch: 26451, Training Loss: 0.00855
Epoch: 26451, Training Loss: 0.00892
Epoch: 26451, Training Loss: 0.00893
Epoch: 26451, Training Loss: 0.01099
Epoch: 26452, Training Loss: 0.00855
Epoch: 26452, Training Loss: 0.00892
Epoch: 26452, Training Loss: 0.00893
Epoch: 26452, Training Loss: 0.01098
Epoch: 26453, Training Loss: 0.00855
Epoch: 26453, Training Loss: 0.00892
Epoch: 26453, Training Loss: 0.00893
Epoch: 26453, Training Loss: 0.01098
Epoch: 26454, Training Loss: 0.00855
Epoch: 26454, Training Loss: 0.00892
Epoch: 26454, Training Loss: 0.00893
Epoch: 26454, Training Loss: 0.01098
Epoch: 26455, Training Loss: 0.00855
Epoch: 26455, Training Loss: 0.00892
Epoch: 26455, Training Loss: 0.00893
Epoch: 26455, Training Loss: 0.01098
Epoch: 26456, Training Loss: 0.00855
Epoch: 26456, Training Loss: 0.00892
Epoch: 26456, Training Loss: 0.00893
Epoch: 26456, Training Loss: 0.01098
Epoch: 26457, Training Loss: 0.00855
Epoch: 26457, Training Loss: 0.00892
Epoch: 26457, Training Loss: 0.00893
Epoch: 26457, Training Loss: 0.01098
Epoch: 26458, Training Loss: 0.00855
Epoch: 26458, Training Loss: 0.00892
Epoch: 26458, Training Loss: 0.00893
Epoch: 26458, Training Loss: 0.01098
Epoch: 26459, Training Loss: 0.00855
Epoch: 26459, Training Loss: 0.00892
Epoch: 26459, Training Loss: 0.00893
Epoch: 26459, Training Loss: 0.01098
Epoch: 26460, Training Loss: 0.00855
Epoch: 26460, Training Loss: 0.00892
Epoch: 26460, Training Loss: 0.00893
Epoch: 26460, Training Loss: 0.01098
Epoch: 26461, Training Loss: 0.00855
Epoch: 26461, Training Loss: 0.00892
Epoch: 26461, Training Loss: 0.00893
Epoch: 26461, Training Loss: 0.01098
Epoch: 26462, Training Loss: 0.00855
Epoch: 26462, Training Loss: 0.00892
Epoch: 26462, Training Loss: 0.00893
Epoch: 26462, Training Loss: 0.01098
Epoch: 26463, Training Loss: 0.00855
Epoch: 26463, Training Loss: 0.00892
Epoch: 26463, Training Loss: 0.00893
Epoch: 26463, Training Loss: 0.01098
Epoch: 26464, Training Loss: 0.00855
Epoch: 26464, Training Loss: 0.00892
Epoch: 26464, Training Loss: 0.00893
Epoch: 26464, Training Loss: 0.01098
Epoch: 26465, Training Loss: 0.00855
Epoch: 26465, Training Loss: 0.00892
Epoch: 26465, Training Loss: 0.00893
Epoch: 26465, Training Loss: 0.01098
Epoch: 26466, Training Loss: 0.00855
Epoch: 26466, Training Loss: 0.00892
Epoch: 26466, Training Loss: 0.00893
Epoch: 26466, Training Loss: 0.01098
Epoch: 26467, Training Loss: 0.00855
Epoch: 26467, Training Loss: 0.00892
Epoch: 26467, Training Loss: 0.00893
Epoch: 26467, Training Loss: 0.01098
Epoch: 26468, Training Loss: 0.00855
Epoch: 26468, Training Loss: 0.00892
Epoch: 26468, Training Loss: 0.00893
Epoch: 26468, Training Loss: 0.01098
Epoch: 26469, Training Loss: 0.00855
Epoch: 26469, Training Loss: 0.00892
Epoch: 26469, Training Loss: 0.00893
Epoch: 26469, Training Loss: 0.01098
Epoch: 26470, Training Loss: 0.00855
Epoch: 26470, Training Loss: 0.00892
Epoch: 26470, Training Loss: 0.00893
Epoch: 26470, Training Loss: 0.01098
Epoch: 26471, Training Loss: 0.00855
Epoch: 26471, Training Loss: 0.00892
Epoch: 26471, Training Loss: 0.00893
Epoch: 26471, Training Loss: 0.01098
Epoch: 26472, Training Loss: 0.00855
Epoch: 26472, Training Loss: 0.00892
Epoch: 26472, Training Loss: 0.00893
Epoch: 26472, Training Loss: 0.01098
Epoch: 26473, Training Loss: 0.00855
Epoch: 26473, Training Loss: 0.00892
Epoch: 26473, Training Loss: 0.00893
Epoch: 26473, Training Loss: 0.01098
Epoch: 26474, Training Loss: 0.00854
Epoch: 26474, Training Loss: 0.00892
Epoch: 26474, Training Loss: 0.00893
Epoch: 26474, Training Loss: 0.01098
Epoch: 26475, Training Loss: 0.00854
Epoch: 26475, Training Loss: 0.00892
Epoch: 26475, Training Loss: 0.00893
Epoch: 26475, Training Loss: 0.01098
Epoch: 26476, Training Loss: 0.00854
Epoch: 26476, Training Loss: 0.00892
Epoch: 26476, Training Loss: 0.00893
Epoch: 26476, Training Loss: 0.01098
Epoch: 26477, Training Loss: 0.00854
Epoch: 26477, Training Loss: 0.00892
Epoch: 26477, Training Loss: 0.00893
Epoch: 26477, Training Loss: 0.01098
Epoch: 26478, Training Loss: 0.00854
Epoch: 26478, Training Loss: 0.00892
Epoch: 26478, Training Loss: 0.00893
Epoch: 26478, Training Loss: 0.01098
Epoch: 26479, Training Loss: 0.00854
Epoch: 26479, Training Loss: 0.00892
Epoch: 26479, Training Loss: 0.00893
Epoch: 26479, Training Loss: 0.01098
Epoch: 26480, Training Loss: 0.00854
Epoch: 26480, Training Loss: 0.00892
Epoch: 26480, Training Loss: 0.00893
Epoch: 26480, Training Loss: 0.01098
Epoch: 26481, Training Loss: 0.00854
Epoch: 26481, Training Loss: 0.00892
Epoch: 26481, Training Loss: 0.00893
Epoch: 26481, Training Loss: 0.01098
Epoch: 26482, Training Loss: 0.00854
Epoch: 26482, Training Loss: 0.00892
Epoch: 26482, Training Loss: 0.00893
Epoch: 26482, Training Loss: 0.01098
Epoch: 26483, Training Loss: 0.00854
Epoch: 26483, Training Loss: 0.00892
Epoch: 26483, Training Loss: 0.00893
Epoch: 26483, Training Loss: 0.01098
Epoch: 26484, Training Loss: 0.00854
Epoch: 26484, Training Loss: 0.00892
Epoch: 26484, Training Loss: 0.00893
Epoch: 26484, Training Loss: 0.01098
Epoch: 26485, Training Loss: 0.00854
Epoch: 26485, Training Loss: 0.00892
Epoch: 26485, Training Loss: 0.00893
Epoch: 26485, Training Loss: 0.01098
Epoch: 26486, Training Loss: 0.00854
Epoch: 26486, Training Loss: 0.00892
Epoch: 26486, Training Loss: 0.00893
Epoch: 26486, Training Loss: 0.01098
Epoch: 26487, Training Loss: 0.00854
Epoch: 26487, Training Loss: 0.00892
Epoch: 26487, Training Loss: 0.00893
Epoch: 26487, Training Loss: 0.01098
Epoch: 26488, Training Loss: 0.00854
Epoch: 26488, Training Loss: 0.00892
Epoch: 26488, Training Loss: 0.00893
Epoch: 26488, Training Loss: 0.01098
Epoch: 26489, Training Loss: 0.00854
Epoch: 26489, Training Loss: 0.00892
Epoch: 26489, Training Loss: 0.00893
Epoch: 26489, Training Loss: 0.01098
Epoch: 26490, Training Loss: 0.00854
Epoch: 26490, Training Loss: 0.00892
Epoch: 26490, Training Loss: 0.00893
Epoch: 26490, Training Loss: 0.01098
Epoch: 26491, Training Loss: 0.00854
Epoch: 26491, Training Loss: 0.00892
Epoch: 26491, Training Loss: 0.00893
Epoch: 26491, Training Loss: 0.01098
Epoch: 26492, Training Loss: 0.00854
Epoch: 26492, Training Loss: 0.00892
Epoch: 26492, Training Loss: 0.00893
Epoch: 26492, Training Loss: 0.01098
Epoch: 26493, Training Loss: 0.00854
Epoch: 26493, Training Loss: 0.00892
Epoch: 26493, Training Loss: 0.00893
Epoch: 26493, Training Loss: 0.01098
Epoch: 26494, Training Loss: 0.00854
Epoch: 26494, Training Loss: 0.00892
Epoch: 26494, Training Loss: 0.00893
Epoch: 26494, Training Loss: 0.01097
Epoch: 26495, Training Loss: 0.00854
Epoch: 26495, Training Loss: 0.00892
Epoch: 26495, Training Loss: 0.00893
Epoch: 26495, Training Loss: 0.01097
Epoch: 26496, Training Loss: 0.00854
Epoch: 26496, Training Loss: 0.00892
Epoch: 26496, Training Loss: 0.00893
Epoch: 26496, Training Loss: 0.01097
Epoch: 26497, Training Loss: 0.00854
Epoch: 26497, Training Loss: 0.00892
Epoch: 26497, Training Loss: 0.00893
Epoch: 26497, Training Loss: 0.01097
Epoch: 26498, Training Loss: 0.00854
Epoch: 26498, Training Loss: 0.00891
Epoch: 26498, Training Loss: 0.00893
Epoch: 26498, Training Loss: 0.01097
Epoch: 26499, Training Loss: 0.00854
Epoch: 26499, Training Loss: 0.00891
Epoch: 26499, Training Loss: 0.00893
Epoch: 26499, Training Loss: 0.01097
Epoch: 26500, Training Loss: 0.00854
Epoch: 26500, Training Loss: 0.00891
Epoch: 26500, Training Loss: 0.00893
Epoch: 26500, Training Loss: 0.01097
Epoch: 26501, Training Loss: 0.00854
Epoch: 26501, Training Loss: 0.00891
Epoch: 26501, Training Loss: 0.00893
Epoch: 26501, Training Loss: 0.01097
Epoch: 26502, Training Loss: 0.00854
Epoch: 26502, Training Loss: 0.00891
Epoch: 26502, Training Loss: 0.00893
Epoch: 26502, Training Loss: 0.01097
Epoch: 26503, Training Loss: 0.00854
Epoch: 26503, Training Loss: 0.00891
Epoch: 26503, Training Loss: 0.00892
Epoch: 26503, Training Loss: 0.01097
Epoch: 26504, Training Loss: 0.00854
Epoch: 26504, Training Loss: 0.00891
Epoch: 26504, Training Loss: 0.00892
Epoch: 26504, Training Loss: 0.01097
Epoch: 26505, Training Loss: 0.00854
Epoch: 26505, Training Loss: 0.00891
Epoch: 26505, Training Loss: 0.00892
Epoch: 26505, Training Loss: 0.01097
Epoch: 26506, Training Loss: 0.00854
Epoch: 26506, Training Loss: 0.00891
Epoch: 26506, Training Loss: 0.00892
Epoch: 26506, Training Loss: 0.01097
Epoch: 26507, Training Loss: 0.00854
Epoch: 26507, Training Loss: 0.00891
Epoch: 26507, Training Loss: 0.00892
Epoch: 26507, Training Loss: 0.01097
Epoch: 26508, Training Loss: 0.00854
Epoch: 26508, Training Loss: 0.00891
Epoch: 26508, Training Loss: 0.00892
Epoch: 26508, Training Loss: 0.01097
Epoch: 26509, Training Loss: 0.00854
Epoch: 26509, Training Loss: 0.00891
Epoch: 26509, Training Loss: 0.00892
Epoch: 26509, Training Loss: 0.01097
Epoch: 26510, Training Loss: 0.00854
Epoch: 26510, Training Loss: 0.00891
Epoch: 26510, Training Loss: 0.00892
Epoch: 26510, Training Loss: 0.01097
Epoch: 26511, Training Loss: 0.00854
Epoch: 26511, Training Loss: 0.00891
Epoch: 26511, Training Loss: 0.00892
Epoch: 26511, Training Loss: 0.01097
Epoch: 26512, Training Loss: 0.00854
Epoch: 26512, Training Loss: 0.00891
Epoch: 26512, Training Loss: 0.00892
Epoch: 26512, Training Loss: 0.01097
Epoch: 26513, Training Loss: 0.00854
Epoch: 26513, Training Loss: 0.00891
Epoch: 26513, Training Loss: 0.00892
Epoch: 26513, Training Loss: 0.01097
Epoch: 26514, Training Loss: 0.00854
Epoch: 26514, Training Loss: 0.00891
Epoch: 26514, Training Loss: 0.00892
Epoch: 26514, Training Loss: 0.01097
Epoch: 26515, Training Loss: 0.00854
Epoch: 26515, Training Loss: 0.00891
Epoch: 26515, Training Loss: 0.00892
Epoch: 26515, Training Loss: 0.01097
Epoch: 26516, Training Loss: 0.00854
Epoch: 26516, Training Loss: 0.00891
Epoch: 26516, Training Loss: 0.00892
Epoch: 26516, Training Loss: 0.01097
Epoch: 26517, Training Loss: 0.00854
Epoch: 26517, Training Loss: 0.00891
Epoch: 26517, Training Loss: 0.00892
Epoch: 26517, Training Loss: 0.01097
Epoch: 26518, Training Loss: 0.00854
Epoch: 26518, Training Loss: 0.00891
Epoch: 26518, Training Loss: 0.00892
Epoch: 26518, Training Loss: 0.01097
Epoch: 26519, Training Loss: 0.00854
Epoch: 26519, Training Loss: 0.00891
Epoch: 26519, Training Loss: 0.00892
Epoch: 26519, Training Loss: 0.01097
Epoch: 26520, Training Loss: 0.00854
Epoch: 26520, Training Loss: 0.00891
Epoch: 26520, Training Loss: 0.00892
Epoch: 26520, Training Loss: 0.01097
Epoch: 26521, Training Loss: 0.00854
Epoch: 26521, Training Loss: 0.00891
Epoch: 26521, Training Loss: 0.00892
Epoch: 26521, Training Loss: 0.01097
Epoch: 26522, Training Loss: 0.00854
Epoch: 26522, Training Loss: 0.00891
Epoch: 26522, Training Loss: 0.00892
Epoch: 26522, Training Loss: 0.01097
Epoch: 26523, Training Loss: 0.00854
Epoch: 26523, Training Loss: 0.00891
Epoch: 26523, Training Loss: 0.00892
Epoch: 26523, Training Loss: 0.01097
Epoch: 26524, Training Loss: 0.00854
Epoch: 26524, Training Loss: 0.00891
Epoch: 26524, Training Loss: 0.00892
Epoch: 26524, Training Loss: 0.01097
Epoch: 26525, Training Loss: 0.00854
Epoch: 26525, Training Loss: 0.00891
Epoch: 26525, Training Loss: 0.00892
Epoch: 26525, Training Loss: 0.01097
Epoch: 26526, Training Loss: 0.00854
Epoch: 26526, Training Loss: 0.00891
Epoch: 26526, Training Loss: 0.00892
Epoch: 26526, Training Loss: 0.01097
Epoch: 26527, Training Loss: 0.00854
Epoch: 26527, Training Loss: 0.00891
Epoch: 26527, Training Loss: 0.00892
Epoch: 26527, Training Loss: 0.01097
Epoch: 26528, Training Loss: 0.00854
Epoch: 26528, Training Loss: 0.00891
Epoch: 26528, Training Loss: 0.00892
Epoch: 26528, Training Loss: 0.01097
Epoch: 26529, Training Loss: 0.00853
Epoch: 26529, Training Loss: 0.00891
Epoch: 26529, Training Loss: 0.00892
Epoch: 26529, Training Loss: 0.01097
Epoch: 26530, Training Loss: 0.00853
Epoch: 26530, Training Loss: 0.00891
Epoch: 26530, Training Loss: 0.00892
Epoch: 26530, Training Loss: 0.01097
Epoch: 26531, Training Loss: 0.00853
Epoch: 26531, Training Loss: 0.00891
Epoch: 26531, Training Loss: 0.00892
Epoch: 26531, Training Loss: 0.01097
Epoch: 26532, Training Loss: 0.00853
Epoch: 26532, Training Loss: 0.00891
Epoch: 26532, Training Loss: 0.00892
Epoch: 26532, Training Loss: 0.01097
Epoch: 26533, Training Loss: 0.00853
Epoch: 26533, Training Loss: 0.00891
Epoch: 26533, Training Loss: 0.00892
Epoch: 26533, Training Loss: 0.01097
Epoch: 26534, Training Loss: 0.00853
Epoch: 26534, Training Loss: 0.00891
Epoch: 26534, Training Loss: 0.00892
Epoch: 26534, Training Loss: 0.01097
Epoch: 26535, Training Loss: 0.00853
Epoch: 26535, Training Loss: 0.00891
Epoch: 26535, Training Loss: 0.00892
Epoch: 26535, Training Loss: 0.01097
Epoch: 26536, Training Loss: 0.00853
Epoch: 26536, Training Loss: 0.00891
Epoch: 26536, Training Loss: 0.00892
Epoch: 26536, Training Loss: 0.01096
Epoch: 26537, Training Loss: 0.00853
Epoch: 26537, Training Loss: 0.00891
Epoch: 26537, Training Loss: 0.00892
Epoch: 26537, Training Loss: 0.01096
Epoch: 26538, Training Loss: 0.00853
Epoch: 26538, Training Loss: 0.00891
Epoch: 26538, Training Loss: 0.00892
Epoch: 26538, Training Loss: 0.01096
Epoch: 26539, Training Loss: 0.00853
Epoch: 26539, Training Loss: 0.00891
Epoch: 26539, Training Loss: 0.00892
Epoch: 26539, Training Loss: 0.01096
Epoch: 26540, Training Loss: 0.00853
Epoch: 26540, Training Loss: 0.00891
Epoch: 26540, Training Loss: 0.00892
Epoch: 26540, Training Loss: 0.01096
Epoch: 26541, Training Loss: 0.00853
Epoch: 26541, Training Loss: 0.00891
Epoch: 26541, Training Loss: 0.00892
Epoch: 26541, Training Loss: 0.01096
Epoch: 26542, Training Loss: 0.00853
Epoch: 26542, Training Loss: 0.00891
Epoch: 26542, Training Loss: 0.00892
Epoch: 26542, Training Loss: 0.01096
Epoch: 26543, Training Loss: 0.00853
Epoch: 26543, Training Loss: 0.00891
Epoch: 26543, Training Loss: 0.00892
Epoch: 26543, Training Loss: 0.01096
Epoch: 26544, Training Loss: 0.00853
Epoch: 26544, Training Loss: 0.00891
Epoch: 26544, Training Loss: 0.00892
Epoch: 26544, Training Loss: 0.01096
Epoch: 26545, Training Loss: 0.00853
Epoch: 26545, Training Loss: 0.00891
Epoch: 26545, Training Loss: 0.00892
Epoch: 26545, Training Loss: 0.01096
Epoch: 26546, Training Loss: 0.00853
Epoch: 26546, Training Loss: 0.00891
Epoch: 26546, Training Loss: 0.00892
Epoch: 26546, Training Loss: 0.01096
Epoch: 26547, Training Loss: 0.00853
Epoch: 26547, Training Loss: 0.00891
Epoch: 26547, Training Loss: 0.00892
Epoch: 26547, Training Loss: 0.01096
Epoch: 26548, Training Loss: 0.00853
Epoch: 26548, Training Loss: 0.00891
Epoch: 26548, Training Loss: 0.00892
Epoch: 26548, Training Loss: 0.01096
Epoch: 26549, Training Loss: 0.00853
Epoch: 26549, Training Loss: 0.00891
Epoch: 26549, Training Loss: 0.00892
Epoch: 26549, Training Loss: 0.01096
Epoch: 26550, Training Loss: 0.00853
Epoch: 26550, Training Loss: 0.00890
Epoch: 26550, Training Loss: 0.00892
Epoch: 26550, Training Loss: 0.01096
Epoch: 26551, Training Loss: 0.00853
Epoch: 26551, Training Loss: 0.00890
Epoch: 26551, Training Loss: 0.00892
Epoch: 26551, Training Loss: 0.01096
Epoch: 26552, Training Loss: 0.00853
Epoch: 26552, Training Loss: 0.00890
Epoch: 26552, Training Loss: 0.00892
Epoch: 26552, Training Loss: 0.01096
Epoch: 26553, Training Loss: 0.00853
Epoch: 26553, Training Loss: 0.00890
Epoch: 26553, Training Loss: 0.00892
Epoch: 26553, Training Loss: 0.01096
Epoch: 26554, Training Loss: 0.00853
Epoch: 26554, Training Loss: 0.00890
Epoch: 26554, Training Loss: 0.00892
Epoch: 26554, Training Loss: 0.01096
Epoch: 26555, Training Loss: 0.00853
Epoch: 26555, Training Loss: 0.00890
Epoch: 26555, Training Loss: 0.00891
Epoch: 26555, Training Loss: 0.01096
Epoch: 26556, Training Loss: 0.00853
Epoch: 26556, Training Loss: 0.00890
Epoch: 26556, Training Loss: 0.00891
Epoch: 26556, Training Loss: 0.01096
Epoch: 26557, Training Loss: 0.00853
Epoch: 26557, Training Loss: 0.00890
Epoch: 26557, Training Loss: 0.00891
Epoch: 26557, Training Loss: 0.01096
Epoch: 26558, Training Loss: 0.00853
Epoch: 26558, Training Loss: 0.00890
Epoch: 26558, Training Loss: 0.00891
Epoch: 26558, Training Loss: 0.01096
Epoch: 26559, Training Loss: 0.00853
Epoch: 26559, Training Loss: 0.00890
Epoch: 26559, Training Loss: 0.00891
Epoch: 26559, Training Loss: 0.01096
Epoch: 26560, Training Loss: 0.00853
Epoch: 26560, Training Loss: 0.00890
Epoch: 26560, Training Loss: 0.00891
Epoch: 26560, Training Loss: 0.01096
Epoch: 26561, Training Loss: 0.00853
Epoch: 26561, Training Loss: 0.00890
Epoch: 26561, Training Loss: 0.00891
Epoch: 26561, Training Loss: 0.01096
Epoch: 26562, Training Loss: 0.00853
Epoch: 26562, Training Loss: 0.00890
Epoch: 26562, Training Loss: 0.00891
Epoch: 26562, Training Loss: 0.01096
Epoch: 26563, Training Loss: 0.00853
Epoch: 26563, Training Loss: 0.00890
Epoch: 26563, Training Loss: 0.00891
Epoch: 26563, Training Loss: 0.01096
Epoch: 26564, Training Loss: 0.00853
Epoch: 26564, Training Loss: 0.00890
Epoch: 26564, Training Loss: 0.00891
Epoch: 26564, Training Loss: 0.01096
Epoch: 26565, Training Loss: 0.00853
Epoch: 26565, Training Loss: 0.00890
Epoch: 26565, Training Loss: 0.00891
Epoch: 26565, Training Loss: 0.01096
Epoch: 26566, Training Loss: 0.00853
Epoch: 26566, Training Loss: 0.00890
Epoch: 26566, Training Loss: 0.00891
Epoch: 26566, Training Loss: 0.01096
Epoch: 26567, Training Loss: 0.00853
Epoch: 26567, Training Loss: 0.00890
Epoch: 26567, Training Loss: 0.00891
Epoch: 26567, Training Loss: 0.01096
Epoch: 26568, Training Loss: 0.00853
Epoch: 26568, Training Loss: 0.00890
Epoch: 26568, Training Loss: 0.00891
Epoch: 26568, Training Loss: 0.01096
Epoch: 26569, Training Loss: 0.00853
Epoch: 26569, Training Loss: 0.00890
Epoch: 26569, Training Loss: 0.00891
Epoch: 26569, Training Loss: 0.01096
Epoch: 26570, Training Loss: 0.00853
Epoch: 26570, Training Loss: 0.00890
Epoch: 26570, Training Loss: 0.00891
Epoch: 26570, Training Loss: 0.01096
Epoch: 26571, Training Loss: 0.00853
Epoch: 26571, Training Loss: 0.00890
Epoch: 26571, Training Loss: 0.00891
Epoch: 26571, Training Loss: 0.01096
Epoch: 26572, Training Loss: 0.00853
Epoch: 26572, Training Loss: 0.00890
Epoch: 26572, Training Loss: 0.00891
Epoch: 26572, Training Loss: 0.01096
Epoch: 26573, Training Loss: 0.00853
Epoch: 26573, Training Loss: 0.00890
Epoch: 26573, Training Loss: 0.00891
Epoch: 26573, Training Loss: 0.01096
Epoch: 26574, Training Loss: 0.00853
Epoch: 26574, Training Loss: 0.00890
Epoch: 26574, Training Loss: 0.00891
Epoch: 26574, Training Loss: 0.01096
Epoch: 26575, Training Loss: 0.00853
Epoch: 26575, Training Loss: 0.00890
Epoch: 26575, Training Loss: 0.00891
Epoch: 26575, Training Loss: 0.01096
Epoch: 26576, Training Loss: 0.00853
Epoch: 26576, Training Loss: 0.00890
Epoch: 26576, Training Loss: 0.00891
Epoch: 26576, Training Loss: 0.01096
Epoch: 26577, Training Loss: 0.00853
Epoch: 26577, Training Loss: 0.00890
Epoch: 26577, Training Loss: 0.00891
Epoch: 26577, Training Loss: 0.01096
Epoch: 26578, Training Loss: 0.00853
Epoch: 26578, Training Loss: 0.00890
Epoch: 26578, Training Loss: 0.00891
Epoch: 26578, Training Loss: 0.01095
Epoch: 26579, Training Loss: 0.00853
Epoch: 26579, Training Loss: 0.00890
Epoch: 26579, Training Loss: 0.00891
Epoch: 26579, Training Loss: 0.01095
Epoch: 26580, Training Loss: 0.00853
Epoch: 26580, Training Loss: 0.00890
Epoch: 26580, Training Loss: 0.00891
Epoch: 26580, Training Loss: 0.01095
Epoch: 26581, Training Loss: 0.00853
Epoch: 26581, Training Loss: 0.00890
Epoch: 26581, Training Loss: 0.00891
Epoch: 26581, Training Loss: 0.01095
Epoch: 26582, Training Loss: 0.00853
Epoch: 26582, Training Loss: 0.00890
Epoch: 26582, Training Loss: 0.00891
Epoch: 26582, Training Loss: 0.01095
Epoch: 26583, Training Loss: 0.00853
Epoch: 26583, Training Loss: 0.00890
Epoch: 26583, Training Loss: 0.00891
Epoch: 26583, Training Loss: 0.01095
Epoch: 26584, Training Loss: 0.00853
Epoch: 26584, Training Loss: 0.00890
Epoch: 26584, Training Loss: 0.00891
Epoch: 26584, Training Loss: 0.01095
Epoch: 26585, Training Loss: 0.00852
Epoch: 26585, Training Loss: 0.00890
Epoch: 26585, Training Loss: 0.00891
Epoch: 26585, Training Loss: 0.01095
Epoch: 26586, Training Loss: 0.00852
Epoch: 26586, Training Loss: 0.00890
Epoch: 26586, Training Loss: 0.00891
Epoch: 26586, Training Loss: 0.01095
Epoch: 26587, Training Loss: 0.00852
Epoch: 26587, Training Loss: 0.00890
Epoch: 26587, Training Loss: 0.00891
Epoch: 26587, Training Loss: 0.01095
Epoch: 26588, Training Loss: 0.00852
Epoch: 26588, Training Loss: 0.00890
Epoch: 26588, Training Loss: 0.00891
Epoch: 26588, Training Loss: 0.01095
Epoch: 26589, Training Loss: 0.00852
Epoch: 26589, Training Loss: 0.00890
Epoch: 26589, Training Loss: 0.00891
Epoch: 26589, Training Loss: 0.01095
Epoch: 26590, Training Loss: 0.00852
Epoch: 26590, Training Loss: 0.00890
Epoch: 26590, Training Loss: 0.00891
Epoch: 26590, Training Loss: 0.01095
Epoch: 26591, Training Loss: 0.00852
Epoch: 26591, Training Loss: 0.00890
Epoch: 26591, Training Loss: 0.00891
Epoch: 26591, Training Loss: 0.01095
Epoch: 26592, Training Loss: 0.00852
Epoch: 26592, Training Loss: 0.00890
Epoch: 26592, Training Loss: 0.00891
Epoch: 26592, Training Loss: 0.01095
Epoch: 26593, Training Loss: 0.00852
Epoch: 26593, Training Loss: 0.00890
Epoch: 26593, Training Loss: 0.00891
Epoch: 26593, Training Loss: 0.01095
Epoch: 26594, Training Loss: 0.00852
Epoch: 26594, Training Loss: 0.00890
Epoch: 26594, Training Loss: 0.00891
Epoch: 26594, Training Loss: 0.01095
Epoch: 26595, Training Loss: 0.00852
Epoch: 26595, Training Loss: 0.00890
Epoch: 26595, Training Loss: 0.00891
Epoch: 26595, Training Loss: 0.01095
Epoch: 26596, Training Loss: 0.00852
Epoch: 26596, Training Loss: 0.00890
Epoch: 26596, Training Loss: 0.00891
Epoch: 26596, Training Loss: 0.01095
Epoch: 26597, Training Loss: 0.00852
Epoch: 26597, Training Loss: 0.00890
Epoch: 26597, Training Loss: 0.00891
Epoch: 26597, Training Loss: 0.01095
Epoch: 26598, Training Loss: 0.00852
Epoch: 26598, Training Loss: 0.00890
Epoch: 26598, Training Loss: 0.00891
Epoch: 26598, Training Loss: 0.01095
Epoch: 26599, Training Loss: 0.00852
Epoch: 26599, Training Loss: 0.00890
Epoch: 26599, Training Loss: 0.00891
Epoch: 26599, Training Loss: 0.01095
Epoch: 26600, Training Loss: 0.00852
Epoch: 26600, Training Loss: 0.00890
Epoch: 26600, Training Loss: 0.00891
Epoch: 26600, Training Loss: 0.01095
Epoch: 26601, Training Loss: 0.00852
Epoch: 26601, Training Loss: 0.00890
Epoch: 26601, Training Loss: 0.00891
Epoch: 26601, Training Loss: 0.01095
Epoch: 26602, Training Loss: 0.00852
Epoch: 26602, Training Loss: 0.00889
Epoch: 26602, Training Loss: 0.00891
Epoch: 26602, Training Loss: 0.01095
Epoch: 26603, Training Loss: 0.00852
Epoch: 26603, Training Loss: 0.00889
Epoch: 26603, Training Loss: 0.00891
Epoch: 26603, Training Loss: 0.01095
Epoch: 26604, Training Loss: 0.00852
Epoch: 26604, Training Loss: 0.00889
Epoch: 26604, Training Loss: 0.00891
Epoch: 26604, Training Loss: 0.01095
Epoch: 26605, Training Loss: 0.00852
Epoch: 26605, Training Loss: 0.00889
Epoch: 26605, Training Loss: 0.00891
Epoch: 26605, Training Loss: 0.01095
Epoch: 26606, Training Loss: 0.00852
Epoch: 26606, Training Loss: 0.00889
Epoch: 26606, Training Loss: 0.00891
Epoch: 26606, Training Loss: 0.01095
Epoch: 26607, Training Loss: 0.00852
Epoch: 26607, Training Loss: 0.00889
Epoch: 26607, Training Loss: 0.00890
Epoch: 26607, Training Loss: 0.01095
Epoch: 26608, Training Loss: 0.00852
Epoch: 26608, Training Loss: 0.00889
Epoch: 26608, Training Loss: 0.00890
Epoch: 26608, Training Loss: 0.01095
Epoch: 26609, Training Loss: 0.00852
Epoch: 26609, Training Loss: 0.00889
Epoch: 26609, Training Loss: 0.00890
Epoch: 26609, Training Loss: 0.01095
Epoch: 26610, Training Loss: 0.00852
Epoch: 26610, Training Loss: 0.00889
Epoch: 26610, Training Loss: 0.00890
Epoch: 26610, Training Loss: 0.01095
Epoch: 26611, Training Loss: 0.00852
Epoch: 26611, Training Loss: 0.00889
Epoch: 26611, Training Loss: 0.00890
Epoch: 26611, Training Loss: 0.01095
Epoch: 26612, Training Loss: 0.00852
Epoch: 26612, Training Loss: 0.00889
Epoch: 26612, Training Loss: 0.00890
Epoch: 26612, Training Loss: 0.01095
Epoch: 26613, Training Loss: 0.00852
Epoch: 26613, Training Loss: 0.00889
Epoch: 26613, Training Loss: 0.00890
Epoch: 26613, Training Loss: 0.01095
Epoch: 26614, Training Loss: 0.00852
Epoch: 26614, Training Loss: 0.00889
Epoch: 26614, Training Loss: 0.00890
Epoch: 26614, Training Loss: 0.01095
Epoch: 26615, Training Loss: 0.00852
Epoch: 26615, Training Loss: 0.00889
Epoch: 26615, Training Loss: 0.00890
Epoch: 26615, Training Loss: 0.01095
Epoch: 26616, Training Loss: 0.00852
Epoch: 26616, Training Loss: 0.00889
Epoch: 26616, Training Loss: 0.00890
Epoch: 26616, Training Loss: 0.01095
Epoch: 26617, Training Loss: 0.00852
Epoch: 26617, Training Loss: 0.00889
Epoch: 26617, Training Loss: 0.00890
Epoch: 26617, Training Loss: 0.01095
Epoch: 26618, Training Loss: 0.00852
Epoch: 26618, Training Loss: 0.00889
Epoch: 26618, Training Loss: 0.00890
Epoch: 26618, Training Loss: 0.01095
Epoch: 26619, Training Loss: 0.00852
Epoch: 26619, Training Loss: 0.00889
Epoch: 26619, Training Loss: 0.00890
Epoch: 26619, Training Loss: 0.01095
Epoch: 26620, Training Loss: 0.00852
Epoch: 26620, Training Loss: 0.00889
Epoch: 26620, Training Loss: 0.00890
Epoch: 26620, Training Loss: 0.01094
Epoch: 26621, Training Loss: 0.00852
Epoch: 26621, Training Loss: 0.00889
Epoch: 26621, Training Loss: 0.00890
Epoch: 26621, Training Loss: 0.01094
Epoch: 26622, Training Loss: 0.00852
Epoch: 26622, Training Loss: 0.00889
Epoch: 26622, Training Loss: 0.00890
Epoch: 26622, Training Loss: 0.01094
Epoch: 26623, Training Loss: 0.00852
Epoch: 26623, Training Loss: 0.00889
Epoch: 26623, Training Loss: 0.00890
Epoch: 26623, Training Loss: 0.01094
Epoch: 26624, Training Loss: 0.00852
Epoch: 26624, Training Loss: 0.00889
Epoch: 26624, Training Loss: 0.00890
Epoch: 26624, Training Loss: 0.01094
Epoch: 26625, Training Loss: 0.00852
Epoch: 26625, Training Loss: 0.00889
Epoch: 26625, Training Loss: 0.00890
Epoch: 26625, Training Loss: 0.01094
Epoch: 26626, Training Loss: 0.00852
Epoch: 26626, Training Loss: 0.00889
Epoch: 26626, Training Loss: 0.00890
Epoch: 26626, Training Loss: 0.01094
Epoch: 26627, Training Loss: 0.00852
Epoch: 26627, Training Loss: 0.00889
Epoch: 26627, Training Loss: 0.00890
Epoch: 26627, Training Loss: 0.01094
Epoch: 26628, Training Loss: 0.00852
Epoch: 26628, Training Loss: 0.00889
Epoch: 26628, Training Loss: 0.00890
Epoch: 26628, Training Loss: 0.01094
Epoch: 26629, Training Loss: 0.00852
Epoch: 26629, Training Loss: 0.00889
Epoch: 26629, Training Loss: 0.00890
Epoch: 26629, Training Loss: 0.01094
Epoch: 26630, Training Loss: 0.00852
Epoch: 26630, Training Loss: 0.00889
Epoch: 26630, Training Loss: 0.00890
Epoch: 26630, Training Loss: 0.01094
Epoch: 26631, Training Loss: 0.00852
Epoch: 26631, Training Loss: 0.00889
Epoch: 26631, Training Loss: 0.00890
Epoch: 26631, Training Loss: 0.01094
Epoch: 26632, Training Loss: 0.00852
Epoch: 26632, Training Loss: 0.00889
Epoch: 26632, Training Loss: 0.00890
Epoch: 26632, Training Loss: 0.01094
Epoch: 26633, Training Loss: 0.00852
Epoch: 26633, Training Loss: 0.00889
Epoch: 26633, Training Loss: 0.00890
Epoch: 26633, Training Loss: 0.01094
Epoch: 26634, Training Loss: 0.00852
Epoch: 26634, Training Loss: 0.00889
Epoch: 26634, Training Loss: 0.00890
Epoch: 26634, Training Loss: 0.01094
Epoch: 26635, Training Loss: 0.00852
Epoch: 26635, Training Loss: 0.00889
Epoch: 26635, Training Loss: 0.00890
Epoch: 26635, Training Loss: 0.01094
Epoch: 26636, Training Loss: 0.00852
Epoch: 26636, Training Loss: 0.00889
Epoch: 26636, Training Loss: 0.00890
Epoch: 26636, Training Loss: 0.01094
Epoch: 26637, Training Loss: 0.00852
Epoch: 26637, Training Loss: 0.00889
Epoch: 26637, Training Loss: 0.00890
Epoch: 26637, Training Loss: 0.01094
Epoch: 26638, Training Loss: 0.00852
Epoch: 26638, Training Loss: 0.00889
Epoch: 26638, Training Loss: 0.00890
Epoch: 26638, Training Loss: 0.01094
Epoch: 26639, Training Loss: 0.00852
Epoch: 26639, Training Loss: 0.00889
Epoch: 26639, Training Loss: 0.00890
Epoch: 26639, Training Loss: 0.01094
Epoch: 26640, Training Loss: 0.00852
Epoch: 26640, Training Loss: 0.00889
Epoch: 26640, Training Loss: 0.00890
Epoch: 26640, Training Loss: 0.01094
Epoch: 26641, Training Loss: 0.00852
Epoch: 26641, Training Loss: 0.00889
Epoch: 26641, Training Loss: 0.00890
Epoch: 26641, Training Loss: 0.01094
Epoch: 26642, Training Loss: 0.00851
Epoch: 26642, Training Loss: 0.00889
Epoch: 26642, Training Loss: 0.00890
Epoch: 26642, Training Loss: 0.01094
Epoch: 26643, Training Loss: 0.00851
Epoch: 26643, Training Loss: 0.00889
Epoch: 26643, Training Loss: 0.00890
Epoch: 26643, Training Loss: 0.01094
Epoch: 26644, Training Loss: 0.00851
Epoch: 26644, Training Loss: 0.00889
Epoch: 26644, Training Loss: 0.00890
Epoch: 26644, Training Loss: 0.01094
Epoch: 26645, Training Loss: 0.00851
Epoch: 26645, Training Loss: 0.00889
Epoch: 26645, Training Loss: 0.00890
Epoch: 26645, Training Loss: 0.01094
Epoch: 26646, Training Loss: 0.00851
Epoch: 26646, Training Loss: 0.00889
Epoch: 26646, Training Loss: 0.00890
Epoch: 26646, Training Loss: 0.01094
Epoch: 26647, Training Loss: 0.00851
Epoch: 26647, Training Loss: 0.00889
Epoch: 26647, Training Loss: 0.00890
Epoch: 26647, Training Loss: 0.01094
Epoch: 26648, Training Loss: 0.00851
Epoch: 26648, Training Loss: 0.00889
Epoch: 26648, Training Loss: 0.00890
Epoch: 26648, Training Loss: 0.01094
Epoch: 26649, Training Loss: 0.00851
Epoch: 26649, Training Loss: 0.00889
Epoch: 26649, Training Loss: 0.00890
Epoch: 26649, Training Loss: 0.01094
Epoch: 26650, Training Loss: 0.00851
Epoch: 26650, Training Loss: 0.00889
Epoch: 26650, Training Loss: 0.00890
Epoch: 26650, Training Loss: 0.01094
Epoch: 26651, Training Loss: 0.00851
Epoch: 26651, Training Loss: 0.00889
Epoch: 26651, Training Loss: 0.00890
Epoch: 26651, Training Loss: 0.01094
Epoch: 26652, Training Loss: 0.00851
Epoch: 26652, Training Loss: 0.00889
Epoch: 26652, Training Loss: 0.00890
Epoch: 26652, Training Loss: 0.01094
Epoch: 26653, Training Loss: 0.00851
Epoch: 26653, Training Loss: 0.00889
Epoch: 26653, Training Loss: 0.00890
Epoch: 26653, Training Loss: 0.01094
Epoch: 26654, Training Loss: 0.00851
Epoch: 26654, Training Loss: 0.00889
Epoch: 26654, Training Loss: 0.00890
Epoch: 26654, Training Loss: 0.01094
Epoch: 26655, Training Loss: 0.00851
Epoch: 26655, Training Loss: 0.00888
Epoch: 26655, Training Loss: 0.00890
Epoch: 26655, Training Loss: 0.01094
Epoch: 26656, Training Loss: 0.00851
Epoch: 26656, Training Loss: 0.00888
Epoch: 26656, Training Loss: 0.00890
Epoch: 26656, Training Loss: 0.01094
Epoch: 26657, Training Loss: 0.00851
Epoch: 26657, Training Loss: 0.00888
Epoch: 26657, Training Loss: 0.00890
Epoch: 26657, Training Loss: 0.01094
Epoch: 26658, Training Loss: 0.00851
Epoch: 26658, Training Loss: 0.00888
Epoch: 26658, Training Loss: 0.00890
Epoch: 26658, Training Loss: 0.01094
Epoch: 26659, Training Loss: 0.00851
Epoch: 26659, Training Loss: 0.00888
Epoch: 26659, Training Loss: 0.00890
Epoch: 26659, Training Loss: 0.01094
Epoch: 26660, Training Loss: 0.00851
Epoch: 26660, Training Loss: 0.00888
Epoch: 26660, Training Loss: 0.00889
Epoch: 26660, Training Loss: 0.01094
Epoch: 26661, Training Loss: 0.00851
Epoch: 26661, Training Loss: 0.00888
Epoch: 26661, Training Loss: 0.00889
Epoch: 26661, Training Loss: 0.01094
Epoch: 26662, Training Loss: 0.00851
Epoch: 26662, Training Loss: 0.00888
Epoch: 26662, Training Loss: 0.00889
Epoch: 26662, Training Loss: 0.01093
Epoch: 26663, Training Loss: 0.00851
Epoch: 26663, Training Loss: 0.00888
Epoch: 26663, Training Loss: 0.00889
Epoch: 26663, Training Loss: 0.01093
Epoch: 26664, Training Loss: 0.00851
Epoch: 26664, Training Loss: 0.00888
Epoch: 26664, Training Loss: 0.00889
Epoch: 26664, Training Loss: 0.01093
Epoch: 26665, Training Loss: 0.00851
Epoch: 26665, Training Loss: 0.00888
Epoch: 26665, Training Loss: 0.00889
Epoch: 26665, Training Loss: 0.01093
Epoch: 26666, Training Loss: 0.00851
Epoch: 26666, Training Loss: 0.00888
Epoch: 26666, Training Loss: 0.00889
Epoch: 26666, Training Loss: 0.01093
Epoch: 26667, Training Loss: 0.00851
Epoch: 26667, Training Loss: 0.00888
Epoch: 26667, Training Loss: 0.00889
Epoch: 26667, Training Loss: 0.01093
Epoch: 26668, Training Loss: 0.00851
Epoch: 26668, Training Loss: 0.00888
Epoch: 26668, Training Loss: 0.00889
Epoch: 26668, Training Loss: 0.01093
Epoch: 26669, Training Loss: 0.00851
Epoch: 26669, Training Loss: 0.00888
Epoch: 26669, Training Loss: 0.00889
Epoch: 26669, Training Loss: 0.01093
Epoch: 26670, Training Loss: 0.00851
Epoch: 26670, Training Loss: 0.00888
Epoch: 26670, Training Loss: 0.00889
Epoch: 26670, Training Loss: 0.01093
Epoch: 26671, Training Loss: 0.00851
Epoch: 26671, Training Loss: 0.00888
Epoch: 26671, Training Loss: 0.00889
Epoch: 26671, Training Loss: 0.01093
Epoch: 26672, Training Loss: 0.00851
Epoch: 26672, Training Loss: 0.00888
Epoch: 26672, Training Loss: 0.00889
Epoch: 26672, Training Loss: 0.01093
Epoch: 26673, Training Loss: 0.00851
Epoch: 26673, Training Loss: 0.00888
Epoch: 26673, Training Loss: 0.00889
Epoch: 26673, Training Loss: 0.01093
Epoch: 26674, Training Loss: 0.00851
Epoch: 26674, Training Loss: 0.00888
Epoch: 26674, Training Loss: 0.00889
Epoch: 26674, Training Loss: 0.01093
Epoch: 26675, Training Loss: 0.00851
Epoch: 26675, Training Loss: 0.00888
Epoch: 26675, Training Loss: 0.00889
Epoch: 26675, Training Loss: 0.01093
Epoch: 26676, Training Loss: 0.00851
Epoch: 26676, Training Loss: 0.00888
Epoch: 26676, Training Loss: 0.00889
Epoch: 26676, Training Loss: 0.01093
Epoch: 26677, Training Loss: 0.00851
Epoch: 26677, Training Loss: 0.00888
Epoch: 26677, Training Loss: 0.00889
Epoch: 26677, Training Loss: 0.01093
Epoch: 26678, Training Loss: 0.00851
Epoch: 26678, Training Loss: 0.00888
Epoch: 26678, Training Loss: 0.00889
Epoch: 26678, Training Loss: 0.01093
Epoch: 26679, Training Loss: 0.00851
Epoch: 26679, Training Loss: 0.00888
Epoch: 26679, Training Loss: 0.00889
Epoch: 26679, Training Loss: 0.01093
Epoch: 26680, Training Loss: 0.00851
Epoch: 26680, Training Loss: 0.00888
Epoch: 26680, Training Loss: 0.00889
Epoch: 26680, Training Loss: 0.01093
Epoch: 26681, Training Loss: 0.00851
Epoch: 26681, Training Loss: 0.00888
Epoch: 26681, Training Loss: 0.00889
Epoch: 26681, Training Loss: 0.01093
Epoch: 26682, Training Loss: 0.00851
Epoch: 26682, Training Loss: 0.00888
Epoch: 26682, Training Loss: 0.00889
Epoch: 26682, Training Loss: 0.01093
Epoch: 26683, Training Loss: 0.00851
Epoch: 26683, Training Loss: 0.00888
Epoch: 26683, Training Loss: 0.00889
Epoch: 26683, Training Loss: 0.01093
Epoch: 26684, Training Loss: 0.00851
Epoch: 26684, Training Loss: 0.00888
Epoch: 26684, Training Loss: 0.00889
Epoch: 26684, Training Loss: 0.01093
Epoch: 26685, Training Loss: 0.00851
Epoch: 26685, Training Loss: 0.00888
Epoch: 26685, Training Loss: 0.00889
Epoch: 26685, Training Loss: 0.01093
Epoch: 26686, Training Loss: 0.00851
Epoch: 26686, Training Loss: 0.00888
Epoch: 26686, Training Loss: 0.00889
Epoch: 26686, Training Loss: 0.01093
Epoch: 26687, Training Loss: 0.00851
Epoch: 26687, Training Loss: 0.00888
Epoch: 26687, Training Loss: 0.00889
Epoch: 26687, Training Loss: 0.01093
Epoch: 26688, Training Loss: 0.00851
Epoch: 26688, Training Loss: 0.00888
Epoch: 26688, Training Loss: 0.00889
Epoch: 26688, Training Loss: 0.01093
Epoch: 26689, Training Loss: 0.00851
Epoch: 26689, Training Loss: 0.00888
Epoch: 26689, Training Loss: 0.00889
Epoch: 26689, Training Loss: 0.01093
Epoch: 26690, Training Loss: 0.00851
Epoch: 26690, Training Loss: 0.00888
Epoch: 26690, Training Loss: 0.00889
Epoch: 26690, Training Loss: 0.01093
Epoch: 26691, Training Loss: 0.00851
Epoch: 26691, Training Loss: 0.00888
Epoch: 26691, Training Loss: 0.00889
Epoch: 26691, Training Loss: 0.01093
Epoch: 26692, Training Loss: 0.00851
Epoch: 26692, Training Loss: 0.00888
Epoch: 26692, Training Loss: 0.00889
Epoch: 26692, Training Loss: 0.01093
Epoch: 26693, Training Loss: 0.00851
Epoch: 26693, Training Loss: 0.00888
Epoch: 26693, Training Loss: 0.00889
Epoch: 26693, Training Loss: 0.01093
Epoch: 26694, Training Loss: 0.00851
Epoch: 26694, Training Loss: 0.00888
Epoch: 26694, Training Loss: 0.00889
Epoch: 26694, Training Loss: 0.01093
Epoch: 26695, Training Loss: 0.00851
Epoch: 26695, Training Loss: 0.00888
Epoch: 26695, Training Loss: 0.00889
Epoch: 26695, Training Loss: 0.01093
Epoch: 26696, Training Loss: 0.00851
Epoch: 26696, Training Loss: 0.00888
Epoch: 26696, Training Loss: 0.00889
Epoch: 26696, Training Loss: 0.01093
Epoch: 26697, Training Loss: 0.00851
Epoch: 26697, Training Loss: 0.00888
Epoch: 26697, Training Loss: 0.00889
Epoch: 26697, Training Loss: 0.01093
Epoch: 26698, Training Loss: 0.00850
Epoch: 26698, Training Loss: 0.00888
Epoch: 26698, Training Loss: 0.00889
Epoch: 26698, Training Loss: 0.01093
Epoch: 26699, Training Loss: 0.00850
Epoch: 26699, Training Loss: 0.00888
Epoch: 26699, Training Loss: 0.00889
Epoch: 26699, Training Loss: 0.01093
Epoch: 26700, Training Loss: 0.00850
Epoch: 26700, Training Loss: 0.00888
Epoch: 26700, Training Loss: 0.00889
Epoch: 26700, Training Loss: 0.01093
Epoch: 26701, Training Loss: 0.00850
Epoch: 26701, Training Loss: 0.00888
Epoch: 26701, Training Loss: 0.00889
Epoch: 26701, Training Loss: 0.01093
Epoch: 26702, Training Loss: 0.00850
Epoch: 26702, Training Loss: 0.00888
Epoch: 26702, Training Loss: 0.00889
Epoch: 26702, Training Loss: 0.01093
Epoch: 26703, Training Loss: 0.00850
Epoch: 26703, Training Loss: 0.00888
Epoch: 26703, Training Loss: 0.00889
Epoch: 26703, Training Loss: 0.01093
Epoch: 26704, Training Loss: 0.00850
Epoch: 26704, Training Loss: 0.00888
Epoch: 26704, Training Loss: 0.00889
Epoch: 26704, Training Loss: 0.01092
Epoch: 26705, Training Loss: 0.00850
Epoch: 26705, Training Loss: 0.00888
Epoch: 26705, Training Loss: 0.00889
Epoch: 26705, Training Loss: 0.01092
Epoch: 26706, Training Loss: 0.00850
Epoch: 26706, Training Loss: 0.00888
Epoch: 26706, Training Loss: 0.00889
Epoch: 26706, Training Loss: 0.01092
Epoch: 26707, Training Loss: 0.00850
Epoch: 26707, Training Loss: 0.00887
Epoch: 26707, Training Loss: 0.00889
Epoch: 26707, Training Loss: 0.01092
Epoch: 26708, Training Loss: 0.00850
Epoch: 26708, Training Loss: 0.00887
Epoch: 26708, Training Loss: 0.00889
Epoch: 26708, Training Loss: 0.01092
Epoch: 26709, Training Loss: 0.00850
Epoch: 26709, Training Loss: 0.00887
Epoch: 26709, Training Loss: 0.00889
Epoch: 26709, Training Loss: 0.01092
Epoch: 26710, Training Loss: 0.00850
Epoch: 26710, Training Loss: 0.00887
Epoch: 26710, Training Loss: 0.00889
Epoch: 26710, Training Loss: 0.01092
Epoch: 26711, Training Loss: 0.00850
Epoch: 26711, Training Loss: 0.00887
Epoch: 26711, Training Loss: 0.00889
Epoch: 26711, Training Loss: 0.01092
Epoch: 26712, Training Loss: 0.00850
Epoch: 26712, Training Loss: 0.00887
Epoch: 26712, Training Loss: 0.00888
Epoch: 26712, Training Loss: 0.01092
Epoch: 26713, Training Loss: 0.00850
Epoch: 26713, Training Loss: 0.00887
Epoch: 26713, Training Loss: 0.00888
Epoch: 26713, Training Loss: 0.01092
Epoch: 26714, Training Loss: 0.00850
Epoch: 26714, Training Loss: 0.00887
Epoch: 26714, Training Loss: 0.00888
Epoch: 26714, Training Loss: 0.01092
Epoch: 26715, Training Loss: 0.00850
Epoch: 26715, Training Loss: 0.00887
Epoch: 26715, Training Loss: 0.00888
Epoch: 26715, Training Loss: 0.01092
Epoch: 26716, Training Loss: 0.00850
Epoch: 26716, Training Loss: 0.00887
Epoch: 26716, Training Loss: 0.00888
Epoch: 26716, Training Loss: 0.01092
Epoch: 26717, Training Loss: 0.00850
Epoch: 26717, Training Loss: 0.00887
Epoch: 26717, Training Loss: 0.00888
Epoch: 26717, Training Loss: 0.01092
Epoch: 26718, Training Loss: 0.00850
Epoch: 26718, Training Loss: 0.00887
Epoch: 26718, Training Loss: 0.00888
Epoch: 26718, Training Loss: 0.01092
Epoch: 26719, Training Loss: 0.00850
Epoch: 26719, Training Loss: 0.00887
Epoch: 26719, Training Loss: 0.00888
Epoch: 26719, Training Loss: 0.01092
Epoch: 26720, Training Loss: 0.00850
Epoch: 26720, Training Loss: 0.00887
Epoch: 26720, Training Loss: 0.00888
Epoch: 26720, Training Loss: 0.01092
Epoch: 26721, Training Loss: 0.00850
Epoch: 26721, Training Loss: 0.00887
Epoch: 26721, Training Loss: 0.00888
Epoch: 26721, Training Loss: 0.01092
Epoch: 26722, Training Loss: 0.00850
Epoch: 26722, Training Loss: 0.00887
Epoch: 26722, Training Loss: 0.00888
Epoch: 26722, Training Loss: 0.01092
Epoch: 26723, Training Loss: 0.00850
Epoch: 26723, Training Loss: 0.00887
Epoch: 26723, Training Loss: 0.00888
Epoch: 26723, Training Loss: 0.01092
Epoch: 26724, Training Loss: 0.00850
Epoch: 26724, Training Loss: 0.00887
Epoch: 26724, Training Loss: 0.00888
Epoch: 26724, Training Loss: 0.01092
Epoch: 26725, Training Loss: 0.00850
Epoch: 26725, Training Loss: 0.00887
Epoch: 26725, Training Loss: 0.00888
Epoch: 26725, Training Loss: 0.01092
Epoch: 26726, Training Loss: 0.00850
Epoch: 26726, Training Loss: 0.00887
Epoch: 26726, Training Loss: 0.00888
Epoch: 26726, Training Loss: 0.01092
Epoch: 26727, Training Loss: 0.00850
Epoch: 26727, Training Loss: 0.00887
Epoch: 26727, Training Loss: 0.00888
Epoch: 26727, Training Loss: 0.01092
Epoch: 26728, Training Loss: 0.00850
Epoch: 26728, Training Loss: 0.00887
Epoch: 26728, Training Loss: 0.00888
Epoch: 26728, Training Loss: 0.01092
Epoch: 26729, Training Loss: 0.00850
Epoch: 26729, Training Loss: 0.00887
Epoch: 26729, Training Loss: 0.00888
Epoch: 26729, Training Loss: 0.01092
Epoch: 26730, Training Loss: 0.00850
Epoch: 26730, Training Loss: 0.00887
Epoch: 26730, Training Loss: 0.00888
Epoch: 26730, Training Loss: 0.01092
Epoch: 26731, Training Loss: 0.00850
Epoch: 26731, Training Loss: 0.00887
Epoch: 26731, Training Loss: 0.00888
Epoch: 26731, Training Loss: 0.01092
Epoch: 26732, Training Loss: 0.00850
Epoch: 26732, Training Loss: 0.00887
Epoch: 26732, Training Loss: 0.00888
Epoch: 26732, Training Loss: 0.01092
Epoch: 26733, Training Loss: 0.00850
Epoch: 26733, Training Loss: 0.00887
Epoch: 26733, Training Loss: 0.00888
Epoch: 26733, Training Loss: 0.01092
Epoch: 26734, Training Loss: 0.00850
Epoch: 26734, Training Loss: 0.00887
Epoch: 26734, Training Loss: 0.00888
Epoch: 26734, Training Loss: 0.01092
Epoch: 26735, Training Loss: 0.00850
Epoch: 26735, Training Loss: 0.00887
Epoch: 26735, Training Loss: 0.00888
Epoch: 26735, Training Loss: 0.01092
Epoch: 26736, Training Loss: 0.00850
Epoch: 26736, Training Loss: 0.00887
Epoch: 26736, Training Loss: 0.00888
Epoch: 26736, Training Loss: 0.01092
Epoch: 26737, Training Loss: 0.00850
Epoch: 26737, Training Loss: 0.00887
Epoch: 26737, Training Loss: 0.00888
Epoch: 26737, Training Loss: 0.01092
Epoch: 26738, Training Loss: 0.00850
Epoch: 26738, Training Loss: 0.00887
Epoch: 26738, Training Loss: 0.00888
Epoch: 26738, Training Loss: 0.01092
Epoch: 26739, Training Loss: 0.00850
Epoch: 26739, Training Loss: 0.00887
Epoch: 26739, Training Loss: 0.00888
Epoch: 26739, Training Loss: 0.01092
Epoch: 26740, Training Loss: 0.00850
Epoch: 26740, Training Loss: 0.00887
Epoch: 26740, Training Loss: 0.00888
Epoch: 26740, Training Loss: 0.01092
Epoch: 26741, Training Loss: 0.00850
Epoch: 26741, Training Loss: 0.00887
Epoch: 26741, Training Loss: 0.00888
Epoch: 26741, Training Loss: 0.01092
Epoch: 26742, Training Loss: 0.00850
Epoch: 26742, Training Loss: 0.00887
Epoch: 26742, Training Loss: 0.00888
Epoch: 26742, Training Loss: 0.01092
Epoch: 26743, Training Loss: 0.00850
Epoch: 26743, Training Loss: 0.00887
Epoch: 26743, Training Loss: 0.00888
Epoch: 26743, Training Loss: 0.01092
Epoch: 26744, Training Loss: 0.00850
Epoch: 26744, Training Loss: 0.00887
Epoch: 26744, Training Loss: 0.00888
Epoch: 26744, Training Loss: 0.01092
Epoch: 26745, Training Loss: 0.00850
Epoch: 26745, Training Loss: 0.00887
Epoch: 26745, Training Loss: 0.00888
Epoch: 26745, Training Loss: 0.01092
Epoch: 26746, Training Loss: 0.00850
Epoch: 26746, Training Loss: 0.00887
Epoch: 26746, Training Loss: 0.00888
Epoch: 26746, Training Loss: 0.01091
Epoch: 26747, Training Loss: 0.00850
Epoch: 26747, Training Loss: 0.00887
Epoch: 26747, Training Loss: 0.00888
Epoch: 26747, Training Loss: 0.01091
Epoch: 26748, Training Loss: 0.00850
Epoch: 26748, Training Loss: 0.00887
Epoch: 26748, Training Loss: 0.00888
Epoch: 26748, Training Loss: 0.01091
Epoch: 26749, Training Loss: 0.00850
Epoch: 26749, Training Loss: 0.00887
Epoch: 26749, Training Loss: 0.00888
Epoch: 26749, Training Loss: 0.01091
Epoch: 26750, Training Loss: 0.00850
Epoch: 26750, Training Loss: 0.00887
Epoch: 26750, Training Loss: 0.00888
Epoch: 26750, Training Loss: 0.01091
Epoch: 26751, Training Loss: 0.00850
Epoch: 26751, Training Loss: 0.00887
Epoch: 26751, Training Loss: 0.00888
Epoch: 26751, Training Loss: 0.01091
Epoch: 26752, Training Loss: 0.00850
Epoch: 26752, Training Loss: 0.00887
Epoch: 26752, Training Loss: 0.00888
Epoch: 26752, Training Loss: 0.01091
Epoch: 26753, Training Loss: 0.00850
Epoch: 26753, Training Loss: 0.00887
Epoch: 26753, Training Loss: 0.00888
Epoch: 26753, Training Loss: 0.01091
Epoch: 26754, Training Loss: 0.00849
Epoch: 26754, Training Loss: 0.00887
Epoch: 26754, Training Loss: 0.00888
Epoch: 26754, Training Loss: 0.01091
Epoch: 26755, Training Loss: 0.00849
Epoch: 26755, Training Loss: 0.00887
Epoch: 26755, Training Loss: 0.00888
Epoch: 26755, Training Loss: 0.01091
Epoch: 26756, Training Loss: 0.00849
Epoch: 26756, Training Loss: 0.00887
Epoch: 26756, Training Loss: 0.00888
Epoch: 26756, Training Loss: 0.01091
Epoch: 26757, Training Loss: 0.00849
Epoch: 26757, Training Loss: 0.00887
Epoch: 26757, Training Loss: 0.00888
Epoch: 26757, Training Loss: 0.01091
Epoch: 26758, Training Loss: 0.00849
Epoch: 26758, Training Loss: 0.00887
Epoch: 26758, Training Loss: 0.00888
Epoch: 26758, Training Loss: 0.01091
Epoch: 26759, Training Loss: 0.00849
Epoch: 26759, Training Loss: 0.00887
Epoch: 26759, Training Loss: 0.00888
Epoch: 26759, Training Loss: 0.01091
Epoch: 26760, Training Loss: 0.00849
Epoch: 26760, Training Loss: 0.00886
Epoch: 26760, Training Loss: 0.00888
Epoch: 26760, Training Loss: 0.01091
Epoch: 26761, Training Loss: 0.00849
Epoch: 26761, Training Loss: 0.00886
Epoch: 26761, Training Loss: 0.00888
Epoch: 26761, Training Loss: 0.01091
Epoch: 26762, Training Loss: 0.00849
Epoch: 26762, Training Loss: 0.00886
Epoch: 26762, Training Loss: 0.00888
Epoch: 26762, Training Loss: 0.01091
Epoch: 26763, Training Loss: 0.00849
Epoch: 26763, Training Loss: 0.00886
Epoch: 26763, Training Loss: 0.00888
Epoch: 26763, Training Loss: 0.01091
Epoch: 26764, Training Loss: 0.00849
Epoch: 26764, Training Loss: 0.00886
Epoch: 26764, Training Loss: 0.00888
Epoch: 26764, Training Loss: 0.01091
Epoch: 26765, Training Loss: 0.00849
Epoch: 26765, Training Loss: 0.00886
Epoch: 26765, Training Loss: 0.00887
Epoch: 26765, Training Loss: 0.01091
Epoch: 26766, Training Loss: 0.00849
Epoch: 26766, Training Loss: 0.00886
Epoch: 26766, Training Loss: 0.00887
Epoch: 26766, Training Loss: 0.01091
Epoch: 26767, Training Loss: 0.00849
Epoch: 26767, Training Loss: 0.00886
Epoch: 26767, Training Loss: 0.00887
Epoch: 26767, Training Loss: 0.01091
Epoch: 26768, Training Loss: 0.00849
Epoch: 26768, Training Loss: 0.00886
Epoch: 26768, Training Loss: 0.00887
Epoch: 26768, Training Loss: 0.01091
Epoch: 26769, Training Loss: 0.00849
Epoch: 26769, Training Loss: 0.00886
Epoch: 26769, Training Loss: 0.00887
Epoch: 26769, Training Loss: 0.01091
Epoch: 26770, Training Loss: 0.00849
Epoch: 26770, Training Loss: 0.00886
Epoch: 26770, Training Loss: 0.00887
Epoch: 26770, Training Loss: 0.01091
Epoch: 26771, Training Loss: 0.00849
Epoch: 26771, Training Loss: 0.00886
Epoch: 26771, Training Loss: 0.00887
Epoch: 26771, Training Loss: 0.01091
Epoch: 26772, Training Loss: 0.00849
Epoch: 26772, Training Loss: 0.00886
Epoch: 26772, Training Loss: 0.00887
Epoch: 26772, Training Loss: 0.01091
Epoch: 26773, Training Loss: 0.00849
Epoch: 26773, Training Loss: 0.00886
Epoch: 26773, Training Loss: 0.00887
Epoch: 26773, Training Loss: 0.01091
Epoch: 26774, Training Loss: 0.00849
Epoch: 26774, Training Loss: 0.00886
Epoch: 26774, Training Loss: 0.00887
Epoch: 26774, Training Loss: 0.01091
Epoch: 26775, Training Loss: 0.00849
Epoch: 26775, Training Loss: 0.00886
Epoch: 26775, Training Loss: 0.00887
Epoch: 26775, Training Loss: 0.01091
Epoch: 26776, Training Loss: 0.00849
Epoch: 26776, Training Loss: 0.00886
Epoch: 26776, Training Loss: 0.00887
Epoch: 26776, Training Loss: 0.01091
Epoch: 26777, Training Loss: 0.00849
Epoch: 26777, Training Loss: 0.00886
Epoch: 26777, Training Loss: 0.00887
Epoch: 26777, Training Loss: 0.01091
Epoch: 26778, Training Loss: 0.00849
Epoch: 26778, Training Loss: 0.00886
Epoch: 26778, Training Loss: 0.00887
Epoch: 26778, Training Loss: 0.01091
Epoch: 26779, Training Loss: 0.00849
Epoch: 26779, Training Loss: 0.00886
Epoch: 26779, Training Loss: 0.00887
Epoch: 26779, Training Loss: 0.01091
Epoch: 26780, Training Loss: 0.00849
Epoch: 26780, Training Loss: 0.00886
Epoch: 26780, Training Loss: 0.00887
Epoch: 26780, Training Loss: 0.01091
Epoch: 26781, Training Loss: 0.00849
Epoch: 26781, Training Loss: 0.00886
Epoch: 26781, Training Loss: 0.00887
Epoch: 26781, Training Loss: 0.01091
Epoch: 26782, Training Loss: 0.00849
Epoch: 26782, Training Loss: 0.00886
Epoch: 26782, Training Loss: 0.00887
Epoch: 26782, Training Loss: 0.01091
Epoch: 26783, Training Loss: 0.00849
Epoch: 26783, Training Loss: 0.00886
Epoch: 26783, Training Loss: 0.00887
Epoch: 26783, Training Loss: 0.01091
Epoch: 26784, Training Loss: 0.00849
Epoch: 26784, Training Loss: 0.00886
Epoch: 26784, Training Loss: 0.00887
Epoch: 26784, Training Loss: 0.01091
Epoch: 26785, Training Loss: 0.00849
Epoch: 26785, Training Loss: 0.00886
Epoch: 26785, Training Loss: 0.00887
Epoch: 26785, Training Loss: 0.01091
Epoch: 26786, Training Loss: 0.00849
Epoch: 26786, Training Loss: 0.00886
Epoch: 26786, Training Loss: 0.00887
Epoch: 26786, Training Loss: 0.01091
Epoch: 26787, Training Loss: 0.00849
Epoch: 26787, Training Loss: 0.00886
Epoch: 26787, Training Loss: 0.00887
Epoch: 26787, Training Loss: 0.01091
Epoch: 26788, Training Loss: 0.00849
Epoch: 26788, Training Loss: 0.00886
Epoch: 26788, Training Loss: 0.00887
Epoch: 26788, Training Loss: 0.01091
Epoch: 26789, Training Loss: 0.00849
Epoch: 26789, Training Loss: 0.00886
Epoch: 26789, Training Loss: 0.00887
Epoch: 26789, Training Loss: 0.01090
Epoch: 26790, Training Loss: 0.00849
Epoch: 26790, Training Loss: 0.00886
Epoch: 26790, Training Loss: 0.00887
Epoch: 26790, Training Loss: 0.01090
Epoch: 26791, Training Loss: 0.00849
Epoch: 26791, Training Loss: 0.00886
Epoch: 26791, Training Loss: 0.00887
Epoch: 26791, Training Loss: 0.01090
Epoch: 26792, Training Loss: 0.00849
Epoch: 26792, Training Loss: 0.00886
Epoch: 26792, Training Loss: 0.00887
Epoch: 26792, Training Loss: 0.01090
Epoch: 26793, Training Loss: 0.00849
Epoch: 26793, Training Loss: 0.00886
Epoch: 26793, Training Loss: 0.00887
Epoch: 26793, Training Loss: 0.01090
Epoch: 26794, Training Loss: 0.00849
Epoch: 26794, Training Loss: 0.00886
Epoch: 26794, Training Loss: 0.00887
Epoch: 26794, Training Loss: 0.01090
Epoch: 26795, Training Loss: 0.00849
Epoch: 26795, Training Loss: 0.00886
Epoch: 26795, Training Loss: 0.00887
Epoch: 26795, Training Loss: 0.01090
Epoch: 26796, Training Loss: 0.00849
Epoch: 26796, Training Loss: 0.00886
Epoch: 26796, Training Loss: 0.00887
Epoch: 26796, Training Loss: 0.01090
Epoch: 26797, Training Loss: 0.00849
Epoch: 26797, Training Loss: 0.00886
Epoch: 26797, Training Loss: 0.00887
Epoch: 26797, Training Loss: 0.01090
Epoch: 26798, Training Loss: 0.00849
Epoch: 26798, Training Loss: 0.00886
Epoch: 26798, Training Loss: 0.00887
Epoch: 26798, Training Loss: 0.01090
Epoch: 26799, Training Loss: 0.00849
Epoch: 26799, Training Loss: 0.00886
Epoch: 26799, Training Loss: 0.00887
Epoch: 26799, Training Loss: 0.01090
Epoch: 26800, Training Loss: 0.00849
Epoch: 26800, Training Loss: 0.00886
Epoch: 26800, Training Loss: 0.00887
Epoch: 26800, Training Loss: 0.01090
Epoch: 26801, Training Loss: 0.00849
Epoch: 26801, Training Loss: 0.00886
Epoch: 26801, Training Loss: 0.00887
Epoch: 26801, Training Loss: 0.01090
Epoch: 26802, Training Loss: 0.00849
Epoch: 26802, Training Loss: 0.00886
Epoch: 26802, Training Loss: 0.00887
Epoch: 26802, Training Loss: 0.01090
Epoch: 26803, Training Loss: 0.00849
Epoch: 26803, Training Loss: 0.00886
Epoch: 26803, Training Loss: 0.00887
Epoch: 26803, Training Loss: 0.01090
Epoch: 26804, Training Loss: 0.00849
Epoch: 26804, Training Loss: 0.00886
Epoch: 26804, Training Loss: 0.00887
Epoch: 26804, Training Loss: 0.01090
Epoch: 26805, Training Loss: 0.00849
Epoch: 26805, Training Loss: 0.00886
Epoch: 26805, Training Loss: 0.00887
Epoch: 26805, Training Loss: 0.01090
Epoch: 26806, Training Loss: 0.00849
Epoch: 26806, Training Loss: 0.00886
Epoch: 26806, Training Loss: 0.00887
Epoch: 26806, Training Loss: 0.01090
Epoch: 26807, Training Loss: 0.00849
Epoch: 26807, Training Loss: 0.00886
Epoch: 26807, Training Loss: 0.00887
Epoch: 26807, Training Loss: 0.01090
Epoch: 26808, Training Loss: 0.00849
Epoch: 26808, Training Loss: 0.00886
Epoch: 26808, Training Loss: 0.00887
Epoch: 26808, Training Loss: 0.01090
Epoch: 26809, Training Loss: 0.00849
Epoch: 26809, Training Loss: 0.00886
Epoch: 26809, Training Loss: 0.00887
Epoch: 26809, Training Loss: 0.01090
Epoch: 26810, Training Loss: 0.00849
Epoch: 26810, Training Loss: 0.00886
Epoch: 26810, Training Loss: 0.00887
Epoch: 26810, Training Loss: 0.01090
Epoch: 26811, Training Loss: 0.00848
Epoch: 26811, Training Loss: 0.00886
Epoch: 26811, Training Loss: 0.00887
Epoch: 26811, Training Loss: 0.01090
Epoch: 26812, Training Loss: 0.00848
Epoch: 26812, Training Loss: 0.00886
Epoch: 26812, Training Loss: 0.00887
Epoch: 26812, Training Loss: 0.01090
Epoch: 26813, Training Loss: 0.00848
Epoch: 26813, Training Loss: 0.00885
Epoch: 26813, Training Loss: 0.00887
Epoch: 26813, Training Loss: 0.01090
Epoch: 26814, Training Loss: 0.00848
Epoch: 26814, Training Loss: 0.00885
Epoch: 26814, Training Loss: 0.00887
Epoch: 26814, Training Loss: 0.01090
Epoch: 26815, Training Loss: 0.00848
Epoch: 26815, Training Loss: 0.00885
Epoch: 26815, Training Loss: 0.00887
Epoch: 26815, Training Loss: 0.01090
Epoch: 26816, Training Loss: 0.00848
Epoch: 26816, Training Loss: 0.00885
Epoch: 26816, Training Loss: 0.00887
Epoch: 26816, Training Loss: 0.01090
Epoch: 26817, Training Loss: 0.00848
Epoch: 26817, Training Loss: 0.00885
Epoch: 26817, Training Loss: 0.00887
Epoch: 26817, Training Loss: 0.01090
Epoch: 26818, Training Loss: 0.00848
Epoch: 26818, Training Loss: 0.00885
Epoch: 26818, Training Loss: 0.00886
Epoch: 26818, Training Loss: 0.01090
Epoch: 26819, Training Loss: 0.00848
Epoch: 26819, Training Loss: 0.00885
Epoch: 26819, Training Loss: 0.00886
Epoch: 26819, Training Loss: 0.01090
Epoch: 26820, Training Loss: 0.00848
Epoch: 26820, Training Loss: 0.00885
Epoch: 26820, Training Loss: 0.00886
Epoch: 26820, Training Loss: 0.01090
Epoch: 26821, Training Loss: 0.00848
Epoch: 26821, Training Loss: 0.00885
Epoch: 26821, Training Loss: 0.00886
Epoch: 26821, Training Loss: 0.01090
Epoch: 26822, Training Loss: 0.00848
Epoch: 26822, Training Loss: 0.00885
Epoch: 26822, Training Loss: 0.00886
Epoch: 26822, Training Loss: 0.01090
Epoch: 26823, Training Loss: 0.00848
Epoch: 26823, Training Loss: 0.00885
Epoch: 26823, Training Loss: 0.00886
Epoch: 26823, Training Loss: 0.01090
Epoch: 26824, Training Loss: 0.00848
Epoch: 26824, Training Loss: 0.00885
Epoch: 26824, Training Loss: 0.00886
Epoch: 26824, Training Loss: 0.01090
Epoch: 26825, Training Loss: 0.00848
Epoch: 26825, Training Loss: 0.00885
Epoch: 26825, Training Loss: 0.00886
Epoch: 26825, Training Loss: 0.01090
Epoch: 26826, Training Loss: 0.00848
Epoch: 26826, Training Loss: 0.00885
Epoch: 26826, Training Loss: 0.00886
Epoch: 26826, Training Loss: 0.01090
Epoch: 26827, Training Loss: 0.00848
Epoch: 26827, Training Loss: 0.00885
Epoch: 26827, Training Loss: 0.00886
Epoch: 26827, Training Loss: 0.01090
Epoch: 26828, Training Loss: 0.00848
Epoch: 26828, Training Loss: 0.00885
Epoch: 26828, Training Loss: 0.00886
Epoch: 26828, Training Loss: 0.01090
Epoch: 26829, Training Loss: 0.00848
Epoch: 26829, Training Loss: 0.00885
Epoch: 26829, Training Loss: 0.00886
Epoch: 26829, Training Loss: 0.01090
Epoch: 26830, Training Loss: 0.00848
Epoch: 26830, Training Loss: 0.00885
Epoch: 26830, Training Loss: 0.00886
Epoch: 26830, Training Loss: 0.01090
Epoch: 26831, Training Loss: 0.00848
Epoch: 26831, Training Loss: 0.00885
Epoch: 26831, Training Loss: 0.00886
Epoch: 26831, Training Loss: 0.01090
Epoch: 26832, Training Loss: 0.00848
Epoch: 26832, Training Loss: 0.00885
Epoch: 26832, Training Loss: 0.00886
Epoch: 26832, Training Loss: 0.01089
Epoch: 26833, Training Loss: 0.00848
Epoch: 26833, Training Loss: 0.00885
Epoch: 26833, Training Loss: 0.00886
Epoch: 26833, Training Loss: 0.01089
Epoch: 26834, Training Loss: 0.00848
Epoch: 26834, Training Loss: 0.00885
Epoch: 26834, Training Loss: 0.00886
Epoch: 26834, Training Loss: 0.01089
Epoch: 26835, Training Loss: 0.00848
Epoch: 26835, Training Loss: 0.00885
Epoch: 26835, Training Loss: 0.00886
Epoch: 26835, Training Loss: 0.01089
Epoch: 26836, Training Loss: 0.00848
Epoch: 26836, Training Loss: 0.00885
Epoch: 26836, Training Loss: 0.00886
Epoch: 26836, Training Loss: 0.01089
Epoch: 26837, Training Loss: 0.00848
Epoch: 26837, Training Loss: 0.00885
Epoch: 26837, Training Loss: 0.00886
Epoch: 26837, Training Loss: 0.01089
Epoch: 26838, Training Loss: 0.00848
Epoch: 26838, Training Loss: 0.00885
Epoch: 26838, Training Loss: 0.00886
Epoch: 26838, Training Loss: 0.01089
Epoch: 26839, Training Loss: 0.00848
Epoch: 26839, Training Loss: 0.00885
Epoch: 26839, Training Loss: 0.00886
Epoch: 26839, Training Loss: 0.01089
Epoch: 26840, Training Loss: 0.00848
Epoch: 26840, Training Loss: 0.00885
Epoch: 26840, Training Loss: 0.00886
Epoch: 26840, Training Loss: 0.01089
Epoch: 26841, Training Loss: 0.00848
Epoch: 26841, Training Loss: 0.00885
Epoch: 26841, Training Loss: 0.00886
Epoch: 26841, Training Loss: 0.01089
Epoch: 26842, Training Loss: 0.00848
Epoch: 26842, Training Loss: 0.00885
Epoch: 26842, Training Loss: 0.00886
Epoch: 26842, Training Loss: 0.01089
Epoch: 26843, Training Loss: 0.00848
Epoch: 26843, Training Loss: 0.00885
Epoch: 26843, Training Loss: 0.00886
Epoch: 26843, Training Loss: 0.01089
Epoch: 26844, Training Loss: 0.00848
Epoch: 26844, Training Loss: 0.00885
Epoch: 26844, Training Loss: 0.00886
Epoch: 26844, Training Loss: 0.01089
Epoch: 26845, Training Loss: 0.00848
Epoch: 26845, Training Loss: 0.00885
Epoch: 26845, Training Loss: 0.00886
Epoch: 26845, Training Loss: 0.01089
Epoch: 26846, Training Loss: 0.00848
Epoch: 26846, Training Loss: 0.00885
Epoch: 26846, Training Loss: 0.00886
Epoch: 26846, Training Loss: 0.01089
Epoch: 26847, Training Loss: 0.00848
Epoch: 26847, Training Loss: 0.00885
Epoch: 26847, Training Loss: 0.00886
Epoch: 26847, Training Loss: 0.01089
Epoch: 26848, Training Loss: 0.00848
Epoch: 26848, Training Loss: 0.00885
Epoch: 26848, Training Loss: 0.00886
Epoch: 26848, Training Loss: 0.01089
Epoch: 26849, Training Loss: 0.00848
Epoch: 26849, Training Loss: 0.00885
Epoch: 26849, Training Loss: 0.00886
Epoch: 26849, Training Loss: 0.01089
Epoch: 26850, Training Loss: 0.00848
Epoch: 26850, Training Loss: 0.00885
Epoch: 26850, Training Loss: 0.00886
Epoch: 26850, Training Loss: 0.01089
Epoch: 26851, Training Loss: 0.00848
Epoch: 26851, Training Loss: 0.00885
Epoch: 26851, Training Loss: 0.00886
Epoch: 26851, Training Loss: 0.01089
Epoch: 26852, Training Loss: 0.00848
Epoch: 26852, Training Loss: 0.00885
Epoch: 26852, Training Loss: 0.00886
Epoch: 26852, Training Loss: 0.01089
Epoch: 26853, Training Loss: 0.00848
Epoch: 26853, Training Loss: 0.00885
Epoch: 26853, Training Loss: 0.00886
Epoch: 26853, Training Loss: 0.01089
Epoch: 26854, Training Loss: 0.00848
Epoch: 26854, Training Loss: 0.00885
Epoch: 26854, Training Loss: 0.00886
Epoch: 26854, Training Loss: 0.01089
Epoch: 26855, Training Loss: 0.00848
Epoch: 26855, Training Loss: 0.00885
Epoch: 26855, Training Loss: 0.00886
Epoch: 26855, Training Loss: 0.01089
Epoch: 26856, Training Loss: 0.00848
Epoch: 26856, Training Loss: 0.00885
Epoch: 26856, Training Loss: 0.00886
Epoch: 26856, Training Loss: 0.01089
Epoch: 26857, Training Loss: 0.00848
Epoch: 26857, Training Loss: 0.00885
Epoch: 26857, Training Loss: 0.00886
Epoch: 26857, Training Loss: 0.01089
Epoch: 26858, Training Loss: 0.00848
Epoch: 26858, Training Loss: 0.00885
Epoch: 26858, Training Loss: 0.00886
Epoch: 26858, Training Loss: 0.01089
Epoch: 26859, Training Loss: 0.00848
Epoch: 26859, Training Loss: 0.00885
Epoch: 26859, Training Loss: 0.00886
Epoch: 26859, Training Loss: 0.01089
Epoch: 26860, Training Loss: 0.00848
Epoch: 26860, Training Loss: 0.00885
Epoch: 26860, Training Loss: 0.00886
Epoch: 26860, Training Loss: 0.01089
Epoch: 26861, Training Loss: 0.00848
Epoch: 26861, Training Loss: 0.00885
Epoch: 26861, Training Loss: 0.00886
Epoch: 26861, Training Loss: 0.01089
Epoch: 26862, Training Loss: 0.00848
Epoch: 26862, Training Loss: 0.00885
Epoch: 26862, Training Loss: 0.00886
Epoch: 26862, Training Loss: 0.01089
Epoch: 26863, Training Loss: 0.00848
Epoch: 26863, Training Loss: 0.00885
Epoch: 26863, Training Loss: 0.00886
Epoch: 26863, Training Loss: 0.01089
Epoch: 26864, Training Loss: 0.00848
Epoch: 26864, Training Loss: 0.00885
Epoch: 26864, Training Loss: 0.00886
Epoch: 26864, Training Loss: 0.01089
Epoch: 26865, Training Loss: 0.00848
Epoch: 26865, Training Loss: 0.00885
Epoch: 26865, Training Loss: 0.00886
Epoch: 26865, Training Loss: 0.01089
Epoch: 26866, Training Loss: 0.00848
Epoch: 26866, Training Loss: 0.00885
Epoch: 26866, Training Loss: 0.00886
Epoch: 26866, Training Loss: 0.01089
Epoch: 26867, Training Loss: 0.00848
Epoch: 26867, Training Loss: 0.00884
Epoch: 26867, Training Loss: 0.00886
Epoch: 26867, Training Loss: 0.01089
Epoch: 26868, Training Loss: 0.00847
Epoch: 26868, Training Loss: 0.00884
Epoch: 26868, Training Loss: 0.00886
Epoch: 26868, Training Loss: 0.01089
Epoch: 26869, Training Loss: 0.00847
Epoch: 26869, Training Loss: 0.00884
Epoch: 26869, Training Loss: 0.00886
Epoch: 26869, Training Loss: 0.01089
Epoch: 26870, Training Loss: 0.00847
Epoch: 26870, Training Loss: 0.00884
Epoch: 26870, Training Loss: 0.00886
Epoch: 26870, Training Loss: 0.01089
Epoch: 26871, Training Loss: 0.00847
Epoch: 26871, Training Loss: 0.00884
Epoch: 26871, Training Loss: 0.00885
Epoch: 26871, Training Loss: 0.01089
Epoch: 26872, Training Loss: 0.00847
Epoch: 26872, Training Loss: 0.00884
Epoch: 26872, Training Loss: 0.00885
Epoch: 26872, Training Loss: 0.01089
Epoch: 26873, Training Loss: 0.00847
Epoch: 26873, Training Loss: 0.00884
Epoch: 26873, Training Loss: 0.00885
Epoch: 26873, Training Loss: 0.01089
Epoch: 26874, Training Loss: 0.00847
Epoch: 26874, Training Loss: 0.00884
Epoch: 26874, Training Loss: 0.00885
Epoch: 26874, Training Loss: 0.01088
Epoch: 26875, Training Loss: 0.00847
Epoch: 26875, Training Loss: 0.00884
Epoch: 26875, Training Loss: 0.00885
Epoch: 26875, Training Loss: 0.01088
Epoch: 26876, Training Loss: 0.00847
Epoch: 26876, Training Loss: 0.00884
Epoch: 26876, Training Loss: 0.00885
Epoch: 26876, Training Loss: 0.01088
Epoch: 26877, Training Loss: 0.00847
Epoch: 26877, Training Loss: 0.00884
Epoch: 26877, Training Loss: 0.00885
Epoch: 26877, Training Loss: 0.01088
Epoch: 26878, Training Loss: 0.00847
Epoch: 26878, Training Loss: 0.00884
Epoch: 26878, Training Loss: 0.00885
Epoch: 26878, Training Loss: 0.01088
Epoch: 26879, Training Loss: 0.00847
Epoch: 26879, Training Loss: 0.00884
Epoch: 26879, Training Loss: 0.00885
Epoch: 26879, Training Loss: 0.01088
Epoch: 26880, Training Loss: 0.00847
Epoch: 26880, Training Loss: 0.00884
Epoch: 26880, Training Loss: 0.00885
Epoch: 26880, Training Loss: 0.01088
Epoch: 26881, Training Loss: 0.00847
Epoch: 26881, Training Loss: 0.00884
Epoch: 26881, Training Loss: 0.00885
Epoch: 26881, Training Loss: 0.01088
Epoch: 26882, Training Loss: 0.00847
Epoch: 26882, Training Loss: 0.00884
Epoch: 26882, Training Loss: 0.00885
Epoch: 26882, Training Loss: 0.01088
Epoch: 26883, Training Loss: 0.00847
Epoch: 26883, Training Loss: 0.00884
Epoch: 26883, Training Loss: 0.00885
Epoch: 26883, Training Loss: 0.01088
Epoch: 26884, Training Loss: 0.00847
Epoch: 26884, Training Loss: 0.00884
Epoch: 26884, Training Loss: 0.00885
Epoch: 26884, Training Loss: 0.01088
Epoch: 26885, Training Loss: 0.00847
Epoch: 26885, Training Loss: 0.00884
Epoch: 26885, Training Loss: 0.00885
Epoch: 26885, Training Loss: 0.01088
Epoch: 26886, Training Loss: 0.00847
Epoch: 26886, Training Loss: 0.00884
Epoch: 26886, Training Loss: 0.00885
Epoch: 26886, Training Loss: 0.01088
Epoch: 26887, Training Loss: 0.00847
Epoch: 26887, Training Loss: 0.00884
Epoch: 26887, Training Loss: 0.00885
Epoch: 26887, Training Loss: 0.01088
Epoch: 26888, Training Loss: 0.00847
Epoch: 26888, Training Loss: 0.00884
Epoch: 26888, Training Loss: 0.00885
Epoch: 26888, Training Loss: 0.01088
Epoch: 26889, Training Loss: 0.00847
Epoch: 26889, Training Loss: 0.00884
Epoch: 26889, Training Loss: 0.00885
Epoch: 26889, Training Loss: 0.01088
Epoch: 26890, Training Loss: 0.00847
Epoch: 26890, Training Loss: 0.00884
Epoch: 26890, Training Loss: 0.00885
Epoch: 26890, Training Loss: 0.01088
Epoch: 26891, Training Loss: 0.00847
Epoch: 26891, Training Loss: 0.00884
Epoch: 26891, Training Loss: 0.00885
Epoch: 26891, Training Loss: 0.01088
Epoch: 26892, Training Loss: 0.00847
Epoch: 26892, Training Loss: 0.00884
Epoch: 26892, Training Loss: 0.00885
Epoch: 26892, Training Loss: 0.01088
Epoch: 26893, Training Loss: 0.00847
Epoch: 26893, Training Loss: 0.00884
Epoch: 26893, Training Loss: 0.00885
Epoch: 26893, Training Loss: 0.01088
Epoch: 26894, Training Loss: 0.00847
Epoch: 26894, Training Loss: 0.00884
Epoch: 26894, Training Loss: 0.00885
Epoch: 26894, Training Loss: 0.01088
Epoch: 26895, Training Loss: 0.00847
Epoch: 26895, Training Loss: 0.00884
Epoch: 26895, Training Loss: 0.00885
Epoch: 26895, Training Loss: 0.01088
Epoch: 26896, Training Loss: 0.00847
Epoch: 26896, Training Loss: 0.00884
Epoch: 26896, Training Loss: 0.00885
Epoch: 26896, Training Loss: 0.01088
Epoch: 26897, Training Loss: 0.00847
Epoch: 26897, Training Loss: 0.00884
Epoch: 26897, Training Loss: 0.00885
Epoch: 26897, Training Loss: 0.01088
Epoch: 26898, Training Loss: 0.00847
Epoch: 26898, Training Loss: 0.00884
Epoch: 26898, Training Loss: 0.00885
Epoch: 26898, Training Loss: 0.01088
Epoch: 26899, Training Loss: 0.00847
Epoch: 26899, Training Loss: 0.00884
Epoch: 26899, Training Loss: 0.00885
Epoch: 26899, Training Loss: 0.01088
Epoch: 26900, Training Loss: 0.00847
Epoch: 26900, Training Loss: 0.00884
Epoch: 26900, Training Loss: 0.00885
Epoch: 26900, Training Loss: 0.01088
Epoch: 26901, Training Loss: 0.00847
Epoch: 26901, Training Loss: 0.00884
Epoch: 26901, Training Loss: 0.00885
Epoch: 26901, Training Loss: 0.01088
Epoch: 26902, Training Loss: 0.00847
Epoch: 26902, Training Loss: 0.00884
Epoch: 26902, Training Loss: 0.00885
Epoch: 26902, Training Loss: 0.01088
Epoch: 26903, Training Loss: 0.00847
Epoch: 26903, Training Loss: 0.00884
Epoch: 26903, Training Loss: 0.00885
Epoch: 26903, Training Loss: 0.01088
Epoch: 26904, Training Loss: 0.00847
Epoch: 26904, Training Loss: 0.00884
Epoch: 26904, Training Loss: 0.00885
Epoch: 26904, Training Loss: 0.01088
Epoch: 26905, Training Loss: 0.00847
Epoch: 26905, Training Loss: 0.00884
Epoch: 26905, Training Loss: 0.00885
Epoch: 26905, Training Loss: 0.01088
Epoch: 26906, Training Loss: 0.00847
Epoch: 26906, Training Loss: 0.00884
Epoch: 26906, Training Loss: 0.00885
Epoch: 26906, Training Loss: 0.01088
Epoch: 26907, Training Loss: 0.00847
Epoch: 26907, Training Loss: 0.00884
Epoch: 26907, Training Loss: 0.00885
Epoch: 26907, Training Loss: 0.01088
Epoch: 26908, Training Loss: 0.00847
Epoch: 26908, Training Loss: 0.00884
Epoch: 26908, Training Loss: 0.00885
Epoch: 26908, Training Loss: 0.01088
Epoch: 26909, Training Loss: 0.00847
Epoch: 26909, Training Loss: 0.00884
Epoch: 26909, Training Loss: 0.00885
Epoch: 26909, Training Loss: 0.01088
Epoch: 26910, Training Loss: 0.00847
Epoch: 26910, Training Loss: 0.00884
Epoch: 26910, Training Loss: 0.00885
Epoch: 26910, Training Loss: 0.01088
Epoch: 26911, Training Loss: 0.00847
Epoch: 26911, Training Loss: 0.00884
Epoch: 26911, Training Loss: 0.00885
Epoch: 26911, Training Loss: 0.01088
Epoch: 26912, Training Loss: 0.00847
Epoch: 26912, Training Loss: 0.00884
Epoch: 26912, Training Loss: 0.00885
Epoch: 26912, Training Loss: 0.01088
Epoch: 26913, Training Loss: 0.00847
Epoch: 26913, Training Loss: 0.00884
Epoch: 26913, Training Loss: 0.00885
Epoch: 26913, Training Loss: 0.01088
Epoch: 26914, Training Loss: 0.00847
Epoch: 26914, Training Loss: 0.00884
Epoch: 26914, Training Loss: 0.00885
Epoch: 26914, Training Loss: 0.01088
Epoch: 26915, Training Loss: 0.00847
Epoch: 26915, Training Loss: 0.00884
Epoch: 26915, Training Loss: 0.00885
Epoch: 26915, Training Loss: 0.01088
Epoch: 26916, Training Loss: 0.00847
Epoch: 26916, Training Loss: 0.00884
Epoch: 26916, Training Loss: 0.00885
Epoch: 26916, Training Loss: 0.01088
Epoch: 26917, Training Loss: 0.00847
Epoch: 26917, Training Loss: 0.00884
Epoch: 26917, Training Loss: 0.00885
Epoch: 26917, Training Loss: 0.01087
Epoch: 26918, Training Loss: 0.00847
Epoch: 26918, Training Loss: 0.00884
Epoch: 26918, Training Loss: 0.00885
Epoch: 26918, Training Loss: 0.01087
Epoch: 26919, Training Loss: 0.00847
Epoch: 26919, Training Loss: 0.00884
Epoch: 26919, Training Loss: 0.00885
Epoch: 26919, Training Loss: 0.01087
Epoch: 26920, Training Loss: 0.00847
Epoch: 26920, Training Loss: 0.00883
Epoch: 26920, Training Loss: 0.00885
Epoch: 26920, Training Loss: 0.01087
Epoch: 26921, Training Loss: 0.00847
Epoch: 26921, Training Loss: 0.00883
Epoch: 26921, Training Loss: 0.00885
Epoch: 26921, Training Loss: 0.01087
Epoch: 26922, Training Loss: 0.00847
Epoch: 26922, Training Loss: 0.00883
Epoch: 26922, Training Loss: 0.00885
Epoch: 26922, Training Loss: 0.01087
Epoch: 26923, Training Loss: 0.00847
Epoch: 26923, Training Loss: 0.00883
Epoch: 26923, Training Loss: 0.00885
Epoch: 26923, Training Loss: 0.01087
Epoch: 26924, Training Loss: 0.00847
Epoch: 26924, Training Loss: 0.00883
Epoch: 26924, Training Loss: 0.00885
Epoch: 26924, Training Loss: 0.01087
Epoch: 26925, Training Loss: 0.00846
Epoch: 26925, Training Loss: 0.00883
Epoch: 26925, Training Loss: 0.00884
Epoch: 26925, Training Loss: 0.01087
Epoch: 26926, Training Loss: 0.00846
Epoch: 26926, Training Loss: 0.00883
Epoch: 26926, Training Loss: 0.00884
Epoch: 26926, Training Loss: 0.01087
Epoch: 26927, Training Loss: 0.00846
Epoch: 26927, Training Loss: 0.00883
Epoch: 26927, Training Loss: 0.00884
Epoch: 26927, Training Loss: 0.01087
Epoch: 26928, Training Loss: 0.00846
Epoch: 26928, Training Loss: 0.00883
Epoch: 26928, Training Loss: 0.00884
Epoch: 26928, Training Loss: 0.01087
Epoch: 26929, Training Loss: 0.00846
Epoch: 26929, Training Loss: 0.00883
Epoch: 26929, Training Loss: 0.00884
Epoch: 26929, Training Loss: 0.01087
Epoch: 26930, Training Loss: 0.00846
Epoch: 26930, Training Loss: 0.00883
Epoch: 26930, Training Loss: 0.00884
Epoch: 26930, Training Loss: 0.01087
Epoch: 26931, Training Loss: 0.00846
Epoch: 26931, Training Loss: 0.00883
Epoch: 26931, Training Loss: 0.00884
Epoch: 26931, Training Loss: 0.01087
Epoch: 26932, Training Loss: 0.00846
Epoch: 26932, Training Loss: 0.00883
Epoch: 26932, Training Loss: 0.00884
Epoch: 26932, Training Loss: 0.01087
Epoch: 26933, Training Loss: 0.00846
Epoch: 26933, Training Loss: 0.00883
Epoch: 26933, Training Loss: 0.00884
Epoch: 26933, Training Loss: 0.01087
Epoch: 26934, Training Loss: 0.00846
Epoch: 26934, Training Loss: 0.00883
Epoch: 26934, Training Loss: 0.00884
Epoch: 26934, Training Loss: 0.01087
Epoch: 26935, Training Loss: 0.00846
Epoch: 26935, Training Loss: 0.00883
Epoch: 26935, Training Loss: 0.00884
Epoch: 26935, Training Loss: 0.01087
Epoch: 26936, Training Loss: 0.00846
Epoch: 26936, Training Loss: 0.00883
Epoch: 26936, Training Loss: 0.00884
Epoch: 26936, Training Loss: 0.01087
Epoch: 26937, Training Loss: 0.00846
Epoch: 26937, Training Loss: 0.00883
Epoch: 26937, Training Loss: 0.00884
Epoch: 26937, Training Loss: 0.01087
Epoch: 26938, Training Loss: 0.00846
Epoch: 26938, Training Loss: 0.00883
Epoch: 26938, Training Loss: 0.00884
Epoch: 26938, Training Loss: 0.01087
Epoch: 26939, Training Loss: 0.00846
Epoch: 26939, Training Loss: 0.00883
Epoch: 26939, Training Loss: 0.00884
Epoch: 26939, Training Loss: 0.01087
Epoch: 26940, Training Loss: 0.00846
Epoch: 26940, Training Loss: 0.00883
Epoch: 26940, Training Loss: 0.00884
Epoch: 26940, Training Loss: 0.01087
Epoch: 26941, Training Loss: 0.00846
Epoch: 26941, Training Loss: 0.00883
Epoch: 26941, Training Loss: 0.00884
Epoch: 26941, Training Loss: 0.01087
Epoch: 26942, Training Loss: 0.00846
Epoch: 26942, Training Loss: 0.00883
Epoch: 26942, Training Loss: 0.00884
Epoch: 26942, Training Loss: 0.01087
Epoch: 26943, Training Loss: 0.00846
Epoch: 26943, Training Loss: 0.00883
Epoch: 26943, Training Loss: 0.00884
Epoch: 26943, Training Loss: 0.01087
Epoch: 26944, Training Loss: 0.00846
Epoch: 26944, Training Loss: 0.00883
Epoch: 26944, Training Loss: 0.00884
Epoch: 26944, Training Loss: 0.01087
Epoch: 26945, Training Loss: 0.00846
Epoch: 26945, Training Loss: 0.00883
Epoch: 26945, Training Loss: 0.00884
Epoch: 26945, Training Loss: 0.01087
Epoch: 26946, Training Loss: 0.00846
Epoch: 26946, Training Loss: 0.00883
Epoch: 26946, Training Loss: 0.00884
Epoch: 26946, Training Loss: 0.01087
Epoch: 26947, Training Loss: 0.00846
Epoch: 26947, Training Loss: 0.00883
Epoch: 26947, Training Loss: 0.00884
Epoch: 26947, Training Loss: 0.01087
Epoch: 26948, Training Loss: 0.00846
Epoch: 26948, Training Loss: 0.00883
Epoch: 26948, Training Loss: 0.00884
Epoch: 26948, Training Loss: 0.01087
Epoch: 26949, Training Loss: 0.00846
Epoch: 26949, Training Loss: 0.00883
Epoch: 26949, Training Loss: 0.00884
Epoch: 26949, Training Loss: 0.01087
Epoch: 26950, Training Loss: 0.00846
Epoch: 26950, Training Loss: 0.00883
Epoch: 26950, Training Loss: 0.00884
Epoch: 26950, Training Loss: 0.01087
Epoch: 26951, Training Loss: 0.00846
Epoch: 26951, Training Loss: 0.00883
Epoch: 26951, Training Loss: 0.00884
Epoch: 26951, Training Loss: 0.01087
Epoch: 26952, Training Loss: 0.00846
Epoch: 26952, Training Loss: 0.00883
Epoch: 26952, Training Loss: 0.00884
Epoch: 26952, Training Loss: 0.01087
Epoch: 26953, Training Loss: 0.00846
Epoch: 26953, Training Loss: 0.00883
Epoch: 26953, Training Loss: 0.00884
Epoch: 26953, Training Loss: 0.01087
Epoch: 26954, Training Loss: 0.00846
Epoch: 26954, Training Loss: 0.00883
Epoch: 26954, Training Loss: 0.00884
Epoch: 26954, Training Loss: 0.01087
Epoch: 26955, Training Loss: 0.00846
Epoch: 26955, Training Loss: 0.00883
Epoch: 26955, Training Loss: 0.00884
Epoch: 26955, Training Loss: 0.01087
Epoch: 26956, Training Loss: 0.00846
Epoch: 26956, Training Loss: 0.00883
Epoch: 26956, Training Loss: 0.00884
Epoch: 26956, Training Loss: 0.01087
Epoch: 26957, Training Loss: 0.00846
Epoch: 26957, Training Loss: 0.00883
Epoch: 26957, Training Loss: 0.00884
Epoch: 26957, Training Loss: 0.01087
Epoch: 26958, Training Loss: 0.00846
Epoch: 26958, Training Loss: 0.00883
Epoch: 26958, Training Loss: 0.00884
Epoch: 26958, Training Loss: 0.01087
Epoch: 26959, Training Loss: 0.00846
Epoch: 26959, Training Loss: 0.00883
Epoch: 26959, Training Loss: 0.00884
Epoch: 26959, Training Loss: 0.01087
Epoch: 26960, Training Loss: 0.00846
Epoch: 26960, Training Loss: 0.00883
Epoch: 26960, Training Loss: 0.00884
Epoch: 26960, Training Loss: 0.01086
Epoch: 26961, Training Loss: 0.00846
Epoch: 26961, Training Loss: 0.00883
Epoch: 26961, Training Loss: 0.00884
Epoch: 26961, Training Loss: 0.01086
Epoch: 26962, Training Loss: 0.00846
Epoch: 26962, Training Loss: 0.00883
Epoch: 26962, Training Loss: 0.00884
Epoch: 26962, Training Loss: 0.01086
Epoch: 26963, Training Loss: 0.00846
Epoch: 26963, Training Loss: 0.00883
Epoch: 26963, Training Loss: 0.00884
Epoch: 26963, Training Loss: 0.01086
Epoch: 26964, Training Loss: 0.00846
Epoch: 26964, Training Loss: 0.00883
Epoch: 26964, Training Loss: 0.00884
Epoch: 26964, Training Loss: 0.01086
Epoch: 26965, Training Loss: 0.00846
Epoch: 26965, Training Loss: 0.00883
Epoch: 26965, Training Loss: 0.00884
Epoch: 26965, Training Loss: 0.01086
Epoch: 26966, Training Loss: 0.00846
Epoch: 26966, Training Loss: 0.00883
Epoch: 26966, Training Loss: 0.00884
Epoch: 26966, Training Loss: 0.01086
Epoch: 26967, Training Loss: 0.00846
Epoch: 26967, Training Loss: 0.00883
Epoch: 26967, Training Loss: 0.00884
Epoch: 26967, Training Loss: 0.01086
Epoch: 26968, Training Loss: 0.00846
Epoch: 26968, Training Loss: 0.00883
Epoch: 26968, Training Loss: 0.00884
Epoch: 26968, Training Loss: 0.01086
Epoch: 26969, Training Loss: 0.00846
Epoch: 26969, Training Loss: 0.00883
Epoch: 26969, Training Loss: 0.00884
Epoch: 26969, Training Loss: 0.01086
Epoch: 26970, Training Loss: 0.00846
Epoch: 26970, Training Loss: 0.00883
Epoch: 26970, Training Loss: 0.00884
Epoch: 26970, Training Loss: 0.01086
Epoch: 26971, Training Loss: 0.00846
Epoch: 26971, Training Loss: 0.00883
Epoch: 26971, Training Loss: 0.00884
Epoch: 26971, Training Loss: 0.01086
Epoch: 26972, Training Loss: 0.00846
Epoch: 26972, Training Loss: 0.00883
Epoch: 26972, Training Loss: 0.00884
Epoch: 26972, Training Loss: 0.01086
Epoch: 26973, Training Loss: 0.00846
Epoch: 26973, Training Loss: 0.00883
Epoch: 26973, Training Loss: 0.00884
Epoch: 26973, Training Loss: 0.01086
Epoch: 26974, Training Loss: 0.00846
Epoch: 26974, Training Loss: 0.00882
Epoch: 26974, Training Loss: 0.00884
Epoch: 26974, Training Loss: 0.01086
Epoch: 26975, Training Loss: 0.00846
Epoch: 26975, Training Loss: 0.00882
Epoch: 26975, Training Loss: 0.00884
Epoch: 26975, Training Loss: 0.01086
Epoch: 26976, Training Loss: 0.00846
Epoch: 26976, Training Loss: 0.00882
Epoch: 26976, Training Loss: 0.00884
Epoch: 26976, Training Loss: 0.01086
Epoch: 26977, Training Loss: 0.00846
Epoch: 26977, Training Loss: 0.00882
Epoch: 26977, Training Loss: 0.00884
Epoch: 26977, Training Loss: 0.01086
Epoch: 26978, Training Loss: 0.00846
Epoch: 26978, Training Loss: 0.00882
Epoch: 26978, Training Loss: 0.00883
Epoch: 26978, Training Loss: 0.01086
Epoch: 26979, Training Loss: 0.00846
Epoch: 26979, Training Loss: 0.00882
Epoch: 26979, Training Loss: 0.00883
Epoch: 26979, Training Loss: 0.01086
Epoch: 26980, Training Loss: 0.00846
Epoch: 26980, Training Loss: 0.00882
Epoch: 26980, Training Loss: 0.00883
Epoch: 26980, Training Loss: 0.01086
Epoch: 26981, Training Loss: 0.00846
Epoch: 26981, Training Loss: 0.00882
Epoch: 26981, Training Loss: 0.00883
Epoch: 26981, Training Loss: 0.01086
Epoch: 26982, Training Loss: 0.00845
Epoch: 26982, Training Loss: 0.00882
Epoch: 26982, Training Loss: 0.00883
Epoch: 26982, Training Loss: 0.01086
Epoch: 26983, Training Loss: 0.00845
Epoch: 26983, Training Loss: 0.00882
Epoch: 26983, Training Loss: 0.00883
Epoch: 26983, Training Loss: 0.01086
Epoch: 26984, Training Loss: 0.00845
Epoch: 26984, Training Loss: 0.00882
Epoch: 26984, Training Loss: 0.00883
Epoch: 26984, Training Loss: 0.01086
Epoch: 26985, Training Loss: 0.00845
Epoch: 26985, Training Loss: 0.00882
Epoch: 26985, Training Loss: 0.00883
Epoch: 26985, Training Loss: 0.01086
Epoch: 26986, Training Loss: 0.00845
Epoch: 26986, Training Loss: 0.00882
Epoch: 26986, Training Loss: 0.00883
Epoch: 26986, Training Loss: 0.01086
Epoch: 26987, Training Loss: 0.00845
Epoch: 26987, Training Loss: 0.00882
Epoch: 26987, Training Loss: 0.00883
Epoch: 26987, Training Loss: 0.01086
Epoch: 26988, Training Loss: 0.00845
Epoch: 26988, Training Loss: 0.00882
Epoch: 26988, Training Loss: 0.00883
Epoch: 26988, Training Loss: 0.01086
Epoch: 26989, Training Loss: 0.00845
Epoch: 26989, Training Loss: 0.00882
Epoch: 26989, Training Loss: 0.00883
Epoch: 26989, Training Loss: 0.01086
Epoch: 26990, Training Loss: 0.00845
Epoch: 26990, Training Loss: 0.00882
Epoch: 26990, Training Loss: 0.00883
Epoch: 26990, Training Loss: 0.01086
Epoch: 26991, Training Loss: 0.00845
Epoch: 26991, Training Loss: 0.00882
Epoch: 26991, Training Loss: 0.00883
Epoch: 26991, Training Loss: 0.01086
Epoch: 26992, Training Loss: 0.00845
Epoch: 26992, Training Loss: 0.00882
Epoch: 26992, Training Loss: 0.00883
Epoch: 26992, Training Loss: 0.01086
Epoch: 26993, Training Loss: 0.00845
Epoch: 26993, Training Loss: 0.00882
Epoch: 26993, Training Loss: 0.00883
Epoch: 26993, Training Loss: 0.01086
Epoch: 26994, Training Loss: 0.00845
Epoch: 26994, Training Loss: 0.00882
Epoch: 26994, Training Loss: 0.00883
Epoch: 26994, Training Loss: 0.01086
Epoch: 26995, Training Loss: 0.00845
Epoch: 26995, Training Loss: 0.00882
Epoch: 26995, Training Loss: 0.00883
Epoch: 26995, Training Loss: 0.01086
Epoch: 26996, Training Loss: 0.00845
Epoch: 26996, Training Loss: 0.00882
Epoch: 26996, Training Loss: 0.00883
Epoch: 26996, Training Loss: 0.01086
Epoch: 26997, Training Loss: 0.00845
Epoch: 26997, Training Loss: 0.00882
Epoch: 26997, Training Loss: 0.00883
Epoch: 26997, Training Loss: 0.01086
Epoch: 26998, Training Loss: 0.00845
Epoch: 26998, Training Loss: 0.00882
Epoch: 26998, Training Loss: 0.00883
Epoch: 26998, Training Loss: 0.01086
Epoch: 26999, Training Loss: 0.00845
Epoch: 26999, Training Loss: 0.00882
Epoch: 26999, Training Loss: 0.00883
Epoch: 26999, Training Loss: 0.01086
Epoch: 27000, Training Loss: 0.00845
Epoch: 27000, Training Loss: 0.00882
Epoch: 27000, Training Loss: 0.00883
Epoch: 27000, Training Loss: 0.01086
Epoch: 27001, Training Loss: 0.00845
Epoch: 27001, Training Loss: 0.00882
Epoch: 27001, Training Loss: 0.00883
Epoch: 27001, Training Loss: 0.01086
Epoch: 27002, Training Loss: 0.00845
Epoch: 27002, Training Loss: 0.00882
Epoch: 27002, Training Loss: 0.00883
Epoch: 27002, Training Loss: 0.01086
Epoch: 27003, Training Loss: 0.00845
Epoch: 27003, Training Loss: 0.00882
Epoch: 27003, Training Loss: 0.00883
Epoch: 27003, Training Loss: 0.01085
Epoch: 27004, Training Loss: 0.00845
Epoch: 27004, Training Loss: 0.00882
Epoch: 27004, Training Loss: 0.00883
Epoch: 27004, Training Loss: 0.01085
Epoch: 27005, Training Loss: 0.00845
Epoch: 27005, Training Loss: 0.00882
Epoch: 27005, Training Loss: 0.00883
Epoch: 27005, Training Loss: 0.01085
Epoch: 27006, Training Loss: 0.00845
Epoch: 27006, Training Loss: 0.00882
Epoch: 27006, Training Loss: 0.00883
Epoch: 27006, Training Loss: 0.01085
Epoch: 27007, Training Loss: 0.00845
Epoch: 27007, Training Loss: 0.00882
Epoch: 27007, Training Loss: 0.00883
Epoch: 27007, Training Loss: 0.01085
Epoch: 27008, Training Loss: 0.00845
Epoch: 27008, Training Loss: 0.00882
Epoch: 27008, Training Loss: 0.00883
Epoch: 27008, Training Loss: 0.01085
Epoch: 27009, Training Loss: 0.00845
Epoch: 27009, Training Loss: 0.00882
Epoch: 27009, Training Loss: 0.00883
Epoch: 27009, Training Loss: 0.01085
Epoch: 27010, Training Loss: 0.00845
Epoch: 27010, Training Loss: 0.00882
Epoch: 27010, Training Loss: 0.00883
Epoch: 27010, Training Loss: 0.01085
Epoch: 27011, Training Loss: 0.00845
Epoch: 27011, Training Loss: 0.00882
Epoch: 27011, Training Loss: 0.00883
Epoch: 27011, Training Loss: 0.01085
Epoch: 27012, Training Loss: 0.00845
Epoch: 27012, Training Loss: 0.00882
Epoch: 27012, Training Loss: 0.00883
Epoch: 27012, Training Loss: 0.01085
Epoch: 27013, Training Loss: 0.00845
Epoch: 27013, Training Loss: 0.00882
Epoch: 27013, Training Loss: 0.00883
Epoch: 27013, Training Loss: 0.01085
Epoch: 27014, Training Loss: 0.00845
Epoch: 27014, Training Loss: 0.00882
Epoch: 27014, Training Loss: 0.00883
Epoch: 27014, Training Loss: 0.01085
Epoch: 27015, Training Loss: 0.00845
Epoch: 27015, Training Loss: 0.00882
Epoch: 27015, Training Loss: 0.00883
Epoch: 27015, Training Loss: 0.01085
Epoch: 27016, Training Loss: 0.00845
Epoch: 27016, Training Loss: 0.00882
Epoch: 27016, Training Loss: 0.00883
Epoch: 27016, Training Loss: 0.01085
Epoch: 27017, Training Loss: 0.00845
Epoch: 27017, Training Loss: 0.00882
Epoch: 27017, Training Loss: 0.00883
Epoch: 27017, Training Loss: 0.01085
Epoch: 27018, Training Loss: 0.00845
Epoch: 27018, Training Loss: 0.00882
Epoch: 27018, Training Loss: 0.00883
Epoch: 27018, Training Loss: 0.01085
Epoch: 27019, Training Loss: 0.00845
Epoch: 27019, Training Loss: 0.00882
Epoch: 27019, Training Loss: 0.00883
Epoch: 27019, Training Loss: 0.01085
Epoch: 27020, Training Loss: 0.00845
Epoch: 27020, Training Loss: 0.00882
Epoch: 27020, Training Loss: 0.00883
Epoch: 27020, Training Loss: 0.01085
Epoch: 27021, Training Loss: 0.00845
Epoch: 27021, Training Loss: 0.00882
Epoch: 27021, Training Loss: 0.00883
Epoch: 27021, Training Loss: 0.01085
Epoch: 27022, Training Loss: 0.00845
Epoch: 27022, Training Loss: 0.00882
Epoch: 27022, Training Loss: 0.00883
Epoch: 27022, Training Loss: 0.01085
Epoch: 27023, Training Loss: 0.00845
Epoch: 27023, Training Loss: 0.00882
Epoch: 27023, Training Loss: 0.00883
Epoch: 27023, Training Loss: 0.01085
Epoch: 27024, Training Loss: 0.00845
Epoch: 27024, Training Loss: 0.00882
Epoch: 27024, Training Loss: 0.00883
Epoch: 27024, Training Loss: 0.01085
Epoch: 27025, Training Loss: 0.00845
Epoch: 27025, Training Loss: 0.00882
Epoch: 27025, Training Loss: 0.00883
Epoch: 27025, Training Loss: 0.01085
Epoch: 27026, Training Loss: 0.00845
Epoch: 27026, Training Loss: 0.00882
Epoch: 27026, Training Loss: 0.00883
Epoch: 27026, Training Loss: 0.01085
Epoch: 27027, Training Loss: 0.00845
Epoch: 27027, Training Loss: 0.00882
Epoch: 27027, Training Loss: 0.00883
Epoch: 27027, Training Loss: 0.01085
Epoch: 27028, Training Loss: 0.00845
Epoch: 27028, Training Loss: 0.00881
Epoch: 27028, Training Loss: 0.00883
Epoch: 27028, Training Loss: 0.01085
Epoch: 27029, Training Loss: 0.00845
Epoch: 27029, Training Loss: 0.00881
Epoch: 27029, Training Loss: 0.00883
Epoch: 27029, Training Loss: 0.01085
Epoch: 27030, Training Loss: 0.00845
Epoch: 27030, Training Loss: 0.00881
Epoch: 27030, Training Loss: 0.00883
Epoch: 27030, Training Loss: 0.01085
Epoch: 27031, Training Loss: 0.00845
Epoch: 27031, Training Loss: 0.00881
Epoch: 27031, Training Loss: 0.00883
Epoch: 27031, Training Loss: 0.01085
Epoch: 27032, Training Loss: 0.00845
Epoch: 27032, Training Loss: 0.00881
Epoch: 27032, Training Loss: 0.00882
Epoch: 27032, Training Loss: 0.01085
Epoch: 27033, Training Loss: 0.00845
Epoch: 27033, Training Loss: 0.00881
Epoch: 27033, Training Loss: 0.00882
Epoch: 27033, Training Loss: 0.01085
Epoch: 27034, Training Loss: 0.00845
Epoch: 27034, Training Loss: 0.00881
Epoch: 27034, Training Loss: 0.00882
Epoch: 27034, Training Loss: 0.01085
Epoch: 27035, Training Loss: 0.00845
Epoch: 27035, Training Loss: 0.00881
Epoch: 27035, Training Loss: 0.00882
Epoch: 27035, Training Loss: 0.01085
Epoch: 27036, Training Loss: 0.00845
Epoch: 27036, Training Loss: 0.00881
Epoch: 27036, Training Loss: 0.00882
Epoch: 27036, Training Loss: 0.01085
Epoch: 27037, Training Loss: 0.00845
Epoch: 27037, Training Loss: 0.00881
Epoch: 27037, Training Loss: 0.00882
Epoch: 27037, Training Loss: 0.01085
Epoch: 27038, Training Loss: 0.00845
Epoch: 27038, Training Loss: 0.00881
Epoch: 27038, Training Loss: 0.00882
Epoch: 27038, Training Loss: 0.01085
Epoch: 27039, Training Loss: 0.00845
Epoch: 27039, Training Loss: 0.00881
Epoch: 27039, Training Loss: 0.00882
Epoch: 27039, Training Loss: 0.01085
Epoch: 27040, Training Loss: 0.00844
Epoch: 27040, Training Loss: 0.00881
Epoch: 27040, Training Loss: 0.00882
Epoch: 27040, Training Loss: 0.01085
Epoch: 27041, Training Loss: 0.00844
Epoch: 27041, Training Loss: 0.00881
Epoch: 27041, Training Loss: 0.00882
Epoch: 27041, Training Loss: 0.01085
Epoch: 27042, Training Loss: 0.00844
Epoch: 27042, Training Loss: 0.00881
Epoch: 27042, Training Loss: 0.00882
Epoch: 27042, Training Loss: 0.01085
Epoch: 27043, Training Loss: 0.00844
Epoch: 27043, Training Loss: 0.00881
Epoch: 27043, Training Loss: 0.00882
Epoch: 27043, Training Loss: 0.01085
Epoch: 27044, Training Loss: 0.00844
Epoch: 27044, Training Loss: 0.00881
Epoch: 27044, Training Loss: 0.00882
Epoch: 27044, Training Loss: 0.01085
Epoch: 27045, Training Loss: 0.00844
Epoch: 27045, Training Loss: 0.00881
Epoch: 27045, Training Loss: 0.00882
Epoch: 27045, Training Loss: 0.01085
Epoch: 27046, Training Loss: 0.00844
Epoch: 27046, Training Loss: 0.00881
Epoch: 27046, Training Loss: 0.00882
Epoch: 27046, Training Loss: 0.01084
Epoch: 27047, Training Loss: 0.00844
Epoch: 27047, Training Loss: 0.00881
Epoch: 27047, Training Loss: 0.00882
Epoch: 27047, Training Loss: 0.01084
Epoch: 27048, Training Loss: 0.00844
Epoch: 27048, Training Loss: 0.00881
Epoch: 27048, Training Loss: 0.00882
Epoch: 27048, Training Loss: 0.01084
Epoch: 27049, Training Loss: 0.00844
Epoch: 27049, Training Loss: 0.00881
Epoch: 27049, Training Loss: 0.00882
Epoch: 27049, Training Loss: 0.01084
Epoch: 27050, Training Loss: 0.00844
Epoch: 27050, Training Loss: 0.00881
Epoch: 27050, Training Loss: 0.00882
Epoch: 27050, Training Loss: 0.01084
Epoch: 27051, Training Loss: 0.00844
Epoch: 27051, Training Loss: 0.00881
Epoch: 27051, Training Loss: 0.00882
Epoch: 27051, Training Loss: 0.01084
Epoch: 27052, Training Loss: 0.00844
Epoch: 27052, Training Loss: 0.00881
Epoch: 27052, Training Loss: 0.00882
Epoch: 27052, Training Loss: 0.01084
Epoch: 27053, Training Loss: 0.00844
Epoch: 27053, Training Loss: 0.00881
Epoch: 27053, Training Loss: 0.00882
Epoch: 27053, Training Loss: 0.01084
Epoch: 27054, Training Loss: 0.00844
Epoch: 27054, Training Loss: 0.00881
Epoch: 27054, Training Loss: 0.00882
Epoch: 27054, Training Loss: 0.01084
Epoch: 27055, Training Loss: 0.00844
Epoch: 27055, Training Loss: 0.00881
Epoch: 27055, Training Loss: 0.00882
Epoch: 27055, Training Loss: 0.01084
Epoch: 27056, Training Loss: 0.00844
Epoch: 27056, Training Loss: 0.00881
Epoch: 27056, Training Loss: 0.00882
Epoch: 27056, Training Loss: 0.01084
Epoch: 27057, Training Loss: 0.00844
Epoch: 27057, Training Loss: 0.00881
Epoch: 27057, Training Loss: 0.00882
Epoch: 27057, Training Loss: 0.01084
Epoch: 27058, Training Loss: 0.00844
Epoch: 27058, Training Loss: 0.00881
Epoch: 27058, Training Loss: 0.00882
Epoch: 27058, Training Loss: 0.01084
Epoch: 27059, Training Loss: 0.00844
Epoch: 27059, Training Loss: 0.00881
Epoch: 27059, Training Loss: 0.00882
Epoch: 27059, Training Loss: 0.01084
Epoch: 27060, Training Loss: 0.00844
Epoch: 27060, Training Loss: 0.00881
Epoch: 27060, Training Loss: 0.00882
Epoch: 27060, Training Loss: 0.01084
Epoch: 27061, Training Loss: 0.00844
Epoch: 27061, Training Loss: 0.00881
Epoch: 27061, Training Loss: 0.00882
Epoch: 27061, Training Loss: 0.01084
Epoch: 27062, Training Loss: 0.00844
Epoch: 27062, Training Loss: 0.00881
Epoch: 27062, Training Loss: 0.00882
Epoch: 27062, Training Loss: 0.01084
Epoch: 27063, Training Loss: 0.00844
Epoch: 27063, Training Loss: 0.00881
Epoch: 27063, Training Loss: 0.00882
Epoch: 27063, Training Loss: 0.01084
Epoch: 27064, Training Loss: 0.00844
Epoch: 27064, Training Loss: 0.00881
Epoch: 27064, Training Loss: 0.00882
Epoch: 27064, Training Loss: 0.01084
Epoch: 27065, Training Loss: 0.00844
Epoch: 27065, Training Loss: 0.00881
Epoch: 27065, Training Loss: 0.00882
Epoch: 27065, Training Loss: 0.01084
Epoch: 27066, Training Loss: 0.00844
Epoch: 27066, Training Loss: 0.00881
Epoch: 27066, Training Loss: 0.00882
Epoch: 27066, Training Loss: 0.01084
Epoch: 27067, Training Loss: 0.00844
Epoch: 27067, Training Loss: 0.00881
Epoch: 27067, Training Loss: 0.00882
Epoch: 27067, Training Loss: 0.01084
Epoch: 27068, Training Loss: 0.00844
Epoch: 27068, Training Loss: 0.00881
Epoch: 27068, Training Loss: 0.00882
Epoch: 27068, Training Loss: 0.01084
Epoch: 27069, Training Loss: 0.00844
Epoch: 27069, Training Loss: 0.00881
Epoch: 27069, Training Loss: 0.00882
Epoch: 27069, Training Loss: 0.01084
Epoch: 27070, Training Loss: 0.00844
Epoch: 27070, Training Loss: 0.00881
Epoch: 27070, Training Loss: 0.00882
Epoch: 27070, Training Loss: 0.01084
Epoch: 27071, Training Loss: 0.00844
Epoch: 27071, Training Loss: 0.00881
Epoch: 27071, Training Loss: 0.00882
Epoch: 27071, Training Loss: 0.01084
Epoch: 27072, Training Loss: 0.00844
Epoch: 27072, Training Loss: 0.00881
Epoch: 27072, Training Loss: 0.00882
Epoch: 27072, Training Loss: 0.01084
Epoch: 27073, Training Loss: 0.00844
Epoch: 27073, Training Loss: 0.00881
Epoch: 27073, Training Loss: 0.00882
Epoch: 27073, Training Loss: 0.01084
Epoch: 27074, Training Loss: 0.00844
Epoch: 27074, Training Loss: 0.00881
Epoch: 27074, Training Loss: 0.00882
Epoch: 27074, Training Loss: 0.01084
Epoch: 27075, Training Loss: 0.00844
Epoch: 27075, Training Loss: 0.00881
Epoch: 27075, Training Loss: 0.00882
Epoch: 27075, Training Loss: 0.01084
Epoch: 27076, Training Loss: 0.00844
Epoch: 27076, Training Loss: 0.00881
Epoch: 27076, Training Loss: 0.00882
Epoch: 27076, Training Loss: 0.01084
Epoch: 27077, Training Loss: 0.00844
Epoch: 27077, Training Loss: 0.00881
Epoch: 27077, Training Loss: 0.00882
Epoch: 27077, Training Loss: 0.01084
Epoch: 27078, Training Loss: 0.00844
Epoch: 27078, Training Loss: 0.00881
Epoch: 27078, Training Loss: 0.00882
Epoch: 27078, Training Loss: 0.01084
Epoch: 27079, Training Loss: 0.00844
Epoch: 27079, Training Loss: 0.00881
Epoch: 27079, Training Loss: 0.00882
Epoch: 27079, Training Loss: 0.01084
Epoch: 27080, Training Loss: 0.00844
Epoch: 27080, Training Loss: 0.00881
Epoch: 27080, Training Loss: 0.00882
Epoch: 27080, Training Loss: 0.01084
Epoch: 27081, Training Loss: 0.00844
Epoch: 27081, Training Loss: 0.00881
Epoch: 27081, Training Loss: 0.00882
Epoch: 27081, Training Loss: 0.01084
Epoch: 27082, Training Loss: 0.00844
Epoch: 27082, Training Loss: 0.00880
Epoch: 27082, Training Loss: 0.00882
Epoch: 27082, Training Loss: 0.01084
Epoch: 27083, Training Loss: 0.00844
Epoch: 27083, Training Loss: 0.00880
Epoch: 27083, Training Loss: 0.00882
Epoch: 27083, Training Loss: 0.01084
Epoch: 27084, Training Loss: 0.00844
Epoch: 27084, Training Loss: 0.00880
Epoch: 27084, Training Loss: 0.00882
Epoch: 27084, Training Loss: 0.01084
Epoch: 27085, Training Loss: 0.00844
Epoch: 27085, Training Loss: 0.00880
Epoch: 27085, Training Loss: 0.00882
Epoch: 27085, Training Loss: 0.01084
Epoch: 27086, Training Loss: 0.00844
Epoch: 27086, Training Loss: 0.00880
Epoch: 27086, Training Loss: 0.00881
Epoch: 27086, Training Loss: 0.01084
Epoch: 27087, Training Loss: 0.00844
Epoch: 27087, Training Loss: 0.00880
Epoch: 27087, Training Loss: 0.00881
Epoch: 27087, Training Loss: 0.01084
Epoch: 27088, Training Loss: 0.00844
Epoch: 27088, Training Loss: 0.00880
Epoch: 27088, Training Loss: 0.00881
Epoch: 27088, Training Loss: 0.01084
Epoch: 27089, Training Loss: 0.00844
Epoch: 27089, Training Loss: 0.00880
Epoch: 27089, Training Loss: 0.00881
Epoch: 27089, Training Loss: 0.01084
Epoch: 27090, Training Loss: 0.00844
Epoch: 27090, Training Loss: 0.00880
Epoch: 27090, Training Loss: 0.00881
Epoch: 27090, Training Loss: 0.01083
Epoch: 27091, Training Loss: 0.00844
Epoch: 27091, Training Loss: 0.00880
Epoch: 27091, Training Loss: 0.00881
Epoch: 27091, Training Loss: 0.01083
Epoch: 27092, Training Loss: 0.00844
Epoch: 27092, Training Loss: 0.00880
Epoch: 27092, Training Loss: 0.00881
Epoch: 27092, Training Loss: 0.01083
Epoch: 27093, Training Loss: 0.00844
Epoch: 27093, Training Loss: 0.00880
Epoch: 27093, Training Loss: 0.00881
Epoch: 27093, Training Loss: 0.01083
Epoch: 27094, Training Loss: 0.00844
Epoch: 27094, Training Loss: 0.00880
Epoch: 27094, Training Loss: 0.00881
Epoch: 27094, Training Loss: 0.01083
Epoch: 27095, Training Loss: 0.00844
Epoch: 27095, Training Loss: 0.00880
Epoch: 27095, Training Loss: 0.00881
Epoch: 27095, Training Loss: 0.01083
Epoch: 27096, Training Loss: 0.00844
Epoch: 27096, Training Loss: 0.00880
Epoch: 27096, Training Loss: 0.00881
Epoch: 27096, Training Loss: 0.01083
Epoch: 27097, Training Loss: 0.00844
Epoch: 27097, Training Loss: 0.00880
Epoch: 27097, Training Loss: 0.00881
Epoch: 27097, Training Loss: 0.01083
Epoch: 27098, Training Loss: 0.00843
Epoch: 27098, Training Loss: 0.00880
Epoch: 27098, Training Loss: 0.00881
Epoch: 27098, Training Loss: 0.01083
Epoch: 27099, Training Loss: 0.00843
Epoch: 27099, Training Loss: 0.00880
Epoch: 27099, Training Loss: 0.00881
Epoch: 27099, Training Loss: 0.01083
Epoch: 27100, Training Loss: 0.00843
Epoch: 27100, Training Loss: 0.00880
Epoch: 27100, Training Loss: 0.00881
Epoch: 27100, Training Loss: 0.01083
Epoch: 27101, Training Loss: 0.00843
Epoch: 27101, Training Loss: 0.00880
Epoch: 27101, Training Loss: 0.00881
Epoch: 27101, Training Loss: 0.01083
Epoch: 27102, Training Loss: 0.00843
Epoch: 27102, Training Loss: 0.00880
Epoch: 27102, Training Loss: 0.00881
Epoch: 27102, Training Loss: 0.01083
Epoch: 27103, Training Loss: 0.00843
Epoch: 27103, Training Loss: 0.00880
Epoch: 27103, Training Loss: 0.00881
Epoch: 27103, Training Loss: 0.01083
Epoch: 27104, Training Loss: 0.00843
Epoch: 27104, Training Loss: 0.00880
Epoch: 27104, Training Loss: 0.00881
Epoch: 27104, Training Loss: 0.01083
Epoch: 27105, Training Loss: 0.00843
Epoch: 27105, Training Loss: 0.00880
Epoch: 27105, Training Loss: 0.00881
Epoch: 27105, Training Loss: 0.01083
Epoch: 27106, Training Loss: 0.00843
Epoch: 27106, Training Loss: 0.00880
Epoch: 27106, Training Loss: 0.00881
Epoch: 27106, Training Loss: 0.01083
Epoch: 27107, Training Loss: 0.00843
Epoch: 27107, Training Loss: 0.00880
Epoch: 27107, Training Loss: 0.00881
Epoch: 27107, Training Loss: 0.01083
Epoch: 27108, Training Loss: 0.00843
Epoch: 27108, Training Loss: 0.00880
Epoch: 27108, Training Loss: 0.00881
Epoch: 27108, Training Loss: 0.01083
Epoch: 27109, Training Loss: 0.00843
Epoch: 27109, Training Loss: 0.00880
Epoch: 27109, Training Loss: 0.00881
Epoch: 27109, Training Loss: 0.01083
Epoch: 27110, Training Loss: 0.00843
Epoch: 27110, Training Loss: 0.00880
Epoch: 27110, Training Loss: 0.00881
Epoch: 27110, Training Loss: 0.01083
Epoch: 27111, Training Loss: 0.00843
Epoch: 27111, Training Loss: 0.00880
Epoch: 27111, Training Loss: 0.00881
Epoch: 27111, Training Loss: 0.01083
Epoch: 27112, Training Loss: 0.00843
Epoch: 27112, Training Loss: 0.00880
Epoch: 27112, Training Loss: 0.00881
Epoch: 27112, Training Loss: 0.01083
Epoch: 27113, Training Loss: 0.00843
Epoch: 27113, Training Loss: 0.00880
Epoch: 27113, Training Loss: 0.00881
Epoch: 27113, Training Loss: 0.01083
Epoch: 27114, Training Loss: 0.00843
Epoch: 27114, Training Loss: 0.00880
Epoch: 27114, Training Loss: 0.00881
Epoch: 27114, Training Loss: 0.01083
Epoch: 27115, Training Loss: 0.00843
Epoch: 27115, Training Loss: 0.00880
Epoch: 27115, Training Loss: 0.00881
Epoch: 27115, Training Loss: 0.01083
Epoch: 27116, Training Loss: 0.00843
Epoch: 27116, Training Loss: 0.00880
Epoch: 27116, Training Loss: 0.00881
Epoch: 27116, Training Loss: 0.01083
Epoch: 27117, Training Loss: 0.00843
Epoch: 27117, Training Loss: 0.00880
Epoch: 27117, Training Loss: 0.00881
Epoch: 27117, Training Loss: 0.01083
Epoch: 27118, Training Loss: 0.00843
Epoch: 27118, Training Loss: 0.00880
Epoch: 27118, Training Loss: 0.00881
Epoch: 27118, Training Loss: 0.01083
Epoch: 27119, Training Loss: 0.00843
Epoch: 27119, Training Loss: 0.00880
Epoch: 27119, Training Loss: 0.00881
Epoch: 27119, Training Loss: 0.01083
Epoch: 27120, Training Loss: 0.00843
Epoch: 27120, Training Loss: 0.00880
Epoch: 27120, Training Loss: 0.00881
Epoch: 27120, Training Loss: 0.01083
Epoch: 27121, Training Loss: 0.00843
Epoch: 27121, Training Loss: 0.00880
Epoch: 27121, Training Loss: 0.00881
Epoch: 27121, Training Loss: 0.01083
Epoch: 27122, Training Loss: 0.00843
Epoch: 27122, Training Loss: 0.00880
Epoch: 27122, Training Loss: 0.00881
Epoch: 27122, Training Loss: 0.01083
Epoch: 27123, Training Loss: 0.00843
Epoch: 27123, Training Loss: 0.00880
Epoch: 27123, Training Loss: 0.00881
Epoch: 27123, Training Loss: 0.01083
Epoch: 27124, Training Loss: 0.00843
Epoch: 27124, Training Loss: 0.00880
Epoch: 27124, Training Loss: 0.00881
Epoch: 27124, Training Loss: 0.01083
Epoch: 27125, Training Loss: 0.00843
Epoch: 27125, Training Loss: 0.00880
Epoch: 27125, Training Loss: 0.00881
Epoch: 27125, Training Loss: 0.01083
Epoch: 27126, Training Loss: 0.00843
Epoch: 27126, Training Loss: 0.00880
Epoch: 27126, Training Loss: 0.00881
Epoch: 27126, Training Loss: 0.01083
Epoch: 27127, Training Loss: 0.00843
Epoch: 27127, Training Loss: 0.00880
Epoch: 27127, Training Loss: 0.00881
Epoch: 27127, Training Loss: 0.01083
Epoch: 27128, Training Loss: 0.00843
Epoch: 27128, Training Loss: 0.00880
Epoch: 27128, Training Loss: 0.00881
Epoch: 27128, Training Loss: 0.01083
Epoch: 27129, Training Loss: 0.00843
Epoch: 27129, Training Loss: 0.00880
Epoch: 27129, Training Loss: 0.00881
Epoch: 27129, Training Loss: 0.01083
Epoch: 27130, Training Loss: 0.00843
Epoch: 27130, Training Loss: 0.00880
Epoch: 27130, Training Loss: 0.00881
Epoch: 27130, Training Loss: 0.01083
Epoch: 27131, Training Loss: 0.00843
Epoch: 27131, Training Loss: 0.00880
Epoch: 27131, Training Loss: 0.00881
Epoch: 27131, Training Loss: 0.01083
Epoch: 27132, Training Loss: 0.00843
Epoch: 27132, Training Loss: 0.00880
Epoch: 27132, Training Loss: 0.00881
Epoch: 27132, Training Loss: 0.01083
Epoch: 27133, Training Loss: 0.00843
Epoch: 27133, Training Loss: 0.00880
Epoch: 27133, Training Loss: 0.00881
Epoch: 27133, Training Loss: 0.01082
Epoch: 27134, Training Loss: 0.00843
Epoch: 27134, Training Loss: 0.00880
Epoch: 27134, Training Loss: 0.00881
Epoch: 27134, Training Loss: 0.01082
Epoch: 27135, Training Loss: 0.00843
Epoch: 27135, Training Loss: 0.00880
Epoch: 27135, Training Loss: 0.00881
Epoch: 27135, Training Loss: 0.01082
Epoch: 27136, Training Loss: 0.00843
Epoch: 27136, Training Loss: 0.00879
Epoch: 27136, Training Loss: 0.00881
Epoch: 27136, Training Loss: 0.01082
Epoch: 27137, Training Loss: 0.00843
Epoch: 27137, Training Loss: 0.00879
Epoch: 27137, Training Loss: 0.00881
Epoch: 27137, Training Loss: 0.01082
Epoch: 27138, Training Loss: 0.00843
Epoch: 27138, Training Loss: 0.00879
Epoch: 27138, Training Loss: 0.00881
Epoch: 27138, Training Loss: 0.01082
Epoch: 27139, Training Loss: 0.00843
Epoch: 27139, Training Loss: 0.00879
Epoch: 27139, Training Loss: 0.00881
Epoch: 27139, Training Loss: 0.01082
Epoch: 27140, Training Loss: 0.00843
Epoch: 27140, Training Loss: 0.00879
Epoch: 27140, Training Loss: 0.00880
Epoch: 27140, Training Loss: 0.01082
Epoch: 27141, Training Loss: 0.00843
Epoch: 27141, Training Loss: 0.00879
Epoch: 27141, Training Loss: 0.00880
Epoch: 27141, Training Loss: 0.01082
Epoch: 27142, Training Loss: 0.00843
Epoch: 27142, Training Loss: 0.00879
Epoch: 27142, Training Loss: 0.00880
Epoch: 27142, Training Loss: 0.01082
Epoch: 27143, Training Loss: 0.00843
Epoch: 27143, Training Loss: 0.00879
Epoch: 27143, Training Loss: 0.00880
Epoch: 27143, Training Loss: 0.01082
Epoch: 27144, Training Loss: 0.00843
Epoch: 27144, Training Loss: 0.00879
Epoch: 27144, Training Loss: 0.00880
Epoch: 27144, Training Loss: 0.01082
Epoch: 27145, Training Loss: 0.00843
Epoch: 27145, Training Loss: 0.00879
Epoch: 27145, Training Loss: 0.00880
Epoch: 27145, Training Loss: 0.01082
Epoch: 27146, Training Loss: 0.00843
Epoch: 27146, Training Loss: 0.00879
Epoch: 27146, Training Loss: 0.00880
Epoch: 27146, Training Loss: 0.01082
Epoch: 27147, Training Loss: 0.00843
Epoch: 27147, Training Loss: 0.00879
Epoch: 27147, Training Loss: 0.00880
Epoch: 27147, Training Loss: 0.01082
Epoch: 27148, Training Loss: 0.00843
Epoch: 27148, Training Loss: 0.00879
Epoch: 27148, Training Loss: 0.00880
Epoch: 27148, Training Loss: 0.01082
Epoch: 27149, Training Loss: 0.00843
Epoch: 27149, Training Loss: 0.00879
Epoch: 27149, Training Loss: 0.00880
Epoch: 27149, Training Loss: 0.01082
Epoch: 27150, Training Loss: 0.00843
Epoch: 27150, Training Loss: 0.00879
Epoch: 27150, Training Loss: 0.00880
Epoch: 27150, Training Loss: 0.01082
Epoch: 27151, Training Loss: 0.00843
Epoch: 27151, Training Loss: 0.00879
Epoch: 27151, Training Loss: 0.00880
Epoch: 27151, Training Loss: 0.01082
Epoch: 27152, Training Loss: 0.00843
Epoch: 27152, Training Loss: 0.00879
Epoch: 27152, Training Loss: 0.00880
Epoch: 27152, Training Loss: 0.01082
Epoch: 27153, Training Loss: 0.00843
Epoch: 27153, Training Loss: 0.00879
Epoch: 27153, Training Loss: 0.00880
Epoch: 27153, Training Loss: 0.01082
Epoch: 27154, Training Loss: 0.00843
Epoch: 27154, Training Loss: 0.00879
Epoch: 27154, Training Loss: 0.00880
Epoch: 27154, Training Loss: 0.01082
Epoch: 27155, Training Loss: 0.00843
Epoch: 27155, Training Loss: 0.00879
Epoch: 27155, Training Loss: 0.00880
Epoch: 27155, Training Loss: 0.01082
Epoch: 27156, Training Loss: 0.00842
Epoch: 27156, Training Loss: 0.00879
Epoch: 27156, Training Loss: 0.00880
Epoch: 27156, Training Loss: 0.01082
Epoch: 27157, Training Loss: 0.00842
Epoch: 27157, Training Loss: 0.00879
Epoch: 27157, Training Loss: 0.00880
Epoch: 27157, Training Loss: 0.01082
Epoch: 27158, Training Loss: 0.00842
Epoch: 27158, Training Loss: 0.00879
Epoch: 27158, Training Loss: 0.00880
Epoch: 27158, Training Loss: 0.01082
Epoch: 27159, Training Loss: 0.00842
Epoch: 27159, Training Loss: 0.00879
Epoch: 27159, Training Loss: 0.00880
Epoch: 27159, Training Loss: 0.01082
Epoch: 27160, Training Loss: 0.00842
Epoch: 27160, Training Loss: 0.00879
Epoch: 27160, Training Loss: 0.00880
Epoch: 27160, Training Loss: 0.01082
Epoch: 27161, Training Loss: 0.00842
Epoch: 27161, Training Loss: 0.00879
Epoch: 27161, Training Loss: 0.00880
Epoch: 27161, Training Loss: 0.01082
Epoch: 27162, Training Loss: 0.00842
Epoch: 27162, Training Loss: 0.00879
Epoch: 27162, Training Loss: 0.00880
Epoch: 27162, Training Loss: 0.01082
Epoch: 27163, Training Loss: 0.00842
Epoch: 27163, Training Loss: 0.00879
Epoch: 27163, Training Loss: 0.00880
Epoch: 27163, Training Loss: 0.01082
Epoch: 27164, Training Loss: 0.00842
Epoch: 27164, Training Loss: 0.00879
Epoch: 27164, Training Loss: 0.00880
Epoch: 27164, Training Loss: 0.01082
Epoch: 27165, Training Loss: 0.00842
Epoch: 27165, Training Loss: 0.00879
Epoch: 27165, Training Loss: 0.00880
Epoch: 27165, Training Loss: 0.01082
Epoch: 27166, Training Loss: 0.00842
Epoch: 27166, Training Loss: 0.00879
Epoch: 27166, Training Loss: 0.00880
Epoch: 27166, Training Loss: 0.01082
Epoch: 27167, Training Loss: 0.00842
Epoch: 27167, Training Loss: 0.00879
Epoch: 27167, Training Loss: 0.00880
Epoch: 27167, Training Loss: 0.01082
Epoch: 27168, Training Loss: 0.00842
Epoch: 27168, Training Loss: 0.00879
Epoch: 27168, Training Loss: 0.00880
Epoch: 27168, Training Loss: 0.01082
Epoch: 27169, Training Loss: 0.00842
Epoch: 27169, Training Loss: 0.00879
Epoch: 27169, Training Loss: 0.00880
Epoch: 27169, Training Loss: 0.01082
Epoch: 27170, Training Loss: 0.00842
Epoch: 27170, Training Loss: 0.00879
Epoch: 27170, Training Loss: 0.00880
Epoch: 27170, Training Loss: 0.01082
Epoch: 27171, Training Loss: 0.00842
Epoch: 27171, Training Loss: 0.00879
Epoch: 27171, Training Loss: 0.00880
Epoch: 27171, Training Loss: 0.01082
Epoch: 27172, Training Loss: 0.00842
Epoch: 27172, Training Loss: 0.00879
Epoch: 27172, Training Loss: 0.00880
Epoch: 27172, Training Loss: 0.01082
Epoch: 27173, Training Loss: 0.00842
Epoch: 27173, Training Loss: 0.00879
Epoch: 27173, Training Loss: 0.00880
Epoch: 27173, Training Loss: 0.01082
Epoch: 27174, Training Loss: 0.00842
Epoch: 27174, Training Loss: 0.00879
Epoch: 27174, Training Loss: 0.00880
Epoch: 27174, Training Loss: 0.01082
Epoch: 27175, Training Loss: 0.00842
Epoch: 27175, Training Loss: 0.00879
Epoch: 27175, Training Loss: 0.00880
Epoch: 27175, Training Loss: 0.01082
Epoch: 27176, Training Loss: 0.00842
Epoch: 27176, Training Loss: 0.00879
Epoch: 27176, Training Loss: 0.00880
Epoch: 27176, Training Loss: 0.01082
Epoch: 27177, Training Loss: 0.00842
Epoch: 27177, Training Loss: 0.00879
Epoch: 27177, Training Loss: 0.00880
Epoch: 27177, Training Loss: 0.01081
Epoch: 27178, Training Loss: 0.00842
Epoch: 27178, Training Loss: 0.00879
Epoch: 27178, Training Loss: 0.00880
Epoch: 27178, Training Loss: 0.01081
Epoch: 27179, Training Loss: 0.00842
Epoch: 27179, Training Loss: 0.00879
Epoch: 27179, Training Loss: 0.00880
Epoch: 27179, Training Loss: 0.01081
Epoch: 27180, Training Loss: 0.00842
Epoch: 27180, Training Loss: 0.00879
Epoch: 27180, Training Loss: 0.00880
Epoch: 27180, Training Loss: 0.01081
Epoch: 27181, Training Loss: 0.00842
Epoch: 27181, Training Loss: 0.00879
Epoch: 27181, Training Loss: 0.00880
Epoch: 27181, Training Loss: 0.01081
Epoch: 27182, Training Loss: 0.00842
Epoch: 27182, Training Loss: 0.00879
Epoch: 27182, Training Loss: 0.00880
Epoch: 27182, Training Loss: 0.01081
Epoch: 27183, Training Loss: 0.00842
Epoch: 27183, Training Loss: 0.00879
Epoch: 27183, Training Loss: 0.00880
Epoch: 27183, Training Loss: 0.01081
Epoch: 27184, Training Loss: 0.00842
Epoch: 27184, Training Loss: 0.00879
Epoch: 27184, Training Loss: 0.00880
Epoch: 27184, Training Loss: 0.01081
Epoch: 27185, Training Loss: 0.00842
Epoch: 27185, Training Loss: 0.00879
Epoch: 27185, Training Loss: 0.00880
Epoch: 27185, Training Loss: 0.01081
Epoch: 27186, Training Loss: 0.00842
Epoch: 27186, Training Loss: 0.00879
Epoch: 27186, Training Loss: 0.00880
Epoch: 27186, Training Loss: 0.01081
Epoch: 27187, Training Loss: 0.00842
Epoch: 27187, Training Loss: 0.00879
Epoch: 27187, Training Loss: 0.00880
Epoch: 27187, Training Loss: 0.01081
Epoch: 27188, Training Loss: 0.00842
Epoch: 27188, Training Loss: 0.00879
Epoch: 27188, Training Loss: 0.00880
Epoch: 27188, Training Loss: 0.01081
Epoch: 27189, Training Loss: 0.00842
Epoch: 27189, Training Loss: 0.00879
Epoch: 27189, Training Loss: 0.00880
Epoch: 27189, Training Loss: 0.01081
Epoch: 27190, Training Loss: 0.00842
Epoch: 27190, Training Loss: 0.00878
Epoch: 27190, Training Loss: 0.00880
Epoch: 27190, Training Loss: 0.01081
Epoch: 27191, Training Loss: 0.00842
Epoch: 27191, Training Loss: 0.00878
Epoch: 27191, Training Loss: 0.00880
Epoch: 27191, Training Loss: 0.01081
Epoch: 27192, Training Loss: 0.00842
Epoch: 27192, Training Loss: 0.00878
Epoch: 27192, Training Loss: 0.00880
Epoch: 27192, Training Loss: 0.01081
Epoch: 27193, Training Loss: 0.00842
Epoch: 27193, Training Loss: 0.00878
Epoch: 27193, Training Loss: 0.00880
Epoch: 27193, Training Loss: 0.01081
Epoch: 27194, Training Loss: 0.00842
Epoch: 27194, Training Loss: 0.00878
Epoch: 27194, Training Loss: 0.00879
Epoch: 27194, Training Loss: 0.01081
Epoch: 27195, Training Loss: 0.00842
Epoch: 27195, Training Loss: 0.00878
Epoch: 27195, Training Loss: 0.00879
Epoch: 27195, Training Loss: 0.01081
Epoch: 27196, Training Loss: 0.00842
Epoch: 27196, Training Loss: 0.00878
Epoch: 27196, Training Loss: 0.00879
Epoch: 27196, Training Loss: 0.01081
Epoch: 27197, Training Loss: 0.00842
Epoch: 27197, Training Loss: 0.00878
Epoch: 27197, Training Loss: 0.00879
Epoch: 27197, Training Loss: 0.01081
Epoch: 27198, Training Loss: 0.00842
Epoch: 27198, Training Loss: 0.00878
Epoch: 27198, Training Loss: 0.00879
Epoch: 27198, Training Loss: 0.01081
Epoch: 27199, Training Loss: 0.00842
Epoch: 27199, Training Loss: 0.00878
Epoch: 27199, Training Loss: 0.00879
Epoch: 27199, Training Loss: 0.01081
Epoch: 27200, Training Loss: 0.00842
Epoch: 27200, Training Loss: 0.00878
Epoch: 27200, Training Loss: 0.00879
Epoch: 27200, Training Loss: 0.01081
Epoch: 27201, Training Loss: 0.00842
Epoch: 27201, Training Loss: 0.00878
Epoch: 27201, Training Loss: 0.00879
Epoch: 27201, Training Loss: 0.01081
Epoch: 27202, Training Loss: 0.00842
Epoch: 27202, Training Loss: 0.00878
Epoch: 27202, Training Loss: 0.00879
Epoch: 27202, Training Loss: 0.01081
Epoch: 27203, Training Loss: 0.00842
Epoch: 27203, Training Loss: 0.00878
Epoch: 27203, Training Loss: 0.00879
Epoch: 27203, Training Loss: 0.01081
Epoch: 27204, Training Loss: 0.00842
Epoch: 27204, Training Loss: 0.00878
Epoch: 27204, Training Loss: 0.00879
Epoch: 27204, Training Loss: 0.01081
Epoch: 27205, Training Loss: 0.00842
Epoch: 27205, Training Loss: 0.00878
Epoch: 27205, Training Loss: 0.00879
Epoch: 27205, Training Loss: 0.01081
Epoch: 27206, Training Loss: 0.00842
Epoch: 27206, Training Loss: 0.00878
Epoch: 27206, Training Loss: 0.00879
Epoch: 27206, Training Loss: 0.01081
Epoch: 27207, Training Loss: 0.00842
Epoch: 27207, Training Loss: 0.00878
Epoch: 27207, Training Loss: 0.00879
Epoch: 27207, Training Loss: 0.01081
Epoch: 27208, Training Loss: 0.00842
Epoch: 27208, Training Loss: 0.00878
Epoch: 27208, Training Loss: 0.00879
Epoch: 27208, Training Loss: 0.01081
Epoch: 27209, Training Loss: 0.00842
Epoch: 27209, Training Loss: 0.00878
Epoch: 27209, Training Loss: 0.00879
Epoch: 27209, Training Loss: 0.01081
Epoch: 27210, Training Loss: 0.00842
Epoch: 27210, Training Loss: 0.00878
Epoch: 27210, Training Loss: 0.00879
Epoch: 27210, Training Loss: 0.01081
Epoch: 27211, Training Loss: 0.00842
Epoch: 27211, Training Loss: 0.00878
Epoch: 27211, Training Loss: 0.00879
Epoch: 27211, Training Loss: 0.01081
Epoch: 27212, Training Loss: 0.00842
Epoch: 27212, Training Loss: 0.00878
Epoch: 27212, Training Loss: 0.00879
Epoch: 27212, Training Loss: 0.01081
Epoch: 27213, Training Loss: 0.00842
Epoch: 27213, Training Loss: 0.00878
Epoch: 27213, Training Loss: 0.00879
Epoch: 27213, Training Loss: 0.01081
Epoch: 27214, Training Loss: 0.00841
Epoch: 27214, Training Loss: 0.00878
Epoch: 27214, Training Loss: 0.00879
Epoch: 27214, Training Loss: 0.01081
Epoch: 27215, Training Loss: 0.00841
Epoch: 27215, Training Loss: 0.00878
Epoch: 27215, Training Loss: 0.00879
Epoch: 27215, Training Loss: 0.01081
Epoch: 27216, Training Loss: 0.00841
Epoch: 27216, Training Loss: 0.00878
Epoch: 27216, Training Loss: 0.00879
Epoch: 27216, Training Loss: 0.01081
Epoch: 27217, Training Loss: 0.00841
Epoch: 27217, Training Loss: 0.00878
Epoch: 27217, Training Loss: 0.00879
Epoch: 27217, Training Loss: 0.01081
Epoch: 27218, Training Loss: 0.00841
Epoch: 27218, Training Loss: 0.00878
Epoch: 27218, Training Loss: 0.00879
Epoch: 27218, Training Loss: 0.01081
Epoch: 27219, Training Loss: 0.00841
Epoch: 27219, Training Loss: 0.00878
Epoch: 27219, Training Loss: 0.00879
Epoch: 27219, Training Loss: 0.01081
Epoch: 27220, Training Loss: 0.00841
Epoch: 27220, Training Loss: 0.00878
Epoch: 27220, Training Loss: 0.00879
Epoch: 27220, Training Loss: 0.01081
Epoch: 27221, Training Loss: 0.00841
Epoch: 27221, Training Loss: 0.00878
Epoch: 27221, Training Loss: 0.00879
Epoch: 27221, Training Loss: 0.01080
Epoch: 27222, Training Loss: 0.00841
Epoch: 27222, Training Loss: 0.00878
Epoch: 27222, Training Loss: 0.00879
Epoch: 27222, Training Loss: 0.01080
Epoch: 27223, Training Loss: 0.00841
Epoch: 27223, Training Loss: 0.00878
Epoch: 27223, Training Loss: 0.00879
Epoch: 27223, Training Loss: 0.01080
Epoch: 27224, Training Loss: 0.00841
Epoch: 27224, Training Loss: 0.00878
Epoch: 27224, Training Loss: 0.00879
Epoch: 27224, Training Loss: 0.01080
Epoch: 27225, Training Loss: 0.00841
Epoch: 27225, Training Loss: 0.00878
Epoch: 27225, Training Loss: 0.00879
Epoch: 27225, Training Loss: 0.01080
Epoch: 27226, Training Loss: 0.00841
Epoch: 27226, Training Loss: 0.00878
Epoch: 27226, Training Loss: 0.00879
Epoch: 27226, Training Loss: 0.01080
Epoch: 27227, Training Loss: 0.00841
Epoch: 27227, Training Loss: 0.00878
Epoch: 27227, Training Loss: 0.00879
Epoch: 27227, Training Loss: 0.01080
Epoch: 27228, Training Loss: 0.00841
Epoch: 27228, Training Loss: 0.00878
Epoch: 27228, Training Loss: 0.00879
Epoch: 27228, Training Loss: 0.01080
Epoch: 27229, Training Loss: 0.00841
Epoch: 27229, Training Loss: 0.00878
Epoch: 27229, Training Loss: 0.00879
Epoch: 27229, Training Loss: 0.01080
Epoch: 27230, Training Loss: 0.00841
Epoch: 27230, Training Loss: 0.00878
Epoch: 27230, Training Loss: 0.00879
Epoch: 27230, Training Loss: 0.01080
Epoch: 27231, Training Loss: 0.00841
Epoch: 27231, Training Loss: 0.00878
Epoch: 27231, Training Loss: 0.00879
Epoch: 27231, Training Loss: 0.01080
Epoch: 27232, Training Loss: 0.00841
Epoch: 27232, Training Loss: 0.00878
Epoch: 27232, Training Loss: 0.00879
Epoch: 27232, Training Loss: 0.01080
Epoch: 27233, Training Loss: 0.00841
Epoch: 27233, Training Loss: 0.00878
Epoch: 27233, Training Loss: 0.00879
Epoch: 27233, Training Loss: 0.01080
Epoch: 27234, Training Loss: 0.00841
Epoch: 27234, Training Loss: 0.00878
Epoch: 27234, Training Loss: 0.00879
Epoch: 27234, Training Loss: 0.01080
Epoch: 27235, Training Loss: 0.00841
Epoch: 27235, Training Loss: 0.00878
Epoch: 27235, Training Loss: 0.00879
Epoch: 27235, Training Loss: 0.01080
Epoch: 27236, Training Loss: 0.00841
Epoch: 27236, Training Loss: 0.00878
Epoch: 27236, Training Loss: 0.00879
Epoch: 27236, Training Loss: 0.01080
Epoch: 27237, Training Loss: 0.00841
Epoch: 27237, Training Loss: 0.00878
Epoch: 27237, Training Loss: 0.00879
Epoch: 27237, Training Loss: 0.01080
Epoch: 27238, Training Loss: 0.00841
Epoch: 27238, Training Loss: 0.00878
Epoch: 27238, Training Loss: 0.00879
Epoch: 27238, Training Loss: 0.01080
Epoch: 27239, Training Loss: 0.00841
Epoch: 27239, Training Loss: 0.00878
Epoch: 27239, Training Loss: 0.00879
Epoch: 27239, Training Loss: 0.01080
Epoch: 27240, Training Loss: 0.00841
Epoch: 27240, Training Loss: 0.00878
Epoch: 27240, Training Loss: 0.00879
Epoch: 27240, Training Loss: 0.01080
Epoch: 27241, Training Loss: 0.00841
Epoch: 27241, Training Loss: 0.00878
Epoch: 27241, Training Loss: 0.00879
Epoch: 27241, Training Loss: 0.01080
Epoch: 27242, Training Loss: 0.00841
Epoch: 27242, Training Loss: 0.00878
Epoch: 27242, Training Loss: 0.00879
Epoch: 27242, Training Loss: 0.01080
Epoch: 27243, Training Loss: 0.00841
Epoch: 27243, Training Loss: 0.00878
Epoch: 27243, Training Loss: 0.00879
Epoch: 27243, Training Loss: 0.01080
Epoch: 27244, Training Loss: 0.00841
Epoch: 27244, Training Loss: 0.00878
Epoch: 27244, Training Loss: 0.00879
Epoch: 27244, Training Loss: 0.01080
Epoch: 27245, Training Loss: 0.00841
Epoch: 27245, Training Loss: 0.00877
Epoch: 27245, Training Loss: 0.00879
Epoch: 27245, Training Loss: 0.01080
Epoch: 27246, Training Loss: 0.00841
Epoch: 27246, Training Loss: 0.00877
Epoch: 27246, Training Loss: 0.00879
Epoch: 27246, Training Loss: 0.01080
Epoch: 27247, Training Loss: 0.00841
Epoch: 27247, Training Loss: 0.00877
Epoch: 27247, Training Loss: 0.00879
Epoch: 27247, Training Loss: 0.01080
Epoch: 27248, Training Loss: 0.00841
Epoch: 27248, Training Loss: 0.00877
Epoch: 27248, Training Loss: 0.00879
Epoch: 27248, Training Loss: 0.01080
Epoch: 27249, Training Loss: 0.00841
Epoch: 27249, Training Loss: 0.00877
Epoch: 27249, Training Loss: 0.00878
Epoch: 27249, Training Loss: 0.01080
Epoch: 27250, Training Loss: 0.00841
Epoch: 27250, Training Loss: 0.00877
Epoch: 27250, Training Loss: 0.00878
Epoch: 27250, Training Loss: 0.01080
Epoch: 27251, Training Loss: 0.00841
Epoch: 27251, Training Loss: 0.00877
Epoch: 27251, Training Loss: 0.00878
Epoch: 27251, Training Loss: 0.01080
Epoch: 27252, Training Loss: 0.00841
Epoch: 27252, Training Loss: 0.00877
Epoch: 27252, Training Loss: 0.00878
Epoch: 27252, Training Loss: 0.01080
Epoch: 27253, Training Loss: 0.00841
Epoch: 27253, Training Loss: 0.00877
Epoch: 27253, Training Loss: 0.00878
Epoch: 27253, Training Loss: 0.01080
Epoch: 27254, Training Loss: 0.00841
Epoch: 27254, Training Loss: 0.00877
Epoch: 27254, Training Loss: 0.00878
Epoch: 27254, Training Loss: 0.01080
Epoch: 27255, Training Loss: 0.00841
Epoch: 27255, Training Loss: 0.00877
Epoch: 27255, Training Loss: 0.00878
Epoch: 27255, Training Loss: 0.01080
Epoch: 27256, Training Loss: 0.00841
Epoch: 27256, Training Loss: 0.00877
Epoch: 27256, Training Loss: 0.00878
Epoch: 27256, Training Loss: 0.01080
Epoch: 27257, Training Loss: 0.00841
Epoch: 27257, Training Loss: 0.00877
Epoch: 27257, Training Loss: 0.00878
Epoch: 27257, Training Loss: 0.01080
Epoch: 27258, Training Loss: 0.00841
Epoch: 27258, Training Loss: 0.00877
Epoch: 27258, Training Loss: 0.00878
Epoch: 27258, Training Loss: 0.01080
Epoch: 27259, Training Loss: 0.00841
Epoch: 27259, Training Loss: 0.00877
Epoch: 27259, Training Loss: 0.00878
Epoch: 27259, Training Loss: 0.01080
Epoch: 27260, Training Loss: 0.00841
Epoch: 27260, Training Loss: 0.00877
Epoch: 27260, Training Loss: 0.00878
Epoch: 27260, Training Loss: 0.01080
Epoch: 27261, Training Loss: 0.00841
Epoch: 27261, Training Loss: 0.00877
Epoch: 27261, Training Loss: 0.00878
Epoch: 27261, Training Loss: 0.01080
Epoch: 27262, Training Loss: 0.00841
Epoch: 27262, Training Loss: 0.00877
Epoch: 27262, Training Loss: 0.00878
Epoch: 27262, Training Loss: 0.01080
Epoch: 27263, Training Loss: 0.00841
Epoch: 27263, Training Loss: 0.00877
Epoch: 27263, Training Loss: 0.00878
Epoch: 27263, Training Loss: 0.01080
Epoch: 27264, Training Loss: 0.00841
Epoch: 27264, Training Loss: 0.00877
Epoch: 27264, Training Loss: 0.00878
Epoch: 27264, Training Loss: 0.01079
Epoch: 27265, Training Loss: 0.00841
Epoch: 27265, Training Loss: 0.00877
Epoch: 27265, Training Loss: 0.00878
Epoch: 27265, Training Loss: 0.01079
Epoch: 27266, Training Loss: 0.00841
Epoch: 27266, Training Loss: 0.00877
Epoch: 27266, Training Loss: 0.00878
Epoch: 27266, Training Loss: 0.01079
Epoch: 27267, Training Loss: 0.00841
Epoch: 27267, Training Loss: 0.00877
Epoch: 27267, Training Loss: 0.00878
Epoch: 27267, Training Loss: 0.01079
Epoch: 27268, Training Loss: 0.00841
Epoch: 27268, Training Loss: 0.00877
Epoch: 27268, Training Loss: 0.00878
Epoch: 27268, Training Loss: 0.01079
Epoch: 27269, Training Loss: 0.00841
Epoch: 27269, Training Loss: 0.00877
Epoch: 27269, Training Loss: 0.00878
Epoch: 27269, Training Loss: 0.01079
Epoch: 27270, Training Loss: 0.00841
Epoch: 27270, Training Loss: 0.00877
Epoch: 27270, Training Loss: 0.00878
Epoch: 27270, Training Loss: 0.01079
Epoch: 27271, Training Loss: 0.00841
Epoch: 27271, Training Loss: 0.00877
Epoch: 27271, Training Loss: 0.00878
Epoch: 27271, Training Loss: 0.01079
Epoch: 27272, Training Loss: 0.00840
Epoch: 27272, Training Loss: 0.00877
Epoch: 27272, Training Loss: 0.00878
Epoch: 27272, Training Loss: 0.01079
Epoch: 27273, Training Loss: 0.00840
Epoch: 27273, Training Loss: 0.00877
Epoch: 27273, Training Loss: 0.00878
Epoch: 27273, Training Loss: 0.01079
Epoch: 27274, Training Loss: 0.00840
Epoch: 27274, Training Loss: 0.00877
Epoch: 27274, Training Loss: 0.00878
Epoch: 27274, Training Loss: 0.01079
Epoch: 27275, Training Loss: 0.00840
Epoch: 27275, Training Loss: 0.00877
Epoch: 27275, Training Loss: 0.00878
Epoch: 27275, Training Loss: 0.01079
Epoch: 27276, Training Loss: 0.00840
Epoch: 27276, Training Loss: 0.00877
Epoch: 27276, Training Loss: 0.00878
Epoch: 27276, Training Loss: 0.01079
Epoch: 27277, Training Loss: 0.00840
Epoch: 27277, Training Loss: 0.00877
Epoch: 27277, Training Loss: 0.00878
Epoch: 27277, Training Loss: 0.01079
Epoch: 27278, Training Loss: 0.00840
Epoch: 27278, Training Loss: 0.00877
Epoch: 27278, Training Loss: 0.00878
Epoch: 27278, Training Loss: 0.01079
Epoch: 27279, Training Loss: 0.00840
Epoch: 27279, Training Loss: 0.00877
Epoch: 27279, Training Loss: 0.00878
Epoch: 27279, Training Loss: 0.01079
Epoch: 27280, Training Loss: 0.00840
Epoch: 27280, Training Loss: 0.00877
Epoch: 27280, Training Loss: 0.00878
Epoch: 27280, Training Loss: 0.01079
Epoch: 27281, Training Loss: 0.00840
Epoch: 27281, Training Loss: 0.00877
Epoch: 27281, Training Loss: 0.00878
Epoch: 27281, Training Loss: 0.01079
Epoch: 27282, Training Loss: 0.00840
Epoch: 27282, Training Loss: 0.00877
Epoch: 27282, Training Loss: 0.00878
Epoch: 27282, Training Loss: 0.01079
Epoch: 27283, Training Loss: 0.00840
Epoch: 27283, Training Loss: 0.00877
Epoch: 27283, Training Loss: 0.00878
Epoch: 27283, Training Loss: 0.01079
Epoch: 27284, Training Loss: 0.00840
Epoch: 27284, Training Loss: 0.00877
Epoch: 27284, Training Loss: 0.00878
Epoch: 27284, Training Loss: 0.01079
Epoch: 27285, Training Loss: 0.00840
Epoch: 27285, Training Loss: 0.00877
Epoch: 27285, Training Loss: 0.00878
Epoch: 27285, Training Loss: 0.01079
Epoch: 27286, Training Loss: 0.00840
Epoch: 27286, Training Loss: 0.00877
Epoch: 27286, Training Loss: 0.00878
Epoch: 27286, Training Loss: 0.01079
Epoch: 27287, Training Loss: 0.00840
Epoch: 27287, Training Loss: 0.00877
Epoch: 27287, Training Loss: 0.00878
Epoch: 27287, Training Loss: 0.01079
Epoch: 27288, Training Loss: 0.00840
Epoch: 27288, Training Loss: 0.00877
Epoch: 27288, Training Loss: 0.00878
Epoch: 27288, Training Loss: 0.01079
Epoch: 27289, Training Loss: 0.00840
Epoch: 27289, Training Loss: 0.00877
Epoch: 27289, Training Loss: 0.00878
Epoch: 27289, Training Loss: 0.01079
Epoch: 27290, Training Loss: 0.00840
Epoch: 27290, Training Loss: 0.00877
Epoch: 27290, Training Loss: 0.00878
Epoch: 27290, Training Loss: 0.01079
Epoch: 27291, Training Loss: 0.00840
Epoch: 27291, Training Loss: 0.00877
Epoch: 27291, Training Loss: 0.00878
Epoch: 27291, Training Loss: 0.01079
Epoch: 27292, Training Loss: 0.00840
Epoch: 27292, Training Loss: 0.00877
Epoch: 27292, Training Loss: 0.00878
Epoch: 27292, Training Loss: 0.01079
Epoch: 27293, Training Loss: 0.00840
Epoch: 27293, Training Loss: 0.00877
Epoch: 27293, Training Loss: 0.00878
Epoch: 27293, Training Loss: 0.01079
Epoch: 27294, Training Loss: 0.00840
Epoch: 27294, Training Loss: 0.00877
Epoch: 27294, Training Loss: 0.00878
Epoch: 27294, Training Loss: 0.01079
Epoch: 27295, Training Loss: 0.00840
Epoch: 27295, Training Loss: 0.00877
Epoch: 27295, Training Loss: 0.00878
Epoch: 27295, Training Loss: 0.01079
Epoch: 27296, Training Loss: 0.00840
Epoch: 27296, Training Loss: 0.00877
Epoch: 27296, Training Loss: 0.00878
Epoch: 27296, Training Loss: 0.01079
Epoch: 27297, Training Loss: 0.00840
Epoch: 27297, Training Loss: 0.00877
Epoch: 27297, Training Loss: 0.00878
Epoch: 27297, Training Loss: 0.01079
Epoch: 27298, Training Loss: 0.00840
Epoch: 27298, Training Loss: 0.00877
Epoch: 27298, Training Loss: 0.00878
Epoch: 27298, Training Loss: 0.01079
Epoch: 27299, Training Loss: 0.00840
Epoch: 27299, Training Loss: 0.00876
Epoch: 27299, Training Loss: 0.00878
Epoch: 27299, Training Loss: 0.01079
Epoch: 27300, Training Loss: 0.00840
Epoch: 27300, Training Loss: 0.00876
Epoch: 27300, Training Loss: 0.00878
Epoch: 27300, Training Loss: 0.01079
Epoch: 27301, Training Loss: 0.00840
Epoch: 27301, Training Loss: 0.00876
Epoch: 27301, Training Loss: 0.00878
Epoch: 27301, Training Loss: 0.01079
Epoch: 27302, Training Loss: 0.00840
Epoch: 27302, Training Loss: 0.00876
Epoch: 27302, Training Loss: 0.00878
Epoch: 27302, Training Loss: 0.01079
Epoch: 27303, Training Loss: 0.00840
Epoch: 27303, Training Loss: 0.00876
Epoch: 27303, Training Loss: 0.00877
Epoch: 27303, Training Loss: 0.01079
Epoch: 27304, Training Loss: 0.00840
Epoch: 27304, Training Loss: 0.00876
Epoch: 27304, Training Loss: 0.00877
Epoch: 27304, Training Loss: 0.01079
Epoch: 27305, Training Loss: 0.00840
Epoch: 27305, Training Loss: 0.00876
Epoch: 27305, Training Loss: 0.00877
Epoch: 27305, Training Loss: 0.01079
Epoch: 27306, Training Loss: 0.00840
Epoch: 27306, Training Loss: 0.00876
Epoch: 27306, Training Loss: 0.00877
Epoch: 27306, Training Loss: 0.01079
Epoch: 27307, Training Loss: 0.00840
Epoch: 27307, Training Loss: 0.00876
Epoch: 27307, Training Loss: 0.00877
Epoch: 27307, Training Loss: 0.01079
Epoch: 27308, Training Loss: 0.00840
Epoch: 27308, Training Loss: 0.00876
Epoch: 27308, Training Loss: 0.00877
Epoch: 27308, Training Loss: 0.01078
Epoch: 27309, Training Loss: 0.00840
Epoch: 27309, Training Loss: 0.00876
Epoch: 27309, Training Loss: 0.00877
Epoch: 27309, Training Loss: 0.01078
Epoch: 27310, Training Loss: 0.00840
Epoch: 27310, Training Loss: 0.00876
Epoch: 27310, Training Loss: 0.00877
Epoch: 27310, Training Loss: 0.01078
Epoch: 27311, Training Loss: 0.00840
Epoch: 27311, Training Loss: 0.00876
Epoch: 27311, Training Loss: 0.00877
Epoch: 27311, Training Loss: 0.01078
Epoch: 27312, Training Loss: 0.00840
Epoch: 27312, Training Loss: 0.00876
Epoch: 27312, Training Loss: 0.00877
Epoch: 27312, Training Loss: 0.01078
Epoch: 27313, Training Loss: 0.00840
Epoch: 27313, Training Loss: 0.00876
Epoch: 27313, Training Loss: 0.00877
Epoch: 27313, Training Loss: 0.01078
Epoch: 27314, Training Loss: 0.00840
Epoch: 27314, Training Loss: 0.00876
Epoch: 27314, Training Loss: 0.00877
Epoch: 27314, Training Loss: 0.01078
Epoch: 27315, Training Loss: 0.00840
Epoch: 27315, Training Loss: 0.00876
Epoch: 27315, Training Loss: 0.00877
Epoch: 27315, Training Loss: 0.01078
Epoch: 27316, Training Loss: 0.00840
Epoch: 27316, Training Loss: 0.00876
Epoch: 27316, Training Loss: 0.00877
Epoch: 27316, Training Loss: 0.01078
Epoch: 27317, Training Loss: 0.00840
Epoch: 27317, Training Loss: 0.00876
Epoch: 27317, Training Loss: 0.00877
Epoch: 27317, Training Loss: 0.01078
Epoch: 27318, Training Loss: 0.00840
Epoch: 27318, Training Loss: 0.00876
Epoch: 27318, Training Loss: 0.00877
Epoch: 27318, Training Loss: 0.01078
Epoch: 27319, Training Loss: 0.00840
Epoch: 27319, Training Loss: 0.00876
Epoch: 27319, Training Loss: 0.00877
Epoch: 27319, Training Loss: 0.01078
Epoch: 27320, Training Loss: 0.00840
Epoch: 27320, Training Loss: 0.00876
Epoch: 27320, Training Loss: 0.00877
Epoch: 27320, Training Loss: 0.01078
Epoch: 27321, Training Loss: 0.00840
Epoch: 27321, Training Loss: 0.00876
Epoch: 27321, Training Loss: 0.00877
Epoch: 27321, Training Loss: 0.01078
Epoch: 27322, Training Loss: 0.00840
Epoch: 27322, Training Loss: 0.00876
Epoch: 27322, Training Loss: 0.00877
Epoch: 27322, Training Loss: 0.01078
Epoch: 27323, Training Loss: 0.00840
Epoch: 27323, Training Loss: 0.00876
Epoch: 27323, Training Loss: 0.00877
Epoch: 27323, Training Loss: 0.01078
Epoch: 27324, Training Loss: 0.00840
Epoch: 27324, Training Loss: 0.00876
Epoch: 27324, Training Loss: 0.00877
Epoch: 27324, Training Loss: 0.01078
Epoch: 27325, Training Loss: 0.00840
Epoch: 27325, Training Loss: 0.00876
Epoch: 27325, Training Loss: 0.00877
Epoch: 27325, Training Loss: 0.01078
Epoch: 27326, Training Loss: 0.00840
Epoch: 27326, Training Loss: 0.00876
Epoch: 27326, Training Loss: 0.00877
Epoch: 27326, Training Loss: 0.01078
Epoch: 27327, Training Loss: 0.00840
Epoch: 27327, Training Loss: 0.00876
Epoch: 27327, Training Loss: 0.00877
Epoch: 27327, Training Loss: 0.01078
Epoch: 27328, Training Loss: 0.00840
Epoch: 27328, Training Loss: 0.00876
Epoch: 27328, Training Loss: 0.00877
Epoch: 27328, Training Loss: 0.01078
Epoch: 27329, Training Loss: 0.00840
Epoch: 27329, Training Loss: 0.00876
Epoch: 27329, Training Loss: 0.00877
Epoch: 27329, Training Loss: 0.01078
Epoch: 27330, Training Loss: 0.00840
Epoch: 27330, Training Loss: 0.00876
Epoch: 27330, Training Loss: 0.00877
Epoch: 27330, Training Loss: 0.01078
Epoch: 27331, Training Loss: 0.00839
Epoch: 27331, Training Loss: 0.00876
Epoch: 27331, Training Loss: 0.00877
Epoch: 27331, Training Loss: 0.01078
Epoch: 27332, Training Loss: 0.00839
Epoch: 27332, Training Loss: 0.00876
Epoch: 27332, Training Loss: 0.00877
Epoch: 27332, Training Loss: 0.01078
Epoch: 27333, Training Loss: 0.00839
Epoch: 27333, Training Loss: 0.00876
Epoch: 27333, Training Loss: 0.00877
Epoch: 27333, Training Loss: 0.01078
Epoch: 27334, Training Loss: 0.00839
Epoch: 27334, Training Loss: 0.00876
Epoch: 27334, Training Loss: 0.00877
Epoch: 27334, Training Loss: 0.01078
Epoch: 27335, Training Loss: 0.00839
Epoch: 27335, Training Loss: 0.00876
Epoch: 27335, Training Loss: 0.00877
Epoch: 27335, Training Loss: 0.01078
Epoch: 27336, Training Loss: 0.00839
Epoch: 27336, Training Loss: 0.00876
Epoch: 27336, Training Loss: 0.00877
Epoch: 27336, Training Loss: 0.01078
Epoch: 27337, Training Loss: 0.00839
Epoch: 27337, Training Loss: 0.00876
Epoch: 27337, Training Loss: 0.00877
Epoch: 27337, Training Loss: 0.01078
Epoch: 27338, Training Loss: 0.00839
Epoch: 27338, Training Loss: 0.00876
Epoch: 27338, Training Loss: 0.00877
Epoch: 27338, Training Loss: 0.01078
Epoch: 27339, Training Loss: 0.00839
Epoch: 27339, Training Loss: 0.00876
Epoch: 27339, Training Loss: 0.00877
Epoch: 27339, Training Loss: 0.01078
Epoch: 27340, Training Loss: 0.00839
Epoch: 27340, Training Loss: 0.00876
Epoch: 27340, Training Loss: 0.00877
Epoch: 27340, Training Loss: 0.01078
Epoch: 27341, Training Loss: 0.00839
Epoch: 27341, Training Loss: 0.00876
Epoch: 27341, Training Loss: 0.00877
Epoch: 27341, Training Loss: 0.01078
Epoch: 27342, Training Loss: 0.00839
Epoch: 27342, Training Loss: 0.00876
Epoch: 27342, Training Loss: 0.00877
Epoch: 27342, Training Loss: 0.01078
Epoch: 27343, Training Loss: 0.00839
Epoch: 27343, Training Loss: 0.00876
Epoch: 27343, Training Loss: 0.00877
Epoch: 27343, Training Loss: 0.01078
Epoch: 27344, Training Loss: 0.00839
Epoch: 27344, Training Loss: 0.00876
Epoch: 27344, Training Loss: 0.00877
Epoch: 27344, Training Loss: 0.01078
Epoch: 27345, Training Loss: 0.00839
Epoch: 27345, Training Loss: 0.00876
Epoch: 27345, Training Loss: 0.00877
Epoch: 27345, Training Loss: 0.01078
Epoch: 27346, Training Loss: 0.00839
Epoch: 27346, Training Loss: 0.00876
Epoch: 27346, Training Loss: 0.00877
Epoch: 27346, Training Loss: 0.01078
Epoch: 27347, Training Loss: 0.00839
Epoch: 27347, Training Loss: 0.00876
Epoch: 27347, Training Loss: 0.00877
Epoch: 27347, Training Loss: 0.01078
Epoch: 27348, Training Loss: 0.00839
Epoch: 27348, Training Loss: 0.00876
Epoch: 27348, Training Loss: 0.00877
Epoch: 27348, Training Loss: 0.01078
Epoch: 27349, Training Loss: 0.00839
Epoch: 27349, Training Loss: 0.00876
Epoch: 27349, Training Loss: 0.00877
Epoch: 27349, Training Loss: 0.01078
Epoch: 27350, Training Loss: 0.00839
Epoch: 27350, Training Loss: 0.00876
Epoch: 27350, Training Loss: 0.00877
Epoch: 27350, Training Loss: 0.01078
Epoch: 27351, Training Loss: 0.00839
Epoch: 27351, Training Loss: 0.00876
Epoch: 27351, Training Loss: 0.00877
Epoch: 27351, Training Loss: 0.01078
Epoch: 27352, Training Loss: 0.00839
Epoch: 27352, Training Loss: 0.00876
Epoch: 27352, Training Loss: 0.00877
Epoch: 27352, Training Loss: 0.01077
Epoch: 27353, Training Loss: 0.00839
Epoch: 27353, Training Loss: 0.00876
Epoch: 27353, Training Loss: 0.00877
Epoch: 27353, Training Loss: 0.01077
Epoch: 27354, Training Loss: 0.00839
Epoch: 27354, Training Loss: 0.00875
Epoch: 27354, Training Loss: 0.00877
Epoch: 27354, Training Loss: 0.01077
Epoch: 27355, Training Loss: 0.00839
Epoch: 27355, Training Loss: 0.00875
Epoch: 27355, Training Loss: 0.00877
Epoch: 27355, Training Loss: 0.01077
Epoch: 27356, Training Loss: 0.00839
Epoch: 27356, Training Loss: 0.00875
Epoch: 27356, Training Loss: 0.00877
Epoch: 27356, Training Loss: 0.01077
Epoch: 27357, Training Loss: 0.00839
Epoch: 27357, Training Loss: 0.00875
Epoch: 27357, Training Loss: 0.00877
Epoch: 27357, Training Loss: 0.01077
Epoch: 27358, Training Loss: 0.00839
Epoch: 27358, Training Loss: 0.00875
Epoch: 27358, Training Loss: 0.00876
Epoch: 27358, Training Loss: 0.01077
Epoch: 27359, Training Loss: 0.00839
Epoch: 27359, Training Loss: 0.00875
Epoch: 27359, Training Loss: 0.00876
Epoch: 27359, Training Loss: 0.01077
Epoch: 27360, Training Loss: 0.00839
Epoch: 27360, Training Loss: 0.00875
Epoch: 27360, Training Loss: 0.00876
Epoch: 27360, Training Loss: 0.01077
Epoch: 27361, Training Loss: 0.00839
Epoch: 27361, Training Loss: 0.00875
Epoch: 27361, Training Loss: 0.00876
Epoch: 27361, Training Loss: 0.01077
Epoch: 27362, Training Loss: 0.00839
Epoch: 27362, Training Loss: 0.00875
Epoch: 27362, Training Loss: 0.00876
Epoch: 27362, Training Loss: 0.01077
Epoch: 27363, Training Loss: 0.00839
Epoch: 27363, Training Loss: 0.00875
Epoch: 27363, Training Loss: 0.00876
Epoch: 27363, Training Loss: 0.01077
Epoch: 27364, Training Loss: 0.00839
Epoch: 27364, Training Loss: 0.00875
Epoch: 27364, Training Loss: 0.00876
Epoch: 27364, Training Loss: 0.01077
Epoch: 27365, Training Loss: 0.00839
Epoch: 27365, Training Loss: 0.00875
Epoch: 27365, Training Loss: 0.00876
Epoch: 27365, Training Loss: 0.01077
Epoch: 27366, Training Loss: 0.00839
Epoch: 27366, Training Loss: 0.00875
Epoch: 27366, Training Loss: 0.00876
Epoch: 27366, Training Loss: 0.01077
Epoch: 27367, Training Loss: 0.00839
Epoch: 27367, Training Loss: 0.00875
Epoch: 27367, Training Loss: 0.00876
Epoch: 27367, Training Loss: 0.01077
Epoch: 27368, Training Loss: 0.00839
Epoch: 27368, Training Loss: 0.00875
Epoch: 27368, Training Loss: 0.00876
Epoch: 27368, Training Loss: 0.01077
Epoch: 27369, Training Loss: 0.00839
Epoch: 27369, Training Loss: 0.00875
Epoch: 27369, Training Loss: 0.00876
Epoch: 27369, Training Loss: 0.01077
Epoch: 27370, Training Loss: 0.00839
Epoch: 27370, Training Loss: 0.00875
Epoch: 27370, Training Loss: 0.00876
Epoch: 27370, Training Loss: 0.01077
Epoch: 27371, Training Loss: 0.00839
Epoch: 27371, Training Loss: 0.00875
Epoch: 27371, Training Loss: 0.00876
Epoch: 27371, Training Loss: 0.01077
Epoch: 27372, Training Loss: 0.00839
Epoch: 27372, Training Loss: 0.00875
Epoch: 27372, Training Loss: 0.00876
Epoch: 27372, Training Loss: 0.01077
Epoch: 27373, Training Loss: 0.00839
Epoch: 27373, Training Loss: 0.00875
Epoch: 27373, Training Loss: 0.00876
Epoch: 27373, Training Loss: 0.01077
Epoch: 27374, Training Loss: 0.00839
Epoch: 27374, Training Loss: 0.00875
Epoch: 27374, Training Loss: 0.00876
Epoch: 27374, Training Loss: 0.01077
Epoch: 27375, Training Loss: 0.00839
Epoch: 27375, Training Loss: 0.00875
Epoch: 27375, Training Loss: 0.00876
Epoch: 27375, Training Loss: 0.01077
Epoch: 27376, Training Loss: 0.00839
Epoch: 27376, Training Loss: 0.00875
Epoch: 27376, Training Loss: 0.00876
Epoch: 27376, Training Loss: 0.01077
Epoch: 27377, Training Loss: 0.00839
Epoch: 27377, Training Loss: 0.00875
Epoch: 27377, Training Loss: 0.00876
Epoch: 27377, Training Loss: 0.01077
Epoch: 27378, Training Loss: 0.00839
Epoch: 27378, Training Loss: 0.00875
Epoch: 27378, Training Loss: 0.00876
Epoch: 27378, Training Loss: 0.01077
Epoch: 27379, Training Loss: 0.00839
Epoch: 27379, Training Loss: 0.00875
Epoch: 27379, Training Loss: 0.00876
Epoch: 27379, Training Loss: 0.01077
Epoch: 27380, Training Loss: 0.00839
Epoch: 27380, Training Loss: 0.00875
Epoch: 27380, Training Loss: 0.00876
Epoch: 27380, Training Loss: 0.01077
Epoch: 27381, Training Loss: 0.00839
Epoch: 27381, Training Loss: 0.00875
Epoch: 27381, Training Loss: 0.00876
Epoch: 27381, Training Loss: 0.01077
Epoch: 27382, Training Loss: 0.00839
Epoch: 27382, Training Loss: 0.00875
Epoch: 27382, Training Loss: 0.00876
Epoch: 27382, Training Loss: 0.01077
Epoch: 27383, Training Loss: 0.00839
Epoch: 27383, Training Loss: 0.00875
Epoch: 27383, Training Loss: 0.00876
Epoch: 27383, Training Loss: 0.01077
Epoch: 27384, Training Loss: 0.00839
Epoch: 27384, Training Loss: 0.00875
Epoch: 27384, Training Loss: 0.00876
Epoch: 27384, Training Loss: 0.01077
Epoch: 27385, Training Loss: 0.00839
Epoch: 27385, Training Loss: 0.00875
Epoch: 27385, Training Loss: 0.00876
Epoch: 27385, Training Loss: 0.01077
Epoch: 27386, Training Loss: 0.00839
Epoch: 27386, Training Loss: 0.00875
Epoch: 27386, Training Loss: 0.00876
Epoch: 27386, Training Loss: 0.01077
Epoch: 27387, Training Loss: 0.00839
Epoch: 27387, Training Loss: 0.00875
Epoch: 27387, Training Loss: 0.00876
Epoch: 27387, Training Loss: 0.01077
Epoch: 27388, Training Loss: 0.00839
Epoch: 27388, Training Loss: 0.00875
Epoch: 27388, Training Loss: 0.00876
Epoch: 27388, Training Loss: 0.01077
Epoch: 27389, Training Loss: 0.00838
Epoch: 27389, Training Loss: 0.00875
Epoch: 27389, Training Loss: 0.00876
Epoch: 27389, Training Loss: 0.01077
Epoch: 27390, Training Loss: 0.00838
Epoch: 27390, Training Loss: 0.00875
Epoch: 27390, Training Loss: 0.00876
Epoch: 27390, Training Loss: 0.01077
Epoch: 27391, Training Loss: 0.00838
Epoch: 27391, Training Loss: 0.00875
Epoch: 27391, Training Loss: 0.00876
Epoch: 27391, Training Loss: 0.01077
Epoch: 27392, Training Loss: 0.00838
Epoch: 27392, Training Loss: 0.00875
Epoch: 27392, Training Loss: 0.00876
Epoch: 27392, Training Loss: 0.01077
Epoch: 27393, Training Loss: 0.00838
Epoch: 27393, Training Loss: 0.00875
Epoch: 27393, Training Loss: 0.00876
Epoch: 27393, Training Loss: 0.01077
Epoch: 27394, Training Loss: 0.00838
Epoch: 27394, Training Loss: 0.00875
Epoch: 27394, Training Loss: 0.00876
Epoch: 27394, Training Loss: 0.01077
Epoch: 27395, Training Loss: 0.00838
Epoch: 27395, Training Loss: 0.00875
Epoch: 27395, Training Loss: 0.00876
Epoch: 27395, Training Loss: 0.01077
Epoch: 27396, Training Loss: 0.00838
Epoch: 27396, Training Loss: 0.00875
Epoch: 27396, Training Loss: 0.00876
Epoch: 27396, Training Loss: 0.01076
Epoch: 27397, Training Loss: 0.00838
Epoch: 27397, Training Loss: 0.00875
Epoch: 27397, Training Loss: 0.00876
Epoch: 27397, Training Loss: 0.01076
Epoch: 27398, Training Loss: 0.00838
Epoch: 27398, Training Loss: 0.00875
Epoch: 27398, Training Loss: 0.00876
Epoch: 27398, Training Loss: 0.01076
Epoch: 27399, Training Loss: 0.00838
Epoch: 27399, Training Loss: 0.00875
Epoch: 27399, Training Loss: 0.00876
Epoch: 27399, Training Loss: 0.01076
Epoch: 27400, Training Loss: 0.00838
Epoch: 27400, Training Loss: 0.00875
Epoch: 27400, Training Loss: 0.00876
Epoch: 27400, Training Loss: 0.01076
Epoch: 27401, Training Loss: 0.00838
Epoch: 27401, Training Loss: 0.00875
Epoch: 27401, Training Loss: 0.00876
Epoch: 27401, Training Loss: 0.01076
Epoch: 27402, Training Loss: 0.00838
Epoch: 27402, Training Loss: 0.00875
Epoch: 27402, Training Loss: 0.00876
Epoch: 27402, Training Loss: 0.01076
Epoch: 27403, Training Loss: 0.00838
Epoch: 27403, Training Loss: 0.00875
Epoch: 27403, Training Loss: 0.00876
Epoch: 27403, Training Loss: 0.01076
Epoch: 27404, Training Loss: 0.00838
Epoch: 27404, Training Loss: 0.00875
Epoch: 27404, Training Loss: 0.00876
Epoch: 27404, Training Loss: 0.01076
Epoch: 27405, Training Loss: 0.00838
Epoch: 27405, Training Loss: 0.00875
Epoch: 27405, Training Loss: 0.00876
Epoch: 27405, Training Loss: 0.01076
Epoch: 27406, Training Loss: 0.00838
Epoch: 27406, Training Loss: 0.00875
Epoch: 27406, Training Loss: 0.00876
Epoch: 27406, Training Loss: 0.01076
Epoch: 27407, Training Loss: 0.00838
Epoch: 27407, Training Loss: 0.00875
Epoch: 27407, Training Loss: 0.00876
Epoch: 27407, Training Loss: 0.01076
Epoch: 27408, Training Loss: 0.00838
Epoch: 27408, Training Loss: 0.00875
Epoch: 27408, Training Loss: 0.00876
Epoch: 27408, Training Loss: 0.01076
Epoch: 27409, Training Loss: 0.00838
Epoch: 27409, Training Loss: 0.00874
Epoch: 27409, Training Loss: 0.00876
Epoch: 27409, Training Loss: 0.01076
Epoch: 27410, Training Loss: 0.00838
Epoch: 27410, Training Loss: 0.00874
Epoch: 27410, Training Loss: 0.00876
Epoch: 27410, Training Loss: 0.01076
Epoch: 27411, Training Loss: 0.00838
Epoch: 27411, Training Loss: 0.00874
Epoch: 27411, Training Loss: 0.00876
Epoch: 27411, Training Loss: 0.01076
Epoch: 27412, Training Loss: 0.00838
Epoch: 27412, Training Loss: 0.00874
Epoch: 27412, Training Loss: 0.00876
Epoch: 27412, Training Loss: 0.01076
Epoch: 27413, Training Loss: 0.00838
Epoch: 27413, Training Loss: 0.00874
Epoch: 27413, Training Loss: 0.00875
Epoch: 27413, Training Loss: 0.01076
Epoch: 27414, Training Loss: 0.00838
Epoch: 27414, Training Loss: 0.00874
Epoch: 27414, Training Loss: 0.00875
Epoch: 27414, Training Loss: 0.01076
Epoch: 27415, Training Loss: 0.00838
Epoch: 27415, Training Loss: 0.00874
Epoch: 27415, Training Loss: 0.00875
Epoch: 27415, Training Loss: 0.01076
Epoch: 27416, Training Loss: 0.00838
Epoch: 27416, Training Loss: 0.00874
Epoch: 27416, Training Loss: 0.00875
Epoch: 27416, Training Loss: 0.01076
Epoch: 27417, Training Loss: 0.00838
Epoch: 27417, Training Loss: 0.00874
Epoch: 27417, Training Loss: 0.00875
Epoch: 27417, Training Loss: 0.01076
Epoch: 27418, Training Loss: 0.00838
Epoch: 27418, Training Loss: 0.00874
Epoch: 27418, Training Loss: 0.00875
Epoch: 27418, Training Loss: 0.01076
Epoch: 27419, Training Loss: 0.00838
Epoch: 27419, Training Loss: 0.00874
Epoch: 27419, Training Loss: 0.00875
Epoch: 27419, Training Loss: 0.01076
Epoch: 27420, Training Loss: 0.00838
Epoch: 27420, Training Loss: 0.00874
Epoch: 27420, Training Loss: 0.00875
Epoch: 27420, Training Loss: 0.01076
Epoch: 27421, Training Loss: 0.00838
Epoch: 27421, Training Loss: 0.00874
Epoch: 27421, Training Loss: 0.00875
Epoch: 27421, Training Loss: 0.01076
Epoch: 27422, Training Loss: 0.00838
Epoch: 27422, Training Loss: 0.00874
Epoch: 27422, Training Loss: 0.00875
Epoch: 27422, Training Loss: 0.01076
Epoch: 27423, Training Loss: 0.00838
Epoch: 27423, Training Loss: 0.00874
Epoch: 27423, Training Loss: 0.00875
Epoch: 27423, Training Loss: 0.01076
Epoch: 27424, Training Loss: 0.00838
Epoch: 27424, Training Loss: 0.00874
Epoch: 27424, Training Loss: 0.00875
Epoch: 27424, Training Loss: 0.01076
Epoch: 27425, Training Loss: 0.00838
Epoch: 27425, Training Loss: 0.00874
Epoch: 27425, Training Loss: 0.00875
Epoch: 27425, Training Loss: 0.01076
Epoch: 27426, Training Loss: 0.00838
Epoch: 27426, Training Loss: 0.00874
Epoch: 27426, Training Loss: 0.00875
Epoch: 27426, Training Loss: 0.01076
Epoch: 27427, Training Loss: 0.00838
Epoch: 27427, Training Loss: 0.00874
Epoch: 27427, Training Loss: 0.00875
Epoch: 27427, Training Loss: 0.01076
Epoch: 27428, Training Loss: 0.00838
Epoch: 27428, Training Loss: 0.00874
Epoch: 27428, Training Loss: 0.00875
Epoch: 27428, Training Loss: 0.01076
Epoch: 27429, Training Loss: 0.00838
Epoch: 27429, Training Loss: 0.00874
Epoch: 27429, Training Loss: 0.00875
Epoch: 27429, Training Loss: 0.01076
Epoch: 27430, Training Loss: 0.00838
Epoch: 27430, Training Loss: 0.00874
Epoch: 27430, Training Loss: 0.00875
Epoch: 27430, Training Loss: 0.01076
Epoch: 27431, Training Loss: 0.00838
Epoch: 27431, Training Loss: 0.00874
Epoch: 27431, Training Loss: 0.00875
Epoch: 27431, Training Loss: 0.01076
Epoch: 27432, Training Loss: 0.00838
Epoch: 27432, Training Loss: 0.00874
Epoch: 27432, Training Loss: 0.00875
Epoch: 27432, Training Loss: 0.01076
Epoch: 27433, Training Loss: 0.00838
Epoch: 27433, Training Loss: 0.00874
Epoch: 27433, Training Loss: 0.00875
Epoch: 27433, Training Loss: 0.01076
Epoch: 27434, Training Loss: 0.00838
Epoch: 27434, Training Loss: 0.00874
Epoch: 27434, Training Loss: 0.00875
Epoch: 27434, Training Loss: 0.01076
Epoch: 27435, Training Loss: 0.00838
Epoch: 27435, Training Loss: 0.00874
Epoch: 27435, Training Loss: 0.00875
Epoch: 27435, Training Loss: 0.01076
Epoch: 27436, Training Loss: 0.00838
Epoch: 27436, Training Loss: 0.00874
Epoch: 27436, Training Loss: 0.00875
Epoch: 27436, Training Loss: 0.01076
Epoch: 27437, Training Loss: 0.00838
Epoch: 27437, Training Loss: 0.00874
Epoch: 27437, Training Loss: 0.00875
Epoch: 27437, Training Loss: 0.01076
Epoch: 27438, Training Loss: 0.00838
Epoch: 27438, Training Loss: 0.00874
Epoch: 27438, Training Loss: 0.00875
Epoch: 27438, Training Loss: 0.01076
Epoch: 27439, Training Loss: 0.00838
Epoch: 27439, Training Loss: 0.00874
Epoch: 27439, Training Loss: 0.00875
Epoch: 27439, Training Loss: 0.01076
Epoch: 27440, Training Loss: 0.00838
Epoch: 27440, Training Loss: 0.00874
Epoch: 27440, Training Loss: 0.00875
Epoch: 27440, Training Loss: 0.01076
Epoch: 27441, Training Loss: 0.00838
Epoch: 27441, Training Loss: 0.00874
Epoch: 27441, Training Loss: 0.00875
Epoch: 27441, Training Loss: 0.01075
Epoch: 27442, Training Loss: 0.00838
Epoch: 27442, Training Loss: 0.00874
Epoch: 27442, Training Loss: 0.00875
Epoch: 27442, Training Loss: 0.01075
Epoch: 27443, Training Loss: 0.00838
Epoch: 27443, Training Loss: 0.00874
Epoch: 27443, Training Loss: 0.00875
Epoch: 27443, Training Loss: 0.01075
Epoch: 27444, Training Loss: 0.00838
Epoch: 27444, Training Loss: 0.00874
Epoch: 27444, Training Loss: 0.00875
Epoch: 27444, Training Loss: 0.01075
Epoch: 27445, Training Loss: 0.00838
Epoch: 27445, Training Loss: 0.00874
Epoch: 27445, Training Loss: 0.00875
Epoch: 27445, Training Loss: 0.01075
Epoch: 27446, Training Loss: 0.00838
Epoch: 27446, Training Loss: 0.00874
Epoch: 27446, Training Loss: 0.00875
Epoch: 27446, Training Loss: 0.01075
Epoch: 27447, Training Loss: 0.00838
Epoch: 27447, Training Loss: 0.00874
Epoch: 27447, Training Loss: 0.00875
Epoch: 27447, Training Loss: 0.01075
Epoch: 27448, Training Loss: 0.00837
Epoch: 27448, Training Loss: 0.00874
Epoch: 27448, Training Loss: 0.00875
Epoch: 27448, Training Loss: 0.01075
Epoch: 27449, Training Loss: 0.00837
Epoch: 27449, Training Loss: 0.00874
Epoch: 27449, Training Loss: 0.00875
Epoch: 27449, Training Loss: 0.01075
Epoch: 27450, Training Loss: 0.00837
Epoch: 27450, Training Loss: 0.00874
Epoch: 27450, Training Loss: 0.00875
Epoch: 27450, Training Loss: 0.01075
Epoch: 27451, Training Loss: 0.00837
Epoch: 27451, Training Loss: 0.00874
Epoch: 27451, Training Loss: 0.00875
Epoch: 27451, Training Loss: 0.01075
Epoch: 27452, Training Loss: 0.00837
Epoch: 27452, Training Loss: 0.00874
Epoch: 27452, Training Loss: 0.00875
Epoch: 27452, Training Loss: 0.01075
Epoch: 27453, Training Loss: 0.00837
Epoch: 27453, Training Loss: 0.00874
Epoch: 27453, Training Loss: 0.00875
Epoch: 27453, Training Loss: 0.01075
Epoch: 27454, Training Loss: 0.00837
Epoch: 27454, Training Loss: 0.00874
Epoch: 27454, Training Loss: 0.00875
Epoch: 27454, Training Loss: 0.01075
Epoch: 27455, Training Loss: 0.00837
Epoch: 27455, Training Loss: 0.00874
Epoch: 27455, Training Loss: 0.00875
Epoch: 27455, Training Loss: 0.01075
Epoch: 27456, Training Loss: 0.00837
Epoch: 27456, Training Loss: 0.00874
Epoch: 27456, Training Loss: 0.00875
Epoch: 27456, Training Loss: 0.01075
Epoch: 27457, Training Loss: 0.00837
Epoch: 27457, Training Loss: 0.00874
Epoch: 27457, Training Loss: 0.00875
Epoch: 27457, Training Loss: 0.01075
Epoch: 27458, Training Loss: 0.00837
Epoch: 27458, Training Loss: 0.00874
Epoch: 27458, Training Loss: 0.00875
Epoch: 27458, Training Loss: 0.01075
Epoch: 27459, Training Loss: 0.00837
Epoch: 27459, Training Loss: 0.00874
Epoch: 27459, Training Loss: 0.00875
Epoch: 27459, Training Loss: 0.01075
Epoch: 27460, Training Loss: 0.00837
Epoch: 27460, Training Loss: 0.00874
Epoch: 27460, Training Loss: 0.00875
Epoch: 27460, Training Loss: 0.01075
Epoch: 27461, Training Loss: 0.00837
Epoch: 27461, Training Loss: 0.00874
Epoch: 27461, Training Loss: 0.00875
Epoch: 27461, Training Loss: 0.01075
Epoch: 27462, Training Loss: 0.00837
Epoch: 27462, Training Loss: 0.00874
Epoch: 27462, Training Loss: 0.00875
Epoch: 27462, Training Loss: 0.01075
Epoch: 27463, Training Loss: 0.00837
Epoch: 27463, Training Loss: 0.00874
Epoch: 27463, Training Loss: 0.00875
Epoch: 27463, Training Loss: 0.01075
Epoch: 27464, Training Loss: 0.00837
Epoch: 27464, Training Loss: 0.00873
Epoch: 27464, Training Loss: 0.00875
Epoch: 27464, Training Loss: 0.01075
Epoch: 27465, Training Loss: 0.00837
Epoch: 27465, Training Loss: 0.00873
Epoch: 27465, Training Loss: 0.00875
Epoch: 27465, Training Loss: 0.01075
Epoch: 27466, Training Loss: 0.00837
Epoch: 27466, Training Loss: 0.00873
Epoch: 27466, Training Loss: 0.00875
Epoch: 27466, Training Loss: 0.01075
Epoch: 27467, Training Loss: 0.00837
Epoch: 27467, Training Loss: 0.00873
Epoch: 27467, Training Loss: 0.00875
Epoch: 27467, Training Loss: 0.01075
Epoch: 27468, Training Loss: 0.00837
Epoch: 27468, Training Loss: 0.00873
Epoch: 27468, Training Loss: 0.00874
Epoch: 27468, Training Loss: 0.01075
Epoch: 27469, Training Loss: 0.00837
Epoch: 27469, Training Loss: 0.00873
Epoch: 27469, Training Loss: 0.00874
Epoch: 27469, Training Loss: 0.01075
Epoch: 27470, Training Loss: 0.00837
Epoch: 27470, Training Loss: 0.00873
Epoch: 27470, Training Loss: 0.00874
Epoch: 27470, Training Loss: 0.01075
Epoch: 27471, Training Loss: 0.00837
Epoch: 27471, Training Loss: 0.00873
Epoch: 27471, Training Loss: 0.00874
Epoch: 27471, Training Loss: 0.01075
Epoch: 27472, Training Loss: 0.00837
Epoch: 27472, Training Loss: 0.00873
Epoch: 27472, Training Loss: 0.00874
Epoch: 27472, Training Loss: 0.01075
Epoch: 27473, Training Loss: 0.00837
Epoch: 27473, Training Loss: 0.00873
Epoch: 27473, Training Loss: 0.00874
Epoch: 27473, Training Loss: 0.01075
Epoch: 27474, Training Loss: 0.00837
Epoch: 27474, Training Loss: 0.00873
Epoch: 27474, Training Loss: 0.00874
Epoch: 27474, Training Loss: 0.01075
Epoch: 27475, Training Loss: 0.00837
Epoch: 27475, Training Loss: 0.00873
Epoch: 27475, Training Loss: 0.00874
Epoch: 27475, Training Loss: 0.01075
Epoch: 27476, Training Loss: 0.00837
Epoch: 27476, Training Loss: 0.00873
Epoch: 27476, Training Loss: 0.00874
Epoch: 27476, Training Loss: 0.01075
Epoch: 27477, Training Loss: 0.00837
Epoch: 27477, Training Loss: 0.00873
Epoch: 27477, Training Loss: 0.00874
Epoch: 27477, Training Loss: 0.01075
Epoch: 27478, Training Loss: 0.00837
Epoch: 27478, Training Loss: 0.00873
Epoch: 27478, Training Loss: 0.00874
Epoch: 27478, Training Loss: 0.01075
Epoch: 27479, Training Loss: 0.00837
Epoch: 27479, Training Loss: 0.00873
Epoch: 27479, Training Loss: 0.00874
Epoch: 27479, Training Loss: 0.01075
Epoch: 27480, Training Loss: 0.00837
Epoch: 27480, Training Loss: 0.00873
Epoch: 27480, Training Loss: 0.00874
Epoch: 27480, Training Loss: 0.01075
Epoch: 27481, Training Loss: 0.00837
Epoch: 27481, Training Loss: 0.00873
Epoch: 27481, Training Loss: 0.00874
Epoch: 27481, Training Loss: 0.01075
Epoch: 27482, Training Loss: 0.00837
Epoch: 27482, Training Loss: 0.00873
Epoch: 27482, Training Loss: 0.00874
Epoch: 27482, Training Loss: 0.01075
Epoch: 27483, Training Loss: 0.00837
Epoch: 27483, Training Loss: 0.00873
Epoch: 27483, Training Loss: 0.00874
Epoch: 27483, Training Loss: 0.01075
Epoch: 27484, Training Loss: 0.00837
Epoch: 27484, Training Loss: 0.00873
Epoch: 27484, Training Loss: 0.00874
Epoch: 27484, Training Loss: 0.01075
Epoch: 27485, Training Loss: 0.00837
Epoch: 27485, Training Loss: 0.00873
Epoch: 27485, Training Loss: 0.00874
Epoch: 27485, Training Loss: 0.01074
Epoch: 27486, Training Loss: 0.00837
Epoch: 27486, Training Loss: 0.00873
Epoch: 27486, Training Loss: 0.00874
Epoch: 27486, Training Loss: 0.01074
Epoch: 27487, Training Loss: 0.00837
Epoch: 27487, Training Loss: 0.00873
Epoch: 27487, Training Loss: 0.00874
Epoch: 27487, Training Loss: 0.01074
Epoch: 27488, Training Loss: 0.00837
Epoch: 27488, Training Loss: 0.00873
Epoch: 27488, Training Loss: 0.00874
Epoch: 27488, Training Loss: 0.01074
Epoch: 27489, Training Loss: 0.00837
Epoch: 27489, Training Loss: 0.00873
Epoch: 27489, Training Loss: 0.00874
Epoch: 27489, Training Loss: 0.01074
Epoch: 27490, Training Loss: 0.00837
Epoch: 27490, Training Loss: 0.00873
Epoch: 27490, Training Loss: 0.00874
Epoch: 27490, Training Loss: 0.01074
Epoch: 27491, Training Loss: 0.00837
Epoch: 27491, Training Loss: 0.00873
Epoch: 27491, Training Loss: 0.00874
Epoch: 27491, Training Loss: 0.01074
Epoch: 27492, Training Loss: 0.00837
Epoch: 27492, Training Loss: 0.00873
Epoch: 27492, Training Loss: 0.00874
Epoch: 27492, Training Loss: 0.01074
Epoch: 27493, Training Loss: 0.00837
Epoch: 27493, Training Loss: 0.00873
Epoch: 27493, Training Loss: 0.00874
Epoch: 27493, Training Loss: 0.01074
Epoch: 27494, Training Loss: 0.00837
Epoch: 27494, Training Loss: 0.00873
Epoch: 27494, Training Loss: 0.00874
Epoch: 27494, Training Loss: 0.01074
Epoch: 27495, Training Loss: 0.00837
Epoch: 27495, Training Loss: 0.00873
Epoch: 27495, Training Loss: 0.00874
Epoch: 27495, Training Loss: 0.01074
Epoch: 27496, Training Loss: 0.00837
Epoch: 27496, Training Loss: 0.00873
Epoch: 27496, Training Loss: 0.00874
Epoch: 27496, Training Loss: 0.01074
Epoch: 27497, Training Loss: 0.00837
Epoch: 27497, Training Loss: 0.00873
Epoch: 27497, Training Loss: 0.00874
Epoch: 27497, Training Loss: 0.01074
Epoch: 27498, Training Loss: 0.00837
Epoch: 27498, Training Loss: 0.00873
Epoch: 27498, Training Loss: 0.00874
Epoch: 27498, Training Loss: 0.01074
Epoch: 27499, Training Loss: 0.00837
Epoch: 27499, Training Loss: 0.00873
Epoch: 27499, Training Loss: 0.00874
Epoch: 27499, Training Loss: 0.01074
Epoch: 27500, Training Loss: 0.00837
Epoch: 27500, Training Loss: 0.00873
Epoch: 27500, Training Loss: 0.00874
Epoch: 27500, Training Loss: 0.01074
Epoch: 27501, Training Loss: 0.00837
Epoch: 27501, Training Loss: 0.00873
Epoch: 27501, Training Loss: 0.00874
Epoch: 27501, Training Loss: 0.01074
Epoch: 27502, Training Loss: 0.00837
Epoch: 27502, Training Loss: 0.00873
Epoch: 27502, Training Loss: 0.00874
Epoch: 27502, Training Loss: 0.01074
Epoch: 27503, Training Loss: 0.00837
Epoch: 27503, Training Loss: 0.00873
Epoch: 27503, Training Loss: 0.00874
Epoch: 27503, Training Loss: 0.01074
Epoch: 27504, Training Loss: 0.00837
Epoch: 27504, Training Loss: 0.00873
Epoch: 27504, Training Loss: 0.00874
Epoch: 27504, Training Loss: 0.01074
Epoch: 27505, Training Loss: 0.00837
Epoch: 27505, Training Loss: 0.00873
Epoch: 27505, Training Loss: 0.00874
Epoch: 27505, Training Loss: 0.01074
Epoch: 27506, Training Loss: 0.00837
Epoch: 27506, Training Loss: 0.00873
Epoch: 27506, Training Loss: 0.00874
Epoch: 27506, Training Loss: 0.01074
Epoch: 27507, Training Loss: 0.00837
Epoch: 27507, Training Loss: 0.00873
Epoch: 27507, Training Loss: 0.00874
Epoch: 27507, Training Loss: 0.01074
Epoch: 27508, Training Loss: 0.00836
Epoch: 27508, Training Loss: 0.00873
Epoch: 27508, Training Loss: 0.00874
Epoch: 27508, Training Loss: 0.01074
Epoch: 27509, Training Loss: 0.00836
Epoch: 27509, Training Loss: 0.00873
Epoch: 27509, Training Loss: 0.00874
Epoch: 27509, Training Loss: 0.01074
Epoch: 27510, Training Loss: 0.00836
Epoch: 27510, Training Loss: 0.00873
Epoch: 27510, Training Loss: 0.00874
Epoch: 27510, Training Loss: 0.01074
Epoch: 27511, Training Loss: 0.00836
Epoch: 27511, Training Loss: 0.00873
Epoch: 27511, Training Loss: 0.00874
Epoch: 27511, Training Loss: 0.01074
Epoch: 27512, Training Loss: 0.00836
Epoch: 27512, Training Loss: 0.00873
Epoch: 27512, Training Loss: 0.00874
Epoch: 27512, Training Loss: 0.01074
Epoch: 27513, Training Loss: 0.00836
Epoch: 27513, Training Loss: 0.00873
Epoch: 27513, Training Loss: 0.00874
Epoch: 27513, Training Loss: 0.01074
Epoch: 27514, Training Loss: 0.00836
Epoch: 27514, Training Loss: 0.00873
Epoch: 27514, Training Loss: 0.00874
Epoch: 27514, Training Loss: 0.01074
Epoch: 27515, Training Loss: 0.00836
Epoch: 27515, Training Loss: 0.00873
Epoch: 27515, Training Loss: 0.00874
Epoch: 27515, Training Loss: 0.01074
Epoch: 27516, Training Loss: 0.00836
Epoch: 27516, Training Loss: 0.00873
Epoch: 27516, Training Loss: 0.00874
Epoch: 27516, Training Loss: 0.01074
Epoch: 27517, Training Loss: 0.00836
Epoch: 27517, Training Loss: 0.00873
Epoch: 27517, Training Loss: 0.00874
Epoch: 27517, Training Loss: 0.01074
Epoch: 27518, Training Loss: 0.00836
Epoch: 27518, Training Loss: 0.00873
Epoch: 27518, Training Loss: 0.00874
Epoch: 27518, Training Loss: 0.01074
Epoch: 27519, Training Loss: 0.00836
Epoch: 27519, Training Loss: 0.00873
Epoch: 27519, Training Loss: 0.00874
Epoch: 27519, Training Loss: 0.01074
Epoch: 27520, Training Loss: 0.00836
Epoch: 27520, Training Loss: 0.00872
Epoch: 27520, Training Loss: 0.00874
Epoch: 27520, Training Loss: 0.01074
Epoch: 27521, Training Loss: 0.00836
Epoch: 27521, Training Loss: 0.00872
Epoch: 27521, Training Loss: 0.00874
Epoch: 27521, Training Loss: 0.01074
Epoch: 27522, Training Loss: 0.00836
Epoch: 27522, Training Loss: 0.00872
Epoch: 27522, Training Loss: 0.00874
Epoch: 27522, Training Loss: 0.01074
Epoch: 27523, Training Loss: 0.00836
Epoch: 27523, Training Loss: 0.00872
Epoch: 27523, Training Loss: 0.00873
Epoch: 27523, Training Loss: 0.01074
Epoch: 27524, Training Loss: 0.00836
Epoch: 27524, Training Loss: 0.00872
Epoch: 27524, Training Loss: 0.00873
Epoch: 27524, Training Loss: 0.01074
Epoch: 27525, Training Loss: 0.00836
Epoch: 27525, Training Loss: 0.00872
Epoch: 27525, Training Loss: 0.00873
Epoch: 27525, Training Loss: 0.01074
Epoch: 27526, Training Loss: 0.00836
Epoch: 27526, Training Loss: 0.00872
Epoch: 27526, Training Loss: 0.00873
Epoch: 27526, Training Loss: 0.01074
Epoch: 27527, Training Loss: 0.00836
Epoch: 27527, Training Loss: 0.00872
Epoch: 27527, Training Loss: 0.00873
Epoch: 27527, Training Loss: 0.01074
Epoch: 27528, Training Loss: 0.00836
Epoch: 27528, Training Loss: 0.00872
Epoch: 27528, Training Loss: 0.00873
Epoch: 27528, Training Loss: 0.01074
Epoch: 27529, Training Loss: 0.00836
Epoch: 27529, Training Loss: 0.00872
Epoch: 27529, Training Loss: 0.00873
Epoch: 27529, Training Loss: 0.01074
Epoch: 27530, Training Loss: 0.00836
Epoch: 27530, Training Loss: 0.00872
Epoch: 27530, Training Loss: 0.00873
Epoch: 27530, Training Loss: 0.01073
Epoch: 27531, Training Loss: 0.00836
Epoch: 27531, Training Loss: 0.00872
Epoch: 27531, Training Loss: 0.00873
Epoch: 27531, Training Loss: 0.01073
Epoch: 27532, Training Loss: 0.00836
Epoch: 27532, Training Loss: 0.00872
Epoch: 27532, Training Loss: 0.00873
Epoch: 27532, Training Loss: 0.01073
Epoch: 27533, Training Loss: 0.00836
Epoch: 27533, Training Loss: 0.00872
Epoch: 27533, Training Loss: 0.00873
Epoch: 27533, Training Loss: 0.01073
Epoch: 27534, Training Loss: 0.00836
Epoch: 27534, Training Loss: 0.00872
Epoch: 27534, Training Loss: 0.00873
Epoch: 27534, Training Loss: 0.01073
Epoch: 27535, Training Loss: 0.00836
Epoch: 27535, Training Loss: 0.00872
Epoch: 27535, Training Loss: 0.00873
Epoch: 27535, Training Loss: 0.01073
Epoch: 27536, Training Loss: 0.00836
Epoch: 27536, Training Loss: 0.00872
Epoch: 27536, Training Loss: 0.00873
Epoch: 27536, Training Loss: 0.01073
Epoch: 27537, Training Loss: 0.00836
Epoch: 27537, Training Loss: 0.00872
Epoch: 27537, Training Loss: 0.00873
Epoch: 27537, Training Loss: 0.01073
Epoch: 27538, Training Loss: 0.00836
Epoch: 27538, Training Loss: 0.00872
Epoch: 27538, Training Loss: 0.00873
Epoch: 27538, Training Loss: 0.01073
Epoch: 27539, Training Loss: 0.00836
Epoch: 27539, Training Loss: 0.00872
Epoch: 27539, Training Loss: 0.00873
Epoch: 27539, Training Loss: 0.01073
Epoch: 27540, Training Loss: 0.00836
Epoch: 27540, Training Loss: 0.00872
Epoch: 27540, Training Loss: 0.00873
Epoch: 27540, Training Loss: 0.01073
Epoch: 27541, Training Loss: 0.00836
Epoch: 27541, Training Loss: 0.00872
Epoch: 27541, Training Loss: 0.00873
Epoch: 27541, Training Loss: 0.01073
Epoch: 27542, Training Loss: 0.00836
Epoch: 27542, Training Loss: 0.00872
Epoch: 27542, Training Loss: 0.00873
Epoch: 27542, Training Loss: 0.01073
Epoch: 27543, Training Loss: 0.00836
Epoch: 27543, Training Loss: 0.00872
Epoch: 27543, Training Loss: 0.00873
Epoch: 27543, Training Loss: 0.01073
Epoch: 27544, Training Loss: 0.00836
Epoch: 27544, Training Loss: 0.00872
Epoch: 27544, Training Loss: 0.00873
Epoch: 27544, Training Loss: 0.01073
Epoch: 27545, Training Loss: 0.00836
Epoch: 27545, Training Loss: 0.00872
Epoch: 27545, Training Loss: 0.00873
Epoch: 27545, Training Loss: 0.01073
Epoch: 27546, Training Loss: 0.00836
Epoch: 27546, Training Loss: 0.00872
Epoch: 27546, Training Loss: 0.00873
Epoch: 27546, Training Loss: 0.01073
Epoch: 27547, Training Loss: 0.00836
Epoch: 27547, Training Loss: 0.00872
Epoch: 27547, Training Loss: 0.00873
Epoch: 27547, Training Loss: 0.01073
Epoch: 27548, Training Loss: 0.00836
Epoch: 27548, Training Loss: 0.00872
Epoch: 27548, Training Loss: 0.00873
Epoch: 27548, Training Loss: 0.01073
Epoch: 27549, Training Loss: 0.00836
Epoch: 27549, Training Loss: 0.00872
Epoch: 27549, Training Loss: 0.00873
Epoch: 27549, Training Loss: 0.01073
Epoch: 27550, Training Loss: 0.00836
Epoch: 27550, Training Loss: 0.00872
Epoch: 27550, Training Loss: 0.00873
Epoch: 27550, Training Loss: 0.01073
Epoch: 27551, Training Loss: 0.00836
Epoch: 27551, Training Loss: 0.00872
Epoch: 27551, Training Loss: 0.00873
Epoch: 27551, Training Loss: 0.01073
Epoch: 27552, Training Loss: 0.00836
Epoch: 27552, Training Loss: 0.00872
Epoch: 27552, Training Loss: 0.00873
Epoch: 27552, Training Loss: 0.01073
Epoch: 27553, Training Loss: 0.00836
Epoch: 27553, Training Loss: 0.00872
Epoch: 27553, Training Loss: 0.00873
Epoch: 27553, Training Loss: 0.01073
Epoch: 27554, Training Loss: 0.00836
Epoch: 27554, Training Loss: 0.00872
Epoch: 27554, Training Loss: 0.00873
Epoch: 27554, Training Loss: 0.01073
Epoch: 27555, Training Loss: 0.00836
Epoch: 27555, Training Loss: 0.00872
Epoch: 27555, Training Loss: 0.00873
Epoch: 27555, Training Loss: 0.01073
Epoch: 27556, Training Loss: 0.00836
Epoch: 27556, Training Loss: 0.00872
Epoch: 27556, Training Loss: 0.00873
Epoch: 27556, Training Loss: 0.01073
Epoch: 27557, Training Loss: 0.00836
Epoch: 27557, Training Loss: 0.00872
Epoch: 27557, Training Loss: 0.00873
Epoch: 27557, Training Loss: 0.01073
Epoch: 27558, Training Loss: 0.00836
Epoch: 27558, Training Loss: 0.00872
Epoch: 27558, Training Loss: 0.00873
Epoch: 27558, Training Loss: 0.01073
Epoch: 27559, Training Loss: 0.00836
Epoch: 27559, Training Loss: 0.00872
Epoch: 27559, Training Loss: 0.00873
Epoch: 27559, Training Loss: 0.01073
Epoch: 27560, Training Loss: 0.00836
Epoch: 27560, Training Loss: 0.00872
Epoch: 27560, Training Loss: 0.00873
Epoch: 27560, Training Loss: 0.01073
Epoch: 27561, Training Loss: 0.00836
Epoch: 27561, Training Loss: 0.00872
Epoch: 27561, Training Loss: 0.00873
Epoch: 27561, Training Loss: 0.01073
Epoch: 27562, Training Loss: 0.00836
Epoch: 27562, Training Loss: 0.00872
Epoch: 27562, Training Loss: 0.00873
Epoch: 27562, Training Loss: 0.01073
Epoch: 27563, Training Loss: 0.00836
Epoch: 27563, Training Loss: 0.00872
Epoch: 27563, Training Loss: 0.00873
Epoch: 27563, Training Loss: 0.01073
Epoch: 27564, Training Loss: 0.00836
Epoch: 27564, Training Loss: 0.00872
Epoch: 27564, Training Loss: 0.00873
Epoch: 27564, Training Loss: 0.01073
Epoch: 27565, Training Loss: 0.00836
Epoch: 27565, Training Loss: 0.00872
Epoch: 27565, Training Loss: 0.00873
Epoch: 27565, Training Loss: 0.01073
Epoch: 27566, Training Loss: 0.00836
Epoch: 27566, Training Loss: 0.00872
Epoch: 27566, Training Loss: 0.00873
Epoch: 27566, Training Loss: 0.01073
Epoch: 27567, Training Loss: 0.00835
Epoch: 27567, Training Loss: 0.00872
Epoch: 27567, Training Loss: 0.00873
Epoch: 27567, Training Loss: 0.01073
Epoch: 27568, Training Loss: 0.00835
Epoch: 27568, Training Loss: 0.00872
Epoch: 27568, Training Loss: 0.00873
Epoch: 27568, Training Loss: 0.01073
Epoch: 27569, Training Loss: 0.00835
Epoch: 27569, Training Loss: 0.00872
Epoch: 27569, Training Loss: 0.00873
Epoch: 27569, Training Loss: 0.01073
Epoch: 27570, Training Loss: 0.00835
Epoch: 27570, Training Loss: 0.00872
Epoch: 27570, Training Loss: 0.00873
Epoch: 27570, Training Loss: 0.01073
Epoch: 27571, Training Loss: 0.00835
Epoch: 27571, Training Loss: 0.00872
Epoch: 27571, Training Loss: 0.00873
Epoch: 27571, Training Loss: 0.01073
Epoch: 27572, Training Loss: 0.00835
Epoch: 27572, Training Loss: 0.00872
Epoch: 27572, Training Loss: 0.00873
Epoch: 27572, Training Loss: 0.01073
Epoch: 27573, Training Loss: 0.00835
Epoch: 27573, Training Loss: 0.00872
Epoch: 27573, Training Loss: 0.00873
Epoch: 27573, Training Loss: 0.01073
Epoch: 27574, Training Loss: 0.00835
Epoch: 27574, Training Loss: 0.00872
Epoch: 27574, Training Loss: 0.00873
Epoch: 27574, Training Loss: 0.01072
Epoch: 27575, Training Loss: 0.00835
Epoch: 27575, Training Loss: 0.00872
Epoch: 27575, Training Loss: 0.00873
Epoch: 27575, Training Loss: 0.01072
Epoch: 27576, Training Loss: 0.00835
Epoch: 27576, Training Loss: 0.00871
Epoch: 27576, Training Loss: 0.00873
Epoch: 27576, Training Loss: 0.01072
Epoch: 27577, Training Loss: 0.00835
Epoch: 27577, Training Loss: 0.00871
Epoch: 27577, Training Loss: 0.00873
Epoch: 27577, Training Loss: 0.01072
Epoch: 27578, Training Loss: 0.00835
Epoch: 27578, Training Loss: 0.00871
Epoch: 27578, Training Loss: 0.00873
Epoch: 27578, Training Loss: 0.01072
Epoch: 27579, Training Loss: 0.00835
Epoch: 27579, Training Loss: 0.00871
Epoch: 27579, Training Loss: 0.00872
Epoch: 27579, Training Loss: 0.01072
Epoch: 27580, Training Loss: 0.00835
Epoch: 27580, Training Loss: 0.00871
Epoch: 27580, Training Loss: 0.00872
Epoch: 27580, Training Loss: 0.01072
Epoch: 27581, Training Loss: 0.00835
Epoch: 27581, Training Loss: 0.00871
Epoch: 27581, Training Loss: 0.00872
Epoch: 27581, Training Loss: 0.01072
Epoch: 27582, Training Loss: 0.00835
Epoch: 27582, Training Loss: 0.00871
Epoch: 27582, Training Loss: 0.00872
Epoch: 27582, Training Loss: 0.01072
Epoch: 27583, Training Loss: 0.00835
Epoch: 27583, Training Loss: 0.00871
Epoch: 27583, Training Loss: 0.00872
Epoch: 27583, Training Loss: 0.01072
Epoch: 27584, Training Loss: 0.00835
Epoch: 27584, Training Loss: 0.00871
Epoch: 27584, Training Loss: 0.00872
Epoch: 27584, Training Loss: 0.01072
Epoch: 27585, Training Loss: 0.00835
Epoch: 27585, Training Loss: 0.00871
Epoch: 27585, Training Loss: 0.00872
Epoch: 27585, Training Loss: 0.01072
Epoch: 27586, Training Loss: 0.00835
Epoch: 27586, Training Loss: 0.00871
Epoch: 27586, Training Loss: 0.00872
Epoch: 27586, Training Loss: 0.01072
Epoch: 27587, Training Loss: 0.00835
Epoch: 27587, Training Loss: 0.00871
Epoch: 27587, Training Loss: 0.00872
Epoch: 27587, Training Loss: 0.01072
Epoch: 27588, Training Loss: 0.00835
Epoch: 27588, Training Loss: 0.00871
Epoch: 27588, Training Loss: 0.00872
Epoch: 27588, Training Loss: 0.01072
Epoch: 27589, Training Loss: 0.00835
Epoch: 27589, Training Loss: 0.00871
Epoch: 27589, Training Loss: 0.00872
Epoch: 27589, Training Loss: 0.01072
Epoch: 27590, Training Loss: 0.00835
Epoch: 27590, Training Loss: 0.00871
Epoch: 27590, Training Loss: 0.00872
Epoch: 27590, Training Loss: 0.01072
Epoch: 27591, Training Loss: 0.00835
Epoch: 27591, Training Loss: 0.00871
Epoch: 27591, Training Loss: 0.00872
Epoch: 27591, Training Loss: 0.01072
Epoch: 27592, Training Loss: 0.00835
Epoch: 27592, Training Loss: 0.00871
Epoch: 27592, Training Loss: 0.00872
Epoch: 27592, Training Loss: 0.01072
Epoch: 27593, Training Loss: 0.00835
Epoch: 27593, Training Loss: 0.00871
Epoch: 27593, Training Loss: 0.00872
Epoch: 27593, Training Loss: 0.01072
Epoch: 27594, Training Loss: 0.00835
Epoch: 27594, Training Loss: 0.00871
Epoch: 27594, Training Loss: 0.00872
Epoch: 27594, Training Loss: 0.01072
Epoch: 27595, Training Loss: 0.00835
Epoch: 27595, Training Loss: 0.00871
Epoch: 27595, Training Loss: 0.00872
Epoch: 27595, Training Loss: 0.01072
Epoch: 27596, Training Loss: 0.00835
Epoch: 27596, Training Loss: 0.00871
Epoch: 27596, Training Loss: 0.00872
Epoch: 27596, Training Loss: 0.01072
Epoch: 27597, Training Loss: 0.00835
Epoch: 27597, Training Loss: 0.00871
Epoch: 27597, Training Loss: 0.00872
Epoch: 27597, Training Loss: 0.01072
Epoch: 27598, Training Loss: 0.00835
Epoch: 27598, Training Loss: 0.00871
Epoch: 27598, Training Loss: 0.00872
Epoch: 27598, Training Loss: 0.01072
Epoch: 27599, Training Loss: 0.00835
Epoch: 27599, Training Loss: 0.00871
Epoch: 27599, Training Loss: 0.00872
Epoch: 27599, Training Loss: 0.01072
Epoch: 27600, Training Loss: 0.00835
Epoch: 27600, Training Loss: 0.00871
Epoch: 27600, Training Loss: 0.00872
Epoch: 27600, Training Loss: 0.01072
Epoch: 27601, Training Loss: 0.00835
Epoch: 27601, Training Loss: 0.00871
Epoch: 27601, Training Loss: 0.00872
Epoch: 27601, Training Loss: 0.01072
Epoch: 27602, Training Loss: 0.00835
Epoch: 27602, Training Loss: 0.00871
Epoch: 27602, Training Loss: 0.00872
Epoch: 27602, Training Loss: 0.01072
Epoch: 27603, Training Loss: 0.00835
Epoch: 27603, Training Loss: 0.00871
Epoch: 27603, Training Loss: 0.00872
Epoch: 27603, Training Loss: 0.01072
Epoch: 27604, Training Loss: 0.00835
Epoch: 27604, Training Loss: 0.00871
Epoch: 27604, Training Loss: 0.00872
Epoch: 27604, Training Loss: 0.01072
Epoch: 27605, Training Loss: 0.00835
Epoch: 27605, Training Loss: 0.00871
Epoch: 27605, Training Loss: 0.00872
Epoch: 27605, Training Loss: 0.01072
Epoch: 27606, Training Loss: 0.00835
Epoch: 27606, Training Loss: 0.00871
Epoch: 27606, Training Loss: 0.00872
Epoch: 27606, Training Loss: 0.01072
Epoch: 27607, Training Loss: 0.00835
Epoch: 27607, Training Loss: 0.00871
Epoch: 27607, Training Loss: 0.00872
Epoch: 27607, Training Loss: 0.01072
Epoch: 27608, Training Loss: 0.00835
Epoch: 27608, Training Loss: 0.00871
Epoch: 27608, Training Loss: 0.00872
Epoch: 27608, Training Loss: 0.01072
Epoch: 27609, Training Loss: 0.00835
Epoch: 27609, Training Loss: 0.00871
Epoch: 27609, Training Loss: 0.00872
Epoch: 27609, Training Loss: 0.01072
Epoch: 27610, Training Loss: 0.00835
Epoch: 27610, Training Loss: 0.00871
Epoch: 27610, Training Loss: 0.00872
Epoch: 27610, Training Loss: 0.01072
Epoch: 27611, Training Loss: 0.00835
Epoch: 27611, Training Loss: 0.00871
Epoch: 27611, Training Loss: 0.00872
Epoch: 27611, Training Loss: 0.01072
Epoch: 27612, Training Loss: 0.00835
Epoch: 27612, Training Loss: 0.00871
Epoch: 27612, Training Loss: 0.00872
Epoch: 27612, Training Loss: 0.01072
Epoch: 27613, Training Loss: 0.00835
Epoch: 27613, Training Loss: 0.00871
Epoch: 27613, Training Loss: 0.00872
Epoch: 27613, Training Loss: 0.01072
Epoch: 27614, Training Loss: 0.00835
Epoch: 27614, Training Loss: 0.00871
Epoch: 27614, Training Loss: 0.00872
Epoch: 27614, Training Loss: 0.01072
Epoch: 27615, Training Loss: 0.00835
Epoch: 27615, Training Loss: 0.00871
Epoch: 27615, Training Loss: 0.00872
Epoch: 27615, Training Loss: 0.01072
Epoch: 27616, Training Loss: 0.00835
Epoch: 27616, Training Loss: 0.00871
Epoch: 27616, Training Loss: 0.00872
Epoch: 27616, Training Loss: 0.01072
Epoch: 27617, Training Loss: 0.00835
Epoch: 27617, Training Loss: 0.00871
Epoch: 27617, Training Loss: 0.00872
Epoch: 27617, Training Loss: 0.01072
Epoch: 27618, Training Loss: 0.00835
Epoch: 27618, Training Loss: 0.00871
Epoch: 27618, Training Loss: 0.00872
Epoch: 27618, Training Loss: 0.01072
Epoch: 27619, Training Loss: 0.00835
Epoch: 27619, Training Loss: 0.00871
Epoch: 27619, Training Loss: 0.00872
Epoch: 27619, Training Loss: 0.01071
Epoch: 27620, Training Loss: 0.00835
Epoch: 27620, Training Loss: 0.00871
Epoch: 27620, Training Loss: 0.00872
Epoch: 27620, Training Loss: 0.01071
Epoch: 27621, Training Loss: 0.00835
Epoch: 27621, Training Loss: 0.00871
Epoch: 27621, Training Loss: 0.00872
Epoch: 27621, Training Loss: 0.01071
Epoch: 27622, Training Loss: 0.00835
Epoch: 27622, Training Loss: 0.00871
Epoch: 27622, Training Loss: 0.00872
Epoch: 27622, Training Loss: 0.01071
Epoch: 27623, Training Loss: 0.00835
Epoch: 27623, Training Loss: 0.00871
Epoch: 27623, Training Loss: 0.00872
Epoch: 27623, Training Loss: 0.01071
Epoch: 27624, Training Loss: 0.00835
Epoch: 27624, Training Loss: 0.00871
Epoch: 27624, Training Loss: 0.00872
Epoch: 27624, Training Loss: 0.01071
Epoch: 27625, Training Loss: 0.00835
Epoch: 27625, Training Loss: 0.00871
Epoch: 27625, Training Loss: 0.00872
Epoch: 27625, Training Loss: 0.01071
Epoch: 27626, Training Loss: 0.00835
Epoch: 27626, Training Loss: 0.00871
Epoch: 27626, Training Loss: 0.00872
Epoch: 27626, Training Loss: 0.01071
Epoch: 27627, Training Loss: 0.00834
Epoch: 27627, Training Loss: 0.00871
Epoch: 27627, Training Loss: 0.00872
Epoch: 27627, Training Loss: 0.01071
Epoch: 27628, Training Loss: 0.00834
Epoch: 27628, Training Loss: 0.00871
Epoch: 27628, Training Loss: 0.00872
Epoch: 27628, Training Loss: 0.01071
Epoch: 27629, Training Loss: 0.00834
Epoch: 27629, Training Loss: 0.00871
Epoch: 27629, Training Loss: 0.00872
Epoch: 27629, Training Loss: 0.01071
Epoch: 27630, Training Loss: 0.00834
Epoch: 27630, Training Loss: 0.00871
Epoch: 27630, Training Loss: 0.00872
Epoch: 27630, Training Loss: 0.01071
Epoch: 27631, Training Loss: 0.00834
Epoch: 27631, Training Loss: 0.00870
Epoch: 27631, Training Loss: 0.00872
Epoch: 27631, Training Loss: 0.01071
Epoch: 27632, Training Loss: 0.00834
Epoch: 27632, Training Loss: 0.00870
Epoch: 27632, Training Loss: 0.00872
Epoch: 27632, Training Loss: 0.01071
Epoch: 27633, Training Loss: 0.00834
Epoch: 27633, Training Loss: 0.00870
Epoch: 27633, Training Loss: 0.00872
Epoch: 27633, Training Loss: 0.01071
Epoch: 27634, Training Loss: 0.00834
Epoch: 27634, Training Loss: 0.00870
Epoch: 27634, Training Loss: 0.00872
Epoch: 27634, Training Loss: 0.01071
Epoch: 27635, Training Loss: 0.00834
Epoch: 27635, Training Loss: 0.00870
Epoch: 27635, Training Loss: 0.00871
Epoch: 27635, Training Loss: 0.01071
Epoch: 27636, Training Loss: 0.00834
Epoch: 27636, Training Loss: 0.00870
Epoch: 27636, Training Loss: 0.00871
Epoch: 27636, Training Loss: 0.01071
Epoch: 27637, Training Loss: 0.00834
Epoch: 27637, Training Loss: 0.00870
Epoch: 27637, Training Loss: 0.00871
Epoch: 27637, Training Loss: 0.01071
Epoch: 27638, Training Loss: 0.00834
Epoch: 27638, Training Loss: 0.00870
Epoch: 27638, Training Loss: 0.00871
Epoch: 27638, Training Loss: 0.01071
Epoch: 27639, Training Loss: 0.00834
Epoch: 27639, Training Loss: 0.00870
Epoch: 27639, Training Loss: 0.00871
Epoch: 27639, Training Loss: 0.01071
Epoch: 27640, Training Loss: 0.00834
Epoch: 27640, Training Loss: 0.00870
Epoch: 27640, Training Loss: 0.00871
Epoch: 27640, Training Loss: 0.01071
Epoch: 27641, Training Loss: 0.00834
Epoch: 27641, Training Loss: 0.00870
Epoch: 27641, Training Loss: 0.00871
Epoch: 27641, Training Loss: 0.01071
Epoch: 27642, Training Loss: 0.00834
Epoch: 27642, Training Loss: 0.00870
Epoch: 27642, Training Loss: 0.00871
Epoch: 27642, Training Loss: 0.01071
Epoch: 27643, Training Loss: 0.00834
Epoch: 27643, Training Loss: 0.00870
Epoch: 27643, Training Loss: 0.00871
Epoch: 27643, Training Loss: 0.01071
Epoch: 27644, Training Loss: 0.00834
Epoch: 27644, Training Loss: 0.00870
Epoch: 27644, Training Loss: 0.00871
Epoch: 27644, Training Loss: 0.01071
Epoch: 27645, Training Loss: 0.00834
Epoch: 27645, Training Loss: 0.00870
Epoch: 27645, Training Loss: 0.00871
Epoch: 27645, Training Loss: 0.01071
Epoch: 27646, Training Loss: 0.00834
Epoch: 27646, Training Loss: 0.00870
Epoch: 27646, Training Loss: 0.00871
Epoch: 27646, Training Loss: 0.01071
Epoch: 27647, Training Loss: 0.00834
Epoch: 27647, Training Loss: 0.00870
Epoch: 27647, Training Loss: 0.00871
Epoch: 27647, Training Loss: 0.01071
Epoch: 27648, Training Loss: 0.00834
Epoch: 27648, Training Loss: 0.00870
Epoch: 27648, Training Loss: 0.00871
Epoch: 27648, Training Loss: 0.01071
Epoch: 27649, Training Loss: 0.00834
Epoch: 27649, Training Loss: 0.00870
Epoch: 27649, Training Loss: 0.00871
Epoch: 27649, Training Loss: 0.01071
Epoch: 27650, Training Loss: 0.00834
Epoch: 27650, Training Loss: 0.00870
Epoch: 27650, Training Loss: 0.00871
Epoch: 27650, Training Loss: 0.01071
Epoch: 27651, Training Loss: 0.00834
Epoch: 27651, Training Loss: 0.00870
Epoch: 27651, Training Loss: 0.00871
Epoch: 27651, Training Loss: 0.01071
Epoch: 27652, Training Loss: 0.00834
Epoch: 27652, Training Loss: 0.00870
Epoch: 27652, Training Loss: 0.00871
Epoch: 27652, Training Loss: 0.01071
Epoch: 27653, Training Loss: 0.00834
Epoch: 27653, Training Loss: 0.00870
Epoch: 27653, Training Loss: 0.00871
Epoch: 27653, Training Loss: 0.01071
Epoch: 27654, Training Loss: 0.00834
Epoch: 27654, Training Loss: 0.00870
Epoch: 27654, Training Loss: 0.00871
Epoch: 27654, Training Loss: 0.01071
Epoch: 27655, Training Loss: 0.00834
Epoch: 27655, Training Loss: 0.00870
Epoch: 27655, Training Loss: 0.00871
Epoch: 27655, Training Loss: 0.01071
Epoch: 27656, Training Loss: 0.00834
Epoch: 27656, Training Loss: 0.00870
Epoch: 27656, Training Loss: 0.00871
Epoch: 27656, Training Loss: 0.01071
Epoch: 27657, Training Loss: 0.00834
Epoch: 27657, Training Loss: 0.00870
Epoch: 27657, Training Loss: 0.00871
Epoch: 27657, Training Loss: 0.01071
Epoch: 27658, Training Loss: 0.00834
Epoch: 27658, Training Loss: 0.00870
Epoch: 27658, Training Loss: 0.00871
Epoch: 27658, Training Loss: 0.01071
Epoch: 27659, Training Loss: 0.00834
Epoch: 27659, Training Loss: 0.00870
Epoch: 27659, Training Loss: 0.00871
Epoch: 27659, Training Loss: 0.01071
Epoch: 27660, Training Loss: 0.00834
Epoch: 27660, Training Loss: 0.00870
Epoch: 27660, Training Loss: 0.00871
Epoch: 27660, Training Loss: 0.01071
Epoch: 27661, Training Loss: 0.00834
Epoch: 27661, Training Loss: 0.00870
Epoch: 27661, Training Loss: 0.00871
Epoch: 27661, Training Loss: 0.01071
Epoch: 27662, Training Loss: 0.00834
Epoch: 27662, Training Loss: 0.00870
Epoch: 27662, Training Loss: 0.00871
Epoch: 27662, Training Loss: 0.01071
Epoch: 27663, Training Loss: 0.00834
Epoch: 27663, Training Loss: 0.00870
Epoch: 27663, Training Loss: 0.00871
Epoch: 27663, Training Loss: 0.01071
Epoch: 27664, Training Loss: 0.00834
Epoch: 27664, Training Loss: 0.00870
Epoch: 27664, Training Loss: 0.00871
Epoch: 27664, Training Loss: 0.01070
Epoch: 27665, Training Loss: 0.00834
Epoch: 27665, Training Loss: 0.00870
Epoch: 27665, Training Loss: 0.00871
Epoch: 27665, Training Loss: 0.01070
Epoch: 27666, Training Loss: 0.00834
Epoch: 27666, Training Loss: 0.00870
Epoch: 27666, Training Loss: 0.00871
Epoch: 27666, Training Loss: 0.01070
Epoch: 27667, Training Loss: 0.00834
Epoch: 27667, Training Loss: 0.00870
Epoch: 27667, Training Loss: 0.00871
Epoch: 27667, Training Loss: 0.01070
Epoch: 27668, Training Loss: 0.00834
Epoch: 27668, Training Loss: 0.00870
Epoch: 27668, Training Loss: 0.00871
Epoch: 27668, Training Loss: 0.01070
Epoch: 27669, Training Loss: 0.00834
Epoch: 27669, Training Loss: 0.00870
Epoch: 27669, Training Loss: 0.00871
Epoch: 27669, Training Loss: 0.01070
Epoch: 27670, Training Loss: 0.00834
Epoch: 27670, Training Loss: 0.00870
Epoch: 27670, Training Loss: 0.00871
Epoch: 27670, Training Loss: 0.01070
Epoch: 27671, Training Loss: 0.00834
Epoch: 27671, Training Loss: 0.00870
Epoch: 27671, Training Loss: 0.00871
Epoch: 27671, Training Loss: 0.01070
Epoch: 27672, Training Loss: 0.00834
Epoch: 27672, Training Loss: 0.00870
Epoch: 27672, Training Loss: 0.00871
Epoch: 27672, Training Loss: 0.01070
Epoch: 27673, Training Loss: 0.00834
Epoch: 27673, Training Loss: 0.00870
Epoch: 27673, Training Loss: 0.00871
Epoch: 27673, Training Loss: 0.01070
Epoch: 27674, Training Loss: 0.00834
Epoch: 27674, Training Loss: 0.00870
Epoch: 27674, Training Loss: 0.00871
Epoch: 27674, Training Loss: 0.01070
Epoch: 27675, Training Loss: 0.00834
Epoch: 27675, Training Loss: 0.00870
Epoch: 27675, Training Loss: 0.00871
Epoch: 27675, Training Loss: 0.01070
Epoch: 27676, Training Loss: 0.00834
Epoch: 27676, Training Loss: 0.00870
Epoch: 27676, Training Loss: 0.00871
Epoch: 27676, Training Loss: 0.01070
Epoch: 27677, Training Loss: 0.00834
Epoch: 27677, Training Loss: 0.00870
Epoch: 27677, Training Loss: 0.00871
Epoch: 27677, Training Loss: 0.01070
Epoch: 27678, Training Loss: 0.00834
Epoch: 27678, Training Loss: 0.00870
Epoch: 27678, Training Loss: 0.00871
Epoch: 27678, Training Loss: 0.01070
Epoch: 27679, Training Loss: 0.00834
Epoch: 27679, Training Loss: 0.00870
Epoch: 27679, Training Loss: 0.00871
Epoch: 27679, Training Loss: 0.01070
Epoch: 27680, Training Loss: 0.00834
Epoch: 27680, Training Loss: 0.00870
Epoch: 27680, Training Loss: 0.00871
Epoch: 27680, Training Loss: 0.01070
Epoch: 27681, Training Loss: 0.00834
Epoch: 27681, Training Loss: 0.00870
Epoch: 27681, Training Loss: 0.00871
Epoch: 27681, Training Loss: 0.01070
Epoch: 27682, Training Loss: 0.00834
Epoch: 27682, Training Loss: 0.00870
Epoch: 27682, Training Loss: 0.00871
Epoch: 27682, Training Loss: 0.01070
Epoch: 27683, Training Loss: 0.00834
Epoch: 27683, Training Loss: 0.00870
Epoch: 27683, Training Loss: 0.00871
Epoch: 27683, Training Loss: 0.01070
Epoch: 27684, Training Loss: 0.00834
Epoch: 27684, Training Loss: 0.00870
Epoch: 27684, Training Loss: 0.00871
Epoch: 27684, Training Loss: 0.01070
Epoch: 27685, Training Loss: 0.00834
Epoch: 27685, Training Loss: 0.00870
Epoch: 27685, Training Loss: 0.00871
Epoch: 27685, Training Loss: 0.01070
Epoch: 27686, Training Loss: 0.00833
Epoch: 27686, Training Loss: 0.00870
Epoch: 27686, Training Loss: 0.00871
Epoch: 27686, Training Loss: 0.01070
Epoch: 27687, Training Loss: 0.00833
Epoch: 27687, Training Loss: 0.00869
Epoch: 27687, Training Loss: 0.00871
Epoch: 27687, Training Loss: 0.01070
Epoch: 27688, Training Loss: 0.00833
Epoch: 27688, Training Loss: 0.00869
Epoch: 27688, Training Loss: 0.00871
Epoch: 27688, Training Loss: 0.01070
Epoch: 27689, Training Loss: 0.00833
Epoch: 27689, Training Loss: 0.00869
Epoch: 27689, Training Loss: 0.00871
Epoch: 27689, Training Loss: 0.01070
Epoch: 27690, Training Loss: 0.00833
Epoch: 27690, Training Loss: 0.00869
Epoch: 27690, Training Loss: 0.00871
Epoch: 27690, Training Loss: 0.01070
Epoch: 27691, Training Loss: 0.00833
Epoch: 27691, Training Loss: 0.00869
Epoch: 27691, Training Loss: 0.00870
Epoch: 27691, Training Loss: 0.01070
Epoch: 27692, Training Loss: 0.00833
Epoch: 27692, Training Loss: 0.00869
Epoch: 27692, Training Loss: 0.00870
Epoch: 27692, Training Loss: 0.01070
Epoch: 27693, Training Loss: 0.00833
Epoch: 27693, Training Loss: 0.00869
Epoch: 27693, Training Loss: 0.00870
Epoch: 27693, Training Loss: 0.01070
Epoch: 27694, Training Loss: 0.00833
Epoch: 27694, Training Loss: 0.00869
Epoch: 27694, Training Loss: 0.00870
Epoch: 27694, Training Loss: 0.01070
Epoch: 27695, Training Loss: 0.00833
Epoch: 27695, Training Loss: 0.00869
Epoch: 27695, Training Loss: 0.00870
Epoch: 27695, Training Loss: 0.01070
Epoch: 27696, Training Loss: 0.00833
Epoch: 27696, Training Loss: 0.00869
Epoch: 27696, Training Loss: 0.00870
Epoch: 27696, Training Loss: 0.01070
Epoch: 27697, Training Loss: 0.00833
Epoch: 27697, Training Loss: 0.00869
Epoch: 27697, Training Loss: 0.00870
Epoch: 27697, Training Loss: 0.01070
Epoch: 27698, Training Loss: 0.00833
Epoch: 27698, Training Loss: 0.00869
Epoch: 27698, Training Loss: 0.00870
Epoch: 27698, Training Loss: 0.01070
Epoch: 27699, Training Loss: 0.00833
Epoch: 27699, Training Loss: 0.00869
Epoch: 27699, Training Loss: 0.00870
Epoch: 27699, Training Loss: 0.01070
Epoch: 27700, Training Loss: 0.00833
Epoch: 27700, Training Loss: 0.00869
Epoch: 27700, Training Loss: 0.00870
Epoch: 27700, Training Loss: 0.01070
Epoch: 27701, Training Loss: 0.00833
Epoch: 27701, Training Loss: 0.00869
Epoch: 27701, Training Loss: 0.00870
Epoch: 27701, Training Loss: 0.01070
Epoch: 27702, Training Loss: 0.00833
Epoch: 27702, Training Loss: 0.00869
Epoch: 27702, Training Loss: 0.00870
Epoch: 27702, Training Loss: 0.01070
Epoch: 27703, Training Loss: 0.00833
Epoch: 27703, Training Loss: 0.00869
Epoch: 27703, Training Loss: 0.00870
Epoch: 27703, Training Loss: 0.01070
Epoch: 27704, Training Loss: 0.00833
Epoch: 27704, Training Loss: 0.00869
Epoch: 27704, Training Loss: 0.00870
Epoch: 27704, Training Loss: 0.01070
Epoch: 27705, Training Loss: 0.00833
Epoch: 27705, Training Loss: 0.00869
Epoch: 27705, Training Loss: 0.00870
Epoch: 27705, Training Loss: 0.01070
Epoch: 27706, Training Loss: 0.00833
Epoch: 27706, Training Loss: 0.00869
Epoch: 27706, Training Loss: 0.00870
Epoch: 27706, Training Loss: 0.01070
Epoch: 27707, Training Loss: 0.00833
Epoch: 27707, Training Loss: 0.00869
Epoch: 27707, Training Loss: 0.00870
Epoch: 27707, Training Loss: 0.01070
Epoch: 27708, Training Loss: 0.00833
Epoch: 27708, Training Loss: 0.00869
Epoch: 27708, Training Loss: 0.00870
Epoch: 27708, Training Loss: 0.01070
Epoch: 27709, Training Loss: 0.00833
Epoch: 27709, Training Loss: 0.00869
Epoch: 27709, Training Loss: 0.00870
Epoch: 27709, Training Loss: 0.01069
Epoch: 27710, Training Loss: 0.00833
Epoch: 27710, Training Loss: 0.00869
Epoch: 27710, Training Loss: 0.00870
Epoch: 27710, Training Loss: 0.01069
Epoch: 27711, Training Loss: 0.00833
Epoch: 27711, Training Loss: 0.00869
Epoch: 27711, Training Loss: 0.00870
Epoch: 27711, Training Loss: 0.01069
Epoch: 27712, Training Loss: 0.00833
Epoch: 27712, Training Loss: 0.00869
Epoch: 27712, Training Loss: 0.00870
Epoch: 27712, Training Loss: 0.01069
Epoch: 27713, Training Loss: 0.00833
Epoch: 27713, Training Loss: 0.00869
Epoch: 27713, Training Loss: 0.00870
Epoch: 27713, Training Loss: 0.01069
Epoch: 27714, Training Loss: 0.00833
Epoch: 27714, Training Loss: 0.00869
Epoch: 27714, Training Loss: 0.00870
Epoch: 27714, Training Loss: 0.01069
Epoch: 27715, Training Loss: 0.00833
Epoch: 27715, Training Loss: 0.00869
Epoch: 27715, Training Loss: 0.00870
Epoch: 27715, Training Loss: 0.01069
Epoch: 27716, Training Loss: 0.00833
Epoch: 27716, Training Loss: 0.00869
Epoch: 27716, Training Loss: 0.00870
Epoch: 27716, Training Loss: 0.01069
Epoch: 27717, Training Loss: 0.00833
Epoch: 27717, Training Loss: 0.00869
Epoch: 27717, Training Loss: 0.00870
Epoch: 27717, Training Loss: 0.01069
Epoch: 27718, Training Loss: 0.00833
Epoch: 27718, Training Loss: 0.00869
Epoch: 27718, Training Loss: 0.00870
Epoch: 27718, Training Loss: 0.01069
Epoch: 27719, Training Loss: 0.00833
Epoch: 27719, Training Loss: 0.00869
Epoch: 27719, Training Loss: 0.00870
Epoch: 27719, Training Loss: 0.01069
Epoch: 27720, Training Loss: 0.00833
Epoch: 27720, Training Loss: 0.00869
Epoch: 27720, Training Loss: 0.00870
Epoch: 27720, Training Loss: 0.01069
Epoch: 27721, Training Loss: 0.00833
Epoch: 27721, Training Loss: 0.00869
Epoch: 27721, Training Loss: 0.00870
Epoch: 27721, Training Loss: 0.01069
Epoch: 27722, Training Loss: 0.00833
Epoch: 27722, Training Loss: 0.00869
Epoch: 27722, Training Loss: 0.00870
Epoch: 27722, Training Loss: 0.01069
Epoch: 27723, Training Loss: 0.00833
Epoch: 27723, Training Loss: 0.00869
Epoch: 27723, Training Loss: 0.00870
Epoch: 27723, Training Loss: 0.01069
Epoch: 27724, Training Loss: 0.00833
Epoch: 27724, Training Loss: 0.00869
Epoch: 27724, Training Loss: 0.00870
Epoch: 27724, Training Loss: 0.01069
Epoch: 27725, Training Loss: 0.00833
Epoch: 27725, Training Loss: 0.00869
Epoch: 27725, Training Loss: 0.00870
Epoch: 27725, Training Loss: 0.01069
Epoch: 27726, Training Loss: 0.00833
Epoch: 27726, Training Loss: 0.00869
Epoch: 27726, Training Loss: 0.00870
Epoch: 27726, Training Loss: 0.01069
Epoch: 27727, Training Loss: 0.00833
Epoch: 27727, Training Loss: 0.00869
Epoch: 27727, Training Loss: 0.00870
Epoch: 27727, Training Loss: 0.01069
Epoch: 27728, Training Loss: 0.00833
Epoch: 27728, Training Loss: 0.00869
Epoch: 27728, Training Loss: 0.00870
Epoch: 27728, Training Loss: 0.01069
Epoch: 27729, Training Loss: 0.00833
Epoch: 27729, Training Loss: 0.00869
Epoch: 27729, Training Loss: 0.00870
Epoch: 27729, Training Loss: 0.01069
Epoch: 27730, Training Loss: 0.00833
Epoch: 27730, Training Loss: 0.00869
Epoch: 27730, Training Loss: 0.00870
Epoch: 27730, Training Loss: 0.01069
Epoch: 27731, Training Loss: 0.00833
Epoch: 27731, Training Loss: 0.00869
Epoch: 27731, Training Loss: 0.00870
Epoch: 27731, Training Loss: 0.01069
Epoch: 27732, Training Loss: 0.00833
Epoch: 27732, Training Loss: 0.00869
Epoch: 27732, Training Loss: 0.00870
Epoch: 27732, Training Loss: 0.01069
Epoch: 27733, Training Loss: 0.00833
Epoch: 27733, Training Loss: 0.00869
Epoch: 27733, Training Loss: 0.00870
Epoch: 27733, Training Loss: 0.01069
Epoch: 27734, Training Loss: 0.00833
Epoch: 27734, Training Loss: 0.00869
Epoch: 27734, Training Loss: 0.00870
Epoch: 27734, Training Loss: 0.01069
Epoch: 27735, Training Loss: 0.00833
Epoch: 27735, Training Loss: 0.00869
Epoch: 27735, Training Loss: 0.00870
Epoch: 27735, Training Loss: 0.01069
Epoch: 27736, Training Loss: 0.00833
Epoch: 27736, Training Loss: 0.00869
Epoch: 27736, Training Loss: 0.00870
Epoch: 27736, Training Loss: 0.01069
Epoch: 27737, Training Loss: 0.00833
Epoch: 27737, Training Loss: 0.00869
Epoch: 27737, Training Loss: 0.00870
Epoch: 27737, Training Loss: 0.01069
Epoch: 27738, Training Loss: 0.00833
Epoch: 27738, Training Loss: 0.00869
Epoch: 27738, Training Loss: 0.00870
Epoch: 27738, Training Loss: 0.01069
Epoch: 27739, Training Loss: 0.00833
Epoch: 27739, Training Loss: 0.00869
Epoch: 27739, Training Loss: 0.00870
Epoch: 27739, Training Loss: 0.01069
Epoch: 27740, Training Loss: 0.00833
Epoch: 27740, Training Loss: 0.00869
Epoch: 27740, Training Loss: 0.00870
Epoch: 27740, Training Loss: 0.01069
Epoch: 27741, Training Loss: 0.00833
Epoch: 27741, Training Loss: 0.00869
Epoch: 27741, Training Loss: 0.00870
Epoch: 27741, Training Loss: 0.01069
Epoch: 27742, Training Loss: 0.00833
Epoch: 27742, Training Loss: 0.00869
Epoch: 27742, Training Loss: 0.00870
Epoch: 27742, Training Loss: 0.01069
Epoch: 27743, Training Loss: 0.00833
Epoch: 27743, Training Loss: 0.00869
Epoch: 27743, Training Loss: 0.00870
Epoch: 27743, Training Loss: 0.01069
Epoch: 27744, Training Loss: 0.00833
Epoch: 27744, Training Loss: 0.00868
Epoch: 27744, Training Loss: 0.00870
Epoch: 27744, Training Loss: 0.01069
Epoch: 27745, Training Loss: 0.00833
Epoch: 27745, Training Loss: 0.00868
Epoch: 27745, Training Loss: 0.00870
Epoch: 27745, Training Loss: 0.01069
Epoch: 27746, Training Loss: 0.00832
Epoch: 27746, Training Loss: 0.00868
Epoch: 27746, Training Loss: 0.00870
Epoch: 27746, Training Loss: 0.01069
Epoch: 27747, Training Loss: 0.00832
Epoch: 27747, Training Loss: 0.00868
Epoch: 27747, Training Loss: 0.00869
Epoch: 27747, Training Loss: 0.01069
Epoch: 27748, Training Loss: 0.00832
Epoch: 27748, Training Loss: 0.00868
Epoch: 27748, Training Loss: 0.00869
Epoch: 27748, Training Loss: 0.01069
Epoch: 27749, Training Loss: 0.00832
Epoch: 27749, Training Loss: 0.00868
Epoch: 27749, Training Loss: 0.00869
Epoch: 27749, Training Loss: 0.01069
Epoch: 27750, Training Loss: 0.00832
Epoch: 27750, Training Loss: 0.00868
Epoch: 27750, Training Loss: 0.00869
Epoch: 27750, Training Loss: 0.01069
Epoch: 27751, Training Loss: 0.00832
Epoch: 27751, Training Loss: 0.00868
Epoch: 27751, Training Loss: 0.00869
Epoch: 27751, Training Loss: 0.01069
Epoch: 27752, Training Loss: 0.00832
Epoch: 27752, Training Loss: 0.00868
Epoch: 27752, Training Loss: 0.00869
Epoch: 27752, Training Loss: 0.01069
Epoch: 27753, Training Loss: 0.00832
Epoch: 27753, Training Loss: 0.00868
Epoch: 27753, Training Loss: 0.00869
Epoch: 27753, Training Loss: 0.01069
Epoch: 27754, Training Loss: 0.00832
Epoch: 27754, Training Loss: 0.00868
Epoch: 27754, Training Loss: 0.00869
Epoch: 27754, Training Loss: 0.01068
Epoch: 27755, Training Loss: 0.00832
Epoch: 27755, Training Loss: 0.00868
Epoch: 27755, Training Loss: 0.00869
Epoch: 27755, Training Loss: 0.01068
Epoch: 27756, Training Loss: 0.00832
Epoch: 27756, Training Loss: 0.00868
Epoch: 27756, Training Loss: 0.00869
Epoch: 27756, Training Loss: 0.01068
Epoch: 27757, Training Loss: 0.00832
Epoch: 27757, Training Loss: 0.00868
Epoch: 27757, Training Loss: 0.00869
Epoch: 27757, Training Loss: 0.01068
Epoch: 27758, Training Loss: 0.00832
Epoch: 27758, Training Loss: 0.00868
Epoch: 27758, Training Loss: 0.00869
Epoch: 27758, Training Loss: 0.01068
Epoch: 27759, Training Loss: 0.00832
Epoch: 27759, Training Loss: 0.00868
Epoch: 27759, Training Loss: 0.00869
Epoch: 27759, Training Loss: 0.01068
Epoch: 27760, Training Loss: 0.00832
Epoch: 27760, Training Loss: 0.00868
Epoch: 27760, Training Loss: 0.00869
Epoch: 27760, Training Loss: 0.01068
Epoch: 27761, Training Loss: 0.00832
Epoch: 27761, Training Loss: 0.00868
Epoch: 27761, Training Loss: 0.00869
Epoch: 27761, Training Loss: 0.01068
Epoch: 27762, Training Loss: 0.00832
Epoch: 27762, Training Loss: 0.00868
Epoch: 27762, Training Loss: 0.00869
Epoch: 27762, Training Loss: 0.01068
Epoch: 27763, Training Loss: 0.00832
Epoch: 27763, Training Loss: 0.00868
Epoch: 27763, Training Loss: 0.00869
Epoch: 27763, Training Loss: 0.01068
Epoch: 27764, Training Loss: 0.00832
Epoch: 27764, Training Loss: 0.00868
Epoch: 27764, Training Loss: 0.00869
Epoch: 27764, Training Loss: 0.01068
Epoch: 27765, Training Loss: 0.00832
Epoch: 27765, Training Loss: 0.00868
Epoch: 27765, Training Loss: 0.00869
Epoch: 27765, Training Loss: 0.01068
Epoch: 27766, Training Loss: 0.00832
Epoch: 27766, Training Loss: 0.00868
Epoch: 27766, Training Loss: 0.00869
Epoch: 27766, Training Loss: 0.01068
Epoch: 27767, Training Loss: 0.00832
Epoch: 27767, Training Loss: 0.00868
Epoch: 27767, Training Loss: 0.00869
Epoch: 27767, Training Loss: 0.01068
Epoch: 27768, Training Loss: 0.00832
Epoch: 27768, Training Loss: 0.00868
Epoch: 27768, Training Loss: 0.00869
Epoch: 27768, Training Loss: 0.01068
Epoch: 27769, Training Loss: 0.00832
Epoch: 27769, Training Loss: 0.00868
Epoch: 27769, Training Loss: 0.00869
Epoch: 27769, Training Loss: 0.01068
Epoch: 27770, Training Loss: 0.00832
Epoch: 27770, Training Loss: 0.00868
Epoch: 27770, Training Loss: 0.00869
Epoch: 27770, Training Loss: 0.01068
Epoch: 27771, Training Loss: 0.00832
Epoch: 27771, Training Loss: 0.00868
Epoch: 27771, Training Loss: 0.00869
Epoch: 27771, Training Loss: 0.01068
Epoch: 27772, Training Loss: 0.00832
Epoch: 27772, Training Loss: 0.00868
Epoch: 27772, Training Loss: 0.00869
Epoch: 27772, Training Loss: 0.01068
Epoch: 27773, Training Loss: 0.00832
Epoch: 27773, Training Loss: 0.00868
Epoch: 27773, Training Loss: 0.00869
Epoch: 27773, Training Loss: 0.01068
Epoch: 27774, Training Loss: 0.00832
Epoch: 27774, Training Loss: 0.00868
Epoch: 27774, Training Loss: 0.00869
Epoch: 27774, Training Loss: 0.01068
Epoch: 27775, Training Loss: 0.00832
Epoch: 27775, Training Loss: 0.00868
Epoch: 27775, Training Loss: 0.00869
Epoch: 27775, Training Loss: 0.01068
Epoch: 27776, Training Loss: 0.00832
Epoch: 27776, Training Loss: 0.00868
Epoch: 27776, Training Loss: 0.00869
Epoch: 27776, Training Loss: 0.01068
Epoch: 27777, Training Loss: 0.00832
Epoch: 27777, Training Loss: 0.00868
Epoch: 27777, Training Loss: 0.00869
Epoch: 27777, Training Loss: 0.01068
Epoch: 27778, Training Loss: 0.00832
Epoch: 27778, Training Loss: 0.00868
Epoch: 27778, Training Loss: 0.00869
Epoch: 27778, Training Loss: 0.01068
Epoch: 27779, Training Loss: 0.00832
Epoch: 27779, Training Loss: 0.00868
Epoch: 27779, Training Loss: 0.00869
Epoch: 27779, Training Loss: 0.01068
Epoch: 27780, Training Loss: 0.00832
Epoch: 27780, Training Loss: 0.00868
Epoch: 27780, Training Loss: 0.00869
Epoch: 27780, Training Loss: 0.01068
Epoch: 27781, Training Loss: 0.00832
Epoch: 27781, Training Loss: 0.00868
Epoch: 27781, Training Loss: 0.00869
Epoch: 27781, Training Loss: 0.01068
Epoch: 27782, Training Loss: 0.00832
Epoch: 27782, Training Loss: 0.00868
Epoch: 27782, Training Loss: 0.00869
Epoch: 27782, Training Loss: 0.01068
Epoch: 27783, Training Loss: 0.00832
Epoch: 27783, Training Loss: 0.00868
Epoch: 27783, Training Loss: 0.00869
Epoch: 27783, Training Loss: 0.01068
Epoch: 27784, Training Loss: 0.00832
Epoch: 27784, Training Loss: 0.00868
Epoch: 27784, Training Loss: 0.00869
Epoch: 27784, Training Loss: 0.01068
Epoch: 27785, Training Loss: 0.00832
Epoch: 27785, Training Loss: 0.00868
Epoch: 27785, Training Loss: 0.00869
Epoch: 27785, Training Loss: 0.01068
Epoch: 27786, Training Loss: 0.00832
Epoch: 27786, Training Loss: 0.00868
Epoch: 27786, Training Loss: 0.00869
Epoch: 27786, Training Loss: 0.01068
Epoch: 27787, Training Loss: 0.00832
Epoch: 27787, Training Loss: 0.00868
Epoch: 27787, Training Loss: 0.00869
Epoch: 27787, Training Loss: 0.01068
Epoch: 27788, Training Loss: 0.00832
Epoch: 27788, Training Loss: 0.00868
Epoch: 27788, Training Loss: 0.00869
Epoch: 27788, Training Loss: 0.01068
Epoch: 27789, Training Loss: 0.00832
Epoch: 27789, Training Loss: 0.00868
Epoch: 27789, Training Loss: 0.00869
Epoch: 27789, Training Loss: 0.01068
Epoch: 27790, Training Loss: 0.00832
Epoch: 27790, Training Loss: 0.00868
Epoch: 27790, Training Loss: 0.00869
Epoch: 27790, Training Loss: 0.01068
Epoch: 27791, Training Loss: 0.00832
Epoch: 27791, Training Loss: 0.00868
Epoch: 27791, Training Loss: 0.00869
Epoch: 27791, Training Loss: 0.01068
Epoch: 27792, Training Loss: 0.00832
Epoch: 27792, Training Loss: 0.00868
Epoch: 27792, Training Loss: 0.00869
Epoch: 27792, Training Loss: 0.01068
Epoch: 27793, Training Loss: 0.00832
Epoch: 27793, Training Loss: 0.00868
Epoch: 27793, Training Loss: 0.00869
Epoch: 27793, Training Loss: 0.01068
Epoch: 27794, Training Loss: 0.00832
Epoch: 27794, Training Loss: 0.00868
Epoch: 27794, Training Loss: 0.00869
Epoch: 27794, Training Loss: 0.01068
Epoch: 27795, Training Loss: 0.00832
Epoch: 27795, Training Loss: 0.00868
Epoch: 27795, Training Loss: 0.00869
Epoch: 27795, Training Loss: 0.01068
Epoch: 27796, Training Loss: 0.00832
Epoch: 27796, Training Loss: 0.00868
Epoch: 27796, Training Loss: 0.00869
Epoch: 27796, Training Loss: 0.01068
Epoch: 27797, Training Loss: 0.00832
Epoch: 27797, Training Loss: 0.00868
Epoch: 27797, Training Loss: 0.00869
Epoch: 27797, Training Loss: 0.01068
Epoch: 27798, Training Loss: 0.00832
Epoch: 27798, Training Loss: 0.00868
Epoch: 27798, Training Loss: 0.00869
Epoch: 27798, Training Loss: 0.01068
Epoch: 27799, Training Loss: 0.00832
Epoch: 27799, Training Loss: 0.00868
Epoch: 27799, Training Loss: 0.00869
Epoch: 27799, Training Loss: 0.01067
Epoch: 27800, Training Loss: 0.00832
Epoch: 27800, Training Loss: 0.00867
Epoch: 27800, Training Loss: 0.00869
Epoch: 27800, Training Loss: 0.01067
Epoch: 27801, Training Loss: 0.00832
Epoch: 27801, Training Loss: 0.00867
Epoch: 27801, Training Loss: 0.00869
Epoch: 27801, Training Loss: 0.01067
Epoch: 27802, Training Loss: 0.00832
Epoch: 27802, Training Loss: 0.00867
Epoch: 27802, Training Loss: 0.00869
Epoch: 27802, Training Loss: 0.01067
Epoch: 27803, Training Loss: 0.00832
Epoch: 27803, Training Loss: 0.00867
Epoch: 27803, Training Loss: 0.00868
Epoch: 27803, Training Loss: 0.01067
Epoch: 27804, Training Loss: 0.00832
Epoch: 27804, Training Loss: 0.00867
Epoch: 27804, Training Loss: 0.00868
Epoch: 27804, Training Loss: 0.01067
Epoch: 27805, Training Loss: 0.00832
Epoch: 27805, Training Loss: 0.00867
Epoch: 27805, Training Loss: 0.00868
Epoch: 27805, Training Loss: 0.01067
Epoch: 27806, Training Loss: 0.00832
Epoch: 27806, Training Loss: 0.00867
Epoch: 27806, Training Loss: 0.00868
Epoch: 27806, Training Loss: 0.01067
Epoch: 27807, Training Loss: 0.00831
Epoch: 27807, Training Loss: 0.00867
Epoch: 27807, Training Loss: 0.00868
Epoch: 27807, Training Loss: 0.01067
Epoch: 27808, Training Loss: 0.00831
Epoch: 27808, Training Loss: 0.00867
Epoch: 27808, Training Loss: 0.00868
Epoch: 27808, Training Loss: 0.01067
Epoch: 27809, Training Loss: 0.00831
Epoch: 27809, Training Loss: 0.00867
Epoch: 27809, Training Loss: 0.00868
Epoch: 27809, Training Loss: 0.01067
Epoch: 27810, Training Loss: 0.00831
Epoch: 27810, Training Loss: 0.00867
Epoch: 27810, Training Loss: 0.00868
Epoch: 27810, Training Loss: 0.01067
Epoch: 27811, Training Loss: 0.00831
Epoch: 27811, Training Loss: 0.00867
Epoch: 27811, Training Loss: 0.00868
Epoch: 27811, Training Loss: 0.01067
Epoch: 27812, Training Loss: 0.00831
Epoch: 27812, Training Loss: 0.00867
Epoch: 27812, Training Loss: 0.00868
Epoch: 27812, Training Loss: 0.01067
Epoch: 27813, Training Loss: 0.00831
Epoch: 27813, Training Loss: 0.00867
Epoch: 27813, Training Loss: 0.00868
Epoch: 27813, Training Loss: 0.01067
Epoch: 27814, Training Loss: 0.00831
Epoch: 27814, Training Loss: 0.00867
Epoch: 27814, Training Loss: 0.00868
Epoch: 27814, Training Loss: 0.01067
Epoch: 27815, Training Loss: 0.00831
Epoch: 27815, Training Loss: 0.00867
Epoch: 27815, Training Loss: 0.00868
Epoch: 27815, Training Loss: 0.01067
Epoch: 27816, Training Loss: 0.00831
Epoch: 27816, Training Loss: 0.00867
Epoch: 27816, Training Loss: 0.00868
Epoch: 27816, Training Loss: 0.01067
Epoch: 27817, Training Loss: 0.00831
Epoch: 27817, Training Loss: 0.00867
Epoch: 27817, Training Loss: 0.00868
Epoch: 27817, Training Loss: 0.01067
Epoch: 27818, Training Loss: 0.00831
Epoch: 27818, Training Loss: 0.00867
Epoch: 27818, Training Loss: 0.00868
Epoch: 27818, Training Loss: 0.01067
Epoch: 27819, Training Loss: 0.00831
Epoch: 27819, Training Loss: 0.00867
Epoch: 27819, Training Loss: 0.00868
Epoch: 27819, Training Loss: 0.01067
Epoch: 27820, Training Loss: 0.00831
Epoch: 27820, Training Loss: 0.00867
Epoch: 27820, Training Loss: 0.00868
Epoch: 27820, Training Loss: 0.01067
Epoch: 27821, Training Loss: 0.00831
Epoch: 27821, Training Loss: 0.00867
Epoch: 27821, Training Loss: 0.00868
Epoch: 27821, Training Loss: 0.01067
Epoch: 27822, Training Loss: 0.00831
Epoch: 27822, Training Loss: 0.00867
Epoch: 27822, Training Loss: 0.00868
Epoch: 27822, Training Loss: 0.01067
Epoch: 27823, Training Loss: 0.00831
Epoch: 27823, Training Loss: 0.00867
Epoch: 27823, Training Loss: 0.00868
Epoch: 27823, Training Loss: 0.01067
Epoch: 27824, Training Loss: 0.00831
Epoch: 27824, Training Loss: 0.00867
Epoch: 27824, Training Loss: 0.00868
Epoch: 27824, Training Loss: 0.01067
Epoch: 27825, Training Loss: 0.00831
Epoch: 27825, Training Loss: 0.00867
Epoch: 27825, Training Loss: 0.00868
Epoch: 27825, Training Loss: 0.01067
Epoch: 27826, Training Loss: 0.00831
Epoch: 27826, Training Loss: 0.00867
Epoch: 27826, Training Loss: 0.00868
Epoch: 27826, Training Loss: 0.01067
Epoch: 27827, Training Loss: 0.00831
Epoch: 27827, Training Loss: 0.00867
Epoch: 27827, Training Loss: 0.00868
Epoch: 27827, Training Loss: 0.01067
Epoch: 27828, Training Loss: 0.00831
Epoch: 27828, Training Loss: 0.00867
Epoch: 27828, Training Loss: 0.00868
Epoch: 27828, Training Loss: 0.01067
Epoch: 27829, Training Loss: 0.00831
Epoch: 27829, Training Loss: 0.00867
Epoch: 27829, Training Loss: 0.00868
Epoch: 27829, Training Loss: 0.01067
Epoch: 27830, Training Loss: 0.00831
Epoch: 27830, Training Loss: 0.00867
Epoch: 27830, Training Loss: 0.00868
Epoch: 27830, Training Loss: 0.01067
Epoch: 27831, Training Loss: 0.00831
Epoch: 27831, Training Loss: 0.00867
Epoch: 27831, Training Loss: 0.00868
Epoch: 27831, Training Loss: 0.01067
Epoch: 27832, Training Loss: 0.00831
Epoch: 27832, Training Loss: 0.00867
Epoch: 27832, Training Loss: 0.00868
Epoch: 27832, Training Loss: 0.01067
Epoch: 27833, Training Loss: 0.00831
Epoch: 27833, Training Loss: 0.00867
Epoch: 27833, Training Loss: 0.00868
Epoch: 27833, Training Loss: 0.01067
Epoch: 27834, Training Loss: 0.00831
Epoch: 27834, Training Loss: 0.00867
Epoch: 27834, Training Loss: 0.00868
Epoch: 27834, Training Loss: 0.01067
Epoch: 27835, Training Loss: 0.00831
Epoch: 27835, Training Loss: 0.00867
Epoch: 27835, Training Loss: 0.00868
Epoch: 27835, Training Loss: 0.01067
Epoch: 27836, Training Loss: 0.00831
Epoch: 27836, Training Loss: 0.00867
Epoch: 27836, Training Loss: 0.00868
Epoch: 27836, Training Loss: 0.01067
Epoch: 27837, Training Loss: 0.00831
Epoch: 27837, Training Loss: 0.00867
Epoch: 27837, Training Loss: 0.00868
Epoch: 27837, Training Loss: 0.01067
Epoch: 27838, Training Loss: 0.00831
Epoch: 27838, Training Loss: 0.00867
Epoch: 27838, Training Loss: 0.00868
Epoch: 27838, Training Loss: 0.01067
Epoch: 27839, Training Loss: 0.00831
Epoch: 27839, Training Loss: 0.00867
Epoch: 27839, Training Loss: 0.00868
Epoch: 27839, Training Loss: 0.01067
Epoch: 27840, Training Loss: 0.00831
Epoch: 27840, Training Loss: 0.00867
Epoch: 27840, Training Loss: 0.00868
Epoch: 27840, Training Loss: 0.01067
Epoch: 27841, Training Loss: 0.00831
Epoch: 27841, Training Loss: 0.00867
Epoch: 27841, Training Loss: 0.00868
Epoch: 27841, Training Loss: 0.01067
Epoch: 27842, Training Loss: 0.00831
Epoch: 27842, Training Loss: 0.00867
Epoch: 27842, Training Loss: 0.00868
Epoch: 27842, Training Loss: 0.01067
Epoch: 27843, Training Loss: 0.00831
Epoch: 27843, Training Loss: 0.00867
Epoch: 27843, Training Loss: 0.00868
Epoch: 27843, Training Loss: 0.01067
Epoch: 27844, Training Loss: 0.00831
Epoch: 27844, Training Loss: 0.00867
Epoch: 27844, Training Loss: 0.00868
Epoch: 27844, Training Loss: 0.01067
Epoch: 27845, Training Loss: 0.00831
Epoch: 27845, Training Loss: 0.00867
Epoch: 27845, Training Loss: 0.00868
Epoch: 27845, Training Loss: 0.01066
Epoch: 27846, Training Loss: 0.00831
Epoch: 27846, Training Loss: 0.00867
Epoch: 27846, Training Loss: 0.00868
Epoch: 27846, Training Loss: 0.01066
Epoch: 27847, Training Loss: 0.00831
Epoch: 27847, Training Loss: 0.00867
Epoch: 27847, Training Loss: 0.00868
Epoch: 27847, Training Loss: 0.01066
Epoch: 27848, Training Loss: 0.00831
Epoch: 27848, Training Loss: 0.00867
Epoch: 27848, Training Loss: 0.00868
Epoch: 27848, Training Loss: 0.01066
Epoch: 27849, Training Loss: 0.00831
Epoch: 27849, Training Loss: 0.00867
Epoch: 27849, Training Loss: 0.00868
Epoch: 27849, Training Loss: 0.01066
Epoch: 27850, Training Loss: 0.00831
Epoch: 27850, Training Loss: 0.00867
Epoch: 27850, Training Loss: 0.00868
Epoch: 27850, Training Loss: 0.01066
Epoch: 27851, Training Loss: 0.00831
Epoch: 27851, Training Loss: 0.00867
Epoch: 27851, Training Loss: 0.00868
Epoch: 27851, Training Loss: 0.01066
Epoch: 27852, Training Loss: 0.00831
Epoch: 27852, Training Loss: 0.00867
Epoch: 27852, Training Loss: 0.00868
Epoch: 27852, Training Loss: 0.01066
Epoch: 27853, Training Loss: 0.00831
Epoch: 27853, Training Loss: 0.00867
Epoch: 27853, Training Loss: 0.00868
Epoch: 27853, Training Loss: 0.01066
Epoch: 27854, Training Loss: 0.00831
Epoch: 27854, Training Loss: 0.00867
Epoch: 27854, Training Loss: 0.00868
Epoch: 27854, Training Loss: 0.01066
Epoch: 27855, Training Loss: 0.00831
Epoch: 27855, Training Loss: 0.00867
Epoch: 27855, Training Loss: 0.00868
Epoch: 27855, Training Loss: 0.01066
Epoch: 27856, Training Loss: 0.00831
Epoch: 27856, Training Loss: 0.00867
Epoch: 27856, Training Loss: 0.00868
Epoch: 27856, Training Loss: 0.01066
Epoch: 27857, Training Loss: 0.00831
Epoch: 27857, Training Loss: 0.00866
Epoch: 27857, Training Loss: 0.00868
Epoch: 27857, Training Loss: 0.01066
Epoch: 27858, Training Loss: 0.00831
Epoch: 27858, Training Loss: 0.00866
Epoch: 27858, Training Loss: 0.00868
Epoch: 27858, Training Loss: 0.01066
Epoch: 27859, Training Loss: 0.00831
Epoch: 27859, Training Loss: 0.00866
Epoch: 27859, Training Loss: 0.00868
Epoch: 27859, Training Loss: 0.01066
Epoch: 27860, Training Loss: 0.00831
Epoch: 27860, Training Loss: 0.00866
Epoch: 27860, Training Loss: 0.00867
Epoch: 27860, Training Loss: 0.01066
Epoch: 27861, Training Loss: 0.00831
Epoch: 27861, Training Loss: 0.00866
Epoch: 27861, Training Loss: 0.00867
Epoch: 27861, Training Loss: 0.01066
Epoch: 27862, Training Loss: 0.00831
Epoch: 27862, Training Loss: 0.00866
Epoch: 27862, Training Loss: 0.00867
Epoch: 27862, Training Loss: 0.01066
Epoch: 27863, Training Loss: 0.00831
Epoch: 27863, Training Loss: 0.00866
Epoch: 27863, Training Loss: 0.00867
Epoch: 27863, Training Loss: 0.01066
Epoch: 27864, Training Loss: 0.00831
Epoch: 27864, Training Loss: 0.00866
Epoch: 27864, Training Loss: 0.00867
Epoch: 27864, Training Loss: 0.01066
Epoch: 27865, Training Loss: 0.00831
Epoch: 27865, Training Loss: 0.00866
Epoch: 27865, Training Loss: 0.00867
Epoch: 27865, Training Loss: 0.01066
Epoch: 27866, Training Loss: 0.00831
Epoch: 27866, Training Loss: 0.00866
Epoch: 27866, Training Loss: 0.00867
Epoch: 27866, Training Loss: 0.01066
Epoch: 27867, Training Loss: 0.00830
Epoch: 27867, Training Loss: 0.00866
Epoch: 27867, Training Loss: 0.00867
Epoch: 27867, Training Loss: 0.01066
Epoch: 27868, Training Loss: 0.00830
Epoch: 27868, Training Loss: 0.00866
Epoch: 27868, Training Loss: 0.00867
Epoch: 27868, Training Loss: 0.01066
Epoch: 27869, Training Loss: 0.00830
Epoch: 27869, Training Loss: 0.00866
Epoch: 27869, Training Loss: 0.00867
Epoch: 27869, Training Loss: 0.01066
Epoch: 27870, Training Loss: 0.00830
Epoch: 27870, Training Loss: 0.00866
Epoch: 27870, Training Loss: 0.00867
Epoch: 27870, Training Loss: 0.01066
Epoch: 27871, Training Loss: 0.00830
Epoch: 27871, Training Loss: 0.00866
Epoch: 27871, Training Loss: 0.00867
Epoch: 27871, Training Loss: 0.01066
Epoch: 27872, Training Loss: 0.00830
Epoch: 27872, Training Loss: 0.00866
Epoch: 27872, Training Loss: 0.00867
Epoch: 27872, Training Loss: 0.01066
Epoch: 27873, Training Loss: 0.00830
Epoch: 27873, Training Loss: 0.00866
Epoch: 27873, Training Loss: 0.00867
Epoch: 27873, Training Loss: 0.01066
Epoch: 27874, Training Loss: 0.00830
Epoch: 27874, Training Loss: 0.00866
Epoch: 27874, Training Loss: 0.00867
Epoch: 27874, Training Loss: 0.01066
Epoch: 27875, Training Loss: 0.00830
Epoch: 27875, Training Loss: 0.00866
Epoch: 27875, Training Loss: 0.00867
Epoch: 27875, Training Loss: 0.01066
Epoch: 27876, Training Loss: 0.00830
Epoch: 27876, Training Loss: 0.00866
Epoch: 27876, Training Loss: 0.00867
Epoch: 27876, Training Loss: 0.01066
Epoch: 27877, Training Loss: 0.00830
Epoch: 27877, Training Loss: 0.00866
Epoch: 27877, Training Loss: 0.00867
Epoch: 27877, Training Loss: 0.01066
Epoch: 27878, Training Loss: 0.00830
Epoch: 27878, Training Loss: 0.00866
Epoch: 27878, Training Loss: 0.00867
Epoch: 27878, Training Loss: 0.01066
Epoch: 27879, Training Loss: 0.00830
Epoch: 27879, Training Loss: 0.00866
Epoch: 27879, Training Loss: 0.00867
Epoch: 27879, Training Loss: 0.01066
Epoch: 27880, Training Loss: 0.00830
Epoch: 27880, Training Loss: 0.00866
Epoch: 27880, Training Loss: 0.00867
Epoch: 27880, Training Loss: 0.01066
Epoch: 27881, Training Loss: 0.00830
Epoch: 27881, Training Loss: 0.00866
Epoch: 27881, Training Loss: 0.00867
Epoch: 27881, Training Loss: 0.01066
Epoch: 27882, Training Loss: 0.00830
Epoch: 27882, Training Loss: 0.00866
Epoch: 27882, Training Loss: 0.00867
Epoch: 27882, Training Loss: 0.01066
Epoch: 27883, Training Loss: 0.00830
Epoch: 27883, Training Loss: 0.00866
Epoch: 27883, Training Loss: 0.00867
Epoch: 27883, Training Loss: 0.01066
Epoch: 27884, Training Loss: 0.00830
Epoch: 27884, Training Loss: 0.00866
Epoch: 27884, Training Loss: 0.00867
Epoch: 27884, Training Loss: 0.01066
Epoch: 27885, Training Loss: 0.00830
Epoch: 27885, Training Loss: 0.00866
Epoch: 27885, Training Loss: 0.00867
Epoch: 27885, Training Loss: 0.01066
Epoch: 27886, Training Loss: 0.00830
Epoch: 27886, Training Loss: 0.00866
Epoch: 27886, Training Loss: 0.00867
Epoch: 27886, Training Loss: 0.01066
Epoch: 27887, Training Loss: 0.00830
Epoch: 27887, Training Loss: 0.00866
Epoch: 27887, Training Loss: 0.00867
Epoch: 27887, Training Loss: 0.01066
Epoch: 27888, Training Loss: 0.00830
Epoch: 27888, Training Loss: 0.00866
Epoch: 27888, Training Loss: 0.00867
Epoch: 27888, Training Loss: 0.01066
Epoch: 27889, Training Loss: 0.00830
Epoch: 27889, Training Loss: 0.00866
Epoch: 27889, Training Loss: 0.00867
Epoch: 27889, Training Loss: 0.01066
Epoch: 27890, Training Loss: 0.00830
Epoch: 27890, Training Loss: 0.00866
Epoch: 27890, Training Loss: 0.00867
Epoch: 27890, Training Loss: 0.01065
Epoch: 27891, Training Loss: 0.00830
Epoch: 27891, Training Loss: 0.00866
Epoch: 27891, Training Loss: 0.00867
Epoch: 27891, Training Loss: 0.01065
Epoch: 27892, Training Loss: 0.00830
Epoch: 27892, Training Loss: 0.00866
Epoch: 27892, Training Loss: 0.00867
Epoch: 27892, Training Loss: 0.01065
Epoch: 27893, Training Loss: 0.00830
Epoch: 27893, Training Loss: 0.00866
Epoch: 27893, Training Loss: 0.00867
Epoch: 27893, Training Loss: 0.01065
Epoch: 27894, Training Loss: 0.00830
Epoch: 27894, Training Loss: 0.00866
Epoch: 27894, Training Loss: 0.00867
Epoch: 27894, Training Loss: 0.01065
Epoch: 27895, Training Loss: 0.00830
Epoch: 27895, Training Loss: 0.00866
Epoch: 27895, Training Loss: 0.00867
Epoch: 27895, Training Loss: 0.01065
Epoch: 27896, Training Loss: 0.00830
Epoch: 27896, Training Loss: 0.00866
Epoch: 27896, Training Loss: 0.00867
Epoch: 27896, Training Loss: 0.01065
Epoch: 27897, Training Loss: 0.00830
Epoch: 27897, Training Loss: 0.00866
Epoch: 27897, Training Loss: 0.00867
Epoch: 27897, Training Loss: 0.01065
Epoch: 27898, Training Loss: 0.00830
Epoch: 27898, Training Loss: 0.00866
Epoch: 27898, Training Loss: 0.00867
Epoch: 27898, Training Loss: 0.01065
Epoch: 27899, Training Loss: 0.00830
Epoch: 27899, Training Loss: 0.00866
Epoch: 27899, Training Loss: 0.00867
Epoch: 27899, Training Loss: 0.01065
Epoch: 27900, Training Loss: 0.00830
Epoch: 27900, Training Loss: 0.00866
Epoch: 27900, Training Loss: 0.00867
Epoch: 27900, Training Loss: 0.01065
Epoch: 27901, Training Loss: 0.00830
Epoch: 27901, Training Loss: 0.00866
Epoch: 27901, Training Loss: 0.00867
Epoch: 27901, Training Loss: 0.01065
Epoch: 27902, Training Loss: 0.00830
Epoch: 27902, Training Loss: 0.00866
Epoch: 27902, Training Loss: 0.00867
Epoch: 27902, Training Loss: 0.01065
Epoch: 27903, Training Loss: 0.00830
Epoch: 27903, Training Loss: 0.00866
Epoch: 27903, Training Loss: 0.00867
Epoch: 27903, Training Loss: 0.01065
Epoch: 27904, Training Loss: 0.00830
Epoch: 27904, Training Loss: 0.00866
Epoch: 27904, Training Loss: 0.00867
Epoch: 27904, Training Loss: 0.01065
Epoch: 27905, Training Loss: 0.00830
Epoch: 27905, Training Loss: 0.00866
Epoch: 27905, Training Loss: 0.00867
Epoch: 27905, Training Loss: 0.01065
Epoch: 27906, Training Loss: 0.00830
Epoch: 27906, Training Loss: 0.00866
Epoch: 27906, Training Loss: 0.00867
Epoch: 27906, Training Loss: 0.01065
Epoch: 27907, Training Loss: 0.00830
Epoch: 27907, Training Loss: 0.00866
Epoch: 27907, Training Loss: 0.00867
Epoch: 27907, Training Loss: 0.01065
Epoch: 27908, Training Loss: 0.00830
Epoch: 27908, Training Loss: 0.00866
Epoch: 27908, Training Loss: 0.00867
Epoch: 27908, Training Loss: 0.01065
Epoch: 27909, Training Loss: 0.00830
Epoch: 27909, Training Loss: 0.00866
Epoch: 27909, Training Loss: 0.00867
Epoch: 27909, Training Loss: 0.01065
Epoch: 27910, Training Loss: 0.00830
Epoch: 27910, Training Loss: 0.00866
Epoch: 27910, Training Loss: 0.00867
Epoch: 27910, Training Loss: 0.01065
Epoch: 27911, Training Loss: 0.00830
Epoch: 27911, Training Loss: 0.00866
Epoch: 27911, Training Loss: 0.00867
Epoch: 27911, Training Loss: 0.01065
Epoch: 27912, Training Loss: 0.00830
Epoch: 27912, Training Loss: 0.00866
Epoch: 27912, Training Loss: 0.00867
Epoch: 27912, Training Loss: 0.01065
Epoch: 27913, Training Loss: 0.00830
Epoch: 27913, Training Loss: 0.00865
Epoch: 27913, Training Loss: 0.00867
Epoch: 27913, Training Loss: 0.01065
Epoch: 27914, Training Loss: 0.00830
Epoch: 27914, Training Loss: 0.00865
Epoch: 27914, Training Loss: 0.00867
Epoch: 27914, Training Loss: 0.01065
Epoch: 27915, Training Loss: 0.00830
Epoch: 27915, Training Loss: 0.00865
Epoch: 27915, Training Loss: 0.00867
Epoch: 27915, Training Loss: 0.01065
Epoch: 27916, Training Loss: 0.00830
Epoch: 27916, Training Loss: 0.00865
Epoch: 27916, Training Loss: 0.00866
Epoch: 27916, Training Loss: 0.01065
Epoch: 27917, Training Loss: 0.00830
Epoch: 27917, Training Loss: 0.00865
Epoch: 27917, Training Loss: 0.00866
Epoch: 27917, Training Loss: 0.01065
Epoch: 27918, Training Loss: 0.00830
Epoch: 27918, Training Loss: 0.00865
Epoch: 27918, Training Loss: 0.00866
Epoch: 27918, Training Loss: 0.01065
Epoch: 27919, Training Loss: 0.00830
Epoch: 27919, Training Loss: 0.00865
Epoch: 27919, Training Loss: 0.00866
Epoch: 27919, Training Loss: 0.01065
Epoch: 27920, Training Loss: 0.00830
Epoch: 27920, Training Loss: 0.00865
Epoch: 27920, Training Loss: 0.00866
Epoch: 27920, Training Loss: 0.01065
Epoch: 27921, Training Loss: 0.00830
Epoch: 27921, Training Loss: 0.00865
Epoch: 27921, Training Loss: 0.00866
Epoch: 27921, Training Loss: 0.01065
Epoch: 27922, Training Loss: 0.00830
Epoch: 27922, Training Loss: 0.00865
Epoch: 27922, Training Loss: 0.00866
Epoch: 27922, Training Loss: 0.01065
Epoch: 27923, Training Loss: 0.00830
Epoch: 27923, Training Loss: 0.00865
Epoch: 27923, Training Loss: 0.00866
Epoch: 27923, Training Loss: 0.01065
Epoch: 27924, Training Loss: 0.00830
Epoch: 27924, Training Loss: 0.00865
Epoch: 27924, Training Loss: 0.00866
Epoch: 27924, Training Loss: 0.01065
Epoch: 27925, Training Loss: 0.00830
Epoch: 27925, Training Loss: 0.00865
Epoch: 27925, Training Loss: 0.00866
Epoch: 27925, Training Loss: 0.01065
Epoch: 27926, Training Loss: 0.00830
Epoch: 27926, Training Loss: 0.00865
Epoch: 27926, Training Loss: 0.00866
Epoch: 27926, Training Loss: 0.01065
Epoch: 27927, Training Loss: 0.00830
Epoch: 27927, Training Loss: 0.00865
Epoch: 27927, Training Loss: 0.00866
Epoch: 27927, Training Loss: 0.01065
Epoch: 27928, Training Loss: 0.00829
Epoch: 27928, Training Loss: 0.00865
Epoch: 27928, Training Loss: 0.00866
Epoch: 27928, Training Loss: 0.01065
Epoch: 27929, Training Loss: 0.00829
Epoch: 27929, Training Loss: 0.00865
Epoch: 27929, Training Loss: 0.00866
Epoch: 27929, Training Loss: 0.01065
Epoch: 27930, Training Loss: 0.00829
Epoch: 27930, Training Loss: 0.00865
Epoch: 27930, Training Loss: 0.00866
Epoch: 27930, Training Loss: 0.01065
Epoch: 27931, Training Loss: 0.00829
Epoch: 27931, Training Loss: 0.00865
Epoch: 27931, Training Loss: 0.00866
Epoch: 27931, Training Loss: 0.01065
Epoch: 27932, Training Loss: 0.00829
Epoch: 27932, Training Loss: 0.00865
Epoch: 27932, Training Loss: 0.00866
Epoch: 27932, Training Loss: 0.01065
Epoch: 27933, Training Loss: 0.00829
Epoch: 27933, Training Loss: 0.00865
Epoch: 27933, Training Loss: 0.00866
Epoch: 27933, Training Loss: 0.01065
Epoch: 27934, Training Loss: 0.00829
Epoch: 27934, Training Loss: 0.00865
Epoch: 27934, Training Loss: 0.00866
Epoch: 27934, Training Loss: 0.01065
Epoch: 27935, Training Loss: 0.00829
Epoch: 27935, Training Loss: 0.00865
Epoch: 27935, Training Loss: 0.00866
Epoch: 27935, Training Loss: 0.01065
Epoch: 27936, Training Loss: 0.00829
Epoch: 27936, Training Loss: 0.00865
Epoch: 27936, Training Loss: 0.00866
Epoch: 27936, Training Loss: 0.01064
Epoch: 27937, Training Loss: 0.00829
Epoch: 27937, Training Loss: 0.00865
Epoch: 27937, Training Loss: 0.00866
Epoch: 27937, Training Loss: 0.01064
Epoch: 27938, Training Loss: 0.00829
Epoch: 27938, Training Loss: 0.00865
Epoch: 27938, Training Loss: 0.00866
Epoch: 27938, Training Loss: 0.01064
Epoch: 27939, Training Loss: 0.00829
Epoch: 27939, Training Loss: 0.00865
Epoch: 27939, Training Loss: 0.00866
Epoch: 27939, Training Loss: 0.01064
Epoch: 27940, Training Loss: 0.00829
Epoch: 27940, Training Loss: 0.00865
Epoch: 27940, Training Loss: 0.00866
Epoch: 27940, Training Loss: 0.01064
Epoch: 27941, Training Loss: 0.00829
Epoch: 27941, Training Loss: 0.00865
Epoch: 27941, Training Loss: 0.00866
Epoch: 27941, Training Loss: 0.01064
Epoch: 27942, Training Loss: 0.00829
Epoch: 27942, Training Loss: 0.00865
Epoch: 27942, Training Loss: 0.00866
Epoch: 27942, Training Loss: 0.01064
Epoch: 27943, Training Loss: 0.00829
Epoch: 27943, Training Loss: 0.00865
Epoch: 27943, Training Loss: 0.00866
Epoch: 27943, Training Loss: 0.01064
Epoch: 27944, Training Loss: 0.00829
Epoch: 27944, Training Loss: 0.00865
Epoch: 27944, Training Loss: 0.00866
Epoch: 27944, Training Loss: 0.01064
Epoch: 27945, Training Loss: 0.00829
Epoch: 27945, Training Loss: 0.00865
Epoch: 27945, Training Loss: 0.00866
Epoch: 27945, Training Loss: 0.01064
Epoch: 27946, Training Loss: 0.00829
Epoch: 27946, Training Loss: 0.00865
Epoch: 27946, Training Loss: 0.00866
Epoch: 27946, Training Loss: 0.01064
Epoch: 27947, Training Loss: 0.00829
Epoch: 27947, Training Loss: 0.00865
Epoch: 27947, Training Loss: 0.00866
Epoch: 27947, Training Loss: 0.01064
Epoch: 27948, Training Loss: 0.00829
Epoch: 27948, Training Loss: 0.00865
Epoch: 27948, Training Loss: 0.00866
Epoch: 27948, Training Loss: 0.01064
Epoch: 27949, Training Loss: 0.00829
Epoch: 27949, Training Loss: 0.00865
Epoch: 27949, Training Loss: 0.00866
Epoch: 27949, Training Loss: 0.01064
Epoch: 27950, Training Loss: 0.00829
Epoch: 27950, Training Loss: 0.00865
Epoch: 27950, Training Loss: 0.00866
Epoch: 27950, Training Loss: 0.01064
Epoch: 27951, Training Loss: 0.00829
Epoch: 27951, Training Loss: 0.00865
Epoch: 27951, Training Loss: 0.00866
Epoch: 27951, Training Loss: 0.01064
Epoch: 27952, Training Loss: 0.00829
Epoch: 27952, Training Loss: 0.00865
Epoch: 27952, Training Loss: 0.00866
Epoch: 27952, Training Loss: 0.01064
Epoch: 27953, Training Loss: 0.00829
Epoch: 27953, Training Loss: 0.00865
Epoch: 27953, Training Loss: 0.00866
Epoch: 27953, Training Loss: 0.01064
Epoch: 27954, Training Loss: 0.00829
Epoch: 27954, Training Loss: 0.00865
Epoch: 27954, Training Loss: 0.00866
Epoch: 27954, Training Loss: 0.01064
Epoch: 27955, Training Loss: 0.00829
Epoch: 27955, Training Loss: 0.00865
Epoch: 27955, Training Loss: 0.00866
Epoch: 27955, Training Loss: 0.01064
Epoch: 27956, Training Loss: 0.00829
Epoch: 27956, Training Loss: 0.00865
Epoch: 27956, Training Loss: 0.00866
Epoch: 27956, Training Loss: 0.01064
Epoch: 27957, Training Loss: 0.00829
Epoch: 27957, Training Loss: 0.00865
Epoch: 27957, Training Loss: 0.00866
Epoch: 27957, Training Loss: 0.01064
Epoch: 27958, Training Loss: 0.00829
Epoch: 27958, Training Loss: 0.00865
Epoch: 27958, Training Loss: 0.00866
Epoch: 27958, Training Loss: 0.01064
Epoch: 27959, Training Loss: 0.00829
Epoch: 27959, Training Loss: 0.00865
Epoch: 27959, Training Loss: 0.00866
Epoch: 27959, Training Loss: 0.01064
Epoch: 27960, Training Loss: 0.00829
Epoch: 27960, Training Loss: 0.00865
Epoch: 27960, Training Loss: 0.00866
Epoch: 27960, Training Loss: 0.01064
Epoch: 27961, Training Loss: 0.00829
Epoch: 27961, Training Loss: 0.00865
Epoch: 27961, Training Loss: 0.00866
Epoch: 27961, Training Loss: 0.01064
Epoch: 27962, Training Loss: 0.00829
Epoch: 27962, Training Loss: 0.00865
Epoch: 27962, Training Loss: 0.00866
Epoch: 27962, Training Loss: 0.01064
Epoch: 27963, Training Loss: 0.00829
Epoch: 27963, Training Loss: 0.00865
Epoch: 27963, Training Loss: 0.00866
Epoch: 27963, Training Loss: 0.01064
Epoch: 27964, Training Loss: 0.00829
Epoch: 27964, Training Loss: 0.00865
Epoch: 27964, Training Loss: 0.00866
Epoch: 27964, Training Loss: 0.01064
Epoch: 27965, Training Loss: 0.00829
Epoch: 27965, Training Loss: 0.00865
Epoch: 27965, Training Loss: 0.00866
Epoch: 27965, Training Loss: 0.01064
Epoch: 27966, Training Loss: 0.00829
Epoch: 27966, Training Loss: 0.00865
Epoch: 27966, Training Loss: 0.00866
Epoch: 27966, Training Loss: 0.01064
Epoch: 27967, Training Loss: 0.00829
Epoch: 27967, Training Loss: 0.00865
Epoch: 27967, Training Loss: 0.00866
Epoch: 27967, Training Loss: 0.01064
Epoch: 27968, Training Loss: 0.00829
Epoch: 27968, Training Loss: 0.00865
Epoch: 27968, Training Loss: 0.00866
Epoch: 27968, Training Loss: 0.01064
Epoch: 27969, Training Loss: 0.00829
Epoch: 27969, Training Loss: 0.00865
Epoch: 27969, Training Loss: 0.00866
Epoch: 27969, Training Loss: 0.01064
Epoch: 27970, Training Loss: 0.00829
Epoch: 27970, Training Loss: 0.00864
Epoch: 27970, Training Loss: 0.00866
Epoch: 27970, Training Loss: 0.01064
Epoch: 27971, Training Loss: 0.00829
Epoch: 27971, Training Loss: 0.00864
Epoch: 27971, Training Loss: 0.00866
Epoch: 27971, Training Loss: 0.01064
Epoch: 27972, Training Loss: 0.00829
Epoch: 27972, Training Loss: 0.00864
Epoch: 27972, Training Loss: 0.00866
Epoch: 27972, Training Loss: 0.01064
Epoch: 27973, Training Loss: 0.00829
Epoch: 27973, Training Loss: 0.00864
Epoch: 27973, Training Loss: 0.00865
Epoch: 27973, Training Loss: 0.01064
Epoch: 27974, Training Loss: 0.00829
Epoch: 27974, Training Loss: 0.00864
Epoch: 27974, Training Loss: 0.00865
Epoch: 27974, Training Loss: 0.01064
Epoch: 27975, Training Loss: 0.00829
Epoch: 27975, Training Loss: 0.00864
Epoch: 27975, Training Loss: 0.00865
Epoch: 27975, Training Loss: 0.01064
Epoch: 27976, Training Loss: 0.00829
Epoch: 27976, Training Loss: 0.00864
Epoch: 27976, Training Loss: 0.00865
Epoch: 27976, Training Loss: 0.01064
Epoch: 27977, Training Loss: 0.00829
Epoch: 27977, Training Loss: 0.00864
Epoch: 27977, Training Loss: 0.00865
Epoch: 27977, Training Loss: 0.01064
Epoch: 27978, Training Loss: 0.00829
Epoch: 27978, Training Loss: 0.00864
Epoch: 27978, Training Loss: 0.00865
Epoch: 27978, Training Loss: 0.01064
Epoch: 27979, Training Loss: 0.00829
Epoch: 27979, Training Loss: 0.00864
Epoch: 27979, Training Loss: 0.00865
Epoch: 27979, Training Loss: 0.01064
Epoch: 27980, Training Loss: 0.00829
Epoch: 27980, Training Loss: 0.00864
Epoch: 27980, Training Loss: 0.00865
Epoch: 27980, Training Loss: 0.01064
Epoch: 27981, Training Loss: 0.00829
Epoch: 27981, Training Loss: 0.00864
Epoch: 27981, Training Loss: 0.00865
Epoch: 27981, Training Loss: 0.01064
Epoch: 27982, Training Loss: 0.00829
Epoch: 27982, Training Loss: 0.00864
Epoch: 27982, Training Loss: 0.00865
Epoch: 27982, Training Loss: 0.01063
Epoch: 27983, Training Loss: 0.00829
Epoch: 27983, Training Loss: 0.00864
Epoch: 27983, Training Loss: 0.00865
Epoch: 27983, Training Loss: 0.01063
Epoch: 27984, Training Loss: 0.00829
Epoch: 27984, Training Loss: 0.00864
Epoch: 27984, Training Loss: 0.00865
Epoch: 27984, Training Loss: 0.01063
Epoch: 27985, Training Loss: 0.00829
Epoch: 27985, Training Loss: 0.00864
Epoch: 27985, Training Loss: 0.00865
Epoch: 27985, Training Loss: 0.01063
Epoch: 27986, Training Loss: 0.00829
Epoch: 27986, Training Loss: 0.00864
Epoch: 27986, Training Loss: 0.00865
Epoch: 27986, Training Loss: 0.01063
Epoch: 27987, Training Loss: 0.00829
Epoch: 27987, Training Loss: 0.00864
Epoch: 27987, Training Loss: 0.00865
Epoch: 27987, Training Loss: 0.01063
Epoch: 27988, Training Loss: 0.00829
Epoch: 27988, Training Loss: 0.00864
Epoch: 27988, Training Loss: 0.00865
Epoch: 27988, Training Loss: 0.01063
Epoch: 27989, Training Loss: 0.00828
Epoch: 27989, Training Loss: 0.00864
Epoch: 27989, Training Loss: 0.00865
Epoch: 27989, Training Loss: 0.01063
Epoch: 27990, Training Loss: 0.00828
Epoch: 27990, Training Loss: 0.00864
Epoch: 27990, Training Loss: 0.00865
Epoch: 27990, Training Loss: 0.01063
Epoch: 27991, Training Loss: 0.00828
Epoch: 27991, Training Loss: 0.00864
Epoch: 27991, Training Loss: 0.00865
Epoch: 27991, Training Loss: 0.01063
Epoch: 27992, Training Loss: 0.00828
Epoch: 27992, Training Loss: 0.00864
Epoch: 27992, Training Loss: 0.00865
Epoch: 27992, Training Loss: 0.01063
Epoch: 27993, Training Loss: 0.00828
Epoch: 27993, Training Loss: 0.00864
Epoch: 27993, Training Loss: 0.00865
Epoch: 27993, Training Loss: 0.01063
Epoch: 27994, Training Loss: 0.00828
Epoch: 27994, Training Loss: 0.00864
Epoch: 27994, Training Loss: 0.00865
Epoch: 27994, Training Loss: 0.01063
Epoch: 27995, Training Loss: 0.00828
Epoch: 27995, Training Loss: 0.00864
Epoch: 27995, Training Loss: 0.00865
Epoch: 27995, Training Loss: 0.01063
Epoch: 27996, Training Loss: 0.00828
Epoch: 27996, Training Loss: 0.00864
Epoch: 27996, Training Loss: 0.00865
Epoch: 27996, Training Loss: 0.01063
Epoch: 27997, Training Loss: 0.00828
Epoch: 27997, Training Loss: 0.00864
Epoch: 27997, Training Loss: 0.00865
Epoch: 27997, Training Loss: 0.01063
Epoch: 27998, Training Loss: 0.00828
Epoch: 27998, Training Loss: 0.00864
Epoch: 27998, Training Loss: 0.00865
Epoch: 27998, Training Loss: 0.01063
Epoch: 27999, Training Loss: 0.00828
Epoch: 27999, Training Loss: 0.00864
Epoch: 27999, Training Loss: 0.00865
Epoch: 27999, Training Loss: 0.01063
Epoch: 28000, Training Loss: 0.00828
Epoch: 28000, Training Loss: 0.00864
Epoch: 28000, Training Loss: 0.00865
Epoch: 28000, Training Loss: 0.01063
Epoch: 28001, Training Loss: 0.00828
Epoch: 28001, Training Loss: 0.00864
Epoch: 28001, Training Loss: 0.00865
Epoch: 28001, Training Loss: 0.01063
Epoch: 28002, Training Loss: 0.00828
Epoch: 28002, Training Loss: 0.00864
Epoch: 28002, Training Loss: 0.00865
Epoch: 28002, Training Loss: 0.01063
Epoch: 28003, Training Loss: 0.00828
Epoch: 28003, Training Loss: 0.00864
Epoch: 28003, Training Loss: 0.00865
Epoch: 28003, Training Loss: 0.01063
Epoch: 28004, Training Loss: 0.00828
Epoch: 28004, Training Loss: 0.00864
Epoch: 28004, Training Loss: 0.00865
Epoch: 28004, Training Loss: 0.01063
Epoch: 28005, Training Loss: 0.00828
Epoch: 28005, Training Loss: 0.00864
Epoch: 28005, Training Loss: 0.00865
Epoch: 28005, Training Loss: 0.01063
Epoch: 28006, Training Loss: 0.00828
Epoch: 28006, Training Loss: 0.00864
Epoch: 28006, Training Loss: 0.00865
Epoch: 28006, Training Loss: 0.01063
Epoch: 28007, Training Loss: 0.00828
Epoch: 28007, Training Loss: 0.00864
Epoch: 28007, Training Loss: 0.00865
Epoch: 28007, Training Loss: 0.01063
Epoch: 28008, Training Loss: 0.00828
Epoch: 28008, Training Loss: 0.00864
Epoch: 28008, Training Loss: 0.00865
Epoch: 28008, Training Loss: 0.01063
Epoch: 28009, Training Loss: 0.00828
Epoch: 28009, Training Loss: 0.00864
Epoch: 28009, Training Loss: 0.00865
Epoch: 28009, Training Loss: 0.01063
Epoch: 28010, Training Loss: 0.00828
Epoch: 28010, Training Loss: 0.00864
Epoch: 28010, Training Loss: 0.00865
Epoch: 28010, Training Loss: 0.01063
Epoch: 28011, Training Loss: 0.00828
Epoch: 28011, Training Loss: 0.00864
Epoch: 28011, Training Loss: 0.00865
Epoch: 28011, Training Loss: 0.01063
Epoch: 28012, Training Loss: 0.00828
Epoch: 28012, Training Loss: 0.00864
Epoch: 28012, Training Loss: 0.00865
Epoch: 28012, Training Loss: 0.01063
Epoch: 28013, Training Loss: 0.00828
Epoch: 28013, Training Loss: 0.00864
Epoch: 28013, Training Loss: 0.00865
Epoch: 28013, Training Loss: 0.01063
Epoch: 28014, Training Loss: 0.00828
Epoch: 28014, Training Loss: 0.00864
Epoch: 28014, Training Loss: 0.00865
Epoch: 28014, Training Loss: 0.01063
Epoch: 28015, Training Loss: 0.00828
Epoch: 28015, Training Loss: 0.00864
Epoch: 28015, Training Loss: 0.00865
Epoch: 28015, Training Loss: 0.01063
Epoch: 28016, Training Loss: 0.00828
Epoch: 28016, Training Loss: 0.00864
Epoch: 28016, Training Loss: 0.00865
Epoch: 28016, Training Loss: 0.01063
Epoch: 28017, Training Loss: 0.00828
Epoch: 28017, Training Loss: 0.00864
Epoch: 28017, Training Loss: 0.00865
Epoch: 28017, Training Loss: 0.01063
Epoch: 28018, Training Loss: 0.00828
Epoch: 28018, Training Loss: 0.00864
Epoch: 28018, Training Loss: 0.00865
Epoch: 28018, Training Loss: 0.01063
Epoch: 28019, Training Loss: 0.00828
Epoch: 28019, Training Loss: 0.00864
Epoch: 28019, Training Loss: 0.00865
Epoch: 28019, Training Loss: 0.01063
Epoch: 28020, Training Loss: 0.00828
Epoch: 28020, Training Loss: 0.00864
Epoch: 28020, Training Loss: 0.00865
Epoch: 28020, Training Loss: 0.01063
Epoch: 28021, Training Loss: 0.00828
Epoch: 28021, Training Loss: 0.00864
Epoch: 28021, Training Loss: 0.00865
Epoch: 28021, Training Loss: 0.01063
Epoch: 28022, Training Loss: 0.00828
Epoch: 28022, Training Loss: 0.00864
Epoch: 28022, Training Loss: 0.00865
Epoch: 28022, Training Loss: 0.01063
Epoch: 28023, Training Loss: 0.00828
Epoch: 28023, Training Loss: 0.00864
Epoch: 28023, Training Loss: 0.00865
Epoch: 28023, Training Loss: 0.01063
Epoch: 28024, Training Loss: 0.00828
Epoch: 28024, Training Loss: 0.00864
Epoch: 28024, Training Loss: 0.00865
Epoch: 28024, Training Loss: 0.01063
Epoch: 28025, Training Loss: 0.00828
Epoch: 28025, Training Loss: 0.00864
Epoch: 28025, Training Loss: 0.00865
Epoch: 28025, Training Loss: 0.01063
Epoch: 28026, Training Loss: 0.00828
Epoch: 28026, Training Loss: 0.00864
Epoch: 28026, Training Loss: 0.00865
Epoch: 28026, Training Loss: 0.01063
Epoch: 28027, Training Loss: 0.00828
Epoch: 28027, Training Loss: 0.00863
Epoch: 28027, Training Loss: 0.00865
Epoch: 28027, Training Loss: 0.01063
Epoch: 28028, Training Loss: 0.00828
Epoch: 28028, Training Loss: 0.00863
Epoch: 28028, Training Loss: 0.00865
Epoch: 28028, Training Loss: 0.01062
Epoch: 28029, Training Loss: 0.00828
Epoch: 28029, Training Loss: 0.00863
Epoch: 28029, Training Loss: 0.00865
Epoch: 28029, Training Loss: 0.01062
Epoch: 28030, Training Loss: 0.00828
Epoch: 28030, Training Loss: 0.00863
Epoch: 28030, Training Loss: 0.00864
Epoch: 28030, Training Loss: 0.01062
Epoch: 28031, Training Loss: 0.00828
Epoch: 28031, Training Loss: 0.00863
Epoch: 28031, Training Loss: 0.00864
Epoch: 28031, Training Loss: 0.01062
Epoch: 28032, Training Loss: 0.00828
Epoch: 28032, Training Loss: 0.00863
Epoch: 28032, Training Loss: 0.00864
Epoch: 28032, Training Loss: 0.01062
Epoch: 28033, Training Loss: 0.00828
Epoch: 28033, Training Loss: 0.00863
Epoch: 28033, Training Loss: 0.00864
Epoch: 28033, Training Loss: 0.01062
Epoch: 28034, Training Loss: 0.00828
Epoch: 28034, Training Loss: 0.00863
Epoch: 28034, Training Loss: 0.00864
Epoch: 28034, Training Loss: 0.01062
Epoch: 28035, Training Loss: 0.00828
Epoch: 28035, Training Loss: 0.00863
Epoch: 28035, Training Loss: 0.00864
Epoch: 28035, Training Loss: 0.01062
Epoch: 28036, Training Loss: 0.00828
Epoch: 28036, Training Loss: 0.00863
Epoch: 28036, Training Loss: 0.00864
Epoch: 28036, Training Loss: 0.01062
Epoch: 28037, Training Loss: 0.00828
Epoch: 28037, Training Loss: 0.00863
Epoch: 28037, Training Loss: 0.00864
Epoch: 28037, Training Loss: 0.01062
Epoch: 28038, Training Loss: 0.00828
Epoch: 28038, Training Loss: 0.00863
Epoch: 28038, Training Loss: 0.00864
Epoch: 28038, Training Loss: 0.01062
Epoch: 28039, Training Loss: 0.00828
Epoch: 28039, Training Loss: 0.00863
Epoch: 28039, Training Loss: 0.00864
Epoch: 28039, Training Loss: 0.01062
Epoch: 28040, Training Loss: 0.00828
Epoch: 28040, Training Loss: 0.00863
Epoch: 28040, Training Loss: 0.00864
Epoch: 28040, Training Loss: 0.01062
Epoch: 28041, Training Loss: 0.00828
Epoch: 28041, Training Loss: 0.00863
Epoch: 28041, Training Loss: 0.00864
Epoch: 28041, Training Loss: 0.01062
Epoch: 28042, Training Loss: 0.00828
Epoch: 28042, Training Loss: 0.00863
Epoch: 28042, Training Loss: 0.00864
Epoch: 28042, Training Loss: 0.01062
Epoch: 28043, Training Loss: 0.00828
Epoch: 28043, Training Loss: 0.00863
Epoch: 28043, Training Loss: 0.00864
Epoch: 28043, Training Loss: 0.01062
Epoch: 28044, Training Loss: 0.00828
Epoch: 28044, Training Loss: 0.00863
Epoch: 28044, Training Loss: 0.00864
Epoch: 28044, Training Loss: 0.01062
Epoch: 28045, Training Loss: 0.00828
Epoch: 28045, Training Loss: 0.00863
Epoch: 28045, Training Loss: 0.00864
Epoch: 28045, Training Loss: 0.01062
Epoch: 28046, Training Loss: 0.00828
Epoch: 28046, Training Loss: 0.00863
Epoch: 28046, Training Loss: 0.00864
Epoch: 28046, Training Loss: 0.01062
Epoch: 28047, Training Loss: 0.00828
Epoch: 28047, Training Loss: 0.00863
Epoch: 28047, Training Loss: 0.00864
Epoch: 28047, Training Loss: 0.01062
Epoch: 28048, Training Loss: 0.00828
Epoch: 28048, Training Loss: 0.00863
Epoch: 28048, Training Loss: 0.00864
Epoch: 28048, Training Loss: 0.01062
Epoch: 28049, Training Loss: 0.00828
Epoch: 28049, Training Loss: 0.00863
Epoch: 28049, Training Loss: 0.00864
Epoch: 28049, Training Loss: 0.01062
Epoch: 28050, Training Loss: 0.00827
Epoch: 28050, Training Loss: 0.00863
Epoch: 28050, Training Loss: 0.00864
Epoch: 28050, Training Loss: 0.01062
Epoch: 28051, Training Loss: 0.00827
Epoch: 28051, Training Loss: 0.00863
Epoch: 28051, Training Loss: 0.00864
Epoch: 28051, Training Loss: 0.01062
Epoch: 28052, Training Loss: 0.00827
Epoch: 28052, Training Loss: 0.00863
Epoch: 28052, Training Loss: 0.00864
Epoch: 28052, Training Loss: 0.01062
Epoch: 28053, Training Loss: 0.00827
Epoch: 28053, Training Loss: 0.00863
Epoch: 28053, Training Loss: 0.00864
Epoch: 28053, Training Loss: 0.01062
Epoch: 28054, Training Loss: 0.00827
Epoch: 28054, Training Loss: 0.00863
Epoch: 28054, Training Loss: 0.00864
Epoch: 28054, Training Loss: 0.01062
Epoch: 28055, Training Loss: 0.00827
Epoch: 28055, Training Loss: 0.00863
Epoch: 28055, Training Loss: 0.00864
Epoch: 28055, Training Loss: 0.01062
Epoch: 28056, Training Loss: 0.00827
Epoch: 28056, Training Loss: 0.00863
Epoch: 28056, Training Loss: 0.00864
Epoch: 28056, Training Loss: 0.01062
Epoch: 28057, Training Loss: 0.00827
Epoch: 28057, Training Loss: 0.00863
Epoch: 28057, Training Loss: 0.00864
Epoch: 28057, Training Loss: 0.01062
Epoch: 28058, Training Loss: 0.00827
Epoch: 28058, Training Loss: 0.00863
Epoch: 28058, Training Loss: 0.00864
Epoch: 28058, Training Loss: 0.01062
Epoch: 28059, Training Loss: 0.00827
Epoch: 28059, Training Loss: 0.00863
Epoch: 28059, Training Loss: 0.00864
Epoch: 28059, Training Loss: 0.01062
Epoch: 28060, Training Loss: 0.00827
Epoch: 28060, Training Loss: 0.00863
Epoch: 28060, Training Loss: 0.00864
Epoch: 28060, Training Loss: 0.01062
Epoch: 28061, Training Loss: 0.00827
Epoch: 28061, Training Loss: 0.00863
Epoch: 28061, Training Loss: 0.00864
Epoch: 28061, Training Loss: 0.01062
Epoch: 28062, Training Loss: 0.00827
Epoch: 28062, Training Loss: 0.00863
Epoch: 28062, Training Loss: 0.00864
Epoch: 28062, Training Loss: 0.01062
Epoch: 28063, Training Loss: 0.00827
Epoch: 28063, Training Loss: 0.00863
Epoch: 28063, Training Loss: 0.00864
Epoch: 28063, Training Loss: 0.01062
Epoch: 28064, Training Loss: 0.00827
Epoch: 28064, Training Loss: 0.00863
Epoch: 28064, Training Loss: 0.00864
Epoch: 28064, Training Loss: 0.01062
Epoch: 28065, Training Loss: 0.00827
Epoch: 28065, Training Loss: 0.00863
Epoch: 28065, Training Loss: 0.00864
Epoch: 28065, Training Loss: 0.01062
Epoch: 28066, Training Loss: 0.00827
Epoch: 28066, Training Loss: 0.00863
Epoch: 28066, Training Loss: 0.00864
Epoch: 28066, Training Loss: 0.01062
Epoch: 28067, Training Loss: 0.00827
Epoch: 28067, Training Loss: 0.00863
Epoch: 28067, Training Loss: 0.00864
Epoch: 28067, Training Loss: 0.01062
Epoch: 28068, Training Loss: 0.00827
Epoch: 28068, Training Loss: 0.00863
Epoch: 28068, Training Loss: 0.00864
Epoch: 28068, Training Loss: 0.01062
Epoch: 28069, Training Loss: 0.00827
Epoch: 28069, Training Loss: 0.00863
Epoch: 28069, Training Loss: 0.00864
Epoch: 28069, Training Loss: 0.01062
Epoch: 28070, Training Loss: 0.00827
Epoch: 28070, Training Loss: 0.00863
Epoch: 28070, Training Loss: 0.00864
Epoch: 28070, Training Loss: 0.01062
Epoch: 28071, Training Loss: 0.00827
Epoch: 28071, Training Loss: 0.00863
Epoch: 28071, Training Loss: 0.00864
Epoch: 28071, Training Loss: 0.01062
Epoch: 28072, Training Loss: 0.00827
Epoch: 28072, Training Loss: 0.00863
Epoch: 28072, Training Loss: 0.00864
Epoch: 28072, Training Loss: 0.01062
Epoch: 28073, Training Loss: 0.00827
Epoch: 28073, Training Loss: 0.00863
Epoch: 28073, Training Loss: 0.00864
Epoch: 28073, Training Loss: 0.01062
Epoch: 28074, Training Loss: 0.00827
Epoch: 28074, Training Loss: 0.00863
Epoch: 28074, Training Loss: 0.00864
Epoch: 28074, Training Loss: 0.01061
Epoch: 28075, Training Loss: 0.00827
Epoch: 28075, Training Loss: 0.00863
Epoch: 28075, Training Loss: 0.00864
Epoch: 28075, Training Loss: 0.01061
Epoch: 28076, Training Loss: 0.00827
Epoch: 28076, Training Loss: 0.00863
Epoch: 28076, Training Loss: 0.00864
Epoch: 28076, Training Loss: 0.01061
Epoch: 28077, Training Loss: 0.00827
Epoch: 28077, Training Loss: 0.00863
Epoch: 28077, Training Loss: 0.00864
Epoch: 28077, Training Loss: 0.01061
Epoch: 28078, Training Loss: 0.00827
Epoch: 28078, Training Loss: 0.00863
Epoch: 28078, Training Loss: 0.00864
Epoch: 28078, Training Loss: 0.01061
Epoch: 28079, Training Loss: 0.00827
Epoch: 28079, Training Loss: 0.00863
Epoch: 28079, Training Loss: 0.00864
Epoch: 28079, Training Loss: 0.01061
Epoch: 28080, Training Loss: 0.00827
Epoch: 28080, Training Loss: 0.00863
Epoch: 28080, Training Loss: 0.00864
Epoch: 28080, Training Loss: 0.01061
Epoch: 28081, Training Loss: 0.00827
Epoch: 28081, Training Loss: 0.00863
Epoch: 28081, Training Loss: 0.00864
Epoch: 28081, Training Loss: 0.01061
Epoch: 28082, Training Loss: 0.00827
Epoch: 28082, Training Loss: 0.00863
Epoch: 28082, Training Loss: 0.00864
Epoch: 28082, Training Loss: 0.01061
Epoch: 28083, Training Loss: 0.00827
Epoch: 28083, Training Loss: 0.00863
Epoch: 28083, Training Loss: 0.00864
Epoch: 28083, Training Loss: 0.01061
Epoch: 28084, Training Loss: 0.00827
Epoch: 28084, Training Loss: 0.00863
Epoch: 28084, Training Loss: 0.00864
Epoch: 28084, Training Loss: 0.01061
Epoch: 28085, Training Loss: 0.00827
Epoch: 28085, Training Loss: 0.00862
Epoch: 28085, Training Loss: 0.00864
Epoch: 28085, Training Loss: 0.01061
Epoch: 28086, Training Loss: 0.00827
Epoch: 28086, Training Loss: 0.00862
Epoch: 28086, Training Loss: 0.00864
Epoch: 28086, Training Loss: 0.01061
Epoch: 28087, Training Loss: 0.00827
Epoch: 28087, Training Loss: 0.00862
Epoch: 28087, Training Loss: 0.00864
Epoch: 28087, Training Loss: 0.01061
Epoch: 28088, Training Loss: 0.00827
Epoch: 28088, Training Loss: 0.00862
Epoch: 28088, Training Loss: 0.00863
Epoch: 28088, Training Loss: 0.01061
Epoch: 28089, Training Loss: 0.00827
Epoch: 28089, Training Loss: 0.00862
Epoch: 28089, Training Loss: 0.00863
Epoch: 28089, Training Loss: 0.01061
Epoch: 28090, Training Loss: 0.00827
Epoch: 28090, Training Loss: 0.00862
Epoch: 28090, Training Loss: 0.00863
Epoch: 28090, Training Loss: 0.01061
Epoch: 28091, Training Loss: 0.00827
Epoch: 28091, Training Loss: 0.00862
Epoch: 28091, Training Loss: 0.00863
Epoch: 28091, Training Loss: 0.01061
Epoch: 28092, Training Loss: 0.00827
Epoch: 28092, Training Loss: 0.00862
Epoch: 28092, Training Loss: 0.00863
Epoch: 28092, Training Loss: 0.01061
Epoch: 28093, Training Loss: 0.00827
Epoch: 28093, Training Loss: 0.00862
Epoch: 28093, Training Loss: 0.00863
Epoch: 28093, Training Loss: 0.01061
Epoch: 28094, Training Loss: 0.00827
Epoch: 28094, Training Loss: 0.00862
Epoch: 28094, Training Loss: 0.00863
Epoch: 28094, Training Loss: 0.01061
Epoch: 28095, Training Loss: 0.00827
Epoch: 28095, Training Loss: 0.00862
Epoch: 28095, Training Loss: 0.00863
Epoch: 28095, Training Loss: 0.01061
Epoch: 28096, Training Loss: 0.00827
Epoch: 28096, Training Loss: 0.00862
Epoch: 28096, Training Loss: 0.00863
Epoch: 28096, Training Loss: 0.01061
Epoch: 28097, Training Loss: 0.00827
Epoch: 28097, Training Loss: 0.00862
Epoch: 28097, Training Loss: 0.00863
Epoch: 28097, Training Loss: 0.01061
Epoch: 28098, Training Loss: 0.00827
Epoch: 28098, Training Loss: 0.00862
Epoch: 28098, Training Loss: 0.00863
Epoch: 28098, Training Loss: 0.01061
Epoch: 28099, Training Loss: 0.00827
Epoch: 28099, Training Loss: 0.00862
Epoch: 28099, Training Loss: 0.00863
Epoch: 28099, Training Loss: 0.01061
Epoch: 28100, Training Loss: 0.00827
Epoch: 28100, Training Loss: 0.00862
Epoch: 28100, Training Loss: 0.00863
Epoch: 28100, Training Loss: 0.01061
Epoch: 28101, Training Loss: 0.00827
Epoch: 28101, Training Loss: 0.00862
Epoch: 28101, Training Loss: 0.00863
Epoch: 28101, Training Loss: 0.01061
Epoch: 28102, Training Loss: 0.00827
Epoch: 28102, Training Loss: 0.00862
Epoch: 28102, Training Loss: 0.00863
Epoch: 28102, Training Loss: 0.01061
Epoch: 28103, Training Loss: 0.00827
Epoch: 28103, Training Loss: 0.00862
Epoch: 28103, Training Loss: 0.00863
Epoch: 28103, Training Loss: 0.01061
Epoch: 28104, Training Loss: 0.00827
Epoch: 28104, Training Loss: 0.00862
Epoch: 28104, Training Loss: 0.00863
Epoch: 28104, Training Loss: 0.01061
Epoch: 28105, Training Loss: 0.00827
Epoch: 28105, Training Loss: 0.00862
Epoch: 28105, Training Loss: 0.00863
Epoch: 28105, Training Loss: 0.01061
Epoch: 28106, Training Loss: 0.00827
Epoch: 28106, Training Loss: 0.00862
Epoch: 28106, Training Loss: 0.00863
Epoch: 28106, Training Loss: 0.01061
Epoch: 28107, Training Loss: 0.00827
Epoch: 28107, Training Loss: 0.00862
Epoch: 28107, Training Loss: 0.00863
Epoch: 28107, Training Loss: 0.01061
Epoch: 28108, Training Loss: 0.00827
Epoch: 28108, Training Loss: 0.00862
Epoch: 28108, Training Loss: 0.00863
Epoch: 28108, Training Loss: 0.01061
Epoch: 28109, Training Loss: 0.00827
Epoch: 28109, Training Loss: 0.00862
Epoch: 28109, Training Loss: 0.00863
Epoch: 28109, Training Loss: 0.01061
Epoch: 28110, Training Loss: 0.00827
Epoch: 28110, Training Loss: 0.00862
Epoch: 28110, Training Loss: 0.00863
Epoch: 28110, Training Loss: 0.01061
Epoch: 28111, Training Loss: 0.00826
Epoch: 28111, Training Loss: 0.00862
Epoch: 28111, Training Loss: 0.00863
Epoch: 28111, Training Loss: 0.01061
Epoch: 28112, Training Loss: 0.00826
Epoch: 28112, Training Loss: 0.00862
Epoch: 28112, Training Loss: 0.00863
Epoch: 28112, Training Loss: 0.01061
Epoch: 28113, Training Loss: 0.00826
Epoch: 28113, Training Loss: 0.00862
Epoch: 28113, Training Loss: 0.00863
Epoch: 28113, Training Loss: 0.01061
Epoch: 28114, Training Loss: 0.00826
Epoch: 28114, Training Loss: 0.00862
Epoch: 28114, Training Loss: 0.00863
Epoch: 28114, Training Loss: 0.01061
Epoch: 28115, Training Loss: 0.00826
Epoch: 28115, Training Loss: 0.00862
Epoch: 28115, Training Loss: 0.00863
Epoch: 28115, Training Loss: 0.01061
Epoch: 28116, Training Loss: 0.00826
Epoch: 28116, Training Loss: 0.00862
Epoch: 28116, Training Loss: 0.00863
Epoch: 28116, Training Loss: 0.01061
Epoch: 28117, Training Loss: 0.00826
Epoch: 28117, Training Loss: 0.00862
Epoch: 28117, Training Loss: 0.00863
Epoch: 28117, Training Loss: 0.01061
Epoch: 28118, Training Loss: 0.00826
Epoch: 28118, Training Loss: 0.00862
Epoch: 28118, Training Loss: 0.00863
Epoch: 28118, Training Loss: 0.01061
Epoch: 28119, Training Loss: 0.00826
Epoch: 28119, Training Loss: 0.00862
Epoch: 28119, Training Loss: 0.00863
Epoch: 28119, Training Loss: 0.01061
Epoch: 28120, Training Loss: 0.00826
Epoch: 28120, Training Loss: 0.00862
Epoch: 28120, Training Loss: 0.00863
Epoch: 28120, Training Loss: 0.01060
Epoch: 28121, Training Loss: 0.00826
Epoch: 28121, Training Loss: 0.00862
Epoch: 28121, Training Loss: 0.00863
Epoch: 28121, Training Loss: 0.01060
Epoch: 28122, Training Loss: 0.00826
Epoch: 28122, Training Loss: 0.00862
Epoch: 28122, Training Loss: 0.00863
Epoch: 28122, Training Loss: 0.01060
Epoch: 28123, Training Loss: 0.00826
Epoch: 28123, Training Loss: 0.00862
Epoch: 28123, Training Loss: 0.00863
Epoch: 28123, Training Loss: 0.01060
Epoch: 28124, Training Loss: 0.00826
Epoch: 28124, Training Loss: 0.00862
Epoch: 28124, Training Loss: 0.00863
Epoch: 28124, Training Loss: 0.01060
Epoch: 28125, Training Loss: 0.00826
Epoch: 28125, Training Loss: 0.00862
Epoch: 28125, Training Loss: 0.00863
Epoch: 28125, Training Loss: 0.01060
Epoch: 28126, Training Loss: 0.00826
Epoch: 28126, Training Loss: 0.00862
Epoch: 28126, Training Loss: 0.00863
Epoch: 28126, Training Loss: 0.01060
Epoch: 28127, Training Loss: 0.00826
Epoch: 28127, Training Loss: 0.00862
Epoch: 28127, Training Loss: 0.00863
Epoch: 28127, Training Loss: 0.01060
Epoch: 28128, Training Loss: 0.00826
Epoch: 28128, Training Loss: 0.00862
Epoch: 28128, Training Loss: 0.00863
Epoch: 28128, Training Loss: 0.01060
Epoch: 28129, Training Loss: 0.00826
Epoch: 28129, Training Loss: 0.00862
Epoch: 28129, Training Loss: 0.00863
Epoch: 28129, Training Loss: 0.01060
Epoch: 28130, Training Loss: 0.00826
Epoch: 28130, Training Loss: 0.00862
Epoch: 28130, Training Loss: 0.00863
Epoch: 28130, Training Loss: 0.01060
Epoch: 28131, Training Loss: 0.00826
Epoch: 28131, Training Loss: 0.00862
Epoch: 28131, Training Loss: 0.00863
Epoch: 28131, Training Loss: 0.01060
Epoch: 28132, Training Loss: 0.00826
Epoch: 28132, Training Loss: 0.00862
Epoch: 28132, Training Loss: 0.00863
Epoch: 28132, Training Loss: 0.01060
Epoch: 28133, Training Loss: 0.00826
Epoch: 28133, Training Loss: 0.00862
Epoch: 28133, Training Loss: 0.00863
Epoch: 28133, Training Loss: 0.01060
Epoch: 28134, Training Loss: 0.00826
Epoch: 28134, Training Loss: 0.00862
Epoch: 28134, Training Loss: 0.00863
Epoch: 28134, Training Loss: 0.01060
Epoch: 28135, Training Loss: 0.00826
Epoch: 28135, Training Loss: 0.00862
Epoch: 28135, Training Loss: 0.00863
Epoch: 28135, Training Loss: 0.01060
Epoch: 28136, Training Loss: 0.00826
Epoch: 28136, Training Loss: 0.00862
Epoch: 28136, Training Loss: 0.00863
Epoch: 28136, Training Loss: 0.01060
Epoch: 28137, Training Loss: 0.00826
Epoch: 28137, Training Loss: 0.00862
Epoch: 28137, Training Loss: 0.00863
Epoch: 28137, Training Loss: 0.01060
Epoch: 28138, Training Loss: 0.00826
Epoch: 28138, Training Loss: 0.00862
Epoch: 28138, Training Loss: 0.00863
Epoch: 28138, Training Loss: 0.01060
Epoch: 28139, Training Loss: 0.00826
Epoch: 28139, Training Loss: 0.00862
Epoch: 28139, Training Loss: 0.00863
Epoch: 28139, Training Loss: 0.01060
Epoch: 28140, Training Loss: 0.00826
Epoch: 28140, Training Loss: 0.00862
Epoch: 28140, Training Loss: 0.00863
Epoch: 28140, Training Loss: 0.01060
Epoch: 28141, Training Loss: 0.00826
Epoch: 28141, Training Loss: 0.00862
Epoch: 28141, Training Loss: 0.00863
Epoch: 28141, Training Loss: 0.01060
Epoch: 28142, Training Loss: 0.00826
Epoch: 28142, Training Loss: 0.00861
Epoch: 28142, Training Loss: 0.00863
Epoch: 28142, Training Loss: 0.01060
Epoch: 28143, Training Loss: 0.00826
Epoch: 28143, Training Loss: 0.00861
Epoch: 28143, Training Loss: 0.00863
Epoch: 28143, Training Loss: 0.01060
Epoch: 28144, Training Loss: 0.00826
Epoch: 28144, Training Loss: 0.00861
Epoch: 28144, Training Loss: 0.00863
Epoch: 28144, Training Loss: 0.01060
Epoch: 28145, Training Loss: 0.00826
Epoch: 28145, Training Loss: 0.00861
Epoch: 28145, Training Loss: 0.00862
Epoch: 28145, Training Loss: 0.01060
Epoch: 28146, Training Loss: 0.00826
Epoch: 28146, Training Loss: 0.00861
Epoch: 28146, Training Loss: 0.00862
Epoch: 28146, Training Loss: 0.01060
Epoch: 28147, Training Loss: 0.00826
Epoch: 28147, Training Loss: 0.00861
Epoch: 28147, Training Loss: 0.00862
Epoch: 28147, Training Loss: 0.01060
Epoch: 28148, Training Loss: 0.00826
Epoch: 28148, Training Loss: 0.00861
Epoch: 28148, Training Loss: 0.00862
Epoch: 28148, Training Loss: 0.01060
Epoch: 28149, Training Loss: 0.00826
Epoch: 28149, Training Loss: 0.00861
Epoch: 28149, Training Loss: 0.00862
Epoch: 28149, Training Loss: 0.01060
Epoch: 28150, Training Loss: 0.00826
Epoch: 28150, Training Loss: 0.00861
Epoch: 28150, Training Loss: 0.00862
Epoch: 28150, Training Loss: 0.01060
Epoch: 28151, Training Loss: 0.00826
Epoch: 28151, Training Loss: 0.00861
Epoch: 28151, Training Loss: 0.00862
Epoch: 28151, Training Loss: 0.01060
Epoch: 28152, Training Loss: 0.00826
Epoch: 28152, Training Loss: 0.00861
Epoch: 28152, Training Loss: 0.00862
Epoch: 28152, Training Loss: 0.01060
Epoch: 28153, Training Loss: 0.00826
Epoch: 28153, Training Loss: 0.00861
Epoch: 28153, Training Loss: 0.00862
Epoch: 28153, Training Loss: 0.01060
Epoch: 28154, Training Loss: 0.00826
Epoch: 28154, Training Loss: 0.00861
Epoch: 28154, Training Loss: 0.00862
Epoch: 28154, Training Loss: 0.01060
Epoch: 28155, Training Loss: 0.00826
Epoch: 28155, Training Loss: 0.00861
Epoch: 28155, Training Loss: 0.00862
Epoch: 28155, Training Loss: 0.01060
Epoch: 28156, Training Loss: 0.00826
Epoch: 28156, Training Loss: 0.00861
Epoch: 28156, Training Loss: 0.00862
Epoch: 28156, Training Loss: 0.01060
Epoch: 28157, Training Loss: 0.00826
Epoch: 28157, Training Loss: 0.00861
Epoch: 28157, Training Loss: 0.00862
Epoch: 28157, Training Loss: 0.01060
Epoch: 28158, Training Loss: 0.00826
Epoch: 28158, Training Loss: 0.00861
Epoch: 28158, Training Loss: 0.00862
Epoch: 28158, Training Loss: 0.01060
Epoch: 28159, Training Loss: 0.00826
Epoch: 28159, Training Loss: 0.00861
Epoch: 28159, Training Loss: 0.00862
Epoch: 28159, Training Loss: 0.01060
Epoch: 28160, Training Loss: 0.00826
Epoch: 28160, Training Loss: 0.00861
Epoch: 28160, Training Loss: 0.00862
Epoch: 28160, Training Loss: 0.01060
Epoch: 28161, Training Loss: 0.00826
Epoch: 28161, Training Loss: 0.00861
Epoch: 28161, Training Loss: 0.00862
Epoch: 28161, Training Loss: 0.01060
Epoch: 28162, Training Loss: 0.00826
Epoch: 28162, Training Loss: 0.00861
Epoch: 28162, Training Loss: 0.00862
Epoch: 28162, Training Loss: 0.01060
Epoch: 28163, Training Loss: 0.00826
Epoch: 28163, Training Loss: 0.00861
Epoch: 28163, Training Loss: 0.00862
Epoch: 28163, Training Loss: 0.01060
Epoch: 28164, Training Loss: 0.00826
Epoch: 28164, Training Loss: 0.00861
Epoch: 28164, Training Loss: 0.00862
Epoch: 28164, Training Loss: 0.01060
Epoch: 28165, Training Loss: 0.00826
Epoch: 28165, Training Loss: 0.00861
Epoch: 28165, Training Loss: 0.00862
Epoch: 28165, Training Loss: 0.01060
Epoch: 28166, Training Loss: 0.00826
Epoch: 28166, Training Loss: 0.00861
Epoch: 28166, Training Loss: 0.00862
Epoch: 28166, Training Loss: 0.01059
Epoch: 28167, Training Loss: 0.00826
Epoch: 28167, Training Loss: 0.00861
Epoch: 28167, Training Loss: 0.00862
Epoch: 28167, Training Loss: 0.01059
Epoch: 28168, Training Loss: 0.00826
Epoch: 28168, Training Loss: 0.00861
Epoch: 28168, Training Loss: 0.00862
Epoch: 28168, Training Loss: 0.01059
Epoch: 28169, Training Loss: 0.00826
Epoch: 28169, Training Loss: 0.00861
Epoch: 28169, Training Loss: 0.00862
Epoch: 28169, Training Loss: 0.01059
Epoch: 28170, Training Loss: 0.00826
Epoch: 28170, Training Loss: 0.00861
Epoch: 28170, Training Loss: 0.00862
Epoch: 28170, Training Loss: 0.01059
Epoch: 28171, Training Loss: 0.00826
Epoch: 28171, Training Loss: 0.00861
Epoch: 28171, Training Loss: 0.00862
Epoch: 28171, Training Loss: 0.01059
Epoch: 28172, Training Loss: 0.00826
Epoch: 28172, Training Loss: 0.00861
Epoch: 28172, Training Loss: 0.00862
Epoch: 28172, Training Loss: 0.01059
Epoch: 28173, Training Loss: 0.00825
Epoch: 28173, Training Loss: 0.00861
Epoch: 28173, Training Loss: 0.00862
Epoch: 28173, Training Loss: 0.01059
Epoch: 28174, Training Loss: 0.00825
Epoch: 28174, Training Loss: 0.00861
Epoch: 28174, Training Loss: 0.00862
Epoch: 28174, Training Loss: 0.01059
Epoch: 28175, Training Loss: 0.00825
Epoch: 28175, Training Loss: 0.00861
Epoch: 28175, Training Loss: 0.00862
Epoch: 28175, Training Loss: 0.01059
Epoch: 28176, Training Loss: 0.00825
Epoch: 28176, Training Loss: 0.00861
Epoch: 28176, Training Loss: 0.00862
Epoch: 28176, Training Loss: 0.01059
Epoch: 28177, Training Loss: 0.00825
Epoch: 28177, Training Loss: 0.00861
Epoch: 28177, Training Loss: 0.00862
Epoch: 28177, Training Loss: 0.01059
Epoch: 28178, Training Loss: 0.00825
Epoch: 28178, Training Loss: 0.00861
Epoch: 28178, Training Loss: 0.00862
Epoch: 28178, Training Loss: 0.01059
Epoch: 28179, Training Loss: 0.00825
Epoch: 28179, Training Loss: 0.00861
Epoch: 28179, Training Loss: 0.00862
Epoch: 28179, Training Loss: 0.01059
Epoch: 28180, Training Loss: 0.00825
Epoch: 28180, Training Loss: 0.00861
Epoch: 28180, Training Loss: 0.00862
Epoch: 28180, Training Loss: 0.01059
Epoch: 28181, Training Loss: 0.00825
Epoch: 28181, Training Loss: 0.00861
Epoch: 28181, Training Loss: 0.00862
Epoch: 28181, Training Loss: 0.01059
Epoch: 28182, Training Loss: 0.00825
Epoch: 28182, Training Loss: 0.00861
Epoch: 28182, Training Loss: 0.00862
Epoch: 28182, Training Loss: 0.01059
Epoch: 28183, Training Loss: 0.00825
Epoch: 28183, Training Loss: 0.00861
Epoch: 28183, Training Loss: 0.00862
Epoch: 28183, Training Loss: 0.01059
Epoch: 28184, Training Loss: 0.00825
Epoch: 28184, Training Loss: 0.00861
Epoch: 28184, Training Loss: 0.00862
Epoch: 28184, Training Loss: 0.01059
Epoch: 28185, Training Loss: 0.00825
Epoch: 28185, Training Loss: 0.00861
Epoch: 28185, Training Loss: 0.00862
Epoch: 28185, Training Loss: 0.01059
Epoch: 28186, Training Loss: 0.00825
Epoch: 28186, Training Loss: 0.00861
Epoch: 28186, Training Loss: 0.00862
Epoch: 28186, Training Loss: 0.01059
Epoch: 28187, Training Loss: 0.00825
Epoch: 28187, Training Loss: 0.00861
Epoch: 28187, Training Loss: 0.00862
Epoch: 28187, Training Loss: 0.01059
Epoch: 28188, Training Loss: 0.00825
Epoch: 28188, Training Loss: 0.00861
Epoch: 28188, Training Loss: 0.00862
Epoch: 28188, Training Loss: 0.01059
Epoch: 28189, Training Loss: 0.00825
Epoch: 28189, Training Loss: 0.00861
Epoch: 28189, Training Loss: 0.00862
Epoch: 28189, Training Loss: 0.01059
Epoch: 28190, Training Loss: 0.00825
Epoch: 28190, Training Loss: 0.00861
Epoch: 28190, Training Loss: 0.00862
Epoch: 28190, Training Loss: 0.01059
Epoch: 28191, Training Loss: 0.00825
Epoch: 28191, Training Loss: 0.00861
Epoch: 28191, Training Loss: 0.00862
Epoch: 28191, Training Loss: 0.01059
Epoch: 28192, Training Loss: 0.00825
Epoch: 28192, Training Loss: 0.00861
Epoch: 28192, Training Loss: 0.00862
Epoch: 28192, Training Loss: 0.01059
Epoch: 28193, Training Loss: 0.00825
Epoch: 28193, Training Loss: 0.00861
Epoch: 28193, Training Loss: 0.00862
Epoch: 28193, Training Loss: 0.01059
Epoch: 28194, Training Loss: 0.00825
Epoch: 28194, Training Loss: 0.00861
Epoch: 28194, Training Loss: 0.00862
Epoch: 28194, Training Loss: 0.01059
Epoch: 28195, Training Loss: 0.00825
Epoch: 28195, Training Loss: 0.00861
Epoch: 28195, Training Loss: 0.00862
Epoch: 28195, Training Loss: 0.01059
Epoch: 28196, Training Loss: 0.00825
Epoch: 28196, Training Loss: 0.00861
Epoch: 28196, Training Loss: 0.00862
Epoch: 28196, Training Loss: 0.01059
Epoch: 28197, Training Loss: 0.00825
Epoch: 28197, Training Loss: 0.00861
Epoch: 28197, Training Loss: 0.00862
Epoch: 28197, Training Loss: 0.01059
Epoch: 28198, Training Loss: 0.00825
Epoch: 28198, Training Loss: 0.00861
Epoch: 28198, Training Loss: 0.00862
Epoch: 28198, Training Loss: 0.01059
Epoch: 28199, Training Loss: 0.00825
Epoch: 28199, Training Loss: 0.00861
Epoch: 28199, Training Loss: 0.00862
Epoch: 28199, Training Loss: 0.01059
Epoch: 28200, Training Loss: 0.00825
Epoch: 28200, Training Loss: 0.00860
Epoch: 28200, Training Loss: 0.00862
Epoch: 28200, Training Loss: 0.01059
Epoch: 28201, Training Loss: 0.00825
Epoch: 28201, Training Loss: 0.00860
Epoch: 28201, Training Loss: 0.00862
Epoch: 28201, Training Loss: 0.01059
Epoch: 28202, Training Loss: 0.00825
Epoch: 28202, Training Loss: 0.00860
Epoch: 28202, Training Loss: 0.00862
Epoch: 28202, Training Loss: 0.01059
Epoch: 28203, Training Loss: 0.00825
Epoch: 28203, Training Loss: 0.00860
Epoch: 28203, Training Loss: 0.00861
Epoch: 28203, Training Loss: 0.01059
Epoch: 28204, Training Loss: 0.00825
Epoch: 28204, Training Loss: 0.00860
Epoch: 28204, Training Loss: 0.00861
Epoch: 28204, Training Loss: 0.01059
Epoch: 28205, Training Loss: 0.00825
Epoch: 28205, Training Loss: 0.00860
Epoch: 28205, Training Loss: 0.00861
Epoch: 28205, Training Loss: 0.01059
Epoch: 28206, Training Loss: 0.00825
Epoch: 28206, Training Loss: 0.00860
Epoch: 28206, Training Loss: 0.00861
Epoch: 28206, Training Loss: 0.01059
Epoch: 28207, Training Loss: 0.00825
Epoch: 28207, Training Loss: 0.00860
Epoch: 28207, Training Loss: 0.00861
Epoch: 28207, Training Loss: 0.01059
Epoch: 28208, Training Loss: 0.00825
Epoch: 28208, Training Loss: 0.00860
Epoch: 28208, Training Loss: 0.00861
Epoch: 28208, Training Loss: 0.01059
Epoch: 28209, Training Loss: 0.00825
Epoch: 28209, Training Loss: 0.00860
Epoch: 28209, Training Loss: 0.00861
Epoch: 28209, Training Loss: 0.01059
Epoch: 28210, Training Loss: 0.00825
Epoch: 28210, Training Loss: 0.00860
Epoch: 28210, Training Loss: 0.00861
Epoch: 28210, Training Loss: 0.01059
Epoch: 28211, Training Loss: 0.00825
Epoch: 28211, Training Loss: 0.00860
Epoch: 28211, Training Loss: 0.00861
Epoch: 28211, Training Loss: 0.01059
Epoch: 28212, Training Loss: 0.00825
Epoch: 28212, Training Loss: 0.00860
Epoch: 28212, Training Loss: 0.00861
Epoch: 28212, Training Loss: 0.01058
Epoch: 28213, Training Loss: 0.00825
Epoch: 28213, Training Loss: 0.00860
Epoch: 28213, Training Loss: 0.00861
Epoch: 28213, Training Loss: 0.01058
Epoch: 28214, Training Loss: 0.00825
Epoch: 28214, Training Loss: 0.00860
Epoch: 28214, Training Loss: 0.00861
Epoch: 28214, Training Loss: 0.01058
Epoch: 28215, Training Loss: 0.00825
Epoch: 28215, Training Loss: 0.00860
Epoch: 28215, Training Loss: 0.00861
Epoch: 28215, Training Loss: 0.01058
Epoch: 28216, Training Loss: 0.00825
Epoch: 28216, Training Loss: 0.00860
Epoch: 28216, Training Loss: 0.00861
Epoch: 28216, Training Loss: 0.01058
Epoch: 28217, Training Loss: 0.00825
Epoch: 28217, Training Loss: 0.00860
Epoch: 28217, Training Loss: 0.00861
Epoch: 28217, Training Loss: 0.01058
Epoch: 28218, Training Loss: 0.00825
Epoch: 28218, Training Loss: 0.00860
Epoch: 28218, Training Loss: 0.00861
Epoch: 28218, Training Loss: 0.01058
Epoch: 28219, Training Loss: 0.00825
Epoch: 28219, Training Loss: 0.00860
Epoch: 28219, Training Loss: 0.00861
Epoch: 28219, Training Loss: 0.01058
Epoch: 28220, Training Loss: 0.00825
Epoch: 28220, Training Loss: 0.00860
Epoch: 28220, Training Loss: 0.00861
Epoch: 28220, Training Loss: 0.01058
Epoch: 28221, Training Loss: 0.00825
Epoch: 28221, Training Loss: 0.00860
Epoch: 28221, Training Loss: 0.00861
Epoch: 28221, Training Loss: 0.01058
Epoch: 28222, Training Loss: 0.00825
Epoch: 28222, Training Loss: 0.00860
Epoch: 28222, Training Loss: 0.00861
Epoch: 28222, Training Loss: 0.01058
Epoch: 28223, Training Loss: 0.00825
Epoch: 28223, Training Loss: 0.00860
Epoch: 28223, Training Loss: 0.00861
Epoch: 28223, Training Loss: 0.01058
Epoch: 28224, Training Loss: 0.00825
Epoch: 28224, Training Loss: 0.00860
Epoch: 28224, Training Loss: 0.00861
Epoch: 28224, Training Loss: 0.01058
Epoch: 28225, Training Loss: 0.00825
Epoch: 28225, Training Loss: 0.00860
Epoch: 28225, Training Loss: 0.00861
Epoch: 28225, Training Loss: 0.01058
Epoch: 28226, Training Loss: 0.00825
Epoch: 28226, Training Loss: 0.00860
Epoch: 28226, Training Loss: 0.00861
Epoch: 28226, Training Loss: 0.01058
Epoch: 28227, Training Loss: 0.00825
Epoch: 28227, Training Loss: 0.00860
Epoch: 28227, Training Loss: 0.00861
Epoch: 28227, Training Loss: 0.01058
Epoch: 28228, Training Loss: 0.00825
Epoch: 28228, Training Loss: 0.00860
Epoch: 28228, Training Loss: 0.00861
Epoch: 28228, Training Loss: 0.01058
Epoch: 28229, Training Loss: 0.00825
Epoch: 28229, Training Loss: 0.00860
Epoch: 28229, Training Loss: 0.00861
Epoch: 28229, Training Loss: 0.01058
Epoch: 28230, Training Loss: 0.00825
Epoch: 28230, Training Loss: 0.00860
Epoch: 28230, Training Loss: 0.00861
Epoch: 28230, Training Loss: 0.01058
Epoch: 28231, Training Loss: 0.00825
Epoch: 28231, Training Loss: 0.00860
Epoch: 28231, Training Loss: 0.00861
Epoch: 28231, Training Loss: 0.01058
Epoch: 28232, Training Loss: 0.00825
Epoch: 28232, Training Loss: 0.00860
Epoch: 28232, Training Loss: 0.00861
Epoch: 28232, Training Loss: 0.01058
Epoch: 28233, Training Loss: 0.00825
Epoch: 28233, Training Loss: 0.00860
Epoch: 28233, Training Loss: 0.00861
Epoch: 28233, Training Loss: 0.01058
Epoch: 28234, Training Loss: 0.00824
Epoch: 28234, Training Loss: 0.00860
Epoch: 28234, Training Loss: 0.00861
Epoch: 28234, Training Loss: 0.01058
Epoch: 28235, Training Loss: 0.00824
Epoch: 28235, Training Loss: 0.00860
Epoch: 28235, Training Loss: 0.00861
Epoch: 28235, Training Loss: 0.01058
Epoch: 28236, Training Loss: 0.00824
Epoch: 28236, Training Loss: 0.00860
Epoch: 28236, Training Loss: 0.00861
Epoch: 28236, Training Loss: 0.01058
Epoch: 28237, Training Loss: 0.00824
Epoch: 28237, Training Loss: 0.00860
Epoch: 28237, Training Loss: 0.00861
Epoch: 28237, Training Loss: 0.01058
Epoch: 28238, Training Loss: 0.00824
Epoch: 28238, Training Loss: 0.00860
Epoch: 28238, Training Loss: 0.00861
Epoch: 28238, Training Loss: 0.01058
Epoch: 28239, Training Loss: 0.00824
Epoch: 28239, Training Loss: 0.00860
Epoch: 28239, Training Loss: 0.00861
Epoch: 28239, Training Loss: 0.01058
Epoch: 28240, Training Loss: 0.00824
Epoch: 28240, Training Loss: 0.00860
Epoch: 28240, Training Loss: 0.00861
Epoch: 28240, Training Loss: 0.01058
Epoch: 28241, Training Loss: 0.00824
Epoch: 28241, Training Loss: 0.00860
Epoch: 28241, Training Loss: 0.00861
Epoch: 28241, Training Loss: 0.01058
Epoch: 28242, Training Loss: 0.00824
Epoch: 28242, Training Loss: 0.00860
Epoch: 28242, Training Loss: 0.00861
Epoch: 28242, Training Loss: 0.01058
Epoch: 28243, Training Loss: 0.00824
Epoch: 28243, Training Loss: 0.00860
Epoch: 28243, Training Loss: 0.00861
Epoch: 28243, Training Loss: 0.01058
Epoch: 28244, Training Loss: 0.00824
Epoch: 28244, Training Loss: 0.00860
Epoch: 28244, Training Loss: 0.00861
Epoch: 28244, Training Loss: 0.01058
Epoch: 28245, Training Loss: 0.00824
Epoch: 28245, Training Loss: 0.00860
Epoch: 28245, Training Loss: 0.00861
Epoch: 28245, Training Loss: 0.01058
Epoch: 28246, Training Loss: 0.00824
Epoch: 28246, Training Loss: 0.00860
Epoch: 28246, Training Loss: 0.00861
Epoch: 28246, Training Loss: 0.01058
Epoch: 28247, Training Loss: 0.00824
Epoch: 28247, Training Loss: 0.00860
Epoch: 28247, Training Loss: 0.00861
Epoch: 28247, Training Loss: 0.01058
Epoch: 28248, Training Loss: 0.00824
Epoch: 28248, Training Loss: 0.00860
Epoch: 28248, Training Loss: 0.00861
Epoch: 28248, Training Loss: 0.01058
Epoch: 28249, Training Loss: 0.00824
Epoch: 28249, Training Loss: 0.00860
Epoch: 28249, Training Loss: 0.00861
Epoch: 28249, Training Loss: 0.01058
Epoch: 28250, Training Loss: 0.00824
Epoch: 28250, Training Loss: 0.00860
Epoch: 28250, Training Loss: 0.00861
Epoch: 28250, Training Loss: 0.01058
Epoch: 28251, Training Loss: 0.00824
Epoch: 28251, Training Loss: 0.00860
Epoch: 28251, Training Loss: 0.00861
Epoch: 28251, Training Loss: 0.01058
Epoch: 28252, Training Loss: 0.00824
Epoch: 28252, Training Loss: 0.00860
Epoch: 28252, Training Loss: 0.00861
Epoch: 28252, Training Loss: 0.01058
Epoch: 28253, Training Loss: 0.00824
Epoch: 28253, Training Loss: 0.00860
Epoch: 28253, Training Loss: 0.00861
Epoch: 28253, Training Loss: 0.01058
Epoch: 28254, Training Loss: 0.00824
Epoch: 28254, Training Loss: 0.00860
Epoch: 28254, Training Loss: 0.00861
Epoch: 28254, Training Loss: 0.01058
Epoch: 28255, Training Loss: 0.00824
Epoch: 28255, Training Loss: 0.00860
Epoch: 28255, Training Loss: 0.00861
Epoch: 28255, Training Loss: 0.01058
Epoch: 28256, Training Loss: 0.00824
Epoch: 28256, Training Loss: 0.00860
Epoch: 28256, Training Loss: 0.00861
Epoch: 28256, Training Loss: 0.01058
Epoch: 28257, Training Loss: 0.00824
Epoch: 28257, Training Loss: 0.00860
Epoch: 28257, Training Loss: 0.00861
Epoch: 28257, Training Loss: 0.01058
Epoch: 28258, Training Loss: 0.00824
Epoch: 28258, Training Loss: 0.00859
Epoch: 28258, Training Loss: 0.00861
Epoch: 28258, Training Loss: 0.01058
Epoch: 28259, Training Loss: 0.00824
Epoch: 28259, Training Loss: 0.00859
Epoch: 28259, Training Loss: 0.00861
Epoch: 28259, Training Loss: 0.01057
Epoch: 28260, Training Loss: 0.00824
Epoch: 28260, Training Loss: 0.00859
Epoch: 28260, Training Loss: 0.00861
Epoch: 28260, Training Loss: 0.01057
Epoch: 28261, Training Loss: 0.00824
Epoch: 28261, Training Loss: 0.00859
Epoch: 28261, Training Loss: 0.00860
Epoch: 28261, Training Loss: 0.01057
Epoch: 28262, Training Loss: 0.00824
Epoch: 28262, Training Loss: 0.00859
Epoch: 28262, Training Loss: 0.00860
Epoch: 28262, Training Loss: 0.01057
Epoch: 28263, Training Loss: 0.00824
Epoch: 28263, Training Loss: 0.00859
Epoch: 28263, Training Loss: 0.00860
Epoch: 28263, Training Loss: 0.01057
Epoch: 28264, Training Loss: 0.00824
Epoch: 28264, Training Loss: 0.00859
Epoch: 28264, Training Loss: 0.00860
Epoch: 28264, Training Loss: 0.01057
Epoch: 28265, Training Loss: 0.00824
Epoch: 28265, Training Loss: 0.00859
Epoch: 28265, Training Loss: 0.00860
Epoch: 28265, Training Loss: 0.01057
Epoch: 28266, Training Loss: 0.00824
Epoch: 28266, Training Loss: 0.00859
Epoch: 28266, Training Loss: 0.00860
Epoch: 28266, Training Loss: 0.01057
Epoch: 28267, Training Loss: 0.00824
Epoch: 28267, Training Loss: 0.00859
Epoch: 28267, Training Loss: 0.00860
Epoch: 28267, Training Loss: 0.01057
Epoch: 28268, Training Loss: 0.00824
Epoch: 28268, Training Loss: 0.00859
Epoch: 28268, Training Loss: 0.00860
Epoch: 28268, Training Loss: 0.01057
Epoch: 28269, Training Loss: 0.00824
Epoch: 28269, Training Loss: 0.00859
Epoch: 28269, Training Loss: 0.00860
Epoch: 28269, Training Loss: 0.01057
Epoch: 28270, Training Loss: 0.00824
Epoch: 28270, Training Loss: 0.00859
Epoch: 28270, Training Loss: 0.00860
Epoch: 28270, Training Loss: 0.01057
Epoch: 28271, Training Loss: 0.00824
Epoch: 28271, Training Loss: 0.00859
Epoch: 28271, Training Loss: 0.00860
Epoch: 28271, Training Loss: 0.01057
Epoch: 28272, Training Loss: 0.00824
Epoch: 28272, Training Loss: 0.00859
Epoch: 28272, Training Loss: 0.00860
Epoch: 28272, Training Loss: 0.01057
Epoch: 28273, Training Loss: 0.00824
Epoch: 28273, Training Loss: 0.00859
Epoch: 28273, Training Loss: 0.00860
Epoch: 28273, Training Loss: 0.01057
Epoch: 28274, Training Loss: 0.00824
Epoch: 28274, Training Loss: 0.00859
Epoch: 28274, Training Loss: 0.00860
Epoch: 28274, Training Loss: 0.01057
Epoch: 28275, Training Loss: 0.00824
Epoch: 28275, Training Loss: 0.00859
Epoch: 28275, Training Loss: 0.00860
Epoch: 28275, Training Loss: 0.01057
Epoch: 28276, Training Loss: 0.00824
Epoch: 28276, Training Loss: 0.00859
Epoch: 28276, Training Loss: 0.00860
Epoch: 28276, Training Loss: 0.01057
Epoch: 28277, Training Loss: 0.00824
Epoch: 28277, Training Loss: 0.00859
Epoch: 28277, Training Loss: 0.00860
Epoch: 28277, Training Loss: 0.01057
Epoch: 28278, Training Loss: 0.00824
Epoch: 28278, Training Loss: 0.00859
Epoch: 28278, Training Loss: 0.00860
Epoch: 28278, Training Loss: 0.01057
Epoch: 28279, Training Loss: 0.00824
Epoch: 28279, Training Loss: 0.00859
Epoch: 28279, Training Loss: 0.00860
Epoch: 28279, Training Loss: 0.01057
Epoch: 28280, Training Loss: 0.00824
Epoch: 28280, Training Loss: 0.00859
Epoch: 28280, Training Loss: 0.00860
Epoch: 28280, Training Loss: 0.01057
Epoch: 28281, Training Loss: 0.00824
Epoch: 28281, Training Loss: 0.00859
Epoch: 28281, Training Loss: 0.00860
Epoch: 28281, Training Loss: 0.01057
Epoch: 28282, Training Loss: 0.00824
Epoch: 28282, Training Loss: 0.00859
Epoch: 28282, Training Loss: 0.00860
Epoch: 28282, Training Loss: 0.01057
Epoch: 28283, Training Loss: 0.00824
Epoch: 28283, Training Loss: 0.00859
Epoch: 28283, Training Loss: 0.00860
Epoch: 28283, Training Loss: 0.01057
Epoch: 28284, Training Loss: 0.00824
Epoch: 28284, Training Loss: 0.00859
Epoch: 28284, Training Loss: 0.00860
Epoch: 28284, Training Loss: 0.01057
Epoch: 28285, Training Loss: 0.00824
Epoch: 28285, Training Loss: 0.00859
Epoch: 28285, Training Loss: 0.00860
Epoch: 28285, Training Loss: 0.01057
Epoch: 28286, Training Loss: 0.00824
Epoch: 28286, Training Loss: 0.00859
Epoch: 28286, Training Loss: 0.00860
Epoch: 28286, Training Loss: 0.01057
Epoch: 28287, Training Loss: 0.00824
Epoch: 28287, Training Loss: 0.00859
Epoch: 28287, Training Loss: 0.00860
Epoch: 28287, Training Loss: 0.01057
Epoch: 28288, Training Loss: 0.00824
Epoch: 28288, Training Loss: 0.00859
Epoch: 28288, Training Loss: 0.00860
Epoch: 28288, Training Loss: 0.01057
Epoch: 28289, Training Loss: 0.00824
Epoch: 28289, Training Loss: 0.00859
Epoch: 28289, Training Loss: 0.00860
Epoch: 28289, Training Loss: 0.01057
Epoch: 28290, Training Loss: 0.00824
Epoch: 28290, Training Loss: 0.00859
Epoch: 28290, Training Loss: 0.00860
Epoch: 28290, Training Loss: 0.01057
Epoch: 28291, Training Loss: 0.00824
Epoch: 28291, Training Loss: 0.00859
Epoch: 28291, Training Loss: 0.00860
Epoch: 28291, Training Loss: 0.01057
Epoch: 28292, Training Loss: 0.00824
Epoch: 28292, Training Loss: 0.00859
Epoch: 28292, Training Loss: 0.00860
Epoch: 28292, Training Loss: 0.01057
Epoch: 28293, Training Loss: 0.00824
Epoch: 28293, Training Loss: 0.00859
Epoch: 28293, Training Loss: 0.00860
Epoch: 28293, Training Loss: 0.01057
Epoch: 28294, Training Loss: 0.00824
Epoch: 28294, Training Loss: 0.00859
Epoch: 28294, Training Loss: 0.00860
Epoch: 28294, Training Loss: 0.01057
Epoch: 28295, Training Loss: 0.00824
Epoch: 28295, Training Loss: 0.00859
Epoch: 28295, Training Loss: 0.00860
Epoch: 28295, Training Loss: 0.01057
Epoch: 28296, Training Loss: 0.00823
Epoch: 28296, Training Loss: 0.00859
Epoch: 28296, Training Loss: 0.00860
Epoch: 28296, Training Loss: 0.01057
Epoch: 28297, Training Loss: 0.00823
Epoch: 28297, Training Loss: 0.00859
Epoch: 28297, Training Loss: 0.00860
Epoch: 28297, Training Loss: 0.01057
Epoch: 28298, Training Loss: 0.00823
Epoch: 28298, Training Loss: 0.00859
Epoch: 28298, Training Loss: 0.00860
Epoch: 28298, Training Loss: 0.01057
Epoch: 28299, Training Loss: 0.00823
Epoch: 28299, Training Loss: 0.00859
Epoch: 28299, Training Loss: 0.00860
Epoch: 28299, Training Loss: 0.01057
Epoch: 28300, Training Loss: 0.00823
Epoch: 28300, Training Loss: 0.00859
Epoch: 28300, Training Loss: 0.00860
Epoch: 28300, Training Loss: 0.01057
Epoch: 28301, Training Loss: 0.00823
Epoch: 28301, Training Loss: 0.00859
Epoch: 28301, Training Loss: 0.00860
Epoch: 28301, Training Loss: 0.01057
Epoch: 28302, Training Loss: 0.00823
Epoch: 28302, Training Loss: 0.00859
Epoch: 28302, Training Loss: 0.00860
Epoch: 28302, Training Loss: 0.01057
Epoch: 28303, Training Loss: 0.00823
Epoch: 28303, Training Loss: 0.00859
Epoch: 28303, Training Loss: 0.00860
Epoch: 28303, Training Loss: 0.01057
Epoch: 28304, Training Loss: 0.00823
Epoch: 28304, Training Loss: 0.00859
Epoch: 28304, Training Loss: 0.00860
Epoch: 28304, Training Loss: 0.01057
Epoch: 28305, Training Loss: 0.00823
Epoch: 28305, Training Loss: 0.00859
Epoch: 28305, Training Loss: 0.00860
Epoch: 28305, Training Loss: 0.01057
Epoch: 28306, Training Loss: 0.00823
Epoch: 28306, Training Loss: 0.00859
Epoch: 28306, Training Loss: 0.00860
Epoch: 28306, Training Loss: 0.01056
Epoch: 28307, Training Loss: 0.00823
Epoch: 28307, Training Loss: 0.00859
Epoch: 28307, Training Loss: 0.00860
Epoch: 28307, Training Loss: 0.01056
Epoch: 28308, Training Loss: 0.00823
Epoch: 28308, Training Loss: 0.00859
Epoch: 28308, Training Loss: 0.00860
Epoch: 28308, Training Loss: 0.01056
Epoch: 28309, Training Loss: 0.00823
Epoch: 28309, Training Loss: 0.00859
Epoch: 28309, Training Loss: 0.00860
Epoch: 28309, Training Loss: 0.01056
Epoch: 28310, Training Loss: 0.00823
Epoch: 28310, Training Loss: 0.00859
Epoch: 28310, Training Loss: 0.00860
Epoch: 28310, Training Loss: 0.01056
Epoch: 28311, Training Loss: 0.00823
Epoch: 28311, Training Loss: 0.00859
Epoch: 28311, Training Loss: 0.00860
Epoch: 28311, Training Loss: 0.01056
Epoch: 28312, Training Loss: 0.00823
Epoch: 28312, Training Loss: 0.00859
Epoch: 28312, Training Loss: 0.00860
Epoch: 28312, Training Loss: 0.01056
Epoch: 28313, Training Loss: 0.00823
Epoch: 28313, Training Loss: 0.00859
Epoch: 28313, Training Loss: 0.00860
Epoch: 28313, Training Loss: 0.01056
Epoch: 28314, Training Loss: 0.00823
Epoch: 28314, Training Loss: 0.00859
Epoch: 28314, Training Loss: 0.00860
Epoch: 28314, Training Loss: 0.01056
Epoch: 28315, Training Loss: 0.00823
Epoch: 28315, Training Loss: 0.00859
Epoch: 28315, Training Loss: 0.00860
Epoch: 28315, Training Loss: 0.01056
Epoch: 28316, Training Loss: 0.00823
Epoch: 28316, Training Loss: 0.00858
Epoch: 28316, Training Loss: 0.00860
Epoch: 28316, Training Loss: 0.01056
Epoch: 28317, Training Loss: 0.00823
Epoch: 28317, Training Loss: 0.00858
Epoch: 28317, Training Loss: 0.00860
Epoch: 28317, Training Loss: 0.01056
Epoch: 28318, Training Loss: 0.00823
Epoch: 28318, Training Loss: 0.00858
Epoch: 28318, Training Loss: 0.00860
Epoch: 28318, Training Loss: 0.01056
Epoch: 28319, Training Loss: 0.00823
Epoch: 28319, Training Loss: 0.00858
Epoch: 28319, Training Loss: 0.00859
Epoch: 28319, Training Loss: 0.01056
Epoch: 28320, Training Loss: 0.00823
Epoch: 28320, Training Loss: 0.00858
Epoch: 28320, Training Loss: 0.00859
Epoch: 28320, Training Loss: 0.01056
Epoch: 28321, Training Loss: 0.00823
Epoch: 28321, Training Loss: 0.00858
Epoch: 28321, Training Loss: 0.00859
Epoch: 28321, Training Loss: 0.01056
Epoch: 28322, Training Loss: 0.00823
Epoch: 28322, Training Loss: 0.00858
Epoch: 28322, Training Loss: 0.00859
Epoch: 28322, Training Loss: 0.01056
Epoch: 28323, Training Loss: 0.00823
Epoch: 28323, Training Loss: 0.00858
Epoch: 28323, Training Loss: 0.00859
Epoch: 28323, Training Loss: 0.01056
Epoch: 28324, Training Loss: 0.00823
Epoch: 28324, Training Loss: 0.00858
Epoch: 28324, Training Loss: 0.00859
Epoch: 28324, Training Loss: 0.01056
Epoch: 28325, Training Loss: 0.00823
Epoch: 28325, Training Loss: 0.00858
Epoch: 28325, Training Loss: 0.00859
Epoch: 28325, Training Loss: 0.01056
Epoch: 28326, Training Loss: 0.00823
Epoch: 28326, Training Loss: 0.00858
Epoch: 28326, Training Loss: 0.00859
Epoch: 28326, Training Loss: 0.01056
Epoch: 28327, Training Loss: 0.00823
Epoch: 28327, Training Loss: 0.00858
Epoch: 28327, Training Loss: 0.00859
Epoch: 28327, Training Loss: 0.01056
Epoch: 28328, Training Loss: 0.00823
Epoch: 28328, Training Loss: 0.00858
Epoch: 28328, Training Loss: 0.00859
Epoch: 28328, Training Loss: 0.01056
Epoch: 28329, Training Loss: 0.00823
Epoch: 28329, Training Loss: 0.00858
Epoch: 28329, Training Loss: 0.00859
Epoch: 28329, Training Loss: 0.01056
Epoch: 28330, Training Loss: 0.00823
Epoch: 28330, Training Loss: 0.00858
Epoch: 28330, Training Loss: 0.00859
Epoch: 28330, Training Loss: 0.01056
Epoch: 28331, Training Loss: 0.00823
Epoch: 28331, Training Loss: 0.00858
Epoch: 28331, Training Loss: 0.00859
Epoch: 28331, Training Loss: 0.01056
Epoch: 28332, Training Loss: 0.00823
Epoch: 28332, Training Loss: 0.00858
Epoch: 28332, Training Loss: 0.00859
Epoch: 28332, Training Loss: 0.01056
Epoch: 28333, Training Loss: 0.00823
Epoch: 28333, Training Loss: 0.00858
Epoch: 28333, Training Loss: 0.00859
Epoch: 28333, Training Loss: 0.01056
Epoch: 28334, Training Loss: 0.00823
Epoch: 28334, Training Loss: 0.00858
Epoch: 28334, Training Loss: 0.00859
Epoch: 28334, Training Loss: 0.01056
Epoch: 28335, Training Loss: 0.00823
Epoch: 28335, Training Loss: 0.00858
Epoch: 28335, Training Loss: 0.00859
Epoch: 28335, Training Loss: 0.01056
Epoch: 28336, Training Loss: 0.00823
Epoch: 28336, Training Loss: 0.00858
Epoch: 28336, Training Loss: 0.00859
Epoch: 28336, Training Loss: 0.01056
Epoch: 28337, Training Loss: 0.00823
Epoch: 28337, Training Loss: 0.00858
Epoch: 28337, Training Loss: 0.00859
Epoch: 28337, Training Loss: 0.01056
Epoch: 28338, Training Loss: 0.00823
Epoch: 28338, Training Loss: 0.00858
Epoch: 28338, Training Loss: 0.00859
Epoch: 28338, Training Loss: 0.01056
Epoch: 28339, Training Loss: 0.00823
Epoch: 28339, Training Loss: 0.00858
Epoch: 28339, Training Loss: 0.00859
Epoch: 28339, Training Loss: 0.01056
Epoch: 28340, Training Loss: 0.00823
Epoch: 28340, Training Loss: 0.00858
Epoch: 28340, Training Loss: 0.00859
Epoch: 28340, Training Loss: 0.01056
Epoch: 28341, Training Loss: 0.00823
Epoch: 28341, Training Loss: 0.00858
Epoch: 28341, Training Loss: 0.00859
Epoch: 28341, Training Loss: 0.01056
Epoch: 28342, Training Loss: 0.00823
Epoch: 28342, Training Loss: 0.00858
Epoch: 28342, Training Loss: 0.00859
Epoch: 28342, Training Loss: 0.01056
Epoch: 28343, Training Loss: 0.00823
Epoch: 28343, Training Loss: 0.00858
Epoch: 28343, Training Loss: 0.00859
Epoch: 28343, Training Loss: 0.01056
Epoch: 28344, Training Loss: 0.00823
Epoch: 28344, Training Loss: 0.00858
Epoch: 28344, Training Loss: 0.00859
Epoch: 28344, Training Loss: 0.01056
Epoch: 28345, Training Loss: 0.00823
Epoch: 28345, Training Loss: 0.00858
Epoch: 28345, Training Loss: 0.00859
Epoch: 28345, Training Loss: 0.01056
Epoch: 28346, Training Loss: 0.00823
Epoch: 28346, Training Loss: 0.00858
Epoch: 28346, Training Loss: 0.00859
Epoch: 28346, Training Loss: 0.01056
Epoch: 28347, Training Loss: 0.00823
Epoch: 28347, Training Loss: 0.00858
Epoch: 28347, Training Loss: 0.00859
Epoch: 28347, Training Loss: 0.01056
Epoch: 28348, Training Loss: 0.00823
Epoch: 28348, Training Loss: 0.00858
Epoch: 28348, Training Loss: 0.00859
Epoch: 28348, Training Loss: 0.01056
Epoch: 28349, Training Loss: 0.00823
Epoch: 28349, Training Loss: 0.00858
Epoch: 28349, Training Loss: 0.00859
Epoch: 28349, Training Loss: 0.01056
Epoch: 28350, Training Loss: 0.00823
Epoch: 28350, Training Loss: 0.00858
Epoch: 28350, Training Loss: 0.00859
Epoch: 28350, Training Loss: 0.01056
Epoch: 28351, Training Loss: 0.00823
Epoch: 28351, Training Loss: 0.00858
Epoch: 28351, Training Loss: 0.00859
Epoch: 28351, Training Loss: 0.01056
Epoch: 28352, Training Loss: 0.00823
Epoch: 28352, Training Loss: 0.00858
Epoch: 28352, Training Loss: 0.00859
Epoch: 28352, Training Loss: 0.01055
Epoch: 28353, Training Loss: 0.00823
Epoch: 28353, Training Loss: 0.00858
Epoch: 28353, Training Loss: 0.00859
Epoch: 28353, Training Loss: 0.01055
Epoch: 28354, Training Loss: 0.00823
Epoch: 28354, Training Loss: 0.00858
Epoch: 28354, Training Loss: 0.00859
Epoch: 28354, Training Loss: 0.01055
Epoch: 28355, Training Loss: 0.00823
Epoch: 28355, Training Loss: 0.00858
Epoch: 28355, Training Loss: 0.00859
Epoch: 28355, Training Loss: 0.01055
Epoch: 28356, Training Loss: 0.00823
Epoch: 28356, Training Loss: 0.00858
Epoch: 28356, Training Loss: 0.00859
Epoch: 28356, Training Loss: 0.01055
Epoch: 28357, Training Loss: 0.00823
Epoch: 28357, Training Loss: 0.00858
Epoch: 28357, Training Loss: 0.00859
Epoch: 28357, Training Loss: 0.01055
Epoch: 28358, Training Loss: 0.00823
Epoch: 28358, Training Loss: 0.00858
Epoch: 28358, Training Loss: 0.00859
Epoch: 28358, Training Loss: 0.01055
Epoch: 28359, Training Loss: 0.00822
Epoch: 28359, Training Loss: 0.00858
Epoch: 28359, Training Loss: 0.00859
Epoch: 28359, Training Loss: 0.01055
Epoch: 28360, Training Loss: 0.00822
Epoch: 28360, Training Loss: 0.00858
Epoch: 28360, Training Loss: 0.00859
Epoch: 28360, Training Loss: 0.01055
Epoch: 28361, Training Loss: 0.00822
Epoch: 28361, Training Loss: 0.00858
Epoch: 28361, Training Loss: 0.00859
Epoch: 28361, Training Loss: 0.01055
Epoch: 28362, Training Loss: 0.00822
Epoch: 28362, Training Loss: 0.00858
Epoch: 28362, Training Loss: 0.00859
Epoch: 28362, Training Loss: 0.01055
Epoch: 28363, Training Loss: 0.00822
Epoch: 28363, Training Loss: 0.00858
Epoch: 28363, Training Loss: 0.00859
Epoch: 28363, Training Loss: 0.01055
Epoch: 28364, Training Loss: 0.00822
Epoch: 28364, Training Loss: 0.00858
Epoch: 28364, Training Loss: 0.00859
Epoch: 28364, Training Loss: 0.01055
Epoch: 28365, Training Loss: 0.00822
Epoch: 28365, Training Loss: 0.00858
Epoch: 28365, Training Loss: 0.00859
Epoch: 28365, Training Loss: 0.01055
Epoch: 28366, Training Loss: 0.00822
Epoch: 28366, Training Loss: 0.00858
Epoch: 28366, Training Loss: 0.00859
Epoch: 28366, Training Loss: 0.01055
Epoch: 28367, Training Loss: 0.00822
Epoch: 28367, Training Loss: 0.00858
Epoch: 28367, Training Loss: 0.00859
Epoch: 28367, Training Loss: 0.01055
Epoch: 28368, Training Loss: 0.00822
Epoch: 28368, Training Loss: 0.00858
Epoch: 28368, Training Loss: 0.00859
Epoch: 28368, Training Loss: 0.01055
Epoch: 28369, Training Loss: 0.00822
Epoch: 28369, Training Loss: 0.00858
Epoch: 28369, Training Loss: 0.00859
Epoch: 28369, Training Loss: 0.01055
Epoch: 28370, Training Loss: 0.00822
Epoch: 28370, Training Loss: 0.00858
Epoch: 28370, Training Loss: 0.00859
Epoch: 28370, Training Loss: 0.01055
Epoch: 28371, Training Loss: 0.00822
Epoch: 28371, Training Loss: 0.00858
Epoch: 28371, Training Loss: 0.00859
Epoch: 28371, Training Loss: 0.01055
Epoch: 28372, Training Loss: 0.00822
Epoch: 28372, Training Loss: 0.00858
Epoch: 28372, Training Loss: 0.00859
Epoch: 28372, Training Loss: 0.01055
Epoch: 28373, Training Loss: 0.00822
Epoch: 28373, Training Loss: 0.00858
Epoch: 28373, Training Loss: 0.00859
Epoch: 28373, Training Loss: 0.01055
Epoch: 28374, Training Loss: 0.00822
Epoch: 28374, Training Loss: 0.00858
Epoch: 28374, Training Loss: 0.00859
Epoch: 28374, Training Loss: 0.01055
Epoch: 28375, Training Loss: 0.00822
Epoch: 28375, Training Loss: 0.00857
Epoch: 28375, Training Loss: 0.00859
Epoch: 28375, Training Loss: 0.01055
Epoch: 28376, Training Loss: 0.00822
Epoch: 28376, Training Loss: 0.00857
Epoch: 28376, Training Loss: 0.00859
Epoch: 28376, Training Loss: 0.01055
Epoch: 28377, Training Loss: 0.00822
Epoch: 28377, Training Loss: 0.00857
Epoch: 28377, Training Loss: 0.00858
Epoch: 28377, Training Loss: 0.01055
Epoch: 28378, Training Loss: 0.00822
Epoch: 28378, Training Loss: 0.00857
Epoch: 28378, Training Loss: 0.00858
Epoch: 28378, Training Loss: 0.01055
Epoch: 28379, Training Loss: 0.00822
Epoch: 28379, Training Loss: 0.00857
Epoch: 28379, Training Loss: 0.00858
Epoch: 28379, Training Loss: 0.01055
Epoch: 28380, Training Loss: 0.00822
Epoch: 28380, Training Loss: 0.00857
Epoch: 28380, Training Loss: 0.00858
Epoch: 28380, Training Loss: 0.01055
Epoch: 28381, Training Loss: 0.00822
Epoch: 28381, Training Loss: 0.00857
Epoch: 28381, Training Loss: 0.00858
Epoch: 28381, Training Loss: 0.01055
Epoch: 28382, Training Loss: 0.00822
Epoch: 28382, Training Loss: 0.00857
Epoch: 28382, Training Loss: 0.00858
Epoch: 28382, Training Loss: 0.01055
Epoch: 28383, Training Loss: 0.00822
Epoch: 28383, Training Loss: 0.00857
Epoch: 28383, Training Loss: 0.00858
Epoch: 28383, Training Loss: 0.01055
Epoch: 28384, Training Loss: 0.00822
Epoch: 28384, Training Loss: 0.00857
Epoch: 28384, Training Loss: 0.00858
Epoch: 28384, Training Loss: 0.01055
Epoch: 28385, Training Loss: 0.00822
Epoch: 28385, Training Loss: 0.00857
Epoch: 28385, Training Loss: 0.00858
Epoch: 28385, Training Loss: 0.01055
Epoch: 28386, Training Loss: 0.00822
Epoch: 28386, Training Loss: 0.00857
Epoch: 28386, Training Loss: 0.00858
Epoch: 28386, Training Loss: 0.01055
Epoch: 28387, Training Loss: 0.00822
Epoch: 28387, Training Loss: 0.00857
Epoch: 28387, Training Loss: 0.00858
Epoch: 28387, Training Loss: 0.01055
Epoch: 28388, Training Loss: 0.00822
Epoch: 28388, Training Loss: 0.00857
Epoch: 28388, Training Loss: 0.00858
Epoch: 28388, Training Loss: 0.01055
Epoch: 28389, Training Loss: 0.00822
Epoch: 28389, Training Loss: 0.00857
Epoch: 28389, Training Loss: 0.00858
Epoch: 28389, Training Loss: 0.01055
Epoch: 28390, Training Loss: 0.00822
Epoch: 28390, Training Loss: 0.00857
Epoch: 28390, Training Loss: 0.00858
Epoch: 28390, Training Loss: 0.01055
Epoch: 28391, Training Loss: 0.00822
Epoch: 28391, Training Loss: 0.00857
Epoch: 28391, Training Loss: 0.00858
Epoch: 28391, Training Loss: 0.01055
Epoch: 28392, Training Loss: 0.00822
Epoch: 28392, Training Loss: 0.00857
Epoch: 28392, Training Loss: 0.00858
Epoch: 28392, Training Loss: 0.01055
Epoch: 28393, Training Loss: 0.00822
Epoch: 28393, Training Loss: 0.00857
Epoch: 28393, Training Loss: 0.00858
Epoch: 28393, Training Loss: 0.01055
Epoch: 28394, Training Loss: 0.00822
Epoch: 28394, Training Loss: 0.00857
Epoch: 28394, Training Loss: 0.00858
Epoch: 28394, Training Loss: 0.01055
Epoch: 28395, Training Loss: 0.00822
Epoch: 28395, Training Loss: 0.00857
Epoch: 28395, Training Loss: 0.00858
Epoch: 28395, Training Loss: 0.01055
Epoch: 28396, Training Loss: 0.00822
Epoch: 28396, Training Loss: 0.00857
Epoch: 28396, Training Loss: 0.00858
Epoch: 28396, Training Loss: 0.01055
Epoch: 28397, Training Loss: 0.00822
Epoch: 28397, Training Loss: 0.00857
Epoch: 28397, Training Loss: 0.00858
Epoch: 28397, Training Loss: 0.01055
Epoch: 28398, Training Loss: 0.00822
Epoch: 28398, Training Loss: 0.00857
Epoch: 28398, Training Loss: 0.00858
Epoch: 28398, Training Loss: 0.01055
Epoch: 28399, Training Loss: 0.00822
Epoch: 28399, Training Loss: 0.00857
Epoch: 28399, Training Loss: 0.00858
Epoch: 28399, Training Loss: 0.01054
Epoch: 28400, Training Loss: 0.00822
Epoch: 28400, Training Loss: 0.00857
Epoch: 28400, Training Loss: 0.00858
Epoch: 28400, Training Loss: 0.01054
Epoch: 28401, Training Loss: 0.00822
Epoch: 28401, Training Loss: 0.00857
Epoch: 28401, Training Loss: 0.00858
Epoch: 28401, Training Loss: 0.01054
Epoch: 28402, Training Loss: 0.00822
Epoch: 28402, Training Loss: 0.00857
Epoch: 28402, Training Loss: 0.00858
Epoch: 28402, Training Loss: 0.01054
Epoch: 28403, Training Loss: 0.00822
Epoch: 28403, Training Loss: 0.00857
Epoch: 28403, Training Loss: 0.00858
Epoch: 28403, Training Loss: 0.01054
Epoch: 28404, Training Loss: 0.00822
Epoch: 28404, Training Loss: 0.00857
Epoch: 28404, Training Loss: 0.00858
Epoch: 28404, Training Loss: 0.01054
Epoch: 28405, Training Loss: 0.00822
Epoch: 28405, Training Loss: 0.00857
Epoch: 28405, Training Loss: 0.00858
Epoch: 28405, Training Loss: 0.01054
Epoch: 28406, Training Loss: 0.00822
Epoch: 28406, Training Loss: 0.00857
Epoch: 28406, Training Loss: 0.00858
Epoch: 28406, Training Loss: 0.01054
Epoch: 28407, Training Loss: 0.00822
Epoch: 28407, Training Loss: 0.00857
Epoch: 28407, Training Loss: 0.00858
Epoch: 28407, Training Loss: 0.01054
Epoch: 28408, Training Loss: 0.00822
Epoch: 28408, Training Loss: 0.00857
Epoch: 28408, Training Loss: 0.00858
Epoch: 28408, Training Loss: 0.01054
Epoch: 28409, Training Loss: 0.00822
Epoch: 28409, Training Loss: 0.00857
Epoch: 28409, Training Loss: 0.00858
Epoch: 28409, Training Loss: 0.01054
Epoch: 28410, Training Loss: 0.00822
Epoch: 28410, Training Loss: 0.00857
Epoch: 28410, Training Loss: 0.00858
Epoch: 28410, Training Loss: 0.01054
Epoch: 28411, Training Loss: 0.00822
Epoch: 28411, Training Loss: 0.00857
Epoch: 28411, Training Loss: 0.00858
Epoch: 28411, Training Loss: 0.01054
Epoch: 28412, Training Loss: 0.00822
Epoch: 28412, Training Loss: 0.00857
Epoch: 28412, Training Loss: 0.00858
Epoch: 28412, Training Loss: 0.01054
Epoch: 28413, Training Loss: 0.00822
Epoch: 28413, Training Loss: 0.00857
Epoch: 28413, Training Loss: 0.00858
Epoch: 28413, Training Loss: 0.01054
Epoch: 28414, Training Loss: 0.00822
Epoch: 28414, Training Loss: 0.00857
Epoch: 28414, Training Loss: 0.00858
Epoch: 28414, Training Loss: 0.01054
Epoch: 28415, Training Loss: 0.00822
Epoch: 28415, Training Loss: 0.00857
Epoch: 28415, Training Loss: 0.00858
Epoch: 28415, Training Loss: 0.01054
Epoch: 28416, Training Loss: 0.00822
Epoch: 28416, Training Loss: 0.00857
Epoch: 28416, Training Loss: 0.00858
Epoch: 28416, Training Loss: 0.01054
Epoch: 28417, Training Loss: 0.00822
Epoch: 28417, Training Loss: 0.00857
Epoch: 28417, Training Loss: 0.00858
Epoch: 28417, Training Loss: 0.01054
Epoch: 28418, Training Loss: 0.00822
Epoch: 28418, Training Loss: 0.00857
Epoch: 28418, Training Loss: 0.00858
Epoch: 28418, Training Loss: 0.01054
Epoch: 28419, Training Loss: 0.00822
Epoch: 28419, Training Loss: 0.00857
Epoch: 28419, Training Loss: 0.00858
Epoch: 28419, Training Loss: 0.01054
Epoch: 28420, Training Loss: 0.00822
Epoch: 28420, Training Loss: 0.00857
Epoch: 28420, Training Loss: 0.00858
Epoch: 28420, Training Loss: 0.01054
Epoch: 28421, Training Loss: 0.00821
Epoch: 28421, Training Loss: 0.00857
Epoch: 28421, Training Loss: 0.00858
Epoch: 28421, Training Loss: 0.01054
Epoch: 28422, Training Loss: 0.00821
Epoch: 28422, Training Loss: 0.00857
Epoch: 28422, Training Loss: 0.00858
Epoch: 28422, Training Loss: 0.01054
Epoch: 28423, Training Loss: 0.00821
Epoch: 28423, Training Loss: 0.00857
Epoch: 28423, Training Loss: 0.00858
Epoch: 28423, Training Loss: 0.01054
Epoch: 28424, Training Loss: 0.00821
Epoch: 28424, Training Loss: 0.00857
Epoch: 28424, Training Loss: 0.00858
Epoch: 28424, Training Loss: 0.01054
Epoch: 28425, Training Loss: 0.00821
Epoch: 28425, Training Loss: 0.00857
Epoch: 28425, Training Loss: 0.00858
Epoch: 28425, Training Loss: 0.01054
Epoch: 28426, Training Loss: 0.00821
Epoch: 28426, Training Loss: 0.00857
Epoch: 28426, Training Loss: 0.00858
Epoch: 28426, Training Loss: 0.01054
Epoch: 28427, Training Loss: 0.00821
Epoch: 28427, Training Loss: 0.00857
Epoch: 28427, Training Loss: 0.00858
Epoch: 28427, Training Loss: 0.01054
Epoch: 28428, Training Loss: 0.00821
Epoch: 28428, Training Loss: 0.00857
Epoch: 28428, Training Loss: 0.00858
Epoch: 28428, Training Loss: 0.01054
Epoch: 28429, Training Loss: 0.00821
Epoch: 28429, Training Loss: 0.00857
Epoch: 28429, Training Loss: 0.00858
Epoch: 28429, Training Loss: 0.01054
Epoch: 28430, Training Loss: 0.00821
Epoch: 28430, Training Loss: 0.00857
Epoch: 28430, Training Loss: 0.00858
Epoch: 28430, Training Loss: 0.01054
Epoch: 28431, Training Loss: 0.00821
Epoch: 28431, Training Loss: 0.00857
Epoch: 28431, Training Loss: 0.00858
Epoch: 28431, Training Loss: 0.01054
Epoch: 28432, Training Loss: 0.00821
Epoch: 28432, Training Loss: 0.00857
Epoch: 28432, Training Loss: 0.00858
Epoch: 28432, Training Loss: 0.01054
Epoch: 28433, Training Loss: 0.00821
Epoch: 28433, Training Loss: 0.00856
Epoch: 28433, Training Loss: 0.00858
Epoch: 28433, Training Loss: 0.01054
Epoch: 28434, Training Loss: 0.00821
Epoch: 28434, Training Loss: 0.00856
Epoch: 28434, Training Loss: 0.00858
Epoch: 28434, Training Loss: 0.01054
Epoch: 28435, Training Loss: 0.00821
Epoch: 28435, Training Loss: 0.00856
Epoch: 28435, Training Loss: 0.00857
Epoch: 28435, Training Loss: 0.01054
Epoch: 28436, Training Loss: 0.00821
Epoch: 28436, Training Loss: 0.00856
Epoch: 28436, Training Loss: 0.00857
Epoch: 28436, Training Loss: 0.01054
Epoch: 28437, Training Loss: 0.00821
Epoch: 28437, Training Loss: 0.00856
Epoch: 28437, Training Loss: 0.00857
Epoch: 28437, Training Loss: 0.01054
Epoch: 28438, Training Loss: 0.00821
Epoch: 28438, Training Loss: 0.00856
Epoch: 28438, Training Loss: 0.00857
Epoch: 28438, Training Loss: 0.01054
Epoch: 28439, Training Loss: 0.00821
Epoch: 28439, Training Loss: 0.00856
Epoch: 28439, Training Loss: 0.00857
Epoch: 28439, Training Loss: 0.01054
Epoch: 28440, Training Loss: 0.00821
Epoch: 28440, Training Loss: 0.00856
Epoch: 28440, Training Loss: 0.00857
Epoch: 28440, Training Loss: 0.01054
Epoch: 28441, Training Loss: 0.00821
Epoch: 28441, Training Loss: 0.00856
Epoch: 28441, Training Loss: 0.00857
Epoch: 28441, Training Loss: 0.01054
Epoch: 28442, Training Loss: 0.00821
Epoch: 28442, Training Loss: 0.00856
Epoch: 28442, Training Loss: 0.00857
Epoch: 28442, Training Loss: 0.01054
Epoch: 28443, Training Loss: 0.00821
Epoch: 28443, Training Loss: 0.00856
Epoch: 28443, Training Loss: 0.00857
Epoch: 28443, Training Loss: 0.01054
Epoch: 28444, Training Loss: 0.00821
Epoch: 28444, Training Loss: 0.00856
Epoch: 28444, Training Loss: 0.00857
Epoch: 28444, Training Loss: 0.01054
Epoch: 28445, Training Loss: 0.00821
Epoch: 28445, Training Loss: 0.00856
Epoch: 28445, Training Loss: 0.00857
Epoch: 28445, Training Loss: 0.01054
Epoch: 28446, Training Loss: 0.00821
Epoch: 28446, Training Loss: 0.00856
Epoch: 28446, Training Loss: 0.00857
Epoch: 28446, Training Loss: 0.01053
Epoch: 28447, Training Loss: 0.00821
Epoch: 28447, Training Loss: 0.00856
Epoch: 28447, Training Loss: 0.00857
Epoch: 28447, Training Loss: 0.01053
Epoch: 28448, Training Loss: 0.00821
Epoch: 28448, Training Loss: 0.00856
Epoch: 28448, Training Loss: 0.00857
Epoch: 28448, Training Loss: 0.01053
Epoch: 28449, Training Loss: 0.00821
Epoch: 28449, Training Loss: 0.00856
Epoch: 28449, Training Loss: 0.00857
Epoch: 28449, Training Loss: 0.01053
Epoch: 28450, Training Loss: 0.00821
Epoch: 28450, Training Loss: 0.00856
Epoch: 28450, Training Loss: 0.00857
Epoch: 28450, Training Loss: 0.01053
Epoch: 28451, Training Loss: 0.00821
Epoch: 28451, Training Loss: 0.00856
Epoch: 28451, Training Loss: 0.00857
Epoch: 28451, Training Loss: 0.01053
Epoch: 28452, Training Loss: 0.00821
Epoch: 28452, Training Loss: 0.00856
Epoch: 28452, Training Loss: 0.00857
Epoch: 28452, Training Loss: 0.01053
Epoch: 28453, Training Loss: 0.00821
Epoch: 28453, Training Loss: 0.00856
Epoch: 28453, Training Loss: 0.00857
Epoch: 28453, Training Loss: 0.01053
Epoch: 28454, Training Loss: 0.00821
Epoch: 28454, Training Loss: 0.00856
Epoch: 28454, Training Loss: 0.00857
Epoch: 28454, Training Loss: 0.01053
Epoch: 28455, Training Loss: 0.00821
Epoch: 28455, Training Loss: 0.00856
Epoch: 28455, Training Loss: 0.00857
Epoch: 28455, Training Loss: 0.01053
Epoch: 28456, Training Loss: 0.00821
Epoch: 28456, Training Loss: 0.00856
Epoch: 28456, Training Loss: 0.00857
Epoch: 28456, Training Loss: 0.01053
Epoch: 28457, Training Loss: 0.00821
Epoch: 28457, Training Loss: 0.00856
Epoch: 28457, Training Loss: 0.00857
Epoch: 28457, Training Loss: 0.01053
Epoch: 28458, Training Loss: 0.00821
Epoch: 28458, Training Loss: 0.00856
Epoch: 28458, Training Loss: 0.00857
Epoch: 28458, Training Loss: 0.01053
Epoch: 28459, Training Loss: 0.00821
Epoch: 28459, Training Loss: 0.00856
Epoch: 28459, Training Loss: 0.00857
Epoch: 28459, Training Loss: 0.01053
Epoch: 28460, Training Loss: 0.00821
Epoch: 28460, Training Loss: 0.00856
Epoch: 28460, Training Loss: 0.00857
Epoch: 28460, Training Loss: 0.01053
Epoch: 28461, Training Loss: 0.00821
Epoch: 28461, Training Loss: 0.00856
Epoch: 28461, Training Loss: 0.00857
Epoch: 28461, Training Loss: 0.01053
Epoch: 28462, Training Loss: 0.00821
Epoch: 28462, Training Loss: 0.00856
Epoch: 28462, Training Loss: 0.00857
Epoch: 28462, Training Loss: 0.01053
Epoch: 28463, Training Loss: 0.00821
Epoch: 28463, Training Loss: 0.00856
Epoch: 28463, Training Loss: 0.00857
Epoch: 28463, Training Loss: 0.01053
Epoch: 28464, Training Loss: 0.00821
Epoch: 28464, Training Loss: 0.00856
Epoch: 28464, Training Loss: 0.00857
Epoch: 28464, Training Loss: 0.01053
Epoch: 28465, Training Loss: 0.00821
Epoch: 28465, Training Loss: 0.00856
Epoch: 28465, Training Loss: 0.00857
Epoch: 28465, Training Loss: 0.01053
Epoch: 28466, Training Loss: 0.00821
Epoch: 28466, Training Loss: 0.00856
Epoch: 28466, Training Loss: 0.00857
Epoch: 28466, Training Loss: 0.01053
Epoch: 28467, Training Loss: 0.00821
Epoch: 28467, Training Loss: 0.00856
Epoch: 28467, Training Loss: 0.00857
Epoch: 28467, Training Loss: 0.01053
Epoch: 28468, Training Loss: 0.00821
Epoch: 28468, Training Loss: 0.00856
Epoch: 28468, Training Loss: 0.00857
Epoch: 28468, Training Loss: 0.01053
Epoch: 28469, Training Loss: 0.00821
Epoch: 28469, Training Loss: 0.00856
Epoch: 28469, Training Loss: 0.00857
Epoch: 28469, Training Loss: 0.01053
Epoch: 28470, Training Loss: 0.00821
Epoch: 28470, Training Loss: 0.00856
Epoch: 28470, Training Loss: 0.00857
Epoch: 28470, Training Loss: 0.01053
Epoch: 28471, Training Loss: 0.00821
Epoch: 28471, Training Loss: 0.00856
Epoch: 28471, Training Loss: 0.00857
Epoch: 28471, Training Loss: 0.01053
Epoch: 28472, Training Loss: 0.00821
Epoch: 28472, Training Loss: 0.00856
Epoch: 28472, Training Loss: 0.00857
Epoch: 28472, Training Loss: 0.01053
Epoch: 28473, Training Loss: 0.00821
Epoch: 28473, Training Loss: 0.00856
Epoch: 28473, Training Loss: 0.00857
Epoch: 28473, Training Loss: 0.01053
Epoch: 28474, Training Loss: 0.00821
Epoch: 28474, Training Loss: 0.00856
Epoch: 28474, Training Loss: 0.00857
Epoch: 28474, Training Loss: 0.01053
Epoch: 28475, Training Loss: 0.00821
Epoch: 28475, Training Loss: 0.00856
Epoch: 28475, Training Loss: 0.00857
Epoch: 28475, Training Loss: 0.01053
Epoch: 28476, Training Loss: 0.00821
Epoch: 28476, Training Loss: 0.00856
Epoch: 28476, Training Loss: 0.00857
Epoch: 28476, Training Loss: 0.01053
Epoch: 28477, Training Loss: 0.00821
Epoch: 28477, Training Loss: 0.00856
Epoch: 28477, Training Loss: 0.00857
Epoch: 28477, Training Loss: 0.01053
Epoch: 28478, Training Loss: 0.00821
Epoch: 28478, Training Loss: 0.00856
Epoch: 28478, Training Loss: 0.00857
Epoch: 28478, Training Loss: 0.01053
Epoch: 28479, Training Loss: 0.00821
Epoch: 28479, Training Loss: 0.00856
Epoch: 28479, Training Loss: 0.00857
Epoch: 28479, Training Loss: 0.01053
Epoch: 28480, Training Loss: 0.00821
Epoch: 28480, Training Loss: 0.00856
Epoch: 28480, Training Loss: 0.00857
Epoch: 28480, Training Loss: 0.01053
Epoch: 28481, Training Loss: 0.00821
Epoch: 28481, Training Loss: 0.00856
Epoch: 28481, Training Loss: 0.00857
Epoch: 28481, Training Loss: 0.01053
Epoch: 28482, Training Loss: 0.00821
Epoch: 28482, Training Loss: 0.00856
Epoch: 28482, Training Loss: 0.00857
Epoch: 28482, Training Loss: 0.01053
Epoch: 28483, Training Loss: 0.00821
Epoch: 28483, Training Loss: 0.00856
Epoch: 28483, Training Loss: 0.00857
Epoch: 28483, Training Loss: 0.01053
Epoch: 28484, Training Loss: 0.00820
Epoch: 28484, Training Loss: 0.00856
Epoch: 28484, Training Loss: 0.00857
Epoch: 28484, Training Loss: 0.01053
Epoch: 28485, Training Loss: 0.00820
Epoch: 28485, Training Loss: 0.00856
Epoch: 28485, Training Loss: 0.00857
Epoch: 28485, Training Loss: 0.01053
Epoch: 28486, Training Loss: 0.00820
Epoch: 28486, Training Loss: 0.00856
Epoch: 28486, Training Loss: 0.00857
Epoch: 28486, Training Loss: 0.01053
Epoch: 28487, Training Loss: 0.00820
Epoch: 28487, Training Loss: 0.00856
Epoch: 28487, Training Loss: 0.00857
Epoch: 28487, Training Loss: 0.01053
Epoch: 28488, Training Loss: 0.00820
Epoch: 28488, Training Loss: 0.00856
Epoch: 28488, Training Loss: 0.00857
Epoch: 28488, Training Loss: 0.01053
Epoch: 28489, Training Loss: 0.00820
Epoch: 28489, Training Loss: 0.00856
Epoch: 28489, Training Loss: 0.00857
Epoch: 28489, Training Loss: 0.01053
Epoch: 28490, Training Loss: 0.00820
Epoch: 28490, Training Loss: 0.00856
Epoch: 28490, Training Loss: 0.00857
Epoch: 28490, Training Loss: 0.01053
Epoch: 28491, Training Loss: 0.00820
Epoch: 28491, Training Loss: 0.00856
Epoch: 28491, Training Loss: 0.00857
Epoch: 28491, Training Loss: 0.01053
Epoch: 28492, Training Loss: 0.00820
Epoch: 28492, Training Loss: 0.00855
Epoch: 28492, Training Loss: 0.00857
Epoch: 28492, Training Loss: 0.01053
Epoch: 28493, Training Loss: 0.00820
Epoch: 28493, Training Loss: 0.00855
Epoch: 28493, Training Loss: 0.00857
Epoch: 28493, Training Loss: 0.01053
Epoch: 28494, Training Loss: 0.00820
Epoch: 28494, Training Loss: 0.00855
Epoch: 28494, Training Loss: 0.00856
Epoch: 28494, Training Loss: 0.01052
Epoch: 28495, Training Loss: 0.00820
Epoch: 28495, Training Loss: 0.00855
Epoch: 28495, Training Loss: 0.00856
Epoch: 28495, Training Loss: 0.01052
Epoch: 28496, Training Loss: 0.00820
Epoch: 28496, Training Loss: 0.00855
Epoch: 28496, Training Loss: 0.00856
Epoch: 28496, Training Loss: 0.01052
Epoch: 28497, Training Loss: 0.00820
Epoch: 28497, Training Loss: 0.00855
Epoch: 28497, Training Loss: 0.00856
Epoch: 28497, Training Loss: 0.01052
Epoch: 28498, Training Loss: 0.00820
Epoch: 28498, Training Loss: 0.00855
Epoch: 28498, Training Loss: 0.00856
Epoch: 28498, Training Loss: 0.01052
Epoch: 28499, Training Loss: 0.00820
Epoch: 28499, Training Loss: 0.00855
Epoch: 28499, Training Loss: 0.00856
Epoch: 28499, Training Loss: 0.01052
Epoch: 28500, Training Loss: 0.00820
Epoch: 28500, Training Loss: 0.00855
Epoch: 28500, Training Loss: 0.00856
Epoch: 28500, Training Loss: 0.01052
Epoch: 28501, Training Loss: 0.00820
Epoch: 28501, Training Loss: 0.00855
Epoch: 28501, Training Loss: 0.00856
Epoch: 28501, Training Loss: 0.01052
Epoch: 28502, Training Loss: 0.00820
Epoch: 28502, Training Loss: 0.00855
Epoch: 28502, Training Loss: 0.00856
Epoch: 28502, Training Loss: 0.01052
Epoch: 28503, Training Loss: 0.00820
Epoch: 28503, Training Loss: 0.00855
Epoch: 28503, Training Loss: 0.00856
Epoch: 28503, Training Loss: 0.01052
Epoch: 28504, Training Loss: 0.00820
Epoch: 28504, Training Loss: 0.00855
Epoch: 28504, Training Loss: 0.00856
Epoch: 28504, Training Loss: 0.01052
Epoch: 28505, Training Loss: 0.00820
Epoch: 28505, Training Loss: 0.00855
Epoch: 28505, Training Loss: 0.00856
Epoch: 28505, Training Loss: 0.01052
Epoch: 28506, Training Loss: 0.00820
Epoch: 28506, Training Loss: 0.00855
Epoch: 28506, Training Loss: 0.00856
Epoch: 28506, Training Loss: 0.01052
Epoch: 28507, Training Loss: 0.00820
Epoch: 28507, Training Loss: 0.00855
Epoch: 28507, Training Loss: 0.00856
Epoch: 28507, Training Loss: 0.01052
Epoch: 28508, Training Loss: 0.00820
Epoch: 28508, Training Loss: 0.00855
Epoch: 28508, Training Loss: 0.00856
Epoch: 28508, Training Loss: 0.01052
Epoch: 28509, Training Loss: 0.00820
Epoch: 28509, Training Loss: 0.00855
Epoch: 28509, Training Loss: 0.00856
Epoch: 28509, Training Loss: 0.01052
Epoch: 28510, Training Loss: 0.00820
Epoch: 28510, Training Loss: 0.00855
Epoch: 28510, Training Loss: 0.00856
Epoch: 28510, Training Loss: 0.01052
Epoch: 28511, Training Loss: 0.00820
Epoch: 28511, Training Loss: 0.00855
Epoch: 28511, Training Loss: 0.00856
Epoch: 28511, Training Loss: 0.01052
Epoch: 28512, Training Loss: 0.00820
Epoch: 28512, Training Loss: 0.00855
Epoch: 28512, Training Loss: 0.00856
Epoch: 28512, Training Loss: 0.01052
Epoch: 28513, Training Loss: 0.00820
Epoch: 28513, Training Loss: 0.00855
Epoch: 28513, Training Loss: 0.00856
Epoch: 28513, Training Loss: 0.01052
Epoch: 28514, Training Loss: 0.00820
Epoch: 28514, Training Loss: 0.00855
Epoch: 28514, Training Loss: 0.00856
Epoch: 28514, Training Loss: 0.01052
Epoch: 28515, Training Loss: 0.00820
Epoch: 28515, Training Loss: 0.00855
Epoch: 28515, Training Loss: 0.00856
Epoch: 28515, Training Loss: 0.01052
Epoch: 28516, Training Loss: 0.00820
Epoch: 28516, Training Loss: 0.00855
Epoch: 28516, Training Loss: 0.00856
Epoch: 28516, Training Loss: 0.01052
Epoch: 28517, Training Loss: 0.00820
Epoch: 28517, Training Loss: 0.00855
Epoch: 28517, Training Loss: 0.00856
Epoch: 28517, Training Loss: 0.01052
Epoch: 28518, Training Loss: 0.00820
Epoch: 28518, Training Loss: 0.00855
Epoch: 28518, Training Loss: 0.00856
Epoch: 28518, Training Loss: 0.01052
Epoch: 28519, Training Loss: 0.00820
Epoch: 28519, Training Loss: 0.00855
Epoch: 28519, Training Loss: 0.00856
Epoch: 28519, Training Loss: 0.01052
Epoch: 28520, Training Loss: 0.00820
Epoch: 28520, Training Loss: 0.00855
Epoch: 28520, Training Loss: 0.00856
Epoch: 28520, Training Loss: 0.01052
Epoch: 28521, Training Loss: 0.00820
Epoch: 28521, Training Loss: 0.00855
Epoch: 28521, Training Loss: 0.00856
Epoch: 28521, Training Loss: 0.01052
Epoch: 28522, Training Loss: 0.00820
Epoch: 28522, Training Loss: 0.00855
Epoch: 28522, Training Loss: 0.00856
Epoch: 28522, Training Loss: 0.01052
Epoch: 28523, Training Loss: 0.00820
Epoch: 28523, Training Loss: 0.00855
Epoch: 28523, Training Loss: 0.00856
Epoch: 28523, Training Loss: 0.01052
Epoch: 28524, Training Loss: 0.00820
Epoch: 28524, Training Loss: 0.00855
Epoch: 28524, Training Loss: 0.00856
Epoch: 28524, Training Loss: 0.01052
Epoch: 28525, Training Loss: 0.00820
Epoch: 28525, Training Loss: 0.00855
Epoch: 28525, Training Loss: 0.00856
Epoch: 28525, Training Loss: 0.01052
Epoch: 28526, Training Loss: 0.00820
Epoch: 28526, Training Loss: 0.00855
Epoch: 28526, Training Loss: 0.00856
Epoch: 28526, Training Loss: 0.01052
Epoch: 28527, Training Loss: 0.00820
Epoch: 28527, Training Loss: 0.00855
Epoch: 28527, Training Loss: 0.00856
Epoch: 28527, Training Loss: 0.01052
Epoch: 28528, Training Loss: 0.00820
Epoch: 28528, Training Loss: 0.00855
Epoch: 28528, Training Loss: 0.00856
Epoch: 28528, Training Loss: 0.01052
Epoch: 28529, Training Loss: 0.00820
Epoch: 28529, Training Loss: 0.00855
Epoch: 28529, Training Loss: 0.00856
Epoch: 28529, Training Loss: 0.01052
Epoch: 28530, Training Loss: 0.00820
Epoch: 28530, Training Loss: 0.00855
Epoch: 28530, Training Loss: 0.00856
Epoch: 28530, Training Loss: 0.01052
Epoch: 28531, Training Loss: 0.00820
Epoch: 28531, Training Loss: 0.00855
Epoch: 28531, Training Loss: 0.00856
Epoch: 28531, Training Loss: 0.01052
Epoch: 28532, Training Loss: 0.00820
Epoch: 28532, Training Loss: 0.00855
Epoch: 28532, Training Loss: 0.00856
Epoch: 28532, Training Loss: 0.01052
Epoch: 28533, Training Loss: 0.00820
Epoch: 28533, Training Loss: 0.00855
Epoch: 28533, Training Loss: 0.00856
Epoch: 28533, Training Loss: 0.01052
Epoch: 28534, Training Loss: 0.00820
Epoch: 28534, Training Loss: 0.00855
Epoch: 28534, Training Loss: 0.00856
Epoch: 28534, Training Loss: 0.01052
Epoch: 28535, Training Loss: 0.00820
Epoch: 28535, Training Loss: 0.00855
Epoch: 28535, Training Loss: 0.00856
Epoch: 28535, Training Loss: 0.01052
Epoch: 28536, Training Loss: 0.00820
Epoch: 28536, Training Loss: 0.00855
Epoch: 28536, Training Loss: 0.00856
Epoch: 28536, Training Loss: 0.01052
Epoch: 28537, Training Loss: 0.00820
Epoch: 28537, Training Loss: 0.00855
Epoch: 28537, Training Loss: 0.00856
Epoch: 28537, Training Loss: 0.01052
Epoch: 28538, Training Loss: 0.00820
Epoch: 28538, Training Loss: 0.00855
Epoch: 28538, Training Loss: 0.00856
Epoch: 28538, Training Loss: 0.01052
Epoch: 28539, Training Loss: 0.00820
Epoch: 28539, Training Loss: 0.00855
Epoch: 28539, Training Loss: 0.00856
Epoch: 28539, Training Loss: 0.01052
Epoch: 28540, Training Loss: 0.00820
Epoch: 28540, Training Loss: 0.00855
Epoch: 28540, Training Loss: 0.00856
Epoch: 28540, Training Loss: 0.01052
Epoch: 28541, Training Loss: 0.00820
Epoch: 28541, Training Loss: 0.00855
Epoch: 28541, Training Loss: 0.00856
Epoch: 28541, Training Loss: 0.01051
Epoch: 28542, Training Loss: 0.00820
Epoch: 28542, Training Loss: 0.00855
Epoch: 28542, Training Loss: 0.00856
Epoch: 28542, Training Loss: 0.01051
Epoch: 28543, Training Loss: 0.00820
Epoch: 28543, Training Loss: 0.00855
Epoch: 28543, Training Loss: 0.00856
Epoch: 28543, Training Loss: 0.01051
Epoch: 28544, Training Loss: 0.00820
Epoch: 28544, Training Loss: 0.00855
Epoch: 28544, Training Loss: 0.00856
Epoch: 28544, Training Loss: 0.01051
Epoch: 28545, Training Loss: 0.00820
Epoch: 28545, Training Loss: 0.00855
Epoch: 28545, Training Loss: 0.00856
Epoch: 28545, Training Loss: 0.01051
Epoch: 28546, Training Loss: 0.00820
Epoch: 28546, Training Loss: 0.00855
Epoch: 28546, Training Loss: 0.00856
Epoch: 28546, Training Loss: 0.01051
Epoch: 28547, Training Loss: 0.00819
Epoch: 28547, Training Loss: 0.00855
Epoch: 28547, Training Loss: 0.00856
Epoch: 28547, Training Loss: 0.01051
Epoch: 28548, Training Loss: 0.00819
Epoch: 28548, Training Loss: 0.00855
Epoch: 28548, Training Loss: 0.00856
Epoch: 28548, Training Loss: 0.01051
Epoch: 28549, Training Loss: 0.00819
Epoch: 28549, Training Loss: 0.00855
Epoch: 28549, Training Loss: 0.00856
Epoch: 28549, Training Loss: 0.01051
Epoch: 28550, Training Loss: 0.00819
Epoch: 28550, Training Loss: 0.00855
Epoch: 28550, Training Loss: 0.00856
Epoch: 28550, Training Loss: 0.01051
Epoch: 28551, Training Loss: 0.00819
Epoch: 28551, Training Loss: 0.00854
Epoch: 28551, Training Loss: 0.00856
Epoch: 28551, Training Loss: 0.01051
Epoch: 28552, Training Loss: 0.00819
Epoch: 28552, Training Loss: 0.00854
Epoch: 28552, Training Loss: 0.00856
Epoch: 28552, Training Loss: 0.01051
Epoch: 28553, Training Loss: 0.00819
Epoch: 28553, Training Loss: 0.00854
Epoch: 28553, Training Loss: 0.00855
Epoch: 28553, Training Loss: 0.01051
Epoch: 28554, Training Loss: 0.00819
Epoch: 28554, Training Loss: 0.00854
Epoch: 28554, Training Loss: 0.00855
Epoch: 28554, Training Loss: 0.01051
Epoch: 28555, Training Loss: 0.00819
Epoch: 28555, Training Loss: 0.00854
Epoch: 28555, Training Loss: 0.00855
Epoch: 28555, Training Loss: 0.01051
Epoch: 28556, Training Loss: 0.00819
Epoch: 28556, Training Loss: 0.00854
Epoch: 28556, Training Loss: 0.00855
Epoch: 28556, Training Loss: 0.01051
Epoch: 28557, Training Loss: 0.00819
Epoch: 28557, Training Loss: 0.00854
Epoch: 28557, Training Loss: 0.00855
Epoch: 28557, Training Loss: 0.01051
Epoch: 28558, Training Loss: 0.00819
Epoch: 28558, Training Loss: 0.00854
Epoch: 28558, Training Loss: 0.00855
Epoch: 28558, Training Loss: 0.01051
Epoch: 28559, Training Loss: 0.00819
Epoch: 28559, Training Loss: 0.00854
Epoch: 28559, Training Loss: 0.00855
Epoch: 28559, Training Loss: 0.01051
Epoch: 28560, Training Loss: 0.00819
Epoch: 28560, Training Loss: 0.00854
Epoch: 28560, Training Loss: 0.00855
Epoch: 28560, Training Loss: 0.01051
Epoch: 28561, Training Loss: 0.00819
Epoch: 28561, Training Loss: 0.00854
Epoch: 28561, Training Loss: 0.00855
Epoch: 28561, Training Loss: 0.01051
Epoch: 28562, Training Loss: 0.00819
Epoch: 28562, Training Loss: 0.00854
Epoch: 28562, Training Loss: 0.00855
Epoch: 28562, Training Loss: 0.01051
Epoch: 28563, Training Loss: 0.00819
Epoch: 28563, Training Loss: 0.00854
Epoch: 28563, Training Loss: 0.00855
Epoch: 28563, Training Loss: 0.01051
Epoch: 28564, Training Loss: 0.00819
Epoch: 28564, Training Loss: 0.00854
Epoch: 28564, Training Loss: 0.00855
Epoch: 28564, Training Loss: 0.01051
Epoch: 28565, Training Loss: 0.00819
Epoch: 28565, Training Loss: 0.00854
Epoch: 28565, Training Loss: 0.00855
Epoch: 28565, Training Loss: 0.01051
Epoch: 28566, Training Loss: 0.00819
Epoch: 28566, Training Loss: 0.00854
Epoch: 28566, Training Loss: 0.00855
Epoch: 28566, Training Loss: 0.01051
Epoch: 28567, Training Loss: 0.00819
Epoch: 28567, Training Loss: 0.00854
Epoch: 28567, Training Loss: 0.00855
Epoch: 28567, Training Loss: 0.01051
Epoch: 28568, Training Loss: 0.00819
Epoch: 28568, Training Loss: 0.00854
Epoch: 28568, Training Loss: 0.00855
Epoch: 28568, Training Loss: 0.01051
Epoch: 28569, Training Loss: 0.00819
Epoch: 28569, Training Loss: 0.00854
Epoch: 28569, Training Loss: 0.00855
Epoch: 28569, Training Loss: 0.01051
Epoch: 28570, Training Loss: 0.00819
Epoch: 28570, Training Loss: 0.00854
Epoch: 28570, Training Loss: 0.00855
Epoch: 28570, Training Loss: 0.01051
Epoch: 28571, Training Loss: 0.00819
Epoch: 28571, Training Loss: 0.00854
Epoch: 28571, Training Loss: 0.00855
Epoch: 28571, Training Loss: 0.01051
Epoch: 28572, Training Loss: 0.00819
Epoch: 28572, Training Loss: 0.00854
Epoch: 28572, Training Loss: 0.00855
Epoch: 28572, Training Loss: 0.01051
Epoch: 28573, Training Loss: 0.00819
Epoch: 28573, Training Loss: 0.00854
Epoch: 28573, Training Loss: 0.00855
Epoch: 28573, Training Loss: 0.01051
Epoch: 28574, Training Loss: 0.00819
Epoch: 28574, Training Loss: 0.00854
Epoch: 28574, Training Loss: 0.00855
Epoch: 28574, Training Loss: 0.01051
Epoch: 28575, Training Loss: 0.00819
Epoch: 28575, Training Loss: 0.00854
Epoch: 28575, Training Loss: 0.00855
Epoch: 28575, Training Loss: 0.01051
Epoch: 28576, Training Loss: 0.00819
Epoch: 28576, Training Loss: 0.00854
Epoch: 28576, Training Loss: 0.00855
Epoch: 28576, Training Loss: 0.01051
Epoch: 28577, Training Loss: 0.00819
Epoch: 28577, Training Loss: 0.00854
Epoch: 28577, Training Loss: 0.00855
Epoch: 28577, Training Loss: 0.01051
Epoch: 28578, Training Loss: 0.00819
Epoch: 28578, Training Loss: 0.00854
Epoch: 28578, Training Loss: 0.00855
Epoch: 28578, Training Loss: 0.01051
Epoch: 28579, Training Loss: 0.00819
Epoch: 28579, Training Loss: 0.00854
Epoch: 28579, Training Loss: 0.00855
Epoch: 28579, Training Loss: 0.01051
Epoch: 28580, Training Loss: 0.00819
Epoch: 28580, Training Loss: 0.00854
Epoch: 28580, Training Loss: 0.00855
Epoch: 28580, Training Loss: 0.01051
Epoch: 28581, Training Loss: 0.00819
Epoch: 28581, Training Loss: 0.00854
Epoch: 28581, Training Loss: 0.00855
Epoch: 28581, Training Loss: 0.01051
Epoch: 28582, Training Loss: 0.00819
Epoch: 28582, Training Loss: 0.00854
Epoch: 28582, Training Loss: 0.00855
Epoch: 28582, Training Loss: 0.01051
Epoch: 28583, Training Loss: 0.00819
Epoch: 28583, Training Loss: 0.00854
Epoch: 28583, Training Loss: 0.00855
Epoch: 28583, Training Loss: 0.01051
Epoch: 28584, Training Loss: 0.00819
Epoch: 28584, Training Loss: 0.00854
Epoch: 28584, Training Loss: 0.00855
Epoch: 28584, Training Loss: 0.01051
Epoch: 28585, Training Loss: 0.00819
Epoch: 28585, Training Loss: 0.00854
Epoch: 28585, Training Loss: 0.00855
Epoch: 28585, Training Loss: 0.01051
Epoch: 28586, Training Loss: 0.00819
Epoch: 28586, Training Loss: 0.00854
Epoch: 28586, Training Loss: 0.00855
Epoch: 28586, Training Loss: 0.01051
Epoch: 28587, Training Loss: 0.00819
Epoch: 28587, Training Loss: 0.00854
Epoch: 28587, Training Loss: 0.00855
Epoch: 28587, Training Loss: 0.01051
Epoch: 28588, Training Loss: 0.00819
Epoch: 28588, Training Loss: 0.00854
Epoch: 28588, Training Loss: 0.00855
Epoch: 28588, Training Loss: 0.01050
Epoch: 28589, Training Loss: 0.00819
Epoch: 28589, Training Loss: 0.00854
Epoch: 28589, Training Loss: 0.00855
Epoch: 28589, Training Loss: 0.01050
Epoch: 28590, Training Loss: 0.00819
Epoch: 28590, Training Loss: 0.00854
Epoch: 28590, Training Loss: 0.00855
Epoch: 28590, Training Loss: 0.01050
Epoch: 28591, Training Loss: 0.00819
Epoch: 28591, Training Loss: 0.00854
Epoch: 28591, Training Loss: 0.00855
Epoch: 28591, Training Loss: 0.01050
Epoch: 28592, Training Loss: 0.00819
Epoch: 28592, Training Loss: 0.00854
Epoch: 28592, Training Loss: 0.00855
Epoch: 28592, Training Loss: 0.01050
Epoch: 28593, Training Loss: 0.00819
Epoch: 28593, Training Loss: 0.00854
Epoch: 28593, Training Loss: 0.00855
Epoch: 28593, Training Loss: 0.01050
Epoch: 28594, Training Loss: 0.00819
Epoch: 28594, Training Loss: 0.00854
Epoch: 28594, Training Loss: 0.00855
Epoch: 28594, Training Loss: 0.01050
Epoch: 28595, Training Loss: 0.00819
Epoch: 28595, Training Loss: 0.00854
Epoch: 28595, Training Loss: 0.00855
Epoch: 28595, Training Loss: 0.01050
Epoch: 28596, Training Loss: 0.00819
Epoch: 28596, Training Loss: 0.00854
Epoch: 28596, Training Loss: 0.00855
Epoch: 28596, Training Loss: 0.01050
Epoch: 28597, Training Loss: 0.00819
Epoch: 28597, Training Loss: 0.00854
Epoch: 28597, Training Loss: 0.00855
Epoch: 28597, Training Loss: 0.01050
Epoch: 28598, Training Loss: 0.00819
Epoch: 28598, Training Loss: 0.00854
Epoch: 28598, Training Loss: 0.00855
Epoch: 28598, Training Loss: 0.01050
Epoch: 28599, Training Loss: 0.00819
Epoch: 28599, Training Loss: 0.00854
Epoch: 28599, Training Loss: 0.00855
Epoch: 28599, Training Loss: 0.01050
Epoch: 28600, Training Loss: 0.00819
Epoch: 28600, Training Loss: 0.00854
Epoch: 28600, Training Loss: 0.00855
Epoch: 28600, Training Loss: 0.01050
Epoch: 28601, Training Loss: 0.00819
Epoch: 28601, Training Loss: 0.00854
Epoch: 28601, Training Loss: 0.00855
Epoch: 28601, Training Loss: 0.01050
Epoch: 28602, Training Loss: 0.00819
Epoch: 28602, Training Loss: 0.00854
Epoch: 28602, Training Loss: 0.00855
Epoch: 28602, Training Loss: 0.01050
Epoch: 28603, Training Loss: 0.00819
Epoch: 28603, Training Loss: 0.00854
Epoch: 28603, Training Loss: 0.00855
Epoch: 28603, Training Loss: 0.01050
Epoch: 28604, Training Loss: 0.00819
Epoch: 28604, Training Loss: 0.00854
Epoch: 28604, Training Loss: 0.00855
Epoch: 28604, Training Loss: 0.01050
Epoch: 28605, Training Loss: 0.00819
Epoch: 28605, Training Loss: 0.00854
Epoch: 28605, Training Loss: 0.00855
Epoch: 28605, Training Loss: 0.01050
Epoch: 28606, Training Loss: 0.00819
Epoch: 28606, Training Loss: 0.00854
Epoch: 28606, Training Loss: 0.00855
Epoch: 28606, Training Loss: 0.01050
Epoch: 28607, Training Loss: 0.00819
Epoch: 28607, Training Loss: 0.00854
Epoch: 28607, Training Loss: 0.00855
Epoch: 28607, Training Loss: 0.01050
Epoch: 28608, Training Loss: 0.00819
Epoch: 28608, Training Loss: 0.00854
Epoch: 28608, Training Loss: 0.00855
Epoch: 28608, Training Loss: 0.01050
Epoch: 28609, Training Loss: 0.00819
Epoch: 28609, Training Loss: 0.00854
Epoch: 28609, Training Loss: 0.00855
Epoch: 28609, Training Loss: 0.01050
Epoch: 28610, Training Loss: 0.00818
Epoch: 28610, Training Loss: 0.00853
Epoch: 28610, Training Loss: 0.00855
Epoch: 28610, Training Loss: 0.01050
Epoch: 28611, Training Loss: 0.00818
Epoch: 28611, Training Loss: 0.00853
Epoch: 28611, Training Loss: 0.00855
Epoch: 28611, Training Loss: 0.01050
Epoch: 28612, Training Loss: 0.00818
Epoch: 28612, Training Loss: 0.00853
Epoch: 28612, Training Loss: 0.00854
Epoch: 28612, Training Loss: 0.01050
Epoch: 28613, Training Loss: 0.00818
Epoch: 28613, Training Loss: 0.00853
Epoch: 28613, Training Loss: 0.00854
Epoch: 28613, Training Loss: 0.01050
Epoch: 28614, Training Loss: 0.00818
Epoch: 28614, Training Loss: 0.00853
Epoch: 28614, Training Loss: 0.00854
Epoch: 28614, Training Loss: 0.01050
Epoch: 28615, Training Loss: 0.00818
Epoch: 28615, Training Loss: 0.00853
Epoch: 28615, Training Loss: 0.00854
Epoch: 28615, Training Loss: 0.01050
Epoch: 28616, Training Loss: 0.00818
Epoch: 28616, Training Loss: 0.00853
Epoch: 28616, Training Loss: 0.00854
Epoch: 28616, Training Loss: 0.01050
Epoch: 28617, Training Loss: 0.00818
Epoch: 28617, Training Loss: 0.00853
Epoch: 28617, Training Loss: 0.00854
Epoch: 28617, Training Loss: 0.01050
Epoch: 28618, Training Loss: 0.00818
Epoch: 28618, Training Loss: 0.00853
Epoch: 28618, Training Loss: 0.00854
Epoch: 28618, Training Loss: 0.01050
Epoch: 28619, Training Loss: 0.00818
Epoch: 28619, Training Loss: 0.00853
Epoch: 28619, Training Loss: 0.00854
Epoch: 28619, Training Loss: 0.01050
Epoch: 28620, Training Loss: 0.00818
Epoch: 28620, Training Loss: 0.00853
Epoch: 28620, Training Loss: 0.00854
Epoch: 28620, Training Loss: 0.01050
Epoch: 28621, Training Loss: 0.00818
Epoch: 28621, Training Loss: 0.00853
Epoch: 28621, Training Loss: 0.00854
Epoch: 28621, Training Loss: 0.01050
Epoch: 28622, Training Loss: 0.00818
Epoch: 28622, Training Loss: 0.00853
Epoch: 28622, Training Loss: 0.00854
Epoch: 28622, Training Loss: 0.01050
Epoch: 28623, Training Loss: 0.00818
Epoch: 28623, Training Loss: 0.00853
Epoch: 28623, Training Loss: 0.00854
Epoch: 28623, Training Loss: 0.01050
Epoch: 28624, Training Loss: 0.00818
Epoch: 28624, Training Loss: 0.00853
Epoch: 28624, Training Loss: 0.00854
Epoch: 28624, Training Loss: 0.01050
Epoch: 28625, Training Loss: 0.00818
Epoch: 28625, Training Loss: 0.00853
Epoch: 28625, Training Loss: 0.00854
Epoch: 28625, Training Loss: 0.01050
Epoch: 28626, Training Loss: 0.00818
Epoch: 28626, Training Loss: 0.00853
Epoch: 28626, Training Loss: 0.00854
Epoch: 28626, Training Loss: 0.01050
Epoch: 28627, Training Loss: 0.00818
Epoch: 28627, Training Loss: 0.00853
Epoch: 28627, Training Loss: 0.00854
Epoch: 28627, Training Loss: 0.01050
Epoch: 28628, Training Loss: 0.00818
Epoch: 28628, Training Loss: 0.00853
Epoch: 28628, Training Loss: 0.00854
Epoch: 28628, Training Loss: 0.01050
Epoch: 28629, Training Loss: 0.00818
Epoch: 28629, Training Loss: 0.00853
Epoch: 28629, Training Loss: 0.00854
Epoch: 28629, Training Loss: 0.01050
Epoch: 28630, Training Loss: 0.00818
Epoch: 28630, Training Loss: 0.00853
Epoch: 28630, Training Loss: 0.00854
Epoch: 28630, Training Loss: 0.01050
Epoch: 28631, Training Loss: 0.00818
Epoch: 28631, Training Loss: 0.00853
Epoch: 28631, Training Loss: 0.00854
Epoch: 28631, Training Loss: 0.01050
Epoch: 28632, Training Loss: 0.00818
Epoch: 28632, Training Loss: 0.00853
Epoch: 28632, Training Loss: 0.00854
Epoch: 28632, Training Loss: 0.01050
Epoch: 28633, Training Loss: 0.00818
Epoch: 28633, Training Loss: 0.00853
Epoch: 28633, Training Loss: 0.00854
Epoch: 28633, Training Loss: 0.01050
Epoch: 28634, Training Loss: 0.00818
Epoch: 28634, Training Loss: 0.00853
Epoch: 28634, Training Loss: 0.00854
Epoch: 28634, Training Loss: 0.01050
Epoch: 28635, Training Loss: 0.00818
Epoch: 28635, Training Loss: 0.00853
Epoch: 28635, Training Loss: 0.00854
Epoch: 28635, Training Loss: 0.01050
Epoch: 28636, Training Loss: 0.00818
Epoch: 28636, Training Loss: 0.00853
Epoch: 28636, Training Loss: 0.00854
Epoch: 28636, Training Loss: 0.01049
Epoch: 28637, Training Loss: 0.00818
Epoch: 28637, Training Loss: 0.00853
Epoch: 28637, Training Loss: 0.00854
Epoch: 28637, Training Loss: 0.01049
Epoch: 28638, Training Loss: 0.00818
Epoch: 28638, Training Loss: 0.00853
Epoch: 28638, Training Loss: 0.00854
Epoch: 28638, Training Loss: 0.01049
Epoch: 28639, Training Loss: 0.00818
Epoch: 28639, Training Loss: 0.00853
Epoch: 28639, Training Loss: 0.00854
Epoch: 28639, Training Loss: 0.01049
Epoch: 28640, Training Loss: 0.00818
Epoch: 28640, Training Loss: 0.00853
Epoch: 28640, Training Loss: 0.00854
Epoch: 28640, Training Loss: 0.01049
Epoch: 28641, Training Loss: 0.00818
Epoch: 28641, Training Loss: 0.00853
Epoch: 28641, Training Loss: 0.00854
Epoch: 28641, Training Loss: 0.01049
Epoch: 28642, Training Loss: 0.00818
Epoch: 28642, Training Loss: 0.00853
Epoch: 28642, Training Loss: 0.00854
Epoch: 28642, Training Loss: 0.01049
Epoch: 28643, Training Loss: 0.00818
Epoch: 28643, Training Loss: 0.00853
Epoch: 28643, Training Loss: 0.00854
Epoch: 28643, Training Loss: 0.01049
Epoch: 28644, Training Loss: 0.00818
Epoch: 28644, Training Loss: 0.00853
Epoch: 28644, Training Loss: 0.00854
Epoch: 28644, Training Loss: 0.01049
Epoch: 28645, Training Loss: 0.00818
Epoch: 28645, Training Loss: 0.00853
Epoch: 28645, Training Loss: 0.00854
Epoch: 28645, Training Loss: 0.01049
Epoch: 28646, Training Loss: 0.00818
Epoch: 28646, Training Loss: 0.00853
Epoch: 28646, Training Loss: 0.00854
Epoch: 28646, Training Loss: 0.01049
Epoch: 28647, Training Loss: 0.00818
Epoch: 28647, Training Loss: 0.00853
Epoch: 28647, Training Loss: 0.00854
Epoch: 28647, Training Loss: 0.01049
Epoch: 28648, Training Loss: 0.00818
Epoch: 28648, Training Loss: 0.00853
Epoch: 28648, Training Loss: 0.00854
Epoch: 28648, Training Loss: 0.01049
Epoch: 28649, Training Loss: 0.00818
Epoch: 28649, Training Loss: 0.00853
Epoch: 28649, Training Loss: 0.00854
Epoch: 28649, Training Loss: 0.01049
Epoch: 28650, Training Loss: 0.00818
Epoch: 28650, Training Loss: 0.00853
Epoch: 28650, Training Loss: 0.00854
Epoch: 28650, Training Loss: 0.01049
Epoch: 28651, Training Loss: 0.00818
Epoch: 28651, Training Loss: 0.00853
Epoch: 28651, Training Loss: 0.00854
Epoch: 28651, Training Loss: 0.01049
Epoch: 28652, Training Loss: 0.00818
Epoch: 28652, Training Loss: 0.00853
Epoch: 28652, Training Loss: 0.00854
Epoch: 28652, Training Loss: 0.01049
Epoch: 28653, Training Loss: 0.00818
Epoch: 28653, Training Loss: 0.00853
Epoch: 28653, Training Loss: 0.00854
Epoch: 28653, Training Loss: 0.01049
Epoch: 28654, Training Loss: 0.00818
Epoch: 28654, Training Loss: 0.00853
Epoch: 28654, Training Loss: 0.00854
Epoch: 28654, Training Loss: 0.01049
Epoch: 28655, Training Loss: 0.00818
Epoch: 28655, Training Loss: 0.00853
Epoch: 28655, Training Loss: 0.00854
Epoch: 28655, Training Loss: 0.01049
Epoch: 28656, Training Loss: 0.00818
Epoch: 28656, Training Loss: 0.00853
Epoch: 28656, Training Loss: 0.00854
Epoch: 28656, Training Loss: 0.01049
Epoch: 28657, Training Loss: 0.00818
Epoch: 28657, Training Loss: 0.00853
Epoch: 28657, Training Loss: 0.00854
Epoch: 28657, Training Loss: 0.01049
Epoch: 28658, Training Loss: 0.00818
Epoch: 28658, Training Loss: 0.00853
Epoch: 28658, Training Loss: 0.00854
Epoch: 28658, Training Loss: 0.01049
Epoch: 28659, Training Loss: 0.00818
Epoch: 28659, Training Loss: 0.00853
Epoch: 28659, Training Loss: 0.00854
Epoch: 28659, Training Loss: 0.01049
Epoch: 28660, Training Loss: 0.00818
Epoch: 28660, Training Loss: 0.00853
Epoch: 28660, Training Loss: 0.00854
Epoch: 28660, Training Loss: 0.01049
Epoch: 28661, Training Loss: 0.00818
Epoch: 28661, Training Loss: 0.00853
Epoch: 28661, Training Loss: 0.00854
Epoch: 28661, Training Loss: 0.01049
Epoch: 28662, Training Loss: 0.00818
Epoch: 28662, Training Loss: 0.00853
Epoch: 28662, Training Loss: 0.00854
Epoch: 28662, Training Loss: 0.01049
Epoch: 28663, Training Loss: 0.00818
Epoch: 28663, Training Loss: 0.00853
Epoch: 28663, Training Loss: 0.00854
Epoch: 28663, Training Loss: 0.01049
Epoch: 28664, Training Loss: 0.00818
Epoch: 28664, Training Loss: 0.00853
Epoch: 28664, Training Loss: 0.00854
Epoch: 28664, Training Loss: 0.01049
Epoch: 28665, Training Loss: 0.00818
Epoch: 28665, Training Loss: 0.00853
Epoch: 28665, Training Loss: 0.00854
Epoch: 28665, Training Loss: 0.01049
Epoch: 28666, Training Loss: 0.00818
Epoch: 28666, Training Loss: 0.00853
Epoch: 28666, Training Loss: 0.00854
Epoch: 28666, Training Loss: 0.01049
Epoch: 28667, Training Loss: 0.00818
Epoch: 28667, Training Loss: 0.00853
Epoch: 28667, Training Loss: 0.00854
Epoch: 28667, Training Loss: 0.01049
Epoch: 28668, Training Loss: 0.00818
Epoch: 28668, Training Loss: 0.00853
Epoch: 28668, Training Loss: 0.00854
Epoch: 28668, Training Loss: 0.01049
Epoch: 28669, Training Loss: 0.00818
Epoch: 28669, Training Loss: 0.00852
Epoch: 28669, Training Loss: 0.00854
Epoch: 28669, Training Loss: 0.01049
Epoch: 28670, Training Loss: 0.00818
Epoch: 28670, Training Loss: 0.00852
Epoch: 28670, Training Loss: 0.00854
Epoch: 28670, Training Loss: 0.01049
Epoch: 28671, Training Loss: 0.00818
Epoch: 28671, Training Loss: 0.00852
Epoch: 28671, Training Loss: 0.00853
Epoch: 28671, Training Loss: 0.01049
Epoch: 28672, Training Loss: 0.00818
Epoch: 28672, Training Loss: 0.00852
Epoch: 28672, Training Loss: 0.00853
Epoch: 28672, Training Loss: 0.01049
Epoch: 28673, Training Loss: 0.00817
Epoch: 28673, Training Loss: 0.00852
Epoch: 28673, Training Loss: 0.00853
Epoch: 28673, Training Loss: 0.01049
Epoch: 28674, Training Loss: 0.00817
Epoch: 28674, Training Loss: 0.00852
Epoch: 28674, Training Loss: 0.00853
Epoch: 28674, Training Loss: 0.01049
Epoch: 28675, Training Loss: 0.00817
Epoch: 28675, Training Loss: 0.00852
Epoch: 28675, Training Loss: 0.00853
Epoch: 28675, Training Loss: 0.01049
Epoch: 28676, Training Loss: 0.00817
Epoch: 28676, Training Loss: 0.00852
Epoch: 28676, Training Loss: 0.00853
Epoch: 28676, Training Loss: 0.01049
Epoch: 28677, Training Loss: 0.00817
Epoch: 28677, Training Loss: 0.00852
Epoch: 28677, Training Loss: 0.00853
Epoch: 28677, Training Loss: 0.01049
Epoch: 28678, Training Loss: 0.00817
Epoch: 28678, Training Loss: 0.00852
Epoch: 28678, Training Loss: 0.00853
Epoch: 28678, Training Loss: 0.01049
Epoch: 28679, Training Loss: 0.00817
Epoch: 28679, Training Loss: 0.00852
Epoch: 28679, Training Loss: 0.00853
Epoch: 28679, Training Loss: 0.01049
Epoch: 28680, Training Loss: 0.00817
Epoch: 28680, Training Loss: 0.00852
Epoch: 28680, Training Loss: 0.00853
Epoch: 28680, Training Loss: 0.01049
Epoch: 28681, Training Loss: 0.00817
Epoch: 28681, Training Loss: 0.00852
Epoch: 28681, Training Loss: 0.00853
Epoch: 28681, Training Loss: 0.01049
Epoch: 28682, Training Loss: 0.00817
Epoch: 28682, Training Loss: 0.00852
Epoch: 28682, Training Loss: 0.00853
Epoch: 28682, Training Loss: 0.01049
Epoch: 28683, Training Loss: 0.00817
Epoch: 28683, Training Loss: 0.00852
Epoch: 28683, Training Loss: 0.00853
Epoch: 28683, Training Loss: 0.01049
Epoch: 28684, Training Loss: 0.00817
Epoch: 28684, Training Loss: 0.00852
Epoch: 28684, Training Loss: 0.00853
Epoch: 28684, Training Loss: 0.01048
Epoch: 28685, Training Loss: 0.00817
Epoch: 28685, Training Loss: 0.00852
Epoch: 28685, Training Loss: 0.00853
Epoch: 28685, Training Loss: 0.01048
Epoch: 28686, Training Loss: 0.00817
Epoch: 28686, Training Loss: 0.00852
Epoch: 28686, Training Loss: 0.00853
Epoch: 28686, Training Loss: 0.01048
Epoch: 28687, Training Loss: 0.00817
Epoch: 28687, Training Loss: 0.00852
Epoch: 28687, Training Loss: 0.00853
Epoch: 28687, Training Loss: 0.01048
Epoch: 28688, Training Loss: 0.00817
Epoch: 28688, Training Loss: 0.00852
Epoch: 28688, Training Loss: 0.00853
Epoch: 28688, Training Loss: 0.01048
Epoch: 28689, Training Loss: 0.00817
Epoch: 28689, Training Loss: 0.00852
Epoch: 28689, Training Loss: 0.00853
Epoch: 28689, Training Loss: 0.01048
Epoch: 28690, Training Loss: 0.00817
Epoch: 28690, Training Loss: 0.00852
Epoch: 28690, Training Loss: 0.00853
Epoch: 28690, Training Loss: 0.01048
Epoch: 28691, Training Loss: 0.00817
Epoch: 28691, Training Loss: 0.00852
Epoch: 28691, Training Loss: 0.00853
Epoch: 28691, Training Loss: 0.01048
Epoch: 28692, Training Loss: 0.00817
Epoch: 28692, Training Loss: 0.00852
Epoch: 28692, Training Loss: 0.00853
Epoch: 28692, Training Loss: 0.01048
Epoch: 28693, Training Loss: 0.00817
Epoch: 28693, Training Loss: 0.00852
Epoch: 28693, Training Loss: 0.00853
Epoch: 28693, Training Loss: 0.01048
Epoch: 28694, Training Loss: 0.00817
Epoch: 28694, Training Loss: 0.00852
Epoch: 28694, Training Loss: 0.00853
Epoch: 28694, Training Loss: 0.01048
Epoch: 28695, Training Loss: 0.00817
Epoch: 28695, Training Loss: 0.00852
Epoch: 28695, Training Loss: 0.00853
Epoch: 28695, Training Loss: 0.01048
Epoch: 28696, Training Loss: 0.00817
Epoch: 28696, Training Loss: 0.00852
Epoch: 28696, Training Loss: 0.00853
Epoch: 28696, Training Loss: 0.01048
Epoch: 28697, Training Loss: 0.00817
Epoch: 28697, Training Loss: 0.00852
Epoch: 28697, Training Loss: 0.00853
Epoch: 28697, Training Loss: 0.01048
Epoch: 28698, Training Loss: 0.00817
Epoch: 28698, Training Loss: 0.00852
Epoch: 28698, Training Loss: 0.00853
Epoch: 28698, Training Loss: 0.01048
Epoch: 28699, Training Loss: 0.00817
Epoch: 28699, Training Loss: 0.00852
Epoch: 28699, Training Loss: 0.00853
Epoch: 28699, Training Loss: 0.01048
Epoch: 28700, Training Loss: 0.00817
Epoch: 28700, Training Loss: 0.00852
Epoch: 28700, Training Loss: 0.00853
Epoch: 28700, Training Loss: 0.01048
Epoch: 28701, Training Loss: 0.00817
Epoch: 28701, Training Loss: 0.00852
Epoch: 28701, Training Loss: 0.00853
Epoch: 28701, Training Loss: 0.01048
Epoch: 28702, Training Loss: 0.00817
Epoch: 28702, Training Loss: 0.00852
Epoch: 28702, Training Loss: 0.00853
Epoch: 28702, Training Loss: 0.01048
Epoch: 28703, Training Loss: 0.00817
Epoch: 28703, Training Loss: 0.00852
Epoch: 28703, Training Loss: 0.00853
Epoch: 28703, Training Loss: 0.01048
Epoch: 28704, Training Loss: 0.00817
Epoch: 28704, Training Loss: 0.00852
Epoch: 28704, Training Loss: 0.00853
Epoch: 28704, Training Loss: 0.01048
Epoch: 28705, Training Loss: 0.00817
Epoch: 28705, Training Loss: 0.00852
Epoch: 28705, Training Loss: 0.00853
Epoch: 28705, Training Loss: 0.01048
Epoch: 28706, Training Loss: 0.00817
Epoch: 28706, Training Loss: 0.00852
Epoch: 28706, Training Loss: 0.00853
Epoch: 28706, Training Loss: 0.01048
Epoch: 28707, Training Loss: 0.00817
Epoch: 28707, Training Loss: 0.00852
Epoch: 28707, Training Loss: 0.00853
Epoch: 28707, Training Loss: 0.01048
Epoch: 28708, Training Loss: 0.00817
Epoch: 28708, Training Loss: 0.00852
Epoch: 28708, Training Loss: 0.00853
Epoch: 28708, Training Loss: 0.01048
Epoch: 28709, Training Loss: 0.00817
Epoch: 28709, Training Loss: 0.00852
Epoch: 28709, Training Loss: 0.00853
Epoch: 28709, Training Loss: 0.01048
Epoch: 28710, Training Loss: 0.00817
Epoch: 28710, Training Loss: 0.00852
Epoch: 28710, Training Loss: 0.00853
Epoch: 28710, Training Loss: 0.01048
Epoch: 28711, Training Loss: 0.00817
Epoch: 28711, Training Loss: 0.00852
Epoch: 28711, Training Loss: 0.00853
Epoch: 28711, Training Loss: 0.01048
Epoch: 28712, Training Loss: 0.00817
Epoch: 28712, Training Loss: 0.00852
Epoch: 28712, Training Loss: 0.00853
Epoch: 28712, Training Loss: 0.01048
Epoch: 28713, Training Loss: 0.00817
Epoch: 28713, Training Loss: 0.00852
Epoch: 28713, Training Loss: 0.00853
Epoch: 28713, Training Loss: 0.01048
Epoch: 28714, Training Loss: 0.00817
Epoch: 28714, Training Loss: 0.00852
Epoch: 28714, Training Loss: 0.00853
Epoch: 28714, Training Loss: 0.01048
Epoch: 28715, Training Loss: 0.00817
Epoch: 28715, Training Loss: 0.00852
Epoch: 28715, Training Loss: 0.00853
Epoch: 28715, Training Loss: 0.01048
Epoch: 28716, Training Loss: 0.00817
Epoch: 28716, Training Loss: 0.00852
Epoch: 28716, Training Loss: 0.00853
Epoch: 28716, Training Loss: 0.01048
Epoch: 28717, Training Loss: 0.00817
Epoch: 28717, Training Loss: 0.00852
Epoch: 28717, Training Loss: 0.00853
Epoch: 28717, Training Loss: 0.01048
Epoch: 28718, Training Loss: 0.00817
Epoch: 28718, Training Loss: 0.00852
Epoch: 28718, Training Loss: 0.00853
Epoch: 28718, Training Loss: 0.01048
Epoch: 28719, Training Loss: 0.00817
Epoch: 28719, Training Loss: 0.00852
Epoch: 28719, Training Loss: 0.00853
Epoch: 28719, Training Loss: 0.01048
Epoch: 28720, Training Loss: 0.00817
Epoch: 28720, Training Loss: 0.00852
Epoch: 28720, Training Loss: 0.00853
Epoch: 28720, Training Loss: 0.01048
Epoch: 28721, Training Loss: 0.00817
Epoch: 28721, Training Loss: 0.00852
Epoch: 28721, Training Loss: 0.00853
Epoch: 28721, Training Loss: 0.01048
Epoch: 28722, Training Loss: 0.00817
Epoch: 28722, Training Loss: 0.00852
Epoch: 28722, Training Loss: 0.00853
Epoch: 28722, Training Loss: 0.01048
Epoch: 28723, Training Loss: 0.00817
Epoch: 28723, Training Loss: 0.00852
Epoch: 28723, Training Loss: 0.00853
Epoch: 28723, Training Loss: 0.01048
Epoch: 28724, Training Loss: 0.00817
Epoch: 28724, Training Loss: 0.00852
Epoch: 28724, Training Loss: 0.00853
Epoch: 28724, Training Loss: 0.01048
Epoch: 28725, Training Loss: 0.00817
Epoch: 28725, Training Loss: 0.00852
Epoch: 28725, Training Loss: 0.00853
Epoch: 28725, Training Loss: 0.01048
Epoch: 28726, Training Loss: 0.00817
Epoch: 28726, Training Loss: 0.00852
Epoch: 28726, Training Loss: 0.00853
Epoch: 28726, Training Loss: 0.01048
Epoch: 28727, Training Loss: 0.00817
Epoch: 28727, Training Loss: 0.00852
Epoch: 28727, Training Loss: 0.00853
Epoch: 28727, Training Loss: 0.01048
Epoch: 28728, Training Loss: 0.00817
Epoch: 28728, Training Loss: 0.00852
Epoch: 28728, Training Loss: 0.00853
Epoch: 28728, Training Loss: 0.01048
Epoch: 28729, Training Loss: 0.00817
Epoch: 28729, Training Loss: 0.00851
Epoch: 28729, Training Loss: 0.00853
Epoch: 28729, Training Loss: 0.01048
Epoch: 28730, Training Loss: 0.00817
Epoch: 28730, Training Loss: 0.00851
Epoch: 28730, Training Loss: 0.00853
Epoch: 28730, Training Loss: 0.01048
Epoch: 28731, Training Loss: 0.00817
Epoch: 28731, Training Loss: 0.00851
Epoch: 28731, Training Loss: 0.00852
Epoch: 28731, Training Loss: 0.01048
Epoch: 28732, Training Loss: 0.00817
Epoch: 28732, Training Loss: 0.00851
Epoch: 28732, Training Loss: 0.00852
Epoch: 28732, Training Loss: 0.01047
Epoch: 28733, Training Loss: 0.00817
Epoch: 28733, Training Loss: 0.00851
Epoch: 28733, Training Loss: 0.00852
Epoch: 28733, Training Loss: 0.01047
Epoch: 28734, Training Loss: 0.00817
Epoch: 28734, Training Loss: 0.00851
Epoch: 28734, Training Loss: 0.00852
Epoch: 28734, Training Loss: 0.01047
Epoch: 28735, Training Loss: 0.00817
Epoch: 28735, Training Loss: 0.00851
Epoch: 28735, Training Loss: 0.00852
Epoch: 28735, Training Loss: 0.01047
Epoch: 28736, Training Loss: 0.00817
Epoch: 28736, Training Loss: 0.00851
Epoch: 28736, Training Loss: 0.00852
Epoch: 28736, Training Loss: 0.01047
Epoch: 28737, Training Loss: 0.00816
Epoch: 28737, Training Loss: 0.00851
Epoch: 28737, Training Loss: 0.00852
Epoch: 28737, Training Loss: 0.01047
Epoch: 28738, Training Loss: 0.00816
Epoch: 28738, Training Loss: 0.00851
Epoch: 28738, Training Loss: 0.00852
Epoch: 28738, Training Loss: 0.01047
Epoch: 28739, Training Loss: 0.00816
Epoch: 28739, Training Loss: 0.00851
Epoch: 28739, Training Loss: 0.00852
Epoch: 28739, Training Loss: 0.01047
Epoch: 28740, Training Loss: 0.00816
Epoch: 28740, Training Loss: 0.00851
Epoch: 28740, Training Loss: 0.00852
Epoch: 28740, Training Loss: 0.01047
Epoch: 28741, Training Loss: 0.00816
Epoch: 28741, Training Loss: 0.00851
Epoch: 28741, Training Loss: 0.00852
Epoch: 28741, Training Loss: 0.01047
Epoch: 28742, Training Loss: 0.00816
Epoch: 28742, Training Loss: 0.00851
Epoch: 28742, Training Loss: 0.00852
Epoch: 28742, Training Loss: 0.01047
Epoch: 28743, Training Loss: 0.00816
Epoch: 28743, Training Loss: 0.00851
Epoch: 28743, Training Loss: 0.00852
Epoch: 28743, Training Loss: 0.01047
Epoch: 28744, Training Loss: 0.00816
Epoch: 28744, Training Loss: 0.00851
Epoch: 28744, Training Loss: 0.00852
Epoch: 28744, Training Loss: 0.01047
Epoch: 28745, Training Loss: 0.00816
Epoch: 28745, Training Loss: 0.00851
Epoch: 28745, Training Loss: 0.00852
Epoch: 28745, Training Loss: 0.01047
Epoch: 28746, Training Loss: 0.00816
Epoch: 28746, Training Loss: 0.00851
Epoch: 28746, Training Loss: 0.00852
Epoch: 28746, Training Loss: 0.01047
Epoch: 28747, Training Loss: 0.00816
Epoch: 28747, Training Loss: 0.00851
Epoch: 28747, Training Loss: 0.00852
Epoch: 28747, Training Loss: 0.01047
Epoch: 28748, Training Loss: 0.00816
Epoch: 28748, Training Loss: 0.00851
Epoch: 28748, Training Loss: 0.00852
Epoch: 28748, Training Loss: 0.01047
Epoch: 28749, Training Loss: 0.00816
Epoch: 28749, Training Loss: 0.00851
Epoch: 28749, Training Loss: 0.00852
Epoch: 28749, Training Loss: 0.01047
Epoch: 28750, Training Loss: 0.00816
Epoch: 28750, Training Loss: 0.00851
Epoch: 28750, Training Loss: 0.00852
Epoch: 28750, Training Loss: 0.01047
Epoch: 28751, Training Loss: 0.00816
Epoch: 28751, Training Loss: 0.00851
Epoch: 28751, Training Loss: 0.00852
Epoch: 28751, Training Loss: 0.01047
Epoch: 28752, Training Loss: 0.00816
Epoch: 28752, Training Loss: 0.00851
Epoch: 28752, Training Loss: 0.00852
Epoch: 28752, Training Loss: 0.01047
Epoch: 28753, Training Loss: 0.00816
Epoch: 28753, Training Loss: 0.00851
Epoch: 28753, Training Loss: 0.00852
Epoch: 28753, Training Loss: 0.01047
Epoch: 28754, Training Loss: 0.00816
Epoch: 28754, Training Loss: 0.00851
Epoch: 28754, Training Loss: 0.00852
Epoch: 28754, Training Loss: 0.01047
Epoch: 28755, Training Loss: 0.00816
Epoch: 28755, Training Loss: 0.00851
Epoch: 28755, Training Loss: 0.00852
Epoch: 28755, Training Loss: 0.01047
Epoch: 28756, Training Loss: 0.00816
Epoch: 28756, Training Loss: 0.00851
Epoch: 28756, Training Loss: 0.00852
Epoch: 28756, Training Loss: 0.01047
Epoch: 28757, Training Loss: 0.00816
Epoch: 28757, Training Loss: 0.00851
Epoch: 28757, Training Loss: 0.00852
Epoch: 28757, Training Loss: 0.01047
Epoch: 28758, Training Loss: 0.00816
Epoch: 28758, Training Loss: 0.00851
Epoch: 28758, Training Loss: 0.00852
Epoch: 28758, Training Loss: 0.01047
Epoch: 28759, Training Loss: 0.00816
Epoch: 28759, Training Loss: 0.00851
Epoch: 28759, Training Loss: 0.00852
Epoch: 28759, Training Loss: 0.01047
Epoch: 28760, Training Loss: 0.00816
Epoch: 28760, Training Loss: 0.00851
Epoch: 28760, Training Loss: 0.00852
Epoch: 28760, Training Loss: 0.01047
Epoch: 28761, Training Loss: 0.00816
Epoch: 28761, Training Loss: 0.00851
Epoch: 28761, Training Loss: 0.00852
Epoch: 28761, Training Loss: 0.01047
Epoch: 28762, Training Loss: 0.00816
Epoch: 28762, Training Loss: 0.00851
Epoch: 28762, Training Loss: 0.00852
Epoch: 28762, Training Loss: 0.01047
Epoch: 28763, Training Loss: 0.00816
Epoch: 28763, Training Loss: 0.00851
Epoch: 28763, Training Loss: 0.00852
Epoch: 28763, Training Loss: 0.01047
Epoch: 28764, Training Loss: 0.00816
Epoch: 28764, Training Loss: 0.00851
Epoch: 28764, Training Loss: 0.00852
Epoch: 28764, Training Loss: 0.01047
Epoch: 28765, Training Loss: 0.00816
Epoch: 28765, Training Loss: 0.00851
Epoch: 28765, Training Loss: 0.00852
Epoch: 28765, Training Loss: 0.01047
Epoch: 28766, Training Loss: 0.00816
Epoch: 28766, Training Loss: 0.00851
Epoch: 28766, Training Loss: 0.00852
Epoch: 28766, Training Loss: 0.01047
Epoch: 28767, Training Loss: 0.00816
Epoch: 28767, Training Loss: 0.00851
Epoch: 28767, Training Loss: 0.00852
Epoch: 28767, Training Loss: 0.01047
Epoch: 28768, Training Loss: 0.00816
Epoch: 28768, Training Loss: 0.00851
Epoch: 28768, Training Loss: 0.00852
Epoch: 28768, Training Loss: 0.01047
Epoch: 28769, Training Loss: 0.00816
Epoch: 28769, Training Loss: 0.00851
Epoch: 28769, Training Loss: 0.00852
Epoch: 28769, Training Loss: 0.01047
Epoch: 28770, Training Loss: 0.00816
Epoch: 28770, Training Loss: 0.00851
Epoch: 28770, Training Loss: 0.00852
Epoch: 28770, Training Loss: 0.01047
Epoch: 28771, Training Loss: 0.00816
Epoch: 28771, Training Loss: 0.00851
Epoch: 28771, Training Loss: 0.00852
Epoch: 28771, Training Loss: 0.01047
Epoch: 28772, Training Loss: 0.00816
Epoch: 28772, Training Loss: 0.00851
Epoch: 28772, Training Loss: 0.00852
Epoch: 28772, Training Loss: 0.01047
Epoch: 28773, Training Loss: 0.00816
Epoch: 28773, Training Loss: 0.00851
Epoch: 28773, Training Loss: 0.00852
Epoch: 28773, Training Loss: 0.01047
Epoch: 28774, Training Loss: 0.00816
Epoch: 28774, Training Loss: 0.00851
Epoch: 28774, Training Loss: 0.00852
Epoch: 28774, Training Loss: 0.01047
Epoch: 28775, Training Loss: 0.00816
Epoch: 28775, Training Loss: 0.00851
Epoch: 28775, Training Loss: 0.00852
Epoch: 28775, Training Loss: 0.01047
Epoch: 28776, Training Loss: 0.00816
Epoch: 28776, Training Loss: 0.00851
Epoch: 28776, Training Loss: 0.00852
Epoch: 28776, Training Loss: 0.01047
Epoch: 28777, Training Loss: 0.00816
Epoch: 28777, Training Loss: 0.00851
Epoch: 28777, Training Loss: 0.00852
Epoch: 28777, Training Loss: 0.01047
Epoch: 28778, Training Loss: 0.00816
Epoch: 28778, Training Loss: 0.00851
Epoch: 28778, Training Loss: 0.00852
Epoch: 28778, Training Loss: 0.01047
Epoch: 28779, Training Loss: 0.00816
Epoch: 28779, Training Loss: 0.00851
Epoch: 28779, Training Loss: 0.00852
Epoch: 28779, Training Loss: 0.01047
Epoch: 28780, Training Loss: 0.00816
Epoch: 28780, Training Loss: 0.00851
Epoch: 28780, Training Loss: 0.00852
Epoch: 28780, Training Loss: 0.01046
Epoch: 28781, Training Loss: 0.00816
Epoch: 28781, Training Loss: 0.00851
Epoch: 28781, Training Loss: 0.00852
Epoch: 28781, Training Loss: 0.01046
Epoch: 28782, Training Loss: 0.00816
Epoch: 28782, Training Loss: 0.00851
Epoch: 28782, Training Loss: 0.00852
Epoch: 28782, Training Loss: 0.01046
Epoch: 28783, Training Loss: 0.00816
Epoch: 28783, Training Loss: 0.00851
Epoch: 28783, Training Loss: 0.00852
Epoch: 28783, Training Loss: 0.01046
Epoch: 28784, Training Loss: 0.00816
Epoch: 28784, Training Loss: 0.00851
Epoch: 28784, Training Loss: 0.00852
Epoch: 28784, Training Loss: 0.01046
Epoch: 28785, Training Loss: 0.00816
Epoch: 28785, Training Loss: 0.00851
Epoch: 28785, Training Loss: 0.00852
Epoch: 28785, Training Loss: 0.01046
Epoch: 28786, Training Loss: 0.00816
Epoch: 28786, Training Loss: 0.00851
Epoch: 28786, Training Loss: 0.00852
Epoch: 28786, Training Loss: 0.01046
Epoch: 28787, Training Loss: 0.00816
Epoch: 28787, Training Loss: 0.00851
Epoch: 28787, Training Loss: 0.00852
Epoch: 28787, Training Loss: 0.01046
Epoch: 28788, Training Loss: 0.00816
Epoch: 28788, Training Loss: 0.00851
Epoch: 28788, Training Loss: 0.00852
Epoch: 28788, Training Loss: 0.01046
Epoch: 28789, Training Loss: 0.00816
Epoch: 28789, Training Loss: 0.00850
Epoch: 28789, Training Loss: 0.00852
Epoch: 28789, Training Loss: 0.01046
Epoch: 28790, Training Loss: 0.00816
Epoch: 28790, Training Loss: 0.00850
Epoch: 28790, Training Loss: 0.00851
Epoch: 28790, Training Loss: 0.01046
Epoch: 28791, Training Loss: 0.00816
Epoch: 28791, Training Loss: 0.00850
Epoch: 28791, Training Loss: 0.00851
Epoch: 28791, Training Loss: 0.01046
Epoch: 28792, Training Loss: 0.00816
Epoch: 28792, Training Loss: 0.00850
Epoch: 28792, Training Loss: 0.00851
Epoch: 28792, Training Loss: 0.01046
Epoch: 28793, Training Loss: 0.00816
Epoch: 28793, Training Loss: 0.00850
Epoch: 28793, Training Loss: 0.00851
Epoch: 28793, Training Loss: 0.01046
Epoch: 28794, Training Loss: 0.00816
Epoch: 28794, Training Loss: 0.00850
Epoch: 28794, Training Loss: 0.00851
Epoch: 28794, Training Loss: 0.01046
Epoch: 28795, Training Loss: 0.00816
Epoch: 28795, Training Loss: 0.00850
Epoch: 28795, Training Loss: 0.00851
Epoch: 28795, Training Loss: 0.01046
Epoch: 28796, Training Loss: 0.00816
Epoch: 28796, Training Loss: 0.00850
Epoch: 28796, Training Loss: 0.00851
Epoch: 28796, Training Loss: 0.01046
Epoch: 28797, Training Loss: 0.00816
Epoch: 28797, Training Loss: 0.00850
Epoch: 28797, Training Loss: 0.00851
Epoch: 28797, Training Loss: 0.01046
Epoch: 28798, Training Loss: 0.00816
Epoch: 28798, Training Loss: 0.00850
Epoch: 28798, Training Loss: 0.00851
Epoch: 28798, Training Loss: 0.01046
Epoch: 28799, Training Loss: 0.00816
Epoch: 28799, Training Loss: 0.00850
Epoch: 28799, Training Loss: 0.00851
Epoch: 28799, Training Loss: 0.01046
Epoch: 28800, Training Loss: 0.00815
Epoch: 28800, Training Loss: 0.00850
Epoch: 28800, Training Loss: 0.00851
Epoch: 28800, Training Loss: 0.01046
Epoch: 28801, Training Loss: 0.00815
Epoch: 28801, Training Loss: 0.00850
Epoch: 28801, Training Loss: 0.00851
Epoch: 28801, Training Loss: 0.01046
Epoch: 28802, Training Loss: 0.00815
Epoch: 28802, Training Loss: 0.00850
Epoch: 28802, Training Loss: 0.00851
Epoch: 28802, Training Loss: 0.01046
Epoch: 28803, Training Loss: 0.00815
Epoch: 28803, Training Loss: 0.00850
Epoch: 28803, Training Loss: 0.00851
Epoch: 28803, Training Loss: 0.01046
Epoch: 28804, Training Loss: 0.00815
Epoch: 28804, Training Loss: 0.00850
Epoch: 28804, Training Loss: 0.00851
Epoch: 28804, Training Loss: 0.01046
Epoch: 28805, Training Loss: 0.00815
Epoch: 28805, Training Loss: 0.00850
Epoch: 28805, Training Loss: 0.00851
Epoch: 28805, Training Loss: 0.01046
Epoch: 28806, Training Loss: 0.00815
Epoch: 28806, Training Loss: 0.00850
Epoch: 28806, Training Loss: 0.00851
Epoch: 28806, Training Loss: 0.01046
Epoch: 28807, Training Loss: 0.00815
Epoch: 28807, Training Loss: 0.00850
Epoch: 28807, Training Loss: 0.00851
Epoch: 28807, Training Loss: 0.01046
Epoch: 28808, Training Loss: 0.00815
Epoch: 28808, Training Loss: 0.00850
Epoch: 28808, Training Loss: 0.00851
Epoch: 28808, Training Loss: 0.01046
Epoch: 28809, Training Loss: 0.00815
Epoch: 28809, Training Loss: 0.00850
Epoch: 28809, Training Loss: 0.00851
Epoch: 28809, Training Loss: 0.01046
Epoch: 28810, Training Loss: 0.00815
Epoch: 28810, Training Loss: 0.00850
Epoch: 28810, Training Loss: 0.00851
Epoch: 28810, Training Loss: 0.01046
Epoch: 28811, Training Loss: 0.00815
Epoch: 28811, Training Loss: 0.00850
Epoch: 28811, Training Loss: 0.00851
Epoch: 28811, Training Loss: 0.01046
Epoch: 28812, Training Loss: 0.00815
Epoch: 28812, Training Loss: 0.00850
Epoch: 28812, Training Loss: 0.00851
Epoch: 28812, Training Loss: 0.01046
Epoch: 28813, Training Loss: 0.00815
Epoch: 28813, Training Loss: 0.00850
Epoch: 28813, Training Loss: 0.00851
Epoch: 28813, Training Loss: 0.01046
Epoch: 28814, Training Loss: 0.00815
Epoch: 28814, Training Loss: 0.00850
Epoch: 28814, Training Loss: 0.00851
Epoch: 28814, Training Loss: 0.01046
Epoch: 28815, Training Loss: 0.00815
Epoch: 28815, Training Loss: 0.00850
Epoch: 28815, Training Loss: 0.00851
Epoch: 28815, Training Loss: 0.01046
Epoch: 28816, Training Loss: 0.00815
Epoch: 28816, Training Loss: 0.00850
Epoch: 28816, Training Loss: 0.00851
Epoch: 28816, Training Loss: 0.01046
Epoch: 28817, Training Loss: 0.00815
Epoch: 28817, Training Loss: 0.00850
Epoch: 28817, Training Loss: 0.00851
Epoch: 28817, Training Loss: 0.01046
Epoch: 28818, Training Loss: 0.00815
Epoch: 28818, Training Loss: 0.00850
Epoch: 28818, Training Loss: 0.00851
Epoch: 28818, Training Loss: 0.01046
Epoch: 28819, Training Loss: 0.00815
Epoch: 28819, Training Loss: 0.00850
Epoch: 28819, Training Loss: 0.00851
Epoch: 28819, Training Loss: 0.01046
Epoch: 28820, Training Loss: 0.00815
Epoch: 28820, Training Loss: 0.00850
Epoch: 28820, Training Loss: 0.00851
Epoch: 28820, Training Loss: 0.01046
Epoch: 28821, Training Loss: 0.00815
Epoch: 28821, Training Loss: 0.00850
Epoch: 28821, Training Loss: 0.00851
Epoch: 28821, Training Loss: 0.01046
Epoch: 28822, Training Loss: 0.00815
Epoch: 28822, Training Loss: 0.00850
Epoch: 28822, Training Loss: 0.00851
Epoch: 28822, Training Loss: 0.01046
Epoch: 28823, Training Loss: 0.00815
Epoch: 28823, Training Loss: 0.00850
Epoch: 28823, Training Loss: 0.00851
Epoch: 28823, Training Loss: 0.01046
Epoch: 28824, Training Loss: 0.00815
Epoch: 28824, Training Loss: 0.00850
Epoch: 28824, Training Loss: 0.00851
Epoch: 28824, Training Loss: 0.01046
Epoch: 28825, Training Loss: 0.00815
Epoch: 28825, Training Loss: 0.00850
Epoch: 28825, Training Loss: 0.00851
Epoch: 28825, Training Loss: 0.01046
Epoch: 28826, Training Loss: 0.00815
Epoch: 28826, Training Loss: 0.00850
Epoch: 28826, Training Loss: 0.00851
Epoch: 28826, Training Loss: 0.01046
Epoch: 28827, Training Loss: 0.00815
Epoch: 28827, Training Loss: 0.00850
Epoch: 28827, Training Loss: 0.00851
Epoch: 28827, Training Loss: 0.01046
Epoch: 28828, Training Loss: 0.00815
Epoch: 28828, Training Loss: 0.00850
Epoch: 28828, Training Loss: 0.00851
Epoch: 28828, Training Loss: 0.01045
Epoch: 28829, Training Loss: 0.00815
Epoch: 28829, Training Loss: 0.00850
Epoch: 28829, Training Loss: 0.00851
Epoch: 28829, Training Loss: 0.01045
Epoch: 28830, Training Loss: 0.00815
Epoch: 28830, Training Loss: 0.00850
Epoch: 28830, Training Loss: 0.00851
Epoch: 28830, Training Loss: 0.01045
Epoch: 28831, Training Loss: 0.00815
Epoch: 28831, Training Loss: 0.00850
Epoch: 28831, Training Loss: 0.00851
Epoch: 28831, Training Loss: 0.01045
Epoch: 28832, Training Loss: 0.00815
Epoch: 28832, Training Loss: 0.00850
Epoch: 28832, Training Loss: 0.00851
Epoch: 28832, Training Loss: 0.01045
Epoch: 28833, Training Loss: 0.00815
Epoch: 28833, Training Loss: 0.00850
Epoch: 28833, Training Loss: 0.00851
Epoch: 28833, Training Loss: 0.01045
Epoch: 28834, Training Loss: 0.00815
Epoch: 28834, Training Loss: 0.00850
Epoch: 28834, Training Loss: 0.00851
Epoch: 28834, Training Loss: 0.01045
Epoch: 28835, Training Loss: 0.00815
Epoch: 28835, Training Loss: 0.00850
Epoch: 28835, Training Loss: 0.00851
Epoch: 28835, Training Loss: 0.01045
Epoch: 28836, Training Loss: 0.00815
Epoch: 28836, Training Loss: 0.00850
Epoch: 28836, Training Loss: 0.00851
Epoch: 28836, Training Loss: 0.01045
Epoch: 28837, Training Loss: 0.00815
Epoch: 28837, Training Loss: 0.00850
Epoch: 28837, Training Loss: 0.00851
Epoch: 28837, Training Loss: 0.01045
Epoch: 28838, Training Loss: 0.00815
Epoch: 28838, Training Loss: 0.00850
Epoch: 28838, Training Loss: 0.00851
Epoch: 28838, Training Loss: 0.01045
Epoch: 28839, Training Loss: 0.00815
Epoch: 28839, Training Loss: 0.00850
Epoch: 28839, Training Loss: 0.00851
Epoch: 28839, Training Loss: 0.01045
Epoch: 28840, Training Loss: 0.00815
Epoch: 28840, Training Loss: 0.00850
Epoch: 28840, Training Loss: 0.00851
Epoch: 28840, Training Loss: 0.01045
Epoch: 28841, Training Loss: 0.00815
Epoch: 28841, Training Loss: 0.00850
Epoch: 28841, Training Loss: 0.00851
Epoch: 28841, Training Loss: 0.01045
Epoch: 28842, Training Loss: 0.00815
Epoch: 28842, Training Loss: 0.00850
Epoch: 28842, Training Loss: 0.00851
Epoch: 28842, Training Loss: 0.01045
Epoch: 28843, Training Loss: 0.00815
Epoch: 28843, Training Loss: 0.00850
Epoch: 28843, Training Loss: 0.00851
Epoch: 28843, Training Loss: 0.01045
Epoch: 28844, Training Loss: 0.00815
Epoch: 28844, Training Loss: 0.00850
Epoch: 28844, Training Loss: 0.00851
Epoch: 28844, Training Loss: 0.01045
Epoch: 28845, Training Loss: 0.00815
Epoch: 28845, Training Loss: 0.00850
Epoch: 28845, Training Loss: 0.00851
Epoch: 28845, Training Loss: 0.01045
Epoch: 28846, Training Loss: 0.00815
Epoch: 28846, Training Loss: 0.00850
Epoch: 28846, Training Loss: 0.00851
Epoch: 28846, Training Loss: 0.01045
Epoch: 28847, Training Loss: 0.00815
Epoch: 28847, Training Loss: 0.00850
Epoch: 28847, Training Loss: 0.00851
Epoch: 28847, Training Loss: 0.01045
Epoch: 28848, Training Loss: 0.00815
Epoch: 28848, Training Loss: 0.00850
Epoch: 28848, Training Loss: 0.00851
Epoch: 28848, Training Loss: 0.01045
Epoch: 28849, Training Loss: 0.00815
Epoch: 28849, Training Loss: 0.00849
Epoch: 28849, Training Loss: 0.00851
Epoch: 28849, Training Loss: 0.01045
Epoch: 28850, Training Loss: 0.00815
Epoch: 28850, Training Loss: 0.00849
Epoch: 28850, Training Loss: 0.00850
Epoch: 28850, Training Loss: 0.01045
Epoch: 28851, Training Loss: 0.00815
Epoch: 28851, Training Loss: 0.00849
Epoch: 28851, Training Loss: 0.00850
Epoch: 28851, Training Loss: 0.01045
Epoch: 28852, Training Loss: 0.00815
Epoch: 28852, Training Loss: 0.00849
Epoch: 28852, Training Loss: 0.00850
Epoch: 28852, Training Loss: 0.01045
Epoch: 28853, Training Loss: 0.00815
Epoch: 28853, Training Loss: 0.00849
Epoch: 28853, Training Loss: 0.00850
Epoch: 28853, Training Loss: 0.01045
Epoch: 28854, Training Loss: 0.00815
Epoch: 28854, Training Loss: 0.00849
Epoch: 28854, Training Loss: 0.00850
Epoch: 28854, Training Loss: 0.01045
Epoch: 28855, Training Loss: 0.00815
Epoch: 28855, Training Loss: 0.00849
Epoch: 28855, Training Loss: 0.00850
Epoch: 28855, Training Loss: 0.01045
Epoch: 28856, Training Loss: 0.00815
Epoch: 28856, Training Loss: 0.00849
Epoch: 28856, Training Loss: 0.00850
Epoch: 28856, Training Loss: 0.01045
Epoch: 28857, Training Loss: 0.00815
Epoch: 28857, Training Loss: 0.00849
Epoch: 28857, Training Loss: 0.00850
Epoch: 28857, Training Loss: 0.01045
Epoch: 28858, Training Loss: 0.00815
Epoch: 28858, Training Loss: 0.00849
Epoch: 28858, Training Loss: 0.00850
Epoch: 28858, Training Loss: 0.01045
Epoch: 28859, Training Loss: 0.00815
Epoch: 28859, Training Loss: 0.00849
Epoch: 28859, Training Loss: 0.00850
Epoch: 28859, Training Loss: 0.01045
Epoch: 28860, Training Loss: 0.00815
Epoch: 28860, Training Loss: 0.00849
Epoch: 28860, Training Loss: 0.00850
Epoch: 28860, Training Loss: 0.01045
Epoch: 28861, Training Loss: 0.00815
Epoch: 28861, Training Loss: 0.00849
Epoch: 28861, Training Loss: 0.00850
Epoch: 28861, Training Loss: 0.01045
Epoch: 28862, Training Loss: 0.00815
Epoch: 28862, Training Loss: 0.00849
Epoch: 28862, Training Loss: 0.00850
Epoch: 28862, Training Loss: 0.01045
Epoch: 28863, Training Loss: 0.00815
Epoch: 28863, Training Loss: 0.00849
Epoch: 28863, Training Loss: 0.00850
Epoch: 28863, Training Loss: 0.01045
Epoch: 28864, Training Loss: 0.00814
Epoch: 28864, Training Loss: 0.00849
Epoch: 28864, Training Loss: 0.00850
Epoch: 28864, Training Loss: 0.01045
Epoch: 28865, Training Loss: 0.00814
Epoch: 28865, Training Loss: 0.00849
Epoch: 28865, Training Loss: 0.00850
Epoch: 28865, Training Loss: 0.01045
Epoch: 28866, Training Loss: 0.00814
Epoch: 28866, Training Loss: 0.00849
Epoch: 28866, Training Loss: 0.00850
Epoch: 28866, Training Loss: 0.01045
Epoch: 28867, Training Loss: 0.00814
Epoch: 28867, Training Loss: 0.00849
Epoch: 28867, Training Loss: 0.00850
Epoch: 28867, Training Loss: 0.01045
Epoch: 28868, Training Loss: 0.00814
Epoch: 28868, Training Loss: 0.00849
Epoch: 28868, Training Loss: 0.00850
Epoch: 28868, Training Loss: 0.01045
Epoch: 28869, Training Loss: 0.00814
Epoch: 28869, Training Loss: 0.00849
Epoch: 28869, Training Loss: 0.00850
Epoch: 28869, Training Loss: 0.01045
Epoch: 28870, Training Loss: 0.00814
Epoch: 28870, Training Loss: 0.00849
Epoch: 28870, Training Loss: 0.00850
Epoch: 28870, Training Loss: 0.01045
Epoch: 28871, Training Loss: 0.00814
Epoch: 28871, Training Loss: 0.00849
Epoch: 28871, Training Loss: 0.00850
Epoch: 28871, Training Loss: 0.01045
Epoch: 28872, Training Loss: 0.00814
Epoch: 28872, Training Loss: 0.00849
Epoch: 28872, Training Loss: 0.00850
Epoch: 28872, Training Loss: 0.01045
Epoch: 28873, Training Loss: 0.00814
Epoch: 28873, Training Loss: 0.00849
Epoch: 28873, Training Loss: 0.00850
Epoch: 28873, Training Loss: 0.01045
Epoch: 28874, Training Loss: 0.00814
Epoch: 28874, Training Loss: 0.00849
Epoch: 28874, Training Loss: 0.00850
Epoch: 28874, Training Loss: 0.01045
Epoch: 28875, Training Loss: 0.00814
Epoch: 28875, Training Loss: 0.00849
Epoch: 28875, Training Loss: 0.00850
Epoch: 28875, Training Loss: 0.01045
Epoch: 28876, Training Loss: 0.00814
Epoch: 28876, Training Loss: 0.00849
Epoch: 28876, Training Loss: 0.00850
Epoch: 28876, Training Loss: 0.01044
Epoch: 28877, Training Loss: 0.00814
Epoch: 28877, Training Loss: 0.00849
Epoch: 28877, Training Loss: 0.00850
Epoch: 28877, Training Loss: 0.01044
Epoch: 28878, Training Loss: 0.00814
Epoch: 28878, Training Loss: 0.00849
Epoch: 28878, Training Loss: 0.00850
Epoch: 28878, Training Loss: 0.01044
Epoch: 28879, Training Loss: 0.00814
Epoch: 28879, Training Loss: 0.00849
Epoch: 28879, Training Loss: 0.00850
Epoch: 28879, Training Loss: 0.01044
Epoch: 28880, Training Loss: 0.00814
Epoch: 28880, Training Loss: 0.00849
Epoch: 28880, Training Loss: 0.00850
Epoch: 28880, Training Loss: 0.01044
Epoch: 28881, Training Loss: 0.00814
Epoch: 28881, Training Loss: 0.00849
Epoch: 28881, Training Loss: 0.00850
Epoch: 28881, Training Loss: 0.01044
Epoch: 28882, Training Loss: 0.00814
Epoch: 28882, Training Loss: 0.00849
Epoch: 28882, Training Loss: 0.00850
Epoch: 28882, Training Loss: 0.01044
Epoch: 28883, Training Loss: 0.00814
Epoch: 28883, Training Loss: 0.00849
Epoch: 28883, Training Loss: 0.00850
Epoch: 28883, Training Loss: 0.01044
Epoch: 28884, Training Loss: 0.00814
Epoch: 28884, Training Loss: 0.00849
Epoch: 28884, Training Loss: 0.00850
Epoch: 28884, Training Loss: 0.01044
Epoch: 28885, Training Loss: 0.00814
Epoch: 28885, Training Loss: 0.00849
Epoch: 28885, Training Loss: 0.00850
Epoch: 28885, Training Loss: 0.01044
Epoch: 28886, Training Loss: 0.00814
Epoch: 28886, Training Loss: 0.00849
Epoch: 28886, Training Loss: 0.00850
Epoch: 28886, Training Loss: 0.01044
Epoch: 28887, Training Loss: 0.00814
Epoch: 28887, Training Loss: 0.00849
Epoch: 28887, Training Loss: 0.00850
Epoch: 28887, Training Loss: 0.01044
Epoch: 28888, Training Loss: 0.00814
Epoch: 28888, Training Loss: 0.00849
Epoch: 28888, Training Loss: 0.00850
Epoch: 28888, Training Loss: 0.01044
Epoch: 28889, Training Loss: 0.00814
Epoch: 28889, Training Loss: 0.00849
Epoch: 28889, Training Loss: 0.00850
Epoch: 28889, Training Loss: 0.01044
Epoch: 28890, Training Loss: 0.00814
Epoch: 28890, Training Loss: 0.00849
Epoch: 28890, Training Loss: 0.00850
Epoch: 28890, Training Loss: 0.01044
Epoch: 28891, Training Loss: 0.00814
Epoch: 28891, Training Loss: 0.00849
Epoch: 28891, Training Loss: 0.00850
Epoch: 28891, Training Loss: 0.01044
Epoch: 28892, Training Loss: 0.00814
Epoch: 28892, Training Loss: 0.00849
Epoch: 28892, Training Loss: 0.00850
Epoch: 28892, Training Loss: 0.01044
Epoch: 28893, Training Loss: 0.00814
Epoch: 28893, Training Loss: 0.00849
Epoch: 28893, Training Loss: 0.00850
Epoch: 28893, Training Loss: 0.01044
Epoch: 28894, Training Loss: 0.00814
Epoch: 28894, Training Loss: 0.00849
Epoch: 28894, Training Loss: 0.00850
Epoch: 28894, Training Loss: 0.01044
Epoch: 28895, Training Loss: 0.00814
Epoch: 28895, Training Loss: 0.00849
Epoch: 28895, Training Loss: 0.00850
Epoch: 28895, Training Loss: 0.01044
Epoch: 28896, Training Loss: 0.00814
Epoch: 28896, Training Loss: 0.00849
Epoch: 28896, Training Loss: 0.00850
Epoch: 28896, Training Loss: 0.01044
Epoch: 28897, Training Loss: 0.00814
Epoch: 28897, Training Loss: 0.00849
Epoch: 28897, Training Loss: 0.00850
Epoch: 28897, Training Loss: 0.01044
Epoch: 28898, Training Loss: 0.00814
Epoch: 28898, Training Loss: 0.00849
Epoch: 28898, Training Loss: 0.00850
Epoch: 28898, Training Loss: 0.01044
Epoch: 28899, Training Loss: 0.00814
Epoch: 28899, Training Loss: 0.00849
Epoch: 28899, Training Loss: 0.00850
Epoch: 28899, Training Loss: 0.01044
Epoch: 28900, Training Loss: 0.00814
Epoch: 28900, Training Loss: 0.00849
Epoch: 28900, Training Loss: 0.00850
Epoch: 28900, Training Loss: 0.01044
Epoch: 28901, Training Loss: 0.00814
Epoch: 28901, Training Loss: 0.00849
Epoch: 28901, Training Loss: 0.00850
Epoch: 28901, Training Loss: 0.01044
Epoch: 28902, Training Loss: 0.00814
Epoch: 28902, Training Loss: 0.00849
Epoch: 28902, Training Loss: 0.00850
Epoch: 28902, Training Loss: 0.01044
Epoch: 28903, Training Loss: 0.00814
Epoch: 28903, Training Loss: 0.00849
Epoch: 28903, Training Loss: 0.00850
Epoch: 28903, Training Loss: 0.01044
Epoch: 28904, Training Loss: 0.00814
Epoch: 28904, Training Loss: 0.00849
Epoch: 28904, Training Loss: 0.00850
Epoch: 28904, Training Loss: 0.01044
Epoch: 28905, Training Loss: 0.00814
Epoch: 28905, Training Loss: 0.00849
Epoch: 28905, Training Loss: 0.00850
Epoch: 28905, Training Loss: 0.01044
Epoch: 28906, Training Loss: 0.00814
Epoch: 28906, Training Loss: 0.00849
Epoch: 28906, Training Loss: 0.00850
Epoch: 28906, Training Loss: 0.01044
Epoch: 28907, Training Loss: 0.00814
Epoch: 28907, Training Loss: 0.00849
Epoch: 28907, Training Loss: 0.00850
Epoch: 28907, Training Loss: 0.01044
Epoch: 28908, Training Loss: 0.00814
Epoch: 28908, Training Loss: 0.00849
Epoch: 28908, Training Loss: 0.00850
Epoch: 28908, Training Loss: 0.01044
Epoch: 28909, Training Loss: 0.00814
Epoch: 28909, Training Loss: 0.00848
Epoch: 28909, Training Loss: 0.00850
Epoch: 28909, Training Loss: 0.01044
Epoch: 28910, Training Loss: 0.00814
Epoch: 28910, Training Loss: 0.00848
Epoch: 28910, Training Loss: 0.00849
Epoch: 28910, Training Loss: 0.01044
Epoch: 28911, Training Loss: 0.00814
Epoch: 28911, Training Loss: 0.00848
Epoch: 28911, Training Loss: 0.00849
Epoch: 28911, Training Loss: 0.01044
Epoch: 28912, Training Loss: 0.00814
Epoch: 28912, Training Loss: 0.00848
Epoch: 28912, Training Loss: 0.00849
Epoch: 28912, Training Loss: 0.01044
Epoch: 28913, Training Loss: 0.00814
Epoch: 28913, Training Loss: 0.00848
Epoch: 28913, Training Loss: 0.00849
Epoch: 28913, Training Loss: 0.01044
Epoch: 28914, Training Loss: 0.00814
Epoch: 28914, Training Loss: 0.00848
Epoch: 28914, Training Loss: 0.00849
Epoch: 28914, Training Loss: 0.01044
Epoch: 28915, Training Loss: 0.00814
Epoch: 28915, Training Loss: 0.00848
Epoch: 28915, Training Loss: 0.00849
Epoch: 28915, Training Loss: 0.01044
Epoch: 28916, Training Loss: 0.00814
Epoch: 28916, Training Loss: 0.00848
Epoch: 28916, Training Loss: 0.00849
Epoch: 28916, Training Loss: 0.01044
Epoch: 28917, Training Loss: 0.00814
Epoch: 28917, Training Loss: 0.00848
Epoch: 28917, Training Loss: 0.00849
Epoch: 28917, Training Loss: 0.01044
Epoch: 28918, Training Loss: 0.00814
Epoch: 28918, Training Loss: 0.00848
Epoch: 28918, Training Loss: 0.00849
Epoch: 28918, Training Loss: 0.01044
Epoch: 28919, Training Loss: 0.00814
Epoch: 28919, Training Loss: 0.00848
Epoch: 28919, Training Loss: 0.00849
Epoch: 28919, Training Loss: 0.01044
Epoch: 28920, Training Loss: 0.00814
Epoch: 28920, Training Loss: 0.00848
Epoch: 28920, Training Loss: 0.00849
Epoch: 28920, Training Loss: 0.01044
Epoch: 28921, Training Loss: 0.00814
Epoch: 28921, Training Loss: 0.00848
Epoch: 28921, Training Loss: 0.00849
Epoch: 28921, Training Loss: 0.01044
Epoch: 28922, Training Loss: 0.00814
Epoch: 28922, Training Loss: 0.00848
Epoch: 28922, Training Loss: 0.00849
Epoch: 28922, Training Loss: 0.01044
Epoch: 28923, Training Loss: 0.00814
Epoch: 28923, Training Loss: 0.00848
Epoch: 28923, Training Loss: 0.00849
Epoch: 28923, Training Loss: 0.01044
Epoch: 28924, Training Loss: 0.00814
Epoch: 28924, Training Loss: 0.00848
Epoch: 28924, Training Loss: 0.00849
Epoch: 28924, Training Loss: 0.01043
Epoch: 28925, Training Loss: 0.00814
Epoch: 28925, Training Loss: 0.00848
Epoch: 28925, Training Loss: 0.00849
Epoch: 28925, Training Loss: 0.01043
Epoch: 28926, Training Loss: 0.00814
Epoch: 28926, Training Loss: 0.00848
Epoch: 28926, Training Loss: 0.00849
Epoch: 28926, Training Loss: 0.01043
Epoch: 28927, Training Loss: 0.00814
Epoch: 28927, Training Loss: 0.00848
Epoch: 28927, Training Loss: 0.00849
Epoch: 28927, Training Loss: 0.01043
Epoch: 28928, Training Loss: 0.00814
Epoch: 28928, Training Loss: 0.00848
Epoch: 28928, Training Loss: 0.00849
Epoch: 28928, Training Loss: 0.01043
Epoch: 28929, Training Loss: 0.00813
Epoch: 28929, Training Loss: 0.00848
Epoch: 28929, Training Loss: 0.00849
Epoch: 28929, Training Loss: 0.01043
Epoch: 28930, Training Loss: 0.00813
Epoch: 28930, Training Loss: 0.00848
Epoch: 28930, Training Loss: 0.00849
Epoch: 28930, Training Loss: 0.01043
Epoch: 28931, Training Loss: 0.00813
Epoch: 28931, Training Loss: 0.00848
Epoch: 28931, Training Loss: 0.00849
Epoch: 28931, Training Loss: 0.01043
Epoch: 28932, Training Loss: 0.00813
Epoch: 28932, Training Loss: 0.00848
Epoch: 28932, Training Loss: 0.00849
Epoch: 28932, Training Loss: 0.01043
Epoch: 28933, Training Loss: 0.00813
Epoch: 28933, Training Loss: 0.00848
Epoch: 28933, Training Loss: 0.00849
Epoch: 28933, Training Loss: 0.01043
Epoch: 28934, Training Loss: 0.00813
Epoch: 28934, Training Loss: 0.00848
Epoch: 28934, Training Loss: 0.00849
Epoch: 28934, Training Loss: 0.01043
Epoch: 28935, Training Loss: 0.00813
Epoch: 28935, Training Loss: 0.00848
Epoch: 28935, Training Loss: 0.00849
Epoch: 28935, Training Loss: 0.01043
Epoch: 28936, Training Loss: 0.00813
Epoch: 28936, Training Loss: 0.00848
Epoch: 28936, Training Loss: 0.00849
Epoch: 28936, Training Loss: 0.01043
Epoch: 28937, Training Loss: 0.00813
Epoch: 28937, Training Loss: 0.00848
Epoch: 28937, Training Loss: 0.00849
Epoch: 28937, Training Loss: 0.01043
Epoch: 28938, Training Loss: 0.00813
Epoch: 28938, Training Loss: 0.00848
Epoch: 28938, Training Loss: 0.00849
Epoch: 28938, Training Loss: 0.01043
Epoch: 28939, Training Loss: 0.00813
Epoch: 28939, Training Loss: 0.00848
Epoch: 28939, Training Loss: 0.00849
Epoch: 28939, Training Loss: 0.01043
Epoch: 28940, Training Loss: 0.00813
Epoch: 28940, Training Loss: 0.00848
Epoch: 28940, Training Loss: 0.00849
Epoch: 28940, Training Loss: 0.01043
Epoch: 28941, Training Loss: 0.00813
Epoch: 28941, Training Loss: 0.00848
Epoch: 28941, Training Loss: 0.00849
Epoch: 28941, Training Loss: 0.01043
Epoch: 28942, Training Loss: 0.00813
Epoch: 28942, Training Loss: 0.00848
Epoch: 28942, Training Loss: 0.00849
Epoch: 28942, Training Loss: 0.01043
Epoch: 28943, Training Loss: 0.00813
Epoch: 28943, Training Loss: 0.00848
Epoch: 28943, Training Loss: 0.00849
Epoch: 28943, Training Loss: 0.01043
Epoch: 28944, Training Loss: 0.00813
Epoch: 28944, Training Loss: 0.00848
Epoch: 28944, Training Loss: 0.00849
Epoch: 28944, Training Loss: 0.01043
Epoch: 28945, Training Loss: 0.00813
Epoch: 28945, Training Loss: 0.00848
Epoch: 28945, Training Loss: 0.00849
Epoch: 28945, Training Loss: 0.01043
Epoch: 28946, Training Loss: 0.00813
Epoch: 28946, Training Loss: 0.00848
Epoch: 28946, Training Loss: 0.00849
Epoch: 28946, Training Loss: 0.01043
Epoch: 28947, Training Loss: 0.00813
Epoch: 28947, Training Loss: 0.00848
Epoch: 28947, Training Loss: 0.00849
Epoch: 28947, Training Loss: 0.01043
Epoch: 28948, Training Loss: 0.00813
Epoch: 28948, Training Loss: 0.00848
Epoch: 28948, Training Loss: 0.00849
Epoch: 28948, Training Loss: 0.01043
Epoch: 28949, Training Loss: 0.00813
Epoch: 28949, Training Loss: 0.00848
Epoch: 28949, Training Loss: 0.00849
Epoch: 28949, Training Loss: 0.01043
Epoch: 28950, Training Loss: 0.00813
Epoch: 28950, Training Loss: 0.00848
Epoch: 28950, Training Loss: 0.00849
Epoch: 28950, Training Loss: 0.01043
Epoch: 28951, Training Loss: 0.00813
Epoch: 28951, Training Loss: 0.00848
Epoch: 28951, Training Loss: 0.00849
Epoch: 28951, Training Loss: 0.01043
Epoch: 28952, Training Loss: 0.00813
Epoch: 28952, Training Loss: 0.00848
Epoch: 28952, Training Loss: 0.00849
Epoch: 28952, Training Loss: 0.01043
Epoch: 28953, Training Loss: 0.00813
Epoch: 28953, Training Loss: 0.00848
Epoch: 28953, Training Loss: 0.00849
Epoch: 28953, Training Loss: 0.01043
Epoch: 28954, Training Loss: 0.00813
Epoch: 28954, Training Loss: 0.00848
Epoch: 28954, Training Loss: 0.00849
Epoch: 28954, Training Loss: 0.01043
Epoch: 28955, Training Loss: 0.00813
Epoch: 28955, Training Loss: 0.00848
Epoch: 28955, Training Loss: 0.00849
Epoch: 28955, Training Loss: 0.01043
Epoch: 28956, Training Loss: 0.00813
Epoch: 28956, Training Loss: 0.00848
Epoch: 28956, Training Loss: 0.00849
Epoch: 28956, Training Loss: 0.01043
Epoch: 28957, Training Loss: 0.00813
Epoch: 28957, Training Loss: 0.00848
Epoch: 28957, Training Loss: 0.00849
Epoch: 28957, Training Loss: 0.01043
Epoch: 28958, Training Loss: 0.00813
Epoch: 28958, Training Loss: 0.00848
Epoch: 28958, Training Loss: 0.00849
Epoch: 28958, Training Loss: 0.01043
Epoch: 28959, Training Loss: 0.00813
Epoch: 28959, Training Loss: 0.00848
Epoch: 28959, Training Loss: 0.00849
Epoch: 28959, Training Loss: 0.01043
Epoch: 28960, Training Loss: 0.00813
Epoch: 28960, Training Loss: 0.00848
Epoch: 28960, Training Loss: 0.00849
Epoch: 28960, Training Loss: 0.01043
Epoch: 28961, Training Loss: 0.00813
Epoch: 28961, Training Loss: 0.00848
Epoch: 28961, Training Loss: 0.00849
Epoch: 28961, Training Loss: 0.01043
Epoch: 28962, Training Loss: 0.00813
Epoch: 28962, Training Loss: 0.00848
Epoch: 28962, Training Loss: 0.00849
Epoch: 28962, Training Loss: 0.01043
Epoch: 28963, Training Loss: 0.00813
Epoch: 28963, Training Loss: 0.00848
Epoch: 28963, Training Loss: 0.00849
Epoch: 28963, Training Loss: 0.01043
Epoch: 28964, Training Loss: 0.00813
Epoch: 28964, Training Loss: 0.00848
Epoch: 28964, Training Loss: 0.00849
Epoch: 28964, Training Loss: 0.01043
Epoch: 28965, Training Loss: 0.00813
Epoch: 28965, Training Loss: 0.00848
Epoch: 28965, Training Loss: 0.00849
Epoch: 28965, Training Loss: 0.01043
Epoch: 28966, Training Loss: 0.00813
Epoch: 28966, Training Loss: 0.00848
Epoch: 28966, Training Loss: 0.00849
Epoch: 28966, Training Loss: 0.01043
Epoch: 28967, Training Loss: 0.00813
Epoch: 28967, Training Loss: 0.00848
Epoch: 28967, Training Loss: 0.00849
Epoch: 28967, Training Loss: 0.01043
Epoch: 28968, Training Loss: 0.00813
Epoch: 28968, Training Loss: 0.00848
Epoch: 28968, Training Loss: 0.00849
Epoch: 28968, Training Loss: 0.01043
Epoch: 28969, Training Loss: 0.00813
Epoch: 28969, Training Loss: 0.00847
Epoch: 28969, Training Loss: 0.00849
Epoch: 28969, Training Loss: 0.01043
Epoch: 28970, Training Loss: 0.00813
Epoch: 28970, Training Loss: 0.00847
Epoch: 28970, Training Loss: 0.00849
Epoch: 28970, Training Loss: 0.01043
Epoch: 28971, Training Loss: 0.00813
Epoch: 28971, Training Loss: 0.00847
Epoch: 28971, Training Loss: 0.00848
Epoch: 28971, Training Loss: 0.01043
Epoch: 28972, Training Loss: 0.00813
Epoch: 28972, Training Loss: 0.00847
Epoch: 28972, Training Loss: 0.00848
Epoch: 28972, Training Loss: 0.01043
Epoch: 28973, Training Loss: 0.00813
Epoch: 28973, Training Loss: 0.00847
Epoch: 28973, Training Loss: 0.00848
Epoch: 28973, Training Loss: 0.01042
Epoch: 28974, Training Loss: 0.00813
Epoch: 28974, Training Loss: 0.00847
Epoch: 28974, Training Loss: 0.00848
Epoch: 28974, Training Loss: 0.01042
Epoch: 28975, Training Loss: 0.00813
Epoch: 28975, Training Loss: 0.00847
Epoch: 28975, Training Loss: 0.00848
Epoch: 28975, Training Loss: 0.01042
Epoch: 28976, Training Loss: 0.00813
Epoch: 28976, Training Loss: 0.00847
Epoch: 28976, Training Loss: 0.00848
Epoch: 28976, Training Loss: 0.01042
Epoch: 28977, Training Loss: 0.00813
Epoch: 28977, Training Loss: 0.00847
Epoch: 28977, Training Loss: 0.00848
Epoch: 28977, Training Loss: 0.01042
Epoch: 28978, Training Loss: 0.00813
Epoch: 28978, Training Loss: 0.00847
Epoch: 28978, Training Loss: 0.00848
Epoch: 28978, Training Loss: 0.01042
Epoch: 28979, Training Loss: 0.00813
Epoch: 28979, Training Loss: 0.00847
Epoch: 28979, Training Loss: 0.00848
Epoch: 28979, Training Loss: 0.01042
Epoch: 28980, Training Loss: 0.00813
Epoch: 28980, Training Loss: 0.00847
Epoch: 28980, Training Loss: 0.00848
Epoch: 28980, Training Loss: 0.01042
Epoch: 28981, Training Loss: 0.00813
Epoch: 28981, Training Loss: 0.00847
Epoch: 28981, Training Loss: 0.00848
Epoch: 28981, Training Loss: 0.01042
Epoch: 28982, Training Loss: 0.00813
Epoch: 28982, Training Loss: 0.00847
Epoch: 28982, Training Loss: 0.00848
Epoch: 28982, Training Loss: 0.01042
Epoch: 28983, Training Loss: 0.00813
Epoch: 28983, Training Loss: 0.00847
Epoch: 28983, Training Loss: 0.00848
Epoch: 28983, Training Loss: 0.01042
Epoch: 28984, Training Loss: 0.00813
Epoch: 28984, Training Loss: 0.00847
Epoch: 28984, Training Loss: 0.00848
Epoch: 28984, Training Loss: 0.01042
Epoch: 28985, Training Loss: 0.00813
Epoch: 28985, Training Loss: 0.00847
Epoch: 28985, Training Loss: 0.00848
Epoch: 28985, Training Loss: 0.01042
Epoch: 28986, Training Loss: 0.00813
Epoch: 28986, Training Loss: 0.00847
Epoch: 28986, Training Loss: 0.00848
Epoch: 28986, Training Loss: 0.01042
Epoch: 28987, Training Loss: 0.00813
Epoch: 28987, Training Loss: 0.00847
Epoch: 28987, Training Loss: 0.00848
Epoch: 28987, Training Loss: 0.01042
Epoch: 28988, Training Loss: 0.00813
Epoch: 28988, Training Loss: 0.00847
Epoch: 28988, Training Loss: 0.00848
Epoch: 28988, Training Loss: 0.01042
Epoch: 28989, Training Loss: 0.00813
Epoch: 28989, Training Loss: 0.00847
Epoch: 28989, Training Loss: 0.00848
Epoch: 28989, Training Loss: 0.01042
Epoch: 28990, Training Loss: 0.00813
Epoch: 28990, Training Loss: 0.00847
Epoch: 28990, Training Loss: 0.00848
Epoch: 28990, Training Loss: 0.01042
Epoch: 28991, Training Loss: 0.00813
Epoch: 28991, Training Loss: 0.00847
Epoch: 28991, Training Loss: 0.00848
Epoch: 28991, Training Loss: 0.01042
Epoch: 28992, Training Loss: 0.00813
Epoch: 28992, Training Loss: 0.00847
Epoch: 28992, Training Loss: 0.00848
Epoch: 28992, Training Loss: 0.01042
Epoch: 28993, Training Loss: 0.00812
Epoch: 28993, Training Loss: 0.00847
Epoch: 28993, Training Loss: 0.00848
Epoch: 28993, Training Loss: 0.01042
Epoch: 28994, Training Loss: 0.00812
Epoch: 28994, Training Loss: 0.00847
Epoch: 28994, Training Loss: 0.00848
Epoch: 28994, Training Loss: 0.01042
Epoch: 28995, Training Loss: 0.00812
Epoch: 28995, Training Loss: 0.00847
Epoch: 28995, Training Loss: 0.00848
Epoch: 28995, Training Loss: 0.01042
Epoch: 28996, Training Loss: 0.00812
Epoch: 28996, Training Loss: 0.00847
Epoch: 28996, Training Loss: 0.00848
Epoch: 28996, Training Loss: 0.01042
Epoch: 28997, Training Loss: 0.00812
Epoch: 28997, Training Loss: 0.00847
Epoch: 28997, Training Loss: 0.00848
Epoch: 28997, Training Loss: 0.01042
Epoch: 28998, Training Loss: 0.00812
Epoch: 28998, Training Loss: 0.00847
Epoch: 28998, Training Loss: 0.00848
Epoch: 28998, Training Loss: 0.01042
Epoch: 28999, Training Loss: 0.00812
Epoch: 28999, Training Loss: 0.00847
Epoch: 28999, Training Loss: 0.00848
Epoch: 28999, Training Loss: 0.01042
Epoch: 29000, Training Loss: 0.00812
Epoch: 29000, Training Loss: 0.00847
Epoch: 29000, Training Loss: 0.00848
Epoch: 29000, Training Loss: 0.01042
Epoch: 29001, Training Loss: 0.00812
Epoch: 29001, Training Loss: 0.00847
Epoch: 29001, Training Loss: 0.00848
Epoch: 29001, Training Loss: 0.01042
Epoch: 29002, Training Loss: 0.00812
Epoch: 29002, Training Loss: 0.00847
Epoch: 29002, Training Loss: 0.00848
Epoch: 29002, Training Loss: 0.01042
Epoch: 29003, Training Loss: 0.00812
Epoch: 29003, Training Loss: 0.00847
Epoch: 29003, Training Loss: 0.00848
Epoch: 29003, Training Loss: 0.01042
Epoch: 29004, Training Loss: 0.00812
Epoch: 29004, Training Loss: 0.00847
Epoch: 29004, Training Loss: 0.00848
Epoch: 29004, Training Loss: 0.01042
Epoch: 29005, Training Loss: 0.00812
Epoch: 29005, Training Loss: 0.00847
Epoch: 29005, Training Loss: 0.00848
Epoch: 29005, Training Loss: 0.01042
Epoch: 29006, Training Loss: 0.00812
Epoch: 29006, Training Loss: 0.00847
Epoch: 29006, Training Loss: 0.00848
Epoch: 29006, Training Loss: 0.01042
Epoch: 29007, Training Loss: 0.00812
Epoch: 29007, Training Loss: 0.00847
Epoch: 29007, Training Loss: 0.00848
Epoch: 29007, Training Loss: 0.01042
Epoch: 29008, Training Loss: 0.00812
Epoch: 29008, Training Loss: 0.00847
Epoch: 29008, Training Loss: 0.00848
Epoch: 29008, Training Loss: 0.01042
Epoch: 29009, Training Loss: 0.00812
Epoch: 29009, Training Loss: 0.00847
Epoch: 29009, Training Loss: 0.00848
Epoch: 29009, Training Loss: 0.01042
Epoch: 29010, Training Loss: 0.00812
Epoch: 29010, Training Loss: 0.00847
Epoch: 29010, Training Loss: 0.00848
Epoch: 29010, Training Loss: 0.01042
Epoch: 29011, Training Loss: 0.00812
Epoch: 29011, Training Loss: 0.00847
Epoch: 29011, Training Loss: 0.00848
Epoch: 29011, Training Loss: 0.01042
Epoch: 29012, Training Loss: 0.00812
Epoch: 29012, Training Loss: 0.00847
Epoch: 29012, Training Loss: 0.00848
Epoch: 29012, Training Loss: 0.01042
Epoch: 29013, Training Loss: 0.00812
Epoch: 29013, Training Loss: 0.00847
Epoch: 29013, Training Loss: 0.00848
Epoch: 29013, Training Loss: 0.01042
Epoch: 29014, Training Loss: 0.00812
Epoch: 29014, Training Loss: 0.00847
Epoch: 29014, Training Loss: 0.00848
Epoch: 29014, Training Loss: 0.01042
Epoch: 29015, Training Loss: 0.00812
Epoch: 29015, Training Loss: 0.00847
Epoch: 29015, Training Loss: 0.00848
Epoch: 29015, Training Loss: 0.01042
Epoch: 29016, Training Loss: 0.00812
Epoch: 29016, Training Loss: 0.00847
Epoch: 29016, Training Loss: 0.00848
Epoch: 29016, Training Loss: 0.01042
Epoch: 29017, Training Loss: 0.00812
Epoch: 29017, Training Loss: 0.00847
Epoch: 29017, Training Loss: 0.00848
Epoch: 29017, Training Loss: 0.01042
Epoch: 29018, Training Loss: 0.00812
Epoch: 29018, Training Loss: 0.00847
Epoch: 29018, Training Loss: 0.00848
Epoch: 29018, Training Loss: 0.01042
Epoch: 29019, Training Loss: 0.00812
Epoch: 29019, Training Loss: 0.00847
Epoch: 29019, Training Loss: 0.00848
Epoch: 29019, Training Loss: 0.01042
Epoch: 29020, Training Loss: 0.00812
Epoch: 29020, Training Loss: 0.00847
Epoch: 29020, Training Loss: 0.00848
Epoch: 29020, Training Loss: 0.01042
Epoch: 29021, Training Loss: 0.00812
Epoch: 29021, Training Loss: 0.00847
Epoch: 29021, Training Loss: 0.00848
Epoch: 29021, Training Loss: 0.01042
Epoch: 29022, Training Loss: 0.00812
Epoch: 29022, Training Loss: 0.00847
Epoch: 29022, Training Loss: 0.00848
Epoch: 29022, Training Loss: 0.01041
Epoch: 29023, Training Loss: 0.00812
Epoch: 29023, Training Loss: 0.00847
Epoch: 29023, Training Loss: 0.00848
Epoch: 29023, Training Loss: 0.01041
Epoch: 29024, Training Loss: 0.00812
Epoch: 29024, Training Loss: 0.00847
Epoch: 29024, Training Loss: 0.00848
Epoch: 29024, Training Loss: 0.01041
Epoch: 29025, Training Loss: 0.00812
Epoch: 29025, Training Loss: 0.00847
Epoch: 29025, Training Loss: 0.00848
Epoch: 29025, Training Loss: 0.01041
Epoch: 29026, Training Loss: 0.00812
Epoch: 29026, Training Loss: 0.00847
Epoch: 29026, Training Loss: 0.00848
Epoch: 29026, Training Loss: 0.01041
Epoch: 29027, Training Loss: 0.00812
Epoch: 29027, Training Loss: 0.00847
Epoch: 29027, Training Loss: 0.00848
Epoch: 29027, Training Loss: 0.01041
Epoch: 29028, Training Loss: 0.00812
Epoch: 29028, Training Loss: 0.00847
Epoch: 29028, Training Loss: 0.00848
Epoch: 29028, Training Loss: 0.01041
Epoch: 29029, Training Loss: 0.00812
Epoch: 29029, Training Loss: 0.00847
Epoch: 29029, Training Loss: 0.00848
Epoch: 29029, Training Loss: 0.01041
Epoch: 29030, Training Loss: 0.00812
Epoch: 29030, Training Loss: 0.00846
Epoch: 29030, Training Loss: 0.00848
Epoch: 29030, Training Loss: 0.01041
Epoch: 29031, Training Loss: 0.00812
Epoch: 29031, Training Loss: 0.00846
Epoch: 29031, Training Loss: 0.00847
Epoch: 29031, Training Loss: 0.01041
Epoch: 29032, Training Loss: 0.00812
Epoch: 29032, Training Loss: 0.00846
Epoch: 29032, Training Loss: 0.00847
Epoch: 29032, Training Loss: 0.01041
Epoch: 29033, Training Loss: 0.00812
Epoch: 29033, Training Loss: 0.00846
Epoch: 29033, Training Loss: 0.00847
Epoch: 29033, Training Loss: 0.01041
Epoch: 29034, Training Loss: 0.00812
Epoch: 29034, Training Loss: 0.00846
Epoch: 29034, Training Loss: 0.00847
Epoch: 29034, Training Loss: 0.01041
Epoch: 29035, Training Loss: 0.00812
Epoch: 29035, Training Loss: 0.00846
Epoch: 29035, Training Loss: 0.00847
Epoch: 29035, Training Loss: 0.01041
Epoch: 29036, Training Loss: 0.00812
Epoch: 29036, Training Loss: 0.00846
Epoch: 29036, Training Loss: 0.00847
Epoch: 29036, Training Loss: 0.01041
Epoch: 29037, Training Loss: 0.00812
Epoch: 29037, Training Loss: 0.00846
Epoch: 29037, Training Loss: 0.00847
Epoch: 29037, Training Loss: 0.01041
Epoch: 29038, Training Loss: 0.00812
Epoch: 29038, Training Loss: 0.00846
Epoch: 29038, Training Loss: 0.00847
Epoch: 29038, Training Loss: 0.01041
Epoch: 29039, Training Loss: 0.00812
Epoch: 29039, Training Loss: 0.00846
Epoch: 29039, Training Loss: 0.00847
Epoch: 29039, Training Loss: 0.01041
Epoch: 29040, Training Loss: 0.00812
Epoch: 29040, Training Loss: 0.00846
Epoch: 29040, Training Loss: 0.00847
Epoch: 29040, Training Loss: 0.01041
Epoch: 29041, Training Loss: 0.00812
Epoch: 29041, Training Loss: 0.00846
Epoch: 29041, Training Loss: 0.00847
Epoch: 29041, Training Loss: 0.01041
Epoch: 29042, Training Loss: 0.00812
Epoch: 29042, Training Loss: 0.00846
Epoch: 29042, Training Loss: 0.00847
Epoch: 29042, Training Loss: 0.01041
Epoch: 29043, Training Loss: 0.00812
Epoch: 29043, Training Loss: 0.00846
Epoch: 29043, Training Loss: 0.00847
Epoch: 29043, Training Loss: 0.01041
Epoch: 29044, Training Loss: 0.00812
Epoch: 29044, Training Loss: 0.00846
Epoch: 29044, Training Loss: 0.00847
Epoch: 29044, Training Loss: 0.01041
Epoch: 29045, Training Loss: 0.00812
Epoch: 29045, Training Loss: 0.00846
Epoch: 29045, Training Loss: 0.00847
Epoch: 29045, Training Loss: 0.01041
Epoch: 29046, Training Loss: 0.00812
Epoch: 29046, Training Loss: 0.00846
Epoch: 29046, Training Loss: 0.00847
Epoch: 29046, Training Loss: 0.01041
Epoch: 29047, Training Loss: 0.00812
Epoch: 29047, Training Loss: 0.00846
Epoch: 29047, Training Loss: 0.00847
Epoch: 29047, Training Loss: 0.01041
Epoch: 29048, Training Loss: 0.00812
Epoch: 29048, Training Loss: 0.00846
Epoch: 29048, Training Loss: 0.00847
Epoch: 29048, Training Loss: 0.01041
Epoch: 29049, Training Loss: 0.00812
Epoch: 29049, Training Loss: 0.00846
Epoch: 29049, Training Loss: 0.00847
Epoch: 29049, Training Loss: 0.01041
Epoch: 29050, Training Loss: 0.00812
Epoch: 29050, Training Loss: 0.00846
Epoch: 29050, Training Loss: 0.00847
Epoch: 29050, Training Loss: 0.01041
Epoch: 29051, Training Loss: 0.00812
Epoch: 29051, Training Loss: 0.00846
Epoch: 29051, Training Loss: 0.00847
Epoch: 29051, Training Loss: 0.01041
Epoch: 29052, Training Loss: 0.00812
Epoch: 29052, Training Loss: 0.00846
Epoch: 29052, Training Loss: 0.00847
Epoch: 29052, Training Loss: 0.01041
Epoch: 29053, Training Loss: 0.00812
Epoch: 29053, Training Loss: 0.00846
Epoch: 29053, Training Loss: 0.00847
Epoch: 29053, Training Loss: 0.01041
Epoch: 29054, Training Loss: 0.00812
Epoch: 29054, Training Loss: 0.00846
Epoch: 29054, Training Loss: 0.00847
Epoch: 29054, Training Loss: 0.01041
Epoch: 29055, Training Loss: 0.00812
Epoch: 29055, Training Loss: 0.00846
Epoch: 29055, Training Loss: 0.00847
Epoch: 29055, Training Loss: 0.01041
Epoch: 29056, Training Loss: 0.00812
Epoch: 29056, Training Loss: 0.00846
Epoch: 29056, Training Loss: 0.00847
Epoch: 29056, Training Loss: 0.01041
Epoch: 29057, Training Loss: 0.00812
Epoch: 29057, Training Loss: 0.00846
Epoch: 29057, Training Loss: 0.00847
Epoch: 29057, Training Loss: 0.01041
Epoch: 29058, Training Loss: 0.00811
Epoch: 29058, Training Loss: 0.00846
Epoch: 29058, Training Loss: 0.00847
Epoch: 29058, Training Loss: 0.01041
Epoch: 29059, Training Loss: 0.00811
Epoch: 29059, Training Loss: 0.00846
Epoch: 29059, Training Loss: 0.00847
Epoch: 29059, Training Loss: 0.01041
Epoch: 29060, Training Loss: 0.00811
Epoch: 29060, Training Loss: 0.00846
Epoch: 29060, Training Loss: 0.00847
Epoch: 29060, Training Loss: 0.01041
Epoch: 29061, Training Loss: 0.00811
Epoch: 29061, Training Loss: 0.00846
Epoch: 29061, Training Loss: 0.00847
Epoch: 29061, Training Loss: 0.01041
Epoch: 29062, Training Loss: 0.00811
Epoch: 29062, Training Loss: 0.00846
Epoch: 29062, Training Loss: 0.00847
Epoch: 29062, Training Loss: 0.01041
Epoch: 29063, Training Loss: 0.00811
Epoch: 29063, Training Loss: 0.00846
Epoch: 29063, Training Loss: 0.00847
Epoch: 29063, Training Loss: 0.01041
Epoch: 29064, Training Loss: 0.00811
Epoch: 29064, Training Loss: 0.00846
Epoch: 29064, Training Loss: 0.00847
Epoch: 29064, Training Loss: 0.01041
Epoch: 29065, Training Loss: 0.00811
Epoch: 29065, Training Loss: 0.00846
Epoch: 29065, Training Loss: 0.00847
Epoch: 29065, Training Loss: 0.01041
Epoch: 29066, Training Loss: 0.00811
Epoch: 29066, Training Loss: 0.00846
Epoch: 29066, Training Loss: 0.00847
Epoch: 29066, Training Loss: 0.01041
Epoch: 29067, Training Loss: 0.00811
Epoch: 29067, Training Loss: 0.00846
Epoch: 29067, Training Loss: 0.00847
Epoch: 29067, Training Loss: 0.01041
Epoch: 29068, Training Loss: 0.00811
Epoch: 29068, Training Loss: 0.00846
Epoch: 29068, Training Loss: 0.00847
Epoch: 29068, Training Loss: 0.01041
Epoch: 29069, Training Loss: 0.00811
Epoch: 29069, Training Loss: 0.00846
Epoch: 29069, Training Loss: 0.00847
Epoch: 29069, Training Loss: 0.01041
Epoch: 29070, Training Loss: 0.00811
Epoch: 29070, Training Loss: 0.00846
Epoch: 29070, Training Loss: 0.00847
Epoch: 29070, Training Loss: 0.01040
Epoch: 29071, Training Loss: 0.00811
Epoch: 29071, Training Loss: 0.00846
Epoch: 29071, Training Loss: 0.00847
Epoch: 29071, Training Loss: 0.01040
Epoch: 29072, Training Loss: 0.00811
Epoch: 29072, Training Loss: 0.00846
Epoch: 29072, Training Loss: 0.00847
Epoch: 29072, Training Loss: 0.01040
Epoch: 29073, Training Loss: 0.00811
Epoch: 29073, Training Loss: 0.00846
Epoch: 29073, Training Loss: 0.00847
Epoch: 29073, Training Loss: 0.01040
Epoch: 29074, Training Loss: 0.00811
Epoch: 29074, Training Loss: 0.00846
Epoch: 29074, Training Loss: 0.00847
Epoch: 29074, Training Loss: 0.01040
Epoch: 29075, Training Loss: 0.00811
Epoch: 29075, Training Loss: 0.00846
Epoch: 29075, Training Loss: 0.00847
Epoch: 29075, Training Loss: 0.01040
Epoch: 29076, Training Loss: 0.00811
Epoch: 29076, Training Loss: 0.00846
Epoch: 29076, Training Loss: 0.00847
Epoch: 29076, Training Loss: 0.01040
Epoch: 29077, Training Loss: 0.00811
Epoch: 29077, Training Loss: 0.00846
Epoch: 29077, Training Loss: 0.00847
Epoch: 29077, Training Loss: 0.01040
Epoch: 29078, Training Loss: 0.00811
Epoch: 29078, Training Loss: 0.00846
Epoch: 29078, Training Loss: 0.00847
Epoch: 29078, Training Loss: 0.01040
Epoch: 29079, Training Loss: 0.00811
Epoch: 29079, Training Loss: 0.00846
Epoch: 29079, Training Loss: 0.00847
Epoch: 29079, Training Loss: 0.01040
Epoch: 29080, Training Loss: 0.00811
Epoch: 29080, Training Loss: 0.00846
Epoch: 29080, Training Loss: 0.00847
Epoch: 29080, Training Loss: 0.01040
Epoch: 29081, Training Loss: 0.00811
Epoch: 29081, Training Loss: 0.00846
Epoch: 29081, Training Loss: 0.00847
Epoch: 29081, Training Loss: 0.01040
Epoch: 29082, Training Loss: 0.00811
Epoch: 29082, Training Loss: 0.00846
Epoch: 29082, Training Loss: 0.00847
Epoch: 29082, Training Loss: 0.01040
Epoch: 29083, Training Loss: 0.00811
Epoch: 29083, Training Loss: 0.00846
Epoch: 29083, Training Loss: 0.00847
Epoch: 29083, Training Loss: 0.01040
Epoch: 29084, Training Loss: 0.00811
Epoch: 29084, Training Loss: 0.00846
Epoch: 29084, Training Loss: 0.00847
Epoch: 29084, Training Loss: 0.01040
Epoch: 29085, Training Loss: 0.00811
Epoch: 29085, Training Loss: 0.00846
Epoch: 29085, Training Loss: 0.00847
Epoch: 29085, Training Loss: 0.01040
Epoch: 29086, Training Loss: 0.00811
Epoch: 29086, Training Loss: 0.00846
Epoch: 29086, Training Loss: 0.00847
Epoch: 29086, Training Loss: 0.01040
Epoch: 29087, Training Loss: 0.00811
Epoch: 29087, Training Loss: 0.00846
Epoch: 29087, Training Loss: 0.00847
Epoch: 29087, Training Loss: 0.01040
Epoch: 29088, Training Loss: 0.00811
Epoch: 29088, Training Loss: 0.00846
Epoch: 29088, Training Loss: 0.00847
Epoch: 29088, Training Loss: 0.01040
Epoch: 29089, Training Loss: 0.00811
Epoch: 29089, Training Loss: 0.00846
Epoch: 29089, Training Loss: 0.00847
Epoch: 29089, Training Loss: 0.01040
Epoch: 29090, Training Loss: 0.00811
Epoch: 29090, Training Loss: 0.00846
Epoch: 29090, Training Loss: 0.00847
Epoch: 29090, Training Loss: 0.01040
Epoch: 29091, Training Loss: 0.00811
Epoch: 29091, Training Loss: 0.00845
Epoch: 29091, Training Loss: 0.00847
Epoch: 29091, Training Loss: 0.01040
Epoch: 29092, Training Loss: 0.00811
Epoch: 29092, Training Loss: 0.00845
Epoch: 29092, Training Loss: 0.00846
Epoch: 29092, Training Loss: 0.01040
Epoch: 29093, Training Loss: 0.00811
Epoch: 29093, Training Loss: 0.00845
Epoch: 29093, Training Loss: 0.00846
Epoch: 29093, Training Loss: 0.01040
Epoch: 29094, Training Loss: 0.00811
Epoch: 29094, Training Loss: 0.00845
Epoch: 29094, Training Loss: 0.00846
Epoch: 29094, Training Loss: 0.01040
Epoch: 29095, Training Loss: 0.00811
Epoch: 29095, Training Loss: 0.00845
Epoch: 29095, Training Loss: 0.00846
Epoch: 29095, Training Loss: 0.01040
Epoch: 29096, Training Loss: 0.00811
Epoch: 29096, Training Loss: 0.00845
Epoch: 29096, Training Loss: 0.00846
Epoch: 29096, Training Loss: 0.01040
Epoch: 29097, Training Loss: 0.00811
Epoch: 29097, Training Loss: 0.00845
Epoch: 29097, Training Loss: 0.00846
Epoch: 29097, Training Loss: 0.01040
Epoch: 29098, Training Loss: 0.00811
Epoch: 29098, Training Loss: 0.00845
Epoch: 29098, Training Loss: 0.00846
Epoch: 29098, Training Loss: 0.01040
Epoch: 29099, Training Loss: 0.00811
Epoch: 29099, Training Loss: 0.00845
Epoch: 29099, Training Loss: 0.00846
Epoch: 29099, Training Loss: 0.01040
Epoch: 29100, Training Loss: 0.00811
Epoch: 29100, Training Loss: 0.00845
Epoch: 29100, Training Loss: 0.00846
Epoch: 29100, Training Loss: 0.01040
Epoch: 29101, Training Loss: 0.00811
Epoch: 29101, Training Loss: 0.00845
Epoch: 29101, Training Loss: 0.00846
Epoch: 29101, Training Loss: 0.01040
Epoch: 29102, Training Loss: 0.00811
Epoch: 29102, Training Loss: 0.00845
Epoch: 29102, Training Loss: 0.00846
Epoch: 29102, Training Loss: 0.01040
Epoch: 29103, Training Loss: 0.00811
Epoch: 29103, Training Loss: 0.00845
Epoch: 29103, Training Loss: 0.00846
Epoch: 29103, Training Loss: 0.01040
Epoch: 29104, Training Loss: 0.00811
Epoch: 29104, Training Loss: 0.00845
Epoch: 29104, Training Loss: 0.00846
Epoch: 29104, Training Loss: 0.01040
Epoch: 29105, Training Loss: 0.00811
Epoch: 29105, Training Loss: 0.00845
Epoch: 29105, Training Loss: 0.00846
Epoch: 29105, Training Loss: 0.01040
Epoch: 29106, Training Loss: 0.00811
Epoch: 29106, Training Loss: 0.00845
Epoch: 29106, Training Loss: 0.00846
Epoch: 29106, Training Loss: 0.01040
Epoch: 29107, Training Loss: 0.00811
Epoch: 29107, Training Loss: 0.00845
Epoch: 29107, Training Loss: 0.00846
Epoch: 29107, Training Loss: 0.01040
Epoch: 29108, Training Loss: 0.00811
Epoch: 29108, Training Loss: 0.00845
Epoch: 29108, Training Loss: 0.00846
Epoch: 29108, Training Loss: 0.01040
Epoch: 29109, Training Loss: 0.00811
Epoch: 29109, Training Loss: 0.00845
Epoch: 29109, Training Loss: 0.00846
Epoch: 29109, Training Loss: 0.01040
Epoch: 29110, Training Loss: 0.00811
Epoch: 29110, Training Loss: 0.00845
Epoch: 29110, Training Loss: 0.00846
Epoch: 29110, Training Loss: 0.01040
Epoch: 29111, Training Loss: 0.00811
Epoch: 29111, Training Loss: 0.00845
Epoch: 29111, Training Loss: 0.00846
Epoch: 29111, Training Loss: 0.01040
Epoch: 29112, Training Loss: 0.00811
Epoch: 29112, Training Loss: 0.00845
Epoch: 29112, Training Loss: 0.00846
Epoch: 29112, Training Loss: 0.01040
Epoch: 29113, Training Loss: 0.00811
Epoch: 29113, Training Loss: 0.00845
Epoch: 29113, Training Loss: 0.00846
Epoch: 29113, Training Loss: 0.01040
Epoch: 29114, Training Loss: 0.00811
Epoch: 29114, Training Loss: 0.00845
Epoch: 29114, Training Loss: 0.00846
Epoch: 29114, Training Loss: 0.01040
Epoch: 29115, Training Loss: 0.00811
Epoch: 29115, Training Loss: 0.00845
Epoch: 29115, Training Loss: 0.00846
Epoch: 29115, Training Loss: 0.01040
Epoch: 29116, Training Loss: 0.00811
Epoch: 29116, Training Loss: 0.00845
Epoch: 29116, Training Loss: 0.00846
Epoch: 29116, Training Loss: 0.01040
Epoch: 29117, Training Loss: 0.00811
Epoch: 29117, Training Loss: 0.00845
Epoch: 29117, Training Loss: 0.00846
Epoch: 29117, Training Loss: 0.01040
Epoch: 29118, Training Loss: 0.00811
Epoch: 29118, Training Loss: 0.00845
Epoch: 29118, Training Loss: 0.00846
Epoch: 29118, Training Loss: 0.01040
Epoch: 29119, Training Loss: 0.00811
Epoch: 29119, Training Loss: 0.00845
Epoch: 29119, Training Loss: 0.00846
Epoch: 29119, Training Loss: 0.01039
Epoch: 29120, Training Loss: 0.00811
Epoch: 29120, Training Loss: 0.00845
Epoch: 29120, Training Loss: 0.00846
Epoch: 29120, Training Loss: 0.01039
Epoch: 29121, Training Loss: 0.00811
Epoch: 29121, Training Loss: 0.00845
Epoch: 29121, Training Loss: 0.00846
Epoch: 29121, Training Loss: 0.01039
Epoch: 29122, Training Loss: 0.00811
Epoch: 29122, Training Loss: 0.00845
Epoch: 29122, Training Loss: 0.00846
Epoch: 29122, Training Loss: 0.01039
Epoch: 29123, Training Loss: 0.00810
Epoch: 29123, Training Loss: 0.00845
Epoch: 29123, Training Loss: 0.00846
Epoch: 29123, Training Loss: 0.01039
Epoch: 29124, Training Loss: 0.00810
Epoch: 29124, Training Loss: 0.00845
Epoch: 29124, Training Loss: 0.00846
Epoch: 29124, Training Loss: 0.01039
Epoch: 29125, Training Loss: 0.00810
Epoch: 29125, Training Loss: 0.00845
Epoch: 29125, Training Loss: 0.00846
Epoch: 29125, Training Loss: 0.01039
Epoch: 29126, Training Loss: 0.00810
Epoch: 29126, Training Loss: 0.00845
Epoch: 29126, Training Loss: 0.00846
Epoch: 29126, Training Loss: 0.01039
Epoch: 29127, Training Loss: 0.00810
Epoch: 29127, Training Loss: 0.00845
Epoch: 29127, Training Loss: 0.00846
Epoch: 29127, Training Loss: 0.01039
Epoch: 29128, Training Loss: 0.00810
Epoch: 29128, Training Loss: 0.00845
Epoch: 29128, Training Loss: 0.00846
Epoch: 29128, Training Loss: 0.01039
Epoch: 29129, Training Loss: 0.00810
Epoch: 29129, Training Loss: 0.00845
Epoch: 29129, Training Loss: 0.00846
Epoch: 29129, Training Loss: 0.01039
Epoch: 29130, Training Loss: 0.00810
Epoch: 29130, Training Loss: 0.00845
Epoch: 29130, Training Loss: 0.00846
Epoch: 29130, Training Loss: 0.01039
Epoch: 29131, Training Loss: 0.00810
Epoch: 29131, Training Loss: 0.00845
Epoch: 29131, Training Loss: 0.00846
Epoch: 29131, Training Loss: 0.01039
Epoch: 29132, Training Loss: 0.00810
Epoch: 29132, Training Loss: 0.00845
Epoch: 29132, Training Loss: 0.00846
Epoch: 29132, Training Loss: 0.01039
Epoch: 29133, Training Loss: 0.00810
Epoch: 29133, Training Loss: 0.00845
Epoch: 29133, Training Loss: 0.00846
Epoch: 29133, Training Loss: 0.01039
Epoch: 29134, Training Loss: 0.00810
Epoch: 29134, Training Loss: 0.00845
Epoch: 29134, Training Loss: 0.00846
Epoch: 29134, Training Loss: 0.01039
Epoch: 29135, Training Loss: 0.00810
Epoch: 29135, Training Loss: 0.00845
Epoch: 29135, Training Loss: 0.00846
Epoch: 29135, Training Loss: 0.01039
Epoch: 29136, Training Loss: 0.00810
Epoch: 29136, Training Loss: 0.00845
Epoch: 29136, Training Loss: 0.00846
Epoch: 29136, Training Loss: 0.01039
Epoch: 29137, Training Loss: 0.00810
Epoch: 29137, Training Loss: 0.00845
Epoch: 29137, Training Loss: 0.00846
Epoch: 29137, Training Loss: 0.01039
Epoch: 29138, Training Loss: 0.00810
Epoch: 29138, Training Loss: 0.00845
Epoch: 29138, Training Loss: 0.00846
Epoch: 29138, Training Loss: 0.01039
Epoch: 29139, Training Loss: 0.00810
Epoch: 29139, Training Loss: 0.00845
Epoch: 29139, Training Loss: 0.00846
Epoch: 29139, Training Loss: 0.01039
Epoch: 29140, Training Loss: 0.00810
Epoch: 29140, Training Loss: 0.00845
Epoch: 29140, Training Loss: 0.00846
Epoch: 29140, Training Loss: 0.01039
Epoch: 29141, Training Loss: 0.00810
Epoch: 29141, Training Loss: 0.00845
Epoch: 29141, Training Loss: 0.00846
Epoch: 29141, Training Loss: 0.01039
Epoch: 29142, Training Loss: 0.00810
Epoch: 29142, Training Loss: 0.00845
Epoch: 29142, Training Loss: 0.00846
Epoch: 29142, Training Loss: 0.01039
Epoch: 29143, Training Loss: 0.00810
Epoch: 29143, Training Loss: 0.00845
Epoch: 29143, Training Loss: 0.00846
Epoch: 29143, Training Loss: 0.01039
Epoch: 29144, Training Loss: 0.00810
Epoch: 29144, Training Loss: 0.00845
Epoch: 29144, Training Loss: 0.00846
Epoch: 29144, Training Loss: 0.01039
Epoch: 29145, Training Loss: 0.00810
Epoch: 29145, Training Loss: 0.00845
Epoch: 29145, Training Loss: 0.00846
Epoch: 29145, Training Loss: 0.01039
Epoch: 29146, Training Loss: 0.00810
Epoch: 29146, Training Loss: 0.00845
Epoch: 29146, Training Loss: 0.00846
Epoch: 29146, Training Loss: 0.01039
Epoch: 29147, Training Loss: 0.00810
Epoch: 29147, Training Loss: 0.00845
Epoch: 29147, Training Loss: 0.00846
Epoch: 29147, Training Loss: 0.01039
Epoch: 29148, Training Loss: 0.00810
Epoch: 29148, Training Loss: 0.00845
Epoch: 29148, Training Loss: 0.00846
Epoch: 29148, Training Loss: 0.01039
Epoch: 29149, Training Loss: 0.00810
Epoch: 29149, Training Loss: 0.00845
Epoch: 29149, Training Loss: 0.00846
Epoch: 29149, Training Loss: 0.01039
Epoch: 29150, Training Loss: 0.00810
Epoch: 29150, Training Loss: 0.00845
Epoch: 29150, Training Loss: 0.00846
Epoch: 29150, Training Loss: 0.01039
Epoch: 29151, Training Loss: 0.00810
Epoch: 29151, Training Loss: 0.00845
Epoch: 29151, Training Loss: 0.00846
Epoch: 29151, Training Loss: 0.01039
Epoch: 29152, Training Loss: 0.00810
Epoch: 29152, Training Loss: 0.00844
Epoch: 29152, Training Loss: 0.00846
Epoch: 29152, Training Loss: 0.01039
Epoch: 29153, Training Loss: 0.00810
Epoch: 29153, Training Loss: 0.00844
Epoch: 29153, Training Loss: 0.00845
Epoch: 29153, Training Loss: 0.01039
Epoch: 29154, Training Loss: 0.00810
Epoch: 29154, Training Loss: 0.00844
Epoch: 29154, Training Loss: 0.00845
Epoch: 29154, Training Loss: 0.01039
Epoch: 29155, Training Loss: 0.00810
Epoch: 29155, Training Loss: 0.00844
Epoch: 29155, Training Loss: 0.00845
Epoch: 29155, Training Loss: 0.01039
Epoch: 29156, Training Loss: 0.00810
Epoch: 29156, Training Loss: 0.00844
Epoch: 29156, Training Loss: 0.00845
Epoch: 29156, Training Loss: 0.01039
Epoch: 29157, Training Loss: 0.00810
Epoch: 29157, Training Loss: 0.00844
Epoch: 29157, Training Loss: 0.00845
Epoch: 29157, Training Loss: 0.01039
Epoch: 29158, Training Loss: 0.00810
Epoch: 29158, Training Loss: 0.00844
Epoch: 29158, Training Loss: 0.00845
Epoch: 29158, Training Loss: 0.01039
Epoch: 29159, Training Loss: 0.00810
Epoch: 29159, Training Loss: 0.00844
Epoch: 29159, Training Loss: 0.00845
Epoch: 29159, Training Loss: 0.01039
Epoch: 29160, Training Loss: 0.00810
Epoch: 29160, Training Loss: 0.00844
Epoch: 29160, Training Loss: 0.00845
Epoch: 29160, Training Loss: 0.01039
Epoch: 29161, Training Loss: 0.00810
Epoch: 29161, Training Loss: 0.00844
Epoch: 29161, Training Loss: 0.00845
Epoch: 29161, Training Loss: 0.01039
Epoch: 29162, Training Loss: 0.00810
Epoch: 29162, Training Loss: 0.00844
Epoch: 29162, Training Loss: 0.00845
Epoch: 29162, Training Loss: 0.01039
Epoch: 29163, Training Loss: 0.00810
Epoch: 29163, Training Loss: 0.00844
Epoch: 29163, Training Loss: 0.00845
Epoch: 29163, Training Loss: 0.01039
Epoch: 29164, Training Loss: 0.00810
Epoch: 29164, Training Loss: 0.00844
Epoch: 29164, Training Loss: 0.00845
Epoch: 29164, Training Loss: 0.01039
Epoch: 29165, Training Loss: 0.00810
Epoch: 29165, Training Loss: 0.00844
Epoch: 29165, Training Loss: 0.00845
Epoch: 29165, Training Loss: 0.01039
Epoch: 29166, Training Loss: 0.00810
Epoch: 29166, Training Loss: 0.00844
Epoch: 29166, Training Loss: 0.00845
Epoch: 29166, Training Loss: 0.01039
Epoch: 29167, Training Loss: 0.00810
Epoch: 29167, Training Loss: 0.00844
Epoch: 29167, Training Loss: 0.00845
Epoch: 29167, Training Loss: 0.01039
Epoch: 29168, Training Loss: 0.00810
Epoch: 29168, Training Loss: 0.00844
Epoch: 29168, Training Loss: 0.00845
Epoch: 29168, Training Loss: 0.01038
Epoch: 29169, Training Loss: 0.00810
Epoch: 29169, Training Loss: 0.00844
Epoch: 29169, Training Loss: 0.00845
Epoch: 29169, Training Loss: 0.01038
Epoch: 29170, Training Loss: 0.00810
Epoch: 29170, Training Loss: 0.00844
Epoch: 29170, Training Loss: 0.00845
Epoch: 29170, Training Loss: 0.01038
Epoch: 29171, Training Loss: 0.00810
Epoch: 29171, Training Loss: 0.00844
Epoch: 29171, Training Loss: 0.00845
Epoch: 29171, Training Loss: 0.01038
Epoch: 29172, Training Loss: 0.00810
Epoch: 29172, Training Loss: 0.00844
Epoch: 29172, Training Loss: 0.00845
Epoch: 29172, Training Loss: 0.01038
Epoch: 29173, Training Loss: 0.00810
Epoch: 29173, Training Loss: 0.00844
Epoch: 29173, Training Loss: 0.00845
Epoch: 29173, Training Loss: 0.01038
Epoch: 29174, Training Loss: 0.00810
Epoch: 29174, Training Loss: 0.00844
Epoch: 29174, Training Loss: 0.00845
Epoch: 29174, Training Loss: 0.01038
Epoch: 29175, Training Loss: 0.00810
Epoch: 29175, Training Loss: 0.00844
Epoch: 29175, Training Loss: 0.00845
Epoch: 29175, Training Loss: 0.01038
Epoch: 29176, Training Loss: 0.00810
Epoch: 29176, Training Loss: 0.00844
Epoch: 29176, Training Loss: 0.00845
Epoch: 29176, Training Loss: 0.01038
Epoch: 29177, Training Loss: 0.00810
Epoch: 29177, Training Loss: 0.00844
Epoch: 29177, Training Loss: 0.00845
Epoch: 29177, Training Loss: 0.01038
Epoch: 29178, Training Loss: 0.00810
Epoch: 29178, Training Loss: 0.00844
Epoch: 29178, Training Loss: 0.00845
Epoch: 29178, Training Loss: 0.01038
Epoch: 29179, Training Loss: 0.00810
Epoch: 29179, Training Loss: 0.00844
Epoch: 29179, Training Loss: 0.00845
Epoch: 29179, Training Loss: 0.01038
Epoch: 29180, Training Loss: 0.00810
Epoch: 29180, Training Loss: 0.00844
Epoch: 29180, Training Loss: 0.00845
Epoch: 29180, Training Loss: 0.01038
Epoch: 29181, Training Loss: 0.00810
Epoch: 29181, Training Loss: 0.00844
Epoch: 29181, Training Loss: 0.00845
Epoch: 29181, Training Loss: 0.01038
Epoch: 29182, Training Loss: 0.00810
Epoch: 29182, Training Loss: 0.00844
Epoch: 29182, Training Loss: 0.00845
Epoch: 29182, Training Loss: 0.01038
Epoch: 29183, Training Loss: 0.00810
Epoch: 29183, Training Loss: 0.00844
Epoch: 29183, Training Loss: 0.00845
Epoch: 29183, Training Loss: 0.01038
Epoch: 29184, Training Loss: 0.00810
Epoch: 29184, Training Loss: 0.00844
Epoch: 29184, Training Loss: 0.00845
Epoch: 29184, Training Loss: 0.01038
Epoch: 29185, Training Loss: 0.00810
Epoch: 29185, Training Loss: 0.00844
Epoch: 29185, Training Loss: 0.00845
Epoch: 29185, Training Loss: 0.01038
Epoch: 29186, Training Loss: 0.00810
Epoch: 29186, Training Loss: 0.00844
Epoch: 29186, Training Loss: 0.00845
Epoch: 29186, Training Loss: 0.01038
Epoch: 29187, Training Loss: 0.00810
Epoch: 29187, Training Loss: 0.00844
Epoch: 29187, Training Loss: 0.00845
Epoch: 29187, Training Loss: 0.01038
Epoch: 29188, Training Loss: 0.00809
Epoch: 29188, Training Loss: 0.00844
Epoch: 29188, Training Loss: 0.00845
Epoch: 29188, Training Loss: 0.01038
Epoch: 29189, Training Loss: 0.00809
Epoch: 29189, Training Loss: 0.00844
Epoch: 29189, Training Loss: 0.00845
Epoch: 29189, Training Loss: 0.01038
Epoch: 29190, Training Loss: 0.00809
Epoch: 29190, Training Loss: 0.00844
Epoch: 29190, Training Loss: 0.00845
Epoch: 29190, Training Loss: 0.01038
Epoch: 29191, Training Loss: 0.00809
Epoch: 29191, Training Loss: 0.00844
Epoch: 29191, Training Loss: 0.00845
Epoch: 29191, Training Loss: 0.01038
Epoch: 29192, Training Loss: 0.00809
Epoch: 29192, Training Loss: 0.00844
Epoch: 29192, Training Loss: 0.00845
Epoch: 29192, Training Loss: 0.01038
Epoch: 29193, Training Loss: 0.00809
Epoch: 29193, Training Loss: 0.00844
Epoch: 29193, Training Loss: 0.00845
Epoch: 29193, Training Loss: 0.01038
Epoch: 29194, Training Loss: 0.00809
Epoch: 29194, Training Loss: 0.00844
Epoch: 29194, Training Loss: 0.00845
Epoch: 29194, Training Loss: 0.01038
Epoch: 29195, Training Loss: 0.00809
Epoch: 29195, Training Loss: 0.00844
Epoch: 29195, Training Loss: 0.00845
Epoch: 29195, Training Loss: 0.01038
Epoch: 29196, Training Loss: 0.00809
Epoch: 29196, Training Loss: 0.00844
Epoch: 29196, Training Loss: 0.00845
Epoch: 29196, Training Loss: 0.01038
Epoch: 29197, Training Loss: 0.00809
Epoch: 29197, Training Loss: 0.00844
Epoch: 29197, Training Loss: 0.00845
Epoch: 29197, Training Loss: 0.01038
Epoch: 29198, Training Loss: 0.00809
Epoch: 29198, Training Loss: 0.00844
Epoch: 29198, Training Loss: 0.00845
Epoch: 29198, Training Loss: 0.01038
Epoch: 29199, Training Loss: 0.00809
Epoch: 29199, Training Loss: 0.00844
Epoch: 29199, Training Loss: 0.00845
Epoch: 29199, Training Loss: 0.01038
Epoch: 29200, Training Loss: 0.00809
Epoch: 29200, Training Loss: 0.00844
Epoch: 29200, Training Loss: 0.00845
Epoch: 29200, Training Loss: 0.01038
Epoch: 29201, Training Loss: 0.00809
Epoch: 29201, Training Loss: 0.00844
Epoch: 29201, Training Loss: 0.00845
Epoch: 29201, Training Loss: 0.01038
Epoch: 29202, Training Loss: 0.00809
Epoch: 29202, Training Loss: 0.00844
Epoch: 29202, Training Loss: 0.00845
Epoch: 29202, Training Loss: 0.01038
Epoch: 29203, Training Loss: 0.00809
Epoch: 29203, Training Loss: 0.00844
Epoch: 29203, Training Loss: 0.00845
Epoch: 29203, Training Loss: 0.01038
Epoch: 29204, Training Loss: 0.00809
Epoch: 29204, Training Loss: 0.00844
Epoch: 29204, Training Loss: 0.00845
Epoch: 29204, Training Loss: 0.01038
Epoch: 29205, Training Loss: 0.00809
Epoch: 29205, Training Loss: 0.00844
Epoch: 29205, Training Loss: 0.00845
Epoch: 29205, Training Loss: 0.01038
Epoch: 29206, Training Loss: 0.00809
Epoch: 29206, Training Loss: 0.00844
Epoch: 29206, Training Loss: 0.00845
Epoch: 29206, Training Loss: 0.01038
Epoch: 29207, Training Loss: 0.00809
Epoch: 29207, Training Loss: 0.00844
Epoch: 29207, Training Loss: 0.00845
Epoch: 29207, Training Loss: 0.01038
Epoch: 29208, Training Loss: 0.00809
Epoch: 29208, Training Loss: 0.00844
Epoch: 29208, Training Loss: 0.00845
Epoch: 29208, Training Loss: 0.01038
Epoch: 29209, Training Loss: 0.00809
Epoch: 29209, Training Loss: 0.00844
Epoch: 29209, Training Loss: 0.00845
Epoch: 29209, Training Loss: 0.01038
Epoch: 29210, Training Loss: 0.00809
Epoch: 29210, Training Loss: 0.00844
Epoch: 29210, Training Loss: 0.00845
Epoch: 29210, Training Loss: 0.01038
Epoch: 29211, Training Loss: 0.00809
Epoch: 29211, Training Loss: 0.00844
Epoch: 29211, Training Loss: 0.00845
Epoch: 29211, Training Loss: 0.01038
Epoch: 29212, Training Loss: 0.00809
Epoch: 29212, Training Loss: 0.00844
Epoch: 29212, Training Loss: 0.00845
Epoch: 29212, Training Loss: 0.01038
Epoch: 29213, Training Loss: 0.00809
Epoch: 29213, Training Loss: 0.00843
Epoch: 29213, Training Loss: 0.00845
Epoch: 29213, Training Loss: 0.01038
Epoch: 29214, Training Loss: 0.00809
Epoch: 29214, Training Loss: 0.00843
Epoch: 29214, Training Loss: 0.00844
Epoch: 29214, Training Loss: 0.01038
Epoch: 29215, Training Loss: 0.00809
Epoch: 29215, Training Loss: 0.00843
Epoch: 29215, Training Loss: 0.00844
Epoch: 29215, Training Loss: 0.01038
Epoch: 29216, Training Loss: 0.00809
Epoch: 29216, Training Loss: 0.00843
Epoch: 29216, Training Loss: 0.00844
Epoch: 29216, Training Loss: 0.01038
Epoch: 29217, Training Loss: 0.00809
Epoch: 29217, Training Loss: 0.00843
Epoch: 29217, Training Loss: 0.00844
Epoch: 29217, Training Loss: 0.01038
Epoch: 29218, Training Loss: 0.00809
Epoch: 29218, Training Loss: 0.00843
Epoch: 29218, Training Loss: 0.00844
Epoch: 29218, Training Loss: 0.01037
Epoch: 29219, Training Loss: 0.00809
Epoch: 29219, Training Loss: 0.00843
Epoch: 29219, Training Loss: 0.00844
Epoch: 29219, Training Loss: 0.01037
Epoch: 29220, Training Loss: 0.00809
Epoch: 29220, Training Loss: 0.00843
Epoch: 29220, Training Loss: 0.00844
Epoch: 29220, Training Loss: 0.01037
Epoch: 29221, Training Loss: 0.00809
Epoch: 29221, Training Loss: 0.00843
Epoch: 29221, Training Loss: 0.00844
Epoch: 29221, Training Loss: 0.01037
Epoch: 29222, Training Loss: 0.00809
Epoch: 29222, Training Loss: 0.00843
Epoch: 29222, Training Loss: 0.00844
Epoch: 29222, Training Loss: 0.01037
Epoch: 29223, Training Loss: 0.00809
Epoch: 29223, Training Loss: 0.00843
Epoch: 29223, Training Loss: 0.00844
Epoch: 29223, Training Loss: 0.01037
Epoch: 29224, Training Loss: 0.00809
Epoch: 29224, Training Loss: 0.00843
Epoch: 29224, Training Loss: 0.00844
Epoch: 29224, Training Loss: 0.01037
Epoch: 29225, Training Loss: 0.00809
Epoch: 29225, Training Loss: 0.00843
Epoch: 29225, Training Loss: 0.00844
Epoch: 29225, Training Loss: 0.01037
Epoch: 29226, Training Loss: 0.00809
Epoch: 29226, Training Loss: 0.00843
Epoch: 29226, Training Loss: 0.00844
Epoch: 29226, Training Loss: 0.01037
Epoch: 29227, Training Loss: 0.00809
Epoch: 29227, Training Loss: 0.00843
Epoch: 29227, Training Loss: 0.00844
Epoch: 29227, Training Loss: 0.01037
Epoch: 29228, Training Loss: 0.00809
Epoch: 29228, Training Loss: 0.00843
Epoch: 29228, Training Loss: 0.00844
Epoch: 29228, Training Loss: 0.01037
Epoch: 29229, Training Loss: 0.00809
Epoch: 29229, Training Loss: 0.00843
Epoch: 29229, Training Loss: 0.00844
Epoch: 29229, Training Loss: 0.01037
Epoch: 29230, Training Loss: 0.00809
Epoch: 29230, Training Loss: 0.00843
Epoch: 29230, Training Loss: 0.00844
Epoch: 29230, Training Loss: 0.01037
Epoch: 29231, Training Loss: 0.00809
Epoch: 29231, Training Loss: 0.00843
Epoch: 29231, Training Loss: 0.00844
Epoch: 29231, Training Loss: 0.01037
Epoch: 29232, Training Loss: 0.00809
Epoch: 29232, Training Loss: 0.00843
Epoch: 29232, Training Loss: 0.00844
Epoch: 29232, Training Loss: 0.01037
Epoch: 29233, Training Loss: 0.00809
Epoch: 29233, Training Loss: 0.00843
Epoch: 29233, Training Loss: 0.00844
Epoch: 29233, Training Loss: 0.01037
Epoch: 29234, Training Loss: 0.00809
Epoch: 29234, Training Loss: 0.00843
Epoch: 29234, Training Loss: 0.00844
Epoch: 29234, Training Loss: 0.01037
Epoch: 29235, Training Loss: 0.00809
Epoch: 29235, Training Loss: 0.00843
Epoch: 29235, Training Loss: 0.00844
Epoch: 29235, Training Loss: 0.01037
Epoch: 29236, Training Loss: 0.00809
Epoch: 29236, Training Loss: 0.00843
Epoch: 29236, Training Loss: 0.00844
Epoch: 29236, Training Loss: 0.01037
Epoch: 29237, Training Loss: 0.00809
Epoch: 29237, Training Loss: 0.00843
Epoch: 29237, Training Loss: 0.00844
Epoch: 29237, Training Loss: 0.01037
Epoch: 29238, Training Loss: 0.00809
Epoch: 29238, Training Loss: 0.00843
Epoch: 29238, Training Loss: 0.00844
Epoch: 29238, Training Loss: 0.01037
Epoch: 29239, Training Loss: 0.00809
Epoch: 29239, Training Loss: 0.00843
Epoch: 29239, Training Loss: 0.00844
Epoch: 29239, Training Loss: 0.01037
Epoch: 29240, Training Loss: 0.00809
Epoch: 29240, Training Loss: 0.00843
Epoch: 29240, Training Loss: 0.00844
Epoch: 29240, Training Loss: 0.01037
Epoch: 29241, Training Loss: 0.00809
Epoch: 29241, Training Loss: 0.00843
Epoch: 29241, Training Loss: 0.00844
Epoch: 29241, Training Loss: 0.01037
Epoch: 29242, Training Loss: 0.00809
Epoch: 29242, Training Loss: 0.00843
Epoch: 29242, Training Loss: 0.00844
Epoch: 29242, Training Loss: 0.01037
Epoch: 29243, Training Loss: 0.00809
Epoch: 29243, Training Loss: 0.00843
Epoch: 29243, Training Loss: 0.00844
Epoch: 29243, Training Loss: 0.01037
Epoch: 29244, Training Loss: 0.00809
Epoch: 29244, Training Loss: 0.00843
Epoch: 29244, Training Loss: 0.00844
Epoch: 29244, Training Loss: 0.01037
Epoch: 29245, Training Loss: 0.00809
Epoch: 29245, Training Loss: 0.00843
Epoch: 29245, Training Loss: 0.00844
Epoch: 29245, Training Loss: 0.01037
Epoch: 29246, Training Loss: 0.00809
Epoch: 29246, Training Loss: 0.00843
Epoch: 29246, Training Loss: 0.00844
Epoch: 29246, Training Loss: 0.01037
Epoch: 29247, Training Loss: 0.00809
Epoch: 29247, Training Loss: 0.00843
Epoch: 29247, Training Loss: 0.00844
Epoch: 29247, Training Loss: 0.01037
Epoch: 29248, Training Loss: 0.00809
Epoch: 29248, Training Loss: 0.00843
Epoch: 29248, Training Loss: 0.00844
Epoch: 29248, Training Loss: 0.01037
Epoch: 29249, Training Loss: 0.00809
Epoch: 29249, Training Loss: 0.00843
Epoch: 29249, Training Loss: 0.00844
Epoch: 29249, Training Loss: 0.01037
Epoch: 29250, Training Loss: 0.00809
Epoch: 29250, Training Loss: 0.00843
Epoch: 29250, Training Loss: 0.00844
Epoch: 29250, Training Loss: 0.01037
Epoch: 29251, Training Loss: 0.00809
Epoch: 29251, Training Loss: 0.00843
Epoch: 29251, Training Loss: 0.00844
Epoch: 29251, Training Loss: 0.01037
Epoch: 29252, Training Loss: 0.00809
Epoch: 29252, Training Loss: 0.00843
Epoch: 29252, Training Loss: 0.00844
Epoch: 29252, Training Loss: 0.01037
Epoch: 29253, Training Loss: 0.00809
Epoch: 29253, Training Loss: 0.00843
Epoch: 29253, Training Loss: 0.00844
Epoch: 29253, Training Loss: 0.01037
Epoch: 29254, Training Loss: 0.00808
Epoch: 29254, Training Loss: 0.00843
Epoch: 29254, Training Loss: 0.00844
Epoch: 29254, Training Loss: 0.01037
Epoch: 29255, Training Loss: 0.00808
Epoch: 29255, Training Loss: 0.00843
Epoch: 29255, Training Loss: 0.00844
Epoch: 29255, Training Loss: 0.01037
Epoch: 29256, Training Loss: 0.00808
Epoch: 29256, Training Loss: 0.00843
Epoch: 29256, Training Loss: 0.00844
Epoch: 29256, Training Loss: 0.01037
Epoch: 29257, Training Loss: 0.00808
Epoch: 29257, Training Loss: 0.00843
Epoch: 29257, Training Loss: 0.00844
Epoch: 29257, Training Loss: 0.01037
Epoch: 29258, Training Loss: 0.00808
Epoch: 29258, Training Loss: 0.00843
Epoch: 29258, Training Loss: 0.00844
Epoch: 29258, Training Loss: 0.01037
Epoch: 29259, Training Loss: 0.00808
Epoch: 29259, Training Loss: 0.00843
Epoch: 29259, Training Loss: 0.00844
Epoch: 29259, Training Loss: 0.01037
Epoch: 29260, Training Loss: 0.00808
Epoch: 29260, Training Loss: 0.00843
Epoch: 29260, Training Loss: 0.00844
Epoch: 29260, Training Loss: 0.01037
Epoch: 29261, Training Loss: 0.00808
Epoch: 29261, Training Loss: 0.00843
Epoch: 29261, Training Loss: 0.00844
Epoch: 29261, Training Loss: 0.01037
Epoch: 29262, Training Loss: 0.00808
Epoch: 29262, Training Loss: 0.00843
Epoch: 29262, Training Loss: 0.00844
Epoch: 29262, Training Loss: 0.01037
Epoch: 29263, Training Loss: 0.00808
Epoch: 29263, Training Loss: 0.00843
Epoch: 29263, Training Loss: 0.00844
Epoch: 29263, Training Loss: 0.01037
Epoch: 29264, Training Loss: 0.00808
Epoch: 29264, Training Loss: 0.00843
Epoch: 29264, Training Loss: 0.00844
Epoch: 29264, Training Loss: 0.01037
Epoch: 29265, Training Loss: 0.00808
Epoch: 29265, Training Loss: 0.00843
Epoch: 29265, Training Loss: 0.00844
Epoch: 29265, Training Loss: 0.01037
Epoch: 29266, Training Loss: 0.00808
Epoch: 29266, Training Loss: 0.00843
Epoch: 29266, Training Loss: 0.00844
Epoch: 29266, Training Loss: 0.01037
Epoch: 29267, Training Loss: 0.00808
Epoch: 29267, Training Loss: 0.00843
Epoch: 29267, Training Loss: 0.00844
Epoch: 29267, Training Loss: 0.01036
Epoch: 29268, Training Loss: 0.00808
Epoch: 29268, Training Loss: 0.00843
Epoch: 29268, Training Loss: 0.00844
Epoch: 29268, Training Loss: 0.01036
Epoch: 29269, Training Loss: 0.00808
Epoch: 29269, Training Loss: 0.00843
Epoch: 29269, Training Loss: 0.00844
Epoch: 29269, Training Loss: 0.01036
Epoch: 29270, Training Loss: 0.00808
Epoch: 29270, Training Loss: 0.00843
Epoch: 29270, Training Loss: 0.00844
Epoch: 29270, Training Loss: 0.01036
Epoch: 29271, Training Loss: 0.00808
Epoch: 29271, Training Loss: 0.00843
Epoch: 29271, Training Loss: 0.00844
Epoch: 29271, Training Loss: 0.01036
Epoch: 29272, Training Loss: 0.00808
Epoch: 29272, Training Loss: 0.00843
Epoch: 29272, Training Loss: 0.00844
Epoch: 29272, Training Loss: 0.01036
Epoch: 29273, Training Loss: 0.00808
Epoch: 29273, Training Loss: 0.00843
Epoch: 29273, Training Loss: 0.00844
Epoch: 29273, Training Loss: 0.01036
Epoch: 29274, Training Loss: 0.00808
Epoch: 29274, Training Loss: 0.00843
Epoch: 29274, Training Loss: 0.00844
Epoch: 29274, Training Loss: 0.01036
Epoch: 29275, Training Loss: 0.00808
Epoch: 29275, Training Loss: 0.00842
Epoch: 29275, Training Loss: 0.00843
Epoch: 29275, Training Loss: 0.01036
Epoch: 29276, Training Loss: 0.00808
Epoch: 29276, Training Loss: 0.00842
Epoch: 29276, Training Loss: 0.00843
Epoch: 29276, Training Loss: 0.01036
Epoch: 29277, Training Loss: 0.00808
Epoch: 29277, Training Loss: 0.00842
Epoch: 29277, Training Loss: 0.00843
Epoch: 29277, Training Loss: 0.01036
Epoch: 29278, Training Loss: 0.00808
Epoch: 29278, Training Loss: 0.00842
Epoch: 29278, Training Loss: 0.00843
Epoch: 29278, Training Loss: 0.01036
Epoch: 29279, Training Loss: 0.00808
Epoch: 29279, Training Loss: 0.00842
Epoch: 29279, Training Loss: 0.00843
Epoch: 29279, Training Loss: 0.01036
Epoch: 29280, Training Loss: 0.00808
Epoch: 29280, Training Loss: 0.00842
Epoch: 29280, Training Loss: 0.00843
Epoch: 29280, Training Loss: 0.01036
Epoch: 29281, Training Loss: 0.00808
Epoch: 29281, Training Loss: 0.00842
Epoch: 29281, Training Loss: 0.00843
Epoch: 29281, Training Loss: 0.01036
Epoch: 29282, Training Loss: 0.00808
Epoch: 29282, Training Loss: 0.00842
Epoch: 29282, Training Loss: 0.00843
Epoch: 29282, Training Loss: 0.01036
Epoch: 29283, Training Loss: 0.00808
Epoch: 29283, Training Loss: 0.00842
Epoch: 29283, Training Loss: 0.00843
Epoch: 29283, Training Loss: 0.01036
Epoch: 29284, Training Loss: 0.00808
Epoch: 29284, Training Loss: 0.00842
Epoch: 29284, Training Loss: 0.00843
Epoch: 29284, Training Loss: 0.01036
Epoch: 29285, Training Loss: 0.00808
Epoch: 29285, Training Loss: 0.00842
Epoch: 29285, Training Loss: 0.00843
Epoch: 29285, Training Loss: 0.01036
Epoch: 29286, Training Loss: 0.00808
Epoch: 29286, Training Loss: 0.00842
Epoch: 29286, Training Loss: 0.00843
Epoch: 29286, Training Loss: 0.01036
Epoch: 29287, Training Loss: 0.00808
Epoch: 29287, Training Loss: 0.00842
Epoch: 29287, Training Loss: 0.00843
Epoch: 29287, Training Loss: 0.01036
Epoch: 29288, Training Loss: 0.00808
Epoch: 29288, Training Loss: 0.00842
Epoch: 29288, Training Loss: 0.00843
Epoch: 29288, Training Loss: 0.01036
Epoch: 29289, Training Loss: 0.00808
Epoch: 29289, Training Loss: 0.00842
Epoch: 29289, Training Loss: 0.00843
Epoch: 29289, Training Loss: 0.01036
Epoch: 29290, Training Loss: 0.00808
Epoch: 29290, Training Loss: 0.00842
Epoch: 29290, Training Loss: 0.00843
Epoch: 29290, Training Loss: 0.01036
Epoch: 29291, Training Loss: 0.00808
Epoch: 29291, Training Loss: 0.00842
Epoch: 29291, Training Loss: 0.00843
Epoch: 29291, Training Loss: 0.01036
Epoch: 29292, Training Loss: 0.00808
Epoch: 29292, Training Loss: 0.00842
Epoch: 29292, Training Loss: 0.00843
Epoch: 29292, Training Loss: 0.01036
Epoch: 29293, Training Loss: 0.00808
Epoch: 29293, Training Loss: 0.00842
Epoch: 29293, Training Loss: 0.00843
Epoch: 29293, Training Loss: 0.01036
Epoch: 29294, Training Loss: 0.00808
Epoch: 29294, Training Loss: 0.00842
Epoch: 29294, Training Loss: 0.00843
Epoch: 29294, Training Loss: 0.01036
Epoch: 29295, Training Loss: 0.00808
Epoch: 29295, Training Loss: 0.00842
Epoch: 29295, Training Loss: 0.00843
Epoch: 29295, Training Loss: 0.01036
Epoch: 29296, Training Loss: 0.00808
Epoch: 29296, Training Loss: 0.00842
Epoch: 29296, Training Loss: 0.00843
Epoch: 29296, Training Loss: 0.01036
Epoch: 29297, Training Loss: 0.00808
Epoch: 29297, Training Loss: 0.00842
Epoch: 29297, Training Loss: 0.00843
Epoch: 29297, Training Loss: 0.01036
Epoch: 29298, Training Loss: 0.00808
Epoch: 29298, Training Loss: 0.00842
Epoch: 29298, Training Loss: 0.00843
Epoch: 29298, Training Loss: 0.01036
Epoch: 29299, Training Loss: 0.00808
Epoch: 29299, Training Loss: 0.00842
Epoch: 29299, Training Loss: 0.00843
Epoch: 29299, Training Loss: 0.01036
Epoch: 29300, Training Loss: 0.00808
Epoch: 29300, Training Loss: 0.00842
Epoch: 29300, Training Loss: 0.00843
Epoch: 29300, Training Loss: 0.01036
Epoch: 29301, Training Loss: 0.00808
Epoch: 29301, Training Loss: 0.00842
Epoch: 29301, Training Loss: 0.00843
Epoch: 29301, Training Loss: 0.01036
Epoch: 29302, Training Loss: 0.00808
Epoch: 29302, Training Loss: 0.00842
Epoch: 29302, Training Loss: 0.00843
Epoch: 29302, Training Loss: 0.01036
Epoch: 29303, Training Loss: 0.00808
Epoch: 29303, Training Loss: 0.00842
Epoch: 29303, Training Loss: 0.00843
Epoch: 29303, Training Loss: 0.01036
Epoch: 29304, Training Loss: 0.00808
Epoch: 29304, Training Loss: 0.00842
Epoch: 29304, Training Loss: 0.00843
Epoch: 29304, Training Loss: 0.01036
Epoch: 29305, Training Loss: 0.00808
Epoch: 29305, Training Loss: 0.00842
Epoch: 29305, Training Loss: 0.00843
Epoch: 29305, Training Loss: 0.01036
Epoch: 29306, Training Loss: 0.00808
Epoch: 29306, Training Loss: 0.00842
Epoch: 29306, Training Loss: 0.00843
Epoch: 29306, Training Loss: 0.01036
Epoch: 29307, Training Loss: 0.00808
Epoch: 29307, Training Loss: 0.00842
Epoch: 29307, Training Loss: 0.00843
Epoch: 29307, Training Loss: 0.01036
Epoch: 29308, Training Loss: 0.00808
Epoch: 29308, Training Loss: 0.00842
Epoch: 29308, Training Loss: 0.00843
Epoch: 29308, Training Loss: 0.01036
Epoch: 29309, Training Loss: 0.00808
Epoch: 29309, Training Loss: 0.00842
Epoch: 29309, Training Loss: 0.00843
Epoch: 29309, Training Loss: 0.01036
Epoch: 29310, Training Loss: 0.00808
Epoch: 29310, Training Loss: 0.00842
Epoch: 29310, Training Loss: 0.00843
Epoch: 29310, Training Loss: 0.01036
Epoch: 29311, Training Loss: 0.00808
Epoch: 29311, Training Loss: 0.00842
Epoch: 29311, Training Loss: 0.00843
Epoch: 29311, Training Loss: 0.01036
Epoch: 29312, Training Loss: 0.00808
Epoch: 29312, Training Loss: 0.00842
Epoch: 29312, Training Loss: 0.00843
Epoch: 29312, Training Loss: 0.01036
Epoch: 29313, Training Loss: 0.00808
Epoch: 29313, Training Loss: 0.00842
Epoch: 29313, Training Loss: 0.00843
Epoch: 29313, Training Loss: 0.01036
Epoch: 29314, Training Loss: 0.00808
Epoch: 29314, Training Loss: 0.00842
Epoch: 29314, Training Loss: 0.00843
Epoch: 29314, Training Loss: 0.01036
Epoch: 29315, Training Loss: 0.00808
Epoch: 29315, Training Loss: 0.00842
Epoch: 29315, Training Loss: 0.00843
Epoch: 29315, Training Loss: 0.01036
Epoch: 29316, Training Loss: 0.00808
Epoch: 29316, Training Loss: 0.00842
Epoch: 29316, Training Loss: 0.00843
Epoch: 29316, Training Loss: 0.01035
Epoch: 29317, Training Loss: 0.00808
Epoch: 29317, Training Loss: 0.00842
Epoch: 29317, Training Loss: 0.00843
Epoch: 29317, Training Loss: 0.01035
Epoch: 29318, Training Loss: 0.00808
Epoch: 29318, Training Loss: 0.00842
Epoch: 29318, Training Loss: 0.00843
Epoch: 29318, Training Loss: 0.01035
Epoch: 29319, Training Loss: 0.00807
Epoch: 29319, Training Loss: 0.00842
Epoch: 29319, Training Loss: 0.00843
Epoch: 29319, Training Loss: 0.01035
Epoch: 29320, Training Loss: 0.00807
Epoch: 29320, Training Loss: 0.00842
Epoch: 29320, Training Loss: 0.00843
Epoch: 29320, Training Loss: 0.01035
Epoch: 29321, Training Loss: 0.00807
Epoch: 29321, Training Loss: 0.00842
Epoch: 29321, Training Loss: 0.00843
Epoch: 29321, Training Loss: 0.01035
Epoch: 29322, Training Loss: 0.00807
Epoch: 29322, Training Loss: 0.00842
Epoch: 29322, Training Loss: 0.00843
Epoch: 29322, Training Loss: 0.01035
Epoch: 29323, Training Loss: 0.00807
Epoch: 29323, Training Loss: 0.00842
Epoch: 29323, Training Loss: 0.00843
Epoch: 29323, Training Loss: 0.01035
Epoch: 29324, Training Loss: 0.00807
Epoch: 29324, Training Loss: 0.00842
Epoch: 29324, Training Loss: 0.00843
Epoch: 29324, Training Loss: 0.01035
Epoch: 29325, Training Loss: 0.00807
Epoch: 29325, Training Loss: 0.00842
Epoch: 29325, Training Loss: 0.00843
Epoch: 29325, Training Loss: 0.01035
Epoch: 29326, Training Loss: 0.00807
Epoch: 29326, Training Loss: 0.00842
Epoch: 29326, Training Loss: 0.00843
Epoch: 29326, Training Loss: 0.01035
Epoch: 29327, Training Loss: 0.00807
Epoch: 29327, Training Loss: 0.00842
Epoch: 29327, Training Loss: 0.00843
Epoch: 29327, Training Loss: 0.01035
Epoch: 29328, Training Loss: 0.00807
Epoch: 29328, Training Loss: 0.00842
Epoch: 29328, Training Loss: 0.00843
Epoch: 29328, Training Loss: 0.01035
Epoch: 29329, Training Loss: 0.00807
Epoch: 29329, Training Loss: 0.00842
Epoch: 29329, Training Loss: 0.00843
Epoch: 29329, Training Loss: 0.01035
Epoch: 29330, Training Loss: 0.00807
Epoch: 29330, Training Loss: 0.00842
Epoch: 29330, Training Loss: 0.00843
Epoch: 29330, Training Loss: 0.01035
Epoch: 29331, Training Loss: 0.00807
Epoch: 29331, Training Loss: 0.00842
Epoch: 29331, Training Loss: 0.00843
Epoch: 29331, Training Loss: 0.01035
Epoch: 29332, Training Loss: 0.00807
Epoch: 29332, Training Loss: 0.00842
Epoch: 29332, Training Loss: 0.00843
Epoch: 29332, Training Loss: 0.01035
Epoch: 29333, Training Loss: 0.00807
Epoch: 29333, Training Loss: 0.00842
Epoch: 29333, Training Loss: 0.00843
Epoch: 29333, Training Loss: 0.01035
Epoch: 29334, Training Loss: 0.00807
Epoch: 29334, Training Loss: 0.00842
Epoch: 29334, Training Loss: 0.00843
Epoch: 29334, Training Loss: 0.01035
Epoch: 29335, Training Loss: 0.00807
Epoch: 29335, Training Loss: 0.00842
Epoch: 29335, Training Loss: 0.00843
Epoch: 29335, Training Loss: 0.01035
Epoch: 29336, Training Loss: 0.00807
Epoch: 29336, Training Loss: 0.00841
Epoch: 29336, Training Loss: 0.00843
Epoch: 29336, Training Loss: 0.01035
Epoch: 29337, Training Loss: 0.00807
Epoch: 29337, Training Loss: 0.00841
Epoch: 29337, Training Loss: 0.00842
Epoch: 29337, Training Loss: 0.01035
Epoch: 29338, Training Loss: 0.00807
Epoch: 29338, Training Loss: 0.00841
Epoch: 29338, Training Loss: 0.00842
Epoch: 29338, Training Loss: 0.01035
Epoch: 29339, Training Loss: 0.00807
Epoch: 29339, Training Loss: 0.00841
Epoch: 29339, Training Loss: 0.00842
Epoch: 29339, Training Loss: 0.01035
Epoch: 29340, Training Loss: 0.00807
Epoch: 29340, Training Loss: 0.00841
Epoch: 29340, Training Loss: 0.00842
Epoch: 29340, Training Loss: 0.01035
Epoch: 29341, Training Loss: 0.00807
Epoch: 29341, Training Loss: 0.00841
Epoch: 29341, Training Loss: 0.00842
Epoch: 29341, Training Loss: 0.01035
Epoch: 29342, Training Loss: 0.00807
Epoch: 29342, Training Loss: 0.00841
Epoch: 29342, Training Loss: 0.00842
Epoch: 29342, Training Loss: 0.01035
Epoch: 29343, Training Loss: 0.00807
Epoch: 29343, Training Loss: 0.00841
Epoch: 29343, Training Loss: 0.00842
Epoch: 29343, Training Loss: 0.01035
Epoch: 29344, Training Loss: 0.00807
Epoch: 29344, Training Loss: 0.00841
Epoch: 29344, Training Loss: 0.00842
Epoch: 29344, Training Loss: 0.01035
Epoch: 29345, Training Loss: 0.00807
Epoch: 29345, Training Loss: 0.00841
Epoch: 29345, Training Loss: 0.00842
Epoch: 29345, Training Loss: 0.01035
Epoch: 29346, Training Loss: 0.00807
Epoch: 29346, Training Loss: 0.00841
Epoch: 29346, Training Loss: 0.00842
Epoch: 29346, Training Loss: 0.01035
Epoch: 29347, Training Loss: 0.00807
Epoch: 29347, Training Loss: 0.00841
Epoch: 29347, Training Loss: 0.00842
Epoch: 29347, Training Loss: 0.01035
Epoch: 29348, Training Loss: 0.00807
Epoch: 29348, Training Loss: 0.00841
Epoch: 29348, Training Loss: 0.00842
Epoch: 29348, Training Loss: 0.01035
Epoch: 29349, Training Loss: 0.00807
Epoch: 29349, Training Loss: 0.00841
Epoch: 29349, Training Loss: 0.00842
Epoch: 29349, Training Loss: 0.01035
Epoch: 29350, Training Loss: 0.00807
Epoch: 29350, Training Loss: 0.00841
Epoch: 29350, Training Loss: 0.00842
Epoch: 29350, Training Loss: 0.01035
Epoch: 29351, Training Loss: 0.00807
Epoch: 29351, Training Loss: 0.00841
Epoch: 29351, Training Loss: 0.00842
Epoch: 29351, Training Loss: 0.01035
Epoch: 29352, Training Loss: 0.00807
Epoch: 29352, Training Loss: 0.00841
Epoch: 29352, Training Loss: 0.00842
Epoch: 29352, Training Loss: 0.01035
Epoch: 29353, Training Loss: 0.00807
Epoch: 29353, Training Loss: 0.00841
Epoch: 29353, Training Loss: 0.00842
Epoch: 29353, Training Loss: 0.01035
Epoch: 29354, Training Loss: 0.00807
Epoch: 29354, Training Loss: 0.00841
Epoch: 29354, Training Loss: 0.00842
Epoch: 29354, Training Loss: 0.01035
Epoch: 29355, Training Loss: 0.00807
Epoch: 29355, Training Loss: 0.00841
Epoch: 29355, Training Loss: 0.00842
Epoch: 29355, Training Loss: 0.01035
Epoch: 29356, Training Loss: 0.00807
Epoch: 29356, Training Loss: 0.00841
Epoch: 29356, Training Loss: 0.00842
Epoch: 29356, Training Loss: 0.01035
Epoch: 29357, Training Loss: 0.00807
Epoch: 29357, Training Loss: 0.00841
Epoch: 29357, Training Loss: 0.00842
Epoch: 29357, Training Loss: 0.01035
Epoch: 29358, Training Loss: 0.00807
Epoch: 29358, Training Loss: 0.00841
Epoch: 29358, Training Loss: 0.00842
Epoch: 29358, Training Loss: 0.01035
Epoch: 29359, Training Loss: 0.00807
Epoch: 29359, Training Loss: 0.00841
Epoch: 29359, Training Loss: 0.00842
Epoch: 29359, Training Loss: 0.01035
Epoch: 29360, Training Loss: 0.00807
Epoch: 29360, Training Loss: 0.00841
Epoch: 29360, Training Loss: 0.00842
Epoch: 29360, Training Loss: 0.01035
Epoch: 29361, Training Loss: 0.00807
Epoch: 29361, Training Loss: 0.00841
Epoch: 29361, Training Loss: 0.00842
Epoch: 29361, Training Loss: 0.01035
Epoch: 29362, Training Loss: 0.00807
Epoch: 29362, Training Loss: 0.00841
Epoch: 29362, Training Loss: 0.00842
Epoch: 29362, Training Loss: 0.01035
Epoch: 29363, Training Loss: 0.00807
Epoch: 29363, Training Loss: 0.00841
Epoch: 29363, Training Loss: 0.00842
Epoch: 29363, Training Loss: 0.01035
Epoch: 29364, Training Loss: 0.00807
Epoch: 29364, Training Loss: 0.00841
Epoch: 29364, Training Loss: 0.00842
Epoch: 29364, Training Loss: 0.01035
Epoch: 29365, Training Loss: 0.00807
Epoch: 29365, Training Loss: 0.00841
Epoch: 29365, Training Loss: 0.00842
Epoch: 29365, Training Loss: 0.01035
Epoch: 29366, Training Loss: 0.00807
Epoch: 29366, Training Loss: 0.00841
Epoch: 29366, Training Loss: 0.00842
Epoch: 29366, Training Loss: 0.01034
Epoch: 29367, Training Loss: 0.00807
Epoch: 29367, Training Loss: 0.00841
Epoch: 29367, Training Loss: 0.00842
Epoch: 29367, Training Loss: 0.01034
Epoch: 29368, Training Loss: 0.00807
Epoch: 29368, Training Loss: 0.00841
Epoch: 29368, Training Loss: 0.00842
Epoch: 29368, Training Loss: 0.01034
Epoch: 29369, Training Loss: 0.00807
Epoch: 29369, Training Loss: 0.00841
Epoch: 29369, Training Loss: 0.00842
Epoch: 29369, Training Loss: 0.01034
Epoch: 29370, Training Loss: 0.00807
Epoch: 29370, Training Loss: 0.00841
Epoch: 29370, Training Loss: 0.00842
Epoch: 29370, Training Loss: 0.01034
Epoch: 29371, Training Loss: 0.00807
Epoch: 29371, Training Loss: 0.00841
Epoch: 29371, Training Loss: 0.00842
Epoch: 29371, Training Loss: 0.01034
Epoch: 29372, Training Loss: 0.00807
Epoch: 29372, Training Loss: 0.00841
Epoch: 29372, Training Loss: 0.00842
Epoch: 29372, Training Loss: 0.01034
Epoch: 29373, Training Loss: 0.00807
Epoch: 29373, Training Loss: 0.00841
Epoch: 29373, Training Loss: 0.00842
Epoch: 29373, Training Loss: 0.01034
Epoch: 29374, Training Loss: 0.00807
Epoch: 29374, Training Loss: 0.00841
Epoch: 29374, Training Loss: 0.00842
Epoch: 29374, Training Loss: 0.01034
Epoch: 29375, Training Loss: 0.00807
Epoch: 29375, Training Loss: 0.00841
Epoch: 29375, Training Loss: 0.00842
Epoch: 29375, Training Loss: 0.01034
Epoch: 29376, Training Loss: 0.00807
Epoch: 29376, Training Loss: 0.00841
Epoch: 29376, Training Loss: 0.00842
Epoch: 29376, Training Loss: 0.01034
Epoch: 29377, Training Loss: 0.00807
Epoch: 29377, Training Loss: 0.00841
Epoch: 29377, Training Loss: 0.00842
Epoch: 29377, Training Loss: 0.01034
Epoch: 29378, Training Loss: 0.00807
Epoch: 29378, Training Loss: 0.00841
Epoch: 29378, Training Loss: 0.00842
Epoch: 29378, Training Loss: 0.01034
Epoch: 29379, Training Loss: 0.00807
Epoch: 29379, Training Loss: 0.00841
Epoch: 29379, Training Loss: 0.00842
Epoch: 29379, Training Loss: 0.01034
Epoch: 29380, Training Loss: 0.00807
Epoch: 29380, Training Loss: 0.00841
Epoch: 29380, Training Loss: 0.00842
Epoch: 29380, Training Loss: 0.01034
Epoch: 29381, Training Loss: 0.00807
Epoch: 29381, Training Loss: 0.00841
Epoch: 29381, Training Loss: 0.00842
Epoch: 29381, Training Loss: 0.01034
Epoch: 29382, Training Loss: 0.00807
Epoch: 29382, Training Loss: 0.00841
Epoch: 29382, Training Loss: 0.00842
Epoch: 29382, Training Loss: 0.01034
Epoch: 29383, Training Loss: 0.00807
Epoch: 29383, Training Loss: 0.00841
Epoch: 29383, Training Loss: 0.00842
Epoch: 29383, Training Loss: 0.01034
Epoch: 29384, Training Loss: 0.00807
Epoch: 29384, Training Loss: 0.00841
Epoch: 29384, Training Loss: 0.00842
Epoch: 29384, Training Loss: 0.01034
Epoch: 29385, Training Loss: 0.00806
Epoch: 29385, Training Loss: 0.00841
Epoch: 29385, Training Loss: 0.00842
Epoch: 29385, Training Loss: 0.01034
Epoch: 29386, Training Loss: 0.00806
Epoch: 29386, Training Loss: 0.00841
Epoch: 29386, Training Loss: 0.00842
Epoch: 29386, Training Loss: 0.01034
Epoch: 29387, Training Loss: 0.00806
Epoch: 29387, Training Loss: 0.00841
Epoch: 29387, Training Loss: 0.00842
Epoch: 29387, Training Loss: 0.01034
Epoch: 29388, Training Loss: 0.00806
Epoch: 29388, Training Loss: 0.00841
Epoch: 29388, Training Loss: 0.00842
Epoch: 29388, Training Loss: 0.01034
Epoch: 29389, Training Loss: 0.00806
Epoch: 29389, Training Loss: 0.00841
Epoch: 29389, Training Loss: 0.00842
Epoch: 29389, Training Loss: 0.01034
Epoch: 29390, Training Loss: 0.00806
Epoch: 29390, Training Loss: 0.00841
Epoch: 29390, Training Loss: 0.00842
Epoch: 29390, Training Loss: 0.01034
Epoch: 29391, Training Loss: 0.00806
Epoch: 29391, Training Loss: 0.00841
Epoch: 29391, Training Loss: 0.00842
Epoch: 29391, Training Loss: 0.01034
Epoch: 29392, Training Loss: 0.00806
Epoch: 29392, Training Loss: 0.00841
Epoch: 29392, Training Loss: 0.00842
Epoch: 29392, Training Loss: 0.01034
Epoch: 29393, Training Loss: 0.00806
Epoch: 29393, Training Loss: 0.00841
Epoch: 29393, Training Loss: 0.00842
Epoch: 29393, Training Loss: 0.01034
Epoch: 29394, Training Loss: 0.00806
Epoch: 29394, Training Loss: 0.00841
Epoch: 29394, Training Loss: 0.00842
Epoch: 29394, Training Loss: 0.01034
Epoch: 29395, Training Loss: 0.00806
Epoch: 29395, Training Loss: 0.00841
Epoch: 29395, Training Loss: 0.00842
Epoch: 29395, Training Loss: 0.01034
Epoch: 29396, Training Loss: 0.00806
Epoch: 29396, Training Loss: 0.00841
Epoch: 29396, Training Loss: 0.00842
Epoch: 29396, Training Loss: 0.01034
Epoch: 29397, Training Loss: 0.00806
Epoch: 29397, Training Loss: 0.00841
Epoch: 29397, Training Loss: 0.00842
Epoch: 29397, Training Loss: 0.01034
Epoch: 29398, Training Loss: 0.00806
Epoch: 29398, Training Loss: 0.00840
Epoch: 29398, Training Loss: 0.00842
Epoch: 29398, Training Loss: 0.01034
Epoch: 29399, Training Loss: 0.00806
Epoch: 29399, Training Loss: 0.00840
Epoch: 29399, Training Loss: 0.00841
Epoch: 29399, Training Loss: 0.01034
Epoch: 29400, Training Loss: 0.00806
Epoch: 29400, Training Loss: 0.00840
Epoch: 29400, Training Loss: 0.00841
Epoch: 29400, Training Loss: 0.01034
Epoch: 29401, Training Loss: 0.00806
Epoch: 29401, Training Loss: 0.00840
Epoch: 29401, Training Loss: 0.00841
Epoch: 29401, Training Loss: 0.01034
Epoch: 29402, Training Loss: 0.00806
Epoch: 29402, Training Loss: 0.00840
Epoch: 29402, Training Loss: 0.00841
Epoch: 29402, Training Loss: 0.01034
Epoch: 29403, Training Loss: 0.00806
Epoch: 29403, Training Loss: 0.00840
Epoch: 29403, Training Loss: 0.00841
Epoch: 29403, Training Loss: 0.01034
Epoch: 29404, Training Loss: 0.00806
Epoch: 29404, Training Loss: 0.00840
Epoch: 29404, Training Loss: 0.00841
Epoch: 29404, Training Loss: 0.01034
Epoch: 29405, Training Loss: 0.00806
Epoch: 29405, Training Loss: 0.00840
Epoch: 29405, Training Loss: 0.00841
Epoch: 29405, Training Loss: 0.01034
Epoch: 29406, Training Loss: 0.00806
Epoch: 29406, Training Loss: 0.00840
Epoch: 29406, Training Loss: 0.00841
Epoch: 29406, Training Loss: 0.01034
Epoch: 29407, Training Loss: 0.00806
Epoch: 29407, Training Loss: 0.00840
Epoch: 29407, Training Loss: 0.00841
Epoch: 29407, Training Loss: 0.01034
Epoch: 29408, Training Loss: 0.00806
Epoch: 29408, Training Loss: 0.00840
Epoch: 29408, Training Loss: 0.00841
Epoch: 29408, Training Loss: 0.01034
Epoch: 29409, Training Loss: 0.00806
Epoch: 29409, Training Loss: 0.00840
Epoch: 29409, Training Loss: 0.00841
Epoch: 29409, Training Loss: 0.01034
Epoch: 29410, Training Loss: 0.00806
Epoch: 29410, Training Loss: 0.00840
Epoch: 29410, Training Loss: 0.00841
Epoch: 29410, Training Loss: 0.01034
Epoch: 29411, Training Loss: 0.00806
Epoch: 29411, Training Loss: 0.00840
Epoch: 29411, Training Loss: 0.00841
Epoch: 29411, Training Loss: 0.01034
Epoch: 29412, Training Loss: 0.00806
Epoch: 29412, Training Loss: 0.00840
Epoch: 29412, Training Loss: 0.00841
Epoch: 29412, Training Loss: 0.01034
Epoch: 29413, Training Loss: 0.00806
Epoch: 29413, Training Loss: 0.00840
Epoch: 29413, Training Loss: 0.00841
Epoch: 29413, Training Loss: 0.01034
Epoch: 29414, Training Loss: 0.00806
Epoch: 29414, Training Loss: 0.00840
Epoch: 29414, Training Loss: 0.00841
Epoch: 29414, Training Loss: 0.01034
Epoch: 29415, Training Loss: 0.00806
Epoch: 29415, Training Loss: 0.00840
Epoch: 29415, Training Loss: 0.00841
Epoch: 29415, Training Loss: 0.01034
Epoch: 29416, Training Loss: 0.00806
Epoch: 29416, Training Loss: 0.00840
Epoch: 29416, Training Loss: 0.00841
Epoch: 29416, Training Loss: 0.01033
Epoch: 29417, Training Loss: 0.00806
Epoch: 29417, Training Loss: 0.00840
Epoch: 29417, Training Loss: 0.00841
Epoch: 29417, Training Loss: 0.01033
Epoch: 29418, Training Loss: 0.00806
Epoch: 29418, Training Loss: 0.00840
Epoch: 29418, Training Loss: 0.00841
Epoch: 29418, Training Loss: 0.01033
Epoch: 29419, Training Loss: 0.00806
Epoch: 29419, Training Loss: 0.00840
Epoch: 29419, Training Loss: 0.00841
Epoch: 29419, Training Loss: 0.01033
Epoch: 29420, Training Loss: 0.00806
Epoch: 29420, Training Loss: 0.00840
Epoch: 29420, Training Loss: 0.00841
Epoch: 29420, Training Loss: 0.01033
Epoch: 29421, Training Loss: 0.00806
Epoch: 29421, Training Loss: 0.00840
Epoch: 29421, Training Loss: 0.00841
Epoch: 29421, Training Loss: 0.01033
Epoch: 29422, Training Loss: 0.00806
Epoch: 29422, Training Loss: 0.00840
Epoch: 29422, Training Loss: 0.00841
Epoch: 29422, Training Loss: 0.01033
Epoch: 29423, Training Loss: 0.00806
Epoch: 29423, Training Loss: 0.00840
Epoch: 29423, Training Loss: 0.00841
Epoch: 29423, Training Loss: 0.01033
Epoch: 29424, Training Loss: 0.00806
Epoch: 29424, Training Loss: 0.00840
Epoch: 29424, Training Loss: 0.00841
Epoch: 29424, Training Loss: 0.01033
Epoch: 29425, Training Loss: 0.00806
Epoch: 29425, Training Loss: 0.00840
Epoch: 29425, Training Loss: 0.00841
Epoch: 29425, Training Loss: 0.01033
Epoch: 29426, Training Loss: 0.00806
Epoch: 29426, Training Loss: 0.00840
Epoch: 29426, Training Loss: 0.00841
Epoch: 29426, Training Loss: 0.01033
Epoch: 29427, Training Loss: 0.00806
Epoch: 29427, Training Loss: 0.00840
Epoch: 29427, Training Loss: 0.00841
Epoch: 29427, Training Loss: 0.01033
Epoch: 29428, Training Loss: 0.00806
Epoch: 29428, Training Loss: 0.00840
Epoch: 29428, Training Loss: 0.00841
Epoch: 29428, Training Loss: 0.01033
Epoch: 29429, Training Loss: 0.00806
Epoch: 29429, Training Loss: 0.00840
Epoch: 29429, Training Loss: 0.00841
Epoch: 29429, Training Loss: 0.01033
Epoch: 29430, Training Loss: 0.00806
Epoch: 29430, Training Loss: 0.00840
Epoch: 29430, Training Loss: 0.00841
Epoch: 29430, Training Loss: 0.01033
Epoch: 29431, Training Loss: 0.00806
Epoch: 29431, Training Loss: 0.00840
Epoch: 29431, Training Loss: 0.00841
Epoch: 29431, Training Loss: 0.01033
Epoch: 29432, Training Loss: 0.00806
Epoch: 29432, Training Loss: 0.00840
Epoch: 29432, Training Loss: 0.00841
Epoch: 29432, Training Loss: 0.01033
Epoch: 29433, Training Loss: 0.00806
Epoch: 29433, Training Loss: 0.00840
Epoch: 29433, Training Loss: 0.00841
Epoch: 29433, Training Loss: 0.01033
Epoch: 29434, Training Loss: 0.00806
Epoch: 29434, Training Loss: 0.00840
Epoch: 29434, Training Loss: 0.00841
Epoch: 29434, Training Loss: 0.01033
Epoch: 29435, Training Loss: 0.00806
Epoch: 29435, Training Loss: 0.00840
Epoch: 29435, Training Loss: 0.00841
Epoch: 29435, Training Loss: 0.01033
Epoch: 29436, Training Loss: 0.00806
Epoch: 29436, Training Loss: 0.00840
Epoch: 29436, Training Loss: 0.00841
Epoch: 29436, Training Loss: 0.01033
Epoch: 29437, Training Loss: 0.00806
Epoch: 29437, Training Loss: 0.00840
Epoch: 29437, Training Loss: 0.00841
Epoch: 29437, Training Loss: 0.01033
Epoch: 29438, Training Loss: 0.00806
Epoch: 29438, Training Loss: 0.00840
Epoch: 29438, Training Loss: 0.00841
Epoch: 29438, Training Loss: 0.01033
Epoch: 29439, Training Loss: 0.00806
Epoch: 29439, Training Loss: 0.00840
Epoch: 29439, Training Loss: 0.00841
Epoch: 29439, Training Loss: 0.01033
Epoch: 29440, Training Loss: 0.00806
Epoch: 29440, Training Loss: 0.00840
Epoch: 29440, Training Loss: 0.00841
Epoch: 29440, Training Loss: 0.01033
Epoch: 29441, Training Loss: 0.00806
Epoch: 29441, Training Loss: 0.00840
Epoch: 29441, Training Loss: 0.00841
Epoch: 29441, Training Loss: 0.01033
Epoch: 29442, Training Loss: 0.00806
Epoch: 29442, Training Loss: 0.00840
Epoch: 29442, Training Loss: 0.00841
Epoch: 29442, Training Loss: 0.01033
Epoch: 29443, Training Loss: 0.00806
Epoch: 29443, Training Loss: 0.00840
Epoch: 29443, Training Loss: 0.00841
Epoch: 29443, Training Loss: 0.01033
Epoch: 29444, Training Loss: 0.00806
Epoch: 29444, Training Loss: 0.00840
Epoch: 29444, Training Loss: 0.00841
Epoch: 29444, Training Loss: 0.01033
Epoch: 29445, Training Loss: 0.00806
Epoch: 29445, Training Loss: 0.00840
Epoch: 29445, Training Loss: 0.00841
Epoch: 29445, Training Loss: 0.01033
Epoch: 29446, Training Loss: 0.00806
Epoch: 29446, Training Loss: 0.00840
Epoch: 29446, Training Loss: 0.00841
Epoch: 29446, Training Loss: 0.01033
Epoch: 29447, Training Loss: 0.00806
Epoch: 29447, Training Loss: 0.00840
Epoch: 29447, Training Loss: 0.00841
Epoch: 29447, Training Loss: 0.01033
Epoch: 29448, Training Loss: 0.00806
Epoch: 29448, Training Loss: 0.00840
Epoch: 29448, Training Loss: 0.00841
Epoch: 29448, Training Loss: 0.01033
Epoch: 29449, Training Loss: 0.00806
Epoch: 29449, Training Loss: 0.00840
Epoch: 29449, Training Loss: 0.00841
Epoch: 29449, Training Loss: 0.01033
Epoch: 29450, Training Loss: 0.00806
Epoch: 29450, Training Loss: 0.00840
Epoch: 29450, Training Loss: 0.00841
Epoch: 29450, Training Loss: 0.01033
Epoch: 29451, Training Loss: 0.00806
Epoch: 29451, Training Loss: 0.00840
Epoch: 29451, Training Loss: 0.00841
Epoch: 29451, Training Loss: 0.01033
Epoch: 29452, Training Loss: 0.00805
Epoch: 29452, Training Loss: 0.00840
Epoch: 29452, Training Loss: 0.00841
Epoch: 29452, Training Loss: 0.01033
Epoch: 29453, Training Loss: 0.00805
Epoch: 29453, Training Loss: 0.00840
Epoch: 29453, Training Loss: 0.00841
Epoch: 29453, Training Loss: 0.01033
Epoch: 29454, Training Loss: 0.00805
Epoch: 29454, Training Loss: 0.00840
Epoch: 29454, Training Loss: 0.00841
Epoch: 29454, Training Loss: 0.01033
Epoch: 29455, Training Loss: 0.00805
Epoch: 29455, Training Loss: 0.00840
Epoch: 29455, Training Loss: 0.00841
Epoch: 29455, Training Loss: 0.01033
Epoch: 29456, Training Loss: 0.00805
Epoch: 29456, Training Loss: 0.00840
Epoch: 29456, Training Loss: 0.00841
Epoch: 29456, Training Loss: 0.01033
Epoch: 29457, Training Loss: 0.00805
Epoch: 29457, Training Loss: 0.00840
Epoch: 29457, Training Loss: 0.00841
Epoch: 29457, Training Loss: 0.01033
Epoch: 29458, Training Loss: 0.00805
Epoch: 29458, Training Loss: 0.00840
Epoch: 29458, Training Loss: 0.00841
Epoch: 29458, Training Loss: 0.01033
Epoch: 29459, Training Loss: 0.00805
Epoch: 29459, Training Loss: 0.00840
Epoch: 29459, Training Loss: 0.00841
Epoch: 29459, Training Loss: 0.01033
Epoch: 29460, Training Loss: 0.00805
Epoch: 29460, Training Loss: 0.00839
Epoch: 29460, Training Loss: 0.00841
Epoch: 29460, Training Loss: 0.01033
Epoch: 29461, Training Loss: 0.00805
Epoch: 29461, Training Loss: 0.00839
Epoch: 29461, Training Loss: 0.00840
Epoch: 29461, Training Loss: 0.01033
Epoch: 29462, Training Loss: 0.00805
Epoch: 29462, Training Loss: 0.00839
Epoch: 29462, Training Loss: 0.00840
Epoch: 29462, Training Loss: 0.01033
Epoch: 29463, Training Loss: 0.00805
Epoch: 29463, Training Loss: 0.00839
Epoch: 29463, Training Loss: 0.00840
Epoch: 29463, Training Loss: 0.01033
Epoch: 29464, Training Loss: 0.00805
Epoch: 29464, Training Loss: 0.00839
Epoch: 29464, Training Loss: 0.00840
Epoch: 29464, Training Loss: 0.01033
Epoch: 29465, Training Loss: 0.00805
Epoch: 29465, Training Loss: 0.00839
Epoch: 29465, Training Loss: 0.00840
Epoch: 29465, Training Loss: 0.01033
Epoch: 29466, Training Loss: 0.00805
Epoch: 29466, Training Loss: 0.00839
Epoch: 29466, Training Loss: 0.00840
Epoch: 29466, Training Loss: 0.01032
Epoch: 29467, Training Loss: 0.00805
Epoch: 29467, Training Loss: 0.00839
Epoch: 29467, Training Loss: 0.00840
Epoch: 29467, Training Loss: 0.01032
Epoch: 29468, Training Loss: 0.00805
Epoch: 29468, Training Loss: 0.00839
Epoch: 29468, Training Loss: 0.00840
Epoch: 29468, Training Loss: 0.01032
Epoch: 29469, Training Loss: 0.00805
Epoch: 29469, Training Loss: 0.00839
Epoch: 29469, Training Loss: 0.00840
Epoch: 29469, Training Loss: 0.01032
Epoch: 29470, Training Loss: 0.00805
Epoch: 29470, Training Loss: 0.00839
Epoch: 29470, Training Loss: 0.00840
Epoch: 29470, Training Loss: 0.01032
Epoch: 29471, Training Loss: 0.00805
Epoch: 29471, Training Loss: 0.00839
Epoch: 29471, Training Loss: 0.00840
Epoch: 29471, Training Loss: 0.01032
Epoch: 29472, Training Loss: 0.00805
Epoch: 29472, Training Loss: 0.00839
Epoch: 29472, Training Loss: 0.00840
Epoch: 29472, Training Loss: 0.01032
Epoch: 29473, Training Loss: 0.00805
Epoch: 29473, Training Loss: 0.00839
Epoch: 29473, Training Loss: 0.00840
Epoch: 29473, Training Loss: 0.01032
Epoch: 29474, Training Loss: 0.00805
Epoch: 29474, Training Loss: 0.00839
Epoch: 29474, Training Loss: 0.00840
Epoch: 29474, Training Loss: 0.01032
Epoch: 29475, Training Loss: 0.00805
Epoch: 29475, Training Loss: 0.00839
Epoch: 29475, Training Loss: 0.00840
Epoch: 29475, Training Loss: 0.01032
Epoch: 29476, Training Loss: 0.00805
Epoch: 29476, Training Loss: 0.00839
Epoch: 29476, Training Loss: 0.00840
Epoch: 29476, Training Loss: 0.01032
Epoch: 29477, Training Loss: 0.00805
Epoch: 29477, Training Loss: 0.00839
Epoch: 29477, Training Loss: 0.00840
Epoch: 29477, Training Loss: 0.01032
Epoch: 29478, Training Loss: 0.00805
Epoch: 29478, Training Loss: 0.00839
Epoch: 29478, Training Loss: 0.00840
Epoch: 29478, Training Loss: 0.01032
Epoch: 29479, Training Loss: 0.00805
Epoch: 29479, Training Loss: 0.00839
Epoch: 29479, Training Loss: 0.00840
Epoch: 29479, Training Loss: 0.01032
Epoch: 29480, Training Loss: 0.00805
Epoch: 29480, Training Loss: 0.00839
Epoch: 29480, Training Loss: 0.00840
Epoch: 29480, Training Loss: 0.01032
Epoch: 29481, Training Loss: 0.00805
Epoch: 29481, Training Loss: 0.00839
Epoch: 29481, Training Loss: 0.00840
Epoch: 29481, Training Loss: 0.01032
Epoch: 29482, Training Loss: 0.00805
Epoch: 29482, Training Loss: 0.00839
Epoch: 29482, Training Loss: 0.00840
Epoch: 29482, Training Loss: 0.01032
Epoch: 29483, Training Loss: 0.00805
Epoch: 29483, Training Loss: 0.00839
Epoch: 29483, Training Loss: 0.00840
Epoch: 29483, Training Loss: 0.01032
Epoch: 29484, Training Loss: 0.00805
Epoch: 29484, Training Loss: 0.00839
Epoch: 29484, Training Loss: 0.00840
Epoch: 29484, Training Loss: 0.01032
Epoch: 29485, Training Loss: 0.00805
Epoch: 29485, Training Loss: 0.00839
Epoch: 29485, Training Loss: 0.00840
Epoch: 29485, Training Loss: 0.01032
Epoch: 29486, Training Loss: 0.00805
Epoch: 29486, Training Loss: 0.00839
Epoch: 29486, Training Loss: 0.00840
Epoch: 29486, Training Loss: 0.01032
Epoch: 29487, Training Loss: 0.00805
Epoch: 29487, Training Loss: 0.00839
Epoch: 29487, Training Loss: 0.00840
Epoch: 29487, Training Loss: 0.01032
Epoch: 29488, Training Loss: 0.00805
Epoch: 29488, Training Loss: 0.00839
Epoch: 29488, Training Loss: 0.00840
Epoch: 29488, Training Loss: 0.01032
Epoch: 29489, Training Loss: 0.00805
Epoch: 29489, Training Loss: 0.00839
Epoch: 29489, Training Loss: 0.00840
Epoch: 29489, Training Loss: 0.01032
Epoch: 29490, Training Loss: 0.00805
Epoch: 29490, Training Loss: 0.00839
Epoch: 29490, Training Loss: 0.00840
Epoch: 29490, Training Loss: 0.01032
Epoch: 29491, Training Loss: 0.00805
Epoch: 29491, Training Loss: 0.00839
Epoch: 29491, Training Loss: 0.00840
Epoch: 29491, Training Loss: 0.01032
Epoch: 29492, Training Loss: 0.00805
Epoch: 29492, Training Loss: 0.00839
Epoch: 29492, Training Loss: 0.00840
Epoch: 29492, Training Loss: 0.01032
Epoch: 29493, Training Loss: 0.00805
Epoch: 29493, Training Loss: 0.00839
Epoch: 29493, Training Loss: 0.00840
Epoch: 29493, Training Loss: 0.01032
Epoch: 29494, Training Loss: 0.00805
Epoch: 29494, Training Loss: 0.00839
Epoch: 29494, Training Loss: 0.00840
Epoch: 29494, Training Loss: 0.01032
Epoch: 29495, Training Loss: 0.00805
Epoch: 29495, Training Loss: 0.00839
Epoch: 29495, Training Loss: 0.00840
Epoch: 29495, Training Loss: 0.01032
Epoch: 29496, Training Loss: 0.00805
Epoch: 29496, Training Loss: 0.00839
Epoch: 29496, Training Loss: 0.00840
Epoch: 29496, Training Loss: 0.01032
Epoch: 29497, Training Loss: 0.00805
Epoch: 29497, Training Loss: 0.00839
Epoch: 29497, Training Loss: 0.00840
Epoch: 29497, Training Loss: 0.01032
Epoch: 29498, Training Loss: 0.00805
Epoch: 29498, Training Loss: 0.00839
Epoch: 29498, Training Loss: 0.00840
Epoch: 29498, Training Loss: 0.01032
Epoch: 29499, Training Loss: 0.00805
Epoch: 29499, Training Loss: 0.00839
Epoch: 29499, Training Loss: 0.00840
Epoch: 29499, Training Loss: 0.01032
Epoch: 29500, Training Loss: 0.00805
Epoch: 29500, Training Loss: 0.00839
Epoch: 29500, Training Loss: 0.00840
Epoch: 29500, Training Loss: 0.01032
Epoch: 29501, Training Loss: 0.00805
Epoch: 29501, Training Loss: 0.00839
Epoch: 29501, Training Loss: 0.00840
Epoch: 29501, Training Loss: 0.01032
Epoch: 29502, Training Loss: 0.00805
Epoch: 29502, Training Loss: 0.00839
Epoch: 29502, Training Loss: 0.00840
Epoch: 29502, Training Loss: 0.01032
Epoch: 29503, Training Loss: 0.00805
Epoch: 29503, Training Loss: 0.00839
Epoch: 29503, Training Loss: 0.00840
Epoch: 29503, Training Loss: 0.01032
Epoch: 29504, Training Loss: 0.00805
Epoch: 29504, Training Loss: 0.00839
Epoch: 29504, Training Loss: 0.00840
Epoch: 29504, Training Loss: 0.01032
Epoch: 29505, Training Loss: 0.00805
Epoch: 29505, Training Loss: 0.00839
Epoch: 29505, Training Loss: 0.00840
Epoch: 29505, Training Loss: 0.01032
Epoch: 29506, Training Loss: 0.00805
Epoch: 29506, Training Loss: 0.00839
Epoch: 29506, Training Loss: 0.00840
Epoch: 29506, Training Loss: 0.01032
Epoch: 29507, Training Loss: 0.00805
Epoch: 29507, Training Loss: 0.00839
Epoch: 29507, Training Loss: 0.00840
Epoch: 29507, Training Loss: 0.01032
Epoch: 29508, Training Loss: 0.00805
Epoch: 29508, Training Loss: 0.00839
Epoch: 29508, Training Loss: 0.00840
Epoch: 29508, Training Loss: 0.01032
Epoch: 29509, Training Loss: 0.00805
Epoch: 29509, Training Loss: 0.00839
Epoch: 29509, Training Loss: 0.00840
Epoch: 29509, Training Loss: 0.01032
Epoch: 29510, Training Loss: 0.00805
Epoch: 29510, Training Loss: 0.00839
Epoch: 29510, Training Loss: 0.00840
Epoch: 29510, Training Loss: 0.01032
Epoch: 29511, Training Loss: 0.00805
Epoch: 29511, Training Loss: 0.00839
Epoch: 29511, Training Loss: 0.00840
Epoch: 29511, Training Loss: 0.01032
Epoch: 29512, Training Loss: 0.00805
Epoch: 29512, Training Loss: 0.00839
Epoch: 29512, Training Loss: 0.00840
Epoch: 29512, Training Loss: 0.01032
Epoch: 29513, Training Loss: 0.00805
Epoch: 29513, Training Loss: 0.00839
Epoch: 29513, Training Loss: 0.00840
Epoch: 29513, Training Loss: 0.01032
Epoch: 29514, Training Loss: 0.00805
Epoch: 29514, Training Loss: 0.00839
Epoch: 29514, Training Loss: 0.00840
Epoch: 29514, Training Loss: 0.01032
Epoch: 29515, Training Loss: 0.00805
Epoch: 29515, Training Loss: 0.00839
Epoch: 29515, Training Loss: 0.00840
Epoch: 29515, Training Loss: 0.01032
Epoch: 29516, Training Loss: 0.00805
Epoch: 29516, Training Loss: 0.00839
Epoch: 29516, Training Loss: 0.00840
Epoch: 29516, Training Loss: 0.01031
Epoch: 29517, Training Loss: 0.00805
Epoch: 29517, Training Loss: 0.00839
Epoch: 29517, Training Loss: 0.00840
Epoch: 29517, Training Loss: 0.01031
Epoch: 29518, Training Loss: 0.00804
Epoch: 29518, Training Loss: 0.00839
Epoch: 29518, Training Loss: 0.00840
Epoch: 29518, Training Loss: 0.01031
Epoch: 29519, Training Loss: 0.00804
Epoch: 29519, Training Loss: 0.00839
Epoch: 29519, Training Loss: 0.00840
Epoch: 29519, Training Loss: 0.01031
Epoch: 29520, Training Loss: 0.00804
Epoch: 29520, Training Loss: 0.00839
Epoch: 29520, Training Loss: 0.00840
Epoch: 29520, Training Loss: 0.01031
Epoch: 29521, Training Loss: 0.00804
Epoch: 29521, Training Loss: 0.00839
Epoch: 29521, Training Loss: 0.00840
Epoch: 29521, Training Loss: 0.01031
Epoch: 29522, Training Loss: 0.00804
Epoch: 29522, Training Loss: 0.00839
Epoch: 29522, Training Loss: 0.00840
Epoch: 29522, Training Loss: 0.01031
Epoch: 29523, Training Loss: 0.00804
Epoch: 29523, Training Loss: 0.00838
Epoch: 29523, Training Loss: 0.00839
Epoch: 29523, Training Loss: 0.01031
Epoch: 29524, Training Loss: 0.00804
Epoch: 29524, Training Loss: 0.00838
Epoch: 29524, Training Loss: 0.00839
Epoch: 29524, Training Loss: 0.01031
Epoch: 29525, Training Loss: 0.00804
Epoch: 29525, Training Loss: 0.00838
Epoch: 29525, Training Loss: 0.00839
Epoch: 29525, Training Loss: 0.01031
Epoch: 29526, Training Loss: 0.00804
Epoch: 29526, Training Loss: 0.00838
Epoch: 29526, Training Loss: 0.00839
Epoch: 29526, Training Loss: 0.01031
Epoch: 29527, Training Loss: 0.00804
Epoch: 29527, Training Loss: 0.00838
Epoch: 29527, Training Loss: 0.00839
Epoch: 29527, Training Loss: 0.01031
Epoch: 29528, Training Loss: 0.00804
Epoch: 29528, Training Loss: 0.00838
Epoch: 29528, Training Loss: 0.00839
Epoch: 29528, Training Loss: 0.01031
Epoch: 29529, Training Loss: 0.00804
Epoch: 29529, Training Loss: 0.00838
Epoch: 29529, Training Loss: 0.00839
Epoch: 29529, Training Loss: 0.01031
Epoch: 29530, Training Loss: 0.00804
Epoch: 29530, Training Loss: 0.00838
Epoch: 29530, Training Loss: 0.00839
Epoch: 29530, Training Loss: 0.01031
Epoch: 29531, Training Loss: 0.00804
Epoch: 29531, Training Loss: 0.00838
Epoch: 29531, Training Loss: 0.00839
Epoch: 29531, Training Loss: 0.01031
Epoch: 29532, Training Loss: 0.00804
Epoch: 29532, Training Loss: 0.00838
Epoch: 29532, Training Loss: 0.00839
Epoch: 29532, Training Loss: 0.01031
Epoch: 29533, Training Loss: 0.00804
Epoch: 29533, Training Loss: 0.00838
Epoch: 29533, Training Loss: 0.00839
Epoch: 29533, Training Loss: 0.01031
Epoch: 29534, Training Loss: 0.00804
Epoch: 29534, Training Loss: 0.00838
Epoch: 29534, Training Loss: 0.00839
Epoch: 29534, Training Loss: 0.01031
Epoch: 29535, Training Loss: 0.00804
Epoch: 29535, Training Loss: 0.00838
Epoch: 29535, Training Loss: 0.00839
Epoch: 29535, Training Loss: 0.01031
Epoch: 29536, Training Loss: 0.00804
Epoch: 29536, Training Loss: 0.00838
Epoch: 29536, Training Loss: 0.00839
Epoch: 29536, Training Loss: 0.01031
Epoch: 29537, Training Loss: 0.00804
Epoch: 29537, Training Loss: 0.00838
Epoch: 29537, Training Loss: 0.00839
Epoch: 29537, Training Loss: 0.01031
Epoch: 29538, Training Loss: 0.00804
Epoch: 29538, Training Loss: 0.00838
Epoch: 29538, Training Loss: 0.00839
Epoch: 29538, Training Loss: 0.01031
Epoch: 29539, Training Loss: 0.00804
Epoch: 29539, Training Loss: 0.00838
Epoch: 29539, Training Loss: 0.00839
Epoch: 29539, Training Loss: 0.01031
Epoch: 29540, Training Loss: 0.00804
Epoch: 29540, Training Loss: 0.00838
Epoch: 29540, Training Loss: 0.00839
Epoch: 29540, Training Loss: 0.01031
Epoch: 29541, Training Loss: 0.00804
Epoch: 29541, Training Loss: 0.00838
Epoch: 29541, Training Loss: 0.00839
Epoch: 29541, Training Loss: 0.01031
Epoch: 29542, Training Loss: 0.00804
Epoch: 29542, Training Loss: 0.00838
Epoch: 29542, Training Loss: 0.00839
Epoch: 29542, Training Loss: 0.01031
Epoch: 29543, Training Loss: 0.00804
Epoch: 29543, Training Loss: 0.00838
Epoch: 29543, Training Loss: 0.00839
Epoch: 29543, Training Loss: 0.01031
Epoch: 29544, Training Loss: 0.00804
Epoch: 29544, Training Loss: 0.00838
Epoch: 29544, Training Loss: 0.00839
Epoch: 29544, Training Loss: 0.01031
Epoch: 29545, Training Loss: 0.00804
Epoch: 29545, Training Loss: 0.00838
Epoch: 29545, Training Loss: 0.00839
Epoch: 29545, Training Loss: 0.01031
Epoch: 29546, Training Loss: 0.00804
Epoch: 29546, Training Loss: 0.00838
Epoch: 29546, Training Loss: 0.00839
Epoch: 29546, Training Loss: 0.01031
Epoch: 29547, Training Loss: 0.00804
Epoch: 29547, Training Loss: 0.00838
Epoch: 29547, Training Loss: 0.00839
Epoch: 29547, Training Loss: 0.01031
Epoch: 29548, Training Loss: 0.00804
Epoch: 29548, Training Loss: 0.00838
Epoch: 29548, Training Loss: 0.00839
Epoch: 29548, Training Loss: 0.01031
Epoch: 29549, Training Loss: 0.00804
Epoch: 29549, Training Loss: 0.00838
Epoch: 29549, Training Loss: 0.00839
Epoch: 29549, Training Loss: 0.01031
Epoch: 29550, Training Loss: 0.00804
Epoch: 29550, Training Loss: 0.00838
Epoch: 29550, Training Loss: 0.00839
Epoch: 29550, Training Loss: 0.01031
Epoch: 29551, Training Loss: 0.00804
Epoch: 29551, Training Loss: 0.00838
Epoch: 29551, Training Loss: 0.00839
Epoch: 29551, Training Loss: 0.01031
Epoch: 29552, Training Loss: 0.00804
Epoch: 29552, Training Loss: 0.00838
Epoch: 29552, Training Loss: 0.00839
Epoch: 29552, Training Loss: 0.01031
Epoch: 29553, Training Loss: 0.00804
Epoch: 29553, Training Loss: 0.00838
Epoch: 29553, Training Loss: 0.00839
Epoch: 29553, Training Loss: 0.01031
Epoch: 29554, Training Loss: 0.00804
Epoch: 29554, Training Loss: 0.00838
Epoch: 29554, Training Loss: 0.00839
Epoch: 29554, Training Loss: 0.01031
Epoch: 29555, Training Loss: 0.00804
Epoch: 29555, Training Loss: 0.00838
Epoch: 29555, Training Loss: 0.00839
Epoch: 29555, Training Loss: 0.01031
Epoch: 29556, Training Loss: 0.00804
Epoch: 29556, Training Loss: 0.00838
Epoch: 29556, Training Loss: 0.00839
Epoch: 29556, Training Loss: 0.01031
Epoch: 29557, Training Loss: 0.00804
Epoch: 29557, Training Loss: 0.00838
Epoch: 29557, Training Loss: 0.00839
Epoch: 29557, Training Loss: 0.01031
Epoch: 29558, Training Loss: 0.00804
Epoch: 29558, Training Loss: 0.00838
Epoch: 29558, Training Loss: 0.00839
Epoch: 29558, Training Loss: 0.01031
Epoch: 29559, Training Loss: 0.00804
Epoch: 29559, Training Loss: 0.00838
Epoch: 29559, Training Loss: 0.00839
Epoch: 29559, Training Loss: 0.01031
Epoch: 29560, Training Loss: 0.00804
Epoch: 29560, Training Loss: 0.00838
Epoch: 29560, Training Loss: 0.00839
Epoch: 29560, Training Loss: 0.01031
Epoch: 29561, Training Loss: 0.00804
Epoch: 29561, Training Loss: 0.00838
Epoch: 29561, Training Loss: 0.00839
Epoch: 29561, Training Loss: 0.01031
Epoch: 29562, Training Loss: 0.00804
Epoch: 29562, Training Loss: 0.00838
Epoch: 29562, Training Loss: 0.00839
Epoch: 29562, Training Loss: 0.01031
Epoch: 29563, Training Loss: 0.00804
Epoch: 29563, Training Loss: 0.00838
Epoch: 29563, Training Loss: 0.00839
Epoch: 29563, Training Loss: 0.01031
Epoch: 29564, Training Loss: 0.00804
Epoch: 29564, Training Loss: 0.00838
Epoch: 29564, Training Loss: 0.00839
Epoch: 29564, Training Loss: 0.01031
Epoch: 29565, Training Loss: 0.00804
Epoch: 29565, Training Loss: 0.00838
Epoch: 29565, Training Loss: 0.00839
Epoch: 29565, Training Loss: 0.01031
Epoch: 29566, Training Loss: 0.00804
Epoch: 29566, Training Loss: 0.00838
Epoch: 29566, Training Loss: 0.00839
Epoch: 29566, Training Loss: 0.01030
Epoch: 29567, Training Loss: 0.00804
Epoch: 29567, Training Loss: 0.00838
Epoch: 29567, Training Loss: 0.00839
Epoch: 29567, Training Loss: 0.01030
Epoch: 29568, Training Loss: 0.00804
Epoch: 29568, Training Loss: 0.00838
Epoch: 29568, Training Loss: 0.00839
Epoch: 29568, Training Loss: 0.01030
Epoch: 29569, Training Loss: 0.00804
Epoch: 29569, Training Loss: 0.00838
Epoch: 29569, Training Loss: 0.00839
Epoch: 29569, Training Loss: 0.01030
Epoch: 29570, Training Loss: 0.00804
Epoch: 29570, Training Loss: 0.00838
Epoch: 29570, Training Loss: 0.00839
Epoch: 29570, Training Loss: 0.01030
Epoch: 29571, Training Loss: 0.00804
Epoch: 29571, Training Loss: 0.00838
Epoch: 29571, Training Loss: 0.00839
Epoch: 29571, Training Loss: 0.01030
Epoch: 29572, Training Loss: 0.00804
Epoch: 29572, Training Loss: 0.00838
Epoch: 29572, Training Loss: 0.00839
Epoch: 29572, Training Loss: 0.01030
Epoch: 29573, Training Loss: 0.00804
Epoch: 29573, Training Loss: 0.00838
Epoch: 29573, Training Loss: 0.00839
Epoch: 29573, Training Loss: 0.01030
Epoch: 29574, Training Loss: 0.00804
Epoch: 29574, Training Loss: 0.00838
Epoch: 29574, Training Loss: 0.00839
Epoch: 29574, Training Loss: 0.01030
Epoch: 29575, Training Loss: 0.00804
Epoch: 29575, Training Loss: 0.00838
Epoch: 29575, Training Loss: 0.00839
Epoch: 29575, Training Loss: 0.01030
Epoch: 29576, Training Loss: 0.00804
Epoch: 29576, Training Loss: 0.00838
Epoch: 29576, Training Loss: 0.00839
Epoch: 29576, Training Loss: 0.01030
Epoch: 29577, Training Loss: 0.00804
Epoch: 29577, Training Loss: 0.00838
Epoch: 29577, Training Loss: 0.00839
Epoch: 29577, Training Loss: 0.01030
Epoch: 29578, Training Loss: 0.00804
Epoch: 29578, Training Loss: 0.00838
Epoch: 29578, Training Loss: 0.00839
Epoch: 29578, Training Loss: 0.01030
Epoch: 29579, Training Loss: 0.00804
Epoch: 29579, Training Loss: 0.00838
Epoch: 29579, Training Loss: 0.00839
Epoch: 29579, Training Loss: 0.01030
Epoch: 29580, Training Loss: 0.00804
Epoch: 29580, Training Loss: 0.00838
Epoch: 29580, Training Loss: 0.00839
Epoch: 29580, Training Loss: 0.01030
Epoch: 29581, Training Loss: 0.00804
Epoch: 29581, Training Loss: 0.00838
Epoch: 29581, Training Loss: 0.00839
Epoch: 29581, Training Loss: 0.01030
Epoch: 29582, Training Loss: 0.00804
Epoch: 29582, Training Loss: 0.00838
Epoch: 29582, Training Loss: 0.00839
Epoch: 29582, Training Loss: 0.01030
Epoch: 29583, Training Loss: 0.00804
Epoch: 29583, Training Loss: 0.00838
Epoch: 29583, Training Loss: 0.00839
Epoch: 29583, Training Loss: 0.01030
Epoch: 29584, Training Loss: 0.00804
Epoch: 29584, Training Loss: 0.00838
Epoch: 29584, Training Loss: 0.00839
Epoch: 29584, Training Loss: 0.01030
Epoch: 29585, Training Loss: 0.00803
Epoch: 29585, Training Loss: 0.00837
Epoch: 29585, Training Loss: 0.00838
Epoch: 29585, Training Loss: 0.01030
Epoch: 29586, Training Loss: 0.00803
Epoch: 29586, Training Loss: 0.00837
Epoch: 29586, Training Loss: 0.00838
Epoch: 29586, Training Loss: 0.01030
Epoch: 29587, Training Loss: 0.00803
Epoch: 29587, Training Loss: 0.00837
Epoch: 29587, Training Loss: 0.00838
Epoch: 29587, Training Loss: 0.01030
Epoch: 29588, Training Loss: 0.00803
Epoch: 29588, Training Loss: 0.00837
Epoch: 29588, Training Loss: 0.00838
Epoch: 29588, Training Loss: 0.01030
Epoch: 29589, Training Loss: 0.00803
Epoch: 29589, Training Loss: 0.00837
Epoch: 29589, Training Loss: 0.00838
Epoch: 29589, Training Loss: 0.01030
Epoch: 29590, Training Loss: 0.00803
Epoch: 29590, Training Loss: 0.00837
Epoch: 29590, Training Loss: 0.00838
Epoch: 29590, Training Loss: 0.01030
Epoch: 29591, Training Loss: 0.00803
Epoch: 29591, Training Loss: 0.00837
Epoch: 29591, Training Loss: 0.00838
Epoch: 29591, Training Loss: 0.01030
Epoch: 29592, Training Loss: 0.00803
Epoch: 29592, Training Loss: 0.00837
Epoch: 29592, Training Loss: 0.00838
Epoch: 29592, Training Loss: 0.01030
Epoch: 29593, Training Loss: 0.00803
Epoch: 29593, Training Loss: 0.00837
Epoch: 29593, Training Loss: 0.00838
Epoch: 29593, Training Loss: 0.01030
Epoch: 29594, Training Loss: 0.00803
Epoch: 29594, Training Loss: 0.00837
Epoch: 29594, Training Loss: 0.00838
Epoch: 29594, Training Loss: 0.01030
Epoch: 29595, Training Loss: 0.00803
Epoch: 29595, Training Loss: 0.00837
Epoch: 29595, Training Loss: 0.00838
Epoch: 29595, Training Loss: 0.01030
Epoch: 29596, Training Loss: 0.00803
Epoch: 29596, Training Loss: 0.00837
Epoch: 29596, Training Loss: 0.00838
Epoch: 29596, Training Loss: 0.01030
Epoch: 29597, Training Loss: 0.00803
Epoch: 29597, Training Loss: 0.00837
Epoch: 29597, Training Loss: 0.00838
Epoch: 29597, Training Loss: 0.01030
Epoch: 29598, Training Loss: 0.00803
Epoch: 29598, Training Loss: 0.00837
Epoch: 29598, Training Loss: 0.00838
Epoch: 29598, Training Loss: 0.01030
Epoch: 29599, Training Loss: 0.00803
Epoch: 29599, Training Loss: 0.00837
Epoch: 29599, Training Loss: 0.00838
Epoch: 29599, Training Loss: 0.01030
Epoch: 29600, Training Loss: 0.00803
Epoch: 29600, Training Loss: 0.00837
Epoch: 29600, Training Loss: 0.00838
Epoch: 29600, Training Loss: 0.01030
Epoch: 29601, Training Loss: 0.00803
Epoch: 29601, Training Loss: 0.00837
Epoch: 29601, Training Loss: 0.00838
Epoch: 29601, Training Loss: 0.01030
Epoch: 29602, Training Loss: 0.00803
Epoch: 29602, Training Loss: 0.00837
Epoch: 29602, Training Loss: 0.00838
Epoch: 29602, Training Loss: 0.01030
Epoch: 29603, Training Loss: 0.00803
Epoch: 29603, Training Loss: 0.00837
Epoch: 29603, Training Loss: 0.00838
Epoch: 29603, Training Loss: 0.01030
Epoch: 29604, Training Loss: 0.00803
Epoch: 29604, Training Loss: 0.00837
Epoch: 29604, Training Loss: 0.00838
Epoch: 29604, Training Loss: 0.01030
Epoch: 29605, Training Loss: 0.00803
Epoch: 29605, Training Loss: 0.00837
Epoch: 29605, Training Loss: 0.00838
Epoch: 29605, Training Loss: 0.01030
Epoch: 29606, Training Loss: 0.00803
Epoch: 29606, Training Loss: 0.00837
Epoch: 29606, Training Loss: 0.00838
Epoch: 29606, Training Loss: 0.01030
Epoch: 29607, Training Loss: 0.00803
Epoch: 29607, Training Loss: 0.00837
Epoch: 29607, Training Loss: 0.00838
Epoch: 29607, Training Loss: 0.01030
Epoch: 29608, Training Loss: 0.00803
Epoch: 29608, Training Loss: 0.00837
Epoch: 29608, Training Loss: 0.00838
Epoch: 29608, Training Loss: 0.01030
Epoch: 29609, Training Loss: 0.00803
Epoch: 29609, Training Loss: 0.00837
Epoch: 29609, Training Loss: 0.00838
Epoch: 29609, Training Loss: 0.01030
Epoch: 29610, Training Loss: 0.00803
Epoch: 29610, Training Loss: 0.00837
Epoch: 29610, Training Loss: 0.00838
Epoch: 29610, Training Loss: 0.01030
Epoch: 29611, Training Loss: 0.00803
Epoch: 29611, Training Loss: 0.00837
Epoch: 29611, Training Loss: 0.00838
Epoch: 29611, Training Loss: 0.01030
Epoch: 29612, Training Loss: 0.00803
Epoch: 29612, Training Loss: 0.00837
Epoch: 29612, Training Loss: 0.00838
Epoch: 29612, Training Loss: 0.01030
Epoch: 29613, Training Loss: 0.00803
Epoch: 29613, Training Loss: 0.00837
Epoch: 29613, Training Loss: 0.00838
Epoch: 29613, Training Loss: 0.01030
Epoch: 29614, Training Loss: 0.00803
Epoch: 29614, Training Loss: 0.00837
Epoch: 29614, Training Loss: 0.00838
Epoch: 29614, Training Loss: 0.01030
Epoch: 29615, Training Loss: 0.00803
Epoch: 29615, Training Loss: 0.00837
Epoch: 29615, Training Loss: 0.00838
Epoch: 29615, Training Loss: 0.01030
Epoch: 29616, Training Loss: 0.00803
Epoch: 29616, Training Loss: 0.00837
Epoch: 29616, Training Loss: 0.00838
Epoch: 29616, Training Loss: 0.01029
Epoch: 29617, Training Loss: 0.00803
Epoch: 29617, Training Loss: 0.00837
Epoch: 29617, Training Loss: 0.00838
Epoch: 29617, Training Loss: 0.01029
Epoch: 29618, Training Loss: 0.00803
Epoch: 29618, Training Loss: 0.00837
Epoch: 29618, Training Loss: 0.00838
Epoch: 29618, Training Loss: 0.01029
Epoch: 29619, Training Loss: 0.00803
Epoch: 29619, Training Loss: 0.00837
Epoch: 29619, Training Loss: 0.00838
Epoch: 29619, Training Loss: 0.01029
Epoch: 29620, Training Loss: 0.00803
Epoch: 29620, Training Loss: 0.00837
Epoch: 29620, Training Loss: 0.00838
Epoch: 29620, Training Loss: 0.01029
Epoch: 29621, Training Loss: 0.00803
Epoch: 29621, Training Loss: 0.00837
Epoch: 29621, Training Loss: 0.00838
Epoch: 29621, Training Loss: 0.01029
Epoch: 29622, Training Loss: 0.00803
Epoch: 29622, Training Loss: 0.00837
Epoch: 29622, Training Loss: 0.00838
Epoch: 29622, Training Loss: 0.01029
Epoch: 29623, Training Loss: 0.00803
Epoch: 29623, Training Loss: 0.00837
Epoch: 29623, Training Loss: 0.00838
Epoch: 29623, Training Loss: 0.01029
Epoch: 29624, Training Loss: 0.00803
Epoch: 29624, Training Loss: 0.00837
Epoch: 29624, Training Loss: 0.00838
Epoch: 29624, Training Loss: 0.01029
Epoch: 29625, Training Loss: 0.00803
Epoch: 29625, Training Loss: 0.00837
Epoch: 29625, Training Loss: 0.00838
Epoch: 29625, Training Loss: 0.01029
Epoch: 29626, Training Loss: 0.00803
Epoch: 29626, Training Loss: 0.00837
Epoch: 29626, Training Loss: 0.00838
Epoch: 29626, Training Loss: 0.01029
Epoch: 29627, Training Loss: 0.00803
Epoch: 29627, Training Loss: 0.00837
Epoch: 29627, Training Loss: 0.00838
Epoch: 29627, Training Loss: 0.01029
Epoch: 29628, Training Loss: 0.00803
Epoch: 29628, Training Loss: 0.00837
Epoch: 29628, Training Loss: 0.00838
Epoch: 29628, Training Loss: 0.01029
Epoch: 29629, Training Loss: 0.00803
Epoch: 29629, Training Loss: 0.00837
Epoch: 29629, Training Loss: 0.00838
Epoch: 29629, Training Loss: 0.01029
Epoch: 29630, Training Loss: 0.00803
Epoch: 29630, Training Loss: 0.00837
Epoch: 29630, Training Loss: 0.00838
Epoch: 29630, Training Loss: 0.01029
Epoch: 29631, Training Loss: 0.00803
Epoch: 29631, Training Loss: 0.00837
Epoch: 29631, Training Loss: 0.00838
Epoch: 29631, Training Loss: 0.01029
Epoch: 29632, Training Loss: 0.00803
Epoch: 29632, Training Loss: 0.00837
Epoch: 29632, Training Loss: 0.00838
Epoch: 29632, Training Loss: 0.01029
Epoch: 29633, Training Loss: 0.00803
Epoch: 29633, Training Loss: 0.00837
Epoch: 29633, Training Loss: 0.00838
Epoch: 29633, Training Loss: 0.01029
Epoch: 29634, Training Loss: 0.00803
Epoch: 29634, Training Loss: 0.00837
Epoch: 29634, Training Loss: 0.00838
Epoch: 29634, Training Loss: 0.01029
Epoch: 29635, Training Loss: 0.00803
Epoch: 29635, Training Loss: 0.00837
Epoch: 29635, Training Loss: 0.00838
Epoch: 29635, Training Loss: 0.01029
Epoch: 29636, Training Loss: 0.00803
Epoch: 29636, Training Loss: 0.00837
Epoch: 29636, Training Loss: 0.00838
Epoch: 29636, Training Loss: 0.01029
Epoch: 29637, Training Loss: 0.00803
Epoch: 29637, Training Loss: 0.00837
Epoch: 29637, Training Loss: 0.00838
Epoch: 29637, Training Loss: 0.01029
Epoch: 29638, Training Loss: 0.00803
Epoch: 29638, Training Loss: 0.00837
Epoch: 29638, Training Loss: 0.00838
Epoch: 29638, Training Loss: 0.01029
Epoch: 29639, Training Loss: 0.00803
Epoch: 29639, Training Loss: 0.00837
Epoch: 29639, Training Loss: 0.00838
Epoch: 29639, Training Loss: 0.01029
Epoch: 29640, Training Loss: 0.00803
Epoch: 29640, Training Loss: 0.00837
Epoch: 29640, Training Loss: 0.00838
Epoch: 29640, Training Loss: 0.01029
Epoch: 29641, Training Loss: 0.00803
Epoch: 29641, Training Loss: 0.00837
Epoch: 29641, Training Loss: 0.00838
Epoch: 29641, Training Loss: 0.01029
Epoch: 29642, Training Loss: 0.00803
Epoch: 29642, Training Loss: 0.00837
Epoch: 29642, Training Loss: 0.00838
Epoch: 29642, Training Loss: 0.01029
Epoch: 29643, Training Loss: 0.00803
Epoch: 29643, Training Loss: 0.00837
Epoch: 29643, Training Loss: 0.00838
Epoch: 29643, Training Loss: 0.01029
Epoch: 29644, Training Loss: 0.00803
Epoch: 29644, Training Loss: 0.00837
Epoch: 29644, Training Loss: 0.00838
Epoch: 29644, Training Loss: 0.01029
Epoch: 29645, Training Loss: 0.00803
Epoch: 29645, Training Loss: 0.00837
Epoch: 29645, Training Loss: 0.00838
Epoch: 29645, Training Loss: 0.01029
Epoch: 29646, Training Loss: 0.00803
Epoch: 29646, Training Loss: 0.00837
Epoch: 29646, Training Loss: 0.00838
Epoch: 29646, Training Loss: 0.01029
Epoch: 29647, Training Loss: 0.00803
Epoch: 29647, Training Loss: 0.00837
Epoch: 29647, Training Loss: 0.00838
Epoch: 29647, Training Loss: 0.01029
Epoch: 29648, Training Loss: 0.00803
Epoch: 29648, Training Loss: 0.00836
Epoch: 29648, Training Loss: 0.00837
Epoch: 29648, Training Loss: 0.01029
Epoch: 29649, Training Loss: 0.00803
Epoch: 29649, Training Loss: 0.00836
Epoch: 29649, Training Loss: 0.00837
Epoch: 29649, Training Loss: 0.01029
Epoch: 29650, Training Loss: 0.00803
Epoch: 29650, Training Loss: 0.00836
Epoch: 29650, Training Loss: 0.00837
Epoch: 29650, Training Loss: 0.01029
Epoch: 29651, Training Loss: 0.00803
Epoch: 29651, Training Loss: 0.00836
Epoch: 29651, Training Loss: 0.00837
Epoch: 29651, Training Loss: 0.01029
Epoch: 29652, Training Loss: 0.00802
Epoch: 29652, Training Loss: 0.00836
Epoch: 29652, Training Loss: 0.00837
Epoch: 29652, Training Loss: 0.01029
Epoch: 29653, Training Loss: 0.00802
Epoch: 29653, Training Loss: 0.00836
Epoch: 29653, Training Loss: 0.00837
Epoch: 29653, Training Loss: 0.01029
Epoch: 29654, Training Loss: 0.00802
Epoch: 29654, Training Loss: 0.00836
Epoch: 29654, Training Loss: 0.00837
Epoch: 29654, Training Loss: 0.01029
Epoch: 29655, Training Loss: 0.00802
Epoch: 29655, Training Loss: 0.00836
Epoch: 29655, Training Loss: 0.00837
Epoch: 29655, Training Loss: 0.01029
Epoch: 29656, Training Loss: 0.00802
Epoch: 29656, Training Loss: 0.00836
Epoch: 29656, Training Loss: 0.00837
Epoch: 29656, Training Loss: 0.01029
Epoch: 29657, Training Loss: 0.00802
Epoch: 29657, Training Loss: 0.00836
Epoch: 29657, Training Loss: 0.00837
Epoch: 29657, Training Loss: 0.01029
Epoch: 29658, Training Loss: 0.00802
Epoch: 29658, Training Loss: 0.00836
Epoch: 29658, Training Loss: 0.00837
Epoch: 29658, Training Loss: 0.01029
Epoch: 29659, Training Loss: 0.00802
Epoch: 29659, Training Loss: 0.00836
Epoch: 29659, Training Loss: 0.00837
Epoch: 29659, Training Loss: 0.01029
Epoch: 29660, Training Loss: 0.00802
Epoch: 29660, Training Loss: 0.00836
Epoch: 29660, Training Loss: 0.00837
Epoch: 29660, Training Loss: 0.01029
Epoch: 29661, Training Loss: 0.00802
Epoch: 29661, Training Loss: 0.00836
Epoch: 29661, Training Loss: 0.00837
Epoch: 29661, Training Loss: 0.01029
Epoch: 29662, Training Loss: 0.00802
Epoch: 29662, Training Loss: 0.00836
Epoch: 29662, Training Loss: 0.00837
Epoch: 29662, Training Loss: 0.01029
Epoch: 29663, Training Loss: 0.00802
Epoch: 29663, Training Loss: 0.00836
Epoch: 29663, Training Loss: 0.00837
Epoch: 29663, Training Loss: 0.01029
Epoch: 29664, Training Loss: 0.00802
Epoch: 29664, Training Loss: 0.00836
Epoch: 29664, Training Loss: 0.00837
Epoch: 29664, Training Loss: 0.01029
Epoch: 29665, Training Loss: 0.00802
Epoch: 29665, Training Loss: 0.00836
Epoch: 29665, Training Loss: 0.00837
Epoch: 29665, Training Loss: 0.01029
Epoch: 29666, Training Loss: 0.00802
Epoch: 29666, Training Loss: 0.00836
Epoch: 29666, Training Loss: 0.00837
Epoch: 29666, Training Loss: 0.01029
Epoch: 29667, Training Loss: 0.00802
Epoch: 29667, Training Loss: 0.00836
Epoch: 29667, Training Loss: 0.00837
Epoch: 29667, Training Loss: 0.01028
Epoch: 29668, Training Loss: 0.00802
Epoch: 29668, Training Loss: 0.00836
Epoch: 29668, Training Loss: 0.00837
Epoch: 29668, Training Loss: 0.01028
Epoch: 29669, Training Loss: 0.00802
Epoch: 29669, Training Loss: 0.00836
Epoch: 29669, Training Loss: 0.00837
Epoch: 29669, Training Loss: 0.01028
Epoch: 29670, Training Loss: 0.00802
Epoch: 29670, Training Loss: 0.00836
Epoch: 29670, Training Loss: 0.00837
Epoch: 29670, Training Loss: 0.01028
Epoch: 29671, Training Loss: 0.00802
Epoch: 29671, Training Loss: 0.00836
Epoch: 29671, Training Loss: 0.00837
Epoch: 29671, Training Loss: 0.01028
Epoch: 29672, Training Loss: 0.00802
Epoch: 29672, Training Loss: 0.00836
Epoch: 29672, Training Loss: 0.00837
Epoch: 29672, Training Loss: 0.01028
Epoch: 29673, Training Loss: 0.00802
Epoch: 29673, Training Loss: 0.00836
Epoch: 29673, Training Loss: 0.00837
Epoch: 29673, Training Loss: 0.01028
Epoch: 29674, Training Loss: 0.00802
Epoch: 29674, Training Loss: 0.00836
Epoch: 29674, Training Loss: 0.00837
Epoch: 29674, Training Loss: 0.01028
Epoch: 29675, Training Loss: 0.00802
Epoch: 29675, Training Loss: 0.00836
Epoch: 29675, Training Loss: 0.00837
Epoch: 29675, Training Loss: 0.01028
Epoch: 29676, Training Loss: 0.00802
Epoch: 29676, Training Loss: 0.00836
Epoch: 29676, Training Loss: 0.00837
Epoch: 29676, Training Loss: 0.01028
Epoch: 29677, Training Loss: 0.00802
Epoch: 29677, Training Loss: 0.00836
Epoch: 29677, Training Loss: 0.00837
Epoch: 29677, Training Loss: 0.01028
Epoch: 29678, Training Loss: 0.00802
Epoch: 29678, Training Loss: 0.00836
Epoch: 29678, Training Loss: 0.00837
Epoch: 29678, Training Loss: 0.01028
Epoch: 29679, Training Loss: 0.00802
Epoch: 29679, Training Loss: 0.00836
Epoch: 29679, Training Loss: 0.00837
Epoch: 29679, Training Loss: 0.01028
Epoch: 29680, Training Loss: 0.00802
Epoch: 29680, Training Loss: 0.00836
Epoch: 29680, Training Loss: 0.00837
Epoch: 29680, Training Loss: 0.01028
Epoch: 29681, Training Loss: 0.00802
Epoch: 29681, Training Loss: 0.00836
Epoch: 29681, Training Loss: 0.00837
Epoch: 29681, Training Loss: 0.01028
Epoch: 29682, Training Loss: 0.00802
Epoch: 29682, Training Loss: 0.00836
Epoch: 29682, Training Loss: 0.00837
Epoch: 29682, Training Loss: 0.01028
Epoch: 29683, Training Loss: 0.00802
Epoch: 29683, Training Loss: 0.00836
Epoch: 29683, Training Loss: 0.00837
Epoch: 29683, Training Loss: 0.01028
Epoch: 29684, Training Loss: 0.00802
Epoch: 29684, Training Loss: 0.00836
Epoch: 29684, Training Loss: 0.00837
Epoch: 29684, Training Loss: 0.01028
Epoch: 29685, Training Loss: 0.00802
Epoch: 29685, Training Loss: 0.00836
Epoch: 29685, Training Loss: 0.00837
Epoch: 29685, Training Loss: 0.01028
Epoch: 29686, Training Loss: 0.00802
Epoch: 29686, Training Loss: 0.00836
Epoch: 29686, Training Loss: 0.00837
Epoch: 29686, Training Loss: 0.01028
Epoch: 29687, Training Loss: 0.00802
Epoch: 29687, Training Loss: 0.00836
Epoch: 29687, Training Loss: 0.00837
Epoch: 29687, Training Loss: 0.01028
Epoch: 29688, Training Loss: 0.00802
Epoch: 29688, Training Loss: 0.00836
Epoch: 29688, Training Loss: 0.00837
Epoch: 29688, Training Loss: 0.01028
Epoch: 29689, Training Loss: 0.00802
Epoch: 29689, Training Loss: 0.00836
Epoch: 29689, Training Loss: 0.00837
Epoch: 29689, Training Loss: 0.01028
Epoch: 29690, Training Loss: 0.00802
Epoch: 29690, Training Loss: 0.00836
Epoch: 29690, Training Loss: 0.00837
Epoch: 29690, Training Loss: 0.01028
Epoch: 29691, Training Loss: 0.00802
Epoch: 29691, Training Loss: 0.00836
Epoch: 29691, Training Loss: 0.00837
Epoch: 29691, Training Loss: 0.01028
Epoch: 29692, Training Loss: 0.00802
Epoch: 29692, Training Loss: 0.00836
Epoch: 29692, Training Loss: 0.00837
Epoch: 29692, Training Loss: 0.01028
Epoch: 29693, Training Loss: 0.00802
Epoch: 29693, Training Loss: 0.00836
Epoch: 29693, Training Loss: 0.00837
Epoch: 29693, Training Loss: 0.01028
Epoch: 29694, Training Loss: 0.00802
Epoch: 29694, Training Loss: 0.00836
Epoch: 29694, Training Loss: 0.00837
Epoch: 29694, Training Loss: 0.01028
Epoch: 29695, Training Loss: 0.00802
Epoch: 29695, Training Loss: 0.00836
Epoch: 29695, Training Loss: 0.00837
Epoch: 29695, Training Loss: 0.01028
Epoch: 29696, Training Loss: 0.00802
Epoch: 29696, Training Loss: 0.00836
Epoch: 29696, Training Loss: 0.00837
Epoch: 29696, Training Loss: 0.01028
Epoch: 29697, Training Loss: 0.00802
Epoch: 29697, Training Loss: 0.00836
Epoch: 29697, Training Loss: 0.00837
Epoch: 29697, Training Loss: 0.01028
Epoch: 29698, Training Loss: 0.00802
Epoch: 29698, Training Loss: 0.00836
Epoch: 29698, Training Loss: 0.00837
Epoch: 29698, Training Loss: 0.01028
Epoch: 29699, Training Loss: 0.00802
Epoch: 29699, Training Loss: 0.00836
Epoch: 29699, Training Loss: 0.00837
Epoch: 29699, Training Loss: 0.01028
Epoch: 29700, Training Loss: 0.00802
Epoch: 29700, Training Loss: 0.00836
Epoch: 29700, Training Loss: 0.00837
Epoch: 29700, Training Loss: 0.01028
Epoch: 29701, Training Loss: 0.00802
Epoch: 29701, Training Loss: 0.00836
Epoch: 29701, Training Loss: 0.00837
Epoch: 29701, Training Loss: 0.01028
Epoch: 29702, Training Loss: 0.00802
Epoch: 29702, Training Loss: 0.00836
Epoch: 29702, Training Loss: 0.00837
Epoch: 29702, Training Loss: 0.01028
Epoch: 29703, Training Loss: 0.00802
Epoch: 29703, Training Loss: 0.00836
Epoch: 29703, Training Loss: 0.00837
Epoch: 29703, Training Loss: 0.01028
Epoch: 29704, Training Loss: 0.00802
Epoch: 29704, Training Loss: 0.00836
Epoch: 29704, Training Loss: 0.00837
Epoch: 29704, Training Loss: 0.01028
Epoch: 29705, Training Loss: 0.00802
Epoch: 29705, Training Loss: 0.00836
Epoch: 29705, Training Loss: 0.00837
Epoch: 29705, Training Loss: 0.01028
Epoch: 29706, Training Loss: 0.00802
Epoch: 29706, Training Loss: 0.00836
Epoch: 29706, Training Loss: 0.00837
Epoch: 29706, Training Loss: 0.01028
Epoch: 29707, Training Loss: 0.00802
Epoch: 29707, Training Loss: 0.00836
Epoch: 29707, Training Loss: 0.00837
Epoch: 29707, Training Loss: 0.01028
Epoch: 29708, Training Loss: 0.00802
Epoch: 29708, Training Loss: 0.00836
Epoch: 29708, Training Loss: 0.00837
Epoch: 29708, Training Loss: 0.01028
Epoch: 29709, Training Loss: 0.00802
Epoch: 29709, Training Loss: 0.00836
Epoch: 29709, Training Loss: 0.00837
Epoch: 29709, Training Loss: 0.01028
Epoch: 29710, Training Loss: 0.00802
Epoch: 29710, Training Loss: 0.00836
Epoch: 29710, Training Loss: 0.00837
Epoch: 29710, Training Loss: 0.01028
Epoch: 29711, Training Loss: 0.00802
Epoch: 29711, Training Loss: 0.00835
Epoch: 29711, Training Loss: 0.00836
Epoch: 29711, Training Loss: 0.01028
Epoch: 29712, Training Loss: 0.00802
Epoch: 29712, Training Loss: 0.00835
Epoch: 29712, Training Loss: 0.00836
Epoch: 29712, Training Loss: 0.01028
Epoch: 29713, Training Loss: 0.00802
Epoch: 29713, Training Loss: 0.00835
Epoch: 29713, Training Loss: 0.00836
Epoch: 29713, Training Loss: 0.01028
Epoch: 29714, Training Loss: 0.00802
Epoch: 29714, Training Loss: 0.00835
Epoch: 29714, Training Loss: 0.00836
Epoch: 29714, Training Loss: 0.01028
Epoch: 29715, Training Loss: 0.00802
Epoch: 29715, Training Loss: 0.00835
Epoch: 29715, Training Loss: 0.00836
Epoch: 29715, Training Loss: 0.01028
Epoch: 29716, Training Loss: 0.00802
Epoch: 29716, Training Loss: 0.00835
Epoch: 29716, Training Loss: 0.00836
Epoch: 29716, Training Loss: 0.01028
Epoch: 29717, Training Loss: 0.00802
Epoch: 29717, Training Loss: 0.00835
Epoch: 29717, Training Loss: 0.00836
Epoch: 29717, Training Loss: 0.01028
Epoch: 29718, Training Loss: 0.00802
Epoch: 29718, Training Loss: 0.00835
Epoch: 29718, Training Loss: 0.00836
Epoch: 29718, Training Loss: 0.01027
Epoch: 29719, Training Loss: 0.00801
Epoch: 29719, Training Loss: 0.00835
Epoch: 29719, Training Loss: 0.00836
Epoch: 29719, Training Loss: 0.01027
Epoch: 29720, Training Loss: 0.00801
Epoch: 29720, Training Loss: 0.00835
Epoch: 29720, Training Loss: 0.00836
Epoch: 29720, Training Loss: 0.01027
Epoch: 29721, Training Loss: 0.00801
Epoch: 29721, Training Loss: 0.00835
Epoch: 29721, Training Loss: 0.00836
Epoch: 29721, Training Loss: 0.01027
Epoch: 29722, Training Loss: 0.00801
Epoch: 29722, Training Loss: 0.00835
Epoch: 29722, Training Loss: 0.00836
Epoch: 29722, Training Loss: 0.01027
Epoch: 29723, Training Loss: 0.00801
Epoch: 29723, Training Loss: 0.00835
Epoch: 29723, Training Loss: 0.00836
Epoch: 29723, Training Loss: 0.01027
Epoch: 29724, Training Loss: 0.00801
Epoch: 29724, Training Loss: 0.00835
Epoch: 29724, Training Loss: 0.00836
Epoch: 29724, Training Loss: 0.01027
Epoch: 29725, Training Loss: 0.00801
Epoch: 29725, Training Loss: 0.00835
Epoch: 29725, Training Loss: 0.00836
Epoch: 29725, Training Loss: 0.01027
Epoch: 29726, Training Loss: 0.00801
Epoch: 29726, Training Loss: 0.00835
Epoch: 29726, Training Loss: 0.00836
Epoch: 29726, Training Loss: 0.01027
Epoch: 29727, Training Loss: 0.00801
Epoch: 29727, Training Loss: 0.00835
Epoch: 29727, Training Loss: 0.00836
Epoch: 29727, Training Loss: 0.01027
Epoch: 29728, Training Loss: 0.00801
Epoch: 29728, Training Loss: 0.00835
Epoch: 29728, Training Loss: 0.00836
Epoch: 29728, Training Loss: 0.01027
Epoch: 29729, Training Loss: 0.00801
Epoch: 29729, Training Loss: 0.00835
Epoch: 29729, Training Loss: 0.00836
Epoch: 29729, Training Loss: 0.01027
Epoch: 29730, Training Loss: 0.00801
Epoch: 29730, Training Loss: 0.00835
Epoch: 29730, Training Loss: 0.00836
Epoch: 29730, Training Loss: 0.01027
Epoch: 29731, Training Loss: 0.00801
Epoch: 29731, Training Loss: 0.00835
Epoch: 29731, Training Loss: 0.00836
Epoch: 29731, Training Loss: 0.01027
Epoch: 29732, Training Loss: 0.00801
Epoch: 29732, Training Loss: 0.00835
Epoch: 29732, Training Loss: 0.00836
Epoch: 29732, Training Loss: 0.01027
Epoch: 29733, Training Loss: 0.00801
Epoch: 29733, Training Loss: 0.00835
Epoch: 29733, Training Loss: 0.00836
Epoch: 29733, Training Loss: 0.01027
Epoch: 29734, Training Loss: 0.00801
Epoch: 29734, Training Loss: 0.00835
Epoch: 29734, Training Loss: 0.00836
Epoch: 29734, Training Loss: 0.01027
Epoch: 29735, Training Loss: 0.00801
Epoch: 29735, Training Loss: 0.00835
Epoch: 29735, Training Loss: 0.00836
Epoch: 29735, Training Loss: 0.01027
Epoch: 29736, Training Loss: 0.00801
Epoch: 29736, Training Loss: 0.00835
Epoch: 29736, Training Loss: 0.00836
Epoch: 29736, Training Loss: 0.01027
Epoch: 29737, Training Loss: 0.00801
Epoch: 29737, Training Loss: 0.00835
Epoch: 29737, Training Loss: 0.00836
Epoch: 29737, Training Loss: 0.01027
Epoch: 29738, Training Loss: 0.00801
Epoch: 29738, Training Loss: 0.00835
Epoch: 29738, Training Loss: 0.00836
Epoch: 29738, Training Loss: 0.01027
Epoch: 29739, Training Loss: 0.00801
Epoch: 29739, Training Loss: 0.00835
Epoch: 29739, Training Loss: 0.00836
Epoch: 29739, Training Loss: 0.01027
Epoch: 29740, Training Loss: 0.00801
Epoch: 29740, Training Loss: 0.00835
Epoch: 29740, Training Loss: 0.00836
Epoch: 29740, Training Loss: 0.01027
Epoch: 29741, Training Loss: 0.00801
Epoch: 29741, Training Loss: 0.00835
Epoch: 29741, Training Loss: 0.00836
Epoch: 29741, Training Loss: 0.01027
Epoch: 29742, Training Loss: 0.00801
Epoch: 29742, Training Loss: 0.00835
Epoch: 29742, Training Loss: 0.00836
Epoch: 29742, Training Loss: 0.01027
Epoch: 29743, Training Loss: 0.00801
Epoch: 29743, Training Loss: 0.00835
Epoch: 29743, Training Loss: 0.00836
Epoch: 29743, Training Loss: 0.01027
Epoch: 29744, Training Loss: 0.00801
Epoch: 29744, Training Loss: 0.00835
Epoch: 29744, Training Loss: 0.00836
Epoch: 29744, Training Loss: 0.01027
Epoch: 29745, Training Loss: 0.00801
Epoch: 29745, Training Loss: 0.00835
Epoch: 29745, Training Loss: 0.00836
Epoch: 29745, Training Loss: 0.01027
Epoch: 29746, Training Loss: 0.00801
Epoch: 29746, Training Loss: 0.00835
Epoch: 29746, Training Loss: 0.00836
Epoch: 29746, Training Loss: 0.01027
Epoch: 29747, Training Loss: 0.00801
Epoch: 29747, Training Loss: 0.00835
Epoch: 29747, Training Loss: 0.00836
Epoch: 29747, Training Loss: 0.01027
Epoch: 29748, Training Loss: 0.00801
Epoch: 29748, Training Loss: 0.00835
Epoch: 29748, Training Loss: 0.00836
Epoch: 29748, Training Loss: 0.01027
Epoch: 29749, Training Loss: 0.00801
Epoch: 29749, Training Loss: 0.00835
Epoch: 29749, Training Loss: 0.00836
Epoch: 29749, Training Loss: 0.01027
Epoch: 29750, Training Loss: 0.00801
Epoch: 29750, Training Loss: 0.00835
Epoch: 29750, Training Loss: 0.00836
Epoch: 29750, Training Loss: 0.01027
Epoch: 29751, Training Loss: 0.00801
Epoch: 29751, Training Loss: 0.00835
Epoch: 29751, Training Loss: 0.00836
Epoch: 29751, Training Loss: 0.01027
Epoch: 29752, Training Loss: 0.00801
Epoch: 29752, Training Loss: 0.00835
Epoch: 29752, Training Loss: 0.00836
Epoch: 29752, Training Loss: 0.01027
Epoch: 29753, Training Loss: 0.00801
Epoch: 29753, Training Loss: 0.00835
Epoch: 29753, Training Loss: 0.00836
Epoch: 29753, Training Loss: 0.01027
Epoch: 29754, Training Loss: 0.00801
Epoch: 29754, Training Loss: 0.00835
Epoch: 29754, Training Loss: 0.00836
Epoch: 29754, Training Loss: 0.01027
Epoch: 29755, Training Loss: 0.00801
Epoch: 29755, Training Loss: 0.00835
Epoch: 29755, Training Loss: 0.00836
Epoch: 29755, Training Loss: 0.01027
Epoch: 29756, Training Loss: 0.00801
Epoch: 29756, Training Loss: 0.00835
Epoch: 29756, Training Loss: 0.00836
Epoch: 29756, Training Loss: 0.01027
Epoch: 29757, Training Loss: 0.00801
Epoch: 29757, Training Loss: 0.00835
Epoch: 29757, Training Loss: 0.00836
Epoch: 29757, Training Loss: 0.01027
Epoch: 29758, Training Loss: 0.00801
Epoch: 29758, Training Loss: 0.00835
Epoch: 29758, Training Loss: 0.00836
Epoch: 29758, Training Loss: 0.01027
Epoch: 29759, Training Loss: 0.00801
Epoch: 29759, Training Loss: 0.00835
Epoch: 29759, Training Loss: 0.00836
Epoch: 29759, Training Loss: 0.01027
Epoch: 29760, Training Loss: 0.00801
Epoch: 29760, Training Loss: 0.00835
Epoch: 29760, Training Loss: 0.00836
Epoch: 29760, Training Loss: 0.01027
Epoch: 29761, Training Loss: 0.00801
Epoch: 29761, Training Loss: 0.00835
Epoch: 29761, Training Loss: 0.00836
Epoch: 29761, Training Loss: 0.01027
Epoch: 29762, Training Loss: 0.00801
Epoch: 29762, Training Loss: 0.00835
Epoch: 29762, Training Loss: 0.00836
Epoch: 29762, Training Loss: 0.01027
Epoch: 29763, Training Loss: 0.00801
Epoch: 29763, Training Loss: 0.00835
Epoch: 29763, Training Loss: 0.00836
Epoch: 29763, Training Loss: 0.01027
Epoch: 29764, Training Loss: 0.00801
Epoch: 29764, Training Loss: 0.00835
Epoch: 29764, Training Loss: 0.00836
Epoch: 29764, Training Loss: 0.01027
Epoch: 29765, Training Loss: 0.00801
Epoch: 29765, Training Loss: 0.00835
Epoch: 29765, Training Loss: 0.00836
Epoch: 29765, Training Loss: 0.01027
Epoch: 29766, Training Loss: 0.00801
Epoch: 29766, Training Loss: 0.00835
Epoch: 29766, Training Loss: 0.00836
Epoch: 29766, Training Loss: 0.01027
Epoch: 29767, Training Loss: 0.00801
Epoch: 29767, Training Loss: 0.00835
Epoch: 29767, Training Loss: 0.00836
Epoch: 29767, Training Loss: 0.01027
Epoch: 29768, Training Loss: 0.00801
Epoch: 29768, Training Loss: 0.00835
Epoch: 29768, Training Loss: 0.00836
Epoch: 29768, Training Loss: 0.01026
Epoch: 29769, Training Loss: 0.00801
Epoch: 29769, Training Loss: 0.00835
Epoch: 29769, Training Loss: 0.00836
Epoch: 29769, Training Loss: 0.01026
Epoch: 29770, Training Loss: 0.00801
Epoch: 29770, Training Loss: 0.00835
Epoch: 29770, Training Loss: 0.00836
Epoch: 29770, Training Loss: 0.01026
Epoch: 29771, Training Loss: 0.00801
Epoch: 29771, Training Loss: 0.00835
Epoch: 29771, Training Loss: 0.00836
Epoch: 29771, Training Loss: 0.01026
Epoch: 29772, Training Loss: 0.00801
Epoch: 29772, Training Loss: 0.00835
Epoch: 29772, Training Loss: 0.00836
Epoch: 29772, Training Loss: 0.01026
Epoch: 29773, Training Loss: 0.00801
Epoch: 29773, Training Loss: 0.00835
Epoch: 29773, Training Loss: 0.00836
Epoch: 29773, Training Loss: 0.01026
Epoch: 29774, Training Loss: 0.00801
Epoch: 29774, Training Loss: 0.00834
Epoch: 29774, Training Loss: 0.00835
Epoch: 29774, Training Loss: 0.01026
Epoch: 29775, Training Loss: 0.00801
Epoch: 29775, Training Loss: 0.00834
Epoch: 29775, Training Loss: 0.00835
Epoch: 29775, Training Loss: 0.01026
Epoch: 29776, Training Loss: 0.00801
Epoch: 29776, Training Loss: 0.00834
Epoch: 29776, Training Loss: 0.00835
Epoch: 29776, Training Loss: 0.01026
Epoch: 29777, Training Loss: 0.00801
Epoch: 29777, Training Loss: 0.00834
Epoch: 29777, Training Loss: 0.00835
Epoch: 29777, Training Loss: 0.01026
Epoch: 29778, Training Loss: 0.00801
Epoch: 29778, Training Loss: 0.00834
Epoch: 29778, Training Loss: 0.00835
Epoch: 29778, Training Loss: 0.01026
Epoch: 29779, Training Loss: 0.00801
Epoch: 29779, Training Loss: 0.00834
Epoch: 29779, Training Loss: 0.00835
Epoch: 29779, Training Loss: 0.01026
Epoch: 29780, Training Loss: 0.00801
Epoch: 29780, Training Loss: 0.00834
Epoch: 29780, Training Loss: 0.00835
Epoch: 29780, Training Loss: 0.01026
Epoch: 29781, Training Loss: 0.00801
Epoch: 29781, Training Loss: 0.00834
Epoch: 29781, Training Loss: 0.00835
Epoch: 29781, Training Loss: 0.01026
Epoch: 29782, Training Loss: 0.00801
Epoch: 29782, Training Loss: 0.00834
Epoch: 29782, Training Loss: 0.00835
Epoch: 29782, Training Loss: 0.01026
Epoch: 29783, Training Loss: 0.00801
Epoch: 29783, Training Loss: 0.00834
Epoch: 29783, Training Loss: 0.00835
Epoch: 29783, Training Loss: 0.01026
Epoch: 29784, Training Loss: 0.00801
Epoch: 29784, Training Loss: 0.00834
Epoch: 29784, Training Loss: 0.00835
Epoch: 29784, Training Loss: 0.01026
Epoch: 29785, Training Loss: 0.00801
Epoch: 29785, Training Loss: 0.00834
Epoch: 29785, Training Loss: 0.00835
Epoch: 29785, Training Loss: 0.01026
Epoch: 29786, Training Loss: 0.00800
Epoch: 29786, Training Loss: 0.00834
Epoch: 29786, Training Loss: 0.00835
Epoch: 29786, Training Loss: 0.01026
Epoch: 29787, Training Loss: 0.00800
Epoch: 29787, Training Loss: 0.00834
Epoch: 29787, Training Loss: 0.00835
Epoch: 29787, Training Loss: 0.01026
Epoch: 29788, Training Loss: 0.00800
Epoch: 29788, Training Loss: 0.00834
Epoch: 29788, Training Loss: 0.00835
Epoch: 29788, Training Loss: 0.01026
Epoch: 29789, Training Loss: 0.00800
Epoch: 29789, Training Loss: 0.00834
Epoch: 29789, Training Loss: 0.00835
Epoch: 29789, Training Loss: 0.01026
Epoch: 29790, Training Loss: 0.00800
Epoch: 29790, Training Loss: 0.00834
Epoch: 29790, Training Loss: 0.00835
Epoch: 29790, Training Loss: 0.01026
Epoch: 29791, Training Loss: 0.00800
Epoch: 29791, Training Loss: 0.00834
Epoch: 29791, Training Loss: 0.00835
Epoch: 29791, Training Loss: 0.01026
Epoch: 29792, Training Loss: 0.00800
Epoch: 29792, Training Loss: 0.00834
Epoch: 29792, Training Loss: 0.00835
Epoch: 29792, Training Loss: 0.01026
Epoch: 29793, Training Loss: 0.00800
Epoch: 29793, Training Loss: 0.00834
Epoch: 29793, Training Loss: 0.00835
Epoch: 29793, Training Loss: 0.01026
Epoch: 29794, Training Loss: 0.00800
Epoch: 29794, Training Loss: 0.00834
Epoch: 29794, Training Loss: 0.00835
Epoch: 29794, Training Loss: 0.01026
Epoch: 29795, Training Loss: 0.00800
Epoch: 29795, Training Loss: 0.00834
Epoch: 29795, Training Loss: 0.00835
Epoch: 29795, Training Loss: 0.01026
Epoch: 29796, Training Loss: 0.00800
Epoch: 29796, Training Loss: 0.00834
Epoch: 29796, Training Loss: 0.00835
Epoch: 29796, Training Loss: 0.01026
Epoch: 29797, Training Loss: 0.00800
Epoch: 29797, Training Loss: 0.00834
Epoch: 29797, Training Loss: 0.00835
Epoch: 29797, Training Loss: 0.01026
Epoch: 29798, Training Loss: 0.00800
Epoch: 29798, Training Loss: 0.00834
Epoch: 29798, Training Loss: 0.00835
Epoch: 29798, Training Loss: 0.01026
Epoch: 29799, Training Loss: 0.00800
Epoch: 29799, Training Loss: 0.00834
Epoch: 29799, Training Loss: 0.00835
Epoch: 29799, Training Loss: 0.01026
Epoch: 29800, Training Loss: 0.00800
Epoch: 29800, Training Loss: 0.00834
Epoch: 29800, Training Loss: 0.00835
Epoch: 29800, Training Loss: 0.01026
Epoch: 29801, Training Loss: 0.00800
Epoch: 29801, Training Loss: 0.00834
Epoch: 29801, Training Loss: 0.00835
Epoch: 29801, Training Loss: 0.01026
Epoch: 29802, Training Loss: 0.00800
Epoch: 29802, Training Loss: 0.00834
Epoch: 29802, Training Loss: 0.00835
Epoch: 29802, Training Loss: 0.01026
Epoch: 29803, Training Loss: 0.00800
Epoch: 29803, Training Loss: 0.00834
Epoch: 29803, Training Loss: 0.00835
Epoch: 29803, Training Loss: 0.01026
Epoch: 29804, Training Loss: 0.00800
Epoch: 29804, Training Loss: 0.00834
Epoch: 29804, Training Loss: 0.00835
Epoch: 29804, Training Loss: 0.01026
Epoch: 29805, Training Loss: 0.00800
Epoch: 29805, Training Loss: 0.00834
Epoch: 29805, Training Loss: 0.00835
Epoch: 29805, Training Loss: 0.01026
Epoch: 29806, Training Loss: 0.00800
Epoch: 29806, Training Loss: 0.00834
Epoch: 29806, Training Loss: 0.00835
Epoch: 29806, Training Loss: 0.01026
Epoch: 29807, Training Loss: 0.00800
Epoch: 29807, Training Loss: 0.00834
Epoch: 29807, Training Loss: 0.00835
Epoch: 29807, Training Loss: 0.01026
Epoch: 29808, Training Loss: 0.00800
Epoch: 29808, Training Loss: 0.00834
Epoch: 29808, Training Loss: 0.00835
Epoch: 29808, Training Loss: 0.01026
Epoch: 29809, Training Loss: 0.00800
Epoch: 29809, Training Loss: 0.00834
Epoch: 29809, Training Loss: 0.00835
Epoch: 29809, Training Loss: 0.01026
Epoch: 29810, Training Loss: 0.00800
Epoch: 29810, Training Loss: 0.00834
Epoch: 29810, Training Loss: 0.00835
Epoch: 29810, Training Loss: 0.01026
Epoch: 29811, Training Loss: 0.00800
Epoch: 29811, Training Loss: 0.00834
Epoch: 29811, Training Loss: 0.00835
Epoch: 29811, Training Loss: 0.01026
Epoch: 29812, Training Loss: 0.00800
Epoch: 29812, Training Loss: 0.00834
Epoch: 29812, Training Loss: 0.00835
Epoch: 29812, Training Loss: 0.01026
Epoch: 29813, Training Loss: 0.00800
Epoch: 29813, Training Loss: 0.00834
Epoch: 29813, Training Loss: 0.00835
Epoch: 29813, Training Loss: 0.01026
Epoch: 29814, Training Loss: 0.00800
Epoch: 29814, Training Loss: 0.00834
Epoch: 29814, Training Loss: 0.00835
Epoch: 29814, Training Loss: 0.01026
Epoch: 29815, Training Loss: 0.00800
Epoch: 29815, Training Loss: 0.00834
Epoch: 29815, Training Loss: 0.00835
Epoch: 29815, Training Loss: 0.01026
Epoch: 29816, Training Loss: 0.00800
Epoch: 29816, Training Loss: 0.00834
Epoch: 29816, Training Loss: 0.00835
Epoch: 29816, Training Loss: 0.01026
Epoch: 29817, Training Loss: 0.00800
Epoch: 29817, Training Loss: 0.00834
Epoch: 29817, Training Loss: 0.00835
Epoch: 29817, Training Loss: 0.01026
Epoch: 29818, Training Loss: 0.00800
Epoch: 29818, Training Loss: 0.00834
Epoch: 29818, Training Loss: 0.00835
Epoch: 29818, Training Loss: 0.01026
Epoch: 29819, Training Loss: 0.00800
Epoch: 29819, Training Loss: 0.00834
Epoch: 29819, Training Loss: 0.00835
Epoch: 29819, Training Loss: 0.01025
Epoch: 29820, Training Loss: 0.00800
Epoch: 29820, Training Loss: 0.00834
Epoch: 29820, Training Loss: 0.00835
Epoch: 29820, Training Loss: 0.01025
Epoch: 29821, Training Loss: 0.00800
Epoch: 29821, Training Loss: 0.00834
Epoch: 29821, Training Loss: 0.00835
Epoch: 29821, Training Loss: 0.01025
Epoch: 29822, Training Loss: 0.00800
Epoch: 29822, Training Loss: 0.00834
Epoch: 29822, Training Loss: 0.00835
Epoch: 29822, Training Loss: 0.01025
Epoch: 29823, Training Loss: 0.00800
Epoch: 29823, Training Loss: 0.00834
Epoch: 29823, Training Loss: 0.00835
Epoch: 29823, Training Loss: 0.01025
Epoch: 29824, Training Loss: 0.00800
Epoch: 29824, Training Loss: 0.00834
Epoch: 29824, Training Loss: 0.00835
Epoch: 29824, Training Loss: 0.01025
Epoch: 29825, Training Loss: 0.00800
Epoch: 29825, Training Loss: 0.00834
Epoch: 29825, Training Loss: 0.00835
Epoch: 29825, Training Loss: 0.01025
Epoch: 29826, Training Loss: 0.00800
Epoch: 29826, Training Loss: 0.00834
Epoch: 29826, Training Loss: 0.00835
Epoch: 29826, Training Loss: 0.01025
Epoch: 29827, Training Loss: 0.00800
Epoch: 29827, Training Loss: 0.00834
Epoch: 29827, Training Loss: 0.00835
Epoch: 29827, Training Loss: 0.01025
Epoch: 29828, Training Loss: 0.00800
Epoch: 29828, Training Loss: 0.00834
Epoch: 29828, Training Loss: 0.00835
Epoch: 29828, Training Loss: 0.01025
Epoch: 29829, Training Loss: 0.00800
Epoch: 29829, Training Loss: 0.00834
Epoch: 29829, Training Loss: 0.00835
Epoch: 29829, Training Loss: 0.01025
Epoch: 29830, Training Loss: 0.00800
Epoch: 29830, Training Loss: 0.00834
Epoch: 29830, Training Loss: 0.00835
Epoch: 29830, Training Loss: 0.01025
Epoch: 29831, Training Loss: 0.00800
Epoch: 29831, Training Loss: 0.00834
Epoch: 29831, Training Loss: 0.00835
Epoch: 29831, Training Loss: 0.01025
Epoch: 29832, Training Loss: 0.00800
Epoch: 29832, Training Loss: 0.00834
Epoch: 29832, Training Loss: 0.00835
Epoch: 29832, Training Loss: 0.01025
Epoch: 29833, Training Loss: 0.00800
Epoch: 29833, Training Loss: 0.00834
Epoch: 29833, Training Loss: 0.00835
Epoch: 29833, Training Loss: 0.01025
Epoch: 29834, Training Loss: 0.00800
Epoch: 29834, Training Loss: 0.00834
Epoch: 29834, Training Loss: 0.00835
Epoch: 29834, Training Loss: 0.01025
Epoch: 29835, Training Loss: 0.00800
Epoch: 29835, Training Loss: 0.00834
Epoch: 29835, Training Loss: 0.00835
Epoch: 29835, Training Loss: 0.01025
Epoch: 29836, Training Loss: 0.00800
Epoch: 29836, Training Loss: 0.00834
Epoch: 29836, Training Loss: 0.00835
Epoch: 29836, Training Loss: 0.01025
Epoch: 29837, Training Loss: 0.00800
Epoch: 29837, Training Loss: 0.00834
Epoch: 29837, Training Loss: 0.00834
Epoch: 29837, Training Loss: 0.01025
Epoch: 29838, Training Loss: 0.00800
Epoch: 29838, Training Loss: 0.00833
Epoch: 29838, Training Loss: 0.00834
Epoch: 29838, Training Loss: 0.01025
Epoch: 29839, Training Loss: 0.00800
Epoch: 29839, Training Loss: 0.00833
Epoch: 29839, Training Loss: 0.00834
Epoch: 29839, Training Loss: 0.01025
Epoch: 29840, Training Loss: 0.00800
Epoch: 29840, Training Loss: 0.00833
Epoch: 29840, Training Loss: 0.00834
Epoch: 29840, Training Loss: 0.01025
Epoch: 29841, Training Loss: 0.00800
Epoch: 29841, Training Loss: 0.00833
Epoch: 29841, Training Loss: 0.00834
Epoch: 29841, Training Loss: 0.01025
Epoch: 29842, Training Loss: 0.00800
Epoch: 29842, Training Loss: 0.00833
Epoch: 29842, Training Loss: 0.00834
Epoch: 29842, Training Loss: 0.01025
Epoch: 29843, Training Loss: 0.00800
Epoch: 29843, Training Loss: 0.00833
Epoch: 29843, Training Loss: 0.00834
Epoch: 29843, Training Loss: 0.01025
Epoch: 29844, Training Loss: 0.00800
Epoch: 29844, Training Loss: 0.00833
Epoch: 29844, Training Loss: 0.00834
Epoch: 29844, Training Loss: 0.01025
Epoch: 29845, Training Loss: 0.00800
Epoch: 29845, Training Loss: 0.00833
Epoch: 29845, Training Loss: 0.00834
Epoch: 29845, Training Loss: 0.01025
Epoch: 29846, Training Loss: 0.00800
Epoch: 29846, Training Loss: 0.00833
Epoch: 29846, Training Loss: 0.00834
Epoch: 29846, Training Loss: 0.01025
Epoch: 29847, Training Loss: 0.00800
Epoch: 29847, Training Loss: 0.00833
Epoch: 29847, Training Loss: 0.00834
Epoch: 29847, Training Loss: 0.01025
Epoch: 29848, Training Loss: 0.00800
Epoch: 29848, Training Loss: 0.00833
Epoch: 29848, Training Loss: 0.00834
Epoch: 29848, Training Loss: 0.01025
Epoch: 29849, Training Loss: 0.00800
Epoch: 29849, Training Loss: 0.00833
Epoch: 29849, Training Loss: 0.00834
Epoch: 29849, Training Loss: 0.01025
Epoch: 29850, Training Loss: 0.00800
Epoch: 29850, Training Loss: 0.00833
Epoch: 29850, Training Loss: 0.00834
Epoch: 29850, Training Loss: 0.01025
Epoch: 29851, Training Loss: 0.00800
Epoch: 29851, Training Loss: 0.00833
Epoch: 29851, Training Loss: 0.00834
Epoch: 29851, Training Loss: 0.01025
Epoch: 29852, Training Loss: 0.00800
Epoch: 29852, Training Loss: 0.00833
Epoch: 29852, Training Loss: 0.00834
Epoch: 29852, Training Loss: 0.01025
Epoch: 29853, Training Loss: 0.00800
Epoch: 29853, Training Loss: 0.00833
Epoch: 29853, Training Loss: 0.00834
Epoch: 29853, Training Loss: 0.01025
Epoch: 29854, Training Loss: 0.00799
Epoch: 29854, Training Loss: 0.00833
Epoch: 29854, Training Loss: 0.00834
Epoch: 29854, Training Loss: 0.01025
Epoch: 29855, Training Loss: 0.00799
Epoch: 29855, Training Loss: 0.00833
Epoch: 29855, Training Loss: 0.00834
Epoch: 29855, Training Loss: 0.01025
Epoch: 29856, Training Loss: 0.00799
Epoch: 29856, Training Loss: 0.00833
Epoch: 29856, Training Loss: 0.00834
Epoch: 29856, Training Loss: 0.01025
Epoch: 29857, Training Loss: 0.00799
Epoch: 29857, Training Loss: 0.00833
Epoch: 29857, Training Loss: 0.00834
Epoch: 29857, Training Loss: 0.01025
Epoch: 29858, Training Loss: 0.00799
Epoch: 29858, Training Loss: 0.00833
Epoch: 29858, Training Loss: 0.00834
Epoch: 29858, Training Loss: 0.01025
Epoch: 29859, Training Loss: 0.00799
Epoch: 29859, Training Loss: 0.00833
Epoch: 29859, Training Loss: 0.00834
Epoch: 29859, Training Loss: 0.01025
Epoch: 29860, Training Loss: 0.00799
Epoch: 29860, Training Loss: 0.00833
Epoch: 29860, Training Loss: 0.00834
Epoch: 29860, Training Loss: 0.01025
Epoch: 29861, Training Loss: 0.00799
Epoch: 29861, Training Loss: 0.00833
Epoch: 29861, Training Loss: 0.00834
Epoch: 29861, Training Loss: 0.01025
Epoch: 29862, Training Loss: 0.00799
Epoch: 29862, Training Loss: 0.00833
Epoch: 29862, Training Loss: 0.00834
Epoch: 29862, Training Loss: 0.01025
Epoch: 29863, Training Loss: 0.00799
Epoch: 29863, Training Loss: 0.00833
Epoch: 29863, Training Loss: 0.00834
Epoch: 29863, Training Loss: 0.01025
Epoch: 29864, Training Loss: 0.00799
Epoch: 29864, Training Loss: 0.00833
Epoch: 29864, Training Loss: 0.00834
Epoch: 29864, Training Loss: 0.01025
Epoch: 29865, Training Loss: 0.00799
Epoch: 29865, Training Loss: 0.00833
Epoch: 29865, Training Loss: 0.00834
Epoch: 29865, Training Loss: 0.01025
Epoch: 29866, Training Loss: 0.00799
Epoch: 29866, Training Loss: 0.00833
Epoch: 29866, Training Loss: 0.00834
Epoch: 29866, Training Loss: 0.01025
Epoch: 29867, Training Loss: 0.00799
Epoch: 29867, Training Loss: 0.00833
Epoch: 29867, Training Loss: 0.00834
Epoch: 29867, Training Loss: 0.01025
Epoch: 29868, Training Loss: 0.00799
Epoch: 29868, Training Loss: 0.00833
Epoch: 29868, Training Loss: 0.00834
Epoch: 29868, Training Loss: 0.01025
Epoch: 29869, Training Loss: 0.00799
Epoch: 29869, Training Loss: 0.00833
Epoch: 29869, Training Loss: 0.00834
Epoch: 29869, Training Loss: 0.01025
Epoch: 29870, Training Loss: 0.00799
Epoch: 29870, Training Loss: 0.00833
Epoch: 29870, Training Loss: 0.00834
Epoch: 29870, Training Loss: 0.01024
Epoch: 29871, Training Loss: 0.00799
Epoch: 29871, Training Loss: 0.00833
Epoch: 29871, Training Loss: 0.00834
Epoch: 29871, Training Loss: 0.01024
Epoch: 29872, Training Loss: 0.00799
Epoch: 29872, Training Loss: 0.00833
Epoch: 29872, Training Loss: 0.00834
Epoch: 29872, Training Loss: 0.01024
Epoch: 29873, Training Loss: 0.00799
Epoch: 29873, Training Loss: 0.00833
Epoch: 29873, Training Loss: 0.00834
Epoch: 29873, Training Loss: 0.01024
Epoch: 29874, Training Loss: 0.00799
Epoch: 29874, Training Loss: 0.00833
Epoch: 29874, Training Loss: 0.00834
Epoch: 29874, Training Loss: 0.01024
Epoch: 29875, Training Loss: 0.00799
Epoch: 29875, Training Loss: 0.00833
Epoch: 29875, Training Loss: 0.00834
Epoch: 29875, Training Loss: 0.01024
Epoch: 29876, Training Loss: 0.00799
Epoch: 29876, Training Loss: 0.00833
Epoch: 29876, Training Loss: 0.00834
Epoch: 29876, Training Loss: 0.01024
Epoch: 29877, Training Loss: 0.00799
Epoch: 29877, Training Loss: 0.00833
Epoch: 29877, Training Loss: 0.00834
Epoch: 29877, Training Loss: 0.01024
Epoch: 29878, Training Loss: 0.00799
Epoch: 29878, Training Loss: 0.00833
Epoch: 29878, Training Loss: 0.00834
Epoch: 29878, Training Loss: 0.01024
Epoch: 29879, Training Loss: 0.00799
Epoch: 29879, Training Loss: 0.00833
Epoch: 29879, Training Loss: 0.00834
Epoch: 29879, Training Loss: 0.01024
Epoch: 29880, Training Loss: 0.00799
Epoch: 29880, Training Loss: 0.00833
Epoch: 29880, Training Loss: 0.00834
Epoch: 29880, Training Loss: 0.01024
Epoch: 29881, Training Loss: 0.00799
Epoch: 29881, Training Loss: 0.00833
Epoch: 29881, Training Loss: 0.00834
Epoch: 29881, Training Loss: 0.01024
Epoch: 29882, Training Loss: 0.00799
Epoch: 29882, Training Loss: 0.00833
Epoch: 29882, Training Loss: 0.00834
Epoch: 29882, Training Loss: 0.01024
Epoch: 29883, Training Loss: 0.00799
Epoch: 29883, Training Loss: 0.00833
Epoch: 29883, Training Loss: 0.00834
Epoch: 29883, Training Loss: 0.01024
Epoch: 29884, Training Loss: 0.00799
Epoch: 29884, Training Loss: 0.00833
Epoch: 29884, Training Loss: 0.00834
Epoch: 29884, Training Loss: 0.01024
Epoch: 29885, Training Loss: 0.00799
Epoch: 29885, Training Loss: 0.00833
Epoch: 29885, Training Loss: 0.00834
Epoch: 29885, Training Loss: 0.01024
Epoch: 29886, Training Loss: 0.00799
Epoch: 29886, Training Loss: 0.00833
Epoch: 29886, Training Loss: 0.00834
Epoch: 29886, Training Loss: 0.01024
Epoch: 29887, Training Loss: 0.00799
Epoch: 29887, Training Loss: 0.00833
Epoch: 29887, Training Loss: 0.00834
Epoch: 29887, Training Loss: 0.01024
Epoch: 29888, Training Loss: 0.00799
Epoch: 29888, Training Loss: 0.00833
Epoch: 29888, Training Loss: 0.00834
Epoch: 29888, Training Loss: 0.01024
Epoch: 29889, Training Loss: 0.00799
Epoch: 29889, Training Loss: 0.00833
Epoch: 29889, Training Loss: 0.00834
Epoch: 29889, Training Loss: 0.01024
Epoch: 29890, Training Loss: 0.00799
Epoch: 29890, Training Loss: 0.00833
Epoch: 29890, Training Loss: 0.00834
Epoch: 29890, Training Loss: 0.01024
Epoch: 29891, Training Loss: 0.00799
Epoch: 29891, Training Loss: 0.00833
Epoch: 29891, Training Loss: 0.00834
Epoch: 29891, Training Loss: 0.01024
Epoch: 29892, Training Loss: 0.00799
Epoch: 29892, Training Loss: 0.00833
Epoch: 29892, Training Loss: 0.00834
Epoch: 29892, Training Loss: 0.01024
Epoch: 29893, Training Loss: 0.00799
Epoch: 29893, Training Loss: 0.00833
Epoch: 29893, Training Loss: 0.00834
Epoch: 29893, Training Loss: 0.01024
Epoch: 29894, Training Loss: 0.00799
Epoch: 29894, Training Loss: 0.00833
Epoch: 29894, Training Loss: 0.00834
Epoch: 29894, Training Loss: 0.01024
Epoch: 29895, Training Loss: 0.00799
Epoch: 29895, Training Loss: 0.00833
Epoch: 29895, Training Loss: 0.00834
Epoch: 29895, Training Loss: 0.01024
Epoch: 29896, Training Loss: 0.00799
Epoch: 29896, Training Loss: 0.00833
Epoch: 29896, Training Loss: 0.00834
Epoch: 29896, Training Loss: 0.01024
Epoch: 29897, Training Loss: 0.00799
Epoch: 29897, Training Loss: 0.00833
Epoch: 29897, Training Loss: 0.00834
Epoch: 29897, Training Loss: 0.01024
Epoch: 29898, Training Loss: 0.00799
Epoch: 29898, Training Loss: 0.00833
Epoch: 29898, Training Loss: 0.00834
Epoch: 29898, Training Loss: 0.01024
Epoch: 29899, Training Loss: 0.00799
Epoch: 29899, Training Loss: 0.00833
Epoch: 29899, Training Loss: 0.00834
Epoch: 29899, Training Loss: 0.01024
Epoch: 29900, Training Loss: 0.00799
Epoch: 29900, Training Loss: 0.00833
Epoch: 29900, Training Loss: 0.00834
Epoch: 29900, Training Loss: 0.01024
Epoch: 29901, Training Loss: 0.00799
Epoch: 29901, Training Loss: 0.00832
Epoch: 29901, Training Loss: 0.00833
Epoch: 29901, Training Loss: 0.01024
Epoch: 29902, Training Loss: 0.00799
Epoch: 29902, Training Loss: 0.00832
Epoch: 29902, Training Loss: 0.00833
Epoch: 29902, Training Loss: 0.01024
Epoch: 29903, Training Loss: 0.00799
Epoch: 29903, Training Loss: 0.00832
Epoch: 29903, Training Loss: 0.00833
Epoch: 29903, Training Loss: 0.01024
Epoch: 29904, Training Loss: 0.00799
Epoch: 29904, Training Loss: 0.00832
Epoch: 29904, Training Loss: 0.00833
Epoch: 29904, Training Loss: 0.01024
Epoch: 29905, Training Loss: 0.00799
Epoch: 29905, Training Loss: 0.00832
Epoch: 29905, Training Loss: 0.00833
Epoch: 29905, Training Loss: 0.01024
Epoch: 29906, Training Loss: 0.00799
Epoch: 29906, Training Loss: 0.00832
Epoch: 29906, Training Loss: 0.00833
Epoch: 29906, Training Loss: 0.01024
Epoch: 29907, Training Loss: 0.00799
Epoch: 29907, Training Loss: 0.00832
Epoch: 29907, Training Loss: 0.00833
Epoch: 29907, Training Loss: 0.01024
Epoch: 29908, Training Loss: 0.00799
Epoch: 29908, Training Loss: 0.00832
Epoch: 29908, Training Loss: 0.00833
Epoch: 29908, Training Loss: 0.01024
Epoch: 29909, Training Loss: 0.00799
Epoch: 29909, Training Loss: 0.00832
Epoch: 29909, Training Loss: 0.00833
Epoch: 29909, Training Loss: 0.01024
Epoch: 29910, Training Loss: 0.00799
Epoch: 29910, Training Loss: 0.00832
Epoch: 29910, Training Loss: 0.00833
Epoch: 29910, Training Loss: 0.01024
Epoch: 29911, Training Loss: 0.00799
Epoch: 29911, Training Loss: 0.00832
Epoch: 29911, Training Loss: 0.00833
Epoch: 29911, Training Loss: 0.01024
Epoch: 29912, Training Loss: 0.00799
Epoch: 29912, Training Loss: 0.00832
Epoch: 29912, Training Loss: 0.00833
Epoch: 29912, Training Loss: 0.01024
Epoch: 29913, Training Loss: 0.00799
Epoch: 29913, Training Loss: 0.00832
Epoch: 29913, Training Loss: 0.00833
Epoch: 29913, Training Loss: 0.01024
Epoch: 29914, Training Loss: 0.00799
Epoch: 29914, Training Loss: 0.00832
Epoch: 29914, Training Loss: 0.00833
Epoch: 29914, Training Loss: 0.01024
Epoch: 29915, Training Loss: 0.00799
Epoch: 29915, Training Loss: 0.00832
Epoch: 29915, Training Loss: 0.00833
Epoch: 29915, Training Loss: 0.01024
Epoch: 29916, Training Loss: 0.00799
Epoch: 29916, Training Loss: 0.00832
Epoch: 29916, Training Loss: 0.00833
Epoch: 29916, Training Loss: 0.01024
Epoch: 29917, Training Loss: 0.00799
Epoch: 29917, Training Loss: 0.00832
Epoch: 29917, Training Loss: 0.00833
Epoch: 29917, Training Loss: 0.01024
Epoch: 29918, Training Loss: 0.00799
Epoch: 29918, Training Loss: 0.00832
Epoch: 29918, Training Loss: 0.00833
Epoch: 29918, Training Loss: 0.01024
Epoch: 29919, Training Loss: 0.00799
Epoch: 29919, Training Loss: 0.00832
Epoch: 29919, Training Loss: 0.00833
Epoch: 29919, Training Loss: 0.01024
Epoch: 29920, Training Loss: 0.00799
Epoch: 29920, Training Loss: 0.00832
Epoch: 29920, Training Loss: 0.00833
Epoch: 29920, Training Loss: 0.01024
Epoch: 29921, Training Loss: 0.00799
Epoch: 29921, Training Loss: 0.00832
Epoch: 29921, Training Loss: 0.00833
Epoch: 29921, Training Loss: 0.01024
Epoch: 29922, Training Loss: 0.00798
Epoch: 29922, Training Loss: 0.00832
Epoch: 29922, Training Loss: 0.00833
Epoch: 29922, Training Loss: 0.01023
Epoch: 29923, Training Loss: 0.00798
Epoch: 29923, Training Loss: 0.00832
Epoch: 29923, Training Loss: 0.00833
Epoch: 29923, Training Loss: 0.01023
Epoch: 29924, Training Loss: 0.00798
Epoch: 29924, Training Loss: 0.00832
Epoch: 29924, Training Loss: 0.00833
Epoch: 29924, Training Loss: 0.01023
Epoch: 29925, Training Loss: 0.00798
Epoch: 29925, Training Loss: 0.00832
Epoch: 29925, Training Loss: 0.00833
Epoch: 29925, Training Loss: 0.01023
Epoch: 29926, Training Loss: 0.00798
Epoch: 29926, Training Loss: 0.00832
Epoch: 29926, Training Loss: 0.00833
Epoch: 29926, Training Loss: 0.01023
Epoch: 29927, Training Loss: 0.00798
Epoch: 29927, Training Loss: 0.00832
Epoch: 29927, Training Loss: 0.00833
Epoch: 29927, Training Loss: 0.01023
Epoch: 29928, Training Loss: 0.00798
Epoch: 29928, Training Loss: 0.00832
Epoch: 29928, Training Loss: 0.00833
Epoch: 29928, Training Loss: 0.01023
Epoch: 29929, Training Loss: 0.00798
Epoch: 29929, Training Loss: 0.00832
Epoch: 29929, Training Loss: 0.00833
Epoch: 29929, Training Loss: 0.01023
Epoch: 29930, Training Loss: 0.00798
Epoch: 29930, Training Loss: 0.00832
Epoch: 29930, Training Loss: 0.00833
Epoch: 29930, Training Loss: 0.01023
Epoch: 29931, Training Loss: 0.00798
Epoch: 29931, Training Loss: 0.00832
Epoch: 29931, Training Loss: 0.00833
Epoch: 29931, Training Loss: 0.01023
Epoch: 29932, Training Loss: 0.00798
Epoch: 29932, Training Loss: 0.00832
Epoch: 29932, Training Loss: 0.00833
Epoch: 29932, Training Loss: 0.01023
Epoch: 29933, Training Loss: 0.00798
Epoch: 29933, Training Loss: 0.00832
Epoch: 29933, Training Loss: 0.00833
Epoch: 29933, Training Loss: 0.01023
Epoch: 29934, Training Loss: 0.00798
Epoch: 29934, Training Loss: 0.00832
Epoch: 29934, Training Loss: 0.00833
Epoch: 29934, Training Loss: 0.01023
Epoch: 29935, Training Loss: 0.00798
Epoch: 29935, Training Loss: 0.00832
Epoch: 29935, Training Loss: 0.00833
Epoch: 29935, Training Loss: 0.01023
Epoch: 29936, Training Loss: 0.00798
Epoch: 29936, Training Loss: 0.00832
Epoch: 29936, Training Loss: 0.00833
Epoch: 29936, Training Loss: 0.01023
Epoch: 29937, Training Loss: 0.00798
Epoch: 29937, Training Loss: 0.00832
Epoch: 29937, Training Loss: 0.00833
Epoch: 29937, Training Loss: 0.01023
Epoch: 29938, Training Loss: 0.00798
Epoch: 29938, Training Loss: 0.00832
Epoch: 29938, Training Loss: 0.00833
Epoch: 29938, Training Loss: 0.01023
Epoch: 29939, Training Loss: 0.00798
Epoch: 29939, Training Loss: 0.00832
Epoch: 29939, Training Loss: 0.00833
Epoch: 29939, Training Loss: 0.01023
Epoch: 29940, Training Loss: 0.00798
Epoch: 29940, Training Loss: 0.00832
Epoch: 29940, Training Loss: 0.00833
Epoch: 29940, Training Loss: 0.01023
Epoch: 29941, Training Loss: 0.00798
Epoch: 29941, Training Loss: 0.00832
Epoch: 29941, Training Loss: 0.00833
Epoch: 29941, Training Loss: 0.01023
Epoch: 29942, Training Loss: 0.00798
Epoch: 29942, Training Loss: 0.00832
Epoch: 29942, Training Loss: 0.00833
Epoch: 29942, Training Loss: 0.01023
Epoch: 29943, Training Loss: 0.00798
Epoch: 29943, Training Loss: 0.00832
Epoch: 29943, Training Loss: 0.00833
Epoch: 29943, Training Loss: 0.01023
Epoch: 29944, Training Loss: 0.00798
Epoch: 29944, Training Loss: 0.00832
Epoch: 29944, Training Loss: 0.00833
Epoch: 29944, Training Loss: 0.01023
Epoch: 29945, Training Loss: 0.00798
Epoch: 29945, Training Loss: 0.00832
Epoch: 29945, Training Loss: 0.00833
Epoch: 29945, Training Loss: 0.01023
Epoch: 29946, Training Loss: 0.00798
Epoch: 29946, Training Loss: 0.00832
Epoch: 29946, Training Loss: 0.00833
Epoch: 29946, Training Loss: 0.01023
Epoch: 29947, Training Loss: 0.00798
Epoch: 29947, Training Loss: 0.00832
Epoch: 29947, Training Loss: 0.00833
Epoch: 29947, Training Loss: 0.01023
Epoch: 29948, Training Loss: 0.00798
Epoch: 29948, Training Loss: 0.00832
Epoch: 29948, Training Loss: 0.00833
Epoch: 29948, Training Loss: 0.01023
Epoch: 29949, Training Loss: 0.00798
Epoch: 29949, Training Loss: 0.00832
Epoch: 29949, Training Loss: 0.00833
Epoch: 29949, Training Loss: 0.01023
Epoch: 29950, Training Loss: 0.00798
Epoch: 29950, Training Loss: 0.00832
Epoch: 29950, Training Loss: 0.00833
Epoch: 29950, Training Loss: 0.01023
Epoch: 29951, Training Loss: 0.00798
Epoch: 29951, Training Loss: 0.00832
Epoch: 29951, Training Loss: 0.00833
Epoch: 29951, Training Loss: 0.01023
Epoch: 29952, Training Loss: 0.00798
Epoch: 29952, Training Loss: 0.00832
Epoch: 29952, Training Loss: 0.00833
Epoch: 29952, Training Loss: 0.01023
Epoch: 29953, Training Loss: 0.00798
Epoch: 29953, Training Loss: 0.00832
Epoch: 29953, Training Loss: 0.00833
Epoch: 29953, Training Loss: 0.01023
Epoch: 29954, Training Loss: 0.00798
Epoch: 29954, Training Loss: 0.00832
Epoch: 29954, Training Loss: 0.00833
Epoch: 29954, Training Loss: 0.01023
Epoch: 29955, Training Loss: 0.00798
Epoch: 29955, Training Loss: 0.00832
Epoch: 29955, Training Loss: 0.00833
Epoch: 29955, Training Loss: 0.01023
Epoch: 29956, Training Loss: 0.00798
Epoch: 29956, Training Loss: 0.00832
Epoch: 29956, Training Loss: 0.00833
Epoch: 29956, Training Loss: 0.01023
Epoch: 29957, Training Loss: 0.00798
Epoch: 29957, Training Loss: 0.00832
Epoch: 29957, Training Loss: 0.00833
Epoch: 29957, Training Loss: 0.01023
Epoch: 29958, Training Loss: 0.00798
Epoch: 29958, Training Loss: 0.00832
Epoch: 29958, Training Loss: 0.00833
Epoch: 29958, Training Loss: 0.01023
Epoch: 29959, Training Loss: 0.00798
Epoch: 29959, Training Loss: 0.00832
Epoch: 29959, Training Loss: 0.00833
Epoch: 29959, Training Loss: 0.01023
Epoch: 29960, Training Loss: 0.00798
Epoch: 29960, Training Loss: 0.00832
Epoch: 29960, Training Loss: 0.00833
Epoch: 29960, Training Loss: 0.01023
Epoch: 29961, Training Loss: 0.00798
Epoch: 29961, Training Loss: 0.00832
Epoch: 29961, Training Loss: 0.00833
Epoch: 29961, Training Loss: 0.01023
Epoch: 29962, Training Loss: 0.00798
Epoch: 29962, Training Loss: 0.00832
Epoch: 29962, Training Loss: 0.00833
Epoch: 29962, Training Loss: 0.01023
Epoch: 29963, Training Loss: 0.00798
Epoch: 29963, Training Loss: 0.00832
Epoch: 29963, Training Loss: 0.00833
Epoch: 29963, Training Loss: 0.01023
Epoch: 29964, Training Loss: 0.00798
Epoch: 29964, Training Loss: 0.00832
Epoch: 29964, Training Loss: 0.00833
Epoch: 29964, Training Loss: 0.01023
Epoch: 29965, Training Loss: 0.00798
Epoch: 29965, Training Loss: 0.00831
Epoch: 29965, Training Loss: 0.00832
Epoch: 29965, Training Loss: 0.01023
Epoch: 29966, Training Loss: 0.00798
Epoch: 29966, Training Loss: 0.00831
Epoch: 29966, Training Loss: 0.00832
Epoch: 29966, Training Loss: 0.01023
Epoch: 29967, Training Loss: 0.00798
Epoch: 29967, Training Loss: 0.00831
Epoch: 29967, Training Loss: 0.00832
Epoch: 29967, Training Loss: 0.01023
Epoch: 29968, Training Loss: 0.00798
Epoch: 29968, Training Loss: 0.00831
Epoch: 29968, Training Loss: 0.00832
Epoch: 29968, Training Loss: 0.01023
Epoch: 29969, Training Loss: 0.00798
Epoch: 29969, Training Loss: 0.00831
Epoch: 29969, Training Loss: 0.00832
Epoch: 29969, Training Loss: 0.01023
Epoch: 29970, Training Loss: 0.00798
Epoch: 29970, Training Loss: 0.00831
Epoch: 29970, Training Loss: 0.00832
Epoch: 29970, Training Loss: 0.01023
Epoch: 29971, Training Loss: 0.00798
Epoch: 29971, Training Loss: 0.00831
Epoch: 29971, Training Loss: 0.00832
Epoch: 29971, Training Loss: 0.01023
Epoch: 29972, Training Loss: 0.00798
Epoch: 29972, Training Loss: 0.00831
Epoch: 29972, Training Loss: 0.00832
Epoch: 29972, Training Loss: 0.01023
Epoch: 29973, Training Loss: 0.00798
Epoch: 29973, Training Loss: 0.00831
Epoch: 29973, Training Loss: 0.00832
Epoch: 29973, Training Loss: 0.01022
Epoch: 29974, Training Loss: 0.00798
Epoch: 29974, Training Loss: 0.00831
Epoch: 29974, Training Loss: 0.00832
Epoch: 29974, Training Loss: 0.01022
Epoch: 29975, Training Loss: 0.00798
Epoch: 29975, Training Loss: 0.00831
Epoch: 29975, Training Loss: 0.00832
Epoch: 29975, Training Loss: 0.01022
Epoch: 29976, Training Loss: 0.00798
Epoch: 29976, Training Loss: 0.00831
Epoch: 29976, Training Loss: 0.00832
Epoch: 29976, Training Loss: 0.01022
Epoch: 29977, Training Loss: 0.00798
Epoch: 29977, Training Loss: 0.00831
Epoch: 29977, Training Loss: 0.00832
Epoch: 29977, Training Loss: 0.01022
Epoch: 29978, Training Loss: 0.00798
Epoch: 29978, Training Loss: 0.00831
Epoch: 29978, Training Loss: 0.00832
Epoch: 29978, Training Loss: 0.01022
Epoch: 29979, Training Loss: 0.00798
Epoch: 29979, Training Loss: 0.00831
Epoch: 29979, Training Loss: 0.00832
Epoch: 29979, Training Loss: 0.01022
Epoch: 29980, Training Loss: 0.00798
Epoch: 29980, Training Loss: 0.00831
Epoch: 29980, Training Loss: 0.00832
Epoch: 29980, Training Loss: 0.01022
Epoch: 29981, Training Loss: 0.00798
Epoch: 29981, Training Loss: 0.00831
Epoch: 29981, Training Loss: 0.00832
Epoch: 29981, Training Loss: 0.01022
Epoch: 29982, Training Loss: 0.00798
Epoch: 29982, Training Loss: 0.00831
Epoch: 29982, Training Loss: 0.00832
Epoch: 29982, Training Loss: 0.01022
Epoch: 29983, Training Loss: 0.00798
Epoch: 29983, Training Loss: 0.00831
Epoch: 29983, Training Loss: 0.00832
Epoch: 29983, Training Loss: 0.01022
Epoch: 29984, Training Loss: 0.00798
Epoch: 29984, Training Loss: 0.00831
Epoch: 29984, Training Loss: 0.00832
Epoch: 29984, Training Loss: 0.01022
Epoch: 29985, Training Loss: 0.00798
Epoch: 29985, Training Loss: 0.00831
Epoch: 29985, Training Loss: 0.00832
Epoch: 29985, Training Loss: 0.01022
Epoch: 29986, Training Loss: 0.00798
Epoch: 29986, Training Loss: 0.00831
Epoch: 29986, Training Loss: 0.00832
Epoch: 29986, Training Loss: 0.01022
Epoch: 29987, Training Loss: 0.00798
Epoch: 29987, Training Loss: 0.00831
Epoch: 29987, Training Loss: 0.00832
Epoch: 29987, Training Loss: 0.01022
Epoch: 29988, Training Loss: 0.00798
Epoch: 29988, Training Loss: 0.00831
Epoch: 29988, Training Loss: 0.00832
Epoch: 29988, Training Loss: 0.01022
Epoch: 29989, Training Loss: 0.00798
Epoch: 29989, Training Loss: 0.00831
Epoch: 29989, Training Loss: 0.00832
Epoch: 29989, Training Loss: 0.01022
Epoch: 29990, Training Loss: 0.00797
Epoch: 29990, Training Loss: 0.00831
Epoch: 29990, Training Loss: 0.00832
Epoch: 29990, Training Loss: 0.01022
Epoch: 29991, Training Loss: 0.00797
Epoch: 29991, Training Loss: 0.00831
Epoch: 29991, Training Loss: 0.00832
Epoch: 29991, Training Loss: 0.01022
Epoch: 29992, Training Loss: 0.00797
Epoch: 29992, Training Loss: 0.00831
Epoch: 29992, Training Loss: 0.00832
Epoch: 29992, Training Loss: 0.01022
Epoch: 29993, Training Loss: 0.00797
Epoch: 29993, Training Loss: 0.00831
Epoch: 29993, Training Loss: 0.00832
Epoch: 29993, Training Loss: 0.01022
Epoch: 29994, Training Loss: 0.00797
Epoch: 29994, Training Loss: 0.00831
Epoch: 29994, Training Loss: 0.00832
Epoch: 29994, Training Loss: 0.01022
Epoch: 29995, Training Loss: 0.00797
Epoch: 29995, Training Loss: 0.00831
Epoch: 29995, Training Loss: 0.00832
Epoch: 29995, Training Loss: 0.01022
Epoch: 29996, Training Loss: 0.00797
Epoch: 29996, Training Loss: 0.00831
Epoch: 29996, Training Loss: 0.00832
Epoch: 29996, Training Loss: 0.01022
Epoch: 29997, Training Loss: 0.00797
Epoch: 29997, Training Loss: 0.00831
Epoch: 29997, Training Loss: 0.00832
Epoch: 29997, Training Loss: 0.01022
Epoch: 29998, Training Loss: 0.00797
Epoch: 29998, Training Loss: 0.00831
Epoch: 29998, Training Loss: 0.00832
Epoch: 29998, Training Loss: 0.01022
Epoch: 29999, Training Loss: 0.00797
Epoch: 29999, Training Loss: 0.00831
Epoch: 29999, Training Loss: 0.00832
Epoch: 29999, Training Loss: 0.01022
Epoch: 30000, Training Loss: 0.00797
Epoch: 30000, Training Loss: 0.00831
Epoch: 30000, Training Loss: 0.00832
Epoch: 30000, Training Loss: 0.01022
Epoch: 30001, Training Loss: 0.00797
Epoch: 30001, Training Loss: 0.00831
Epoch: 30001, Training Loss: 0.00832
Epoch: 30001, Training Loss: 0.01022
Epoch: 30002, Training Loss: 0.00797
Epoch: 30002, Training Loss: 0.00831
Epoch: 30002, Training Loss: 0.00832
Epoch: 30002, Training Loss: 0.01022
Epoch: 30003, Training Loss: 0.00797
Epoch: 30003, Training Loss: 0.00831
Epoch: 30003, Training Loss: 0.00832
Epoch: 30003, Training Loss: 0.01022
Epoch: 30004, Training Loss: 0.00797
Epoch: 30004, Training Loss: 0.00831
Epoch: 30004, Training Loss: 0.00832
Epoch: 30004, Training Loss: 0.01022
Epoch: 30005, Training Loss: 0.00797
Epoch: 30005, Training Loss: 0.00831
Epoch: 30005, Training Loss: 0.00832
Epoch: 30005, Training Loss: 0.01022
Epoch: 30006, Training Loss: 0.00797
Epoch: 30006, Training Loss: 0.00831
Epoch: 30006, Training Loss: 0.00832
Epoch: 30006, Training Loss: 0.01022
Epoch: 30007, Training Loss: 0.00797
Epoch: 30007, Training Loss: 0.00831
Epoch: 30007, Training Loss: 0.00832
Epoch: 30007, Training Loss: 0.01022
Epoch: 30008, Training Loss: 0.00797
Epoch: 30008, Training Loss: 0.00831
Epoch: 30008, Training Loss: 0.00832
Epoch: 30008, Training Loss: 0.01022
Epoch: 30009, Training Loss: 0.00797
Epoch: 30009, Training Loss: 0.00831
Epoch: 30009, Training Loss: 0.00832
Epoch: 30009, Training Loss: 0.01022
Epoch: 30010, Training Loss: 0.00797
Epoch: 30010, Training Loss: 0.00831
Epoch: 30010, Training Loss: 0.00832
Epoch: 30010, Training Loss: 0.01022
Epoch: 30011, Training Loss: 0.00797
Epoch: 30011, Training Loss: 0.00831
Epoch: 30011, Training Loss: 0.00832
Epoch: 30011, Training Loss: 0.01022
Epoch: 30012, Training Loss: 0.00797
Epoch: 30012, Training Loss: 0.00831
Epoch: 30012, Training Loss: 0.00832
Epoch: 30012, Training Loss: 0.01022
Epoch: 30013, Training Loss: 0.00797
Epoch: 30013, Training Loss: 0.00831
Epoch: 30013, Training Loss: 0.00832
Epoch: 30013, Training Loss: 0.01022
Epoch: 30014, Training Loss: 0.00797
Epoch: 30014, Training Loss: 0.00831
Epoch: 30014, Training Loss: 0.00832
Epoch: 30014, Training Loss: 0.01022
Epoch: 30015, Training Loss: 0.00797
Epoch: 30015, Training Loss: 0.00831
Epoch: 30015, Training Loss: 0.00832
Epoch: 30015, Training Loss: 0.01022
Epoch: 30016, Training Loss: 0.00797
Epoch: 30016, Training Loss: 0.00831
Epoch: 30016, Training Loss: 0.00832
Epoch: 30016, Training Loss: 0.01022
Epoch: 30017, Training Loss: 0.00797
Epoch: 30017, Training Loss: 0.00831
Epoch: 30017, Training Loss: 0.00832
Epoch: 30017, Training Loss: 0.01022
Epoch: 30018, Training Loss: 0.00797
Epoch: 30018, Training Loss: 0.00831
Epoch: 30018, Training Loss: 0.00832
Epoch: 30018, Training Loss: 0.01022
Epoch: 30019, Training Loss: 0.00797
Epoch: 30019, Training Loss: 0.00831
Epoch: 30019, Training Loss: 0.00832
Epoch: 30019, Training Loss: 0.01022
Epoch: 30020, Training Loss: 0.00797
Epoch: 30020, Training Loss: 0.00831
Epoch: 30020, Training Loss: 0.00832
Epoch: 30020, Training Loss: 0.01022
Epoch: 30021, Training Loss: 0.00797
Epoch: 30021, Training Loss: 0.00831
Epoch: 30021, Training Loss: 0.00832
Epoch: 30021, Training Loss: 0.01022
Epoch: 30022, Training Loss: 0.00797
Epoch: 30022, Training Loss: 0.00831
Epoch: 30022, Training Loss: 0.00832
Epoch: 30022, Training Loss: 0.01022
Epoch: 30023, Training Loss: 0.00797
Epoch: 30023, Training Loss: 0.00831
Epoch: 30023, Training Loss: 0.00832
Epoch: 30023, Training Loss: 0.01022
Epoch: 30024, Training Loss: 0.00797
Epoch: 30024, Training Loss: 0.00831
Epoch: 30024, Training Loss: 0.00832
Epoch: 30024, Training Loss: 0.01021
Epoch: 30025, Training Loss: 0.00797
Epoch: 30025, Training Loss: 0.00831
Epoch: 30025, Training Loss: 0.00832
Epoch: 30025, Training Loss: 0.01021
Epoch: 30026, Training Loss: 0.00797
Epoch: 30026, Training Loss: 0.00831
Epoch: 30026, Training Loss: 0.00832
Epoch: 30026, Training Loss: 0.01021
Epoch: 30027, Training Loss: 0.00797
Epoch: 30027, Training Loss: 0.00831
Epoch: 30027, Training Loss: 0.00832
Epoch: 30027, Training Loss: 0.01021
Epoch: 30028, Training Loss: 0.00797
Epoch: 30028, Training Loss: 0.00831
Epoch: 30028, Training Loss: 0.00832
Epoch: 30028, Training Loss: 0.01021
Epoch: 30029, Training Loss: 0.00797
Epoch: 30029, Training Loss: 0.00830
Epoch: 30029, Training Loss: 0.00831
Epoch: 30029, Training Loss: 0.01021
Epoch: 30030, Training Loss: 0.00797
Epoch: 30030, Training Loss: 0.00830
Epoch: 30030, Training Loss: 0.00831
Epoch: 30030, Training Loss: 0.01021
Epoch: 30031, Training Loss: 0.00797
Epoch: 30031, Training Loss: 0.00830
Epoch: 30031, Training Loss: 0.00831
Epoch: 30031, Training Loss: 0.01021
Epoch: 30032, Training Loss: 0.00797
Epoch: 30032, Training Loss: 0.00830
Epoch: 30032, Training Loss: 0.00831
Epoch: 30032, Training Loss: 0.01021
Epoch: 30033, Training Loss: 0.00797
Epoch: 30033, Training Loss: 0.00830
Epoch: 30033, Training Loss: 0.00831
Epoch: 30033, Training Loss: 0.01021
Epoch: 30034, Training Loss: 0.00797
Epoch: 30034, Training Loss: 0.00830
Epoch: 30034, Training Loss: 0.00831
Epoch: 30034, Training Loss: 0.01021
Epoch: 30035, Training Loss: 0.00797
Epoch: 30035, Training Loss: 0.00830
Epoch: 30035, Training Loss: 0.00831
Epoch: 30035, Training Loss: 0.01021
Epoch: 30036, Training Loss: 0.00797
Epoch: 30036, Training Loss: 0.00830
Epoch: 30036, Training Loss: 0.00831
Epoch: 30036, Training Loss: 0.01021
Epoch: 30037, Training Loss: 0.00797
Epoch: 30037, Training Loss: 0.00830
Epoch: 30037, Training Loss: 0.00831
Epoch: 30037, Training Loss: 0.01021
Epoch: 30038, Training Loss: 0.00797
Epoch: 30038, Training Loss: 0.00830
Epoch: 30038, Training Loss: 0.00831
Epoch: 30038, Training Loss: 0.01021
Epoch: 30039, Training Loss: 0.00797
Epoch: 30039, Training Loss: 0.00830
Epoch: 30039, Training Loss: 0.00831
Epoch: 30039, Training Loss: 0.01021
Epoch: 30040, Training Loss: 0.00797
Epoch: 30040, Training Loss: 0.00830
Epoch: 30040, Training Loss: 0.00831
Epoch: 30040, Training Loss: 0.01021
Epoch: 30041, Training Loss: 0.00797
Epoch: 30041, Training Loss: 0.00830
Epoch: 30041, Training Loss: 0.00831
Epoch: 30041, Training Loss: 0.01021
Epoch: 30042, Training Loss: 0.00797
Epoch: 30042, Training Loss: 0.00830
Epoch: 30042, Training Loss: 0.00831
Epoch: 30042, Training Loss: 0.01021
Epoch: 30043, Training Loss: 0.00797
Epoch: 30043, Training Loss: 0.00830
Epoch: 30043, Training Loss: 0.00831
Epoch: 30043, Training Loss: 0.01021
Epoch: 30044, Training Loss: 0.00797
Epoch: 30044, Training Loss: 0.00830
Epoch: 30044, Training Loss: 0.00831
Epoch: 30044, Training Loss: 0.01021
Epoch: 30045, Training Loss: 0.00797
Epoch: 30045, Training Loss: 0.00830
Epoch: 30045, Training Loss: 0.00831
Epoch: 30045, Training Loss: 0.01021
Epoch: 30046, Training Loss: 0.00797
Epoch: 30046, Training Loss: 0.00830
Epoch: 30046, Training Loss: 0.00831
Epoch: 30046, Training Loss: 0.01021
Epoch: 30047, Training Loss: 0.00797
Epoch: 30047, Training Loss: 0.00830
Epoch: 30047, Training Loss: 0.00831
Epoch: 30047, Training Loss: 0.01021
Epoch: 30048, Training Loss: 0.00797
Epoch: 30048, Training Loss: 0.00830
Epoch: 30048, Training Loss: 0.00831
Epoch: 30048, Training Loss: 0.01021
Epoch: 30049, Training Loss: 0.00797
Epoch: 30049, Training Loss: 0.00830
Epoch: 30049, Training Loss: 0.00831
Epoch: 30049, Training Loss: 0.01021
Epoch: 30050, Training Loss: 0.00797
Epoch: 30050, Training Loss: 0.00830
Epoch: 30050, Training Loss: 0.00831
Epoch: 30050, Training Loss: 0.01021
Epoch: 30051, Training Loss: 0.00797
Epoch: 30051, Training Loss: 0.00830
Epoch: 30051, Training Loss: 0.00831
Epoch: 30051, Training Loss: 0.01021
Epoch: 30052, Training Loss: 0.00797
Epoch: 30052, Training Loss: 0.00830
Epoch: 30052, Training Loss: 0.00831
Epoch: 30052, Training Loss: 0.01021
Epoch: 30053, Training Loss: 0.00797
Epoch: 30053, Training Loss: 0.00830
Epoch: 30053, Training Loss: 0.00831
Epoch: 30053, Training Loss: 0.01021
Epoch: 30054, Training Loss: 0.00797
Epoch: 30054, Training Loss: 0.00830
Epoch: 30054, Training Loss: 0.00831
Epoch: 30054, Training Loss: 0.01021
Epoch: 30055, Training Loss: 0.00797
Epoch: 30055, Training Loss: 0.00830
Epoch: 30055, Training Loss: 0.00831
Epoch: 30055, Training Loss: 0.01021
Epoch: 30056, Training Loss: 0.00797
Epoch: 30056, Training Loss: 0.00830
Epoch: 30056, Training Loss: 0.00831
Epoch: 30056, Training Loss: 0.01021
Epoch: 30057, Training Loss: 0.00797
Epoch: 30057, Training Loss: 0.00830
Epoch: 30057, Training Loss: 0.00831
Epoch: 30057, Training Loss: 0.01021
Epoch: 30058, Training Loss: 0.00797
Epoch: 30058, Training Loss: 0.00830
Epoch: 30058, Training Loss: 0.00831
Epoch: 30058, Training Loss: 0.01021
Epoch: 30059, Training Loss: 0.00796
Epoch: 30059, Training Loss: 0.00830
Epoch: 30059, Training Loss: 0.00831
Epoch: 30059, Training Loss: 0.01021
Epoch: 30060, Training Loss: 0.00796
Epoch: 30060, Training Loss: 0.00830
Epoch: 30060, Training Loss: 0.00831
Epoch: 30060, Training Loss: 0.01021
Epoch: 30061, Training Loss: 0.00796
Epoch: 30061, Training Loss: 0.00830
Epoch: 30061, Training Loss: 0.00831
Epoch: 30061, Training Loss: 0.01021
Epoch: 30062, Training Loss: 0.00796
Epoch: 30062, Training Loss: 0.00830
Epoch: 30062, Training Loss: 0.00831
Epoch: 30062, Training Loss: 0.01021
Epoch: 30063, Training Loss: 0.00796
Epoch: 30063, Training Loss: 0.00830
Epoch: 30063, Training Loss: 0.00831
Epoch: 30063, Training Loss: 0.01021
Epoch: 30064, Training Loss: 0.00796
Epoch: 30064, Training Loss: 0.00830
Epoch: 30064, Training Loss: 0.00831
Epoch: 30064, Training Loss: 0.01021
Epoch: 30065, Training Loss: 0.00796
Epoch: 30065, Training Loss: 0.00830
Epoch: 30065, Training Loss: 0.00831
Epoch: 30065, Training Loss: 0.01021
Epoch: 30066, Training Loss: 0.00796
Epoch: 30066, Training Loss: 0.00830
Epoch: 30066, Training Loss: 0.00831
Epoch: 30066, Training Loss: 0.01021
Epoch: 30067, Training Loss: 0.00796
Epoch: 30067, Training Loss: 0.00830
Epoch: 30067, Training Loss: 0.00831
Epoch: 30067, Training Loss: 0.01021
Epoch: 30068, Training Loss: 0.00796
Epoch: 30068, Training Loss: 0.00830
Epoch: 30068, Training Loss: 0.00831
Epoch: 30068, Training Loss: 0.01021
Epoch: 30069, Training Loss: 0.00796
Epoch: 30069, Training Loss: 0.00830
Epoch: 30069, Training Loss: 0.00831
Epoch: 30069, Training Loss: 0.01021
Epoch: 30070, Training Loss: 0.00796
Epoch: 30070, Training Loss: 0.00830
Epoch: 30070, Training Loss: 0.00831
Epoch: 30070, Training Loss: 0.01021
Epoch: 30071, Training Loss: 0.00796
Epoch: 30071, Training Loss: 0.00830
Epoch: 30071, Training Loss: 0.00831
Epoch: 30071, Training Loss: 0.01021
Epoch: 30072, Training Loss: 0.00796
Epoch: 30072, Training Loss: 0.00830
Epoch: 30072, Training Loss: 0.00831
Epoch: 30072, Training Loss: 0.01021
Epoch: 30073, Training Loss: 0.00796
Epoch: 30073, Training Loss: 0.00830
Epoch: 30073, Training Loss: 0.00831
Epoch: 30073, Training Loss: 0.01021
Epoch: 30074, Training Loss: 0.00796
Epoch: 30074, Training Loss: 0.00830
Epoch: 30074, Training Loss: 0.00831
Epoch: 30074, Training Loss: 0.01021
Epoch: 30075, Training Loss: 0.00796
Epoch: 30075, Training Loss: 0.00830
Epoch: 30075, Training Loss: 0.00831
Epoch: 30075, Training Loss: 0.01021
Epoch: 30076, Training Loss: 0.00796
Epoch: 30076, Training Loss: 0.00830
Epoch: 30076, Training Loss: 0.00831
Epoch: 30076, Training Loss: 0.01020
Epoch: 30077, Training Loss: 0.00796
Epoch: 30077, Training Loss: 0.00830
Epoch: 30077, Training Loss: 0.00831
Epoch: 30077, Training Loss: 0.01020
Epoch: 30078, Training Loss: 0.00796
Epoch: 30078, Training Loss: 0.00830
Epoch: 30078, Training Loss: 0.00831
Epoch: 30078, Training Loss: 0.01020
Epoch: 30079, Training Loss: 0.00796
Epoch: 30079, Training Loss: 0.00830
Epoch: 30079, Training Loss: 0.00831
Epoch: 30079, Training Loss: 0.01020
Epoch: 30080, Training Loss: 0.00796
Epoch: 30080, Training Loss: 0.00830
Epoch: 30080, Training Loss: 0.00831
Epoch: 30080, Training Loss: 0.01020
Epoch: 30081, Training Loss: 0.00796
Epoch: 30081, Training Loss: 0.00830
Epoch: 30081, Training Loss: 0.00831
Epoch: 30081, Training Loss: 0.01020
Epoch: 30082, Training Loss: 0.00796
Epoch: 30082, Training Loss: 0.00830
Epoch: 30082, Training Loss: 0.00831
Epoch: 30082, Training Loss: 0.01020
Epoch: 30083, Training Loss: 0.00796
Epoch: 30083, Training Loss: 0.00830
Epoch: 30083, Training Loss: 0.00831
Epoch: 30083, Training Loss: 0.01020
Epoch: 30084, Training Loss: 0.00796
Epoch: 30084, Training Loss: 0.00830
Epoch: 30084, Training Loss: 0.00831
Epoch: 30084, Training Loss: 0.01020
Epoch: 30085, Training Loss: 0.00796
Epoch: 30085, Training Loss: 0.00830
Epoch: 30085, Training Loss: 0.00831
Epoch: 30085, Training Loss: 0.01020
Epoch: 30086, Training Loss: 0.00796
Epoch: 30086, Training Loss: 0.00830
Epoch: 30086, Training Loss: 0.00831
Epoch: 30086, Training Loss: 0.01020
Epoch: 30087, Training Loss: 0.00796
Epoch: 30087, Training Loss: 0.00830
Epoch: 30087, Training Loss: 0.00831
Epoch: 30087, Training Loss: 0.01020
Epoch: 30088, Training Loss: 0.00796
Epoch: 30088, Training Loss: 0.00830
Epoch: 30088, Training Loss: 0.00831
Epoch: 30088, Training Loss: 0.01020
Epoch: 30089, Training Loss: 0.00796
Epoch: 30089, Training Loss: 0.00830
Epoch: 30089, Training Loss: 0.00831
Epoch: 30089, Training Loss: 0.01020
Epoch: 30090, Training Loss: 0.00796
Epoch: 30090, Training Loss: 0.00830
Epoch: 30090, Training Loss: 0.00831
Epoch: 30090, Training Loss: 0.01020
Epoch: 30091, Training Loss: 0.00796
Epoch: 30091, Training Loss: 0.00830
Epoch: 30091, Training Loss: 0.00831
Epoch: 30091, Training Loss: 0.01020
Epoch: 30092, Training Loss: 0.00796
Epoch: 30092, Training Loss: 0.00830
Epoch: 30092, Training Loss: 0.00831
Epoch: 30092, Training Loss: 0.01020
Epoch: 30093, Training Loss: 0.00796
Epoch: 30093, Training Loss: 0.00830
Epoch: 30093, Training Loss: 0.00830
Epoch: 30093, Training Loss: 0.01020
Epoch: 30094, Training Loss: 0.00796
Epoch: 30094, Training Loss: 0.00829
Epoch: 30094, Training Loss: 0.00830
Epoch: 30094, Training Loss: 0.01020
Epoch: 30095, Training Loss: 0.00796
Epoch: 30095, Training Loss: 0.00829
Epoch: 30095, Training Loss: 0.00830
Epoch: 30095, Training Loss: 0.01020
Epoch: 30096, Training Loss: 0.00796
Epoch: 30096, Training Loss: 0.00829
Epoch: 30096, Training Loss: 0.00830
Epoch: 30096, Training Loss: 0.01020
Epoch: 30097, Training Loss: 0.00796
Epoch: 30097, Training Loss: 0.00829
Epoch: 30097, Training Loss: 0.00830
Epoch: 30097, Training Loss: 0.01020
Epoch: 30098, Training Loss: 0.00796
Epoch: 30098, Training Loss: 0.00829
Epoch: 30098, Training Loss: 0.00830
Epoch: 30098, Training Loss: 0.01020
Epoch: 30099, Training Loss: 0.00796
Epoch: 30099, Training Loss: 0.00829
Epoch: 30099, Training Loss: 0.00830
Epoch: 30099, Training Loss: 0.01020
Epoch: 30100, Training Loss: 0.00796
Epoch: 30100, Training Loss: 0.00829
Epoch: 30100, Training Loss: 0.00830
Epoch: 30100, Training Loss: 0.01020
Epoch: 30101, Training Loss: 0.00796
Epoch: 30101, Training Loss: 0.00829
Epoch: 30101, Training Loss: 0.00830
Epoch: 30101, Training Loss: 0.01020
Epoch: 30102, Training Loss: 0.00796
Epoch: 30102, Training Loss: 0.00829
Epoch: 30102, Training Loss: 0.00830
Epoch: 30102, Training Loss: 0.01020
Epoch: 30103, Training Loss: 0.00796
Epoch: 30103, Training Loss: 0.00829
Epoch: 30103, Training Loss: 0.00830
Epoch: 30103, Training Loss: 0.01020
Epoch: 30104, Training Loss: 0.00796
Epoch: 30104, Training Loss: 0.00829
Epoch: 30104, Training Loss: 0.00830
Epoch: 30104, Training Loss: 0.01020
Epoch: 30105, Training Loss: 0.00796
Epoch: 30105, Training Loss: 0.00829
Epoch: 30105, Training Loss: 0.00830
Epoch: 30105, Training Loss: 0.01020
Epoch: 30106, Training Loss: 0.00796
Epoch: 30106, Training Loss: 0.00829
Epoch: 30106, Training Loss: 0.00830
Epoch: 30106, Training Loss: 0.01020
Epoch: 30107, Training Loss: 0.00796
Epoch: 30107, Training Loss: 0.00829
Epoch: 30107, Training Loss: 0.00830
Epoch: 30107, Training Loss: 0.01020
Epoch: 30108, Training Loss: 0.00796
Epoch: 30108, Training Loss: 0.00829
Epoch: 30108, Training Loss: 0.00830
Epoch: 30108, Training Loss: 0.01020
Epoch: 30109, Training Loss: 0.00796
Epoch: 30109, Training Loss: 0.00829
Epoch: 30109, Training Loss: 0.00830
Epoch: 30109, Training Loss: 0.01020
Epoch: 30110, Training Loss: 0.00796
Epoch: 30110, Training Loss: 0.00829
Epoch: 30110, Training Loss: 0.00830
Epoch: 30110, Training Loss: 0.01020
Epoch: 30111, Training Loss: 0.00796
Epoch: 30111, Training Loss: 0.00829
Epoch: 30111, Training Loss: 0.00830
Epoch: 30111, Training Loss: 0.01020
Epoch: 30112, Training Loss: 0.00796
Epoch: 30112, Training Loss: 0.00829
Epoch: 30112, Training Loss: 0.00830
Epoch: 30112, Training Loss: 0.01020
Epoch: 30113, Training Loss: 0.00796
Epoch: 30113, Training Loss: 0.00829
Epoch: 30113, Training Loss: 0.00830
Epoch: 30113, Training Loss: 0.01020
Epoch: 30114, Training Loss: 0.00796
Epoch: 30114, Training Loss: 0.00829
Epoch: 30114, Training Loss: 0.00830
Epoch: 30114, Training Loss: 0.01020
Epoch: 30115, Training Loss: 0.00796
Epoch: 30115, Training Loss: 0.00829
Epoch: 30115, Training Loss: 0.00830
Epoch: 30115, Training Loss: 0.01020
Epoch: 30116, Training Loss: 0.00796
Epoch: 30116, Training Loss: 0.00829
Epoch: 30116, Training Loss: 0.00830
Epoch: 30116, Training Loss: 0.01020
Epoch: 30117, Training Loss: 0.00796
Epoch: 30117, Training Loss: 0.00829
Epoch: 30117, Training Loss: 0.00830
Epoch: 30117, Training Loss: 0.01020
Epoch: 30118, Training Loss: 0.00796
Epoch: 30118, Training Loss: 0.00829
Epoch: 30118, Training Loss: 0.00830
Epoch: 30118, Training Loss: 0.01020
Epoch: 30119, Training Loss: 0.00796
Epoch: 30119, Training Loss: 0.00829
Epoch: 30119, Training Loss: 0.00830
Epoch: 30119, Training Loss: 0.01020
Epoch: 30120, Training Loss: 0.00796
Epoch: 30120, Training Loss: 0.00829
Epoch: 30120, Training Loss: 0.00830
Epoch: 30120, Training Loss: 0.01020
Epoch: 30121, Training Loss: 0.00796
Epoch: 30121, Training Loss: 0.00829
Epoch: 30121, Training Loss: 0.00830
Epoch: 30121, Training Loss: 0.01020
Epoch: 30122, Training Loss: 0.00796
Epoch: 30122, Training Loss: 0.00829
Epoch: 30122, Training Loss: 0.00830
Epoch: 30122, Training Loss: 0.01020
Epoch: 30123, Training Loss: 0.00796
Epoch: 30123, Training Loss: 0.00829
Epoch: 30123, Training Loss: 0.00830
Epoch: 30123, Training Loss: 0.01020
Epoch: 30124, Training Loss: 0.00796
Epoch: 30124, Training Loss: 0.00829
Epoch: 30124, Training Loss: 0.00830
Epoch: 30124, Training Loss: 0.01020
Epoch: 30125, Training Loss: 0.00796
Epoch: 30125, Training Loss: 0.00829
Epoch: 30125, Training Loss: 0.00830
Epoch: 30125, Training Loss: 0.01020
Epoch: 30126, Training Loss: 0.00796
Epoch: 30126, Training Loss: 0.00829
Epoch: 30126, Training Loss: 0.00830
Epoch: 30126, Training Loss: 0.01020
Epoch: 30127, Training Loss: 0.00795
Epoch: 30127, Training Loss: 0.00829
Epoch: 30127, Training Loss: 0.00830
Epoch: 30127, Training Loss: 0.01020
Epoch: 30128, Training Loss: 0.00795
Epoch: 30128, Training Loss: 0.00829
Epoch: 30128, Training Loss: 0.00830
Epoch: 30128, Training Loss: 0.01019
Epoch: 30129, Training Loss: 0.00795
Epoch: 30129, Training Loss: 0.00829
Epoch: 30129, Training Loss: 0.00830
Epoch: 30129, Training Loss: 0.01019
Epoch: 30130, Training Loss: 0.00795
Epoch: 30130, Training Loss: 0.00829
Epoch: 30130, Training Loss: 0.00830
Epoch: 30130, Training Loss: 0.01019
Epoch: 30131, Training Loss: 0.00795
Epoch: 30131, Training Loss: 0.00829
Epoch: 30131, Training Loss: 0.00830
Epoch: 30131, Training Loss: 0.01019
Epoch: 30132, Training Loss: 0.00795
Epoch: 30132, Training Loss: 0.00829
Epoch: 30132, Training Loss: 0.00830
Epoch: 30132, Training Loss: 0.01019
Epoch: 30133, Training Loss: 0.00795
Epoch: 30133, Training Loss: 0.00829
Epoch: 30133, Training Loss: 0.00830
Epoch: 30133, Training Loss: 0.01019
Epoch: 30134, Training Loss: 0.00795
Epoch: 30134, Training Loss: 0.00829
Epoch: 30134, Training Loss: 0.00830
Epoch: 30134, Training Loss: 0.01019
Epoch: 30135, Training Loss: 0.00795
Epoch: 30135, Training Loss: 0.00829
Epoch: 30135, Training Loss: 0.00830
Epoch: 30135, Training Loss: 0.01019
Epoch: 30136, Training Loss: 0.00795
Epoch: 30136, Training Loss: 0.00829
Epoch: 30136, Training Loss: 0.00830
Epoch: 30136, Training Loss: 0.01019
Epoch: 30137, Training Loss: 0.00795
Epoch: 30137, Training Loss: 0.00829
Epoch: 30137, Training Loss: 0.00830
Epoch: 30137, Training Loss: 0.01019
Epoch: 30138, Training Loss: 0.00795
Epoch: 30138, Training Loss: 0.00829
Epoch: 30138, Training Loss: 0.00830
Epoch: 30138, Training Loss: 0.01019
Epoch: 30139, Training Loss: 0.00795
Epoch: 30139, Training Loss: 0.00829
Epoch: 30139, Training Loss: 0.00830
Epoch: 30139, Training Loss: 0.01019
Epoch: 30140, Training Loss: 0.00795
Epoch: 30140, Training Loss: 0.00829
Epoch: 30140, Training Loss: 0.00830
Epoch: 30140, Training Loss: 0.01019
Epoch: 30141, Training Loss: 0.00795
Epoch: 30141, Training Loss: 0.00829
Epoch: 30141, Training Loss: 0.00830
Epoch: 30141, Training Loss: 0.01019
Epoch: 30142, Training Loss: 0.00795
Epoch: 30142, Training Loss: 0.00829
Epoch: 30142, Training Loss: 0.00830
Epoch: 30142, Training Loss: 0.01019
Epoch: 30143, Training Loss: 0.00795
Epoch: 30143, Training Loss: 0.00829
Epoch: 30143, Training Loss: 0.00830
Epoch: 30143, Training Loss: 0.01019
Epoch: 30144, Training Loss: 0.00795
Epoch: 30144, Training Loss: 0.00829
Epoch: 30144, Training Loss: 0.00830
Epoch: 30144, Training Loss: 0.01019
Epoch: 30145, Training Loss: 0.00795
Epoch: 30145, Training Loss: 0.00829
Epoch: 30145, Training Loss: 0.00830
Epoch: 30145, Training Loss: 0.01019
Epoch: 30146, Training Loss: 0.00795
Epoch: 30146, Training Loss: 0.00829
Epoch: 30146, Training Loss: 0.00830
Epoch: 30146, Training Loss: 0.01019
Epoch: 30147, Training Loss: 0.00795
Epoch: 30147, Training Loss: 0.00829
Epoch: 30147, Training Loss: 0.00830
Epoch: 30147, Training Loss: 0.01019
Epoch: 30148, Training Loss: 0.00795
Epoch: 30148, Training Loss: 0.00829
Epoch: 30148, Training Loss: 0.00830
Epoch: 30148, Training Loss: 0.01019
Epoch: 30149, Training Loss: 0.00795
Epoch: 30149, Training Loss: 0.00829
Epoch: 30149, Training Loss: 0.00830
Epoch: 30149, Training Loss: 0.01019
Epoch: 30150, Training Loss: 0.00795
Epoch: 30150, Training Loss: 0.00829
Epoch: 30150, Training Loss: 0.00830
Epoch: 30150, Training Loss: 0.01019
Epoch: 30151, Training Loss: 0.00795
Epoch: 30151, Training Loss: 0.00829
Epoch: 30151, Training Loss: 0.00830
Epoch: 30151, Training Loss: 0.01019
Epoch: 30152, Training Loss: 0.00795
Epoch: 30152, Training Loss: 0.00829
Epoch: 30152, Training Loss: 0.00830
Epoch: 30152, Training Loss: 0.01019
Epoch: 30153, Training Loss: 0.00795
Epoch: 30153, Training Loss: 0.00829
Epoch: 30153, Training Loss: 0.00830
Epoch: 30153, Training Loss: 0.01019
Epoch: 30154, Training Loss: 0.00795
Epoch: 30154, Training Loss: 0.00829
Epoch: 30154, Training Loss: 0.00830
Epoch: 30154, Training Loss: 0.01019
Epoch: 30155, Training Loss: 0.00795
Epoch: 30155, Training Loss: 0.00829
Epoch: 30155, Training Loss: 0.00830
Epoch: 30155, Training Loss: 0.01019
Epoch: 30156, Training Loss: 0.00795
Epoch: 30156, Training Loss: 0.00829
Epoch: 30156, Training Loss: 0.00830
Epoch: 30156, Training Loss: 0.01019
Epoch: 30157, Training Loss: 0.00795
Epoch: 30157, Training Loss: 0.00829
Epoch: 30157, Training Loss: 0.00830
Epoch: 30157, Training Loss: 0.01019
Epoch: 30158, Training Loss: 0.00795
Epoch: 30158, Training Loss: 0.00828
Epoch: 30158, Training Loss: 0.00829
Epoch: 30158, Training Loss: 0.01019
Epoch: 30159, Training Loss: 0.00795
Epoch: 30159, Training Loss: 0.00828
Epoch: 30159, Training Loss: 0.00829
Epoch: 30159, Training Loss: 0.01019
Epoch: 30160, Training Loss: 0.00795
Epoch: 30160, Training Loss: 0.00828
Epoch: 30160, Training Loss: 0.00829
Epoch: 30160, Training Loss: 0.01019
Epoch: 30161, Training Loss: 0.00795
Epoch: 30161, Training Loss: 0.00828
Epoch: 30161, Training Loss: 0.00829
Epoch: 30161, Training Loss: 0.01019
Epoch: 30162, Training Loss: 0.00795
Epoch: 30162, Training Loss: 0.00828
Epoch: 30162, Training Loss: 0.00829
Epoch: 30162, Training Loss: 0.01019
Epoch: 30163, Training Loss: 0.00795
Epoch: 30163, Training Loss: 0.00828
Epoch: 30163, Training Loss: 0.00829
Epoch: 30163, Training Loss: 0.01019
Epoch: 30164, Training Loss: 0.00795
Epoch: 30164, Training Loss: 0.00828
Epoch: 30164, Training Loss: 0.00829
Epoch: 30164, Training Loss: 0.01019
Epoch: 30165, Training Loss: 0.00795
Epoch: 30165, Training Loss: 0.00828
Epoch: 30165, Training Loss: 0.00829
Epoch: 30165, Training Loss: 0.01019
Epoch: 30166, Training Loss: 0.00795
Epoch: 30166, Training Loss: 0.00828
Epoch: 30166, Training Loss: 0.00829
Epoch: 30166, Training Loss: 0.01019
Epoch: 30167, Training Loss: 0.00795
Epoch: 30167, Training Loss: 0.00828
Epoch: 30167, Training Loss: 0.00829
Epoch: 30167, Training Loss: 0.01019
Epoch: 30168, Training Loss: 0.00795
Epoch: 30168, Training Loss: 0.00828
Epoch: 30168, Training Loss: 0.00829
Epoch: 30168, Training Loss: 0.01019
Epoch: 30169, Training Loss: 0.00795
Epoch: 30169, Training Loss: 0.00828
Epoch: 30169, Training Loss: 0.00829
Epoch: 30169, Training Loss: 0.01019
Epoch: 30170, Training Loss: 0.00795
Epoch: 30170, Training Loss: 0.00828
Epoch: 30170, Training Loss: 0.00829
Epoch: 30170, Training Loss: 0.01019
Epoch: 30171, Training Loss: 0.00795
Epoch: 30171, Training Loss: 0.00828
Epoch: 30171, Training Loss: 0.00829
Epoch: 30171, Training Loss: 0.01019
Epoch: 30172, Training Loss: 0.00795
Epoch: 30172, Training Loss: 0.00828
Epoch: 30172, Training Loss: 0.00829
Epoch: 30172, Training Loss: 0.01019
Epoch: 30173, Training Loss: 0.00795
Epoch: 30173, Training Loss: 0.00828
Epoch: 30173, Training Loss: 0.00829
Epoch: 30173, Training Loss: 0.01019
Epoch: 30174, Training Loss: 0.00795
Epoch: 30174, Training Loss: 0.00828
Epoch: 30174, Training Loss: 0.00829
Epoch: 30174, Training Loss: 0.01019
Epoch: 30175, Training Loss: 0.00795
Epoch: 30175, Training Loss: 0.00828
Epoch: 30175, Training Loss: 0.00829
Epoch: 30175, Training Loss: 0.01019
Epoch: 30176, Training Loss: 0.00795
Epoch: 30176, Training Loss: 0.00828
Epoch: 30176, Training Loss: 0.00829
Epoch: 30176, Training Loss: 0.01019
Epoch: 30177, Training Loss: 0.00795
Epoch: 30177, Training Loss: 0.00828
Epoch: 30177, Training Loss: 0.00829
Epoch: 30177, Training Loss: 0.01019
Epoch: 30178, Training Loss: 0.00795
Epoch: 30178, Training Loss: 0.00828
Epoch: 30178, Training Loss: 0.00829
Epoch: 30178, Training Loss: 0.01019
Epoch: 30179, Training Loss: 0.00795
Epoch: 30179, Training Loss: 0.00828
Epoch: 30179, Training Loss: 0.00829
Epoch: 30179, Training Loss: 0.01019
Epoch: 30180, Training Loss: 0.00795
Epoch: 30180, Training Loss: 0.00828
Epoch: 30180, Training Loss: 0.00829
Epoch: 30180, Training Loss: 0.01018
Epoch: 30181, Training Loss: 0.00795
Epoch: 30181, Training Loss: 0.00828
Epoch: 30181, Training Loss: 0.00829
Epoch: 30181, Training Loss: 0.01018
Epoch: 30182, Training Loss: 0.00795
Epoch: 30182, Training Loss: 0.00828
Epoch: 30182, Training Loss: 0.00829
Epoch: 30182, Training Loss: 0.01018
Epoch: 30183, Training Loss: 0.00795
Epoch: 30183, Training Loss: 0.00828
Epoch: 30183, Training Loss: 0.00829
Epoch: 30183, Training Loss: 0.01018
Epoch: 30184, Training Loss: 0.00795
Epoch: 30184, Training Loss: 0.00828
Epoch: 30184, Training Loss: 0.00829
Epoch: 30184, Training Loss: 0.01018
Epoch: 30185, Training Loss: 0.00795
Epoch: 30185, Training Loss: 0.00828
Epoch: 30185, Training Loss: 0.00829
Epoch: 30185, Training Loss: 0.01018
Epoch: 30186, Training Loss: 0.00795
Epoch: 30186, Training Loss: 0.00828
Epoch: 30186, Training Loss: 0.00829
Epoch: 30186, Training Loss: 0.01018
Epoch: 30187, Training Loss: 0.00795
Epoch: 30187, Training Loss: 0.00828
Epoch: 30187, Training Loss: 0.00829
Epoch: 30187, Training Loss: 0.01018
Epoch: 30188, Training Loss: 0.00795
Epoch: 30188, Training Loss: 0.00828
Epoch: 30188, Training Loss: 0.00829
Epoch: 30188, Training Loss: 0.01018
Epoch: 30189, Training Loss: 0.00795
Epoch: 30189, Training Loss: 0.00828
Epoch: 30189, Training Loss: 0.00829
Epoch: 30189, Training Loss: 0.01018
Epoch: 30190, Training Loss: 0.00795
Epoch: 30190, Training Loss: 0.00828
Epoch: 30190, Training Loss: 0.00829
Epoch: 30190, Training Loss: 0.01018
Epoch: 30191, Training Loss: 0.00795
Epoch: 30191, Training Loss: 0.00828
Epoch: 30191, Training Loss: 0.00829
Epoch: 30191, Training Loss: 0.01018
Epoch: 30192, Training Loss: 0.00795
Epoch: 30192, Training Loss: 0.00828
Epoch: 30192, Training Loss: 0.00829
Epoch: 30192, Training Loss: 0.01018
Epoch: 30193, Training Loss: 0.00795
Epoch: 30193, Training Loss: 0.00828
Epoch: 30193, Training Loss: 0.00829
Epoch: 30193, Training Loss: 0.01018
Epoch: 30194, Training Loss: 0.00795
Epoch: 30194, Training Loss: 0.00828
Epoch: 30194, Training Loss: 0.00829
Epoch: 30194, Training Loss: 0.01018
Epoch: 30195, Training Loss: 0.00795
Epoch: 30195, Training Loss: 0.00828
Epoch: 30195, Training Loss: 0.00829
Epoch: 30195, Training Loss: 0.01018
Epoch: 30196, Training Loss: 0.00794
Epoch: 30196, Training Loss: 0.00828
Epoch: 30196, Training Loss: 0.00829
Epoch: 30196, Training Loss: 0.01018
Epoch: 30197, Training Loss: 0.00794
Epoch: 30197, Training Loss: 0.00828
Epoch: 30197, Training Loss: 0.00829
Epoch: 30197, Training Loss: 0.01018
Epoch: 30198, Training Loss: 0.00794
Epoch: 30198, Training Loss: 0.00828
Epoch: 30198, Training Loss: 0.00829
Epoch: 30198, Training Loss: 0.01018
Epoch: 30199, Training Loss: 0.00794
Epoch: 30199, Training Loss: 0.00828
Epoch: 30199, Training Loss: 0.00829
Epoch: 30199, Training Loss: 0.01018
Epoch: 30200, Training Loss: 0.00794
Epoch: 30200, Training Loss: 0.00828
Epoch: 30200, Training Loss: 0.00829
Epoch: 30200, Training Loss: 0.01018
Epoch: 30201, Training Loss: 0.00794
Epoch: 30201, Training Loss: 0.00828
Epoch: 30201, Training Loss: 0.00829
Epoch: 30201, Training Loss: 0.01018
Epoch: 30202, Training Loss: 0.00794
Epoch: 30202, Training Loss: 0.00828
Epoch: 30202, Training Loss: 0.00829
Epoch: 30202, Training Loss: 0.01018
Epoch: 30203, Training Loss: 0.00794
Epoch: 30203, Training Loss: 0.00828
Epoch: 30203, Training Loss: 0.00829
Epoch: 30203, Training Loss: 0.01018
Epoch: 30204, Training Loss: 0.00794
Epoch: 30204, Training Loss: 0.00828
Epoch: 30204, Training Loss: 0.00829
Epoch: 30204, Training Loss: 0.01018
Epoch: 30205, Training Loss: 0.00794
Epoch: 30205, Training Loss: 0.00828
Epoch: 30205, Training Loss: 0.00829
Epoch: 30205, Training Loss: 0.01018
Epoch: 30206, Training Loss: 0.00794
Epoch: 30206, Training Loss: 0.00828
Epoch: 30206, Training Loss: 0.00829
Epoch: 30206, Training Loss: 0.01018
Epoch: 30207, Training Loss: 0.00794
Epoch: 30207, Training Loss: 0.00828
Epoch: 30207, Training Loss: 0.00829
Epoch: 30207, Training Loss: 0.01018
Epoch: 30208, Training Loss: 0.00794
Epoch: 30208, Training Loss: 0.00828
Epoch: 30208, Training Loss: 0.00829
Epoch: 30208, Training Loss: 0.01018
Epoch: 30209, Training Loss: 0.00794
Epoch: 30209, Training Loss: 0.00828
Epoch: 30209, Training Loss: 0.00829
Epoch: 30209, Training Loss: 0.01018
Epoch: 30210, Training Loss: 0.00794
Epoch: 30210, Training Loss: 0.00828
Epoch: 30210, Training Loss: 0.00829
Epoch: 30210, Training Loss: 0.01018
Epoch: 30211, Training Loss: 0.00794
Epoch: 30211, Training Loss: 0.00828
Epoch: 30211, Training Loss: 0.00829
Epoch: 30211, Training Loss: 0.01018
Epoch: 30212, Training Loss: 0.00794
Epoch: 30212, Training Loss: 0.00828
Epoch: 30212, Training Loss: 0.00829
Epoch: 30212, Training Loss: 0.01018
Epoch: 30213, Training Loss: 0.00794
Epoch: 30213, Training Loss: 0.00828
Epoch: 30213, Training Loss: 0.00829
Epoch: 30213, Training Loss: 0.01018
Epoch: 30214, Training Loss: 0.00794
Epoch: 30214, Training Loss: 0.00828
Epoch: 30214, Training Loss: 0.00829
Epoch: 30214, Training Loss: 0.01018
Epoch: 30215, Training Loss: 0.00794
Epoch: 30215, Training Loss: 0.00828
Epoch: 30215, Training Loss: 0.00829
Epoch: 30215, Training Loss: 0.01018
Epoch: 30216, Training Loss: 0.00794
Epoch: 30216, Training Loss: 0.00828
Epoch: 30216, Training Loss: 0.00829
Epoch: 30216, Training Loss: 0.01018
Epoch: 30217, Training Loss: 0.00794
Epoch: 30217, Training Loss: 0.00828
Epoch: 30217, Training Loss: 0.00829
Epoch: 30217, Training Loss: 0.01018
Epoch: 30218, Training Loss: 0.00794
Epoch: 30218, Training Loss: 0.00828
Epoch: 30218, Training Loss: 0.00829
Epoch: 30218, Training Loss: 0.01018
Epoch: 30219, Training Loss: 0.00794
Epoch: 30219, Training Loss: 0.00828
Epoch: 30219, Training Loss: 0.00829
Epoch: 30219, Training Loss: 0.01018
Epoch: 30220, Training Loss: 0.00794
Epoch: 30220, Training Loss: 0.00828
Epoch: 30220, Training Loss: 0.00829
Epoch: 30220, Training Loss: 0.01018
Epoch: 30221, Training Loss: 0.00794
Epoch: 30221, Training Loss: 0.00828
Epoch: 30221, Training Loss: 0.00829
Epoch: 30221, Training Loss: 0.01018
Epoch: 30222, Training Loss: 0.00794
Epoch: 30222, Training Loss: 0.00828
Epoch: 30222, Training Loss: 0.00828
Epoch: 30222, Training Loss: 0.01018
Epoch: 30223, Training Loss: 0.00794
Epoch: 30223, Training Loss: 0.00827
Epoch: 30223, Training Loss: 0.00828
Epoch: 30223, Training Loss: 0.01018
Epoch: 30224, Training Loss: 0.00794
Epoch: 30224, Training Loss: 0.00827
Epoch: 30224, Training Loss: 0.00828
Epoch: 30224, Training Loss: 0.01018
Epoch: 30225, Training Loss: 0.00794
Epoch: 30225, Training Loss: 0.00827
Epoch: 30225, Training Loss: 0.00828
Epoch: 30225, Training Loss: 0.01018
Epoch: 30226, Training Loss: 0.00794
Epoch: 30226, Training Loss: 0.00827
Epoch: 30226, Training Loss: 0.00828
Epoch: 30226, Training Loss: 0.01018
Epoch: 30227, Training Loss: 0.00794
Epoch: 30227, Training Loss: 0.00827
Epoch: 30227, Training Loss: 0.00828
Epoch: 30227, Training Loss: 0.01018
Epoch: 30228, Training Loss: 0.00794
Epoch: 30228, Training Loss: 0.00827
Epoch: 30228, Training Loss: 0.00828
Epoch: 30228, Training Loss: 0.01018
Epoch: 30229, Training Loss: 0.00794
Epoch: 30229, Training Loss: 0.00827
Epoch: 30229, Training Loss: 0.00828
Epoch: 30229, Training Loss: 0.01018
Epoch: 30230, Training Loss: 0.00794
Epoch: 30230, Training Loss: 0.00827
Epoch: 30230, Training Loss: 0.00828
Epoch: 30230, Training Loss: 0.01018
Epoch: 30231, Training Loss: 0.00794
Epoch: 30231, Training Loss: 0.00827
Epoch: 30231, Training Loss: 0.00828
Epoch: 30231, Training Loss: 0.01018
Epoch: 30232, Training Loss: 0.00794
Epoch: 30232, Training Loss: 0.00827
Epoch: 30232, Training Loss: 0.00828
Epoch: 30232, Training Loss: 0.01017
Epoch: 30233, Training Loss: 0.00794
Epoch: 30233, Training Loss: 0.00827
Epoch: 30233, Training Loss: 0.00828
Epoch: 30233, Training Loss: 0.01017
Epoch: 30234, Training Loss: 0.00794
Epoch: 30234, Training Loss: 0.00827
Epoch: 30234, Training Loss: 0.00828
Epoch: 30234, Training Loss: 0.01017
Epoch: 30235, Training Loss: 0.00794
Epoch: 30235, Training Loss: 0.00827
Epoch: 30235, Training Loss: 0.00828
Epoch: 30235, Training Loss: 0.01017
Epoch: 30236, Training Loss: 0.00794
Epoch: 30236, Training Loss: 0.00827
Epoch: 30236, Training Loss: 0.00828
Epoch: 30236, Training Loss: 0.01017
Epoch: 30237, Training Loss: 0.00794
Epoch: 30237, Training Loss: 0.00827
Epoch: 30237, Training Loss: 0.00828
Epoch: 30237, Training Loss: 0.01017
Epoch: 30238, Training Loss: 0.00794
Epoch: 30238, Training Loss: 0.00827
Epoch: 30238, Training Loss: 0.00828
Epoch: 30238, Training Loss: 0.01017
Epoch: 30239, Training Loss: 0.00794
Epoch: 30239, Training Loss: 0.00827
Epoch: 30239, Training Loss: 0.00828
Epoch: 30239, Training Loss: 0.01017
Epoch: 30240, Training Loss: 0.00794
Epoch: 30240, Training Loss: 0.00827
Epoch: 30240, Training Loss: 0.00828
Epoch: 30240, Training Loss: 0.01017
Epoch: 30241, Training Loss: 0.00794
Epoch: 30241, Training Loss: 0.00827
Epoch: 30241, Training Loss: 0.00828
Epoch: 30241, Training Loss: 0.01017
Epoch: 30242, Training Loss: 0.00794
Epoch: 30242, Training Loss: 0.00827
Epoch: 30242, Training Loss: 0.00828
Epoch: 30242, Training Loss: 0.01017
Epoch: 30243, Training Loss: 0.00794
Epoch: 30243, Training Loss: 0.00827
Epoch: 30243, Training Loss: 0.00828
Epoch: 30243, Training Loss: 0.01017
Epoch: 30244, Training Loss: 0.00794
Epoch: 30244, Training Loss: 0.00827
Epoch: 30244, Training Loss: 0.00828
Epoch: 30244, Training Loss: 0.01017
Epoch: 30245, Training Loss: 0.00794
Epoch: 30245, Training Loss: 0.00827
Epoch: 30245, Training Loss: 0.00828
Epoch: 30245, Training Loss: 0.01017
Epoch: 30246, Training Loss: 0.00794
Epoch: 30246, Training Loss: 0.00827
Epoch: 30246, Training Loss: 0.00828
Epoch: 30246, Training Loss: 0.01017
Epoch: 30247, Training Loss: 0.00794
Epoch: 30247, Training Loss: 0.00827
Epoch: 30247, Training Loss: 0.00828
Epoch: 30247, Training Loss: 0.01017
Epoch: 30248, Training Loss: 0.00794
Epoch: 30248, Training Loss: 0.00827
Epoch: 30248, Training Loss: 0.00828
Epoch: 30248, Training Loss: 0.01017
Epoch: 30249, Training Loss: 0.00794
Epoch: 30249, Training Loss: 0.00827
Epoch: 30249, Training Loss: 0.00828
Epoch: 30249, Training Loss: 0.01017
Epoch: 30250, Training Loss: 0.00794
Epoch: 30250, Training Loss: 0.00827
Epoch: 30250, Training Loss: 0.00828
Epoch: 30250, Training Loss: 0.01017
Epoch: 30251, Training Loss: 0.00794
Epoch: 30251, Training Loss: 0.00827
Epoch: 30251, Training Loss: 0.00828
Epoch: 30251, Training Loss: 0.01017
Epoch: 30252, Training Loss: 0.00794
Epoch: 30252, Training Loss: 0.00827
Epoch: 30252, Training Loss: 0.00828
Epoch: 30252, Training Loss: 0.01017
Epoch: 30253, Training Loss: 0.00794
Epoch: 30253, Training Loss: 0.00827
Epoch: 30253, Training Loss: 0.00828
Epoch: 30253, Training Loss: 0.01017
Epoch: 30254, Training Loss: 0.00794
Epoch: 30254, Training Loss: 0.00827
Epoch: 30254, Training Loss: 0.00828
Epoch: 30254, Training Loss: 0.01017
Epoch: 30255, Training Loss: 0.00794
Epoch: 30255, Training Loss: 0.00827
Epoch: 30255, Training Loss: 0.00828
Epoch: 30255, Training Loss: 0.01017
Epoch: 30256, Training Loss: 0.00794
Epoch: 30256, Training Loss: 0.00827
Epoch: 30256, Training Loss: 0.00828
Epoch: 30256, Training Loss: 0.01017
Epoch: 30257, Training Loss: 0.00794
Epoch: 30257, Training Loss: 0.00827
Epoch: 30257, Training Loss: 0.00828
Epoch: 30257, Training Loss: 0.01017
Epoch: 30258, Training Loss: 0.00794
Epoch: 30258, Training Loss: 0.00827
Epoch: 30258, Training Loss: 0.00828
Epoch: 30258, Training Loss: 0.01017
Epoch: 30259, Training Loss: 0.00794
Epoch: 30259, Training Loss: 0.00827
Epoch: 30259, Training Loss: 0.00828
Epoch: 30259, Training Loss: 0.01017
Epoch: 30260, Training Loss: 0.00794
Epoch: 30260, Training Loss: 0.00827
Epoch: 30260, Training Loss: 0.00828
Epoch: 30260, Training Loss: 0.01017
Epoch: 30261, Training Loss: 0.00794
Epoch: 30261, Training Loss: 0.00827
Epoch: 30261, Training Loss: 0.00828
Epoch: 30261, Training Loss: 0.01017
Epoch: 30262, Training Loss: 0.00794
Epoch: 30262, Training Loss: 0.00827
Epoch: 30262, Training Loss: 0.00828
Epoch: 30262, Training Loss: 0.01017
Epoch: 30263, Training Loss: 0.00794
Epoch: 30263, Training Loss: 0.00827
Epoch: 30263, Training Loss: 0.00828
Epoch: 30263, Training Loss: 0.01017
Epoch: 30264, Training Loss: 0.00794
Epoch: 30264, Training Loss: 0.00827
Epoch: 30264, Training Loss: 0.00828
Epoch: 30264, Training Loss: 0.01017
Epoch: 30265, Training Loss: 0.00793
Epoch: 30265, Training Loss: 0.00827
Epoch: 30265, Training Loss: 0.00828
Epoch: 30265, Training Loss: 0.01017
Epoch: 30266, Training Loss: 0.00793
Epoch: 30266, Training Loss: 0.00827
Epoch: 30266, Training Loss: 0.00828
Epoch: 30266, Training Loss: 0.01017
Epoch: 30267, Training Loss: 0.00793
Epoch: 30267, Training Loss: 0.00827
Epoch: 30267, Training Loss: 0.00828
Epoch: 30267, Training Loss: 0.01017
Epoch: 30268, Training Loss: 0.00793
Epoch: 30268, Training Loss: 0.00827
Epoch: 30268, Training Loss: 0.00828
Epoch: 30268, Training Loss: 0.01017
Epoch: 30269, Training Loss: 0.00793
Epoch: 30269, Training Loss: 0.00827
Epoch: 30269, Training Loss: 0.00828
Epoch: 30269, Training Loss: 0.01017
Epoch: 30270, Training Loss: 0.00793
Epoch: 30270, Training Loss: 0.00827
Epoch: 30270, Training Loss: 0.00828
Epoch: 30270, Training Loss: 0.01017
Epoch: 30271, Training Loss: 0.00793
Epoch: 30271, Training Loss: 0.00827
Epoch: 30271, Training Loss: 0.00828
Epoch: 30271, Training Loss: 0.01017
Epoch: 30272, Training Loss: 0.00793
Epoch: 30272, Training Loss: 0.00827
Epoch: 30272, Training Loss: 0.00828
Epoch: 30272, Training Loss: 0.01017
Epoch: 30273, Training Loss: 0.00793
Epoch: 30273, Training Loss: 0.00827
Epoch: 30273, Training Loss: 0.00828
Epoch: 30273, Training Loss: 0.01017
Epoch: 30274, Training Loss: 0.00793
Epoch: 30274, Training Loss: 0.00827
Epoch: 30274, Training Loss: 0.00828
Epoch: 30274, Training Loss: 0.01017
Epoch: 30275, Training Loss: 0.00793
Epoch: 30275, Training Loss: 0.00827
Epoch: 30275, Training Loss: 0.00828
Epoch: 30275, Training Loss: 0.01017
Epoch: 30276, Training Loss: 0.00793
Epoch: 30276, Training Loss: 0.00827
Epoch: 30276, Training Loss: 0.00828
Epoch: 30276, Training Loss: 0.01017
Epoch: 30277, Training Loss: 0.00793
Epoch: 30277, Training Loss: 0.00827
Epoch: 30277, Training Loss: 0.00828
Epoch: 30277, Training Loss: 0.01017
Epoch: 30278, Training Loss: 0.00793
Epoch: 30278, Training Loss: 0.00827
Epoch: 30278, Training Loss: 0.00828
Epoch: 30278, Training Loss: 0.01017
Epoch: 30279, Training Loss: 0.00793
Epoch: 30279, Training Loss: 0.00827
Epoch: 30279, Training Loss: 0.00828
Epoch: 30279, Training Loss: 0.01017
Epoch: 30280, Training Loss: 0.00793
Epoch: 30280, Training Loss: 0.00827
Epoch: 30280, Training Loss: 0.00828
Epoch: 30280, Training Loss: 0.01017
Epoch: 30281, Training Loss: 0.00793
Epoch: 30281, Training Loss: 0.00827
Epoch: 30281, Training Loss: 0.00828
Epoch: 30281, Training Loss: 0.01017
Epoch: 30282, Training Loss: 0.00793
Epoch: 30282, Training Loss: 0.00827
Epoch: 30282, Training Loss: 0.00828
Epoch: 30282, Training Loss: 0.01017
Epoch: 30283, Training Loss: 0.00793
Epoch: 30283, Training Loss: 0.00827
Epoch: 30283, Training Loss: 0.00828
Epoch: 30283, Training Loss: 0.01017
Epoch: 30284, Training Loss: 0.00793
Epoch: 30284, Training Loss: 0.00827
Epoch: 30284, Training Loss: 0.00828
Epoch: 30284, Training Loss: 0.01016
Epoch: 30285, Training Loss: 0.00793
Epoch: 30285, Training Loss: 0.00827
Epoch: 30285, Training Loss: 0.00828
Epoch: 30285, Training Loss: 0.01016
Epoch: 30286, Training Loss: 0.00793
Epoch: 30286, Training Loss: 0.00827
Epoch: 30286, Training Loss: 0.00828
Epoch: 30286, Training Loss: 0.01016
Epoch: 30287, Training Loss: 0.00793
Epoch: 30287, Training Loss: 0.00827
Epoch: 30287, Training Loss: 0.00827
Epoch: 30287, Training Loss: 0.01016
Epoch: 30288, Training Loss: 0.00793
Epoch: 30288, Training Loss: 0.00826
Epoch: 30288, Training Loss: 0.00827
Epoch: 30288, Training Loss: 0.01016
Epoch: 30289, Training Loss: 0.00793
Epoch: 30289, Training Loss: 0.00826
Epoch: 30289, Training Loss: 0.00827
Epoch: 30289, Training Loss: 0.01016
Epoch: 30290, Training Loss: 0.00793
Epoch: 30290, Training Loss: 0.00826
Epoch: 30290, Training Loss: 0.00827
Epoch: 30290, Training Loss: 0.01016
Epoch: 30291, Training Loss: 0.00793
Epoch: 30291, Training Loss: 0.00826
Epoch: 30291, Training Loss: 0.00827
Epoch: 30291, Training Loss: 0.01016
Epoch: 30292, Training Loss: 0.00793
Epoch: 30292, Training Loss: 0.00826
Epoch: 30292, Training Loss: 0.00827
Epoch: 30292, Training Loss: 0.01016
Epoch: 30293, Training Loss: 0.00793
Epoch: 30293, Training Loss: 0.00826
Epoch: 30293, Training Loss: 0.00827
Epoch: 30293, Training Loss: 0.01016
Epoch: 30294, Training Loss: 0.00793
Epoch: 30294, Training Loss: 0.00826
Epoch: 30294, Training Loss: 0.00827
Epoch: 30294, Training Loss: 0.01016
Epoch: 30295, Training Loss: 0.00793
Epoch: 30295, Training Loss: 0.00826
Epoch: 30295, Training Loss: 0.00827
Epoch: 30295, Training Loss: 0.01016
Epoch: 30296, Training Loss: 0.00793
Epoch: 30296, Training Loss: 0.00826
Epoch: 30296, Training Loss: 0.00827
Epoch: 30296, Training Loss: 0.01016
Epoch: 30297, Training Loss: 0.00793
Epoch: 30297, Training Loss: 0.00826
Epoch: 30297, Training Loss: 0.00827
Epoch: 30297, Training Loss: 0.01016
Epoch: 30298, Training Loss: 0.00793
Epoch: 30298, Training Loss: 0.00826
Epoch: 30298, Training Loss: 0.00827
Epoch: 30298, Training Loss: 0.01016
Epoch: 30299, Training Loss: 0.00793
Epoch: 30299, Training Loss: 0.00826
Epoch: 30299, Training Loss: 0.00827
Epoch: 30299, Training Loss: 0.01016
Epoch: 30300, Training Loss: 0.00793
Epoch: 30300, Training Loss: 0.00826
Epoch: 30300, Training Loss: 0.00827
Epoch: 30300, Training Loss: 0.01016
Epoch: 30301, Training Loss: 0.00793
Epoch: 30301, Training Loss: 0.00826
Epoch: 30301, Training Loss: 0.00827
Epoch: 30301, Training Loss: 0.01016
Epoch: 30302, Training Loss: 0.00793
Epoch: 30302, Training Loss: 0.00826
Epoch: 30302, Training Loss: 0.00827
Epoch: 30302, Training Loss: 0.01016
Epoch: 30303, Training Loss: 0.00793
Epoch: 30303, Training Loss: 0.00826
Epoch: 30303, Training Loss: 0.00827
Epoch: 30303, Training Loss: 0.01016
Epoch: 30304, Training Loss: 0.00793
Epoch: 30304, Training Loss: 0.00826
Epoch: 30304, Training Loss: 0.00827
Epoch: 30304, Training Loss: 0.01016
Epoch: 30305, Training Loss: 0.00793
Epoch: 30305, Training Loss: 0.00826
Epoch: 30305, Training Loss: 0.00827
Epoch: 30305, Training Loss: 0.01016
Epoch: 30306, Training Loss: 0.00793
Epoch: 30306, Training Loss: 0.00826
Epoch: 30306, Training Loss: 0.00827
Epoch: 30306, Training Loss: 0.01016
Epoch: 30307, Training Loss: 0.00793
Epoch: 30307, Training Loss: 0.00826
Epoch: 30307, Training Loss: 0.00827
Epoch: 30307, Training Loss: 0.01016
Epoch: 30308, Training Loss: 0.00793
Epoch: 30308, Training Loss: 0.00826
Epoch: 30308, Training Loss: 0.00827
Epoch: 30308, Training Loss: 0.01016
Epoch: 30309, Training Loss: 0.00793
Epoch: 30309, Training Loss: 0.00826
Epoch: 30309, Training Loss: 0.00827
Epoch: 30309, Training Loss: 0.01016
Epoch: 30310, Training Loss: 0.00793
Epoch: 30310, Training Loss: 0.00826
Epoch: 30310, Training Loss: 0.00827
Epoch: 30310, Training Loss: 0.01016
Epoch: 30311, Training Loss: 0.00793
Epoch: 30311, Training Loss: 0.00826
Epoch: 30311, Training Loss: 0.00827
Epoch: 30311, Training Loss: 0.01016
Epoch: 30312, Training Loss: 0.00793
Epoch: 30312, Training Loss: 0.00826
Epoch: 30312, Training Loss: 0.00827
Epoch: 30312, Training Loss: 0.01016
Epoch: 30313, Training Loss: 0.00793
Epoch: 30313, Training Loss: 0.00826
Epoch: 30313, Training Loss: 0.00827
Epoch: 30313, Training Loss: 0.01016
Epoch: 30314, Training Loss: 0.00793
Epoch: 30314, Training Loss: 0.00826
Epoch: 30314, Training Loss: 0.00827
Epoch: 30314, Training Loss: 0.01016
Epoch: 30315, Training Loss: 0.00793
Epoch: 30315, Training Loss: 0.00826
Epoch: 30315, Training Loss: 0.00827
Epoch: 30315, Training Loss: 0.01016
Epoch: 30316, Training Loss: 0.00793
Epoch: 30316, Training Loss: 0.00826
Epoch: 30316, Training Loss: 0.00827
Epoch: 30316, Training Loss: 0.01016
Epoch: 30317, Training Loss: 0.00793
Epoch: 30317, Training Loss: 0.00826
Epoch: 30317, Training Loss: 0.00827
Epoch: 30317, Training Loss: 0.01016
Epoch: 30318, Training Loss: 0.00793
Epoch: 30318, Training Loss: 0.00826
Epoch: 30318, Training Loss: 0.00827
Epoch: 30318, Training Loss: 0.01016
Epoch: 30319, Training Loss: 0.00793
Epoch: 30319, Training Loss: 0.00826
Epoch: 30319, Training Loss: 0.00827
Epoch: 30319, Training Loss: 0.01016
Epoch: 30320, Training Loss: 0.00793
Epoch: 30320, Training Loss: 0.00826
Epoch: 30320, Training Loss: 0.00827
Epoch: 30320, Training Loss: 0.01016
Epoch: 30321, Training Loss: 0.00793
Epoch: 30321, Training Loss: 0.00826
Epoch: 30321, Training Loss: 0.00827
Epoch: 30321, Training Loss: 0.01016
Epoch: 30322, Training Loss: 0.00793
Epoch: 30322, Training Loss: 0.00826
Epoch: 30322, Training Loss: 0.00827
Epoch: 30322, Training Loss: 0.01016
Epoch: 30323, Training Loss: 0.00793
Epoch: 30323, Training Loss: 0.00826
Epoch: 30323, Training Loss: 0.00827
Epoch: 30323, Training Loss: 0.01016
Epoch: 30324, Training Loss: 0.00793
Epoch: 30324, Training Loss: 0.00826
Epoch: 30324, Training Loss: 0.00827
Epoch: 30324, Training Loss: 0.01016
Epoch: 30325, Training Loss: 0.00793
Epoch: 30325, Training Loss: 0.00826
Epoch: 30325, Training Loss: 0.00827
Epoch: 30325, Training Loss: 0.01016
Epoch: 30326, Training Loss: 0.00793
Epoch: 30326, Training Loss: 0.00826
Epoch: 30326, Training Loss: 0.00827
Epoch: 30326, Training Loss: 0.01016
Epoch: 30327, Training Loss: 0.00793
Epoch: 30327, Training Loss: 0.00826
Epoch: 30327, Training Loss: 0.00827
Epoch: 30327, Training Loss: 0.01016
Epoch: 30328, Training Loss: 0.00793
Epoch: 30328, Training Loss: 0.00826
Epoch: 30328, Training Loss: 0.00827
Epoch: 30328, Training Loss: 0.01016
Epoch: 30329, Training Loss: 0.00793
Epoch: 30329, Training Loss: 0.00826
Epoch: 30329, Training Loss: 0.00827
Epoch: 30329, Training Loss: 0.01016
Epoch: 30330, Training Loss: 0.00793
Epoch: 30330, Training Loss: 0.00826
Epoch: 30330, Training Loss: 0.00827
Epoch: 30330, Training Loss: 0.01016
Epoch: 30331, Training Loss: 0.00793
Epoch: 30331, Training Loss: 0.00826
Epoch: 30331, Training Loss: 0.00827
Epoch: 30331, Training Loss: 0.01016
Epoch: 30332, Training Loss: 0.00793
Epoch: 30332, Training Loss: 0.00826
Epoch: 30332, Training Loss: 0.00827
Epoch: 30332, Training Loss: 0.01016
Epoch: 30333, Training Loss: 0.00793
Epoch: 30333, Training Loss: 0.00826
Epoch: 30333, Training Loss: 0.00827
Epoch: 30333, Training Loss: 0.01016
Epoch: 30334, Training Loss: 0.00793
Epoch: 30334, Training Loss: 0.00826
Epoch: 30334, Training Loss: 0.00827
Epoch: 30334, Training Loss: 0.01016
Epoch: 30335, Training Loss: 0.00792
Epoch: 30335, Training Loss: 0.00826
Epoch: 30335, Training Loss: 0.00827
Epoch: 30335, Training Loss: 0.01016
Epoch: 30336, Training Loss: 0.00792
Epoch: 30336, Training Loss: 0.00826
Epoch: 30336, Training Loss: 0.00827
Epoch: 30336, Training Loss: 0.01016
Epoch: 30337, Training Loss: 0.00792
Epoch: 30337, Training Loss: 0.00826
Epoch: 30337, Training Loss: 0.00827
Epoch: 30337, Training Loss: 0.01015
Epoch: 30338, Training Loss: 0.00792
Epoch: 30338, Training Loss: 0.00826
Epoch: 30338, Training Loss: 0.00827
Epoch: 30338, Training Loss: 0.01015
Epoch: 30339, Training Loss: 0.00792
Epoch: 30339, Training Loss: 0.00826
Epoch: 30339, Training Loss: 0.00827
Epoch: 30339, Training Loss: 0.01015
Epoch: 30340, Training Loss: 0.00792
Epoch: 30340, Training Loss: 0.00826
Epoch: 30340, Training Loss: 0.00827
Epoch: 30340, Training Loss: 0.01015
Epoch: 30341, Training Loss: 0.00792
Epoch: 30341, Training Loss: 0.00826
Epoch: 30341, Training Loss: 0.00827
Epoch: 30341, Training Loss: 0.01015
Epoch: 30342, Training Loss: 0.00792
Epoch: 30342, Training Loss: 0.00826
Epoch: 30342, Training Loss: 0.00827
Epoch: 30342, Training Loss: 0.01015
Epoch: 30343, Training Loss: 0.00792
Epoch: 30343, Training Loss: 0.00826
Epoch: 30343, Training Loss: 0.00827
Epoch: 30343, Training Loss: 0.01015
Epoch: 30344, Training Loss: 0.00792
Epoch: 30344, Training Loss: 0.00826
Epoch: 30344, Training Loss: 0.00827
Epoch: 30344, Training Loss: 0.01015
Epoch: 30345, Training Loss: 0.00792
Epoch: 30345, Training Loss: 0.00826
Epoch: 30345, Training Loss: 0.00827
Epoch: 30345, Training Loss: 0.01015
Epoch: 30346, Training Loss: 0.00792
Epoch: 30346, Training Loss: 0.00826
Epoch: 30346, Training Loss: 0.00827
Epoch: 30346, Training Loss: 0.01015
Epoch: 30347, Training Loss: 0.00792
Epoch: 30347, Training Loss: 0.00826
Epoch: 30347, Training Loss: 0.00827
Epoch: 30347, Training Loss: 0.01015
Epoch: 30348, Training Loss: 0.00792
Epoch: 30348, Training Loss: 0.00826
Epoch: 30348, Training Loss: 0.00827
Epoch: 30348, Training Loss: 0.01015
Epoch: 30349, Training Loss: 0.00792
Epoch: 30349, Training Loss: 0.00826
Epoch: 30349, Training Loss: 0.00827
Epoch: 30349, Training Loss: 0.01015
Epoch: 30350, Training Loss: 0.00792
Epoch: 30350, Training Loss: 0.00826
Epoch: 30350, Training Loss: 0.00827
Epoch: 30350, Training Loss: 0.01015
Epoch: 30351, Training Loss: 0.00792
Epoch: 30351, Training Loss: 0.00826
Epoch: 30351, Training Loss: 0.00827
Epoch: 30351, Training Loss: 0.01015
Epoch: 30352, Training Loss: 0.00792
Epoch: 30352, Training Loss: 0.00826
Epoch: 30352, Training Loss: 0.00826
Epoch: 30352, Training Loss: 0.01015
Epoch: 30353, Training Loss: 0.00792
Epoch: 30353, Training Loss: 0.00825
Epoch: 30353, Training Loss: 0.00826
Epoch: 30353, Training Loss: 0.01015
Epoch: 30354, Training Loss: 0.00792
Epoch: 30354, Training Loss: 0.00825
Epoch: 30354, Training Loss: 0.00826
Epoch: 30354, Training Loss: 0.01015
Epoch: 30355, Training Loss: 0.00792
Epoch: 30355, Training Loss: 0.00825
Epoch: 30355, Training Loss: 0.00826
Epoch: 30355, Training Loss: 0.01015
Epoch: 30356, Training Loss: 0.00792
Epoch: 30356, Training Loss: 0.00825
Epoch: 30356, Training Loss: 0.00826
Epoch: 30356, Training Loss: 0.01015
Epoch: 30357, Training Loss: 0.00792
Epoch: 30357, Training Loss: 0.00825
Epoch: 30357, Training Loss: 0.00826
Epoch: 30357, Training Loss: 0.01015
Epoch: 30358, Training Loss: 0.00792
Epoch: 30358, Training Loss: 0.00825
Epoch: 30358, Training Loss: 0.00826
Epoch: 30358, Training Loss: 0.01015
Epoch: 30359, Training Loss: 0.00792
Epoch: 30359, Training Loss: 0.00825
Epoch: 30359, Training Loss: 0.00826
Epoch: 30359, Training Loss: 0.01015
Epoch: 30360, Training Loss: 0.00792
Epoch: 30360, Training Loss: 0.00825
Epoch: 30360, Training Loss: 0.00826
Epoch: 30360, Training Loss: 0.01015
Epoch: 30361, Training Loss: 0.00792
Epoch: 30361, Training Loss: 0.00825
Epoch: 30361, Training Loss: 0.00826
Epoch: 30361, Training Loss: 0.01015
Epoch: 30362, Training Loss: 0.00792
Epoch: 30362, Training Loss: 0.00825
Epoch: 30362, Training Loss: 0.00826
Epoch: 30362, Training Loss: 0.01015
Epoch: 30363, Training Loss: 0.00792
Epoch: 30363, Training Loss: 0.00825
Epoch: 30363, Training Loss: 0.00826
Epoch: 30363, Training Loss: 0.01015
Epoch: 30364, Training Loss: 0.00792
Epoch: 30364, Training Loss: 0.00825
Epoch: 30364, Training Loss: 0.00826
Epoch: 30364, Training Loss: 0.01015
Epoch: 30365, Training Loss: 0.00792
Epoch: 30365, Training Loss: 0.00825
Epoch: 30365, Training Loss: 0.00826
Epoch: 30365, Training Loss: 0.01015
Epoch: 30366, Training Loss: 0.00792
Epoch: 30366, Training Loss: 0.00825
Epoch: 30366, Training Loss: 0.00826
Epoch: 30366, Training Loss: 0.01015
Epoch: 30367, Training Loss: 0.00792
Epoch: 30367, Training Loss: 0.00825
Epoch: 30367, Training Loss: 0.00826
Epoch: 30367, Training Loss: 0.01015
Epoch: 30368, Training Loss: 0.00792
Epoch: 30368, Training Loss: 0.00825
Epoch: 30368, Training Loss: 0.00826
Epoch: 30368, Training Loss: 0.01015
Epoch: 30369, Training Loss: 0.00792
Epoch: 30369, Training Loss: 0.00825
Epoch: 30369, Training Loss: 0.00826
Epoch: 30369, Training Loss: 0.01015
Epoch: 30370, Training Loss: 0.00792
Epoch: 30370, Training Loss: 0.00825
Epoch: 30370, Training Loss: 0.00826
Epoch: 30370, Training Loss: 0.01015
Epoch: 30371, Training Loss: 0.00792
Epoch: 30371, Training Loss: 0.00825
Epoch: 30371, Training Loss: 0.00826
Epoch: 30371, Training Loss: 0.01015
Epoch: 30372, Training Loss: 0.00792
Epoch: 30372, Training Loss: 0.00825
Epoch: 30372, Training Loss: 0.00826
Epoch: 30372, Training Loss: 0.01015
Epoch: 30373, Training Loss: 0.00792
Epoch: 30373, Training Loss: 0.00825
Epoch: 30373, Training Loss: 0.00826
Epoch: 30373, Training Loss: 0.01015
Epoch: 30374, Training Loss: 0.00792
Epoch: 30374, Training Loss: 0.00825
Epoch: 30374, Training Loss: 0.00826
Epoch: 30374, Training Loss: 0.01015
Epoch: 30375, Training Loss: 0.00792
Epoch: 30375, Training Loss: 0.00825
Epoch: 30375, Training Loss: 0.00826
Epoch: 30375, Training Loss: 0.01015
Epoch: 30376, Training Loss: 0.00792
Epoch: 30376, Training Loss: 0.00825
Epoch: 30376, Training Loss: 0.00826
Epoch: 30376, Training Loss: 0.01015
Epoch: 30377, Training Loss: 0.00792
Epoch: 30377, Training Loss: 0.00825
Epoch: 30377, Training Loss: 0.00826
Epoch: 30377, Training Loss: 0.01015
Epoch: 30378, Training Loss: 0.00792
Epoch: 30378, Training Loss: 0.00825
Epoch: 30378, Training Loss: 0.00826
Epoch: 30378, Training Loss: 0.01015
Epoch: 30379, Training Loss: 0.00792
Epoch: 30379, Training Loss: 0.00825
Epoch: 30379, Training Loss: 0.00826
Epoch: 30379, Training Loss: 0.01015
Epoch: 30380, Training Loss: 0.00792
Epoch: 30380, Training Loss: 0.00825
Epoch: 30380, Training Loss: 0.00826
Epoch: 30380, Training Loss: 0.01015
Epoch: 30381, Training Loss: 0.00792
Epoch: 30381, Training Loss: 0.00825
Epoch: 30381, Training Loss: 0.00826
Epoch: 30381, Training Loss: 0.01015
Epoch: 30382, Training Loss: 0.00792
Epoch: 30382, Training Loss: 0.00825
Epoch: 30382, Training Loss: 0.00826
Epoch: 30382, Training Loss: 0.01015
Epoch: 30383, Training Loss: 0.00792
Epoch: 30383, Training Loss: 0.00825
Epoch: 30383, Training Loss: 0.00826
Epoch: 30383, Training Loss: 0.01015
Epoch: 30384, Training Loss: 0.00792
Epoch: 30384, Training Loss: 0.00825
Epoch: 30384, Training Loss: 0.00826
Epoch: 30384, Training Loss: 0.01015
Epoch: 30385, Training Loss: 0.00792
Epoch: 30385, Training Loss: 0.00825
Epoch: 30385, Training Loss: 0.00826
Epoch: 30385, Training Loss: 0.01015
Epoch: 30386, Training Loss: 0.00792
Epoch: 30386, Training Loss: 0.00825
Epoch: 30386, Training Loss: 0.00826
Epoch: 30386, Training Loss: 0.01015
Epoch: 30387, Training Loss: 0.00792
Epoch: 30387, Training Loss: 0.00825
Epoch: 30387, Training Loss: 0.00826
Epoch: 30387, Training Loss: 0.01015
Epoch: 30388, Training Loss: 0.00792
Epoch: 30388, Training Loss: 0.00825
Epoch: 30388, Training Loss: 0.00826
Epoch: 30388, Training Loss: 0.01015
Epoch: 30389, Training Loss: 0.00792
Epoch: 30389, Training Loss: 0.00825
Epoch: 30389, Training Loss: 0.00826
Epoch: 30389, Training Loss: 0.01014
Epoch: 30390, Training Loss: 0.00792
Epoch: 30390, Training Loss: 0.00825
Epoch: 30390, Training Loss: 0.00826
Epoch: 30390, Training Loss: 0.01014
Epoch: 30391, Training Loss: 0.00792
Epoch: 30391, Training Loss: 0.00825
Epoch: 30391, Training Loss: 0.00826
Epoch: 30391, Training Loss: 0.01014
Epoch: 30392, Training Loss: 0.00792
Epoch: 30392, Training Loss: 0.00825
Epoch: 30392, Training Loss: 0.00826
Epoch: 30392, Training Loss: 0.01014
Epoch: 30393, Training Loss: 0.00792
Epoch: 30393, Training Loss: 0.00825
Epoch: 30393, Training Loss: 0.00826
Epoch: 30393, Training Loss: 0.01014
Epoch: 30394, Training Loss: 0.00792
Epoch: 30394, Training Loss: 0.00825
Epoch: 30394, Training Loss: 0.00826
Epoch: 30394, Training Loss: 0.01014
Epoch: 30395, Training Loss: 0.00792
Epoch: 30395, Training Loss: 0.00825
Epoch: 30395, Training Loss: 0.00826
Epoch: 30395, Training Loss: 0.01014
Epoch: 30396, Training Loss: 0.00792
Epoch: 30396, Training Loss: 0.00825
Epoch: 30396, Training Loss: 0.00826
Epoch: 30396, Training Loss: 0.01014
Epoch: 30397, Training Loss: 0.00792
Epoch: 30397, Training Loss: 0.00825
Epoch: 30397, Training Loss: 0.00826
Epoch: 30397, Training Loss: 0.01014
Epoch: 30398, Training Loss: 0.00792
Epoch: 30398, Training Loss: 0.00825
Epoch: 30398, Training Loss: 0.00826
Epoch: 30398, Training Loss: 0.01014
Epoch: 30399, Training Loss: 0.00792
Epoch: 30399, Training Loss: 0.00825
Epoch: 30399, Training Loss: 0.00826
Epoch: 30399, Training Loss: 0.01014
Epoch: 30400, Training Loss: 0.00792
Epoch: 30400, Training Loss: 0.00825
Epoch: 30400, Training Loss: 0.00826
Epoch: 30400, Training Loss: 0.01014
Epoch: 30401, Training Loss: 0.00792
Epoch: 30401, Training Loss: 0.00825
Epoch: 30401, Training Loss: 0.00826
Epoch: 30401, Training Loss: 0.01014
Epoch: 30402, Training Loss: 0.00792
Epoch: 30402, Training Loss: 0.00825
Epoch: 30402, Training Loss: 0.00826
Epoch: 30402, Training Loss: 0.01014
Epoch: 30403, Training Loss: 0.00792
Epoch: 30403, Training Loss: 0.00825
Epoch: 30403, Training Loss: 0.00826
Epoch: 30403, Training Loss: 0.01014
Epoch: 30404, Training Loss: 0.00792
Epoch: 30404, Training Loss: 0.00825
Epoch: 30404, Training Loss: 0.00826
Epoch: 30404, Training Loss: 0.01014
Epoch: 30405, Training Loss: 0.00791
Epoch: 30405, Training Loss: 0.00825
Epoch: 30405, Training Loss: 0.00826
Epoch: 30405, Training Loss: 0.01014
Epoch: 30406, Training Loss: 0.00791
Epoch: 30406, Training Loss: 0.00825
Epoch: 30406, Training Loss: 0.00826
Epoch: 30406, Training Loss: 0.01014
Epoch: 30407, Training Loss: 0.00791
Epoch: 30407, Training Loss: 0.00825
Epoch: 30407, Training Loss: 0.00826
Epoch: 30407, Training Loss: 0.01014
Epoch: 30408, Training Loss: 0.00791
Epoch: 30408, Training Loss: 0.00825
Epoch: 30408, Training Loss: 0.00826
Epoch: 30408, Training Loss: 0.01014
Epoch: 30409, Training Loss: 0.00791
Epoch: 30409, Training Loss: 0.00825
Epoch: 30409, Training Loss: 0.00826
Epoch: 30409, Training Loss: 0.01014
Epoch: 30410, Training Loss: 0.00791
Epoch: 30410, Training Loss: 0.00825
Epoch: 30410, Training Loss: 0.00826
Epoch: 30410, Training Loss: 0.01014
Epoch: 30411, Training Loss: 0.00791
Epoch: 30411, Training Loss: 0.00825
Epoch: 30411, Training Loss: 0.00826
Epoch: 30411, Training Loss: 0.01014
Epoch: 30412, Training Loss: 0.00791
Epoch: 30412, Training Loss: 0.00825
Epoch: 30412, Training Loss: 0.00826
Epoch: 30412, Training Loss: 0.01014
Epoch: 30413, Training Loss: 0.00791
Epoch: 30413, Training Loss: 0.00825
Epoch: 30413, Training Loss: 0.00826
Epoch: 30413, Training Loss: 0.01014
Epoch: 30414, Training Loss: 0.00791
Epoch: 30414, Training Loss: 0.00825
Epoch: 30414, Training Loss: 0.00826
Epoch: 30414, Training Loss: 0.01014
Epoch: 30415, Training Loss: 0.00791
Epoch: 30415, Training Loss: 0.00825
Epoch: 30415, Training Loss: 0.00826
Epoch: 30415, Training Loss: 0.01014
Epoch: 30416, Training Loss: 0.00791
Epoch: 30416, Training Loss: 0.00825
Epoch: 30416, Training Loss: 0.00826
Epoch: 30416, Training Loss: 0.01014
Epoch: 30417, Training Loss: 0.00791
Epoch: 30417, Training Loss: 0.00825
Epoch: 30417, Training Loss: 0.00826
Epoch: 30417, Training Loss: 0.01014
Epoch: 30418, Training Loss: 0.00791
Epoch: 30418, Training Loss: 0.00825
Epoch: 30418, Training Loss: 0.00825
Epoch: 30418, Training Loss: 0.01014
Epoch: 30419, Training Loss: 0.00791
Epoch: 30419, Training Loss: 0.00824
Epoch: 30419, Training Loss: 0.00825
Epoch: 30419, Training Loss: 0.01014
Epoch: 30420, Training Loss: 0.00791
Epoch: 30420, Training Loss: 0.00824
Epoch: 30420, Training Loss: 0.00825
Epoch: 30420, Training Loss: 0.01014
Epoch: 30421, Training Loss: 0.00791
Epoch: 30421, Training Loss: 0.00824
Epoch: 30421, Training Loss: 0.00825
Epoch: 30421, Training Loss: 0.01014
Epoch: 30422, Training Loss: 0.00791
Epoch: 30422, Training Loss: 0.00824
Epoch: 30422, Training Loss: 0.00825
Epoch: 30422, Training Loss: 0.01014
Epoch: 30423, Training Loss: 0.00791
Epoch: 30423, Training Loss: 0.00824
Epoch: 30423, Training Loss: 0.00825
Epoch: 30423, Training Loss: 0.01014
Epoch: 30424, Training Loss: 0.00791
Epoch: 30424, Training Loss: 0.00824
Epoch: 30424, Training Loss: 0.00825
Epoch: 30424, Training Loss: 0.01014
Epoch: 30425, Training Loss: 0.00791
Epoch: 30425, Training Loss: 0.00824
Epoch: 30425, Training Loss: 0.00825
Epoch: 30425, Training Loss: 0.01014
Epoch: 30426, Training Loss: 0.00791
Epoch: 30426, Training Loss: 0.00824
Epoch: 30426, Training Loss: 0.00825
Epoch: 30426, Training Loss: 0.01014
Epoch: 30427, Training Loss: 0.00791
Epoch: 30427, Training Loss: 0.00824
Epoch: 30427, Training Loss: 0.00825
Epoch: 30427, Training Loss: 0.01014
Epoch: 30428, Training Loss: 0.00791
Epoch: 30428, Training Loss: 0.00824
Epoch: 30428, Training Loss: 0.00825
Epoch: 30428, Training Loss: 0.01014
Epoch: 30429, Training Loss: 0.00791
Epoch: 30429, Training Loss: 0.00824
Epoch: 30429, Training Loss: 0.00825
Epoch: 30429, Training Loss: 0.01014
Epoch: 30430, Training Loss: 0.00791
Epoch: 30430, Training Loss: 0.00824
Epoch: 30430, Training Loss: 0.00825
Epoch: 30430, Training Loss: 0.01014
Epoch: 30431, Training Loss: 0.00791
Epoch: 30431, Training Loss: 0.00824
Epoch: 30431, Training Loss: 0.00825
Epoch: 30431, Training Loss: 0.01014
Epoch: 30432, Training Loss: 0.00791
Epoch: 30432, Training Loss: 0.00824
Epoch: 30432, Training Loss: 0.00825
Epoch: 30432, Training Loss: 0.01014
Epoch: 30433, Training Loss: 0.00791
Epoch: 30433, Training Loss: 0.00824
Epoch: 30433, Training Loss: 0.00825
Epoch: 30433, Training Loss: 0.01014
Epoch: 30434, Training Loss: 0.00791
Epoch: 30434, Training Loss: 0.00824
Epoch: 30434, Training Loss: 0.00825
Epoch: 30434, Training Loss: 0.01014
Epoch: 30435, Training Loss: 0.00791
Epoch: 30435, Training Loss: 0.00824
Epoch: 30435, Training Loss: 0.00825
Epoch: 30435, Training Loss: 0.01014
Epoch: 30436, Training Loss: 0.00791
Epoch: 30436, Training Loss: 0.00824
Epoch: 30436, Training Loss: 0.00825
Epoch: 30436, Training Loss: 0.01014
Epoch: 30437, Training Loss: 0.00791
Epoch: 30437, Training Loss: 0.00824
Epoch: 30437, Training Loss: 0.00825
Epoch: 30437, Training Loss: 0.01014
Epoch: 30438, Training Loss: 0.00791
Epoch: 30438, Training Loss: 0.00824
Epoch: 30438, Training Loss: 0.00825
Epoch: 30438, Training Loss: 0.01014
Epoch: 30439, Training Loss: 0.00791
Epoch: 30439, Training Loss: 0.00824
Epoch: 30439, Training Loss: 0.00825
Epoch: 30439, Training Loss: 0.01014
Epoch: 30440, Training Loss: 0.00791
Epoch: 30440, Training Loss: 0.00824
Epoch: 30440, Training Loss: 0.00825
Epoch: 30440, Training Loss: 0.01014
Epoch: 30441, Training Loss: 0.00791
Epoch: 30441, Training Loss: 0.00824
Epoch: 30441, Training Loss: 0.00825
Epoch: 30441, Training Loss: 0.01014
Epoch: 30442, Training Loss: 0.00791
Epoch: 30442, Training Loss: 0.00824
Epoch: 30442, Training Loss: 0.00825
Epoch: 30442, Training Loss: 0.01013
Epoch: 30443, Training Loss: 0.00791
Epoch: 30443, Training Loss: 0.00824
Epoch: 30443, Training Loss: 0.00825
Epoch: 30443, Training Loss: 0.01013
Epoch: 30444, Training Loss: 0.00791
Epoch: 30444, Training Loss: 0.00824
Epoch: 30444, Training Loss: 0.00825
Epoch: 30444, Training Loss: 0.01013
Epoch: 30445, Training Loss: 0.00791
Epoch: 30445, Training Loss: 0.00824
Epoch: 30445, Training Loss: 0.00825
Epoch: 30445, Training Loss: 0.01013
Epoch: 30446, Training Loss: 0.00791
Epoch: 30446, Training Loss: 0.00824
Epoch: 30446, Training Loss: 0.00825
Epoch: 30446, Training Loss: 0.01013
Epoch: 30447, Training Loss: 0.00791
Epoch: 30447, Training Loss: 0.00824
Epoch: 30447, Training Loss: 0.00825
Epoch: 30447, Training Loss: 0.01013
Epoch: 30448, Training Loss: 0.00791
Epoch: 30448, Training Loss: 0.00824
Epoch: 30448, Training Loss: 0.00825
Epoch: 30448, Training Loss: 0.01013
Epoch: 30449, Training Loss: 0.00791
Epoch: 30449, Training Loss: 0.00824
Epoch: 30449, Training Loss: 0.00825
Epoch: 30449, Training Loss: 0.01013
Epoch: 30450, Training Loss: 0.00791
Epoch: 30450, Training Loss: 0.00824
Epoch: 30450, Training Loss: 0.00825
Epoch: 30450, Training Loss: 0.01013
Epoch: 30451, Training Loss: 0.00791
Epoch: 30451, Training Loss: 0.00824
Epoch: 30451, Training Loss: 0.00825
Epoch: 30451, Training Loss: 0.01013
Epoch: 30452, Training Loss: 0.00791
Epoch: 30452, Training Loss: 0.00824
Epoch: 30452, Training Loss: 0.00825
Epoch: 30452, Training Loss: 0.01013
Epoch: 30453, Training Loss: 0.00791
Epoch: 30453, Training Loss: 0.00824
Epoch: 30453, Training Loss: 0.00825
Epoch: 30453, Training Loss: 0.01013
Epoch: 30454, Training Loss: 0.00791
Epoch: 30454, Training Loss: 0.00824
Epoch: 30454, Training Loss: 0.00825
Epoch: 30454, Training Loss: 0.01013
Epoch: 30455, Training Loss: 0.00791
Epoch: 30455, Training Loss: 0.00824
Epoch: 30455, Training Loss: 0.00825
Epoch: 30455, Training Loss: 0.01013
Epoch: 30456, Training Loss: 0.00791
Epoch: 30456, Training Loss: 0.00824
Epoch: 30456, Training Loss: 0.00825
Epoch: 30456, Training Loss: 0.01013
Epoch: 30457, Training Loss: 0.00791
Epoch: 30457, Training Loss: 0.00824
Epoch: 30457, Training Loss: 0.00825
Epoch: 30457, Training Loss: 0.01013
Epoch: 30458, Training Loss: 0.00791
Epoch: 30458, Training Loss: 0.00824
Epoch: 30458, Training Loss: 0.00825
Epoch: 30458, Training Loss: 0.01013
Epoch: 30459, Training Loss: 0.00791
Epoch: 30459, Training Loss: 0.00824
Epoch: 30459, Training Loss: 0.00825
Epoch: 30459, Training Loss: 0.01013
Epoch: 30460, Training Loss: 0.00791
Epoch: 30460, Training Loss: 0.00824
Epoch: 30460, Training Loss: 0.00825
Epoch: 30460, Training Loss: 0.01013
Epoch: 30461, Training Loss: 0.00791
Epoch: 30461, Training Loss: 0.00824
Epoch: 30461, Training Loss: 0.00825
Epoch: 30461, Training Loss: 0.01013
Epoch: 30462, Training Loss: 0.00791
Epoch: 30462, Training Loss: 0.00824
Epoch: 30462, Training Loss: 0.00825
Epoch: 30462, Training Loss: 0.01013
Epoch: 30463, Training Loss: 0.00791
Epoch: 30463, Training Loss: 0.00824
Epoch: 30463, Training Loss: 0.00825
Epoch: 30463, Training Loss: 0.01013
Epoch: 30464, Training Loss: 0.00791
Epoch: 30464, Training Loss: 0.00824
Epoch: 30464, Training Loss: 0.00825
Epoch: 30464, Training Loss: 0.01013
Epoch: 30465, Training Loss: 0.00791
Epoch: 30465, Training Loss: 0.00824
Epoch: 30465, Training Loss: 0.00825
Epoch: 30465, Training Loss: 0.01013
Epoch: 30466, Training Loss: 0.00791
Epoch: 30466, Training Loss: 0.00824
Epoch: 30466, Training Loss: 0.00825
Epoch: 30466, Training Loss: 0.01013
Epoch: 30467, Training Loss: 0.00791
Epoch: 30467, Training Loss: 0.00824
Epoch: 30467, Training Loss: 0.00825
Epoch: 30467, Training Loss: 0.01013
Epoch: 30468, Training Loss: 0.00791
Epoch: 30468, Training Loss: 0.00824
Epoch: 30468, Training Loss: 0.00825
Epoch: 30468, Training Loss: 0.01013
Epoch: 30469, Training Loss: 0.00791
Epoch: 30469, Training Loss: 0.00824
Epoch: 30469, Training Loss: 0.00825
Epoch: 30469, Training Loss: 0.01013
Epoch: 30470, Training Loss: 0.00791
Epoch: 30470, Training Loss: 0.00824
Epoch: 30470, Training Loss: 0.00825
Epoch: 30470, Training Loss: 0.01013
Epoch: 30471, Training Loss: 0.00791
Epoch: 30471, Training Loss: 0.00824
Epoch: 30471, Training Loss: 0.00825
Epoch: 30471, Training Loss: 0.01013
Epoch: 30472, Training Loss: 0.00791
Epoch: 30472, Training Loss: 0.00824
Epoch: 30472, Training Loss: 0.00825
Epoch: 30472, Training Loss: 0.01013
Epoch: 30473, Training Loss: 0.00791
Epoch: 30473, Training Loss: 0.00824
Epoch: 30473, Training Loss: 0.00825
Epoch: 30473, Training Loss: 0.01013
Epoch: 30474, Training Loss: 0.00791
Epoch: 30474, Training Loss: 0.00824
Epoch: 30474, Training Loss: 0.00825
Epoch: 30474, Training Loss: 0.01013
Epoch: 30475, Training Loss: 0.00790
Epoch: 30475, Training Loss: 0.00824
Epoch: 30475, Training Loss: 0.00825
Epoch: 30475, Training Loss: 0.01013
Epoch: 30476, Training Loss: 0.00790
Epoch: 30476, Training Loss: 0.00824
Epoch: 30476, Training Loss: 0.00825
Epoch: 30476, Training Loss: 0.01013
Epoch: 30477, Training Loss: 0.00790
Epoch: 30477, Training Loss: 0.00824
Epoch: 30477, Training Loss: 0.00825
Epoch: 30477, Training Loss: 0.01013
Epoch: 30478, Training Loss: 0.00790
Epoch: 30478, Training Loss: 0.00824
Epoch: 30478, Training Loss: 0.00825
Epoch: 30478, Training Loss: 0.01013
Epoch: 30479, Training Loss: 0.00790
Epoch: 30479, Training Loss: 0.00824
Epoch: 30479, Training Loss: 0.00825
Epoch: 30479, Training Loss: 0.01013
Epoch: 30480, Training Loss: 0.00790
Epoch: 30480, Training Loss: 0.00824
Epoch: 30480, Training Loss: 0.00825
Epoch: 30480, Training Loss: 0.01013
Epoch: 30481, Training Loss: 0.00790
Epoch: 30481, Training Loss: 0.00824
Epoch: 30481, Training Loss: 0.00825
Epoch: 30481, Training Loss: 0.01013
Epoch: 30482, Training Loss: 0.00790
Epoch: 30482, Training Loss: 0.00824
Epoch: 30482, Training Loss: 0.00825
Epoch: 30482, Training Loss: 0.01013
Epoch: 30483, Training Loss: 0.00790
Epoch: 30483, Training Loss: 0.00824
Epoch: 30483, Training Loss: 0.00824
Epoch: 30483, Training Loss: 0.01013
Epoch: 30484, Training Loss: 0.00790
Epoch: 30484, Training Loss: 0.00824
Epoch: 30484, Training Loss: 0.00824
Epoch: 30484, Training Loss: 0.01013
Epoch: 30485, Training Loss: 0.00790
Epoch: 30485, Training Loss: 0.00823
Epoch: 30485, Training Loss: 0.00824
Epoch: 30485, Training Loss: 0.01013
Epoch: 30486, Training Loss: 0.00790
Epoch: 30486, Training Loss: 0.00823
Epoch: 30486, Training Loss: 0.00824
Epoch: 30486, Training Loss: 0.01013
Epoch: 30487, Training Loss: 0.00790
Epoch: 30487, Training Loss: 0.00823
Epoch: 30487, Training Loss: 0.00824
Epoch: 30487, Training Loss: 0.01013
Epoch: 30488, Training Loss: 0.00790
Epoch: 30488, Training Loss: 0.00823
Epoch: 30488, Training Loss: 0.00824
Epoch: 30488, Training Loss: 0.01013
Epoch: 30489, Training Loss: 0.00790
Epoch: 30489, Training Loss: 0.00823
Epoch: 30489, Training Loss: 0.00824
Epoch: 30489, Training Loss: 0.01013
Epoch: 30490, Training Loss: 0.00790
Epoch: 30490, Training Loss: 0.00823
Epoch: 30490, Training Loss: 0.00824
Epoch: 30490, Training Loss: 0.01013
Epoch: 30491, Training Loss: 0.00790
Epoch: 30491, Training Loss: 0.00823
Epoch: 30491, Training Loss: 0.00824
Epoch: 30491, Training Loss: 0.01013
Epoch: 30492, Training Loss: 0.00790
Epoch: 30492, Training Loss: 0.00823
Epoch: 30492, Training Loss: 0.00824
Epoch: 30492, Training Loss: 0.01013
Epoch: 30493, Training Loss: 0.00790
Epoch: 30493, Training Loss: 0.00823
Epoch: 30493, Training Loss: 0.00824
Epoch: 30493, Training Loss: 0.01013
Epoch: 30494, Training Loss: 0.00790
Epoch: 30494, Training Loss: 0.00823
Epoch: 30494, Training Loss: 0.00824
Epoch: 30494, Training Loss: 0.01013
Epoch: 30495, Training Loss: 0.00790
Epoch: 30495, Training Loss: 0.00823
Epoch: 30495, Training Loss: 0.00824
Epoch: 30495, Training Loss: 0.01012
Epoch: 30496, Training Loss: 0.00790
Epoch: 30496, Training Loss: 0.00823
Epoch: 30496, Training Loss: 0.00824
Epoch: 30496, Training Loss: 0.01012
Epoch: 30497, Training Loss: 0.00790
Epoch: 30497, Training Loss: 0.00823
Epoch: 30497, Training Loss: 0.00824
Epoch: 30497, Training Loss: 0.01012
Epoch: 30498, Training Loss: 0.00790
Epoch: 30498, Training Loss: 0.00823
Epoch: 30498, Training Loss: 0.00824
Epoch: 30498, Training Loss: 0.01012
Epoch: 30499, Training Loss: 0.00790
Epoch: 30499, Training Loss: 0.00823
Epoch: 30499, Training Loss: 0.00824
Epoch: 30499, Training Loss: 0.01012
Epoch: 30500, Training Loss: 0.00790
Epoch: 30500, Training Loss: 0.00823
Epoch: 30500, Training Loss: 0.00824
Epoch: 30500, Training Loss: 0.01012
Epoch: 30501, Training Loss: 0.00790
Epoch: 30501, Training Loss: 0.00823
Epoch: 30501, Training Loss: 0.00824
Epoch: 30501, Training Loss: 0.01012
Epoch: 30502, Training Loss: 0.00790
Epoch: 30502, Training Loss: 0.00823
Epoch: 30502, Training Loss: 0.00824
Epoch: 30502, Training Loss: 0.01012
Epoch: 30503, Training Loss: 0.00790
Epoch: 30503, Training Loss: 0.00823
Epoch: 30503, Training Loss: 0.00824
Epoch: 30503, Training Loss: 0.01012
Epoch: 30504, Training Loss: 0.00790
Epoch: 30504, Training Loss: 0.00823
Epoch: 30504, Training Loss: 0.00824
Epoch: 30504, Training Loss: 0.01012
Epoch: 30505, Training Loss: 0.00790
Epoch: 30505, Training Loss: 0.00823
Epoch: 30505, Training Loss: 0.00824
Epoch: 30505, Training Loss: 0.01012
Epoch: 30506, Training Loss: 0.00790
Epoch: 30506, Training Loss: 0.00823
Epoch: 30506, Training Loss: 0.00824
Epoch: 30506, Training Loss: 0.01012
Epoch: 30507, Training Loss: 0.00790
Epoch: 30507, Training Loss: 0.00823
Epoch: 30507, Training Loss: 0.00824
Epoch: 30507, Training Loss: 0.01012
Epoch: 30508, Training Loss: 0.00790
Epoch: 30508, Training Loss: 0.00823
Epoch: 30508, Training Loss: 0.00824
Epoch: 30508, Training Loss: 0.01012
Epoch: 30509, Training Loss: 0.00790
Epoch: 30509, Training Loss: 0.00823
Epoch: 30509, Training Loss: 0.00824
Epoch: 30509, Training Loss: 0.01012
Epoch: 30510, Training Loss: 0.00790
Epoch: 30510, Training Loss: 0.00823
Epoch: 30510, Training Loss: 0.00824
Epoch: 30510, Training Loss: 0.01012
Epoch: 30511, Training Loss: 0.00790
Epoch: 30511, Training Loss: 0.00823
Epoch: 30511, Training Loss: 0.00824
Epoch: 30511, Training Loss: 0.01012
Epoch: 30512, Training Loss: 0.00790
Epoch: 30512, Training Loss: 0.00823
Epoch: 30512, Training Loss: 0.00824
Epoch: 30512, Training Loss: 0.01012
Epoch: 30513, Training Loss: 0.00790
Epoch: 30513, Training Loss: 0.00823
Epoch: 30513, Training Loss: 0.00824
Epoch: 30513, Training Loss: 0.01012
Epoch: 30514, Training Loss: 0.00790
Epoch: 30514, Training Loss: 0.00823
Epoch: 30514, Training Loss: 0.00824
Epoch: 30514, Training Loss: 0.01012
Epoch: 30515, Training Loss: 0.00790
Epoch: 30515, Training Loss: 0.00823
Epoch: 30515, Training Loss: 0.00824
Epoch: 30515, Training Loss: 0.01012
Epoch: 30516, Training Loss: 0.00790
Epoch: 30516, Training Loss: 0.00823
Epoch: 30516, Training Loss: 0.00824
Epoch: 30516, Training Loss: 0.01012
Epoch: 30517, Training Loss: 0.00790
Epoch: 30517, Training Loss: 0.00823
Epoch: 30517, Training Loss: 0.00824
Epoch: 30517, Training Loss: 0.01012
Epoch: 30518, Training Loss: 0.00790
Epoch: 30518, Training Loss: 0.00823
Epoch: 30518, Training Loss: 0.00824
Epoch: 30518, Training Loss: 0.01012
Epoch: 30519, Training Loss: 0.00790
Epoch: 30519, Training Loss: 0.00823
Epoch: 30519, Training Loss: 0.00824
Epoch: 30519, Training Loss: 0.01012
Epoch: 30520, Training Loss: 0.00790
Epoch: 30520, Training Loss: 0.00823
Epoch: 30520, Training Loss: 0.00824
Epoch: 30520, Training Loss: 0.01012
Epoch: 30521, Training Loss: 0.00790
Epoch: 30521, Training Loss: 0.00823
Epoch: 30521, Training Loss: 0.00824
Epoch: 30521, Training Loss: 0.01012
Epoch: 30522, Training Loss: 0.00790
Epoch: 30522, Training Loss: 0.00823
Epoch: 30522, Training Loss: 0.00824
Epoch: 30522, Training Loss: 0.01012
Epoch: 30523, Training Loss: 0.00790
Epoch: 30523, Training Loss: 0.00823
Epoch: 30523, Training Loss: 0.00824
Epoch: 30523, Training Loss: 0.01012
Epoch: 30524, Training Loss: 0.00790
Epoch: 30524, Training Loss: 0.00823
Epoch: 30524, Training Loss: 0.00824
Epoch: 30524, Training Loss: 0.01012
Epoch: 30525, Training Loss: 0.00790
Epoch: 30525, Training Loss: 0.00823
Epoch: 30525, Training Loss: 0.00824
Epoch: 30525, Training Loss: 0.01012
Epoch: 30526, Training Loss: 0.00790
Epoch: 30526, Training Loss: 0.00823
Epoch: 30526, Training Loss: 0.00824
Epoch: 30526, Training Loss: 0.01012
Epoch: 30527, Training Loss: 0.00790
Epoch: 30527, Training Loss: 0.00823
Epoch: 30527, Training Loss: 0.00824
Epoch: 30527, Training Loss: 0.01012
Epoch: 30528, Training Loss: 0.00790
Epoch: 30528, Training Loss: 0.00823
Epoch: 30528, Training Loss: 0.00824
Epoch: 30528, Training Loss: 0.01012
Epoch: 30529, Training Loss: 0.00790
Epoch: 30529, Training Loss: 0.00823
Epoch: 30529, Training Loss: 0.00824
Epoch: 30529, Training Loss: 0.01012
Epoch: 30530, Training Loss: 0.00790
Epoch: 30530, Training Loss: 0.00823
Epoch: 30530, Training Loss: 0.00824
Epoch: 30530, Training Loss: 0.01012
Epoch: 30531, Training Loss: 0.00790
Epoch: 30531, Training Loss: 0.00823
Epoch: 30531, Training Loss: 0.00824
Epoch: 30531, Training Loss: 0.01012
Epoch: 30532, Training Loss: 0.00790
Epoch: 30532, Training Loss: 0.00823
Epoch: 30532, Training Loss: 0.00824
Epoch: 30532, Training Loss: 0.01012
Epoch: 30533, Training Loss: 0.00790
Epoch: 30533, Training Loss: 0.00823
Epoch: 30533, Training Loss: 0.00824
Epoch: 30533, Training Loss: 0.01012
Epoch: 30534, Training Loss: 0.00790
Epoch: 30534, Training Loss: 0.00823
Epoch: 30534, Training Loss: 0.00824
Epoch: 30534, Training Loss: 0.01012
Epoch: 30535, Training Loss: 0.00790
Epoch: 30535, Training Loss: 0.00823
Epoch: 30535, Training Loss: 0.00824
Epoch: 30535, Training Loss: 0.01012
Epoch: 30536, Training Loss: 0.00790
Epoch: 30536, Training Loss: 0.00823
Epoch: 30536, Training Loss: 0.00824
Epoch: 30536, Training Loss: 0.01012
Epoch: 30537, Training Loss: 0.00790
Epoch: 30537, Training Loss: 0.00823
Epoch: 30537, Training Loss: 0.00824
Epoch: 30537, Training Loss: 0.01012
Epoch: 30538, Training Loss: 0.00790
Epoch: 30538, Training Loss: 0.00823
Epoch: 30538, Training Loss: 0.00824
Epoch: 30538, Training Loss: 0.01012
Epoch: 30539, Training Loss: 0.00790
Epoch: 30539, Training Loss: 0.00823
Epoch: 30539, Training Loss: 0.00824
Epoch: 30539, Training Loss: 0.01012
Epoch: 30540, Training Loss: 0.00790
Epoch: 30540, Training Loss: 0.00823
Epoch: 30540, Training Loss: 0.00824
Epoch: 30540, Training Loss: 0.01012
Epoch: 30541, Training Loss: 0.00790
Epoch: 30541, Training Loss: 0.00823
Epoch: 30541, Training Loss: 0.00824
Epoch: 30541, Training Loss: 0.01012
Epoch: 30542, Training Loss: 0.00790
Epoch: 30542, Training Loss: 0.00823
Epoch: 30542, Training Loss: 0.00824
Epoch: 30542, Training Loss: 0.01012
Epoch: 30543, Training Loss: 0.00790
Epoch: 30543, Training Loss: 0.00823
Epoch: 30543, Training Loss: 0.00824
Epoch: 30543, Training Loss: 0.01012
Epoch: 30544, Training Loss: 0.00790
Epoch: 30544, Training Loss: 0.00823
Epoch: 30544, Training Loss: 0.00824
Epoch: 30544, Training Loss: 0.01012
Epoch: 30545, Training Loss: 0.00789
Epoch: 30545, Training Loss: 0.00823
Epoch: 30545, Training Loss: 0.00824
Epoch: 30545, Training Loss: 0.01012
Epoch: 30546, Training Loss: 0.00789
Epoch: 30546, Training Loss: 0.00823
Epoch: 30546, Training Loss: 0.00824
Epoch: 30546, Training Loss: 0.01012
Epoch: 30547, Training Loss: 0.00789
Epoch: 30547, Training Loss: 0.00823
Epoch: 30547, Training Loss: 0.00824
Epoch: 30547, Training Loss: 0.01012
Epoch: 30548, Training Loss: 0.00789
Epoch: 30548, Training Loss: 0.00823
Epoch: 30548, Training Loss: 0.00824
Epoch: 30548, Training Loss: 0.01011
Epoch: 30549, Training Loss: 0.00789
Epoch: 30549, Training Loss: 0.00823
Epoch: 30549, Training Loss: 0.00823
Epoch: 30549, Training Loss: 0.01011
Epoch: 30550, Training Loss: 0.00789
Epoch: 30550, Training Loss: 0.00823
Epoch: 30550, Training Loss: 0.00823
Epoch: 30550, Training Loss: 0.01011
Epoch: 30551, Training Loss: 0.00789
Epoch: 30551, Training Loss: 0.00822
Epoch: 30551, Training Loss: 0.00823
Epoch: 30551, Training Loss: 0.01011
Epoch: 30552, Training Loss: 0.00789
Epoch: 30552, Training Loss: 0.00822
Epoch: 30552, Training Loss: 0.00823
Epoch: 30552, Training Loss: 0.01011
Epoch: 30553, Training Loss: 0.00789
Epoch: 30553, Training Loss: 0.00822
Epoch: 30553, Training Loss: 0.00823
Epoch: 30553, Training Loss: 0.01011
Epoch: 30554, Training Loss: 0.00789
Epoch: 30554, Training Loss: 0.00822
Epoch: 30554, Training Loss: 0.00823
Epoch: 30554, Training Loss: 0.01011
Epoch: 30555, Training Loss: 0.00789
Epoch: 30555, Training Loss: 0.00822
Epoch: 30555, Training Loss: 0.00823
Epoch: 30555, Training Loss: 0.01011
Epoch: 30556, Training Loss: 0.00789
Epoch: 30556, Training Loss: 0.00822
Epoch: 30556, Training Loss: 0.00823
Epoch: 30556, Training Loss: 0.01011
Epoch: 30557, Training Loss: 0.00789
Epoch: 30557, Training Loss: 0.00822
Epoch: 30557, Training Loss: 0.00823
Epoch: 30557, Training Loss: 0.01011
Epoch: 30558, Training Loss: 0.00789
Epoch: 30558, Training Loss: 0.00822
Epoch: 30558, Training Loss: 0.00823
Epoch: 30558, Training Loss: 0.01011
Epoch: 30559, Training Loss: 0.00789
Epoch: 30559, Training Loss: 0.00822
Epoch: 30559, Training Loss: 0.00823
Epoch: 30559, Training Loss: 0.01011
Epoch: 30560, Training Loss: 0.00789
Epoch: 30560, Training Loss: 0.00822
Epoch: 30560, Training Loss: 0.00823
Epoch: 30560, Training Loss: 0.01011
Epoch: 30561, Training Loss: 0.00789
Epoch: 30561, Training Loss: 0.00822
Epoch: 30561, Training Loss: 0.00823
Epoch: 30561, Training Loss: 0.01011
Epoch: 30562, Training Loss: 0.00789
Epoch: 30562, Training Loss: 0.00822
Epoch: 30562, Training Loss: 0.00823
Epoch: 30562, Training Loss: 0.01011
Epoch: 30563, Training Loss: 0.00789
Epoch: 30563, Training Loss: 0.00822
Epoch: 30563, Training Loss: 0.00823
Epoch: 30563, Training Loss: 0.01011
Epoch: 30564, Training Loss: 0.00789
Epoch: 30564, Training Loss: 0.00822
Epoch: 30564, Training Loss: 0.00823
Epoch: 30564, Training Loss: 0.01011
Epoch: 30565, Training Loss: 0.00789
Epoch: 30565, Training Loss: 0.00822
Epoch: 30565, Training Loss: 0.00823
Epoch: 30565, Training Loss: 0.01011
Epoch: 30566, Training Loss: 0.00789
Epoch: 30566, Training Loss: 0.00822
Epoch: 30566, Training Loss: 0.00823
Epoch: 30566, Training Loss: 0.01011
Epoch: 30567, Training Loss: 0.00789
Epoch: 30567, Training Loss: 0.00822
Epoch: 30567, Training Loss: 0.00823
Epoch: 30567, Training Loss: 0.01011
Epoch: 30568, Training Loss: 0.00789
Epoch: 30568, Training Loss: 0.00822
Epoch: 30568, Training Loss: 0.00823
Epoch: 30568, Training Loss: 0.01011
Epoch: 30569, Training Loss: 0.00789
Epoch: 30569, Training Loss: 0.00822
Epoch: 30569, Training Loss: 0.00823
Epoch: 30569, Training Loss: 0.01011
Epoch: 30570, Training Loss: 0.00789
Epoch: 30570, Training Loss: 0.00822
Epoch: 30570, Training Loss: 0.00823
Epoch: 30570, Training Loss: 0.01011
Epoch: 30571, Training Loss: 0.00789
Epoch: 30571, Training Loss: 0.00822
Epoch: 30571, Training Loss: 0.00823
Epoch: 30571, Training Loss: 0.01011
Epoch: 30572, Training Loss: 0.00789
Epoch: 30572, Training Loss: 0.00822
Epoch: 30572, Training Loss: 0.00823
Epoch: 30572, Training Loss: 0.01011
Epoch: 30573, Training Loss: 0.00789
Epoch: 30573, Training Loss: 0.00822
Epoch: 30573, Training Loss: 0.00823
Epoch: 30573, Training Loss: 0.01011
Epoch: 30574, Training Loss: 0.00789
Epoch: 30574, Training Loss: 0.00822
Epoch: 30574, Training Loss: 0.00823
Epoch: 30574, Training Loss: 0.01011
Epoch: 30575, Training Loss: 0.00789
Epoch: 30575, Training Loss: 0.00822
Epoch: 30575, Training Loss: 0.00823
Epoch: 30575, Training Loss: 0.01011
Epoch: 30576, Training Loss: 0.00789
Epoch: 30576, Training Loss: 0.00822
Epoch: 30576, Training Loss: 0.00823
Epoch: 30576, Training Loss: 0.01011
Epoch: 30577, Training Loss: 0.00789
Epoch: 30577, Training Loss: 0.00822
Epoch: 30577, Training Loss: 0.00823
Epoch: 30577, Training Loss: 0.01011
Epoch: 30578, Training Loss: 0.00789
Epoch: 30578, Training Loss: 0.00822
Epoch: 30578, Training Loss: 0.00823
Epoch: 30578, Training Loss: 0.01011
Epoch: 30579, Training Loss: 0.00789
Epoch: 30579, Training Loss: 0.00822
Epoch: 30579, Training Loss: 0.00823
Epoch: 30579, Training Loss: 0.01011
Epoch: 30580, Training Loss: 0.00789
Epoch: 30580, Training Loss: 0.00822
Epoch: 30580, Training Loss: 0.00823
Epoch: 30580, Training Loss: 0.01011
Epoch: 30581, Training Loss: 0.00789
Epoch: 30581, Training Loss: 0.00822
Epoch: 30581, Training Loss: 0.00823
Epoch: 30581, Training Loss: 0.01011
Epoch: 30582, Training Loss: 0.00789
Epoch: 30582, Training Loss: 0.00822
Epoch: 30582, Training Loss: 0.00823
Epoch: 30582, Training Loss: 0.01011
Epoch: 30583, Training Loss: 0.00789
Epoch: 30583, Training Loss: 0.00822
Epoch: 30583, Training Loss: 0.00823
Epoch: 30583, Training Loss: 0.01011
Epoch: 30584, Training Loss: 0.00789
Epoch: 30584, Training Loss: 0.00822
Epoch: 30584, Training Loss: 0.00823
Epoch: 30584, Training Loss: 0.01011
Epoch: 30585, Training Loss: 0.00789
Epoch: 30585, Training Loss: 0.00822
Epoch: 30585, Training Loss: 0.00823
Epoch: 30585, Training Loss: 0.01011
Epoch: 30586, Training Loss: 0.00789
Epoch: 30586, Training Loss: 0.00822
Epoch: 30586, Training Loss: 0.00823
Epoch: 30586, Training Loss: 0.01011
Epoch: 30587, Training Loss: 0.00789
Epoch: 30587, Training Loss: 0.00822
Epoch: 30587, Training Loss: 0.00823
Epoch: 30587, Training Loss: 0.01011
Epoch: 30588, Training Loss: 0.00789
Epoch: 30588, Training Loss: 0.00822
Epoch: 30588, Training Loss: 0.00823
Epoch: 30588, Training Loss: 0.01011
Epoch: 30589, Training Loss: 0.00789
Epoch: 30589, Training Loss: 0.00822
Epoch: 30589, Training Loss: 0.00823
Epoch: 30589, Training Loss: 0.01011
Epoch: 30590, Training Loss: 0.00789
Epoch: 30590, Training Loss: 0.00822
Epoch: 30590, Training Loss: 0.00823
Epoch: 30590, Training Loss: 0.01011
Epoch: 30591, Training Loss: 0.00789
Epoch: 30591, Training Loss: 0.00822
Epoch: 30591, Training Loss: 0.00823
Epoch: 30591, Training Loss: 0.01011
Epoch: 30592, Training Loss: 0.00789
Epoch: 30592, Training Loss: 0.00822
Epoch: 30592, Training Loss: 0.00823
Epoch: 30592, Training Loss: 0.01011
Epoch: 30593, Training Loss: 0.00789
Epoch: 30593, Training Loss: 0.00822
Epoch: 30593, Training Loss: 0.00823
Epoch: 30593, Training Loss: 0.01011
Epoch: 30594, Training Loss: 0.00789
Epoch: 30594, Training Loss: 0.00822
Epoch: 30594, Training Loss: 0.00823
Epoch: 30594, Training Loss: 0.01011
Epoch: 30595, Training Loss: 0.00789
Epoch: 30595, Training Loss: 0.00822
Epoch: 30595, Training Loss: 0.00823
Epoch: 30595, Training Loss: 0.01011
Epoch: 30596, Training Loss: 0.00789
Epoch: 30596, Training Loss: 0.00822
Epoch: 30596, Training Loss: 0.00823
Epoch: 30596, Training Loss: 0.01011
Epoch: 30597, Training Loss: 0.00789
Epoch: 30597, Training Loss: 0.00822
Epoch: 30597, Training Loss: 0.00823
Epoch: 30597, Training Loss: 0.01011
Epoch: 30598, Training Loss: 0.00789
Epoch: 30598, Training Loss: 0.00822
Epoch: 30598, Training Loss: 0.00823
Epoch: 30598, Training Loss: 0.01011
Epoch: 30599, Training Loss: 0.00789
Epoch: 30599, Training Loss: 0.00822
Epoch: 30599, Training Loss: 0.00823
Epoch: 30599, Training Loss: 0.01011
Epoch: 30600, Training Loss: 0.00789
Epoch: 30600, Training Loss: 0.00822
Epoch: 30600, Training Loss: 0.00823
Epoch: 30600, Training Loss: 0.01011
Epoch: 30601, Training Loss: 0.00789
Epoch: 30601, Training Loss: 0.00822
Epoch: 30601, Training Loss: 0.00823
Epoch: 30601, Training Loss: 0.01010
Epoch: 30602, Training Loss: 0.00789
Epoch: 30602, Training Loss: 0.00822
Epoch: 30602, Training Loss: 0.00823
Epoch: 30602, Training Loss: 0.01010
Epoch: 30603, Training Loss: 0.00789
Epoch: 30603, Training Loss: 0.00822
Epoch: 30603, Training Loss: 0.00823
Epoch: 30603, Training Loss: 0.01010
Epoch: 30604, Training Loss: 0.00789
Epoch: 30604, Training Loss: 0.00822
Epoch: 30604, Training Loss: 0.00823
Epoch: 30604, Training Loss: 0.01010
Epoch: 30605, Training Loss: 0.00789
Epoch: 30605, Training Loss: 0.00822
Epoch: 30605, Training Loss: 0.00823
Epoch: 30605, Training Loss: 0.01010
Epoch: 30606, Training Loss: 0.00789
Epoch: 30606, Training Loss: 0.00822
Epoch: 30606, Training Loss: 0.00823
Epoch: 30606, Training Loss: 0.01010
Epoch: 30607, Training Loss: 0.00789
Epoch: 30607, Training Loss: 0.00822
Epoch: 30607, Training Loss: 0.00823
Epoch: 30607, Training Loss: 0.01010
Epoch: 30608, Training Loss: 0.00789
Epoch: 30608, Training Loss: 0.00822
Epoch: 30608, Training Loss: 0.00823
Epoch: 30608, Training Loss: 0.01010
Epoch: 30609, Training Loss: 0.00789
Epoch: 30609, Training Loss: 0.00822
Epoch: 30609, Training Loss: 0.00823
Epoch: 30609, Training Loss: 0.01010
Epoch: 30610, Training Loss: 0.00789
Epoch: 30610, Training Loss: 0.00822
Epoch: 30610, Training Loss: 0.00823
Epoch: 30610, Training Loss: 0.01010
Epoch: 30611, Training Loss: 0.00789
Epoch: 30611, Training Loss: 0.00822
Epoch: 30611, Training Loss: 0.00823
Epoch: 30611, Training Loss: 0.01010
Epoch: 30612, Training Loss: 0.00789
Epoch: 30612, Training Loss: 0.00822
Epoch: 30612, Training Loss: 0.00823
Epoch: 30612, Training Loss: 0.01010
Epoch: 30613, Training Loss: 0.00789
Epoch: 30613, Training Loss: 0.00822
Epoch: 30613, Training Loss: 0.00823
Epoch: 30613, Training Loss: 0.01010
Epoch: 30614, Training Loss: 0.00789
Epoch: 30614, Training Loss: 0.00822
Epoch: 30614, Training Loss: 0.00823
Epoch: 30614, Training Loss: 0.01010
Epoch: 30615, Training Loss: 0.00788
Epoch: 30615, Training Loss: 0.00822
Epoch: 30615, Training Loss: 0.00822
Epoch: 30615, Training Loss: 0.01010
Epoch: 30616, Training Loss: 0.00788
Epoch: 30616, Training Loss: 0.00822
Epoch: 30616, Training Loss: 0.00822
Epoch: 30616, Training Loss: 0.01010
Epoch: 30617, Training Loss: 0.00788
Epoch: 30617, Training Loss: 0.00821
Epoch: 30617, Training Loss: 0.00822
Epoch: 30617, Training Loss: 0.01010
Epoch: 30618, Training Loss: 0.00788
Epoch: 30618, Training Loss: 0.00821
Epoch: 30618, Training Loss: 0.00822
Epoch: 30618, Training Loss: 0.01010
Epoch: 30619, Training Loss: 0.00788
Epoch: 30619, Training Loss: 0.00821
Epoch: 30619, Training Loss: 0.00822
Epoch: 30619, Training Loss: 0.01010
Epoch: 30620, Training Loss: 0.00788
Epoch: 30620, Training Loss: 0.00821
Epoch: 30620, Training Loss: 0.00822
Epoch: 30620, Training Loss: 0.01010
Epoch: 30621, Training Loss: 0.00788
Epoch: 30621, Training Loss: 0.00821
Epoch: 30621, Training Loss: 0.00822
Epoch: 30621, Training Loss: 0.01010
Epoch: 30622, Training Loss: 0.00788
Epoch: 30622, Training Loss: 0.00821
Epoch: 30622, Training Loss: 0.00822
Epoch: 30622, Training Loss: 0.01010
Epoch: 30623, Training Loss: 0.00788
Epoch: 30623, Training Loss: 0.00821
Epoch: 30623, Training Loss: 0.00822
Epoch: 30623, Training Loss: 0.01010
Epoch: 30624, Training Loss: 0.00788
Epoch: 30624, Training Loss: 0.00821
Epoch: 30624, Training Loss: 0.00822
Epoch: 30624, Training Loss: 0.01010
Epoch: 30625, Training Loss: 0.00788
Epoch: 30625, Training Loss: 0.00821
Epoch: 30625, Training Loss: 0.00822
Epoch: 30625, Training Loss: 0.01010
Epoch: 30626, Training Loss: 0.00788
Epoch: 30626, Training Loss: 0.00821
Epoch: 30626, Training Loss: 0.00822
Epoch: 30626, Training Loss: 0.01010
Epoch: 30627, Training Loss: 0.00788
Epoch: 30627, Training Loss: 0.00821
Epoch: 30627, Training Loss: 0.00822
Epoch: 30627, Training Loss: 0.01010
Epoch: 30628, Training Loss: 0.00788
Epoch: 30628, Training Loss: 0.00821
Epoch: 30628, Training Loss: 0.00822
Epoch: 30628, Training Loss: 0.01010
Epoch: 30629, Training Loss: 0.00788
Epoch: 30629, Training Loss: 0.00821
Epoch: 30629, Training Loss: 0.00822
Epoch: 30629, Training Loss: 0.01010
Epoch: 30630, Training Loss: 0.00788
Epoch: 30630, Training Loss: 0.00821
Epoch: 30630, Training Loss: 0.00822
Epoch: 30630, Training Loss: 0.01010
Epoch: 30631, Training Loss: 0.00788
Epoch: 30631, Training Loss: 0.00821
Epoch: 30631, Training Loss: 0.00822
Epoch: 30631, Training Loss: 0.01010
Epoch: 30632, Training Loss: 0.00788
Epoch: 30632, Training Loss: 0.00821
Epoch: 30632, Training Loss: 0.00822
Epoch: 30632, Training Loss: 0.01010
Epoch: 30633, Training Loss: 0.00788
Epoch: 30633, Training Loss: 0.00821
Epoch: 30633, Training Loss: 0.00822
Epoch: 30633, Training Loss: 0.01010
Epoch: 30634, Training Loss: 0.00788
Epoch: 30634, Training Loss: 0.00821
Epoch: 30634, Training Loss: 0.00822
Epoch: 30634, Training Loss: 0.01010
Epoch: 30635, Training Loss: 0.00788
Epoch: 30635, Training Loss: 0.00821
Epoch: 30635, Training Loss: 0.00822
Epoch: 30635, Training Loss: 0.01010
Epoch: 30636, Training Loss: 0.00788
Epoch: 30636, Training Loss: 0.00821
Epoch: 30636, Training Loss: 0.00822
Epoch: 30636, Training Loss: 0.01010
Epoch: 30637, Training Loss: 0.00788
Epoch: 30637, Training Loss: 0.00821
Epoch: 30637, Training Loss: 0.00822
Epoch: 30637, Training Loss: 0.01010
Epoch: 30638, Training Loss: 0.00788
Epoch: 30638, Training Loss: 0.00821
Epoch: 30638, Training Loss: 0.00822
Epoch: 30638, Training Loss: 0.01010
Epoch: 30639, Training Loss: 0.00788
Epoch: 30639, Training Loss: 0.00821
Epoch: 30639, Training Loss: 0.00822
Epoch: 30639, Training Loss: 0.01010
Epoch: 30640, Training Loss: 0.00788
Epoch: 30640, Training Loss: 0.00821
Epoch: 30640, Training Loss: 0.00822
Epoch: 30640, Training Loss: 0.01010
Epoch: 30641, Training Loss: 0.00788
Epoch: 30641, Training Loss: 0.00821
Epoch: 30641, Training Loss: 0.00822
Epoch: 30641, Training Loss: 0.01010
Epoch: 30642, Training Loss: 0.00788
Epoch: 30642, Training Loss: 0.00821
Epoch: 30642, Training Loss: 0.00822
Epoch: 30642, Training Loss: 0.01010
Epoch: 30643, Training Loss: 0.00788
Epoch: 30643, Training Loss: 0.00821
Epoch: 30643, Training Loss: 0.00822
Epoch: 30643, Training Loss: 0.01010
Epoch: 30644, Training Loss: 0.00788
Epoch: 30644, Training Loss: 0.00821
Epoch: 30644, Training Loss: 0.00822
Epoch: 30644, Training Loss: 0.01010
Epoch: 30645, Training Loss: 0.00788
Epoch: 30645, Training Loss: 0.00821
Epoch: 30645, Training Loss: 0.00822
Epoch: 30645, Training Loss: 0.01010
Epoch: 30646, Training Loss: 0.00788
Epoch: 30646, Training Loss: 0.00821
Epoch: 30646, Training Loss: 0.00822
Epoch: 30646, Training Loss: 0.01010
Epoch: 30647, Training Loss: 0.00788
Epoch: 30647, Training Loss: 0.00821
Epoch: 30647, Training Loss: 0.00822
Epoch: 30647, Training Loss: 0.01010
Epoch: 30648, Training Loss: 0.00788
Epoch: 30648, Training Loss: 0.00821
Epoch: 30648, Training Loss: 0.00822
Epoch: 30648, Training Loss: 0.01010
Epoch: 30649, Training Loss: 0.00788
Epoch: 30649, Training Loss: 0.00821
Epoch: 30649, Training Loss: 0.00822
Epoch: 30649, Training Loss: 0.01010
Epoch: 30650, Training Loss: 0.00788
Epoch: 30650, Training Loss: 0.00821
Epoch: 30650, Training Loss: 0.00822
Epoch: 30650, Training Loss: 0.01010
Epoch: 30651, Training Loss: 0.00788
Epoch: 30651, Training Loss: 0.00821
Epoch: 30651, Training Loss: 0.00822
Epoch: 30651, Training Loss: 0.01010
Epoch: 30652, Training Loss: 0.00788
Epoch: 30652, Training Loss: 0.00821
Epoch: 30652, Training Loss: 0.00822
Epoch: 30652, Training Loss: 0.01010
Epoch: 30653, Training Loss: 0.00788
Epoch: 30653, Training Loss: 0.00821
Epoch: 30653, Training Loss: 0.00822
Epoch: 30653, Training Loss: 0.01010
Epoch: 30654, Training Loss: 0.00788
Epoch: 30654, Training Loss: 0.00821
Epoch: 30654, Training Loss: 0.00822
Epoch: 30654, Training Loss: 0.01009
Epoch: 30655, Training Loss: 0.00788
Epoch: 30655, Training Loss: 0.00821
Epoch: 30655, Training Loss: 0.00822
Epoch: 30655, Training Loss: 0.01009
Epoch: 30656, Training Loss: 0.00788
Epoch: 30656, Training Loss: 0.00821
Epoch: 30656, Training Loss: 0.00822
Epoch: 30656, Training Loss: 0.01009
Epoch: 30657, Training Loss: 0.00788
Epoch: 30657, Training Loss: 0.00821
Epoch: 30657, Training Loss: 0.00822
Epoch: 30657, Training Loss: 0.01009
Epoch: 30658, Training Loss: 0.00788
Epoch: 30658, Training Loss: 0.00821
Epoch: 30658, Training Loss: 0.00822
Epoch: 30658, Training Loss: 0.01009
Epoch: 30659, Training Loss: 0.00788
Epoch: 30659, Training Loss: 0.00821
Epoch: 30659, Training Loss: 0.00822
Epoch: 30659, Training Loss: 0.01009
Epoch: 30660, Training Loss: 0.00788
Epoch: 30660, Training Loss: 0.00821
Epoch: 30660, Training Loss: 0.00822
Epoch: 30660, Training Loss: 0.01009
Epoch: 30661, Training Loss: 0.00788
Epoch: 30661, Training Loss: 0.00821
Epoch: 30661, Training Loss: 0.00822
Epoch: 30661, Training Loss: 0.01009
Epoch: 30662, Training Loss: 0.00788
Epoch: 30662, Training Loss: 0.00821
Epoch: 30662, Training Loss: 0.00822
Epoch: 30662, Training Loss: 0.01009
Epoch: 30663, Training Loss: 0.00788
Epoch: 30663, Training Loss: 0.00821
Epoch: 30663, Training Loss: 0.00822
Epoch: 30663, Training Loss: 0.01009
Epoch: 30664, Training Loss: 0.00788
Epoch: 30664, Training Loss: 0.00821
Epoch: 30664, Training Loss: 0.00822
Epoch: 30664, Training Loss: 0.01009
Epoch: 30665, Training Loss: 0.00788
Epoch: 30665, Training Loss: 0.00821
Epoch: 30665, Training Loss: 0.00822
Epoch: 30665, Training Loss: 0.01009
Epoch: 30666, Training Loss: 0.00788
Epoch: 30666, Training Loss: 0.00821
Epoch: 30666, Training Loss: 0.00822
Epoch: 30666, Training Loss: 0.01009
Epoch: 30667, Training Loss: 0.00788
Epoch: 30667, Training Loss: 0.00821
Epoch: 30667, Training Loss: 0.00822
Epoch: 30667, Training Loss: 0.01009
Epoch: 30668, Training Loss: 0.00788
Epoch: 30668, Training Loss: 0.00821
Epoch: 30668, Training Loss: 0.00822
Epoch: 30668, Training Loss: 0.01009
Epoch: 30669, Training Loss: 0.00788
Epoch: 30669, Training Loss: 0.00821
Epoch: 30669, Training Loss: 0.00822
Epoch: 30669, Training Loss: 0.01009
Epoch: 30670, Training Loss: 0.00788
Epoch: 30670, Training Loss: 0.00821
Epoch: 30670, Training Loss: 0.00822
Epoch: 30670, Training Loss: 0.01009
Epoch: 30671, Training Loss: 0.00788
Epoch: 30671, Training Loss: 0.00821
Epoch: 30671, Training Loss: 0.00822
Epoch: 30671, Training Loss: 0.01009
Epoch: 30672, Training Loss: 0.00788
Epoch: 30672, Training Loss: 0.00821
Epoch: 30672, Training Loss: 0.00822
Epoch: 30672, Training Loss: 0.01009
Epoch: 30673, Training Loss: 0.00788
Epoch: 30673, Training Loss: 0.00821
Epoch: 30673, Training Loss: 0.00822
Epoch: 30673, Training Loss: 0.01009
Epoch: 30674, Training Loss: 0.00788
Epoch: 30674, Training Loss: 0.00821
Epoch: 30674, Training Loss: 0.00822
Epoch: 30674, Training Loss: 0.01009
Epoch: 30675, Training Loss: 0.00788
Epoch: 30675, Training Loss: 0.00821
Epoch: 30675, Training Loss: 0.00822
Epoch: 30675, Training Loss: 0.01009
Epoch: 30676, Training Loss: 0.00788
Epoch: 30676, Training Loss: 0.00821
Epoch: 30676, Training Loss: 0.00822
Epoch: 30676, Training Loss: 0.01009
Epoch: 30677, Training Loss: 0.00788
Epoch: 30677, Training Loss: 0.00821
Epoch: 30677, Training Loss: 0.00822
Epoch: 30677, Training Loss: 0.01009
Epoch: 30678, Training Loss: 0.00788
Epoch: 30678, Training Loss: 0.00821
Epoch: 30678, Training Loss: 0.00822
Epoch: 30678, Training Loss: 0.01009
Epoch: 30679, Training Loss: 0.00788
Epoch: 30679, Training Loss: 0.00821
Epoch: 30679, Training Loss: 0.00822
Epoch: 30679, Training Loss: 0.01009
Epoch: 30680, Training Loss: 0.00788
Epoch: 30680, Training Loss: 0.00821
Epoch: 30680, Training Loss: 0.00822
Epoch: 30680, Training Loss: 0.01009
Epoch: 30681, Training Loss: 0.00788
Epoch: 30681, Training Loss: 0.00821
Epoch: 30681, Training Loss: 0.00822
Epoch: 30681, Training Loss: 0.01009
Epoch: 30682, Training Loss: 0.00788
Epoch: 30682, Training Loss: 0.00821
Epoch: 30682, Training Loss: 0.00821
Epoch: 30682, Training Loss: 0.01009
Epoch: 30683, Training Loss: 0.00788
Epoch: 30683, Training Loss: 0.00820
Epoch: 30683, Training Loss: 0.00821
Epoch: 30683, Training Loss: 0.01009
Epoch: 30684, Training Loss: 0.00788
Epoch: 30684, Training Loss: 0.00820
Epoch: 30684, Training Loss: 0.00821
Epoch: 30684, Training Loss: 0.01009
Epoch: 30685, Training Loss: 0.00788
Epoch: 30685, Training Loss: 0.00820
Epoch: 30685, Training Loss: 0.00821
Epoch: 30685, Training Loss: 0.01009
Epoch: 30686, Training Loss: 0.00787
Epoch: 30686, Training Loss: 0.00820
Epoch: 30686, Training Loss: 0.00821
Epoch: 30686, Training Loss: 0.01009
Epoch: 30687, Training Loss: 0.00787
Epoch: 30687, Training Loss: 0.00820
Epoch: 30687, Training Loss: 0.00821
Epoch: 30687, Training Loss: 0.01009
Epoch: 30688, Training Loss: 0.00787
Epoch: 30688, Training Loss: 0.00820
Epoch: 30688, Training Loss: 0.00821
Epoch: 30688, Training Loss: 0.01009
Epoch: 30689, Training Loss: 0.00787
Epoch: 30689, Training Loss: 0.00820
Epoch: 30689, Training Loss: 0.00821
Epoch: 30689, Training Loss: 0.01009
Epoch: 30690, Training Loss: 0.00787
Epoch: 30690, Training Loss: 0.00820
Epoch: 30690, Training Loss: 0.00821
Epoch: 30690, Training Loss: 0.01009
Epoch: 30691, Training Loss: 0.00787
Epoch: 30691, Training Loss: 0.00820
Epoch: 30691, Training Loss: 0.00821
Epoch: 30691, Training Loss: 0.01009
Epoch: 30692, Training Loss: 0.00787
Epoch: 30692, Training Loss: 0.00820
Epoch: 30692, Training Loss: 0.00821
Epoch: 30692, Training Loss: 0.01009
Epoch: 30693, Training Loss: 0.00787
Epoch: 30693, Training Loss: 0.00820
Epoch: 30693, Training Loss: 0.00821
Epoch: 30693, Training Loss: 0.01009
Epoch: 30694, Training Loss: 0.00787
Epoch: 30694, Training Loss: 0.00820
Epoch: 30694, Training Loss: 0.00821
Epoch: 30694, Training Loss: 0.01009
Epoch: 30695, Training Loss: 0.00787
Epoch: 30695, Training Loss: 0.00820
Epoch: 30695, Training Loss: 0.00821
Epoch: 30695, Training Loss: 0.01009
Epoch: 30696, Training Loss: 0.00787
Epoch: 30696, Training Loss: 0.00820
Epoch: 30696, Training Loss: 0.00821
Epoch: 30696, Training Loss: 0.01009
Epoch: 30697, Training Loss: 0.00787
Epoch: 30697, Training Loss: 0.00820
Epoch: 30697, Training Loss: 0.00821
Epoch: 30697, Training Loss: 0.01009
Epoch: 30698, Training Loss: 0.00787
Epoch: 30698, Training Loss: 0.00820
Epoch: 30698, Training Loss: 0.00821
Epoch: 30698, Training Loss: 0.01009
Epoch: 30699, Training Loss: 0.00787
Epoch: 30699, Training Loss: 0.00820
Epoch: 30699, Training Loss: 0.00821
Epoch: 30699, Training Loss: 0.01009
Epoch: 30700, Training Loss: 0.00787
Epoch: 30700, Training Loss: 0.00820
Epoch: 30700, Training Loss: 0.00821
Epoch: 30700, Training Loss: 0.01009
Epoch: 30701, Training Loss: 0.00787
Epoch: 30701, Training Loss: 0.00820
Epoch: 30701, Training Loss: 0.00821
Epoch: 30701, Training Loss: 0.01009
Epoch: 30702, Training Loss: 0.00787
Epoch: 30702, Training Loss: 0.00820
Epoch: 30702, Training Loss: 0.00821
Epoch: 30702, Training Loss: 0.01009
Epoch: 30703, Training Loss: 0.00787
Epoch: 30703, Training Loss: 0.00820
Epoch: 30703, Training Loss: 0.00821
Epoch: 30703, Training Loss: 0.01009
Epoch: 30704, Training Loss: 0.00787
Epoch: 30704, Training Loss: 0.00820
Epoch: 30704, Training Loss: 0.00821
Epoch: 30704, Training Loss: 0.01009
Epoch: 30705, Training Loss: 0.00787
Epoch: 30705, Training Loss: 0.00820
Epoch: 30705, Training Loss: 0.00821
Epoch: 30705, Training Loss: 0.01009
Epoch: 30706, Training Loss: 0.00787
Epoch: 30706, Training Loss: 0.00820
Epoch: 30706, Training Loss: 0.00821
Epoch: 30706, Training Loss: 0.01009
Epoch: 30707, Training Loss: 0.00787
Epoch: 30707, Training Loss: 0.00820
Epoch: 30707, Training Loss: 0.00821
Epoch: 30707, Training Loss: 0.01009
Epoch: 30708, Training Loss: 0.00787
Epoch: 30708, Training Loss: 0.00820
Epoch: 30708, Training Loss: 0.00821
Epoch: 30708, Training Loss: 0.01008
Epoch: 30709, Training Loss: 0.00787
Epoch: 30709, Training Loss: 0.00820
Epoch: 30709, Training Loss: 0.00821
Epoch: 30709, Training Loss: 0.01008
Epoch: 30710, Training Loss: 0.00787
Epoch: 30710, Training Loss: 0.00820
Epoch: 30710, Training Loss: 0.00821
Epoch: 30710, Training Loss: 0.01008
Epoch: 30711, Training Loss: 0.00787
Epoch: 30711, Training Loss: 0.00820
Epoch: 30711, Training Loss: 0.00821
Epoch: 30711, Training Loss: 0.01008
Epoch: 30712, Training Loss: 0.00787
Epoch: 30712, Training Loss: 0.00820
Epoch: 30712, Training Loss: 0.00821
Epoch: 30712, Training Loss: 0.01008
Epoch: 30713, Training Loss: 0.00787
Epoch: 30713, Training Loss: 0.00820
Epoch: 30713, Training Loss: 0.00821
Epoch: 30713, Training Loss: 0.01008
Epoch: 30714, Training Loss: 0.00787
Epoch: 30714, Training Loss: 0.00820
Epoch: 30714, Training Loss: 0.00821
Epoch: 30714, Training Loss: 0.01008
Epoch: 30715, Training Loss: 0.00787
Epoch: 30715, Training Loss: 0.00820
Epoch: 30715, Training Loss: 0.00821
Epoch: 30715, Training Loss: 0.01008
Epoch: 30716, Training Loss: 0.00787
Epoch: 30716, Training Loss: 0.00820
Epoch: 30716, Training Loss: 0.00821
Epoch: 30716, Training Loss: 0.01008
Epoch: 30717, Training Loss: 0.00787
Epoch: 30717, Training Loss: 0.00820
Epoch: 30717, Training Loss: 0.00821
Epoch: 30717, Training Loss: 0.01008
Epoch: 30718, Training Loss: 0.00787
Epoch: 30718, Training Loss: 0.00820
Epoch: 30718, Training Loss: 0.00821
Epoch: 30718, Training Loss: 0.01008
Epoch: 30719, Training Loss: 0.00787
Epoch: 30719, Training Loss: 0.00820
Epoch: 30719, Training Loss: 0.00821
Epoch: 30719, Training Loss: 0.01008
Epoch: 30720, Training Loss: 0.00787
Epoch: 30720, Training Loss: 0.00820
Epoch: 30720, Training Loss: 0.00821
Epoch: 30720, Training Loss: 0.01008
Epoch: 30721, Training Loss: 0.00787
Epoch: 30721, Training Loss: 0.00820
Epoch: 30721, Training Loss: 0.00821
Epoch: 30721, Training Loss: 0.01008
Epoch: 30722, Training Loss: 0.00787
Epoch: 30722, Training Loss: 0.00820
Epoch: 30722, Training Loss: 0.00821
Epoch: 30722, Training Loss: 0.01008
Epoch: 30723, Training Loss: 0.00787
Epoch: 30723, Training Loss: 0.00820
Epoch: 30723, Training Loss: 0.00821
Epoch: 30723, Training Loss: 0.01008
Epoch: 30724, Training Loss: 0.00787
Epoch: 30724, Training Loss: 0.00820
Epoch: 30724, Training Loss: 0.00821
Epoch: 30724, Training Loss: 0.01008
Epoch: 30725, Training Loss: 0.00787
Epoch: 30725, Training Loss: 0.00820
Epoch: 30725, Training Loss: 0.00821
Epoch: 30725, Training Loss: 0.01008
Epoch: 30726, Training Loss: 0.00787
Epoch: 30726, Training Loss: 0.00820
Epoch: 30726, Training Loss: 0.00821
Epoch: 30726, Training Loss: 0.01008
Epoch: 30727, Training Loss: 0.00787
Epoch: 30727, Training Loss: 0.00820
Epoch: 30727, Training Loss: 0.00821
Epoch: 30727, Training Loss: 0.01008
Epoch: 30728, Training Loss: 0.00787
Epoch: 30728, Training Loss: 0.00820
Epoch: 30728, Training Loss: 0.00821
Epoch: 30728, Training Loss: 0.01008
Epoch: 30729, Training Loss: 0.00787
Epoch: 30729, Training Loss: 0.00820
Epoch: 30729, Training Loss: 0.00821
Epoch: 30729, Training Loss: 0.01008
Epoch: 30730, Training Loss: 0.00787
Epoch: 30730, Training Loss: 0.00820
Epoch: 30730, Training Loss: 0.00821
Epoch: 30730, Training Loss: 0.01008
Epoch: 30731, Training Loss: 0.00787
Epoch: 30731, Training Loss: 0.00820
Epoch: 30731, Training Loss: 0.00821
Epoch: 30731, Training Loss: 0.01008
Epoch: 30732, Training Loss: 0.00787
Epoch: 30732, Training Loss: 0.00820
Epoch: 30732, Training Loss: 0.00821
Epoch: 30732, Training Loss: 0.01008
Epoch: 30733, Training Loss: 0.00787
Epoch: 30733, Training Loss: 0.00820
Epoch: 30733, Training Loss: 0.00821
Epoch: 30733, Training Loss: 0.01008
Epoch: 30734, Training Loss: 0.00787
Epoch: 30734, Training Loss: 0.00820
Epoch: 30734, Training Loss: 0.00821
Epoch: 30734, Training Loss: 0.01008
Epoch: 30735, Training Loss: 0.00787
Epoch: 30735, Training Loss: 0.00820
Epoch: 30735, Training Loss: 0.00821
Epoch: 30735, Training Loss: 0.01008
Epoch: 30736, Training Loss: 0.00787
Epoch: 30736, Training Loss: 0.00820
Epoch: 30736, Training Loss: 0.00821
Epoch: 30736, Training Loss: 0.01008
Epoch: 30737, Training Loss: 0.00787
Epoch: 30737, Training Loss: 0.00820
Epoch: 30737, Training Loss: 0.00821
Epoch: 30737, Training Loss: 0.01008
Epoch: 30738, Training Loss: 0.00787
Epoch: 30738, Training Loss: 0.00820
Epoch: 30738, Training Loss: 0.00821
Epoch: 30738, Training Loss: 0.01008
Epoch: 30739, Training Loss: 0.00787
Epoch: 30739, Training Loss: 0.00820
Epoch: 30739, Training Loss: 0.00821
Epoch: 30739, Training Loss: 0.01008
Epoch: 30740, Training Loss: 0.00787
Epoch: 30740, Training Loss: 0.00820
Epoch: 30740, Training Loss: 0.00821
Epoch: 30740, Training Loss: 0.01008
Epoch: 30741, Training Loss: 0.00787
Epoch: 30741, Training Loss: 0.00820
Epoch: 30741, Training Loss: 0.00821
Epoch: 30741, Training Loss: 0.01008
Epoch: 30742, Training Loss: 0.00787
Epoch: 30742, Training Loss: 0.00820
Epoch: 30742, Training Loss: 0.00821
Epoch: 30742, Training Loss: 0.01008
Epoch: 30743, Training Loss: 0.00787
Epoch: 30743, Training Loss: 0.00820
Epoch: 30743, Training Loss: 0.00821
Epoch: 30743, Training Loss: 0.01008
Epoch: 30744, Training Loss: 0.00787
Epoch: 30744, Training Loss: 0.00820
Epoch: 30744, Training Loss: 0.00821
Epoch: 30744, Training Loss: 0.01008
Epoch: 30745, Training Loss: 0.00787
Epoch: 30745, Training Loss: 0.00820
Epoch: 30745, Training Loss: 0.00821
Epoch: 30745, Training Loss: 0.01008
Epoch: 30746, Training Loss: 0.00787
Epoch: 30746, Training Loss: 0.00820
Epoch: 30746, Training Loss: 0.00821
Epoch: 30746, Training Loss: 0.01008
Epoch: 30747, Training Loss: 0.00787
Epoch: 30747, Training Loss: 0.00820
Epoch: 30747, Training Loss: 0.00821
Epoch: 30747, Training Loss: 0.01008
Epoch: 30748, Training Loss: 0.00787
Epoch: 30748, Training Loss: 0.00820
Epoch: 30748, Training Loss: 0.00820
Epoch: 30748, Training Loss: 0.01008
Epoch: 30749, Training Loss: 0.00787
Epoch: 30749, Training Loss: 0.00820
Epoch: 30749, Training Loss: 0.00820
Epoch: 30749, Training Loss: 0.01008
Epoch: 30750, Training Loss: 0.00787
Epoch: 30750, Training Loss: 0.00819
Epoch: 30750, Training Loss: 0.00820
Epoch: 30750, Training Loss: 0.01008
Epoch: 30751, Training Loss: 0.00787
Epoch: 30751, Training Loss: 0.00819
Epoch: 30751, Training Loss: 0.00820
Epoch: 30751, Training Loss: 0.01008
Epoch: 30752, Training Loss: 0.00787
Epoch: 30752, Training Loss: 0.00819
Epoch: 30752, Training Loss: 0.00820
Epoch: 30752, Training Loss: 0.01008
Epoch: 30753, Training Loss: 0.00787
Epoch: 30753, Training Loss: 0.00819
Epoch: 30753, Training Loss: 0.00820
Epoch: 30753, Training Loss: 0.01008
Epoch: 30754, Training Loss: 0.00787
Epoch: 30754, Training Loss: 0.00819
Epoch: 30754, Training Loss: 0.00820
Epoch: 30754, Training Loss: 0.01008
Epoch: 30755, Training Loss: 0.00787
Epoch: 30755, Training Loss: 0.00819
Epoch: 30755, Training Loss: 0.00820
Epoch: 30755, Training Loss: 0.01008
Epoch: 30756, Training Loss: 0.00787
Epoch: 30756, Training Loss: 0.00819
Epoch: 30756, Training Loss: 0.00820
Epoch: 30756, Training Loss: 0.01008
Epoch: 30757, Training Loss: 0.00786
Epoch: 30757, Training Loss: 0.00819
Epoch: 30757, Training Loss: 0.00820
Epoch: 30757, Training Loss: 0.01008
Epoch: 30758, Training Loss: 0.00786
Epoch: 30758, Training Loss: 0.00819
Epoch: 30758, Training Loss: 0.00820
Epoch: 30758, Training Loss: 0.01008
Epoch: 30759, Training Loss: 0.00786
Epoch: 30759, Training Loss: 0.00819
Epoch: 30759, Training Loss: 0.00820
Epoch: 30759, Training Loss: 0.01008
Epoch: 30760, Training Loss: 0.00786
Epoch: 30760, Training Loss: 0.00819
Epoch: 30760, Training Loss: 0.00820
Epoch: 30760, Training Loss: 0.01008
Epoch: 30761, Training Loss: 0.00786
Epoch: 30761, Training Loss: 0.00819
Epoch: 30761, Training Loss: 0.00820
Epoch: 30761, Training Loss: 0.01008
Epoch: 30762, Training Loss: 0.00786
Epoch: 30762, Training Loss: 0.00819
Epoch: 30762, Training Loss: 0.00820
Epoch: 30762, Training Loss: 0.01007
Epoch: 30763, Training Loss: 0.00786
Epoch: 30763, Training Loss: 0.00819
Epoch: 30763, Training Loss: 0.00820
Epoch: 30763, Training Loss: 0.01007
Epoch: 30764, Training Loss: 0.00786
Epoch: 30764, Training Loss: 0.00819
Epoch: 30764, Training Loss: 0.00820
Epoch: 30764, Training Loss: 0.01007
Epoch: 30765, Training Loss: 0.00786
Epoch: 30765, Training Loss: 0.00819
Epoch: 30765, Training Loss: 0.00820
Epoch: 30765, Training Loss: 0.01007
Epoch: 30766, Training Loss: 0.00786
Epoch: 30766, Training Loss: 0.00819
Epoch: 30766, Training Loss: 0.00820
Epoch: 30766, Training Loss: 0.01007
Epoch: 30767, Training Loss: 0.00786
Epoch: 30767, Training Loss: 0.00819
Epoch: 30767, Training Loss: 0.00820
Epoch: 30767, Training Loss: 0.01007
Epoch: 30768, Training Loss: 0.00786
Epoch: 30768, Training Loss: 0.00819
Epoch: 30768, Training Loss: 0.00820
Epoch: 30768, Training Loss: 0.01007
Epoch: 30769, Training Loss: 0.00786
Epoch: 30769, Training Loss: 0.00819
Epoch: 30769, Training Loss: 0.00820
Epoch: 30769, Training Loss: 0.01007
Epoch: 30770, Training Loss: 0.00786
Epoch: 30770, Training Loss: 0.00819
Epoch: 30770, Training Loss: 0.00820
Epoch: 30770, Training Loss: 0.01007
Epoch: 30771, Training Loss: 0.00786
Epoch: 30771, Training Loss: 0.00819
Epoch: 30771, Training Loss: 0.00820
Epoch: 30771, Training Loss: 0.01007
Epoch: 30772, Training Loss: 0.00786
Epoch: 30772, Training Loss: 0.00819
Epoch: 30772, Training Loss: 0.00820
Epoch: 30772, Training Loss: 0.01007
Epoch: 30773, Training Loss: 0.00786
Epoch: 30773, Training Loss: 0.00819
Epoch: 30773, Training Loss: 0.00820
Epoch: 30773, Training Loss: 0.01007
Epoch: 30774, Training Loss: 0.00786
Epoch: 30774, Training Loss: 0.00819
Epoch: 30774, Training Loss: 0.00820
Epoch: 30774, Training Loss: 0.01007
Epoch: 30775, Training Loss: 0.00786
Epoch: 30775, Training Loss: 0.00819
Epoch: 30775, Training Loss: 0.00820
Epoch: 30775, Training Loss: 0.01007
Epoch: 30776, Training Loss: 0.00786
Epoch: 30776, Training Loss: 0.00819
Epoch: 30776, Training Loss: 0.00820
Epoch: 30776, Training Loss: 0.01007
Epoch: 30777, Training Loss: 0.00786
Epoch: 30777, Training Loss: 0.00819
Epoch: 30777, Training Loss: 0.00820
Epoch: 30777, Training Loss: 0.01007
Epoch: 30778, Training Loss: 0.00786
Epoch: 30778, Training Loss: 0.00819
Epoch: 30778, Training Loss: 0.00820
Epoch: 30778, Training Loss: 0.01007
Epoch: 30779, Training Loss: 0.00786
Epoch: 30779, Training Loss: 0.00819
Epoch: 30779, Training Loss: 0.00820
Epoch: 30779, Training Loss: 0.01007
Epoch: 30780, Training Loss: 0.00786
Epoch: 30780, Training Loss: 0.00819
Epoch: 30780, Training Loss: 0.00820
Epoch: 30780, Training Loss: 0.01007
Epoch: 30781, Training Loss: 0.00786
Epoch: 30781, Training Loss: 0.00819
Epoch: 30781, Training Loss: 0.00820
Epoch: 30781, Training Loss: 0.01007
Epoch: 30782, Training Loss: 0.00786
Epoch: 30782, Training Loss: 0.00819
Epoch: 30782, Training Loss: 0.00820
Epoch: 30782, Training Loss: 0.01007
Epoch: 30783, Training Loss: 0.00786
Epoch: 30783, Training Loss: 0.00819
Epoch: 30783, Training Loss: 0.00820
Epoch: 30783, Training Loss: 0.01007
Epoch: 30784, Training Loss: 0.00786
Epoch: 30784, Training Loss: 0.00819
Epoch: 30784, Training Loss: 0.00820
Epoch: 30784, Training Loss: 0.01007
Epoch: 30785, Training Loss: 0.00786
Epoch: 30785, Training Loss: 0.00819
Epoch: 30785, Training Loss: 0.00820
Epoch: 30785, Training Loss: 0.01007
Epoch: 30786, Training Loss: 0.00786
Epoch: 30786, Training Loss: 0.00819
Epoch: 30786, Training Loss: 0.00820
Epoch: 30786, Training Loss: 0.01007
Epoch: 30787, Training Loss: 0.00786
Epoch: 30787, Training Loss: 0.00819
Epoch: 30787, Training Loss: 0.00820
Epoch: 30787, Training Loss: 0.01007
Epoch: 30788, Training Loss: 0.00786
Epoch: 30788, Training Loss: 0.00819
Epoch: 30788, Training Loss: 0.00820
Epoch: 30788, Training Loss: 0.01007
Epoch: 30789, Training Loss: 0.00786
Epoch: 30789, Training Loss: 0.00819
Epoch: 30789, Training Loss: 0.00820
Epoch: 30789, Training Loss: 0.01007
Epoch: 30790, Training Loss: 0.00786
Epoch: 30790, Training Loss: 0.00819
Epoch: 30790, Training Loss: 0.00820
Epoch: 30790, Training Loss: 0.01007
Epoch: 30791, Training Loss: 0.00786
Epoch: 30791, Training Loss: 0.00819
Epoch: 30791, Training Loss: 0.00820
Epoch: 30791, Training Loss: 0.01007
Epoch: 30792, Training Loss: 0.00786
Epoch: 30792, Training Loss: 0.00819
Epoch: 30792, Training Loss: 0.00820
Epoch: 30792, Training Loss: 0.01007
Epoch: 30793, Training Loss: 0.00786
Epoch: 30793, Training Loss: 0.00819
Epoch: 30793, Training Loss: 0.00820
Epoch: 30793, Training Loss: 0.01007
Epoch: 30794, Training Loss: 0.00786
Epoch: 30794, Training Loss: 0.00819
Epoch: 30794, Training Loss: 0.00820
Epoch: 30794, Training Loss: 0.01007
Epoch: 30795, Training Loss: 0.00786
Epoch: 30795, Training Loss: 0.00819
Epoch: 30795, Training Loss: 0.00820
Epoch: 30795, Training Loss: 0.01007
Epoch: 30796, Training Loss: 0.00786
Epoch: 30796, Training Loss: 0.00819
Epoch: 30796, Training Loss: 0.00820
Epoch: 30796, Training Loss: 0.01007
Epoch: 30797, Training Loss: 0.00786
Epoch: 30797, Training Loss: 0.00819
Epoch: 30797, Training Loss: 0.00820
Epoch: 30797, Training Loss: 0.01007
Epoch: 30798, Training Loss: 0.00786
Epoch: 30798, Training Loss: 0.00819
Epoch: 30798, Training Loss: 0.00820
Epoch: 30798, Training Loss: 0.01007
Epoch: 30799, Training Loss: 0.00786
Epoch: 30799, Training Loss: 0.00819
Epoch: 30799, Training Loss: 0.00820
Epoch: 30799, Training Loss: 0.01007
Epoch: 30800, Training Loss: 0.00786
Epoch: 30800, Training Loss: 0.00819
Epoch: 30800, Training Loss: 0.00820
Epoch: 30800, Training Loss: 0.01007
Epoch: 30801, Training Loss: 0.00786
Epoch: 30801, Training Loss: 0.00819
Epoch: 30801, Training Loss: 0.00820
Epoch: 30801, Training Loss: 0.01007
Epoch: 30802, Training Loss: 0.00786
Epoch: 30802, Training Loss: 0.00819
Epoch: 30802, Training Loss: 0.00820
Epoch: 30802, Training Loss: 0.01007
Epoch: 30803, Training Loss: 0.00786
Epoch: 30803, Training Loss: 0.00819
Epoch: 30803, Training Loss: 0.00820
Epoch: 30803, Training Loss: 0.01007
Epoch: 30804, Training Loss: 0.00786
Epoch: 30804, Training Loss: 0.00819
Epoch: 30804, Training Loss: 0.00820
Epoch: 30804, Training Loss: 0.01007
Epoch: 30805, Training Loss: 0.00786
Epoch: 30805, Training Loss: 0.00819
Epoch: 30805, Training Loss: 0.00820
Epoch: 30805, Training Loss: 0.01007
Epoch: 30806, Training Loss: 0.00786
Epoch: 30806, Training Loss: 0.00819
Epoch: 30806, Training Loss: 0.00820
Epoch: 30806, Training Loss: 0.01007
Epoch: 30807, Training Loss: 0.00786
Epoch: 30807, Training Loss: 0.00819
Epoch: 30807, Training Loss: 0.00820
Epoch: 30807, Training Loss: 0.01007
Epoch: 30808, Training Loss: 0.00786
Epoch: 30808, Training Loss: 0.00819
Epoch: 30808, Training Loss: 0.00820
Epoch: 30808, Training Loss: 0.01007
Epoch: 30809, Training Loss: 0.00786
Epoch: 30809, Training Loss: 0.00819
Epoch: 30809, Training Loss: 0.00820
Epoch: 30809, Training Loss: 0.01007
Epoch: 30810, Training Loss: 0.00786
Epoch: 30810, Training Loss: 0.00819
Epoch: 30810, Training Loss: 0.00820
Epoch: 30810, Training Loss: 0.01007
Epoch: 30811, Training Loss: 0.00786
Epoch: 30811, Training Loss: 0.00819
Epoch: 30811, Training Loss: 0.00820
Epoch: 30811, Training Loss: 0.01007
Epoch: 30812, Training Loss: 0.00786
Epoch: 30812, Training Loss: 0.00819
Epoch: 30812, Training Loss: 0.00820
Epoch: 30812, Training Loss: 0.01007
Epoch: 30813, Training Loss: 0.00786
Epoch: 30813, Training Loss: 0.00819
Epoch: 30813, Training Loss: 0.00820
Epoch: 30813, Training Loss: 0.01007
Epoch: 30814, Training Loss: 0.00786
Epoch: 30814, Training Loss: 0.00819
Epoch: 30814, Training Loss: 0.00820
Epoch: 30814, Training Loss: 0.01007
Epoch: 30815, Training Loss: 0.00786
Epoch: 30815, Training Loss: 0.00819
Epoch: 30815, Training Loss: 0.00819
Epoch: 30815, Training Loss: 0.01006
Epoch: 30816, Training Loss: 0.00786
Epoch: 30816, Training Loss: 0.00819
Epoch: 30816, Training Loss: 0.00819
Epoch: 30816, Training Loss: 0.01006
Epoch: 30817, Training Loss: 0.00786
Epoch: 30817, Training Loss: 0.00818
Epoch: 30817, Training Loss: 0.00819
Epoch: 30817, Training Loss: 0.01006
Epoch: 30818, Training Loss: 0.00786
Epoch: 30818, Training Loss: 0.00818
Epoch: 30818, Training Loss: 0.00819
Epoch: 30818, Training Loss: 0.01006
Epoch: 30819, Training Loss: 0.00786
Epoch: 30819, Training Loss: 0.00818
Epoch: 30819, Training Loss: 0.00819
Epoch: 30819, Training Loss: 0.01006
Epoch: 30820, Training Loss: 0.00786
Epoch: 30820, Training Loss: 0.00818
Epoch: 30820, Training Loss: 0.00819
Epoch: 30820, Training Loss: 0.01006
Epoch: 30821, Training Loss: 0.00786
Epoch: 30821, Training Loss: 0.00818
Epoch: 30821, Training Loss: 0.00819
Epoch: 30821, Training Loss: 0.01006
Epoch: 30822, Training Loss: 0.00786
Epoch: 30822, Training Loss: 0.00818
Epoch: 30822, Training Loss: 0.00819
Epoch: 30822, Training Loss: 0.01006
Epoch: 30823, Training Loss: 0.00786
Epoch: 30823, Training Loss: 0.00818
Epoch: 30823, Training Loss: 0.00819
Epoch: 30823, Training Loss: 0.01006
Epoch: 30824, Training Loss: 0.00786
Epoch: 30824, Training Loss: 0.00818
Epoch: 30824, Training Loss: 0.00819
Epoch: 30824, Training Loss: 0.01006
Epoch: 30825, Training Loss: 0.00786
Epoch: 30825, Training Loss: 0.00818
Epoch: 30825, Training Loss: 0.00819
Epoch: 30825, Training Loss: 0.01006
Epoch: 30826, Training Loss: 0.00786
Epoch: 30826, Training Loss: 0.00818
Epoch: 30826, Training Loss: 0.00819
Epoch: 30826, Training Loss: 0.01006
Epoch: 30827, Training Loss: 0.00786
Epoch: 30827, Training Loss: 0.00818
Epoch: 30827, Training Loss: 0.00819
Epoch: 30827, Training Loss: 0.01006
Epoch: 30828, Training Loss: 0.00786
Epoch: 30828, Training Loss: 0.00818
Epoch: 30828, Training Loss: 0.00819
Epoch: 30828, Training Loss: 0.01006
Epoch: 30829, Training Loss: 0.00785
Epoch: 30829, Training Loss: 0.00818
Epoch: 30829, Training Loss: 0.00819
Epoch: 30829, Training Loss: 0.01006
Epoch: 30830, Training Loss: 0.00785
Epoch: 30830, Training Loss: 0.00818
Epoch: 30830, Training Loss: 0.00819
Epoch: 30830, Training Loss: 0.01006
Epoch: 30831, Training Loss: 0.00785
Epoch: 30831, Training Loss: 0.00818
Epoch: 30831, Training Loss: 0.00819
Epoch: 30831, Training Loss: 0.01006
Epoch: 30832, Training Loss: 0.00785
Epoch: 30832, Training Loss: 0.00818
Epoch: 30832, Training Loss: 0.00819
Epoch: 30832, Training Loss: 0.01006
Epoch: 30833, Training Loss: 0.00785
Epoch: 30833, Training Loss: 0.00818
Epoch: 30833, Training Loss: 0.00819
Epoch: 30833, Training Loss: 0.01006
Epoch: 30834, Training Loss: 0.00785
Epoch: 30834, Training Loss: 0.00818
Epoch: 30834, Training Loss: 0.00819
Epoch: 30834, Training Loss: 0.01006
Epoch: 30835, Training Loss: 0.00785
Epoch: 30835, Training Loss: 0.00818
Epoch: 30835, Training Loss: 0.00819
Epoch: 30835, Training Loss: 0.01006
Epoch: 30836, Training Loss: 0.00785
Epoch: 30836, Training Loss: 0.00818
Epoch: 30836, Training Loss: 0.00819
Epoch: 30836, Training Loss: 0.01006
Epoch: 30837, Training Loss: 0.00785
Epoch: 30837, Training Loss: 0.00818
Epoch: 30837, Training Loss: 0.00819
Epoch: 30837, Training Loss: 0.01006
Epoch: 30838, Training Loss: 0.00785
Epoch: 30838, Training Loss: 0.00818
Epoch: 30838, Training Loss: 0.00819
Epoch: 30838, Training Loss: 0.01006
Epoch: 30839, Training Loss: 0.00785
Epoch: 30839, Training Loss: 0.00818
Epoch: 30839, Training Loss: 0.00819
Epoch: 30839, Training Loss: 0.01006
Epoch: 30840, Training Loss: 0.00785
Epoch: 30840, Training Loss: 0.00818
Epoch: 30840, Training Loss: 0.00819
Epoch: 30840, Training Loss: 0.01006
Epoch: 30841, Training Loss: 0.00785
Epoch: 30841, Training Loss: 0.00818
Epoch: 30841, Training Loss: 0.00819
Epoch: 30841, Training Loss: 0.01006
Epoch: 30842, Training Loss: 0.00785
Epoch: 30842, Training Loss: 0.00818
Epoch: 30842, Training Loss: 0.00819
Epoch: 30842, Training Loss: 0.01006
Epoch: 30843, Training Loss: 0.00785
Epoch: 30843, Training Loss: 0.00818
Epoch: 30843, Training Loss: 0.00819
Epoch: 30843, Training Loss: 0.01006
Epoch: 30844, Training Loss: 0.00785
Epoch: 30844, Training Loss: 0.00818
Epoch: 30844, Training Loss: 0.00819
Epoch: 30844, Training Loss: 0.01006
Epoch: 30845, Training Loss: 0.00785
Epoch: 30845, Training Loss: 0.00818
Epoch: 30845, Training Loss: 0.00819
Epoch: 30845, Training Loss: 0.01006
Epoch: 30846, Training Loss: 0.00785
Epoch: 30846, Training Loss: 0.00818
Epoch: 30846, Training Loss: 0.00819
Epoch: 30846, Training Loss: 0.01006
Epoch: 30847, Training Loss: 0.00785
Epoch: 30847, Training Loss: 0.00818
Epoch: 30847, Training Loss: 0.00819
Epoch: 30847, Training Loss: 0.01006
Epoch: 30848, Training Loss: 0.00785
Epoch: 30848, Training Loss: 0.00818
Epoch: 30848, Training Loss: 0.00819
Epoch: 30848, Training Loss: 0.01006
Epoch: 30849, Training Loss: 0.00785
Epoch: 30849, Training Loss: 0.00818
Epoch: 30849, Training Loss: 0.00819
Epoch: 30849, Training Loss: 0.01006
Epoch: 30850, Training Loss: 0.00785
Epoch: 30850, Training Loss: 0.00818
Epoch: 30850, Training Loss: 0.00819
Epoch: 30850, Training Loss: 0.01006
Epoch: 30851, Training Loss: 0.00785
Epoch: 30851, Training Loss: 0.00818
Epoch: 30851, Training Loss: 0.00819
Epoch: 30851, Training Loss: 0.01006
Epoch: 30852, Training Loss: 0.00785
Epoch: 30852, Training Loss: 0.00818
Epoch: 30852, Training Loss: 0.00819
Epoch: 30852, Training Loss: 0.01006
Epoch: 30853, Training Loss: 0.00785
Epoch: 30853, Training Loss: 0.00818
Epoch: 30853, Training Loss: 0.00819
Epoch: 30853, Training Loss: 0.01006
Epoch: 30854, Training Loss: 0.00785
Epoch: 30854, Training Loss: 0.00818
Epoch: 30854, Training Loss: 0.00819
Epoch: 30854, Training Loss: 0.01006
Epoch: 30855, Training Loss: 0.00785
Epoch: 30855, Training Loss: 0.00818
Epoch: 30855, Training Loss: 0.00819
Epoch: 30855, Training Loss: 0.01006
Epoch: 30856, Training Loss: 0.00785
Epoch: 30856, Training Loss: 0.00818
Epoch: 30856, Training Loss: 0.00819
Epoch: 30856, Training Loss: 0.01006
Epoch: 30857, Training Loss: 0.00785
Epoch: 30857, Training Loss: 0.00818
Epoch: 30857, Training Loss: 0.00819
Epoch: 30857, Training Loss: 0.01006
Epoch: 30858, Training Loss: 0.00785
Epoch: 30858, Training Loss: 0.00818
Epoch: 30858, Training Loss: 0.00819
Epoch: 30858, Training Loss: 0.01006
Epoch: 30859, Training Loss: 0.00785
Epoch: 30859, Training Loss: 0.00818
Epoch: 30859, Training Loss: 0.00819
Epoch: 30859, Training Loss: 0.01006
Epoch: 30860, Training Loss: 0.00785
Epoch: 30860, Training Loss: 0.00818
Epoch: 30860, Training Loss: 0.00819
Epoch: 30860, Training Loss: 0.01006
Epoch: 30861, Training Loss: 0.00785
Epoch: 30861, Training Loss: 0.00818
Epoch: 30861, Training Loss: 0.00819
Epoch: 30861, Training Loss: 0.01006
Epoch: 30862, Training Loss: 0.00785
Epoch: 30862, Training Loss: 0.00818
Epoch: 30862, Training Loss: 0.00819
Epoch: 30862, Training Loss: 0.01006
Epoch: 30863, Training Loss: 0.00785
Epoch: 30863, Training Loss: 0.00818
Epoch: 30863, Training Loss: 0.00819
Epoch: 30863, Training Loss: 0.01006
Epoch: 30864, Training Loss: 0.00785
Epoch: 30864, Training Loss: 0.00818
Epoch: 30864, Training Loss: 0.00819
Epoch: 30864, Training Loss: 0.01006
Epoch: 30865, Training Loss: 0.00785
Epoch: 30865, Training Loss: 0.00818
Epoch: 30865, Training Loss: 0.00819
Epoch: 30865, Training Loss: 0.01006
Epoch: 30866, Training Loss: 0.00785
Epoch: 30866, Training Loss: 0.00818
Epoch: 30866, Training Loss: 0.00819
Epoch: 30866, Training Loss: 0.01006
Epoch: 30867, Training Loss: 0.00785
Epoch: 30867, Training Loss: 0.00818
Epoch: 30867, Training Loss: 0.00819
Epoch: 30867, Training Loss: 0.01006
Epoch: 30868, Training Loss: 0.00785
Epoch: 30868, Training Loss: 0.00818
Epoch: 30868, Training Loss: 0.00819
Epoch: 30868, Training Loss: 0.01006
Epoch: 30869, Training Loss: 0.00785
Epoch: 30869, Training Loss: 0.00818
Epoch: 30869, Training Loss: 0.00819
Epoch: 30869, Training Loss: 0.01005
Epoch: 30870, Training Loss: 0.00785
Epoch: 30870, Training Loss: 0.00818
Epoch: 30870, Training Loss: 0.00819
Epoch: 30870, Training Loss: 0.01005
Epoch: 30871, Training Loss: 0.00785
Epoch: 30871, Training Loss: 0.00818
Epoch: 30871, Training Loss: 0.00819
Epoch: 30871, Training Loss: 0.01005
Epoch: 30872, Training Loss: 0.00785
Epoch: 30872, Training Loss: 0.00818
Epoch: 30872, Training Loss: 0.00819
Epoch: 30872, Training Loss: 0.01005
Epoch: 30873, Training Loss: 0.00785
Epoch: 30873, Training Loss: 0.00818
Epoch: 30873, Training Loss: 0.00819
Epoch: 30873, Training Loss: 0.01005
Epoch: 30874, Training Loss: 0.00785
Epoch: 30874, Training Loss: 0.00818
Epoch: 30874, Training Loss: 0.00819
Epoch: 30874, Training Loss: 0.01005
Epoch: 30875, Training Loss: 0.00785
Epoch: 30875, Training Loss: 0.00818
Epoch: 30875, Training Loss: 0.00819
Epoch: 30875, Training Loss: 0.01005
Epoch: 30876, Training Loss: 0.00785
Epoch: 30876, Training Loss: 0.00818
Epoch: 30876, Training Loss: 0.00819
Epoch: 30876, Training Loss: 0.01005
Epoch: 30877, Training Loss: 0.00785
Epoch: 30877, Training Loss: 0.00818
Epoch: 30877, Training Loss: 0.00819
Epoch: 30877, Training Loss: 0.01005
Epoch: 30878, Training Loss: 0.00785
Epoch: 30878, Training Loss: 0.00818
Epoch: 30878, Training Loss: 0.00819
Epoch: 30878, Training Loss: 0.01005
Epoch: 30879, Training Loss: 0.00785
Epoch: 30879, Training Loss: 0.00818
Epoch: 30879, Training Loss: 0.00819
Epoch: 30879, Training Loss: 0.01005
Epoch: 30880, Training Loss: 0.00785
Epoch: 30880, Training Loss: 0.00818
Epoch: 30880, Training Loss: 0.00819
Epoch: 30880, Training Loss: 0.01005
Epoch: 30881, Training Loss: 0.00785
Epoch: 30881, Training Loss: 0.00818
Epoch: 30881, Training Loss: 0.00819
Epoch: 30881, Training Loss: 0.01005
Epoch: 30882, Training Loss: 0.00785
Epoch: 30882, Training Loss: 0.00818
Epoch: 30882, Training Loss: 0.00818
Epoch: 30882, Training Loss: 0.01005
Epoch: 30883, Training Loss: 0.00785
Epoch: 30883, Training Loss: 0.00818
Epoch: 30883, Training Loss: 0.00818
Epoch: 30883, Training Loss: 0.01005
Epoch: 30884, Training Loss: 0.00785
Epoch: 30884, Training Loss: 0.00817
Epoch: 30884, Training Loss: 0.00818
Epoch: 30884, Training Loss: 0.01005
Epoch: 30885, Training Loss: 0.00785
Epoch: 30885, Training Loss: 0.00817
Epoch: 30885, Training Loss: 0.00818
Epoch: 30885, Training Loss: 0.01005
Epoch: 30886, Training Loss: 0.00785
Epoch: 30886, Training Loss: 0.00817
Epoch: 30886, Training Loss: 0.00818
Epoch: 30886, Training Loss: 0.01005
Epoch: 30887, Training Loss: 0.00785
Epoch: 30887, Training Loss: 0.00817
Epoch: 30887, Training Loss: 0.00818
Epoch: 30887, Training Loss: 0.01005
Epoch: 30888, Training Loss: 0.00785
Epoch: 30888, Training Loss: 0.00817
Epoch: 30888, Training Loss: 0.00818
Epoch: 30888, Training Loss: 0.01005
Epoch: 30889, Training Loss: 0.00785
Epoch: 30889, Training Loss: 0.00817
Epoch: 30889, Training Loss: 0.00818
Epoch: 30889, Training Loss: 0.01005
Epoch: 30890, Training Loss: 0.00785
Epoch: 30890, Training Loss: 0.00817
Epoch: 30890, Training Loss: 0.00818
Epoch: 30890, Training Loss: 0.01005
Epoch: 30891, Training Loss: 0.00785
Epoch: 30891, Training Loss: 0.00817
Epoch: 30891, Training Loss: 0.00818
Epoch: 30891, Training Loss: 0.01005
Epoch: 30892, Training Loss: 0.00785
Epoch: 30892, Training Loss: 0.00817
Epoch: 30892, Training Loss: 0.00818
Epoch: 30892, Training Loss: 0.01005
Epoch: 30893, Training Loss: 0.00785
Epoch: 30893, Training Loss: 0.00817
Epoch: 30893, Training Loss: 0.00818
Epoch: 30893, Training Loss: 0.01005
Epoch: 30894, Training Loss: 0.00785
Epoch: 30894, Training Loss: 0.00817
Epoch: 30894, Training Loss: 0.00818
Epoch: 30894, Training Loss: 0.01005
Epoch: 30895, Training Loss: 0.00785
Epoch: 30895, Training Loss: 0.00817
Epoch: 30895, Training Loss: 0.00818
Epoch: 30895, Training Loss: 0.01005
Epoch: 30896, Training Loss: 0.00785
Epoch: 30896, Training Loss: 0.00817
Epoch: 30896, Training Loss: 0.00818
Epoch: 30896, Training Loss: 0.01005
Epoch: 30897, Training Loss: 0.00785
Epoch: 30897, Training Loss: 0.00817
Epoch: 30897, Training Loss: 0.00818
Epoch: 30897, Training Loss: 0.01005
Epoch: 30898, Training Loss: 0.00785
Epoch: 30898, Training Loss: 0.00817
Epoch: 30898, Training Loss: 0.00818
Epoch: 30898, Training Loss: 0.01005
Epoch: 30899, Training Loss: 0.00785
Epoch: 30899, Training Loss: 0.00817
Epoch: 30899, Training Loss: 0.00818
Epoch: 30899, Training Loss: 0.01005
Epoch: 30900, Training Loss: 0.00784
Epoch: 30900, Training Loss: 0.00817
Epoch: 30900, Training Loss: 0.00818
Epoch: 30900, Training Loss: 0.01005
Epoch: 30901, Training Loss: 0.00784
Epoch: 30901, Training Loss: 0.00817
Epoch: 30901, Training Loss: 0.00818
Epoch: 30901, Training Loss: 0.01005
Epoch: 30902, Training Loss: 0.00784
Epoch: 30902, Training Loss: 0.00817
Epoch: 30902, Training Loss: 0.00818
Epoch: 30902, Training Loss: 0.01005
Epoch: 30903, Training Loss: 0.00784
Epoch: 30903, Training Loss: 0.00817
Epoch: 30903, Training Loss: 0.00818
Epoch: 30903, Training Loss: 0.01005
Epoch: 30904, Training Loss: 0.00784
Epoch: 30904, Training Loss: 0.00817
Epoch: 30904, Training Loss: 0.00818
Epoch: 30904, Training Loss: 0.01005
Epoch: 30905, Training Loss: 0.00784
Epoch: 30905, Training Loss: 0.00817
Epoch: 30905, Training Loss: 0.00818
Epoch: 30905, Training Loss: 0.01005
Epoch: 30906, Training Loss: 0.00784
Epoch: 30906, Training Loss: 0.00817
Epoch: 30906, Training Loss: 0.00818
Epoch: 30906, Training Loss: 0.01005
Epoch: 30907, Training Loss: 0.00784
Epoch: 30907, Training Loss: 0.00817
Epoch: 30907, Training Loss: 0.00818
Epoch: 30907, Training Loss: 0.01005
Epoch: 30908, Training Loss: 0.00784
Epoch: 30908, Training Loss: 0.00817
Epoch: 30908, Training Loss: 0.00818
Epoch: 30908, Training Loss: 0.01005
Epoch: 30909, Training Loss: 0.00784
Epoch: 30909, Training Loss: 0.00817
Epoch: 30909, Training Loss: 0.00818
Epoch: 30909, Training Loss: 0.01005
Epoch: 30910, Training Loss: 0.00784
Epoch: 30910, Training Loss: 0.00817
Epoch: 30910, Training Loss: 0.00818
Epoch: 30910, Training Loss: 0.01005
Epoch: 30911, Training Loss: 0.00784
Epoch: 30911, Training Loss: 0.00817
Epoch: 30911, Training Loss: 0.00818
Epoch: 30911, Training Loss: 0.01005
Epoch: 30912, Training Loss: 0.00784
Epoch: 30912, Training Loss: 0.00817
Epoch: 30912, Training Loss: 0.00818
Epoch: 30912, Training Loss: 0.01005
Epoch: 30913, Training Loss: 0.00784
Epoch: 30913, Training Loss: 0.00817
Epoch: 30913, Training Loss: 0.00818
Epoch: 30913, Training Loss: 0.01005
Epoch: 30914, Training Loss: 0.00784
Epoch: 30914, Training Loss: 0.00817
Epoch: 30914, Training Loss: 0.00818
Epoch: 30914, Training Loss: 0.01005
Epoch: 30915, Training Loss: 0.00784
Epoch: 30915, Training Loss: 0.00817
Epoch: 30915, Training Loss: 0.00818
Epoch: 30915, Training Loss: 0.01005
Epoch: 30916, Training Loss: 0.00784
Epoch: 30916, Training Loss: 0.00817
Epoch: 30916, Training Loss: 0.00818
Epoch: 30916, Training Loss: 0.01005
Epoch: 30917, Training Loss: 0.00784
Epoch: 30917, Training Loss: 0.00817
Epoch: 30917, Training Loss: 0.00818
Epoch: 30917, Training Loss: 0.01005
Epoch: 30918, Training Loss: 0.00784
Epoch: 30918, Training Loss: 0.00817
Epoch: 30918, Training Loss: 0.00818
Epoch: 30918, Training Loss: 0.01005
Epoch: 30919, Training Loss: 0.00784
Epoch: 30919, Training Loss: 0.00817
Epoch: 30919, Training Loss: 0.00818
Epoch: 30919, Training Loss: 0.01005
Epoch: 30920, Training Loss: 0.00784
Epoch: 30920, Training Loss: 0.00817
Epoch: 30920, Training Loss: 0.00818
Epoch: 30920, Training Loss: 0.01005
Epoch: 30921, Training Loss: 0.00784
Epoch: 30921, Training Loss: 0.00817
Epoch: 30921, Training Loss: 0.00818
Epoch: 30921, Training Loss: 0.01005
Epoch: 30922, Training Loss: 0.00784
Epoch: 30922, Training Loss: 0.00817
Epoch: 30922, Training Loss: 0.00818
Epoch: 30922, Training Loss: 0.01005
Epoch: 30923, Training Loss: 0.00784
Epoch: 30923, Training Loss: 0.00817
Epoch: 30923, Training Loss: 0.00818
Epoch: 30923, Training Loss: 0.01004
Epoch: 30924, Training Loss: 0.00784
Epoch: 30924, Training Loss: 0.00817
Epoch: 30924, Training Loss: 0.00818
Epoch: 30924, Training Loss: 0.01004
Epoch: 30925, Training Loss: 0.00784
Epoch: 30925, Training Loss: 0.00817
Epoch: 30925, Training Loss: 0.00818
Epoch: 30925, Training Loss: 0.01004
Epoch: 30926, Training Loss: 0.00784
Epoch: 30926, Training Loss: 0.00817
Epoch: 30926, Training Loss: 0.00818
Epoch: 30926, Training Loss: 0.01004
Epoch: 30927, Training Loss: 0.00784
Epoch: 30927, Training Loss: 0.00817
Epoch: 30927, Training Loss: 0.00818
Epoch: 30927, Training Loss: 0.01004
Epoch: 30928, Training Loss: 0.00784
Epoch: 30928, Training Loss: 0.00817
Epoch: 30928, Training Loss: 0.00818
Epoch: 30928, Training Loss: 0.01004
Epoch: 30929, Training Loss: 0.00784
Epoch: 30929, Training Loss: 0.00817
Epoch: 30929, Training Loss: 0.00818
Epoch: 30929, Training Loss: 0.01004
Epoch: 30930, Training Loss: 0.00784
Epoch: 30930, Training Loss: 0.00817
Epoch: 30930, Training Loss: 0.00818
Epoch: 30930, Training Loss: 0.01004
Epoch: 30931, Training Loss: 0.00784
Epoch: 30931, Training Loss: 0.00817
Epoch: 30931, Training Loss: 0.00818
Epoch: 30931, Training Loss: 0.01004
Epoch: 30932, Training Loss: 0.00784
Epoch: 30932, Training Loss: 0.00817
Epoch: 30932, Training Loss: 0.00818
Epoch: 30932, Training Loss: 0.01004
Epoch: 30933, Training Loss: 0.00784
Epoch: 30933, Training Loss: 0.00817
Epoch: 30933, Training Loss: 0.00818
Epoch: 30933, Training Loss: 0.01004
Epoch: 30934, Training Loss: 0.00784
Epoch: 30934, Training Loss: 0.00817
Epoch: 30934, Training Loss: 0.00818
Epoch: 30934, Training Loss: 0.01004
Epoch: 30935, Training Loss: 0.00784
Epoch: 30935, Training Loss: 0.00817
Epoch: 30935, Training Loss: 0.00818
Epoch: 30935, Training Loss: 0.01004
Epoch: 30936, Training Loss: 0.00784
Epoch: 30936, Training Loss: 0.00817
Epoch: 30936, Training Loss: 0.00818
Epoch: 30936, Training Loss: 0.01004
Epoch: 30937, Training Loss: 0.00784
Epoch: 30937, Training Loss: 0.00817
Epoch: 30937, Training Loss: 0.00818
Epoch: 30937, Training Loss: 0.01004
Epoch: 30938, Training Loss: 0.00784
Epoch: 30938, Training Loss: 0.00817
Epoch: 30938, Training Loss: 0.00818
Epoch: 30938, Training Loss: 0.01004
Epoch: 30939, Training Loss: 0.00784
Epoch: 30939, Training Loss: 0.00817
Epoch: 30939, Training Loss: 0.00818
Epoch: 30939, Training Loss: 0.01004
Epoch: 30940, Training Loss: 0.00784
Epoch: 30940, Training Loss: 0.00817
Epoch: 30940, Training Loss: 0.00818
Epoch: 30940, Training Loss: 0.01004
Epoch: 30941, Training Loss: 0.00784
Epoch: 30941, Training Loss: 0.00817
Epoch: 30941, Training Loss: 0.00818
Epoch: 30941, Training Loss: 0.01004
Epoch: 30942, Training Loss: 0.00784
Epoch: 30942, Training Loss: 0.00817
Epoch: 30942, Training Loss: 0.00818
Epoch: 30942, Training Loss: 0.01004
Epoch: 30943, Training Loss: 0.00784
Epoch: 30943, Training Loss: 0.00817
Epoch: 30943, Training Loss: 0.00818
Epoch: 30943, Training Loss: 0.01004
Epoch: 30944, Training Loss: 0.00784
Epoch: 30944, Training Loss: 0.00817
Epoch: 30944, Training Loss: 0.00818
Epoch: 30944, Training Loss: 0.01004
Epoch: 30945, Training Loss: 0.00784
Epoch: 30945, Training Loss: 0.00817
Epoch: 30945, Training Loss: 0.00818
Epoch: 30945, Training Loss: 0.01004
Epoch: 30946, Training Loss: 0.00784
Epoch: 30946, Training Loss: 0.00817
Epoch: 30946, Training Loss: 0.00818
Epoch: 30946, Training Loss: 0.01004
Epoch: 30947, Training Loss: 0.00784
Epoch: 30947, Training Loss: 0.00817
Epoch: 30947, Training Loss: 0.00818
Epoch: 30947, Training Loss: 0.01004
Epoch: 30948, Training Loss: 0.00784
Epoch: 30948, Training Loss: 0.00817
Epoch: 30948, Training Loss: 0.00818
Epoch: 30948, Training Loss: 0.01004
Epoch: 30949, Training Loss: 0.00784
Epoch: 30949, Training Loss: 0.00817
Epoch: 30949, Training Loss: 0.00817
Epoch: 30949, Training Loss: 0.01004
Epoch: 30950, Training Loss: 0.00784
Epoch: 30950, Training Loss: 0.00817
Epoch: 30950, Training Loss: 0.00817
Epoch: 30950, Training Loss: 0.01004
Epoch: 30951, Training Loss: 0.00784
Epoch: 30951, Training Loss: 0.00816
Epoch: 30951, Training Loss: 0.00817
Epoch: 30951, Training Loss: 0.01004
Epoch: 30952, Training Loss: 0.00784
Epoch: 30952, Training Loss: 0.00816
Epoch: 30952, Training Loss: 0.00817
Epoch: 30952, Training Loss: 0.01004
Epoch: 30953, Training Loss: 0.00784
Epoch: 30953, Training Loss: 0.00816
Epoch: 30953, Training Loss: 0.00817
Epoch: 30953, Training Loss: 0.01004
Epoch: 30954, Training Loss: 0.00784
Epoch: 30954, Training Loss: 0.00816
Epoch: 30954, Training Loss: 0.00817
Epoch: 30954, Training Loss: 0.01004
Epoch: 30955, Training Loss: 0.00784
Epoch: 30955, Training Loss: 0.00816
Epoch: 30955, Training Loss: 0.00817
Epoch: 30955, Training Loss: 0.01004
Epoch: 30956, Training Loss: 0.00784
Epoch: 30956, Training Loss: 0.00816
Epoch: 30956, Training Loss: 0.00817
Epoch: 30956, Training Loss: 0.01004
Epoch: 30957, Training Loss: 0.00784
Epoch: 30957, Training Loss: 0.00816
Epoch: 30957, Training Loss: 0.00817
Epoch: 30957, Training Loss: 0.01004
Epoch: 30958, Training Loss: 0.00784
Epoch: 30958, Training Loss: 0.00816
Epoch: 30958, Training Loss: 0.00817
Epoch: 30958, Training Loss: 0.01004
Epoch: 30959, Training Loss: 0.00784
Epoch: 30959, Training Loss: 0.00816
Epoch: 30959, Training Loss: 0.00817
Epoch: 30959, Training Loss: 0.01004
Epoch: 30960, Training Loss: 0.00784
Epoch: 30960, Training Loss: 0.00816
Epoch: 30960, Training Loss: 0.00817
Epoch: 30960, Training Loss: 0.01004
Epoch: 30961, Training Loss: 0.00784
Epoch: 30961, Training Loss: 0.00816
Epoch: 30961, Training Loss: 0.00817
Epoch: 30961, Training Loss: 0.01004
Epoch: 30962, Training Loss: 0.00784
Epoch: 30962, Training Loss: 0.00816
Epoch: 30962, Training Loss: 0.00817
Epoch: 30962, Training Loss: 0.01004
Epoch: 30963, Training Loss: 0.00784
Epoch: 30963, Training Loss: 0.00816
Epoch: 30963, Training Loss: 0.00817
Epoch: 30963, Training Loss: 0.01004
Epoch: 30964, Training Loss: 0.00784
Epoch: 30964, Training Loss: 0.00816
Epoch: 30964, Training Loss: 0.00817
Epoch: 30964, Training Loss: 0.01004
Epoch: 30965, Training Loss: 0.00784
Epoch: 30965, Training Loss: 0.00816
Epoch: 30965, Training Loss: 0.00817
Epoch: 30965, Training Loss: 0.01004
Epoch: 30966, Training Loss: 0.00784
Epoch: 30966, Training Loss: 0.00816
Epoch: 30966, Training Loss: 0.00817
Epoch: 30966, Training Loss: 0.01004
Epoch: 30967, Training Loss: 0.00784
Epoch: 30967, Training Loss: 0.00816
Epoch: 30967, Training Loss: 0.00817
Epoch: 30967, Training Loss: 0.01004
Epoch: 30968, Training Loss: 0.00784
Epoch: 30968, Training Loss: 0.00816
Epoch: 30968, Training Loss: 0.00817
Epoch: 30968, Training Loss: 0.01004
Epoch: 30969, Training Loss: 0.00784
Epoch: 30969, Training Loss: 0.00816
Epoch: 30969, Training Loss: 0.00817
Epoch: 30969, Training Loss: 0.01004
Epoch: 30970, Training Loss: 0.00784
Epoch: 30970, Training Loss: 0.00816
Epoch: 30970, Training Loss: 0.00817
Epoch: 30970, Training Loss: 0.01004
Epoch: 30971, Training Loss: 0.00784
Epoch: 30971, Training Loss: 0.00816
Epoch: 30971, Training Loss: 0.00817
Epoch: 30971, Training Loss: 0.01004
Epoch: 30972, Training Loss: 0.00783
Epoch: 30972, Training Loss: 0.00816
Epoch: 30972, Training Loss: 0.00817
Epoch: 30972, Training Loss: 0.01004
Epoch: 30973, Training Loss: 0.00783
Epoch: 30973, Training Loss: 0.00816
Epoch: 30973, Training Loss: 0.00817
Epoch: 30973, Training Loss: 0.01004
Epoch: 30974, Training Loss: 0.00783
Epoch: 30974, Training Loss: 0.00816
Epoch: 30974, Training Loss: 0.00817
Epoch: 30974, Training Loss: 0.01004
Epoch: 30975, Training Loss: 0.00783
Epoch: 30975, Training Loss: 0.00816
Epoch: 30975, Training Loss: 0.00817
Epoch: 30975, Training Loss: 0.01004
Epoch: 30976, Training Loss: 0.00783
Epoch: 30976, Training Loss: 0.00816
Epoch: 30976, Training Loss: 0.00817
Epoch: 30976, Training Loss: 0.01004
Epoch: 30977, Training Loss: 0.00783
Epoch: 30977, Training Loss: 0.00816
Epoch: 30977, Training Loss: 0.00817
Epoch: 30977, Training Loss: 0.01004
Epoch: 30978, Training Loss: 0.00783
Epoch: 30978, Training Loss: 0.00816
Epoch: 30978, Training Loss: 0.00817
Epoch: 30978, Training Loss: 0.01003
Epoch: 30979, Training Loss: 0.00783
Epoch: 30979, Training Loss: 0.00816
Epoch: 30979, Training Loss: 0.00817
Epoch: 30979, Training Loss: 0.01003
Epoch: 30980, Training Loss: 0.00783
Epoch: 30980, Training Loss: 0.00816
Epoch: 30980, Training Loss: 0.00817
Epoch: 30980, Training Loss: 0.01003
Epoch: 30981, Training Loss: 0.00783
Epoch: 30981, Training Loss: 0.00816
Epoch: 30981, Training Loss: 0.00817
Epoch: 30981, Training Loss: 0.01003
Epoch: 30982, Training Loss: 0.00783
Epoch: 30982, Training Loss: 0.00816
Epoch: 30982, Training Loss: 0.00817
Epoch: 30982, Training Loss: 0.01003
Epoch: 30983, Training Loss: 0.00783
Epoch: 30983, Training Loss: 0.00816
Epoch: 30983, Training Loss: 0.00817
Epoch: 30983, Training Loss: 0.01003
Epoch: 30984, Training Loss: 0.00783
Epoch: 30984, Training Loss: 0.00816
Epoch: 30984, Training Loss: 0.00817
Epoch: 30984, Training Loss: 0.01003
Epoch: 30985, Training Loss: 0.00783
Epoch: 30985, Training Loss: 0.00816
Epoch: 30985, Training Loss: 0.00817
Epoch: 30985, Training Loss: 0.01003
Epoch: 30986, Training Loss: 0.00783
Epoch: 30986, Training Loss: 0.00816
Epoch: 30986, Training Loss: 0.00817
Epoch: 30986, Training Loss: 0.01003
Epoch: 30987, Training Loss: 0.00783
Epoch: 30987, Training Loss: 0.00816
Epoch: 30987, Training Loss: 0.00817
Epoch: 30987, Training Loss: 0.01003
Epoch: 30988, Training Loss: 0.00783
Epoch: 30988, Training Loss: 0.00816
Epoch: 30988, Training Loss: 0.00817
Epoch: 30988, Training Loss: 0.01003
Epoch: 30989, Training Loss: 0.00783
Epoch: 30989, Training Loss: 0.00816
Epoch: 30989, Training Loss: 0.00817
Epoch: 30989, Training Loss: 0.01003
Epoch: 30990, Training Loss: 0.00783
Epoch: 30990, Training Loss: 0.00816
Epoch: 30990, Training Loss: 0.00817
Epoch: 30990, Training Loss: 0.01003
Epoch: 30991, Training Loss: 0.00783
Epoch: 30991, Training Loss: 0.00816
Epoch: 30991, Training Loss: 0.00817
Epoch: 30991, Training Loss: 0.01003
Epoch: 30992, Training Loss: 0.00783
Epoch: 30992, Training Loss: 0.00816
Epoch: 30992, Training Loss: 0.00817
Epoch: 30992, Training Loss: 0.01003
Epoch: 30993, Training Loss: 0.00783
Epoch: 30993, Training Loss: 0.00816
Epoch: 30993, Training Loss: 0.00817
Epoch: 30993, Training Loss: 0.01003
Epoch: 30994, Training Loss: 0.00783
Epoch: 30994, Training Loss: 0.00816
Epoch: 30994, Training Loss: 0.00817
Epoch: 30994, Training Loss: 0.01003
Epoch: 30995, Training Loss: 0.00783
Epoch: 30995, Training Loss: 0.00816
Epoch: 30995, Training Loss: 0.00817
Epoch: 30995, Training Loss: 0.01003
Epoch: 30996, Training Loss: 0.00783
Epoch: 30996, Training Loss: 0.00816
Epoch: 30996, Training Loss: 0.00817
Epoch: 30996, Training Loss: 0.01003
Epoch: 30997, Training Loss: 0.00783
Epoch: 30997, Training Loss: 0.00816
Epoch: 30997, Training Loss: 0.00817
Epoch: 30997, Training Loss: 0.01003
Epoch: 30998, Training Loss: 0.00783
Epoch: 30998, Training Loss: 0.00816
Epoch: 30998, Training Loss: 0.00817
Epoch: 30998, Training Loss: 0.01003
Epoch: 30999, Training Loss: 0.00783
Epoch: 30999, Training Loss: 0.00816
Epoch: 30999, Training Loss: 0.00817
Epoch: 30999, Training Loss: 0.01003
Epoch: 31000, Training Loss: 0.00783
Epoch: 31000, Training Loss: 0.00816
Epoch: 31000, Training Loss: 0.00817
Epoch: 31000, Training Loss: 0.01003
Epoch: 31001, Training Loss: 0.00783
Epoch: 31001, Training Loss: 0.00816
Epoch: 31001, Training Loss: 0.00817
Epoch: 31001, Training Loss: 0.01003
Epoch: 31002, Training Loss: 0.00783
Epoch: 31002, Training Loss: 0.00816
Epoch: 31002, Training Loss: 0.00817
Epoch: 31002, Training Loss: 0.01003
Epoch: 31003, Training Loss: 0.00783
Epoch: 31003, Training Loss: 0.00816
Epoch: 31003, Training Loss: 0.00817
Epoch: 31003, Training Loss: 0.01003
Epoch: 31004, Training Loss: 0.00783
Epoch: 31004, Training Loss: 0.00816
Epoch: 31004, Training Loss: 0.00817
Epoch: 31004, Training Loss: 0.01003
Epoch: 31005, Training Loss: 0.00783
Epoch: 31005, Training Loss: 0.00816
Epoch: 31005, Training Loss: 0.00817
Epoch: 31005, Training Loss: 0.01003
Epoch: 31006, Training Loss: 0.00783
Epoch: 31006, Training Loss: 0.00816
Epoch: 31006, Training Loss: 0.00817
Epoch: 31006, Training Loss: 0.01003
Epoch: 31007, Training Loss: 0.00783
Epoch: 31007, Training Loss: 0.00816
Epoch: 31007, Training Loss: 0.00817
Epoch: 31007, Training Loss: 0.01003
Epoch: 31008, Training Loss: 0.00783
Epoch: 31008, Training Loss: 0.00816
Epoch: 31008, Training Loss: 0.00817
Epoch: 31008, Training Loss: 0.01003
Epoch: 31009, Training Loss: 0.00783
Epoch: 31009, Training Loss: 0.00816
Epoch: 31009, Training Loss: 0.00817
Epoch: 31009, Training Loss: 0.01003
Epoch: 31010, Training Loss: 0.00783
Epoch: 31010, Training Loss: 0.00816
Epoch: 31010, Training Loss: 0.00817
Epoch: 31010, Training Loss: 0.01003
Epoch: 31011, Training Loss: 0.00783
Epoch: 31011, Training Loss: 0.00816
Epoch: 31011, Training Loss: 0.00817
Epoch: 31011, Training Loss: 0.01003
Epoch: 31012, Training Loss: 0.00783
Epoch: 31012, Training Loss: 0.00816
Epoch: 31012, Training Loss: 0.00817
Epoch: 31012, Training Loss: 0.01003
Epoch: 31013, Training Loss: 0.00783
Epoch: 31013, Training Loss: 0.00816
Epoch: 31013, Training Loss: 0.00817
Epoch: 31013, Training Loss: 0.01003
Epoch: 31014, Training Loss: 0.00783
Epoch: 31014, Training Loss: 0.00816
Epoch: 31014, Training Loss: 0.00817
Epoch: 31014, Training Loss: 0.01003
Epoch: 31015, Training Loss: 0.00783
Epoch: 31015, Training Loss: 0.00816
Epoch: 31015, Training Loss: 0.00817
Epoch: 31015, Training Loss: 0.01003
Epoch: 31016, Training Loss: 0.00783
Epoch: 31016, Training Loss: 0.00816
Epoch: 31016, Training Loss: 0.00817
Epoch: 31016, Training Loss: 0.01003
Epoch: 31017, Training Loss: 0.00783
Epoch: 31017, Training Loss: 0.00816
Epoch: 31017, Training Loss: 0.00816
Epoch: 31017, Training Loss: 0.01003
Epoch: 31018, Training Loss: 0.00783
Epoch: 31018, Training Loss: 0.00816
Epoch: 31018, Training Loss: 0.00816
Epoch: 31018, Training Loss: 0.01003
Epoch: 31019, Training Loss: 0.00783
Epoch: 31019, Training Loss: 0.00815
Epoch: 31019, Training Loss: 0.00816
Epoch: 31019, Training Loss: 0.01003
Epoch: 31020, Training Loss: 0.00783
Epoch: 31020, Training Loss: 0.00815
Epoch: 31020, Training Loss: 0.00816
Epoch: 31020, Training Loss: 0.01003
Epoch: 31021, Training Loss: 0.00783
Epoch: 31021, Training Loss: 0.00815
Epoch: 31021, Training Loss: 0.00816
Epoch: 31021, Training Loss: 0.01003
Epoch: 31022, Training Loss: 0.00783
Epoch: 31022, Training Loss: 0.00815
Epoch: 31022, Training Loss: 0.00816
Epoch: 31022, Training Loss: 0.01003
Epoch: 31023, Training Loss: 0.00783
Epoch: 31023, Training Loss: 0.00815
Epoch: 31023, Training Loss: 0.00816
Epoch: 31023, Training Loss: 0.01003
Epoch: 31024, Training Loss: 0.00783
Epoch: 31024, Training Loss: 0.00815
Epoch: 31024, Training Loss: 0.00816
Epoch: 31024, Training Loss: 0.01003
Epoch: 31025, Training Loss: 0.00783
Epoch: 31025, Training Loss: 0.00815
Epoch: 31025, Training Loss: 0.00816
Epoch: 31025, Training Loss: 0.01003
Epoch: 31026, Training Loss: 0.00783
Epoch: 31026, Training Loss: 0.00815
Epoch: 31026, Training Loss: 0.00816
Epoch: 31026, Training Loss: 0.01003
Epoch: 31027, Training Loss: 0.00783
Epoch: 31027, Training Loss: 0.00815
Epoch: 31027, Training Loss: 0.00816
Epoch: 31027, Training Loss: 0.01003
Epoch: 31028, Training Loss: 0.00783
Epoch: 31028, Training Loss: 0.00815
Epoch: 31028, Training Loss: 0.00816
Epoch: 31028, Training Loss: 0.01003
Epoch: 31029, Training Loss: 0.00783
Epoch: 31029, Training Loss: 0.00815
Epoch: 31029, Training Loss: 0.00816
Epoch: 31029, Training Loss: 0.01003
Epoch: 31030, Training Loss: 0.00783
Epoch: 31030, Training Loss: 0.00815
Epoch: 31030, Training Loss: 0.00816
Epoch: 31030, Training Loss: 0.01003
Epoch: 31031, Training Loss: 0.00783
Epoch: 31031, Training Loss: 0.00815
Epoch: 31031, Training Loss: 0.00816
Epoch: 31031, Training Loss: 0.01003
Epoch: 31032, Training Loss: 0.00783
Epoch: 31032, Training Loss: 0.00815
Epoch: 31032, Training Loss: 0.00816
Epoch: 31032, Training Loss: 0.01002
Epoch: 31033, Training Loss: 0.00783
Epoch: 31033, Training Loss: 0.00815
Epoch: 31033, Training Loss: 0.00816
Epoch: 31033, Training Loss: 0.01002
Epoch: 31034, Training Loss: 0.00783
Epoch: 31034, Training Loss: 0.00815
Epoch: 31034, Training Loss: 0.00816
Epoch: 31034, Training Loss: 0.01002
Epoch: 31035, Training Loss: 0.00783
Epoch: 31035, Training Loss: 0.00815
Epoch: 31035, Training Loss: 0.00816
Epoch: 31035, Training Loss: 0.01002
Epoch: 31036, Training Loss: 0.00783
Epoch: 31036, Training Loss: 0.00815
Epoch: 31036, Training Loss: 0.00816
Epoch: 31036, Training Loss: 0.01002
Epoch: 31037, Training Loss: 0.00783
Epoch: 31037, Training Loss: 0.00815
Epoch: 31037, Training Loss: 0.00816
Epoch: 31037, Training Loss: 0.01002
Epoch: 31038, Training Loss: 0.00783
Epoch: 31038, Training Loss: 0.00815
Epoch: 31038, Training Loss: 0.00816
Epoch: 31038, Training Loss: 0.01002
Epoch: 31039, Training Loss: 0.00783
Epoch: 31039, Training Loss: 0.00815
Epoch: 31039, Training Loss: 0.00816
Epoch: 31039, Training Loss: 0.01002
Epoch: 31040, Training Loss: 0.00783
Epoch: 31040, Training Loss: 0.00815
Epoch: 31040, Training Loss: 0.00816
Epoch: 31040, Training Loss: 0.01002
Epoch: 31041, Training Loss: 0.00783
Epoch: 31041, Training Loss: 0.00815
Epoch: 31041, Training Loss: 0.00816
Epoch: 31041, Training Loss: 0.01002
Epoch: 31042, Training Loss: 0.00783
Epoch: 31042, Training Loss: 0.00815
Epoch: 31042, Training Loss: 0.00816
Epoch: 31042, Training Loss: 0.01002
Epoch: 31043, Training Loss: 0.00783
Epoch: 31043, Training Loss: 0.00815
Epoch: 31043, Training Loss: 0.00816
Epoch: 31043, Training Loss: 0.01002
Epoch: 31044, Training Loss: 0.00782
Epoch: 31044, Training Loss: 0.00815
Epoch: 31044, Training Loss: 0.00816
Epoch: 31044, Training Loss: 0.01002
Epoch: 31045, Training Loss: 0.00782
Epoch: 31045, Training Loss: 0.00815
Epoch: 31045, Training Loss: 0.00816
Epoch: 31045, Training Loss: 0.01002
Epoch: 31046, Training Loss: 0.00782
Epoch: 31046, Training Loss: 0.00815
Epoch: 31046, Training Loss: 0.00816
Epoch: 31046, Training Loss: 0.01002
Epoch: 31047, Training Loss: 0.00782
Epoch: 31047, Training Loss: 0.00815
Epoch: 31047, Training Loss: 0.00816
Epoch: 31047, Training Loss: 0.01002
Epoch: 31048, Training Loss: 0.00782
Epoch: 31048, Training Loss: 0.00815
Epoch: 31048, Training Loss: 0.00816
Epoch: 31048, Training Loss: 0.01002
Epoch: 31049, Training Loss: 0.00782
Epoch: 31049, Training Loss: 0.00815
Epoch: 31049, Training Loss: 0.00816
Epoch: 31049, Training Loss: 0.01002
Epoch: 31050, Training Loss: 0.00782
Epoch: 31050, Training Loss: 0.00815
Epoch: 31050, Training Loss: 0.00816
Epoch: 31050, Training Loss: 0.01002
Epoch: 31051, Training Loss: 0.00782
Epoch: 31051, Training Loss: 0.00815
Epoch: 31051, Training Loss: 0.00816
Epoch: 31051, Training Loss: 0.01002
Epoch: 31052, Training Loss: 0.00782
Epoch: 31052, Training Loss: 0.00815
Epoch: 31052, Training Loss: 0.00816
Epoch: 31052, Training Loss: 0.01002
Epoch: 31053, Training Loss: 0.00782
Epoch: 31053, Training Loss: 0.00815
Epoch: 31053, Training Loss: 0.00816
Epoch: 31053, Training Loss: 0.01002
Epoch: 31054, Training Loss: 0.00782
Epoch: 31054, Training Loss: 0.00815
Epoch: 31054, Training Loss: 0.00816
Epoch: 31054, Training Loss: 0.01002
Epoch: 31055, Training Loss: 0.00782
Epoch: 31055, Training Loss: 0.00815
Epoch: 31055, Training Loss: 0.00816
Epoch: 31055, Training Loss: 0.01002
Epoch: 31056, Training Loss: 0.00782
Epoch: 31056, Training Loss: 0.00815
Epoch: 31056, Training Loss: 0.00816
Epoch: 31056, Training Loss: 0.01002
Epoch: 31057, Training Loss: 0.00782
Epoch: 31057, Training Loss: 0.00815
Epoch: 31057, Training Loss: 0.00816
Epoch: 31057, Training Loss: 0.01002
Epoch: 31058, Training Loss: 0.00782
Epoch: 31058, Training Loss: 0.00815
Epoch: 31058, Training Loss: 0.00816
Epoch: 31058, Training Loss: 0.01002
Epoch: 31059, Training Loss: 0.00782
Epoch: 31059, Training Loss: 0.00815
Epoch: 31059, Training Loss: 0.00816
Epoch: 31059, Training Loss: 0.01002
Epoch: 31060, Training Loss: 0.00782
Epoch: 31060, Training Loss: 0.00815
Epoch: 31060, Training Loss: 0.00816
Epoch: 31060, Training Loss: 0.01002
Epoch: 31061, Training Loss: 0.00782
Epoch: 31061, Training Loss: 0.00815
Epoch: 31061, Training Loss: 0.00816
Epoch: 31061, Training Loss: 0.01002
Epoch: 31062, Training Loss: 0.00782
Epoch: 31062, Training Loss: 0.00815
Epoch: 31062, Training Loss: 0.00816
Epoch: 31062, Training Loss: 0.01002
Epoch: 31063, Training Loss: 0.00782
Epoch: 31063, Training Loss: 0.00815
Epoch: 31063, Training Loss: 0.00816
Epoch: 31063, Training Loss: 0.01002
Epoch: 31064, Training Loss: 0.00782
Epoch: 31064, Training Loss: 0.00815
Epoch: 31064, Training Loss: 0.00816
Epoch: 31064, Training Loss: 0.01002
Epoch: 31065, Training Loss: 0.00782
Epoch: 31065, Training Loss: 0.00815
Epoch: 31065, Training Loss: 0.00816
Epoch: 31065, Training Loss: 0.01002
Epoch: 31066, Training Loss: 0.00782
Epoch: 31066, Training Loss: 0.00815
Epoch: 31066, Training Loss: 0.00816
Epoch: 31066, Training Loss: 0.01002
Epoch: 31067, Training Loss: 0.00782
Epoch: 31067, Training Loss: 0.00815
Epoch: 31067, Training Loss: 0.00816
Epoch: 31067, Training Loss: 0.01002
Epoch: 31068, Training Loss: 0.00782
Epoch: 31068, Training Loss: 0.00815
Epoch: 31068, Training Loss: 0.00816
Epoch: 31068, Training Loss: 0.01002
Epoch: 31069, Training Loss: 0.00782
Epoch: 31069, Training Loss: 0.00815
Epoch: 31069, Training Loss: 0.00816
Epoch: 31069, Training Loss: 0.01002
Epoch: 31070, Training Loss: 0.00782
Epoch: 31070, Training Loss: 0.00815
Epoch: 31070, Training Loss: 0.00816
Epoch: 31070, Training Loss: 0.01002
Epoch: 31071, Training Loss: 0.00782
Epoch: 31071, Training Loss: 0.00815
Epoch: 31071, Training Loss: 0.00816
Epoch: 31071, Training Loss: 0.01002
Epoch: 31072, Training Loss: 0.00782
Epoch: 31072, Training Loss: 0.00815
Epoch: 31072, Training Loss: 0.00816
Epoch: 31072, Training Loss: 0.01002
Epoch: 31073, Training Loss: 0.00782
Epoch: 31073, Training Loss: 0.00815
Epoch: 31073, Training Loss: 0.00816
Epoch: 31073, Training Loss: 0.01002
Epoch: 31074, Training Loss: 0.00782
Epoch: 31074, Training Loss: 0.00815
Epoch: 31074, Training Loss: 0.00816
Epoch: 31074, Training Loss: 0.01002
Epoch: 31075, Training Loss: 0.00782
Epoch: 31075, Training Loss: 0.00815
Epoch: 31075, Training Loss: 0.00816
Epoch: 31075, Training Loss: 0.01002
Epoch: 31076, Training Loss: 0.00782
Epoch: 31076, Training Loss: 0.00815
Epoch: 31076, Training Loss: 0.00816
Epoch: 31076, Training Loss: 0.01002
Epoch: 31077, Training Loss: 0.00782
Epoch: 31077, Training Loss: 0.00815
Epoch: 31077, Training Loss: 0.00816
Epoch: 31077, Training Loss: 0.01002
Epoch: 31078, Training Loss: 0.00782
Epoch: 31078, Training Loss: 0.00815
Epoch: 31078, Training Loss: 0.00816
Epoch: 31078, Training Loss: 0.01002
Epoch: 31079, Training Loss: 0.00782
Epoch: 31079, Training Loss: 0.00815
Epoch: 31079, Training Loss: 0.00816
Epoch: 31079, Training Loss: 0.01002
Epoch: 31080, Training Loss: 0.00782
Epoch: 31080, Training Loss: 0.00815
Epoch: 31080, Training Loss: 0.00816
Epoch: 31080, Training Loss: 0.01002
Epoch: 31081, Training Loss: 0.00782
Epoch: 31081, Training Loss: 0.00815
Epoch: 31081, Training Loss: 0.00816
Epoch: 31081, Training Loss: 0.01002
Epoch: 31082, Training Loss: 0.00782
Epoch: 31082, Training Loss: 0.00815
Epoch: 31082, Training Loss: 0.00816
Epoch: 31082, Training Loss: 0.01002
Epoch: 31083, Training Loss: 0.00782
Epoch: 31083, Training Loss: 0.00815
Epoch: 31083, Training Loss: 0.00816
Epoch: 31083, Training Loss: 0.01002
Epoch: 31084, Training Loss: 0.00782
Epoch: 31084, Training Loss: 0.00815
Epoch: 31084, Training Loss: 0.00816
Epoch: 31084, Training Loss: 0.01002
Epoch: 31085, Training Loss: 0.00782
Epoch: 31085, Training Loss: 0.00815
Epoch: 31085, Training Loss: 0.00815
Epoch: 31085, Training Loss: 0.01002
Epoch: 31086, Training Loss: 0.00782
Epoch: 31086, Training Loss: 0.00815
Epoch: 31086, Training Loss: 0.00815
Epoch: 31086, Training Loss: 0.01002
Epoch: 31087, Training Loss: 0.00782
Epoch: 31087, Training Loss: 0.00814
Epoch: 31087, Training Loss: 0.00815
Epoch: 31087, Training Loss: 0.01001
Epoch: 31088, Training Loss: 0.00782
Epoch: 31088, Training Loss: 0.00814
Epoch: 31088, Training Loss: 0.00815
Epoch: 31088, Training Loss: 0.01001
Epoch: 31089, Training Loss: 0.00782
Epoch: 31089, Training Loss: 0.00814
Epoch: 31089, Training Loss: 0.00815
Epoch: 31089, Training Loss: 0.01001
Epoch: 31090, Training Loss: 0.00782
Epoch: 31090, Training Loss: 0.00814
Epoch: 31090, Training Loss: 0.00815
Epoch: 31090, Training Loss: 0.01001
Epoch: 31091, Training Loss: 0.00782
Epoch: 31091, Training Loss: 0.00814
Epoch: 31091, Training Loss: 0.00815
Epoch: 31091, Training Loss: 0.01001
Epoch: 31092, Training Loss: 0.00782
Epoch: 31092, Training Loss: 0.00814
Epoch: 31092, Training Loss: 0.00815
Epoch: 31092, Training Loss: 0.01001
Epoch: 31093, Training Loss: 0.00782
Epoch: 31093, Training Loss: 0.00814
Epoch: 31093, Training Loss: 0.00815
Epoch: 31093, Training Loss: 0.01001
Epoch: 31094, Training Loss: 0.00782
Epoch: 31094, Training Loss: 0.00814
Epoch: 31094, Training Loss: 0.00815
Epoch: 31094, Training Loss: 0.01001
Epoch: 31095, Training Loss: 0.00782
Epoch: 31095, Training Loss: 0.00814
Epoch: 31095, Training Loss: 0.00815
Epoch: 31095, Training Loss: 0.01001
Epoch: 31096, Training Loss: 0.00782
Epoch: 31096, Training Loss: 0.00814
Epoch: 31096, Training Loss: 0.00815
Epoch: 31096, Training Loss: 0.01001
Epoch: 31097, Training Loss: 0.00782
Epoch: 31097, Training Loss: 0.00814
Epoch: 31097, Training Loss: 0.00815
Epoch: 31097, Training Loss: 0.01001
Epoch: 31098, Training Loss: 0.00782
Epoch: 31098, Training Loss: 0.00814
Epoch: 31098, Training Loss: 0.00815
Epoch: 31098, Training Loss: 0.01001
Epoch: 31099, Training Loss: 0.00782
Epoch: 31099, Training Loss: 0.00814
Epoch: 31099, Training Loss: 0.00815
Epoch: 31099, Training Loss: 0.01001
Epoch: 31100, Training Loss: 0.00782
Epoch: 31100, Training Loss: 0.00814
Epoch: 31100, Training Loss: 0.00815
Epoch: 31100, Training Loss: 0.01001
Epoch: 31101, Training Loss: 0.00782
Epoch: 31101, Training Loss: 0.00814
Epoch: 31101, Training Loss: 0.00815
Epoch: 31101, Training Loss: 0.01001
Epoch: 31102, Training Loss: 0.00782
Epoch: 31102, Training Loss: 0.00814
Epoch: 31102, Training Loss: 0.00815
Epoch: 31102, Training Loss: 0.01001
Epoch: 31103, Training Loss: 0.00782
Epoch: 31103, Training Loss: 0.00814
Epoch: 31103, Training Loss: 0.00815
Epoch: 31103, Training Loss: 0.01001
Epoch: 31104, Training Loss: 0.00782
Epoch: 31104, Training Loss: 0.00814
Epoch: 31104, Training Loss: 0.00815
Epoch: 31104, Training Loss: 0.01001
Epoch: 31105, Training Loss: 0.00782
Epoch: 31105, Training Loss: 0.00814
Epoch: 31105, Training Loss: 0.00815
Epoch: 31105, Training Loss: 0.01001
Epoch: 31106, Training Loss: 0.00782
Epoch: 31106, Training Loss: 0.00814
Epoch: 31106, Training Loss: 0.00815
Epoch: 31106, Training Loss: 0.01001
Epoch: 31107, Training Loss: 0.00782
Epoch: 31107, Training Loss: 0.00814
Epoch: 31107, Training Loss: 0.00815
Epoch: 31107, Training Loss: 0.01001
Epoch: 31108, Training Loss: 0.00782
Epoch: 31108, Training Loss: 0.00814
Epoch: 31108, Training Loss: 0.00815
Epoch: 31108, Training Loss: 0.01001
Epoch: 31109, Training Loss: 0.00782
Epoch: 31109, Training Loss: 0.00814
Epoch: 31109, Training Loss: 0.00815
Epoch: 31109, Training Loss: 0.01001
Epoch: 31110, Training Loss: 0.00782
Epoch: 31110, Training Loss: 0.00814
Epoch: 31110, Training Loss: 0.00815
Epoch: 31110, Training Loss: 0.01001
Epoch: 31111, Training Loss: 0.00782
Epoch: 31111, Training Loss: 0.00814
Epoch: 31111, Training Loss: 0.00815
Epoch: 31111, Training Loss: 0.01001
Epoch: 31112, Training Loss: 0.00782
Epoch: 31112, Training Loss: 0.00814
Epoch: 31112, Training Loss: 0.00815
Epoch: 31112, Training Loss: 0.01001
Epoch: 31113, Training Loss: 0.00782
Epoch: 31113, Training Loss: 0.00814
Epoch: 31113, Training Loss: 0.00815
Epoch: 31113, Training Loss: 0.01001
Epoch: 31114, Training Loss: 0.00782
Epoch: 31114, Training Loss: 0.00814
Epoch: 31114, Training Loss: 0.00815
Epoch: 31114, Training Loss: 0.01001
Epoch: 31115, Training Loss: 0.00782
Epoch: 31115, Training Loss: 0.00814
Epoch: 31115, Training Loss: 0.00815
Epoch: 31115, Training Loss: 0.01001
Epoch: 31116, Training Loss: 0.00782
Epoch: 31116, Training Loss: 0.00814
Epoch: 31116, Training Loss: 0.00815
Epoch: 31116, Training Loss: 0.01001
Epoch: 31117, Training Loss: 0.00781
Epoch: 31117, Training Loss: 0.00814
Epoch: 31117, Training Loss: 0.00815
Epoch: 31117, Training Loss: 0.01001
Epoch: 31118, Training Loss: 0.00781
Epoch: 31118, Training Loss: 0.00814
Epoch: 31118, Training Loss: 0.00815
Epoch: 31118, Training Loss: 0.01001
Epoch: 31119, Training Loss: 0.00781
Epoch: 31119, Training Loss: 0.00814
Epoch: 31119, Training Loss: 0.00815
Epoch: 31119, Training Loss: 0.01001
Epoch: 31120, Training Loss: 0.00781
Epoch: 31120, Training Loss: 0.00814
Epoch: 31120, Training Loss: 0.00815
Epoch: 31120, Training Loss: 0.01001
Epoch: 31121, Training Loss: 0.00781
Epoch: 31121, Training Loss: 0.00814
Epoch: 31121, Training Loss: 0.00815
Epoch: 31121, Training Loss: 0.01001
Epoch: 31122, Training Loss: 0.00781
Epoch: 31122, Training Loss: 0.00814
Epoch: 31122, Training Loss: 0.00815
Epoch: 31122, Training Loss: 0.01001
Epoch: 31123, Training Loss: 0.00781
Epoch: 31123, Training Loss: 0.00814
Epoch: 31123, Training Loss: 0.00815
Epoch: 31123, Training Loss: 0.01001
Epoch: 31124, Training Loss: 0.00781
Epoch: 31124, Training Loss: 0.00814
Epoch: 31124, Training Loss: 0.00815
Epoch: 31124, Training Loss: 0.01001
Epoch: 31125, Training Loss: 0.00781
Epoch: 31125, Training Loss: 0.00814
Epoch: 31125, Training Loss: 0.00815
Epoch: 31125, Training Loss: 0.01001
Epoch: 31126, Training Loss: 0.00781
Epoch: 31126, Training Loss: 0.00814
Epoch: 31126, Training Loss: 0.00815
Epoch: 31126, Training Loss: 0.01001
Epoch: 31127, Training Loss: 0.00781
Epoch: 31127, Training Loss: 0.00814
Epoch: 31127, Training Loss: 0.00815
Epoch: 31127, Training Loss: 0.01001
Epoch: 31128, Training Loss: 0.00781
Epoch: 31128, Training Loss: 0.00814
Epoch: 31128, Training Loss: 0.00815
Epoch: 31128, Training Loss: 0.01001
Epoch: 31129, Training Loss: 0.00781
Epoch: 31129, Training Loss: 0.00814
Epoch: 31129, Training Loss: 0.00815
Epoch: 31129, Training Loss: 0.01001
Epoch: 31130, Training Loss: 0.00781
Epoch: 31130, Training Loss: 0.00814
Epoch: 31130, Training Loss: 0.00815
Epoch: 31130, Training Loss: 0.01001
Epoch: 31131, Training Loss: 0.00781
Epoch: 31131, Training Loss: 0.00814
Epoch: 31131, Training Loss: 0.00815
Epoch: 31131, Training Loss: 0.01001
Epoch: 31132, Training Loss: 0.00781
Epoch: 31132, Training Loss: 0.00814
Epoch: 31132, Training Loss: 0.00815
Epoch: 31132, Training Loss: 0.01001
Epoch: 31133, Training Loss: 0.00781
Epoch: 31133, Training Loss: 0.00814
Epoch: 31133, Training Loss: 0.00815
Epoch: 31133, Training Loss: 0.01001
Epoch: 31134, Training Loss: 0.00781
Epoch: 31134, Training Loss: 0.00814
Epoch: 31134, Training Loss: 0.00815
Epoch: 31134, Training Loss: 0.01001
Epoch: 31135, Training Loss: 0.00781
Epoch: 31135, Training Loss: 0.00814
Epoch: 31135, Training Loss: 0.00815
Epoch: 31135, Training Loss: 0.01001
Epoch: 31136, Training Loss: 0.00781
Epoch: 31136, Training Loss: 0.00814
Epoch: 31136, Training Loss: 0.00815
Epoch: 31136, Training Loss: 0.01001
Epoch: 31137, Training Loss: 0.00781
Epoch: 31137, Training Loss: 0.00814
Epoch: 31137, Training Loss: 0.00815
Epoch: 31137, Training Loss: 0.01001
Epoch: 31138, Training Loss: 0.00781
Epoch: 31138, Training Loss: 0.00814
Epoch: 31138, Training Loss: 0.00815
Epoch: 31138, Training Loss: 0.01001
Epoch: 31139, Training Loss: 0.00781
Epoch: 31139, Training Loss: 0.00814
Epoch: 31139, Training Loss: 0.00815
Epoch: 31139, Training Loss: 0.01001
Epoch: 31140, Training Loss: 0.00781
Epoch: 31140, Training Loss: 0.00814
Epoch: 31140, Training Loss: 0.00815
Epoch: 31140, Training Loss: 0.01001
Epoch: 31141, Training Loss: 0.00781
Epoch: 31141, Training Loss: 0.00814
Epoch: 31141, Training Loss: 0.00815
Epoch: 31141, Training Loss: 0.01001
Epoch: 31142, Training Loss: 0.00781
Epoch: 31142, Training Loss: 0.00814
Epoch: 31142, Training Loss: 0.00815
Epoch: 31142, Training Loss: 0.01000
Epoch: 31143, Training Loss: 0.00781
Epoch: 31143, Training Loss: 0.00814
Epoch: 31143, Training Loss: 0.00815
Epoch: 31143, Training Loss: 0.01000
Epoch: 31144, Training Loss: 0.00781
Epoch: 31144, Training Loss: 0.00814
Epoch: 31144, Training Loss: 0.00815
Epoch: 31144, Training Loss: 0.01000
Epoch: 31145, Training Loss: 0.00781
Epoch: 31145, Training Loss: 0.00814
Epoch: 31145, Training Loss: 0.00815
Epoch: 31145, Training Loss: 0.01000
Epoch: 31146, Training Loss: 0.00781
Epoch: 31146, Training Loss: 0.00814
Epoch: 31146, Training Loss: 0.00815
Epoch: 31146, Training Loss: 0.01000
Epoch: 31147, Training Loss: 0.00781
Epoch: 31147, Training Loss: 0.00814
Epoch: 31147, Training Loss: 0.00815
Epoch: 31147, Training Loss: 0.01000
Epoch: 31148, Training Loss: 0.00781
Epoch: 31148, Training Loss: 0.00814
Epoch: 31148, Training Loss: 0.00815
Epoch: 31148, Training Loss: 0.01000
Epoch: 31149, Training Loss: 0.00781
Epoch: 31149, Training Loss: 0.00814
Epoch: 31149, Training Loss: 0.00815
Epoch: 31149, Training Loss: 0.01000
Epoch: 31150, Training Loss: 0.00781
Epoch: 31150, Training Loss: 0.00814
Epoch: 31150, Training Loss: 0.00815
Epoch: 31150, Training Loss: 0.01000
Epoch: 31151, Training Loss: 0.00781
Epoch: 31151, Training Loss: 0.00814
Epoch: 31151, Training Loss: 0.00815
Epoch: 31151, Training Loss: 0.01000
Epoch: 31152, Training Loss: 0.00781
Epoch: 31152, Training Loss: 0.00814
Epoch: 31152, Training Loss: 0.00815
Epoch: 31152, Training Loss: 0.01000
Epoch: 31153, Training Loss: 0.00781
Epoch: 31153, Training Loss: 0.00814
Epoch: 31153, Training Loss: 0.00814
Epoch: 31153, Training Loss: 0.01000
Epoch: 31154, Training Loss: 0.00781
Epoch: 31154, Training Loss: 0.00814
Epoch: 31154, Training Loss: 0.00814
Epoch: 31154, Training Loss: 0.01000
Epoch: 31155, Training Loss: 0.00781
Epoch: 31155, Training Loss: 0.00813
Epoch: 31155, Training Loss: 0.00814
Epoch: 31155, Training Loss: 0.01000
Epoch: 31156, Training Loss: 0.00781
Epoch: 31156, Training Loss: 0.00813
Epoch: 31156, Training Loss: 0.00814
Epoch: 31156, Training Loss: 0.01000
Epoch: 31157, Training Loss: 0.00781
Epoch: 31157, Training Loss: 0.00813
Epoch: 31157, Training Loss: 0.00814
Epoch: 31157, Training Loss: 0.01000
Epoch: 31158, Training Loss: 0.00781
Epoch: 31158, Training Loss: 0.00813
Epoch: 31158, Training Loss: 0.00814
Epoch: 31158, Training Loss: 0.01000
Epoch: 31159, Training Loss: 0.00781
Epoch: 31159, Training Loss: 0.00813
Epoch: 31159, Training Loss: 0.00814
Epoch: 31159, Training Loss: 0.01000
Epoch: 31160, Training Loss: 0.00781
Epoch: 31160, Training Loss: 0.00813
Epoch: 31160, Training Loss: 0.00814
Epoch: 31160, Training Loss: 0.01000
Epoch: 31161, Training Loss: 0.00781
Epoch: 31161, Training Loss: 0.00813
Epoch: 31161, Training Loss: 0.00814
Epoch: 31161, Training Loss: 0.01000
Epoch: 31162, Training Loss: 0.00781
Epoch: 31162, Training Loss: 0.00813
Epoch: 31162, Training Loss: 0.00814
Epoch: 31162, Training Loss: 0.01000
Epoch: 31163, Training Loss: 0.00781
Epoch: 31163, Training Loss: 0.00813
Epoch: 31163, Training Loss: 0.00814
Epoch: 31163, Training Loss: 0.01000
Epoch: 31164, Training Loss: 0.00781
Epoch: 31164, Training Loss: 0.00813
Epoch: 31164, Training Loss: 0.00814
Epoch: 31164, Training Loss: 0.01000
Epoch: 31165, Training Loss: 0.00781
Epoch: 31165, Training Loss: 0.00813
Epoch: 31165, Training Loss: 0.00814
Epoch: 31165, Training Loss: 0.01000
Epoch: 31166, Training Loss: 0.00781
Epoch: 31166, Training Loss: 0.00813
Epoch: 31166, Training Loss: 0.00814
Epoch: 31166, Training Loss: 0.01000
Epoch: 31167, Training Loss: 0.00781
Epoch: 31167, Training Loss: 0.00813
Epoch: 31167, Training Loss: 0.00814
Epoch: 31167, Training Loss: 0.01000
Epoch: 31168, Training Loss: 0.00781
Epoch: 31168, Training Loss: 0.00813
Epoch: 31168, Training Loss: 0.00814
Epoch: 31168, Training Loss: 0.01000
Epoch: 31169, Training Loss: 0.00781
Epoch: 31169, Training Loss: 0.00813
Epoch: 31169, Training Loss: 0.00814
Epoch: 31169, Training Loss: 0.01000
Epoch: 31170, Training Loss: 0.00781
Epoch: 31170, Training Loss: 0.00813
Epoch: 31170, Training Loss: 0.00814
Epoch: 31170, Training Loss: 0.01000
Epoch: 31171, Training Loss: 0.00781
Epoch: 31171, Training Loss: 0.00813
Epoch: 31171, Training Loss: 0.00814
Epoch: 31171, Training Loss: 0.01000
Epoch: 31172, Training Loss: 0.00781
Epoch: 31172, Training Loss: 0.00813
Epoch: 31172, Training Loss: 0.00814
Epoch: 31172, Training Loss: 0.01000
Epoch: 31173, Training Loss: 0.00781
Epoch: 31173, Training Loss: 0.00813
Epoch: 31173, Training Loss: 0.00814
Epoch: 31173, Training Loss: 0.01000
Epoch: 31174, Training Loss: 0.00781
Epoch: 31174, Training Loss: 0.00813
Epoch: 31174, Training Loss: 0.00814
Epoch: 31174, Training Loss: 0.01000
Epoch: 31175, Training Loss: 0.00781
Epoch: 31175, Training Loss: 0.00813
Epoch: 31175, Training Loss: 0.00814
Epoch: 31175, Training Loss: 0.01000
Epoch: 31176, Training Loss: 0.00781
Epoch: 31176, Training Loss: 0.00813
Epoch: 31176, Training Loss: 0.00814
Epoch: 31176, Training Loss: 0.01000
Epoch: 31177, Training Loss: 0.00781
Epoch: 31177, Training Loss: 0.00813
Epoch: 31177, Training Loss: 0.00814
Epoch: 31177, Training Loss: 0.01000
Epoch: 31178, Training Loss: 0.00781
Epoch: 31178, Training Loss: 0.00813
Epoch: 31178, Training Loss: 0.00814
Epoch: 31178, Training Loss: 0.01000
Epoch: 31179, Training Loss: 0.00781
Epoch: 31179, Training Loss: 0.00813
Epoch: 31179, Training Loss: 0.00814
Epoch: 31179, Training Loss: 0.01000
Epoch: 31180, Training Loss: 0.00781
Epoch: 31180, Training Loss: 0.00813
Epoch: 31180, Training Loss: 0.00814
Epoch: 31180, Training Loss: 0.01000
Epoch: 31181, Training Loss: 0.00781
Epoch: 31181, Training Loss: 0.00813
Epoch: 31181, Training Loss: 0.00814
Epoch: 31181, Training Loss: 0.01000
Epoch: 31182, Training Loss: 0.00781
Epoch: 31182, Training Loss: 0.00813
Epoch: 31182, Training Loss: 0.00814
Epoch: 31182, Training Loss: 0.01000
Epoch: 31183, Training Loss: 0.00781
Epoch: 31183, Training Loss: 0.00813
Epoch: 31183, Training Loss: 0.00814
Epoch: 31183, Training Loss: 0.01000
Epoch: 31184, Training Loss: 0.00781
Epoch: 31184, Training Loss: 0.00813
Epoch: 31184, Training Loss: 0.00814
Epoch: 31184, Training Loss: 0.01000
Epoch: 31185, Training Loss: 0.00781
Epoch: 31185, Training Loss: 0.00813
Epoch: 31185, Training Loss: 0.00814
Epoch: 31185, Training Loss: 0.01000
Epoch: 31186, Training Loss: 0.00781
Epoch: 31186, Training Loss: 0.00813
Epoch: 31186, Training Loss: 0.00814
Epoch: 31186, Training Loss: 0.01000
Epoch: 31187, Training Loss: 0.00781
Epoch: 31187, Training Loss: 0.00813
Epoch: 31187, Training Loss: 0.00814
Epoch: 31187, Training Loss: 0.01000
Epoch: 31188, Training Loss: 0.00781
Epoch: 31188, Training Loss: 0.00813
Epoch: 31188, Training Loss: 0.00814
Epoch: 31188, Training Loss: 0.01000
Epoch: 31189, Training Loss: 0.00780
Epoch: 31189, Training Loss: 0.00813
Epoch: 31189, Training Loss: 0.00814
Epoch: 31189, Training Loss: 0.01000
Epoch: 31190, Training Loss: 0.00780
Epoch: 31190, Training Loss: 0.00813
Epoch: 31190, Training Loss: 0.00814
Epoch: 31190, Training Loss: 0.01000
Epoch: 31191, Training Loss: 0.00780
Epoch: 31191, Training Loss: 0.00813
Epoch: 31191, Training Loss: 0.00814
Epoch: 31191, Training Loss: 0.01000
Epoch: 31192, Training Loss: 0.00780
Epoch: 31192, Training Loss: 0.00813
Epoch: 31192, Training Loss: 0.00814
Epoch: 31192, Training Loss: 0.01000
Epoch: 31193, Training Loss: 0.00780
Epoch: 31193, Training Loss: 0.00813
Epoch: 31193, Training Loss: 0.00814
Epoch: 31193, Training Loss: 0.01000
Epoch: 31194, Training Loss: 0.00780
Epoch: 31194, Training Loss: 0.00813
Epoch: 31194, Training Loss: 0.00814
Epoch: 31194, Training Loss: 0.01000
Epoch: 31195, Training Loss: 0.00780
Epoch: 31195, Training Loss: 0.00813
Epoch: 31195, Training Loss: 0.00814
Epoch: 31195, Training Loss: 0.01000
Epoch: 31196, Training Loss: 0.00780
Epoch: 31196, Training Loss: 0.00813
Epoch: 31196, Training Loss: 0.00814
Epoch: 31196, Training Loss: 0.01000
Epoch: 31197, Training Loss: 0.00780
Epoch: 31197, Training Loss: 0.00813
Epoch: 31197, Training Loss: 0.00814
Epoch: 31197, Training Loss: 0.00999
Epoch: 31198, Training Loss: 0.00780
Epoch: 31198, Training Loss: 0.00813
Epoch: 31198, Training Loss: 0.00814
Epoch: 31198, Training Loss: 0.00999
Epoch: 31199, Training Loss: 0.00780
Epoch: 31199, Training Loss: 0.00813
Epoch: 31199, Training Loss: 0.00814
Epoch: 31199, Training Loss: 0.00999
Epoch: 31200, Training Loss: 0.00780
Epoch: 31200, Training Loss: 0.00813
Epoch: 31200, Training Loss: 0.00814
Epoch: 31200, Training Loss: 0.00999
Epoch: 31201, Training Loss: 0.00780
Epoch: 31201, Training Loss: 0.00813
Epoch: 31201, Training Loss: 0.00814
Epoch: 31201, Training Loss: 0.00999
Epoch: 31202, Training Loss: 0.00780
Epoch: 31202, Training Loss: 0.00813
Epoch: 31202, Training Loss: 0.00814
Epoch: 31202, Training Loss: 0.00999
Epoch: 31203, Training Loss: 0.00780
Epoch: 31203, Training Loss: 0.00813
Epoch: 31203, Training Loss: 0.00814
Epoch: 31203, Training Loss: 0.00999
Epoch: 31204, Training Loss: 0.00780
Epoch: 31204, Training Loss: 0.00813
Epoch: 31204, Training Loss: 0.00814
Epoch: 31204, Training Loss: 0.00999
Epoch: 31205, Training Loss: 0.00780
Epoch: 31205, Training Loss: 0.00813
Epoch: 31205, Training Loss: 0.00814
Epoch: 31205, Training Loss: 0.00999
Epoch: 31206, Training Loss: 0.00780
Epoch: 31206, Training Loss: 0.00813
Epoch: 31206, Training Loss: 0.00814
Epoch: 31206, Training Loss: 0.00999
Epoch: 31207, Training Loss: 0.00780
Epoch: 31207, Training Loss: 0.00813
Epoch: 31207, Training Loss: 0.00814
Epoch: 31207, Training Loss: 0.00999
Epoch: 31208, Training Loss: 0.00780
Epoch: 31208, Training Loss: 0.00813
Epoch: 31208, Training Loss: 0.00814
Epoch: 31208, Training Loss: 0.00999
Epoch: 31209, Training Loss: 0.00780
Epoch: 31209, Training Loss: 0.00813
Epoch: 31209, Training Loss: 0.00814
Epoch: 31209, Training Loss: 0.00999
Epoch: 31210, Training Loss: 0.00780
Epoch: 31210, Training Loss: 0.00813
Epoch: 31210, Training Loss: 0.00814
Epoch: 31210, Training Loss: 0.00999
Epoch: 31211, Training Loss: 0.00780
Epoch: 31211, Training Loss: 0.00813
Epoch: 31211, Training Loss: 0.00814
Epoch: 31211, Training Loss: 0.00999
Epoch: 31212, Training Loss: 0.00780
Epoch: 31212, Training Loss: 0.00813
Epoch: 31212, Training Loss: 0.00814
Epoch: 31212, Training Loss: 0.00999
Epoch: 31213, Training Loss: 0.00780
Epoch: 31213, Training Loss: 0.00813
Epoch: 31213, Training Loss: 0.00814
Epoch: 31213, Training Loss: 0.00999
Epoch: 31214, Training Loss: 0.00780
Epoch: 31214, Training Loss: 0.00813
Epoch: 31214, Training Loss: 0.00814
Epoch: 31214, Training Loss: 0.00999
Epoch: 31215, Training Loss: 0.00780
Epoch: 31215, Training Loss: 0.00813
Epoch: 31215, Training Loss: 0.00814
Epoch: 31215, Training Loss: 0.00999
Epoch: 31216, Training Loss: 0.00780
Epoch: 31216, Training Loss: 0.00813
Epoch: 31216, Training Loss: 0.00814
Epoch: 31216, Training Loss: 0.00999
Epoch: 31217, Training Loss: 0.00780
Epoch: 31217, Training Loss: 0.00813
Epoch: 31217, Training Loss: 0.00814
Epoch: 31217, Training Loss: 0.00999
Epoch: 31218, Training Loss: 0.00780
Epoch: 31218, Training Loss: 0.00813
Epoch: 31218, Training Loss: 0.00814
Epoch: 31218, Training Loss: 0.00999
Epoch: 31219, Training Loss: 0.00780
Epoch: 31219, Training Loss: 0.00813
Epoch: 31219, Training Loss: 0.00814
Epoch: 31219, Training Loss: 0.00999
Epoch: 31220, Training Loss: 0.00780
Epoch: 31220, Training Loss: 0.00813
Epoch: 31220, Training Loss: 0.00814
Epoch: 31220, Training Loss: 0.00999
Epoch: 31221, Training Loss: 0.00780
Epoch: 31221, Training Loss: 0.00813
Epoch: 31221, Training Loss: 0.00813
Epoch: 31221, Training Loss: 0.00999
Epoch: 31222, Training Loss: 0.00780
Epoch: 31222, Training Loss: 0.00813
Epoch: 31222, Training Loss: 0.00813
Epoch: 31222, Training Loss: 0.00999
Epoch: 31223, Training Loss: 0.00780
Epoch: 31223, Training Loss: 0.00813
Epoch: 31223, Training Loss: 0.00813
Epoch: 31223, Training Loss: 0.00999
Epoch: 31224, Training Loss: 0.00780
Epoch: 31224, Training Loss: 0.00812
Epoch: 31224, Training Loss: 0.00813
Epoch: 31224, Training Loss: 0.00999
Epoch: 31225, Training Loss: 0.00780
Epoch: 31225, Training Loss: 0.00812
Epoch: 31225, Training Loss: 0.00813
Epoch: 31225, Training Loss: 0.00999
Epoch: 31226, Training Loss: 0.00780
Epoch: 31226, Training Loss: 0.00812
Epoch: 31226, Training Loss: 0.00813
Epoch: 31226, Training Loss: 0.00999
Epoch: 31227, Training Loss: 0.00780
Epoch: 31227, Training Loss: 0.00812
Epoch: 31227, Training Loss: 0.00813
Epoch: 31227, Training Loss: 0.00999
Epoch: 31228, Training Loss: 0.00780
Epoch: 31228, Training Loss: 0.00812
Epoch: 31228, Training Loss: 0.00813
Epoch: 31228, Training Loss: 0.00999
Epoch: 31229, Training Loss: 0.00780
Epoch: 31229, Training Loss: 0.00812
Epoch: 31229, Training Loss: 0.00813
Epoch: 31229, Training Loss: 0.00999
Epoch: 31230, Training Loss: 0.00780
Epoch: 31230, Training Loss: 0.00812
Epoch: 31230, Training Loss: 0.00813
Epoch: 31230, Training Loss: 0.00999
Epoch: 31231, Training Loss: 0.00780
Epoch: 31231, Training Loss: 0.00812
Epoch: 31231, Training Loss: 0.00813
Epoch: 31231, Training Loss: 0.00999
Epoch: 31232, Training Loss: 0.00780
Epoch: 31232, Training Loss: 0.00812
Epoch: 31232, Training Loss: 0.00813
Epoch: 31232, Training Loss: 0.00999
Epoch: 31233, Training Loss: 0.00780
Epoch: 31233, Training Loss: 0.00812
Epoch: 31233, Training Loss: 0.00813
Epoch: 31233, Training Loss: 0.00999
Epoch: 31234, Training Loss: 0.00780
Epoch: 31234, Training Loss: 0.00812
Epoch: 31234, Training Loss: 0.00813
Epoch: 31234, Training Loss: 0.00999
Epoch: 31235, Training Loss: 0.00780
Epoch: 31235, Training Loss: 0.00812
Epoch: 31235, Training Loss: 0.00813
Epoch: 31235, Training Loss: 0.00999
Epoch: 31236, Training Loss: 0.00780
Epoch: 31236, Training Loss: 0.00812
Epoch: 31236, Training Loss: 0.00813
Epoch: 31236, Training Loss: 0.00999
Epoch: 31237, Training Loss: 0.00780
Epoch: 31237, Training Loss: 0.00812
Epoch: 31237, Training Loss: 0.00813
Epoch: 31237, Training Loss: 0.00999
Epoch: 31238, Training Loss: 0.00780
Epoch: 31238, Training Loss: 0.00812
Epoch: 31238, Training Loss: 0.00813
Epoch: 31238, Training Loss: 0.00999
Epoch: 31239, Training Loss: 0.00780
Epoch: 31239, Training Loss: 0.00812
Epoch: 31239, Training Loss: 0.00813
Epoch: 31239, Training Loss: 0.00999
Epoch: 31240, Training Loss: 0.00780
Epoch: 31240, Training Loss: 0.00812
Epoch: 31240, Training Loss: 0.00813
Epoch: 31240, Training Loss: 0.00999
Epoch: 31241, Training Loss: 0.00780
Epoch: 31241, Training Loss: 0.00812
Epoch: 31241, Training Loss: 0.00813
Epoch: 31241, Training Loss: 0.00999
Epoch: 31242, Training Loss: 0.00780
Epoch: 31242, Training Loss: 0.00812
Epoch: 31242, Training Loss: 0.00813
Epoch: 31242, Training Loss: 0.00999
Epoch: 31243, Training Loss: 0.00780
Epoch: 31243, Training Loss: 0.00812
Epoch: 31243, Training Loss: 0.00813
Epoch: 31243, Training Loss: 0.00999
Epoch: 31244, Training Loss: 0.00780
Epoch: 31244, Training Loss: 0.00812
Epoch: 31244, Training Loss: 0.00813
Epoch: 31244, Training Loss: 0.00999
Epoch: 31245, Training Loss: 0.00780
Epoch: 31245, Training Loss: 0.00812
Epoch: 31245, Training Loss: 0.00813
Epoch: 31245, Training Loss: 0.00999
Epoch: 31246, Training Loss: 0.00780
Epoch: 31246, Training Loss: 0.00812
Epoch: 31246, Training Loss: 0.00813
Epoch: 31246, Training Loss: 0.00999
Epoch: 31247, Training Loss: 0.00780
Epoch: 31247, Training Loss: 0.00812
Epoch: 31247, Training Loss: 0.00813
Epoch: 31247, Training Loss: 0.00999
Epoch: 31248, Training Loss: 0.00780
Epoch: 31248, Training Loss: 0.00812
Epoch: 31248, Training Loss: 0.00813
Epoch: 31248, Training Loss: 0.00999
Epoch: 31249, Training Loss: 0.00780
Epoch: 31249, Training Loss: 0.00812
Epoch: 31249, Training Loss: 0.00813
Epoch: 31249, Training Loss: 0.00999
Epoch: 31250, Training Loss: 0.00780
Epoch: 31250, Training Loss: 0.00812
Epoch: 31250, Training Loss: 0.00813
Epoch: 31250, Training Loss: 0.00999
Epoch: 31251, Training Loss: 0.00780
Epoch: 31251, Training Loss: 0.00812
Epoch: 31251, Training Loss: 0.00813
Epoch: 31251, Training Loss: 0.00999
Epoch: 31252, Training Loss: 0.00780
Epoch: 31252, Training Loss: 0.00812
Epoch: 31252, Training Loss: 0.00813
Epoch: 31252, Training Loss: 0.00998
Epoch: 31253, Training Loss: 0.00780
Epoch: 31253, Training Loss: 0.00812
Epoch: 31253, Training Loss: 0.00813
Epoch: 31253, Training Loss: 0.00998
Epoch: 31254, Training Loss: 0.00780
Epoch: 31254, Training Loss: 0.00812
Epoch: 31254, Training Loss: 0.00813
Epoch: 31254, Training Loss: 0.00998
Epoch: 31255, Training Loss: 0.00780
Epoch: 31255, Training Loss: 0.00812
Epoch: 31255, Training Loss: 0.00813
Epoch: 31255, Training Loss: 0.00998
Epoch: 31256, Training Loss: 0.00780
Epoch: 31256, Training Loss: 0.00812
Epoch: 31256, Training Loss: 0.00813
Epoch: 31256, Training Loss: 0.00998
Epoch: 31257, Training Loss: 0.00780
Epoch: 31257, Training Loss: 0.00812
Epoch: 31257, Training Loss: 0.00813
Epoch: 31257, Training Loss: 0.00998
Epoch: 31258, Training Loss: 0.00780
Epoch: 31258, Training Loss: 0.00812
Epoch: 31258, Training Loss: 0.00813
Epoch: 31258, Training Loss: 0.00998
Epoch: 31259, Training Loss: 0.00780
Epoch: 31259, Training Loss: 0.00812
Epoch: 31259, Training Loss: 0.00813
Epoch: 31259, Training Loss: 0.00998
Epoch: 31260, Training Loss: 0.00780
Epoch: 31260, Training Loss: 0.00812
Epoch: 31260, Training Loss: 0.00813
Epoch: 31260, Training Loss: 0.00998
Epoch: 31261, Training Loss: 0.00780
Epoch: 31261, Training Loss: 0.00812
Epoch: 31261, Training Loss: 0.00813
Epoch: 31261, Training Loss: 0.00998
Epoch: 31262, Training Loss: 0.00779
Epoch: 31262, Training Loss: 0.00812
Epoch: 31262, Training Loss: 0.00813
Epoch: 31262, Training Loss: 0.00998
Epoch: 31263, Training Loss: 0.00779
Epoch: 31263, Training Loss: 0.00812
Epoch: 31263, Training Loss: 0.00813
Epoch: 31263, Training Loss: 0.00998
Epoch: 31264, Training Loss: 0.00779
Epoch: 31264, Training Loss: 0.00812
Epoch: 31264, Training Loss: 0.00813
Epoch: 31264, Training Loss: 0.00998
Epoch: 31265, Training Loss: 0.00779
Epoch: 31265, Training Loss: 0.00812
Epoch: 31265, Training Loss: 0.00813
Epoch: 31265, Training Loss: 0.00998
Epoch: 31266, Training Loss: 0.00779
Epoch: 31266, Training Loss: 0.00812
Epoch: 31266, Training Loss: 0.00813
Epoch: 31266, Training Loss: 0.00998
Epoch: 31267, Training Loss: 0.00779
Epoch: 31267, Training Loss: 0.00812
Epoch: 31267, Training Loss: 0.00813
Epoch: 31267, Training Loss: 0.00998
Epoch: 31268, Training Loss: 0.00779
Epoch: 31268, Training Loss: 0.00812
Epoch: 31268, Training Loss: 0.00813
Epoch: 31268, Training Loss: 0.00998
Epoch: 31269, Training Loss: 0.00779
Epoch: 31269, Training Loss: 0.00812
Epoch: 31269, Training Loss: 0.00813
Epoch: 31269, Training Loss: 0.00998
Epoch: 31270, Training Loss: 0.00779
Epoch: 31270, Training Loss: 0.00812
Epoch: 31270, Training Loss: 0.00813
Epoch: 31270, Training Loss: 0.00998
Epoch: 31271, Training Loss: 0.00779
Epoch: 31271, Training Loss: 0.00812
Epoch: 31271, Training Loss: 0.00813
Epoch: 31271, Training Loss: 0.00998
Epoch: 31272, Training Loss: 0.00779
Epoch: 31272, Training Loss: 0.00812
Epoch: 31272, Training Loss: 0.00813
Epoch: 31272, Training Loss: 0.00998
Epoch: 31273, Training Loss: 0.00779
Epoch: 31273, Training Loss: 0.00812
Epoch: 31273, Training Loss: 0.00813
Epoch: 31273, Training Loss: 0.00998
Epoch: 31274, Training Loss: 0.00779
Epoch: 31274, Training Loss: 0.00812
Epoch: 31274, Training Loss: 0.00813
Epoch: 31274, Training Loss: 0.00998
Epoch: 31275, Training Loss: 0.00779
Epoch: 31275, Training Loss: 0.00812
Epoch: 31275, Training Loss: 0.00813
Epoch: 31275, Training Loss: 0.00998
Epoch: 31276, Training Loss: 0.00779
Epoch: 31276, Training Loss: 0.00812
Epoch: 31276, Training Loss: 0.00813
Epoch: 31276, Training Loss: 0.00998
Epoch: 31277, Training Loss: 0.00779
Epoch: 31277, Training Loss: 0.00812
Epoch: 31277, Training Loss: 0.00813
Epoch: 31277, Training Loss: 0.00998
Epoch: 31278, Training Loss: 0.00779
Epoch: 31278, Training Loss: 0.00812
Epoch: 31278, Training Loss: 0.00813
Epoch: 31278, Training Loss: 0.00998
Epoch: 31279, Training Loss: 0.00779
Epoch: 31279, Training Loss: 0.00812
Epoch: 31279, Training Loss: 0.00813
Epoch: 31279, Training Loss: 0.00998
Epoch: 31280, Training Loss: 0.00779
Epoch: 31280, Training Loss: 0.00812
Epoch: 31280, Training Loss: 0.00813
Epoch: 31280, Training Loss: 0.00998
Epoch: 31281, Training Loss: 0.00779
Epoch: 31281, Training Loss: 0.00812
Epoch: 31281, Training Loss: 0.00813
Epoch: 31281, Training Loss: 0.00998
Epoch: 31282, Training Loss: 0.00779
Epoch: 31282, Training Loss: 0.00812
Epoch: 31282, Training Loss: 0.00813
Epoch: 31282, Training Loss: 0.00998
Epoch: 31283, Training Loss: 0.00779
Epoch: 31283, Training Loss: 0.00812
Epoch: 31283, Training Loss: 0.00813
Epoch: 31283, Training Loss: 0.00998
Epoch: 31284, Training Loss: 0.00779
Epoch: 31284, Training Loss: 0.00812
Epoch: 31284, Training Loss: 0.00813
Epoch: 31284, Training Loss: 0.00998
Epoch: 31285, Training Loss: 0.00779
Epoch: 31285, Training Loss: 0.00812
Epoch: 31285, Training Loss: 0.00813
Epoch: 31285, Training Loss: 0.00998
Epoch: 31286, Training Loss: 0.00779
Epoch: 31286, Training Loss: 0.00812
Epoch: 31286, Training Loss: 0.00813
Epoch: 31286, Training Loss: 0.00998
Epoch: 31287, Training Loss: 0.00779
Epoch: 31287, Training Loss: 0.00812
Epoch: 31287, Training Loss: 0.00813
Epoch: 31287, Training Loss: 0.00998
Epoch: 31288, Training Loss: 0.00779
Epoch: 31288, Training Loss: 0.00812
Epoch: 31288, Training Loss: 0.00813
Epoch: 31288, Training Loss: 0.00998
Epoch: 31289, Training Loss: 0.00779
Epoch: 31289, Training Loss: 0.00812
Epoch: 31289, Training Loss: 0.00813
Epoch: 31289, Training Loss: 0.00998
Epoch: 31290, Training Loss: 0.00779
Epoch: 31290, Training Loss: 0.00812
Epoch: 31290, Training Loss: 0.00812
Epoch: 31290, Training Loss: 0.00998
Epoch: 31291, Training Loss: 0.00779
Epoch: 31291, Training Loss: 0.00812
Epoch: 31291, Training Loss: 0.00812
Epoch: 31291, Training Loss: 0.00998
Epoch: 31292, Training Loss: 0.00779
Epoch: 31292, Training Loss: 0.00811
Epoch: 31292, Training Loss: 0.00812
Epoch: 31292, Training Loss: 0.00998
Epoch: 31293, Training Loss: 0.00779
Epoch: 31293, Training Loss: 0.00811
Epoch: 31293, Training Loss: 0.00812
Epoch: 31293, Training Loss: 0.00998
Epoch: 31294, Training Loss: 0.00779
Epoch: 31294, Training Loss: 0.00811
Epoch: 31294, Training Loss: 0.00812
Epoch: 31294, Training Loss: 0.00998
Epoch: 31295, Training Loss: 0.00779
Epoch: 31295, Training Loss: 0.00811
Epoch: 31295, Training Loss: 0.00812
Epoch: 31295, Training Loss: 0.00998
Epoch: 31296, Training Loss: 0.00779
Epoch: 31296, Training Loss: 0.00811
Epoch: 31296, Training Loss: 0.00812
Epoch: 31296, Training Loss: 0.00998
Epoch: 31297, Training Loss: 0.00779
Epoch: 31297, Training Loss: 0.00811
Epoch: 31297, Training Loss: 0.00812
Epoch: 31297, Training Loss: 0.00998
Epoch: 31298, Training Loss: 0.00779
Epoch: 31298, Training Loss: 0.00811
Epoch: 31298, Training Loss: 0.00812
Epoch: 31298, Training Loss: 0.00998
Epoch: 31299, Training Loss: 0.00779
Epoch: 31299, Training Loss: 0.00811
Epoch: 31299, Training Loss: 0.00812
Epoch: 31299, Training Loss: 0.00998
Epoch: 31300, Training Loss: 0.00779
Epoch: 31300, Training Loss: 0.00811
Epoch: 31300, Training Loss: 0.00812
Epoch: 31300, Training Loss: 0.00998
Epoch: 31301, Training Loss: 0.00779
Epoch: 31301, Training Loss: 0.00811
Epoch: 31301, Training Loss: 0.00812
Epoch: 31301, Training Loss: 0.00998
Epoch: 31302, Training Loss: 0.00779
Epoch: 31302, Training Loss: 0.00811
Epoch: 31302, Training Loss: 0.00812
Epoch: 31302, Training Loss: 0.00998
Epoch: 31303, Training Loss: 0.00779
Epoch: 31303, Training Loss: 0.00811
Epoch: 31303, Training Loss: 0.00812
Epoch: 31303, Training Loss: 0.00998
Epoch: 31304, Training Loss: 0.00779
Epoch: 31304, Training Loss: 0.00811
Epoch: 31304, Training Loss: 0.00812
Epoch: 31304, Training Loss: 0.00998
Epoch: 31305, Training Loss: 0.00779
Epoch: 31305, Training Loss: 0.00811
Epoch: 31305, Training Loss: 0.00812
Epoch: 31305, Training Loss: 0.00998
Epoch: 31306, Training Loss: 0.00779
Epoch: 31306, Training Loss: 0.00811
Epoch: 31306, Training Loss: 0.00812
Epoch: 31306, Training Loss: 0.00998
Epoch: 31307, Training Loss: 0.00779
Epoch: 31307, Training Loss: 0.00811
Epoch: 31307, Training Loss: 0.00812
Epoch: 31307, Training Loss: 0.00997
Epoch: 31308, Training Loss: 0.00779
Epoch: 31308, Training Loss: 0.00811
Epoch: 31308, Training Loss: 0.00812
Epoch: 31308, Training Loss: 0.00997
Epoch: 31309, Training Loss: 0.00779
Epoch: 31309, Training Loss: 0.00811
Epoch: 31309, Training Loss: 0.00812
Epoch: 31309, Training Loss: 0.00997
Epoch: 31310, Training Loss: 0.00779
Epoch: 31310, Training Loss: 0.00811
Epoch: 31310, Training Loss: 0.00812
Epoch: 31310, Training Loss: 0.00997
Epoch: 31311, Training Loss: 0.00779
Epoch: 31311, Training Loss: 0.00811
Epoch: 31311, Training Loss: 0.00812
Epoch: 31311, Training Loss: 0.00997
Epoch: 31312, Training Loss: 0.00779
Epoch: 31312, Training Loss: 0.00811
Epoch: 31312, Training Loss: 0.00812
Epoch: 31312, Training Loss: 0.00997
Epoch: 31313, Training Loss: 0.00779
Epoch: 31313, Training Loss: 0.00811
Epoch: 31313, Training Loss: 0.00812
Epoch: 31313, Training Loss: 0.00997
Epoch: 31314, Training Loss: 0.00779
Epoch: 31314, Training Loss: 0.00811
Epoch: 31314, Training Loss: 0.00812
Epoch: 31314, Training Loss: 0.00997
Epoch: 31315, Training Loss: 0.00779
Epoch: 31315, Training Loss: 0.00811
Epoch: 31315, Training Loss: 0.00812
Epoch: 31315, Training Loss: 0.00997
Epoch: 31316, Training Loss: 0.00779
Epoch: 31316, Training Loss: 0.00811
Epoch: 31316, Training Loss: 0.00812
Epoch: 31316, Training Loss: 0.00997
Epoch: 31317, Training Loss: 0.00779
Epoch: 31317, Training Loss: 0.00811
Epoch: 31317, Training Loss: 0.00812
Epoch: 31317, Training Loss: 0.00997
Epoch: 31318, Training Loss: 0.00779
Epoch: 31318, Training Loss: 0.00811
Epoch: 31318, Training Loss: 0.00812
Epoch: 31318, Training Loss: 0.00997
Epoch: 31319, Training Loss: 0.00779
Epoch: 31319, Training Loss: 0.00811
Epoch: 31319, Training Loss: 0.00812
Epoch: 31319, Training Loss: 0.00997
Epoch: 31320, Training Loss: 0.00779
Epoch: 31320, Training Loss: 0.00811
Epoch: 31320, Training Loss: 0.00812
Epoch: 31320, Training Loss: 0.00997
Epoch: 31321, Training Loss: 0.00779
Epoch: 31321, Training Loss: 0.00811
Epoch: 31321, Training Loss: 0.00812
Epoch: 31321, Training Loss: 0.00997
Epoch: 31322, Training Loss: 0.00779
Epoch: 31322, Training Loss: 0.00811
Epoch: 31322, Training Loss: 0.00812
Epoch: 31322, Training Loss: 0.00997
Epoch: 31323, Training Loss: 0.00779
Epoch: 31323, Training Loss: 0.00811
Epoch: 31323, Training Loss: 0.00812
Epoch: 31323, Training Loss: 0.00997
Epoch: 31324, Training Loss: 0.00779
Epoch: 31324, Training Loss: 0.00811
Epoch: 31324, Training Loss: 0.00812
Epoch: 31324, Training Loss: 0.00997
Epoch: 31325, Training Loss: 0.00779
Epoch: 31325, Training Loss: 0.00811
Epoch: 31325, Training Loss: 0.00812
Epoch: 31325, Training Loss: 0.00997
Epoch: 31326, Training Loss: 0.00779
Epoch: 31326, Training Loss: 0.00811
Epoch: 31326, Training Loss: 0.00812
Epoch: 31326, Training Loss: 0.00997
Epoch: 31327, Training Loss: 0.00779
Epoch: 31327, Training Loss: 0.00811
Epoch: 31327, Training Loss: 0.00812
Epoch: 31327, Training Loss: 0.00997
Epoch: 31328, Training Loss: 0.00779
Epoch: 31328, Training Loss: 0.00811
Epoch: 31328, Training Loss: 0.00812
Epoch: 31328, Training Loss: 0.00997
Epoch: 31329, Training Loss: 0.00779
Epoch: 31329, Training Loss: 0.00811
Epoch: 31329, Training Loss: 0.00812
Epoch: 31329, Training Loss: 0.00997
Epoch: 31330, Training Loss: 0.00779
Epoch: 31330, Training Loss: 0.00811
Epoch: 31330, Training Loss: 0.00812
Epoch: 31330, Training Loss: 0.00997
Epoch: 31331, Training Loss: 0.00779
Epoch: 31331, Training Loss: 0.00811
Epoch: 31331, Training Loss: 0.00812
Epoch: 31331, Training Loss: 0.00997
Epoch: 31332, Training Loss: 0.00779
Epoch: 31332, Training Loss: 0.00811
Epoch: 31332, Training Loss: 0.00812
Epoch: 31332, Training Loss: 0.00997
Epoch: 31333, Training Loss: 0.00779
Epoch: 31333, Training Loss: 0.00811
Epoch: 31333, Training Loss: 0.00812
Epoch: 31333, Training Loss: 0.00997
Epoch: 31334, Training Loss: 0.00779
Epoch: 31334, Training Loss: 0.00811
Epoch: 31334, Training Loss: 0.00812
Epoch: 31334, Training Loss: 0.00997
Epoch: 31335, Training Loss: 0.00779
Epoch: 31335, Training Loss: 0.00811
Epoch: 31335, Training Loss: 0.00812
Epoch: 31335, Training Loss: 0.00997
Epoch: 31336, Training Loss: 0.00778
Epoch: 31336, Training Loss: 0.00811
Epoch: 31336, Training Loss: 0.00812
Epoch: 31336, Training Loss: 0.00997
Epoch: 31337, Training Loss: 0.00778
Epoch: 31337, Training Loss: 0.00811
Epoch: 31337, Training Loss: 0.00812
Epoch: 31337, Training Loss: 0.00997
Epoch: 31338, Training Loss: 0.00778
Epoch: 31338, Training Loss: 0.00811
Epoch: 31338, Training Loss: 0.00812
Epoch: 31338, Training Loss: 0.00997
Epoch: 31339, Training Loss: 0.00778
Epoch: 31339, Training Loss: 0.00811
Epoch: 31339, Training Loss: 0.00812
Epoch: 31339, Training Loss: 0.00997
Epoch: 31340, Training Loss: 0.00778
Epoch: 31340, Training Loss: 0.00811
Epoch: 31340, Training Loss: 0.00812
Epoch: 31340, Training Loss: 0.00997
Epoch: 31341, Training Loss: 0.00778
Epoch: 31341, Training Loss: 0.00811
Epoch: 31341, Training Loss: 0.00812
Epoch: 31341, Training Loss: 0.00997
Epoch: 31342, Training Loss: 0.00778
Epoch: 31342, Training Loss: 0.00811
Epoch: 31342, Training Loss: 0.00812
Epoch: 31342, Training Loss: 0.00997
Epoch: 31343, Training Loss: 0.00778
Epoch: 31343, Training Loss: 0.00811
Epoch: 31343, Training Loss: 0.00812
Epoch: 31343, Training Loss: 0.00997
Epoch: 31344, Training Loss: 0.00778
Epoch: 31344, Training Loss: 0.00811
Epoch: 31344, Training Loss: 0.00812
Epoch: 31344, Training Loss: 0.00997
Epoch: 31345, Training Loss: 0.00778
Epoch: 31345, Training Loss: 0.00811
Epoch: 31345, Training Loss: 0.00812
Epoch: 31345, Training Loss: 0.00997
Epoch: 31346, Training Loss: 0.00778
Epoch: 31346, Training Loss: 0.00811
Epoch: 31346, Training Loss: 0.00812
Epoch: 31346, Training Loss: 0.00997
Epoch: 31347, Training Loss: 0.00778
Epoch: 31347, Training Loss: 0.00811
Epoch: 31347, Training Loss: 0.00812
Epoch: 31347, Training Loss: 0.00997
Epoch: 31348, Training Loss: 0.00778
Epoch: 31348, Training Loss: 0.00811
Epoch: 31348, Training Loss: 0.00812
Epoch: 31348, Training Loss: 0.00997
Epoch: 31349, Training Loss: 0.00778
Epoch: 31349, Training Loss: 0.00811
Epoch: 31349, Training Loss: 0.00812
Epoch: 31349, Training Loss: 0.00997
Epoch: 31350, Training Loss: 0.00778
Epoch: 31350, Training Loss: 0.00811
Epoch: 31350, Training Loss: 0.00812
Epoch: 31350, Training Loss: 0.00997
Epoch: 31351, Training Loss: 0.00778
Epoch: 31351, Training Loss: 0.00811
Epoch: 31351, Training Loss: 0.00812
Epoch: 31351, Training Loss: 0.00997
Epoch: 31352, Training Loss: 0.00778
Epoch: 31352, Training Loss: 0.00811
Epoch: 31352, Training Loss: 0.00812
Epoch: 31352, Training Loss: 0.00997
Epoch: 31353, Training Loss: 0.00778
Epoch: 31353, Training Loss: 0.00811
Epoch: 31353, Training Loss: 0.00812
Epoch: 31353, Training Loss: 0.00997
Epoch: 31354, Training Loss: 0.00778
Epoch: 31354, Training Loss: 0.00811
Epoch: 31354, Training Loss: 0.00812
Epoch: 31354, Training Loss: 0.00997
Epoch: 31355, Training Loss: 0.00778
Epoch: 31355, Training Loss: 0.00811
Epoch: 31355, Training Loss: 0.00812
Epoch: 31355, Training Loss: 0.00997
Epoch: 31356, Training Loss: 0.00778
Epoch: 31356, Training Loss: 0.00811
Epoch: 31356, Training Loss: 0.00812
Epoch: 31356, Training Loss: 0.00997
Epoch: 31357, Training Loss: 0.00778
Epoch: 31357, Training Loss: 0.00811
Epoch: 31357, Training Loss: 0.00812
Epoch: 31357, Training Loss: 0.00997
Epoch: 31358, Training Loss: 0.00778
Epoch: 31358, Training Loss: 0.00811
Epoch: 31358, Training Loss: 0.00811
Epoch: 31358, Training Loss: 0.00997
Epoch: 31359, Training Loss: 0.00778
Epoch: 31359, Training Loss: 0.00811
Epoch: 31359, Training Loss: 0.00811
Epoch: 31359, Training Loss: 0.00997
Epoch: 31360, Training Loss: 0.00778
Epoch: 31360, Training Loss: 0.00811
Epoch: 31360, Training Loss: 0.00811
Epoch: 31360, Training Loss: 0.00997
Epoch: 31361, Training Loss: 0.00778
Epoch: 31361, Training Loss: 0.00810
Epoch: 31361, Training Loss: 0.00811
Epoch: 31361, Training Loss: 0.00997
Epoch: 31362, Training Loss: 0.00778
Epoch: 31362, Training Loss: 0.00810
Epoch: 31362, Training Loss: 0.00811
Epoch: 31362, Training Loss: 0.00996
Epoch: 31363, Training Loss: 0.00778
Epoch: 31363, Training Loss: 0.00810
Epoch: 31363, Training Loss: 0.00811
Epoch: 31363, Training Loss: 0.00996
Epoch: 31364, Training Loss: 0.00778
Epoch: 31364, Training Loss: 0.00810
Epoch: 31364, Training Loss: 0.00811
Epoch: 31364, Training Loss: 0.00996
Epoch: 31365, Training Loss: 0.00778
Epoch: 31365, Training Loss: 0.00810
Epoch: 31365, Training Loss: 0.00811
Epoch: 31365, Training Loss: 0.00996
Epoch: 31366, Training Loss: 0.00778
Epoch: 31366, Training Loss: 0.00810
Epoch: 31366, Training Loss: 0.00811
Epoch: 31366, Training Loss: 0.00996
Epoch: 31367, Training Loss: 0.00778
Epoch: 31367, Training Loss: 0.00810
Epoch: 31367, Training Loss: 0.00811
Epoch: 31367, Training Loss: 0.00996
Epoch: 31368, Training Loss: 0.00778
Epoch: 31368, Training Loss: 0.00810
Epoch: 31368, Training Loss: 0.00811
Epoch: 31368, Training Loss: 0.00996
Epoch: 31369, Training Loss: 0.00778
Epoch: 31369, Training Loss: 0.00810
Epoch: 31369, Training Loss: 0.00811
Epoch: 31369, Training Loss: 0.00996
Epoch: 31370, Training Loss: 0.00778
Epoch: 31370, Training Loss: 0.00810
Epoch: 31370, Training Loss: 0.00811
Epoch: 31370, Training Loss: 0.00996
Epoch: 31371, Training Loss: 0.00778
Epoch: 31371, Training Loss: 0.00810
Epoch: 31371, Training Loss: 0.00811
Epoch: 31371, Training Loss: 0.00996
Epoch: 31372, Training Loss: 0.00778
Epoch: 31372, Training Loss: 0.00810
Epoch: 31372, Training Loss: 0.00811
Epoch: 31372, Training Loss: 0.00996
Epoch: 31373, Training Loss: 0.00778
Epoch: 31373, Training Loss: 0.00810
Epoch: 31373, Training Loss: 0.00811
Epoch: 31373, Training Loss: 0.00996
Epoch: 31374, Training Loss: 0.00778
Epoch: 31374, Training Loss: 0.00810
Epoch: 31374, Training Loss: 0.00811
Epoch: 31374, Training Loss: 0.00996
Epoch: 31375, Training Loss: 0.00778
Epoch: 31375, Training Loss: 0.00810
Epoch: 31375, Training Loss: 0.00811
Epoch: 31375, Training Loss: 0.00996
Epoch: 31376, Training Loss: 0.00778
Epoch: 31376, Training Loss: 0.00810
Epoch: 31376, Training Loss: 0.00811
Epoch: 31376, Training Loss: 0.00996
Epoch: 31377, Training Loss: 0.00778
Epoch: 31377, Training Loss: 0.00810
Epoch: 31377, Training Loss: 0.00811
Epoch: 31377, Training Loss: 0.00996
Epoch: 31378, Training Loss: 0.00778
Epoch: 31378, Training Loss: 0.00810
Epoch: 31378, Training Loss: 0.00811
Epoch: 31378, Training Loss: 0.00996
Epoch: 31379, Training Loss: 0.00778
Epoch: 31379, Training Loss: 0.00810
Epoch: 31379, Training Loss: 0.00811
Epoch: 31379, Training Loss: 0.00996
Epoch: 31380, Training Loss: 0.00778
Epoch: 31380, Training Loss: 0.00810
Epoch: 31380, Training Loss: 0.00811
Epoch: 31380, Training Loss: 0.00996
Epoch: 31381, Training Loss: 0.00778
Epoch: 31381, Training Loss: 0.00810
Epoch: 31381, Training Loss: 0.00811
Epoch: 31381, Training Loss: 0.00996
Epoch: 31382, Training Loss: 0.00778
Epoch: 31382, Training Loss: 0.00810
Epoch: 31382, Training Loss: 0.00811
Epoch: 31382, Training Loss: 0.00996
Epoch: 31383, Training Loss: 0.00778
Epoch: 31383, Training Loss: 0.00810
Epoch: 31383, Training Loss: 0.00811
Epoch: 31383, Training Loss: 0.00996
Epoch: 31384, Training Loss: 0.00778
Epoch: 31384, Training Loss: 0.00810
Epoch: 31384, Training Loss: 0.00811
Epoch: 31384, Training Loss: 0.00996
Epoch: 31385, Training Loss: 0.00778
Epoch: 31385, Training Loss: 0.00810
Epoch: 31385, Training Loss: 0.00811
Epoch: 31385, Training Loss: 0.00996
Epoch: 31386, Training Loss: 0.00778
Epoch: 31386, Training Loss: 0.00810
Epoch: 31386, Training Loss: 0.00811
Epoch: 31386, Training Loss: 0.00996
Epoch: 31387, Training Loss: 0.00778
Epoch: 31387, Training Loss: 0.00810
Epoch: 31387, Training Loss: 0.00811
Epoch: 31387, Training Loss: 0.00996
Epoch: 31388, Training Loss: 0.00778
Epoch: 31388, Training Loss: 0.00810
Epoch: 31388, Training Loss: 0.00811
Epoch: 31388, Training Loss: 0.00996
Epoch: 31389, Training Loss: 0.00778
Epoch: 31389, Training Loss: 0.00810
Epoch: 31389, Training Loss: 0.00811
Epoch: 31389, Training Loss: 0.00996
Epoch: 31390, Training Loss: 0.00778
Epoch: 31390, Training Loss: 0.00810
Epoch: 31390, Training Loss: 0.00811
Epoch: 31390, Training Loss: 0.00996
Epoch: 31391, Training Loss: 0.00778
Epoch: 31391, Training Loss: 0.00810
Epoch: 31391, Training Loss: 0.00811
Epoch: 31391, Training Loss: 0.00996
Epoch: 31392, Training Loss: 0.00778
Epoch: 31392, Training Loss: 0.00810
Epoch: 31392, Training Loss: 0.00811
Epoch: 31392, Training Loss: 0.00996
Epoch: 31393, Training Loss: 0.00778
Epoch: 31393, Training Loss: 0.00810
Epoch: 31393, Training Loss: 0.00811
Epoch: 31393, Training Loss: 0.00996
Epoch: 31394, Training Loss: 0.00778
Epoch: 31394, Training Loss: 0.00810
Epoch: 31394, Training Loss: 0.00811
Epoch: 31394, Training Loss: 0.00996
Epoch: 31395, Training Loss: 0.00778
Epoch: 31395, Training Loss: 0.00810
Epoch: 31395, Training Loss: 0.00811
Epoch: 31395, Training Loss: 0.00996
Epoch: 31396, Training Loss: 0.00778
Epoch: 31396, Training Loss: 0.00810
Epoch: 31396, Training Loss: 0.00811
Epoch: 31396, Training Loss: 0.00996
Epoch: 31397, Training Loss: 0.00778
Epoch: 31397, Training Loss: 0.00810
Epoch: 31397, Training Loss: 0.00811
Epoch: 31397, Training Loss: 0.00996
Epoch: 31398, Training Loss: 0.00778
Epoch: 31398, Training Loss: 0.00810
Epoch: 31398, Training Loss: 0.00811
Epoch: 31398, Training Loss: 0.00996
Epoch: 31399, Training Loss: 0.00778
Epoch: 31399, Training Loss: 0.00810
Epoch: 31399, Training Loss: 0.00811
Epoch: 31399, Training Loss: 0.00996
Epoch: 31400, Training Loss: 0.00778
Epoch: 31400, Training Loss: 0.00810
Epoch: 31400, Training Loss: 0.00811
Epoch: 31400, Training Loss: 0.00996
Epoch: 31401, Training Loss: 0.00778
Epoch: 31401, Training Loss: 0.00810
Epoch: 31401, Training Loss: 0.00811
Epoch: 31401, Training Loss: 0.00996
Epoch: 31402, Training Loss: 0.00778
Epoch: 31402, Training Loss: 0.00810
Epoch: 31402, Training Loss: 0.00811
Epoch: 31402, Training Loss: 0.00996
Epoch: 31403, Training Loss: 0.00778
Epoch: 31403, Training Loss: 0.00810
Epoch: 31403, Training Loss: 0.00811
Epoch: 31403, Training Loss: 0.00996
Epoch: 31404, Training Loss: 0.00778
Epoch: 31404, Training Loss: 0.00810
Epoch: 31404, Training Loss: 0.00811
Epoch: 31404, Training Loss: 0.00996
Epoch: 31405, Training Loss: 0.00778
Epoch: 31405, Training Loss: 0.00810
Epoch: 31405, Training Loss: 0.00811
Epoch: 31405, Training Loss: 0.00996
Epoch: 31406, Training Loss: 0.00778
Epoch: 31406, Training Loss: 0.00810
Epoch: 31406, Training Loss: 0.00811
Epoch: 31406, Training Loss: 0.00996
Epoch: 31407, Training Loss: 0.00778
Epoch: 31407, Training Loss: 0.00810
Epoch: 31407, Training Loss: 0.00811
Epoch: 31407, Training Loss: 0.00996
Epoch: 31408, Training Loss: 0.00778
Epoch: 31408, Training Loss: 0.00810
Epoch: 31408, Training Loss: 0.00811
Epoch: 31408, Training Loss: 0.00996
Epoch: 31409, Training Loss: 0.00777
Epoch: 31409, Training Loss: 0.00810
Epoch: 31409, Training Loss: 0.00811
Epoch: 31409, Training Loss: 0.00996
Epoch: 31410, Training Loss: 0.00777
Epoch: 31410, Training Loss: 0.00810
Epoch: 31410, Training Loss: 0.00811
Epoch: 31410, Training Loss: 0.00996
Epoch: 31411, Training Loss: 0.00777
Epoch: 31411, Training Loss: 0.00810
Epoch: 31411, Training Loss: 0.00811
Epoch: 31411, Training Loss: 0.00996
Epoch: 31412, Training Loss: 0.00777
Epoch: 31412, Training Loss: 0.00810
Epoch: 31412, Training Loss: 0.00811
Epoch: 31412, Training Loss: 0.00996
Epoch: 31413, Training Loss: 0.00777
Epoch: 31413, Training Loss: 0.00810
Epoch: 31413, Training Loss: 0.00811
Epoch: 31413, Training Loss: 0.00996
Epoch: 31414, Training Loss: 0.00777
Epoch: 31414, Training Loss: 0.00810
Epoch: 31414, Training Loss: 0.00811
Epoch: 31414, Training Loss: 0.00996
Epoch: 31415, Training Loss: 0.00777
Epoch: 31415, Training Loss: 0.00810
Epoch: 31415, Training Loss: 0.00811
Epoch: 31415, Training Loss: 0.00996
Epoch: 31416, Training Loss: 0.00777
Epoch: 31416, Training Loss: 0.00810
Epoch: 31416, Training Loss: 0.00811
Epoch: 31416, Training Loss: 0.00996
Epoch: 31417, Training Loss: 0.00777
Epoch: 31417, Training Loss: 0.00810
Epoch: 31417, Training Loss: 0.00811
Epoch: 31417, Training Loss: 0.00996
Epoch: 31418, Training Loss: 0.00777
Epoch: 31418, Training Loss: 0.00810
Epoch: 31418, Training Loss: 0.00811
Epoch: 31418, Training Loss: 0.00995
Epoch: 31419, Training Loss: 0.00777
Epoch: 31419, Training Loss: 0.00810
Epoch: 31419, Training Loss: 0.00811
Epoch: 31419, Training Loss: 0.00995
Epoch: 31420, Training Loss: 0.00777
Epoch: 31420, Training Loss: 0.00810
Epoch: 31420, Training Loss: 0.00811
Epoch: 31420, Training Loss: 0.00995
Epoch: 31421, Training Loss: 0.00777
Epoch: 31421, Training Loss: 0.00810
Epoch: 31421, Training Loss: 0.00811
Epoch: 31421, Training Loss: 0.00995
Epoch: 31422, Training Loss: 0.00777
Epoch: 31422, Training Loss: 0.00810
Epoch: 31422, Training Loss: 0.00811
Epoch: 31422, Training Loss: 0.00995
Epoch: 31423, Training Loss: 0.00777
Epoch: 31423, Training Loss: 0.00810
Epoch: 31423, Training Loss: 0.00811
Epoch: 31423, Training Loss: 0.00995
Epoch: 31424, Training Loss: 0.00777
Epoch: 31424, Training Loss: 0.00810
Epoch: 31424, Training Loss: 0.00811
Epoch: 31424, Training Loss: 0.00995
Epoch: 31425, Training Loss: 0.00777
Epoch: 31425, Training Loss: 0.00810
Epoch: 31425, Training Loss: 0.00811
Epoch: 31425, Training Loss: 0.00995
Epoch: 31426, Training Loss: 0.00777
Epoch: 31426, Training Loss: 0.00810
Epoch: 31426, Training Loss: 0.00811
Epoch: 31426, Training Loss: 0.00995
Epoch: 31427, Training Loss: 0.00777
Epoch: 31427, Training Loss: 0.00810
Epoch: 31427, Training Loss: 0.00810
Epoch: 31427, Training Loss: 0.00995
Epoch: 31428, Training Loss: 0.00777
Epoch: 31428, Training Loss: 0.00810
Epoch: 31428, Training Loss: 0.00810
Epoch: 31428, Training Loss: 0.00995
Epoch: 31429, Training Loss: 0.00777
Epoch: 31429, Training Loss: 0.00810
Epoch: 31429, Training Loss: 0.00810
Epoch: 31429, Training Loss: 0.00995
Epoch: 31430, Training Loss: 0.00777
Epoch: 31430, Training Loss: 0.00809
Epoch: 31430, Training Loss: 0.00810
Epoch: 31430, Training Loss: 0.00995
Epoch: 31431, Training Loss: 0.00777
Epoch: 31431, Training Loss: 0.00809
Epoch: 31431, Training Loss: 0.00810
Epoch: 31431, Training Loss: 0.00995
Epoch: 31432, Training Loss: 0.00777
Epoch: 31432, Training Loss: 0.00809
Epoch: 31432, Training Loss: 0.00810
Epoch: 31432, Training Loss: 0.00995
Epoch: 31433, Training Loss: 0.00777
Epoch: 31433, Training Loss: 0.00809
Epoch: 31433, Training Loss: 0.00810
Epoch: 31433, Training Loss: 0.00995
Epoch: 31434, Training Loss: 0.00777
Epoch: 31434, Training Loss: 0.00809
Epoch: 31434, Training Loss: 0.00810
Epoch: 31434, Training Loss: 0.00995
Epoch: 31435, Training Loss: 0.00777
Epoch: 31435, Training Loss: 0.00809
Epoch: 31435, Training Loss: 0.00810
Epoch: 31435, Training Loss: 0.00995
Epoch: 31436, Training Loss: 0.00777
Epoch: 31436, Training Loss: 0.00809
Epoch: 31436, Training Loss: 0.00810
Epoch: 31436, Training Loss: 0.00995
Epoch: 31437, Training Loss: 0.00777
Epoch: 31437, Training Loss: 0.00809
Epoch: 31437, Training Loss: 0.00810
Epoch: 31437, Training Loss: 0.00995
Epoch: 31438, Training Loss: 0.00777
Epoch: 31438, Training Loss: 0.00809
Epoch: 31438, Training Loss: 0.00810
Epoch: 31438, Training Loss: 0.00995
Epoch: 31439, Training Loss: 0.00777
Epoch: 31439, Training Loss: 0.00809
Epoch: 31439, Training Loss: 0.00810
Epoch: 31439, Training Loss: 0.00995
Epoch: 31440, Training Loss: 0.00777
Epoch: 31440, Training Loss: 0.00809
Epoch: 31440, Training Loss: 0.00810
Epoch: 31440, Training Loss: 0.00995
Epoch: 31441, Training Loss: 0.00777
Epoch: 31441, Training Loss: 0.00809
Epoch: 31441, Training Loss: 0.00810
Epoch: 31441, Training Loss: 0.00995
Epoch: 31442, Training Loss: 0.00777
Epoch: 31442, Training Loss: 0.00809
Epoch: 31442, Training Loss: 0.00810
Epoch: 31442, Training Loss: 0.00995
Epoch: 31443, Training Loss: 0.00777
Epoch: 31443, Training Loss: 0.00809
Epoch: 31443, Training Loss: 0.00810
Epoch: 31443, Training Loss: 0.00995
Epoch: 31444, Training Loss: 0.00777
Epoch: 31444, Training Loss: 0.00809
Epoch: 31444, Training Loss: 0.00810
Epoch: 31444, Training Loss: 0.00995
Epoch: 31445, Training Loss: 0.00777
Epoch: 31445, Training Loss: 0.00809
Epoch: 31445, Training Loss: 0.00810
Epoch: 31445, Training Loss: 0.00995
Epoch: 31446, Training Loss: 0.00777
Epoch: 31446, Training Loss: 0.00809
Epoch: 31446, Training Loss: 0.00810
Epoch: 31446, Training Loss: 0.00995
Epoch: 31447, Training Loss: 0.00777
Epoch: 31447, Training Loss: 0.00809
Epoch: 31447, Training Loss: 0.00810
Epoch: 31447, Training Loss: 0.00995
Epoch: 31448, Training Loss: 0.00777
Epoch: 31448, Training Loss: 0.00809
Epoch: 31448, Training Loss: 0.00810
Epoch: 31448, Training Loss: 0.00995
Epoch: 31449, Training Loss: 0.00777
Epoch: 31449, Training Loss: 0.00809
Epoch: 31449, Training Loss: 0.00810
Epoch: 31449, Training Loss: 0.00995
Epoch: 31450, Training Loss: 0.00777
Epoch: 31450, Training Loss: 0.00809
Epoch: 31450, Training Loss: 0.00810
Epoch: 31450, Training Loss: 0.00995
Epoch: 31451, Training Loss: 0.00777
Epoch: 31451, Training Loss: 0.00809
Epoch: 31451, Training Loss: 0.00810
Epoch: 31451, Training Loss: 0.00995
Epoch: 31452, Training Loss: 0.00777
Epoch: 31452, Training Loss: 0.00809
Epoch: 31452, Training Loss: 0.00810
Epoch: 31452, Training Loss: 0.00995
Epoch: 31453, Training Loss: 0.00777
Epoch: 31453, Training Loss: 0.00809
Epoch: 31453, Training Loss: 0.00810
Epoch: 31453, Training Loss: 0.00995
Epoch: 31454, Training Loss: 0.00777
Epoch: 31454, Training Loss: 0.00809
Epoch: 31454, Training Loss: 0.00810
Epoch: 31454, Training Loss: 0.00995
Epoch: 31455, Training Loss: 0.00777
Epoch: 31455, Training Loss: 0.00809
Epoch: 31455, Training Loss: 0.00810
Epoch: 31455, Training Loss: 0.00995
Epoch: 31456, Training Loss: 0.00777
Epoch: 31456, Training Loss: 0.00809
Epoch: 31456, Training Loss: 0.00810
Epoch: 31456, Training Loss: 0.00995
Epoch: 31457, Training Loss: 0.00777
Epoch: 31457, Training Loss: 0.00809
Epoch: 31457, Training Loss: 0.00810
Epoch: 31457, Training Loss: 0.00995
Epoch: 31458, Training Loss: 0.00777
Epoch: 31458, Training Loss: 0.00809
Epoch: 31458, Training Loss: 0.00810
Epoch: 31458, Training Loss: 0.00995
Epoch: 31459, Training Loss: 0.00777
Epoch: 31459, Training Loss: 0.00809
Epoch: 31459, Training Loss: 0.00810
Epoch: 31459, Training Loss: 0.00995
Epoch: 31460, Training Loss: 0.00777
Epoch: 31460, Training Loss: 0.00809
Epoch: 31460, Training Loss: 0.00810
Epoch: 31460, Training Loss: 0.00995
Epoch: 31461, Training Loss: 0.00777
Epoch: 31461, Training Loss: 0.00809
Epoch: 31461, Training Loss: 0.00810
Epoch: 31461, Training Loss: 0.00995
Epoch: 31462, Training Loss: 0.00777
Epoch: 31462, Training Loss: 0.00809
Epoch: 31462, Training Loss: 0.00810
Epoch: 31462, Training Loss: 0.00995
Epoch: 31463, Training Loss: 0.00777
Epoch: 31463, Training Loss: 0.00809
Epoch: 31463, Training Loss: 0.00810
Epoch: 31463, Training Loss: 0.00995
Epoch: 31464, Training Loss: 0.00777
Epoch: 31464, Training Loss: 0.00809
Epoch: 31464, Training Loss: 0.00810
Epoch: 31464, Training Loss: 0.00995
Epoch: 31465, Training Loss: 0.00777
Epoch: 31465, Training Loss: 0.00809
Epoch: 31465, Training Loss: 0.00810
Epoch: 31465, Training Loss: 0.00995
Epoch: 31466, Training Loss: 0.00777
Epoch: 31466, Training Loss: 0.00809
Epoch: 31466, Training Loss: 0.00810
Epoch: 31466, Training Loss: 0.00995
Epoch: 31467, Training Loss: 0.00777
Epoch: 31467, Training Loss: 0.00809
Epoch: 31467, Training Loss: 0.00810
Epoch: 31467, Training Loss: 0.00995
Epoch: 31468, Training Loss: 0.00777
Epoch: 31468, Training Loss: 0.00809
Epoch: 31468, Training Loss: 0.00810
Epoch: 31468, Training Loss: 0.00995
Epoch: 31469, Training Loss: 0.00777
Epoch: 31469, Training Loss: 0.00809
Epoch: 31469, Training Loss: 0.00810
Epoch: 31469, Training Loss: 0.00995
Epoch: 31470, Training Loss: 0.00777
Epoch: 31470, Training Loss: 0.00809
Epoch: 31470, Training Loss: 0.00810
Epoch: 31470, Training Loss: 0.00995
Epoch: 31471, Training Loss: 0.00777
Epoch: 31471, Training Loss: 0.00809
Epoch: 31471, Training Loss: 0.00810
Epoch: 31471, Training Loss: 0.00995
Epoch: 31472, Training Loss: 0.00777
Epoch: 31472, Training Loss: 0.00809
Epoch: 31472, Training Loss: 0.00810
Epoch: 31472, Training Loss: 0.00995
Epoch: 31473, Training Loss: 0.00777
Epoch: 31473, Training Loss: 0.00809
Epoch: 31473, Training Loss: 0.00810
Epoch: 31473, Training Loss: 0.00995
Epoch: 31474, Training Loss: 0.00777
Epoch: 31474, Training Loss: 0.00809
Epoch: 31474, Training Loss: 0.00810
Epoch: 31474, Training Loss: 0.00994
Epoch: 31475, Training Loss: 0.00777
Epoch: 31475, Training Loss: 0.00809
Epoch: 31475, Training Loss: 0.00810
Epoch: 31475, Training Loss: 0.00994
Epoch: 31476, Training Loss: 0.00777
Epoch: 31476, Training Loss: 0.00809
Epoch: 31476, Training Loss: 0.00810
Epoch: 31476, Training Loss: 0.00994
Epoch: 31477, Training Loss: 0.00777
Epoch: 31477, Training Loss: 0.00809
Epoch: 31477, Training Loss: 0.00810
Epoch: 31477, Training Loss: 0.00994
Epoch: 31478, Training Loss: 0.00777
Epoch: 31478, Training Loss: 0.00809
Epoch: 31478, Training Loss: 0.00810
Epoch: 31478, Training Loss: 0.00994
Epoch: 31479, Training Loss: 0.00777
Epoch: 31479, Training Loss: 0.00809
Epoch: 31479, Training Loss: 0.00810
Epoch: 31479, Training Loss: 0.00994
Epoch: 31480, Training Loss: 0.00777
Epoch: 31480, Training Loss: 0.00809
Epoch: 31480, Training Loss: 0.00810
Epoch: 31480, Training Loss: 0.00994
Epoch: 31481, Training Loss: 0.00777
Epoch: 31481, Training Loss: 0.00809
Epoch: 31481, Training Loss: 0.00810
Epoch: 31481, Training Loss: 0.00994
Epoch: 31482, Training Loss: 0.00777
Epoch: 31482, Training Loss: 0.00809
Epoch: 31482, Training Loss: 0.00810
Epoch: 31482, Training Loss: 0.00994
Epoch: 31483, Training Loss: 0.00776
Epoch: 31483, Training Loss: 0.00809
Epoch: 31483, Training Loss: 0.00810
Epoch: 31483, Training Loss: 0.00994
Epoch: 31484, Training Loss: 0.00776
Epoch: 31484, Training Loss: 0.00809
Epoch: 31484, Training Loss: 0.00810
Epoch: 31484, Training Loss: 0.00994
Epoch: 31485, Training Loss: 0.00776
Epoch: 31485, Training Loss: 0.00809
Epoch: 31485, Training Loss: 0.00810
Epoch: 31485, Training Loss: 0.00994
Epoch: 31486, Training Loss: 0.00776
Epoch: 31486, Training Loss: 0.00809
Epoch: 31486, Training Loss: 0.00810
Epoch: 31486, Training Loss: 0.00994
Epoch: 31487, Training Loss: 0.00776
Epoch: 31487, Training Loss: 0.00809
Epoch: 31487, Training Loss: 0.00810
Epoch: 31487, Training Loss: 0.00994
Epoch: 31488, Training Loss: 0.00776
Epoch: 31488, Training Loss: 0.00809
Epoch: 31488, Training Loss: 0.00810
Epoch: 31488, Training Loss: 0.00994
Epoch: 31489, Training Loss: 0.00776
Epoch: 31489, Training Loss: 0.00809
Epoch: 31489, Training Loss: 0.00810
Epoch: 31489, Training Loss: 0.00994
Epoch: 31490, Training Loss: 0.00776
Epoch: 31490, Training Loss: 0.00809
Epoch: 31490, Training Loss: 0.00810
Epoch: 31490, Training Loss: 0.00994
Epoch: 31491, Training Loss: 0.00776
Epoch: 31491, Training Loss: 0.00809
Epoch: 31491, Training Loss: 0.00810
Epoch: 31491, Training Loss: 0.00994
Epoch: 31492, Training Loss: 0.00776
Epoch: 31492, Training Loss: 0.00809
Epoch: 31492, Training Loss: 0.00810
Epoch: 31492, Training Loss: 0.00994
Epoch: 31493, Training Loss: 0.00776
Epoch: 31493, Training Loss: 0.00809
Epoch: 31493, Training Loss: 0.00810
Epoch: 31493, Training Loss: 0.00994
Epoch: 31494, Training Loss: 0.00776
Epoch: 31494, Training Loss: 0.00809
Epoch: 31494, Training Loss: 0.00810
Epoch: 31494, Training Loss: 0.00994
Epoch: 31495, Training Loss: 0.00776
Epoch: 31495, Training Loss: 0.00809
Epoch: 31495, Training Loss: 0.00810
Epoch: 31495, Training Loss: 0.00994
Epoch: 31496, Training Loss: 0.00776
Epoch: 31496, Training Loss: 0.00809
Epoch: 31496, Training Loss: 0.00810
Epoch: 31496, Training Loss: 0.00994
Epoch: 31497, Training Loss: 0.00776
Epoch: 31497, Training Loss: 0.00809
Epoch: 31497, Training Loss: 0.00809
Epoch: 31497, Training Loss: 0.00994
Epoch: 31498, Training Loss: 0.00776
Epoch: 31498, Training Loss: 0.00809
Epoch: 31498, Training Loss: 0.00809
Epoch: 31498, Training Loss: 0.00994
Epoch: 31499, Training Loss: 0.00776
Epoch: 31499, Training Loss: 0.00809
Epoch: 31499, Training Loss: 0.00809
Epoch: 31499, Training Loss: 0.00994
Epoch: 31500, Training Loss: 0.00776
Epoch: 31500, Training Loss: 0.00808
Epoch: 31500, Training Loss: 0.00809
Epoch: 31500, Training Loss: 0.00994
Epoch: 31501, Training Loss: 0.00776
Epoch: 31501, Training Loss: 0.00808
Epoch: 31501, Training Loss: 0.00809
Epoch: 31501, Training Loss: 0.00994
Epoch: 31502, Training Loss: 0.00776
Epoch: 31502, Training Loss: 0.00808
Epoch: 31502, Training Loss: 0.00809
Epoch: 31502, Training Loss: 0.00994
Epoch: 31503, Training Loss: 0.00776
Epoch: 31503, Training Loss: 0.00808
Epoch: 31503, Training Loss: 0.00809
Epoch: 31503, Training Loss: 0.00994
Epoch: 31504, Training Loss: 0.00776
Epoch: 31504, Training Loss: 0.00808
Epoch: 31504, Training Loss: 0.00809
Epoch: 31504, Training Loss: 0.00994
Epoch: 31505, Training Loss: 0.00776
Epoch: 31505, Training Loss: 0.00808
Epoch: 31505, Training Loss: 0.00809
Epoch: 31505, Training Loss: 0.00994
Epoch: 31506, Training Loss: 0.00776
Epoch: 31506, Training Loss: 0.00808
Epoch: 31506, Training Loss: 0.00809
Epoch: 31506, Training Loss: 0.00994
Epoch: 31507, Training Loss: 0.00776
Epoch: 31507, Training Loss: 0.00808
Epoch: 31507, Training Loss: 0.00809
Epoch: 31507, Training Loss: 0.00994
Epoch: 31508, Training Loss: 0.00776
Epoch: 31508, Training Loss: 0.00808
Epoch: 31508, Training Loss: 0.00809
Epoch: 31508, Training Loss: 0.00994
Epoch: 31509, Training Loss: 0.00776
Epoch: 31509, Training Loss: 0.00808
Epoch: 31509, Training Loss: 0.00809
Epoch: 31509, Training Loss: 0.00994
Epoch: 31510, Training Loss: 0.00776
Epoch: 31510, Training Loss: 0.00808
Epoch: 31510, Training Loss: 0.00809
Epoch: 31510, Training Loss: 0.00994
Epoch: 31511, Training Loss: 0.00776
Epoch: 31511, Training Loss: 0.00808
Epoch: 31511, Training Loss: 0.00809
Epoch: 31511, Training Loss: 0.00994
Epoch: 31512, Training Loss: 0.00776
Epoch: 31512, Training Loss: 0.00808
Epoch: 31512, Training Loss: 0.00809
Epoch: 31512, Training Loss: 0.00994
Epoch: 31513, Training Loss: 0.00776
Epoch: 31513, Training Loss: 0.00808
Epoch: 31513, Training Loss: 0.00809
Epoch: 31513, Training Loss: 0.00994
Epoch: 31514, Training Loss: 0.00776
Epoch: 31514, Training Loss: 0.00808
Epoch: 31514, Training Loss: 0.00809
Epoch: 31514, Training Loss: 0.00994
Epoch: 31515, Training Loss: 0.00776
Epoch: 31515, Training Loss: 0.00808
Epoch: 31515, Training Loss: 0.00809
Epoch: 31515, Training Loss: 0.00994
Epoch: 31516, Training Loss: 0.00776
Epoch: 31516, Training Loss: 0.00808
Epoch: 31516, Training Loss: 0.00809
Epoch: 31516, Training Loss: 0.00994
Epoch: 31517, Training Loss: 0.00776
Epoch: 31517, Training Loss: 0.00808
Epoch: 31517, Training Loss: 0.00809
Epoch: 31517, Training Loss: 0.00994
Epoch: 31518, Training Loss: 0.00776
Epoch: 31518, Training Loss: 0.00808
Epoch: 31518, Training Loss: 0.00809
Epoch: 31518, Training Loss: 0.00994
Epoch: 31519, Training Loss: 0.00776
Epoch: 31519, Training Loss: 0.00808
Epoch: 31519, Training Loss: 0.00809
Epoch: 31519, Training Loss: 0.00994
Epoch: 31520, Training Loss: 0.00776
Epoch: 31520, Training Loss: 0.00808
Epoch: 31520, Training Loss: 0.00809
Epoch: 31520, Training Loss: 0.00994
Epoch: 31521, Training Loss: 0.00776
Epoch: 31521, Training Loss: 0.00808
Epoch: 31521, Training Loss: 0.00809
Epoch: 31521, Training Loss: 0.00994
Epoch: 31522, Training Loss: 0.00776
Epoch: 31522, Training Loss: 0.00808
Epoch: 31522, Training Loss: 0.00809
Epoch: 31522, Training Loss: 0.00994
Epoch: 31523, Training Loss: 0.00776
Epoch: 31523, Training Loss: 0.00808
Epoch: 31523, Training Loss: 0.00809
Epoch: 31523, Training Loss: 0.00994
Epoch: 31524, Training Loss: 0.00776
Epoch: 31524, Training Loss: 0.00808
Epoch: 31524, Training Loss: 0.00809
Epoch: 31524, Training Loss: 0.00994
Epoch: 31525, Training Loss: 0.00776
Epoch: 31525, Training Loss: 0.00808
Epoch: 31525, Training Loss: 0.00809
Epoch: 31525, Training Loss: 0.00994
Epoch: 31526, Training Loss: 0.00776
Epoch: 31526, Training Loss: 0.00808
Epoch: 31526, Training Loss: 0.00809
Epoch: 31526, Training Loss: 0.00994
Epoch: 31527, Training Loss: 0.00776
Epoch: 31527, Training Loss: 0.00808
Epoch: 31527, Training Loss: 0.00809
Epoch: 31527, Training Loss: 0.00994
Epoch: 31528, Training Loss: 0.00776
Epoch: 31528, Training Loss: 0.00808
Epoch: 31528, Training Loss: 0.00809
Epoch: 31528, Training Loss: 0.00994
Epoch: 31529, Training Loss: 0.00776
Epoch: 31529, Training Loss: 0.00808
Epoch: 31529, Training Loss: 0.00809
Epoch: 31529, Training Loss: 0.00993
Epoch: 31530, Training Loss: 0.00776
Epoch: 31530, Training Loss: 0.00808
Epoch: 31530, Training Loss: 0.00809
Epoch: 31530, Training Loss: 0.00993
Epoch: 31531, Training Loss: 0.00776
Epoch: 31531, Training Loss: 0.00808
Epoch: 31531, Training Loss: 0.00809
Epoch: 31531, Training Loss: 0.00993
Epoch: 31532, Training Loss: 0.00776
Epoch: 31532, Training Loss: 0.00808
Epoch: 31532, Training Loss: 0.00809
Epoch: 31532, Training Loss: 0.00993
Epoch: 31533, Training Loss: 0.00776
Epoch: 31533, Training Loss: 0.00808
Epoch: 31533, Training Loss: 0.00809
Epoch: 31533, Training Loss: 0.00993
Epoch: 31534, Training Loss: 0.00776
Epoch: 31534, Training Loss: 0.00808
Epoch: 31534, Training Loss: 0.00809
Epoch: 31534, Training Loss: 0.00993
Epoch: 31535, Training Loss: 0.00776
Epoch: 31535, Training Loss: 0.00808
Epoch: 31535, Training Loss: 0.00809
Epoch: 31535, Training Loss: 0.00993
Epoch: 31536, Training Loss: 0.00776
Epoch: 31536, Training Loss: 0.00808
Epoch: 31536, Training Loss: 0.00809
Epoch: 31536, Training Loss: 0.00993
Epoch: 31537, Training Loss: 0.00776
Epoch: 31537, Training Loss: 0.00808
Epoch: 31537, Training Loss: 0.00809
Epoch: 31537, Training Loss: 0.00993
Epoch: 31538, Training Loss: 0.00776
Epoch: 31538, Training Loss: 0.00808
Epoch: 31538, Training Loss: 0.00809
Epoch: 31538, Training Loss: 0.00993
Epoch: 31539, Training Loss: 0.00776
Epoch: 31539, Training Loss: 0.00808
Epoch: 31539, Training Loss: 0.00809
Epoch: 31539, Training Loss: 0.00993
Epoch: 31540, Training Loss: 0.00776
Epoch: 31540, Training Loss: 0.00808
Epoch: 31540, Training Loss: 0.00809
Epoch: 31540, Training Loss: 0.00993
Epoch: 31541, Training Loss: 0.00776
Epoch: 31541, Training Loss: 0.00808
Epoch: 31541, Training Loss: 0.00809
Epoch: 31541, Training Loss: 0.00993
Epoch: 31542, Training Loss: 0.00776
Epoch: 31542, Training Loss: 0.00808
Epoch: 31542, Training Loss: 0.00809
Epoch: 31542, Training Loss: 0.00993
Epoch: 31543, Training Loss: 0.00776
Epoch: 31543, Training Loss: 0.00808
Epoch: 31543, Training Loss: 0.00809
Epoch: 31543, Training Loss: 0.00993
Epoch: 31544, Training Loss: 0.00776
Epoch: 31544, Training Loss: 0.00808
Epoch: 31544, Training Loss: 0.00809
Epoch: 31544, Training Loss: 0.00993
Epoch: 31545, Training Loss: 0.00776
Epoch: 31545, Training Loss: 0.00808
Epoch: 31545, Training Loss: 0.00809
Epoch: 31545, Training Loss: 0.00993
Epoch: 31546, Training Loss: 0.00776
Epoch: 31546, Training Loss: 0.00808
Epoch: 31546, Training Loss: 0.00809
Epoch: 31546, Training Loss: 0.00993
Epoch: 31547, Training Loss: 0.00776
Epoch: 31547, Training Loss: 0.00808
Epoch: 31547, Training Loss: 0.00809
Epoch: 31547, Training Loss: 0.00993
Epoch: 31548, Training Loss: 0.00776
Epoch: 31548, Training Loss: 0.00808
Epoch: 31548, Training Loss: 0.00809
Epoch: 31548, Training Loss: 0.00993
Epoch: 31549, Training Loss: 0.00776
Epoch: 31549, Training Loss: 0.00808
Epoch: 31549, Training Loss: 0.00809
Epoch: 31549, Training Loss: 0.00993
Epoch: 31550, Training Loss: 0.00776
Epoch: 31550, Training Loss: 0.00808
Epoch: 31550, Training Loss: 0.00809
Epoch: 31550, Training Loss: 0.00993
Epoch: 31551, Training Loss: 0.00776
Epoch: 31551, Training Loss: 0.00808
Epoch: 31551, Training Loss: 0.00809
Epoch: 31551, Training Loss: 0.00993
Epoch: 31552, Training Loss: 0.00776
Epoch: 31552, Training Loss: 0.00808
Epoch: 31552, Training Loss: 0.00809
Epoch: 31552, Training Loss: 0.00993
Epoch: 31553, Training Loss: 0.00776
Epoch: 31553, Training Loss: 0.00808
Epoch: 31553, Training Loss: 0.00809
Epoch: 31553, Training Loss: 0.00993
Epoch: 31554, Training Loss: 0.00776
Epoch: 31554, Training Loss: 0.00808
Epoch: 31554, Training Loss: 0.00809
Epoch: 31554, Training Loss: 0.00993
Epoch: 31555, Training Loss: 0.00776
Epoch: 31555, Training Loss: 0.00808
Epoch: 31555, Training Loss: 0.00809
Epoch: 31555, Training Loss: 0.00993
Epoch: 31556, Training Loss: 0.00776
Epoch: 31556, Training Loss: 0.00808
Epoch: 31556, Training Loss: 0.00809
Epoch: 31556, Training Loss: 0.00993
Epoch: 31557, Training Loss: 0.00775
Epoch: 31557, Training Loss: 0.00808
Epoch: 31557, Training Loss: 0.00809
Epoch: 31557, Training Loss: 0.00993
Epoch: 31558, Training Loss: 0.00775
Epoch: 31558, Training Loss: 0.00808
Epoch: 31558, Training Loss: 0.00809
Epoch: 31558, Training Loss: 0.00993
Epoch: 31559, Training Loss: 0.00775
Epoch: 31559, Training Loss: 0.00808
Epoch: 31559, Training Loss: 0.00809
Epoch: 31559, Training Loss: 0.00993
Epoch: 31560, Training Loss: 0.00775
Epoch: 31560, Training Loss: 0.00808
Epoch: 31560, Training Loss: 0.00809
Epoch: 31560, Training Loss: 0.00993
Epoch: 31561, Training Loss: 0.00775
Epoch: 31561, Training Loss: 0.00808
Epoch: 31561, Training Loss: 0.00809
Epoch: 31561, Training Loss: 0.00993
Epoch: 31562, Training Loss: 0.00775
Epoch: 31562, Training Loss: 0.00808
Epoch: 31562, Training Loss: 0.00809
Epoch: 31562, Training Loss: 0.00993
Epoch: 31563, Training Loss: 0.00775
Epoch: 31563, Training Loss: 0.00808
Epoch: 31563, Training Loss: 0.00809
Epoch: 31563, Training Loss: 0.00993
Epoch: 31564, Training Loss: 0.00775
Epoch: 31564, Training Loss: 0.00808
Epoch: 31564, Training Loss: 0.00809
Epoch: 31564, Training Loss: 0.00993
Epoch: 31565, Training Loss: 0.00775
Epoch: 31565, Training Loss: 0.00808
Epoch: 31565, Training Loss: 0.00809
Epoch: 31565, Training Loss: 0.00993
Epoch: 31566, Training Loss: 0.00775
Epoch: 31566, Training Loss: 0.00808
Epoch: 31566, Training Loss: 0.00808
Epoch: 31566, Training Loss: 0.00993
Epoch: 31567, Training Loss: 0.00775
Epoch: 31567, Training Loss: 0.00808
Epoch: 31567, Training Loss: 0.00808
Epoch: 31567, Training Loss: 0.00993
Epoch: 31568, Training Loss: 0.00775
Epoch: 31568, Training Loss: 0.00808
Epoch: 31568, Training Loss: 0.00808
Epoch: 31568, Training Loss: 0.00993
Epoch: 31569, Training Loss: 0.00775
Epoch: 31569, Training Loss: 0.00807
Epoch: 31569, Training Loss: 0.00808
Epoch: 31569, Training Loss: 0.00993
Epoch: 31570, Training Loss: 0.00775
Epoch: 31570, Training Loss: 0.00807
Epoch: 31570, Training Loss: 0.00808
Epoch: 31570, Training Loss: 0.00993
Epoch: 31571, Training Loss: 0.00775
Epoch: 31571, Training Loss: 0.00807
Epoch: 31571, Training Loss: 0.00808
Epoch: 31571, Training Loss: 0.00993
Epoch: 31572, Training Loss: 0.00775
Epoch: 31572, Training Loss: 0.00807
Epoch: 31572, Training Loss: 0.00808
Epoch: 31572, Training Loss: 0.00993
Epoch: 31573, Training Loss: 0.00775
Epoch: 31573, Training Loss: 0.00807
Epoch: 31573, Training Loss: 0.00808
Epoch: 31573, Training Loss: 0.00993
Epoch: 31574, Training Loss: 0.00775
Epoch: 31574, Training Loss: 0.00807
Epoch: 31574, Training Loss: 0.00808
Epoch: 31574, Training Loss: 0.00993
Epoch: 31575, Training Loss: 0.00775
Epoch: 31575, Training Loss: 0.00807
Epoch: 31575, Training Loss: 0.00808
Epoch: 31575, Training Loss: 0.00993
Epoch: 31576, Training Loss: 0.00775
Epoch: 31576, Training Loss: 0.00807
Epoch: 31576, Training Loss: 0.00808
Epoch: 31576, Training Loss: 0.00993
Epoch: 31577, Training Loss: 0.00775
Epoch: 31577, Training Loss: 0.00807
Epoch: 31577, Training Loss: 0.00808
Epoch: 31577, Training Loss: 0.00993
Epoch: 31578, Training Loss: 0.00775
Epoch: 31578, Training Loss: 0.00807
Epoch: 31578, Training Loss: 0.00808
Epoch: 31578, Training Loss: 0.00993
Epoch: 31579, Training Loss: 0.00775
Epoch: 31579, Training Loss: 0.00807
Epoch: 31579, Training Loss: 0.00808
Epoch: 31579, Training Loss: 0.00993
Epoch: 31580, Training Loss: 0.00775
Epoch: 31580, Training Loss: 0.00807
Epoch: 31580, Training Loss: 0.00808
Epoch: 31580, Training Loss: 0.00993
Epoch: 31581, Training Loss: 0.00775
Epoch: 31581, Training Loss: 0.00807
Epoch: 31581, Training Loss: 0.00808
Epoch: 31581, Training Loss: 0.00993
Epoch: 31582, Training Loss: 0.00775
Epoch: 31582, Training Loss: 0.00807
Epoch: 31582, Training Loss: 0.00808
Epoch: 31582, Training Loss: 0.00993
Epoch: 31583, Training Loss: 0.00775
Epoch: 31583, Training Loss: 0.00807
Epoch: 31583, Training Loss: 0.00808
Epoch: 31583, Training Loss: 0.00993
Epoch: 31584, Training Loss: 0.00775
Epoch: 31584, Training Loss: 0.00807
Epoch: 31584, Training Loss: 0.00808
Epoch: 31584, Training Loss: 0.00993
Epoch: 31585, Training Loss: 0.00775
Epoch: 31585, Training Loss: 0.00807
Epoch: 31585, Training Loss: 0.00808
Epoch: 31585, Training Loss: 0.00993
Epoch: 31586, Training Loss: 0.00775
Epoch: 31586, Training Loss: 0.00807
Epoch: 31586, Training Loss: 0.00808
Epoch: 31586, Training Loss: 0.00992
Epoch: 31587, Training Loss: 0.00775
Epoch: 31587, Training Loss: 0.00807
Epoch: 31587, Training Loss: 0.00808
Epoch: 31587, Training Loss: 0.00992
Epoch: 31588, Training Loss: 0.00775
Epoch: 31588, Training Loss: 0.00807
Epoch: 31588, Training Loss: 0.00808
Epoch: 31588, Training Loss: 0.00992
Epoch: 31589, Training Loss: 0.00775
Epoch: 31589, Training Loss: 0.00807
Epoch: 31589, Training Loss: 0.00808
Epoch: 31589, Training Loss: 0.00992
Epoch: 31590, Training Loss: 0.00775
Epoch: 31590, Training Loss: 0.00807
Epoch: 31590, Training Loss: 0.00808
Epoch: 31590, Training Loss: 0.00992
Epoch: 31591, Training Loss: 0.00775
Epoch: 31591, Training Loss: 0.00807
Epoch: 31591, Training Loss: 0.00808
Epoch: 31591, Training Loss: 0.00992
Epoch: 31592, Training Loss: 0.00775
Epoch: 31592, Training Loss: 0.00807
Epoch: 31592, Training Loss: 0.00808
Epoch: 31592, Training Loss: 0.00992
Epoch: 31593, Training Loss: 0.00775
Epoch: 31593, Training Loss: 0.00807
Epoch: 31593, Training Loss: 0.00808
Epoch: 31593, Training Loss: 0.00992
Epoch: 31594, Training Loss: 0.00775
Epoch: 31594, Training Loss: 0.00807
Epoch: 31594, Training Loss: 0.00808
Epoch: 31594, Training Loss: 0.00992
Epoch: 31595, Training Loss: 0.00775
Epoch: 31595, Training Loss: 0.00807
Epoch: 31595, Training Loss: 0.00808
Epoch: 31595, Training Loss: 0.00992
Epoch: 31596, Training Loss: 0.00775
Epoch: 31596, Training Loss: 0.00807
Epoch: 31596, Training Loss: 0.00808
Epoch: 31596, Training Loss: 0.00992
Epoch: 31597, Training Loss: 0.00775
Epoch: 31597, Training Loss: 0.00807
Epoch: 31597, Training Loss: 0.00808
Epoch: 31597, Training Loss: 0.00992
Epoch: 31598, Training Loss: 0.00775
Epoch: 31598, Training Loss: 0.00807
Epoch: 31598, Training Loss: 0.00808
Epoch: 31598, Training Loss: 0.00992
Epoch: 31599, Training Loss: 0.00775
Epoch: 31599, Training Loss: 0.00807
Epoch: 31599, Training Loss: 0.00808
Epoch: 31599, Training Loss: 0.00992
Epoch: 31600, Training Loss: 0.00775
Epoch: 31600, Training Loss: 0.00807
Epoch: 31600, Training Loss: 0.00808
Epoch: 31600, Training Loss: 0.00992
Epoch: 31601, Training Loss: 0.00775
Epoch: 31601, Training Loss: 0.00807
Epoch: 31601, Training Loss: 0.00808
Epoch: 31601, Training Loss: 0.00992
Epoch: 31602, Training Loss: 0.00775
Epoch: 31602, Training Loss: 0.00807
Epoch: 31602, Training Loss: 0.00808
Epoch: 31602, Training Loss: 0.00992
Epoch: 31603, Training Loss: 0.00775
Epoch: 31603, Training Loss: 0.00807
Epoch: 31603, Training Loss: 0.00808
Epoch: 31603, Training Loss: 0.00992
Epoch: 31604, Training Loss: 0.00775
Epoch: 31604, Training Loss: 0.00807
Epoch: 31604, Training Loss: 0.00808
Epoch: 31604, Training Loss: 0.00992
Epoch: 31605, Training Loss: 0.00775
Epoch: 31605, Training Loss: 0.00807
Epoch: 31605, Training Loss: 0.00808
Epoch: 31605, Training Loss: 0.00992
Epoch: 31606, Training Loss: 0.00775
Epoch: 31606, Training Loss: 0.00807
Epoch: 31606, Training Loss: 0.00808
Epoch: 31606, Training Loss: 0.00992
Epoch: 31607, Training Loss: 0.00775
Epoch: 31607, Training Loss: 0.00807
Epoch: 31607, Training Loss: 0.00808
Epoch: 31607, Training Loss: 0.00992
Epoch: 31608, Training Loss: 0.00775
Epoch: 31608, Training Loss: 0.00807
Epoch: 31608, Training Loss: 0.00808
Epoch: 31608, Training Loss: 0.00992
Epoch: 31609, Training Loss: 0.00775
Epoch: 31609, Training Loss: 0.00807
Epoch: 31609, Training Loss: 0.00808
Epoch: 31609, Training Loss: 0.00992
Epoch: 31610, Training Loss: 0.00775
Epoch: 31610, Training Loss: 0.00807
Epoch: 31610, Training Loss: 0.00808
Epoch: 31610, Training Loss: 0.00992
Epoch: 31611, Training Loss: 0.00775
Epoch: 31611, Training Loss: 0.00807
Epoch: 31611, Training Loss: 0.00808
Epoch: 31611, Training Loss: 0.00992
Epoch: 31612, Training Loss: 0.00775
Epoch: 31612, Training Loss: 0.00807
Epoch: 31612, Training Loss: 0.00808
Epoch: 31612, Training Loss: 0.00992
Epoch: 31613, Training Loss: 0.00775
Epoch: 31613, Training Loss: 0.00807
Epoch: 31613, Training Loss: 0.00808
Epoch: 31613, Training Loss: 0.00992
Epoch: 31614, Training Loss: 0.00775
Epoch: 31614, Training Loss: 0.00807
Epoch: 31614, Training Loss: 0.00808
Epoch: 31614, Training Loss: 0.00992
Epoch: 31615, Training Loss: 0.00775
Epoch: 31615, Training Loss: 0.00807
Epoch: 31615, Training Loss: 0.00808
Epoch: 31615, Training Loss: 0.00992
Epoch: 31616, Training Loss: 0.00775
Epoch: 31616, Training Loss: 0.00807
Epoch: 31616, Training Loss: 0.00808
Epoch: 31616, Training Loss: 0.00992
Epoch: 31617, Training Loss: 0.00775
Epoch: 31617, Training Loss: 0.00807
Epoch: 31617, Training Loss: 0.00808
Epoch: 31617, Training Loss: 0.00992
Epoch: 31618, Training Loss: 0.00775
Epoch: 31618, Training Loss: 0.00807
Epoch: 31618, Training Loss: 0.00808
Epoch: 31618, Training Loss: 0.00992
Epoch: 31619, Training Loss: 0.00775
Epoch: 31619, Training Loss: 0.00807
Epoch: 31619, Training Loss: 0.00808
Epoch: 31619, Training Loss: 0.00992
Epoch: 31620, Training Loss: 0.00775
Epoch: 31620, Training Loss: 0.00807
Epoch: 31620, Training Loss: 0.00808
Epoch: 31620, Training Loss: 0.00992
Epoch: 31621, Training Loss: 0.00775
Epoch: 31621, Training Loss: 0.00807
Epoch: 31621, Training Loss: 0.00808
Epoch: 31621, Training Loss: 0.00992
Epoch: 31622, Training Loss: 0.00775
Epoch: 31622, Training Loss: 0.00807
Epoch: 31622, Training Loss: 0.00808
Epoch: 31622, Training Loss: 0.00992
Epoch: 31623, Training Loss: 0.00775
Epoch: 31623, Training Loss: 0.00807
Epoch: 31623, Training Loss: 0.00808
Epoch: 31623, Training Loss: 0.00992
Epoch: 31624, Training Loss: 0.00775
Epoch: 31624, Training Loss: 0.00807
Epoch: 31624, Training Loss: 0.00808
Epoch: 31624, Training Loss: 0.00992
Epoch: 31625, Training Loss: 0.00775
Epoch: 31625, Training Loss: 0.00807
Epoch: 31625, Training Loss: 0.00808
Epoch: 31625, Training Loss: 0.00992
Epoch: 31626, Training Loss: 0.00775
Epoch: 31626, Training Loss: 0.00807
Epoch: 31626, Training Loss: 0.00808
Epoch: 31626, Training Loss: 0.00992
Epoch: 31627, Training Loss: 0.00775
Epoch: 31627, Training Loss: 0.00807
Epoch: 31627, Training Loss: 0.00808
Epoch: 31627, Training Loss: 0.00992
Epoch: 31628, Training Loss: 0.00775
Epoch: 31628, Training Loss: 0.00807
Epoch: 31628, Training Loss: 0.00808
Epoch: 31628, Training Loss: 0.00992
Epoch: 31629, Training Loss: 0.00775
Epoch: 31629, Training Loss: 0.00807
Epoch: 31629, Training Loss: 0.00808
Epoch: 31629, Training Loss: 0.00992
Epoch: 31630, Training Loss: 0.00775
Epoch: 31630, Training Loss: 0.00807
Epoch: 31630, Training Loss: 0.00808
Epoch: 31630, Training Loss: 0.00992
Epoch: 31631, Training Loss: 0.00775
Epoch: 31631, Training Loss: 0.00807
Epoch: 31631, Training Loss: 0.00808
Epoch: 31631, Training Loss: 0.00992
Epoch: 31632, Training Loss: 0.00774
Epoch: 31632, Training Loss: 0.00807
Epoch: 31632, Training Loss: 0.00808
Epoch: 31632, Training Loss: 0.00992
Epoch: 31633, Training Loss: 0.00774
Epoch: 31633, Training Loss: 0.00807
Epoch: 31633, Training Loss: 0.00808
Epoch: 31633, Training Loss: 0.00992
Epoch: 31634, Training Loss: 0.00774
Epoch: 31634, Training Loss: 0.00807
Epoch: 31634, Training Loss: 0.00808
Epoch: 31634, Training Loss: 0.00992
Epoch: 31635, Training Loss: 0.00774
Epoch: 31635, Training Loss: 0.00807
Epoch: 31635, Training Loss: 0.00808
Epoch: 31635, Training Loss: 0.00992
Epoch: 31636, Training Loss: 0.00774
Epoch: 31636, Training Loss: 0.00807
Epoch: 31636, Training Loss: 0.00807
Epoch: 31636, Training Loss: 0.00992
Epoch: 31637, Training Loss: 0.00774
Epoch: 31637, Training Loss: 0.00807
Epoch: 31637, Training Loss: 0.00807
Epoch: 31637, Training Loss: 0.00992
Epoch: 31638, Training Loss: 0.00774
Epoch: 31638, Training Loss: 0.00807
Epoch: 31638, Training Loss: 0.00807
Epoch: 31638, Training Loss: 0.00992
Epoch: 31639, Training Loss: 0.00774
Epoch: 31639, Training Loss: 0.00806
Epoch: 31639, Training Loss: 0.00807
Epoch: 31639, Training Loss: 0.00992
Epoch: 31640, Training Loss: 0.00774
Epoch: 31640, Training Loss: 0.00806
Epoch: 31640, Training Loss: 0.00807
Epoch: 31640, Training Loss: 0.00992
Epoch: 31641, Training Loss: 0.00774
Epoch: 31641, Training Loss: 0.00806
Epoch: 31641, Training Loss: 0.00807
Epoch: 31641, Training Loss: 0.00992
Epoch: 31642, Training Loss: 0.00774
Epoch: 31642, Training Loss: 0.00806
Epoch: 31642, Training Loss: 0.00807
Epoch: 31642, Training Loss: 0.00991
Epoch: 31643, Training Loss: 0.00774
Epoch: 31643, Training Loss: 0.00806
Epoch: 31643, Training Loss: 0.00807
Epoch: 31643, Training Loss: 0.00991
Epoch: 31644, Training Loss: 0.00774
Epoch: 31644, Training Loss: 0.00806
Epoch: 31644, Training Loss: 0.00807
Epoch: 31644, Training Loss: 0.00991
Epoch: 31645, Training Loss: 0.00774
Epoch: 31645, Training Loss: 0.00806
Epoch: 31645, Training Loss: 0.00807
Epoch: 31645, Training Loss: 0.00991
Epoch: 31646, Training Loss: 0.00774
Epoch: 31646, Training Loss: 0.00806
Epoch: 31646, Training Loss: 0.00807
Epoch: 31646, Training Loss: 0.00991
Epoch: 31647, Training Loss: 0.00774
Epoch: 31647, Training Loss: 0.00806
Epoch: 31647, Training Loss: 0.00807
Epoch: 31647, Training Loss: 0.00991
Epoch: 31648, Training Loss: 0.00774
Epoch: 31648, Training Loss: 0.00806
Epoch: 31648, Training Loss: 0.00807
Epoch: 31648, Training Loss: 0.00991
Epoch: 31649, Training Loss: 0.00774
Epoch: 31649, Training Loss: 0.00806
Epoch: 31649, Training Loss: 0.00807
Epoch: 31649, Training Loss: 0.00991
Epoch: 31650, Training Loss: 0.00774
Epoch: 31650, Training Loss: 0.00806
Epoch: 31650, Training Loss: 0.00807
Epoch: 31650, Training Loss: 0.00991
Epoch: 31651, Training Loss: 0.00774
Epoch: 31651, Training Loss: 0.00806
Epoch: 31651, Training Loss: 0.00807
Epoch: 31651, Training Loss: 0.00991
Epoch: 31652, Training Loss: 0.00774
Epoch: 31652, Training Loss: 0.00806
Epoch: 31652, Training Loss: 0.00807
Epoch: 31652, Training Loss: 0.00991
Epoch: 31653, Training Loss: 0.00774
Epoch: 31653, Training Loss: 0.00806
Epoch: 31653, Training Loss: 0.00807
Epoch: 31653, Training Loss: 0.00991
Epoch: 31654, Training Loss: 0.00774
Epoch: 31654, Training Loss: 0.00806
Epoch: 31654, Training Loss: 0.00807
Epoch: 31654, Training Loss: 0.00991
Epoch: 31655, Training Loss: 0.00774
Epoch: 31655, Training Loss: 0.00806
Epoch: 31655, Training Loss: 0.00807
Epoch: 31655, Training Loss: 0.00991
Epoch: 31656, Training Loss: 0.00774
Epoch: 31656, Training Loss: 0.00806
Epoch: 31656, Training Loss: 0.00807
Epoch: 31656, Training Loss: 0.00991
Epoch: 31657, Training Loss: 0.00774
Epoch: 31657, Training Loss: 0.00806
Epoch: 31657, Training Loss: 0.00807
Epoch: 31657, Training Loss: 0.00991
Epoch: 31658, Training Loss: 0.00774
Epoch: 31658, Training Loss: 0.00806
Epoch: 31658, Training Loss: 0.00807
Epoch: 31658, Training Loss: 0.00991
Epoch: 31659, Training Loss: 0.00774
Epoch: 31659, Training Loss: 0.00806
Epoch: 31659, Training Loss: 0.00807
Epoch: 31659, Training Loss: 0.00991
Epoch: 31660, Training Loss: 0.00774
Epoch: 31660, Training Loss: 0.00806
Epoch: 31660, Training Loss: 0.00807
Epoch: 31660, Training Loss: 0.00991
Epoch: 31661, Training Loss: 0.00774
Epoch: 31661, Training Loss: 0.00806
Epoch: 31661, Training Loss: 0.00807
Epoch: 31661, Training Loss: 0.00991
Epoch: 31662, Training Loss: 0.00774
Epoch: 31662, Training Loss: 0.00806
Epoch: 31662, Training Loss: 0.00807
Epoch: 31662, Training Loss: 0.00991
Epoch: 31663, Training Loss: 0.00774
Epoch: 31663, Training Loss: 0.00806
Epoch: 31663, Training Loss: 0.00807
Epoch: 31663, Training Loss: 0.00991
Epoch: 31664, Training Loss: 0.00774
Epoch: 31664, Training Loss: 0.00806
Epoch: 31664, Training Loss: 0.00807
Epoch: 31664, Training Loss: 0.00991
Epoch: 31665, Training Loss: 0.00774
Epoch: 31665, Training Loss: 0.00806
Epoch: 31665, Training Loss: 0.00807
Epoch: 31665, Training Loss: 0.00991
Epoch: 31666, Training Loss: 0.00774
Epoch: 31666, Training Loss: 0.00806
Epoch: 31666, Training Loss: 0.00807
Epoch: 31666, Training Loss: 0.00991
Epoch: 31667, Training Loss: 0.00774
Epoch: 31667, Training Loss: 0.00806
Epoch: 31667, Training Loss: 0.00807
Epoch: 31667, Training Loss: 0.00991
Epoch: 31668, Training Loss: 0.00774
Epoch: 31668, Training Loss: 0.00806
Epoch: 31668, Training Loss: 0.00807
Epoch: 31668, Training Loss: 0.00991
Epoch: 31669, Training Loss: 0.00774
Epoch: 31669, Training Loss: 0.00806
Epoch: 31669, Training Loss: 0.00807
Epoch: 31669, Training Loss: 0.00991
Epoch: 31670, Training Loss: 0.00774
Epoch: 31670, Training Loss: 0.00806
Epoch: 31670, Training Loss: 0.00807
Epoch: 31670, Training Loss: 0.00991
Epoch: 31671, Training Loss: 0.00774
Epoch: 31671, Training Loss: 0.00806
Epoch: 31671, Training Loss: 0.00807
Epoch: 31671, Training Loss: 0.00991
Epoch: 31672, Training Loss: 0.00774
Epoch: 31672, Training Loss: 0.00806
Epoch: 31672, Training Loss: 0.00807
Epoch: 31672, Training Loss: 0.00991
Epoch: 31673, Training Loss: 0.00774
Epoch: 31673, Training Loss: 0.00806
Epoch: 31673, Training Loss: 0.00807
Epoch: 31673, Training Loss: 0.00991
Epoch: 31674, Training Loss: 0.00774
Epoch: 31674, Training Loss: 0.00806
Epoch: 31674, Training Loss: 0.00807
Epoch: 31674, Training Loss: 0.00991
Epoch: 31675, Training Loss: 0.00774
Epoch: 31675, Training Loss: 0.00806
Epoch: 31675, Training Loss: 0.00807
Epoch: 31675, Training Loss: 0.00991
Epoch: 31676, Training Loss: 0.00774
Epoch: 31676, Training Loss: 0.00806
Epoch: 31676, Training Loss: 0.00807
Epoch: 31676, Training Loss: 0.00991
Epoch: 31677, Training Loss: 0.00774
Epoch: 31677, Training Loss: 0.00806
Epoch: 31677, Training Loss: 0.00807
Epoch: 31677, Training Loss: 0.00991
Epoch: 31678, Training Loss: 0.00774
Epoch: 31678, Training Loss: 0.00806
Epoch: 31678, Training Loss: 0.00807
Epoch: 31678, Training Loss: 0.00991
Epoch: 31679, Training Loss: 0.00774
Epoch: 31679, Training Loss: 0.00806
Epoch: 31679, Training Loss: 0.00807
Epoch: 31679, Training Loss: 0.00991
Epoch: 31680, Training Loss: 0.00774
Epoch: 31680, Training Loss: 0.00806
Epoch: 31680, Training Loss: 0.00807
Epoch: 31680, Training Loss: 0.00991
Epoch: 31681, Training Loss: 0.00774
Epoch: 31681, Training Loss: 0.00806
Epoch: 31681, Training Loss: 0.00807
Epoch: 31681, Training Loss: 0.00991
Epoch: 31682, Training Loss: 0.00774
Epoch: 31682, Training Loss: 0.00806
Epoch: 31682, Training Loss: 0.00807
Epoch: 31682, Training Loss: 0.00991
Epoch: 31683, Training Loss: 0.00774
Epoch: 31683, Training Loss: 0.00806
Epoch: 31683, Training Loss: 0.00807
Epoch: 31683, Training Loss: 0.00991
Epoch: 31684, Training Loss: 0.00774
Epoch: 31684, Training Loss: 0.00806
Epoch: 31684, Training Loss: 0.00807
Epoch: 31684, Training Loss: 0.00991
Epoch: 31685, Training Loss: 0.00774
Epoch: 31685, Training Loss: 0.00806
Epoch: 31685, Training Loss: 0.00807
Epoch: 31685, Training Loss: 0.00991
Epoch: 31686, Training Loss: 0.00774
Epoch: 31686, Training Loss: 0.00806
Epoch: 31686, Training Loss: 0.00807
Epoch: 31686, Training Loss: 0.00991
Epoch: 31687, Training Loss: 0.00774
Epoch: 31687, Training Loss: 0.00806
Epoch: 31687, Training Loss: 0.00807
Epoch: 31687, Training Loss: 0.00991
Epoch: 31688, Training Loss: 0.00774
Epoch: 31688, Training Loss: 0.00806
Epoch: 31688, Training Loss: 0.00807
Epoch: 31688, Training Loss: 0.00991
Epoch: 31689, Training Loss: 0.00774
Epoch: 31689, Training Loss: 0.00806
Epoch: 31689, Training Loss: 0.00807
Epoch: 31689, Training Loss: 0.00991
Epoch: 31690, Training Loss: 0.00774
Epoch: 31690, Training Loss: 0.00806
Epoch: 31690, Training Loss: 0.00807
Epoch: 31690, Training Loss: 0.00991
Epoch: 31691, Training Loss: 0.00774
Epoch: 31691, Training Loss: 0.00806
Epoch: 31691, Training Loss: 0.00807
Epoch: 31691, Training Loss: 0.00991
Epoch: 31692, Training Loss: 0.00774
Epoch: 31692, Training Loss: 0.00806
Epoch: 31692, Training Loss: 0.00807
Epoch: 31692, Training Loss: 0.00991
Epoch: 31693, Training Loss: 0.00774
Epoch: 31693, Training Loss: 0.00806
Epoch: 31693, Training Loss: 0.00807
Epoch: 31693, Training Loss: 0.00991
Epoch: 31694, Training Loss: 0.00774
Epoch: 31694, Training Loss: 0.00806
Epoch: 31694, Training Loss: 0.00807
Epoch: 31694, Training Loss: 0.00991
Epoch: 31695, Training Loss: 0.00774
Epoch: 31695, Training Loss: 0.00806
Epoch: 31695, Training Loss: 0.00807
Epoch: 31695, Training Loss: 0.00991
Epoch: 31696, Training Loss: 0.00774
Epoch: 31696, Training Loss: 0.00806
Epoch: 31696, Training Loss: 0.00807
Epoch: 31696, Training Loss: 0.00991
Epoch: 31697, Training Loss: 0.00774
Epoch: 31697, Training Loss: 0.00806
Epoch: 31697, Training Loss: 0.00807
Epoch: 31697, Training Loss: 0.00991
Epoch: 31698, Training Loss: 0.00774
Epoch: 31698, Training Loss: 0.00806
Epoch: 31698, Training Loss: 0.00807
Epoch: 31698, Training Loss: 0.00990
Epoch: 31699, Training Loss: 0.00774
Epoch: 31699, Training Loss: 0.00806
Epoch: 31699, Training Loss: 0.00807
Epoch: 31699, Training Loss: 0.00990
Epoch: 31700, Training Loss: 0.00774
Epoch: 31700, Training Loss: 0.00806
Epoch: 31700, Training Loss: 0.00807
Epoch: 31700, Training Loss: 0.00990
Epoch: 31701, Training Loss: 0.00774
Epoch: 31701, Training Loss: 0.00806
Epoch: 31701, Training Loss: 0.00807
Epoch: 31701, Training Loss: 0.00990
Epoch: 31702, Training Loss: 0.00774
Epoch: 31702, Training Loss: 0.00806
Epoch: 31702, Training Loss: 0.00807
Epoch: 31702, Training Loss: 0.00990
Epoch: 31703, Training Loss: 0.00774
Epoch: 31703, Training Loss: 0.00806
Epoch: 31703, Training Loss: 0.00807
Epoch: 31703, Training Loss: 0.00990
Epoch: 31704, Training Loss: 0.00774
Epoch: 31704, Training Loss: 0.00806
Epoch: 31704, Training Loss: 0.00807
Epoch: 31704, Training Loss: 0.00990
Epoch: 31705, Training Loss: 0.00774
Epoch: 31705, Training Loss: 0.00806
Epoch: 31705, Training Loss: 0.00807
Epoch: 31705, Training Loss: 0.00990
Epoch: 31706, Training Loss: 0.00773
Epoch: 31706, Training Loss: 0.00806
Epoch: 31706, Training Loss: 0.00806
Epoch: 31706, Training Loss: 0.00990
Epoch: 31707, Training Loss: 0.00773
Epoch: 31707, Training Loss: 0.00806
Epoch: 31707, Training Loss: 0.00806
Epoch: 31707, Training Loss: 0.00990
Epoch: 31708, Training Loss: 0.00773
Epoch: 31708, Training Loss: 0.00806
Epoch: 31708, Training Loss: 0.00806
Epoch: 31708, Training Loss: 0.00990
Epoch: 31709, Training Loss: 0.00773
Epoch: 31709, Training Loss: 0.00805
Epoch: 31709, Training Loss: 0.00806
Epoch: 31709, Training Loss: 0.00990
Epoch: 31710, Training Loss: 0.00773
Epoch: 31710, Training Loss: 0.00805
Epoch: 31710, Training Loss: 0.00806
Epoch: 31710, Training Loss: 0.00990
Epoch: 31711, Training Loss: 0.00773
Epoch: 31711, Training Loss: 0.00805
Epoch: 31711, Training Loss: 0.00806
Epoch: 31711, Training Loss: 0.00990
Epoch: 31712, Training Loss: 0.00773
Epoch: 31712, Training Loss: 0.00805
Epoch: 31712, Training Loss: 0.00806
Epoch: 31712, Training Loss: 0.00990
Epoch: 31713, Training Loss: 0.00773
Epoch: 31713, Training Loss: 0.00805
Epoch: 31713, Training Loss: 0.00806
Epoch: 31713, Training Loss: 0.00990
Epoch: 31714, Training Loss: 0.00773
Epoch: 31714, Training Loss: 0.00805
Epoch: 31714, Training Loss: 0.00806
Epoch: 31714, Training Loss: 0.00990
Epoch: 31715, Training Loss: 0.00773
Epoch: 31715, Training Loss: 0.00805
Epoch: 31715, Training Loss: 0.00806
Epoch: 31715, Training Loss: 0.00990
Epoch: 31716, Training Loss: 0.00773
Epoch: 31716, Training Loss: 0.00805
Epoch: 31716, Training Loss: 0.00806
Epoch: 31716, Training Loss: 0.00990
Epoch: 31717, Training Loss: 0.00773
Epoch: 31717, Training Loss: 0.00805
Epoch: 31717, Training Loss: 0.00806
Epoch: 31717, Training Loss: 0.00990
Epoch: 31718, Training Loss: 0.00773
Epoch: 31718, Training Loss: 0.00805
Epoch: 31718, Training Loss: 0.00806
Epoch: 31718, Training Loss: 0.00990
Epoch: 31719, Training Loss: 0.00773
Epoch: 31719, Training Loss: 0.00805
Epoch: 31719, Training Loss: 0.00806
Epoch: 31719, Training Loss: 0.00990
Epoch: 31720, Training Loss: 0.00773
Epoch: 31720, Training Loss: 0.00805
Epoch: 31720, Training Loss: 0.00806
Epoch: 31720, Training Loss: 0.00990
Epoch: 31721, Training Loss: 0.00773
Epoch: 31721, Training Loss: 0.00805
Epoch: 31721, Training Loss: 0.00806
Epoch: 31721, Training Loss: 0.00990
Epoch: 31722, Training Loss: 0.00773
Epoch: 31722, Training Loss: 0.00805
Epoch: 31722, Training Loss: 0.00806
Epoch: 31722, Training Loss: 0.00990
Epoch: 31723, Training Loss: 0.00773
Epoch: 31723, Training Loss: 0.00805
Epoch: 31723, Training Loss: 0.00806
Epoch: 31723, Training Loss: 0.00990
Epoch: 31724, Training Loss: 0.00773
Epoch: 31724, Training Loss: 0.00805
Epoch: 31724, Training Loss: 0.00806
Epoch: 31724, Training Loss: 0.00990
Epoch: 31725, Training Loss: 0.00773
Epoch: 31725, Training Loss: 0.00805
Epoch: 31725, Training Loss: 0.00806
Epoch: 31725, Training Loss: 0.00990
Epoch: 31726, Training Loss: 0.00773
Epoch: 31726, Training Loss: 0.00805
Epoch: 31726, Training Loss: 0.00806
Epoch: 31726, Training Loss: 0.00990
Epoch: 31727, Training Loss: 0.00773
Epoch: 31727, Training Loss: 0.00805
Epoch: 31727, Training Loss: 0.00806
Epoch: 31727, Training Loss: 0.00990
Epoch: 31728, Training Loss: 0.00773
Epoch: 31728, Training Loss: 0.00805
Epoch: 31728, Training Loss: 0.00806
Epoch: 31728, Training Loss: 0.00990
Epoch: 31729, Training Loss: 0.00773
Epoch: 31729, Training Loss: 0.00805
Epoch: 31729, Training Loss: 0.00806
Epoch: 31729, Training Loss: 0.00990
Epoch: 31730, Training Loss: 0.00773
Epoch: 31730, Training Loss: 0.00805
Epoch: 31730, Training Loss: 0.00806
Epoch: 31730, Training Loss: 0.00990
Epoch: 31731, Training Loss: 0.00773
Epoch: 31731, Training Loss: 0.00805
Epoch: 31731, Training Loss: 0.00806
Epoch: 31731, Training Loss: 0.00990
Epoch: 31732, Training Loss: 0.00773
Epoch: 31732, Training Loss: 0.00805
Epoch: 31732, Training Loss: 0.00806
Epoch: 31732, Training Loss: 0.00990
Epoch: 31733, Training Loss: 0.00773
Epoch: 31733, Training Loss: 0.00805
Epoch: 31733, Training Loss: 0.00806
Epoch: 31733, Training Loss: 0.00990
Epoch: 31734, Training Loss: 0.00773
Epoch: 31734, Training Loss: 0.00805
Epoch: 31734, Training Loss: 0.00806
Epoch: 31734, Training Loss: 0.00990
Epoch: 31735, Training Loss: 0.00773
Epoch: 31735, Training Loss: 0.00805
Epoch: 31735, Training Loss: 0.00806
Epoch: 31735, Training Loss: 0.00990
Epoch: 31736, Training Loss: 0.00773
Epoch: 31736, Training Loss: 0.00805
Epoch: 31736, Training Loss: 0.00806
Epoch: 31736, Training Loss: 0.00990
Epoch: 31737, Training Loss: 0.00773
Epoch: 31737, Training Loss: 0.00805
Epoch: 31737, Training Loss: 0.00806
Epoch: 31737, Training Loss: 0.00990
Epoch: 31738, Training Loss: 0.00773
Epoch: 31738, Training Loss: 0.00805
Epoch: 31738, Training Loss: 0.00806
Epoch: 31738, Training Loss: 0.00990
Epoch: 31739, Training Loss: 0.00773
Epoch: 31739, Training Loss: 0.00805
Epoch: 31739, Training Loss: 0.00806
Epoch: 31739, Training Loss: 0.00990
Epoch: 31740, Training Loss: 0.00773
Epoch: 31740, Training Loss: 0.00805
Epoch: 31740, Training Loss: 0.00806
Epoch: 31740, Training Loss: 0.00990
Epoch: 31741, Training Loss: 0.00773
Epoch: 31741, Training Loss: 0.00805
Epoch: 31741, Training Loss: 0.00806
Epoch: 31741, Training Loss: 0.00990
Epoch: 31742, Training Loss: 0.00773
Epoch: 31742, Training Loss: 0.00805
Epoch: 31742, Training Loss: 0.00806
Epoch: 31742, Training Loss: 0.00990
Epoch: 31743, Training Loss: 0.00773
Epoch: 31743, Training Loss: 0.00805
Epoch: 31743, Training Loss: 0.00806
Epoch: 31743, Training Loss: 0.00990
Epoch: 31744, Training Loss: 0.00773
Epoch: 31744, Training Loss: 0.00805
Epoch: 31744, Training Loss: 0.00806
Epoch: 31744, Training Loss: 0.00990
Epoch: 31745, Training Loss: 0.00773
Epoch: 31745, Training Loss: 0.00805
Epoch: 31745, Training Loss: 0.00806
Epoch: 31745, Training Loss: 0.00990
Epoch: 31746, Training Loss: 0.00773
Epoch: 31746, Training Loss: 0.00805
Epoch: 31746, Training Loss: 0.00806
Epoch: 31746, Training Loss: 0.00990
Epoch: 31747, Training Loss: 0.00773
Epoch: 31747, Training Loss: 0.00805
Epoch: 31747, Training Loss: 0.00806
Epoch: 31747, Training Loss: 0.00990
Epoch: 31748, Training Loss: 0.00773
Epoch: 31748, Training Loss: 0.00805
Epoch: 31748, Training Loss: 0.00806
Epoch: 31748, Training Loss: 0.00990
Epoch: 31749, Training Loss: 0.00773
Epoch: 31749, Training Loss: 0.00805
Epoch: 31749, Training Loss: 0.00806
Epoch: 31749, Training Loss: 0.00990
Epoch: 31750, Training Loss: 0.00773
Epoch: 31750, Training Loss: 0.00805
Epoch: 31750, Training Loss: 0.00806
Epoch: 31750, Training Loss: 0.00990
Epoch: 31751, Training Loss: 0.00773
Epoch: 31751, Training Loss: 0.00805
Epoch: 31751, Training Loss: 0.00806
Epoch: 31751, Training Loss: 0.00990
Epoch: 31752, Training Loss: 0.00773
Epoch: 31752, Training Loss: 0.00805
Epoch: 31752, Training Loss: 0.00806
Epoch: 31752, Training Loss: 0.00990
Epoch: 31753, Training Loss: 0.00773
Epoch: 31753, Training Loss: 0.00805
Epoch: 31753, Training Loss: 0.00806
Epoch: 31753, Training Loss: 0.00990
Epoch: 31754, Training Loss: 0.00773
Epoch: 31754, Training Loss: 0.00805
Epoch: 31754, Training Loss: 0.00806
Epoch: 31754, Training Loss: 0.00990
Epoch: 31755, Training Loss: 0.00773
Epoch: 31755, Training Loss: 0.00805
Epoch: 31755, Training Loss: 0.00806
Epoch: 31755, Training Loss: 0.00989
Epoch: 31756, Training Loss: 0.00773
Epoch: 31756, Training Loss: 0.00805
Epoch: 31756, Training Loss: 0.00806
Epoch: 31756, Training Loss: 0.00989
Epoch: 31757, Training Loss: 0.00773
Epoch: 31757, Training Loss: 0.00805
Epoch: 31757, Training Loss: 0.00806
Epoch: 31757, Training Loss: 0.00989
Epoch: 31758, Training Loss: 0.00773
Epoch: 31758, Training Loss: 0.00805
Epoch: 31758, Training Loss: 0.00806
Epoch: 31758, Training Loss: 0.00989
Epoch: 31759, Training Loss: 0.00773
Epoch: 31759, Training Loss: 0.00805
Epoch: 31759, Training Loss: 0.00806
Epoch: 31759, Training Loss: 0.00989
Epoch: 31760, Training Loss: 0.00773
Epoch: 31760, Training Loss: 0.00805
Epoch: 31760, Training Loss: 0.00806
Epoch: 31760, Training Loss: 0.00989
Epoch: 31761, Training Loss: 0.00773
Epoch: 31761, Training Loss: 0.00805
Epoch: 31761, Training Loss: 0.00806
Epoch: 31761, Training Loss: 0.00989
Epoch: 31762, Training Loss: 0.00773
Epoch: 31762, Training Loss: 0.00805
Epoch: 31762, Training Loss: 0.00806
Epoch: 31762, Training Loss: 0.00989
Epoch: 31763, Training Loss: 0.00773
Epoch: 31763, Training Loss: 0.00805
Epoch: 31763, Training Loss: 0.00806
Epoch: 31763, Training Loss: 0.00989
Epoch: 31764, Training Loss: 0.00773
Epoch: 31764, Training Loss: 0.00805
Epoch: 31764, Training Loss: 0.00806
Epoch: 31764, Training Loss: 0.00989
Epoch: 31765, Training Loss: 0.00773
Epoch: 31765, Training Loss: 0.00805
Epoch: 31765, Training Loss: 0.00806
Epoch: 31765, Training Loss: 0.00989
Epoch: 31766, Training Loss: 0.00773
Epoch: 31766, Training Loss: 0.00805
Epoch: 31766, Training Loss: 0.00806
Epoch: 31766, Training Loss: 0.00989
Epoch: 31767, Training Loss: 0.00773
Epoch: 31767, Training Loss: 0.00805
Epoch: 31767, Training Loss: 0.00806
Epoch: 31767, Training Loss: 0.00989
Epoch: 31768, Training Loss: 0.00773
Epoch: 31768, Training Loss: 0.00805
Epoch: 31768, Training Loss: 0.00806
Epoch: 31768, Training Loss: 0.00989
Epoch: 31769, Training Loss: 0.00773
Epoch: 31769, Training Loss: 0.00805
Epoch: 31769, Training Loss: 0.00806
Epoch: 31769, Training Loss: 0.00989
Epoch: 31770, Training Loss: 0.00773
Epoch: 31770, Training Loss: 0.00805
Epoch: 31770, Training Loss: 0.00806
Epoch: 31770, Training Loss: 0.00989
Epoch: 31771, Training Loss: 0.00773
Epoch: 31771, Training Loss: 0.00805
Epoch: 31771, Training Loss: 0.00806
Epoch: 31771, Training Loss: 0.00989
Epoch: 31772, Training Loss: 0.00773
Epoch: 31772, Training Loss: 0.00805
Epoch: 31772, Training Loss: 0.00806
Epoch: 31772, Training Loss: 0.00989
Epoch: 31773, Training Loss: 0.00773
Epoch: 31773, Training Loss: 0.00805
Epoch: 31773, Training Loss: 0.00806
Epoch: 31773, Training Loss: 0.00989
Epoch: 31774, Training Loss: 0.00773
Epoch: 31774, Training Loss: 0.00805
Epoch: 31774, Training Loss: 0.00806
Epoch: 31774, Training Loss: 0.00989
Epoch: 31775, Training Loss: 0.00773
Epoch: 31775, Training Loss: 0.00805
Epoch: 31775, Training Loss: 0.00806
Epoch: 31775, Training Loss: 0.00989
Epoch: 31776, Training Loss: 0.00773
Epoch: 31776, Training Loss: 0.00805
Epoch: 31776, Training Loss: 0.00805
Epoch: 31776, Training Loss: 0.00989
Epoch: 31777, Training Loss: 0.00773
Epoch: 31777, Training Loss: 0.00805
Epoch: 31777, Training Loss: 0.00805
Epoch: 31777, Training Loss: 0.00989
Epoch: 31778, Training Loss: 0.00773
Epoch: 31778, Training Loss: 0.00805
Epoch: 31778, Training Loss: 0.00805
Epoch: 31778, Training Loss: 0.00989
Epoch: 31779, Training Loss: 0.00773
Epoch: 31779, Training Loss: 0.00805
Epoch: 31779, Training Loss: 0.00805
Epoch: 31779, Training Loss: 0.00989
Epoch: 31780, Training Loss: 0.00773
Epoch: 31780, Training Loss: 0.00804
Epoch: 31780, Training Loss: 0.00805
Epoch: 31780, Training Loss: 0.00989
Epoch: 31781, Training Loss: 0.00772
Epoch: 31781, Training Loss: 0.00804
Epoch: 31781, Training Loss: 0.00805
Epoch: 31781, Training Loss: 0.00989
Epoch: 31782, Training Loss: 0.00772
Epoch: 31782, Training Loss: 0.00804
Epoch: 31782, Training Loss: 0.00805
Epoch: 31782, Training Loss: 0.00989
Epoch: 31783, Training Loss: 0.00772
Epoch: 31783, Training Loss: 0.00804
Epoch: 31783, Training Loss: 0.00805
Epoch: 31783, Training Loss: 0.00989
Epoch: 31784, Training Loss: 0.00772
Epoch: 31784, Training Loss: 0.00804
Epoch: 31784, Training Loss: 0.00805
Epoch: 31784, Training Loss: 0.00989
Epoch: 31785, Training Loss: 0.00772
Epoch: 31785, Training Loss: 0.00804
Epoch: 31785, Training Loss: 0.00805
Epoch: 31785, Training Loss: 0.00989
Epoch: 31786, Training Loss: 0.00772
Epoch: 31786, Training Loss: 0.00804
Epoch: 31786, Training Loss: 0.00805
Epoch: 31786, Training Loss: 0.00989
Epoch: 31787, Training Loss: 0.00772
Epoch: 31787, Training Loss: 0.00804
Epoch: 31787, Training Loss: 0.00805
Epoch: 31787, Training Loss: 0.00989
Epoch: 31788, Training Loss: 0.00772
Epoch: 31788, Training Loss: 0.00804
Epoch: 31788, Training Loss: 0.00805
Epoch: 31788, Training Loss: 0.00989
Epoch: 31789, Training Loss: 0.00772
Epoch: 31789, Training Loss: 0.00804
Epoch: 31789, Training Loss: 0.00805
Epoch: 31789, Training Loss: 0.00989
Epoch: 31790, Training Loss: 0.00772
Epoch: 31790, Training Loss: 0.00804
Epoch: 31790, Training Loss: 0.00805
Epoch: 31790, Training Loss: 0.00989
Epoch: 31791, Training Loss: 0.00772
Epoch: 31791, Training Loss: 0.00804
Epoch: 31791, Training Loss: 0.00805
Epoch: 31791, Training Loss: 0.00989
Epoch: 31792, Training Loss: 0.00772
Epoch: 31792, Training Loss: 0.00804
Epoch: 31792, Training Loss: 0.00805
Epoch: 31792, Training Loss: 0.00989
Epoch: 31793, Training Loss: 0.00772
Epoch: 31793, Training Loss: 0.00804
Epoch: 31793, Training Loss: 0.00805
Epoch: 31793, Training Loss: 0.00989
Epoch: 31794, Training Loss: 0.00772
Epoch: 31794, Training Loss: 0.00804
Epoch: 31794, Training Loss: 0.00805
Epoch: 31794, Training Loss: 0.00989
Epoch: 31795, Training Loss: 0.00772
Epoch: 31795, Training Loss: 0.00804
Epoch: 31795, Training Loss: 0.00805
Epoch: 31795, Training Loss: 0.00989
Epoch: 31796, Training Loss: 0.00772
Epoch: 31796, Training Loss: 0.00804
Epoch: 31796, Training Loss: 0.00805
Epoch: 31796, Training Loss: 0.00989
Epoch: 31797, Training Loss: 0.00772
Epoch: 31797, Training Loss: 0.00804
Epoch: 31797, Training Loss: 0.00805
Epoch: 31797, Training Loss: 0.00989
Epoch: 31798, Training Loss: 0.00772
Epoch: 31798, Training Loss: 0.00804
Epoch: 31798, Training Loss: 0.00805
Epoch: 31798, Training Loss: 0.00989
Epoch: 31799, Training Loss: 0.00772
Epoch: 31799, Training Loss: 0.00804
Epoch: 31799, Training Loss: 0.00805
Epoch: 31799, Training Loss: 0.00989
Epoch: 31800, Training Loss: 0.00772
Epoch: 31800, Training Loss: 0.00804
Epoch: 31800, Training Loss: 0.00805
Epoch: 31800, Training Loss: 0.00989
Epoch: 31801, Training Loss: 0.00772
Epoch: 31801, Training Loss: 0.00804
Epoch: 31801, Training Loss: 0.00805
Epoch: 31801, Training Loss: 0.00989
Epoch: 31802, Training Loss: 0.00772
Epoch: 31802, Training Loss: 0.00804
Epoch: 31802, Training Loss: 0.00805
Epoch: 31802, Training Loss: 0.00989
Epoch: 31803, Training Loss: 0.00772
Epoch: 31803, Training Loss: 0.00804
Epoch: 31803, Training Loss: 0.00805
Epoch: 31803, Training Loss: 0.00989
Epoch: 31804, Training Loss: 0.00772
Epoch: 31804, Training Loss: 0.00804
Epoch: 31804, Training Loss: 0.00805
Epoch: 31804, Training Loss: 0.00989
Epoch: 31805, Training Loss: 0.00772
Epoch: 31805, Training Loss: 0.00804
Epoch: 31805, Training Loss: 0.00805
Epoch: 31805, Training Loss: 0.00989
Epoch: 31806, Training Loss: 0.00772
Epoch: 31806, Training Loss: 0.00804
Epoch: 31806, Training Loss: 0.00805
Epoch: 31806, Training Loss: 0.00989
Epoch: 31807, Training Loss: 0.00772
Epoch: 31807, Training Loss: 0.00804
Epoch: 31807, Training Loss: 0.00805
Epoch: 31807, Training Loss: 0.00989
Epoch: 31808, Training Loss: 0.00772
Epoch: 31808, Training Loss: 0.00804
Epoch: 31808, Training Loss: 0.00805
Epoch: 31808, Training Loss: 0.00989
Epoch: 31809, Training Loss: 0.00772
Epoch: 31809, Training Loss: 0.00804
Epoch: 31809, Training Loss: 0.00805
Epoch: 31809, Training Loss: 0.00989
Epoch: 31810, Training Loss: 0.00772
Epoch: 31810, Training Loss: 0.00804
Epoch: 31810, Training Loss: 0.00805
Epoch: 31810, Training Loss: 0.00989
Epoch: 31811, Training Loss: 0.00772
Epoch: 31811, Training Loss: 0.00804
Epoch: 31811, Training Loss: 0.00805
Epoch: 31811, Training Loss: 0.00989
Epoch: 31812, Training Loss: 0.00772
Epoch: 31812, Training Loss: 0.00804
Epoch: 31812, Training Loss: 0.00805
Epoch: 31812, Training Loss: 0.00988
Epoch: 31813, Training Loss: 0.00772
Epoch: 31813, Training Loss: 0.00804
Epoch: 31813, Training Loss: 0.00805
Epoch: 31813, Training Loss: 0.00988
Epoch: 31814, Training Loss: 0.00772
Epoch: 31814, Training Loss: 0.00804
Epoch: 31814, Training Loss: 0.00805
Epoch: 31814, Training Loss: 0.00988
Epoch: 31815, Training Loss: 0.00772
Epoch: 31815, Training Loss: 0.00804
Epoch: 31815, Training Loss: 0.00805
Epoch: 31815, Training Loss: 0.00988
Epoch: 31816, Training Loss: 0.00772
Epoch: 31816, Training Loss: 0.00804
Epoch: 31816, Training Loss: 0.00805
Epoch: 31816, Training Loss: 0.00988
Epoch: 31817, Training Loss: 0.00772
Epoch: 31817, Training Loss: 0.00804
Epoch: 31817, Training Loss: 0.00805
Epoch: 31817, Training Loss: 0.00988
Epoch: 31818, Training Loss: 0.00772
Epoch: 31818, Training Loss: 0.00804
Epoch: 31818, Training Loss: 0.00805
Epoch: 31818, Training Loss: 0.00988
Epoch: 31819, Training Loss: 0.00772
Epoch: 31819, Training Loss: 0.00804
Epoch: 31819, Training Loss: 0.00805
Epoch: 31819, Training Loss: 0.00988
Epoch: 31820, Training Loss: 0.00772
Epoch: 31820, Training Loss: 0.00804
Epoch: 31820, Training Loss: 0.00805
Epoch: 31820, Training Loss: 0.00988
Epoch: 31821, Training Loss: 0.00772
Epoch: 31821, Training Loss: 0.00804
Epoch: 31821, Training Loss: 0.00805
Epoch: 31821, Training Loss: 0.00988
Epoch: 31822, Training Loss: 0.00772
Epoch: 31822, Training Loss: 0.00804
Epoch: 31822, Training Loss: 0.00805
Epoch: 31822, Training Loss: 0.00988
Epoch: 31823, Training Loss: 0.00772
Epoch: 31823, Training Loss: 0.00804
Epoch: 31823, Training Loss: 0.00805
Epoch: 31823, Training Loss: 0.00988
Epoch: 31824, Training Loss: 0.00772
Epoch: 31824, Training Loss: 0.00804
Epoch: 31824, Training Loss: 0.00805
Epoch: 31824, Training Loss: 0.00988
Epoch: 31825, Training Loss: 0.00772
Epoch: 31825, Training Loss: 0.00804
Epoch: 31825, Training Loss: 0.00805
Epoch: 31825, Training Loss: 0.00988
Epoch: 31826, Training Loss: 0.00772
Epoch: 31826, Training Loss: 0.00804
Epoch: 31826, Training Loss: 0.00805
Epoch: 31826, Training Loss: 0.00988
Epoch: 31827, Training Loss: 0.00772
Epoch: 31827, Training Loss: 0.00804
Epoch: 31827, Training Loss: 0.00805
Epoch: 31827, Training Loss: 0.00988
Epoch: 31828, Training Loss: 0.00772
Epoch: 31828, Training Loss: 0.00804
Epoch: 31828, Training Loss: 0.00805
Epoch: 31828, Training Loss: 0.00988
Epoch: 31829, Training Loss: 0.00772
Epoch: 31829, Training Loss: 0.00804
Epoch: 31829, Training Loss: 0.00805
Epoch: 31829, Training Loss: 0.00988
Epoch: 31830, Training Loss: 0.00772
Epoch: 31830, Training Loss: 0.00804
Epoch: 31830, Training Loss: 0.00805
Epoch: 31830, Training Loss: 0.00988
Epoch: 31831, Training Loss: 0.00772
Epoch: 31831, Training Loss: 0.00804
Epoch: 31831, Training Loss: 0.00805
Epoch: 31831, Training Loss: 0.00988
Epoch: 31832, Training Loss: 0.00772
Epoch: 31832, Training Loss: 0.00804
Epoch: 31832, Training Loss: 0.00805
Epoch: 31832, Training Loss: 0.00988
Epoch: 31833, Training Loss: 0.00772
Epoch: 31833, Training Loss: 0.00804
Epoch: 31833, Training Loss: 0.00805
Epoch: 31833, Training Loss: 0.00988
Epoch: 31834, Training Loss: 0.00772
Epoch: 31834, Training Loss: 0.00804
Epoch: 31834, Training Loss: 0.00805
Epoch: 31834, Training Loss: 0.00988
Epoch: 31835, Training Loss: 0.00772
Epoch: 31835, Training Loss: 0.00804
Epoch: 31835, Training Loss: 0.00805
Epoch: 31835, Training Loss: 0.00988
Epoch: 31836, Training Loss: 0.00772
Epoch: 31836, Training Loss: 0.00804
Epoch: 31836, Training Loss: 0.00805
Epoch: 31836, Training Loss: 0.00988
Epoch: 31837, Training Loss: 0.00772
Epoch: 31837, Training Loss: 0.00804
Epoch: 31837, Training Loss: 0.00805
Epoch: 31837, Training Loss: 0.00988
Epoch: 31838, Training Loss: 0.00772
Epoch: 31838, Training Loss: 0.00804
Epoch: 31838, Training Loss: 0.00805
Epoch: 31838, Training Loss: 0.00988
Epoch: 31839, Training Loss: 0.00772
Epoch: 31839, Training Loss: 0.00804
Epoch: 31839, Training Loss: 0.00805
Epoch: 31839, Training Loss: 0.00988
Epoch: 31840, Training Loss: 0.00772
Epoch: 31840, Training Loss: 0.00804
Epoch: 31840, Training Loss: 0.00805
Epoch: 31840, Training Loss: 0.00988
Epoch: 31841, Training Loss: 0.00772
Epoch: 31841, Training Loss: 0.00804
Epoch: 31841, Training Loss: 0.00805
Epoch: 31841, Training Loss: 0.00988
Epoch: 31842, Training Loss: 0.00772
Epoch: 31842, Training Loss: 0.00804
Epoch: 31842, Training Loss: 0.00805
Epoch: 31842, Training Loss: 0.00988
Epoch: 31843, Training Loss: 0.00772
Epoch: 31843, Training Loss: 0.00804
Epoch: 31843, Training Loss: 0.00805
Epoch: 31843, Training Loss: 0.00988
Epoch: 31844, Training Loss: 0.00772
Epoch: 31844, Training Loss: 0.00804
Epoch: 31844, Training Loss: 0.00805
Epoch: 31844, Training Loss: 0.00988
Epoch: 31845, Training Loss: 0.00772
Epoch: 31845, Training Loss: 0.00804
Epoch: 31845, Training Loss: 0.00805
Epoch: 31845, Training Loss: 0.00988
Epoch: 31846, Training Loss: 0.00772
Epoch: 31846, Training Loss: 0.00804
Epoch: 31846, Training Loss: 0.00805
Epoch: 31846, Training Loss: 0.00988
Epoch: 31847, Training Loss: 0.00772
Epoch: 31847, Training Loss: 0.00804
Epoch: 31847, Training Loss: 0.00804
Epoch: 31847, Training Loss: 0.00988
Epoch: 31848, Training Loss: 0.00772
Epoch: 31848, Training Loss: 0.00804
Epoch: 31848, Training Loss: 0.00804
Epoch: 31848, Training Loss: 0.00988
Epoch: 31849, Training Loss: 0.00772
Epoch: 31849, Training Loss: 0.00804
Epoch: 31849, Training Loss: 0.00804
Epoch: 31849, Training Loss: 0.00988
Epoch: 31850, Training Loss: 0.00772
Epoch: 31850, Training Loss: 0.00804
Epoch: 31850, Training Loss: 0.00804
Epoch: 31850, Training Loss: 0.00988
Epoch: 31851, Training Loss: 0.00772
Epoch: 31851, Training Loss: 0.00803
Epoch: 31851, Training Loss: 0.00804
Epoch: 31851, Training Loss: 0.00988
Epoch: 31852, Training Loss: 0.00772
Epoch: 31852, Training Loss: 0.00803
Epoch: 31852, Training Loss: 0.00804
Epoch: 31852, Training Loss: 0.00988
Epoch: 31853, Training Loss: 0.00772
Epoch: 31853, Training Loss: 0.00803
Epoch: 31853, Training Loss: 0.00804
Epoch: 31853, Training Loss: 0.00988
Epoch: 31854, Training Loss: 0.00772
Epoch: 31854, Training Loss: 0.00803
Epoch: 31854, Training Loss: 0.00804
Epoch: 31854, Training Loss: 0.00988
Epoch: 31855, Training Loss: 0.00772
Epoch: 31855, Training Loss: 0.00803
Epoch: 31855, Training Loss: 0.00804
Epoch: 31855, Training Loss: 0.00988
Epoch: 31856, Training Loss: 0.00771
Epoch: 31856, Training Loss: 0.00803
Epoch: 31856, Training Loss: 0.00804
Epoch: 31856, Training Loss: 0.00988
Epoch: 31857, Training Loss: 0.00771
Epoch: 31857, Training Loss: 0.00803
Epoch: 31857, Training Loss: 0.00804
Epoch: 31857, Training Loss: 0.00988
Epoch: 31858, Training Loss: 0.00771
Epoch: 31858, Training Loss: 0.00803
Epoch: 31858, Training Loss: 0.00804
Epoch: 31858, Training Loss: 0.00988
Epoch: 31859, Training Loss: 0.00771
Epoch: 31859, Training Loss: 0.00803
Epoch: 31859, Training Loss: 0.00804
Epoch: 31859, Training Loss: 0.00988
Epoch: 31860, Training Loss: 0.00771
Epoch: 31860, Training Loss: 0.00803
Epoch: 31860, Training Loss: 0.00804
Epoch: 31860, Training Loss: 0.00988
Epoch: 31861, Training Loss: 0.00771
Epoch: 31861, Training Loss: 0.00803
Epoch: 31861, Training Loss: 0.00804
Epoch: 31861, Training Loss: 0.00988
Epoch: 31862, Training Loss: 0.00771
Epoch: 31862, Training Loss: 0.00803
Epoch: 31862, Training Loss: 0.00804
Epoch: 31862, Training Loss: 0.00988
Epoch: 31863, Training Loss: 0.00771
Epoch: 31863, Training Loss: 0.00803
Epoch: 31863, Training Loss: 0.00804
Epoch: 31863, Training Loss: 0.00988
Epoch: 31864, Training Loss: 0.00771
Epoch: 31864, Training Loss: 0.00803
Epoch: 31864, Training Loss: 0.00804
Epoch: 31864, Training Loss: 0.00988
Epoch: 31865, Training Loss: 0.00771
Epoch: 31865, Training Loss: 0.00803
Epoch: 31865, Training Loss: 0.00804
Epoch: 31865, Training Loss: 0.00988
Epoch: 31866, Training Loss: 0.00771
Epoch: 31866, Training Loss: 0.00803
Epoch: 31866, Training Loss: 0.00804
Epoch: 31866, Training Loss: 0.00988
Epoch: 31867, Training Loss: 0.00771
Epoch: 31867, Training Loss: 0.00803
Epoch: 31867, Training Loss: 0.00804
Epoch: 31867, Training Loss: 0.00988
Epoch: 31868, Training Loss: 0.00771
Epoch: 31868, Training Loss: 0.00803
Epoch: 31868, Training Loss: 0.00804
Epoch: 31868, Training Loss: 0.00987
Epoch: 31869, Training Loss: 0.00771
Epoch: 31869, Training Loss: 0.00803
Epoch: 31869, Training Loss: 0.00804
Epoch: 31869, Training Loss: 0.00987
Epoch: 31870, Training Loss: 0.00771
Epoch: 31870, Training Loss: 0.00803
Epoch: 31870, Training Loss: 0.00804
Epoch: 31870, Training Loss: 0.00987
Epoch: 31871, Training Loss: 0.00771
Epoch: 31871, Training Loss: 0.00803
Epoch: 31871, Training Loss: 0.00804
Epoch: 31871, Training Loss: 0.00987
Epoch: 31872, Training Loss: 0.00771
Epoch: 31872, Training Loss: 0.00803
Epoch: 31872, Training Loss: 0.00804
Epoch: 31872, Training Loss: 0.00987
Epoch: 31873, Training Loss: 0.00771
Epoch: 31873, Training Loss: 0.00803
Epoch: 31873, Training Loss: 0.00804
Epoch: 31873, Training Loss: 0.00987
Epoch: 31874, Training Loss: 0.00771
Epoch: 31874, Training Loss: 0.00803
Epoch: 31874, Training Loss: 0.00804
Epoch: 31874, Training Loss: 0.00987
Epoch: 31875, Training Loss: 0.00771
Epoch: 31875, Training Loss: 0.00803
Epoch: 31875, Training Loss: 0.00804
Epoch: 31875, Training Loss: 0.00987
Epoch: 31876, Training Loss: 0.00771
Epoch: 31876, Training Loss: 0.00803
Epoch: 31876, Training Loss: 0.00804
Epoch: 31876, Training Loss: 0.00987
Epoch: 31877, Training Loss: 0.00771
Epoch: 31877, Training Loss: 0.00803
Epoch: 31877, Training Loss: 0.00804
Epoch: 31877, Training Loss: 0.00987
Epoch: 31878, Training Loss: 0.00771
Epoch: 31878, Training Loss: 0.00803
Epoch: 31878, Training Loss: 0.00804
Epoch: 31878, Training Loss: 0.00987
Epoch: 31879, Training Loss: 0.00771
Epoch: 31879, Training Loss: 0.00803
Epoch: 31879, Training Loss: 0.00804
Epoch: 31879, Training Loss: 0.00987
Epoch: 31880, Training Loss: 0.00771
Epoch: 31880, Training Loss: 0.00803
Epoch: 31880, Training Loss: 0.00804
Epoch: 31880, Training Loss: 0.00987
Epoch: 31881, Training Loss: 0.00771
Epoch: 31881, Training Loss: 0.00803
Epoch: 31881, Training Loss: 0.00804
Epoch: 31881, Training Loss: 0.00987
Epoch: 31882, Training Loss: 0.00771
Epoch: 31882, Training Loss: 0.00803
Epoch: 31882, Training Loss: 0.00804
Epoch: 31882, Training Loss: 0.00987
Epoch: 31883, Training Loss: 0.00771
Epoch: 31883, Training Loss: 0.00803
Epoch: 31883, Training Loss: 0.00804
Epoch: 31883, Training Loss: 0.00987
Epoch: 31884, Training Loss: 0.00771
Epoch: 31884, Training Loss: 0.00803
Epoch: 31884, Training Loss: 0.00804
Epoch: 31884, Training Loss: 0.00987
Epoch: 31885, Training Loss: 0.00771
Epoch: 31885, Training Loss: 0.00803
Epoch: 31885, Training Loss: 0.00804
Epoch: 31885, Training Loss: 0.00987
Epoch: 31886, Training Loss: 0.00771
Epoch: 31886, Training Loss: 0.00803
Epoch: 31886, Training Loss: 0.00804
Epoch: 31886, Training Loss: 0.00987
Epoch: 31887, Training Loss: 0.00771
Epoch: 31887, Training Loss: 0.00803
Epoch: 31887, Training Loss: 0.00804
Epoch: 31887, Training Loss: 0.00987
Epoch: 31888, Training Loss: 0.00771
Epoch: 31888, Training Loss: 0.00803
Epoch: 31888, Training Loss: 0.00804
Epoch: 31888, Training Loss: 0.00987
Epoch: 31889, Training Loss: 0.00771
Epoch: 31889, Training Loss: 0.00803
Epoch: 31889, Training Loss: 0.00804
Epoch: 31889, Training Loss: 0.00987
Epoch: 31890, Training Loss: 0.00771
Epoch: 31890, Training Loss: 0.00803
Epoch: 31890, Training Loss: 0.00804
Epoch: 31890, Training Loss: 0.00987
Epoch: 31891, Training Loss: 0.00771
Epoch: 31891, Training Loss: 0.00803
Epoch: 31891, Training Loss: 0.00804
Epoch: 31891, Training Loss: 0.00987
Epoch: 31892, Training Loss: 0.00771
Epoch: 31892, Training Loss: 0.00803
Epoch: 31892, Training Loss: 0.00804
Epoch: 31892, Training Loss: 0.00987
Epoch: 31893, Training Loss: 0.00771
Epoch: 31893, Training Loss: 0.00803
Epoch: 31893, Training Loss: 0.00804
Epoch: 31893, Training Loss: 0.00987
Epoch: 31894, Training Loss: 0.00771
Epoch: 31894, Training Loss: 0.00803
Epoch: 31894, Training Loss: 0.00804
Epoch: 31894, Training Loss: 0.00987
Epoch: 31895, Training Loss: 0.00771
Epoch: 31895, Training Loss: 0.00803
Epoch: 31895, Training Loss: 0.00804
Epoch: 31895, Training Loss: 0.00987
Epoch: 31896, Training Loss: 0.00771
Epoch: 31896, Training Loss: 0.00803
Epoch: 31896, Training Loss: 0.00804
Epoch: 31896, Training Loss: 0.00987
Epoch: 31897, Training Loss: 0.00771
Epoch: 31897, Training Loss: 0.00803
Epoch: 31897, Training Loss: 0.00804
Epoch: 31897, Training Loss: 0.00987
Epoch: 31898, Training Loss: 0.00771
Epoch: 31898, Training Loss: 0.00803
Epoch: 31898, Training Loss: 0.00804
Epoch: 31898, Training Loss: 0.00987
Epoch: 31899, Training Loss: 0.00771
Epoch: 31899, Training Loss: 0.00803
Epoch: 31899, Training Loss: 0.00804
Epoch: 31899, Training Loss: 0.00987
Epoch: 31900, Training Loss: 0.00771
Epoch: 31900, Training Loss: 0.00803
Epoch: 31900, Training Loss: 0.00804
Epoch: 31900, Training Loss: 0.00987
Epoch: 31901, Training Loss: 0.00771
Epoch: 31901, Training Loss: 0.00803
Epoch: 31901, Training Loss: 0.00804
Epoch: 31901, Training Loss: 0.00987
Epoch: 31902, Training Loss: 0.00771
Epoch: 31902, Training Loss: 0.00803
Epoch: 31902, Training Loss: 0.00804
Epoch: 31902, Training Loss: 0.00987
Epoch: 31903, Training Loss: 0.00771
Epoch: 31903, Training Loss: 0.00803
Epoch: 31903, Training Loss: 0.00804
Epoch: 31903, Training Loss: 0.00987
Epoch: 31904, Training Loss: 0.00771
Epoch: 31904, Training Loss: 0.00803
Epoch: 31904, Training Loss: 0.00804
Epoch: 31904, Training Loss: 0.00987
Epoch: 31905, Training Loss: 0.00771
Epoch: 31905, Training Loss: 0.00803
Epoch: 31905, Training Loss: 0.00804
Epoch: 31905, Training Loss: 0.00987
Epoch: 31906, Training Loss: 0.00771
Epoch: 31906, Training Loss: 0.00803
Epoch: 31906, Training Loss: 0.00804
Epoch: 31906, Training Loss: 0.00987
Epoch: 31907, Training Loss: 0.00771
Epoch: 31907, Training Loss: 0.00803
Epoch: 31907, Training Loss: 0.00804
Epoch: 31907, Training Loss: 0.00987
Epoch: 31908, Training Loss: 0.00771
Epoch: 31908, Training Loss: 0.00803
Epoch: 31908, Training Loss: 0.00804
Epoch: 31908, Training Loss: 0.00987
Epoch: 31909, Training Loss: 0.00771
Epoch: 31909, Training Loss: 0.00803
Epoch: 31909, Training Loss: 0.00804
Epoch: 31909, Training Loss: 0.00987
Epoch: 31910, Training Loss: 0.00771
Epoch: 31910, Training Loss: 0.00803
Epoch: 31910, Training Loss: 0.00804
Epoch: 31910, Training Loss: 0.00987
Epoch: 31911, Training Loss: 0.00771
Epoch: 31911, Training Loss: 0.00803
Epoch: 31911, Training Loss: 0.00804
Epoch: 31911, Training Loss: 0.00987
Epoch: 31912, Training Loss: 0.00771
Epoch: 31912, Training Loss: 0.00803
Epoch: 31912, Training Loss: 0.00804
Epoch: 31912, Training Loss: 0.00987
Epoch: 31913, Training Loss: 0.00771
Epoch: 31913, Training Loss: 0.00803
Epoch: 31913, Training Loss: 0.00804
Epoch: 31913, Training Loss: 0.00987
Epoch: 31914, Training Loss: 0.00771
Epoch: 31914, Training Loss: 0.00803
Epoch: 31914, Training Loss: 0.00804
Epoch: 31914, Training Loss: 0.00987
Epoch: 31915, Training Loss: 0.00771
Epoch: 31915, Training Loss: 0.00803
Epoch: 31915, Training Loss: 0.00804
Epoch: 31915, Training Loss: 0.00987
Epoch: 31916, Training Loss: 0.00771
Epoch: 31916, Training Loss: 0.00803
Epoch: 31916, Training Loss: 0.00804
Epoch: 31916, Training Loss: 0.00987
Epoch: 31917, Training Loss: 0.00771
Epoch: 31917, Training Loss: 0.00803
Epoch: 31917, Training Loss: 0.00804
Epoch: 31917, Training Loss: 0.00987
Epoch: 31918, Training Loss: 0.00771
Epoch: 31918, Training Loss: 0.00803
Epoch: 31918, Training Loss: 0.00803
Epoch: 31918, Training Loss: 0.00987
Epoch: 31919, Training Loss: 0.00771
Epoch: 31919, Training Loss: 0.00803
Epoch: 31919, Training Loss: 0.00803
Epoch: 31919, Training Loss: 0.00987
Epoch: 31920, Training Loss: 0.00771
Epoch: 31920, Training Loss: 0.00803
Epoch: 31920, Training Loss: 0.00803
Epoch: 31920, Training Loss: 0.00987
Epoch: 31921, Training Loss: 0.00771
Epoch: 31921, Training Loss: 0.00803
Epoch: 31921, Training Loss: 0.00803
Epoch: 31921, Training Loss: 0.00987
Epoch: 31922, Training Loss: 0.00771
Epoch: 31922, Training Loss: 0.00802
Epoch: 31922, Training Loss: 0.00803
Epoch: 31922, Training Loss: 0.00987
Epoch: 31923, Training Loss: 0.00771
Epoch: 31923, Training Loss: 0.00802
Epoch: 31923, Training Loss: 0.00803
Epoch: 31923, Training Loss: 0.00987
Epoch: 31924, Training Loss: 0.00771
Epoch: 31924, Training Loss: 0.00802
Epoch: 31924, Training Loss: 0.00803
Epoch: 31924, Training Loss: 0.00987
Epoch: 31925, Training Loss: 0.00771
Epoch: 31925, Training Loss: 0.00802
Epoch: 31925, Training Loss: 0.00803
Epoch: 31925, Training Loss: 0.00987
Epoch: 31926, Training Loss: 0.00771
Epoch: 31926, Training Loss: 0.00802
Epoch: 31926, Training Loss: 0.00803
Epoch: 31926, Training Loss: 0.00986
Epoch: 31927, Training Loss: 0.00771
Epoch: 31927, Training Loss: 0.00802
Epoch: 31927, Training Loss: 0.00803
Epoch: 31927, Training Loss: 0.00986
Epoch: 31928, Training Loss: 0.00771
Epoch: 31928, Training Loss: 0.00802
Epoch: 31928, Training Loss: 0.00803
Epoch: 31928, Training Loss: 0.00986
Epoch: 31929, Training Loss: 0.00771
Epoch: 31929, Training Loss: 0.00802
Epoch: 31929, Training Loss: 0.00803
Epoch: 31929, Training Loss: 0.00986
Epoch: 31930, Training Loss: 0.00771
Epoch: 31930, Training Loss: 0.00802
Epoch: 31930, Training Loss: 0.00803
Epoch: 31930, Training Loss: 0.00986
Epoch: 31931, Training Loss: 0.00771
Epoch: 31931, Training Loss: 0.00802
Epoch: 31931, Training Loss: 0.00803
Epoch: 31931, Training Loss: 0.00986
Epoch: 31932, Training Loss: 0.00770
Epoch: 31932, Training Loss: 0.00802
Epoch: 31932, Training Loss: 0.00803
Epoch: 31932, Training Loss: 0.00986
Epoch: 31933, Training Loss: 0.00770
Epoch: 31933, Training Loss: 0.00802
Epoch: 31933, Training Loss: 0.00803
Epoch: 31933, Training Loss: 0.00986
Epoch: 31934, Training Loss: 0.00770
Epoch: 31934, Training Loss: 0.00802
Epoch: 31934, Training Loss: 0.00803
Epoch: 31934, Training Loss: 0.00986
Epoch: 31935, Training Loss: 0.00770
Epoch: 31935, Training Loss: 0.00802
Epoch: 31935, Training Loss: 0.00803
Epoch: 31935, Training Loss: 0.00986
Epoch: 31936, Training Loss: 0.00770
Epoch: 31936, Training Loss: 0.00802
Epoch: 31936, Training Loss: 0.00803
Epoch: 31936, Training Loss: 0.00986
Epoch: 31937, Training Loss: 0.00770
Epoch: 31937, Training Loss: 0.00802
Epoch: 31937, Training Loss: 0.00803
Epoch: 31937, Training Loss: 0.00986
Epoch: 31938, Training Loss: 0.00770
Epoch: 31938, Training Loss: 0.00802
Epoch: 31938, Training Loss: 0.00803
Epoch: 31938, Training Loss: 0.00986
Epoch: 31939, Training Loss: 0.00770
Epoch: 31939, Training Loss: 0.00802
Epoch: 31939, Training Loss: 0.00803
Epoch: 31939, Training Loss: 0.00986
Epoch: 31940, Training Loss: 0.00770
Epoch: 31940, Training Loss: 0.00802
Epoch: 31940, Training Loss: 0.00803
Epoch: 31940, Training Loss: 0.00986
Epoch: 31941, Training Loss: 0.00770
Epoch: 31941, Training Loss: 0.00802
Epoch: 31941, Training Loss: 0.00803
Epoch: 31941, Training Loss: 0.00986
Epoch: 31942, Training Loss: 0.00770
Epoch: 31942, Training Loss: 0.00802
Epoch: 31942, Training Loss: 0.00803
Epoch: 31942, Training Loss: 0.00986
Epoch: 31943, Training Loss: 0.00770
Epoch: 31943, Training Loss: 0.00802
Epoch: 31943, Training Loss: 0.00803
Epoch: 31943, Training Loss: 0.00986
Epoch: 31944, Training Loss: 0.00770
Epoch: 31944, Training Loss: 0.00802
Epoch: 31944, Training Loss: 0.00803
Epoch: 31944, Training Loss: 0.00986
Epoch: 31945, Training Loss: 0.00770
Epoch: 31945, Training Loss: 0.00802
Epoch: 31945, Training Loss: 0.00803
Epoch: 31945, Training Loss: 0.00986
Epoch: 31946, Training Loss: 0.00770
Epoch: 31946, Training Loss: 0.00802
Epoch: 31946, Training Loss: 0.00803
Epoch: 31946, Training Loss: 0.00986
Epoch: 31947, Training Loss: 0.00770
Epoch: 31947, Training Loss: 0.00802
Epoch: 31947, Training Loss: 0.00803
Epoch: 31947, Training Loss: 0.00986
Epoch: 31948, Training Loss: 0.00770
Epoch: 31948, Training Loss: 0.00802
Epoch: 31948, Training Loss: 0.00803
Epoch: 31948, Training Loss: 0.00986
Epoch: 31949, Training Loss: 0.00770
Epoch: 31949, Training Loss: 0.00802
Epoch: 31949, Training Loss: 0.00803
Epoch: 31949, Training Loss: 0.00986
Epoch: 31950, Training Loss: 0.00770
Epoch: 31950, Training Loss: 0.00802
Epoch: 31950, Training Loss: 0.00803
Epoch: 31950, Training Loss: 0.00986
Epoch: 31951, Training Loss: 0.00770
Epoch: 31951, Training Loss: 0.00802
Epoch: 31951, Training Loss: 0.00803
Epoch: 31951, Training Loss: 0.00986
Epoch: 31952, Training Loss: 0.00770
Epoch: 31952, Training Loss: 0.00802
Epoch: 31952, Training Loss: 0.00803
Epoch: 31952, Training Loss: 0.00986
Epoch: 31953, Training Loss: 0.00770
Epoch: 31953, Training Loss: 0.00802
Epoch: 31953, Training Loss: 0.00803
Epoch: 31953, Training Loss: 0.00986
Epoch: 31954, Training Loss: 0.00770
Epoch: 31954, Training Loss: 0.00802
Epoch: 31954, Training Loss: 0.00803
Epoch: 31954, Training Loss: 0.00986
Epoch: 31955, Training Loss: 0.00770
Epoch: 31955, Training Loss: 0.00802
Epoch: 31955, Training Loss: 0.00803
Epoch: 31955, Training Loss: 0.00986
Epoch: 31956, Training Loss: 0.00770
Epoch: 31956, Training Loss: 0.00802
Epoch: 31956, Training Loss: 0.00803
Epoch: 31956, Training Loss: 0.00986
Epoch: 31957, Training Loss: 0.00770
Epoch: 31957, Training Loss: 0.00802
Epoch: 31957, Training Loss: 0.00803
Epoch: 31957, Training Loss: 0.00986
Epoch: 31958, Training Loss: 0.00770
Epoch: 31958, Training Loss: 0.00802
Epoch: 31958, Training Loss: 0.00803
Epoch: 31958, Training Loss: 0.00986
Epoch: 31959, Training Loss: 0.00770
Epoch: 31959, Training Loss: 0.00802
Epoch: 31959, Training Loss: 0.00803
Epoch: 31959, Training Loss: 0.00986
Epoch: 31960, Training Loss: 0.00770
Epoch: 31960, Training Loss: 0.00802
Epoch: 31960, Training Loss: 0.00803
Epoch: 31960, Training Loss: 0.00986
Epoch: 31961, Training Loss: 0.00770
Epoch: 31961, Training Loss: 0.00802
Epoch: 31961, Training Loss: 0.00803
Epoch: 31961, Training Loss: 0.00986
Epoch: 31962, Training Loss: 0.00770
Epoch: 31962, Training Loss: 0.00802
Epoch: 31962, Training Loss: 0.00803
Epoch: 31962, Training Loss: 0.00986
Epoch: 31963, Training Loss: 0.00770
Epoch: 31963, Training Loss: 0.00802
Epoch: 31963, Training Loss: 0.00803
Epoch: 31963, Training Loss: 0.00986
Epoch: 31964, Training Loss: 0.00770
Epoch: 31964, Training Loss: 0.00802
Epoch: 31964, Training Loss: 0.00803
Epoch: 31964, Training Loss: 0.00986
Epoch: 31965, Training Loss: 0.00770
Epoch: 31965, Training Loss: 0.00802
Epoch: 31965, Training Loss: 0.00803
Epoch: 31965, Training Loss: 0.00986
Epoch: 31966, Training Loss: 0.00770
Epoch: 31966, Training Loss: 0.00802
Epoch: 31966, Training Loss: 0.00803
Epoch: 31966, Training Loss: 0.00986
Epoch: 31967, Training Loss: 0.00770
Epoch: 31967, Training Loss: 0.00802
Epoch: 31967, Training Loss: 0.00803
Epoch: 31967, Training Loss: 0.00986
Epoch: 31968, Training Loss: 0.00770
Epoch: 31968, Training Loss: 0.00802
Epoch: 31968, Training Loss: 0.00803
Epoch: 31968, Training Loss: 0.00986
Epoch: 31969, Training Loss: 0.00770
Epoch: 31969, Training Loss: 0.00802
Epoch: 31969, Training Loss: 0.00803
Epoch: 31969, Training Loss: 0.00986
Epoch: 31970, Training Loss: 0.00770
Epoch: 31970, Training Loss: 0.00802
Epoch: 31970, Training Loss: 0.00803
Epoch: 31970, Training Loss: 0.00986
Epoch: 31971, Training Loss: 0.00770
Epoch: 31971, Training Loss: 0.00802
Epoch: 31971, Training Loss: 0.00803
Epoch: 31971, Training Loss: 0.00986
Epoch: 31972, Training Loss: 0.00770
Epoch: 31972, Training Loss: 0.00802
Epoch: 31972, Training Loss: 0.00803
Epoch: 31972, Training Loss: 0.00986
Epoch: 31973, Training Loss: 0.00770
Epoch: 31973, Training Loss: 0.00802
Epoch: 31973, Training Loss: 0.00803
Epoch: 31973, Training Loss: 0.00986
Epoch: 31974, Training Loss: 0.00770
Epoch: 31974, Training Loss: 0.00802
Epoch: 31974, Training Loss: 0.00803
Epoch: 31974, Training Loss: 0.00986
Epoch: 31975, Training Loss: 0.00770
Epoch: 31975, Training Loss: 0.00802
Epoch: 31975, Training Loss: 0.00803
Epoch: 31975, Training Loss: 0.00986
Epoch: 31976, Training Loss: 0.00770
Epoch: 31976, Training Loss: 0.00802
Epoch: 31976, Training Loss: 0.00803
Epoch: 31976, Training Loss: 0.00986
Epoch: 31977, Training Loss: 0.00770
Epoch: 31977, Training Loss: 0.00802
Epoch: 31977, Training Loss: 0.00803
Epoch: 31977, Training Loss: 0.00986
Epoch: 31978, Training Loss: 0.00770
Epoch: 31978, Training Loss: 0.00802
Epoch: 31978, Training Loss: 0.00803
Epoch: 31978, Training Loss: 0.00986
Epoch: 31979, Training Loss: 0.00770
Epoch: 31979, Training Loss: 0.00802
Epoch: 31979, Training Loss: 0.00803
Epoch: 31979, Training Loss: 0.00986
Epoch: 31980, Training Loss: 0.00770
Epoch: 31980, Training Loss: 0.00802
Epoch: 31980, Training Loss: 0.00803
Epoch: 31980, Training Loss: 0.00986
Epoch: 31981, Training Loss: 0.00770
Epoch: 31981, Training Loss: 0.00802
Epoch: 31981, Training Loss: 0.00803
Epoch: 31981, Training Loss: 0.00986
Epoch: 31982, Training Loss: 0.00770
Epoch: 31982, Training Loss: 0.00802
Epoch: 31982, Training Loss: 0.00803
Epoch: 31982, Training Loss: 0.00986
Epoch: 31983, Training Loss: 0.00770
Epoch: 31983, Training Loss: 0.00802
Epoch: 31983, Training Loss: 0.00803
Epoch: 31983, Training Loss: 0.00985
Epoch: 31984, Training Loss: 0.00770
Epoch: 31984, Training Loss: 0.00802
Epoch: 31984, Training Loss: 0.00803
Epoch: 31984, Training Loss: 0.00985
Epoch: 31985, Training Loss: 0.00770
Epoch: 31985, Training Loss: 0.00802
Epoch: 31985, Training Loss: 0.00803
Epoch: 31985, Training Loss: 0.00985
Epoch: 31986, Training Loss: 0.00770
Epoch: 31986, Training Loss: 0.00802
Epoch: 31986, Training Loss: 0.00803
Epoch: 31986, Training Loss: 0.00985
Epoch: 31987, Training Loss: 0.00770
Epoch: 31987, Training Loss: 0.00802
Epoch: 31987, Training Loss: 0.00803
Epoch: 31987, Training Loss: 0.00985
Epoch: 31988, Training Loss: 0.00770
Epoch: 31988, Training Loss: 0.00802
Epoch: 31988, Training Loss: 0.00803
Epoch: 31988, Training Loss: 0.00985
Epoch: 31989, Training Loss: 0.00770
Epoch: 31989, Training Loss: 0.00802
Epoch: 31989, Training Loss: 0.00802
Epoch: 31989, Training Loss: 0.00985
Epoch: 31990, Training Loss: 0.00770
Epoch: 31990, Training Loss: 0.00802
Epoch: 31990, Training Loss: 0.00802
Epoch: 31990, Training Loss: 0.00985
Epoch: 31991, Training Loss: 0.00770
Epoch: 31991, Training Loss: 0.00802
Epoch: 31991, Training Loss: 0.00802
Epoch: 31991, Training Loss: 0.00985
Epoch: 31992, Training Loss: 0.00770
Epoch: 31992, Training Loss: 0.00802
Epoch: 31992, Training Loss: 0.00802
Epoch: 31992, Training Loss: 0.00985
Epoch: 31993, Training Loss: 0.00770
Epoch: 31993, Training Loss: 0.00801
Epoch: 31993, Training Loss: 0.00802
Epoch: 31993, Training Loss: 0.00985
Epoch: 31994, Training Loss: 0.00770
Epoch: 31994, Training Loss: 0.00801
Epoch: 31994, Training Loss: 0.00802
Epoch: 31994, Training Loss: 0.00985
Epoch: 31995, Training Loss: 0.00770
Epoch: 31995, Training Loss: 0.00801
Epoch: 31995, Training Loss: 0.00802
Epoch: 31995, Training Loss: 0.00985
Epoch: 31996, Training Loss: 0.00770
Epoch: 31996, Training Loss: 0.00801
Epoch: 31996, Training Loss: 0.00802
Epoch: 31996, Training Loss: 0.00985
Epoch: 31997, Training Loss: 0.00770
Epoch: 31997, Training Loss: 0.00801
Epoch: 31997, Training Loss: 0.00802
Epoch: 31997, Training Loss: 0.00985
Epoch: 31998, Training Loss: 0.00770
Epoch: 31998, Training Loss: 0.00801
Epoch: 31998, Training Loss: 0.00802
Epoch: 31998, Training Loss: 0.00985
Epoch: 31999, Training Loss: 0.00770
Epoch: 31999, Training Loss: 0.00801
Epoch: 31999, Training Loss: 0.00802
Epoch: 31999, Training Loss: 0.00985
Epoch: 32000, Training Loss: 0.00770
Epoch: 32000, Training Loss: 0.00801
Epoch: 32000, Training Loss: 0.00802
Epoch: 32000, Training Loss: 0.00985
Epoch: 32001, Training Loss: 0.00770
Epoch: 32001, Training Loss: 0.00801
Epoch: 32001, Training Loss: 0.00802
Epoch: 32001, Training Loss: 0.00985
Epoch: 32002, Training Loss: 0.00770
Epoch: 32002, Training Loss: 0.00801
Epoch: 32002, Training Loss: 0.00802
Epoch: 32002, Training Loss: 0.00985
Epoch: 32003, Training Loss: 0.00770
Epoch: 32003, Training Loss: 0.00801
Epoch: 32003, Training Loss: 0.00802
Epoch: 32003, Training Loss: 0.00985
Epoch: 32004, Training Loss: 0.00770
Epoch: 32004, Training Loss: 0.00801
Epoch: 32004, Training Loss: 0.00802
Epoch: 32004, Training Loss: 0.00985
Epoch: 32005, Training Loss: 0.00770
Epoch: 32005, Training Loss: 0.00801
Epoch: 32005, Training Loss: 0.00802
Epoch: 32005, Training Loss: 0.00985
Epoch: 32006, Training Loss: 0.00770
Epoch: 32006, Training Loss: 0.00801
Epoch: 32006, Training Loss: 0.00802
Epoch: 32006, Training Loss: 0.00985
Epoch: 32007, Training Loss: 0.00770
Epoch: 32007, Training Loss: 0.00801
Epoch: 32007, Training Loss: 0.00802
Epoch: 32007, Training Loss: 0.00985
Epoch: 32008, Training Loss: 0.00769
Epoch: 32008, Training Loss: 0.00801
Epoch: 32008, Training Loss: 0.00802
Epoch: 32008, Training Loss: 0.00985
Epoch: 32009, Training Loss: 0.00769
Epoch: 32009, Training Loss: 0.00801
Epoch: 32009, Training Loss: 0.00802
Epoch: 32009, Training Loss: 0.00985
Epoch: 32010, Training Loss: 0.00769
Epoch: 32010, Training Loss: 0.00801
Epoch: 32010, Training Loss: 0.00802
Epoch: 32010, Training Loss: 0.00985
Epoch: 32011, Training Loss: 0.00769
Epoch: 32011, Training Loss: 0.00801
Epoch: 32011, Training Loss: 0.00802
Epoch: 32011, Training Loss: 0.00985
Epoch: 32012, Training Loss: 0.00769
Epoch: 32012, Training Loss: 0.00801
Epoch: 32012, Training Loss: 0.00802
Epoch: 32012, Training Loss: 0.00985
Epoch: 32013, Training Loss: 0.00769
Epoch: 32013, Training Loss: 0.00801
Epoch: 32013, Training Loss: 0.00802
Epoch: 32013, Training Loss: 0.00985
Epoch: 32014, Training Loss: 0.00769
Epoch: 32014, Training Loss: 0.00801
Epoch: 32014, Training Loss: 0.00802
Epoch: 32014, Training Loss: 0.00985
Epoch: 32015, Training Loss: 0.00769
Epoch: 32015, Training Loss: 0.00801
Epoch: 32015, Training Loss: 0.00802
Epoch: 32015, Training Loss: 0.00985
Epoch: 32016, Training Loss: 0.00769
Epoch: 32016, Training Loss: 0.00801
Epoch: 32016, Training Loss: 0.00802
Epoch: 32016, Training Loss: 0.00985
Epoch: 32017, Training Loss: 0.00769
Epoch: 32017, Training Loss: 0.00801
Epoch: 32017, Training Loss: 0.00802
Epoch: 32017, Training Loss: 0.00985
Epoch: 32018, Training Loss: 0.00769
Epoch: 32018, Training Loss: 0.00801
Epoch: 32018, Training Loss: 0.00802
Epoch: 32018, Training Loss: 0.00985
Epoch: 32019, Training Loss: 0.00769
Epoch: 32019, Training Loss: 0.00801
Epoch: 32019, Training Loss: 0.00802
Epoch: 32019, Training Loss: 0.00985
Epoch: 32020, Training Loss: 0.00769
Epoch: 32020, Training Loss: 0.00801
Epoch: 32020, Training Loss: 0.00802
Epoch: 32020, Training Loss: 0.00985
Epoch: 32021, Training Loss: 0.00769
Epoch: 32021, Training Loss: 0.00801
Epoch: 32021, Training Loss: 0.00802
Epoch: 32021, Training Loss: 0.00985
Epoch: 32022, Training Loss: 0.00769
Epoch: 32022, Training Loss: 0.00801
Epoch: 32022, Training Loss: 0.00802
Epoch: 32022, Training Loss: 0.00985
Epoch: 32023, Training Loss: 0.00769
Epoch: 32023, Training Loss: 0.00801
Epoch: 32023, Training Loss: 0.00802
Epoch: 32023, Training Loss: 0.00985
Epoch: 32024, Training Loss: 0.00769
Epoch: 32024, Training Loss: 0.00801
Epoch: 32024, Training Loss: 0.00802
Epoch: 32024, Training Loss: 0.00985
Epoch: 32025, Training Loss: 0.00769
Epoch: 32025, Training Loss: 0.00801
Epoch: 32025, Training Loss: 0.00802
Epoch: 32025, Training Loss: 0.00985
Epoch: 32026, Training Loss: 0.00769
Epoch: 32026, Training Loss: 0.00801
Epoch: 32026, Training Loss: 0.00802
Epoch: 32026, Training Loss: 0.00985
Epoch: 32027, Training Loss: 0.00769
Epoch: 32027, Training Loss: 0.00801
Epoch: 32027, Training Loss: 0.00802
Epoch: 32027, Training Loss: 0.00985
Epoch: 32028, Training Loss: 0.00769
Epoch: 32028, Training Loss: 0.00801
Epoch: 32028, Training Loss: 0.00802
Epoch: 32028, Training Loss: 0.00985
Epoch: 32029, Training Loss: 0.00769
Epoch: 32029, Training Loss: 0.00801
Epoch: 32029, Training Loss: 0.00802
Epoch: 32029, Training Loss: 0.00985
Epoch: 32030, Training Loss: 0.00769
Epoch: 32030, Training Loss: 0.00801
Epoch: 32030, Training Loss: 0.00802
Epoch: 32030, Training Loss: 0.00985
Epoch: 32031, Training Loss: 0.00769
Epoch: 32031, Training Loss: 0.00801
Epoch: 32031, Training Loss: 0.00802
Epoch: 32031, Training Loss: 0.00985
Epoch: 32032, Training Loss: 0.00769
Epoch: 32032, Training Loss: 0.00801
Epoch: 32032, Training Loss: 0.00802
Epoch: 32032, Training Loss: 0.00985
Epoch: 32033, Training Loss: 0.00769
Epoch: 32033, Training Loss: 0.00801
Epoch: 32033, Training Loss: 0.00802
Epoch: 32033, Training Loss: 0.00985
Epoch: 32034, Training Loss: 0.00769
Epoch: 32034, Training Loss: 0.00801
Epoch: 32034, Training Loss: 0.00802
Epoch: 32034, Training Loss: 0.00985
Epoch: 32035, Training Loss: 0.00769
Epoch: 32035, Training Loss: 0.00801
Epoch: 32035, Training Loss: 0.00802
Epoch: 32035, Training Loss: 0.00985
Epoch: 32036, Training Loss: 0.00769
Epoch: 32036, Training Loss: 0.00801
Epoch: 32036, Training Loss: 0.00802
Epoch: 32036, Training Loss: 0.00985
Epoch: 32037, Training Loss: 0.00769
Epoch: 32037, Training Loss: 0.00801
Epoch: 32037, Training Loss: 0.00802
Epoch: 32037, Training Loss: 0.00985
Epoch: 32038, Training Loss: 0.00769
Epoch: 32038, Training Loss: 0.00801
Epoch: 32038, Training Loss: 0.00802
Epoch: 32038, Training Loss: 0.00985
Epoch: 32039, Training Loss: 0.00769
Epoch: 32039, Training Loss: 0.00801
Epoch: 32039, Training Loss: 0.00802
Epoch: 32039, Training Loss: 0.00985
Epoch: 32040, Training Loss: 0.00769
Epoch: 32040, Training Loss: 0.00801
Epoch: 32040, Training Loss: 0.00802
Epoch: 32040, Training Loss: 0.00984
Epoch: 32041, Training Loss: 0.00769
Epoch: 32041, Training Loss: 0.00801
Epoch: 32041, Training Loss: 0.00802
Epoch: 32041, Training Loss: 0.00984
Epoch: 32042, Training Loss: 0.00769
Epoch: 32042, Training Loss: 0.00801
Epoch: 32042, Training Loss: 0.00802
Epoch: 32042, Training Loss: 0.00984
Epoch: 32043, Training Loss: 0.00769
Epoch: 32043, Training Loss: 0.00801
Epoch: 32043, Training Loss: 0.00802
Epoch: 32043, Training Loss: 0.00984
Epoch: 32044, Training Loss: 0.00769
Epoch: 32044, Training Loss: 0.00801
Epoch: 32044, Training Loss: 0.00802
Epoch: 32044, Training Loss: 0.00984
Epoch: 32045, Training Loss: 0.00769
Epoch: 32045, Training Loss: 0.00801
Epoch: 32045, Training Loss: 0.00802
Epoch: 32045, Training Loss: 0.00984
Epoch: 32046, Training Loss: 0.00769
Epoch: 32046, Training Loss: 0.00801
Epoch: 32046, Training Loss: 0.00802
Epoch: 32046, Training Loss: 0.00984
Epoch: 32047, Training Loss: 0.00769
Epoch: 32047, Training Loss: 0.00801
Epoch: 32047, Training Loss: 0.00802
Epoch: 32047, Training Loss: 0.00984
Epoch: 32048, Training Loss: 0.00769
Epoch: 32048, Training Loss: 0.00801
Epoch: 32048, Training Loss: 0.00802
Epoch: 32048, Training Loss: 0.00984
Epoch: 32049, Training Loss: 0.00769
Epoch: 32049, Training Loss: 0.00801
Epoch: 32049, Training Loss: 0.00802
Epoch: 32049, Training Loss: 0.00984
Epoch: 32050, Training Loss: 0.00769
Epoch: 32050, Training Loss: 0.00801
Epoch: 32050, Training Loss: 0.00802
Epoch: 32050, Training Loss: 0.00984
Epoch: 32051, Training Loss: 0.00769
Epoch: 32051, Training Loss: 0.00801
Epoch: 32051, Training Loss: 0.00802
Epoch: 32051, Training Loss: 0.00984
Epoch: 32052, Training Loss: 0.00769
Epoch: 32052, Training Loss: 0.00801
Epoch: 32052, Training Loss: 0.00802
Epoch: 32052, Training Loss: 0.00984
Epoch: 32053, Training Loss: 0.00769
Epoch: 32053, Training Loss: 0.00801
Epoch: 32053, Training Loss: 0.00802
Epoch: 32053, Training Loss: 0.00984
Epoch: 32054, Training Loss: 0.00769
Epoch: 32054, Training Loss: 0.00801
Epoch: 32054, Training Loss: 0.00802
Epoch: 32054, Training Loss: 0.00984
Epoch: 32055, Training Loss: 0.00769
Epoch: 32055, Training Loss: 0.00801
Epoch: 32055, Training Loss: 0.00802
Epoch: 32055, Training Loss: 0.00984
Epoch: 32056, Training Loss: 0.00769
Epoch: 32056, Training Loss: 0.00801
Epoch: 32056, Training Loss: 0.00802
Epoch: 32056, Training Loss: 0.00984
Epoch: 32057, Training Loss: 0.00769
Epoch: 32057, Training Loss: 0.00801
Epoch: 32057, Training Loss: 0.00802
Epoch: 32057, Training Loss: 0.00984
Epoch: 32058, Training Loss: 0.00769
Epoch: 32058, Training Loss: 0.00801
Epoch: 32058, Training Loss: 0.00802
Epoch: 32058, Training Loss: 0.00984
Epoch: 32059, Training Loss: 0.00769
Epoch: 32059, Training Loss: 0.00801
Epoch: 32059, Training Loss: 0.00802
Epoch: 32059, Training Loss: 0.00984
Epoch: 32060, Training Loss: 0.00769
Epoch: 32060, Training Loss: 0.00801
Epoch: 32060, Training Loss: 0.00801
Epoch: 32060, Training Loss: 0.00984
Epoch: 32061, Training Loss: 0.00769
Epoch: 32061, Training Loss: 0.00801
Epoch: 32061, Training Loss: 0.00801
Epoch: 32061, Training Loss: 0.00984
Epoch: 32062, Training Loss: 0.00769
Epoch: 32062, Training Loss: 0.00801
Epoch: 32062, Training Loss: 0.00801
Epoch: 32062, Training Loss: 0.00984
Epoch: 32063, Training Loss: 0.00769
Epoch: 32063, Training Loss: 0.00801
Epoch: 32063, Training Loss: 0.00801
Epoch: 32063, Training Loss: 0.00984
Epoch: 32064, Training Loss: 0.00769
Epoch: 32064, Training Loss: 0.00800
Epoch: 32064, Training Loss: 0.00801
Epoch: 32064, Training Loss: 0.00984
Epoch: 32065, Training Loss: 0.00769
Epoch: 32065, Training Loss: 0.00800
Epoch: 32065, Training Loss: 0.00801
Epoch: 32065, Training Loss: 0.00984
Epoch: 32066, Training Loss: 0.00769
Epoch: 32066, Training Loss: 0.00800
Epoch: 32066, Training Loss: 0.00801
Epoch: 32066, Training Loss: 0.00984
Epoch: 32067, Training Loss: 0.00769
Epoch: 32067, Training Loss: 0.00800
Epoch: 32067, Training Loss: 0.00801
Epoch: 32067, Training Loss: 0.00984
Epoch: 32068, Training Loss: 0.00769
Epoch: 32068, Training Loss: 0.00800
Epoch: 32068, Training Loss: 0.00801
Epoch: 32068, Training Loss: 0.00984
Epoch: 32069, Training Loss: 0.00769
Epoch: 32069, Training Loss: 0.00800
Epoch: 32069, Training Loss: 0.00801
Epoch: 32069, Training Loss: 0.00984
Epoch: 32070, Training Loss: 0.00769
Epoch: 32070, Training Loss: 0.00800
Epoch: 32070, Training Loss: 0.00801
Epoch: 32070, Training Loss: 0.00984
Epoch: 32071, Training Loss: 0.00769
Epoch: 32071, Training Loss: 0.00800
Epoch: 32071, Training Loss: 0.00801
Epoch: 32071, Training Loss: 0.00984
Epoch: 32072, Training Loss: 0.00769
Epoch: 32072, Training Loss: 0.00800
Epoch: 32072, Training Loss: 0.00801
Epoch: 32072, Training Loss: 0.00984
Epoch: 32073, Training Loss: 0.00769
Epoch: 32073, Training Loss: 0.00800
Epoch: 32073, Training Loss: 0.00801
Epoch: 32073, Training Loss: 0.00984
Epoch: 32074, Training Loss: 0.00769
Epoch: 32074, Training Loss: 0.00800
Epoch: 32074, Training Loss: 0.00801
Epoch: 32074, Training Loss: 0.00984
Epoch: 32075, Training Loss: 0.00769
Epoch: 32075, Training Loss: 0.00800
Epoch: 32075, Training Loss: 0.00801
Epoch: 32075, Training Loss: 0.00984
Epoch: 32076, Training Loss: 0.00769
Epoch: 32076, Training Loss: 0.00800
Epoch: 32076, Training Loss: 0.00801
Epoch: 32076, Training Loss: 0.00984
Epoch: 32077, Training Loss: 0.00769
Epoch: 32077, Training Loss: 0.00800
Epoch: 32077, Training Loss: 0.00801
Epoch: 32077, Training Loss: 0.00984
Epoch: 32078, Training Loss: 0.00769
Epoch: 32078, Training Loss: 0.00800
Epoch: 32078, Training Loss: 0.00801
Epoch: 32078, Training Loss: 0.00984
Epoch: 32079, Training Loss: 0.00769
Epoch: 32079, Training Loss: 0.00800
Epoch: 32079, Training Loss: 0.00801
Epoch: 32079, Training Loss: 0.00984
Epoch: 32080, Training Loss: 0.00769
Epoch: 32080, Training Loss: 0.00800
Epoch: 32080, Training Loss: 0.00801
Epoch: 32080, Training Loss: 0.00984
Epoch: 32081, Training Loss: 0.00769
Epoch: 32081, Training Loss: 0.00800
Epoch: 32081, Training Loss: 0.00801
Epoch: 32081, Training Loss: 0.00984
Epoch: 32082, Training Loss: 0.00769
Epoch: 32082, Training Loss: 0.00800
Epoch: 32082, Training Loss: 0.00801
Epoch: 32082, Training Loss: 0.00984
Epoch: 32083, Training Loss: 0.00769
Epoch: 32083, Training Loss: 0.00800
Epoch: 32083, Training Loss: 0.00801
Epoch: 32083, Training Loss: 0.00984
Epoch: 32084, Training Loss: 0.00768
Epoch: 32084, Training Loss: 0.00800
Epoch: 32084, Training Loss: 0.00801
Epoch: 32084, Training Loss: 0.00984
Epoch: 32085, Training Loss: 0.00768
Epoch: 32085, Training Loss: 0.00800
Epoch: 32085, Training Loss: 0.00801
Epoch: 32085, Training Loss: 0.00984
Epoch: 32086, Training Loss: 0.00768
Epoch: 32086, Training Loss: 0.00800
Epoch: 32086, Training Loss: 0.00801
Epoch: 32086, Training Loss: 0.00984
Epoch: 32087, Training Loss: 0.00768
Epoch: 32087, Training Loss: 0.00800
Epoch: 32087, Training Loss: 0.00801
Epoch: 32087, Training Loss: 0.00984
Epoch: 32088, Training Loss: 0.00768
Epoch: 32088, Training Loss: 0.00800
Epoch: 32088, Training Loss: 0.00801
Epoch: 32088, Training Loss: 0.00984
Epoch: 32089, Training Loss: 0.00768
Epoch: 32089, Training Loss: 0.00800
Epoch: 32089, Training Loss: 0.00801
Epoch: 32089, Training Loss: 0.00984
Epoch: 32090, Training Loss: 0.00768
Epoch: 32090, Training Loss: 0.00800
Epoch: 32090, Training Loss: 0.00801
Epoch: 32090, Training Loss: 0.00984
Epoch: 32091, Training Loss: 0.00768
Epoch: 32091, Training Loss: 0.00800
Epoch: 32091, Training Loss: 0.00801
Epoch: 32091, Training Loss: 0.00984
Epoch: 32092, Training Loss: 0.00768
Epoch: 32092, Training Loss: 0.00800
Epoch: 32092, Training Loss: 0.00801
Epoch: 32092, Training Loss: 0.00984
Epoch: 32093, Training Loss: 0.00768
Epoch: 32093, Training Loss: 0.00800
Epoch: 32093, Training Loss: 0.00801
Epoch: 32093, Training Loss: 0.00984
Epoch: 32094, Training Loss: 0.00768
Epoch: 32094, Training Loss: 0.00800
Epoch: 32094, Training Loss: 0.00801
Epoch: 32094, Training Loss: 0.00984
Epoch: 32095, Training Loss: 0.00768
Epoch: 32095, Training Loss: 0.00800
Epoch: 32095, Training Loss: 0.00801
Epoch: 32095, Training Loss: 0.00984
Epoch: 32096, Training Loss: 0.00768
Epoch: 32096, Training Loss: 0.00800
Epoch: 32096, Training Loss: 0.00801
Epoch: 32096, Training Loss: 0.00984
Epoch: 32097, Training Loss: 0.00768
Epoch: 32097, Training Loss: 0.00800
Epoch: 32097, Training Loss: 0.00801
Epoch: 32097, Training Loss: 0.00984
Epoch: 32098, Training Loss: 0.00768
Epoch: 32098, Training Loss: 0.00800
Epoch: 32098, Training Loss: 0.00801
Epoch: 32098, Training Loss: 0.00983
Epoch: 32099, Training Loss: 0.00768
Epoch: 32099, Training Loss: 0.00800
Epoch: 32099, Training Loss: 0.00801
Epoch: 32099, Training Loss: 0.00983
Epoch: 32100, Training Loss: 0.00768
Epoch: 32100, Training Loss: 0.00800
Epoch: 32100, Training Loss: 0.00801
Epoch: 32100, Training Loss: 0.00983
Epoch: 32101, Training Loss: 0.00768
Epoch: 32101, Training Loss: 0.00800
Epoch: 32101, Training Loss: 0.00801
Epoch: 32101, Training Loss: 0.00983
Epoch: 32102, Training Loss: 0.00768
Epoch: 32102, Training Loss: 0.00800
Epoch: 32102, Training Loss: 0.00801
Epoch: 32102, Training Loss: 0.00983
Epoch: 32103, Training Loss: 0.00768
Epoch: 32103, Training Loss: 0.00800
Epoch: 32103, Training Loss: 0.00801
Epoch: 32103, Training Loss: 0.00983
Epoch: 32104, Training Loss: 0.00768
Epoch: 32104, Training Loss: 0.00800
Epoch: 32104, Training Loss: 0.00801
Epoch: 32104, Training Loss: 0.00983
Epoch: 32105, Training Loss: 0.00768
Epoch: 32105, Training Loss: 0.00800
Epoch: 32105, Training Loss: 0.00801
Epoch: 32105, Training Loss: 0.00983
Epoch: 32106, Training Loss: 0.00768
Epoch: 32106, Training Loss: 0.00800
Epoch: 32106, Training Loss: 0.00801
Epoch: 32106, Training Loss: 0.00983
Epoch: 32107, Training Loss: 0.00768
Epoch: 32107, Training Loss: 0.00800
Epoch: 32107, Training Loss: 0.00801
Epoch: 32107, Training Loss: 0.00983
Epoch: 32108, Training Loss: 0.00768
Epoch: 32108, Training Loss: 0.00800
Epoch: 32108, Training Loss: 0.00801
Epoch: 32108, Training Loss: 0.00983
Epoch: 32109, Training Loss: 0.00768
Epoch: 32109, Training Loss: 0.00800
Epoch: 32109, Training Loss: 0.00801
Epoch: 32109, Training Loss: 0.00983
Epoch: 32110, Training Loss: 0.00768
Epoch: 32110, Training Loss: 0.00800
Epoch: 32110, Training Loss: 0.00801
Epoch: 32110, Training Loss: 0.00983
Epoch: 32111, Training Loss: 0.00768
Epoch: 32111, Training Loss: 0.00800
Epoch: 32111, Training Loss: 0.00801
Epoch: 32111, Training Loss: 0.00983
Epoch: 32112, Training Loss: 0.00768
Epoch: 32112, Training Loss: 0.00800
Epoch: 32112, Training Loss: 0.00801
Epoch: 32112, Training Loss: 0.00983
Epoch: 32113, Training Loss: 0.00768
Epoch: 32113, Training Loss: 0.00800
Epoch: 32113, Training Loss: 0.00801
Epoch: 32113, Training Loss: 0.00983
Epoch: 32114, Training Loss: 0.00768
Epoch: 32114, Training Loss: 0.00800
Epoch: 32114, Training Loss: 0.00801
Epoch: 32114, Training Loss: 0.00983
Epoch: 32115, Training Loss: 0.00768
Epoch: 32115, Training Loss: 0.00800
Epoch: 32115, Training Loss: 0.00801
Epoch: 32115, Training Loss: 0.00983
Epoch: 32116, Training Loss: 0.00768
Epoch: 32116, Training Loss: 0.00800
Epoch: 32116, Training Loss: 0.00801
Epoch: 32116, Training Loss: 0.00983
Epoch: 32117, Training Loss: 0.00768
Epoch: 32117, Training Loss: 0.00800
Epoch: 32117, Training Loss: 0.00801
Epoch: 32117, Training Loss: 0.00983
Epoch: 32118, Training Loss: 0.00768
Epoch: 32118, Training Loss: 0.00800
Epoch: 32118, Training Loss: 0.00801
Epoch: 32118, Training Loss: 0.00983
Epoch: 32119, Training Loss: 0.00768
Epoch: 32119, Training Loss: 0.00800
Epoch: 32119, Training Loss: 0.00801
Epoch: 32119, Training Loss: 0.00983
Epoch: 32120, Training Loss: 0.00768
Epoch: 32120, Training Loss: 0.00800
Epoch: 32120, Training Loss: 0.00801
Epoch: 32120, Training Loss: 0.00983
Epoch: 32121, Training Loss: 0.00768
Epoch: 32121, Training Loss: 0.00800
Epoch: 32121, Training Loss: 0.00801
Epoch: 32121, Training Loss: 0.00983
Epoch: 32122, Training Loss: 0.00768
Epoch: 32122, Training Loss: 0.00800
Epoch: 32122, Training Loss: 0.00801
Epoch: 32122, Training Loss: 0.00983
Epoch: 32123, Training Loss: 0.00768
Epoch: 32123, Training Loss: 0.00800
Epoch: 32123, Training Loss: 0.00801
Epoch: 32123, Training Loss: 0.00983
Epoch: 32124, Training Loss: 0.00768
Epoch: 32124, Training Loss: 0.00800
Epoch: 32124, Training Loss: 0.00801
Epoch: 32124, Training Loss: 0.00983
Epoch: 32125, Training Loss: 0.00768
Epoch: 32125, Training Loss: 0.00800
Epoch: 32125, Training Loss: 0.00801
Epoch: 32125, Training Loss: 0.00983
Epoch: 32126, Training Loss: 0.00768
Epoch: 32126, Training Loss: 0.00800
Epoch: 32126, Training Loss: 0.00801
Epoch: 32126, Training Loss: 0.00983
Epoch: 32127, Training Loss: 0.00768
Epoch: 32127, Training Loss: 0.00800
Epoch: 32127, Training Loss: 0.00801
Epoch: 32127, Training Loss: 0.00983
Epoch: 32128, Training Loss: 0.00768
Epoch: 32128, Training Loss: 0.00800
Epoch: 32128, Training Loss: 0.00801
Epoch: 32128, Training Loss: 0.00983
Epoch: 32129, Training Loss: 0.00768
Epoch: 32129, Training Loss: 0.00800
Epoch: 32129, Training Loss: 0.00801
Epoch: 32129, Training Loss: 0.00983
Epoch: 32130, Training Loss: 0.00768
Epoch: 32130, Training Loss: 0.00800
Epoch: 32130, Training Loss: 0.00801
Epoch: 32130, Training Loss: 0.00983
Epoch: 32131, Training Loss: 0.00768
Epoch: 32131, Training Loss: 0.00800
Epoch: 32131, Training Loss: 0.00801
Epoch: 32131, Training Loss: 0.00983
Epoch: 32132, Training Loss: 0.00768
Epoch: 32132, Training Loss: 0.00800
Epoch: 32132, Training Loss: 0.00800
Epoch: 32132, Training Loss: 0.00983
Epoch: 32133, Training Loss: 0.00768
Epoch: 32133, Training Loss: 0.00800
Epoch: 32133, Training Loss: 0.00800
Epoch: 32133, Training Loss: 0.00983
Epoch: 32134, Training Loss: 0.00768
Epoch: 32134, Training Loss: 0.00800
Epoch: 32134, Training Loss: 0.00800
Epoch: 32134, Training Loss: 0.00983
Epoch: 32135, Training Loss: 0.00768
Epoch: 32135, Training Loss: 0.00800
Epoch: 32135, Training Loss: 0.00800
Epoch: 32135, Training Loss: 0.00983
Epoch: 32136, Training Loss: 0.00768
Epoch: 32136, Training Loss: 0.00799
Epoch: 32136, Training Loss: 0.00800
Epoch: 32136, Training Loss: 0.00983
Epoch: 32137, Training Loss: 0.00768
Epoch: 32137, Training Loss: 0.00799
Epoch: 32137, Training Loss: 0.00800
Epoch: 32137, Training Loss: 0.00983
Epoch: 32138, Training Loss: 0.00768
Epoch: 32138, Training Loss: 0.00799
Epoch: 32138, Training Loss: 0.00800
Epoch: 32138, Training Loss: 0.00983
Epoch: 32139, Training Loss: 0.00768
Epoch: 32139, Training Loss: 0.00799
Epoch: 32139, Training Loss: 0.00800
Epoch: 32139, Training Loss: 0.00983
Epoch: 32140, Training Loss: 0.00768
Epoch: 32140, Training Loss: 0.00799
Epoch: 32140, Training Loss: 0.00800
Epoch: 32140, Training Loss: 0.00983
Epoch: 32141, Training Loss: 0.00768
Epoch: 32141, Training Loss: 0.00799
Epoch: 32141, Training Loss: 0.00800
Epoch: 32141, Training Loss: 0.00983
Epoch: 32142, Training Loss: 0.00768
Epoch: 32142, Training Loss: 0.00799
Epoch: 32142, Training Loss: 0.00800
Epoch: 32142, Training Loss: 0.00983
Epoch: 32143, Training Loss: 0.00768
Epoch: 32143, Training Loss: 0.00799
Epoch: 32143, Training Loss: 0.00800
Epoch: 32143, Training Loss: 0.00983
Epoch: 32144, Training Loss: 0.00768
Epoch: 32144, Training Loss: 0.00799
Epoch: 32144, Training Loss: 0.00800
Epoch: 32144, Training Loss: 0.00983
Epoch: 32145, Training Loss: 0.00768
Epoch: 32145, Training Loss: 0.00799
Epoch: 32145, Training Loss: 0.00800
Epoch: 32145, Training Loss: 0.00983
Epoch: 32146, Training Loss: 0.00768
Epoch: 32146, Training Loss: 0.00799
Epoch: 32146, Training Loss: 0.00800
Epoch: 32146, Training Loss: 0.00983
Epoch: 32147, Training Loss: 0.00768
Epoch: 32147, Training Loss: 0.00799
Epoch: 32147, Training Loss: 0.00800
Epoch: 32147, Training Loss: 0.00983
Epoch: 32148, Training Loss: 0.00768
Epoch: 32148, Training Loss: 0.00799
Epoch: 32148, Training Loss: 0.00800
Epoch: 32148, Training Loss: 0.00983
Epoch: 32149, Training Loss: 0.00768
Epoch: 32149, Training Loss: 0.00799
Epoch: 32149, Training Loss: 0.00800
Epoch: 32149, Training Loss: 0.00983
Epoch: 32150, Training Loss: 0.00768
Epoch: 32150, Training Loss: 0.00799
Epoch: 32150, Training Loss: 0.00800
Epoch: 32150, Training Loss: 0.00983
Epoch: 32151, Training Loss: 0.00768
Epoch: 32151, Training Loss: 0.00799
Epoch: 32151, Training Loss: 0.00800
Epoch: 32151, Training Loss: 0.00983
Epoch: 32152, Training Loss: 0.00768
Epoch: 32152, Training Loss: 0.00799
Epoch: 32152, Training Loss: 0.00800
Epoch: 32152, Training Loss: 0.00983
Epoch: 32153, Training Loss: 0.00768
Epoch: 32153, Training Loss: 0.00799
Epoch: 32153, Training Loss: 0.00800
Epoch: 32153, Training Loss: 0.00983
Epoch: 32154, Training Loss: 0.00768
Epoch: 32154, Training Loss: 0.00799
Epoch: 32154, Training Loss: 0.00800
Epoch: 32154, Training Loss: 0.00983
Epoch: 32155, Training Loss: 0.00768
Epoch: 32155, Training Loss: 0.00799
Epoch: 32155, Training Loss: 0.00800
Epoch: 32155, Training Loss: 0.00983
Epoch: 32156, Training Loss: 0.00768
Epoch: 32156, Training Loss: 0.00799
Epoch: 32156, Training Loss: 0.00800
Epoch: 32156, Training Loss: 0.00982
Epoch: 32157, Training Loss: 0.00768
Epoch: 32157, Training Loss: 0.00799
Epoch: 32157, Training Loss: 0.00800
Epoch: 32157, Training Loss: 0.00982
Epoch: 32158, Training Loss: 0.00768
Epoch: 32158, Training Loss: 0.00799
Epoch: 32158, Training Loss: 0.00800
Epoch: 32158, Training Loss: 0.00982
Epoch: 32159, Training Loss: 0.00768
Epoch: 32159, Training Loss: 0.00799
Epoch: 32159, Training Loss: 0.00800
Epoch: 32159, Training Loss: 0.00982
Epoch: 32160, Training Loss: 0.00767
Epoch: 32160, Training Loss: 0.00799
Epoch: 32160, Training Loss: 0.00800
Epoch: 32160, Training Loss: 0.00982
Epoch: 32161, Training Loss: 0.00767
Epoch: 32161, Training Loss: 0.00799
Epoch: 32161, Training Loss: 0.00800
Epoch: 32161, Training Loss: 0.00982
Epoch: 32162, Training Loss: 0.00767
Epoch: 32162, Training Loss: 0.00799
Epoch: 32162, Training Loss: 0.00800
Epoch: 32162, Training Loss: 0.00982
Epoch: 32163, Training Loss: 0.00767
Epoch: 32163, Training Loss: 0.00799
Epoch: 32163, Training Loss: 0.00800
Epoch: 32163, Training Loss: 0.00982
Epoch: 32164, Training Loss: 0.00767
Epoch: 32164, Training Loss: 0.00799
Epoch: 32164, Training Loss: 0.00800
Epoch: 32164, Training Loss: 0.00982
Epoch: 32165, Training Loss: 0.00767
Epoch: 32165, Training Loss: 0.00799
Epoch: 32165, Training Loss: 0.00800
Epoch: 32165, Training Loss: 0.00982
Epoch: 32166, Training Loss: 0.00767
Epoch: 32166, Training Loss: 0.00799
Epoch: 32166, Training Loss: 0.00800
Epoch: 32166, Training Loss: 0.00982
Epoch: 32167, Training Loss: 0.00767
Epoch: 32167, Training Loss: 0.00799
Epoch: 32167, Training Loss: 0.00800
Epoch: 32167, Training Loss: 0.00982
Epoch: 32168, Training Loss: 0.00767
Epoch: 32168, Training Loss: 0.00799
Epoch: 32168, Training Loss: 0.00800
Epoch: 32168, Training Loss: 0.00982
Epoch: 32169, Training Loss: 0.00767
Epoch: 32169, Training Loss: 0.00799
Epoch: 32169, Training Loss: 0.00800
Epoch: 32169, Training Loss: 0.00982
Epoch: 32170, Training Loss: 0.00767
Epoch: 32170, Training Loss: 0.00799
Epoch: 32170, Training Loss: 0.00800
Epoch: 32170, Training Loss: 0.00982
Epoch: 32171, Training Loss: 0.00767
Epoch: 32171, Training Loss: 0.00799
Epoch: 32171, Training Loss: 0.00800
Epoch: 32171, Training Loss: 0.00982
Epoch: 32172, Training Loss: 0.00767
Epoch: 32172, Training Loss: 0.00799
Epoch: 32172, Training Loss: 0.00800
Epoch: 32172, Training Loss: 0.00982
Epoch: 32173, Training Loss: 0.00767
Epoch: 32173, Training Loss: 0.00799
Epoch: 32173, Training Loss: 0.00800
Epoch: 32173, Training Loss: 0.00982
Epoch: 32174, Training Loss: 0.00767
Epoch: 32174, Training Loss: 0.00799
Epoch: 32174, Training Loss: 0.00800
Epoch: 32174, Training Loss: 0.00982
Epoch: 32175, Training Loss: 0.00767
Epoch: 32175, Training Loss: 0.00799
Epoch: 32175, Training Loss: 0.00800
Epoch: 32175, Training Loss: 0.00982
Epoch: 32176, Training Loss: 0.00767
Epoch: 32176, Training Loss: 0.00799
Epoch: 32176, Training Loss: 0.00800
Epoch: 32176, Training Loss: 0.00982
Epoch: 32177, Training Loss: 0.00767
Epoch: 32177, Training Loss: 0.00799
Epoch: 32177, Training Loss: 0.00800
Epoch: 32177, Training Loss: 0.00982
Epoch: 32178, Training Loss: 0.00767
Epoch: 32178, Training Loss: 0.00799
Epoch: 32178, Training Loss: 0.00800
Epoch: 32178, Training Loss: 0.00982
Epoch: 32179, Training Loss: 0.00767
Epoch: 32179, Training Loss: 0.00799
Epoch: 32179, Training Loss: 0.00800
Epoch: 32179, Training Loss: 0.00982
Epoch: 32180, Training Loss: 0.00767
Epoch: 32180, Training Loss: 0.00799
Epoch: 32180, Training Loss: 0.00800
Epoch: 32180, Training Loss: 0.00982
Epoch: 32181, Training Loss: 0.00767
Epoch: 32181, Training Loss: 0.00799
Epoch: 32181, Training Loss: 0.00800
Epoch: 32181, Training Loss: 0.00982
Epoch: 32182, Training Loss: 0.00767
Epoch: 32182, Training Loss: 0.00799
Epoch: 32182, Training Loss: 0.00800
Epoch: 32182, Training Loss: 0.00982
Epoch: 32183, Training Loss: 0.00767
Epoch: 32183, Training Loss: 0.00799
Epoch: 32183, Training Loss: 0.00800
Epoch: 32183, Training Loss: 0.00982
Epoch: 32184, Training Loss: 0.00767
Epoch: 32184, Training Loss: 0.00799
Epoch: 32184, Training Loss: 0.00800
Epoch: 32184, Training Loss: 0.00982
Epoch: 32185, Training Loss: 0.00767
Epoch: 32185, Training Loss: 0.00799
Epoch: 32185, Training Loss: 0.00800
Epoch: 32185, Training Loss: 0.00982
Epoch: 32186, Training Loss: 0.00767
Epoch: 32186, Training Loss: 0.00799
Epoch: 32186, Training Loss: 0.00800
Epoch: 32186, Training Loss: 0.00982
Epoch: 32187, Training Loss: 0.00767
Epoch: 32187, Training Loss: 0.00799
Epoch: 32187, Training Loss: 0.00800
Epoch: 32187, Training Loss: 0.00982
Epoch: 32188, Training Loss: 0.00767
Epoch: 32188, Training Loss: 0.00799
Epoch: 32188, Training Loss: 0.00800
Epoch: 32188, Training Loss: 0.00982
Epoch: 32189, Training Loss: 0.00767
Epoch: 32189, Training Loss: 0.00799
Epoch: 32189, Training Loss: 0.00800
Epoch: 32189, Training Loss: 0.00982
Epoch: 32190, Training Loss: 0.00767
Epoch: 32190, Training Loss: 0.00799
Epoch: 32190, Training Loss: 0.00800
Epoch: 32190, Training Loss: 0.00982
Epoch: 32191, Training Loss: 0.00767
Epoch: 32191, Training Loss: 0.00799
Epoch: 32191, Training Loss: 0.00800
Epoch: 32191, Training Loss: 0.00982
Epoch: 32192, Training Loss: 0.00767
Epoch: 32192, Training Loss: 0.00799
Epoch: 32192, Training Loss: 0.00800
Epoch: 32192, Training Loss: 0.00982
Epoch: 32193, Training Loss: 0.00767
Epoch: 32193, Training Loss: 0.00799
Epoch: 32193, Training Loss: 0.00800
Epoch: 32193, Training Loss: 0.00982
Epoch: 32194, Training Loss: 0.00767
Epoch: 32194, Training Loss: 0.00799
Epoch: 32194, Training Loss: 0.00800
Epoch: 32194, Training Loss: 0.00982
Epoch: 32195, Training Loss: 0.00767
Epoch: 32195, Training Loss: 0.00799
Epoch: 32195, Training Loss: 0.00800
Epoch: 32195, Training Loss: 0.00982
Epoch: 32196, Training Loss: 0.00767
Epoch: 32196, Training Loss: 0.00799
Epoch: 32196, Training Loss: 0.00800
Epoch: 32196, Training Loss: 0.00982
Epoch: 32197, Training Loss: 0.00767
Epoch: 32197, Training Loss: 0.00799
Epoch: 32197, Training Loss: 0.00800
Epoch: 32197, Training Loss: 0.00982
Epoch: 32198, Training Loss: 0.00767
Epoch: 32198, Training Loss: 0.00799
Epoch: 32198, Training Loss: 0.00800
Epoch: 32198, Training Loss: 0.00982
Epoch: 32199, Training Loss: 0.00767
Epoch: 32199, Training Loss: 0.00799
Epoch: 32199, Training Loss: 0.00800
Epoch: 32199, Training Loss: 0.00982
Epoch: 32200, Training Loss: 0.00767
Epoch: 32200, Training Loss: 0.00799
Epoch: 32200, Training Loss: 0.00800
Epoch: 32200, Training Loss: 0.00982
Epoch: 32201, Training Loss: 0.00767
Epoch: 32201, Training Loss: 0.00799
Epoch: 32201, Training Loss: 0.00800
Epoch: 32201, Training Loss: 0.00982
Epoch: 32202, Training Loss: 0.00767
Epoch: 32202, Training Loss: 0.00799
Epoch: 32202, Training Loss: 0.00800
Epoch: 32202, Training Loss: 0.00982
Epoch: 32203, Training Loss: 0.00767
Epoch: 32203, Training Loss: 0.00799
Epoch: 32203, Training Loss: 0.00800
Epoch: 32203, Training Loss: 0.00982
Epoch: 32204, Training Loss: 0.00767
Epoch: 32204, Training Loss: 0.00799
Epoch: 32204, Training Loss: 0.00799
Epoch: 32204, Training Loss: 0.00982
Epoch: 32205, Training Loss: 0.00767
Epoch: 32205, Training Loss: 0.00799
Epoch: 32205, Training Loss: 0.00799
Epoch: 32205, Training Loss: 0.00982
Epoch: 32206, Training Loss: 0.00767
Epoch: 32206, Training Loss: 0.00799
Epoch: 32206, Training Loss: 0.00799
Epoch: 32206, Training Loss: 0.00982
Epoch: 32207, Training Loss: 0.00767
Epoch: 32207, Training Loss: 0.00799
Epoch: 32207, Training Loss: 0.00799
Epoch: 32207, Training Loss: 0.00982
Epoch: 32208, Training Loss: 0.00767
Epoch: 32208, Training Loss: 0.00798
Epoch: 32208, Training Loss: 0.00799
Epoch: 32208, Training Loss: 0.00982
Epoch: 32209, Training Loss: 0.00767
Epoch: 32209, Training Loss: 0.00798
Epoch: 32209, Training Loss: 0.00799
Epoch: 32209, Training Loss: 0.00982
Epoch: 32210, Training Loss: 0.00767
Epoch: 32210, Training Loss: 0.00798
Epoch: 32210, Training Loss: 0.00799
Epoch: 32210, Training Loss: 0.00982
Epoch: 32211, Training Loss: 0.00767
Epoch: 32211, Training Loss: 0.00798
Epoch: 32211, Training Loss: 0.00799
Epoch: 32211, Training Loss: 0.00982
Epoch: 32212, Training Loss: 0.00767
Epoch: 32212, Training Loss: 0.00798
Epoch: 32212, Training Loss: 0.00799
Epoch: 32212, Training Loss: 0.00982
Epoch: 32213, Training Loss: 0.00767
Epoch: 32213, Training Loss: 0.00798
Epoch: 32213, Training Loss: 0.00799
Epoch: 32213, Training Loss: 0.00982
Epoch: 32214, Training Loss: 0.00767
Epoch: 32214, Training Loss: 0.00798
Epoch: 32214, Training Loss: 0.00799
Epoch: 32214, Training Loss: 0.00981
Epoch: 32215, Training Loss: 0.00767
Epoch: 32215, Training Loss: 0.00798
Epoch: 32215, Training Loss: 0.00799
Epoch: 32215, Training Loss: 0.00981
Epoch: 32216, Training Loss: 0.00767
Epoch: 32216, Training Loss: 0.00798
Epoch: 32216, Training Loss: 0.00799
Epoch: 32216, Training Loss: 0.00981
Epoch: 32217, Training Loss: 0.00767
Epoch: 32217, Training Loss: 0.00798
Epoch: 32217, Training Loss: 0.00799
Epoch: 32217, Training Loss: 0.00981
Epoch: 32218, Training Loss: 0.00767
Epoch: 32218, Training Loss: 0.00798
Epoch: 32218, Training Loss: 0.00799
Epoch: 32218, Training Loss: 0.00981
Epoch: 32219, Training Loss: 0.00767
Epoch: 32219, Training Loss: 0.00798
Epoch: 32219, Training Loss: 0.00799
Epoch: 32219, Training Loss: 0.00981
Epoch: 32220, Training Loss: 0.00767
Epoch: 32220, Training Loss: 0.00798
Epoch: 32220, Training Loss: 0.00799
Epoch: 32220, Training Loss: 0.00981
Epoch: 32221, Training Loss: 0.00767
Epoch: 32221, Training Loss: 0.00798
Epoch: 32221, Training Loss: 0.00799
Epoch: 32221, Training Loss: 0.00981
Epoch: 32222, Training Loss: 0.00767
Epoch: 32222, Training Loss: 0.00798
Epoch: 32222, Training Loss: 0.00799
Epoch: 32222, Training Loss: 0.00981
Epoch: 32223, Training Loss: 0.00767
Epoch: 32223, Training Loss: 0.00798
Epoch: 32223, Training Loss: 0.00799
Epoch: 32223, Training Loss: 0.00981
Epoch: 32224, Training Loss: 0.00767
Epoch: 32224, Training Loss: 0.00798
Epoch: 32224, Training Loss: 0.00799
Epoch: 32224, Training Loss: 0.00981
Epoch: 32225, Training Loss: 0.00767
Epoch: 32225, Training Loss: 0.00798
Epoch: 32225, Training Loss: 0.00799
Epoch: 32225, Training Loss: 0.00981
Epoch: 32226, Training Loss: 0.00767
Epoch: 32226, Training Loss: 0.00798
Epoch: 32226, Training Loss: 0.00799
Epoch: 32226, Training Loss: 0.00981
Epoch: 32227, Training Loss: 0.00767
Epoch: 32227, Training Loss: 0.00798
Epoch: 32227, Training Loss: 0.00799
Epoch: 32227, Training Loss: 0.00981
Epoch: 32228, Training Loss: 0.00767
Epoch: 32228, Training Loss: 0.00798
Epoch: 32228, Training Loss: 0.00799
Epoch: 32228, Training Loss: 0.00981
Epoch: 32229, Training Loss: 0.00767
Epoch: 32229, Training Loss: 0.00798
Epoch: 32229, Training Loss: 0.00799
Epoch: 32229, Training Loss: 0.00981
Epoch: 32230, Training Loss: 0.00767
Epoch: 32230, Training Loss: 0.00798
Epoch: 32230, Training Loss: 0.00799
Epoch: 32230, Training Loss: 0.00981
Epoch: 32231, Training Loss: 0.00767
Epoch: 32231, Training Loss: 0.00798
Epoch: 32231, Training Loss: 0.00799
Epoch: 32231, Training Loss: 0.00981
Epoch: 32232, Training Loss: 0.00767
Epoch: 32232, Training Loss: 0.00798
Epoch: 32232, Training Loss: 0.00799
Epoch: 32232, Training Loss: 0.00981
Epoch: 32233, Training Loss: 0.00767
Epoch: 32233, Training Loss: 0.00798
Epoch: 32233, Training Loss: 0.00799
Epoch: 32233, Training Loss: 0.00981
Epoch: 32234, Training Loss: 0.00767
Epoch: 32234, Training Loss: 0.00798
Epoch: 32234, Training Loss: 0.00799
Epoch: 32234, Training Loss: 0.00981
Epoch: 32235, Training Loss: 0.00767
Epoch: 32235, Training Loss: 0.00798
Epoch: 32235, Training Loss: 0.00799
Epoch: 32235, Training Loss: 0.00981
Epoch: 32236, Training Loss: 0.00767
Epoch: 32236, Training Loss: 0.00798
Epoch: 32236, Training Loss: 0.00799
Epoch: 32236, Training Loss: 0.00981
Epoch: 32237, Training Loss: 0.00766
Epoch: 32237, Training Loss: 0.00798
Epoch: 32237, Training Loss: 0.00799
Epoch: 32237, Training Loss: 0.00981
Epoch: 32238, Training Loss: 0.00766
Epoch: 32238, Training Loss: 0.00798
Epoch: 32238, Training Loss: 0.00799
Epoch: 32238, Training Loss: 0.00981
Epoch: 32239, Training Loss: 0.00766
Epoch: 32239, Training Loss: 0.00798
Epoch: 32239, Training Loss: 0.00799
Epoch: 32239, Training Loss: 0.00981
Epoch: 32240, Training Loss: 0.00766
Epoch: 32240, Training Loss: 0.00798
Epoch: 32240, Training Loss: 0.00799
Epoch: 32240, Training Loss: 0.00981
Epoch: 32241, Training Loss: 0.00766
Epoch: 32241, Training Loss: 0.00798
Epoch: 32241, Training Loss: 0.00799
Epoch: 32241, Training Loss: 0.00981
Epoch: 32242, Training Loss: 0.00766
Epoch: 32242, Training Loss: 0.00798
Epoch: 32242, Training Loss: 0.00799
Epoch: 32242, Training Loss: 0.00981
Epoch: 32243, Training Loss: 0.00766
Epoch: 32243, Training Loss: 0.00798
Epoch: 32243, Training Loss: 0.00799
Epoch: 32243, Training Loss: 0.00981
Epoch: 32244, Training Loss: 0.00766
Epoch: 32244, Training Loss: 0.00798
Epoch: 32244, Training Loss: 0.00799
Epoch: 32244, Training Loss: 0.00981
Epoch: 32245, Training Loss: 0.00766
Epoch: 32245, Training Loss: 0.00798
Epoch: 32245, Training Loss: 0.00799
Epoch: 32245, Training Loss: 0.00981
Epoch: 32246, Training Loss: 0.00766
Epoch: 32246, Training Loss: 0.00798
Epoch: 32246, Training Loss: 0.00799
Epoch: 32246, Training Loss: 0.00981
Epoch: 32247, Training Loss: 0.00766
Epoch: 32247, Training Loss: 0.00798
Epoch: 32247, Training Loss: 0.00799
Epoch: 32247, Training Loss: 0.00981
Epoch: 32248, Training Loss: 0.00766
Epoch: 32248, Training Loss: 0.00798
Epoch: 32248, Training Loss: 0.00799
Epoch: 32248, Training Loss: 0.00981
Epoch: 32249, Training Loss: 0.00766
Epoch: 32249, Training Loss: 0.00798
Epoch: 32249, Training Loss: 0.00799
Epoch: 32249, Training Loss: 0.00981
Epoch: 32250, Training Loss: 0.00766
Epoch: 32250, Training Loss: 0.00798
Epoch: 32250, Training Loss: 0.00799
Epoch: 32250, Training Loss: 0.00981
Epoch: 32251, Training Loss: 0.00766
Epoch: 32251, Training Loss: 0.00798
Epoch: 32251, Training Loss: 0.00799
Epoch: 32251, Training Loss: 0.00981
Epoch: 32252, Training Loss: 0.00766
Epoch: 32252, Training Loss: 0.00798
Epoch: 32252, Training Loss: 0.00799
Epoch: 32252, Training Loss: 0.00981
Epoch: 32253, Training Loss: 0.00766
Epoch: 32253, Training Loss: 0.00798
Epoch: 32253, Training Loss: 0.00799
Epoch: 32253, Training Loss: 0.00981
Epoch: 32254, Training Loss: 0.00766
Epoch: 32254, Training Loss: 0.00798
Epoch: 32254, Training Loss: 0.00799
Epoch: 32254, Training Loss: 0.00981
Epoch: 32255, Training Loss: 0.00766
Epoch: 32255, Training Loss: 0.00798
Epoch: 32255, Training Loss: 0.00799
Epoch: 32255, Training Loss: 0.00981
Epoch: 32256, Training Loss: 0.00766
Epoch: 32256, Training Loss: 0.00798
Epoch: 32256, Training Loss: 0.00799
Epoch: 32256, Training Loss: 0.00981
Epoch: 32257, Training Loss: 0.00766
Epoch: 32257, Training Loss: 0.00798
Epoch: 32257, Training Loss: 0.00799
Epoch: 32257, Training Loss: 0.00981
Epoch: 32258, Training Loss: 0.00766
Epoch: 32258, Training Loss: 0.00798
Epoch: 32258, Training Loss: 0.00799
Epoch: 32258, Training Loss: 0.00981
Epoch: 32259, Training Loss: 0.00766
Epoch: 32259, Training Loss: 0.00798
Epoch: 32259, Training Loss: 0.00799
Epoch: 32259, Training Loss: 0.00981
Epoch: 32260, Training Loss: 0.00766
Epoch: 32260, Training Loss: 0.00798
Epoch: 32260, Training Loss: 0.00799
Epoch: 32260, Training Loss: 0.00981
Epoch: 32261, Training Loss: 0.00766
Epoch: 32261, Training Loss: 0.00798
Epoch: 32261, Training Loss: 0.00799
Epoch: 32261, Training Loss: 0.00981
Epoch: 32262, Training Loss: 0.00766
Epoch: 32262, Training Loss: 0.00798
Epoch: 32262, Training Loss: 0.00799
Epoch: 32262, Training Loss: 0.00981
Epoch: 32263, Training Loss: 0.00766
Epoch: 32263, Training Loss: 0.00798
Epoch: 32263, Training Loss: 0.00799
Epoch: 32263, Training Loss: 0.00981
Epoch: 32264, Training Loss: 0.00766
Epoch: 32264, Training Loss: 0.00798
Epoch: 32264, Training Loss: 0.00799
Epoch: 32264, Training Loss: 0.00981
Epoch: 32265, Training Loss: 0.00766
Epoch: 32265, Training Loss: 0.00798
Epoch: 32265, Training Loss: 0.00799
Epoch: 32265, Training Loss: 0.00981
Epoch: 32266, Training Loss: 0.00766
Epoch: 32266, Training Loss: 0.00798
Epoch: 32266, Training Loss: 0.00799
Epoch: 32266, Training Loss: 0.00981
Epoch: 32267, Training Loss: 0.00766
Epoch: 32267, Training Loss: 0.00798
Epoch: 32267, Training Loss: 0.00799
Epoch: 32267, Training Loss: 0.00981
Epoch: 32268, Training Loss: 0.00766
Epoch: 32268, Training Loss: 0.00798
Epoch: 32268, Training Loss: 0.00799
Epoch: 32268, Training Loss: 0.00981
Epoch: 32269, Training Loss: 0.00766
Epoch: 32269, Training Loss: 0.00798
Epoch: 32269, Training Loss: 0.00799
Epoch: 32269, Training Loss: 0.00981
Epoch: 32270, Training Loss: 0.00766
Epoch: 32270, Training Loss: 0.00798
Epoch: 32270, Training Loss: 0.00799
Epoch: 32270, Training Loss: 0.00981
Epoch: 32271, Training Loss: 0.00766
Epoch: 32271, Training Loss: 0.00798
Epoch: 32271, Training Loss: 0.00799
Epoch: 32271, Training Loss: 0.00981
Epoch: 32272, Training Loss: 0.00766
Epoch: 32272, Training Loss: 0.00798
Epoch: 32272, Training Loss: 0.00799
Epoch: 32272, Training Loss: 0.00980
Epoch: 32273, Training Loss: 0.00766
Epoch: 32273, Training Loss: 0.00798
Epoch: 32273, Training Loss: 0.00799
Epoch: 32273, Training Loss: 0.00980
Epoch: 32274, Training Loss: 0.00766
Epoch: 32274, Training Loss: 0.00798
Epoch: 32274, Training Loss: 0.00799
Epoch: 32274, Training Loss: 0.00980
Epoch: 32275, Training Loss: 0.00766
Epoch: 32275, Training Loss: 0.00798
Epoch: 32275, Training Loss: 0.00799
Epoch: 32275, Training Loss: 0.00980
Epoch: 32276, Training Loss: 0.00766
Epoch: 32276, Training Loss: 0.00798
Epoch: 32276, Training Loss: 0.00798
Epoch: 32276, Training Loss: 0.00980
Epoch: 32277, Training Loss: 0.00766
Epoch: 32277, Training Loss: 0.00798
Epoch: 32277, Training Loss: 0.00798
Epoch: 32277, Training Loss: 0.00980
Epoch: 32278, Training Loss: 0.00766
Epoch: 32278, Training Loss: 0.00798
Epoch: 32278, Training Loss: 0.00798
Epoch: 32278, Training Loss: 0.00980
Epoch: 32279, Training Loss: 0.00766
Epoch: 32279, Training Loss: 0.00798
Epoch: 32279, Training Loss: 0.00798
Epoch: 32279, Training Loss: 0.00980
Epoch: 32280, Training Loss: 0.00766
Epoch: 32280, Training Loss: 0.00797
Epoch: 32280, Training Loss: 0.00798
Epoch: 32280, Training Loss: 0.00980
Epoch: 32281, Training Loss: 0.00766
Epoch: 32281, Training Loss: 0.00797
Epoch: 32281, Training Loss: 0.00798
Epoch: 32281, Training Loss: 0.00980
Epoch: 32282, Training Loss: 0.00766
Epoch: 32282, Training Loss: 0.00797
Epoch: 32282, Training Loss: 0.00798
Epoch: 32282, Training Loss: 0.00980
Epoch: 32283, Training Loss: 0.00766
Epoch: 32283, Training Loss: 0.00797
Epoch: 32283, Training Loss: 0.00798
Epoch: 32283, Training Loss: 0.00980
Epoch: 32284, Training Loss: 0.00766
Epoch: 32284, Training Loss: 0.00797
Epoch: 32284, Training Loss: 0.00798
Epoch: 32284, Training Loss: 0.00980
Epoch: 32285, Training Loss: 0.00766
Epoch: 32285, Training Loss: 0.00797
Epoch: 32285, Training Loss: 0.00798
Epoch: 32285, Training Loss: 0.00980
Epoch: 32286, Training Loss: 0.00766
Epoch: 32286, Training Loss: 0.00797
Epoch: 32286, Training Loss: 0.00798
Epoch: 32286, Training Loss: 0.00980
Epoch: 32287, Training Loss: 0.00766
Epoch: 32287, Training Loss: 0.00797
Epoch: 32287, Training Loss: 0.00798
Epoch: 32287, Training Loss: 0.00980
Epoch: 32288, Training Loss: 0.00766
Epoch: 32288, Training Loss: 0.00797
Epoch: 32288, Training Loss: 0.00798
Epoch: 32288, Training Loss: 0.00980
Epoch: 32289, Training Loss: 0.00766
Epoch: 32289, Training Loss: 0.00797
Epoch: 32289, Training Loss: 0.00798
Epoch: 32289, Training Loss: 0.00980
Epoch: 32290, Training Loss: 0.00766
Epoch: 32290, Training Loss: 0.00797
Epoch: 32290, Training Loss: 0.00798
Epoch: 32290, Training Loss: 0.00980
Epoch: 32291, Training Loss: 0.00766
Epoch: 32291, Training Loss: 0.00797
Epoch: 32291, Training Loss: 0.00798
Epoch: 32291, Training Loss: 0.00980
Epoch: 32292, Training Loss: 0.00766
Epoch: 32292, Training Loss: 0.00797
Epoch: 32292, Training Loss: 0.00798
Epoch: 32292, Training Loss: 0.00980
Epoch: 32293, Training Loss: 0.00766
Epoch: 32293, Training Loss: 0.00797
Epoch: 32293, Training Loss: 0.00798
Epoch: 32293, Training Loss: 0.00980
Epoch: 32294, Training Loss: 0.00766
Epoch: 32294, Training Loss: 0.00797
Epoch: 32294, Training Loss: 0.00798
Epoch: 32294, Training Loss: 0.00980
Epoch: 32295, Training Loss: 0.00766
Epoch: 32295, Training Loss: 0.00797
Epoch: 32295, Training Loss: 0.00798
Epoch: 32295, Training Loss: 0.00980
Epoch: 32296, Training Loss: 0.00766
Epoch: 32296, Training Loss: 0.00797
Epoch: 32296, Training Loss: 0.00798
Epoch: 32296, Training Loss: 0.00980
Epoch: 32297, Training Loss: 0.00766
Epoch: 32297, Training Loss: 0.00797
Epoch: 32297, Training Loss: 0.00798
Epoch: 32297, Training Loss: 0.00980
Epoch: 32298, Training Loss: 0.00766
Epoch: 32298, Training Loss: 0.00797
Epoch: 32298, Training Loss: 0.00798
Epoch: 32298, Training Loss: 0.00980
Epoch: 32299, Training Loss: 0.00766
Epoch: 32299, Training Loss: 0.00797
Epoch: 32299, Training Loss: 0.00798
Epoch: 32299, Training Loss: 0.00980
Epoch: 32300, Training Loss: 0.00766
Epoch: 32300, Training Loss: 0.00797
Epoch: 32300, Training Loss: 0.00798
Epoch: 32300, Training Loss: 0.00980
Epoch: 32301, Training Loss: 0.00766
Epoch: 32301, Training Loss: 0.00797
Epoch: 32301, Training Loss: 0.00798
Epoch: 32301, Training Loss: 0.00980
Epoch: 32302, Training Loss: 0.00766
Epoch: 32302, Training Loss: 0.00797
Epoch: 32302, Training Loss: 0.00798
Epoch: 32302, Training Loss: 0.00980
Epoch: 32303, Training Loss: 0.00766
Epoch: 32303, Training Loss: 0.00797
Epoch: 32303, Training Loss: 0.00798
Epoch: 32303, Training Loss: 0.00980
Epoch: 32304, Training Loss: 0.00766
Epoch: 32304, Training Loss: 0.00797
Epoch: 32304, Training Loss: 0.00798
Epoch: 32304, Training Loss: 0.00980
Epoch: 32305, Training Loss: 0.00766
Epoch: 32305, Training Loss: 0.00797
Epoch: 32305, Training Loss: 0.00798
Epoch: 32305, Training Loss: 0.00980
Epoch: 32306, Training Loss: 0.00766
Epoch: 32306, Training Loss: 0.00797
Epoch: 32306, Training Loss: 0.00798
Epoch: 32306, Training Loss: 0.00980
Epoch: 32307, Training Loss: 0.00766
Epoch: 32307, Training Loss: 0.00797
Epoch: 32307, Training Loss: 0.00798
Epoch: 32307, Training Loss: 0.00980
Epoch: 32308, Training Loss: 0.00766
Epoch: 32308, Training Loss: 0.00797
Epoch: 32308, Training Loss: 0.00798
Epoch: 32308, Training Loss: 0.00980
Epoch: 32309, Training Loss: 0.00766
Epoch: 32309, Training Loss: 0.00797
Epoch: 32309, Training Loss: 0.00798
Epoch: 32309, Training Loss: 0.00980
Epoch: 32310, Training Loss: 0.00766
Epoch: 32310, Training Loss: 0.00797
Epoch: 32310, Training Loss: 0.00798
Epoch: 32310, Training Loss: 0.00980
Epoch: 32311, Training Loss: 0.00766
Epoch: 32311, Training Loss: 0.00797
Epoch: 32311, Training Loss: 0.00798
Epoch: 32311, Training Loss: 0.00980
Epoch: 32312, Training Loss: 0.00766
Epoch: 32312, Training Loss: 0.00797
Epoch: 32312, Training Loss: 0.00798
Epoch: 32312, Training Loss: 0.00980
Epoch: 32313, Training Loss: 0.00766
Epoch: 32313, Training Loss: 0.00797
Epoch: 32313, Training Loss: 0.00798
Epoch: 32313, Training Loss: 0.00980
Epoch: 32314, Training Loss: 0.00765
Epoch: 32314, Training Loss: 0.00797
Epoch: 32314, Training Loss: 0.00798
Epoch: 32314, Training Loss: 0.00980
Epoch: 32315, Training Loss: 0.00765
Epoch: 32315, Training Loss: 0.00797
Epoch: 32315, Training Loss: 0.00798
Epoch: 32315, Training Loss: 0.00980
Epoch: 32316, Training Loss: 0.00765
Epoch: 32316, Training Loss: 0.00797
Epoch: 32316, Training Loss: 0.00798
Epoch: 32316, Training Loss: 0.00980
Epoch: 32317, Training Loss: 0.00765
Epoch: 32317, Training Loss: 0.00797
Epoch: 32317, Training Loss: 0.00798
Epoch: 32317, Training Loss: 0.00980
Epoch: 32318, Training Loss: 0.00765
Epoch: 32318, Training Loss: 0.00797
Epoch: 32318, Training Loss: 0.00798
Epoch: 32318, Training Loss: 0.00980
Epoch: 32319, Training Loss: 0.00765
Epoch: 32319, Training Loss: 0.00797
Epoch: 32319, Training Loss: 0.00798
Epoch: 32319, Training Loss: 0.00980
Epoch: 32320, Training Loss: 0.00765
Epoch: 32320, Training Loss: 0.00797
Epoch: 32320, Training Loss: 0.00798
Epoch: 32320, Training Loss: 0.00980
Epoch: 32321, Training Loss: 0.00765
Epoch: 32321, Training Loss: 0.00797
Epoch: 32321, Training Loss: 0.00798
Epoch: 32321, Training Loss: 0.00980
Epoch: 32322, Training Loss: 0.00765
Epoch: 32322, Training Loss: 0.00797
Epoch: 32322, Training Loss: 0.00798
Epoch: 32322, Training Loss: 0.00980
Epoch: 32323, Training Loss: 0.00765
Epoch: 32323, Training Loss: 0.00797
Epoch: 32323, Training Loss: 0.00798
Epoch: 32323, Training Loss: 0.00980
Epoch: 32324, Training Loss: 0.00765
Epoch: 32324, Training Loss: 0.00797
Epoch: 32324, Training Loss: 0.00798
Epoch: 32324, Training Loss: 0.00980
Epoch: 32325, Training Loss: 0.00765
Epoch: 32325, Training Loss: 0.00797
Epoch: 32325, Training Loss: 0.00798
Epoch: 32325, Training Loss: 0.00980
Epoch: 32326, Training Loss: 0.00765
Epoch: 32326, Training Loss: 0.00797
Epoch: 32326, Training Loss: 0.00798
Epoch: 32326, Training Loss: 0.00980
Epoch: 32327, Training Loss: 0.00765
Epoch: 32327, Training Loss: 0.00797
Epoch: 32327, Training Loss: 0.00798
Epoch: 32327, Training Loss: 0.00980
Epoch: 32328, Training Loss: 0.00765
Epoch: 32328, Training Loss: 0.00797
Epoch: 32328, Training Loss: 0.00798
Epoch: 32328, Training Loss: 0.00980
Epoch: 32329, Training Loss: 0.00765
Epoch: 32329, Training Loss: 0.00797
Epoch: 32329, Training Loss: 0.00798
Epoch: 32329, Training Loss: 0.00980
Epoch: 32330, Training Loss: 0.00765
Epoch: 32330, Training Loss: 0.00797
Epoch: 32330, Training Loss: 0.00798
Epoch: 32330, Training Loss: 0.00979
Epoch: 32331, Training Loss: 0.00765
Epoch: 32331, Training Loss: 0.00797
Epoch: 32331, Training Loss: 0.00798
Epoch: 32331, Training Loss: 0.00979
Epoch: 32332, Training Loss: 0.00765
Epoch: 32332, Training Loss: 0.00797
Epoch: 32332, Training Loss: 0.00798
Epoch: 32332, Training Loss: 0.00979
Epoch: 32333, Training Loss: 0.00765
Epoch: 32333, Training Loss: 0.00797
Epoch: 32333, Training Loss: 0.00798
Epoch: 32333, Training Loss: 0.00979
Epoch: 32334, Training Loss: 0.00765
Epoch: 32334, Training Loss: 0.00797
Epoch: 32334, Training Loss: 0.00798
Epoch: 32334, Training Loss: 0.00979
Epoch: 32335, Training Loss: 0.00765
Epoch: 32335, Training Loss: 0.00797
Epoch: 32335, Training Loss: 0.00798
Epoch: 32335, Training Loss: 0.00979
Epoch: 32336, Training Loss: 0.00765
Epoch: 32336, Training Loss: 0.00797
Epoch: 32336, Training Loss: 0.00798
Epoch: 32336, Training Loss: 0.00979
Epoch: 32337, Training Loss: 0.00765
Epoch: 32337, Training Loss: 0.00797
Epoch: 32337, Training Loss: 0.00798
Epoch: 32337, Training Loss: 0.00979
Epoch: 32338, Training Loss: 0.00765
Epoch: 32338, Training Loss: 0.00797
Epoch: 32338, Training Loss: 0.00798
Epoch: 32338, Training Loss: 0.00979
Epoch: 32339, Training Loss: 0.00765
Epoch: 32339, Training Loss: 0.00797
Epoch: 32339, Training Loss: 0.00798
Epoch: 32339, Training Loss: 0.00979
Epoch: 32340, Training Loss: 0.00765
Epoch: 32340, Training Loss: 0.00797
Epoch: 32340, Training Loss: 0.00798
Epoch: 32340, Training Loss: 0.00979
Epoch: 32341, Training Loss: 0.00765
Epoch: 32341, Training Loss: 0.00797
Epoch: 32341, Training Loss: 0.00798
Epoch: 32341, Training Loss: 0.00979
Epoch: 32342, Training Loss: 0.00765
Epoch: 32342, Training Loss: 0.00797
Epoch: 32342, Training Loss: 0.00798
Epoch: 32342, Training Loss: 0.00979
Epoch: 32343, Training Loss: 0.00765
Epoch: 32343, Training Loss: 0.00797
Epoch: 32343, Training Loss: 0.00798
Epoch: 32343, Training Loss: 0.00979
Epoch: 32344, Training Loss: 0.00765
Epoch: 32344, Training Loss: 0.00797
Epoch: 32344, Training Loss: 0.00798
Epoch: 32344, Training Loss: 0.00979
Epoch: 32345, Training Loss: 0.00765
Epoch: 32345, Training Loss: 0.00797
Epoch: 32345, Training Loss: 0.00798
Epoch: 32345, Training Loss: 0.00979
Epoch: 32346, Training Loss: 0.00765
Epoch: 32346, Training Loss: 0.00797
Epoch: 32346, Training Loss: 0.00798
Epoch: 32346, Training Loss: 0.00979
Epoch: 32347, Training Loss: 0.00765
Epoch: 32347, Training Loss: 0.00797
Epoch: 32347, Training Loss: 0.00798
Epoch: 32347, Training Loss: 0.00979
Epoch: 32348, Training Loss: 0.00765
Epoch: 32348, Training Loss: 0.00797
Epoch: 32348, Training Loss: 0.00797
Epoch: 32348, Training Loss: 0.00979
Epoch: 32349, Training Loss: 0.00765
Epoch: 32349, Training Loss: 0.00797
Epoch: 32349, Training Loss: 0.00797
Epoch: 32349, Training Loss: 0.00979
Epoch: 32350, Training Loss: 0.00765
Epoch: 32350, Training Loss: 0.00797
Epoch: 32350, Training Loss: 0.00797
Epoch: 32350, Training Loss: 0.00979
Epoch: 32351, Training Loss: 0.00765
Epoch: 32351, Training Loss: 0.00797
Epoch: 32351, Training Loss: 0.00797
Epoch: 32351, Training Loss: 0.00979
Epoch: 32352, Training Loss: 0.00765
Epoch: 32352, Training Loss: 0.00797
Epoch: 32352, Training Loss: 0.00797
Epoch: 32352, Training Loss: 0.00979
Epoch: 32353, Training Loss: 0.00765
Epoch: 32353, Training Loss: 0.00796
Epoch: 32353, Training Loss: 0.00797
Epoch: 32353, Training Loss: 0.00979
Epoch: 32354, Training Loss: 0.00765
Epoch: 32354, Training Loss: 0.00796
Epoch: 32354, Training Loss: 0.00797
Epoch: 32354, Training Loss: 0.00979
Epoch: 32355, Training Loss: 0.00765
Epoch: 32355, Training Loss: 0.00796
Epoch: 32355, Training Loss: 0.00797
Epoch: 32355, Training Loss: 0.00979
Epoch: 32356, Training Loss: 0.00765
Epoch: 32356, Training Loss: 0.00796
Epoch: 32356, Training Loss: 0.00797
Epoch: 32356, Training Loss: 0.00979
Epoch: 32357, Training Loss: 0.00765
Epoch: 32357, Training Loss: 0.00796
Epoch: 32357, Training Loss: 0.00797
Epoch: 32357, Training Loss: 0.00979
Epoch: 32358, Training Loss: 0.00765
Epoch: 32358, Training Loss: 0.00796
Epoch: 32358, Training Loss: 0.00797
Epoch: 32358, Training Loss: 0.00979
Epoch: 32359, Training Loss: 0.00765
Epoch: 32359, Training Loss: 0.00796
Epoch: 32359, Training Loss: 0.00797
Epoch: 32359, Training Loss: 0.00979
Epoch: 32360, Training Loss: 0.00765
Epoch: 32360, Training Loss: 0.00796
Epoch: 32360, Training Loss: 0.00797
Epoch: 32360, Training Loss: 0.00979
Epoch: 32361, Training Loss: 0.00765
Epoch: 32361, Training Loss: 0.00796
Epoch: 32361, Training Loss: 0.00797
Epoch: 32361, Training Loss: 0.00979
Epoch: 32362, Training Loss: 0.00765
Epoch: 32362, Training Loss: 0.00796
Epoch: 32362, Training Loss: 0.00797
Epoch: 32362, Training Loss: 0.00979
Epoch: 32363, Training Loss: 0.00765
Epoch: 32363, Training Loss: 0.00796
Epoch: 32363, Training Loss: 0.00797
Epoch: 32363, Training Loss: 0.00979
Epoch: 32364, Training Loss: 0.00765
Epoch: 32364, Training Loss: 0.00796
Epoch: 32364, Training Loss: 0.00797
Epoch: 32364, Training Loss: 0.00979
Epoch: 32365, Training Loss: 0.00765
Epoch: 32365, Training Loss: 0.00796
Epoch: 32365, Training Loss: 0.00797
Epoch: 32365, Training Loss: 0.00979
Epoch: 32366, Training Loss: 0.00765
Epoch: 32366, Training Loss: 0.00796
Epoch: 32366, Training Loss: 0.00797
Epoch: 32366, Training Loss: 0.00979
Epoch: 32367, Training Loss: 0.00765
Epoch: 32367, Training Loss: 0.00796
Epoch: 32367, Training Loss: 0.00797
Epoch: 32367, Training Loss: 0.00979
Epoch: 32368, Training Loss: 0.00765
Epoch: 32368, Training Loss: 0.00796
Epoch: 32368, Training Loss: 0.00797
Epoch: 32368, Training Loss: 0.00979
Epoch: 32369, Training Loss: 0.00765
Epoch: 32369, Training Loss: 0.00796
Epoch: 32369, Training Loss: 0.00797
Epoch: 32369, Training Loss: 0.00979
Epoch: 32370, Training Loss: 0.00765
Epoch: 32370, Training Loss: 0.00796
Epoch: 32370, Training Loss: 0.00797
Epoch: 32370, Training Loss: 0.00979
Epoch: 32371, Training Loss: 0.00765
Epoch: 32371, Training Loss: 0.00796
Epoch: 32371, Training Loss: 0.00797
Epoch: 32371, Training Loss: 0.00979
Epoch: 32372, Training Loss: 0.00765
Epoch: 32372, Training Loss: 0.00796
Epoch: 32372, Training Loss: 0.00797
Epoch: 32372, Training Loss: 0.00979
Epoch: 32373, Training Loss: 0.00765
Epoch: 32373, Training Loss: 0.00796
Epoch: 32373, Training Loss: 0.00797
Epoch: 32373, Training Loss: 0.00979
Epoch: 32374, Training Loss: 0.00765
Epoch: 32374, Training Loss: 0.00796
Epoch: 32374, Training Loss: 0.00797
Epoch: 32374, Training Loss: 0.00979
Epoch: 32375, Training Loss: 0.00765
Epoch: 32375, Training Loss: 0.00796
Epoch: 32375, Training Loss: 0.00797
Epoch: 32375, Training Loss: 0.00979
Epoch: 32376, Training Loss: 0.00765
Epoch: 32376, Training Loss: 0.00796
Epoch: 32376, Training Loss: 0.00797
Epoch: 32376, Training Loss: 0.00979
Epoch: 32377, Training Loss: 0.00765
Epoch: 32377, Training Loss: 0.00796
Epoch: 32377, Training Loss: 0.00797
Epoch: 32377, Training Loss: 0.00979
Epoch: 32378, Training Loss: 0.00765
Epoch: 32378, Training Loss: 0.00796
Epoch: 32378, Training Loss: 0.00797
Epoch: 32378, Training Loss: 0.00979
Epoch: 32379, Training Loss: 0.00765
Epoch: 32379, Training Loss: 0.00796
Epoch: 32379, Training Loss: 0.00797
Epoch: 32379, Training Loss: 0.00979
Epoch: 32380, Training Loss: 0.00765
Epoch: 32380, Training Loss: 0.00796
Epoch: 32380, Training Loss: 0.00797
Epoch: 32380, Training Loss: 0.00979
Epoch: 32381, Training Loss: 0.00765
Epoch: 32381, Training Loss: 0.00796
Epoch: 32381, Training Loss: 0.00797
Epoch: 32381, Training Loss: 0.00979
Epoch: 32382, Training Loss: 0.00765
Epoch: 32382, Training Loss: 0.00796
Epoch: 32382, Training Loss: 0.00797
Epoch: 32382, Training Loss: 0.00979
Epoch: 32383, Training Loss: 0.00765
Epoch: 32383, Training Loss: 0.00796
Epoch: 32383, Training Loss: 0.00797
Epoch: 32383, Training Loss: 0.00979
Epoch: 32384, Training Loss: 0.00765
Epoch: 32384, Training Loss: 0.00796
Epoch: 32384, Training Loss: 0.00797
Epoch: 32384, Training Loss: 0.00979
Epoch: 32385, Training Loss: 0.00765
Epoch: 32385, Training Loss: 0.00796
Epoch: 32385, Training Loss: 0.00797
Epoch: 32385, Training Loss: 0.00979
Epoch: 32386, Training Loss: 0.00765
Epoch: 32386, Training Loss: 0.00796
Epoch: 32386, Training Loss: 0.00797
Epoch: 32386, Training Loss: 0.00979
Epoch: 32387, Training Loss: 0.00765
Epoch: 32387, Training Loss: 0.00796
Epoch: 32387, Training Loss: 0.00797
Epoch: 32387, Training Loss: 0.00979
Epoch: 32388, Training Loss: 0.00765
Epoch: 32388, Training Loss: 0.00796
Epoch: 32388, Training Loss: 0.00797
Epoch: 32388, Training Loss: 0.00978
Epoch: 32389, Training Loss: 0.00765
Epoch: 32389, Training Loss: 0.00796
Epoch: 32389, Training Loss: 0.00797
Epoch: 32389, Training Loss: 0.00978
Epoch: 32390, Training Loss: 0.00765
Epoch: 32390, Training Loss: 0.00796
Epoch: 32390, Training Loss: 0.00797
Epoch: 32390, Training Loss: 0.00978
Epoch: 32391, Training Loss: 0.00764
Epoch: 32391, Training Loss: 0.00796
Epoch: 32391, Training Loss: 0.00797
Epoch: 32391, Training Loss: 0.00978
Epoch: 32392, Training Loss: 0.00764
Epoch: 32392, Training Loss: 0.00796
Epoch: 32392, Training Loss: 0.00797
Epoch: 32392, Training Loss: 0.00978
Epoch: 32393, Training Loss: 0.00764
Epoch: 32393, Training Loss: 0.00796
Epoch: 32393, Training Loss: 0.00797
Epoch: 32393, Training Loss: 0.00978
Epoch: 32394, Training Loss: 0.00764
Epoch: 32394, Training Loss: 0.00796
Epoch: 32394, Training Loss: 0.00797
Epoch: 32394, Training Loss: 0.00978
Epoch: 32395, Training Loss: 0.00764
Epoch: 32395, Training Loss: 0.00796
Epoch: 32395, Training Loss: 0.00797
Epoch: 32395, Training Loss: 0.00978
Epoch: 32396, Training Loss: 0.00764
Epoch: 32396, Training Loss: 0.00796
Epoch: 32396, Training Loss: 0.00797
Epoch: 32396, Training Loss: 0.00978
Epoch: 32397, Training Loss: 0.00764
Epoch: 32397, Training Loss: 0.00796
Epoch: 32397, Training Loss: 0.00797
Epoch: 32397, Training Loss: 0.00978
Epoch: 32398, Training Loss: 0.00764
Epoch: 32398, Training Loss: 0.00796
Epoch: 32398, Training Loss: 0.00797
Epoch: 32398, Training Loss: 0.00978
Epoch: 32399, Training Loss: 0.00764
Epoch: 32399, Training Loss: 0.00796
Epoch: 32399, Training Loss: 0.00797
Epoch: 32399, Training Loss: 0.00978
Epoch: 32400, Training Loss: 0.00764
Epoch: 32400, Training Loss: 0.00796
Epoch: 32400, Training Loss: 0.00797
Epoch: 32400, Training Loss: 0.00978
Epoch: 32401, Training Loss: 0.00764
Epoch: 32401, Training Loss: 0.00796
Epoch: 32401, Training Loss: 0.00797
Epoch: 32401, Training Loss: 0.00978
Epoch: 32402, Training Loss: 0.00764
Epoch: 32402, Training Loss: 0.00796
Epoch: 32402, Training Loss: 0.00797
Epoch: 32402, Training Loss: 0.00978
Epoch: 32403, Training Loss: 0.00764
Epoch: 32403, Training Loss: 0.00796
Epoch: 32403, Training Loss: 0.00797
Epoch: 32403, Training Loss: 0.00978
Epoch: 32404, Training Loss: 0.00764
Epoch: 32404, Training Loss: 0.00796
Epoch: 32404, Training Loss: 0.00797
Epoch: 32404, Training Loss: 0.00978
Epoch: 32405, Training Loss: 0.00764
Epoch: 32405, Training Loss: 0.00796
Epoch: 32405, Training Loss: 0.00797
Epoch: 32405, Training Loss: 0.00978
Epoch: 32406, Training Loss: 0.00764
Epoch: 32406, Training Loss: 0.00796
Epoch: 32406, Training Loss: 0.00797
Epoch: 32406, Training Loss: 0.00978
Epoch: 32407, Training Loss: 0.00764
Epoch: 32407, Training Loss: 0.00796
Epoch: 32407, Training Loss: 0.00797
Epoch: 32407, Training Loss: 0.00978
Epoch: 32408, Training Loss: 0.00764
Epoch: 32408, Training Loss: 0.00796
Epoch: 32408, Training Loss: 0.00797
Epoch: 32408, Training Loss: 0.00978
Epoch: 32409, Training Loss: 0.00764
Epoch: 32409, Training Loss: 0.00796
Epoch: 32409, Training Loss: 0.00797
Epoch: 32409, Training Loss: 0.00978
Epoch: 32410, Training Loss: 0.00764
Epoch: 32410, Training Loss: 0.00796
Epoch: 32410, Training Loss: 0.00797
Epoch: 32410, Training Loss: 0.00978
Epoch: 32411, Training Loss: 0.00764
Epoch: 32411, Training Loss: 0.00796
Epoch: 32411, Training Loss: 0.00797
Epoch: 32411, Training Loss: 0.00978
Epoch: 32412, Training Loss: 0.00764
Epoch: 32412, Training Loss: 0.00796
Epoch: 32412, Training Loss: 0.00797
Epoch: 32412, Training Loss: 0.00978
Epoch: 32413, Training Loss: 0.00764
Epoch: 32413, Training Loss: 0.00796
Epoch: 32413, Training Loss: 0.00797
Epoch: 32413, Training Loss: 0.00978
Epoch: 32414, Training Loss: 0.00764
Epoch: 32414, Training Loss: 0.00796
Epoch: 32414, Training Loss: 0.00797
Epoch: 32414, Training Loss: 0.00978
Epoch: 32415, Training Loss: 0.00764
Epoch: 32415, Training Loss: 0.00796
Epoch: 32415, Training Loss: 0.00797
Epoch: 32415, Training Loss: 0.00978
Epoch: 32416, Training Loss: 0.00764
Epoch: 32416, Training Loss: 0.00796
Epoch: 32416, Training Loss: 0.00797
Epoch: 32416, Training Loss: 0.00978
Epoch: 32417, Training Loss: 0.00764
Epoch: 32417, Training Loss: 0.00796
Epoch: 32417, Training Loss: 0.00797
Epoch: 32417, Training Loss: 0.00978
Epoch: 32418, Training Loss: 0.00764
Epoch: 32418, Training Loss: 0.00796
Epoch: 32418, Training Loss: 0.00797
Epoch: 32418, Training Loss: 0.00978
Epoch: 32419, Training Loss: 0.00764
Epoch: 32419, Training Loss: 0.00796
Epoch: 32419, Training Loss: 0.00797
Epoch: 32419, Training Loss: 0.00978
Epoch: 32420, Training Loss: 0.00764
Epoch: 32420, Training Loss: 0.00796
Epoch: 32420, Training Loss: 0.00797
Epoch: 32420, Training Loss: 0.00978
Epoch: 32421, Training Loss: 0.00764
Epoch: 32421, Training Loss: 0.00796
Epoch: 32421, Training Loss: 0.00796
Epoch: 32421, Training Loss: 0.00978
Epoch: 32422, Training Loss: 0.00764
Epoch: 32422, Training Loss: 0.00796
Epoch: 32422, Training Loss: 0.00796
Epoch: 32422, Training Loss: 0.00978
Epoch: 32423, Training Loss: 0.00764
Epoch: 32423, Training Loss: 0.00796
Epoch: 32423, Training Loss: 0.00796
Epoch: 32423, Training Loss: 0.00978
Epoch: 32424, Training Loss: 0.00764
Epoch: 32424, Training Loss: 0.00796
Epoch: 32424, Training Loss: 0.00796
Epoch: 32424, Training Loss: 0.00978
Epoch: 32425, Training Loss: 0.00764
Epoch: 32425, Training Loss: 0.00796
Epoch: 32425, Training Loss: 0.00796
Epoch: 32425, Training Loss: 0.00978
Epoch: 32426, Training Loss: 0.00764
Epoch: 32426, Training Loss: 0.00795
Epoch: 32426, Training Loss: 0.00796
Epoch: 32426, Training Loss: 0.00978
Epoch: 32427, Training Loss: 0.00764
Epoch: 32427, Training Loss: 0.00795
Epoch: 32427, Training Loss: 0.00796
Epoch: 32427, Training Loss: 0.00978
Epoch: 32428, Training Loss: 0.00764
Epoch: 32428, Training Loss: 0.00795
Epoch: 32428, Training Loss: 0.00796
Epoch: 32428, Training Loss: 0.00978
Epoch: 32429, Training Loss: 0.00764
Epoch: 32429, Training Loss: 0.00795
Epoch: 32429, Training Loss: 0.00796
Epoch: 32429, Training Loss: 0.00978
Epoch: 32430, Training Loss: 0.00764
Epoch: 32430, Training Loss: 0.00795
Epoch: 32430, Training Loss: 0.00796
Epoch: 32430, Training Loss: 0.00978
Epoch: 32431, Training Loss: 0.00764
Epoch: 32431, Training Loss: 0.00795
Epoch: 32431, Training Loss: 0.00796
Epoch: 32431, Training Loss: 0.00978
Epoch: 32432, Training Loss: 0.00764
Epoch: 32432, Training Loss: 0.00795
Epoch: 32432, Training Loss: 0.00796
Epoch: 32432, Training Loss: 0.00978
Epoch: 32433, Training Loss: 0.00764
Epoch: 32433, Training Loss: 0.00795
Epoch: 32433, Training Loss: 0.00796
Epoch: 32433, Training Loss: 0.00978
Epoch: 32434, Training Loss: 0.00764
Epoch: 32434, Training Loss: 0.00795
Epoch: 32434, Training Loss: 0.00796
Epoch: 32434, Training Loss: 0.00978
Epoch: 32435, Training Loss: 0.00764
Epoch: 32435, Training Loss: 0.00795
Epoch: 32435, Training Loss: 0.00796
Epoch: 32435, Training Loss: 0.00978
Epoch: 32436, Training Loss: 0.00764
Epoch: 32436, Training Loss: 0.00795
Epoch: 32436, Training Loss: 0.00796
Epoch: 32436, Training Loss: 0.00978
Epoch: 32437, Training Loss: 0.00764
Epoch: 32437, Training Loss: 0.00795
Epoch: 32437, Training Loss: 0.00796
Epoch: 32437, Training Loss: 0.00978
Epoch: 32438, Training Loss: 0.00764
Epoch: 32438, Training Loss: 0.00795
Epoch: 32438, Training Loss: 0.00796
Epoch: 32438, Training Loss: 0.00978
Epoch: 32439, Training Loss: 0.00764
Epoch: 32439, Training Loss: 0.00795
Epoch: 32439, Training Loss: 0.00796
Epoch: 32439, Training Loss: 0.00978
Epoch: 32440, Training Loss: 0.00764
Epoch: 32440, Training Loss: 0.00795
Epoch: 32440, Training Loss: 0.00796
Epoch: 32440, Training Loss: 0.00978
Epoch: 32441, Training Loss: 0.00764
Epoch: 32441, Training Loss: 0.00795
Epoch: 32441, Training Loss: 0.00796
Epoch: 32441, Training Loss: 0.00978
Epoch: 32442, Training Loss: 0.00764
Epoch: 32442, Training Loss: 0.00795
Epoch: 32442, Training Loss: 0.00796
Epoch: 32442, Training Loss: 0.00978
Epoch: 32443, Training Loss: 0.00764
Epoch: 32443, Training Loss: 0.00795
Epoch: 32443, Training Loss: 0.00796
Epoch: 32443, Training Loss: 0.00978
Epoch: 32444, Training Loss: 0.00764
Epoch: 32444, Training Loss: 0.00795
Epoch: 32444, Training Loss: 0.00796
Epoch: 32444, Training Loss: 0.00978
Epoch: 32445, Training Loss: 0.00764
Epoch: 32445, Training Loss: 0.00795
Epoch: 32445, Training Loss: 0.00796
Epoch: 32445, Training Loss: 0.00978
Epoch: 32446, Training Loss: 0.00764
Epoch: 32446, Training Loss: 0.00795
Epoch: 32446, Training Loss: 0.00796
Epoch: 32446, Training Loss: 0.00978
Epoch: 32447, Training Loss: 0.00764
Epoch: 32447, Training Loss: 0.00795
Epoch: 32447, Training Loss: 0.00796
Epoch: 32447, Training Loss: 0.00977
Epoch: 32448, Training Loss: 0.00764
Epoch: 32448, Training Loss: 0.00795
Epoch: 32448, Training Loss: 0.00796
Epoch: 32448, Training Loss: 0.00977
Epoch: 32449, Training Loss: 0.00764
Epoch: 32449, Training Loss: 0.00795
Epoch: 32449, Training Loss: 0.00796
Epoch: 32449, Training Loss: 0.00977
Epoch: 32450, Training Loss: 0.00764
Epoch: 32450, Training Loss: 0.00795
Epoch: 32450, Training Loss: 0.00796
Epoch: 32450, Training Loss: 0.00977
Epoch: 32451, Training Loss: 0.00764
Epoch: 32451, Training Loss: 0.00795
Epoch: 32451, Training Loss: 0.00796
Epoch: 32451, Training Loss: 0.00977
Epoch: 32452, Training Loss: 0.00764
Epoch: 32452, Training Loss: 0.00795
Epoch: 32452, Training Loss: 0.00796
Epoch: 32452, Training Loss: 0.00977
Epoch: 32453, Training Loss: 0.00764
Epoch: 32453, Training Loss: 0.00795
Epoch: 32453, Training Loss: 0.00796
Epoch: 32453, Training Loss: 0.00977
Epoch: 32454, Training Loss: 0.00764
Epoch: 32454, Training Loss: 0.00795
Epoch: 32454, Training Loss: 0.00796
Epoch: 32454, Training Loss: 0.00977
Epoch: 32455, Training Loss: 0.00764
Epoch: 32455, Training Loss: 0.00795
Epoch: 32455, Training Loss: 0.00796
Epoch: 32455, Training Loss: 0.00977
Epoch: 32456, Training Loss: 0.00764
Epoch: 32456, Training Loss: 0.00795
Epoch: 32456, Training Loss: 0.00796
Epoch: 32456, Training Loss: 0.00977
Epoch: 32457, Training Loss: 0.00764
Epoch: 32457, Training Loss: 0.00795
Epoch: 32457, Training Loss: 0.00796
Epoch: 32457, Training Loss: 0.00977
Epoch: 32458, Training Loss: 0.00764
Epoch: 32458, Training Loss: 0.00795
Epoch: 32458, Training Loss: 0.00796
Epoch: 32458, Training Loss: 0.00977
Epoch: 32459, Training Loss: 0.00764
Epoch: 32459, Training Loss: 0.00795
Epoch: 32459, Training Loss: 0.00796
Epoch: 32459, Training Loss: 0.00977
Epoch: 32460, Training Loss: 0.00764
Epoch: 32460, Training Loss: 0.00795
Epoch: 32460, Training Loss: 0.00796
Epoch: 32460, Training Loss: 0.00977
Epoch: 32461, Training Loss: 0.00764
Epoch: 32461, Training Loss: 0.00795
Epoch: 32461, Training Loss: 0.00796
Epoch: 32461, Training Loss: 0.00977
Epoch: 32462, Training Loss: 0.00764
Epoch: 32462, Training Loss: 0.00795
Epoch: 32462, Training Loss: 0.00796
Epoch: 32462, Training Loss: 0.00977
Epoch: 32463, Training Loss: 0.00764
Epoch: 32463, Training Loss: 0.00795
Epoch: 32463, Training Loss: 0.00796
Epoch: 32463, Training Loss: 0.00977
Epoch: 32464, Training Loss: 0.00764
Epoch: 32464, Training Loss: 0.00795
Epoch: 32464, Training Loss: 0.00796
Epoch: 32464, Training Loss: 0.00977
Epoch: 32465, Training Loss: 0.00764
Epoch: 32465, Training Loss: 0.00795
Epoch: 32465, Training Loss: 0.00796
Epoch: 32465, Training Loss: 0.00977
Epoch: 32466, Training Loss: 0.00764
Epoch: 32466, Training Loss: 0.00795
Epoch: 32466, Training Loss: 0.00796
Epoch: 32466, Training Loss: 0.00977
Epoch: 32467, Training Loss: 0.00764
Epoch: 32467, Training Loss: 0.00795
Epoch: 32467, Training Loss: 0.00796
Epoch: 32467, Training Loss: 0.00977
Epoch: 32468, Training Loss: 0.00764
Epoch: 32468, Training Loss: 0.00795
Epoch: 32468, Training Loss: 0.00796
Epoch: 32468, Training Loss: 0.00977
Epoch: 32469, Training Loss: 0.00763
Epoch: 32469, Training Loss: 0.00795
Epoch: 32469, Training Loss: 0.00796
Epoch: 32469, Training Loss: 0.00977
Epoch: 32470, Training Loss: 0.00763
Epoch: 32470, Training Loss: 0.00795
Epoch: 32470, Training Loss: 0.00796
Epoch: 32470, Training Loss: 0.00977
Epoch: 32471, Training Loss: 0.00763
Epoch: 32471, Training Loss: 0.00795
Epoch: 32471, Training Loss: 0.00796
Epoch: 32471, Training Loss: 0.00977
Epoch: 32472, Training Loss: 0.00763
Epoch: 32472, Training Loss: 0.00795
Epoch: 32472, Training Loss: 0.00796
Epoch: 32472, Training Loss: 0.00977
Epoch: 32473, Training Loss: 0.00763
Epoch: 32473, Training Loss: 0.00795
Epoch: 32473, Training Loss: 0.00796
Epoch: 32473, Training Loss: 0.00977
Epoch: 32474, Training Loss: 0.00763
Epoch: 32474, Training Loss: 0.00795
Epoch: 32474, Training Loss: 0.00796
Epoch: 32474, Training Loss: 0.00977
Epoch: 32475, Training Loss: 0.00763
Epoch: 32475, Training Loss: 0.00795
Epoch: 32475, Training Loss: 0.00796
Epoch: 32475, Training Loss: 0.00977
Epoch: 32476, Training Loss: 0.00763
Epoch: 32476, Training Loss: 0.00795
Epoch: 32476, Training Loss: 0.00796
Epoch: 32476, Training Loss: 0.00977
Epoch: 32477, Training Loss: 0.00763
Epoch: 32477, Training Loss: 0.00795
Epoch: 32477, Training Loss: 0.00796
Epoch: 32477, Training Loss: 0.00977
Epoch: 32478, Training Loss: 0.00763
Epoch: 32478, Training Loss: 0.00795
Epoch: 32478, Training Loss: 0.00796
Epoch: 32478, Training Loss: 0.00977
Epoch: 32479, Training Loss: 0.00763
Epoch: 32479, Training Loss: 0.00795
Epoch: 32479, Training Loss: 0.00796
Epoch: 32479, Training Loss: 0.00977
Epoch: 32480, Training Loss: 0.00763
Epoch: 32480, Training Loss: 0.00795
Epoch: 32480, Training Loss: 0.00796
Epoch: 32480, Training Loss: 0.00977
Epoch: 32481, Training Loss: 0.00763
Epoch: 32481, Training Loss: 0.00795
Epoch: 32481, Training Loss: 0.00796
Epoch: 32481, Training Loss: 0.00977
Epoch: 32482, Training Loss: 0.00763
Epoch: 32482, Training Loss: 0.00795
Epoch: 32482, Training Loss: 0.00796
Epoch: 32482, Training Loss: 0.00977
Epoch: 32483, Training Loss: 0.00763
Epoch: 32483, Training Loss: 0.00795
Epoch: 32483, Training Loss: 0.00796
Epoch: 32483, Training Loss: 0.00977
Epoch: 32484, Training Loss: 0.00763
Epoch: 32484, Training Loss: 0.00795
Epoch: 32484, Training Loss: 0.00796
Epoch: 32484, Training Loss: 0.00977
Epoch: 32485, Training Loss: 0.00763
Epoch: 32485, Training Loss: 0.00795
Epoch: 32485, Training Loss: 0.00796
Epoch: 32485, Training Loss: 0.00977
Epoch: 32486, Training Loss: 0.00763
Epoch: 32486, Training Loss: 0.00795
Epoch: 32486, Training Loss: 0.00796
Epoch: 32486, Training Loss: 0.00977
Epoch: 32487, Training Loss: 0.00763
Epoch: 32487, Training Loss: 0.00795
Epoch: 32487, Training Loss: 0.00796
Epoch: 32487, Training Loss: 0.00977
Epoch: 32488, Training Loss: 0.00763
Epoch: 32488, Training Loss: 0.00795
Epoch: 32488, Training Loss: 0.00796
Epoch: 32488, Training Loss: 0.00977
Epoch: 32489, Training Loss: 0.00763
Epoch: 32489, Training Loss: 0.00795
Epoch: 32489, Training Loss: 0.00796
Epoch: 32489, Training Loss: 0.00977
Epoch: 32490, Training Loss: 0.00763
Epoch: 32490, Training Loss: 0.00795
Epoch: 32490, Training Loss: 0.00796
Epoch: 32490, Training Loss: 0.00977
Epoch: 32491, Training Loss: 0.00763
Epoch: 32491, Training Loss: 0.00795
Epoch: 32491, Training Loss: 0.00796
Epoch: 32491, Training Loss: 0.00977
Epoch: 32492, Training Loss: 0.00763
Epoch: 32492, Training Loss: 0.00795
Epoch: 32492, Training Loss: 0.00796
Epoch: 32492, Training Loss: 0.00977
Epoch: 32493, Training Loss: 0.00763
Epoch: 32493, Training Loss: 0.00795
Epoch: 32493, Training Loss: 0.00796
Epoch: 32493, Training Loss: 0.00977
Epoch: 32494, Training Loss: 0.00763
Epoch: 32494, Training Loss: 0.00795
Epoch: 32494, Training Loss: 0.00795
Epoch: 32494, Training Loss: 0.00977
Epoch: 32495, Training Loss: 0.00763
Epoch: 32495, Training Loss: 0.00795
Epoch: 32495, Training Loss: 0.00795
Epoch: 32495, Training Loss: 0.00977
Epoch: 32496, Training Loss: 0.00763
Epoch: 32496, Training Loss: 0.00795
Epoch: 32496, Training Loss: 0.00795
Epoch: 32496, Training Loss: 0.00977
Epoch: 32497, Training Loss: 0.00763
Epoch: 32497, Training Loss: 0.00795
Epoch: 32497, Training Loss: 0.00795
Epoch: 32497, Training Loss: 0.00977
Epoch: 32498, Training Loss: 0.00763
Epoch: 32498, Training Loss: 0.00795
Epoch: 32498, Training Loss: 0.00795
Epoch: 32498, Training Loss: 0.00977
Epoch: 32499, Training Loss: 0.00763
Epoch: 32499, Training Loss: 0.00794
Epoch: 32499, Training Loss: 0.00795
Epoch: 32499, Training Loss: 0.00977
Epoch: 32500, Training Loss: 0.00763
Epoch: 32500, Training Loss: 0.00794
Epoch: 32500, Training Loss: 0.00795
Epoch: 32500, Training Loss: 0.00977
Epoch: 32501, Training Loss: 0.00763
Epoch: 32501, Training Loss: 0.00794
Epoch: 32501, Training Loss: 0.00795
Epoch: 32501, Training Loss: 0.00977
Epoch: 32502, Training Loss: 0.00763
Epoch: 32502, Training Loss: 0.00794
Epoch: 32502, Training Loss: 0.00795
Epoch: 32502, Training Loss: 0.00977
Epoch: 32503, Training Loss: 0.00763
Epoch: 32503, Training Loss: 0.00794
Epoch: 32503, Training Loss: 0.00795
Epoch: 32503, Training Loss: 0.00977
Epoch: 32504, Training Loss: 0.00763
Epoch: 32504, Training Loss: 0.00794
Epoch: 32504, Training Loss: 0.00795
Epoch: 32504, Training Loss: 0.00977
Epoch: 32505, Training Loss: 0.00763
Epoch: 32505, Training Loss: 0.00794
Epoch: 32505, Training Loss: 0.00795
Epoch: 32505, Training Loss: 0.00977
Epoch: 32506, Training Loss: 0.00763
Epoch: 32506, Training Loss: 0.00794
Epoch: 32506, Training Loss: 0.00795
Epoch: 32506, Training Loss: 0.00976
Epoch: 32507, Training Loss: 0.00763
Epoch: 32507, Training Loss: 0.00794
Epoch: 32507, Training Loss: 0.00795
Epoch: 32507, Training Loss: 0.00976
Epoch: 32508, Training Loss: 0.00763
Epoch: 32508, Training Loss: 0.00794
Epoch: 32508, Training Loss: 0.00795
Epoch: 32508, Training Loss: 0.00976
Epoch: 32509, Training Loss: 0.00763
Epoch: 32509, Training Loss: 0.00794
Epoch: 32509, Training Loss: 0.00795
Epoch: 32509, Training Loss: 0.00976
Epoch: 32510, Training Loss: 0.00763
Epoch: 32510, Training Loss: 0.00794
Epoch: 32510, Training Loss: 0.00795
Epoch: 32510, Training Loss: 0.00976
Epoch: 32511, Training Loss: 0.00763
Epoch: 32511, Training Loss: 0.00794
Epoch: 32511, Training Loss: 0.00795
Epoch: 32511, Training Loss: 0.00976
Epoch: 32512, Training Loss: 0.00763
Epoch: 32512, Training Loss: 0.00794
Epoch: 32512, Training Loss: 0.00795
Epoch: 32512, Training Loss: 0.00976
Epoch: 32513, Training Loss: 0.00763
Epoch: 32513, Training Loss: 0.00794
Epoch: 32513, Training Loss: 0.00795
Epoch: 32513, Training Loss: 0.00976
Epoch: 32514, Training Loss: 0.00763
Epoch: 32514, Training Loss: 0.00794
Epoch: 32514, Training Loss: 0.00795
Epoch: 32514, Training Loss: 0.00976
Epoch: 32515, Training Loss: 0.00763
Epoch: 32515, Training Loss: 0.00794
Epoch: 32515, Training Loss: 0.00795
Epoch: 32515, Training Loss: 0.00976
Epoch: 32516, Training Loss: 0.00763
Epoch: 32516, Training Loss: 0.00794
Epoch: 32516, Training Loss: 0.00795
Epoch: 32516, Training Loss: 0.00976
Epoch: 32517, Training Loss: 0.00763
Epoch: 32517, Training Loss: 0.00794
Epoch: 32517, Training Loss: 0.00795
Epoch: 32517, Training Loss: 0.00976
Epoch: 32518, Training Loss: 0.00763
Epoch: 32518, Training Loss: 0.00794
Epoch: 32518, Training Loss: 0.00795
Epoch: 32518, Training Loss: 0.00976
Epoch: 32519, Training Loss: 0.00763
Epoch: 32519, Training Loss: 0.00794
Epoch: 32519, Training Loss: 0.00795
Epoch: 32519, Training Loss: 0.00976
Epoch: 32520, Training Loss: 0.00763
Epoch: 32520, Training Loss: 0.00794
Epoch: 32520, Training Loss: 0.00795
Epoch: 32520, Training Loss: 0.00976
Epoch: 32521, Training Loss: 0.00763
Epoch: 32521, Training Loss: 0.00794
Epoch: 32521, Training Loss: 0.00795
Epoch: 32521, Training Loss: 0.00976
Epoch: 32522, Training Loss: 0.00763
Epoch: 32522, Training Loss: 0.00794
Epoch: 32522, Training Loss: 0.00795
Epoch: 32522, Training Loss: 0.00976
Epoch: 32523, Training Loss: 0.00763
Epoch: 32523, Training Loss: 0.00794
Epoch: 32523, Training Loss: 0.00795
Epoch: 32523, Training Loss: 0.00976
Epoch: 32524, Training Loss: 0.00763
Epoch: 32524, Training Loss: 0.00794
Epoch: 32524, Training Loss: 0.00795
Epoch: 32524, Training Loss: 0.00976
Epoch: 32525, Training Loss: 0.00763
Epoch: 32525, Training Loss: 0.00794
Epoch: 32525, Training Loss: 0.00795
Epoch: 32525, Training Loss: 0.00976
Epoch: 32526, Training Loss: 0.00763
Epoch: 32526, Training Loss: 0.00794
Epoch: 32526, Training Loss: 0.00795
Epoch: 32526, Training Loss: 0.00976
Epoch: 32527, Training Loss: 0.00763
Epoch: 32527, Training Loss: 0.00794
Epoch: 32527, Training Loss: 0.00795
Epoch: 32527, Training Loss: 0.00976
Epoch: 32528, Training Loss: 0.00763
Epoch: 32528, Training Loss: 0.00794
Epoch: 32528, Training Loss: 0.00795
Epoch: 32528, Training Loss: 0.00976
Epoch: 32529, Training Loss: 0.00763
Epoch: 32529, Training Loss: 0.00794
Epoch: 32529, Training Loss: 0.00795
Epoch: 32529, Training Loss: 0.00976
Epoch: 32530, Training Loss: 0.00763
Epoch: 32530, Training Loss: 0.00794
Epoch: 32530, Training Loss: 0.00795
Epoch: 32530, Training Loss: 0.00976
Epoch: 32531, Training Loss: 0.00763
Epoch: 32531, Training Loss: 0.00794
Epoch: 32531, Training Loss: 0.00795
Epoch: 32531, Training Loss: 0.00976
Epoch: 32532, Training Loss: 0.00763
Epoch: 32532, Training Loss: 0.00794
Epoch: 32532, Training Loss: 0.00795
Epoch: 32532, Training Loss: 0.00976
Epoch: 32533, Training Loss: 0.00763
Epoch: 32533, Training Loss: 0.00794
Epoch: 32533, Training Loss: 0.00795
Epoch: 32533, Training Loss: 0.00976
Epoch: 32534, Training Loss: 0.00763
Epoch: 32534, Training Loss: 0.00794
Epoch: 32534, Training Loss: 0.00795
Epoch: 32534, Training Loss: 0.00976
Epoch: 32535, Training Loss: 0.00763
Epoch: 32535, Training Loss: 0.00794
Epoch: 32535, Training Loss: 0.00795
Epoch: 32535, Training Loss: 0.00976
Epoch: 32536, Training Loss: 0.00763
Epoch: 32536, Training Loss: 0.00794
Epoch: 32536, Training Loss: 0.00795
Epoch: 32536, Training Loss: 0.00976
Epoch: 32537, Training Loss: 0.00763
Epoch: 32537, Training Loss: 0.00794
Epoch: 32537, Training Loss: 0.00795
Epoch: 32537, Training Loss: 0.00976
Epoch: 32538, Training Loss: 0.00763
Epoch: 32538, Training Loss: 0.00794
Epoch: 32538, Training Loss: 0.00795
Epoch: 32538, Training Loss: 0.00976
Epoch: 32539, Training Loss: 0.00763
Epoch: 32539, Training Loss: 0.00794
Epoch: 32539, Training Loss: 0.00795
Epoch: 32539, Training Loss: 0.00976
Epoch: 32540, Training Loss: 0.00763
Epoch: 32540, Training Loss: 0.00794
Epoch: 32540, Training Loss: 0.00795
Epoch: 32540, Training Loss: 0.00976
Epoch: 32541, Training Loss: 0.00763
Epoch: 32541, Training Loss: 0.00794
Epoch: 32541, Training Loss: 0.00795
Epoch: 32541, Training Loss: 0.00976
Epoch: 32542, Training Loss: 0.00763
Epoch: 32542, Training Loss: 0.00794
Epoch: 32542, Training Loss: 0.00795
Epoch: 32542, Training Loss: 0.00976
Epoch: 32543, Training Loss: 0.00763
Epoch: 32543, Training Loss: 0.00794
Epoch: 32543, Training Loss: 0.00795
Epoch: 32543, Training Loss: 0.00976
Epoch: 32544, Training Loss: 0.00763
Epoch: 32544, Training Loss: 0.00794
Epoch: 32544, Training Loss: 0.00795
Epoch: 32544, Training Loss: 0.00976
Epoch: 32545, Training Loss: 0.00763
Epoch: 32545, Training Loss: 0.00794
Epoch: 32545, Training Loss: 0.00795
Epoch: 32545, Training Loss: 0.00976
Epoch: 32546, Training Loss: 0.00763
Epoch: 32546, Training Loss: 0.00794
Epoch: 32546, Training Loss: 0.00795
Epoch: 32546, Training Loss: 0.00976
Epoch: 32547, Training Loss: 0.00762
Epoch: 32547, Training Loss: 0.00794
Epoch: 32547, Training Loss: 0.00795
Epoch: 32547, Training Loss: 0.00976
Epoch: 32548, Training Loss: 0.00762
Epoch: 32548, Training Loss: 0.00794
Epoch: 32548, Training Loss: 0.00795
Epoch: 32548, Training Loss: 0.00976
Epoch: 32549, Training Loss: 0.00762
Epoch: 32549, Training Loss: 0.00794
Epoch: 32549, Training Loss: 0.00795
Epoch: 32549, Training Loss: 0.00976
Epoch: 32550, Training Loss: 0.00762
Epoch: 32550, Training Loss: 0.00794
Epoch: 32550, Training Loss: 0.00795
Epoch: 32550, Training Loss: 0.00976
Epoch: 32551, Training Loss: 0.00762
Epoch: 32551, Training Loss: 0.00794
Epoch: 32551, Training Loss: 0.00795
Epoch: 32551, Training Loss: 0.00976
Epoch: 32552, Training Loss: 0.00762
Epoch: 32552, Training Loss: 0.00794
Epoch: 32552, Training Loss: 0.00795
Epoch: 32552, Training Loss: 0.00976
Epoch: 32553, Training Loss: 0.00762
Epoch: 32553, Training Loss: 0.00794
Epoch: 32553, Training Loss: 0.00795
Epoch: 32553, Training Loss: 0.00976
Epoch: 32554, Training Loss: 0.00762
Epoch: 32554, Training Loss: 0.00794
Epoch: 32554, Training Loss: 0.00795
Epoch: 32554, Training Loss: 0.00976
Epoch: 32555, Training Loss: 0.00762
Epoch: 32555, Training Loss: 0.00794
Epoch: 32555, Training Loss: 0.00795
Epoch: 32555, Training Loss: 0.00976
Epoch: 32556, Training Loss: 0.00762
Epoch: 32556, Training Loss: 0.00794
Epoch: 32556, Training Loss: 0.00795
Epoch: 32556, Training Loss: 0.00976
Epoch: 32557, Training Loss: 0.00762
Epoch: 32557, Training Loss: 0.00794
Epoch: 32557, Training Loss: 0.00795
Epoch: 32557, Training Loss: 0.00976
Epoch: 32558, Training Loss: 0.00762
Epoch: 32558, Training Loss: 0.00794
Epoch: 32558, Training Loss: 0.00795
Epoch: 32558, Training Loss: 0.00976
Epoch: 32559, Training Loss: 0.00762
Epoch: 32559, Training Loss: 0.00794
Epoch: 32559, Training Loss: 0.00795
Epoch: 32559, Training Loss: 0.00976
Epoch: 32560, Training Loss: 0.00762
Epoch: 32560, Training Loss: 0.00794
Epoch: 32560, Training Loss: 0.00795
Epoch: 32560, Training Loss: 0.00976
Epoch: 32561, Training Loss: 0.00762
Epoch: 32561, Training Loss: 0.00794
Epoch: 32561, Training Loss: 0.00795
Epoch: 32561, Training Loss: 0.00976
Epoch: 32562, Training Loss: 0.00762
Epoch: 32562, Training Loss: 0.00794
Epoch: 32562, Training Loss: 0.00795
Epoch: 32562, Training Loss: 0.00976
Epoch: 32563, Training Loss: 0.00762
Epoch: 32563, Training Loss: 0.00794
Epoch: 32563, Training Loss: 0.00795
Epoch: 32563, Training Loss: 0.00976
Epoch: 32564, Training Loss: 0.00762
Epoch: 32564, Training Loss: 0.00794
Epoch: 32564, Training Loss: 0.00795
Epoch: 32564, Training Loss: 0.00976
Epoch: 32565, Training Loss: 0.00762
Epoch: 32565, Training Loss: 0.00794
Epoch: 32565, Training Loss: 0.00795
Epoch: 32565, Training Loss: 0.00975
Epoch: 32566, Training Loss: 0.00762
Epoch: 32566, Training Loss: 0.00794
Epoch: 32566, Training Loss: 0.00795
Epoch: 32566, Training Loss: 0.00975
Epoch: 32567, Training Loss: 0.00762
Epoch: 32567, Training Loss: 0.00794
Epoch: 32567, Training Loss: 0.00794
Epoch: 32567, Training Loss: 0.00975
Epoch: 32568, Training Loss: 0.00762
Epoch: 32568, Training Loss: 0.00794
Epoch: 32568, Training Loss: 0.00794
Epoch: 32568, Training Loss: 0.00975
Epoch: 32569, Training Loss: 0.00762
Epoch: 32569, Training Loss: 0.00794
Epoch: 32569, Training Loss: 0.00794
Epoch: 32569, Training Loss: 0.00975
Epoch: 32570, Training Loss: 0.00762
Epoch: 32570, Training Loss: 0.00794
Epoch: 32570, Training Loss: 0.00794
Epoch: 32570, Training Loss: 0.00975
Epoch: 32571, Training Loss: 0.00762
Epoch: 32571, Training Loss: 0.00794
Epoch: 32571, Training Loss: 0.00794
Epoch: 32571, Training Loss: 0.00975
Epoch: 32572, Training Loss: 0.00762
Epoch: 32572, Training Loss: 0.00793
Epoch: 32572, Training Loss: 0.00794
Epoch: 32572, Training Loss: 0.00975
Epoch: 32573, Training Loss: 0.00762
Epoch: 32573, Training Loss: 0.00793
Epoch: 32573, Training Loss: 0.00794
Epoch: 32573, Training Loss: 0.00975
Epoch: 32574, Training Loss: 0.00762
Epoch: 32574, Training Loss: 0.00793
Epoch: 32574, Training Loss: 0.00794
Epoch: 32574, Training Loss: 0.00975
Epoch: 32575, Training Loss: 0.00762
Epoch: 32575, Training Loss: 0.00793
Epoch: 32575, Training Loss: 0.00794
Epoch: 32575, Training Loss: 0.00975
Epoch: 32576, Training Loss: 0.00762
Epoch: 32576, Training Loss: 0.00793
Epoch: 32576, Training Loss: 0.00794
Epoch: 32576, Training Loss: 0.00975
Epoch: 32577, Training Loss: 0.00762
Epoch: 32577, Training Loss: 0.00793
Epoch: 32577, Training Loss: 0.00794
Epoch: 32577, Training Loss: 0.00975
Epoch: 32578, Training Loss: 0.00762
Epoch: 32578, Training Loss: 0.00793
Epoch: 32578, Training Loss: 0.00794
Epoch: 32578, Training Loss: 0.00975
Epoch: 32579, Training Loss: 0.00762
Epoch: 32579, Training Loss: 0.00793
Epoch: 32579, Training Loss: 0.00794
Epoch: 32579, Training Loss: 0.00975
Epoch: 32580, Training Loss: 0.00762
Epoch: 32580, Training Loss: 0.00793
Epoch: 32580, Training Loss: 0.00794
Epoch: 32580, Training Loss: 0.00975
Epoch: 32581, Training Loss: 0.00762
Epoch: 32581, Training Loss: 0.00793
Epoch: 32581, Training Loss: 0.00794
Epoch: 32581, Training Loss: 0.00975
Epoch: 32582, Training Loss: 0.00762
Epoch: 32582, Training Loss: 0.00793
Epoch: 32582, Training Loss: 0.00794
Epoch: 32582, Training Loss: 0.00975
Epoch: 32583, Training Loss: 0.00762
Epoch: 32583, Training Loss: 0.00793
Epoch: 32583, Training Loss: 0.00794
Epoch: 32583, Training Loss: 0.00975
Epoch: 32584, Training Loss: 0.00762
Epoch: 32584, Training Loss: 0.00793
Epoch: 32584, Training Loss: 0.00794
Epoch: 32584, Training Loss: 0.00975
Epoch: 32585, Training Loss: 0.00762
Epoch: 32585, Training Loss: 0.00793
Epoch: 32585, Training Loss: 0.00794
Epoch: 32585, Training Loss: 0.00975
Epoch: 32586, Training Loss: 0.00762
Epoch: 32586, Training Loss: 0.00793
Epoch: 32586, Training Loss: 0.00794
Epoch: 32586, Training Loss: 0.00975
Epoch: 32587, Training Loss: 0.00762
Epoch: 32587, Training Loss: 0.00793
Epoch: 32587, Training Loss: 0.00794
Epoch: 32587, Training Loss: 0.00975
Epoch: 32588, Training Loss: 0.00762
Epoch: 32588, Training Loss: 0.00793
Epoch: 32588, Training Loss: 0.00794
Epoch: 32588, Training Loss: 0.00975
Epoch: 32589, Training Loss: 0.00762
Epoch: 32589, Training Loss: 0.00793
Epoch: 32589, Training Loss: 0.00794
Epoch: 32589, Training Loss: 0.00975
Epoch: 32590, Training Loss: 0.00762
Epoch: 32590, Training Loss: 0.00793
Epoch: 32590, Training Loss: 0.00794
Epoch: 32590, Training Loss: 0.00975
Epoch: 32591, Training Loss: 0.00762
Epoch: 32591, Training Loss: 0.00793
Epoch: 32591, Training Loss: 0.00794
Epoch: 32591, Training Loss: 0.00975
Epoch: 32592, Training Loss: 0.00762
Epoch: 32592, Training Loss: 0.00793
Epoch: 32592, Training Loss: 0.00794
Epoch: 32592, Training Loss: 0.00975
Epoch: 32593, Training Loss: 0.00762
Epoch: 32593, Training Loss: 0.00793
Epoch: 32593, Training Loss: 0.00794
Epoch: 32593, Training Loss: 0.00975
Epoch: 32594, Training Loss: 0.00762
Epoch: 32594, Training Loss: 0.00793
Epoch: 32594, Training Loss: 0.00794
Epoch: 32594, Training Loss: 0.00975
Epoch: 32595, Training Loss: 0.00762
Epoch: 32595, Training Loss: 0.00793
Epoch: 32595, Training Loss: 0.00794
Epoch: 32595, Training Loss: 0.00975
Epoch: 32596, Training Loss: 0.00762
Epoch: 32596, Training Loss: 0.00793
Epoch: 32596, Training Loss: 0.00794
Epoch: 32596, Training Loss: 0.00975
Epoch: 32597, Training Loss: 0.00762
Epoch: 32597, Training Loss: 0.00793
Epoch: 32597, Training Loss: 0.00794
Epoch: 32597, Training Loss: 0.00975
Epoch: 32598, Training Loss: 0.00762
Epoch: 32598, Training Loss: 0.00793
Epoch: 32598, Training Loss: 0.00794
Epoch: 32598, Training Loss: 0.00975
Epoch: 32599, Training Loss: 0.00762
Epoch: 32599, Training Loss: 0.00793
Epoch: 32599, Training Loss: 0.00794
Epoch: 32599, Training Loss: 0.00975
Epoch: 32600, Training Loss: 0.00762
Epoch: 32600, Training Loss: 0.00793
Epoch: 32600, Training Loss: 0.00794
Epoch: 32600, Training Loss: 0.00975
Epoch: 32601, Training Loss: 0.00762
Epoch: 32601, Training Loss: 0.00793
Epoch: 32601, Training Loss: 0.00794
Epoch: 32601, Training Loss: 0.00975
Epoch: 32602, Training Loss: 0.00762
Epoch: 32602, Training Loss: 0.00793
Epoch: 32602, Training Loss: 0.00794
Epoch: 32602, Training Loss: 0.00975
Epoch: 32603, Training Loss: 0.00762
Epoch: 32603, Training Loss: 0.00793
Epoch: 32603, Training Loss: 0.00794
Epoch: 32603, Training Loss: 0.00975
Epoch: 32604, Training Loss: 0.00762
Epoch: 32604, Training Loss: 0.00793
Epoch: 32604, Training Loss: 0.00794
Epoch: 32604, Training Loss: 0.00975
Epoch: 32605, Training Loss: 0.00762
Epoch: 32605, Training Loss: 0.00793
Epoch: 32605, Training Loss: 0.00794
Epoch: 32605, Training Loss: 0.00975
Epoch: 32606, Training Loss: 0.00762
Epoch: 32606, Training Loss: 0.00793
Epoch: 32606, Training Loss: 0.00794
Epoch: 32606, Training Loss: 0.00975
Epoch: 32607, Training Loss: 0.00762
Epoch: 32607, Training Loss: 0.00793
Epoch: 32607, Training Loss: 0.00794
Epoch: 32607, Training Loss: 0.00975
Epoch: 32608, Training Loss: 0.00762
Epoch: 32608, Training Loss: 0.00793
Epoch: 32608, Training Loss: 0.00794
Epoch: 32608, Training Loss: 0.00975
Epoch: 32609, Training Loss: 0.00762
Epoch: 32609, Training Loss: 0.00793
Epoch: 32609, Training Loss: 0.00794
Epoch: 32609, Training Loss: 0.00975
Epoch: 32610, Training Loss: 0.00762
Epoch: 32610, Training Loss: 0.00793
Epoch: 32610, Training Loss: 0.00794
Epoch: 32610, Training Loss: 0.00975
Epoch: 32611, Training Loss: 0.00762
Epoch: 32611, Training Loss: 0.00793
Epoch: 32611, Training Loss: 0.00794
Epoch: 32611, Training Loss: 0.00975
Epoch: 32612, Training Loss: 0.00762
Epoch: 32612, Training Loss: 0.00793
Epoch: 32612, Training Loss: 0.00794
Epoch: 32612, Training Loss: 0.00975
Epoch: 32613, Training Loss: 0.00762
Epoch: 32613, Training Loss: 0.00793
Epoch: 32613, Training Loss: 0.00794
Epoch: 32613, Training Loss: 0.00975
Epoch: 32614, Training Loss: 0.00762
Epoch: 32614, Training Loss: 0.00793
Epoch: 32614, Training Loss: 0.00794
Epoch: 32614, Training Loss: 0.00975
Epoch: 32615, Training Loss: 0.00762
Epoch: 32615, Training Loss: 0.00793
Epoch: 32615, Training Loss: 0.00794
Epoch: 32615, Training Loss: 0.00975
Epoch: 32616, Training Loss: 0.00762
Epoch: 32616, Training Loss: 0.00793
Epoch: 32616, Training Loss: 0.00794
Epoch: 32616, Training Loss: 0.00975
Epoch: 32617, Training Loss: 0.00762
Epoch: 32617, Training Loss: 0.00793
Epoch: 32617, Training Loss: 0.00794
Epoch: 32617, Training Loss: 0.00975
Epoch: 32618, Training Loss: 0.00762
Epoch: 32618, Training Loss: 0.00793
Epoch: 32618, Training Loss: 0.00794
Epoch: 32618, Training Loss: 0.00975
Epoch: 32619, Training Loss: 0.00762
Epoch: 32619, Training Loss: 0.00793
Epoch: 32619, Training Loss: 0.00794
Epoch: 32619, Training Loss: 0.00975
Epoch: 32620, Training Loss: 0.00762
Epoch: 32620, Training Loss: 0.00793
Epoch: 32620, Training Loss: 0.00794
Epoch: 32620, Training Loss: 0.00975
Epoch: 32621, Training Loss: 0.00762
Epoch: 32621, Training Loss: 0.00793
Epoch: 32621, Training Loss: 0.00794
Epoch: 32621, Training Loss: 0.00975
Epoch: 32622, Training Loss: 0.00762
Epoch: 32622, Training Loss: 0.00793
Epoch: 32622, Training Loss: 0.00794
Epoch: 32622, Training Loss: 0.00975
Epoch: 32623, Training Loss: 0.00762
Epoch: 32623, Training Loss: 0.00793
Epoch: 32623, Training Loss: 0.00794
Epoch: 32623, Training Loss: 0.00975
Epoch: 32624, Training Loss: 0.00762
Epoch: 32624, Training Loss: 0.00793
Epoch: 32624, Training Loss: 0.00794
Epoch: 32624, Training Loss: 0.00974
Epoch: 32625, Training Loss: 0.00761
Epoch: 32625, Training Loss: 0.00793
Epoch: 32625, Training Loss: 0.00794
Epoch: 32625, Training Loss: 0.00974
Epoch: 32626, Training Loss: 0.00761
Epoch: 32626, Training Loss: 0.00793
Epoch: 32626, Training Loss: 0.00794
Epoch: 32626, Training Loss: 0.00974
Epoch: 32627, Training Loss: 0.00761
Epoch: 32627, Training Loss: 0.00793
Epoch: 32627, Training Loss: 0.00794
Epoch: 32627, Training Loss: 0.00974
Epoch: 32628, Training Loss: 0.00761
Epoch: 32628, Training Loss: 0.00793
Epoch: 32628, Training Loss: 0.00794
Epoch: 32628, Training Loss: 0.00974
Epoch: 32629, Training Loss: 0.00761
Epoch: 32629, Training Loss: 0.00793
Epoch: 32629, Training Loss: 0.00794
Epoch: 32629, Training Loss: 0.00974
Epoch: 32630, Training Loss: 0.00761
Epoch: 32630, Training Loss: 0.00793
Epoch: 32630, Training Loss: 0.00794
Epoch: 32630, Training Loss: 0.00974
Epoch: 32631, Training Loss: 0.00761
Epoch: 32631, Training Loss: 0.00793
Epoch: 32631, Training Loss: 0.00794
Epoch: 32631, Training Loss: 0.00974
Epoch: 32632, Training Loss: 0.00761
Epoch: 32632, Training Loss: 0.00793
Epoch: 32632, Training Loss: 0.00794
Epoch: 32632, Training Loss: 0.00974
Epoch: 32633, Training Loss: 0.00761
Epoch: 32633, Training Loss: 0.00793
Epoch: 32633, Training Loss: 0.00794
Epoch: 32633, Training Loss: 0.00974
Epoch: 32634, Training Loss: 0.00761
Epoch: 32634, Training Loss: 0.00793
Epoch: 32634, Training Loss: 0.00794
Epoch: 32634, Training Loss: 0.00974
Epoch: 32635, Training Loss: 0.00761
Epoch: 32635, Training Loss: 0.00793
Epoch: 32635, Training Loss: 0.00794
Epoch: 32635, Training Loss: 0.00974
Epoch: 32636, Training Loss: 0.00761
Epoch: 32636, Training Loss: 0.00793
Epoch: 32636, Training Loss: 0.00794
Epoch: 32636, Training Loss: 0.00974
Epoch: 32637, Training Loss: 0.00761
Epoch: 32637, Training Loss: 0.00793
Epoch: 32637, Training Loss: 0.00794
Epoch: 32637, Training Loss: 0.00974
Epoch: 32638, Training Loss: 0.00761
Epoch: 32638, Training Loss: 0.00793
Epoch: 32638, Training Loss: 0.00794
Epoch: 32638, Training Loss: 0.00974
Epoch: 32639, Training Loss: 0.00761
Epoch: 32639, Training Loss: 0.00793
Epoch: 32639, Training Loss: 0.00794
Epoch: 32639, Training Loss: 0.00974
Epoch: 32640, Training Loss: 0.00761
Epoch: 32640, Training Loss: 0.00793
Epoch: 32640, Training Loss: 0.00793
Epoch: 32640, Training Loss: 0.00974
Epoch: 32641, Training Loss: 0.00761
Epoch: 32641, Training Loss: 0.00793
Epoch: 32641, Training Loss: 0.00793
Epoch: 32641, Training Loss: 0.00974
Epoch: 32642, Training Loss: 0.00761
Epoch: 32642, Training Loss: 0.00793
Epoch: 32642, Training Loss: 0.00793
Epoch: 32642, Training Loss: 0.00974
Epoch: 32643, Training Loss: 0.00761
Epoch: 32643, Training Loss: 0.00793
Epoch: 32643, Training Loss: 0.00793
Epoch: 32643, Training Loss: 0.00974
Epoch: 32644, Training Loss: 0.00761
Epoch: 32644, Training Loss: 0.00793
Epoch: 32644, Training Loss: 0.00793
Epoch: 32644, Training Loss: 0.00974
Epoch: 32645, Training Loss: 0.00761
Epoch: 32645, Training Loss: 0.00793
Epoch: 32645, Training Loss: 0.00793
Epoch: 32645, Training Loss: 0.00974
Epoch: 32646, Training Loss: 0.00761
Epoch: 32646, Training Loss: 0.00792
Epoch: 32646, Training Loss: 0.00793
Epoch: 32646, Training Loss: 0.00974
Epoch: 32647, Training Loss: 0.00761
Epoch: 32647, Training Loss: 0.00792
Epoch: 32647, Training Loss: 0.00793
Epoch: 32647, Training Loss: 0.00974
Epoch: 32648, Training Loss: 0.00761
Epoch: 32648, Training Loss: 0.00792
Epoch: 32648, Training Loss: 0.00793
Epoch: 32648, Training Loss: 0.00974
Epoch: 32649, Training Loss: 0.00761
Epoch: 32649, Training Loss: 0.00792
Epoch: 32649, Training Loss: 0.00793
Epoch: 32649, Training Loss: 0.00974
Epoch: 32650, Training Loss: 0.00761
Epoch: 32650, Training Loss: 0.00792
Epoch: 32650, Training Loss: 0.00793
Epoch: 32650, Training Loss: 0.00974
Epoch: 32651, Training Loss: 0.00761
Epoch: 32651, Training Loss: 0.00792
Epoch: 32651, Training Loss: 0.00793
Epoch: 32651, Training Loss: 0.00974
Epoch: 32652, Training Loss: 0.00761
Epoch: 32652, Training Loss: 0.00792
Epoch: 32652, Training Loss: 0.00793
Epoch: 32652, Training Loss: 0.00974
Epoch: 32653, Training Loss: 0.00761
Epoch: 32653, Training Loss: 0.00792
Epoch: 32653, Training Loss: 0.00793
Epoch: 32653, Training Loss: 0.00974
Epoch: 32654, Training Loss: 0.00761
Epoch: 32654, Training Loss: 0.00792
Epoch: 32654, Training Loss: 0.00793
Epoch: 32654, Training Loss: 0.00974
Epoch: 32655, Training Loss: 0.00761
Epoch: 32655, Training Loss: 0.00792
Epoch: 32655, Training Loss: 0.00793
Epoch: 32655, Training Loss: 0.00974
Epoch: 32656, Training Loss: 0.00761
Epoch: 32656, Training Loss: 0.00792
Epoch: 32656, Training Loss: 0.00793
Epoch: 32656, Training Loss: 0.00974
Epoch: 32657, Training Loss: 0.00761
Epoch: 32657, Training Loss: 0.00792
Epoch: 32657, Training Loss: 0.00793
Epoch: 32657, Training Loss: 0.00974
Epoch: 32658, Training Loss: 0.00761
Epoch: 32658, Training Loss: 0.00792
Epoch: 32658, Training Loss: 0.00793
Epoch: 32658, Training Loss: 0.00974
Epoch: 32659, Training Loss: 0.00761
Epoch: 32659, Training Loss: 0.00792
Epoch: 32659, Training Loss: 0.00793
Epoch: 32659, Training Loss: 0.00974
Epoch: 32660, Training Loss: 0.00761
Epoch: 32660, Training Loss: 0.00792
Epoch: 32660, Training Loss: 0.00793
Epoch: 32660, Training Loss: 0.00974
Epoch: 32661, Training Loss: 0.00761
Epoch: 32661, Training Loss: 0.00792
Epoch: 32661, Training Loss: 0.00793
Epoch: 32661, Training Loss: 0.00974
Epoch: 32662, Training Loss: 0.00761
Epoch: 32662, Training Loss: 0.00792
Epoch: 32662, Training Loss: 0.00793
Epoch: 32662, Training Loss: 0.00974
Epoch: 32663, Training Loss: 0.00761
Epoch: 32663, Training Loss: 0.00792
Epoch: 32663, Training Loss: 0.00793
Epoch: 32663, Training Loss: 0.00974
Epoch: 32664, Training Loss: 0.00761
Epoch: 32664, Training Loss: 0.00792
Epoch: 32664, Training Loss: 0.00793
Epoch: 32664, Training Loss: 0.00974
Epoch: 32665, Training Loss: 0.00761
Epoch: 32665, Training Loss: 0.00792
Epoch: 32665, Training Loss: 0.00793
Epoch: 32665, Training Loss: 0.00974
Epoch: 32666, Training Loss: 0.00761
Epoch: 32666, Training Loss: 0.00792
Epoch: 32666, Training Loss: 0.00793
Epoch: 32666, Training Loss: 0.00974
Epoch: 32667, Training Loss: 0.00761
Epoch: 32667, Training Loss: 0.00792
Epoch: 32667, Training Loss: 0.00793
Epoch: 32667, Training Loss: 0.00974
Epoch: 32668, Training Loss: 0.00761
Epoch: 32668, Training Loss: 0.00792
Epoch: 32668, Training Loss: 0.00793
Epoch: 32668, Training Loss: 0.00974
Epoch: 32669, Training Loss: 0.00761
Epoch: 32669, Training Loss: 0.00792
Epoch: 32669, Training Loss: 0.00793
Epoch: 32669, Training Loss: 0.00974
Epoch: 32670, Training Loss: 0.00761
Epoch: 32670, Training Loss: 0.00792
Epoch: 32670, Training Loss: 0.00793
Epoch: 32670, Training Loss: 0.00974
Epoch: 32671, Training Loss: 0.00761
Epoch: 32671, Training Loss: 0.00792
Epoch: 32671, Training Loss: 0.00793
Epoch: 32671, Training Loss: 0.00974
Epoch: 32672, Training Loss: 0.00761
Epoch: 32672, Training Loss: 0.00792
Epoch: 32672, Training Loss: 0.00793
Epoch: 32672, Training Loss: 0.00974
Epoch: 32673, Training Loss: 0.00761
Epoch: 32673, Training Loss: 0.00792
Epoch: 32673, Training Loss: 0.00793
Epoch: 32673, Training Loss: 0.00974
Epoch: 32674, Training Loss: 0.00761
Epoch: 32674, Training Loss: 0.00792
Epoch: 32674, Training Loss: 0.00793
Epoch: 32674, Training Loss: 0.00974
Epoch: 32675, Training Loss: 0.00761
Epoch: 32675, Training Loss: 0.00792
Epoch: 32675, Training Loss: 0.00793
Epoch: 32675, Training Loss: 0.00974
Epoch: 32676, Training Loss: 0.00761
Epoch: 32676, Training Loss: 0.00792
Epoch: 32676, Training Loss: 0.00793
Epoch: 32676, Training Loss: 0.00974
Epoch: 32677, Training Loss: 0.00761
Epoch: 32677, Training Loss: 0.00792
Epoch: 32677, Training Loss: 0.00793
Epoch: 32677, Training Loss: 0.00974
Epoch: 32678, Training Loss: 0.00761
Epoch: 32678, Training Loss: 0.00792
Epoch: 32678, Training Loss: 0.00793
Epoch: 32678, Training Loss: 0.00974
Epoch: 32679, Training Loss: 0.00761
Epoch: 32679, Training Loss: 0.00792
Epoch: 32679, Training Loss: 0.00793
Epoch: 32679, Training Loss: 0.00974
Epoch: 32680, Training Loss: 0.00761
Epoch: 32680, Training Loss: 0.00792
Epoch: 32680, Training Loss: 0.00793
Epoch: 32680, Training Loss: 0.00974
Epoch: 32681, Training Loss: 0.00761
Epoch: 32681, Training Loss: 0.00792
Epoch: 32681, Training Loss: 0.00793
Epoch: 32681, Training Loss: 0.00974
Epoch: 32682, Training Loss: 0.00761
Epoch: 32682, Training Loss: 0.00792
Epoch: 32682, Training Loss: 0.00793
Epoch: 32682, Training Loss: 0.00974
Epoch: 32683, Training Loss: 0.00761
Epoch: 32683, Training Loss: 0.00792
Epoch: 32683, Training Loss: 0.00793
Epoch: 32683, Training Loss: 0.00973
Epoch: 32684, Training Loss: 0.00761
Epoch: 32684, Training Loss: 0.00792
Epoch: 32684, Training Loss: 0.00793
Epoch: 32684, Training Loss: 0.00973
Epoch: 32685, Training Loss: 0.00761
Epoch: 32685, Training Loss: 0.00792
Epoch: 32685, Training Loss: 0.00793
Epoch: 32685, Training Loss: 0.00973
Epoch: 32686, Training Loss: 0.00761
Epoch: 32686, Training Loss: 0.00792
Epoch: 32686, Training Loss: 0.00793
Epoch: 32686, Training Loss: 0.00973
Epoch: 32687, Training Loss: 0.00761
Epoch: 32687, Training Loss: 0.00792
Epoch: 32687, Training Loss: 0.00793
Epoch: 32687, Training Loss: 0.00973
Epoch: 32688, Training Loss: 0.00761
Epoch: 32688, Training Loss: 0.00792
Epoch: 32688, Training Loss: 0.00793
Epoch: 32688, Training Loss: 0.00973
Epoch: 32689, Training Loss: 0.00761
Epoch: 32689, Training Loss: 0.00792
Epoch: 32689, Training Loss: 0.00793
Epoch: 32689, Training Loss: 0.00973
Epoch: 32690, Training Loss: 0.00761
Epoch: 32690, Training Loss: 0.00792
Epoch: 32690, Training Loss: 0.00793
Epoch: 32690, Training Loss: 0.00973
Epoch: 32691, Training Loss: 0.00761
Epoch: 32691, Training Loss: 0.00792
Epoch: 32691, Training Loss: 0.00793
Epoch: 32691, Training Loss: 0.00973
Epoch: 32692, Training Loss: 0.00761
Epoch: 32692, Training Loss: 0.00792
Epoch: 32692, Training Loss: 0.00793
Epoch: 32692, Training Loss: 0.00973
Epoch: 32693, Training Loss: 0.00761
Epoch: 32693, Training Loss: 0.00792
Epoch: 32693, Training Loss: 0.00793
Epoch: 32693, Training Loss: 0.00973
Epoch: 32694, Training Loss: 0.00761
Epoch: 32694, Training Loss: 0.00792
Epoch: 32694, Training Loss: 0.00793
Epoch: 32694, Training Loss: 0.00973
Epoch: 32695, Training Loss: 0.00761
Epoch: 32695, Training Loss: 0.00792
Epoch: 32695, Training Loss: 0.00793
Epoch: 32695, Training Loss: 0.00973
Epoch: 32696, Training Loss: 0.00761
Epoch: 32696, Training Loss: 0.00792
Epoch: 32696, Training Loss: 0.00793
Epoch: 32696, Training Loss: 0.00973
Epoch: 32697, Training Loss: 0.00761
Epoch: 32697, Training Loss: 0.00792
Epoch: 32697, Training Loss: 0.00793
Epoch: 32697, Training Loss: 0.00973
Epoch: 32698, Training Loss: 0.00761
Epoch: 32698, Training Loss: 0.00792
Epoch: 32698, Training Loss: 0.00793
Epoch: 32698, Training Loss: 0.00973
Epoch: 32699, Training Loss: 0.00761
Epoch: 32699, Training Loss: 0.00792
Epoch: 32699, Training Loss: 0.00793
Epoch: 32699, Training Loss: 0.00973
Epoch: 32700, Training Loss: 0.00761
Epoch: 32700, Training Loss: 0.00792
Epoch: 32700, Training Loss: 0.00793
Epoch: 32700, Training Loss: 0.00973
Epoch: 32701, Training Loss: 0.00761
Epoch: 32701, Training Loss: 0.00792
Epoch: 32701, Training Loss: 0.00793
Epoch: 32701, Training Loss: 0.00973
Epoch: 32702, Training Loss: 0.00761
Epoch: 32702, Training Loss: 0.00792
Epoch: 32702, Training Loss: 0.00793
Epoch: 32702, Training Loss: 0.00973
Epoch: 32703, Training Loss: 0.00761
Epoch: 32703, Training Loss: 0.00792
Epoch: 32703, Training Loss: 0.00793
Epoch: 32703, Training Loss: 0.00973
Epoch: 32704, Training Loss: 0.00760
Epoch: 32704, Training Loss: 0.00792
Epoch: 32704, Training Loss: 0.00793
Epoch: 32704, Training Loss: 0.00973
Epoch: 32705, Training Loss: 0.00760
Epoch: 32705, Training Loss: 0.00792
Epoch: 32705, Training Loss: 0.00793
Epoch: 32705, Training Loss: 0.00973
Epoch: 32706, Training Loss: 0.00760
Epoch: 32706, Training Loss: 0.00792
Epoch: 32706, Training Loss: 0.00793
Epoch: 32706, Training Loss: 0.00973
Epoch: 32707, Training Loss: 0.00760
Epoch: 32707, Training Loss: 0.00792
Epoch: 32707, Training Loss: 0.00793
Epoch: 32707, Training Loss: 0.00973
Epoch: 32708, Training Loss: 0.00760
Epoch: 32708, Training Loss: 0.00792
Epoch: 32708, Training Loss: 0.00793
Epoch: 32708, Training Loss: 0.00973
Epoch: 32709, Training Loss: 0.00760
Epoch: 32709, Training Loss: 0.00792
Epoch: 32709, Training Loss: 0.00793
Epoch: 32709, Training Loss: 0.00973
Epoch: 32710, Training Loss: 0.00760
Epoch: 32710, Training Loss: 0.00792
Epoch: 32710, Training Loss: 0.00793
Epoch: 32710, Training Loss: 0.00973
Epoch: 32711, Training Loss: 0.00760
Epoch: 32711, Training Loss: 0.00792
Epoch: 32711, Training Loss: 0.00793
Epoch: 32711, Training Loss: 0.00973
Epoch: 32712, Training Loss: 0.00760
Epoch: 32712, Training Loss: 0.00792
Epoch: 32712, Training Loss: 0.00793
Epoch: 32712, Training Loss: 0.00973
Epoch: 32713, Training Loss: 0.00760
Epoch: 32713, Training Loss: 0.00792
Epoch: 32713, Training Loss: 0.00793
Epoch: 32713, Training Loss: 0.00973
Epoch: 32714, Training Loss: 0.00760
Epoch: 32714, Training Loss: 0.00792
Epoch: 32714, Training Loss: 0.00792
Epoch: 32714, Training Loss: 0.00973
Epoch: 32715, Training Loss: 0.00760
Epoch: 32715, Training Loss: 0.00792
Epoch: 32715, Training Loss: 0.00792
Epoch: 32715, Training Loss: 0.00973
Epoch: 32716, Training Loss: 0.00760
Epoch: 32716, Training Loss: 0.00792
Epoch: 32716, Training Loss: 0.00792
Epoch: 32716, Training Loss: 0.00973
Epoch: 32717, Training Loss: 0.00760
Epoch: 32717, Training Loss: 0.00792
Epoch: 32717, Training Loss: 0.00792
Epoch: 32717, Training Loss: 0.00973
Epoch: 32718, Training Loss: 0.00760
Epoch: 32718, Training Loss: 0.00792
Epoch: 32718, Training Loss: 0.00792
Epoch: 32718, Training Loss: 0.00973
Epoch: 32719, Training Loss: 0.00760
Epoch: 32719, Training Loss: 0.00791
Epoch: 32719, Training Loss: 0.00792
Epoch: 32719, Training Loss: 0.00973
Epoch: 32720, Training Loss: 0.00760
Epoch: 32720, Training Loss: 0.00791
Epoch: 32720, Training Loss: 0.00792
Epoch: 32720, Training Loss: 0.00973
Epoch: 32721, Training Loss: 0.00760
Epoch: 32721, Training Loss: 0.00791
Epoch: 32721, Training Loss: 0.00792
Epoch: 32721, Training Loss: 0.00973
Epoch: 32722, Training Loss: 0.00760
Epoch: 32722, Training Loss: 0.00791
Epoch: 32722, Training Loss: 0.00792
Epoch: 32722, Training Loss: 0.00973
Epoch: 32723, Training Loss: 0.00760
Epoch: 32723, Training Loss: 0.00791
Epoch: 32723, Training Loss: 0.00792
Epoch: 32723, Training Loss: 0.00973
Epoch: 32724, Training Loss: 0.00760
Epoch: 32724, Training Loss: 0.00791
Epoch: 32724, Training Loss: 0.00792
Epoch: 32724, Training Loss: 0.00973
Epoch: 32725, Training Loss: 0.00760
Epoch: 32725, Training Loss: 0.00791
Epoch: 32725, Training Loss: 0.00792
Epoch: 32725, Training Loss: 0.00973
Epoch: 32726, Training Loss: 0.00760
Epoch: 32726, Training Loss: 0.00791
Epoch: 32726, Training Loss: 0.00792
Epoch: 32726, Training Loss: 0.00973
Epoch: 32727, Training Loss: 0.00760
Epoch: 32727, Training Loss: 0.00791
Epoch: 32727, Training Loss: 0.00792
Epoch: 32727, Training Loss: 0.00973
Epoch: 32728, Training Loss: 0.00760
Epoch: 32728, Training Loss: 0.00791
Epoch: 32728, Training Loss: 0.00792
Epoch: 32728, Training Loss: 0.00973
Epoch: 32729, Training Loss: 0.00760
Epoch: 32729, Training Loss: 0.00791
Epoch: 32729, Training Loss: 0.00792
Epoch: 32729, Training Loss: 0.00973
Epoch: 32730, Training Loss: 0.00760
Epoch: 32730, Training Loss: 0.00791
Epoch: 32730, Training Loss: 0.00792
Epoch: 32730, Training Loss: 0.00973
Epoch: 32731, Training Loss: 0.00760
Epoch: 32731, Training Loss: 0.00791
Epoch: 32731, Training Loss: 0.00792
Epoch: 32731, Training Loss: 0.00973
Epoch: 32732, Training Loss: 0.00760
Epoch: 32732, Training Loss: 0.00791
Epoch: 32732, Training Loss: 0.00792
Epoch: 32732, Training Loss: 0.00973
Epoch: 32733, Training Loss: 0.00760
Epoch: 32733, Training Loss: 0.00791
Epoch: 32733, Training Loss: 0.00792
Epoch: 32733, Training Loss: 0.00973
Epoch: 32734, Training Loss: 0.00760
Epoch: 32734, Training Loss: 0.00791
Epoch: 32734, Training Loss: 0.00792
Epoch: 32734, Training Loss: 0.00973
Epoch: 32735, Training Loss: 0.00760
Epoch: 32735, Training Loss: 0.00791
Epoch: 32735, Training Loss: 0.00792
Epoch: 32735, Training Loss: 0.00973
Epoch: 32736, Training Loss: 0.00760
Epoch: 32736, Training Loss: 0.00791
Epoch: 32736, Training Loss: 0.00792
Epoch: 32736, Training Loss: 0.00973
Epoch: 32737, Training Loss: 0.00760
Epoch: 32737, Training Loss: 0.00791
Epoch: 32737, Training Loss: 0.00792
Epoch: 32737, Training Loss: 0.00973
Epoch: 32738, Training Loss: 0.00760
Epoch: 32738, Training Loss: 0.00791
Epoch: 32738, Training Loss: 0.00792
Epoch: 32738, Training Loss: 0.00973
Epoch: 32739, Training Loss: 0.00760
Epoch: 32739, Training Loss: 0.00791
Epoch: 32739, Training Loss: 0.00792
Epoch: 32739, Training Loss: 0.00973
Epoch: 32740, Training Loss: 0.00760
Epoch: 32740, Training Loss: 0.00791
Epoch: 32740, Training Loss: 0.00792
Epoch: 32740, Training Loss: 0.00973
Epoch: 32741, Training Loss: 0.00760
Epoch: 32741, Training Loss: 0.00791
Epoch: 32741, Training Loss: 0.00792
Epoch: 32741, Training Loss: 0.00973
Epoch: 32742, Training Loss: 0.00760
Epoch: 32742, Training Loss: 0.00791
Epoch: 32742, Training Loss: 0.00792
Epoch: 32742, Training Loss: 0.00973
Epoch: 32743, Training Loss: 0.00760
Epoch: 32743, Training Loss: 0.00791
Epoch: 32743, Training Loss: 0.00792
Epoch: 32743, Training Loss: 0.00972
Epoch: 32744, Training Loss: 0.00760
Epoch: 32744, Training Loss: 0.00791
Epoch: 32744, Training Loss: 0.00792
Epoch: 32744, Training Loss: 0.00972
Epoch: 32745, Training Loss: 0.00760
Epoch: 32745, Training Loss: 0.00791
Epoch: 32745, Training Loss: 0.00792
Epoch: 32745, Training Loss: 0.00972
Epoch: 32746, Training Loss: 0.00760
Epoch: 32746, Training Loss: 0.00791
Epoch: 32746, Training Loss: 0.00792
Epoch: 32746, Training Loss: 0.00972
Epoch: 32747, Training Loss: 0.00760
Epoch: 32747, Training Loss: 0.00791
Epoch: 32747, Training Loss: 0.00792
Epoch: 32747, Training Loss: 0.00972
Epoch: 32748, Training Loss: 0.00760
Epoch: 32748, Training Loss: 0.00791
Epoch: 32748, Training Loss: 0.00792
Epoch: 32748, Training Loss: 0.00972
Epoch: 32749, Training Loss: 0.00760
Epoch: 32749, Training Loss: 0.00791
Epoch: 32749, Training Loss: 0.00792
Epoch: 32749, Training Loss: 0.00972
Epoch: 32750, Training Loss: 0.00760
Epoch: 32750, Training Loss: 0.00791
Epoch: 32750, Training Loss: 0.00792
Epoch: 32750, Training Loss: 0.00972
Epoch: 32751, Training Loss: 0.00760
Epoch: 32751, Training Loss: 0.00791
Epoch: 32751, Training Loss: 0.00792
Epoch: 32751, Training Loss: 0.00972
Epoch: 32752, Training Loss: 0.00760
Epoch: 32752, Training Loss: 0.00791
Epoch: 32752, Training Loss: 0.00792
Epoch: 32752, Training Loss: 0.00972
Epoch: 32753, Training Loss: 0.00760
Epoch: 32753, Training Loss: 0.00791
Epoch: 32753, Training Loss: 0.00792
Epoch: 32753, Training Loss: 0.00972
Epoch: 32754, Training Loss: 0.00760
Epoch: 32754, Training Loss: 0.00791
Epoch: 32754, Training Loss: 0.00792
Epoch: 32754, Training Loss: 0.00972
Epoch: 32755, Training Loss: 0.00760
Epoch: 32755, Training Loss: 0.00791
Epoch: 32755, Training Loss: 0.00792
Epoch: 32755, Training Loss: 0.00972
Epoch: 32756, Training Loss: 0.00760
Epoch: 32756, Training Loss: 0.00791
Epoch: 32756, Training Loss: 0.00792
Epoch: 32756, Training Loss: 0.00972
Epoch: 32757, Training Loss: 0.00760
Epoch: 32757, Training Loss: 0.00791
Epoch: 32757, Training Loss: 0.00792
Epoch: 32757, Training Loss: 0.00972
Epoch: 32758, Training Loss: 0.00760
Epoch: 32758, Training Loss: 0.00791
Epoch: 32758, Training Loss: 0.00792
Epoch: 32758, Training Loss: 0.00972
Epoch: 32759, Training Loss: 0.00760
Epoch: 32759, Training Loss: 0.00791
Epoch: 32759, Training Loss: 0.00792
Epoch: 32759, Training Loss: 0.00972
Epoch: 32760, Training Loss: 0.00760
Epoch: 32760, Training Loss: 0.00791
Epoch: 32760, Training Loss: 0.00792
Epoch: 32760, Training Loss: 0.00972
Epoch: 32761, Training Loss: 0.00760
Epoch: 32761, Training Loss: 0.00791
Epoch: 32761, Training Loss: 0.00792
Epoch: 32761, Training Loss: 0.00972
Epoch: 32762, Training Loss: 0.00760
Epoch: 32762, Training Loss: 0.00791
Epoch: 32762, Training Loss: 0.00792
Epoch: 32762, Training Loss: 0.00972
Epoch: 32763, Training Loss: 0.00760
Epoch: 32763, Training Loss: 0.00791
Epoch: 32763, Training Loss: 0.00792
Epoch: 32763, Training Loss: 0.00972
Epoch: 32764, Training Loss: 0.00760
Epoch: 32764, Training Loss: 0.00791
Epoch: 32764, Training Loss: 0.00792
Epoch: 32764, Training Loss: 0.00972
Epoch: 32765, Training Loss: 0.00760
Epoch: 32765, Training Loss: 0.00791
Epoch: 32765, Training Loss: 0.00792
Epoch: 32765, Training Loss: 0.00972
Epoch: 32766, Training Loss: 0.00760
Epoch: 32766, Training Loss: 0.00791
Epoch: 32766, Training Loss: 0.00792
Epoch: 32766, Training Loss: 0.00972
Epoch: 32767, Training Loss: 0.00760
Epoch: 32767, Training Loss: 0.00791
Epoch: 32767, Training Loss: 0.00792
Epoch: 32767, Training Loss: 0.00972
Epoch: 32768, Training Loss: 0.00760
Epoch: 32768, Training Loss: 0.00791
Epoch: 32768, Training Loss: 0.00792
Epoch: 32768, Training Loss: 0.00972
Epoch: 32769, Training Loss: 0.00760
Epoch: 32769, Training Loss: 0.00791
Epoch: 32769, Training Loss: 0.00792
Epoch: 32769, Training Loss: 0.00972
Epoch: 32770, Training Loss: 0.00760
Epoch: 32770, Training Loss: 0.00791
Epoch: 32770, Training Loss: 0.00792
Epoch: 32770, Training Loss: 0.00972
Epoch: 32771, Training Loss: 0.00760
Epoch: 32771, Training Loss: 0.00791
Epoch: 32771, Training Loss: 0.00792
Epoch: 32771, Training Loss: 0.00972
Epoch: 32772, Training Loss: 0.00760
Epoch: 32772, Training Loss: 0.00791
Epoch: 32772, Training Loss: 0.00792
Epoch: 32772, Training Loss: 0.00972
Epoch: 32773, Training Loss: 0.00760
Epoch: 32773, Training Loss: 0.00791
Epoch: 32773, Training Loss: 0.00792
Epoch: 32773, Training Loss: 0.00972
Epoch: 32774, Training Loss: 0.00760
Epoch: 32774, Training Loss: 0.00791
Epoch: 32774, Training Loss: 0.00792
Epoch: 32774, Training Loss: 0.00972
Epoch: 32775, Training Loss: 0.00760
Epoch: 32775, Training Loss: 0.00791
Epoch: 32775, Training Loss: 0.00792
Epoch: 32775, Training Loss: 0.00972
Epoch: 32776, Training Loss: 0.00760
Epoch: 32776, Training Loss: 0.00791
Epoch: 32776, Training Loss: 0.00792
Epoch: 32776, Training Loss: 0.00972
Epoch: 32777, Training Loss: 0.00760
Epoch: 32777, Training Loss: 0.00791
Epoch: 32777, Training Loss: 0.00792
Epoch: 32777, Training Loss: 0.00972
Epoch: 32778, Training Loss: 0.00760
Epoch: 32778, Training Loss: 0.00791
Epoch: 32778, Training Loss: 0.00792
Epoch: 32778, Training Loss: 0.00972
Epoch: 32779, Training Loss: 0.00760
Epoch: 32779, Training Loss: 0.00791
Epoch: 32779, Training Loss: 0.00792
Epoch: 32779, Training Loss: 0.00972
Epoch: 32780, Training Loss: 0.00760
Epoch: 32780, Training Loss: 0.00791
Epoch: 32780, Training Loss: 0.00792
Epoch: 32780, Training Loss: 0.00972
Epoch: 32781, Training Loss: 0.00760
Epoch: 32781, Training Loss: 0.00791
Epoch: 32781, Training Loss: 0.00792
Epoch: 32781, Training Loss: 0.00972
Epoch: 32782, Training Loss: 0.00760
Epoch: 32782, Training Loss: 0.00791
Epoch: 32782, Training Loss: 0.00792
Epoch: 32782, Training Loss: 0.00972
Epoch: 32783, Training Loss: 0.00759
Epoch: 32783, Training Loss: 0.00791
Epoch: 32783, Training Loss: 0.00792
Epoch: 32783, Training Loss: 0.00972
Epoch: 32784, Training Loss: 0.00759
Epoch: 32784, Training Loss: 0.00791
Epoch: 32784, Training Loss: 0.00792
Epoch: 32784, Training Loss: 0.00972
Epoch: 32785, Training Loss: 0.00759
Epoch: 32785, Training Loss: 0.00791
Epoch: 32785, Training Loss: 0.00792
Epoch: 32785, Training Loss: 0.00972
Epoch: 32786, Training Loss: 0.00759
Epoch: 32786, Training Loss: 0.00791
Epoch: 32786, Training Loss: 0.00792
Epoch: 32786, Training Loss: 0.00972
Epoch: 32787, Training Loss: 0.00759
Epoch: 32787, Training Loss: 0.00791
Epoch: 32787, Training Loss: 0.00792
Epoch: 32787, Training Loss: 0.00972
Epoch: 32788, Training Loss: 0.00759
Epoch: 32788, Training Loss: 0.00791
Epoch: 32788, Training Loss: 0.00791
Epoch: 32788, Training Loss: 0.00972
Epoch: 32789, Training Loss: 0.00759
Epoch: 32789, Training Loss: 0.00791
Epoch: 32789, Training Loss: 0.00791
Epoch: 32789, Training Loss: 0.00972
Epoch: 32790, Training Loss: 0.00759
Epoch: 32790, Training Loss: 0.00791
Epoch: 32790, Training Loss: 0.00791
Epoch: 32790, Training Loss: 0.00972
Epoch: 32791, Training Loss: 0.00759
Epoch: 32791, Training Loss: 0.00791
Epoch: 32791, Training Loss: 0.00791
Epoch: 32791, Training Loss: 0.00972
Epoch: 32792, Training Loss: 0.00759
Epoch: 32792, Training Loss: 0.00791
Epoch: 32792, Training Loss: 0.00791
Epoch: 32792, Training Loss: 0.00972
Epoch: 32793, Training Loss: 0.00759
Epoch: 32793, Training Loss: 0.00791
Epoch: 32793, Training Loss: 0.00791
Epoch: 32793, Training Loss: 0.00972
Epoch: 32794, Training Loss: 0.00759
Epoch: 32794, Training Loss: 0.00790
Epoch: 32794, Training Loss: 0.00791
Epoch: 32794, Training Loss: 0.00972
Epoch: 32795, Training Loss: 0.00759
Epoch: 32795, Training Loss: 0.00790
Epoch: 32795, Training Loss: 0.00791
Epoch: 32795, Training Loss: 0.00972
Epoch: 32796, Training Loss: 0.00759
Epoch: 32796, Training Loss: 0.00790
Epoch: 32796, Training Loss: 0.00791
Epoch: 32796, Training Loss: 0.00972
Epoch: 32797, Training Loss: 0.00759
Epoch: 32797, Training Loss: 0.00790
Epoch: 32797, Training Loss: 0.00791
Epoch: 32797, Training Loss: 0.00972
Epoch: 32798, Training Loss: 0.00759
Epoch: 32798, Training Loss: 0.00790
Epoch: 32798, Training Loss: 0.00791
Epoch: 32798, Training Loss: 0.00972
Epoch: 32799, Training Loss: 0.00759
Epoch: 32799, Training Loss: 0.00790
Epoch: 32799, Training Loss: 0.00791
Epoch: 32799, Training Loss: 0.00972
Epoch: 32800, Training Loss: 0.00759
Epoch: 32800, Training Loss: 0.00790
Epoch: 32800, Training Loss: 0.00791
Epoch: 32800, Training Loss: 0.00972
Epoch: 32801, Training Loss: 0.00759
Epoch: 32801, Training Loss: 0.00790
Epoch: 32801, Training Loss: 0.00791
Epoch: 32801, Training Loss: 0.00972
Epoch: 32802, Training Loss: 0.00759
Epoch: 32802, Training Loss: 0.00790
Epoch: 32802, Training Loss: 0.00791
Epoch: 32802, Training Loss: 0.00972
Epoch: 32803, Training Loss: 0.00759
Epoch: 32803, Training Loss: 0.00790
Epoch: 32803, Training Loss: 0.00791
Epoch: 32803, Training Loss: 0.00971
Epoch: 32804, Training Loss: 0.00759
Epoch: 32804, Training Loss: 0.00790
Epoch: 32804, Training Loss: 0.00791
Epoch: 32804, Training Loss: 0.00971
Epoch: 32805, Training Loss: 0.00759
Epoch: 32805, Training Loss: 0.00790
Epoch: 32805, Training Loss: 0.00791
Epoch: 32805, Training Loss: 0.00971
Epoch: 32806, Training Loss: 0.00759
Epoch: 32806, Training Loss: 0.00790
Epoch: 32806, Training Loss: 0.00791
Epoch: 32806, Training Loss: 0.00971
Epoch: 32807, Training Loss: 0.00759
Epoch: 32807, Training Loss: 0.00790
Epoch: 32807, Training Loss: 0.00791
Epoch: 32807, Training Loss: 0.00971
Epoch: 32808, Training Loss: 0.00759
Epoch: 32808, Training Loss: 0.00790
Epoch: 32808, Training Loss: 0.00791
Epoch: 32808, Training Loss: 0.00971
Epoch: 32809, Training Loss: 0.00759
Epoch: 32809, Training Loss: 0.00790
Epoch: 32809, Training Loss: 0.00791
Epoch: 32809, Training Loss: 0.00971
Epoch: 32810, Training Loss: 0.00759
Epoch: 32810, Training Loss: 0.00790
Epoch: 32810, Training Loss: 0.00791
Epoch: 32810, Training Loss: 0.00971
Epoch: 32811, Training Loss: 0.00759
Epoch: 32811, Training Loss: 0.00790
Epoch: 32811, Training Loss: 0.00791
Epoch: 32811, Training Loss: 0.00971
Epoch: 32812, Training Loss: 0.00759
Epoch: 32812, Training Loss: 0.00790
Epoch: 32812, Training Loss: 0.00791
Epoch: 32812, Training Loss: 0.00971
Epoch: 32813, Training Loss: 0.00759
Epoch: 32813, Training Loss: 0.00790
Epoch: 32813, Training Loss: 0.00791
Epoch: 32813, Training Loss: 0.00971
Epoch: 32814, Training Loss: 0.00759
Epoch: 32814, Training Loss: 0.00790
Epoch: 32814, Training Loss: 0.00791
Epoch: 32814, Training Loss: 0.00971
Epoch: 32815, Training Loss: 0.00759
Epoch: 32815, Training Loss: 0.00790
Epoch: 32815, Training Loss: 0.00791
Epoch: 32815, Training Loss: 0.00971
Epoch: 32816, Training Loss: 0.00759
Epoch: 32816, Training Loss: 0.00790
Epoch: 32816, Training Loss: 0.00791
Epoch: 32816, Training Loss: 0.00971
Epoch: 32817, Training Loss: 0.00759
Epoch: 32817, Training Loss: 0.00790
Epoch: 32817, Training Loss: 0.00791
Epoch: 32817, Training Loss: 0.00971
Epoch: 32818, Training Loss: 0.00759
Epoch: 32818, Training Loss: 0.00790
Epoch: 32818, Training Loss: 0.00791
Epoch: 32818, Training Loss: 0.00971
Epoch: 32819, Training Loss: 0.00759
Epoch: 32819, Training Loss: 0.00790
Epoch: 32819, Training Loss: 0.00791
Epoch: 32819, Training Loss: 0.00971
Epoch: 32820, Training Loss: 0.00759
Epoch: 32820, Training Loss: 0.00790
Epoch: 32820, Training Loss: 0.00791
Epoch: 32820, Training Loss: 0.00971
Epoch: 32821, Training Loss: 0.00759
Epoch: 32821, Training Loss: 0.00790
Epoch: 32821, Training Loss: 0.00791
Epoch: 32821, Training Loss: 0.00971
Epoch: 32822, Training Loss: 0.00759
Epoch: 32822, Training Loss: 0.00790
Epoch: 32822, Training Loss: 0.00791
Epoch: 32822, Training Loss: 0.00971
Epoch: 32823, Training Loss: 0.00759
Epoch: 32823, Training Loss: 0.00790
Epoch: 32823, Training Loss: 0.00791
Epoch: 32823, Training Loss: 0.00971
Epoch: 32824, Training Loss: 0.00759
Epoch: 32824, Training Loss: 0.00790
Epoch: 32824, Training Loss: 0.00791
Epoch: 32824, Training Loss: 0.00971
Epoch: 32825, Training Loss: 0.00759
Epoch: 32825, Training Loss: 0.00790
Epoch: 32825, Training Loss: 0.00791
Epoch: 32825, Training Loss: 0.00971
Epoch: 32826, Training Loss: 0.00759
Epoch: 32826, Training Loss: 0.00790
Epoch: 32826, Training Loss: 0.00791
Epoch: 32826, Training Loss: 0.00971
Epoch: 32827, Training Loss: 0.00759
Epoch: 32827, Training Loss: 0.00790
Epoch: 32827, Training Loss: 0.00791
Epoch: 32827, Training Loss: 0.00971
Epoch: 32828, Training Loss: 0.00759
Epoch: 32828, Training Loss: 0.00790
Epoch: 32828, Training Loss: 0.00791
Epoch: 32828, Training Loss: 0.00971
Epoch: 32829, Training Loss: 0.00759
Epoch: 32829, Training Loss: 0.00790
Epoch: 32829, Training Loss: 0.00791
Epoch: 32829, Training Loss: 0.00971
Epoch: 32830, Training Loss: 0.00759
Epoch: 32830, Training Loss: 0.00790
Epoch: 32830, Training Loss: 0.00791
Epoch: 32830, Training Loss: 0.00971
Epoch: 32831, Training Loss: 0.00759
Epoch: 32831, Training Loss: 0.00790
Epoch: 32831, Training Loss: 0.00791
Epoch: 32831, Training Loss: 0.00971
Epoch: 32832, Training Loss: 0.00759
Epoch: 32832, Training Loss: 0.00790
Epoch: 32832, Training Loss: 0.00791
Epoch: 32832, Training Loss: 0.00971
Epoch: 32833, Training Loss: 0.00759
Epoch: 32833, Training Loss: 0.00790
Epoch: 32833, Training Loss: 0.00791
Epoch: 32833, Training Loss: 0.00971
Epoch: 32834, Training Loss: 0.00759
Epoch: 32834, Training Loss: 0.00790
Epoch: 32834, Training Loss: 0.00791
Epoch: 32834, Training Loss: 0.00971
Epoch: 32835, Training Loss: 0.00759
Epoch: 32835, Training Loss: 0.00790
Epoch: 32835, Training Loss: 0.00791
Epoch: 32835, Training Loss: 0.00971
Epoch: 32836, Training Loss: 0.00759
Epoch: 32836, Training Loss: 0.00790
Epoch: 32836, Training Loss: 0.00791
Epoch: 32836, Training Loss: 0.00971
Epoch: 32837, Training Loss: 0.00759
Epoch: 32837, Training Loss: 0.00790
Epoch: 32837, Training Loss: 0.00791
Epoch: 32837, Training Loss: 0.00971
Epoch: 32838, Training Loss: 0.00759
Epoch: 32838, Training Loss: 0.00790
Epoch: 32838, Training Loss: 0.00791
Epoch: 32838, Training Loss: 0.00971
Epoch: 32839, Training Loss: 0.00759
Epoch: 32839, Training Loss: 0.00790
Epoch: 32839, Training Loss: 0.00791
Epoch: 32839, Training Loss: 0.00971
Epoch: 32840, Training Loss: 0.00759
Epoch: 32840, Training Loss: 0.00790
Epoch: 32840, Training Loss: 0.00791
Epoch: 32840, Training Loss: 0.00971
Epoch: 32841, Training Loss: 0.00759
Epoch: 32841, Training Loss: 0.00790
Epoch: 32841, Training Loss: 0.00791
Epoch: 32841, Training Loss: 0.00971
Epoch: 32842, Training Loss: 0.00759
Epoch: 32842, Training Loss: 0.00790
Epoch: 32842, Training Loss: 0.00791
Epoch: 32842, Training Loss: 0.00971
Epoch: 32843, Training Loss: 0.00759
Epoch: 32843, Training Loss: 0.00790
Epoch: 32843, Training Loss: 0.00791
Epoch: 32843, Training Loss: 0.00971
Epoch: 32844, Training Loss: 0.00759
Epoch: 32844, Training Loss: 0.00790
Epoch: 32844, Training Loss: 0.00791
Epoch: 32844, Training Loss: 0.00971
Epoch: 32845, Training Loss: 0.00759
Epoch: 32845, Training Loss: 0.00790
Epoch: 32845, Training Loss: 0.00791
Epoch: 32845, Training Loss: 0.00971
Epoch: 32846, Training Loss: 0.00759
Epoch: 32846, Training Loss: 0.00790
Epoch: 32846, Training Loss: 0.00791
Epoch: 32846, Training Loss: 0.00971
Epoch: 32847, Training Loss: 0.00759
Epoch: 32847, Training Loss: 0.00790
Epoch: 32847, Training Loss: 0.00791
Epoch: 32847, Training Loss: 0.00971
Epoch: 32848, Training Loss: 0.00759
Epoch: 32848, Training Loss: 0.00790
Epoch: 32848, Training Loss: 0.00791
Epoch: 32848, Training Loss: 0.00971
Epoch: 32849, Training Loss: 0.00759
Epoch: 32849, Training Loss: 0.00790
Epoch: 32849, Training Loss: 0.00791
Epoch: 32849, Training Loss: 0.00971
Epoch: 32850, Training Loss: 0.00759
Epoch: 32850, Training Loss: 0.00790
Epoch: 32850, Training Loss: 0.00791
Epoch: 32850, Training Loss: 0.00971
Epoch: 32851, Training Loss: 0.00759
Epoch: 32851, Training Loss: 0.00790
Epoch: 32851, Training Loss: 0.00791
Epoch: 32851, Training Loss: 0.00971
Epoch: 32852, Training Loss: 0.00759
Epoch: 32852, Training Loss: 0.00790
Epoch: 32852, Training Loss: 0.00791
Epoch: 32852, Training Loss: 0.00971
Epoch: 32853, Training Loss: 0.00759
Epoch: 32853, Training Loss: 0.00790
Epoch: 32853, Training Loss: 0.00791
Epoch: 32853, Training Loss: 0.00971
Epoch: 32854, Training Loss: 0.00759
Epoch: 32854, Training Loss: 0.00790
Epoch: 32854, Training Loss: 0.00791
Epoch: 32854, Training Loss: 0.00971
Epoch: 32855, Training Loss: 0.00759
Epoch: 32855, Training Loss: 0.00790
Epoch: 32855, Training Loss: 0.00791
Epoch: 32855, Training Loss: 0.00971
Epoch: 32856, Training Loss: 0.00759
Epoch: 32856, Training Loss: 0.00790
Epoch: 32856, Training Loss: 0.00791
Epoch: 32856, Training Loss: 0.00971
Epoch: 32857, Training Loss: 0.00759
Epoch: 32857, Training Loss: 0.00790
Epoch: 32857, Training Loss: 0.00791
Epoch: 32857, Training Loss: 0.00971
Epoch: 32858, Training Loss: 0.00759
Epoch: 32858, Training Loss: 0.00790
Epoch: 32858, Training Loss: 0.00791
Epoch: 32858, Training Loss: 0.00971
Epoch: 32859, Training Loss: 0.00759
Epoch: 32859, Training Loss: 0.00790
Epoch: 32859, Training Loss: 0.00791
Epoch: 32859, Training Loss: 0.00971
Epoch: 32860, Training Loss: 0.00759
Epoch: 32860, Training Loss: 0.00790
Epoch: 32860, Training Loss: 0.00791
Epoch: 32860, Training Loss: 0.00971
Epoch: 32861, Training Loss: 0.00759
Epoch: 32861, Training Loss: 0.00790
Epoch: 32861, Training Loss: 0.00791
Epoch: 32861, Training Loss: 0.00971
Epoch: 32862, Training Loss: 0.00758
Epoch: 32862, Training Loss: 0.00790
Epoch: 32862, Training Loss: 0.00791
Epoch: 32862, Training Loss: 0.00971
Epoch: 32863, Training Loss: 0.00758
Epoch: 32863, Training Loss: 0.00790
Epoch: 32863, Training Loss: 0.00790
Epoch: 32863, Training Loss: 0.00970
Epoch: 32864, Training Loss: 0.00758
Epoch: 32864, Training Loss: 0.00790
Epoch: 32864, Training Loss: 0.00790
Epoch: 32864, Training Loss: 0.00970
Epoch: 32865, Training Loss: 0.00758
Epoch: 32865, Training Loss: 0.00790
Epoch: 32865, Training Loss: 0.00790
Epoch: 32865, Training Loss: 0.00970
Epoch: 32866, Training Loss: 0.00758
Epoch: 32866, Training Loss: 0.00790
Epoch: 32866, Training Loss: 0.00790
Epoch: 32866, Training Loss: 0.00970
Epoch: 32867, Training Loss: 0.00758
Epoch: 32867, Training Loss: 0.00790
Epoch: 32867, Training Loss: 0.00790
Epoch: 32867, Training Loss: 0.00970
Epoch: 32868, Training Loss: 0.00758
Epoch: 32868, Training Loss: 0.00789
Epoch: 32868, Training Loss: 0.00790
Epoch: 32868, Training Loss: 0.00970
Epoch: 32869, Training Loss: 0.00758
Epoch: 32869, Training Loss: 0.00789
Epoch: 32869, Training Loss: 0.00790
Epoch: 32869, Training Loss: 0.00970
Epoch: 32870, Training Loss: 0.00758
Epoch: 32870, Training Loss: 0.00789
Epoch: 32870, Training Loss: 0.00790
Epoch: 32870, Training Loss: 0.00970
Epoch: 32871, Training Loss: 0.00758
Epoch: 32871, Training Loss: 0.00789
Epoch: 32871, Training Loss: 0.00790
Epoch: 32871, Training Loss: 0.00970
Epoch: 32872, Training Loss: 0.00758
Epoch: 32872, Training Loss: 0.00789
Epoch: 32872, Training Loss: 0.00790
Epoch: 32872, Training Loss: 0.00970
Epoch: 32873, Training Loss: 0.00758
Epoch: 32873, Training Loss: 0.00789
Epoch: 32873, Training Loss: 0.00790
Epoch: 32873, Training Loss: 0.00970
Epoch: 32874, Training Loss: 0.00758
Epoch: 32874, Training Loss: 0.00789
Epoch: 32874, Training Loss: 0.00790
Epoch: 32874, Training Loss: 0.00970
Epoch: 32875, Training Loss: 0.00758
Epoch: 32875, Training Loss: 0.00789
Epoch: 32875, Training Loss: 0.00790
Epoch: 32875, Training Loss: 0.00970
Epoch: 32876, Training Loss: 0.00758
Epoch: 32876, Training Loss: 0.00789
Epoch: 32876, Training Loss: 0.00790
Epoch: 32876, Training Loss: 0.00970
Epoch: 32877, Training Loss: 0.00758
Epoch: 32877, Training Loss: 0.00789
Epoch: 32877, Training Loss: 0.00790
Epoch: 32877, Training Loss: 0.00970
Epoch: 32878, Training Loss: 0.00758
Epoch: 32878, Training Loss: 0.00789
Epoch: 32878, Training Loss: 0.00790
Epoch: 32878, Training Loss: 0.00970
Epoch: 32879, Training Loss: 0.00758
Epoch: 32879, Training Loss: 0.00789
Epoch: 32879, Training Loss: 0.00790
Epoch: 32879, Training Loss: 0.00970
Epoch: 32880, Training Loss: 0.00758
Epoch: 32880, Training Loss: 0.00789
Epoch: 32880, Training Loss: 0.00790
Epoch: 32880, Training Loss: 0.00970
Epoch: 32881, Training Loss: 0.00758
Epoch: 32881, Training Loss: 0.00789
Epoch: 32881, Training Loss: 0.00790
Epoch: 32881, Training Loss: 0.00970
Epoch: 32882, Training Loss: 0.00758
Epoch: 32882, Training Loss: 0.00789
Epoch: 32882, Training Loss: 0.00790
Epoch: 32882, Training Loss: 0.00970
Epoch: 32883, Training Loss: 0.00758
Epoch: 32883, Training Loss: 0.00789
Epoch: 32883, Training Loss: 0.00790
Epoch: 32883, Training Loss: 0.00970
Epoch: 32884, Training Loss: 0.00758
Epoch: 32884, Training Loss: 0.00789
Epoch: 32884, Training Loss: 0.00790
Epoch: 32884, Training Loss: 0.00970
Epoch: 32885, Training Loss: 0.00758
Epoch: 32885, Training Loss: 0.00789
Epoch: 32885, Training Loss: 0.00790
Epoch: 32885, Training Loss: 0.00970
Epoch: 32886, Training Loss: 0.00758
Epoch: 32886, Training Loss: 0.00789
Epoch: 32886, Training Loss: 0.00790
Epoch: 32886, Training Loss: 0.00970
Epoch: 32887, Training Loss: 0.00758
Epoch: 32887, Training Loss: 0.00789
Epoch: 32887, Training Loss: 0.00790
Epoch: 32887, Training Loss: 0.00970
Epoch: 32888, Training Loss: 0.00758
Epoch: 32888, Training Loss: 0.00789
Epoch: 32888, Training Loss: 0.00790
Epoch: 32888, Training Loss: 0.00970
Epoch: 32889, Training Loss: 0.00758
Epoch: 32889, Training Loss: 0.00789
Epoch: 32889, Training Loss: 0.00790
Epoch: 32889, Training Loss: 0.00970
Epoch: 32890, Training Loss: 0.00758
Epoch: 32890, Training Loss: 0.00789
Epoch: 32890, Training Loss: 0.00790
Epoch: 32890, Training Loss: 0.00970
Epoch: 32891, Training Loss: 0.00758
Epoch: 32891, Training Loss: 0.00789
Epoch: 32891, Training Loss: 0.00790
Epoch: 32891, Training Loss: 0.00970
Epoch: 32892, Training Loss: 0.00758
Epoch: 32892, Training Loss: 0.00789
Epoch: 32892, Training Loss: 0.00790
Epoch: 32892, Training Loss: 0.00970
Epoch: 32893, Training Loss: 0.00758
Epoch: 32893, Training Loss: 0.00789
Epoch: 32893, Training Loss: 0.00790
Epoch: 32893, Training Loss: 0.00970
Epoch: 32894, Training Loss: 0.00758
Epoch: 32894, Training Loss: 0.00789
Epoch: 32894, Training Loss: 0.00790
Epoch: 32894, Training Loss: 0.00970
Epoch: 32895, Training Loss: 0.00758
Epoch: 32895, Training Loss: 0.00789
Epoch: 32895, Training Loss: 0.00790
Epoch: 32895, Training Loss: 0.00970
Epoch: 32896, Training Loss: 0.00758
Epoch: 32896, Training Loss: 0.00789
Epoch: 32896, Training Loss: 0.00790
Epoch: 32896, Training Loss: 0.00970
Epoch: 32897, Training Loss: 0.00758
Epoch: 32897, Training Loss: 0.00789
Epoch: 32897, Training Loss: 0.00790
Epoch: 32897, Training Loss: 0.00970
Epoch: 32898, Training Loss: 0.00758
Epoch: 32898, Training Loss: 0.00789
Epoch: 32898, Training Loss: 0.00790
Epoch: 32898, Training Loss: 0.00970
Epoch: 32899, Training Loss: 0.00758
Epoch: 32899, Training Loss: 0.00789
Epoch: 32899, Training Loss: 0.00790
Epoch: 32899, Training Loss: 0.00970
Epoch: 32900, Training Loss: 0.00758
Epoch: 32900, Training Loss: 0.00789
Epoch: 32900, Training Loss: 0.00790
Epoch: 32900, Training Loss: 0.00970
Epoch: 32901, Training Loss: 0.00758
Epoch: 32901, Training Loss: 0.00789
Epoch: 32901, Training Loss: 0.00790
Epoch: 32901, Training Loss: 0.00970
Epoch: 32902, Training Loss: 0.00758
Epoch: 32902, Training Loss: 0.00789
Epoch: 32902, Training Loss: 0.00790
Epoch: 32902, Training Loss: 0.00970
Epoch: 32903, Training Loss: 0.00758
Epoch: 32903, Training Loss: 0.00789
Epoch: 32903, Training Loss: 0.00790
Epoch: 32903, Training Loss: 0.00970
Epoch: 32904, Training Loss: 0.00758
Epoch: 32904, Training Loss: 0.00789
Epoch: 32904, Training Loss: 0.00790
Epoch: 32904, Training Loss: 0.00970
Epoch: 32905, Training Loss: 0.00758
Epoch: 32905, Training Loss: 0.00789
Epoch: 32905, Training Loss: 0.00790
Epoch: 32905, Training Loss: 0.00970
Epoch: 32906, Training Loss: 0.00758
Epoch: 32906, Training Loss: 0.00789
Epoch: 32906, Training Loss: 0.00790
Epoch: 32906, Training Loss: 0.00970
Epoch: 32907, Training Loss: 0.00758
Epoch: 32907, Training Loss: 0.00789
Epoch: 32907, Training Loss: 0.00790
Epoch: 32907, Training Loss: 0.00970
Epoch: 32908, Training Loss: 0.00758
Epoch: 32908, Training Loss: 0.00789
Epoch: 32908, Training Loss: 0.00790
Epoch: 32908, Training Loss: 0.00970
Epoch: 32909, Training Loss: 0.00758
Epoch: 32909, Training Loss: 0.00789
Epoch: 32909, Training Loss: 0.00790
Epoch: 32909, Training Loss: 0.00970
Epoch: 32910, Training Loss: 0.00758
Epoch: 32910, Training Loss: 0.00789
Epoch: 32910, Training Loss: 0.00790
Epoch: 32910, Training Loss: 0.00970
Epoch: 32911, Training Loss: 0.00758
Epoch: 32911, Training Loss: 0.00789
Epoch: 32911, Training Loss: 0.00790
Epoch: 32911, Training Loss: 0.00970
Epoch: 32912, Training Loss: 0.00758
Epoch: 32912, Training Loss: 0.00789
Epoch: 32912, Training Loss: 0.00790
Epoch: 32912, Training Loss: 0.00970
Epoch: 32913, Training Loss: 0.00758
Epoch: 32913, Training Loss: 0.00789
Epoch: 32913, Training Loss: 0.00790
Epoch: 32913, Training Loss: 0.00970
Epoch: 32914, Training Loss: 0.00758
Epoch: 32914, Training Loss: 0.00789
Epoch: 32914, Training Loss: 0.00790
Epoch: 32914, Training Loss: 0.00970
Epoch: 32915, Training Loss: 0.00758
Epoch: 32915, Training Loss: 0.00789
Epoch: 32915, Training Loss: 0.00790
Epoch: 32915, Training Loss: 0.00970
Epoch: 32916, Training Loss: 0.00758
Epoch: 32916, Training Loss: 0.00789
Epoch: 32916, Training Loss: 0.00790
Epoch: 32916, Training Loss: 0.00970
Epoch: 32917, Training Loss: 0.00758
Epoch: 32917, Training Loss: 0.00789
Epoch: 32917, Training Loss: 0.00790
Epoch: 32917, Training Loss: 0.00970
Epoch: 32918, Training Loss: 0.00758
Epoch: 32918, Training Loss: 0.00789
Epoch: 32918, Training Loss: 0.00790
Epoch: 32918, Training Loss: 0.00970
Epoch: 32919, Training Loss: 0.00758
Epoch: 32919, Training Loss: 0.00789
Epoch: 32919, Training Loss: 0.00790
Epoch: 32919, Training Loss: 0.00970
Epoch: 32920, Training Loss: 0.00758
Epoch: 32920, Training Loss: 0.00789
Epoch: 32920, Training Loss: 0.00790
Epoch: 32920, Training Loss: 0.00970
Epoch: 32921, Training Loss: 0.00758
Epoch: 32921, Training Loss: 0.00789
Epoch: 32921, Training Loss: 0.00790
Epoch: 32921, Training Loss: 0.00970
Epoch: 32922, Training Loss: 0.00758
Epoch: 32922, Training Loss: 0.00789
Epoch: 32922, Training Loss: 0.00790
Epoch: 32922, Training Loss: 0.00970
Epoch: 32923, Training Loss: 0.00758
Epoch: 32923, Training Loss: 0.00789
Epoch: 32923, Training Loss: 0.00790
Epoch: 32923, Training Loss: 0.00969
Epoch: 32924, Training Loss: 0.00758
Epoch: 32924, Training Loss: 0.00789
Epoch: 32924, Training Loss: 0.00790
Epoch: 32924, Training Loss: 0.00969
Epoch: 32925, Training Loss: 0.00758
Epoch: 32925, Training Loss: 0.00789
Epoch: 32925, Training Loss: 0.00790
Epoch: 32925, Training Loss: 0.00969
Epoch: 32926, Training Loss: 0.00758
Epoch: 32926, Training Loss: 0.00789
Epoch: 32926, Training Loss: 0.00790
Epoch: 32926, Training Loss: 0.00969
Epoch: 32927, Training Loss: 0.00758
Epoch: 32927, Training Loss: 0.00789
Epoch: 32927, Training Loss: 0.00790
Epoch: 32927, Training Loss: 0.00969
Epoch: 32928, Training Loss: 0.00758
Epoch: 32928, Training Loss: 0.00789
Epoch: 32928, Training Loss: 0.00790
Epoch: 32928, Training Loss: 0.00969
Epoch: 32929, Training Loss: 0.00758
Epoch: 32929, Training Loss: 0.00789
Epoch: 32929, Training Loss: 0.00790
Epoch: 32929, Training Loss: 0.00969
Epoch: 32930, Training Loss: 0.00758
Epoch: 32930, Training Loss: 0.00789
Epoch: 32930, Training Loss: 0.00790
Epoch: 32930, Training Loss: 0.00969
Epoch: 32931, Training Loss: 0.00758
Epoch: 32931, Training Loss: 0.00789
Epoch: 32931, Training Loss: 0.00790
Epoch: 32931, Training Loss: 0.00969
Epoch: 32932, Training Loss: 0.00758
Epoch: 32932, Training Loss: 0.00789
Epoch: 32932, Training Loss: 0.00790
Epoch: 32932, Training Loss: 0.00969
Epoch: 32933, Training Loss: 0.00758
Epoch: 32933, Training Loss: 0.00789
Epoch: 32933, Training Loss: 0.00790
Epoch: 32933, Training Loss: 0.00969
Epoch: 32934, Training Loss: 0.00758
Epoch: 32934, Training Loss: 0.00789
Epoch: 32934, Training Loss: 0.00790
Epoch: 32934, Training Loss: 0.00969
Epoch: 32935, Training Loss: 0.00758
Epoch: 32935, Training Loss: 0.00789
Epoch: 32935, Training Loss: 0.00790
Epoch: 32935, Training Loss: 0.00969
Epoch: 32936, Training Loss: 0.00758
Epoch: 32936, Training Loss: 0.00789
Epoch: 32936, Training Loss: 0.00790
Epoch: 32936, Training Loss: 0.00969
Epoch: 32937, Training Loss: 0.00758
Epoch: 32937, Training Loss: 0.00789
Epoch: 32937, Training Loss: 0.00789
Epoch: 32937, Training Loss: 0.00969
Epoch: 32938, Training Loss: 0.00758
Epoch: 32938, Training Loss: 0.00789
Epoch: 32938, Training Loss: 0.00789
Epoch: 32938, Training Loss: 0.00969
Epoch: 32939, Training Loss: 0.00758
Epoch: 32939, Training Loss: 0.00789
Epoch: 32939, Training Loss: 0.00789
Epoch: 32939, Training Loss: 0.00969
Epoch: 32940, Training Loss: 0.00758
Epoch: 32940, Training Loss: 0.00789
Epoch: 32940, Training Loss: 0.00789
Epoch: 32940, Training Loss: 0.00969
Epoch: 32941, Training Loss: 0.00757
Epoch: 32941, Training Loss: 0.00789
Epoch: 32941, Training Loss: 0.00789
Epoch: 32941, Training Loss: 0.00969
Epoch: 32942, Training Loss: 0.00757
Epoch: 32942, Training Loss: 0.00789
Epoch: 32942, Training Loss: 0.00789
Epoch: 32942, Training Loss: 0.00969
Epoch: 32943, Training Loss: 0.00757
Epoch: 32943, Training Loss: 0.00788
Epoch: 32943, Training Loss: 0.00789
Epoch: 32943, Training Loss: 0.00969
Epoch: 32944, Training Loss: 0.00757
Epoch: 32944, Training Loss: 0.00788
Epoch: 32944, Training Loss: 0.00789
Epoch: 32944, Training Loss: 0.00969
Epoch: 32945, Training Loss: 0.00757
Epoch: 32945, Training Loss: 0.00788
Epoch: 32945, Training Loss: 0.00789
Epoch: 32945, Training Loss: 0.00969
Epoch: 32946, Training Loss: 0.00757
Epoch: 32946, Training Loss: 0.00788
Epoch: 32946, Training Loss: 0.00789
Epoch: 32946, Training Loss: 0.00969
Epoch: 32947, Training Loss: 0.00757
Epoch: 32947, Training Loss: 0.00788
Epoch: 32947, Training Loss: 0.00789
Epoch: 32947, Training Loss: 0.00969
Epoch: 32948, Training Loss: 0.00757
Epoch: 32948, Training Loss: 0.00788
Epoch: 32948, Training Loss: 0.00789
Epoch: 32948, Training Loss: 0.00969
Epoch: 32949, Training Loss: 0.00757
Epoch: 32949, Training Loss: 0.00788
Epoch: 32949, Training Loss: 0.00789
Epoch: 32949, Training Loss: 0.00969
Epoch: 32950, Training Loss: 0.00757
Epoch: 32950, Training Loss: 0.00788
Epoch: 32950, Training Loss: 0.00789
Epoch: 32950, Training Loss: 0.00969
Epoch: 32951, Training Loss: 0.00757
Epoch: 32951, Training Loss: 0.00788
Epoch: 32951, Training Loss: 0.00789
Epoch: 32951, Training Loss: 0.00969
Epoch: 32952, Training Loss: 0.00757
Epoch: 32952, Training Loss: 0.00788
Epoch: 32952, Training Loss: 0.00789
Epoch: 32952, Training Loss: 0.00969
Epoch: 32953, Training Loss: 0.00757
Epoch: 32953, Training Loss: 0.00788
Epoch: 32953, Training Loss: 0.00789
Epoch: 32953, Training Loss: 0.00969
Epoch: 32954, Training Loss: 0.00757
Epoch: 32954, Training Loss: 0.00788
Epoch: 32954, Training Loss: 0.00789
Epoch: 32954, Training Loss: 0.00969
Epoch: 32955, Training Loss: 0.00757
Epoch: 32955, Training Loss: 0.00788
Epoch: 32955, Training Loss: 0.00789
Epoch: 32955, Training Loss: 0.00969
Epoch: 32956, Training Loss: 0.00757
Epoch: 32956, Training Loss: 0.00788
Epoch: 32956, Training Loss: 0.00789
Epoch: 32956, Training Loss: 0.00969
Epoch: 32957, Training Loss: 0.00757
Epoch: 32957, Training Loss: 0.00788
Epoch: 32957, Training Loss: 0.00789
Epoch: 32957, Training Loss: 0.00969
Epoch: 32958, Training Loss: 0.00757
Epoch: 32958, Training Loss: 0.00788
Epoch: 32958, Training Loss: 0.00789
Epoch: 32958, Training Loss: 0.00969
Epoch: 32959, Training Loss: 0.00757
Epoch: 32959, Training Loss: 0.00788
Epoch: 32959, Training Loss: 0.00789
Epoch: 32959, Training Loss: 0.00969
Epoch: 32960, Training Loss: 0.00757
Epoch: 32960, Training Loss: 0.00788
Epoch: 32960, Training Loss: 0.00789
Epoch: 32960, Training Loss: 0.00969
Epoch: 32961, Training Loss: 0.00757
Epoch: 32961, Training Loss: 0.00788
Epoch: 32961, Training Loss: 0.00789
Epoch: 32961, Training Loss: 0.00969
Epoch: 32962, Training Loss: 0.00757
Epoch: 32962, Training Loss: 0.00788
Epoch: 32962, Training Loss: 0.00789
Epoch: 32962, Training Loss: 0.00969
Epoch: 32963, Training Loss: 0.00757
Epoch: 32963, Training Loss: 0.00788
Epoch: 32963, Training Loss: 0.00789
Epoch: 32963, Training Loss: 0.00969
Epoch: 32964, Training Loss: 0.00757
Epoch: 32964, Training Loss: 0.00788
Epoch: 32964, Training Loss: 0.00789
Epoch: 32964, Training Loss: 0.00969
Epoch: 32965, Training Loss: 0.00757
Epoch: 32965, Training Loss: 0.00788
Epoch: 32965, Training Loss: 0.00789
Epoch: 32965, Training Loss: 0.00969
Epoch: 32966, Training Loss: 0.00757
Epoch: 32966, Training Loss: 0.00788
Epoch: 32966, Training Loss: 0.00789
Epoch: 32966, Training Loss: 0.00969
Epoch: 32967, Training Loss: 0.00757
Epoch: 32967, Training Loss: 0.00788
Epoch: 32967, Training Loss: 0.00789
Epoch: 32967, Training Loss: 0.00969
Epoch: 32968, Training Loss: 0.00757
Epoch: 32968, Training Loss: 0.00788
Epoch: 32968, Training Loss: 0.00789
Epoch: 32968, Training Loss: 0.00969
Epoch: 32969, Training Loss: 0.00757
Epoch: 32969, Training Loss: 0.00788
Epoch: 32969, Training Loss: 0.00789
Epoch: 32969, Training Loss: 0.00969
Epoch: 32970, Training Loss: 0.00757
Epoch: 32970, Training Loss: 0.00788
Epoch: 32970, Training Loss: 0.00789
Epoch: 32970, Training Loss: 0.00969
Epoch: 32971, Training Loss: 0.00757
Epoch: 32971, Training Loss: 0.00788
Epoch: 32971, Training Loss: 0.00789
Epoch: 32971, Training Loss: 0.00969
Epoch: 32972, Training Loss: 0.00757
Epoch: 32972, Training Loss: 0.00788
Epoch: 32972, Training Loss: 0.00789
Epoch: 32972, Training Loss: 0.00969
Epoch: 32973, Training Loss: 0.00757
Epoch: 32973, Training Loss: 0.00788
Epoch: 32973, Training Loss: 0.00789
Epoch: 32973, Training Loss: 0.00969
Epoch: 32974, Training Loss: 0.00757
Epoch: 32974, Training Loss: 0.00788
Epoch: 32974, Training Loss: 0.00789
Epoch: 32974, Training Loss: 0.00969
Epoch: 32975, Training Loss: 0.00757
Epoch: 32975, Training Loss: 0.00788
Epoch: 32975, Training Loss: 0.00789
Epoch: 32975, Training Loss: 0.00969
Epoch: 32976, Training Loss: 0.00757
Epoch: 32976, Training Loss: 0.00788
Epoch: 32976, Training Loss: 0.00789
Epoch: 32976, Training Loss: 0.00969
Epoch: 32977, Training Loss: 0.00757
Epoch: 32977, Training Loss: 0.00788
Epoch: 32977, Training Loss: 0.00789
Epoch: 32977, Training Loss: 0.00969
Epoch: 32978, Training Loss: 0.00757
Epoch: 32978, Training Loss: 0.00788
Epoch: 32978, Training Loss: 0.00789
Epoch: 32978, Training Loss: 0.00969
Epoch: 32979, Training Loss: 0.00757
Epoch: 32979, Training Loss: 0.00788
Epoch: 32979, Training Loss: 0.00789
Epoch: 32979, Training Loss: 0.00969
Epoch: 32980, Training Loss: 0.00757
Epoch: 32980, Training Loss: 0.00788
Epoch: 32980, Training Loss: 0.00789
Epoch: 32980, Training Loss: 0.00969
Epoch: 32981, Training Loss: 0.00757
Epoch: 32981, Training Loss: 0.00788
Epoch: 32981, Training Loss: 0.00789
Epoch: 32981, Training Loss: 0.00969
Epoch: 32982, Training Loss: 0.00757
Epoch: 32982, Training Loss: 0.00788
Epoch: 32982, Training Loss: 0.00789
Epoch: 32982, Training Loss: 0.00969
Epoch: 32983, Training Loss: 0.00757
Epoch: 32983, Training Loss: 0.00788
Epoch: 32983, Training Loss: 0.00789
Epoch: 32983, Training Loss: 0.00968
Epoch: 32984, Training Loss: 0.00757
Epoch: 32984, Training Loss: 0.00788
Epoch: 32984, Training Loss: 0.00789
Epoch: 32984, Training Loss: 0.00968
Epoch: 32985, Training Loss: 0.00757
Epoch: 32985, Training Loss: 0.00788
Epoch: 32985, Training Loss: 0.00789
Epoch: 32985, Training Loss: 0.00968
Epoch: 32986, Training Loss: 0.00757
Epoch: 32986, Training Loss: 0.00788
Epoch: 32986, Training Loss: 0.00789
Epoch: 32986, Training Loss: 0.00968
Epoch: 32987, Training Loss: 0.00757
Epoch: 32987, Training Loss: 0.00788
Epoch: 32987, Training Loss: 0.00789
Epoch: 32987, Training Loss: 0.00968
Epoch: 32988, Training Loss: 0.00757
Epoch: 32988, Training Loss: 0.00788
Epoch: 32988, Training Loss: 0.00789
Epoch: 32988, Training Loss: 0.00968
Epoch: 32989, Training Loss: 0.00757
Epoch: 32989, Training Loss: 0.00788
Epoch: 32989, Training Loss: 0.00789
Epoch: 32989, Training Loss: 0.00968
Epoch: 32990, Training Loss: 0.00757
Epoch: 32990, Training Loss: 0.00788
Epoch: 32990, Training Loss: 0.00789
Epoch: 32990, Training Loss: 0.00968
Epoch: 32991, Training Loss: 0.00757
Epoch: 32991, Training Loss: 0.00788
Epoch: 32991, Training Loss: 0.00789
Epoch: 32991, Training Loss: 0.00968
Epoch: 32992, Training Loss: 0.00757
Epoch: 32992, Training Loss: 0.00788
Epoch: 32992, Training Loss: 0.00789
Epoch: 32992, Training Loss: 0.00968
Epoch: 32993, Training Loss: 0.00757
Epoch: 32993, Training Loss: 0.00788
Epoch: 32993, Training Loss: 0.00789
Epoch: 32993, Training Loss: 0.00968
Epoch: 32994, Training Loss: 0.00757
Epoch: 32994, Training Loss: 0.00788
Epoch: 32994, Training Loss: 0.00789
Epoch: 32994, Training Loss: 0.00968
Epoch: 32995, Training Loss: 0.00757
Epoch: 32995, Training Loss: 0.00788
Epoch: 32995, Training Loss: 0.00789
Epoch: 32995, Training Loss: 0.00968
Epoch: 32996, Training Loss: 0.00757
Epoch: 32996, Training Loss: 0.00788
Epoch: 32996, Training Loss: 0.00789
Epoch: 32996, Training Loss: 0.00968
Epoch: 32997, Training Loss: 0.00757
Epoch: 32997, Training Loss: 0.00788
Epoch: 32997, Training Loss: 0.00789
Epoch: 32997, Training Loss: 0.00968
Epoch: 32998, Training Loss: 0.00757
Epoch: 32998, Training Loss: 0.00788
Epoch: 32998, Training Loss: 0.00789
Epoch: 32998, Training Loss: 0.00968
Epoch: 32999, Training Loss: 0.00757
Epoch: 32999, Training Loss: 0.00788
Epoch: 32999, Training Loss: 0.00789
Epoch: 32999, Training Loss: 0.00968
Epoch: 33000, Training Loss: 0.00757
Epoch: 33000, Training Loss: 0.00788
Epoch: 33000, Training Loss: 0.00789
Epoch: 33000, Training Loss: 0.00968
Epoch: 33001, Training Loss: 0.00757
Epoch: 33001, Training Loss: 0.00788
Epoch: 33001, Training Loss: 0.00789
Epoch: 33001, Training Loss: 0.00968
Epoch: 33002, Training Loss: 0.00757
Epoch: 33002, Training Loss: 0.00788
Epoch: 33002, Training Loss: 0.00789
Epoch: 33002, Training Loss: 0.00968
Epoch: 33003, Training Loss: 0.00757
Epoch: 33003, Training Loss: 0.00788
Epoch: 33003, Training Loss: 0.00789
Epoch: 33003, Training Loss: 0.00968
Epoch: 33004, Training Loss: 0.00757
Epoch: 33004, Training Loss: 0.00788
Epoch: 33004, Training Loss: 0.00789
Epoch: 33004, Training Loss: 0.00968
Epoch: 33005, Training Loss: 0.00757
Epoch: 33005, Training Loss: 0.00788
Epoch: 33005, Training Loss: 0.00789
Epoch: 33005, Training Loss: 0.00968
Epoch: 33006, Training Loss: 0.00757
Epoch: 33006, Training Loss: 0.00788
Epoch: 33006, Training Loss: 0.00789
Epoch: 33006, Training Loss: 0.00968
Epoch: 33007, Training Loss: 0.00757
Epoch: 33007, Training Loss: 0.00788
Epoch: 33007, Training Loss: 0.00789
Epoch: 33007, Training Loss: 0.00968
Epoch: 33008, Training Loss: 0.00757
Epoch: 33008, Training Loss: 0.00788
Epoch: 33008, Training Loss: 0.00789
Epoch: 33008, Training Loss: 0.00968
Epoch: 33009, Training Loss: 0.00757
Epoch: 33009, Training Loss: 0.00788
Epoch: 33009, Training Loss: 0.00789
Epoch: 33009, Training Loss: 0.00968
Epoch: 33010, Training Loss: 0.00757
Epoch: 33010, Training Loss: 0.00788
Epoch: 33010, Training Loss: 0.00789
Epoch: 33010, Training Loss: 0.00968
Epoch: 33011, Training Loss: 0.00757
Epoch: 33011, Training Loss: 0.00788
Epoch: 33011, Training Loss: 0.00789
Epoch: 33011, Training Loss: 0.00968
Epoch: 33012, Training Loss: 0.00757
Epoch: 33012, Training Loss: 0.00788
Epoch: 33012, Training Loss: 0.00788
Epoch: 33012, Training Loss: 0.00968
Epoch: 33013, Training Loss: 0.00757
Epoch: 33013, Training Loss: 0.00788
Epoch: 33013, Training Loss: 0.00788
Epoch: 33013, Training Loss: 0.00968
Epoch: 33014, Training Loss: 0.00757
Epoch: 33014, Training Loss: 0.00788
Epoch: 33014, Training Loss: 0.00788
Epoch: 33014, Training Loss: 0.00968
Epoch: 33015, Training Loss: 0.00757
Epoch: 33015, Training Loss: 0.00788
Epoch: 33015, Training Loss: 0.00788
Epoch: 33015, Training Loss: 0.00968
Epoch: 33016, Training Loss: 0.00757
Epoch: 33016, Training Loss: 0.00788
Epoch: 33016, Training Loss: 0.00788
Epoch: 33016, Training Loss: 0.00968
Epoch: 33017, Training Loss: 0.00757
Epoch: 33017, Training Loss: 0.00788
Epoch: 33017, Training Loss: 0.00788
Epoch: 33017, Training Loss: 0.00968
Epoch: 33018, Training Loss: 0.00757
Epoch: 33018, Training Loss: 0.00787
Epoch: 33018, Training Loss: 0.00788
Epoch: 33018, Training Loss: 0.00968
Epoch: 33019, Training Loss: 0.00757
Epoch: 33019, Training Loss: 0.00787
Epoch: 33019, Training Loss: 0.00788
Epoch: 33019, Training Loss: 0.00968
Epoch: 33020, Training Loss: 0.00757
Epoch: 33020, Training Loss: 0.00787
Epoch: 33020, Training Loss: 0.00788
Epoch: 33020, Training Loss: 0.00968
Epoch: 33021, Training Loss: 0.00756
Epoch: 33021, Training Loss: 0.00787
Epoch: 33021, Training Loss: 0.00788
Epoch: 33021, Training Loss: 0.00968
Epoch: 33022, Training Loss: 0.00756
Epoch: 33022, Training Loss: 0.00787
Epoch: 33022, Training Loss: 0.00788
Epoch: 33022, Training Loss: 0.00968
Epoch: 33023, Training Loss: 0.00756
Epoch: 33023, Training Loss: 0.00787
Epoch: 33023, Training Loss: 0.00788
Epoch: 33023, Training Loss: 0.00968
Epoch: 33024, Training Loss: 0.00756
Epoch: 33024, Training Loss: 0.00787
Epoch: 33024, Training Loss: 0.00788
Epoch: 33024, Training Loss: 0.00968
Epoch: 33025, Training Loss: 0.00756
Epoch: 33025, Training Loss: 0.00787
Epoch: 33025, Training Loss: 0.00788
Epoch: 33025, Training Loss: 0.00968
Epoch: 33026, Training Loss: 0.00756
Epoch: 33026, Training Loss: 0.00787
Epoch: 33026, Training Loss: 0.00788
Epoch: 33026, Training Loss: 0.00968
Epoch: 33027, Training Loss: 0.00756
Epoch: 33027, Training Loss: 0.00787
Epoch: 33027, Training Loss: 0.00788
Epoch: 33027, Training Loss: 0.00968
Epoch: 33028, Training Loss: 0.00756
Epoch: 33028, Training Loss: 0.00787
Epoch: 33028, Training Loss: 0.00788
Epoch: 33028, Training Loss: 0.00968
Epoch: 33029, Training Loss: 0.00756
Epoch: 33029, Training Loss: 0.00787
Epoch: 33029, Training Loss: 0.00788
Epoch: 33029, Training Loss: 0.00968
Epoch: 33030, Training Loss: 0.00756
Epoch: 33030, Training Loss: 0.00787
Epoch: 33030, Training Loss: 0.00788
Epoch: 33030, Training Loss: 0.00968
Epoch: 33031, Training Loss: 0.00756
Epoch: 33031, Training Loss: 0.00787
Epoch: 33031, Training Loss: 0.00788
Epoch: 33031, Training Loss: 0.00968
Epoch: 33032, Training Loss: 0.00756
Epoch: 33032, Training Loss: 0.00787
Epoch: 33032, Training Loss: 0.00788
Epoch: 33032, Training Loss: 0.00968
Epoch: 33033, Training Loss: 0.00756
Epoch: 33033, Training Loss: 0.00787
Epoch: 33033, Training Loss: 0.00788
Epoch: 33033, Training Loss: 0.00968
Epoch: 33034, Training Loss: 0.00756
Epoch: 33034, Training Loss: 0.00787
Epoch: 33034, Training Loss: 0.00788
Epoch: 33034, Training Loss: 0.00968
Epoch: 33035, Training Loss: 0.00756
Epoch: 33035, Training Loss: 0.00787
Epoch: 33035, Training Loss: 0.00788
Epoch: 33035, Training Loss: 0.00968
Epoch: 33036, Training Loss: 0.00756
Epoch: 33036, Training Loss: 0.00787
Epoch: 33036, Training Loss: 0.00788
Epoch: 33036, Training Loss: 0.00968
Epoch: 33037, Training Loss: 0.00756
Epoch: 33037, Training Loss: 0.00787
Epoch: 33037, Training Loss: 0.00788
Epoch: 33037, Training Loss: 0.00968
Epoch: 33038, Training Loss: 0.00756
Epoch: 33038, Training Loss: 0.00787
Epoch: 33038, Training Loss: 0.00788
Epoch: 33038, Training Loss: 0.00968
Epoch: 33039, Training Loss: 0.00756
Epoch: 33039, Training Loss: 0.00787
Epoch: 33039, Training Loss: 0.00788
Epoch: 33039, Training Loss: 0.00968
Epoch: 33040, Training Loss: 0.00756
Epoch: 33040, Training Loss: 0.00787
Epoch: 33040, Training Loss: 0.00788
Epoch: 33040, Training Loss: 0.00968
Epoch: 33041, Training Loss: 0.00756
Epoch: 33041, Training Loss: 0.00787
Epoch: 33041, Training Loss: 0.00788
Epoch: 33041, Training Loss: 0.00968
Epoch: 33042, Training Loss: 0.00756
Epoch: 33042, Training Loss: 0.00787
Epoch: 33042, Training Loss: 0.00788
Epoch: 33042, Training Loss: 0.00968
Epoch: 33043, Training Loss: 0.00756
Epoch: 33043, Training Loss: 0.00787
Epoch: 33043, Training Loss: 0.00788
Epoch: 33043, Training Loss: 0.00967
Epoch: 33044, Training Loss: 0.00756
Epoch: 33044, Training Loss: 0.00787
Epoch: 33044, Training Loss: 0.00788
Epoch: 33044, Training Loss: 0.00967
Epoch: 33045, Training Loss: 0.00756
Epoch: 33045, Training Loss: 0.00787
Epoch: 33045, Training Loss: 0.00788
Epoch: 33045, Training Loss: 0.00967
Epoch: 33046, Training Loss: 0.00756
Epoch: 33046, Training Loss: 0.00787
Epoch: 33046, Training Loss: 0.00788
Epoch: 33046, Training Loss: 0.00967
Epoch: 33047, Training Loss: 0.00756
Epoch: 33047, Training Loss: 0.00787
Epoch: 33047, Training Loss: 0.00788
Epoch: 33047, Training Loss: 0.00967
Epoch: 33048, Training Loss: 0.00756
Epoch: 33048, Training Loss: 0.00787
Epoch: 33048, Training Loss: 0.00788
Epoch: 33048, Training Loss: 0.00967
Epoch: 33049, Training Loss: 0.00756
Epoch: 33049, Training Loss: 0.00787
Epoch: 33049, Training Loss: 0.00788
Epoch: 33049, Training Loss: 0.00967
Epoch: 33050, Training Loss: 0.00756
Epoch: 33050, Training Loss: 0.00787
Epoch: 33050, Training Loss: 0.00788
Epoch: 33050, Training Loss: 0.00967
Epoch: 33051, Training Loss: 0.00756
Epoch: 33051, Training Loss: 0.00787
Epoch: 33051, Training Loss: 0.00788
Epoch: 33051, Training Loss: 0.00967
Epoch: 33052, Training Loss: 0.00756
Epoch: 33052, Training Loss: 0.00787
Epoch: 33052, Training Loss: 0.00788
Epoch: 33052, Training Loss: 0.00967
Epoch: 33053, Training Loss: 0.00756
Epoch: 33053, Training Loss: 0.00787
Epoch: 33053, Training Loss: 0.00788
Epoch: 33053, Training Loss: 0.00967
Epoch: 33054, Training Loss: 0.00756
Epoch: 33054, Training Loss: 0.00787
Epoch: 33054, Training Loss: 0.00788
Epoch: 33054, Training Loss: 0.00967
Epoch: 33055, Training Loss: 0.00756
Epoch: 33055, Training Loss: 0.00787
Epoch: 33055, Training Loss: 0.00788
Epoch: 33055, Training Loss: 0.00967
Epoch: 33056, Training Loss: 0.00756
Epoch: 33056, Training Loss: 0.00787
Epoch: 33056, Training Loss: 0.00788
Epoch: 33056, Training Loss: 0.00967
Epoch: 33057, Training Loss: 0.00756
Epoch: 33057, Training Loss: 0.00787
Epoch: 33057, Training Loss: 0.00788
Epoch: 33057, Training Loss: 0.00967
Epoch: 33058, Training Loss: 0.00756
Epoch: 33058, Training Loss: 0.00787
Epoch: 33058, Training Loss: 0.00788
Epoch: 33058, Training Loss: 0.00967
Epoch: 33059, Training Loss: 0.00756
Epoch: 33059, Training Loss: 0.00787
Epoch: 33059, Training Loss: 0.00788
Epoch: 33059, Training Loss: 0.00967
Epoch: 33060, Training Loss: 0.00756
Epoch: 33060, Training Loss: 0.00787
Epoch: 33060, Training Loss: 0.00788
Epoch: 33060, Training Loss: 0.00967
Epoch: 33061, Training Loss: 0.00756
Epoch: 33061, Training Loss: 0.00787
Epoch: 33061, Training Loss: 0.00788
Epoch: 33061, Training Loss: 0.00967
Epoch: 33062, Training Loss: 0.00756
Epoch: 33062, Training Loss: 0.00787
Epoch: 33062, Training Loss: 0.00788
Epoch: 33062, Training Loss: 0.00967
Epoch: 33063, Training Loss: 0.00756
Epoch: 33063, Training Loss: 0.00787
Epoch: 33063, Training Loss: 0.00788
Epoch: 33063, Training Loss: 0.00967
Epoch: 33064, Training Loss: 0.00756
Epoch: 33064, Training Loss: 0.00787
Epoch: 33064, Training Loss: 0.00788
Epoch: 33064, Training Loss: 0.00967
Epoch: 33065, Training Loss: 0.00756
Epoch: 33065, Training Loss: 0.00787
Epoch: 33065, Training Loss: 0.00788
Epoch: 33065, Training Loss: 0.00967
Epoch: 33066, Training Loss: 0.00756
Epoch: 33066, Training Loss: 0.00787
Epoch: 33066, Training Loss: 0.00788
Epoch: 33066, Training Loss: 0.00967
Epoch: 33067, Training Loss: 0.00756
Epoch: 33067, Training Loss: 0.00787
Epoch: 33067, Training Loss: 0.00788
Epoch: 33067, Training Loss: 0.00967
Epoch: 33068, Training Loss: 0.00756
Epoch: 33068, Training Loss: 0.00787
Epoch: 33068, Training Loss: 0.00788
Epoch: 33068, Training Loss: 0.00967
Epoch: 33069, Training Loss: 0.00756
Epoch: 33069, Training Loss: 0.00787
Epoch: 33069, Training Loss: 0.00788
Epoch: 33069, Training Loss: 0.00967
Epoch: 33070, Training Loss: 0.00756
Epoch: 33070, Training Loss: 0.00787
Epoch: 33070, Training Loss: 0.00788
Epoch: 33070, Training Loss: 0.00967
Epoch: 33071, Training Loss: 0.00756
Epoch: 33071, Training Loss: 0.00787
Epoch: 33071, Training Loss: 0.00788
Epoch: 33071, Training Loss: 0.00967
Epoch: 33072, Training Loss: 0.00756
Epoch: 33072, Training Loss: 0.00787
Epoch: 33072, Training Loss: 0.00788
Epoch: 33072, Training Loss: 0.00967
Epoch: 33073, Training Loss: 0.00756
Epoch: 33073, Training Loss: 0.00787
Epoch: 33073, Training Loss: 0.00788
Epoch: 33073, Training Loss: 0.00967
Epoch: 33074, Training Loss: 0.00756
Epoch: 33074, Training Loss: 0.00787
Epoch: 33074, Training Loss: 0.00788
Epoch: 33074, Training Loss: 0.00967
Epoch: 33075, Training Loss: 0.00756
Epoch: 33075, Training Loss: 0.00787
Epoch: 33075, Training Loss: 0.00788
Epoch: 33075, Training Loss: 0.00967
Epoch: 33076, Training Loss: 0.00756
Epoch: 33076, Training Loss: 0.00787
Epoch: 33076, Training Loss: 0.00788
Epoch: 33076, Training Loss: 0.00967
Epoch: 33077, Training Loss: 0.00756
Epoch: 33077, Training Loss: 0.00787
Epoch: 33077, Training Loss: 0.00788
Epoch: 33077, Training Loss: 0.00967
Epoch: 33078, Training Loss: 0.00756
Epoch: 33078, Training Loss: 0.00787
Epoch: 33078, Training Loss: 0.00788
Epoch: 33078, Training Loss: 0.00967
Epoch: 33079, Training Loss: 0.00756
Epoch: 33079, Training Loss: 0.00787
Epoch: 33079, Training Loss: 0.00788
Epoch: 33079, Training Loss: 0.00967
Epoch: 33080, Training Loss: 0.00756
Epoch: 33080, Training Loss: 0.00787
Epoch: 33080, Training Loss: 0.00788
Epoch: 33080, Training Loss: 0.00967
Epoch: 33081, Training Loss: 0.00756
Epoch: 33081, Training Loss: 0.00787
Epoch: 33081, Training Loss: 0.00788
Epoch: 33081, Training Loss: 0.00967
Epoch: 33082, Training Loss: 0.00756
Epoch: 33082, Training Loss: 0.00787
Epoch: 33082, Training Loss: 0.00788
Epoch: 33082, Training Loss: 0.00967
Epoch: 33083, Training Loss: 0.00756
Epoch: 33083, Training Loss: 0.00787
Epoch: 33083, Training Loss: 0.00788
Epoch: 33083, Training Loss: 0.00967
Epoch: 33084, Training Loss: 0.00756
Epoch: 33084, Training Loss: 0.00787
Epoch: 33084, Training Loss: 0.00788
Epoch: 33084, Training Loss: 0.00967
Epoch: 33085, Training Loss: 0.00756
Epoch: 33085, Training Loss: 0.00787
Epoch: 33085, Training Loss: 0.00788
Epoch: 33085, Training Loss: 0.00967
Epoch: 33086, Training Loss: 0.00756
Epoch: 33086, Training Loss: 0.00787
Epoch: 33086, Training Loss: 0.00788
Epoch: 33086, Training Loss: 0.00967
Epoch: 33087, Training Loss: 0.00756
Epoch: 33087, Training Loss: 0.00787
Epoch: 33087, Training Loss: 0.00787
Epoch: 33087, Training Loss: 0.00967
Epoch: 33088, Training Loss: 0.00756
Epoch: 33088, Training Loss: 0.00787
Epoch: 33088, Training Loss: 0.00787
Epoch: 33088, Training Loss: 0.00967
Epoch: 33089, Training Loss: 0.00756
Epoch: 33089, Training Loss: 0.00787
Epoch: 33089, Training Loss: 0.00787
Epoch: 33089, Training Loss: 0.00967
Epoch: 33090, Training Loss: 0.00756
Epoch: 33090, Training Loss: 0.00787
Epoch: 33090, Training Loss: 0.00787
Epoch: 33090, Training Loss: 0.00967
Epoch: 33091, Training Loss: 0.00756
Epoch: 33091, Training Loss: 0.00787
Epoch: 33091, Training Loss: 0.00787
Epoch: 33091, Training Loss: 0.00967
Epoch: 33092, Training Loss: 0.00756
Epoch: 33092, Training Loss: 0.00787
Epoch: 33092, Training Loss: 0.00787
Epoch: 33092, Training Loss: 0.00967
Epoch: 33093, Training Loss: 0.00756
Epoch: 33093, Training Loss: 0.00786
Epoch: 33093, Training Loss: 0.00787
Epoch: 33093, Training Loss: 0.00967
Epoch: 33094, Training Loss: 0.00756
Epoch: 33094, Training Loss: 0.00786
Epoch: 33094, Training Loss: 0.00787
Epoch: 33094, Training Loss: 0.00967
Epoch: 33095, Training Loss: 0.00756
Epoch: 33095, Training Loss: 0.00786
Epoch: 33095, Training Loss: 0.00787
Epoch: 33095, Training Loss: 0.00967
Epoch: 33096, Training Loss: 0.00756
Epoch: 33096, Training Loss: 0.00786
Epoch: 33096, Training Loss: 0.00787
Epoch: 33096, Training Loss: 0.00967
Epoch: 33097, Training Loss: 0.00756
Epoch: 33097, Training Loss: 0.00786
Epoch: 33097, Training Loss: 0.00787
Epoch: 33097, Training Loss: 0.00967
Epoch: 33098, Training Loss: 0.00756
Epoch: 33098, Training Loss: 0.00786
Epoch: 33098, Training Loss: 0.00787
Epoch: 33098, Training Loss: 0.00967
Epoch: 33099, Training Loss: 0.00756
Epoch: 33099, Training Loss: 0.00786
Epoch: 33099, Training Loss: 0.00787
Epoch: 33099, Training Loss: 0.00967
Epoch: 33100, Training Loss: 0.00756
Epoch: 33100, Training Loss: 0.00786
Epoch: 33100, Training Loss: 0.00787
Epoch: 33100, Training Loss: 0.00967
Epoch: 33101, Training Loss: 0.00755
Epoch: 33101, Training Loss: 0.00786
Epoch: 33101, Training Loss: 0.00787
Epoch: 33101, Training Loss: 0.00967
Epoch: 33102, Training Loss: 0.00755
Epoch: 33102, Training Loss: 0.00786
Epoch: 33102, Training Loss: 0.00787
Epoch: 33102, Training Loss: 0.00967
Epoch: 33103, Training Loss: 0.00755
Epoch: 33103, Training Loss: 0.00786
Epoch: 33103, Training Loss: 0.00787
Epoch: 33103, Training Loss: 0.00967
Epoch: 33104, Training Loss: 0.00755
Epoch: 33104, Training Loss: 0.00786
Epoch: 33104, Training Loss: 0.00787
Epoch: 33104, Training Loss: 0.00966
Epoch: 33105, Training Loss: 0.00755
Epoch: 33105, Training Loss: 0.00786
Epoch: 33105, Training Loss: 0.00787
Epoch: 33105, Training Loss: 0.00966
Epoch: 33106, Training Loss: 0.00755
Epoch: 33106, Training Loss: 0.00786
Epoch: 33106, Training Loss: 0.00787
Epoch: 33106, Training Loss: 0.00966
Epoch: 33107, Training Loss: 0.00755
Epoch: 33107, Training Loss: 0.00786
Epoch: 33107, Training Loss: 0.00787
Epoch: 33107, Training Loss: 0.00966
Epoch: 33108, Training Loss: 0.00755
Epoch: 33108, Training Loss: 0.00786
Epoch: 33108, Training Loss: 0.00787
Epoch: 33108, Training Loss: 0.00966
Epoch: 33109, Training Loss: 0.00755
Epoch: 33109, Training Loss: 0.00786
Epoch: 33109, Training Loss: 0.00787
Epoch: 33109, Training Loss: 0.00966
Epoch: 33110, Training Loss: 0.00755
Epoch: 33110, Training Loss: 0.00786
Epoch: 33110, Training Loss: 0.00787
Epoch: 33110, Training Loss: 0.00966
Epoch: 33111, Training Loss: 0.00755
Epoch: 33111, Training Loss: 0.00786
Epoch: 33111, Training Loss: 0.00787
Epoch: 33111, Training Loss: 0.00966
Epoch: 33112, Training Loss: 0.00755
Epoch: 33112, Training Loss: 0.00786
Epoch: 33112, Training Loss: 0.00787
Epoch: 33112, Training Loss: 0.00966
Epoch: 33113, Training Loss: 0.00755
Epoch: 33113, Training Loss: 0.00786
Epoch: 33113, Training Loss: 0.00787
Epoch: 33113, Training Loss: 0.00966
Epoch: 33114, Training Loss: 0.00755
Epoch: 33114, Training Loss: 0.00786
Epoch: 33114, Training Loss: 0.00787
Epoch: 33114, Training Loss: 0.00966
Epoch: 33115, Training Loss: 0.00755
Epoch: 33115, Training Loss: 0.00786
Epoch: 33115, Training Loss: 0.00787
Epoch: 33115, Training Loss: 0.00966
Epoch: 33116, Training Loss: 0.00755
Epoch: 33116, Training Loss: 0.00786
Epoch: 33116, Training Loss: 0.00787
Epoch: 33116, Training Loss: 0.00966
Epoch: 33117, Training Loss: 0.00755
Epoch: 33117, Training Loss: 0.00786
Epoch: 33117, Training Loss: 0.00787
Epoch: 33117, Training Loss: 0.00966
Epoch: 33118, Training Loss: 0.00755
Epoch: 33118, Training Loss: 0.00786
Epoch: 33118, Training Loss: 0.00787
Epoch: 33118, Training Loss: 0.00966
Epoch: 33119, Training Loss: 0.00755
Epoch: 33119, Training Loss: 0.00786
Epoch: 33119, Training Loss: 0.00787
Epoch: 33119, Training Loss: 0.00966
Epoch: 33120, Training Loss: 0.00755
Epoch: 33120, Training Loss: 0.00786
Epoch: 33120, Training Loss: 0.00787
Epoch: 33120, Training Loss: 0.00966
Epoch: 33121, Training Loss: 0.00755
Epoch: 33121, Training Loss: 0.00786
Epoch: 33121, Training Loss: 0.00787
Epoch: 33121, Training Loss: 0.00966
Epoch: 33122, Training Loss: 0.00755
Epoch: 33122, Training Loss: 0.00786
Epoch: 33122, Training Loss: 0.00787
Epoch: 33122, Training Loss: 0.00966
Epoch: 33123, Training Loss: 0.00755
Epoch: 33123, Training Loss: 0.00786
Epoch: 33123, Training Loss: 0.00787
Epoch: 33123, Training Loss: 0.00966
Epoch: 33124, Training Loss: 0.00755
Epoch: 33124, Training Loss: 0.00786
Epoch: 33124, Training Loss: 0.00787
Epoch: 33124, Training Loss: 0.00966
Epoch: 33125, Training Loss: 0.00755
Epoch: 33125, Training Loss: 0.00786
Epoch: 33125, Training Loss: 0.00787
Epoch: 33125, Training Loss: 0.00966
Epoch: 33126, Training Loss: 0.00755
Epoch: 33126, Training Loss: 0.00786
Epoch: 33126, Training Loss: 0.00787
Epoch: 33126, Training Loss: 0.00966
Epoch: 33127, Training Loss: 0.00755
Epoch: 33127, Training Loss: 0.00786
Epoch: 33127, Training Loss: 0.00787
Epoch: 33127, Training Loss: 0.00966
Epoch: 33128, Training Loss: 0.00755
Epoch: 33128, Training Loss: 0.00786
Epoch: 33128, Training Loss: 0.00787
Epoch: 33128, Training Loss: 0.00966
Epoch: 33129, Training Loss: 0.00755
Epoch: 33129, Training Loss: 0.00786
Epoch: 33129, Training Loss: 0.00787
Epoch: 33129, Training Loss: 0.00966
Epoch: 33130, Training Loss: 0.00755
Epoch: 33130, Training Loss: 0.00786
Epoch: 33130, Training Loss: 0.00787
Epoch: 33130, Training Loss: 0.00966
Epoch: 33131, Training Loss: 0.00755
Epoch: 33131, Training Loss: 0.00786
Epoch: 33131, Training Loss: 0.00787
Epoch: 33131, Training Loss: 0.00966
Epoch: 33132, Training Loss: 0.00755
Epoch: 33132, Training Loss: 0.00786
Epoch: 33132, Training Loss: 0.00787
Epoch: 33132, Training Loss: 0.00966
Epoch: 33133, Training Loss: 0.00755
Epoch: 33133, Training Loss: 0.00786
Epoch: 33133, Training Loss: 0.00787
Epoch: 33133, Training Loss: 0.00966
Epoch: 33134, Training Loss: 0.00755
Epoch: 33134, Training Loss: 0.00786
Epoch: 33134, Training Loss: 0.00787
Epoch: 33134, Training Loss: 0.00966
Epoch: 33135, Training Loss: 0.00755
Epoch: 33135, Training Loss: 0.00786
Epoch: 33135, Training Loss: 0.00787
Epoch: 33135, Training Loss: 0.00966
Epoch: 33136, Training Loss: 0.00755
Epoch: 33136, Training Loss: 0.00786
Epoch: 33136, Training Loss: 0.00787
Epoch: 33136, Training Loss: 0.00966
Epoch: 33137, Training Loss: 0.00755
Epoch: 33137, Training Loss: 0.00786
Epoch: 33137, Training Loss: 0.00787
Epoch: 33137, Training Loss: 0.00966
Epoch: 33138, Training Loss: 0.00755
Epoch: 33138, Training Loss: 0.00786
Epoch: 33138, Training Loss: 0.00787
Epoch: 33138, Training Loss: 0.00966
Epoch: 33139, Training Loss: 0.00755
Epoch: 33139, Training Loss: 0.00786
Epoch: 33139, Training Loss: 0.00787
Epoch: 33139, Training Loss: 0.00966
Epoch: 33140, Training Loss: 0.00755
Epoch: 33140, Training Loss: 0.00786
Epoch: 33140, Training Loss: 0.00787
Epoch: 33140, Training Loss: 0.00966
Epoch: 33141, Training Loss: 0.00755
Epoch: 33141, Training Loss: 0.00786
Epoch: 33141, Training Loss: 0.00787
Epoch: 33141, Training Loss: 0.00966
Epoch: 33142, Training Loss: 0.00755
Epoch: 33142, Training Loss: 0.00786
Epoch: 33142, Training Loss: 0.00787
Epoch: 33142, Training Loss: 0.00966
Epoch: 33143, Training Loss: 0.00755
Epoch: 33143, Training Loss: 0.00786
Epoch: 33143, Training Loss: 0.00787
Epoch: 33143, Training Loss: 0.00966
Epoch: 33144, Training Loss: 0.00755
Epoch: 33144, Training Loss: 0.00786
Epoch: 33144, Training Loss: 0.00787
Epoch: 33144, Training Loss: 0.00966
Epoch: 33145, Training Loss: 0.00755
Epoch: 33145, Training Loss: 0.00786
Epoch: 33145, Training Loss: 0.00787
Epoch: 33145, Training Loss: 0.00966
Epoch: 33146, Training Loss: 0.00755
Epoch: 33146, Training Loss: 0.00786
Epoch: 33146, Training Loss: 0.00787
Epoch: 33146, Training Loss: 0.00966
Epoch: 33147, Training Loss: 0.00755
Epoch: 33147, Training Loss: 0.00786
Epoch: 33147, Training Loss: 0.00787
Epoch: 33147, Training Loss: 0.00966
Epoch: 33148, Training Loss: 0.00755
Epoch: 33148, Training Loss: 0.00786
Epoch: 33148, Training Loss: 0.00787
Epoch: 33148, Training Loss: 0.00966
Epoch: 33149, Training Loss: 0.00755
Epoch: 33149, Training Loss: 0.00786
Epoch: 33149, Training Loss: 0.00787
Epoch: 33149, Training Loss: 0.00966
Epoch: 33150, Training Loss: 0.00755
Epoch: 33150, Training Loss: 0.00786
Epoch: 33150, Training Loss: 0.00787
Epoch: 33150, Training Loss: 0.00966
Epoch: 33151, Training Loss: 0.00755
Epoch: 33151, Training Loss: 0.00786
Epoch: 33151, Training Loss: 0.00787
Epoch: 33151, Training Loss: 0.00966
Epoch: 33152, Training Loss: 0.00755
Epoch: 33152, Training Loss: 0.00786
Epoch: 33152, Training Loss: 0.00787
Epoch: 33152, Training Loss: 0.00966
Epoch: 33153, Training Loss: 0.00755
Epoch: 33153, Training Loss: 0.00786
Epoch: 33153, Training Loss: 0.00787
Epoch: 33153, Training Loss: 0.00966
Epoch: 33154, Training Loss: 0.00755
Epoch: 33154, Training Loss: 0.00786
Epoch: 33154, Training Loss: 0.00787
Epoch: 33154, Training Loss: 0.00966
Epoch: 33155, Training Loss: 0.00755
Epoch: 33155, Training Loss: 0.00786
Epoch: 33155, Training Loss: 0.00787
Epoch: 33155, Training Loss: 0.00966
Epoch: 33156, Training Loss: 0.00755
Epoch: 33156, Training Loss: 0.00786
Epoch: 33156, Training Loss: 0.00787
Epoch: 33156, Training Loss: 0.00966
Epoch: 33157, Training Loss: 0.00755
Epoch: 33157, Training Loss: 0.00786
Epoch: 33157, Training Loss: 0.00787
Epoch: 33157, Training Loss: 0.00966
Epoch: 33158, Training Loss: 0.00755
Epoch: 33158, Training Loss: 0.00786
Epoch: 33158, Training Loss: 0.00787
Epoch: 33158, Training Loss: 0.00966
Epoch: 33159, Training Loss: 0.00755
Epoch: 33159, Training Loss: 0.00786
Epoch: 33159, Training Loss: 0.00787
Epoch: 33159, Training Loss: 0.00966
Epoch: 33160, Training Loss: 0.00755
Epoch: 33160, Training Loss: 0.00786
Epoch: 33160, Training Loss: 0.00787
Epoch: 33160, Training Loss: 0.00966
Epoch: 33161, Training Loss: 0.00755
Epoch: 33161, Training Loss: 0.00786
Epoch: 33161, Training Loss: 0.00787
Epoch: 33161, Training Loss: 0.00966
Epoch: 33162, Training Loss: 0.00755
Epoch: 33162, Training Loss: 0.00786
Epoch: 33162, Training Loss: 0.00787
Epoch: 33162, Training Loss: 0.00966
Epoch: 33163, Training Loss: 0.00755
Epoch: 33163, Training Loss: 0.00786
Epoch: 33163, Training Loss: 0.00786
Epoch: 33163, Training Loss: 0.00966
Epoch: 33164, Training Loss: 0.00755
Epoch: 33164, Training Loss: 0.00786
Epoch: 33164, Training Loss: 0.00786
Epoch: 33164, Training Loss: 0.00966
Epoch: 33165, Training Loss: 0.00755
Epoch: 33165, Training Loss: 0.00786
Epoch: 33165, Training Loss: 0.00786
Epoch: 33165, Training Loss: 0.00965
Epoch: 33166, Training Loss: 0.00755
Epoch: 33166, Training Loss: 0.00786
Epoch: 33166, Training Loss: 0.00786
Epoch: 33166, Training Loss: 0.00965
Epoch: 33167, Training Loss: 0.00755
Epoch: 33167, Training Loss: 0.00786
Epoch: 33167, Training Loss: 0.00786
Epoch: 33167, Training Loss: 0.00965
Epoch: 33168, Training Loss: 0.00755
Epoch: 33168, Training Loss: 0.00786
Epoch: 33168, Training Loss: 0.00786
Epoch: 33168, Training Loss: 0.00965
Epoch: 33169, Training Loss: 0.00755
Epoch: 33169, Training Loss: 0.00785
Epoch: 33169, Training Loss: 0.00786
Epoch: 33169, Training Loss: 0.00965
Epoch: 33170, Training Loss: 0.00755
Epoch: 33170, Training Loss: 0.00785
Epoch: 33170, Training Loss: 0.00786
Epoch: 33170, Training Loss: 0.00965
Epoch: 33171, Training Loss: 0.00755
Epoch: 33171, Training Loss: 0.00785
Epoch: 33171, Training Loss: 0.00786
Epoch: 33171, Training Loss: 0.00965
Epoch: 33172, Training Loss: 0.00755
Epoch: 33172, Training Loss: 0.00785
Epoch: 33172, Training Loss: 0.00786
Epoch: 33172, Training Loss: 0.00965
Epoch: 33173, Training Loss: 0.00755
Epoch: 33173, Training Loss: 0.00785
Epoch: 33173, Training Loss: 0.00786
Epoch: 33173, Training Loss: 0.00965
Epoch: 33174, Training Loss: 0.00755
Epoch: 33174, Training Loss: 0.00785
Epoch: 33174, Training Loss: 0.00786
Epoch: 33174, Training Loss: 0.00965
Epoch: 33175, Training Loss: 0.00755
Epoch: 33175, Training Loss: 0.00785
Epoch: 33175, Training Loss: 0.00786
Epoch: 33175, Training Loss: 0.00965
Epoch: 33176, Training Loss: 0.00755
Epoch: 33176, Training Loss: 0.00785
Epoch: 33176, Training Loss: 0.00786
Epoch: 33176, Training Loss: 0.00965
Epoch: 33177, Training Loss: 0.00755
Epoch: 33177, Training Loss: 0.00785
Epoch: 33177, Training Loss: 0.00786
Epoch: 33177, Training Loss: 0.00965
Epoch: 33178, Training Loss: 0.00755
Epoch: 33178, Training Loss: 0.00785
Epoch: 33178, Training Loss: 0.00786
Epoch: 33178, Training Loss: 0.00965
Epoch: 33179, Training Loss: 0.00755
Epoch: 33179, Training Loss: 0.00785
Epoch: 33179, Training Loss: 0.00786
Epoch: 33179, Training Loss: 0.00965
Epoch: 33180, Training Loss: 0.00755
Epoch: 33180, Training Loss: 0.00785
Epoch: 33180, Training Loss: 0.00786
Epoch: 33180, Training Loss: 0.00965
Epoch: 33181, Training Loss: 0.00755
Epoch: 33181, Training Loss: 0.00785
Epoch: 33181, Training Loss: 0.00786
Epoch: 33181, Training Loss: 0.00965
Epoch: 33182, Training Loss: 0.00754
Epoch: 33182, Training Loss: 0.00785
Epoch: 33182, Training Loss: 0.00786
Epoch: 33182, Training Loss: 0.00965
Epoch: 33183, Training Loss: 0.00754
Epoch: 33183, Training Loss: 0.00785
Epoch: 33183, Training Loss: 0.00786
Epoch: 33183, Training Loss: 0.00965
Epoch: 33184, Training Loss: 0.00754
Epoch: 33184, Training Loss: 0.00785
Epoch: 33184, Training Loss: 0.00786
Epoch: 33184, Training Loss: 0.00965
Epoch: 33185, Training Loss: 0.00754
Epoch: 33185, Training Loss: 0.00785
Epoch: 33185, Training Loss: 0.00786
Epoch: 33185, Training Loss: 0.00965
Epoch: 33186, Training Loss: 0.00754
Epoch: 33186, Training Loss: 0.00785
Epoch: 33186, Training Loss: 0.00786
Epoch: 33186, Training Loss: 0.00965
Epoch: 33187, Training Loss: 0.00754
Epoch: 33187, Training Loss: 0.00785
Epoch: 33187, Training Loss: 0.00786
Epoch: 33187, Training Loss: 0.00965
Epoch: 33188, Training Loss: 0.00754
Epoch: 33188, Training Loss: 0.00785
Epoch: 33188, Training Loss: 0.00786
Epoch: 33188, Training Loss: 0.00965
Epoch: 33189, Training Loss: 0.00754
Epoch: 33189, Training Loss: 0.00785
Epoch: 33189, Training Loss: 0.00786
Epoch: 33189, Training Loss: 0.00965
Epoch: 33190, Training Loss: 0.00754
Epoch: 33190, Training Loss: 0.00785
Epoch: 33190, Training Loss: 0.00786
Epoch: 33190, Training Loss: 0.00965
Epoch: 33191, Training Loss: 0.00754
Epoch: 33191, Training Loss: 0.00785
Epoch: 33191, Training Loss: 0.00786
Epoch: 33191, Training Loss: 0.00965
Epoch: 33192, Training Loss: 0.00754
Epoch: 33192, Training Loss: 0.00785
Epoch: 33192, Training Loss: 0.00786
Epoch: 33192, Training Loss: 0.00965
Epoch: 33193, Training Loss: 0.00754
Epoch: 33193, Training Loss: 0.00785
Epoch: 33193, Training Loss: 0.00786
Epoch: 33193, Training Loss: 0.00965
Epoch: 33194, Training Loss: 0.00754
Epoch: 33194, Training Loss: 0.00785
Epoch: 33194, Training Loss: 0.00786
Epoch: 33194, Training Loss: 0.00965
Epoch: 33195, Training Loss: 0.00754
Epoch: 33195, Training Loss: 0.00785
Epoch: 33195, Training Loss: 0.00786
Epoch: 33195, Training Loss: 0.00965
Epoch: 33196, Training Loss: 0.00754
Epoch: 33196, Training Loss: 0.00785
Epoch: 33196, Training Loss: 0.00786
Epoch: 33196, Training Loss: 0.00965
Epoch: 33197, Training Loss: 0.00754
Epoch: 33197, Training Loss: 0.00785
Epoch: 33197, Training Loss: 0.00786
Epoch: 33197, Training Loss: 0.00965
Epoch: 33198, Training Loss: 0.00754
Epoch: 33198, Training Loss: 0.00785
Epoch: 33198, Training Loss: 0.00786
Epoch: 33198, Training Loss: 0.00965
Epoch: 33199, Training Loss: 0.00754
Epoch: 33199, Training Loss: 0.00785
Epoch: 33199, Training Loss: 0.00786
Epoch: 33199, Training Loss: 0.00965
Epoch: 33200, Training Loss: 0.00754
Epoch: 33200, Training Loss: 0.00785
Epoch: 33200, Training Loss: 0.00786
Epoch: 33200, Training Loss: 0.00965
Epoch: 33201, Training Loss: 0.00754
Epoch: 33201, Training Loss: 0.00785
Epoch: 33201, Training Loss: 0.00786
Epoch: 33201, Training Loss: 0.00965
Epoch: 33202, Training Loss: 0.00754
Epoch: 33202, Training Loss: 0.00785
Epoch: 33202, Training Loss: 0.00786
Epoch: 33202, Training Loss: 0.00965
Epoch: 33203, Training Loss: 0.00754
Epoch: 33203, Training Loss: 0.00785
Epoch: 33203, Training Loss: 0.00786
Epoch: 33203, Training Loss: 0.00965
Epoch: 33204, Training Loss: 0.00754
Epoch: 33204, Training Loss: 0.00785
Epoch: 33204, Training Loss: 0.00786
Epoch: 33204, Training Loss: 0.00965
Epoch: 33205, Training Loss: 0.00754
Epoch: 33205, Training Loss: 0.00785
Epoch: 33205, Training Loss: 0.00786
Epoch: 33205, Training Loss: 0.00965
Epoch: 33206, Training Loss: 0.00754
Epoch: 33206, Training Loss: 0.00785
Epoch: 33206, Training Loss: 0.00786
Epoch: 33206, Training Loss: 0.00965
Epoch: 33207, Training Loss: 0.00754
Epoch: 33207, Training Loss: 0.00785
Epoch: 33207, Training Loss: 0.00786
Epoch: 33207, Training Loss: 0.00965
Epoch: 33208, Training Loss: 0.00754
Epoch: 33208, Training Loss: 0.00785
Epoch: 33208, Training Loss: 0.00786
Epoch: 33208, Training Loss: 0.00965
Epoch: 33209, Training Loss: 0.00754
Epoch: 33209, Training Loss: 0.00785
Epoch: 33209, Training Loss: 0.00786
Epoch: 33209, Training Loss: 0.00965
Epoch: 33210, Training Loss: 0.00754
Epoch: 33210, Training Loss: 0.00785
Epoch: 33210, Training Loss: 0.00786
Epoch: 33210, Training Loss: 0.00965
Epoch: 33211, Training Loss: 0.00754
Epoch: 33211, Training Loss: 0.00785
Epoch: 33211, Training Loss: 0.00786
Epoch: 33211, Training Loss: 0.00965
Epoch: 33212, Training Loss: 0.00754
Epoch: 33212, Training Loss: 0.00785
Epoch: 33212, Training Loss: 0.00786
Epoch: 33212, Training Loss: 0.00965
Epoch: 33213, Training Loss: 0.00754
Epoch: 33213, Training Loss: 0.00785
Epoch: 33213, Training Loss: 0.00786
Epoch: 33213, Training Loss: 0.00965
Epoch: 33214, Training Loss: 0.00754
Epoch: 33214, Training Loss: 0.00785
Epoch: 33214, Training Loss: 0.00786
Epoch: 33214, Training Loss: 0.00965
Epoch: 33215, Training Loss: 0.00754
Epoch: 33215, Training Loss: 0.00785
Epoch: 33215, Training Loss: 0.00786
Epoch: 33215, Training Loss: 0.00965
Epoch: 33216, Training Loss: 0.00754
Epoch: 33216, Training Loss: 0.00785
Epoch: 33216, Training Loss: 0.00786
Epoch: 33216, Training Loss: 0.00965
Epoch: 33217, Training Loss: 0.00754
Epoch: 33217, Training Loss: 0.00785
Epoch: 33217, Training Loss: 0.00786
Epoch: 33217, Training Loss: 0.00965
Epoch: 33218, Training Loss: 0.00754
Epoch: 33218, Training Loss: 0.00785
Epoch: 33218, Training Loss: 0.00786
Epoch: 33218, Training Loss: 0.00965
Epoch: 33219, Training Loss: 0.00754
Epoch: 33219, Training Loss: 0.00785
Epoch: 33219, Training Loss: 0.00786
Epoch: 33219, Training Loss: 0.00965
Epoch: 33220, Training Loss: 0.00754
Epoch: 33220, Training Loss: 0.00785
Epoch: 33220, Training Loss: 0.00786
Epoch: 33220, Training Loss: 0.00965
Epoch: 33221, Training Loss: 0.00754
Epoch: 33221, Training Loss: 0.00785
Epoch: 33221, Training Loss: 0.00786
Epoch: 33221, Training Loss: 0.00965
Epoch: 33222, Training Loss: 0.00754
Epoch: 33222, Training Loss: 0.00785
Epoch: 33222, Training Loss: 0.00786
Epoch: 33222, Training Loss: 0.00965
Epoch: 33223, Training Loss: 0.00754
Epoch: 33223, Training Loss: 0.00785
Epoch: 33223, Training Loss: 0.00786
Epoch: 33223, Training Loss: 0.00965
Epoch: 33224, Training Loss: 0.00754
Epoch: 33224, Training Loss: 0.00785
Epoch: 33224, Training Loss: 0.00786
Epoch: 33224, Training Loss: 0.00965
Epoch: 33225, Training Loss: 0.00754
Epoch: 33225, Training Loss: 0.00785
Epoch: 33225, Training Loss: 0.00786
Epoch: 33225, Training Loss: 0.00965
Epoch: 33226, Training Loss: 0.00754
Epoch: 33226, Training Loss: 0.00785
Epoch: 33226, Training Loss: 0.00786
Epoch: 33226, Training Loss: 0.00964
Epoch: 33227, Training Loss: 0.00754
Epoch: 33227, Training Loss: 0.00785
Epoch: 33227, Training Loss: 0.00786
Epoch: 33227, Training Loss: 0.00964
Epoch: 33228, Training Loss: 0.00754
Epoch: 33228, Training Loss: 0.00785
Epoch: 33228, Training Loss: 0.00786
Epoch: 33228, Training Loss: 0.00964
Epoch: 33229, Training Loss: 0.00754
Epoch: 33229, Training Loss: 0.00785
Epoch: 33229, Training Loss: 0.00786
Epoch: 33229, Training Loss: 0.00964
Epoch: 33230, Training Loss: 0.00754
Epoch: 33230, Training Loss: 0.00785
Epoch: 33230, Training Loss: 0.00786
Epoch: 33230, Training Loss: 0.00964
Epoch: 33231, Training Loss: 0.00754
Epoch: 33231, Training Loss: 0.00785
Epoch: 33231, Training Loss: 0.00786
Epoch: 33231, Training Loss: 0.00964
Epoch: 33232, Training Loss: 0.00754
Epoch: 33232, Training Loss: 0.00785
Epoch: 33232, Training Loss: 0.00786
Epoch: 33232, Training Loss: 0.00964
Epoch: 33233, Training Loss: 0.00754
Epoch: 33233, Training Loss: 0.00785
Epoch: 33233, Training Loss: 0.00786
Epoch: 33233, Training Loss: 0.00964
Epoch: 33234, Training Loss: 0.00754
Epoch: 33234, Training Loss: 0.00785
Epoch: 33234, Training Loss: 0.00786
Epoch: 33234, Training Loss: 0.00964
Epoch: 33235, Training Loss: 0.00754
Epoch: 33235, Training Loss: 0.00785
Epoch: 33235, Training Loss: 0.00786
Epoch: 33235, Training Loss: 0.00964
Epoch: 33236, Training Loss: 0.00754
Epoch: 33236, Training Loss: 0.00785
Epoch: 33236, Training Loss: 0.00786
Epoch: 33236, Training Loss: 0.00964
Epoch: 33237, Training Loss: 0.00754
Epoch: 33237, Training Loss: 0.00785
Epoch: 33237, Training Loss: 0.00786
Epoch: 33237, Training Loss: 0.00964
Epoch: 33238, Training Loss: 0.00754
Epoch: 33238, Training Loss: 0.00785
Epoch: 33238, Training Loss: 0.00785
Epoch: 33238, Training Loss: 0.00964
Epoch: 33239, Training Loss: 0.00754
Epoch: 33239, Training Loss: 0.00785
Epoch: 33239, Training Loss: 0.00785
Epoch: 33239, Training Loss: 0.00964
Epoch: 33240, Training Loss: 0.00754
Epoch: 33240, Training Loss: 0.00785
Epoch: 33240, Training Loss: 0.00785
Epoch: 33240, Training Loss: 0.00964
Epoch: 33241, Training Loss: 0.00754
Epoch: 33241, Training Loss: 0.00785
Epoch: 33241, Training Loss: 0.00785
Epoch: 33241, Training Loss: 0.00964
Epoch: 33242, Training Loss: 0.00754
Epoch: 33242, Training Loss: 0.00785
Epoch: 33242, Training Loss: 0.00785
Epoch: 33242, Training Loss: 0.00964
Epoch: 33243, Training Loss: 0.00754
Epoch: 33243, Training Loss: 0.00785
Epoch: 33243, Training Loss: 0.00785
Epoch: 33243, Training Loss: 0.00964
Epoch: 33244, Training Loss: 0.00754
Epoch: 33244, Training Loss: 0.00785
Epoch: 33244, Training Loss: 0.00785
Epoch: 33244, Training Loss: 0.00964
Epoch: 33245, Training Loss: 0.00754
Epoch: 33245, Training Loss: 0.00784
Epoch: 33245, Training Loss: 0.00785
Epoch: 33245, Training Loss: 0.00964
Epoch: 33246, Training Loss: 0.00754
Epoch: 33246, Training Loss: 0.00784
Epoch: 33246, Training Loss: 0.00785
Epoch: 33246, Training Loss: 0.00964
Epoch: 33247, Training Loss: 0.00754
Epoch: 33247, Training Loss: 0.00784
Epoch: 33247, Training Loss: 0.00785
Epoch: 33247, Training Loss: 0.00964
Epoch: 33248, Training Loss: 0.00754
Epoch: 33248, Training Loss: 0.00784
Epoch: 33248, Training Loss: 0.00785
Epoch: 33248, Training Loss: 0.00964
Epoch: 33249, Training Loss: 0.00754
Epoch: 33249, Training Loss: 0.00784
Epoch: 33249, Training Loss: 0.00785
Epoch: 33249, Training Loss: 0.00964
Epoch: 33250, Training Loss: 0.00754
Epoch: 33250, Training Loss: 0.00784
Epoch: 33250, Training Loss: 0.00785
Epoch: 33250, Training Loss: 0.00964
Epoch: 33251, Training Loss: 0.00754
Epoch: 33251, Training Loss: 0.00784
Epoch: 33251, Training Loss: 0.00785
Epoch: 33251, Training Loss: 0.00964
Epoch: 33252, Training Loss: 0.00754
Epoch: 33252, Training Loss: 0.00784
Epoch: 33252, Training Loss: 0.00785
Epoch: 33252, Training Loss: 0.00964
Epoch: 33253, Training Loss: 0.00754
Epoch: 33253, Training Loss: 0.00784
Epoch: 33253, Training Loss: 0.00785
Epoch: 33253, Training Loss: 0.00964
Epoch: 33254, Training Loss: 0.00754
Epoch: 33254, Training Loss: 0.00784
Epoch: 33254, Training Loss: 0.00785
Epoch: 33254, Training Loss: 0.00964
Epoch: 33255, Training Loss: 0.00754
Epoch: 33255, Training Loss: 0.00784
Epoch: 33255, Training Loss: 0.00785
Epoch: 33255, Training Loss: 0.00964
Epoch: 33256, Training Loss: 0.00754
Epoch: 33256, Training Loss: 0.00784
Epoch: 33256, Training Loss: 0.00785
Epoch: 33256, Training Loss: 0.00964
Epoch: 33257, Training Loss: 0.00754
Epoch: 33257, Training Loss: 0.00784
Epoch: 33257, Training Loss: 0.00785
Epoch: 33257, Training Loss: 0.00964
Epoch: 33258, Training Loss: 0.00754
Epoch: 33258, Training Loss: 0.00784
Epoch: 33258, Training Loss: 0.00785
Epoch: 33258, Training Loss: 0.00964
Epoch: 33259, Training Loss: 0.00754
Epoch: 33259, Training Loss: 0.00784
Epoch: 33259, Training Loss: 0.00785
Epoch: 33259, Training Loss: 0.00964
Epoch: 33260, Training Loss: 0.00754
Epoch: 33260, Training Loss: 0.00784
Epoch: 33260, Training Loss: 0.00785
Epoch: 33260, Training Loss: 0.00964
Epoch: 33261, Training Loss: 0.00754
Epoch: 33261, Training Loss: 0.00784
Epoch: 33261, Training Loss: 0.00785
Epoch: 33261, Training Loss: 0.00964
Epoch: 33262, Training Loss: 0.00753
Epoch: 33262, Training Loss: 0.00784
Epoch: 33262, Training Loss: 0.00785
Epoch: 33262, Training Loss: 0.00964
Epoch: 33263, Training Loss: 0.00753
Epoch: 33263, Training Loss: 0.00784
Epoch: 33263, Training Loss: 0.00785
Epoch: 33263, Training Loss: 0.00964
Epoch: 33264, Training Loss: 0.00753
Epoch: 33264, Training Loss: 0.00784
Epoch: 33264, Training Loss: 0.00785
Epoch: 33264, Training Loss: 0.00964
Epoch: 33265, Training Loss: 0.00753
Epoch: 33265, Training Loss: 0.00784
Epoch: 33265, Training Loss: 0.00785
Epoch: 33265, Training Loss: 0.00964
Epoch: 33266, Training Loss: 0.00753
Epoch: 33266, Training Loss: 0.00784
Epoch: 33266, Training Loss: 0.00785
Epoch: 33266, Training Loss: 0.00964
Epoch: 33267, Training Loss: 0.00753
Epoch: 33267, Training Loss: 0.00784
Epoch: 33267, Training Loss: 0.00785
Epoch: 33267, Training Loss: 0.00964
Epoch: 33268, Training Loss: 0.00753
Epoch: 33268, Training Loss: 0.00784
Epoch: 33268, Training Loss: 0.00785
Epoch: 33268, Training Loss: 0.00964
Epoch: 33269, Training Loss: 0.00753
Epoch: 33269, Training Loss: 0.00784
Epoch: 33269, Training Loss: 0.00785
Epoch: 33269, Training Loss: 0.00964
Epoch: 33270, Training Loss: 0.00753
Epoch: 33270, Training Loss: 0.00784
Epoch: 33270, Training Loss: 0.00785
Epoch: 33270, Training Loss: 0.00964
Epoch: 33271, Training Loss: 0.00753
Epoch: 33271, Training Loss: 0.00784
Epoch: 33271, Training Loss: 0.00785
Epoch: 33271, Training Loss: 0.00964
Epoch: 33272, Training Loss: 0.00753
Epoch: 33272, Training Loss: 0.00784
Epoch: 33272, Training Loss: 0.00785
Epoch: 33272, Training Loss: 0.00964
Epoch: 33273, Training Loss: 0.00753
Epoch: 33273, Training Loss: 0.00784
Epoch: 33273, Training Loss: 0.00785
Epoch: 33273, Training Loss: 0.00964
Epoch: 33274, Training Loss: 0.00753
Epoch: 33274, Training Loss: 0.00784
Epoch: 33274, Training Loss: 0.00785
Epoch: 33274, Training Loss: 0.00964
Epoch: 33275, Training Loss: 0.00753
Epoch: 33275, Training Loss: 0.00784
Epoch: 33275, Training Loss: 0.00785
Epoch: 33275, Training Loss: 0.00964
Epoch: 33276, Training Loss: 0.00753
Epoch: 33276, Training Loss: 0.00784
Epoch: 33276, Training Loss: 0.00785
Epoch: 33276, Training Loss: 0.00964
Epoch: 33277, Training Loss: 0.00753
Epoch: 33277, Training Loss: 0.00784
Epoch: 33277, Training Loss: 0.00785
Epoch: 33277, Training Loss: 0.00964
Epoch: 33278, Training Loss: 0.00753
Epoch: 33278, Training Loss: 0.00784
Epoch: 33278, Training Loss: 0.00785
Epoch: 33278, Training Loss: 0.00964
Epoch: 33279, Training Loss: 0.00753
Epoch: 33279, Training Loss: 0.00784
Epoch: 33279, Training Loss: 0.00785
Epoch: 33279, Training Loss: 0.00964
Epoch: 33280, Training Loss: 0.00753
Epoch: 33280, Training Loss: 0.00784
Epoch: 33280, Training Loss: 0.00785
Epoch: 33280, Training Loss: 0.00964
Epoch: 33281, Training Loss: 0.00753
Epoch: 33281, Training Loss: 0.00784
Epoch: 33281, Training Loss: 0.00785
Epoch: 33281, Training Loss: 0.00964
Epoch: 33282, Training Loss: 0.00753
Epoch: 33282, Training Loss: 0.00784
Epoch: 33282, Training Loss: 0.00785
Epoch: 33282, Training Loss: 0.00964
Epoch: 33283, Training Loss: 0.00753
Epoch: 33283, Training Loss: 0.00784
Epoch: 33283, Training Loss: 0.00785
Epoch: 33283, Training Loss: 0.00964
Epoch: 33284, Training Loss: 0.00753
Epoch: 33284, Training Loss: 0.00784
Epoch: 33284, Training Loss: 0.00785
Epoch: 33284, Training Loss: 0.00964
Epoch: 33285, Training Loss: 0.00753
Epoch: 33285, Training Loss: 0.00784
Epoch: 33285, Training Loss: 0.00785
Epoch: 33285, Training Loss: 0.00964
Epoch: 33286, Training Loss: 0.00753
Epoch: 33286, Training Loss: 0.00784
Epoch: 33286, Training Loss: 0.00785
Epoch: 33286, Training Loss: 0.00964
Epoch: 33287, Training Loss: 0.00753
Epoch: 33287, Training Loss: 0.00784
Epoch: 33287, Training Loss: 0.00785
Epoch: 33287, Training Loss: 0.00963
Epoch: 33288, Training Loss: 0.00753
Epoch: 33288, Training Loss: 0.00784
Epoch: 33288, Training Loss: 0.00785
Epoch: 33288, Training Loss: 0.00963
Epoch: 33289, Training Loss: 0.00753
Epoch: 33289, Training Loss: 0.00784
Epoch: 33289, Training Loss: 0.00785
Epoch: 33289, Training Loss: 0.00963
Epoch: 33290, Training Loss: 0.00753
Epoch: 33290, Training Loss: 0.00784
Epoch: 33290, Training Loss: 0.00785
Epoch: 33290, Training Loss: 0.00963
Epoch: 33291, Training Loss: 0.00753
Epoch: 33291, Training Loss: 0.00784
Epoch: 33291, Training Loss: 0.00785
Epoch: 33291, Training Loss: 0.00963
Epoch: 33292, Training Loss: 0.00753
Epoch: 33292, Training Loss: 0.00784
Epoch: 33292, Training Loss: 0.00785
Epoch: 33292, Training Loss: 0.00963
Epoch: 33293, Training Loss: 0.00753
Epoch: 33293, Training Loss: 0.00784
Epoch: 33293, Training Loss: 0.00785
Epoch: 33293, Training Loss: 0.00963
Epoch: 33294, Training Loss: 0.00753
Epoch: 33294, Training Loss: 0.00784
Epoch: 33294, Training Loss: 0.00785
Epoch: 33294, Training Loss: 0.00963
Epoch: 33295, Training Loss: 0.00753
Epoch: 33295, Training Loss: 0.00784
Epoch: 33295, Training Loss: 0.00785
Epoch: 33295, Training Loss: 0.00963
Epoch: 33296, Training Loss: 0.00753
Epoch: 33296, Training Loss: 0.00784
Epoch: 33296, Training Loss: 0.00785
Epoch: 33296, Training Loss: 0.00963
Epoch: 33297, Training Loss: 0.00753
Epoch: 33297, Training Loss: 0.00784
Epoch: 33297, Training Loss: 0.00785
Epoch: 33297, Training Loss: 0.00963
Epoch: 33298, Training Loss: 0.00753
Epoch: 33298, Training Loss: 0.00784
Epoch: 33298, Training Loss: 0.00785
Epoch: 33298, Training Loss: 0.00963
Epoch: 33299, Training Loss: 0.00753
Epoch: 33299, Training Loss: 0.00784
Epoch: 33299, Training Loss: 0.00785
Epoch: 33299, Training Loss: 0.00963
Epoch: 33300, Training Loss: 0.00753
Epoch: 33300, Training Loss: 0.00784
Epoch: 33300, Training Loss: 0.00785
Epoch: 33300, Training Loss: 0.00963
Epoch: 33301, Training Loss: 0.00753
Epoch: 33301, Training Loss: 0.00784
Epoch: 33301, Training Loss: 0.00785
Epoch: 33301, Training Loss: 0.00963
Epoch: 33302, Training Loss: 0.00753
Epoch: 33302, Training Loss: 0.00784
Epoch: 33302, Training Loss: 0.00785
Epoch: 33302, Training Loss: 0.00963
Epoch: 33303, Training Loss: 0.00753
Epoch: 33303, Training Loss: 0.00784
Epoch: 33303, Training Loss: 0.00785
Epoch: 33303, Training Loss: 0.00963
Epoch: 33304, Training Loss: 0.00753
Epoch: 33304, Training Loss: 0.00784
Epoch: 33304, Training Loss: 0.00785
Epoch: 33304, Training Loss: 0.00963
Epoch: 33305, Training Loss: 0.00753
Epoch: 33305, Training Loss: 0.00784
Epoch: 33305, Training Loss: 0.00785
Epoch: 33305, Training Loss: 0.00963
Epoch: 33306, Training Loss: 0.00753
Epoch: 33306, Training Loss: 0.00784
Epoch: 33306, Training Loss: 0.00785
Epoch: 33306, Training Loss: 0.00963
Epoch: 33307, Training Loss: 0.00753
Epoch: 33307, Training Loss: 0.00784
Epoch: 33307, Training Loss: 0.00785
Epoch: 33307, Training Loss: 0.00963
Epoch: 33308, Training Loss: 0.00753
Epoch: 33308, Training Loss: 0.00784
Epoch: 33308, Training Loss: 0.00785
Epoch: 33308, Training Loss: 0.00963
Epoch: 33309, Training Loss: 0.00753
Epoch: 33309, Training Loss: 0.00784
Epoch: 33309, Training Loss: 0.00785
Epoch: 33309, Training Loss: 0.00963
Epoch: 33310, Training Loss: 0.00753
Epoch: 33310, Training Loss: 0.00784
Epoch: 33310, Training Loss: 0.00785
Epoch: 33310, Training Loss: 0.00963
Epoch: 33311, Training Loss: 0.00753
Epoch: 33311, Training Loss: 0.00784
Epoch: 33311, Training Loss: 0.00785
Epoch: 33311, Training Loss: 0.00963
Epoch: 33312, Training Loss: 0.00753
Epoch: 33312, Training Loss: 0.00784
Epoch: 33312, Training Loss: 0.00785
Epoch: 33312, Training Loss: 0.00963
Epoch: 33313, Training Loss: 0.00753
Epoch: 33313, Training Loss: 0.00784
Epoch: 33313, Training Loss: 0.00785
Epoch: 33313, Training Loss: 0.00963
Epoch: 33314, Training Loss: 0.00753
Epoch: 33314, Training Loss: 0.00784
Epoch: 33314, Training Loss: 0.00784
Epoch: 33314, Training Loss: 0.00963
Epoch: 33315, Training Loss: 0.00753
Epoch: 33315, Training Loss: 0.00784
Epoch: 33315, Training Loss: 0.00784
Epoch: 33315, Training Loss: 0.00963
Epoch: 33316, Training Loss: 0.00753
Epoch: 33316, Training Loss: 0.00784
Epoch: 33316, Training Loss: 0.00784
Epoch: 33316, Training Loss: 0.00963
Epoch: 33317, Training Loss: 0.00753
Epoch: 33317, Training Loss: 0.00784
Epoch: 33317, Training Loss: 0.00784
Epoch: 33317, Training Loss: 0.00963
Epoch: 33318, Training Loss: 0.00753
Epoch: 33318, Training Loss: 0.00784
Epoch: 33318, Training Loss: 0.00784
Epoch: 33318, Training Loss: 0.00963
Epoch: 33319, Training Loss: 0.00753
Epoch: 33319, Training Loss: 0.00784
Epoch: 33319, Training Loss: 0.00784
Epoch: 33319, Training Loss: 0.00963
Epoch: 33320, Training Loss: 0.00753
Epoch: 33320, Training Loss: 0.00784
Epoch: 33320, Training Loss: 0.00784
Epoch: 33320, Training Loss: 0.00963
Epoch: 33321, Training Loss: 0.00753
Epoch: 33321, Training Loss: 0.00783
Epoch: 33321, Training Loss: 0.00784
Epoch: 33321, Training Loss: 0.00963
Epoch: 33322, Training Loss: 0.00753
Epoch: 33322, Training Loss: 0.00783
Epoch: 33322, Training Loss: 0.00784
Epoch: 33322, Training Loss: 0.00963
Epoch: 33323, Training Loss: 0.00753
Epoch: 33323, Training Loss: 0.00783
Epoch: 33323, Training Loss: 0.00784
Epoch: 33323, Training Loss: 0.00963
Epoch: 33324, Training Loss: 0.00753
Epoch: 33324, Training Loss: 0.00783
Epoch: 33324, Training Loss: 0.00784
Epoch: 33324, Training Loss: 0.00963
Epoch: 33325, Training Loss: 0.00753
Epoch: 33325, Training Loss: 0.00783
Epoch: 33325, Training Loss: 0.00784
Epoch: 33325, Training Loss: 0.00963
Epoch: 33326, Training Loss: 0.00753
Epoch: 33326, Training Loss: 0.00783
Epoch: 33326, Training Loss: 0.00784
Epoch: 33326, Training Loss: 0.00963
Epoch: 33327, Training Loss: 0.00753
Epoch: 33327, Training Loss: 0.00783
Epoch: 33327, Training Loss: 0.00784
Epoch: 33327, Training Loss: 0.00963
Epoch: 33328, Training Loss: 0.00753
Epoch: 33328, Training Loss: 0.00783
Epoch: 33328, Training Loss: 0.00784
Epoch: 33328, Training Loss: 0.00963
Epoch: 33329, Training Loss: 0.00753
Epoch: 33329, Training Loss: 0.00783
Epoch: 33329, Training Loss: 0.00784
Epoch: 33329, Training Loss: 0.00963
Epoch: 33330, Training Loss: 0.00753
Epoch: 33330, Training Loss: 0.00783
Epoch: 33330, Training Loss: 0.00784
Epoch: 33330, Training Loss: 0.00963
Epoch: 33331, Training Loss: 0.00753
Epoch: 33331, Training Loss: 0.00783
Epoch: 33331, Training Loss: 0.00784
Epoch: 33331, Training Loss: 0.00963
Epoch: 33332, Training Loss: 0.00753
Epoch: 33332, Training Loss: 0.00783
Epoch: 33332, Training Loss: 0.00784
Epoch: 33332, Training Loss: 0.00963
Epoch: 33333, Training Loss: 0.00753
Epoch: 33333, Training Loss: 0.00783
Epoch: 33333, Training Loss: 0.00784
Epoch: 33333, Training Loss: 0.00963
Epoch: 33334, Training Loss: 0.00753
Epoch: 33334, Training Loss: 0.00783
Epoch: 33334, Training Loss: 0.00784
Epoch: 33334, Training Loss: 0.00963
Epoch: 33335, Training Loss: 0.00753
Epoch: 33335, Training Loss: 0.00783
Epoch: 33335, Training Loss: 0.00784
Epoch: 33335, Training Loss: 0.00963
Epoch: 33336, Training Loss: 0.00753
Epoch: 33336, Training Loss: 0.00783
Epoch: 33336, Training Loss: 0.00784
Epoch: 33336, Training Loss: 0.00963
Epoch: 33337, Training Loss: 0.00753
Epoch: 33337, Training Loss: 0.00783
Epoch: 33337, Training Loss: 0.00784
Epoch: 33337, Training Loss: 0.00963
Epoch: 33338, Training Loss: 0.00753
Epoch: 33338, Training Loss: 0.00783
Epoch: 33338, Training Loss: 0.00784
Epoch: 33338, Training Loss: 0.00963
Epoch: 33339, Training Loss: 0.00753
Epoch: 33339, Training Loss: 0.00783
Epoch: 33339, Training Loss: 0.00784
Epoch: 33339, Training Loss: 0.00963
Epoch: 33340, Training Loss: 0.00753
Epoch: 33340, Training Loss: 0.00783
Epoch: 33340, Training Loss: 0.00784
Epoch: 33340, Training Loss: 0.00963
Epoch: 33341, Training Loss: 0.00753
Epoch: 33341, Training Loss: 0.00783
Epoch: 33341, Training Loss: 0.00784
Epoch: 33341, Training Loss: 0.00963
Epoch: 33342, Training Loss: 0.00753
Epoch: 33342, Training Loss: 0.00783
Epoch: 33342, Training Loss: 0.00784
Epoch: 33342, Training Loss: 0.00963
Epoch: 33343, Training Loss: 0.00752
Epoch: 33343, Training Loss: 0.00783
Epoch: 33343, Training Loss: 0.00784
Epoch: 33343, Training Loss: 0.00963
Epoch: 33344, Training Loss: 0.00752
Epoch: 33344, Training Loss: 0.00783
Epoch: 33344, Training Loss: 0.00784
Epoch: 33344, Training Loss: 0.00963
Epoch: 33345, Training Loss: 0.00752
Epoch: 33345, Training Loss: 0.00783
Epoch: 33345, Training Loss: 0.00784
Epoch: 33345, Training Loss: 0.00963
Epoch: 33346, Training Loss: 0.00752
Epoch: 33346, Training Loss: 0.00783
Epoch: 33346, Training Loss: 0.00784
Epoch: 33346, Training Loss: 0.00963
Epoch: 33347, Training Loss: 0.00752
Epoch: 33347, Training Loss: 0.00783
Epoch: 33347, Training Loss: 0.00784
Epoch: 33347, Training Loss: 0.00963
Epoch: 33348, Training Loss: 0.00752
Epoch: 33348, Training Loss: 0.00783
Epoch: 33348, Training Loss: 0.00784
Epoch: 33348, Training Loss: 0.00962
Epoch: 33349, Training Loss: 0.00752
Epoch: 33349, Training Loss: 0.00783
Epoch: 33349, Training Loss: 0.00784
Epoch: 33349, Training Loss: 0.00962
Epoch: 33350, Training Loss: 0.00752
Epoch: 33350, Training Loss: 0.00783
Epoch: 33350, Training Loss: 0.00784
Epoch: 33350, Training Loss: 0.00962
Epoch: 33351, Training Loss: 0.00752
Epoch: 33351, Training Loss: 0.00783
Epoch: 33351, Training Loss: 0.00784
Epoch: 33351, Training Loss: 0.00962
Epoch: 33352, Training Loss: 0.00752
Epoch: 33352, Training Loss: 0.00783
Epoch: 33352, Training Loss: 0.00784
Epoch: 33352, Training Loss: 0.00962
Epoch: 33353, Training Loss: 0.00752
Epoch: 33353, Training Loss: 0.00783
Epoch: 33353, Training Loss: 0.00784
Epoch: 33353, Training Loss: 0.00962
Epoch: 33354, Training Loss: 0.00752
Epoch: 33354, Training Loss: 0.00783
Epoch: 33354, Training Loss: 0.00784
Epoch: 33354, Training Loss: 0.00962
Epoch: 33355, Training Loss: 0.00752
Epoch: 33355, Training Loss: 0.00783
Epoch: 33355, Training Loss: 0.00784
Epoch: 33355, Training Loss: 0.00962
Epoch: 33356, Training Loss: 0.00752
Epoch: 33356, Training Loss: 0.00783
Epoch: 33356, Training Loss: 0.00784
Epoch: 33356, Training Loss: 0.00962
Epoch: 33357, Training Loss: 0.00752
Epoch: 33357, Training Loss: 0.00783
Epoch: 33357, Training Loss: 0.00784
Epoch: 33357, Training Loss: 0.00962
Epoch: 33358, Training Loss: 0.00752
Epoch: 33358, Training Loss: 0.00783
Epoch: 33358, Training Loss: 0.00784
Epoch: 33358, Training Loss: 0.00962
Epoch: 33359, Training Loss: 0.00752
Epoch: 33359, Training Loss: 0.00783
Epoch: 33359, Training Loss: 0.00784
Epoch: 33359, Training Loss: 0.00962
Epoch: 33360, Training Loss: 0.00752
Epoch: 33360, Training Loss: 0.00783
Epoch: 33360, Training Loss: 0.00784
Epoch: 33360, Training Loss: 0.00962
Epoch: 33361, Training Loss: 0.00752
Epoch: 33361, Training Loss: 0.00783
Epoch: 33361, Training Loss: 0.00784
Epoch: 33361, Training Loss: 0.00962
Epoch: 33362, Training Loss: 0.00752
Epoch: 33362, Training Loss: 0.00783
Epoch: 33362, Training Loss: 0.00784
Epoch: 33362, Training Loss: 0.00962
Epoch: 33363, Training Loss: 0.00752
Epoch: 33363, Training Loss: 0.00783
Epoch: 33363, Training Loss: 0.00784
Epoch: 33363, Training Loss: 0.00962
Epoch: 33364, Training Loss: 0.00752
Epoch: 33364, Training Loss: 0.00783
Epoch: 33364, Training Loss: 0.00784
Epoch: 33364, Training Loss: 0.00962
Epoch: 33365, Training Loss: 0.00752
Epoch: 33365, Training Loss: 0.00783
Epoch: 33365, Training Loss: 0.00784
Epoch: 33365, Training Loss: 0.00962
Epoch: 33366, Training Loss: 0.00752
Epoch: 33366, Training Loss: 0.00783
Epoch: 33366, Training Loss: 0.00784
Epoch: 33366, Training Loss: 0.00962
Epoch: 33367, Training Loss: 0.00752
Epoch: 33367, Training Loss: 0.00783
Epoch: 33367, Training Loss: 0.00784
Epoch: 33367, Training Loss: 0.00962
Epoch: 33368, Training Loss: 0.00752
Epoch: 33368, Training Loss: 0.00783
Epoch: 33368, Training Loss: 0.00784
Epoch: 33368, Training Loss: 0.00962
Epoch: 33369, Training Loss: 0.00752
Epoch: 33369, Training Loss: 0.00783
Epoch: 33369, Training Loss: 0.00784
Epoch: 33369, Training Loss: 0.00962
Epoch: 33370, Training Loss: 0.00752
Epoch: 33370, Training Loss: 0.00783
Epoch: 33370, Training Loss: 0.00784
Epoch: 33370, Training Loss: 0.00962
Epoch: 33371, Training Loss: 0.00752
Epoch: 33371, Training Loss: 0.00783
Epoch: 33371, Training Loss: 0.00784
Epoch: 33371, Training Loss: 0.00962
Epoch: 33372, Training Loss: 0.00752
Epoch: 33372, Training Loss: 0.00783
Epoch: 33372, Training Loss: 0.00784
Epoch: 33372, Training Loss: 0.00962
Epoch: 33373, Training Loss: 0.00752
Epoch: 33373, Training Loss: 0.00783
Epoch: 33373, Training Loss: 0.00784
Epoch: 33373, Training Loss: 0.00962
Epoch: 33374, Training Loss: 0.00752
Epoch: 33374, Training Loss: 0.00783
Epoch: 33374, Training Loss: 0.00784
Epoch: 33374, Training Loss: 0.00962
Epoch: 33375, Training Loss: 0.00752
Epoch: 33375, Training Loss: 0.00783
Epoch: 33375, Training Loss: 0.00784
Epoch: 33375, Training Loss: 0.00962
Epoch: 33376, Training Loss: 0.00752
Epoch: 33376, Training Loss: 0.00783
Epoch: 33376, Training Loss: 0.00784
Epoch: 33376, Training Loss: 0.00962
Epoch: 33377, Training Loss: 0.00752
Epoch: 33377, Training Loss: 0.00783
Epoch: 33377, Training Loss: 0.00784
Epoch: 33377, Training Loss: 0.00962
Epoch: 33378, Training Loss: 0.00752
Epoch: 33378, Training Loss: 0.00783
Epoch: 33378, Training Loss: 0.00784
Epoch: 33378, Training Loss: 0.00962
Epoch: 33379, Training Loss: 0.00752
Epoch: 33379, Training Loss: 0.00783
Epoch: 33379, Training Loss: 0.00784
Epoch: 33379, Training Loss: 0.00962
Epoch: 33380, Training Loss: 0.00752
Epoch: 33380, Training Loss: 0.00783
Epoch: 33380, Training Loss: 0.00784
Epoch: 33380, Training Loss: 0.00962
Epoch: 33381, Training Loss: 0.00752
Epoch: 33381, Training Loss: 0.00783
Epoch: 33381, Training Loss: 0.00784
Epoch: 33381, Training Loss: 0.00962
Epoch: 33382, Training Loss: 0.00752
Epoch: 33382, Training Loss: 0.00783
Epoch: 33382, Training Loss: 0.00784
Epoch: 33382, Training Loss: 0.00962
Epoch: 33383, Training Loss: 0.00752
Epoch: 33383, Training Loss: 0.00783
Epoch: 33383, Training Loss: 0.00784
Epoch: 33383, Training Loss: 0.00962
Epoch: 33384, Training Loss: 0.00752
Epoch: 33384, Training Loss: 0.00783
Epoch: 33384, Training Loss: 0.00784
Epoch: 33384, Training Loss: 0.00962
Epoch: 33385, Training Loss: 0.00752
Epoch: 33385, Training Loss: 0.00783
Epoch: 33385, Training Loss: 0.00784
Epoch: 33385, Training Loss: 0.00962
Epoch: 33386, Training Loss: 0.00752
Epoch: 33386, Training Loss: 0.00783
Epoch: 33386, Training Loss: 0.00784
Epoch: 33386, Training Loss: 0.00962
Epoch: 33387, Training Loss: 0.00752
Epoch: 33387, Training Loss: 0.00783
Epoch: 33387, Training Loss: 0.00784
Epoch: 33387, Training Loss: 0.00962
Epoch: 33388, Training Loss: 0.00752
Epoch: 33388, Training Loss: 0.00783
Epoch: 33388, Training Loss: 0.00784
Epoch: 33388, Training Loss: 0.00962
Epoch: 33389, Training Loss: 0.00752
Epoch: 33389, Training Loss: 0.00783
Epoch: 33389, Training Loss: 0.00784
Epoch: 33389, Training Loss: 0.00962
Epoch: 33390, Training Loss: 0.00752
Epoch: 33390, Training Loss: 0.00783
Epoch: 33390, Training Loss: 0.00784
Epoch: 33390, Training Loss: 0.00962
Epoch: 33391, Training Loss: 0.00752
Epoch: 33391, Training Loss: 0.00783
Epoch: 33391, Training Loss: 0.00783
Epoch: 33391, Training Loss: 0.00962
Epoch: 33392, Training Loss: 0.00752
Epoch: 33392, Training Loss: 0.00783
Epoch: 33392, Training Loss: 0.00783
Epoch: 33392, Training Loss: 0.00962
Epoch: 33393, Training Loss: 0.00752
Epoch: 33393, Training Loss: 0.00783
Epoch: 33393, Training Loss: 0.00783
Epoch: 33393, Training Loss: 0.00962
Epoch: 33394, Training Loss: 0.00752
Epoch: 33394, Training Loss: 0.00783
Epoch: 33394, Training Loss: 0.00783
Epoch: 33394, Training Loss: 0.00962
Epoch: 33395, Training Loss: 0.00752
Epoch: 33395, Training Loss: 0.00783
Epoch: 33395, Training Loss: 0.00783
Epoch: 33395, Training Loss: 0.00962
Epoch: 33396, Training Loss: 0.00752
Epoch: 33396, Training Loss: 0.00783
Epoch: 33396, Training Loss: 0.00783
Epoch: 33396, Training Loss: 0.00962
Epoch: 33397, Training Loss: 0.00752
Epoch: 33397, Training Loss: 0.00782
Epoch: 33397, Training Loss: 0.00783
Epoch: 33397, Training Loss: 0.00962
Epoch: 33398, Training Loss: 0.00752
Epoch: 33398, Training Loss: 0.00782
Epoch: 33398, Training Loss: 0.00783
Epoch: 33398, Training Loss: 0.00962
Epoch: 33399, Training Loss: 0.00752
Epoch: 33399, Training Loss: 0.00782
Epoch: 33399, Training Loss: 0.00783
Epoch: 33399, Training Loss: 0.00962
Epoch: 33400, Training Loss: 0.00752
Epoch: 33400, Training Loss: 0.00782
Epoch: 33400, Training Loss: 0.00783
Epoch: 33400, Training Loss: 0.00962
Epoch: 33401, Training Loss: 0.00752
Epoch: 33401, Training Loss: 0.00782
Epoch: 33401, Training Loss: 0.00783
Epoch: 33401, Training Loss: 0.00962
Epoch: 33402, Training Loss: 0.00752
Epoch: 33402, Training Loss: 0.00782
Epoch: 33402, Training Loss: 0.00783
Epoch: 33402, Training Loss: 0.00962
Epoch: 33403, Training Loss: 0.00752
Epoch: 33403, Training Loss: 0.00782
Epoch: 33403, Training Loss: 0.00783
Epoch: 33403, Training Loss: 0.00962
Epoch: 33404, Training Loss: 0.00752
Epoch: 33404, Training Loss: 0.00782
Epoch: 33404, Training Loss: 0.00783
Epoch: 33404, Training Loss: 0.00962
Epoch: 33405, Training Loss: 0.00752
Epoch: 33405, Training Loss: 0.00782
Epoch: 33405, Training Loss: 0.00783
Epoch: 33405, Training Loss: 0.00962
Epoch: 33406, Training Loss: 0.00752
Epoch: 33406, Training Loss: 0.00782
Epoch: 33406, Training Loss: 0.00783
Epoch: 33406, Training Loss: 0.00962
Epoch: 33407, Training Loss: 0.00752
Epoch: 33407, Training Loss: 0.00782
Epoch: 33407, Training Loss: 0.00783
Epoch: 33407, Training Loss: 0.00962
Epoch: 33408, Training Loss: 0.00752
Epoch: 33408, Training Loss: 0.00782
Epoch: 33408, Training Loss: 0.00783
Epoch: 33408, Training Loss: 0.00962
Epoch: 33409, Training Loss: 0.00752
Epoch: 33409, Training Loss: 0.00782
Epoch: 33409, Training Loss: 0.00783
Epoch: 33409, Training Loss: 0.00962
Epoch: 33410, Training Loss: 0.00752
Epoch: 33410, Training Loss: 0.00782
Epoch: 33410, Training Loss: 0.00783
Epoch: 33410, Training Loss: 0.00961
Epoch: 33411, Training Loss: 0.00752
Epoch: 33411, Training Loss: 0.00782
Epoch: 33411, Training Loss: 0.00783
Epoch: 33411, Training Loss: 0.00961
Epoch: 33412, Training Loss: 0.00752
Epoch: 33412, Training Loss: 0.00782
Epoch: 33412, Training Loss: 0.00783
Epoch: 33412, Training Loss: 0.00961
Epoch: 33413, Training Loss: 0.00752
Epoch: 33413, Training Loss: 0.00782
Epoch: 33413, Training Loss: 0.00783
Epoch: 33413, Training Loss: 0.00961
Epoch: 33414, Training Loss: 0.00752
Epoch: 33414, Training Loss: 0.00782
Epoch: 33414, Training Loss: 0.00783
Epoch: 33414, Training Loss: 0.00961
Epoch: 33415, Training Loss: 0.00752
Epoch: 33415, Training Loss: 0.00782
Epoch: 33415, Training Loss: 0.00783
Epoch: 33415, Training Loss: 0.00961
Epoch: 33416, Training Loss: 0.00752
Epoch: 33416, Training Loss: 0.00782
Epoch: 33416, Training Loss: 0.00783
Epoch: 33416, Training Loss: 0.00961
Epoch: 33417, Training Loss: 0.00752
Epoch: 33417, Training Loss: 0.00782
Epoch: 33417, Training Loss: 0.00783
Epoch: 33417, Training Loss: 0.00961
Epoch: 33418, Training Loss: 0.00752
Epoch: 33418, Training Loss: 0.00782
Epoch: 33418, Training Loss: 0.00783
Epoch: 33418, Training Loss: 0.00961
Epoch: 33419, Training Loss: 0.00752
Epoch: 33419, Training Loss: 0.00782
Epoch: 33419, Training Loss: 0.00783
Epoch: 33419, Training Loss: 0.00961
Epoch: 33420, Training Loss: 0.00752
Epoch: 33420, Training Loss: 0.00782
Epoch: 33420, Training Loss: 0.00783
Epoch: 33420, Training Loss: 0.00961
Epoch: 33421, Training Loss: 0.00752
Epoch: 33421, Training Loss: 0.00782
Epoch: 33421, Training Loss: 0.00783
Epoch: 33421, Training Loss: 0.00961
Epoch: 33422, Training Loss: 0.00752
Epoch: 33422, Training Loss: 0.00782
Epoch: 33422, Training Loss: 0.00783
Epoch: 33422, Training Loss: 0.00961
Epoch: 33423, Training Loss: 0.00752
Epoch: 33423, Training Loss: 0.00782
Epoch: 33423, Training Loss: 0.00783
Epoch: 33423, Training Loss: 0.00961
Epoch: 33424, Training Loss: 0.00752
Epoch: 33424, Training Loss: 0.00782
Epoch: 33424, Training Loss: 0.00783
Epoch: 33424, Training Loss: 0.00961
Epoch: 33425, Training Loss: 0.00751
Epoch: 33425, Training Loss: 0.00782
Epoch: 33425, Training Loss: 0.00783
Epoch: 33425, Training Loss: 0.00961
Epoch: 33426, Training Loss: 0.00751
Epoch: 33426, Training Loss: 0.00782
Epoch: 33426, Training Loss: 0.00783
Epoch: 33426, Training Loss: 0.00961
Epoch: 33427, Training Loss: 0.00751
Epoch: 33427, Training Loss: 0.00782
Epoch: 33427, Training Loss: 0.00783
Epoch: 33427, Training Loss: 0.00961
Epoch: 33428, Training Loss: 0.00751
Epoch: 33428, Training Loss: 0.00782
Epoch: 33428, Training Loss: 0.00783
Epoch: 33428, Training Loss: 0.00961
Epoch: 33429, Training Loss: 0.00751
Epoch: 33429, Training Loss: 0.00782
Epoch: 33429, Training Loss: 0.00783
Epoch: 33429, Training Loss: 0.00961
Epoch: 33430, Training Loss: 0.00751
Epoch: 33430, Training Loss: 0.00782
Epoch: 33430, Training Loss: 0.00783
Epoch: 33430, Training Loss: 0.00961
Epoch: 33431, Training Loss: 0.00751
Epoch: 33431, Training Loss: 0.00782
Epoch: 33431, Training Loss: 0.00783
Epoch: 33431, Training Loss: 0.00961
Epoch: 33432, Training Loss: 0.00751
Epoch: 33432, Training Loss: 0.00782
Epoch: 33432, Training Loss: 0.00783
Epoch: 33432, Training Loss: 0.00961
Epoch: 33433, Training Loss: 0.00751
Epoch: 33433, Training Loss: 0.00782
Epoch: 33433, Training Loss: 0.00783
Epoch: 33433, Training Loss: 0.00961
Epoch: 33434, Training Loss: 0.00751
Epoch: 33434, Training Loss: 0.00782
Epoch: 33434, Training Loss: 0.00783
Epoch: 33434, Training Loss: 0.00961
Epoch: 33435, Training Loss: 0.00751
Epoch: 33435, Training Loss: 0.00782
Epoch: 33435, Training Loss: 0.00783
Epoch: 33435, Training Loss: 0.00961
Epoch: 33436, Training Loss: 0.00751
Epoch: 33436, Training Loss: 0.00782
Epoch: 33436, Training Loss: 0.00783
Epoch: 33436, Training Loss: 0.00961
Epoch: 33437, Training Loss: 0.00751
Epoch: 33437, Training Loss: 0.00782
Epoch: 33437, Training Loss: 0.00783
Epoch: 33437, Training Loss: 0.00961
Epoch: 33438, Training Loss: 0.00751
Epoch: 33438, Training Loss: 0.00782
Epoch: 33438, Training Loss: 0.00783
Epoch: 33438, Training Loss: 0.00961
Epoch: 33439, Training Loss: 0.00751
Epoch: 33439, Training Loss: 0.00782
Epoch: 33439, Training Loss: 0.00783
Epoch: 33439, Training Loss: 0.00961
Epoch: 33440, Training Loss: 0.00751
Epoch: 33440, Training Loss: 0.00782
Epoch: 33440, Training Loss: 0.00783
Epoch: 33440, Training Loss: 0.00961
Epoch: 33441, Training Loss: 0.00751
Epoch: 33441, Training Loss: 0.00782
Epoch: 33441, Training Loss: 0.00783
Epoch: 33441, Training Loss: 0.00961
Epoch: 33442, Training Loss: 0.00751
Epoch: 33442, Training Loss: 0.00782
Epoch: 33442, Training Loss: 0.00783
Epoch: 33442, Training Loss: 0.00961
Epoch: 33443, Training Loss: 0.00751
Epoch: 33443, Training Loss: 0.00782
Epoch: 33443, Training Loss: 0.00783
Epoch: 33443, Training Loss: 0.00961
Epoch: 33444, Training Loss: 0.00751
Epoch: 33444, Training Loss: 0.00782
Epoch: 33444, Training Loss: 0.00783
Epoch: 33444, Training Loss: 0.00961
Epoch: 33445, Training Loss: 0.00751
Epoch: 33445, Training Loss: 0.00782
Epoch: 33445, Training Loss: 0.00783
Epoch: 33445, Training Loss: 0.00961
Epoch: 33446, Training Loss: 0.00751
Epoch: 33446, Training Loss: 0.00782
Epoch: 33446, Training Loss: 0.00783
Epoch: 33446, Training Loss: 0.00961
Epoch: 33447, Training Loss: 0.00751
Epoch: 33447, Training Loss: 0.00782
Epoch: 33447, Training Loss: 0.00783
Epoch: 33447, Training Loss: 0.00961
Epoch: 33448, Training Loss: 0.00751
Epoch: 33448, Training Loss: 0.00782
Epoch: 33448, Training Loss: 0.00783
Epoch: 33448, Training Loss: 0.00961
Epoch: 33449, Training Loss: 0.00751
Epoch: 33449, Training Loss: 0.00782
Epoch: 33449, Training Loss: 0.00783
Epoch: 33449, Training Loss: 0.00961
Epoch: 33450, Training Loss: 0.00751
Epoch: 33450, Training Loss: 0.00782
Epoch: 33450, Training Loss: 0.00783
Epoch: 33450, Training Loss: 0.00961
Epoch: 33451, Training Loss: 0.00751
Epoch: 33451, Training Loss: 0.00782
Epoch: 33451, Training Loss: 0.00783
Epoch: 33451, Training Loss: 0.00961
Epoch: 33452, Training Loss: 0.00751
Epoch: 33452, Training Loss: 0.00782
Epoch: 33452, Training Loss: 0.00783
Epoch: 33452, Training Loss: 0.00961
Epoch: 33453, Training Loss: 0.00751
Epoch: 33453, Training Loss: 0.00782
Epoch: 33453, Training Loss: 0.00783
Epoch: 33453, Training Loss: 0.00961
Epoch: 33454, Training Loss: 0.00751
Epoch: 33454, Training Loss: 0.00782
Epoch: 33454, Training Loss: 0.00783
Epoch: 33454, Training Loss: 0.00961
Epoch: 33455, Training Loss: 0.00751
Epoch: 33455, Training Loss: 0.00782
Epoch: 33455, Training Loss: 0.00783
Epoch: 33455, Training Loss: 0.00961
Epoch: 33456, Training Loss: 0.00751
Epoch: 33456, Training Loss: 0.00782
Epoch: 33456, Training Loss: 0.00783
Epoch: 33456, Training Loss: 0.00961
Epoch: 33457, Training Loss: 0.00751
Epoch: 33457, Training Loss: 0.00782
Epoch: 33457, Training Loss: 0.00783
Epoch: 33457, Training Loss: 0.00961
Epoch: 33458, Training Loss: 0.00751
Epoch: 33458, Training Loss: 0.00782
Epoch: 33458, Training Loss: 0.00783
Epoch: 33458, Training Loss: 0.00961
Epoch: 33459, Training Loss: 0.00751
Epoch: 33459, Training Loss: 0.00782
Epoch: 33459, Training Loss: 0.00783
Epoch: 33459, Training Loss: 0.00961
Epoch: 33460, Training Loss: 0.00751
Epoch: 33460, Training Loss: 0.00782
Epoch: 33460, Training Loss: 0.00783
Epoch: 33460, Training Loss: 0.00961
Epoch: 33461, Training Loss: 0.00751
Epoch: 33461, Training Loss: 0.00782
Epoch: 33461, Training Loss: 0.00783
Epoch: 33461, Training Loss: 0.00961
Epoch: 33462, Training Loss: 0.00751
Epoch: 33462, Training Loss: 0.00782
Epoch: 33462, Training Loss: 0.00783
Epoch: 33462, Training Loss: 0.00961
Epoch: 33463, Training Loss: 0.00751
Epoch: 33463, Training Loss: 0.00782
Epoch: 33463, Training Loss: 0.00783
Epoch: 33463, Training Loss: 0.00961
Epoch: 33464, Training Loss: 0.00751
Epoch: 33464, Training Loss: 0.00782
Epoch: 33464, Training Loss: 0.00783
Epoch: 33464, Training Loss: 0.00961
Epoch: 33465, Training Loss: 0.00751
Epoch: 33465, Training Loss: 0.00782
Epoch: 33465, Training Loss: 0.00783
Epoch: 33465, Training Loss: 0.00961
Epoch: 33466, Training Loss: 0.00751
Epoch: 33466, Training Loss: 0.00782
Epoch: 33466, Training Loss: 0.00783
Epoch: 33466, Training Loss: 0.00961
Epoch: 33467, Training Loss: 0.00751
Epoch: 33467, Training Loss: 0.00782
Epoch: 33467, Training Loss: 0.00782
Epoch: 33467, Training Loss: 0.00961
Epoch: 33468, Training Loss: 0.00751
Epoch: 33468, Training Loss: 0.00782
Epoch: 33468, Training Loss: 0.00782
Epoch: 33468, Training Loss: 0.00961
Epoch: 33469, Training Loss: 0.00751
Epoch: 33469, Training Loss: 0.00782
Epoch: 33469, Training Loss: 0.00782
Epoch: 33469, Training Loss: 0.00961
Epoch: 33470, Training Loss: 0.00751
Epoch: 33470, Training Loss: 0.00782
Epoch: 33470, Training Loss: 0.00782
Epoch: 33470, Training Loss: 0.00961
Epoch: 33471, Training Loss: 0.00751
Epoch: 33471, Training Loss: 0.00782
Epoch: 33471, Training Loss: 0.00782
Epoch: 33471, Training Loss: 0.00961
Epoch: 33472, Training Loss: 0.00751
Epoch: 33472, Training Loss: 0.00782
Epoch: 33472, Training Loss: 0.00782
Epoch: 33472, Training Loss: 0.00960
Epoch: 33473, Training Loss: 0.00751
Epoch: 33473, Training Loss: 0.00782
Epoch: 33473, Training Loss: 0.00782
Epoch: 33473, Training Loss: 0.00960
Epoch: 33474, Training Loss: 0.00751
Epoch: 33474, Training Loss: 0.00781
Epoch: 33474, Training Loss: 0.00782
Epoch: 33474, Training Loss: 0.00960
Epoch: 33475, Training Loss: 0.00751
Epoch: 33475, Training Loss: 0.00781
Epoch: 33475, Training Loss: 0.00782
Epoch: 33475, Training Loss: 0.00960
Epoch: 33476, Training Loss: 0.00751
Epoch: 33476, Training Loss: 0.00781
Epoch: 33476, Training Loss: 0.00782
Epoch: 33476, Training Loss: 0.00960
Epoch: 33477, Training Loss: 0.00751
Epoch: 33477, Training Loss: 0.00781
Epoch: 33477, Training Loss: 0.00782
Epoch: 33477, Training Loss: 0.00960
Epoch: 33478, Training Loss: 0.00751
Epoch: 33478, Training Loss: 0.00781
Epoch: 33478, Training Loss: 0.00782
Epoch: 33478, Training Loss: 0.00960
Epoch: 33479, Training Loss: 0.00751
Epoch: 33479, Training Loss: 0.00781
Epoch: 33479, Training Loss: 0.00782
Epoch: 33479, Training Loss: 0.00960
Epoch: 33480, Training Loss: 0.00751
Epoch: 33480, Training Loss: 0.00781
Epoch: 33480, Training Loss: 0.00782
Epoch: 33480, Training Loss: 0.00960
Epoch: 33481, Training Loss: 0.00751
Epoch: 33481, Training Loss: 0.00781
Epoch: 33481, Training Loss: 0.00782
Epoch: 33481, Training Loss: 0.00960
Epoch: 33482, Training Loss: 0.00751
Epoch: 33482, Training Loss: 0.00781
Epoch: 33482, Training Loss: 0.00782
Epoch: 33482, Training Loss: 0.00960
Epoch: 33483, Training Loss: 0.00751
Epoch: 33483, Training Loss: 0.00781
Epoch: 33483, Training Loss: 0.00782
Epoch: 33483, Training Loss: 0.00960
Epoch: 33484, Training Loss: 0.00751
Epoch: 33484, Training Loss: 0.00781
Epoch: 33484, Training Loss: 0.00782
Epoch: 33484, Training Loss: 0.00960
Epoch: 33485, Training Loss: 0.00751
Epoch: 33485, Training Loss: 0.00781
Epoch: 33485, Training Loss: 0.00782
Epoch: 33485, Training Loss: 0.00960
Epoch: 33486, Training Loss: 0.00751
Epoch: 33486, Training Loss: 0.00781
Epoch: 33486, Training Loss: 0.00782
Epoch: 33486, Training Loss: 0.00960
Epoch: 33487, Training Loss: 0.00751
Epoch: 33487, Training Loss: 0.00781
Epoch: 33487, Training Loss: 0.00782
Epoch: 33487, Training Loss: 0.00960
Epoch: 33488, Training Loss: 0.00751
Epoch: 33488, Training Loss: 0.00781
Epoch: 33488, Training Loss: 0.00782
Epoch: 33488, Training Loss: 0.00960
Epoch: 33489, Training Loss: 0.00751
Epoch: 33489, Training Loss: 0.00781
Epoch: 33489, Training Loss: 0.00782
Epoch: 33489, Training Loss: 0.00960
Epoch: 33490, Training Loss: 0.00751
Epoch: 33490, Training Loss: 0.00781
Epoch: 33490, Training Loss: 0.00782
Epoch: 33490, Training Loss: 0.00960
Epoch: 33491, Training Loss: 0.00751
Epoch: 33491, Training Loss: 0.00781
Epoch: 33491, Training Loss: 0.00782
Epoch: 33491, Training Loss: 0.00960
Epoch: 33492, Training Loss: 0.00751
Epoch: 33492, Training Loss: 0.00781
Epoch: 33492, Training Loss: 0.00782
Epoch: 33492, Training Loss: 0.00960
Epoch: 33493, Training Loss: 0.00751
Epoch: 33493, Training Loss: 0.00781
Epoch: 33493, Training Loss: 0.00782
Epoch: 33493, Training Loss: 0.00960
Epoch: 33494, Training Loss: 0.00751
Epoch: 33494, Training Loss: 0.00781
Epoch: 33494, Training Loss: 0.00782
Epoch: 33494, Training Loss: 0.00960
Epoch: 33495, Training Loss: 0.00751
Epoch: 33495, Training Loss: 0.00781
Epoch: 33495, Training Loss: 0.00782
Epoch: 33495, Training Loss: 0.00960
Epoch: 33496, Training Loss: 0.00751
Epoch: 33496, Training Loss: 0.00781
Epoch: 33496, Training Loss: 0.00782
Epoch: 33496, Training Loss: 0.00960
Epoch: 33497, Training Loss: 0.00751
Epoch: 33497, Training Loss: 0.00781
Epoch: 33497, Training Loss: 0.00782
Epoch: 33497, Training Loss: 0.00960
Epoch: 33498, Training Loss: 0.00751
Epoch: 33498, Training Loss: 0.00781
Epoch: 33498, Training Loss: 0.00782
Epoch: 33498, Training Loss: 0.00960
Epoch: 33499, Training Loss: 0.00751
Epoch: 33499, Training Loss: 0.00781
Epoch: 33499, Training Loss: 0.00782
Epoch: 33499, Training Loss: 0.00960
Epoch: 33500, Training Loss: 0.00751
Epoch: 33500, Training Loss: 0.00781
Epoch: 33500, Training Loss: 0.00782
Epoch: 33500, Training Loss: 0.00960
Epoch: 33501, Training Loss: 0.00751
Epoch: 33501, Training Loss: 0.00781
Epoch: 33501, Training Loss: 0.00782
Epoch: 33501, Training Loss: 0.00960
Epoch: 33502, Training Loss: 0.00751
Epoch: 33502, Training Loss: 0.00781
Epoch: 33502, Training Loss: 0.00782
Epoch: 33502, Training Loss: 0.00960
Epoch: 33503, Training Loss: 0.00751
Epoch: 33503, Training Loss: 0.00781
Epoch: 33503, Training Loss: 0.00782
Epoch: 33503, Training Loss: 0.00960
Epoch: 33504, Training Loss: 0.00751
Epoch: 33504, Training Loss: 0.00781
Epoch: 33504, Training Loss: 0.00782
Epoch: 33504, Training Loss: 0.00960
Epoch: 33505, Training Loss: 0.00751
Epoch: 33505, Training Loss: 0.00781
Epoch: 33505, Training Loss: 0.00782
Epoch: 33505, Training Loss: 0.00960
Epoch: 33506, Training Loss: 0.00750
Epoch: 33506, Training Loss: 0.00781
Epoch: 33506, Training Loss: 0.00782
Epoch: 33506, Training Loss: 0.00960
Epoch: 33507, Training Loss: 0.00750
Epoch: 33507, Training Loss: 0.00781
Epoch: 33507, Training Loss: 0.00782
Epoch: 33507, Training Loss: 0.00960
Epoch: 33508, Training Loss: 0.00750
Epoch: 33508, Training Loss: 0.00781
Epoch: 33508, Training Loss: 0.00782
Epoch: 33508, Training Loss: 0.00960
Epoch: 33509, Training Loss: 0.00750
Epoch: 33509, Training Loss: 0.00781
Epoch: 33509, Training Loss: 0.00782
Epoch: 33509, Training Loss: 0.00960
Epoch: 33510, Training Loss: 0.00750
Epoch: 33510, Training Loss: 0.00781
Epoch: 33510, Training Loss: 0.00782
Epoch: 33510, Training Loss: 0.00960
Epoch: 33511, Training Loss: 0.00750
Epoch: 33511, Training Loss: 0.00781
Epoch: 33511, Training Loss: 0.00782
Epoch: 33511, Training Loss: 0.00960
Epoch: 33512, Training Loss: 0.00750
Epoch: 33512, Training Loss: 0.00781
Epoch: 33512, Training Loss: 0.00782
Epoch: 33512, Training Loss: 0.00960
Epoch: 33513, Training Loss: 0.00750
Epoch: 33513, Training Loss: 0.00781
Epoch: 33513, Training Loss: 0.00782
Epoch: 33513, Training Loss: 0.00960
Epoch: 33514, Training Loss: 0.00750
Epoch: 33514, Training Loss: 0.00781
Epoch: 33514, Training Loss: 0.00782
Epoch: 33514, Training Loss: 0.00960
Epoch: 33515, Training Loss: 0.00750
Epoch: 33515, Training Loss: 0.00781
Epoch: 33515, Training Loss: 0.00782
Epoch: 33515, Training Loss: 0.00960
Epoch: 33516, Training Loss: 0.00750
Epoch: 33516, Training Loss: 0.00781
Epoch: 33516, Training Loss: 0.00782
Epoch: 33516, Training Loss: 0.00960
Epoch: 33517, Training Loss: 0.00750
Epoch: 33517, Training Loss: 0.00781
Epoch: 33517, Training Loss: 0.00782
Epoch: 33517, Training Loss: 0.00960
Epoch: 33518, Training Loss: 0.00750
Epoch: 33518, Training Loss: 0.00781
Epoch: 33518, Training Loss: 0.00782
Epoch: 33518, Training Loss: 0.00960
Epoch: 33519, Training Loss: 0.00750
Epoch: 33519, Training Loss: 0.00781
Epoch: 33519, Training Loss: 0.00782
Epoch: 33519, Training Loss: 0.00960
Epoch: 33520, Training Loss: 0.00750
Epoch: 33520, Training Loss: 0.00781
Epoch: 33520, Training Loss: 0.00782
Epoch: 33520, Training Loss: 0.00960
Epoch: 33521, Training Loss: 0.00750
Epoch: 33521, Training Loss: 0.00781
Epoch: 33521, Training Loss: 0.00782
Epoch: 33521, Training Loss: 0.00960
Epoch: 33522, Training Loss: 0.00750
Epoch: 33522, Training Loss: 0.00781
Epoch: 33522, Training Loss: 0.00782
Epoch: 33522, Training Loss: 0.00960
Epoch: 33523, Training Loss: 0.00750
Epoch: 33523, Training Loss: 0.00781
Epoch: 33523, Training Loss: 0.00782
Epoch: 33523, Training Loss: 0.00960
Epoch: 33524, Training Loss: 0.00750
Epoch: 33524, Training Loss: 0.00781
Epoch: 33524, Training Loss: 0.00782
Epoch: 33524, Training Loss: 0.00960
Epoch: 33525, Training Loss: 0.00750
Epoch: 33525, Training Loss: 0.00781
Epoch: 33525, Training Loss: 0.00782
Epoch: 33525, Training Loss: 0.00960
Epoch: 33526, Training Loss: 0.00750
Epoch: 33526, Training Loss: 0.00781
Epoch: 33526, Training Loss: 0.00782
Epoch: 33526, Training Loss: 0.00960
Epoch: 33527, Training Loss: 0.00750
Epoch: 33527, Training Loss: 0.00781
Epoch: 33527, Training Loss: 0.00782
Epoch: 33527, Training Loss: 0.00960
Epoch: 33528, Training Loss: 0.00750
Epoch: 33528, Training Loss: 0.00781
Epoch: 33528, Training Loss: 0.00782
Epoch: 33528, Training Loss: 0.00960
Epoch: 33529, Training Loss: 0.00750
Epoch: 33529, Training Loss: 0.00781
Epoch: 33529, Training Loss: 0.00782
Epoch: 33529, Training Loss: 0.00960
Epoch: 33530, Training Loss: 0.00750
Epoch: 33530, Training Loss: 0.00781
Epoch: 33530, Training Loss: 0.00782
Epoch: 33530, Training Loss: 0.00960
Epoch: 33531, Training Loss: 0.00750
Epoch: 33531, Training Loss: 0.00781
Epoch: 33531, Training Loss: 0.00782
Epoch: 33531, Training Loss: 0.00960
Epoch: 33532, Training Loss: 0.00750
Epoch: 33532, Training Loss: 0.00781
Epoch: 33532, Training Loss: 0.00782
Epoch: 33532, Training Loss: 0.00960
Epoch: 33533, Training Loss: 0.00750
Epoch: 33533, Training Loss: 0.00781
Epoch: 33533, Training Loss: 0.00782
Epoch: 33533, Training Loss: 0.00960
Epoch: 33534, Training Loss: 0.00750
Epoch: 33534, Training Loss: 0.00781
Epoch: 33534, Training Loss: 0.00782
Epoch: 33534, Training Loss: 0.00959
Epoch: 33535, Training Loss: 0.00750
Epoch: 33535, Training Loss: 0.00781
Epoch: 33535, Training Loss: 0.00782
Epoch: 33535, Training Loss: 0.00959
Epoch: 33536, Training Loss: 0.00750
Epoch: 33536, Training Loss: 0.00781
Epoch: 33536, Training Loss: 0.00782
Epoch: 33536, Training Loss: 0.00959
Epoch: 33537, Training Loss: 0.00750
Epoch: 33537, Training Loss: 0.00781
Epoch: 33537, Training Loss: 0.00782
Epoch: 33537, Training Loss: 0.00959
Epoch: 33538, Training Loss: 0.00750
Epoch: 33538, Training Loss: 0.00781
Epoch: 33538, Training Loss: 0.00782
Epoch: 33538, Training Loss: 0.00959
Epoch: 33539, Training Loss: 0.00750
Epoch: 33539, Training Loss: 0.00781
Epoch: 33539, Training Loss: 0.00782
Epoch: 33539, Training Loss: 0.00959
Epoch: 33540, Training Loss: 0.00750
Epoch: 33540, Training Loss: 0.00781
Epoch: 33540, Training Loss: 0.00782
Epoch: 33540, Training Loss: 0.00959
Epoch: 33541, Training Loss: 0.00750
Epoch: 33541, Training Loss: 0.00781
Epoch: 33541, Training Loss: 0.00782
Epoch: 33541, Training Loss: 0.00959
Epoch: 33542, Training Loss: 0.00750
Epoch: 33542, Training Loss: 0.00781
Epoch: 33542, Training Loss: 0.00782
Epoch: 33542, Training Loss: 0.00959
Epoch: 33543, Training Loss: 0.00750
Epoch: 33543, Training Loss: 0.00781
Epoch: 33543, Training Loss: 0.00782
Epoch: 33543, Training Loss: 0.00959
Epoch: 33544, Training Loss: 0.00750
Epoch: 33544, Training Loss: 0.00781
Epoch: 33544, Training Loss: 0.00781
Epoch: 33544, Training Loss: 0.00959
Epoch: 33545, Training Loss: 0.00750
Epoch: 33545, Training Loss: 0.00781
Epoch: 33545, Training Loss: 0.00781
Epoch: 33545, Training Loss: 0.00959
Epoch: 33546, Training Loss: 0.00750
Epoch: 33546, Training Loss: 0.00781
Epoch: 33546, Training Loss: 0.00781
Epoch: 33546, Training Loss: 0.00959
Epoch: 33547, Training Loss: 0.00750
Epoch: 33547, Training Loss: 0.00781
Epoch: 33547, Training Loss: 0.00781
Epoch: 33547, Training Loss: 0.00959
Epoch: 33548, Training Loss: 0.00750
Epoch: 33548, Training Loss: 0.00781
Epoch: 33548, Training Loss: 0.00781
Epoch: 33548, Training Loss: 0.00959
Epoch: 33549, Training Loss: 0.00750
Epoch: 33549, Training Loss: 0.00781
Epoch: 33549, Training Loss: 0.00781
Epoch: 33549, Training Loss: 0.00959
Epoch: 33550, Training Loss: 0.00750
Epoch: 33550, Training Loss: 0.00781
Epoch: 33550, Training Loss: 0.00781
Epoch: 33550, Training Loss: 0.00959
Epoch: 33551, Training Loss: 0.00750
Epoch: 33551, Training Loss: 0.00780
Epoch: 33551, Training Loss: 0.00781
Epoch: 33551, Training Loss: 0.00959
Epoch: 33552, Training Loss: 0.00750
Epoch: 33552, Training Loss: 0.00780
Epoch: 33552, Training Loss: 0.00781
Epoch: 33552, Training Loss: 0.00959
Epoch: 33553, Training Loss: 0.00750
Epoch: 33553, Training Loss: 0.00780
Epoch: 33553, Training Loss: 0.00781
Epoch: 33553, Training Loss: 0.00959
Epoch: 33554, Training Loss: 0.00750
Epoch: 33554, Training Loss: 0.00780
Epoch: 33554, Training Loss: 0.00781
Epoch: 33554, Training Loss: 0.00959
Epoch: 33555, Training Loss: 0.00750
Epoch: 33555, Training Loss: 0.00780
Epoch: 33555, Training Loss: 0.00781
Epoch: 33555, Training Loss: 0.00959
Epoch: 33556, Training Loss: 0.00750
Epoch: 33556, Training Loss: 0.00780
Epoch: 33556, Training Loss: 0.00781
Epoch: 33556, Training Loss: 0.00959
Epoch: 33557, Training Loss: 0.00750
Epoch: 33557, Training Loss: 0.00780
Epoch: 33557, Training Loss: 0.00781
Epoch: 33557, Training Loss: 0.00959
Epoch: 33558, Training Loss: 0.00750
Epoch: 33558, Training Loss: 0.00780
Epoch: 33558, Training Loss: 0.00781
Epoch: 33558, Training Loss: 0.00959
Epoch: 33559, Training Loss: 0.00750
Epoch: 33559, Training Loss: 0.00780
Epoch: 33559, Training Loss: 0.00781
Epoch: 33559, Training Loss: 0.00959
Epoch: 33560, Training Loss: 0.00750
Epoch: 33560, Training Loss: 0.00780
Epoch: 33560, Training Loss: 0.00781
Epoch: 33560, Training Loss: 0.00959
Epoch: 33561, Training Loss: 0.00750
Epoch: 33561, Training Loss: 0.00780
Epoch: 33561, Training Loss: 0.00781
Epoch: 33561, Training Loss: 0.00959
Epoch: 33562, Training Loss: 0.00750
Epoch: 33562, Training Loss: 0.00780
Epoch: 33562, Training Loss: 0.00781
Epoch: 33562, Training Loss: 0.00959
Epoch: 33563, Training Loss: 0.00750
Epoch: 33563, Training Loss: 0.00780
Epoch: 33563, Training Loss: 0.00781
Epoch: 33563, Training Loss: 0.00959
Epoch: 33564, Training Loss: 0.00750
Epoch: 33564, Training Loss: 0.00780
Epoch: 33564, Training Loss: 0.00781
Epoch: 33564, Training Loss: 0.00959
Epoch: 33565, Training Loss: 0.00750
Epoch: 33565, Training Loss: 0.00780
Epoch: 33565, Training Loss: 0.00781
Epoch: 33565, Training Loss: 0.00959
Epoch: 33566, Training Loss: 0.00750
Epoch: 33566, Training Loss: 0.00780
Epoch: 33566, Training Loss: 0.00781
Epoch: 33566, Training Loss: 0.00959
Epoch: 33567, Training Loss: 0.00750
Epoch: 33567, Training Loss: 0.00780
Epoch: 33567, Training Loss: 0.00781
Epoch: 33567, Training Loss: 0.00959
Epoch: 33568, Training Loss: 0.00750
Epoch: 33568, Training Loss: 0.00780
Epoch: 33568, Training Loss: 0.00781
Epoch: 33568, Training Loss: 0.00959
Epoch: 33569, Training Loss: 0.00750
Epoch: 33569, Training Loss: 0.00780
Epoch: 33569, Training Loss: 0.00781
Epoch: 33569, Training Loss: 0.00959
Epoch: 33570, Training Loss: 0.00750
Epoch: 33570, Training Loss: 0.00780
Epoch: 33570, Training Loss: 0.00781
Epoch: 33570, Training Loss: 0.00959
Epoch: 33571, Training Loss: 0.00750
Epoch: 33571, Training Loss: 0.00780
Epoch: 33571, Training Loss: 0.00781
Epoch: 33571, Training Loss: 0.00959
Epoch: 33572, Training Loss: 0.00750
Epoch: 33572, Training Loss: 0.00780
Epoch: 33572, Training Loss: 0.00781
Epoch: 33572, Training Loss: 0.00959
Epoch: 33573, Training Loss: 0.00750
Epoch: 33573, Training Loss: 0.00780
Epoch: 33573, Training Loss: 0.00781
Epoch: 33573, Training Loss: 0.00959
Epoch: 33574, Training Loss: 0.00750
Epoch: 33574, Training Loss: 0.00780
Epoch: 33574, Training Loss: 0.00781
Epoch: 33574, Training Loss: 0.00959
Epoch: 33575, Training Loss: 0.00750
Epoch: 33575, Training Loss: 0.00780
Epoch: 33575, Training Loss: 0.00781
Epoch: 33575, Training Loss: 0.00959
Epoch: 33576, Training Loss: 0.00750
Epoch: 33576, Training Loss: 0.00780
Epoch: 33576, Training Loss: 0.00781
Epoch: 33576, Training Loss: 0.00959
Epoch: 33577, Training Loss: 0.00750
Epoch: 33577, Training Loss: 0.00780
Epoch: 33577, Training Loss: 0.00781
Epoch: 33577, Training Loss: 0.00959
Epoch: 33578, Training Loss: 0.00750
Epoch: 33578, Training Loss: 0.00780
Epoch: 33578, Training Loss: 0.00781
Epoch: 33578, Training Loss: 0.00959
Epoch: 33579, Training Loss: 0.00750
Epoch: 33579, Training Loss: 0.00780
Epoch: 33579, Training Loss: 0.00781
Epoch: 33579, Training Loss: 0.00959
Epoch: 33580, Training Loss: 0.00750
Epoch: 33580, Training Loss: 0.00780
Epoch: 33580, Training Loss: 0.00781
Epoch: 33580, Training Loss: 0.00959
Epoch: 33581, Training Loss: 0.00750
Epoch: 33581, Training Loss: 0.00780
Epoch: 33581, Training Loss: 0.00781
Epoch: 33581, Training Loss: 0.00959
Epoch: 33582, Training Loss: 0.00750
Epoch: 33582, Training Loss: 0.00780
Epoch: 33582, Training Loss: 0.00781
Epoch: 33582, Training Loss: 0.00959
Epoch: 33583, Training Loss: 0.00750
Epoch: 33583, Training Loss: 0.00780
Epoch: 33583, Training Loss: 0.00781
Epoch: 33583, Training Loss: 0.00959
Epoch: 33584, Training Loss: 0.00750
Epoch: 33584, Training Loss: 0.00780
Epoch: 33584, Training Loss: 0.00781
Epoch: 33584, Training Loss: 0.00959
Epoch: 33585, Training Loss: 0.00750
Epoch: 33585, Training Loss: 0.00780
Epoch: 33585, Training Loss: 0.00781
Epoch: 33585, Training Loss: 0.00959
Epoch: 33586, Training Loss: 0.00750
Epoch: 33586, Training Loss: 0.00780
Epoch: 33586, Training Loss: 0.00781
Epoch: 33586, Training Loss: 0.00959
Epoch: 33587, Training Loss: 0.00750
Epoch: 33587, Training Loss: 0.00780
Epoch: 33587, Training Loss: 0.00781
Epoch: 33587, Training Loss: 0.00959
Epoch: 33588, Training Loss: 0.00749
Epoch: 33588, Training Loss: 0.00780
Epoch: 33588, Training Loss: 0.00781
Epoch: 33588, Training Loss: 0.00959
Epoch: 33589, Training Loss: 0.00749
Epoch: 33589, Training Loss: 0.00780
Epoch: 33589, Training Loss: 0.00781
Epoch: 33589, Training Loss: 0.00959
Epoch: 33590, Training Loss: 0.00749
Epoch: 33590, Training Loss: 0.00780
Epoch: 33590, Training Loss: 0.00781
Epoch: 33590, Training Loss: 0.00959
Epoch: 33591, Training Loss: 0.00749
Epoch: 33591, Training Loss: 0.00780
Epoch: 33591, Training Loss: 0.00781
Epoch: 33591, Training Loss: 0.00959
Epoch: 33592, Training Loss: 0.00749
Epoch: 33592, Training Loss: 0.00780
Epoch: 33592, Training Loss: 0.00781
Epoch: 33592, Training Loss: 0.00959
Epoch: 33593, Training Loss: 0.00749
Epoch: 33593, Training Loss: 0.00780
Epoch: 33593, Training Loss: 0.00781
Epoch: 33593, Training Loss: 0.00959
Epoch: 33594, Training Loss: 0.00749
Epoch: 33594, Training Loss: 0.00780
Epoch: 33594, Training Loss: 0.00781
Epoch: 33594, Training Loss: 0.00959
Epoch: 33595, Training Loss: 0.00749
Epoch: 33595, Training Loss: 0.00780
Epoch: 33595, Training Loss: 0.00781
Epoch: 33595, Training Loss: 0.00959
Epoch: 33596, Training Loss: 0.00749
Epoch: 33596, Training Loss: 0.00780
Epoch: 33596, Training Loss: 0.00781
Epoch: 33596, Training Loss: 0.00958
Epoch: 33597, Training Loss: 0.00749
Epoch: 33597, Training Loss: 0.00780
Epoch: 33597, Training Loss: 0.00781
Epoch: 33597, Training Loss: 0.00958
Epoch: 33598, Training Loss: 0.00749
Epoch: 33598, Training Loss: 0.00780
Epoch: 33598, Training Loss: 0.00781
Epoch: 33598, Training Loss: 0.00958
Epoch: 33599, Training Loss: 0.00749
Epoch: 33599, Training Loss: 0.00780
Epoch: 33599, Training Loss: 0.00781
Epoch: 33599, Training Loss: 0.00958
Epoch: 33600, Training Loss: 0.00749
Epoch: 33600, Training Loss: 0.00780
Epoch: 33600, Training Loss: 0.00781
Epoch: 33600, Training Loss: 0.00958
Epoch: 33601, Training Loss: 0.00749
Epoch: 33601, Training Loss: 0.00780
Epoch: 33601, Training Loss: 0.00781
Epoch: 33601, Training Loss: 0.00958
Epoch: 33602, Training Loss: 0.00749
Epoch: 33602, Training Loss: 0.00780
Epoch: 33602, Training Loss: 0.00781
Epoch: 33602, Training Loss: 0.00958
Epoch: 33603, Training Loss: 0.00749
Epoch: 33603, Training Loss: 0.00780
Epoch: 33603, Training Loss: 0.00781
Epoch: 33603, Training Loss: 0.00958
Epoch: 33604, Training Loss: 0.00749
Epoch: 33604, Training Loss: 0.00780
Epoch: 33604, Training Loss: 0.00781
Epoch: 33604, Training Loss: 0.00958
Epoch: 33605, Training Loss: 0.00749
Epoch: 33605, Training Loss: 0.00780
Epoch: 33605, Training Loss: 0.00781
Epoch: 33605, Training Loss: 0.00958
Epoch: 33606, Training Loss: 0.00749
Epoch: 33606, Training Loss: 0.00780
Epoch: 33606, Training Loss: 0.00781
Epoch: 33606, Training Loss: 0.00958
Epoch: 33607, Training Loss: 0.00749
Epoch: 33607, Training Loss: 0.00780
Epoch: 33607, Training Loss: 0.00781
Epoch: 33607, Training Loss: 0.00958
Epoch: 33608, Training Loss: 0.00749
Epoch: 33608, Training Loss: 0.00780
Epoch: 33608, Training Loss: 0.00781
Epoch: 33608, Training Loss: 0.00958
Epoch: 33609, Training Loss: 0.00749
Epoch: 33609, Training Loss: 0.00780
Epoch: 33609, Training Loss: 0.00781
Epoch: 33609, Training Loss: 0.00958
Epoch: 33610, Training Loss: 0.00749
Epoch: 33610, Training Loss: 0.00780
Epoch: 33610, Training Loss: 0.00781
Epoch: 33610, Training Loss: 0.00958
Epoch: 33611, Training Loss: 0.00749
Epoch: 33611, Training Loss: 0.00780
Epoch: 33611, Training Loss: 0.00781
Epoch: 33611, Training Loss: 0.00958
Epoch: 33612, Training Loss: 0.00749
Epoch: 33612, Training Loss: 0.00780
Epoch: 33612, Training Loss: 0.00781
Epoch: 33612, Training Loss: 0.00958
Epoch: 33613, Training Loss: 0.00749
Epoch: 33613, Training Loss: 0.00780
Epoch: 33613, Training Loss: 0.00781
Epoch: 33613, Training Loss: 0.00958
Epoch: 33614, Training Loss: 0.00749
Epoch: 33614, Training Loss: 0.00780
Epoch: 33614, Training Loss: 0.00781
Epoch: 33614, Training Loss: 0.00958
Epoch: 33615, Training Loss: 0.00749
Epoch: 33615, Training Loss: 0.00780
Epoch: 33615, Training Loss: 0.00781
Epoch: 33615, Training Loss: 0.00958
Epoch: 33616, Training Loss: 0.00749
Epoch: 33616, Training Loss: 0.00780
Epoch: 33616, Training Loss: 0.00781
Epoch: 33616, Training Loss: 0.00958
Epoch: 33617, Training Loss: 0.00749
Epoch: 33617, Training Loss: 0.00780
Epoch: 33617, Training Loss: 0.00781
Epoch: 33617, Training Loss: 0.00958
Epoch: 33618, Training Loss: 0.00749
Epoch: 33618, Training Loss: 0.00780
Epoch: 33618, Training Loss: 0.00781
Epoch: 33618, Training Loss: 0.00958
Epoch: 33619, Training Loss: 0.00749
Epoch: 33619, Training Loss: 0.00780
Epoch: 33619, Training Loss: 0.00781
Epoch: 33619, Training Loss: 0.00958
Epoch: 33620, Training Loss: 0.00749
Epoch: 33620, Training Loss: 0.00780
Epoch: 33620, Training Loss: 0.00781
Epoch: 33620, Training Loss: 0.00958
Epoch: 33621, Training Loss: 0.00749
Epoch: 33621, Training Loss: 0.00780
Epoch: 33621, Training Loss: 0.00780
Epoch: 33621, Training Loss: 0.00958
Epoch: 33622, Training Loss: 0.00749
Epoch: 33622, Training Loss: 0.00780
Epoch: 33622, Training Loss: 0.00780
Epoch: 33622, Training Loss: 0.00958
Epoch: 33623, Training Loss: 0.00749
Epoch: 33623, Training Loss: 0.00780
Epoch: 33623, Training Loss: 0.00780
Epoch: 33623, Training Loss: 0.00958
Epoch: 33624, Training Loss: 0.00749
Epoch: 33624, Training Loss: 0.00780
Epoch: 33624, Training Loss: 0.00780
Epoch: 33624, Training Loss: 0.00958
Epoch: 33625, Training Loss: 0.00749
Epoch: 33625, Training Loss: 0.00780
Epoch: 33625, Training Loss: 0.00780
Epoch: 33625, Training Loss: 0.00958
Epoch: 33626, Training Loss: 0.00749
Epoch: 33626, Training Loss: 0.00780
Epoch: 33626, Training Loss: 0.00780
Epoch: 33626, Training Loss: 0.00958
Epoch: 33627, Training Loss: 0.00749
Epoch: 33627, Training Loss: 0.00780
Epoch: 33627, Training Loss: 0.00780
Epoch: 33627, Training Loss: 0.00958
Epoch: 33628, Training Loss: 0.00749
Epoch: 33628, Training Loss: 0.00779
Epoch: 33628, Training Loss: 0.00780
Epoch: 33628, Training Loss: 0.00958
Epoch: 33629, Training Loss: 0.00749
Epoch: 33629, Training Loss: 0.00779
Epoch: 33629, Training Loss: 0.00780
Epoch: 33629, Training Loss: 0.00958
Epoch: 33630, Training Loss: 0.00749
Epoch: 33630, Training Loss: 0.00779
Epoch: 33630, Training Loss: 0.00780
Epoch: 33630, Training Loss: 0.00958
Epoch: 33631, Training Loss: 0.00749
Epoch: 33631, Training Loss: 0.00779
Epoch: 33631, Training Loss: 0.00780
Epoch: 33631, Training Loss: 0.00958
Epoch: 33632, Training Loss: 0.00749
Epoch: 33632, Training Loss: 0.00779
Epoch: 33632, Training Loss: 0.00780
Epoch: 33632, Training Loss: 0.00958
Epoch: 33633, Training Loss: 0.00749
Epoch: 33633, Training Loss: 0.00779
Epoch: 33633, Training Loss: 0.00780
Epoch: 33633, Training Loss: 0.00958
Epoch: 33634, Training Loss: 0.00749
Epoch: 33634, Training Loss: 0.00779
Epoch: 33634, Training Loss: 0.00780
Epoch: 33634, Training Loss: 0.00958
Epoch: 33635, Training Loss: 0.00749
Epoch: 33635, Training Loss: 0.00779
Epoch: 33635, Training Loss: 0.00780
Epoch: 33635, Training Loss: 0.00958
Epoch: 33636, Training Loss: 0.00749
Epoch: 33636, Training Loss: 0.00779
Epoch: 33636, Training Loss: 0.00780
Epoch: 33636, Training Loss: 0.00958
Epoch: 33637, Training Loss: 0.00749
Epoch: 33637, Training Loss: 0.00779
Epoch: 33637, Training Loss: 0.00780
Epoch: 33637, Training Loss: 0.00958
Epoch: 33638, Training Loss: 0.00749
Epoch: 33638, Training Loss: 0.00779
Epoch: 33638, Training Loss: 0.00780
Epoch: 33638, Training Loss: 0.00958
Epoch: 33639, Training Loss: 0.00749
Epoch: 33639, Training Loss: 0.00779
Epoch: 33639, Training Loss: 0.00780
Epoch: 33639, Training Loss: 0.00958
Epoch: 33640, Training Loss: 0.00749
Epoch: 33640, Training Loss: 0.00779
Epoch: 33640, Training Loss: 0.00780
Epoch: 33640, Training Loss: 0.00958
Epoch: 33641, Training Loss: 0.00749
Epoch: 33641, Training Loss: 0.00779
Epoch: 33641, Training Loss: 0.00780
Epoch: 33641, Training Loss: 0.00958
Epoch: 33642, Training Loss: 0.00749
Epoch: 33642, Training Loss: 0.00779
Epoch: 33642, Training Loss: 0.00780
Epoch: 33642, Training Loss: 0.00958
Epoch: 33643, Training Loss: 0.00749
Epoch: 33643, Training Loss: 0.00779
Epoch: 33643, Training Loss: 0.00780
Epoch: 33643, Training Loss: 0.00958
Epoch: 33644, Training Loss: 0.00749
Epoch: 33644, Training Loss: 0.00779
Epoch: 33644, Training Loss: 0.00780
Epoch: 33644, Training Loss: 0.00958
Epoch: 33645, Training Loss: 0.00749
Epoch: 33645, Training Loss: 0.00779
Epoch: 33645, Training Loss: 0.00780
Epoch: 33645, Training Loss: 0.00958
Epoch: 33646, Training Loss: 0.00749
Epoch: 33646, Training Loss: 0.00779
Epoch: 33646, Training Loss: 0.00780
Epoch: 33646, Training Loss: 0.00958
Epoch: 33647, Training Loss: 0.00749
Epoch: 33647, Training Loss: 0.00779
Epoch: 33647, Training Loss: 0.00780
Epoch: 33647, Training Loss: 0.00958
Epoch: 33648, Training Loss: 0.00749
Epoch: 33648, Training Loss: 0.00779
Epoch: 33648, Training Loss: 0.00780
Epoch: 33648, Training Loss: 0.00958
Epoch: 33649, Training Loss: 0.00749
Epoch: 33649, Training Loss: 0.00779
Epoch: 33649, Training Loss: 0.00780
Epoch: 33649, Training Loss: 0.00958
Epoch: 33650, Training Loss: 0.00749
Epoch: 33650, Training Loss: 0.00779
Epoch: 33650, Training Loss: 0.00780
Epoch: 33650, Training Loss: 0.00958
Epoch: 33651, Training Loss: 0.00749
Epoch: 33651, Training Loss: 0.00779
Epoch: 33651, Training Loss: 0.00780
Epoch: 33651, Training Loss: 0.00958
Epoch: 33652, Training Loss: 0.00749
Epoch: 33652, Training Loss: 0.00779
Epoch: 33652, Training Loss: 0.00780
Epoch: 33652, Training Loss: 0.00958
Epoch: 33653, Training Loss: 0.00749
Epoch: 33653, Training Loss: 0.00779
Epoch: 33653, Training Loss: 0.00780
Epoch: 33653, Training Loss: 0.00958
Epoch: 33654, Training Loss: 0.00749
Epoch: 33654, Training Loss: 0.00779
Epoch: 33654, Training Loss: 0.00780
Epoch: 33654, Training Loss: 0.00958
Epoch: 33655, Training Loss: 0.00749
Epoch: 33655, Training Loss: 0.00779
Epoch: 33655, Training Loss: 0.00780
Epoch: 33655, Training Loss: 0.00958
Epoch: 33656, Training Loss: 0.00749
Epoch: 33656, Training Loss: 0.00779
Epoch: 33656, Training Loss: 0.00780
Epoch: 33656, Training Loss: 0.00958
Epoch: 33657, Training Loss: 0.00749
Epoch: 33657, Training Loss: 0.00779
Epoch: 33657, Training Loss: 0.00780
Epoch: 33657, Training Loss: 0.00958
Epoch: 33658, Training Loss: 0.00749
Epoch: 33658, Training Loss: 0.00779
Epoch: 33658, Training Loss: 0.00780
Epoch: 33658, Training Loss: 0.00957
Epoch: 33659, Training Loss: 0.00749
Epoch: 33659, Training Loss: 0.00779
Epoch: 33659, Training Loss: 0.00780
Epoch: 33659, Training Loss: 0.00957
Epoch: 33660, Training Loss: 0.00749
Epoch: 33660, Training Loss: 0.00779
Epoch: 33660, Training Loss: 0.00780
Epoch: 33660, Training Loss: 0.00957
Epoch: 33661, Training Loss: 0.00749
Epoch: 33661, Training Loss: 0.00779
Epoch: 33661, Training Loss: 0.00780
Epoch: 33661, Training Loss: 0.00957
Epoch: 33662, Training Loss: 0.00749
Epoch: 33662, Training Loss: 0.00779
Epoch: 33662, Training Loss: 0.00780
Epoch: 33662, Training Loss: 0.00957
Epoch: 33663, Training Loss: 0.00749
Epoch: 33663, Training Loss: 0.00779
Epoch: 33663, Training Loss: 0.00780
Epoch: 33663, Training Loss: 0.00957
Epoch: 33664, Training Loss: 0.00749
Epoch: 33664, Training Loss: 0.00779
Epoch: 33664, Training Loss: 0.00780
Epoch: 33664, Training Loss: 0.00957
Epoch: 33665, Training Loss: 0.00749
Epoch: 33665, Training Loss: 0.00779
Epoch: 33665, Training Loss: 0.00780
Epoch: 33665, Training Loss: 0.00957
Epoch: 33666, Training Loss: 0.00749
Epoch: 33666, Training Loss: 0.00779
Epoch: 33666, Training Loss: 0.00780
Epoch: 33666, Training Loss: 0.00957
Epoch: 33667, Training Loss: 0.00749
Epoch: 33667, Training Loss: 0.00779
Epoch: 33667, Training Loss: 0.00780
Epoch: 33667, Training Loss: 0.00957
Epoch: 33668, Training Loss: 0.00749
Epoch: 33668, Training Loss: 0.00779
Epoch: 33668, Training Loss: 0.00780
Epoch: 33668, Training Loss: 0.00957
Epoch: 33669, Training Loss: 0.00749
Epoch: 33669, Training Loss: 0.00779
Epoch: 33669, Training Loss: 0.00780
Epoch: 33669, Training Loss: 0.00957
Epoch: 33670, Training Loss: 0.00749
Epoch: 33670, Training Loss: 0.00779
Epoch: 33670, Training Loss: 0.00780
Epoch: 33670, Training Loss: 0.00957
Epoch: 33671, Training Loss: 0.00748
Epoch: 33671, Training Loss: 0.00779
Epoch: 33671, Training Loss: 0.00780
Epoch: 33671, Training Loss: 0.00957
Epoch: 33672, Training Loss: 0.00748
Epoch: 33672, Training Loss: 0.00779
Epoch: 33672, Training Loss: 0.00780
Epoch: 33672, Training Loss: 0.00957
Epoch: 33673, Training Loss: 0.00748
Epoch: 33673, Training Loss: 0.00779
Epoch: 33673, Training Loss: 0.00780
Epoch: 33673, Training Loss: 0.00957
Epoch: 33674, Training Loss: 0.00748
Epoch: 33674, Training Loss: 0.00779
Epoch: 33674, Training Loss: 0.00780
Epoch: 33674, Training Loss: 0.00957
Epoch: 33675, Training Loss: 0.00748
Epoch: 33675, Training Loss: 0.00779
Epoch: 33675, Training Loss: 0.00780
Epoch: 33675, Training Loss: 0.00957
Epoch: 33676, Training Loss: 0.00748
Epoch: 33676, Training Loss: 0.00779
Epoch: 33676, Training Loss: 0.00780
Epoch: 33676, Training Loss: 0.00957
Epoch: 33677, Training Loss: 0.00748
Epoch: 33677, Training Loss: 0.00779
Epoch: 33677, Training Loss: 0.00780
Epoch: 33677, Training Loss: 0.00957
Epoch: 33678, Training Loss: 0.00748
Epoch: 33678, Training Loss: 0.00779
Epoch: 33678, Training Loss: 0.00780
Epoch: 33678, Training Loss: 0.00957
Epoch: 33679, Training Loss: 0.00748
Epoch: 33679, Training Loss: 0.00779
Epoch: 33679, Training Loss: 0.00780
Epoch: 33679, Training Loss: 0.00957
Epoch: 33680, Training Loss: 0.00748
Epoch: 33680, Training Loss: 0.00779
Epoch: 33680, Training Loss: 0.00780
Epoch: 33680, Training Loss: 0.00957
Epoch: 33681, Training Loss: 0.00748
Epoch: 33681, Training Loss: 0.00779
Epoch: 33681, Training Loss: 0.00780
Epoch: 33681, Training Loss: 0.00957
Epoch: 33682, Training Loss: 0.00748
Epoch: 33682, Training Loss: 0.00779
Epoch: 33682, Training Loss: 0.00780
Epoch: 33682, Training Loss: 0.00957
Epoch: 33683, Training Loss: 0.00748
Epoch: 33683, Training Loss: 0.00779
Epoch: 33683, Training Loss: 0.00780
Epoch: 33683, Training Loss: 0.00957
Epoch: 33684, Training Loss: 0.00748
Epoch: 33684, Training Loss: 0.00779
Epoch: 33684, Training Loss: 0.00780
Epoch: 33684, Training Loss: 0.00957
Epoch: 33685, Training Loss: 0.00748
Epoch: 33685, Training Loss: 0.00779
Epoch: 33685, Training Loss: 0.00780
Epoch: 33685, Training Loss: 0.00957
Epoch: 33686, Training Loss: 0.00748
Epoch: 33686, Training Loss: 0.00779
Epoch: 33686, Training Loss: 0.00780
Epoch: 33686, Training Loss: 0.00957
Epoch: 33687, Training Loss: 0.00748
Epoch: 33687, Training Loss: 0.00779
Epoch: 33687, Training Loss: 0.00780
Epoch: 33687, Training Loss: 0.00957
Epoch: 33688, Training Loss: 0.00748
Epoch: 33688, Training Loss: 0.00779
Epoch: 33688, Training Loss: 0.00780
Epoch: 33688, Training Loss: 0.00957
Epoch: 33689, Training Loss: 0.00748
Epoch: 33689, Training Loss: 0.00779
Epoch: 33689, Training Loss: 0.00780
Epoch: 33689, Training Loss: 0.00957
Epoch: 33690, Training Loss: 0.00748
Epoch: 33690, Training Loss: 0.00779
Epoch: 33690, Training Loss: 0.00780
Epoch: 33690, Training Loss: 0.00957
Epoch: 33691, Training Loss: 0.00748
Epoch: 33691, Training Loss: 0.00779
Epoch: 33691, Training Loss: 0.00780
Epoch: 33691, Training Loss: 0.00957
Epoch: 33692, Training Loss: 0.00748
Epoch: 33692, Training Loss: 0.00779
Epoch: 33692, Training Loss: 0.00780
Epoch: 33692, Training Loss: 0.00957
Epoch: 33693, Training Loss: 0.00748
Epoch: 33693, Training Loss: 0.00779
Epoch: 33693, Training Loss: 0.00780
Epoch: 33693, Training Loss: 0.00957
Epoch: 33694, Training Loss: 0.00748
Epoch: 33694, Training Loss: 0.00779
Epoch: 33694, Training Loss: 0.00780
Epoch: 33694, Training Loss: 0.00957
Epoch: 33695, Training Loss: 0.00748
Epoch: 33695, Training Loss: 0.00779
Epoch: 33695, Training Loss: 0.00780
Epoch: 33695, Training Loss: 0.00957
Epoch: 33696, Training Loss: 0.00748
Epoch: 33696, Training Loss: 0.00779
Epoch: 33696, Training Loss: 0.00780
Epoch: 33696, Training Loss: 0.00957
Epoch: 33697, Training Loss: 0.00748
Epoch: 33697, Training Loss: 0.00779
Epoch: 33697, Training Loss: 0.00780
Epoch: 33697, Training Loss: 0.00957
Epoch: 33698, Training Loss: 0.00748
Epoch: 33698, Training Loss: 0.00779
Epoch: 33698, Training Loss: 0.00780
Epoch: 33698, Training Loss: 0.00957
Epoch: 33699, Training Loss: 0.00748
Epoch: 33699, Training Loss: 0.00779
Epoch: 33699, Training Loss: 0.00779
Epoch: 33699, Training Loss: 0.00957
Epoch: 33700, Training Loss: 0.00748
Epoch: 33700, Training Loss: 0.00779
Epoch: 33700, Training Loss: 0.00779
Epoch: 33700, Training Loss: 0.00957
Epoch: 33701, Training Loss: 0.00748
Epoch: 33701, Training Loss: 0.00779
Epoch: 33701, Training Loss: 0.00779
Epoch: 33701, Training Loss: 0.00957
Epoch: 33702, Training Loss: 0.00748
Epoch: 33702, Training Loss: 0.00779
Epoch: 33702, Training Loss: 0.00779
Epoch: 33702, Training Loss: 0.00957
Epoch: 33703, Training Loss: 0.00748
Epoch: 33703, Training Loss: 0.00779
Epoch: 33703, Training Loss: 0.00779
Epoch: 33703, Training Loss: 0.00957
Epoch: 33704, Training Loss: 0.00748
Epoch: 33704, Training Loss: 0.00779
Epoch: 33704, Training Loss: 0.00779
Epoch: 33704, Training Loss: 0.00957
Epoch: 33705, Training Loss: 0.00748
Epoch: 33705, Training Loss: 0.00779
Epoch: 33705, Training Loss: 0.00779
Epoch: 33705, Training Loss: 0.00957
Epoch: 33706, Training Loss: 0.00748
Epoch: 33706, Training Loss: 0.00778
Epoch: 33706, Training Loss: 0.00779
Epoch: 33706, Training Loss: 0.00957
Epoch: 33707, Training Loss: 0.00748
Epoch: 33707, Training Loss: 0.00778
Epoch: 33707, Training Loss: 0.00779
Epoch: 33707, Training Loss: 0.00957
Epoch: 33708, Training Loss: 0.00748
Epoch: 33708, Training Loss: 0.00778
Epoch: 33708, Training Loss: 0.00779
Epoch: 33708, Training Loss: 0.00957
Epoch: 33709, Training Loss: 0.00748
Epoch: 33709, Training Loss: 0.00778
Epoch: 33709, Training Loss: 0.00779
Epoch: 33709, Training Loss: 0.00957
Epoch: 33710, Training Loss: 0.00748
Epoch: 33710, Training Loss: 0.00778
Epoch: 33710, Training Loss: 0.00779
Epoch: 33710, Training Loss: 0.00957
Epoch: 33711, Training Loss: 0.00748
Epoch: 33711, Training Loss: 0.00778
Epoch: 33711, Training Loss: 0.00779
Epoch: 33711, Training Loss: 0.00957
Epoch: 33712, Training Loss: 0.00748
Epoch: 33712, Training Loss: 0.00778
Epoch: 33712, Training Loss: 0.00779
Epoch: 33712, Training Loss: 0.00957
Epoch: 33713, Training Loss: 0.00748
Epoch: 33713, Training Loss: 0.00778
Epoch: 33713, Training Loss: 0.00779
Epoch: 33713, Training Loss: 0.00957
Epoch: 33714, Training Loss: 0.00748
Epoch: 33714, Training Loss: 0.00778
Epoch: 33714, Training Loss: 0.00779
Epoch: 33714, Training Loss: 0.00957
Epoch: 33715, Training Loss: 0.00748
Epoch: 33715, Training Loss: 0.00778
Epoch: 33715, Training Loss: 0.00779
Epoch: 33715, Training Loss: 0.00957
Epoch: 33716, Training Loss: 0.00748
Epoch: 33716, Training Loss: 0.00778
Epoch: 33716, Training Loss: 0.00779
Epoch: 33716, Training Loss: 0.00957
Epoch: 33717, Training Loss: 0.00748
Epoch: 33717, Training Loss: 0.00778
Epoch: 33717, Training Loss: 0.00779
Epoch: 33717, Training Loss: 0.00957
Epoch: 33718, Training Loss: 0.00748
Epoch: 33718, Training Loss: 0.00778
Epoch: 33718, Training Loss: 0.00779
Epoch: 33718, Training Loss: 0.00957
Epoch: 33719, Training Loss: 0.00748
Epoch: 33719, Training Loss: 0.00778
Epoch: 33719, Training Loss: 0.00779
Epoch: 33719, Training Loss: 0.00957
Epoch: 33720, Training Loss: 0.00748
Epoch: 33720, Training Loss: 0.00778
Epoch: 33720, Training Loss: 0.00779
Epoch: 33720, Training Loss: 0.00956
Epoch: 33721, Training Loss: 0.00748
Epoch: 33721, Training Loss: 0.00778
Epoch: 33721, Training Loss: 0.00779
Epoch: 33721, Training Loss: 0.00956
Epoch: 33722, Training Loss: 0.00748
Epoch: 33722, Training Loss: 0.00778
Epoch: 33722, Training Loss: 0.00779
Epoch: 33722, Training Loss: 0.00956
Epoch: 33723, Training Loss: 0.00748
Epoch: 33723, Training Loss: 0.00778
Epoch: 33723, Training Loss: 0.00779
Epoch: 33723, Training Loss: 0.00956
Epoch: 33724, Training Loss: 0.00748
Epoch: 33724, Training Loss: 0.00778
Epoch: 33724, Training Loss: 0.00779
Epoch: 33724, Training Loss: 0.00956
Epoch: 33725, Training Loss: 0.00748
Epoch: 33725, Training Loss: 0.00778
Epoch: 33725, Training Loss: 0.00779
Epoch: 33725, Training Loss: 0.00956
Epoch: 33726, Training Loss: 0.00748
Epoch: 33726, Training Loss: 0.00778
Epoch: 33726, Training Loss: 0.00779
Epoch: 33726, Training Loss: 0.00956
Epoch: 33727, Training Loss: 0.00748
Epoch: 33727, Training Loss: 0.00778
Epoch: 33727, Training Loss: 0.00779
Epoch: 33727, Training Loss: 0.00956
Epoch: 33728, Training Loss: 0.00748
Epoch: 33728, Training Loss: 0.00778
Epoch: 33728, Training Loss: 0.00779
Epoch: 33728, Training Loss: 0.00956
Epoch: 33729, Training Loss: 0.00748
Epoch: 33729, Training Loss: 0.00778
Epoch: 33729, Training Loss: 0.00779
Epoch: 33729, Training Loss: 0.00956
Epoch: 33730, Training Loss: 0.00748
Epoch: 33730, Training Loss: 0.00778
Epoch: 33730, Training Loss: 0.00779
Epoch: 33730, Training Loss: 0.00956
Epoch: 33731, Training Loss: 0.00748
Epoch: 33731, Training Loss: 0.00778
Epoch: 33731, Training Loss: 0.00779
Epoch: 33731, Training Loss: 0.00956
Epoch: 33732, Training Loss: 0.00748
Epoch: 33732, Training Loss: 0.00778
Epoch: 33732, Training Loss: 0.00779
Epoch: 33732, Training Loss: 0.00956
Epoch: 33733, Training Loss: 0.00748
Epoch: 33733, Training Loss: 0.00778
Epoch: 33733, Training Loss: 0.00779
Epoch: 33733, Training Loss: 0.00956
Epoch: 33734, Training Loss: 0.00748
Epoch: 33734, Training Loss: 0.00778
Epoch: 33734, Training Loss: 0.00779
Epoch: 33734, Training Loss: 0.00956
Epoch: 33735, Training Loss: 0.00748
Epoch: 33735, Training Loss: 0.00778
Epoch: 33735, Training Loss: 0.00779
Epoch: 33735, Training Loss: 0.00956
Epoch: 33736, Training Loss: 0.00748
Epoch: 33736, Training Loss: 0.00778
Epoch: 33736, Training Loss: 0.00779
Epoch: 33736, Training Loss: 0.00956
Epoch: 33737, Training Loss: 0.00748
Epoch: 33737, Training Loss: 0.00778
Epoch: 33737, Training Loss: 0.00779
Epoch: 33737, Training Loss: 0.00956
Epoch: 33738, Training Loss: 0.00748
Epoch: 33738, Training Loss: 0.00778
Epoch: 33738, Training Loss: 0.00779
Epoch: 33738, Training Loss: 0.00956
Epoch: 33739, Training Loss: 0.00748
Epoch: 33739, Training Loss: 0.00778
Epoch: 33739, Training Loss: 0.00779
Epoch: 33739, Training Loss: 0.00956
Epoch: 33740, Training Loss: 0.00748
Epoch: 33740, Training Loss: 0.00778
Epoch: 33740, Training Loss: 0.00779
Epoch: 33740, Training Loss: 0.00956
Epoch: 33741, Training Loss: 0.00748
Epoch: 33741, Training Loss: 0.00778
Epoch: 33741, Training Loss: 0.00779
Epoch: 33741, Training Loss: 0.00956
Epoch: 33742, Training Loss: 0.00748
Epoch: 33742, Training Loss: 0.00778
Epoch: 33742, Training Loss: 0.00779
Epoch: 33742, Training Loss: 0.00956
Epoch: 33743, Training Loss: 0.00748
Epoch: 33743, Training Loss: 0.00778
Epoch: 33743, Training Loss: 0.00779
Epoch: 33743, Training Loss: 0.00956
Epoch: 33744, Training Loss: 0.00748
Epoch: 33744, Training Loss: 0.00778
Epoch: 33744, Training Loss: 0.00779
Epoch: 33744, Training Loss: 0.00956
Epoch: 33745, Training Loss: 0.00748
Epoch: 33745, Training Loss: 0.00778
Epoch: 33745, Training Loss: 0.00779
Epoch: 33745, Training Loss: 0.00956
Epoch: 33746, Training Loss: 0.00748
Epoch: 33746, Training Loss: 0.00778
Epoch: 33746, Training Loss: 0.00779
Epoch: 33746, Training Loss: 0.00956
Epoch: 33747, Training Loss: 0.00748
Epoch: 33747, Training Loss: 0.00778
Epoch: 33747, Training Loss: 0.00779
Epoch: 33747, Training Loss: 0.00956
Epoch: 33748, Training Loss: 0.00748
Epoch: 33748, Training Loss: 0.00778
Epoch: 33748, Training Loss: 0.00779
Epoch: 33748, Training Loss: 0.00956
Epoch: 33749, Training Loss: 0.00748
Epoch: 33749, Training Loss: 0.00778
Epoch: 33749, Training Loss: 0.00779
Epoch: 33749, Training Loss: 0.00956
Epoch: 33750, Training Loss: 0.00748
Epoch: 33750, Training Loss: 0.00778
Epoch: 33750, Training Loss: 0.00779
Epoch: 33750, Training Loss: 0.00956
Epoch: 33751, Training Loss: 0.00748
Epoch: 33751, Training Loss: 0.00778
Epoch: 33751, Training Loss: 0.00779
Epoch: 33751, Training Loss: 0.00956
Epoch: 33752, Training Loss: 0.00748
Epoch: 33752, Training Loss: 0.00778
Epoch: 33752, Training Loss: 0.00779
Epoch: 33752, Training Loss: 0.00956
Epoch: 33753, Training Loss: 0.00747
Epoch: 33753, Training Loss: 0.00778
Epoch: 33753, Training Loss: 0.00779
Epoch: 33753, Training Loss: 0.00956
Epoch: 33754, Training Loss: 0.00747
Epoch: 33754, Training Loss: 0.00778
Epoch: 33754, Training Loss: 0.00779
Epoch: 33754, Training Loss: 0.00956
Epoch: 33755, Training Loss: 0.00747
Epoch: 33755, Training Loss: 0.00778
Epoch: 33755, Training Loss: 0.00779
Epoch: 33755, Training Loss: 0.00956
Epoch: 33756, Training Loss: 0.00747
Epoch: 33756, Training Loss: 0.00778
Epoch: 33756, Training Loss: 0.00779
Epoch: 33756, Training Loss: 0.00956
Epoch: 33757, Training Loss: 0.00747
Epoch: 33757, Training Loss: 0.00778
Epoch: 33757, Training Loss: 0.00779
Epoch: 33757, Training Loss: 0.00956
Epoch: 33758, Training Loss: 0.00747
Epoch: 33758, Training Loss: 0.00778
Epoch: 33758, Training Loss: 0.00779
Epoch: 33758, Training Loss: 0.00956
Epoch: 33759, Training Loss: 0.00747
Epoch: 33759, Training Loss: 0.00778
Epoch: 33759, Training Loss: 0.00779
Epoch: 33759, Training Loss: 0.00956
Epoch: 33760, Training Loss: 0.00747
Epoch: 33760, Training Loss: 0.00778
Epoch: 33760, Training Loss: 0.00779
Epoch: 33760, Training Loss: 0.00956
Epoch: 33761, Training Loss: 0.00747
Epoch: 33761, Training Loss: 0.00778
Epoch: 33761, Training Loss: 0.00779
Epoch: 33761, Training Loss: 0.00956
Epoch: 33762, Training Loss: 0.00747
Epoch: 33762, Training Loss: 0.00778
Epoch: 33762, Training Loss: 0.00779
Epoch: 33762, Training Loss: 0.00956
Epoch: 33763, Training Loss: 0.00747
Epoch: 33763, Training Loss: 0.00778
Epoch: 33763, Training Loss: 0.00779
Epoch: 33763, Training Loss: 0.00956
Epoch: 33764, Training Loss: 0.00747
Epoch: 33764, Training Loss: 0.00778
Epoch: 33764, Training Loss: 0.00779
Epoch: 33764, Training Loss: 0.00956
Epoch: 33765, Training Loss: 0.00747
Epoch: 33765, Training Loss: 0.00778
Epoch: 33765, Training Loss: 0.00779
Epoch: 33765, Training Loss: 0.00956
Epoch: 33766, Training Loss: 0.00747
Epoch: 33766, Training Loss: 0.00778
Epoch: 33766, Training Loss: 0.00779
Epoch: 33766, Training Loss: 0.00956
Epoch: 33767, Training Loss: 0.00747
Epoch: 33767, Training Loss: 0.00778
Epoch: 33767, Training Loss: 0.00779
Epoch: 33767, Training Loss: 0.00956
Epoch: 33768, Training Loss: 0.00747
Epoch: 33768, Training Loss: 0.00778
Epoch: 33768, Training Loss: 0.00779
Epoch: 33768, Training Loss: 0.00956
Epoch: 33769, Training Loss: 0.00747
Epoch: 33769, Training Loss: 0.00778
Epoch: 33769, Training Loss: 0.00779
Epoch: 33769, Training Loss: 0.00956
Epoch: 33770, Training Loss: 0.00747
Epoch: 33770, Training Loss: 0.00778
Epoch: 33770, Training Loss: 0.00779
Epoch: 33770, Training Loss: 0.00956
Epoch: 33771, Training Loss: 0.00747
Epoch: 33771, Training Loss: 0.00778
Epoch: 33771, Training Loss: 0.00779
Epoch: 33771, Training Loss: 0.00956
Epoch: 33772, Training Loss: 0.00747
Epoch: 33772, Training Loss: 0.00778
Epoch: 33772, Training Loss: 0.00779
Epoch: 33772, Training Loss: 0.00956
Epoch: 33773, Training Loss: 0.00747
Epoch: 33773, Training Loss: 0.00778
Epoch: 33773, Training Loss: 0.00779
Epoch: 33773, Training Loss: 0.00956
Epoch: 33774, Training Loss: 0.00747
Epoch: 33774, Training Loss: 0.00778
Epoch: 33774, Training Loss: 0.00779
Epoch: 33774, Training Loss: 0.00956
Epoch: 33775, Training Loss: 0.00747
Epoch: 33775, Training Loss: 0.00778
Epoch: 33775, Training Loss: 0.00779
Epoch: 33775, Training Loss: 0.00956
Epoch: 33776, Training Loss: 0.00747
Epoch: 33776, Training Loss: 0.00778
Epoch: 33776, Training Loss: 0.00778
Epoch: 33776, Training Loss: 0.00956
Epoch: 33777, Training Loss: 0.00747
Epoch: 33777, Training Loss: 0.00778
Epoch: 33777, Training Loss: 0.00778
Epoch: 33777, Training Loss: 0.00956
Epoch: 33778, Training Loss: 0.00747
Epoch: 33778, Training Loss: 0.00778
Epoch: 33778, Training Loss: 0.00778
Epoch: 33778, Training Loss: 0.00956
Epoch: 33779, Training Loss: 0.00747
Epoch: 33779, Training Loss: 0.00778
Epoch: 33779, Training Loss: 0.00778
Epoch: 33779, Training Loss: 0.00956
Epoch: 33780, Training Loss: 0.00747
Epoch: 33780, Training Loss: 0.00778
Epoch: 33780, Training Loss: 0.00778
Epoch: 33780, Training Loss: 0.00956
Epoch: 33781, Training Loss: 0.00747
Epoch: 33781, Training Loss: 0.00778
Epoch: 33781, Training Loss: 0.00778
Epoch: 33781, Training Loss: 0.00956
Epoch: 33782, Training Loss: 0.00747
Epoch: 33782, Training Loss: 0.00778
Epoch: 33782, Training Loss: 0.00778
Epoch: 33782, Training Loss: 0.00956
Epoch: 33783, Training Loss: 0.00747
Epoch: 33783, Training Loss: 0.00778
Epoch: 33783, Training Loss: 0.00778
Epoch: 33783, Training Loss: 0.00955
Epoch: 33784, Training Loss: 0.00747
Epoch: 33784, Training Loss: 0.00777
Epoch: 33784, Training Loss: 0.00778
Epoch: 33784, Training Loss: 0.00955
Epoch: 33785, Training Loss: 0.00747
Epoch: 33785, Training Loss: 0.00777
Epoch: 33785, Training Loss: 0.00778
Epoch: 33785, Training Loss: 0.00955
Epoch: 33786, Training Loss: 0.00747
Epoch: 33786, Training Loss: 0.00777
Epoch: 33786, Training Loss: 0.00778
Epoch: 33786, Training Loss: 0.00955
Epoch: 33787, Training Loss: 0.00747
Epoch: 33787, Training Loss: 0.00777
Epoch: 33787, Training Loss: 0.00778
Epoch: 33787, Training Loss: 0.00955
Epoch: 33788, Training Loss: 0.00747
Epoch: 33788, Training Loss: 0.00777
Epoch: 33788, Training Loss: 0.00778
Epoch: 33788, Training Loss: 0.00955
Epoch: 33789, Training Loss: 0.00747
Epoch: 33789, Training Loss: 0.00777
Epoch: 33789, Training Loss: 0.00778
Epoch: 33789, Training Loss: 0.00955
Epoch: 33790, Training Loss: 0.00747
Epoch: 33790, Training Loss: 0.00777
Epoch: 33790, Training Loss: 0.00778
Epoch: 33790, Training Loss: 0.00955
Epoch: 33791, Training Loss: 0.00747
Epoch: 33791, Training Loss: 0.00777
Epoch: 33791, Training Loss: 0.00778
Epoch: 33791, Training Loss: 0.00955
Epoch: 33792, Training Loss: 0.00747
Epoch: 33792, Training Loss: 0.00777
Epoch: 33792, Training Loss: 0.00778
Epoch: 33792, Training Loss: 0.00955
Epoch: 33793, Training Loss: 0.00747
Epoch: 33793, Training Loss: 0.00777
Epoch: 33793, Training Loss: 0.00778
Epoch: 33793, Training Loss: 0.00955
Epoch: 33794, Training Loss: 0.00747
Epoch: 33794, Training Loss: 0.00777
Epoch: 33794, Training Loss: 0.00778
Epoch: 33794, Training Loss: 0.00955
Epoch: 33795, Training Loss: 0.00747
Epoch: 33795, Training Loss: 0.00777
Epoch: 33795, Training Loss: 0.00778
Epoch: 33795, Training Loss: 0.00955
Epoch: 33796, Training Loss: 0.00747
Epoch: 33796, Training Loss: 0.00777
Epoch: 33796, Training Loss: 0.00778
Epoch: 33796, Training Loss: 0.00955
Epoch: 33797, Training Loss: 0.00747
Epoch: 33797, Training Loss: 0.00777
Epoch: 33797, Training Loss: 0.00778
Epoch: 33797, Training Loss: 0.00955
Epoch: 33798, Training Loss: 0.00747
Epoch: 33798, Training Loss: 0.00777
Epoch: 33798, Training Loss: 0.00778
Epoch: 33798, Training Loss: 0.00955
Epoch: 33799, Training Loss: 0.00747
Epoch: 33799, Training Loss: 0.00777
Epoch: 33799, Training Loss: 0.00778
Epoch: 33799, Training Loss: 0.00955
Epoch: 33800, Training Loss: 0.00747
Epoch: 33800, Training Loss: 0.00777
Epoch: 33800, Training Loss: 0.00778
Epoch: 33800, Training Loss: 0.00955
Epoch: 33801, Training Loss: 0.00747
Epoch: 33801, Training Loss: 0.00777
Epoch: 33801, Training Loss: 0.00778
Epoch: 33801, Training Loss: 0.00955
Epoch: 33802, Training Loss: 0.00747
Epoch: 33802, Training Loss: 0.00777
Epoch: 33802, Training Loss: 0.00778
Epoch: 33802, Training Loss: 0.00955
Epoch: 33803, Training Loss: 0.00747
Epoch: 33803, Training Loss: 0.00777
Epoch: 33803, Training Loss: 0.00778
Epoch: 33803, Training Loss: 0.00955
Epoch: 33804, Training Loss: 0.00747
Epoch: 33804, Training Loss: 0.00777
Epoch: 33804, Training Loss: 0.00778
Epoch: 33804, Training Loss: 0.00955
Epoch: 33805, Training Loss: 0.00747
Epoch: 33805, Training Loss: 0.00777
Epoch: 33805, Training Loss: 0.00778
Epoch: 33805, Training Loss: 0.00955
Epoch: 33806, Training Loss: 0.00747
Epoch: 33806, Training Loss: 0.00777
Epoch: 33806, Training Loss: 0.00778
Epoch: 33806, Training Loss: 0.00955
Epoch: 33807, Training Loss: 0.00747
Epoch: 33807, Training Loss: 0.00777
Epoch: 33807, Training Loss: 0.00778
Epoch: 33807, Training Loss: 0.00955
Epoch: 33808, Training Loss: 0.00747
Epoch: 33808, Training Loss: 0.00777
Epoch: 33808, Training Loss: 0.00778
Epoch: 33808, Training Loss: 0.00955
Epoch: 33809, Training Loss: 0.00747
Epoch: 33809, Training Loss: 0.00777
Epoch: 33809, Training Loss: 0.00778
Epoch: 33809, Training Loss: 0.00955
Epoch: 33810, Training Loss: 0.00747
Epoch: 33810, Training Loss: 0.00777
Epoch: 33810, Training Loss: 0.00778
Epoch: 33810, Training Loss: 0.00955
Epoch: 33811, Training Loss: 0.00747
Epoch: 33811, Training Loss: 0.00777
Epoch: 33811, Training Loss: 0.00778
Epoch: 33811, Training Loss: 0.00955
Epoch: 33812, Training Loss: 0.00747
Epoch: 33812, Training Loss: 0.00777
Epoch: 33812, Training Loss: 0.00778
Epoch: 33812, Training Loss: 0.00955
Epoch: 33813, Training Loss: 0.00747
Epoch: 33813, Training Loss: 0.00777
Epoch: 33813, Training Loss: 0.00778
Epoch: 33813, Training Loss: 0.00955
Epoch: 33814, Training Loss: 0.00747
Epoch: 33814, Training Loss: 0.00777
Epoch: 33814, Training Loss: 0.00778
Epoch: 33814, Training Loss: 0.00955
Epoch: 33815, Training Loss: 0.00747
Epoch: 33815, Training Loss: 0.00777
Epoch: 33815, Training Loss: 0.00778
Epoch: 33815, Training Loss: 0.00955
Epoch: 33816, Training Loss: 0.00747
Epoch: 33816, Training Loss: 0.00777
Epoch: 33816, Training Loss: 0.00778
Epoch: 33816, Training Loss: 0.00955
Epoch: 33817, Training Loss: 0.00747
Epoch: 33817, Training Loss: 0.00777
Epoch: 33817, Training Loss: 0.00778
Epoch: 33817, Training Loss: 0.00955
Epoch: 33818, Training Loss: 0.00747
Epoch: 33818, Training Loss: 0.00777
Epoch: 33818, Training Loss: 0.00778
Epoch: 33818, Training Loss: 0.00955
Epoch: 33819, Training Loss: 0.00747
Epoch: 33819, Training Loss: 0.00777
Epoch: 33819, Training Loss: 0.00778
Epoch: 33819, Training Loss: 0.00955
Epoch: 33820, Training Loss: 0.00747
Epoch: 33820, Training Loss: 0.00777
Epoch: 33820, Training Loss: 0.00778
Epoch: 33820, Training Loss: 0.00955
Epoch: 33821, Training Loss: 0.00747
Epoch: 33821, Training Loss: 0.00777
Epoch: 33821, Training Loss: 0.00778
Epoch: 33821, Training Loss: 0.00955
Epoch: 33822, Training Loss: 0.00747
Epoch: 33822, Training Loss: 0.00777
Epoch: 33822, Training Loss: 0.00778
Epoch: 33822, Training Loss: 0.00955
Epoch: 33823, Training Loss: 0.00747
Epoch: 33823, Training Loss: 0.00777
Epoch: 33823, Training Loss: 0.00778
Epoch: 33823, Training Loss: 0.00955
Epoch: 33824, Training Loss: 0.00747
Epoch: 33824, Training Loss: 0.00777
Epoch: 33824, Training Loss: 0.00778
Epoch: 33824, Training Loss: 0.00955
Epoch: 33825, Training Loss: 0.00747
Epoch: 33825, Training Loss: 0.00777
Epoch: 33825, Training Loss: 0.00778
Epoch: 33825, Training Loss: 0.00955
Epoch: 33826, Training Loss: 0.00747
Epoch: 33826, Training Loss: 0.00777
Epoch: 33826, Training Loss: 0.00778
Epoch: 33826, Training Loss: 0.00955
Epoch: 33827, Training Loss: 0.00747
Epoch: 33827, Training Loss: 0.00777
Epoch: 33827, Training Loss: 0.00778
Epoch: 33827, Training Loss: 0.00955
Epoch: 33828, Training Loss: 0.00747
Epoch: 33828, Training Loss: 0.00777
Epoch: 33828, Training Loss: 0.00778
Epoch: 33828, Training Loss: 0.00955
Epoch: 33829, Training Loss: 0.00747
Epoch: 33829, Training Loss: 0.00777
Epoch: 33829, Training Loss: 0.00778
Epoch: 33829, Training Loss: 0.00955
Epoch: 33830, Training Loss: 0.00747
Epoch: 33830, Training Loss: 0.00777
Epoch: 33830, Training Loss: 0.00778
Epoch: 33830, Training Loss: 0.00955
Epoch: 33831, Training Loss: 0.00747
Epoch: 33831, Training Loss: 0.00777
Epoch: 33831, Training Loss: 0.00778
Epoch: 33831, Training Loss: 0.00955
Epoch: 33832, Training Loss: 0.00747
Epoch: 33832, Training Loss: 0.00777
Epoch: 33832, Training Loss: 0.00778
Epoch: 33832, Training Loss: 0.00955
Epoch: 33833, Training Loss: 0.00747
Epoch: 33833, Training Loss: 0.00777
Epoch: 33833, Training Loss: 0.00778
Epoch: 33833, Training Loss: 0.00955
Epoch: 33834, Training Loss: 0.00747
Epoch: 33834, Training Loss: 0.00777
Epoch: 33834, Training Loss: 0.00778
Epoch: 33834, Training Loss: 0.00955
Epoch: 33835, Training Loss: 0.00747
Epoch: 33835, Training Loss: 0.00777
Epoch: 33835, Training Loss: 0.00778
Epoch: 33835, Training Loss: 0.00955
Epoch: 33836, Training Loss: 0.00746
Epoch: 33836, Training Loss: 0.00777
Epoch: 33836, Training Loss: 0.00778
Epoch: 33836, Training Loss: 0.00955
Epoch: 33837, Training Loss: 0.00746
Epoch: 33837, Training Loss: 0.00777
Epoch: 33837, Training Loss: 0.00778
Epoch: 33837, Training Loss: 0.00955
Epoch: 33838, Training Loss: 0.00746
Epoch: 33838, Training Loss: 0.00777
Epoch: 33838, Training Loss: 0.00778
Epoch: 33838, Training Loss: 0.00955
Epoch: 33839, Training Loss: 0.00746
Epoch: 33839, Training Loss: 0.00777
Epoch: 33839, Training Loss: 0.00778
Epoch: 33839, Training Loss: 0.00955
Epoch: 33840, Training Loss: 0.00746
Epoch: 33840, Training Loss: 0.00777
Epoch: 33840, Training Loss: 0.00778
Epoch: 33840, Training Loss: 0.00955
Epoch: 33841, Training Loss: 0.00746
Epoch: 33841, Training Loss: 0.00777
Epoch: 33841, Training Loss: 0.00778
Epoch: 33841, Training Loss: 0.00955
Epoch: 33842, Training Loss: 0.00746
Epoch: 33842, Training Loss: 0.00777
Epoch: 33842, Training Loss: 0.00778
Epoch: 33842, Training Loss: 0.00955
Epoch: 33843, Training Loss: 0.00746
Epoch: 33843, Training Loss: 0.00777
Epoch: 33843, Training Loss: 0.00778
Epoch: 33843, Training Loss: 0.00955
Epoch: 33844, Training Loss: 0.00746
Epoch: 33844, Training Loss: 0.00777
Epoch: 33844, Training Loss: 0.00778
Epoch: 33844, Training Loss: 0.00955
Epoch: 33845, Training Loss: 0.00746
Epoch: 33845, Training Loss: 0.00777
Epoch: 33845, Training Loss: 0.00778
Epoch: 33845, Training Loss: 0.00955
Epoch: 33846, Training Loss: 0.00746
Epoch: 33846, Training Loss: 0.00777
Epoch: 33846, Training Loss: 0.00778
Epoch: 33846, Training Loss: 0.00954
Epoch: 33847, Training Loss: 0.00746
Epoch: 33847, Training Loss: 0.00777
Epoch: 33847, Training Loss: 0.00778
Epoch: 33847, Training Loss: 0.00954
Epoch: 33848, Training Loss: 0.00746
Epoch: 33848, Training Loss: 0.00777
Epoch: 33848, Training Loss: 0.00778
Epoch: 33848, Training Loss: 0.00954
Epoch: 33849, Training Loss: 0.00746
Epoch: 33849, Training Loss: 0.00777
Epoch: 33849, Training Loss: 0.00778
Epoch: 33849, Training Loss: 0.00954
Epoch: 33850, Training Loss: 0.00746
Epoch: 33850, Training Loss: 0.00777
Epoch: 33850, Training Loss: 0.00778
Epoch: 33850, Training Loss: 0.00954
Epoch: 33851, Training Loss: 0.00746
Epoch: 33851, Training Loss: 0.00777
Epoch: 33851, Training Loss: 0.00778
Epoch: 33851, Training Loss: 0.00954
Epoch: 33852, Training Loss: 0.00746
Epoch: 33852, Training Loss: 0.00777
Epoch: 33852, Training Loss: 0.00778
Epoch: 33852, Training Loss: 0.00954
Epoch: 33853, Training Loss: 0.00746
Epoch: 33853, Training Loss: 0.00777
Epoch: 33853, Training Loss: 0.00778
Epoch: 33853, Training Loss: 0.00954
Epoch: 33854, Training Loss: 0.00746
Epoch: 33854, Training Loss: 0.00777
Epoch: 33854, Training Loss: 0.00777
Epoch: 33854, Training Loss: 0.00954
Epoch: 33855, Training Loss: 0.00746
Epoch: 33855, Training Loss: 0.00777
Epoch: 33855, Training Loss: 0.00777
Epoch: 33855, Training Loss: 0.00954
Epoch: 33856, Training Loss: 0.00746
Epoch: 33856, Training Loss: 0.00777
Epoch: 33856, Training Loss: 0.00777
Epoch: 33856, Training Loss: 0.00954
Epoch: 33857, Training Loss: 0.00746
Epoch: 33857, Training Loss: 0.00777
Epoch: 33857, Training Loss: 0.00777
Epoch: 33857, Training Loss: 0.00954
Epoch: 33858, Training Loss: 0.00746
Epoch: 33858, Training Loss: 0.00777
Epoch: 33858, Training Loss: 0.00777
Epoch: 33858, Training Loss: 0.00954
Epoch: 33859, Training Loss: 0.00746
Epoch: 33859, Training Loss: 0.00777
Epoch: 33859, Training Loss: 0.00777
Epoch: 33859, Training Loss: 0.00954
Epoch: 33860, Training Loss: 0.00746
Epoch: 33860, Training Loss: 0.00777
Epoch: 33860, Training Loss: 0.00777
Epoch: 33860, Training Loss: 0.00954
Epoch: 33861, Training Loss: 0.00746
Epoch: 33861, Training Loss: 0.00777
Epoch: 33861, Training Loss: 0.00777
Epoch: 33861, Training Loss: 0.00954
Epoch: 33862, Training Loss: 0.00746
Epoch: 33862, Training Loss: 0.00776
Epoch: 33862, Training Loss: 0.00777
Epoch: 33862, Training Loss: 0.00954
Epoch: 33863, Training Loss: 0.00746
Epoch: 33863, Training Loss: 0.00776
Epoch: 33863, Training Loss: 0.00777
Epoch: 33863, Training Loss: 0.00954
Epoch: 33864, Training Loss: 0.00746
Epoch: 33864, Training Loss: 0.00776
Epoch: 33864, Training Loss: 0.00777
Epoch: 33864, Training Loss: 0.00954
Epoch: 33865, Training Loss: 0.00746
Epoch: 33865, Training Loss: 0.00776
Epoch: 33865, Training Loss: 0.00777
Epoch: 33865, Training Loss: 0.00954
Epoch: 33866, Training Loss: 0.00746
Epoch: 33866, Training Loss: 0.00776
Epoch: 33866, Training Loss: 0.00777
Epoch: 33866, Training Loss: 0.00954
Epoch: 33867, Training Loss: 0.00746
Epoch: 33867, Training Loss: 0.00776
Epoch: 33867, Training Loss: 0.00777
Epoch: 33867, Training Loss: 0.00954
Epoch: 33868, Training Loss: 0.00746
Epoch: 33868, Training Loss: 0.00776
Epoch: 33868, Training Loss: 0.00777
Epoch: 33868, Training Loss: 0.00954
Epoch: 33869, Training Loss: 0.00746
Epoch: 33869, Training Loss: 0.00776
Epoch: 33869, Training Loss: 0.00777
Epoch: 33869, Training Loss: 0.00954
Epoch: 33870, Training Loss: 0.00746
Epoch: 33870, Training Loss: 0.00776
Epoch: 33870, Training Loss: 0.00777
Epoch: 33870, Training Loss: 0.00954
Epoch: 33871, Training Loss: 0.00746
Epoch: 33871, Training Loss: 0.00776
Epoch: 33871, Training Loss: 0.00777
Epoch: 33871, Training Loss: 0.00954
Epoch: 33872, Training Loss: 0.00746
Epoch: 33872, Training Loss: 0.00776
Epoch: 33872, Training Loss: 0.00777
Epoch: 33872, Training Loss: 0.00954
Epoch: 33873, Training Loss: 0.00746
Epoch: 33873, Training Loss: 0.00776
Epoch: 33873, Training Loss: 0.00777
Epoch: 33873, Training Loss: 0.00954
Epoch: 33874, Training Loss: 0.00746
Epoch: 33874, Training Loss: 0.00776
Epoch: 33874, Training Loss: 0.00777
Epoch: 33874, Training Loss: 0.00954
Epoch: 33875, Training Loss: 0.00746
Epoch: 33875, Training Loss: 0.00776
Epoch: 33875, Training Loss: 0.00777
Epoch: 33875, Training Loss: 0.00954
Epoch: 33876, Training Loss: 0.00746
Epoch: 33876, Training Loss: 0.00776
Epoch: 33876, Training Loss: 0.00777
Epoch: 33876, Training Loss: 0.00954
Epoch: 33877, Training Loss: 0.00746
Epoch: 33877, Training Loss: 0.00776
Epoch: 33877, Training Loss: 0.00777
Epoch: 33877, Training Loss: 0.00954
Epoch: 33878, Training Loss: 0.00746
Epoch: 33878, Training Loss: 0.00776
Epoch: 33878, Training Loss: 0.00777
Epoch: 33878, Training Loss: 0.00954
Epoch: 33879, Training Loss: 0.00746
Epoch: 33879, Training Loss: 0.00776
Epoch: 33879, Training Loss: 0.00777
Epoch: 33879, Training Loss: 0.00954
Epoch: 33880, Training Loss: 0.00746
Epoch: 33880, Training Loss: 0.00776
Epoch: 33880, Training Loss: 0.00777
Epoch: 33880, Training Loss: 0.00954
Epoch: 33881, Training Loss: 0.00746
Epoch: 33881, Training Loss: 0.00776
Epoch: 33881, Training Loss: 0.00777
Epoch: 33881, Training Loss: 0.00954
Epoch: 33882, Training Loss: 0.00746
Epoch: 33882, Training Loss: 0.00776
Epoch: 33882, Training Loss: 0.00777
Epoch: 33882, Training Loss: 0.00954
Epoch: 33883, Training Loss: 0.00746
Epoch: 33883, Training Loss: 0.00776
Epoch: 33883, Training Loss: 0.00777
Epoch: 33883, Training Loss: 0.00954
Epoch: 33884, Training Loss: 0.00746
Epoch: 33884, Training Loss: 0.00776
Epoch: 33884, Training Loss: 0.00777
Epoch: 33884, Training Loss: 0.00954
Epoch: 33885, Training Loss: 0.00746
Epoch: 33885, Training Loss: 0.00776
Epoch: 33885, Training Loss: 0.00777
Epoch: 33885, Training Loss: 0.00954
Epoch: 33886, Training Loss: 0.00746
Epoch: 33886, Training Loss: 0.00776
Epoch: 33886, Training Loss: 0.00777
Epoch: 33886, Training Loss: 0.00954
Epoch: 33887, Training Loss: 0.00746
Epoch: 33887, Training Loss: 0.00776
Epoch: 33887, Training Loss: 0.00777
Epoch: 33887, Training Loss: 0.00954
Epoch: 33888, Training Loss: 0.00746
Epoch: 33888, Training Loss: 0.00776
Epoch: 33888, Training Loss: 0.00777
Epoch: 33888, Training Loss: 0.00954
Epoch: 33889, Training Loss: 0.00746
Epoch: 33889, Training Loss: 0.00776
Epoch: 33889, Training Loss: 0.00777
Epoch: 33889, Training Loss: 0.00954
Epoch: 33890, Training Loss: 0.00746
Epoch: 33890, Training Loss: 0.00776
Epoch: 33890, Training Loss: 0.00777
Epoch: 33890, Training Loss: 0.00954
Epoch: 33891, Training Loss: 0.00746
Epoch: 33891, Training Loss: 0.00776
Epoch: 33891, Training Loss: 0.00777
Epoch: 33891, Training Loss: 0.00954
Epoch: 33892, Training Loss: 0.00746
Epoch: 33892, Training Loss: 0.00776
Epoch: 33892, Training Loss: 0.00777
Epoch: 33892, Training Loss: 0.00954
Epoch: 33893, Training Loss: 0.00746
Epoch: 33893, Training Loss: 0.00776
Epoch: 33893, Training Loss: 0.00777
Epoch: 33893, Training Loss: 0.00954
Epoch: 33894, Training Loss: 0.00746
Epoch: 33894, Training Loss: 0.00776
Epoch: 33894, Training Loss: 0.00777
Epoch: 33894, Training Loss: 0.00954
Epoch: 33895, Training Loss: 0.00746
Epoch: 33895, Training Loss: 0.00776
Epoch: 33895, Training Loss: 0.00777
Epoch: 33895, Training Loss: 0.00954
Epoch: 33896, Training Loss: 0.00746
Epoch: 33896, Training Loss: 0.00776
Epoch: 33896, Training Loss: 0.00777
Epoch: 33896, Training Loss: 0.00954
Epoch: 33897, Training Loss: 0.00746
Epoch: 33897, Training Loss: 0.00776
Epoch: 33897, Training Loss: 0.00777
Epoch: 33897, Training Loss: 0.00954
Epoch: 33898, Training Loss: 0.00746
Epoch: 33898, Training Loss: 0.00776
Epoch: 33898, Training Loss: 0.00777
Epoch: 33898, Training Loss: 0.00954
Epoch: 33899, Training Loss: 0.00746
Epoch: 33899, Training Loss: 0.00776
Epoch: 33899, Training Loss: 0.00777
Epoch: 33899, Training Loss: 0.00954
Epoch: 33900, Training Loss: 0.00746
Epoch: 33900, Training Loss: 0.00776
Epoch: 33900, Training Loss: 0.00777
Epoch: 33900, Training Loss: 0.00954
Epoch: 33901, Training Loss: 0.00746
Epoch: 33901, Training Loss: 0.00776
Epoch: 33901, Training Loss: 0.00777
Epoch: 33901, Training Loss: 0.00954
Epoch: 33902, Training Loss: 0.00746
Epoch: 33902, Training Loss: 0.00776
Epoch: 33902, Training Loss: 0.00777
Epoch: 33902, Training Loss: 0.00954
Epoch: 33903, Training Loss: 0.00746
Epoch: 33903, Training Loss: 0.00776
Epoch: 33903, Training Loss: 0.00777
Epoch: 33903, Training Loss: 0.00954
Epoch: 33904, Training Loss: 0.00746
Epoch: 33904, Training Loss: 0.00776
Epoch: 33904, Training Loss: 0.00777
Epoch: 33904, Training Loss: 0.00954
Epoch: 33905, Training Loss: 0.00746
Epoch: 33905, Training Loss: 0.00776
Epoch: 33905, Training Loss: 0.00777
Epoch: 33905, Training Loss: 0.00954
Epoch: 33906, Training Loss: 0.00746
Epoch: 33906, Training Loss: 0.00776
Epoch: 33906, Training Loss: 0.00777
Epoch: 33906, Training Loss: 0.00954
Epoch: 33907, Training Loss: 0.00746
Epoch: 33907, Training Loss: 0.00776
Epoch: 33907, Training Loss: 0.00777
Epoch: 33907, Training Loss: 0.00954
Epoch: 33908, Training Loss: 0.00746
Epoch: 33908, Training Loss: 0.00776
Epoch: 33908, Training Loss: 0.00777
Epoch: 33908, Training Loss: 0.00954
Epoch: 33909, Training Loss: 0.00746
Epoch: 33909, Training Loss: 0.00776
Epoch: 33909, Training Loss: 0.00777
Epoch: 33909, Training Loss: 0.00953
Epoch: 33910, Training Loss: 0.00746
Epoch: 33910, Training Loss: 0.00776
Epoch: 33910, Training Loss: 0.00777
Epoch: 33910, Training Loss: 0.00953
Epoch: 33911, Training Loss: 0.00746
Epoch: 33911, Training Loss: 0.00776
Epoch: 33911, Training Loss: 0.00777
Epoch: 33911, Training Loss: 0.00953
Epoch: 33912, Training Loss: 0.00746
Epoch: 33912, Training Loss: 0.00776
Epoch: 33912, Training Loss: 0.00777
Epoch: 33912, Training Loss: 0.00953
Epoch: 33913, Training Loss: 0.00746
Epoch: 33913, Training Loss: 0.00776
Epoch: 33913, Training Loss: 0.00777
Epoch: 33913, Training Loss: 0.00953
Epoch: 33914, Training Loss: 0.00746
Epoch: 33914, Training Loss: 0.00776
Epoch: 33914, Training Loss: 0.00777
Epoch: 33914, Training Loss: 0.00953
Epoch: 33915, Training Loss: 0.00746
Epoch: 33915, Training Loss: 0.00776
Epoch: 33915, Training Loss: 0.00777
Epoch: 33915, Training Loss: 0.00953
Epoch: 33916, Training Loss: 0.00746
Epoch: 33916, Training Loss: 0.00776
Epoch: 33916, Training Loss: 0.00777
Epoch: 33916, Training Loss: 0.00953
Epoch: 33917, Training Loss: 0.00746
Epoch: 33917, Training Loss: 0.00776
Epoch: 33917, Training Loss: 0.00777
Epoch: 33917, Training Loss: 0.00953
Epoch: 33918, Training Loss: 0.00746
Epoch: 33918, Training Loss: 0.00776
Epoch: 33918, Training Loss: 0.00777
Epoch: 33918, Training Loss: 0.00953
Epoch: 33919, Training Loss: 0.00746
Epoch: 33919, Training Loss: 0.00776
Epoch: 33919, Training Loss: 0.00777
Epoch: 33919, Training Loss: 0.00953
Epoch: 33920, Training Loss: 0.00745
Epoch: 33920, Training Loss: 0.00776
Epoch: 33920, Training Loss: 0.00777
Epoch: 33920, Training Loss: 0.00953
Epoch: 33921, Training Loss: 0.00745
Epoch: 33921, Training Loss: 0.00776
Epoch: 33921, Training Loss: 0.00777
Epoch: 33921, Training Loss: 0.00953
Epoch: 33922, Training Loss: 0.00745
Epoch: 33922, Training Loss: 0.00776
Epoch: 33922, Training Loss: 0.00777
Epoch: 33922, Training Loss: 0.00953
Epoch: 33923, Training Loss: 0.00745
Epoch: 33923, Training Loss: 0.00776
Epoch: 33923, Training Loss: 0.00777
Epoch: 33923, Training Loss: 0.00953
Epoch: 33924, Training Loss: 0.00745
Epoch: 33924, Training Loss: 0.00776
Epoch: 33924, Training Loss: 0.00777
Epoch: 33924, Training Loss: 0.00953
Epoch: 33925, Training Loss: 0.00745
Epoch: 33925, Training Loss: 0.00776
Epoch: 33925, Training Loss: 0.00777
Epoch: 33925, Training Loss: 0.00953
Epoch: 33926, Training Loss: 0.00745
Epoch: 33926, Training Loss: 0.00776
Epoch: 33926, Training Loss: 0.00777
Epoch: 33926, Training Loss: 0.00953
Epoch: 33927, Training Loss: 0.00745
Epoch: 33927, Training Loss: 0.00776
Epoch: 33927, Training Loss: 0.00777
Epoch: 33927, Training Loss: 0.00953
Epoch: 33928, Training Loss: 0.00745
Epoch: 33928, Training Loss: 0.00776
Epoch: 33928, Training Loss: 0.00777
Epoch: 33928, Training Loss: 0.00953
Epoch: 33929, Training Loss: 0.00745
Epoch: 33929, Training Loss: 0.00776
Epoch: 33929, Training Loss: 0.00777
Epoch: 33929, Training Loss: 0.00953
Epoch: 33930, Training Loss: 0.00745
Epoch: 33930, Training Loss: 0.00776
Epoch: 33930, Training Loss: 0.00777
Epoch: 33930, Training Loss: 0.00953
Epoch: 33931, Training Loss: 0.00745
Epoch: 33931, Training Loss: 0.00776
Epoch: 33931, Training Loss: 0.00777
Epoch: 33931, Training Loss: 0.00953
Epoch: 33932, Training Loss: 0.00745
Epoch: 33932, Training Loss: 0.00776
Epoch: 33932, Training Loss: 0.00777
Epoch: 33932, Training Loss: 0.00953
Epoch: 33933, Training Loss: 0.00745
Epoch: 33933, Training Loss: 0.00776
Epoch: 33933, Training Loss: 0.00776
Epoch: 33933, Training Loss: 0.00953
Epoch: 33934, Training Loss: 0.00745
Epoch: 33934, Training Loss: 0.00776
Epoch: 33934, Training Loss: 0.00776
Epoch: 33934, Training Loss: 0.00953
Epoch: 33935, Training Loss: 0.00745
Epoch: 33935, Training Loss: 0.00776
Epoch: 33935, Training Loss: 0.00776
Epoch: 33935, Training Loss: 0.00953
Epoch: 33936, Training Loss: 0.00745
Epoch: 33936, Training Loss: 0.00776
Epoch: 33936, Training Loss: 0.00776
Epoch: 33936, Training Loss: 0.00953
Epoch: 33937, Training Loss: 0.00745
Epoch: 33937, Training Loss: 0.00776
Epoch: 33937, Training Loss: 0.00776
Epoch: 33937, Training Loss: 0.00953
Epoch: 33938, Training Loss: 0.00745
Epoch: 33938, Training Loss: 0.00776
Epoch: 33938, Training Loss: 0.00776
Epoch: 33938, Training Loss: 0.00953
Epoch: 33939, Training Loss: 0.00745
Epoch: 33939, Training Loss: 0.00776
Epoch: 33939, Training Loss: 0.00776
Epoch: 33939, Training Loss: 0.00953
Epoch: 33940, Training Loss: 0.00745
Epoch: 33940, Training Loss: 0.00775
Epoch: 33940, Training Loss: 0.00776
Epoch: 33940, Training Loss: 0.00953
Epoch: 33941, Training Loss: 0.00745
Epoch: 33941, Training Loss: 0.00775
Epoch: 33941, Training Loss: 0.00776
Epoch: 33941, Training Loss: 0.00953
Epoch: 33942, Training Loss: 0.00745
Epoch: 33942, Training Loss: 0.00775
Epoch: 33942, Training Loss: 0.00776
Epoch: 33942, Training Loss: 0.00953
Epoch: 33943, Training Loss: 0.00745
Epoch: 33943, Training Loss: 0.00775
Epoch: 33943, Training Loss: 0.00776
Epoch: 33943, Training Loss: 0.00953
Epoch: 33944, Training Loss: 0.00745
Epoch: 33944, Training Loss: 0.00775
Epoch: 33944, Training Loss: 0.00776
Epoch: 33944, Training Loss: 0.00953
Epoch: 33945, Training Loss: 0.00745
Epoch: 33945, Training Loss: 0.00775
Epoch: 33945, Training Loss: 0.00776
Epoch: 33945, Training Loss: 0.00953
Epoch: 33946, Training Loss: 0.00745
Epoch: 33946, Training Loss: 0.00775
Epoch: 33946, Training Loss: 0.00776
Epoch: 33946, Training Loss: 0.00953
Epoch: 33947, Training Loss: 0.00745
Epoch: 33947, Training Loss: 0.00775
Epoch: 33947, Training Loss: 0.00776
Epoch: 33947, Training Loss: 0.00953
Epoch: 33948, Training Loss: 0.00745
Epoch: 33948, Training Loss: 0.00775
Epoch: 33948, Training Loss: 0.00776
Epoch: 33948, Training Loss: 0.00953
Epoch: 33949, Training Loss: 0.00745
Epoch: 33949, Training Loss: 0.00775
Epoch: 33949, Training Loss: 0.00776
Epoch: 33949, Training Loss: 0.00953
Epoch: 33950, Training Loss: 0.00745
Epoch: 33950, Training Loss: 0.00775
Epoch: 33950, Training Loss: 0.00776
Epoch: 33950, Training Loss: 0.00953
Epoch: 33951, Training Loss: 0.00745
Epoch: 33951, Training Loss: 0.00775
Epoch: 33951, Training Loss: 0.00776
Epoch: 33951, Training Loss: 0.00953
Epoch: 33952, Training Loss: 0.00745
Epoch: 33952, Training Loss: 0.00775
Epoch: 33952, Training Loss: 0.00776
Epoch: 33952, Training Loss: 0.00953
Epoch: 33953, Training Loss: 0.00745
Epoch: 33953, Training Loss: 0.00775
Epoch: 33953, Training Loss: 0.00776
Epoch: 33953, Training Loss: 0.00953
Epoch: 33954, Training Loss: 0.00745
Epoch: 33954, Training Loss: 0.00775
Epoch: 33954, Training Loss: 0.00776
Epoch: 33954, Training Loss: 0.00953
Epoch: 33955, Training Loss: 0.00745
Epoch: 33955, Training Loss: 0.00775
Epoch: 33955, Training Loss: 0.00776
Epoch: 33955, Training Loss: 0.00953
Epoch: 33956, Training Loss: 0.00745
Epoch: 33956, Training Loss: 0.00775
Epoch: 33956, Training Loss: 0.00776
Epoch: 33956, Training Loss: 0.00953
Epoch: 33957, Training Loss: 0.00745
Epoch: 33957, Training Loss: 0.00775
Epoch: 33957, Training Loss: 0.00776
Epoch: 33957, Training Loss: 0.00953
Epoch: 33958, Training Loss: 0.00745
Epoch: 33958, Training Loss: 0.00775
Epoch: 33958, Training Loss: 0.00776
Epoch: 33958, Training Loss: 0.00953
Epoch: 33959, Training Loss: 0.00745
Epoch: 33959, Training Loss: 0.00775
Epoch: 33959, Training Loss: 0.00776
Epoch: 33959, Training Loss: 0.00953
Epoch: 33960, Training Loss: 0.00745
Epoch: 33960, Training Loss: 0.00775
Epoch: 33960, Training Loss: 0.00776
Epoch: 33960, Training Loss: 0.00953
Epoch: 33961, Training Loss: 0.00745
Epoch: 33961, Training Loss: 0.00775
Epoch: 33961, Training Loss: 0.00776
Epoch: 33961, Training Loss: 0.00953
Epoch: 33962, Training Loss: 0.00745
Epoch: 33962, Training Loss: 0.00775
Epoch: 33962, Training Loss: 0.00776
Epoch: 33962, Training Loss: 0.00953
Epoch: 33963, Training Loss: 0.00745
Epoch: 33963, Training Loss: 0.00775
Epoch: 33963, Training Loss: 0.00776
Epoch: 33963, Training Loss: 0.00953
Epoch: 33964, Training Loss: 0.00745
Epoch: 33964, Training Loss: 0.00775
Epoch: 33964, Training Loss: 0.00776
Epoch: 33964, Training Loss: 0.00953
Epoch: 33965, Training Loss: 0.00745
Epoch: 33965, Training Loss: 0.00775
Epoch: 33965, Training Loss: 0.00776
Epoch: 33965, Training Loss: 0.00953
Epoch: 33966, Training Loss: 0.00745
Epoch: 33966, Training Loss: 0.00775
Epoch: 33966, Training Loss: 0.00776
Epoch: 33966, Training Loss: 0.00953
Epoch: 33967, Training Loss: 0.00745
Epoch: 33967, Training Loss: 0.00775
Epoch: 33967, Training Loss: 0.00776
Epoch: 33967, Training Loss: 0.00953
Epoch: 33968, Training Loss: 0.00745
Epoch: 33968, Training Loss: 0.00775
Epoch: 33968, Training Loss: 0.00776
Epoch: 33968, Training Loss: 0.00953
Epoch: 33969, Training Loss: 0.00745
Epoch: 33969, Training Loss: 0.00775
Epoch: 33969, Training Loss: 0.00776
Epoch: 33969, Training Loss: 0.00953
Epoch: 33970, Training Loss: 0.00745
Epoch: 33970, Training Loss: 0.00775
Epoch: 33970, Training Loss: 0.00776
Epoch: 33970, Training Loss: 0.00953
Epoch: 33971, Training Loss: 0.00745
Epoch: 33971, Training Loss: 0.00775
Epoch: 33971, Training Loss: 0.00776
Epoch: 33971, Training Loss: 0.00953
Epoch: 33972, Training Loss: 0.00745
Epoch: 33972, Training Loss: 0.00775
Epoch: 33972, Training Loss: 0.00776
Epoch: 33972, Training Loss: 0.00952
Epoch: 33973, Training Loss: 0.00745
Epoch: 33973, Training Loss: 0.00775
Epoch: 33973, Training Loss: 0.00776
Epoch: 33973, Training Loss: 0.00952
Epoch: 33974, Training Loss: 0.00745
Epoch: 33974, Training Loss: 0.00775
Epoch: 33974, Training Loss: 0.00776
Epoch: 33974, Training Loss: 0.00952
Epoch: 33975, Training Loss: 0.00745
Epoch: 33975, Training Loss: 0.00775
Epoch: 33975, Training Loss: 0.00776
Epoch: 33975, Training Loss: 0.00952
Epoch: 33976, Training Loss: 0.00745
Epoch: 33976, Training Loss: 0.00775
Epoch: 33976, Training Loss: 0.00776
Epoch: 33976, Training Loss: 0.00952
Epoch: 33977, Training Loss: 0.00745
Epoch: 33977, Training Loss: 0.00775
Epoch: 33977, Training Loss: 0.00776
Epoch: 33977, Training Loss: 0.00952
Epoch: 33978, Training Loss: 0.00745
Epoch: 33978, Training Loss: 0.00775
Epoch: 33978, Training Loss: 0.00776
Epoch: 33978, Training Loss: 0.00952
Epoch: 33979, Training Loss: 0.00745
Epoch: 33979, Training Loss: 0.00775
Epoch: 33979, Training Loss: 0.00776
Epoch: 33979, Training Loss: 0.00952
Epoch: 33980, Training Loss: 0.00745
Epoch: 33980, Training Loss: 0.00775
Epoch: 33980, Training Loss: 0.00776
Epoch: 33980, Training Loss: 0.00952
Epoch: 33981, Training Loss: 0.00745
Epoch: 33981, Training Loss: 0.00775
Epoch: 33981, Training Loss: 0.00776
Epoch: 33981, Training Loss: 0.00952
Epoch: 33982, Training Loss: 0.00745
Epoch: 33982, Training Loss: 0.00775
Epoch: 33982, Training Loss: 0.00776
Epoch: 33982, Training Loss: 0.00952
Epoch: 33983, Training Loss: 0.00745
Epoch: 33983, Training Loss: 0.00775
Epoch: 33983, Training Loss: 0.00776
Epoch: 33983, Training Loss: 0.00952
Epoch: 33984, Training Loss: 0.00745
Epoch: 33984, Training Loss: 0.00775
Epoch: 33984, Training Loss: 0.00776
Epoch: 33984, Training Loss: 0.00952
Epoch: 33985, Training Loss: 0.00745
Epoch: 33985, Training Loss: 0.00775
Epoch: 33985, Training Loss: 0.00776
Epoch: 33985, Training Loss: 0.00952
Epoch: 33986, Training Loss: 0.00745
Epoch: 33986, Training Loss: 0.00775
Epoch: 33986, Training Loss: 0.00776
Epoch: 33986, Training Loss: 0.00952
Epoch: 33987, Training Loss: 0.00745
Epoch: 33987, Training Loss: 0.00775
Epoch: 33987, Training Loss: 0.00776
Epoch: 33987, Training Loss: 0.00952
Epoch: 33988, Training Loss: 0.00745
Epoch: 33988, Training Loss: 0.00775
Epoch: 33988, Training Loss: 0.00776
Epoch: 33988, Training Loss: 0.00952
Epoch: 33989, Training Loss: 0.00745
Epoch: 33989, Training Loss: 0.00775
Epoch: 33989, Training Loss: 0.00776
Epoch: 33989, Training Loss: 0.00952
Epoch: 33990, Training Loss: 0.00745
Epoch: 33990, Training Loss: 0.00775
Epoch: 33990, Training Loss: 0.00776
Epoch: 33990, Training Loss: 0.00952
Epoch: 33991, Training Loss: 0.00745
Epoch: 33991, Training Loss: 0.00775
Epoch: 33991, Training Loss: 0.00776
Epoch: 33991, Training Loss: 0.00952
Epoch: 33992, Training Loss: 0.00745
Epoch: 33992, Training Loss: 0.00775
Epoch: 33992, Training Loss: 0.00776
Epoch: 33992, Training Loss: 0.00952
Epoch: 33993, Training Loss: 0.00745
Epoch: 33993, Training Loss: 0.00775
Epoch: 33993, Training Loss: 0.00776
Epoch: 33993, Training Loss: 0.00952
Epoch: 33994, Training Loss: 0.00745
Epoch: 33994, Training Loss: 0.00775
Epoch: 33994, Training Loss: 0.00776
Epoch: 33994, Training Loss: 0.00952
Epoch: 33995, Training Loss: 0.00745
Epoch: 33995, Training Loss: 0.00775
Epoch: 33995, Training Loss: 0.00776
Epoch: 33995, Training Loss: 0.00952
Epoch: 33996, Training Loss: 0.00745
Epoch: 33996, Training Loss: 0.00775
Epoch: 33996, Training Loss: 0.00776
Epoch: 33996, Training Loss: 0.00952
Epoch: 33997, Training Loss: 0.00745
Epoch: 33997, Training Loss: 0.00775
Epoch: 33997, Training Loss: 0.00776
Epoch: 33997, Training Loss: 0.00952
Epoch: 33998, Training Loss: 0.00745
Epoch: 33998, Training Loss: 0.00775
Epoch: 33998, Training Loss: 0.00776
Epoch: 33998, Training Loss: 0.00952
Epoch: 33999, Training Loss: 0.00745
Epoch: 33999, Training Loss: 0.00775
Epoch: 33999, Training Loss: 0.00776
Epoch: 33999, Training Loss: 0.00952
Epoch: 34000, Training Loss: 0.00745
Epoch: 34000, Training Loss: 0.00775
Epoch: 34000, Training Loss: 0.00776
Epoch: 34000, Training Loss: 0.00952
Epoch: 34001, Training Loss: 0.00745
Epoch: 34001, Training Loss: 0.00775
Epoch: 34001, Training Loss: 0.00776
Epoch: 34001, Training Loss: 0.00952
Epoch: 34002, Training Loss: 0.00745
Epoch: 34002, Training Loss: 0.00775
Epoch: 34002, Training Loss: 0.00776
Epoch: 34002, Training Loss: 0.00952
Epoch: 34003, Training Loss: 0.00744
Epoch: 34003, Training Loss: 0.00775
Epoch: 34003, Training Loss: 0.00776
Epoch: 34003, Training Loss: 0.00952
Epoch: 34004, Training Loss: 0.00744
Epoch: 34004, Training Loss: 0.00775
Epoch: 34004, Training Loss: 0.00776
Epoch: 34004, Training Loss: 0.00952
Epoch: 34005, Training Loss: 0.00744
Epoch: 34005, Training Loss: 0.00775
Epoch: 34005, Training Loss: 0.00776
Epoch: 34005, Training Loss: 0.00952
Epoch: 34006, Training Loss: 0.00744
Epoch: 34006, Training Loss: 0.00775
Epoch: 34006, Training Loss: 0.00776
Epoch: 34006, Training Loss: 0.00952
Epoch: 34007, Training Loss: 0.00744
Epoch: 34007, Training Loss: 0.00775
Epoch: 34007, Training Loss: 0.00776
Epoch: 34007, Training Loss: 0.00952
Epoch: 34008, Training Loss: 0.00744
Epoch: 34008, Training Loss: 0.00775
Epoch: 34008, Training Loss: 0.00776
Epoch: 34008, Training Loss: 0.00952
Epoch: 34009, Training Loss: 0.00744
Epoch: 34009, Training Loss: 0.00775
Epoch: 34009, Training Loss: 0.00776
Epoch: 34009, Training Loss: 0.00952
Epoch: 34010, Training Loss: 0.00744
Epoch: 34010, Training Loss: 0.00775
Epoch: 34010, Training Loss: 0.00776
Epoch: 34010, Training Loss: 0.00952
Epoch: 34011, Training Loss: 0.00744
Epoch: 34011, Training Loss: 0.00775
Epoch: 34011, Training Loss: 0.00775
Epoch: 34011, Training Loss: 0.00952
Epoch: 34012, Training Loss: 0.00744
Epoch: 34012, Training Loss: 0.00775
Epoch: 34012, Training Loss: 0.00775
Epoch: 34012, Training Loss: 0.00952
Epoch: 34013, Training Loss: 0.00744
Epoch: 34013, Training Loss: 0.00775
Epoch: 34013, Training Loss: 0.00775
Epoch: 34013, Training Loss: 0.00952
Epoch: 34014, Training Loss: 0.00744
Epoch: 34014, Training Loss: 0.00775
Epoch: 34014, Training Loss: 0.00775
Epoch: 34014, Training Loss: 0.00952
Epoch: 34015, Training Loss: 0.00744
Epoch: 34015, Training Loss: 0.00775
Epoch: 34015, Training Loss: 0.00775
Epoch: 34015, Training Loss: 0.00952
Epoch: 34016, Training Loss: 0.00744
Epoch: 34016, Training Loss: 0.00775
Epoch: 34016, Training Loss: 0.00775
Epoch: 34016, Training Loss: 0.00952
Epoch: 34017, Training Loss: 0.00744
Epoch: 34017, Training Loss: 0.00775
Epoch: 34017, Training Loss: 0.00775
Epoch: 34017, Training Loss: 0.00952
Epoch: 34018, Training Loss: 0.00744
Epoch: 34018, Training Loss: 0.00775
Epoch: 34018, Training Loss: 0.00775
Epoch: 34018, Training Loss: 0.00952
Epoch: 34019, Training Loss: 0.00744
Epoch: 34019, Training Loss: 0.00774
Epoch: 34019, Training Loss: 0.00775
Epoch: 34019, Training Loss: 0.00952
Epoch: 34020, Training Loss: 0.00744
Epoch: 34020, Training Loss: 0.00774
Epoch: 34020, Training Loss: 0.00775
Epoch: 34020, Training Loss: 0.00952
Epoch: 34021, Training Loss: 0.00744
Epoch: 34021, Training Loss: 0.00774
Epoch: 34021, Training Loss: 0.00775
Epoch: 34021, Training Loss: 0.00952
Epoch: 34022, Training Loss: 0.00744
Epoch: 34022, Training Loss: 0.00774
Epoch: 34022, Training Loss: 0.00775
Epoch: 34022, Training Loss: 0.00952
Epoch: 34023, Training Loss: 0.00744
Epoch: 34023, Training Loss: 0.00774
Epoch: 34023, Training Loss: 0.00775
Epoch: 34023, Training Loss: 0.00952
Epoch: 34024, Training Loss: 0.00744
Epoch: 34024, Training Loss: 0.00774
Epoch: 34024, Training Loss: 0.00775
Epoch: 34024, Training Loss: 0.00952
Epoch: 34025, Training Loss: 0.00744
Epoch: 34025, Training Loss: 0.00774
Epoch: 34025, Training Loss: 0.00775
Epoch: 34025, Training Loss: 0.00952
Epoch: 34026, Training Loss: 0.00744
Epoch: 34026, Training Loss: 0.00774
Epoch: 34026, Training Loss: 0.00775
Epoch: 34026, Training Loss: 0.00952
Epoch: 34027, Training Loss: 0.00744
Epoch: 34027, Training Loss: 0.00774
Epoch: 34027, Training Loss: 0.00775
Epoch: 34027, Training Loss: 0.00952
Epoch: 34028, Training Loss: 0.00744
Epoch: 34028, Training Loss: 0.00774
Epoch: 34028, Training Loss: 0.00775
Epoch: 34028, Training Loss: 0.00952
Epoch: 34029, Training Loss: 0.00744
Epoch: 34029, Training Loss: 0.00774
Epoch: 34029, Training Loss: 0.00775
Epoch: 34029, Training Loss: 0.00952
Epoch: 34030, Training Loss: 0.00744
Epoch: 34030, Training Loss: 0.00774
Epoch: 34030, Training Loss: 0.00775
Epoch: 34030, Training Loss: 0.00952
Epoch: 34031, Training Loss: 0.00744
Epoch: 34031, Training Loss: 0.00774
Epoch: 34031, Training Loss: 0.00775
Epoch: 34031, Training Loss: 0.00952
Epoch: 34032, Training Loss: 0.00744
Epoch: 34032, Training Loss: 0.00774
Epoch: 34032, Training Loss: 0.00775
Epoch: 34032, Training Loss: 0.00952
Epoch: 34033, Training Loss: 0.00744
Epoch: 34033, Training Loss: 0.00774
Epoch: 34033, Training Loss: 0.00775
Epoch: 34033, Training Loss: 0.00952
Epoch: 34034, Training Loss: 0.00744
Epoch: 34034, Training Loss: 0.00774
Epoch: 34034, Training Loss: 0.00775
Epoch: 34034, Training Loss: 0.00952
Epoch: 34035, Training Loss: 0.00744
Epoch: 34035, Training Loss: 0.00774
Epoch: 34035, Training Loss: 0.00775
Epoch: 34035, Training Loss: 0.00952
Epoch: 34036, Training Loss: 0.00744
Epoch: 34036, Training Loss: 0.00774
Epoch: 34036, Training Loss: 0.00775
Epoch: 34036, Training Loss: 0.00951
Epoch: 34037, Training Loss: 0.00744
Epoch: 34037, Training Loss: 0.00774
Epoch: 34037, Training Loss: 0.00775
Epoch: 34037, Training Loss: 0.00951
Epoch: 34038, Training Loss: 0.00744
Epoch: 34038, Training Loss: 0.00774
Epoch: 34038, Training Loss: 0.00775
Epoch: 34038, Training Loss: 0.00951
Epoch: 34039, Training Loss: 0.00744
Epoch: 34039, Training Loss: 0.00774
Epoch: 34039, Training Loss: 0.00775
Epoch: 34039, Training Loss: 0.00951
Epoch: 34040, Training Loss: 0.00744
Epoch: 34040, Training Loss: 0.00774
Epoch: 34040, Training Loss: 0.00775
Epoch: 34040, Training Loss: 0.00951
Epoch: 34041, Training Loss: 0.00744
Epoch: 34041, Training Loss: 0.00774
Epoch: 34041, Training Loss: 0.00775
Epoch: 34041, Training Loss: 0.00951
Epoch: 34042, Training Loss: 0.00744
Epoch: 34042, Training Loss: 0.00774
Epoch: 34042, Training Loss: 0.00775
Epoch: 34042, Training Loss: 0.00951
Epoch: 34043, Training Loss: 0.00744
Epoch: 34043, Training Loss: 0.00774
Epoch: 34043, Training Loss: 0.00775
Epoch: 34043, Training Loss: 0.00951
Epoch: 34044, Training Loss: 0.00744
Epoch: 34044, Training Loss: 0.00774
Epoch: 34044, Training Loss: 0.00775
Epoch: 34044, Training Loss: 0.00951
Epoch: 34045, Training Loss: 0.00744
Epoch: 34045, Training Loss: 0.00774
Epoch: 34045, Training Loss: 0.00775
Epoch: 34045, Training Loss: 0.00951
Epoch: 34046, Training Loss: 0.00744
Epoch: 34046, Training Loss: 0.00774
Epoch: 34046, Training Loss: 0.00775
Epoch: 34046, Training Loss: 0.00951
Epoch: 34047, Training Loss: 0.00744
Epoch: 34047, Training Loss: 0.00774
Epoch: 34047, Training Loss: 0.00775
Epoch: 34047, Training Loss: 0.00951
Epoch: 34048, Training Loss: 0.00744
Epoch: 34048, Training Loss: 0.00774
Epoch: 34048, Training Loss: 0.00775
Epoch: 34048, Training Loss: 0.00951
Epoch: 34049, Training Loss: 0.00744
Epoch: 34049, Training Loss: 0.00774
Epoch: 34049, Training Loss: 0.00775
Epoch: 34049, Training Loss: 0.00951
Epoch: 34050, Training Loss: 0.00744
Epoch: 34050, Training Loss: 0.00774
Epoch: 34050, Training Loss: 0.00775
Epoch: 34050, Training Loss: 0.00951
Epoch: 34051, Training Loss: 0.00744
Epoch: 34051, Training Loss: 0.00774
Epoch: 34051, Training Loss: 0.00775
Epoch: 34051, Training Loss: 0.00951
Epoch: 34052, Training Loss: 0.00744
Epoch: 34052, Training Loss: 0.00774
Epoch: 34052, Training Loss: 0.00775
Epoch: 34052, Training Loss: 0.00951
Epoch: 34053, Training Loss: 0.00744
Epoch: 34053, Training Loss: 0.00774
Epoch: 34053, Training Loss: 0.00775
Epoch: 34053, Training Loss: 0.00951
Epoch: 34054, Training Loss: 0.00744
Epoch: 34054, Training Loss: 0.00774
Epoch: 34054, Training Loss: 0.00775
Epoch: 34054, Training Loss: 0.00951
Epoch: 34055, Training Loss: 0.00744
Epoch: 34055, Training Loss: 0.00774
Epoch: 34055, Training Loss: 0.00775
Epoch: 34055, Training Loss: 0.00951
Epoch: 34056, Training Loss: 0.00744
Epoch: 34056, Training Loss: 0.00774
Epoch: 34056, Training Loss: 0.00775
Epoch: 34056, Training Loss: 0.00951
Epoch: 34057, Training Loss: 0.00744
Epoch: 34057, Training Loss: 0.00774
Epoch: 34057, Training Loss: 0.00775
Epoch: 34057, Training Loss: 0.00951
Epoch: 34058, Training Loss: 0.00744
Epoch: 34058, Training Loss: 0.00774
Epoch: 34058, Training Loss: 0.00775
Epoch: 34058, Training Loss: 0.00951
Epoch: 34059, Training Loss: 0.00744
Epoch: 34059, Training Loss: 0.00774
Epoch: 34059, Training Loss: 0.00775
Epoch: 34059, Training Loss: 0.00951
Epoch: 34060, Training Loss: 0.00744
Epoch: 34060, Training Loss: 0.00774
Epoch: 34060, Training Loss: 0.00775
Epoch: 34060, Training Loss: 0.00951
Epoch: 34061, Training Loss: 0.00744
Epoch: 34061, Training Loss: 0.00774
Epoch: 34061, Training Loss: 0.00775
Epoch: 34061, Training Loss: 0.00951
Epoch: 34062, Training Loss: 0.00744
Epoch: 34062, Training Loss: 0.00774
Epoch: 34062, Training Loss: 0.00775
Epoch: 34062, Training Loss: 0.00951
Epoch: 34063, Training Loss: 0.00744
Epoch: 34063, Training Loss: 0.00774
Epoch: 34063, Training Loss: 0.00775
Epoch: 34063, Training Loss: 0.00951
Epoch: 34064, Training Loss: 0.00744
Epoch: 34064, Training Loss: 0.00774
Epoch: 34064, Training Loss: 0.00775
Epoch: 34064, Training Loss: 0.00951
Epoch: 34065, Training Loss: 0.00744
Epoch: 34065, Training Loss: 0.00774
Epoch: 34065, Training Loss: 0.00775
Epoch: 34065, Training Loss: 0.00951
Epoch: 34066, Training Loss: 0.00744
Epoch: 34066, Training Loss: 0.00774
Epoch: 34066, Training Loss: 0.00775
Epoch: 34066, Training Loss: 0.00951
Epoch: 34067, Training Loss: 0.00744
Epoch: 34067, Training Loss: 0.00774
Epoch: 34067, Training Loss: 0.00775
Epoch: 34067, Training Loss: 0.00951
Epoch: 34068, Training Loss: 0.00744
Epoch: 34068, Training Loss: 0.00774
Epoch: 34068, Training Loss: 0.00775
Epoch: 34068, Training Loss: 0.00951
Epoch: 34069, Training Loss: 0.00744
Epoch: 34069, Training Loss: 0.00774
Epoch: 34069, Training Loss: 0.00775
Epoch: 34069, Training Loss: 0.00951
Epoch: 34070, Training Loss: 0.00744
Epoch: 34070, Training Loss: 0.00774
Epoch: 34070, Training Loss: 0.00775
Epoch: 34070, Training Loss: 0.00951
Epoch: 34071, Training Loss: 0.00744
Epoch: 34071, Training Loss: 0.00774
Epoch: 34071, Training Loss: 0.00775
Epoch: 34071, Training Loss: 0.00951
Epoch: 34072, Training Loss: 0.00744
Epoch: 34072, Training Loss: 0.00774
Epoch: 34072, Training Loss: 0.00775
Epoch: 34072, Training Loss: 0.00951
Epoch: 34073, Training Loss: 0.00744
Epoch: 34073, Training Loss: 0.00774
Epoch: 34073, Training Loss: 0.00775
Epoch: 34073, Training Loss: 0.00951
Epoch: 34074, Training Loss: 0.00744
Epoch: 34074, Training Loss: 0.00774
Epoch: 34074, Training Loss: 0.00775
Epoch: 34074, Training Loss: 0.00951
Epoch: 34075, Training Loss: 0.00744
Epoch: 34075, Training Loss: 0.00774
Epoch: 34075, Training Loss: 0.00775
Epoch: 34075, Training Loss: 0.00951
Epoch: 34076, Training Loss: 0.00744
Epoch: 34076, Training Loss: 0.00774
Epoch: 34076, Training Loss: 0.00775
Epoch: 34076, Training Loss: 0.00951
Epoch: 34077, Training Loss: 0.00744
Epoch: 34077, Training Loss: 0.00774
Epoch: 34077, Training Loss: 0.00775
Epoch: 34077, Training Loss: 0.00951
Epoch: 34078, Training Loss: 0.00744
Epoch: 34078, Training Loss: 0.00774
Epoch: 34078, Training Loss: 0.00775
Epoch: 34078, Training Loss: 0.00951
Epoch: 34079, Training Loss: 0.00744
Epoch: 34079, Training Loss: 0.00774
Epoch: 34079, Training Loss: 0.00775
Epoch: 34079, Training Loss: 0.00951
Epoch: 34080, Training Loss: 0.00744
Epoch: 34080, Training Loss: 0.00774
Epoch: 34080, Training Loss: 0.00775
Epoch: 34080, Training Loss: 0.00951
Epoch: 34081, Training Loss: 0.00744
Epoch: 34081, Training Loss: 0.00774
Epoch: 34081, Training Loss: 0.00775
Epoch: 34081, Training Loss: 0.00951
Epoch: 34082, Training Loss: 0.00744
Epoch: 34082, Training Loss: 0.00774
Epoch: 34082, Training Loss: 0.00775
Epoch: 34082, Training Loss: 0.00951
Epoch: 34083, Training Loss: 0.00744
Epoch: 34083, Training Loss: 0.00774
Epoch: 34083, Training Loss: 0.00775
Epoch: 34083, Training Loss: 0.00951
Epoch: 34084, Training Loss: 0.00744
Epoch: 34084, Training Loss: 0.00774
Epoch: 34084, Training Loss: 0.00775
Epoch: 34084, Training Loss: 0.00951
Epoch: 34085, Training Loss: 0.00744
Epoch: 34085, Training Loss: 0.00774
Epoch: 34085, Training Loss: 0.00775
Epoch: 34085, Training Loss: 0.00951
Epoch: 34086, Training Loss: 0.00744
Epoch: 34086, Training Loss: 0.00774
Epoch: 34086, Training Loss: 0.00775
Epoch: 34086, Training Loss: 0.00951
Epoch: 34087, Training Loss: 0.00743
Epoch: 34087, Training Loss: 0.00774
Epoch: 34087, Training Loss: 0.00775
Epoch: 34087, Training Loss: 0.00951
Epoch: 34088, Training Loss: 0.00743
Epoch: 34088, Training Loss: 0.00774
Epoch: 34088, Training Loss: 0.00775
Epoch: 34088, Training Loss: 0.00951
Epoch: 34089, Training Loss: 0.00743
Epoch: 34089, Training Loss: 0.00774
Epoch: 34089, Training Loss: 0.00775
Epoch: 34089, Training Loss: 0.00951
Epoch: 34090, Training Loss: 0.00743
Epoch: 34090, Training Loss: 0.00774
Epoch: 34090, Training Loss: 0.00774
Epoch: 34090, Training Loss: 0.00951
Epoch: 34091, Training Loss: 0.00743
Epoch: 34091, Training Loss: 0.00774
Epoch: 34091, Training Loss: 0.00774
Epoch: 34091, Training Loss: 0.00951
Epoch: 34092, Training Loss: 0.00743
Epoch: 34092, Training Loss: 0.00774
Epoch: 34092, Training Loss: 0.00774
Epoch: 34092, Training Loss: 0.00951
Epoch: 34093, Training Loss: 0.00743
Epoch: 34093, Training Loss: 0.00774
Epoch: 34093, Training Loss: 0.00774
Epoch: 34093, Training Loss: 0.00951
Epoch: 34094, Training Loss: 0.00743
Epoch: 34094, Training Loss: 0.00774
Epoch: 34094, Training Loss: 0.00774
Epoch: 34094, Training Loss: 0.00951
Epoch: 34095, Training Loss: 0.00743
Epoch: 34095, Training Loss: 0.00774
Epoch: 34095, Training Loss: 0.00774
Epoch: 34095, Training Loss: 0.00951
Epoch: 34096, Training Loss: 0.00743
Epoch: 34096, Training Loss: 0.00774
Epoch: 34096, Training Loss: 0.00774
Epoch: 34096, Training Loss: 0.00951
Epoch: 34097, Training Loss: 0.00743
Epoch: 34097, Training Loss: 0.00774
Epoch: 34097, Training Loss: 0.00774
Epoch: 34097, Training Loss: 0.00951
Epoch: 34098, Training Loss: 0.00743
Epoch: 34098, Training Loss: 0.00773
Epoch: 34098, Training Loss: 0.00774
Epoch: 34098, Training Loss: 0.00951
Epoch: 34099, Training Loss: 0.00743
Epoch: 34099, Training Loss: 0.00773
Epoch: 34099, Training Loss: 0.00774
Epoch: 34099, Training Loss: 0.00951
Epoch: 34100, Training Loss: 0.00743
Epoch: 34100, Training Loss: 0.00773
Epoch: 34100, Training Loss: 0.00774
Epoch: 34100, Training Loss: 0.00950
Epoch: 34101, Training Loss: 0.00743
Epoch: 34101, Training Loss: 0.00773
Epoch: 34101, Training Loss: 0.00774
Epoch: 34101, Training Loss: 0.00950
Epoch: 34102, Training Loss: 0.00743
Epoch: 34102, Training Loss: 0.00773
Epoch: 34102, Training Loss: 0.00774
Epoch: 34102, Training Loss: 0.00950
Epoch: 34103, Training Loss: 0.00743
Epoch: 34103, Training Loss: 0.00773
Epoch: 34103, Training Loss: 0.00774
Epoch: 34103, Training Loss: 0.00950
Epoch: 34104, Training Loss: 0.00743
Epoch: 34104, Training Loss: 0.00773
Epoch: 34104, Training Loss: 0.00774
Epoch: 34104, Training Loss: 0.00950
Epoch: 34105, Training Loss: 0.00743
Epoch: 34105, Training Loss: 0.00773
Epoch: 34105, Training Loss: 0.00774
Epoch: 34105, Training Loss: 0.00950
Epoch: 34106, Training Loss: 0.00743
Epoch: 34106, Training Loss: 0.00773
Epoch: 34106, Training Loss: 0.00774
Epoch: 34106, Training Loss: 0.00950
Epoch: 34107, Training Loss: 0.00743
Epoch: 34107, Training Loss: 0.00773
Epoch: 34107, Training Loss: 0.00774
Epoch: 34107, Training Loss: 0.00950
Epoch: 34108, Training Loss: 0.00743
Epoch: 34108, Training Loss: 0.00773
Epoch: 34108, Training Loss: 0.00774
Epoch: 34108, Training Loss: 0.00950
Epoch: 34109, Training Loss: 0.00743
Epoch: 34109, Training Loss: 0.00773
Epoch: 34109, Training Loss: 0.00774
Epoch: 34109, Training Loss: 0.00950
Epoch: 34110, Training Loss: 0.00743
Epoch: 34110, Training Loss: 0.00773
Epoch: 34110, Training Loss: 0.00774
Epoch: 34110, Training Loss: 0.00950
Epoch: 34111, Training Loss: 0.00743
Epoch: 34111, Training Loss: 0.00773
Epoch: 34111, Training Loss: 0.00774
Epoch: 34111, Training Loss: 0.00950
Epoch: 34112, Training Loss: 0.00743
Epoch: 34112, Training Loss: 0.00773
Epoch: 34112, Training Loss: 0.00774
Epoch: 34112, Training Loss: 0.00950
Epoch: 34113, Training Loss: 0.00743
Epoch: 34113, Training Loss: 0.00773
Epoch: 34113, Training Loss: 0.00774
Epoch: 34113, Training Loss: 0.00950
Epoch: 34114, Training Loss: 0.00743
Epoch: 34114, Training Loss: 0.00773
Epoch: 34114, Training Loss: 0.00774
Epoch: 34114, Training Loss: 0.00950
Epoch: 34115, Training Loss: 0.00743
Epoch: 34115, Training Loss: 0.00773
Epoch: 34115, Training Loss: 0.00774
Epoch: 34115, Training Loss: 0.00950
Epoch: 34116, Training Loss: 0.00743
Epoch: 34116, Training Loss: 0.00773
Epoch: 34116, Training Loss: 0.00774
Epoch: 34116, Training Loss: 0.00950
Epoch: 34117, Training Loss: 0.00743
Epoch: 34117, Training Loss: 0.00773
Epoch: 34117, Training Loss: 0.00774
Epoch: 34117, Training Loss: 0.00950
Epoch: 34118, Training Loss: 0.00743
Epoch: 34118, Training Loss: 0.00773
Epoch: 34118, Training Loss: 0.00774
Epoch: 34118, Training Loss: 0.00950
Epoch: 34119, Training Loss: 0.00743
Epoch: 34119, Training Loss: 0.00773
Epoch: 34119, Training Loss: 0.00774
Epoch: 34119, Training Loss: 0.00950
Epoch: 34120, Training Loss: 0.00743
Epoch: 34120, Training Loss: 0.00773
Epoch: 34120, Training Loss: 0.00774
Epoch: 34120, Training Loss: 0.00950
Epoch: 34121, Training Loss: 0.00743
Epoch: 34121, Training Loss: 0.00773
Epoch: 34121, Training Loss: 0.00774
Epoch: 34121, Training Loss: 0.00950
Epoch: 34122, Training Loss: 0.00743
Epoch: 34122, Training Loss: 0.00773
Epoch: 34122, Training Loss: 0.00774
Epoch: 34122, Training Loss: 0.00950
Epoch: 34123, Training Loss: 0.00743
Epoch: 34123, Training Loss: 0.00773
Epoch: 34123, Training Loss: 0.00774
Epoch: 34123, Training Loss: 0.00950
Epoch: 34124, Training Loss: 0.00743
Epoch: 34124, Training Loss: 0.00773
Epoch: 34124, Training Loss: 0.00774
Epoch: 34124, Training Loss: 0.00950
Epoch: 34125, Training Loss: 0.00743
Epoch: 34125, Training Loss: 0.00773
Epoch: 34125, Training Loss: 0.00774
Epoch: 34125, Training Loss: 0.00950
Epoch: 34126, Training Loss: 0.00743
Epoch: 34126, Training Loss: 0.00773
Epoch: 34126, Training Loss: 0.00774
Epoch: 34126, Training Loss: 0.00950
Epoch: 34127, Training Loss: 0.00743
Epoch: 34127, Training Loss: 0.00773
Epoch: 34127, Training Loss: 0.00774
Epoch: 34127, Training Loss: 0.00950
Epoch: 34128, Training Loss: 0.00743
Epoch: 34128, Training Loss: 0.00773
Epoch: 34128, Training Loss: 0.00774
Epoch: 34128, Training Loss: 0.00950
Epoch: 34129, Training Loss: 0.00743
Epoch: 34129, Training Loss: 0.00773
Epoch: 34129, Training Loss: 0.00774
Epoch: 34129, Training Loss: 0.00950
Epoch: 34130, Training Loss: 0.00743
Epoch: 34130, Training Loss: 0.00773
Epoch: 34130, Training Loss: 0.00774
Epoch: 34130, Training Loss: 0.00950
Epoch: 34131, Training Loss: 0.00743
Epoch: 34131, Training Loss: 0.00773
Epoch: 34131, Training Loss: 0.00774
Epoch: 34131, Training Loss: 0.00950
Epoch: 34132, Training Loss: 0.00743
Epoch: 34132, Training Loss: 0.00773
Epoch: 34132, Training Loss: 0.00774
Epoch: 34132, Training Loss: 0.00950
Epoch: 34133, Training Loss: 0.00743
Epoch: 34133, Training Loss: 0.00773
Epoch: 34133, Training Loss: 0.00774
Epoch: 34133, Training Loss: 0.00950
Epoch: 34134, Training Loss: 0.00743
Epoch: 34134, Training Loss: 0.00773
Epoch: 34134, Training Loss: 0.00774
Epoch: 34134, Training Loss: 0.00950
Epoch: 34135, Training Loss: 0.00743
Epoch: 34135, Training Loss: 0.00773
Epoch: 34135, Training Loss: 0.00774
Epoch: 34135, Training Loss: 0.00950
Epoch: 34136, Training Loss: 0.00743
Epoch: 34136, Training Loss: 0.00773
Epoch: 34136, Training Loss: 0.00774
Epoch: 34136, Training Loss: 0.00950
Epoch: 34137, Training Loss: 0.00743
Epoch: 34137, Training Loss: 0.00773
Epoch: 34137, Training Loss: 0.00774
Epoch: 34137, Training Loss: 0.00950
Epoch: 34138, Training Loss: 0.00743
Epoch: 34138, Training Loss: 0.00773
Epoch: 34138, Training Loss: 0.00774
Epoch: 34138, Training Loss: 0.00950
Epoch: 34139, Training Loss: 0.00743
Epoch: 34139, Training Loss: 0.00773
Epoch: 34139, Training Loss: 0.00774
Epoch: 34139, Training Loss: 0.00950
Epoch: 34140, Training Loss: 0.00743
Epoch: 34140, Training Loss: 0.00773
Epoch: 34140, Training Loss: 0.00774
Epoch: 34140, Training Loss: 0.00950
Epoch: 34141, Training Loss: 0.00743
Epoch: 34141, Training Loss: 0.00773
Epoch: 34141, Training Loss: 0.00774
Epoch: 34141, Training Loss: 0.00950
Epoch: 34142, Training Loss: 0.00743
Epoch: 34142, Training Loss: 0.00773
Epoch: 34142, Training Loss: 0.00774
Epoch: 34142, Training Loss: 0.00950
Epoch: 34143, Training Loss: 0.00743
Epoch: 34143, Training Loss: 0.00773
Epoch: 34143, Training Loss: 0.00774
Epoch: 34143, Training Loss: 0.00950
Epoch: 34144, Training Loss: 0.00743
Epoch: 34144, Training Loss: 0.00773
Epoch: 34144, Training Loss: 0.00774
Epoch: 34144, Training Loss: 0.00950
Epoch: 34145, Training Loss: 0.00743
Epoch: 34145, Training Loss: 0.00773
Epoch: 34145, Training Loss: 0.00774
Epoch: 34145, Training Loss: 0.00950
Epoch: 34146, Training Loss: 0.00743
Epoch: 34146, Training Loss: 0.00773
Epoch: 34146, Training Loss: 0.00774
Epoch: 34146, Training Loss: 0.00950
Epoch: 34147, Training Loss: 0.00743
Epoch: 34147, Training Loss: 0.00773
Epoch: 34147, Training Loss: 0.00774
Epoch: 34147, Training Loss: 0.00950
Epoch: 34148, Training Loss: 0.00743
Epoch: 34148, Training Loss: 0.00773
Epoch: 34148, Training Loss: 0.00774
Epoch: 34148, Training Loss: 0.00950
Epoch: 34149, Training Loss: 0.00743
Epoch: 34149, Training Loss: 0.00773
Epoch: 34149, Training Loss: 0.00774
Epoch: 34149, Training Loss: 0.00950
Epoch: 34150, Training Loss: 0.00743
Epoch: 34150, Training Loss: 0.00773
Epoch: 34150, Training Loss: 0.00774
Epoch: 34150, Training Loss: 0.00950
Epoch: 34151, Training Loss: 0.00743
Epoch: 34151, Training Loss: 0.00773
Epoch: 34151, Training Loss: 0.00774
Epoch: 34151, Training Loss: 0.00950
Epoch: 34152, Training Loss: 0.00743
Epoch: 34152, Training Loss: 0.00773
Epoch: 34152, Training Loss: 0.00774
Epoch: 34152, Training Loss: 0.00950
Epoch: 34153, Training Loss: 0.00743
Epoch: 34153, Training Loss: 0.00773
Epoch: 34153, Training Loss: 0.00774
Epoch: 34153, Training Loss: 0.00950
Epoch: 34154, Training Loss: 0.00743
Epoch: 34154, Training Loss: 0.00773
Epoch: 34154, Training Loss: 0.00774
Epoch: 34154, Training Loss: 0.00950
Epoch: 34155, Training Loss: 0.00743
Epoch: 34155, Training Loss: 0.00773
Epoch: 34155, Training Loss: 0.00774
Epoch: 34155, Training Loss: 0.00950
Epoch: 34156, Training Loss: 0.00743
Epoch: 34156, Training Loss: 0.00773
Epoch: 34156, Training Loss: 0.00774
Epoch: 34156, Training Loss: 0.00950
Epoch: 34157, Training Loss: 0.00743
Epoch: 34157, Training Loss: 0.00773
Epoch: 34157, Training Loss: 0.00774
Epoch: 34157, Training Loss: 0.00950
Epoch: 34158, Training Loss: 0.00743
Epoch: 34158, Training Loss: 0.00773
Epoch: 34158, Training Loss: 0.00774
Epoch: 34158, Training Loss: 0.00950
Epoch: 34159, Training Loss: 0.00743
Epoch: 34159, Training Loss: 0.00773
Epoch: 34159, Training Loss: 0.00774
Epoch: 34159, Training Loss: 0.00950
Epoch: 34160, Training Loss: 0.00743
Epoch: 34160, Training Loss: 0.00773
Epoch: 34160, Training Loss: 0.00774
Epoch: 34160, Training Loss: 0.00950
Epoch: 34161, Training Loss: 0.00743
Epoch: 34161, Training Loss: 0.00773
Epoch: 34161, Training Loss: 0.00774
Epoch: 34161, Training Loss: 0.00950
Epoch: 34162, Training Loss: 0.00743
Epoch: 34162, Training Loss: 0.00773
Epoch: 34162, Training Loss: 0.00774
Epoch: 34162, Training Loss: 0.00950
Epoch: 34163, Training Loss: 0.00743
Epoch: 34163, Training Loss: 0.00773
Epoch: 34163, Training Loss: 0.00774
Epoch: 34163, Training Loss: 0.00949
Epoch: 34164, Training Loss: 0.00743
Epoch: 34164, Training Loss: 0.00773
Epoch: 34164, Training Loss: 0.00774
Epoch: 34164, Training Loss: 0.00949
Epoch: 34165, Training Loss: 0.00743
Epoch: 34165, Training Loss: 0.00773
Epoch: 34165, Training Loss: 0.00774
Epoch: 34165, Training Loss: 0.00949
Epoch: 34166, Training Loss: 0.00743
Epoch: 34166, Training Loss: 0.00773
Epoch: 34166, Training Loss: 0.00774
Epoch: 34166, Training Loss: 0.00949
Epoch: 34167, Training Loss: 0.00743
Epoch: 34167, Training Loss: 0.00773
Epoch: 34167, Training Loss: 0.00774
Epoch: 34167, Training Loss: 0.00949
Epoch: 34168, Training Loss: 0.00743
Epoch: 34168, Training Loss: 0.00773
Epoch: 34168, Training Loss: 0.00774
Epoch: 34168, Training Loss: 0.00949
Epoch: 34169, Training Loss: 0.00743
Epoch: 34169, Training Loss: 0.00773
Epoch: 34169, Training Loss: 0.00773
Epoch: 34169, Training Loss: 0.00949
Epoch: 34170, Training Loss: 0.00743
Epoch: 34170, Training Loss: 0.00773
Epoch: 34170, Training Loss: 0.00773
Epoch: 34170, Training Loss: 0.00949
Epoch: 34171, Training Loss: 0.00743
Epoch: 34171, Training Loss: 0.00773
Epoch: 34171, Training Loss: 0.00773
Epoch: 34171, Training Loss: 0.00949
Epoch: 34172, Training Loss: 0.00742
Epoch: 34172, Training Loss: 0.00773
Epoch: 34172, Training Loss: 0.00773
Epoch: 34172, Training Loss: 0.00949
Epoch: 34173, Training Loss: 0.00742
Epoch: 34173, Training Loss: 0.00773
Epoch: 34173, Training Loss: 0.00773
Epoch: 34173, Training Loss: 0.00949
Epoch: 34174, Training Loss: 0.00742
Epoch: 34174, Training Loss: 0.00773
Epoch: 34174, Training Loss: 0.00773
Epoch: 34174, Training Loss: 0.00949
Epoch: 34175, Training Loss: 0.00742
Epoch: 34175, Training Loss: 0.00773
Epoch: 34175, Training Loss: 0.00773
Epoch: 34175, Training Loss: 0.00949
Epoch: 34176, Training Loss: 0.00742
Epoch: 34176, Training Loss: 0.00773
Epoch: 34176, Training Loss: 0.00773
Epoch: 34176, Training Loss: 0.00949
Epoch: 34177, Training Loss: 0.00742
Epoch: 34177, Training Loss: 0.00773
Epoch: 34177, Training Loss: 0.00773
Epoch: 34177, Training Loss: 0.00949
Epoch: 34178, Training Loss: 0.00742
Epoch: 34178, Training Loss: 0.00772
Epoch: 34178, Training Loss: 0.00773
Epoch: 34178, Training Loss: 0.00949
Epoch: 34179, Training Loss: 0.00742
Epoch: 34179, Training Loss: 0.00772
Epoch: 34179, Training Loss: 0.00773
Epoch: 34179, Training Loss: 0.00949
Epoch: 34180, Training Loss: 0.00742
Epoch: 34180, Training Loss: 0.00772
Epoch: 34180, Training Loss: 0.00773
Epoch: 34180, Training Loss: 0.00949
Epoch: 34181, Training Loss: 0.00742
Epoch: 34181, Training Loss: 0.00772
Epoch: 34181, Training Loss: 0.00773
Epoch: 34181, Training Loss: 0.00949
Epoch: 34182, Training Loss: 0.00742
Epoch: 34182, Training Loss: 0.00772
Epoch: 34182, Training Loss: 0.00773
Epoch: 34182, Training Loss: 0.00949
Epoch: 34183, Training Loss: 0.00742
Epoch: 34183, Training Loss: 0.00772
Epoch: 34183, Training Loss: 0.00773
Epoch: 34183, Training Loss: 0.00949
Epoch: 34184, Training Loss: 0.00742
Epoch: 34184, Training Loss: 0.00772
Epoch: 34184, Training Loss: 0.00773
Epoch: 34184, Training Loss: 0.00949
Epoch: 34185, Training Loss: 0.00742
Epoch: 34185, Training Loss: 0.00772
Epoch: 34185, Training Loss: 0.00773
Epoch: 34185, Training Loss: 0.00949
Epoch: 34186, Training Loss: 0.00742
Epoch: 34186, Training Loss: 0.00772
Epoch: 34186, Training Loss: 0.00773
Epoch: 34186, Training Loss: 0.00949
Epoch: 34187, Training Loss: 0.00742
Epoch: 34187, Training Loss: 0.00772
Epoch: 34187, Training Loss: 0.00773
Epoch: 34187, Training Loss: 0.00949
Epoch: 34188, Training Loss: 0.00742
Epoch: 34188, Training Loss: 0.00772
Epoch: 34188, Training Loss: 0.00773
Epoch: 34188, Training Loss: 0.00949
Epoch: 34189, Training Loss: 0.00742
Epoch: 34189, Training Loss: 0.00772
Epoch: 34189, Training Loss: 0.00773
Epoch: 34189, Training Loss: 0.00949
Epoch: 34190, Training Loss: 0.00742
Epoch: 34190, Training Loss: 0.00772
Epoch: 34190, Training Loss: 0.00773
Epoch: 34190, Training Loss: 0.00949
Epoch: 34191, Training Loss: 0.00742
Epoch: 34191, Training Loss: 0.00772
Epoch: 34191, Training Loss: 0.00773
Epoch: 34191, Training Loss: 0.00949
Epoch: 34192, Training Loss: 0.00742
Epoch: 34192, Training Loss: 0.00772
Epoch: 34192, Training Loss: 0.00773
Epoch: 34192, Training Loss: 0.00949
Epoch: 34193, Training Loss: 0.00742
Epoch: 34193, Training Loss: 0.00772
Epoch: 34193, Training Loss: 0.00773
Epoch: 34193, Training Loss: 0.00949
Epoch: 34194, Training Loss: 0.00742
Epoch: 34194, Training Loss: 0.00772
Epoch: 34194, Training Loss: 0.00773
Epoch: 34194, Training Loss: 0.00949
Epoch: 34195, Training Loss: 0.00742
Epoch: 34195, Training Loss: 0.00772
Epoch: 34195, Training Loss: 0.00773
Epoch: 34195, Training Loss: 0.00949
Epoch: 34196, Training Loss: 0.00742
Epoch: 34196, Training Loss: 0.00772
Epoch: 34196, Training Loss: 0.00773
Epoch: 34196, Training Loss: 0.00949
Epoch: 34197, Training Loss: 0.00742
Epoch: 34197, Training Loss: 0.00772
Epoch: 34197, Training Loss: 0.00773
Epoch: 34197, Training Loss: 0.00949
Epoch: 34198, Training Loss: 0.00742
Epoch: 34198, Training Loss: 0.00772
Epoch: 34198, Training Loss: 0.00773
Epoch: 34198, Training Loss: 0.00949
Epoch: 34199, Training Loss: 0.00742
Epoch: 34199, Training Loss: 0.00772
Epoch: 34199, Training Loss: 0.00773
Epoch: 34199, Training Loss: 0.00949
Epoch: 34200, Training Loss: 0.00742
Epoch: 34200, Training Loss: 0.00772
Epoch: 34200, Training Loss: 0.00773
Epoch: 34200, Training Loss: 0.00949
Epoch: 34201, Training Loss: 0.00742
Epoch: 34201, Training Loss: 0.00772
Epoch: 34201, Training Loss: 0.00773
Epoch: 34201, Training Loss: 0.00949
Epoch: 34202, Training Loss: 0.00742
Epoch: 34202, Training Loss: 0.00772
Epoch: 34202, Training Loss: 0.00773
Epoch: 34202, Training Loss: 0.00949
Epoch: 34203, Training Loss: 0.00742
Epoch: 34203, Training Loss: 0.00772
Epoch: 34203, Training Loss: 0.00773
Epoch: 34203, Training Loss: 0.00949
Epoch: 34204, Training Loss: 0.00742
Epoch: 34204, Training Loss: 0.00772
Epoch: 34204, Training Loss: 0.00773
Epoch: 34204, Training Loss: 0.00949
Epoch: 34205, Training Loss: 0.00742
Epoch: 34205, Training Loss: 0.00772
Epoch: 34205, Training Loss: 0.00773
Epoch: 34205, Training Loss: 0.00949
Epoch: 34206, Training Loss: 0.00742
Epoch: 34206, Training Loss: 0.00772
Epoch: 34206, Training Loss: 0.00773
Epoch: 34206, Training Loss: 0.00949
Epoch: 34207, Training Loss: 0.00742
Epoch: 34207, Training Loss: 0.00772
Epoch: 34207, Training Loss: 0.00773
Epoch: 34207, Training Loss: 0.00949
Epoch: 34208, Training Loss: 0.00742
Epoch: 34208, Training Loss: 0.00772
Epoch: 34208, Training Loss: 0.00773
Epoch: 34208, Training Loss: 0.00949
Epoch: 34209, Training Loss: 0.00742
Epoch: 34209, Training Loss: 0.00772
Epoch: 34209, Training Loss: 0.00773
Epoch: 34209, Training Loss: 0.00949
Epoch: 34210, Training Loss: 0.00742
Epoch: 34210, Training Loss: 0.00772
Epoch: 34210, Training Loss: 0.00773
Epoch: 34210, Training Loss: 0.00949
Epoch: 34211, Training Loss: 0.00742
Epoch: 34211, Training Loss: 0.00772
Epoch: 34211, Training Loss: 0.00773
Epoch: 34211, Training Loss: 0.00949
Epoch: 34212, Training Loss: 0.00742
Epoch: 34212, Training Loss: 0.00772
Epoch: 34212, Training Loss: 0.00773
Epoch: 34212, Training Loss: 0.00949
Epoch: 34213, Training Loss: 0.00742
Epoch: 34213, Training Loss: 0.00772
Epoch: 34213, Training Loss: 0.00773
Epoch: 34213, Training Loss: 0.00949
Epoch: 34214, Training Loss: 0.00742
Epoch: 34214, Training Loss: 0.00772
Epoch: 34214, Training Loss: 0.00773
Epoch: 34214, Training Loss: 0.00949
Epoch: 34215, Training Loss: 0.00742
Epoch: 34215, Training Loss: 0.00772
Epoch: 34215, Training Loss: 0.00773
Epoch: 34215, Training Loss: 0.00949
Epoch: 34216, Training Loss: 0.00742
Epoch: 34216, Training Loss: 0.00772
Epoch: 34216, Training Loss: 0.00773
Epoch: 34216, Training Loss: 0.00949
Epoch: 34217, Training Loss: 0.00742
Epoch: 34217, Training Loss: 0.00772
Epoch: 34217, Training Loss: 0.00773
Epoch: 34217, Training Loss: 0.00949
Epoch: 34218, Training Loss: 0.00742
Epoch: 34218, Training Loss: 0.00772
Epoch: 34218, Training Loss: 0.00773
Epoch: 34218, Training Loss: 0.00949
Epoch: 34219, Training Loss: 0.00742
Epoch: 34219, Training Loss: 0.00772
Epoch: 34219, Training Loss: 0.00773
Epoch: 34219, Training Loss: 0.00949
Epoch: 34220, Training Loss: 0.00742
Epoch: 34220, Training Loss: 0.00772
Epoch: 34220, Training Loss: 0.00773
Epoch: 34220, Training Loss: 0.00949
Epoch: 34221, Training Loss: 0.00742
Epoch: 34221, Training Loss: 0.00772
Epoch: 34221, Training Loss: 0.00773
Epoch: 34221, Training Loss: 0.00949
Epoch: 34222, Training Loss: 0.00742
Epoch: 34222, Training Loss: 0.00772
Epoch: 34222, Training Loss: 0.00773
Epoch: 34222, Training Loss: 0.00949
Epoch: 34223, Training Loss: 0.00742
Epoch: 34223, Training Loss: 0.00772
Epoch: 34223, Training Loss: 0.00773
Epoch: 34223, Training Loss: 0.00949
Epoch: 34224, Training Loss: 0.00742
Epoch: 34224, Training Loss: 0.00772
Epoch: 34224, Training Loss: 0.00773
Epoch: 34224, Training Loss: 0.00949
Epoch: 34225, Training Loss: 0.00742
Epoch: 34225, Training Loss: 0.00772
Epoch: 34225, Training Loss: 0.00773
Epoch: 34225, Training Loss: 0.00949
Epoch: 34226, Training Loss: 0.00742
Epoch: 34226, Training Loss: 0.00772
Epoch: 34226, Training Loss: 0.00773
Epoch: 34226, Training Loss: 0.00949
Epoch: 34227, Training Loss: 0.00742
Epoch: 34227, Training Loss: 0.00772
Epoch: 34227, Training Loss: 0.00773
Epoch: 34227, Training Loss: 0.00949
Epoch: 34228, Training Loss: 0.00742
Epoch: 34228, Training Loss: 0.00772
Epoch: 34228, Training Loss: 0.00773
Epoch: 34228, Training Loss: 0.00948
Epoch: 34229, Training Loss: 0.00742
Epoch: 34229, Training Loss: 0.00772
Epoch: 34229, Training Loss: 0.00773
Epoch: 34229, Training Loss: 0.00948
Epoch: 34230, Training Loss: 0.00742
Epoch: 34230, Training Loss: 0.00772
Epoch: 34230, Training Loss: 0.00773
Epoch: 34230, Training Loss: 0.00948
Epoch: 34231, Training Loss: 0.00742
Epoch: 34231, Training Loss: 0.00772
Epoch: 34231, Training Loss: 0.00773
Epoch: 34231, Training Loss: 0.00948
Epoch: 34232, Training Loss: 0.00742
Epoch: 34232, Training Loss: 0.00772
Epoch: 34232, Training Loss: 0.00773
Epoch: 34232, Training Loss: 0.00948
Epoch: 34233, Training Loss: 0.00742
Epoch: 34233, Training Loss: 0.00772
Epoch: 34233, Training Loss: 0.00773
Epoch: 34233, Training Loss: 0.00948
Epoch: 34234, Training Loss: 0.00742
Epoch: 34234, Training Loss: 0.00772
Epoch: 34234, Training Loss: 0.00773
Epoch: 34234, Training Loss: 0.00948
Epoch: 34235, Training Loss: 0.00742
Epoch: 34235, Training Loss: 0.00772
Epoch: 34235, Training Loss: 0.00773
Epoch: 34235, Training Loss: 0.00948
Epoch: 34236, Training Loss: 0.00742
Epoch: 34236, Training Loss: 0.00772
Epoch: 34236, Training Loss: 0.00773
Epoch: 34236, Training Loss: 0.00948
Epoch: 34237, Training Loss: 0.00742
Epoch: 34237, Training Loss: 0.00772
Epoch: 34237, Training Loss: 0.00773
Epoch: 34237, Training Loss: 0.00948
Epoch: 34238, Training Loss: 0.00742
Epoch: 34238, Training Loss: 0.00772
Epoch: 34238, Training Loss: 0.00773
Epoch: 34238, Training Loss: 0.00948
Epoch: 34239, Training Loss: 0.00742
Epoch: 34239, Training Loss: 0.00772
Epoch: 34239, Training Loss: 0.00773
Epoch: 34239, Training Loss: 0.00948
Epoch: 34240, Training Loss: 0.00742
Epoch: 34240, Training Loss: 0.00772
Epoch: 34240, Training Loss: 0.00773
Epoch: 34240, Training Loss: 0.00948
Epoch: 34241, Training Loss: 0.00742
Epoch: 34241, Training Loss: 0.00772
Epoch: 34241, Training Loss: 0.00773
Epoch: 34241, Training Loss: 0.00948
Epoch: 34242, Training Loss: 0.00742
Epoch: 34242, Training Loss: 0.00772
Epoch: 34242, Training Loss: 0.00773
Epoch: 34242, Training Loss: 0.00948
Epoch: 34243, Training Loss: 0.00742
Epoch: 34243, Training Loss: 0.00772
Epoch: 34243, Training Loss: 0.00773
Epoch: 34243, Training Loss: 0.00948
Epoch: 34244, Training Loss: 0.00742
Epoch: 34244, Training Loss: 0.00772
Epoch: 34244, Training Loss: 0.00773
Epoch: 34244, Training Loss: 0.00948
Epoch: 34245, Training Loss: 0.00742
Epoch: 34245, Training Loss: 0.00772
Epoch: 34245, Training Loss: 0.00773
Epoch: 34245, Training Loss: 0.00948
Epoch: 34246, Training Loss: 0.00742
Epoch: 34246, Training Loss: 0.00772
Epoch: 34246, Training Loss: 0.00773
Epoch: 34246, Training Loss: 0.00948
Epoch: 34247, Training Loss: 0.00742
Epoch: 34247, Training Loss: 0.00772
Epoch: 34247, Training Loss: 0.00773
Epoch: 34247, Training Loss: 0.00948
Epoch: 34248, Training Loss: 0.00742
Epoch: 34248, Training Loss: 0.00772
Epoch: 34248, Training Loss: 0.00773
Epoch: 34248, Training Loss: 0.00948
Epoch: 34249, Training Loss: 0.00742
Epoch: 34249, Training Loss: 0.00772
Epoch: 34249, Training Loss: 0.00772
Epoch: 34249, Training Loss: 0.00948
Epoch: 34250, Training Loss: 0.00742
Epoch: 34250, Training Loss: 0.00772
Epoch: 34250, Training Loss: 0.00772
Epoch: 34250, Training Loss: 0.00948
Epoch: 34251, Training Loss: 0.00742
Epoch: 34251, Training Loss: 0.00772
Epoch: 34251, Training Loss: 0.00772
Epoch: 34251, Training Loss: 0.00948
Epoch: 34252, Training Loss: 0.00742
Epoch: 34252, Training Loss: 0.00772
Epoch: 34252, Training Loss: 0.00772
Epoch: 34252, Training Loss: 0.00948
Epoch: 34253, Training Loss: 0.00742
Epoch: 34253, Training Loss: 0.00772
Epoch: 34253, Training Loss: 0.00772
Epoch: 34253, Training Loss: 0.00948
Epoch: 34254, Training Loss: 0.00742
Epoch: 34254, Training Loss: 0.00772
Epoch: 34254, Training Loss: 0.00772
Epoch: 34254, Training Loss: 0.00948
Epoch: 34255, Training Loss: 0.00742
Epoch: 34255, Training Loss: 0.00772
Epoch: 34255, Training Loss: 0.00772
Epoch: 34255, Training Loss: 0.00948
Epoch: 34256, Training Loss: 0.00741
Epoch: 34256, Training Loss: 0.00772
Epoch: 34256, Training Loss: 0.00772
Epoch: 34256, Training Loss: 0.00948
Epoch: 34257, Training Loss: 0.00741
Epoch: 34257, Training Loss: 0.00771
Epoch: 34257, Training Loss: 0.00772
Epoch: 34257, Training Loss: 0.00948
Epoch: 34258, Training Loss: 0.00741
Epoch: 34258, Training Loss: 0.00771
Epoch: 34258, Training Loss: 0.00772
Epoch: 34258, Training Loss: 0.00948
Epoch: 34259, Training Loss: 0.00741
Epoch: 34259, Training Loss: 0.00771
Epoch: 34259, Training Loss: 0.00772
Epoch: 34259, Training Loss: 0.00948
Epoch: 34260, Training Loss: 0.00741
Epoch: 34260, Training Loss: 0.00771
Epoch: 34260, Training Loss: 0.00772
Epoch: 34260, Training Loss: 0.00948
Epoch: 34261, Training Loss: 0.00741
Epoch: 34261, Training Loss: 0.00771
Epoch: 34261, Training Loss: 0.00772
Epoch: 34261, Training Loss: 0.00948
Epoch: 34262, Training Loss: 0.00741
Epoch: 34262, Training Loss: 0.00771
Epoch: 34262, Training Loss: 0.00772
Epoch: 34262, Training Loss: 0.00948
Epoch: 34263, Training Loss: 0.00741
Epoch: 34263, Training Loss: 0.00771
Epoch: 34263, Training Loss: 0.00772
Epoch: 34263, Training Loss: 0.00948
Epoch: 34264, Training Loss: 0.00741
Epoch: 34264, Training Loss: 0.00771
Epoch: 34264, Training Loss: 0.00772
Epoch: 34264, Training Loss: 0.00948
Epoch: 34265, Training Loss: 0.00741
Epoch: 34265, Training Loss: 0.00771
Epoch: 34265, Training Loss: 0.00772
Epoch: 34265, Training Loss: 0.00948
Epoch: 34266, Training Loss: 0.00741
Epoch: 34266, Training Loss: 0.00771
Epoch: 34266, Training Loss: 0.00772
Epoch: 34266, Training Loss: 0.00948
Epoch: 34267, Training Loss: 0.00741
Epoch: 34267, Training Loss: 0.00771
Epoch: 34267, Training Loss: 0.00772
Epoch: 34267, Training Loss: 0.00948
Epoch: 34268, Training Loss: 0.00741
Epoch: 34268, Training Loss: 0.00771
Epoch: 34268, Training Loss: 0.00772
Epoch: 34268, Training Loss: 0.00948
Epoch: 34269, Training Loss: 0.00741
Epoch: 34269, Training Loss: 0.00771
Epoch: 34269, Training Loss: 0.00772
Epoch: 34269, Training Loss: 0.00948
Epoch: 34270, Training Loss: 0.00741
Epoch: 34270, Training Loss: 0.00771
Epoch: 34270, Training Loss: 0.00772
Epoch: 34270, Training Loss: 0.00948
Epoch: 34271, Training Loss: 0.00741
Epoch: 34271, Training Loss: 0.00771
Epoch: 34271, Training Loss: 0.00772
Epoch: 34271, Training Loss: 0.00948
Epoch: 34272, Training Loss: 0.00741
Epoch: 34272, Training Loss: 0.00771
Epoch: 34272, Training Loss: 0.00772
Epoch: 34272, Training Loss: 0.00948
Epoch: 34273, Training Loss: 0.00741
Epoch: 34273, Training Loss: 0.00771
Epoch: 34273, Training Loss: 0.00772
Epoch: 34273, Training Loss: 0.00948
Epoch: 34274, Training Loss: 0.00741
Epoch: 34274, Training Loss: 0.00771
Epoch: 34274, Training Loss: 0.00772
Epoch: 34274, Training Loss: 0.00948
Epoch: 34275, Training Loss: 0.00741
Epoch: 34275, Training Loss: 0.00771
Epoch: 34275, Training Loss: 0.00772
Epoch: 34275, Training Loss: 0.00948
Epoch: 34276, Training Loss: 0.00741
Epoch: 34276, Training Loss: 0.00771
Epoch: 34276, Training Loss: 0.00772
Epoch: 34276, Training Loss: 0.00948
Epoch: 34277, Training Loss: 0.00741
Epoch: 34277, Training Loss: 0.00771
Epoch: 34277, Training Loss: 0.00772
Epoch: 34277, Training Loss: 0.00948
Epoch: 34278, Training Loss: 0.00741
Epoch: 34278, Training Loss: 0.00771
Epoch: 34278, Training Loss: 0.00772
Epoch: 34278, Training Loss: 0.00948
Epoch: 34279, Training Loss: 0.00741
Epoch: 34279, Training Loss: 0.00771
Epoch: 34279, Training Loss: 0.00772
Epoch: 34279, Training Loss: 0.00948
Epoch: 34280, Training Loss: 0.00741
Epoch: 34280, Training Loss: 0.00771
Epoch: 34280, Training Loss: 0.00772
Epoch: 34280, Training Loss: 0.00948
Epoch: 34281, Training Loss: 0.00741
Epoch: 34281, Training Loss: 0.00771
Epoch: 34281, Training Loss: 0.00772
Epoch: 34281, Training Loss: 0.00948
Epoch: 34282, Training Loss: 0.00741
Epoch: 34282, Training Loss: 0.00771
Epoch: 34282, Training Loss: 0.00772
Epoch: 34282, Training Loss: 0.00948
Epoch: 34283, Training Loss: 0.00741
Epoch: 34283, Training Loss: 0.00771
Epoch: 34283, Training Loss: 0.00772
Epoch: 34283, Training Loss: 0.00948
Epoch: 34284, Training Loss: 0.00741
Epoch: 34284, Training Loss: 0.00771
Epoch: 34284, Training Loss: 0.00772
Epoch: 34284, Training Loss: 0.00948
Epoch: 34285, Training Loss: 0.00741
Epoch: 34285, Training Loss: 0.00771
Epoch: 34285, Training Loss: 0.00772
Epoch: 34285, Training Loss: 0.00948
Epoch: 34286, Training Loss: 0.00741
Epoch: 34286, Training Loss: 0.00771
Epoch: 34286, Training Loss: 0.00772
Epoch: 34286, Training Loss: 0.00948
Epoch: 34287, Training Loss: 0.00741
Epoch: 34287, Training Loss: 0.00771
Epoch: 34287, Training Loss: 0.00772
Epoch: 34287, Training Loss: 0.00948
Epoch: 34288, Training Loss: 0.00741
Epoch: 34288, Training Loss: 0.00771
Epoch: 34288, Training Loss: 0.00772
Epoch: 34288, Training Loss: 0.00948
Epoch: 34289, Training Loss: 0.00741
Epoch: 34289, Training Loss: 0.00771
Epoch: 34289, Training Loss: 0.00772
Epoch: 34289, Training Loss: 0.00948
Epoch: 34290, Training Loss: 0.00741
Epoch: 34290, Training Loss: 0.00771
Epoch: 34290, Training Loss: 0.00772
Epoch: 34290, Training Loss: 0.00948
Epoch: 34291, Training Loss: 0.00741
Epoch: 34291, Training Loss: 0.00771
Epoch: 34291, Training Loss: 0.00772
Epoch: 34291, Training Loss: 0.00948
Epoch: 34292, Training Loss: 0.00741
Epoch: 34292, Training Loss: 0.00771
Epoch: 34292, Training Loss: 0.00772
Epoch: 34292, Training Loss: 0.00947
Epoch: 34293, Training Loss: 0.00741
Epoch: 34293, Training Loss: 0.00771
Epoch: 34293, Training Loss: 0.00772
Epoch: 34293, Training Loss: 0.00947
Epoch: 34294, Training Loss: 0.00741
Epoch: 34294, Training Loss: 0.00771
Epoch: 34294, Training Loss: 0.00772
Epoch: 34294, Training Loss: 0.00947
Epoch: 34295, Training Loss: 0.00741
Epoch: 34295, Training Loss: 0.00771
Epoch: 34295, Training Loss: 0.00772
Epoch: 34295, Training Loss: 0.00947
Epoch: 34296, Training Loss: 0.00741
Epoch: 34296, Training Loss: 0.00771
Epoch: 34296, Training Loss: 0.00772
Epoch: 34296, Training Loss: 0.00947
Epoch: 34297, Training Loss: 0.00741
Epoch: 34297, Training Loss: 0.00771
Epoch: 34297, Training Loss: 0.00772
Epoch: 34297, Training Loss: 0.00947
Epoch: 34298, Training Loss: 0.00741
Epoch: 34298, Training Loss: 0.00771
Epoch: 34298, Training Loss: 0.00772
Epoch: 34298, Training Loss: 0.00947
Epoch: 34299, Training Loss: 0.00741
Epoch: 34299, Training Loss: 0.00771
Epoch: 34299, Training Loss: 0.00772
Epoch: 34299, Training Loss: 0.00947
Epoch: 34300, Training Loss: 0.00741
Epoch: 34300, Training Loss: 0.00771
Epoch: 34300, Training Loss: 0.00772
Epoch: 34300, Training Loss: 0.00947
Epoch: 34301, Training Loss: 0.00741
Epoch: 34301, Training Loss: 0.00771
Epoch: 34301, Training Loss: 0.00772
Epoch: 34301, Training Loss: 0.00947
Epoch: 34302, Training Loss: 0.00741
Epoch: 34302, Training Loss: 0.00771
Epoch: 34302, Training Loss: 0.00772
Epoch: 34302, Training Loss: 0.00947
Epoch: 34303, Training Loss: 0.00741
Epoch: 34303, Training Loss: 0.00771
Epoch: 34303, Training Loss: 0.00772
Epoch: 34303, Training Loss: 0.00947
Epoch: 34304, Training Loss: 0.00741
Epoch: 34304, Training Loss: 0.00771
Epoch: 34304, Training Loss: 0.00772
Epoch: 34304, Training Loss: 0.00947
Epoch: 34305, Training Loss: 0.00741
Epoch: 34305, Training Loss: 0.00771
Epoch: 34305, Training Loss: 0.00772
Epoch: 34305, Training Loss: 0.00947
Epoch: 34306, Training Loss: 0.00741
Epoch: 34306, Training Loss: 0.00771
Epoch: 34306, Training Loss: 0.00772
Epoch: 34306, Training Loss: 0.00947
Epoch: 34307, Training Loss: 0.00741
Epoch: 34307, Training Loss: 0.00771
Epoch: 34307, Training Loss: 0.00772
Epoch: 34307, Training Loss: 0.00947
Epoch: 34308, Training Loss: 0.00741
Epoch: 34308, Training Loss: 0.00771
Epoch: 34308, Training Loss: 0.00772
Epoch: 34308, Training Loss: 0.00947
Epoch: 34309, Training Loss: 0.00741
Epoch: 34309, Training Loss: 0.00771
Epoch: 34309, Training Loss: 0.00772
Epoch: 34309, Training Loss: 0.00947
Epoch: 34310, Training Loss: 0.00741
Epoch: 34310, Training Loss: 0.00771
Epoch: 34310, Training Loss: 0.00772
Epoch: 34310, Training Loss: 0.00947
Epoch: 34311, Training Loss: 0.00741
Epoch: 34311, Training Loss: 0.00771
Epoch: 34311, Training Loss: 0.00772
Epoch: 34311, Training Loss: 0.00947
Epoch: 34312, Training Loss: 0.00741
Epoch: 34312, Training Loss: 0.00771
Epoch: 34312, Training Loss: 0.00772
Epoch: 34312, Training Loss: 0.00947
Epoch: 34313, Training Loss: 0.00741
Epoch: 34313, Training Loss: 0.00771
Epoch: 34313, Training Loss: 0.00772
Epoch: 34313, Training Loss: 0.00947
Epoch: 34314, Training Loss: 0.00741
Epoch: 34314, Training Loss: 0.00771
Epoch: 34314, Training Loss: 0.00772
Epoch: 34314, Training Loss: 0.00947
Epoch: 34315, Training Loss: 0.00741
Epoch: 34315, Training Loss: 0.00771
Epoch: 34315, Training Loss: 0.00772
Epoch: 34315, Training Loss: 0.00947
Epoch: 34316, Training Loss: 0.00741
Epoch: 34316, Training Loss: 0.00771
Epoch: 34316, Training Loss: 0.00772
Epoch: 34316, Training Loss: 0.00947
Epoch: 34317, Training Loss: 0.00741
Epoch: 34317, Training Loss: 0.00771
Epoch: 34317, Training Loss: 0.00772
Epoch: 34317, Training Loss: 0.00947
Epoch: 34318, Training Loss: 0.00741
Epoch: 34318, Training Loss: 0.00771
Epoch: 34318, Training Loss: 0.00772
Epoch: 34318, Training Loss: 0.00947
Epoch: 34319, Training Loss: 0.00741
Epoch: 34319, Training Loss: 0.00771
Epoch: 34319, Training Loss: 0.00772
Epoch: 34319, Training Loss: 0.00947
Epoch: 34320, Training Loss: 0.00741
Epoch: 34320, Training Loss: 0.00771
Epoch: 34320, Training Loss: 0.00772
Epoch: 34320, Training Loss: 0.00947
Epoch: 34321, Training Loss: 0.00741
Epoch: 34321, Training Loss: 0.00771
Epoch: 34321, Training Loss: 0.00772
Epoch: 34321, Training Loss: 0.00947
Epoch: 34322, Training Loss: 0.00741
Epoch: 34322, Training Loss: 0.00771
Epoch: 34322, Training Loss: 0.00772
Epoch: 34322, Training Loss: 0.00947
Epoch: 34323, Training Loss: 0.00741
Epoch: 34323, Training Loss: 0.00771
Epoch: 34323, Training Loss: 0.00772
Epoch: 34323, Training Loss: 0.00947
Epoch: 34324, Training Loss: 0.00741
Epoch: 34324, Training Loss: 0.00771
Epoch: 34324, Training Loss: 0.00772
Epoch: 34324, Training Loss: 0.00947
Epoch: 34325, Training Loss: 0.00741
Epoch: 34325, Training Loss: 0.00771
Epoch: 34325, Training Loss: 0.00772
Epoch: 34325, Training Loss: 0.00947
Epoch: 34326, Training Loss: 0.00741
Epoch: 34326, Training Loss: 0.00771
Epoch: 34326, Training Loss: 0.00772
Epoch: 34326, Training Loss: 0.00947
Epoch: 34327, Training Loss: 0.00741
Epoch: 34327, Training Loss: 0.00771
Epoch: 34327, Training Loss: 0.00772
Epoch: 34327, Training Loss: 0.00947
Epoch: 34328, Training Loss: 0.00741
Epoch: 34328, Training Loss: 0.00771
Epoch: 34328, Training Loss: 0.00772
Epoch: 34328, Training Loss: 0.00947
Epoch: 34329, Training Loss: 0.00741
Epoch: 34329, Training Loss: 0.00771
Epoch: 34329, Training Loss: 0.00771
Epoch: 34329, Training Loss: 0.00947
Epoch: 34330, Training Loss: 0.00741
Epoch: 34330, Training Loss: 0.00771
Epoch: 34330, Training Loss: 0.00771
Epoch: 34330, Training Loss: 0.00947
Epoch: 34331, Training Loss: 0.00741
Epoch: 34331, Training Loss: 0.00771
Epoch: 34331, Training Loss: 0.00771
Epoch: 34331, Training Loss: 0.00947
Epoch: 34332, Training Loss: 0.00741
Epoch: 34332, Training Loss: 0.00771
Epoch: 34332, Training Loss: 0.00771
Epoch: 34332, Training Loss: 0.00947
Epoch: 34333, Training Loss: 0.00741
Epoch: 34333, Training Loss: 0.00771
Epoch: 34333, Training Loss: 0.00771
Epoch: 34333, Training Loss: 0.00947
Epoch: 34334, Training Loss: 0.00741
Epoch: 34334, Training Loss: 0.00771
Epoch: 34334, Training Loss: 0.00771
Epoch: 34334, Training Loss: 0.00947
Epoch: 34335, Training Loss: 0.00741
Epoch: 34335, Training Loss: 0.00771
Epoch: 34335, Training Loss: 0.00771
Epoch: 34335, Training Loss: 0.00947
Epoch: 34336, Training Loss: 0.00741
Epoch: 34336, Training Loss: 0.00771
Epoch: 34336, Training Loss: 0.00771
Epoch: 34336, Training Loss: 0.00947
Epoch: 34337, Training Loss: 0.00741
Epoch: 34337, Training Loss: 0.00770
Epoch: 34337, Training Loss: 0.00771
Epoch: 34337, Training Loss: 0.00947
Epoch: 34338, Training Loss: 0.00741
Epoch: 34338, Training Loss: 0.00770
Epoch: 34338, Training Loss: 0.00771
Epoch: 34338, Training Loss: 0.00947
Epoch: 34339, Training Loss: 0.00741
Epoch: 34339, Training Loss: 0.00770
Epoch: 34339, Training Loss: 0.00771
Epoch: 34339, Training Loss: 0.00947
Epoch: 34340, Training Loss: 0.00741
Epoch: 34340, Training Loss: 0.00770
Epoch: 34340, Training Loss: 0.00771
Epoch: 34340, Training Loss: 0.00947
Epoch: 34341, Training Loss: 0.00740
Epoch: 34341, Training Loss: 0.00770
Epoch: 34341, Training Loss: 0.00771
Epoch: 34341, Training Loss: 0.00947
Epoch: 34342, Training Loss: 0.00740
Epoch: 34342, Training Loss: 0.00770
Epoch: 34342, Training Loss: 0.00771
Epoch: 34342, Training Loss: 0.00947
Epoch: 34343, Training Loss: 0.00740
Epoch: 34343, Training Loss: 0.00770
Epoch: 34343, Training Loss: 0.00771
Epoch: 34343, Training Loss: 0.00947
Epoch: 34344, Training Loss: 0.00740
Epoch: 34344, Training Loss: 0.00770
Epoch: 34344, Training Loss: 0.00771
Epoch: 34344, Training Loss: 0.00947
Epoch: 34345, Training Loss: 0.00740
Epoch: 34345, Training Loss: 0.00770
Epoch: 34345, Training Loss: 0.00771
Epoch: 34345, Training Loss: 0.00947
Epoch: 34346, Training Loss: 0.00740
Epoch: 34346, Training Loss: 0.00770
Epoch: 34346, Training Loss: 0.00771
Epoch: 34346, Training Loss: 0.00947
Epoch: 34347, Training Loss: 0.00740
Epoch: 34347, Training Loss: 0.00770
Epoch: 34347, Training Loss: 0.00771
Epoch: 34347, Training Loss: 0.00947
Epoch: 34348, Training Loss: 0.00740
Epoch: 34348, Training Loss: 0.00770
Epoch: 34348, Training Loss: 0.00771
Epoch: 34348, Training Loss: 0.00947
Epoch: 34349, Training Loss: 0.00740
Epoch: 34349, Training Loss: 0.00770
Epoch: 34349, Training Loss: 0.00771
Epoch: 34349, Training Loss: 0.00947
Epoch: 34350, Training Loss: 0.00740
Epoch: 34350, Training Loss: 0.00770
Epoch: 34350, Training Loss: 0.00771
Epoch: 34350, Training Loss: 0.00947
Epoch: 34351, Training Loss: 0.00740
Epoch: 34351, Training Loss: 0.00770
Epoch: 34351, Training Loss: 0.00771
Epoch: 34351, Training Loss: 0.00947
Epoch: 34352, Training Loss: 0.00740
Epoch: 34352, Training Loss: 0.00770
Epoch: 34352, Training Loss: 0.00771
Epoch: 34352, Training Loss: 0.00947
Epoch: 34353, Training Loss: 0.00740
Epoch: 34353, Training Loss: 0.00770
Epoch: 34353, Training Loss: 0.00771
Epoch: 34353, Training Loss: 0.00947
Epoch: 34354, Training Loss: 0.00740
Epoch: 34354, Training Loss: 0.00770
Epoch: 34354, Training Loss: 0.00771
Epoch: 34354, Training Loss: 0.00947
Epoch: 34355, Training Loss: 0.00740
Epoch: 34355, Training Loss: 0.00770
Epoch: 34355, Training Loss: 0.00771
Epoch: 34355, Training Loss: 0.00947
Epoch: 34356, Training Loss: 0.00740
Epoch: 34356, Training Loss: 0.00770
Epoch: 34356, Training Loss: 0.00771
Epoch: 34356, Training Loss: 0.00946
Epoch: 34357, Training Loss: 0.00740
Epoch: 34357, Training Loss: 0.00770
Epoch: 34357, Training Loss: 0.00771
Epoch: 34357, Training Loss: 0.00946
Epoch: 34358, Training Loss: 0.00740
Epoch: 34358, Training Loss: 0.00770
Epoch: 34358, Training Loss: 0.00771
Epoch: 34358, Training Loss: 0.00946
Epoch: 34359, Training Loss: 0.00740
Epoch: 34359, Training Loss: 0.00770
Epoch: 34359, Training Loss: 0.00771
Epoch: 34359, Training Loss: 0.00946
Epoch: 34360, Training Loss: 0.00740
Epoch: 34360, Training Loss: 0.00770
Epoch: 34360, Training Loss: 0.00771
Epoch: 34360, Training Loss: 0.00946
Epoch: 34361, Training Loss: 0.00740
Epoch: 34361, Training Loss: 0.00770
Epoch: 34361, Training Loss: 0.00771
Epoch: 34361, Training Loss: 0.00946
Epoch: 34362, Training Loss: 0.00740
Epoch: 34362, Training Loss: 0.00770
Epoch: 34362, Training Loss: 0.00771
Epoch: 34362, Training Loss: 0.00946
Epoch: 34363, Training Loss: 0.00740
Epoch: 34363, Training Loss: 0.00770
Epoch: 34363, Training Loss: 0.00771
Epoch: 34363, Training Loss: 0.00946
Epoch: 34364, Training Loss: 0.00740
Epoch: 34364, Training Loss: 0.00770
Epoch: 34364, Training Loss: 0.00771
Epoch: 34364, Training Loss: 0.00946
Epoch: 34365, Training Loss: 0.00740
Epoch: 34365, Training Loss: 0.00770
Epoch: 34365, Training Loss: 0.00771
Epoch: 34365, Training Loss: 0.00946
Epoch: 34366, Training Loss: 0.00740
Epoch: 34366, Training Loss: 0.00770
Epoch: 34366, Training Loss: 0.00771
Epoch: 34366, Training Loss: 0.00946
Epoch: 34367, Training Loss: 0.00740
Epoch: 34367, Training Loss: 0.00770
Epoch: 34367, Training Loss: 0.00771
Epoch: 34367, Training Loss: 0.00946
Epoch: 34368, Training Loss: 0.00740
Epoch: 34368, Training Loss: 0.00770
Epoch: 34368, Training Loss: 0.00771
Epoch: 34368, Training Loss: 0.00946
Epoch: 34369, Training Loss: 0.00740
Epoch: 34369, Training Loss: 0.00770
Epoch: 34369, Training Loss: 0.00771
Epoch: 34369, Training Loss: 0.00946
Epoch: 34370, Training Loss: 0.00740
Epoch: 34370, Training Loss: 0.00770
Epoch: 34370, Training Loss: 0.00771
Epoch: 34370, Training Loss: 0.00946
Epoch: 34371, Training Loss: 0.00740
Epoch: 34371, Training Loss: 0.00770
Epoch: 34371, Training Loss: 0.00771
Epoch: 34371, Training Loss: 0.00946
Epoch: 34372, Training Loss: 0.00740
Epoch: 34372, Training Loss: 0.00770
Epoch: 34372, Training Loss: 0.00771
Epoch: 34372, Training Loss: 0.00946
Epoch: 34373, Training Loss: 0.00740
Epoch: 34373, Training Loss: 0.00770
Epoch: 34373, Training Loss: 0.00771
Epoch: 34373, Training Loss: 0.00946
Epoch: 34374, Training Loss: 0.00740
Epoch: 34374, Training Loss: 0.00770
Epoch: 34374, Training Loss: 0.00771
Epoch: 34374, Training Loss: 0.00946
Epoch: 34375, Training Loss: 0.00740
Epoch: 34375, Training Loss: 0.00770
Epoch: 34375, Training Loss: 0.00771
Epoch: 34375, Training Loss: 0.00946
Epoch: 34376, Training Loss: 0.00740
Epoch: 34376, Training Loss: 0.00770
Epoch: 34376, Training Loss: 0.00771
Epoch: 34376, Training Loss: 0.00946
Epoch: 34377, Training Loss: 0.00740
Epoch: 34377, Training Loss: 0.00770
Epoch: 34377, Training Loss: 0.00771
Epoch: 34377, Training Loss: 0.00946
Epoch: 34378, Training Loss: 0.00740
Epoch: 34378, Training Loss: 0.00770
Epoch: 34378, Training Loss: 0.00771
Epoch: 34378, Training Loss: 0.00946
Epoch: 34379, Training Loss: 0.00740
Epoch: 34379, Training Loss: 0.00770
Epoch: 34379, Training Loss: 0.00771
Epoch: 34379, Training Loss: 0.00946
Epoch: 34380, Training Loss: 0.00740
Epoch: 34380, Training Loss: 0.00770
Epoch: 34380, Training Loss: 0.00771
Epoch: 34380, Training Loss: 0.00946
Epoch: 34381, Training Loss: 0.00740
Epoch: 34381, Training Loss: 0.00770
Epoch: 34381, Training Loss: 0.00771
Epoch: 34381, Training Loss: 0.00946
Epoch: 34382, Training Loss: 0.00740
Epoch: 34382, Training Loss: 0.00770
Epoch: 34382, Training Loss: 0.00771
Epoch: 34382, Training Loss: 0.00946
Epoch: 34383, Training Loss: 0.00740
Epoch: 34383, Training Loss: 0.00770
Epoch: 34383, Training Loss: 0.00771
Epoch: 34383, Training Loss: 0.00946
Epoch: 34384, Training Loss: 0.00740
Epoch: 34384, Training Loss: 0.00770
Epoch: 34384, Training Loss: 0.00771
Epoch: 34384, Training Loss: 0.00946
Epoch: 34385, Training Loss: 0.00740
Epoch: 34385, Training Loss: 0.00770
Epoch: 34385, Training Loss: 0.00771
Epoch: 34385, Training Loss: 0.00946
Epoch: 34386, Training Loss: 0.00740
Epoch: 34386, Training Loss: 0.00770
Epoch: 34386, Training Loss: 0.00771
Epoch: 34386, Training Loss: 0.00946
Epoch: 34387, Training Loss: 0.00740
Epoch: 34387, Training Loss: 0.00770
Epoch: 34387, Training Loss: 0.00771
Epoch: 34387, Training Loss: 0.00946
Epoch: 34388, Training Loss: 0.00740
Epoch: 34388, Training Loss: 0.00770
Epoch: 34388, Training Loss: 0.00771
Epoch: 34388, Training Loss: 0.00946
Epoch: 34389, Training Loss: 0.00740
Epoch: 34389, Training Loss: 0.00770
Epoch: 34389, Training Loss: 0.00771
Epoch: 34389, Training Loss: 0.00946
Epoch: 34390, Training Loss: 0.00740
Epoch: 34390, Training Loss: 0.00770
Epoch: 34390, Training Loss: 0.00771
Epoch: 34390, Training Loss: 0.00946
Epoch: 34391, Training Loss: 0.00740
Epoch: 34391, Training Loss: 0.00770
Epoch: 34391, Training Loss: 0.00771
Epoch: 34391, Training Loss: 0.00946
Epoch: 34392, Training Loss: 0.00740
Epoch: 34392, Training Loss: 0.00770
Epoch: 34392, Training Loss: 0.00771
Epoch: 34392, Training Loss: 0.00946
Epoch: 34393, Training Loss: 0.00740
Epoch: 34393, Training Loss: 0.00770
Epoch: 34393, Training Loss: 0.00771
Epoch: 34393, Training Loss: 0.00946
Epoch: 34394, Training Loss: 0.00740
Epoch: 34394, Training Loss: 0.00770
Epoch: 34394, Training Loss: 0.00771
Epoch: 34394, Training Loss: 0.00946
Epoch: 34395, Training Loss: 0.00740
Epoch: 34395, Training Loss: 0.00770
Epoch: 34395, Training Loss: 0.00771
Epoch: 34395, Training Loss: 0.00946
Epoch: 34396, Training Loss: 0.00740
Epoch: 34396, Training Loss: 0.00770
Epoch: 34396, Training Loss: 0.00771
Epoch: 34396, Training Loss: 0.00946
Epoch: 34397, Training Loss: 0.00740
Epoch: 34397, Training Loss: 0.00770
Epoch: 34397, Training Loss: 0.00771
Epoch: 34397, Training Loss: 0.00946
Epoch: 34398, Training Loss: 0.00740
Epoch: 34398, Training Loss: 0.00770
Epoch: 34398, Training Loss: 0.00771
Epoch: 34398, Training Loss: 0.00946
Epoch: 34399, Training Loss: 0.00740
Epoch: 34399, Training Loss: 0.00770
Epoch: 34399, Training Loss: 0.00771
Epoch: 34399, Training Loss: 0.00946
Epoch: 34400, Training Loss: 0.00740
Epoch: 34400, Training Loss: 0.00770
Epoch: 34400, Training Loss: 0.00771
Epoch: 34400, Training Loss: 0.00946
Epoch: 34401, Training Loss: 0.00740
Epoch: 34401, Training Loss: 0.00770
Epoch: 34401, Training Loss: 0.00771
Epoch: 34401, Training Loss: 0.00946
Epoch: 34402, Training Loss: 0.00740
Epoch: 34402, Training Loss: 0.00770
Epoch: 34402, Training Loss: 0.00771
Epoch: 34402, Training Loss: 0.00946
Epoch: 34403, Training Loss: 0.00740
Epoch: 34403, Training Loss: 0.00770
Epoch: 34403, Training Loss: 0.00771
Epoch: 34403, Training Loss: 0.00946
Epoch: 34404, Training Loss: 0.00740
Epoch: 34404, Training Loss: 0.00770
Epoch: 34404, Training Loss: 0.00771
Epoch: 34404, Training Loss: 0.00946
Epoch: 34405, Training Loss: 0.00740
Epoch: 34405, Training Loss: 0.00770
Epoch: 34405, Training Loss: 0.00771
Epoch: 34405, Training Loss: 0.00946
Epoch: 34406, Training Loss: 0.00740
Epoch: 34406, Training Loss: 0.00770
Epoch: 34406, Training Loss: 0.00771
Epoch: 34406, Training Loss: 0.00946
Epoch: 34407, Training Loss: 0.00740
Epoch: 34407, Training Loss: 0.00770
Epoch: 34407, Training Loss: 0.00771
Epoch: 34407, Training Loss: 0.00946
Epoch: 34408, Training Loss: 0.00740
Epoch: 34408, Training Loss: 0.00770
Epoch: 34408, Training Loss: 0.00771
Epoch: 34408, Training Loss: 0.00946
Epoch: 34409, Training Loss: 0.00740
Epoch: 34409, Training Loss: 0.00770
Epoch: 34409, Training Loss: 0.00770
Epoch: 34409, Training Loss: 0.00946
Epoch: 34410, Training Loss: 0.00740
Epoch: 34410, Training Loss: 0.00770
Epoch: 34410, Training Loss: 0.00770
Epoch: 34410, Training Loss: 0.00946
Epoch: 34411, Training Loss: 0.00740
Epoch: 34411, Training Loss: 0.00770
Epoch: 34411, Training Loss: 0.00770
Epoch: 34411, Training Loss: 0.00946
Epoch: 34412, Training Loss: 0.00740
Epoch: 34412, Training Loss: 0.00770
Epoch: 34412, Training Loss: 0.00770
Epoch: 34412, Training Loss: 0.00946
Epoch: 34413, Training Loss: 0.00740
Epoch: 34413, Training Loss: 0.00770
Epoch: 34413, Training Loss: 0.00770
Epoch: 34413, Training Loss: 0.00946
Epoch: 34414, Training Loss: 0.00740
Epoch: 34414, Training Loss: 0.00770
Epoch: 34414, Training Loss: 0.00770
Epoch: 34414, Training Loss: 0.00946
Epoch: 34415, Training Loss: 0.00740
Epoch: 34415, Training Loss: 0.00770
Epoch: 34415, Training Loss: 0.00770
Epoch: 34415, Training Loss: 0.00946
Epoch: 34416, Training Loss: 0.00740
Epoch: 34416, Training Loss: 0.00770
Epoch: 34416, Training Loss: 0.00770
Epoch: 34416, Training Loss: 0.00946
Epoch: 34417, Training Loss: 0.00740
Epoch: 34417, Training Loss: 0.00770
Epoch: 34417, Training Loss: 0.00770
Epoch: 34417, Training Loss: 0.00946
Epoch: 34418, Training Loss: 0.00740
Epoch: 34418, Training Loss: 0.00769
Epoch: 34418, Training Loss: 0.00770
Epoch: 34418, Training Loss: 0.00946
Epoch: 34419, Training Loss: 0.00740
Epoch: 34419, Training Loss: 0.00769
Epoch: 34419, Training Loss: 0.00770
Epoch: 34419, Training Loss: 0.00946
Epoch: 34420, Training Loss: 0.00740
Epoch: 34420, Training Loss: 0.00769
Epoch: 34420, Training Loss: 0.00770
Epoch: 34420, Training Loss: 0.00946
Epoch: 34421, Training Loss: 0.00740
Epoch: 34421, Training Loss: 0.00769
Epoch: 34421, Training Loss: 0.00770
Epoch: 34421, Training Loss: 0.00945
Epoch: 34422, Training Loss: 0.00740
Epoch: 34422, Training Loss: 0.00769
Epoch: 34422, Training Loss: 0.00770
Epoch: 34422, Training Loss: 0.00945
Epoch: 34423, Training Loss: 0.00740
Epoch: 34423, Training Loss: 0.00769
Epoch: 34423, Training Loss: 0.00770
Epoch: 34423, Training Loss: 0.00945
Epoch: 34424, Training Loss: 0.00740
Epoch: 34424, Training Loss: 0.00769
Epoch: 34424, Training Loss: 0.00770
Epoch: 34424, Training Loss: 0.00945
Epoch: 34425, Training Loss: 0.00740
Epoch: 34425, Training Loss: 0.00769
Epoch: 34425, Training Loss: 0.00770
Epoch: 34425, Training Loss: 0.00945
Epoch: 34426, Training Loss: 0.00740
Epoch: 34426, Training Loss: 0.00769
Epoch: 34426, Training Loss: 0.00770
Epoch: 34426, Training Loss: 0.00945
Epoch: 34427, Training Loss: 0.00739
Epoch: 34427, Training Loss: 0.00769
Epoch: 34427, Training Loss: 0.00770
Epoch: 34427, Training Loss: 0.00945
Epoch: 34428, Training Loss: 0.00739
Epoch: 34428, Training Loss: 0.00769
Epoch: 34428, Training Loss: 0.00770
Epoch: 34428, Training Loss: 0.00945
Epoch: 34429, Training Loss: 0.00739
Epoch: 34429, Training Loss: 0.00769
Epoch: 34429, Training Loss: 0.00770
Epoch: 34429, Training Loss: 0.00945
Epoch: 34430, Training Loss: 0.00739
Epoch: 34430, Training Loss: 0.00769
Epoch: 34430, Training Loss: 0.00770
Epoch: 34430, Training Loss: 0.00945
Epoch: 34431, Training Loss: 0.00739
Epoch: 34431, Training Loss: 0.00769
Epoch: 34431, Training Loss: 0.00770
Epoch: 34431, Training Loss: 0.00945
Epoch: 34432, Training Loss: 0.00739
Epoch: 34432, Training Loss: 0.00769
Epoch: 34432, Training Loss: 0.00770
Epoch: 34432, Training Loss: 0.00945
Epoch: 34433, Training Loss: 0.00739
Epoch: 34433, Training Loss: 0.00769
Epoch: 34433, Training Loss: 0.00770
Epoch: 34433, Training Loss: 0.00945
Epoch: 34434, Training Loss: 0.00739
Epoch: 34434, Training Loss: 0.00769
Epoch: 34434, Training Loss: 0.00770
Epoch: 34434, Training Loss: 0.00945
Epoch: 34435, Training Loss: 0.00739
Epoch: 34435, Training Loss: 0.00769
Epoch: 34435, Training Loss: 0.00770
Epoch: 34435, Training Loss: 0.00945
Epoch: 34436, Training Loss: 0.00739
Epoch: 34436, Training Loss: 0.00769
Epoch: 34436, Training Loss: 0.00770
Epoch: 34436, Training Loss: 0.00945
Epoch: 34437, Training Loss: 0.00739
Epoch: 34437, Training Loss: 0.00769
Epoch: 34437, Training Loss: 0.00770
Epoch: 34437, Training Loss: 0.00945
Epoch: 34438, Training Loss: 0.00739
Epoch: 34438, Training Loss: 0.00769
Epoch: 34438, Training Loss: 0.00770
Epoch: 34438, Training Loss: 0.00945
Epoch: 34439, Training Loss: 0.00739
Epoch: 34439, Training Loss: 0.00769
Epoch: 34439, Training Loss: 0.00770
Epoch: 34439, Training Loss: 0.00945
Epoch: 34440, Training Loss: 0.00739
Epoch: 34440, Training Loss: 0.00769
Epoch: 34440, Training Loss: 0.00770
Epoch: 34440, Training Loss: 0.00945
Epoch: 34441, Training Loss: 0.00739
Epoch: 34441, Training Loss: 0.00769
Epoch: 34441, Training Loss: 0.00770
Epoch: 34441, Training Loss: 0.00945
Epoch: 34442, Training Loss: 0.00739
Epoch: 34442, Training Loss: 0.00769
Epoch: 34442, Training Loss: 0.00770
Epoch: 34442, Training Loss: 0.00945
Epoch: 34443, Training Loss: 0.00739
Epoch: 34443, Training Loss: 0.00769
Epoch: 34443, Training Loss: 0.00770
Epoch: 34443, Training Loss: 0.00945
Epoch: 34444, Training Loss: 0.00739
Epoch: 34444, Training Loss: 0.00769
Epoch: 34444, Training Loss: 0.00770
Epoch: 34444, Training Loss: 0.00945
Epoch: 34445, Training Loss: 0.00739
Epoch: 34445, Training Loss: 0.00769
Epoch: 34445, Training Loss: 0.00770
Epoch: 34445, Training Loss: 0.00945
Epoch: 34446, Training Loss: 0.00739
Epoch: 34446, Training Loss: 0.00769
Epoch: 34446, Training Loss: 0.00770
Epoch: 34446, Training Loss: 0.00945
Epoch: 34447, Training Loss: 0.00739
Epoch: 34447, Training Loss: 0.00769
Epoch: 34447, Training Loss: 0.00770
Epoch: 34447, Training Loss: 0.00945
Epoch: 34448, Training Loss: 0.00739
Epoch: 34448, Training Loss: 0.00769
Epoch: 34448, Training Loss: 0.00770
Epoch: 34448, Training Loss: 0.00945
Epoch: 34449, Training Loss: 0.00739
Epoch: 34449, Training Loss: 0.00769
Epoch: 34449, Training Loss: 0.00770
Epoch: 34449, Training Loss: 0.00945
Epoch: 34450, Training Loss: 0.00739
Epoch: 34450, Training Loss: 0.00769
Epoch: 34450, Training Loss: 0.00770
Epoch: 34450, Training Loss: 0.00945
Epoch: 34451, Training Loss: 0.00739
Epoch: 34451, Training Loss: 0.00769
Epoch: 34451, Training Loss: 0.00770
Epoch: 34451, Training Loss: 0.00945
Epoch: 34452, Training Loss: 0.00739
Epoch: 34452, Training Loss: 0.00769
Epoch: 34452, Training Loss: 0.00770
Epoch: 34452, Training Loss: 0.00945
Epoch: 34453, Training Loss: 0.00739
Epoch: 34453, Training Loss: 0.00769
Epoch: 34453, Training Loss: 0.00770
Epoch: 34453, Training Loss: 0.00945
Epoch: 34454, Training Loss: 0.00739
Epoch: 34454, Training Loss: 0.00769
Epoch: 34454, Training Loss: 0.00770
Epoch: 34454, Training Loss: 0.00945
Epoch: 34455, Training Loss: 0.00739
Epoch: 34455, Training Loss: 0.00769
Epoch: 34455, Training Loss: 0.00770
Epoch: 34455, Training Loss: 0.00945
Epoch: 34456, Training Loss: 0.00739
Epoch: 34456, Training Loss: 0.00769
Epoch: 34456, Training Loss: 0.00770
Epoch: 34456, Training Loss: 0.00945
Epoch: 34457, Training Loss: 0.00739
Epoch: 34457, Training Loss: 0.00769
Epoch: 34457, Training Loss: 0.00770
Epoch: 34457, Training Loss: 0.00945
Epoch: 34458, Training Loss: 0.00739
Epoch: 34458, Training Loss: 0.00769
Epoch: 34458, Training Loss: 0.00770
Epoch: 34458, Training Loss: 0.00945
Epoch: 34459, Training Loss: 0.00739
Epoch: 34459, Training Loss: 0.00769
Epoch: 34459, Training Loss: 0.00770
Epoch: 34459, Training Loss: 0.00945
Epoch: 34460, Training Loss: 0.00739
Epoch: 34460, Training Loss: 0.00769
Epoch: 34460, Training Loss: 0.00770
Epoch: 34460, Training Loss: 0.00945
Epoch: 34461, Training Loss: 0.00739
Epoch: 34461, Training Loss: 0.00769
Epoch: 34461, Training Loss: 0.00770
Epoch: 34461, Training Loss: 0.00945
Epoch: 34462, Training Loss: 0.00739
Epoch: 34462, Training Loss: 0.00769
Epoch: 34462, Training Loss: 0.00770
Epoch: 34462, Training Loss: 0.00945
Epoch: 34463, Training Loss: 0.00739
Epoch: 34463, Training Loss: 0.00769
Epoch: 34463, Training Loss: 0.00770
Epoch: 34463, Training Loss: 0.00945
Epoch: 34464, Training Loss: 0.00739
Epoch: 34464, Training Loss: 0.00769
Epoch: 34464, Training Loss: 0.00770
Epoch: 34464, Training Loss: 0.00945
Epoch: 34465, Training Loss: 0.00739
Epoch: 34465, Training Loss: 0.00769
Epoch: 34465, Training Loss: 0.00770
Epoch: 34465, Training Loss: 0.00945
Epoch: 34466, Training Loss: 0.00739
Epoch: 34466, Training Loss: 0.00769
Epoch: 34466, Training Loss: 0.00770
Epoch: 34466, Training Loss: 0.00945
Epoch: 34467, Training Loss: 0.00739
Epoch: 34467, Training Loss: 0.00769
Epoch: 34467, Training Loss: 0.00770
Epoch: 34467, Training Loss: 0.00945
Epoch: 34468, Training Loss: 0.00739
Epoch: 34468, Training Loss: 0.00769
Epoch: 34468, Training Loss: 0.00770
Epoch: 34468, Training Loss: 0.00945
Epoch: 34469, Training Loss: 0.00739
Epoch: 34469, Training Loss: 0.00769
Epoch: 34469, Training Loss: 0.00770
Epoch: 34469, Training Loss: 0.00945
Epoch: 34470, Training Loss: 0.00739
Epoch: 34470, Training Loss: 0.00769
Epoch: 34470, Training Loss: 0.00770
Epoch: 34470, Training Loss: 0.00945
Epoch: 34471, Training Loss: 0.00739
Epoch: 34471, Training Loss: 0.00769
Epoch: 34471, Training Loss: 0.00770
Epoch: 34471, Training Loss: 0.00945
Epoch: 34472, Training Loss: 0.00739
Epoch: 34472, Training Loss: 0.00769
Epoch: 34472, Training Loss: 0.00770
Epoch: 34472, Training Loss: 0.00945
Epoch: 34473, Training Loss: 0.00739
Epoch: 34473, Training Loss: 0.00769
Epoch: 34473, Training Loss: 0.00770
Epoch: 34473, Training Loss: 0.00945
Epoch: 34474, Training Loss: 0.00739
Epoch: 34474, Training Loss: 0.00769
Epoch: 34474, Training Loss: 0.00770
Epoch: 34474, Training Loss: 0.00945
Epoch: 34475, Training Loss: 0.00739
Epoch: 34475, Training Loss: 0.00769
Epoch: 34475, Training Loss: 0.00770
Epoch: 34475, Training Loss: 0.00945
Epoch: 34476, Training Loss: 0.00739
Epoch: 34476, Training Loss: 0.00769
Epoch: 34476, Training Loss: 0.00770
Epoch: 34476, Training Loss: 0.00945
Epoch: 34477, Training Loss: 0.00739
Epoch: 34477, Training Loss: 0.00769
Epoch: 34477, Training Loss: 0.00770
Epoch: 34477, Training Loss: 0.00945
Epoch: 34478, Training Loss: 0.00739
Epoch: 34478, Training Loss: 0.00769
Epoch: 34478, Training Loss: 0.00770
Epoch: 34478, Training Loss: 0.00945
Epoch: 34479, Training Loss: 0.00739
Epoch: 34479, Training Loss: 0.00769
Epoch: 34479, Training Loss: 0.00770
Epoch: 34479, Training Loss: 0.00945
Epoch: 34480, Training Loss: 0.00739
Epoch: 34480, Training Loss: 0.00769
Epoch: 34480, Training Loss: 0.00770
Epoch: 34480, Training Loss: 0.00945
Epoch: 34481, Training Loss: 0.00739
Epoch: 34481, Training Loss: 0.00769
Epoch: 34481, Training Loss: 0.00770
Epoch: 34481, Training Loss: 0.00945
Epoch: 34482, Training Loss: 0.00739
Epoch: 34482, Training Loss: 0.00769
Epoch: 34482, Training Loss: 0.00770
Epoch: 34482, Training Loss: 0.00945
Epoch: 34483, Training Loss: 0.00739
Epoch: 34483, Training Loss: 0.00769
Epoch: 34483, Training Loss: 0.00770
Epoch: 34483, Training Loss: 0.00945
Epoch: 34484, Training Loss: 0.00739
Epoch: 34484, Training Loss: 0.00769
Epoch: 34484, Training Loss: 0.00770
Epoch: 34484, Training Loss: 0.00945
Epoch: 34485, Training Loss: 0.00739
Epoch: 34485, Training Loss: 0.00769
Epoch: 34485, Training Loss: 0.00770
Epoch: 34485, Training Loss: 0.00945
Epoch: 34486, Training Loss: 0.00739
Epoch: 34486, Training Loss: 0.00769
Epoch: 34486, Training Loss: 0.00770
Epoch: 34486, Training Loss: 0.00944
Epoch: 34487, Training Loss: 0.00739
Epoch: 34487, Training Loss: 0.00769
Epoch: 34487, Training Loss: 0.00770
Epoch: 34487, Training Loss: 0.00944
Epoch: 34488, Training Loss: 0.00739
Epoch: 34488, Training Loss: 0.00769
Epoch: 34488, Training Loss: 0.00770
Epoch: 34488, Training Loss: 0.00944
Epoch: 34489, Training Loss: 0.00739
Epoch: 34489, Training Loss: 0.00769
Epoch: 34489, Training Loss: 0.00769
Epoch: 34489, Training Loss: 0.00944
Epoch: 34490, Training Loss: 0.00739
Epoch: 34490, Training Loss: 0.00769
Epoch: 34490, Training Loss: 0.00769
Epoch: 34490, Training Loss: 0.00944
Epoch: 34491, Training Loss: 0.00739
Epoch: 34491, Training Loss: 0.00769
Epoch: 34491, Training Loss: 0.00769
Epoch: 34491, Training Loss: 0.00944
Epoch: 34492, Training Loss: 0.00739
Epoch: 34492, Training Loss: 0.00769
Epoch: 34492, Training Loss: 0.00769
Epoch: 34492, Training Loss: 0.00944
Epoch: 34493, Training Loss: 0.00739
Epoch: 34493, Training Loss: 0.00769
Epoch: 34493, Training Loss: 0.00769
Epoch: 34493, Training Loss: 0.00944
Epoch: 34494, Training Loss: 0.00739
Epoch: 34494, Training Loss: 0.00769
Epoch: 34494, Training Loss: 0.00769
Epoch: 34494, Training Loss: 0.00944
Epoch: 34495, Training Loss: 0.00739
Epoch: 34495, Training Loss: 0.00769
Epoch: 34495, Training Loss: 0.00769
Epoch: 34495, Training Loss: 0.00944
Epoch: 34496, Training Loss: 0.00739
Epoch: 34496, Training Loss: 0.00769
Epoch: 34496, Training Loss: 0.00769
Epoch: 34496, Training Loss: 0.00944
Epoch: 34497, Training Loss: 0.00739
Epoch: 34497, Training Loss: 0.00769
Epoch: 34497, Training Loss: 0.00769
Epoch: 34497, Training Loss: 0.00944
Epoch: 34498, Training Loss: 0.00739
Epoch: 34498, Training Loss: 0.00768
Epoch: 34498, Training Loss: 0.00769
Epoch: 34498, Training Loss: 0.00944
Epoch: 34499, Training Loss: 0.00739
Epoch: 34499, Training Loss: 0.00768
Epoch: 34499, Training Loss: 0.00769
Epoch: 34499, Training Loss: 0.00944
Epoch: 34500, Training Loss: 0.00739
Epoch: 34500, Training Loss: 0.00768
Epoch: 34500, Training Loss: 0.00769
Epoch: 34500, Training Loss: 0.00944
Epoch: 34501, Training Loss: 0.00739
Epoch: 34501, Training Loss: 0.00768
Epoch: 34501, Training Loss: 0.00769
Epoch: 34501, Training Loss: 0.00944
Epoch: 34502, Training Loss: 0.00739
Epoch: 34502, Training Loss: 0.00768
Epoch: 34502, Training Loss: 0.00769
Epoch: 34502, Training Loss: 0.00944
Epoch: 34503, Training Loss: 0.00739
Epoch: 34503, Training Loss: 0.00768
Epoch: 34503, Training Loss: 0.00769
Epoch: 34503, Training Loss: 0.00944
Epoch: 34504, Training Loss: 0.00739
Epoch: 34504, Training Loss: 0.00768
Epoch: 34504, Training Loss: 0.00769
Epoch: 34504, Training Loss: 0.00944
Epoch: 34505, Training Loss: 0.00739
Epoch: 34505, Training Loss: 0.00768
Epoch: 34505, Training Loss: 0.00769
Epoch: 34505, Training Loss: 0.00944
Epoch: 34506, Training Loss: 0.00739
Epoch: 34506, Training Loss: 0.00768
Epoch: 34506, Training Loss: 0.00769
Epoch: 34506, Training Loss: 0.00944
Epoch: 34507, Training Loss: 0.00739
Epoch: 34507, Training Loss: 0.00768
Epoch: 34507, Training Loss: 0.00769
Epoch: 34507, Training Loss: 0.00944
Epoch: 34508, Training Loss: 0.00739
Epoch: 34508, Training Loss: 0.00768
Epoch: 34508, Training Loss: 0.00769
Epoch: 34508, Training Loss: 0.00944
Epoch: 34509, Training Loss: 0.00739
Epoch: 34509, Training Loss: 0.00768
Epoch: 34509, Training Loss: 0.00769
Epoch: 34509, Training Loss: 0.00944
Epoch: 34510, Training Loss: 0.00739
Epoch: 34510, Training Loss: 0.00768
Epoch: 34510, Training Loss: 0.00769
Epoch: 34510, Training Loss: 0.00944
Epoch: 34511, Training Loss: 0.00739
Epoch: 34511, Training Loss: 0.00768
Epoch: 34511, Training Loss: 0.00769
Epoch: 34511, Training Loss: 0.00944
Epoch: 34512, Training Loss: 0.00738
Epoch: 34512, Training Loss: 0.00768
Epoch: 34512, Training Loss: 0.00769
Epoch: 34512, Training Loss: 0.00944
Epoch: 34513, Training Loss: 0.00738
Epoch: 34513, Training Loss: 0.00768
Epoch: 34513, Training Loss: 0.00769
Epoch: 34513, Training Loss: 0.00944
Epoch: 34514, Training Loss: 0.00738
Epoch: 34514, Training Loss: 0.00768
Epoch: 34514, Training Loss: 0.00769
Epoch: 34514, Training Loss: 0.00944
Epoch: 34515, Training Loss: 0.00738
Epoch: 34515, Training Loss: 0.00768
Epoch: 34515, Training Loss: 0.00769
Epoch: 34515, Training Loss: 0.00944
Epoch: 34516, Training Loss: 0.00738
Epoch: 34516, Training Loss: 0.00768
Epoch: 34516, Training Loss: 0.00769
Epoch: 34516, Training Loss: 0.00944
Epoch: 34517, Training Loss: 0.00738
Epoch: 34517, Training Loss: 0.00768
Epoch: 34517, Training Loss: 0.00769
Epoch: 34517, Training Loss: 0.00944
Epoch: 34518, Training Loss: 0.00738
Epoch: 34518, Training Loss: 0.00768
Epoch: 34518, Training Loss: 0.00769
Epoch: 34518, Training Loss: 0.00944
Epoch: 34519, Training Loss: 0.00738
Epoch: 34519, Training Loss: 0.00768
Epoch: 34519, Training Loss: 0.00769
Epoch: 34519, Training Loss: 0.00944
Epoch: 34520, Training Loss: 0.00738
Epoch: 34520, Training Loss: 0.00768
Epoch: 34520, Training Loss: 0.00769
Epoch: 34520, Training Loss: 0.00944
Epoch: 34521, Training Loss: 0.00738
Epoch: 34521, Training Loss: 0.00768
Epoch: 34521, Training Loss: 0.00769
Epoch: 34521, Training Loss: 0.00944
Epoch: 34522, Training Loss: 0.00738
Epoch: 34522, Training Loss: 0.00768
Epoch: 34522, Training Loss: 0.00769
Epoch: 34522, Training Loss: 0.00944
Epoch: 34523, Training Loss: 0.00738
Epoch: 34523, Training Loss: 0.00768
Epoch: 34523, Training Loss: 0.00769
Epoch: 34523, Training Loss: 0.00944
Epoch: 34524, Training Loss: 0.00738
Epoch: 34524, Training Loss: 0.00768
Epoch: 34524, Training Loss: 0.00769
Epoch: 34524, Training Loss: 0.00944
Epoch: 34525, Training Loss: 0.00738
Epoch: 34525, Training Loss: 0.00768
Epoch: 34525, Training Loss: 0.00769
Epoch: 34525, Training Loss: 0.00944
Epoch: 34526, Training Loss: 0.00738
Epoch: 34526, Training Loss: 0.00768
Epoch: 34526, Training Loss: 0.00769
Epoch: 34526, Training Loss: 0.00944
Epoch: 34527, Training Loss: 0.00738
Epoch: 34527, Training Loss: 0.00768
Epoch: 34527, Training Loss: 0.00769
Epoch: 34527, Training Loss: 0.00944
Epoch: 34528, Training Loss: 0.00738
Epoch: 34528, Training Loss: 0.00768
Epoch: 34528, Training Loss: 0.00769
Epoch: 34528, Training Loss: 0.00944
Epoch: 34529, Training Loss: 0.00738
Epoch: 34529, Training Loss: 0.00768
Epoch: 34529, Training Loss: 0.00769
Epoch: 34529, Training Loss: 0.00944
Epoch: 34530, Training Loss: 0.00738
Epoch: 34530, Training Loss: 0.00768
Epoch: 34530, Training Loss: 0.00769
Epoch: 34530, Training Loss: 0.00944
Epoch: 34531, Training Loss: 0.00738
Epoch: 34531, Training Loss: 0.00768
Epoch: 34531, Training Loss: 0.00769
Epoch: 34531, Training Loss: 0.00944
Epoch: 34532, Training Loss: 0.00738
Epoch: 34532, Training Loss: 0.00768
Epoch: 34532, Training Loss: 0.00769
Epoch: 34532, Training Loss: 0.00944
Epoch: 34533, Training Loss: 0.00738
Epoch: 34533, Training Loss: 0.00768
Epoch: 34533, Training Loss: 0.00769
Epoch: 34533, Training Loss: 0.00944
Epoch: 34534, Training Loss: 0.00738
Epoch: 34534, Training Loss: 0.00768
Epoch: 34534, Training Loss: 0.00769
Epoch: 34534, Training Loss: 0.00944
Epoch: 34535, Training Loss: 0.00738
Epoch: 34535, Training Loss: 0.00768
Epoch: 34535, Training Loss: 0.00769
Epoch: 34535, Training Loss: 0.00944
Epoch: 34536, Training Loss: 0.00738
Epoch: 34536, Training Loss: 0.00768
Epoch: 34536, Training Loss: 0.00769
Epoch: 34536, Training Loss: 0.00944
Epoch: 34537, Training Loss: 0.00738
Epoch: 34537, Training Loss: 0.00768
Epoch: 34537, Training Loss: 0.00769
Epoch: 34537, Training Loss: 0.00944
Epoch: 34538, Training Loss: 0.00738
Epoch: 34538, Training Loss: 0.00768
Epoch: 34538, Training Loss: 0.00769
Epoch: 34538, Training Loss: 0.00944
Epoch: 34539, Training Loss: 0.00738
Epoch: 34539, Training Loss: 0.00768
Epoch: 34539, Training Loss: 0.00769
Epoch: 34539, Training Loss: 0.00944
Epoch: 34540, Training Loss: 0.00738
Epoch: 34540, Training Loss: 0.00768
Epoch: 34540, Training Loss: 0.00769
Epoch: 34540, Training Loss: 0.00944
Epoch: 34541, Training Loss: 0.00738
Epoch: 34541, Training Loss: 0.00768
Epoch: 34541, Training Loss: 0.00769
Epoch: 34541, Training Loss: 0.00944
Epoch: 34542, Training Loss: 0.00738
Epoch: 34542, Training Loss: 0.00768
Epoch: 34542, Training Loss: 0.00769
Epoch: 34542, Training Loss: 0.00944
Epoch: 34543, Training Loss: 0.00738
Epoch: 34543, Training Loss: 0.00768
Epoch: 34543, Training Loss: 0.00769
Epoch: 34543, Training Loss: 0.00944
Epoch: 34544, Training Loss: 0.00738
Epoch: 34544, Training Loss: 0.00768
Epoch: 34544, Training Loss: 0.00769
Epoch: 34544, Training Loss: 0.00944
Epoch: 34545, Training Loss: 0.00738
Epoch: 34545, Training Loss: 0.00768
Epoch: 34545, Training Loss: 0.00769
Epoch: 34545, Training Loss: 0.00944
Epoch: 34546, Training Loss: 0.00738
Epoch: 34546, Training Loss: 0.00768
Epoch: 34546, Training Loss: 0.00769
Epoch: 34546, Training Loss: 0.00944
Epoch: 34547, Training Loss: 0.00738
Epoch: 34547, Training Loss: 0.00768
Epoch: 34547, Training Loss: 0.00769
Epoch: 34547, Training Loss: 0.00944
Epoch: 34548, Training Loss: 0.00738
Epoch: 34548, Training Loss: 0.00768
Epoch: 34548, Training Loss: 0.00769
Epoch: 34548, Training Loss: 0.00944
Epoch: 34549, Training Loss: 0.00738
Epoch: 34549, Training Loss: 0.00768
Epoch: 34549, Training Loss: 0.00769
Epoch: 34549, Training Loss: 0.00944
Epoch: 34550, Training Loss: 0.00738
Epoch: 34550, Training Loss: 0.00768
Epoch: 34550, Training Loss: 0.00769
Epoch: 34550, Training Loss: 0.00944
Epoch: 34551, Training Loss: 0.00738
Epoch: 34551, Training Loss: 0.00768
Epoch: 34551, Training Loss: 0.00769
Epoch: 34551, Training Loss: 0.00943
Epoch: 34552, Training Loss: 0.00738
Epoch: 34552, Training Loss: 0.00768
Epoch: 34552, Training Loss: 0.00769
Epoch: 34552, Training Loss: 0.00943
Epoch: 34553, Training Loss: 0.00738
Epoch: 34553, Training Loss: 0.00768
Epoch: 34553, Training Loss: 0.00769
Epoch: 34553, Training Loss: 0.00943
Epoch: 34554, Training Loss: 0.00738
Epoch: 34554, Training Loss: 0.00768
Epoch: 34554, Training Loss: 0.00769
Epoch: 34554, Training Loss: 0.00943
Epoch: 34555, Training Loss: 0.00738
Epoch: 34555, Training Loss: 0.00768
Epoch: 34555, Training Loss: 0.00769
Epoch: 34555, Training Loss: 0.00943
Epoch: 34556, Training Loss: 0.00738
Epoch: 34556, Training Loss: 0.00768
Epoch: 34556, Training Loss: 0.00769
Epoch: 34556, Training Loss: 0.00943
Epoch: 34557, Training Loss: 0.00738
Epoch: 34557, Training Loss: 0.00768
Epoch: 34557, Training Loss: 0.00769
Epoch: 34557, Training Loss: 0.00943
Epoch: 34558, Training Loss: 0.00738
Epoch: 34558, Training Loss: 0.00768
Epoch: 34558, Training Loss: 0.00769
Epoch: 34558, Training Loss: 0.00943
Epoch: 34559, Training Loss: 0.00738
Epoch: 34559, Training Loss: 0.00768
Epoch: 34559, Training Loss: 0.00769
Epoch: 34559, Training Loss: 0.00943
Epoch: 34560, Training Loss: 0.00738
Epoch: 34560, Training Loss: 0.00768
Epoch: 34560, Training Loss: 0.00769
Epoch: 34560, Training Loss: 0.00943
Epoch: 34561, Training Loss: 0.00738
Epoch: 34561, Training Loss: 0.00768
Epoch: 34561, Training Loss: 0.00769
Epoch: 34561, Training Loss: 0.00943
Epoch: 34562, Training Loss: 0.00738
Epoch: 34562, Training Loss: 0.00768
Epoch: 34562, Training Loss: 0.00769
Epoch: 34562, Training Loss: 0.00943
Epoch: 34563, Training Loss: 0.00738
Epoch: 34563, Training Loss: 0.00768
Epoch: 34563, Training Loss: 0.00769
Epoch: 34563, Training Loss: 0.00943
Epoch: 34564, Training Loss: 0.00738
Epoch: 34564, Training Loss: 0.00768
Epoch: 34564, Training Loss: 0.00769
Epoch: 34564, Training Loss: 0.00943
Epoch: 34565, Training Loss: 0.00738
Epoch: 34565, Training Loss: 0.00768
Epoch: 34565, Training Loss: 0.00769
Epoch: 34565, Training Loss: 0.00943
Epoch: 34566, Training Loss: 0.00738
Epoch: 34566, Training Loss: 0.00768
Epoch: 34566, Training Loss: 0.00769
Epoch: 34566, Training Loss: 0.00943
Epoch: 34567, Training Loss: 0.00738
Epoch: 34567, Training Loss: 0.00768
Epoch: 34567, Training Loss: 0.00769
Epoch: 34567, Training Loss: 0.00943
Epoch: 34568, Training Loss: 0.00738
Epoch: 34568, Training Loss: 0.00768
Epoch: 34568, Training Loss: 0.00769
Epoch: 34568, Training Loss: 0.00943
Epoch: 34569, Training Loss: 0.00738
Epoch: 34569, Training Loss: 0.00768
Epoch: 34569, Training Loss: 0.00769
Epoch: 34569, Training Loss: 0.00943
Epoch: 34570, Training Loss: 0.00738
Epoch: 34570, Training Loss: 0.00768
Epoch: 34570, Training Loss: 0.00768
Epoch: 34570, Training Loss: 0.00943
Epoch: 34571, Training Loss: 0.00738
Epoch: 34571, Training Loss: 0.00768
Epoch: 34571, Training Loss: 0.00768
Epoch: 34571, Training Loss: 0.00943
Epoch: 34572, Training Loss: 0.00738
Epoch: 34572, Training Loss: 0.00768
Epoch: 34572, Training Loss: 0.00768
Epoch: 34572, Training Loss: 0.00943
Epoch: 34573, Training Loss: 0.00738
Epoch: 34573, Training Loss: 0.00768
Epoch: 34573, Training Loss: 0.00768
Epoch: 34573, Training Loss: 0.00943
Epoch: 34574, Training Loss: 0.00738
Epoch: 34574, Training Loss: 0.00768
Epoch: 34574, Training Loss: 0.00768
Epoch: 34574, Training Loss: 0.00943
Epoch: 34575, Training Loss: 0.00738
Epoch: 34575, Training Loss: 0.00768
Epoch: 34575, Training Loss: 0.00768
Epoch: 34575, Training Loss: 0.00943
Epoch: 34576, Training Loss: 0.00738
Epoch: 34576, Training Loss: 0.00768
Epoch: 34576, Training Loss: 0.00768
Epoch: 34576, Training Loss: 0.00943
Epoch: 34577, Training Loss: 0.00738
Epoch: 34577, Training Loss: 0.00768
Epoch: 34577, Training Loss: 0.00768
Epoch: 34577, Training Loss: 0.00943
Epoch: 34578, Training Loss: 0.00738
Epoch: 34578, Training Loss: 0.00768
Epoch: 34578, Training Loss: 0.00768
Epoch: 34578, Training Loss: 0.00943
Epoch: 34579, Training Loss: 0.00738
Epoch: 34579, Training Loss: 0.00767
Epoch: 34579, Training Loss: 0.00768
Epoch: 34579, Training Loss: 0.00943
Epoch: 34580, Training Loss: 0.00738
Epoch: 34580, Training Loss: 0.00767
Epoch: 34580, Training Loss: 0.00768
Epoch: 34580, Training Loss: 0.00943
Epoch: 34581, Training Loss: 0.00738
Epoch: 34581, Training Loss: 0.00767
Epoch: 34581, Training Loss: 0.00768
Epoch: 34581, Training Loss: 0.00943
Epoch: 34582, Training Loss: 0.00738
Epoch: 34582, Training Loss: 0.00767
Epoch: 34582, Training Loss: 0.00768
Epoch: 34582, Training Loss: 0.00943
Epoch: 34583, Training Loss: 0.00738
Epoch: 34583, Training Loss: 0.00767
Epoch: 34583, Training Loss: 0.00768
Epoch: 34583, Training Loss: 0.00943
Epoch: 34584, Training Loss: 0.00738
Epoch: 34584, Training Loss: 0.00767
Epoch: 34584, Training Loss: 0.00768
Epoch: 34584, Training Loss: 0.00943
Epoch: 34585, Training Loss: 0.00738
Epoch: 34585, Training Loss: 0.00767
Epoch: 34585, Training Loss: 0.00768
Epoch: 34585, Training Loss: 0.00943
Epoch: 34586, Training Loss: 0.00738
Epoch: 34586, Training Loss: 0.00767
Epoch: 34586, Training Loss: 0.00768
Epoch: 34586, Training Loss: 0.00943
Epoch: 34587, Training Loss: 0.00738
Epoch: 34587, Training Loss: 0.00767
Epoch: 34587, Training Loss: 0.00768
Epoch: 34587, Training Loss: 0.00943
Epoch: 34588, Training Loss: 0.00738
Epoch: 34588, Training Loss: 0.00767
Epoch: 34588, Training Loss: 0.00768
Epoch: 34588, Training Loss: 0.00943
Epoch: 34589, Training Loss: 0.00738
Epoch: 34589, Training Loss: 0.00767
Epoch: 34589, Training Loss: 0.00768
Epoch: 34589, Training Loss: 0.00943
Epoch: 34590, Training Loss: 0.00738
Epoch: 34590, Training Loss: 0.00767
Epoch: 34590, Training Loss: 0.00768
Epoch: 34590, Training Loss: 0.00943
Epoch: 34591, Training Loss: 0.00738
Epoch: 34591, Training Loss: 0.00767
Epoch: 34591, Training Loss: 0.00768
Epoch: 34591, Training Loss: 0.00943
Epoch: 34592, Training Loss: 0.00738
Epoch: 34592, Training Loss: 0.00767
Epoch: 34592, Training Loss: 0.00768
Epoch: 34592, Training Loss: 0.00943
Epoch: 34593, Training Loss: 0.00738
Epoch: 34593, Training Loss: 0.00767
Epoch: 34593, Training Loss: 0.00768
Epoch: 34593, Training Loss: 0.00943
Epoch: 34594, Training Loss: 0.00738
Epoch: 34594, Training Loss: 0.00767
Epoch: 34594, Training Loss: 0.00768
Epoch: 34594, Training Loss: 0.00943
Epoch: 34595, Training Loss: 0.00738
Epoch: 34595, Training Loss: 0.00767
Epoch: 34595, Training Loss: 0.00768
Epoch: 34595, Training Loss: 0.00943
Epoch: 34596, Training Loss: 0.00738
Epoch: 34596, Training Loss: 0.00767
Epoch: 34596, Training Loss: 0.00768
Epoch: 34596, Training Loss: 0.00943
Epoch: 34597, Training Loss: 0.00738
Epoch: 34597, Training Loss: 0.00767
Epoch: 34597, Training Loss: 0.00768
Epoch: 34597, Training Loss: 0.00943
Epoch: 34598, Training Loss: 0.00737
Epoch: 34598, Training Loss: 0.00767
Epoch: 34598, Training Loss: 0.00768
Epoch: 34598, Training Loss: 0.00943
Epoch: 34599, Training Loss: 0.00737
Epoch: 34599, Training Loss: 0.00767
Epoch: 34599, Training Loss: 0.00768
Epoch: 34599, Training Loss: 0.00943
Epoch: 34600, Training Loss: 0.00737
Epoch: 34600, Training Loss: 0.00767
Epoch: 34600, Training Loss: 0.00768
Epoch: 34600, Training Loss: 0.00943
Epoch: 34601, Training Loss: 0.00737
Epoch: 34601, Training Loss: 0.00767
Epoch: 34601, Training Loss: 0.00768
Epoch: 34601, Training Loss: 0.00943
Epoch: 34602, Training Loss: 0.00737
Epoch: 34602, Training Loss: 0.00767
Epoch: 34602, Training Loss: 0.00768
Epoch: 34602, Training Loss: 0.00943
Epoch: 34603, Training Loss: 0.00737
Epoch: 34603, Training Loss: 0.00767
Epoch: 34603, Training Loss: 0.00768
Epoch: 34603, Training Loss: 0.00943
Epoch: 34604, Training Loss: 0.00737
Epoch: 34604, Training Loss: 0.00767
Epoch: 34604, Training Loss: 0.00768
Epoch: 34604, Training Loss: 0.00943
Epoch: 34605, Training Loss: 0.00737
Epoch: 34605, Training Loss: 0.00767
Epoch: 34605, Training Loss: 0.00768
Epoch: 34605, Training Loss: 0.00943
Epoch: 34606, Training Loss: 0.00737
Epoch: 34606, Training Loss: 0.00767
Epoch: 34606, Training Loss: 0.00768
Epoch: 34606, Training Loss: 0.00943
Epoch: 34607, Training Loss: 0.00737
Epoch: 34607, Training Loss: 0.00767
Epoch: 34607, Training Loss: 0.00768
Epoch: 34607, Training Loss: 0.00943
Epoch: 34608, Training Loss: 0.00737
Epoch: 34608, Training Loss: 0.00767
Epoch: 34608, Training Loss: 0.00768
Epoch: 34608, Training Loss: 0.00943
Epoch: 34609, Training Loss: 0.00737
Epoch: 34609, Training Loss: 0.00767
Epoch: 34609, Training Loss: 0.00768
Epoch: 34609, Training Loss: 0.00943
Epoch: 34610, Training Loss: 0.00737
Epoch: 34610, Training Loss: 0.00767
Epoch: 34610, Training Loss: 0.00768
Epoch: 34610, Training Loss: 0.00943
Epoch: 34611, Training Loss: 0.00737
Epoch: 34611, Training Loss: 0.00767
Epoch: 34611, Training Loss: 0.00768
Epoch: 34611, Training Loss: 0.00943
Epoch: 34612, Training Loss: 0.00737
Epoch: 34612, Training Loss: 0.00767
Epoch: 34612, Training Loss: 0.00768
Epoch: 34612, Training Loss: 0.00943
Epoch: 34613, Training Loss: 0.00737
Epoch: 34613, Training Loss: 0.00767
Epoch: 34613, Training Loss: 0.00768
Epoch: 34613, Training Loss: 0.00943
Epoch: 34614, Training Loss: 0.00737
Epoch: 34614, Training Loss: 0.00767
Epoch: 34614, Training Loss: 0.00768
Epoch: 34614, Training Loss: 0.00943
Epoch: 34615, Training Loss: 0.00737
Epoch: 34615, Training Loss: 0.00767
Epoch: 34615, Training Loss: 0.00768
Epoch: 34615, Training Loss: 0.00943
Epoch: 34616, Training Loss: 0.00737
Epoch: 34616, Training Loss: 0.00767
Epoch: 34616, Training Loss: 0.00768
Epoch: 34616, Training Loss: 0.00942
Epoch: 34617, Training Loss: 0.00737
Epoch: 34617, Training Loss: 0.00767
Epoch: 34617, Training Loss: 0.00768
Epoch: 34617, Training Loss: 0.00942
Epoch: 34618, Training Loss: 0.00737
Epoch: 34618, Training Loss: 0.00767
Epoch: 34618, Training Loss: 0.00768
Epoch: 34618, Training Loss: 0.00942
Epoch: 34619, Training Loss: 0.00737
Epoch: 34619, Training Loss: 0.00767
Epoch: 34619, Training Loss: 0.00768
Epoch: 34619, Training Loss: 0.00942
Epoch: 34620, Training Loss: 0.00737
Epoch: 34620, Training Loss: 0.00767
Epoch: 34620, Training Loss: 0.00768
Epoch: 34620, Training Loss: 0.00942
Epoch: 34621, Training Loss: 0.00737
Epoch: 34621, Training Loss: 0.00767
Epoch: 34621, Training Loss: 0.00768
Epoch: 34621, Training Loss: 0.00942
Epoch: 34622, Training Loss: 0.00737
Epoch: 34622, Training Loss: 0.00767
Epoch: 34622, Training Loss: 0.00768
Epoch: 34622, Training Loss: 0.00942
Epoch: 34623, Training Loss: 0.00737
Epoch: 34623, Training Loss: 0.00767
Epoch: 34623, Training Loss: 0.00768
Epoch: 34623, Training Loss: 0.00942
Epoch: 34624, Training Loss: 0.00737
Epoch: 34624, Training Loss: 0.00767
Epoch: 34624, Training Loss: 0.00768
Epoch: 34624, Training Loss: 0.00942
Epoch: 34625, Training Loss: 0.00737
Epoch: 34625, Training Loss: 0.00767
Epoch: 34625, Training Loss: 0.00768
Epoch: 34625, Training Loss: 0.00942
Epoch: 34626, Training Loss: 0.00737
Epoch: 34626, Training Loss: 0.00767
Epoch: 34626, Training Loss: 0.00768
Epoch: 34626, Training Loss: 0.00942
Epoch: 34627, Training Loss: 0.00737
Epoch: 34627, Training Loss: 0.00767
Epoch: 34627, Training Loss: 0.00768
Epoch: 34627, Training Loss: 0.00942
Epoch: 34628, Training Loss: 0.00737
Epoch: 34628, Training Loss: 0.00767
Epoch: 34628, Training Loss: 0.00768
Epoch: 34628, Training Loss: 0.00942
Epoch: 34629, Training Loss: 0.00737
Epoch: 34629, Training Loss: 0.00767
Epoch: 34629, Training Loss: 0.00768
Epoch: 34629, Training Loss: 0.00942
Epoch: 34630, Training Loss: 0.00737
Epoch: 34630, Training Loss: 0.00767
Epoch: 34630, Training Loss: 0.00768
Epoch: 34630, Training Loss: 0.00942
Epoch: 34631, Training Loss: 0.00737
Epoch: 34631, Training Loss: 0.00767
Epoch: 34631, Training Loss: 0.00768
Epoch: 34631, Training Loss: 0.00942
Epoch: 34632, Training Loss: 0.00737
Epoch: 34632, Training Loss: 0.00767
Epoch: 34632, Training Loss: 0.00768
Epoch: 34632, Training Loss: 0.00942
Epoch: 34633, Training Loss: 0.00737
Epoch: 34633, Training Loss: 0.00767
Epoch: 34633, Training Loss: 0.00768
Epoch: 34633, Training Loss: 0.00942
Epoch: 34634, Training Loss: 0.00737
Epoch: 34634, Training Loss: 0.00767
Epoch: 34634, Training Loss: 0.00768
Epoch: 34634, Training Loss: 0.00942
Epoch: 34635, Training Loss: 0.00737
Epoch: 34635, Training Loss: 0.00767
Epoch: 34635, Training Loss: 0.00768
Epoch: 34635, Training Loss: 0.00942
Epoch: 34636, Training Loss: 0.00737
Epoch: 34636, Training Loss: 0.00767
Epoch: 34636, Training Loss: 0.00768
Epoch: 34636, Training Loss: 0.00942
Epoch: 34637, Training Loss: 0.00737
Epoch: 34637, Training Loss: 0.00767
Epoch: 34637, Training Loss: 0.00768
Epoch: 34637, Training Loss: 0.00942
Epoch: 34638, Training Loss: 0.00737
Epoch: 34638, Training Loss: 0.00767
Epoch: 34638, Training Loss: 0.00768
Epoch: 34638, Training Loss: 0.00942
Epoch: 34639, Training Loss: 0.00737
Epoch: 34639, Training Loss: 0.00767
Epoch: 34639, Training Loss: 0.00768
Epoch: 34639, Training Loss: 0.00942
Epoch: 34640, Training Loss: 0.00737
Epoch: 34640, Training Loss: 0.00767
Epoch: 34640, Training Loss: 0.00768
Epoch: 34640, Training Loss: 0.00942
Epoch: 34641, Training Loss: 0.00737
Epoch: 34641, Training Loss: 0.00767
Epoch: 34641, Training Loss: 0.00768
Epoch: 34641, Training Loss: 0.00942
Epoch: 34642, Training Loss: 0.00737
Epoch: 34642, Training Loss: 0.00767
Epoch: 34642, Training Loss: 0.00768
Epoch: 34642, Training Loss: 0.00942
Epoch: 34643, Training Loss: 0.00737
Epoch: 34643, Training Loss: 0.00767
Epoch: 34643, Training Loss: 0.00768
Epoch: 34643, Training Loss: 0.00942
Epoch: 34644, Training Loss: 0.00737
Epoch: 34644, Training Loss: 0.00767
Epoch: 34644, Training Loss: 0.00768
Epoch: 34644, Training Loss: 0.00942
Epoch: 34645, Training Loss: 0.00737
Epoch: 34645, Training Loss: 0.00767
Epoch: 34645, Training Loss: 0.00768
Epoch: 34645, Training Loss: 0.00942
Epoch: 34646, Training Loss: 0.00737
Epoch: 34646, Training Loss: 0.00767
Epoch: 34646, Training Loss: 0.00768
Epoch: 34646, Training Loss: 0.00942
Epoch: 34647, Training Loss: 0.00737
Epoch: 34647, Training Loss: 0.00767
Epoch: 34647, Training Loss: 0.00768
Epoch: 34647, Training Loss: 0.00942
Epoch: 34648, Training Loss: 0.00737
Epoch: 34648, Training Loss: 0.00767
Epoch: 34648, Training Loss: 0.00768
Epoch: 34648, Training Loss: 0.00942
Epoch: 34649, Training Loss: 0.00737
Epoch: 34649, Training Loss: 0.00767
Epoch: 34649, Training Loss: 0.00768
Epoch: 34649, Training Loss: 0.00942
Epoch: 34650, Training Loss: 0.00737
Epoch: 34650, Training Loss: 0.00767
Epoch: 34650, Training Loss: 0.00768
Epoch: 34650, Training Loss: 0.00942
Epoch: 34651, Training Loss: 0.00737
Epoch: 34651, Training Loss: 0.00767
Epoch: 34651, Training Loss: 0.00767
Epoch: 34651, Training Loss: 0.00942
Epoch: 34652, Training Loss: 0.00737
Epoch: 34652, Training Loss: 0.00767
Epoch: 34652, Training Loss: 0.00767
Epoch: 34652, Training Loss: 0.00942
Epoch: 34653, Training Loss: 0.00737
Epoch: 34653, Training Loss: 0.00767
Epoch: 34653, Training Loss: 0.00767
Epoch: 34653, Training Loss: 0.00942
Epoch: 34654, Training Loss: 0.00737
Epoch: 34654, Training Loss: 0.00767
Epoch: 34654, Training Loss: 0.00767
Epoch: 34654, Training Loss: 0.00942
Epoch: 34655, Training Loss: 0.00737
Epoch: 34655, Training Loss: 0.00767
Epoch: 34655, Training Loss: 0.00767
Epoch: 34655, Training Loss: 0.00942
Epoch: 34656, Training Loss: 0.00737
Epoch: 34656, Training Loss: 0.00767
Epoch: 34656, Training Loss: 0.00767
Epoch: 34656, Training Loss: 0.00942
Epoch: 34657, Training Loss: 0.00737
Epoch: 34657, Training Loss: 0.00767
Epoch: 34657, Training Loss: 0.00767
Epoch: 34657, Training Loss: 0.00942
Epoch: 34658, Training Loss: 0.00737
Epoch: 34658, Training Loss: 0.00767
Epoch: 34658, Training Loss: 0.00767
Epoch: 34658, Training Loss: 0.00942
Epoch: 34659, Training Loss: 0.00737
Epoch: 34659, Training Loss: 0.00767
Epoch: 34659, Training Loss: 0.00767
Epoch: 34659, Training Loss: 0.00942
Epoch: 34660, Training Loss: 0.00737
Epoch: 34660, Training Loss: 0.00766
Epoch: 34660, Training Loss: 0.00767
Epoch: 34660, Training Loss: 0.00942
Epoch: 34661, Training Loss: 0.00737
Epoch: 34661, Training Loss: 0.00766
Epoch: 34661, Training Loss: 0.00767
Epoch: 34661, Training Loss: 0.00942
Epoch: 34662, Training Loss: 0.00737
Epoch: 34662, Training Loss: 0.00766
Epoch: 34662, Training Loss: 0.00767
Epoch: 34662, Training Loss: 0.00942
Epoch: 34663, Training Loss: 0.00737
Epoch: 34663, Training Loss: 0.00766
Epoch: 34663, Training Loss: 0.00767
Epoch: 34663, Training Loss: 0.00942
Epoch: 34664, Training Loss: 0.00737
Epoch: 34664, Training Loss: 0.00766
Epoch: 34664, Training Loss: 0.00767
Epoch: 34664, Training Loss: 0.00942
Epoch: 34665, Training Loss: 0.00737
Epoch: 34665, Training Loss: 0.00766
Epoch: 34665, Training Loss: 0.00767
Epoch: 34665, Training Loss: 0.00942
Epoch: 34666, Training Loss: 0.00737
Epoch: 34666, Training Loss: 0.00766
Epoch: 34666, Training Loss: 0.00767
Epoch: 34666, Training Loss: 0.00942
Epoch: 34667, Training Loss: 0.00737
Epoch: 34667, Training Loss: 0.00766
Epoch: 34667, Training Loss: 0.00767
Epoch: 34667, Training Loss: 0.00942
Epoch: 34668, Training Loss: 0.00737
Epoch: 34668, Training Loss: 0.00766
Epoch: 34668, Training Loss: 0.00767
Epoch: 34668, Training Loss: 0.00942
Epoch: 34669, Training Loss: 0.00737
Epoch: 34669, Training Loss: 0.00766
Epoch: 34669, Training Loss: 0.00767
Epoch: 34669, Training Loss: 0.00942
Epoch: 34670, Training Loss: 0.00737
Epoch: 34670, Training Loss: 0.00766
Epoch: 34670, Training Loss: 0.00767
Epoch: 34670, Training Loss: 0.00942
Epoch: 34671, Training Loss: 0.00737
Epoch: 34671, Training Loss: 0.00766
Epoch: 34671, Training Loss: 0.00767
Epoch: 34671, Training Loss: 0.00942
Epoch: 34672, Training Loss: 0.00737
Epoch: 34672, Training Loss: 0.00766
Epoch: 34672, Training Loss: 0.00767
Epoch: 34672, Training Loss: 0.00942
Epoch: 34673, Training Loss: 0.00737
Epoch: 34673, Training Loss: 0.00766
Epoch: 34673, Training Loss: 0.00767
Epoch: 34673, Training Loss: 0.00942
Epoch: 34674, Training Loss: 0.00737
Epoch: 34674, Training Loss: 0.00766
Epoch: 34674, Training Loss: 0.00767
Epoch: 34674, Training Loss: 0.00942
Epoch: 34675, Training Loss: 0.00737
Epoch: 34675, Training Loss: 0.00766
Epoch: 34675, Training Loss: 0.00767
Epoch: 34675, Training Loss: 0.00942
Epoch: 34676, Training Loss: 0.00737
Epoch: 34676, Training Loss: 0.00766
Epoch: 34676, Training Loss: 0.00767
Epoch: 34676, Training Loss: 0.00942
Epoch: 34677, Training Loss: 0.00737
Epoch: 34677, Training Loss: 0.00766
Epoch: 34677, Training Loss: 0.00767
Epoch: 34677, Training Loss: 0.00942
Epoch: 34678, Training Loss: 0.00737
Epoch: 34678, Training Loss: 0.00766
Epoch: 34678, Training Loss: 0.00767
Epoch: 34678, Training Loss: 0.00942
Epoch: 34679, Training Loss: 0.00737
Epoch: 34679, Training Loss: 0.00766
Epoch: 34679, Training Loss: 0.00767
Epoch: 34679, Training Loss: 0.00942
Epoch: 34680, Training Loss: 0.00737
Epoch: 34680, Training Loss: 0.00766
Epoch: 34680, Training Loss: 0.00767
Epoch: 34680, Training Loss: 0.00942
Epoch: 34681, Training Loss: 0.00737
Epoch: 34681, Training Loss: 0.00766
Epoch: 34681, Training Loss: 0.00767
Epoch: 34681, Training Loss: 0.00942
Epoch: 34682, Training Loss: 0.00737
Epoch: 34682, Training Loss: 0.00766
Epoch: 34682, Training Loss: 0.00767
Epoch: 34682, Training Loss: 0.00941
Epoch: 34683, Training Loss: 0.00737
Epoch: 34683, Training Loss: 0.00766
Epoch: 34683, Training Loss: 0.00767
Epoch: 34683, Training Loss: 0.00941
Epoch: 34684, Training Loss: 0.00737
Epoch: 34684, Training Loss: 0.00766
Epoch: 34684, Training Loss: 0.00767
Epoch: 34684, Training Loss: 0.00941
Epoch: 34685, Training Loss: 0.00736
Epoch: 34685, Training Loss: 0.00766
Epoch: 34685, Training Loss: 0.00767
Epoch: 34685, Training Loss: 0.00941
Epoch: 34686, Training Loss: 0.00736
Epoch: 34686, Training Loss: 0.00766
Epoch: 34686, Training Loss: 0.00767
Epoch: 34686, Training Loss: 0.00941
Epoch: 34687, Training Loss: 0.00736
Epoch: 34687, Training Loss: 0.00766
Epoch: 34687, Training Loss: 0.00767
Epoch: 34687, Training Loss: 0.00941
Epoch: 34688, Training Loss: 0.00736
Epoch: 34688, Training Loss: 0.00766
Epoch: 34688, Training Loss: 0.00767
Epoch: 34688, Training Loss: 0.00941
Epoch: 34689, Training Loss: 0.00736
Epoch: 34689, Training Loss: 0.00766
Epoch: 34689, Training Loss: 0.00767
Epoch: 34689, Training Loss: 0.00941
Epoch: 34690, Training Loss: 0.00736
Epoch: 34690, Training Loss: 0.00766
Epoch: 34690, Training Loss: 0.00767
Epoch: 34690, Training Loss: 0.00941
Epoch: 34691, Training Loss: 0.00736
Epoch: 34691, Training Loss: 0.00766
Epoch: 34691, Training Loss: 0.00767
Epoch: 34691, Training Loss: 0.00941
Epoch: 34692, Training Loss: 0.00736
Epoch: 34692, Training Loss: 0.00766
Epoch: 34692, Training Loss: 0.00767
Epoch: 34692, Training Loss: 0.00941
Epoch: 34693, Training Loss: 0.00736
Epoch: 34693, Training Loss: 0.00766
Epoch: 34693, Training Loss: 0.00767
Epoch: 34693, Training Loss: 0.00941
Epoch: 34694, Training Loss: 0.00736
Epoch: 34694, Training Loss: 0.00766
Epoch: 34694, Training Loss: 0.00767
Epoch: 34694, Training Loss: 0.00941
Epoch: 34695, Training Loss: 0.00736
Epoch: 34695, Training Loss: 0.00766
Epoch: 34695, Training Loss: 0.00767
Epoch: 34695, Training Loss: 0.00941
Epoch: 34696, Training Loss: 0.00736
Epoch: 34696, Training Loss: 0.00766
Epoch: 34696, Training Loss: 0.00767
Epoch: 34696, Training Loss: 0.00941
Epoch: 34697, Training Loss: 0.00736
Epoch: 34697, Training Loss: 0.00766
Epoch: 34697, Training Loss: 0.00767
Epoch: 34697, Training Loss: 0.00941
Epoch: 34698, Training Loss: 0.00736
Epoch: 34698, Training Loss: 0.00766
Epoch: 34698, Training Loss: 0.00767
Epoch: 34698, Training Loss: 0.00941
Epoch: 34699, Training Loss: 0.00736
Epoch: 34699, Training Loss: 0.00766
Epoch: 34699, Training Loss: 0.00767
Epoch: 34699, Training Loss: 0.00941
Epoch: 34700, Training Loss: 0.00736
Epoch: 34700, Training Loss: 0.00766
Epoch: 34700, Training Loss: 0.00767
Epoch: 34700, Training Loss: 0.00941
Epoch: 34701, Training Loss: 0.00736
Epoch: 34701, Training Loss: 0.00766
Epoch: 34701, Training Loss: 0.00767
Epoch: 34701, Training Loss: 0.00941
Epoch: 34702, Training Loss: 0.00736
Epoch: 34702, Training Loss: 0.00766
Epoch: 34702, Training Loss: 0.00767
Epoch: 34702, Training Loss: 0.00941
Epoch: 34703, Training Loss: 0.00736
Epoch: 34703, Training Loss: 0.00766
Epoch: 34703, Training Loss: 0.00767
Epoch: 34703, Training Loss: 0.00941
Epoch: 34704, Training Loss: 0.00736
Epoch: 34704, Training Loss: 0.00766
Epoch: 34704, Training Loss: 0.00767
Epoch: 34704, Training Loss: 0.00941
Epoch: 34705, Training Loss: 0.00736
Epoch: 34705, Training Loss: 0.00766
Epoch: 34705, Training Loss: 0.00767
Epoch: 34705, Training Loss: 0.00941
Epoch: 34706, Training Loss: 0.00736
Epoch: 34706, Training Loss: 0.00766
Epoch: 34706, Training Loss: 0.00767
Epoch: 34706, Training Loss: 0.00941
Epoch: 34707, Training Loss: 0.00736
Epoch: 34707, Training Loss: 0.00766
Epoch: 34707, Training Loss: 0.00767
Epoch: 34707, Training Loss: 0.00941
Epoch: 34708, Training Loss: 0.00736
Epoch: 34708, Training Loss: 0.00766
Epoch: 34708, Training Loss: 0.00767
Epoch: 34708, Training Loss: 0.00941
Epoch: 34709, Training Loss: 0.00736
Epoch: 34709, Training Loss: 0.00766
Epoch: 34709, Training Loss: 0.00767
Epoch: 34709, Training Loss: 0.00941
Epoch: 34710, Training Loss: 0.00736
Epoch: 34710, Training Loss: 0.00766
Epoch: 34710, Training Loss: 0.00767
Epoch: 34710, Training Loss: 0.00941
Epoch: 34711, Training Loss: 0.00736
Epoch: 34711, Training Loss: 0.00766
Epoch: 34711, Training Loss: 0.00767
Epoch: 34711, Training Loss: 0.00941
Epoch: 34712, Training Loss: 0.00736
Epoch: 34712, Training Loss: 0.00766
Epoch: 34712, Training Loss: 0.00767
Epoch: 34712, Training Loss: 0.00941
Epoch: 34713, Training Loss: 0.00736
Epoch: 34713, Training Loss: 0.00766
Epoch: 34713, Training Loss: 0.00767
Epoch: 34713, Training Loss: 0.00941
Epoch: 34714, Training Loss: 0.00736
Epoch: 34714, Training Loss: 0.00766
Epoch: 34714, Training Loss: 0.00767
Epoch: 34714, Training Loss: 0.00941
Epoch: 34715, Training Loss: 0.00736
Epoch: 34715, Training Loss: 0.00766
Epoch: 34715, Training Loss: 0.00767
Epoch: 34715, Training Loss: 0.00941
Epoch: 34716, Training Loss: 0.00736
Epoch: 34716, Training Loss: 0.00766
Epoch: 34716, Training Loss: 0.00767
Epoch: 34716, Training Loss: 0.00941
Epoch: 34717, Training Loss: 0.00736
Epoch: 34717, Training Loss: 0.00766
Epoch: 34717, Training Loss: 0.00767
Epoch: 34717, Training Loss: 0.00941
Epoch: 34718, Training Loss: 0.00736
Epoch: 34718, Training Loss: 0.00766
Epoch: 34718, Training Loss: 0.00767
Epoch: 34718, Training Loss: 0.00941
Epoch: 34719, Training Loss: 0.00736
Epoch: 34719, Training Loss: 0.00766
Epoch: 34719, Training Loss: 0.00767
Epoch: 34719, Training Loss: 0.00941
Epoch: 34720, Training Loss: 0.00736
Epoch: 34720, Training Loss: 0.00766
Epoch: 34720, Training Loss: 0.00767
Epoch: 34720, Training Loss: 0.00941
Epoch: 34721, Training Loss: 0.00736
Epoch: 34721, Training Loss: 0.00766
Epoch: 34721, Training Loss: 0.00767
Epoch: 34721, Training Loss: 0.00941
Epoch: 34722, Training Loss: 0.00736
Epoch: 34722, Training Loss: 0.00766
Epoch: 34722, Training Loss: 0.00767
Epoch: 34722, Training Loss: 0.00941
Epoch: 34723, Training Loss: 0.00736
Epoch: 34723, Training Loss: 0.00766
Epoch: 34723, Training Loss: 0.00767
Epoch: 34723, Training Loss: 0.00941
Epoch: 34724, Training Loss: 0.00736
Epoch: 34724, Training Loss: 0.00766
Epoch: 34724, Training Loss: 0.00767
Epoch: 34724, Training Loss: 0.00941
Epoch: 34725, Training Loss: 0.00736
Epoch: 34725, Training Loss: 0.00766
Epoch: 34725, Training Loss: 0.00767
Epoch: 34725, Training Loss: 0.00941
Epoch: 34726, Training Loss: 0.00736
Epoch: 34726, Training Loss: 0.00766
Epoch: 34726, Training Loss: 0.00767
Epoch: 34726, Training Loss: 0.00941
Epoch: 34727, Training Loss: 0.00736
Epoch: 34727, Training Loss: 0.00766
Epoch: 34727, Training Loss: 0.00767
Epoch: 34727, Training Loss: 0.00941
Epoch: 34728, Training Loss: 0.00736
Epoch: 34728, Training Loss: 0.00766
Epoch: 34728, Training Loss: 0.00767
Epoch: 34728, Training Loss: 0.00941
Epoch: 34729, Training Loss: 0.00736
Epoch: 34729, Training Loss: 0.00766
Epoch: 34729, Training Loss: 0.00767
Epoch: 34729, Training Loss: 0.00941
Epoch: 34730, Training Loss: 0.00736
Epoch: 34730, Training Loss: 0.00766
Epoch: 34730, Training Loss: 0.00767
Epoch: 34730, Training Loss: 0.00941
Epoch: 34731, Training Loss: 0.00736
Epoch: 34731, Training Loss: 0.00766
Epoch: 34731, Training Loss: 0.00767
Epoch: 34731, Training Loss: 0.00941
Epoch: 34732, Training Loss: 0.00736
Epoch: 34732, Training Loss: 0.00766
Epoch: 34732, Training Loss: 0.00767
Epoch: 34732, Training Loss: 0.00941
Epoch: 34733, Training Loss: 0.00736
Epoch: 34733, Training Loss: 0.00766
Epoch: 34733, Training Loss: 0.00766
Epoch: 34733, Training Loss: 0.00941
Epoch: 34734, Training Loss: 0.00736
Epoch: 34734, Training Loss: 0.00766
Epoch: 34734, Training Loss: 0.00766
Epoch: 34734, Training Loss: 0.00941
Epoch: 34735, Training Loss: 0.00736
Epoch: 34735, Training Loss: 0.00766
Epoch: 34735, Training Loss: 0.00766
Epoch: 34735, Training Loss: 0.00941
Epoch: 34736, Training Loss: 0.00736
Epoch: 34736, Training Loss: 0.00766
Epoch: 34736, Training Loss: 0.00766
Epoch: 34736, Training Loss: 0.00941
Epoch: 34737, Training Loss: 0.00736
Epoch: 34737, Training Loss: 0.00766
Epoch: 34737, Training Loss: 0.00766
Epoch: 34737, Training Loss: 0.00941
Epoch: 34738, Training Loss: 0.00736
Epoch: 34738, Training Loss: 0.00766
Epoch: 34738, Training Loss: 0.00766
Epoch: 34738, Training Loss: 0.00941
Epoch: 34739, Training Loss: 0.00736
Epoch: 34739, Training Loss: 0.00766
Epoch: 34739, Training Loss: 0.00766
Epoch: 34739, Training Loss: 0.00941
Epoch: 34740, Training Loss: 0.00736
Epoch: 34740, Training Loss: 0.00766
Epoch: 34740, Training Loss: 0.00766
Epoch: 34740, Training Loss: 0.00941
Epoch: 34741, Training Loss: 0.00736
Epoch: 34741, Training Loss: 0.00766
Epoch: 34741, Training Loss: 0.00766
Epoch: 34741, Training Loss: 0.00941
Epoch: 34742, Training Loss: 0.00736
Epoch: 34742, Training Loss: 0.00765
Epoch: 34742, Training Loss: 0.00766
Epoch: 34742, Training Loss: 0.00941
Epoch: 34743, Training Loss: 0.00736
Epoch: 34743, Training Loss: 0.00765
Epoch: 34743, Training Loss: 0.00766
Epoch: 34743, Training Loss: 0.00941
Epoch: 34744, Training Loss: 0.00736
Epoch: 34744, Training Loss: 0.00765
Epoch: 34744, Training Loss: 0.00766
Epoch: 34744, Training Loss: 0.00941
Epoch: 34745, Training Loss: 0.00736
Epoch: 34745, Training Loss: 0.00765
Epoch: 34745, Training Loss: 0.00766
Epoch: 34745, Training Loss: 0.00941
Epoch: 34746, Training Loss: 0.00736
Epoch: 34746, Training Loss: 0.00765
Epoch: 34746, Training Loss: 0.00766
Epoch: 34746, Training Loss: 0.00941
Epoch: 34747, Training Loss: 0.00736
Epoch: 34747, Training Loss: 0.00765
Epoch: 34747, Training Loss: 0.00766
Epoch: 34747, Training Loss: 0.00940
Epoch: 34748, Training Loss: 0.00736
Epoch: 34748, Training Loss: 0.00765
Epoch: 34748, Training Loss: 0.00766
Epoch: 34748, Training Loss: 0.00940
Epoch: 34749, Training Loss: 0.00736
Epoch: 34749, Training Loss: 0.00765
Epoch: 34749, Training Loss: 0.00766
Epoch: 34749, Training Loss: 0.00940
Epoch: 34750, Training Loss: 0.00736
Epoch: 34750, Training Loss: 0.00765
Epoch: 34750, Training Loss: 0.00766
Epoch: 34750, Training Loss: 0.00940
Epoch: 34751, Training Loss: 0.00736
Epoch: 34751, Training Loss: 0.00765
Epoch: 34751, Training Loss: 0.00766
Epoch: 34751, Training Loss: 0.00940
Epoch: 34752, Training Loss: 0.00736
Epoch: 34752, Training Loss: 0.00765
Epoch: 34752, Training Loss: 0.00766
Epoch: 34752, Training Loss: 0.00940
Epoch: 34753, Training Loss: 0.00736
Epoch: 34753, Training Loss: 0.00765
Epoch: 34753, Training Loss: 0.00766
Epoch: 34753, Training Loss: 0.00940
Epoch: 34754, Training Loss: 0.00736
Epoch: 34754, Training Loss: 0.00765
Epoch: 34754, Training Loss: 0.00766
Epoch: 34754, Training Loss: 0.00940
Epoch: 34755, Training Loss: 0.00736
Epoch: 34755, Training Loss: 0.00765
Epoch: 34755, Training Loss: 0.00766
Epoch: 34755, Training Loss: 0.00940
Epoch: 34756, Training Loss: 0.00736
Epoch: 34756, Training Loss: 0.00765
Epoch: 34756, Training Loss: 0.00766
Epoch: 34756, Training Loss: 0.00940
Epoch: 34757, Training Loss: 0.00736
Epoch: 34757, Training Loss: 0.00765
Epoch: 34757, Training Loss: 0.00766
Epoch: 34757, Training Loss: 0.00940
Epoch: 34758, Training Loss: 0.00736
Epoch: 34758, Training Loss: 0.00765
Epoch: 34758, Training Loss: 0.00766
Epoch: 34758, Training Loss: 0.00940
Epoch: 34759, Training Loss: 0.00736
Epoch: 34759, Training Loss: 0.00765
Epoch: 34759, Training Loss: 0.00766
Epoch: 34759, Training Loss: 0.00940
Epoch: 34760, Training Loss: 0.00736
Epoch: 34760, Training Loss: 0.00765
Epoch: 34760, Training Loss: 0.00766
Epoch: 34760, Training Loss: 0.00940
Epoch: 34761, Training Loss: 0.00736
Epoch: 34761, Training Loss: 0.00765
Epoch: 34761, Training Loss: 0.00766
Epoch: 34761, Training Loss: 0.00940
Epoch: 34762, Training Loss: 0.00736
Epoch: 34762, Training Loss: 0.00765
Epoch: 34762, Training Loss: 0.00766
Epoch: 34762, Training Loss: 0.00940
Epoch: 34763, Training Loss: 0.00736
Epoch: 34763, Training Loss: 0.00765
Epoch: 34763, Training Loss: 0.00766
Epoch: 34763, Training Loss: 0.00940
Epoch: 34764, Training Loss: 0.00736
Epoch: 34764, Training Loss: 0.00765
Epoch: 34764, Training Loss: 0.00766
Epoch: 34764, Training Loss: 0.00940
Epoch: 34765, Training Loss: 0.00736
Epoch: 34765, Training Loss: 0.00765
Epoch: 34765, Training Loss: 0.00766
Epoch: 34765, Training Loss: 0.00940
Epoch: 34766, Training Loss: 0.00736
Epoch: 34766, Training Loss: 0.00765
Epoch: 34766, Training Loss: 0.00766
Epoch: 34766, Training Loss: 0.00940
Epoch: 34767, Training Loss: 0.00736
Epoch: 34767, Training Loss: 0.00765
Epoch: 34767, Training Loss: 0.00766
Epoch: 34767, Training Loss: 0.00940
Epoch: 34768, Training Loss: 0.00736
Epoch: 34768, Training Loss: 0.00765
Epoch: 34768, Training Loss: 0.00766
Epoch: 34768, Training Loss: 0.00940
Epoch: 34769, Training Loss: 0.00736
Epoch: 34769, Training Loss: 0.00765
Epoch: 34769, Training Loss: 0.00766
Epoch: 34769, Training Loss: 0.00940
Epoch: 34770, Training Loss: 0.00736
Epoch: 34770, Training Loss: 0.00765
Epoch: 34770, Training Loss: 0.00766
Epoch: 34770, Training Loss: 0.00940
Epoch: 34771, Training Loss: 0.00736
Epoch: 34771, Training Loss: 0.00765
Epoch: 34771, Training Loss: 0.00766
Epoch: 34771, Training Loss: 0.00940
Epoch: 34772, Training Loss: 0.00735
Epoch: 34772, Training Loss: 0.00765
Epoch: 34772, Training Loss: 0.00766
Epoch: 34772, Training Loss: 0.00940
Epoch: 34773, Training Loss: 0.00735
Epoch: 34773, Training Loss: 0.00765
Epoch: 34773, Training Loss: 0.00766
Epoch: 34773, Training Loss: 0.00940
Epoch: 34774, Training Loss: 0.00735
Epoch: 34774, Training Loss: 0.00765
Epoch: 34774, Training Loss: 0.00766
Epoch: 34774, Training Loss: 0.00940
Epoch: 34775, Training Loss: 0.00735
Epoch: 34775, Training Loss: 0.00765
Epoch: 34775, Training Loss: 0.00766
Epoch: 34775, Training Loss: 0.00940
Epoch: 34776, Training Loss: 0.00735
Epoch: 34776, Training Loss: 0.00765
Epoch: 34776, Training Loss: 0.00766
Epoch: 34776, Training Loss: 0.00940
Epoch: 34777, Training Loss: 0.00735
Epoch: 34777, Training Loss: 0.00765
Epoch: 34777, Training Loss: 0.00766
Epoch: 34777, Training Loss: 0.00940
Epoch: 34778, Training Loss: 0.00735
Epoch: 34778, Training Loss: 0.00765
Epoch: 34778, Training Loss: 0.00766
Epoch: 34778, Training Loss: 0.00940
Epoch: 34779, Training Loss: 0.00735
Epoch: 34779, Training Loss: 0.00765
Epoch: 34779, Training Loss: 0.00766
Epoch: 34779, Training Loss: 0.00940
Epoch: 34780, Training Loss: 0.00735
Epoch: 34780, Training Loss: 0.00765
Epoch: 34780, Training Loss: 0.00766
Epoch: 34780, Training Loss: 0.00940
Epoch: 34781, Training Loss: 0.00735
Epoch: 34781, Training Loss: 0.00765
Epoch: 34781, Training Loss: 0.00766
Epoch: 34781, Training Loss: 0.00940
Epoch: 34782, Training Loss: 0.00735
Epoch: 34782, Training Loss: 0.00765
Epoch: 34782, Training Loss: 0.00766
Epoch: 34782, Training Loss: 0.00940
Epoch: 34783, Training Loss: 0.00735
Epoch: 34783, Training Loss: 0.00765
Epoch: 34783, Training Loss: 0.00766
Epoch: 34783, Training Loss: 0.00940
Epoch: 34784, Training Loss: 0.00735
Epoch: 34784, Training Loss: 0.00765
Epoch: 34784, Training Loss: 0.00766
Epoch: 34784, Training Loss: 0.00940
Epoch: 34785, Training Loss: 0.00735
Epoch: 34785, Training Loss: 0.00765
Epoch: 34785, Training Loss: 0.00766
Epoch: 34785, Training Loss: 0.00940
Epoch: 34786, Training Loss: 0.00735
Epoch: 34786, Training Loss: 0.00765
Epoch: 34786, Training Loss: 0.00766
Epoch: 34786, Training Loss: 0.00940
Epoch: 34787, Training Loss: 0.00735
Epoch: 34787, Training Loss: 0.00765
Epoch: 34787, Training Loss: 0.00766
Epoch: 34787, Training Loss: 0.00940
Epoch: 34788, Training Loss: 0.00735
Epoch: 34788, Training Loss: 0.00765
Epoch: 34788, Training Loss: 0.00766
Epoch: 34788, Training Loss: 0.00940
Epoch: 34789, Training Loss: 0.00735
Epoch: 34789, Training Loss: 0.00765
Epoch: 34789, Training Loss: 0.00766
Epoch: 34789, Training Loss: 0.00940
Epoch: 34790, Training Loss: 0.00735
Epoch: 34790, Training Loss: 0.00765
Epoch: 34790, Training Loss: 0.00766
Epoch: 34790, Training Loss: 0.00940
Epoch: 34791, Training Loss: 0.00735
Epoch: 34791, Training Loss: 0.00765
Epoch: 34791, Training Loss: 0.00766
Epoch: 34791, Training Loss: 0.00940
Epoch: 34792, Training Loss: 0.00735
Epoch: 34792, Training Loss: 0.00765
Epoch: 34792, Training Loss: 0.00766
Epoch: 34792, Training Loss: 0.00940
Epoch: 34793, Training Loss: 0.00735
Epoch: 34793, Training Loss: 0.00765
Epoch: 34793, Training Loss: 0.00766
Epoch: 34793, Training Loss: 0.00940
Epoch: 34794, Training Loss: 0.00735
Epoch: 34794, Training Loss: 0.00765
Epoch: 34794, Training Loss: 0.00766
Epoch: 34794, Training Loss: 0.00940
Epoch: 34795, Training Loss: 0.00735
Epoch: 34795, Training Loss: 0.00765
Epoch: 34795, Training Loss: 0.00766
Epoch: 34795, Training Loss: 0.00940
Epoch: 34796, Training Loss: 0.00735
Epoch: 34796, Training Loss: 0.00765
Epoch: 34796, Training Loss: 0.00766
Epoch: 34796, Training Loss: 0.00940
Epoch: 34797, Training Loss: 0.00735
Epoch: 34797, Training Loss: 0.00765
Epoch: 34797, Training Loss: 0.00766
Epoch: 34797, Training Loss: 0.00940
Epoch: 34798, Training Loss: 0.00735
Epoch: 34798, Training Loss: 0.00765
Epoch: 34798, Training Loss: 0.00766
Epoch: 34798, Training Loss: 0.00940
Epoch: 34799, Training Loss: 0.00735
Epoch: 34799, Training Loss: 0.00765
Epoch: 34799, Training Loss: 0.00766
Epoch: 34799, Training Loss: 0.00940
Epoch: 34800, Training Loss: 0.00735
Epoch: 34800, Training Loss: 0.00765
Epoch: 34800, Training Loss: 0.00766
Epoch: 34800, Training Loss: 0.00940
Epoch: 34801, Training Loss: 0.00735
Epoch: 34801, Training Loss: 0.00765
Epoch: 34801, Training Loss: 0.00766
Epoch: 34801, Training Loss: 0.00940
Epoch: 34802, Training Loss: 0.00735
Epoch: 34802, Training Loss: 0.00765
Epoch: 34802, Training Loss: 0.00766
Epoch: 34802, Training Loss: 0.00940
Epoch: 34803, Training Loss: 0.00735
Epoch: 34803, Training Loss: 0.00765
Epoch: 34803, Training Loss: 0.00766
Epoch: 34803, Training Loss: 0.00940
Epoch: 34804, Training Loss: 0.00735
Epoch: 34804, Training Loss: 0.00765
Epoch: 34804, Training Loss: 0.00766
Epoch: 34804, Training Loss: 0.00940
Epoch: 34805, Training Loss: 0.00735
Epoch: 34805, Training Loss: 0.00765
Epoch: 34805, Training Loss: 0.00766
Epoch: 34805, Training Loss: 0.00940
Epoch: 34806, Training Loss: 0.00735
Epoch: 34806, Training Loss: 0.00765
Epoch: 34806, Training Loss: 0.00766
Epoch: 34806, Training Loss: 0.00940
Epoch: 34807, Training Loss: 0.00735
Epoch: 34807, Training Loss: 0.00765
Epoch: 34807, Training Loss: 0.00766
Epoch: 34807, Training Loss: 0.00940
Epoch: 34808, Training Loss: 0.00735
Epoch: 34808, Training Loss: 0.00765
Epoch: 34808, Training Loss: 0.00766
Epoch: 34808, Training Loss: 0.00940
Epoch: 34809, Training Loss: 0.00735
Epoch: 34809, Training Loss: 0.00765
Epoch: 34809, Training Loss: 0.00766
Epoch: 34809, Training Loss: 0.00940
Epoch: 34810, Training Loss: 0.00735
Epoch: 34810, Training Loss: 0.00765
Epoch: 34810, Training Loss: 0.00766
Epoch: 34810, Training Loss: 0.00940
Epoch: 34811, Training Loss: 0.00735
Epoch: 34811, Training Loss: 0.00765
Epoch: 34811, Training Loss: 0.00766
Epoch: 34811, Training Loss: 0.00940
Epoch: 34812, Training Loss: 0.00735
Epoch: 34812, Training Loss: 0.00765
Epoch: 34812, Training Loss: 0.00766
Epoch: 34812, Training Loss: 0.00940
Epoch: 34813, Training Loss: 0.00735
Epoch: 34813, Training Loss: 0.00765
Epoch: 34813, Training Loss: 0.00766
Epoch: 34813, Training Loss: 0.00939
Epoch: 34814, Training Loss: 0.00735
Epoch: 34814, Training Loss: 0.00765
Epoch: 34814, Training Loss: 0.00765
Epoch: 34814, Training Loss: 0.00939
Epoch: 34815, Training Loss: 0.00735
Epoch: 34815, Training Loss: 0.00765
Epoch: 34815, Training Loss: 0.00765
Epoch: 34815, Training Loss: 0.00939
Epoch: 34816, Training Loss: 0.00735
Epoch: 34816, Training Loss: 0.00765
Epoch: 34816, Training Loss: 0.00765
Epoch: 34816, Training Loss: 0.00939
Epoch: 34817, Training Loss: 0.00735
Epoch: 34817, Training Loss: 0.00765
Epoch: 34817, Training Loss: 0.00765
Epoch: 34817, Training Loss: 0.00939
Epoch: 34818, Training Loss: 0.00735
Epoch: 34818, Training Loss: 0.00765
Epoch: 34818, Training Loss: 0.00765
Epoch: 34818, Training Loss: 0.00939
Epoch: 34819, Training Loss: 0.00735
Epoch: 34819, Training Loss: 0.00765
Epoch: 34819, Training Loss: 0.00765
Epoch: 34819, Training Loss: 0.00939
Epoch: 34820, Training Loss: 0.00735
Epoch: 34820, Training Loss: 0.00765
Epoch: 34820, Training Loss: 0.00765
Epoch: 34820, Training Loss: 0.00939
Epoch: 34821, Training Loss: 0.00735
Epoch: 34821, Training Loss: 0.00765
Epoch: 34821, Training Loss: 0.00765
Epoch: 34821, Training Loss: 0.00939
Epoch: 34822, Training Loss: 0.00735
Epoch: 34822, Training Loss: 0.00765
Epoch: 34822, Training Loss: 0.00765
Epoch: 34822, Training Loss: 0.00939
Epoch: 34823, Training Loss: 0.00735
Epoch: 34823, Training Loss: 0.00765
Epoch: 34823, Training Loss: 0.00765
Epoch: 34823, Training Loss: 0.00939
Epoch: 34824, Training Loss: 0.00735
Epoch: 34824, Training Loss: 0.00764
Epoch: 34824, Training Loss: 0.00765
Epoch: 34824, Training Loss: 0.00939
Epoch: 34825, Training Loss: 0.00735
Epoch: 34825, Training Loss: 0.00764
Epoch: 34825, Training Loss: 0.00765
Epoch: 34825, Training Loss: 0.00939
Epoch: 34826, Training Loss: 0.00735
Epoch: 34826, Training Loss: 0.00764
Epoch: 34826, Training Loss: 0.00765
Epoch: 34826, Training Loss: 0.00939
Epoch: 34827, Training Loss: 0.00735
Epoch: 34827, Training Loss: 0.00764
Epoch: 34827, Training Loss: 0.00765
Epoch: 34827, Training Loss: 0.00939
Epoch: 34828, Training Loss: 0.00735
Epoch: 34828, Training Loss: 0.00764
Epoch: 34828, Training Loss: 0.00765
Epoch: 34828, Training Loss: 0.00939
Epoch: 34829, Training Loss: 0.00735
Epoch: 34829, Training Loss: 0.00764
Epoch: 34829, Training Loss: 0.00765
Epoch: 34829, Training Loss: 0.00939
Epoch: 34830, Training Loss: 0.00735
Epoch: 34830, Training Loss: 0.00764
Epoch: 34830, Training Loss: 0.00765
Epoch: 34830, Training Loss: 0.00939
Epoch: 34831, Training Loss: 0.00735
Epoch: 34831, Training Loss: 0.00764
Epoch: 34831, Training Loss: 0.00765
Epoch: 34831, Training Loss: 0.00939
Epoch: 34832, Training Loss: 0.00735
Epoch: 34832, Training Loss: 0.00764
Epoch: 34832, Training Loss: 0.00765
Epoch: 34832, Training Loss: 0.00939
Epoch: 34833, Training Loss: 0.00735
Epoch: 34833, Training Loss: 0.00764
Epoch: 34833, Training Loss: 0.00765
Epoch: 34833, Training Loss: 0.00939
Epoch: 34834, Training Loss: 0.00735
Epoch: 34834, Training Loss: 0.00764
Epoch: 34834, Training Loss: 0.00765
Epoch: 34834, Training Loss: 0.00939
Epoch: 34835, Training Loss: 0.00735
Epoch: 34835, Training Loss: 0.00764
Epoch: 34835, Training Loss: 0.00765
Epoch: 34835, Training Loss: 0.00939
Epoch: 34836, Training Loss: 0.00735
Epoch: 34836, Training Loss: 0.00764
Epoch: 34836, Training Loss: 0.00765
Epoch: 34836, Training Loss: 0.00939
Epoch: 34837, Training Loss: 0.00735
Epoch: 34837, Training Loss: 0.00764
Epoch: 34837, Training Loss: 0.00765
Epoch: 34837, Training Loss: 0.00939
Epoch: 34838, Training Loss: 0.00735
Epoch: 34838, Training Loss: 0.00764
Epoch: 34838, Training Loss: 0.00765
Epoch: 34838, Training Loss: 0.00939
Epoch: 34839, Training Loss: 0.00735
Epoch: 34839, Training Loss: 0.00764
Epoch: 34839, Training Loss: 0.00765
Epoch: 34839, Training Loss: 0.00939
Epoch: 34840, Training Loss: 0.00735
Epoch: 34840, Training Loss: 0.00764
Epoch: 34840, Training Loss: 0.00765
Epoch: 34840, Training Loss: 0.00939
Epoch: 34841, Training Loss: 0.00735
Epoch: 34841, Training Loss: 0.00764
Epoch: 34841, Training Loss: 0.00765
Epoch: 34841, Training Loss: 0.00939
Epoch: 34842, Training Loss: 0.00735
Epoch: 34842, Training Loss: 0.00764
Epoch: 34842, Training Loss: 0.00765
Epoch: 34842, Training Loss: 0.00939
Epoch: 34843, Training Loss: 0.00735
Epoch: 34843, Training Loss: 0.00764
Epoch: 34843, Training Loss: 0.00765
Epoch: 34843, Training Loss: 0.00939
Epoch: 34844, Training Loss: 0.00735
Epoch: 34844, Training Loss: 0.00764
Epoch: 34844, Training Loss: 0.00765
Epoch: 34844, Training Loss: 0.00939
Epoch: 34845, Training Loss: 0.00735
Epoch: 34845, Training Loss: 0.00764
Epoch: 34845, Training Loss: 0.00765
Epoch: 34845, Training Loss: 0.00939
Epoch: 34846, Training Loss: 0.00735
Epoch: 34846, Training Loss: 0.00764
Epoch: 34846, Training Loss: 0.00765
Epoch: 34846, Training Loss: 0.00939
Epoch: 34847, Training Loss: 0.00735
Epoch: 34847, Training Loss: 0.00764
Epoch: 34847, Training Loss: 0.00765
Epoch: 34847, Training Loss: 0.00939
Epoch: 34848, Training Loss: 0.00735
Epoch: 34848, Training Loss: 0.00764
Epoch: 34848, Training Loss: 0.00765
Epoch: 34848, Training Loss: 0.00939
Epoch: 34849, Training Loss: 0.00735
Epoch: 34849, Training Loss: 0.00764
Epoch: 34849, Training Loss: 0.00765
Epoch: 34849, Training Loss: 0.00939
Epoch: 34850, Training Loss: 0.00735
Epoch: 34850, Training Loss: 0.00764
Epoch: 34850, Training Loss: 0.00765
Epoch: 34850, Training Loss: 0.00939
Epoch: 34851, Training Loss: 0.00735
Epoch: 34851, Training Loss: 0.00764
Epoch: 34851, Training Loss: 0.00765
Epoch: 34851, Training Loss: 0.00939
Epoch: 34852, Training Loss: 0.00735
Epoch: 34852, Training Loss: 0.00764
Epoch: 34852, Training Loss: 0.00765
Epoch: 34852, Training Loss: 0.00939
Epoch: 34853, Training Loss: 0.00735
Epoch: 34853, Training Loss: 0.00764
Epoch: 34853, Training Loss: 0.00765
Epoch: 34853, Training Loss: 0.00939
Epoch: 34854, Training Loss: 0.00735
Epoch: 34854, Training Loss: 0.00764
Epoch: 34854, Training Loss: 0.00765
Epoch: 34854, Training Loss: 0.00939
Epoch: 34855, Training Loss: 0.00735
Epoch: 34855, Training Loss: 0.00764
Epoch: 34855, Training Loss: 0.00765
Epoch: 34855, Training Loss: 0.00939
Epoch: 34856, Training Loss: 0.00735
Epoch: 34856, Training Loss: 0.00764
Epoch: 34856, Training Loss: 0.00765
Epoch: 34856, Training Loss: 0.00939
Epoch: 34857, Training Loss: 0.00735
Epoch: 34857, Training Loss: 0.00764
Epoch: 34857, Training Loss: 0.00765
Epoch: 34857, Training Loss: 0.00939
Epoch: 34858, Training Loss: 0.00735
Epoch: 34858, Training Loss: 0.00764
Epoch: 34858, Training Loss: 0.00765
Epoch: 34858, Training Loss: 0.00939
Epoch: 34859, Training Loss: 0.00734
Epoch: 34859, Training Loss: 0.00764
Epoch: 34859, Training Loss: 0.00765
Epoch: 34859, Training Loss: 0.00939
Epoch: 34860, Training Loss: 0.00734
Epoch: 34860, Training Loss: 0.00764
Epoch: 34860, Training Loss: 0.00765
Epoch: 34860, Training Loss: 0.00939
Epoch: 34861, Training Loss: 0.00734
Epoch: 34861, Training Loss: 0.00764
Epoch: 34861, Training Loss: 0.00765
Epoch: 34861, Training Loss: 0.00939
Epoch: 34862, Training Loss: 0.00734
Epoch: 34862, Training Loss: 0.00764
Epoch: 34862, Training Loss: 0.00765
Epoch: 34862, Training Loss: 0.00939
Epoch: 34863, Training Loss: 0.00734
Epoch: 34863, Training Loss: 0.00764
Epoch: 34863, Training Loss: 0.00765
Epoch: 34863, Training Loss: 0.00939
Epoch: 34864, Training Loss: 0.00734
Epoch: 34864, Training Loss: 0.00764
Epoch: 34864, Training Loss: 0.00765
Epoch: 34864, Training Loss: 0.00939
Epoch: 34865, Training Loss: 0.00734
Epoch: 34865, Training Loss: 0.00764
Epoch: 34865, Training Loss: 0.00765
Epoch: 34865, Training Loss: 0.00939
Epoch: 34866, Training Loss: 0.00734
Epoch: 34866, Training Loss: 0.00764
Epoch: 34866, Training Loss: 0.00765
Epoch: 34866, Training Loss: 0.00939
Epoch: 34867, Training Loss: 0.00734
Epoch: 34867, Training Loss: 0.00764
Epoch: 34867, Training Loss: 0.00765
Epoch: 34867, Training Loss: 0.00939
Epoch: 34868, Training Loss: 0.00734
Epoch: 34868, Training Loss: 0.00764
Epoch: 34868, Training Loss: 0.00765
Epoch: 34868, Training Loss: 0.00939
Epoch: 34869, Training Loss: 0.00734
Epoch: 34869, Training Loss: 0.00764
Epoch: 34869, Training Loss: 0.00765
Epoch: 34869, Training Loss: 0.00939
Epoch: 34870, Training Loss: 0.00734
Epoch: 34870, Training Loss: 0.00764
Epoch: 34870, Training Loss: 0.00765
Epoch: 34870, Training Loss: 0.00939
Epoch: 34871, Training Loss: 0.00734
Epoch: 34871, Training Loss: 0.00764
Epoch: 34871, Training Loss: 0.00765
Epoch: 34871, Training Loss: 0.00939
Epoch: 34872, Training Loss: 0.00734
Epoch: 34872, Training Loss: 0.00764
Epoch: 34872, Training Loss: 0.00765
Epoch: 34872, Training Loss: 0.00939
Epoch: 34873, Training Loss: 0.00734
Epoch: 34873, Training Loss: 0.00764
Epoch: 34873, Training Loss: 0.00765
Epoch: 34873, Training Loss: 0.00939
Epoch: 34874, Training Loss: 0.00734
Epoch: 34874, Training Loss: 0.00764
Epoch: 34874, Training Loss: 0.00765
Epoch: 34874, Training Loss: 0.00939
Epoch: 34875, Training Loss: 0.00734
Epoch: 34875, Training Loss: 0.00764
Epoch: 34875, Training Loss: 0.00765
Epoch: 34875, Training Loss: 0.00939
Epoch: 34876, Training Loss: 0.00734
Epoch: 34876, Training Loss: 0.00764
Epoch: 34876, Training Loss: 0.00765
Epoch: 34876, Training Loss: 0.00939
Epoch: 34877, Training Loss: 0.00734
Epoch: 34877, Training Loss: 0.00764
Epoch: 34877, Training Loss: 0.00765
Epoch: 34877, Training Loss: 0.00939
Epoch: 34878, Training Loss: 0.00734
Epoch: 34878, Training Loss: 0.00764
Epoch: 34878, Training Loss: 0.00765
Epoch: 34878, Training Loss: 0.00939
Epoch: 34879, Training Loss: 0.00734
Epoch: 34879, Training Loss: 0.00764
Epoch: 34879, Training Loss: 0.00765
Epoch: 34879, Training Loss: 0.00938
Epoch: 34880, Training Loss: 0.00734
Epoch: 34880, Training Loss: 0.00764
Epoch: 34880, Training Loss: 0.00765
Epoch: 34880, Training Loss: 0.00938
Epoch: 34881, Training Loss: 0.00734
Epoch: 34881, Training Loss: 0.00764
Epoch: 34881, Training Loss: 0.00765
Epoch: 34881, Training Loss: 0.00938
Epoch: 34882, Training Loss: 0.00734
Epoch: 34882, Training Loss: 0.00764
Epoch: 34882, Training Loss: 0.00765
Epoch: 34882, Training Loss: 0.00938
Epoch: 34883, Training Loss: 0.00734
Epoch: 34883, Training Loss: 0.00764
Epoch: 34883, Training Loss: 0.00765
Epoch: 34883, Training Loss: 0.00938
Epoch: 34884, Training Loss: 0.00734
Epoch: 34884, Training Loss: 0.00764
Epoch: 34884, Training Loss: 0.00765
Epoch: 34884, Training Loss: 0.00938
Epoch: 34885, Training Loss: 0.00734
Epoch: 34885, Training Loss: 0.00764
Epoch: 34885, Training Loss: 0.00765
Epoch: 34885, Training Loss: 0.00938
Epoch: 34886, Training Loss: 0.00734
Epoch: 34886, Training Loss: 0.00764
Epoch: 34886, Training Loss: 0.00765
Epoch: 34886, Training Loss: 0.00938
Epoch: 34887, Training Loss: 0.00734
Epoch: 34887, Training Loss: 0.00764
Epoch: 34887, Training Loss: 0.00765
Epoch: 34887, Training Loss: 0.00938
Epoch: 34888, Training Loss: 0.00734
Epoch: 34888, Training Loss: 0.00764
Epoch: 34888, Training Loss: 0.00765
Epoch: 34888, Training Loss: 0.00938
Epoch: 34889, Training Loss: 0.00734
Epoch: 34889, Training Loss: 0.00764
Epoch: 34889, Training Loss: 0.00765
Epoch: 34889, Training Loss: 0.00938
Epoch: 34890, Training Loss: 0.00734
Epoch: 34890, Training Loss: 0.00764
Epoch: 34890, Training Loss: 0.00765
Epoch: 34890, Training Loss: 0.00938
Epoch: 34891, Training Loss: 0.00734
Epoch: 34891, Training Loss: 0.00764
Epoch: 34891, Training Loss: 0.00765
Epoch: 34891, Training Loss: 0.00938
Epoch: 34892, Training Loss: 0.00734
Epoch: 34892, Training Loss: 0.00764
Epoch: 34892, Training Loss: 0.00765
Epoch: 34892, Training Loss: 0.00938
Epoch: 34893, Training Loss: 0.00734
Epoch: 34893, Training Loss: 0.00764
Epoch: 34893, Training Loss: 0.00765
Epoch: 34893, Training Loss: 0.00938
Epoch: 34894, Training Loss: 0.00734
Epoch: 34894, Training Loss: 0.00764
Epoch: 34894, Training Loss: 0.00765
Epoch: 34894, Training Loss: 0.00938
Epoch: 34895, Training Loss: 0.00734
Epoch: 34895, Training Loss: 0.00764
Epoch: 34895, Training Loss: 0.00765
Epoch: 34895, Training Loss: 0.00938
Epoch: 34896, Training Loss: 0.00734
Epoch: 34896, Training Loss: 0.00764
Epoch: 34896, Training Loss: 0.00764
Epoch: 34896, Training Loss: 0.00938
Epoch: 34897, Training Loss: 0.00734
Epoch: 34897, Training Loss: 0.00764
Epoch: 34897, Training Loss: 0.00764
Epoch: 34897, Training Loss: 0.00938
Epoch: 34898, Training Loss: 0.00734
Epoch: 34898, Training Loss: 0.00764
Epoch: 34898, Training Loss: 0.00764
Epoch: 34898, Training Loss: 0.00938
Epoch: 34899, Training Loss: 0.00734
Epoch: 34899, Training Loss: 0.00764
Epoch: 34899, Training Loss: 0.00764
Epoch: 34899, Training Loss: 0.00938
Epoch: 34900, Training Loss: 0.00734
Epoch: 34900, Training Loss: 0.00764
Epoch: 34900, Training Loss: 0.00764
Epoch: 34900, Training Loss: 0.00938
Epoch: 34901, Training Loss: 0.00734
Epoch: 34901, Training Loss: 0.00764
Epoch: 34901, Training Loss: 0.00764
Epoch: 34901, Training Loss: 0.00938
Epoch: 34902, Training Loss: 0.00734
Epoch: 34902, Training Loss: 0.00764
Epoch: 34902, Training Loss: 0.00764
Epoch: 34902, Training Loss: 0.00938
Epoch: 34903, Training Loss: 0.00734
Epoch: 34903, Training Loss: 0.00764
Epoch: 34903, Training Loss: 0.00764
Epoch: 34903, Training Loss: 0.00938
Epoch: 34904, Training Loss: 0.00734
Epoch: 34904, Training Loss: 0.00764
Epoch: 34904, Training Loss: 0.00764
Epoch: 34904, Training Loss: 0.00938
Epoch: 34905, Training Loss: 0.00734
Epoch: 34905, Training Loss: 0.00764
Epoch: 34905, Training Loss: 0.00764
Epoch: 34905, Training Loss: 0.00938
Epoch: 34906, Training Loss: 0.00734
Epoch: 34906, Training Loss: 0.00763
Epoch: 34906, Training Loss: 0.00764
Epoch: 34906, Training Loss: 0.00938
Epoch: 34907, Training Loss: 0.00734
Epoch: 34907, Training Loss: 0.00763
Epoch: 34907, Training Loss: 0.00764
Epoch: 34907, Training Loss: 0.00938
Epoch: 34908, Training Loss: 0.00734
Epoch: 34908, Training Loss: 0.00763
Epoch: 34908, Training Loss: 0.00764
Epoch: 34908, Training Loss: 0.00938
Epoch: 34909, Training Loss: 0.00734
Epoch: 34909, Training Loss: 0.00763
Epoch: 34909, Training Loss: 0.00764
Epoch: 34909, Training Loss: 0.00938
Epoch: 34910, Training Loss: 0.00734
Epoch: 34910, Training Loss: 0.00763
Epoch: 34910, Training Loss: 0.00764
Epoch: 34910, Training Loss: 0.00938
Epoch: 34911, Training Loss: 0.00734
Epoch: 34911, Training Loss: 0.00763
Epoch: 34911, Training Loss: 0.00764
Epoch: 34911, Training Loss: 0.00938
Epoch: 34912, Training Loss: 0.00734
Epoch: 34912, Training Loss: 0.00763
Epoch: 34912, Training Loss: 0.00764
Epoch: 34912, Training Loss: 0.00938
Epoch: 34913, Training Loss: 0.00734
Epoch: 34913, Training Loss: 0.00763
Epoch: 34913, Training Loss: 0.00764
Epoch: 34913, Training Loss: 0.00938
Epoch: 34914, Training Loss: 0.00734
Epoch: 34914, Training Loss: 0.00763
Epoch: 34914, Training Loss: 0.00764
Epoch: 34914, Training Loss: 0.00938
Epoch: 34915, Training Loss: 0.00734
Epoch: 34915, Training Loss: 0.00763
Epoch: 34915, Training Loss: 0.00764
Epoch: 34915, Training Loss: 0.00938
Epoch: 34916, Training Loss: 0.00734
Epoch: 34916, Training Loss: 0.00763
Epoch: 34916, Training Loss: 0.00764
Epoch: 34916, Training Loss: 0.00938
Epoch: 34917, Training Loss: 0.00734
Epoch: 34917, Training Loss: 0.00763
Epoch: 34917, Training Loss: 0.00764
Epoch: 34917, Training Loss: 0.00938
Epoch: 34918, Training Loss: 0.00734
Epoch: 34918, Training Loss: 0.00763
Epoch: 34918, Training Loss: 0.00764
Epoch: 34918, Training Loss: 0.00938
Epoch: 34919, Training Loss: 0.00734
Epoch: 34919, Training Loss: 0.00763
Epoch: 34919, Training Loss: 0.00764
Epoch: 34919, Training Loss: 0.00938
Epoch: 34920, Training Loss: 0.00734
Epoch: 34920, Training Loss: 0.00763
Epoch: 34920, Training Loss: 0.00764
Epoch: 34920, Training Loss: 0.00938
Epoch: 34921, Training Loss: 0.00734
Epoch: 34921, Training Loss: 0.00763
Epoch: 34921, Training Loss: 0.00764
Epoch: 34921, Training Loss: 0.00938
Epoch: 34922, Training Loss: 0.00734
Epoch: 34922, Training Loss: 0.00763
Epoch: 34922, Training Loss: 0.00764
Epoch: 34922, Training Loss: 0.00938
Epoch: 34923, Training Loss: 0.00734
Epoch: 34923, Training Loss: 0.00763
Epoch: 34923, Training Loss: 0.00764
Epoch: 34923, Training Loss: 0.00938
Epoch: 34924, Training Loss: 0.00734
Epoch: 34924, Training Loss: 0.00763
Epoch: 34924, Training Loss: 0.00764
Epoch: 34924, Training Loss: 0.00938
Epoch: 34925, Training Loss: 0.00734
Epoch: 34925, Training Loss: 0.00763
Epoch: 34925, Training Loss: 0.00764
Epoch: 34925, Training Loss: 0.00938
Epoch: 34926, Training Loss: 0.00734
Epoch: 34926, Training Loss: 0.00763
Epoch: 34926, Training Loss: 0.00764
Epoch: 34926, Training Loss: 0.00938
Epoch: 34927, Training Loss: 0.00734
Epoch: 34927, Training Loss: 0.00763
Epoch: 34927, Training Loss: 0.00764
Epoch: 34927, Training Loss: 0.00938
Epoch: 34928, Training Loss: 0.00734
Epoch: 34928, Training Loss: 0.00763
Epoch: 34928, Training Loss: 0.00764
Epoch: 34928, Training Loss: 0.00938
Epoch: 34929, Training Loss: 0.00734
Epoch: 34929, Training Loss: 0.00763
Epoch: 34929, Training Loss: 0.00764
Epoch: 34929, Training Loss: 0.00938
Epoch: 34930, Training Loss: 0.00734
Epoch: 34930, Training Loss: 0.00763
Epoch: 34930, Training Loss: 0.00764
Epoch: 34930, Training Loss: 0.00938
Epoch: 34931, Training Loss: 0.00734
Epoch: 34931, Training Loss: 0.00763
Epoch: 34931, Training Loss: 0.00764
Epoch: 34931, Training Loss: 0.00938
Epoch: 34932, Training Loss: 0.00734
Epoch: 34932, Training Loss: 0.00763
Epoch: 34932, Training Loss: 0.00764
Epoch: 34932, Training Loss: 0.00938
Epoch: 34933, Training Loss: 0.00734
Epoch: 34933, Training Loss: 0.00763
Epoch: 34933, Training Loss: 0.00764
Epoch: 34933, Training Loss: 0.00938
Epoch: 34934, Training Loss: 0.00734
Epoch: 34934, Training Loss: 0.00763
Epoch: 34934, Training Loss: 0.00764
Epoch: 34934, Training Loss: 0.00938
Epoch: 34935, Training Loss: 0.00734
Epoch: 34935, Training Loss: 0.00763
Epoch: 34935, Training Loss: 0.00764
Epoch: 34935, Training Loss: 0.00938
Epoch: 34936, Training Loss: 0.00734
Epoch: 34936, Training Loss: 0.00763
Epoch: 34936, Training Loss: 0.00764
Epoch: 34936, Training Loss: 0.00938
Epoch: 34937, Training Loss: 0.00734
Epoch: 34937, Training Loss: 0.00763
Epoch: 34937, Training Loss: 0.00764
Epoch: 34937, Training Loss: 0.00938
Epoch: 34938, Training Loss: 0.00734
Epoch: 34938, Training Loss: 0.00763
Epoch: 34938, Training Loss: 0.00764
Epoch: 34938, Training Loss: 0.00938
Epoch: 34939, Training Loss: 0.00734
Epoch: 34939, Training Loss: 0.00763
Epoch: 34939, Training Loss: 0.00764
Epoch: 34939, Training Loss: 0.00938
Epoch: 34940, Training Loss: 0.00734
Epoch: 34940, Training Loss: 0.00763
Epoch: 34940, Training Loss: 0.00764
Epoch: 34940, Training Loss: 0.00938
Epoch: 34941, Training Loss: 0.00734
Epoch: 34941, Training Loss: 0.00763
Epoch: 34941, Training Loss: 0.00764
Epoch: 34941, Training Loss: 0.00938
Epoch: 34942, Training Loss: 0.00734
Epoch: 34942, Training Loss: 0.00763
Epoch: 34942, Training Loss: 0.00764
Epoch: 34942, Training Loss: 0.00938
Epoch: 34943, Training Loss: 0.00734
Epoch: 34943, Training Loss: 0.00763
Epoch: 34943, Training Loss: 0.00764
Epoch: 34943, Training Loss: 0.00938
Epoch: 34944, Training Loss: 0.00734
Epoch: 34944, Training Loss: 0.00763
Epoch: 34944, Training Loss: 0.00764
Epoch: 34944, Training Loss: 0.00938
Epoch: 34945, Training Loss: 0.00734
Epoch: 34945, Training Loss: 0.00763
Epoch: 34945, Training Loss: 0.00764
Epoch: 34945, Training Loss: 0.00938
Epoch: 34946, Training Loss: 0.00733
Epoch: 34946, Training Loss: 0.00763
Epoch: 34946, Training Loss: 0.00764
Epoch: 34946, Training Loss: 0.00937
Epoch: 34947, Training Loss: 0.00733
Epoch: 34947, Training Loss: 0.00763
Epoch: 34947, Training Loss: 0.00764
Epoch: 34947, Training Loss: 0.00937
Epoch: 34948, Training Loss: 0.00733
Epoch: 34948, Training Loss: 0.00763
Epoch: 34948, Training Loss: 0.00764
Epoch: 34948, Training Loss: 0.00937
Epoch: 34949, Training Loss: 0.00733
Epoch: 34949, Training Loss: 0.00763
Epoch: 34949, Training Loss: 0.00764
Epoch: 34949, Training Loss: 0.00937
Epoch: 34950, Training Loss: 0.00733
Epoch: 34950, Training Loss: 0.00763
Epoch: 34950, Training Loss: 0.00764
Epoch: 34950, Training Loss: 0.00937
Epoch: 34951, Training Loss: 0.00733
Epoch: 34951, Training Loss: 0.00763
Epoch: 34951, Training Loss: 0.00764
Epoch: 34951, Training Loss: 0.00937
Epoch: 34952, Training Loss: 0.00733
Epoch: 34952, Training Loss: 0.00763
Epoch: 34952, Training Loss: 0.00764
Epoch: 34952, Training Loss: 0.00937
Epoch: 34953, Training Loss: 0.00733
Epoch: 34953, Training Loss: 0.00763
Epoch: 34953, Training Loss: 0.00764
Epoch: 34953, Training Loss: 0.00937
Epoch: 34954, Training Loss: 0.00733
Epoch: 34954, Training Loss: 0.00763
Epoch: 34954, Training Loss: 0.00764
Epoch: 34954, Training Loss: 0.00937
Epoch: 34955, Training Loss: 0.00733
Epoch: 34955, Training Loss: 0.00763
Epoch: 34955, Training Loss: 0.00764
Epoch: 34955, Training Loss: 0.00937
Epoch: 34956, Training Loss: 0.00733
Epoch: 34956, Training Loss: 0.00763
Epoch: 34956, Training Loss: 0.00764
Epoch: 34956, Training Loss: 0.00937
Epoch: 34957, Training Loss: 0.00733
Epoch: 34957, Training Loss: 0.00763
Epoch: 34957, Training Loss: 0.00764
Epoch: 34957, Training Loss: 0.00937
Epoch: 34958, Training Loss: 0.00733
Epoch: 34958, Training Loss: 0.00763
Epoch: 34958, Training Loss: 0.00764
Epoch: 34958, Training Loss: 0.00937
Epoch: 34959, Training Loss: 0.00733
Epoch: 34959, Training Loss: 0.00763
Epoch: 34959, Training Loss: 0.00764
Epoch: 34959, Training Loss: 0.00937
Epoch: 34960, Training Loss: 0.00733
Epoch: 34960, Training Loss: 0.00763
Epoch: 34960, Training Loss: 0.00764
Epoch: 34960, Training Loss: 0.00937
Epoch: 34961, Training Loss: 0.00733
Epoch: 34961, Training Loss: 0.00763
Epoch: 34961, Training Loss: 0.00764
Epoch: 34961, Training Loss: 0.00937
Epoch: 34962, Training Loss: 0.00733
Epoch: 34962, Training Loss: 0.00763
Epoch: 34962, Training Loss: 0.00764
Epoch: 34962, Training Loss: 0.00937
Epoch: 34963, Training Loss: 0.00733
Epoch: 34963, Training Loss: 0.00763
Epoch: 34963, Training Loss: 0.00764
Epoch: 34963, Training Loss: 0.00937
Epoch: 34964, Training Loss: 0.00733
Epoch: 34964, Training Loss: 0.00763
Epoch: 34964, Training Loss: 0.00764
Epoch: 34964, Training Loss: 0.00937
Epoch: 34965, Training Loss: 0.00733
Epoch: 34965, Training Loss: 0.00763
Epoch: 34965, Training Loss: 0.00764
Epoch: 34965, Training Loss: 0.00937
Epoch: 34966, Training Loss: 0.00733
Epoch: 34966, Training Loss: 0.00763
Epoch: 34966, Training Loss: 0.00764
Epoch: 34966, Training Loss: 0.00937
Epoch: 34967, Training Loss: 0.00733
Epoch: 34967, Training Loss: 0.00763
Epoch: 34967, Training Loss: 0.00764
Epoch: 34967, Training Loss: 0.00937
Epoch: 34968, Training Loss: 0.00733
Epoch: 34968, Training Loss: 0.00763
Epoch: 34968, Training Loss: 0.00764
Epoch: 34968, Training Loss: 0.00937
Epoch: 34969, Training Loss: 0.00733
Epoch: 34969, Training Loss: 0.00763
Epoch: 34969, Training Loss: 0.00764
Epoch: 34969, Training Loss: 0.00937
Epoch: 34970, Training Loss: 0.00733
Epoch: 34970, Training Loss: 0.00763
Epoch: 34970, Training Loss: 0.00764
Epoch: 34970, Training Loss: 0.00937
Epoch: 34971, Training Loss: 0.00733
Epoch: 34971, Training Loss: 0.00763
Epoch: 34971, Training Loss: 0.00764
Epoch: 34971, Training Loss: 0.00937
Epoch: 34972, Training Loss: 0.00733
Epoch: 34972, Training Loss: 0.00763
Epoch: 34972, Training Loss: 0.00764
Epoch: 34972, Training Loss: 0.00937
Epoch: 34973, Training Loss: 0.00733
Epoch: 34973, Training Loss: 0.00763
Epoch: 34973, Training Loss: 0.00764
Epoch: 34973, Training Loss: 0.00937
Epoch: 34974, Training Loss: 0.00733
Epoch: 34974, Training Loss: 0.00763
Epoch: 34974, Training Loss: 0.00764
Epoch: 34974, Training Loss: 0.00937
Epoch: 34975, Training Loss: 0.00733
Epoch: 34975, Training Loss: 0.00763
Epoch: 34975, Training Loss: 0.00764
Epoch: 34975, Training Loss: 0.00937
Epoch: 34976, Training Loss: 0.00733
Epoch: 34976, Training Loss: 0.00763
Epoch: 34976, Training Loss: 0.00764
Epoch: 34976, Training Loss: 0.00937
Epoch: 34977, Training Loss: 0.00733
Epoch: 34977, Training Loss: 0.00763
Epoch: 34977, Training Loss: 0.00764
Epoch: 34977, Training Loss: 0.00937
Epoch: 34978, Training Loss: 0.00733
Epoch: 34978, Training Loss: 0.00763
Epoch: 34978, Training Loss: 0.00764
Epoch: 34978, Training Loss: 0.00937
Epoch: 34979, Training Loss: 0.00733
Epoch: 34979, Training Loss: 0.00763
Epoch: 34979, Training Loss: 0.00763
Epoch: 34979, Training Loss: 0.00937
Epoch: 34980, Training Loss: 0.00733
Epoch: 34980, Training Loss: 0.00763
Epoch: 34980, Training Loss: 0.00763
Epoch: 34980, Training Loss: 0.00937
Epoch: 34981, Training Loss: 0.00733
Epoch: 34981, Training Loss: 0.00763
Epoch: 34981, Training Loss: 0.00763
Epoch: 34981, Training Loss: 0.00937
Epoch: 34982, Training Loss: 0.00733
Epoch: 34982, Training Loss: 0.00763
Epoch: 34982, Training Loss: 0.00763
Epoch: 34982, Training Loss: 0.00937
Epoch: 34983, Training Loss: 0.00733
Epoch: 34983, Training Loss: 0.00763
Epoch: 34983, Training Loss: 0.00763
Epoch: 34983, Training Loss: 0.00937
Epoch: 34984, Training Loss: 0.00733
Epoch: 34984, Training Loss: 0.00763
Epoch: 34984, Training Loss: 0.00763
Epoch: 34984, Training Loss: 0.00937
Epoch: 34985, Training Loss: 0.00733
Epoch: 34985, Training Loss: 0.00763
Epoch: 34985, Training Loss: 0.00763
Epoch: 34985, Training Loss: 0.00937
Epoch: 34986, Training Loss: 0.00733
Epoch: 34986, Training Loss: 0.00763
Epoch: 34986, Training Loss: 0.00763
Epoch: 34986, Training Loss: 0.00937
Epoch: 34987, Training Loss: 0.00733
Epoch: 34987, Training Loss: 0.00763
Epoch: 34987, Training Loss: 0.00763
Epoch: 34987, Training Loss: 0.00937
Epoch: 34988, Training Loss: 0.00733
Epoch: 34988, Training Loss: 0.00762
Epoch: 34988, Training Loss: 0.00763
Epoch: 34988, Training Loss: 0.00937
Epoch: 34989, Training Loss: 0.00733
Epoch: 34989, Training Loss: 0.00762
Epoch: 34989, Training Loss: 0.00763
Epoch: 34989, Training Loss: 0.00937
Epoch: 34990, Training Loss: 0.00733
Epoch: 34990, Training Loss: 0.00762
Epoch: 34990, Training Loss: 0.00763
Epoch: 34990, Training Loss: 0.00937
Epoch: 34991, Training Loss: 0.00733
Epoch: 34991, Training Loss: 0.00762
Epoch: 34991, Training Loss: 0.00763
Epoch: 34991, Training Loss: 0.00937
Epoch: 34992, Training Loss: 0.00733
Epoch: 34992, Training Loss: 0.00762
Epoch: 34992, Training Loss: 0.00763
Epoch: 34992, Training Loss: 0.00937
Epoch: 34993, Training Loss: 0.00733
Epoch: 34993, Training Loss: 0.00762
Epoch: 34993, Training Loss: 0.00763
Epoch: 34993, Training Loss: 0.00937
Epoch: 34994, Training Loss: 0.00733
Epoch: 34994, Training Loss: 0.00762
Epoch: 34994, Training Loss: 0.00763
Epoch: 34994, Training Loss: 0.00937
Epoch: 34995, Training Loss: 0.00733
Epoch: 34995, Training Loss: 0.00762
Epoch: 34995, Training Loss: 0.00763
Epoch: 34995, Training Loss: 0.00937
Epoch: 34996, Training Loss: 0.00733
Epoch: 34996, Training Loss: 0.00762
Epoch: 34996, Training Loss: 0.00763
Epoch: 34996, Training Loss: 0.00937
Epoch: 34997, Training Loss: 0.00733
Epoch: 34997, Training Loss: 0.00762
Epoch: 34997, Training Loss: 0.00763
Epoch: 34997, Training Loss: 0.00937
Epoch: 34998, Training Loss: 0.00733
Epoch: 34998, Training Loss: 0.00762
Epoch: 34998, Training Loss: 0.00763
Epoch: 34998, Training Loss: 0.00937
Epoch: 34999, Training Loss: 0.00733
Epoch: 34999, Training Loss: 0.00762
Epoch: 34999, Training Loss: 0.00763
Epoch: 34999, Training Loss: 0.00937
Epoch: 35000, Training Loss: 0.00733
Epoch: 35000, Training Loss: 0.00762
Epoch: 35000, Training Loss: 0.00763
Epoch: 35000, Training Loss: 0.00937
Epoch: 35001, Training Loss: 0.00733
Epoch: 35001, Training Loss: 0.00762
Epoch: 35001, Training Loss: 0.00763
Epoch: 35001, Training Loss: 0.00937
Epoch: 35002, Training Loss: 0.00733
Epoch: 35002, Training Loss: 0.00762
Epoch: 35002, Training Loss: 0.00763
Epoch: 35002, Training Loss: 0.00937
Epoch: 35003, Training Loss: 0.00733
Epoch: 35003, Training Loss: 0.00762
Epoch: 35003, Training Loss: 0.00763
Epoch: 35003, Training Loss: 0.00937
Epoch: 35004, Training Loss: 0.00733
Epoch: 35004, Training Loss: 0.00762
Epoch: 35004, Training Loss: 0.00763
Epoch: 35004, Training Loss: 0.00937
Epoch: 35005, Training Loss: 0.00733
Epoch: 35005, Training Loss: 0.00762
Epoch: 35005, Training Loss: 0.00763
Epoch: 35005, Training Loss: 0.00937
Epoch: 35006, Training Loss: 0.00733
Epoch: 35006, Training Loss: 0.00762
Epoch: 35006, Training Loss: 0.00763
Epoch: 35006, Training Loss: 0.00937
Epoch: 35007, Training Loss: 0.00733
Epoch: 35007, Training Loss: 0.00762
Epoch: 35007, Training Loss: 0.00763
Epoch: 35007, Training Loss: 0.00937
Epoch: 35008, Training Loss: 0.00733
Epoch: 35008, Training Loss: 0.00762
Epoch: 35008, Training Loss: 0.00763
Epoch: 35008, Training Loss: 0.00937
Epoch: 35009, Training Loss: 0.00733
Epoch: 35009, Training Loss: 0.00762
Epoch: 35009, Training Loss: 0.00763
Epoch: 35009, Training Loss: 0.00937
Epoch: 35010, Training Loss: 0.00733
Epoch: 35010, Training Loss: 0.00762
Epoch: 35010, Training Loss: 0.00763
Epoch: 35010, Training Loss: 0.00937
Epoch: 35011, Training Loss: 0.00733
Epoch: 35011, Training Loss: 0.00762
Epoch: 35011, Training Loss: 0.00763
Epoch: 35011, Training Loss: 0.00937
Epoch: 35012, Training Loss: 0.00733
Epoch: 35012, Training Loss: 0.00762
Epoch: 35012, Training Loss: 0.00763
Epoch: 35012, Training Loss: 0.00936
Epoch: 35013, Training Loss: 0.00733
Epoch: 35013, Training Loss: 0.00762
Epoch: 35013, Training Loss: 0.00763
Epoch: 35013, Training Loss: 0.00936
Epoch: 35014, Training Loss: 0.00733
Epoch: 35014, Training Loss: 0.00762
Epoch: 35014, Training Loss: 0.00763
Epoch: 35014, Training Loss: 0.00936
Epoch: 35015, Training Loss: 0.00733
Epoch: 35015, Training Loss: 0.00762
Epoch: 35015, Training Loss: 0.00763
Epoch: 35015, Training Loss: 0.00936
Epoch: 35016, Training Loss: 0.00733
Epoch: 35016, Training Loss: 0.00762
Epoch: 35016, Training Loss: 0.00763
Epoch: 35016, Training Loss: 0.00936
Epoch: 35017, Training Loss: 0.00733
Epoch: 35017, Training Loss: 0.00762
Epoch: 35017, Training Loss: 0.00763
Epoch: 35017, Training Loss: 0.00936
Epoch: 35018, Training Loss: 0.00733
Epoch: 35018, Training Loss: 0.00762
Epoch: 35018, Training Loss: 0.00763
Epoch: 35018, Training Loss: 0.00936
Epoch: 35019, Training Loss: 0.00733
Epoch: 35019, Training Loss: 0.00762
Epoch: 35019, Training Loss: 0.00763
Epoch: 35019, Training Loss: 0.00936
Epoch: 35020, Training Loss: 0.00733
Epoch: 35020, Training Loss: 0.00762
Epoch: 35020, Training Loss: 0.00763
Epoch: 35020, Training Loss: 0.00936
Epoch: 35021, Training Loss: 0.00733
Epoch: 35021, Training Loss: 0.00762
Epoch: 35021, Training Loss: 0.00763
Epoch: 35021, Training Loss: 0.00936
Epoch: 35022, Training Loss: 0.00733
Epoch: 35022, Training Loss: 0.00762
Epoch: 35022, Training Loss: 0.00763
Epoch: 35022, Training Loss: 0.00936
Epoch: 35023, Training Loss: 0.00733
Epoch: 35023, Training Loss: 0.00762
Epoch: 35023, Training Loss: 0.00763
Epoch: 35023, Training Loss: 0.00936
Epoch: 35024, Training Loss: 0.00733
Epoch: 35024, Training Loss: 0.00762
Epoch: 35024, Training Loss: 0.00763
Epoch: 35024, Training Loss: 0.00936
Epoch: 35025, Training Loss: 0.00733
Epoch: 35025, Training Loss: 0.00762
Epoch: 35025, Training Loss: 0.00763
Epoch: 35025, Training Loss: 0.00936
Epoch: 35026, Training Loss: 0.00733
Epoch: 35026, Training Loss: 0.00762
Epoch: 35026, Training Loss: 0.00763
Epoch: 35026, Training Loss: 0.00936
Epoch: 35027, Training Loss: 0.00733
Epoch: 35027, Training Loss: 0.00762
Epoch: 35027, Training Loss: 0.00763
Epoch: 35027, Training Loss: 0.00936
Epoch: 35028, Training Loss: 0.00733
Epoch: 35028, Training Loss: 0.00762
Epoch: 35028, Training Loss: 0.00763
Epoch: 35028, Training Loss: 0.00936
Epoch: 35029, Training Loss: 0.00733
Epoch: 35029, Training Loss: 0.00762
Epoch: 35029, Training Loss: 0.00763
Epoch: 35029, Training Loss: 0.00936
Epoch: 35030, Training Loss: 0.00733
Epoch: 35030, Training Loss: 0.00762
Epoch: 35030, Training Loss: 0.00763
Epoch: 35030, Training Loss: 0.00936
Epoch: 35031, Training Loss: 0.00733
Epoch: 35031, Training Loss: 0.00762
Epoch: 35031, Training Loss: 0.00763
Epoch: 35031, Training Loss: 0.00936
Epoch: 35032, Training Loss: 0.00733
Epoch: 35032, Training Loss: 0.00762
Epoch: 35032, Training Loss: 0.00763
Epoch: 35032, Training Loss: 0.00936
Epoch: 35033, Training Loss: 0.00733
Epoch: 35033, Training Loss: 0.00762
Epoch: 35033, Training Loss: 0.00763
Epoch: 35033, Training Loss: 0.00936
Epoch: 35034, Training Loss: 0.00732
Epoch: 35034, Training Loss: 0.00762
Epoch: 35034, Training Loss: 0.00763
Epoch: 35034, Training Loss: 0.00936
Epoch: 35035, Training Loss: 0.00732
Epoch: 35035, Training Loss: 0.00762
Epoch: 35035, Training Loss: 0.00763
Epoch: 35035, Training Loss: 0.00936
Epoch: 35036, Training Loss: 0.00732
Epoch: 35036, Training Loss: 0.00762
Epoch: 35036, Training Loss: 0.00763
Epoch: 35036, Training Loss: 0.00936
Epoch: 35037, Training Loss: 0.00732
Epoch: 35037, Training Loss: 0.00762
Epoch: 35037, Training Loss: 0.00763
Epoch: 35037, Training Loss: 0.00936
Epoch: 35038, Training Loss: 0.00732
Epoch: 35038, Training Loss: 0.00762
Epoch: 35038, Training Loss: 0.00763
Epoch: 35038, Training Loss: 0.00936
Epoch: 35039, Training Loss: 0.00732
Epoch: 35039, Training Loss: 0.00762
Epoch: 35039, Training Loss: 0.00763
Epoch: 35039, Training Loss: 0.00936
Epoch: 35040, Training Loss: 0.00732
Epoch: 35040, Training Loss: 0.00762
Epoch: 35040, Training Loss: 0.00763
Epoch: 35040, Training Loss: 0.00936
Epoch: 35041, Training Loss: 0.00732
Epoch: 35041, Training Loss: 0.00762
Epoch: 35041, Training Loss: 0.00763
Epoch: 35041, Training Loss: 0.00936
Epoch: 35042, Training Loss: 0.00732
Epoch: 35042, Training Loss: 0.00762
Epoch: 35042, Training Loss: 0.00763
Epoch: 35042, Training Loss: 0.00936
Epoch: 35043, Training Loss: 0.00732
Epoch: 35043, Training Loss: 0.00762
Epoch: 35043, Training Loss: 0.00763
Epoch: 35043, Training Loss: 0.00936
Epoch: 35044, Training Loss: 0.00732
Epoch: 35044, Training Loss: 0.00762
Epoch: 35044, Training Loss: 0.00763
Epoch: 35044, Training Loss: 0.00936
Epoch: 35045, Training Loss: 0.00732
Epoch: 35045, Training Loss: 0.00762
Epoch: 35045, Training Loss: 0.00763
Epoch: 35045, Training Loss: 0.00936
Epoch: 35046, Training Loss: 0.00732
Epoch: 35046, Training Loss: 0.00762
Epoch: 35046, Training Loss: 0.00763
Epoch: 35046, Training Loss: 0.00936
Epoch: 35047, Training Loss: 0.00732
Epoch: 35047, Training Loss: 0.00762
Epoch: 35047, Training Loss: 0.00763
Epoch: 35047, Training Loss: 0.00936
Epoch: 35048, Training Loss: 0.00732
Epoch: 35048, Training Loss: 0.00762
Epoch: 35048, Training Loss: 0.00763
Epoch: 35048, Training Loss: 0.00936
Epoch: 35049, Training Loss: 0.00732
Epoch: 35049, Training Loss: 0.00762
Epoch: 35049, Training Loss: 0.00763
Epoch: 35049, Training Loss: 0.00936
Epoch: 35050, Training Loss: 0.00732
Epoch: 35050, Training Loss: 0.00762
Epoch: 35050, Training Loss: 0.00763
Epoch: 35050, Training Loss: 0.00936
Epoch: 35051, Training Loss: 0.00732
Epoch: 35051, Training Loss: 0.00762
Epoch: 35051, Training Loss: 0.00763
Epoch: 35051, Training Loss: 0.00936
Epoch: 35052, Training Loss: 0.00732
Epoch: 35052, Training Loss: 0.00762
Epoch: 35052, Training Loss: 0.00763
Epoch: 35052, Training Loss: 0.00936
Epoch: 35053, Training Loss: 0.00732
Epoch: 35053, Training Loss: 0.00762
Epoch: 35053, Training Loss: 0.00763
Epoch: 35053, Training Loss: 0.00936
Epoch: 35054, Training Loss: 0.00732
Epoch: 35054, Training Loss: 0.00762
Epoch: 35054, Training Loss: 0.00763
Epoch: 35054, Training Loss: 0.00936
Epoch: 35055, Training Loss: 0.00732
Epoch: 35055, Training Loss: 0.00762
Epoch: 35055, Training Loss: 0.00763
Epoch: 35055, Training Loss: 0.00936
Epoch: 35056, Training Loss: 0.00732
Epoch: 35056, Training Loss: 0.00762
Epoch: 35056, Training Loss: 0.00763
Epoch: 35056, Training Loss: 0.00936
Epoch: 35057, Training Loss: 0.00732
Epoch: 35057, Training Loss: 0.00762
Epoch: 35057, Training Loss: 0.00763
Epoch: 35057, Training Loss: 0.00936
Epoch: 35058, Training Loss: 0.00732
Epoch: 35058, Training Loss: 0.00762
Epoch: 35058, Training Loss: 0.00763
Epoch: 35058, Training Loss: 0.00936
Epoch: 35059, Training Loss: 0.00732
Epoch: 35059, Training Loss: 0.00762
Epoch: 35059, Training Loss: 0.00763
Epoch: 35059, Training Loss: 0.00936
Epoch: 35060, Training Loss: 0.00732
Epoch: 35060, Training Loss: 0.00762
Epoch: 35060, Training Loss: 0.00763
Epoch: 35060, Training Loss: 0.00936
Epoch: 35061, Training Loss: 0.00732
Epoch: 35061, Training Loss: 0.00762
Epoch: 35061, Training Loss: 0.00762
Epoch: 35061, Training Loss: 0.00936
Epoch: 35062, Training Loss: 0.00732
Epoch: 35062, Training Loss: 0.00762
Epoch: 35062, Training Loss: 0.00762
Epoch: 35062, Training Loss: 0.00936
Epoch: 35063, Training Loss: 0.00732
Epoch: 35063, Training Loss: 0.00762
Epoch: 35063, Training Loss: 0.00762
Epoch: 35063, Training Loss: 0.00936
Epoch: 35064, Training Loss: 0.00732
Epoch: 35064, Training Loss: 0.00762
Epoch: 35064, Training Loss: 0.00762
Epoch: 35064, Training Loss: 0.00936
Epoch: 35065, Training Loss: 0.00732
Epoch: 35065, Training Loss: 0.00762
Epoch: 35065, Training Loss: 0.00762
Epoch: 35065, Training Loss: 0.00936
Epoch: 35066, Training Loss: 0.00732
Epoch: 35066, Training Loss: 0.00762
Epoch: 35066, Training Loss: 0.00762
Epoch: 35066, Training Loss: 0.00936
Epoch: 35067, Training Loss: 0.00732
Epoch: 35067, Training Loss: 0.00762
Epoch: 35067, Training Loss: 0.00762
Epoch: 35067, Training Loss: 0.00936
Epoch: 35068, Training Loss: 0.00732
Epoch: 35068, Training Loss: 0.00762
Epoch: 35068, Training Loss: 0.00762
Epoch: 35068, Training Loss: 0.00936
Epoch: 35069, Training Loss: 0.00732
Epoch: 35069, Training Loss: 0.00762
Epoch: 35069, Training Loss: 0.00762
Epoch: 35069, Training Loss: 0.00936
Epoch: 35070, Training Loss: 0.00732
Epoch: 35070, Training Loss: 0.00762
Epoch: 35070, Training Loss: 0.00762
Epoch: 35070, Training Loss: 0.00936
Epoch: 35071, Training Loss: 0.00732
Epoch: 35071, Training Loss: 0.00761
Epoch: 35071, Training Loss: 0.00762
Epoch: 35071, Training Loss: 0.00936
Epoch: 35072, Training Loss: 0.00732
Epoch: 35072, Training Loss: 0.00761
Epoch: 35072, Training Loss: 0.00762
Epoch: 35072, Training Loss: 0.00936
Epoch: 35073, Training Loss: 0.00732
Epoch: 35073, Training Loss: 0.00761
Epoch: 35073, Training Loss: 0.00762
Epoch: 35073, Training Loss: 0.00936
Epoch: 35074, Training Loss: 0.00732
Epoch: 35074, Training Loss: 0.00761
Epoch: 35074, Training Loss: 0.00762
Epoch: 35074, Training Loss: 0.00936
Epoch: 35075, Training Loss: 0.00732
Epoch: 35075, Training Loss: 0.00761
Epoch: 35075, Training Loss: 0.00762
Epoch: 35075, Training Loss: 0.00936
Epoch: 35076, Training Loss: 0.00732
Epoch: 35076, Training Loss: 0.00761
Epoch: 35076, Training Loss: 0.00762
Epoch: 35076, Training Loss: 0.00936
Epoch: 35077, Training Loss: 0.00732
Epoch: 35077, Training Loss: 0.00761
Epoch: 35077, Training Loss: 0.00762
Epoch: 35077, Training Loss: 0.00936
Epoch: 35078, Training Loss: 0.00732
Epoch: 35078, Training Loss: 0.00761
Epoch: 35078, Training Loss: 0.00762
Epoch: 35078, Training Loss: 0.00936
Epoch: 35079, Training Loss: 0.00732
Epoch: 35079, Training Loss: 0.00761
Epoch: 35079, Training Loss: 0.00762
Epoch: 35079, Training Loss: 0.00935
Epoch: 35080, Training Loss: 0.00732
Epoch: 35080, Training Loss: 0.00761
Epoch: 35080, Training Loss: 0.00762
Epoch: 35080, Training Loss: 0.00935
Epoch: 35081, Training Loss: 0.00732
Epoch: 35081, Training Loss: 0.00761
Epoch: 35081, Training Loss: 0.00762
Epoch: 35081, Training Loss: 0.00935
Epoch: 35082, Training Loss: 0.00732
Epoch: 35082, Training Loss: 0.00761
Epoch: 35082, Training Loss: 0.00762
Epoch: 35082, Training Loss: 0.00935
Epoch: 35083, Training Loss: 0.00732
Epoch: 35083, Training Loss: 0.00761
Epoch: 35083, Training Loss: 0.00762
Epoch: 35083, Training Loss: 0.00935
Epoch: 35084, Training Loss: 0.00732
Epoch: 35084, Training Loss: 0.00761
Epoch: 35084, Training Loss: 0.00762
Epoch: 35084, Training Loss: 0.00935
Epoch: 35085, Training Loss: 0.00732
Epoch: 35085, Training Loss: 0.00761
Epoch: 35085, Training Loss: 0.00762
Epoch: 35085, Training Loss: 0.00935
Epoch: 35086, Training Loss: 0.00732
Epoch: 35086, Training Loss: 0.00761
Epoch: 35086, Training Loss: 0.00762
Epoch: 35086, Training Loss: 0.00935
Epoch: 35087, Training Loss: 0.00732
Epoch: 35087, Training Loss: 0.00761
Epoch: 35087, Training Loss: 0.00762
Epoch: 35087, Training Loss: 0.00935
Epoch: 35088, Training Loss: 0.00732
Epoch: 35088, Training Loss: 0.00761
Epoch: 35088, Training Loss: 0.00762
Epoch: 35088, Training Loss: 0.00935
Epoch: 35089, Training Loss: 0.00732
Epoch: 35089, Training Loss: 0.00761
Epoch: 35089, Training Loss: 0.00762
Epoch: 35089, Training Loss: 0.00935
Epoch: 35090, Training Loss: 0.00732
Epoch: 35090, Training Loss: 0.00761
Epoch: 35090, Training Loss: 0.00762
Epoch: 35090, Training Loss: 0.00935
Epoch: 35091, Training Loss: 0.00732
Epoch: 35091, Training Loss: 0.00761
Epoch: 35091, Training Loss: 0.00762
Epoch: 35091, Training Loss: 0.00935
Epoch: 35092, Training Loss: 0.00732
Epoch: 35092, Training Loss: 0.00761
Epoch: 35092, Training Loss: 0.00762
Epoch: 35092, Training Loss: 0.00935
Epoch: 35093, Training Loss: 0.00732
Epoch: 35093, Training Loss: 0.00761
Epoch: 35093, Training Loss: 0.00762
Epoch: 35093, Training Loss: 0.00935
Epoch: 35094, Training Loss: 0.00732
Epoch: 35094, Training Loss: 0.00761
Epoch: 35094, Training Loss: 0.00762
Epoch: 35094, Training Loss: 0.00935
Epoch: 35095, Training Loss: 0.00732
Epoch: 35095, Training Loss: 0.00761
Epoch: 35095, Training Loss: 0.00762
Epoch: 35095, Training Loss: 0.00935
Epoch: 35096, Training Loss: 0.00732
Epoch: 35096, Training Loss: 0.00761
Epoch: 35096, Training Loss: 0.00762
Epoch: 35096, Training Loss: 0.00935
Epoch: 35097, Training Loss: 0.00732
Epoch: 35097, Training Loss: 0.00761
Epoch: 35097, Training Loss: 0.00762
Epoch: 35097, Training Loss: 0.00935
Epoch: 35098, Training Loss: 0.00732
Epoch: 35098, Training Loss: 0.00761
Epoch: 35098, Training Loss: 0.00762
Epoch: 35098, Training Loss: 0.00935
Epoch: 35099, Training Loss: 0.00732
Epoch: 35099, Training Loss: 0.00761
Epoch: 35099, Training Loss: 0.00762
Epoch: 35099, Training Loss: 0.00935
Epoch: 35100, Training Loss: 0.00732
Epoch: 35100, Training Loss: 0.00761
Epoch: 35100, Training Loss: 0.00762
Epoch: 35100, Training Loss: 0.00935
Epoch: 35101, Training Loss: 0.00732
Epoch: 35101, Training Loss: 0.00761
Epoch: 35101, Training Loss: 0.00762
Epoch: 35101, Training Loss: 0.00935
Epoch: 35102, Training Loss: 0.00732
Epoch: 35102, Training Loss: 0.00761
Epoch: 35102, Training Loss: 0.00762
Epoch: 35102, Training Loss: 0.00935
Epoch: 35103, Training Loss: 0.00732
Epoch: 35103, Training Loss: 0.00761
Epoch: 35103, Training Loss: 0.00762
Epoch: 35103, Training Loss: 0.00935
Epoch: 35104, Training Loss: 0.00732
Epoch: 35104, Training Loss: 0.00761
Epoch: 35104, Training Loss: 0.00762
Epoch: 35104, Training Loss: 0.00935
Epoch: 35105, Training Loss: 0.00732
Epoch: 35105, Training Loss: 0.00761
Epoch: 35105, Training Loss: 0.00762
Epoch: 35105, Training Loss: 0.00935
Epoch: 35106, Training Loss: 0.00732
Epoch: 35106, Training Loss: 0.00761
Epoch: 35106, Training Loss: 0.00762
Epoch: 35106, Training Loss: 0.00935
Epoch: 35107, Training Loss: 0.00732
Epoch: 35107, Training Loss: 0.00761
Epoch: 35107, Training Loss: 0.00762
Epoch: 35107, Training Loss: 0.00935
Epoch: 35108, Training Loss: 0.00732
Epoch: 35108, Training Loss: 0.00761
Epoch: 35108, Training Loss: 0.00762
Epoch: 35108, Training Loss: 0.00935
Epoch: 35109, Training Loss: 0.00732
Epoch: 35109, Training Loss: 0.00761
Epoch: 35109, Training Loss: 0.00762
Epoch: 35109, Training Loss: 0.00935
Epoch: 35110, Training Loss: 0.00732
Epoch: 35110, Training Loss: 0.00761
Epoch: 35110, Training Loss: 0.00762
Epoch: 35110, Training Loss: 0.00935
Epoch: 35111, Training Loss: 0.00732
Epoch: 35111, Training Loss: 0.00761
Epoch: 35111, Training Loss: 0.00762
Epoch: 35111, Training Loss: 0.00935
Epoch: 35112, Training Loss: 0.00732
Epoch: 35112, Training Loss: 0.00761
Epoch: 35112, Training Loss: 0.00762
Epoch: 35112, Training Loss: 0.00935
Epoch: 35113, Training Loss: 0.00732
Epoch: 35113, Training Loss: 0.00761
Epoch: 35113, Training Loss: 0.00762
Epoch: 35113, Training Loss: 0.00935
Epoch: 35114, Training Loss: 0.00732
Epoch: 35114, Training Loss: 0.00761
Epoch: 35114, Training Loss: 0.00762
Epoch: 35114, Training Loss: 0.00935
Epoch: 35115, Training Loss: 0.00732
Epoch: 35115, Training Loss: 0.00761
Epoch: 35115, Training Loss: 0.00762
Epoch: 35115, Training Loss: 0.00935
Epoch: 35116, Training Loss: 0.00732
Epoch: 35116, Training Loss: 0.00761
Epoch: 35116, Training Loss: 0.00762
Epoch: 35116, Training Loss: 0.00935
Epoch: 35117, Training Loss: 0.00732
Epoch: 35117, Training Loss: 0.00761
Epoch: 35117, Training Loss: 0.00762
Epoch: 35117, Training Loss: 0.00935
Epoch: 35118, Training Loss: 0.00732
Epoch: 35118, Training Loss: 0.00761
Epoch: 35118, Training Loss: 0.00762
Epoch: 35118, Training Loss: 0.00935
Epoch: 35119, Training Loss: 0.00732
Epoch: 35119, Training Loss: 0.00761
Epoch: 35119, Training Loss: 0.00762
Epoch: 35119, Training Loss: 0.00935
Epoch: 35120, Training Loss: 0.00732
Epoch: 35120, Training Loss: 0.00761
Epoch: 35120, Training Loss: 0.00762
Epoch: 35120, Training Loss: 0.00935
Epoch: 35121, Training Loss: 0.00732
Epoch: 35121, Training Loss: 0.00761
Epoch: 35121, Training Loss: 0.00762
Epoch: 35121, Training Loss: 0.00935
Epoch: 35122, Training Loss: 0.00731
Epoch: 35122, Training Loss: 0.00761
Epoch: 35122, Training Loss: 0.00762
Epoch: 35122, Training Loss: 0.00935
Epoch: 35123, Training Loss: 0.00731
Epoch: 35123, Training Loss: 0.00761
Epoch: 35123, Training Loss: 0.00762
Epoch: 35123, Training Loss: 0.00935
Epoch: 35124, Training Loss: 0.00731
Epoch: 35124, Training Loss: 0.00761
Epoch: 35124, Training Loss: 0.00762
Epoch: 35124, Training Loss: 0.00935
Epoch: 35125, Training Loss: 0.00731
Epoch: 35125, Training Loss: 0.00761
Epoch: 35125, Training Loss: 0.00762
Epoch: 35125, Training Loss: 0.00935
Epoch: 35126, Training Loss: 0.00731
Epoch: 35126, Training Loss: 0.00761
Epoch: 35126, Training Loss: 0.00762
Epoch: 35126, Training Loss: 0.00935
Epoch: 35127, Training Loss: 0.00731
Epoch: 35127, Training Loss: 0.00761
Epoch: 35127, Training Loss: 0.00762
Epoch: 35127, Training Loss: 0.00935
Epoch: 35128, Training Loss: 0.00731
Epoch: 35128, Training Loss: 0.00761
Epoch: 35128, Training Loss: 0.00762
Epoch: 35128, Training Loss: 0.00935
Epoch: 35129, Training Loss: 0.00731
Epoch: 35129, Training Loss: 0.00761
Epoch: 35129, Training Loss: 0.00762
Epoch: 35129, Training Loss: 0.00935
Epoch: 35130, Training Loss: 0.00731
Epoch: 35130, Training Loss: 0.00761
Epoch: 35130, Training Loss: 0.00762
Epoch: 35130, Training Loss: 0.00935
Epoch: 35131, Training Loss: 0.00731
Epoch: 35131, Training Loss: 0.00761
Epoch: 35131, Training Loss: 0.00762
Epoch: 35131, Training Loss: 0.00935
Epoch: 35132, Training Loss: 0.00731
Epoch: 35132, Training Loss: 0.00761
Epoch: 35132, Training Loss: 0.00762
Epoch: 35132, Training Loss: 0.00935
Epoch: 35133, Training Loss: 0.00731
Epoch: 35133, Training Loss: 0.00761
Epoch: 35133, Training Loss: 0.00762
Epoch: 35133, Training Loss: 0.00935
Epoch: 35134, Training Loss: 0.00731
Epoch: 35134, Training Loss: 0.00761
Epoch: 35134, Training Loss: 0.00762
Epoch: 35134, Training Loss: 0.00935
Epoch: 35135, Training Loss: 0.00731
Epoch: 35135, Training Loss: 0.00761
Epoch: 35135, Training Loss: 0.00762
Epoch: 35135, Training Loss: 0.00935
Epoch: 35136, Training Loss: 0.00731
Epoch: 35136, Training Loss: 0.00761
Epoch: 35136, Training Loss: 0.00762
Epoch: 35136, Training Loss: 0.00935
Epoch: 35137, Training Loss: 0.00731
Epoch: 35137, Training Loss: 0.00761
Epoch: 35137, Training Loss: 0.00762
Epoch: 35137, Training Loss: 0.00935
Epoch: 35138, Training Loss: 0.00731
Epoch: 35138, Training Loss: 0.00761
Epoch: 35138, Training Loss: 0.00762
Epoch: 35138, Training Loss: 0.00935
Epoch: 35139, Training Loss: 0.00731
Epoch: 35139, Training Loss: 0.00761
Epoch: 35139, Training Loss: 0.00762
Epoch: 35139, Training Loss: 0.00935
Epoch: 35140, Training Loss: 0.00731
Epoch: 35140, Training Loss: 0.00761
Epoch: 35140, Training Loss: 0.00762
Epoch: 35140, Training Loss: 0.00935
Epoch: 35141, Training Loss: 0.00731
Epoch: 35141, Training Loss: 0.00761
Epoch: 35141, Training Loss: 0.00762
Epoch: 35141, Training Loss: 0.00935
Epoch: 35142, Training Loss: 0.00731
Epoch: 35142, Training Loss: 0.00761
Epoch: 35142, Training Loss: 0.00762
Epoch: 35142, Training Loss: 0.00935
Epoch: 35143, Training Loss: 0.00731
Epoch: 35143, Training Loss: 0.00761
Epoch: 35143, Training Loss: 0.00762
Epoch: 35143, Training Loss: 0.00935
Epoch: 35144, Training Loss: 0.00731
Epoch: 35144, Training Loss: 0.00761
Epoch: 35144, Training Loss: 0.00761
Epoch: 35144, Training Loss: 0.00935
Epoch: 35145, Training Loss: 0.00731
Epoch: 35145, Training Loss: 0.00761
Epoch: 35145, Training Loss: 0.00761
Epoch: 35145, Training Loss: 0.00935
Epoch: 35146, Training Loss: 0.00731
Epoch: 35146, Training Loss: 0.00761
Epoch: 35146, Training Loss: 0.00761
Epoch: 35146, Training Loss: 0.00934
Epoch: 35147, Training Loss: 0.00731
Epoch: 35147, Training Loss: 0.00761
Epoch: 35147, Training Loss: 0.00761
Epoch: 35147, Training Loss: 0.00934
Epoch: 35148, Training Loss: 0.00731
Epoch: 35148, Training Loss: 0.00761
Epoch: 35148, Training Loss: 0.00761
Epoch: 35148, Training Loss: 0.00934
Epoch: 35149, Training Loss: 0.00731
Epoch: 35149, Training Loss: 0.00761
Epoch: 35149, Training Loss: 0.00761
Epoch: 35149, Training Loss: 0.00934
Epoch: 35150, Training Loss: 0.00731
Epoch: 35150, Training Loss: 0.00761
Epoch: 35150, Training Loss: 0.00761
Epoch: 35150, Training Loss: 0.00934
Epoch: 35151, Training Loss: 0.00731
Epoch: 35151, Training Loss: 0.00761
Epoch: 35151, Training Loss: 0.00761
Epoch: 35151, Training Loss: 0.00934
Epoch: 35152, Training Loss: 0.00731
Epoch: 35152, Training Loss: 0.00761
Epoch: 35152, Training Loss: 0.00761
Epoch: 35152, Training Loss: 0.00934
Epoch: 35153, Training Loss: 0.00731
Epoch: 35153, Training Loss: 0.00761
Epoch: 35153, Training Loss: 0.00761
Epoch: 35153, Training Loss: 0.00934
Epoch: 35154, Training Loss: 0.00731
Epoch: 35154, Training Loss: 0.00760
Epoch: 35154, Training Loss: 0.00761
Epoch: 35154, Training Loss: 0.00934
Epoch: 35155, Training Loss: 0.00731
Epoch: 35155, Training Loss: 0.00760
Epoch: 35155, Training Loss: 0.00761
Epoch: 35155, Training Loss: 0.00934
Epoch: 35156, Training Loss: 0.00731
Epoch: 35156, Training Loss: 0.00760
Epoch: 35156, Training Loss: 0.00761
Epoch: 35156, Training Loss: 0.00934
Epoch: 35157, Training Loss: 0.00731
Epoch: 35157, Training Loss: 0.00760
Epoch: 35157, Training Loss: 0.00761
Epoch: 35157, Training Loss: 0.00934
Epoch: 35158, Training Loss: 0.00731
Epoch: 35158, Training Loss: 0.00760
Epoch: 35158, Training Loss: 0.00761
Epoch: 35158, Training Loss: 0.00934
Epoch: 35159, Training Loss: 0.00731
Epoch: 35159, Training Loss: 0.00760
Epoch: 35159, Training Loss: 0.00761
Epoch: 35159, Training Loss: 0.00934
Epoch: 35160, Training Loss: 0.00731
Epoch: 35160, Training Loss: 0.00760
Epoch: 35160, Training Loss: 0.00761
Epoch: 35160, Training Loss: 0.00934
Epoch: 35161, Training Loss: 0.00731
Epoch: 35161, Training Loss: 0.00760
Epoch: 35161, Training Loss: 0.00761
Epoch: 35161, Training Loss: 0.00934
Epoch: 35162, Training Loss: 0.00731
Epoch: 35162, Training Loss: 0.00760
Epoch: 35162, Training Loss: 0.00761
Epoch: 35162, Training Loss: 0.00934
Epoch: 35163, Training Loss: 0.00731
Epoch: 35163, Training Loss: 0.00760
Epoch: 35163, Training Loss: 0.00761
Epoch: 35163, Training Loss: 0.00934
Epoch: 35164, Training Loss: 0.00731
Epoch: 35164, Training Loss: 0.00760
Epoch: 35164, Training Loss: 0.00761
Epoch: 35164, Training Loss: 0.00934
Epoch: 35165, Training Loss: 0.00731
Epoch: 35165, Training Loss: 0.00760
Epoch: 35165, Training Loss: 0.00761
Epoch: 35165, Training Loss: 0.00934
Epoch: 35166, Training Loss: 0.00731
Epoch: 35166, Training Loss: 0.00760
Epoch: 35166, Training Loss: 0.00761
Epoch: 35166, Training Loss: 0.00934
Epoch: 35167, Training Loss: 0.00731
Epoch: 35167, Training Loss: 0.00760
Epoch: 35167, Training Loss: 0.00761
Epoch: 35167, Training Loss: 0.00934
Epoch: 35168, Training Loss: 0.00731
Epoch: 35168, Training Loss: 0.00760
Epoch: 35168, Training Loss: 0.00761
Epoch: 35168, Training Loss: 0.00934
Epoch: 35169, Training Loss: 0.00731
Epoch: 35169, Training Loss: 0.00760
Epoch: 35169, Training Loss: 0.00761
Epoch: 35169, Training Loss: 0.00934
Epoch: 35170, Training Loss: 0.00731
Epoch: 35170, Training Loss: 0.00760
Epoch: 35170, Training Loss: 0.00761
Epoch: 35170, Training Loss: 0.00934
Epoch: 35171, Training Loss: 0.00731
Epoch: 35171, Training Loss: 0.00760
Epoch: 35171, Training Loss: 0.00761
Epoch: 35171, Training Loss: 0.00934
Epoch: 35172, Training Loss: 0.00731
Epoch: 35172, Training Loss: 0.00760
Epoch: 35172, Training Loss: 0.00761
Epoch: 35172, Training Loss: 0.00934
Epoch: 35173, Training Loss: 0.00731
Epoch: 35173, Training Loss: 0.00760
Epoch: 35173, Training Loss: 0.00761
Epoch: 35173, Training Loss: 0.00934
Epoch: 35174, Training Loss: 0.00731
Epoch: 35174, Training Loss: 0.00760
Epoch: 35174, Training Loss: 0.00761
Epoch: 35174, Training Loss: 0.00934
Epoch: 35175, Training Loss: 0.00731
Epoch: 35175, Training Loss: 0.00760
Epoch: 35175, Training Loss: 0.00761
Epoch: 35175, Training Loss: 0.00934
Epoch: 35176, Training Loss: 0.00731
Epoch: 35176, Training Loss: 0.00760
Epoch: 35176, Training Loss: 0.00761
Epoch: 35176, Training Loss: 0.00934
Epoch: 35177, Training Loss: 0.00731
Epoch: 35177, Training Loss: 0.00760
Epoch: 35177, Training Loss: 0.00761
Epoch: 35177, Training Loss: 0.00934
Epoch: 35178, Training Loss: 0.00731
Epoch: 35178, Training Loss: 0.00760
Epoch: 35178, Training Loss: 0.00761
Epoch: 35178, Training Loss: 0.00934
Epoch: 35179, Training Loss: 0.00731
Epoch: 35179, Training Loss: 0.00760
Epoch: 35179, Training Loss: 0.00761
Epoch: 35179, Training Loss: 0.00934
Epoch: 35180, Training Loss: 0.00731
Epoch: 35180, Training Loss: 0.00760
Epoch: 35180, Training Loss: 0.00761
Epoch: 35180, Training Loss: 0.00934
Epoch: 35181, Training Loss: 0.00731
Epoch: 35181, Training Loss: 0.00760
Epoch: 35181, Training Loss: 0.00761
Epoch: 35181, Training Loss: 0.00934
Epoch: 35182, Training Loss: 0.00731
Epoch: 35182, Training Loss: 0.00760
Epoch: 35182, Training Loss: 0.00761
Epoch: 35182, Training Loss: 0.00934
Epoch: 35183, Training Loss: 0.00731
Epoch: 35183, Training Loss: 0.00760
Epoch: 35183, Training Loss: 0.00761
Epoch: 35183, Training Loss: 0.00934
Epoch: 35184, Training Loss: 0.00731
Epoch: 35184, Training Loss: 0.00760
Epoch: 35184, Training Loss: 0.00761
Epoch: 35184, Training Loss: 0.00934
Epoch: 35185, Training Loss: 0.00731
Epoch: 35185, Training Loss: 0.00760
Epoch: 35185, Training Loss: 0.00761
Epoch: 35185, Training Loss: 0.00934
Epoch: 35186, Training Loss: 0.00731
Epoch: 35186, Training Loss: 0.00760
Epoch: 35186, Training Loss: 0.00761
Epoch: 35186, Training Loss: 0.00934
Epoch: 35187, Training Loss: 0.00731
Epoch: 35187, Training Loss: 0.00760
Epoch: 35187, Training Loss: 0.00761
Epoch: 35187, Training Loss: 0.00934
Epoch: 35188, Training Loss: 0.00731
Epoch: 35188, Training Loss: 0.00760
Epoch: 35188, Training Loss: 0.00761
Epoch: 35188, Training Loss: 0.00934
Epoch: 35189, Training Loss: 0.00731
Epoch: 35189, Training Loss: 0.00760
Epoch: 35189, Training Loss: 0.00761
Epoch: 35189, Training Loss: 0.00934
Epoch: 35190, Training Loss: 0.00731
Epoch: 35190, Training Loss: 0.00760
Epoch: 35190, Training Loss: 0.00761
Epoch: 35190, Training Loss: 0.00934
Epoch: 35191, Training Loss: 0.00731
Epoch: 35191, Training Loss: 0.00760
Epoch: 35191, Training Loss: 0.00761
Epoch: 35191, Training Loss: 0.00934
Epoch: 35192, Training Loss: 0.00731
Epoch: 35192, Training Loss: 0.00760
Epoch: 35192, Training Loss: 0.00761
Epoch: 35192, Training Loss: 0.00934
Epoch: 35193, Training Loss: 0.00731
Epoch: 35193, Training Loss: 0.00760
Epoch: 35193, Training Loss: 0.00761
Epoch: 35193, Training Loss: 0.00934
Epoch: 35194, Training Loss: 0.00731
Epoch: 35194, Training Loss: 0.00760
Epoch: 35194, Training Loss: 0.00761
Epoch: 35194, Training Loss: 0.00934
Epoch: 35195, Training Loss: 0.00731
Epoch: 35195, Training Loss: 0.00760
Epoch: 35195, Training Loss: 0.00761
Epoch: 35195, Training Loss: 0.00934
Epoch: 35196, Training Loss: 0.00731
Epoch: 35196, Training Loss: 0.00760
Epoch: 35196, Training Loss: 0.00761
Epoch: 35196, Training Loss: 0.00934
Epoch: 35197, Training Loss: 0.00731
Epoch: 35197, Training Loss: 0.00760
Epoch: 35197, Training Loss: 0.00761
Epoch: 35197, Training Loss: 0.00934
Epoch: 35198, Training Loss: 0.00731
Epoch: 35198, Training Loss: 0.00760
Epoch: 35198, Training Loss: 0.00761
Epoch: 35198, Training Loss: 0.00934
Epoch: 35199, Training Loss: 0.00731
Epoch: 35199, Training Loss: 0.00760
Epoch: 35199, Training Loss: 0.00761
Epoch: 35199, Training Loss: 0.00934
Epoch: 35200, Training Loss: 0.00731
Epoch: 35200, Training Loss: 0.00760
Epoch: 35200, Training Loss: 0.00761
Epoch: 35200, Training Loss: 0.00934
Epoch: 35201, Training Loss: 0.00731
Epoch: 35201, Training Loss: 0.00760
Epoch: 35201, Training Loss: 0.00761
Epoch: 35201, Training Loss: 0.00934
Epoch: 35202, Training Loss: 0.00731
Epoch: 35202, Training Loss: 0.00760
Epoch: 35202, Training Loss: 0.00761
Epoch: 35202, Training Loss: 0.00934
Epoch: 35203, Training Loss: 0.00731
Epoch: 35203, Training Loss: 0.00760
Epoch: 35203, Training Loss: 0.00761
Epoch: 35203, Training Loss: 0.00934
Epoch: 35204, Training Loss: 0.00731
Epoch: 35204, Training Loss: 0.00760
Epoch: 35204, Training Loss: 0.00761
Epoch: 35204, Training Loss: 0.00934
Epoch: 35205, Training Loss: 0.00731
Epoch: 35205, Training Loss: 0.00760
Epoch: 35205, Training Loss: 0.00761
Epoch: 35205, Training Loss: 0.00934
Epoch: 35206, Training Loss: 0.00731
Epoch: 35206, Training Loss: 0.00760
Epoch: 35206, Training Loss: 0.00761
Epoch: 35206, Training Loss: 0.00934
Epoch: 35207, Training Loss: 0.00731
Epoch: 35207, Training Loss: 0.00760
Epoch: 35207, Training Loss: 0.00761
Epoch: 35207, Training Loss: 0.00934
Epoch: 35208, Training Loss: 0.00731
Epoch: 35208, Training Loss: 0.00760
Epoch: 35208, Training Loss: 0.00761
Epoch: 35208, Training Loss: 0.00934
Epoch: 35209, Training Loss: 0.00731
Epoch: 35209, Training Loss: 0.00760
Epoch: 35209, Training Loss: 0.00761
Epoch: 35209, Training Loss: 0.00934
Epoch: 35210, Training Loss: 0.00731
Epoch: 35210, Training Loss: 0.00760
Epoch: 35210, Training Loss: 0.00761
Epoch: 35210, Training Loss: 0.00934
Epoch: 35211, Training Loss: 0.00730
Epoch: 35211, Training Loss: 0.00760
Epoch: 35211, Training Loss: 0.00761
Epoch: 35211, Training Loss: 0.00934
Epoch: 35212, Training Loss: 0.00730
Epoch: 35212, Training Loss: 0.00760
Epoch: 35212, Training Loss: 0.00761
Epoch: 35212, Training Loss: 0.00934
Epoch: 35213, Training Loss: 0.00730
Epoch: 35213, Training Loss: 0.00760
Epoch: 35213, Training Loss: 0.00761
Epoch: 35213, Training Loss: 0.00933
Epoch: 35214, Training Loss: 0.00730
Epoch: 35214, Training Loss: 0.00760
Epoch: 35214, Training Loss: 0.00761
Epoch: 35214, Training Loss: 0.00933
Epoch: 35215, Training Loss: 0.00730
Epoch: 35215, Training Loss: 0.00760
Epoch: 35215, Training Loss: 0.00761
Epoch: 35215, Training Loss: 0.00933
Epoch: 35216, Training Loss: 0.00730
Epoch: 35216, Training Loss: 0.00760
Epoch: 35216, Training Loss: 0.00761
Epoch: 35216, Training Loss: 0.00933
Epoch: 35217, Training Loss: 0.00730
Epoch: 35217, Training Loss: 0.00760
Epoch: 35217, Training Loss: 0.00761
Epoch: 35217, Training Loss: 0.00933
Epoch: 35218, Training Loss: 0.00730
Epoch: 35218, Training Loss: 0.00760
Epoch: 35218, Training Loss: 0.00761
Epoch: 35218, Training Loss: 0.00933
Epoch: 35219, Training Loss: 0.00730
Epoch: 35219, Training Loss: 0.00760
Epoch: 35219, Training Loss: 0.00761
Epoch: 35219, Training Loss: 0.00933
Epoch: 35220, Training Loss: 0.00730
Epoch: 35220, Training Loss: 0.00760
Epoch: 35220, Training Loss: 0.00761
Epoch: 35220, Training Loss: 0.00933
Epoch: 35221, Training Loss: 0.00730
Epoch: 35221, Training Loss: 0.00760
Epoch: 35221, Training Loss: 0.00761
Epoch: 35221, Training Loss: 0.00933
Epoch: 35222, Training Loss: 0.00730
Epoch: 35222, Training Loss: 0.00760
Epoch: 35222, Training Loss: 0.00761
Epoch: 35222, Training Loss: 0.00933
Epoch: 35223, Training Loss: 0.00730
Epoch: 35223, Training Loss: 0.00760
Epoch: 35223, Training Loss: 0.00761
Epoch: 35223, Training Loss: 0.00933
Epoch: 35224, Training Loss: 0.00730
Epoch: 35224, Training Loss: 0.00760
Epoch: 35224, Training Loss: 0.00761
Epoch: 35224, Training Loss: 0.00933
Epoch: 35225, Training Loss: 0.00730
Epoch: 35225, Training Loss: 0.00760
Epoch: 35225, Training Loss: 0.00761
Epoch: 35225, Training Loss: 0.00933
Epoch: 35226, Training Loss: 0.00730
Epoch: 35226, Training Loss: 0.00760
Epoch: 35226, Training Loss: 0.00761
Epoch: 35226, Training Loss: 0.00933
Epoch: 35227, Training Loss: 0.00730
Epoch: 35227, Training Loss: 0.00760
Epoch: 35227, Training Loss: 0.00760
Epoch: 35227, Training Loss: 0.00933
Epoch: 35228, Training Loss: 0.00730
Epoch: 35228, Training Loss: 0.00760
Epoch: 35228, Training Loss: 0.00760
Epoch: 35228, Training Loss: 0.00933
Epoch: 35229, Training Loss: 0.00730
Epoch: 35229, Training Loss: 0.00760
Epoch: 35229, Training Loss: 0.00760
Epoch: 35229, Training Loss: 0.00933
Epoch: 35230, Training Loss: 0.00730
Epoch: 35230, Training Loss: 0.00760
Epoch: 35230, Training Loss: 0.00760
Epoch: 35230, Training Loss: 0.00933
Epoch: 35231, Training Loss: 0.00730
Epoch: 35231, Training Loss: 0.00760
Epoch: 35231, Training Loss: 0.00760
Epoch: 35231, Training Loss: 0.00933
Epoch: 35232, Training Loss: 0.00730
Epoch: 35232, Training Loss: 0.00760
Epoch: 35232, Training Loss: 0.00760
Epoch: 35232, Training Loss: 0.00933
Epoch: 35233, Training Loss: 0.00730
Epoch: 35233, Training Loss: 0.00760
Epoch: 35233, Training Loss: 0.00760
Epoch: 35233, Training Loss: 0.00933
Epoch: 35234, Training Loss: 0.00730
Epoch: 35234, Training Loss: 0.00760
Epoch: 35234, Training Loss: 0.00760
Epoch: 35234, Training Loss: 0.00933
Epoch: 35235, Training Loss: 0.00730
Epoch: 35235, Training Loss: 0.00760
Epoch: 35235, Training Loss: 0.00760
Epoch: 35235, Training Loss: 0.00933
Epoch: 35236, Training Loss: 0.00730
Epoch: 35236, Training Loss: 0.00760
Epoch: 35236, Training Loss: 0.00760
Epoch: 35236, Training Loss: 0.00933
Epoch: 35237, Training Loss: 0.00730
Epoch: 35237, Training Loss: 0.00760
Epoch: 35237, Training Loss: 0.00760
Epoch: 35237, Training Loss: 0.00933
Epoch: 35238, Training Loss: 0.00730
Epoch: 35238, Training Loss: 0.00759
Epoch: 35238, Training Loss: 0.00760
Epoch: 35238, Training Loss: 0.00933
Epoch: 35239, Training Loss: 0.00730
Epoch: 35239, Training Loss: 0.00759
Epoch: 35239, Training Loss: 0.00760
Epoch: 35239, Training Loss: 0.00933
Epoch: 35240, Training Loss: 0.00730
Epoch: 35240, Training Loss: 0.00759
Epoch: 35240, Training Loss: 0.00760
Epoch: 35240, Training Loss: 0.00933
Epoch: 35241, Training Loss: 0.00730
Epoch: 35241, Training Loss: 0.00759
Epoch: 35241, Training Loss: 0.00760
Epoch: 35241, Training Loss: 0.00933
Epoch: 35242, Training Loss: 0.00730
Epoch: 35242, Training Loss: 0.00759
Epoch: 35242, Training Loss: 0.00760
Epoch: 35242, Training Loss: 0.00933
Epoch: 35243, Training Loss: 0.00730
Epoch: 35243, Training Loss: 0.00759
Epoch: 35243, Training Loss: 0.00760
Epoch: 35243, Training Loss: 0.00933
Epoch: 35244, Training Loss: 0.00730
Epoch: 35244, Training Loss: 0.00759
Epoch: 35244, Training Loss: 0.00760
Epoch: 35244, Training Loss: 0.00933
Epoch: 35245, Training Loss: 0.00730
Epoch: 35245, Training Loss: 0.00759
Epoch: 35245, Training Loss: 0.00760
Epoch: 35245, Training Loss: 0.00933
Epoch: 35246, Training Loss: 0.00730
Epoch: 35246, Training Loss: 0.00759
Epoch: 35246, Training Loss: 0.00760
Epoch: 35246, Training Loss: 0.00933
Epoch: 35247, Training Loss: 0.00730
Epoch: 35247, Training Loss: 0.00759
Epoch: 35247, Training Loss: 0.00760
Epoch: 35247, Training Loss: 0.00933
Epoch: 35248, Training Loss: 0.00730
Epoch: 35248, Training Loss: 0.00759
Epoch: 35248, Training Loss: 0.00760
Epoch: 35248, Training Loss: 0.00933
Epoch: 35249, Training Loss: 0.00730
Epoch: 35249, Training Loss: 0.00759
Epoch: 35249, Training Loss: 0.00760
Epoch: 35249, Training Loss: 0.00933
Epoch: 35250, Training Loss: 0.00730
Epoch: 35250, Training Loss: 0.00759
Epoch: 35250, Training Loss: 0.00760
Epoch: 35250, Training Loss: 0.00933
Epoch: 35251, Training Loss: 0.00730
Epoch: 35251, Training Loss: 0.00759
Epoch: 35251, Training Loss: 0.00760
Epoch: 35251, Training Loss: 0.00933
Epoch: 35252, Training Loss: 0.00730
Epoch: 35252, Training Loss: 0.00759
Epoch: 35252, Training Loss: 0.00760
Epoch: 35252, Training Loss: 0.00933
Epoch: 35253, Training Loss: 0.00730
Epoch: 35253, Training Loss: 0.00759
Epoch: 35253, Training Loss: 0.00760
Epoch: 35253, Training Loss: 0.00933
Epoch: 35254, Training Loss: 0.00730
Epoch: 35254, Training Loss: 0.00759
Epoch: 35254, Training Loss: 0.00760
Epoch: 35254, Training Loss: 0.00933
Epoch: 35255, Training Loss: 0.00730
Epoch: 35255, Training Loss: 0.00759
Epoch: 35255, Training Loss: 0.00760
Epoch: 35255, Training Loss: 0.00933
Epoch: 35256, Training Loss: 0.00730
Epoch: 35256, Training Loss: 0.00759
Epoch: 35256, Training Loss: 0.00760
Epoch: 35256, Training Loss: 0.00933
Epoch: 35257, Training Loss: 0.00730
Epoch: 35257, Training Loss: 0.00759
Epoch: 35257, Training Loss: 0.00760
Epoch: 35257, Training Loss: 0.00933
Epoch: 35258, Training Loss: 0.00730
Epoch: 35258, Training Loss: 0.00759
Epoch: 35258, Training Loss: 0.00760
Epoch: 35258, Training Loss: 0.00933
Epoch: 35259, Training Loss: 0.00730
Epoch: 35259, Training Loss: 0.00759
Epoch: 35259, Training Loss: 0.00760
Epoch: 35259, Training Loss: 0.00933
Epoch: 35260, Training Loss: 0.00730
Epoch: 35260, Training Loss: 0.00759
Epoch: 35260, Training Loss: 0.00760
Epoch: 35260, Training Loss: 0.00933
Epoch: 35261, Training Loss: 0.00730
Epoch: 35261, Training Loss: 0.00759
Epoch: 35261, Training Loss: 0.00760
Epoch: 35261, Training Loss: 0.00933
Epoch: 35262, Training Loss: 0.00730
Epoch: 35262, Training Loss: 0.00759
Epoch: 35262, Training Loss: 0.00760
Epoch: 35262, Training Loss: 0.00933
Epoch: 35263, Training Loss: 0.00730
Epoch: 35263, Training Loss: 0.00759
Epoch: 35263, Training Loss: 0.00760
Epoch: 35263, Training Loss: 0.00933
Epoch: 35264, Training Loss: 0.00730
Epoch: 35264, Training Loss: 0.00759
Epoch: 35264, Training Loss: 0.00760
Epoch: 35264, Training Loss: 0.00933
Epoch: 35265, Training Loss: 0.00730
Epoch: 35265, Training Loss: 0.00759
Epoch: 35265, Training Loss: 0.00760
Epoch: 35265, Training Loss: 0.00933
Epoch: 35266, Training Loss: 0.00730
Epoch: 35266, Training Loss: 0.00759
Epoch: 35266, Training Loss: 0.00760
Epoch: 35266, Training Loss: 0.00933
Epoch: 35267, Training Loss: 0.00730
Epoch: 35267, Training Loss: 0.00759
Epoch: 35267, Training Loss: 0.00760
Epoch: 35267, Training Loss: 0.00933
Epoch: 35268, Training Loss: 0.00730
Epoch: 35268, Training Loss: 0.00759
Epoch: 35268, Training Loss: 0.00760
Epoch: 35268, Training Loss: 0.00933
Epoch: 35269, Training Loss: 0.00730
Epoch: 35269, Training Loss: 0.00759
Epoch: 35269, Training Loss: 0.00760
Epoch: 35269, Training Loss: 0.00933
Epoch: 35270, Training Loss: 0.00730
Epoch: 35270, Training Loss: 0.00759
Epoch: 35270, Training Loss: 0.00760
Epoch: 35270, Training Loss: 0.00933
Epoch: 35271, Training Loss: 0.00730
Epoch: 35271, Training Loss: 0.00759
Epoch: 35271, Training Loss: 0.00760
Epoch: 35271, Training Loss: 0.00933
Epoch: 35272, Training Loss: 0.00730
Epoch: 35272, Training Loss: 0.00759
Epoch: 35272, Training Loss: 0.00760
Epoch: 35272, Training Loss: 0.00933
Epoch: 35273, Training Loss: 0.00730
Epoch: 35273, Training Loss: 0.00759
Epoch: 35273, Training Loss: 0.00760
Epoch: 35273, Training Loss: 0.00933
Epoch: 35274, Training Loss: 0.00730
Epoch: 35274, Training Loss: 0.00759
Epoch: 35274, Training Loss: 0.00760
Epoch: 35274, Training Loss: 0.00933
Epoch: 35275, Training Loss: 0.00730
Epoch: 35275, Training Loss: 0.00759
Epoch: 35275, Training Loss: 0.00760
Epoch: 35275, Training Loss: 0.00933
Epoch: 35276, Training Loss: 0.00730
Epoch: 35276, Training Loss: 0.00759
Epoch: 35276, Training Loss: 0.00760
Epoch: 35276, Training Loss: 0.00933
Epoch: 35277, Training Loss: 0.00730
Epoch: 35277, Training Loss: 0.00759
Epoch: 35277, Training Loss: 0.00760
Epoch: 35277, Training Loss: 0.00933
Epoch: 35278, Training Loss: 0.00730
Epoch: 35278, Training Loss: 0.00759
Epoch: 35278, Training Loss: 0.00760
Epoch: 35278, Training Loss: 0.00933
Epoch: 35279, Training Loss: 0.00730
Epoch: 35279, Training Loss: 0.00759
Epoch: 35279, Training Loss: 0.00760
Epoch: 35279, Training Loss: 0.00933
Epoch: 35280, Training Loss: 0.00730
Epoch: 35280, Training Loss: 0.00759
Epoch: 35280, Training Loss: 0.00760
Epoch: 35280, Training Loss: 0.00932
Epoch: 35281, Training Loss: 0.00730
Epoch: 35281, Training Loss: 0.00759
Epoch: 35281, Training Loss: 0.00760
Epoch: 35281, Training Loss: 0.00932
Epoch: 35282, Training Loss: 0.00730
Epoch: 35282, Training Loss: 0.00759
Epoch: 35282, Training Loss: 0.00760
Epoch: 35282, Training Loss: 0.00932
Epoch: 35283, Training Loss: 0.00730
Epoch: 35283, Training Loss: 0.00759
Epoch: 35283, Training Loss: 0.00760
Epoch: 35283, Training Loss: 0.00932
Epoch: 35284, Training Loss: 0.00730
Epoch: 35284, Training Loss: 0.00759
Epoch: 35284, Training Loss: 0.00760
Epoch: 35284, Training Loss: 0.00932
Epoch: 35285, Training Loss: 0.00730
Epoch: 35285, Training Loss: 0.00759
Epoch: 35285, Training Loss: 0.00760
Epoch: 35285, Training Loss: 0.00932
Epoch: 35286, Training Loss: 0.00730
Epoch: 35286, Training Loss: 0.00759
Epoch: 35286, Training Loss: 0.00760
Epoch: 35286, Training Loss: 0.00932
Epoch: 35287, Training Loss: 0.00730
Epoch: 35287, Training Loss: 0.00759
Epoch: 35287, Training Loss: 0.00760
Epoch: 35287, Training Loss: 0.00932
Epoch: 35288, Training Loss: 0.00730
Epoch: 35288, Training Loss: 0.00759
Epoch: 35288, Training Loss: 0.00760
Epoch: 35288, Training Loss: 0.00932
Epoch: 35289, Training Loss: 0.00730
Epoch: 35289, Training Loss: 0.00759
Epoch: 35289, Training Loss: 0.00760
Epoch: 35289, Training Loss: 0.00932
Epoch: 35290, Training Loss: 0.00730
Epoch: 35290, Training Loss: 0.00759
Epoch: 35290, Training Loss: 0.00760
Epoch: 35290, Training Loss: 0.00932
Epoch: 35291, Training Loss: 0.00730
Epoch: 35291, Training Loss: 0.00759
Epoch: 35291, Training Loss: 0.00760
Epoch: 35291, Training Loss: 0.00932
Epoch: 35292, Training Loss: 0.00730
Epoch: 35292, Training Loss: 0.00759
Epoch: 35292, Training Loss: 0.00760
Epoch: 35292, Training Loss: 0.00932
Epoch: 35293, Training Loss: 0.00730
Epoch: 35293, Training Loss: 0.00759
Epoch: 35293, Training Loss: 0.00760
Epoch: 35293, Training Loss: 0.00932
Epoch: 35294, Training Loss: 0.00730
Epoch: 35294, Training Loss: 0.00759
Epoch: 35294, Training Loss: 0.00760
Epoch: 35294, Training Loss: 0.00932
Epoch: 35295, Training Loss: 0.00730
Epoch: 35295, Training Loss: 0.00759
Epoch: 35295, Training Loss: 0.00760
Epoch: 35295, Training Loss: 0.00932
Epoch: 35296, Training Loss: 0.00730
Epoch: 35296, Training Loss: 0.00759
Epoch: 35296, Training Loss: 0.00760
Epoch: 35296, Training Loss: 0.00932
Epoch: 35297, Training Loss: 0.00730
Epoch: 35297, Training Loss: 0.00759
Epoch: 35297, Training Loss: 0.00760
Epoch: 35297, Training Loss: 0.00932
Epoch: 35298, Training Loss: 0.00730
Epoch: 35298, Training Loss: 0.00759
Epoch: 35298, Training Loss: 0.00760
Epoch: 35298, Training Loss: 0.00932
Epoch: 35299, Training Loss: 0.00729
Epoch: 35299, Training Loss: 0.00759
Epoch: 35299, Training Loss: 0.00760
Epoch: 35299, Training Loss: 0.00932
Epoch: 35300, Training Loss: 0.00729
Epoch: 35300, Training Loss: 0.00759
Epoch: 35300, Training Loss: 0.00760
Epoch: 35300, Training Loss: 0.00932
Epoch: 35301, Training Loss: 0.00729
Epoch: 35301, Training Loss: 0.00759
Epoch: 35301, Training Loss: 0.00760
Epoch: 35301, Training Loss: 0.00932
Epoch: 35302, Training Loss: 0.00729
Epoch: 35302, Training Loss: 0.00759
Epoch: 35302, Training Loss: 0.00760
Epoch: 35302, Training Loss: 0.00932
Epoch: 35303, Training Loss: 0.00729
Epoch: 35303, Training Loss: 0.00759
Epoch: 35303, Training Loss: 0.00760
Epoch: 35303, Training Loss: 0.00932
Epoch: 35304, Training Loss: 0.00729
Epoch: 35304, Training Loss: 0.00759
Epoch: 35304, Training Loss: 0.00760
Epoch: 35304, Training Loss: 0.00932
Epoch: 35305, Training Loss: 0.00729
Epoch: 35305, Training Loss: 0.00759
Epoch: 35305, Training Loss: 0.00760
Epoch: 35305, Training Loss: 0.00932
Epoch: 35306, Training Loss: 0.00729
Epoch: 35306, Training Loss: 0.00759
Epoch: 35306, Training Loss: 0.00760
Epoch: 35306, Training Loss: 0.00932
Epoch: 35307, Training Loss: 0.00729
Epoch: 35307, Training Loss: 0.00759
Epoch: 35307, Training Loss: 0.00760
Epoch: 35307, Training Loss: 0.00932
Epoch: 35308, Training Loss: 0.00729
Epoch: 35308, Training Loss: 0.00759
Epoch: 35308, Training Loss: 0.00760
Epoch: 35308, Training Loss: 0.00932
Epoch: 35309, Training Loss: 0.00729
Epoch: 35309, Training Loss: 0.00759
Epoch: 35309, Training Loss: 0.00760
Epoch: 35309, Training Loss: 0.00932
Epoch: 35310, Training Loss: 0.00729
Epoch: 35310, Training Loss: 0.00759
Epoch: 35310, Training Loss: 0.00760
Epoch: 35310, Training Loss: 0.00932
Epoch: 35311, Training Loss: 0.00729
Epoch: 35311, Training Loss: 0.00759
Epoch: 35311, Training Loss: 0.00759
Epoch: 35311, Training Loss: 0.00932
Epoch: 35312, Training Loss: 0.00729
Epoch: 35312, Training Loss: 0.00759
Epoch: 35312, Training Loss: 0.00759
Epoch: 35312, Training Loss: 0.00932
Epoch: 35313, Training Loss: 0.00729
Epoch: 35313, Training Loss: 0.00759
Epoch: 35313, Training Loss: 0.00759
Epoch: 35313, Training Loss: 0.00932
Epoch: 35314, Training Loss: 0.00729
Epoch: 35314, Training Loss: 0.00759
Epoch: 35314, Training Loss: 0.00759
Epoch: 35314, Training Loss: 0.00932
Epoch: 35315, Training Loss: 0.00729
Epoch: 35315, Training Loss: 0.00759
Epoch: 35315, Training Loss: 0.00759
Epoch: 35315, Training Loss: 0.00932
Epoch: 35316, Training Loss: 0.00729
Epoch: 35316, Training Loss: 0.00759
Epoch: 35316, Training Loss: 0.00759
Epoch: 35316, Training Loss: 0.00932
Epoch: 35317, Training Loss: 0.00729
Epoch: 35317, Training Loss: 0.00759
Epoch: 35317, Training Loss: 0.00759
Epoch: 35317, Training Loss: 0.00932
Epoch: 35318, Training Loss: 0.00729
Epoch: 35318, Training Loss: 0.00759
Epoch: 35318, Training Loss: 0.00759
Epoch: 35318, Training Loss: 0.00932
Epoch: 35319, Training Loss: 0.00729
Epoch: 35319, Training Loss: 0.00759
Epoch: 35319, Training Loss: 0.00759
Epoch: 35319, Training Loss: 0.00932
Epoch: 35320, Training Loss: 0.00729
Epoch: 35320, Training Loss: 0.00759
Epoch: 35320, Training Loss: 0.00759
Epoch: 35320, Training Loss: 0.00932
Epoch: 35321, Training Loss: 0.00729
Epoch: 35321, Training Loss: 0.00759
Epoch: 35321, Training Loss: 0.00759
Epoch: 35321, Training Loss: 0.00932
Epoch: 35322, Training Loss: 0.00729
Epoch: 35322, Training Loss: 0.00758
Epoch: 35322, Training Loss: 0.00759
Epoch: 35322, Training Loss: 0.00932
Epoch: 35323, Training Loss: 0.00729
Epoch: 35323, Training Loss: 0.00758
Epoch: 35323, Training Loss: 0.00759
Epoch: 35323, Training Loss: 0.00932
Epoch: 35324, Training Loss: 0.00729
Epoch: 35324, Training Loss: 0.00758
Epoch: 35324, Training Loss: 0.00759
Epoch: 35324, Training Loss: 0.00932
Epoch: 35325, Training Loss: 0.00729
Epoch: 35325, Training Loss: 0.00758
Epoch: 35325, Training Loss: 0.00759
Epoch: 35325, Training Loss: 0.00932
Epoch: 35326, Training Loss: 0.00729
Epoch: 35326, Training Loss: 0.00758
Epoch: 35326, Training Loss: 0.00759
Epoch: 35326, Training Loss: 0.00932
Epoch: 35327, Training Loss: 0.00729
Epoch: 35327, Training Loss: 0.00758
Epoch: 35327, Training Loss: 0.00759
Epoch: 35327, Training Loss: 0.00932
Epoch: 35328, Training Loss: 0.00729
Epoch: 35328, Training Loss: 0.00758
Epoch: 35328, Training Loss: 0.00759
Epoch: 35328, Training Loss: 0.00932
Epoch: 35329, Training Loss: 0.00729
Epoch: 35329, Training Loss: 0.00758
Epoch: 35329, Training Loss: 0.00759
Epoch: 35329, Training Loss: 0.00932
Epoch: 35330, Training Loss: 0.00729
Epoch: 35330, Training Loss: 0.00758
Epoch: 35330, Training Loss: 0.00759
Epoch: 35330, Training Loss: 0.00932
Epoch: 35331, Training Loss: 0.00729
Epoch: 35331, Training Loss: 0.00758
Epoch: 35331, Training Loss: 0.00759
Epoch: 35331, Training Loss: 0.00932
Epoch: 35332, Training Loss: 0.00729
Epoch: 35332, Training Loss: 0.00758
Epoch: 35332, Training Loss: 0.00759
Epoch: 35332, Training Loss: 0.00932
Epoch: 35333, Training Loss: 0.00729
Epoch: 35333, Training Loss: 0.00758
Epoch: 35333, Training Loss: 0.00759
Epoch: 35333, Training Loss: 0.00932
Epoch: 35334, Training Loss: 0.00729
Epoch: 35334, Training Loss: 0.00758
Epoch: 35334, Training Loss: 0.00759
Epoch: 35334, Training Loss: 0.00932
Epoch: 35335, Training Loss: 0.00729
Epoch: 35335, Training Loss: 0.00758
Epoch: 35335, Training Loss: 0.00759
Epoch: 35335, Training Loss: 0.00932
Epoch: 35336, Training Loss: 0.00729
Epoch: 35336, Training Loss: 0.00758
Epoch: 35336, Training Loss: 0.00759
Epoch: 35336, Training Loss: 0.00932
Epoch: 35337, Training Loss: 0.00729
Epoch: 35337, Training Loss: 0.00758
Epoch: 35337, Training Loss: 0.00759
Epoch: 35337, Training Loss: 0.00932
Epoch: 35338, Training Loss: 0.00729
Epoch: 35338, Training Loss: 0.00758
Epoch: 35338, Training Loss: 0.00759
Epoch: 35338, Training Loss: 0.00932
Epoch: 35339, Training Loss: 0.00729
Epoch: 35339, Training Loss: 0.00758
Epoch: 35339, Training Loss: 0.00759
Epoch: 35339, Training Loss: 0.00932
Epoch: 35340, Training Loss: 0.00729
Epoch: 35340, Training Loss: 0.00758
Epoch: 35340, Training Loss: 0.00759
Epoch: 35340, Training Loss: 0.00932
Epoch: 35341, Training Loss: 0.00729
Epoch: 35341, Training Loss: 0.00758
Epoch: 35341, Training Loss: 0.00759
Epoch: 35341, Training Loss: 0.00932
Epoch: 35342, Training Loss: 0.00729
Epoch: 35342, Training Loss: 0.00758
Epoch: 35342, Training Loss: 0.00759
Epoch: 35342, Training Loss: 0.00932
Epoch: 35343, Training Loss: 0.00729
Epoch: 35343, Training Loss: 0.00758
Epoch: 35343, Training Loss: 0.00759
Epoch: 35343, Training Loss: 0.00932
Epoch: 35344, Training Loss: 0.00729
Epoch: 35344, Training Loss: 0.00758
Epoch: 35344, Training Loss: 0.00759
Epoch: 35344, Training Loss: 0.00932
Epoch: 35345, Training Loss: 0.00729
Epoch: 35345, Training Loss: 0.00758
Epoch: 35345, Training Loss: 0.00759
Epoch: 35345, Training Loss: 0.00932
Epoch: 35346, Training Loss: 0.00729
Epoch: 35346, Training Loss: 0.00758
Epoch: 35346, Training Loss: 0.00759
Epoch: 35346, Training Loss: 0.00932
Epoch: 35347, Training Loss: 0.00729
Epoch: 35347, Training Loss: 0.00758
Epoch: 35347, Training Loss: 0.00759
Epoch: 35347, Training Loss: 0.00932
Epoch: 35348, Training Loss: 0.00729
Epoch: 35348, Training Loss: 0.00758
Epoch: 35348, Training Loss: 0.00759
Epoch: 35348, Training Loss: 0.00931
Epoch: 35349, Training Loss: 0.00729
Epoch: 35349, Training Loss: 0.00758
Epoch: 35349, Training Loss: 0.00759
Epoch: 35349, Training Loss: 0.00931
Epoch: 35350, Training Loss: 0.00729
Epoch: 35350, Training Loss: 0.00758
Epoch: 35350, Training Loss: 0.00759
Epoch: 35350, Training Loss: 0.00931
Epoch: 35351, Training Loss: 0.00729
Epoch: 35351, Training Loss: 0.00758
Epoch: 35351, Training Loss: 0.00759
Epoch: 35351, Training Loss: 0.00931
Epoch: 35352, Training Loss: 0.00729
Epoch: 35352, Training Loss: 0.00758
Epoch: 35352, Training Loss: 0.00759
Epoch: 35352, Training Loss: 0.00931
Epoch: 35353, Training Loss: 0.00729
Epoch: 35353, Training Loss: 0.00758
Epoch: 35353, Training Loss: 0.00759
Epoch: 35353, Training Loss: 0.00931
Epoch: 35354, Training Loss: 0.00729
Epoch: 35354, Training Loss: 0.00758
Epoch: 35354, Training Loss: 0.00759
Epoch: 35354, Training Loss: 0.00931
Epoch: 35355, Training Loss: 0.00729
Epoch: 35355, Training Loss: 0.00758
Epoch: 35355, Training Loss: 0.00759
Epoch: 35355, Training Loss: 0.00931
Epoch: 35356, Training Loss: 0.00729
Epoch: 35356, Training Loss: 0.00758
Epoch: 35356, Training Loss: 0.00759
Epoch: 35356, Training Loss: 0.00931
Epoch: 35357, Training Loss: 0.00729
Epoch: 35357, Training Loss: 0.00758
Epoch: 35357, Training Loss: 0.00759
Epoch: 35357, Training Loss: 0.00931
Epoch: 35358, Training Loss: 0.00729
Epoch: 35358, Training Loss: 0.00758
Epoch: 35358, Training Loss: 0.00759
Epoch: 35358, Training Loss: 0.00931
Epoch: 35359, Training Loss: 0.00729
Epoch: 35359, Training Loss: 0.00758
Epoch: 35359, Training Loss: 0.00759
Epoch: 35359, Training Loss: 0.00931
Epoch: 35360, Training Loss: 0.00729
Epoch: 35360, Training Loss: 0.00758
Epoch: 35360, Training Loss: 0.00759
Epoch: 35360, Training Loss: 0.00931
Epoch: 35361, Training Loss: 0.00729
Epoch: 35361, Training Loss: 0.00758
Epoch: 35361, Training Loss: 0.00759
Epoch: 35361, Training Loss: 0.00931
Epoch: 35362, Training Loss: 0.00729
Epoch: 35362, Training Loss: 0.00758
Epoch: 35362, Training Loss: 0.00759
Epoch: 35362, Training Loss: 0.00931
Epoch: 35363, Training Loss: 0.00729
Epoch: 35363, Training Loss: 0.00758
Epoch: 35363, Training Loss: 0.00759
Epoch: 35363, Training Loss: 0.00931
Epoch: 35364, Training Loss: 0.00729
Epoch: 35364, Training Loss: 0.00758
Epoch: 35364, Training Loss: 0.00759
Epoch: 35364, Training Loss: 0.00931
Epoch: 35365, Training Loss: 0.00729
Epoch: 35365, Training Loss: 0.00758
Epoch: 35365, Training Loss: 0.00759
Epoch: 35365, Training Loss: 0.00931
Epoch: 35366, Training Loss: 0.00729
Epoch: 35366, Training Loss: 0.00758
Epoch: 35366, Training Loss: 0.00759
Epoch: 35366, Training Loss: 0.00931
Epoch: 35367, Training Loss: 0.00729
Epoch: 35367, Training Loss: 0.00758
Epoch: 35367, Training Loss: 0.00759
Epoch: 35367, Training Loss: 0.00931
Epoch: 35368, Training Loss: 0.00729
Epoch: 35368, Training Loss: 0.00758
Epoch: 35368, Training Loss: 0.00759
Epoch: 35368, Training Loss: 0.00931
Epoch: 35369, Training Loss: 0.00729
Epoch: 35369, Training Loss: 0.00758
Epoch: 35369, Training Loss: 0.00759
Epoch: 35369, Training Loss: 0.00931
Epoch: 35370, Training Loss: 0.00729
Epoch: 35370, Training Loss: 0.00758
Epoch: 35370, Training Loss: 0.00759
Epoch: 35370, Training Loss: 0.00931
Epoch: 35371, Training Loss: 0.00729
Epoch: 35371, Training Loss: 0.00758
Epoch: 35371, Training Loss: 0.00759
Epoch: 35371, Training Loss: 0.00931
Epoch: 35372, Training Loss: 0.00729
Epoch: 35372, Training Loss: 0.00758
Epoch: 35372, Training Loss: 0.00759
Epoch: 35372, Training Loss: 0.00931
Epoch: 35373, Training Loss: 0.00729
Epoch: 35373, Training Loss: 0.00758
Epoch: 35373, Training Loss: 0.00759
Epoch: 35373, Training Loss: 0.00931
Epoch: 35374, Training Loss: 0.00729
Epoch: 35374, Training Loss: 0.00758
Epoch: 35374, Training Loss: 0.00759
Epoch: 35374, Training Loss: 0.00931
Epoch: 35375, Training Loss: 0.00729
Epoch: 35375, Training Loss: 0.00758
Epoch: 35375, Training Loss: 0.00759
Epoch: 35375, Training Loss: 0.00931
Epoch: 35376, Training Loss: 0.00729
Epoch: 35376, Training Loss: 0.00758
Epoch: 35376, Training Loss: 0.00759
Epoch: 35376, Training Loss: 0.00931
Epoch: 35377, Training Loss: 0.00729
Epoch: 35377, Training Loss: 0.00758
Epoch: 35377, Training Loss: 0.00759
Epoch: 35377, Training Loss: 0.00931
Epoch: 35378, Training Loss: 0.00729
Epoch: 35378, Training Loss: 0.00758
Epoch: 35378, Training Loss: 0.00759
Epoch: 35378, Training Loss: 0.00931
Epoch: 35379, Training Loss: 0.00729
Epoch: 35379, Training Loss: 0.00758
Epoch: 35379, Training Loss: 0.00759
Epoch: 35379, Training Loss: 0.00931
Epoch: 35380, Training Loss: 0.00729
Epoch: 35380, Training Loss: 0.00758
Epoch: 35380, Training Loss: 0.00759
Epoch: 35380, Training Loss: 0.00931
Epoch: 35381, Training Loss: 0.00729
Epoch: 35381, Training Loss: 0.00758
Epoch: 35381, Training Loss: 0.00759
Epoch: 35381, Training Loss: 0.00931
Epoch: 35382, Training Loss: 0.00729
Epoch: 35382, Training Loss: 0.00758
Epoch: 35382, Training Loss: 0.00759
Epoch: 35382, Training Loss: 0.00931
Epoch: 35383, Training Loss: 0.00729
Epoch: 35383, Training Loss: 0.00758
Epoch: 35383, Training Loss: 0.00759
Epoch: 35383, Training Loss: 0.00931
Epoch: 35384, Training Loss: 0.00729
Epoch: 35384, Training Loss: 0.00758
Epoch: 35384, Training Loss: 0.00759
Epoch: 35384, Training Loss: 0.00931
Epoch: 35385, Training Loss: 0.00729
Epoch: 35385, Training Loss: 0.00758
Epoch: 35385, Training Loss: 0.00759
Epoch: 35385, Training Loss: 0.00931
Epoch: 35386, Training Loss: 0.00729
Epoch: 35386, Training Loss: 0.00758
Epoch: 35386, Training Loss: 0.00759
Epoch: 35386, Training Loss: 0.00931
Epoch: 35387, Training Loss: 0.00729
Epoch: 35387, Training Loss: 0.00758
Epoch: 35387, Training Loss: 0.00759
Epoch: 35387, Training Loss: 0.00931
Epoch: 35388, Training Loss: 0.00729
Epoch: 35388, Training Loss: 0.00758
Epoch: 35388, Training Loss: 0.00759
Epoch: 35388, Training Loss: 0.00931
Epoch: 35389, Training Loss: 0.00728
Epoch: 35389, Training Loss: 0.00758
Epoch: 35389, Training Loss: 0.00759
Epoch: 35389, Training Loss: 0.00931
Epoch: 35390, Training Loss: 0.00728
Epoch: 35390, Training Loss: 0.00758
Epoch: 35390, Training Loss: 0.00759
Epoch: 35390, Training Loss: 0.00931
Epoch: 35391, Training Loss: 0.00728
Epoch: 35391, Training Loss: 0.00758
Epoch: 35391, Training Loss: 0.00759
Epoch: 35391, Training Loss: 0.00931
Epoch: 35392, Training Loss: 0.00728
Epoch: 35392, Training Loss: 0.00758
Epoch: 35392, Training Loss: 0.00759
Epoch: 35392, Training Loss: 0.00931
Epoch: 35393, Training Loss: 0.00728
Epoch: 35393, Training Loss: 0.00758
Epoch: 35393, Training Loss: 0.00759
Epoch: 35393, Training Loss: 0.00931
Epoch: 35394, Training Loss: 0.00728
Epoch: 35394, Training Loss: 0.00758
Epoch: 35394, Training Loss: 0.00759
Epoch: 35394, Training Loss: 0.00931
Epoch: 35395, Training Loss: 0.00728
Epoch: 35395, Training Loss: 0.00758
Epoch: 35395, Training Loss: 0.00758
Epoch: 35395, Training Loss: 0.00931
Epoch: 35396, Training Loss: 0.00728
Epoch: 35396, Training Loss: 0.00758
Epoch: 35396, Training Loss: 0.00758
Epoch: 35396, Training Loss: 0.00931
Epoch: 35397, Training Loss: 0.00728
Epoch: 35397, Training Loss: 0.00758
Epoch: 35397, Training Loss: 0.00758
Epoch: 35397, Training Loss: 0.00931
Epoch: 35398, Training Loss: 0.00728
Epoch: 35398, Training Loss: 0.00758
Epoch: 35398, Training Loss: 0.00758
Epoch: 35398, Training Loss: 0.00931
Epoch: 35399, Training Loss: 0.00728
Epoch: 35399, Training Loss: 0.00758
Epoch: 35399, Training Loss: 0.00758
Epoch: 35399, Training Loss: 0.00931
Epoch: 35400, Training Loss: 0.00728
Epoch: 35400, Training Loss: 0.00758
Epoch: 35400, Training Loss: 0.00758
Epoch: 35400, Training Loss: 0.00931
Epoch: 35401, Training Loss: 0.00728
Epoch: 35401, Training Loss: 0.00758
Epoch: 35401, Training Loss: 0.00758
Epoch: 35401, Training Loss: 0.00931
Epoch: 35402, Training Loss: 0.00728
Epoch: 35402, Training Loss: 0.00758
Epoch: 35402, Training Loss: 0.00758
Epoch: 35402, Training Loss: 0.00931
Epoch: 35403, Training Loss: 0.00728
Epoch: 35403, Training Loss: 0.00758
Epoch: 35403, Training Loss: 0.00758
Epoch: 35403, Training Loss: 0.00931
Epoch: 35404, Training Loss: 0.00728
Epoch: 35404, Training Loss: 0.00758
Epoch: 35404, Training Loss: 0.00758
Epoch: 35404, Training Loss: 0.00931
Epoch: 35405, Training Loss: 0.00728
Epoch: 35405, Training Loss: 0.00758
Epoch: 35405, Training Loss: 0.00758
Epoch: 35405, Training Loss: 0.00931
Epoch: 35406, Training Loss: 0.00728
Epoch: 35406, Training Loss: 0.00757
Epoch: 35406, Training Loss: 0.00758
Epoch: 35406, Training Loss: 0.00931
Epoch: 35407, Training Loss: 0.00728
Epoch: 35407, Training Loss: 0.00757
Epoch: 35407, Training Loss: 0.00758
Epoch: 35407, Training Loss: 0.00931
Epoch: 35408, Training Loss: 0.00728
Epoch: 35408, Training Loss: 0.00757
Epoch: 35408, Training Loss: 0.00758
Epoch: 35408, Training Loss: 0.00931
Epoch: 35409, Training Loss: 0.00728
Epoch: 35409, Training Loss: 0.00757
Epoch: 35409, Training Loss: 0.00758
Epoch: 35409, Training Loss: 0.00931
Epoch: 35410, Training Loss: 0.00728
Epoch: 35410, Training Loss: 0.00757
Epoch: 35410, Training Loss: 0.00758
Epoch: 35410, Training Loss: 0.00931
Epoch: 35411, Training Loss: 0.00728
Epoch: 35411, Training Loss: 0.00757
Epoch: 35411, Training Loss: 0.00758
Epoch: 35411, Training Loss: 0.00931
Epoch: 35412, Training Loss: 0.00728
Epoch: 35412, Training Loss: 0.00757
Epoch: 35412, Training Loss: 0.00758
Epoch: 35412, Training Loss: 0.00931
Epoch: 35413, Training Loss: 0.00728
Epoch: 35413, Training Loss: 0.00757
Epoch: 35413, Training Loss: 0.00758
Epoch: 35413, Training Loss: 0.00931
Epoch: 35414, Training Loss: 0.00728
Epoch: 35414, Training Loss: 0.00757
Epoch: 35414, Training Loss: 0.00758
Epoch: 35414, Training Loss: 0.00931
Epoch: 35415, Training Loss: 0.00728
Epoch: 35415, Training Loss: 0.00757
Epoch: 35415, Training Loss: 0.00758
Epoch: 35415, Training Loss: 0.00931
Epoch: 35416, Training Loss: 0.00728
Epoch: 35416, Training Loss: 0.00757
Epoch: 35416, Training Loss: 0.00758
Epoch: 35416, Training Loss: 0.00930
Epoch: 35417, Training Loss: 0.00728
Epoch: 35417, Training Loss: 0.00757
Epoch: 35417, Training Loss: 0.00758
Epoch: 35417, Training Loss: 0.00930
Epoch: 35418, Training Loss: 0.00728
Epoch: 35418, Training Loss: 0.00757
Epoch: 35418, Training Loss: 0.00758
Epoch: 35418, Training Loss: 0.00930
Epoch: 35419, Training Loss: 0.00728
Epoch: 35419, Training Loss: 0.00757
Epoch: 35419, Training Loss: 0.00758
Epoch: 35419, Training Loss: 0.00930
Epoch: 35420, Training Loss: 0.00728
Epoch: 35420, Training Loss: 0.00757
Epoch: 35420, Training Loss: 0.00758
Epoch: 35420, Training Loss: 0.00930
Epoch: 35421, Training Loss: 0.00728
Epoch: 35421, Training Loss: 0.00757
Epoch: 35421, Training Loss: 0.00758
Epoch: 35421, Training Loss: 0.00930
Epoch: 35422, Training Loss: 0.00728
Epoch: 35422, Training Loss: 0.00757
Epoch: 35422, Training Loss: 0.00758
Epoch: 35422, Training Loss: 0.00930
Epoch: 35423, Training Loss: 0.00728
Epoch: 35423, Training Loss: 0.00757
Epoch: 35423, Training Loss: 0.00758
Epoch: 35423, Training Loss: 0.00930
Epoch: 35424, Training Loss: 0.00728
Epoch: 35424, Training Loss: 0.00757
Epoch: 35424, Training Loss: 0.00758
Epoch: 35424, Training Loss: 0.00930
Epoch: 35425, Training Loss: 0.00728
Epoch: 35425, Training Loss: 0.00757
Epoch: 35425, Training Loss: 0.00758
Epoch: 35425, Training Loss: 0.00930
Epoch: 35426, Training Loss: 0.00728
Epoch: 35426, Training Loss: 0.00757
Epoch: 35426, Training Loss: 0.00758
Epoch: 35426, Training Loss: 0.00930
Epoch: 35427, Training Loss: 0.00728
Epoch: 35427, Training Loss: 0.00757
Epoch: 35427, Training Loss: 0.00758
Epoch: 35427, Training Loss: 0.00930
Epoch: 35428, Training Loss: 0.00728
Epoch: 35428, Training Loss: 0.00757
Epoch: 35428, Training Loss: 0.00758
Epoch: 35428, Training Loss: 0.00930
Epoch: 35429, Training Loss: 0.00728
Epoch: 35429, Training Loss: 0.00757
Epoch: 35429, Training Loss: 0.00758
Epoch: 35429, Training Loss: 0.00930
Epoch: 35430, Training Loss: 0.00728
Epoch: 35430, Training Loss: 0.00757
Epoch: 35430, Training Loss: 0.00758
Epoch: 35430, Training Loss: 0.00930
Epoch: 35431, Training Loss: 0.00728
Epoch: 35431, Training Loss: 0.00757
Epoch: 35431, Training Loss: 0.00758
Epoch: 35431, Training Loss: 0.00930
Epoch: 35432, Training Loss: 0.00728
Epoch: 35432, Training Loss: 0.00757
Epoch: 35432, Training Loss: 0.00758
Epoch: 35432, Training Loss: 0.00930
Epoch: 35433, Training Loss: 0.00728
Epoch: 35433, Training Loss: 0.00757
Epoch: 35433, Training Loss: 0.00758
Epoch: 35433, Training Loss: 0.00930
Epoch: 35434, Training Loss: 0.00728
Epoch: 35434, Training Loss: 0.00757
Epoch: 35434, Training Loss: 0.00758
Epoch: 35434, Training Loss: 0.00930
Epoch: 35435, Training Loss: 0.00728
Epoch: 35435, Training Loss: 0.00757
Epoch: 35435, Training Loss: 0.00758
Epoch: 35435, Training Loss: 0.00930
Epoch: 35436, Training Loss: 0.00728
Epoch: 35436, Training Loss: 0.00757
Epoch: 35436, Training Loss: 0.00758
Epoch: 35436, Training Loss: 0.00930
Epoch: 35437, Training Loss: 0.00728
Epoch: 35437, Training Loss: 0.00757
Epoch: 35437, Training Loss: 0.00758
Epoch: 35437, Training Loss: 0.00930
Epoch: 35438, Training Loss: 0.00728
Epoch: 35438, Training Loss: 0.00757
Epoch: 35438, Training Loss: 0.00758
Epoch: 35438, Training Loss: 0.00930
Epoch: 35439, Training Loss: 0.00728
Epoch: 35439, Training Loss: 0.00757
Epoch: 35439, Training Loss: 0.00758
Epoch: 35439, Training Loss: 0.00930
Epoch: 35440, Training Loss: 0.00728
Epoch: 35440, Training Loss: 0.00757
Epoch: 35440, Training Loss: 0.00758
Epoch: 35440, Training Loss: 0.00930
Epoch: 35441, Training Loss: 0.00728
Epoch: 35441, Training Loss: 0.00757
Epoch: 35441, Training Loss: 0.00758
Epoch: 35441, Training Loss: 0.00930
Epoch: 35442, Training Loss: 0.00728
Epoch: 35442, Training Loss: 0.00757
Epoch: 35442, Training Loss: 0.00758
Epoch: 35442, Training Loss: 0.00930
Epoch: 35443, Training Loss: 0.00728
Epoch: 35443, Training Loss: 0.00757
Epoch: 35443, Training Loss: 0.00758
Epoch: 35443, Training Loss: 0.00930
Epoch: 35444, Training Loss: 0.00728
Epoch: 35444, Training Loss: 0.00757
Epoch: 35444, Training Loss: 0.00758
Epoch: 35444, Training Loss: 0.00930
Epoch: 35445, Training Loss: 0.00728
Epoch: 35445, Training Loss: 0.00757
Epoch: 35445, Training Loss: 0.00758
Epoch: 35445, Training Loss: 0.00930
Epoch: 35446, Training Loss: 0.00728
Epoch: 35446, Training Loss: 0.00757
Epoch: 35446, Training Loss: 0.00758
Epoch: 35446, Training Loss: 0.00930
Epoch: 35447, Training Loss: 0.00728
Epoch: 35447, Training Loss: 0.00757
Epoch: 35447, Training Loss: 0.00758
Epoch: 35447, Training Loss: 0.00930
Epoch: 35448, Training Loss: 0.00728
Epoch: 35448, Training Loss: 0.00757
Epoch: 35448, Training Loss: 0.00758
Epoch: 35448, Training Loss: 0.00930
Epoch: 35449, Training Loss: 0.00728
Epoch: 35449, Training Loss: 0.00757
Epoch: 35449, Training Loss: 0.00758
Epoch: 35449, Training Loss: 0.00930
Epoch: 35450, Training Loss: 0.00728
Epoch: 35450, Training Loss: 0.00757
Epoch: 35450, Training Loss: 0.00758
Epoch: 35450, Training Loss: 0.00930
Epoch: 35451, Training Loss: 0.00728
Epoch: 35451, Training Loss: 0.00757
Epoch: 35451, Training Loss: 0.00758
Epoch: 35451, Training Loss: 0.00930
Epoch: 35452, Training Loss: 0.00728
Epoch: 35452, Training Loss: 0.00757
Epoch: 35452, Training Loss: 0.00758
Epoch: 35452, Training Loss: 0.00930
Epoch: 35453, Training Loss: 0.00728
Epoch: 35453, Training Loss: 0.00757
Epoch: 35453, Training Loss: 0.00758
Epoch: 35453, Training Loss: 0.00930
Epoch: 35454, Training Loss: 0.00728
Epoch: 35454, Training Loss: 0.00757
Epoch: 35454, Training Loss: 0.00758
Epoch: 35454, Training Loss: 0.00930
Epoch: 35455, Training Loss: 0.00728
Epoch: 35455, Training Loss: 0.00757
Epoch: 35455, Training Loss: 0.00758
Epoch: 35455, Training Loss: 0.00930
Epoch: 35456, Training Loss: 0.00728
Epoch: 35456, Training Loss: 0.00757
Epoch: 35456, Training Loss: 0.00758
Epoch: 35456, Training Loss: 0.00930
Epoch: 35457, Training Loss: 0.00728
Epoch: 35457, Training Loss: 0.00757
Epoch: 35457, Training Loss: 0.00758
Epoch: 35457, Training Loss: 0.00930
Epoch: 35458, Training Loss: 0.00728
Epoch: 35458, Training Loss: 0.00757
Epoch: 35458, Training Loss: 0.00758
Epoch: 35458, Training Loss: 0.00930
Epoch: 35459, Training Loss: 0.00728
Epoch: 35459, Training Loss: 0.00757
Epoch: 35459, Training Loss: 0.00758
Epoch: 35459, Training Loss: 0.00930
Epoch: 35460, Training Loss: 0.00728
Epoch: 35460, Training Loss: 0.00757
Epoch: 35460, Training Loss: 0.00758
Epoch: 35460, Training Loss: 0.00930
Epoch: 35461, Training Loss: 0.00728
Epoch: 35461, Training Loss: 0.00757
Epoch: 35461, Training Loss: 0.00758
Epoch: 35461, Training Loss: 0.00930
Epoch: 35462, Training Loss: 0.00728
Epoch: 35462, Training Loss: 0.00757
Epoch: 35462, Training Loss: 0.00758
Epoch: 35462, Training Loss: 0.00930
Epoch: 35463, Training Loss: 0.00728
Epoch: 35463, Training Loss: 0.00757
Epoch: 35463, Training Loss: 0.00758
Epoch: 35463, Training Loss: 0.00930
Epoch: 35464, Training Loss: 0.00728
Epoch: 35464, Training Loss: 0.00757
Epoch: 35464, Training Loss: 0.00758
Epoch: 35464, Training Loss: 0.00930
Epoch: 35465, Training Loss: 0.00728
Epoch: 35465, Training Loss: 0.00757
Epoch: 35465, Training Loss: 0.00758
Epoch: 35465, Training Loss: 0.00930
Epoch: 35466, Training Loss: 0.00728
Epoch: 35466, Training Loss: 0.00757
Epoch: 35466, Training Loss: 0.00758
Epoch: 35466, Training Loss: 0.00930
Epoch: 35467, Training Loss: 0.00728
Epoch: 35467, Training Loss: 0.00757
Epoch: 35467, Training Loss: 0.00758
Epoch: 35467, Training Loss: 0.00930
Epoch: 35468, Training Loss: 0.00728
Epoch: 35468, Training Loss: 0.00757
Epoch: 35468, Training Loss: 0.00758
Epoch: 35468, Training Loss: 0.00930
Epoch: 35469, Training Loss: 0.00728
Epoch: 35469, Training Loss: 0.00757
Epoch: 35469, Training Loss: 0.00758
Epoch: 35469, Training Loss: 0.00930
Epoch: 35470, Training Loss: 0.00728
Epoch: 35470, Training Loss: 0.00757
Epoch: 35470, Training Loss: 0.00758
Epoch: 35470, Training Loss: 0.00930
Epoch: 35471, Training Loss: 0.00728
Epoch: 35471, Training Loss: 0.00757
Epoch: 35471, Training Loss: 0.00758
Epoch: 35471, Training Loss: 0.00930
Epoch: 35472, Training Loss: 0.00728
Epoch: 35472, Training Loss: 0.00757
Epoch: 35472, Training Loss: 0.00758
Epoch: 35472, Training Loss: 0.00930
Epoch: 35473, Training Loss: 0.00728
Epoch: 35473, Training Loss: 0.00757
Epoch: 35473, Training Loss: 0.00758
Epoch: 35473, Training Loss: 0.00930
Epoch: 35474, Training Loss: 0.00728
Epoch: 35474, Training Loss: 0.00757
Epoch: 35474, Training Loss: 0.00758
Epoch: 35474, Training Loss: 0.00930
Epoch: 35475, Training Loss: 0.00728
Epoch: 35475, Training Loss: 0.00757
Epoch: 35475, Training Loss: 0.00758
Epoch: 35475, Training Loss: 0.00930
Epoch: 35476, Training Loss: 0.00728
Epoch: 35476, Training Loss: 0.00757
Epoch: 35476, Training Loss: 0.00758
Epoch: 35476, Training Loss: 0.00930
Epoch: 35477, Training Loss: 0.00728
Epoch: 35477, Training Loss: 0.00757
Epoch: 35477, Training Loss: 0.00758
Epoch: 35477, Training Loss: 0.00930
Epoch: 35478, Training Loss: 0.00727
Epoch: 35478, Training Loss: 0.00757
Epoch: 35478, Training Loss: 0.00758
Epoch: 35478, Training Loss: 0.00930
Epoch: 35479, Training Loss: 0.00727
Epoch: 35479, Training Loss: 0.00757
Epoch: 35479, Training Loss: 0.00757
Epoch: 35479, Training Loss: 0.00930
Epoch: 35480, Training Loss: 0.00727
Epoch: 35480, Training Loss: 0.00757
Epoch: 35480, Training Loss: 0.00757
Epoch: 35480, Training Loss: 0.00930
Epoch: 35481, Training Loss: 0.00727
Epoch: 35481, Training Loss: 0.00757
Epoch: 35481, Training Loss: 0.00757
Epoch: 35481, Training Loss: 0.00930
Epoch: 35482, Training Loss: 0.00727
Epoch: 35482, Training Loss: 0.00757
Epoch: 35482, Training Loss: 0.00757
Epoch: 35482, Training Loss: 0.00930
Epoch: 35483, Training Loss: 0.00727
Epoch: 35483, Training Loss: 0.00757
Epoch: 35483, Training Loss: 0.00757
Epoch: 35483, Training Loss: 0.00930
Epoch: 35484, Training Loss: 0.00727
Epoch: 35484, Training Loss: 0.00757
Epoch: 35484, Training Loss: 0.00757
Epoch: 35484, Training Loss: 0.00929
Epoch: 35485, Training Loss: 0.00727
Epoch: 35485, Training Loss: 0.00757
Epoch: 35485, Training Loss: 0.00757
Epoch: 35485, Training Loss: 0.00929
Epoch: 35486, Training Loss: 0.00727
Epoch: 35486, Training Loss: 0.00757
Epoch: 35486, Training Loss: 0.00757
Epoch: 35486, Training Loss: 0.00929
Epoch: 35487, Training Loss: 0.00727
Epoch: 35487, Training Loss: 0.00757
Epoch: 35487, Training Loss: 0.00757
Epoch: 35487, Training Loss: 0.00929
Epoch: 35488, Training Loss: 0.00727
Epoch: 35488, Training Loss: 0.00757
Epoch: 35488, Training Loss: 0.00757
Epoch: 35488, Training Loss: 0.00929
Epoch: 35489, Training Loss: 0.00727
Epoch: 35489, Training Loss: 0.00757
Epoch: 35489, Training Loss: 0.00757
Epoch: 35489, Training Loss: 0.00929
Epoch: 35490, Training Loss: 0.00727
Epoch: 35490, Training Loss: 0.00756
Epoch: 35490, Training Loss: 0.00757
Epoch: 35490, Training Loss: 0.00929
Epoch: 35491, Training Loss: 0.00727
Epoch: 35491, Training Loss: 0.00756
Epoch: 35491, Training Loss: 0.00757
Epoch: 35491, Training Loss: 0.00929
Epoch: 35492, Training Loss: 0.00727
Epoch: 35492, Training Loss: 0.00756
Epoch: 35492, Training Loss: 0.00757
Epoch: 35492, Training Loss: 0.00929
Epoch: 35493, Training Loss: 0.00727
Epoch: 35493, Training Loss: 0.00756
Epoch: 35493, Training Loss: 0.00757
Epoch: 35493, Training Loss: 0.00929
Epoch: 35494, Training Loss: 0.00727
Epoch: 35494, Training Loss: 0.00756
Epoch: 35494, Training Loss: 0.00757
Epoch: 35494, Training Loss: 0.00929
Epoch: 35495, Training Loss: 0.00727
Epoch: 35495, Training Loss: 0.00756
Epoch: 35495, Training Loss: 0.00757
Epoch: 35495, Training Loss: 0.00929
Epoch: 35496, Training Loss: 0.00727
Epoch: 35496, Training Loss: 0.00756
Epoch: 35496, Training Loss: 0.00757
Epoch: 35496, Training Loss: 0.00929
Epoch: 35497, Training Loss: 0.00727
Epoch: 35497, Training Loss: 0.00756
Epoch: 35497, Training Loss: 0.00757
Epoch: 35497, Training Loss: 0.00929
Epoch: 35498, Training Loss: 0.00727
Epoch: 35498, Training Loss: 0.00756
Epoch: 35498, Training Loss: 0.00757
Epoch: 35498, Training Loss: 0.00929
Epoch: 35499, Training Loss: 0.00727
Epoch: 35499, Training Loss: 0.00756
Epoch: 35499, Training Loss: 0.00757
Epoch: 35499, Training Loss: 0.00929
Epoch: 35500, Training Loss: 0.00727
Epoch: 35500, Training Loss: 0.00756
Epoch: 35500, Training Loss: 0.00757
Epoch: 35500, Training Loss: 0.00929
Epoch: 35501, Training Loss: 0.00727
Epoch: 35501, Training Loss: 0.00756
Epoch: 35501, Training Loss: 0.00757
Epoch: 35501, Training Loss: 0.00929
Epoch: 35502, Training Loss: 0.00727
Epoch: 35502, Training Loss: 0.00756
Epoch: 35502, Training Loss: 0.00757
Epoch: 35502, Training Loss: 0.00929
Epoch: 35503, Training Loss: 0.00727
Epoch: 35503, Training Loss: 0.00756
Epoch: 35503, Training Loss: 0.00757
Epoch: 35503, Training Loss: 0.00929
Epoch: 35504, Training Loss: 0.00727
Epoch: 35504, Training Loss: 0.00756
Epoch: 35504, Training Loss: 0.00757
Epoch: 35504, Training Loss: 0.00929
Epoch: 35505, Training Loss: 0.00727
Epoch: 35505, Training Loss: 0.00756
Epoch: 35505, Training Loss: 0.00757
Epoch: 35505, Training Loss: 0.00929
Epoch: 35506, Training Loss: 0.00727
Epoch: 35506, Training Loss: 0.00756
Epoch: 35506, Training Loss: 0.00757
Epoch: 35506, Training Loss: 0.00929
Epoch: 35507, Training Loss: 0.00727
Epoch: 35507, Training Loss: 0.00756
Epoch: 35507, Training Loss: 0.00757
Epoch: 35507, Training Loss: 0.00929
Epoch: 35508, Training Loss: 0.00727
Epoch: 35508, Training Loss: 0.00756
Epoch: 35508, Training Loss: 0.00757
Epoch: 35508, Training Loss: 0.00929
Epoch: 35509, Training Loss: 0.00727
Epoch: 35509, Training Loss: 0.00756
Epoch: 35509, Training Loss: 0.00757
Epoch: 35509, Training Loss: 0.00929
Epoch: 35510, Training Loss: 0.00727
Epoch: 35510, Training Loss: 0.00756
Epoch: 35510, Training Loss: 0.00757
Epoch: 35510, Training Loss: 0.00929
Epoch: 35511, Training Loss: 0.00727
Epoch: 35511, Training Loss: 0.00756
Epoch: 35511, Training Loss: 0.00757
Epoch: 35511, Training Loss: 0.00929
Epoch: 35512, Training Loss: 0.00727
Epoch: 35512, Training Loss: 0.00756
Epoch: 35512, Training Loss: 0.00757
Epoch: 35512, Training Loss: 0.00929
Epoch: 35513, Training Loss: 0.00727
Epoch: 35513, Training Loss: 0.00756
Epoch: 35513, Training Loss: 0.00757
Epoch: 35513, Training Loss: 0.00929
Epoch: 35514, Training Loss: 0.00727
Epoch: 35514, Training Loss: 0.00756
Epoch: 35514, Training Loss: 0.00757
Epoch: 35514, Training Loss: 0.00929
Epoch: 35515, Training Loss: 0.00727
Epoch: 35515, Training Loss: 0.00756
Epoch: 35515, Training Loss: 0.00757
Epoch: 35515, Training Loss: 0.00929
Epoch: 35516, Training Loss: 0.00727
Epoch: 35516, Training Loss: 0.00756
Epoch: 35516, Training Loss: 0.00757
Epoch: 35516, Training Loss: 0.00929
Epoch: 35517, Training Loss: 0.00727
Epoch: 35517, Training Loss: 0.00756
Epoch: 35517, Training Loss: 0.00757
Epoch: 35517, Training Loss: 0.00929
Epoch: 35518, Training Loss: 0.00727
Epoch: 35518, Training Loss: 0.00756
Epoch: 35518, Training Loss: 0.00757
Epoch: 35518, Training Loss: 0.00929
Epoch: 35519, Training Loss: 0.00727
Epoch: 35519, Training Loss: 0.00756
Epoch: 35519, Training Loss: 0.00757
Epoch: 35519, Training Loss: 0.00929
Epoch: 35520, Training Loss: 0.00727
Epoch: 35520, Training Loss: 0.00756
Epoch: 35520, Training Loss: 0.00757
Epoch: 35520, Training Loss: 0.00929
Epoch: 35521, Training Loss: 0.00727
Epoch: 35521, Training Loss: 0.00756
Epoch: 35521, Training Loss: 0.00757
Epoch: 35521, Training Loss: 0.00929
Epoch: 35522, Training Loss: 0.00727
Epoch: 35522, Training Loss: 0.00756
Epoch: 35522, Training Loss: 0.00757
Epoch: 35522, Training Loss: 0.00929
Epoch: 35523, Training Loss: 0.00727
Epoch: 35523, Training Loss: 0.00756
Epoch: 35523, Training Loss: 0.00757
Epoch: 35523, Training Loss: 0.00929
Epoch: 35524, Training Loss: 0.00727
Epoch: 35524, Training Loss: 0.00756
Epoch: 35524, Training Loss: 0.00757
Epoch: 35524, Training Loss: 0.00929
Epoch: 35525, Training Loss: 0.00727
Epoch: 35525, Training Loss: 0.00756
Epoch: 35525, Training Loss: 0.00757
Epoch: 35525, Training Loss: 0.00929
Epoch: 35526, Training Loss: 0.00727
Epoch: 35526, Training Loss: 0.00756
Epoch: 35526, Training Loss: 0.00757
Epoch: 35526, Training Loss: 0.00929
Epoch: 35527, Training Loss: 0.00727
Epoch: 35527, Training Loss: 0.00756
Epoch: 35527, Training Loss: 0.00757
Epoch: 35527, Training Loss: 0.00929
Epoch: 35528, Training Loss: 0.00727
Epoch: 35528, Training Loss: 0.00756
Epoch: 35528, Training Loss: 0.00757
Epoch: 35528, Training Loss: 0.00929
Epoch: 35529, Training Loss: 0.00727
Epoch: 35529, Training Loss: 0.00756
Epoch: 35529, Training Loss: 0.00757
Epoch: 35529, Training Loss: 0.00929
Epoch: 35530, Training Loss: 0.00727
Epoch: 35530, Training Loss: 0.00756
Epoch: 35530, Training Loss: 0.00757
Epoch: 35530, Training Loss: 0.00929
Epoch: 35531, Training Loss: 0.00727
Epoch: 35531, Training Loss: 0.00756
Epoch: 35531, Training Loss: 0.00757
Epoch: 35531, Training Loss: 0.00929
Epoch: 35532, Training Loss: 0.00727
Epoch: 35532, Training Loss: 0.00756
Epoch: 35532, Training Loss: 0.00757
Epoch: 35532, Training Loss: 0.00929
Epoch: 35533, Training Loss: 0.00727
Epoch: 35533, Training Loss: 0.00756
Epoch: 35533, Training Loss: 0.00757
Epoch: 35533, Training Loss: 0.00929
Epoch: 35534, Training Loss: 0.00727
Epoch: 35534, Training Loss: 0.00756
Epoch: 35534, Training Loss: 0.00757
Epoch: 35534, Training Loss: 0.00929
Epoch: 35535, Training Loss: 0.00727
Epoch: 35535, Training Loss: 0.00756
Epoch: 35535, Training Loss: 0.00757
Epoch: 35535, Training Loss: 0.00929
Epoch: 35536, Training Loss: 0.00727
Epoch: 35536, Training Loss: 0.00756
Epoch: 35536, Training Loss: 0.00757
Epoch: 35536, Training Loss: 0.00929
Epoch: 35537, Training Loss: 0.00727
Epoch: 35537, Training Loss: 0.00756
Epoch: 35537, Training Loss: 0.00757
Epoch: 35537, Training Loss: 0.00929
Epoch: 35538, Training Loss: 0.00727
Epoch: 35538, Training Loss: 0.00756
Epoch: 35538, Training Loss: 0.00757
Epoch: 35538, Training Loss: 0.00929
Epoch: 35539, Training Loss: 0.00727
Epoch: 35539, Training Loss: 0.00756
Epoch: 35539, Training Loss: 0.00757
Epoch: 35539, Training Loss: 0.00929
Epoch: 35540, Training Loss: 0.00727
Epoch: 35540, Training Loss: 0.00756
Epoch: 35540, Training Loss: 0.00757
Epoch: 35540, Training Loss: 0.00929
Epoch: 35541, Training Loss: 0.00727
Epoch: 35541, Training Loss: 0.00756
Epoch: 35541, Training Loss: 0.00757
Epoch: 35541, Training Loss: 0.00929
Epoch: 35542, Training Loss: 0.00727
Epoch: 35542, Training Loss: 0.00756
Epoch: 35542, Training Loss: 0.00757
Epoch: 35542, Training Loss: 0.00929
Epoch: 35543, Training Loss: 0.00727
Epoch: 35543, Training Loss: 0.00756
Epoch: 35543, Training Loss: 0.00757
Epoch: 35543, Training Loss: 0.00929
Epoch: 35544, Training Loss: 0.00727
Epoch: 35544, Training Loss: 0.00756
Epoch: 35544, Training Loss: 0.00757
Epoch: 35544, Training Loss: 0.00929
Epoch: 35545, Training Loss: 0.00727
Epoch: 35545, Training Loss: 0.00756
Epoch: 35545, Training Loss: 0.00757
Epoch: 35545, Training Loss: 0.00929
Epoch: 35546, Training Loss: 0.00727
Epoch: 35546, Training Loss: 0.00756
Epoch: 35546, Training Loss: 0.00757
Epoch: 35546, Training Loss: 0.00929
Epoch: 35547, Training Loss: 0.00727
Epoch: 35547, Training Loss: 0.00756
Epoch: 35547, Training Loss: 0.00757
Epoch: 35547, Training Loss: 0.00929
Epoch: 35548, Training Loss: 0.00727
Epoch: 35548, Training Loss: 0.00756
Epoch: 35548, Training Loss: 0.00757
Epoch: 35548, Training Loss: 0.00929
Epoch: 35549, Training Loss: 0.00727
Epoch: 35549, Training Loss: 0.00756
Epoch: 35549, Training Loss: 0.00757
Epoch: 35549, Training Loss: 0.00929
Epoch: 35550, Training Loss: 0.00727
Epoch: 35550, Training Loss: 0.00756
Epoch: 35550, Training Loss: 0.00757
Epoch: 35550, Training Loss: 0.00929
Epoch: 35551, Training Loss: 0.00727
Epoch: 35551, Training Loss: 0.00756
Epoch: 35551, Training Loss: 0.00757
Epoch: 35551, Training Loss: 0.00929
Epoch: 35552, Training Loss: 0.00727
Epoch: 35552, Training Loss: 0.00756
Epoch: 35552, Training Loss: 0.00757
Epoch: 35552, Training Loss: 0.00928
Epoch: 35553, Training Loss: 0.00727
Epoch: 35553, Training Loss: 0.00756
Epoch: 35553, Training Loss: 0.00757
Epoch: 35553, Training Loss: 0.00928
Epoch: 35554, Training Loss: 0.00727
Epoch: 35554, Training Loss: 0.00756
Epoch: 35554, Training Loss: 0.00757
Epoch: 35554, Training Loss: 0.00928
Epoch: 35555, Training Loss: 0.00727
Epoch: 35555, Training Loss: 0.00756
Epoch: 35555, Training Loss: 0.00757
Epoch: 35555, Training Loss: 0.00928
Epoch: 35556, Training Loss: 0.00727
Epoch: 35556, Training Loss: 0.00756
Epoch: 35556, Training Loss: 0.00757
Epoch: 35556, Training Loss: 0.00928
Epoch: 35557, Training Loss: 0.00727
Epoch: 35557, Training Loss: 0.00756
Epoch: 35557, Training Loss: 0.00757
Epoch: 35557, Training Loss: 0.00928
Epoch: 35558, Training Loss: 0.00727
Epoch: 35558, Training Loss: 0.00756
Epoch: 35558, Training Loss: 0.00757
Epoch: 35558, Training Loss: 0.00928
Epoch: 35559, Training Loss: 0.00727
Epoch: 35559, Training Loss: 0.00756
Epoch: 35559, Training Loss: 0.00757
Epoch: 35559, Training Loss: 0.00928
Epoch: 35560, Training Loss: 0.00727
Epoch: 35560, Training Loss: 0.00756
Epoch: 35560, Training Loss: 0.00757
Epoch: 35560, Training Loss: 0.00928
Epoch: 35561, Training Loss: 0.00727
Epoch: 35561, Training Loss: 0.00756
Epoch: 35561, Training Loss: 0.00757
Epoch: 35561, Training Loss: 0.00928
Epoch: 35562, Training Loss: 0.00727
Epoch: 35562, Training Loss: 0.00756
Epoch: 35562, Training Loss: 0.00757
Epoch: 35562, Training Loss: 0.00928
Epoch: 35563, Training Loss: 0.00727
Epoch: 35563, Training Loss: 0.00756
Epoch: 35563, Training Loss: 0.00757
Epoch: 35563, Training Loss: 0.00928
Epoch: 35564, Training Loss: 0.00727
Epoch: 35564, Training Loss: 0.00756
Epoch: 35564, Training Loss: 0.00756
Epoch: 35564, Training Loss: 0.00928
Epoch: 35565, Training Loss: 0.00727
Epoch: 35565, Training Loss: 0.00756
Epoch: 35565, Training Loss: 0.00756
Epoch: 35565, Training Loss: 0.00928
Epoch: 35566, Training Loss: 0.00727
Epoch: 35566, Training Loss: 0.00756
Epoch: 35566, Training Loss: 0.00756
Epoch: 35566, Training Loss: 0.00928
Epoch: 35567, Training Loss: 0.00727
Epoch: 35567, Training Loss: 0.00756
Epoch: 35567, Training Loss: 0.00756
Epoch: 35567, Training Loss: 0.00928
Epoch: 35568, Training Loss: 0.00726
Epoch: 35568, Training Loss: 0.00756
Epoch: 35568, Training Loss: 0.00756
Epoch: 35568, Training Loss: 0.00928
Epoch: 35569, Training Loss: 0.00726
Epoch: 35569, Training Loss: 0.00756
Epoch: 35569, Training Loss: 0.00756
Epoch: 35569, Training Loss: 0.00928
Epoch: 35570, Training Loss: 0.00726
Epoch: 35570, Training Loss: 0.00756
Epoch: 35570, Training Loss: 0.00756
Epoch: 35570, Training Loss: 0.00928
Epoch: 35571, Training Loss: 0.00726
Epoch: 35571, Training Loss: 0.00756
Epoch: 35571, Training Loss: 0.00756
Epoch: 35571, Training Loss: 0.00928
Epoch: 35572, Training Loss: 0.00726
Epoch: 35572, Training Loss: 0.00756
Epoch: 35572, Training Loss: 0.00756
Epoch: 35572, Training Loss: 0.00928
Epoch: 35573, Training Loss: 0.00726
Epoch: 35573, Training Loss: 0.00756
Epoch: 35573, Training Loss: 0.00756
Epoch: 35573, Training Loss: 0.00928
Epoch: 35574, Training Loss: 0.00726
Epoch: 35574, Training Loss: 0.00756
Epoch: 35574, Training Loss: 0.00756
Epoch: 35574, Training Loss: 0.00928
Epoch: 35575, Training Loss: 0.00726
Epoch: 35575, Training Loss: 0.00755
Epoch: 35575, Training Loss: 0.00756
Epoch: 35575, Training Loss: 0.00928
Epoch: 35576, Training Loss: 0.00726
Epoch: 35576, Training Loss: 0.00755
Epoch: 35576, Training Loss: 0.00756
Epoch: 35576, Training Loss: 0.00928
Epoch: 35577, Training Loss: 0.00726
Epoch: 35577, Training Loss: 0.00755
Epoch: 35577, Training Loss: 0.00756
Epoch: 35577, Training Loss: 0.00928
Epoch: 35578, Training Loss: 0.00726
Epoch: 35578, Training Loss: 0.00755
Epoch: 35578, Training Loss: 0.00756
Epoch: 35578, Training Loss: 0.00928
Epoch: 35579, Training Loss: 0.00726
Epoch: 35579, Training Loss: 0.00755
Epoch: 35579, Training Loss: 0.00756
Epoch: 35579, Training Loss: 0.00928
Epoch: 35580, Training Loss: 0.00726
Epoch: 35580, Training Loss: 0.00755
Epoch: 35580, Training Loss: 0.00756
Epoch: 35580, Training Loss: 0.00928
Epoch: 35581, Training Loss: 0.00726
Epoch: 35581, Training Loss: 0.00755
Epoch: 35581, Training Loss: 0.00756
Epoch: 35581, Training Loss: 0.00928
Epoch: 35582, Training Loss: 0.00726
Epoch: 35582, Training Loss: 0.00755
Epoch: 35582, Training Loss: 0.00756
Epoch: 35582, Training Loss: 0.00928
Epoch: 35583, Training Loss: 0.00726
Epoch: 35583, Training Loss: 0.00755
Epoch: 35583, Training Loss: 0.00756
Epoch: 35583, Training Loss: 0.00928
Epoch: 35584, Training Loss: 0.00726
Epoch: 35584, Training Loss: 0.00755
Epoch: 35584, Training Loss: 0.00756
Epoch: 35584, Training Loss: 0.00928
Epoch: 35585, Training Loss: 0.00726
Epoch: 35585, Training Loss: 0.00755
Epoch: 35585, Training Loss: 0.00756
Epoch: 35585, Training Loss: 0.00928
Epoch: 35586, Training Loss: 0.00726
Epoch: 35586, Training Loss: 0.00755
Epoch: 35586, Training Loss: 0.00756
Epoch: 35586, Training Loss: 0.00928
Epoch: 35587, Training Loss: 0.00726
Epoch: 35587, Training Loss: 0.00755
Epoch: 35587, Training Loss: 0.00756
Epoch: 35587, Training Loss: 0.00928
Epoch: 35588, Training Loss: 0.00726
Epoch: 35588, Training Loss: 0.00755
Epoch: 35588, Training Loss: 0.00756
Epoch: 35588, Training Loss: 0.00928
Epoch: 35589, Training Loss: 0.00726
Epoch: 35589, Training Loss: 0.00755
Epoch: 35589, Training Loss: 0.00756
Epoch: 35589, Training Loss: 0.00928
Epoch: 35590, Training Loss: 0.00726
Epoch: 35590, Training Loss: 0.00755
Epoch: 35590, Training Loss: 0.00756
Epoch: 35590, Training Loss: 0.00928
Epoch: 35591, Training Loss: 0.00726
Epoch: 35591, Training Loss: 0.00755
Epoch: 35591, Training Loss: 0.00756
Epoch: 35591, Training Loss: 0.00928
Epoch: 35592, Training Loss: 0.00726
Epoch: 35592, Training Loss: 0.00755
Epoch: 35592, Training Loss: 0.00756
Epoch: 35592, Training Loss: 0.00928
Epoch: 35593, Training Loss: 0.00726
Epoch: 35593, Training Loss: 0.00755
Epoch: 35593, Training Loss: 0.00756
Epoch: 35593, Training Loss: 0.00928
Epoch: 35594, Training Loss: 0.00726
Epoch: 35594, Training Loss: 0.00755
Epoch: 35594, Training Loss: 0.00756
Epoch: 35594, Training Loss: 0.00928
Epoch: 35595, Training Loss: 0.00726
Epoch: 35595, Training Loss: 0.00755
Epoch: 35595, Training Loss: 0.00756
Epoch: 35595, Training Loss: 0.00928
Epoch: 35596, Training Loss: 0.00726
Epoch: 35596, Training Loss: 0.00755
Epoch: 35596, Training Loss: 0.00756
Epoch: 35596, Training Loss: 0.00928
Epoch: 35597, Training Loss: 0.00726
Epoch: 35597, Training Loss: 0.00755
Epoch: 35597, Training Loss: 0.00756
Epoch: 35597, Training Loss: 0.00928
Epoch: 35598, Training Loss: 0.00726
Epoch: 35598, Training Loss: 0.00755
Epoch: 35598, Training Loss: 0.00756
Epoch: 35598, Training Loss: 0.00928
Epoch: 35599, Training Loss: 0.00726
Epoch: 35599, Training Loss: 0.00755
Epoch: 35599, Training Loss: 0.00756
Epoch: 35599, Training Loss: 0.00928
Epoch: 35600, Training Loss: 0.00726
Epoch: 35600, Training Loss: 0.00755
Epoch: 35600, Training Loss: 0.00756
Epoch: 35600, Training Loss: 0.00928
Epoch: 35601, Training Loss: 0.00726
Epoch: 35601, Training Loss: 0.00755
Epoch: 35601, Training Loss: 0.00756
Epoch: 35601, Training Loss: 0.00928
Epoch: 35602, Training Loss: 0.00726
Epoch: 35602, Training Loss: 0.00755
Epoch: 35602, Training Loss: 0.00756
Epoch: 35602, Training Loss: 0.00928
Epoch: 35603, Training Loss: 0.00726
Epoch: 35603, Training Loss: 0.00755
Epoch: 35603, Training Loss: 0.00756
Epoch: 35603, Training Loss: 0.00928
Epoch: 35604, Training Loss: 0.00726
Epoch: 35604, Training Loss: 0.00755
Epoch: 35604, Training Loss: 0.00756
Epoch: 35604, Training Loss: 0.00928
Epoch: 35605, Training Loss: 0.00726
Epoch: 35605, Training Loss: 0.00755
Epoch: 35605, Training Loss: 0.00756
Epoch: 35605, Training Loss: 0.00928
Epoch: 35606, Training Loss: 0.00726
Epoch: 35606, Training Loss: 0.00755
Epoch: 35606, Training Loss: 0.00756
Epoch: 35606, Training Loss: 0.00928
Epoch: 35607, Training Loss: 0.00726
Epoch: 35607, Training Loss: 0.00755
Epoch: 35607, Training Loss: 0.00756
Epoch: 35607, Training Loss: 0.00928
Epoch: 35608, Training Loss: 0.00726
Epoch: 35608, Training Loss: 0.00755
Epoch: 35608, Training Loss: 0.00756
Epoch: 35608, Training Loss: 0.00928
Epoch: 35609, Training Loss: 0.00726
Epoch: 35609, Training Loss: 0.00755
Epoch: 35609, Training Loss: 0.00756
Epoch: 35609, Training Loss: 0.00928
Epoch: 35610, Training Loss: 0.00726
Epoch: 35610, Training Loss: 0.00755
Epoch: 35610, Training Loss: 0.00756
Epoch: 35610, Training Loss: 0.00928
Epoch: 35611, Training Loss: 0.00726
Epoch: 35611, Training Loss: 0.00755
Epoch: 35611, Training Loss: 0.00756
Epoch: 35611, Training Loss: 0.00928
Epoch: 35612, Training Loss: 0.00726
Epoch: 35612, Training Loss: 0.00755
Epoch: 35612, Training Loss: 0.00756
Epoch: 35612, Training Loss: 0.00928
Epoch: 35613, Training Loss: 0.00726
Epoch: 35613, Training Loss: 0.00755
Epoch: 35613, Training Loss: 0.00756
Epoch: 35613, Training Loss: 0.00928
Epoch: 35614, Training Loss: 0.00726
Epoch: 35614, Training Loss: 0.00755
Epoch: 35614, Training Loss: 0.00756
Epoch: 35614, Training Loss: 0.00928
Epoch: 35615, Training Loss: 0.00726
Epoch: 35615, Training Loss: 0.00755
Epoch: 35615, Training Loss: 0.00756
Epoch: 35615, Training Loss: 0.00928
Epoch: 35616, Training Loss: 0.00726
Epoch: 35616, Training Loss: 0.00755
Epoch: 35616, Training Loss: 0.00756
Epoch: 35616, Training Loss: 0.00928
Epoch: 35617, Training Loss: 0.00726
Epoch: 35617, Training Loss: 0.00755
Epoch: 35617, Training Loss: 0.00756
Epoch: 35617, Training Loss: 0.00928
Epoch: 35618, Training Loss: 0.00726
Epoch: 35618, Training Loss: 0.00755
Epoch: 35618, Training Loss: 0.00756
Epoch: 35618, Training Loss: 0.00928
Epoch: 35619, Training Loss: 0.00726
Epoch: 35619, Training Loss: 0.00755
Epoch: 35619, Training Loss: 0.00756
Epoch: 35619, Training Loss: 0.00928
Epoch: 35620, Training Loss: 0.00726
Epoch: 35620, Training Loss: 0.00755
Epoch: 35620, Training Loss: 0.00756
Epoch: 35620, Training Loss: 0.00927
Epoch: 35621, Training Loss: 0.00726
Epoch: 35621, Training Loss: 0.00755
Epoch: 35621, Training Loss: 0.00756
Epoch: 35621, Training Loss: 0.00927
Epoch: 35622, Training Loss: 0.00726
Epoch: 35622, Training Loss: 0.00755
Epoch: 35622, Training Loss: 0.00756
Epoch: 35622, Training Loss: 0.00927
Epoch: 35623, Training Loss: 0.00726
Epoch: 35623, Training Loss: 0.00755
Epoch: 35623, Training Loss: 0.00756
Epoch: 35623, Training Loss: 0.00927
Epoch: 35624, Training Loss: 0.00726
Epoch: 35624, Training Loss: 0.00755
Epoch: 35624, Training Loss: 0.00756
Epoch: 35624, Training Loss: 0.00927
Epoch: 35625, Training Loss: 0.00726
Epoch: 35625, Training Loss: 0.00755
Epoch: 35625, Training Loss: 0.00756
Epoch: 35625, Training Loss: 0.00927
Epoch: 35626, Training Loss: 0.00726
Epoch: 35626, Training Loss: 0.00755
Epoch: 35626, Training Loss: 0.00756
Epoch: 35626, Training Loss: 0.00927
Epoch: 35627, Training Loss: 0.00726
Epoch: 35627, Training Loss: 0.00755
Epoch: 35627, Training Loss: 0.00756
Epoch: 35627, Training Loss: 0.00927
Epoch: 35628, Training Loss: 0.00726
Epoch: 35628, Training Loss: 0.00755
Epoch: 35628, Training Loss: 0.00756
Epoch: 35628, Training Loss: 0.00927
Epoch: 35629, Training Loss: 0.00726
Epoch: 35629, Training Loss: 0.00755
Epoch: 35629, Training Loss: 0.00756
Epoch: 35629, Training Loss: 0.00927
Epoch: 35630, Training Loss: 0.00726
Epoch: 35630, Training Loss: 0.00755
Epoch: 35630, Training Loss: 0.00756
Epoch: 35630, Training Loss: 0.00927
Epoch: 35631, Training Loss: 0.00726
Epoch: 35631, Training Loss: 0.00755
Epoch: 35631, Training Loss: 0.00756
Epoch: 35631, Training Loss: 0.00927
Epoch: 35632, Training Loss: 0.00726
Epoch: 35632, Training Loss: 0.00755
Epoch: 35632, Training Loss: 0.00756
Epoch: 35632, Training Loss: 0.00927
Epoch: 35633, Training Loss: 0.00726
Epoch: 35633, Training Loss: 0.00755
Epoch: 35633, Training Loss: 0.00756
Epoch: 35633, Training Loss: 0.00927
Epoch: 35634, Training Loss: 0.00726
Epoch: 35634, Training Loss: 0.00755
Epoch: 35634, Training Loss: 0.00756
Epoch: 35634, Training Loss: 0.00927
Epoch: 35635, Training Loss: 0.00726
Epoch: 35635, Training Loss: 0.00755
Epoch: 35635, Training Loss: 0.00756
Epoch: 35635, Training Loss: 0.00927
Epoch: 35636, Training Loss: 0.00726
Epoch: 35636, Training Loss: 0.00755
Epoch: 35636, Training Loss: 0.00756
Epoch: 35636, Training Loss: 0.00927
Epoch: 35637, Training Loss: 0.00726
Epoch: 35637, Training Loss: 0.00755
Epoch: 35637, Training Loss: 0.00756
Epoch: 35637, Training Loss: 0.00927
Epoch: 35638, Training Loss: 0.00726
Epoch: 35638, Training Loss: 0.00755
Epoch: 35638, Training Loss: 0.00756
Epoch: 35638, Training Loss: 0.00927
Epoch: 35639, Training Loss: 0.00726
Epoch: 35639, Training Loss: 0.00755
Epoch: 35639, Training Loss: 0.00756
Epoch: 35639, Training Loss: 0.00927
Epoch: 35640, Training Loss: 0.00726
Epoch: 35640, Training Loss: 0.00755
Epoch: 35640, Training Loss: 0.00756
Epoch: 35640, Training Loss: 0.00927
Epoch: 35641, Training Loss: 0.00726
Epoch: 35641, Training Loss: 0.00755
Epoch: 35641, Training Loss: 0.00756
Epoch: 35641, Training Loss: 0.00927
Epoch: 35642, Training Loss: 0.00726
Epoch: 35642, Training Loss: 0.00755
Epoch: 35642, Training Loss: 0.00756
Epoch: 35642, Training Loss: 0.00927
Epoch: 35643, Training Loss: 0.00726
Epoch: 35643, Training Loss: 0.00755
Epoch: 35643, Training Loss: 0.00756
Epoch: 35643, Training Loss: 0.00927
Epoch: 35644, Training Loss: 0.00726
Epoch: 35644, Training Loss: 0.00755
Epoch: 35644, Training Loss: 0.00756
Epoch: 35644, Training Loss: 0.00927
Epoch: 35645, Training Loss: 0.00726
Epoch: 35645, Training Loss: 0.00755
Epoch: 35645, Training Loss: 0.00756
Epoch: 35645, Training Loss: 0.00927
Epoch: 35646, Training Loss: 0.00726
Epoch: 35646, Training Loss: 0.00755
Epoch: 35646, Training Loss: 0.00756
Epoch: 35646, Training Loss: 0.00927
Epoch: 35647, Training Loss: 0.00726
Epoch: 35647, Training Loss: 0.00755
Epoch: 35647, Training Loss: 0.00756
Epoch: 35647, Training Loss: 0.00927
Epoch: 35648, Training Loss: 0.00726
Epoch: 35648, Training Loss: 0.00755
Epoch: 35648, Training Loss: 0.00756
Epoch: 35648, Training Loss: 0.00927
Epoch: 35649, Training Loss: 0.00726
Epoch: 35649, Training Loss: 0.00755
Epoch: 35649, Training Loss: 0.00755
Epoch: 35649, Training Loss: 0.00927
Epoch: 35650, Training Loss: 0.00726
Epoch: 35650, Training Loss: 0.00755
Epoch: 35650, Training Loss: 0.00755
Epoch: 35650, Training Loss: 0.00927
Epoch: 35651, Training Loss: 0.00726
Epoch: 35651, Training Loss: 0.00755
Epoch: 35651, Training Loss: 0.00755
Epoch: 35651, Training Loss: 0.00927
Epoch: 35652, Training Loss: 0.00726
Epoch: 35652, Training Loss: 0.00755
Epoch: 35652, Training Loss: 0.00755
Epoch: 35652, Training Loss: 0.00927
Epoch: 35653, Training Loss: 0.00726
Epoch: 35653, Training Loss: 0.00755
Epoch: 35653, Training Loss: 0.00755
Epoch: 35653, Training Loss: 0.00927
Epoch: 35654, Training Loss: 0.00726
Epoch: 35654, Training Loss: 0.00755
Epoch: 35654, Training Loss: 0.00755
Epoch: 35654, Training Loss: 0.00927
Epoch: 35655, Training Loss: 0.00726
Epoch: 35655, Training Loss: 0.00755
Epoch: 35655, Training Loss: 0.00755
Epoch: 35655, Training Loss: 0.00927
Epoch: 35656, Training Loss: 0.00726
Epoch: 35656, Training Loss: 0.00755
Epoch: 35656, Training Loss: 0.00755
Epoch: 35656, Training Loss: 0.00927
Epoch: 35657, Training Loss: 0.00726
Epoch: 35657, Training Loss: 0.00755
Epoch: 35657, Training Loss: 0.00755
Epoch: 35657, Training Loss: 0.00927
Epoch: 35658, Training Loss: 0.00726
Epoch: 35658, Training Loss: 0.00755
Epoch: 35658, Training Loss: 0.00755
Epoch: 35658, Training Loss: 0.00927
Epoch: 35659, Training Loss: 0.00725
Epoch: 35659, Training Loss: 0.00755
Epoch: 35659, Training Loss: 0.00755
Epoch: 35659, Training Loss: 0.00927
Epoch: 35660, Training Loss: 0.00725
Epoch: 35660, Training Loss: 0.00754
Epoch: 35660, Training Loss: 0.00755
Epoch: 35660, Training Loss: 0.00927
Epoch: 35661, Training Loss: 0.00725
Epoch: 35661, Training Loss: 0.00754
Epoch: 35661, Training Loss: 0.00755
Epoch: 35661, Training Loss: 0.00927
Epoch: 35662, Training Loss: 0.00725
Epoch: 35662, Training Loss: 0.00754
Epoch: 35662, Training Loss: 0.00755
Epoch: 35662, Training Loss: 0.00927
Epoch: 35663, Training Loss: 0.00725
Epoch: 35663, Training Loss: 0.00754
Epoch: 35663, Training Loss: 0.00755
Epoch: 35663, Training Loss: 0.00927
Epoch: 35664, Training Loss: 0.00725
Epoch: 35664, Training Loss: 0.00754
Epoch: 35664, Training Loss: 0.00755
Epoch: 35664, Training Loss: 0.00927
Epoch: 35665, Training Loss: 0.00725
Epoch: 35665, Training Loss: 0.00754
Epoch: 35665, Training Loss: 0.00755
Epoch: 35665, Training Loss: 0.00927
Epoch: 35666, Training Loss: 0.00725
Epoch: 35666, Training Loss: 0.00754
Epoch: 35666, Training Loss: 0.00755
Epoch: 35666, Training Loss: 0.00927
Epoch: 35667, Training Loss: 0.00725
Epoch: 35667, Training Loss: 0.00754
Epoch: 35667, Training Loss: 0.00755
Epoch: 35667, Training Loss: 0.00927
Epoch: 35668, Training Loss: 0.00725
Epoch: 35668, Training Loss: 0.00754
Epoch: 35668, Training Loss: 0.00755
Epoch: 35668, Training Loss: 0.00927
Epoch: 35669, Training Loss: 0.00725
Epoch: 35669, Training Loss: 0.00754
Epoch: 35669, Training Loss: 0.00755
Epoch: 35669, Training Loss: 0.00927
Epoch: 35670, Training Loss: 0.00725
Epoch: 35670, Training Loss: 0.00754
Epoch: 35670, Training Loss: 0.00755
Epoch: 35670, Training Loss: 0.00927
Epoch: 35671, Training Loss: 0.00725
Epoch: 35671, Training Loss: 0.00754
Epoch: 35671, Training Loss: 0.00755
Epoch: 35671, Training Loss: 0.00927
Epoch: 35672, Training Loss: 0.00725
Epoch: 35672, Training Loss: 0.00754
Epoch: 35672, Training Loss: 0.00755
Epoch: 35672, Training Loss: 0.00927
Epoch: 35673, Training Loss: 0.00725
Epoch: 35673, Training Loss: 0.00754
Epoch: 35673, Training Loss: 0.00755
Epoch: 35673, Training Loss: 0.00927
Epoch: 35674, Training Loss: 0.00725
Epoch: 35674, Training Loss: 0.00754
Epoch: 35674, Training Loss: 0.00755
Epoch: 35674, Training Loss: 0.00927
Epoch: 35675, Training Loss: 0.00725
Epoch: 35675, Training Loss: 0.00754
Epoch: 35675, Training Loss: 0.00755
Epoch: 35675, Training Loss: 0.00927
Epoch: 35676, Training Loss: 0.00725
Epoch: 35676, Training Loss: 0.00754
Epoch: 35676, Training Loss: 0.00755
Epoch: 35676, Training Loss: 0.00927
Epoch: 35677, Training Loss: 0.00725
Epoch: 35677, Training Loss: 0.00754
Epoch: 35677, Training Loss: 0.00755
Epoch: 35677, Training Loss: 0.00927
Epoch: 35678, Training Loss: 0.00725
Epoch: 35678, Training Loss: 0.00754
Epoch: 35678, Training Loss: 0.00755
Epoch: 35678, Training Loss: 0.00927
Epoch: 35679, Training Loss: 0.00725
Epoch: 35679, Training Loss: 0.00754
Epoch: 35679, Training Loss: 0.00755
Epoch: 35679, Training Loss: 0.00927
Epoch: 35680, Training Loss: 0.00725
Epoch: 35680, Training Loss: 0.00754
Epoch: 35680, Training Loss: 0.00755
Epoch: 35680, Training Loss: 0.00927
Epoch: 35681, Training Loss: 0.00725
Epoch: 35681, Training Loss: 0.00754
Epoch: 35681, Training Loss: 0.00755
Epoch: 35681, Training Loss: 0.00927
Epoch: 35682, Training Loss: 0.00725
Epoch: 35682, Training Loss: 0.00754
Epoch: 35682, Training Loss: 0.00755
Epoch: 35682, Training Loss: 0.00927
Epoch: 35683, Training Loss: 0.00725
Epoch: 35683, Training Loss: 0.00754
Epoch: 35683, Training Loss: 0.00755
Epoch: 35683, Training Loss: 0.00927
Epoch: 35684, Training Loss: 0.00725
Epoch: 35684, Training Loss: 0.00754
Epoch: 35684, Training Loss: 0.00755
Epoch: 35684, Training Loss: 0.00927
Epoch: 35685, Training Loss: 0.00725
Epoch: 35685, Training Loss: 0.00754
Epoch: 35685, Training Loss: 0.00755
Epoch: 35685, Training Loss: 0.00927
Epoch: 35686, Training Loss: 0.00725
Epoch: 35686, Training Loss: 0.00754
Epoch: 35686, Training Loss: 0.00755
Epoch: 35686, Training Loss: 0.00927
Epoch: 35687, Training Loss: 0.00725
Epoch: 35687, Training Loss: 0.00754
Epoch: 35687, Training Loss: 0.00755
Epoch: 35687, Training Loss: 0.00927
Epoch: 35688, Training Loss: 0.00725
Epoch: 35688, Training Loss: 0.00754
Epoch: 35688, Training Loss: 0.00755
Epoch: 35688, Training Loss: 0.00927
Epoch: 35689, Training Loss: 0.00725
Epoch: 35689, Training Loss: 0.00754
Epoch: 35689, Training Loss: 0.00755
Epoch: 35689, Training Loss: 0.00926
Epoch: 35690, Training Loss: 0.00725
Epoch: 35690, Training Loss: 0.00754
Epoch: 35690, Training Loss: 0.00755
Epoch: 35690, Training Loss: 0.00926
Epoch: 35691, Training Loss: 0.00725
Epoch: 35691, Training Loss: 0.00754
Epoch: 35691, Training Loss: 0.00755
Epoch: 35691, Training Loss: 0.00926
Epoch: 35692, Training Loss: 0.00725
Epoch: 35692, Training Loss: 0.00754
Epoch: 35692, Training Loss: 0.00755
Epoch: 35692, Training Loss: 0.00926
Epoch: 35693, Training Loss: 0.00725
Epoch: 35693, Training Loss: 0.00754
Epoch: 35693, Training Loss: 0.00755
Epoch: 35693, Training Loss: 0.00926
Epoch: 35694, Training Loss: 0.00725
Epoch: 35694, Training Loss: 0.00754
Epoch: 35694, Training Loss: 0.00755
Epoch: 35694, Training Loss: 0.00926
Epoch: 35695, Training Loss: 0.00725
Epoch: 35695, Training Loss: 0.00754
Epoch: 35695, Training Loss: 0.00755
Epoch: 35695, Training Loss: 0.00926
Epoch: 35696, Training Loss: 0.00725
Epoch: 35696, Training Loss: 0.00754
Epoch: 35696, Training Loss: 0.00755
Epoch: 35696, Training Loss: 0.00926
Epoch: 35697, Training Loss: 0.00725
Epoch: 35697, Training Loss: 0.00754
Epoch: 35697, Training Loss: 0.00755
Epoch: 35697, Training Loss: 0.00926
Epoch: 35698, Training Loss: 0.00725
Epoch: 35698, Training Loss: 0.00754
Epoch: 35698, Training Loss: 0.00755
Epoch: 35698, Training Loss: 0.00926
Epoch: 35699, Training Loss: 0.00725
Epoch: 35699, Training Loss: 0.00754
Epoch: 35699, Training Loss: 0.00755
Epoch: 35699, Training Loss: 0.00926
Epoch: 35700, Training Loss: 0.00725
Epoch: 35700, Training Loss: 0.00754
Epoch: 35700, Training Loss: 0.00755
Epoch: 35700, Training Loss: 0.00926
Epoch: 35701, Training Loss: 0.00725
Epoch: 35701, Training Loss: 0.00754
Epoch: 35701, Training Loss: 0.00755
Epoch: 35701, Training Loss: 0.00926
Epoch: 35702, Training Loss: 0.00725
Epoch: 35702, Training Loss: 0.00754
Epoch: 35702, Training Loss: 0.00755
Epoch: 35702, Training Loss: 0.00926
Epoch: 35703, Training Loss: 0.00725
Epoch: 35703, Training Loss: 0.00754
Epoch: 35703, Training Loss: 0.00755
Epoch: 35703, Training Loss: 0.00926
Epoch: 35704, Training Loss: 0.00725
Epoch: 35704, Training Loss: 0.00754
Epoch: 35704, Training Loss: 0.00755
Epoch: 35704, Training Loss: 0.00926
Epoch: 35705, Training Loss: 0.00725
Epoch: 35705, Training Loss: 0.00754
Epoch: 35705, Training Loss: 0.00755
Epoch: 35705, Training Loss: 0.00926
Epoch: 35706, Training Loss: 0.00725
Epoch: 35706, Training Loss: 0.00754
Epoch: 35706, Training Loss: 0.00755
Epoch: 35706, Training Loss: 0.00926
Epoch: 35707, Training Loss: 0.00725
Epoch: 35707, Training Loss: 0.00754
Epoch: 35707, Training Loss: 0.00755
Epoch: 35707, Training Loss: 0.00926
Epoch: 35708, Training Loss: 0.00725
Epoch: 35708, Training Loss: 0.00754
Epoch: 35708, Training Loss: 0.00755
Epoch: 35708, Training Loss: 0.00926
Epoch: 35709, Training Loss: 0.00725
Epoch: 35709, Training Loss: 0.00754
Epoch: 35709, Training Loss: 0.00755
Epoch: 35709, Training Loss: 0.00926
Epoch: 35710, Training Loss: 0.00725
Epoch: 35710, Training Loss: 0.00754
Epoch: 35710, Training Loss: 0.00755
Epoch: 35710, Training Loss: 0.00926
Epoch: 35711, Training Loss: 0.00725
Epoch: 35711, Training Loss: 0.00754
Epoch: 35711, Training Loss: 0.00755
Epoch: 35711, Training Loss: 0.00926
Epoch: 35712, Training Loss: 0.00725
Epoch: 35712, Training Loss: 0.00754
Epoch: 35712, Training Loss: 0.00755
Epoch: 35712, Training Loss: 0.00926
Epoch: 35713, Training Loss: 0.00725
Epoch: 35713, Training Loss: 0.00754
Epoch: 35713, Training Loss: 0.00755
Epoch: 35713, Training Loss: 0.00926
Epoch: 35714, Training Loss: 0.00725
Epoch: 35714, Training Loss: 0.00754
Epoch: 35714, Training Loss: 0.00755
Epoch: 35714, Training Loss: 0.00926
Epoch: 35715, Training Loss: 0.00725
Epoch: 35715, Training Loss: 0.00754
Epoch: 35715, Training Loss: 0.00755
Epoch: 35715, Training Loss: 0.00926
Epoch: 35716, Training Loss: 0.00725
Epoch: 35716, Training Loss: 0.00754
Epoch: 35716, Training Loss: 0.00755
Epoch: 35716, Training Loss: 0.00926
Epoch: 35717, Training Loss: 0.00725
Epoch: 35717, Training Loss: 0.00754
Epoch: 35717, Training Loss: 0.00755
Epoch: 35717, Training Loss: 0.00926
Epoch: 35718, Training Loss: 0.00725
Epoch: 35718, Training Loss: 0.00754
Epoch: 35718, Training Loss: 0.00755
Epoch: 35718, Training Loss: 0.00926
Epoch: 35719, Training Loss: 0.00725
Epoch: 35719, Training Loss: 0.00754
Epoch: 35719, Training Loss: 0.00755
Epoch: 35719, Training Loss: 0.00926
Epoch: 35720, Training Loss: 0.00725
Epoch: 35720, Training Loss: 0.00754
Epoch: 35720, Training Loss: 0.00755
Epoch: 35720, Training Loss: 0.00926
Epoch: 35721, Training Loss: 0.00725
Epoch: 35721, Training Loss: 0.00754
Epoch: 35721, Training Loss: 0.00755
Epoch: 35721, Training Loss: 0.00926
Epoch: 35722, Training Loss: 0.00725
Epoch: 35722, Training Loss: 0.00754
Epoch: 35722, Training Loss: 0.00755
Epoch: 35722, Training Loss: 0.00926
Epoch: 35723, Training Loss: 0.00725
Epoch: 35723, Training Loss: 0.00754
Epoch: 35723, Training Loss: 0.00755
Epoch: 35723, Training Loss: 0.00926
Epoch: 35724, Training Loss: 0.00725
Epoch: 35724, Training Loss: 0.00754
Epoch: 35724, Training Loss: 0.00755
Epoch: 35724, Training Loss: 0.00926
Epoch: 35725, Training Loss: 0.00725
Epoch: 35725, Training Loss: 0.00754
Epoch: 35725, Training Loss: 0.00755
Epoch: 35725, Training Loss: 0.00926
Epoch: 35726, Training Loss: 0.00725
Epoch: 35726, Training Loss: 0.00754
Epoch: 35726, Training Loss: 0.00755
Epoch: 35726, Training Loss: 0.00926
Epoch: 35727, Training Loss: 0.00725
Epoch: 35727, Training Loss: 0.00754
Epoch: 35727, Training Loss: 0.00755
Epoch: 35727, Training Loss: 0.00926
Epoch: 35728, Training Loss: 0.00725
Epoch: 35728, Training Loss: 0.00754
Epoch: 35728, Training Loss: 0.00755
Epoch: 35728, Training Loss: 0.00926
Epoch: 35729, Training Loss: 0.00725
Epoch: 35729, Training Loss: 0.00754
Epoch: 35729, Training Loss: 0.00755
Epoch: 35729, Training Loss: 0.00926
Epoch: 35730, Training Loss: 0.00725
Epoch: 35730, Training Loss: 0.00754
Epoch: 35730, Training Loss: 0.00755
Epoch: 35730, Training Loss: 0.00926
Epoch: 35731, Training Loss: 0.00725
Epoch: 35731, Training Loss: 0.00754
Epoch: 35731, Training Loss: 0.00755
Epoch: 35731, Training Loss: 0.00926
Epoch: 35732, Training Loss: 0.00725
Epoch: 35732, Training Loss: 0.00754
Epoch: 35732, Training Loss: 0.00755
Epoch: 35732, Training Loss: 0.00926
Epoch: 35733, Training Loss: 0.00725
Epoch: 35733, Training Loss: 0.00754
Epoch: 35733, Training Loss: 0.00755
Epoch: 35733, Training Loss: 0.00926
Epoch: 35734, Training Loss: 0.00725
Epoch: 35734, Training Loss: 0.00754
Epoch: 35734, Training Loss: 0.00754
Epoch: 35734, Training Loss: 0.00926
Epoch: 35735, Training Loss: 0.00725
Epoch: 35735, Training Loss: 0.00754
Epoch: 35735, Training Loss: 0.00754
Epoch: 35735, Training Loss: 0.00926
Epoch: 35736, Training Loss: 0.00725
Epoch: 35736, Training Loss: 0.00754
Epoch: 35736, Training Loss: 0.00754
Epoch: 35736, Training Loss: 0.00926
Epoch: 35737, Training Loss: 0.00725
Epoch: 35737, Training Loss: 0.00754
Epoch: 35737, Training Loss: 0.00754
Epoch: 35737, Training Loss: 0.00926
Epoch: 35738, Training Loss: 0.00725
Epoch: 35738, Training Loss: 0.00754
Epoch: 35738, Training Loss: 0.00754
Epoch: 35738, Training Loss: 0.00926
Epoch: 35739, Training Loss: 0.00725
Epoch: 35739, Training Loss: 0.00754
Epoch: 35739, Training Loss: 0.00754
Epoch: 35739, Training Loss: 0.00926
Epoch: 35740, Training Loss: 0.00725
Epoch: 35740, Training Loss: 0.00754
Epoch: 35740, Training Loss: 0.00754
Epoch: 35740, Training Loss: 0.00926
Epoch: 35741, Training Loss: 0.00725
Epoch: 35741, Training Loss: 0.00754
Epoch: 35741, Training Loss: 0.00754
Epoch: 35741, Training Loss: 0.00926
Epoch: 35742, Training Loss: 0.00725
Epoch: 35742, Training Loss: 0.00754
Epoch: 35742, Training Loss: 0.00754
Epoch: 35742, Training Loss: 0.00926
Epoch: 35743, Training Loss: 0.00725
Epoch: 35743, Training Loss: 0.00754
Epoch: 35743, Training Loss: 0.00754
Epoch: 35743, Training Loss: 0.00926
Epoch: 35744, Training Loss: 0.00725
Epoch: 35744, Training Loss: 0.00754
Epoch: 35744, Training Loss: 0.00754
Epoch: 35744, Training Loss: 0.00926
Epoch: 35745, Training Loss: 0.00725
Epoch: 35745, Training Loss: 0.00753
Epoch: 35745, Training Loss: 0.00754
Epoch: 35745, Training Loss: 0.00926
Epoch: 35746, Training Loss: 0.00725
Epoch: 35746, Training Loss: 0.00753
Epoch: 35746, Training Loss: 0.00754
Epoch: 35746, Training Loss: 0.00926
Epoch: 35747, Training Loss: 0.00725
Epoch: 35747, Training Loss: 0.00753
Epoch: 35747, Training Loss: 0.00754
Epoch: 35747, Training Loss: 0.00926
Epoch: 35748, Training Loss: 0.00725
Epoch: 35748, Training Loss: 0.00753
Epoch: 35748, Training Loss: 0.00754
Epoch: 35748, Training Loss: 0.00926
Epoch: 35749, Training Loss: 0.00724
Epoch: 35749, Training Loss: 0.00753
Epoch: 35749, Training Loss: 0.00754
Epoch: 35749, Training Loss: 0.00926
Epoch: 35750, Training Loss: 0.00724
Epoch: 35750, Training Loss: 0.00753
Epoch: 35750, Training Loss: 0.00754
Epoch: 35750, Training Loss: 0.00926
Epoch: 35751, Training Loss: 0.00724
Epoch: 35751, Training Loss: 0.00753
Epoch: 35751, Training Loss: 0.00754
Epoch: 35751, Training Loss: 0.00926
Epoch: 35752, Training Loss: 0.00724
Epoch: 35752, Training Loss: 0.00753
Epoch: 35752, Training Loss: 0.00754
Epoch: 35752, Training Loss: 0.00926
Epoch: 35753, Training Loss: 0.00724
Epoch: 35753, Training Loss: 0.00753
Epoch: 35753, Training Loss: 0.00754
Epoch: 35753, Training Loss: 0.00926
Epoch: 35754, Training Loss: 0.00724
Epoch: 35754, Training Loss: 0.00753
Epoch: 35754, Training Loss: 0.00754
Epoch: 35754, Training Loss: 0.00926
Epoch: 35755, Training Loss: 0.00724
Epoch: 35755, Training Loss: 0.00753
Epoch: 35755, Training Loss: 0.00754
Epoch: 35755, Training Loss: 0.00926
Epoch: 35756, Training Loss: 0.00724
Epoch: 35756, Training Loss: 0.00753
Epoch: 35756, Training Loss: 0.00754
Epoch: 35756, Training Loss: 0.00926
Epoch: 35757, Training Loss: 0.00724
Epoch: 35757, Training Loss: 0.00753
Epoch: 35757, Training Loss: 0.00754
Epoch: 35757, Training Loss: 0.00926
Epoch: 35758, Training Loss: 0.00724
Epoch: 35758, Training Loss: 0.00753
Epoch: 35758, Training Loss: 0.00754
Epoch: 35758, Training Loss: 0.00925
Epoch: 35759, Training Loss: 0.00724
Epoch: 35759, Training Loss: 0.00753
Epoch: 35759, Training Loss: 0.00754
Epoch: 35759, Training Loss: 0.00925
Epoch: 35760, Training Loss: 0.00724
Epoch: 35760, Training Loss: 0.00753
Epoch: 35760, Training Loss: 0.00754
Epoch: 35760, Training Loss: 0.00925
Epoch: 35761, Training Loss: 0.00724
Epoch: 35761, Training Loss: 0.00753
Epoch: 35761, Training Loss: 0.00754
Epoch: 35761, Training Loss: 0.00925
Epoch: 35762, Training Loss: 0.00724
Epoch: 35762, Training Loss: 0.00753
Epoch: 35762, Training Loss: 0.00754
Epoch: 35762, Training Loss: 0.00925
Epoch: 35763, Training Loss: 0.00724
Epoch: 35763, Training Loss: 0.00753
Epoch: 35763, Training Loss: 0.00754
Epoch: 35763, Training Loss: 0.00925
Epoch: 35764, Training Loss: 0.00724
Epoch: 35764, Training Loss: 0.00753
Epoch: 35764, Training Loss: 0.00754
Epoch: 35764, Training Loss: 0.00925
Epoch: 35765, Training Loss: 0.00724
Epoch: 35765, Training Loss: 0.00753
Epoch: 35765, Training Loss: 0.00754
Epoch: 35765, Training Loss: 0.00925
Epoch: 35766, Training Loss: 0.00724
Epoch: 35766, Training Loss: 0.00753
Epoch: 35766, Training Loss: 0.00754
Epoch: 35766, Training Loss: 0.00925
Epoch: 35767, Training Loss: 0.00724
Epoch: 35767, Training Loss: 0.00753
Epoch: 35767, Training Loss: 0.00754
Epoch: 35767, Training Loss: 0.00925
Epoch: 35768, Training Loss: 0.00724
Epoch: 35768, Training Loss: 0.00753
Epoch: 35768, Training Loss: 0.00754
Epoch: 35768, Training Loss: 0.00925
Epoch: 35769, Training Loss: 0.00724
Epoch: 35769, Training Loss: 0.00753
Epoch: 35769, Training Loss: 0.00754
Epoch: 35769, Training Loss: 0.00925
Epoch: 35770, Training Loss: 0.00724
Epoch: 35770, Training Loss: 0.00753
Epoch: 35770, Training Loss: 0.00754
Epoch: 35770, Training Loss: 0.00925
Epoch: 35771, Training Loss: 0.00724
Epoch: 35771, Training Loss: 0.00753
Epoch: 35771, Training Loss: 0.00754
Epoch: 35771, Training Loss: 0.00925
Epoch: 35772, Training Loss: 0.00724
Epoch: 35772, Training Loss: 0.00753
Epoch: 35772, Training Loss: 0.00754
Epoch: 35772, Training Loss: 0.00925
Epoch: 35773, Training Loss: 0.00724
Epoch: 35773, Training Loss: 0.00753
Epoch: 35773, Training Loss: 0.00754
Epoch: 35773, Training Loss: 0.00925
Epoch: 35774, Training Loss: 0.00724
Epoch: 35774, Training Loss: 0.00753
Epoch: 35774, Training Loss: 0.00754
Epoch: 35774, Training Loss: 0.00925
Epoch: 35775, Training Loss: 0.00724
Epoch: 35775, Training Loss: 0.00753
Epoch: 35775, Training Loss: 0.00754
Epoch: 35775, Training Loss: 0.00925
Epoch: 35776, Training Loss: 0.00724
Epoch: 35776, Training Loss: 0.00753
Epoch: 35776, Training Loss: 0.00754
Epoch: 35776, Training Loss: 0.00925
Epoch: 35777, Training Loss: 0.00724
Epoch: 35777, Training Loss: 0.00753
Epoch: 35777, Training Loss: 0.00754
Epoch: 35777, Training Loss: 0.00925
Epoch: 35778, Training Loss: 0.00724
Epoch: 35778, Training Loss: 0.00753
Epoch: 35778, Training Loss: 0.00754
Epoch: 35778, Training Loss: 0.00925
Epoch: 35779, Training Loss: 0.00724
Epoch: 35779, Training Loss: 0.00753
Epoch: 35779, Training Loss: 0.00754
Epoch: 35779, Training Loss: 0.00925
Epoch: 35780, Training Loss: 0.00724
Epoch: 35780, Training Loss: 0.00753
Epoch: 35780, Training Loss: 0.00754
Epoch: 35780, Training Loss: 0.00925
Epoch: 35781, Training Loss: 0.00724
Epoch: 35781, Training Loss: 0.00753
Epoch: 35781, Training Loss: 0.00754
Epoch: 35781, Training Loss: 0.00925
Epoch: 35782, Training Loss: 0.00724
Epoch: 35782, Training Loss: 0.00753
Epoch: 35782, Training Loss: 0.00754
Epoch: 35782, Training Loss: 0.00925
Epoch: 35783, Training Loss: 0.00724
Epoch: 35783, Training Loss: 0.00753
Epoch: 35783, Training Loss: 0.00754
Epoch: 35783, Training Loss: 0.00925
Epoch: 35784, Training Loss: 0.00724
Epoch: 35784, Training Loss: 0.00753
Epoch: 35784, Training Loss: 0.00754
Epoch: 35784, Training Loss: 0.00925
Epoch: 35785, Training Loss: 0.00724
Epoch: 35785, Training Loss: 0.00753
Epoch: 35785, Training Loss: 0.00754
Epoch: 35785, Training Loss: 0.00925
Epoch: 35786, Training Loss: 0.00724
Epoch: 35786, Training Loss: 0.00753
Epoch: 35786, Training Loss: 0.00754
Epoch: 35786, Training Loss: 0.00925
Epoch: 35787, Training Loss: 0.00724
Epoch: 35787, Training Loss: 0.00753
Epoch: 35787, Training Loss: 0.00754
Epoch: 35787, Training Loss: 0.00925
Epoch: 35788, Training Loss: 0.00724
Epoch: 35788, Training Loss: 0.00753
Epoch: 35788, Training Loss: 0.00754
Epoch: 35788, Training Loss: 0.00925
Epoch: 35789, Training Loss: 0.00724
Epoch: 35789, Training Loss: 0.00753
Epoch: 35789, Training Loss: 0.00754
Epoch: 35789, Training Loss: 0.00925
Epoch: 35790, Training Loss: 0.00724
Epoch: 35790, Training Loss: 0.00753
Epoch: 35790, Training Loss: 0.00754
Epoch: 35790, Training Loss: 0.00925
Epoch: 35791, Training Loss: 0.00724
Epoch: 35791, Training Loss: 0.00753
Epoch: 35791, Training Loss: 0.00754
Epoch: 35791, Training Loss: 0.00925
Epoch: 35792, Training Loss: 0.00724
Epoch: 35792, Training Loss: 0.00753
Epoch: 35792, Training Loss: 0.00754
Epoch: 35792, Training Loss: 0.00925
Epoch: 35793, Training Loss: 0.00724
Epoch: 35793, Training Loss: 0.00753
Epoch: 35793, Training Loss: 0.00754
Epoch: 35793, Training Loss: 0.00925
Epoch: 35794, Training Loss: 0.00724
Epoch: 35794, Training Loss: 0.00753
Epoch: 35794, Training Loss: 0.00754
Epoch: 35794, Training Loss: 0.00925
Epoch: 35795, Training Loss: 0.00724
Epoch: 35795, Training Loss: 0.00753
Epoch: 35795, Training Loss: 0.00754
Epoch: 35795, Training Loss: 0.00925
Epoch: 35796, Training Loss: 0.00724
Epoch: 35796, Training Loss: 0.00753
Epoch: 35796, Training Loss: 0.00754
Epoch: 35796, Training Loss: 0.00925
Epoch: 35797, Training Loss: 0.00724
Epoch: 35797, Training Loss: 0.00753
Epoch: 35797, Training Loss: 0.00754
Epoch: 35797, Training Loss: 0.00925
Epoch: 35798, Training Loss: 0.00724
Epoch: 35798, Training Loss: 0.00753
Epoch: 35798, Training Loss: 0.00754
Epoch: 35798, Training Loss: 0.00925
Epoch: 35799, Training Loss: 0.00724
Epoch: 35799, Training Loss: 0.00753
Epoch: 35799, Training Loss: 0.00754
Epoch: 35799, Training Loss: 0.00925
Epoch: 35800, Training Loss: 0.00724
Epoch: 35800, Training Loss: 0.00753
Epoch: 35800, Training Loss: 0.00754
Epoch: 35800, Training Loss: 0.00925
Epoch: 35801, Training Loss: 0.00724
Epoch: 35801, Training Loss: 0.00753
Epoch: 35801, Training Loss: 0.00754
Epoch: 35801, Training Loss: 0.00925
Epoch: 35802, Training Loss: 0.00724
Epoch: 35802, Training Loss: 0.00753
Epoch: 35802, Training Loss: 0.00754
Epoch: 35802, Training Loss: 0.00925
Epoch: 35803, Training Loss: 0.00724
Epoch: 35803, Training Loss: 0.00753
Epoch: 35803, Training Loss: 0.00754
Epoch: 35803, Training Loss: 0.00925
Epoch: 35804, Training Loss: 0.00724
Epoch: 35804, Training Loss: 0.00753
Epoch: 35804, Training Loss: 0.00754
Epoch: 35804, Training Loss: 0.00925
Epoch: 35805, Training Loss: 0.00724
Epoch: 35805, Training Loss: 0.00753
Epoch: 35805, Training Loss: 0.00754
Epoch: 35805, Training Loss: 0.00925
Epoch: 35806, Training Loss: 0.00724
Epoch: 35806, Training Loss: 0.00753
Epoch: 35806, Training Loss: 0.00754
Epoch: 35806, Training Loss: 0.00925
Epoch: 35807, Training Loss: 0.00724
Epoch: 35807, Training Loss: 0.00753
Epoch: 35807, Training Loss: 0.00754
Epoch: 35807, Training Loss: 0.00925
Epoch: 35808, Training Loss: 0.00724
Epoch: 35808, Training Loss: 0.00753
Epoch: 35808, Training Loss: 0.00754
Epoch: 35808, Training Loss: 0.00925
Epoch: 35809, Training Loss: 0.00724
Epoch: 35809, Training Loss: 0.00753
Epoch: 35809, Training Loss: 0.00754
Epoch: 35809, Training Loss: 0.00925
Epoch: 35810, Training Loss: 0.00724
Epoch: 35810, Training Loss: 0.00753
Epoch: 35810, Training Loss: 0.00754
Epoch: 35810, Training Loss: 0.00925
Epoch: 35811, Training Loss: 0.00724
Epoch: 35811, Training Loss: 0.00753
Epoch: 35811, Training Loss: 0.00754
Epoch: 35811, Training Loss: 0.00925
Epoch: 35812, Training Loss: 0.00724
Epoch: 35812, Training Loss: 0.00753
Epoch: 35812, Training Loss: 0.00754
Epoch: 35812, Training Loss: 0.00925
Epoch: 35813, Training Loss: 0.00724
Epoch: 35813, Training Loss: 0.00753
Epoch: 35813, Training Loss: 0.00754
Epoch: 35813, Training Loss: 0.00925
Epoch: 35814, Training Loss: 0.00724
Epoch: 35814, Training Loss: 0.00753
Epoch: 35814, Training Loss: 0.00754
Epoch: 35814, Training Loss: 0.00925
Epoch: 35815, Training Loss: 0.00724
Epoch: 35815, Training Loss: 0.00753
Epoch: 35815, Training Loss: 0.00754
Epoch: 35815, Training Loss: 0.00925
Epoch: 35816, Training Loss: 0.00724
Epoch: 35816, Training Loss: 0.00753
Epoch: 35816, Training Loss: 0.00754
Epoch: 35816, Training Loss: 0.00925
Epoch: 35817, Training Loss: 0.00724
Epoch: 35817, Training Loss: 0.00753
Epoch: 35817, Training Loss: 0.00754
Epoch: 35817, Training Loss: 0.00925
Epoch: 35818, Training Loss: 0.00724
Epoch: 35818, Training Loss: 0.00753
Epoch: 35818, Training Loss: 0.00754
Epoch: 35818, Training Loss: 0.00925
Epoch: 35819, Training Loss: 0.00724
Epoch: 35819, Training Loss: 0.00753
Epoch: 35819, Training Loss: 0.00754
Epoch: 35819, Training Loss: 0.00925
Epoch: 35820, Training Loss: 0.00724
Epoch: 35820, Training Loss: 0.00753
Epoch: 35820, Training Loss: 0.00753
Epoch: 35820, Training Loss: 0.00925
Epoch: 35821, Training Loss: 0.00724
Epoch: 35821, Training Loss: 0.00753
Epoch: 35821, Training Loss: 0.00753
Epoch: 35821, Training Loss: 0.00925
Epoch: 35822, Training Loss: 0.00724
Epoch: 35822, Training Loss: 0.00753
Epoch: 35822, Training Loss: 0.00753
Epoch: 35822, Training Loss: 0.00925
Epoch: 35823, Training Loss: 0.00724
Epoch: 35823, Training Loss: 0.00753
Epoch: 35823, Training Loss: 0.00753
Epoch: 35823, Training Loss: 0.00925
Epoch: 35824, Training Loss: 0.00724
Epoch: 35824, Training Loss: 0.00753
Epoch: 35824, Training Loss: 0.00753
Epoch: 35824, Training Loss: 0.00925
Epoch: 35825, Training Loss: 0.00724
Epoch: 35825, Training Loss: 0.00753
Epoch: 35825, Training Loss: 0.00753
Epoch: 35825, Training Loss: 0.00925
Epoch: 35826, Training Loss: 0.00724
Epoch: 35826, Training Loss: 0.00753
Epoch: 35826, Training Loss: 0.00753
Epoch: 35826, Training Loss: 0.00925
Epoch: 35827, Training Loss: 0.00724
Epoch: 35827, Training Loss: 0.00753
Epoch: 35827, Training Loss: 0.00753
Epoch: 35827, Training Loss: 0.00924
Epoch: 35828, Training Loss: 0.00724
Epoch: 35828, Training Loss: 0.00753
Epoch: 35828, Training Loss: 0.00753
Epoch: 35828, Training Loss: 0.00924
Epoch: 35829, Training Loss: 0.00724
Epoch: 35829, Training Loss: 0.00753
Epoch: 35829, Training Loss: 0.00753
Epoch: 35829, Training Loss: 0.00924
Epoch: 35830, Training Loss: 0.00724
Epoch: 35830, Training Loss: 0.00753
Epoch: 35830, Training Loss: 0.00753
Epoch: 35830, Training Loss: 0.00924
Epoch: 35831, Training Loss: 0.00724
Epoch: 35831, Training Loss: 0.00752
Epoch: 35831, Training Loss: 0.00753
Epoch: 35831, Training Loss: 0.00924
Epoch: 35832, Training Loss: 0.00724
Epoch: 35832, Training Loss: 0.00752
Epoch: 35832, Training Loss: 0.00753
Epoch: 35832, Training Loss: 0.00924
Epoch: 35833, Training Loss: 0.00724
Epoch: 35833, Training Loss: 0.00752
Epoch: 35833, Training Loss: 0.00753
Epoch: 35833, Training Loss: 0.00924
Epoch: 35834, Training Loss: 0.00724
Epoch: 35834, Training Loss: 0.00752
Epoch: 35834, Training Loss: 0.00753
Epoch: 35834, Training Loss: 0.00924
Epoch: 35835, Training Loss: 0.00724
Epoch: 35835, Training Loss: 0.00752
Epoch: 35835, Training Loss: 0.00753
Epoch: 35835, Training Loss: 0.00924
Epoch: 35836, Training Loss: 0.00724
Epoch: 35836, Training Loss: 0.00752
Epoch: 35836, Training Loss: 0.00753
Epoch: 35836, Training Loss: 0.00924
Epoch: 35837, Training Loss: 0.00724
Epoch: 35837, Training Loss: 0.00752
Epoch: 35837, Training Loss: 0.00753
Epoch: 35837, Training Loss: 0.00924
Epoch: 35838, Training Loss: 0.00724
Epoch: 35838, Training Loss: 0.00752
Epoch: 35838, Training Loss: 0.00753
Epoch: 35838, Training Loss: 0.00924
Epoch: 35839, Training Loss: 0.00724
Epoch: 35839, Training Loss: 0.00752
Epoch: 35839, Training Loss: 0.00753
Epoch: 35839, Training Loss: 0.00924
Epoch: 35840, Training Loss: 0.00723
Epoch: 35840, Training Loss: 0.00752
Epoch: 35840, Training Loss: 0.00753
Epoch: 35840, Training Loss: 0.00924
Epoch: 35841, Training Loss: 0.00723
Epoch: 35841, Training Loss: 0.00752
Epoch: 35841, Training Loss: 0.00753
Epoch: 35841, Training Loss: 0.00924
Epoch: 35842, Training Loss: 0.00723
Epoch: 35842, Training Loss: 0.00752
Epoch: 35842, Training Loss: 0.00753
Epoch: 35842, Training Loss: 0.00924
Epoch: 35843, Training Loss: 0.00723
Epoch: 35843, Training Loss: 0.00752
Epoch: 35843, Training Loss: 0.00753
Epoch: 35843, Training Loss: 0.00924
Epoch: 35844, Training Loss: 0.00723
Epoch: 35844, Training Loss: 0.00752
Epoch: 35844, Training Loss: 0.00753
Epoch: 35844, Training Loss: 0.00924
Epoch: 35845, Training Loss: 0.00723
Epoch: 35845, Training Loss: 0.00752
Epoch: 35845, Training Loss: 0.00753
Epoch: 35845, Training Loss: 0.00924
Epoch: 35846, Training Loss: 0.00723
Epoch: 35846, Training Loss: 0.00752
Epoch: 35846, Training Loss: 0.00753
Epoch: 35846, Training Loss: 0.00924
Epoch: 35847, Training Loss: 0.00723
Epoch: 35847, Training Loss: 0.00752
Epoch: 35847, Training Loss: 0.00753
Epoch: 35847, Training Loss: 0.00924
Epoch: 35848, Training Loss: 0.00723
Epoch: 35848, Training Loss: 0.00752
Epoch: 35848, Training Loss: 0.00753
Epoch: 35848, Training Loss: 0.00924
Epoch: 35849, Training Loss: 0.00723
Epoch: 35849, Training Loss: 0.00752
Epoch: 35849, Training Loss: 0.00753
Epoch: 35849, Training Loss: 0.00924
Epoch: 35850, Training Loss: 0.00723
Epoch: 35850, Training Loss: 0.00752
Epoch: 35850, Training Loss: 0.00753
Epoch: 35850, Training Loss: 0.00924
Epoch: 35851, Training Loss: 0.00723
Epoch: 35851, Training Loss: 0.00752
Epoch: 35851, Training Loss: 0.00753
Epoch: 35851, Training Loss: 0.00924
Epoch: 35852, Training Loss: 0.00723
Epoch: 35852, Training Loss: 0.00752
Epoch: 35852, Training Loss: 0.00753
Epoch: 35852, Training Loss: 0.00924
Epoch: 35853, Training Loss: 0.00723
Epoch: 35853, Training Loss: 0.00752
Epoch: 35853, Training Loss: 0.00753
Epoch: 35853, Training Loss: 0.00924
Epoch: 35854, Training Loss: 0.00723
Epoch: 35854, Training Loss: 0.00752
Epoch: 35854, Training Loss: 0.00753
Epoch: 35854, Training Loss: 0.00924
Epoch: 35855, Training Loss: 0.00723
Epoch: 35855, Training Loss: 0.00752
Epoch: 35855, Training Loss: 0.00753
Epoch: 35855, Training Loss: 0.00924
Epoch: 35856, Training Loss: 0.00723
Epoch: 35856, Training Loss: 0.00752
Epoch: 35856, Training Loss: 0.00753
Epoch: 35856, Training Loss: 0.00924
Epoch: 35857, Training Loss: 0.00723
Epoch: 35857, Training Loss: 0.00752
Epoch: 35857, Training Loss: 0.00753
Epoch: 35857, Training Loss: 0.00924
Epoch: 35858, Training Loss: 0.00723
Epoch: 35858, Training Loss: 0.00752
Epoch: 35858, Training Loss: 0.00753
Epoch: 35858, Training Loss: 0.00924
Epoch: 35859, Training Loss: 0.00723
Epoch: 35859, Training Loss: 0.00752
Epoch: 35859, Training Loss: 0.00753
Epoch: 35859, Training Loss: 0.00924
Epoch: 35860, Training Loss: 0.00723
Epoch: 35860, Training Loss: 0.00752
Epoch: 35860, Training Loss: 0.00753
Epoch: 35860, Training Loss: 0.00924
Epoch: 35861, Training Loss: 0.00723
Epoch: 35861, Training Loss: 0.00752
Epoch: 35861, Training Loss: 0.00753
Epoch: 35861, Training Loss: 0.00924
Epoch: 35862, Training Loss: 0.00723
Epoch: 35862, Training Loss: 0.00752
Epoch: 35862, Training Loss: 0.00753
Epoch: 35862, Training Loss: 0.00924
Epoch: 35863, Training Loss: 0.00723
Epoch: 35863, Training Loss: 0.00752
Epoch: 35863, Training Loss: 0.00753
Epoch: 35863, Training Loss: 0.00924
Epoch: 35864, Training Loss: 0.00723
Epoch: 35864, Training Loss: 0.00752
Epoch: 35864, Training Loss: 0.00753
Epoch: 35864, Training Loss: 0.00924
Epoch: 35865, Training Loss: 0.00723
Epoch: 35865, Training Loss: 0.00752
Epoch: 35865, Training Loss: 0.00753
Epoch: 35865, Training Loss: 0.00924
Epoch: 35866, Training Loss: 0.00723
Epoch: 35866, Training Loss: 0.00752
Epoch: 35866, Training Loss: 0.00753
Epoch: 35866, Training Loss: 0.00924
Epoch: 35867, Training Loss: 0.00723
Epoch: 35867, Training Loss: 0.00752
Epoch: 35867, Training Loss: 0.00753
Epoch: 35867, Training Loss: 0.00924
Epoch: 35868, Training Loss: 0.00723
Epoch: 35868, Training Loss: 0.00752
Epoch: 35868, Training Loss: 0.00753
Epoch: 35868, Training Loss: 0.00924
Epoch: 35869, Training Loss: 0.00723
Epoch: 35869, Training Loss: 0.00752
Epoch: 35869, Training Loss: 0.00753
Epoch: 35869, Training Loss: 0.00924
Epoch: 35870, Training Loss: 0.00723
Epoch: 35870, Training Loss: 0.00752
Epoch: 35870, Training Loss: 0.00753
Epoch: 35870, Training Loss: 0.00924
Epoch: 35871, Training Loss: 0.00723
Epoch: 35871, Training Loss: 0.00752
Epoch: 35871, Training Loss: 0.00753
Epoch: 35871, Training Loss: 0.00924
Epoch: 35872, Training Loss: 0.00723
Epoch: 35872, Training Loss: 0.00752
Epoch: 35872, Training Loss: 0.00753
Epoch: 35872, Training Loss: 0.00924
Epoch: 35873, Training Loss: 0.00723
Epoch: 35873, Training Loss: 0.00752
Epoch: 35873, Training Loss: 0.00753
Epoch: 35873, Training Loss: 0.00924
Epoch: 35874, Training Loss: 0.00723
Epoch: 35874, Training Loss: 0.00752
Epoch: 35874, Training Loss: 0.00753
Epoch: 35874, Training Loss: 0.00924
Epoch: 35875, Training Loss: 0.00723
Epoch: 35875, Training Loss: 0.00752
Epoch: 35875, Training Loss: 0.00753
Epoch: 35875, Training Loss: 0.00924
Epoch: 35876, Training Loss: 0.00723
Epoch: 35876, Training Loss: 0.00752
Epoch: 35876, Training Loss: 0.00753
Epoch: 35876, Training Loss: 0.00924
Epoch: 35877, Training Loss: 0.00723
Epoch: 35877, Training Loss: 0.00752
Epoch: 35877, Training Loss: 0.00753
Epoch: 35877, Training Loss: 0.00924
Epoch: 35878, Training Loss: 0.00723
Epoch: 35878, Training Loss: 0.00752
Epoch: 35878, Training Loss: 0.00753
Epoch: 35878, Training Loss: 0.00924
Epoch: 35879, Training Loss: 0.00723
Epoch: 35879, Training Loss: 0.00752
Epoch: 35879, Training Loss: 0.00753
Epoch: 35879, Training Loss: 0.00924
Epoch: 35880, Training Loss: 0.00723
Epoch: 35880, Training Loss: 0.00752
Epoch: 35880, Training Loss: 0.00753
Epoch: 35880, Training Loss: 0.00924
Epoch: 35881, Training Loss: 0.00723
Epoch: 35881, Training Loss: 0.00752
Epoch: 35881, Training Loss: 0.00753
Epoch: 35881, Training Loss: 0.00924
Epoch: 35882, Training Loss: 0.00723
Epoch: 35882, Training Loss: 0.00752
Epoch: 35882, Training Loss: 0.00753
Epoch: 35882, Training Loss: 0.00924
Epoch: 35883, Training Loss: 0.00723
Epoch: 35883, Training Loss: 0.00752
Epoch: 35883, Training Loss: 0.00753
Epoch: 35883, Training Loss: 0.00924
Epoch: 35884, Training Loss: 0.00723
Epoch: 35884, Training Loss: 0.00752
Epoch: 35884, Training Loss: 0.00753
Epoch: 35884, Training Loss: 0.00924
Epoch: 35885, Training Loss: 0.00723
Epoch: 35885, Training Loss: 0.00752
Epoch: 35885, Training Loss: 0.00753
Epoch: 35885, Training Loss: 0.00924
Epoch: 35886, Training Loss: 0.00723
Epoch: 35886, Training Loss: 0.00752
Epoch: 35886, Training Loss: 0.00753
Epoch: 35886, Training Loss: 0.00924
Epoch: 35887, Training Loss: 0.00723
Epoch: 35887, Training Loss: 0.00752
Epoch: 35887, Training Loss: 0.00753
Epoch: 35887, Training Loss: 0.00924
Epoch: 35888, Training Loss: 0.00723
Epoch: 35888, Training Loss: 0.00752
Epoch: 35888, Training Loss: 0.00753
Epoch: 35888, Training Loss: 0.00924
Epoch: 35889, Training Loss: 0.00723
Epoch: 35889, Training Loss: 0.00752
Epoch: 35889, Training Loss: 0.00753
Epoch: 35889, Training Loss: 0.00924
Epoch: 35890, Training Loss: 0.00723
Epoch: 35890, Training Loss: 0.00752
Epoch: 35890, Training Loss: 0.00753
Epoch: 35890, Training Loss: 0.00924
Epoch: 35891, Training Loss: 0.00723
Epoch: 35891, Training Loss: 0.00752
Epoch: 35891, Training Loss: 0.00753
Epoch: 35891, Training Loss: 0.00924
Epoch: 35892, Training Loss: 0.00723
Epoch: 35892, Training Loss: 0.00752
Epoch: 35892, Training Loss: 0.00753
Epoch: 35892, Training Loss: 0.00924
Epoch: 35893, Training Loss: 0.00723
Epoch: 35893, Training Loss: 0.00752
Epoch: 35893, Training Loss: 0.00753
Epoch: 35893, Training Loss: 0.00924
Epoch: 35894, Training Loss: 0.00723
Epoch: 35894, Training Loss: 0.00752
Epoch: 35894, Training Loss: 0.00753
Epoch: 35894, Training Loss: 0.00924
Epoch: 35895, Training Loss: 0.00723
Epoch: 35895, Training Loss: 0.00752
Epoch: 35895, Training Loss: 0.00753
Epoch: 35895, Training Loss: 0.00924
Epoch: 35896, Training Loss: 0.00723
Epoch: 35896, Training Loss: 0.00752
Epoch: 35896, Training Loss: 0.00753
Epoch: 35896, Training Loss: 0.00923
Epoch: 35897, Training Loss: 0.00723
Epoch: 35897, Training Loss: 0.00752
Epoch: 35897, Training Loss: 0.00753
Epoch: 35897, Training Loss: 0.00923
Epoch: 35898, Training Loss: 0.00723
Epoch: 35898, Training Loss: 0.00752
Epoch: 35898, Training Loss: 0.00753
Epoch: 35898, Training Loss: 0.00923
Epoch: 35899, Training Loss: 0.00723
Epoch: 35899, Training Loss: 0.00752
Epoch: 35899, Training Loss: 0.00753
Epoch: 35899, Training Loss: 0.00923
Epoch: 35900, Training Loss: 0.00723
Epoch: 35900, Training Loss: 0.00752
Epoch: 35900, Training Loss: 0.00753
Epoch: 35900, Training Loss: 0.00923
Epoch: 35901, Training Loss: 0.00723
Epoch: 35901, Training Loss: 0.00752
Epoch: 35901, Training Loss: 0.00753
Epoch: 35901, Training Loss: 0.00923
Epoch: 35902, Training Loss: 0.00723
Epoch: 35902, Training Loss: 0.00752
Epoch: 35902, Training Loss: 0.00753
Epoch: 35902, Training Loss: 0.00923
Epoch: 35903, Training Loss: 0.00723
Epoch: 35903, Training Loss: 0.00752
Epoch: 35903, Training Loss: 0.00753
Epoch: 35903, Training Loss: 0.00923
Epoch: 35904, Training Loss: 0.00723
Epoch: 35904, Training Loss: 0.00752
Epoch: 35904, Training Loss: 0.00753
Epoch: 35904, Training Loss: 0.00923
Epoch: 35905, Training Loss: 0.00723
Epoch: 35905, Training Loss: 0.00752
Epoch: 35905, Training Loss: 0.00753
Epoch: 35905, Training Loss: 0.00923
Epoch: 35906, Training Loss: 0.00723
Epoch: 35906, Training Loss: 0.00752
Epoch: 35906, Training Loss: 0.00752
Epoch: 35906, Training Loss: 0.00923
Epoch: 35907, Training Loss: 0.00723
Epoch: 35907, Training Loss: 0.00752
Epoch: 35907, Training Loss: 0.00752
Epoch: 35907, Training Loss: 0.00923
Epoch: 35908, Training Loss: 0.00723
Epoch: 35908, Training Loss: 0.00752
Epoch: 35908, Training Loss: 0.00752
Epoch: 35908, Training Loss: 0.00923
Epoch: 35909, Training Loss: 0.00723
Epoch: 35909, Training Loss: 0.00752
Epoch: 35909, Training Loss: 0.00752
Epoch: 35909, Training Loss: 0.00923
Epoch: 35910, Training Loss: 0.00723
Epoch: 35910, Training Loss: 0.00752
Epoch: 35910, Training Loss: 0.00752
Epoch: 35910, Training Loss: 0.00923
Epoch: 35911, Training Loss: 0.00723
Epoch: 35911, Training Loss: 0.00752
Epoch: 35911, Training Loss: 0.00752
Epoch: 35911, Training Loss: 0.00923
Epoch: 35912, Training Loss: 0.00723
Epoch: 35912, Training Loss: 0.00752
Epoch: 35912, Training Loss: 0.00752
Epoch: 35912, Training Loss: 0.00923
Epoch: 35913, Training Loss: 0.00723
Epoch: 35913, Training Loss: 0.00752
Epoch: 35913, Training Loss: 0.00752
Epoch: 35913, Training Loss: 0.00923
Epoch: 35914, Training Loss: 0.00723
Epoch: 35914, Training Loss: 0.00752
Epoch: 35914, Training Loss: 0.00752
Epoch: 35914, Training Loss: 0.00923
Epoch: 35915, Training Loss: 0.00723
Epoch: 35915, Training Loss: 0.00752
Epoch: 35915, Training Loss: 0.00752
Epoch: 35915, Training Loss: 0.00923
Epoch: 35916, Training Loss: 0.00723
Epoch: 35916, Training Loss: 0.00752
Epoch: 35916, Training Loss: 0.00752
Epoch: 35916, Training Loss: 0.00923
Epoch: 35917, Training Loss: 0.00723
Epoch: 35917, Training Loss: 0.00751
Epoch: 35917, Training Loss: 0.00752
Epoch: 35917, Training Loss: 0.00923
Epoch: 35918, Training Loss: 0.00723
Epoch: 35918, Training Loss: 0.00751
Epoch: 35918, Training Loss: 0.00752
Epoch: 35918, Training Loss: 0.00923
Epoch: 35919, Training Loss: 0.00723
Epoch: 35919, Training Loss: 0.00751
Epoch: 35919, Training Loss: 0.00752
Epoch: 35919, Training Loss: 0.00923
Epoch: 35920, Training Loss: 0.00723
Epoch: 35920, Training Loss: 0.00751
Epoch: 35920, Training Loss: 0.00752
Epoch: 35920, Training Loss: 0.00923
Epoch: 35921, Training Loss: 0.00723
Epoch: 35921, Training Loss: 0.00751
Epoch: 35921, Training Loss: 0.00752
Epoch: 35921, Training Loss: 0.00923
Epoch: 35922, Training Loss: 0.00723
Epoch: 35922, Training Loss: 0.00751
Epoch: 35922, Training Loss: 0.00752
Epoch: 35922, Training Loss: 0.00923
Epoch: 35923, Training Loss: 0.00723
Epoch: 35923, Training Loss: 0.00751
Epoch: 35923, Training Loss: 0.00752
Epoch: 35923, Training Loss: 0.00923
Epoch: 35924, Training Loss: 0.00723
Epoch: 35924, Training Loss: 0.00751
Epoch: 35924, Training Loss: 0.00752
Epoch: 35924, Training Loss: 0.00923
Epoch: 35925, Training Loss: 0.00723
Epoch: 35925, Training Loss: 0.00751
Epoch: 35925, Training Loss: 0.00752
Epoch: 35925, Training Loss: 0.00923
Epoch: 35926, Training Loss: 0.00723
Epoch: 35926, Training Loss: 0.00751
Epoch: 35926, Training Loss: 0.00752
Epoch: 35926, Training Loss: 0.00923
Epoch: 35927, Training Loss: 0.00723
Epoch: 35927, Training Loss: 0.00751
Epoch: 35927, Training Loss: 0.00752
Epoch: 35927, Training Loss: 0.00923
Epoch: 35928, Training Loss: 0.00723
Epoch: 35928, Training Loss: 0.00751
Epoch: 35928, Training Loss: 0.00752
Epoch: 35928, Training Loss: 0.00923
Epoch: 35929, Training Loss: 0.00723
Epoch: 35929, Training Loss: 0.00751
Epoch: 35929, Training Loss: 0.00752
Epoch: 35929, Training Loss: 0.00923
Epoch: 35930, Training Loss: 0.00723
Epoch: 35930, Training Loss: 0.00751
Epoch: 35930, Training Loss: 0.00752
Epoch: 35930, Training Loss: 0.00923
Epoch: 35931, Training Loss: 0.00723
Epoch: 35931, Training Loss: 0.00751
Epoch: 35931, Training Loss: 0.00752
Epoch: 35931, Training Loss: 0.00923
Epoch: 35932, Training Loss: 0.00722
Epoch: 35932, Training Loss: 0.00751
Epoch: 35932, Training Loss: 0.00752
Epoch: 35932, Training Loss: 0.00923
Epoch: 35933, Training Loss: 0.00722
Epoch: 35933, Training Loss: 0.00751
Epoch: 35933, Training Loss: 0.00752
Epoch: 35933, Training Loss: 0.00923
Epoch: 35934, Training Loss: 0.00722
Epoch: 35934, Training Loss: 0.00751
Epoch: 35934, Training Loss: 0.00752
Epoch: 35934, Training Loss: 0.00923
Epoch: 35935, Training Loss: 0.00722
Epoch: 35935, Training Loss: 0.00751
Epoch: 35935, Training Loss: 0.00752
Epoch: 35935, Training Loss: 0.00923
Epoch: 35936, Training Loss: 0.00722
Epoch: 35936, Training Loss: 0.00751
Epoch: 35936, Training Loss: 0.00752
Epoch: 35936, Training Loss: 0.00923
Epoch: 35937, Training Loss: 0.00722
Epoch: 35937, Training Loss: 0.00751
Epoch: 35937, Training Loss: 0.00752
Epoch: 35937, Training Loss: 0.00923
Epoch: 35938, Training Loss: 0.00722
Epoch: 35938, Training Loss: 0.00751
Epoch: 35938, Training Loss: 0.00752
Epoch: 35938, Training Loss: 0.00923
Epoch: 35939, Training Loss: 0.00722
Epoch: 35939, Training Loss: 0.00751
Epoch: 35939, Training Loss: 0.00752
Epoch: 35939, Training Loss: 0.00923
Epoch: 35940, Training Loss: 0.00722
Epoch: 35940, Training Loss: 0.00751
Epoch: 35940, Training Loss: 0.00752
Epoch: 35940, Training Loss: 0.00923
Epoch: 35941, Training Loss: 0.00722
Epoch: 35941, Training Loss: 0.00751
Epoch: 35941, Training Loss: 0.00752
Epoch: 35941, Training Loss: 0.00923
Epoch: 35942, Training Loss: 0.00722
Epoch: 35942, Training Loss: 0.00751
Epoch: 35942, Training Loss: 0.00752
Epoch: 35942, Training Loss: 0.00923
Epoch: 35943, Training Loss: 0.00722
Epoch: 35943, Training Loss: 0.00751
Epoch: 35943, Training Loss: 0.00752
Epoch: 35943, Training Loss: 0.00923
Epoch: 35944, Training Loss: 0.00722
Epoch: 35944, Training Loss: 0.00751
Epoch: 35944, Training Loss: 0.00752
Epoch: 35944, Training Loss: 0.00923
Epoch: 35945, Training Loss: 0.00722
Epoch: 35945, Training Loss: 0.00751
Epoch: 35945, Training Loss: 0.00752
Epoch: 35945, Training Loss: 0.00923
Epoch: 35946, Training Loss: 0.00722
Epoch: 35946, Training Loss: 0.00751
Epoch: 35946, Training Loss: 0.00752
Epoch: 35946, Training Loss: 0.00923
Epoch: 35947, Training Loss: 0.00722
Epoch: 35947, Training Loss: 0.00751
Epoch: 35947, Training Loss: 0.00752
Epoch: 35947, Training Loss: 0.00923
Epoch: 35948, Training Loss: 0.00722
Epoch: 35948, Training Loss: 0.00751
Epoch: 35948, Training Loss: 0.00752
Epoch: 35948, Training Loss: 0.00923
Epoch: 35949, Training Loss: 0.00722
Epoch: 35949, Training Loss: 0.00751
Epoch: 35949, Training Loss: 0.00752
Epoch: 35949, Training Loss: 0.00923
Epoch: 35950, Training Loss: 0.00722
Epoch: 35950, Training Loss: 0.00751
Epoch: 35950, Training Loss: 0.00752
Epoch: 35950, Training Loss: 0.00923
Epoch: 35951, Training Loss: 0.00722
Epoch: 35951, Training Loss: 0.00751
Epoch: 35951, Training Loss: 0.00752
Epoch: 35951, Training Loss: 0.00923
Epoch: 35952, Training Loss: 0.00722
Epoch: 35952, Training Loss: 0.00751
Epoch: 35952, Training Loss: 0.00752
Epoch: 35952, Training Loss: 0.00923
Epoch: 35953, Training Loss: 0.00722
Epoch: 35953, Training Loss: 0.00751
Epoch: 35953, Training Loss: 0.00752
Epoch: 35953, Training Loss: 0.00923
Epoch: 35954, Training Loss: 0.00722
Epoch: 35954, Training Loss: 0.00751
Epoch: 35954, Training Loss: 0.00752
Epoch: 35954, Training Loss: 0.00923
Epoch: 35955, Training Loss: 0.00722
Epoch: 35955, Training Loss: 0.00751
Epoch: 35955, Training Loss: 0.00752
Epoch: 35955, Training Loss: 0.00923
Epoch: 35956, Training Loss: 0.00722
Epoch: 35956, Training Loss: 0.00751
Epoch: 35956, Training Loss: 0.00752
Epoch: 35956, Training Loss: 0.00923
Epoch: 35957, Training Loss: 0.00722
Epoch: 35957, Training Loss: 0.00751
Epoch: 35957, Training Loss: 0.00752
Epoch: 35957, Training Loss: 0.00923
Epoch: 35958, Training Loss: 0.00722
Epoch: 35958, Training Loss: 0.00751
Epoch: 35958, Training Loss: 0.00752
Epoch: 35958, Training Loss: 0.00923
Epoch: 35959, Training Loss: 0.00722
Epoch: 35959, Training Loss: 0.00751
Epoch: 35959, Training Loss: 0.00752
Epoch: 35959, Training Loss: 0.00923
Epoch: 35960, Training Loss: 0.00722
Epoch: 35960, Training Loss: 0.00751
Epoch: 35960, Training Loss: 0.00752
Epoch: 35960, Training Loss: 0.00923
Epoch: 35961, Training Loss: 0.00722
Epoch: 35961, Training Loss: 0.00751
Epoch: 35961, Training Loss: 0.00752
Epoch: 35961, Training Loss: 0.00923
Epoch: 35962, Training Loss: 0.00722
Epoch: 35962, Training Loss: 0.00751
Epoch: 35962, Training Loss: 0.00752
Epoch: 35962, Training Loss: 0.00923
Epoch: 35963, Training Loss: 0.00722
Epoch: 35963, Training Loss: 0.00751
Epoch: 35963, Training Loss: 0.00752
Epoch: 35963, Training Loss: 0.00923
Epoch: 35964, Training Loss: 0.00722
Epoch: 35964, Training Loss: 0.00751
Epoch: 35964, Training Loss: 0.00752
Epoch: 35964, Training Loss: 0.00923
Epoch: 35965, Training Loss: 0.00722
Epoch: 35965, Training Loss: 0.00751
Epoch: 35965, Training Loss: 0.00752
Epoch: 35965, Training Loss: 0.00923
Epoch: 35966, Training Loss: 0.00722
Epoch: 35966, Training Loss: 0.00751
Epoch: 35966, Training Loss: 0.00752
Epoch: 35966, Training Loss: 0.00922
Epoch: 35967, Training Loss: 0.00722
Epoch: 35967, Training Loss: 0.00751
Epoch: 35967, Training Loss: 0.00752
Epoch: 35967, Training Loss: 0.00922
Epoch: 35968, Training Loss: 0.00722
Epoch: 35968, Training Loss: 0.00751
Epoch: 35968, Training Loss: 0.00752
Epoch: 35968, Training Loss: 0.00922
Epoch: 35969, Training Loss: 0.00722
Epoch: 35969, Training Loss: 0.00751
Epoch: 35969, Training Loss: 0.00752
Epoch: 35969, Training Loss: 0.00922
Epoch: 35970, Training Loss: 0.00722
Epoch: 35970, Training Loss: 0.00751
Epoch: 35970, Training Loss: 0.00752
Epoch: 35970, Training Loss: 0.00922
Epoch: 35971, Training Loss: 0.00722
Epoch: 35971, Training Loss: 0.00751
Epoch: 35971, Training Loss: 0.00752
Epoch: 35971, Training Loss: 0.00922
Epoch: 35972, Training Loss: 0.00722
Epoch: 35972, Training Loss: 0.00751
Epoch: 35972, Training Loss: 0.00752
Epoch: 35972, Training Loss: 0.00922
Epoch: 35973, Training Loss: 0.00722
Epoch: 35973, Training Loss: 0.00751
Epoch: 35973, Training Loss: 0.00752
Epoch: 35973, Training Loss: 0.00922
Epoch: 35974, Training Loss: 0.00722
Epoch: 35974, Training Loss: 0.00751
Epoch: 35974, Training Loss: 0.00752
Epoch: 35974, Training Loss: 0.00922
Epoch: 35975, Training Loss: 0.00722
Epoch: 35975, Training Loss: 0.00751
Epoch: 35975, Training Loss: 0.00752
Epoch: 35975, Training Loss: 0.00922
Epoch: 35976, Training Loss: 0.00722
Epoch: 35976, Training Loss: 0.00751
Epoch: 35976, Training Loss: 0.00752
Epoch: 35976, Training Loss: 0.00922
Epoch: 35977, Training Loss: 0.00722
Epoch: 35977, Training Loss: 0.00751
Epoch: 35977, Training Loss: 0.00752
Epoch: 35977, Training Loss: 0.00922
Epoch: 35978, Training Loss: 0.00722
Epoch: 35978, Training Loss: 0.00751
Epoch: 35978, Training Loss: 0.00752
Epoch: 35978, Training Loss: 0.00922
Epoch: 35979, Training Loss: 0.00722
Epoch: 35979, Training Loss: 0.00751
Epoch: 35979, Training Loss: 0.00752
Epoch: 35979, Training Loss: 0.00922
Epoch: 35980, Training Loss: 0.00722
Epoch: 35980, Training Loss: 0.00751
Epoch: 35980, Training Loss: 0.00752
Epoch: 35980, Training Loss: 0.00922
Epoch: 35981, Training Loss: 0.00722
Epoch: 35981, Training Loss: 0.00751
Epoch: 35981, Training Loss: 0.00752
Epoch: 35981, Training Loss: 0.00922
Epoch: 35982, Training Loss: 0.00722
Epoch: 35982, Training Loss: 0.00751
Epoch: 35982, Training Loss: 0.00752
Epoch: 35982, Training Loss: 0.00922
Epoch: 35983, Training Loss: 0.00722
Epoch: 35983, Training Loss: 0.00751
Epoch: 35983, Training Loss: 0.00752
Epoch: 35983, Training Loss: 0.00922
Epoch: 35984, Training Loss: 0.00722
Epoch: 35984, Training Loss: 0.00751
Epoch: 35984, Training Loss: 0.00752
Epoch: 35984, Training Loss: 0.00922
Epoch: 35985, Training Loss: 0.00722
Epoch: 35985, Training Loss: 0.00751
Epoch: 35985, Training Loss: 0.00752
Epoch: 35985, Training Loss: 0.00922
Epoch: 35986, Training Loss: 0.00722
Epoch: 35986, Training Loss: 0.00751
Epoch: 35986, Training Loss: 0.00752
Epoch: 35986, Training Loss: 0.00922
Epoch: 35987, Training Loss: 0.00722
Epoch: 35987, Training Loss: 0.00751
Epoch: 35987, Training Loss: 0.00752
Epoch: 35987, Training Loss: 0.00922
Epoch: 35988, Training Loss: 0.00722
Epoch: 35988, Training Loss: 0.00751
Epoch: 35988, Training Loss: 0.00752
Epoch: 35988, Training Loss: 0.00922
Epoch: 35989, Training Loss: 0.00722
Epoch: 35989, Training Loss: 0.00751
Epoch: 35989, Training Loss: 0.00752
Epoch: 35989, Training Loss: 0.00922
Epoch: 35990, Training Loss: 0.00722
Epoch: 35990, Training Loss: 0.00751
Epoch: 35990, Training Loss: 0.00752
Epoch: 35990, Training Loss: 0.00922
Epoch: 35991, Training Loss: 0.00722
Epoch: 35991, Training Loss: 0.00751
Epoch: 35991, Training Loss: 0.00752
Epoch: 35991, Training Loss: 0.00922
Epoch: 35992, Training Loss: 0.00722
Epoch: 35992, Training Loss: 0.00751
Epoch: 35992, Training Loss: 0.00751
Epoch: 35992, Training Loss: 0.00922
Epoch: 35993, Training Loss: 0.00722
Epoch: 35993, Training Loss: 0.00751
Epoch: 35993, Training Loss: 0.00751
Epoch: 35993, Training Loss: 0.00922
Epoch: 35994, Training Loss: 0.00722
Epoch: 35994, Training Loss: 0.00751
Epoch: 35994, Training Loss: 0.00751
Epoch: 35994, Training Loss: 0.00922
Epoch: 35995, Training Loss: 0.00722
Epoch: 35995, Training Loss: 0.00751
Epoch: 35995, Training Loss: 0.00751
Epoch: 35995, Training Loss: 0.00922
Epoch: 35996, Training Loss: 0.00722
Epoch: 35996, Training Loss: 0.00751
Epoch: 35996, Training Loss: 0.00751
Epoch: 35996, Training Loss: 0.00922
Epoch: 35997, Training Loss: 0.00722
Epoch: 35997, Training Loss: 0.00751
Epoch: 35997, Training Loss: 0.00751
Epoch: 35997, Training Loss: 0.00922
Epoch: 35998, Training Loss: 0.00722
Epoch: 35998, Training Loss: 0.00751
Epoch: 35998, Training Loss: 0.00751
Epoch: 35998, Training Loss: 0.00922
Epoch: 35999, Training Loss: 0.00722
Epoch: 35999, Training Loss: 0.00751
Epoch: 35999, Training Loss: 0.00751
Epoch: 35999, Training Loss: 0.00922
Epoch: 36000, Training Loss: 0.00722
Epoch: 36000, Training Loss: 0.00751
Epoch: 36000, Training Loss: 0.00751
Epoch: 36000, Training Loss: 0.00922
Epoch: 36001, Training Loss: 0.00722
Epoch: 36001, Training Loss: 0.00751
Epoch: 36001, Training Loss: 0.00751
Epoch: 36001, Training Loss: 0.00922
Epoch: 36002, Training Loss: 0.00722
Epoch: 36002, Training Loss: 0.00751
Epoch: 36002, Training Loss: 0.00751
Epoch: 36002, Training Loss: 0.00922
Epoch: 36003, Training Loss: 0.00722
Epoch: 36003, Training Loss: 0.00751
Epoch: 36003, Training Loss: 0.00751
Epoch: 36003, Training Loss: 0.00922
Epoch: 36004, Training Loss: 0.00722
Epoch: 36004, Training Loss: 0.00750
Epoch: 36004, Training Loss: 0.00751
Epoch: 36004, Training Loss: 0.00922
Epoch: 36005, Training Loss: 0.00722
Epoch: 36005, Training Loss: 0.00750
Epoch: 36005, Training Loss: 0.00751
Epoch: 36005, Training Loss: 0.00922
Epoch: 36006, Training Loss: 0.00722
Epoch: 36006, Training Loss: 0.00750
Epoch: 36006, Training Loss: 0.00751
Epoch: 36006, Training Loss: 0.00922
Epoch: 36007, Training Loss: 0.00722
Epoch: 36007, Training Loss: 0.00750
Epoch: 36007, Training Loss: 0.00751
Epoch: 36007, Training Loss: 0.00922
Epoch: 36008, Training Loss: 0.00722
Epoch: 36008, Training Loss: 0.00750
Epoch: 36008, Training Loss: 0.00751
Epoch: 36008, Training Loss: 0.00922
Epoch: 36009, Training Loss: 0.00722
Epoch: 36009, Training Loss: 0.00750
Epoch: 36009, Training Loss: 0.00751
Epoch: 36009, Training Loss: 0.00922
Epoch: 36010, Training Loss: 0.00722
Epoch: 36010, Training Loss: 0.00750
Epoch: 36010, Training Loss: 0.00751
Epoch: 36010, Training Loss: 0.00922
Epoch: 36011, Training Loss: 0.00722
Epoch: 36011, Training Loss: 0.00750
Epoch: 36011, Training Loss: 0.00751
Epoch: 36011, Training Loss: 0.00922
Epoch: 36012, Training Loss: 0.00722
Epoch: 36012, Training Loss: 0.00750
Epoch: 36012, Training Loss: 0.00751
Epoch: 36012, Training Loss: 0.00922
Epoch: 36013, Training Loss: 0.00722
Epoch: 36013, Training Loss: 0.00750
Epoch: 36013, Training Loss: 0.00751
Epoch: 36013, Training Loss: 0.00922
Epoch: 36014, Training Loss: 0.00722
Epoch: 36014, Training Loss: 0.00750
Epoch: 36014, Training Loss: 0.00751
Epoch: 36014, Training Loss: 0.00922
Epoch: 36015, Training Loss: 0.00722
Epoch: 36015, Training Loss: 0.00750
Epoch: 36015, Training Loss: 0.00751
Epoch: 36015, Training Loss: 0.00922
Epoch: 36016, Training Loss: 0.00722
Epoch: 36016, Training Loss: 0.00750
Epoch: 36016, Training Loss: 0.00751
Epoch: 36016, Training Loss: 0.00922
Epoch: 36017, Training Loss: 0.00722
Epoch: 36017, Training Loss: 0.00750
Epoch: 36017, Training Loss: 0.00751
Epoch: 36017, Training Loss: 0.00922
Epoch: 36018, Training Loss: 0.00722
Epoch: 36018, Training Loss: 0.00750
Epoch: 36018, Training Loss: 0.00751
Epoch: 36018, Training Loss: 0.00922
Epoch: 36019, Training Loss: 0.00722
Epoch: 36019, Training Loss: 0.00750
Epoch: 36019, Training Loss: 0.00751
Epoch: 36019, Training Loss: 0.00922
Epoch: 36020, Training Loss: 0.00722
Epoch: 36020, Training Loss: 0.00750
Epoch: 36020, Training Loss: 0.00751
Epoch: 36020, Training Loss: 0.00922
Epoch: 36021, Training Loss: 0.00722
Epoch: 36021, Training Loss: 0.00750
Epoch: 36021, Training Loss: 0.00751
Epoch: 36021, Training Loss: 0.00922
Epoch: 36022, Training Loss: 0.00722
Epoch: 36022, Training Loss: 0.00750
Epoch: 36022, Training Loss: 0.00751
Epoch: 36022, Training Loss: 0.00922
Epoch: 36023, Training Loss: 0.00722
Epoch: 36023, Training Loss: 0.00750
Epoch: 36023, Training Loss: 0.00751
Epoch: 36023, Training Loss: 0.00922
Epoch: 36024, Training Loss: 0.00721
Epoch: 36024, Training Loss: 0.00750
Epoch: 36024, Training Loss: 0.00751
Epoch: 36024, Training Loss: 0.00922
Epoch: 36025, Training Loss: 0.00721
Epoch: 36025, Training Loss: 0.00750
Epoch: 36025, Training Loss: 0.00751
Epoch: 36025, Training Loss: 0.00922
Epoch: 36026, Training Loss: 0.00721
Epoch: 36026, Training Loss: 0.00750
Epoch: 36026, Training Loss: 0.00751
Epoch: 36026, Training Loss: 0.00922
Epoch: 36027, Training Loss: 0.00721
Epoch: 36027, Training Loss: 0.00750
Epoch: 36027, Training Loss: 0.00751
Epoch: 36027, Training Loss: 0.00922
Epoch: 36028, Training Loss: 0.00721
Epoch: 36028, Training Loss: 0.00750
Epoch: 36028, Training Loss: 0.00751
Epoch: 36028, Training Loss: 0.00922
Epoch: 36029, Training Loss: 0.00721
Epoch: 36029, Training Loss: 0.00750
Epoch: 36029, Training Loss: 0.00751
Epoch: 36029, Training Loss: 0.00922
Epoch: 36030, Training Loss: 0.00721
Epoch: 36030, Training Loss: 0.00750
Epoch: 36030, Training Loss: 0.00751
Epoch: 36030, Training Loss: 0.00922
Epoch: 36031, Training Loss: 0.00721
Epoch: 36031, Training Loss: 0.00750
Epoch: 36031, Training Loss: 0.00751
Epoch: 36031, Training Loss: 0.00922
Epoch: 36032, Training Loss: 0.00721
Epoch: 36032, Training Loss: 0.00750
Epoch: 36032, Training Loss: 0.00751
Epoch: 36032, Training Loss: 0.00922
Epoch: 36033, Training Loss: 0.00721
Epoch: 36033, Training Loss: 0.00750
Epoch: 36033, Training Loss: 0.00751
Epoch: 36033, Training Loss: 0.00922
Epoch: 36034, Training Loss: 0.00721
Epoch: 36034, Training Loss: 0.00750
Epoch: 36034, Training Loss: 0.00751
Epoch: 36034, Training Loss: 0.00922
Epoch: 36035, Training Loss: 0.00721
Epoch: 36035, Training Loss: 0.00750
Epoch: 36035, Training Loss: 0.00751
Epoch: 36035, Training Loss: 0.00922
Epoch: 36036, Training Loss: 0.00721
Epoch: 36036, Training Loss: 0.00750
Epoch: 36036, Training Loss: 0.00751
Epoch: 36036, Training Loss: 0.00921
Epoch: 36037, Training Loss: 0.00721
Epoch: 36037, Training Loss: 0.00750
Epoch: 36037, Training Loss: 0.00751
Epoch: 36037, Training Loss: 0.00921
Epoch: 36038, Training Loss: 0.00721
Epoch: 36038, Training Loss: 0.00750
Epoch: 36038, Training Loss: 0.00751
Epoch: 36038, Training Loss: 0.00921
Epoch: 36039, Training Loss: 0.00721
Epoch: 36039, Training Loss: 0.00750
Epoch: 36039, Training Loss: 0.00751
Epoch: 36039, Training Loss: 0.00921
Epoch: 36040, Training Loss: 0.00721
Epoch: 36040, Training Loss: 0.00750
Epoch: 36040, Training Loss: 0.00751
Epoch: 36040, Training Loss: 0.00921
Epoch: 36041, Training Loss: 0.00721
Epoch: 36041, Training Loss: 0.00750
Epoch: 36041, Training Loss: 0.00751
Epoch: 36041, Training Loss: 0.00921
Epoch: 36042, Training Loss: 0.00721
Epoch: 36042, Training Loss: 0.00750
Epoch: 36042, Training Loss: 0.00751
Epoch: 36042, Training Loss: 0.00921
Epoch: 36043, Training Loss: 0.00721
Epoch: 36043, Training Loss: 0.00750
Epoch: 36043, Training Loss: 0.00751
Epoch: 36043, Training Loss: 0.00921
Epoch: 36044, Training Loss: 0.00721
Epoch: 36044, Training Loss: 0.00750
Epoch: 36044, Training Loss: 0.00751
Epoch: 36044, Training Loss: 0.00921
Epoch: 36045, Training Loss: 0.00721
Epoch: 36045, Training Loss: 0.00750
Epoch: 36045, Training Loss: 0.00751
Epoch: 36045, Training Loss: 0.00921
Epoch: 36046, Training Loss: 0.00721
Epoch: 36046, Training Loss: 0.00750
Epoch: 36046, Training Loss: 0.00751
Epoch: 36046, Training Loss: 0.00921
Epoch: 36047, Training Loss: 0.00721
Epoch: 36047, Training Loss: 0.00750
Epoch: 36047, Training Loss: 0.00751
Epoch: 36047, Training Loss: 0.00921
Epoch: 36048, Training Loss: 0.00721
Epoch: 36048, Training Loss: 0.00750
Epoch: 36048, Training Loss: 0.00751
Epoch: 36048, Training Loss: 0.00921
Epoch: 36049, Training Loss: 0.00721
Epoch: 36049, Training Loss: 0.00750
Epoch: 36049, Training Loss: 0.00751
Epoch: 36049, Training Loss: 0.00921
Epoch: 36050, Training Loss: 0.00721
Epoch: 36050, Training Loss: 0.00750
Epoch: 36050, Training Loss: 0.00751
Epoch: 36050, Training Loss: 0.00921
Epoch: 36051, Training Loss: 0.00721
Epoch: 36051, Training Loss: 0.00750
Epoch: 36051, Training Loss: 0.00751
Epoch: 36051, Training Loss: 0.00921
Epoch: 36052, Training Loss: 0.00721
Epoch: 36052, Training Loss: 0.00750
Epoch: 36052, Training Loss: 0.00751
Epoch: 36052, Training Loss: 0.00921
Epoch: 36053, Training Loss: 0.00721
Epoch: 36053, Training Loss: 0.00750
Epoch: 36053, Training Loss: 0.00751
Epoch: 36053, Training Loss: 0.00921
Epoch: 36054, Training Loss: 0.00721
Epoch: 36054, Training Loss: 0.00750
Epoch: 36054, Training Loss: 0.00751
Epoch: 36054, Training Loss: 0.00921
Epoch: 36055, Training Loss: 0.00721
Epoch: 36055, Training Loss: 0.00750
Epoch: 36055, Training Loss: 0.00751
Epoch: 36055, Training Loss: 0.00921
Epoch: 36056, Training Loss: 0.00721
Epoch: 36056, Training Loss: 0.00750
Epoch: 36056, Training Loss: 0.00751
Epoch: 36056, Training Loss: 0.00921
Epoch: 36057, Training Loss: 0.00721
Epoch: 36057, Training Loss: 0.00750
Epoch: 36057, Training Loss: 0.00751
Epoch: 36057, Training Loss: 0.00921
Epoch: 36058, Training Loss: 0.00721
Epoch: 36058, Training Loss: 0.00750
Epoch: 36058, Training Loss: 0.00751
Epoch: 36058, Training Loss: 0.00921
Epoch: 36059, Training Loss: 0.00721
Epoch: 36059, Training Loss: 0.00750
Epoch: 36059, Training Loss: 0.00751
Epoch: 36059, Training Loss: 0.00921
Epoch: 36060, Training Loss: 0.00721
Epoch: 36060, Training Loss: 0.00750
Epoch: 36060, Training Loss: 0.00751
Epoch: 36060, Training Loss: 0.00921
Epoch: 36061, Training Loss: 0.00721
Epoch: 36061, Training Loss: 0.00750
Epoch: 36061, Training Loss: 0.00751
Epoch: 36061, Training Loss: 0.00921
Epoch: 36062, Training Loss: 0.00721
Epoch: 36062, Training Loss: 0.00750
Epoch: 36062, Training Loss: 0.00751
Epoch: 36062, Training Loss: 0.00921
Epoch: 36063, Training Loss: 0.00721
Epoch: 36063, Training Loss: 0.00750
Epoch: 36063, Training Loss: 0.00751
Epoch: 36063, Training Loss: 0.00921
Epoch: 36064, Training Loss: 0.00721
Epoch: 36064, Training Loss: 0.00750
Epoch: 36064, Training Loss: 0.00751
Epoch: 36064, Training Loss: 0.00921
Epoch: 36065, Training Loss: 0.00721
Epoch: 36065, Training Loss: 0.00750
Epoch: 36065, Training Loss: 0.00751
Epoch: 36065, Training Loss: 0.00921
Epoch: 36066, Training Loss: 0.00721
Epoch: 36066, Training Loss: 0.00750
Epoch: 36066, Training Loss: 0.00751
Epoch: 36066, Training Loss: 0.00921
Epoch: 36067, Training Loss: 0.00721
Epoch: 36067, Training Loss: 0.00750
Epoch: 36067, Training Loss: 0.00751
Epoch: 36067, Training Loss: 0.00921
Epoch: 36068, Training Loss: 0.00721
Epoch: 36068, Training Loss: 0.00750
Epoch: 36068, Training Loss: 0.00751
Epoch: 36068, Training Loss: 0.00921
Epoch: 36069, Training Loss: 0.00721
Epoch: 36069, Training Loss: 0.00750
Epoch: 36069, Training Loss: 0.00751
Epoch: 36069, Training Loss: 0.00921
Epoch: 36070, Training Loss: 0.00721
Epoch: 36070, Training Loss: 0.00750
Epoch: 36070, Training Loss: 0.00751
Epoch: 36070, Training Loss: 0.00921
Epoch: 36071, Training Loss: 0.00721
Epoch: 36071, Training Loss: 0.00750
Epoch: 36071, Training Loss: 0.00751
Epoch: 36071, Training Loss: 0.00921
Epoch: 36072, Training Loss: 0.00721
Epoch: 36072, Training Loss: 0.00750
Epoch: 36072, Training Loss: 0.00751
Epoch: 36072, Training Loss: 0.00921
Epoch: 36073, Training Loss: 0.00721
Epoch: 36073, Training Loss: 0.00750
Epoch: 36073, Training Loss: 0.00751
Epoch: 36073, Training Loss: 0.00921
Epoch: 36074, Training Loss: 0.00721
Epoch: 36074, Training Loss: 0.00750
Epoch: 36074, Training Loss: 0.00751
Epoch: 36074, Training Loss: 0.00921
Epoch: 36075, Training Loss: 0.00721
Epoch: 36075, Training Loss: 0.00750
Epoch: 36075, Training Loss: 0.00751
Epoch: 36075, Training Loss: 0.00921
Epoch: 36076, Training Loss: 0.00721
Epoch: 36076, Training Loss: 0.00750
Epoch: 36076, Training Loss: 0.00751
Epoch: 36076, Training Loss: 0.00921
Epoch: 36077, Training Loss: 0.00721
Epoch: 36077, Training Loss: 0.00750
Epoch: 36077, Training Loss: 0.00751
Epoch: 36077, Training Loss: 0.00921
Epoch: 36078, Training Loss: 0.00721
Epoch: 36078, Training Loss: 0.00750
Epoch: 36078, Training Loss: 0.00750
Epoch: 36078, Training Loss: 0.00921
Epoch: 36079, Training Loss: 0.00721
Epoch: 36079, Training Loss: 0.00750
Epoch: 36079, Training Loss: 0.00750
Epoch: 36079, Training Loss: 0.00921
Epoch: 36080, Training Loss: 0.00721
Epoch: 36080, Training Loss: 0.00750
Epoch: 36080, Training Loss: 0.00750
Epoch: 36080, Training Loss: 0.00921
Epoch: 36081, Training Loss: 0.00721
Epoch: 36081, Training Loss: 0.00750
Epoch: 36081, Training Loss: 0.00750
Epoch: 36081, Training Loss: 0.00921
Epoch: 36082, Training Loss: 0.00721
Epoch: 36082, Training Loss: 0.00750
Epoch: 36082, Training Loss: 0.00750
Epoch: 36082, Training Loss: 0.00921
Epoch: 36083, Training Loss: 0.00721
Epoch: 36083, Training Loss: 0.00750
Epoch: 36083, Training Loss: 0.00750
Epoch: 36083, Training Loss: 0.00921
Epoch: 36084, Training Loss: 0.00721
Epoch: 36084, Training Loss: 0.00750
Epoch: 36084, Training Loss: 0.00750
Epoch: 36084, Training Loss: 0.00921
Epoch: 36085, Training Loss: 0.00721
Epoch: 36085, Training Loss: 0.00750
Epoch: 36085, Training Loss: 0.00750
Epoch: 36085, Training Loss: 0.00921
Epoch: 36086, Training Loss: 0.00721
Epoch: 36086, Training Loss: 0.00750
Epoch: 36086, Training Loss: 0.00750
Epoch: 36086, Training Loss: 0.00921
Epoch: 36087, Training Loss: 0.00721
Epoch: 36087, Training Loss: 0.00750
Epoch: 36087, Training Loss: 0.00750
Epoch: 36087, Training Loss: 0.00921
Epoch: 36088, Training Loss: 0.00721
Epoch: 36088, Training Loss: 0.00750
Epoch: 36088, Training Loss: 0.00750
Epoch: 36088, Training Loss: 0.00921
Epoch: 36089, Training Loss: 0.00721
Epoch: 36089, Training Loss: 0.00750
Epoch: 36089, Training Loss: 0.00750
Epoch: 36089, Training Loss: 0.00921
Epoch: 36090, Training Loss: 0.00721
Epoch: 36090, Training Loss: 0.00750
Epoch: 36090, Training Loss: 0.00750
Epoch: 36090, Training Loss: 0.00921
Epoch: 36091, Training Loss: 0.00721
Epoch: 36091, Training Loss: 0.00749
Epoch: 36091, Training Loss: 0.00750
Epoch: 36091, Training Loss: 0.00921
Epoch: 36092, Training Loss: 0.00721
Epoch: 36092, Training Loss: 0.00749
Epoch: 36092, Training Loss: 0.00750
Epoch: 36092, Training Loss: 0.00921
Epoch: 36093, Training Loss: 0.00721
Epoch: 36093, Training Loss: 0.00749
Epoch: 36093, Training Loss: 0.00750
Epoch: 36093, Training Loss: 0.00921
Epoch: 36094, Training Loss: 0.00721
Epoch: 36094, Training Loss: 0.00749
Epoch: 36094, Training Loss: 0.00750
Epoch: 36094, Training Loss: 0.00921
Epoch: 36095, Training Loss: 0.00721
Epoch: 36095, Training Loss: 0.00749
Epoch: 36095, Training Loss: 0.00750
Epoch: 36095, Training Loss: 0.00921
Epoch: 36096, Training Loss: 0.00721
Epoch: 36096, Training Loss: 0.00749
Epoch: 36096, Training Loss: 0.00750
Epoch: 36096, Training Loss: 0.00921
Epoch: 36097, Training Loss: 0.00721
Epoch: 36097, Training Loss: 0.00749
Epoch: 36097, Training Loss: 0.00750
Epoch: 36097, Training Loss: 0.00921
Epoch: 36098, Training Loss: 0.00721
Epoch: 36098, Training Loss: 0.00749
Epoch: 36098, Training Loss: 0.00750
Epoch: 36098, Training Loss: 0.00921
Epoch: 36099, Training Loss: 0.00721
Epoch: 36099, Training Loss: 0.00749
Epoch: 36099, Training Loss: 0.00750
Epoch: 36099, Training Loss: 0.00921
Epoch: 36100, Training Loss: 0.00721
Epoch: 36100, Training Loss: 0.00749
Epoch: 36100, Training Loss: 0.00750
Epoch: 36100, Training Loss: 0.00921
Epoch: 36101, Training Loss: 0.00721
Epoch: 36101, Training Loss: 0.00749
Epoch: 36101, Training Loss: 0.00750
Epoch: 36101, Training Loss: 0.00921
Epoch: 36102, Training Loss: 0.00721
Epoch: 36102, Training Loss: 0.00749
Epoch: 36102, Training Loss: 0.00750
Epoch: 36102, Training Loss: 0.00921
Epoch: 36103, Training Loss: 0.00721
Epoch: 36103, Training Loss: 0.00749
Epoch: 36103, Training Loss: 0.00750
Epoch: 36103, Training Loss: 0.00921
Epoch: 36104, Training Loss: 0.00721
Epoch: 36104, Training Loss: 0.00749
Epoch: 36104, Training Loss: 0.00750
Epoch: 36104, Training Loss: 0.00921
Epoch: 36105, Training Loss: 0.00721
Epoch: 36105, Training Loss: 0.00749
Epoch: 36105, Training Loss: 0.00750
Epoch: 36105, Training Loss: 0.00921
Epoch: 36106, Training Loss: 0.00721
Epoch: 36106, Training Loss: 0.00749
Epoch: 36106, Training Loss: 0.00750
Epoch: 36106, Training Loss: 0.00920
Epoch: 36107, Training Loss: 0.00721
Epoch: 36107, Training Loss: 0.00749
Epoch: 36107, Training Loss: 0.00750
Epoch: 36107, Training Loss: 0.00920
Epoch: 36108, Training Loss: 0.00721
Epoch: 36108, Training Loss: 0.00749
Epoch: 36108, Training Loss: 0.00750
Epoch: 36108, Training Loss: 0.00920
Epoch: 36109, Training Loss: 0.00721
Epoch: 36109, Training Loss: 0.00749
Epoch: 36109, Training Loss: 0.00750
Epoch: 36109, Training Loss: 0.00920
Epoch: 36110, Training Loss: 0.00721
Epoch: 36110, Training Loss: 0.00749
Epoch: 36110, Training Loss: 0.00750
Epoch: 36110, Training Loss: 0.00920
Epoch: 36111, Training Loss: 0.00721
Epoch: 36111, Training Loss: 0.00749
Epoch: 36111, Training Loss: 0.00750
Epoch: 36111, Training Loss: 0.00920
Epoch: 36112, Training Loss: 0.00721
Epoch: 36112, Training Loss: 0.00749
Epoch: 36112, Training Loss: 0.00750
Epoch: 36112, Training Loss: 0.00920
Epoch: 36113, Training Loss: 0.00721
Epoch: 36113, Training Loss: 0.00749
Epoch: 36113, Training Loss: 0.00750
Epoch: 36113, Training Loss: 0.00920
Epoch: 36114, Training Loss: 0.00721
Epoch: 36114, Training Loss: 0.00749
Epoch: 36114, Training Loss: 0.00750
Epoch: 36114, Training Loss: 0.00920
Epoch: 36115, Training Loss: 0.00721
Epoch: 36115, Training Loss: 0.00749
Epoch: 36115, Training Loss: 0.00750
Epoch: 36115, Training Loss: 0.00920
Epoch: 36116, Training Loss: 0.00720
Epoch: 36116, Training Loss: 0.00749
Epoch: 36116, Training Loss: 0.00750
Epoch: 36116, Training Loss: 0.00920
Epoch: 36117, Training Loss: 0.00720
Epoch: 36117, Training Loss: 0.00749
Epoch: 36117, Training Loss: 0.00750
Epoch: 36117, Training Loss: 0.00920
Epoch: 36118, Training Loss: 0.00720
Epoch: 36118, Training Loss: 0.00749
Epoch: 36118, Training Loss: 0.00750
Epoch: 36118, Training Loss: 0.00920
Epoch: 36119, Training Loss: 0.00720
Epoch: 36119, Training Loss: 0.00749
Epoch: 36119, Training Loss: 0.00750
Epoch: 36119, Training Loss: 0.00920
Epoch: 36120, Training Loss: 0.00720
Epoch: 36120, Training Loss: 0.00749
Epoch: 36120, Training Loss: 0.00750
Epoch: 36120, Training Loss: 0.00920
Epoch: 36121, Training Loss: 0.00720
Epoch: 36121, Training Loss: 0.00749
Epoch: 36121, Training Loss: 0.00750
Epoch: 36121, Training Loss: 0.00920
Epoch: 36122, Training Loss: 0.00720
Epoch: 36122, Training Loss: 0.00749
Epoch: 36122, Training Loss: 0.00750
Epoch: 36122, Training Loss: 0.00920
Epoch: 36123, Training Loss: 0.00720
Epoch: 36123, Training Loss: 0.00749
Epoch: 36123, Training Loss: 0.00750
Epoch: 36123, Training Loss: 0.00920
Epoch: 36124, Training Loss: 0.00720
Epoch: 36124, Training Loss: 0.00749
Epoch: 36124, Training Loss: 0.00750
Epoch: 36124, Training Loss: 0.00920
Epoch: 36125, Training Loss: 0.00720
Epoch: 36125, Training Loss: 0.00749
Epoch: 36125, Training Loss: 0.00750
Epoch: 36125, Training Loss: 0.00920
Epoch: 36126, Training Loss: 0.00720
Epoch: 36126, Training Loss: 0.00749
Epoch: 36126, Training Loss: 0.00750
Epoch: 36126, Training Loss: 0.00920
Epoch: 36127, Training Loss: 0.00720
Epoch: 36127, Training Loss: 0.00749
Epoch: 36127, Training Loss: 0.00750
Epoch: 36127, Training Loss: 0.00920
Epoch: 36128, Training Loss: 0.00720
Epoch: 36128, Training Loss: 0.00749
Epoch: 36128, Training Loss: 0.00750
Epoch: 36128, Training Loss: 0.00920
Epoch: 36129, Training Loss: 0.00720
Epoch: 36129, Training Loss: 0.00749
Epoch: 36129, Training Loss: 0.00750
Epoch: 36129, Training Loss: 0.00920
Epoch: 36130, Training Loss: 0.00720
Epoch: 36130, Training Loss: 0.00749
Epoch: 36130, Training Loss: 0.00750
Epoch: 36130, Training Loss: 0.00920
Epoch: 36131, Training Loss: 0.00720
Epoch: 36131, Training Loss: 0.00749
Epoch: 36131, Training Loss: 0.00750
Epoch: 36131, Training Loss: 0.00920
Epoch: 36132, Training Loss: 0.00720
Epoch: 36132, Training Loss: 0.00749
Epoch: 36132, Training Loss: 0.00750
Epoch: 36132, Training Loss: 0.00920
Epoch: 36133, Training Loss: 0.00720
Epoch: 36133, Training Loss: 0.00749
Epoch: 36133, Training Loss: 0.00750
Epoch: 36133, Training Loss: 0.00920
Epoch: 36134, Training Loss: 0.00720
Epoch: 36134, Training Loss: 0.00749
Epoch: 36134, Training Loss: 0.00750
Epoch: 36134, Training Loss: 0.00920
Epoch: 36135, Training Loss: 0.00720
Epoch: 36135, Training Loss: 0.00749
Epoch: 36135, Training Loss: 0.00750
Epoch: 36135, Training Loss: 0.00920
Epoch: 36136, Training Loss: 0.00720
Epoch: 36136, Training Loss: 0.00749
Epoch: 36136, Training Loss: 0.00750
Epoch: 36136, Training Loss: 0.00920
Epoch: 36137, Training Loss: 0.00720
Epoch: 36137, Training Loss: 0.00749
Epoch: 36137, Training Loss: 0.00750
Epoch: 36137, Training Loss: 0.00920
Epoch: 36138, Training Loss: 0.00720
Epoch: 36138, Training Loss: 0.00749
Epoch: 36138, Training Loss: 0.00750
Epoch: 36138, Training Loss: 0.00920
Epoch: 36139, Training Loss: 0.00720
Epoch: 36139, Training Loss: 0.00749
Epoch: 36139, Training Loss: 0.00750
Epoch: 36139, Training Loss: 0.00920
Epoch: 36140, Training Loss: 0.00720
Epoch: 36140, Training Loss: 0.00749
Epoch: 36140, Training Loss: 0.00750
Epoch: 36140, Training Loss: 0.00920
Epoch: 36141, Training Loss: 0.00720
Epoch: 36141, Training Loss: 0.00749
Epoch: 36141, Training Loss: 0.00750
Epoch: 36141, Training Loss: 0.00920
Epoch: 36142, Training Loss: 0.00720
Epoch: 36142, Training Loss: 0.00749
Epoch: 36142, Training Loss: 0.00750
Epoch: 36142, Training Loss: 0.00920
Epoch: 36143, Training Loss: 0.00720
Epoch: 36143, Training Loss: 0.00749
Epoch: 36143, Training Loss: 0.00750
Epoch: 36143, Training Loss: 0.00920
Epoch: 36144, Training Loss: 0.00720
Epoch: 36144, Training Loss: 0.00749
Epoch: 36144, Training Loss: 0.00750
Epoch: 36144, Training Loss: 0.00920
Epoch: 36145, Training Loss: 0.00720
Epoch: 36145, Training Loss: 0.00749
Epoch: 36145, Training Loss: 0.00750
Epoch: 36145, Training Loss: 0.00920
Epoch: 36146, Training Loss: 0.00720
Epoch: 36146, Training Loss: 0.00749
Epoch: 36146, Training Loss: 0.00750
Epoch: 36146, Training Loss: 0.00920
Epoch: 36147, Training Loss: 0.00720
Epoch: 36147, Training Loss: 0.00749
Epoch: 36147, Training Loss: 0.00750
Epoch: 36147, Training Loss: 0.00920
Epoch: 36148, Training Loss: 0.00720
Epoch: 36148, Training Loss: 0.00749
Epoch: 36148, Training Loss: 0.00750
Epoch: 36148, Training Loss: 0.00920
Epoch: 36149, Training Loss: 0.00720
Epoch: 36149, Training Loss: 0.00749
Epoch: 36149, Training Loss: 0.00750
Epoch: 36149, Training Loss: 0.00920
Epoch: 36150, Training Loss: 0.00720
Epoch: 36150, Training Loss: 0.00749
Epoch: 36150, Training Loss: 0.00750
Epoch: 36150, Training Loss: 0.00920
Epoch: 36151, Training Loss: 0.00720
Epoch: 36151, Training Loss: 0.00749
Epoch: 36151, Training Loss: 0.00750
Epoch: 36151, Training Loss: 0.00920
Epoch: 36152, Training Loss: 0.00720
Epoch: 36152, Training Loss: 0.00749
Epoch: 36152, Training Loss: 0.00750
Epoch: 36152, Training Loss: 0.00920
Epoch: 36153, Training Loss: 0.00720
Epoch: 36153, Training Loss: 0.00749
Epoch: 36153, Training Loss: 0.00750
Epoch: 36153, Training Loss: 0.00920
Epoch: 36154, Training Loss: 0.00720
Epoch: 36154, Training Loss: 0.00749
Epoch: 36154, Training Loss: 0.00750
Epoch: 36154, Training Loss: 0.00920
Epoch: 36155, Training Loss: 0.00720
Epoch: 36155, Training Loss: 0.00749
Epoch: 36155, Training Loss: 0.00750
Epoch: 36155, Training Loss: 0.00920
Epoch: 36156, Training Loss: 0.00720
Epoch: 36156, Training Loss: 0.00749
Epoch: 36156, Training Loss: 0.00750
Epoch: 36156, Training Loss: 0.00920
Epoch: 36157, Training Loss: 0.00720
Epoch: 36157, Training Loss: 0.00749
Epoch: 36157, Training Loss: 0.00750
Epoch: 36157, Training Loss: 0.00920
Epoch: 36158, Training Loss: 0.00720
Epoch: 36158, Training Loss: 0.00749
Epoch: 36158, Training Loss: 0.00750
Epoch: 36158, Training Loss: 0.00920
Epoch: 36159, Training Loss: 0.00720
Epoch: 36159, Training Loss: 0.00749
Epoch: 36159, Training Loss: 0.00750
Epoch: 36159, Training Loss: 0.00920
Epoch: 36160, Training Loss: 0.00720
Epoch: 36160, Training Loss: 0.00749
Epoch: 36160, Training Loss: 0.00750
Epoch: 36160, Training Loss: 0.00920
Epoch: 36161, Training Loss: 0.00720
Epoch: 36161, Training Loss: 0.00749
Epoch: 36161, Training Loss: 0.00750
Epoch: 36161, Training Loss: 0.00920
Epoch: 36162, Training Loss: 0.00720
Epoch: 36162, Training Loss: 0.00749
Epoch: 36162, Training Loss: 0.00750
Epoch: 36162, Training Loss: 0.00920
Epoch: 36163, Training Loss: 0.00720
Epoch: 36163, Training Loss: 0.00749
Epoch: 36163, Training Loss: 0.00750
Epoch: 36163, Training Loss: 0.00920
Epoch: 36164, Training Loss: 0.00720
Epoch: 36164, Training Loss: 0.00749
Epoch: 36164, Training Loss: 0.00750
Epoch: 36164, Training Loss: 0.00920
Epoch: 36165, Training Loss: 0.00720
Epoch: 36165, Training Loss: 0.00749
Epoch: 36165, Training Loss: 0.00749
Epoch: 36165, Training Loss: 0.00920
Epoch: 36166, Training Loss: 0.00720
Epoch: 36166, Training Loss: 0.00749
Epoch: 36166, Training Loss: 0.00749
Epoch: 36166, Training Loss: 0.00920
Epoch: 36167, Training Loss: 0.00720
Epoch: 36167, Training Loss: 0.00749
Epoch: 36167, Training Loss: 0.00749
Epoch: 36167, Training Loss: 0.00920
Epoch: 36168, Training Loss: 0.00720
Epoch: 36168, Training Loss: 0.00749
Epoch: 36168, Training Loss: 0.00749
Epoch: 36168, Training Loss: 0.00920
Epoch: 36169, Training Loss: 0.00720
Epoch: 36169, Training Loss: 0.00749
Epoch: 36169, Training Loss: 0.00749
Epoch: 36169, Training Loss: 0.00920
Epoch: 36170, Training Loss: 0.00720
Epoch: 36170, Training Loss: 0.00749
Epoch: 36170, Training Loss: 0.00749
Epoch: 36170, Training Loss: 0.00920
Epoch: 36171, Training Loss: 0.00720
Epoch: 36171, Training Loss: 0.00749
Epoch: 36171, Training Loss: 0.00749
Epoch: 36171, Training Loss: 0.00920
Epoch: 36172, Training Loss: 0.00720
Epoch: 36172, Training Loss: 0.00749
Epoch: 36172, Training Loss: 0.00749
Epoch: 36172, Training Loss: 0.00920
Epoch: 36173, Training Loss: 0.00720
Epoch: 36173, Training Loss: 0.00749
Epoch: 36173, Training Loss: 0.00749
Epoch: 36173, Training Loss: 0.00920
Epoch: 36174, Training Loss: 0.00720
Epoch: 36174, Training Loss: 0.00749
Epoch: 36174, Training Loss: 0.00749
Epoch: 36174, Training Loss: 0.00920
Epoch: 36175, Training Loss: 0.00720
Epoch: 36175, Training Loss: 0.00749
Epoch: 36175, Training Loss: 0.00749
Epoch: 36175, Training Loss: 0.00920
Epoch: 36176, Training Loss: 0.00720
Epoch: 36176, Training Loss: 0.00749
Epoch: 36176, Training Loss: 0.00749
Epoch: 36176, Training Loss: 0.00919
Epoch: 36177, Training Loss: 0.00720
Epoch: 36177, Training Loss: 0.00749
Epoch: 36177, Training Loss: 0.00749
Epoch: 36177, Training Loss: 0.00919
Epoch: 36178, Training Loss: 0.00720
Epoch: 36178, Training Loss: 0.00748
Epoch: 36178, Training Loss: 0.00749
Epoch: 36178, Training Loss: 0.00919
Epoch: 36179, Training Loss: 0.00720
Epoch: 36179, Training Loss: 0.00748
Epoch: 36179, Training Loss: 0.00749
Epoch: 36179, Training Loss: 0.00919
Epoch: 36180, Training Loss: 0.00720
Epoch: 36180, Training Loss: 0.00748
Epoch: 36180, Training Loss: 0.00749
Epoch: 36180, Training Loss: 0.00919
Epoch: 36181, Training Loss: 0.00720
Epoch: 36181, Training Loss: 0.00748
Epoch: 36181, Training Loss: 0.00749
Epoch: 36181, Training Loss: 0.00919
Epoch: 36182, Training Loss: 0.00720
Epoch: 36182, Training Loss: 0.00748
Epoch: 36182, Training Loss: 0.00749
Epoch: 36182, Training Loss: 0.00919
Epoch: 36183, Training Loss: 0.00720
Epoch: 36183, Training Loss: 0.00748
Epoch: 36183, Training Loss: 0.00749
Epoch: 36183, Training Loss: 0.00919
Epoch: 36184, Training Loss: 0.00720
Epoch: 36184, Training Loss: 0.00748
Epoch: 36184, Training Loss: 0.00749
Epoch: 36184, Training Loss: 0.00919
Epoch: 36185, Training Loss: 0.00720
Epoch: 36185, Training Loss: 0.00748
Epoch: 36185, Training Loss: 0.00749
Epoch: 36185, Training Loss: 0.00919
Epoch: 36186, Training Loss: 0.00720
Epoch: 36186, Training Loss: 0.00748
Epoch: 36186, Training Loss: 0.00749
Epoch: 36186, Training Loss: 0.00919
Epoch: 36187, Training Loss: 0.00720
Epoch: 36187, Training Loss: 0.00748
Epoch: 36187, Training Loss: 0.00749
Epoch: 36187, Training Loss: 0.00919
Epoch: 36188, Training Loss: 0.00720
Epoch: 36188, Training Loss: 0.00748
Epoch: 36188, Training Loss: 0.00749
Epoch: 36188, Training Loss: 0.00919
Epoch: 36189, Training Loss: 0.00720
Epoch: 36189, Training Loss: 0.00748
Epoch: 36189, Training Loss: 0.00749
Epoch: 36189, Training Loss: 0.00919
Epoch: 36190, Training Loss: 0.00720
Epoch: 36190, Training Loss: 0.00748
Epoch: 36190, Training Loss: 0.00749
Epoch: 36190, Training Loss: 0.00919
Epoch: 36191, Training Loss: 0.00720
Epoch: 36191, Training Loss: 0.00748
Epoch: 36191, Training Loss: 0.00749
Epoch: 36191, Training Loss: 0.00919
Epoch: 36192, Training Loss: 0.00720
Epoch: 36192, Training Loss: 0.00748
Epoch: 36192, Training Loss: 0.00749
Epoch: 36192, Training Loss: 0.00919
Epoch: 36193, Training Loss: 0.00720
Epoch: 36193, Training Loss: 0.00748
Epoch: 36193, Training Loss: 0.00749
Epoch: 36193, Training Loss: 0.00919
Epoch: 36194, Training Loss: 0.00720
Epoch: 36194, Training Loss: 0.00748
Epoch: 36194, Training Loss: 0.00749
Epoch: 36194, Training Loss: 0.00919
Epoch: 36195, Training Loss: 0.00720
Epoch: 36195, Training Loss: 0.00748
Epoch: 36195, Training Loss: 0.00749
Epoch: 36195, Training Loss: 0.00919
Epoch: 36196, Training Loss: 0.00720
Epoch: 36196, Training Loss: 0.00748
Epoch: 36196, Training Loss: 0.00749
Epoch: 36196, Training Loss: 0.00919
Epoch: 36197, Training Loss: 0.00720
Epoch: 36197, Training Loss: 0.00748
Epoch: 36197, Training Loss: 0.00749
Epoch: 36197, Training Loss: 0.00919
Epoch: 36198, Training Loss: 0.00720
Epoch: 36198, Training Loss: 0.00748
Epoch: 36198, Training Loss: 0.00749
Epoch: 36198, Training Loss: 0.00919
Epoch: 36199, Training Loss: 0.00720
Epoch: 36199, Training Loss: 0.00748
Epoch: 36199, Training Loss: 0.00749
Epoch: 36199, Training Loss: 0.00919
Epoch: 36200, Training Loss: 0.00720
Epoch: 36200, Training Loss: 0.00748
Epoch: 36200, Training Loss: 0.00749
Epoch: 36200, Training Loss: 0.00919
Epoch: 36201, Training Loss: 0.00720
Epoch: 36201, Training Loss: 0.00748
Epoch: 36201, Training Loss: 0.00749
Epoch: 36201, Training Loss: 0.00919
Epoch: 36202, Training Loss: 0.00720
Epoch: 36202, Training Loss: 0.00748
Epoch: 36202, Training Loss: 0.00749
Epoch: 36202, Training Loss: 0.00919
Epoch: 36203, Training Loss: 0.00720
Epoch: 36203, Training Loss: 0.00748
Epoch: 36203, Training Loss: 0.00749
Epoch: 36203, Training Loss: 0.00919
Epoch: 36204, Training Loss: 0.00720
Epoch: 36204, Training Loss: 0.00748
Epoch: 36204, Training Loss: 0.00749
Epoch: 36204, Training Loss: 0.00919
Epoch: 36205, Training Loss: 0.00720
Epoch: 36205, Training Loss: 0.00748
Epoch: 36205, Training Loss: 0.00749
Epoch: 36205, Training Loss: 0.00919
Epoch: 36206, Training Loss: 0.00720
Epoch: 36206, Training Loss: 0.00748
Epoch: 36206, Training Loss: 0.00749
Epoch: 36206, Training Loss: 0.00919
Epoch: 36207, Training Loss: 0.00720
Epoch: 36207, Training Loss: 0.00748
Epoch: 36207, Training Loss: 0.00749
Epoch: 36207, Training Loss: 0.00919
Epoch: 36208, Training Loss: 0.00720
Epoch: 36208, Training Loss: 0.00748
Epoch: 36208, Training Loss: 0.00749
Epoch: 36208, Training Loss: 0.00919
Epoch: 36209, Training Loss: 0.00719
Epoch: 36209, Training Loss: 0.00748
Epoch: 36209, Training Loss: 0.00749
Epoch: 36209, Training Loss: 0.00919
Epoch: 36210, Training Loss: 0.00719
Epoch: 36210, Training Loss: 0.00748
Epoch: 36210, Training Loss: 0.00749
Epoch: 36210, Training Loss: 0.00919
Epoch: 36211, Training Loss: 0.00719
Epoch: 36211, Training Loss: 0.00748
Epoch: 36211, Training Loss: 0.00749
Epoch: 36211, Training Loss: 0.00919
Epoch: 36212, Training Loss: 0.00719
Epoch: 36212, Training Loss: 0.00748
Epoch: 36212, Training Loss: 0.00749
Epoch: 36212, Training Loss: 0.00919
Epoch: 36213, Training Loss: 0.00719
Epoch: 36213, Training Loss: 0.00748
Epoch: 36213, Training Loss: 0.00749
Epoch: 36213, Training Loss: 0.00919
Epoch: 36214, Training Loss: 0.00719
Epoch: 36214, Training Loss: 0.00748
Epoch: 36214, Training Loss: 0.00749
Epoch: 36214, Training Loss: 0.00919
Epoch: 36215, Training Loss: 0.00719
Epoch: 36215, Training Loss: 0.00748
Epoch: 36215, Training Loss: 0.00749
Epoch: 36215, Training Loss: 0.00919
Epoch: 36216, Training Loss: 0.00719
Epoch: 36216, Training Loss: 0.00748
Epoch: 36216, Training Loss: 0.00749
Epoch: 36216, Training Loss: 0.00919
Epoch: 36217, Training Loss: 0.00719
Epoch: 36217, Training Loss: 0.00748
Epoch: 36217, Training Loss: 0.00749
Epoch: 36217, Training Loss: 0.00919
Epoch: 36218, Training Loss: 0.00719
Epoch: 36218, Training Loss: 0.00748
Epoch: 36218, Training Loss: 0.00749
Epoch: 36218, Training Loss: 0.00919
Epoch: 36219, Training Loss: 0.00719
Epoch: 36219, Training Loss: 0.00748
Epoch: 36219, Training Loss: 0.00749
Epoch: 36219, Training Loss: 0.00919
Epoch: 36220, Training Loss: 0.00719
Epoch: 36220, Training Loss: 0.00748
Epoch: 36220, Training Loss: 0.00749
Epoch: 36220, Training Loss: 0.00919
Epoch: 36221, Training Loss: 0.00719
Epoch: 36221, Training Loss: 0.00748
Epoch: 36221, Training Loss: 0.00749
Epoch: 36221, Training Loss: 0.00919
Epoch: 36222, Training Loss: 0.00719
Epoch: 36222, Training Loss: 0.00748
Epoch: 36222, Training Loss: 0.00749
Epoch: 36222, Training Loss: 0.00919
Epoch: 36223, Training Loss: 0.00719
Epoch: 36223, Training Loss: 0.00748
Epoch: 36223, Training Loss: 0.00749
Epoch: 36223, Training Loss: 0.00919
Epoch: 36224, Training Loss: 0.00719
Epoch: 36224, Training Loss: 0.00748
Epoch: 36224, Training Loss: 0.00749
Epoch: 36224, Training Loss: 0.00919
Epoch: 36225, Training Loss: 0.00719
Epoch: 36225, Training Loss: 0.00748
Epoch: 36225, Training Loss: 0.00749
Epoch: 36225, Training Loss: 0.00919
Epoch: 36226, Training Loss: 0.00719
Epoch: 36226, Training Loss: 0.00748
Epoch: 36226, Training Loss: 0.00749
Epoch: 36226, Training Loss: 0.00919
Epoch: 36227, Training Loss: 0.00719
Epoch: 36227, Training Loss: 0.00748
Epoch: 36227, Training Loss: 0.00749
Epoch: 36227, Training Loss: 0.00919
Epoch: 36228, Training Loss: 0.00719
Epoch: 36228, Training Loss: 0.00748
Epoch: 36228, Training Loss: 0.00749
Epoch: 36228, Training Loss: 0.00919
Epoch: 36229, Training Loss: 0.00719
Epoch: 36229, Training Loss: 0.00748
Epoch: 36229, Training Loss: 0.00749
Epoch: 36229, Training Loss: 0.00919
Epoch: 36230, Training Loss: 0.00719
Epoch: 36230, Training Loss: 0.00748
Epoch: 36230, Training Loss: 0.00749
Epoch: 36230, Training Loss: 0.00919
Epoch: 36231, Training Loss: 0.00719
Epoch: 36231, Training Loss: 0.00748
Epoch: 36231, Training Loss: 0.00749
Epoch: 36231, Training Loss: 0.00919
Epoch: 36232, Training Loss: 0.00719
Epoch: 36232, Training Loss: 0.00748
Epoch: 36232, Training Loss: 0.00749
Epoch: 36232, Training Loss: 0.00919
Epoch: 36233, Training Loss: 0.00719
Epoch: 36233, Training Loss: 0.00748
Epoch: 36233, Training Loss: 0.00749
Epoch: 36233, Training Loss: 0.00919
Epoch: 36234, Training Loss: 0.00719
Epoch: 36234, Training Loss: 0.00748
Epoch: 36234, Training Loss: 0.00749
Epoch: 36234, Training Loss: 0.00919
Epoch: 36235, Training Loss: 0.00719
Epoch: 36235, Training Loss: 0.00748
Epoch: 36235, Training Loss: 0.00749
Epoch: 36235, Training Loss: 0.00919
Epoch: 36236, Training Loss: 0.00719
Epoch: 36236, Training Loss: 0.00748
Epoch: 36236, Training Loss: 0.00749
Epoch: 36236, Training Loss: 0.00919
Epoch: 36237, Training Loss: 0.00719
Epoch: 36237, Training Loss: 0.00748
Epoch: 36237, Training Loss: 0.00749
Epoch: 36237, Training Loss: 0.00919
Epoch: 36238, Training Loss: 0.00719
Epoch: 36238, Training Loss: 0.00748
Epoch: 36238, Training Loss: 0.00749
Epoch: 36238, Training Loss: 0.00919
Epoch: 36239, Training Loss: 0.00719
Epoch: 36239, Training Loss: 0.00748
Epoch: 36239, Training Loss: 0.00749
Epoch: 36239, Training Loss: 0.00919
Epoch: 36240, Training Loss: 0.00719
Epoch: 36240, Training Loss: 0.00748
Epoch: 36240, Training Loss: 0.00749
Epoch: 36240, Training Loss: 0.00919
Epoch: 36241, Training Loss: 0.00719
Epoch: 36241, Training Loss: 0.00748
Epoch: 36241, Training Loss: 0.00749
Epoch: 36241, Training Loss: 0.00919
Epoch: 36242, Training Loss: 0.00719
Epoch: 36242, Training Loss: 0.00748
Epoch: 36242, Training Loss: 0.00749
Epoch: 36242, Training Loss: 0.00919
Epoch: 36243, Training Loss: 0.00719
Epoch: 36243, Training Loss: 0.00748
Epoch: 36243, Training Loss: 0.00749
Epoch: 36243, Training Loss: 0.00919
Epoch: 36244, Training Loss: 0.00719
Epoch: 36244, Training Loss: 0.00748
Epoch: 36244, Training Loss: 0.00749
Epoch: 36244, Training Loss: 0.00919
Epoch: 36245, Training Loss: 0.00719
Epoch: 36245, Training Loss: 0.00748
Epoch: 36245, Training Loss: 0.00749
Epoch: 36245, Training Loss: 0.00919
Epoch: 36246, Training Loss: 0.00719
Epoch: 36246, Training Loss: 0.00748
Epoch: 36246, Training Loss: 0.00749
Epoch: 36246, Training Loss: 0.00918
Epoch: 36247, Training Loss: 0.00719
Epoch: 36247, Training Loss: 0.00748
Epoch: 36247, Training Loss: 0.00749
Epoch: 36247, Training Loss: 0.00918
Epoch: 36248, Training Loss: 0.00719
Epoch: 36248, Training Loss: 0.00748
Epoch: 36248, Training Loss: 0.00749
Epoch: 36248, Training Loss: 0.00918
Epoch: 36249, Training Loss: 0.00719
Epoch: 36249, Training Loss: 0.00748
Epoch: 36249, Training Loss: 0.00749
Epoch: 36249, Training Loss: 0.00918
Epoch: 36250, Training Loss: 0.00719
Epoch: 36250, Training Loss: 0.00748
Epoch: 36250, Training Loss: 0.00749
Epoch: 36250, Training Loss: 0.00918
Epoch: 36251, Training Loss: 0.00719
Epoch: 36251, Training Loss: 0.00748
Epoch: 36251, Training Loss: 0.00749
Epoch: 36251, Training Loss: 0.00918
Epoch: 36252, Training Loss: 0.00719
Epoch: 36252, Training Loss: 0.00748
Epoch: 36252, Training Loss: 0.00749
Epoch: 36252, Training Loss: 0.00918
Epoch: 36253, Training Loss: 0.00719
Epoch: 36253, Training Loss: 0.00748
Epoch: 36253, Training Loss: 0.00748
Epoch: 36253, Training Loss: 0.00918
Epoch: 36254, Training Loss: 0.00719
Epoch: 36254, Training Loss: 0.00748
Epoch: 36254, Training Loss: 0.00748
Epoch: 36254, Training Loss: 0.00918
Epoch: 36255, Training Loss: 0.00719
Epoch: 36255, Training Loss: 0.00748
Epoch: 36255, Training Loss: 0.00748
Epoch: 36255, Training Loss: 0.00918
Epoch: 36256, Training Loss: 0.00719
Epoch: 36256, Training Loss: 0.00748
Epoch: 36256, Training Loss: 0.00748
Epoch: 36256, Training Loss: 0.00918
Epoch: 36257, Training Loss: 0.00719
Epoch: 36257, Training Loss: 0.00748
Epoch: 36257, Training Loss: 0.00748
Epoch: 36257, Training Loss: 0.00918
Epoch: 36258, Training Loss: 0.00719
Epoch: 36258, Training Loss: 0.00748
Epoch: 36258, Training Loss: 0.00748
Epoch: 36258, Training Loss: 0.00918
Epoch: 36259, Training Loss: 0.00719
Epoch: 36259, Training Loss: 0.00748
Epoch: 36259, Training Loss: 0.00748
Epoch: 36259, Training Loss: 0.00918
Epoch: 36260, Training Loss: 0.00719
Epoch: 36260, Training Loss: 0.00748
Epoch: 36260, Training Loss: 0.00748
Epoch: 36260, Training Loss: 0.00918
Epoch: 36261, Training Loss: 0.00719
Epoch: 36261, Training Loss: 0.00748
Epoch: 36261, Training Loss: 0.00748
Epoch: 36261, Training Loss: 0.00918
Epoch: 36262, Training Loss: 0.00719
Epoch: 36262, Training Loss: 0.00748
Epoch: 36262, Training Loss: 0.00748
Epoch: 36262, Training Loss: 0.00918
Epoch: 36263, Training Loss: 0.00719
Epoch: 36263, Training Loss: 0.00748
Epoch: 36263, Training Loss: 0.00748
Epoch: 36263, Training Loss: 0.00918
Epoch: 36264, Training Loss: 0.00719
Epoch: 36264, Training Loss: 0.00748
Epoch: 36264, Training Loss: 0.00748
Epoch: 36264, Training Loss: 0.00918
Epoch: 36265, Training Loss: 0.00719
Epoch: 36265, Training Loss: 0.00747
Epoch: 36265, Training Loss: 0.00748
Epoch: 36265, Training Loss: 0.00918
Epoch: 36266, Training Loss: 0.00719
Epoch: 36266, Training Loss: 0.00747
Epoch: 36266, Training Loss: 0.00748
Epoch: 36266, Training Loss: 0.00918
Epoch: 36267, Training Loss: 0.00719
Epoch: 36267, Training Loss: 0.00747
Epoch: 36267, Training Loss: 0.00748
Epoch: 36267, Training Loss: 0.00918
Epoch: 36268, Training Loss: 0.00719
Epoch: 36268, Training Loss: 0.00747
Epoch: 36268, Training Loss: 0.00748
Epoch: 36268, Training Loss: 0.00918
Epoch: 36269, Training Loss: 0.00719
Epoch: 36269, Training Loss: 0.00747
Epoch: 36269, Training Loss: 0.00748
Epoch: 36269, Training Loss: 0.00918
Epoch: 36270, Training Loss: 0.00719
Epoch: 36270, Training Loss: 0.00747
Epoch: 36270, Training Loss: 0.00748
Epoch: 36270, Training Loss: 0.00918
Epoch: 36271, Training Loss: 0.00719
Epoch: 36271, Training Loss: 0.00747
Epoch: 36271, Training Loss: 0.00748
Epoch: 36271, Training Loss: 0.00918
Epoch: 36272, Training Loss: 0.00719
Epoch: 36272, Training Loss: 0.00747
Epoch: 36272, Training Loss: 0.00748
Epoch: 36272, Training Loss: 0.00918
Epoch: 36273, Training Loss: 0.00719
Epoch: 36273, Training Loss: 0.00747
Epoch: 36273, Training Loss: 0.00748
Epoch: 36273, Training Loss: 0.00918
Epoch: 36274, Training Loss: 0.00719
Epoch: 36274, Training Loss: 0.00747
Epoch: 36274, Training Loss: 0.00748
Epoch: 36274, Training Loss: 0.00918
Epoch: 36275, Training Loss: 0.00719
Epoch: 36275, Training Loss: 0.00747
Epoch: 36275, Training Loss: 0.00748
Epoch: 36275, Training Loss: 0.00918
Epoch: 36276, Training Loss: 0.00719
Epoch: 36276, Training Loss: 0.00747
Epoch: 36276, Training Loss: 0.00748
Epoch: 36276, Training Loss: 0.00918
Epoch: 36277, Training Loss: 0.00719
Epoch: 36277, Training Loss: 0.00747
Epoch: 36277, Training Loss: 0.00748
Epoch: 36277, Training Loss: 0.00918
Epoch: 36278, Training Loss: 0.00719
Epoch: 36278, Training Loss: 0.00747
Epoch: 36278, Training Loss: 0.00748
Epoch: 36278, Training Loss: 0.00918
Epoch: 36279, Training Loss: 0.00719
Epoch: 36279, Training Loss: 0.00747
Epoch: 36279, Training Loss: 0.00748
Epoch: 36279, Training Loss: 0.00918
Epoch: 36280, Training Loss: 0.00719
Epoch: 36280, Training Loss: 0.00747
Epoch: 36280, Training Loss: 0.00748
Epoch: 36280, Training Loss: 0.00918
Epoch: 36281, Training Loss: 0.00719
Epoch: 36281, Training Loss: 0.00747
Epoch: 36281, Training Loss: 0.00748
Epoch: 36281, Training Loss: 0.00918
Epoch: 36282, Training Loss: 0.00719
Epoch: 36282, Training Loss: 0.00747
Epoch: 36282, Training Loss: 0.00748
Epoch: 36282, Training Loss: 0.00918
Epoch: 36283, Training Loss: 0.00719
Epoch: 36283, Training Loss: 0.00747
Epoch: 36283, Training Loss: 0.00748
Epoch: 36283, Training Loss: 0.00918
Epoch: 36284, Training Loss: 0.00719
Epoch: 36284, Training Loss: 0.00747
Epoch: 36284, Training Loss: 0.00748
Epoch: 36284, Training Loss: 0.00918
Epoch: 36285, Training Loss: 0.00719
Epoch: 36285, Training Loss: 0.00747
Epoch: 36285, Training Loss: 0.00748
Epoch: 36285, Training Loss: 0.00918
Epoch: 36286, Training Loss: 0.00719
Epoch: 36286, Training Loss: 0.00747
Epoch: 36286, Training Loss: 0.00748
Epoch: 36286, Training Loss: 0.00918
Epoch: 36287, Training Loss: 0.00719
Epoch: 36287, Training Loss: 0.00747
Epoch: 36287, Training Loss: 0.00748
Epoch: 36287, Training Loss: 0.00918
Epoch: 36288, Training Loss: 0.00719
Epoch: 36288, Training Loss: 0.00747
Epoch: 36288, Training Loss: 0.00748
Epoch: 36288, Training Loss: 0.00918
Epoch: 36289, Training Loss: 0.00719
Epoch: 36289, Training Loss: 0.00747
Epoch: 36289, Training Loss: 0.00748
Epoch: 36289, Training Loss: 0.00918
Epoch: 36290, Training Loss: 0.00719
Epoch: 36290, Training Loss: 0.00747
Epoch: 36290, Training Loss: 0.00748
Epoch: 36290, Training Loss: 0.00918
Epoch: 36291, Training Loss: 0.00719
Epoch: 36291, Training Loss: 0.00747
Epoch: 36291, Training Loss: 0.00748
Epoch: 36291, Training Loss: 0.00918
Epoch: 36292, Training Loss: 0.00719
Epoch: 36292, Training Loss: 0.00747
Epoch: 36292, Training Loss: 0.00748
Epoch: 36292, Training Loss: 0.00918
Epoch: 36293, Training Loss: 0.00719
Epoch: 36293, Training Loss: 0.00747
Epoch: 36293, Training Loss: 0.00748
Epoch: 36293, Training Loss: 0.00918
Epoch: 36294, Training Loss: 0.00719
Epoch: 36294, Training Loss: 0.00747
Epoch: 36294, Training Loss: 0.00748
Epoch: 36294, Training Loss: 0.00918
Epoch: 36295, Training Loss: 0.00719
Epoch: 36295, Training Loss: 0.00747
Epoch: 36295, Training Loss: 0.00748
Epoch: 36295, Training Loss: 0.00918
Epoch: 36296, Training Loss: 0.00719
Epoch: 36296, Training Loss: 0.00747
Epoch: 36296, Training Loss: 0.00748
Epoch: 36296, Training Loss: 0.00918
Epoch: 36297, Training Loss: 0.00719
Epoch: 36297, Training Loss: 0.00747
Epoch: 36297, Training Loss: 0.00748
Epoch: 36297, Training Loss: 0.00918
Epoch: 36298, Training Loss: 0.00719
Epoch: 36298, Training Loss: 0.00747
Epoch: 36298, Training Loss: 0.00748
Epoch: 36298, Training Loss: 0.00918
Epoch: 36299, Training Loss: 0.00719
Epoch: 36299, Training Loss: 0.00747
Epoch: 36299, Training Loss: 0.00748
Epoch: 36299, Training Loss: 0.00918
Epoch: 36300, Training Loss: 0.00719
Epoch: 36300, Training Loss: 0.00747
Epoch: 36300, Training Loss: 0.00748
Epoch: 36300, Training Loss: 0.00918
Epoch: 36301, Training Loss: 0.00719
Epoch: 36301, Training Loss: 0.00747
Epoch: 36301, Training Loss: 0.00748
Epoch: 36301, Training Loss: 0.00918
Epoch: 36302, Training Loss: 0.00718
Epoch: 36302, Training Loss: 0.00747
Epoch: 36302, Training Loss: 0.00748
Epoch: 36302, Training Loss: 0.00918
Epoch: 36303, Training Loss: 0.00718
Epoch: 36303, Training Loss: 0.00747
Epoch: 36303, Training Loss: 0.00748
Epoch: 36303, Training Loss: 0.00918
Epoch: 36304, Training Loss: 0.00718
Epoch: 36304, Training Loss: 0.00747
Epoch: 36304, Training Loss: 0.00748
Epoch: 36304, Training Loss: 0.00918
Epoch: 36305, Training Loss: 0.00718
Epoch: 36305, Training Loss: 0.00747
Epoch: 36305, Training Loss: 0.00748
Epoch: 36305, Training Loss: 0.00918
Epoch: 36306, Training Loss: 0.00718
Epoch: 36306, Training Loss: 0.00747
Epoch: 36306, Training Loss: 0.00748
Epoch: 36306, Training Loss: 0.00918
Epoch: 36307, Training Loss: 0.00718
Epoch: 36307, Training Loss: 0.00747
Epoch: 36307, Training Loss: 0.00748
Epoch: 36307, Training Loss: 0.00918
Epoch: 36308, Training Loss: 0.00718
Epoch: 36308, Training Loss: 0.00747
Epoch: 36308, Training Loss: 0.00748
Epoch: 36308, Training Loss: 0.00918
Epoch: 36309, Training Loss: 0.00718
Epoch: 36309, Training Loss: 0.00747
Epoch: 36309, Training Loss: 0.00748
Epoch: 36309, Training Loss: 0.00918
Epoch: 36310, Training Loss: 0.00718
Epoch: 36310, Training Loss: 0.00747
Epoch: 36310, Training Loss: 0.00748
Epoch: 36310, Training Loss: 0.00918
Epoch: 36311, Training Loss: 0.00718
Epoch: 36311, Training Loss: 0.00747
Epoch: 36311, Training Loss: 0.00748
Epoch: 36311, Training Loss: 0.00918
Epoch: 36312, Training Loss: 0.00718
Epoch: 36312, Training Loss: 0.00747
Epoch: 36312, Training Loss: 0.00748
Epoch: 36312, Training Loss: 0.00918
Epoch: 36313, Training Loss: 0.00718
Epoch: 36313, Training Loss: 0.00747
Epoch: 36313, Training Loss: 0.00748
Epoch: 36313, Training Loss: 0.00918
Epoch: 36314, Training Loss: 0.00718
Epoch: 36314, Training Loss: 0.00747
Epoch: 36314, Training Loss: 0.00748
Epoch: 36314, Training Loss: 0.00918
Epoch: 36315, Training Loss: 0.00718
Epoch: 36315, Training Loss: 0.00747
Epoch: 36315, Training Loss: 0.00748
Epoch: 36315, Training Loss: 0.00918
Epoch: 36316, Training Loss: 0.00718
Epoch: 36316, Training Loss: 0.00747
Epoch: 36316, Training Loss: 0.00748
Epoch: 36316, Training Loss: 0.00918
Epoch: 36317, Training Loss: 0.00718
Epoch: 36317, Training Loss: 0.00747
Epoch: 36317, Training Loss: 0.00748
Epoch: 36317, Training Loss: 0.00917
Epoch: 36318, Training Loss: 0.00718
Epoch: 36318, Training Loss: 0.00747
Epoch: 36318, Training Loss: 0.00748
Epoch: 36318, Training Loss: 0.00917
Epoch: 36319, Training Loss: 0.00718
Epoch: 36319, Training Loss: 0.00747
Epoch: 36319, Training Loss: 0.00748
Epoch: 36319, Training Loss: 0.00917
Epoch: 36320, Training Loss: 0.00718
Epoch: 36320, Training Loss: 0.00747
Epoch: 36320, Training Loss: 0.00748
Epoch: 36320, Training Loss: 0.00917
Epoch: 36321, Training Loss: 0.00718
Epoch: 36321, Training Loss: 0.00747
Epoch: 36321, Training Loss: 0.00748
Epoch: 36321, Training Loss: 0.00917
Epoch: 36322, Training Loss: 0.00718
Epoch: 36322, Training Loss: 0.00747
Epoch: 36322, Training Loss: 0.00748
Epoch: 36322, Training Loss: 0.00917
Epoch: 36323, Training Loss: 0.00718
Epoch: 36323, Training Loss: 0.00747
Epoch: 36323, Training Loss: 0.00748
Epoch: 36323, Training Loss: 0.00917
Epoch: 36324, Training Loss: 0.00718
Epoch: 36324, Training Loss: 0.00747
Epoch: 36324, Training Loss: 0.00748
Epoch: 36324, Training Loss: 0.00917
Epoch: 36325, Training Loss: 0.00718
Epoch: 36325, Training Loss: 0.00747
Epoch: 36325, Training Loss: 0.00748
Epoch: 36325, Training Loss: 0.00917
Epoch: 36326, Training Loss: 0.00718
Epoch: 36326, Training Loss: 0.00747
Epoch: 36326, Training Loss: 0.00748
Epoch: 36326, Training Loss: 0.00917
Epoch: 36327, Training Loss: 0.00718
Epoch: 36327, Training Loss: 0.00747
Epoch: 36327, Training Loss: 0.00748
Epoch: 36327, Training Loss: 0.00917
Epoch: 36328, Training Loss: 0.00718
Epoch: 36328, Training Loss: 0.00747
Epoch: 36328, Training Loss: 0.00748
Epoch: 36328, Training Loss: 0.00917
Epoch: 36329, Training Loss: 0.00718
Epoch: 36329, Training Loss: 0.00747
Epoch: 36329, Training Loss: 0.00748
Epoch: 36329, Training Loss: 0.00917
Epoch: 36330, Training Loss: 0.00718
Epoch: 36330, Training Loss: 0.00747
Epoch: 36330, Training Loss: 0.00748
Epoch: 36330, Training Loss: 0.00917
Epoch: 36331, Training Loss: 0.00718
Epoch: 36331, Training Loss: 0.00747
Epoch: 36331, Training Loss: 0.00748
Epoch: 36331, Training Loss: 0.00917
Epoch: 36332, Training Loss: 0.00718
Epoch: 36332, Training Loss: 0.00747
Epoch: 36332, Training Loss: 0.00748
Epoch: 36332, Training Loss: 0.00917
Epoch: 36333, Training Loss: 0.00718
Epoch: 36333, Training Loss: 0.00747
Epoch: 36333, Training Loss: 0.00748
Epoch: 36333, Training Loss: 0.00917
Epoch: 36334, Training Loss: 0.00718
Epoch: 36334, Training Loss: 0.00747
Epoch: 36334, Training Loss: 0.00748
Epoch: 36334, Training Loss: 0.00917
Epoch: 36335, Training Loss: 0.00718
Epoch: 36335, Training Loss: 0.00747
Epoch: 36335, Training Loss: 0.00748
Epoch: 36335, Training Loss: 0.00917
Epoch: 36336, Training Loss: 0.00718
Epoch: 36336, Training Loss: 0.00747
Epoch: 36336, Training Loss: 0.00748
Epoch: 36336, Training Loss: 0.00917
Epoch: 36337, Training Loss: 0.00718
Epoch: 36337, Training Loss: 0.00747
Epoch: 36337, Training Loss: 0.00748
Epoch: 36337, Training Loss: 0.00917
Epoch: 36338, Training Loss: 0.00718
Epoch: 36338, Training Loss: 0.00747
Epoch: 36338, Training Loss: 0.00748
Epoch: 36338, Training Loss: 0.00917
Epoch: 36339, Training Loss: 0.00718
Epoch: 36339, Training Loss: 0.00747
Epoch: 36339, Training Loss: 0.00748
Epoch: 36339, Training Loss: 0.00917
Epoch: 36340, Training Loss: 0.00718
Epoch: 36340, Training Loss: 0.00747
Epoch: 36340, Training Loss: 0.00747
Epoch: 36340, Training Loss: 0.00917
Epoch: 36341, Training Loss: 0.00718
Epoch: 36341, Training Loss: 0.00747
Epoch: 36341, Training Loss: 0.00747
Epoch: 36341, Training Loss: 0.00917
Epoch: 36342, Training Loss: 0.00718
Epoch: 36342, Training Loss: 0.00747
Epoch: 36342, Training Loss: 0.00747
Epoch: 36342, Training Loss: 0.00917
Epoch: 36343, Training Loss: 0.00718
Epoch: 36343, Training Loss: 0.00747
Epoch: 36343, Training Loss: 0.00747
Epoch: 36343, Training Loss: 0.00917
Epoch: 36344, Training Loss: 0.00718
Epoch: 36344, Training Loss: 0.00747
Epoch: 36344, Training Loss: 0.00747
Epoch: 36344, Training Loss: 0.00917
Epoch: 36345, Training Loss: 0.00718
Epoch: 36345, Training Loss: 0.00747
Epoch: 36345, Training Loss: 0.00747
Epoch: 36345, Training Loss: 0.00917
Epoch: 36346, Training Loss: 0.00718
Epoch: 36346, Training Loss: 0.00747
Epoch: 36346, Training Loss: 0.00747
Epoch: 36346, Training Loss: 0.00917
Epoch: 36347, Training Loss: 0.00718
Epoch: 36347, Training Loss: 0.00747
Epoch: 36347, Training Loss: 0.00747
Epoch: 36347, Training Loss: 0.00917
Epoch: 36348, Training Loss: 0.00718
Epoch: 36348, Training Loss: 0.00747
Epoch: 36348, Training Loss: 0.00747
Epoch: 36348, Training Loss: 0.00917
Epoch: 36349, Training Loss: 0.00718
Epoch: 36349, Training Loss: 0.00747
Epoch: 36349, Training Loss: 0.00747
Epoch: 36349, Training Loss: 0.00917
Epoch: 36350, Training Loss: 0.00718
Epoch: 36350, Training Loss: 0.00747
Epoch: 36350, Training Loss: 0.00747
Epoch: 36350, Training Loss: 0.00917
Epoch: 36351, Training Loss: 0.00718
Epoch: 36351, Training Loss: 0.00747
Epoch: 36351, Training Loss: 0.00747
Epoch: 36351, Training Loss: 0.00917
Epoch: 36352, Training Loss: 0.00718
Epoch: 36352, Training Loss: 0.00747
Epoch: 36352, Training Loss: 0.00747
Epoch: 36352, Training Loss: 0.00917
Epoch: 36353, Training Loss: 0.00718
Epoch: 36353, Training Loss: 0.00746
Epoch: 36353, Training Loss: 0.00747
Epoch: 36353, Training Loss: 0.00917
Epoch: 36354, Training Loss: 0.00718
Epoch: 36354, Training Loss: 0.00746
Epoch: 36354, Training Loss: 0.00747
Epoch: 36354, Training Loss: 0.00917
Epoch: 36355, Training Loss: 0.00718
Epoch: 36355, Training Loss: 0.00746
Epoch: 36355, Training Loss: 0.00747
Epoch: 36355, Training Loss: 0.00917
Epoch: 36356, Training Loss: 0.00718
Epoch: 36356, Training Loss: 0.00746
Epoch: 36356, Training Loss: 0.00747
Epoch: 36356, Training Loss: 0.00917
Epoch: 36357, Training Loss: 0.00718
Epoch: 36357, Training Loss: 0.00746
Epoch: 36357, Training Loss: 0.00747
Epoch: 36357, Training Loss: 0.00917
Epoch: 36358, Training Loss: 0.00718
Epoch: 36358, Training Loss: 0.00746
Epoch: 36358, Training Loss: 0.00747
Epoch: 36358, Training Loss: 0.00917
Epoch: 36359, Training Loss: 0.00718
Epoch: 36359, Training Loss: 0.00746
Epoch: 36359, Training Loss: 0.00747
Epoch: 36359, Training Loss: 0.00917
Epoch: 36360, Training Loss: 0.00718
Epoch: 36360, Training Loss: 0.00746
Epoch: 36360, Training Loss: 0.00747
Epoch: 36360, Training Loss: 0.00917
Epoch: 36361, Training Loss: 0.00718
Epoch: 36361, Training Loss: 0.00746
Epoch: 36361, Training Loss: 0.00747
Epoch: 36361, Training Loss: 0.00917
Epoch: 36362, Training Loss: 0.00718
Epoch: 36362, Training Loss: 0.00746
Epoch: 36362, Training Loss: 0.00747
Epoch: 36362, Training Loss: 0.00917
Epoch: 36363, Training Loss: 0.00718
Epoch: 36363, Training Loss: 0.00746
Epoch: 36363, Training Loss: 0.00747
Epoch: 36363, Training Loss: 0.00917
Epoch: 36364, Training Loss: 0.00718
Epoch: 36364, Training Loss: 0.00746
Epoch: 36364, Training Loss: 0.00747
Epoch: 36364, Training Loss: 0.00917
Epoch: 36365, Training Loss: 0.00718
Epoch: 36365, Training Loss: 0.00746
Epoch: 36365, Training Loss: 0.00747
Epoch: 36365, Training Loss: 0.00917
Epoch: 36366, Training Loss: 0.00718
Epoch: 36366, Training Loss: 0.00746
Epoch: 36366, Training Loss: 0.00747
Epoch: 36366, Training Loss: 0.00917
Epoch: 36367, Training Loss: 0.00718
Epoch: 36367, Training Loss: 0.00746
Epoch: 36367, Training Loss: 0.00747
Epoch: 36367, Training Loss: 0.00917
Epoch: 36368, Training Loss: 0.00718
Epoch: 36368, Training Loss: 0.00746
Epoch: 36368, Training Loss: 0.00747
Epoch: 36368, Training Loss: 0.00917
Epoch: 36369, Training Loss: 0.00718
Epoch: 36369, Training Loss: 0.00746
Epoch: 36369, Training Loss: 0.00747
Epoch: 36369, Training Loss: 0.00917
Epoch: 36370, Training Loss: 0.00718
Epoch: 36370, Training Loss: 0.00746
Epoch: 36370, Training Loss: 0.00747
Epoch: 36370, Training Loss: 0.00917
Epoch: 36371, Training Loss: 0.00718
Epoch: 36371, Training Loss: 0.00746
Epoch: 36371, Training Loss: 0.00747
Epoch: 36371, Training Loss: 0.00917
Epoch: 36372, Training Loss: 0.00718
Epoch: 36372, Training Loss: 0.00746
Epoch: 36372, Training Loss: 0.00747
Epoch: 36372, Training Loss: 0.00917
Epoch: 36373, Training Loss: 0.00718
Epoch: 36373, Training Loss: 0.00746
Epoch: 36373, Training Loss: 0.00747
Epoch: 36373, Training Loss: 0.00917
Epoch: 36374, Training Loss: 0.00718
Epoch: 36374, Training Loss: 0.00746
Epoch: 36374, Training Loss: 0.00747
Epoch: 36374, Training Loss: 0.00917
Epoch: 36375, Training Loss: 0.00718
Epoch: 36375, Training Loss: 0.00746
Epoch: 36375, Training Loss: 0.00747
Epoch: 36375, Training Loss: 0.00917
Epoch: 36376, Training Loss: 0.00718
Epoch: 36376, Training Loss: 0.00746
Epoch: 36376, Training Loss: 0.00747
Epoch: 36376, Training Loss: 0.00917
Epoch: 36377, Training Loss: 0.00718
Epoch: 36377, Training Loss: 0.00746
Epoch: 36377, Training Loss: 0.00747
Epoch: 36377, Training Loss: 0.00917
Epoch: 36378, Training Loss: 0.00718
Epoch: 36378, Training Loss: 0.00746
Epoch: 36378, Training Loss: 0.00747
Epoch: 36378, Training Loss: 0.00917
Epoch: 36379, Training Loss: 0.00718
Epoch: 36379, Training Loss: 0.00746
Epoch: 36379, Training Loss: 0.00747
Epoch: 36379, Training Loss: 0.00917
Epoch: 36380, Training Loss: 0.00718
Epoch: 36380, Training Loss: 0.00746
Epoch: 36380, Training Loss: 0.00747
Epoch: 36380, Training Loss: 0.00917
Epoch: 36381, Training Loss: 0.00718
Epoch: 36381, Training Loss: 0.00746
Epoch: 36381, Training Loss: 0.00747
Epoch: 36381, Training Loss: 0.00917
Epoch: 36382, Training Loss: 0.00718
Epoch: 36382, Training Loss: 0.00746
Epoch: 36382, Training Loss: 0.00747
Epoch: 36382, Training Loss: 0.00917
Epoch: 36383, Training Loss: 0.00718
Epoch: 36383, Training Loss: 0.00746
Epoch: 36383, Training Loss: 0.00747
Epoch: 36383, Training Loss: 0.00917
Epoch: 36384, Training Loss: 0.00718
Epoch: 36384, Training Loss: 0.00746
Epoch: 36384, Training Loss: 0.00747
Epoch: 36384, Training Loss: 0.00917
Epoch: 36385, Training Loss: 0.00718
Epoch: 36385, Training Loss: 0.00746
Epoch: 36385, Training Loss: 0.00747
Epoch: 36385, Training Loss: 0.00917
Epoch: 36386, Training Loss: 0.00718
Epoch: 36386, Training Loss: 0.00746
Epoch: 36386, Training Loss: 0.00747
Epoch: 36386, Training Loss: 0.00917
Epoch: 36387, Training Loss: 0.00718
Epoch: 36387, Training Loss: 0.00746
Epoch: 36387, Training Loss: 0.00747
Epoch: 36387, Training Loss: 0.00917
Epoch: 36388, Training Loss: 0.00718
Epoch: 36388, Training Loss: 0.00746
Epoch: 36388, Training Loss: 0.00747
Epoch: 36388, Training Loss: 0.00916
Epoch: 36389, Training Loss: 0.00718
Epoch: 36389, Training Loss: 0.00746
Epoch: 36389, Training Loss: 0.00747
Epoch: 36389, Training Loss: 0.00916
Epoch: 36390, Training Loss: 0.00718
Epoch: 36390, Training Loss: 0.00746
Epoch: 36390, Training Loss: 0.00747
Epoch: 36390, Training Loss: 0.00916
Epoch: 36391, Training Loss: 0.00718
Epoch: 36391, Training Loss: 0.00746
Epoch: 36391, Training Loss: 0.00747
Epoch: 36391, Training Loss: 0.00916
Epoch: 36392, Training Loss: 0.00718
Epoch: 36392, Training Loss: 0.00746
Epoch: 36392, Training Loss: 0.00747
Epoch: 36392, Training Loss: 0.00916
Epoch: 36393, Training Loss: 0.00718
Epoch: 36393, Training Loss: 0.00746
Epoch: 36393, Training Loss: 0.00747
Epoch: 36393, Training Loss: 0.00916
Epoch: 36394, Training Loss: 0.00718
Epoch: 36394, Training Loss: 0.00746
Epoch: 36394, Training Loss: 0.00747
Epoch: 36394, Training Loss: 0.00916
Epoch: 36395, Training Loss: 0.00717
Epoch: 36395, Training Loss: 0.00746
Epoch: 36395, Training Loss: 0.00747
Epoch: 36395, Training Loss: 0.00916
Epoch: 36396, Training Loss: 0.00717
Epoch: 36396, Training Loss: 0.00746
Epoch: 36396, Training Loss: 0.00747
Epoch: 36396, Training Loss: 0.00916
Epoch: 36397, Training Loss: 0.00717
Epoch: 36397, Training Loss: 0.00746
Epoch: 36397, Training Loss: 0.00747
Epoch: 36397, Training Loss: 0.00916
Epoch: 36398, Training Loss: 0.00717
Epoch: 36398, Training Loss: 0.00746
Epoch: 36398, Training Loss: 0.00747
Epoch: 36398, Training Loss: 0.00916
Epoch: 36399, Training Loss: 0.00717
Epoch: 36399, Training Loss: 0.00746
Epoch: 36399, Training Loss: 0.00747
Epoch: 36399, Training Loss: 0.00916
Epoch: 36400, Training Loss: 0.00717
Epoch: 36400, Training Loss: 0.00746
Epoch: 36400, Training Loss: 0.00747
Epoch: 36400, Training Loss: 0.00916
Epoch: 36401, Training Loss: 0.00717
Epoch: 36401, Training Loss: 0.00746
Epoch: 36401, Training Loss: 0.00747
Epoch: 36401, Training Loss: 0.00916
Epoch: 36402, Training Loss: 0.00717
Epoch: 36402, Training Loss: 0.00746
Epoch: 36402, Training Loss: 0.00747
Epoch: 36402, Training Loss: 0.00916
Epoch: 36403, Training Loss: 0.00717
Epoch: 36403, Training Loss: 0.00746
Epoch: 36403, Training Loss: 0.00747
Epoch: 36403, Training Loss: 0.00916
Epoch: 36404, Training Loss: 0.00717
Epoch: 36404, Training Loss: 0.00746
Epoch: 36404, Training Loss: 0.00747
Epoch: 36404, Training Loss: 0.00916
Epoch: 36405, Training Loss: 0.00717
Epoch: 36405, Training Loss: 0.00746
Epoch: 36405, Training Loss: 0.00747
Epoch: 36405, Training Loss: 0.00916
Epoch: 36406, Training Loss: 0.00717
Epoch: 36406, Training Loss: 0.00746
Epoch: 36406, Training Loss: 0.00747
Epoch: 36406, Training Loss: 0.00916
Epoch: 36407, Training Loss: 0.00717
Epoch: 36407, Training Loss: 0.00746
Epoch: 36407, Training Loss: 0.00747
Epoch: 36407, Training Loss: 0.00916
Epoch: 36408, Training Loss: 0.00717
Epoch: 36408, Training Loss: 0.00746
Epoch: 36408, Training Loss: 0.00747
Epoch: 36408, Training Loss: 0.00916
Epoch: 36409, Training Loss: 0.00717
Epoch: 36409, Training Loss: 0.00746
Epoch: 36409, Training Loss: 0.00747
Epoch: 36409, Training Loss: 0.00916
Epoch: 36410, Training Loss: 0.00717
Epoch: 36410, Training Loss: 0.00746
Epoch: 36410, Training Loss: 0.00747
Epoch: 36410, Training Loss: 0.00916
Epoch: 36411, Training Loss: 0.00717
Epoch: 36411, Training Loss: 0.00746
Epoch: 36411, Training Loss: 0.00747
Epoch: 36411, Training Loss: 0.00916
Epoch: 36412, Training Loss: 0.00717
Epoch: 36412, Training Loss: 0.00746
Epoch: 36412, Training Loss: 0.00747
Epoch: 36412, Training Loss: 0.00916
Epoch: 36413, Training Loss: 0.00717
Epoch: 36413, Training Loss: 0.00746
Epoch: 36413, Training Loss: 0.00747
Epoch: 36413, Training Loss: 0.00916
Epoch: 36414, Training Loss: 0.00717
Epoch: 36414, Training Loss: 0.00746
Epoch: 36414, Training Loss: 0.00747
Epoch: 36414, Training Loss: 0.00916
Epoch: 36415, Training Loss: 0.00717
Epoch: 36415, Training Loss: 0.00746
Epoch: 36415, Training Loss: 0.00747
Epoch: 36415, Training Loss: 0.00916
Epoch: 36416, Training Loss: 0.00717
Epoch: 36416, Training Loss: 0.00746
Epoch: 36416, Training Loss: 0.00747
Epoch: 36416, Training Loss: 0.00916
Epoch: 36417, Training Loss: 0.00717
Epoch: 36417, Training Loss: 0.00746
Epoch: 36417, Training Loss: 0.00747
Epoch: 36417, Training Loss: 0.00916
Epoch: 36418, Training Loss: 0.00717
Epoch: 36418, Training Loss: 0.00746
Epoch: 36418, Training Loss: 0.00747
Epoch: 36418, Training Loss: 0.00916
Epoch: 36419, Training Loss: 0.00717
Epoch: 36419, Training Loss: 0.00746
Epoch: 36419, Training Loss: 0.00747
Epoch: 36419, Training Loss: 0.00916
Epoch: 36420, Training Loss: 0.00717
Epoch: 36420, Training Loss: 0.00746
Epoch: 36420, Training Loss: 0.00747
Epoch: 36420, Training Loss: 0.00916
Epoch: 36421, Training Loss: 0.00717
Epoch: 36421, Training Loss: 0.00746
Epoch: 36421, Training Loss: 0.00747
Epoch: 36421, Training Loss: 0.00916
Epoch: 36422, Training Loss: 0.00717
Epoch: 36422, Training Loss: 0.00746
Epoch: 36422, Training Loss: 0.00747
Epoch: 36422, Training Loss: 0.00916
Epoch: 36423, Training Loss: 0.00717
Epoch: 36423, Training Loss: 0.00746
Epoch: 36423, Training Loss: 0.00747
Epoch: 36423, Training Loss: 0.00916
Epoch: 36424, Training Loss: 0.00717
Epoch: 36424, Training Loss: 0.00746
Epoch: 36424, Training Loss: 0.00747
Epoch: 36424, Training Loss: 0.00916
Epoch: 36425, Training Loss: 0.00717
Epoch: 36425, Training Loss: 0.00746
Epoch: 36425, Training Loss: 0.00747
Epoch: 36425, Training Loss: 0.00916
Epoch: 36426, Training Loss: 0.00717
Epoch: 36426, Training Loss: 0.00746
Epoch: 36426, Training Loss: 0.00747
Epoch: 36426, Training Loss: 0.00916
Epoch: 36427, Training Loss: 0.00717
Epoch: 36427, Training Loss: 0.00746
Epoch: 36427, Training Loss: 0.00747
Epoch: 36427, Training Loss: 0.00916
Epoch: 36428, Training Loss: 0.00717
Epoch: 36428, Training Loss: 0.00746
Epoch: 36428, Training Loss: 0.00746
Epoch: 36428, Training Loss: 0.00916
Epoch: 36429, Training Loss: 0.00717
Epoch: 36429, Training Loss: 0.00746
Epoch: 36429, Training Loss: 0.00746
Epoch: 36429, Training Loss: 0.00916
Epoch: 36430, Training Loss: 0.00717
Epoch: 36430, Training Loss: 0.00746
Epoch: 36430, Training Loss: 0.00746
Epoch: 36430, Training Loss: 0.00916
Epoch: 36431, Training Loss: 0.00717
Epoch: 36431, Training Loss: 0.00746
Epoch: 36431, Training Loss: 0.00746
Epoch: 36431, Training Loss: 0.00916
Epoch: 36432, Training Loss: 0.00717
Epoch: 36432, Training Loss: 0.00746
Epoch: 36432, Training Loss: 0.00746
Epoch: 36432, Training Loss: 0.00916
Epoch: 36433, Training Loss: 0.00717
Epoch: 36433, Training Loss: 0.00746
Epoch: 36433, Training Loss: 0.00746
Epoch: 36433, Training Loss: 0.00916
Epoch: 36434, Training Loss: 0.00717
Epoch: 36434, Training Loss: 0.00746
Epoch: 36434, Training Loss: 0.00746
Epoch: 36434, Training Loss: 0.00916
Epoch: 36435, Training Loss: 0.00717
Epoch: 36435, Training Loss: 0.00746
Epoch: 36435, Training Loss: 0.00746
Epoch: 36435, Training Loss: 0.00916
Epoch: 36436, Training Loss: 0.00717
Epoch: 36436, Training Loss: 0.00746
Epoch: 36436, Training Loss: 0.00746
Epoch: 36436, Training Loss: 0.00916
Epoch: 36437, Training Loss: 0.00717
Epoch: 36437, Training Loss: 0.00746
Epoch: 36437, Training Loss: 0.00746
Epoch: 36437, Training Loss: 0.00916
Epoch: 36438, Training Loss: 0.00717
Epoch: 36438, Training Loss: 0.00746
Epoch: 36438, Training Loss: 0.00746
Epoch: 36438, Training Loss: 0.00916
Epoch: 36439, Training Loss: 0.00717
Epoch: 36439, Training Loss: 0.00746
Epoch: 36439, Training Loss: 0.00746
Epoch: 36439, Training Loss: 0.00916
Epoch: 36440, Training Loss: 0.00717
Epoch: 36440, Training Loss: 0.00746
Epoch: 36440, Training Loss: 0.00746
Epoch: 36440, Training Loss: 0.00916
Epoch: 36441, Training Loss: 0.00717
Epoch: 36441, Training Loss: 0.00745
Epoch: 36441, Training Loss: 0.00746
Epoch: 36441, Training Loss: 0.00916
Epoch: 36442, Training Loss: 0.00717
Epoch: 36442, Training Loss: 0.00745
Epoch: 36442, Training Loss: 0.00746
Epoch: 36442, Training Loss: 0.00916
Epoch: 36443, Training Loss: 0.00717
Epoch: 36443, Training Loss: 0.00745
Epoch: 36443, Training Loss: 0.00746
Epoch: 36443, Training Loss: 0.00916
Epoch: 36444, Training Loss: 0.00717
Epoch: 36444, Training Loss: 0.00745
Epoch: 36444, Training Loss: 0.00746
Epoch: 36444, Training Loss: 0.00916
Epoch: 36445, Training Loss: 0.00717
Epoch: 36445, Training Loss: 0.00745
Epoch: 36445, Training Loss: 0.00746
Epoch: 36445, Training Loss: 0.00916
Epoch: 36446, Training Loss: 0.00717
Epoch: 36446, Training Loss: 0.00745
Epoch: 36446, Training Loss: 0.00746
Epoch: 36446, Training Loss: 0.00916
Epoch: 36447, Training Loss: 0.00717
Epoch: 36447, Training Loss: 0.00745
Epoch: 36447, Training Loss: 0.00746
Epoch: 36447, Training Loss: 0.00916
Epoch: 36448, Training Loss: 0.00717
Epoch: 36448, Training Loss: 0.00745
Epoch: 36448, Training Loss: 0.00746
Epoch: 36448, Training Loss: 0.00916
Epoch: 36449, Training Loss: 0.00717
Epoch: 36449, Training Loss: 0.00745
Epoch: 36449, Training Loss: 0.00746
Epoch: 36449, Training Loss: 0.00916
Epoch: 36450, Training Loss: 0.00717
Epoch: 36450, Training Loss: 0.00745
Epoch: 36450, Training Loss: 0.00746
Epoch: 36450, Training Loss: 0.00916
Epoch: 36451, Training Loss: 0.00717
Epoch: 36451, Training Loss: 0.00745
Epoch: 36451, Training Loss: 0.00746
Epoch: 36451, Training Loss: 0.00916
Epoch: 36452, Training Loss: 0.00717
Epoch: 36452, Training Loss: 0.00745
Epoch: 36452, Training Loss: 0.00746
Epoch: 36452, Training Loss: 0.00916
Epoch: 36453, Training Loss: 0.00717
Epoch: 36453, Training Loss: 0.00745
Epoch: 36453, Training Loss: 0.00746
Epoch: 36453, Training Loss: 0.00916
Epoch: 36454, Training Loss: 0.00717
Epoch: 36454, Training Loss: 0.00745
Epoch: 36454, Training Loss: 0.00746
Epoch: 36454, Training Loss: 0.00916
Epoch: 36455, Training Loss: 0.00717
Epoch: 36455, Training Loss: 0.00745
Epoch: 36455, Training Loss: 0.00746
Epoch: 36455, Training Loss: 0.00916
Epoch: 36456, Training Loss: 0.00717
Epoch: 36456, Training Loss: 0.00745
Epoch: 36456, Training Loss: 0.00746
Epoch: 36456, Training Loss: 0.00916
Epoch: 36457, Training Loss: 0.00717
Epoch: 36457, Training Loss: 0.00745
Epoch: 36457, Training Loss: 0.00746
Epoch: 36457, Training Loss: 0.00916
Epoch: 36458, Training Loss: 0.00717
Epoch: 36458, Training Loss: 0.00745
Epoch: 36458, Training Loss: 0.00746
Epoch: 36458, Training Loss: 0.00916
Epoch: 36459, Training Loss: 0.00717
Epoch: 36459, Training Loss: 0.00745
Epoch: 36459, Training Loss: 0.00746
Epoch: 36459, Training Loss: 0.00915
Epoch: 36460, Training Loss: 0.00717
Epoch: 36460, Training Loss: 0.00745
Epoch: 36460, Training Loss: 0.00746
Epoch: 36460, Training Loss: 0.00915
Epoch: 36461, Training Loss: 0.00717
Epoch: 36461, Training Loss: 0.00745
Epoch: 36461, Training Loss: 0.00746
Epoch: 36461, Training Loss: 0.00915
Epoch: 36462, Training Loss: 0.00717
Epoch: 36462, Training Loss: 0.00745
Epoch: 36462, Training Loss: 0.00746
Epoch: 36462, Training Loss: 0.00915
Epoch: 36463, Training Loss: 0.00717
Epoch: 36463, Training Loss: 0.00745
Epoch: 36463, Training Loss: 0.00746
Epoch: 36463, Training Loss: 0.00915
Epoch: 36464, Training Loss: 0.00717
Epoch: 36464, Training Loss: 0.00745
Epoch: 36464, Training Loss: 0.00746
Epoch: 36464, Training Loss: 0.00915
Epoch: 36465, Training Loss: 0.00717
Epoch: 36465, Training Loss: 0.00745
Epoch: 36465, Training Loss: 0.00746
Epoch: 36465, Training Loss: 0.00915
Epoch: 36466, Training Loss: 0.00717
Epoch: 36466, Training Loss: 0.00745
Epoch: 36466, Training Loss: 0.00746
Epoch: 36466, Training Loss: 0.00915
Epoch: 36467, Training Loss: 0.00717
Epoch: 36467, Training Loss: 0.00745
Epoch: 36467, Training Loss: 0.00746
Epoch: 36467, Training Loss: 0.00915
Epoch: 36468, Training Loss: 0.00717
Epoch: 36468, Training Loss: 0.00745
Epoch: 36468, Training Loss: 0.00746
Epoch: 36468, Training Loss: 0.00915
Epoch: 36469, Training Loss: 0.00717
Epoch: 36469, Training Loss: 0.00745
Epoch: 36469, Training Loss: 0.00746
Epoch: 36469, Training Loss: 0.00915
Epoch: 36470, Training Loss: 0.00717
Epoch: 36470, Training Loss: 0.00745
Epoch: 36470, Training Loss: 0.00746
Epoch: 36470, Training Loss: 0.00915
Epoch: 36471, Training Loss: 0.00717
Epoch: 36471, Training Loss: 0.00745
Epoch: 36471, Training Loss: 0.00746
Epoch: 36471, Training Loss: 0.00915
Epoch: 36472, Training Loss: 0.00717
Epoch: 36472, Training Loss: 0.00745
Epoch: 36472, Training Loss: 0.00746
Epoch: 36472, Training Loss: 0.00915
Epoch: 36473, Training Loss: 0.00717
Epoch: 36473, Training Loss: 0.00745
Epoch: 36473, Training Loss: 0.00746
Epoch: 36473, Training Loss: 0.00915
Epoch: 36474, Training Loss: 0.00717
Epoch: 36474, Training Loss: 0.00745
Epoch: 36474, Training Loss: 0.00746
Epoch: 36474, Training Loss: 0.00915
Epoch: 36475, Training Loss: 0.00717
Epoch: 36475, Training Loss: 0.00745
Epoch: 36475, Training Loss: 0.00746
Epoch: 36475, Training Loss: 0.00915
Epoch: 36476, Training Loss: 0.00717
Epoch: 36476, Training Loss: 0.00745
Epoch: 36476, Training Loss: 0.00746
Epoch: 36476, Training Loss: 0.00915
Epoch: 36477, Training Loss: 0.00717
Epoch: 36477, Training Loss: 0.00745
Epoch: 36477, Training Loss: 0.00746
Epoch: 36477, Training Loss: 0.00915
Epoch: 36478, Training Loss: 0.00717
Epoch: 36478, Training Loss: 0.00745
Epoch: 36478, Training Loss: 0.00746
Epoch: 36478, Training Loss: 0.00915
Epoch: 36479, Training Loss: 0.00717
Epoch: 36479, Training Loss: 0.00745
Epoch: 36479, Training Loss: 0.00746
Epoch: 36479, Training Loss: 0.00915
Epoch: 36480, Training Loss: 0.00717
Epoch: 36480, Training Loss: 0.00745
Epoch: 36480, Training Loss: 0.00746
Epoch: 36480, Training Loss: 0.00915
Epoch: 36481, Training Loss: 0.00717
Epoch: 36481, Training Loss: 0.00745
Epoch: 36481, Training Loss: 0.00746
Epoch: 36481, Training Loss: 0.00915
Epoch: 36482, Training Loss: 0.00717
Epoch: 36482, Training Loss: 0.00745
Epoch: 36482, Training Loss: 0.00746
Epoch: 36482, Training Loss: 0.00915
Epoch: 36483, Training Loss: 0.00717
Epoch: 36483, Training Loss: 0.00745
Epoch: 36483, Training Loss: 0.00746
Epoch: 36483, Training Loss: 0.00915
Epoch: 36484, Training Loss: 0.00717
Epoch: 36484, Training Loss: 0.00745
Epoch: 36484, Training Loss: 0.00746
Epoch: 36484, Training Loss: 0.00915
Epoch: 36485, Training Loss: 0.00717
Epoch: 36485, Training Loss: 0.00745
Epoch: 36485, Training Loss: 0.00746
Epoch: 36485, Training Loss: 0.00915
Epoch: 36486, Training Loss: 0.00717
Epoch: 36486, Training Loss: 0.00745
Epoch: 36486, Training Loss: 0.00746
Epoch: 36486, Training Loss: 0.00915
Epoch: 36487, Training Loss: 0.00717
Epoch: 36487, Training Loss: 0.00745
Epoch: 36487, Training Loss: 0.00746
Epoch: 36487, Training Loss: 0.00915
Epoch: 36488, Training Loss: 0.00717
Epoch: 36488, Training Loss: 0.00745
Epoch: 36488, Training Loss: 0.00746
Epoch: 36488, Training Loss: 0.00915
Epoch: 36489, Training Loss: 0.00716
Epoch: 36489, Training Loss: 0.00745
Epoch: 36489, Training Loss: 0.00746
Epoch: 36489, Training Loss: 0.00915
Epoch: 36490, Training Loss: 0.00716
Epoch: 36490, Training Loss: 0.00745
Epoch: 36490, Training Loss: 0.00746
Epoch: 36490, Training Loss: 0.00915
Epoch: 36491, Training Loss: 0.00716
Epoch: 36491, Training Loss: 0.00745
Epoch: 36491, Training Loss: 0.00746
Epoch: 36491, Training Loss: 0.00915
Epoch: 36492, Training Loss: 0.00716
Epoch: 36492, Training Loss: 0.00745
Epoch: 36492, Training Loss: 0.00746
Epoch: 36492, Training Loss: 0.00915
Epoch: 36493, Training Loss: 0.00716
Epoch: 36493, Training Loss: 0.00745
Epoch: 36493, Training Loss: 0.00746
Epoch: 36493, Training Loss: 0.00915
Epoch: 36494, Training Loss: 0.00716
Epoch: 36494, Training Loss: 0.00745
Epoch: 36494, Training Loss: 0.00746
Epoch: 36494, Training Loss: 0.00915
Epoch: 36495, Training Loss: 0.00716
Epoch: 36495, Training Loss: 0.00745
Epoch: 36495, Training Loss: 0.00746
Epoch: 36495, Training Loss: 0.00915
Epoch: 36496, Training Loss: 0.00716
Epoch: 36496, Training Loss: 0.00745
Epoch: 36496, Training Loss: 0.00746
Epoch: 36496, Training Loss: 0.00915
Epoch: 36497, Training Loss: 0.00716
Epoch: 36497, Training Loss: 0.00745
Epoch: 36497, Training Loss: 0.00746
Epoch: 36497, Training Loss: 0.00915
Epoch: 36498, Training Loss: 0.00716
Epoch: 36498, Training Loss: 0.00745
Epoch: 36498, Training Loss: 0.00746
Epoch: 36498, Training Loss: 0.00915
Epoch: 36499, Training Loss: 0.00716
Epoch: 36499, Training Loss: 0.00745
Epoch: 36499, Training Loss: 0.00746
Epoch: 36499, Training Loss: 0.00915
Epoch: 36500, Training Loss: 0.00716
Epoch: 36500, Training Loss: 0.00745
Epoch: 36500, Training Loss: 0.00746
Epoch: 36500, Training Loss: 0.00915
Epoch: 36501, Training Loss: 0.00716
Epoch: 36501, Training Loss: 0.00745
Epoch: 36501, Training Loss: 0.00746
Epoch: 36501, Training Loss: 0.00915
Epoch: 36502, Training Loss: 0.00716
Epoch: 36502, Training Loss: 0.00745
Epoch: 36502, Training Loss: 0.00746
Epoch: 36502, Training Loss: 0.00915
Epoch: 36503, Training Loss: 0.00716
Epoch: 36503, Training Loss: 0.00745
Epoch: 36503, Training Loss: 0.00746
Epoch: 36503, Training Loss: 0.00915
Epoch: 36504, Training Loss: 0.00716
Epoch: 36504, Training Loss: 0.00745
Epoch: 36504, Training Loss: 0.00746
Epoch: 36504, Training Loss: 0.00915
Epoch: 36505, Training Loss: 0.00716
Epoch: 36505, Training Loss: 0.00745
Epoch: 36505, Training Loss: 0.00746
Epoch: 36505, Training Loss: 0.00915
Epoch: 36506, Training Loss: 0.00716
Epoch: 36506, Training Loss: 0.00745
Epoch: 36506, Training Loss: 0.00746
Epoch: 36506, Training Loss: 0.00915
Epoch: 36507, Training Loss: 0.00716
Epoch: 36507, Training Loss: 0.00745
Epoch: 36507, Training Loss: 0.00746
Epoch: 36507, Training Loss: 0.00915
Epoch: 36508, Training Loss: 0.00716
Epoch: 36508, Training Loss: 0.00745
Epoch: 36508, Training Loss: 0.00746
Epoch: 36508, Training Loss: 0.00915
Epoch: 36509, Training Loss: 0.00716
Epoch: 36509, Training Loss: 0.00745
Epoch: 36509, Training Loss: 0.00746
Epoch: 36509, Training Loss: 0.00915
Epoch: 36510, Training Loss: 0.00716
Epoch: 36510, Training Loss: 0.00745
Epoch: 36510, Training Loss: 0.00746
Epoch: 36510, Training Loss: 0.00915
Epoch: 36511, Training Loss: 0.00716
Epoch: 36511, Training Loss: 0.00745
Epoch: 36511, Training Loss: 0.00746
Epoch: 36511, Training Loss: 0.00915
Epoch: 36512, Training Loss: 0.00716
Epoch: 36512, Training Loss: 0.00745
Epoch: 36512, Training Loss: 0.00746
Epoch: 36512, Training Loss: 0.00915
Epoch: 36513, Training Loss: 0.00716
Epoch: 36513, Training Loss: 0.00745
Epoch: 36513, Training Loss: 0.00746
Epoch: 36513, Training Loss: 0.00915
Epoch: 36514, Training Loss: 0.00716
Epoch: 36514, Training Loss: 0.00745
Epoch: 36514, Training Loss: 0.00746
Epoch: 36514, Training Loss: 0.00915
Epoch: 36515, Training Loss: 0.00716
Epoch: 36515, Training Loss: 0.00745
Epoch: 36515, Training Loss: 0.00746
Epoch: 36515, Training Loss: 0.00915
Epoch: 36516, Training Loss: 0.00716
Epoch: 36516, Training Loss: 0.00745
Epoch: 36516, Training Loss: 0.00746
Epoch: 36516, Training Loss: 0.00915
Epoch: 36517, Training Loss: 0.00716
Epoch: 36517, Training Loss: 0.00745
Epoch: 36517, Training Loss: 0.00745
Epoch: 36517, Training Loss: 0.00915
Epoch: 36518, Training Loss: 0.00716
Epoch: 36518, Training Loss: 0.00745
Epoch: 36518, Training Loss: 0.00745
Epoch: 36518, Training Loss: 0.00915
Epoch: 36519, Training Loss: 0.00716
Epoch: 36519, Training Loss: 0.00745
Epoch: 36519, Training Loss: 0.00745
Epoch: 36519, Training Loss: 0.00915
Epoch: 36520, Training Loss: 0.00716
Epoch: 36520, Training Loss: 0.00745
Epoch: 36520, Training Loss: 0.00745
Epoch: 36520, Training Loss: 0.00915
Epoch: 36521, Training Loss: 0.00716
Epoch: 36521, Training Loss: 0.00745
Epoch: 36521, Training Loss: 0.00745
Epoch: 36521, Training Loss: 0.00915
Epoch: 36522, Training Loss: 0.00716
Epoch: 36522, Training Loss: 0.00745
Epoch: 36522, Training Loss: 0.00745
Epoch: 36522, Training Loss: 0.00915
Epoch: 36523, Training Loss: 0.00716
Epoch: 36523, Training Loss: 0.00745
Epoch: 36523, Training Loss: 0.00745
Epoch: 36523, Training Loss: 0.00915
Epoch: 36524, Training Loss: 0.00716
Epoch: 36524, Training Loss: 0.00745
Epoch: 36524, Training Loss: 0.00745
Epoch: 36524, Training Loss: 0.00915
Epoch: 36525, Training Loss: 0.00716
Epoch: 36525, Training Loss: 0.00745
Epoch: 36525, Training Loss: 0.00745
Epoch: 36525, Training Loss: 0.00915
Epoch: 36526, Training Loss: 0.00716
Epoch: 36526, Training Loss: 0.00745
Epoch: 36526, Training Loss: 0.00745
Epoch: 36526, Training Loss: 0.00915
Epoch: 36527, Training Loss: 0.00716
Epoch: 36527, Training Loss: 0.00745
Epoch: 36527, Training Loss: 0.00745
Epoch: 36527, Training Loss: 0.00915
Epoch: 36528, Training Loss: 0.00716
Epoch: 36528, Training Loss: 0.00745
Epoch: 36528, Training Loss: 0.00745
Epoch: 36528, Training Loss: 0.00915
Epoch: 36529, Training Loss: 0.00716
Epoch: 36529, Training Loss: 0.00745
Epoch: 36529, Training Loss: 0.00745
Epoch: 36529, Training Loss: 0.00915
Epoch: 36530, Training Loss: 0.00716
Epoch: 36530, Training Loss: 0.00744
Epoch: 36530, Training Loss: 0.00745
Epoch: 36530, Training Loss: 0.00914
Epoch: 36531, Training Loss: 0.00716
Epoch: 36531, Training Loss: 0.00744
Epoch: 36531, Training Loss: 0.00745
Epoch: 36531, Training Loss: 0.00914
Epoch: 36532, Training Loss: 0.00716
Epoch: 36532, Training Loss: 0.00744
Epoch: 36532, Training Loss: 0.00745
Epoch: 36532, Training Loss: 0.00914
Epoch: 36533, Training Loss: 0.00716
Epoch: 36533, Training Loss: 0.00744
Epoch: 36533, Training Loss: 0.00745
Epoch: 36533, Training Loss: 0.00914
Epoch: 36534, Training Loss: 0.00716
Epoch: 36534, Training Loss: 0.00744
Epoch: 36534, Training Loss: 0.00745
Epoch: 36534, Training Loss: 0.00914
Epoch: 36535, Training Loss: 0.00716
Epoch: 36535, Training Loss: 0.00744
Epoch: 36535, Training Loss: 0.00745
Epoch: 36535, Training Loss: 0.00914
Epoch: 36536, Training Loss: 0.00716
Epoch: 36536, Training Loss: 0.00744
Epoch: 36536, Training Loss: 0.00745
Epoch: 36536, Training Loss: 0.00914
Epoch: 36537, Training Loss: 0.00716
Epoch: 36537, Training Loss: 0.00744
Epoch: 36537, Training Loss: 0.00745
Epoch: 36537, Training Loss: 0.00914
Epoch: 36538, Training Loss: 0.00716
Epoch: 36538, Training Loss: 0.00744
Epoch: 36538, Training Loss: 0.00745
Epoch: 36538, Training Loss: 0.00914
Epoch: 36539, Training Loss: 0.00716
Epoch: 36539, Training Loss: 0.00744
Epoch: 36539, Training Loss: 0.00745
Epoch: 36539, Training Loss: 0.00914
Epoch: 36540, Training Loss: 0.00716
Epoch: 36540, Training Loss: 0.00744
Epoch: 36540, Training Loss: 0.00745
Epoch: 36540, Training Loss: 0.00914
Epoch: 36541, Training Loss: 0.00716
Epoch: 36541, Training Loss: 0.00744
Epoch: 36541, Training Loss: 0.00745
Epoch: 36541, Training Loss: 0.00914
Epoch: 36542, Training Loss: 0.00716
Epoch: 36542, Training Loss: 0.00744
Epoch: 36542, Training Loss: 0.00745
Epoch: 36542, Training Loss: 0.00914
Epoch: 36543, Training Loss: 0.00716
Epoch: 36543, Training Loss: 0.00744
Epoch: 36543, Training Loss: 0.00745
Epoch: 36543, Training Loss: 0.00914
Epoch: 36544, Training Loss: 0.00716
Epoch: 36544, Training Loss: 0.00744
Epoch: 36544, Training Loss: 0.00745
Epoch: 36544, Training Loss: 0.00914
Epoch: 36545, Training Loss: 0.00716
Epoch: 36545, Training Loss: 0.00744
Epoch: 36545, Training Loss: 0.00745
Epoch: 36545, Training Loss: 0.00914
Epoch: 36546, Training Loss: 0.00716
Epoch: 36546, Training Loss: 0.00744
Epoch: 36546, Training Loss: 0.00745
Epoch: 36546, Training Loss: 0.00914
Epoch: 36547, Training Loss: 0.00716
Epoch: 36547, Training Loss: 0.00744
Epoch: 36547, Training Loss: 0.00745
Epoch: 36547, Training Loss: 0.00914
Epoch: 36548, Training Loss: 0.00716
Epoch: 36548, Training Loss: 0.00744
Epoch: 36548, Training Loss: 0.00745
Epoch: 36548, Training Loss: 0.00914
Epoch: 36549, Training Loss: 0.00716
Epoch: 36549, Training Loss: 0.00744
Epoch: 36549, Training Loss: 0.00745
Epoch: 36549, Training Loss: 0.00914
Epoch: 36550, Training Loss: 0.00716
Epoch: 36550, Training Loss: 0.00744
Epoch: 36550, Training Loss: 0.00745
Epoch: 36550, Training Loss: 0.00914
Epoch: 36551, Training Loss: 0.00716
Epoch: 36551, Training Loss: 0.00744
Epoch: 36551, Training Loss: 0.00745
Epoch: 36551, Training Loss: 0.00914
Epoch: 36552, Training Loss: 0.00716
Epoch: 36552, Training Loss: 0.00744
Epoch: 36552, Training Loss: 0.00745
Epoch: 36552, Training Loss: 0.00914
Epoch: 36553, Training Loss: 0.00716
Epoch: 36553, Training Loss: 0.00744
Epoch: 36553, Training Loss: 0.00745
Epoch: 36553, Training Loss: 0.00914
Epoch: 36554, Training Loss: 0.00716
Epoch: 36554, Training Loss: 0.00744
Epoch: 36554, Training Loss: 0.00745
Epoch: 36554, Training Loss: 0.00914
Epoch: 36555, Training Loss: 0.00716
Epoch: 36555, Training Loss: 0.00744
Epoch: 36555, Training Loss: 0.00745
Epoch: 36555, Training Loss: 0.00914
Epoch: 36556, Training Loss: 0.00716
Epoch: 36556, Training Loss: 0.00744
Epoch: 36556, Training Loss: 0.00745
Epoch: 36556, Training Loss: 0.00914
Epoch: 36557, Training Loss: 0.00716
Epoch: 36557, Training Loss: 0.00744
Epoch: 36557, Training Loss: 0.00745
Epoch: 36557, Training Loss: 0.00914
Epoch: 36558, Training Loss: 0.00716
Epoch: 36558, Training Loss: 0.00744
Epoch: 36558, Training Loss: 0.00745
Epoch: 36558, Training Loss: 0.00914
Epoch: 36559, Training Loss: 0.00716
Epoch: 36559, Training Loss: 0.00744
Epoch: 36559, Training Loss: 0.00745
Epoch: 36559, Training Loss: 0.00914
Epoch: 36560, Training Loss: 0.00716
Epoch: 36560, Training Loss: 0.00744
Epoch: 36560, Training Loss: 0.00745
Epoch: 36560, Training Loss: 0.00914
Epoch: 36561, Training Loss: 0.00716
Epoch: 36561, Training Loss: 0.00744
Epoch: 36561, Training Loss: 0.00745
Epoch: 36561, Training Loss: 0.00914
Epoch: 36562, Training Loss: 0.00716
Epoch: 36562, Training Loss: 0.00744
Epoch: 36562, Training Loss: 0.00745
Epoch: 36562, Training Loss: 0.00914
Epoch: 36563, Training Loss: 0.00716
Epoch: 36563, Training Loss: 0.00744
Epoch: 36563, Training Loss: 0.00745
Epoch: 36563, Training Loss: 0.00914
Epoch: 36564, Training Loss: 0.00716
Epoch: 36564, Training Loss: 0.00744
Epoch: 36564, Training Loss: 0.00745
Epoch: 36564, Training Loss: 0.00914
Epoch: 36565, Training Loss: 0.00716
Epoch: 36565, Training Loss: 0.00744
Epoch: 36565, Training Loss: 0.00745
Epoch: 36565, Training Loss: 0.00914
Epoch: 36566, Training Loss: 0.00716
Epoch: 36566, Training Loss: 0.00744
Epoch: 36566, Training Loss: 0.00745
Epoch: 36566, Training Loss: 0.00914
Epoch: 36567, Training Loss: 0.00716
Epoch: 36567, Training Loss: 0.00744
Epoch: 36567, Training Loss: 0.00745
Epoch: 36567, Training Loss: 0.00914
Epoch: 36568, Training Loss: 0.00716
Epoch: 36568, Training Loss: 0.00744
Epoch: 36568, Training Loss: 0.00745
Epoch: 36568, Training Loss: 0.00914
Epoch: 36569, Training Loss: 0.00716
Epoch: 36569, Training Loss: 0.00744
Epoch: 36569, Training Loss: 0.00745
Epoch: 36569, Training Loss: 0.00914
Epoch: 36570, Training Loss: 0.00716
Epoch: 36570, Training Loss: 0.00744
Epoch: 36570, Training Loss: 0.00745
Epoch: 36570, Training Loss: 0.00914
Epoch: 36571, Training Loss: 0.00716
Epoch: 36571, Training Loss: 0.00744
Epoch: 36571, Training Loss: 0.00745
Epoch: 36571, Training Loss: 0.00914
Epoch: 36572, Training Loss: 0.00716
Epoch: 36572, Training Loss: 0.00744
Epoch: 36572, Training Loss: 0.00745
Epoch: 36572, Training Loss: 0.00914
Epoch: 36573, Training Loss: 0.00716
Epoch: 36573, Training Loss: 0.00744
Epoch: 36573, Training Loss: 0.00745
Epoch: 36573, Training Loss: 0.00914
Epoch: 36574, Training Loss: 0.00716
Epoch: 36574, Training Loss: 0.00744
Epoch: 36574, Training Loss: 0.00745
Epoch: 36574, Training Loss: 0.00914
Epoch: 36575, Training Loss: 0.00716
Epoch: 36575, Training Loss: 0.00744
Epoch: 36575, Training Loss: 0.00745
Epoch: 36575, Training Loss: 0.00914
Epoch: 36576, Training Loss: 0.00716
Epoch: 36576, Training Loss: 0.00744
Epoch: 36576, Training Loss: 0.00745
Epoch: 36576, Training Loss: 0.00914
Epoch: 36577, Training Loss: 0.00716
Epoch: 36577, Training Loss: 0.00744
Epoch: 36577, Training Loss: 0.00745
Epoch: 36577, Training Loss: 0.00914
Epoch: 36578, Training Loss: 0.00716
Epoch: 36578, Training Loss: 0.00744
Epoch: 36578, Training Loss: 0.00745
Epoch: 36578, Training Loss: 0.00914
Epoch: 36579, Training Loss: 0.00716
Epoch: 36579, Training Loss: 0.00744
Epoch: 36579, Training Loss: 0.00745
Epoch: 36579, Training Loss: 0.00914
Epoch: 36580, Training Loss: 0.00716
Epoch: 36580, Training Loss: 0.00744
Epoch: 36580, Training Loss: 0.00745
Epoch: 36580, Training Loss: 0.00914
Epoch: 36581, Training Loss: 0.00716
Epoch: 36581, Training Loss: 0.00744
Epoch: 36581, Training Loss: 0.00745
Epoch: 36581, Training Loss: 0.00914
Epoch: 36582, Training Loss: 0.00716
Epoch: 36582, Training Loss: 0.00744
Epoch: 36582, Training Loss: 0.00745
Epoch: 36582, Training Loss: 0.00914
Epoch: 36583, Training Loss: 0.00715
Epoch: 36583, Training Loss: 0.00744
Epoch: 36583, Training Loss: 0.00745
Epoch: 36583, Training Loss: 0.00914
Epoch: 36584, Training Loss: 0.00715
Epoch: 36584, Training Loss: 0.00744
Epoch: 36584, Training Loss: 0.00745
Epoch: 36584, Training Loss: 0.00914
Epoch: 36585, Training Loss: 0.00715
Epoch: 36585, Training Loss: 0.00744
Epoch: 36585, Training Loss: 0.00745
Epoch: 36585, Training Loss: 0.00914
Epoch: 36586, Training Loss: 0.00715
Epoch: 36586, Training Loss: 0.00744
Epoch: 36586, Training Loss: 0.00745
Epoch: 36586, Training Loss: 0.00914
Epoch: 36587, Training Loss: 0.00715
Epoch: 36587, Training Loss: 0.00744
Epoch: 36587, Training Loss: 0.00745
Epoch: 36587, Training Loss: 0.00914
Epoch: 36588, Training Loss: 0.00715
Epoch: 36588, Training Loss: 0.00744
Epoch: 36588, Training Loss: 0.00745
Epoch: 36588, Training Loss: 0.00914
Epoch: 36589, Training Loss: 0.00715
Epoch: 36589, Training Loss: 0.00744
Epoch: 36589, Training Loss: 0.00745
Epoch: 36589, Training Loss: 0.00914
Epoch: 36590, Training Loss: 0.00715
Epoch: 36590, Training Loss: 0.00744
Epoch: 36590, Training Loss: 0.00745
Epoch: 36590, Training Loss: 0.00914
Epoch: 36591, Training Loss: 0.00715
Epoch: 36591, Training Loss: 0.00744
Epoch: 36591, Training Loss: 0.00745
Epoch: 36591, Training Loss: 0.00914
Epoch: 36592, Training Loss: 0.00715
Epoch: 36592, Training Loss: 0.00744
Epoch: 36592, Training Loss: 0.00745
Epoch: 36592, Training Loss: 0.00914
Epoch: 36593, Training Loss: 0.00715
Epoch: 36593, Training Loss: 0.00744
Epoch: 36593, Training Loss: 0.00745
Epoch: 36593, Training Loss: 0.00914
Epoch: 36594, Training Loss: 0.00715
Epoch: 36594, Training Loss: 0.00744
Epoch: 36594, Training Loss: 0.00745
Epoch: 36594, Training Loss: 0.00914
Epoch: 36595, Training Loss: 0.00715
Epoch: 36595, Training Loss: 0.00744
Epoch: 36595, Training Loss: 0.00745
Epoch: 36595, Training Loss: 0.00914
Epoch: 36596, Training Loss: 0.00715
Epoch: 36596, Training Loss: 0.00744
Epoch: 36596, Training Loss: 0.00745
Epoch: 36596, Training Loss: 0.00914
Epoch: 36597, Training Loss: 0.00715
Epoch: 36597, Training Loss: 0.00744
Epoch: 36597, Training Loss: 0.00745
Epoch: 36597, Training Loss: 0.00914
Epoch: 36598, Training Loss: 0.00715
Epoch: 36598, Training Loss: 0.00744
Epoch: 36598, Training Loss: 0.00745
Epoch: 36598, Training Loss: 0.00914
Epoch: 36599, Training Loss: 0.00715
Epoch: 36599, Training Loss: 0.00744
Epoch: 36599, Training Loss: 0.00745
Epoch: 36599, Training Loss: 0.00914
Epoch: 36600, Training Loss: 0.00715
Epoch: 36600, Training Loss: 0.00744
Epoch: 36600, Training Loss: 0.00745
Epoch: 36600, Training Loss: 0.00914
Epoch: 36601, Training Loss: 0.00715
Epoch: 36601, Training Loss: 0.00744
Epoch: 36601, Training Loss: 0.00745
Epoch: 36601, Training Loss: 0.00914
Epoch: 36602, Training Loss: 0.00715
Epoch: 36602, Training Loss: 0.00744
Epoch: 36602, Training Loss: 0.00745
Epoch: 36602, Training Loss: 0.00913
Epoch: 36603, Training Loss: 0.00715
Epoch: 36603, Training Loss: 0.00744
Epoch: 36603, Training Loss: 0.00745
Epoch: 36603, Training Loss: 0.00913
Epoch: 36604, Training Loss: 0.00715
Epoch: 36604, Training Loss: 0.00744
Epoch: 36604, Training Loss: 0.00745
Epoch: 36604, Training Loss: 0.00913
Epoch: 36605, Training Loss: 0.00715
Epoch: 36605, Training Loss: 0.00744
Epoch: 36605, Training Loss: 0.00744
Epoch: 36605, Training Loss: 0.00913
Epoch: 36606, Training Loss: 0.00715
Epoch: 36606, Training Loss: 0.00744
Epoch: 36606, Training Loss: 0.00744
Epoch: 36606, Training Loss: 0.00913
Epoch: 36607, Training Loss: 0.00715
Epoch: 36607, Training Loss: 0.00744
Epoch: 36607, Training Loss: 0.00744
Epoch: 36607, Training Loss: 0.00913
Epoch: 36608, Training Loss: 0.00715
Epoch: 36608, Training Loss: 0.00744
Epoch: 36608, Training Loss: 0.00744
Epoch: 36608, Training Loss: 0.00913
Epoch: 36609, Training Loss: 0.00715
Epoch: 36609, Training Loss: 0.00744
Epoch: 36609, Training Loss: 0.00744
Epoch: 36609, Training Loss: 0.00913
Epoch: 36610, Training Loss: 0.00715
Epoch: 36610, Training Loss: 0.00744
Epoch: 36610, Training Loss: 0.00744
Epoch: 36610, Training Loss: 0.00913
Epoch: 36611, Training Loss: 0.00715
Epoch: 36611, Training Loss: 0.00744
Epoch: 36611, Training Loss: 0.00744
Epoch: 36611, Training Loss: 0.00913
Epoch: 36612, Training Loss: 0.00715
Epoch: 36612, Training Loss: 0.00744
Epoch: 36612, Training Loss: 0.00744
Epoch: 36612, Training Loss: 0.00913
Epoch: 36613, Training Loss: 0.00715
Epoch: 36613, Training Loss: 0.00744
Epoch: 36613, Training Loss: 0.00744
Epoch: 36613, Training Loss: 0.00913
Epoch: 36614, Training Loss: 0.00715
Epoch: 36614, Training Loss: 0.00744
Epoch: 36614, Training Loss: 0.00744
Epoch: 36614, Training Loss: 0.00913
Epoch: 36615, Training Loss: 0.00715
Epoch: 36615, Training Loss: 0.00744
Epoch: 36615, Training Loss: 0.00744
Epoch: 36615, Training Loss: 0.00913
Epoch: 36616, Training Loss: 0.00715
Epoch: 36616, Training Loss: 0.00744
Epoch: 36616, Training Loss: 0.00744
Epoch: 36616, Training Loss: 0.00913
Epoch: 36617, Training Loss: 0.00715
Epoch: 36617, Training Loss: 0.00744
Epoch: 36617, Training Loss: 0.00744
Epoch: 36617, Training Loss: 0.00913
Epoch: 36618, Training Loss: 0.00715
Epoch: 36618, Training Loss: 0.00744
Epoch: 36618, Training Loss: 0.00744
Epoch: 36618, Training Loss: 0.00913
Epoch: 36619, Training Loss: 0.00715
Epoch: 36619, Training Loss: 0.00743
Epoch: 36619, Training Loss: 0.00744
Epoch: 36619, Training Loss: 0.00913
Epoch: 36620, Training Loss: 0.00715
Epoch: 36620, Training Loss: 0.00743
Epoch: 36620, Training Loss: 0.00744
Epoch: 36620, Training Loss: 0.00913
Epoch: 36621, Training Loss: 0.00715
Epoch: 36621, Training Loss: 0.00743
Epoch: 36621, Training Loss: 0.00744
Epoch: 36621, Training Loss: 0.00913
Epoch: 36622, Training Loss: 0.00715
Epoch: 36622, Training Loss: 0.00743
Epoch: 36622, Training Loss: 0.00744
Epoch: 36622, Training Loss: 0.00913
Epoch: 36623, Training Loss: 0.00715
Epoch: 36623, Training Loss: 0.00743
Epoch: 36623, Training Loss: 0.00744
Epoch: 36623, Training Loss: 0.00913
Epoch: 36624, Training Loss: 0.00715
Epoch: 36624, Training Loss: 0.00743
Epoch: 36624, Training Loss: 0.00744
Epoch: 36624, Training Loss: 0.00913
Epoch: 36625, Training Loss: 0.00715
Epoch: 36625, Training Loss: 0.00743
Epoch: 36625, Training Loss: 0.00744
Epoch: 36625, Training Loss: 0.00913
Epoch: 36626, Training Loss: 0.00715
Epoch: 36626, Training Loss: 0.00743
Epoch: 36626, Training Loss: 0.00744
Epoch: 36626, Training Loss: 0.00913
Epoch: 36627, Training Loss: 0.00715
Epoch: 36627, Training Loss: 0.00743
Epoch: 36627, Training Loss: 0.00744
Epoch: 36627, Training Loss: 0.00913
Epoch: 36628, Training Loss: 0.00715
Epoch: 36628, Training Loss: 0.00743
Epoch: 36628, Training Loss: 0.00744
Epoch: 36628, Training Loss: 0.00913
Epoch: 36629, Training Loss: 0.00715
Epoch: 36629, Training Loss: 0.00743
Epoch: 36629, Training Loss: 0.00744
Epoch: 36629, Training Loss: 0.00913
Epoch: 36630, Training Loss: 0.00715
Epoch: 36630, Training Loss: 0.00743
Epoch: 36630, Training Loss: 0.00744
Epoch: 36630, Training Loss: 0.00913
Epoch: 36631, Training Loss: 0.00715
Epoch: 36631, Training Loss: 0.00743
Epoch: 36631, Training Loss: 0.00744
Epoch: 36631, Training Loss: 0.00913
Epoch: 36632, Training Loss: 0.00715
Epoch: 36632, Training Loss: 0.00743
Epoch: 36632, Training Loss: 0.00744
Epoch: 36632, Training Loss: 0.00913
Epoch: 36633, Training Loss: 0.00715
Epoch: 36633, Training Loss: 0.00743
Epoch: 36633, Training Loss: 0.00744
Epoch: 36633, Training Loss: 0.00913
Epoch: 36634, Training Loss: 0.00715
Epoch: 36634, Training Loss: 0.00743
Epoch: 36634, Training Loss: 0.00744
Epoch: 36634, Training Loss: 0.00913
Epoch: 36635, Training Loss: 0.00715
Epoch: 36635, Training Loss: 0.00743
Epoch: 36635, Training Loss: 0.00744
Epoch: 36635, Training Loss: 0.00913
Epoch: 36636, Training Loss: 0.00715
Epoch: 36636, Training Loss: 0.00743
Epoch: 36636, Training Loss: 0.00744
Epoch: 36636, Training Loss: 0.00913
Epoch: 36637, Training Loss: 0.00715
Epoch: 36637, Training Loss: 0.00743
Epoch: 36637, Training Loss: 0.00744
Epoch: 36637, Training Loss: 0.00913
Epoch: 36638, Training Loss: 0.00715
Epoch: 36638, Training Loss: 0.00743
Epoch: 36638, Training Loss: 0.00744
Epoch: 36638, Training Loss: 0.00913
Epoch: 36639, Training Loss: 0.00715
Epoch: 36639, Training Loss: 0.00743
Epoch: 36639, Training Loss: 0.00744
Epoch: 36639, Training Loss: 0.00913
Epoch: 36640, Training Loss: 0.00715
Epoch: 36640, Training Loss: 0.00743
Epoch: 36640, Training Loss: 0.00744
Epoch: 36640, Training Loss: 0.00913
Epoch: 36641, Training Loss: 0.00715
Epoch: 36641, Training Loss: 0.00743
Epoch: 36641, Training Loss: 0.00744
Epoch: 36641, Training Loss: 0.00913
Epoch: 36642, Training Loss: 0.00715
Epoch: 36642, Training Loss: 0.00743
Epoch: 36642, Training Loss: 0.00744
Epoch: 36642, Training Loss: 0.00913
Epoch: 36643, Training Loss: 0.00715
Epoch: 36643, Training Loss: 0.00743
Epoch: 36643, Training Loss: 0.00744
Epoch: 36643, Training Loss: 0.00913
Epoch: 36644, Training Loss: 0.00715
Epoch: 36644, Training Loss: 0.00743
Epoch: 36644, Training Loss: 0.00744
Epoch: 36644, Training Loss: 0.00913
Epoch: 36645, Training Loss: 0.00715
Epoch: 36645, Training Loss: 0.00743
Epoch: 36645, Training Loss: 0.00744
Epoch: 36645, Training Loss: 0.00913
Epoch: 36646, Training Loss: 0.00715
Epoch: 36646, Training Loss: 0.00743
Epoch: 36646, Training Loss: 0.00744
Epoch: 36646, Training Loss: 0.00913
Epoch: 36647, Training Loss: 0.00715
Epoch: 36647, Training Loss: 0.00743
Epoch: 36647, Training Loss: 0.00744
Epoch: 36647, Training Loss: 0.00913
Epoch: 36648, Training Loss: 0.00715
Epoch: 36648, Training Loss: 0.00743
Epoch: 36648, Training Loss: 0.00744
Epoch: 36648, Training Loss: 0.00913
Epoch: 36649, Training Loss: 0.00715
Epoch: 36649, Training Loss: 0.00743
Epoch: 36649, Training Loss: 0.00744
Epoch: 36649, Training Loss: 0.00913
Epoch: 36650, Training Loss: 0.00715
Epoch: 36650, Training Loss: 0.00743
Epoch: 36650, Training Loss: 0.00744
Epoch: 36650, Training Loss: 0.00913
Epoch: 36651, Training Loss: 0.00715
Epoch: 36651, Training Loss: 0.00743
Epoch: 36651, Training Loss: 0.00744
Epoch: 36651, Training Loss: 0.00913
Epoch: 36652, Training Loss: 0.00715
Epoch: 36652, Training Loss: 0.00743
Epoch: 36652, Training Loss: 0.00744
Epoch: 36652, Training Loss: 0.00913
Epoch: 36653, Training Loss: 0.00715
Epoch: 36653, Training Loss: 0.00743
Epoch: 36653, Training Loss: 0.00744
Epoch: 36653, Training Loss: 0.00913
Epoch: 36654, Training Loss: 0.00715
Epoch: 36654, Training Loss: 0.00743
Epoch: 36654, Training Loss: 0.00744
Epoch: 36654, Training Loss: 0.00913
Epoch: 36655, Training Loss: 0.00715
Epoch: 36655, Training Loss: 0.00743
Epoch: 36655, Training Loss: 0.00744
Epoch: 36655, Training Loss: 0.00913
Epoch: 36656, Training Loss: 0.00715
Epoch: 36656, Training Loss: 0.00743
Epoch: 36656, Training Loss: 0.00744
Epoch: 36656, Training Loss: 0.00913
Epoch: 36657, Training Loss: 0.00715
Epoch: 36657, Training Loss: 0.00743
Epoch: 36657, Training Loss: 0.00744
Epoch: 36657, Training Loss: 0.00913
Epoch: 36658, Training Loss: 0.00715
Epoch: 36658, Training Loss: 0.00743
Epoch: 36658, Training Loss: 0.00744
Epoch: 36658, Training Loss: 0.00913
Epoch: 36659, Training Loss: 0.00715
Epoch: 36659, Training Loss: 0.00743
Epoch: 36659, Training Loss: 0.00744
Epoch: 36659, Training Loss: 0.00913
Epoch: 36660, Training Loss: 0.00715
Epoch: 36660, Training Loss: 0.00743
Epoch: 36660, Training Loss: 0.00744
Epoch: 36660, Training Loss: 0.00913
Epoch: 36661, Training Loss: 0.00715
Epoch: 36661, Training Loss: 0.00743
Epoch: 36661, Training Loss: 0.00744
Epoch: 36661, Training Loss: 0.00913
Epoch: 36662, Training Loss: 0.00715
Epoch: 36662, Training Loss: 0.00743
Epoch: 36662, Training Loss: 0.00744
Epoch: 36662, Training Loss: 0.00913
Epoch: 36663, Training Loss: 0.00715
Epoch: 36663, Training Loss: 0.00743
Epoch: 36663, Training Loss: 0.00744
Epoch: 36663, Training Loss: 0.00913
Epoch: 36664, Training Loss: 0.00715
Epoch: 36664, Training Loss: 0.00743
Epoch: 36664, Training Loss: 0.00744
Epoch: 36664, Training Loss: 0.00913
Epoch: 36665, Training Loss: 0.00715
Epoch: 36665, Training Loss: 0.00743
Epoch: 36665, Training Loss: 0.00744
Epoch: 36665, Training Loss: 0.00913
Epoch: 36666, Training Loss: 0.00715
Epoch: 36666, Training Loss: 0.00743
Epoch: 36666, Training Loss: 0.00744
Epoch: 36666, Training Loss: 0.00913
Epoch: 36667, Training Loss: 0.00715
Epoch: 36667, Training Loss: 0.00743
Epoch: 36667, Training Loss: 0.00744
Epoch: 36667, Training Loss: 0.00913
Epoch: 36668, Training Loss: 0.00715
Epoch: 36668, Training Loss: 0.00743
Epoch: 36668, Training Loss: 0.00744
Epoch: 36668, Training Loss: 0.00913
Epoch: 36669, Training Loss: 0.00715
Epoch: 36669, Training Loss: 0.00743
Epoch: 36669, Training Loss: 0.00744
Epoch: 36669, Training Loss: 0.00913
Epoch: 36670, Training Loss: 0.00715
Epoch: 36670, Training Loss: 0.00743
Epoch: 36670, Training Loss: 0.00744
Epoch: 36670, Training Loss: 0.00913
Epoch: 36671, Training Loss: 0.00715
Epoch: 36671, Training Loss: 0.00743
Epoch: 36671, Training Loss: 0.00744
Epoch: 36671, Training Loss: 0.00913
Epoch: 36672, Training Loss: 0.00715
Epoch: 36672, Training Loss: 0.00743
Epoch: 36672, Training Loss: 0.00744
Epoch: 36672, Training Loss: 0.00913
Epoch: 36673, Training Loss: 0.00715
Epoch: 36673, Training Loss: 0.00743
Epoch: 36673, Training Loss: 0.00744
Epoch: 36673, Training Loss: 0.00913
Epoch: 36674, Training Loss: 0.00715
Epoch: 36674, Training Loss: 0.00743
Epoch: 36674, Training Loss: 0.00744
Epoch: 36674, Training Loss: 0.00912
Epoch: 36675, Training Loss: 0.00715
Epoch: 36675, Training Loss: 0.00743
Epoch: 36675, Training Loss: 0.00744
Epoch: 36675, Training Loss: 0.00912
Epoch: 36676, Training Loss: 0.00715
Epoch: 36676, Training Loss: 0.00743
Epoch: 36676, Training Loss: 0.00744
Epoch: 36676, Training Loss: 0.00912
Epoch: 36677, Training Loss: 0.00714
Epoch: 36677, Training Loss: 0.00743
Epoch: 36677, Training Loss: 0.00744
Epoch: 36677, Training Loss: 0.00912
Epoch: 36678, Training Loss: 0.00714
Epoch: 36678, Training Loss: 0.00743
Epoch: 36678, Training Loss: 0.00744
Epoch: 36678, Training Loss: 0.00912
Epoch: 36679, Training Loss: 0.00714
Epoch: 36679, Training Loss: 0.00743
Epoch: 36679, Training Loss: 0.00744
Epoch: 36679, Training Loss: 0.00912
Epoch: 36680, Training Loss: 0.00714
Epoch: 36680, Training Loss: 0.00743
Epoch: 36680, Training Loss: 0.00744
Epoch: 36680, Training Loss: 0.00912
Epoch: 36681, Training Loss: 0.00714
Epoch: 36681, Training Loss: 0.00743
Epoch: 36681, Training Loss: 0.00744
Epoch: 36681, Training Loss: 0.00912
Epoch: 36682, Training Loss: 0.00714
Epoch: 36682, Training Loss: 0.00743
Epoch: 36682, Training Loss: 0.00744
Epoch: 36682, Training Loss: 0.00912
Epoch: 36683, Training Loss: 0.00714
Epoch: 36683, Training Loss: 0.00743
Epoch: 36683, Training Loss: 0.00744
Epoch: 36683, Training Loss: 0.00912
Epoch: 36684, Training Loss: 0.00714
Epoch: 36684, Training Loss: 0.00743
Epoch: 36684, Training Loss: 0.00744
Epoch: 36684, Training Loss: 0.00912
Epoch: 36685, Training Loss: 0.00714
Epoch: 36685, Training Loss: 0.00743
Epoch: 36685, Training Loss: 0.00744
Epoch: 36685, Training Loss: 0.00912
Epoch: 36686, Training Loss: 0.00714
Epoch: 36686, Training Loss: 0.00743
Epoch: 36686, Training Loss: 0.00744
Epoch: 36686, Training Loss: 0.00912
Epoch: 36687, Training Loss: 0.00714
Epoch: 36687, Training Loss: 0.00743
Epoch: 36687, Training Loss: 0.00744
Epoch: 36687, Training Loss: 0.00912
Epoch: 36688, Training Loss: 0.00714
Epoch: 36688, Training Loss: 0.00743
Epoch: 36688, Training Loss: 0.00744
Epoch: 36688, Training Loss: 0.00912
Epoch: 36689, Training Loss: 0.00714
Epoch: 36689, Training Loss: 0.00743
Epoch: 36689, Training Loss: 0.00744
Epoch: 36689, Training Loss: 0.00912
Epoch: 36690, Training Loss: 0.00714
Epoch: 36690, Training Loss: 0.00743
Epoch: 36690, Training Loss: 0.00744
Epoch: 36690, Training Loss: 0.00912
Epoch: 36691, Training Loss: 0.00714
Epoch: 36691, Training Loss: 0.00743
Epoch: 36691, Training Loss: 0.00744
Epoch: 36691, Training Loss: 0.00912
Epoch: 36692, Training Loss: 0.00714
Epoch: 36692, Training Loss: 0.00743
Epoch: 36692, Training Loss: 0.00744
Epoch: 36692, Training Loss: 0.00912
Epoch: 36693, Training Loss: 0.00714
Epoch: 36693, Training Loss: 0.00743
Epoch: 36693, Training Loss: 0.00744
Epoch: 36693, Training Loss: 0.00912
Epoch: 36694, Training Loss: 0.00714
Epoch: 36694, Training Loss: 0.00743
Epoch: 36694, Training Loss: 0.00743
Epoch: 36694, Training Loss: 0.00912
Epoch: 36695, Training Loss: 0.00714
Epoch: 36695, Training Loss: 0.00743
Epoch: 36695, Training Loss: 0.00743
Epoch: 36695, Training Loss: 0.00912
Epoch: 36696, Training Loss: 0.00714
Epoch: 36696, Training Loss: 0.00743
Epoch: 36696, Training Loss: 0.00743
Epoch: 36696, Training Loss: 0.00912
Epoch: 36697, Training Loss: 0.00714
Epoch: 36697, Training Loss: 0.00743
Epoch: 36697, Training Loss: 0.00743
Epoch: 36697, Training Loss: 0.00912
Epoch: 36698, Training Loss: 0.00714
Epoch: 36698, Training Loss: 0.00743
Epoch: 36698, Training Loss: 0.00743
Epoch: 36698, Training Loss: 0.00912
Epoch: 36699, Training Loss: 0.00714
Epoch: 36699, Training Loss: 0.00743
Epoch: 36699, Training Loss: 0.00743
Epoch: 36699, Training Loss: 0.00912
Epoch: 36700, Training Loss: 0.00714
Epoch: 36700, Training Loss: 0.00743
Epoch: 36700, Training Loss: 0.00743
Epoch: 36700, Training Loss: 0.00912
Epoch: 36701, Training Loss: 0.00714
Epoch: 36701, Training Loss: 0.00743
Epoch: 36701, Training Loss: 0.00743
Epoch: 36701, Training Loss: 0.00912
Epoch: 36702, Training Loss: 0.00714
Epoch: 36702, Training Loss: 0.00743
Epoch: 36702, Training Loss: 0.00743
Epoch: 36702, Training Loss: 0.00912
Epoch: 36703, Training Loss: 0.00714
Epoch: 36703, Training Loss: 0.00743
Epoch: 36703, Training Loss: 0.00743
Epoch: 36703, Training Loss: 0.00912
Epoch: 36704, Training Loss: 0.00714
Epoch: 36704, Training Loss: 0.00743
Epoch: 36704, Training Loss: 0.00743
Epoch: 36704, Training Loss: 0.00912
Epoch: 36705, Training Loss: 0.00714
Epoch: 36705, Training Loss: 0.00743
Epoch: 36705, Training Loss: 0.00743
Epoch: 36705, Training Loss: 0.00912
Epoch: 36706, Training Loss: 0.00714
Epoch: 36706, Training Loss: 0.00743
Epoch: 36706, Training Loss: 0.00743
Epoch: 36706, Training Loss: 0.00912
Epoch: 36707, Training Loss: 0.00714
Epoch: 36707, Training Loss: 0.00743
Epoch: 36707, Training Loss: 0.00743
Epoch: 36707, Training Loss: 0.00912
Epoch: 36708, Training Loss: 0.00714
Epoch: 36708, Training Loss: 0.00742
Epoch: 36708, Training Loss: 0.00743
Epoch: 36708, Training Loss: 0.00912
Epoch: 36709, Training Loss: 0.00714
Epoch: 36709, Training Loss: 0.00742
Epoch: 36709, Training Loss: 0.00743
Epoch: 36709, Training Loss: 0.00912
Epoch: 36710, Training Loss: 0.00714
Epoch: 36710, Training Loss: 0.00742
Epoch: 36710, Training Loss: 0.00743
Epoch: 36710, Training Loss: 0.00912
Epoch: 36711, Training Loss: 0.00714
Epoch: 36711, Training Loss: 0.00742
Epoch: 36711, Training Loss: 0.00743
Epoch: 36711, Training Loss: 0.00912
Epoch: 36712, Training Loss: 0.00714
Epoch: 36712, Training Loss: 0.00742
Epoch: 36712, Training Loss: 0.00743
Epoch: 36712, Training Loss: 0.00912
Epoch: 36713, Training Loss: 0.00714
Epoch: 36713, Training Loss: 0.00742
Epoch: 36713, Training Loss: 0.00743
Epoch: 36713, Training Loss: 0.00912
Epoch: 36714, Training Loss: 0.00714
Epoch: 36714, Training Loss: 0.00742
Epoch: 36714, Training Loss: 0.00743
Epoch: 36714, Training Loss: 0.00912
Epoch: 36715, Training Loss: 0.00714
Epoch: 36715, Training Loss: 0.00742
Epoch: 36715, Training Loss: 0.00743
Epoch: 36715, Training Loss: 0.00912
Epoch: 36716, Training Loss: 0.00714
Epoch: 36716, Training Loss: 0.00742
Epoch: 36716, Training Loss: 0.00743
Epoch: 36716, Training Loss: 0.00912
Epoch: 36717, Training Loss: 0.00714
Epoch: 36717, Training Loss: 0.00742
Epoch: 36717, Training Loss: 0.00743
Epoch: 36717, Training Loss: 0.00912
Epoch: 36718, Training Loss: 0.00714
Epoch: 36718, Training Loss: 0.00742
Epoch: 36718, Training Loss: 0.00743
Epoch: 36718, Training Loss: 0.00912
Epoch: 36719, Training Loss: 0.00714
Epoch: 36719, Training Loss: 0.00742
Epoch: 36719, Training Loss: 0.00743
Epoch: 36719, Training Loss: 0.00912
Epoch: 36720, Training Loss: 0.00714
Epoch: 36720, Training Loss: 0.00742
Epoch: 36720, Training Loss: 0.00743
Epoch: 36720, Training Loss: 0.00912
Epoch: 36721, Training Loss: 0.00714
Epoch: 36721, Training Loss: 0.00742
Epoch: 36721, Training Loss: 0.00743
Epoch: 36721, Training Loss: 0.00912
Epoch: 36722, Training Loss: 0.00714
Epoch: 36722, Training Loss: 0.00742
Epoch: 36722, Training Loss: 0.00743
Epoch: 36722, Training Loss: 0.00912
Epoch: 36723, Training Loss: 0.00714
Epoch: 36723, Training Loss: 0.00742
Epoch: 36723, Training Loss: 0.00743
Epoch: 36723, Training Loss: 0.00912
Epoch: 36724, Training Loss: 0.00714
Epoch: 36724, Training Loss: 0.00742
Epoch: 36724, Training Loss: 0.00743
Epoch: 36724, Training Loss: 0.00912
Epoch: 36725, Training Loss: 0.00714
Epoch: 36725, Training Loss: 0.00742
Epoch: 36725, Training Loss: 0.00743
Epoch: 36725, Training Loss: 0.00912
Epoch: 36726, Training Loss: 0.00714
Epoch: 36726, Training Loss: 0.00742
Epoch: 36726, Training Loss: 0.00743
Epoch: 36726, Training Loss: 0.00912
Epoch: 36727, Training Loss: 0.00714
Epoch: 36727, Training Loss: 0.00742
Epoch: 36727, Training Loss: 0.00743
Epoch: 36727, Training Loss: 0.00912
Epoch: 36728, Training Loss: 0.00714
Epoch: 36728, Training Loss: 0.00742
Epoch: 36728, Training Loss: 0.00743
Epoch: 36728, Training Loss: 0.00912
Epoch: 36729, Training Loss: 0.00714
Epoch: 36729, Training Loss: 0.00742
Epoch: 36729, Training Loss: 0.00743
Epoch: 36729, Training Loss: 0.00912
Epoch: 36730, Training Loss: 0.00714
Epoch: 36730, Training Loss: 0.00742
Epoch: 36730, Training Loss: 0.00743
Epoch: 36730, Training Loss: 0.00912
Epoch: 36731, Training Loss: 0.00714
Epoch: 36731, Training Loss: 0.00742
Epoch: 36731, Training Loss: 0.00743
Epoch: 36731, Training Loss: 0.00912
Epoch: 36732, Training Loss: 0.00714
Epoch: 36732, Training Loss: 0.00742
Epoch: 36732, Training Loss: 0.00743
Epoch: 36732, Training Loss: 0.00912
Epoch: 36733, Training Loss: 0.00714
Epoch: 36733, Training Loss: 0.00742
Epoch: 36733, Training Loss: 0.00743
Epoch: 36733, Training Loss: 0.00912
Epoch: 36734, Training Loss: 0.00714
Epoch: 36734, Training Loss: 0.00742
Epoch: 36734, Training Loss: 0.00743
Epoch: 36734, Training Loss: 0.00912
Epoch: 36735, Training Loss: 0.00714
Epoch: 36735, Training Loss: 0.00742
Epoch: 36735, Training Loss: 0.00743
Epoch: 36735, Training Loss: 0.00912
Epoch: 36736, Training Loss: 0.00714
Epoch: 36736, Training Loss: 0.00742
Epoch: 36736, Training Loss: 0.00743
Epoch: 36736, Training Loss: 0.00912
Epoch: 36737, Training Loss: 0.00714
Epoch: 36737, Training Loss: 0.00742
Epoch: 36737, Training Loss: 0.00743
Epoch: 36737, Training Loss: 0.00912
Epoch: 36738, Training Loss: 0.00714
Epoch: 36738, Training Loss: 0.00742
Epoch: 36738, Training Loss: 0.00743
Epoch: 36738, Training Loss: 0.00912
Epoch: 36739, Training Loss: 0.00714
Epoch: 36739, Training Loss: 0.00742
Epoch: 36739, Training Loss: 0.00743
Epoch: 36739, Training Loss: 0.00912
Epoch: 36740, Training Loss: 0.00714
Epoch: 36740, Training Loss: 0.00742
Epoch: 36740, Training Loss: 0.00743
Epoch: 36740, Training Loss: 0.00912
Epoch: 36741, Training Loss: 0.00714
Epoch: 36741, Training Loss: 0.00742
Epoch: 36741, Training Loss: 0.00743
Epoch: 36741, Training Loss: 0.00912
Epoch: 36742, Training Loss: 0.00714
Epoch: 36742, Training Loss: 0.00742
Epoch: 36742, Training Loss: 0.00743
Epoch: 36742, Training Loss: 0.00912
Epoch: 36743, Training Loss: 0.00714
Epoch: 36743, Training Loss: 0.00742
Epoch: 36743, Training Loss: 0.00743
Epoch: 36743, Training Loss: 0.00912
Epoch: 36744, Training Loss: 0.00714
Epoch: 36744, Training Loss: 0.00742
Epoch: 36744, Training Loss: 0.00743
Epoch: 36744, Training Loss: 0.00912
Epoch: 36745, Training Loss: 0.00714
Epoch: 36745, Training Loss: 0.00742
Epoch: 36745, Training Loss: 0.00743
Epoch: 36745, Training Loss: 0.00912
Epoch: 36746, Training Loss: 0.00714
Epoch: 36746, Training Loss: 0.00742
Epoch: 36746, Training Loss: 0.00743
Epoch: 36746, Training Loss: 0.00911
Epoch: 36747, Training Loss: 0.00714
Epoch: 36747, Training Loss: 0.00742
Epoch: 36747, Training Loss: 0.00743
Epoch: 36747, Training Loss: 0.00911
Epoch: 36748, Training Loss: 0.00714
Epoch: 36748, Training Loss: 0.00742
Epoch: 36748, Training Loss: 0.00743
Epoch: 36748, Training Loss: 0.00911
Epoch: 36749, Training Loss: 0.00714
Epoch: 36749, Training Loss: 0.00742
Epoch: 36749, Training Loss: 0.00743
Epoch: 36749, Training Loss: 0.00911
Epoch: 36750, Training Loss: 0.00714
Epoch: 36750, Training Loss: 0.00742
Epoch: 36750, Training Loss: 0.00743
Epoch: 36750, Training Loss: 0.00911
Epoch: 36751, Training Loss: 0.00714
Epoch: 36751, Training Loss: 0.00742
Epoch: 36751, Training Loss: 0.00743
Epoch: 36751, Training Loss: 0.00911
Epoch: 36752, Training Loss: 0.00714
Epoch: 36752, Training Loss: 0.00742
Epoch: 36752, Training Loss: 0.00743
Epoch: 36752, Training Loss: 0.00911
Epoch: 36753, Training Loss: 0.00714
Epoch: 36753, Training Loss: 0.00742
Epoch: 36753, Training Loss: 0.00743
Epoch: 36753, Training Loss: 0.00911
Epoch: 36754, Training Loss: 0.00714
Epoch: 36754, Training Loss: 0.00742
Epoch: 36754, Training Loss: 0.00743
Epoch: 36754, Training Loss: 0.00911
Epoch: 36755, Training Loss: 0.00714
Epoch: 36755, Training Loss: 0.00742
Epoch: 36755, Training Loss: 0.00743
Epoch: 36755, Training Loss: 0.00911
Epoch: 36756, Training Loss: 0.00714
Epoch: 36756, Training Loss: 0.00742
Epoch: 36756, Training Loss: 0.00743
Epoch: 36756, Training Loss: 0.00911
Epoch: 36757, Training Loss: 0.00714
Epoch: 36757, Training Loss: 0.00742
Epoch: 36757, Training Loss: 0.00743
Epoch: 36757, Training Loss: 0.00911
Epoch: 36758, Training Loss: 0.00714
Epoch: 36758, Training Loss: 0.00742
Epoch: 36758, Training Loss: 0.00743
Epoch: 36758, Training Loss: 0.00911
Epoch: 36759, Training Loss: 0.00714
Epoch: 36759, Training Loss: 0.00742
Epoch: 36759, Training Loss: 0.00743
Epoch: 36759, Training Loss: 0.00911
Epoch: 36760, Training Loss: 0.00714
Epoch: 36760, Training Loss: 0.00742
Epoch: 36760, Training Loss: 0.00743
Epoch: 36760, Training Loss: 0.00911
Epoch: 36761, Training Loss: 0.00714
Epoch: 36761, Training Loss: 0.00742
Epoch: 36761, Training Loss: 0.00743
Epoch: 36761, Training Loss: 0.00911
Epoch: 36762, Training Loss: 0.00714
Epoch: 36762, Training Loss: 0.00742
Epoch: 36762, Training Loss: 0.00743
Epoch: 36762, Training Loss: 0.00911
Epoch: 36763, Training Loss: 0.00714
Epoch: 36763, Training Loss: 0.00742
Epoch: 36763, Training Loss: 0.00743
Epoch: 36763, Training Loss: 0.00911
Epoch: 36764, Training Loss: 0.00714
Epoch: 36764, Training Loss: 0.00742
Epoch: 36764, Training Loss: 0.00743
Epoch: 36764, Training Loss: 0.00911
Epoch: 36765, Training Loss: 0.00714
Epoch: 36765, Training Loss: 0.00742
Epoch: 36765, Training Loss: 0.00743
Epoch: 36765, Training Loss: 0.00911
Epoch: 36766, Training Loss: 0.00714
Epoch: 36766, Training Loss: 0.00742
Epoch: 36766, Training Loss: 0.00743
Epoch: 36766, Training Loss: 0.00911
Epoch: 36767, Training Loss: 0.00714
Epoch: 36767, Training Loss: 0.00742
Epoch: 36767, Training Loss: 0.00743
Epoch: 36767, Training Loss: 0.00911
Epoch: 36768, Training Loss: 0.00714
Epoch: 36768, Training Loss: 0.00742
Epoch: 36768, Training Loss: 0.00743
Epoch: 36768, Training Loss: 0.00911
Epoch: 36769, Training Loss: 0.00714
Epoch: 36769, Training Loss: 0.00742
Epoch: 36769, Training Loss: 0.00743
Epoch: 36769, Training Loss: 0.00911
Epoch: 36770, Training Loss: 0.00714
Epoch: 36770, Training Loss: 0.00742
Epoch: 36770, Training Loss: 0.00743
Epoch: 36770, Training Loss: 0.00911
Epoch: 36771, Training Loss: 0.00714
Epoch: 36771, Training Loss: 0.00742
Epoch: 36771, Training Loss: 0.00743
Epoch: 36771, Training Loss: 0.00911
Epoch: 36772, Training Loss: 0.00713
Epoch: 36772, Training Loss: 0.00742
Epoch: 36772, Training Loss: 0.00743
Epoch: 36772, Training Loss: 0.00911
Epoch: 36773, Training Loss: 0.00713
Epoch: 36773, Training Loss: 0.00742
Epoch: 36773, Training Loss: 0.00743
Epoch: 36773, Training Loss: 0.00911
Epoch: 36774, Training Loss: 0.00713
Epoch: 36774, Training Loss: 0.00742
Epoch: 36774, Training Loss: 0.00743
Epoch: 36774, Training Loss: 0.00911
Epoch: 36775, Training Loss: 0.00713
Epoch: 36775, Training Loss: 0.00742
Epoch: 36775, Training Loss: 0.00743
Epoch: 36775, Training Loss: 0.00911
Epoch: 36776, Training Loss: 0.00713
Epoch: 36776, Training Loss: 0.00742
Epoch: 36776, Training Loss: 0.00743
Epoch: 36776, Training Loss: 0.00911
Epoch: 36777, Training Loss: 0.00713
Epoch: 36777, Training Loss: 0.00742
Epoch: 36777, Training Loss: 0.00743
Epoch: 36777, Training Loss: 0.00911
Epoch: 36778, Training Loss: 0.00713
Epoch: 36778, Training Loss: 0.00742
Epoch: 36778, Training Loss: 0.00743
Epoch: 36778, Training Loss: 0.00911
Epoch: 36779, Training Loss: 0.00713
Epoch: 36779, Training Loss: 0.00742
Epoch: 36779, Training Loss: 0.00743
Epoch: 36779, Training Loss: 0.00911
Epoch: 36780, Training Loss: 0.00713
Epoch: 36780, Training Loss: 0.00742
Epoch: 36780, Training Loss: 0.00743
Epoch: 36780, Training Loss: 0.00911
Epoch: 36781, Training Loss: 0.00713
Epoch: 36781, Training Loss: 0.00742
Epoch: 36781, Training Loss: 0.00743
Epoch: 36781, Training Loss: 0.00911
Epoch: 36782, Training Loss: 0.00713
Epoch: 36782, Training Loss: 0.00742
Epoch: 36782, Training Loss: 0.00743
Epoch: 36782, Training Loss: 0.00911
Epoch: 36783, Training Loss: 0.00713
Epoch: 36783, Training Loss: 0.00742
Epoch: 36783, Training Loss: 0.00743
Epoch: 36783, Training Loss: 0.00911
Epoch: 36784, Training Loss: 0.00713
Epoch: 36784, Training Loss: 0.00742
Epoch: 36784, Training Loss: 0.00742
Epoch: 36784, Training Loss: 0.00911
Epoch: 36785, Training Loss: 0.00713
Epoch: 36785, Training Loss: 0.00742
Epoch: 36785, Training Loss: 0.00742
Epoch: 36785, Training Loss: 0.00911
Epoch: 36786, Training Loss: 0.00713
Epoch: 36786, Training Loss: 0.00742
Epoch: 36786, Training Loss: 0.00742
Epoch: 36786, Training Loss: 0.00911
Epoch: 36787, Training Loss: 0.00713
Epoch: 36787, Training Loss: 0.00742
Epoch: 36787, Training Loss: 0.00742
Epoch: 36787, Training Loss: 0.00911
Epoch: 36788, Training Loss: 0.00713
Epoch: 36788, Training Loss: 0.00742
Epoch: 36788, Training Loss: 0.00742
Epoch: 36788, Training Loss: 0.00911
Epoch: 36789, Training Loss: 0.00713
Epoch: 36789, Training Loss: 0.00742
Epoch: 36789, Training Loss: 0.00742
Epoch: 36789, Training Loss: 0.00911
Epoch: 36790, Training Loss: 0.00713
Epoch: 36790, Training Loss: 0.00742
Epoch: 36790, Training Loss: 0.00742
Epoch: 36790, Training Loss: 0.00911
Epoch: 36791, Training Loss: 0.00713
Epoch: 36791, Training Loss: 0.00742
Epoch: 36791, Training Loss: 0.00742
Epoch: 36791, Training Loss: 0.00911
Epoch: 36792, Training Loss: 0.00713
Epoch: 36792, Training Loss: 0.00742
Epoch: 36792, Training Loss: 0.00742
Epoch: 36792, Training Loss: 0.00911
Epoch: 36793, Training Loss: 0.00713
Epoch: 36793, Training Loss: 0.00742
Epoch: 36793, Training Loss: 0.00742
Epoch: 36793, Training Loss: 0.00911
Epoch: 36794, Training Loss: 0.00713
Epoch: 36794, Training Loss: 0.00742
Epoch: 36794, Training Loss: 0.00742
Epoch: 36794, Training Loss: 0.00911
Epoch: 36795, Training Loss: 0.00713
Epoch: 36795, Training Loss: 0.00742
Epoch: 36795, Training Loss: 0.00742
Epoch: 36795, Training Loss: 0.00911
Epoch: 36796, Training Loss: 0.00713
Epoch: 36796, Training Loss: 0.00742
Epoch: 36796, Training Loss: 0.00742
Epoch: 36796, Training Loss: 0.00911
Epoch: 36797, Training Loss: 0.00713
Epoch: 36797, Training Loss: 0.00741
Epoch: 36797, Training Loss: 0.00742
Epoch: 36797, Training Loss: 0.00911
Epoch: 36798, Training Loss: 0.00713
Epoch: 36798, Training Loss: 0.00741
Epoch: 36798, Training Loss: 0.00742
Epoch: 36798, Training Loss: 0.00911
Epoch: 36799, Training Loss: 0.00713
Epoch: 36799, Training Loss: 0.00741
Epoch: 36799, Training Loss: 0.00742
Epoch: 36799, Training Loss: 0.00911
Epoch: 36800, Training Loss: 0.00713
Epoch: 36800, Training Loss: 0.00741
Epoch: 36800, Training Loss: 0.00742
Epoch: 36800, Training Loss: 0.00911
Epoch: 36801, Training Loss: 0.00713
Epoch: 36801, Training Loss: 0.00741
Epoch: 36801, Training Loss: 0.00742
Epoch: 36801, Training Loss: 0.00911
Epoch: 36802, Training Loss: 0.00713
Epoch: 36802, Training Loss: 0.00741
Epoch: 36802, Training Loss: 0.00742
Epoch: 36802, Training Loss: 0.00911
Epoch: 36803, Training Loss: 0.00713
Epoch: 36803, Training Loss: 0.00741
Epoch: 36803, Training Loss: 0.00742
Epoch: 36803, Training Loss: 0.00911
Epoch: 36804, Training Loss: 0.00713
Epoch: 36804, Training Loss: 0.00741
Epoch: 36804, Training Loss: 0.00742
Epoch: 36804, Training Loss: 0.00911
Epoch: 36805, Training Loss: 0.00713
Epoch: 36805, Training Loss: 0.00741
Epoch: 36805, Training Loss: 0.00742
Epoch: 36805, Training Loss: 0.00911
Epoch: 36806, Training Loss: 0.00713
Epoch: 36806, Training Loss: 0.00741
Epoch: 36806, Training Loss: 0.00742
Epoch: 36806, Training Loss: 0.00911
Epoch: 36807, Training Loss: 0.00713
Epoch: 36807, Training Loss: 0.00741
Epoch: 36807, Training Loss: 0.00742
Epoch: 36807, Training Loss: 0.00911
Epoch: 36808, Training Loss: 0.00713
Epoch: 36808, Training Loss: 0.00741
Epoch: 36808, Training Loss: 0.00742
Epoch: 36808, Training Loss: 0.00911
Epoch: 36809, Training Loss: 0.00713
Epoch: 36809, Training Loss: 0.00741
Epoch: 36809, Training Loss: 0.00742
Epoch: 36809, Training Loss: 0.00911
Epoch: 36810, Training Loss: 0.00713
Epoch: 36810, Training Loss: 0.00741
Epoch: 36810, Training Loss: 0.00742
Epoch: 36810, Training Loss: 0.00911
Epoch: 36811, Training Loss: 0.00713
Epoch: 36811, Training Loss: 0.00741
Epoch: 36811, Training Loss: 0.00742
Epoch: 36811, Training Loss: 0.00911
Epoch: 36812, Training Loss: 0.00713
Epoch: 36812, Training Loss: 0.00741
Epoch: 36812, Training Loss: 0.00742
Epoch: 36812, Training Loss: 0.00911
Epoch: 36813, Training Loss: 0.00713
Epoch: 36813, Training Loss: 0.00741
Epoch: 36813, Training Loss: 0.00742
Epoch: 36813, Training Loss: 0.00911
Epoch: 36814, Training Loss: 0.00713
Epoch: 36814, Training Loss: 0.00741
Epoch: 36814, Training Loss: 0.00742
Epoch: 36814, Training Loss: 0.00911
Epoch: 36815, Training Loss: 0.00713
Epoch: 36815, Training Loss: 0.00741
Epoch: 36815, Training Loss: 0.00742
Epoch: 36815, Training Loss: 0.00911
Epoch: 36816, Training Loss: 0.00713
Epoch: 36816, Training Loss: 0.00741
Epoch: 36816, Training Loss: 0.00742
Epoch: 36816, Training Loss: 0.00911
Epoch: 36817, Training Loss: 0.00713
Epoch: 36817, Training Loss: 0.00741
Epoch: 36817, Training Loss: 0.00742
Epoch: 36817, Training Loss: 0.00911
Epoch: 36818, Training Loss: 0.00713
Epoch: 36818, Training Loss: 0.00741
Epoch: 36818, Training Loss: 0.00742
Epoch: 36818, Training Loss: 0.00910
Epoch: 36819, Training Loss: 0.00713
Epoch: 36819, Training Loss: 0.00741
Epoch: 36819, Training Loss: 0.00742
Epoch: 36819, Training Loss: 0.00910
Epoch: 36820, Training Loss: 0.00713
Epoch: 36820, Training Loss: 0.00741
Epoch: 36820, Training Loss: 0.00742
Epoch: 36820, Training Loss: 0.00910
Epoch: 36821, Training Loss: 0.00713
Epoch: 36821, Training Loss: 0.00741
Epoch: 36821, Training Loss: 0.00742
Epoch: 36821, Training Loss: 0.00910
Epoch: 36822, Training Loss: 0.00713
Epoch: 36822, Training Loss: 0.00741
Epoch: 36822, Training Loss: 0.00742
Epoch: 36822, Training Loss: 0.00910
Epoch: 36823, Training Loss: 0.00713
Epoch: 36823, Training Loss: 0.00741
Epoch: 36823, Training Loss: 0.00742
Epoch: 36823, Training Loss: 0.00910
Epoch: 36824, Training Loss: 0.00713
Epoch: 36824, Training Loss: 0.00741
Epoch: 36824, Training Loss: 0.00742
Epoch: 36824, Training Loss: 0.00910
Epoch: 36825, Training Loss: 0.00713
Epoch: 36825, Training Loss: 0.00741
Epoch: 36825, Training Loss: 0.00742
Epoch: 36825, Training Loss: 0.00910
Epoch: 36826, Training Loss: 0.00713
Epoch: 36826, Training Loss: 0.00741
Epoch: 36826, Training Loss: 0.00742
Epoch: 36826, Training Loss: 0.00910
Epoch: 36827, Training Loss: 0.00713
Epoch: 36827, Training Loss: 0.00741
Epoch: 36827, Training Loss: 0.00742
Epoch: 36827, Training Loss: 0.00910
Epoch: 36828, Training Loss: 0.00713
Epoch: 36828, Training Loss: 0.00741
Epoch: 36828, Training Loss: 0.00742
Epoch: 36828, Training Loss: 0.00910
Epoch: 36829, Training Loss: 0.00713
Epoch: 36829, Training Loss: 0.00741
Epoch: 36829, Training Loss: 0.00742
Epoch: 36829, Training Loss: 0.00910
Epoch: 36830, Training Loss: 0.00713
Epoch: 36830, Training Loss: 0.00741
Epoch: 36830, Training Loss: 0.00742
Epoch: 36830, Training Loss: 0.00910
Epoch: 36831, Training Loss: 0.00713
Epoch: 36831, Training Loss: 0.00741
Epoch: 36831, Training Loss: 0.00742
Epoch: 36831, Training Loss: 0.00910
Epoch: 36832, Training Loss: 0.00713
Epoch: 36832, Training Loss: 0.00741
Epoch: 36832, Training Loss: 0.00742
Epoch: 36832, Training Loss: 0.00910
Epoch: 36833, Training Loss: 0.00713
Epoch: 36833, Training Loss: 0.00741
Epoch: 36833, Training Loss: 0.00742
Epoch: 36833, Training Loss: 0.00910
Epoch: 36834, Training Loss: 0.00713
Epoch: 36834, Training Loss: 0.00741
Epoch: 36834, Training Loss: 0.00742
Epoch: 36834, Training Loss: 0.00910
Epoch: 36835, Training Loss: 0.00713
Epoch: 36835, Training Loss: 0.00741
Epoch: 36835, Training Loss: 0.00742
Epoch: 36835, Training Loss: 0.00910
Epoch: 36836, Training Loss: 0.00713
Epoch: 36836, Training Loss: 0.00741
Epoch: 36836, Training Loss: 0.00742
Epoch: 36836, Training Loss: 0.00910
Epoch: 36837, Training Loss: 0.00713
Epoch: 36837, Training Loss: 0.00741
Epoch: 36837, Training Loss: 0.00742
Epoch: 36837, Training Loss: 0.00910
Epoch: 36838, Training Loss: 0.00713
Epoch: 36838, Training Loss: 0.00741
Epoch: 36838, Training Loss: 0.00742
Epoch: 36838, Training Loss: 0.00910
Epoch: 36839, Training Loss: 0.00713
Epoch: 36839, Training Loss: 0.00741
Epoch: 36839, Training Loss: 0.00742
Epoch: 36839, Training Loss: 0.00910
Epoch: 36840, Training Loss: 0.00713
Epoch: 36840, Training Loss: 0.00741
Epoch: 36840, Training Loss: 0.00742
Epoch: 36840, Training Loss: 0.00910
Epoch: 36841, Training Loss: 0.00713
Epoch: 36841, Training Loss: 0.00741
Epoch: 36841, Training Loss: 0.00742
Epoch: 36841, Training Loss: 0.00910
Epoch: 36842, Training Loss: 0.00713
Epoch: 36842, Training Loss: 0.00741
Epoch: 36842, Training Loss: 0.00742
Epoch: 36842, Training Loss: 0.00910
Epoch: 36843, Training Loss: 0.00713
Epoch: 36843, Training Loss: 0.00741
Epoch: 36843, Training Loss: 0.00742
Epoch: 36843, Training Loss: 0.00910
Epoch: 36844, Training Loss: 0.00713
Epoch: 36844, Training Loss: 0.00741
Epoch: 36844, Training Loss: 0.00742
Epoch: 36844, Training Loss: 0.00910
Epoch: 36845, Training Loss: 0.00713
Epoch: 36845, Training Loss: 0.00741
Epoch: 36845, Training Loss: 0.00742
Epoch: 36845, Training Loss: 0.00910
Epoch: 36846, Training Loss: 0.00713
Epoch: 36846, Training Loss: 0.00741
Epoch: 36846, Training Loss: 0.00742
Epoch: 36846, Training Loss: 0.00910
Epoch: 36847, Training Loss: 0.00713
Epoch: 36847, Training Loss: 0.00741
Epoch: 36847, Training Loss: 0.00742
Epoch: 36847, Training Loss: 0.00910
Epoch: 36848, Training Loss: 0.00713
Epoch: 36848, Training Loss: 0.00741
Epoch: 36848, Training Loss: 0.00742
Epoch: 36848, Training Loss: 0.00910
Epoch: 36849, Training Loss: 0.00713
Epoch: 36849, Training Loss: 0.00741
Epoch: 36849, Training Loss: 0.00742
Epoch: 36849, Training Loss: 0.00910
Epoch: 36850, Training Loss: 0.00713
Epoch: 36850, Training Loss: 0.00741
Epoch: 36850, Training Loss: 0.00742
Epoch: 36850, Training Loss: 0.00910
Epoch: 36851, Training Loss: 0.00713
Epoch: 36851, Training Loss: 0.00741
Epoch: 36851, Training Loss: 0.00742
Epoch: 36851, Training Loss: 0.00910
Epoch: 36852, Training Loss: 0.00713
Epoch: 36852, Training Loss: 0.00741
Epoch: 36852, Training Loss: 0.00742
Epoch: 36852, Training Loss: 0.00910
Epoch: 36853, Training Loss: 0.00713
Epoch: 36853, Training Loss: 0.00741
Epoch: 36853, Training Loss: 0.00742
Epoch: 36853, Training Loss: 0.00910
Epoch: 36854, Training Loss: 0.00713
Epoch: 36854, Training Loss: 0.00741
Epoch: 36854, Training Loss: 0.00742
Epoch: 36854, Training Loss: 0.00910
Epoch: 36855, Training Loss: 0.00713
Epoch: 36855, Training Loss: 0.00741
Epoch: 36855, Training Loss: 0.00742
Epoch: 36855, Training Loss: 0.00910
Epoch: 36856, Training Loss: 0.00713
Epoch: 36856, Training Loss: 0.00741
Epoch: 36856, Training Loss: 0.00742
Epoch: 36856, Training Loss: 0.00910
Epoch: 36857, Training Loss: 0.00713
Epoch: 36857, Training Loss: 0.00741
Epoch: 36857, Training Loss: 0.00742
Epoch: 36857, Training Loss: 0.00910
Epoch: 36858, Training Loss: 0.00713
Epoch: 36858, Training Loss: 0.00741
Epoch: 36858, Training Loss: 0.00742
Epoch: 36858, Training Loss: 0.00910
Epoch: 36859, Training Loss: 0.00713
Epoch: 36859, Training Loss: 0.00741
Epoch: 36859, Training Loss: 0.00742
Epoch: 36859, Training Loss: 0.00910
Epoch: 36860, Training Loss: 0.00713
Epoch: 36860, Training Loss: 0.00741
Epoch: 36860, Training Loss: 0.00742
Epoch: 36860, Training Loss: 0.00910
Epoch: 36861, Training Loss: 0.00713
Epoch: 36861, Training Loss: 0.00741
Epoch: 36861, Training Loss: 0.00742
Epoch: 36861, Training Loss: 0.00910
Epoch: 36862, Training Loss: 0.00713
Epoch: 36862, Training Loss: 0.00741
Epoch: 36862, Training Loss: 0.00742
Epoch: 36862, Training Loss: 0.00910
Epoch: 36863, Training Loss: 0.00713
Epoch: 36863, Training Loss: 0.00741
Epoch: 36863, Training Loss: 0.00742
Epoch: 36863, Training Loss: 0.00910
Epoch: 36864, Training Loss: 0.00713
Epoch: 36864, Training Loss: 0.00741
Epoch: 36864, Training Loss: 0.00742
Epoch: 36864, Training Loss: 0.00910
Epoch: 36865, Training Loss: 0.00713
Epoch: 36865, Training Loss: 0.00741
Epoch: 36865, Training Loss: 0.00742
Epoch: 36865, Training Loss: 0.00910
Epoch: 36866, Training Loss: 0.00713
Epoch: 36866, Training Loss: 0.00741
Epoch: 36866, Training Loss: 0.00742
Epoch: 36866, Training Loss: 0.00910
Epoch: 36867, Training Loss: 0.00713
Epoch: 36867, Training Loss: 0.00741
Epoch: 36867, Training Loss: 0.00742
Epoch: 36867, Training Loss: 0.00910
Epoch: 36868, Training Loss: 0.00712
Epoch: 36868, Training Loss: 0.00741
Epoch: 36868, Training Loss: 0.00742
Epoch: 36868, Training Loss: 0.00910
Epoch: 36869, Training Loss: 0.00712
Epoch: 36869, Training Loss: 0.00741
Epoch: 36869, Training Loss: 0.00742
Epoch: 36869, Training Loss: 0.00910
Epoch: 36870, Training Loss: 0.00712
Epoch: 36870, Training Loss: 0.00741
Epoch: 36870, Training Loss: 0.00742
Epoch: 36870, Training Loss: 0.00910
Epoch: 36871, Training Loss: 0.00712
Epoch: 36871, Training Loss: 0.00741
Epoch: 36871, Training Loss: 0.00742
Epoch: 36871, Training Loss: 0.00910
Epoch: 36872, Training Loss: 0.00712
Epoch: 36872, Training Loss: 0.00741
Epoch: 36872, Training Loss: 0.00742
Epoch: 36872, Training Loss: 0.00910
Epoch: 36873, Training Loss: 0.00712
Epoch: 36873, Training Loss: 0.00741
Epoch: 36873, Training Loss: 0.00741
Epoch: 36873, Training Loss: 0.00910
Epoch: 36874, Training Loss: 0.00712
Epoch: 36874, Training Loss: 0.00741
Epoch: 36874, Training Loss: 0.00741
Epoch: 36874, Training Loss: 0.00910
Epoch: 36875, Training Loss: 0.00712
Epoch: 36875, Training Loss: 0.00741
Epoch: 36875, Training Loss: 0.00741
Epoch: 36875, Training Loss: 0.00910
Epoch: 36876, Training Loss: 0.00712
Epoch: 36876, Training Loss: 0.00741
Epoch: 36876, Training Loss: 0.00741
Epoch: 36876, Training Loss: 0.00910
Epoch: 36877, Training Loss: 0.00712
Epoch: 36877, Training Loss: 0.00741
Epoch: 36877, Training Loss: 0.00741
Epoch: 36877, Training Loss: 0.00910
Epoch: 36878, Training Loss: 0.00712
Epoch: 36878, Training Loss: 0.00741
Epoch: 36878, Training Loss: 0.00741
Epoch: 36878, Training Loss: 0.00910
Epoch: 36879, Training Loss: 0.00712
Epoch: 36879, Training Loss: 0.00741
Epoch: 36879, Training Loss: 0.00741
Epoch: 36879, Training Loss: 0.00910
Epoch: 36880, Training Loss: 0.00712
Epoch: 36880, Training Loss: 0.00741
Epoch: 36880, Training Loss: 0.00741
Epoch: 36880, Training Loss: 0.00910
Epoch: 36881, Training Loss: 0.00712
Epoch: 36881, Training Loss: 0.00741
Epoch: 36881, Training Loss: 0.00741
Epoch: 36881, Training Loss: 0.00910
Epoch: 36882, Training Loss: 0.00712
Epoch: 36882, Training Loss: 0.00741
Epoch: 36882, Training Loss: 0.00741
Epoch: 36882, Training Loss: 0.00910
Epoch: 36883, Training Loss: 0.00712
Epoch: 36883, Training Loss: 0.00741
Epoch: 36883, Training Loss: 0.00741
Epoch: 36883, Training Loss: 0.00910
Epoch: 36884, Training Loss: 0.00712
Epoch: 36884, Training Loss: 0.00741
Epoch: 36884, Training Loss: 0.00741
Epoch: 36884, Training Loss: 0.00910
Epoch: 36885, Training Loss: 0.00712
Epoch: 36885, Training Loss: 0.00741
Epoch: 36885, Training Loss: 0.00741
Epoch: 36885, Training Loss: 0.00910
Epoch: 36886, Training Loss: 0.00712
Epoch: 36886, Training Loss: 0.00741
Epoch: 36886, Training Loss: 0.00741
Epoch: 36886, Training Loss: 0.00910
Epoch: 36887, Training Loss: 0.00712
Epoch: 36887, Training Loss: 0.00740
Epoch: 36887, Training Loss: 0.00741
Epoch: 36887, Training Loss: 0.00910
Epoch: 36888, Training Loss: 0.00712
Epoch: 36888, Training Loss: 0.00740
Epoch: 36888, Training Loss: 0.00741
Epoch: 36888, Training Loss: 0.00910
Epoch: 36889, Training Loss: 0.00712
Epoch: 36889, Training Loss: 0.00740
Epoch: 36889, Training Loss: 0.00741
Epoch: 36889, Training Loss: 0.00910
Epoch: 36890, Training Loss: 0.00712
Epoch: 36890, Training Loss: 0.00740
Epoch: 36890, Training Loss: 0.00741
Epoch: 36890, Training Loss: 0.00910
Epoch: 36891, Training Loss: 0.00712
Epoch: 36891, Training Loss: 0.00740
Epoch: 36891, Training Loss: 0.00741
Epoch: 36891, Training Loss: 0.00909
Epoch: 36892, Training Loss: 0.00712
Epoch: 36892, Training Loss: 0.00740
Epoch: 36892, Training Loss: 0.00741
Epoch: 36892, Training Loss: 0.00909
Epoch: 36893, Training Loss: 0.00712
Epoch: 36893, Training Loss: 0.00740
Epoch: 36893, Training Loss: 0.00741
Epoch: 36893, Training Loss: 0.00909
Epoch: 36894, Training Loss: 0.00712
Epoch: 36894, Training Loss: 0.00740
Epoch: 36894, Training Loss: 0.00741
Epoch: 36894, Training Loss: 0.00909
Epoch: 36895, Training Loss: 0.00712
Epoch: 36895, Training Loss: 0.00740
Epoch: 36895, Training Loss: 0.00741
Epoch: 36895, Training Loss: 0.00909
Epoch: 36896, Training Loss: 0.00712
Epoch: 36896, Training Loss: 0.00740
Epoch: 36896, Training Loss: 0.00741
Epoch: 36896, Training Loss: 0.00909
Epoch: 36897, Training Loss: 0.00712
Epoch: 36897, Training Loss: 0.00740
Epoch: 36897, Training Loss: 0.00741
Epoch: 36897, Training Loss: 0.00909
Epoch: 36898, Training Loss: 0.00712
Epoch: 36898, Training Loss: 0.00740
Epoch: 36898, Training Loss: 0.00741
Epoch: 36898, Training Loss: 0.00909
Epoch: 36899, Training Loss: 0.00712
Epoch: 36899, Training Loss: 0.00740
Epoch: 36899, Training Loss: 0.00741
Epoch: 36899, Training Loss: 0.00909
Epoch: 36900, Training Loss: 0.00712
Epoch: 36900, Training Loss: 0.00740
Epoch: 36900, Training Loss: 0.00741
Epoch: 36900, Training Loss: 0.00909
Epoch: 36901, Training Loss: 0.00712
Epoch: 36901, Training Loss: 0.00740
Epoch: 36901, Training Loss: 0.00741
Epoch: 36901, Training Loss: 0.00909
Epoch: 36902, Training Loss: 0.00712
Epoch: 36902, Training Loss: 0.00740
Epoch: 36902, Training Loss: 0.00741
Epoch: 36902, Training Loss: 0.00909
Epoch: 36903, Training Loss: 0.00712
Epoch: 36903, Training Loss: 0.00740
Epoch: 36903, Training Loss: 0.00741
Epoch: 36903, Training Loss: 0.00909
Epoch: 36904, Training Loss: 0.00712
Epoch: 36904, Training Loss: 0.00740
Epoch: 36904, Training Loss: 0.00741
Epoch: 36904, Training Loss: 0.00909
Epoch: 36905, Training Loss: 0.00712
Epoch: 36905, Training Loss: 0.00740
Epoch: 36905, Training Loss: 0.00741
Epoch: 36905, Training Loss: 0.00909
Epoch: 36906, Training Loss: 0.00712
Epoch: 36906, Training Loss: 0.00740
Epoch: 36906, Training Loss: 0.00741
Epoch: 36906, Training Loss: 0.00909
Epoch: 36907, Training Loss: 0.00712
Epoch: 36907, Training Loss: 0.00740
Epoch: 36907, Training Loss: 0.00741
Epoch: 36907, Training Loss: 0.00909
Epoch: 36908, Training Loss: 0.00712
Epoch: 36908, Training Loss: 0.00740
Epoch: 36908, Training Loss: 0.00741
Epoch: 36908, Training Loss: 0.00909
Epoch: 36909, Training Loss: 0.00712
Epoch: 36909, Training Loss: 0.00740
Epoch: 36909, Training Loss: 0.00741
Epoch: 36909, Training Loss: 0.00909
Epoch: 36910, Training Loss: 0.00712
Epoch: 36910, Training Loss: 0.00740
Epoch: 36910, Training Loss: 0.00741
Epoch: 36910, Training Loss: 0.00909
Epoch: 36911, Training Loss: 0.00712
Epoch: 36911, Training Loss: 0.00740
Epoch: 36911, Training Loss: 0.00741
Epoch: 36911, Training Loss: 0.00909
Epoch: 36912, Training Loss: 0.00712
Epoch: 36912, Training Loss: 0.00740
Epoch: 36912, Training Loss: 0.00741
Epoch: 36912, Training Loss: 0.00909
Epoch: 36913, Training Loss: 0.00712
Epoch: 36913, Training Loss: 0.00740
Epoch: 36913, Training Loss: 0.00741
Epoch: 36913, Training Loss: 0.00909
Epoch: 36914, Training Loss: 0.00712
Epoch: 36914, Training Loss: 0.00740
Epoch: 36914, Training Loss: 0.00741
Epoch: 36914, Training Loss: 0.00909
Epoch: 36915, Training Loss: 0.00712
Epoch: 36915, Training Loss: 0.00740
Epoch: 36915, Training Loss: 0.00741
Epoch: 36915, Training Loss: 0.00909
Epoch: 36916, Training Loss: 0.00712
Epoch: 36916, Training Loss: 0.00740
Epoch: 36916, Training Loss: 0.00741
Epoch: 36916, Training Loss: 0.00909
Epoch: 36917, Training Loss: 0.00712
Epoch: 36917, Training Loss: 0.00740
Epoch: 36917, Training Loss: 0.00741
Epoch: 36917, Training Loss: 0.00909
Epoch: 36918, Training Loss: 0.00712
Epoch: 36918, Training Loss: 0.00740
Epoch: 36918, Training Loss: 0.00741
Epoch: 36918, Training Loss: 0.00909
Epoch: 36919, Training Loss: 0.00712
Epoch: 36919, Training Loss: 0.00740
Epoch: 36919, Training Loss: 0.00741
Epoch: 36919, Training Loss: 0.00909
Epoch: 36920, Training Loss: 0.00712
Epoch: 36920, Training Loss: 0.00740
Epoch: 36920, Training Loss: 0.00741
Epoch: 36920, Training Loss: 0.00909
Epoch: 36921, Training Loss: 0.00712
Epoch: 36921, Training Loss: 0.00740
Epoch: 36921, Training Loss: 0.00741
Epoch: 36921, Training Loss: 0.00909
Epoch: 36922, Training Loss: 0.00712
Epoch: 36922, Training Loss: 0.00740
Epoch: 36922, Training Loss: 0.00741
Epoch: 36922, Training Loss: 0.00909
Epoch: 36923, Training Loss: 0.00712
Epoch: 36923, Training Loss: 0.00740
Epoch: 36923, Training Loss: 0.00741
Epoch: 36923, Training Loss: 0.00909
Epoch: 36924, Training Loss: 0.00712
Epoch: 36924, Training Loss: 0.00740
Epoch: 36924, Training Loss: 0.00741
Epoch: 36924, Training Loss: 0.00909
Epoch: 36925, Training Loss: 0.00712
Epoch: 36925, Training Loss: 0.00740
Epoch: 36925, Training Loss: 0.00741
Epoch: 36925, Training Loss: 0.00909
Epoch: 36926, Training Loss: 0.00712
Epoch: 36926, Training Loss: 0.00740
Epoch: 36926, Training Loss: 0.00741
Epoch: 36926, Training Loss: 0.00909
Epoch: 36927, Training Loss: 0.00712
Epoch: 36927, Training Loss: 0.00740
Epoch: 36927, Training Loss: 0.00741
Epoch: 36927, Training Loss: 0.00909
Epoch: 36928, Training Loss: 0.00712
Epoch: 36928, Training Loss: 0.00740
Epoch: 36928, Training Loss: 0.00741
Epoch: 36928, Training Loss: 0.00909
Epoch: 36929, Training Loss: 0.00712
Epoch: 36929, Training Loss: 0.00740
Epoch: 36929, Training Loss: 0.00741
Epoch: 36929, Training Loss: 0.00909
Epoch: 36930, Training Loss: 0.00712
Epoch: 36930, Training Loss: 0.00740
Epoch: 36930, Training Loss: 0.00741
Epoch: 36930, Training Loss: 0.00909
Epoch: 36931, Training Loss: 0.00712
Epoch: 36931, Training Loss: 0.00740
Epoch: 36931, Training Loss: 0.00741
Epoch: 36931, Training Loss: 0.00909
Epoch: 36932, Training Loss: 0.00712
Epoch: 36932, Training Loss: 0.00740
Epoch: 36932, Training Loss: 0.00741
Epoch: 36932, Training Loss: 0.00909
Epoch: 36933, Training Loss: 0.00712
Epoch: 36933, Training Loss: 0.00740
Epoch: 36933, Training Loss: 0.00741
Epoch: 36933, Training Loss: 0.00909
Epoch: 36934, Training Loss: 0.00712
Epoch: 36934, Training Loss: 0.00740
Epoch: 36934, Training Loss: 0.00741
Epoch: 36934, Training Loss: 0.00909
Epoch: 36935, Training Loss: 0.00712
Epoch: 36935, Training Loss: 0.00740
Epoch: 36935, Training Loss: 0.00741
Epoch: 36935, Training Loss: 0.00909
Epoch: 36936, Training Loss: 0.00712
Epoch: 36936, Training Loss: 0.00740
Epoch: 36936, Training Loss: 0.00741
Epoch: 36936, Training Loss: 0.00909
Epoch: 36937, Training Loss: 0.00712
Epoch: 36937, Training Loss: 0.00740
Epoch: 36937, Training Loss: 0.00741
Epoch: 36937, Training Loss: 0.00909
Epoch: 36938, Training Loss: 0.00712
Epoch: 36938, Training Loss: 0.00740
Epoch: 36938, Training Loss: 0.00741
Epoch: 36938, Training Loss: 0.00909
Epoch: 36939, Training Loss: 0.00712
Epoch: 36939, Training Loss: 0.00740
Epoch: 36939, Training Loss: 0.00741
Epoch: 36939, Training Loss: 0.00909
Epoch: 36940, Training Loss: 0.00712
Epoch: 36940, Training Loss: 0.00740
Epoch: 36940, Training Loss: 0.00741
Epoch: 36940, Training Loss: 0.00909
Epoch: 36941, Training Loss: 0.00712
Epoch: 36941, Training Loss: 0.00740
Epoch: 36941, Training Loss: 0.00741
Epoch: 36941, Training Loss: 0.00909
Epoch: 36942, Training Loss: 0.00712
Epoch: 36942, Training Loss: 0.00740
Epoch: 36942, Training Loss: 0.00741
Epoch: 36942, Training Loss: 0.00909
Epoch: 36943, Training Loss: 0.00712
Epoch: 36943, Training Loss: 0.00740
Epoch: 36943, Training Loss: 0.00741
Epoch: 36943, Training Loss: 0.00909
Epoch: 36944, Training Loss: 0.00712
Epoch: 36944, Training Loss: 0.00740
Epoch: 36944, Training Loss: 0.00741
Epoch: 36944, Training Loss: 0.00909
Epoch: 36945, Training Loss: 0.00712
Epoch: 36945, Training Loss: 0.00740
Epoch: 36945, Training Loss: 0.00741
Epoch: 36945, Training Loss: 0.00909
Epoch: 36946, Training Loss: 0.00712
Epoch: 36946, Training Loss: 0.00740
Epoch: 36946, Training Loss: 0.00741
Epoch: 36946, Training Loss: 0.00909
Epoch: 36947, Training Loss: 0.00712
Epoch: 36947, Training Loss: 0.00740
Epoch: 36947, Training Loss: 0.00741
Epoch: 36947, Training Loss: 0.00909
Epoch: 36948, Training Loss: 0.00712
Epoch: 36948, Training Loss: 0.00740
Epoch: 36948, Training Loss: 0.00741
Epoch: 36948, Training Loss: 0.00909
Epoch: 36949, Training Loss: 0.00712
Epoch: 36949, Training Loss: 0.00740
Epoch: 36949, Training Loss: 0.00741
Epoch: 36949, Training Loss: 0.00909
Epoch: 36950, Training Loss: 0.00712
Epoch: 36950, Training Loss: 0.00740
Epoch: 36950, Training Loss: 0.00741
Epoch: 36950, Training Loss: 0.00909
Epoch: 36951, Training Loss: 0.00712
Epoch: 36951, Training Loss: 0.00740
Epoch: 36951, Training Loss: 0.00741
Epoch: 36951, Training Loss: 0.00909
Epoch: 36952, Training Loss: 0.00712
Epoch: 36952, Training Loss: 0.00740
Epoch: 36952, Training Loss: 0.00741
Epoch: 36952, Training Loss: 0.00909
Epoch: 36953, Training Loss: 0.00712
Epoch: 36953, Training Loss: 0.00740
Epoch: 36953, Training Loss: 0.00741
Epoch: 36953, Training Loss: 0.00909
Epoch: 36954, Training Loss: 0.00712
Epoch: 36954, Training Loss: 0.00740
Epoch: 36954, Training Loss: 0.00741
Epoch: 36954, Training Loss: 0.00909
Epoch: 36955, Training Loss: 0.00712
Epoch: 36955, Training Loss: 0.00740
Epoch: 36955, Training Loss: 0.00741
Epoch: 36955, Training Loss: 0.00909
Epoch: 36956, Training Loss: 0.00712
Epoch: 36956, Training Loss: 0.00740
Epoch: 36956, Training Loss: 0.00741
Epoch: 36956, Training Loss: 0.00909
Epoch: 36957, Training Loss: 0.00712
Epoch: 36957, Training Loss: 0.00740
Epoch: 36957, Training Loss: 0.00741
Epoch: 36957, Training Loss: 0.00909
Epoch: 36958, Training Loss: 0.00712
Epoch: 36958, Training Loss: 0.00740
Epoch: 36958, Training Loss: 0.00741
Epoch: 36958, Training Loss: 0.00909
Epoch: 36959, Training Loss: 0.00712
Epoch: 36959, Training Loss: 0.00740
Epoch: 36959, Training Loss: 0.00741
Epoch: 36959, Training Loss: 0.00909
Epoch: 36960, Training Loss: 0.00712
Epoch: 36960, Training Loss: 0.00740
Epoch: 36960, Training Loss: 0.00741
Epoch: 36960, Training Loss: 0.00909
Epoch: 36961, Training Loss: 0.00712
Epoch: 36961, Training Loss: 0.00740
Epoch: 36961, Training Loss: 0.00741
Epoch: 36961, Training Loss: 0.00909
Epoch: 36962, Training Loss: 0.00712
Epoch: 36962, Training Loss: 0.00740
Epoch: 36962, Training Loss: 0.00741
Epoch: 36962, Training Loss: 0.00909
Epoch: 36963, Training Loss: 0.00711
Epoch: 36963, Training Loss: 0.00740
Epoch: 36963, Training Loss: 0.00741
Epoch: 36963, Training Loss: 0.00908
Epoch: 36964, Training Loss: 0.00711
Epoch: 36964, Training Loss: 0.00740
Epoch: 36964, Training Loss: 0.00740
Epoch: 36964, Training Loss: 0.00908
Epoch: 36965, Training Loss: 0.00711
Epoch: 36965, Training Loss: 0.00740
Epoch: 36965, Training Loss: 0.00740
Epoch: 36965, Training Loss: 0.00908
Epoch: 36966, Training Loss: 0.00711
Epoch: 36966, Training Loss: 0.00740
Epoch: 36966, Training Loss: 0.00740
Epoch: 36966, Training Loss: 0.00908
Epoch: 36967, Training Loss: 0.00711
Epoch: 36967, Training Loss: 0.00740
Epoch: 36967, Training Loss: 0.00740
Epoch: 36967, Training Loss: 0.00908
Epoch: 36968, Training Loss: 0.00711
Epoch: 36968, Training Loss: 0.00740
Epoch: 36968, Training Loss: 0.00740
Epoch: 36968, Training Loss: 0.00908
Epoch: 36969, Training Loss: 0.00711
Epoch: 36969, Training Loss: 0.00740
Epoch: 36969, Training Loss: 0.00740
Epoch: 36969, Training Loss: 0.00908
Epoch: 36970, Training Loss: 0.00711
Epoch: 36970, Training Loss: 0.00740
Epoch: 36970, Training Loss: 0.00740
Epoch: 36970, Training Loss: 0.00908
Epoch: 36971, Training Loss: 0.00711
Epoch: 36971, Training Loss: 0.00740
Epoch: 36971, Training Loss: 0.00740
Epoch: 36971, Training Loss: 0.00908
Epoch: 36972, Training Loss: 0.00711
Epoch: 36972, Training Loss: 0.00740
Epoch: 36972, Training Loss: 0.00740
Epoch: 36972, Training Loss: 0.00908
Epoch: 36973, Training Loss: 0.00711
Epoch: 36973, Training Loss: 0.00740
Epoch: 36973, Training Loss: 0.00740
Epoch: 36973, Training Loss: 0.00908
Epoch: 36974, Training Loss: 0.00711
Epoch: 36974, Training Loss: 0.00740
Epoch: 36974, Training Loss: 0.00740
Epoch: 36974, Training Loss: 0.00908
Epoch: 36975, Training Loss: 0.00711
Epoch: 36975, Training Loss: 0.00740
Epoch: 36975, Training Loss: 0.00740
Epoch: 36975, Training Loss: 0.00908
Epoch: 36976, Training Loss: 0.00711
Epoch: 36976, Training Loss: 0.00740
Epoch: 36976, Training Loss: 0.00740
Epoch: 36976, Training Loss: 0.00908
Epoch: 36977, Training Loss: 0.00711
Epoch: 36977, Training Loss: 0.00740
Epoch: 36977, Training Loss: 0.00740
Epoch: 36977, Training Loss: 0.00908
Epoch: 36978, Training Loss: 0.00711
Epoch: 36978, Training Loss: 0.00739
Epoch: 36978, Training Loss: 0.00740
Epoch: 36978, Training Loss: 0.00908
Epoch: 36979, Training Loss: 0.00711
Epoch: 36979, Training Loss: 0.00739
Epoch: 36979, Training Loss: 0.00740
Epoch: 36979, Training Loss: 0.00908
Epoch: 36980, Training Loss: 0.00711
Epoch: 36980, Training Loss: 0.00739
Epoch: 36980, Training Loss: 0.00740
Epoch: 36980, Training Loss: 0.00908
Epoch: 36981, Training Loss: 0.00711
Epoch: 36981, Training Loss: 0.00739
Epoch: 36981, Training Loss: 0.00740
Epoch: 36981, Training Loss: 0.00908
Epoch: 36982, Training Loss: 0.00711
Epoch: 36982, Training Loss: 0.00739
Epoch: 36982, Training Loss: 0.00740
Epoch: 36982, Training Loss: 0.00908
Epoch: 36983, Training Loss: 0.00711
Epoch: 36983, Training Loss: 0.00739
Epoch: 36983, Training Loss: 0.00740
Epoch: 36983, Training Loss: 0.00908
Epoch: 36984, Training Loss: 0.00711
Epoch: 36984, Training Loss: 0.00739
Epoch: 36984, Training Loss: 0.00740
Epoch: 36984, Training Loss: 0.00908
Epoch: 36985, Training Loss: 0.00711
Epoch: 36985, Training Loss: 0.00739
Epoch: 36985, Training Loss: 0.00740
Epoch: 36985, Training Loss: 0.00908
Epoch: 36986, Training Loss: 0.00711
Epoch: 36986, Training Loss: 0.00739
Epoch: 36986, Training Loss: 0.00740
Epoch: 36986, Training Loss: 0.00908
Epoch: 36987, Training Loss: 0.00711
Epoch: 36987, Training Loss: 0.00739
Epoch: 36987, Training Loss: 0.00740
Epoch: 36987, Training Loss: 0.00908
Epoch: 36988, Training Loss: 0.00711
Epoch: 36988, Training Loss: 0.00739
Epoch: 36988, Training Loss: 0.00740
Epoch: 36988, Training Loss: 0.00908
Epoch: 36989, Training Loss: 0.00711
Epoch: 36989, Training Loss: 0.00739
Epoch: 36989, Training Loss: 0.00740
Epoch: 36989, Training Loss: 0.00908
Epoch: 36990, Training Loss: 0.00711
Epoch: 36990, Training Loss: 0.00739
Epoch: 36990, Training Loss: 0.00740
Epoch: 36990, Training Loss: 0.00908
Epoch: 36991, Training Loss: 0.00711
Epoch: 36991, Training Loss: 0.00739
Epoch: 36991, Training Loss: 0.00740
Epoch: 36991, Training Loss: 0.00908
Epoch: 36992, Training Loss: 0.00711
Epoch: 36992, Training Loss: 0.00739
Epoch: 36992, Training Loss: 0.00740
Epoch: 36992, Training Loss: 0.00908
Epoch: 36993, Training Loss: 0.00711
Epoch: 36993, Training Loss: 0.00739
Epoch: 36993, Training Loss: 0.00740
Epoch: 36993, Training Loss: 0.00908
Epoch: 36994, Training Loss: 0.00711
Epoch: 36994, Training Loss: 0.00739
Epoch: 36994, Training Loss: 0.00740
Epoch: 36994, Training Loss: 0.00908
Epoch: 36995, Training Loss: 0.00711
Epoch: 36995, Training Loss: 0.00739
Epoch: 36995, Training Loss: 0.00740
Epoch: 36995, Training Loss: 0.00908
Epoch: 36996, Training Loss: 0.00711
Epoch: 36996, Training Loss: 0.00739
Epoch: 36996, Training Loss: 0.00740
Epoch: 36996, Training Loss: 0.00908
Epoch: 36997, Training Loss: 0.00711
Epoch: 36997, Training Loss: 0.00739
Epoch: 36997, Training Loss: 0.00740
Epoch: 36997, Training Loss: 0.00908
Epoch: 36998, Training Loss: 0.00711
Epoch: 36998, Training Loss: 0.00739
Epoch: 36998, Training Loss: 0.00740
Epoch: 36998, Training Loss: 0.00908
Epoch: 36999, Training Loss: 0.00711
Epoch: 36999, Training Loss: 0.00739
Epoch: 36999, Training Loss: 0.00740
Epoch: 36999, Training Loss: 0.00908
Epoch: 37000, Training Loss: 0.00711
Epoch: 37000, Training Loss: 0.00739
Epoch: 37000, Training Loss: 0.00740
Epoch: 37000, Training Loss: 0.00908
Epoch: 37001, Training Loss: 0.00711
Epoch: 37001, Training Loss: 0.00739
Epoch: 37001, Training Loss: 0.00740
Epoch: 37001, Training Loss: 0.00908
Epoch: 37002, Training Loss: 0.00711
Epoch: 37002, Training Loss: 0.00739
Epoch: 37002, Training Loss: 0.00740
Epoch: 37002, Training Loss: 0.00908
Epoch: 37003, Training Loss: 0.00711
Epoch: 37003, Training Loss: 0.00739
Epoch: 37003, Training Loss: 0.00740
Epoch: 37003, Training Loss: 0.00908
Epoch: 37004, Training Loss: 0.00711
Epoch: 37004, Training Loss: 0.00739
Epoch: 37004, Training Loss: 0.00740
Epoch: 37004, Training Loss: 0.00908
Epoch: 37005, Training Loss: 0.00711
Epoch: 37005, Training Loss: 0.00739
Epoch: 37005, Training Loss: 0.00740
Epoch: 37005, Training Loss: 0.00908
Epoch: 37006, Training Loss: 0.00711
Epoch: 37006, Training Loss: 0.00739
Epoch: 37006, Training Loss: 0.00740
Epoch: 37006, Training Loss: 0.00908
Epoch: 37007, Training Loss: 0.00711
Epoch: 37007, Training Loss: 0.00739
Epoch: 37007, Training Loss: 0.00740
Epoch: 37007, Training Loss: 0.00908
Epoch: 37008, Training Loss: 0.00711
Epoch: 37008, Training Loss: 0.00739
Epoch: 37008, Training Loss: 0.00740
Epoch: 37008, Training Loss: 0.00908
Epoch: 37009, Training Loss: 0.00711
Epoch: 37009, Training Loss: 0.00739
Epoch: 37009, Training Loss: 0.00740
Epoch: 37009, Training Loss: 0.00908
Epoch: 37010, Training Loss: 0.00711
Epoch: 37010, Training Loss: 0.00739
Epoch: 37010, Training Loss: 0.00740
Epoch: 37010, Training Loss: 0.00908
Epoch: 37011, Training Loss: 0.00711
Epoch: 37011, Training Loss: 0.00739
Epoch: 37011, Training Loss: 0.00740
Epoch: 37011, Training Loss: 0.00908
Epoch: 37012, Training Loss: 0.00711
Epoch: 37012, Training Loss: 0.00739
Epoch: 37012, Training Loss: 0.00740
Epoch: 37012, Training Loss: 0.00908
Epoch: 37013, Training Loss: 0.00711
Epoch: 37013, Training Loss: 0.00739
Epoch: 37013, Training Loss: 0.00740
Epoch: 37013, Training Loss: 0.00908
Epoch: 37014, Training Loss: 0.00711
Epoch: 37014, Training Loss: 0.00739
Epoch: 37014, Training Loss: 0.00740
Epoch: 37014, Training Loss: 0.00908
Epoch: 37015, Training Loss: 0.00711
Epoch: 37015, Training Loss: 0.00739
Epoch: 37015, Training Loss: 0.00740
Epoch: 37015, Training Loss: 0.00908
Epoch: 37016, Training Loss: 0.00711
Epoch: 37016, Training Loss: 0.00739
Epoch: 37016, Training Loss: 0.00740
Epoch: 37016, Training Loss: 0.00908
Epoch: 37017, Training Loss: 0.00711
Epoch: 37017, Training Loss: 0.00739
Epoch: 37017, Training Loss: 0.00740
Epoch: 37017, Training Loss: 0.00908
Epoch: 37018, Training Loss: 0.00711
Epoch: 37018, Training Loss: 0.00739
Epoch: 37018, Training Loss: 0.00740
Epoch: 37018, Training Loss: 0.00908
Epoch: 37019, Training Loss: 0.00711
Epoch: 37019, Training Loss: 0.00739
Epoch: 37019, Training Loss: 0.00740
Epoch: 37019, Training Loss: 0.00908
Epoch: 37020, Training Loss: 0.00711
Epoch: 37020, Training Loss: 0.00739
Epoch: 37020, Training Loss: 0.00740
Epoch: 37020, Training Loss: 0.00908
Epoch: 37021, Training Loss: 0.00711
Epoch: 37021, Training Loss: 0.00739
Epoch: 37021, Training Loss: 0.00740
Epoch: 37021, Training Loss: 0.00908
Epoch: 37022, Training Loss: 0.00711
Epoch: 37022, Training Loss: 0.00739
Epoch: 37022, Training Loss: 0.00740
Epoch: 37022, Training Loss: 0.00908
Epoch: 37023, Training Loss: 0.00711
Epoch: 37023, Training Loss: 0.00739
Epoch: 37023, Training Loss: 0.00740
Epoch: 37023, Training Loss: 0.00908
Epoch: 37024, Training Loss: 0.00711
Epoch: 37024, Training Loss: 0.00739
Epoch: 37024, Training Loss: 0.00740
Epoch: 37024, Training Loss: 0.00908
Epoch: 37025, Training Loss: 0.00711
Epoch: 37025, Training Loss: 0.00739
Epoch: 37025, Training Loss: 0.00740
Epoch: 37025, Training Loss: 0.00908
Epoch: 37026, Training Loss: 0.00711
Epoch: 37026, Training Loss: 0.00739
Epoch: 37026, Training Loss: 0.00740
Epoch: 37026, Training Loss: 0.00908
Epoch: 37027, Training Loss: 0.00711
Epoch: 37027, Training Loss: 0.00739
Epoch: 37027, Training Loss: 0.00740
Epoch: 37027, Training Loss: 0.00908
Epoch: 37028, Training Loss: 0.00711
Epoch: 37028, Training Loss: 0.00739
Epoch: 37028, Training Loss: 0.00740
Epoch: 37028, Training Loss: 0.00908
Epoch: 37029, Training Loss: 0.00711
Epoch: 37029, Training Loss: 0.00739
Epoch: 37029, Training Loss: 0.00740
Epoch: 37029, Training Loss: 0.00908
Epoch: 37030, Training Loss: 0.00711
Epoch: 37030, Training Loss: 0.00739
Epoch: 37030, Training Loss: 0.00740
Epoch: 37030, Training Loss: 0.00908
Epoch: 37031, Training Loss: 0.00711
Epoch: 37031, Training Loss: 0.00739
Epoch: 37031, Training Loss: 0.00740
Epoch: 37031, Training Loss: 0.00908
Epoch: 37032, Training Loss: 0.00711
Epoch: 37032, Training Loss: 0.00739
Epoch: 37032, Training Loss: 0.00740
Epoch: 37032, Training Loss: 0.00908
Epoch: 37033, Training Loss: 0.00711
Epoch: 37033, Training Loss: 0.00739
Epoch: 37033, Training Loss: 0.00740
Epoch: 37033, Training Loss: 0.00908
Epoch: 37034, Training Loss: 0.00711
Epoch: 37034, Training Loss: 0.00739
Epoch: 37034, Training Loss: 0.00740
Epoch: 37034, Training Loss: 0.00908
Epoch: 37035, Training Loss: 0.00711
Epoch: 37035, Training Loss: 0.00739
Epoch: 37035, Training Loss: 0.00740
Epoch: 37035, Training Loss: 0.00908
Epoch: 37036, Training Loss: 0.00711
Epoch: 37036, Training Loss: 0.00739
Epoch: 37036, Training Loss: 0.00740
Epoch: 37036, Training Loss: 0.00907
Epoch: 37037, Training Loss: 0.00711
Epoch: 37037, Training Loss: 0.00739
Epoch: 37037, Training Loss: 0.00740
Epoch: 37037, Training Loss: 0.00907
Epoch: 37038, Training Loss: 0.00711
Epoch: 37038, Training Loss: 0.00739
Epoch: 37038, Training Loss: 0.00740
Epoch: 37038, Training Loss: 0.00907
Epoch: 37039, Training Loss: 0.00711
Epoch: 37039, Training Loss: 0.00739
Epoch: 37039, Training Loss: 0.00740
Epoch: 37039, Training Loss: 0.00907
Epoch: 37040, Training Loss: 0.00711
Epoch: 37040, Training Loss: 0.00739
Epoch: 37040, Training Loss: 0.00740
Epoch: 37040, Training Loss: 0.00907
Epoch: 37041, Training Loss: 0.00711
Epoch: 37041, Training Loss: 0.00739
Epoch: 37041, Training Loss: 0.00740
Epoch: 37041, Training Loss: 0.00907
Epoch: 37042, Training Loss: 0.00711
Epoch: 37042, Training Loss: 0.00739
Epoch: 37042, Training Loss: 0.00740
Epoch: 37042, Training Loss: 0.00907
Epoch: 37043, Training Loss: 0.00711
Epoch: 37043, Training Loss: 0.00739
Epoch: 37043, Training Loss: 0.00740
Epoch: 37043, Training Loss: 0.00907
Epoch: 37044, Training Loss: 0.00711
Epoch: 37044, Training Loss: 0.00739
Epoch: 37044, Training Loss: 0.00740
Epoch: 37044, Training Loss: 0.00907
Epoch: 37045, Training Loss: 0.00711
Epoch: 37045, Training Loss: 0.00739
Epoch: 37045, Training Loss: 0.00740
Epoch: 37045, Training Loss: 0.00907
Epoch: 37046, Training Loss: 0.00711
Epoch: 37046, Training Loss: 0.00739
Epoch: 37046, Training Loss: 0.00740
Epoch: 37046, Training Loss: 0.00907
Epoch: 37047, Training Loss: 0.00711
Epoch: 37047, Training Loss: 0.00739
Epoch: 37047, Training Loss: 0.00740
Epoch: 37047, Training Loss: 0.00907
Epoch: 37048, Training Loss: 0.00711
Epoch: 37048, Training Loss: 0.00739
Epoch: 37048, Training Loss: 0.00740
Epoch: 37048, Training Loss: 0.00907
Epoch: 37049, Training Loss: 0.00711
Epoch: 37049, Training Loss: 0.00739
Epoch: 37049, Training Loss: 0.00740
Epoch: 37049, Training Loss: 0.00907
Epoch: 37050, Training Loss: 0.00711
Epoch: 37050, Training Loss: 0.00739
Epoch: 37050, Training Loss: 0.00740
Epoch: 37050, Training Loss: 0.00907
Epoch: 37051, Training Loss: 0.00711
Epoch: 37051, Training Loss: 0.00739
Epoch: 37051, Training Loss: 0.00740
Epoch: 37051, Training Loss: 0.00907
Epoch: 37052, Training Loss: 0.00711
Epoch: 37052, Training Loss: 0.00739
Epoch: 37052, Training Loss: 0.00740
Epoch: 37052, Training Loss: 0.00907
Epoch: 37053, Training Loss: 0.00711
Epoch: 37053, Training Loss: 0.00739
Epoch: 37053, Training Loss: 0.00740
Epoch: 37053, Training Loss: 0.00907
Epoch: 37054, Training Loss: 0.00711
Epoch: 37054, Training Loss: 0.00739
Epoch: 37054, Training Loss: 0.00739
Epoch: 37054, Training Loss: 0.00907
Epoch: 37055, Training Loss: 0.00711
Epoch: 37055, Training Loss: 0.00739
Epoch: 37055, Training Loss: 0.00739
Epoch: 37055, Training Loss: 0.00907
Epoch: 37056, Training Loss: 0.00711
Epoch: 37056, Training Loss: 0.00739
Epoch: 37056, Training Loss: 0.00739
Epoch: 37056, Training Loss: 0.00907
Epoch: 37057, Training Loss: 0.00711
Epoch: 37057, Training Loss: 0.00739
Epoch: 37057, Training Loss: 0.00739
Epoch: 37057, Training Loss: 0.00907
Epoch: 37058, Training Loss: 0.00711
Epoch: 37058, Training Loss: 0.00739
Epoch: 37058, Training Loss: 0.00739
Epoch: 37058, Training Loss: 0.00907
Epoch: 37059, Training Loss: 0.00711
Epoch: 37059, Training Loss: 0.00739
Epoch: 37059, Training Loss: 0.00739
Epoch: 37059, Training Loss: 0.00907
Epoch: 37060, Training Loss: 0.00710
Epoch: 37060, Training Loss: 0.00739
Epoch: 37060, Training Loss: 0.00739
Epoch: 37060, Training Loss: 0.00907
Epoch: 37061, Training Loss: 0.00710
Epoch: 37061, Training Loss: 0.00739
Epoch: 37061, Training Loss: 0.00739
Epoch: 37061, Training Loss: 0.00907
Epoch: 37062, Training Loss: 0.00710
Epoch: 37062, Training Loss: 0.00739
Epoch: 37062, Training Loss: 0.00739
Epoch: 37062, Training Loss: 0.00907
Epoch: 37063, Training Loss: 0.00710
Epoch: 37063, Training Loss: 0.00739
Epoch: 37063, Training Loss: 0.00739
Epoch: 37063, Training Loss: 0.00907
Epoch: 37064, Training Loss: 0.00710
Epoch: 37064, Training Loss: 0.00739
Epoch: 37064, Training Loss: 0.00739
Epoch: 37064, Training Loss: 0.00907
Epoch: 37065, Training Loss: 0.00710
Epoch: 37065, Training Loss: 0.00739
Epoch: 37065, Training Loss: 0.00739
Epoch: 37065, Training Loss: 0.00907
Epoch: 37066, Training Loss: 0.00710
Epoch: 37066, Training Loss: 0.00739
Epoch: 37066, Training Loss: 0.00739
Epoch: 37066, Training Loss: 0.00907
Epoch: 37067, Training Loss: 0.00710
Epoch: 37067, Training Loss: 0.00739
Epoch: 37067, Training Loss: 0.00739
Epoch: 37067, Training Loss: 0.00907
Epoch: 37068, Training Loss: 0.00710
Epoch: 37068, Training Loss: 0.00738
Epoch: 37068, Training Loss: 0.00739
Epoch: 37068, Training Loss: 0.00907
Epoch: 37069, Training Loss: 0.00710
Epoch: 37069, Training Loss: 0.00738
Epoch: 37069, Training Loss: 0.00739
Epoch: 37069, Training Loss: 0.00907
Epoch: 37070, Training Loss: 0.00710
Epoch: 37070, Training Loss: 0.00738
Epoch: 37070, Training Loss: 0.00739
Epoch: 37070, Training Loss: 0.00907
Epoch: 37071, Training Loss: 0.00710
Epoch: 37071, Training Loss: 0.00738
Epoch: 37071, Training Loss: 0.00739
Epoch: 37071, Training Loss: 0.00907
Epoch: 37072, Training Loss: 0.00710
Epoch: 37072, Training Loss: 0.00738
Epoch: 37072, Training Loss: 0.00739
Epoch: 37072, Training Loss: 0.00907
Epoch: 37073, Training Loss: 0.00710
Epoch: 37073, Training Loss: 0.00738
Epoch: 37073, Training Loss: 0.00739
Epoch: 37073, Training Loss: 0.00907
Epoch: 37074, Training Loss: 0.00710
Epoch: 37074, Training Loss: 0.00738
Epoch: 37074, Training Loss: 0.00739
Epoch: 37074, Training Loss: 0.00907
Epoch: 37075, Training Loss: 0.00710
Epoch: 37075, Training Loss: 0.00738
Epoch: 37075, Training Loss: 0.00739
Epoch: 37075, Training Loss: 0.00907
Epoch: 37076, Training Loss: 0.00710
Epoch: 37076, Training Loss: 0.00738
Epoch: 37076, Training Loss: 0.00739
Epoch: 37076, Training Loss: 0.00907
Epoch: 37077, Training Loss: 0.00710
Epoch: 37077, Training Loss: 0.00738
Epoch: 37077, Training Loss: 0.00739
Epoch: 37077, Training Loss: 0.00907
Epoch: 37078, Training Loss: 0.00710
Epoch: 37078, Training Loss: 0.00738
Epoch: 37078, Training Loss: 0.00739
Epoch: 37078, Training Loss: 0.00907
Epoch: 37079, Training Loss: 0.00710
Epoch: 37079, Training Loss: 0.00738
Epoch: 37079, Training Loss: 0.00739
Epoch: 37079, Training Loss: 0.00907
Epoch: 37080, Training Loss: 0.00710
Epoch: 37080, Training Loss: 0.00738
Epoch: 37080, Training Loss: 0.00739
Epoch: 37080, Training Loss: 0.00907
Epoch: 37081, Training Loss: 0.00710
Epoch: 37081, Training Loss: 0.00738
Epoch: 37081, Training Loss: 0.00739
Epoch: 37081, Training Loss: 0.00907
Epoch: 37082, Training Loss: 0.00710
Epoch: 37082, Training Loss: 0.00738
Epoch: 37082, Training Loss: 0.00739
Epoch: 37082, Training Loss: 0.00907
Epoch: 37083, Training Loss: 0.00710
Epoch: 37083, Training Loss: 0.00738
Epoch: 37083, Training Loss: 0.00739
Epoch: 37083, Training Loss: 0.00907
Epoch: 37084, Training Loss: 0.00710
Epoch: 37084, Training Loss: 0.00738
Epoch: 37084, Training Loss: 0.00739
Epoch: 37084, Training Loss: 0.00907
Epoch: 37085, Training Loss: 0.00710
Epoch: 37085, Training Loss: 0.00738
Epoch: 37085, Training Loss: 0.00739
Epoch: 37085, Training Loss: 0.00907
Epoch: 37086, Training Loss: 0.00710
Epoch: 37086, Training Loss: 0.00738
Epoch: 37086, Training Loss: 0.00739
Epoch: 37086, Training Loss: 0.00907
Epoch: 37087, Training Loss: 0.00710
Epoch: 37087, Training Loss: 0.00738
Epoch: 37087, Training Loss: 0.00739
Epoch: 37087, Training Loss: 0.00907
Epoch: 37088, Training Loss: 0.00710
Epoch: 37088, Training Loss: 0.00738
Epoch: 37088, Training Loss: 0.00739
Epoch: 37088, Training Loss: 0.00907
Epoch: 37089, Training Loss: 0.00710
Epoch: 37089, Training Loss: 0.00738
Epoch: 37089, Training Loss: 0.00739
Epoch: 37089, Training Loss: 0.00907
Epoch: 37090, Training Loss: 0.00710
Epoch: 37090, Training Loss: 0.00738
Epoch: 37090, Training Loss: 0.00739
Epoch: 37090, Training Loss: 0.00907
Epoch: 37091, Training Loss: 0.00710
Epoch: 37091, Training Loss: 0.00738
Epoch: 37091, Training Loss: 0.00739
Epoch: 37091, Training Loss: 0.00907
Epoch: 37092, Training Loss: 0.00710
Epoch: 37092, Training Loss: 0.00738
Epoch: 37092, Training Loss: 0.00739
Epoch: 37092, Training Loss: 0.00907
Epoch: 37093, Training Loss: 0.00710
Epoch: 37093, Training Loss: 0.00738
Epoch: 37093, Training Loss: 0.00739
Epoch: 37093, Training Loss: 0.00907
Epoch: 37094, Training Loss: 0.00710
Epoch: 37094, Training Loss: 0.00738
Epoch: 37094, Training Loss: 0.00739
Epoch: 37094, Training Loss: 0.00907
Epoch: 37095, Training Loss: 0.00710
Epoch: 37095, Training Loss: 0.00738
Epoch: 37095, Training Loss: 0.00739
Epoch: 37095, Training Loss: 0.00907
Epoch: 37096, Training Loss: 0.00710
Epoch: 37096, Training Loss: 0.00738
Epoch: 37096, Training Loss: 0.00739
Epoch: 37096, Training Loss: 0.00907
Epoch: 37097, Training Loss: 0.00710
Epoch: 37097, Training Loss: 0.00738
Epoch: 37097, Training Loss: 0.00739
Epoch: 37097, Training Loss: 0.00907
Epoch: 37098, Training Loss: 0.00710
Epoch: 37098, Training Loss: 0.00738
Epoch: 37098, Training Loss: 0.00739
Epoch: 37098, Training Loss: 0.00907
Epoch: 37099, Training Loss: 0.00710
Epoch: 37099, Training Loss: 0.00738
Epoch: 37099, Training Loss: 0.00739
Epoch: 37099, Training Loss: 0.00907
Epoch: 37100, Training Loss: 0.00710
Epoch: 37100, Training Loss: 0.00738
Epoch: 37100, Training Loss: 0.00739
Epoch: 37100, Training Loss: 0.00907
Epoch: 37101, Training Loss: 0.00710
Epoch: 37101, Training Loss: 0.00738
Epoch: 37101, Training Loss: 0.00739
Epoch: 37101, Training Loss: 0.00907
Epoch: 37102, Training Loss: 0.00710
Epoch: 37102, Training Loss: 0.00738
Epoch: 37102, Training Loss: 0.00739
Epoch: 37102, Training Loss: 0.00907
Epoch: 37103, Training Loss: 0.00710
Epoch: 37103, Training Loss: 0.00738
Epoch: 37103, Training Loss: 0.00739
Epoch: 37103, Training Loss: 0.00907
Epoch: 37104, Training Loss: 0.00710
Epoch: 37104, Training Loss: 0.00738
Epoch: 37104, Training Loss: 0.00739
Epoch: 37104, Training Loss: 0.00907
Epoch: 37105, Training Loss: 0.00710
Epoch: 37105, Training Loss: 0.00738
Epoch: 37105, Training Loss: 0.00739
Epoch: 37105, Training Loss: 0.00907
Epoch: 37106, Training Loss: 0.00710
Epoch: 37106, Training Loss: 0.00738
Epoch: 37106, Training Loss: 0.00739
Epoch: 37106, Training Loss: 0.00907
Epoch: 37107, Training Loss: 0.00710
Epoch: 37107, Training Loss: 0.00738
Epoch: 37107, Training Loss: 0.00739
Epoch: 37107, Training Loss: 0.00907
Epoch: 37108, Training Loss: 0.00710
Epoch: 37108, Training Loss: 0.00738
Epoch: 37108, Training Loss: 0.00739
Epoch: 37108, Training Loss: 0.00907
Epoch: 37109, Training Loss: 0.00710
Epoch: 37109, Training Loss: 0.00738
Epoch: 37109, Training Loss: 0.00739
Epoch: 37109, Training Loss: 0.00906
Epoch: 37110, Training Loss: 0.00710
Epoch: 37110, Training Loss: 0.00738
Epoch: 37110, Training Loss: 0.00739
Epoch: 37110, Training Loss: 0.00906
Epoch: 37111, Training Loss: 0.00710
Epoch: 37111, Training Loss: 0.00738
Epoch: 37111, Training Loss: 0.00739
Epoch: 37111, Training Loss: 0.00906
Epoch: 37112, Training Loss: 0.00710
Epoch: 37112, Training Loss: 0.00738
Epoch: 37112, Training Loss: 0.00739
Epoch: 37112, Training Loss: 0.00906
Epoch: 37113, Training Loss: 0.00710
Epoch: 37113, Training Loss: 0.00738
Epoch: 37113, Training Loss: 0.00739
Epoch: 37113, Training Loss: 0.00906
Epoch: 37114, Training Loss: 0.00710
Epoch: 37114, Training Loss: 0.00738
Epoch: 37114, Training Loss: 0.00739
Epoch: 37114, Training Loss: 0.00906
Epoch: 37115, Training Loss: 0.00710
Epoch: 37115, Training Loss: 0.00738
Epoch: 37115, Training Loss: 0.00739
Epoch: 37115, Training Loss: 0.00906
Epoch: 37116, Training Loss: 0.00710
Epoch: 37116, Training Loss: 0.00738
Epoch: 37116, Training Loss: 0.00739
Epoch: 37116, Training Loss: 0.00906
Epoch: 37117, Training Loss: 0.00710
Epoch: 37117, Training Loss: 0.00738
Epoch: 37117, Training Loss: 0.00739
Epoch: 37117, Training Loss: 0.00906
Epoch: 37118, Training Loss: 0.00710
Epoch: 37118, Training Loss: 0.00738
Epoch: 37118, Training Loss: 0.00739
Epoch: 37118, Training Loss: 0.00906
Epoch: 37119, Training Loss: 0.00710
Epoch: 37119, Training Loss: 0.00738
Epoch: 37119, Training Loss: 0.00739
Epoch: 37119, Training Loss: 0.00906
Epoch: 37120, Training Loss: 0.00710
Epoch: 37120, Training Loss: 0.00738
Epoch: 37120, Training Loss: 0.00739
Epoch: 37120, Training Loss: 0.00906
Epoch: 37121, Training Loss: 0.00710
Epoch: 37121, Training Loss: 0.00738
Epoch: 37121, Training Loss: 0.00739
Epoch: 37121, Training Loss: 0.00906
Epoch: 37122, Training Loss: 0.00710
Epoch: 37122, Training Loss: 0.00738
Epoch: 37122, Training Loss: 0.00739
Epoch: 37122, Training Loss: 0.00906
Epoch: 37123, Training Loss: 0.00710
Epoch: 37123, Training Loss: 0.00738
Epoch: 37123, Training Loss: 0.00739
Epoch: 37123, Training Loss: 0.00906
Epoch: 37124, Training Loss: 0.00710
Epoch: 37124, Training Loss: 0.00738
Epoch: 37124, Training Loss: 0.00739
Epoch: 37124, Training Loss: 0.00906
Epoch: 37125, Training Loss: 0.00710
Epoch: 37125, Training Loss: 0.00738
Epoch: 37125, Training Loss: 0.00739
Epoch: 37125, Training Loss: 0.00906
Epoch: 37126, Training Loss: 0.00710
Epoch: 37126, Training Loss: 0.00738
Epoch: 37126, Training Loss: 0.00739
Epoch: 37126, Training Loss: 0.00906
Epoch: 37127, Training Loss: 0.00710
Epoch: 37127, Training Loss: 0.00738
Epoch: 37127, Training Loss: 0.00739
Epoch: 37127, Training Loss: 0.00906
Epoch: 37128, Training Loss: 0.00710
Epoch: 37128, Training Loss: 0.00738
Epoch: 37128, Training Loss: 0.00739
Epoch: 37128, Training Loss: 0.00906
Epoch: 37129, Training Loss: 0.00710
Epoch: 37129, Training Loss: 0.00738
Epoch: 37129, Training Loss: 0.00739
Epoch: 37129, Training Loss: 0.00906
Epoch: 37130, Training Loss: 0.00710
Epoch: 37130, Training Loss: 0.00738
Epoch: 37130, Training Loss: 0.00739
Epoch: 37130, Training Loss: 0.00906
Epoch: 37131, Training Loss: 0.00710
Epoch: 37131, Training Loss: 0.00738
Epoch: 37131, Training Loss: 0.00739
Epoch: 37131, Training Loss: 0.00906
Epoch: 37132, Training Loss: 0.00710
Epoch: 37132, Training Loss: 0.00738
Epoch: 37132, Training Loss: 0.00739
Epoch: 37132, Training Loss: 0.00906
Epoch: 37133, Training Loss: 0.00710
Epoch: 37133, Training Loss: 0.00738
Epoch: 37133, Training Loss: 0.00739
Epoch: 37133, Training Loss: 0.00906
Epoch: 37134, Training Loss: 0.00710
Epoch: 37134, Training Loss: 0.00738
Epoch: 37134, Training Loss: 0.00739
Epoch: 37134, Training Loss: 0.00906
Epoch: 37135, Training Loss: 0.00710
Epoch: 37135, Training Loss: 0.00738
Epoch: 37135, Training Loss: 0.00739
Epoch: 37135, Training Loss: 0.00906
Epoch: 37136, Training Loss: 0.00710
Epoch: 37136, Training Loss: 0.00738
Epoch: 37136, Training Loss: 0.00739
Epoch: 37136, Training Loss: 0.00906
Epoch: 37137, Training Loss: 0.00710
Epoch: 37137, Training Loss: 0.00738
Epoch: 37137, Training Loss: 0.00739
Epoch: 37137, Training Loss: 0.00906
Epoch: 37138, Training Loss: 0.00710
Epoch: 37138, Training Loss: 0.00738
Epoch: 37138, Training Loss: 0.00739
Epoch: 37138, Training Loss: 0.00906
Epoch: 37139, Training Loss: 0.00710
Epoch: 37139, Training Loss: 0.00738
Epoch: 37139, Training Loss: 0.00739
Epoch: 37139, Training Loss: 0.00906
Epoch: 37140, Training Loss: 0.00710
Epoch: 37140, Training Loss: 0.00738
Epoch: 37140, Training Loss: 0.00739
Epoch: 37140, Training Loss: 0.00906
Epoch: 37141, Training Loss: 0.00710
Epoch: 37141, Training Loss: 0.00738
Epoch: 37141, Training Loss: 0.00739
Epoch: 37141, Training Loss: 0.00906
Epoch: 37142, Training Loss: 0.00710
Epoch: 37142, Training Loss: 0.00738
Epoch: 37142, Training Loss: 0.00739
Epoch: 37142, Training Loss: 0.00906
Epoch: 37143, Training Loss: 0.00710
Epoch: 37143, Training Loss: 0.00738
Epoch: 37143, Training Loss: 0.00739
Epoch: 37143, Training Loss: 0.00906
Epoch: 37144, Training Loss: 0.00710
Epoch: 37144, Training Loss: 0.00738
Epoch: 37144, Training Loss: 0.00739
Epoch: 37144, Training Loss: 0.00906
Epoch: 37145, Training Loss: 0.00710
Epoch: 37145, Training Loss: 0.00738
Epoch: 37145, Training Loss: 0.00738
Epoch: 37145, Training Loss: 0.00906
Epoch: 37146, Training Loss: 0.00710
Epoch: 37146, Training Loss: 0.00738
Epoch: 37146, Training Loss: 0.00738
Epoch: 37146, Training Loss: 0.00906
Epoch: 37147, Training Loss: 0.00710
Epoch: 37147, Training Loss: 0.00738
Epoch: 37147, Training Loss: 0.00738
Epoch: 37147, Training Loss: 0.00906
Epoch: 37148, Training Loss: 0.00710
Epoch: 37148, Training Loss: 0.00738
Epoch: 37148, Training Loss: 0.00738
Epoch: 37148, Training Loss: 0.00906
Epoch: 37149, Training Loss: 0.00710
Epoch: 37149, Training Loss: 0.00738
Epoch: 37149, Training Loss: 0.00738
Epoch: 37149, Training Loss: 0.00906
Epoch: 37150, Training Loss: 0.00710
Epoch: 37150, Training Loss: 0.00738
Epoch: 37150, Training Loss: 0.00738
Epoch: 37150, Training Loss: 0.00906
Epoch: 37151, Training Loss: 0.00710
Epoch: 37151, Training Loss: 0.00738
Epoch: 37151, Training Loss: 0.00738
Epoch: 37151, Training Loss: 0.00906
Epoch: 37152, Training Loss: 0.00710
Epoch: 37152, Training Loss: 0.00738
Epoch: 37152, Training Loss: 0.00738
Epoch: 37152, Training Loss: 0.00906
Epoch: 37153, Training Loss: 0.00710
Epoch: 37153, Training Loss: 0.00738
Epoch: 37153, Training Loss: 0.00738
Epoch: 37153, Training Loss: 0.00906
Epoch: 37154, Training Loss: 0.00710
Epoch: 37154, Training Loss: 0.00738
Epoch: 37154, Training Loss: 0.00738
Epoch: 37154, Training Loss: 0.00906
Epoch: 37155, Training Loss: 0.00710
Epoch: 37155, Training Loss: 0.00738
Epoch: 37155, Training Loss: 0.00738
Epoch: 37155, Training Loss: 0.00906
Epoch: 37156, Training Loss: 0.00709
Epoch: 37156, Training Loss: 0.00738
Epoch: 37156, Training Loss: 0.00738
Epoch: 37156, Training Loss: 0.00906
Epoch: 37157, Training Loss: 0.00709
Epoch: 37157, Training Loss: 0.00738
Epoch: 37157, Training Loss: 0.00738
Epoch: 37157, Training Loss: 0.00906
Epoch: 37158, Training Loss: 0.00709
Epoch: 37158, Training Loss: 0.00738
Epoch: 37158, Training Loss: 0.00738
Epoch: 37158, Training Loss: 0.00906
Epoch: 37159, Training Loss: 0.00709
Epoch: 37159, Training Loss: 0.00737
Epoch: 37159, Training Loss: 0.00738
Epoch: 37159, Training Loss: 0.00906
Epoch: 37160, Training Loss: 0.00709
Epoch: 37160, Training Loss: 0.00737
Epoch: 37160, Training Loss: 0.00738
Epoch: 37160, Training Loss: 0.00906
Epoch: 37161, Training Loss: 0.00709
Epoch: 37161, Training Loss: 0.00737
Epoch: 37161, Training Loss: 0.00738
Epoch: 37161, Training Loss: 0.00906
Epoch: 37162, Training Loss: 0.00709
Epoch: 37162, Training Loss: 0.00737
Epoch: 37162, Training Loss: 0.00738
Epoch: 37162, Training Loss: 0.00906
Epoch: 37163, Training Loss: 0.00709
Epoch: 37163, Training Loss: 0.00737
Epoch: 37163, Training Loss: 0.00738
Epoch: 37163, Training Loss: 0.00906
Epoch: 37164, Training Loss: 0.00709
Epoch: 37164, Training Loss: 0.00737
Epoch: 37164, Training Loss: 0.00738
Epoch: 37164, Training Loss: 0.00906
Epoch: 37165, Training Loss: 0.00709
Epoch: 37165, Training Loss: 0.00737
Epoch: 37165, Training Loss: 0.00738
Epoch: 37165, Training Loss: 0.00906
Epoch: 37166, Training Loss: 0.00709
Epoch: 37166, Training Loss: 0.00737
Epoch: 37166, Training Loss: 0.00738
Epoch: 37166, Training Loss: 0.00906
Epoch: 37167, Training Loss: 0.00709
Epoch: 37167, Training Loss: 0.00737
Epoch: 37167, Training Loss: 0.00738
Epoch: 37167, Training Loss: 0.00906
Epoch: 37168, Training Loss: 0.00709
Epoch: 37168, Training Loss: 0.00737
Epoch: 37168, Training Loss: 0.00738
Epoch: 37168, Training Loss: 0.00906
Epoch: 37169, Training Loss: 0.00709
Epoch: 37169, Training Loss: 0.00737
Epoch: 37169, Training Loss: 0.00738
Epoch: 37169, Training Loss: 0.00906
Epoch: 37170, Training Loss: 0.00709
Epoch: 37170, Training Loss: 0.00737
Epoch: 37170, Training Loss: 0.00738
Epoch: 37170, Training Loss: 0.00906
Epoch: 37171, Training Loss: 0.00709
Epoch: 37171, Training Loss: 0.00737
Epoch: 37171, Training Loss: 0.00738
Epoch: 37171, Training Loss: 0.00906
Epoch: 37172, Training Loss: 0.00709
Epoch: 37172, Training Loss: 0.00737
Epoch: 37172, Training Loss: 0.00738
Epoch: 37172, Training Loss: 0.00906
Epoch: 37173, Training Loss: 0.00709
Epoch: 37173, Training Loss: 0.00737
Epoch: 37173, Training Loss: 0.00738
Epoch: 37173, Training Loss: 0.00906
Epoch: 37174, Training Loss: 0.00709
Epoch: 37174, Training Loss: 0.00737
Epoch: 37174, Training Loss: 0.00738
Epoch: 37174, Training Loss: 0.00906
Epoch: 37175, Training Loss: 0.00709
Epoch: 37175, Training Loss: 0.00737
Epoch: 37175, Training Loss: 0.00738
Epoch: 37175, Training Loss: 0.00906
Epoch: 37176, Training Loss: 0.00709
Epoch: 37176, Training Loss: 0.00737
Epoch: 37176, Training Loss: 0.00738
Epoch: 37176, Training Loss: 0.00906
Epoch: 37177, Training Loss: 0.00709
Epoch: 37177, Training Loss: 0.00737
Epoch: 37177, Training Loss: 0.00738
Epoch: 37177, Training Loss: 0.00906
Epoch: 37178, Training Loss: 0.00709
Epoch: 37178, Training Loss: 0.00737
Epoch: 37178, Training Loss: 0.00738
Epoch: 37178, Training Loss: 0.00906
Epoch: 37179, Training Loss: 0.00709
Epoch: 37179, Training Loss: 0.00737
Epoch: 37179, Training Loss: 0.00738
Epoch: 37179, Training Loss: 0.00906
Epoch: 37180, Training Loss: 0.00709
Epoch: 37180, Training Loss: 0.00737
Epoch: 37180, Training Loss: 0.00738
Epoch: 37180, Training Loss: 0.00906
Epoch: 37181, Training Loss: 0.00709
Epoch: 37181, Training Loss: 0.00737
Epoch: 37181, Training Loss: 0.00738
Epoch: 37181, Training Loss: 0.00906
Epoch: 37182, Training Loss: 0.00709
Epoch: 37182, Training Loss: 0.00737
Epoch: 37182, Training Loss: 0.00738
Epoch: 37182, Training Loss: 0.00906
Epoch: 37183, Training Loss: 0.00709
Epoch: 37183, Training Loss: 0.00737
Epoch: 37183, Training Loss: 0.00738
Epoch: 37183, Training Loss: 0.00905
Epoch: 37184, Training Loss: 0.00709
Epoch: 37184, Training Loss: 0.00737
Epoch: 37184, Training Loss: 0.00738
Epoch: 37184, Training Loss: 0.00905
Epoch: 37185, Training Loss: 0.00709
Epoch: 37185, Training Loss: 0.00737
Epoch: 37185, Training Loss: 0.00738
Epoch: 37185, Training Loss: 0.00905
Epoch: 37186, Training Loss: 0.00709
Epoch: 37186, Training Loss: 0.00737
Epoch: 37186, Training Loss: 0.00738
Epoch: 37186, Training Loss: 0.00905
Epoch: 37187, Training Loss: 0.00709
Epoch: 37187, Training Loss: 0.00737
Epoch: 37187, Training Loss: 0.00738
Epoch: 37187, Training Loss: 0.00905
Epoch: 37188, Training Loss: 0.00709
Epoch: 37188, Training Loss: 0.00737
Epoch: 37188, Training Loss: 0.00738
Epoch: 37188, Training Loss: 0.00905
Epoch: 37189, Training Loss: 0.00709
Epoch: 37189, Training Loss: 0.00737
Epoch: 37189, Training Loss: 0.00738
Epoch: 37189, Training Loss: 0.00905
Epoch: 37190, Training Loss: 0.00709
Epoch: 37190, Training Loss: 0.00737
Epoch: 37190, Training Loss: 0.00738
Epoch: 37190, Training Loss: 0.00905
Epoch: 37191, Training Loss: 0.00709
Epoch: 37191, Training Loss: 0.00737
Epoch: 37191, Training Loss: 0.00738
Epoch: 37191, Training Loss: 0.00905
Epoch: 37192, Training Loss: 0.00709
Epoch: 37192, Training Loss: 0.00737
Epoch: 37192, Training Loss: 0.00738
Epoch: 37192, Training Loss: 0.00905
Epoch: 37193, Training Loss: 0.00709
Epoch: 37193, Training Loss: 0.00737
Epoch: 37193, Training Loss: 0.00738
Epoch: 37193, Training Loss: 0.00905
Epoch: 37194, Training Loss: 0.00709
Epoch: 37194, Training Loss: 0.00737
Epoch: 37194, Training Loss: 0.00738
Epoch: 37194, Training Loss: 0.00905
Epoch: 37195, Training Loss: 0.00709
Epoch: 37195, Training Loss: 0.00737
Epoch: 37195, Training Loss: 0.00738
Epoch: 37195, Training Loss: 0.00905
Epoch: 37196, Training Loss: 0.00709
Epoch: 37196, Training Loss: 0.00737
Epoch: 37196, Training Loss: 0.00738
Epoch: 37196, Training Loss: 0.00905
Epoch: 37197, Training Loss: 0.00709
Epoch: 37197, Training Loss: 0.00737
Epoch: 37197, Training Loss: 0.00738
Epoch: 37197, Training Loss: 0.00905
Epoch: 37198, Training Loss: 0.00709
Epoch: 37198, Training Loss: 0.00737
Epoch: 37198, Training Loss: 0.00738
Epoch: 37198, Training Loss: 0.00905
Epoch: 37199, Training Loss: 0.00709
Epoch: 37199, Training Loss: 0.00737
Epoch: 37199, Training Loss: 0.00738
Epoch: 37199, Training Loss: 0.00905
Epoch: 37200, Training Loss: 0.00709
Epoch: 37200, Training Loss: 0.00737
Epoch: 37200, Training Loss: 0.00738
Epoch: 37200, Training Loss: 0.00905
Epoch: 37201, Training Loss: 0.00709
Epoch: 37201, Training Loss: 0.00737
Epoch: 37201, Training Loss: 0.00738
Epoch: 37201, Training Loss: 0.00905
Epoch: 37202, Training Loss: 0.00709
Epoch: 37202, Training Loss: 0.00737
Epoch: 37202, Training Loss: 0.00738
Epoch: 37202, Training Loss: 0.00905
Epoch: 37203, Training Loss: 0.00709
Epoch: 37203, Training Loss: 0.00737
Epoch: 37203, Training Loss: 0.00738
Epoch: 37203, Training Loss: 0.00905
Epoch: 37204, Training Loss: 0.00709
Epoch: 37204, Training Loss: 0.00737
Epoch: 37204, Training Loss: 0.00738
Epoch: 37204, Training Loss: 0.00905
Epoch: 37205, Training Loss: 0.00709
Epoch: 37205, Training Loss: 0.00737
Epoch: 37205, Training Loss: 0.00738
Epoch: 37205, Training Loss: 0.00905
Epoch: 37206, Training Loss: 0.00709
Epoch: 37206, Training Loss: 0.00737
Epoch: 37206, Training Loss: 0.00738
Epoch: 37206, Training Loss: 0.00905
Epoch: 37207, Training Loss: 0.00709
Epoch: 37207, Training Loss: 0.00737
Epoch: 37207, Training Loss: 0.00738
Epoch: 37207, Training Loss: 0.00905
Epoch: 37208, Training Loss: 0.00709
Epoch: 37208, Training Loss: 0.00737
Epoch: 37208, Training Loss: 0.00738
Epoch: 37208, Training Loss: 0.00905
Epoch: 37209, Training Loss: 0.00709
Epoch: 37209, Training Loss: 0.00737
Epoch: 37209, Training Loss: 0.00738
Epoch: 37209, Training Loss: 0.00905
Epoch: 37210, Training Loss: 0.00709
Epoch: 37210, Training Loss: 0.00737
Epoch: 37210, Training Loss: 0.00738
Epoch: 37210, Training Loss: 0.00905
Epoch: 37211, Training Loss: 0.00709
Epoch: 37211, Training Loss: 0.00737
Epoch: 37211, Training Loss: 0.00738
Epoch: 37211, Training Loss: 0.00905
Epoch: 37212, Training Loss: 0.00709
Epoch: 37212, Training Loss: 0.00737
Epoch: 37212, Training Loss: 0.00738
Epoch: 37212, Training Loss: 0.00905
Epoch: 37213, Training Loss: 0.00709
Epoch: 37213, Training Loss: 0.00737
Epoch: 37213, Training Loss: 0.00738
Epoch: 37213, Training Loss: 0.00905
Epoch: 37214, Training Loss: 0.00709
Epoch: 37214, Training Loss: 0.00737
Epoch: 37214, Training Loss: 0.00738
Epoch: 37214, Training Loss: 0.00905
Epoch: 37215, Training Loss: 0.00709
Epoch: 37215, Training Loss: 0.00737
Epoch: 37215, Training Loss: 0.00738
Epoch: 37215, Training Loss: 0.00905
Epoch: 37216, Training Loss: 0.00709
Epoch: 37216, Training Loss: 0.00737
Epoch: 37216, Training Loss: 0.00738
Epoch: 37216, Training Loss: 0.00905
Epoch: 37217, Training Loss: 0.00709
Epoch: 37217, Training Loss: 0.00737
Epoch: 37217, Training Loss: 0.00738
Epoch: 37217, Training Loss: 0.00905
Epoch: 37218, Training Loss: 0.00709
Epoch: 37218, Training Loss: 0.00737
Epoch: 37218, Training Loss: 0.00738
Epoch: 37218, Training Loss: 0.00905
Epoch: 37219, Training Loss: 0.00709
Epoch: 37219, Training Loss: 0.00737
Epoch: 37219, Training Loss: 0.00738
Epoch: 37219, Training Loss: 0.00905
Epoch: 37220, Training Loss: 0.00709
Epoch: 37220, Training Loss: 0.00737
Epoch: 37220, Training Loss: 0.00738
Epoch: 37220, Training Loss: 0.00905
Epoch: 37221, Training Loss: 0.00709
Epoch: 37221, Training Loss: 0.00737
Epoch: 37221, Training Loss: 0.00738
Epoch: 37221, Training Loss: 0.00905
Epoch: 37222, Training Loss: 0.00709
Epoch: 37222, Training Loss: 0.00737
Epoch: 37222, Training Loss: 0.00738
Epoch: 37222, Training Loss: 0.00905
Epoch: 37223, Training Loss: 0.00709
Epoch: 37223, Training Loss: 0.00737
Epoch: 37223, Training Loss: 0.00738
Epoch: 37223, Training Loss: 0.00905
Epoch: 37224, Training Loss: 0.00709
Epoch: 37224, Training Loss: 0.00737
Epoch: 37224, Training Loss: 0.00738
Epoch: 37224, Training Loss: 0.00905
Epoch: 37225, Training Loss: 0.00709
Epoch: 37225, Training Loss: 0.00737
Epoch: 37225, Training Loss: 0.00738
Epoch: 37225, Training Loss: 0.00905
Epoch: 37226, Training Loss: 0.00709
Epoch: 37226, Training Loss: 0.00737
Epoch: 37226, Training Loss: 0.00738
Epoch: 37226, Training Loss: 0.00905
Epoch: 37227, Training Loss: 0.00709
Epoch: 37227, Training Loss: 0.00737
Epoch: 37227, Training Loss: 0.00738
Epoch: 37227, Training Loss: 0.00905
Epoch: 37228, Training Loss: 0.00709
Epoch: 37228, Training Loss: 0.00737
Epoch: 37228, Training Loss: 0.00738
Epoch: 37228, Training Loss: 0.00905
Epoch: 37229, Training Loss: 0.00709
Epoch: 37229, Training Loss: 0.00737
Epoch: 37229, Training Loss: 0.00738
Epoch: 37229, Training Loss: 0.00905
Epoch: 37230, Training Loss: 0.00709
Epoch: 37230, Training Loss: 0.00737
Epoch: 37230, Training Loss: 0.00738
Epoch: 37230, Training Loss: 0.00905
Epoch: 37231, Training Loss: 0.00709
Epoch: 37231, Training Loss: 0.00737
Epoch: 37231, Training Loss: 0.00738
Epoch: 37231, Training Loss: 0.00905
Epoch: 37232, Training Loss: 0.00709
Epoch: 37232, Training Loss: 0.00737
Epoch: 37232, Training Loss: 0.00738
Epoch: 37232, Training Loss: 0.00905
Epoch: 37233, Training Loss: 0.00709
Epoch: 37233, Training Loss: 0.00737
Epoch: 37233, Training Loss: 0.00738
Epoch: 37233, Training Loss: 0.00905
Epoch: 37234, Training Loss: 0.00709
Epoch: 37234, Training Loss: 0.00737
Epoch: 37234, Training Loss: 0.00738
Epoch: 37234, Training Loss: 0.00905
Epoch: 37235, Training Loss: 0.00709
Epoch: 37235, Training Loss: 0.00737
Epoch: 37235, Training Loss: 0.00738
Epoch: 37235, Training Loss: 0.00905
Epoch: 37236, Training Loss: 0.00709
Epoch: 37236, Training Loss: 0.00737
Epoch: 37236, Training Loss: 0.00737
Epoch: 37236, Training Loss: 0.00905
Epoch: 37237, Training Loss: 0.00709
Epoch: 37237, Training Loss: 0.00737
Epoch: 37237, Training Loss: 0.00737
Epoch: 37237, Training Loss: 0.00905
Epoch: 37238, Training Loss: 0.00709
Epoch: 37238, Training Loss: 0.00737
Epoch: 37238, Training Loss: 0.00737
Epoch: 37238, Training Loss: 0.00905
Epoch: 37239, Training Loss: 0.00709
Epoch: 37239, Training Loss: 0.00737
Epoch: 37239, Training Loss: 0.00737
Epoch: 37239, Training Loss: 0.00905
Epoch: 37240, Training Loss: 0.00709
Epoch: 37240, Training Loss: 0.00737
Epoch: 37240, Training Loss: 0.00737
Epoch: 37240, Training Loss: 0.00905
Epoch: 37241, Training Loss: 0.00709
Epoch: 37241, Training Loss: 0.00737
Epoch: 37241, Training Loss: 0.00737
Epoch: 37241, Training Loss: 0.00905
Epoch: 37242, Training Loss: 0.00709
Epoch: 37242, Training Loss: 0.00737
Epoch: 37242, Training Loss: 0.00737
Epoch: 37242, Training Loss: 0.00905
Epoch: 37243, Training Loss: 0.00709
Epoch: 37243, Training Loss: 0.00737
Epoch: 37243, Training Loss: 0.00737
Epoch: 37243, Training Loss: 0.00905
Epoch: 37244, Training Loss: 0.00709
Epoch: 37244, Training Loss: 0.00737
Epoch: 37244, Training Loss: 0.00737
Epoch: 37244, Training Loss: 0.00905
Epoch: 37245, Training Loss: 0.00709
Epoch: 37245, Training Loss: 0.00737
Epoch: 37245, Training Loss: 0.00737
Epoch: 37245, Training Loss: 0.00905
Epoch: 37246, Training Loss: 0.00709
Epoch: 37246, Training Loss: 0.00737
Epoch: 37246, Training Loss: 0.00737
Epoch: 37246, Training Loss: 0.00905
Epoch: 37247, Training Loss: 0.00709
Epoch: 37247, Training Loss: 0.00737
Epoch: 37247, Training Loss: 0.00737
Epoch: 37247, Training Loss: 0.00905
Epoch: 37248, Training Loss: 0.00709
Epoch: 37248, Training Loss: 0.00737
Epoch: 37248, Training Loss: 0.00737
Epoch: 37248, Training Loss: 0.00905
Epoch: 37249, Training Loss: 0.00709
Epoch: 37249, Training Loss: 0.00737
Epoch: 37249, Training Loss: 0.00737
Epoch: 37249, Training Loss: 0.00905
Epoch: 37250, Training Loss: 0.00709
Epoch: 37250, Training Loss: 0.00737
Epoch: 37250, Training Loss: 0.00737
Epoch: 37250, Training Loss: 0.00905
Epoch: 37251, Training Loss: 0.00709
Epoch: 37251, Training Loss: 0.00736
Epoch: 37251, Training Loss: 0.00737
Epoch: 37251, Training Loss: 0.00905
Epoch: 37252, Training Loss: 0.00709
Epoch: 37252, Training Loss: 0.00736
Epoch: 37252, Training Loss: 0.00737
Epoch: 37252, Training Loss: 0.00905
Epoch: 37253, Training Loss: 0.00708
Epoch: 37253, Training Loss: 0.00736
Epoch: 37253, Training Loss: 0.00737
Epoch: 37253, Training Loss: 0.00905
Epoch: 37254, Training Loss: 0.00708
Epoch: 37254, Training Loss: 0.00736
Epoch: 37254, Training Loss: 0.00737
Epoch: 37254, Training Loss: 0.00905
Epoch: 37255, Training Loss: 0.00708
Epoch: 37255, Training Loss: 0.00736
Epoch: 37255, Training Loss: 0.00737
Epoch: 37255, Training Loss: 0.00905
Epoch: 37256, Training Loss: 0.00708
Epoch: 37256, Training Loss: 0.00736
Epoch: 37256, Training Loss: 0.00737
Epoch: 37256, Training Loss: 0.00905
Epoch: 37257, Training Loss: 0.00708
Epoch: 37257, Training Loss: 0.00736
Epoch: 37257, Training Loss: 0.00737
Epoch: 37257, Training Loss: 0.00904
Epoch: 37258, Training Loss: 0.00708
Epoch: 37258, Training Loss: 0.00736
Epoch: 37258, Training Loss: 0.00737
Epoch: 37258, Training Loss: 0.00904
Epoch: 37259, Training Loss: 0.00708
Epoch: 37259, Training Loss: 0.00736
Epoch: 37259, Training Loss: 0.00737
Epoch: 37259, Training Loss: 0.00904
Epoch: 37260, Training Loss: 0.00708
Epoch: 37260, Training Loss: 0.00736
Epoch: 37260, Training Loss: 0.00737
Epoch: 37260, Training Loss: 0.00904
Epoch: 37261, Training Loss: 0.00708
Epoch: 37261, Training Loss: 0.00736
Epoch: 37261, Training Loss: 0.00737
Epoch: 37261, Training Loss: 0.00904
Epoch: 37262, Training Loss: 0.00708
Epoch: 37262, Training Loss: 0.00736
Epoch: 37262, Training Loss: 0.00737
Epoch: 37262, Training Loss: 0.00904
Epoch: 37263, Training Loss: 0.00708
Epoch: 37263, Training Loss: 0.00736
Epoch: 37263, Training Loss: 0.00737
Epoch: 37263, Training Loss: 0.00904
Epoch: 37264, Training Loss: 0.00708
Epoch: 37264, Training Loss: 0.00736
Epoch: 37264, Training Loss: 0.00737
Epoch: 37264, Training Loss: 0.00904
Epoch: 37265, Training Loss: 0.00708
Epoch: 37265, Training Loss: 0.00736
Epoch: 37265, Training Loss: 0.00737
Epoch: 37265, Training Loss: 0.00904
Epoch: 37266, Training Loss: 0.00708
Epoch: 37266, Training Loss: 0.00736
Epoch: 37266, Training Loss: 0.00737
Epoch: 37266, Training Loss: 0.00904
Epoch: 37267, Training Loss: 0.00708
Epoch: 37267, Training Loss: 0.00736
Epoch: 37267, Training Loss: 0.00737
Epoch: 37267, Training Loss: 0.00904
Epoch: 37268, Training Loss: 0.00708
Epoch: 37268, Training Loss: 0.00736
Epoch: 37268, Training Loss: 0.00737
Epoch: 37268, Training Loss: 0.00904
Epoch: 37269, Training Loss: 0.00708
Epoch: 37269, Training Loss: 0.00736
Epoch: 37269, Training Loss: 0.00737
Epoch: 37269, Training Loss: 0.00904
Epoch: 37270, Training Loss: 0.00708
Epoch: 37270, Training Loss: 0.00736
Epoch: 37270, Training Loss: 0.00737
Epoch: 37270, Training Loss: 0.00904
Epoch: 37271, Training Loss: 0.00708
Epoch: 37271, Training Loss: 0.00736
Epoch: 37271, Training Loss: 0.00737
Epoch: 37271, Training Loss: 0.00904
Epoch: 37272, Training Loss: 0.00708
Epoch: 37272, Training Loss: 0.00736
Epoch: 37272, Training Loss: 0.00737
Epoch: 37272, Training Loss: 0.00904
Epoch: 37273, Training Loss: 0.00708
Epoch: 37273, Training Loss: 0.00736
Epoch: 37273, Training Loss: 0.00737
Epoch: 37273, Training Loss: 0.00904
Epoch: 37274, Training Loss: 0.00708
Epoch: 37274, Training Loss: 0.00736
Epoch: 37274, Training Loss: 0.00737
Epoch: 37274, Training Loss: 0.00904
Epoch: 37275, Training Loss: 0.00708
Epoch: 37275, Training Loss: 0.00736
Epoch: 37275, Training Loss: 0.00737
Epoch: 37275, Training Loss: 0.00904
Epoch: 37276, Training Loss: 0.00708
Epoch: 37276, Training Loss: 0.00736
Epoch: 37276, Training Loss: 0.00737
Epoch: 37276, Training Loss: 0.00904
Epoch: 37277, Training Loss: 0.00708
Epoch: 37277, Training Loss: 0.00736
Epoch: 37277, Training Loss: 0.00737
Epoch: 37277, Training Loss: 0.00904
Epoch: 37278, Training Loss: 0.00708
Epoch: 37278, Training Loss: 0.00736
Epoch: 37278, Training Loss: 0.00737
Epoch: 37278, Training Loss: 0.00904
Epoch: 37279, Training Loss: 0.00708
Epoch: 37279, Training Loss: 0.00736
Epoch: 37279, Training Loss: 0.00737
Epoch: 37279, Training Loss: 0.00904
Epoch: 37280, Training Loss: 0.00708
Epoch: 37280, Training Loss: 0.00736
Epoch: 37280, Training Loss: 0.00737
Epoch: 37280, Training Loss: 0.00904
Epoch: 37281, Training Loss: 0.00708
Epoch: 37281, Training Loss: 0.00736
Epoch: 37281, Training Loss: 0.00737
Epoch: 37281, Training Loss: 0.00904
Epoch: 37282, Training Loss: 0.00708
Epoch: 37282, Training Loss: 0.00736
Epoch: 37282, Training Loss: 0.00737
Epoch: 37282, Training Loss: 0.00904
Epoch: 37283, Training Loss: 0.00708
Epoch: 37283, Training Loss: 0.00736
Epoch: 37283, Training Loss: 0.00737
Epoch: 37283, Training Loss: 0.00904
Epoch: 37284, Training Loss: 0.00708
Epoch: 37284, Training Loss: 0.00736
Epoch: 37284, Training Loss: 0.00737
Epoch: 37284, Training Loss: 0.00904
Epoch: 37285, Training Loss: 0.00708
Epoch: 37285, Training Loss: 0.00736
Epoch: 37285, Training Loss: 0.00737
Epoch: 37285, Training Loss: 0.00904
Epoch: 37286, Training Loss: 0.00708
Epoch: 37286, Training Loss: 0.00736
Epoch: 37286, Training Loss: 0.00737
Epoch: 37286, Training Loss: 0.00904
Epoch: 37287, Training Loss: 0.00708
Epoch: 37287, Training Loss: 0.00736
Epoch: 37287, Training Loss: 0.00737
Epoch: 37287, Training Loss: 0.00904
Epoch: 37288, Training Loss: 0.00708
Epoch: 37288, Training Loss: 0.00736
Epoch: 37288, Training Loss: 0.00737
Epoch: 37288, Training Loss: 0.00904
Epoch: 37289, Training Loss: 0.00708
Epoch: 37289, Training Loss: 0.00736
Epoch: 37289, Training Loss: 0.00737
Epoch: 37289, Training Loss: 0.00904
Epoch: 37290, Training Loss: 0.00708
Epoch: 37290, Training Loss: 0.00736
Epoch: 37290, Training Loss: 0.00737
Epoch: 37290, Training Loss: 0.00904
Epoch: 37291, Training Loss: 0.00708
Epoch: 37291, Training Loss: 0.00736
Epoch: 37291, Training Loss: 0.00737
Epoch: 37291, Training Loss: 0.00904
Epoch: 37292, Training Loss: 0.00708
Epoch: 37292, Training Loss: 0.00736
Epoch: 37292, Training Loss: 0.00737
Epoch: 37292, Training Loss: 0.00904
Epoch: 37293, Training Loss: 0.00708
Epoch: 37293, Training Loss: 0.00736
Epoch: 37293, Training Loss: 0.00737
Epoch: 37293, Training Loss: 0.00904
Epoch: 37294, Training Loss: 0.00708
Epoch: 37294, Training Loss: 0.00736
Epoch: 37294, Training Loss: 0.00737
Epoch: 37294, Training Loss: 0.00904
Epoch: 37295, Training Loss: 0.00708
Epoch: 37295, Training Loss: 0.00736
Epoch: 37295, Training Loss: 0.00737
Epoch: 37295, Training Loss: 0.00904
Epoch: 37296, Training Loss: 0.00708
Epoch: 37296, Training Loss: 0.00736
Epoch: 37296, Training Loss: 0.00737
Epoch: 37296, Training Loss: 0.00904
Epoch: 37297, Training Loss: 0.00708
Epoch: 37297, Training Loss: 0.00736
Epoch: 37297, Training Loss: 0.00737
Epoch: 37297, Training Loss: 0.00904
Epoch: 37298, Training Loss: 0.00708
Epoch: 37298, Training Loss: 0.00736
Epoch: 37298, Training Loss: 0.00737
Epoch: 37298, Training Loss: 0.00904
Epoch: 37299, Training Loss: 0.00708
Epoch: 37299, Training Loss: 0.00736
Epoch: 37299, Training Loss: 0.00737
Epoch: 37299, Training Loss: 0.00904
Epoch: 37300, Training Loss: 0.00708
Epoch: 37300, Training Loss: 0.00736
Epoch: 37300, Training Loss: 0.00737
Epoch: 37300, Training Loss: 0.00904
Epoch: 37301, Training Loss: 0.00708
Epoch: 37301, Training Loss: 0.00736
Epoch: 37301, Training Loss: 0.00737
Epoch: 37301, Training Loss: 0.00904
Epoch: 37302, Training Loss: 0.00708
Epoch: 37302, Training Loss: 0.00736
Epoch: 37302, Training Loss: 0.00737
Epoch: 37302, Training Loss: 0.00904
Epoch: 37303, Training Loss: 0.00708
Epoch: 37303, Training Loss: 0.00736
Epoch: 37303, Training Loss: 0.00737
Epoch: 37303, Training Loss: 0.00904
Epoch: 37304, Training Loss: 0.00708
Epoch: 37304, Training Loss: 0.00736
Epoch: 37304, Training Loss: 0.00737
Epoch: 37304, Training Loss: 0.00904
Epoch: 37305, Training Loss: 0.00708
Epoch: 37305, Training Loss: 0.00736
Epoch: 37305, Training Loss: 0.00737
Epoch: 37305, Training Loss: 0.00904
Epoch: 37306, Training Loss: 0.00708
Epoch: 37306, Training Loss: 0.00736
Epoch: 37306, Training Loss: 0.00737
Epoch: 37306, Training Loss: 0.00904
Epoch: 37307, Training Loss: 0.00708
Epoch: 37307, Training Loss: 0.00736
Epoch: 37307, Training Loss: 0.00737
Epoch: 37307, Training Loss: 0.00904
Epoch: 37308, Training Loss: 0.00708
Epoch: 37308, Training Loss: 0.00736
Epoch: 37308, Training Loss: 0.00737
Epoch: 37308, Training Loss: 0.00904
Epoch: 37309, Training Loss: 0.00708
Epoch: 37309, Training Loss: 0.00736
Epoch: 37309, Training Loss: 0.00737
Epoch: 37309, Training Loss: 0.00904
Epoch: 37310, Training Loss: 0.00708
Epoch: 37310, Training Loss: 0.00736
Epoch: 37310, Training Loss: 0.00737
Epoch: 37310, Training Loss: 0.00904
Epoch: 37311, Training Loss: 0.00708
Epoch: 37311, Training Loss: 0.00736
Epoch: 37311, Training Loss: 0.00737
Epoch: 37311, Training Loss: 0.00904
Epoch: 37312, Training Loss: 0.00708
Epoch: 37312, Training Loss: 0.00736
Epoch: 37312, Training Loss: 0.00737
Epoch: 37312, Training Loss: 0.00904
Epoch: 37313, Training Loss: 0.00708
Epoch: 37313, Training Loss: 0.00736
Epoch: 37313, Training Loss: 0.00737
Epoch: 37313, Training Loss: 0.00904
Epoch: 37314, Training Loss: 0.00708
Epoch: 37314, Training Loss: 0.00736
Epoch: 37314, Training Loss: 0.00737
Epoch: 37314, Training Loss: 0.00904
Epoch: 37315, Training Loss: 0.00708
Epoch: 37315, Training Loss: 0.00736
Epoch: 37315, Training Loss: 0.00737
Epoch: 37315, Training Loss: 0.00904
Epoch: 37316, Training Loss: 0.00708
Epoch: 37316, Training Loss: 0.00736
Epoch: 37316, Training Loss: 0.00737
Epoch: 37316, Training Loss: 0.00904
Epoch: 37317, Training Loss: 0.00708
Epoch: 37317, Training Loss: 0.00736
Epoch: 37317, Training Loss: 0.00737
Epoch: 37317, Training Loss: 0.00904
Epoch: 37318, Training Loss: 0.00708
Epoch: 37318, Training Loss: 0.00736
Epoch: 37318, Training Loss: 0.00737
Epoch: 37318, Training Loss: 0.00904
Epoch: 37319, Training Loss: 0.00708
Epoch: 37319, Training Loss: 0.00736
Epoch: 37319, Training Loss: 0.00737
Epoch: 37319, Training Loss: 0.00904
Epoch: 37320, Training Loss: 0.00708
Epoch: 37320, Training Loss: 0.00736
Epoch: 37320, Training Loss: 0.00737
Epoch: 37320, Training Loss: 0.00904
Epoch: 37321, Training Loss: 0.00708
Epoch: 37321, Training Loss: 0.00736
Epoch: 37321, Training Loss: 0.00737
Epoch: 37321, Training Loss: 0.00904
Epoch: 37322, Training Loss: 0.00708
Epoch: 37322, Training Loss: 0.00736
Epoch: 37322, Training Loss: 0.00737
Epoch: 37322, Training Loss: 0.00904
Epoch: 37323, Training Loss: 0.00708
Epoch: 37323, Training Loss: 0.00736
Epoch: 37323, Training Loss: 0.00737
Epoch: 37323, Training Loss: 0.00904
Epoch: 37324, Training Loss: 0.00708
Epoch: 37324, Training Loss: 0.00736
Epoch: 37324, Training Loss: 0.00737
Epoch: 37324, Training Loss: 0.00904
Epoch: 37325, Training Loss: 0.00708
Epoch: 37325, Training Loss: 0.00736
Epoch: 37325, Training Loss: 0.00737
Epoch: 37325, Training Loss: 0.00904
Epoch: 37326, Training Loss: 0.00708
Epoch: 37326, Training Loss: 0.00736
Epoch: 37326, Training Loss: 0.00737
Epoch: 37326, Training Loss: 0.00904
Epoch: 37327, Training Loss: 0.00708
Epoch: 37327, Training Loss: 0.00736
Epoch: 37327, Training Loss: 0.00737
Epoch: 37327, Training Loss: 0.00904
Epoch: 37328, Training Loss: 0.00708
Epoch: 37328, Training Loss: 0.00736
Epoch: 37328, Training Loss: 0.00736
Epoch: 37328, Training Loss: 0.00904
Epoch: 37329, Training Loss: 0.00708
Epoch: 37329, Training Loss: 0.00736
Epoch: 37329, Training Loss: 0.00736
Epoch: 37329, Training Loss: 0.00904
Epoch: 37330, Training Loss: 0.00708
Epoch: 37330, Training Loss: 0.00736
Epoch: 37330, Training Loss: 0.00736
Epoch: 37330, Training Loss: 0.00904
Epoch: 37331, Training Loss: 0.00708
Epoch: 37331, Training Loss: 0.00736
Epoch: 37331, Training Loss: 0.00736
Epoch: 37331, Training Loss: 0.00903
Epoch: 37332, Training Loss: 0.00708
Epoch: 37332, Training Loss: 0.00736
Epoch: 37332, Training Loss: 0.00736
Epoch: 37332, Training Loss: 0.00903
Epoch: 37333, Training Loss: 0.00708
Epoch: 37333, Training Loss: 0.00736
Epoch: 37333, Training Loss: 0.00736
Epoch: 37333, Training Loss: 0.00903
Epoch: 37334, Training Loss: 0.00708
Epoch: 37334, Training Loss: 0.00736
Epoch: 37334, Training Loss: 0.00736
Epoch: 37334, Training Loss: 0.00903
Epoch: 37335, Training Loss: 0.00708
Epoch: 37335, Training Loss: 0.00736
Epoch: 37335, Training Loss: 0.00736
Epoch: 37335, Training Loss: 0.00903
Epoch: 37336, Training Loss: 0.00708
Epoch: 37336, Training Loss: 0.00736
Epoch: 37336, Training Loss: 0.00736
Epoch: 37336, Training Loss: 0.00903
Epoch: 37337, Training Loss: 0.00708
Epoch: 37337, Training Loss: 0.00736
Epoch: 37337, Training Loss: 0.00736
Epoch: 37337, Training Loss: 0.00903
Epoch: 37338, Training Loss: 0.00708
Epoch: 37338, Training Loss: 0.00736
Epoch: 37338, Training Loss: 0.00736
Epoch: 37338, Training Loss: 0.00903
Epoch: 37339, Training Loss: 0.00708
Epoch: 37339, Training Loss: 0.00736
Epoch: 37339, Training Loss: 0.00736
Epoch: 37339, Training Loss: 0.00903
Epoch: 37340, Training Loss: 0.00708
Epoch: 37340, Training Loss: 0.00736
Epoch: 37340, Training Loss: 0.00736
Epoch: 37340, Training Loss: 0.00903
Epoch: 37341, Training Loss: 0.00708
Epoch: 37341, Training Loss: 0.00736
Epoch: 37341, Training Loss: 0.00736
Epoch: 37341, Training Loss: 0.00903
Epoch: 37342, Training Loss: 0.00708
Epoch: 37342, Training Loss: 0.00735
Epoch: 37342, Training Loss: 0.00736
Epoch: 37342, Training Loss: 0.00903
Epoch: 37343, Training Loss: 0.00708
Epoch: 37343, Training Loss: 0.00735
Epoch: 37343, Training Loss: 0.00736
Epoch: 37343, Training Loss: 0.00903
Epoch: 37344, Training Loss: 0.00708
Epoch: 37344, Training Loss: 0.00735
Epoch: 37344, Training Loss: 0.00736
Epoch: 37344, Training Loss: 0.00903
Epoch: 37345, Training Loss: 0.00708
Epoch: 37345, Training Loss: 0.00735
Epoch: 37345, Training Loss: 0.00736
Epoch: 37345, Training Loss: 0.00903
Epoch: 37346, Training Loss: 0.00708
Epoch: 37346, Training Loss: 0.00735
Epoch: 37346, Training Loss: 0.00736
Epoch: 37346, Training Loss: 0.00903
Epoch: 37347, Training Loss: 0.00708
Epoch: 37347, Training Loss: 0.00735
Epoch: 37347, Training Loss: 0.00736
Epoch: 37347, Training Loss: 0.00903
Epoch: 37348, Training Loss: 0.00708
Epoch: 37348, Training Loss: 0.00735
Epoch: 37348, Training Loss: 0.00736
Epoch: 37348, Training Loss: 0.00903
Epoch: 37349, Training Loss: 0.00708
Epoch: 37349, Training Loss: 0.00735
Epoch: 37349, Training Loss: 0.00736
Epoch: 37349, Training Loss: 0.00903
Epoch: 37350, Training Loss: 0.00707
Epoch: 37350, Training Loss: 0.00735
Epoch: 37350, Training Loss: 0.00736
Epoch: 37350, Training Loss: 0.00903
Epoch: 37351, Training Loss: 0.00707
Epoch: 37351, Training Loss: 0.00735
Epoch: 37351, Training Loss: 0.00736
Epoch: 37351, Training Loss: 0.00903
Epoch: 37352, Training Loss: 0.00707
Epoch: 37352, Training Loss: 0.00735
Epoch: 37352, Training Loss: 0.00736
Epoch: 37352, Training Loss: 0.00903
Epoch: 37353, Training Loss: 0.00707
Epoch: 37353, Training Loss: 0.00735
Epoch: 37353, Training Loss: 0.00736
Epoch: 37353, Training Loss: 0.00903
Epoch: 37354, Training Loss: 0.00707
Epoch: 37354, Training Loss: 0.00735
Epoch: 37354, Training Loss: 0.00736
Epoch: 37354, Training Loss: 0.00903
Epoch: 37355, Training Loss: 0.00707
Epoch: 37355, Training Loss: 0.00735
Epoch: 37355, Training Loss: 0.00736
Epoch: 37355, Training Loss: 0.00903
Epoch: 37356, Training Loss: 0.00707
Epoch: 37356, Training Loss: 0.00735
Epoch: 37356, Training Loss: 0.00736
Epoch: 37356, Training Loss: 0.00903
Epoch: 37357, Training Loss: 0.00707
Epoch: 37357, Training Loss: 0.00735
Epoch: 37357, Training Loss: 0.00736
Epoch: 37357, Training Loss: 0.00903
Epoch: 37358, Training Loss: 0.00707
Epoch: 37358, Training Loss: 0.00735
Epoch: 37358, Training Loss: 0.00736
Epoch: 37358, Training Loss: 0.00903
Epoch: 37359, Training Loss: 0.00707
Epoch: 37359, Training Loss: 0.00735
Epoch: 37359, Training Loss: 0.00736
Epoch: 37359, Training Loss: 0.00903
Epoch: 37360, Training Loss: 0.00707
Epoch: 37360, Training Loss: 0.00735
Epoch: 37360, Training Loss: 0.00736
Epoch: 37360, Training Loss: 0.00903
Epoch: 37361, Training Loss: 0.00707
Epoch: 37361, Training Loss: 0.00735
Epoch: 37361, Training Loss: 0.00736
Epoch: 37361, Training Loss: 0.00903
Epoch: 37362, Training Loss: 0.00707
Epoch: 37362, Training Loss: 0.00735
Epoch: 37362, Training Loss: 0.00736
Epoch: 37362, Training Loss: 0.00903
Epoch: 37363, Training Loss: 0.00707
Epoch: 37363, Training Loss: 0.00735
Epoch: 37363, Training Loss: 0.00736
Epoch: 37363, Training Loss: 0.00903
Epoch: 37364, Training Loss: 0.00707
Epoch: 37364, Training Loss: 0.00735
Epoch: 37364, Training Loss: 0.00736
Epoch: 37364, Training Loss: 0.00903
Epoch: 37365, Training Loss: 0.00707
Epoch: 37365, Training Loss: 0.00735
Epoch: 37365, Training Loss: 0.00736
Epoch: 37365, Training Loss: 0.00903
Epoch: 37366, Training Loss: 0.00707
Epoch: 37366, Training Loss: 0.00735
Epoch: 37366, Training Loss: 0.00736
Epoch: 37366, Training Loss: 0.00903
Epoch: 37367, Training Loss: 0.00707
Epoch: 37367, Training Loss: 0.00735
Epoch: 37367, Training Loss: 0.00736
Epoch: 37367, Training Loss: 0.00903
Epoch: 37368, Training Loss: 0.00707
Epoch: 37368, Training Loss: 0.00735
Epoch: 37368, Training Loss: 0.00736
Epoch: 37368, Training Loss: 0.00903
Epoch: 37369, Training Loss: 0.00707
Epoch: 37369, Training Loss: 0.00735
Epoch: 37369, Training Loss: 0.00736
Epoch: 37369, Training Loss: 0.00903
Epoch: 37370, Training Loss: 0.00707
Epoch: 37370, Training Loss: 0.00735
Epoch: 37370, Training Loss: 0.00736
Epoch: 37370, Training Loss: 0.00903
Epoch: 37371, Training Loss: 0.00707
Epoch: 37371, Training Loss: 0.00735
Epoch: 37371, Training Loss: 0.00736
Epoch: 37371, Training Loss: 0.00903
Epoch: 37372, Training Loss: 0.00707
Epoch: 37372, Training Loss: 0.00735
Epoch: 37372, Training Loss: 0.00736
Epoch: 37372, Training Loss: 0.00903
Epoch: 37373, Training Loss: 0.00707
Epoch: 37373, Training Loss: 0.00735
Epoch: 37373, Training Loss: 0.00736
Epoch: 37373, Training Loss: 0.00903
Epoch: 37374, Training Loss: 0.00707
Epoch: 37374, Training Loss: 0.00735
Epoch: 37374, Training Loss: 0.00736
Epoch: 37374, Training Loss: 0.00903
Epoch: 37375, Training Loss: 0.00707
Epoch: 37375, Training Loss: 0.00735
Epoch: 37375, Training Loss: 0.00736
Epoch: 37375, Training Loss: 0.00903
Epoch: 37376, Training Loss: 0.00707
Epoch: 37376, Training Loss: 0.00735
Epoch: 37376, Training Loss: 0.00736
Epoch: 37376, Training Loss: 0.00903
Epoch: 37377, Training Loss: 0.00707
Epoch: 37377, Training Loss: 0.00735
Epoch: 37377, Training Loss: 0.00736
Epoch: 37377, Training Loss: 0.00903
Epoch: 37378, Training Loss: 0.00707
Epoch: 37378, Training Loss: 0.00735
Epoch: 37378, Training Loss: 0.00736
Epoch: 37378, Training Loss: 0.00903
Epoch: 37379, Training Loss: 0.00707
Epoch: 37379, Training Loss: 0.00735
Epoch: 37379, Training Loss: 0.00736
Epoch: 37379, Training Loss: 0.00903
Epoch: 37380, Training Loss: 0.00707
Epoch: 37380, Training Loss: 0.00735
Epoch: 37380, Training Loss: 0.00736
Epoch: 37380, Training Loss: 0.00903
Epoch: 37381, Training Loss: 0.00707
Epoch: 37381, Training Loss: 0.00735
Epoch: 37381, Training Loss: 0.00736
Epoch: 37381, Training Loss: 0.00903
Epoch: 37382, Training Loss: 0.00707
Epoch: 37382, Training Loss: 0.00735
Epoch: 37382, Training Loss: 0.00736
Epoch: 37382, Training Loss: 0.00903
Epoch: 37383, Training Loss: 0.00707
Epoch: 37383, Training Loss: 0.00735
Epoch: 37383, Training Loss: 0.00736
Epoch: 37383, Training Loss: 0.00903
Epoch: 37384, Training Loss: 0.00707
Epoch: 37384, Training Loss: 0.00735
Epoch: 37384, Training Loss: 0.00736
Epoch: 37384, Training Loss: 0.00903
Epoch: 37385, Training Loss: 0.00707
Epoch: 37385, Training Loss: 0.00735
Epoch: 37385, Training Loss: 0.00736
Epoch: 37385, Training Loss: 0.00903
Epoch: 37386, Training Loss: 0.00707
Epoch: 37386, Training Loss: 0.00735
Epoch: 37386, Training Loss: 0.00736
Epoch: 37386, Training Loss: 0.00903
Epoch: 37387, Training Loss: 0.00707
Epoch: 37387, Training Loss: 0.00735
Epoch: 37387, Training Loss: 0.00736
Epoch: 37387, Training Loss: 0.00903
Epoch: 37388, Training Loss: 0.00707
Epoch: 37388, Training Loss: 0.00735
Epoch: 37388, Training Loss: 0.00736
Epoch: 37388, Training Loss: 0.00903
Epoch: 37389, Training Loss: 0.00707
Epoch: 37389, Training Loss: 0.00735
Epoch: 37389, Training Loss: 0.00736
Epoch: 37389, Training Loss: 0.00903
Epoch: 37390, Training Loss: 0.00707
Epoch: 37390, Training Loss: 0.00735
Epoch: 37390, Training Loss: 0.00736
Epoch: 37390, Training Loss: 0.00903
Epoch: 37391, Training Loss: 0.00707
Epoch: 37391, Training Loss: 0.00735
Epoch: 37391, Training Loss: 0.00736
Epoch: 37391, Training Loss: 0.00903
Epoch: 37392, Training Loss: 0.00707
Epoch: 37392, Training Loss: 0.00735
Epoch: 37392, Training Loss: 0.00736
Epoch: 37392, Training Loss: 0.00903
Epoch: 37393, Training Loss: 0.00707
Epoch: 37393, Training Loss: 0.00735
Epoch: 37393, Training Loss: 0.00736
Epoch: 37393, Training Loss: 0.00903
Epoch: 37394, Training Loss: 0.00707
Epoch: 37394, Training Loss: 0.00735
Epoch: 37394, Training Loss: 0.00736
Epoch: 37394, Training Loss: 0.00903
Epoch: 37395, Training Loss: 0.00707
Epoch: 37395, Training Loss: 0.00735
Epoch: 37395, Training Loss: 0.00736
Epoch: 37395, Training Loss: 0.00903
Epoch: 37396, Training Loss: 0.00707
Epoch: 37396, Training Loss: 0.00735
Epoch: 37396, Training Loss: 0.00736
Epoch: 37396, Training Loss: 0.00903
Epoch: 37397, Training Loss: 0.00707
Epoch: 37397, Training Loss: 0.00735
Epoch: 37397, Training Loss: 0.00736
Epoch: 37397, Training Loss: 0.00903
Epoch: 37398, Training Loss: 0.00707
Epoch: 37398, Training Loss: 0.00735
Epoch: 37398, Training Loss: 0.00736
Epoch: 37398, Training Loss: 0.00903
Epoch: 37399, Training Loss: 0.00707
Epoch: 37399, Training Loss: 0.00735
Epoch: 37399, Training Loss: 0.00736
Epoch: 37399, Training Loss: 0.00903
Epoch: 37400, Training Loss: 0.00707
Epoch: 37400, Training Loss: 0.00735
Epoch: 37400, Training Loss: 0.00736
Epoch: 37400, Training Loss: 0.00903
Epoch: 37401, Training Loss: 0.00707
Epoch: 37401, Training Loss: 0.00735
Epoch: 37401, Training Loss: 0.00736
Epoch: 37401, Training Loss: 0.00903
Epoch: 37402, Training Loss: 0.00707
Epoch: 37402, Training Loss: 0.00735
Epoch: 37402, Training Loss: 0.00736
Epoch: 37402, Training Loss: 0.00903
Epoch: 37403, Training Loss: 0.00707
Epoch: 37403, Training Loss: 0.00735
Epoch: 37403, Training Loss: 0.00736
Epoch: 37403, Training Loss: 0.00903
Epoch: 37404, Training Loss: 0.00707
Epoch: 37404, Training Loss: 0.00735
Epoch: 37404, Training Loss: 0.00736
Epoch: 37404, Training Loss: 0.00903
Epoch: 37405, Training Loss: 0.00707
Epoch: 37405, Training Loss: 0.00735
Epoch: 37405, Training Loss: 0.00736
Epoch: 37405, Training Loss: 0.00902
Epoch: 37406, Training Loss: 0.00707
Epoch: 37406, Training Loss: 0.00735
Epoch: 37406, Training Loss: 0.00736
Epoch: 37406, Training Loss: 0.00902
Epoch: 37407, Training Loss: 0.00707
Epoch: 37407, Training Loss: 0.00735
Epoch: 37407, Training Loss: 0.00736
Epoch: 37407, Training Loss: 0.00902
Epoch: 37408, Training Loss: 0.00707
Epoch: 37408, Training Loss: 0.00735
Epoch: 37408, Training Loss: 0.00736
Epoch: 37408, Training Loss: 0.00902
Epoch: 37409, Training Loss: 0.00707
Epoch: 37409, Training Loss: 0.00735
Epoch: 37409, Training Loss: 0.00736
Epoch: 37409, Training Loss: 0.00902
Epoch: 37410, Training Loss: 0.00707
Epoch: 37410, Training Loss: 0.00735
Epoch: 37410, Training Loss: 0.00736
Epoch: 37410, Training Loss: 0.00902
Epoch: 37411, Training Loss: 0.00707
Epoch: 37411, Training Loss: 0.00735
Epoch: 37411, Training Loss: 0.00736
Epoch: 37411, Training Loss: 0.00902
Epoch: 37412, Training Loss: 0.00707
Epoch: 37412, Training Loss: 0.00735
Epoch: 37412, Training Loss: 0.00736
Epoch: 37412, Training Loss: 0.00902
Epoch: 37413, Training Loss: 0.00707
Epoch: 37413, Training Loss: 0.00735
Epoch: 37413, Training Loss: 0.00736
Epoch: 37413, Training Loss: 0.00902
Epoch: 37414, Training Loss: 0.00707
Epoch: 37414, Training Loss: 0.00735
Epoch: 37414, Training Loss: 0.00736
Epoch: 37414, Training Loss: 0.00902
Epoch: 37415, Training Loss: 0.00707
Epoch: 37415, Training Loss: 0.00735
Epoch: 37415, Training Loss: 0.00736
Epoch: 37415, Training Loss: 0.00902
Epoch: 37416, Training Loss: 0.00707
Epoch: 37416, Training Loss: 0.00735
Epoch: 37416, Training Loss: 0.00736
Epoch: 37416, Training Loss: 0.00902
Epoch: 37417, Training Loss: 0.00707
Epoch: 37417, Training Loss: 0.00735
Epoch: 37417, Training Loss: 0.00736
Epoch: 37417, Training Loss: 0.00902
Epoch: 37418, Training Loss: 0.00707
Epoch: 37418, Training Loss: 0.00735
Epoch: 37418, Training Loss: 0.00736
Epoch: 37418, Training Loss: 0.00902
Epoch: 37419, Training Loss: 0.00707
Epoch: 37419, Training Loss: 0.00735
Epoch: 37419, Training Loss: 0.00735
Epoch: 37419, Training Loss: 0.00902
Epoch: 37420, Training Loss: 0.00707
Epoch: 37420, Training Loss: 0.00735
Epoch: 37420, Training Loss: 0.00735
Epoch: 37420, Training Loss: 0.00902
Epoch: 37421, Training Loss: 0.00707
Epoch: 37421, Training Loss: 0.00735
Epoch: 37421, Training Loss: 0.00735
Epoch: 37421, Training Loss: 0.00902
Epoch: 37422, Training Loss: 0.00707
Epoch: 37422, Training Loss: 0.00735
Epoch: 37422, Training Loss: 0.00735
Epoch: 37422, Training Loss: 0.00902
Epoch: 37423, Training Loss: 0.00707
Epoch: 37423, Training Loss: 0.00735
Epoch: 37423, Training Loss: 0.00735
Epoch: 37423, Training Loss: 0.00902
Epoch: 37424, Training Loss: 0.00707
Epoch: 37424, Training Loss: 0.00735
Epoch: 37424, Training Loss: 0.00735
Epoch: 37424, Training Loss: 0.00902
Epoch: 37425, Training Loss: 0.00707
Epoch: 37425, Training Loss: 0.00735
Epoch: 37425, Training Loss: 0.00735
Epoch: 37425, Training Loss: 0.00902
Epoch: 37426, Training Loss: 0.00707
Epoch: 37426, Training Loss: 0.00735
Epoch: 37426, Training Loss: 0.00735
Epoch: 37426, Training Loss: 0.00902
Epoch: 37427, Training Loss: 0.00707
Epoch: 37427, Training Loss: 0.00735
Epoch: 37427, Training Loss: 0.00735
Epoch: 37427, Training Loss: 0.00902
Epoch: 37428, Training Loss: 0.00707
Epoch: 37428, Training Loss: 0.00735
Epoch: 37428, Training Loss: 0.00735
Epoch: 37428, Training Loss: 0.00902
Epoch: 37429, Training Loss: 0.00707
Epoch: 37429, Training Loss: 0.00735
Epoch: 37429, Training Loss: 0.00735
Epoch: 37429, Training Loss: 0.00902
Epoch: 37430, Training Loss: 0.00707
Epoch: 37430, Training Loss: 0.00735
Epoch: 37430, Training Loss: 0.00735
Epoch: 37430, Training Loss: 0.00902
Epoch: 37431, Training Loss: 0.00707
Epoch: 37431, Training Loss: 0.00735
Epoch: 37431, Training Loss: 0.00735
Epoch: 37431, Training Loss: 0.00902
Epoch: 37432, Training Loss: 0.00707
Epoch: 37432, Training Loss: 0.00735
Epoch: 37432, Training Loss: 0.00735
Epoch: 37432, Training Loss: 0.00902
Epoch: 37433, Training Loss: 0.00707
Epoch: 37433, Training Loss: 0.00735
Epoch: 37433, Training Loss: 0.00735
Epoch: 37433, Training Loss: 0.00902
Epoch: 37434, Training Loss: 0.00707
Epoch: 37434, Training Loss: 0.00735
Epoch: 37434, Training Loss: 0.00735
Epoch: 37434, Training Loss: 0.00902
Epoch: 37435, Training Loss: 0.00707
Epoch: 37435, Training Loss: 0.00734
Epoch: 37435, Training Loss: 0.00735
Epoch: 37435, Training Loss: 0.00902
Epoch: 37436, Training Loss: 0.00707
Epoch: 37436, Training Loss: 0.00734
Epoch: 37436, Training Loss: 0.00735
Epoch: 37436, Training Loss: 0.00902
Epoch: 37437, Training Loss: 0.00707
Epoch: 37437, Training Loss: 0.00734
Epoch: 37437, Training Loss: 0.00735
Epoch: 37437, Training Loss: 0.00902
Epoch: 37438, Training Loss: 0.00707
Epoch: 37438, Training Loss: 0.00734
Epoch: 37438, Training Loss: 0.00735
Epoch: 37438, Training Loss: 0.00902
Epoch: 37439, Training Loss: 0.00707
Epoch: 37439, Training Loss: 0.00734
Epoch: 37439, Training Loss: 0.00735
Epoch: 37439, Training Loss: 0.00902
Epoch: 37440, Training Loss: 0.00707
Epoch: 37440, Training Loss: 0.00734
Epoch: 37440, Training Loss: 0.00735
Epoch: 37440, Training Loss: 0.00902
Epoch: 37441, Training Loss: 0.00707
Epoch: 37441, Training Loss: 0.00734
Epoch: 37441, Training Loss: 0.00735
Epoch: 37441, Training Loss: 0.00902
Epoch: 37442, Training Loss: 0.00707
Epoch: 37442, Training Loss: 0.00734
Epoch: 37442, Training Loss: 0.00735
Epoch: 37442, Training Loss: 0.00902
Epoch: 37443, Training Loss: 0.00707
Epoch: 37443, Training Loss: 0.00734
Epoch: 37443, Training Loss: 0.00735
Epoch: 37443, Training Loss: 0.00902
Epoch: 37444, Training Loss: 0.00707
Epoch: 37444, Training Loss: 0.00734
Epoch: 37444, Training Loss: 0.00735
Epoch: 37444, Training Loss: 0.00902
Epoch: 37445, Training Loss: 0.00707
Epoch: 37445, Training Loss: 0.00734
Epoch: 37445, Training Loss: 0.00735
Epoch: 37445, Training Loss: 0.00902
Epoch: 37446, Training Loss: 0.00707
Epoch: 37446, Training Loss: 0.00734
Epoch: 37446, Training Loss: 0.00735
Epoch: 37446, Training Loss: 0.00902
Epoch: 37447, Training Loss: 0.00707
Epoch: 37447, Training Loss: 0.00734
Epoch: 37447, Training Loss: 0.00735
Epoch: 37447, Training Loss: 0.00902
Epoch: 37448, Training Loss: 0.00706
Epoch: 37448, Training Loss: 0.00734
Epoch: 37448, Training Loss: 0.00735
Epoch: 37448, Training Loss: 0.00902
Epoch: 37449, Training Loss: 0.00706
Epoch: 37449, Training Loss: 0.00734
Epoch: 37449, Training Loss: 0.00735
Epoch: 37449, Training Loss: 0.00902
Epoch: 37450, Training Loss: 0.00706
Epoch: 37450, Training Loss: 0.00734
Epoch: 37450, Training Loss: 0.00735
Epoch: 37450, Training Loss: 0.00902
Epoch: 37451, Training Loss: 0.00706
Epoch: 37451, Training Loss: 0.00734
Epoch: 37451, Training Loss: 0.00735
Epoch: 37451, Training Loss: 0.00902
Epoch: 37452, Training Loss: 0.00706
Epoch: 37452, Training Loss: 0.00734
Epoch: 37452, Training Loss: 0.00735
Epoch: 37452, Training Loss: 0.00902
Epoch: 37453, Training Loss: 0.00706
Epoch: 37453, Training Loss: 0.00734
Epoch: 37453, Training Loss: 0.00735
Epoch: 37453, Training Loss: 0.00902
Epoch: 37454, Training Loss: 0.00706
Epoch: 37454, Training Loss: 0.00734
Epoch: 37454, Training Loss: 0.00735
Epoch: 37454, Training Loss: 0.00902
Epoch: 37455, Training Loss: 0.00706
Epoch: 37455, Training Loss: 0.00734
Epoch: 37455, Training Loss: 0.00735
Epoch: 37455, Training Loss: 0.00902
Epoch: 37456, Training Loss: 0.00706
Epoch: 37456, Training Loss: 0.00734
Epoch: 37456, Training Loss: 0.00735
Epoch: 37456, Training Loss: 0.00902
Epoch: 37457, Training Loss: 0.00706
Epoch: 37457, Training Loss: 0.00734
Epoch: 37457, Training Loss: 0.00735
Epoch: 37457, Training Loss: 0.00902
Epoch: 37458, Training Loss: 0.00706
Epoch: 37458, Training Loss: 0.00734
Epoch: 37458, Training Loss: 0.00735
Epoch: 37458, Training Loss: 0.00902
Epoch: 37459, Training Loss: 0.00706
Epoch: 37459, Training Loss: 0.00734
Epoch: 37459, Training Loss: 0.00735
Epoch: 37459, Training Loss: 0.00902
Epoch: 37460, Training Loss: 0.00706
Epoch: 37460, Training Loss: 0.00734
Epoch: 37460, Training Loss: 0.00735
Epoch: 37460, Training Loss: 0.00902
Epoch: 37461, Training Loss: 0.00706
Epoch: 37461, Training Loss: 0.00734
Epoch: 37461, Training Loss: 0.00735
Epoch: 37461, Training Loss: 0.00902
Epoch: 37462, Training Loss: 0.00706
Epoch: 37462, Training Loss: 0.00734
Epoch: 37462, Training Loss: 0.00735
Epoch: 37462, Training Loss: 0.00902
Epoch: 37463, Training Loss: 0.00706
Epoch: 37463, Training Loss: 0.00734
Epoch: 37463, Training Loss: 0.00735
Epoch: 37463, Training Loss: 0.00902
Epoch: 37464, Training Loss: 0.00706
Epoch: 37464, Training Loss: 0.00734
Epoch: 37464, Training Loss: 0.00735
Epoch: 37464, Training Loss: 0.00902
Epoch: 37465, Training Loss: 0.00706
Epoch: 37465, Training Loss: 0.00734
Epoch: 37465, Training Loss: 0.00735
Epoch: 37465, Training Loss: 0.00902
Epoch: 37466, Training Loss: 0.00706
Epoch: 37466, Training Loss: 0.00734
Epoch: 37466, Training Loss: 0.00735
Epoch: 37466, Training Loss: 0.00902
Epoch: 37467, Training Loss: 0.00706
Epoch: 37467, Training Loss: 0.00734
Epoch: 37467, Training Loss: 0.00735
Epoch: 37467, Training Loss: 0.00902
Epoch: 37468, Training Loss: 0.00706
Epoch: 37468, Training Loss: 0.00734
Epoch: 37468, Training Loss: 0.00735
Epoch: 37468, Training Loss: 0.00902
Epoch: 37469, Training Loss: 0.00706
Epoch: 37469, Training Loss: 0.00734
Epoch: 37469, Training Loss: 0.00735
Epoch: 37469, Training Loss: 0.00902
Epoch: 37470, Training Loss: 0.00706
Epoch: 37470, Training Loss: 0.00734
Epoch: 37470, Training Loss: 0.00735
Epoch: 37470, Training Loss: 0.00902
Epoch: 37471, Training Loss: 0.00706
Epoch: 37471, Training Loss: 0.00734
Epoch: 37471, Training Loss: 0.00735
Epoch: 37471, Training Loss: 0.00902
Epoch: 37472, Training Loss: 0.00706
Epoch: 37472, Training Loss: 0.00734
Epoch: 37472, Training Loss: 0.00735
Epoch: 37472, Training Loss: 0.00902
Epoch: 37473, Training Loss: 0.00706
Epoch: 37473, Training Loss: 0.00734
Epoch: 37473, Training Loss: 0.00735
Epoch: 37473, Training Loss: 0.00902
Epoch: 37474, Training Loss: 0.00706
Epoch: 37474, Training Loss: 0.00734
Epoch: 37474, Training Loss: 0.00735
Epoch: 37474, Training Loss: 0.00902
Epoch: 37475, Training Loss: 0.00706
Epoch: 37475, Training Loss: 0.00734
Epoch: 37475, Training Loss: 0.00735
Epoch: 37475, Training Loss: 0.00902
Epoch: 37476, Training Loss: 0.00706
Epoch: 37476, Training Loss: 0.00734
Epoch: 37476, Training Loss: 0.00735
Epoch: 37476, Training Loss: 0.00902
Epoch: 37477, Training Loss: 0.00706
Epoch: 37477, Training Loss: 0.00734
Epoch: 37477, Training Loss: 0.00735
Epoch: 37477, Training Loss: 0.00902
Epoch: 37478, Training Loss: 0.00706
Epoch: 37478, Training Loss: 0.00734
Epoch: 37478, Training Loss: 0.00735
Epoch: 37478, Training Loss: 0.00902
Epoch: 37479, Training Loss: 0.00706
Epoch: 37479, Training Loss: 0.00734
Epoch: 37479, Training Loss: 0.00735
Epoch: 37479, Training Loss: 0.00901
Epoch: 37480, Training Loss: 0.00706
Epoch: 37480, Training Loss: 0.00734
Epoch: 37480, Training Loss: 0.00735
Epoch: 37480, Training Loss: 0.00901
Epoch: 37481, Training Loss: 0.00706
Epoch: 37481, Training Loss: 0.00734
Epoch: 37481, Training Loss: 0.00735
Epoch: 37481, Training Loss: 0.00901
Epoch: 37482, Training Loss: 0.00706
Epoch: 37482, Training Loss: 0.00734
Epoch: 37482, Training Loss: 0.00735
Epoch: 37482, Training Loss: 0.00901
Epoch: 37483, Training Loss: 0.00706
Epoch: 37483, Training Loss: 0.00734
Epoch: 37483, Training Loss: 0.00735
Epoch: 37483, Training Loss: 0.00901
Epoch: 37484, Training Loss: 0.00706
Epoch: 37484, Training Loss: 0.00734
Epoch: 37484, Training Loss: 0.00735
Epoch: 37484, Training Loss: 0.00901
Epoch: 37485, Training Loss: 0.00706
Epoch: 37485, Training Loss: 0.00734
Epoch: 37485, Training Loss: 0.00735
Epoch: 37485, Training Loss: 0.00901
Epoch: 37486, Training Loss: 0.00706
Epoch: 37486, Training Loss: 0.00734
Epoch: 37486, Training Loss: 0.00735
Epoch: 37486, Training Loss: 0.00901
Epoch: 37487, Training Loss: 0.00706
Epoch: 37487, Training Loss: 0.00734
Epoch: 37487, Training Loss: 0.00735
Epoch: 37487, Training Loss: 0.00901
Epoch: 37488, Training Loss: 0.00706
Epoch: 37488, Training Loss: 0.00734
Epoch: 37488, Training Loss: 0.00735
Epoch: 37488, Training Loss: 0.00901
Epoch: 37489, Training Loss: 0.00706
Epoch: 37489, Training Loss: 0.00734
Epoch: 37489, Training Loss: 0.00735
Epoch: 37489, Training Loss: 0.00901
Epoch: 37490, Training Loss: 0.00706
Epoch: 37490, Training Loss: 0.00734
Epoch: 37490, Training Loss: 0.00735
Epoch: 37490, Training Loss: 0.00901
Epoch: 37491, Training Loss: 0.00706
Epoch: 37491, Training Loss: 0.00734
Epoch: 37491, Training Loss: 0.00735
Epoch: 37491, Training Loss: 0.00901
Epoch: 37492, Training Loss: 0.00706
Epoch: 37492, Training Loss: 0.00734
Epoch: 37492, Training Loss: 0.00735
Epoch: 37492, Training Loss: 0.00901
Epoch: 37493, Training Loss: 0.00706
Epoch: 37493, Training Loss: 0.00734
Epoch: 37493, Training Loss: 0.00735
Epoch: 37493, Training Loss: 0.00901
Epoch: 37494, Training Loss: 0.00706
Epoch: 37494, Training Loss: 0.00734
Epoch: 37494, Training Loss: 0.00735
Epoch: 37494, Training Loss: 0.00901
Epoch: 37495, Training Loss: 0.00706
Epoch: 37495, Training Loss: 0.00734
Epoch: 37495, Training Loss: 0.00735
Epoch: 37495, Training Loss: 0.00901
Epoch: 37496, Training Loss: 0.00706
Epoch: 37496, Training Loss: 0.00734
Epoch: 37496, Training Loss: 0.00735
Epoch: 37496, Training Loss: 0.00901
Epoch: 37497, Training Loss: 0.00706
Epoch: 37497, Training Loss: 0.00734
Epoch: 37497, Training Loss: 0.00735
Epoch: 37497, Training Loss: 0.00901
Epoch: 37498, Training Loss: 0.00706
Epoch: 37498, Training Loss: 0.00734
Epoch: 37498, Training Loss: 0.00735
Epoch: 37498, Training Loss: 0.00901
Epoch: 37499, Training Loss: 0.00706
Epoch: 37499, Training Loss: 0.00734
Epoch: 37499, Training Loss: 0.00735
Epoch: 37499, Training Loss: 0.00901
Epoch: 37500, Training Loss: 0.00706
Epoch: 37500, Training Loss: 0.00734
Epoch: 37500, Training Loss: 0.00735
Epoch: 37500, Training Loss: 0.00901
Epoch: 37501, Training Loss: 0.00706
Epoch: 37501, Training Loss: 0.00734
Epoch: 37501, Training Loss: 0.00735
Epoch: 37501, Training Loss: 0.00901
Epoch: 37502, Training Loss: 0.00706
Epoch: 37502, Training Loss: 0.00734
Epoch: 37502, Training Loss: 0.00735
Epoch: 37502, Training Loss: 0.00901
Epoch: 37503, Training Loss: 0.00706
Epoch: 37503, Training Loss: 0.00734
Epoch: 37503, Training Loss: 0.00735
Epoch: 37503, Training Loss: 0.00901
Epoch: 37504, Training Loss: 0.00706
Epoch: 37504, Training Loss: 0.00734
Epoch: 37504, Training Loss: 0.00735
Epoch: 37504, Training Loss: 0.00901
Epoch: 37505, Training Loss: 0.00706
Epoch: 37505, Training Loss: 0.00734
Epoch: 37505, Training Loss: 0.00735
Epoch: 37505, Training Loss: 0.00901
Epoch: 37506, Training Loss: 0.00706
Epoch: 37506, Training Loss: 0.00734
Epoch: 37506, Training Loss: 0.00735
Epoch: 37506, Training Loss: 0.00901
Epoch: 37507, Training Loss: 0.00706
Epoch: 37507, Training Loss: 0.00734
Epoch: 37507, Training Loss: 0.00735
Epoch: 37507, Training Loss: 0.00901
Epoch: 37508, Training Loss: 0.00706
Epoch: 37508, Training Loss: 0.00734
Epoch: 37508, Training Loss: 0.00735
Epoch: 37508, Training Loss: 0.00901
Epoch: 37509, Training Loss: 0.00706
Epoch: 37509, Training Loss: 0.00734
Epoch: 37509, Training Loss: 0.00735
Epoch: 37509, Training Loss: 0.00901
Epoch: 37510, Training Loss: 0.00706
Epoch: 37510, Training Loss: 0.00734
Epoch: 37510, Training Loss: 0.00735
Epoch: 37510, Training Loss: 0.00901
Epoch: 37511, Training Loss: 0.00706
Epoch: 37511, Training Loss: 0.00734
Epoch: 37511, Training Loss: 0.00735
Epoch: 37511, Training Loss: 0.00901
Epoch: 37512, Training Loss: 0.00706
Epoch: 37512, Training Loss: 0.00734
Epoch: 37512, Training Loss: 0.00734
Epoch: 37512, Training Loss: 0.00901
Epoch: 37513, Training Loss: 0.00706
Epoch: 37513, Training Loss: 0.00734
Epoch: 37513, Training Loss: 0.00734
Epoch: 37513, Training Loss: 0.00901
Epoch: 37514, Training Loss: 0.00706
Epoch: 37514, Training Loss: 0.00734
Epoch: 37514, Training Loss: 0.00734
Epoch: 37514, Training Loss: 0.00901
Epoch: 37515, Training Loss: 0.00706
Epoch: 37515, Training Loss: 0.00734
Epoch: 37515, Training Loss: 0.00734
Epoch: 37515, Training Loss: 0.00901
Epoch: 37516, Training Loss: 0.00706
Epoch: 37516, Training Loss: 0.00734
Epoch: 37516, Training Loss: 0.00734
Epoch: 37516, Training Loss: 0.00901
Epoch: 37517, Training Loss: 0.00706
Epoch: 37517, Training Loss: 0.00734
Epoch: 37517, Training Loss: 0.00734
Epoch: 37517, Training Loss: 0.00901
Epoch: 37518, Training Loss: 0.00706
Epoch: 37518, Training Loss: 0.00734
Epoch: 37518, Training Loss: 0.00734
Epoch: 37518, Training Loss: 0.00901
Epoch: 37519, Training Loss: 0.00706
Epoch: 37519, Training Loss: 0.00734
Epoch: 37519, Training Loss: 0.00734
Epoch: 37519, Training Loss: 0.00901
Epoch: 37520, Training Loss: 0.00706
Epoch: 37520, Training Loss: 0.00734
Epoch: 37520, Training Loss: 0.00734
Epoch: 37520, Training Loss: 0.00901
Epoch: 37521, Training Loss: 0.00706
Epoch: 37521, Training Loss: 0.00734
Epoch: 37521, Training Loss: 0.00734
Epoch: 37521, Training Loss: 0.00901
Epoch: 37522, Training Loss: 0.00706
Epoch: 37522, Training Loss: 0.00734
Epoch: 37522, Training Loss: 0.00734
Epoch: 37522, Training Loss: 0.00901
Epoch: 37523, Training Loss: 0.00706
Epoch: 37523, Training Loss: 0.00734
Epoch: 37523, Training Loss: 0.00734
Epoch: 37523, Training Loss: 0.00901
Epoch: 37524, Training Loss: 0.00706
Epoch: 37524, Training Loss: 0.00734
Epoch: 37524, Training Loss: 0.00734
Epoch: 37524, Training Loss: 0.00901
Epoch: 37525, Training Loss: 0.00706
Epoch: 37525, Training Loss: 0.00734
Epoch: 37525, Training Loss: 0.00734
Epoch: 37525, Training Loss: 0.00901
Epoch: 37526, Training Loss: 0.00706
Epoch: 37526, Training Loss: 0.00734
Epoch: 37526, Training Loss: 0.00734
Epoch: 37526, Training Loss: 0.00901
Epoch: 37527, Training Loss: 0.00706
Epoch: 37527, Training Loss: 0.00733
Epoch: 37527, Training Loss: 0.00734
Epoch: 37527, Training Loss: 0.00901
Epoch: 37528, Training Loss: 0.00706
Epoch: 37528, Training Loss: 0.00733
Epoch: 37528, Training Loss: 0.00734
Epoch: 37528, Training Loss: 0.00901
Epoch: 37529, Training Loss: 0.00706
Epoch: 37529, Training Loss: 0.00733
Epoch: 37529, Training Loss: 0.00734
Epoch: 37529, Training Loss: 0.00901
Epoch: 37530, Training Loss: 0.00706
Epoch: 37530, Training Loss: 0.00733
Epoch: 37530, Training Loss: 0.00734
Epoch: 37530, Training Loss: 0.00901
Epoch: 37531, Training Loss: 0.00706
Epoch: 37531, Training Loss: 0.00733
Epoch: 37531, Training Loss: 0.00734
Epoch: 37531, Training Loss: 0.00901
Epoch: 37532, Training Loss: 0.00706
Epoch: 37532, Training Loss: 0.00733
Epoch: 37532, Training Loss: 0.00734
Epoch: 37532, Training Loss: 0.00901
Epoch: 37533, Training Loss: 0.00706
Epoch: 37533, Training Loss: 0.00733
Epoch: 37533, Training Loss: 0.00734
Epoch: 37533, Training Loss: 0.00901
Epoch: 37534, Training Loss: 0.00706
Epoch: 37534, Training Loss: 0.00733
Epoch: 37534, Training Loss: 0.00734
Epoch: 37534, Training Loss: 0.00901
Epoch: 37535, Training Loss: 0.00706
Epoch: 37535, Training Loss: 0.00733
Epoch: 37535, Training Loss: 0.00734
Epoch: 37535, Training Loss: 0.00901
Epoch: 37536, Training Loss: 0.00706
Epoch: 37536, Training Loss: 0.00733
Epoch: 37536, Training Loss: 0.00734
Epoch: 37536, Training Loss: 0.00901
Epoch: 37537, Training Loss: 0.00706
Epoch: 37537, Training Loss: 0.00733
Epoch: 37537, Training Loss: 0.00734
Epoch: 37537, Training Loss: 0.00901
Epoch: 37538, Training Loss: 0.00706
Epoch: 37538, Training Loss: 0.00733
Epoch: 37538, Training Loss: 0.00734
Epoch: 37538, Training Loss: 0.00901
Epoch: 37539, Training Loss: 0.00706
Epoch: 37539, Training Loss: 0.00733
Epoch: 37539, Training Loss: 0.00734
Epoch: 37539, Training Loss: 0.00901
Epoch: 37540, Training Loss: 0.00706
Epoch: 37540, Training Loss: 0.00733
Epoch: 37540, Training Loss: 0.00734
Epoch: 37540, Training Loss: 0.00901
Epoch: 37541, Training Loss: 0.00706
Epoch: 37541, Training Loss: 0.00733
Epoch: 37541, Training Loss: 0.00734
Epoch: 37541, Training Loss: 0.00901
Epoch: 37542, Training Loss: 0.00706
Epoch: 37542, Training Loss: 0.00733
Epoch: 37542, Training Loss: 0.00734
Epoch: 37542, Training Loss: 0.00901
Epoch: 37543, Training Loss: 0.00706
Epoch: 37543, Training Loss: 0.00733
Epoch: 37543, Training Loss: 0.00734
Epoch: 37543, Training Loss: 0.00901
Epoch: 37544, Training Loss: 0.00706
Epoch: 37544, Training Loss: 0.00733
Epoch: 37544, Training Loss: 0.00734
Epoch: 37544, Training Loss: 0.00901
Epoch: 37545, Training Loss: 0.00706
Epoch: 37545, Training Loss: 0.00733
Epoch: 37545, Training Loss: 0.00734
Epoch: 37545, Training Loss: 0.00901
Epoch: 37546, Training Loss: 0.00705
Epoch: 37546, Training Loss: 0.00733
Epoch: 37546, Training Loss: 0.00734
Epoch: 37546, Training Loss: 0.00901
Epoch: 37547, Training Loss: 0.00705
Epoch: 37547, Training Loss: 0.00733
Epoch: 37547, Training Loss: 0.00734
Epoch: 37547, Training Loss: 0.00901
Epoch: 37548, Training Loss: 0.00705
Epoch: 37548, Training Loss: 0.00733
Epoch: 37548, Training Loss: 0.00734
Epoch: 37548, Training Loss: 0.00901
Epoch: 37549, Training Loss: 0.00705
Epoch: 37549, Training Loss: 0.00733
Epoch: 37549, Training Loss: 0.00734
Epoch: 37549, Training Loss: 0.00901
Epoch: 37550, Training Loss: 0.00705
Epoch: 37550, Training Loss: 0.00733
Epoch: 37550, Training Loss: 0.00734
Epoch: 37550, Training Loss: 0.00901
Epoch: 37551, Training Loss: 0.00705
Epoch: 37551, Training Loss: 0.00733
Epoch: 37551, Training Loss: 0.00734
Epoch: 37551, Training Loss: 0.00901
Epoch: 37552, Training Loss: 0.00705
Epoch: 37552, Training Loss: 0.00733
Epoch: 37552, Training Loss: 0.00734
Epoch: 37552, Training Loss: 0.00901
Epoch: 37553, Training Loss: 0.00705
Epoch: 37553, Training Loss: 0.00733
Epoch: 37553, Training Loss: 0.00734
Epoch: 37553, Training Loss: 0.00901
Epoch: 37554, Training Loss: 0.00705
Epoch: 37554, Training Loss: 0.00733
Epoch: 37554, Training Loss: 0.00734
Epoch: 37554, Training Loss: 0.00900
Epoch: 37555, Training Loss: 0.00705
Epoch: 37555, Training Loss: 0.00733
Epoch: 37555, Training Loss: 0.00734
Epoch: 37555, Training Loss: 0.00900
Epoch: 37556, Training Loss: 0.00705
Epoch: 37556, Training Loss: 0.00733
Epoch: 37556, Training Loss: 0.00734
Epoch: 37556, Training Loss: 0.00900
Epoch: 37557, Training Loss: 0.00705
Epoch: 37557, Training Loss: 0.00733
Epoch: 37557, Training Loss: 0.00734
Epoch: 37557, Training Loss: 0.00900
Epoch: 37558, Training Loss: 0.00705
Epoch: 37558, Training Loss: 0.00733
Epoch: 37558, Training Loss: 0.00734
Epoch: 37558, Training Loss: 0.00900
Epoch: 37559, Training Loss: 0.00705
Epoch: 37559, Training Loss: 0.00733
Epoch: 37559, Training Loss: 0.00734
Epoch: 37559, Training Loss: 0.00900
Epoch: 37560, Training Loss: 0.00705
Epoch: 37560, Training Loss: 0.00733
Epoch: 37560, Training Loss: 0.00734
Epoch: 37560, Training Loss: 0.00900
Epoch: 37561, Training Loss: 0.00705
Epoch: 37561, Training Loss: 0.00733
Epoch: 37561, Training Loss: 0.00734
Epoch: 37561, Training Loss: 0.00900
Epoch: 37562, Training Loss: 0.00705
Epoch: 37562, Training Loss: 0.00733
Epoch: 37562, Training Loss: 0.00734
Epoch: 37562, Training Loss: 0.00900
Epoch: 37563, Training Loss: 0.00705
Epoch: 37563, Training Loss: 0.00733
Epoch: 37563, Training Loss: 0.00734
Epoch: 37563, Training Loss: 0.00900
Epoch: 37564, Training Loss: 0.00705
Epoch: 37564, Training Loss: 0.00733
Epoch: 37564, Training Loss: 0.00734
Epoch: 37564, Training Loss: 0.00900
Epoch: 37565, Training Loss: 0.00705
Epoch: 37565, Training Loss: 0.00733
Epoch: 37565, Training Loss: 0.00734
Epoch: 37565, Training Loss: 0.00900
Epoch: 37566, Training Loss: 0.00705
Epoch: 37566, Training Loss: 0.00733
Epoch: 37566, Training Loss: 0.00734
Epoch: 37566, Training Loss: 0.00900
Epoch: 37567, Training Loss: 0.00705
Epoch: 37567, Training Loss: 0.00733
Epoch: 37567, Training Loss: 0.00734
Epoch: 37567, Training Loss: 0.00900
Epoch: 37568, Training Loss: 0.00705
Epoch: 37568, Training Loss: 0.00733
Epoch: 37568, Training Loss: 0.00734
Epoch: 37568, Training Loss: 0.00900
Epoch: 37569, Training Loss: 0.00705
Epoch: 37569, Training Loss: 0.00733
Epoch: 37569, Training Loss: 0.00734
Epoch: 37569, Training Loss: 0.00900
Epoch: 37570, Training Loss: 0.00705
Epoch: 37570, Training Loss: 0.00733
Epoch: 37570, Training Loss: 0.00734
Epoch: 37570, Training Loss: 0.00900
Epoch: 37571, Training Loss: 0.00705
Epoch: 37571, Training Loss: 0.00733
Epoch: 37571, Training Loss: 0.00734
Epoch: 37571, Training Loss: 0.00900
Epoch: 37572, Training Loss: 0.00705
Epoch: 37572, Training Loss: 0.00733
Epoch: 37572, Training Loss: 0.00734
Epoch: 37572, Training Loss: 0.00900
Epoch: 37573, Training Loss: 0.00705
Epoch: 37573, Training Loss: 0.00733
Epoch: 37573, Training Loss: 0.00734
Epoch: 37573, Training Loss: 0.00900
Epoch: 37574, Training Loss: 0.00705
Epoch: 37574, Training Loss: 0.00733
Epoch: 37574, Training Loss: 0.00734
Epoch: 37574, Training Loss: 0.00900
Epoch: 37575, Training Loss: 0.00705
Epoch: 37575, Training Loss: 0.00733
Epoch: 37575, Training Loss: 0.00734
Epoch: 37575, Training Loss: 0.00900
Epoch: 37576, Training Loss: 0.00705
Epoch: 37576, Training Loss: 0.00733
Epoch: 37576, Training Loss: 0.00734
Epoch: 37576, Training Loss: 0.00900
Epoch: 37577, Training Loss: 0.00705
Epoch: 37577, Training Loss: 0.00733
Epoch: 37577, Training Loss: 0.00734
Epoch: 37577, Training Loss: 0.00900
Epoch: 37578, Training Loss: 0.00705
Epoch: 37578, Training Loss: 0.00733
Epoch: 37578, Training Loss: 0.00734
Epoch: 37578, Training Loss: 0.00900
Epoch: 37579, Training Loss: 0.00705
Epoch: 37579, Training Loss: 0.00733
Epoch: 37579, Training Loss: 0.00734
Epoch: 37579, Training Loss: 0.00900
Epoch: 37580, Training Loss: 0.00705
Epoch: 37580, Training Loss: 0.00733
Epoch: 37580, Training Loss: 0.00734
Epoch: 37580, Training Loss: 0.00900
Epoch: 37581, Training Loss: 0.00705
Epoch: 37581, Training Loss: 0.00733
Epoch: 37581, Training Loss: 0.00734
Epoch: 37581, Training Loss: 0.00900
Epoch: 37582, Training Loss: 0.00705
Epoch: 37582, Training Loss: 0.00733
Epoch: 37582, Training Loss: 0.00734
Epoch: 37582, Training Loss: 0.00900
Epoch: 37583, Training Loss: 0.00705
Epoch: 37583, Training Loss: 0.00733
Epoch: 37583, Training Loss: 0.00734
Epoch: 37583, Training Loss: 0.00900
Epoch: 37584, Training Loss: 0.00705
Epoch: 37584, Training Loss: 0.00733
Epoch: 37584, Training Loss: 0.00734
Epoch: 37584, Training Loss: 0.00900
Epoch: 37585, Training Loss: 0.00705
Epoch: 37585, Training Loss: 0.00733
Epoch: 37585, Training Loss: 0.00734
Epoch: 37585, Training Loss: 0.00900
Epoch: 37586, Training Loss: 0.00705
Epoch: 37586, Training Loss: 0.00733
Epoch: 37586, Training Loss: 0.00734
Epoch: 37586, Training Loss: 0.00900
Epoch: 37587, Training Loss: 0.00705
Epoch: 37587, Training Loss: 0.00733
Epoch: 37587, Training Loss: 0.00734
Epoch: 37587, Training Loss: 0.00900
Epoch: 37588, Training Loss: 0.00705
Epoch: 37588, Training Loss: 0.00733
Epoch: 37588, Training Loss: 0.00734
Epoch: 37588, Training Loss: 0.00900
Epoch: 37589, Training Loss: 0.00705
Epoch: 37589, Training Loss: 0.00733
Epoch: 37589, Training Loss: 0.00734
Epoch: 37589, Training Loss: 0.00900
Epoch: 37590, Training Loss: 0.00705
Epoch: 37590, Training Loss: 0.00733
Epoch: 37590, Training Loss: 0.00734
Epoch: 37590, Training Loss: 0.00900
Epoch: 37591, Training Loss: 0.00705
Epoch: 37591, Training Loss: 0.00733
Epoch: 37591, Training Loss: 0.00734
Epoch: 37591, Training Loss: 0.00900
Epoch: 37592, Training Loss: 0.00705
Epoch: 37592, Training Loss: 0.00733
Epoch: 37592, Training Loss: 0.00734
Epoch: 37592, Training Loss: 0.00900
Epoch: 37593, Training Loss: 0.00705
Epoch: 37593, Training Loss: 0.00733
Epoch: 37593, Training Loss: 0.00734
Epoch: 37593, Training Loss: 0.00900
Epoch: 37594, Training Loss: 0.00705
Epoch: 37594, Training Loss: 0.00733
Epoch: 37594, Training Loss: 0.00734
Epoch: 37594, Training Loss: 0.00900
Epoch: 37595, Training Loss: 0.00705
Epoch: 37595, Training Loss: 0.00733
Epoch: 37595, Training Loss: 0.00734
Epoch: 37595, Training Loss: 0.00900
Epoch: 37596, Training Loss: 0.00705
Epoch: 37596, Training Loss: 0.00733
Epoch: 37596, Training Loss: 0.00734
Epoch: 37596, Training Loss: 0.00900
Epoch: 37597, Training Loss: 0.00705
Epoch: 37597, Training Loss: 0.00733
Epoch: 37597, Training Loss: 0.00734
Epoch: 37597, Training Loss: 0.00900
Epoch: 37598, Training Loss: 0.00705
Epoch: 37598, Training Loss: 0.00733
Epoch: 37598, Training Loss: 0.00734
Epoch: 37598, Training Loss: 0.00900
Epoch: 37599, Training Loss: 0.00705
Epoch: 37599, Training Loss: 0.00733
Epoch: 37599, Training Loss: 0.00734
Epoch: 37599, Training Loss: 0.00900
Epoch: 37600, Training Loss: 0.00705
Epoch: 37600, Training Loss: 0.00733
Epoch: 37600, Training Loss: 0.00734
Epoch: 37600, Training Loss: 0.00900
Epoch: 37601, Training Loss: 0.00705
Epoch: 37601, Training Loss: 0.00733
Epoch: 37601, Training Loss: 0.00734
Epoch: 37601, Training Loss: 0.00900
Epoch: 37602, Training Loss: 0.00705
Epoch: 37602, Training Loss: 0.00733
Epoch: 37602, Training Loss: 0.00734
Epoch: 37602, Training Loss: 0.00900
Epoch: 37603, Training Loss: 0.00705
Epoch: 37603, Training Loss: 0.00733
Epoch: 37603, Training Loss: 0.00734
Epoch: 37603, Training Loss: 0.00900
Epoch: 37604, Training Loss: 0.00705
Epoch: 37604, Training Loss: 0.00733
Epoch: 37604, Training Loss: 0.00733
Epoch: 37604, Training Loss: 0.00900
Epoch: 37605, Training Loss: 0.00705
Epoch: 37605, Training Loss: 0.00733
Epoch: 37605, Training Loss: 0.00733
Epoch: 37605, Training Loss: 0.00900
Epoch: 37606, Training Loss: 0.00705
Epoch: 37606, Training Loss: 0.00733
Epoch: 37606, Training Loss: 0.00733
Epoch: 37606, Training Loss: 0.00900
Epoch: 37607, Training Loss: 0.00705
Epoch: 37607, Training Loss: 0.00733
Epoch: 37607, Training Loss: 0.00733
Epoch: 37607, Training Loss: 0.00900
Epoch: 37608, Training Loss: 0.00705
Epoch: 37608, Training Loss: 0.00733
Epoch: 37608, Training Loss: 0.00733
Epoch: 37608, Training Loss: 0.00900
Epoch: 37609, Training Loss: 0.00705
Epoch: 37609, Training Loss: 0.00733
Epoch: 37609, Training Loss: 0.00733
Epoch: 37609, Training Loss: 0.00900
Epoch: 37610, Training Loss: 0.00705
Epoch: 37610, Training Loss: 0.00733
Epoch: 37610, Training Loss: 0.00733
Epoch: 37610, Training Loss: 0.00900
Epoch: 37611, Training Loss: 0.00705
Epoch: 37611, Training Loss: 0.00733
Epoch: 37611, Training Loss: 0.00733
Epoch: 37611, Training Loss: 0.00900
Epoch: 37612, Training Loss: 0.00705
Epoch: 37612, Training Loss: 0.00733
Epoch: 37612, Training Loss: 0.00733
Epoch: 37612, Training Loss: 0.00900
Epoch: 37613, Training Loss: 0.00705
Epoch: 37613, Training Loss: 0.00733
Epoch: 37613, Training Loss: 0.00733
Epoch: 37613, Training Loss: 0.00900
Epoch: 37614, Training Loss: 0.00705
Epoch: 37614, Training Loss: 0.00733
Epoch: 37614, Training Loss: 0.00733
Epoch: 37614, Training Loss: 0.00900
Epoch: 37615, Training Loss: 0.00705
Epoch: 37615, Training Loss: 0.00733
Epoch: 37615, Training Loss: 0.00733
Epoch: 37615, Training Loss: 0.00900
Epoch: 37616, Training Loss: 0.00705
Epoch: 37616, Training Loss: 0.00733
Epoch: 37616, Training Loss: 0.00733
Epoch: 37616, Training Loss: 0.00900
Epoch: 37617, Training Loss: 0.00705
Epoch: 37617, Training Loss: 0.00733
Epoch: 37617, Training Loss: 0.00733
Epoch: 37617, Training Loss: 0.00900
Epoch: 37618, Training Loss: 0.00705
Epoch: 37618, Training Loss: 0.00733
Epoch: 37618, Training Loss: 0.00733
Epoch: 37618, Training Loss: 0.00900
Epoch: 37619, Training Loss: 0.00705
Epoch: 37619, Training Loss: 0.00733
Epoch: 37619, Training Loss: 0.00733
Epoch: 37619, Training Loss: 0.00900
Epoch: 37620, Training Loss: 0.00705
Epoch: 37620, Training Loss: 0.00732
Epoch: 37620, Training Loss: 0.00733
Epoch: 37620, Training Loss: 0.00900
Epoch: 37621, Training Loss: 0.00705
Epoch: 37621, Training Loss: 0.00732
Epoch: 37621, Training Loss: 0.00733
Epoch: 37621, Training Loss: 0.00900
Epoch: 37622, Training Loss: 0.00705
Epoch: 37622, Training Loss: 0.00732
Epoch: 37622, Training Loss: 0.00733
Epoch: 37622, Training Loss: 0.00900
Epoch: 37623, Training Loss: 0.00705
Epoch: 37623, Training Loss: 0.00732
Epoch: 37623, Training Loss: 0.00733
Epoch: 37623, Training Loss: 0.00900
Epoch: 37624, Training Loss: 0.00705
Epoch: 37624, Training Loss: 0.00732
Epoch: 37624, Training Loss: 0.00733
Epoch: 37624, Training Loss: 0.00900
Epoch: 37625, Training Loss: 0.00705
Epoch: 37625, Training Loss: 0.00732
Epoch: 37625, Training Loss: 0.00733
Epoch: 37625, Training Loss: 0.00900
Epoch: 37626, Training Loss: 0.00705
Epoch: 37626, Training Loss: 0.00732
Epoch: 37626, Training Loss: 0.00733
Epoch: 37626, Training Loss: 0.00900
Epoch: 37627, Training Loss: 0.00705
Epoch: 37627, Training Loss: 0.00732
Epoch: 37627, Training Loss: 0.00733
Epoch: 37627, Training Loss: 0.00900
Epoch: 37628, Training Loss: 0.00705
Epoch: 37628, Training Loss: 0.00732
Epoch: 37628, Training Loss: 0.00733
Epoch: 37628, Training Loss: 0.00900
Epoch: 37629, Training Loss: 0.00705
Epoch: 37629, Training Loss: 0.00732
Epoch: 37629, Training Loss: 0.00733
Epoch: 37629, Training Loss: 0.00899
Epoch: 37630, Training Loss: 0.00705
Epoch: 37630, Training Loss: 0.00732
Epoch: 37630, Training Loss: 0.00733
Epoch: 37630, Training Loss: 0.00899
Epoch: 37631, Training Loss: 0.00705
Epoch: 37631, Training Loss: 0.00732
Epoch: 37631, Training Loss: 0.00733
Epoch: 37631, Training Loss: 0.00899
Epoch: 37632, Training Loss: 0.00705
Epoch: 37632, Training Loss: 0.00732
Epoch: 37632, Training Loss: 0.00733
Epoch: 37632, Training Loss: 0.00899
Epoch: 37633, Training Loss: 0.00705
Epoch: 37633, Training Loss: 0.00732
Epoch: 37633, Training Loss: 0.00733
Epoch: 37633, Training Loss: 0.00899
Epoch: 37634, Training Loss: 0.00705
Epoch: 37634, Training Loss: 0.00732
Epoch: 37634, Training Loss: 0.00733
Epoch: 37634, Training Loss: 0.00899
Epoch: 37635, Training Loss: 0.00705
Epoch: 37635, Training Loss: 0.00732
Epoch: 37635, Training Loss: 0.00733
Epoch: 37635, Training Loss: 0.00899
Epoch: 37636, Training Loss: 0.00705
Epoch: 37636, Training Loss: 0.00732
Epoch: 37636, Training Loss: 0.00733
Epoch: 37636, Training Loss: 0.00899
Epoch: 37637, Training Loss: 0.00705
Epoch: 37637, Training Loss: 0.00732
Epoch: 37637, Training Loss: 0.00733
Epoch: 37637, Training Loss: 0.00899
Epoch: 37638, Training Loss: 0.00705
Epoch: 37638, Training Loss: 0.00732
Epoch: 37638, Training Loss: 0.00733
Epoch: 37638, Training Loss: 0.00899
Epoch: 37639, Training Loss: 0.00705
Epoch: 37639, Training Loss: 0.00732
Epoch: 37639, Training Loss: 0.00733
Epoch: 37639, Training Loss: 0.00899
Epoch: 37640, Training Loss: 0.00705
Epoch: 37640, Training Loss: 0.00732
Epoch: 37640, Training Loss: 0.00733
Epoch: 37640, Training Loss: 0.00899
Epoch: 37641, Training Loss: 0.00705
Epoch: 37641, Training Loss: 0.00732
Epoch: 37641, Training Loss: 0.00733
Epoch: 37641, Training Loss: 0.00899
Epoch: 37642, Training Loss: 0.00705
Epoch: 37642, Training Loss: 0.00732
Epoch: 37642, Training Loss: 0.00733
Epoch: 37642, Training Loss: 0.00899
Epoch: 37643, Training Loss: 0.00705
Epoch: 37643, Training Loss: 0.00732
Epoch: 37643, Training Loss: 0.00733
Epoch: 37643, Training Loss: 0.00899
Epoch: 37644, Training Loss: 0.00705
Epoch: 37644, Training Loss: 0.00732
Epoch: 37644, Training Loss: 0.00733
Epoch: 37644, Training Loss: 0.00899
Epoch: 37645, Training Loss: 0.00704
Epoch: 37645, Training Loss: 0.00732
Epoch: 37645, Training Loss: 0.00733
Epoch: 37645, Training Loss: 0.00899
Epoch: 37646, Training Loss: 0.00704
Epoch: 37646, Training Loss: 0.00732
Epoch: 37646, Training Loss: 0.00733
Epoch: 37646, Training Loss: 0.00899
Epoch: 37647, Training Loss: 0.00704
Epoch: 37647, Training Loss: 0.00732
Epoch: 37647, Training Loss: 0.00733
Epoch: 37647, Training Loss: 0.00899
Epoch: 37648, Training Loss: 0.00704
Epoch: 37648, Training Loss: 0.00732
Epoch: 37648, Training Loss: 0.00733
Epoch: 37648, Training Loss: 0.00899
Epoch: 37649, Training Loss: 0.00704
Epoch: 37649, Training Loss: 0.00732
Epoch: 37649, Training Loss: 0.00733
Epoch: 37649, Training Loss: 0.00899
Epoch: 37650, Training Loss: 0.00704
Epoch: 37650, Training Loss: 0.00732
Epoch: 37650, Training Loss: 0.00733
Epoch: 37650, Training Loss: 0.00899
Epoch: 37651, Training Loss: 0.00704
Epoch: 37651, Training Loss: 0.00732
Epoch: 37651, Training Loss: 0.00733
Epoch: 37651, Training Loss: 0.00899
Epoch: 37652, Training Loss: 0.00704
Epoch: 37652, Training Loss: 0.00732
Epoch: 37652, Training Loss: 0.00733
Epoch: 37652, Training Loss: 0.00899
Epoch: 37653, Training Loss: 0.00704
Epoch: 37653, Training Loss: 0.00732
Epoch: 37653, Training Loss: 0.00733
Epoch: 37653, Training Loss: 0.00899
Epoch: 37654, Training Loss: 0.00704
Epoch: 37654, Training Loss: 0.00732
Epoch: 37654, Training Loss: 0.00733
Epoch: 37654, Training Loss: 0.00899
Epoch: 37655, Training Loss: 0.00704
Epoch: 37655, Training Loss: 0.00732
Epoch: 37655, Training Loss: 0.00733
Epoch: 37655, Training Loss: 0.00899
Epoch: 37656, Training Loss: 0.00704
Epoch: 37656, Training Loss: 0.00732
Epoch: 37656, Training Loss: 0.00733
Epoch: 37656, Training Loss: 0.00899
Epoch: 37657, Training Loss: 0.00704
Epoch: 37657, Training Loss: 0.00732
Epoch: 37657, Training Loss: 0.00733
Epoch: 37657, Training Loss: 0.00899
Epoch: 37658, Training Loss: 0.00704
Epoch: 37658, Training Loss: 0.00732
Epoch: 37658, Training Loss: 0.00733
Epoch: 37658, Training Loss: 0.00899
Epoch: 37659, Training Loss: 0.00704
Epoch: 37659, Training Loss: 0.00732
Epoch: 37659, Training Loss: 0.00733
Epoch: 37659, Training Loss: 0.00899
Epoch: 37660, Training Loss: 0.00704
Epoch: 37660, Training Loss: 0.00732
Epoch: 37660, Training Loss: 0.00733
Epoch: 37660, Training Loss: 0.00899
Epoch: 37661, Training Loss: 0.00704
Epoch: 37661, Training Loss: 0.00732
Epoch: 37661, Training Loss: 0.00733
Epoch: 37661, Training Loss: 0.00899
Epoch: 37662, Training Loss: 0.00704
Epoch: 37662, Training Loss: 0.00732
Epoch: 37662, Training Loss: 0.00733
Epoch: 37662, Training Loss: 0.00899
Epoch: 37663, Training Loss: 0.00704
Epoch: 37663, Training Loss: 0.00732
Epoch: 37663, Training Loss: 0.00733
Epoch: 37663, Training Loss: 0.00899
Epoch: 37664, Training Loss: 0.00704
Epoch: 37664, Training Loss: 0.00732
Epoch: 37664, Training Loss: 0.00733
Epoch: 37664, Training Loss: 0.00899
Epoch: 37665, Training Loss: 0.00704
Epoch: 37665, Training Loss: 0.00732
Epoch: 37665, Training Loss: 0.00733
Epoch: 37665, Training Loss: 0.00899
Epoch: 37666, Training Loss: 0.00704
Epoch: 37666, Training Loss: 0.00732
Epoch: 37666, Training Loss: 0.00733
Epoch: 37666, Training Loss: 0.00899
Epoch: 37667, Training Loss: 0.00704
Epoch: 37667, Training Loss: 0.00732
Epoch: 37667, Training Loss: 0.00733
Epoch: 37667, Training Loss: 0.00899
Epoch: 37668, Training Loss: 0.00704
Epoch: 37668, Training Loss: 0.00732
Epoch: 37668, Training Loss: 0.00733
Epoch: 37668, Training Loss: 0.00899
Epoch: 37669, Training Loss: 0.00704
Epoch: 37669, Training Loss: 0.00732
Epoch: 37669, Training Loss: 0.00733
Epoch: 37669, Training Loss: 0.00899
Epoch: 37670, Training Loss: 0.00704
Epoch: 37670, Training Loss: 0.00732
Epoch: 37670, Training Loss: 0.00733
Epoch: 37670, Training Loss: 0.00899
Epoch: 37671, Training Loss: 0.00704
Epoch: 37671, Training Loss: 0.00732
Epoch: 37671, Training Loss: 0.00733
Epoch: 37671, Training Loss: 0.00899
Epoch: 37672, Training Loss: 0.00704
Epoch: 37672, Training Loss: 0.00732
Epoch: 37672, Training Loss: 0.00733
Epoch: 37672, Training Loss: 0.00899
Epoch: 37673, Training Loss: 0.00704
Epoch: 37673, Training Loss: 0.00732
Epoch: 37673, Training Loss: 0.00733
Epoch: 37673, Training Loss: 0.00899
Epoch: 37674, Training Loss: 0.00704
Epoch: 37674, Training Loss: 0.00732
Epoch: 37674, Training Loss: 0.00733
Epoch: 37674, Training Loss: 0.00899
Epoch: 37675, Training Loss: 0.00704
Epoch: 37675, Training Loss: 0.00732
Epoch: 37675, Training Loss: 0.00733
Epoch: 37675, Training Loss: 0.00899
Epoch: 37676, Training Loss: 0.00704
Epoch: 37676, Training Loss: 0.00732
Epoch: 37676, Training Loss: 0.00733
Epoch: 37676, Training Loss: 0.00899
Epoch: 37677, Training Loss: 0.00704
Epoch: 37677, Training Loss: 0.00732
Epoch: 37677, Training Loss: 0.00733
Epoch: 37677, Training Loss: 0.00899
Epoch: 37678, Training Loss: 0.00704
Epoch: 37678, Training Loss: 0.00732
Epoch: 37678, Training Loss: 0.00733
Epoch: 37678, Training Loss: 0.00899
Epoch: 37679, Training Loss: 0.00704
Epoch: 37679, Training Loss: 0.00732
Epoch: 37679, Training Loss: 0.00733
Epoch: 37679, Training Loss: 0.00899
Epoch: 37680, Training Loss: 0.00704
Epoch: 37680, Training Loss: 0.00732
Epoch: 37680, Training Loss: 0.00733
Epoch: 37680, Training Loss: 0.00899
Epoch: 37681, Training Loss: 0.00704
Epoch: 37681, Training Loss: 0.00732
Epoch: 37681, Training Loss: 0.00733
Epoch: 37681, Training Loss: 0.00899
Epoch: 37682, Training Loss: 0.00704
Epoch: 37682, Training Loss: 0.00732
Epoch: 37682, Training Loss: 0.00733
Epoch: 37682, Training Loss: 0.00899
Epoch: 37683, Training Loss: 0.00704
Epoch: 37683, Training Loss: 0.00732
Epoch: 37683, Training Loss: 0.00733
Epoch: 37683, Training Loss: 0.00899
Epoch: 37684, Training Loss: 0.00704
Epoch: 37684, Training Loss: 0.00732
Epoch: 37684, Training Loss: 0.00733
Epoch: 37684, Training Loss: 0.00899
Epoch: 37685, Training Loss: 0.00704
Epoch: 37685, Training Loss: 0.00732
Epoch: 37685, Training Loss: 0.00733
Epoch: 37685, Training Loss: 0.00899
Epoch: 37686, Training Loss: 0.00704
Epoch: 37686, Training Loss: 0.00732
Epoch: 37686, Training Loss: 0.00733
Epoch: 37686, Training Loss: 0.00899
Epoch: 37687, Training Loss: 0.00704
Epoch: 37687, Training Loss: 0.00732
Epoch: 37687, Training Loss: 0.00733
Epoch: 37687, Training Loss: 0.00899
Epoch: 37688, Training Loss: 0.00704
Epoch: 37688, Training Loss: 0.00732
Epoch: 37688, Training Loss: 0.00733
Epoch: 37688, Training Loss: 0.00899
Epoch: 37689, Training Loss: 0.00704
Epoch: 37689, Training Loss: 0.00732
Epoch: 37689, Training Loss: 0.00733
Epoch: 37689, Training Loss: 0.00899
Epoch: 37690, Training Loss: 0.00704
Epoch: 37690, Training Loss: 0.00732
Epoch: 37690, Training Loss: 0.00733
Epoch: 37690, Training Loss: 0.00899
Epoch: 37691, Training Loss: 0.00704
Epoch: 37691, Training Loss: 0.00732
Epoch: 37691, Training Loss: 0.00733
Epoch: 37691, Training Loss: 0.00899
Epoch: 37692, Training Loss: 0.00704
Epoch: 37692, Training Loss: 0.00732
Epoch: 37692, Training Loss: 0.00733
Epoch: 37692, Training Loss: 0.00899
Epoch: 37693, Training Loss: 0.00704
Epoch: 37693, Training Loss: 0.00732
Epoch: 37693, Training Loss: 0.00733
Epoch: 37693, Training Loss: 0.00899
Epoch: 37694, Training Loss: 0.00704
Epoch: 37694, Training Loss: 0.00732
Epoch: 37694, Training Loss: 0.00733
Epoch: 37694, Training Loss: 0.00899
Epoch: 37695, Training Loss: 0.00704
Epoch: 37695, Training Loss: 0.00732
Epoch: 37695, Training Loss: 0.00733
Epoch: 37695, Training Loss: 0.00899
Epoch: 37696, Training Loss: 0.00704
Epoch: 37696, Training Loss: 0.00732
Epoch: 37696, Training Loss: 0.00733
Epoch: 37696, Training Loss: 0.00899
Epoch: 37697, Training Loss: 0.00704
Epoch: 37697, Training Loss: 0.00732
Epoch: 37697, Training Loss: 0.00732
Epoch: 37697, Training Loss: 0.00899
Epoch: 37698, Training Loss: 0.00704
Epoch: 37698, Training Loss: 0.00732
Epoch: 37698, Training Loss: 0.00732
Epoch: 37698, Training Loss: 0.00899
Epoch: 37699, Training Loss: 0.00704
Epoch: 37699, Training Loss: 0.00732
Epoch: 37699, Training Loss: 0.00732
Epoch: 37699, Training Loss: 0.00899
Epoch: 37700, Training Loss: 0.00704
Epoch: 37700, Training Loss: 0.00732
Epoch: 37700, Training Loss: 0.00732
Epoch: 37700, Training Loss: 0.00899
Epoch: 37701, Training Loss: 0.00704
Epoch: 37701, Training Loss: 0.00732
Epoch: 37701, Training Loss: 0.00732
Epoch: 37701, Training Loss: 0.00899
Epoch: 37702, Training Loss: 0.00704
Epoch: 37702, Training Loss: 0.00732
Epoch: 37702, Training Loss: 0.00732
Epoch: 37702, Training Loss: 0.00899
Epoch: 37703, Training Loss: 0.00704
Epoch: 37703, Training Loss: 0.00732
Epoch: 37703, Training Loss: 0.00732
Epoch: 37703, Training Loss: 0.00899
Epoch: 37704, Training Loss: 0.00704
Epoch: 37704, Training Loss: 0.00732
Epoch: 37704, Training Loss: 0.00732
Epoch: 37704, Training Loss: 0.00898
Epoch: 37705, Training Loss: 0.00704
Epoch: 37705, Training Loss: 0.00732
Epoch: 37705, Training Loss: 0.00732
Epoch: 37705, Training Loss: 0.00898
Epoch: 37706, Training Loss: 0.00704
Epoch: 37706, Training Loss: 0.00732
Epoch: 37706, Training Loss: 0.00732
Epoch: 37706, Training Loss: 0.00898
Epoch: 37707, Training Loss: 0.00704
Epoch: 37707, Training Loss: 0.00732
Epoch: 37707, Training Loss: 0.00732
Epoch: 37707, Training Loss: 0.00898
Epoch: 37708, Training Loss: 0.00704
Epoch: 37708, Training Loss: 0.00732
Epoch: 37708, Training Loss: 0.00732
Epoch: 37708, Training Loss: 0.00898
Epoch: 37709, Training Loss: 0.00704
Epoch: 37709, Training Loss: 0.00732
Epoch: 37709, Training Loss: 0.00732
Epoch: 37709, Training Loss: 0.00898
Epoch: 37710, Training Loss: 0.00704
Epoch: 37710, Training Loss: 0.00732
Epoch: 37710, Training Loss: 0.00732
Epoch: 37710, Training Loss: 0.00898
Epoch: 37711, Training Loss: 0.00704
Epoch: 37711, Training Loss: 0.00732
Epoch: 37711, Training Loss: 0.00732
Epoch: 37711, Training Loss: 0.00898
Epoch: 37712, Training Loss: 0.00704
Epoch: 37712, Training Loss: 0.00732
Epoch: 37712, Training Loss: 0.00732
Epoch: 37712, Training Loss: 0.00898
Epoch: 37713, Training Loss: 0.00704
Epoch: 37713, Training Loss: 0.00731
Epoch: 37713, Training Loss: 0.00732
Epoch: 37713, Training Loss: 0.00898
Epoch: 37714, Training Loss: 0.00704
Epoch: 37714, Training Loss: 0.00731
Epoch: 37714, Training Loss: 0.00732
Epoch: 37714, Training Loss: 0.00898
Epoch: 37715, Training Loss: 0.00704
Epoch: 37715, Training Loss: 0.00731
Epoch: 37715, Training Loss: 0.00732
Epoch: 37715, Training Loss: 0.00898
Epoch: 37716, Training Loss: 0.00704
Epoch: 37716, Training Loss: 0.00731
Epoch: 37716, Training Loss: 0.00732
Epoch: 37716, Training Loss: 0.00898
Epoch: 37717, Training Loss: 0.00704
Epoch: 37717, Training Loss: 0.00731
Epoch: 37717, Training Loss: 0.00732
Epoch: 37717, Training Loss: 0.00898
Epoch: 37718, Training Loss: 0.00704
Epoch: 37718, Training Loss: 0.00731
Epoch: 37718, Training Loss: 0.00732
Epoch: 37718, Training Loss: 0.00898
Epoch: 37719, Training Loss: 0.00704
Epoch: 37719, Training Loss: 0.00731
Epoch: 37719, Training Loss: 0.00732
Epoch: 37719, Training Loss: 0.00898
Epoch: 37720, Training Loss: 0.00704
Epoch: 37720, Training Loss: 0.00731
Epoch: 37720, Training Loss: 0.00732
Epoch: 37720, Training Loss: 0.00898
Epoch: 37721, Training Loss: 0.00704
Epoch: 37721, Training Loss: 0.00731
Epoch: 37721, Training Loss: 0.00732
Epoch: 37721, Training Loss: 0.00898
Epoch: 37722, Training Loss: 0.00704
Epoch: 37722, Training Loss: 0.00731
Epoch: 37722, Training Loss: 0.00732
Epoch: 37722, Training Loss: 0.00898
Epoch: 37723, Training Loss: 0.00704
Epoch: 37723, Training Loss: 0.00731
Epoch: 37723, Training Loss: 0.00732
Epoch: 37723, Training Loss: 0.00898
Epoch: 37724, Training Loss: 0.00704
Epoch: 37724, Training Loss: 0.00731
Epoch: 37724, Training Loss: 0.00732
Epoch: 37724, Training Loss: 0.00898
Epoch: 37725, Training Loss: 0.00704
Epoch: 37725, Training Loss: 0.00731
Epoch: 37725, Training Loss: 0.00732
Epoch: 37725, Training Loss: 0.00898
Epoch: 37726, Training Loss: 0.00704
Epoch: 37726, Training Loss: 0.00731
Epoch: 37726, Training Loss: 0.00732
Epoch: 37726, Training Loss: 0.00898
Epoch: 37727, Training Loss: 0.00704
Epoch: 37727, Training Loss: 0.00731
Epoch: 37727, Training Loss: 0.00732
Epoch: 37727, Training Loss: 0.00898
Epoch: 37728, Training Loss: 0.00704
Epoch: 37728, Training Loss: 0.00731
Epoch: 37728, Training Loss: 0.00732
Epoch: 37728, Training Loss: 0.00898
Epoch: 37729, Training Loss: 0.00704
Epoch: 37729, Training Loss: 0.00731
Epoch: 37729, Training Loss: 0.00732
Epoch: 37729, Training Loss: 0.00898
Epoch: 37730, Training Loss: 0.00704
Epoch: 37730, Training Loss: 0.00731
Epoch: 37730, Training Loss: 0.00732
Epoch: 37730, Training Loss: 0.00898
Epoch: 37731, Training Loss: 0.00704
Epoch: 37731, Training Loss: 0.00731
Epoch: 37731, Training Loss: 0.00732
Epoch: 37731, Training Loss: 0.00898
Epoch: 37732, Training Loss: 0.00704
Epoch: 37732, Training Loss: 0.00731
Epoch: 37732, Training Loss: 0.00732
Epoch: 37732, Training Loss: 0.00898
Epoch: 37733, Training Loss: 0.00704
Epoch: 37733, Training Loss: 0.00731
Epoch: 37733, Training Loss: 0.00732
Epoch: 37733, Training Loss: 0.00898
Epoch: 37734, Training Loss: 0.00704
Epoch: 37734, Training Loss: 0.00731
Epoch: 37734, Training Loss: 0.00732
Epoch: 37734, Training Loss: 0.00898
Epoch: 37735, Training Loss: 0.00704
Epoch: 37735, Training Loss: 0.00731
Epoch: 37735, Training Loss: 0.00732
Epoch: 37735, Training Loss: 0.00898
Epoch: 37736, Training Loss: 0.00704
Epoch: 37736, Training Loss: 0.00731
Epoch: 37736, Training Loss: 0.00732
Epoch: 37736, Training Loss: 0.00898
Epoch: 37737, Training Loss: 0.00704
Epoch: 37737, Training Loss: 0.00731
Epoch: 37737, Training Loss: 0.00732
Epoch: 37737, Training Loss: 0.00898
Epoch: 37738, Training Loss: 0.00704
Epoch: 37738, Training Loss: 0.00731
Epoch: 37738, Training Loss: 0.00732
Epoch: 37738, Training Loss: 0.00898
Epoch: 37739, Training Loss: 0.00704
Epoch: 37739, Training Loss: 0.00731
Epoch: 37739, Training Loss: 0.00732
Epoch: 37739, Training Loss: 0.00898
Epoch: 37740, Training Loss: 0.00704
Epoch: 37740, Training Loss: 0.00731
Epoch: 37740, Training Loss: 0.00732
Epoch: 37740, Training Loss: 0.00898
Epoch: 37741, Training Loss: 0.00704
Epoch: 37741, Training Loss: 0.00731
Epoch: 37741, Training Loss: 0.00732
Epoch: 37741, Training Loss: 0.00898
Epoch: 37742, Training Loss: 0.00704
Epoch: 37742, Training Loss: 0.00731
Epoch: 37742, Training Loss: 0.00732
Epoch: 37742, Training Loss: 0.00898
Epoch: 37743, Training Loss: 0.00704
Epoch: 37743, Training Loss: 0.00731
Epoch: 37743, Training Loss: 0.00732
Epoch: 37743, Training Loss: 0.00898
Epoch: 37744, Training Loss: 0.00703
Epoch: 37744, Training Loss: 0.00731
Epoch: 37744, Training Loss: 0.00732
Epoch: 37744, Training Loss: 0.00898
Epoch: 37745, Training Loss: 0.00703
Epoch: 37745, Training Loss: 0.00731
Epoch: 37745, Training Loss: 0.00732
Epoch: 37745, Training Loss: 0.00898
Epoch: 37746, Training Loss: 0.00703
Epoch: 37746, Training Loss: 0.00731
Epoch: 37746, Training Loss: 0.00732
Epoch: 37746, Training Loss: 0.00898
Epoch: 37747, Training Loss: 0.00703
Epoch: 37747, Training Loss: 0.00731
Epoch: 37747, Training Loss: 0.00732
Epoch: 37747, Training Loss: 0.00898
Epoch: 37748, Training Loss: 0.00703
Epoch: 37748, Training Loss: 0.00731
Epoch: 37748, Training Loss: 0.00732
Epoch: 37748, Training Loss: 0.00898
Epoch: 37749, Training Loss: 0.00703
Epoch: 37749, Training Loss: 0.00731
Epoch: 37749, Training Loss: 0.00732
Epoch: 37749, Training Loss: 0.00898
Epoch: 37750, Training Loss: 0.00703
Epoch: 37750, Training Loss: 0.00731
Epoch: 37750, Training Loss: 0.00732
Epoch: 37750, Training Loss: 0.00898
Epoch: 37751, Training Loss: 0.00703
Epoch: 37751, Training Loss: 0.00731
Epoch: 37751, Training Loss: 0.00732
Epoch: 37751, Training Loss: 0.00898
Epoch: 37752, Training Loss: 0.00703
Epoch: 37752, Training Loss: 0.00731
Epoch: 37752, Training Loss: 0.00732
Epoch: 37752, Training Loss: 0.00898
Epoch: 37753, Training Loss: 0.00703
Epoch: 37753, Training Loss: 0.00731
Epoch: 37753, Training Loss: 0.00732
Epoch: 37753, Training Loss: 0.00898
Epoch: 37754, Training Loss: 0.00703
Epoch: 37754, Training Loss: 0.00731
Epoch: 37754, Training Loss: 0.00732
Epoch: 37754, Training Loss: 0.00898
Epoch: 37755, Training Loss: 0.00703
Epoch: 37755, Training Loss: 0.00731
Epoch: 37755, Training Loss: 0.00732
Epoch: 37755, Training Loss: 0.00898
Epoch: 37756, Training Loss: 0.00703
Epoch: 37756, Training Loss: 0.00731
Epoch: 37756, Training Loss: 0.00732
Epoch: 37756, Training Loss: 0.00898
Epoch: 37757, Training Loss: 0.00703
Epoch: 37757, Training Loss: 0.00731
Epoch: 37757, Training Loss: 0.00732
Epoch: 37757, Training Loss: 0.00898
Epoch: 37758, Training Loss: 0.00703
Epoch: 37758, Training Loss: 0.00731
Epoch: 37758, Training Loss: 0.00732
Epoch: 37758, Training Loss: 0.00898
Epoch: 37759, Training Loss: 0.00703
Epoch: 37759, Training Loss: 0.00731
Epoch: 37759, Training Loss: 0.00732
Epoch: 37759, Training Loss: 0.00898
Epoch: 37760, Training Loss: 0.00703
Epoch: 37760, Training Loss: 0.00731
Epoch: 37760, Training Loss: 0.00732
Epoch: 37760, Training Loss: 0.00898
Epoch: 37761, Training Loss: 0.00703
Epoch: 37761, Training Loss: 0.00731
Epoch: 37761, Training Loss: 0.00732
Epoch: 37761, Training Loss: 0.00898
Epoch: 37762, Training Loss: 0.00703
Epoch: 37762, Training Loss: 0.00731
Epoch: 37762, Training Loss: 0.00732
Epoch: 37762, Training Loss: 0.00898
Epoch: 37763, Training Loss: 0.00703
Epoch: 37763, Training Loss: 0.00731
Epoch: 37763, Training Loss: 0.00732
Epoch: 37763, Training Loss: 0.00898
Epoch: 37764, Training Loss: 0.00703
Epoch: 37764, Training Loss: 0.00731
Epoch: 37764, Training Loss: 0.00732
Epoch: 37764, Training Loss: 0.00898
Epoch: 37765, Training Loss: 0.00703
Epoch: 37765, Training Loss: 0.00731
Epoch: 37765, Training Loss: 0.00732
Epoch: 37765, Training Loss: 0.00898
Epoch: 37766, Training Loss: 0.00703
Epoch: 37766, Training Loss: 0.00731
Epoch: 37766, Training Loss: 0.00732
Epoch: 37766, Training Loss: 0.00898
Epoch: 37767, Training Loss: 0.00703
Epoch: 37767, Training Loss: 0.00731
Epoch: 37767, Training Loss: 0.00732
Epoch: 37767, Training Loss: 0.00898
Epoch: 37768, Training Loss: 0.00703
Epoch: 37768, Training Loss: 0.00731
Epoch: 37768, Training Loss: 0.00732
Epoch: 37768, Training Loss: 0.00898
Epoch: 37769, Training Loss: 0.00703
Epoch: 37769, Training Loss: 0.00731
Epoch: 37769, Training Loss: 0.00732
Epoch: 37769, Training Loss: 0.00898
Epoch: 37770, Training Loss: 0.00703
Epoch: 37770, Training Loss: 0.00731
Epoch: 37770, Training Loss: 0.00732
Epoch: 37770, Training Loss: 0.00898
Epoch: 37771, Training Loss: 0.00703
Epoch: 37771, Training Loss: 0.00731
Epoch: 37771, Training Loss: 0.00732
Epoch: 37771, Training Loss: 0.00898
Epoch: 37772, Training Loss: 0.00703
Epoch: 37772, Training Loss: 0.00731
Epoch: 37772, Training Loss: 0.00732
Epoch: 37772, Training Loss: 0.00898
Epoch: 37773, Training Loss: 0.00703
Epoch: 37773, Training Loss: 0.00731
Epoch: 37773, Training Loss: 0.00732
Epoch: 37773, Training Loss: 0.00898
Epoch: 37774, Training Loss: 0.00703
Epoch: 37774, Training Loss: 0.00731
Epoch: 37774, Training Loss: 0.00732
Epoch: 37774, Training Loss: 0.00898
Epoch: 37775, Training Loss: 0.00703
Epoch: 37775, Training Loss: 0.00731
Epoch: 37775, Training Loss: 0.00732
Epoch: 37775, Training Loss: 0.00898
Epoch: 37776, Training Loss: 0.00703
Epoch: 37776, Training Loss: 0.00731
Epoch: 37776, Training Loss: 0.00732
Epoch: 37776, Training Loss: 0.00898
Epoch: 37777, Training Loss: 0.00703
Epoch: 37777, Training Loss: 0.00731
Epoch: 37777, Training Loss: 0.00732
Epoch: 37777, Training Loss: 0.00898
Epoch: 37778, Training Loss: 0.00703
Epoch: 37778, Training Loss: 0.00731
Epoch: 37778, Training Loss: 0.00732
Epoch: 37778, Training Loss: 0.00898
Epoch: 37779, Training Loss: 0.00703
Epoch: 37779, Training Loss: 0.00731
Epoch: 37779, Training Loss: 0.00732
Epoch: 37779, Training Loss: 0.00897
Epoch: 37780, Training Loss: 0.00703
Epoch: 37780, Training Loss: 0.00731
Epoch: 37780, Training Loss: 0.00732
Epoch: 37780, Training Loss: 0.00897
Epoch: 37781, Training Loss: 0.00703
Epoch: 37781, Training Loss: 0.00731
Epoch: 37781, Training Loss: 0.00732
Epoch: 37781, Training Loss: 0.00897
Epoch: 37782, Training Loss: 0.00703
Epoch: 37782, Training Loss: 0.00731
Epoch: 37782, Training Loss: 0.00732
Epoch: 37782, Training Loss: 0.00897
Epoch: 37783, Training Loss: 0.00703
Epoch: 37783, Training Loss: 0.00731
Epoch: 37783, Training Loss: 0.00732
Epoch: 37783, Training Loss: 0.00897
Epoch: 37784, Training Loss: 0.00703
Epoch: 37784, Training Loss: 0.00731
Epoch: 37784, Training Loss: 0.00732
Epoch: 37784, Training Loss: 0.00897
Epoch: 37785, Training Loss: 0.00703
Epoch: 37785, Training Loss: 0.00731
Epoch: 37785, Training Loss: 0.00732
Epoch: 37785, Training Loss: 0.00897
Epoch: 37786, Training Loss: 0.00703
Epoch: 37786, Training Loss: 0.00731
Epoch: 37786, Training Loss: 0.00732
Epoch: 37786, Training Loss: 0.00897
Epoch: 37787, Training Loss: 0.00703
Epoch: 37787, Training Loss: 0.00731
Epoch: 37787, Training Loss: 0.00732
Epoch: 37787, Training Loss: 0.00897
Epoch: 37788, Training Loss: 0.00703
Epoch: 37788, Training Loss: 0.00731
Epoch: 37788, Training Loss: 0.00732
Epoch: 37788, Training Loss: 0.00897
Epoch: 37789, Training Loss: 0.00703
Epoch: 37789, Training Loss: 0.00731
Epoch: 37789, Training Loss: 0.00732
Epoch: 37789, Training Loss: 0.00897
Epoch: 37790, Training Loss: 0.00703
Epoch: 37790, Training Loss: 0.00731
Epoch: 37790, Training Loss: 0.00732
Epoch: 37790, Training Loss: 0.00897
Epoch: 37791, Training Loss: 0.00703
Epoch: 37791, Training Loss: 0.00731
Epoch: 37791, Training Loss: 0.00731
Epoch: 37791, Training Loss: 0.00897
Epoch: 37792, Training Loss: 0.00703
Epoch: 37792, Training Loss: 0.00731
Epoch: 37792, Training Loss: 0.00731
Epoch: 37792, Training Loss: 0.00897
Epoch: 37793, Training Loss: 0.00703
Epoch: 37793, Training Loss: 0.00731
Epoch: 37793, Training Loss: 0.00731
Epoch: 37793, Training Loss: 0.00897
Epoch: 37794, Training Loss: 0.00703
Epoch: 37794, Training Loss: 0.00731
Epoch: 37794, Training Loss: 0.00731
Epoch: 37794, Training Loss: 0.00897
Epoch: 37795, Training Loss: 0.00703
Epoch: 37795, Training Loss: 0.00731
Epoch: 37795, Training Loss: 0.00731
Epoch: 37795, Training Loss: 0.00897
Epoch: 37796, Training Loss: 0.00703
Epoch: 37796, Training Loss: 0.00731
Epoch: 37796, Training Loss: 0.00731
Epoch: 37796, Training Loss: 0.00897
Epoch: 37797, Training Loss: 0.00703
Epoch: 37797, Training Loss: 0.00731
Epoch: 37797, Training Loss: 0.00731
Epoch: 37797, Training Loss: 0.00897
Epoch: 37798, Training Loss: 0.00703
Epoch: 37798, Training Loss: 0.00731
Epoch: 37798, Training Loss: 0.00731
Epoch: 37798, Training Loss: 0.00897
Epoch: 37799, Training Loss: 0.00703
Epoch: 37799, Training Loss: 0.00731
Epoch: 37799, Training Loss: 0.00731
Epoch: 37799, Training Loss: 0.00897
Epoch: 37800, Training Loss: 0.00703
Epoch: 37800, Training Loss: 0.00731
Epoch: 37800, Training Loss: 0.00731
Epoch: 37800, Training Loss: 0.00897
Epoch: 37801, Training Loss: 0.00703
Epoch: 37801, Training Loss: 0.00731
Epoch: 37801, Training Loss: 0.00731
Epoch: 37801, Training Loss: 0.00897
Epoch: 37802, Training Loss: 0.00703
Epoch: 37802, Training Loss: 0.00731
Epoch: 37802, Training Loss: 0.00731
Epoch: 37802, Training Loss: 0.00897
Epoch: 37803, Training Loss: 0.00703
Epoch: 37803, Training Loss: 0.00731
Epoch: 37803, Training Loss: 0.00731
Epoch: 37803, Training Loss: 0.00897
Epoch: 37804, Training Loss: 0.00703
Epoch: 37804, Training Loss: 0.00731
Epoch: 37804, Training Loss: 0.00731
Epoch: 37804, Training Loss: 0.00897
Epoch: 37805, Training Loss: 0.00703
Epoch: 37805, Training Loss: 0.00731
Epoch: 37805, Training Loss: 0.00731
Epoch: 37805, Training Loss: 0.00897
Epoch: 37806, Training Loss: 0.00703
Epoch: 37806, Training Loss: 0.00731
Epoch: 37806, Training Loss: 0.00731
Epoch: 37806, Training Loss: 0.00897
Epoch: 37807, Training Loss: 0.00703
Epoch: 37807, Training Loss: 0.00730
Epoch: 37807, Training Loss: 0.00731
Epoch: 37807, Training Loss: 0.00897
Epoch: 37808, Training Loss: 0.00703
Epoch: 37808, Training Loss: 0.00730
Epoch: 37808, Training Loss: 0.00731
Epoch: 37808, Training Loss: 0.00897
Epoch: 37809, Training Loss: 0.00703
Epoch: 37809, Training Loss: 0.00730
Epoch: 37809, Training Loss: 0.00731
Epoch: 37809, Training Loss: 0.00897
Epoch: 37810, Training Loss: 0.00703
Epoch: 37810, Training Loss: 0.00730
Epoch: 37810, Training Loss: 0.00731
Epoch: 37810, Training Loss: 0.00897
Epoch: 37811, Training Loss: 0.00703
Epoch: 37811, Training Loss: 0.00730
Epoch: 37811, Training Loss: 0.00731
Epoch: 37811, Training Loss: 0.00897
Epoch: 37812, Training Loss: 0.00703
Epoch: 37812, Training Loss: 0.00730
Epoch: 37812, Training Loss: 0.00731
Epoch: 37812, Training Loss: 0.00897
Epoch: 37813, Training Loss: 0.00703
Epoch: 37813, Training Loss: 0.00730
Epoch: 37813, Training Loss: 0.00731
Epoch: 37813, Training Loss: 0.00897
Epoch: 37814, Training Loss: 0.00703
Epoch: 37814, Training Loss: 0.00730
Epoch: 37814, Training Loss: 0.00731
Epoch: 37814, Training Loss: 0.00897
Epoch: 37815, Training Loss: 0.00703
Epoch: 37815, Training Loss: 0.00730
Epoch: 37815, Training Loss: 0.00731
Epoch: 37815, Training Loss: 0.00897
Epoch: 37816, Training Loss: 0.00703
Epoch: 37816, Training Loss: 0.00730
Epoch: 37816, Training Loss: 0.00731
Epoch: 37816, Training Loss: 0.00897
Epoch: 37817, Training Loss: 0.00703
Epoch: 37817, Training Loss: 0.00730
Epoch: 37817, Training Loss: 0.00731
Epoch: 37817, Training Loss: 0.00897
Epoch: 37818, Training Loss: 0.00703
Epoch: 37818, Training Loss: 0.00730
Epoch: 37818, Training Loss: 0.00731
Epoch: 37818, Training Loss: 0.00897
Epoch: 37819, Training Loss: 0.00703
Epoch: 37819, Training Loss: 0.00730
Epoch: 37819, Training Loss: 0.00731
Epoch: 37819, Training Loss: 0.00897
Epoch: 37820, Training Loss: 0.00703
Epoch: 37820, Training Loss: 0.00730
Epoch: 37820, Training Loss: 0.00731
Epoch: 37820, Training Loss: 0.00897
Epoch: 37821, Training Loss: 0.00703
Epoch: 37821, Training Loss: 0.00730
Epoch: 37821, Training Loss: 0.00731
Epoch: 37821, Training Loss: 0.00897
Epoch: 37822, Training Loss: 0.00703
Epoch: 37822, Training Loss: 0.00730
Epoch: 37822, Training Loss: 0.00731
Epoch: 37822, Training Loss: 0.00897
Epoch: 37823, Training Loss: 0.00703
Epoch: 37823, Training Loss: 0.00730
Epoch: 37823, Training Loss: 0.00731
Epoch: 37823, Training Loss: 0.00897
Epoch: 37824, Training Loss: 0.00703
Epoch: 37824, Training Loss: 0.00730
Epoch: 37824, Training Loss: 0.00731
Epoch: 37824, Training Loss: 0.00897
Epoch: 37825, Training Loss: 0.00703
Epoch: 37825, Training Loss: 0.00730
Epoch: 37825, Training Loss: 0.00731
Epoch: 37825, Training Loss: 0.00897
Epoch: 37826, Training Loss: 0.00703
Epoch: 37826, Training Loss: 0.00730
Epoch: 37826, Training Loss: 0.00731
Epoch: 37826, Training Loss: 0.00897
Epoch: 37827, Training Loss: 0.00703
Epoch: 37827, Training Loss: 0.00730
Epoch: 37827, Training Loss: 0.00731
Epoch: 37827, Training Loss: 0.00897
Epoch: 37828, Training Loss: 0.00703
Epoch: 37828, Training Loss: 0.00730
Epoch: 37828, Training Loss: 0.00731
Epoch: 37828, Training Loss: 0.00897
Epoch: 37829, Training Loss: 0.00703
Epoch: 37829, Training Loss: 0.00730
Epoch: 37829, Training Loss: 0.00731
Epoch: 37829, Training Loss: 0.00897
Epoch: 37830, Training Loss: 0.00703
Epoch: 37830, Training Loss: 0.00730
Epoch: 37830, Training Loss: 0.00731
Epoch: 37830, Training Loss: 0.00897
Epoch: 37831, Training Loss: 0.00703
Epoch: 37831, Training Loss: 0.00730
Epoch: 37831, Training Loss: 0.00731
Epoch: 37831, Training Loss: 0.00897
Epoch: 37832, Training Loss: 0.00703
Epoch: 37832, Training Loss: 0.00730
Epoch: 37832, Training Loss: 0.00731
Epoch: 37832, Training Loss: 0.00897
Epoch: 37833, Training Loss: 0.00703
Epoch: 37833, Training Loss: 0.00730
Epoch: 37833, Training Loss: 0.00731
Epoch: 37833, Training Loss: 0.00897
Epoch: 37834, Training Loss: 0.00703
Epoch: 37834, Training Loss: 0.00730
Epoch: 37834, Training Loss: 0.00731
Epoch: 37834, Training Loss: 0.00897
Epoch: 37835, Training Loss: 0.00703
Epoch: 37835, Training Loss: 0.00730
Epoch: 37835, Training Loss: 0.00731
Epoch: 37835, Training Loss: 0.00897
Epoch: 37836, Training Loss: 0.00703
Epoch: 37836, Training Loss: 0.00730
Epoch: 37836, Training Loss: 0.00731
Epoch: 37836, Training Loss: 0.00897
Epoch: 37837, Training Loss: 0.00703
Epoch: 37837, Training Loss: 0.00730
Epoch: 37837, Training Loss: 0.00731
Epoch: 37837, Training Loss: 0.00897
Epoch: 37838, Training Loss: 0.00703
Epoch: 37838, Training Loss: 0.00730
Epoch: 37838, Training Loss: 0.00731
Epoch: 37838, Training Loss: 0.00897
Epoch: 37839, Training Loss: 0.00703
Epoch: 37839, Training Loss: 0.00730
Epoch: 37839, Training Loss: 0.00731
Epoch: 37839, Training Loss: 0.00897
Epoch: 37840, Training Loss: 0.00703
Epoch: 37840, Training Loss: 0.00730
Epoch: 37840, Training Loss: 0.00731
Epoch: 37840, Training Loss: 0.00897
Epoch: 37841, Training Loss: 0.00703
Epoch: 37841, Training Loss: 0.00730
Epoch: 37841, Training Loss: 0.00731
Epoch: 37841, Training Loss: 0.00897
Epoch: 37842, Training Loss: 0.00703
Epoch: 37842, Training Loss: 0.00730
Epoch: 37842, Training Loss: 0.00731
Epoch: 37842, Training Loss: 0.00897
Epoch: 37843, Training Loss: 0.00702
Epoch: 37843, Training Loss: 0.00730
Epoch: 37843, Training Loss: 0.00731
Epoch: 37843, Training Loss: 0.00897
Epoch: 37844, Training Loss: 0.00702
Epoch: 37844, Training Loss: 0.00730
Epoch: 37844, Training Loss: 0.00731
Epoch: 37844, Training Loss: 0.00897
Epoch: 37845, Training Loss: 0.00702
Epoch: 37845, Training Loss: 0.00730
Epoch: 37845, Training Loss: 0.00731
Epoch: 37845, Training Loss: 0.00897
Epoch: 37846, Training Loss: 0.00702
Epoch: 37846, Training Loss: 0.00730
Epoch: 37846, Training Loss: 0.00731
Epoch: 37846, Training Loss: 0.00897
Epoch: 37847, Training Loss: 0.00702
Epoch: 37847, Training Loss: 0.00730
Epoch: 37847, Training Loss: 0.00731
Epoch: 37847, Training Loss: 0.00897
Epoch: 37848, Training Loss: 0.00702
Epoch: 37848, Training Loss: 0.00730
Epoch: 37848, Training Loss: 0.00731
Epoch: 37848, Training Loss: 0.00897
Epoch: 37849, Training Loss: 0.00702
Epoch: 37849, Training Loss: 0.00730
Epoch: 37849, Training Loss: 0.00731
Epoch: 37849, Training Loss: 0.00897
Epoch: 37850, Training Loss: 0.00702
Epoch: 37850, Training Loss: 0.00730
Epoch: 37850, Training Loss: 0.00731
Epoch: 37850, Training Loss: 0.00897
Epoch: 37851, Training Loss: 0.00702
Epoch: 37851, Training Loss: 0.00730
Epoch: 37851, Training Loss: 0.00731
Epoch: 37851, Training Loss: 0.00897
Epoch: 37852, Training Loss: 0.00702
Epoch: 37852, Training Loss: 0.00730
Epoch: 37852, Training Loss: 0.00731
Epoch: 37852, Training Loss: 0.00897
Epoch: 37853, Training Loss: 0.00702
Epoch: 37853, Training Loss: 0.00730
Epoch: 37853, Training Loss: 0.00731
Epoch: 37853, Training Loss: 0.00897
Epoch: 37854, Training Loss: 0.00702
Epoch: 37854, Training Loss: 0.00730
Epoch: 37854, Training Loss: 0.00731
Epoch: 37854, Training Loss: 0.00897
Epoch: 37855, Training Loss: 0.00702
Epoch: 37855, Training Loss: 0.00730
Epoch: 37855, Training Loss: 0.00731
Epoch: 37855, Training Loss: 0.00896
Epoch: 37856, Training Loss: 0.00702
Epoch: 37856, Training Loss: 0.00730
Epoch: 37856, Training Loss: 0.00731
Epoch: 37856, Training Loss: 0.00896
Epoch: 37857, Training Loss: 0.00702
Epoch: 37857, Training Loss: 0.00730
Epoch: 37857, Training Loss: 0.00731
Epoch: 37857, Training Loss: 0.00896
Epoch: 37858, Training Loss: 0.00702
Epoch: 37858, Training Loss: 0.00730
Epoch: 37858, Training Loss: 0.00731
Epoch: 37858, Training Loss: 0.00896
Epoch: 37859, Training Loss: 0.00702
Epoch: 37859, Training Loss: 0.00730
Epoch: 37859, Training Loss: 0.00731
Epoch: 37859, Training Loss: 0.00896
Epoch: 37860, Training Loss: 0.00702
Epoch: 37860, Training Loss: 0.00730
Epoch: 37860, Training Loss: 0.00731
Epoch: 37860, Training Loss: 0.00896
Epoch: 37861, Training Loss: 0.00702
Epoch: 37861, Training Loss: 0.00730
Epoch: 37861, Training Loss: 0.00731
Epoch: 37861, Training Loss: 0.00896
Epoch: 37862, Training Loss: 0.00702
Epoch: 37862, Training Loss: 0.00730
Epoch: 37862, Training Loss: 0.00731
Epoch: 37862, Training Loss: 0.00896
Epoch: 37863, Training Loss: 0.00702
Epoch: 37863, Training Loss: 0.00730
Epoch: 37863, Training Loss: 0.00731
Epoch: 37863, Training Loss: 0.00896
Epoch: 37864, Training Loss: 0.00702
Epoch: 37864, Training Loss: 0.00730
Epoch: 37864, Training Loss: 0.00731
Epoch: 37864, Training Loss: 0.00896
Epoch: 37865, Training Loss: 0.00702
Epoch: 37865, Training Loss: 0.00730
Epoch: 37865, Training Loss: 0.00731
Epoch: 37865, Training Loss: 0.00896
Epoch: 37866, Training Loss: 0.00702
Epoch: 37866, Training Loss: 0.00730
Epoch: 37866, Training Loss: 0.00731
Epoch: 37866, Training Loss: 0.00896
Epoch: 37867, Training Loss: 0.00702
Epoch: 37867, Training Loss: 0.00730
Epoch: 37867, Training Loss: 0.00731
Epoch: 37867, Training Loss: 0.00896
Epoch: 37868, Training Loss: 0.00702
Epoch: 37868, Training Loss: 0.00730
Epoch: 37868, Training Loss: 0.00731
Epoch: 37868, Training Loss: 0.00896
Epoch: 37869, Training Loss: 0.00702
Epoch: 37869, Training Loss: 0.00730
Epoch: 37869, Training Loss: 0.00731
Epoch: 37869, Training Loss: 0.00896
Epoch: 37870, Training Loss: 0.00702
Epoch: 37870, Training Loss: 0.00730
Epoch: 37870, Training Loss: 0.00731
Epoch: 37870, Training Loss: 0.00896
Epoch: 37871, Training Loss: 0.00702
Epoch: 37871, Training Loss: 0.00730
Epoch: 37871, Training Loss: 0.00731
Epoch: 37871, Training Loss: 0.00896
Epoch: 37872, Training Loss: 0.00702
Epoch: 37872, Training Loss: 0.00730
Epoch: 37872, Training Loss: 0.00731
Epoch: 37872, Training Loss: 0.00896
Epoch: 37873, Training Loss: 0.00702
Epoch: 37873, Training Loss: 0.00730
Epoch: 37873, Training Loss: 0.00731
Epoch: 37873, Training Loss: 0.00896
Epoch: 37874, Training Loss: 0.00702
Epoch: 37874, Training Loss: 0.00730
Epoch: 37874, Training Loss: 0.00731
Epoch: 37874, Training Loss: 0.00896
Epoch: 37875, Training Loss: 0.00702
Epoch: 37875, Training Loss: 0.00730
Epoch: 37875, Training Loss: 0.00731
Epoch: 37875, Training Loss: 0.00896
Epoch: 37876, Training Loss: 0.00702
Epoch: 37876, Training Loss: 0.00730
Epoch: 37876, Training Loss: 0.00731
Epoch: 37876, Training Loss: 0.00896
Epoch: 37877, Training Loss: 0.00702
Epoch: 37877, Training Loss: 0.00730
Epoch: 37877, Training Loss: 0.00731
Epoch: 37877, Training Loss: 0.00896
Epoch: 37878, Training Loss: 0.00702
Epoch: 37878, Training Loss: 0.00730
Epoch: 37878, Training Loss: 0.00731
Epoch: 37878, Training Loss: 0.00896
Epoch: 37879, Training Loss: 0.00702
Epoch: 37879, Training Loss: 0.00730
Epoch: 37879, Training Loss: 0.00731
Epoch: 37879, Training Loss: 0.00896
Epoch: 37880, Training Loss: 0.00702
Epoch: 37880, Training Loss: 0.00730
Epoch: 37880, Training Loss: 0.00731
Epoch: 37880, Training Loss: 0.00896
Epoch: 37881, Training Loss: 0.00702
Epoch: 37881, Training Loss: 0.00730
Epoch: 37881, Training Loss: 0.00731
Epoch: 37881, Training Loss: 0.00896
Epoch: 37882, Training Loss: 0.00702
Epoch: 37882, Training Loss: 0.00730
Epoch: 37882, Training Loss: 0.00731
Epoch: 37882, Training Loss: 0.00896
Epoch: 37883, Training Loss: 0.00702
Epoch: 37883, Training Loss: 0.00730
Epoch: 37883, Training Loss: 0.00731
Epoch: 37883, Training Loss: 0.00896
Epoch: 37884, Training Loss: 0.00702
Epoch: 37884, Training Loss: 0.00730
Epoch: 37884, Training Loss: 0.00731
Epoch: 37884, Training Loss: 0.00896
Epoch: 37885, Training Loss: 0.00702
Epoch: 37885, Training Loss: 0.00730
Epoch: 37885, Training Loss: 0.00730
Epoch: 37885, Training Loss: 0.00896
Epoch: 37886, Training Loss: 0.00702
Epoch: 37886, Training Loss: 0.00730
Epoch: 37886, Training Loss: 0.00730
Epoch: 37886, Training Loss: 0.00896
Epoch: 37887, Training Loss: 0.00702
Epoch: 37887, Training Loss: 0.00730
Epoch: 37887, Training Loss: 0.00730
Epoch: 37887, Training Loss: 0.00896
Epoch: 37888, Training Loss: 0.00702
Epoch: 37888, Training Loss: 0.00730
Epoch: 37888, Training Loss: 0.00730
Epoch: 37888, Training Loss: 0.00896
Epoch: 37889, Training Loss: 0.00702
Epoch: 37889, Training Loss: 0.00730
Epoch: 37889, Training Loss: 0.00730
Epoch: 37889, Training Loss: 0.00896
Epoch: 37890, Training Loss: 0.00702
Epoch: 37890, Training Loss: 0.00730
Epoch: 37890, Training Loss: 0.00730
Epoch: 37890, Training Loss: 0.00896
Epoch: 37891, Training Loss: 0.00702
Epoch: 37891, Training Loss: 0.00730
Epoch: 37891, Training Loss: 0.00730
Epoch: 37891, Training Loss: 0.00896
Epoch: 37892, Training Loss: 0.00702
Epoch: 37892, Training Loss: 0.00730
Epoch: 37892, Training Loss: 0.00730
Epoch: 37892, Training Loss: 0.00896
Epoch: 37893, Training Loss: 0.00702
Epoch: 37893, Training Loss: 0.00730
Epoch: 37893, Training Loss: 0.00730
Epoch: 37893, Training Loss: 0.00896
Epoch: 37894, Training Loss: 0.00702
Epoch: 37894, Training Loss: 0.00730
Epoch: 37894, Training Loss: 0.00730
Epoch: 37894, Training Loss: 0.00896
Epoch: 37895, Training Loss: 0.00702
Epoch: 37895, Training Loss: 0.00730
Epoch: 37895, Training Loss: 0.00730
Epoch: 37895, Training Loss: 0.00896
Epoch: 37896, Training Loss: 0.00702
Epoch: 37896, Training Loss: 0.00730
Epoch: 37896, Training Loss: 0.00730
Epoch: 37896, Training Loss: 0.00896
Epoch: 37897, Training Loss: 0.00702
Epoch: 37897, Training Loss: 0.00730
Epoch: 37897, Training Loss: 0.00730
Epoch: 37897, Training Loss: 0.00896
Epoch: 37898, Training Loss: 0.00702
Epoch: 37898, Training Loss: 0.00730
Epoch: 37898, Training Loss: 0.00730
Epoch: 37898, Training Loss: 0.00896
Epoch: 37899, Training Loss: 0.00702
Epoch: 37899, Training Loss: 0.00730
Epoch: 37899, Training Loss: 0.00730
Epoch: 37899, Training Loss: 0.00896
Epoch: 37900, Training Loss: 0.00702
Epoch: 37900, Training Loss: 0.00730
Epoch: 37900, Training Loss: 0.00730
Epoch: 37900, Training Loss: 0.00896
Epoch: 37901, Training Loss: 0.00702
Epoch: 37901, Training Loss: 0.00729
Epoch: 37901, Training Loss: 0.00730
Epoch: 37901, Training Loss: 0.00896
Epoch: 37902, Training Loss: 0.00702
Epoch: 37902, Training Loss: 0.00729
Epoch: 37902, Training Loss: 0.00730
Epoch: 37902, Training Loss: 0.00896
Epoch: 37903, Training Loss: 0.00702
Epoch: 37903, Training Loss: 0.00729
Epoch: 37903, Training Loss: 0.00730
Epoch: 37903, Training Loss: 0.00896
Epoch: 37904, Training Loss: 0.00702
Epoch: 37904, Training Loss: 0.00729
Epoch: 37904, Training Loss: 0.00730
Epoch: 37904, Training Loss: 0.00896
Epoch: 37905, Training Loss: 0.00702
Epoch: 37905, Training Loss: 0.00729
Epoch: 37905, Training Loss: 0.00730
Epoch: 37905, Training Loss: 0.00896
Epoch: 37906, Training Loss: 0.00702
Epoch: 37906, Training Loss: 0.00729
Epoch: 37906, Training Loss: 0.00730
Epoch: 37906, Training Loss: 0.00896
Epoch: 37907, Training Loss: 0.00702
Epoch: 37907, Training Loss: 0.00729
Epoch: 37907, Training Loss: 0.00730
Epoch: 37907, Training Loss: 0.00896
Epoch: 37908, Training Loss: 0.00702
Epoch: 37908, Training Loss: 0.00729
Epoch: 37908, Training Loss: 0.00730
Epoch: 37908, Training Loss: 0.00896
Epoch: 37909, Training Loss: 0.00702
Epoch: 37909, Training Loss: 0.00729
Epoch: 37909, Training Loss: 0.00730
Epoch: 37909, Training Loss: 0.00896
Epoch: 37910, Training Loss: 0.00702
Epoch: 37910, Training Loss: 0.00729
Epoch: 37910, Training Loss: 0.00730
Epoch: 37910, Training Loss: 0.00896
Epoch: 37911, Training Loss: 0.00702
Epoch: 37911, Training Loss: 0.00729
Epoch: 37911, Training Loss: 0.00730
Epoch: 37911, Training Loss: 0.00896
Epoch: 37912, Training Loss: 0.00702
Epoch: 37912, Training Loss: 0.00729
Epoch: 37912, Training Loss: 0.00730
Epoch: 37912, Training Loss: 0.00896
Epoch: 37913, Training Loss: 0.00702
Epoch: 37913, Training Loss: 0.00729
Epoch: 37913, Training Loss: 0.00730
Epoch: 37913, Training Loss: 0.00896
Epoch: 37914, Training Loss: 0.00702
Epoch: 37914, Training Loss: 0.00729
Epoch: 37914, Training Loss: 0.00730
Epoch: 37914, Training Loss: 0.00896
Epoch: 37915, Training Loss: 0.00702
Epoch: 37915, Training Loss: 0.00729
Epoch: 37915, Training Loss: 0.00730
Epoch: 37915, Training Loss: 0.00896
Epoch: 37916, Training Loss: 0.00702
Epoch: 37916, Training Loss: 0.00729
Epoch: 37916, Training Loss: 0.00730
Epoch: 37916, Training Loss: 0.00896
Epoch: 37917, Training Loss: 0.00702
Epoch: 37917, Training Loss: 0.00729
Epoch: 37917, Training Loss: 0.00730
Epoch: 37917, Training Loss: 0.00896
Epoch: 37918, Training Loss: 0.00702
Epoch: 37918, Training Loss: 0.00729
Epoch: 37918, Training Loss: 0.00730
Epoch: 37918, Training Loss: 0.00896
Epoch: 37919, Training Loss: 0.00702
Epoch: 37919, Training Loss: 0.00729
Epoch: 37919, Training Loss: 0.00730
Epoch: 37919, Training Loss: 0.00896
Epoch: 37920, Training Loss: 0.00702
Epoch: 37920, Training Loss: 0.00729
Epoch: 37920, Training Loss: 0.00730
Epoch: 37920, Training Loss: 0.00896
Epoch: 37921, Training Loss: 0.00702
Epoch: 37921, Training Loss: 0.00729
Epoch: 37921, Training Loss: 0.00730
Epoch: 37921, Training Loss: 0.00896
Epoch: 37922, Training Loss: 0.00702
Epoch: 37922, Training Loss: 0.00729
Epoch: 37922, Training Loss: 0.00730
Epoch: 37922, Training Loss: 0.00896
Epoch: 37923, Training Loss: 0.00702
Epoch: 37923, Training Loss: 0.00729
Epoch: 37923, Training Loss: 0.00730
Epoch: 37923, Training Loss: 0.00896
Epoch: 37924, Training Loss: 0.00702
Epoch: 37924, Training Loss: 0.00729
Epoch: 37924, Training Loss: 0.00730
Epoch: 37924, Training Loss: 0.00896
Epoch: 37925, Training Loss: 0.00702
Epoch: 37925, Training Loss: 0.00729
Epoch: 37925, Training Loss: 0.00730
Epoch: 37925, Training Loss: 0.00896
Epoch: 37926, Training Loss: 0.00702
Epoch: 37926, Training Loss: 0.00729
Epoch: 37926, Training Loss: 0.00730
Epoch: 37926, Training Loss: 0.00896
Epoch: 37927, Training Loss: 0.00702
Epoch: 37927, Training Loss: 0.00729
Epoch: 37927, Training Loss: 0.00730
Epoch: 37927, Training Loss: 0.00896
Epoch: 37928, Training Loss: 0.00702
Epoch: 37928, Training Loss: 0.00729
Epoch: 37928, Training Loss: 0.00730
Epoch: 37928, Training Loss: 0.00896
Epoch: 37929, Training Loss: 0.00702
Epoch: 37929, Training Loss: 0.00729
Epoch: 37929, Training Loss: 0.00730
Epoch: 37929, Training Loss: 0.00896
Epoch: 37930, Training Loss: 0.00702
Epoch: 37930, Training Loss: 0.00729
Epoch: 37930, Training Loss: 0.00730
Epoch: 37930, Training Loss: 0.00896
Epoch: 37931, Training Loss: 0.00702
Epoch: 37931, Training Loss: 0.00729
Epoch: 37931, Training Loss: 0.00730
Epoch: 37931, Training Loss: 0.00895
Epoch: 37932, Training Loss: 0.00702
Epoch: 37932, Training Loss: 0.00729
Epoch: 37932, Training Loss: 0.00730
Epoch: 37932, Training Loss: 0.00895
Epoch: 37933, Training Loss: 0.00702
Epoch: 37933, Training Loss: 0.00729
Epoch: 37933, Training Loss: 0.00730
Epoch: 37933, Training Loss: 0.00895
Epoch: 37934, Training Loss: 0.00702
Epoch: 37934, Training Loss: 0.00729
Epoch: 37934, Training Loss: 0.00730
Epoch: 37934, Training Loss: 0.00895
Epoch: 37935, Training Loss: 0.00702
Epoch: 37935, Training Loss: 0.00729
Epoch: 37935, Training Loss: 0.00730
Epoch: 37935, Training Loss: 0.00895
Epoch: 37936, Training Loss: 0.00702
Epoch: 37936, Training Loss: 0.00729
Epoch: 37936, Training Loss: 0.00730
Epoch: 37936, Training Loss: 0.00895
Epoch: 37937, Training Loss: 0.00702
Epoch: 37937, Training Loss: 0.00729
Epoch: 37937, Training Loss: 0.00730
Epoch: 37937, Training Loss: 0.00895
Epoch: 37938, Training Loss: 0.00702
Epoch: 37938, Training Loss: 0.00729
Epoch: 37938, Training Loss: 0.00730
Epoch: 37938, Training Loss: 0.00895
Epoch: 37939, Training Loss: 0.00702
Epoch: 37939, Training Loss: 0.00729
Epoch: 37939, Training Loss: 0.00730
Epoch: 37939, Training Loss: 0.00895
Epoch: 37940, Training Loss: 0.00702
Epoch: 37940, Training Loss: 0.00729
Epoch: 37940, Training Loss: 0.00730
Epoch: 37940, Training Loss: 0.00895
Epoch: 37941, Training Loss: 0.00702
Epoch: 37941, Training Loss: 0.00729
Epoch: 37941, Training Loss: 0.00730
Epoch: 37941, Training Loss: 0.00895
Epoch: 37942, Training Loss: 0.00702
Epoch: 37942, Training Loss: 0.00729
Epoch: 37942, Training Loss: 0.00730
Epoch: 37942, Training Loss: 0.00895
Epoch: 37943, Training Loss: 0.00701
Epoch: 37943, Training Loss: 0.00729
Epoch: 37943, Training Loss: 0.00730
Epoch: 37943, Training Loss: 0.00895
Epoch: 37944, Training Loss: 0.00701
Epoch: 37944, Training Loss: 0.00729
Epoch: 37944, Training Loss: 0.00730
Epoch: 37944, Training Loss: 0.00895
Epoch: 37945, Training Loss: 0.00701
Epoch: 37945, Training Loss: 0.00729
Epoch: 37945, Training Loss: 0.00730
Epoch: 37945, Training Loss: 0.00895
Epoch: 37946, Training Loss: 0.00701
Epoch: 37946, Training Loss: 0.00729
Epoch: 37946, Training Loss: 0.00730
Epoch: 37946, Training Loss: 0.00895
Epoch: 37947, Training Loss: 0.00701
Epoch: 37947, Training Loss: 0.00729
Epoch: 37947, Training Loss: 0.00730
Epoch: 37947, Training Loss: 0.00895
Epoch: 37948, Training Loss: 0.00701
Epoch: 37948, Training Loss: 0.00729
Epoch: 37948, Training Loss: 0.00730
Epoch: 37948, Training Loss: 0.00895
Epoch: 37949, Training Loss: 0.00701
Epoch: 37949, Training Loss: 0.00729
Epoch: 37949, Training Loss: 0.00730
Epoch: 37949, Training Loss: 0.00895
Epoch: 37950, Training Loss: 0.00701
Epoch: 37950, Training Loss: 0.00729
Epoch: 37950, Training Loss: 0.00730
Epoch: 37950, Training Loss: 0.00895
Epoch: 37951, Training Loss: 0.00701
Epoch: 37951, Training Loss: 0.00729
Epoch: 37951, Training Loss: 0.00730
Epoch: 37951, Training Loss: 0.00895
Epoch: 37952, Training Loss: 0.00701
Epoch: 37952, Training Loss: 0.00729
Epoch: 37952, Training Loss: 0.00730
Epoch: 37952, Training Loss: 0.00895
Epoch: 37953, Training Loss: 0.00701
Epoch: 37953, Training Loss: 0.00729
Epoch: 37953, Training Loss: 0.00730
Epoch: 37953, Training Loss: 0.00895
Epoch: 37954, Training Loss: 0.00701
Epoch: 37954, Training Loss: 0.00729
Epoch: 37954, Training Loss: 0.00730
Epoch: 37954, Training Loss: 0.00895
Epoch: 37955, Training Loss: 0.00701
Epoch: 37955, Training Loss: 0.00729
Epoch: 37955, Training Loss: 0.00730
Epoch: 37955, Training Loss: 0.00895
Epoch: 37956, Training Loss: 0.00701
Epoch: 37956, Training Loss: 0.00729
Epoch: 37956, Training Loss: 0.00730
Epoch: 37956, Training Loss: 0.00895
Epoch: 37957, Training Loss: 0.00701
Epoch: 37957, Training Loss: 0.00729
Epoch: 37957, Training Loss: 0.00730
Epoch: 37957, Training Loss: 0.00895
Epoch: 37958, Training Loss: 0.00701
Epoch: 37958, Training Loss: 0.00729
Epoch: 37958, Training Loss: 0.00730
Epoch: 37958, Training Loss: 0.00895
Epoch: 37959, Training Loss: 0.00701
Epoch: 37959, Training Loss: 0.00729
Epoch: 37959, Training Loss: 0.00730
Epoch: 37959, Training Loss: 0.00895
Epoch: 37960, Training Loss: 0.00701
Epoch: 37960, Training Loss: 0.00729
Epoch: 37960, Training Loss: 0.00730
Epoch: 37960, Training Loss: 0.00895
Epoch: 37961, Training Loss: 0.00701
Epoch: 37961, Training Loss: 0.00729
Epoch: 37961, Training Loss: 0.00730
Epoch: 37961, Training Loss: 0.00895
Epoch: 37962, Training Loss: 0.00701
Epoch: 37962, Training Loss: 0.00729
Epoch: 37962, Training Loss: 0.00730
Epoch: 37962, Training Loss: 0.00895
Epoch: 37963, Training Loss: 0.00701
Epoch: 37963, Training Loss: 0.00729
Epoch: 37963, Training Loss: 0.00730
Epoch: 37963, Training Loss: 0.00895
Epoch: 37964, Training Loss: 0.00701
Epoch: 37964, Training Loss: 0.00729
Epoch: 37964, Training Loss: 0.00730
Epoch: 37964, Training Loss: 0.00895
Epoch: 37965, Training Loss: 0.00701
Epoch: 37965, Training Loss: 0.00729
Epoch: 37965, Training Loss: 0.00730
Epoch: 37965, Training Loss: 0.00895
Epoch: 37966, Training Loss: 0.00701
Epoch: 37966, Training Loss: 0.00729
Epoch: 37966, Training Loss: 0.00730
Epoch: 37966, Training Loss: 0.00895
Epoch: 37967, Training Loss: 0.00701
Epoch: 37967, Training Loss: 0.00729
Epoch: 37967, Training Loss: 0.00730
Epoch: 37967, Training Loss: 0.00895
Epoch: 37968, Training Loss: 0.00701
Epoch: 37968, Training Loss: 0.00729
Epoch: 37968, Training Loss: 0.00730
Epoch: 37968, Training Loss: 0.00895
Epoch: 37969, Training Loss: 0.00701
Epoch: 37969, Training Loss: 0.00729
Epoch: 37969, Training Loss: 0.00730
Epoch: 37969, Training Loss: 0.00895
Epoch: 37970, Training Loss: 0.00701
Epoch: 37970, Training Loss: 0.00729
Epoch: 37970, Training Loss: 0.00730
Epoch: 37970, Training Loss: 0.00895
Epoch: 37971, Training Loss: 0.00701
Epoch: 37971, Training Loss: 0.00729
Epoch: 37971, Training Loss: 0.00730
Epoch: 37971, Training Loss: 0.00895
Epoch: 37972, Training Loss: 0.00701
Epoch: 37972, Training Loss: 0.00729
Epoch: 37972, Training Loss: 0.00730
Epoch: 37972, Training Loss: 0.00895
Epoch: 37973, Training Loss: 0.00701
Epoch: 37973, Training Loss: 0.00729
Epoch: 37973, Training Loss: 0.00730
Epoch: 37973, Training Loss: 0.00895
Epoch: 37974, Training Loss: 0.00701
Epoch: 37974, Training Loss: 0.00729
Epoch: 37974, Training Loss: 0.00730
Epoch: 37974, Training Loss: 0.00895
Epoch: 37975, Training Loss: 0.00701
Epoch: 37975, Training Loss: 0.00729
Epoch: 37975, Training Loss: 0.00730
Epoch: 37975, Training Loss: 0.00895
Epoch: 37976, Training Loss: 0.00701
Epoch: 37976, Training Loss: 0.00729
Epoch: 37976, Training Loss: 0.00730
Epoch: 37976, Training Loss: 0.00895
Epoch: 37977, Training Loss: 0.00701
Epoch: 37977, Training Loss: 0.00729
Epoch: 37977, Training Loss: 0.00730
Epoch: 37977, Training Loss: 0.00895
Epoch: 37978, Training Loss: 0.00701
Epoch: 37978, Training Loss: 0.00729
Epoch: 37978, Training Loss: 0.00730
Epoch: 37978, Training Loss: 0.00895
Epoch: 37979, Training Loss: 0.00701
Epoch: 37979, Training Loss: 0.00729
Epoch: 37979, Training Loss: 0.00729
Epoch: 37979, Training Loss: 0.00895
Epoch: 37980, Training Loss: 0.00701
Epoch: 37980, Training Loss: 0.00729
Epoch: 37980, Training Loss: 0.00729
Epoch: 37980, Training Loss: 0.00895
Epoch: 37981, Training Loss: 0.00701
Epoch: 37981, Training Loss: 0.00729
Epoch: 37981, Training Loss: 0.00729
Epoch: 37981, Training Loss: 0.00895
Epoch: 37982, Training Loss: 0.00701
Epoch: 37982, Training Loss: 0.00729
Epoch: 37982, Training Loss: 0.00729
Epoch: 37982, Training Loss: 0.00895
Epoch: 37983, Training Loss: 0.00701
Epoch: 37983, Training Loss: 0.00729
Epoch: 37983, Training Loss: 0.00729
Epoch: 37983, Training Loss: 0.00895
Epoch: 37984, Training Loss: 0.00701
Epoch: 37984, Training Loss: 0.00729
Epoch: 37984, Training Loss: 0.00729
Epoch: 37984, Training Loss: 0.00895
Epoch: 37985, Training Loss: 0.00701
Epoch: 37985, Training Loss: 0.00729
Epoch: 37985, Training Loss: 0.00729
Epoch: 37985, Training Loss: 0.00895
Epoch: 37986, Training Loss: 0.00701
Epoch: 37986, Training Loss: 0.00729
Epoch: 37986, Training Loss: 0.00729
Epoch: 37986, Training Loss: 0.00895
Epoch: 37987, Training Loss: 0.00701
Epoch: 37987, Training Loss: 0.00729
Epoch: 37987, Training Loss: 0.00729
Epoch: 37987, Training Loss: 0.00895
Epoch: 37988, Training Loss: 0.00701
Epoch: 37988, Training Loss: 0.00729
Epoch: 37988, Training Loss: 0.00729
Epoch: 37988, Training Loss: 0.00895
Epoch: 37989, Training Loss: 0.00701
Epoch: 37989, Training Loss: 0.00729
Epoch: 37989, Training Loss: 0.00729
Epoch: 37989, Training Loss: 0.00895
Epoch: 37990, Training Loss: 0.00701
Epoch: 37990, Training Loss: 0.00729
Epoch: 37990, Training Loss: 0.00729
Epoch: 37990, Training Loss: 0.00895
Epoch: 37991, Training Loss: 0.00701
Epoch: 37991, Training Loss: 0.00729
Epoch: 37991, Training Loss: 0.00729
Epoch: 37991, Training Loss: 0.00895
Epoch: 37992, Training Loss: 0.00701
Epoch: 37992, Training Loss: 0.00729
Epoch: 37992, Training Loss: 0.00729
Epoch: 37992, Training Loss: 0.00895
Epoch: 37993, Training Loss: 0.00701
Epoch: 37993, Training Loss: 0.00729
Epoch: 37993, Training Loss: 0.00729
Epoch: 37993, Training Loss: 0.00895
Epoch: 37994, Training Loss: 0.00701
Epoch: 37994, Training Loss: 0.00729
Epoch: 37994, Training Loss: 0.00729
Epoch: 37994, Training Loss: 0.00895
Epoch: 37995, Training Loss: 0.00701
Epoch: 37995, Training Loss: 0.00728
Epoch: 37995, Training Loss: 0.00729
Epoch: 37995, Training Loss: 0.00895
Epoch: 37996, Training Loss: 0.00701
Epoch: 37996, Training Loss: 0.00728
Epoch: 37996, Training Loss: 0.00729
Epoch: 37996, Training Loss: 0.00895
Epoch: 37997, Training Loss: 0.00701
Epoch: 37997, Training Loss: 0.00728
Epoch: 37997, Training Loss: 0.00729
Epoch: 37997, Training Loss: 0.00895
Epoch: 37998, Training Loss: 0.00701
Epoch: 37998, Training Loss: 0.00728
Epoch: 37998, Training Loss: 0.00729
Epoch: 37998, Training Loss: 0.00895
Epoch: 37999, Training Loss: 0.00701
Epoch: 37999, Training Loss: 0.00728
Epoch: 37999, Training Loss: 0.00729
Epoch: 37999, Training Loss: 0.00895
Epoch: 38000, Training Loss: 0.00701
Epoch: 38000, Training Loss: 0.00728
Epoch: 38000, Training Loss: 0.00729
Epoch: 38000, Training Loss: 0.00895
Epoch: 38001, Training Loss: 0.00701
Epoch: 38001, Training Loss: 0.00728
Epoch: 38001, Training Loss: 0.00729
Epoch: 38001, Training Loss: 0.00895
Epoch: 38002, Training Loss: 0.00701
Epoch: 38002, Training Loss: 0.00728
Epoch: 38002, Training Loss: 0.00729
Epoch: 38002, Training Loss: 0.00895
Epoch: 38003, Training Loss: 0.00701
Epoch: 38003, Training Loss: 0.00728
Epoch: 38003, Training Loss: 0.00729
Epoch: 38003, Training Loss: 0.00895
Epoch: 38004, Training Loss: 0.00701
Epoch: 38004, Training Loss: 0.00728
Epoch: 38004, Training Loss: 0.00729
Epoch: 38004, Training Loss: 0.00895
Epoch: 38005, Training Loss: 0.00701
Epoch: 38005, Training Loss: 0.00728
Epoch: 38005, Training Loss: 0.00729
Epoch: 38005, Training Loss: 0.00895
Epoch: 38006, Training Loss: 0.00701
Epoch: 38006, Training Loss: 0.00728
Epoch: 38006, Training Loss: 0.00729
Epoch: 38006, Training Loss: 0.00895
Epoch: 38007, Training Loss: 0.00701
Epoch: 38007, Training Loss: 0.00728
Epoch: 38007, Training Loss: 0.00729
Epoch: 38007, Training Loss: 0.00894
Epoch: 38008, Training Loss: 0.00701
Epoch: 38008, Training Loss: 0.00728
Epoch: 38008, Training Loss: 0.00729
Epoch: 38008, Training Loss: 0.00894
Epoch: 38009, Training Loss: 0.00701
Epoch: 38009, Training Loss: 0.00728
Epoch: 38009, Training Loss: 0.00729
Epoch: 38009, Training Loss: 0.00894
Epoch: 38010, Training Loss: 0.00701
Epoch: 38010, Training Loss: 0.00728
Epoch: 38010, Training Loss: 0.00729
Epoch: 38010, Training Loss: 0.00894
Epoch: 38011, Training Loss: 0.00701
Epoch: 38011, Training Loss: 0.00728
Epoch: 38011, Training Loss: 0.00729
Epoch: 38011, Training Loss: 0.00894
Epoch: 38012, Training Loss: 0.00701
Epoch: 38012, Training Loss: 0.00728
Epoch: 38012, Training Loss: 0.00729
Epoch: 38012, Training Loss: 0.00894
Epoch: 38013, Training Loss: 0.00701
Epoch: 38013, Training Loss: 0.00728
Epoch: 38013, Training Loss: 0.00729
Epoch: 38013, Training Loss: 0.00894
Epoch: 38014, Training Loss: 0.00701
Epoch: 38014, Training Loss: 0.00728
Epoch: 38014, Training Loss: 0.00729
Epoch: 38014, Training Loss: 0.00894
Epoch: 38015, Training Loss: 0.00701
Epoch: 38015, Training Loss: 0.00728
Epoch: 38015, Training Loss: 0.00729
Epoch: 38015, Training Loss: 0.00894
Epoch: 38016, Training Loss: 0.00701
Epoch: 38016, Training Loss: 0.00728
Epoch: 38016, Training Loss: 0.00729
Epoch: 38016, Training Loss: 0.00894
Epoch: 38017, Training Loss: 0.00701
Epoch: 38017, Training Loss: 0.00728
Epoch: 38017, Training Loss: 0.00729
Epoch: 38017, Training Loss: 0.00894
Epoch: 38018, Training Loss: 0.00701
Epoch: 38018, Training Loss: 0.00728
Epoch: 38018, Training Loss: 0.00729
Epoch: 38018, Training Loss: 0.00894
Epoch: 38019, Training Loss: 0.00701
Epoch: 38019, Training Loss: 0.00728
Epoch: 38019, Training Loss: 0.00729
Epoch: 38019, Training Loss: 0.00894
Epoch: 38020, Training Loss: 0.00701
Epoch: 38020, Training Loss: 0.00728
Epoch: 38020, Training Loss: 0.00729
Epoch: 38020, Training Loss: 0.00894
Epoch: 38021, Training Loss: 0.00701
Epoch: 38021, Training Loss: 0.00728
Epoch: 38021, Training Loss: 0.00729
Epoch: 38021, Training Loss: 0.00894
Epoch: 38022, Training Loss: 0.00701
Epoch: 38022, Training Loss: 0.00728
Epoch: 38022, Training Loss: 0.00729
Epoch: 38022, Training Loss: 0.00894
Epoch: 38023, Training Loss: 0.00701
Epoch: 38023, Training Loss: 0.00728
Epoch: 38023, Training Loss: 0.00729
Epoch: 38023, Training Loss: 0.00894
Epoch: 38024, Training Loss: 0.00701
Epoch: 38024, Training Loss: 0.00728
Epoch: 38024, Training Loss: 0.00729
Epoch: 38024, Training Loss: 0.00894
Epoch: 38025, Training Loss: 0.00701
Epoch: 38025, Training Loss: 0.00728
Epoch: 38025, Training Loss: 0.00729
Epoch: 38025, Training Loss: 0.00894
Epoch: 38026, Training Loss: 0.00701
Epoch: 38026, Training Loss: 0.00728
Epoch: 38026, Training Loss: 0.00729
Epoch: 38026, Training Loss: 0.00894
Epoch: 38027, Training Loss: 0.00701
Epoch: 38027, Training Loss: 0.00728
Epoch: 38027, Training Loss: 0.00729
Epoch: 38027, Training Loss: 0.00894
Epoch: 38028, Training Loss: 0.00701
Epoch: 38028, Training Loss: 0.00728
Epoch: 38028, Training Loss: 0.00729
Epoch: 38028, Training Loss: 0.00894
Epoch: 38029, Training Loss: 0.00701
Epoch: 38029, Training Loss: 0.00728
Epoch: 38029, Training Loss: 0.00729
Epoch: 38029, Training Loss: 0.00894
Epoch: 38030, Training Loss: 0.00701
Epoch: 38030, Training Loss: 0.00728
Epoch: 38030, Training Loss: 0.00729
Epoch: 38030, Training Loss: 0.00894
Epoch: 38031, Training Loss: 0.00701
Epoch: 38031, Training Loss: 0.00728
Epoch: 38031, Training Loss: 0.00729
Epoch: 38031, Training Loss: 0.00894
Epoch: 38032, Training Loss: 0.00701
Epoch: 38032, Training Loss: 0.00728
Epoch: 38032, Training Loss: 0.00729
Epoch: 38032, Training Loss: 0.00894
Epoch: 38033, Training Loss: 0.00701
Epoch: 38033, Training Loss: 0.00728
Epoch: 38033, Training Loss: 0.00729
Epoch: 38033, Training Loss: 0.00894
Epoch: 38034, Training Loss: 0.00701
Epoch: 38034, Training Loss: 0.00728
Epoch: 38034, Training Loss: 0.00729
Epoch: 38034, Training Loss: 0.00894
Epoch: 38035, Training Loss: 0.00701
Epoch: 38035, Training Loss: 0.00728
Epoch: 38035, Training Loss: 0.00729
Epoch: 38035, Training Loss: 0.00894
Epoch: 38036, Training Loss: 0.00701
Epoch: 38036, Training Loss: 0.00728
Epoch: 38036, Training Loss: 0.00729
Epoch: 38036, Training Loss: 0.00894
Epoch: 38037, Training Loss: 0.00701
Epoch: 38037, Training Loss: 0.00728
Epoch: 38037, Training Loss: 0.00729
Epoch: 38037, Training Loss: 0.00894
Epoch: 38038, Training Loss: 0.00701
Epoch: 38038, Training Loss: 0.00728
Epoch: 38038, Training Loss: 0.00729
Epoch: 38038, Training Loss: 0.00894
Epoch: 38039, Training Loss: 0.00701
Epoch: 38039, Training Loss: 0.00728
Epoch: 38039, Training Loss: 0.00729
Epoch: 38039, Training Loss: 0.00894
Epoch: 38040, Training Loss: 0.00701
Epoch: 38040, Training Loss: 0.00728
Epoch: 38040, Training Loss: 0.00729
Epoch: 38040, Training Loss: 0.00894
Epoch: 38041, Training Loss: 0.00701
Epoch: 38041, Training Loss: 0.00728
Epoch: 38041, Training Loss: 0.00729
Epoch: 38041, Training Loss: 0.00894
Epoch: 38042, Training Loss: 0.00701
Epoch: 38042, Training Loss: 0.00728
Epoch: 38042, Training Loss: 0.00729
Epoch: 38042, Training Loss: 0.00894
Epoch: 38043, Training Loss: 0.00701
Epoch: 38043, Training Loss: 0.00728
Epoch: 38043, Training Loss: 0.00729
Epoch: 38043, Training Loss: 0.00894
Epoch: 38044, Training Loss: 0.00700
Epoch: 38044, Training Loss: 0.00728
Epoch: 38044, Training Loss: 0.00729
Epoch: 38044, Training Loss: 0.00894
Epoch: 38045, Training Loss: 0.00700
Epoch: 38045, Training Loss: 0.00728
Epoch: 38045, Training Loss: 0.00729
Epoch: 38045, Training Loss: 0.00894
Epoch: 38046, Training Loss: 0.00700
Epoch: 38046, Training Loss: 0.00728
Epoch: 38046, Training Loss: 0.00729
Epoch: 38046, Training Loss: 0.00894
Epoch: 38047, Training Loss: 0.00700
Epoch: 38047, Training Loss: 0.00728
Epoch: 38047, Training Loss: 0.00729
Epoch: 38047, Training Loss: 0.00894
Epoch: 38048, Training Loss: 0.00700
Epoch: 38048, Training Loss: 0.00728
Epoch: 38048, Training Loss: 0.00729
Epoch: 38048, Training Loss: 0.00894
Epoch: 38049, Training Loss: 0.00700
Epoch: 38049, Training Loss: 0.00728
Epoch: 38049, Training Loss: 0.00729
Epoch: 38049, Training Loss: 0.00894
Epoch: 38050, Training Loss: 0.00700
Epoch: 38050, Training Loss: 0.00728
Epoch: 38050, Training Loss: 0.00729
Epoch: 38050, Training Loss: 0.00894
Epoch: 38051, Training Loss: 0.00700
Epoch: 38051, Training Loss: 0.00728
Epoch: 38051, Training Loss: 0.00729
Epoch: 38051, Training Loss: 0.00894
Epoch: 38052, Training Loss: 0.00700
Epoch: 38052, Training Loss: 0.00728
Epoch: 38052, Training Loss: 0.00729
Epoch: 38052, Training Loss: 0.00894
Epoch: 38053, Training Loss: 0.00700
Epoch: 38053, Training Loss: 0.00728
Epoch: 38053, Training Loss: 0.00729
Epoch: 38053, Training Loss: 0.00894
Epoch: 38054, Training Loss: 0.00700
Epoch: 38054, Training Loss: 0.00728
Epoch: 38054, Training Loss: 0.00729
Epoch: 38054, Training Loss: 0.00894
Epoch: 38055, Training Loss: 0.00700
Epoch: 38055, Training Loss: 0.00728
Epoch: 38055, Training Loss: 0.00729
Epoch: 38055, Training Loss: 0.00894
Epoch: 38056, Training Loss: 0.00700
Epoch: 38056, Training Loss: 0.00728
Epoch: 38056, Training Loss: 0.00729
Epoch: 38056, Training Loss: 0.00894
Epoch: 38057, Training Loss: 0.00700
Epoch: 38057, Training Loss: 0.00728
Epoch: 38057, Training Loss: 0.00729
Epoch: 38057, Training Loss: 0.00894
Epoch: 38058, Training Loss: 0.00700
Epoch: 38058, Training Loss: 0.00728
Epoch: 38058, Training Loss: 0.00729
Epoch: 38058, Training Loss: 0.00894
Epoch: 38059, Training Loss: 0.00700
Epoch: 38059, Training Loss: 0.00728
Epoch: 38059, Training Loss: 0.00729
Epoch: 38059, Training Loss: 0.00894
Epoch: 38060, Training Loss: 0.00700
Epoch: 38060, Training Loss: 0.00728
Epoch: 38060, Training Loss: 0.00729
Epoch: 38060, Training Loss: 0.00894
Epoch: 38061, Training Loss: 0.00700
Epoch: 38061, Training Loss: 0.00728
Epoch: 38061, Training Loss: 0.00729
Epoch: 38061, Training Loss: 0.00894
Epoch: 38062, Training Loss: 0.00700
Epoch: 38062, Training Loss: 0.00728
Epoch: 38062, Training Loss: 0.00729
Epoch: 38062, Training Loss: 0.00894
Epoch: 38063, Training Loss: 0.00700
Epoch: 38063, Training Loss: 0.00728
Epoch: 38063, Training Loss: 0.00729
Epoch: 38063, Training Loss: 0.00894
Epoch: 38064, Training Loss: 0.00700
Epoch: 38064, Training Loss: 0.00728
Epoch: 38064, Training Loss: 0.00729
Epoch: 38064, Training Loss: 0.00894
Epoch: 38065, Training Loss: 0.00700
Epoch: 38065, Training Loss: 0.00728
Epoch: 38065, Training Loss: 0.00729
Epoch: 38065, Training Loss: 0.00894
Epoch: 38066, Training Loss: 0.00700
Epoch: 38066, Training Loss: 0.00728
Epoch: 38066, Training Loss: 0.00729
Epoch: 38066, Training Loss: 0.00894
Epoch: 38067, Training Loss: 0.00700
Epoch: 38067, Training Loss: 0.00728
Epoch: 38067, Training Loss: 0.00729
Epoch: 38067, Training Loss: 0.00894
Epoch: 38068, Training Loss: 0.00700
Epoch: 38068, Training Loss: 0.00728
Epoch: 38068, Training Loss: 0.00729
Epoch: 38068, Training Loss: 0.00894
Epoch: 38069, Training Loss: 0.00700
Epoch: 38069, Training Loss: 0.00728
Epoch: 38069, Training Loss: 0.00729
Epoch: 38069, Training Loss: 0.00894
Epoch: 38070, Training Loss: 0.00700
Epoch: 38070, Training Loss: 0.00728
Epoch: 38070, Training Loss: 0.00729
Epoch: 38070, Training Loss: 0.00894
Epoch: 38071, Training Loss: 0.00700
Epoch: 38071, Training Loss: 0.00728
Epoch: 38071, Training Loss: 0.00729
Epoch: 38071, Training Loss: 0.00894
Epoch: 38072, Training Loss: 0.00700
Epoch: 38072, Training Loss: 0.00728
Epoch: 38072, Training Loss: 0.00729
Epoch: 38072, Training Loss: 0.00894
Epoch: 38073, Training Loss: 0.00700
Epoch: 38073, Training Loss: 0.00728
Epoch: 38073, Training Loss: 0.00728
Epoch: 38073, Training Loss: 0.00894
Epoch: 38074, Training Loss: 0.00700
Epoch: 38074, Training Loss: 0.00728
Epoch: 38074, Training Loss: 0.00728
Epoch: 38074, Training Loss: 0.00894
Epoch: 38075, Training Loss: 0.00700
Epoch: 38075, Training Loss: 0.00728
Epoch: 38075, Training Loss: 0.00728
Epoch: 38075, Training Loss: 0.00894
Epoch: 38076, Training Loss: 0.00700
Epoch: 38076, Training Loss: 0.00728
Epoch: 38076, Training Loss: 0.00728
Epoch: 38076, Training Loss: 0.00894
Epoch: 38077, Training Loss: 0.00700
Epoch: 38077, Training Loss: 0.00728
Epoch: 38077, Training Loss: 0.00728
Epoch: 38077, Training Loss: 0.00894
Epoch: 38078, Training Loss: 0.00700
Epoch: 38078, Training Loss: 0.00728
Epoch: 38078, Training Loss: 0.00728
Epoch: 38078, Training Loss: 0.00894
Epoch: 38079, Training Loss: 0.00700
Epoch: 38079, Training Loss: 0.00728
Epoch: 38079, Training Loss: 0.00728
Epoch: 38079, Training Loss: 0.00894
Epoch: 38080, Training Loss: 0.00700
Epoch: 38080, Training Loss: 0.00728
Epoch: 38080, Training Loss: 0.00728
Epoch: 38080, Training Loss: 0.00894
Epoch: 38081, Training Loss: 0.00700
Epoch: 38081, Training Loss: 0.00728
Epoch: 38081, Training Loss: 0.00728
Epoch: 38081, Training Loss: 0.00894
Epoch: 38082, Training Loss: 0.00700
Epoch: 38082, Training Loss: 0.00728
Epoch: 38082, Training Loss: 0.00728
Epoch: 38082, Training Loss: 0.00894
Epoch: 38083, Training Loss: 0.00700
Epoch: 38083, Training Loss: 0.00728
Epoch: 38083, Training Loss: 0.00728
Epoch: 38083, Training Loss: 0.00893
Epoch: 38084, Training Loss: 0.00700
Epoch: 38084, Training Loss: 0.00728
Epoch: 38084, Training Loss: 0.00728
Epoch: 38084, Training Loss: 0.00893
Epoch: 38085, Training Loss: 0.00700
Epoch: 38085, Training Loss: 0.00728
Epoch: 38085, Training Loss: 0.00728
Epoch: 38085, Training Loss: 0.00893
Epoch: 38086, Training Loss: 0.00700
Epoch: 38086, Training Loss: 0.00728
Epoch: 38086, Training Loss: 0.00728
Epoch: 38086, Training Loss: 0.00893
Epoch: 38087, Training Loss: 0.00700
Epoch: 38087, Training Loss: 0.00728
Epoch: 38087, Training Loss: 0.00728
Epoch: 38087, Training Loss: 0.00893
Epoch: 38088, Training Loss: 0.00700
Epoch: 38088, Training Loss: 0.00728
Epoch: 38088, Training Loss: 0.00728
Epoch: 38088, Training Loss: 0.00893
Epoch: 38089, Training Loss: 0.00700
Epoch: 38089, Training Loss: 0.00728
Epoch: 38089, Training Loss: 0.00728
Epoch: 38089, Training Loss: 0.00893
Epoch: 38090, Training Loss: 0.00700
Epoch: 38090, Training Loss: 0.00727
Epoch: 38090, Training Loss: 0.00728
Epoch: 38090, Training Loss: 0.00893
Epoch: 38091, Training Loss: 0.00700
Epoch: 38091, Training Loss: 0.00727
Epoch: 38091, Training Loss: 0.00728
Epoch: 38091, Training Loss: 0.00893
Epoch: 38092, Training Loss: 0.00700
Epoch: 38092, Training Loss: 0.00727
Epoch: 38092, Training Loss: 0.00728
Epoch: 38092, Training Loss: 0.00893
Epoch: 38093, Training Loss: 0.00700
Epoch: 38093, Training Loss: 0.00727
Epoch: 38093, Training Loss: 0.00728
Epoch: 38093, Training Loss: 0.00893
Epoch: 38094, Training Loss: 0.00700
Epoch: 38094, Training Loss: 0.00727
Epoch: 38094, Training Loss: 0.00728
Epoch: 38094, Training Loss: 0.00893
Epoch: 38095, Training Loss: 0.00700
Epoch: 38095, Training Loss: 0.00727
Epoch: 38095, Training Loss: 0.00728
Epoch: 38095, Training Loss: 0.00893
Epoch: 38096, Training Loss: 0.00700
Epoch: 38096, Training Loss: 0.00727
Epoch: 38096, Training Loss: 0.00728
Epoch: 38096, Training Loss: 0.00893
Epoch: 38097, Training Loss: 0.00700
Epoch: 38097, Training Loss: 0.00727
Epoch: 38097, Training Loss: 0.00728
Epoch: 38097, Training Loss: 0.00893
Epoch: 38098, Training Loss: 0.00700
Epoch: 38098, Training Loss: 0.00727
Epoch: 38098, Training Loss: 0.00728
Epoch: 38098, Training Loss: 0.00893
Epoch: 38099, Training Loss: 0.00700
Epoch: 38099, Training Loss: 0.00727
Epoch: 38099, Training Loss: 0.00728
Epoch: 38099, Training Loss: 0.00893
Epoch: 38100, Training Loss: 0.00700
Epoch: 38100, Training Loss: 0.00727
Epoch: 38100, Training Loss: 0.00728
Epoch: 38100, Training Loss: 0.00893
Epoch: 38101, Training Loss: 0.00700
Epoch: 38101, Training Loss: 0.00727
Epoch: 38101, Training Loss: 0.00728
Epoch: 38101, Training Loss: 0.00893
Epoch: 38102, Training Loss: 0.00700
Epoch: 38102, Training Loss: 0.00727
Epoch: 38102, Training Loss: 0.00728
Epoch: 38102, Training Loss: 0.00893
Epoch: 38103, Training Loss: 0.00700
Epoch: 38103, Training Loss: 0.00727
Epoch: 38103, Training Loss: 0.00728
Epoch: 38103, Training Loss: 0.00893
Epoch: 38104, Training Loss: 0.00700
Epoch: 38104, Training Loss: 0.00727
Epoch: 38104, Training Loss: 0.00728
Epoch: 38104, Training Loss: 0.00893
Epoch: 38105, Training Loss: 0.00700
Epoch: 38105, Training Loss: 0.00727
Epoch: 38105, Training Loss: 0.00728
Epoch: 38105, Training Loss: 0.00893
Epoch: 38106, Training Loss: 0.00700
Epoch: 38106, Training Loss: 0.00727
Epoch: 38106, Training Loss: 0.00728
Epoch: 38106, Training Loss: 0.00893
Epoch: 38107, Training Loss: 0.00700
Epoch: 38107, Training Loss: 0.00727
Epoch: 38107, Training Loss: 0.00728
Epoch: 38107, Training Loss: 0.00893
Epoch: 38108, Training Loss: 0.00700
Epoch: 38108, Training Loss: 0.00727
Epoch: 38108, Training Loss: 0.00728
Epoch: 38108, Training Loss: 0.00893
Epoch: 38109, Training Loss: 0.00700
Epoch: 38109, Training Loss: 0.00727
Epoch: 38109, Training Loss: 0.00728
Epoch: 38109, Training Loss: 0.00893
Epoch: 38110, Training Loss: 0.00700
Epoch: 38110, Training Loss: 0.00727
Epoch: 38110, Training Loss: 0.00728
Epoch: 38110, Training Loss: 0.00893
Epoch: 38111, Training Loss: 0.00700
Epoch: 38111, Training Loss: 0.00727
Epoch: 38111, Training Loss: 0.00728
Epoch: 38111, Training Loss: 0.00893
Epoch: 38112, Training Loss: 0.00700
Epoch: 38112, Training Loss: 0.00727
Epoch: 38112, Training Loss: 0.00728
Epoch: 38112, Training Loss: 0.00893
Epoch: 38113, Training Loss: 0.00700
Epoch: 38113, Training Loss: 0.00727
Epoch: 38113, Training Loss: 0.00728
Epoch: 38113, Training Loss: 0.00893
Epoch: 38114, Training Loss: 0.00700
Epoch: 38114, Training Loss: 0.00727
Epoch: 38114, Training Loss: 0.00728
Epoch: 38114, Training Loss: 0.00893
Epoch: 38115, Training Loss: 0.00700
Epoch: 38115, Training Loss: 0.00727
Epoch: 38115, Training Loss: 0.00728
Epoch: 38115, Training Loss: 0.00893
Epoch: 38116, Training Loss: 0.00700
Epoch: 38116, Training Loss: 0.00727
Epoch: 38116, Training Loss: 0.00728
Epoch: 38116, Training Loss: 0.00893
Epoch: 38117, Training Loss: 0.00700
Epoch: 38117, Training Loss: 0.00727
Epoch: 38117, Training Loss: 0.00728
Epoch: 38117, Training Loss: 0.00893
Epoch: 38118, Training Loss: 0.00700
Epoch: 38118, Training Loss: 0.00727
Epoch: 38118, Training Loss: 0.00728
Epoch: 38118, Training Loss: 0.00893
Epoch: 38119, Training Loss: 0.00700
Epoch: 38119, Training Loss: 0.00727
Epoch: 38119, Training Loss: 0.00728
Epoch: 38119, Training Loss: 0.00893
Epoch: 38120, Training Loss: 0.00700
Epoch: 38120, Training Loss: 0.00727
Epoch: 38120, Training Loss: 0.00728
Epoch: 38120, Training Loss: 0.00893
Epoch: 38121, Training Loss: 0.00700
Epoch: 38121, Training Loss: 0.00727
Epoch: 38121, Training Loss: 0.00728
Epoch: 38121, Training Loss: 0.00893
Epoch: 38122, Training Loss: 0.00700
Epoch: 38122, Training Loss: 0.00727
Epoch: 38122, Training Loss: 0.00728
Epoch: 38122, Training Loss: 0.00893
Epoch: 38123, Training Loss: 0.00700
Epoch: 38123, Training Loss: 0.00727
Epoch: 38123, Training Loss: 0.00728
Epoch: 38123, Training Loss: 0.00893
Epoch: 38124, Training Loss: 0.00700
Epoch: 38124, Training Loss: 0.00727
Epoch: 38124, Training Loss: 0.00728
Epoch: 38124, Training Loss: 0.00893
Epoch: 38125, Training Loss: 0.00700
Epoch: 38125, Training Loss: 0.00727
Epoch: 38125, Training Loss: 0.00728
Epoch: 38125, Training Loss: 0.00893
Epoch: 38126, Training Loss: 0.00700
Epoch: 38126, Training Loss: 0.00727
Epoch: 38126, Training Loss: 0.00728
Epoch: 38126, Training Loss: 0.00893
Epoch: 38127, Training Loss: 0.00700
Epoch: 38127, Training Loss: 0.00727
Epoch: 38127, Training Loss: 0.00728
Epoch: 38127, Training Loss: 0.00893
Epoch: 38128, Training Loss: 0.00700
Epoch: 38128, Training Loss: 0.00727
Epoch: 38128, Training Loss: 0.00728
Epoch: 38128, Training Loss: 0.00893
Epoch: 38129, Training Loss: 0.00700
Epoch: 38129, Training Loss: 0.00727
Epoch: 38129, Training Loss: 0.00728
Epoch: 38129, Training Loss: 0.00893
Epoch: 38130, Training Loss: 0.00700
Epoch: 38130, Training Loss: 0.00727
Epoch: 38130, Training Loss: 0.00728
Epoch: 38130, Training Loss: 0.00893
Epoch: 38131, Training Loss: 0.00700
Epoch: 38131, Training Loss: 0.00727
Epoch: 38131, Training Loss: 0.00728
Epoch: 38131, Training Loss: 0.00893
Epoch: 38132, Training Loss: 0.00700
Epoch: 38132, Training Loss: 0.00727
Epoch: 38132, Training Loss: 0.00728
Epoch: 38132, Training Loss: 0.00893
Epoch: 38133, Training Loss: 0.00700
Epoch: 38133, Training Loss: 0.00727
Epoch: 38133, Training Loss: 0.00728
Epoch: 38133, Training Loss: 0.00893
Epoch: 38134, Training Loss: 0.00700
Epoch: 38134, Training Loss: 0.00727
Epoch: 38134, Training Loss: 0.00728
Epoch: 38134, Training Loss: 0.00893
Epoch: 38135, Training Loss: 0.00700
Epoch: 38135, Training Loss: 0.00727
Epoch: 38135, Training Loss: 0.00728
Epoch: 38135, Training Loss: 0.00893
Epoch: 38136, Training Loss: 0.00700
Epoch: 38136, Training Loss: 0.00727
Epoch: 38136, Training Loss: 0.00728
Epoch: 38136, Training Loss: 0.00893
Epoch: 38137, Training Loss: 0.00700
Epoch: 38137, Training Loss: 0.00727
Epoch: 38137, Training Loss: 0.00728
Epoch: 38137, Training Loss: 0.00893
Epoch: 38138, Training Loss: 0.00700
Epoch: 38138, Training Loss: 0.00727
Epoch: 38138, Training Loss: 0.00728
Epoch: 38138, Training Loss: 0.00893
Epoch: 38139, Training Loss: 0.00700
Epoch: 38139, Training Loss: 0.00727
Epoch: 38139, Training Loss: 0.00728
Epoch: 38139, Training Loss: 0.00893
Epoch: 38140, Training Loss: 0.00700
Epoch: 38140, Training Loss: 0.00727
Epoch: 38140, Training Loss: 0.00728
Epoch: 38140, Training Loss: 0.00893
Epoch: 38141, Training Loss: 0.00700
Epoch: 38141, Training Loss: 0.00727
Epoch: 38141, Training Loss: 0.00728
Epoch: 38141, Training Loss: 0.00893
Epoch: 38142, Training Loss: 0.00700
Epoch: 38142, Training Loss: 0.00727
Epoch: 38142, Training Loss: 0.00728
Epoch: 38142, Training Loss: 0.00893
Epoch: 38143, Training Loss: 0.00700
Epoch: 38143, Training Loss: 0.00727
Epoch: 38143, Training Loss: 0.00728
Epoch: 38143, Training Loss: 0.00893
Epoch: 38144, Training Loss: 0.00699
Epoch: 38144, Training Loss: 0.00727
Epoch: 38144, Training Loss: 0.00728
Epoch: 38144, Training Loss: 0.00893
Epoch: 38145, Training Loss: 0.00699
Epoch: 38145, Training Loss: 0.00727
Epoch: 38145, Training Loss: 0.00728
Epoch: 38145, Training Loss: 0.00893
Epoch: 38146, Training Loss: 0.00699
Epoch: 38146, Training Loss: 0.00727
Epoch: 38146, Training Loss: 0.00728
Epoch: 38146, Training Loss: 0.00893
Epoch: 38147, Training Loss: 0.00699
Epoch: 38147, Training Loss: 0.00727
Epoch: 38147, Training Loss: 0.00728
Epoch: 38147, Training Loss: 0.00893
Epoch: 38148, Training Loss: 0.00699
Epoch: 38148, Training Loss: 0.00727
Epoch: 38148, Training Loss: 0.00728
Epoch: 38148, Training Loss: 0.00893
Epoch: 38149, Training Loss: 0.00699
Epoch: 38149, Training Loss: 0.00727
Epoch: 38149, Training Loss: 0.00728
Epoch: 38149, Training Loss: 0.00893
Epoch: 38150, Training Loss: 0.00699
Epoch: 38150, Training Loss: 0.00727
Epoch: 38150, Training Loss: 0.00728
Epoch: 38150, Training Loss: 0.00893
Epoch: 38151, Training Loss: 0.00699
Epoch: 38151, Training Loss: 0.00727
Epoch: 38151, Training Loss: 0.00728
Epoch: 38151, Training Loss: 0.00893
Epoch: 38152, Training Loss: 0.00699
Epoch: 38152, Training Loss: 0.00727
Epoch: 38152, Training Loss: 0.00728
Epoch: 38152, Training Loss: 0.00893
Epoch: 38153, Training Loss: 0.00699
Epoch: 38153, Training Loss: 0.00727
Epoch: 38153, Training Loss: 0.00728
Epoch: 38153, Training Loss: 0.00893
Epoch: 38154, Training Loss: 0.00699
Epoch: 38154, Training Loss: 0.00727
Epoch: 38154, Training Loss: 0.00728
Epoch: 38154, Training Loss: 0.00893
Epoch: 38155, Training Loss: 0.00699
Epoch: 38155, Training Loss: 0.00727
Epoch: 38155, Training Loss: 0.00728
Epoch: 38155, Training Loss: 0.00893
Epoch: 38156, Training Loss: 0.00699
Epoch: 38156, Training Loss: 0.00727
Epoch: 38156, Training Loss: 0.00728
Epoch: 38156, Training Loss: 0.00893
Epoch: 38157, Training Loss: 0.00699
Epoch: 38157, Training Loss: 0.00727
Epoch: 38157, Training Loss: 0.00728
Epoch: 38157, Training Loss: 0.00893
Epoch: 38158, Training Loss: 0.00699
Epoch: 38158, Training Loss: 0.00727
Epoch: 38158, Training Loss: 0.00728
Epoch: 38158, Training Loss: 0.00893
Epoch: 38159, Training Loss: 0.00699
Epoch: 38159, Training Loss: 0.00727
Epoch: 38159, Training Loss: 0.00728
Epoch: 38159, Training Loss: 0.00893
Epoch: 38160, Training Loss: 0.00699
Epoch: 38160, Training Loss: 0.00727
Epoch: 38160, Training Loss: 0.00728
Epoch: 38160, Training Loss: 0.00892
Epoch: 38161, Training Loss: 0.00699
Epoch: 38161, Training Loss: 0.00727
Epoch: 38161, Training Loss: 0.00728
Epoch: 38161, Training Loss: 0.00892
Epoch: 38162, Training Loss: 0.00699
Epoch: 38162, Training Loss: 0.00727
Epoch: 38162, Training Loss: 0.00728
Epoch: 38162, Training Loss: 0.00892
Epoch: 38163, Training Loss: 0.00699
Epoch: 38163, Training Loss: 0.00727
Epoch: 38163, Training Loss: 0.00728
Epoch: 38163, Training Loss: 0.00892
Epoch: 38164, Training Loss: 0.00699
Epoch: 38164, Training Loss: 0.00727
Epoch: 38164, Training Loss: 0.00728
Epoch: 38164, Training Loss: 0.00892
Epoch: 38165, Training Loss: 0.00699
Epoch: 38165, Training Loss: 0.00727
Epoch: 38165, Training Loss: 0.00728
Epoch: 38165, Training Loss: 0.00892
Epoch: 38166, Training Loss: 0.00699
Epoch: 38166, Training Loss: 0.00727
Epoch: 38166, Training Loss: 0.00728
Epoch: 38166, Training Loss: 0.00892
Epoch: 38167, Training Loss: 0.00699
Epoch: 38167, Training Loss: 0.00727
Epoch: 38167, Training Loss: 0.00728
Epoch: 38167, Training Loss: 0.00892
Epoch: 38168, Training Loss: 0.00699
Epoch: 38168, Training Loss: 0.00727
Epoch: 38168, Training Loss: 0.00727
Epoch: 38168, Training Loss: 0.00892
Epoch: 38169, Training Loss: 0.00699
Epoch: 38169, Training Loss: 0.00727
Epoch: 38169, Training Loss: 0.00727
Epoch: 38169, Training Loss: 0.00892
Epoch: 38170, Training Loss: 0.00699
Epoch: 38170, Training Loss: 0.00727
Epoch: 38170, Training Loss: 0.00727
Epoch: 38170, Training Loss: 0.00892
Epoch: 38171, Training Loss: 0.00699
Epoch: 38171, Training Loss: 0.00727
Epoch: 38171, Training Loss: 0.00727
Epoch: 38171, Training Loss: 0.00892
Epoch: 38172, Training Loss: 0.00699
Epoch: 38172, Training Loss: 0.00727
Epoch: 38172, Training Loss: 0.00727
Epoch: 38172, Training Loss: 0.00892
Epoch: 38173, Training Loss: 0.00699
Epoch: 38173, Training Loss: 0.00727
Epoch: 38173, Training Loss: 0.00727
Epoch: 38173, Training Loss: 0.00892
Epoch: 38174, Training Loss: 0.00699
Epoch: 38174, Training Loss: 0.00727
Epoch: 38174, Training Loss: 0.00727
Epoch: 38174, Training Loss: 0.00892
Epoch: 38175, Training Loss: 0.00699
Epoch: 38175, Training Loss: 0.00727
Epoch: 38175, Training Loss: 0.00727
Epoch: 38175, Training Loss: 0.00892
Epoch: 38176, Training Loss: 0.00699
Epoch: 38176, Training Loss: 0.00727
Epoch: 38176, Training Loss: 0.00727
Epoch: 38176, Training Loss: 0.00892
Epoch: 38177, Training Loss: 0.00699
Epoch: 38177, Training Loss: 0.00727
Epoch: 38177, Training Loss: 0.00727
Epoch: 38177, Training Loss: 0.00892
Epoch: 38178, Training Loss: 0.00699
Epoch: 38178, Training Loss: 0.00727
Epoch: 38178, Training Loss: 0.00727
Epoch: 38178, Training Loss: 0.00892
Epoch: 38179, Training Loss: 0.00699
Epoch: 38179, Training Loss: 0.00727
Epoch: 38179, Training Loss: 0.00727
Epoch: 38179, Training Loss: 0.00892
Epoch: 38180, Training Loss: 0.00699
Epoch: 38180, Training Loss: 0.00727
Epoch: 38180, Training Loss: 0.00727
Epoch: 38180, Training Loss: 0.00892
Epoch: 38181, Training Loss: 0.00699
Epoch: 38181, Training Loss: 0.00727
Epoch: 38181, Training Loss: 0.00727
Epoch: 38181, Training Loss: 0.00892
Epoch: 38182, Training Loss: 0.00699
Epoch: 38182, Training Loss: 0.00727
Epoch: 38182, Training Loss: 0.00727
Epoch: 38182, Training Loss: 0.00892
Epoch: 38183, Training Loss: 0.00699
Epoch: 38183, Training Loss: 0.00727
Epoch: 38183, Training Loss: 0.00727
Epoch: 38183, Training Loss: 0.00892
Epoch: 38184, Training Loss: 0.00699
Epoch: 38184, Training Loss: 0.00727
Epoch: 38184, Training Loss: 0.00727
Epoch: 38184, Training Loss: 0.00892
Epoch: 38185, Training Loss: 0.00699
Epoch: 38185, Training Loss: 0.00726
Epoch: 38185, Training Loss: 0.00727
Epoch: 38185, Training Loss: 0.00892
Epoch: 38186, Training Loss: 0.00699
Epoch: 38186, Training Loss: 0.00726
Epoch: 38186, Training Loss: 0.00727
Epoch: 38186, Training Loss: 0.00892
Epoch: 38187, Training Loss: 0.00699
Epoch: 38187, Training Loss: 0.00726
Epoch: 38187, Training Loss: 0.00727
Epoch: 38187, Training Loss: 0.00892
Epoch: 38188, Training Loss: 0.00699
Epoch: 38188, Training Loss: 0.00726
Epoch: 38188, Training Loss: 0.00727
Epoch: 38188, Training Loss: 0.00892
Epoch: 38189, Training Loss: 0.00699
Epoch: 38189, Training Loss: 0.00726
Epoch: 38189, Training Loss: 0.00727
Epoch: 38189, Training Loss: 0.00892
Epoch: 38190, Training Loss: 0.00699
Epoch: 38190, Training Loss: 0.00726
Epoch: 38190, Training Loss: 0.00727
Epoch: 38190, Training Loss: 0.00892
Epoch: 38191, Training Loss: 0.00699
Epoch: 38191, Training Loss: 0.00726
Epoch: 38191, Training Loss: 0.00727
Epoch: 38191, Training Loss: 0.00892
Epoch: 38192, Training Loss: 0.00699
Epoch: 38192, Training Loss: 0.00726
Epoch: 38192, Training Loss: 0.00727
Epoch: 38192, Training Loss: 0.00892
Epoch: 38193, Training Loss: 0.00699
Epoch: 38193, Training Loss: 0.00726
Epoch: 38193, Training Loss: 0.00727
Epoch: 38193, Training Loss: 0.00892
Epoch: 38194, Training Loss: 0.00699
Epoch: 38194, Training Loss: 0.00726
Epoch: 38194, Training Loss: 0.00727
Epoch: 38194, Training Loss: 0.00892
Epoch: 38195, Training Loss: 0.00699
Epoch: 38195, Training Loss: 0.00726
Epoch: 38195, Training Loss: 0.00727
Epoch: 38195, Training Loss: 0.00892
Epoch: 38196, Training Loss: 0.00699
Epoch: 38196, Training Loss: 0.00726
Epoch: 38196, Training Loss: 0.00727
Epoch: 38196, Training Loss: 0.00892
Epoch: 38197, Training Loss: 0.00699
Epoch: 38197, Training Loss: 0.00726
Epoch: 38197, Training Loss: 0.00727
Epoch: 38197, Training Loss: 0.00892
Epoch: 38198, Training Loss: 0.00699
Epoch: 38198, Training Loss: 0.00726
Epoch: 38198, Training Loss: 0.00727
Epoch: 38198, Training Loss: 0.00892
Epoch: 38199, Training Loss: 0.00699
Epoch: 38199, Training Loss: 0.00726
Epoch: 38199, Training Loss: 0.00727
Epoch: 38199, Training Loss: 0.00892
Epoch: 38200, Training Loss: 0.00699
Epoch: 38200, Training Loss: 0.00726
Epoch: 38200, Training Loss: 0.00727
Epoch: 38200, Training Loss: 0.00892
Epoch: 38201, Training Loss: 0.00699
Epoch: 38201, Training Loss: 0.00726
Epoch: 38201, Training Loss: 0.00727
Epoch: 38201, Training Loss: 0.00892
Epoch: 38202, Training Loss: 0.00699
Epoch: 38202, Training Loss: 0.00726
Epoch: 38202, Training Loss: 0.00727
Epoch: 38202, Training Loss: 0.00892
Epoch: 38203, Training Loss: 0.00699
Epoch: 38203, Training Loss: 0.00726
Epoch: 38203, Training Loss: 0.00727
Epoch: 38203, Training Loss: 0.00892
Epoch: 38204, Training Loss: 0.00699
Epoch: 38204, Training Loss: 0.00726
Epoch: 38204, Training Loss: 0.00727
Epoch: 38204, Training Loss: 0.00892
Epoch: 38205, Training Loss: 0.00699
Epoch: 38205, Training Loss: 0.00726
Epoch: 38205, Training Loss: 0.00727
Epoch: 38205, Training Loss: 0.00892
Epoch: 38206, Training Loss: 0.00699
Epoch: 38206, Training Loss: 0.00726
Epoch: 38206, Training Loss: 0.00727
Epoch: 38206, Training Loss: 0.00892
Epoch: 38207, Training Loss: 0.00699
Epoch: 38207, Training Loss: 0.00726
Epoch: 38207, Training Loss: 0.00727
Epoch: 38207, Training Loss: 0.00892
Epoch: 38208, Training Loss: 0.00699
Epoch: 38208, Training Loss: 0.00726
Epoch: 38208, Training Loss: 0.00727
Epoch: 38208, Training Loss: 0.00892
Epoch: 38209, Training Loss: 0.00699
Epoch: 38209, Training Loss: 0.00726
Epoch: 38209, Training Loss: 0.00727
Epoch: 38209, Training Loss: 0.00892
Epoch: 38210, Training Loss: 0.00699
Epoch: 38210, Training Loss: 0.00726
Epoch: 38210, Training Loss: 0.00727
Epoch: 38210, Training Loss: 0.00892
Epoch: 38211, Training Loss: 0.00699
Epoch: 38211, Training Loss: 0.00726
Epoch: 38211, Training Loss: 0.00727
Epoch: 38211, Training Loss: 0.00892
Epoch: 38212, Training Loss: 0.00699
Epoch: 38212, Training Loss: 0.00726
Epoch: 38212, Training Loss: 0.00727
Epoch: 38212, Training Loss: 0.00892
Epoch: 38213, Training Loss: 0.00699
Epoch: 38213, Training Loss: 0.00726
Epoch: 38213, Training Loss: 0.00727
Epoch: 38213, Training Loss: 0.00892
Epoch: 38214, Training Loss: 0.00699
Epoch: 38214, Training Loss: 0.00726
Epoch: 38214, Training Loss: 0.00727
Epoch: 38214, Training Loss: 0.00892
Epoch: 38215, Training Loss: 0.00699
Epoch: 38215, Training Loss: 0.00726
Epoch: 38215, Training Loss: 0.00727
Epoch: 38215, Training Loss: 0.00892
Epoch: 38216, Training Loss: 0.00699
Epoch: 38216, Training Loss: 0.00726
Epoch: 38216, Training Loss: 0.00727
Epoch: 38216, Training Loss: 0.00892
Epoch: 38217, Training Loss: 0.00699
Epoch: 38217, Training Loss: 0.00726
Epoch: 38217, Training Loss: 0.00727
Epoch: 38217, Training Loss: 0.00892
Epoch: 38218, Training Loss: 0.00699
Epoch: 38218, Training Loss: 0.00726
Epoch: 38218, Training Loss: 0.00727
Epoch: 38218, Training Loss: 0.00892
Epoch: 38219, Training Loss: 0.00699
Epoch: 38219, Training Loss: 0.00726
Epoch: 38219, Training Loss: 0.00727
Epoch: 38219, Training Loss: 0.00892
Epoch: 38220, Training Loss: 0.00699
Epoch: 38220, Training Loss: 0.00726
Epoch: 38220, Training Loss: 0.00727
Epoch: 38220, Training Loss: 0.00892
Epoch: 38221, Training Loss: 0.00699
Epoch: 38221, Training Loss: 0.00726
Epoch: 38221, Training Loss: 0.00727
Epoch: 38221, Training Loss: 0.00892
Epoch: 38222, Training Loss: 0.00699
Epoch: 38222, Training Loss: 0.00726
Epoch: 38222, Training Loss: 0.00727
Epoch: 38222, Training Loss: 0.00892
Epoch: 38223, Training Loss: 0.00699
Epoch: 38223, Training Loss: 0.00726
Epoch: 38223, Training Loss: 0.00727
Epoch: 38223, Training Loss: 0.00892
Epoch: 38224, Training Loss: 0.00699
Epoch: 38224, Training Loss: 0.00726
Epoch: 38224, Training Loss: 0.00727
Epoch: 38224, Training Loss: 0.00892
Epoch: 38225, Training Loss: 0.00699
Epoch: 38225, Training Loss: 0.00726
Epoch: 38225, Training Loss: 0.00727
Epoch: 38225, Training Loss: 0.00892
Epoch: 38226, Training Loss: 0.00699
Epoch: 38226, Training Loss: 0.00726
Epoch: 38226, Training Loss: 0.00727
Epoch: 38226, Training Loss: 0.00892
Epoch: 38227, Training Loss: 0.00699
Epoch: 38227, Training Loss: 0.00726
Epoch: 38227, Training Loss: 0.00727
Epoch: 38227, Training Loss: 0.00892
Epoch: 38228, Training Loss: 0.00699
Epoch: 38228, Training Loss: 0.00726
Epoch: 38228, Training Loss: 0.00727
Epoch: 38228, Training Loss: 0.00892
Epoch: 38229, Training Loss: 0.00699
Epoch: 38229, Training Loss: 0.00726
Epoch: 38229, Training Loss: 0.00727
Epoch: 38229, Training Loss: 0.00892
Epoch: 38230, Training Loss: 0.00699
Epoch: 38230, Training Loss: 0.00726
Epoch: 38230, Training Loss: 0.00727
Epoch: 38230, Training Loss: 0.00892
Epoch: 38231, Training Loss: 0.00699
Epoch: 38231, Training Loss: 0.00726
Epoch: 38231, Training Loss: 0.00727
Epoch: 38231, Training Loss: 0.00892
Epoch: 38232, Training Loss: 0.00699
Epoch: 38232, Training Loss: 0.00726
Epoch: 38232, Training Loss: 0.00727
Epoch: 38232, Training Loss: 0.00892
Epoch: 38233, Training Loss: 0.00699
Epoch: 38233, Training Loss: 0.00726
Epoch: 38233, Training Loss: 0.00727
Epoch: 38233, Training Loss: 0.00892
Epoch: 38234, Training Loss: 0.00699
Epoch: 38234, Training Loss: 0.00726
Epoch: 38234, Training Loss: 0.00727
Epoch: 38234, Training Loss: 0.00892
Epoch: 38235, Training Loss: 0.00699
Epoch: 38235, Training Loss: 0.00726
Epoch: 38235, Training Loss: 0.00727
Epoch: 38235, Training Loss: 0.00892
Epoch: 38236, Training Loss: 0.00699
Epoch: 38236, Training Loss: 0.00726
Epoch: 38236, Training Loss: 0.00727
Epoch: 38236, Training Loss: 0.00892
Epoch: 38237, Training Loss: 0.00699
Epoch: 38237, Training Loss: 0.00726
Epoch: 38237, Training Loss: 0.00727
Epoch: 38237, Training Loss: 0.00891
Epoch: 38238, Training Loss: 0.00699
Epoch: 38238, Training Loss: 0.00726
Epoch: 38238, Training Loss: 0.00727
Epoch: 38238, Training Loss: 0.00891
Epoch: 38239, Training Loss: 0.00699
Epoch: 38239, Training Loss: 0.00726
Epoch: 38239, Training Loss: 0.00727
Epoch: 38239, Training Loss: 0.00891
Epoch: 38240, Training Loss: 0.00699
Epoch: 38240, Training Loss: 0.00726
Epoch: 38240, Training Loss: 0.00727
Epoch: 38240, Training Loss: 0.00891
Epoch: 38241, Training Loss: 0.00699
Epoch: 38241, Training Loss: 0.00726
Epoch: 38241, Training Loss: 0.00727
Epoch: 38241, Training Loss: 0.00891
Epoch: 38242, Training Loss: 0.00699
Epoch: 38242, Training Loss: 0.00726
Epoch: 38242, Training Loss: 0.00727
Epoch: 38242, Training Loss: 0.00891
Epoch: 38243, Training Loss: 0.00699
Epoch: 38243, Training Loss: 0.00726
Epoch: 38243, Training Loss: 0.00727
Epoch: 38243, Training Loss: 0.00891
Epoch: 38244, Training Loss: 0.00699
Epoch: 38244, Training Loss: 0.00726
Epoch: 38244, Training Loss: 0.00727
Epoch: 38244, Training Loss: 0.00891
Epoch: 38245, Training Loss: 0.00698
Epoch: 38245, Training Loss: 0.00726
Epoch: 38245, Training Loss: 0.00727
Epoch: 38245, Training Loss: 0.00891
Epoch: 38246, Training Loss: 0.00698
Epoch: 38246, Training Loss: 0.00726
Epoch: 38246, Training Loss: 0.00727
Epoch: 38246, Training Loss: 0.00891
Epoch: 38247, Training Loss: 0.00698
Epoch: 38247, Training Loss: 0.00726
Epoch: 38247, Training Loss: 0.00727
Epoch: 38247, Training Loss: 0.00891
Epoch: 38248, Training Loss: 0.00698
Epoch: 38248, Training Loss: 0.00726
Epoch: 38248, Training Loss: 0.00727
Epoch: 38248, Training Loss: 0.00891
Epoch: 38249, Training Loss: 0.00698
Epoch: 38249, Training Loss: 0.00726
Epoch: 38249, Training Loss: 0.00727
Epoch: 38249, Training Loss: 0.00891
Epoch: 38250, Training Loss: 0.00698
Epoch: 38250, Training Loss: 0.00726
Epoch: 38250, Training Loss: 0.00727
Epoch: 38250, Training Loss: 0.00891
Epoch: 38251, Training Loss: 0.00698
Epoch: 38251, Training Loss: 0.00726
Epoch: 38251, Training Loss: 0.00727
Epoch: 38251, Training Loss: 0.00891
Epoch: 38252, Training Loss: 0.00698
Epoch: 38252, Training Loss: 0.00726
Epoch: 38252, Training Loss: 0.00727
Epoch: 38252, Training Loss: 0.00891
Epoch: 38253, Training Loss: 0.00698
Epoch: 38253, Training Loss: 0.00726
Epoch: 38253, Training Loss: 0.00727
Epoch: 38253, Training Loss: 0.00891
Epoch: 38254, Training Loss: 0.00698
Epoch: 38254, Training Loss: 0.00726
Epoch: 38254, Training Loss: 0.00727
Epoch: 38254, Training Loss: 0.00891
Epoch: 38255, Training Loss: 0.00698
Epoch: 38255, Training Loss: 0.00726
Epoch: 38255, Training Loss: 0.00727
Epoch: 38255, Training Loss: 0.00891
Epoch: 38256, Training Loss: 0.00698
Epoch: 38256, Training Loss: 0.00726
Epoch: 38256, Training Loss: 0.00727
Epoch: 38256, Training Loss: 0.00891
Epoch: 38257, Training Loss: 0.00698
Epoch: 38257, Training Loss: 0.00726
Epoch: 38257, Training Loss: 0.00727
Epoch: 38257, Training Loss: 0.00891
Epoch: 38258, Training Loss: 0.00698
Epoch: 38258, Training Loss: 0.00726
Epoch: 38258, Training Loss: 0.00727
Epoch: 38258, Training Loss: 0.00891
Epoch: 38259, Training Loss: 0.00698
Epoch: 38259, Training Loss: 0.00726
Epoch: 38259, Training Loss: 0.00727
Epoch: 38259, Training Loss: 0.00891
Epoch: 38260, Training Loss: 0.00698
Epoch: 38260, Training Loss: 0.00726
Epoch: 38260, Training Loss: 0.00727
Epoch: 38260, Training Loss: 0.00891
Epoch: 38261, Training Loss: 0.00698
Epoch: 38261, Training Loss: 0.00726
Epoch: 38261, Training Loss: 0.00727
Epoch: 38261, Training Loss: 0.00891
Epoch: 38262, Training Loss: 0.00698
Epoch: 38262, Training Loss: 0.00726
Epoch: 38262, Training Loss: 0.00727
Epoch: 38262, Training Loss: 0.00891
Epoch: 38263, Training Loss: 0.00698
Epoch: 38263, Training Loss: 0.00726
Epoch: 38263, Training Loss: 0.00727
Epoch: 38263, Training Loss: 0.00891
Epoch: 38264, Training Loss: 0.00698
Epoch: 38264, Training Loss: 0.00726
Epoch: 38264, Training Loss: 0.00726
Epoch: 38264, Training Loss: 0.00891
Epoch: 38265, Training Loss: 0.00698
Epoch: 38265, Training Loss: 0.00726
Epoch: 38265, Training Loss: 0.00726
Epoch: 38265, Training Loss: 0.00891
Epoch: 38266, Training Loss: 0.00698
Epoch: 38266, Training Loss: 0.00726
Epoch: 38266, Training Loss: 0.00726
Epoch: 38266, Training Loss: 0.00891
Epoch: 38267, Training Loss: 0.00698
Epoch: 38267, Training Loss: 0.00726
Epoch: 38267, Training Loss: 0.00726
Epoch: 38267, Training Loss: 0.00891
Epoch: 38268, Training Loss: 0.00698
Epoch: 38268, Training Loss: 0.00726
Epoch: 38268, Training Loss: 0.00726
Epoch: 38268, Training Loss: 0.00891
Epoch: 38269, Training Loss: 0.00698
Epoch: 38269, Training Loss: 0.00726
Epoch: 38269, Training Loss: 0.00726
Epoch: 38269, Training Loss: 0.00891
Epoch: 38270, Training Loss: 0.00698
Epoch: 38270, Training Loss: 0.00726
Epoch: 38270, Training Loss: 0.00726
Epoch: 38270, Training Loss: 0.00891
Epoch: 38271, Training Loss: 0.00698
Epoch: 38271, Training Loss: 0.00726
Epoch: 38271, Training Loss: 0.00726
Epoch: 38271, Training Loss: 0.00891
Epoch: 38272, Training Loss: 0.00698
Epoch: 38272, Training Loss: 0.00726
Epoch: 38272, Training Loss: 0.00726
Epoch: 38272, Training Loss: 0.00891
Epoch: 38273, Training Loss: 0.00698
Epoch: 38273, Training Loss: 0.00726
Epoch: 38273, Training Loss: 0.00726
Epoch: 38273, Training Loss: 0.00891
Epoch: 38274, Training Loss: 0.00698
Epoch: 38274, Training Loss: 0.00726
Epoch: 38274, Training Loss: 0.00726
Epoch: 38274, Training Loss: 0.00891
Epoch: 38275, Training Loss: 0.00698
Epoch: 38275, Training Loss: 0.00726
Epoch: 38275, Training Loss: 0.00726
Epoch: 38275, Training Loss: 0.00891
Epoch: 38276, Training Loss: 0.00698
Epoch: 38276, Training Loss: 0.00726
Epoch: 38276, Training Loss: 0.00726
Epoch: 38276, Training Loss: 0.00891
Epoch: 38277, Training Loss: 0.00698
Epoch: 38277, Training Loss: 0.00726
Epoch: 38277, Training Loss: 0.00726
Epoch: 38277, Training Loss: 0.00891
Epoch: 38278, Training Loss: 0.00698
Epoch: 38278, Training Loss: 0.00726
Epoch: 38278, Training Loss: 0.00726
Epoch: 38278, Training Loss: 0.00891
Epoch: 38279, Training Loss: 0.00698
Epoch: 38279, Training Loss: 0.00726
Epoch: 38279, Training Loss: 0.00726
Epoch: 38279, Training Loss: 0.00891
Epoch: 38280, Training Loss: 0.00698
Epoch: 38280, Training Loss: 0.00726
Epoch: 38280, Training Loss: 0.00726
Epoch: 38280, Training Loss: 0.00891
Epoch: 38281, Training Loss: 0.00698
Epoch: 38281, Training Loss: 0.00725
Epoch: 38281, Training Loss: 0.00726
Epoch: 38281, Training Loss: 0.00891
Epoch: 38282, Training Loss: 0.00698
Epoch: 38282, Training Loss: 0.00725
Epoch: 38282, Training Loss: 0.00726
Epoch: 38282, Training Loss: 0.00891
Epoch: 38283, Training Loss: 0.00698
Epoch: 38283, Training Loss: 0.00725
Epoch: 38283, Training Loss: 0.00726
Epoch: 38283, Training Loss: 0.00891
Epoch: 38284, Training Loss: 0.00698
Epoch: 38284, Training Loss: 0.00725
Epoch: 38284, Training Loss: 0.00726
Epoch: 38284, Training Loss: 0.00891
Epoch: 38285, Training Loss: 0.00698
Epoch: 38285, Training Loss: 0.00725
Epoch: 38285, Training Loss: 0.00726
Epoch: 38285, Training Loss: 0.00891
Epoch: 38286, Training Loss: 0.00698
Epoch: 38286, Training Loss: 0.00725
Epoch: 38286, Training Loss: 0.00726
Epoch: 38286, Training Loss: 0.00891
Epoch: 38287, Training Loss: 0.00698
Epoch: 38287, Training Loss: 0.00725
Epoch: 38287, Training Loss: 0.00726
Epoch: 38287, Training Loss: 0.00891
Epoch: 38288, Training Loss: 0.00698
Epoch: 38288, Training Loss: 0.00725
Epoch: 38288, Training Loss: 0.00726
Epoch: 38288, Training Loss: 0.00891
Epoch: 38289, Training Loss: 0.00698
Epoch: 38289, Training Loss: 0.00725
Epoch: 38289, Training Loss: 0.00726
Epoch: 38289, Training Loss: 0.00891
Epoch: 38290, Training Loss: 0.00698
Epoch: 38290, Training Loss: 0.00725
Epoch: 38290, Training Loss: 0.00726
Epoch: 38290, Training Loss: 0.00891
Epoch: 38291, Training Loss: 0.00698
Epoch: 38291, Training Loss: 0.00725
Epoch: 38291, Training Loss: 0.00726
Epoch: 38291, Training Loss: 0.00891
Epoch: 38292, Training Loss: 0.00698
Epoch: 38292, Training Loss: 0.00725
Epoch: 38292, Training Loss: 0.00726
Epoch: 38292, Training Loss: 0.00891
Epoch: 38293, Training Loss: 0.00698
Epoch: 38293, Training Loss: 0.00725
Epoch: 38293, Training Loss: 0.00726
Epoch: 38293, Training Loss: 0.00891
Epoch: 38294, Training Loss: 0.00698
Epoch: 38294, Training Loss: 0.00725
Epoch: 38294, Training Loss: 0.00726
Epoch: 38294, Training Loss: 0.00891
Epoch: 38295, Training Loss: 0.00698
Epoch: 38295, Training Loss: 0.00725
Epoch: 38295, Training Loss: 0.00726
Epoch: 38295, Training Loss: 0.00891
Epoch: 38296, Training Loss: 0.00698
Epoch: 38296, Training Loss: 0.00725
Epoch: 38296, Training Loss: 0.00726
Epoch: 38296, Training Loss: 0.00891
Epoch: 38297, Training Loss: 0.00698
Epoch: 38297, Training Loss: 0.00725
Epoch: 38297, Training Loss: 0.00726
Epoch: 38297, Training Loss: 0.00891
Epoch: 38298, Training Loss: 0.00698
Epoch: 38298, Training Loss: 0.00725
Epoch: 38298, Training Loss: 0.00726
Epoch: 38298, Training Loss: 0.00891
Epoch: 38299, Training Loss: 0.00698
Epoch: 38299, Training Loss: 0.00725
Epoch: 38299, Training Loss: 0.00726
Epoch: 38299, Training Loss: 0.00891
Epoch: 38300, Training Loss: 0.00698
Epoch: 38300, Training Loss: 0.00725
Epoch: 38300, Training Loss: 0.00726
Epoch: 38300, Training Loss: 0.00891
Epoch: 38301, Training Loss: 0.00698
Epoch: 38301, Training Loss: 0.00725
Epoch: 38301, Training Loss: 0.00726
Epoch: 38301, Training Loss: 0.00891
Epoch: 38302, Training Loss: 0.00698
Epoch: 38302, Training Loss: 0.00725
Epoch: 38302, Training Loss: 0.00726
Epoch: 38302, Training Loss: 0.00891
Epoch: 38303, Training Loss: 0.00698
Epoch: 38303, Training Loss: 0.00725
Epoch: 38303, Training Loss: 0.00726
Epoch: 38303, Training Loss: 0.00891
Epoch: 38304, Training Loss: 0.00698
Epoch: 38304, Training Loss: 0.00725
Epoch: 38304, Training Loss: 0.00726
Epoch: 38304, Training Loss: 0.00891
Epoch: 38305, Training Loss: 0.00698
Epoch: 38305, Training Loss: 0.00725
Epoch: 38305, Training Loss: 0.00726
Epoch: 38305, Training Loss: 0.00891
Epoch: 38306, Training Loss: 0.00698
Epoch: 38306, Training Loss: 0.00725
Epoch: 38306, Training Loss: 0.00726
Epoch: 38306, Training Loss: 0.00891
Epoch: 38307, Training Loss: 0.00698
Epoch: 38307, Training Loss: 0.00725
Epoch: 38307, Training Loss: 0.00726
Epoch: 38307, Training Loss: 0.00891
Epoch: 38308, Training Loss: 0.00698
Epoch: 38308, Training Loss: 0.00725
Epoch: 38308, Training Loss: 0.00726
Epoch: 38308, Training Loss: 0.00891
Epoch: 38309, Training Loss: 0.00698
Epoch: 38309, Training Loss: 0.00725
Epoch: 38309, Training Loss: 0.00726
Epoch: 38309, Training Loss: 0.00891
Epoch: 38310, Training Loss: 0.00698
Epoch: 38310, Training Loss: 0.00725
Epoch: 38310, Training Loss: 0.00726
Epoch: 38310, Training Loss: 0.00891
Epoch: 38311, Training Loss: 0.00698
Epoch: 38311, Training Loss: 0.00725
Epoch: 38311, Training Loss: 0.00726
Epoch: 38311, Training Loss: 0.00891
Epoch: 38312, Training Loss: 0.00698
Epoch: 38312, Training Loss: 0.00725
Epoch: 38312, Training Loss: 0.00726
Epoch: 38312, Training Loss: 0.00891
Epoch: 38313, Training Loss: 0.00698
Epoch: 38313, Training Loss: 0.00725
Epoch: 38313, Training Loss: 0.00726
Epoch: 38313, Training Loss: 0.00891
Epoch: 38314, Training Loss: 0.00698
Epoch: 38314, Training Loss: 0.00725
Epoch: 38314, Training Loss: 0.00726
Epoch: 38314, Training Loss: 0.00890
Epoch: 38315, Training Loss: 0.00698
Epoch: 38315, Training Loss: 0.00725
Epoch: 38315, Training Loss: 0.00726
Epoch: 38315, Training Loss: 0.00890
Epoch: 38316, Training Loss: 0.00698
Epoch: 38316, Training Loss: 0.00725
Epoch: 38316, Training Loss: 0.00726
Epoch: 38316, Training Loss: 0.00890
Epoch: 38317, Training Loss: 0.00698
Epoch: 38317, Training Loss: 0.00725
Epoch: 38317, Training Loss: 0.00726
Epoch: 38317, Training Loss: 0.00890
Epoch: 38318, Training Loss: 0.00698
Epoch: 38318, Training Loss: 0.00725
Epoch: 38318, Training Loss: 0.00726
Epoch: 38318, Training Loss: 0.00890
Epoch: 38319, Training Loss: 0.00698
Epoch: 38319, Training Loss: 0.00725
Epoch: 38319, Training Loss: 0.00726
Epoch: 38319, Training Loss: 0.00890
Epoch: 38320, Training Loss: 0.00698
Epoch: 38320, Training Loss: 0.00725
Epoch: 38320, Training Loss: 0.00726
Epoch: 38320, Training Loss: 0.00890
Epoch: 38321, Training Loss: 0.00698
Epoch: 38321, Training Loss: 0.00725
Epoch: 38321, Training Loss: 0.00726
Epoch: 38321, Training Loss: 0.00890
Epoch: 38322, Training Loss: 0.00698
Epoch: 38322, Training Loss: 0.00725
Epoch: 38322, Training Loss: 0.00726
Epoch: 38322, Training Loss: 0.00890
Epoch: 38323, Training Loss: 0.00698
Epoch: 38323, Training Loss: 0.00725
Epoch: 38323, Training Loss: 0.00726
Epoch: 38323, Training Loss: 0.00890
Epoch: 38324, Training Loss: 0.00698
Epoch: 38324, Training Loss: 0.00725
Epoch: 38324, Training Loss: 0.00726
Epoch: 38324, Training Loss: 0.00890
Epoch: 38325, Training Loss: 0.00698
Epoch: 38325, Training Loss: 0.00725
Epoch: 38325, Training Loss: 0.00726
Epoch: 38325, Training Loss: 0.00890
Epoch: 38326, Training Loss: 0.00698
Epoch: 38326, Training Loss: 0.00725
Epoch: 38326, Training Loss: 0.00726
Epoch: 38326, Training Loss: 0.00890
Epoch: 38327, Training Loss: 0.00698
Epoch: 38327, Training Loss: 0.00725
Epoch: 38327, Training Loss: 0.00726
Epoch: 38327, Training Loss: 0.00890
Epoch: 38328, Training Loss: 0.00698
Epoch: 38328, Training Loss: 0.00725
Epoch: 38328, Training Loss: 0.00726
Epoch: 38328, Training Loss: 0.00890
Epoch: 38329, Training Loss: 0.00698
Epoch: 38329, Training Loss: 0.00725
Epoch: 38329, Training Loss: 0.00726
Epoch: 38329, Training Loss: 0.00890
Epoch: 38330, Training Loss: 0.00698
Epoch: 38330, Training Loss: 0.00725
Epoch: 38330, Training Loss: 0.00726
Epoch: 38330, Training Loss: 0.00890
Epoch: 38331, Training Loss: 0.00698
Epoch: 38331, Training Loss: 0.00725
Epoch: 38331, Training Loss: 0.00726
Epoch: 38331, Training Loss: 0.00890
Epoch: 38332, Training Loss: 0.00698
Epoch: 38332, Training Loss: 0.00725
Epoch: 38332, Training Loss: 0.00726
Epoch: 38332, Training Loss: 0.00890
Epoch: 38333, Training Loss: 0.00698
Epoch: 38333, Training Loss: 0.00725
Epoch: 38333, Training Loss: 0.00726
Epoch: 38333, Training Loss: 0.00890
Epoch: 38334, Training Loss: 0.00698
Epoch: 38334, Training Loss: 0.00725
Epoch: 38334, Training Loss: 0.00726
Epoch: 38334, Training Loss: 0.00890
Epoch: 38335, Training Loss: 0.00698
Epoch: 38335, Training Loss: 0.00725
Epoch: 38335, Training Loss: 0.00726
Epoch: 38335, Training Loss: 0.00890
Epoch: 38336, Training Loss: 0.00698
Epoch: 38336, Training Loss: 0.00725
Epoch: 38336, Training Loss: 0.00726
Epoch: 38336, Training Loss: 0.00890
Epoch: 38337, Training Loss: 0.00698
Epoch: 38337, Training Loss: 0.00725
Epoch: 38337, Training Loss: 0.00726
Epoch: 38337, Training Loss: 0.00890
Epoch: 38338, Training Loss: 0.00698
Epoch: 38338, Training Loss: 0.00725
Epoch: 38338, Training Loss: 0.00726
Epoch: 38338, Training Loss: 0.00890
Epoch: 38339, Training Loss: 0.00698
Epoch: 38339, Training Loss: 0.00725
Epoch: 38339, Training Loss: 0.00726
Epoch: 38339, Training Loss: 0.00890
Epoch: 38340, Training Loss: 0.00698
Epoch: 38340, Training Loss: 0.00725
Epoch: 38340, Training Loss: 0.00726
Epoch: 38340, Training Loss: 0.00890
Epoch: 38341, Training Loss: 0.00698
Epoch: 38341, Training Loss: 0.00725
Epoch: 38341, Training Loss: 0.00726
Epoch: 38341, Training Loss: 0.00890
Epoch: 38342, Training Loss: 0.00698
Epoch: 38342, Training Loss: 0.00725
Epoch: 38342, Training Loss: 0.00726
Epoch: 38342, Training Loss: 0.00890
Epoch: 38343, Training Loss: 0.00698
Epoch: 38343, Training Loss: 0.00725
Epoch: 38343, Training Loss: 0.00726
Epoch: 38343, Training Loss: 0.00890
Epoch: 38344, Training Loss: 0.00698
Epoch: 38344, Training Loss: 0.00725
Epoch: 38344, Training Loss: 0.00726
Epoch: 38344, Training Loss: 0.00890
Epoch: 38345, Training Loss: 0.00698
Epoch: 38345, Training Loss: 0.00725
Epoch: 38345, Training Loss: 0.00726
Epoch: 38345, Training Loss: 0.00890
Epoch: 38346, Training Loss: 0.00698
Epoch: 38346, Training Loss: 0.00725
Epoch: 38346, Training Loss: 0.00726
Epoch: 38346, Training Loss: 0.00890
Epoch: 38347, Training Loss: 0.00697
Epoch: 38347, Training Loss: 0.00725
Epoch: 38347, Training Loss: 0.00726
Epoch: 38347, Training Loss: 0.00890
Epoch: 38348, Training Loss: 0.00697
Epoch: 38348, Training Loss: 0.00725
Epoch: 38348, Training Loss: 0.00726
Epoch: 38348, Training Loss: 0.00890
Epoch: 38349, Training Loss: 0.00697
Epoch: 38349, Training Loss: 0.00725
Epoch: 38349, Training Loss: 0.00726
Epoch: 38349, Training Loss: 0.00890
Epoch: 38350, Training Loss: 0.00697
Epoch: 38350, Training Loss: 0.00725
Epoch: 38350, Training Loss: 0.00726
Epoch: 38350, Training Loss: 0.00890
Epoch: 38351, Training Loss: 0.00697
Epoch: 38351, Training Loss: 0.00725
Epoch: 38351, Training Loss: 0.00726
Epoch: 38351, Training Loss: 0.00890
Epoch: 38352, Training Loss: 0.00697
Epoch: 38352, Training Loss: 0.00725
Epoch: 38352, Training Loss: 0.00726
Epoch: 38352, Training Loss: 0.00890
Epoch: 38353, Training Loss: 0.00697
Epoch: 38353, Training Loss: 0.00725
Epoch: 38353, Training Loss: 0.00726
Epoch: 38353, Training Loss: 0.00890
Epoch: 38354, Training Loss: 0.00697
Epoch: 38354, Training Loss: 0.00725
Epoch: 38354, Training Loss: 0.00726
Epoch: 38354, Training Loss: 0.00890
Epoch: 38355, Training Loss: 0.00697
Epoch: 38355, Training Loss: 0.00725
Epoch: 38355, Training Loss: 0.00726
Epoch: 38355, Training Loss: 0.00890
Epoch: 38356, Training Loss: 0.00697
Epoch: 38356, Training Loss: 0.00725
Epoch: 38356, Training Loss: 0.00726
Epoch: 38356, Training Loss: 0.00890
Epoch: 38357, Training Loss: 0.00697
Epoch: 38357, Training Loss: 0.00725
Epoch: 38357, Training Loss: 0.00726
Epoch: 38357, Training Loss: 0.00890
Epoch: 38358, Training Loss: 0.00697
Epoch: 38358, Training Loss: 0.00725
Epoch: 38358, Training Loss: 0.00726
Epoch: 38358, Training Loss: 0.00890
Epoch: 38359, Training Loss: 0.00697
Epoch: 38359, Training Loss: 0.00725
Epoch: 38359, Training Loss: 0.00725
Epoch: 38359, Training Loss: 0.00890
Epoch: 38360, Training Loss: 0.00697
Epoch: 38360, Training Loss: 0.00725
Epoch: 38360, Training Loss: 0.00725
Epoch: 38360, Training Loss: 0.00890
Epoch: 38361, Training Loss: 0.00697
Epoch: 38361, Training Loss: 0.00725
Epoch: 38361, Training Loss: 0.00725
Epoch: 38361, Training Loss: 0.00890
Epoch: 38362, Training Loss: 0.00697
Epoch: 38362, Training Loss: 0.00725
Epoch: 38362, Training Loss: 0.00725
Epoch: 38362, Training Loss: 0.00890
Epoch: 38363, Training Loss: 0.00697
Epoch: 38363, Training Loss: 0.00725
Epoch: 38363, Training Loss: 0.00725
Epoch: 38363, Training Loss: 0.00890
Epoch: 38364, Training Loss: 0.00697
Epoch: 38364, Training Loss: 0.00725
Epoch: 38364, Training Loss: 0.00725
Epoch: 38364, Training Loss: 0.00890
Epoch: 38365, Training Loss: 0.00697
Epoch: 38365, Training Loss: 0.00725
Epoch: 38365, Training Loss: 0.00725
Epoch: 38365, Training Loss: 0.00890
Epoch: 38366, Training Loss: 0.00697
Epoch: 38366, Training Loss: 0.00725
Epoch: 38366, Training Loss: 0.00725
Epoch: 38366, Training Loss: 0.00890
Epoch: 38367, Training Loss: 0.00697
Epoch: 38367, Training Loss: 0.00725
Epoch: 38367, Training Loss: 0.00725
Epoch: 38367, Training Loss: 0.00890
Epoch: 38368, Training Loss: 0.00697
Epoch: 38368, Training Loss: 0.00725
Epoch: 38368, Training Loss: 0.00725
Epoch: 38368, Training Loss: 0.00890
Epoch: 38369, Training Loss: 0.00697
Epoch: 38369, Training Loss: 0.00725
Epoch: 38369, Training Loss: 0.00725
Epoch: 38369, Training Loss: 0.00890
Epoch: 38370, Training Loss: 0.00697
Epoch: 38370, Training Loss: 0.00725
Epoch: 38370, Training Loss: 0.00725
Epoch: 38370, Training Loss: 0.00890
Epoch: 38371, Training Loss: 0.00697
Epoch: 38371, Training Loss: 0.00725
Epoch: 38371, Training Loss: 0.00725
Epoch: 38371, Training Loss: 0.00890
Epoch: 38372, Training Loss: 0.00697
Epoch: 38372, Training Loss: 0.00725
Epoch: 38372, Training Loss: 0.00725
Epoch: 38372, Training Loss: 0.00890
Epoch: 38373, Training Loss: 0.00697
Epoch: 38373, Training Loss: 0.00725
Epoch: 38373, Training Loss: 0.00725
Epoch: 38373, Training Loss: 0.00890
Epoch: 38374, Training Loss: 0.00697
Epoch: 38374, Training Loss: 0.00725
Epoch: 38374, Training Loss: 0.00725
Epoch: 38374, Training Loss: 0.00890
Epoch: 38375, Training Loss: 0.00697
Epoch: 38375, Training Loss: 0.00725
Epoch: 38375, Training Loss: 0.00725
Epoch: 38375, Training Loss: 0.00890
Epoch: 38376, Training Loss: 0.00697
Epoch: 38376, Training Loss: 0.00725
Epoch: 38376, Training Loss: 0.00725
Epoch: 38376, Training Loss: 0.00890
Epoch: 38377, Training Loss: 0.00697
Epoch: 38377, Training Loss: 0.00724
Epoch: 38377, Training Loss: 0.00725
Epoch: 38377, Training Loss: 0.00890
Epoch: 38378, Training Loss: 0.00697
Epoch: 38378, Training Loss: 0.00724
Epoch: 38378, Training Loss: 0.00725
Epoch: 38378, Training Loss: 0.00890
Epoch: 38379, Training Loss: 0.00697
Epoch: 38379, Training Loss: 0.00724
Epoch: 38379, Training Loss: 0.00725
Epoch: 38379, Training Loss: 0.00890
Epoch: 38380, Training Loss: 0.00697
Epoch: 38380, Training Loss: 0.00724
Epoch: 38380, Training Loss: 0.00725
Epoch: 38380, Training Loss: 0.00890
Epoch: 38381, Training Loss: 0.00697
Epoch: 38381, Training Loss: 0.00724
Epoch: 38381, Training Loss: 0.00725
Epoch: 38381, Training Loss: 0.00890
Epoch: 38382, Training Loss: 0.00697
Epoch: 38382, Training Loss: 0.00724
Epoch: 38382, Training Loss: 0.00725
Epoch: 38382, Training Loss: 0.00890
Epoch: 38383, Training Loss: 0.00697
Epoch: 38383, Training Loss: 0.00724
Epoch: 38383, Training Loss: 0.00725
Epoch: 38383, Training Loss: 0.00890
Epoch: 38384, Training Loss: 0.00697
Epoch: 38384, Training Loss: 0.00724
Epoch: 38384, Training Loss: 0.00725
Epoch: 38384, Training Loss: 0.00890
Epoch: 38385, Training Loss: 0.00697
Epoch: 38385, Training Loss: 0.00724
Epoch: 38385, Training Loss: 0.00725
Epoch: 38385, Training Loss: 0.00890
Epoch: 38386, Training Loss: 0.00697
Epoch: 38386, Training Loss: 0.00724
Epoch: 38386, Training Loss: 0.00725
Epoch: 38386, Training Loss: 0.00890
Epoch: 38387, Training Loss: 0.00697
Epoch: 38387, Training Loss: 0.00724
Epoch: 38387, Training Loss: 0.00725
Epoch: 38387, Training Loss: 0.00890
Epoch: 38388, Training Loss: 0.00697
Epoch: 38388, Training Loss: 0.00724
Epoch: 38388, Training Loss: 0.00725
Epoch: 38388, Training Loss: 0.00890
Epoch: 38389, Training Loss: 0.00697
Epoch: 38389, Training Loss: 0.00724
Epoch: 38389, Training Loss: 0.00725
Epoch: 38389, Training Loss: 0.00890
Epoch: 38390, Training Loss: 0.00697
Epoch: 38390, Training Loss: 0.00724
Epoch: 38390, Training Loss: 0.00725
Epoch: 38390, Training Loss: 0.00890
Epoch: 38391, Training Loss: 0.00697
Epoch: 38391, Training Loss: 0.00724
Epoch: 38391, Training Loss: 0.00725
Epoch: 38391, Training Loss: 0.00889
Epoch: 38392, Training Loss: 0.00697
Epoch: 38392, Training Loss: 0.00724
Epoch: 38392, Training Loss: 0.00725
Epoch: 38392, Training Loss: 0.00889
Epoch: 38393, Training Loss: 0.00697
Epoch: 38393, Training Loss: 0.00724
Epoch: 38393, Training Loss: 0.00725
Epoch: 38393, Training Loss: 0.00889
Epoch: 38394, Training Loss: 0.00697
Epoch: 38394, Training Loss: 0.00724
Epoch: 38394, Training Loss: 0.00725
Epoch: 38394, Training Loss: 0.00889
Epoch: 38395, Training Loss: 0.00697
Epoch: 38395, Training Loss: 0.00724
Epoch: 38395, Training Loss: 0.00725
Epoch: 38395, Training Loss: 0.00889
Epoch: 38396, Training Loss: 0.00697
Epoch: 38396, Training Loss: 0.00724
Epoch: 38396, Training Loss: 0.00725
Epoch: 38396, Training Loss: 0.00889
Epoch: 38397, Training Loss: 0.00697
Epoch: 38397, Training Loss: 0.00724
Epoch: 38397, Training Loss: 0.00725
Epoch: 38397, Training Loss: 0.00889
Epoch: 38398, Training Loss: 0.00697
Epoch: 38398, Training Loss: 0.00724
Epoch: 38398, Training Loss: 0.00725
Epoch: 38398, Training Loss: 0.00889
Epoch: 38399, Training Loss: 0.00697
Epoch: 38399, Training Loss: 0.00724
Epoch: 38399, Training Loss: 0.00725
Epoch: 38399, Training Loss: 0.00889
Epoch: 38400, Training Loss: 0.00697
Epoch: 38400, Training Loss: 0.00724
Epoch: 38400, Training Loss: 0.00725
Epoch: 38400, Training Loss: 0.00889
Epoch: 38401, Training Loss: 0.00697
Epoch: 38401, Training Loss: 0.00724
Epoch: 38401, Training Loss: 0.00725
Epoch: 38401, Training Loss: 0.00889
Epoch: 38402, Training Loss: 0.00697
Epoch: 38402, Training Loss: 0.00724
Epoch: 38402, Training Loss: 0.00725
Epoch: 38402, Training Loss: 0.00889
Epoch: 38403, Training Loss: 0.00697
Epoch: 38403, Training Loss: 0.00724
Epoch: 38403, Training Loss: 0.00725
Epoch: 38403, Training Loss: 0.00889
Epoch: 38404, Training Loss: 0.00697
Epoch: 38404, Training Loss: 0.00724
Epoch: 38404, Training Loss: 0.00725
Epoch: 38404, Training Loss: 0.00889
Epoch: 38405, Training Loss: 0.00697
Epoch: 38405, Training Loss: 0.00724
Epoch: 38405, Training Loss: 0.00725
Epoch: 38405, Training Loss: 0.00889
Epoch: 38406, Training Loss: 0.00697
Epoch: 38406, Training Loss: 0.00724
Epoch: 38406, Training Loss: 0.00725
Epoch: 38406, Training Loss: 0.00889
Epoch: 38407, Training Loss: 0.00697
Epoch: 38407, Training Loss: 0.00724
Epoch: 38407, Training Loss: 0.00725
Epoch: 38407, Training Loss: 0.00889
Epoch: 38408, Training Loss: 0.00697
Epoch: 38408, Training Loss: 0.00724
Epoch: 38408, Training Loss: 0.00725
Epoch: 38408, Training Loss: 0.00889
Epoch: 38409, Training Loss: 0.00697
Epoch: 38409, Training Loss: 0.00724
Epoch: 38409, Training Loss: 0.00725
Epoch: 38409, Training Loss: 0.00889
Epoch: 38410, Training Loss: 0.00697
Epoch: 38410, Training Loss: 0.00724
Epoch: 38410, Training Loss: 0.00725
Epoch: 38410, Training Loss: 0.00889
Epoch: 38411, Training Loss: 0.00697
Epoch: 38411, Training Loss: 0.00724
Epoch: 38411, Training Loss: 0.00725
Epoch: 38411, Training Loss: 0.00889
Epoch: 38412, Training Loss: 0.00697
Epoch: 38412, Training Loss: 0.00724
Epoch: 38412, Training Loss: 0.00725
Epoch: 38412, Training Loss: 0.00889
Epoch: 38413, Training Loss: 0.00697
Epoch: 38413, Training Loss: 0.00724
Epoch: 38413, Training Loss: 0.00725
Epoch: 38413, Training Loss: 0.00889
Epoch: 38414, Training Loss: 0.00697
Epoch: 38414, Training Loss: 0.00724
Epoch: 38414, Training Loss: 0.00725
Epoch: 38414, Training Loss: 0.00889
Epoch: 38415, Training Loss: 0.00697
Epoch: 38415, Training Loss: 0.00724
Epoch: 38415, Training Loss: 0.00725
Epoch: 38415, Training Loss: 0.00889
Epoch: 38416, Training Loss: 0.00697
Epoch: 38416, Training Loss: 0.00724
Epoch: 38416, Training Loss: 0.00725
Epoch: 38416, Training Loss: 0.00889
Epoch: 38417, Training Loss: 0.00697
Epoch: 38417, Training Loss: 0.00724
Epoch: 38417, Training Loss: 0.00725
Epoch: 38417, Training Loss: 0.00889
Epoch: 38418, Training Loss: 0.00697
Epoch: 38418, Training Loss: 0.00724
Epoch: 38418, Training Loss: 0.00725
Epoch: 38418, Training Loss: 0.00889
Epoch: 38419, Training Loss: 0.00697
Epoch: 38419, Training Loss: 0.00724
Epoch: 38419, Training Loss: 0.00725
Epoch: 38419, Training Loss: 0.00889
Epoch: 38420, Training Loss: 0.00697
Epoch: 38420, Training Loss: 0.00724
Epoch: 38420, Training Loss: 0.00725
Epoch: 38420, Training Loss: 0.00889
Epoch: 38421, Training Loss: 0.00697
Epoch: 38421, Training Loss: 0.00724
Epoch: 38421, Training Loss: 0.00725
Epoch: 38421, Training Loss: 0.00889
Epoch: 38422, Training Loss: 0.00697
Epoch: 38422, Training Loss: 0.00724
Epoch: 38422, Training Loss: 0.00725
Epoch: 38422, Training Loss: 0.00889
Epoch: 38423, Training Loss: 0.00697
Epoch: 38423, Training Loss: 0.00724
Epoch: 38423, Training Loss: 0.00725
Epoch: 38423, Training Loss: 0.00889
Epoch: 38424, Training Loss: 0.00697
Epoch: 38424, Training Loss: 0.00724
Epoch: 38424, Training Loss: 0.00725
Epoch: 38424, Training Loss: 0.00889
Epoch: 38425, Training Loss: 0.00697
Epoch: 38425, Training Loss: 0.00724
Epoch: 38425, Training Loss: 0.00725
Epoch: 38425, Training Loss: 0.00889
Epoch: 38426, Training Loss: 0.00697
Epoch: 38426, Training Loss: 0.00724
Epoch: 38426, Training Loss: 0.00725
Epoch: 38426, Training Loss: 0.00889
Epoch: 38427, Training Loss: 0.00697
Epoch: 38427, Training Loss: 0.00724
Epoch: 38427, Training Loss: 0.00725
Epoch: 38427, Training Loss: 0.00889
Epoch: 38428, Training Loss: 0.00697
Epoch: 38428, Training Loss: 0.00724
Epoch: 38428, Training Loss: 0.00725
Epoch: 38428, Training Loss: 0.00889
Epoch: 38429, Training Loss: 0.00697
Epoch: 38429, Training Loss: 0.00724
Epoch: 38429, Training Loss: 0.00725
Epoch: 38429, Training Loss: 0.00889
Epoch: 38430, Training Loss: 0.00697
Epoch: 38430, Training Loss: 0.00724
Epoch: 38430, Training Loss: 0.00725
Epoch: 38430, Training Loss: 0.00889
Epoch: 38431, Training Loss: 0.00697
Epoch: 38431, Training Loss: 0.00724
Epoch: 38431, Training Loss: 0.00725
Epoch: 38431, Training Loss: 0.00889
Epoch: 38432, Training Loss: 0.00697
Epoch: 38432, Training Loss: 0.00724
Epoch: 38432, Training Loss: 0.00725
Epoch: 38432, Training Loss: 0.00889
Epoch: 38433, Training Loss: 0.00697
Epoch: 38433, Training Loss: 0.00724
Epoch: 38433, Training Loss: 0.00725
Epoch: 38433, Training Loss: 0.00889
Epoch: 38434, Training Loss: 0.00697
Epoch: 38434, Training Loss: 0.00724
Epoch: 38434, Training Loss: 0.00725
Epoch: 38434, Training Loss: 0.00889
Epoch: 38435, Training Loss: 0.00697
Epoch: 38435, Training Loss: 0.00724
Epoch: 38435, Training Loss: 0.00725
Epoch: 38435, Training Loss: 0.00889
Epoch: 38436, Training Loss: 0.00697
Epoch: 38436, Training Loss: 0.00724
Epoch: 38436, Training Loss: 0.00725
Epoch: 38436, Training Loss: 0.00889
Epoch: 38437, Training Loss: 0.00697
Epoch: 38437, Training Loss: 0.00724
Epoch: 38437, Training Loss: 0.00725
Epoch: 38437, Training Loss: 0.00889
Epoch: 38438, Training Loss: 0.00697
Epoch: 38438, Training Loss: 0.00724
Epoch: 38438, Training Loss: 0.00725
Epoch: 38438, Training Loss: 0.00889
Epoch: 38439, Training Loss: 0.00697
Epoch: 38439, Training Loss: 0.00724
Epoch: 38439, Training Loss: 0.00725
Epoch: 38439, Training Loss: 0.00889
Epoch: 38440, Training Loss: 0.00697
Epoch: 38440, Training Loss: 0.00724
Epoch: 38440, Training Loss: 0.00725
Epoch: 38440, Training Loss: 0.00889
Epoch: 38441, Training Loss: 0.00697
Epoch: 38441, Training Loss: 0.00724
Epoch: 38441, Training Loss: 0.00725
Epoch: 38441, Training Loss: 0.00889
Epoch: 38442, Training Loss: 0.00697
Epoch: 38442, Training Loss: 0.00724
Epoch: 38442, Training Loss: 0.00725
Epoch: 38442, Training Loss: 0.00889
Epoch: 38443, Training Loss: 0.00697
Epoch: 38443, Training Loss: 0.00724
Epoch: 38443, Training Loss: 0.00725
Epoch: 38443, Training Loss: 0.00889
Epoch: 38444, Training Loss: 0.00697
Epoch: 38444, Training Loss: 0.00724
Epoch: 38444, Training Loss: 0.00725
Epoch: 38444, Training Loss: 0.00889
Epoch: 38445, Training Loss: 0.00697
Epoch: 38445, Training Loss: 0.00724
Epoch: 38445, Training Loss: 0.00725
Epoch: 38445, Training Loss: 0.00889
Epoch: 38446, Training Loss: 0.00697
Epoch: 38446, Training Loss: 0.00724
Epoch: 38446, Training Loss: 0.00725
Epoch: 38446, Training Loss: 0.00889
Epoch: 38447, Training Loss: 0.00697
Epoch: 38447, Training Loss: 0.00724
Epoch: 38447, Training Loss: 0.00725
Epoch: 38447, Training Loss: 0.00889
Epoch: 38448, Training Loss: 0.00697
Epoch: 38448, Training Loss: 0.00724
Epoch: 38448, Training Loss: 0.00725
Epoch: 38448, Training Loss: 0.00889
Epoch: 38449, Training Loss: 0.00696
Epoch: 38449, Training Loss: 0.00724
Epoch: 38449, Training Loss: 0.00725
Epoch: 38449, Training Loss: 0.00889
Epoch: 38450, Training Loss: 0.00696
Epoch: 38450, Training Loss: 0.00724
Epoch: 38450, Training Loss: 0.00725
Epoch: 38450, Training Loss: 0.00889
Epoch: 38451, Training Loss: 0.00696
Epoch: 38451, Training Loss: 0.00724
Epoch: 38451, Training Loss: 0.00725
Epoch: 38451, Training Loss: 0.00889
Epoch: 38452, Training Loss: 0.00696
Epoch: 38452, Training Loss: 0.00724
Epoch: 38452, Training Loss: 0.00725
Epoch: 38452, Training Loss: 0.00889
Epoch: 38453, Training Loss: 0.00696
Epoch: 38453, Training Loss: 0.00724
Epoch: 38453, Training Loss: 0.00725
Epoch: 38453, Training Loss: 0.00889
Epoch: 38454, Training Loss: 0.00696
Epoch: 38454, Training Loss: 0.00724
Epoch: 38454, Training Loss: 0.00725
Epoch: 38454, Training Loss: 0.00889
Epoch: 38455, Training Loss: 0.00696
Epoch: 38455, Training Loss: 0.00724
Epoch: 38455, Training Loss: 0.00724
Epoch: 38455, Training Loss: 0.00889
Epoch: 38456, Training Loss: 0.00696
Epoch: 38456, Training Loss: 0.00724
Epoch: 38456, Training Loss: 0.00724
Epoch: 38456, Training Loss: 0.00889
Epoch: 38457, Training Loss: 0.00696
Epoch: 38457, Training Loss: 0.00724
Epoch: 38457, Training Loss: 0.00724
Epoch: 38457, Training Loss: 0.00889
Epoch: 38458, Training Loss: 0.00696
Epoch: 38458, Training Loss: 0.00724
Epoch: 38458, Training Loss: 0.00724
Epoch: 38458, Training Loss: 0.00889
Epoch: 38459, Training Loss: 0.00696
Epoch: 38459, Training Loss: 0.00724
Epoch: 38459, Training Loss: 0.00724
Epoch: 38459, Training Loss: 0.00889
Epoch: 38460, Training Loss: 0.00696
Epoch: 38460, Training Loss: 0.00724
Epoch: 38460, Training Loss: 0.00724
Epoch: 38460, Training Loss: 0.00889
Epoch: 38461, Training Loss: 0.00696
Epoch: 38461, Training Loss: 0.00724
Epoch: 38461, Training Loss: 0.00724
Epoch: 38461, Training Loss: 0.00889
Epoch: 38462, Training Loss: 0.00696
Epoch: 38462, Training Loss: 0.00724
Epoch: 38462, Training Loss: 0.00724
Epoch: 38462, Training Loss: 0.00889
Epoch: 38463, Training Loss: 0.00696
Epoch: 38463, Training Loss: 0.00724
Epoch: 38463, Training Loss: 0.00724
Epoch: 38463, Training Loss: 0.00889
Epoch: 38464, Training Loss: 0.00696
Epoch: 38464, Training Loss: 0.00724
Epoch: 38464, Training Loss: 0.00724
Epoch: 38464, Training Loss: 0.00889
Epoch: 38465, Training Loss: 0.00696
Epoch: 38465, Training Loss: 0.00724
Epoch: 38465, Training Loss: 0.00724
Epoch: 38465, Training Loss: 0.00889
Epoch: 38466, Training Loss: 0.00696
Epoch: 38466, Training Loss: 0.00724
Epoch: 38466, Training Loss: 0.00724
Epoch: 38466, Training Loss: 0.00889
Epoch: 38467, Training Loss: 0.00696
Epoch: 38467, Training Loss: 0.00724
Epoch: 38467, Training Loss: 0.00724
Epoch: 38467, Training Loss: 0.00889
Epoch: 38468, Training Loss: 0.00696
Epoch: 38468, Training Loss: 0.00724
Epoch: 38468, Training Loss: 0.00724
Epoch: 38468, Training Loss: 0.00889
Epoch: 38469, Training Loss: 0.00696
Epoch: 38469, Training Loss: 0.00724
Epoch: 38469, Training Loss: 0.00724
Epoch: 38469, Training Loss: 0.00888
Epoch: 38470, Training Loss: 0.00696
Epoch: 38470, Training Loss: 0.00724
Epoch: 38470, Training Loss: 0.00724
Epoch: 38470, Training Loss: 0.00888
Epoch: 38471, Training Loss: 0.00696
Epoch: 38471, Training Loss: 0.00724
Epoch: 38471, Training Loss: 0.00724
Epoch: 38471, Training Loss: 0.00888
Epoch: 38472, Training Loss: 0.00696
Epoch: 38472, Training Loss: 0.00724
Epoch: 38472, Training Loss: 0.00724
Epoch: 38472, Training Loss: 0.00888
Epoch: 38473, Training Loss: 0.00696
Epoch: 38473, Training Loss: 0.00723
Epoch: 38473, Training Loss: 0.00724
Epoch: 38473, Training Loss: 0.00888
Epoch: 38474, Training Loss: 0.00696
Epoch: 38474, Training Loss: 0.00723
Epoch: 38474, Training Loss: 0.00724
Epoch: 38474, Training Loss: 0.00888
Epoch: 38475, Training Loss: 0.00696
Epoch: 38475, Training Loss: 0.00723
Epoch: 38475, Training Loss: 0.00724
Epoch: 38475, Training Loss: 0.00888
Epoch: 38476, Training Loss: 0.00696
Epoch: 38476, Training Loss: 0.00723
Epoch: 38476, Training Loss: 0.00724
Epoch: 38476, Training Loss: 0.00888
Epoch: 38477, Training Loss: 0.00696
Epoch: 38477, Training Loss: 0.00723
Epoch: 38477, Training Loss: 0.00724
Epoch: 38477, Training Loss: 0.00888
Epoch: 38478, Training Loss: 0.00696
Epoch: 38478, Training Loss: 0.00723
Epoch: 38478, Training Loss: 0.00724
Epoch: 38478, Training Loss: 0.00888
Epoch: 38479, Training Loss: 0.00696
Epoch: 38479, Training Loss: 0.00723
Epoch: 38479, Training Loss: 0.00724
Epoch: 38479, Training Loss: 0.00888
Epoch: 38480, Training Loss: 0.00696
Epoch: 38480, Training Loss: 0.00723
Epoch: 38480, Training Loss: 0.00724
Epoch: 38480, Training Loss: 0.00888
Epoch: 38481, Training Loss: 0.00696
Epoch: 38481, Training Loss: 0.00723
Epoch: 38481, Training Loss: 0.00724
Epoch: 38481, Training Loss: 0.00888
Epoch: 38482, Training Loss: 0.00696
Epoch: 38482, Training Loss: 0.00723
Epoch: 38482, Training Loss: 0.00724
Epoch: 38482, Training Loss: 0.00888
Epoch: 38483, Training Loss: 0.00696
Epoch: 38483, Training Loss: 0.00723
Epoch: 38483, Training Loss: 0.00724
Epoch: 38483, Training Loss: 0.00888
Epoch: 38484, Training Loss: 0.00696
Epoch: 38484, Training Loss: 0.00723
Epoch: 38484, Training Loss: 0.00724
Epoch: 38484, Training Loss: 0.00888
Epoch: 38485, Training Loss: 0.00696
Epoch: 38485, Training Loss: 0.00723
Epoch: 38485, Training Loss: 0.00724
Epoch: 38485, Training Loss: 0.00888
Epoch: 38486, Training Loss: 0.00696
Epoch: 38486, Training Loss: 0.00723
Epoch: 38486, Training Loss: 0.00724
Epoch: 38486, Training Loss: 0.00888
Epoch: 38487, Training Loss: 0.00696
Epoch: 38487, Training Loss: 0.00723
Epoch: 38487, Training Loss: 0.00724
Epoch: 38487, Training Loss: 0.00888
Epoch: 38488, Training Loss: 0.00696
Epoch: 38488, Training Loss: 0.00723
Epoch: 38488, Training Loss: 0.00724
Epoch: 38488, Training Loss: 0.00888
Epoch: 38489, Training Loss: 0.00696
Epoch: 38489, Training Loss: 0.00723
Epoch: 38489, Training Loss: 0.00724
Epoch: 38489, Training Loss: 0.00888
Epoch: 38490, Training Loss: 0.00696
Epoch: 38490, Training Loss: 0.00723
Epoch: 38490, Training Loss: 0.00724
Epoch: 38490, Training Loss: 0.00888
Epoch: 38491, Training Loss: 0.00696
Epoch: 38491, Training Loss: 0.00723
Epoch: 38491, Training Loss: 0.00724
Epoch: 38491, Training Loss: 0.00888
Epoch: 38492, Training Loss: 0.00696
Epoch: 38492, Training Loss: 0.00723
Epoch: 38492, Training Loss: 0.00724
Epoch: 38492, Training Loss: 0.00888
Epoch: 38493, Training Loss: 0.00696
Epoch: 38493, Training Loss: 0.00723
Epoch: 38493, Training Loss: 0.00724
Epoch: 38493, Training Loss: 0.00888
Epoch: 38494, Training Loss: 0.00696
Epoch: 38494, Training Loss: 0.00723
Epoch: 38494, Training Loss: 0.00724
Epoch: 38494, Training Loss: 0.00888
Epoch: 38495, Training Loss: 0.00696
Epoch: 38495, Training Loss: 0.00723
Epoch: 38495, Training Loss: 0.00724
Epoch: 38495, Training Loss: 0.00888
Epoch: 38496, Training Loss: 0.00696
Epoch: 38496, Training Loss: 0.00723
Epoch: 38496, Training Loss: 0.00724
Epoch: 38496, Training Loss: 0.00888
Epoch: 38497, Training Loss: 0.00696
Epoch: 38497, Training Loss: 0.00723
Epoch: 38497, Training Loss: 0.00724
Epoch: 38497, Training Loss: 0.00888
Epoch: 38498, Training Loss: 0.00696
Epoch: 38498, Training Loss: 0.00723
Epoch: 38498, Training Loss: 0.00724
Epoch: 38498, Training Loss: 0.00888
Epoch: 38499, Training Loss: 0.00696
Epoch: 38499, Training Loss: 0.00723
Epoch: 38499, Training Loss: 0.00724
Epoch: 38499, Training Loss: 0.00888
Epoch: 38500, Training Loss: 0.00696
Epoch: 38500, Training Loss: 0.00723
Epoch: 38500, Training Loss: 0.00724
Epoch: 38500, Training Loss: 0.00888
Epoch: 38501, Training Loss: 0.00696
Epoch: 38501, Training Loss: 0.00723
Epoch: 38501, Training Loss: 0.00724
Epoch: 38501, Training Loss: 0.00888
Epoch: 38502, Training Loss: 0.00696
Epoch: 38502, Training Loss: 0.00723
Epoch: 38502, Training Loss: 0.00724
Epoch: 38502, Training Loss: 0.00888
Epoch: 38503, Training Loss: 0.00696
Epoch: 38503, Training Loss: 0.00723
Epoch: 38503, Training Loss: 0.00724
Epoch: 38503, Training Loss: 0.00888
Epoch: 38504, Training Loss: 0.00696
Epoch: 38504, Training Loss: 0.00723
Epoch: 38504, Training Loss: 0.00724
Epoch: 38504, Training Loss: 0.00888
Epoch: 38505, Training Loss: 0.00696
Epoch: 38505, Training Loss: 0.00723
Epoch: 38505, Training Loss: 0.00724
Epoch: 38505, Training Loss: 0.00888
Epoch: 38506, Training Loss: 0.00696
Epoch: 38506, Training Loss: 0.00723
Epoch: 38506, Training Loss: 0.00724
Epoch: 38506, Training Loss: 0.00888
Epoch: 38507, Training Loss: 0.00696
Epoch: 38507, Training Loss: 0.00723
Epoch: 38507, Training Loss: 0.00724
Epoch: 38507, Training Loss: 0.00888
Epoch: 38508, Training Loss: 0.00696
Epoch: 38508, Training Loss: 0.00723
Epoch: 38508, Training Loss: 0.00724
Epoch: 38508, Training Loss: 0.00888
Epoch: 38509, Training Loss: 0.00696
Epoch: 38509, Training Loss: 0.00723
Epoch: 38509, Training Loss: 0.00724
Epoch: 38509, Training Loss: 0.00888
Epoch: 38510, Training Loss: 0.00696
Epoch: 38510, Training Loss: 0.00723
Epoch: 38510, Training Loss: 0.00724
Epoch: 38510, Training Loss: 0.00888
Epoch: 38511, Training Loss: 0.00696
Epoch: 38511, Training Loss: 0.00723
Epoch: 38511, Training Loss: 0.00724
Epoch: 38511, Training Loss: 0.00888
Epoch: 38512, Training Loss: 0.00696
Epoch: 38512, Training Loss: 0.00723
Epoch: 38512, Training Loss: 0.00724
Epoch: 38512, Training Loss: 0.00888
Epoch: 38513, Training Loss: 0.00696
Epoch: 38513, Training Loss: 0.00723
Epoch: 38513, Training Loss: 0.00724
Epoch: 38513, Training Loss: 0.00888
Epoch: 38514, Training Loss: 0.00696
Epoch: 38514, Training Loss: 0.00723
Epoch: 38514, Training Loss: 0.00724
Epoch: 38514, Training Loss: 0.00888
Epoch: 38515, Training Loss: 0.00696
Epoch: 38515, Training Loss: 0.00723
Epoch: 38515, Training Loss: 0.00724
Epoch: 38515, Training Loss: 0.00888
Epoch: 38516, Training Loss: 0.00696
Epoch: 38516, Training Loss: 0.00723
Epoch: 38516, Training Loss: 0.00724
Epoch: 38516, Training Loss: 0.00888
Epoch: 38517, Training Loss: 0.00696
Epoch: 38517, Training Loss: 0.00723
Epoch: 38517, Training Loss: 0.00724
Epoch: 38517, Training Loss: 0.00888
Epoch: 38518, Training Loss: 0.00696
Epoch: 38518, Training Loss: 0.00723
Epoch: 38518, Training Loss: 0.00724
Epoch: 38518, Training Loss: 0.00888
Epoch: 38519, Training Loss: 0.00696
Epoch: 38519, Training Loss: 0.00723
Epoch: 38519, Training Loss: 0.00724
Epoch: 38519, Training Loss: 0.00888
Epoch: 38520, Training Loss: 0.00696
Epoch: 38520, Training Loss: 0.00723
Epoch: 38520, Training Loss: 0.00724
Epoch: 38520, Training Loss: 0.00888
Epoch: 38521, Training Loss: 0.00696
Epoch: 38521, Training Loss: 0.00723
Epoch: 38521, Training Loss: 0.00724
Epoch: 38521, Training Loss: 0.00888
Epoch: 38522, Training Loss: 0.00696
Epoch: 38522, Training Loss: 0.00723
Epoch: 38522, Training Loss: 0.00724
Epoch: 38522, Training Loss: 0.00888
Epoch: 38523, Training Loss: 0.00696
Epoch: 38523, Training Loss: 0.00723
Epoch: 38523, Training Loss: 0.00724
Epoch: 38523, Training Loss: 0.00888
Epoch: 38524, Training Loss: 0.00696
Epoch: 38524, Training Loss: 0.00723
Epoch: 38524, Training Loss: 0.00724
Epoch: 38524, Training Loss: 0.00888
Epoch: 38525, Training Loss: 0.00696
Epoch: 38525, Training Loss: 0.00723
Epoch: 38525, Training Loss: 0.00724
Epoch: 38525, Training Loss: 0.00888
Epoch: 38526, Training Loss: 0.00696
Epoch: 38526, Training Loss: 0.00723
Epoch: 38526, Training Loss: 0.00724
Epoch: 38526, Training Loss: 0.00888
Epoch: 38527, Training Loss: 0.00696
Epoch: 38527, Training Loss: 0.00723
Epoch: 38527, Training Loss: 0.00724
Epoch: 38527, Training Loss: 0.00888
Epoch: 38528, Training Loss: 0.00696
Epoch: 38528, Training Loss: 0.00723
Epoch: 38528, Training Loss: 0.00724
Epoch: 38528, Training Loss: 0.00888
Epoch: 38529, Training Loss: 0.00696
Epoch: 38529, Training Loss: 0.00723
Epoch: 38529, Training Loss: 0.00724
Epoch: 38529, Training Loss: 0.00888
Epoch: 38530, Training Loss: 0.00696
Epoch: 38530, Training Loss: 0.00723
Epoch: 38530, Training Loss: 0.00724
Epoch: 38530, Training Loss: 0.00888
Epoch: 38531, Training Loss: 0.00696
Epoch: 38531, Training Loss: 0.00723
Epoch: 38531, Training Loss: 0.00724
Epoch: 38531, Training Loss: 0.00888
Epoch: 38532, Training Loss: 0.00696
Epoch: 38532, Training Loss: 0.00723
Epoch: 38532, Training Loss: 0.00724
Epoch: 38532, Training Loss: 0.00888
Epoch: 38533, Training Loss: 0.00696
Epoch: 38533, Training Loss: 0.00723
Epoch: 38533, Training Loss: 0.00724
Epoch: 38533, Training Loss: 0.00888
Epoch: 38534, Training Loss: 0.00696
Epoch: 38534, Training Loss: 0.00723
Epoch: 38534, Training Loss: 0.00724
Epoch: 38534, Training Loss: 0.00888
Epoch: 38535, Training Loss: 0.00696
Epoch: 38535, Training Loss: 0.00723
Epoch: 38535, Training Loss: 0.00724
Epoch: 38535, Training Loss: 0.00888
Epoch: 38536, Training Loss: 0.00696
Epoch: 38536, Training Loss: 0.00723
Epoch: 38536, Training Loss: 0.00724
Epoch: 38536, Training Loss: 0.00888
Epoch: 38537, Training Loss: 0.00696
Epoch: 38537, Training Loss: 0.00723
Epoch: 38537, Training Loss: 0.00724
Epoch: 38537, Training Loss: 0.00888
Epoch: 38538, Training Loss: 0.00696
Epoch: 38538, Training Loss: 0.00723
Epoch: 38538, Training Loss: 0.00724
Epoch: 38538, Training Loss: 0.00888
Epoch: 38539, Training Loss: 0.00696
Epoch: 38539, Training Loss: 0.00723
Epoch: 38539, Training Loss: 0.00724
Epoch: 38539, Training Loss: 0.00888
Epoch: 38540, Training Loss: 0.00696
Epoch: 38540, Training Loss: 0.00723
Epoch: 38540, Training Loss: 0.00724
Epoch: 38540, Training Loss: 0.00888
Epoch: 38541, Training Loss: 0.00696
Epoch: 38541, Training Loss: 0.00723
Epoch: 38541, Training Loss: 0.00724
Epoch: 38541, Training Loss: 0.00888
Epoch: 38542, Training Loss: 0.00696
Epoch: 38542, Training Loss: 0.00723
Epoch: 38542, Training Loss: 0.00724
Epoch: 38542, Training Loss: 0.00888
Epoch: 38543, Training Loss: 0.00696
Epoch: 38543, Training Loss: 0.00723
Epoch: 38543, Training Loss: 0.00724
Epoch: 38543, Training Loss: 0.00888
Epoch: 38544, Training Loss: 0.00696
Epoch: 38544, Training Loss: 0.00723
Epoch: 38544, Training Loss: 0.00724
Epoch: 38544, Training Loss: 0.00888
Epoch: 38545, Training Loss: 0.00696
Epoch: 38545, Training Loss: 0.00723
Epoch: 38545, Training Loss: 0.00724
Epoch: 38545, Training Loss: 0.00888
Epoch: 38546, Training Loss: 0.00696
Epoch: 38546, Training Loss: 0.00723
Epoch: 38546, Training Loss: 0.00724
Epoch: 38546, Training Loss: 0.00888
Epoch: 38547, Training Loss: 0.00696
Epoch: 38547, Training Loss: 0.00723
Epoch: 38547, Training Loss: 0.00724
Epoch: 38547, Training Loss: 0.00887
Epoch: 38548, Training Loss: 0.00696
Epoch: 38548, Training Loss: 0.00723
Epoch: 38548, Training Loss: 0.00724
Epoch: 38548, Training Loss: 0.00887
Epoch: 38549, Training Loss: 0.00696
Epoch: 38549, Training Loss: 0.00723
Epoch: 38549, Training Loss: 0.00724
Epoch: 38549, Training Loss: 0.00887
Epoch: 38550, Training Loss: 0.00696
Epoch: 38550, Training Loss: 0.00723
Epoch: 38550, Training Loss: 0.00724
Epoch: 38550, Training Loss: 0.00887
Epoch: 38551, Training Loss: 0.00695
Epoch: 38551, Training Loss: 0.00723
Epoch: 38551, Training Loss: 0.00724
Epoch: 38551, Training Loss: 0.00887
Epoch: 38552, Training Loss: 0.00695
Epoch: 38552, Training Loss: 0.00723
Epoch: 38552, Training Loss: 0.00723
Epoch: 38552, Training Loss: 0.00887
Epoch: 38553, Training Loss: 0.00695
Epoch: 38553, Training Loss: 0.00723
Epoch: 38553, Training Loss: 0.00723
Epoch: 38553, Training Loss: 0.00887
Epoch: 38554, Training Loss: 0.00695
Epoch: 38554, Training Loss: 0.00723
Epoch: 38554, Training Loss: 0.00723
Epoch: 38554, Training Loss: 0.00887
Epoch: 38555, Training Loss: 0.00695
Epoch: 38555, Training Loss: 0.00723
Epoch: 38555, Training Loss: 0.00723
Epoch: 38555, Training Loss: 0.00887
Epoch: 38556, Training Loss: 0.00695
Epoch: 38556, Training Loss: 0.00723
Epoch: 38556, Training Loss: 0.00723
Epoch: 38556, Training Loss: 0.00887
Epoch: 38557, Training Loss: 0.00695
Epoch: 38557, Training Loss: 0.00723
Epoch: 38557, Training Loss: 0.00723
Epoch: 38557, Training Loss: 0.00887
Epoch: 38558, Training Loss: 0.00695
Epoch: 38558, Training Loss: 0.00723
Epoch: 38558, Training Loss: 0.00723
Epoch: 38558, Training Loss: 0.00887
Epoch: 38559, Training Loss: 0.00695
Epoch: 38559, Training Loss: 0.00723
Epoch: 38559, Training Loss: 0.00723
Epoch: 38559, Training Loss: 0.00887
Epoch: 38560, Training Loss: 0.00695
Epoch: 38560, Training Loss: 0.00723
Epoch: 38560, Training Loss: 0.00723
Epoch: 38560, Training Loss: 0.00887
Epoch: 38561, Training Loss: 0.00695
Epoch: 38561, Training Loss: 0.00723
Epoch: 38561, Training Loss: 0.00723
Epoch: 38561, Training Loss: 0.00887
Epoch: 38562, Training Loss: 0.00695
Epoch: 38562, Training Loss: 0.00723
Epoch: 38562, Training Loss: 0.00723
Epoch: 38562, Training Loss: 0.00887
Epoch: 38563, Training Loss: 0.00695
Epoch: 38563, Training Loss: 0.00723
Epoch: 38563, Training Loss: 0.00723
Epoch: 38563, Training Loss: 0.00887
Epoch: 38564, Training Loss: 0.00695
Epoch: 38564, Training Loss: 0.00723
Epoch: 38564, Training Loss: 0.00723
Epoch: 38564, Training Loss: 0.00887
Epoch: 38565, Training Loss: 0.00695
Epoch: 38565, Training Loss: 0.00723
Epoch: 38565, Training Loss: 0.00723
Epoch: 38565, Training Loss: 0.00887
Epoch: 38566, Training Loss: 0.00695
Epoch: 38566, Training Loss: 0.00723
Epoch: 38566, Training Loss: 0.00723
Epoch: 38566, Training Loss: 0.00887
Epoch: 38567, Training Loss: 0.00695
Epoch: 38567, Training Loss: 0.00723
Epoch: 38567, Training Loss: 0.00723
Epoch: 38567, Training Loss: 0.00887
Epoch: 38568, Training Loss: 0.00695
Epoch: 38568, Training Loss: 0.00723
Epoch: 38568, Training Loss: 0.00723
Epoch: 38568, Training Loss: 0.00887
Epoch: 38569, Training Loss: 0.00695
Epoch: 38569, Training Loss: 0.00723
Epoch: 38569, Training Loss: 0.00723
Epoch: 38569, Training Loss: 0.00887
Epoch: 38570, Training Loss: 0.00695
Epoch: 38570, Training Loss: 0.00722
Epoch: 38570, Training Loss: 0.00723
Epoch: 38570, Training Loss: 0.00887
Epoch: 38571, Training Loss: 0.00695
Epoch: 38571, Training Loss: 0.00722
Epoch: 38571, Training Loss: 0.00723
Epoch: 38571, Training Loss: 0.00887
Epoch: 38572, Training Loss: 0.00695
Epoch: 38572, Training Loss: 0.00722
Epoch: 38572, Training Loss: 0.00723
Epoch: 38572, Training Loss: 0.00887
Epoch: 38573, Training Loss: 0.00695
Epoch: 38573, Training Loss: 0.00722
Epoch: 38573, Training Loss: 0.00723
Epoch: 38573, Training Loss: 0.00887
Epoch: 38574, Training Loss: 0.00695
Epoch: 38574, Training Loss: 0.00722
Epoch: 38574, Training Loss: 0.00723
Epoch: 38574, Training Loss: 0.00887
Epoch: 38575, Training Loss: 0.00695
Epoch: 38575, Training Loss: 0.00722
Epoch: 38575, Training Loss: 0.00723
Epoch: 38575, Training Loss: 0.00887
Epoch: 38576, Training Loss: 0.00695
Epoch: 38576, Training Loss: 0.00722
Epoch: 38576, Training Loss: 0.00723
Epoch: 38576, Training Loss: 0.00887
Epoch: 38577, Training Loss: 0.00695
Epoch: 38577, Training Loss: 0.00722
Epoch: 38577, Training Loss: 0.00723
Epoch: 38577, Training Loss: 0.00887
Epoch: 38578, Training Loss: 0.00695
Epoch: 38578, Training Loss: 0.00722
Epoch: 38578, Training Loss: 0.00723
Epoch: 38578, Training Loss: 0.00887
Epoch: 38579, Training Loss: 0.00695
Epoch: 38579, Training Loss: 0.00722
Epoch: 38579, Training Loss: 0.00723
Epoch: 38579, Training Loss: 0.00887
Epoch: 38580, Training Loss: 0.00695
Epoch: 38580, Training Loss: 0.00722
Epoch: 38580, Training Loss: 0.00723
Epoch: 38580, Training Loss: 0.00887
Epoch: 38581, Training Loss: 0.00695
Epoch: 38581, Training Loss: 0.00722
Epoch: 38581, Training Loss: 0.00723
Epoch: 38581, Training Loss: 0.00887
Epoch: 38582, Training Loss: 0.00695
Epoch: 38582, Training Loss: 0.00722
Epoch: 38582, Training Loss: 0.00723
Epoch: 38582, Training Loss: 0.00887
Epoch: 38583, Training Loss: 0.00695
Epoch: 38583, Training Loss: 0.00722
Epoch: 38583, Training Loss: 0.00723
Epoch: 38583, Training Loss: 0.00887
Epoch: 38584, Training Loss: 0.00695
Epoch: 38584, Training Loss: 0.00722
Epoch: 38584, Training Loss: 0.00723
Epoch: 38584, Training Loss: 0.00887
Epoch: 38585, Training Loss: 0.00695
Epoch: 38585, Training Loss: 0.00722
Epoch: 38585, Training Loss: 0.00723
Epoch: 38585, Training Loss: 0.00887
Epoch: 38586, Training Loss: 0.00695
Epoch: 38586, Training Loss: 0.00722
Epoch: 38586, Training Loss: 0.00723
Epoch: 38586, Training Loss: 0.00887
Epoch: 38587, Training Loss: 0.00695
Epoch: 38587, Training Loss: 0.00722
Epoch: 38587, Training Loss: 0.00723
Epoch: 38587, Training Loss: 0.00887
Epoch: 38588, Training Loss: 0.00695
Epoch: 38588, Training Loss: 0.00722
Epoch: 38588, Training Loss: 0.00723
Epoch: 38588, Training Loss: 0.00887
Epoch: 38589, Training Loss: 0.00695
Epoch: 38589, Training Loss: 0.00722
Epoch: 38589, Training Loss: 0.00723
Epoch: 38589, Training Loss: 0.00887
Epoch: 38590, Training Loss: 0.00695
Epoch: 38590, Training Loss: 0.00722
Epoch: 38590, Training Loss: 0.00723
Epoch: 38590, Training Loss: 0.00887
Epoch: 38591, Training Loss: 0.00695
Epoch: 38591, Training Loss: 0.00722
Epoch: 38591, Training Loss: 0.00723
Epoch: 38591, Training Loss: 0.00887
Epoch: 38592, Training Loss: 0.00695
Epoch: 38592, Training Loss: 0.00722
Epoch: 38592, Training Loss: 0.00723
Epoch: 38592, Training Loss: 0.00887
Epoch: 38593, Training Loss: 0.00695
Epoch: 38593, Training Loss: 0.00722
Epoch: 38593, Training Loss: 0.00723
Epoch: 38593, Training Loss: 0.00887
Epoch: 38594, Training Loss: 0.00695
Epoch: 38594, Training Loss: 0.00722
Epoch: 38594, Training Loss: 0.00723
Epoch: 38594, Training Loss: 0.00887
Epoch: 38595, Training Loss: 0.00695
Epoch: 38595, Training Loss: 0.00722
Epoch: 38595, Training Loss: 0.00723
Epoch: 38595, Training Loss: 0.00887
Epoch: 38596, Training Loss: 0.00695
Epoch: 38596, Training Loss: 0.00722
Epoch: 38596, Training Loss: 0.00723
Epoch: 38596, Training Loss: 0.00887
Epoch: 38597, Training Loss: 0.00695
Epoch: 38597, Training Loss: 0.00722
Epoch: 38597, Training Loss: 0.00723
Epoch: 38597, Training Loss: 0.00887
Epoch: 38598, Training Loss: 0.00695
Epoch: 38598, Training Loss: 0.00722
Epoch: 38598, Training Loss: 0.00723
Epoch: 38598, Training Loss: 0.00887
Epoch: 38599, Training Loss: 0.00695
Epoch: 38599, Training Loss: 0.00722
Epoch: 38599, Training Loss: 0.00723
Epoch: 38599, Training Loss: 0.00887
Epoch: 38600, Training Loss: 0.00695
Epoch: 38600, Training Loss: 0.00722
Epoch: 38600, Training Loss: 0.00723
Epoch: 38600, Training Loss: 0.00887
Epoch: 38601, Training Loss: 0.00695
Epoch: 38601, Training Loss: 0.00722
Epoch: 38601, Training Loss: 0.00723
Epoch: 38601, Training Loss: 0.00887
Epoch: 38602, Training Loss: 0.00695
Epoch: 38602, Training Loss: 0.00722
Epoch: 38602, Training Loss: 0.00723
Epoch: 38602, Training Loss: 0.00887
Epoch: 38603, Training Loss: 0.00695
Epoch: 38603, Training Loss: 0.00722
Epoch: 38603, Training Loss: 0.00723
Epoch: 38603, Training Loss: 0.00887
Epoch: 38604, Training Loss: 0.00695
Epoch: 38604, Training Loss: 0.00722
Epoch: 38604, Training Loss: 0.00723
Epoch: 38604, Training Loss: 0.00887
Epoch: 38605, Training Loss: 0.00695
Epoch: 38605, Training Loss: 0.00722
Epoch: 38605, Training Loss: 0.00723
Epoch: 38605, Training Loss: 0.00887
Epoch: 38606, Training Loss: 0.00695
Epoch: 38606, Training Loss: 0.00722
Epoch: 38606, Training Loss: 0.00723
Epoch: 38606, Training Loss: 0.00887
Epoch: 38607, Training Loss: 0.00695
Epoch: 38607, Training Loss: 0.00722
Epoch: 38607, Training Loss: 0.00723
Epoch: 38607, Training Loss: 0.00887
Epoch: 38608, Training Loss: 0.00695
Epoch: 38608, Training Loss: 0.00722
Epoch: 38608, Training Loss: 0.00723
Epoch: 38608, Training Loss: 0.00887
Epoch: 38609, Training Loss: 0.00695
Epoch: 38609, Training Loss: 0.00722
Epoch: 38609, Training Loss: 0.00723
Epoch: 38609, Training Loss: 0.00887
Epoch: 38610, Training Loss: 0.00695
Epoch: 38610, Training Loss: 0.00722
Epoch: 38610, Training Loss: 0.00723
Epoch: 38610, Training Loss: 0.00887
Epoch: 38611, Training Loss: 0.00695
Epoch: 38611, Training Loss: 0.00722
Epoch: 38611, Training Loss: 0.00723
Epoch: 38611, Training Loss: 0.00887
Epoch: 38612, Training Loss: 0.00695
Epoch: 38612, Training Loss: 0.00722
Epoch: 38612, Training Loss: 0.00723
Epoch: 38612, Training Loss: 0.00887
Epoch: 38613, Training Loss: 0.00695
Epoch: 38613, Training Loss: 0.00722
Epoch: 38613, Training Loss: 0.00723
Epoch: 38613, Training Loss: 0.00887
Epoch: 38614, Training Loss: 0.00695
Epoch: 38614, Training Loss: 0.00722
Epoch: 38614, Training Loss: 0.00723
Epoch: 38614, Training Loss: 0.00887
Epoch: 38615, Training Loss: 0.00695
Epoch: 38615, Training Loss: 0.00722
Epoch: 38615, Training Loss: 0.00723
Epoch: 38615, Training Loss: 0.00887
Epoch: 38616, Training Loss: 0.00695
Epoch: 38616, Training Loss: 0.00722
Epoch: 38616, Training Loss: 0.00723
Epoch: 38616, Training Loss: 0.00887
Epoch: 38617, Training Loss: 0.00695
Epoch: 38617, Training Loss: 0.00722
Epoch: 38617, Training Loss: 0.00723
Epoch: 38617, Training Loss: 0.00887
Epoch: 38618, Training Loss: 0.00695
Epoch: 38618, Training Loss: 0.00722
Epoch: 38618, Training Loss: 0.00723
Epoch: 38618, Training Loss: 0.00887
Epoch: 38619, Training Loss: 0.00695
Epoch: 38619, Training Loss: 0.00722
Epoch: 38619, Training Loss: 0.00723
Epoch: 38619, Training Loss: 0.00887
Epoch: 38620, Training Loss: 0.00695
Epoch: 38620, Training Loss: 0.00722
Epoch: 38620, Training Loss: 0.00723
Epoch: 38620, Training Loss: 0.00887
Epoch: 38621, Training Loss: 0.00695
Epoch: 38621, Training Loss: 0.00722
Epoch: 38621, Training Loss: 0.00723
Epoch: 38621, Training Loss: 0.00887
Epoch: 38622, Training Loss: 0.00695
Epoch: 38622, Training Loss: 0.00722
Epoch: 38622, Training Loss: 0.00723
Epoch: 38622, Training Loss: 0.00887
Epoch: 38623, Training Loss: 0.00695
Epoch: 38623, Training Loss: 0.00722
Epoch: 38623, Training Loss: 0.00723
Epoch: 38623, Training Loss: 0.00887
Epoch: 38624, Training Loss: 0.00695
Epoch: 38624, Training Loss: 0.00722
Epoch: 38624, Training Loss: 0.00723
Epoch: 38624, Training Loss: 0.00887
Epoch: 38625, Training Loss: 0.00695
Epoch: 38625, Training Loss: 0.00722
Epoch: 38625, Training Loss: 0.00723
Epoch: 38625, Training Loss: 0.00886
Epoch: 38626, Training Loss: 0.00695
Epoch: 38626, Training Loss: 0.00722
Epoch: 38626, Training Loss: 0.00723
Epoch: 38626, Training Loss: 0.00886
Epoch: 38627, Training Loss: 0.00695
Epoch: 38627, Training Loss: 0.00722
Epoch: 38627, Training Loss: 0.00723
Epoch: 38627, Training Loss: 0.00886
Epoch: 38628, Training Loss: 0.00695
Epoch: 38628, Training Loss: 0.00722
Epoch: 38628, Training Loss: 0.00723
Epoch: 38628, Training Loss: 0.00886
Epoch: 38629, Training Loss: 0.00695
Epoch: 38629, Training Loss: 0.00722
Epoch: 38629, Training Loss: 0.00723
Epoch: 38629, Training Loss: 0.00886
Epoch: 38630, Training Loss: 0.00695
Epoch: 38630, Training Loss: 0.00722
Epoch: 38630, Training Loss: 0.00723
Epoch: 38630, Training Loss: 0.00886
Epoch: 38631, Training Loss: 0.00695
Epoch: 38631, Training Loss: 0.00722
Epoch: 38631, Training Loss: 0.00723
Epoch: 38631, Training Loss: 0.00886
Epoch: 38632, Training Loss: 0.00695
Epoch: 38632, Training Loss: 0.00722
Epoch: 38632, Training Loss: 0.00723
Epoch: 38632, Training Loss: 0.00886
Epoch: 38633, Training Loss: 0.00695
Epoch: 38633, Training Loss: 0.00722
Epoch: 38633, Training Loss: 0.00723
Epoch: 38633, Training Loss: 0.00886
Epoch: 38634, Training Loss: 0.00695
Epoch: 38634, Training Loss: 0.00722
Epoch: 38634, Training Loss: 0.00723
Epoch: 38634, Training Loss: 0.00886
Epoch: 38635, Training Loss: 0.00695
Epoch: 38635, Training Loss: 0.00722
Epoch: 38635, Training Loss: 0.00723
Epoch: 38635, Training Loss: 0.00886
Epoch: 38636, Training Loss: 0.00695
Epoch: 38636, Training Loss: 0.00722
Epoch: 38636, Training Loss: 0.00723
Epoch: 38636, Training Loss: 0.00886
Epoch: 38637, Training Loss: 0.00695
Epoch: 38637, Training Loss: 0.00722
Epoch: 38637, Training Loss: 0.00723
Epoch: 38637, Training Loss: 0.00886
Epoch: 38638, Training Loss: 0.00695
Epoch: 38638, Training Loss: 0.00722
Epoch: 38638, Training Loss: 0.00723
Epoch: 38638, Training Loss: 0.00886
Epoch: 38639, Training Loss: 0.00695
Epoch: 38639, Training Loss: 0.00722
Epoch: 38639, Training Loss: 0.00723
Epoch: 38639, Training Loss: 0.00886
Epoch: 38640, Training Loss: 0.00695
Epoch: 38640, Training Loss: 0.00722
Epoch: 38640, Training Loss: 0.00723
Epoch: 38640, Training Loss: 0.00886
Epoch: 38641, Training Loss: 0.00695
Epoch: 38641, Training Loss: 0.00722
Epoch: 38641, Training Loss: 0.00723
Epoch: 38641, Training Loss: 0.00886
Epoch: 38642, Training Loss: 0.00695
Epoch: 38642, Training Loss: 0.00722
Epoch: 38642, Training Loss: 0.00723
Epoch: 38642, Training Loss: 0.00886
Epoch: 38643, Training Loss: 0.00695
Epoch: 38643, Training Loss: 0.00722
Epoch: 38643, Training Loss: 0.00723
Epoch: 38643, Training Loss: 0.00886
Epoch: 38644, Training Loss: 0.00695
Epoch: 38644, Training Loss: 0.00722
Epoch: 38644, Training Loss: 0.00723
Epoch: 38644, Training Loss: 0.00886
Epoch: 38645, Training Loss: 0.00695
Epoch: 38645, Training Loss: 0.00722
Epoch: 38645, Training Loss: 0.00723
Epoch: 38645, Training Loss: 0.00886
Epoch: 38646, Training Loss: 0.00695
Epoch: 38646, Training Loss: 0.00722
Epoch: 38646, Training Loss: 0.00723
Epoch: 38646, Training Loss: 0.00886
Epoch: 38647, Training Loss: 0.00695
Epoch: 38647, Training Loss: 0.00722
Epoch: 38647, Training Loss: 0.00723
Epoch: 38647, Training Loss: 0.00886
Epoch: 38648, Training Loss: 0.00695
Epoch: 38648, Training Loss: 0.00722
Epoch: 38648, Training Loss: 0.00723
Epoch: 38648, Training Loss: 0.00886
Epoch: 38649, Training Loss: 0.00695
Epoch: 38649, Training Loss: 0.00722
Epoch: 38649, Training Loss: 0.00722
Epoch: 38649, Training Loss: 0.00886
Epoch: 38650, Training Loss: 0.00695
Epoch: 38650, Training Loss: 0.00722
Epoch: 38650, Training Loss: 0.00722
Epoch: 38650, Training Loss: 0.00886
Epoch: 38651, Training Loss: 0.00695
Epoch: 38651, Training Loss: 0.00722
Epoch: 38651, Training Loss: 0.00722
Epoch: 38651, Training Loss: 0.00886
Epoch: 38652, Training Loss: 0.00695
Epoch: 38652, Training Loss: 0.00722
Epoch: 38652, Training Loss: 0.00722
Epoch: 38652, Training Loss: 0.00886
Epoch: 38653, Training Loss: 0.00695
Epoch: 38653, Training Loss: 0.00722
Epoch: 38653, Training Loss: 0.00722
Epoch: 38653, Training Loss: 0.00886
Epoch: 38654, Training Loss: 0.00694
Epoch: 38654, Training Loss: 0.00722
Epoch: 38654, Training Loss: 0.00722
Epoch: 38654, Training Loss: 0.00886
Epoch: 38655, Training Loss: 0.00694
Epoch: 38655, Training Loss: 0.00722
Epoch: 38655, Training Loss: 0.00722
Epoch: 38655, Training Loss: 0.00886
Epoch: 38656, Training Loss: 0.00694
Epoch: 38656, Training Loss: 0.00722
Epoch: 38656, Training Loss: 0.00722
Epoch: 38656, Training Loss: 0.00886
Epoch: 38657, Training Loss: 0.00694
Epoch: 38657, Training Loss: 0.00722
Epoch: 38657, Training Loss: 0.00722
Epoch: 38657, Training Loss: 0.00886
Epoch: 38658, Training Loss: 0.00694
Epoch: 38658, Training Loss: 0.00722
Epoch: 38658, Training Loss: 0.00722
Epoch: 38658, Training Loss: 0.00886
Epoch: 38659, Training Loss: 0.00694
Epoch: 38659, Training Loss: 0.00722
Epoch: 38659, Training Loss: 0.00722
Epoch: 38659, Training Loss: 0.00886
Epoch: 38660, Training Loss: 0.00694
Epoch: 38660, Training Loss: 0.00722
Epoch: 38660, Training Loss: 0.00722
Epoch: 38660, Training Loss: 0.00886
Epoch: 38661, Training Loss: 0.00694
Epoch: 38661, Training Loss: 0.00722
Epoch: 38661, Training Loss: 0.00722
Epoch: 38661, Training Loss: 0.00886
Epoch: 38662, Training Loss: 0.00694
Epoch: 38662, Training Loss: 0.00722
Epoch: 38662, Training Loss: 0.00722
Epoch: 38662, Training Loss: 0.00886
Epoch: 38663, Training Loss: 0.00694
Epoch: 38663, Training Loss: 0.00722
Epoch: 38663, Training Loss: 0.00722
Epoch: 38663, Training Loss: 0.00886
Epoch: 38664, Training Loss: 0.00694
Epoch: 38664, Training Loss: 0.00722
Epoch: 38664, Training Loss: 0.00722
Epoch: 38664, Training Loss: 0.00886
Epoch: 38665, Training Loss: 0.00694
Epoch: 38665, Training Loss: 0.00722
Epoch: 38665, Training Loss: 0.00722
Epoch: 38665, Training Loss: 0.00886
Epoch: 38666, Training Loss: 0.00694
Epoch: 38666, Training Loss: 0.00722
Epoch: 38666, Training Loss: 0.00722
Epoch: 38666, Training Loss: 0.00886
Epoch: 38667, Training Loss: 0.00694
Epoch: 38667, Training Loss: 0.00721
Epoch: 38667, Training Loss: 0.00722
Epoch: 38667, Training Loss: 0.00886
Epoch: 38668, Training Loss: 0.00694
Epoch: 38668, Training Loss: 0.00721
Epoch: 38668, Training Loss: 0.00722
Epoch: 38668, Training Loss: 0.00886
Epoch: 38669, Training Loss: 0.00694
Epoch: 38669, Training Loss: 0.00721
Epoch: 38669, Training Loss: 0.00722
Epoch: 38669, Training Loss: 0.00886
Epoch: 38670, Training Loss: 0.00694
Epoch: 38670, Training Loss: 0.00721
Epoch: 38670, Training Loss: 0.00722
Epoch: 38670, Training Loss: 0.00886
Epoch: 38671, Training Loss: 0.00694
Epoch: 38671, Training Loss: 0.00721
Epoch: 38671, Training Loss: 0.00722
Epoch: 38671, Training Loss: 0.00886
Epoch: 38672, Training Loss: 0.00694
Epoch: 38672, Training Loss: 0.00721
Epoch: 38672, Training Loss: 0.00722
Epoch: 38672, Training Loss: 0.00886
Epoch: 38673, Training Loss: 0.00694
Epoch: 38673, Training Loss: 0.00721
Epoch: 38673, Training Loss: 0.00722
Epoch: 38673, Training Loss: 0.00886
Epoch: 38674, Training Loss: 0.00694
Epoch: 38674, Training Loss: 0.00721
Epoch: 38674, Training Loss: 0.00722
Epoch: 38674, Training Loss: 0.00886
Epoch: 38675, Training Loss: 0.00694
Epoch: 38675, Training Loss: 0.00721
Epoch: 38675, Training Loss: 0.00722
Epoch: 38675, Training Loss: 0.00886
Epoch: 38676, Training Loss: 0.00694
Epoch: 38676, Training Loss: 0.00721
Epoch: 38676, Training Loss: 0.00722
Epoch: 38676, Training Loss: 0.00886
Epoch: 38677, Training Loss: 0.00694
Epoch: 38677, Training Loss: 0.00721
Epoch: 38677, Training Loss: 0.00722
Epoch: 38677, Training Loss: 0.00886
Epoch: 38678, Training Loss: 0.00694
Epoch: 38678, Training Loss: 0.00721
Epoch: 38678, Training Loss: 0.00722
Epoch: 38678, Training Loss: 0.00886
Epoch: 38679, Training Loss: 0.00694
Epoch: 38679, Training Loss: 0.00721
Epoch: 38679, Training Loss: 0.00722
Epoch: 38679, Training Loss: 0.00886
Epoch: 38680, Training Loss: 0.00694
Epoch: 38680, Training Loss: 0.00721
Epoch: 38680, Training Loss: 0.00722
Epoch: 38680, Training Loss: 0.00886
Epoch: 38681, Training Loss: 0.00694
Epoch: 38681, Training Loss: 0.00721
Epoch: 38681, Training Loss: 0.00722
Epoch: 38681, Training Loss: 0.00886
Epoch: 38682, Training Loss: 0.00694
Epoch: 38682, Training Loss: 0.00721
Epoch: 38682, Training Loss: 0.00722
Epoch: 38682, Training Loss: 0.00886
Epoch: 38683, Training Loss: 0.00694
Epoch: 38683, Training Loss: 0.00721
Epoch: 38683, Training Loss: 0.00722
Epoch: 38683, Training Loss: 0.00886
Epoch: 38684, Training Loss: 0.00694
Epoch: 38684, Training Loss: 0.00721
Epoch: 38684, Training Loss: 0.00722
Epoch: 38684, Training Loss: 0.00886
Epoch: 38685, Training Loss: 0.00694
Epoch: 38685, Training Loss: 0.00721
Epoch: 38685, Training Loss: 0.00722
Epoch: 38685, Training Loss: 0.00886
Epoch: 38686, Training Loss: 0.00694
Epoch: 38686, Training Loss: 0.00721
Epoch: 38686, Training Loss: 0.00722
Epoch: 38686, Training Loss: 0.00886
Epoch: 38687, Training Loss: 0.00694
Epoch: 38687, Training Loss: 0.00721
Epoch: 38687, Training Loss: 0.00722
Epoch: 38687, Training Loss: 0.00886
Epoch: 38688, Training Loss: 0.00694
Epoch: 38688, Training Loss: 0.00721
Epoch: 38688, Training Loss: 0.00722
Epoch: 38688, Training Loss: 0.00886
Epoch: 38689, Training Loss: 0.00694
Epoch: 38689, Training Loss: 0.00721
Epoch: 38689, Training Loss: 0.00722
Epoch: 38689, Training Loss: 0.00886
Epoch: 38690, Training Loss: 0.00694
Epoch: 38690, Training Loss: 0.00721
Epoch: 38690, Training Loss: 0.00722
Epoch: 38690, Training Loss: 0.00886
Epoch: 38691, Training Loss: 0.00694
Epoch: 38691, Training Loss: 0.00721
Epoch: 38691, Training Loss: 0.00722
Epoch: 38691, Training Loss: 0.00886
Epoch: 38692, Training Loss: 0.00694
Epoch: 38692, Training Loss: 0.00721
Epoch: 38692, Training Loss: 0.00722
Epoch: 38692, Training Loss: 0.00886
Epoch: 38693, Training Loss: 0.00694
Epoch: 38693, Training Loss: 0.00721
Epoch: 38693, Training Loss: 0.00722
Epoch: 38693, Training Loss: 0.00886
Epoch: 38694, Training Loss: 0.00694
Epoch: 38694, Training Loss: 0.00721
Epoch: 38694, Training Loss: 0.00722
Epoch: 38694, Training Loss: 0.00886
Epoch: 38695, Training Loss: 0.00694
Epoch: 38695, Training Loss: 0.00721
Epoch: 38695, Training Loss: 0.00722
Epoch: 38695, Training Loss: 0.00886
Epoch: 38696, Training Loss: 0.00694
Epoch: 38696, Training Loss: 0.00721
Epoch: 38696, Training Loss: 0.00722
Epoch: 38696, Training Loss: 0.00886
Epoch: 38697, Training Loss: 0.00694
Epoch: 38697, Training Loss: 0.00721
Epoch: 38697, Training Loss: 0.00722
Epoch: 38697, Training Loss: 0.00886
Epoch: 38698, Training Loss: 0.00694
Epoch: 38698, Training Loss: 0.00721
Epoch: 38698, Training Loss: 0.00722
Epoch: 38698, Training Loss: 0.00886
Epoch: 38699, Training Loss: 0.00694
Epoch: 38699, Training Loss: 0.00721
Epoch: 38699, Training Loss: 0.00722
Epoch: 38699, Training Loss: 0.00886
Epoch: 38700, Training Loss: 0.00694
Epoch: 38700, Training Loss: 0.00721
Epoch: 38700, Training Loss: 0.00722
Epoch: 38700, Training Loss: 0.00886
Epoch: 38701, Training Loss: 0.00694
Epoch: 38701, Training Loss: 0.00721
Epoch: 38701, Training Loss: 0.00722
Epoch: 38701, Training Loss: 0.00886
Epoch: 38702, Training Loss: 0.00694
Epoch: 38702, Training Loss: 0.00721
Epoch: 38702, Training Loss: 0.00722
Epoch: 38702, Training Loss: 0.00886
Epoch: 38703, Training Loss: 0.00694
Epoch: 38703, Training Loss: 0.00721
Epoch: 38703, Training Loss: 0.00722
Epoch: 38703, Training Loss: 0.00886
Epoch: 38704, Training Loss: 0.00694
Epoch: 38704, Training Loss: 0.00721
Epoch: 38704, Training Loss: 0.00722
Epoch: 38704, Training Loss: 0.00885
Epoch: 38705, Training Loss: 0.00694
Epoch: 38705, Training Loss: 0.00721
Epoch: 38705, Training Loss: 0.00722
Epoch: 38705, Training Loss: 0.00885
Epoch: 38706, Training Loss: 0.00694
Epoch: 38706, Training Loss: 0.00721
Epoch: 38706, Training Loss: 0.00722
Epoch: 38706, Training Loss: 0.00885
Epoch: 38707, Training Loss: 0.00694
Epoch: 38707, Training Loss: 0.00721
Epoch: 38707, Training Loss: 0.00722
Epoch: 38707, Training Loss: 0.00885
Epoch: 38708, Training Loss: 0.00694
Epoch: 38708, Training Loss: 0.00721
Epoch: 38708, Training Loss: 0.00722
Epoch: 38708, Training Loss: 0.00885
Epoch: 38709, Training Loss: 0.00694
Epoch: 38709, Training Loss: 0.00721
Epoch: 38709, Training Loss: 0.00722
Epoch: 38709, Training Loss: 0.00885
Epoch: 38710, Training Loss: 0.00694
Epoch: 38710, Training Loss: 0.00721
Epoch: 38710, Training Loss: 0.00722
Epoch: 38710, Training Loss: 0.00885
Epoch: 38711, Training Loss: 0.00694
Epoch: 38711, Training Loss: 0.00721
Epoch: 38711, Training Loss: 0.00722
Epoch: 38711, Training Loss: 0.00885
Epoch: 38712, Training Loss: 0.00694
Epoch: 38712, Training Loss: 0.00721
Epoch: 38712, Training Loss: 0.00722
Epoch: 38712, Training Loss: 0.00885
Epoch: 38713, Training Loss: 0.00694
Epoch: 38713, Training Loss: 0.00721
Epoch: 38713, Training Loss: 0.00722
Epoch: 38713, Training Loss: 0.00885
Epoch: 38714, Training Loss: 0.00694
Epoch: 38714, Training Loss: 0.00721
Epoch: 38714, Training Loss: 0.00722
Epoch: 38714, Training Loss: 0.00885
Epoch: 38715, Training Loss: 0.00694
Epoch: 38715, Training Loss: 0.00721
Epoch: 38715, Training Loss: 0.00722
Epoch: 38715, Training Loss: 0.00885
Epoch: 38716, Training Loss: 0.00694
Epoch: 38716, Training Loss: 0.00721
Epoch: 38716, Training Loss: 0.00722
Epoch: 38716, Training Loss: 0.00885
Epoch: 38717, Training Loss: 0.00694
Epoch: 38717, Training Loss: 0.00721
Epoch: 38717, Training Loss: 0.00722
Epoch: 38717, Training Loss: 0.00885
Epoch: 38718, Training Loss: 0.00694
Epoch: 38718, Training Loss: 0.00721
Epoch: 38718, Training Loss: 0.00722
Epoch: 38718, Training Loss: 0.00885
Epoch: 38719, Training Loss: 0.00694
Epoch: 38719, Training Loss: 0.00721
Epoch: 38719, Training Loss: 0.00722
Epoch: 38719, Training Loss: 0.00885
Epoch: 38720, Training Loss: 0.00694
Epoch: 38720, Training Loss: 0.00721
Epoch: 38720, Training Loss: 0.00722
Epoch: 38720, Training Loss: 0.00885
Epoch: 38721, Training Loss: 0.00694
Epoch: 38721, Training Loss: 0.00721
Epoch: 38721, Training Loss: 0.00722
Epoch: 38721, Training Loss: 0.00885
Epoch: 38722, Training Loss: 0.00694
Epoch: 38722, Training Loss: 0.00721
Epoch: 38722, Training Loss: 0.00722
Epoch: 38722, Training Loss: 0.00885
Epoch: 38723, Training Loss: 0.00694
Epoch: 38723, Training Loss: 0.00721
Epoch: 38723, Training Loss: 0.00722
Epoch: 38723, Training Loss: 0.00885
Epoch: 38724, Training Loss: 0.00694
Epoch: 38724, Training Loss: 0.00721
Epoch: 38724, Training Loss: 0.00722
Epoch: 38724, Training Loss: 0.00885
Epoch: 38725, Training Loss: 0.00694
Epoch: 38725, Training Loss: 0.00721
Epoch: 38725, Training Loss: 0.00722
Epoch: 38725, Training Loss: 0.00885
Epoch: 38726, Training Loss: 0.00694
Epoch: 38726, Training Loss: 0.00721
Epoch: 38726, Training Loss: 0.00722
Epoch: 38726, Training Loss: 0.00885
Epoch: 38727, Training Loss: 0.00694
Epoch: 38727, Training Loss: 0.00721
Epoch: 38727, Training Loss: 0.00722
Epoch: 38727, Training Loss: 0.00885
Epoch: 38728, Training Loss: 0.00694
Epoch: 38728, Training Loss: 0.00721
Epoch: 38728, Training Loss: 0.00722
Epoch: 38728, Training Loss: 0.00885
Epoch: 38729, Training Loss: 0.00694
Epoch: 38729, Training Loss: 0.00721
Epoch: 38729, Training Loss: 0.00722
Epoch: 38729, Training Loss: 0.00885
Epoch: 38730, Training Loss: 0.00694
Epoch: 38730, Training Loss: 0.00721
Epoch: 38730, Training Loss: 0.00722
Epoch: 38730, Training Loss: 0.00885
Epoch: 38731, Training Loss: 0.00694
Epoch: 38731, Training Loss: 0.00721
Epoch: 38731, Training Loss: 0.00722
Epoch: 38731, Training Loss: 0.00885
Epoch: 38732, Training Loss: 0.00694
Epoch: 38732, Training Loss: 0.00721
Epoch: 38732, Training Loss: 0.00722
Epoch: 38732, Training Loss: 0.00885
Epoch: 38733, Training Loss: 0.00694
Epoch: 38733, Training Loss: 0.00721
Epoch: 38733, Training Loss: 0.00722
Epoch: 38733, Training Loss: 0.00885
Epoch: 38734, Training Loss: 0.00694
Epoch: 38734, Training Loss: 0.00721
Epoch: 38734, Training Loss: 0.00722
Epoch: 38734, Training Loss: 0.00885
Epoch: 38735, Training Loss: 0.00694
Epoch: 38735, Training Loss: 0.00721
Epoch: 38735, Training Loss: 0.00722
Epoch: 38735, Training Loss: 0.00885
Epoch: 38736, Training Loss: 0.00694
Epoch: 38736, Training Loss: 0.00721
Epoch: 38736, Training Loss: 0.00722
Epoch: 38736, Training Loss: 0.00885
Epoch: 38737, Training Loss: 0.00694
Epoch: 38737, Training Loss: 0.00721
Epoch: 38737, Training Loss: 0.00722
Epoch: 38737, Training Loss: 0.00885
Epoch: 38738, Training Loss: 0.00694
Epoch: 38738, Training Loss: 0.00721
Epoch: 38738, Training Loss: 0.00722
Epoch: 38738, Training Loss: 0.00885
Epoch: 38739, Training Loss: 0.00694
Epoch: 38739, Training Loss: 0.00721
Epoch: 38739, Training Loss: 0.00722
Epoch: 38739, Training Loss: 0.00885
Epoch: 38740, Training Loss: 0.00694
Epoch: 38740, Training Loss: 0.00721
Epoch: 38740, Training Loss: 0.00722
Epoch: 38740, Training Loss: 0.00885
Epoch: 38741, Training Loss: 0.00694
Epoch: 38741, Training Loss: 0.00721
Epoch: 38741, Training Loss: 0.00722
Epoch: 38741, Training Loss: 0.00885
Epoch: 38742, Training Loss: 0.00694
Epoch: 38742, Training Loss: 0.00721
Epoch: 38742, Training Loss: 0.00722
Epoch: 38742, Training Loss: 0.00885
Epoch: 38743, Training Loss: 0.00694
Epoch: 38743, Training Loss: 0.00721
Epoch: 38743, Training Loss: 0.00722
Epoch: 38743, Training Loss: 0.00885
Epoch: 38744, Training Loss: 0.00694
Epoch: 38744, Training Loss: 0.00721
Epoch: 38744, Training Loss: 0.00722
Epoch: 38744, Training Loss: 0.00885
Epoch: 38745, Training Loss: 0.00694
Epoch: 38745, Training Loss: 0.00721
Epoch: 38745, Training Loss: 0.00722
Epoch: 38745, Training Loss: 0.00885
Epoch: 38746, Training Loss: 0.00694
Epoch: 38746, Training Loss: 0.00721
Epoch: 38746, Training Loss: 0.00721
Epoch: 38746, Training Loss: 0.00885
Epoch: 38747, Training Loss: 0.00694
Epoch: 38747, Training Loss: 0.00721
Epoch: 38747, Training Loss: 0.00721
Epoch: 38747, Training Loss: 0.00885
Epoch: 38748, Training Loss: 0.00694
Epoch: 38748, Training Loss: 0.00721
Epoch: 38748, Training Loss: 0.00721
Epoch: 38748, Training Loss: 0.00885
Epoch: 38749, Training Loss: 0.00694
Epoch: 38749, Training Loss: 0.00721
Epoch: 38749, Training Loss: 0.00721
Epoch: 38749, Training Loss: 0.00885
Epoch: 38750, Training Loss: 0.00694
Epoch: 38750, Training Loss: 0.00721
Epoch: 38750, Training Loss: 0.00721
Epoch: 38750, Training Loss: 0.00885
Epoch: 38751, Training Loss: 0.00694
Epoch: 38751, Training Loss: 0.00721
Epoch: 38751, Training Loss: 0.00721
Epoch: 38751, Training Loss: 0.00885
Epoch: 38752, Training Loss: 0.00694
Epoch: 38752, Training Loss: 0.00721
Epoch: 38752, Training Loss: 0.00721
Epoch: 38752, Training Loss: 0.00885
Epoch: 38753, Training Loss: 0.00694
Epoch: 38753, Training Loss: 0.00721
Epoch: 38753, Training Loss: 0.00721
Epoch: 38753, Training Loss: 0.00885
Epoch: 38754, Training Loss: 0.00694
Epoch: 38754, Training Loss: 0.00721
Epoch: 38754, Training Loss: 0.00721
Epoch: 38754, Training Loss: 0.00885
Epoch: 38755, Training Loss: 0.00694
Epoch: 38755, Training Loss: 0.00721
Epoch: 38755, Training Loss: 0.00721
Epoch: 38755, Training Loss: 0.00885
Epoch: 38756, Training Loss: 0.00694
Epoch: 38756, Training Loss: 0.00721
Epoch: 38756, Training Loss: 0.00721
Epoch: 38756, Training Loss: 0.00885
Epoch: 38757, Training Loss: 0.00694
Epoch: 38757, Training Loss: 0.00721
Epoch: 38757, Training Loss: 0.00721
Epoch: 38757, Training Loss: 0.00885
Epoch: 38758, Training Loss: 0.00693
Epoch: 38758, Training Loss: 0.00721
Epoch: 38758, Training Loss: 0.00721
Epoch: 38758, Training Loss: 0.00885
Epoch: 38759, Training Loss: 0.00693
Epoch: 38759, Training Loss: 0.00721
Epoch: 38759, Training Loss: 0.00721
Epoch: 38759, Training Loss: 0.00885
Epoch: 38760, Training Loss: 0.00693
Epoch: 38760, Training Loss: 0.00721
Epoch: 38760, Training Loss: 0.00721
Epoch: 38760, Training Loss: 0.00885
Epoch: 38761, Training Loss: 0.00693
Epoch: 38761, Training Loss: 0.00721
Epoch: 38761, Training Loss: 0.00721
Epoch: 38761, Training Loss: 0.00885
Epoch: 38762, Training Loss: 0.00693
Epoch: 38762, Training Loss: 0.00721
Epoch: 38762, Training Loss: 0.00721
Epoch: 38762, Training Loss: 0.00885
Epoch: 38763, Training Loss: 0.00693
Epoch: 38763, Training Loss: 0.00721
Epoch: 38763, Training Loss: 0.00721
Epoch: 38763, Training Loss: 0.00885
Epoch: 38764, Training Loss: 0.00693
Epoch: 38764, Training Loss: 0.00720
Epoch: 38764, Training Loss: 0.00721
Epoch: 38764, Training Loss: 0.00885
Epoch: 38765, Training Loss: 0.00693
Epoch: 38765, Training Loss: 0.00720
Epoch: 38765, Training Loss: 0.00721
Epoch: 38765, Training Loss: 0.00885
Epoch: 38766, Training Loss: 0.00693
Epoch: 38766, Training Loss: 0.00720
Epoch: 38766, Training Loss: 0.00721
Epoch: 38766, Training Loss: 0.00885
Epoch: 38767, Training Loss: 0.00693
Epoch: 38767, Training Loss: 0.00720
Epoch: 38767, Training Loss: 0.00721
Epoch: 38767, Training Loss: 0.00885
Epoch: 38768, Training Loss: 0.00693
Epoch: 38768, Training Loss: 0.00720
Epoch: 38768, Training Loss: 0.00721
Epoch: 38768, Training Loss: 0.00885
Epoch: 38769, Training Loss: 0.00693
Epoch: 38769, Training Loss: 0.00720
Epoch: 38769, Training Loss: 0.00721
Epoch: 38769, Training Loss: 0.00885
Epoch: 38770, Training Loss: 0.00693
Epoch: 38770, Training Loss: 0.00720
Epoch: 38770, Training Loss: 0.00721
Epoch: 38770, Training Loss: 0.00885
Epoch: 38771, Training Loss: 0.00693
Epoch: 38771, Training Loss: 0.00720
Epoch: 38771, Training Loss: 0.00721
Epoch: 38771, Training Loss: 0.00885
Epoch: 38772, Training Loss: 0.00693
Epoch: 38772, Training Loss: 0.00720
Epoch: 38772, Training Loss: 0.00721
Epoch: 38772, Training Loss: 0.00885
Epoch: 38773, Training Loss: 0.00693
Epoch: 38773, Training Loss: 0.00720
Epoch: 38773, Training Loss: 0.00721
Epoch: 38773, Training Loss: 0.00885
Epoch: 38774, Training Loss: 0.00693
Epoch: 38774, Training Loss: 0.00720
Epoch: 38774, Training Loss: 0.00721
Epoch: 38774, Training Loss: 0.00885
Epoch: 38775, Training Loss: 0.00693
Epoch: 38775, Training Loss: 0.00720
Epoch: 38775, Training Loss: 0.00721
Epoch: 38775, Training Loss: 0.00885
Epoch: 38776, Training Loss: 0.00693
Epoch: 38776, Training Loss: 0.00720
Epoch: 38776, Training Loss: 0.00721
Epoch: 38776, Training Loss: 0.00885
Epoch: 38777, Training Loss: 0.00693
Epoch: 38777, Training Loss: 0.00720
Epoch: 38777, Training Loss: 0.00721
Epoch: 38777, Training Loss: 0.00885
Epoch: 38778, Training Loss: 0.00693
Epoch: 38778, Training Loss: 0.00720
Epoch: 38778, Training Loss: 0.00721
Epoch: 38778, Training Loss: 0.00885
Epoch: 38779, Training Loss: 0.00693
Epoch: 38779, Training Loss: 0.00720
Epoch: 38779, Training Loss: 0.00721
Epoch: 38779, Training Loss: 0.00885
Epoch: 38780, Training Loss: 0.00693
Epoch: 38780, Training Loss: 0.00720
Epoch: 38780, Training Loss: 0.00721
Epoch: 38780, Training Loss: 0.00885
Epoch: 38781, Training Loss: 0.00693
Epoch: 38781, Training Loss: 0.00720
Epoch: 38781, Training Loss: 0.00721
Epoch: 38781, Training Loss: 0.00885
Epoch: 38782, Training Loss: 0.00693
Epoch: 38782, Training Loss: 0.00720
Epoch: 38782, Training Loss: 0.00721
Epoch: 38782, Training Loss: 0.00884
Epoch: 38783, Training Loss: 0.00693
Epoch: 38783, Training Loss: 0.00720
Epoch: 38783, Training Loss: 0.00721
Epoch: 38783, Training Loss: 0.00884
Epoch: 38784, Training Loss: 0.00693
Epoch: 38784, Training Loss: 0.00720
Epoch: 38784, Training Loss: 0.00721
Epoch: 38784, Training Loss: 0.00884
Epoch: 38785, Training Loss: 0.00693
Epoch: 38785, Training Loss: 0.00720
Epoch: 38785, Training Loss: 0.00721
Epoch: 38785, Training Loss: 0.00884
Epoch: 38786, Training Loss: 0.00693
Epoch: 38786, Training Loss: 0.00720
Epoch: 38786, Training Loss: 0.00721
Epoch: 38786, Training Loss: 0.00884
Epoch: 38787, Training Loss: 0.00693
Epoch: 38787, Training Loss: 0.00720
Epoch: 38787, Training Loss: 0.00721
Epoch: 38787, Training Loss: 0.00884
Epoch: 38788, Training Loss: 0.00693
Epoch: 38788, Training Loss: 0.00720
Epoch: 38788, Training Loss: 0.00721
Epoch: 38788, Training Loss: 0.00884
Epoch: 38789, Training Loss: 0.00693
Epoch: 38789, Training Loss: 0.00720
Epoch: 38789, Training Loss: 0.00721
Epoch: 38789, Training Loss: 0.00884
Epoch: 38790, Training Loss: 0.00693
Epoch: 38790, Training Loss: 0.00720
Epoch: 38790, Training Loss: 0.00721
Epoch: 38790, Training Loss: 0.00884
Epoch: 38791, Training Loss: 0.00693
Epoch: 38791, Training Loss: 0.00720
Epoch: 38791, Training Loss: 0.00721
Epoch: 38791, Training Loss: 0.00884
Epoch: 38792, Training Loss: 0.00693
Epoch: 38792, Training Loss: 0.00720
Epoch: 38792, Training Loss: 0.00721
Epoch: 38792, Training Loss: 0.00884
Epoch: 38793, Training Loss: 0.00693
Epoch: 38793, Training Loss: 0.00720
Epoch: 38793, Training Loss: 0.00721
Epoch: 38793, Training Loss: 0.00884
Epoch: 38794, Training Loss: 0.00693
Epoch: 38794, Training Loss: 0.00720
Epoch: 38794, Training Loss: 0.00721
Epoch: 38794, Training Loss: 0.00884
Epoch: 38795, Training Loss: 0.00693
Epoch: 38795, Training Loss: 0.00720
Epoch: 38795, Training Loss: 0.00721
Epoch: 38795, Training Loss: 0.00884
Epoch: 38796, Training Loss: 0.00693
Epoch: 38796, Training Loss: 0.00720
Epoch: 38796, Training Loss: 0.00721
Epoch: 38796, Training Loss: 0.00884
Epoch: 38797, Training Loss: 0.00693
Epoch: 38797, Training Loss: 0.00720
Epoch: 38797, Training Loss: 0.00721
Epoch: 38797, Training Loss: 0.00884
Epoch: 38798, Training Loss: 0.00693
Epoch: 38798, Training Loss: 0.00720
Epoch: 38798, Training Loss: 0.00721
Epoch: 38798, Training Loss: 0.00884
Epoch: 38799, Training Loss: 0.00693
Epoch: 38799, Training Loss: 0.00720
Epoch: 38799, Training Loss: 0.00721
Epoch: 38799, Training Loss: 0.00884
Epoch: 38800, Training Loss: 0.00693
Epoch: 38800, Training Loss: 0.00720
Epoch: 38800, Training Loss: 0.00721
Epoch: 38800, Training Loss: 0.00884
Epoch: 38801, Training Loss: 0.00693
Epoch: 38801, Training Loss: 0.00720
Epoch: 38801, Training Loss: 0.00721
Epoch: 38801, Training Loss: 0.00884
Epoch: 38802, Training Loss: 0.00693
Epoch: 38802, Training Loss: 0.00720
Epoch: 38802, Training Loss: 0.00721
Epoch: 38802, Training Loss: 0.00884
Epoch: 38803, Training Loss: 0.00693
Epoch: 38803, Training Loss: 0.00720
Epoch: 38803, Training Loss: 0.00721
Epoch: 38803, Training Loss: 0.00884
Epoch: 38804, Training Loss: 0.00693
Epoch: 38804, Training Loss: 0.00720
Epoch: 38804, Training Loss: 0.00721
Epoch: 38804, Training Loss: 0.00884
Epoch: 38805, Training Loss: 0.00693
Epoch: 38805, Training Loss: 0.00720
Epoch: 38805, Training Loss: 0.00721
Epoch: 38805, Training Loss: 0.00884
Epoch: 38806, Training Loss: 0.00693
Epoch: 38806, Training Loss: 0.00720
Epoch: 38806, Training Loss: 0.00721
Epoch: 38806, Training Loss: 0.00884
Epoch: 38807, Training Loss: 0.00693
Epoch: 38807, Training Loss: 0.00720
Epoch: 38807, Training Loss: 0.00721
Epoch: 38807, Training Loss: 0.00884
Epoch: 38808, Training Loss: 0.00693
Epoch: 38808, Training Loss: 0.00720
Epoch: 38808, Training Loss: 0.00721
Epoch: 38808, Training Loss: 0.00884
Epoch: 38809, Training Loss: 0.00693
Epoch: 38809, Training Loss: 0.00720
Epoch: 38809, Training Loss: 0.00721
Epoch: 38809, Training Loss: 0.00884
Epoch: 38810, Training Loss: 0.00693
Epoch: 38810, Training Loss: 0.00720
Epoch: 38810, Training Loss: 0.00721
Epoch: 38810, Training Loss: 0.00884
Epoch: 38811, Training Loss: 0.00693
Epoch: 38811, Training Loss: 0.00720
Epoch: 38811, Training Loss: 0.00721
Epoch: 38811, Training Loss: 0.00884
Epoch: 38812, Training Loss: 0.00693
Epoch: 38812, Training Loss: 0.00720
Epoch: 38812, Training Loss: 0.00721
Epoch: 38812, Training Loss: 0.00884
Epoch: 38813, Training Loss: 0.00693
Epoch: 38813, Training Loss: 0.00720
Epoch: 38813, Training Loss: 0.00721
Epoch: 38813, Training Loss: 0.00884
Epoch: 38814, Training Loss: 0.00693
Epoch: 38814, Training Loss: 0.00720
Epoch: 38814, Training Loss: 0.00721
Epoch: 38814, Training Loss: 0.00884
Epoch: 38815, Training Loss: 0.00693
Epoch: 38815, Training Loss: 0.00720
Epoch: 38815, Training Loss: 0.00721
Epoch: 38815, Training Loss: 0.00884
Epoch: 38816, Training Loss: 0.00693
Epoch: 38816, Training Loss: 0.00720
Epoch: 38816, Training Loss: 0.00721
Epoch: 38816, Training Loss: 0.00884
Epoch: 38817, Training Loss: 0.00693
Epoch: 38817, Training Loss: 0.00720
Epoch: 38817, Training Loss: 0.00721
Epoch: 38817, Training Loss: 0.00884
Epoch: 38818, Training Loss: 0.00693
Epoch: 38818, Training Loss: 0.00720
Epoch: 38818, Training Loss: 0.00721
Epoch: 38818, Training Loss: 0.00884
Epoch: 38819, Training Loss: 0.00693
Epoch: 38819, Training Loss: 0.00720
Epoch: 38819, Training Loss: 0.00721
Epoch: 38819, Training Loss: 0.00884
Epoch: 38820, Training Loss: 0.00693
Epoch: 38820, Training Loss: 0.00720
Epoch: 38820, Training Loss: 0.00721
Epoch: 38820, Training Loss: 0.00884
Epoch: 38821, Training Loss: 0.00693
Epoch: 38821, Training Loss: 0.00720
Epoch: 38821, Training Loss: 0.00721
Epoch: 38821, Training Loss: 0.00884
Epoch: 38822, Training Loss: 0.00693
Epoch: 38822, Training Loss: 0.00720
Epoch: 38822, Training Loss: 0.00721
Epoch: 38822, Training Loss: 0.00884
Epoch: 38823, Training Loss: 0.00693
Epoch: 38823, Training Loss: 0.00720
Epoch: 38823, Training Loss: 0.00721
Epoch: 38823, Training Loss: 0.00884
Epoch: 38824, Training Loss: 0.00693
Epoch: 38824, Training Loss: 0.00720
Epoch: 38824, Training Loss: 0.00721
Epoch: 38824, Training Loss: 0.00884
Epoch: 38825, Training Loss: 0.00693
Epoch: 38825, Training Loss: 0.00720
Epoch: 38825, Training Loss: 0.00721
Epoch: 38825, Training Loss: 0.00884
Epoch: 38826, Training Loss: 0.00693
Epoch: 38826, Training Loss: 0.00720
Epoch: 38826, Training Loss: 0.00721
Epoch: 38826, Training Loss: 0.00884
Epoch: 38827, Training Loss: 0.00693
Epoch: 38827, Training Loss: 0.00720
Epoch: 38827, Training Loss: 0.00721
Epoch: 38827, Training Loss: 0.00884
Epoch: 38828, Training Loss: 0.00693
Epoch: 38828, Training Loss: 0.00720
Epoch: 38828, Training Loss: 0.00721
Epoch: 38828, Training Loss: 0.00884
Epoch: 38829, Training Loss: 0.00693
Epoch: 38829, Training Loss: 0.00720
Epoch: 38829, Training Loss: 0.00721
Epoch: 38829, Training Loss: 0.00884
Epoch: 38830, Training Loss: 0.00693
Epoch: 38830, Training Loss: 0.00720
Epoch: 38830, Training Loss: 0.00721
Epoch: 38830, Training Loss: 0.00884
Epoch: 38831, Training Loss: 0.00693
Epoch: 38831, Training Loss: 0.00720
Epoch: 38831, Training Loss: 0.00721
Epoch: 38831, Training Loss: 0.00884
Epoch: 38832, Training Loss: 0.00693
Epoch: 38832, Training Loss: 0.00720
Epoch: 38832, Training Loss: 0.00721
Epoch: 38832, Training Loss: 0.00884
Epoch: 38833, Training Loss: 0.00693
Epoch: 38833, Training Loss: 0.00720
Epoch: 38833, Training Loss: 0.00721
Epoch: 38833, Training Loss: 0.00884
Epoch: 38834, Training Loss: 0.00693
Epoch: 38834, Training Loss: 0.00720
Epoch: 38834, Training Loss: 0.00721
Epoch: 38834, Training Loss: 0.00884
Epoch: 38835, Training Loss: 0.00693
Epoch: 38835, Training Loss: 0.00720
Epoch: 38835, Training Loss: 0.00721
Epoch: 38835, Training Loss: 0.00884
Epoch: 38836, Training Loss: 0.00693
Epoch: 38836, Training Loss: 0.00720
Epoch: 38836, Training Loss: 0.00721
Epoch: 38836, Training Loss: 0.00884
Epoch: 38837, Training Loss: 0.00693
Epoch: 38837, Training Loss: 0.00720
Epoch: 38837, Training Loss: 0.00721
Epoch: 38837, Training Loss: 0.00884
Epoch: 38838, Training Loss: 0.00693
Epoch: 38838, Training Loss: 0.00720
Epoch: 38838, Training Loss: 0.00721
Epoch: 38838, Training Loss: 0.00884
Epoch: 38839, Training Loss: 0.00693
Epoch: 38839, Training Loss: 0.00720
Epoch: 38839, Training Loss: 0.00721
Epoch: 38839, Training Loss: 0.00884
Epoch: 38840, Training Loss: 0.00693
Epoch: 38840, Training Loss: 0.00720
Epoch: 38840, Training Loss: 0.00721
Epoch: 38840, Training Loss: 0.00884
Epoch: 38841, Training Loss: 0.00693
Epoch: 38841, Training Loss: 0.00720
Epoch: 38841, Training Loss: 0.00721
Epoch: 38841, Training Loss: 0.00884
Epoch: 38842, Training Loss: 0.00693
Epoch: 38842, Training Loss: 0.00720
Epoch: 38842, Training Loss: 0.00721
Epoch: 38842, Training Loss: 0.00884
Epoch: 38843, Training Loss: 0.00693
Epoch: 38843, Training Loss: 0.00720
Epoch: 38843, Training Loss: 0.00721
Epoch: 38843, Training Loss: 0.00884
Epoch: 38844, Training Loss: 0.00693
Epoch: 38844, Training Loss: 0.00720
Epoch: 38844, Training Loss: 0.00720
Epoch: 38844, Training Loss: 0.00884
Epoch: 38845, Training Loss: 0.00693
Epoch: 38845, Training Loss: 0.00720
Epoch: 38845, Training Loss: 0.00720
Epoch: 38845, Training Loss: 0.00884
Epoch: 38846, Training Loss: 0.00693
Epoch: 38846, Training Loss: 0.00720
Epoch: 38846, Training Loss: 0.00720
Epoch: 38846, Training Loss: 0.00884
Epoch: 38847, Training Loss: 0.00693
Epoch: 38847, Training Loss: 0.00720
Epoch: 38847, Training Loss: 0.00720
Epoch: 38847, Training Loss: 0.00884
Epoch: 38848, Training Loss: 0.00693
Epoch: 38848, Training Loss: 0.00720
Epoch: 38848, Training Loss: 0.00720
Epoch: 38848, Training Loss: 0.00884
Epoch: 38849, Training Loss: 0.00693
Epoch: 38849, Training Loss: 0.00720
Epoch: 38849, Training Loss: 0.00720
Epoch: 38849, Training Loss: 0.00884
Epoch: 38850, Training Loss: 0.00693
Epoch: 38850, Training Loss: 0.00720
Epoch: 38850, Training Loss: 0.00720
Epoch: 38850, Training Loss: 0.00884
Epoch: 38851, Training Loss: 0.00693
Epoch: 38851, Training Loss: 0.00720
Epoch: 38851, Training Loss: 0.00720
Epoch: 38851, Training Loss: 0.00884
Epoch: 38852, Training Loss: 0.00693
Epoch: 38852, Training Loss: 0.00720
Epoch: 38852, Training Loss: 0.00720
Epoch: 38852, Training Loss: 0.00884
Epoch: 38853, Training Loss: 0.00693
Epoch: 38853, Training Loss: 0.00720
Epoch: 38853, Training Loss: 0.00720
Epoch: 38853, Training Loss: 0.00884
Epoch: 38854, Training Loss: 0.00693
Epoch: 38854, Training Loss: 0.00720
Epoch: 38854, Training Loss: 0.00720
Epoch: 38854, Training Loss: 0.00884
Epoch: 38855, Training Loss: 0.00693
Epoch: 38855, Training Loss: 0.00720
Epoch: 38855, Training Loss: 0.00720
Epoch: 38855, Training Loss: 0.00884
Epoch: 38856, Training Loss: 0.00693
Epoch: 38856, Training Loss: 0.00720
Epoch: 38856, Training Loss: 0.00720
Epoch: 38856, Training Loss: 0.00884
Epoch: 38857, Training Loss: 0.00693
Epoch: 38857, Training Loss: 0.00720
Epoch: 38857, Training Loss: 0.00720
Epoch: 38857, Training Loss: 0.00884
Epoch: 38858, Training Loss: 0.00693
Epoch: 38858, Training Loss: 0.00720
Epoch: 38858, Training Loss: 0.00720
Epoch: 38858, Training Loss: 0.00884
Epoch: 38859, Training Loss: 0.00693
Epoch: 38859, Training Loss: 0.00720
Epoch: 38859, Training Loss: 0.00720
Epoch: 38859, Training Loss: 0.00884
Epoch: 38860, Training Loss: 0.00693
Epoch: 38860, Training Loss: 0.00720
Epoch: 38860, Training Loss: 0.00720
Epoch: 38860, Training Loss: 0.00884
Epoch: 38861, Training Loss: 0.00692
Epoch: 38861, Training Loss: 0.00720
Epoch: 38861, Training Loss: 0.00720
Epoch: 38861, Training Loss: 0.00883
Epoch: 38862, Training Loss: 0.00692
Epoch: 38862, Training Loss: 0.00719
Epoch: 38862, Training Loss: 0.00720
Epoch: 38862, Training Loss: 0.00883
Epoch: 38863, Training Loss: 0.00692
Epoch: 38863, Training Loss: 0.00719
Epoch: 38863, Training Loss: 0.00720
Epoch: 38863, Training Loss: 0.00883
Epoch: 38864, Training Loss: 0.00692
Epoch: 38864, Training Loss: 0.00719
Epoch: 38864, Training Loss: 0.00720
Epoch: 38864, Training Loss: 0.00883
Epoch: 38865, Training Loss: 0.00692
Epoch: 38865, Training Loss: 0.00719
Epoch: 38865, Training Loss: 0.00720
Epoch: 38865, Training Loss: 0.00883
Epoch: 38866, Training Loss: 0.00692
Epoch: 38866, Training Loss: 0.00719
Epoch: 38866, Training Loss: 0.00720
Epoch: 38866, Training Loss: 0.00883
Epoch: 38867, Training Loss: 0.00692
Epoch: 38867, Training Loss: 0.00719
Epoch: 38867, Training Loss: 0.00720
Epoch: 38867, Training Loss: 0.00883
Epoch: 38868, Training Loss: 0.00692
Epoch: 38868, Training Loss: 0.00719
Epoch: 38868, Training Loss: 0.00720
Epoch: 38868, Training Loss: 0.00883
Epoch: 38869, Training Loss: 0.00692
Epoch: 38869, Training Loss: 0.00719
Epoch: 38869, Training Loss: 0.00720
Epoch: 38869, Training Loss: 0.00883
Epoch: 38870, Training Loss: 0.00692
Epoch: 38870, Training Loss: 0.00719
Epoch: 38870, Training Loss: 0.00720
Epoch: 38870, Training Loss: 0.00883
Epoch: 38871, Training Loss: 0.00692
Epoch: 38871, Training Loss: 0.00719
Epoch: 38871, Training Loss: 0.00720
Epoch: 38871, Training Loss: 0.00883
Epoch: 38872, Training Loss: 0.00692
Epoch: 38872, Training Loss: 0.00719
Epoch: 38872, Training Loss: 0.00720
Epoch: 38872, Training Loss: 0.00883
Epoch: 38873, Training Loss: 0.00692
Epoch: 38873, Training Loss: 0.00719
Epoch: 38873, Training Loss: 0.00720
Epoch: 38873, Training Loss: 0.00883
Epoch: 38874, Training Loss: 0.00692
Epoch: 38874, Training Loss: 0.00719
Epoch: 38874, Training Loss: 0.00720
Epoch: 38874, Training Loss: 0.00883
Epoch: 38875, Training Loss: 0.00692
Epoch: 38875, Training Loss: 0.00719
Epoch: 38875, Training Loss: 0.00720
Epoch: 38875, Training Loss: 0.00883
Epoch: 38876, Training Loss: 0.00692
Epoch: 38876, Training Loss: 0.00719
Epoch: 38876, Training Loss: 0.00720
Epoch: 38876, Training Loss: 0.00883
Epoch: 38877, Training Loss: 0.00692
Epoch: 38877, Training Loss: 0.00719
Epoch: 38877, Training Loss: 0.00720
Epoch: 38877, Training Loss: 0.00883
Epoch: 38878, Training Loss: 0.00692
Epoch: 38878, Training Loss: 0.00719
Epoch: 38878, Training Loss: 0.00720
Epoch: 38878, Training Loss: 0.00883
Epoch: 38879, Training Loss: 0.00692
Epoch: 38879, Training Loss: 0.00719
Epoch: 38879, Training Loss: 0.00720
Epoch: 38879, Training Loss: 0.00883
Epoch: 38880, Training Loss: 0.00692
Epoch: 38880, Training Loss: 0.00719
Epoch: 38880, Training Loss: 0.00720
Epoch: 38880, Training Loss: 0.00883
Epoch: 38881, Training Loss: 0.00692
Epoch: 38881, Training Loss: 0.00719
Epoch: 38881, Training Loss: 0.00720
Epoch: 38881, Training Loss: 0.00883
Epoch: 38882, Training Loss: 0.00692
Epoch: 38882, Training Loss: 0.00719
Epoch: 38882, Training Loss: 0.00720
Epoch: 38882, Training Loss: 0.00883
Epoch: 38883, Training Loss: 0.00692
Epoch: 38883, Training Loss: 0.00719
Epoch: 38883, Training Loss: 0.00720
Epoch: 38883, Training Loss: 0.00883
Epoch: 38884, Training Loss: 0.00692
Epoch: 38884, Training Loss: 0.00719
Epoch: 38884, Training Loss: 0.00720
Epoch: 38884, Training Loss: 0.00883
Epoch: 38885, Training Loss: 0.00692
Epoch: 38885, Training Loss: 0.00719
Epoch: 38885, Training Loss: 0.00720
Epoch: 38885, Training Loss: 0.00883
Epoch: 38886, Training Loss: 0.00692
Epoch: 38886, Training Loss: 0.00719
Epoch: 38886, Training Loss: 0.00720
Epoch: 38886, Training Loss: 0.00883
Epoch: 38887, Training Loss: 0.00692
Epoch: 38887, Training Loss: 0.00719
Epoch: 38887, Training Loss: 0.00720
Epoch: 38887, Training Loss: 0.00883
Epoch: 38888, Training Loss: 0.00692
Epoch: 38888, Training Loss: 0.00719
Epoch: 38888, Training Loss: 0.00720
Epoch: 38888, Training Loss: 0.00883
Epoch: 38889, Training Loss: 0.00692
Epoch: 38889, Training Loss: 0.00719
Epoch: 38889, Training Loss: 0.00720
Epoch: 38889, Training Loss: 0.00883
Epoch: 38890, Training Loss: 0.00692
Epoch: 38890, Training Loss: 0.00719
Epoch: 38890, Training Loss: 0.00720
Epoch: 38890, Training Loss: 0.00883
Epoch: 38891, Training Loss: 0.00692
Epoch: 38891, Training Loss: 0.00719
Epoch: 38891, Training Loss: 0.00720
Epoch: 38891, Training Loss: 0.00883
Epoch: 38892, Training Loss: 0.00692
Epoch: 38892, Training Loss: 0.00719
Epoch: 38892, Training Loss: 0.00720
Epoch: 38892, Training Loss: 0.00883
Epoch: 38893, Training Loss: 0.00692
Epoch: 38893, Training Loss: 0.00719
Epoch: 38893, Training Loss: 0.00720
Epoch: 38893, Training Loss: 0.00883
Epoch: 38894, Training Loss: 0.00692
Epoch: 38894, Training Loss: 0.00719
Epoch: 38894, Training Loss: 0.00720
Epoch: 38894, Training Loss: 0.00883
Epoch: 38895, Training Loss: 0.00692
Epoch: 38895, Training Loss: 0.00719
Epoch: 38895, Training Loss: 0.00720
Epoch: 38895, Training Loss: 0.00883
Epoch: 38896, Training Loss: 0.00692
Epoch: 38896, Training Loss: 0.00719
Epoch: 38896, Training Loss: 0.00720
Epoch: 38896, Training Loss: 0.00883
Epoch: 38897, Training Loss: 0.00692
Epoch: 38897, Training Loss: 0.00719
Epoch: 38897, Training Loss: 0.00720
Epoch: 38897, Training Loss: 0.00883
Epoch: 38898, Training Loss: 0.00692
Epoch: 38898, Training Loss: 0.00719
Epoch: 38898, Training Loss: 0.00720
Epoch: 38898, Training Loss: 0.00883
Epoch: 38899, Training Loss: 0.00692
Epoch: 38899, Training Loss: 0.00719
Epoch: 38899, Training Loss: 0.00720
Epoch: 38899, Training Loss: 0.00883
Epoch: 38900, Training Loss: 0.00692
Epoch: 38900, Training Loss: 0.00719
Epoch: 38900, Training Loss: 0.00720
Epoch: 38900, Training Loss: 0.00883
Epoch: 38901, Training Loss: 0.00692
Epoch: 38901, Training Loss: 0.00719
Epoch: 38901, Training Loss: 0.00720
Epoch: 38901, Training Loss: 0.00883
Epoch: 38902, Training Loss: 0.00692
Epoch: 38902, Training Loss: 0.00719
Epoch: 38902, Training Loss: 0.00720
Epoch: 38902, Training Loss: 0.00883
Epoch: 38903, Training Loss: 0.00692
Epoch: 38903, Training Loss: 0.00719
Epoch: 38903, Training Loss: 0.00720
Epoch: 38903, Training Loss: 0.00883
Epoch: 38904, Training Loss: 0.00692
Epoch: 38904, Training Loss: 0.00719
Epoch: 38904, Training Loss: 0.00720
Epoch: 38904, Training Loss: 0.00883
Epoch: 38905, Training Loss: 0.00692
Epoch: 38905, Training Loss: 0.00719
Epoch: 38905, Training Loss: 0.00720
Epoch: 38905, Training Loss: 0.00883
Epoch: 38906, Training Loss: 0.00692
Epoch: 38906, Training Loss: 0.00719
Epoch: 38906, Training Loss: 0.00720
Epoch: 38906, Training Loss: 0.00883
Epoch: 38907, Training Loss: 0.00692
Epoch: 38907, Training Loss: 0.00719
Epoch: 38907, Training Loss: 0.00720
Epoch: 38907, Training Loss: 0.00883
Epoch: 38908, Training Loss: 0.00692
Epoch: 38908, Training Loss: 0.00719
Epoch: 38908, Training Loss: 0.00720
Epoch: 38908, Training Loss: 0.00883
Epoch: 38909, Training Loss: 0.00692
Epoch: 38909, Training Loss: 0.00719
Epoch: 38909, Training Loss: 0.00720
Epoch: 38909, Training Loss: 0.00883
Epoch: 38910, Training Loss: 0.00692
Epoch: 38910, Training Loss: 0.00719
Epoch: 38910, Training Loss: 0.00720
Epoch: 38910, Training Loss: 0.00883
Epoch: 38911, Training Loss: 0.00692
Epoch: 38911, Training Loss: 0.00719
Epoch: 38911, Training Loss: 0.00720
Epoch: 38911, Training Loss: 0.00883
Epoch: 38912, Training Loss: 0.00692
Epoch: 38912, Training Loss: 0.00719
Epoch: 38912, Training Loss: 0.00720
Epoch: 38912, Training Loss: 0.00883
Epoch: 38913, Training Loss: 0.00692
Epoch: 38913, Training Loss: 0.00719
Epoch: 38913, Training Loss: 0.00720
Epoch: 38913, Training Loss: 0.00883
Epoch: 38914, Training Loss: 0.00692
Epoch: 38914, Training Loss: 0.00719
Epoch: 38914, Training Loss: 0.00720
Epoch: 38914, Training Loss: 0.00883
Epoch: 38915, Training Loss: 0.00692
Epoch: 38915, Training Loss: 0.00719
Epoch: 38915, Training Loss: 0.00720
Epoch: 38915, Training Loss: 0.00883
Epoch: 38916, Training Loss: 0.00692
Epoch: 38916, Training Loss: 0.00719
Epoch: 38916, Training Loss: 0.00720
Epoch: 38916, Training Loss: 0.00883
Epoch: 38917, Training Loss: 0.00692
Epoch: 38917, Training Loss: 0.00719
Epoch: 38917, Training Loss: 0.00720
Epoch: 38917, Training Loss: 0.00883
Epoch: 38918, Training Loss: 0.00692
Epoch: 38918, Training Loss: 0.00719
Epoch: 38918, Training Loss: 0.00720
Epoch: 38918, Training Loss: 0.00883
Epoch: 38919, Training Loss: 0.00692
Epoch: 38919, Training Loss: 0.00719
Epoch: 38919, Training Loss: 0.00720
Epoch: 38919, Training Loss: 0.00883
Epoch: 38920, Training Loss: 0.00692
Epoch: 38920, Training Loss: 0.00719
Epoch: 38920, Training Loss: 0.00720
Epoch: 38920, Training Loss: 0.00883
Epoch: 38921, Training Loss: 0.00692
Epoch: 38921, Training Loss: 0.00719
Epoch: 38921, Training Loss: 0.00720
Epoch: 38921, Training Loss: 0.00883
Epoch: 38922, Training Loss: 0.00692
Epoch: 38922, Training Loss: 0.00719
Epoch: 38922, Training Loss: 0.00720
Epoch: 38922, Training Loss: 0.00883
Epoch: 38923, Training Loss: 0.00692
Epoch: 38923, Training Loss: 0.00719
Epoch: 38923, Training Loss: 0.00720
Epoch: 38923, Training Loss: 0.00883
Epoch: 38924, Training Loss: 0.00692
Epoch: 38924, Training Loss: 0.00719
Epoch: 38924, Training Loss: 0.00720
Epoch: 38924, Training Loss: 0.00883
Epoch: 38925, Training Loss: 0.00692
Epoch: 38925, Training Loss: 0.00719
Epoch: 38925, Training Loss: 0.00720
Epoch: 38925, Training Loss: 0.00883
Epoch: 38926, Training Loss: 0.00692
Epoch: 38926, Training Loss: 0.00719
Epoch: 38926, Training Loss: 0.00720
Epoch: 38926, Training Loss: 0.00883
Epoch: 38927, Training Loss: 0.00692
Epoch: 38927, Training Loss: 0.00719
Epoch: 38927, Training Loss: 0.00720
Epoch: 38927, Training Loss: 0.00883
Epoch: 38928, Training Loss: 0.00692
Epoch: 38928, Training Loss: 0.00719
Epoch: 38928, Training Loss: 0.00720
Epoch: 38928, Training Loss: 0.00883
Epoch: 38929, Training Loss: 0.00692
Epoch: 38929, Training Loss: 0.00719
Epoch: 38929, Training Loss: 0.00720
Epoch: 38929, Training Loss: 0.00883
Epoch: 38930, Training Loss: 0.00692
Epoch: 38930, Training Loss: 0.00719
Epoch: 38930, Training Loss: 0.00720
Epoch: 38930, Training Loss: 0.00883
Epoch: 38931, Training Loss: 0.00692
Epoch: 38931, Training Loss: 0.00719
Epoch: 38931, Training Loss: 0.00720
Epoch: 38931, Training Loss: 0.00883
Epoch: 38932, Training Loss: 0.00692
Epoch: 38932, Training Loss: 0.00719
Epoch: 38932, Training Loss: 0.00720
Epoch: 38932, Training Loss: 0.00883
Epoch: 38933, Training Loss: 0.00692
Epoch: 38933, Training Loss: 0.00719
Epoch: 38933, Training Loss: 0.00720
Epoch: 38933, Training Loss: 0.00883
Epoch: 38934, Training Loss: 0.00692
Epoch: 38934, Training Loss: 0.00719
Epoch: 38934, Training Loss: 0.00720
Epoch: 38934, Training Loss: 0.00883
Epoch: 38935, Training Loss: 0.00692
Epoch: 38935, Training Loss: 0.00719
Epoch: 38935, Training Loss: 0.00720
Epoch: 38935, Training Loss: 0.00883
Epoch: 38936, Training Loss: 0.00692
Epoch: 38936, Training Loss: 0.00719
Epoch: 38936, Training Loss: 0.00720
Epoch: 38936, Training Loss: 0.00883
Epoch: 38937, Training Loss: 0.00692
Epoch: 38937, Training Loss: 0.00719
Epoch: 38937, Training Loss: 0.00720
Epoch: 38937, Training Loss: 0.00883
Epoch: 38938, Training Loss: 0.00692
Epoch: 38938, Training Loss: 0.00719
Epoch: 38938, Training Loss: 0.00720
Epoch: 38938, Training Loss: 0.00883
Epoch: 38939, Training Loss: 0.00692
Epoch: 38939, Training Loss: 0.00719
Epoch: 38939, Training Loss: 0.00720
Epoch: 38939, Training Loss: 0.00883
Epoch: 38940, Training Loss: 0.00692
Epoch: 38940, Training Loss: 0.00719
Epoch: 38940, Training Loss: 0.00720
Epoch: 38940, Training Loss: 0.00883
Epoch: 38941, Training Loss: 0.00692
Epoch: 38941, Training Loss: 0.00719
Epoch: 38941, Training Loss: 0.00720
Epoch: 38941, Training Loss: 0.00882
Epoch: 38942, Training Loss: 0.00692
Epoch: 38942, Training Loss: 0.00719
Epoch: 38942, Training Loss: 0.00719
Epoch: 38942, Training Loss: 0.00882
Epoch: 38943, Training Loss: 0.00692
Epoch: 38943, Training Loss: 0.00719
Epoch: 38943, Training Loss: 0.00719
Epoch: 38943, Training Loss: 0.00882
Epoch: 38944, Training Loss: 0.00692
Epoch: 38944, Training Loss: 0.00719
Epoch: 38944, Training Loss: 0.00719
Epoch: 38944, Training Loss: 0.00882
Epoch: 38945, Training Loss: 0.00692
Epoch: 38945, Training Loss: 0.00719
Epoch: 38945, Training Loss: 0.00719
Epoch: 38945, Training Loss: 0.00882
Epoch: 38946, Training Loss: 0.00692
Epoch: 38946, Training Loss: 0.00719
Epoch: 38946, Training Loss: 0.00719
Epoch: 38946, Training Loss: 0.00882
Epoch: 38947, Training Loss: 0.00692
Epoch: 38947, Training Loss: 0.00719
Epoch: 38947, Training Loss: 0.00719
Epoch: 38947, Training Loss: 0.00882
Epoch: 38948, Training Loss: 0.00692
Epoch: 38948, Training Loss: 0.00719
Epoch: 38948, Training Loss: 0.00719
Epoch: 38948, Training Loss: 0.00882
Epoch: 38949, Training Loss: 0.00692
Epoch: 38949, Training Loss: 0.00719
Epoch: 38949, Training Loss: 0.00719
Epoch: 38949, Training Loss: 0.00882
Epoch: 38950, Training Loss: 0.00692
Epoch: 38950, Training Loss: 0.00719
Epoch: 38950, Training Loss: 0.00719
Epoch: 38950, Training Loss: 0.00882
Epoch: 38951, Training Loss: 0.00692
Epoch: 38951, Training Loss: 0.00719
Epoch: 38951, Training Loss: 0.00719
Epoch: 38951, Training Loss: 0.00882
Epoch: 38952, Training Loss: 0.00692
Epoch: 38952, Training Loss: 0.00719
Epoch: 38952, Training Loss: 0.00719
Epoch: 38952, Training Loss: 0.00882
Epoch: 38953, Training Loss: 0.00692
Epoch: 38953, Training Loss: 0.00719
Epoch: 38953, Training Loss: 0.00719
Epoch: 38953, Training Loss: 0.00882
Epoch: 38954, Training Loss: 0.00692
Epoch: 38954, Training Loss: 0.00719
Epoch: 38954, Training Loss: 0.00719
Epoch: 38954, Training Loss: 0.00882
Epoch: 38955, Training Loss: 0.00692
Epoch: 38955, Training Loss: 0.00719
Epoch: 38955, Training Loss: 0.00719
Epoch: 38955, Training Loss: 0.00882
Epoch: 38956, Training Loss: 0.00692
Epoch: 38956, Training Loss: 0.00719
Epoch: 38956, Training Loss: 0.00719
Epoch: 38956, Training Loss: 0.00882
Epoch: 38957, Training Loss: 0.00692
Epoch: 38957, Training Loss: 0.00719
Epoch: 38957, Training Loss: 0.00719
Epoch: 38957, Training Loss: 0.00882
Epoch: 38958, Training Loss: 0.00692
Epoch: 38958, Training Loss: 0.00719
Epoch: 38958, Training Loss: 0.00719
Epoch: 38958, Training Loss: 0.00882
Epoch: 38959, Training Loss: 0.00692
Epoch: 38959, Training Loss: 0.00719
Epoch: 38959, Training Loss: 0.00719
Epoch: 38959, Training Loss: 0.00882
Epoch: 38960, Training Loss: 0.00692
Epoch: 38960, Training Loss: 0.00718
Epoch: 38960, Training Loss: 0.00719
Epoch: 38960, Training Loss: 0.00882
Epoch: 38961, Training Loss: 0.00692
Epoch: 38961, Training Loss: 0.00718
Epoch: 38961, Training Loss: 0.00719
Epoch: 38961, Training Loss: 0.00882
Epoch: 38962, Training Loss: 0.00692
Epoch: 38962, Training Loss: 0.00718
Epoch: 38962, Training Loss: 0.00719
Epoch: 38962, Training Loss: 0.00882
Epoch: 38963, Training Loss: 0.00692
Epoch: 38963, Training Loss: 0.00718
Epoch: 38963, Training Loss: 0.00719
Epoch: 38963, Training Loss: 0.00882
Epoch: 38964, Training Loss: 0.00692
Epoch: 38964, Training Loss: 0.00718
Epoch: 38964, Training Loss: 0.00719
Epoch: 38964, Training Loss: 0.00882
Epoch: 38965, Training Loss: 0.00692
Epoch: 38965, Training Loss: 0.00718
Epoch: 38965, Training Loss: 0.00719
Epoch: 38965, Training Loss: 0.00882
Epoch: 38966, Training Loss: 0.00691
Epoch: 38966, Training Loss: 0.00718
Epoch: 38966, Training Loss: 0.00719
Epoch: 38966, Training Loss: 0.00882
Epoch: 38967, Training Loss: 0.00691
Epoch: 38967, Training Loss: 0.00718
Epoch: 38967, Training Loss: 0.00719
Epoch: 38967, Training Loss: 0.00882
Epoch: 38968, Training Loss: 0.00691
Epoch: 38968, Training Loss: 0.00718
Epoch: 38968, Training Loss: 0.00719
Epoch: 38968, Training Loss: 0.00882
Epoch: 38969, Training Loss: 0.00691
Epoch: 38969, Training Loss: 0.00718
Epoch: 38969, Training Loss: 0.00719
Epoch: 38969, Training Loss: 0.00882
Epoch: 38970, Training Loss: 0.00691
Epoch: 38970, Training Loss: 0.00718
Epoch: 38970, Training Loss: 0.00719
Epoch: 38970, Training Loss: 0.00882
Epoch: 38971, Training Loss: 0.00691
Epoch: 38971, Training Loss: 0.00718
Epoch: 38971, Training Loss: 0.00719
Epoch: 38971, Training Loss: 0.00882
Epoch: 38972, Training Loss: 0.00691
Epoch: 38972, Training Loss: 0.00718
Epoch: 38972, Training Loss: 0.00719
Epoch: 38972, Training Loss: 0.00882
Epoch: 38973, Training Loss: 0.00691
Epoch: 38973, Training Loss: 0.00718
Epoch: 38973, Training Loss: 0.00719
Epoch: 38973, Training Loss: 0.00882
Epoch: 38974, Training Loss: 0.00691
Epoch: 38974, Training Loss: 0.00718
Epoch: 38974, Training Loss: 0.00719
Epoch: 38974, Training Loss: 0.00882
Epoch: 38975, Training Loss: 0.00691
Epoch: 38975, Training Loss: 0.00718
Epoch: 38975, Training Loss: 0.00719
Epoch: 38975, Training Loss: 0.00882
Epoch: 38976, Training Loss: 0.00691
Epoch: 38976, Training Loss: 0.00718
Epoch: 38976, Training Loss: 0.00719
Epoch: 38976, Training Loss: 0.00882
Epoch: 38977, Training Loss: 0.00691
Epoch: 38977, Training Loss: 0.00718
Epoch: 38977, Training Loss: 0.00719
Epoch: 38977, Training Loss: 0.00882
Epoch: 38978, Training Loss: 0.00691
Epoch: 38978, Training Loss: 0.00718
Epoch: 38978, Training Loss: 0.00719
Epoch: 38978, Training Loss: 0.00882
Epoch: 38979, Training Loss: 0.00691
Epoch: 38979, Training Loss: 0.00718
Epoch: 38979, Training Loss: 0.00719
Epoch: 38979, Training Loss: 0.00882
Epoch: 38980, Training Loss: 0.00691
Epoch: 38980, Training Loss: 0.00718
Epoch: 38980, Training Loss: 0.00719
Epoch: 38980, Training Loss: 0.00882
Epoch: 38981, Training Loss: 0.00691
Epoch: 38981, Training Loss: 0.00718
Epoch: 38981, Training Loss: 0.00719
Epoch: 38981, Training Loss: 0.00882
Epoch: 38982, Training Loss: 0.00691
Epoch: 38982, Training Loss: 0.00718
Epoch: 38982, Training Loss: 0.00719
Epoch: 38982, Training Loss: 0.00882
Epoch: 38983, Training Loss: 0.00691
Epoch: 38983, Training Loss: 0.00718
Epoch: 38983, Training Loss: 0.00719
Epoch: 38983, Training Loss: 0.00882
Epoch: 38984, Training Loss: 0.00691
Epoch: 38984, Training Loss: 0.00718
Epoch: 38984, Training Loss: 0.00719
Epoch: 38984, Training Loss: 0.00882
Epoch: 38985, Training Loss: 0.00691
Epoch: 38985, Training Loss: 0.00718
Epoch: 38985, Training Loss: 0.00719
Epoch: 38985, Training Loss: 0.00882
Epoch: 38986, Training Loss: 0.00691
Epoch: 38986, Training Loss: 0.00718
Epoch: 38986, Training Loss: 0.00719
Epoch: 38986, Training Loss: 0.00882
Epoch: 38987, Training Loss: 0.00691
Epoch: 38987, Training Loss: 0.00718
Epoch: 38987, Training Loss: 0.00719
Epoch: 38987, Training Loss: 0.00882
Epoch: 38988, Training Loss: 0.00691
Epoch: 38988, Training Loss: 0.00718
Epoch: 38988, Training Loss: 0.00719
Epoch: 38988, Training Loss: 0.00882
Epoch: 38989, Training Loss: 0.00691
Epoch: 38989, Training Loss: 0.00718
Epoch: 38989, Training Loss: 0.00719
Epoch: 38989, Training Loss: 0.00882
Epoch: 38990, Training Loss: 0.00691
Epoch: 38990, Training Loss: 0.00718
Epoch: 38990, Training Loss: 0.00719
Epoch: 38990, Training Loss: 0.00882
Epoch: 38991, Training Loss: 0.00691
Epoch: 38991, Training Loss: 0.00718
Epoch: 38991, Training Loss: 0.00719
Epoch: 38991, Training Loss: 0.00882
Epoch: 38992, Training Loss: 0.00691
Epoch: 38992, Training Loss: 0.00718
Epoch: 38992, Training Loss: 0.00719
Epoch: 38992, Training Loss: 0.00882
Epoch: 38993, Training Loss: 0.00691
Epoch: 38993, Training Loss: 0.00718
Epoch: 38993, Training Loss: 0.00719
Epoch: 38993, Training Loss: 0.00882
Epoch: 38994, Training Loss: 0.00691
Epoch: 38994, Training Loss: 0.00718
Epoch: 38994, Training Loss: 0.00719
Epoch: 38994, Training Loss: 0.00882
Epoch: 38995, Training Loss: 0.00691
Epoch: 38995, Training Loss: 0.00718
Epoch: 38995, Training Loss: 0.00719
Epoch: 38995, Training Loss: 0.00882
Epoch: 38996, Training Loss: 0.00691
Epoch: 38996, Training Loss: 0.00718
Epoch: 38996, Training Loss: 0.00719
Epoch: 38996, Training Loss: 0.00882
Epoch: 38997, Training Loss: 0.00691
Epoch: 38997, Training Loss: 0.00718
Epoch: 38997, Training Loss: 0.00719
Epoch: 38997, Training Loss: 0.00882
Epoch: 38998, Training Loss: 0.00691
Epoch: 38998, Training Loss: 0.00718
Epoch: 38998, Training Loss: 0.00719
Epoch: 38998, Training Loss: 0.00882
Epoch: 38999, Training Loss: 0.00691
Epoch: 38999, Training Loss: 0.00718
Epoch: 38999, Training Loss: 0.00719
Epoch: 38999, Training Loss: 0.00882
Epoch: 39000, Training Loss: 0.00691
Epoch: 39000, Training Loss: 0.00718
Epoch: 39000, Training Loss: 0.00719
Epoch: 39000, Training Loss: 0.00882
Epoch: 39001, Training Loss: 0.00691
Epoch: 39001, Training Loss: 0.00718
Epoch: 39001, Training Loss: 0.00719
Epoch: 39001, Training Loss: 0.00882
Epoch: 39002, Training Loss: 0.00691
Epoch: 39002, Training Loss: 0.00718
Epoch: 39002, Training Loss: 0.00719
Epoch: 39002, Training Loss: 0.00882
Epoch: 39003, Training Loss: 0.00691
Epoch: 39003, Training Loss: 0.00718
Epoch: 39003, Training Loss: 0.00719
Epoch: 39003, Training Loss: 0.00882
Epoch: 39004, Training Loss: 0.00691
Epoch: 39004, Training Loss: 0.00718
Epoch: 39004, Training Loss: 0.00719
Epoch: 39004, Training Loss: 0.00882
Epoch: 39005, Training Loss: 0.00691
Epoch: 39005, Training Loss: 0.00718
Epoch: 39005, Training Loss: 0.00719
Epoch: 39005, Training Loss: 0.00882
Epoch: 39006, Training Loss: 0.00691
Epoch: 39006, Training Loss: 0.00718
Epoch: 39006, Training Loss: 0.00719
Epoch: 39006, Training Loss: 0.00882
Epoch: 39007, Training Loss: 0.00691
Epoch: 39007, Training Loss: 0.00718
Epoch: 39007, Training Loss: 0.00719
Epoch: 39007, Training Loss: 0.00882
Epoch: 39008, Training Loss: 0.00691
Epoch: 39008, Training Loss: 0.00718
Epoch: 39008, Training Loss: 0.00719
Epoch: 39008, Training Loss: 0.00882
Epoch: 39009, Training Loss: 0.00691
Epoch: 39009, Training Loss: 0.00718
Epoch: 39009, Training Loss: 0.00719
Epoch: 39009, Training Loss: 0.00882
Epoch: 39010, Training Loss: 0.00691
Epoch: 39010, Training Loss: 0.00718
Epoch: 39010, Training Loss: 0.00719
Epoch: 39010, Training Loss: 0.00882
Epoch: 39011, Training Loss: 0.00691
Epoch: 39011, Training Loss: 0.00718
Epoch: 39011, Training Loss: 0.00719
Epoch: 39011, Training Loss: 0.00882
Epoch: 39012, Training Loss: 0.00691
Epoch: 39012, Training Loss: 0.00718
Epoch: 39012, Training Loss: 0.00719
Epoch: 39012, Training Loss: 0.00882
Epoch: 39013, Training Loss: 0.00691
Epoch: 39013, Training Loss: 0.00718
Epoch: 39013, Training Loss: 0.00719
Epoch: 39013, Training Loss: 0.00882
Epoch: 39014, Training Loss: 0.00691
Epoch: 39014, Training Loss: 0.00718
Epoch: 39014, Training Loss: 0.00719
Epoch: 39014, Training Loss: 0.00882
Epoch: 39015, Training Loss: 0.00691
Epoch: 39015, Training Loss: 0.00718
Epoch: 39015, Training Loss: 0.00719
Epoch: 39015, Training Loss: 0.00882
Epoch: 39016, Training Loss: 0.00691
Epoch: 39016, Training Loss: 0.00718
Epoch: 39016, Training Loss: 0.00719
Epoch: 39016, Training Loss: 0.00882
Epoch: 39017, Training Loss: 0.00691
Epoch: 39017, Training Loss: 0.00718
Epoch: 39017, Training Loss: 0.00719
Epoch: 39017, Training Loss: 0.00882
Epoch: 39018, Training Loss: 0.00691
Epoch: 39018, Training Loss: 0.00718
Epoch: 39018, Training Loss: 0.00719
Epoch: 39018, Training Loss: 0.00882
Epoch: 39019, Training Loss: 0.00691
Epoch: 39019, Training Loss: 0.00718
Epoch: 39019, Training Loss: 0.00719
Epoch: 39019, Training Loss: 0.00882
Epoch: 39020, Training Loss: 0.00691
Epoch: 39020, Training Loss: 0.00718
Epoch: 39020, Training Loss: 0.00719
Epoch: 39020, Training Loss: 0.00881
Epoch: 39021, Training Loss: 0.00691
Epoch: 39021, Training Loss: 0.00718
Epoch: 39021, Training Loss: 0.00719
Epoch: 39021, Training Loss: 0.00881
Epoch: 39022, Training Loss: 0.00691
Epoch: 39022, Training Loss: 0.00718
Epoch: 39022, Training Loss: 0.00719
Epoch: 39022, Training Loss: 0.00881
Epoch: 39023, Training Loss: 0.00691
Epoch: 39023, Training Loss: 0.00718
Epoch: 39023, Training Loss: 0.00719
Epoch: 39023, Training Loss: 0.00881
Epoch: 39024, Training Loss: 0.00691
Epoch: 39024, Training Loss: 0.00718
Epoch: 39024, Training Loss: 0.00719
Epoch: 39024, Training Loss: 0.00881
Epoch: 39025, Training Loss: 0.00691
Epoch: 39025, Training Loss: 0.00718
Epoch: 39025, Training Loss: 0.00719
Epoch: 39025, Training Loss: 0.00881
Epoch: 39026, Training Loss: 0.00691
Epoch: 39026, Training Loss: 0.00718
Epoch: 39026, Training Loss: 0.00719
Epoch: 39026, Training Loss: 0.00881
Epoch: 39027, Training Loss: 0.00691
Epoch: 39027, Training Loss: 0.00718
Epoch: 39027, Training Loss: 0.00719
Epoch: 39027, Training Loss: 0.00881
Epoch: 39028, Training Loss: 0.00691
Epoch: 39028, Training Loss: 0.00718
Epoch: 39028, Training Loss: 0.00719
Epoch: 39028, Training Loss: 0.00881
Epoch: 39029, Training Loss: 0.00691
Epoch: 39029, Training Loss: 0.00718
Epoch: 39029, Training Loss: 0.00719
Epoch: 39029, Training Loss: 0.00881
Epoch: 39030, Training Loss: 0.00691
Epoch: 39030, Training Loss: 0.00718
Epoch: 39030, Training Loss: 0.00719
Epoch: 39030, Training Loss: 0.00881
Epoch: 39031, Training Loss: 0.00691
Epoch: 39031, Training Loss: 0.00718
Epoch: 39031, Training Loss: 0.00719
Epoch: 39031, Training Loss: 0.00881
Epoch: 39032, Training Loss: 0.00691
Epoch: 39032, Training Loss: 0.00718
Epoch: 39032, Training Loss: 0.00719
Epoch: 39032, Training Loss: 0.00881
Epoch: 39033, Training Loss: 0.00691
Epoch: 39033, Training Loss: 0.00718
Epoch: 39033, Training Loss: 0.00719
Epoch: 39033, Training Loss: 0.00881
Epoch: 39034, Training Loss: 0.00691
Epoch: 39034, Training Loss: 0.00718
Epoch: 39034, Training Loss: 0.00719
Epoch: 39034, Training Loss: 0.00881
Epoch: 39035, Training Loss: 0.00691
Epoch: 39035, Training Loss: 0.00718
Epoch: 39035, Training Loss: 0.00719
Epoch: 39035, Training Loss: 0.00881
Epoch: 39036, Training Loss: 0.00691
Epoch: 39036, Training Loss: 0.00718
Epoch: 39036, Training Loss: 0.00719
Epoch: 39036, Training Loss: 0.00881
Epoch: 39037, Training Loss: 0.00691
Epoch: 39037, Training Loss: 0.00718
Epoch: 39037, Training Loss: 0.00719
Epoch: 39037, Training Loss: 0.00881
Epoch: 39038, Training Loss: 0.00691
Epoch: 39038, Training Loss: 0.00718
Epoch: 39038, Training Loss: 0.00719
Epoch: 39038, Training Loss: 0.00881
Epoch: 39039, Training Loss: 0.00691
Epoch: 39039, Training Loss: 0.00718
Epoch: 39039, Training Loss: 0.00719
Epoch: 39039, Training Loss: 0.00881
Epoch: 39040, Training Loss: 0.00691
Epoch: 39040, Training Loss: 0.00718
Epoch: 39040, Training Loss: 0.00718
Epoch: 39040, Training Loss: 0.00881
Epoch: 39041, Training Loss: 0.00691
Epoch: 39041, Training Loss: 0.00718
Epoch: 39041, Training Loss: 0.00718
Epoch: 39041, Training Loss: 0.00881
Epoch: 39042, Training Loss: 0.00691
Epoch: 39042, Training Loss: 0.00718
Epoch: 39042, Training Loss: 0.00718
Epoch: 39042, Training Loss: 0.00881
Epoch: 39043, Training Loss: 0.00691
Epoch: 39043, Training Loss: 0.00718
Epoch: 39043, Training Loss: 0.00718
Epoch: 39043, Training Loss: 0.00881
Epoch: 39044, Training Loss: 0.00691
Epoch: 39044, Training Loss: 0.00718
Epoch: 39044, Training Loss: 0.00718
Epoch: 39044, Training Loss: 0.00881
Epoch: 39045, Training Loss: 0.00691
Epoch: 39045, Training Loss: 0.00718
Epoch: 39045, Training Loss: 0.00718
Epoch: 39045, Training Loss: 0.00881
Epoch: 39046, Training Loss: 0.00691
Epoch: 39046, Training Loss: 0.00718
Epoch: 39046, Training Loss: 0.00718
Epoch: 39046, Training Loss: 0.00881
Epoch: 39047, Training Loss: 0.00691
Epoch: 39047, Training Loss: 0.00718
Epoch: 39047, Training Loss: 0.00718
Epoch: 39047, Training Loss: 0.00881
Epoch: 39048, Training Loss: 0.00691
Epoch: 39048, Training Loss: 0.00718
Epoch: 39048, Training Loss: 0.00718
Epoch: 39048, Training Loss: 0.00881
Epoch: 39049, Training Loss: 0.00691
Epoch: 39049, Training Loss: 0.00718
Epoch: 39049, Training Loss: 0.00718
Epoch: 39049, Training Loss: 0.00881
Epoch: 39050, Training Loss: 0.00691
Epoch: 39050, Training Loss: 0.00718
Epoch: 39050, Training Loss: 0.00718
Epoch: 39050, Training Loss: 0.00881
Epoch: 39051, Training Loss: 0.00691
Epoch: 39051, Training Loss: 0.00718
Epoch: 39051, Training Loss: 0.00718
Epoch: 39051, Training Loss: 0.00881
Epoch: 39052, Training Loss: 0.00691
Epoch: 39052, Training Loss: 0.00718
Epoch: 39052, Training Loss: 0.00718
Epoch: 39052, Training Loss: 0.00881
Epoch: 39053, Training Loss: 0.00691
Epoch: 39053, Training Loss: 0.00718
Epoch: 39053, Training Loss: 0.00718
Epoch: 39053, Training Loss: 0.00881
Epoch: 39054, Training Loss: 0.00691
Epoch: 39054, Training Loss: 0.00718
Epoch: 39054, Training Loss: 0.00718
Epoch: 39054, Training Loss: 0.00881
Epoch: 39055, Training Loss: 0.00691
Epoch: 39055, Training Loss: 0.00718
Epoch: 39055, Training Loss: 0.00718
Epoch: 39055, Training Loss: 0.00881
Epoch: 39056, Training Loss: 0.00691
Epoch: 39056, Training Loss: 0.00718
Epoch: 39056, Training Loss: 0.00718
Epoch: 39056, Training Loss: 0.00881
Epoch: 39057, Training Loss: 0.00691
Epoch: 39057, Training Loss: 0.00718
Epoch: 39057, Training Loss: 0.00718
Epoch: 39057, Training Loss: 0.00881
Epoch: 39058, Training Loss: 0.00691
Epoch: 39058, Training Loss: 0.00718
Epoch: 39058, Training Loss: 0.00718
Epoch: 39058, Training Loss: 0.00881
Epoch: 39059, Training Loss: 0.00691
Epoch: 39059, Training Loss: 0.00717
Epoch: 39059, Training Loss: 0.00718
Epoch: 39059, Training Loss: 0.00881
Epoch: 39060, Training Loss: 0.00691
Epoch: 39060, Training Loss: 0.00717
Epoch: 39060, Training Loss: 0.00718
Epoch: 39060, Training Loss: 0.00881
Epoch: 39061, Training Loss: 0.00691
Epoch: 39061, Training Loss: 0.00717
Epoch: 39061, Training Loss: 0.00718
Epoch: 39061, Training Loss: 0.00881
Epoch: 39062, Training Loss: 0.00691
Epoch: 39062, Training Loss: 0.00717
Epoch: 39062, Training Loss: 0.00718
Epoch: 39062, Training Loss: 0.00881
Epoch: 39063, Training Loss: 0.00691
Epoch: 39063, Training Loss: 0.00717
Epoch: 39063, Training Loss: 0.00718
Epoch: 39063, Training Loss: 0.00881
Epoch: 39064, Training Loss: 0.00691
Epoch: 39064, Training Loss: 0.00717
Epoch: 39064, Training Loss: 0.00718
Epoch: 39064, Training Loss: 0.00881
Epoch: 39065, Training Loss: 0.00691
Epoch: 39065, Training Loss: 0.00717
Epoch: 39065, Training Loss: 0.00718
Epoch: 39065, Training Loss: 0.00881
Epoch: 39066, Training Loss: 0.00691
Epoch: 39066, Training Loss: 0.00717
Epoch: 39066, Training Loss: 0.00718
Epoch: 39066, Training Loss: 0.00881
Epoch: 39067, Training Loss: 0.00691
Epoch: 39067, Training Loss: 0.00717
Epoch: 39067, Training Loss: 0.00718
Epoch: 39067, Training Loss: 0.00881
Epoch: 39068, Training Loss: 0.00691
Epoch: 39068, Training Loss: 0.00717
Epoch: 39068, Training Loss: 0.00718
Epoch: 39068, Training Loss: 0.00881
Epoch: 39069, Training Loss: 0.00691
Epoch: 39069, Training Loss: 0.00717
Epoch: 39069, Training Loss: 0.00718
Epoch: 39069, Training Loss: 0.00881
Epoch: 39070, Training Loss: 0.00690
Epoch: 39070, Training Loss: 0.00717
Epoch: 39070, Training Loss: 0.00718
Epoch: 39070, Training Loss: 0.00881
Epoch: 39071, Training Loss: 0.00690
Epoch: 39071, Training Loss: 0.00717
Epoch: 39071, Training Loss: 0.00718
Epoch: 39071, Training Loss: 0.00881
Epoch: 39072, Training Loss: 0.00690
Epoch: 39072, Training Loss: 0.00717
Epoch: 39072, Training Loss: 0.00718
Epoch: 39072, Training Loss: 0.00881
Epoch: 39073, Training Loss: 0.00690
Epoch: 39073, Training Loss: 0.00717
Epoch: 39073, Training Loss: 0.00718
Epoch: 39073, Training Loss: 0.00881
Epoch: 39074, Training Loss: 0.00690
Epoch: 39074, Training Loss: 0.00717
Epoch: 39074, Training Loss: 0.00718
Epoch: 39074, Training Loss: 0.00881
Epoch: 39075, Training Loss: 0.00690
Epoch: 39075, Training Loss: 0.00717
Epoch: 39075, Training Loss: 0.00718
Epoch: 39075, Training Loss: 0.00881
Epoch: 39076, Training Loss: 0.00690
Epoch: 39076, Training Loss: 0.00717
Epoch: 39076, Training Loss: 0.00718
Epoch: 39076, Training Loss: 0.00881
Epoch: 39077, Training Loss: 0.00690
Epoch: 39077, Training Loss: 0.00717
Epoch: 39077, Training Loss: 0.00718
Epoch: 39077, Training Loss: 0.00881
Epoch: 39078, Training Loss: 0.00690
Epoch: 39078, Training Loss: 0.00717
Epoch: 39078, Training Loss: 0.00718
Epoch: 39078, Training Loss: 0.00881
Epoch: 39079, Training Loss: 0.00690
Epoch: 39079, Training Loss: 0.00717
Epoch: 39079, Training Loss: 0.00718
Epoch: 39079, Training Loss: 0.00881
Epoch: 39080, Training Loss: 0.00690
Epoch: 39080, Training Loss: 0.00717
Epoch: 39080, Training Loss: 0.00718
Epoch: 39080, Training Loss: 0.00881
Epoch: 39081, Training Loss: 0.00690
Epoch: 39081, Training Loss: 0.00717
Epoch: 39081, Training Loss: 0.00718
Epoch: 39081, Training Loss: 0.00881
Epoch: 39082, Training Loss: 0.00690
Epoch: 39082, Training Loss: 0.00717
Epoch: 39082, Training Loss: 0.00718
Epoch: 39082, Training Loss: 0.00881
Epoch: 39083, Training Loss: 0.00690
Epoch: 39083, Training Loss: 0.00717
Epoch: 39083, Training Loss: 0.00718
Epoch: 39083, Training Loss: 0.00881
Epoch: 39084, Training Loss: 0.00690
Epoch: 39084, Training Loss: 0.00717
Epoch: 39084, Training Loss: 0.00718
Epoch: 39084, Training Loss: 0.00881
Epoch: 39085, Training Loss: 0.00690
Epoch: 39085, Training Loss: 0.00717
Epoch: 39085, Training Loss: 0.00718
Epoch: 39085, Training Loss: 0.00881
Epoch: 39086, Training Loss: 0.00690
Epoch: 39086, Training Loss: 0.00717
Epoch: 39086, Training Loss: 0.00718
Epoch: 39086, Training Loss: 0.00881
Epoch: 39087, Training Loss: 0.00690
Epoch: 39087, Training Loss: 0.00717
Epoch: 39087, Training Loss: 0.00718
Epoch: 39087, Training Loss: 0.00881
Epoch: 39088, Training Loss: 0.00690
Epoch: 39088, Training Loss: 0.00717
Epoch: 39088, Training Loss: 0.00718
Epoch: 39088, Training Loss: 0.00881
Epoch: 39089, Training Loss: 0.00690
Epoch: 39089, Training Loss: 0.00717
Epoch: 39089, Training Loss: 0.00718
Epoch: 39089, Training Loss: 0.00881
Epoch: 39090, Training Loss: 0.00690
Epoch: 39090, Training Loss: 0.00717
Epoch: 39090, Training Loss: 0.00718
Epoch: 39090, Training Loss: 0.00881
Epoch: 39091, Training Loss: 0.00690
Epoch: 39091, Training Loss: 0.00717
Epoch: 39091, Training Loss: 0.00718
Epoch: 39091, Training Loss: 0.00881
Epoch: 39092, Training Loss: 0.00690
Epoch: 39092, Training Loss: 0.00717
Epoch: 39092, Training Loss: 0.00718
Epoch: 39092, Training Loss: 0.00881
Epoch: 39093, Training Loss: 0.00690
Epoch: 39093, Training Loss: 0.00717
Epoch: 39093, Training Loss: 0.00718
Epoch: 39093, Training Loss: 0.00881
Epoch: 39094, Training Loss: 0.00690
Epoch: 39094, Training Loss: 0.00717
Epoch: 39094, Training Loss: 0.00718
Epoch: 39094, Training Loss: 0.00881
Epoch: 39095, Training Loss: 0.00690
Epoch: 39095, Training Loss: 0.00717
Epoch: 39095, Training Loss: 0.00718
Epoch: 39095, Training Loss: 0.00881
Epoch: 39096, Training Loss: 0.00690
Epoch: 39096, Training Loss: 0.00717
Epoch: 39096, Training Loss: 0.00718
Epoch: 39096, Training Loss: 0.00881
Epoch: 39097, Training Loss: 0.00690
Epoch: 39097, Training Loss: 0.00717
Epoch: 39097, Training Loss: 0.00718
Epoch: 39097, Training Loss: 0.00881
Epoch: 39098, Training Loss: 0.00690
Epoch: 39098, Training Loss: 0.00717
Epoch: 39098, Training Loss: 0.00718
Epoch: 39098, Training Loss: 0.00881
Epoch: 39099, Training Loss: 0.00690
Epoch: 39099, Training Loss: 0.00717
Epoch: 39099, Training Loss: 0.00718
Epoch: 39099, Training Loss: 0.00881
Epoch: 39100, Training Loss: 0.00690
Epoch: 39100, Training Loss: 0.00717
Epoch: 39100, Training Loss: 0.00718
Epoch: 39100, Training Loss: 0.00880
Epoch: 39101, Training Loss: 0.00690
Epoch: 39101, Training Loss: 0.00717
Epoch: 39101, Training Loss: 0.00718
Epoch: 39101, Training Loss: 0.00880
Epoch: 39102, Training Loss: 0.00690
Epoch: 39102, Training Loss: 0.00717
Epoch: 39102, Training Loss: 0.00718
Epoch: 39102, Training Loss: 0.00880
Epoch: 39103, Training Loss: 0.00690
Epoch: 39103, Training Loss: 0.00717
Epoch: 39103, Training Loss: 0.00718
Epoch: 39103, Training Loss: 0.00880
Epoch: 39104, Training Loss: 0.00690
Epoch: 39104, Training Loss: 0.00717
Epoch: 39104, Training Loss: 0.00718
Epoch: 39104, Training Loss: 0.00880
Epoch: 39105, Training Loss: 0.00690
Epoch: 39105, Training Loss: 0.00717
Epoch: 39105, Training Loss: 0.00718
Epoch: 39105, Training Loss: 0.00880
Epoch: 39106, Training Loss: 0.00690
Epoch: 39106, Training Loss: 0.00717
Epoch: 39106, Training Loss: 0.00718
Epoch: 39106, Training Loss: 0.00880
Epoch: 39107, Training Loss: 0.00690
Epoch: 39107, Training Loss: 0.00717
Epoch: 39107, Training Loss: 0.00718
Epoch: 39107, Training Loss: 0.00880
Epoch: 39108, Training Loss: 0.00690
Epoch: 39108, Training Loss: 0.00717
Epoch: 39108, Training Loss: 0.00718
Epoch: 39108, Training Loss: 0.00880
Epoch: 39109, Training Loss: 0.00690
Epoch: 39109, Training Loss: 0.00717
Epoch: 39109, Training Loss: 0.00718
Epoch: 39109, Training Loss: 0.00880
Epoch: 39110, Training Loss: 0.00690
Epoch: 39110, Training Loss: 0.00717
Epoch: 39110, Training Loss: 0.00718
Epoch: 39110, Training Loss: 0.00880
Epoch: 39111, Training Loss: 0.00690
Epoch: 39111, Training Loss: 0.00717
Epoch: 39111, Training Loss: 0.00718
Epoch: 39111, Training Loss: 0.00880
Epoch: 39112, Training Loss: 0.00690
Epoch: 39112, Training Loss: 0.00717
Epoch: 39112, Training Loss: 0.00718
Epoch: 39112, Training Loss: 0.00880
Epoch: 39113, Training Loss: 0.00690
Epoch: 39113, Training Loss: 0.00717
Epoch: 39113, Training Loss: 0.00718
Epoch: 39113, Training Loss: 0.00880
Epoch: 39114, Training Loss: 0.00690
Epoch: 39114, Training Loss: 0.00717
Epoch: 39114, Training Loss: 0.00718
Epoch: 39114, Training Loss: 0.00880
Epoch: 39115, Training Loss: 0.00690
Epoch: 39115, Training Loss: 0.00717
Epoch: 39115, Training Loss: 0.00718
Epoch: 39115, Training Loss: 0.00880
Epoch: 39116, Training Loss: 0.00690
Epoch: 39116, Training Loss: 0.00717
Epoch: 39116, Training Loss: 0.00718
Epoch: 39116, Training Loss: 0.00880
Epoch: 39117, Training Loss: 0.00690
Epoch: 39117, Training Loss: 0.00717
Epoch: 39117, Training Loss: 0.00718
Epoch: 39117, Training Loss: 0.00880
Epoch: 39118, Training Loss: 0.00690
Epoch: 39118, Training Loss: 0.00717
Epoch: 39118, Training Loss: 0.00718
Epoch: 39118, Training Loss: 0.00880
Epoch: 39119, Training Loss: 0.00690
Epoch: 39119, Training Loss: 0.00717
Epoch: 39119, Training Loss: 0.00718
Epoch: 39119, Training Loss: 0.00880
Epoch: 39120, Training Loss: 0.00690
Epoch: 39120, Training Loss: 0.00717
Epoch: 39120, Training Loss: 0.00718
Epoch: 39120, Training Loss: 0.00880
Epoch: 39121, Training Loss: 0.00690
Epoch: 39121, Training Loss: 0.00717
Epoch: 39121, Training Loss: 0.00718
Epoch: 39121, Training Loss: 0.00880
Epoch: 39122, Training Loss: 0.00690
Epoch: 39122, Training Loss: 0.00717
Epoch: 39122, Training Loss: 0.00718
Epoch: 39122, Training Loss: 0.00880
Epoch: 39123, Training Loss: 0.00690
Epoch: 39123, Training Loss: 0.00717
Epoch: 39123, Training Loss: 0.00718
Epoch: 39123, Training Loss: 0.00880
Epoch: 39124, Training Loss: 0.00690
Epoch: 39124, Training Loss: 0.00717
Epoch: 39124, Training Loss: 0.00718
Epoch: 39124, Training Loss: 0.00880
Epoch: 39125, Training Loss: 0.00690
Epoch: 39125, Training Loss: 0.00717
Epoch: 39125, Training Loss: 0.00718
Epoch: 39125, Training Loss: 0.00880
Epoch: 39126, Training Loss: 0.00690
Epoch: 39126, Training Loss: 0.00717
Epoch: 39126, Training Loss: 0.00718
Epoch: 39126, Training Loss: 0.00880
Epoch: 39127, Training Loss: 0.00690
Epoch: 39127, Training Loss: 0.00717
Epoch: 39127, Training Loss: 0.00718
Epoch: 39127, Training Loss: 0.00880
Epoch: 39128, Training Loss: 0.00690
Epoch: 39128, Training Loss: 0.00717
Epoch: 39128, Training Loss: 0.00718
Epoch: 39128, Training Loss: 0.00880
Epoch: 39129, Training Loss: 0.00690
Epoch: 39129, Training Loss: 0.00717
Epoch: 39129, Training Loss: 0.00718
Epoch: 39129, Training Loss: 0.00880
Epoch: 39130, Training Loss: 0.00690
Epoch: 39130, Training Loss: 0.00717
Epoch: 39130, Training Loss: 0.00718
Epoch: 39130, Training Loss: 0.00880
Epoch: 39131, Training Loss: 0.00690
Epoch: 39131, Training Loss: 0.00717
Epoch: 39131, Training Loss: 0.00718
Epoch: 39131, Training Loss: 0.00880
Epoch: 39132, Training Loss: 0.00690
Epoch: 39132, Training Loss: 0.00717
Epoch: 39132, Training Loss: 0.00718
Epoch: 39132, Training Loss: 0.00880
Epoch: 39133, Training Loss: 0.00690
Epoch: 39133, Training Loss: 0.00717
Epoch: 39133, Training Loss: 0.00718
Epoch: 39133, Training Loss: 0.00880
Epoch: 39134, Training Loss: 0.00690
Epoch: 39134, Training Loss: 0.00717
Epoch: 39134, Training Loss: 0.00718
Epoch: 39134, Training Loss: 0.00880
Epoch: 39135, Training Loss: 0.00690
Epoch: 39135, Training Loss: 0.00717
Epoch: 39135, Training Loss: 0.00718
Epoch: 39135, Training Loss: 0.00880
Epoch: 39136, Training Loss: 0.00690
Epoch: 39136, Training Loss: 0.00717
Epoch: 39136, Training Loss: 0.00718
Epoch: 39136, Training Loss: 0.00880
Epoch: 39137, Training Loss: 0.00690
Epoch: 39137, Training Loss: 0.00717
Epoch: 39137, Training Loss: 0.00718
Epoch: 39137, Training Loss: 0.00880
Epoch: 39138, Training Loss: 0.00690
Epoch: 39138, Training Loss: 0.00717
Epoch: 39138, Training Loss: 0.00718
Epoch: 39138, Training Loss: 0.00880
Epoch: 39139, Training Loss: 0.00690
Epoch: 39139, Training Loss: 0.00717
Epoch: 39139, Training Loss: 0.00717
Epoch: 39139, Training Loss: 0.00880
Epoch: 39140, Training Loss: 0.00690
Epoch: 39140, Training Loss: 0.00717
Epoch: 39140, Training Loss: 0.00717
Epoch: 39140, Training Loss: 0.00880
Epoch: 39141, Training Loss: 0.00690
Epoch: 39141, Training Loss: 0.00717
Epoch: 39141, Training Loss: 0.00717
Epoch: 39141, Training Loss: 0.00880
Epoch: 39142, Training Loss: 0.00690
Epoch: 39142, Training Loss: 0.00717
Epoch: 39142, Training Loss: 0.00717
Epoch: 39142, Training Loss: 0.00880
Epoch: 39143, Training Loss: 0.00690
Epoch: 39143, Training Loss: 0.00717
Epoch: 39143, Training Loss: 0.00717
Epoch: 39143, Training Loss: 0.00880
Epoch: 39144, Training Loss: 0.00690
Epoch: 39144, Training Loss: 0.00717
Epoch: 39144, Training Loss: 0.00717
Epoch: 39144, Training Loss: 0.00880
Epoch: 39145, Training Loss: 0.00690
Epoch: 39145, Training Loss: 0.00717
Epoch: 39145, Training Loss: 0.00717
Epoch: 39145, Training Loss: 0.00880
Epoch: 39146, Training Loss: 0.00690
Epoch: 39146, Training Loss: 0.00717
Epoch: 39146, Training Loss: 0.00717
Epoch: 39146, Training Loss: 0.00880
Epoch: 39147, Training Loss: 0.00690
Epoch: 39147, Training Loss: 0.00717
Epoch: 39147, Training Loss: 0.00717
Epoch: 39147, Training Loss: 0.00880
Epoch: 39148, Training Loss: 0.00690
Epoch: 39148, Training Loss: 0.00717
Epoch: 39148, Training Loss: 0.00717
Epoch: 39148, Training Loss: 0.00880
Epoch: 39149, Training Loss: 0.00690
Epoch: 39149, Training Loss: 0.00717
Epoch: 39149, Training Loss: 0.00717
Epoch: 39149, Training Loss: 0.00880
Epoch: 39150, Training Loss: 0.00690
Epoch: 39150, Training Loss: 0.00717
Epoch: 39150, Training Loss: 0.00717
Epoch: 39150, Training Loss: 0.00880
Epoch: 39151, Training Loss: 0.00690
Epoch: 39151, Training Loss: 0.00717
Epoch: 39151, Training Loss: 0.00717
Epoch: 39151, Training Loss: 0.00880
Epoch: 39152, Training Loss: 0.00690
Epoch: 39152, Training Loss: 0.00717
Epoch: 39152, Training Loss: 0.00717
Epoch: 39152, Training Loss: 0.00880
Epoch: 39153, Training Loss: 0.00690
Epoch: 39153, Training Loss: 0.00717
Epoch: 39153, Training Loss: 0.00717
Epoch: 39153, Training Loss: 0.00880
Epoch: 39154, Training Loss: 0.00690
Epoch: 39154, Training Loss: 0.00717
Epoch: 39154, Training Loss: 0.00717
Epoch: 39154, Training Loss: 0.00880
Epoch: 39155, Training Loss: 0.00690
Epoch: 39155, Training Loss: 0.00717
Epoch: 39155, Training Loss: 0.00717
Epoch: 39155, Training Loss: 0.00880
Epoch: 39156, Training Loss: 0.00690
Epoch: 39156, Training Loss: 0.00717
Epoch: 39156, Training Loss: 0.00717
Epoch: 39156, Training Loss: 0.00880
Epoch: 39157, Training Loss: 0.00690
Epoch: 39157, Training Loss: 0.00717
Epoch: 39157, Training Loss: 0.00717
Epoch: 39157, Training Loss: 0.00880
Epoch: 39158, Training Loss: 0.00690
Epoch: 39158, Training Loss: 0.00716
Epoch: 39158, Training Loss: 0.00717
Epoch: 39158, Training Loss: 0.00880
Epoch: 39159, Training Loss: 0.00690
Epoch: 39159, Training Loss: 0.00716
Epoch: 39159, Training Loss: 0.00717
Epoch: 39159, Training Loss: 0.00880
Epoch: 39160, Training Loss: 0.00690
Epoch: 39160, Training Loss: 0.00716
Epoch: 39160, Training Loss: 0.00717
Epoch: 39160, Training Loss: 0.00880
Epoch: 39161, Training Loss: 0.00690
Epoch: 39161, Training Loss: 0.00716
Epoch: 39161, Training Loss: 0.00717
Epoch: 39161, Training Loss: 0.00880
Epoch: 39162, Training Loss: 0.00690
Epoch: 39162, Training Loss: 0.00716
Epoch: 39162, Training Loss: 0.00717
Epoch: 39162, Training Loss: 0.00880
Epoch: 39163, Training Loss: 0.00690
Epoch: 39163, Training Loss: 0.00716
Epoch: 39163, Training Loss: 0.00717
Epoch: 39163, Training Loss: 0.00880
Epoch: 39164, Training Loss: 0.00690
Epoch: 39164, Training Loss: 0.00716
Epoch: 39164, Training Loss: 0.00717
Epoch: 39164, Training Loss: 0.00880
Epoch: 39165, Training Loss: 0.00690
Epoch: 39165, Training Loss: 0.00716
Epoch: 39165, Training Loss: 0.00717
Epoch: 39165, Training Loss: 0.00880
Epoch: 39166, Training Loss: 0.00690
Epoch: 39166, Training Loss: 0.00716
Epoch: 39166, Training Loss: 0.00717
Epoch: 39166, Training Loss: 0.00880
Epoch: 39167, Training Loss: 0.00690
Epoch: 39167, Training Loss: 0.00716
Epoch: 39167, Training Loss: 0.00717
Epoch: 39167, Training Loss: 0.00880
Epoch: 39168, Training Loss: 0.00690
Epoch: 39168, Training Loss: 0.00716
Epoch: 39168, Training Loss: 0.00717
Epoch: 39168, Training Loss: 0.00880
Epoch: 39169, Training Loss: 0.00690
Epoch: 39169, Training Loss: 0.00716
Epoch: 39169, Training Loss: 0.00717
Epoch: 39169, Training Loss: 0.00880
Epoch: 39170, Training Loss: 0.00690
Epoch: 39170, Training Loss: 0.00716
Epoch: 39170, Training Loss: 0.00717
Epoch: 39170, Training Loss: 0.00880
Epoch: 39171, Training Loss: 0.00690
Epoch: 39171, Training Loss: 0.00716
Epoch: 39171, Training Loss: 0.00717
Epoch: 39171, Training Loss: 0.00880
Epoch: 39172, Training Loss: 0.00690
Epoch: 39172, Training Loss: 0.00716
Epoch: 39172, Training Loss: 0.00717
Epoch: 39172, Training Loss: 0.00880
Epoch: 39173, Training Loss: 0.00690
Epoch: 39173, Training Loss: 0.00716
Epoch: 39173, Training Loss: 0.00717
Epoch: 39173, Training Loss: 0.00880
Epoch: 39174, Training Loss: 0.00690
Epoch: 39174, Training Loss: 0.00716
Epoch: 39174, Training Loss: 0.00717
Epoch: 39174, Training Loss: 0.00880
Epoch: 39175, Training Loss: 0.00689
Epoch: 39175, Training Loss: 0.00716
Epoch: 39175, Training Loss: 0.00717
Epoch: 39175, Training Loss: 0.00880
Epoch: 39176, Training Loss: 0.00689
Epoch: 39176, Training Loss: 0.00716
Epoch: 39176, Training Loss: 0.00717
Epoch: 39176, Training Loss: 0.00880
Epoch: 39177, Training Loss: 0.00689
Epoch: 39177, Training Loss: 0.00716
Epoch: 39177, Training Loss: 0.00717
Epoch: 39177, Training Loss: 0.00880
Epoch: 39178, Training Loss: 0.00689
Epoch: 39178, Training Loss: 0.00716
Epoch: 39178, Training Loss: 0.00717
Epoch: 39178, Training Loss: 0.00880
Epoch: 39179, Training Loss: 0.00689
Epoch: 39179, Training Loss: 0.00716
Epoch: 39179, Training Loss: 0.00717
Epoch: 39179, Training Loss: 0.00880
Epoch: 39180, Training Loss: 0.00689
Epoch: 39180, Training Loss: 0.00716
Epoch: 39180, Training Loss: 0.00717
Epoch: 39180, Training Loss: 0.00879
Epoch: 39181, Training Loss: 0.00689
Epoch: 39181, Training Loss: 0.00716
Epoch: 39181, Training Loss: 0.00717
Epoch: 39181, Training Loss: 0.00879
Epoch: 39182, Training Loss: 0.00689
Epoch: 39182, Training Loss: 0.00716
Epoch: 39182, Training Loss: 0.00717
Epoch: 39182, Training Loss: 0.00879
Epoch: 39183, Training Loss: 0.00689
Epoch: 39183, Training Loss: 0.00716
Epoch: 39183, Training Loss: 0.00717
Epoch: 39183, Training Loss: 0.00879
Epoch: 39184, Training Loss: 0.00689
Epoch: 39184, Training Loss: 0.00716
Epoch: 39184, Training Loss: 0.00717
Epoch: 39184, Training Loss: 0.00879
Epoch: 39185, Training Loss: 0.00689
Epoch: 39185, Training Loss: 0.00716
Epoch: 39185, Training Loss: 0.00717
Epoch: 39185, Training Loss: 0.00879
Epoch: 39186, Training Loss: 0.00689
Epoch: 39186, Training Loss: 0.00716
Epoch: 39186, Training Loss: 0.00717
Epoch: 39186, Training Loss: 0.00879
Epoch: 39187, Training Loss: 0.00689
Epoch: 39187, Training Loss: 0.00716
Epoch: 39187, Training Loss: 0.00717
Epoch: 39187, Training Loss: 0.00879
Epoch: 39188, Training Loss: 0.00689
Epoch: 39188, Training Loss: 0.00716
Epoch: 39188, Training Loss: 0.00717
Epoch: 39188, Training Loss: 0.00879
Epoch: 39189, Training Loss: 0.00689
Epoch: 39189, Training Loss: 0.00716
Epoch: 39189, Training Loss: 0.00717
Epoch: 39189, Training Loss: 0.00879
Epoch: 39190, Training Loss: 0.00689
Epoch: 39190, Training Loss: 0.00716
Epoch: 39190, Training Loss: 0.00717
Epoch: 39190, Training Loss: 0.00879
Epoch: 39191, Training Loss: 0.00689
Epoch: 39191, Training Loss: 0.00716
Epoch: 39191, Training Loss: 0.00717
Epoch: 39191, Training Loss: 0.00879
Epoch: 39192, Training Loss: 0.00689
Epoch: 39192, Training Loss: 0.00716
Epoch: 39192, Training Loss: 0.00717
Epoch: 39192, Training Loss: 0.00879
Epoch: 39193, Training Loss: 0.00689
Epoch: 39193, Training Loss: 0.00716
Epoch: 39193, Training Loss: 0.00717
Epoch: 39193, Training Loss: 0.00879
Epoch: 39194, Training Loss: 0.00689
Epoch: 39194, Training Loss: 0.00716
Epoch: 39194, Training Loss: 0.00717
Epoch: 39194, Training Loss: 0.00879
Epoch: 39195, Training Loss: 0.00689
Epoch: 39195, Training Loss: 0.00716
Epoch: 39195, Training Loss: 0.00717
Epoch: 39195, Training Loss: 0.00879
Epoch: 39196, Training Loss: 0.00689
Epoch: 39196, Training Loss: 0.00716
Epoch: 39196, Training Loss: 0.00717
Epoch: 39196, Training Loss: 0.00879
Epoch: 39197, Training Loss: 0.00689
Epoch: 39197, Training Loss: 0.00716
Epoch: 39197, Training Loss: 0.00717
Epoch: 39197, Training Loss: 0.00879
Epoch: 39198, Training Loss: 0.00689
Epoch: 39198, Training Loss: 0.00716
Epoch: 39198, Training Loss: 0.00717
Epoch: 39198, Training Loss: 0.00879
Epoch: 39199, Training Loss: 0.00689
Epoch: 39199, Training Loss: 0.00716
Epoch: 39199, Training Loss: 0.00717
Epoch: 39199, Training Loss: 0.00879
Epoch: 39200, Training Loss: 0.00689
Epoch: 39200, Training Loss: 0.00716
Epoch: 39200, Training Loss: 0.00717
Epoch: 39200, Training Loss: 0.00879
Epoch: 39201, Training Loss: 0.00689
Epoch: 39201, Training Loss: 0.00716
Epoch: 39201, Training Loss: 0.00717
Epoch: 39201, Training Loss: 0.00879
Epoch: 39202, Training Loss: 0.00689
Epoch: 39202, Training Loss: 0.00716
Epoch: 39202, Training Loss: 0.00717
Epoch: 39202, Training Loss: 0.00879
Epoch: 39203, Training Loss: 0.00689
Epoch: 39203, Training Loss: 0.00716
Epoch: 39203, Training Loss: 0.00717
Epoch: 39203, Training Loss: 0.00879
Epoch: 39204, Training Loss: 0.00689
Epoch: 39204, Training Loss: 0.00716
Epoch: 39204, Training Loss: 0.00717
Epoch: 39204, Training Loss: 0.00879
Epoch: 39205, Training Loss: 0.00689
Epoch: 39205, Training Loss: 0.00716
Epoch: 39205, Training Loss: 0.00717
Epoch: 39205, Training Loss: 0.00879
Epoch: 39206, Training Loss: 0.00689
Epoch: 39206, Training Loss: 0.00716
Epoch: 39206, Training Loss: 0.00717
Epoch: 39206, Training Loss: 0.00879
Epoch: 39207, Training Loss: 0.00689
Epoch: 39207, Training Loss: 0.00716
Epoch: 39207, Training Loss: 0.00717
Epoch: 39207, Training Loss: 0.00879
Epoch: 39208, Training Loss: 0.00689
Epoch: 39208, Training Loss: 0.00716
Epoch: 39208, Training Loss: 0.00717
Epoch: 39208, Training Loss: 0.00879
Epoch: 39209, Training Loss: 0.00689
Epoch: 39209, Training Loss: 0.00716
Epoch: 39209, Training Loss: 0.00717
Epoch: 39209, Training Loss: 0.00879
Epoch: 39210, Training Loss: 0.00689
Epoch: 39210, Training Loss: 0.00716
Epoch: 39210, Training Loss: 0.00717
Epoch: 39210, Training Loss: 0.00879
Epoch: 39211, Training Loss: 0.00689
Epoch: 39211, Training Loss: 0.00716
Epoch: 39211, Training Loss: 0.00717
Epoch: 39211, Training Loss: 0.00879
Epoch: 39212, Training Loss: 0.00689
Epoch: 39212, Training Loss: 0.00716
Epoch: 39212, Training Loss: 0.00717
Epoch: 39212, Training Loss: 0.00879
Epoch: 39213, Training Loss: 0.00689
Epoch: 39213, Training Loss: 0.00716
Epoch: 39213, Training Loss: 0.00717
Epoch: 39213, Training Loss: 0.00879
Epoch: 39214, Training Loss: 0.00689
Epoch: 39214, Training Loss: 0.00716
Epoch: 39214, Training Loss: 0.00717
Epoch: 39214, Training Loss: 0.00879
Epoch: 39215, Training Loss: 0.00689
Epoch: 39215, Training Loss: 0.00716
Epoch: 39215, Training Loss: 0.00717
Epoch: 39215, Training Loss: 0.00879
Epoch: 39216, Training Loss: 0.00689
Epoch: 39216, Training Loss: 0.00716
Epoch: 39216, Training Loss: 0.00717
Epoch: 39216, Training Loss: 0.00879
Epoch: 39217, Training Loss: 0.00689
Epoch: 39217, Training Loss: 0.00716
Epoch: 39217, Training Loss: 0.00717
Epoch: 39217, Training Loss: 0.00879
Epoch: 39218, Training Loss: 0.00689
Epoch: 39218, Training Loss: 0.00716
Epoch: 39218, Training Loss: 0.00717
Epoch: 39218, Training Loss: 0.00879
Epoch: 39219, Training Loss: 0.00689
Epoch: 39219, Training Loss: 0.00716
Epoch: 39219, Training Loss: 0.00717
Epoch: 39219, Training Loss: 0.00879
Epoch: 39220, Training Loss: 0.00689
Epoch: 39220, Training Loss: 0.00716
Epoch: 39220, Training Loss: 0.00717
Epoch: 39220, Training Loss: 0.00879
Epoch: 39221, Training Loss: 0.00689
Epoch: 39221, Training Loss: 0.00716
Epoch: 39221, Training Loss: 0.00717
Epoch: 39221, Training Loss: 0.00879
Epoch: 39222, Training Loss: 0.00689
Epoch: 39222, Training Loss: 0.00716
Epoch: 39222, Training Loss: 0.00717
Epoch: 39222, Training Loss: 0.00879
Epoch: 39223, Training Loss: 0.00689
Epoch: 39223, Training Loss: 0.00716
Epoch: 39223, Training Loss: 0.00717
Epoch: 39223, Training Loss: 0.00879
Epoch: 39224, Training Loss: 0.00689
Epoch: 39224, Training Loss: 0.00716
Epoch: 39224, Training Loss: 0.00717
Epoch: 39224, Training Loss: 0.00879
Epoch: 39225, Training Loss: 0.00689
Epoch: 39225, Training Loss: 0.00716
Epoch: 39225, Training Loss: 0.00717
Epoch: 39225, Training Loss: 0.00879
Epoch: 39226, Training Loss: 0.00689
Epoch: 39226, Training Loss: 0.00716
Epoch: 39226, Training Loss: 0.00717
Epoch: 39226, Training Loss: 0.00879
Epoch: 39227, Training Loss: 0.00689
Epoch: 39227, Training Loss: 0.00716
Epoch: 39227, Training Loss: 0.00717
Epoch: 39227, Training Loss: 0.00879
Epoch: 39228, Training Loss: 0.00689
Epoch: 39228, Training Loss: 0.00716
Epoch: 39228, Training Loss: 0.00717
Epoch: 39228, Training Loss: 0.00879
Epoch: 39229, Training Loss: 0.00689
Epoch: 39229, Training Loss: 0.00716
Epoch: 39229, Training Loss: 0.00717
Epoch: 39229, Training Loss: 0.00879
Epoch: 39230, Training Loss: 0.00689
Epoch: 39230, Training Loss: 0.00716
Epoch: 39230, Training Loss: 0.00717
Epoch: 39230, Training Loss: 0.00879
Epoch: 39231, Training Loss: 0.00689
Epoch: 39231, Training Loss: 0.00716
Epoch: 39231, Training Loss: 0.00717
Epoch: 39231, Training Loss: 0.00879
Epoch: 39232, Training Loss: 0.00689
Epoch: 39232, Training Loss: 0.00716
Epoch: 39232, Training Loss: 0.00717
Epoch: 39232, Training Loss: 0.00879
Epoch: 39233, Training Loss: 0.00689
Epoch: 39233, Training Loss: 0.00716
Epoch: 39233, Training Loss: 0.00717
Epoch: 39233, Training Loss: 0.00879
Epoch: 39234, Training Loss: 0.00689
Epoch: 39234, Training Loss: 0.00716
Epoch: 39234, Training Loss: 0.00717
Epoch: 39234, Training Loss: 0.00879
Epoch: 39235, Training Loss: 0.00689
Epoch: 39235, Training Loss: 0.00716
Epoch: 39235, Training Loss: 0.00717
Epoch: 39235, Training Loss: 0.00879
Epoch: 39236, Training Loss: 0.00689
Epoch: 39236, Training Loss: 0.00716
Epoch: 39236, Training Loss: 0.00717
Epoch: 39236, Training Loss: 0.00879
Epoch: 39237, Training Loss: 0.00689
Epoch: 39237, Training Loss: 0.00716
Epoch: 39237, Training Loss: 0.00717
Epoch: 39237, Training Loss: 0.00879
Epoch: 39238, Training Loss: 0.00689
Epoch: 39238, Training Loss: 0.00716
Epoch: 39238, Training Loss: 0.00717
Epoch: 39238, Training Loss: 0.00879
Epoch: 39239, Training Loss: 0.00689
Epoch: 39239, Training Loss: 0.00716
Epoch: 39239, Training Loss: 0.00716
Epoch: 39239, Training Loss: 0.00879
Epoch: 39240, Training Loss: 0.00689
Epoch: 39240, Training Loss: 0.00716
Epoch: 39240, Training Loss: 0.00716
Epoch: 39240, Training Loss: 0.00879
Epoch: 39241, Training Loss: 0.00689
Epoch: 39241, Training Loss: 0.00716
Epoch: 39241, Training Loss: 0.00716
Epoch: 39241, Training Loss: 0.00879
Epoch: 39242, Training Loss: 0.00689
Epoch: 39242, Training Loss: 0.00716
Epoch: 39242, Training Loss: 0.00716
Epoch: 39242, Training Loss: 0.00879
Epoch: 39243, Training Loss: 0.00689
Epoch: 39243, Training Loss: 0.00716
Epoch: 39243, Training Loss: 0.00716
Epoch: 39243, Training Loss: 0.00879
Epoch: 39244, Training Loss: 0.00689
Epoch: 39244, Training Loss: 0.00716
Epoch: 39244, Training Loss: 0.00716
Epoch: 39244, Training Loss: 0.00879
Epoch: 39245, Training Loss: 0.00689
Epoch: 39245, Training Loss: 0.00716
Epoch: 39245, Training Loss: 0.00716
Epoch: 39245, Training Loss: 0.00879
Epoch: 39246, Training Loss: 0.00689
Epoch: 39246, Training Loss: 0.00716
Epoch: 39246, Training Loss: 0.00716
Epoch: 39246, Training Loss: 0.00879
Epoch: 39247, Training Loss: 0.00689
Epoch: 39247, Training Loss: 0.00716
Epoch: 39247, Training Loss: 0.00716
Epoch: 39247, Training Loss: 0.00879
Epoch: 39248, Training Loss: 0.00689
Epoch: 39248, Training Loss: 0.00716
Epoch: 39248, Training Loss: 0.00716
Epoch: 39248, Training Loss: 0.00879
Epoch: 39249, Training Loss: 0.00689
Epoch: 39249, Training Loss: 0.00716
Epoch: 39249, Training Loss: 0.00716
Epoch: 39249, Training Loss: 0.00879
Epoch: 39250, Training Loss: 0.00689
Epoch: 39250, Training Loss: 0.00716
Epoch: 39250, Training Loss: 0.00716
Epoch: 39250, Training Loss: 0.00879
Epoch: 39251, Training Loss: 0.00689
Epoch: 39251, Training Loss: 0.00716
Epoch: 39251, Training Loss: 0.00716
Epoch: 39251, Training Loss: 0.00879
Epoch: 39252, Training Loss: 0.00689
Epoch: 39252, Training Loss: 0.00716
Epoch: 39252, Training Loss: 0.00716
Epoch: 39252, Training Loss: 0.00879
Epoch: 39253, Training Loss: 0.00689
Epoch: 39253, Training Loss: 0.00716
Epoch: 39253, Training Loss: 0.00716
Epoch: 39253, Training Loss: 0.00879
Epoch: 39254, Training Loss: 0.00689
Epoch: 39254, Training Loss: 0.00716
Epoch: 39254, Training Loss: 0.00716
Epoch: 39254, Training Loss: 0.00879
Epoch: 39255, Training Loss: 0.00689
Epoch: 39255, Training Loss: 0.00716
Epoch: 39255, Training Loss: 0.00716
Epoch: 39255, Training Loss: 0.00879
Epoch: 39256, Training Loss: 0.00689
Epoch: 39256, Training Loss: 0.00716
Epoch: 39256, Training Loss: 0.00716
Epoch: 39256, Training Loss: 0.00879
Epoch: 39257, Training Loss: 0.00689
Epoch: 39257, Training Loss: 0.00716
Epoch: 39257, Training Loss: 0.00716
Epoch: 39257, Training Loss: 0.00879
Epoch: 39258, Training Loss: 0.00689
Epoch: 39258, Training Loss: 0.00715
Epoch: 39258, Training Loss: 0.00716
Epoch: 39258, Training Loss: 0.00879
Epoch: 39259, Training Loss: 0.00689
Epoch: 39259, Training Loss: 0.00715
Epoch: 39259, Training Loss: 0.00716
Epoch: 39259, Training Loss: 0.00879
Epoch: 39260, Training Loss: 0.00689
Epoch: 39260, Training Loss: 0.00715
Epoch: 39260, Training Loss: 0.00716
Epoch: 39260, Training Loss: 0.00878
Epoch: 39261, Training Loss: 0.00689
Epoch: 39261, Training Loss: 0.00715
Epoch: 39261, Training Loss: 0.00716
Epoch: 39261, Training Loss: 0.00878
Epoch: 39262, Training Loss: 0.00689
Epoch: 39262, Training Loss: 0.00715
Epoch: 39262, Training Loss: 0.00716
Epoch: 39262, Training Loss: 0.00878
Epoch: 39263, Training Loss: 0.00689
Epoch: 39263, Training Loss: 0.00715
Epoch: 39263, Training Loss: 0.00716
Epoch: 39263, Training Loss: 0.00878
Epoch: 39264, Training Loss: 0.00689
Epoch: 39264, Training Loss: 0.00715
Epoch: 39264, Training Loss: 0.00716
Epoch: 39264, Training Loss: 0.00878
Epoch: 39265, Training Loss: 0.00689
Epoch: 39265, Training Loss: 0.00715
Epoch: 39265, Training Loss: 0.00716
Epoch: 39265, Training Loss: 0.00878
Epoch: 39266, Training Loss: 0.00689
Epoch: 39266, Training Loss: 0.00715
Epoch: 39266, Training Loss: 0.00716
Epoch: 39266, Training Loss: 0.00878
Epoch: 39267, Training Loss: 0.00689
Epoch: 39267, Training Loss: 0.00715
Epoch: 39267, Training Loss: 0.00716
Epoch: 39267, Training Loss: 0.00878
Epoch: 39268, Training Loss: 0.00689
Epoch: 39268, Training Loss: 0.00715
Epoch: 39268, Training Loss: 0.00716
Epoch: 39268, Training Loss: 0.00878
Epoch: 39269, Training Loss: 0.00689
Epoch: 39269, Training Loss: 0.00715
Epoch: 39269, Training Loss: 0.00716
Epoch: 39269, Training Loss: 0.00878
Epoch: 39270, Training Loss: 0.00689
Epoch: 39270, Training Loss: 0.00715
Epoch: 39270, Training Loss: 0.00716
Epoch: 39270, Training Loss: 0.00878
Epoch: 39271, Training Loss: 0.00689
Epoch: 39271, Training Loss: 0.00715
Epoch: 39271, Training Loss: 0.00716
Epoch: 39271, Training Loss: 0.00878
Epoch: 39272, Training Loss: 0.00689
Epoch: 39272, Training Loss: 0.00715
Epoch: 39272, Training Loss: 0.00716
Epoch: 39272, Training Loss: 0.00878
Epoch: 39273, Training Loss: 0.00689
Epoch: 39273, Training Loss: 0.00715
Epoch: 39273, Training Loss: 0.00716
Epoch: 39273, Training Loss: 0.00878
Epoch: 39274, Training Loss: 0.00689
Epoch: 39274, Training Loss: 0.00715
Epoch: 39274, Training Loss: 0.00716
Epoch: 39274, Training Loss: 0.00878
Epoch: 39275, Training Loss: 0.00689
Epoch: 39275, Training Loss: 0.00715
Epoch: 39275, Training Loss: 0.00716
Epoch: 39275, Training Loss: 0.00878
Epoch: 39276, Training Loss: 0.00689
Epoch: 39276, Training Loss: 0.00715
Epoch: 39276, Training Loss: 0.00716
Epoch: 39276, Training Loss: 0.00878
Epoch: 39277, Training Loss: 0.00689
Epoch: 39277, Training Loss: 0.00715
Epoch: 39277, Training Loss: 0.00716
Epoch: 39277, Training Loss: 0.00878
Epoch: 39278, Training Loss: 0.00689
Epoch: 39278, Training Loss: 0.00715
Epoch: 39278, Training Loss: 0.00716
Epoch: 39278, Training Loss: 0.00878
Epoch: 39279, Training Loss: 0.00689
Epoch: 39279, Training Loss: 0.00715
Epoch: 39279, Training Loss: 0.00716
Epoch: 39279, Training Loss: 0.00878
Epoch: 39280, Training Loss: 0.00689
Epoch: 39280, Training Loss: 0.00715
Epoch: 39280, Training Loss: 0.00716
Epoch: 39280, Training Loss: 0.00878
Epoch: 39281, Training Loss: 0.00688
Epoch: 39281, Training Loss: 0.00715
Epoch: 39281, Training Loss: 0.00716
Epoch: 39281, Training Loss: 0.00878
Epoch: 39282, Training Loss: 0.00688
Epoch: 39282, Training Loss: 0.00715
Epoch: 39282, Training Loss: 0.00716
Epoch: 39282, Training Loss: 0.00878
Epoch: 39283, Training Loss: 0.00688
Epoch: 39283, Training Loss: 0.00715
Epoch: 39283, Training Loss: 0.00716
Epoch: 39283, Training Loss: 0.00878
Epoch: 39284, Training Loss: 0.00688
Epoch: 39284, Training Loss: 0.00715
Epoch: 39284, Training Loss: 0.00716
Epoch: 39284, Training Loss: 0.00878
Epoch: 39285, Training Loss: 0.00688
Epoch: 39285, Training Loss: 0.00715
Epoch: 39285, Training Loss: 0.00716
Epoch: 39285, Training Loss: 0.00878
Epoch: 39286, Training Loss: 0.00688
Epoch: 39286, Training Loss: 0.00715
Epoch: 39286, Training Loss: 0.00716
Epoch: 39286, Training Loss: 0.00878
Epoch: 39287, Training Loss: 0.00688
Epoch: 39287, Training Loss: 0.00715
Epoch: 39287, Training Loss: 0.00716
Epoch: 39287, Training Loss: 0.00878
Epoch: 39288, Training Loss: 0.00688
Epoch: 39288, Training Loss: 0.00715
Epoch: 39288, Training Loss: 0.00716
Epoch: 39288, Training Loss: 0.00878
Epoch: 39289, Training Loss: 0.00688
Epoch: 39289, Training Loss: 0.00715
Epoch: 39289, Training Loss: 0.00716
Epoch: 39289, Training Loss: 0.00878
Epoch: 39290, Training Loss: 0.00688
Epoch: 39290, Training Loss: 0.00715
Epoch: 39290, Training Loss: 0.00716
Epoch: 39290, Training Loss: 0.00878
Epoch: 39291, Training Loss: 0.00688
Epoch: 39291, Training Loss: 0.00715
Epoch: 39291, Training Loss: 0.00716
Epoch: 39291, Training Loss: 0.00878
Epoch: 39292, Training Loss: 0.00688
Epoch: 39292, Training Loss: 0.00715
Epoch: 39292, Training Loss: 0.00716
Epoch: 39292, Training Loss: 0.00878
Epoch: 39293, Training Loss: 0.00688
Epoch: 39293, Training Loss: 0.00715
Epoch: 39293, Training Loss: 0.00716
Epoch: 39293, Training Loss: 0.00878
Epoch: 39294, Training Loss: 0.00688
Epoch: 39294, Training Loss: 0.00715
Epoch: 39294, Training Loss: 0.00716
Epoch: 39294, Training Loss: 0.00878
Epoch: 39295, Training Loss: 0.00688
Epoch: 39295, Training Loss: 0.00715
Epoch: 39295, Training Loss: 0.00716
Epoch: 39295, Training Loss: 0.00878
Epoch: 39296, Training Loss: 0.00688
Epoch: 39296, Training Loss: 0.00715
Epoch: 39296, Training Loss: 0.00716
Epoch: 39296, Training Loss: 0.00878
Epoch: 39297, Training Loss: 0.00688
Epoch: 39297, Training Loss: 0.00715
Epoch: 39297, Training Loss: 0.00716
Epoch: 39297, Training Loss: 0.00878
Epoch: 39298, Training Loss: 0.00688
Epoch: 39298, Training Loss: 0.00715
Epoch: 39298, Training Loss: 0.00716
Epoch: 39298, Training Loss: 0.00878
Epoch: 39299, Training Loss: 0.00688
Epoch: 39299, Training Loss: 0.00715
Epoch: 39299, Training Loss: 0.00716
Epoch: 39299, Training Loss: 0.00878
Epoch: 39300, Training Loss: 0.00688
Epoch: 39300, Training Loss: 0.00715
Epoch: 39300, Training Loss: 0.00716
Epoch: 39300, Training Loss: 0.00878
Epoch: 39301, Training Loss: 0.00688
Epoch: 39301, Training Loss: 0.00715
Epoch: 39301, Training Loss: 0.00716
Epoch: 39301, Training Loss: 0.00878
Epoch: 39302, Training Loss: 0.00688
Epoch: 39302, Training Loss: 0.00715
Epoch: 39302, Training Loss: 0.00716
Epoch: 39302, Training Loss: 0.00878
Epoch: 39303, Training Loss: 0.00688
Epoch: 39303, Training Loss: 0.00715
Epoch: 39303, Training Loss: 0.00716
Epoch: 39303, Training Loss: 0.00878
Epoch: 39304, Training Loss: 0.00688
Epoch: 39304, Training Loss: 0.00715
Epoch: 39304, Training Loss: 0.00716
Epoch: 39304, Training Loss: 0.00878
Epoch: 39305, Training Loss: 0.00688
Epoch: 39305, Training Loss: 0.00715
Epoch: 39305, Training Loss: 0.00716
Epoch: 39305, Training Loss: 0.00878
Epoch: 39306, Training Loss: 0.00688
Epoch: 39306, Training Loss: 0.00715
Epoch: 39306, Training Loss: 0.00716
Epoch: 39306, Training Loss: 0.00878
Epoch: 39307, Training Loss: 0.00688
Epoch: 39307, Training Loss: 0.00715
Epoch: 39307, Training Loss: 0.00716
Epoch: 39307, Training Loss: 0.00878
Epoch: 39308, Training Loss: 0.00688
Epoch: 39308, Training Loss: 0.00715
Epoch: 39308, Training Loss: 0.00716
Epoch: 39308, Training Loss: 0.00878
Epoch: 39309, Training Loss: 0.00688
Epoch: 39309, Training Loss: 0.00715
Epoch: 39309, Training Loss: 0.00716
Epoch: 39309, Training Loss: 0.00878
Epoch: 39310, Training Loss: 0.00688
Epoch: 39310, Training Loss: 0.00715
Epoch: 39310, Training Loss: 0.00716
Epoch: 39310, Training Loss: 0.00878
Epoch: 39311, Training Loss: 0.00688
Epoch: 39311, Training Loss: 0.00715
Epoch: 39311, Training Loss: 0.00716
Epoch: 39311, Training Loss: 0.00878
Epoch: 39312, Training Loss: 0.00688
Epoch: 39312, Training Loss: 0.00715
Epoch: 39312, Training Loss: 0.00716
Epoch: 39312, Training Loss: 0.00878
Epoch: 39313, Training Loss: 0.00688
Epoch: 39313, Training Loss: 0.00715
Epoch: 39313, Training Loss: 0.00716
Epoch: 39313, Training Loss: 0.00878
Epoch: 39314, Training Loss: 0.00688
Epoch: 39314, Training Loss: 0.00715
Epoch: 39314, Training Loss: 0.00716
Epoch: 39314, Training Loss: 0.00878
Epoch: 39315, Training Loss: 0.00688
Epoch: 39315, Training Loss: 0.00715
Epoch: 39315, Training Loss: 0.00716
Epoch: 39315, Training Loss: 0.00878
Epoch: 39316, Training Loss: 0.00688
Epoch: 39316, Training Loss: 0.00715
Epoch: 39316, Training Loss: 0.00716
Epoch: 39316, Training Loss: 0.00878
Epoch: 39317, Training Loss: 0.00688
Epoch: 39317, Training Loss: 0.00715
Epoch: 39317, Training Loss: 0.00716
Epoch: 39317, Training Loss: 0.00878
Epoch: 39318, Training Loss: 0.00688
Epoch: 39318, Training Loss: 0.00715
Epoch: 39318, Training Loss: 0.00716
Epoch: 39318, Training Loss: 0.00878
Epoch: 39319, Training Loss: 0.00688
Epoch: 39319, Training Loss: 0.00715
Epoch: 39319, Training Loss: 0.00716
Epoch: 39319, Training Loss: 0.00878
Epoch: 39320, Training Loss: 0.00688
Epoch: 39320, Training Loss: 0.00715
Epoch: 39320, Training Loss: 0.00716
Epoch: 39320, Training Loss: 0.00878
Epoch: 39321, Training Loss: 0.00688
Epoch: 39321, Training Loss: 0.00715
Epoch: 39321, Training Loss: 0.00716
Epoch: 39321, Training Loss: 0.00878
Epoch: 39322, Training Loss: 0.00688
Epoch: 39322, Training Loss: 0.00715
Epoch: 39322, Training Loss: 0.00716
Epoch: 39322, Training Loss: 0.00878
Epoch: 39323, Training Loss: 0.00688
Epoch: 39323, Training Loss: 0.00715
Epoch: 39323, Training Loss: 0.00716
Epoch: 39323, Training Loss: 0.00878
Epoch: 39324, Training Loss: 0.00688
Epoch: 39324, Training Loss: 0.00715
Epoch: 39324, Training Loss: 0.00716
Epoch: 39324, Training Loss: 0.00878
Epoch: 39325, Training Loss: 0.00688
Epoch: 39325, Training Loss: 0.00715
Epoch: 39325, Training Loss: 0.00716
Epoch: 39325, Training Loss: 0.00878
Epoch: 39326, Training Loss: 0.00688
Epoch: 39326, Training Loss: 0.00715
Epoch: 39326, Training Loss: 0.00716
Epoch: 39326, Training Loss: 0.00878
Epoch: 39327, Training Loss: 0.00688
Epoch: 39327, Training Loss: 0.00715
Epoch: 39327, Training Loss: 0.00716
Epoch: 39327, Training Loss: 0.00878
Epoch: 39328, Training Loss: 0.00688
Epoch: 39328, Training Loss: 0.00715
Epoch: 39328, Training Loss: 0.00716
Epoch: 39328, Training Loss: 0.00878
Epoch: 39329, Training Loss: 0.00688
Epoch: 39329, Training Loss: 0.00715
Epoch: 39329, Training Loss: 0.00716
Epoch: 39329, Training Loss: 0.00878
Epoch: 39330, Training Loss: 0.00688
Epoch: 39330, Training Loss: 0.00715
Epoch: 39330, Training Loss: 0.00716
Epoch: 39330, Training Loss: 0.00878
Epoch: 39331, Training Loss: 0.00688
Epoch: 39331, Training Loss: 0.00715
Epoch: 39331, Training Loss: 0.00716
Epoch: 39331, Training Loss: 0.00878
Epoch: 39332, Training Loss: 0.00688
Epoch: 39332, Training Loss: 0.00715
Epoch: 39332, Training Loss: 0.00716
Epoch: 39332, Training Loss: 0.00878
Epoch: 39333, Training Loss: 0.00688
Epoch: 39333, Training Loss: 0.00715
Epoch: 39333, Training Loss: 0.00716
Epoch: 39333, Training Loss: 0.00878
Epoch: 39334, Training Loss: 0.00688
Epoch: 39334, Training Loss: 0.00715
Epoch: 39334, Training Loss: 0.00716
Epoch: 39334, Training Loss: 0.00878
Epoch: 39335, Training Loss: 0.00688
Epoch: 39335, Training Loss: 0.00715
Epoch: 39335, Training Loss: 0.00716
Epoch: 39335, Training Loss: 0.00878
Epoch: 39336, Training Loss: 0.00688
Epoch: 39336, Training Loss: 0.00715
Epoch: 39336, Training Loss: 0.00716
Epoch: 39336, Training Loss: 0.00878
Epoch: 39337, Training Loss: 0.00688
Epoch: 39337, Training Loss: 0.00715
Epoch: 39337, Training Loss: 0.00716
Epoch: 39337, Training Loss: 0.00878
Epoch: 39338, Training Loss: 0.00688
Epoch: 39338, Training Loss: 0.00715
Epoch: 39338, Training Loss: 0.00715
Epoch: 39338, Training Loss: 0.00878
Epoch: 39339, Training Loss: 0.00688
Epoch: 39339, Training Loss: 0.00715
Epoch: 39339, Training Loss: 0.00715
Epoch: 39339, Training Loss: 0.00878
Epoch: 39340, Training Loss: 0.00688
Epoch: 39340, Training Loss: 0.00715
Epoch: 39340, Training Loss: 0.00715
Epoch: 39340, Training Loss: 0.00878
Epoch: 39341, Training Loss: 0.00688
Epoch: 39341, Training Loss: 0.00715
Epoch: 39341, Training Loss: 0.00715
Epoch: 39341, Training Loss: 0.00877
Epoch: 39342, Training Loss: 0.00688
Epoch: 39342, Training Loss: 0.00715
Epoch: 39342, Training Loss: 0.00715
Epoch: 39342, Training Loss: 0.00877
Epoch: 39343, Training Loss: 0.00688
Epoch: 39343, Training Loss: 0.00715
Epoch: 39343, Training Loss: 0.00715
Epoch: 39343, Training Loss: 0.00877
Epoch: 39344, Training Loss: 0.00688
Epoch: 39344, Training Loss: 0.00715
Epoch: 39344, Training Loss: 0.00715
Epoch: 39344, Training Loss: 0.00877
Epoch: 39345, Training Loss: 0.00688
Epoch: 39345, Training Loss: 0.00715
Epoch: 39345, Training Loss: 0.00715
Epoch: 39345, Training Loss: 0.00877
Epoch: 39346, Training Loss: 0.00688
Epoch: 39346, Training Loss: 0.00715
Epoch: 39346, Training Loss: 0.00715
Epoch: 39346, Training Loss: 0.00877
Epoch: 39347, Training Loss: 0.00688
Epoch: 39347, Training Loss: 0.00715
Epoch: 39347, Training Loss: 0.00715
Epoch: 39347, Training Loss: 0.00877
Epoch: 39348, Training Loss: 0.00688
Epoch: 39348, Training Loss: 0.00715
Epoch: 39348, Training Loss: 0.00715
Epoch: 39348, Training Loss: 0.00877
Epoch: 39349, Training Loss: 0.00688
Epoch: 39349, Training Loss: 0.00715
Epoch: 39349, Training Loss: 0.00715
Epoch: 39349, Training Loss: 0.00877
Epoch: 39350, Training Loss: 0.00688
Epoch: 39350, Training Loss: 0.00715
Epoch: 39350, Training Loss: 0.00715
Epoch: 39350, Training Loss: 0.00877
Epoch: 39351, Training Loss: 0.00688
Epoch: 39351, Training Loss: 0.00715
Epoch: 39351, Training Loss: 0.00715
Epoch: 39351, Training Loss: 0.00877
Epoch: 39352, Training Loss: 0.00688
Epoch: 39352, Training Loss: 0.00715
Epoch: 39352, Training Loss: 0.00715
Epoch: 39352, Training Loss: 0.00877
Epoch: 39353, Training Loss: 0.00688
Epoch: 39353, Training Loss: 0.00715
Epoch: 39353, Training Loss: 0.00715
Epoch: 39353, Training Loss: 0.00877
Epoch: 39354, Training Loss: 0.00688
Epoch: 39354, Training Loss: 0.00715
Epoch: 39354, Training Loss: 0.00715
Epoch: 39354, Training Loss: 0.00877
Epoch: 39355, Training Loss: 0.00688
Epoch: 39355, Training Loss: 0.00715
Epoch: 39355, Training Loss: 0.00715
Epoch: 39355, Training Loss: 0.00877
Epoch: 39356, Training Loss: 0.00688
Epoch: 39356, Training Loss: 0.00715
Epoch: 39356, Training Loss: 0.00715
Epoch: 39356, Training Loss: 0.00877
Epoch: 39357, Training Loss: 0.00688
Epoch: 39357, Training Loss: 0.00715
Epoch: 39357, Training Loss: 0.00715
Epoch: 39357, Training Loss: 0.00877
Epoch: 39358, Training Loss: 0.00688
Epoch: 39358, Training Loss: 0.00714
Epoch: 39358, Training Loss: 0.00715
Epoch: 39358, Training Loss: 0.00877
Epoch: 39359, Training Loss: 0.00688
Epoch: 39359, Training Loss: 0.00714
Epoch: 39359, Training Loss: 0.00715
Epoch: 39359, Training Loss: 0.00877
Epoch: 39360, Training Loss: 0.00688
Epoch: 39360, Training Loss: 0.00714
Epoch: 39360, Training Loss: 0.00715
Epoch: 39360, Training Loss: 0.00877
Epoch: 39361, Training Loss: 0.00688
Epoch: 39361, Training Loss: 0.00714
Epoch: 39361, Training Loss: 0.00715
Epoch: 39361, Training Loss: 0.00877
Epoch: 39362, Training Loss: 0.00688
Epoch: 39362, Training Loss: 0.00714
Epoch: 39362, Training Loss: 0.00715
Epoch: 39362, Training Loss: 0.00877
Epoch: 39363, Training Loss: 0.00688
Epoch: 39363, Training Loss: 0.00714
Epoch: 39363, Training Loss: 0.00715
Epoch: 39363, Training Loss: 0.00877
Epoch: 39364, Training Loss: 0.00688
Epoch: 39364, Training Loss: 0.00714
Epoch: 39364, Training Loss: 0.00715
Epoch: 39364, Training Loss: 0.00877
Epoch: 39365, Training Loss: 0.00688
Epoch: 39365, Training Loss: 0.00714
Epoch: 39365, Training Loss: 0.00715
Epoch: 39365, Training Loss: 0.00877
Epoch: 39366, Training Loss: 0.00688
Epoch: 39366, Training Loss: 0.00714
Epoch: 39366, Training Loss: 0.00715
Epoch: 39366, Training Loss: 0.00877
Epoch: 39367, Training Loss: 0.00688
Epoch: 39367, Training Loss: 0.00714
Epoch: 39367, Training Loss: 0.00715
Epoch: 39367, Training Loss: 0.00877
Epoch: 39368, Training Loss: 0.00688
Epoch: 39368, Training Loss: 0.00714
Epoch: 39368, Training Loss: 0.00715
Epoch: 39368, Training Loss: 0.00877
Epoch: 39369, Training Loss: 0.00688
Epoch: 39369, Training Loss: 0.00714
Epoch: 39369, Training Loss: 0.00715
Epoch: 39369, Training Loss: 0.00877
Epoch: 39370, Training Loss: 0.00688
Epoch: 39370, Training Loss: 0.00714
Epoch: 39370, Training Loss: 0.00715
Epoch: 39370, Training Loss: 0.00877
Epoch: 39371, Training Loss: 0.00688
Epoch: 39371, Training Loss: 0.00714
Epoch: 39371, Training Loss: 0.00715
Epoch: 39371, Training Loss: 0.00877
Epoch: 39372, Training Loss: 0.00688
Epoch: 39372, Training Loss: 0.00714
Epoch: 39372, Training Loss: 0.00715
Epoch: 39372, Training Loss: 0.00877
Epoch: 39373, Training Loss: 0.00688
Epoch: 39373, Training Loss: 0.00714
Epoch: 39373, Training Loss: 0.00715
Epoch: 39373, Training Loss: 0.00877
Epoch: 39374, Training Loss: 0.00688
Epoch: 39374, Training Loss: 0.00714
Epoch: 39374, Training Loss: 0.00715
Epoch: 39374, Training Loss: 0.00877
Epoch: 39375, Training Loss: 0.00688
Epoch: 39375, Training Loss: 0.00714
Epoch: 39375, Training Loss: 0.00715
Epoch: 39375, Training Loss: 0.00877
Epoch: 39376, Training Loss: 0.00688
Epoch: 39376, Training Loss: 0.00714
Epoch: 39376, Training Loss: 0.00715
Epoch: 39376, Training Loss: 0.00877
Epoch: 39377, Training Loss: 0.00688
Epoch: 39377, Training Loss: 0.00714
Epoch: 39377, Training Loss: 0.00715
Epoch: 39377, Training Loss: 0.00877
Epoch: 39378, Training Loss: 0.00688
Epoch: 39378, Training Loss: 0.00714
Epoch: 39378, Training Loss: 0.00715
Epoch: 39378, Training Loss: 0.00877
Epoch: 39379, Training Loss: 0.00688
Epoch: 39379, Training Loss: 0.00714
Epoch: 39379, Training Loss: 0.00715
Epoch: 39379, Training Loss: 0.00877
Epoch: 39380, Training Loss: 0.00688
Epoch: 39380, Training Loss: 0.00714
Epoch: 39380, Training Loss: 0.00715
Epoch: 39380, Training Loss: 0.00877
Epoch: 39381, Training Loss: 0.00688
Epoch: 39381, Training Loss: 0.00714
Epoch: 39381, Training Loss: 0.00715
Epoch: 39381, Training Loss: 0.00877
Epoch: 39382, Training Loss: 0.00688
Epoch: 39382, Training Loss: 0.00714
Epoch: 39382, Training Loss: 0.00715
Epoch: 39382, Training Loss: 0.00877
Epoch: 39383, Training Loss: 0.00688
Epoch: 39383, Training Loss: 0.00714
Epoch: 39383, Training Loss: 0.00715
Epoch: 39383, Training Loss: 0.00877
Epoch: 39384, Training Loss: 0.00688
Epoch: 39384, Training Loss: 0.00714
Epoch: 39384, Training Loss: 0.00715
Epoch: 39384, Training Loss: 0.00877
Epoch: 39385, Training Loss: 0.00688
Epoch: 39385, Training Loss: 0.00714
Epoch: 39385, Training Loss: 0.00715
Epoch: 39385, Training Loss: 0.00877
Epoch: 39386, Training Loss: 0.00688
Epoch: 39386, Training Loss: 0.00714
Epoch: 39386, Training Loss: 0.00715
Epoch: 39386, Training Loss: 0.00877
Epoch: 39387, Training Loss: 0.00687
Epoch: 39387, Training Loss: 0.00714
Epoch: 39387, Training Loss: 0.00715
Epoch: 39387, Training Loss: 0.00877
Epoch: 39388, Training Loss: 0.00687
Epoch: 39388, Training Loss: 0.00714
Epoch: 39388, Training Loss: 0.00715
Epoch: 39388, Training Loss: 0.00877
Epoch: 39389, Training Loss: 0.00687
Epoch: 39389, Training Loss: 0.00714
Epoch: 39389, Training Loss: 0.00715
Epoch: 39389, Training Loss: 0.00877
Epoch: 39390, Training Loss: 0.00687
Epoch: 39390, Training Loss: 0.00714
Epoch: 39390, Training Loss: 0.00715
Epoch: 39390, Training Loss: 0.00877
Epoch: 39391, Training Loss: 0.00687
Epoch: 39391, Training Loss: 0.00714
Epoch: 39391, Training Loss: 0.00715
Epoch: 39391, Training Loss: 0.00877
Epoch: 39392, Training Loss: 0.00687
Epoch: 39392, Training Loss: 0.00714
Epoch: 39392, Training Loss: 0.00715
Epoch: 39392, Training Loss: 0.00877
Epoch: 39393, Training Loss: 0.00687
Epoch: 39393, Training Loss: 0.00714
Epoch: 39393, Training Loss: 0.00715
Epoch: 39393, Training Loss: 0.00877
Epoch: 39394, Training Loss: 0.00687
Epoch: 39394, Training Loss: 0.00714
Epoch: 39394, Training Loss: 0.00715
Epoch: 39394, Training Loss: 0.00877
Epoch: 39395, Training Loss: 0.00687
Epoch: 39395, Training Loss: 0.00714
Epoch: 39395, Training Loss: 0.00715
Epoch: 39395, Training Loss: 0.00877
Epoch: 39396, Training Loss: 0.00687
Epoch: 39396, Training Loss: 0.00714
Epoch: 39396, Training Loss: 0.00715
Epoch: 39396, Training Loss: 0.00877
Epoch: 39397, Training Loss: 0.00687
Epoch: 39397, Training Loss: 0.00714
Epoch: 39397, Training Loss: 0.00715
Epoch: 39397, Training Loss: 0.00877
Epoch: 39398, Training Loss: 0.00687
Epoch: 39398, Training Loss: 0.00714
Epoch: 39398, Training Loss: 0.00715
Epoch: 39398, Training Loss: 0.00877
Epoch: 39399, Training Loss: 0.00687
Epoch: 39399, Training Loss: 0.00714
Epoch: 39399, Training Loss: 0.00715
Epoch: 39399, Training Loss: 0.00877
Epoch: 39400, Training Loss: 0.00687
Epoch: 39400, Training Loss: 0.00714
Epoch: 39400, Training Loss: 0.00715
Epoch: 39400, Training Loss: 0.00877
Epoch: 39401, Training Loss: 0.00687
Epoch: 39401, Training Loss: 0.00714
Epoch: 39401, Training Loss: 0.00715
Epoch: 39401, Training Loss: 0.00877
Epoch: 39402, Training Loss: 0.00687
Epoch: 39402, Training Loss: 0.00714
Epoch: 39402, Training Loss: 0.00715
Epoch: 39402, Training Loss: 0.00877
Epoch: 39403, Training Loss: 0.00687
Epoch: 39403, Training Loss: 0.00714
Epoch: 39403, Training Loss: 0.00715
Epoch: 39403, Training Loss: 0.00877
Epoch: 39404, Training Loss: 0.00687
Epoch: 39404, Training Loss: 0.00714
Epoch: 39404, Training Loss: 0.00715
Epoch: 39404, Training Loss: 0.00877
Epoch: 39405, Training Loss: 0.00687
Epoch: 39405, Training Loss: 0.00714
Epoch: 39405, Training Loss: 0.00715
Epoch: 39405, Training Loss: 0.00877
Epoch: 39406, Training Loss: 0.00687
Epoch: 39406, Training Loss: 0.00714
Epoch: 39406, Training Loss: 0.00715
Epoch: 39406, Training Loss: 0.00877
Epoch: 39407, Training Loss: 0.00687
Epoch: 39407, Training Loss: 0.00714
Epoch: 39407, Training Loss: 0.00715
Epoch: 39407, Training Loss: 0.00877
Epoch: 39408, Training Loss: 0.00687
Epoch: 39408, Training Loss: 0.00714
Epoch: 39408, Training Loss: 0.00715
Epoch: 39408, Training Loss: 0.00877
Epoch: 39409, Training Loss: 0.00687
Epoch: 39409, Training Loss: 0.00714
Epoch: 39409, Training Loss: 0.00715
Epoch: 39409, Training Loss: 0.00877
Epoch: 39410, Training Loss: 0.00687
Epoch: 39410, Training Loss: 0.00714
Epoch: 39410, Training Loss: 0.00715
Epoch: 39410, Training Loss: 0.00877
Epoch: 39411, Training Loss: 0.00687
Epoch: 39411, Training Loss: 0.00714
Epoch: 39411, Training Loss: 0.00715
Epoch: 39411, Training Loss: 0.00877
Epoch: 39412, Training Loss: 0.00687
Epoch: 39412, Training Loss: 0.00714
Epoch: 39412, Training Loss: 0.00715
Epoch: 39412, Training Loss: 0.00877
Epoch: 39413, Training Loss: 0.00687
Epoch: 39413, Training Loss: 0.00714
Epoch: 39413, Training Loss: 0.00715
Epoch: 39413, Training Loss: 0.00877
Epoch: 39414, Training Loss: 0.00687
Epoch: 39414, Training Loss: 0.00714
Epoch: 39414, Training Loss: 0.00715
Epoch: 39414, Training Loss: 0.00877
Epoch: 39415, Training Loss: 0.00687
Epoch: 39415, Training Loss: 0.00714
Epoch: 39415, Training Loss: 0.00715
Epoch: 39415, Training Loss: 0.00877
Epoch: 39416, Training Loss: 0.00687
Epoch: 39416, Training Loss: 0.00714
Epoch: 39416, Training Loss: 0.00715
Epoch: 39416, Training Loss: 0.00877
Epoch: 39417, Training Loss: 0.00687
Epoch: 39417, Training Loss: 0.00714
Epoch: 39417, Training Loss: 0.00715
Epoch: 39417, Training Loss: 0.00877
Epoch: 39418, Training Loss: 0.00687
Epoch: 39418, Training Loss: 0.00714
Epoch: 39418, Training Loss: 0.00715
Epoch: 39418, Training Loss: 0.00877
Epoch: 39419, Training Loss: 0.00687
Epoch: 39419, Training Loss: 0.00714
Epoch: 39419, Training Loss: 0.00715
Epoch: 39419, Training Loss: 0.00877
Epoch: 39420, Training Loss: 0.00687
Epoch: 39420, Training Loss: 0.00714
Epoch: 39420, Training Loss: 0.00715
Epoch: 39420, Training Loss: 0.00877
Epoch: 39421, Training Loss: 0.00687
Epoch: 39421, Training Loss: 0.00714
Epoch: 39421, Training Loss: 0.00715
Epoch: 39421, Training Loss: 0.00877
Epoch: 39422, Training Loss: 0.00687
Epoch: 39422, Training Loss: 0.00714
Epoch: 39422, Training Loss: 0.00715
Epoch: 39422, Training Loss: 0.00876
Epoch: 39423, Training Loss: 0.00687
Epoch: 39423, Training Loss: 0.00714
Epoch: 39423, Training Loss: 0.00715
Epoch: 39423, Training Loss: 0.00876
Epoch: 39424, Training Loss: 0.00687
Epoch: 39424, Training Loss: 0.00714
Epoch: 39424, Training Loss: 0.00715
Epoch: 39424, Training Loss: 0.00876
Epoch: 39425, Training Loss: 0.00687
Epoch: 39425, Training Loss: 0.00714
Epoch: 39425, Training Loss: 0.00715
Epoch: 39425, Training Loss: 0.00876
Epoch: 39426, Training Loss: 0.00687
Epoch: 39426, Training Loss: 0.00714
Epoch: 39426, Training Loss: 0.00715
Epoch: 39426, Training Loss: 0.00876
Epoch: 39427, Training Loss: 0.00687
Epoch: 39427, Training Loss: 0.00714
Epoch: 39427, Training Loss: 0.00715
Epoch: 39427, Training Loss: 0.00876
Epoch: 39428, Training Loss: 0.00687
Epoch: 39428, Training Loss: 0.00714
Epoch: 39428, Training Loss: 0.00715
Epoch: 39428, Training Loss: 0.00876
Epoch: 39429, Training Loss: 0.00687
Epoch: 39429, Training Loss: 0.00714
Epoch: 39429, Training Loss: 0.00715
Epoch: 39429, Training Loss: 0.00876
Epoch: 39430, Training Loss: 0.00687
Epoch: 39430, Training Loss: 0.00714
Epoch: 39430, Training Loss: 0.00715
Epoch: 39430, Training Loss: 0.00876
Epoch: 39431, Training Loss: 0.00687
Epoch: 39431, Training Loss: 0.00714
Epoch: 39431, Training Loss: 0.00715
Epoch: 39431, Training Loss: 0.00876
Epoch: 39432, Training Loss: 0.00687
Epoch: 39432, Training Loss: 0.00714
Epoch: 39432, Training Loss: 0.00715
Epoch: 39432, Training Loss: 0.00876
Epoch: 39433, Training Loss: 0.00687
Epoch: 39433, Training Loss: 0.00714
Epoch: 39433, Training Loss: 0.00715
Epoch: 39433, Training Loss: 0.00876
Epoch: 39434, Training Loss: 0.00687
Epoch: 39434, Training Loss: 0.00714
Epoch: 39434, Training Loss: 0.00715
Epoch: 39434, Training Loss: 0.00876
Epoch: 39435, Training Loss: 0.00687
Epoch: 39435, Training Loss: 0.00714
Epoch: 39435, Training Loss: 0.00715
Epoch: 39435, Training Loss: 0.00876
Epoch: 39436, Training Loss: 0.00687
Epoch: 39436, Training Loss: 0.00714
Epoch: 39436, Training Loss: 0.00715
Epoch: 39436, Training Loss: 0.00876
Epoch: 39437, Training Loss: 0.00687
Epoch: 39437, Training Loss: 0.00714
Epoch: 39437, Training Loss: 0.00715
Epoch: 39437, Training Loss: 0.00876
Epoch: 39438, Training Loss: 0.00687
Epoch: 39438, Training Loss: 0.00714
Epoch: 39438, Training Loss: 0.00714
Epoch: 39438, Training Loss: 0.00876
Epoch: 39439, Training Loss: 0.00687
Epoch: 39439, Training Loss: 0.00714
Epoch: 39439, Training Loss: 0.00714
Epoch: 39439, Training Loss: 0.00876
Epoch: 39440, Training Loss: 0.00687
Epoch: 39440, Training Loss: 0.00714
Epoch: 39440, Training Loss: 0.00714
Epoch: 39440, Training Loss: 0.00876
Epoch: 39441, Training Loss: 0.00687
Epoch: 39441, Training Loss: 0.00714
Epoch: 39441, Training Loss: 0.00714
Epoch: 39441, Training Loss: 0.00876
Epoch: 39442, Training Loss: 0.00687
Epoch: 39442, Training Loss: 0.00714
Epoch: 39442, Training Loss: 0.00714
Epoch: 39442, Training Loss: 0.00876
Epoch: 39443, Training Loss: 0.00687
Epoch: 39443, Training Loss: 0.00714
Epoch: 39443, Training Loss: 0.00714
Epoch: 39443, Training Loss: 0.00876
Epoch: 39444, Training Loss: 0.00687
Epoch: 39444, Training Loss: 0.00714
Epoch: 39444, Training Loss: 0.00714
Epoch: 39444, Training Loss: 0.00876
Epoch: 39445, Training Loss: 0.00687
Epoch: 39445, Training Loss: 0.00714
Epoch: 39445, Training Loss: 0.00714
Epoch: 39445, Training Loss: 0.00876
Epoch: 39446, Training Loss: 0.00687
Epoch: 39446, Training Loss: 0.00714
Epoch: 39446, Training Loss: 0.00714
Epoch: 39446, Training Loss: 0.00876
Epoch: 39447, Training Loss: 0.00687
Epoch: 39447, Training Loss: 0.00714
Epoch: 39447, Training Loss: 0.00714
Epoch: 39447, Training Loss: 0.00876
Epoch: 39448, Training Loss: 0.00687
Epoch: 39448, Training Loss: 0.00714
Epoch: 39448, Training Loss: 0.00714
Epoch: 39448, Training Loss: 0.00876
Epoch: 39449, Training Loss: 0.00687
Epoch: 39449, Training Loss: 0.00714
Epoch: 39449, Training Loss: 0.00714
Epoch: 39449, Training Loss: 0.00876
Epoch: 39450, Training Loss: 0.00687
Epoch: 39450, Training Loss: 0.00714
Epoch: 39450, Training Loss: 0.00714
Epoch: 39450, Training Loss: 0.00876
Epoch: 39451, Training Loss: 0.00687
Epoch: 39451, Training Loss: 0.00714
Epoch: 39451, Training Loss: 0.00714
Epoch: 39451, Training Loss: 0.00876
Epoch: 39452, Training Loss: 0.00687
Epoch: 39452, Training Loss: 0.00714
Epoch: 39452, Training Loss: 0.00714
Epoch: 39452, Training Loss: 0.00876
Epoch: 39453, Training Loss: 0.00687
Epoch: 39453, Training Loss: 0.00714
Epoch: 39453, Training Loss: 0.00714
Epoch: 39453, Training Loss: 0.00876
Epoch: 39454, Training Loss: 0.00687
Epoch: 39454, Training Loss: 0.00714
Epoch: 39454, Training Loss: 0.00714
Epoch: 39454, Training Loss: 0.00876
Epoch: 39455, Training Loss: 0.00687
Epoch: 39455, Training Loss: 0.00714
Epoch: 39455, Training Loss: 0.00714
Epoch: 39455, Training Loss: 0.00876
Epoch: 39456, Training Loss: 0.00687
Epoch: 39456, Training Loss: 0.00714
Epoch: 39456, Training Loss: 0.00714
Epoch: 39456, Training Loss: 0.00876
Epoch: 39457, Training Loss: 0.00687
Epoch: 39457, Training Loss: 0.00714
Epoch: 39457, Training Loss: 0.00714
Epoch: 39457, Training Loss: 0.00876
Epoch: 39458, Training Loss: 0.00687
Epoch: 39458, Training Loss: 0.00713
Epoch: 39458, Training Loss: 0.00714
Epoch: 39458, Training Loss: 0.00876
Epoch: 39459, Training Loss: 0.00687
Epoch: 39459, Training Loss: 0.00713
Epoch: 39459, Training Loss: 0.00714
Epoch: 39459, Training Loss: 0.00876
Epoch: 39460, Training Loss: 0.00687
Epoch: 39460, Training Loss: 0.00713
Epoch: 39460, Training Loss: 0.00714
Epoch: 39460, Training Loss: 0.00876
Epoch: 39461, Training Loss: 0.00687
Epoch: 39461, Training Loss: 0.00713
Epoch: 39461, Training Loss: 0.00714
Epoch: 39461, Training Loss: 0.00876
Epoch: 39462, Training Loss: 0.00687
Epoch: 39462, Training Loss: 0.00713
Epoch: 39462, Training Loss: 0.00714
Epoch: 39462, Training Loss: 0.00876
Epoch: 39463, Training Loss: 0.00687
Epoch: 39463, Training Loss: 0.00713
Epoch: 39463, Training Loss: 0.00714
Epoch: 39463, Training Loss: 0.00876
Epoch: 39464, Training Loss: 0.00687
Epoch: 39464, Training Loss: 0.00713
Epoch: 39464, Training Loss: 0.00714
Epoch: 39464, Training Loss: 0.00876
Epoch: 39465, Training Loss: 0.00687
Epoch: 39465, Training Loss: 0.00713
Epoch: 39465, Training Loss: 0.00714
Epoch: 39465, Training Loss: 0.00876
Epoch: 39466, Training Loss: 0.00687
Epoch: 39466, Training Loss: 0.00713
Epoch: 39466, Training Loss: 0.00714
Epoch: 39466, Training Loss: 0.00876
Epoch: 39467, Training Loss: 0.00687
Epoch: 39467, Training Loss: 0.00713
Epoch: 39467, Training Loss: 0.00714
Epoch: 39467, Training Loss: 0.00876
Epoch: 39468, Training Loss: 0.00687
Epoch: 39468, Training Loss: 0.00713
Epoch: 39468, Training Loss: 0.00714
Epoch: 39468, Training Loss: 0.00876
Epoch: 39469, Training Loss: 0.00687
Epoch: 39469, Training Loss: 0.00713
Epoch: 39469, Training Loss: 0.00714
Epoch: 39469, Training Loss: 0.00876
Epoch: 39470, Training Loss: 0.00687
Epoch: 39470, Training Loss: 0.00713
Epoch: 39470, Training Loss: 0.00714
Epoch: 39470, Training Loss: 0.00876
Epoch: 39471, Training Loss: 0.00687
Epoch: 39471, Training Loss: 0.00713
Epoch: 39471, Training Loss: 0.00714
Epoch: 39471, Training Loss: 0.00876
Epoch: 39472, Training Loss: 0.00687
Epoch: 39472, Training Loss: 0.00713
Epoch: 39472, Training Loss: 0.00714
Epoch: 39472, Training Loss: 0.00876
Epoch: 39473, Training Loss: 0.00687
Epoch: 39473, Training Loss: 0.00713
Epoch: 39473, Training Loss: 0.00714
Epoch: 39473, Training Loss: 0.00876
Epoch: 39474, Training Loss: 0.00687
Epoch: 39474, Training Loss: 0.00713
Epoch: 39474, Training Loss: 0.00714
Epoch: 39474, Training Loss: 0.00876
Epoch: 39475, Training Loss: 0.00687
Epoch: 39475, Training Loss: 0.00713
Epoch: 39475, Training Loss: 0.00714
Epoch: 39475, Training Loss: 0.00876
Epoch: 39476, Training Loss: 0.00687
Epoch: 39476, Training Loss: 0.00713
Epoch: 39476, Training Loss: 0.00714
Epoch: 39476, Training Loss: 0.00876
Epoch: 39477, Training Loss: 0.00687
Epoch: 39477, Training Loss: 0.00713
Epoch: 39477, Training Loss: 0.00714
Epoch: 39477, Training Loss: 0.00876
Epoch: 39478, Training Loss: 0.00687
Epoch: 39478, Training Loss: 0.00713
Epoch: 39478, Training Loss: 0.00714
Epoch: 39478, Training Loss: 0.00876
Epoch: 39479, Training Loss: 0.00687
Epoch: 39479, Training Loss: 0.00713
Epoch: 39479, Training Loss: 0.00714
Epoch: 39479, Training Loss: 0.00876
Epoch: 39480, Training Loss: 0.00687
Epoch: 39480, Training Loss: 0.00713
Epoch: 39480, Training Loss: 0.00714
Epoch: 39480, Training Loss: 0.00876
Epoch: 39481, Training Loss: 0.00687
Epoch: 39481, Training Loss: 0.00713
Epoch: 39481, Training Loss: 0.00714
Epoch: 39481, Training Loss: 0.00876
Epoch: 39482, Training Loss: 0.00687
Epoch: 39482, Training Loss: 0.00713
Epoch: 39482, Training Loss: 0.00714
Epoch: 39482, Training Loss: 0.00876
Epoch: 39483, Training Loss: 0.00687
Epoch: 39483, Training Loss: 0.00713
Epoch: 39483, Training Loss: 0.00714
Epoch: 39483, Training Loss: 0.00876
Epoch: 39484, Training Loss: 0.00687
Epoch: 39484, Training Loss: 0.00713
Epoch: 39484, Training Loss: 0.00714
Epoch: 39484, Training Loss: 0.00876
Epoch: 39485, Training Loss: 0.00687
Epoch: 39485, Training Loss: 0.00713
Epoch: 39485, Training Loss: 0.00714
Epoch: 39485, Training Loss: 0.00876
Epoch: 39486, Training Loss: 0.00687
Epoch: 39486, Training Loss: 0.00713
Epoch: 39486, Training Loss: 0.00714
Epoch: 39486, Training Loss: 0.00876
Epoch: 39487, Training Loss: 0.00687
Epoch: 39487, Training Loss: 0.00713
Epoch: 39487, Training Loss: 0.00714
Epoch: 39487, Training Loss: 0.00876
Epoch: 39488, Training Loss: 0.00687
Epoch: 39488, Training Loss: 0.00713
Epoch: 39488, Training Loss: 0.00714
Epoch: 39488, Training Loss: 0.00876
Epoch: 39489, Training Loss: 0.00687
Epoch: 39489, Training Loss: 0.00713
Epoch: 39489, Training Loss: 0.00714
Epoch: 39489, Training Loss: 0.00876
Epoch: 39490, Training Loss: 0.00687
Epoch: 39490, Training Loss: 0.00713
Epoch: 39490, Training Loss: 0.00714
Epoch: 39490, Training Loss: 0.00876
Epoch: 39491, Training Loss: 0.00687
Epoch: 39491, Training Loss: 0.00713
Epoch: 39491, Training Loss: 0.00714
Epoch: 39491, Training Loss: 0.00876
Epoch: 39492, Training Loss: 0.00687
Epoch: 39492, Training Loss: 0.00713
Epoch: 39492, Training Loss: 0.00714
Epoch: 39492, Training Loss: 0.00876
Epoch: 39493, Training Loss: 0.00686
Epoch: 39493, Training Loss: 0.00713
Epoch: 39493, Training Loss: 0.00714
Epoch: 39493, Training Loss: 0.00876
Epoch: 39494, Training Loss: 0.00686
Epoch: 39494, Training Loss: 0.00713
Epoch: 39494, Training Loss: 0.00714
Epoch: 39494, Training Loss: 0.00876
Epoch: 39495, Training Loss: 0.00686
Epoch: 39495, Training Loss: 0.00713
Epoch: 39495, Training Loss: 0.00714
Epoch: 39495, Training Loss: 0.00876
Epoch: 39496, Training Loss: 0.00686
Epoch: 39496, Training Loss: 0.00713
Epoch: 39496, Training Loss: 0.00714
Epoch: 39496, Training Loss: 0.00876
Epoch: 39497, Training Loss: 0.00686
Epoch: 39497, Training Loss: 0.00713
Epoch: 39497, Training Loss: 0.00714
Epoch: 39497, Training Loss: 0.00876
Epoch: 39498, Training Loss: 0.00686
Epoch: 39498, Training Loss: 0.00713
Epoch: 39498, Training Loss: 0.00714
Epoch: 39498, Training Loss: 0.00876
Epoch: 39499, Training Loss: 0.00686
Epoch: 39499, Training Loss: 0.00713
Epoch: 39499, Training Loss: 0.00714
Epoch: 39499, Training Loss: 0.00876
Epoch: 39500, Training Loss: 0.00686
Epoch: 39500, Training Loss: 0.00713
Epoch: 39500, Training Loss: 0.00714
Epoch: 39500, Training Loss: 0.00876
Epoch: 39501, Training Loss: 0.00686
Epoch: 39501, Training Loss: 0.00713
Epoch: 39501, Training Loss: 0.00714
Epoch: 39501, Training Loss: 0.00876
Epoch: 39502, Training Loss: 0.00686
Epoch: 39502, Training Loss: 0.00713
Epoch: 39502, Training Loss: 0.00714
Epoch: 39502, Training Loss: 0.00876
Epoch: 39503, Training Loss: 0.00686
Epoch: 39503, Training Loss: 0.00713
Epoch: 39503, Training Loss: 0.00714
Epoch: 39503, Training Loss: 0.00875
Epoch: 39504, Training Loss: 0.00686
Epoch: 39504, Training Loss: 0.00713
Epoch: 39504, Training Loss: 0.00714
Epoch: 39504, Training Loss: 0.00875
Epoch: 39505, Training Loss: 0.00686
Epoch: 39505, Training Loss: 0.00713
Epoch: 39505, Training Loss: 0.00714
Epoch: 39505, Training Loss: 0.00875
Epoch: 39506, Training Loss: 0.00686
Epoch: 39506, Training Loss: 0.00713
Epoch: 39506, Training Loss: 0.00714
Epoch: 39506, Training Loss: 0.00875
Epoch: 39507, Training Loss: 0.00686
Epoch: 39507, Training Loss: 0.00713
Epoch: 39507, Training Loss: 0.00714
Epoch: 39507, Training Loss: 0.00875
Epoch: 39508, Training Loss: 0.00686
Epoch: 39508, Training Loss: 0.00713
Epoch: 39508, Training Loss: 0.00714
Epoch: 39508, Training Loss: 0.00875
Epoch: 39509, Training Loss: 0.00686
Epoch: 39509, Training Loss: 0.00713
Epoch: 39509, Training Loss: 0.00714
Epoch: 39509, Training Loss: 0.00875
Epoch: 39510, Training Loss: 0.00686
Epoch: 39510, Training Loss: 0.00713
Epoch: 39510, Training Loss: 0.00714
Epoch: 39510, Training Loss: 0.00875
Epoch: 39511, Training Loss: 0.00686
Epoch: 39511, Training Loss: 0.00713
Epoch: 39511, Training Loss: 0.00714
Epoch: 39511, Training Loss: 0.00875
Epoch: 39512, Training Loss: 0.00686
Epoch: 39512, Training Loss: 0.00713
Epoch: 39512, Training Loss: 0.00714
Epoch: 39512, Training Loss: 0.00875
Epoch: 39513, Training Loss: 0.00686
Epoch: 39513, Training Loss: 0.00713
Epoch: 39513, Training Loss: 0.00714
Epoch: 39513, Training Loss: 0.00875
Epoch: 39514, Training Loss: 0.00686
Epoch: 39514, Training Loss: 0.00713
Epoch: 39514, Training Loss: 0.00714
Epoch: 39514, Training Loss: 0.00875
Epoch: 39515, Training Loss: 0.00686
Epoch: 39515, Training Loss: 0.00713
Epoch: 39515, Training Loss: 0.00714
Epoch: 39515, Training Loss: 0.00875
Epoch: 39516, Training Loss: 0.00686
Epoch: 39516, Training Loss: 0.00713
Epoch: 39516, Training Loss: 0.00714
Epoch: 39516, Training Loss: 0.00875
Epoch: 39517, Training Loss: 0.00686
Epoch: 39517, Training Loss: 0.00713
Epoch: 39517, Training Loss: 0.00714
Epoch: 39517, Training Loss: 0.00875
Epoch: 39518, Training Loss: 0.00686
Epoch: 39518, Training Loss: 0.00713
Epoch: 39518, Training Loss: 0.00714
Epoch: 39518, Training Loss: 0.00875
Epoch: 39519, Training Loss: 0.00686
Epoch: 39519, Training Loss: 0.00713
Epoch: 39519, Training Loss: 0.00714
Epoch: 39519, Training Loss: 0.00875
Epoch: 39520, Training Loss: 0.00686
Epoch: 39520, Training Loss: 0.00713
Epoch: 39520, Training Loss: 0.00714
Epoch: 39520, Training Loss: 0.00875
Epoch: 39521, Training Loss: 0.00686
Epoch: 39521, Training Loss: 0.00713
Epoch: 39521, Training Loss: 0.00714
Epoch: 39521, Training Loss: 0.00875
Epoch: 39522, Training Loss: 0.00686
Epoch: 39522, Training Loss: 0.00713
Epoch: 39522, Training Loss: 0.00714
Epoch: 39522, Training Loss: 0.00875
Epoch: 39523, Training Loss: 0.00686
Epoch: 39523, Training Loss: 0.00713
Epoch: 39523, Training Loss: 0.00714
Epoch: 39523, Training Loss: 0.00875
Epoch: 39524, Training Loss: 0.00686
Epoch: 39524, Training Loss: 0.00713
Epoch: 39524, Training Loss: 0.00714
Epoch: 39524, Training Loss: 0.00875
Epoch: 39525, Training Loss: 0.00686
Epoch: 39525, Training Loss: 0.00713
Epoch: 39525, Training Loss: 0.00714
Epoch: 39525, Training Loss: 0.00875
Epoch: 39526, Training Loss: 0.00686
Epoch: 39526, Training Loss: 0.00713
Epoch: 39526, Training Loss: 0.00714
Epoch: 39526, Training Loss: 0.00875
Epoch: 39527, Training Loss: 0.00686
Epoch: 39527, Training Loss: 0.00713
Epoch: 39527, Training Loss: 0.00714
Epoch: 39527, Training Loss: 0.00875
Epoch: 39528, Training Loss: 0.00686
Epoch: 39528, Training Loss: 0.00713
Epoch: 39528, Training Loss: 0.00714
Epoch: 39528, Training Loss: 0.00875
Epoch: 39529, Training Loss: 0.00686
Epoch: 39529, Training Loss: 0.00713
Epoch: 39529, Training Loss: 0.00714
Epoch: 39529, Training Loss: 0.00875
Epoch: 39530, Training Loss: 0.00686
Epoch: 39530, Training Loss: 0.00713
Epoch: 39530, Training Loss: 0.00714
Epoch: 39530, Training Loss: 0.00875
Epoch: 39531, Training Loss: 0.00686
Epoch: 39531, Training Loss: 0.00713
Epoch: 39531, Training Loss: 0.00714
Epoch: 39531, Training Loss: 0.00875
Epoch: 39532, Training Loss: 0.00686
Epoch: 39532, Training Loss: 0.00713
Epoch: 39532, Training Loss: 0.00714
Epoch: 39532, Training Loss: 0.00875
Epoch: 39533, Training Loss: 0.00686
Epoch: 39533, Training Loss: 0.00713
Epoch: 39533, Training Loss: 0.00714
Epoch: 39533, Training Loss: 0.00875
Epoch: 39534, Training Loss: 0.00686
Epoch: 39534, Training Loss: 0.00713
Epoch: 39534, Training Loss: 0.00714
Epoch: 39534, Training Loss: 0.00875
Epoch: 39535, Training Loss: 0.00686
Epoch: 39535, Training Loss: 0.00713
Epoch: 39535, Training Loss: 0.00714
Epoch: 39535, Training Loss: 0.00875
Epoch: 39536, Training Loss: 0.00686
Epoch: 39536, Training Loss: 0.00713
Epoch: 39536, Training Loss: 0.00714
Epoch: 39536, Training Loss: 0.00875
Epoch: 39537, Training Loss: 0.00686
Epoch: 39537, Training Loss: 0.00713
Epoch: 39537, Training Loss: 0.00714
Epoch: 39537, Training Loss: 0.00875
Epoch: 39538, Training Loss: 0.00686
Epoch: 39538, Training Loss: 0.00713
Epoch: 39538, Training Loss: 0.00714
Epoch: 39538, Training Loss: 0.00875
Epoch: 39539, Training Loss: 0.00686
Epoch: 39539, Training Loss: 0.00713
Epoch: 39539, Training Loss: 0.00713
Epoch: 39539, Training Loss: 0.00875
Epoch: 39540, Training Loss: 0.00686
Epoch: 39540, Training Loss: 0.00713
Epoch: 39540, Training Loss: 0.00713
Epoch: 39540, Training Loss: 0.00875
Epoch: 39541, Training Loss: 0.00686
Epoch: 39541, Training Loss: 0.00713
Epoch: 39541, Training Loss: 0.00713
Epoch: 39541, Training Loss: 0.00875
Epoch: 39542, Training Loss: 0.00686
Epoch: 39542, Training Loss: 0.00713
Epoch: 39542, Training Loss: 0.00713
Epoch: 39542, Training Loss: 0.00875
Epoch: 39543, Training Loss: 0.00686
Epoch: 39543, Training Loss: 0.00713
Epoch: 39543, Training Loss: 0.00713
Epoch: 39543, Training Loss: 0.00875
Epoch: 39544, Training Loss: 0.00686
Epoch: 39544, Training Loss: 0.00713
Epoch: 39544, Training Loss: 0.00713
Epoch: 39544, Training Loss: 0.00875
Epoch: 39545, Training Loss: 0.00686
Epoch: 39545, Training Loss: 0.00713
Epoch: 39545, Training Loss: 0.00713
Epoch: 39545, Training Loss: 0.00875
Epoch: 39546, Training Loss: 0.00686
Epoch: 39546, Training Loss: 0.00713
Epoch: 39546, Training Loss: 0.00713
Epoch: 39546, Training Loss: 0.00875
Epoch: 39547, Training Loss: 0.00686
Epoch: 39547, Training Loss: 0.00713
Epoch: 39547, Training Loss: 0.00713
Epoch: 39547, Training Loss: 0.00875
Epoch: 39548, Training Loss: 0.00686
Epoch: 39548, Training Loss: 0.00713
Epoch: 39548, Training Loss: 0.00713
Epoch: 39548, Training Loss: 0.00875
Epoch: 39549, Training Loss: 0.00686
Epoch: 39549, Training Loss: 0.00713
Epoch: 39549, Training Loss: 0.00713
Epoch: 39549, Training Loss: 0.00875
Epoch: 39550, Training Loss: 0.00686
Epoch: 39550, Training Loss: 0.00713
Epoch: 39550, Training Loss: 0.00713
Epoch: 39550, Training Loss: 0.00875
Epoch: 39551, Training Loss: 0.00686
Epoch: 39551, Training Loss: 0.00713
Epoch: 39551, Training Loss: 0.00713
Epoch: 39551, Training Loss: 0.00875
Epoch: 39552, Training Loss: 0.00686
Epoch: 39552, Training Loss: 0.00713
Epoch: 39552, Training Loss: 0.00713
Epoch: 39552, Training Loss: 0.00875
Epoch: 39553, Training Loss: 0.00686
Epoch: 39553, Training Loss: 0.00713
Epoch: 39553, Training Loss: 0.00713
Epoch: 39553, Training Loss: 0.00875
Epoch: 39554, Training Loss: 0.00686
Epoch: 39554, Training Loss: 0.00713
Epoch: 39554, Training Loss: 0.00713
Epoch: 39554, Training Loss: 0.00875
Epoch: 39555, Training Loss: 0.00686
Epoch: 39555, Training Loss: 0.00713
Epoch: 39555, Training Loss: 0.00713
Epoch: 39555, Training Loss: 0.00875
Epoch: 39556, Training Loss: 0.00686
Epoch: 39556, Training Loss: 0.00713
Epoch: 39556, Training Loss: 0.00713
Epoch: 39556, Training Loss: 0.00875
Epoch: 39557, Training Loss: 0.00686
Epoch: 39557, Training Loss: 0.00713
Epoch: 39557, Training Loss: 0.00713
Epoch: 39557, Training Loss: 0.00875
Epoch: 39558, Training Loss: 0.00686
Epoch: 39558, Training Loss: 0.00713
Epoch: 39558, Training Loss: 0.00713
Epoch: 39558, Training Loss: 0.00875
Epoch: 39559, Training Loss: 0.00686
Epoch: 39559, Training Loss: 0.00712
Epoch: 39559, Training Loss: 0.00713
Epoch: 39559, Training Loss: 0.00875
Epoch: 39560, Training Loss: 0.00686
Epoch: 39560, Training Loss: 0.00712
Epoch: 39560, Training Loss: 0.00713
Epoch: 39560, Training Loss: 0.00875
Epoch: 39561, Training Loss: 0.00686
Epoch: 39561, Training Loss: 0.00712
Epoch: 39561, Training Loss: 0.00713
Epoch: 39561, Training Loss: 0.00875
Epoch: 39562, Training Loss: 0.00686
Epoch: 39562, Training Loss: 0.00712
Epoch: 39562, Training Loss: 0.00713
Epoch: 39562, Training Loss: 0.00875
Epoch: 39563, Training Loss: 0.00686
Epoch: 39563, Training Loss: 0.00712
Epoch: 39563, Training Loss: 0.00713
Epoch: 39563, Training Loss: 0.00875
Epoch: 39564, Training Loss: 0.00686
Epoch: 39564, Training Loss: 0.00712
Epoch: 39564, Training Loss: 0.00713
Epoch: 39564, Training Loss: 0.00875
Epoch: 39565, Training Loss: 0.00686
Epoch: 39565, Training Loss: 0.00712
Epoch: 39565, Training Loss: 0.00713
Epoch: 39565, Training Loss: 0.00875
Epoch: 39566, Training Loss: 0.00686
Epoch: 39566, Training Loss: 0.00712
Epoch: 39566, Training Loss: 0.00713
Epoch: 39566, Training Loss: 0.00875
Epoch: 39567, Training Loss: 0.00686
Epoch: 39567, Training Loss: 0.00712
Epoch: 39567, Training Loss: 0.00713
Epoch: 39567, Training Loss: 0.00875
Epoch: 39568, Training Loss: 0.00686
Epoch: 39568, Training Loss: 0.00712
Epoch: 39568, Training Loss: 0.00713
Epoch: 39568, Training Loss: 0.00875
Epoch: 39569, Training Loss: 0.00686
Epoch: 39569, Training Loss: 0.00712
Epoch: 39569, Training Loss: 0.00713
Epoch: 39569, Training Loss: 0.00875
Epoch: 39570, Training Loss: 0.00686
Epoch: 39570, Training Loss: 0.00712
Epoch: 39570, Training Loss: 0.00713
Epoch: 39570, Training Loss: 0.00875
Epoch: 39571, Training Loss: 0.00686
Epoch: 39571, Training Loss: 0.00712
Epoch: 39571, Training Loss: 0.00713
Epoch: 39571, Training Loss: 0.00875
Epoch: 39572, Training Loss: 0.00686
Epoch: 39572, Training Loss: 0.00712
Epoch: 39572, Training Loss: 0.00713
Epoch: 39572, Training Loss: 0.00875
Epoch: 39573, Training Loss: 0.00686
Epoch: 39573, Training Loss: 0.00712
Epoch: 39573, Training Loss: 0.00713
Epoch: 39573, Training Loss: 0.00875
Epoch: 39574, Training Loss: 0.00686
Epoch: 39574, Training Loss: 0.00712
Epoch: 39574, Training Loss: 0.00713
Epoch: 39574, Training Loss: 0.00875
Epoch: 39575, Training Loss: 0.00686
Epoch: 39575, Training Loss: 0.00712
Epoch: 39575, Training Loss: 0.00713
Epoch: 39575, Training Loss: 0.00875
Epoch: 39576, Training Loss: 0.00686
Epoch: 39576, Training Loss: 0.00712
Epoch: 39576, Training Loss: 0.00713
Epoch: 39576, Training Loss: 0.00875
Epoch: 39577, Training Loss: 0.00686
Epoch: 39577, Training Loss: 0.00712
Epoch: 39577, Training Loss: 0.00713
Epoch: 39577, Training Loss: 0.00875
Epoch: 39578, Training Loss: 0.00686
Epoch: 39578, Training Loss: 0.00712
Epoch: 39578, Training Loss: 0.00713
Epoch: 39578, Training Loss: 0.00875
Epoch: 39579, Training Loss: 0.00686
Epoch: 39579, Training Loss: 0.00712
Epoch: 39579, Training Loss: 0.00713
Epoch: 39579, Training Loss: 0.00875
Epoch: 39580, Training Loss: 0.00686
Epoch: 39580, Training Loss: 0.00712
Epoch: 39580, Training Loss: 0.00713
Epoch: 39580, Training Loss: 0.00875
Epoch: 39581, Training Loss: 0.00686
Epoch: 39581, Training Loss: 0.00712
Epoch: 39581, Training Loss: 0.00713
Epoch: 39581, Training Loss: 0.00875
Epoch: 39582, Training Loss: 0.00686
Epoch: 39582, Training Loss: 0.00712
Epoch: 39582, Training Loss: 0.00713
Epoch: 39582, Training Loss: 0.00875
Epoch: 39583, Training Loss: 0.00686
Epoch: 39583, Training Loss: 0.00712
Epoch: 39583, Training Loss: 0.00713
Epoch: 39583, Training Loss: 0.00875
Epoch: 39584, Training Loss: 0.00686
Epoch: 39584, Training Loss: 0.00712
Epoch: 39584, Training Loss: 0.00713
Epoch: 39584, Training Loss: 0.00874
Epoch: 39585, Training Loss: 0.00686
Epoch: 39585, Training Loss: 0.00712
Epoch: 39585, Training Loss: 0.00713
Epoch: 39585, Training Loss: 0.00874
Epoch: 39586, Training Loss: 0.00686
Epoch: 39586, Training Loss: 0.00712
Epoch: 39586, Training Loss: 0.00713
Epoch: 39586, Training Loss: 0.00874
Epoch: 39587, Training Loss: 0.00686
Epoch: 39587, Training Loss: 0.00712
Epoch: 39587, Training Loss: 0.00713
Epoch: 39587, Training Loss: 0.00874
Epoch: 39588, Training Loss: 0.00686
Epoch: 39588, Training Loss: 0.00712
Epoch: 39588, Training Loss: 0.00713
Epoch: 39588, Training Loss: 0.00874
Epoch: 39589, Training Loss: 0.00686
Epoch: 39589, Training Loss: 0.00712
Epoch: 39589, Training Loss: 0.00713
Epoch: 39589, Training Loss: 0.00874
Epoch: 39590, Training Loss: 0.00686
Epoch: 39590, Training Loss: 0.00712
Epoch: 39590, Training Loss: 0.00713
Epoch: 39590, Training Loss: 0.00874
Epoch: 39591, Training Loss: 0.00686
Epoch: 39591, Training Loss: 0.00712
Epoch: 39591, Training Loss: 0.00713
Epoch: 39591, Training Loss: 0.00874
Epoch: 39592, Training Loss: 0.00686
Epoch: 39592, Training Loss: 0.00712
Epoch: 39592, Training Loss: 0.00713
Epoch: 39592, Training Loss: 0.00874
Epoch: 39593, Training Loss: 0.00686
Epoch: 39593, Training Loss: 0.00712
Epoch: 39593, Training Loss: 0.00713
Epoch: 39593, Training Loss: 0.00874
Epoch: 39594, Training Loss: 0.00686
Epoch: 39594, Training Loss: 0.00712
Epoch: 39594, Training Loss: 0.00713
Epoch: 39594, Training Loss: 0.00874
Epoch: 39595, Training Loss: 0.00686
Epoch: 39595, Training Loss: 0.00712
Epoch: 39595, Training Loss: 0.00713
Epoch: 39595, Training Loss: 0.00874
Epoch: 39596, Training Loss: 0.00686
Epoch: 39596, Training Loss: 0.00712
Epoch: 39596, Training Loss: 0.00713
Epoch: 39596, Training Loss: 0.00874
Epoch: 39597, Training Loss: 0.00686
Epoch: 39597, Training Loss: 0.00712
Epoch: 39597, Training Loss: 0.00713
Epoch: 39597, Training Loss: 0.00874
Epoch: 39598, Training Loss: 0.00686
Epoch: 39598, Training Loss: 0.00712
Epoch: 39598, Training Loss: 0.00713
Epoch: 39598, Training Loss: 0.00874
Epoch: 39599, Training Loss: 0.00686
Epoch: 39599, Training Loss: 0.00712
Epoch: 39599, Training Loss: 0.00713
Epoch: 39599, Training Loss: 0.00874
Epoch: 39600, Training Loss: 0.00685
Epoch: 39600, Training Loss: 0.00712
Epoch: 39600, Training Loss: 0.00713
Epoch: 39600, Training Loss: 0.00874
Epoch: 39601, Training Loss: 0.00685
Epoch: 39601, Training Loss: 0.00712
Epoch: 39601, Training Loss: 0.00713
Epoch: 39601, Training Loss: 0.00874
Epoch: 39602, Training Loss: 0.00685
Epoch: 39602, Training Loss: 0.00712
Epoch: 39602, Training Loss: 0.00713
Epoch: 39602, Training Loss: 0.00874
Epoch: 39603, Training Loss: 0.00685
Epoch: 39603, Training Loss: 0.00712
Epoch: 39603, Training Loss: 0.00713
Epoch: 39603, Training Loss: 0.00874
Epoch: 39604, Training Loss: 0.00685
Epoch: 39604, Training Loss: 0.00712
Epoch: 39604, Training Loss: 0.00713
Epoch: 39604, Training Loss: 0.00874
Epoch: 39605, Training Loss: 0.00685
Epoch: 39605, Training Loss: 0.00712
Epoch: 39605, Training Loss: 0.00713
Epoch: 39605, Training Loss: 0.00874
Epoch: 39606, Training Loss: 0.00685
Epoch: 39606, Training Loss: 0.00712
Epoch: 39606, Training Loss: 0.00713
Epoch: 39606, Training Loss: 0.00874
Epoch: 39607, Training Loss: 0.00685
Epoch: 39607, Training Loss: 0.00712
Epoch: 39607, Training Loss: 0.00713
Epoch: 39607, Training Loss: 0.00874
Epoch: 39608, Training Loss: 0.00685
Epoch: 39608, Training Loss: 0.00712
Epoch: 39608, Training Loss: 0.00713
Epoch: 39608, Training Loss: 0.00874
Epoch: 39609, Training Loss: 0.00685
Epoch: 39609, Training Loss: 0.00712
Epoch: 39609, Training Loss: 0.00713
Epoch: 39609, Training Loss: 0.00874
Epoch: 39610, Training Loss: 0.00685
Epoch: 39610, Training Loss: 0.00712
Epoch: 39610, Training Loss: 0.00713
Epoch: 39610, Training Loss: 0.00874
Epoch: 39611, Training Loss: 0.00685
Epoch: 39611, Training Loss: 0.00712
Epoch: 39611, Training Loss: 0.00713
Epoch: 39611, Training Loss: 0.00874
Epoch: 39612, Training Loss: 0.00685
Epoch: 39612, Training Loss: 0.00712
Epoch: 39612, Training Loss: 0.00713
Epoch: 39612, Training Loss: 0.00874
Epoch: 39613, Training Loss: 0.00685
Epoch: 39613, Training Loss: 0.00712
Epoch: 39613, Training Loss: 0.00713
Epoch: 39613, Training Loss: 0.00874
Epoch: 39614, Training Loss: 0.00685
Epoch: 39614, Training Loss: 0.00712
Epoch: 39614, Training Loss: 0.00713
Epoch: 39614, Training Loss: 0.00874
Epoch: 39615, Training Loss: 0.00685
Epoch: 39615, Training Loss: 0.00712
Epoch: 39615, Training Loss: 0.00713
Epoch: 39615, Training Loss: 0.00874
Epoch: 39616, Training Loss: 0.00685
Epoch: 39616, Training Loss: 0.00712
Epoch: 39616, Training Loss: 0.00713
Epoch: 39616, Training Loss: 0.00874
Epoch: 39617, Training Loss: 0.00685
Epoch: 39617, Training Loss: 0.00712
Epoch: 39617, Training Loss: 0.00713
Epoch: 39617, Training Loss: 0.00874
Epoch: 39618, Training Loss: 0.00685
Epoch: 39618, Training Loss: 0.00712
Epoch: 39618, Training Loss: 0.00713
Epoch: 39618, Training Loss: 0.00874
Epoch: 39619, Training Loss: 0.00685
Epoch: 39619, Training Loss: 0.00712
Epoch: 39619, Training Loss: 0.00713
Epoch: 39619, Training Loss: 0.00874
Epoch: 39620, Training Loss: 0.00685
Epoch: 39620, Training Loss: 0.00712
Epoch: 39620, Training Loss: 0.00713
Epoch: 39620, Training Loss: 0.00874
Epoch: 39621, Training Loss: 0.00685
Epoch: 39621, Training Loss: 0.00712
Epoch: 39621, Training Loss: 0.00713
Epoch: 39621, Training Loss: 0.00874
Epoch: 39622, Training Loss: 0.00685
Epoch: 39622, Training Loss: 0.00712
Epoch: 39622, Training Loss: 0.00713
Epoch: 39622, Training Loss: 0.00874
Epoch: 39623, Training Loss: 0.00685
Epoch: 39623, Training Loss: 0.00712
Epoch: 39623, Training Loss: 0.00713
Epoch: 39623, Training Loss: 0.00874
Epoch: 39624, Training Loss: 0.00685
Epoch: 39624, Training Loss: 0.00712
Epoch: 39624, Training Loss: 0.00713
Epoch: 39624, Training Loss: 0.00874
Epoch: 39625, Training Loss: 0.00685
Epoch: 39625, Training Loss: 0.00712
Epoch: 39625, Training Loss: 0.00713
Epoch: 39625, Training Loss: 0.00874
Epoch: 39626, Training Loss: 0.00685
Epoch: 39626, Training Loss: 0.00712
Epoch: 39626, Training Loss: 0.00713
Epoch: 39626, Training Loss: 0.00874
Epoch: 39627, Training Loss: 0.00685
Epoch: 39627, Training Loss: 0.00712
Epoch: 39627, Training Loss: 0.00713
Epoch: 39627, Training Loss: 0.00874
Epoch: 39628, Training Loss: 0.00685
Epoch: 39628, Training Loss: 0.00712
Epoch: 39628, Training Loss: 0.00713
Epoch: 39628, Training Loss: 0.00874
Epoch: 39629, Training Loss: 0.00685
Epoch: 39629, Training Loss: 0.00712
Epoch: 39629, Training Loss: 0.00713
Epoch: 39629, Training Loss: 0.00874
Epoch: 39630, Training Loss: 0.00685
Epoch: 39630, Training Loss: 0.00712
Epoch: 39630, Training Loss: 0.00713
Epoch: 39630, Training Loss: 0.00874
Epoch: 39631, Training Loss: 0.00685
Epoch: 39631, Training Loss: 0.00712
Epoch: 39631, Training Loss: 0.00713
Epoch: 39631, Training Loss: 0.00874
Epoch: 39632, Training Loss: 0.00685
Epoch: 39632, Training Loss: 0.00712
Epoch: 39632, Training Loss: 0.00713
Epoch: 39632, Training Loss: 0.00874
Epoch: 39633, Training Loss: 0.00685
Epoch: 39633, Training Loss: 0.00712
Epoch: 39633, Training Loss: 0.00713
Epoch: 39633, Training Loss: 0.00874
Epoch: 39634, Training Loss: 0.00685
Epoch: 39634, Training Loss: 0.00712
Epoch: 39634, Training Loss: 0.00713
Epoch: 39634, Training Loss: 0.00874
Epoch: 39635, Training Loss: 0.00685
Epoch: 39635, Training Loss: 0.00712
Epoch: 39635, Training Loss: 0.00713
Epoch: 39635, Training Loss: 0.00874
Epoch: 39636, Training Loss: 0.00685
Epoch: 39636, Training Loss: 0.00712
Epoch: 39636, Training Loss: 0.00713
Epoch: 39636, Training Loss: 0.00874
Epoch: 39637, Training Loss: 0.00685
Epoch: 39637, Training Loss: 0.00712
Epoch: 39637, Training Loss: 0.00713
Epoch: 39637, Training Loss: 0.00874
Epoch: 39638, Training Loss: 0.00685
Epoch: 39638, Training Loss: 0.00712
Epoch: 39638, Training Loss: 0.00713
Epoch: 39638, Training Loss: 0.00874
Epoch: 39639, Training Loss: 0.00685
Epoch: 39639, Training Loss: 0.00712
Epoch: 39639, Training Loss: 0.00713
Epoch: 39639, Training Loss: 0.00874
Epoch: 39640, Training Loss: 0.00685
Epoch: 39640, Training Loss: 0.00712
Epoch: 39640, Training Loss: 0.00712
Epoch: 39640, Training Loss: 0.00874
Epoch: 39641, Training Loss: 0.00685
Epoch: 39641, Training Loss: 0.00712
Epoch: 39641, Training Loss: 0.00712
Epoch: 39641, Training Loss: 0.00874
Epoch: 39642, Training Loss: 0.00685
Epoch: 39642, Training Loss: 0.00712
Epoch: 39642, Training Loss: 0.00712
Epoch: 39642, Training Loss: 0.00874
Epoch: 39643, Training Loss: 0.00685
Epoch: 39643, Training Loss: 0.00712
Epoch: 39643, Training Loss: 0.00712
Epoch: 39643, Training Loss: 0.00874
Epoch: 39644, Training Loss: 0.00685
Epoch: 39644, Training Loss: 0.00712
Epoch: 39644, Training Loss: 0.00712
Epoch: 39644, Training Loss: 0.00874
Epoch: 39645, Training Loss: 0.00685
Epoch: 39645, Training Loss: 0.00712
Epoch: 39645, Training Loss: 0.00712
Epoch: 39645, Training Loss: 0.00874
Epoch: 39646, Training Loss: 0.00685
Epoch: 39646, Training Loss: 0.00712
Epoch: 39646, Training Loss: 0.00712
Epoch: 39646, Training Loss: 0.00874
Epoch: 39647, Training Loss: 0.00685
Epoch: 39647, Training Loss: 0.00712
Epoch: 39647, Training Loss: 0.00712
Epoch: 39647, Training Loss: 0.00874
Epoch: 39648, Training Loss: 0.00685
Epoch: 39648, Training Loss: 0.00712
Epoch: 39648, Training Loss: 0.00712
Epoch: 39648, Training Loss: 0.00874
Epoch: 39649, Training Loss: 0.00685
Epoch: 39649, Training Loss: 0.00712
Epoch: 39649, Training Loss: 0.00712
Epoch: 39649, Training Loss: 0.00874
Epoch: 39650, Training Loss: 0.00685
Epoch: 39650, Training Loss: 0.00712
Epoch: 39650, Training Loss: 0.00712
Epoch: 39650, Training Loss: 0.00874
Epoch: 39651, Training Loss: 0.00685
Epoch: 39651, Training Loss: 0.00712
Epoch: 39651, Training Loss: 0.00712
Epoch: 39651, Training Loss: 0.00874
Epoch: 39652, Training Loss: 0.00685
Epoch: 39652, Training Loss: 0.00712
Epoch: 39652, Training Loss: 0.00712
Epoch: 39652, Training Loss: 0.00874
Epoch: 39653, Training Loss: 0.00685
Epoch: 39653, Training Loss: 0.00712
Epoch: 39653, Training Loss: 0.00712
Epoch: 39653, Training Loss: 0.00874
Epoch: 39654, Training Loss: 0.00685
Epoch: 39654, Training Loss: 0.00712
Epoch: 39654, Training Loss: 0.00712
Epoch: 39654, Training Loss: 0.00874
Epoch: 39655, Training Loss: 0.00685
Epoch: 39655, Training Loss: 0.00712
Epoch: 39655, Training Loss: 0.00712
Epoch: 39655, Training Loss: 0.00874
Epoch: 39656, Training Loss: 0.00685
Epoch: 39656, Training Loss: 0.00712
Epoch: 39656, Training Loss: 0.00712
Epoch: 39656, Training Loss: 0.00874
Epoch: 39657, Training Loss: 0.00685
Epoch: 39657, Training Loss: 0.00712
Epoch: 39657, Training Loss: 0.00712
Epoch: 39657, Training Loss: 0.00874
Epoch: 39658, Training Loss: 0.00685
Epoch: 39658, Training Loss: 0.00712
Epoch: 39658, Training Loss: 0.00712
Epoch: 39658, Training Loss: 0.00874
Epoch: 39659, Training Loss: 0.00685
Epoch: 39659, Training Loss: 0.00712
Epoch: 39659, Training Loss: 0.00712
Epoch: 39659, Training Loss: 0.00874
Epoch: 39660, Training Loss: 0.00685
Epoch: 39660, Training Loss: 0.00711
Epoch: 39660, Training Loss: 0.00712
Epoch: 39660, Training Loss: 0.00874
Epoch: 39661, Training Loss: 0.00685
Epoch: 39661, Training Loss: 0.00711
Epoch: 39661, Training Loss: 0.00712
Epoch: 39661, Training Loss: 0.00874
Epoch: 39662, Training Loss: 0.00685
Epoch: 39662, Training Loss: 0.00711
Epoch: 39662, Training Loss: 0.00712
Epoch: 39662, Training Loss: 0.00874
Epoch: 39663, Training Loss: 0.00685
Epoch: 39663, Training Loss: 0.00711
Epoch: 39663, Training Loss: 0.00712
Epoch: 39663, Training Loss: 0.00874
Epoch: 39664, Training Loss: 0.00685
Epoch: 39664, Training Loss: 0.00711
Epoch: 39664, Training Loss: 0.00712
Epoch: 39664, Training Loss: 0.00874
Epoch: 39665, Training Loss: 0.00685
Epoch: 39665, Training Loss: 0.00711
Epoch: 39665, Training Loss: 0.00712
Epoch: 39665, Training Loss: 0.00874
Epoch: 39666, Training Loss: 0.00685
Epoch: 39666, Training Loss: 0.00711
Epoch: 39666, Training Loss: 0.00712
Epoch: 39666, Training Loss: 0.00873
Epoch: 39667, Training Loss: 0.00685
Epoch: 39667, Training Loss: 0.00711
Epoch: 39667, Training Loss: 0.00712
Epoch: 39667, Training Loss: 0.00873
Epoch: 39668, Training Loss: 0.00685
Epoch: 39668, Training Loss: 0.00711
Epoch: 39668, Training Loss: 0.00712
Epoch: 39668, Training Loss: 0.00873
Epoch: 39669, Training Loss: 0.00685
Epoch: 39669, Training Loss: 0.00711
Epoch: 39669, Training Loss: 0.00712
Epoch: 39669, Training Loss: 0.00873
Epoch: 39670, Training Loss: 0.00685
Epoch: 39670, Training Loss: 0.00711
Epoch: 39670, Training Loss: 0.00712
Epoch: 39670, Training Loss: 0.00873
Epoch: 39671, Training Loss: 0.00685
Epoch: 39671, Training Loss: 0.00711
Epoch: 39671, Training Loss: 0.00712
Epoch: 39671, Training Loss: 0.00873
Epoch: 39672, Training Loss: 0.00685
Epoch: 39672, Training Loss: 0.00711
Epoch: 39672, Training Loss: 0.00712
Epoch: 39672, Training Loss: 0.00873
Epoch: 39673, Training Loss: 0.00685
Epoch: 39673, Training Loss: 0.00711
Epoch: 39673, Training Loss: 0.00712
Epoch: 39673, Training Loss: 0.00873
Epoch: 39674, Training Loss: 0.00685
Epoch: 39674, Training Loss: 0.00711
Epoch: 39674, Training Loss: 0.00712
Epoch: 39674, Training Loss: 0.00873
Epoch: 39675, Training Loss: 0.00685
Epoch: 39675, Training Loss: 0.00711
Epoch: 39675, Training Loss: 0.00712
Epoch: 39675, Training Loss: 0.00873
Epoch: 39676, Training Loss: 0.00685
Epoch: 39676, Training Loss: 0.00711
Epoch: 39676, Training Loss: 0.00712
Epoch: 39676, Training Loss: 0.00873
Epoch: 39677, Training Loss: 0.00685
Epoch: 39677, Training Loss: 0.00711
Epoch: 39677, Training Loss: 0.00712
Epoch: 39677, Training Loss: 0.00873
Epoch: 39678, Training Loss: 0.00685
Epoch: 39678, Training Loss: 0.00711
Epoch: 39678, Training Loss: 0.00712
Epoch: 39678, Training Loss: 0.00873
Epoch: 39679, Training Loss: 0.00685
Epoch: 39679, Training Loss: 0.00711
Epoch: 39679, Training Loss: 0.00712
Epoch: 39679, Training Loss: 0.00873
Epoch: 39680, Training Loss: 0.00685
Epoch: 39680, Training Loss: 0.00711
Epoch: 39680, Training Loss: 0.00712
Epoch: 39680, Training Loss: 0.00873
Epoch: 39681, Training Loss: 0.00685
Epoch: 39681, Training Loss: 0.00711
Epoch: 39681, Training Loss: 0.00712
Epoch: 39681, Training Loss: 0.00873
Epoch: 39682, Training Loss: 0.00685
Epoch: 39682, Training Loss: 0.00711
Epoch: 39682, Training Loss: 0.00712
Epoch: 39682, Training Loss: 0.00873
Epoch: 39683, Training Loss: 0.00685
Epoch: 39683, Training Loss: 0.00711
Epoch: 39683, Training Loss: 0.00712
Epoch: 39683, Training Loss: 0.00873
Epoch: 39684, Training Loss: 0.00685
Epoch: 39684, Training Loss: 0.00711
Epoch: 39684, Training Loss: 0.00712
Epoch: 39684, Training Loss: 0.00873
Epoch: 39685, Training Loss: 0.00685
Epoch: 39685, Training Loss: 0.00711
Epoch: 39685, Training Loss: 0.00712
Epoch: 39685, Training Loss: 0.00873
Epoch: 39686, Training Loss: 0.00685
Epoch: 39686, Training Loss: 0.00711
Epoch: 39686, Training Loss: 0.00712
Epoch: 39686, Training Loss: 0.00873
Epoch: 39687, Training Loss: 0.00685
Epoch: 39687, Training Loss: 0.00711
Epoch: 39687, Training Loss: 0.00712
Epoch: 39687, Training Loss: 0.00873
Epoch: 39688, Training Loss: 0.00685
Epoch: 39688, Training Loss: 0.00711
Epoch: 39688, Training Loss: 0.00712
Epoch: 39688, Training Loss: 0.00873
Epoch: 39689, Training Loss: 0.00685
Epoch: 39689, Training Loss: 0.00711
Epoch: 39689, Training Loss: 0.00712
Epoch: 39689, Training Loss: 0.00873
Epoch: 39690, Training Loss: 0.00685
Epoch: 39690, Training Loss: 0.00711
Epoch: 39690, Training Loss: 0.00712
Epoch: 39690, Training Loss: 0.00873
Epoch: 39691, Training Loss: 0.00685
Epoch: 39691, Training Loss: 0.00711
Epoch: 39691, Training Loss: 0.00712
Epoch: 39691, Training Loss: 0.00873
Epoch: 39692, Training Loss: 0.00685
Epoch: 39692, Training Loss: 0.00711
Epoch: 39692, Training Loss: 0.00712
Epoch: 39692, Training Loss: 0.00873
Epoch: 39693, Training Loss: 0.00685
Epoch: 39693, Training Loss: 0.00711
Epoch: 39693, Training Loss: 0.00712
Epoch: 39693, Training Loss: 0.00873
Epoch: 39694, Training Loss: 0.00685
Epoch: 39694, Training Loss: 0.00711
Epoch: 39694, Training Loss: 0.00712
Epoch: 39694, Training Loss: 0.00873
Epoch: 39695, Training Loss: 0.00685
Epoch: 39695, Training Loss: 0.00711
Epoch: 39695, Training Loss: 0.00712
Epoch: 39695, Training Loss: 0.00873
Epoch: 39696, Training Loss: 0.00685
Epoch: 39696, Training Loss: 0.00711
Epoch: 39696, Training Loss: 0.00712
Epoch: 39696, Training Loss: 0.00873
Epoch: 39697, Training Loss: 0.00685
Epoch: 39697, Training Loss: 0.00711
Epoch: 39697, Training Loss: 0.00712
Epoch: 39697, Training Loss: 0.00873
Epoch: 39698, Training Loss: 0.00685
Epoch: 39698, Training Loss: 0.00711
Epoch: 39698, Training Loss: 0.00712
Epoch: 39698, Training Loss: 0.00873
Epoch: 39699, Training Loss: 0.00685
Epoch: 39699, Training Loss: 0.00711
Epoch: 39699, Training Loss: 0.00712
Epoch: 39699, Training Loss: 0.00873
Epoch: 39700, Training Loss: 0.00685
Epoch: 39700, Training Loss: 0.00711
Epoch: 39700, Training Loss: 0.00712
Epoch: 39700, Training Loss: 0.00873
Epoch: 39701, Training Loss: 0.00685
Epoch: 39701, Training Loss: 0.00711
Epoch: 39701, Training Loss: 0.00712
Epoch: 39701, Training Loss: 0.00873
Epoch: 39702, Training Loss: 0.00685
Epoch: 39702, Training Loss: 0.00711
Epoch: 39702, Training Loss: 0.00712
Epoch: 39702, Training Loss: 0.00873
Epoch: 39703, Training Loss: 0.00685
Epoch: 39703, Training Loss: 0.00711
Epoch: 39703, Training Loss: 0.00712
Epoch: 39703, Training Loss: 0.00873
Epoch: 39704, Training Loss: 0.00685
Epoch: 39704, Training Loss: 0.00711
Epoch: 39704, Training Loss: 0.00712
Epoch: 39704, Training Loss: 0.00873
Epoch: 39705, Training Loss: 0.00685
Epoch: 39705, Training Loss: 0.00711
Epoch: 39705, Training Loss: 0.00712
Epoch: 39705, Training Loss: 0.00873
Epoch: 39706, Training Loss: 0.00685
Epoch: 39706, Training Loss: 0.00711
Epoch: 39706, Training Loss: 0.00712
Epoch: 39706, Training Loss: 0.00873
Epoch: 39707, Training Loss: 0.00685
Epoch: 39707, Training Loss: 0.00711
Epoch: 39707, Training Loss: 0.00712
Epoch: 39707, Training Loss: 0.00873
Epoch: 39708, Training Loss: 0.00684
Epoch: 39708, Training Loss: 0.00711
Epoch: 39708, Training Loss: 0.00712
Epoch: 39708, Training Loss: 0.00873
Epoch: 39709, Training Loss: 0.00684
Epoch: 39709, Training Loss: 0.00711
Epoch: 39709, Training Loss: 0.00712
Epoch: 39709, Training Loss: 0.00873
Epoch: 39710, Training Loss: 0.00684
Epoch: 39710, Training Loss: 0.00711
Epoch: 39710, Training Loss: 0.00712
Epoch: 39710, Training Loss: 0.00873
Epoch: 39711, Training Loss: 0.00684
Epoch: 39711, Training Loss: 0.00711
Epoch: 39711, Training Loss: 0.00712
Epoch: 39711, Training Loss: 0.00873
Epoch: 39712, Training Loss: 0.00684
Epoch: 39712, Training Loss: 0.00711
Epoch: 39712, Training Loss: 0.00712
Epoch: 39712, Training Loss: 0.00873
Epoch: 39713, Training Loss: 0.00684
Epoch: 39713, Training Loss: 0.00711
Epoch: 39713, Training Loss: 0.00712
Epoch: 39713, Training Loss: 0.00873
Epoch: 39714, Training Loss: 0.00684
Epoch: 39714, Training Loss: 0.00711
Epoch: 39714, Training Loss: 0.00712
Epoch: 39714, Training Loss: 0.00873
Epoch: 39715, Training Loss: 0.00684
Epoch: 39715, Training Loss: 0.00711
Epoch: 39715, Training Loss: 0.00712
Epoch: 39715, Training Loss: 0.00873
Epoch: 39716, Training Loss: 0.00684
Epoch: 39716, Training Loss: 0.00711
Epoch: 39716, Training Loss: 0.00712
Epoch: 39716, Training Loss: 0.00873
Epoch: 39717, Training Loss: 0.00684
Epoch: 39717, Training Loss: 0.00711
Epoch: 39717, Training Loss: 0.00712
Epoch: 39717, Training Loss: 0.00873
Epoch: 39718, Training Loss: 0.00684
Epoch: 39718, Training Loss: 0.00711
Epoch: 39718, Training Loss: 0.00712
Epoch: 39718, Training Loss: 0.00873
Epoch: 39719, Training Loss: 0.00684
Epoch: 39719, Training Loss: 0.00711
Epoch: 39719, Training Loss: 0.00712
Epoch: 39719, Training Loss: 0.00873
Epoch: 39720, Training Loss: 0.00684
Epoch: 39720, Training Loss: 0.00711
Epoch: 39720, Training Loss: 0.00712
Epoch: 39720, Training Loss: 0.00873
Epoch: 39721, Training Loss: 0.00684
Epoch: 39721, Training Loss: 0.00711
Epoch: 39721, Training Loss: 0.00712
Epoch: 39721, Training Loss: 0.00873
Epoch: 39722, Training Loss: 0.00684
Epoch: 39722, Training Loss: 0.00711
Epoch: 39722, Training Loss: 0.00712
Epoch: 39722, Training Loss: 0.00873
Epoch: 39723, Training Loss: 0.00684
Epoch: 39723, Training Loss: 0.00711
Epoch: 39723, Training Loss: 0.00712
Epoch: 39723, Training Loss: 0.00873
Epoch: 39724, Training Loss: 0.00684
Epoch: 39724, Training Loss: 0.00711
Epoch: 39724, Training Loss: 0.00712
Epoch: 39724, Training Loss: 0.00873
Epoch: 39725, Training Loss: 0.00684
Epoch: 39725, Training Loss: 0.00711
Epoch: 39725, Training Loss: 0.00712
Epoch: 39725, Training Loss: 0.00873
Epoch: 39726, Training Loss: 0.00684
Epoch: 39726, Training Loss: 0.00711
Epoch: 39726, Training Loss: 0.00712
Epoch: 39726, Training Loss: 0.00873
Epoch: 39727, Training Loss: 0.00684
Epoch: 39727, Training Loss: 0.00711
Epoch: 39727, Training Loss: 0.00712
Epoch: 39727, Training Loss: 0.00873
Epoch: 39728, Training Loss: 0.00684
Epoch: 39728, Training Loss: 0.00711
Epoch: 39728, Training Loss: 0.00712
Epoch: 39728, Training Loss: 0.00873
Epoch: 39729, Training Loss: 0.00684
Epoch: 39729, Training Loss: 0.00711
Epoch: 39729, Training Loss: 0.00712
Epoch: 39729, Training Loss: 0.00873
Epoch: 39730, Training Loss: 0.00684
Epoch: 39730, Training Loss: 0.00711
Epoch: 39730, Training Loss: 0.00712
Epoch: 39730, Training Loss: 0.00873
Epoch: 39731, Training Loss: 0.00684
Epoch: 39731, Training Loss: 0.00711
Epoch: 39731, Training Loss: 0.00712
Epoch: 39731, Training Loss: 0.00873
Epoch: 39732, Training Loss: 0.00684
Epoch: 39732, Training Loss: 0.00711
Epoch: 39732, Training Loss: 0.00712
Epoch: 39732, Training Loss: 0.00873
Epoch: 39733, Training Loss: 0.00684
Epoch: 39733, Training Loss: 0.00711
Epoch: 39733, Training Loss: 0.00712
Epoch: 39733, Training Loss: 0.00873
Epoch: 39734, Training Loss: 0.00684
Epoch: 39734, Training Loss: 0.00711
Epoch: 39734, Training Loss: 0.00712
Epoch: 39734, Training Loss: 0.00873
Epoch: 39735, Training Loss: 0.00684
Epoch: 39735, Training Loss: 0.00711
Epoch: 39735, Training Loss: 0.00712
Epoch: 39735, Training Loss: 0.00873
Epoch: 39736, Training Loss: 0.00684
Epoch: 39736, Training Loss: 0.00711
Epoch: 39736, Training Loss: 0.00712
Epoch: 39736, Training Loss: 0.00873
Epoch: 39737, Training Loss: 0.00684
Epoch: 39737, Training Loss: 0.00711
Epoch: 39737, Training Loss: 0.00712
Epoch: 39737, Training Loss: 0.00873
Epoch: 39738, Training Loss: 0.00684
Epoch: 39738, Training Loss: 0.00711
Epoch: 39738, Training Loss: 0.00712
Epoch: 39738, Training Loss: 0.00873
Epoch: 39739, Training Loss: 0.00684
Epoch: 39739, Training Loss: 0.00711
Epoch: 39739, Training Loss: 0.00712
Epoch: 39739, Training Loss: 0.00873
Epoch: 39740, Training Loss: 0.00684
Epoch: 39740, Training Loss: 0.00711
Epoch: 39740, Training Loss: 0.00712
Epoch: 39740, Training Loss: 0.00873
Epoch: 39741, Training Loss: 0.00684
Epoch: 39741, Training Loss: 0.00711
Epoch: 39741, Training Loss: 0.00711
Epoch: 39741, Training Loss: 0.00873
Epoch: 39742, Training Loss: 0.00684
Epoch: 39742, Training Loss: 0.00711
Epoch: 39742, Training Loss: 0.00711
Epoch: 39742, Training Loss: 0.00873
Epoch: 39743, Training Loss: 0.00684
Epoch: 39743, Training Loss: 0.00711
Epoch: 39743, Training Loss: 0.00711
Epoch: 39743, Training Loss: 0.00873
Epoch: 39744, Training Loss: 0.00684
Epoch: 39744, Training Loss: 0.00711
Epoch: 39744, Training Loss: 0.00711
Epoch: 39744, Training Loss: 0.00873
Epoch: 39745, Training Loss: 0.00684
Epoch: 39745, Training Loss: 0.00711
Epoch: 39745, Training Loss: 0.00711
Epoch: 39745, Training Loss: 0.00873
Epoch: 39746, Training Loss: 0.00684
Epoch: 39746, Training Loss: 0.00711
Epoch: 39746, Training Loss: 0.00711
Epoch: 39746, Training Loss: 0.00873
Epoch: 39747, Training Loss: 0.00684
Epoch: 39747, Training Loss: 0.00711
Epoch: 39747, Training Loss: 0.00711
Epoch: 39747, Training Loss: 0.00873
Epoch: 39748, Training Loss: 0.00684
Epoch: 39748, Training Loss: 0.00711
Epoch: 39748, Training Loss: 0.00711
Epoch: 39748, Training Loss: 0.00872
Epoch: 39749, Training Loss: 0.00684
Epoch: 39749, Training Loss: 0.00711
Epoch: 39749, Training Loss: 0.00711
Epoch: 39749, Training Loss: 0.00872
Epoch: 39750, Training Loss: 0.00684
Epoch: 39750, Training Loss: 0.00711
Epoch: 39750, Training Loss: 0.00711
Epoch: 39750, Training Loss: 0.00872
Epoch: 39751, Training Loss: 0.00684
Epoch: 39751, Training Loss: 0.00711
Epoch: 39751, Training Loss: 0.00711
Epoch: 39751, Training Loss: 0.00872
Epoch: 39752, Training Loss: 0.00684
Epoch: 39752, Training Loss: 0.00711
Epoch: 39752, Training Loss: 0.00711
Epoch: 39752, Training Loss: 0.00872
Epoch: 39753, Training Loss: 0.00684
Epoch: 39753, Training Loss: 0.00711
Epoch: 39753, Training Loss: 0.00711
Epoch: 39753, Training Loss: 0.00872
Epoch: 39754, Training Loss: 0.00684
Epoch: 39754, Training Loss: 0.00711
Epoch: 39754, Training Loss: 0.00711
Epoch: 39754, Training Loss: 0.00872
Epoch: 39755, Training Loss: 0.00684
Epoch: 39755, Training Loss: 0.00711
Epoch: 39755, Training Loss: 0.00711
Epoch: 39755, Training Loss: 0.00872
Epoch: 39756, Training Loss: 0.00684
Epoch: 39756, Training Loss: 0.00711
Epoch: 39756, Training Loss: 0.00711
Epoch: 39756, Training Loss: 0.00872
Epoch: 39757, Training Loss: 0.00684
Epoch: 39757, Training Loss: 0.00711
Epoch: 39757, Training Loss: 0.00711
Epoch: 39757, Training Loss: 0.00872
Epoch: 39758, Training Loss: 0.00684
Epoch: 39758, Training Loss: 0.00711
Epoch: 39758, Training Loss: 0.00711
Epoch: 39758, Training Loss: 0.00872
Epoch: 39759, Training Loss: 0.00684
Epoch: 39759, Training Loss: 0.00711
Epoch: 39759, Training Loss: 0.00711
Epoch: 39759, Training Loss: 0.00872
Epoch: 39760, Training Loss: 0.00684
Epoch: 39760, Training Loss: 0.00711
Epoch: 39760, Training Loss: 0.00711
Epoch: 39760, Training Loss: 0.00872
Epoch: 39761, Training Loss: 0.00684
Epoch: 39761, Training Loss: 0.00711
Epoch: 39761, Training Loss: 0.00711
Epoch: 39761, Training Loss: 0.00872
Epoch: 39762, Training Loss: 0.00684
Epoch: 39762, Training Loss: 0.00710
Epoch: 39762, Training Loss: 0.00711
Epoch: 39762, Training Loss: 0.00872
Epoch: 39763, Training Loss: 0.00684
Epoch: 39763, Training Loss: 0.00710
Epoch: 39763, Training Loss: 0.00711
Epoch: 39763, Training Loss: 0.00872
Epoch: 39764, Training Loss: 0.00684
Epoch: 39764, Training Loss: 0.00710
Epoch: 39764, Training Loss: 0.00711
Epoch: 39764, Training Loss: 0.00872
Epoch: 39765, Training Loss: 0.00684
Epoch: 39765, Training Loss: 0.00710
Epoch: 39765, Training Loss: 0.00711
Epoch: 39765, Training Loss: 0.00872
Epoch: 39766, Training Loss: 0.00684
Epoch: 39766, Training Loss: 0.00710
Epoch: 39766, Training Loss: 0.00711
Epoch: 39766, Training Loss: 0.00872
Epoch: 39767, Training Loss: 0.00684
Epoch: 39767, Training Loss: 0.00710
Epoch: 39767, Training Loss: 0.00711
Epoch: 39767, Training Loss: 0.00872
Epoch: 39768, Training Loss: 0.00684
Epoch: 39768, Training Loss: 0.00710
Epoch: 39768, Training Loss: 0.00711
Epoch: 39768, Training Loss: 0.00872
Epoch: 39769, Training Loss: 0.00684
Epoch: 39769, Training Loss: 0.00710
Epoch: 39769, Training Loss: 0.00711
Epoch: 39769, Training Loss: 0.00872
Epoch: 39770, Training Loss: 0.00684
Epoch: 39770, Training Loss: 0.00710
Epoch: 39770, Training Loss: 0.00711
Epoch: 39770, Training Loss: 0.00872
Epoch: 39771, Training Loss: 0.00684
Epoch: 39771, Training Loss: 0.00710
Epoch: 39771, Training Loss: 0.00711
Epoch: 39771, Training Loss: 0.00872
Epoch: 39772, Training Loss: 0.00684
Epoch: 39772, Training Loss: 0.00710
Epoch: 39772, Training Loss: 0.00711
Epoch: 39772, Training Loss: 0.00872
Epoch: 39773, Training Loss: 0.00684
Epoch: 39773, Training Loss: 0.00710
Epoch: 39773, Training Loss: 0.00711
Epoch: 39773, Training Loss: 0.00872
Epoch: 39774, Training Loss: 0.00684
Epoch: 39774, Training Loss: 0.00710
Epoch: 39774, Training Loss: 0.00711
Epoch: 39774, Training Loss: 0.00872
Epoch: 39775, Training Loss: 0.00684
Epoch: 39775, Training Loss: 0.00710
Epoch: 39775, Training Loss: 0.00711
Epoch: 39775, Training Loss: 0.00872
Epoch: 39776, Training Loss: 0.00684
Epoch: 39776, Training Loss: 0.00710
Epoch: 39776, Training Loss: 0.00711
Epoch: 39776, Training Loss: 0.00872
Epoch: 39777, Training Loss: 0.00684
Epoch: 39777, Training Loss: 0.00710
Epoch: 39777, Training Loss: 0.00711
Epoch: 39777, Training Loss: 0.00872
Epoch: 39778, Training Loss: 0.00684
Epoch: 39778, Training Loss: 0.00710
Epoch: 39778, Training Loss: 0.00711
Epoch: 39778, Training Loss: 0.00872
Epoch: 39779, Training Loss: 0.00684
Epoch: 39779, Training Loss: 0.00710
Epoch: 39779, Training Loss: 0.00711
Epoch: 39779, Training Loss: 0.00872
Epoch: 39780, Training Loss: 0.00684
Epoch: 39780, Training Loss: 0.00710
Epoch: 39780, Training Loss: 0.00711
Epoch: 39780, Training Loss: 0.00872
Epoch: 39781, Training Loss: 0.00684
Epoch: 39781, Training Loss: 0.00710
Epoch: 39781, Training Loss: 0.00711
Epoch: 39781, Training Loss: 0.00872
Epoch: 39782, Training Loss: 0.00684
Epoch: 39782, Training Loss: 0.00710
Epoch: 39782, Training Loss: 0.00711
Epoch: 39782, Training Loss: 0.00872
Epoch: 39783, Training Loss: 0.00684
Epoch: 39783, Training Loss: 0.00710
Epoch: 39783, Training Loss: 0.00711
Epoch: 39783, Training Loss: 0.00872
Epoch: 39784, Training Loss: 0.00684
Epoch: 39784, Training Loss: 0.00710
Epoch: 39784, Training Loss: 0.00711
Epoch: 39784, Training Loss: 0.00872
Epoch: 39785, Training Loss: 0.00684
Epoch: 39785, Training Loss: 0.00710
Epoch: 39785, Training Loss: 0.00711
Epoch: 39785, Training Loss: 0.00872
Epoch: 39786, Training Loss: 0.00684
Epoch: 39786, Training Loss: 0.00710
Epoch: 39786, Training Loss: 0.00711
Epoch: 39786, Training Loss: 0.00872
Epoch: 39787, Training Loss: 0.00684
Epoch: 39787, Training Loss: 0.00710
Epoch: 39787, Training Loss: 0.00711
Epoch: 39787, Training Loss: 0.00872
Epoch: 39788, Training Loss: 0.00684
Epoch: 39788, Training Loss: 0.00710
Epoch: 39788, Training Loss: 0.00711
Epoch: 39788, Training Loss: 0.00872
Epoch: 39789, Training Loss: 0.00684
Epoch: 39789, Training Loss: 0.00710
Epoch: 39789, Training Loss: 0.00711
Epoch: 39789, Training Loss: 0.00872
Epoch: 39790, Training Loss: 0.00684
Epoch: 39790, Training Loss: 0.00710
Epoch: 39790, Training Loss: 0.00711
Epoch: 39790, Training Loss: 0.00872
Epoch: 39791, Training Loss: 0.00684
Epoch: 39791, Training Loss: 0.00710
Epoch: 39791, Training Loss: 0.00711
Epoch: 39791, Training Loss: 0.00872
Epoch: 39792, Training Loss: 0.00684
Epoch: 39792, Training Loss: 0.00710
Epoch: 39792, Training Loss: 0.00711
Epoch: 39792, Training Loss: 0.00872
Epoch: 39793, Training Loss: 0.00684
Epoch: 39793, Training Loss: 0.00710
Epoch: 39793, Training Loss: 0.00711
Epoch: 39793, Training Loss: 0.00872
Epoch: 39794, Training Loss: 0.00684
Epoch: 39794, Training Loss: 0.00710
Epoch: 39794, Training Loss: 0.00711
Epoch: 39794, Training Loss: 0.00872
Epoch: 39795, Training Loss: 0.00684
Epoch: 39795, Training Loss: 0.00710
Epoch: 39795, Training Loss: 0.00711
Epoch: 39795, Training Loss: 0.00872
Epoch: 39796, Training Loss: 0.00684
Epoch: 39796, Training Loss: 0.00710
Epoch: 39796, Training Loss: 0.00711
Epoch: 39796, Training Loss: 0.00872
Epoch: 39797, Training Loss: 0.00684
Epoch: 39797, Training Loss: 0.00710
Epoch: 39797, Training Loss: 0.00711
Epoch: 39797, Training Loss: 0.00872
Epoch: 39798, Training Loss: 0.00684
Epoch: 39798, Training Loss: 0.00710
Epoch: 39798, Training Loss: 0.00711
Epoch: 39798, Training Loss: 0.00872
Epoch: 39799, Training Loss: 0.00684
Epoch: 39799, Training Loss: 0.00710
Epoch: 39799, Training Loss: 0.00711
Epoch: 39799, Training Loss: 0.00872
Epoch: 39800, Training Loss: 0.00684
Epoch: 39800, Training Loss: 0.00710
Epoch: 39800, Training Loss: 0.00711
Epoch: 39800, Training Loss: 0.00872
Epoch: 39801, Training Loss: 0.00684
Epoch: 39801, Training Loss: 0.00710
Epoch: 39801, Training Loss: 0.00711
Epoch: 39801, Training Loss: 0.00872
Epoch: 39802, Training Loss: 0.00684
Epoch: 39802, Training Loss: 0.00710
Epoch: 39802, Training Loss: 0.00711
Epoch: 39802, Training Loss: 0.00872
Epoch: 39803, Training Loss: 0.00684
Epoch: 39803, Training Loss: 0.00710
Epoch: 39803, Training Loss: 0.00711
Epoch: 39803, Training Loss: 0.00872
Epoch: 39804, Training Loss: 0.00684
Epoch: 39804, Training Loss: 0.00710
Epoch: 39804, Training Loss: 0.00711
Epoch: 39804, Training Loss: 0.00872
Epoch: 39805, Training Loss: 0.00684
Epoch: 39805, Training Loss: 0.00710
Epoch: 39805, Training Loss: 0.00711
Epoch: 39805, Training Loss: 0.00872
Epoch: 39806, Training Loss: 0.00684
Epoch: 39806, Training Loss: 0.00710
Epoch: 39806, Training Loss: 0.00711
Epoch: 39806, Training Loss: 0.00872
Epoch: 39807, Training Loss: 0.00684
Epoch: 39807, Training Loss: 0.00710
Epoch: 39807, Training Loss: 0.00711
Epoch: 39807, Training Loss: 0.00872
Epoch: 39808, Training Loss: 0.00684
Epoch: 39808, Training Loss: 0.00710
Epoch: 39808, Training Loss: 0.00711
Epoch: 39808, Training Loss: 0.00872
Epoch: 39809, Training Loss: 0.00684
Epoch: 39809, Training Loss: 0.00710
Epoch: 39809, Training Loss: 0.00711
Epoch: 39809, Training Loss: 0.00872
Epoch: 39810, Training Loss: 0.00684
Epoch: 39810, Training Loss: 0.00710
Epoch: 39810, Training Loss: 0.00711
Epoch: 39810, Training Loss: 0.00872
Epoch: 39811, Training Loss: 0.00684
Epoch: 39811, Training Loss: 0.00710
Epoch: 39811, Training Loss: 0.00711
Epoch: 39811, Training Loss: 0.00872
Epoch: 39812, Training Loss: 0.00684
Epoch: 39812, Training Loss: 0.00710
Epoch: 39812, Training Loss: 0.00711
Epoch: 39812, Training Loss: 0.00872
Epoch: 39813, Training Loss: 0.00684
Epoch: 39813, Training Loss: 0.00710
Epoch: 39813, Training Loss: 0.00711
Epoch: 39813, Training Loss: 0.00872
Epoch: 39814, Training Loss: 0.00684
Epoch: 39814, Training Loss: 0.00710
Epoch: 39814, Training Loss: 0.00711
Epoch: 39814, Training Loss: 0.00872
Epoch: 39815, Training Loss: 0.00684
Epoch: 39815, Training Loss: 0.00710
Epoch: 39815, Training Loss: 0.00711
Epoch: 39815, Training Loss: 0.00872
Epoch: 39816, Training Loss: 0.00683
Epoch: 39816, Training Loss: 0.00710
Epoch: 39816, Training Loss: 0.00711
Epoch: 39816, Training Loss: 0.00872
Epoch: 39817, Training Loss: 0.00683
Epoch: 39817, Training Loss: 0.00710
Epoch: 39817, Training Loss: 0.00711
Epoch: 39817, Training Loss: 0.00872
Epoch: 39818, Training Loss: 0.00683
Epoch: 39818, Training Loss: 0.00710
Epoch: 39818, Training Loss: 0.00711
Epoch: 39818, Training Loss: 0.00872
Epoch: 39819, Training Loss: 0.00683
Epoch: 39819, Training Loss: 0.00710
Epoch: 39819, Training Loss: 0.00711
Epoch: 39819, Training Loss: 0.00872
Epoch: 39820, Training Loss: 0.00683
Epoch: 39820, Training Loss: 0.00710
Epoch: 39820, Training Loss: 0.00711
Epoch: 39820, Training Loss: 0.00872
Epoch: 39821, Training Loss: 0.00683
Epoch: 39821, Training Loss: 0.00710
Epoch: 39821, Training Loss: 0.00711
Epoch: 39821, Training Loss: 0.00872
Epoch: 39822, Training Loss: 0.00683
Epoch: 39822, Training Loss: 0.00710
Epoch: 39822, Training Loss: 0.00711
Epoch: 39822, Training Loss: 0.00872
Epoch: 39823, Training Loss: 0.00683
Epoch: 39823, Training Loss: 0.00710
Epoch: 39823, Training Loss: 0.00711
Epoch: 39823, Training Loss: 0.00872
Epoch: 39824, Training Loss: 0.00683
Epoch: 39824, Training Loss: 0.00710
Epoch: 39824, Training Loss: 0.00711
Epoch: 39824, Training Loss: 0.00872
Epoch: 39825, Training Loss: 0.00683
Epoch: 39825, Training Loss: 0.00710
Epoch: 39825, Training Loss: 0.00711
Epoch: 39825, Training Loss: 0.00872
Epoch: 39826, Training Loss: 0.00683
Epoch: 39826, Training Loss: 0.00710
Epoch: 39826, Training Loss: 0.00711
Epoch: 39826, Training Loss: 0.00872
Epoch: 39827, Training Loss: 0.00683
Epoch: 39827, Training Loss: 0.00710
Epoch: 39827, Training Loss: 0.00711
Epoch: 39827, Training Loss: 0.00872
Epoch: 39828, Training Loss: 0.00683
Epoch: 39828, Training Loss: 0.00710
Epoch: 39828, Training Loss: 0.00711
Epoch: 39828, Training Loss: 0.00872
Epoch: 39829, Training Loss: 0.00683
Epoch: 39829, Training Loss: 0.00710
Epoch: 39829, Training Loss: 0.00711
Epoch: 39829, Training Loss: 0.00872
Epoch: 39830, Training Loss: 0.00683
Epoch: 39830, Training Loss: 0.00710
Epoch: 39830, Training Loss: 0.00711
Epoch: 39830, Training Loss: 0.00871
Epoch: 39831, Training Loss: 0.00683
Epoch: 39831, Training Loss: 0.00710
Epoch: 39831, Training Loss: 0.00711
Epoch: 39831, Training Loss: 0.00871
Epoch: 39832, Training Loss: 0.00683
Epoch: 39832, Training Loss: 0.00710
Epoch: 39832, Training Loss: 0.00711
Epoch: 39832, Training Loss: 0.00871
Epoch: 39833, Training Loss: 0.00683
Epoch: 39833, Training Loss: 0.00710
Epoch: 39833, Training Loss: 0.00711
Epoch: 39833, Training Loss: 0.00871
Epoch: 39834, Training Loss: 0.00683
Epoch: 39834, Training Loss: 0.00710
Epoch: 39834, Training Loss: 0.00711
Epoch: 39834, Training Loss: 0.00871
Epoch: 39835, Training Loss: 0.00683
Epoch: 39835, Training Loss: 0.00710
Epoch: 39835, Training Loss: 0.00711
Epoch: 39835, Training Loss: 0.00871
Epoch: 39836, Training Loss: 0.00683
Epoch: 39836, Training Loss: 0.00710
Epoch: 39836, Training Loss: 0.00711
Epoch: 39836, Training Loss: 0.00871
Epoch: 39837, Training Loss: 0.00683
Epoch: 39837, Training Loss: 0.00710
Epoch: 39837, Training Loss: 0.00711
Epoch: 39837, Training Loss: 0.00871
Epoch: 39838, Training Loss: 0.00683
Epoch: 39838, Training Loss: 0.00710
Epoch: 39838, Training Loss: 0.00711
Epoch: 39838, Training Loss: 0.00871
Epoch: 39839, Training Loss: 0.00683
Epoch: 39839, Training Loss: 0.00710
Epoch: 39839, Training Loss: 0.00711
Epoch: 39839, Training Loss: 0.00871
Epoch: 39840, Training Loss: 0.00683
Epoch: 39840, Training Loss: 0.00710
Epoch: 39840, Training Loss: 0.00711
Epoch: 39840, Training Loss: 0.00871
Epoch: 39841, Training Loss: 0.00683
Epoch: 39841, Training Loss: 0.00710
Epoch: 39841, Training Loss: 0.00711
Epoch: 39841, Training Loss: 0.00871
Epoch: 39842, Training Loss: 0.00683
Epoch: 39842, Training Loss: 0.00710
Epoch: 39842, Training Loss: 0.00711
Epoch: 39842, Training Loss: 0.00871
Epoch: 39843, Training Loss: 0.00683
Epoch: 39843, Training Loss: 0.00710
Epoch: 39843, Training Loss: 0.00710
Epoch: 39843, Training Loss: 0.00871
Epoch: 39844, Training Loss: 0.00683
Epoch: 39844, Training Loss: 0.00710
Epoch: 39844, Training Loss: 0.00710
Epoch: 39844, Training Loss: 0.00871
Epoch: 39845, Training Loss: 0.00683
Epoch: 39845, Training Loss: 0.00710
Epoch: 39845, Training Loss: 0.00710
Epoch: 39845, Training Loss: 0.00871
Epoch: 39846, Training Loss: 0.00683
Epoch: 39846, Training Loss: 0.00710
Epoch: 39846, Training Loss: 0.00710
Epoch: 39846, Training Loss: 0.00871
Epoch: 39847, Training Loss: 0.00683
Epoch: 39847, Training Loss: 0.00710
Epoch: 39847, Training Loss: 0.00710
Epoch: 39847, Training Loss: 0.00871
Epoch: 39848, Training Loss: 0.00683
Epoch: 39848, Training Loss: 0.00710
Epoch: 39848, Training Loss: 0.00710
Epoch: 39848, Training Loss: 0.00871
Epoch: 39849, Training Loss: 0.00683
Epoch: 39849, Training Loss: 0.00710
Epoch: 39849, Training Loss: 0.00710
Epoch: 39849, Training Loss: 0.00871
Epoch: 39850, Training Loss: 0.00683
Epoch: 39850, Training Loss: 0.00710
Epoch: 39850, Training Loss: 0.00710
Epoch: 39850, Training Loss: 0.00871
Epoch: 39851, Training Loss: 0.00683
Epoch: 39851, Training Loss: 0.00710
Epoch: 39851, Training Loss: 0.00710
Epoch: 39851, Training Loss: 0.00871
Epoch: 39852, Training Loss: 0.00683
Epoch: 39852, Training Loss: 0.00710
Epoch: 39852, Training Loss: 0.00710
Epoch: 39852, Training Loss: 0.00871
Epoch: 39853, Training Loss: 0.00683
Epoch: 39853, Training Loss: 0.00710
Epoch: 39853, Training Loss: 0.00710
Epoch: 39853, Training Loss: 0.00871
Epoch: 39854, Training Loss: 0.00683
Epoch: 39854, Training Loss: 0.00710
Epoch: 39854, Training Loss: 0.00710
Epoch: 39854, Training Loss: 0.00871
Epoch: 39855, Training Loss: 0.00683
Epoch: 39855, Training Loss: 0.00710
Epoch: 39855, Training Loss: 0.00710
Epoch: 39855, Training Loss: 0.00871
Epoch: 39856, Training Loss: 0.00683
Epoch: 39856, Training Loss: 0.00710
Epoch: 39856, Training Loss: 0.00710
Epoch: 39856, Training Loss: 0.00871
Epoch: 39857, Training Loss: 0.00683
Epoch: 39857, Training Loss: 0.00710
Epoch: 39857, Training Loss: 0.00710
Epoch: 39857, Training Loss: 0.00871
Epoch: 39858, Training Loss: 0.00683
Epoch: 39858, Training Loss: 0.00710
Epoch: 39858, Training Loss: 0.00710
Epoch: 39858, Training Loss: 0.00871
Epoch: 39859, Training Loss: 0.00683
Epoch: 39859, Training Loss: 0.00710
Epoch: 39859, Training Loss: 0.00710
Epoch: 39859, Training Loss: 0.00871
Epoch: 39860, Training Loss: 0.00683
Epoch: 39860, Training Loss: 0.00710
Epoch: 39860, Training Loss: 0.00710
Epoch: 39860, Training Loss: 0.00871
Epoch: 39861, Training Loss: 0.00683
Epoch: 39861, Training Loss: 0.00710
Epoch: 39861, Training Loss: 0.00710
Epoch: 39861, Training Loss: 0.00871
Epoch: 39862, Training Loss: 0.00683
Epoch: 39862, Training Loss: 0.00710
Epoch: 39862, Training Loss: 0.00710
Epoch: 39862, Training Loss: 0.00871
Epoch: 39863, Training Loss: 0.00683
Epoch: 39863, Training Loss: 0.00710
Epoch: 39863, Training Loss: 0.00710
Epoch: 39863, Training Loss: 0.00871
Epoch: 39864, Training Loss: 0.00683
Epoch: 39864, Training Loss: 0.00709
Epoch: 39864, Training Loss: 0.00710
Epoch: 39864, Training Loss: 0.00871
Epoch: 39865, Training Loss: 0.00683
Epoch: 39865, Training Loss: 0.00709
Epoch: 39865, Training Loss: 0.00710
Epoch: 39865, Training Loss: 0.00871
Epoch: 39866, Training Loss: 0.00683
Epoch: 39866, Training Loss: 0.00709
Epoch: 39866, Training Loss: 0.00710
Epoch: 39866, Training Loss: 0.00871
Epoch: 39867, Training Loss: 0.00683
Epoch: 39867, Training Loss: 0.00709
Epoch: 39867, Training Loss: 0.00710
Epoch: 39867, Training Loss: 0.00871
Epoch: 39868, Training Loss: 0.00683
Epoch: 39868, Training Loss: 0.00709
Epoch: 39868, Training Loss: 0.00710
Epoch: 39868, Training Loss: 0.00871
Epoch: 39869, Training Loss: 0.00683
Epoch: 39869, Training Loss: 0.00709
Epoch: 39869, Training Loss: 0.00710
Epoch: 39869, Training Loss: 0.00871
Epoch: 39870, Training Loss: 0.00683
Epoch: 39870, Training Loss: 0.00709
Epoch: 39870, Training Loss: 0.00710
Epoch: 39870, Training Loss: 0.00871
Epoch: 39871, Training Loss: 0.00683
Epoch: 39871, Training Loss: 0.00709
Epoch: 39871, Training Loss: 0.00710
Epoch: 39871, Training Loss: 0.00871
Epoch: 39872, Training Loss: 0.00683
Epoch: 39872, Training Loss: 0.00709
Epoch: 39872, Training Loss: 0.00710
Epoch: 39872, Training Loss: 0.00871
Epoch: 39873, Training Loss: 0.00683
Epoch: 39873, Training Loss: 0.00709
Epoch: 39873, Training Loss: 0.00710
Epoch: 39873, Training Loss: 0.00871
Epoch: 39874, Training Loss: 0.00683
Epoch: 39874, Training Loss: 0.00709
Epoch: 39874, Training Loss: 0.00710
Epoch: 39874, Training Loss: 0.00871
Epoch: 39875, Training Loss: 0.00683
Epoch: 39875, Training Loss: 0.00709
Epoch: 39875, Training Loss: 0.00710
Epoch: 39875, Training Loss: 0.00871
Epoch: 39876, Training Loss: 0.00683
Epoch: 39876, Training Loss: 0.00709
Epoch: 39876, Training Loss: 0.00710
Epoch: 39876, Training Loss: 0.00871
Epoch: 39877, Training Loss: 0.00683
Epoch: 39877, Training Loss: 0.00709
Epoch: 39877, Training Loss: 0.00710
Epoch: 39877, Training Loss: 0.00871
Epoch: 39878, Training Loss: 0.00683
Epoch: 39878, Training Loss: 0.00709
Epoch: 39878, Training Loss: 0.00710
Epoch: 39878, Training Loss: 0.00871
Epoch: 39879, Training Loss: 0.00683
Epoch: 39879, Training Loss: 0.00709
Epoch: 39879, Training Loss: 0.00710
Epoch: 39879, Training Loss: 0.00871
Epoch: 39880, Training Loss: 0.00683
Epoch: 39880, Training Loss: 0.00709
Epoch: 39880, Training Loss: 0.00710
Epoch: 39880, Training Loss: 0.00871
Epoch: 39881, Training Loss: 0.00683
Epoch: 39881, Training Loss: 0.00709
Epoch: 39881, Training Loss: 0.00710
Epoch: 39881, Training Loss: 0.00871
Epoch: 39882, Training Loss: 0.00683
Epoch: 39882, Training Loss: 0.00709
Epoch: 39882, Training Loss: 0.00710
Epoch: 39882, Training Loss: 0.00871
Epoch: 39883, Training Loss: 0.00683
Epoch: 39883, Training Loss: 0.00709
Epoch: 39883, Training Loss: 0.00710
Epoch: 39883, Training Loss: 0.00871
Epoch: 39884, Training Loss: 0.00683
Epoch: 39884, Training Loss: 0.00709
Epoch: 39884, Training Loss: 0.00710
Epoch: 39884, Training Loss: 0.00871
Epoch: 39885, Training Loss: 0.00683
Epoch: 39885, Training Loss: 0.00709
Epoch: 39885, Training Loss: 0.00710
Epoch: 39885, Training Loss: 0.00871
Epoch: 39886, Training Loss: 0.00683
Epoch: 39886, Training Loss: 0.00709
Epoch: 39886, Training Loss: 0.00710
Epoch: 39886, Training Loss: 0.00871
Epoch: 39887, Training Loss: 0.00683
Epoch: 39887, Training Loss: 0.00709
Epoch: 39887, Training Loss: 0.00710
Epoch: 39887, Training Loss: 0.00871
Epoch: 39888, Training Loss: 0.00683
Epoch: 39888, Training Loss: 0.00709
Epoch: 39888, Training Loss: 0.00710
Epoch: 39888, Training Loss: 0.00871
Epoch: 39889, Training Loss: 0.00683
Epoch: 39889, Training Loss: 0.00709
Epoch: 39889, Training Loss: 0.00710
Epoch: 39889, Training Loss: 0.00871
Epoch: 39890, Training Loss: 0.00683
Epoch: 39890, Training Loss: 0.00709
Epoch: 39890, Training Loss: 0.00710
Epoch: 39890, Training Loss: 0.00871
Epoch: 39891, Training Loss: 0.00683
Epoch: 39891, Training Loss: 0.00709
Epoch: 39891, Training Loss: 0.00710
Epoch: 39891, Training Loss: 0.00871
Epoch: 39892, Training Loss: 0.00683
Epoch: 39892, Training Loss: 0.00709
Epoch: 39892, Training Loss: 0.00710
Epoch: 39892, Training Loss: 0.00871
Epoch: 39893, Training Loss: 0.00683
Epoch: 39893, Training Loss: 0.00709
Epoch: 39893, Training Loss: 0.00710
Epoch: 39893, Training Loss: 0.00871
Epoch: 39894, Training Loss: 0.00683
Epoch: 39894, Training Loss: 0.00709
Epoch: 39894, Training Loss: 0.00710
Epoch: 39894, Training Loss: 0.00871
Epoch: 39895, Training Loss: 0.00683
Epoch: 39895, Training Loss: 0.00709
Epoch: 39895, Training Loss: 0.00710
Epoch: 39895, Training Loss: 0.00871
Epoch: 39896, Training Loss: 0.00683
Epoch: 39896, Training Loss: 0.00709
Epoch: 39896, Training Loss: 0.00710
Epoch: 39896, Training Loss: 0.00871
Epoch: 39897, Training Loss: 0.00683
Epoch: 39897, Training Loss: 0.00709
Epoch: 39897, Training Loss: 0.00710
Epoch: 39897, Training Loss: 0.00871
Epoch: 39898, Training Loss: 0.00683
Epoch: 39898, Training Loss: 0.00709
Epoch: 39898, Training Loss: 0.00710
Epoch: 39898, Training Loss: 0.00871
Epoch: 39899, Training Loss: 0.00683
Epoch: 39899, Training Loss: 0.00709
Epoch: 39899, Training Loss: 0.00710
Epoch: 39899, Training Loss: 0.00871
Epoch: 39900, Training Loss: 0.00683
Epoch: 39900, Training Loss: 0.00709
Epoch: 39900, Training Loss: 0.00710
Epoch: 39900, Training Loss: 0.00871
Epoch: 39901, Training Loss: 0.00683
Epoch: 39901, Training Loss: 0.00709
Epoch: 39901, Training Loss: 0.00710
Epoch: 39901, Training Loss: 0.00871
Epoch: 39902, Training Loss: 0.00683
Epoch: 39902, Training Loss: 0.00709
Epoch: 39902, Training Loss: 0.00710
Epoch: 39902, Training Loss: 0.00871
Epoch: 39903, Training Loss: 0.00683
Epoch: 39903, Training Loss: 0.00709
Epoch: 39903, Training Loss: 0.00710
Epoch: 39903, Training Loss: 0.00871
Epoch: 39904, Training Loss: 0.00683
Epoch: 39904, Training Loss: 0.00709
Epoch: 39904, Training Loss: 0.00710
Epoch: 39904, Training Loss: 0.00871
Epoch: 39905, Training Loss: 0.00683
Epoch: 39905, Training Loss: 0.00709
Epoch: 39905, Training Loss: 0.00710
Epoch: 39905, Training Loss: 0.00871
Epoch: 39906, Training Loss: 0.00683
Epoch: 39906, Training Loss: 0.00709
Epoch: 39906, Training Loss: 0.00710
Epoch: 39906, Training Loss: 0.00871
Epoch: 39907, Training Loss: 0.00683
Epoch: 39907, Training Loss: 0.00709
Epoch: 39907, Training Loss: 0.00710
Epoch: 39907, Training Loss: 0.00871
Epoch: 39908, Training Loss: 0.00683
Epoch: 39908, Training Loss: 0.00709
Epoch: 39908, Training Loss: 0.00710
Epoch: 39908, Training Loss: 0.00871
Epoch: 39909, Training Loss: 0.00683
Epoch: 39909, Training Loss: 0.00709
Epoch: 39909, Training Loss: 0.00710
Epoch: 39909, Training Loss: 0.00871
Epoch: 39910, Training Loss: 0.00683
Epoch: 39910, Training Loss: 0.00709
Epoch: 39910, Training Loss: 0.00710
Epoch: 39910, Training Loss: 0.00871
Epoch: 39911, Training Loss: 0.00683
Epoch: 39911, Training Loss: 0.00709
Epoch: 39911, Training Loss: 0.00710
Epoch: 39911, Training Loss: 0.00871
Epoch: 39912, Training Loss: 0.00683
Epoch: 39912, Training Loss: 0.00709
Epoch: 39912, Training Loss: 0.00710
Epoch: 39912, Training Loss: 0.00870
Epoch: 39913, Training Loss: 0.00683
Epoch: 39913, Training Loss: 0.00709
Epoch: 39913, Training Loss: 0.00710
Epoch: 39913, Training Loss: 0.00870
Epoch: 39914, Training Loss: 0.00683
Epoch: 39914, Training Loss: 0.00709
Epoch: 39914, Training Loss: 0.00710
Epoch: 39914, Training Loss: 0.00870
Epoch: 39915, Training Loss: 0.00683
Epoch: 39915, Training Loss: 0.00709
Epoch: 39915, Training Loss: 0.00710
Epoch: 39915, Training Loss: 0.00870
Epoch: 39916, Training Loss: 0.00683
Epoch: 39916, Training Loss: 0.00709
Epoch: 39916, Training Loss: 0.00710
Epoch: 39916, Training Loss: 0.00870
Epoch: 39917, Training Loss: 0.00683
Epoch: 39917, Training Loss: 0.00709
Epoch: 39917, Training Loss: 0.00710
Epoch: 39917, Training Loss: 0.00870
Epoch: 39918, Training Loss: 0.00683
Epoch: 39918, Training Loss: 0.00709
Epoch: 39918, Training Loss: 0.00710
Epoch: 39918, Training Loss: 0.00870
Epoch: 39919, Training Loss: 0.00683
Epoch: 39919, Training Loss: 0.00709
Epoch: 39919, Training Loss: 0.00710
Epoch: 39919, Training Loss: 0.00870
Epoch: 39920, Training Loss: 0.00683
Epoch: 39920, Training Loss: 0.00709
Epoch: 39920, Training Loss: 0.00710
Epoch: 39920, Training Loss: 0.00870
Epoch: 39921, Training Loss: 0.00683
Epoch: 39921, Training Loss: 0.00709
Epoch: 39921, Training Loss: 0.00710
Epoch: 39921, Training Loss: 0.00870
Epoch: 39922, Training Loss: 0.00683
Epoch: 39922, Training Loss: 0.00709
Epoch: 39922, Training Loss: 0.00710
Epoch: 39922, Training Loss: 0.00870
Epoch: 39923, Training Loss: 0.00683
Epoch: 39923, Training Loss: 0.00709
Epoch: 39923, Training Loss: 0.00710
Epoch: 39923, Training Loss: 0.00870
Epoch: 39924, Training Loss: 0.00682
Epoch: 39924, Training Loss: 0.00709
Epoch: 39924, Training Loss: 0.00710
Epoch: 39924, Training Loss: 0.00870
Epoch: 39925, Training Loss: 0.00682
Epoch: 39925, Training Loss: 0.00709
Epoch: 39925, Training Loss: 0.00710
Epoch: 39925, Training Loss: 0.00870
Epoch: 39926, Training Loss: 0.00682
Epoch: 39926, Training Loss: 0.00709
Epoch: 39926, Training Loss: 0.00710
Epoch: 39926, Training Loss: 0.00870
Epoch: 39927, Training Loss: 0.00682
Epoch: 39927, Training Loss: 0.00709
Epoch: 39927, Training Loss: 0.00710
Epoch: 39927, Training Loss: 0.00870
Epoch: 39928, Training Loss: 0.00682
Epoch: 39928, Training Loss: 0.00709
Epoch: 39928, Training Loss: 0.00710
Epoch: 39928, Training Loss: 0.00870
Epoch: 39929, Training Loss: 0.00682
Epoch: 39929, Training Loss: 0.00709
Epoch: 39929, Training Loss: 0.00710
Epoch: 39929, Training Loss: 0.00870
Epoch: 39930, Training Loss: 0.00682
Epoch: 39930, Training Loss: 0.00709
Epoch: 39930, Training Loss: 0.00710
Epoch: 39930, Training Loss: 0.00870
Epoch: 39931, Training Loss: 0.00682
Epoch: 39931, Training Loss: 0.00709
Epoch: 39931, Training Loss: 0.00710
Epoch: 39931, Training Loss: 0.00870
Epoch: 39932, Training Loss: 0.00682
Epoch: 39932, Training Loss: 0.00709
Epoch: 39932, Training Loss: 0.00710
Epoch: 39932, Training Loss: 0.00870
Epoch: 39933, Training Loss: 0.00682
Epoch: 39933, Training Loss: 0.00709
Epoch: 39933, Training Loss: 0.00710
Epoch: 39933, Training Loss: 0.00870
Epoch: 39934, Training Loss: 0.00682
Epoch: 39934, Training Loss: 0.00709
Epoch: 39934, Training Loss: 0.00710
Epoch: 39934, Training Loss: 0.00870
Epoch: 39935, Training Loss: 0.00682
Epoch: 39935, Training Loss: 0.00709
Epoch: 39935, Training Loss: 0.00710
Epoch: 39935, Training Loss: 0.00870
Epoch: 39936, Training Loss: 0.00682
Epoch: 39936, Training Loss: 0.00709
Epoch: 39936, Training Loss: 0.00710
Epoch: 39936, Training Loss: 0.00870
Epoch: 39937, Training Loss: 0.00682
Epoch: 39937, Training Loss: 0.00709
Epoch: 39937, Training Loss: 0.00710
Epoch: 39937, Training Loss: 0.00870
Epoch: 39938, Training Loss: 0.00682
Epoch: 39938, Training Loss: 0.00709
Epoch: 39938, Training Loss: 0.00710
Epoch: 39938, Training Loss: 0.00870
Epoch: 39939, Training Loss: 0.00682
Epoch: 39939, Training Loss: 0.00709
Epoch: 39939, Training Loss: 0.00710
Epoch: 39939, Training Loss: 0.00870
Epoch: 39940, Training Loss: 0.00682
Epoch: 39940, Training Loss: 0.00709
Epoch: 39940, Training Loss: 0.00710
Epoch: 39940, Training Loss: 0.00870
Epoch: 39941, Training Loss: 0.00682
Epoch: 39941, Training Loss: 0.00709
Epoch: 39941, Training Loss: 0.00710
Epoch: 39941, Training Loss: 0.00870
Epoch: 39942, Training Loss: 0.00682
Epoch: 39942, Training Loss: 0.00709
Epoch: 39942, Training Loss: 0.00710
Epoch: 39942, Training Loss: 0.00870
Epoch: 39943, Training Loss: 0.00682
Epoch: 39943, Training Loss: 0.00709
Epoch: 39943, Training Loss: 0.00710
Epoch: 39943, Training Loss: 0.00870
Epoch: 39944, Training Loss: 0.00682
Epoch: 39944, Training Loss: 0.00709
Epoch: 39944, Training Loss: 0.00710
Epoch: 39944, Training Loss: 0.00870
Epoch: 39945, Training Loss: 0.00682
Epoch: 39945, Training Loss: 0.00709
Epoch: 39945, Training Loss: 0.00709
Epoch: 39945, Training Loss: 0.00870
Epoch: 39946, Training Loss: 0.00682
Epoch: 39946, Training Loss: 0.00709
Epoch: 39946, Training Loss: 0.00709
Epoch: 39946, Training Loss: 0.00870
Epoch: 39947, Training Loss: 0.00682
Epoch: 39947, Training Loss: 0.00709
Epoch: 39947, Training Loss: 0.00709
Epoch: 39947, Training Loss: 0.00870
Epoch: 39948, Training Loss: 0.00682
Epoch: 39948, Training Loss: 0.00709
Epoch: 39948, Training Loss: 0.00709
Epoch: 39948, Training Loss: 0.00870
Epoch: 39949, Training Loss: 0.00682
Epoch: 39949, Training Loss: 0.00709
Epoch: 39949, Training Loss: 0.00709
Epoch: 39949, Training Loss: 0.00870
Epoch: 39950, Training Loss: 0.00682
Epoch: 39950, Training Loss: 0.00709
Epoch: 39950, Training Loss: 0.00709
Epoch: 39950, Training Loss: 0.00870
Epoch: 39951, Training Loss: 0.00682
Epoch: 39951, Training Loss: 0.00709
Epoch: 39951, Training Loss: 0.00709
Epoch: 39951, Training Loss: 0.00870
Epoch: 39952, Training Loss: 0.00682
Epoch: 39952, Training Loss: 0.00709
Epoch: 39952, Training Loss: 0.00709
Epoch: 39952, Training Loss: 0.00870
Epoch: 39953, Training Loss: 0.00682
Epoch: 39953, Training Loss: 0.00709
Epoch: 39953, Training Loss: 0.00709
Epoch: 39953, Training Loss: 0.00870
Epoch: 39954, Training Loss: 0.00682
Epoch: 39954, Training Loss: 0.00709
Epoch: 39954, Training Loss: 0.00709
Epoch: 39954, Training Loss: 0.00870
Epoch: 39955, Training Loss: 0.00682
Epoch: 39955, Training Loss: 0.00709
Epoch: 39955, Training Loss: 0.00709
Epoch: 39955, Training Loss: 0.00870
Epoch: 39956, Training Loss: 0.00682
Epoch: 39956, Training Loss: 0.00709
Epoch: 39956, Training Loss: 0.00709
Epoch: 39956, Training Loss: 0.00870
Epoch: 39957, Training Loss: 0.00682
Epoch: 39957, Training Loss: 0.00709
Epoch: 39957, Training Loss: 0.00709
Epoch: 39957, Training Loss: 0.00870
Epoch: 39958, Training Loss: 0.00682
Epoch: 39958, Training Loss: 0.00709
Epoch: 39958, Training Loss: 0.00709
Epoch: 39958, Training Loss: 0.00870
Epoch: 39959, Training Loss: 0.00682
Epoch: 39959, Training Loss: 0.00709
Epoch: 39959, Training Loss: 0.00709
Epoch: 39959, Training Loss: 0.00870
Epoch: 39960, Training Loss: 0.00682
Epoch: 39960, Training Loss: 0.00709
Epoch: 39960, Training Loss: 0.00709
Epoch: 39960, Training Loss: 0.00870
Epoch: 39961, Training Loss: 0.00682
Epoch: 39961, Training Loss: 0.00709
Epoch: 39961, Training Loss: 0.00709
Epoch: 39961, Training Loss: 0.00870
Epoch: 39962, Training Loss: 0.00682
Epoch: 39962, Training Loss: 0.00709
Epoch: 39962, Training Loss: 0.00709
Epoch: 39962, Training Loss: 0.00870
Epoch: 39963, Training Loss: 0.00682
Epoch: 39963, Training Loss: 0.00709
Epoch: 39963, Training Loss: 0.00709
Epoch: 39963, Training Loss: 0.00870
Epoch: 39964, Training Loss: 0.00682
Epoch: 39964, Training Loss: 0.00709
Epoch: 39964, Training Loss: 0.00709
Epoch: 39964, Training Loss: 0.00870
Epoch: 39965, Training Loss: 0.00682
Epoch: 39965, Training Loss: 0.00709
Epoch: 39965, Training Loss: 0.00709
Epoch: 39965, Training Loss: 0.00870
Epoch: 39966, Training Loss: 0.00682
Epoch: 39966, Training Loss: 0.00708
Epoch: 39966, Training Loss: 0.00709
Epoch: 39966, Training Loss: 0.00870
Epoch: 39967, Training Loss: 0.00682
Epoch: 39967, Training Loss: 0.00708
Epoch: 39967, Training Loss: 0.00709
Epoch: 39967, Training Loss: 0.00870
Epoch: 39968, Training Loss: 0.00682
Epoch: 39968, Training Loss: 0.00708
Epoch: 39968, Training Loss: 0.00709
Epoch: 39968, Training Loss: 0.00870
Epoch: 39969, Training Loss: 0.00682
Epoch: 39969, Training Loss: 0.00708
Epoch: 39969, Training Loss: 0.00709
Epoch: 39969, Training Loss: 0.00870
Epoch: 39970, Training Loss: 0.00682
Epoch: 39970, Training Loss: 0.00708
Epoch: 39970, Training Loss: 0.00709
Epoch: 39970, Training Loss: 0.00870
Epoch: 39971, Training Loss: 0.00682
Epoch: 39971, Training Loss: 0.00708
Epoch: 39971, Training Loss: 0.00709
Epoch: 39971, Training Loss: 0.00870
Epoch: 39972, Training Loss: 0.00682
Epoch: 39972, Training Loss: 0.00708
Epoch: 39972, Training Loss: 0.00709
Epoch: 39972, Training Loss: 0.00870
Epoch: 39973, Training Loss: 0.00682
Epoch: 39973, Training Loss: 0.00708
Epoch: 39973, Training Loss: 0.00709
Epoch: 39973, Training Loss: 0.00870
Epoch: 39974, Training Loss: 0.00682
Epoch: 39974, Training Loss: 0.00708
Epoch: 39974, Training Loss: 0.00709
Epoch: 39974, Training Loss: 0.00870
Epoch: 39975, Training Loss: 0.00682
Epoch: 39975, Training Loss: 0.00708
Epoch: 39975, Training Loss: 0.00709
Epoch: 39975, Training Loss: 0.00870
Epoch: 39976, Training Loss: 0.00682
Epoch: 39976, Training Loss: 0.00708
Epoch: 39976, Training Loss: 0.00709
Epoch: 39976, Training Loss: 0.00870
Epoch: 39977, Training Loss: 0.00682
Epoch: 39977, Training Loss: 0.00708
Epoch: 39977, Training Loss: 0.00709
Epoch: 39977, Training Loss: 0.00870
Epoch: 39978, Training Loss: 0.00682
Epoch: 39978, Training Loss: 0.00708
Epoch: 39978, Training Loss: 0.00709
Epoch: 39978, Training Loss: 0.00870
Epoch: 39979, Training Loss: 0.00682
Epoch: 39979, Training Loss: 0.00708
Epoch: 39979, Training Loss: 0.00709
Epoch: 39979, Training Loss: 0.00870
Epoch: 39980, Training Loss: 0.00682
Epoch: 39980, Training Loss: 0.00708
Epoch: 39980, Training Loss: 0.00709
Epoch: 39980, Training Loss: 0.00870
Epoch: 39981, Training Loss: 0.00682
Epoch: 39981, Training Loss: 0.00708
Epoch: 39981, Training Loss: 0.00709
Epoch: 39981, Training Loss: 0.00870
Epoch: 39982, Training Loss: 0.00682
Epoch: 39982, Training Loss: 0.00708
Epoch: 39982, Training Loss: 0.00709
Epoch: 39982, Training Loss: 0.00870
Epoch: 39983, Training Loss: 0.00682
Epoch: 39983, Training Loss: 0.00708
Epoch: 39983, Training Loss: 0.00709
Epoch: 39983, Training Loss: 0.00870
Epoch: 39984, Training Loss: 0.00682
Epoch: 39984, Training Loss: 0.00708
Epoch: 39984, Training Loss: 0.00709
Epoch: 39984, Training Loss: 0.00870
Epoch: 39985, Training Loss: 0.00682
Epoch: 39985, Training Loss: 0.00708
Epoch: 39985, Training Loss: 0.00709
Epoch: 39985, Training Loss: 0.00870
Epoch: 39986, Training Loss: 0.00682
Epoch: 39986, Training Loss: 0.00708
Epoch: 39986, Training Loss: 0.00709
Epoch: 39986, Training Loss: 0.00870
Epoch: 39987, Training Loss: 0.00682
Epoch: 39987, Training Loss: 0.00708
Epoch: 39987, Training Loss: 0.00709
Epoch: 39987, Training Loss: 0.00870
Epoch: 39988, Training Loss: 0.00682
Epoch: 39988, Training Loss: 0.00708
Epoch: 39988, Training Loss: 0.00709
Epoch: 39988, Training Loss: 0.00870
Epoch: 39989, Training Loss: 0.00682
Epoch: 39989, Training Loss: 0.00708
Epoch: 39989, Training Loss: 0.00709
Epoch: 39989, Training Loss: 0.00870
Epoch: 39990, Training Loss: 0.00682
Epoch: 39990, Training Loss: 0.00708
Epoch: 39990, Training Loss: 0.00709
Epoch: 39990, Training Loss: 0.00870
Epoch: 39991, Training Loss: 0.00682
Epoch: 39991, Training Loss: 0.00708
Epoch: 39991, Training Loss: 0.00709
Epoch: 39991, Training Loss: 0.00870
Epoch: 39992, Training Loss: 0.00682
Epoch: 39992, Training Loss: 0.00708
Epoch: 39992, Training Loss: 0.00709
Epoch: 39992, Training Loss: 0.00870
Epoch: 39993, Training Loss: 0.00682
Epoch: 39993, Training Loss: 0.00708
Epoch: 39993, Training Loss: 0.00709
Epoch: 39993, Training Loss: 0.00870
Epoch: 39994, Training Loss: 0.00682
Epoch: 39994, Training Loss: 0.00708
Epoch: 39994, Training Loss: 0.00709
Epoch: 39994, Training Loss: 0.00870
Epoch: 39995, Training Loss: 0.00682
Epoch: 39995, Training Loss: 0.00708
Epoch: 39995, Training Loss: 0.00709
Epoch: 39995, Training Loss: 0.00869
Epoch: 39996, Training Loss: 0.00682
Epoch: 39996, Training Loss: 0.00708
Epoch: 39996, Training Loss: 0.00709
Epoch: 39996, Training Loss: 0.00869
Epoch: 39997, Training Loss: 0.00682
Epoch: 39997, Training Loss: 0.00708
Epoch: 39997, Training Loss: 0.00709
Epoch: 39997, Training Loss: 0.00869
Epoch: 39998, Training Loss: 0.00682
Epoch: 39998, Training Loss: 0.00708
Epoch: 39998, Training Loss: 0.00709
Epoch: 39998, Training Loss: 0.00869
Epoch: 39999, Training Loss: 0.00682
Epoch: 39999, Training Loss: 0.00708
Epoch: 39999, Training Loss: 0.00709
Epoch: 39999, Training Loss: 0.00869
Epoch: 40000, Training Loss: 0.00682
Epoch: 40000, Training Loss: 0.00708
Epoch: 40000, Training Loss: 0.00709
Epoch: 40000, Training Loss: 0.00869
Epoch: 40001, Training Loss: 0.00682
Epoch: 40001, Training Loss: 0.00708
Epoch: 40001, Training Loss: 0.00709
Epoch: 40001, Training Loss: 0.00869
Epoch: 40002, Training Loss: 0.00682
Epoch: 40002, Training Loss: 0.00708
Epoch: 40002, Training Loss: 0.00709
Epoch: 40002, Training Loss: 0.00869
Epoch: 40003, Training Loss: 0.00682
Epoch: 40003, Training Loss: 0.00708
Epoch: 40003, Training Loss: 0.00709
Epoch: 40003, Training Loss: 0.00869
Epoch: 40004, Training Loss: 0.00682
Epoch: 40004, Training Loss: 0.00708
Epoch: 40004, Training Loss: 0.00709
Epoch: 40004, Training Loss: 0.00869
Epoch: 40005, Training Loss: 0.00682
Epoch: 40005, Training Loss: 0.00708
Epoch: 40005, Training Loss: 0.00709
Epoch: 40005, Training Loss: 0.00869
Epoch: 40006, Training Loss: 0.00682
Epoch: 40006, Training Loss: 0.00708
Epoch: 40006, Training Loss: 0.00709
Epoch: 40006, Training Loss: 0.00869
Epoch: 40007, Training Loss: 0.00682
Epoch: 40007, Training Loss: 0.00708
Epoch: 40007, Training Loss: 0.00709
Epoch: 40007, Training Loss: 0.00869
Epoch: 40008, Training Loss: 0.00682
Epoch: 40008, Training Loss: 0.00708
Epoch: 40008, Training Loss: 0.00709
Epoch: 40008, Training Loss: 0.00869
Epoch: 40009, Training Loss: 0.00682
Epoch: 40009, Training Loss: 0.00708
Epoch: 40009, Training Loss: 0.00709
Epoch: 40009, Training Loss: 0.00869
Epoch: 40010, Training Loss: 0.00682
Epoch: 40010, Training Loss: 0.00708
Epoch: 40010, Training Loss: 0.00709
Epoch: 40010, Training Loss: 0.00869
Epoch: 40011, Training Loss: 0.00682
Epoch: 40011, Training Loss: 0.00708
Epoch: 40011, Training Loss: 0.00709
Epoch: 40011, Training Loss: 0.00869
Epoch: 40012, Training Loss: 0.00682
Epoch: 40012, Training Loss: 0.00708
Epoch: 40012, Training Loss: 0.00709
Epoch: 40012, Training Loss: 0.00869
Epoch: 40013, Training Loss: 0.00682
Epoch: 40013, Training Loss: 0.00708
Epoch: 40013, Training Loss: 0.00709
Epoch: 40013, Training Loss: 0.00869
Epoch: 40014, Training Loss: 0.00682
Epoch: 40014, Training Loss: 0.00708
Epoch: 40014, Training Loss: 0.00709
Epoch: 40014, Training Loss: 0.00869
Epoch: 40015, Training Loss: 0.00682
Epoch: 40015, Training Loss: 0.00708
Epoch: 40015, Training Loss: 0.00709
Epoch: 40015, Training Loss: 0.00869
Epoch: 40016, Training Loss: 0.00682
Epoch: 40016, Training Loss: 0.00708
Epoch: 40016, Training Loss: 0.00709
Epoch: 40016, Training Loss: 0.00869
Epoch: 40017, Training Loss: 0.00682
Epoch: 40017, Training Loss: 0.00708
Epoch: 40017, Training Loss: 0.00709
Epoch: 40017, Training Loss: 0.00869
Epoch: 40018, Training Loss: 0.00682
Epoch: 40018, Training Loss: 0.00708
Epoch: 40018, Training Loss: 0.00709
Epoch: 40018, Training Loss: 0.00869
Epoch: 40019, Training Loss: 0.00682
Epoch: 40019, Training Loss: 0.00708
Epoch: 40019, Training Loss: 0.00709
Epoch: 40019, Training Loss: 0.00869
Epoch: 40020, Training Loss: 0.00682
Epoch: 40020, Training Loss: 0.00708
Epoch: 40020, Training Loss: 0.00709
Epoch: 40020, Training Loss: 0.00869
Epoch: 40021, Training Loss: 0.00682
Epoch: 40021, Training Loss: 0.00708
Epoch: 40021, Training Loss: 0.00709
Epoch: 40021, Training Loss: 0.00869
Epoch: 40022, Training Loss: 0.00682
Epoch: 40022, Training Loss: 0.00708
Epoch: 40022, Training Loss: 0.00709
Epoch: 40022, Training Loss: 0.00869
Epoch: 40023, Training Loss: 0.00682
Epoch: 40023, Training Loss: 0.00708
Epoch: 40023, Training Loss: 0.00709
Epoch: 40023, Training Loss: 0.00869
Epoch: 40024, Training Loss: 0.00682
Epoch: 40024, Training Loss: 0.00708
Epoch: 40024, Training Loss: 0.00709
Epoch: 40024, Training Loss: 0.00869
Epoch: 40025, Training Loss: 0.00682
Epoch: 40025, Training Loss: 0.00708
Epoch: 40025, Training Loss: 0.00709
Epoch: 40025, Training Loss: 0.00869
Epoch: 40026, Training Loss: 0.00682
Epoch: 40026, Training Loss: 0.00708
Epoch: 40026, Training Loss: 0.00709
Epoch: 40026, Training Loss: 0.00869
Epoch: 40027, Training Loss: 0.00682
Epoch: 40027, Training Loss: 0.00708
Epoch: 40027, Training Loss: 0.00709
Epoch: 40027, Training Loss: 0.00869
Epoch: 40028, Training Loss: 0.00682
Epoch: 40028, Training Loss: 0.00708
Epoch: 40028, Training Loss: 0.00709
Epoch: 40028, Training Loss: 0.00869
Epoch: 40029, Training Loss: 0.00682
Epoch: 40029, Training Loss: 0.00708
Epoch: 40029, Training Loss: 0.00709
Epoch: 40029, Training Loss: 0.00869
Epoch: 40030, Training Loss: 0.00682
Epoch: 40030, Training Loss: 0.00708
Epoch: 40030, Training Loss: 0.00709
Epoch: 40030, Training Loss: 0.00869
Epoch: 40031, Training Loss: 0.00682
Epoch: 40031, Training Loss: 0.00708
Epoch: 40031, Training Loss: 0.00709
Epoch: 40031, Training Loss: 0.00869
Epoch: 40032, Training Loss: 0.00682
Epoch: 40032, Training Loss: 0.00708
Epoch: 40032, Training Loss: 0.00709
Epoch: 40032, Training Loss: 0.00869
Epoch: 40033, Training Loss: 0.00681
Epoch: 40033, Training Loss: 0.00708
Epoch: 40033, Training Loss: 0.00709
Epoch: 40033, Training Loss: 0.00869
Epoch: 40034, Training Loss: 0.00681
Epoch: 40034, Training Loss: 0.00708
Epoch: 40034, Training Loss: 0.00709
Epoch: 40034, Training Loss: 0.00869
Epoch: 40035, Training Loss: 0.00681
Epoch: 40035, Training Loss: 0.00708
Epoch: 40035, Training Loss: 0.00709
Epoch: 40035, Training Loss: 0.00869
Epoch: 40036, Training Loss: 0.00681
Epoch: 40036, Training Loss: 0.00708
Epoch: 40036, Training Loss: 0.00709
Epoch: 40036, Training Loss: 0.00869
Epoch: 40037, Training Loss: 0.00681
Epoch: 40037, Training Loss: 0.00708
Epoch: 40037, Training Loss: 0.00709
Epoch: 40037, Training Loss: 0.00869
Epoch: 40038, Training Loss: 0.00681
Epoch: 40038, Training Loss: 0.00708
Epoch: 40038, Training Loss: 0.00709
Epoch: 40038, Training Loss: 0.00869
Epoch: 40039, Training Loss: 0.00681
Epoch: 40039, Training Loss: 0.00708
Epoch: 40039, Training Loss: 0.00709
Epoch: 40039, Training Loss: 0.00869
Epoch: 40040, Training Loss: 0.00681
Epoch: 40040, Training Loss: 0.00708
Epoch: 40040, Training Loss: 0.00709
Epoch: 40040, Training Loss: 0.00869
Epoch: 40041, Training Loss: 0.00681
Epoch: 40041, Training Loss: 0.00708
Epoch: 40041, Training Loss: 0.00709
Epoch: 40041, Training Loss: 0.00869
Epoch: 40042, Training Loss: 0.00681
Epoch: 40042, Training Loss: 0.00708
Epoch: 40042, Training Loss: 0.00709
Epoch: 40042, Training Loss: 0.00869
Epoch: 40043, Training Loss: 0.00681
Epoch: 40043, Training Loss: 0.00708
Epoch: 40043, Training Loss: 0.00709
Epoch: 40043, Training Loss: 0.00869
Epoch: 40044, Training Loss: 0.00681
Epoch: 40044, Training Loss: 0.00708
Epoch: 40044, Training Loss: 0.00709
Epoch: 40044, Training Loss: 0.00869
Epoch: 40045, Training Loss: 0.00681
Epoch: 40045, Training Loss: 0.00708
Epoch: 40045, Training Loss: 0.00709
Epoch: 40045, Training Loss: 0.00869
Epoch: 40046, Training Loss: 0.00681
Epoch: 40046, Training Loss: 0.00708
Epoch: 40046, Training Loss: 0.00709
Epoch: 40046, Training Loss: 0.00869
Epoch: 40047, Training Loss: 0.00681
Epoch: 40047, Training Loss: 0.00708
Epoch: 40047, Training Loss: 0.00709
Epoch: 40047, Training Loss: 0.00869
Epoch: 40048, Training Loss: 0.00681
Epoch: 40048, Training Loss: 0.00708
Epoch: 40048, Training Loss: 0.00708
Epoch: 40048, Training Loss: 0.00869
Epoch: 40049, Training Loss: 0.00681
Epoch: 40049, Training Loss: 0.00708
Epoch: 40049, Training Loss: 0.00708
Epoch: 40049, Training Loss: 0.00869
Epoch: 40050, Training Loss: 0.00681
Epoch: 40050, Training Loss: 0.00708
Epoch: 40050, Training Loss: 0.00708
Epoch: 40050, Training Loss: 0.00869
Epoch: 40051, Training Loss: 0.00681
Epoch: 40051, Training Loss: 0.00708
Epoch: 40051, Training Loss: 0.00708
Epoch: 40051, Training Loss: 0.00869
Epoch: 40052, Training Loss: 0.00681
Epoch: 40052, Training Loss: 0.00708
Epoch: 40052, Training Loss: 0.00708
Epoch: 40052, Training Loss: 0.00869
Epoch: 40053, Training Loss: 0.00681
Epoch: 40053, Training Loss: 0.00708
Epoch: 40053, Training Loss: 0.00708
Epoch: 40053, Training Loss: 0.00869
Epoch: 40054, Training Loss: 0.00681
Epoch: 40054, Training Loss: 0.00708
Epoch: 40054, Training Loss: 0.00708
Epoch: 40054, Training Loss: 0.00869
Epoch: 40055, Training Loss: 0.00681
Epoch: 40055, Training Loss: 0.00708
Epoch: 40055, Training Loss: 0.00708
Epoch: 40055, Training Loss: 0.00869
Epoch: 40056, Training Loss: 0.00681
Epoch: 40056, Training Loss: 0.00708
Epoch: 40056, Training Loss: 0.00708
Epoch: 40056, Training Loss: 0.00869
Epoch: 40057, Training Loss: 0.00681
Epoch: 40057, Training Loss: 0.00708
Epoch: 40057, Training Loss: 0.00708
Epoch: 40057, Training Loss: 0.00869
Epoch: 40058, Training Loss: 0.00681
Epoch: 40058, Training Loss: 0.00708
Epoch: 40058, Training Loss: 0.00708
Epoch: 40058, Training Loss: 0.00869
Epoch: 40059, Training Loss: 0.00681
Epoch: 40059, Training Loss: 0.00708
Epoch: 40059, Training Loss: 0.00708
Epoch: 40059, Training Loss: 0.00869
Epoch: 40060, Training Loss: 0.00681
Epoch: 40060, Training Loss: 0.00708
Epoch: 40060, Training Loss: 0.00708
Epoch: 40060, Training Loss: 0.00869
Epoch: 40061, Training Loss: 0.00681
Epoch: 40061, Training Loss: 0.00708
Epoch: 40061, Training Loss: 0.00708
Epoch: 40061, Training Loss: 0.00869
Epoch: 40062, Training Loss: 0.00681
Epoch: 40062, Training Loss: 0.00708
Epoch: 40062, Training Loss: 0.00708
Epoch: 40062, Training Loss: 0.00869
Epoch: 40063, Training Loss: 0.00681
Epoch: 40063, Training Loss: 0.00708
Epoch: 40063, Training Loss: 0.00708
Epoch: 40063, Training Loss: 0.00869
Epoch: 40064, Training Loss: 0.00681
Epoch: 40064, Training Loss: 0.00708
Epoch: 40064, Training Loss: 0.00708
Epoch: 40064, Training Loss: 0.00869
Epoch: 40065, Training Loss: 0.00681
Epoch: 40065, Training Loss: 0.00708
Epoch: 40065, Training Loss: 0.00708
Epoch: 40065, Training Loss: 0.00869
Epoch: 40066, Training Loss: 0.00681
Epoch: 40066, Training Loss: 0.00708
Epoch: 40066, Training Loss: 0.00708
Epoch: 40066, Training Loss: 0.00869
Epoch: 40067, Training Loss: 0.00681
Epoch: 40067, Training Loss: 0.00708
Epoch: 40067, Training Loss: 0.00708
Epoch: 40067, Training Loss: 0.00869
Epoch: 40068, Training Loss: 0.00681
Epoch: 40068, Training Loss: 0.00708
Epoch: 40068, Training Loss: 0.00708
Epoch: 40068, Training Loss: 0.00869
Epoch: 40069, Training Loss: 0.00681
Epoch: 40069, Training Loss: 0.00707
Epoch: 40069, Training Loss: 0.00708
Epoch: 40069, Training Loss: 0.00869
Epoch: 40070, Training Loss: 0.00681
Epoch: 40070, Training Loss: 0.00707
Epoch: 40070, Training Loss: 0.00708
Epoch: 40070, Training Loss: 0.00869
Epoch: 40071, Training Loss: 0.00681
Epoch: 40071, Training Loss: 0.00707
Epoch: 40071, Training Loss: 0.00708
Epoch: 40071, Training Loss: 0.00869
Epoch: 40072, Training Loss: 0.00681
Epoch: 40072, Training Loss: 0.00707
Epoch: 40072, Training Loss: 0.00708
Epoch: 40072, Training Loss: 0.00869
Epoch: 40073, Training Loss: 0.00681
Epoch: 40073, Training Loss: 0.00707
Epoch: 40073, Training Loss: 0.00708
Epoch: 40073, Training Loss: 0.00869
Epoch: 40074, Training Loss: 0.00681
Epoch: 40074, Training Loss: 0.00707
Epoch: 40074, Training Loss: 0.00708
Epoch: 40074, Training Loss: 0.00869
Epoch: 40075, Training Loss: 0.00681
Epoch: 40075, Training Loss: 0.00707
Epoch: 40075, Training Loss: 0.00708
Epoch: 40075, Training Loss: 0.00869
Epoch: 40076, Training Loss: 0.00681
Epoch: 40076, Training Loss: 0.00707
Epoch: 40076, Training Loss: 0.00708
Epoch: 40076, Training Loss: 0.00869
Epoch: 40077, Training Loss: 0.00681
Epoch: 40077, Training Loss: 0.00707
Epoch: 40077, Training Loss: 0.00708
Epoch: 40077, Training Loss: 0.00869
Epoch: 40078, Training Loss: 0.00681
Epoch: 40078, Training Loss: 0.00707
Epoch: 40078, Training Loss: 0.00708
Epoch: 40078, Training Loss: 0.00868
Epoch: 40079, Training Loss: 0.00681
Epoch: 40079, Training Loss: 0.00707
Epoch: 40079, Training Loss: 0.00708
Epoch: 40079, Training Loss: 0.00868
Epoch: 40080, Training Loss: 0.00681
Epoch: 40080, Training Loss: 0.00707
Epoch: 40080, Training Loss: 0.00708
Epoch: 40080, Training Loss: 0.00868
Epoch: 40081, Training Loss: 0.00681
Epoch: 40081, Training Loss: 0.00707
Epoch: 40081, Training Loss: 0.00708
Epoch: 40081, Training Loss: 0.00868
Epoch: 40082, Training Loss: 0.00681
Epoch: 40082, Training Loss: 0.00707
Epoch: 40082, Training Loss: 0.00708
Epoch: 40082, Training Loss: 0.00868
Epoch: 40083, Training Loss: 0.00681
Epoch: 40083, Training Loss: 0.00707
Epoch: 40083, Training Loss: 0.00708
Epoch: 40083, Training Loss: 0.00868
Epoch: 40084, Training Loss: 0.00681
Epoch: 40084, Training Loss: 0.00707
Epoch: 40084, Training Loss: 0.00708
Epoch: 40084, Training Loss: 0.00868
Epoch: 40085, Training Loss: 0.00681
Epoch: 40085, Training Loss: 0.00707
Epoch: 40085, Training Loss: 0.00708
Epoch: 40085, Training Loss: 0.00868
Epoch: 40086, Training Loss: 0.00681
Epoch: 40086, Training Loss: 0.00707
Epoch: 40086, Training Loss: 0.00708
Epoch: 40086, Training Loss: 0.00868
Epoch: 40087, Training Loss: 0.00681
Epoch: 40087, Training Loss: 0.00707
Epoch: 40087, Training Loss: 0.00708
Epoch: 40087, Training Loss: 0.00868
Epoch: 40088, Training Loss: 0.00681
Epoch: 40088, Training Loss: 0.00707
Epoch: 40088, Training Loss: 0.00708
Epoch: 40088, Training Loss: 0.00868
Epoch: 40089, Training Loss: 0.00681
Epoch: 40089, Training Loss: 0.00707
Epoch: 40089, Training Loss: 0.00708
Epoch: 40089, Training Loss: 0.00868
Epoch: 40090, Training Loss: 0.00681
Epoch: 40090, Training Loss: 0.00707
Epoch: 40090, Training Loss: 0.00708
Epoch: 40090, Training Loss: 0.00868
Epoch: 40091, Training Loss: 0.00681
Epoch: 40091, Training Loss: 0.00707
Epoch: 40091, Training Loss: 0.00708
Epoch: 40091, Training Loss: 0.00868
Epoch: 40092, Training Loss: 0.00681
Epoch: 40092, Training Loss: 0.00707
Epoch: 40092, Training Loss: 0.00708
Epoch: 40092, Training Loss: 0.00868
Epoch: 40093, Training Loss: 0.00681
Epoch: 40093, Training Loss: 0.00707
Epoch: 40093, Training Loss: 0.00708
Epoch: 40093, Training Loss: 0.00868
Epoch: 40094, Training Loss: 0.00681
Epoch: 40094, Training Loss: 0.00707
Epoch: 40094, Training Loss: 0.00708
Epoch: 40094, Training Loss: 0.00868
Epoch: 40095, Training Loss: 0.00681
Epoch: 40095, Training Loss: 0.00707
Epoch: 40095, Training Loss: 0.00708
Epoch: 40095, Training Loss: 0.00868
Epoch: 40096, Training Loss: 0.00681
Epoch: 40096, Training Loss: 0.00707
Epoch: 40096, Training Loss: 0.00708
Epoch: 40096, Training Loss: 0.00868
Epoch: 40097, Training Loss: 0.00681
Epoch: 40097, Training Loss: 0.00707
Epoch: 40097, Training Loss: 0.00708
Epoch: 40097, Training Loss: 0.00868
Epoch: 40098, Training Loss: 0.00681
Epoch: 40098, Training Loss: 0.00707
Epoch: 40098, Training Loss: 0.00708
Epoch: 40098, Training Loss: 0.00868
Epoch: 40099, Training Loss: 0.00681
Epoch: 40099, Training Loss: 0.00707
Epoch: 40099, Training Loss: 0.00708
Epoch: 40099, Training Loss: 0.00868
Epoch: 40100, Training Loss: 0.00681
Epoch: 40100, Training Loss: 0.00707
Epoch: 40100, Training Loss: 0.00708
Epoch: 40100, Training Loss: 0.00868
Epoch: 40101, Training Loss: 0.00681
Epoch: 40101, Training Loss: 0.00707
Epoch: 40101, Training Loss: 0.00708
Epoch: 40101, Training Loss: 0.00868
Epoch: 40102, Training Loss: 0.00681
Epoch: 40102, Training Loss: 0.00707
Epoch: 40102, Training Loss: 0.00708
Epoch: 40102, Training Loss: 0.00868
Epoch: 40103, Training Loss: 0.00681
Epoch: 40103, Training Loss: 0.00707
Epoch: 40103, Training Loss: 0.00708
Epoch: 40103, Training Loss: 0.00868
Epoch: 40104, Training Loss: 0.00681
Epoch: 40104, Training Loss: 0.00707
Epoch: 40104, Training Loss: 0.00708
Epoch: 40104, Training Loss: 0.00868
Epoch: 40105, Training Loss: 0.00681
Epoch: 40105, Training Loss: 0.00707
Epoch: 40105, Training Loss: 0.00708
Epoch: 40105, Training Loss: 0.00868
Epoch: 40106, Training Loss: 0.00681
Epoch: 40106, Training Loss: 0.00707
Epoch: 40106, Training Loss: 0.00708
Epoch: 40106, Training Loss: 0.00868
Epoch: 40107, Training Loss: 0.00681
Epoch: 40107, Training Loss: 0.00707
Epoch: 40107, Training Loss: 0.00708
Epoch: 40107, Training Loss: 0.00868
Epoch: 40108, Training Loss: 0.00681
Epoch: 40108, Training Loss: 0.00707
Epoch: 40108, Training Loss: 0.00708
Epoch: 40108, Training Loss: 0.00868
Epoch: 40109, Training Loss: 0.00681
Epoch: 40109, Training Loss: 0.00707
Epoch: 40109, Training Loss: 0.00708
Epoch: 40109, Training Loss: 0.00868
Epoch: 40110, Training Loss: 0.00681
Epoch: 40110, Training Loss: 0.00707
Epoch: 40110, Training Loss: 0.00708
Epoch: 40110, Training Loss: 0.00868
Epoch: 40111, Training Loss: 0.00681
Epoch: 40111, Training Loss: 0.00707
Epoch: 40111, Training Loss: 0.00708
Epoch: 40111, Training Loss: 0.00868
Epoch: 40112, Training Loss: 0.00681
Epoch: 40112, Training Loss: 0.00707
Epoch: 40112, Training Loss: 0.00708
Epoch: 40112, Training Loss: 0.00868
Epoch: 40113, Training Loss: 0.00681
Epoch: 40113, Training Loss: 0.00707
Epoch: 40113, Training Loss: 0.00708
Epoch: 40113, Training Loss: 0.00868
Epoch: 40114, Training Loss: 0.00681
Epoch: 40114, Training Loss: 0.00707
Epoch: 40114, Training Loss: 0.00708
Epoch: 40114, Training Loss: 0.00868
Epoch: 40115, Training Loss: 0.00681
Epoch: 40115, Training Loss: 0.00707
Epoch: 40115, Training Loss: 0.00708
Epoch: 40115, Training Loss: 0.00868
Epoch: 40116, Training Loss: 0.00681
Epoch: 40116, Training Loss: 0.00707
Epoch: 40116, Training Loss: 0.00708
Epoch: 40116, Training Loss: 0.00868
Epoch: 40117, Training Loss: 0.00681
Epoch: 40117, Training Loss: 0.00707
Epoch: 40117, Training Loss: 0.00708
Epoch: 40117, Training Loss: 0.00868
Epoch: 40118, Training Loss: 0.00681
Epoch: 40118, Training Loss: 0.00707
Epoch: 40118, Training Loss: 0.00708
Epoch: 40118, Training Loss: 0.00868
Epoch: 40119, Training Loss: 0.00681
Epoch: 40119, Training Loss: 0.00707
Epoch: 40119, Training Loss: 0.00708
Epoch: 40119, Training Loss: 0.00868
Epoch: 40120, Training Loss: 0.00681
Epoch: 40120, Training Loss: 0.00707
Epoch: 40120, Training Loss: 0.00708
Epoch: 40120, Training Loss: 0.00868
Epoch: 40121, Training Loss: 0.00681
Epoch: 40121, Training Loss: 0.00707
Epoch: 40121, Training Loss: 0.00708
Epoch: 40121, Training Loss: 0.00868
Epoch: 40122, Training Loss: 0.00681
Epoch: 40122, Training Loss: 0.00707
Epoch: 40122, Training Loss: 0.00708
Epoch: 40122, Training Loss: 0.00868
Epoch: 40123, Training Loss: 0.00681
Epoch: 40123, Training Loss: 0.00707
Epoch: 40123, Training Loss: 0.00708
Epoch: 40123, Training Loss: 0.00868
Epoch: 40124, Training Loss: 0.00681
Epoch: 40124, Training Loss: 0.00707
Epoch: 40124, Training Loss: 0.00708
Epoch: 40124, Training Loss: 0.00868
Epoch: 40125, Training Loss: 0.00681
Epoch: 40125, Training Loss: 0.00707
Epoch: 40125, Training Loss: 0.00708
Epoch: 40125, Training Loss: 0.00868
Epoch: 40126, Training Loss: 0.00681
Epoch: 40126, Training Loss: 0.00707
Epoch: 40126, Training Loss: 0.00708
Epoch: 40126, Training Loss: 0.00868
Epoch: 40127, Training Loss: 0.00681
Epoch: 40127, Training Loss: 0.00707
Epoch: 40127, Training Loss: 0.00708
Epoch: 40127, Training Loss: 0.00868
Epoch: 40128, Training Loss: 0.00681
Epoch: 40128, Training Loss: 0.00707
Epoch: 40128, Training Loss: 0.00708
Epoch: 40128, Training Loss: 0.00868
Epoch: 40129, Training Loss: 0.00681
Epoch: 40129, Training Loss: 0.00707
Epoch: 40129, Training Loss: 0.00708
Epoch: 40129, Training Loss: 0.00868
Epoch: 40130, Training Loss: 0.00681
Epoch: 40130, Training Loss: 0.00707
Epoch: 40130, Training Loss: 0.00708
Epoch: 40130, Training Loss: 0.00868
Epoch: 40131, Training Loss: 0.00681
Epoch: 40131, Training Loss: 0.00707
Epoch: 40131, Training Loss: 0.00708
Epoch: 40131, Training Loss: 0.00868
Epoch: 40132, Training Loss: 0.00681
Epoch: 40132, Training Loss: 0.00707
Epoch: 40132, Training Loss: 0.00708
Epoch: 40132, Training Loss: 0.00868
Epoch: 40133, Training Loss: 0.00681
Epoch: 40133, Training Loss: 0.00707
Epoch: 40133, Training Loss: 0.00708
Epoch: 40133, Training Loss: 0.00868
Epoch: 40134, Training Loss: 0.00681
Epoch: 40134, Training Loss: 0.00707
Epoch: 40134, Training Loss: 0.00708
Epoch: 40134, Training Loss: 0.00868
Epoch: 40135, Training Loss: 0.00681
Epoch: 40135, Training Loss: 0.00707
Epoch: 40135, Training Loss: 0.00708
Epoch: 40135, Training Loss: 0.00868
Epoch: 40136, Training Loss: 0.00681
Epoch: 40136, Training Loss: 0.00707
Epoch: 40136, Training Loss: 0.00708
Epoch: 40136, Training Loss: 0.00868
Epoch: 40137, Training Loss: 0.00681
Epoch: 40137, Training Loss: 0.00707
Epoch: 40137, Training Loss: 0.00708
Epoch: 40137, Training Loss: 0.00868
Epoch: 40138, Training Loss: 0.00681
Epoch: 40138, Training Loss: 0.00707
Epoch: 40138, Training Loss: 0.00708
Epoch: 40138, Training Loss: 0.00868
Epoch: 40139, Training Loss: 0.00681
Epoch: 40139, Training Loss: 0.00707
Epoch: 40139, Training Loss: 0.00708
Epoch: 40139, Training Loss: 0.00868
Epoch: 40140, Training Loss: 0.00681
Epoch: 40140, Training Loss: 0.00707
Epoch: 40140, Training Loss: 0.00708
Epoch: 40140, Training Loss: 0.00868
Epoch: 40141, Training Loss: 0.00681
Epoch: 40141, Training Loss: 0.00707
Epoch: 40141, Training Loss: 0.00708
Epoch: 40141, Training Loss: 0.00868
Epoch: 40142, Training Loss: 0.00680
Epoch: 40142, Training Loss: 0.00707
Epoch: 40142, Training Loss: 0.00708
Epoch: 40142, Training Loss: 0.00868
Epoch: 40143, Training Loss: 0.00680
Epoch: 40143, Training Loss: 0.00707
Epoch: 40143, Training Loss: 0.00708
Epoch: 40143, Training Loss: 0.00868
Epoch: 40144, Training Loss: 0.00680
Epoch: 40144, Training Loss: 0.00707
Epoch: 40144, Training Loss: 0.00708
Epoch: 40144, Training Loss: 0.00868
Epoch: 40145, Training Loss: 0.00680
Epoch: 40145, Training Loss: 0.00707
Epoch: 40145, Training Loss: 0.00708
Epoch: 40145, Training Loss: 0.00868
Epoch: 40146, Training Loss: 0.00680
Epoch: 40146, Training Loss: 0.00707
Epoch: 40146, Training Loss: 0.00708
Epoch: 40146, Training Loss: 0.00868
Epoch: 40147, Training Loss: 0.00680
Epoch: 40147, Training Loss: 0.00707
Epoch: 40147, Training Loss: 0.00708
Epoch: 40147, Training Loss: 0.00868
Epoch: 40148, Training Loss: 0.00680
Epoch: 40148, Training Loss: 0.00707
Epoch: 40148, Training Loss: 0.00708
Epoch: 40148, Training Loss: 0.00868
Epoch: 40149, Training Loss: 0.00680
Epoch: 40149, Training Loss: 0.00707
Epoch: 40149, Training Loss: 0.00708
Epoch: 40149, Training Loss: 0.00868
Epoch: 40150, Training Loss: 0.00680
Epoch: 40150, Training Loss: 0.00707
Epoch: 40150, Training Loss: 0.00708
Epoch: 40150, Training Loss: 0.00868
Epoch: 40151, Training Loss: 0.00680
Epoch: 40151, Training Loss: 0.00707
Epoch: 40151, Training Loss: 0.00707
Epoch: 40151, Training Loss: 0.00868
Epoch: 40152, Training Loss: 0.00680
Epoch: 40152, Training Loss: 0.00707
Epoch: 40152, Training Loss: 0.00707
Epoch: 40152, Training Loss: 0.00868
Epoch: 40153, Training Loss: 0.00680
Epoch: 40153, Training Loss: 0.00707
Epoch: 40153, Training Loss: 0.00707
Epoch: 40153, Training Loss: 0.00868
Epoch: 40154, Training Loss: 0.00680
Epoch: 40154, Training Loss: 0.00707
Epoch: 40154, Training Loss: 0.00707
Epoch: 40154, Training Loss: 0.00868
Epoch: 40155, Training Loss: 0.00680
Epoch: 40155, Training Loss: 0.00707
Epoch: 40155, Training Loss: 0.00707
Epoch: 40155, Training Loss: 0.00868
Epoch: 40156, Training Loss: 0.00680
Epoch: 40156, Training Loss: 0.00707
Epoch: 40156, Training Loss: 0.00707
Epoch: 40156, Training Loss: 0.00868
Epoch: 40157, Training Loss: 0.00680
Epoch: 40157, Training Loss: 0.00707
Epoch: 40157, Training Loss: 0.00707
Epoch: 40157, Training Loss: 0.00868
Epoch: 40158, Training Loss: 0.00680
Epoch: 40158, Training Loss: 0.00707
Epoch: 40158, Training Loss: 0.00707
Epoch: 40158, Training Loss: 0.00868
Epoch: 40159, Training Loss: 0.00680
Epoch: 40159, Training Loss: 0.00707
Epoch: 40159, Training Loss: 0.00707
Epoch: 40159, Training Loss: 0.00868
Epoch: 40160, Training Loss: 0.00680
Epoch: 40160, Training Loss: 0.00707
Epoch: 40160, Training Loss: 0.00707
Epoch: 40160, Training Loss: 0.00868
Epoch: 40161, Training Loss: 0.00680
Epoch: 40161, Training Loss: 0.00707
Epoch: 40161, Training Loss: 0.00707
Epoch: 40161, Training Loss: 0.00868
Epoch: 40162, Training Loss: 0.00680
Epoch: 40162, Training Loss: 0.00707
Epoch: 40162, Training Loss: 0.00707
Epoch: 40162, Training Loss: 0.00867
Epoch: 40163, Training Loss: 0.00680
Epoch: 40163, Training Loss: 0.00707
Epoch: 40163, Training Loss: 0.00707
Epoch: 40163, Training Loss: 0.00867
Epoch: 40164, Training Loss: 0.00680
Epoch: 40164, Training Loss: 0.00707
Epoch: 40164, Training Loss: 0.00707
Epoch: 40164, Training Loss: 0.00867
Epoch: 40165, Training Loss: 0.00680
Epoch: 40165, Training Loss: 0.00707
Epoch: 40165, Training Loss: 0.00707
Epoch: 40165, Training Loss: 0.00867
Epoch: 40166, Training Loss: 0.00680
Epoch: 40166, Training Loss: 0.00707
Epoch: 40166, Training Loss: 0.00707
Epoch: 40166, Training Loss: 0.00867
Epoch: 40167, Training Loss: 0.00680
Epoch: 40167, Training Loss: 0.00707
Epoch: 40167, Training Loss: 0.00707
Epoch: 40167, Training Loss: 0.00867
Epoch: 40168, Training Loss: 0.00680
Epoch: 40168, Training Loss: 0.00707
Epoch: 40168, Training Loss: 0.00707
Epoch: 40168, Training Loss: 0.00867
Epoch: 40169, Training Loss: 0.00680
Epoch: 40169, Training Loss: 0.00707
Epoch: 40169, Training Loss: 0.00707
Epoch: 40169, Training Loss: 0.00867
Epoch: 40170, Training Loss: 0.00680
Epoch: 40170, Training Loss: 0.00707
Epoch: 40170, Training Loss: 0.00707
Epoch: 40170, Training Loss: 0.00867
Epoch: 40171, Training Loss: 0.00680
Epoch: 40171, Training Loss: 0.00707
Epoch: 40171, Training Loss: 0.00707
Epoch: 40171, Training Loss: 0.00867
Epoch: 40172, Training Loss: 0.00680
Epoch: 40172, Training Loss: 0.00706
Epoch: 40172, Training Loss: 0.00707
Epoch: 40172, Training Loss: 0.00867
Epoch: 40173, Training Loss: 0.00680
Epoch: 40173, Training Loss: 0.00706
Epoch: 40173, Training Loss: 0.00707
Epoch: 40173, Training Loss: 0.00867
Epoch: 40174, Training Loss: 0.00680
Epoch: 40174, Training Loss: 0.00706
Epoch: 40174, Training Loss: 0.00707
Epoch: 40174, Training Loss: 0.00867
Epoch: 40175, Training Loss: 0.00680
Epoch: 40175, Training Loss: 0.00706
Epoch: 40175, Training Loss: 0.00707
Epoch: 40175, Training Loss: 0.00867
Epoch: 40176, Training Loss: 0.00680
Epoch: 40176, Training Loss: 0.00706
Epoch: 40176, Training Loss: 0.00707
Epoch: 40176, Training Loss: 0.00867
Epoch: 40177, Training Loss: 0.00680
Epoch: 40177, Training Loss: 0.00706
Epoch: 40177, Training Loss: 0.00707
Epoch: 40177, Training Loss: 0.00867
Epoch: 40178, Training Loss: 0.00680
Epoch: 40178, Training Loss: 0.00706
Epoch: 40178, Training Loss: 0.00707
Epoch: 40178, Training Loss: 0.00867
Epoch: 40179, Training Loss: 0.00680
Epoch: 40179, Training Loss: 0.00706
Epoch: 40179, Training Loss: 0.00707
Epoch: 40179, Training Loss: 0.00867
Epoch: 40180, Training Loss: 0.00680
Epoch: 40180, Training Loss: 0.00706
Epoch: 40180, Training Loss: 0.00707
Epoch: 40180, Training Loss: 0.00867
Epoch: 40181, Training Loss: 0.00680
Epoch: 40181, Training Loss: 0.00706
Epoch: 40181, Training Loss: 0.00707
Epoch: 40181, Training Loss: 0.00867
Epoch: 40182, Training Loss: 0.00680
Epoch: 40182, Training Loss: 0.00706
Epoch: 40182, Training Loss: 0.00707
Epoch: 40182, Training Loss: 0.00867
Epoch: 40183, Training Loss: 0.00680
Epoch: 40183, Training Loss: 0.00706
Epoch: 40183, Training Loss: 0.00707
Epoch: 40183, Training Loss: 0.00867
Epoch: 40184, Training Loss: 0.00680
Epoch: 40184, Training Loss: 0.00706
Epoch: 40184, Training Loss: 0.00707
Epoch: 40184, Training Loss: 0.00867
Epoch: 40185, Training Loss: 0.00680
Epoch: 40185, Training Loss: 0.00706
Epoch: 40185, Training Loss: 0.00707
Epoch: 40185, Training Loss: 0.00867
Epoch: 40186, Training Loss: 0.00680
Epoch: 40186, Training Loss: 0.00706
Epoch: 40186, Training Loss: 0.00707
Epoch: 40186, Training Loss: 0.00867
Epoch: 40187, Training Loss: 0.00680
Epoch: 40187, Training Loss: 0.00706
Epoch: 40187, Training Loss: 0.00707
Epoch: 40187, Training Loss: 0.00867
Epoch: 40188, Training Loss: 0.00680
Epoch: 40188, Training Loss: 0.00706
Epoch: 40188, Training Loss: 0.00707
Epoch: 40188, Training Loss: 0.00867
Epoch: 40189, Training Loss: 0.00680
Epoch: 40189, Training Loss: 0.00706
Epoch: 40189, Training Loss: 0.00707
Epoch: 40189, Training Loss: 0.00867
Epoch: 40190, Training Loss: 0.00680
Epoch: 40190, Training Loss: 0.00706
Epoch: 40190, Training Loss: 0.00707
Epoch: 40190, Training Loss: 0.00867
Epoch: 40191, Training Loss: 0.00680
Epoch: 40191, Training Loss: 0.00706
Epoch: 40191, Training Loss: 0.00707
Epoch: 40191, Training Loss: 0.00867
Epoch: 40192, Training Loss: 0.00680
Epoch: 40192, Training Loss: 0.00706
Epoch: 40192, Training Loss: 0.00707
Epoch: 40192, Training Loss: 0.00867
Epoch: 40193, Training Loss: 0.00680
Epoch: 40193, Training Loss: 0.00706
Epoch: 40193, Training Loss: 0.00707
Epoch: 40193, Training Loss: 0.00867
Epoch: 40194, Training Loss: 0.00680
Epoch: 40194, Training Loss: 0.00706
Epoch: 40194, Training Loss: 0.00707
Epoch: 40194, Training Loss: 0.00867
Epoch: 40195, Training Loss: 0.00680
Epoch: 40195, Training Loss: 0.00706
Epoch: 40195, Training Loss: 0.00707
Epoch: 40195, Training Loss: 0.00867
Epoch: 40196, Training Loss: 0.00680
Epoch: 40196, Training Loss: 0.00706
Epoch: 40196, Training Loss: 0.00707
Epoch: 40196, Training Loss: 0.00867
Epoch: 40197, Training Loss: 0.00680
Epoch: 40197, Training Loss: 0.00706
Epoch: 40197, Training Loss: 0.00707
Epoch: 40197, Training Loss: 0.00867
Epoch: 40198, Training Loss: 0.00680
Epoch: 40198, Training Loss: 0.00706
Epoch: 40198, Training Loss: 0.00707
Epoch: 40198, Training Loss: 0.00867
Epoch: 40199, Training Loss: 0.00680
Epoch: 40199, Training Loss: 0.00706
Epoch: 40199, Training Loss: 0.00707
Epoch: 40199, Training Loss: 0.00867
Epoch: 40200, Training Loss: 0.00680
Epoch: 40200, Training Loss: 0.00706
Epoch: 40200, Training Loss: 0.00707
Epoch: 40200, Training Loss: 0.00867
Epoch: 40201, Training Loss: 0.00680
Epoch: 40201, Training Loss: 0.00706
Epoch: 40201, Training Loss: 0.00707
Epoch: 40201, Training Loss: 0.00867
Epoch: 40202, Training Loss: 0.00680
Epoch: 40202, Training Loss: 0.00706
Epoch: 40202, Training Loss: 0.00707
Epoch: 40202, Training Loss: 0.00867
Epoch: 40203, Training Loss: 0.00680
Epoch: 40203, Training Loss: 0.00706
Epoch: 40203, Training Loss: 0.00707
Epoch: 40203, Training Loss: 0.00867
Epoch: 40204, Training Loss: 0.00680
Epoch: 40204, Training Loss: 0.00706
Epoch: 40204, Training Loss: 0.00707
Epoch: 40204, Training Loss: 0.00867
Epoch: 40205, Training Loss: 0.00680
Epoch: 40205, Training Loss: 0.00706
Epoch: 40205, Training Loss: 0.00707
Epoch: 40205, Training Loss: 0.00867
Epoch: 40206, Training Loss: 0.00680
Epoch: 40206, Training Loss: 0.00706
Epoch: 40206, Training Loss: 0.00707
Epoch: 40206, Training Loss: 0.00867
Epoch: 40207, Training Loss: 0.00680
Epoch: 40207, Training Loss: 0.00706
Epoch: 40207, Training Loss: 0.00707
Epoch: 40207, Training Loss: 0.00867
Epoch: 40208, Training Loss: 0.00680
Epoch: 40208, Training Loss: 0.00706
Epoch: 40208, Training Loss: 0.00707
Epoch: 40208, Training Loss: 0.00867
Epoch: 40209, Training Loss: 0.00680
Epoch: 40209, Training Loss: 0.00706
Epoch: 40209, Training Loss: 0.00707
Epoch: 40209, Training Loss: 0.00867
Epoch: 40210, Training Loss: 0.00680
Epoch: 40210, Training Loss: 0.00706
Epoch: 40210, Training Loss: 0.00707
Epoch: 40210, Training Loss: 0.00867
Epoch: 40211, Training Loss: 0.00680
Epoch: 40211, Training Loss: 0.00706
Epoch: 40211, Training Loss: 0.00707
Epoch: 40211, Training Loss: 0.00867
Epoch: 40212, Training Loss: 0.00680
Epoch: 40212, Training Loss: 0.00706
Epoch: 40212, Training Loss: 0.00707
Epoch: 40212, Training Loss: 0.00867
Epoch: 40213, Training Loss: 0.00680
Epoch: 40213, Training Loss: 0.00706
Epoch: 40213, Training Loss: 0.00707
Epoch: 40213, Training Loss: 0.00867
Epoch: 40214, Training Loss: 0.00680
Epoch: 40214, Training Loss: 0.00706
Epoch: 40214, Training Loss: 0.00707
Epoch: 40214, Training Loss: 0.00867
Epoch: 40215, Training Loss: 0.00680
Epoch: 40215, Training Loss: 0.00706
Epoch: 40215, Training Loss: 0.00707
Epoch: 40215, Training Loss: 0.00867
Epoch: 40216, Training Loss: 0.00680
Epoch: 40216, Training Loss: 0.00706
Epoch: 40216, Training Loss: 0.00707
Epoch: 40216, Training Loss: 0.00867
Epoch: 40217, Training Loss: 0.00680
Epoch: 40217, Training Loss: 0.00706
Epoch: 40217, Training Loss: 0.00707
Epoch: 40217, Training Loss: 0.00867
Epoch: 40218, Training Loss: 0.00680
Epoch: 40218, Training Loss: 0.00706
Epoch: 40218, Training Loss: 0.00707
Epoch: 40218, Training Loss: 0.00867
Epoch: 40219, Training Loss: 0.00680
Epoch: 40219, Training Loss: 0.00706
Epoch: 40219, Training Loss: 0.00707
Epoch: 40219, Training Loss: 0.00867
Epoch: 40220, Training Loss: 0.00680
Epoch: 40220, Training Loss: 0.00706
Epoch: 40220, Training Loss: 0.00707
Epoch: 40220, Training Loss: 0.00867
Epoch: 40221, Training Loss: 0.00680
Epoch: 40221, Training Loss: 0.00706
Epoch: 40221, Training Loss: 0.00707
Epoch: 40221, Training Loss: 0.00867
Epoch: 40222, Training Loss: 0.00680
Epoch: 40222, Training Loss: 0.00706
Epoch: 40222, Training Loss: 0.00707
Epoch: 40222, Training Loss: 0.00867
Epoch: 40223, Training Loss: 0.00680
Epoch: 40223, Training Loss: 0.00706
Epoch: 40223, Training Loss: 0.00707
Epoch: 40223, Training Loss: 0.00867
Epoch: 40224, Training Loss: 0.00680
Epoch: 40224, Training Loss: 0.00706
Epoch: 40224, Training Loss: 0.00707
Epoch: 40224, Training Loss: 0.00867
Epoch: 40225, Training Loss: 0.00680
Epoch: 40225, Training Loss: 0.00706
Epoch: 40225, Training Loss: 0.00707
Epoch: 40225, Training Loss: 0.00867
Epoch: 40226, Training Loss: 0.00680
Epoch: 40226, Training Loss: 0.00706
Epoch: 40226, Training Loss: 0.00707
Epoch: 40226, Training Loss: 0.00867
Epoch: 40227, Training Loss: 0.00680
Epoch: 40227, Training Loss: 0.00706
Epoch: 40227, Training Loss: 0.00707
Epoch: 40227, Training Loss: 0.00867
Epoch: 40228, Training Loss: 0.00680
Epoch: 40228, Training Loss: 0.00706
Epoch: 40228, Training Loss: 0.00707
Epoch: 40228, Training Loss: 0.00867
Epoch: 40229, Training Loss: 0.00680
Epoch: 40229, Training Loss: 0.00706
Epoch: 40229, Training Loss: 0.00707
Epoch: 40229, Training Loss: 0.00867
Epoch: 40230, Training Loss: 0.00680
Epoch: 40230, Training Loss: 0.00706
Epoch: 40230, Training Loss: 0.00707
Epoch: 40230, Training Loss: 0.00867
Epoch: 40231, Training Loss: 0.00680
Epoch: 40231, Training Loss: 0.00706
Epoch: 40231, Training Loss: 0.00707
Epoch: 40231, Training Loss: 0.00867
Epoch: 40232, Training Loss: 0.00680
Epoch: 40232, Training Loss: 0.00706
Epoch: 40232, Training Loss: 0.00707
Epoch: 40232, Training Loss: 0.00867
Epoch: 40233, Training Loss: 0.00680
Epoch: 40233, Training Loss: 0.00706
Epoch: 40233, Training Loss: 0.00707
Epoch: 40233, Training Loss: 0.00867
Epoch: 40234, Training Loss: 0.00680
Epoch: 40234, Training Loss: 0.00706
Epoch: 40234, Training Loss: 0.00707
Epoch: 40234, Training Loss: 0.00867
Epoch: 40235, Training Loss: 0.00680
Epoch: 40235, Training Loss: 0.00706
Epoch: 40235, Training Loss: 0.00707
Epoch: 40235, Training Loss: 0.00867
Epoch: 40236, Training Loss: 0.00680
Epoch: 40236, Training Loss: 0.00706
Epoch: 40236, Training Loss: 0.00707
Epoch: 40236, Training Loss: 0.00867
Epoch: 40237, Training Loss: 0.00680
Epoch: 40237, Training Loss: 0.00706
Epoch: 40237, Training Loss: 0.00707
Epoch: 40237, Training Loss: 0.00867
Epoch: 40238, Training Loss: 0.00680
Epoch: 40238, Training Loss: 0.00706
Epoch: 40238, Training Loss: 0.00707
Epoch: 40238, Training Loss: 0.00867
Epoch: 40239, Training Loss: 0.00680
Epoch: 40239, Training Loss: 0.00706
Epoch: 40239, Training Loss: 0.00707
Epoch: 40239, Training Loss: 0.00867
Epoch: 40240, Training Loss: 0.00680
Epoch: 40240, Training Loss: 0.00706
Epoch: 40240, Training Loss: 0.00707
Epoch: 40240, Training Loss: 0.00867
Epoch: 40241, Training Loss: 0.00680
Epoch: 40241, Training Loss: 0.00706
Epoch: 40241, Training Loss: 0.00707
Epoch: 40241, Training Loss: 0.00867
Epoch: 40242, Training Loss: 0.00680
Epoch: 40242, Training Loss: 0.00706
Epoch: 40242, Training Loss: 0.00707
Epoch: 40242, Training Loss: 0.00867
Epoch: 40243, Training Loss: 0.00680
Epoch: 40243, Training Loss: 0.00706
Epoch: 40243, Training Loss: 0.00707
Epoch: 40243, Training Loss: 0.00867
Epoch: 40244, Training Loss: 0.00680
Epoch: 40244, Training Loss: 0.00706
Epoch: 40244, Training Loss: 0.00707
Epoch: 40244, Training Loss: 0.00867
Epoch: 40245, Training Loss: 0.00680
Epoch: 40245, Training Loss: 0.00706
Epoch: 40245, Training Loss: 0.00707
Epoch: 40245, Training Loss: 0.00866
Epoch: 40246, Training Loss: 0.00680
Epoch: 40246, Training Loss: 0.00706
Epoch: 40246, Training Loss: 0.00707
Epoch: 40246, Training Loss: 0.00866
Epoch: 40247, Training Loss: 0.00680
Epoch: 40247, Training Loss: 0.00706
Epoch: 40247, Training Loss: 0.00707
Epoch: 40247, Training Loss: 0.00866
Epoch: 40248, Training Loss: 0.00680
Epoch: 40248, Training Loss: 0.00706
Epoch: 40248, Training Loss: 0.00707
Epoch: 40248, Training Loss: 0.00866
Epoch: 40249, Training Loss: 0.00680
Epoch: 40249, Training Loss: 0.00706
Epoch: 40249, Training Loss: 0.00707
Epoch: 40249, Training Loss: 0.00866
Epoch: 40250, Training Loss: 0.00680
Epoch: 40250, Training Loss: 0.00706
Epoch: 40250, Training Loss: 0.00707
Epoch: 40250, Training Loss: 0.00866
Epoch: 40251, Training Loss: 0.00680
Epoch: 40251, Training Loss: 0.00706
Epoch: 40251, Training Loss: 0.00707
Epoch: 40251, Training Loss: 0.00866
Epoch: 40252, Training Loss: 0.00679
Epoch: 40252, Training Loss: 0.00706
Epoch: 40252, Training Loss: 0.00707
Epoch: 40252, Training Loss: 0.00866
Epoch: 40253, Training Loss: 0.00679
Epoch: 40253, Training Loss: 0.00706
Epoch: 40253, Training Loss: 0.00707
Epoch: 40253, Training Loss: 0.00866
Epoch: 40254, Training Loss: 0.00679
Epoch: 40254, Training Loss: 0.00706
Epoch: 40254, Training Loss: 0.00706
Epoch: 40254, Training Loss: 0.00866
Epoch: 40255, Training Loss: 0.00679
Epoch: 40255, Training Loss: 0.00706
Epoch: 40255, Training Loss: 0.00706
Epoch: 40255, Training Loss: 0.00866
Epoch: 40256, Training Loss: 0.00679
Epoch: 40256, Training Loss: 0.00706
Epoch: 40256, Training Loss: 0.00706
Epoch: 40256, Training Loss: 0.00866
Epoch: 40257, Training Loss: 0.00679
Epoch: 40257, Training Loss: 0.00706
Epoch: 40257, Training Loss: 0.00706
Epoch: 40257, Training Loss: 0.00866
Epoch: 40258, Training Loss: 0.00679
Epoch: 40258, Training Loss: 0.00706
Epoch: 40258, Training Loss: 0.00706
Epoch: 40258, Training Loss: 0.00866
Epoch: 40259, Training Loss: 0.00679
Epoch: 40259, Training Loss: 0.00706
Epoch: 40259, Training Loss: 0.00706
Epoch: 40259, Training Loss: 0.00866
Epoch: 40260, Training Loss: 0.00679
Epoch: 40260, Training Loss: 0.00706
Epoch: 40260, Training Loss: 0.00706
Epoch: 40260, Training Loss: 0.00866
Epoch: 40261, Training Loss: 0.00679
Epoch: 40261, Training Loss: 0.00706
Epoch: 40261, Training Loss: 0.00706
Epoch: 40261, Training Loss: 0.00866
Epoch: 40262, Training Loss: 0.00679
Epoch: 40262, Training Loss: 0.00706
Epoch: 40262, Training Loss: 0.00706
Epoch: 40262, Training Loss: 0.00866
Epoch: 40263, Training Loss: 0.00679
Epoch: 40263, Training Loss: 0.00706
Epoch: 40263, Training Loss: 0.00706
Epoch: 40263, Training Loss: 0.00866
Epoch: 40264, Training Loss: 0.00679
Epoch: 40264, Training Loss: 0.00706
Epoch: 40264, Training Loss: 0.00706
Epoch: 40264, Training Loss: 0.00866
Epoch: 40265, Training Loss: 0.00679
Epoch: 40265, Training Loss: 0.00706
Epoch: 40265, Training Loss: 0.00706
Epoch: 40265, Training Loss: 0.00866
Epoch: 40266, Training Loss: 0.00679
Epoch: 40266, Training Loss: 0.00706
Epoch: 40266, Training Loss: 0.00706
Epoch: 40266, Training Loss: 0.00866
Epoch: 40267, Training Loss: 0.00679
Epoch: 40267, Training Loss: 0.00706
Epoch: 40267, Training Loss: 0.00706
Epoch: 40267, Training Loss: 0.00866
Epoch: 40268, Training Loss: 0.00679
Epoch: 40268, Training Loss: 0.00706
Epoch: 40268, Training Loss: 0.00706
Epoch: 40268, Training Loss: 0.00866
Epoch: 40269, Training Loss: 0.00679
Epoch: 40269, Training Loss: 0.00706
Epoch: 40269, Training Loss: 0.00706
Epoch: 40269, Training Loss: 0.00866
Epoch: 40270, Training Loss: 0.00679
Epoch: 40270, Training Loss: 0.00706
Epoch: 40270, Training Loss: 0.00706
Epoch: 40270, Training Loss: 0.00866
Epoch: 40271, Training Loss: 0.00679
Epoch: 40271, Training Loss: 0.00706
Epoch: 40271, Training Loss: 0.00706
Epoch: 40271, Training Loss: 0.00866
Epoch: 40272, Training Loss: 0.00679
Epoch: 40272, Training Loss: 0.00706
Epoch: 40272, Training Loss: 0.00706
Epoch: 40272, Training Loss: 0.00866
Epoch: 40273, Training Loss: 0.00679
Epoch: 40273, Training Loss: 0.00706
Epoch: 40273, Training Loss: 0.00706
Epoch: 40273, Training Loss: 0.00866
Epoch: 40274, Training Loss: 0.00679
Epoch: 40274, Training Loss: 0.00706
Epoch: 40274, Training Loss: 0.00706
Epoch: 40274, Training Loss: 0.00866
Epoch: 40275, Training Loss: 0.00679
Epoch: 40275, Training Loss: 0.00706
Epoch: 40275, Training Loss: 0.00706
Epoch: 40275, Training Loss: 0.00866
Epoch: 40276, Training Loss: 0.00679
Epoch: 40276, Training Loss: 0.00705
Epoch: 40276, Training Loss: 0.00706
Epoch: 40276, Training Loss: 0.00866
Epoch: 40277, Training Loss: 0.00679
Epoch: 40277, Training Loss: 0.00705
Epoch: 40277, Training Loss: 0.00706
Epoch: 40277, Training Loss: 0.00866
Epoch: 40278, Training Loss: 0.00679
Epoch: 40278, Training Loss: 0.00705
Epoch: 40278, Training Loss: 0.00706
Epoch: 40278, Training Loss: 0.00866
Epoch: 40279, Training Loss: 0.00679
Epoch: 40279, Training Loss: 0.00705
Epoch: 40279, Training Loss: 0.00706
Epoch: 40279, Training Loss: 0.00866
Epoch: 40280, Training Loss: 0.00679
Epoch: 40280, Training Loss: 0.00705
Epoch: 40280, Training Loss: 0.00706
Epoch: 40280, Training Loss: 0.00866
Epoch: 40281, Training Loss: 0.00679
Epoch: 40281, Training Loss: 0.00705
Epoch: 40281, Training Loss: 0.00706
Epoch: 40281, Training Loss: 0.00866
Epoch: 40282, Training Loss: 0.00679
Epoch: 40282, Training Loss: 0.00705
Epoch: 40282, Training Loss: 0.00706
Epoch: 40282, Training Loss: 0.00866
Epoch: 40283, Training Loss: 0.00679
Epoch: 40283, Training Loss: 0.00705
Epoch: 40283, Training Loss: 0.00706
Epoch: 40283, Training Loss: 0.00866
Epoch: 40284, Training Loss: 0.00679
Epoch: 40284, Training Loss: 0.00705
Epoch: 40284, Training Loss: 0.00706
Epoch: 40284, Training Loss: 0.00866
Epoch: 40285, Training Loss: 0.00679
Epoch: 40285, Training Loss: 0.00705
Epoch: 40285, Training Loss: 0.00706
Epoch: 40285, Training Loss: 0.00866
Epoch: 40286, Training Loss: 0.00679
Epoch: 40286, Training Loss: 0.00705
Epoch: 40286, Training Loss: 0.00706
Epoch: 40286, Training Loss: 0.00866
Epoch: 40287, Training Loss: 0.00679
Epoch: 40287, Training Loss: 0.00705
Epoch: 40287, Training Loss: 0.00706
Epoch: 40287, Training Loss: 0.00866
Epoch: 40288, Training Loss: 0.00679
Epoch: 40288, Training Loss: 0.00705
Epoch: 40288, Training Loss: 0.00706
Epoch: 40288, Training Loss: 0.00866
Epoch: 40289, Training Loss: 0.00679
Epoch: 40289, Training Loss: 0.00705
Epoch: 40289, Training Loss: 0.00706
Epoch: 40289, Training Loss: 0.00866
Epoch: 40290, Training Loss: 0.00679
Epoch: 40290, Training Loss: 0.00705
Epoch: 40290, Training Loss: 0.00706
Epoch: 40290, Training Loss: 0.00866
Epoch: 40291, Training Loss: 0.00679
Epoch: 40291, Training Loss: 0.00705
Epoch: 40291, Training Loss: 0.00706
Epoch: 40291, Training Loss: 0.00866
Epoch: 40292, Training Loss: 0.00679
Epoch: 40292, Training Loss: 0.00705
Epoch: 40292, Training Loss: 0.00706
Epoch: 40292, Training Loss: 0.00866
Epoch: 40293, Training Loss: 0.00679
Epoch: 40293, Training Loss: 0.00705
Epoch: 40293, Training Loss: 0.00706
Epoch: 40293, Training Loss: 0.00866
Epoch: 40294, Training Loss: 0.00679
Epoch: 40294, Training Loss: 0.00705
Epoch: 40294, Training Loss: 0.00706
Epoch: 40294, Training Loss: 0.00866
Epoch: 40295, Training Loss: 0.00679
Epoch: 40295, Training Loss: 0.00705
Epoch: 40295, Training Loss: 0.00706
Epoch: 40295, Training Loss: 0.00866
Epoch: 40296, Training Loss: 0.00679
Epoch: 40296, Training Loss: 0.00705
Epoch: 40296, Training Loss: 0.00706
Epoch: 40296, Training Loss: 0.00866
Epoch: 40297, Training Loss: 0.00679
Epoch: 40297, Training Loss: 0.00705
Epoch: 40297, Training Loss: 0.00706
Epoch: 40297, Training Loss: 0.00866
Epoch: 40298, Training Loss: 0.00679
Epoch: 40298, Training Loss: 0.00705
Epoch: 40298, Training Loss: 0.00706
Epoch: 40298, Training Loss: 0.00866
Epoch: 40299, Training Loss: 0.00679
Epoch: 40299, Training Loss: 0.00705
Epoch: 40299, Training Loss: 0.00706
Epoch: 40299, Training Loss: 0.00866
Epoch: 40300, Training Loss: 0.00679
Epoch: 40300, Training Loss: 0.00705
Epoch: 40300, Training Loss: 0.00706
Epoch: 40300, Training Loss: 0.00866
Epoch: 40301, Training Loss: 0.00679
Epoch: 40301, Training Loss: 0.00705
Epoch: 40301, Training Loss: 0.00706
Epoch: 40301, Training Loss: 0.00866
Epoch: 40302, Training Loss: 0.00679
Epoch: 40302, Training Loss: 0.00705
Epoch: 40302, Training Loss: 0.00706
Epoch: 40302, Training Loss: 0.00866
Epoch: 40303, Training Loss: 0.00679
Epoch: 40303, Training Loss: 0.00705
Epoch: 40303, Training Loss: 0.00706
Epoch: 40303, Training Loss: 0.00866
Epoch: 40304, Training Loss: 0.00679
Epoch: 40304, Training Loss: 0.00705
Epoch: 40304, Training Loss: 0.00706
Epoch: 40304, Training Loss: 0.00866
Epoch: 40305, Training Loss: 0.00679
Epoch: 40305, Training Loss: 0.00705
Epoch: 40305, Training Loss: 0.00706
Epoch: 40305, Training Loss: 0.00866
Epoch: 40306, Training Loss: 0.00679
Epoch: 40306, Training Loss: 0.00705
Epoch: 40306, Training Loss: 0.00706
Epoch: 40306, Training Loss: 0.00866
Epoch: 40307, Training Loss: 0.00679
Epoch: 40307, Training Loss: 0.00705
Epoch: 40307, Training Loss: 0.00706
Epoch: 40307, Training Loss: 0.00866
Epoch: 40308, Training Loss: 0.00679
Epoch: 40308, Training Loss: 0.00705
Epoch: 40308, Training Loss: 0.00706
Epoch: 40308, Training Loss: 0.00866
Epoch: 40309, Training Loss: 0.00679
Epoch: 40309, Training Loss: 0.00705
Epoch: 40309, Training Loss: 0.00706
Epoch: 40309, Training Loss: 0.00866
Epoch: 40310, Training Loss: 0.00679
Epoch: 40310, Training Loss: 0.00705
Epoch: 40310, Training Loss: 0.00706
Epoch: 40310, Training Loss: 0.00866
Epoch: 40311, Training Loss: 0.00679
Epoch: 40311, Training Loss: 0.00705
Epoch: 40311, Training Loss: 0.00706
Epoch: 40311, Training Loss: 0.00866
Epoch: 40312, Training Loss: 0.00679
Epoch: 40312, Training Loss: 0.00705
Epoch: 40312, Training Loss: 0.00706
Epoch: 40312, Training Loss: 0.00866
Epoch: 40313, Training Loss: 0.00679
Epoch: 40313, Training Loss: 0.00705
Epoch: 40313, Training Loss: 0.00706
Epoch: 40313, Training Loss: 0.00866
Epoch: 40314, Training Loss: 0.00679
Epoch: 40314, Training Loss: 0.00705
Epoch: 40314, Training Loss: 0.00706
Epoch: 40314, Training Loss: 0.00866
Epoch: 40315, Training Loss: 0.00679
Epoch: 40315, Training Loss: 0.00705
Epoch: 40315, Training Loss: 0.00706
Epoch: 40315, Training Loss: 0.00866
Epoch: 40316, Training Loss: 0.00679
Epoch: 40316, Training Loss: 0.00705
Epoch: 40316, Training Loss: 0.00706
Epoch: 40316, Training Loss: 0.00866
Epoch: 40317, Training Loss: 0.00679
Epoch: 40317, Training Loss: 0.00705
Epoch: 40317, Training Loss: 0.00706
Epoch: 40317, Training Loss: 0.00866
Epoch: 40318, Training Loss: 0.00679
Epoch: 40318, Training Loss: 0.00705
Epoch: 40318, Training Loss: 0.00706
Epoch: 40318, Training Loss: 0.00866
Epoch: 40319, Training Loss: 0.00679
Epoch: 40319, Training Loss: 0.00705
Epoch: 40319, Training Loss: 0.00706
Epoch: 40319, Training Loss: 0.00866
Epoch: 40320, Training Loss: 0.00679
Epoch: 40320, Training Loss: 0.00705
Epoch: 40320, Training Loss: 0.00706
Epoch: 40320, Training Loss: 0.00866
Epoch: 40321, Training Loss: 0.00679
Epoch: 40321, Training Loss: 0.00705
Epoch: 40321, Training Loss: 0.00706
Epoch: 40321, Training Loss: 0.00866
Epoch: 40322, Training Loss: 0.00679
Epoch: 40322, Training Loss: 0.00705
Epoch: 40322, Training Loss: 0.00706
Epoch: 40322, Training Loss: 0.00866
Epoch: 40323, Training Loss: 0.00679
Epoch: 40323, Training Loss: 0.00705
Epoch: 40323, Training Loss: 0.00706
Epoch: 40323, Training Loss: 0.00866
Epoch: 40324, Training Loss: 0.00679
Epoch: 40324, Training Loss: 0.00705
Epoch: 40324, Training Loss: 0.00706
Epoch: 40324, Training Loss: 0.00866
Epoch: 40325, Training Loss: 0.00679
Epoch: 40325, Training Loss: 0.00705
Epoch: 40325, Training Loss: 0.00706
Epoch: 40325, Training Loss: 0.00866
Epoch: 40326, Training Loss: 0.00679
Epoch: 40326, Training Loss: 0.00705
Epoch: 40326, Training Loss: 0.00706
Epoch: 40326, Training Loss: 0.00866
Epoch: 40327, Training Loss: 0.00679
Epoch: 40327, Training Loss: 0.00705
Epoch: 40327, Training Loss: 0.00706
Epoch: 40327, Training Loss: 0.00866
Epoch: 40328, Training Loss: 0.00679
Epoch: 40328, Training Loss: 0.00705
Epoch: 40328, Training Loss: 0.00706
Epoch: 40328, Training Loss: 0.00866
Epoch: 40329, Training Loss: 0.00679
Epoch: 40329, Training Loss: 0.00705
Epoch: 40329, Training Loss: 0.00706
Epoch: 40329, Training Loss: 0.00865
Epoch: 40330, Training Loss: 0.00679
Epoch: 40330, Training Loss: 0.00705
Epoch: 40330, Training Loss: 0.00706
Epoch: 40330, Training Loss: 0.00865
Epoch: 40331, Training Loss: 0.00679
Epoch: 40331, Training Loss: 0.00705
Epoch: 40331, Training Loss: 0.00706
Epoch: 40331, Training Loss: 0.00865
Epoch: 40332, Training Loss: 0.00679
Epoch: 40332, Training Loss: 0.00705
Epoch: 40332, Training Loss: 0.00706
Epoch: 40332, Training Loss: 0.00865
Epoch: 40333, Training Loss: 0.00679
Epoch: 40333, Training Loss: 0.00705
Epoch: 40333, Training Loss: 0.00706
Epoch: 40333, Training Loss: 0.00865
Epoch: 40334, Training Loss: 0.00679
Epoch: 40334, Training Loss: 0.00705
Epoch: 40334, Training Loss: 0.00706
Epoch: 40334, Training Loss: 0.00865
Epoch: 40335, Training Loss: 0.00679
Epoch: 40335, Training Loss: 0.00705
Epoch: 40335, Training Loss: 0.00706
Epoch: 40335, Training Loss: 0.00865
Epoch: 40336, Training Loss: 0.00679
Epoch: 40336, Training Loss: 0.00705
Epoch: 40336, Training Loss: 0.00706
Epoch: 40336, Training Loss: 0.00865
Epoch: 40337, Training Loss: 0.00679
Epoch: 40337, Training Loss: 0.00705
Epoch: 40337, Training Loss: 0.00706
Epoch: 40337, Training Loss: 0.00865
Epoch: 40338, Training Loss: 0.00679
Epoch: 40338, Training Loss: 0.00705
Epoch: 40338, Training Loss: 0.00706
Epoch: 40338, Training Loss: 0.00865
Epoch: 40339, Training Loss: 0.00679
Epoch: 40339, Training Loss: 0.00705
Epoch: 40339, Training Loss: 0.00706
Epoch: 40339, Training Loss: 0.00865
Epoch: 40340, Training Loss: 0.00679
Epoch: 40340, Training Loss: 0.00705
Epoch: 40340, Training Loss: 0.00706
Epoch: 40340, Training Loss: 0.00865
Epoch: 40341, Training Loss: 0.00679
Epoch: 40341, Training Loss: 0.00705
Epoch: 40341, Training Loss: 0.00706
Epoch: 40341, Training Loss: 0.00865
Epoch: 40342, Training Loss: 0.00679
Epoch: 40342, Training Loss: 0.00705
Epoch: 40342, Training Loss: 0.00706
Epoch: 40342, Training Loss: 0.00865
Epoch: 40343, Training Loss: 0.00679
Epoch: 40343, Training Loss: 0.00705
Epoch: 40343, Training Loss: 0.00706
Epoch: 40343, Training Loss: 0.00865
Epoch: 40344, Training Loss: 0.00679
Epoch: 40344, Training Loss: 0.00705
Epoch: 40344, Training Loss: 0.00706
Epoch: 40344, Training Loss: 0.00865
Epoch: 40345, Training Loss: 0.00679
Epoch: 40345, Training Loss: 0.00705
Epoch: 40345, Training Loss: 0.00706
Epoch: 40345, Training Loss: 0.00865
Epoch: 40346, Training Loss: 0.00679
Epoch: 40346, Training Loss: 0.00705
Epoch: 40346, Training Loss: 0.00706
Epoch: 40346, Training Loss: 0.00865
Epoch: 40347, Training Loss: 0.00679
Epoch: 40347, Training Loss: 0.00705
Epoch: 40347, Training Loss: 0.00706
Epoch: 40347, Training Loss: 0.00865
Epoch: 40348, Training Loss: 0.00679
Epoch: 40348, Training Loss: 0.00705
Epoch: 40348, Training Loss: 0.00706
Epoch: 40348, Training Loss: 0.00865
Epoch: 40349, Training Loss: 0.00679
Epoch: 40349, Training Loss: 0.00705
Epoch: 40349, Training Loss: 0.00706
Epoch: 40349, Training Loss: 0.00865
Epoch: 40350, Training Loss: 0.00679
Epoch: 40350, Training Loss: 0.00705
Epoch: 40350, Training Loss: 0.00706
Epoch: 40350, Training Loss: 0.00865
Epoch: 40351, Training Loss: 0.00679
Epoch: 40351, Training Loss: 0.00705
Epoch: 40351, Training Loss: 0.00706
Epoch: 40351, Training Loss: 0.00865
Epoch: 40352, Training Loss: 0.00679
Epoch: 40352, Training Loss: 0.00705
Epoch: 40352, Training Loss: 0.00706
Epoch: 40352, Training Loss: 0.00865
Epoch: 40353, Training Loss: 0.00679
Epoch: 40353, Training Loss: 0.00705
Epoch: 40353, Training Loss: 0.00706
Epoch: 40353, Training Loss: 0.00865
Epoch: 40354, Training Loss: 0.00679
Epoch: 40354, Training Loss: 0.00705
Epoch: 40354, Training Loss: 0.00706
Epoch: 40354, Training Loss: 0.00865
Epoch: 40355, Training Loss: 0.00679
Epoch: 40355, Training Loss: 0.00705
Epoch: 40355, Training Loss: 0.00706
Epoch: 40355, Training Loss: 0.00865
Epoch: 40356, Training Loss: 0.00679
Epoch: 40356, Training Loss: 0.00705
Epoch: 40356, Training Loss: 0.00706
Epoch: 40356, Training Loss: 0.00865
Epoch: 40357, Training Loss: 0.00679
Epoch: 40357, Training Loss: 0.00705
Epoch: 40357, Training Loss: 0.00706
Epoch: 40357, Training Loss: 0.00865
Epoch: 40358, Training Loss: 0.00679
Epoch: 40358, Training Loss: 0.00705
Epoch: 40358, Training Loss: 0.00705
Epoch: 40358, Training Loss: 0.00865
Epoch: 40359, Training Loss: 0.00679
Epoch: 40359, Training Loss: 0.00705
Epoch: 40359, Training Loss: 0.00705
Epoch: 40359, Training Loss: 0.00865
Epoch: 40360, Training Loss: 0.00679
Epoch: 40360, Training Loss: 0.00705
Epoch: 40360, Training Loss: 0.00705
Epoch: 40360, Training Loss: 0.00865
Epoch: 40361, Training Loss: 0.00679
Epoch: 40361, Training Loss: 0.00705
Epoch: 40361, Training Loss: 0.00705
Epoch: 40361, Training Loss: 0.00865
Epoch: 40362, Training Loss: 0.00678
Epoch: 40362, Training Loss: 0.00705
Epoch: 40362, Training Loss: 0.00705
Epoch: 40362, Training Loss: 0.00865
Epoch: 40363, Training Loss: 0.00678
Epoch: 40363, Training Loss: 0.00705
Epoch: 40363, Training Loss: 0.00705
Epoch: 40363, Training Loss: 0.00865
Epoch: 40364, Training Loss: 0.00678
Epoch: 40364, Training Loss: 0.00705
Epoch: 40364, Training Loss: 0.00705
Epoch: 40364, Training Loss: 0.00865
Epoch: 40365, Training Loss: 0.00678
Epoch: 40365, Training Loss: 0.00705
Epoch: 40365, Training Loss: 0.00705
Epoch: 40365, Training Loss: 0.00865
Epoch: 40366, Training Loss: 0.00678
Epoch: 40366, Training Loss: 0.00705
Epoch: 40366, Training Loss: 0.00705
Epoch: 40366, Training Loss: 0.00865
Epoch: 40367, Training Loss: 0.00678
Epoch: 40367, Training Loss: 0.00705
Epoch: 40367, Training Loss: 0.00705
Epoch: 40367, Training Loss: 0.00865
Epoch: 40368, Training Loss: 0.00678
Epoch: 40368, Training Loss: 0.00705
Epoch: 40368, Training Loss: 0.00705
Epoch: 40368, Training Loss: 0.00865
Epoch: 40369, Training Loss: 0.00678
Epoch: 40369, Training Loss: 0.00705
Epoch: 40369, Training Loss: 0.00705
Epoch: 40369, Training Loss: 0.00865
Epoch: 40370, Training Loss: 0.00678
Epoch: 40370, Training Loss: 0.00705
Epoch: 40370, Training Loss: 0.00705
Epoch: 40370, Training Loss: 0.00865
Epoch: 40371, Training Loss: 0.00678
Epoch: 40371, Training Loss: 0.00705
Epoch: 40371, Training Loss: 0.00705
Epoch: 40371, Training Loss: 0.00865
Epoch: 40372, Training Loss: 0.00678
Epoch: 40372, Training Loss: 0.00705
Epoch: 40372, Training Loss: 0.00705
Epoch: 40372, Training Loss: 0.00865
Epoch: 40373, Training Loss: 0.00678
Epoch: 40373, Training Loss: 0.00705
Epoch: 40373, Training Loss: 0.00705
Epoch: 40373, Training Loss: 0.00865
Epoch: 40374, Training Loss: 0.00678
Epoch: 40374, Training Loss: 0.00705
Epoch: 40374, Training Loss: 0.00705
Epoch: 40374, Training Loss: 0.00865
Epoch: 40375, Training Loss: 0.00678
Epoch: 40375, Training Loss: 0.00705
Epoch: 40375, Training Loss: 0.00705
Epoch: 40375, Training Loss: 0.00865
Epoch: 40376, Training Loss: 0.00678
Epoch: 40376, Training Loss: 0.00705
Epoch: 40376, Training Loss: 0.00705
Epoch: 40376, Training Loss: 0.00865
Epoch: 40377, Training Loss: 0.00678
Epoch: 40377, Training Loss: 0.00705
Epoch: 40377, Training Loss: 0.00705
Epoch: 40377, Training Loss: 0.00865
Epoch: 40378, Training Loss: 0.00678
Epoch: 40378, Training Loss: 0.00705
Epoch: 40378, Training Loss: 0.00705
Epoch: 40378, Training Loss: 0.00865
Epoch: 40379, Training Loss: 0.00678
Epoch: 40379, Training Loss: 0.00705
Epoch: 40379, Training Loss: 0.00705
Epoch: 40379, Training Loss: 0.00865
Epoch: 40380, Training Loss: 0.00678
Epoch: 40380, Training Loss: 0.00704
Epoch: 40380, Training Loss: 0.00705
Epoch: 40380, Training Loss: 0.00865
Epoch: 40381, Training Loss: 0.00678
Epoch: 40381, Training Loss: 0.00704
Epoch: 40381, Training Loss: 0.00705
Epoch: 40381, Training Loss: 0.00865
Epoch: 40382, Training Loss: 0.00678
Epoch: 40382, Training Loss: 0.00704
Epoch: 40382, Training Loss: 0.00705
Epoch: 40382, Training Loss: 0.00865
Epoch: 40383, Training Loss: 0.00678
Epoch: 40383, Training Loss: 0.00704
Epoch: 40383, Training Loss: 0.00705
Epoch: 40383, Training Loss: 0.00865
Epoch: 40384, Training Loss: 0.00678
Epoch: 40384, Training Loss: 0.00704
Epoch: 40384, Training Loss: 0.00705
Epoch: 40384, Training Loss: 0.00865
Epoch: 40385, Training Loss: 0.00678
Epoch: 40385, Training Loss: 0.00704
Epoch: 40385, Training Loss: 0.00705
Epoch: 40385, Training Loss: 0.00865
Epoch: 40386, Training Loss: 0.00678
Epoch: 40386, Training Loss: 0.00704
Epoch: 40386, Training Loss: 0.00705
Epoch: 40386, Training Loss: 0.00865
Epoch: 40387, Training Loss: 0.00678
Epoch: 40387, Training Loss: 0.00704
Epoch: 40387, Training Loss: 0.00705
Epoch: 40387, Training Loss: 0.00865
Epoch: 40388, Training Loss: 0.00678
Epoch: 40388, Training Loss: 0.00704
Epoch: 40388, Training Loss: 0.00705
Epoch: 40388, Training Loss: 0.00865
Epoch: 40389, Training Loss: 0.00678
Epoch: 40389, Training Loss: 0.00704
Epoch: 40389, Training Loss: 0.00705
Epoch: 40389, Training Loss: 0.00865
Epoch: 40390, Training Loss: 0.00678
Epoch: 40390, Training Loss: 0.00704
Epoch: 40390, Training Loss: 0.00705
Epoch: 40390, Training Loss: 0.00865
Epoch: 40391, Training Loss: 0.00678
Epoch: 40391, Training Loss: 0.00704
Epoch: 40391, Training Loss: 0.00705
Epoch: 40391, Training Loss: 0.00865
Epoch: 40392, Training Loss: 0.00678
Epoch: 40392, Training Loss: 0.00704
Epoch: 40392, Training Loss: 0.00705
Epoch: 40392, Training Loss: 0.00865
Epoch: 40393, Training Loss: 0.00678
Epoch: 40393, Training Loss: 0.00704
Epoch: 40393, Training Loss: 0.00705
Epoch: 40393, Training Loss: 0.00865
Epoch: 40394, Training Loss: 0.00678
Epoch: 40394, Training Loss: 0.00704
Epoch: 40394, Training Loss: 0.00705
Epoch: 40394, Training Loss: 0.00865
Epoch: 40395, Training Loss: 0.00678
Epoch: 40395, Training Loss: 0.00704
Epoch: 40395, Training Loss: 0.00705
Epoch: 40395, Training Loss: 0.00865
Epoch: 40396, Training Loss: 0.00678
Epoch: 40396, Training Loss: 0.00704
Epoch: 40396, Training Loss: 0.00705
Epoch: 40396, Training Loss: 0.00865
Epoch: 40397, Training Loss: 0.00678
Epoch: 40397, Training Loss: 0.00704
Epoch: 40397, Training Loss: 0.00705
Epoch: 40397, Training Loss: 0.00865
Epoch: 40398, Training Loss: 0.00678
Epoch: 40398, Training Loss: 0.00704
Epoch: 40398, Training Loss: 0.00705
Epoch: 40398, Training Loss: 0.00865
Epoch: 40399, Training Loss: 0.00678
Epoch: 40399, Training Loss: 0.00704
Epoch: 40399, Training Loss: 0.00705
Epoch: 40399, Training Loss: 0.00865
Epoch: 40400, Training Loss: 0.00678
Epoch: 40400, Training Loss: 0.00704
Epoch: 40400, Training Loss: 0.00705
Epoch: 40400, Training Loss: 0.00865
Epoch: 40401, Training Loss: 0.00678
Epoch: 40401, Training Loss: 0.00704
Epoch: 40401, Training Loss: 0.00705
Epoch: 40401, Training Loss: 0.00865
Epoch: 40402, Training Loss: 0.00678
Epoch: 40402, Training Loss: 0.00704
Epoch: 40402, Training Loss: 0.00705
Epoch: 40402, Training Loss: 0.00865
Epoch: 40403, Training Loss: 0.00678
Epoch: 40403, Training Loss: 0.00704
Epoch: 40403, Training Loss: 0.00705
Epoch: 40403, Training Loss: 0.00865
Epoch: 40404, Training Loss: 0.00678
Epoch: 40404, Training Loss: 0.00704
Epoch: 40404, Training Loss: 0.00705
Epoch: 40404, Training Loss: 0.00865
Epoch: 40405, Training Loss: 0.00678
Epoch: 40405, Training Loss: 0.00704
Epoch: 40405, Training Loss: 0.00705
Epoch: 40405, Training Loss: 0.00865
Epoch: 40406, Training Loss: 0.00678
Epoch: 40406, Training Loss: 0.00704
Epoch: 40406, Training Loss: 0.00705
Epoch: 40406, Training Loss: 0.00865
Epoch: 40407, Training Loss: 0.00678
Epoch: 40407, Training Loss: 0.00704
Epoch: 40407, Training Loss: 0.00705
Epoch: 40407, Training Loss: 0.00865
Epoch: 40408, Training Loss: 0.00678
Epoch: 40408, Training Loss: 0.00704
Epoch: 40408, Training Loss: 0.00705
Epoch: 40408, Training Loss: 0.00865
Epoch: 40409, Training Loss: 0.00678
Epoch: 40409, Training Loss: 0.00704
Epoch: 40409, Training Loss: 0.00705
Epoch: 40409, Training Loss: 0.00865
Epoch: 40410, Training Loss: 0.00678
Epoch: 40410, Training Loss: 0.00704
Epoch: 40410, Training Loss: 0.00705
Epoch: 40410, Training Loss: 0.00865
Epoch: 40411, Training Loss: 0.00678
Epoch: 40411, Training Loss: 0.00704
Epoch: 40411, Training Loss: 0.00705
Epoch: 40411, Training Loss: 0.00865
Epoch: 40412, Training Loss: 0.00678
Epoch: 40412, Training Loss: 0.00704
Epoch: 40412, Training Loss: 0.00705
Epoch: 40412, Training Loss: 0.00865
Epoch: 40413, Training Loss: 0.00678
Epoch: 40413, Training Loss: 0.00704
Epoch: 40413, Training Loss: 0.00705
Epoch: 40413, Training Loss: 0.00864
Epoch: 40414, Training Loss: 0.00678
Epoch: 40414, Training Loss: 0.00704
Epoch: 40414, Training Loss: 0.00705
Epoch: 40414, Training Loss: 0.00864
Epoch: 40415, Training Loss: 0.00678
Epoch: 40415, Training Loss: 0.00704
Epoch: 40415, Training Loss: 0.00705
Epoch: 40415, Training Loss: 0.00864
Epoch: 40416, Training Loss: 0.00678
Epoch: 40416, Training Loss: 0.00704
Epoch: 40416, Training Loss: 0.00705
Epoch: 40416, Training Loss: 0.00864
Epoch: 40417, Training Loss: 0.00678
Epoch: 40417, Training Loss: 0.00704
Epoch: 40417, Training Loss: 0.00705
Epoch: 40417, Training Loss: 0.00864
Epoch: 40418, Training Loss: 0.00678
Epoch: 40418, Training Loss: 0.00704
Epoch: 40418, Training Loss: 0.00705
Epoch: 40418, Training Loss: 0.00864
Epoch: 40419, Training Loss: 0.00678
Epoch: 40419, Training Loss: 0.00704
Epoch: 40419, Training Loss: 0.00705
Epoch: 40419, Training Loss: 0.00864
Epoch: 40420, Training Loss: 0.00678
Epoch: 40420, Training Loss: 0.00704
Epoch: 40420, Training Loss: 0.00705
Epoch: 40420, Training Loss: 0.00864
Epoch: 40421, Training Loss: 0.00678
Epoch: 40421, Training Loss: 0.00704
Epoch: 40421, Training Loss: 0.00705
Epoch: 40421, Training Loss: 0.00864
Epoch: 40422, Training Loss: 0.00678
Epoch: 40422, Training Loss: 0.00704
Epoch: 40422, Training Loss: 0.00705
Epoch: 40422, Training Loss: 0.00864
Epoch: 40423, Training Loss: 0.00678
Epoch: 40423, Training Loss: 0.00704
Epoch: 40423, Training Loss: 0.00705
Epoch: 40423, Training Loss: 0.00864
Epoch: 40424, Training Loss: 0.00678
Epoch: 40424, Training Loss: 0.00704
Epoch: 40424, Training Loss: 0.00705
Epoch: 40424, Training Loss: 0.00864
Epoch: 40425, Training Loss: 0.00678
Epoch: 40425, Training Loss: 0.00704
Epoch: 40425, Training Loss: 0.00705
Epoch: 40425, Training Loss: 0.00864
Epoch: 40426, Training Loss: 0.00678
Epoch: 40426, Training Loss: 0.00704
Epoch: 40426, Training Loss: 0.00705
Epoch: 40426, Training Loss: 0.00864
Epoch: 40427, Training Loss: 0.00678
Epoch: 40427, Training Loss: 0.00704
Epoch: 40427, Training Loss: 0.00705
Epoch: 40427, Training Loss: 0.00864
Epoch: 40428, Training Loss: 0.00678
Epoch: 40428, Training Loss: 0.00704
Epoch: 40428, Training Loss: 0.00705
Epoch: 40428, Training Loss: 0.00864
Epoch: 40429, Training Loss: 0.00678
Epoch: 40429, Training Loss: 0.00704
Epoch: 40429, Training Loss: 0.00705
Epoch: 40429, Training Loss: 0.00864
Epoch: 40430, Training Loss: 0.00678
Epoch: 40430, Training Loss: 0.00704
Epoch: 40430, Training Loss: 0.00705
Epoch: 40430, Training Loss: 0.00864
Epoch: 40431, Training Loss: 0.00678
Epoch: 40431, Training Loss: 0.00704
Epoch: 40431, Training Loss: 0.00705
Epoch: 40431, Training Loss: 0.00864
Epoch: 40432, Training Loss: 0.00678
Epoch: 40432, Training Loss: 0.00704
Epoch: 40432, Training Loss: 0.00705
Epoch: 40432, Training Loss: 0.00864
Epoch: 40433, Training Loss: 0.00678
Epoch: 40433, Training Loss: 0.00704
Epoch: 40433, Training Loss: 0.00705
Epoch: 40433, Training Loss: 0.00864
Epoch: 40434, Training Loss: 0.00678
Epoch: 40434, Training Loss: 0.00704
Epoch: 40434, Training Loss: 0.00705
Epoch: 40434, Training Loss: 0.00864
Epoch: 40435, Training Loss: 0.00678
Epoch: 40435, Training Loss: 0.00704
Epoch: 40435, Training Loss: 0.00705
Epoch: 40435, Training Loss: 0.00864
Epoch: 40436, Training Loss: 0.00678
Epoch: 40436, Training Loss: 0.00704
Epoch: 40436, Training Loss: 0.00705
Epoch: 40436, Training Loss: 0.00864
Epoch: 40437, Training Loss: 0.00678
Epoch: 40437, Training Loss: 0.00704
Epoch: 40437, Training Loss: 0.00705
Epoch: 40437, Training Loss: 0.00864
Epoch: 40438, Training Loss: 0.00678
Epoch: 40438, Training Loss: 0.00704
Epoch: 40438, Training Loss: 0.00705
Epoch: 40438, Training Loss: 0.00864
Epoch: 40439, Training Loss: 0.00678
Epoch: 40439, Training Loss: 0.00704
Epoch: 40439, Training Loss: 0.00705
Epoch: 40439, Training Loss: 0.00864
Epoch: 40440, Training Loss: 0.00678
Epoch: 40440, Training Loss: 0.00704
Epoch: 40440, Training Loss: 0.00705
Epoch: 40440, Training Loss: 0.00864
Epoch: 40441, Training Loss: 0.00678
Epoch: 40441, Training Loss: 0.00704
Epoch: 40441, Training Loss: 0.00705
Epoch: 40441, Training Loss: 0.00864
Epoch: 40442, Training Loss: 0.00678
Epoch: 40442, Training Loss: 0.00704
Epoch: 40442, Training Loss: 0.00705
Epoch: 40442, Training Loss: 0.00864
Epoch: 40443, Training Loss: 0.00678
Epoch: 40443, Training Loss: 0.00704
Epoch: 40443, Training Loss: 0.00705
Epoch: 40443, Training Loss: 0.00864
Epoch: 40444, Training Loss: 0.00678
Epoch: 40444, Training Loss: 0.00704
Epoch: 40444, Training Loss: 0.00705
Epoch: 40444, Training Loss: 0.00864
Epoch: 40445, Training Loss: 0.00678
Epoch: 40445, Training Loss: 0.00704
Epoch: 40445, Training Loss: 0.00705
Epoch: 40445, Training Loss: 0.00864
Epoch: 40446, Training Loss: 0.00678
Epoch: 40446, Training Loss: 0.00704
Epoch: 40446, Training Loss: 0.00705
Epoch: 40446, Training Loss: 0.00864
Epoch: 40447, Training Loss: 0.00678
Epoch: 40447, Training Loss: 0.00704
Epoch: 40447, Training Loss: 0.00705
Epoch: 40447, Training Loss: 0.00864
Epoch: 40448, Training Loss: 0.00678
Epoch: 40448, Training Loss: 0.00704
Epoch: 40448, Training Loss: 0.00705
Epoch: 40448, Training Loss: 0.00864
Epoch: 40449, Training Loss: 0.00678
Epoch: 40449, Training Loss: 0.00704
Epoch: 40449, Training Loss: 0.00705
Epoch: 40449, Training Loss: 0.00864
Epoch: 40450, Training Loss: 0.00678
Epoch: 40450, Training Loss: 0.00704
Epoch: 40450, Training Loss: 0.00705
Epoch: 40450, Training Loss: 0.00864
Epoch: 40451, Training Loss: 0.00678
Epoch: 40451, Training Loss: 0.00704
Epoch: 40451, Training Loss: 0.00705
Epoch: 40451, Training Loss: 0.00864
Epoch: 40452, Training Loss: 0.00678
Epoch: 40452, Training Loss: 0.00704
Epoch: 40452, Training Loss: 0.00705
Epoch: 40452, Training Loss: 0.00864
Epoch: 40453, Training Loss: 0.00678
Epoch: 40453, Training Loss: 0.00704
Epoch: 40453, Training Loss: 0.00705
Epoch: 40453, Training Loss: 0.00864
Epoch: 40454, Training Loss: 0.00678
Epoch: 40454, Training Loss: 0.00704
Epoch: 40454, Training Loss: 0.00705
Epoch: 40454, Training Loss: 0.00864
Epoch: 40455, Training Loss: 0.00678
Epoch: 40455, Training Loss: 0.00704
Epoch: 40455, Training Loss: 0.00705
Epoch: 40455, Training Loss: 0.00864
Epoch: 40456, Training Loss: 0.00678
Epoch: 40456, Training Loss: 0.00704
Epoch: 40456, Training Loss: 0.00705
Epoch: 40456, Training Loss: 0.00864
Epoch: 40457, Training Loss: 0.00678
Epoch: 40457, Training Loss: 0.00704
Epoch: 40457, Training Loss: 0.00705
Epoch: 40457, Training Loss: 0.00864
Epoch: 40458, Training Loss: 0.00678
Epoch: 40458, Training Loss: 0.00704
Epoch: 40458, Training Loss: 0.00705
Epoch: 40458, Training Loss: 0.00864
Epoch: 40459, Training Loss: 0.00678
Epoch: 40459, Training Loss: 0.00704
Epoch: 40459, Training Loss: 0.00705
Epoch: 40459, Training Loss: 0.00864
Epoch: 40460, Training Loss: 0.00678
Epoch: 40460, Training Loss: 0.00704
Epoch: 40460, Training Loss: 0.00705
Epoch: 40460, Training Loss: 0.00864
Epoch: 40461, Training Loss: 0.00678
Epoch: 40461, Training Loss: 0.00704
Epoch: 40461, Training Loss: 0.00705
Epoch: 40461, Training Loss: 0.00864
Epoch: 40462, Training Loss: 0.00678
Epoch: 40462, Training Loss: 0.00704
Epoch: 40462, Training Loss: 0.00705
Epoch: 40462, Training Loss: 0.00864
Epoch: 40463, Training Loss: 0.00678
Epoch: 40463, Training Loss: 0.00704
Epoch: 40463, Training Loss: 0.00704
Epoch: 40463, Training Loss: 0.00864
Epoch: 40464, Training Loss: 0.00678
Epoch: 40464, Training Loss: 0.00704
Epoch: 40464, Training Loss: 0.00704
Epoch: 40464, Training Loss: 0.00864
Epoch: 40465, Training Loss: 0.00678
Epoch: 40465, Training Loss: 0.00704
Epoch: 40465, Training Loss: 0.00704
Epoch: 40465, Training Loss: 0.00864
Epoch: 40466, Training Loss: 0.00678
Epoch: 40466, Training Loss: 0.00704
Epoch: 40466, Training Loss: 0.00704
Epoch: 40466, Training Loss: 0.00864
Epoch: 40467, Training Loss: 0.00678
Epoch: 40467, Training Loss: 0.00704
Epoch: 40467, Training Loss: 0.00704
Epoch: 40467, Training Loss: 0.00864
Epoch: 40468, Training Loss: 0.00678
Epoch: 40468, Training Loss: 0.00704
Epoch: 40468, Training Loss: 0.00704
Epoch: 40468, Training Loss: 0.00864
Epoch: 40469, Training Loss: 0.00678
Epoch: 40469, Training Loss: 0.00704
Epoch: 40469, Training Loss: 0.00704
Epoch: 40469, Training Loss: 0.00864
Epoch: 40470, Training Loss: 0.00678
Epoch: 40470, Training Loss: 0.00704
Epoch: 40470, Training Loss: 0.00704
Epoch: 40470, Training Loss: 0.00864
Epoch: 40471, Training Loss: 0.00678
Epoch: 40471, Training Loss: 0.00704
Epoch: 40471, Training Loss: 0.00704
Epoch: 40471, Training Loss: 0.00864
Epoch: 40472, Training Loss: 0.00678
Epoch: 40472, Training Loss: 0.00704
Epoch: 40472, Training Loss: 0.00704
Epoch: 40472, Training Loss: 0.00864
Epoch: 40473, Training Loss: 0.00677
Epoch: 40473, Training Loss: 0.00704
Epoch: 40473, Training Loss: 0.00704
Epoch: 40473, Training Loss: 0.00864
Epoch: 40474, Training Loss: 0.00677
Epoch: 40474, Training Loss: 0.00704
Epoch: 40474, Training Loss: 0.00704
Epoch: 40474, Training Loss: 0.00864
Epoch: 40475, Training Loss: 0.00677
Epoch: 40475, Training Loss: 0.00704
Epoch: 40475, Training Loss: 0.00704
Epoch: 40475, Training Loss: 0.00864
Epoch: 40476, Training Loss: 0.00677
Epoch: 40476, Training Loss: 0.00704
Epoch: 40476, Training Loss: 0.00704
Epoch: 40476, Training Loss: 0.00864
Epoch: 40477, Training Loss: 0.00677
Epoch: 40477, Training Loss: 0.00704
Epoch: 40477, Training Loss: 0.00704
Epoch: 40477, Training Loss: 0.00864
Epoch: 40478, Training Loss: 0.00677
Epoch: 40478, Training Loss: 0.00704
Epoch: 40478, Training Loss: 0.00704
Epoch: 40478, Training Loss: 0.00864
Epoch: 40479, Training Loss: 0.00677
Epoch: 40479, Training Loss: 0.00704
Epoch: 40479, Training Loss: 0.00704
Epoch: 40479, Training Loss: 0.00864
Epoch: 40480, Training Loss: 0.00677
Epoch: 40480, Training Loss: 0.00704
Epoch: 40480, Training Loss: 0.00704
Epoch: 40480, Training Loss: 0.00864
Epoch: 40481, Training Loss: 0.00677
Epoch: 40481, Training Loss: 0.00704
Epoch: 40481, Training Loss: 0.00704
Epoch: 40481, Training Loss: 0.00864
Epoch: 40482, Training Loss: 0.00677
Epoch: 40482, Training Loss: 0.00704
Epoch: 40482, Training Loss: 0.00704
Epoch: 40482, Training Loss: 0.00864
Epoch: 40483, Training Loss: 0.00677
Epoch: 40483, Training Loss: 0.00704
Epoch: 40483, Training Loss: 0.00704
Epoch: 40483, Training Loss: 0.00864
Epoch: 40484, Training Loss: 0.00677
Epoch: 40484, Training Loss: 0.00704
Epoch: 40484, Training Loss: 0.00704
Epoch: 40484, Training Loss: 0.00864
Epoch: 40485, Training Loss: 0.00677
Epoch: 40485, Training Loss: 0.00703
Epoch: 40485, Training Loss: 0.00704
Epoch: 40485, Training Loss: 0.00864
Epoch: 40486, Training Loss: 0.00677
Epoch: 40486, Training Loss: 0.00703
Epoch: 40486, Training Loss: 0.00704
Epoch: 40486, Training Loss: 0.00864
Epoch: 40487, Training Loss: 0.00677
Epoch: 40487, Training Loss: 0.00703
Epoch: 40487, Training Loss: 0.00704
Epoch: 40487, Training Loss: 0.00864
Epoch: 40488, Training Loss: 0.00677
Epoch: 40488, Training Loss: 0.00703
Epoch: 40488, Training Loss: 0.00704
Epoch: 40488, Training Loss: 0.00864
Epoch: 40489, Training Loss: 0.00677
Epoch: 40489, Training Loss: 0.00703
Epoch: 40489, Training Loss: 0.00704
Epoch: 40489, Training Loss: 0.00864
Epoch: 40490, Training Loss: 0.00677
Epoch: 40490, Training Loss: 0.00703
Epoch: 40490, Training Loss: 0.00704
Epoch: 40490, Training Loss: 0.00864
Epoch: 40491, Training Loss: 0.00677
Epoch: 40491, Training Loss: 0.00703
Epoch: 40491, Training Loss: 0.00704
Epoch: 40491, Training Loss: 0.00864
Epoch: 40492, Training Loss: 0.00677
Epoch: 40492, Training Loss: 0.00703
Epoch: 40492, Training Loss: 0.00704
Epoch: 40492, Training Loss: 0.00864
Epoch: 40493, Training Loss: 0.00677
Epoch: 40493, Training Loss: 0.00703
Epoch: 40493, Training Loss: 0.00704
Epoch: 40493, Training Loss: 0.00864
Epoch: 40494, Training Loss: 0.00677
Epoch: 40494, Training Loss: 0.00703
Epoch: 40494, Training Loss: 0.00704
Epoch: 40494, Training Loss: 0.00864
Epoch: 40495, Training Loss: 0.00677
Epoch: 40495, Training Loss: 0.00703
Epoch: 40495, Training Loss: 0.00704
Epoch: 40495, Training Loss: 0.00864
Epoch: 40496, Training Loss: 0.00677
Epoch: 40496, Training Loss: 0.00703
Epoch: 40496, Training Loss: 0.00704
Epoch: 40496, Training Loss: 0.00864
Epoch: 40497, Training Loss: 0.00677
Epoch: 40497, Training Loss: 0.00703
Epoch: 40497, Training Loss: 0.00704
Epoch: 40497, Training Loss: 0.00864
Epoch: 40498, Training Loss: 0.00677
Epoch: 40498, Training Loss: 0.00703
Epoch: 40498, Training Loss: 0.00704
Epoch: 40498, Training Loss: 0.00863
Epoch: 40499, Training Loss: 0.00677
Epoch: 40499, Training Loss: 0.00703
Epoch: 40499, Training Loss: 0.00704
Epoch: 40499, Training Loss: 0.00863
Epoch: 40500, Training Loss: 0.00677
Epoch: 40500, Training Loss: 0.00703
Epoch: 40500, Training Loss: 0.00704
Epoch: 40500, Training Loss: 0.00863
Epoch: 40501, Training Loss: 0.00677
Epoch: 40501, Training Loss: 0.00703
Epoch: 40501, Training Loss: 0.00704
Epoch: 40501, Training Loss: 0.00863
Epoch: 40502, Training Loss: 0.00677
Epoch: 40502, Training Loss: 0.00703
Epoch: 40502, Training Loss: 0.00704
Epoch: 40502, Training Loss: 0.00863
Epoch: 40503, Training Loss: 0.00677
Epoch: 40503, Training Loss: 0.00703
Epoch: 40503, Training Loss: 0.00704
Epoch: 40503, Training Loss: 0.00863
Epoch: 40504, Training Loss: 0.00677
Epoch: 40504, Training Loss: 0.00703
Epoch: 40504, Training Loss: 0.00704
Epoch: 40504, Training Loss: 0.00863
Epoch: 40505, Training Loss: 0.00677
Epoch: 40505, Training Loss: 0.00703
Epoch: 40505, Training Loss: 0.00704
Epoch: 40505, Training Loss: 0.00863
Epoch: 40506, Training Loss: 0.00677
Epoch: 40506, Training Loss: 0.00703
Epoch: 40506, Training Loss: 0.00704
Epoch: 40506, Training Loss: 0.00863
Epoch: 40507, Training Loss: 0.00677
Epoch: 40507, Training Loss: 0.00703
Epoch: 40507, Training Loss: 0.00704
Epoch: 40507, Training Loss: 0.00863
Epoch: 40508, Training Loss: 0.00677
Epoch: 40508, Training Loss: 0.00703
Epoch: 40508, Training Loss: 0.00704
Epoch: 40508, Training Loss: 0.00863
Epoch: 40509, Training Loss: 0.00677
Epoch: 40509, Training Loss: 0.00703
Epoch: 40509, Training Loss: 0.00704
Epoch: 40509, Training Loss: 0.00863
Epoch: 40510, Training Loss: 0.00677
Epoch: 40510, Training Loss: 0.00703
Epoch: 40510, Training Loss: 0.00704
Epoch: 40510, Training Loss: 0.00863
Epoch: 40511, Training Loss: 0.00677
Epoch: 40511, Training Loss: 0.00703
Epoch: 40511, Training Loss: 0.00704
Epoch: 40511, Training Loss: 0.00863
Epoch: 40512, Training Loss: 0.00677
Epoch: 40512, Training Loss: 0.00703
Epoch: 40512, Training Loss: 0.00704
Epoch: 40512, Training Loss: 0.00863
Epoch: 40513, Training Loss: 0.00677
Epoch: 40513, Training Loss: 0.00703
Epoch: 40513, Training Loss: 0.00704
Epoch: 40513, Training Loss: 0.00863
Epoch: 40514, Training Loss: 0.00677
Epoch: 40514, Training Loss: 0.00703
Epoch: 40514, Training Loss: 0.00704
Epoch: 40514, Training Loss: 0.00863
Epoch: 40515, Training Loss: 0.00677
Epoch: 40515, Training Loss: 0.00703
Epoch: 40515, Training Loss: 0.00704
Epoch: 40515, Training Loss: 0.00863
Epoch: 40516, Training Loss: 0.00677
Epoch: 40516, Training Loss: 0.00703
Epoch: 40516, Training Loss: 0.00704
Epoch: 40516, Training Loss: 0.00863
Epoch: 40517, Training Loss: 0.00677
Epoch: 40517, Training Loss: 0.00703
Epoch: 40517, Training Loss: 0.00704
Epoch: 40517, Training Loss: 0.00863
Epoch: 40518, Training Loss: 0.00677
Epoch: 40518, Training Loss: 0.00703
Epoch: 40518, Training Loss: 0.00704
Epoch: 40518, Training Loss: 0.00863
Epoch: 40519, Training Loss: 0.00677
Epoch: 40519, Training Loss: 0.00703
Epoch: 40519, Training Loss: 0.00704
Epoch: 40519, Training Loss: 0.00863
Epoch: 40520, Training Loss: 0.00677
Epoch: 40520, Training Loss: 0.00703
Epoch: 40520, Training Loss: 0.00704
Epoch: 40520, Training Loss: 0.00863
Epoch: 40521, Training Loss: 0.00677
Epoch: 40521, Training Loss: 0.00703
Epoch: 40521, Training Loss: 0.00704
Epoch: 40521, Training Loss: 0.00863
Epoch: 40522, Training Loss: 0.00677
Epoch: 40522, Training Loss: 0.00703
Epoch: 40522, Training Loss: 0.00704
Epoch: 40522, Training Loss: 0.00863
Epoch: 40523, Training Loss: 0.00677
Epoch: 40523, Training Loss: 0.00703
Epoch: 40523, Training Loss: 0.00704
Epoch: 40523, Training Loss: 0.00863
Epoch: 40524, Training Loss: 0.00677
Epoch: 40524, Training Loss: 0.00703
Epoch: 40524, Training Loss: 0.00704
Epoch: 40524, Training Loss: 0.00863
Epoch: 40525, Training Loss: 0.00677
Epoch: 40525, Training Loss: 0.00703
Epoch: 40525, Training Loss: 0.00704
Epoch: 40525, Training Loss: 0.00863
Epoch: 40526, Training Loss: 0.00677
Epoch: 40526, Training Loss: 0.00703
Epoch: 40526, Training Loss: 0.00704
Epoch: 40526, Training Loss: 0.00863
Epoch: 40527, Training Loss: 0.00677
Epoch: 40527, Training Loss: 0.00703
Epoch: 40527, Training Loss: 0.00704
Epoch: 40527, Training Loss: 0.00863
Epoch: 40528, Training Loss: 0.00677
Epoch: 40528, Training Loss: 0.00703
Epoch: 40528, Training Loss: 0.00704
Epoch: 40528, Training Loss: 0.00863
Epoch: 40529, Training Loss: 0.00677
Epoch: 40529, Training Loss: 0.00703
Epoch: 40529, Training Loss: 0.00704
Epoch: 40529, Training Loss: 0.00863
Epoch: 40530, Training Loss: 0.00677
Epoch: 40530, Training Loss: 0.00703
Epoch: 40530, Training Loss: 0.00704
Epoch: 40530, Training Loss: 0.00863
Epoch: 40531, Training Loss: 0.00677
Epoch: 40531, Training Loss: 0.00703
Epoch: 40531, Training Loss: 0.00704
Epoch: 40531, Training Loss: 0.00863
Epoch: 40532, Training Loss: 0.00677
Epoch: 40532, Training Loss: 0.00703
Epoch: 40532, Training Loss: 0.00704
Epoch: 40532, Training Loss: 0.00863
Epoch: 40533, Training Loss: 0.00677
Epoch: 40533, Training Loss: 0.00703
Epoch: 40533, Training Loss: 0.00704
Epoch: 40533, Training Loss: 0.00863
Epoch: 40534, Training Loss: 0.00677
Epoch: 40534, Training Loss: 0.00703
Epoch: 40534, Training Loss: 0.00704
Epoch: 40534, Training Loss: 0.00863
Epoch: 40535, Training Loss: 0.00677
Epoch: 40535, Training Loss: 0.00703
Epoch: 40535, Training Loss: 0.00704
Epoch: 40535, Training Loss: 0.00863
Epoch: 40536, Training Loss: 0.00677
Epoch: 40536, Training Loss: 0.00703
Epoch: 40536, Training Loss: 0.00704
Epoch: 40536, Training Loss: 0.00863
Epoch: 40537, Training Loss: 0.00677
Epoch: 40537, Training Loss: 0.00703
Epoch: 40537, Training Loss: 0.00704
Epoch: 40537, Training Loss: 0.00863
Epoch: 40538, Training Loss: 0.00677
Epoch: 40538, Training Loss: 0.00703
Epoch: 40538, Training Loss: 0.00704
Epoch: 40538, Training Loss: 0.00863
Epoch: 40539, Training Loss: 0.00677
Epoch: 40539, Training Loss: 0.00703
Epoch: 40539, Training Loss: 0.00704
Epoch: 40539, Training Loss: 0.00863
Epoch: 40540, Training Loss: 0.00677
Epoch: 40540, Training Loss: 0.00703
Epoch: 40540, Training Loss: 0.00704
Epoch: 40540, Training Loss: 0.00863
Epoch: 40541, Training Loss: 0.00677
Epoch: 40541, Training Loss: 0.00703
Epoch: 40541, Training Loss: 0.00704
Epoch: 40541, Training Loss: 0.00863
Epoch: 40542, Training Loss: 0.00677
Epoch: 40542, Training Loss: 0.00703
Epoch: 40542, Training Loss: 0.00704
Epoch: 40542, Training Loss: 0.00863
Epoch: 40543, Training Loss: 0.00677
Epoch: 40543, Training Loss: 0.00703
Epoch: 40543, Training Loss: 0.00704
Epoch: 40543, Training Loss: 0.00863
Epoch: 40544, Training Loss: 0.00677
Epoch: 40544, Training Loss: 0.00703
Epoch: 40544, Training Loss: 0.00704
Epoch: 40544, Training Loss: 0.00863
Epoch: 40545, Training Loss: 0.00677
Epoch: 40545, Training Loss: 0.00703
Epoch: 40545, Training Loss: 0.00704
Epoch: 40545, Training Loss: 0.00863
Epoch: 40546, Training Loss: 0.00677
Epoch: 40546, Training Loss: 0.00703
Epoch: 40546, Training Loss: 0.00704
Epoch: 40546, Training Loss: 0.00863
Epoch: 40547, Training Loss: 0.00677
Epoch: 40547, Training Loss: 0.00703
Epoch: 40547, Training Loss: 0.00704
Epoch: 40547, Training Loss: 0.00863
Epoch: 40548, Training Loss: 0.00677
Epoch: 40548, Training Loss: 0.00703
Epoch: 40548, Training Loss: 0.00704
Epoch: 40548, Training Loss: 0.00863
Epoch: 40549, Training Loss: 0.00677
Epoch: 40549, Training Loss: 0.00703
Epoch: 40549, Training Loss: 0.00704
Epoch: 40549, Training Loss: 0.00863
Epoch: 40550, Training Loss: 0.00677
Epoch: 40550, Training Loss: 0.00703
Epoch: 40550, Training Loss: 0.00704
Epoch: 40550, Training Loss: 0.00863
Epoch: 40551, Training Loss: 0.00677
Epoch: 40551, Training Loss: 0.00703
Epoch: 40551, Training Loss: 0.00704
Epoch: 40551, Training Loss: 0.00863
Epoch: 40552, Training Loss: 0.00677
Epoch: 40552, Training Loss: 0.00703
Epoch: 40552, Training Loss: 0.00704
Epoch: 40552, Training Loss: 0.00863
Epoch: 40553, Training Loss: 0.00677
Epoch: 40553, Training Loss: 0.00703
Epoch: 40553, Training Loss: 0.00704
Epoch: 40553, Training Loss: 0.00863
Epoch: 40554, Training Loss: 0.00677
Epoch: 40554, Training Loss: 0.00703
Epoch: 40554, Training Loss: 0.00704
Epoch: 40554, Training Loss: 0.00863
Epoch: 40555, Training Loss: 0.00677
Epoch: 40555, Training Loss: 0.00703
Epoch: 40555, Training Loss: 0.00704
Epoch: 40555, Training Loss: 0.00863
Epoch: 40556, Training Loss: 0.00677
Epoch: 40556, Training Loss: 0.00703
Epoch: 40556, Training Loss: 0.00704
Epoch: 40556, Training Loss: 0.00863
Epoch: 40557, Training Loss: 0.00677
Epoch: 40557, Training Loss: 0.00703
Epoch: 40557, Training Loss: 0.00704
Epoch: 40557, Training Loss: 0.00863
Epoch: 40558, Training Loss: 0.00677
Epoch: 40558, Training Loss: 0.00703
Epoch: 40558, Training Loss: 0.00704
Epoch: 40558, Training Loss: 0.00863
Epoch: 40559, Training Loss: 0.00677
Epoch: 40559, Training Loss: 0.00703
Epoch: 40559, Training Loss: 0.00704
Epoch: 40559, Training Loss: 0.00863
Epoch: 40560, Training Loss: 0.00677
Epoch: 40560, Training Loss: 0.00703
Epoch: 40560, Training Loss: 0.00704
Epoch: 40560, Training Loss: 0.00863
Epoch: 40561, Training Loss: 0.00677
Epoch: 40561, Training Loss: 0.00703
Epoch: 40561, Training Loss: 0.00704
Epoch: 40561, Training Loss: 0.00863
Epoch: 40562, Training Loss: 0.00677
Epoch: 40562, Training Loss: 0.00703
Epoch: 40562, Training Loss: 0.00704
Epoch: 40562, Training Loss: 0.00863
Epoch: 40563, Training Loss: 0.00677
Epoch: 40563, Training Loss: 0.00703
Epoch: 40563, Training Loss: 0.00704
Epoch: 40563, Training Loss: 0.00863
Epoch: 40564, Training Loss: 0.00677
Epoch: 40564, Training Loss: 0.00703
Epoch: 40564, Training Loss: 0.00704
Epoch: 40564, Training Loss: 0.00863
Epoch: 40565, Training Loss: 0.00677
Epoch: 40565, Training Loss: 0.00703
Epoch: 40565, Training Loss: 0.00704
Epoch: 40565, Training Loss: 0.00863
Epoch: 40566, Training Loss: 0.00677
Epoch: 40566, Training Loss: 0.00703
Epoch: 40566, Training Loss: 0.00704
Epoch: 40566, Training Loss: 0.00863
Epoch: 40567, Training Loss: 0.00677
Epoch: 40567, Training Loss: 0.00703
Epoch: 40567, Training Loss: 0.00704
Epoch: 40567, Training Loss: 0.00863
Epoch: 40568, Training Loss: 0.00677
Epoch: 40568, Training Loss: 0.00703
Epoch: 40568, Training Loss: 0.00703
Epoch: 40568, Training Loss: 0.00863
Epoch: 40569, Training Loss: 0.00677
Epoch: 40569, Training Loss: 0.00703
Epoch: 40569, Training Loss: 0.00703
Epoch: 40569, Training Loss: 0.00863
Epoch: 40570, Training Loss: 0.00677
Epoch: 40570, Training Loss: 0.00703
Epoch: 40570, Training Loss: 0.00703
Epoch: 40570, Training Loss: 0.00863
Epoch: 40571, Training Loss: 0.00677
Epoch: 40571, Training Loss: 0.00703
Epoch: 40571, Training Loss: 0.00703
Epoch: 40571, Training Loss: 0.00863
Epoch: 40572, Training Loss: 0.00677
Epoch: 40572, Training Loss: 0.00703
Epoch: 40572, Training Loss: 0.00703
Epoch: 40572, Training Loss: 0.00863
Epoch: 40573, Training Loss: 0.00677
Epoch: 40573, Training Loss: 0.00703
Epoch: 40573, Training Loss: 0.00703
Epoch: 40573, Training Loss: 0.00863
Epoch: 40574, Training Loss: 0.00677
Epoch: 40574, Training Loss: 0.00703
Epoch: 40574, Training Loss: 0.00703
Epoch: 40574, Training Loss: 0.00863
Epoch: 40575, Training Loss: 0.00677
Epoch: 40575, Training Loss: 0.00703
Epoch: 40575, Training Loss: 0.00703
Epoch: 40575, Training Loss: 0.00863
Epoch: 40576, Training Loss: 0.00677
Epoch: 40576, Training Loss: 0.00703
Epoch: 40576, Training Loss: 0.00703
Epoch: 40576, Training Loss: 0.00863
Epoch: 40577, Training Loss: 0.00677
Epoch: 40577, Training Loss: 0.00703
Epoch: 40577, Training Loss: 0.00703
Epoch: 40577, Training Loss: 0.00863
Epoch: 40578, Training Loss: 0.00677
Epoch: 40578, Training Loss: 0.00703
Epoch: 40578, Training Loss: 0.00703
Epoch: 40578, Training Loss: 0.00863
Epoch: 40579, Training Loss: 0.00677
Epoch: 40579, Training Loss: 0.00703
Epoch: 40579, Training Loss: 0.00703
Epoch: 40579, Training Loss: 0.00863
Epoch: 40580, Training Loss: 0.00677
Epoch: 40580, Training Loss: 0.00703
Epoch: 40580, Training Loss: 0.00703
Epoch: 40580, Training Loss: 0.00863
Epoch: 40581, Training Loss: 0.00677
Epoch: 40581, Training Loss: 0.00703
Epoch: 40581, Training Loss: 0.00703
Epoch: 40581, Training Loss: 0.00863
Epoch: 40582, Training Loss: 0.00677
Epoch: 40582, Training Loss: 0.00703
Epoch: 40582, Training Loss: 0.00703
Epoch: 40582, Training Loss: 0.00863
Epoch: 40583, Training Loss: 0.00677
Epoch: 40583, Training Loss: 0.00703
Epoch: 40583, Training Loss: 0.00703
Epoch: 40583, Training Loss: 0.00862
Epoch: 40584, Training Loss: 0.00676
Epoch: 40584, Training Loss: 0.00703
Epoch: 40584, Training Loss: 0.00703
Epoch: 40584, Training Loss: 0.00862
Epoch: 40585, Training Loss: 0.00676
Epoch: 40585, Training Loss: 0.00703
Epoch: 40585, Training Loss: 0.00703
Epoch: 40585, Training Loss: 0.00862
Epoch: 40586, Training Loss: 0.00676
Epoch: 40586, Training Loss: 0.00703
Epoch: 40586, Training Loss: 0.00703
Epoch: 40586, Training Loss: 0.00862
Epoch: 40587, Training Loss: 0.00676
Epoch: 40587, Training Loss: 0.00703
Epoch: 40587, Training Loss: 0.00703
Epoch: 40587, Training Loss: 0.00862
Epoch: 40588, Training Loss: 0.00676
Epoch: 40588, Training Loss: 0.00703
Epoch: 40588, Training Loss: 0.00703
Epoch: 40588, Training Loss: 0.00862
Epoch: 40589, Training Loss: 0.00676
Epoch: 40589, Training Loss: 0.00703
Epoch: 40589, Training Loss: 0.00703
Epoch: 40589, Training Loss: 0.00862
Epoch: 40590, Training Loss: 0.00676
Epoch: 40590, Training Loss: 0.00702
Epoch: 40590, Training Loss: 0.00703
Epoch: 40590, Training Loss: 0.00862
Epoch: 40591, Training Loss: 0.00676
Epoch: 40591, Training Loss: 0.00702
Epoch: 40591, Training Loss: 0.00703
Epoch: 40591, Training Loss: 0.00862
Epoch: 40592, Training Loss: 0.00676
Epoch: 40592, Training Loss: 0.00702
Epoch: 40592, Training Loss: 0.00703
Epoch: 40592, Training Loss: 0.00862
Epoch: 40593, Training Loss: 0.00676
Epoch: 40593, Training Loss: 0.00702
Epoch: 40593, Training Loss: 0.00703
Epoch: 40593, Training Loss: 0.00862
Epoch: 40594, Training Loss: 0.00676
Epoch: 40594, Training Loss: 0.00702
Epoch: 40594, Training Loss: 0.00703
Epoch: 40594, Training Loss: 0.00862
Epoch: 40595, Training Loss: 0.00676
Epoch: 40595, Training Loss: 0.00702
Epoch: 40595, Training Loss: 0.00703
Epoch: 40595, Training Loss: 0.00862
Epoch: 40596, Training Loss: 0.00676
Epoch: 40596, Training Loss: 0.00702
Epoch: 40596, Training Loss: 0.00703
Epoch: 40596, Training Loss: 0.00862
Epoch: 40597, Training Loss: 0.00676
Epoch: 40597, Training Loss: 0.00702
Epoch: 40597, Training Loss: 0.00703
Epoch: 40597, Training Loss: 0.00862
Epoch: 40598, Training Loss: 0.00676
Epoch: 40598, Training Loss: 0.00702
Epoch: 40598, Training Loss: 0.00703
Epoch: 40598, Training Loss: 0.00862
Epoch: 40599, Training Loss: 0.00676
Epoch: 40599, Training Loss: 0.00702
Epoch: 40599, Training Loss: 0.00703
Epoch: 40599, Training Loss: 0.00862
Epoch: 40600, Training Loss: 0.00676
Epoch: 40600, Training Loss: 0.00702
Epoch: 40600, Training Loss: 0.00703
Epoch: 40600, Training Loss: 0.00862
Epoch: 40601, Training Loss: 0.00676
Epoch: 40601, Training Loss: 0.00702
Epoch: 40601, Training Loss: 0.00703
Epoch: 40601, Training Loss: 0.00862
Epoch: 40602, Training Loss: 0.00676
Epoch: 40602, Training Loss: 0.00702
Epoch: 40602, Training Loss: 0.00703
Epoch: 40602, Training Loss: 0.00862
Epoch: 40603, Training Loss: 0.00676
Epoch: 40603, Training Loss: 0.00702
Epoch: 40603, Training Loss: 0.00703
Epoch: 40603, Training Loss: 0.00862
Epoch: 40604, Training Loss: 0.00676
Epoch: 40604, Training Loss: 0.00702
Epoch: 40604, Training Loss: 0.00703
Epoch: 40604, Training Loss: 0.00862
Epoch: 40605, Training Loss: 0.00676
Epoch: 40605, Training Loss: 0.00702
Epoch: 40605, Training Loss: 0.00703
Epoch: 40605, Training Loss: 0.00862
Epoch: 40606, Training Loss: 0.00676
Epoch: 40606, Training Loss: 0.00702
Epoch: 40606, Training Loss: 0.00703
Epoch: 40606, Training Loss: 0.00862
Epoch: 40607, Training Loss: 0.00676
Epoch: 40607, Training Loss: 0.00702
Epoch: 40607, Training Loss: 0.00703
Epoch: 40607, Training Loss: 0.00862
Epoch: 40608, Training Loss: 0.00676
Epoch: 40608, Training Loss: 0.00702
Epoch: 40608, Training Loss: 0.00703
Epoch: 40608, Training Loss: 0.00862
Epoch: 40609, Training Loss: 0.00676
Epoch: 40609, Training Loss: 0.00702
Epoch: 40609, Training Loss: 0.00703
Epoch: 40609, Training Loss: 0.00862
Epoch: 40610, Training Loss: 0.00676
Epoch: 40610, Training Loss: 0.00702
Epoch: 40610, Training Loss: 0.00703
Epoch: 40610, Training Loss: 0.00862
Epoch: 40611, Training Loss: 0.00676
Epoch: 40611, Training Loss: 0.00702
Epoch: 40611, Training Loss: 0.00703
Epoch: 40611, Training Loss: 0.00862
Epoch: 40612, Training Loss: 0.00676
Epoch: 40612, Training Loss: 0.00702
Epoch: 40612, Training Loss: 0.00703
Epoch: 40612, Training Loss: 0.00862
Epoch: 40613, Training Loss: 0.00676
Epoch: 40613, Training Loss: 0.00702
Epoch: 40613, Training Loss: 0.00703
Epoch: 40613, Training Loss: 0.00862
Epoch: 40614, Training Loss: 0.00676
Epoch: 40614, Training Loss: 0.00702
Epoch: 40614, Training Loss: 0.00703
Epoch: 40614, Training Loss: 0.00862
Epoch: 40615, Training Loss: 0.00676
Epoch: 40615, Training Loss: 0.00702
Epoch: 40615, Training Loss: 0.00703
Epoch: 40615, Training Loss: 0.00862
Epoch: 40616, Training Loss: 0.00676
Epoch: 40616, Training Loss: 0.00702
Epoch: 40616, Training Loss: 0.00703
Epoch: 40616, Training Loss: 0.00862
Epoch: 40617, Training Loss: 0.00676
Epoch: 40617, Training Loss: 0.00702
Epoch: 40617, Training Loss: 0.00703
Epoch: 40617, Training Loss: 0.00862
Epoch: 40618, Training Loss: 0.00676
Epoch: 40618, Training Loss: 0.00702
Epoch: 40618, Training Loss: 0.00703
Epoch: 40618, Training Loss: 0.00862
Epoch: 40619, Training Loss: 0.00676
Epoch: 40619, Training Loss: 0.00702
Epoch: 40619, Training Loss: 0.00703
Epoch: 40619, Training Loss: 0.00862
Epoch: 40620, Training Loss: 0.00676
Epoch: 40620, Training Loss: 0.00702
Epoch: 40620, Training Loss: 0.00703
Epoch: 40620, Training Loss: 0.00862
Epoch: 40621, Training Loss: 0.00676
Epoch: 40621, Training Loss: 0.00702
Epoch: 40621, Training Loss: 0.00703
Epoch: 40621, Training Loss: 0.00862
Epoch: 40622, Training Loss: 0.00676
Epoch: 40622, Training Loss: 0.00702
Epoch: 40622, Training Loss: 0.00703
Epoch: 40622, Training Loss: 0.00862
Epoch: 40623, Training Loss: 0.00676
Epoch: 40623, Training Loss: 0.00702
Epoch: 40623, Training Loss: 0.00703
Epoch: 40623, Training Loss: 0.00862
Epoch: 40624, Training Loss: 0.00676
Epoch: 40624, Training Loss: 0.00702
Epoch: 40624, Training Loss: 0.00703
Epoch: 40624, Training Loss: 0.00862
Epoch: 40625, Training Loss: 0.00676
Epoch: 40625, Training Loss: 0.00702
Epoch: 40625, Training Loss: 0.00703
Epoch: 40625, Training Loss: 0.00862
Epoch: 40626, Training Loss: 0.00676
Epoch: 40626, Training Loss: 0.00702
Epoch: 40626, Training Loss: 0.00703
Epoch: 40626, Training Loss: 0.00862
Epoch: 40627, Training Loss: 0.00676
Epoch: 40627, Training Loss: 0.00702
Epoch: 40627, Training Loss: 0.00703
Epoch: 40627, Training Loss: 0.00862
Epoch: 40628, Training Loss: 0.00676
Epoch: 40628, Training Loss: 0.00702
Epoch: 40628, Training Loss: 0.00703
Epoch: 40628, Training Loss: 0.00862
Epoch: 40629, Training Loss: 0.00676
Epoch: 40629, Training Loss: 0.00702
Epoch: 40629, Training Loss: 0.00703
Epoch: 40629, Training Loss: 0.00862
Epoch: 40630, Training Loss: 0.00676
Epoch: 40630, Training Loss: 0.00702
Epoch: 40630, Training Loss: 0.00703
Epoch: 40630, Training Loss: 0.00862
Epoch: 40631, Training Loss: 0.00676
Epoch: 40631, Training Loss: 0.00702
Epoch: 40631, Training Loss: 0.00703
Epoch: 40631, Training Loss: 0.00862
Epoch: 40632, Training Loss: 0.00676
Epoch: 40632, Training Loss: 0.00702
Epoch: 40632, Training Loss: 0.00703
Epoch: 40632, Training Loss: 0.00862
Epoch: 40633, Training Loss: 0.00676
Epoch: 40633, Training Loss: 0.00702
Epoch: 40633, Training Loss: 0.00703
Epoch: 40633, Training Loss: 0.00862
Epoch: 40634, Training Loss: 0.00676
Epoch: 40634, Training Loss: 0.00702
Epoch: 40634, Training Loss: 0.00703
Epoch: 40634, Training Loss: 0.00862
Epoch: 40635, Training Loss: 0.00676
Epoch: 40635, Training Loss: 0.00702
Epoch: 40635, Training Loss: 0.00703
Epoch: 40635, Training Loss: 0.00862
Epoch: 40636, Training Loss: 0.00676
Epoch: 40636, Training Loss: 0.00702
Epoch: 40636, Training Loss: 0.00703
Epoch: 40636, Training Loss: 0.00862
Epoch: 40637, Training Loss: 0.00676
Epoch: 40637, Training Loss: 0.00702
Epoch: 40637, Training Loss: 0.00703
Epoch: 40637, Training Loss: 0.00862
Epoch: 40638, Training Loss: 0.00676
Epoch: 40638, Training Loss: 0.00702
Epoch: 40638, Training Loss: 0.00703
Epoch: 40638, Training Loss: 0.00862
Epoch: 40639, Training Loss: 0.00676
Epoch: 40639, Training Loss: 0.00702
Epoch: 40639, Training Loss: 0.00703
Epoch: 40639, Training Loss: 0.00862
Epoch: 40640, Training Loss: 0.00676
Epoch: 40640, Training Loss: 0.00702
Epoch: 40640, Training Loss: 0.00703
Epoch: 40640, Training Loss: 0.00862
Epoch: 40641, Training Loss: 0.00676
Epoch: 40641, Training Loss: 0.00702
Epoch: 40641, Training Loss: 0.00703
Epoch: 40641, Training Loss: 0.00862
Epoch: 40642, Training Loss: 0.00676
Epoch: 40642, Training Loss: 0.00702
Epoch: 40642, Training Loss: 0.00703
Epoch: 40642, Training Loss: 0.00862
Epoch: 40643, Training Loss: 0.00676
Epoch: 40643, Training Loss: 0.00702
Epoch: 40643, Training Loss: 0.00703
Epoch: 40643, Training Loss: 0.00862
Epoch: 40644, Training Loss: 0.00676
Epoch: 40644, Training Loss: 0.00702
Epoch: 40644, Training Loss: 0.00703
Epoch: 40644, Training Loss: 0.00862
Epoch: 40645, Training Loss: 0.00676
Epoch: 40645, Training Loss: 0.00702
Epoch: 40645, Training Loss: 0.00703
Epoch: 40645, Training Loss: 0.00862
Epoch: 40646, Training Loss: 0.00676
Epoch: 40646, Training Loss: 0.00702
Epoch: 40646, Training Loss: 0.00703
Epoch: 40646, Training Loss: 0.00862
Epoch: 40647, Training Loss: 0.00676
Epoch: 40647, Training Loss: 0.00702
Epoch: 40647, Training Loss: 0.00703
Epoch: 40647, Training Loss: 0.00862
Epoch: 40648, Training Loss: 0.00676
Epoch: 40648, Training Loss: 0.00702
Epoch: 40648, Training Loss: 0.00703
Epoch: 40648, Training Loss: 0.00862
Epoch: 40649, Training Loss: 0.00676
Epoch: 40649, Training Loss: 0.00702
Epoch: 40649, Training Loss: 0.00703
Epoch: 40649, Training Loss: 0.00862
Epoch: 40650, Training Loss: 0.00676
Epoch: 40650, Training Loss: 0.00702
Epoch: 40650, Training Loss: 0.00703
Epoch: 40650, Training Loss: 0.00862
Epoch: 40651, Training Loss: 0.00676
Epoch: 40651, Training Loss: 0.00702
Epoch: 40651, Training Loss: 0.00703
Epoch: 40651, Training Loss: 0.00862
Epoch: 40652, Training Loss: 0.00676
Epoch: 40652, Training Loss: 0.00702
Epoch: 40652, Training Loss: 0.00703
Epoch: 40652, Training Loss: 0.00862
Epoch: 40653, Training Loss: 0.00676
Epoch: 40653, Training Loss: 0.00702
Epoch: 40653, Training Loss: 0.00703
Epoch: 40653, Training Loss: 0.00862
Epoch: 40654, Training Loss: 0.00676
Epoch: 40654, Training Loss: 0.00702
Epoch: 40654, Training Loss: 0.00703
Epoch: 40654, Training Loss: 0.00862
Epoch: 40655, Training Loss: 0.00676
Epoch: 40655, Training Loss: 0.00702
Epoch: 40655, Training Loss: 0.00703
Epoch: 40655, Training Loss: 0.00862
Epoch: 40656, Training Loss: 0.00676
Epoch: 40656, Training Loss: 0.00702
Epoch: 40656, Training Loss: 0.00703
Epoch: 40656, Training Loss: 0.00862
Epoch: 40657, Training Loss: 0.00676
Epoch: 40657, Training Loss: 0.00702
Epoch: 40657, Training Loss: 0.00703
Epoch: 40657, Training Loss: 0.00862
Epoch: 40658, Training Loss: 0.00676
Epoch: 40658, Training Loss: 0.00702
Epoch: 40658, Training Loss: 0.00703
Epoch: 40658, Training Loss: 0.00862
Epoch: 40659, Training Loss: 0.00676
Epoch: 40659, Training Loss: 0.00702
Epoch: 40659, Training Loss: 0.00703
Epoch: 40659, Training Loss: 0.00862
Epoch: 40660, Training Loss: 0.00676
Epoch: 40660, Training Loss: 0.00702
Epoch: 40660, Training Loss: 0.00703
Epoch: 40660, Training Loss: 0.00862
Epoch: 40661, Training Loss: 0.00676
Epoch: 40661, Training Loss: 0.00702
Epoch: 40661, Training Loss: 0.00703
Epoch: 40661, Training Loss: 0.00862
Epoch: 40662, Training Loss: 0.00676
Epoch: 40662, Training Loss: 0.00702
Epoch: 40662, Training Loss: 0.00703
Epoch: 40662, Training Loss: 0.00862
Epoch: 40663, Training Loss: 0.00676
Epoch: 40663, Training Loss: 0.00702
Epoch: 40663, Training Loss: 0.00703
Epoch: 40663, Training Loss: 0.00862
Epoch: 40664, Training Loss: 0.00676
Epoch: 40664, Training Loss: 0.00702
Epoch: 40664, Training Loss: 0.00703
Epoch: 40664, Training Loss: 0.00862
Epoch: 40665, Training Loss: 0.00676
Epoch: 40665, Training Loss: 0.00702
Epoch: 40665, Training Loss: 0.00703
Epoch: 40665, Training Loss: 0.00862
Epoch: 40666, Training Loss: 0.00676
Epoch: 40666, Training Loss: 0.00702
Epoch: 40666, Training Loss: 0.00703
Epoch: 40666, Training Loss: 0.00862
Epoch: 40667, Training Loss: 0.00676
Epoch: 40667, Training Loss: 0.00702
Epoch: 40667, Training Loss: 0.00703
Epoch: 40667, Training Loss: 0.00862
Epoch: 40668, Training Loss: 0.00676
Epoch: 40668, Training Loss: 0.00702
Epoch: 40668, Training Loss: 0.00703
Epoch: 40668, Training Loss: 0.00861
Epoch: 40669, Training Loss: 0.00676
Epoch: 40669, Training Loss: 0.00702
Epoch: 40669, Training Loss: 0.00703
Epoch: 40669, Training Loss: 0.00861
Epoch: 40670, Training Loss: 0.00676
Epoch: 40670, Training Loss: 0.00702
Epoch: 40670, Training Loss: 0.00703
Epoch: 40670, Training Loss: 0.00861
Epoch: 40671, Training Loss: 0.00676
Epoch: 40671, Training Loss: 0.00702
Epoch: 40671, Training Loss: 0.00703
Epoch: 40671, Training Loss: 0.00861
Epoch: 40672, Training Loss: 0.00676
Epoch: 40672, Training Loss: 0.00702
Epoch: 40672, Training Loss: 0.00703
Epoch: 40672, Training Loss: 0.00861
Epoch: 40673, Training Loss: 0.00676
Epoch: 40673, Training Loss: 0.00702
Epoch: 40673, Training Loss: 0.00702
Epoch: 40673, Training Loss: 0.00861
Epoch: 40674, Training Loss: 0.00676
Epoch: 40674, Training Loss: 0.00702
Epoch: 40674, Training Loss: 0.00702
Epoch: 40674, Training Loss: 0.00861
Epoch: 40675, Training Loss: 0.00676
Epoch: 40675, Training Loss: 0.00702
Epoch: 40675, Training Loss: 0.00702
Epoch: 40675, Training Loss: 0.00861
Epoch: 40676, Training Loss: 0.00676
Epoch: 40676, Training Loss: 0.00702
Epoch: 40676, Training Loss: 0.00702
Epoch: 40676, Training Loss: 0.00861
Epoch: 40677, Training Loss: 0.00676
Epoch: 40677, Training Loss: 0.00702
Epoch: 40677, Training Loss: 0.00702
Epoch: 40677, Training Loss: 0.00861
Epoch: 40678, Training Loss: 0.00676
Epoch: 40678, Training Loss: 0.00702
Epoch: 40678, Training Loss: 0.00702
Epoch: 40678, Training Loss: 0.00861
Epoch: 40679, Training Loss: 0.00676
Epoch: 40679, Training Loss: 0.00702
Epoch: 40679, Training Loss: 0.00702
Epoch: 40679, Training Loss: 0.00861
Epoch: 40680, Training Loss: 0.00676
Epoch: 40680, Training Loss: 0.00702
Epoch: 40680, Training Loss: 0.00702
Epoch: 40680, Training Loss: 0.00861
Epoch: 40681, Training Loss: 0.00676
Epoch: 40681, Training Loss: 0.00702
Epoch: 40681, Training Loss: 0.00702
Epoch: 40681, Training Loss: 0.00861
Epoch: 40682, Training Loss: 0.00676
Epoch: 40682, Training Loss: 0.00702
Epoch: 40682, Training Loss: 0.00702
Epoch: 40682, Training Loss: 0.00861
Epoch: 40683, Training Loss: 0.00676
Epoch: 40683, Training Loss: 0.00702
Epoch: 40683, Training Loss: 0.00702
Epoch: 40683, Training Loss: 0.00861
Epoch: 40684, Training Loss: 0.00676
Epoch: 40684, Training Loss: 0.00702
Epoch: 40684, Training Loss: 0.00702
Epoch: 40684, Training Loss: 0.00861
Epoch: 40685, Training Loss: 0.00676
Epoch: 40685, Training Loss: 0.00702
Epoch: 40685, Training Loss: 0.00702
Epoch: 40685, Training Loss: 0.00861
Epoch: 40686, Training Loss: 0.00676
Epoch: 40686, Training Loss: 0.00702
Epoch: 40686, Training Loss: 0.00702
Epoch: 40686, Training Loss: 0.00861
Epoch: 40687, Training Loss: 0.00676
Epoch: 40687, Training Loss: 0.00702
Epoch: 40687, Training Loss: 0.00702
Epoch: 40687, Training Loss: 0.00861
Epoch: 40688, Training Loss: 0.00676
Epoch: 40688, Training Loss: 0.00702
Epoch: 40688, Training Loss: 0.00702
Epoch: 40688, Training Loss: 0.00861
Epoch: 40689, Training Loss: 0.00676
Epoch: 40689, Training Loss: 0.00702
Epoch: 40689, Training Loss: 0.00702
Epoch: 40689, Training Loss: 0.00861
Epoch: 40690, Training Loss: 0.00676
Epoch: 40690, Training Loss: 0.00702
Epoch: 40690, Training Loss: 0.00702
Epoch: 40690, Training Loss: 0.00861
Epoch: 40691, Training Loss: 0.00676
Epoch: 40691, Training Loss: 0.00702
Epoch: 40691, Training Loss: 0.00702
Epoch: 40691, Training Loss: 0.00861
Epoch: 40692, Training Loss: 0.00676
Epoch: 40692, Training Loss: 0.00702
Epoch: 40692, Training Loss: 0.00702
Epoch: 40692, Training Loss: 0.00861
Epoch: 40693, Training Loss: 0.00676
Epoch: 40693, Training Loss: 0.00702
Epoch: 40693, Training Loss: 0.00702
Epoch: 40693, Training Loss: 0.00861
Epoch: 40694, Training Loss: 0.00676
Epoch: 40694, Training Loss: 0.00702
Epoch: 40694, Training Loss: 0.00702
Epoch: 40694, Training Loss: 0.00861
Epoch: 40695, Training Loss: 0.00676
Epoch: 40695, Training Loss: 0.00702
Epoch: 40695, Training Loss: 0.00702
Epoch: 40695, Training Loss: 0.00861
Epoch: 40696, Training Loss: 0.00675
Epoch: 40696, Training Loss: 0.00701
Epoch: 40696, Training Loss: 0.00702
Epoch: 40696, Training Loss: 0.00861
Epoch: 40697, Training Loss: 0.00675
Epoch: 40697, Training Loss: 0.00701
Epoch: 40697, Training Loss: 0.00702
Epoch: 40697, Training Loss: 0.00861
Epoch: 40698, Training Loss: 0.00675
Epoch: 40698, Training Loss: 0.00701
Epoch: 40698, Training Loss: 0.00702
Epoch: 40698, Training Loss: 0.00861
Epoch: 40699, Training Loss: 0.00675
Epoch: 40699, Training Loss: 0.00701
Epoch: 40699, Training Loss: 0.00702
Epoch: 40699, Training Loss: 0.00861
Epoch: 40700, Training Loss: 0.00675
Epoch: 40700, Training Loss: 0.00701
Epoch: 40700, Training Loss: 0.00702
Epoch: 40700, Training Loss: 0.00861
Epoch: 40701, Training Loss: 0.00675
Epoch: 40701, Training Loss: 0.00701
Epoch: 40701, Training Loss: 0.00702
Epoch: 40701, Training Loss: 0.00861
Epoch: 40702, Training Loss: 0.00675
Epoch: 40702, Training Loss: 0.00701
Epoch: 40702, Training Loss: 0.00702
Epoch: 40702, Training Loss: 0.00861
Epoch: 40703, Training Loss: 0.00675
Epoch: 40703, Training Loss: 0.00701
Epoch: 40703, Training Loss: 0.00702
Epoch: 40703, Training Loss: 0.00861
Epoch: 40704, Training Loss: 0.00675
Epoch: 40704, Training Loss: 0.00701
Epoch: 40704, Training Loss: 0.00702
Epoch: 40704, Training Loss: 0.00861
Epoch: 40705, Training Loss: 0.00675
Epoch: 40705, Training Loss: 0.00701
Epoch: 40705, Training Loss: 0.00702
Epoch: 40705, Training Loss: 0.00861
Epoch: 40706, Training Loss: 0.00675
Epoch: 40706, Training Loss: 0.00701
Epoch: 40706, Training Loss: 0.00702
Epoch: 40706, Training Loss: 0.00861
Epoch: 40707, Training Loss: 0.00675
Epoch: 40707, Training Loss: 0.00701
Epoch: 40707, Training Loss: 0.00702
Epoch: 40707, Training Loss: 0.00861
Epoch: 40708, Training Loss: 0.00675
Epoch: 40708, Training Loss: 0.00701
Epoch: 40708, Training Loss: 0.00702
Epoch: 40708, Training Loss: 0.00861
Epoch: 40709, Training Loss: 0.00675
Epoch: 40709, Training Loss: 0.00701
Epoch: 40709, Training Loss: 0.00702
Epoch: 40709, Training Loss: 0.00861
Epoch: 40710, Training Loss: 0.00675
Epoch: 40710, Training Loss: 0.00701
Epoch: 40710, Training Loss: 0.00702
Epoch: 40710, Training Loss: 0.00861
Epoch: 40711, Training Loss: 0.00675
Epoch: 40711, Training Loss: 0.00701
Epoch: 40711, Training Loss: 0.00702
Epoch: 40711, Training Loss: 0.00861
Epoch: 40712, Training Loss: 0.00675
Epoch: 40712, Training Loss: 0.00701
Epoch: 40712, Training Loss: 0.00702
Epoch: 40712, Training Loss: 0.00861
Epoch: 40713, Training Loss: 0.00675
Epoch: 40713, Training Loss: 0.00701
Epoch: 40713, Training Loss: 0.00702
Epoch: 40713, Training Loss: 0.00861
Epoch: 40714, Training Loss: 0.00675
Epoch: 40714, Training Loss: 0.00701
Epoch: 40714, Training Loss: 0.00702
Epoch: 40714, Training Loss: 0.00861
Epoch: 40715, Training Loss: 0.00675
Epoch: 40715, Training Loss: 0.00701
Epoch: 40715, Training Loss: 0.00702
Epoch: 40715, Training Loss: 0.00861
Epoch: 40716, Training Loss: 0.00675
Epoch: 40716, Training Loss: 0.00701
Epoch: 40716, Training Loss: 0.00702
Epoch: 40716, Training Loss: 0.00861
Epoch: 40717, Training Loss: 0.00675
Epoch: 40717, Training Loss: 0.00701
Epoch: 40717, Training Loss: 0.00702
Epoch: 40717, Training Loss: 0.00861
Epoch: 40718, Training Loss: 0.00675
Epoch: 40718, Training Loss: 0.00701
Epoch: 40718, Training Loss: 0.00702
Epoch: 40718, Training Loss: 0.00861
Epoch: 40719, Training Loss: 0.00675
Epoch: 40719, Training Loss: 0.00701
Epoch: 40719, Training Loss: 0.00702
Epoch: 40719, Training Loss: 0.00861
Epoch: 40720, Training Loss: 0.00675
Epoch: 40720, Training Loss: 0.00701
Epoch: 40720, Training Loss: 0.00702
Epoch: 40720, Training Loss: 0.00861
Epoch: 40721, Training Loss: 0.00675
Epoch: 40721, Training Loss: 0.00701
Epoch: 40721, Training Loss: 0.00702
Epoch: 40721, Training Loss: 0.00861
Epoch: 40722, Training Loss: 0.00675
Epoch: 40722, Training Loss: 0.00701
Epoch: 40722, Training Loss: 0.00702
Epoch: 40722, Training Loss: 0.00861
Epoch: 40723, Training Loss: 0.00675
Epoch: 40723, Training Loss: 0.00701
Epoch: 40723, Training Loss: 0.00702
Epoch: 40723, Training Loss: 0.00861
Epoch: 40724, Training Loss: 0.00675
Epoch: 40724, Training Loss: 0.00701
Epoch: 40724, Training Loss: 0.00702
Epoch: 40724, Training Loss: 0.00861
Epoch: 40725, Training Loss: 0.00675
Epoch: 40725, Training Loss: 0.00701
Epoch: 40725, Training Loss: 0.00702
Epoch: 40725, Training Loss: 0.00861
Epoch: 40726, Training Loss: 0.00675
Epoch: 40726, Training Loss: 0.00701
Epoch: 40726, Training Loss: 0.00702
Epoch: 40726, Training Loss: 0.00861
Epoch: 40727, Training Loss: 0.00675
Epoch: 40727, Training Loss: 0.00701
Epoch: 40727, Training Loss: 0.00702
Epoch: 40727, Training Loss: 0.00861
Epoch: 40728, Training Loss: 0.00675
Epoch: 40728, Training Loss: 0.00701
Epoch: 40728, Training Loss: 0.00702
Epoch: 40728, Training Loss: 0.00861
Epoch: 40729, Training Loss: 0.00675
Epoch: 40729, Training Loss: 0.00701
Epoch: 40729, Training Loss: 0.00702
Epoch: 40729, Training Loss: 0.00861
Epoch: 40730, Training Loss: 0.00675
Epoch: 40730, Training Loss: 0.00701
Epoch: 40730, Training Loss: 0.00702
Epoch: 40730, Training Loss: 0.00861
Epoch: 40731, Training Loss: 0.00675
Epoch: 40731, Training Loss: 0.00701
Epoch: 40731, Training Loss: 0.00702
Epoch: 40731, Training Loss: 0.00861
Epoch: 40732, Training Loss: 0.00675
Epoch: 40732, Training Loss: 0.00701
Epoch: 40732, Training Loss: 0.00702
Epoch: 40732, Training Loss: 0.00861
Epoch: 40733, Training Loss: 0.00675
Epoch: 40733, Training Loss: 0.00701
Epoch: 40733, Training Loss: 0.00702
Epoch: 40733, Training Loss: 0.00861
Epoch: 40734, Training Loss: 0.00675
Epoch: 40734, Training Loss: 0.00701
Epoch: 40734, Training Loss: 0.00702
Epoch: 40734, Training Loss: 0.00861
Epoch: 40735, Training Loss: 0.00675
Epoch: 40735, Training Loss: 0.00701
Epoch: 40735, Training Loss: 0.00702
Epoch: 40735, Training Loss: 0.00861
Epoch: 40736, Training Loss: 0.00675
Epoch: 40736, Training Loss: 0.00701
Epoch: 40736, Training Loss: 0.00702
Epoch: 40736, Training Loss: 0.00861
Epoch: 40737, Training Loss: 0.00675
Epoch: 40737, Training Loss: 0.00701
Epoch: 40737, Training Loss: 0.00702
Epoch: 40737, Training Loss: 0.00861
Epoch: 40738, Training Loss: 0.00675
Epoch: 40738, Training Loss: 0.00701
Epoch: 40738, Training Loss: 0.00702
Epoch: 40738, Training Loss: 0.00861
Epoch: 40739, Training Loss: 0.00675
Epoch: 40739, Training Loss: 0.00701
Epoch: 40739, Training Loss: 0.00702
Epoch: 40739, Training Loss: 0.00861
Epoch: 40740, Training Loss: 0.00675
Epoch: 40740, Training Loss: 0.00701
Epoch: 40740, Training Loss: 0.00702
Epoch: 40740, Training Loss: 0.00861
Epoch: 40741, Training Loss: 0.00675
Epoch: 40741, Training Loss: 0.00701
Epoch: 40741, Training Loss: 0.00702
Epoch: 40741, Training Loss: 0.00861
Epoch: 40742, Training Loss: 0.00675
Epoch: 40742, Training Loss: 0.00701
Epoch: 40742, Training Loss: 0.00702
Epoch: 40742, Training Loss: 0.00861
Epoch: 40743, Training Loss: 0.00675
Epoch: 40743, Training Loss: 0.00701
Epoch: 40743, Training Loss: 0.00702
Epoch: 40743, Training Loss: 0.00861
Epoch: 40744, Training Loss: 0.00675
Epoch: 40744, Training Loss: 0.00701
Epoch: 40744, Training Loss: 0.00702
Epoch: 40744, Training Loss: 0.00861
Epoch: 40745, Training Loss: 0.00675
Epoch: 40745, Training Loss: 0.00701
Epoch: 40745, Training Loss: 0.00702
Epoch: 40745, Training Loss: 0.00861
Epoch: 40746, Training Loss: 0.00675
Epoch: 40746, Training Loss: 0.00701
Epoch: 40746, Training Loss: 0.00702
Epoch: 40746, Training Loss: 0.00861
Epoch: 40747, Training Loss: 0.00675
Epoch: 40747, Training Loss: 0.00701
Epoch: 40747, Training Loss: 0.00702
Epoch: 40747, Training Loss: 0.00861
Epoch: 40748, Training Loss: 0.00675
Epoch: 40748, Training Loss: 0.00701
Epoch: 40748, Training Loss: 0.00702
Epoch: 40748, Training Loss: 0.00861
Epoch: 40749, Training Loss: 0.00675
Epoch: 40749, Training Loss: 0.00701
Epoch: 40749, Training Loss: 0.00702
Epoch: 40749, Training Loss: 0.00861
Epoch: 40750, Training Loss: 0.00675
Epoch: 40750, Training Loss: 0.00701
Epoch: 40750, Training Loss: 0.00702
Epoch: 40750, Training Loss: 0.00861
Epoch: 40751, Training Loss: 0.00675
Epoch: 40751, Training Loss: 0.00701
Epoch: 40751, Training Loss: 0.00702
Epoch: 40751, Training Loss: 0.00861
Epoch: 40752, Training Loss: 0.00675
Epoch: 40752, Training Loss: 0.00701
Epoch: 40752, Training Loss: 0.00702
Epoch: 40752, Training Loss: 0.00861
Epoch: 40753, Training Loss: 0.00675
Epoch: 40753, Training Loss: 0.00701
Epoch: 40753, Training Loss: 0.00702
Epoch: 40753, Training Loss: 0.00860
Epoch: 40754, Training Loss: 0.00675
Epoch: 40754, Training Loss: 0.00701
Epoch: 40754, Training Loss: 0.00702
Epoch: 40754, Training Loss: 0.00860
Epoch: 40755, Training Loss: 0.00675
Epoch: 40755, Training Loss: 0.00701
Epoch: 40755, Training Loss: 0.00702
Epoch: 40755, Training Loss: 0.00860
Epoch: 40756, Training Loss: 0.00675
Epoch: 40756, Training Loss: 0.00701
Epoch: 40756, Training Loss: 0.00702
Epoch: 40756, Training Loss: 0.00860
Epoch: 40757, Training Loss: 0.00675
Epoch: 40757, Training Loss: 0.00701
Epoch: 40757, Training Loss: 0.00702
Epoch: 40757, Training Loss: 0.00860
Epoch: 40758, Training Loss: 0.00675
Epoch: 40758, Training Loss: 0.00701
Epoch: 40758, Training Loss: 0.00702
Epoch: 40758, Training Loss: 0.00860
Epoch: 40759, Training Loss: 0.00675
Epoch: 40759, Training Loss: 0.00701
Epoch: 40759, Training Loss: 0.00702
Epoch: 40759, Training Loss: 0.00860
Epoch: 40760, Training Loss: 0.00675
Epoch: 40760, Training Loss: 0.00701
Epoch: 40760, Training Loss: 0.00702
Epoch: 40760, Training Loss: 0.00860
Epoch: 40761, Training Loss: 0.00675
Epoch: 40761, Training Loss: 0.00701
Epoch: 40761, Training Loss: 0.00702
Epoch: 40761, Training Loss: 0.00860
Epoch: 40762, Training Loss: 0.00675
Epoch: 40762, Training Loss: 0.00701
Epoch: 40762, Training Loss: 0.00702
Epoch: 40762, Training Loss: 0.00860
Epoch: 40763, Training Loss: 0.00675
Epoch: 40763, Training Loss: 0.00701
Epoch: 40763, Training Loss: 0.00702
Epoch: 40763, Training Loss: 0.00860
Epoch: 40764, Training Loss: 0.00675
Epoch: 40764, Training Loss: 0.00701
Epoch: 40764, Training Loss: 0.00702
Epoch: 40764, Training Loss: 0.00860
Epoch: 40765, Training Loss: 0.00675
Epoch: 40765, Training Loss: 0.00701
Epoch: 40765, Training Loss: 0.00702
Epoch: 40765, Training Loss: 0.00860
Epoch: 40766, Training Loss: 0.00675
Epoch: 40766, Training Loss: 0.00701
Epoch: 40766, Training Loss: 0.00702
Epoch: 40766, Training Loss: 0.00860
Epoch: 40767, Training Loss: 0.00675
Epoch: 40767, Training Loss: 0.00701
Epoch: 40767, Training Loss: 0.00702
Epoch: 40767, Training Loss: 0.00860
Epoch: 40768, Training Loss: 0.00675
Epoch: 40768, Training Loss: 0.00701
Epoch: 40768, Training Loss: 0.00702
Epoch: 40768, Training Loss: 0.00860
Epoch: 40769, Training Loss: 0.00675
Epoch: 40769, Training Loss: 0.00701
Epoch: 40769, Training Loss: 0.00702
Epoch: 40769, Training Loss: 0.00860
Epoch: 40770, Training Loss: 0.00675
Epoch: 40770, Training Loss: 0.00701
Epoch: 40770, Training Loss: 0.00702
Epoch: 40770, Training Loss: 0.00860
Epoch: 40771, Training Loss: 0.00675
Epoch: 40771, Training Loss: 0.00701
Epoch: 40771, Training Loss: 0.00702
Epoch: 40771, Training Loss: 0.00860
Epoch: 40772, Training Loss: 0.00675
Epoch: 40772, Training Loss: 0.00701
Epoch: 40772, Training Loss: 0.00702
Epoch: 40772, Training Loss: 0.00860
Epoch: 40773, Training Loss: 0.00675
Epoch: 40773, Training Loss: 0.00701
Epoch: 40773, Training Loss: 0.00702
Epoch: 40773, Training Loss: 0.00860
Epoch: 40774, Training Loss: 0.00675
Epoch: 40774, Training Loss: 0.00701
Epoch: 40774, Training Loss: 0.00702
Epoch: 40774, Training Loss: 0.00860
Epoch: 40775, Training Loss: 0.00675
Epoch: 40775, Training Loss: 0.00701
Epoch: 40775, Training Loss: 0.00702
Epoch: 40775, Training Loss: 0.00860
Epoch: 40776, Training Loss: 0.00675
Epoch: 40776, Training Loss: 0.00701
Epoch: 40776, Training Loss: 0.00702
Epoch: 40776, Training Loss: 0.00860
Epoch: 40777, Training Loss: 0.00675
Epoch: 40777, Training Loss: 0.00701
Epoch: 40777, Training Loss: 0.00702
Epoch: 40777, Training Loss: 0.00860
Epoch: 40778, Training Loss: 0.00675
Epoch: 40778, Training Loss: 0.00701
Epoch: 40778, Training Loss: 0.00702
Epoch: 40778, Training Loss: 0.00860
Epoch: 40779, Training Loss: 0.00675
Epoch: 40779, Training Loss: 0.00701
Epoch: 40779, Training Loss: 0.00701
Epoch: 40779, Training Loss: 0.00860
Epoch: 40780, Training Loss: 0.00675
Epoch: 40780, Training Loss: 0.00701
Epoch: 40780, Training Loss: 0.00701
Epoch: 40780, Training Loss: 0.00860
Epoch: 40781, Training Loss: 0.00675
Epoch: 40781, Training Loss: 0.00701
Epoch: 40781, Training Loss: 0.00701
Epoch: 40781, Training Loss: 0.00860
Epoch: 40782, Training Loss: 0.00675
Epoch: 40782, Training Loss: 0.00701
Epoch: 40782, Training Loss: 0.00701
Epoch: 40782, Training Loss: 0.00860
Epoch: 40783, Training Loss: 0.00675
Epoch: 40783, Training Loss: 0.00701
Epoch: 40783, Training Loss: 0.00701
Epoch: 40783, Training Loss: 0.00860
Epoch: 40784, Training Loss: 0.00675
Epoch: 40784, Training Loss: 0.00701
Epoch: 40784, Training Loss: 0.00701
Epoch: 40784, Training Loss: 0.00860
Epoch: 40785, Training Loss: 0.00675
Epoch: 40785, Training Loss: 0.00701
Epoch: 40785, Training Loss: 0.00701
Epoch: 40785, Training Loss: 0.00860
Epoch: 40786, Training Loss: 0.00675
Epoch: 40786, Training Loss: 0.00701
Epoch: 40786, Training Loss: 0.00701
Epoch: 40786, Training Loss: 0.00860
Epoch: 40787, Training Loss: 0.00675
Epoch: 40787, Training Loss: 0.00701
Epoch: 40787, Training Loss: 0.00701
Epoch: 40787, Training Loss: 0.00860
Epoch: 40788, Training Loss: 0.00675
Epoch: 40788, Training Loss: 0.00701
Epoch: 40788, Training Loss: 0.00701
Epoch: 40788, Training Loss: 0.00860
Epoch: 40789, Training Loss: 0.00675
Epoch: 40789, Training Loss: 0.00701
Epoch: 40789, Training Loss: 0.00701
Epoch: 40789, Training Loss: 0.00860
Epoch: 40790, Training Loss: 0.00675
Epoch: 40790, Training Loss: 0.00701
Epoch: 40790, Training Loss: 0.00701
Epoch: 40790, Training Loss: 0.00860
Epoch: 40791, Training Loss: 0.00675
Epoch: 40791, Training Loss: 0.00701
Epoch: 40791, Training Loss: 0.00701
Epoch: 40791, Training Loss: 0.00860
Epoch: 40792, Training Loss: 0.00675
Epoch: 40792, Training Loss: 0.00701
Epoch: 40792, Training Loss: 0.00701
Epoch: 40792, Training Loss: 0.00860
Epoch: 40793, Training Loss: 0.00675
Epoch: 40793, Training Loss: 0.00701
Epoch: 40793, Training Loss: 0.00701
Epoch: 40793, Training Loss: 0.00860
Epoch: 40794, Training Loss: 0.00675
Epoch: 40794, Training Loss: 0.00701
Epoch: 40794, Training Loss: 0.00701
Epoch: 40794, Training Loss: 0.00860
Epoch: 40795, Training Loss: 0.00675
Epoch: 40795, Training Loss: 0.00701
Epoch: 40795, Training Loss: 0.00701
Epoch: 40795, Training Loss: 0.00860
Epoch: 40796, Training Loss: 0.00675
Epoch: 40796, Training Loss: 0.00701
Epoch: 40796, Training Loss: 0.00701
Epoch: 40796, Training Loss: 0.00860
Epoch: 40797, Training Loss: 0.00675
Epoch: 40797, Training Loss: 0.00701
Epoch: 40797, Training Loss: 0.00701
Epoch: 40797, Training Loss: 0.00860
Epoch: 40798, Training Loss: 0.00675
Epoch: 40798, Training Loss: 0.00701
Epoch: 40798, Training Loss: 0.00701
Epoch: 40798, Training Loss: 0.00860
Epoch: 40799, Training Loss: 0.00675
Epoch: 40799, Training Loss: 0.00701
Epoch: 40799, Training Loss: 0.00701
Epoch: 40799, Training Loss: 0.00860
Epoch: 40800, Training Loss: 0.00675
Epoch: 40800, Training Loss: 0.00701
Epoch: 40800, Training Loss: 0.00701
Epoch: 40800, Training Loss: 0.00860
Epoch: 40801, Training Loss: 0.00675
Epoch: 40801, Training Loss: 0.00701
Epoch: 40801, Training Loss: 0.00701
Epoch: 40801, Training Loss: 0.00860
Epoch: 40802, Training Loss: 0.00675
Epoch: 40802, Training Loss: 0.00700
Epoch: 40802, Training Loss: 0.00701
Epoch: 40802, Training Loss: 0.00860
Epoch: 40803, Training Loss: 0.00675
Epoch: 40803, Training Loss: 0.00700
Epoch: 40803, Training Loss: 0.00701
Epoch: 40803, Training Loss: 0.00860
Epoch: 40804, Training Loss: 0.00675
Epoch: 40804, Training Loss: 0.00700
Epoch: 40804, Training Loss: 0.00701
Epoch: 40804, Training Loss: 0.00860
Epoch: 40805, Training Loss: 0.00675
Epoch: 40805, Training Loss: 0.00700
Epoch: 40805, Training Loss: 0.00701
Epoch: 40805, Training Loss: 0.00860
Epoch: 40806, Training Loss: 0.00675
Epoch: 40806, Training Loss: 0.00700
Epoch: 40806, Training Loss: 0.00701
Epoch: 40806, Training Loss: 0.00860
Epoch: 40807, Training Loss: 0.00675
Epoch: 40807, Training Loss: 0.00700
Epoch: 40807, Training Loss: 0.00701
Epoch: 40807, Training Loss: 0.00860
Epoch: 40808, Training Loss: 0.00674
Epoch: 40808, Training Loss: 0.00700
Epoch: 40808, Training Loss: 0.00701
Epoch: 40808, Training Loss: 0.00860
Epoch: 40809, Training Loss: 0.00674
Epoch: 40809, Training Loss: 0.00700
Epoch: 40809, Training Loss: 0.00701
Epoch: 40809, Training Loss: 0.00860
Epoch: 40810, Training Loss: 0.00674
Epoch: 40810, Training Loss: 0.00700
Epoch: 40810, Training Loss: 0.00701
Epoch: 40810, Training Loss: 0.00860
Epoch: 40811, Training Loss: 0.00674
Epoch: 40811, Training Loss: 0.00700
Epoch: 40811, Training Loss: 0.00701
Epoch: 40811, Training Loss: 0.00860
Epoch: 40812, Training Loss: 0.00674
Epoch: 40812, Training Loss: 0.00700
Epoch: 40812, Training Loss: 0.00701
Epoch: 40812, Training Loss: 0.00860
Epoch: 40813, Training Loss: 0.00674
Epoch: 40813, Training Loss: 0.00700
Epoch: 40813, Training Loss: 0.00701
Epoch: 40813, Training Loss: 0.00860
Epoch: 40814, Training Loss: 0.00674
Epoch: 40814, Training Loss: 0.00700
Epoch: 40814, Training Loss: 0.00701
Epoch: 40814, Training Loss: 0.00860
Epoch: 40815, Training Loss: 0.00674
Epoch: 40815, Training Loss: 0.00700
Epoch: 40815, Training Loss: 0.00701
Epoch: 40815, Training Loss: 0.00860
Epoch: 40816, Training Loss: 0.00674
Epoch: 40816, Training Loss: 0.00700
Epoch: 40816, Training Loss: 0.00701
Epoch: 40816, Training Loss: 0.00860
Epoch: 40817, Training Loss: 0.00674
Epoch: 40817, Training Loss: 0.00700
Epoch: 40817, Training Loss: 0.00701
Epoch: 40817, Training Loss: 0.00860
Epoch: 40818, Training Loss: 0.00674
Epoch: 40818, Training Loss: 0.00700
Epoch: 40818, Training Loss: 0.00701
Epoch: 40818, Training Loss: 0.00860
Epoch: 40819, Training Loss: 0.00674
Epoch: 40819, Training Loss: 0.00700
Epoch: 40819, Training Loss: 0.00701
Epoch: 40819, Training Loss: 0.00860
Epoch: 40820, Training Loss: 0.00674
Epoch: 40820, Training Loss: 0.00700
Epoch: 40820, Training Loss: 0.00701
Epoch: 40820, Training Loss: 0.00860
Epoch: 40821, Training Loss: 0.00674
Epoch: 40821, Training Loss: 0.00700
Epoch: 40821, Training Loss: 0.00701
Epoch: 40821, Training Loss: 0.00860
Epoch: 40822, Training Loss: 0.00674
Epoch: 40822, Training Loss: 0.00700
Epoch: 40822, Training Loss: 0.00701
Epoch: 40822, Training Loss: 0.00860
Epoch: 40823, Training Loss: 0.00674
Epoch: 40823, Training Loss: 0.00700
Epoch: 40823, Training Loss: 0.00701
Epoch: 40823, Training Loss: 0.00860
Epoch: 40824, Training Loss: 0.00674
Epoch: 40824, Training Loss: 0.00700
Epoch: 40824, Training Loss: 0.00701
Epoch: 40824, Training Loss: 0.00860
Epoch: 40825, Training Loss: 0.00674
Epoch: 40825, Training Loss: 0.00700
Epoch: 40825, Training Loss: 0.00701
Epoch: 40825, Training Loss: 0.00860
Epoch: 40826, Training Loss: 0.00674
Epoch: 40826, Training Loss: 0.00700
Epoch: 40826, Training Loss: 0.00701
Epoch: 40826, Training Loss: 0.00860
Epoch: 40827, Training Loss: 0.00674
Epoch: 40827, Training Loss: 0.00700
Epoch: 40827, Training Loss: 0.00701
Epoch: 40827, Training Loss: 0.00860
Epoch: 40828, Training Loss: 0.00674
Epoch: 40828, Training Loss: 0.00700
Epoch: 40828, Training Loss: 0.00701
Epoch: 40828, Training Loss: 0.00860
Epoch: 40829, Training Loss: 0.00674
Epoch: 40829, Training Loss: 0.00700
Epoch: 40829, Training Loss: 0.00701
Epoch: 40829, Training Loss: 0.00860
Epoch: 40830, Training Loss: 0.00674
Epoch: 40830, Training Loss: 0.00700
Epoch: 40830, Training Loss: 0.00701
Epoch: 40830, Training Loss: 0.00860
Epoch: 40831, Training Loss: 0.00674
Epoch: 40831, Training Loss: 0.00700
Epoch: 40831, Training Loss: 0.00701
Epoch: 40831, Training Loss: 0.00860
Epoch: 40832, Training Loss: 0.00674
Epoch: 40832, Training Loss: 0.00700
Epoch: 40832, Training Loss: 0.00701
Epoch: 40832, Training Loss: 0.00860
Epoch: 40833, Training Loss: 0.00674
Epoch: 40833, Training Loss: 0.00700
Epoch: 40833, Training Loss: 0.00701
Epoch: 40833, Training Loss: 0.00860
Epoch: 40834, Training Loss: 0.00674
Epoch: 40834, Training Loss: 0.00700
Epoch: 40834, Training Loss: 0.00701
Epoch: 40834, Training Loss: 0.00860
Epoch: 40835, Training Loss: 0.00674
Epoch: 40835, Training Loss: 0.00700
Epoch: 40835, Training Loss: 0.00701
Epoch: 40835, Training Loss: 0.00860
Epoch: 40836, Training Loss: 0.00674
Epoch: 40836, Training Loss: 0.00700
Epoch: 40836, Training Loss: 0.00701
Epoch: 40836, Training Loss: 0.00860
Epoch: 40837, Training Loss: 0.00674
Epoch: 40837, Training Loss: 0.00700
Epoch: 40837, Training Loss: 0.00701
Epoch: 40837, Training Loss: 0.00860
Epoch: 40838, Training Loss: 0.00674
Epoch: 40838, Training Loss: 0.00700
Epoch: 40838, Training Loss: 0.00701
Epoch: 40838, Training Loss: 0.00860
Epoch: 40839, Training Loss: 0.00674
Epoch: 40839, Training Loss: 0.00700
Epoch: 40839, Training Loss: 0.00701
Epoch: 40839, Training Loss: 0.00859
Epoch: 40840, Training Loss: 0.00674
Epoch: 40840, Training Loss: 0.00700
Epoch: 40840, Training Loss: 0.00701
Epoch: 40840, Training Loss: 0.00859
Epoch: 40841, Training Loss: 0.00674
Epoch: 40841, Training Loss: 0.00700
Epoch: 40841, Training Loss: 0.00701
Epoch: 40841, Training Loss: 0.00859
Epoch: 40842, Training Loss: 0.00674
Epoch: 40842, Training Loss: 0.00700
Epoch: 40842, Training Loss: 0.00701
Epoch: 40842, Training Loss: 0.00859
Epoch: 40843, Training Loss: 0.00674
Epoch: 40843, Training Loss: 0.00700
Epoch: 40843, Training Loss: 0.00701
Epoch: 40843, Training Loss: 0.00859
Epoch: 40844, Training Loss: 0.00674
Epoch: 40844, Training Loss: 0.00700
Epoch: 40844, Training Loss: 0.00701
Epoch: 40844, Training Loss: 0.00859
Epoch: 40845, Training Loss: 0.00674
Epoch: 40845, Training Loss: 0.00700
Epoch: 40845, Training Loss: 0.00701
Epoch: 40845, Training Loss: 0.00859
Epoch: 40846, Training Loss: 0.00674
Epoch: 40846, Training Loss: 0.00700
Epoch: 40846, Training Loss: 0.00701
Epoch: 40846, Training Loss: 0.00859
Epoch: 40847, Training Loss: 0.00674
Epoch: 40847, Training Loss: 0.00700
Epoch: 40847, Training Loss: 0.00701
Epoch: 40847, Training Loss: 0.00859
Epoch: 40848, Training Loss: 0.00674
Epoch: 40848, Training Loss: 0.00700
Epoch: 40848, Training Loss: 0.00701
Epoch: 40848, Training Loss: 0.00859
Epoch: 40849, Training Loss: 0.00674
Epoch: 40849, Training Loss: 0.00700
Epoch: 40849, Training Loss: 0.00701
Epoch: 40849, Training Loss: 0.00859
Epoch: 40850, Training Loss: 0.00674
Epoch: 40850, Training Loss: 0.00700
Epoch: 40850, Training Loss: 0.00701
Epoch: 40850, Training Loss: 0.00859
Epoch: 40851, Training Loss: 0.00674
Epoch: 40851, Training Loss: 0.00700
Epoch: 40851, Training Loss: 0.00701
Epoch: 40851, Training Loss: 0.00859
Epoch: 40852, Training Loss: 0.00674
Epoch: 40852, Training Loss: 0.00700
Epoch: 40852, Training Loss: 0.00701
Epoch: 40852, Training Loss: 0.00859
Epoch: 40853, Training Loss: 0.00674
Epoch: 40853, Training Loss: 0.00700
Epoch: 40853, Training Loss: 0.00701
Epoch: 40853, Training Loss: 0.00859
Epoch: 40854, Training Loss: 0.00674
Epoch: 40854, Training Loss: 0.00700
Epoch: 40854, Training Loss: 0.00701
Epoch: 40854, Training Loss: 0.00859
Epoch: 40855, Training Loss: 0.00674
Epoch: 40855, Training Loss: 0.00700
Epoch: 40855, Training Loss: 0.00701
Epoch: 40855, Training Loss: 0.00859
Epoch: 40856, Training Loss: 0.00674
Epoch: 40856, Training Loss: 0.00700
Epoch: 40856, Training Loss: 0.00701
Epoch: 40856, Training Loss: 0.00859
Epoch: 40857, Training Loss: 0.00674
Epoch: 40857, Training Loss: 0.00700
Epoch: 40857, Training Loss: 0.00701
Epoch: 40857, Training Loss: 0.00859
Epoch: 40858, Training Loss: 0.00674
Epoch: 40858, Training Loss: 0.00700
Epoch: 40858, Training Loss: 0.00701
Epoch: 40858, Training Loss: 0.00859
Epoch: 40859, Training Loss: 0.00674
Epoch: 40859, Training Loss: 0.00700
Epoch: 40859, Training Loss: 0.00701
Epoch: 40859, Training Loss: 0.00859
Epoch: 40860, Training Loss: 0.00674
Epoch: 40860, Training Loss: 0.00700
Epoch: 40860, Training Loss: 0.00701
Epoch: 40860, Training Loss: 0.00859
Epoch: 40861, Training Loss: 0.00674
Epoch: 40861, Training Loss: 0.00700
Epoch: 40861, Training Loss: 0.00701
Epoch: 40861, Training Loss: 0.00859
Epoch: 40862, Training Loss: 0.00674
Epoch: 40862, Training Loss: 0.00700
Epoch: 40862, Training Loss: 0.00701
Epoch: 40862, Training Loss: 0.00859
Epoch: 40863, Training Loss: 0.00674
Epoch: 40863, Training Loss: 0.00700
Epoch: 40863, Training Loss: 0.00701
Epoch: 40863, Training Loss: 0.00859
Epoch: 40864, Training Loss: 0.00674
Epoch: 40864, Training Loss: 0.00700
Epoch: 40864, Training Loss: 0.00701
Epoch: 40864, Training Loss: 0.00859
Epoch: 40865, Training Loss: 0.00674
Epoch: 40865, Training Loss: 0.00700
Epoch: 40865, Training Loss: 0.00701
Epoch: 40865, Training Loss: 0.00859
Epoch: 40866, Training Loss: 0.00674
Epoch: 40866, Training Loss: 0.00700
Epoch: 40866, Training Loss: 0.00701
Epoch: 40866, Training Loss: 0.00859
Epoch: 40867, Training Loss: 0.00674
Epoch: 40867, Training Loss: 0.00700
Epoch: 40867, Training Loss: 0.00701
Epoch: 40867, Training Loss: 0.00859
Epoch: 40868, Training Loss: 0.00674
Epoch: 40868, Training Loss: 0.00700
Epoch: 40868, Training Loss: 0.00701
Epoch: 40868, Training Loss: 0.00859
Epoch: 40869, Training Loss: 0.00674
Epoch: 40869, Training Loss: 0.00700
Epoch: 40869, Training Loss: 0.00701
Epoch: 40869, Training Loss: 0.00859
Epoch: 40870, Training Loss: 0.00674
Epoch: 40870, Training Loss: 0.00700
Epoch: 40870, Training Loss: 0.00701
Epoch: 40870, Training Loss: 0.00859
Epoch: 40871, Training Loss: 0.00674
Epoch: 40871, Training Loss: 0.00700
Epoch: 40871, Training Loss: 0.00701
Epoch: 40871, Training Loss: 0.00859
Epoch: 40872, Training Loss: 0.00674
Epoch: 40872, Training Loss: 0.00700
Epoch: 40872, Training Loss: 0.00701
Epoch: 40872, Training Loss: 0.00859
Epoch: 40873, Training Loss: 0.00674
Epoch: 40873, Training Loss: 0.00700
Epoch: 40873, Training Loss: 0.00701
Epoch: 40873, Training Loss: 0.00859
Epoch: 40874, Training Loss: 0.00674
Epoch: 40874, Training Loss: 0.00700
Epoch: 40874, Training Loss: 0.00701
Epoch: 40874, Training Loss: 0.00859
Epoch: 40875, Training Loss: 0.00674
Epoch: 40875, Training Loss: 0.00700
Epoch: 40875, Training Loss: 0.00701
Epoch: 40875, Training Loss: 0.00859
Epoch: 40876, Training Loss: 0.00674
Epoch: 40876, Training Loss: 0.00700
Epoch: 40876, Training Loss: 0.00701
Epoch: 40876, Training Loss: 0.00859
Epoch: 40877, Training Loss: 0.00674
Epoch: 40877, Training Loss: 0.00700
Epoch: 40877, Training Loss: 0.00701
Epoch: 40877, Training Loss: 0.00859
Epoch: 40878, Training Loss: 0.00674
Epoch: 40878, Training Loss: 0.00700
Epoch: 40878, Training Loss: 0.00701
Epoch: 40878, Training Loss: 0.00859
Epoch: 40879, Training Loss: 0.00674
Epoch: 40879, Training Loss: 0.00700
Epoch: 40879, Training Loss: 0.00701
Epoch: 40879, Training Loss: 0.00859
Epoch: 40880, Training Loss: 0.00674
Epoch: 40880, Training Loss: 0.00700
Epoch: 40880, Training Loss: 0.00701
Epoch: 40880, Training Loss: 0.00859
Epoch: 40881, Training Loss: 0.00674
Epoch: 40881, Training Loss: 0.00700
Epoch: 40881, Training Loss: 0.00701
Epoch: 40881, Training Loss: 0.00859
Epoch: 40882, Training Loss: 0.00674
Epoch: 40882, Training Loss: 0.00700
Epoch: 40882, Training Loss: 0.00701
Epoch: 40882, Training Loss: 0.00859
Epoch: 40883, Training Loss: 0.00674
Epoch: 40883, Training Loss: 0.00700
Epoch: 40883, Training Loss: 0.00701
Epoch: 40883, Training Loss: 0.00859
Epoch: 40884, Training Loss: 0.00674
Epoch: 40884, Training Loss: 0.00700
Epoch: 40884, Training Loss: 0.00701
Epoch: 40884, Training Loss: 0.00859
Epoch: 40885, Training Loss: 0.00674
Epoch: 40885, Training Loss: 0.00700
Epoch: 40885, Training Loss: 0.00700
Epoch: 40885, Training Loss: 0.00859
Epoch: 40886, Training Loss: 0.00674
Epoch: 40886, Training Loss: 0.00700
Epoch: 40886, Training Loss: 0.00700
Epoch: 40886, Training Loss: 0.00859
Epoch: 40887, Training Loss: 0.00674
Epoch: 40887, Training Loss: 0.00700
Epoch: 40887, Training Loss: 0.00700
Epoch: 40887, Training Loss: 0.00859
Epoch: 40888, Training Loss: 0.00674
Epoch: 40888, Training Loss: 0.00700
Epoch: 40888, Training Loss: 0.00700
Epoch: 40888, Training Loss: 0.00859
Epoch: 40889, Training Loss: 0.00674
Epoch: 40889, Training Loss: 0.00700
Epoch: 40889, Training Loss: 0.00700
Epoch: 40889, Training Loss: 0.00859
Epoch: 40890, Training Loss: 0.00674
Epoch: 40890, Training Loss: 0.00700
Epoch: 40890, Training Loss: 0.00700
Epoch: 40890, Training Loss: 0.00859
Epoch: 40891, Training Loss: 0.00674
Epoch: 40891, Training Loss: 0.00700
Epoch: 40891, Training Loss: 0.00700
Epoch: 40891, Training Loss: 0.00859
Epoch: 40892, Training Loss: 0.00674
Epoch: 40892, Training Loss: 0.00700
Epoch: 40892, Training Loss: 0.00700
Epoch: 40892, Training Loss: 0.00859
Epoch: 40893, Training Loss: 0.00674
Epoch: 40893, Training Loss: 0.00700
Epoch: 40893, Training Loss: 0.00700
Epoch: 40893, Training Loss: 0.00859
Epoch: 40894, Training Loss: 0.00674
Epoch: 40894, Training Loss: 0.00700
Epoch: 40894, Training Loss: 0.00700
Epoch: 40894, Training Loss: 0.00859
Epoch: 40895, Training Loss: 0.00674
Epoch: 40895, Training Loss: 0.00700
Epoch: 40895, Training Loss: 0.00700
Epoch: 40895, Training Loss: 0.00859
Epoch: 40896, Training Loss: 0.00674
Epoch: 40896, Training Loss: 0.00700
Epoch: 40896, Training Loss: 0.00700
Epoch: 40896, Training Loss: 0.00859
Epoch: 40897, Training Loss: 0.00674
Epoch: 40897, Training Loss: 0.00700
Epoch: 40897, Training Loss: 0.00700
Epoch: 40897, Training Loss: 0.00859
Epoch: 40898, Training Loss: 0.00674
Epoch: 40898, Training Loss: 0.00700
Epoch: 40898, Training Loss: 0.00700
Epoch: 40898, Training Loss: 0.00859
Epoch: 40899, Training Loss: 0.00674
Epoch: 40899, Training Loss: 0.00700
Epoch: 40899, Training Loss: 0.00700
Epoch: 40899, Training Loss: 0.00859
Epoch: 40900, Training Loss: 0.00674
Epoch: 40900, Training Loss: 0.00700
Epoch: 40900, Training Loss: 0.00700
Epoch: 40900, Training Loss: 0.00859
Epoch: 40901, Training Loss: 0.00674
Epoch: 40901, Training Loss: 0.00700
Epoch: 40901, Training Loss: 0.00700
Epoch: 40901, Training Loss: 0.00859
Epoch: 40902, Training Loss: 0.00674
Epoch: 40902, Training Loss: 0.00700
Epoch: 40902, Training Loss: 0.00700
Epoch: 40902, Training Loss: 0.00859
Epoch: 40903, Training Loss: 0.00674
Epoch: 40903, Training Loss: 0.00700
Epoch: 40903, Training Loss: 0.00700
Epoch: 40903, Training Loss: 0.00859
Epoch: 40904, Training Loss: 0.00674
Epoch: 40904, Training Loss: 0.00700
Epoch: 40904, Training Loss: 0.00700
Epoch: 40904, Training Loss: 0.00859
Epoch: 40905, Training Loss: 0.00674
Epoch: 40905, Training Loss: 0.00700
Epoch: 40905, Training Loss: 0.00700
Epoch: 40905, Training Loss: 0.00859
Epoch: 40906, Training Loss: 0.00674
Epoch: 40906, Training Loss: 0.00700
Epoch: 40906, Training Loss: 0.00700
Epoch: 40906, Training Loss: 0.00859
Epoch: 40907, Training Loss: 0.00674
Epoch: 40907, Training Loss: 0.00700
Epoch: 40907, Training Loss: 0.00700
Epoch: 40907, Training Loss: 0.00859
Epoch: 40908, Training Loss: 0.00674
Epoch: 40908, Training Loss: 0.00699
Epoch: 40908, Training Loss: 0.00700
Epoch: 40908, Training Loss: 0.00859
Epoch: 40909, Training Loss: 0.00674
Epoch: 40909, Training Loss: 0.00699
Epoch: 40909, Training Loss: 0.00700
Epoch: 40909, Training Loss: 0.00859
Epoch: 40910, Training Loss: 0.00674
Epoch: 40910, Training Loss: 0.00699
Epoch: 40910, Training Loss: 0.00700
Epoch: 40910, Training Loss: 0.00859
Epoch: 40911, Training Loss: 0.00674
Epoch: 40911, Training Loss: 0.00699
Epoch: 40911, Training Loss: 0.00700
Epoch: 40911, Training Loss: 0.00859
Epoch: 40912, Training Loss: 0.00674
Epoch: 40912, Training Loss: 0.00699
Epoch: 40912, Training Loss: 0.00700
Epoch: 40912, Training Loss: 0.00859
Epoch: 40913, Training Loss: 0.00674
Epoch: 40913, Training Loss: 0.00699
Epoch: 40913, Training Loss: 0.00700
Epoch: 40913, Training Loss: 0.00859
Epoch: 40914, Training Loss: 0.00674
Epoch: 40914, Training Loss: 0.00699
Epoch: 40914, Training Loss: 0.00700
Epoch: 40914, Training Loss: 0.00859
Epoch: 40915, Training Loss: 0.00674
Epoch: 40915, Training Loss: 0.00699
Epoch: 40915, Training Loss: 0.00700
Epoch: 40915, Training Loss: 0.00859
Epoch: 40916, Training Loss: 0.00674
Epoch: 40916, Training Loss: 0.00699
Epoch: 40916, Training Loss: 0.00700
Epoch: 40916, Training Loss: 0.00859
Epoch: 40917, Training Loss: 0.00674
Epoch: 40917, Training Loss: 0.00699
Epoch: 40917, Training Loss: 0.00700
Epoch: 40917, Training Loss: 0.00859
Epoch: 40918, Training Loss: 0.00674
Epoch: 40918, Training Loss: 0.00699
Epoch: 40918, Training Loss: 0.00700
Epoch: 40918, Training Loss: 0.00859
Epoch: 40919, Training Loss: 0.00674
Epoch: 40919, Training Loss: 0.00699
Epoch: 40919, Training Loss: 0.00700
Epoch: 40919, Training Loss: 0.00859
Epoch: 40920, Training Loss: 0.00674
Epoch: 40920, Training Loss: 0.00699
Epoch: 40920, Training Loss: 0.00700
Epoch: 40920, Training Loss: 0.00859
Epoch: 40921, Training Loss: 0.00673
Epoch: 40921, Training Loss: 0.00699
Epoch: 40921, Training Loss: 0.00700
Epoch: 40921, Training Loss: 0.00859
Epoch: 40922, Training Loss: 0.00673
Epoch: 40922, Training Loss: 0.00699
Epoch: 40922, Training Loss: 0.00700
Epoch: 40922, Training Loss: 0.00859
Epoch: 40923, Training Loss: 0.00673
Epoch: 40923, Training Loss: 0.00699
Epoch: 40923, Training Loss: 0.00700
Epoch: 40923, Training Loss: 0.00859
Epoch: 40924, Training Loss: 0.00673
Epoch: 40924, Training Loss: 0.00699
Epoch: 40924, Training Loss: 0.00700
Epoch: 40924, Training Loss: 0.00859
Epoch: 40925, Training Loss: 0.00673
Epoch: 40925, Training Loss: 0.00699
Epoch: 40925, Training Loss: 0.00700
Epoch: 40925, Training Loss: 0.00858
Epoch: 40926, Training Loss: 0.00673
Epoch: 40926, Training Loss: 0.00699
Epoch: 40926, Training Loss: 0.00700
Epoch: 40926, Training Loss: 0.00858
Epoch: 40927, Training Loss: 0.00673
Epoch: 40927, Training Loss: 0.00699
Epoch: 40927, Training Loss: 0.00700
Epoch: 40927, Training Loss: 0.00858
Epoch: 40928, Training Loss: 0.00673
Epoch: 40928, Training Loss: 0.00699
Epoch: 40928, Training Loss: 0.00700
Epoch: 40928, Training Loss: 0.00858
Epoch: 40929, Training Loss: 0.00673
Epoch: 40929, Training Loss: 0.00699
Epoch: 40929, Training Loss: 0.00700
Epoch: 40929, Training Loss: 0.00858
Epoch: 40930, Training Loss: 0.00673
Epoch: 40930, Training Loss: 0.00699
Epoch: 40930, Training Loss: 0.00700
Epoch: 40930, Training Loss: 0.00858
Epoch: 40931, Training Loss: 0.00673
Epoch: 40931, Training Loss: 0.00699
Epoch: 40931, Training Loss: 0.00700
Epoch: 40931, Training Loss: 0.00858
Epoch: 40932, Training Loss: 0.00673
Epoch: 40932, Training Loss: 0.00699
Epoch: 40932, Training Loss: 0.00700
Epoch: 40932, Training Loss: 0.00858
Epoch: 40933, Training Loss: 0.00673
Epoch: 40933, Training Loss: 0.00699
Epoch: 40933, Training Loss: 0.00700
Epoch: 40933, Training Loss: 0.00858
Epoch: 40934, Training Loss: 0.00673
Epoch: 40934, Training Loss: 0.00699
Epoch: 40934, Training Loss: 0.00700
Epoch: 40934, Training Loss: 0.00858
Epoch: 40935, Training Loss: 0.00673
Epoch: 40935, Training Loss: 0.00699
Epoch: 40935, Training Loss: 0.00700
Epoch: 40935, Training Loss: 0.00858
Epoch: 40936, Training Loss: 0.00673
Epoch: 40936, Training Loss: 0.00699
Epoch: 40936, Training Loss: 0.00700
Epoch: 40936, Training Loss: 0.00858
Epoch: 40937, Training Loss: 0.00673
Epoch: 40937, Training Loss: 0.00699
Epoch: 40937, Training Loss: 0.00700
Epoch: 40937, Training Loss: 0.00858
Epoch: 40938, Training Loss: 0.00673
Epoch: 40938, Training Loss: 0.00699
Epoch: 40938, Training Loss: 0.00700
Epoch: 40938, Training Loss: 0.00858
Epoch: 40939, Training Loss: 0.00673
Epoch: 40939, Training Loss: 0.00699
Epoch: 40939, Training Loss: 0.00700
Epoch: 40939, Training Loss: 0.00858
Epoch: 40940, Training Loss: 0.00673
Epoch: 40940, Training Loss: 0.00699
Epoch: 40940, Training Loss: 0.00700
Epoch: 40940, Training Loss: 0.00858
Epoch: 40941, Training Loss: 0.00673
Epoch: 40941, Training Loss: 0.00699
Epoch: 40941, Training Loss: 0.00700
Epoch: 40941, Training Loss: 0.00858
Epoch: 40942, Training Loss: 0.00673
Epoch: 40942, Training Loss: 0.00699
Epoch: 40942, Training Loss: 0.00700
Epoch: 40942, Training Loss: 0.00858
Epoch: 40943, Training Loss: 0.00673
Epoch: 40943, Training Loss: 0.00699
Epoch: 40943, Training Loss: 0.00700
Epoch: 40943, Training Loss: 0.00858
Epoch: 40944, Training Loss: 0.00673
Epoch: 40944, Training Loss: 0.00699
Epoch: 40944, Training Loss: 0.00700
Epoch: 40944, Training Loss: 0.00858
Epoch: 40945, Training Loss: 0.00673
Epoch: 40945, Training Loss: 0.00699
Epoch: 40945, Training Loss: 0.00700
Epoch: 40945, Training Loss: 0.00858
Epoch: 40946, Training Loss: 0.00673
Epoch: 40946, Training Loss: 0.00699
Epoch: 40946, Training Loss: 0.00700
Epoch: 40946, Training Loss: 0.00858
Epoch: 40947, Training Loss: 0.00673
Epoch: 40947, Training Loss: 0.00699
Epoch: 40947, Training Loss: 0.00700
Epoch: 40947, Training Loss: 0.00858
Epoch: 40948, Training Loss: 0.00673
Epoch: 40948, Training Loss: 0.00699
Epoch: 40948, Training Loss: 0.00700
Epoch: 40948, Training Loss: 0.00858
Epoch: 40949, Training Loss: 0.00673
Epoch: 40949, Training Loss: 0.00699
Epoch: 40949, Training Loss: 0.00700
Epoch: 40949, Training Loss: 0.00858
Epoch: 40950, Training Loss: 0.00673
Epoch: 40950, Training Loss: 0.00699
Epoch: 40950, Training Loss: 0.00700
Epoch: 40950, Training Loss: 0.00858
Epoch: 40951, Training Loss: 0.00673
Epoch: 40951, Training Loss: 0.00699
Epoch: 40951, Training Loss: 0.00700
Epoch: 40951, Training Loss: 0.00858
Epoch: 40952, Training Loss: 0.00673
Epoch: 40952, Training Loss: 0.00699
Epoch: 40952, Training Loss: 0.00700
Epoch: 40952, Training Loss: 0.00858
Epoch: 40953, Training Loss: 0.00673
Epoch: 40953, Training Loss: 0.00699
Epoch: 40953, Training Loss: 0.00700
Epoch: 40953, Training Loss: 0.00858
Epoch: 40954, Training Loss: 0.00673
Epoch: 40954, Training Loss: 0.00699
Epoch: 40954, Training Loss: 0.00700
Epoch: 40954, Training Loss: 0.00858
Epoch: 40955, Training Loss: 0.00673
Epoch: 40955, Training Loss: 0.00699
Epoch: 40955, Training Loss: 0.00700
Epoch: 40955, Training Loss: 0.00858
Epoch: 40956, Training Loss: 0.00673
Epoch: 40956, Training Loss: 0.00699
Epoch: 40956, Training Loss: 0.00700
Epoch: 40956, Training Loss: 0.00858
Epoch: 40957, Training Loss: 0.00673
Epoch: 40957, Training Loss: 0.00699
Epoch: 40957, Training Loss: 0.00700
Epoch: 40957, Training Loss: 0.00858
Epoch: 40958, Training Loss: 0.00673
Epoch: 40958, Training Loss: 0.00699
Epoch: 40958, Training Loss: 0.00700
Epoch: 40958, Training Loss: 0.00858
Epoch: 40959, Training Loss: 0.00673
Epoch: 40959, Training Loss: 0.00699
Epoch: 40959, Training Loss: 0.00700
Epoch: 40959, Training Loss: 0.00858
Epoch: 40960, Training Loss: 0.00673
Epoch: 40960, Training Loss: 0.00699
Epoch: 40960, Training Loss: 0.00700
Epoch: 40960, Training Loss: 0.00858
Epoch: 40961, Training Loss: 0.00673
Epoch: 40961, Training Loss: 0.00699
Epoch: 40961, Training Loss: 0.00700
Epoch: 40961, Training Loss: 0.00858
Epoch: 40962, Training Loss: 0.00673
Epoch: 40962, Training Loss: 0.00699
Epoch: 40962, Training Loss: 0.00700
Epoch: 40962, Training Loss: 0.00858
Epoch: 40963, Training Loss: 0.00673
Epoch: 40963, Training Loss: 0.00699
Epoch: 40963, Training Loss: 0.00700
Epoch: 40963, Training Loss: 0.00858
Epoch: 40964, Training Loss: 0.00673
Epoch: 40964, Training Loss: 0.00699
Epoch: 40964, Training Loss: 0.00700
Epoch: 40964, Training Loss: 0.00858
Epoch: 40965, Training Loss: 0.00673
Epoch: 40965, Training Loss: 0.00699
Epoch: 40965, Training Loss: 0.00700
Epoch: 40965, Training Loss: 0.00858
Epoch: 40966, Training Loss: 0.00673
Epoch: 40966, Training Loss: 0.00699
Epoch: 40966, Training Loss: 0.00700
Epoch: 40966, Training Loss: 0.00858
Epoch: 40967, Training Loss: 0.00673
Epoch: 40967, Training Loss: 0.00699
Epoch: 40967, Training Loss: 0.00700
Epoch: 40967, Training Loss: 0.00858
Epoch: 40968, Training Loss: 0.00673
Epoch: 40968, Training Loss: 0.00699
Epoch: 40968, Training Loss: 0.00700
Epoch: 40968, Training Loss: 0.00858
Epoch: 40969, Training Loss: 0.00673
Epoch: 40969, Training Loss: 0.00699
Epoch: 40969, Training Loss: 0.00700
Epoch: 40969, Training Loss: 0.00858
Epoch: 40970, Training Loss: 0.00673
Epoch: 40970, Training Loss: 0.00699
Epoch: 40970, Training Loss: 0.00700
Epoch: 40970, Training Loss: 0.00858
Epoch: 40971, Training Loss: 0.00673
Epoch: 40971, Training Loss: 0.00699
Epoch: 40971, Training Loss: 0.00700
Epoch: 40971, Training Loss: 0.00858
Epoch: 40972, Training Loss: 0.00673
Epoch: 40972, Training Loss: 0.00699
Epoch: 40972, Training Loss: 0.00700
Epoch: 40972, Training Loss: 0.00858
Epoch: 40973, Training Loss: 0.00673
Epoch: 40973, Training Loss: 0.00699
Epoch: 40973, Training Loss: 0.00700
Epoch: 40973, Training Loss: 0.00858
Epoch: 40974, Training Loss: 0.00673
Epoch: 40974, Training Loss: 0.00699
Epoch: 40974, Training Loss: 0.00700
Epoch: 40974, Training Loss: 0.00858
Epoch: 40975, Training Loss: 0.00673
Epoch: 40975, Training Loss: 0.00699
Epoch: 40975, Training Loss: 0.00700
Epoch: 40975, Training Loss: 0.00858
Epoch: 40976, Training Loss: 0.00673
Epoch: 40976, Training Loss: 0.00699
Epoch: 40976, Training Loss: 0.00700
Epoch: 40976, Training Loss: 0.00858
Epoch: 40977, Training Loss: 0.00673
Epoch: 40977, Training Loss: 0.00699
Epoch: 40977, Training Loss: 0.00700
Epoch: 40977, Training Loss: 0.00858
Epoch: 40978, Training Loss: 0.00673
Epoch: 40978, Training Loss: 0.00699
Epoch: 40978, Training Loss: 0.00700
Epoch: 40978, Training Loss: 0.00858
Epoch: 40979, Training Loss: 0.00673
Epoch: 40979, Training Loss: 0.00699
Epoch: 40979, Training Loss: 0.00700
Epoch: 40979, Training Loss: 0.00858
Epoch: 40980, Training Loss: 0.00673
Epoch: 40980, Training Loss: 0.00699
Epoch: 40980, Training Loss: 0.00700
Epoch: 40980, Training Loss: 0.00858
Epoch: 40981, Training Loss: 0.00673
Epoch: 40981, Training Loss: 0.00699
Epoch: 40981, Training Loss: 0.00700
Epoch: 40981, Training Loss: 0.00858
Epoch: 40982, Training Loss: 0.00673
Epoch: 40982, Training Loss: 0.00699
Epoch: 40982, Training Loss: 0.00700
Epoch: 40982, Training Loss: 0.00858
Epoch: 40983, Training Loss: 0.00673
Epoch: 40983, Training Loss: 0.00699
Epoch: 40983, Training Loss: 0.00700
Epoch: 40983, Training Loss: 0.00858
Epoch: 40984, Training Loss: 0.00673
Epoch: 40984, Training Loss: 0.00699
Epoch: 40984, Training Loss: 0.00700
Epoch: 40984, Training Loss: 0.00858
Epoch: 40985, Training Loss: 0.00673
Epoch: 40985, Training Loss: 0.00699
Epoch: 40985, Training Loss: 0.00700
Epoch: 40985, Training Loss: 0.00858
Epoch: 40986, Training Loss: 0.00673
Epoch: 40986, Training Loss: 0.00699
Epoch: 40986, Training Loss: 0.00700
Epoch: 40986, Training Loss: 0.00858
Epoch: 40987, Training Loss: 0.00673
Epoch: 40987, Training Loss: 0.00699
Epoch: 40987, Training Loss: 0.00700
Epoch: 40987, Training Loss: 0.00858
Epoch: 40988, Training Loss: 0.00673
Epoch: 40988, Training Loss: 0.00699
Epoch: 40988, Training Loss: 0.00700
Epoch: 40988, Training Loss: 0.00858
Epoch: 40989, Training Loss: 0.00673
Epoch: 40989, Training Loss: 0.00699
Epoch: 40989, Training Loss: 0.00700
Epoch: 40989, Training Loss: 0.00858
Epoch: 40990, Training Loss: 0.00673
Epoch: 40990, Training Loss: 0.00699
Epoch: 40990, Training Loss: 0.00700
Epoch: 40990, Training Loss: 0.00858
Epoch: 40991, Training Loss: 0.00673
Epoch: 40991, Training Loss: 0.00699
Epoch: 40991, Training Loss: 0.00699
Epoch: 40991, Training Loss: 0.00858
Epoch: 40992, Training Loss: 0.00673
Epoch: 40992, Training Loss: 0.00699
Epoch: 40992, Training Loss: 0.00699
Epoch: 40992, Training Loss: 0.00858
Epoch: 40993, Training Loss: 0.00673
Epoch: 40993, Training Loss: 0.00699
Epoch: 40993, Training Loss: 0.00699
Epoch: 40993, Training Loss: 0.00858
Epoch: 40994, Training Loss: 0.00673
Epoch: 40994, Training Loss: 0.00699
Epoch: 40994, Training Loss: 0.00699
Epoch: 40994, Training Loss: 0.00858
Epoch: 40995, Training Loss: 0.00673
Epoch: 40995, Training Loss: 0.00699
Epoch: 40995, Training Loss: 0.00699
Epoch: 40995, Training Loss: 0.00858
Epoch: 40996, Training Loss: 0.00673
Epoch: 40996, Training Loss: 0.00699
Epoch: 40996, Training Loss: 0.00699
Epoch: 40996, Training Loss: 0.00858
Epoch: 40997, Training Loss: 0.00673
Epoch: 40997, Training Loss: 0.00699
Epoch: 40997, Training Loss: 0.00699
Epoch: 40997, Training Loss: 0.00858
Epoch: 40998, Training Loss: 0.00673
Epoch: 40998, Training Loss: 0.00699
Epoch: 40998, Training Loss: 0.00699
Epoch: 40998, Training Loss: 0.00858
Epoch: 40999, Training Loss: 0.00673
Epoch: 40999, Training Loss: 0.00699
Epoch: 40999, Training Loss: 0.00699
Epoch: 40999, Training Loss: 0.00858
Epoch: 41000, Training Loss: 0.00673
Epoch: 41000, Training Loss: 0.00699
Epoch: 41000, Training Loss: 0.00699
Epoch: 41000, Training Loss: 0.00858
Epoch: 41001, Training Loss: 0.00673
Epoch: 41001, Training Loss: 0.00699
Epoch: 41001, Training Loss: 0.00699
Epoch: 41001, Training Loss: 0.00858
Epoch: 41002, Training Loss: 0.00673
Epoch: 41002, Training Loss: 0.00699
Epoch: 41002, Training Loss: 0.00699
Epoch: 41002, Training Loss: 0.00858
Epoch: 41003, Training Loss: 0.00673
Epoch: 41003, Training Loss: 0.00699
Epoch: 41003, Training Loss: 0.00699
Epoch: 41003, Training Loss: 0.00858
Epoch: 41004, Training Loss: 0.00673
Epoch: 41004, Training Loss: 0.00699
Epoch: 41004, Training Loss: 0.00699
Epoch: 41004, Training Loss: 0.00858
Epoch: 41005, Training Loss: 0.00673
Epoch: 41005, Training Loss: 0.00699
Epoch: 41005, Training Loss: 0.00699
Epoch: 41005, Training Loss: 0.00858
Epoch: 41006, Training Loss: 0.00673
Epoch: 41006, Training Loss: 0.00699
Epoch: 41006, Training Loss: 0.00699
Epoch: 41006, Training Loss: 0.00858
Epoch: 41007, Training Loss: 0.00673
Epoch: 41007, Training Loss: 0.00699
Epoch: 41007, Training Loss: 0.00699
Epoch: 41007, Training Loss: 0.00858
Epoch: 41008, Training Loss: 0.00673
Epoch: 41008, Training Loss: 0.00699
Epoch: 41008, Training Loss: 0.00699
Epoch: 41008, Training Loss: 0.00858
Epoch: 41009, Training Loss: 0.00673
Epoch: 41009, Training Loss: 0.00699
Epoch: 41009, Training Loss: 0.00699
Epoch: 41009, Training Loss: 0.00858
Epoch: 41010, Training Loss: 0.00673
Epoch: 41010, Training Loss: 0.00699
Epoch: 41010, Training Loss: 0.00699
Epoch: 41010, Training Loss: 0.00858
Epoch: 41011, Training Loss: 0.00673
Epoch: 41011, Training Loss: 0.00699
Epoch: 41011, Training Loss: 0.00699
Epoch: 41011, Training Loss: 0.00857
Epoch: 41012, Training Loss: 0.00673
Epoch: 41012, Training Loss: 0.00699
Epoch: 41012, Training Loss: 0.00699
Epoch: 41012, Training Loss: 0.00857
Epoch: 41013, Training Loss: 0.00673
Epoch: 41013, Training Loss: 0.00699
Epoch: 41013, Training Loss: 0.00699
Epoch: 41013, Training Loss: 0.00857
Epoch: 41014, Training Loss: 0.00673
Epoch: 41014, Training Loss: 0.00699
Epoch: 41014, Training Loss: 0.00699
Epoch: 41014, Training Loss: 0.00857
Epoch: 41015, Training Loss: 0.00673
Epoch: 41015, Training Loss: 0.00698
Epoch: 41015, Training Loss: 0.00699
Epoch: 41015, Training Loss: 0.00857
Epoch: 41016, Training Loss: 0.00673
Epoch: 41016, Training Loss: 0.00698
Epoch: 41016, Training Loss: 0.00699
Epoch: 41016, Training Loss: 0.00857
Epoch: 41017, Training Loss: 0.00673
Epoch: 41017, Training Loss: 0.00698
Epoch: 41017, Training Loss: 0.00699
Epoch: 41017, Training Loss: 0.00857
Epoch: 41018, Training Loss: 0.00673
Epoch: 41018, Training Loss: 0.00698
Epoch: 41018, Training Loss: 0.00699
Epoch: 41018, Training Loss: 0.00857
Epoch: 41019, Training Loss: 0.00673
Epoch: 41019, Training Loss: 0.00698
Epoch: 41019, Training Loss: 0.00699
Epoch: 41019, Training Loss: 0.00857
Epoch: 41020, Training Loss: 0.00673
Epoch: 41020, Training Loss: 0.00698
Epoch: 41020, Training Loss: 0.00699
Epoch: 41020, Training Loss: 0.00857
Epoch: 41021, Training Loss: 0.00673
Epoch: 41021, Training Loss: 0.00698
Epoch: 41021, Training Loss: 0.00699
Epoch: 41021, Training Loss: 0.00857
Epoch: 41022, Training Loss: 0.00673
Epoch: 41022, Training Loss: 0.00698
Epoch: 41022, Training Loss: 0.00699
Epoch: 41022, Training Loss: 0.00857
Epoch: 41023, Training Loss: 0.00673
Epoch: 41023, Training Loss: 0.00698
Epoch: 41023, Training Loss: 0.00699
Epoch: 41023, Training Loss: 0.00857
Epoch: 41024, Training Loss: 0.00673
Epoch: 41024, Training Loss: 0.00698
Epoch: 41024, Training Loss: 0.00699
Epoch: 41024, Training Loss: 0.00857
Epoch: 41025, Training Loss: 0.00673
Epoch: 41025, Training Loss: 0.00698
Epoch: 41025, Training Loss: 0.00699
Epoch: 41025, Training Loss: 0.00857
Epoch: 41026, Training Loss: 0.00673
Epoch: 41026, Training Loss: 0.00698
Epoch: 41026, Training Loss: 0.00699
Epoch: 41026, Training Loss: 0.00857
Epoch: 41027, Training Loss: 0.00673
Epoch: 41027, Training Loss: 0.00698
Epoch: 41027, Training Loss: 0.00699
Epoch: 41027, Training Loss: 0.00857
Epoch: 41028, Training Loss: 0.00673
Epoch: 41028, Training Loss: 0.00698
Epoch: 41028, Training Loss: 0.00699
Epoch: 41028, Training Loss: 0.00857
Epoch: 41029, Training Loss: 0.00673
Epoch: 41029, Training Loss: 0.00698
Epoch: 41029, Training Loss: 0.00699
Epoch: 41029, Training Loss: 0.00857
Epoch: 41030, Training Loss: 0.00673
Epoch: 41030, Training Loss: 0.00698
Epoch: 41030, Training Loss: 0.00699
Epoch: 41030, Training Loss: 0.00857
Epoch: 41031, Training Loss: 0.00673
Epoch: 41031, Training Loss: 0.00698
Epoch: 41031, Training Loss: 0.00699
Epoch: 41031, Training Loss: 0.00857
Epoch: 41032, Training Loss: 0.00673
Epoch: 41032, Training Loss: 0.00698
Epoch: 41032, Training Loss: 0.00699
Epoch: 41032, Training Loss: 0.00857
Epoch: 41033, Training Loss: 0.00673
Epoch: 41033, Training Loss: 0.00698
Epoch: 41033, Training Loss: 0.00699
Epoch: 41033, Training Loss: 0.00857
Epoch: 41034, Training Loss: 0.00672
Epoch: 41034, Training Loss: 0.00698
Epoch: 41034, Training Loss: 0.00699
Epoch: 41034, Training Loss: 0.00857
Epoch: 41035, Training Loss: 0.00672
Epoch: 41035, Training Loss: 0.00698
Epoch: 41035, Training Loss: 0.00699
Epoch: 41035, Training Loss: 0.00857
Epoch: 41036, Training Loss: 0.00672
Epoch: 41036, Training Loss: 0.00698
Epoch: 41036, Training Loss: 0.00699
Epoch: 41036, Training Loss: 0.00857
Epoch: 41037, Training Loss: 0.00672
Epoch: 41037, Training Loss: 0.00698
Epoch: 41037, Training Loss: 0.00699
Epoch: 41037, Training Loss: 0.00857
Epoch: 41038, Training Loss: 0.00672
Epoch: 41038, Training Loss: 0.00698
Epoch: 41038, Training Loss: 0.00699
Epoch: 41038, Training Loss: 0.00857
Epoch: 41039, Training Loss: 0.00672
Epoch: 41039, Training Loss: 0.00698
Epoch: 41039, Training Loss: 0.00699
Epoch: 41039, Training Loss: 0.00857
Epoch: 41040, Training Loss: 0.00672
Epoch: 41040, Training Loss: 0.00698
Epoch: 41040, Training Loss: 0.00699
Epoch: 41040, Training Loss: 0.00857
Epoch: 41041, Training Loss: 0.00672
Epoch: 41041, Training Loss: 0.00698
Epoch: 41041, Training Loss: 0.00699
Epoch: 41041, Training Loss: 0.00857
Epoch: 41042, Training Loss: 0.00672
Epoch: 41042, Training Loss: 0.00698
Epoch: 41042, Training Loss: 0.00699
Epoch: 41042, Training Loss: 0.00857
Epoch: 41043, Training Loss: 0.00672
Epoch: 41043, Training Loss: 0.00698
Epoch: 41043, Training Loss: 0.00699
Epoch: 41043, Training Loss: 0.00857
Epoch: 41044, Training Loss: 0.00672
Epoch: 41044, Training Loss: 0.00698
Epoch: 41044, Training Loss: 0.00699
Epoch: 41044, Training Loss: 0.00857
Epoch: 41045, Training Loss: 0.00672
Epoch: 41045, Training Loss: 0.00698
Epoch: 41045, Training Loss: 0.00699
Epoch: 41045, Training Loss: 0.00857
Epoch: 41046, Training Loss: 0.00672
Epoch: 41046, Training Loss: 0.00698
Epoch: 41046, Training Loss: 0.00699
Epoch: 41046, Training Loss: 0.00857
Epoch: 41047, Training Loss: 0.00672
Epoch: 41047, Training Loss: 0.00698
Epoch: 41047, Training Loss: 0.00699
Epoch: 41047, Training Loss: 0.00857
Epoch: 41048, Training Loss: 0.00672
Epoch: 41048, Training Loss: 0.00698
Epoch: 41048, Training Loss: 0.00699
Epoch: 41048, Training Loss: 0.00857
Epoch: 41049, Training Loss: 0.00672
Epoch: 41049, Training Loss: 0.00698
Epoch: 41049, Training Loss: 0.00699
Epoch: 41049, Training Loss: 0.00857
Epoch: 41050, Training Loss: 0.00672
Epoch: 41050, Training Loss: 0.00698
Epoch: 41050, Training Loss: 0.00699
Epoch: 41050, Training Loss: 0.00857
Epoch: 41051, Training Loss: 0.00672
Epoch: 41051, Training Loss: 0.00698
Epoch: 41051, Training Loss: 0.00699
Epoch: 41051, Training Loss: 0.00857
Epoch: 41052, Training Loss: 0.00672
Epoch: 41052, Training Loss: 0.00698
Epoch: 41052, Training Loss: 0.00699
Epoch: 41052, Training Loss: 0.00857
Epoch: 41053, Training Loss: 0.00672
Epoch: 41053, Training Loss: 0.00698
Epoch: 41053, Training Loss: 0.00699
Epoch: 41053, Training Loss: 0.00857
Epoch: 41054, Training Loss: 0.00672
Epoch: 41054, Training Loss: 0.00698
Epoch: 41054, Training Loss: 0.00699
Epoch: 41054, Training Loss: 0.00857
Epoch: 41055, Training Loss: 0.00672
Epoch: 41055, Training Loss: 0.00698
Epoch: 41055, Training Loss: 0.00699
Epoch: 41055, Training Loss: 0.00857
Epoch: 41056, Training Loss: 0.00672
Epoch: 41056, Training Loss: 0.00698
Epoch: 41056, Training Loss: 0.00699
Epoch: 41056, Training Loss: 0.00857
Epoch: 41057, Training Loss: 0.00672
Epoch: 41057, Training Loss: 0.00698
Epoch: 41057, Training Loss: 0.00699
Epoch: 41057, Training Loss: 0.00857
Epoch: 41058, Training Loss: 0.00672
Epoch: 41058, Training Loss: 0.00698
Epoch: 41058, Training Loss: 0.00699
Epoch: 41058, Training Loss: 0.00857
Epoch: 41059, Training Loss: 0.00672
Epoch: 41059, Training Loss: 0.00698
Epoch: 41059, Training Loss: 0.00699
Epoch: 41059, Training Loss: 0.00857
Epoch: 41060, Training Loss: 0.00672
Epoch: 41060, Training Loss: 0.00698
Epoch: 41060, Training Loss: 0.00699
Epoch: 41060, Training Loss: 0.00857
Epoch: 41061, Training Loss: 0.00672
Epoch: 41061, Training Loss: 0.00698
Epoch: 41061, Training Loss: 0.00699
Epoch: 41061, Training Loss: 0.00857
Epoch: 41062, Training Loss: 0.00672
Epoch: 41062, Training Loss: 0.00698
Epoch: 41062, Training Loss: 0.00699
Epoch: 41062, Training Loss: 0.00857
Epoch: 41063, Training Loss: 0.00672
Epoch: 41063, Training Loss: 0.00698
Epoch: 41063, Training Loss: 0.00699
Epoch: 41063, Training Loss: 0.00857
Epoch: 41064, Training Loss: 0.00672
Epoch: 41064, Training Loss: 0.00698
Epoch: 41064, Training Loss: 0.00699
Epoch: 41064, Training Loss: 0.00857
Epoch: 41065, Training Loss: 0.00672
Epoch: 41065, Training Loss: 0.00698
Epoch: 41065, Training Loss: 0.00699
Epoch: 41065, Training Loss: 0.00857
Epoch: 41066, Training Loss: 0.00672
Epoch: 41066, Training Loss: 0.00698
Epoch: 41066, Training Loss: 0.00699
Epoch: 41066, Training Loss: 0.00857
Epoch: 41067, Training Loss: 0.00672
Epoch: 41067, Training Loss: 0.00698
Epoch: 41067, Training Loss: 0.00699
Epoch: 41067, Training Loss: 0.00857
Epoch: 41068, Training Loss: 0.00672
Epoch: 41068, Training Loss: 0.00698
Epoch: 41068, Training Loss: 0.00699
Epoch: 41068, Training Loss: 0.00857
Epoch: 41069, Training Loss: 0.00672
Epoch: 41069, Training Loss: 0.00698
Epoch: 41069, Training Loss: 0.00699
Epoch: 41069, Training Loss: 0.00857
Epoch: 41070, Training Loss: 0.00672
Epoch: 41070, Training Loss: 0.00698
Epoch: 41070, Training Loss: 0.00699
Epoch: 41070, Training Loss: 0.00857
Epoch: 41071, Training Loss: 0.00672
Epoch: 41071, Training Loss: 0.00698
Epoch: 41071, Training Loss: 0.00699
Epoch: 41071, Training Loss: 0.00857
Epoch: 41072, Training Loss: 0.00672
Epoch: 41072, Training Loss: 0.00698
Epoch: 41072, Training Loss: 0.00699
Epoch: 41072, Training Loss: 0.00857
Epoch: 41073, Training Loss: 0.00672
Epoch: 41073, Training Loss: 0.00698
Epoch: 41073, Training Loss: 0.00699
Epoch: 41073, Training Loss: 0.00857
Epoch: 41074, Training Loss: 0.00672
Epoch: 41074, Training Loss: 0.00698
Epoch: 41074, Training Loss: 0.00699
Epoch: 41074, Training Loss: 0.00857
Epoch: 41075, Training Loss: 0.00672
Epoch: 41075, Training Loss: 0.00698
Epoch: 41075, Training Loss: 0.00699
Epoch: 41075, Training Loss: 0.00857
Epoch: 41076, Training Loss: 0.00672
Epoch: 41076, Training Loss: 0.00698
Epoch: 41076, Training Loss: 0.00699
Epoch: 41076, Training Loss: 0.00857
Epoch: 41077, Training Loss: 0.00672
Epoch: 41077, Training Loss: 0.00698
Epoch: 41077, Training Loss: 0.00699
Epoch: 41077, Training Loss: 0.00857
Epoch: 41078, Training Loss: 0.00672
Epoch: 41078, Training Loss: 0.00698
Epoch: 41078, Training Loss: 0.00699
Epoch: 41078, Training Loss: 0.00857
Epoch: 41079, Training Loss: 0.00672
Epoch: 41079, Training Loss: 0.00698
Epoch: 41079, Training Loss: 0.00699
Epoch: 41079, Training Loss: 0.00857
Epoch: 41080, Training Loss: 0.00672
Epoch: 41080, Training Loss: 0.00698
Epoch: 41080, Training Loss: 0.00699
Epoch: 41080, Training Loss: 0.00857
Epoch: 41081, Training Loss: 0.00672
Epoch: 41081, Training Loss: 0.00698
Epoch: 41081, Training Loss: 0.00699
Epoch: 41081, Training Loss: 0.00857
Epoch: 41082, Training Loss: 0.00672
Epoch: 41082, Training Loss: 0.00698
Epoch: 41082, Training Loss: 0.00699
Epoch: 41082, Training Loss: 0.00857
Epoch: 41083, Training Loss: 0.00672
Epoch: 41083, Training Loss: 0.00698
Epoch: 41083, Training Loss: 0.00699
Epoch: 41083, Training Loss: 0.00857
Epoch: 41084, Training Loss: 0.00672
Epoch: 41084, Training Loss: 0.00698
Epoch: 41084, Training Loss: 0.00699
Epoch: 41084, Training Loss: 0.00857
Epoch: 41085, Training Loss: 0.00672
Epoch: 41085, Training Loss: 0.00698
Epoch: 41085, Training Loss: 0.00699
Epoch: 41085, Training Loss: 0.00857
Epoch: 41086, Training Loss: 0.00672
Epoch: 41086, Training Loss: 0.00698
Epoch: 41086, Training Loss: 0.00699
Epoch: 41086, Training Loss: 0.00857
Epoch: 41087, Training Loss: 0.00672
Epoch: 41087, Training Loss: 0.00698
Epoch: 41087, Training Loss: 0.00699
Epoch: 41087, Training Loss: 0.00857
Epoch: 41088, Training Loss: 0.00672
Epoch: 41088, Training Loss: 0.00698
Epoch: 41088, Training Loss: 0.00699
Epoch: 41088, Training Loss: 0.00857
Epoch: 41089, Training Loss: 0.00672
Epoch: 41089, Training Loss: 0.00698
Epoch: 41089, Training Loss: 0.00699
Epoch: 41089, Training Loss: 0.00857
Epoch: 41090, Training Loss: 0.00672
Epoch: 41090, Training Loss: 0.00698
Epoch: 41090, Training Loss: 0.00699
Epoch: 41090, Training Loss: 0.00857
Epoch: 41091, Training Loss: 0.00672
Epoch: 41091, Training Loss: 0.00698
Epoch: 41091, Training Loss: 0.00699
Epoch: 41091, Training Loss: 0.00857
Epoch: 41092, Training Loss: 0.00672
Epoch: 41092, Training Loss: 0.00698
Epoch: 41092, Training Loss: 0.00699
Epoch: 41092, Training Loss: 0.00857
Epoch: 41093, Training Loss: 0.00672
Epoch: 41093, Training Loss: 0.00698
Epoch: 41093, Training Loss: 0.00699
Epoch: 41093, Training Loss: 0.00857
Epoch: 41094, Training Loss: 0.00672
Epoch: 41094, Training Loss: 0.00698
Epoch: 41094, Training Loss: 0.00699
Epoch: 41094, Training Loss: 0.00857
Epoch: 41095, Training Loss: 0.00672
Epoch: 41095, Training Loss: 0.00698
Epoch: 41095, Training Loss: 0.00699
Epoch: 41095, Training Loss: 0.00857
Epoch: 41096, Training Loss: 0.00672
Epoch: 41096, Training Loss: 0.00698
Epoch: 41096, Training Loss: 0.00699
Epoch: 41096, Training Loss: 0.00857
Epoch: 41097, Training Loss: 0.00672
Epoch: 41097, Training Loss: 0.00698
Epoch: 41097, Training Loss: 0.00699
Epoch: 41097, Training Loss: 0.00856
Epoch: 41098, Training Loss: 0.00672
Epoch: 41098, Training Loss: 0.00698
Epoch: 41098, Training Loss: 0.00698
Epoch: 41098, Training Loss: 0.00856
Epoch: 41099, Training Loss: 0.00672
Epoch: 41099, Training Loss: 0.00698
Epoch: 41099, Training Loss: 0.00698
Epoch: 41099, Training Loss: 0.00856
Epoch: 41100, Training Loss: 0.00672
Epoch: 41100, Training Loss: 0.00698
Epoch: 41100, Training Loss: 0.00698
Epoch: 41100, Training Loss: 0.00856
Epoch: 41101, Training Loss: 0.00672
Epoch: 41101, Training Loss: 0.00698
Epoch: 41101, Training Loss: 0.00698
Epoch: 41101, Training Loss: 0.00856
Epoch: 41102, Training Loss: 0.00672
Epoch: 41102, Training Loss: 0.00698
Epoch: 41102, Training Loss: 0.00698
Epoch: 41102, Training Loss: 0.00856
Epoch: 41103, Training Loss: 0.00672
Epoch: 41103, Training Loss: 0.00698
Epoch: 41103, Training Loss: 0.00698
Epoch: 41103, Training Loss: 0.00856
Epoch: 41104, Training Loss: 0.00672
Epoch: 41104, Training Loss: 0.00698
Epoch: 41104, Training Loss: 0.00698
Epoch: 41104, Training Loss: 0.00856
Epoch: 41105, Training Loss: 0.00672
Epoch: 41105, Training Loss: 0.00698
Epoch: 41105, Training Loss: 0.00698
Epoch: 41105, Training Loss: 0.00856
Epoch: 41106, Training Loss: 0.00672
Epoch: 41106, Training Loss: 0.00698
Epoch: 41106, Training Loss: 0.00698
Epoch: 41106, Training Loss: 0.00856
Epoch: 41107, Training Loss: 0.00672
Epoch: 41107, Training Loss: 0.00698
Epoch: 41107, Training Loss: 0.00698
Epoch: 41107, Training Loss: 0.00856
Epoch: 41108, Training Loss: 0.00672
Epoch: 41108, Training Loss: 0.00698
Epoch: 41108, Training Loss: 0.00698
Epoch: 41108, Training Loss: 0.00856
Epoch: 41109, Training Loss: 0.00672
Epoch: 41109, Training Loss: 0.00698
Epoch: 41109, Training Loss: 0.00698
Epoch: 41109, Training Loss: 0.00856
Epoch: 41110, Training Loss: 0.00672
Epoch: 41110, Training Loss: 0.00698
Epoch: 41110, Training Loss: 0.00698
Epoch: 41110, Training Loss: 0.00856
Epoch: 41111, Training Loss: 0.00672
Epoch: 41111, Training Loss: 0.00698
Epoch: 41111, Training Loss: 0.00698
Epoch: 41111, Training Loss: 0.00856
Epoch: 41112, Training Loss: 0.00672
Epoch: 41112, Training Loss: 0.00698
Epoch: 41112, Training Loss: 0.00698
Epoch: 41112, Training Loss: 0.00856
Epoch: 41113, Training Loss: 0.00672
Epoch: 41113, Training Loss: 0.00698
Epoch: 41113, Training Loss: 0.00698
Epoch: 41113, Training Loss: 0.00856
Epoch: 41114, Training Loss: 0.00672
Epoch: 41114, Training Loss: 0.00698
Epoch: 41114, Training Loss: 0.00698
Epoch: 41114, Training Loss: 0.00856
Epoch: 41115, Training Loss: 0.00672
Epoch: 41115, Training Loss: 0.00698
Epoch: 41115, Training Loss: 0.00698
Epoch: 41115, Training Loss: 0.00856
Epoch: 41116, Training Loss: 0.00672
Epoch: 41116, Training Loss: 0.00698
Epoch: 41116, Training Loss: 0.00698
Epoch: 41116, Training Loss: 0.00856
Epoch: 41117, Training Loss: 0.00672
Epoch: 41117, Training Loss: 0.00698
Epoch: 41117, Training Loss: 0.00698
Epoch: 41117, Training Loss: 0.00856
Epoch: 41118, Training Loss: 0.00672
Epoch: 41118, Training Loss: 0.00698
Epoch: 41118, Training Loss: 0.00698
Epoch: 41118, Training Loss: 0.00856
Epoch: 41119, Training Loss: 0.00672
Epoch: 41119, Training Loss: 0.00698
Epoch: 41119, Training Loss: 0.00698
Epoch: 41119, Training Loss: 0.00856
Epoch: 41120, Training Loss: 0.00672
Epoch: 41120, Training Loss: 0.00698
Epoch: 41120, Training Loss: 0.00698
Epoch: 41120, Training Loss: 0.00856
Epoch: 41121, Training Loss: 0.00672
Epoch: 41121, Training Loss: 0.00698
Epoch: 41121, Training Loss: 0.00698
Epoch: 41121, Training Loss: 0.00856
Epoch: 41122, Training Loss: 0.00672
Epoch: 41122, Training Loss: 0.00697
Epoch: 41122, Training Loss: 0.00698
Epoch: 41122, Training Loss: 0.00856
Epoch: 41123, Training Loss: 0.00672
Epoch: 41123, Training Loss: 0.00697
Epoch: 41123, Training Loss: 0.00698
Epoch: 41123, Training Loss: 0.00856
Epoch: 41124, Training Loss: 0.00672
Epoch: 41124, Training Loss: 0.00697
Epoch: 41124, Training Loss: 0.00698
Epoch: 41124, Training Loss: 0.00856
Epoch: 41125, Training Loss: 0.00672
Epoch: 41125, Training Loss: 0.00697
Epoch: 41125, Training Loss: 0.00698
Epoch: 41125, Training Loss: 0.00856
Epoch: 41126, Training Loss: 0.00672
Epoch: 41126, Training Loss: 0.00697
Epoch: 41126, Training Loss: 0.00698
Epoch: 41126, Training Loss: 0.00856
Epoch: 41127, Training Loss: 0.00672
Epoch: 41127, Training Loss: 0.00697
Epoch: 41127, Training Loss: 0.00698
Epoch: 41127, Training Loss: 0.00856
Epoch: 41128, Training Loss: 0.00672
Epoch: 41128, Training Loss: 0.00697
Epoch: 41128, Training Loss: 0.00698
Epoch: 41128, Training Loss: 0.00856
Epoch: 41129, Training Loss: 0.00672
Epoch: 41129, Training Loss: 0.00697
Epoch: 41129, Training Loss: 0.00698
Epoch: 41129, Training Loss: 0.00856
Epoch: 41130, Training Loss: 0.00672
Epoch: 41130, Training Loss: 0.00697
Epoch: 41130, Training Loss: 0.00698
Epoch: 41130, Training Loss: 0.00856
Epoch: 41131, Training Loss: 0.00672
Epoch: 41131, Training Loss: 0.00697
Epoch: 41131, Training Loss: 0.00698
Epoch: 41131, Training Loss: 0.00856
Epoch: 41132, Training Loss: 0.00672
Epoch: 41132, Training Loss: 0.00697
Epoch: 41132, Training Loss: 0.00698
Epoch: 41132, Training Loss: 0.00856
Epoch: 41133, Training Loss: 0.00672
Epoch: 41133, Training Loss: 0.00697
Epoch: 41133, Training Loss: 0.00698
Epoch: 41133, Training Loss: 0.00856
Epoch: 41134, Training Loss: 0.00672
Epoch: 41134, Training Loss: 0.00697
Epoch: 41134, Training Loss: 0.00698
Epoch: 41134, Training Loss: 0.00856
Epoch: 41135, Training Loss: 0.00672
Epoch: 41135, Training Loss: 0.00697
Epoch: 41135, Training Loss: 0.00698
Epoch: 41135, Training Loss: 0.00856
Epoch: 41136, Training Loss: 0.00672
Epoch: 41136, Training Loss: 0.00697
Epoch: 41136, Training Loss: 0.00698
Epoch: 41136, Training Loss: 0.00856
Epoch: 41137, Training Loss: 0.00672
Epoch: 41137, Training Loss: 0.00697
Epoch: 41137, Training Loss: 0.00698
Epoch: 41137, Training Loss: 0.00856
Epoch: 41138, Training Loss: 0.00672
Epoch: 41138, Training Loss: 0.00697
Epoch: 41138, Training Loss: 0.00698
Epoch: 41138, Training Loss: 0.00856
Epoch: 41139, Training Loss: 0.00672
Epoch: 41139, Training Loss: 0.00697
Epoch: 41139, Training Loss: 0.00698
Epoch: 41139, Training Loss: 0.00856
Epoch: 41140, Training Loss: 0.00672
Epoch: 41140, Training Loss: 0.00697
Epoch: 41140, Training Loss: 0.00698
Epoch: 41140, Training Loss: 0.00856
Epoch: 41141, Training Loss: 0.00672
Epoch: 41141, Training Loss: 0.00697
Epoch: 41141, Training Loss: 0.00698
Epoch: 41141, Training Loss: 0.00856
Epoch: 41142, Training Loss: 0.00672
Epoch: 41142, Training Loss: 0.00697
Epoch: 41142, Training Loss: 0.00698
Epoch: 41142, Training Loss: 0.00856
Epoch: 41143, Training Loss: 0.00672
Epoch: 41143, Training Loss: 0.00697
Epoch: 41143, Training Loss: 0.00698
Epoch: 41143, Training Loss: 0.00856
Epoch: 41144, Training Loss: 0.00672
Epoch: 41144, Training Loss: 0.00697
Epoch: 41144, Training Loss: 0.00698
Epoch: 41144, Training Loss: 0.00856
Epoch: 41145, Training Loss: 0.00672
Epoch: 41145, Training Loss: 0.00697
Epoch: 41145, Training Loss: 0.00698
Epoch: 41145, Training Loss: 0.00856
Epoch: 41146, Training Loss: 0.00672
Epoch: 41146, Training Loss: 0.00697
Epoch: 41146, Training Loss: 0.00698
Epoch: 41146, Training Loss: 0.00856
Epoch: 41147, Training Loss: 0.00672
Epoch: 41147, Training Loss: 0.00697
Epoch: 41147, Training Loss: 0.00698
Epoch: 41147, Training Loss: 0.00856
Epoch: 41148, Training Loss: 0.00671
Epoch: 41148, Training Loss: 0.00697
Epoch: 41148, Training Loss: 0.00698
Epoch: 41148, Training Loss: 0.00856
Epoch: 41149, Training Loss: 0.00671
Epoch: 41149, Training Loss: 0.00697
Epoch: 41149, Training Loss: 0.00698
Epoch: 41149, Training Loss: 0.00856
Epoch: 41150, Training Loss: 0.00671
Epoch: 41150, Training Loss: 0.00697
Epoch: 41150, Training Loss: 0.00698
Epoch: 41150, Training Loss: 0.00856
Epoch: 41151, Training Loss: 0.00671
Epoch: 41151, Training Loss: 0.00697
Epoch: 41151, Training Loss: 0.00698
Epoch: 41151, Training Loss: 0.00856
Epoch: 41152, Training Loss: 0.00671
Epoch: 41152, Training Loss: 0.00697
Epoch: 41152, Training Loss: 0.00698
Epoch: 41152, Training Loss: 0.00856
Epoch: 41153, Training Loss: 0.00671
Epoch: 41153, Training Loss: 0.00697
Epoch: 41153, Training Loss: 0.00698
Epoch: 41153, Training Loss: 0.00856
Epoch: 41154, Training Loss: 0.00671
Epoch: 41154, Training Loss: 0.00697
Epoch: 41154, Training Loss: 0.00698
Epoch: 41154, Training Loss: 0.00856
Epoch: 41155, Training Loss: 0.00671
Epoch: 41155, Training Loss: 0.00697
Epoch: 41155, Training Loss: 0.00698
Epoch: 41155, Training Loss: 0.00856
Epoch: 41156, Training Loss: 0.00671
Epoch: 41156, Training Loss: 0.00697
Epoch: 41156, Training Loss: 0.00698
Epoch: 41156, Training Loss: 0.00856
Epoch: 41157, Training Loss: 0.00671
Epoch: 41157, Training Loss: 0.00697
Epoch: 41157, Training Loss: 0.00698
Epoch: 41157, Training Loss: 0.00856
Epoch: 41158, Training Loss: 0.00671
Epoch: 41158, Training Loss: 0.00697
Epoch: 41158, Training Loss: 0.00698
Epoch: 41158, Training Loss: 0.00856
Epoch: 41159, Training Loss: 0.00671
Epoch: 41159, Training Loss: 0.00697
Epoch: 41159, Training Loss: 0.00698
Epoch: 41159, Training Loss: 0.00856
Epoch: 41160, Training Loss: 0.00671
Epoch: 41160, Training Loss: 0.00697
Epoch: 41160, Training Loss: 0.00698
Epoch: 41160, Training Loss: 0.00856
Epoch: 41161, Training Loss: 0.00671
Epoch: 41161, Training Loss: 0.00697
Epoch: 41161, Training Loss: 0.00698
Epoch: 41161, Training Loss: 0.00856
Epoch: 41162, Training Loss: 0.00671
Epoch: 41162, Training Loss: 0.00697
Epoch: 41162, Training Loss: 0.00698
Epoch: 41162, Training Loss: 0.00856
Epoch: 41163, Training Loss: 0.00671
Epoch: 41163, Training Loss: 0.00697
Epoch: 41163, Training Loss: 0.00698
Epoch: 41163, Training Loss: 0.00856
Epoch: 41164, Training Loss: 0.00671
Epoch: 41164, Training Loss: 0.00697
Epoch: 41164, Training Loss: 0.00698
Epoch: 41164, Training Loss: 0.00856
Epoch: 41165, Training Loss: 0.00671
Epoch: 41165, Training Loss: 0.00697
Epoch: 41165, Training Loss: 0.00698
Epoch: 41165, Training Loss: 0.00856
Epoch: 41166, Training Loss: 0.00671
Epoch: 41166, Training Loss: 0.00697
Epoch: 41166, Training Loss: 0.00698
Epoch: 41166, Training Loss: 0.00856
Epoch: 41167, Training Loss: 0.00671
Epoch: 41167, Training Loss: 0.00697
Epoch: 41167, Training Loss: 0.00698
Epoch: 41167, Training Loss: 0.00856
Epoch: 41168, Training Loss: 0.00671
Epoch: 41168, Training Loss: 0.00697
Epoch: 41168, Training Loss: 0.00698
Epoch: 41168, Training Loss: 0.00856
Epoch: 41169, Training Loss: 0.00671
Epoch: 41169, Training Loss: 0.00697
Epoch: 41169, Training Loss: 0.00698
Epoch: 41169, Training Loss: 0.00856
Epoch: 41170, Training Loss: 0.00671
Epoch: 41170, Training Loss: 0.00697
Epoch: 41170, Training Loss: 0.00698
Epoch: 41170, Training Loss: 0.00856
Epoch: 41171, Training Loss: 0.00671
Epoch: 41171, Training Loss: 0.00697
Epoch: 41171, Training Loss: 0.00698
Epoch: 41171, Training Loss: 0.00856
Epoch: 41172, Training Loss: 0.00671
Epoch: 41172, Training Loss: 0.00697
Epoch: 41172, Training Loss: 0.00698
Epoch: 41172, Training Loss: 0.00856
Epoch: 41173, Training Loss: 0.00671
Epoch: 41173, Training Loss: 0.00697
Epoch: 41173, Training Loss: 0.00698
Epoch: 41173, Training Loss: 0.00856
Epoch: 41174, Training Loss: 0.00671
Epoch: 41174, Training Loss: 0.00697
Epoch: 41174, Training Loss: 0.00698
Epoch: 41174, Training Loss: 0.00856
Epoch: 41175, Training Loss: 0.00671
Epoch: 41175, Training Loss: 0.00697
Epoch: 41175, Training Loss: 0.00698
Epoch: 41175, Training Loss: 0.00856
Epoch: 41176, Training Loss: 0.00671
Epoch: 41176, Training Loss: 0.00697
Epoch: 41176, Training Loss: 0.00698
Epoch: 41176, Training Loss: 0.00856
Epoch: 41177, Training Loss: 0.00671
Epoch: 41177, Training Loss: 0.00697
Epoch: 41177, Training Loss: 0.00698
Epoch: 41177, Training Loss: 0.00856
Epoch: 41178, Training Loss: 0.00671
Epoch: 41178, Training Loss: 0.00697
Epoch: 41178, Training Loss: 0.00698
Epoch: 41178, Training Loss: 0.00856
Epoch: 41179, Training Loss: 0.00671
Epoch: 41179, Training Loss: 0.00697
Epoch: 41179, Training Loss: 0.00698
Epoch: 41179, Training Loss: 0.00856
Epoch: 41180, Training Loss: 0.00671
Epoch: 41180, Training Loss: 0.00697
Epoch: 41180, Training Loss: 0.00698
Epoch: 41180, Training Loss: 0.00856
Epoch: 41181, Training Loss: 0.00671
Epoch: 41181, Training Loss: 0.00697
Epoch: 41181, Training Loss: 0.00698
Epoch: 41181, Training Loss: 0.00856
Epoch: 41182, Training Loss: 0.00671
Epoch: 41182, Training Loss: 0.00697
Epoch: 41182, Training Loss: 0.00698
Epoch: 41182, Training Loss: 0.00856
Epoch: 41183, Training Loss: 0.00671
Epoch: 41183, Training Loss: 0.00697
Epoch: 41183, Training Loss: 0.00698
Epoch: 41183, Training Loss: 0.00856
Epoch: 41184, Training Loss: 0.00671
Epoch: 41184, Training Loss: 0.00697
Epoch: 41184, Training Loss: 0.00698
Epoch: 41184, Training Loss: 0.00855
Epoch: 41185, Training Loss: 0.00671
Epoch: 41185, Training Loss: 0.00697
Epoch: 41185, Training Loss: 0.00698
Epoch: 41185, Training Loss: 0.00855
Epoch: 41186, Training Loss: 0.00671
Epoch: 41186, Training Loss: 0.00697
Epoch: 41186, Training Loss: 0.00698
Epoch: 41186, Training Loss: 0.00855
Epoch: 41187, Training Loss: 0.00671
Epoch: 41187, Training Loss: 0.00697
Epoch: 41187, Training Loss: 0.00698
Epoch: 41187, Training Loss: 0.00855
Epoch: 41188, Training Loss: 0.00671
Epoch: 41188, Training Loss: 0.00697
Epoch: 41188, Training Loss: 0.00698
Epoch: 41188, Training Loss: 0.00855
Epoch: 41189, Training Loss: 0.00671
Epoch: 41189, Training Loss: 0.00697
Epoch: 41189, Training Loss: 0.00698
Epoch: 41189, Training Loss: 0.00855
Epoch: 41190, Training Loss: 0.00671
Epoch: 41190, Training Loss: 0.00697
Epoch: 41190, Training Loss: 0.00698
Epoch: 41190, Training Loss: 0.00855
Epoch: 41191, Training Loss: 0.00671
Epoch: 41191, Training Loss: 0.00697
Epoch: 41191, Training Loss: 0.00698
Epoch: 41191, Training Loss: 0.00855
Epoch: 41192, Training Loss: 0.00671
Epoch: 41192, Training Loss: 0.00697
Epoch: 41192, Training Loss: 0.00698
Epoch: 41192, Training Loss: 0.00855
Epoch: 41193, Training Loss: 0.00671
Epoch: 41193, Training Loss: 0.00697
Epoch: 41193, Training Loss: 0.00698
Epoch: 41193, Training Loss: 0.00855
Epoch: 41194, Training Loss: 0.00671
Epoch: 41194, Training Loss: 0.00697
Epoch: 41194, Training Loss: 0.00698
Epoch: 41194, Training Loss: 0.00855
Epoch: 41195, Training Loss: 0.00671
Epoch: 41195, Training Loss: 0.00697
Epoch: 41195, Training Loss: 0.00698
Epoch: 41195, Training Loss: 0.00855
Epoch: 41196, Training Loss: 0.00671
Epoch: 41196, Training Loss: 0.00697
Epoch: 41196, Training Loss: 0.00698
Epoch: 41196, Training Loss: 0.00855
Epoch: 41197, Training Loss: 0.00671
Epoch: 41197, Training Loss: 0.00697
Epoch: 41197, Training Loss: 0.00698
Epoch: 41197, Training Loss: 0.00855
Epoch: 41198, Training Loss: 0.00671
Epoch: 41198, Training Loss: 0.00697
Epoch: 41198, Training Loss: 0.00698
Epoch: 41198, Training Loss: 0.00855
Epoch: 41199, Training Loss: 0.00671
Epoch: 41199, Training Loss: 0.00697
Epoch: 41199, Training Loss: 0.00698
Epoch: 41199, Training Loss: 0.00855
Epoch: 41200, Training Loss: 0.00671
Epoch: 41200, Training Loss: 0.00697
Epoch: 41200, Training Loss: 0.00698
Epoch: 41200, Training Loss: 0.00855
Epoch: 41201, Training Loss: 0.00671
Epoch: 41201, Training Loss: 0.00697
Epoch: 41201, Training Loss: 0.00698
Epoch: 41201, Training Loss: 0.00855
Epoch: 41202, Training Loss: 0.00671
Epoch: 41202, Training Loss: 0.00697
Epoch: 41202, Training Loss: 0.00698
Epoch: 41202, Training Loss: 0.00855
Epoch: 41203, Training Loss: 0.00671
Epoch: 41203, Training Loss: 0.00697
Epoch: 41203, Training Loss: 0.00698
Epoch: 41203, Training Loss: 0.00855
Epoch: 41204, Training Loss: 0.00671
Epoch: 41204, Training Loss: 0.00697
Epoch: 41204, Training Loss: 0.00698
Epoch: 41204, Training Loss: 0.00855
Epoch: 41205, Training Loss: 0.00671
Epoch: 41205, Training Loss: 0.00697
Epoch: 41205, Training Loss: 0.00698
Epoch: 41205, Training Loss: 0.00855
Epoch: 41206, Training Loss: 0.00671
Epoch: 41206, Training Loss: 0.00697
Epoch: 41206, Training Loss: 0.00697
Epoch: 41206, Training Loss: 0.00855
Epoch: 41207, Training Loss: 0.00671
Epoch: 41207, Training Loss: 0.00697
Epoch: 41207, Training Loss: 0.00697
Epoch: 41207, Training Loss: 0.00855
Epoch: 41208, Training Loss: 0.00671
Epoch: 41208, Training Loss: 0.00697
Epoch: 41208, Training Loss: 0.00697
Epoch: 41208, Training Loss: 0.00855
Epoch: 41209, Training Loss: 0.00671
Epoch: 41209, Training Loss: 0.00697
Epoch: 41209, Training Loss: 0.00697
Epoch: 41209, Training Loss: 0.00855
Epoch: 41210, Training Loss: 0.00671
Epoch: 41210, Training Loss: 0.00697
Epoch: 41210, Training Loss: 0.00697
Epoch: 41210, Training Loss: 0.00855
Epoch: 41211, Training Loss: 0.00671
Epoch: 41211, Training Loss: 0.00697
Epoch: 41211, Training Loss: 0.00697
Epoch: 41211, Training Loss: 0.00855
Epoch: 41212, Training Loss: 0.00671
Epoch: 41212, Training Loss: 0.00697
Epoch: 41212, Training Loss: 0.00697
Epoch: 41212, Training Loss: 0.00855
Epoch: 41213, Training Loss: 0.00671
Epoch: 41213, Training Loss: 0.00697
Epoch: 41213, Training Loss: 0.00697
Epoch: 41213, Training Loss: 0.00855
Epoch: 41214, Training Loss: 0.00671
Epoch: 41214, Training Loss: 0.00697
Epoch: 41214, Training Loss: 0.00697
Epoch: 41214, Training Loss: 0.00855
Epoch: 41215, Training Loss: 0.00671
Epoch: 41215, Training Loss: 0.00697
Epoch: 41215, Training Loss: 0.00697
Epoch: 41215, Training Loss: 0.00855
Epoch: 41216, Training Loss: 0.00671
Epoch: 41216, Training Loss: 0.00697
Epoch: 41216, Training Loss: 0.00697
Epoch: 41216, Training Loss: 0.00855
Epoch: 41217, Training Loss: 0.00671
Epoch: 41217, Training Loss: 0.00697
Epoch: 41217, Training Loss: 0.00697
Epoch: 41217, Training Loss: 0.00855
Epoch: 41218, Training Loss: 0.00671
Epoch: 41218, Training Loss: 0.00697
Epoch: 41218, Training Loss: 0.00697
Epoch: 41218, Training Loss: 0.00855
Epoch: 41219, Training Loss: 0.00671
Epoch: 41219, Training Loss: 0.00697
Epoch: 41219, Training Loss: 0.00697
Epoch: 41219, Training Loss: 0.00855
Epoch: 41220, Training Loss: 0.00671
Epoch: 41220, Training Loss: 0.00697
Epoch: 41220, Training Loss: 0.00697
Epoch: 41220, Training Loss: 0.00855
Epoch: 41221, Training Loss: 0.00671
Epoch: 41221, Training Loss: 0.00697
Epoch: 41221, Training Loss: 0.00697
Epoch: 41221, Training Loss: 0.00855
Epoch: 41222, Training Loss: 0.00671
Epoch: 41222, Training Loss: 0.00697
Epoch: 41222, Training Loss: 0.00697
Epoch: 41222, Training Loss: 0.00855
Epoch: 41223, Training Loss: 0.00671
Epoch: 41223, Training Loss: 0.00697
Epoch: 41223, Training Loss: 0.00697
Epoch: 41223, Training Loss: 0.00855
Epoch: 41224, Training Loss: 0.00671
Epoch: 41224, Training Loss: 0.00697
Epoch: 41224, Training Loss: 0.00697
Epoch: 41224, Training Loss: 0.00855
Epoch: 41225, Training Loss: 0.00671
Epoch: 41225, Training Loss: 0.00697
Epoch: 41225, Training Loss: 0.00697
Epoch: 41225, Training Loss: 0.00855
Epoch: 41226, Training Loss: 0.00671
Epoch: 41226, Training Loss: 0.00697
Epoch: 41226, Training Loss: 0.00697
Epoch: 41226, Training Loss: 0.00855
Epoch: 41227, Training Loss: 0.00671
Epoch: 41227, Training Loss: 0.00697
Epoch: 41227, Training Loss: 0.00697
Epoch: 41227, Training Loss: 0.00855
Epoch: 41228, Training Loss: 0.00671
Epoch: 41228, Training Loss: 0.00697
Epoch: 41228, Training Loss: 0.00697
Epoch: 41228, Training Loss: 0.00855
Epoch: 41229, Training Loss: 0.00671
Epoch: 41229, Training Loss: 0.00697
Epoch: 41229, Training Loss: 0.00697
Epoch: 41229, Training Loss: 0.00855
Epoch: 41230, Training Loss: 0.00671
Epoch: 41230, Training Loss: 0.00696
Epoch: 41230, Training Loss: 0.00697
Epoch: 41230, Training Loss: 0.00855
Epoch: 41231, Training Loss: 0.00671
Epoch: 41231, Training Loss: 0.00696
Epoch: 41231, Training Loss: 0.00697
Epoch: 41231, Training Loss: 0.00855
Epoch: 41232, Training Loss: 0.00671
Epoch: 41232, Training Loss: 0.00696
Epoch: 41232, Training Loss: 0.00697
Epoch: 41232, Training Loss: 0.00855
Epoch: 41233, Training Loss: 0.00671
Epoch: 41233, Training Loss: 0.00696
Epoch: 41233, Training Loss: 0.00697
Epoch: 41233, Training Loss: 0.00855
Epoch: 41234, Training Loss: 0.00671
Epoch: 41234, Training Loss: 0.00696
Epoch: 41234, Training Loss: 0.00697
Epoch: 41234, Training Loss: 0.00855
Epoch: 41235, Training Loss: 0.00671
Epoch: 41235, Training Loss: 0.00696
Epoch: 41235, Training Loss: 0.00697
Epoch: 41235, Training Loss: 0.00855
Epoch: 41236, Training Loss: 0.00671
Epoch: 41236, Training Loss: 0.00696
Epoch: 41236, Training Loss: 0.00697
Epoch: 41236, Training Loss: 0.00855
Epoch: 41237, Training Loss: 0.00671
Epoch: 41237, Training Loss: 0.00696
Epoch: 41237, Training Loss: 0.00697
Epoch: 41237, Training Loss: 0.00855
Epoch: 41238, Training Loss: 0.00671
Epoch: 41238, Training Loss: 0.00696
Epoch: 41238, Training Loss: 0.00697
Epoch: 41238, Training Loss: 0.00855
Epoch: 41239, Training Loss: 0.00671
Epoch: 41239, Training Loss: 0.00696
Epoch: 41239, Training Loss: 0.00697
Epoch: 41239, Training Loss: 0.00855
Epoch: 41240, Training Loss: 0.00671
Epoch: 41240, Training Loss: 0.00696
Epoch: 41240, Training Loss: 0.00697
Epoch: 41240, Training Loss: 0.00855
Epoch: 41241, Training Loss: 0.00671
Epoch: 41241, Training Loss: 0.00696
Epoch: 41241, Training Loss: 0.00697
Epoch: 41241, Training Loss: 0.00855
Epoch: 41242, Training Loss: 0.00671
Epoch: 41242, Training Loss: 0.00696
Epoch: 41242, Training Loss: 0.00697
Epoch: 41242, Training Loss: 0.00855
Epoch: 41243, Training Loss: 0.00671
Epoch: 41243, Training Loss: 0.00696
Epoch: 41243, Training Loss: 0.00697
Epoch: 41243, Training Loss: 0.00855
Epoch: 41244, Training Loss: 0.00671
Epoch: 41244, Training Loss: 0.00696
Epoch: 41244, Training Loss: 0.00697
Epoch: 41244, Training Loss: 0.00855
Epoch: 41245, Training Loss: 0.00671
Epoch: 41245, Training Loss: 0.00696
Epoch: 41245, Training Loss: 0.00697
Epoch: 41245, Training Loss: 0.00855
Epoch: 41246, Training Loss: 0.00671
Epoch: 41246, Training Loss: 0.00696
Epoch: 41246, Training Loss: 0.00697
Epoch: 41246, Training Loss: 0.00855
Epoch: 41247, Training Loss: 0.00671
Epoch: 41247, Training Loss: 0.00696
Epoch: 41247, Training Loss: 0.00697
Epoch: 41247, Training Loss: 0.00855
Epoch: 41248, Training Loss: 0.00671
Epoch: 41248, Training Loss: 0.00696
Epoch: 41248, Training Loss: 0.00697
Epoch: 41248, Training Loss: 0.00855
Epoch: 41249, Training Loss: 0.00671
Epoch: 41249, Training Loss: 0.00696
Epoch: 41249, Training Loss: 0.00697
Epoch: 41249, Training Loss: 0.00855
Epoch: 41250, Training Loss: 0.00671
Epoch: 41250, Training Loss: 0.00696
Epoch: 41250, Training Loss: 0.00697
Epoch: 41250, Training Loss: 0.00855
Epoch: 41251, Training Loss: 0.00671
Epoch: 41251, Training Loss: 0.00696
Epoch: 41251, Training Loss: 0.00697
Epoch: 41251, Training Loss: 0.00855
Epoch: 41252, Training Loss: 0.00671
Epoch: 41252, Training Loss: 0.00696
Epoch: 41252, Training Loss: 0.00697
Epoch: 41252, Training Loss: 0.00855
Epoch: 41253, Training Loss: 0.00671
Epoch: 41253, Training Loss: 0.00696
Epoch: 41253, Training Loss: 0.00697
Epoch: 41253, Training Loss: 0.00855
Epoch: 41254, Training Loss: 0.00671
Epoch: 41254, Training Loss: 0.00696
Epoch: 41254, Training Loss: 0.00697
Epoch: 41254, Training Loss: 0.00855
Epoch: 41255, Training Loss: 0.00671
Epoch: 41255, Training Loss: 0.00696
Epoch: 41255, Training Loss: 0.00697
Epoch: 41255, Training Loss: 0.00855
Epoch: 41256, Training Loss: 0.00671
Epoch: 41256, Training Loss: 0.00696
Epoch: 41256, Training Loss: 0.00697
Epoch: 41256, Training Loss: 0.00855
Epoch: 41257, Training Loss: 0.00671
Epoch: 41257, Training Loss: 0.00696
Epoch: 41257, Training Loss: 0.00697
Epoch: 41257, Training Loss: 0.00855
Epoch: 41258, Training Loss: 0.00671
Epoch: 41258, Training Loss: 0.00696
Epoch: 41258, Training Loss: 0.00697
Epoch: 41258, Training Loss: 0.00855
Epoch: 41259, Training Loss: 0.00671
Epoch: 41259, Training Loss: 0.00696
Epoch: 41259, Training Loss: 0.00697
Epoch: 41259, Training Loss: 0.00855
Epoch: 41260, Training Loss: 0.00671
Epoch: 41260, Training Loss: 0.00696
Epoch: 41260, Training Loss: 0.00697
Epoch: 41260, Training Loss: 0.00855
Epoch: 41261, Training Loss: 0.00671
Epoch: 41261, Training Loss: 0.00696
Epoch: 41261, Training Loss: 0.00697
Epoch: 41261, Training Loss: 0.00855
Epoch: 41262, Training Loss: 0.00670
Epoch: 41262, Training Loss: 0.00696
Epoch: 41262, Training Loss: 0.00697
Epoch: 41262, Training Loss: 0.00855
Epoch: 41263, Training Loss: 0.00670
Epoch: 41263, Training Loss: 0.00696
Epoch: 41263, Training Loss: 0.00697
Epoch: 41263, Training Loss: 0.00855
Epoch: 41264, Training Loss: 0.00670
Epoch: 41264, Training Loss: 0.00696
Epoch: 41264, Training Loss: 0.00697
Epoch: 41264, Training Loss: 0.00855
Epoch: 41265, Training Loss: 0.00670
Epoch: 41265, Training Loss: 0.00696
Epoch: 41265, Training Loss: 0.00697
Epoch: 41265, Training Loss: 0.00855
Epoch: 41266, Training Loss: 0.00670
Epoch: 41266, Training Loss: 0.00696
Epoch: 41266, Training Loss: 0.00697
Epoch: 41266, Training Loss: 0.00855
Epoch: 41267, Training Loss: 0.00670
Epoch: 41267, Training Loss: 0.00696
Epoch: 41267, Training Loss: 0.00697
Epoch: 41267, Training Loss: 0.00855
Epoch: 41268, Training Loss: 0.00670
Epoch: 41268, Training Loss: 0.00696
Epoch: 41268, Training Loss: 0.00697
Epoch: 41268, Training Loss: 0.00855
Epoch: 41269, Training Loss: 0.00670
Epoch: 41269, Training Loss: 0.00696
Epoch: 41269, Training Loss: 0.00697
Epoch: 41269, Training Loss: 0.00855
Epoch: 41270, Training Loss: 0.00670
Epoch: 41270, Training Loss: 0.00696
Epoch: 41270, Training Loss: 0.00697
Epoch: 41270, Training Loss: 0.00855
Epoch: 41271, Training Loss: 0.00670
Epoch: 41271, Training Loss: 0.00696
Epoch: 41271, Training Loss: 0.00697
Epoch: 41271, Training Loss: 0.00854
Epoch: 41272, Training Loss: 0.00670
Epoch: 41272, Training Loss: 0.00696
Epoch: 41272, Training Loss: 0.00697
Epoch: 41272, Training Loss: 0.00854
Epoch: 41273, Training Loss: 0.00670
Epoch: 41273, Training Loss: 0.00696
Epoch: 41273, Training Loss: 0.00697
Epoch: 41273, Training Loss: 0.00854
Epoch: 41274, Training Loss: 0.00670
Epoch: 41274, Training Loss: 0.00696
Epoch: 41274, Training Loss: 0.00697
Epoch: 41274, Training Loss: 0.00854
Epoch: 41275, Training Loss: 0.00670
Epoch: 41275, Training Loss: 0.00696
Epoch: 41275, Training Loss: 0.00697
Epoch: 41275, Training Loss: 0.00854
Epoch: 41276, Training Loss: 0.00670
Epoch: 41276, Training Loss: 0.00696
Epoch: 41276, Training Loss: 0.00697
Epoch: 41276, Training Loss: 0.00854
Epoch: 41277, Training Loss: 0.00670
Epoch: 41277, Training Loss: 0.00696
Epoch: 41277, Training Loss: 0.00697
Epoch: 41277, Training Loss: 0.00854
Epoch: 41278, Training Loss: 0.00670
Epoch: 41278, Training Loss: 0.00696
Epoch: 41278, Training Loss: 0.00697
Epoch: 41278, Training Loss: 0.00854
Epoch: 41279, Training Loss: 0.00670
Epoch: 41279, Training Loss: 0.00696
Epoch: 41279, Training Loss: 0.00697
Epoch: 41279, Training Loss: 0.00854
Epoch: 41280, Training Loss: 0.00670
Epoch: 41280, Training Loss: 0.00696
Epoch: 41280, Training Loss: 0.00697
Epoch: 41280, Training Loss: 0.00854
Epoch: 41281, Training Loss: 0.00670
Epoch: 41281, Training Loss: 0.00696
Epoch: 41281, Training Loss: 0.00697
Epoch: 41281, Training Loss: 0.00854
Epoch: 41282, Training Loss: 0.00670
Epoch: 41282, Training Loss: 0.00696
Epoch: 41282, Training Loss: 0.00697
Epoch: 41282, Training Loss: 0.00854
Epoch: 41283, Training Loss: 0.00670
Epoch: 41283, Training Loss: 0.00696
Epoch: 41283, Training Loss: 0.00697
Epoch: 41283, Training Loss: 0.00854
Epoch: 41284, Training Loss: 0.00670
Epoch: 41284, Training Loss: 0.00696
Epoch: 41284, Training Loss: 0.00697
Epoch: 41284, Training Loss: 0.00854
Epoch: 41285, Training Loss: 0.00670
Epoch: 41285, Training Loss: 0.00696
Epoch: 41285, Training Loss: 0.00697
Epoch: 41285, Training Loss: 0.00854
Epoch: 41286, Training Loss: 0.00670
Epoch: 41286, Training Loss: 0.00696
Epoch: 41286, Training Loss: 0.00697
Epoch: 41286, Training Loss: 0.00854
Epoch: 41287, Training Loss: 0.00670
Epoch: 41287, Training Loss: 0.00696
Epoch: 41287, Training Loss: 0.00697
Epoch: 41287, Training Loss: 0.00854
Epoch: 41288, Training Loss: 0.00670
Epoch: 41288, Training Loss: 0.00696
Epoch: 41288, Training Loss: 0.00697
Epoch: 41288, Training Loss: 0.00854
Epoch: 41289, Training Loss: 0.00670
Epoch: 41289, Training Loss: 0.00696
Epoch: 41289, Training Loss: 0.00697
Epoch: 41289, Training Loss: 0.00854
Epoch: 41290, Training Loss: 0.00670
Epoch: 41290, Training Loss: 0.00696
Epoch: 41290, Training Loss: 0.00697
Epoch: 41290, Training Loss: 0.00854
Epoch: 41291, Training Loss: 0.00670
Epoch: 41291, Training Loss: 0.00696
Epoch: 41291, Training Loss: 0.00697
Epoch: 41291, Training Loss: 0.00854
Epoch: 41292, Training Loss: 0.00670
Epoch: 41292, Training Loss: 0.00696
Epoch: 41292, Training Loss: 0.00697
Epoch: 41292, Training Loss: 0.00854
Epoch: 41293, Training Loss: 0.00670
Epoch: 41293, Training Loss: 0.00696
Epoch: 41293, Training Loss: 0.00697
Epoch: 41293, Training Loss: 0.00854
Epoch: 41294, Training Loss: 0.00670
Epoch: 41294, Training Loss: 0.00696
Epoch: 41294, Training Loss: 0.00697
Epoch: 41294, Training Loss: 0.00854
Epoch: 41295, Training Loss: 0.00670
Epoch: 41295, Training Loss: 0.00696
Epoch: 41295, Training Loss: 0.00697
Epoch: 41295, Training Loss: 0.00854
Epoch: 41296, Training Loss: 0.00670
Epoch: 41296, Training Loss: 0.00696
Epoch: 41296, Training Loss: 0.00697
Epoch: 41296, Training Loss: 0.00854
Epoch: 41297, Training Loss: 0.00670
Epoch: 41297, Training Loss: 0.00696
Epoch: 41297, Training Loss: 0.00697
Epoch: 41297, Training Loss: 0.00854
Epoch: 41298, Training Loss: 0.00670
Epoch: 41298, Training Loss: 0.00696
Epoch: 41298, Training Loss: 0.00697
Epoch: 41298, Training Loss: 0.00854
Epoch: 41299, Training Loss: 0.00670
Epoch: 41299, Training Loss: 0.00696
Epoch: 41299, Training Loss: 0.00697
Epoch: 41299, Training Loss: 0.00854
Epoch: 41300, Training Loss: 0.00670
Epoch: 41300, Training Loss: 0.00696
Epoch: 41300, Training Loss: 0.00697
Epoch: 41300, Training Loss: 0.00854
Epoch: 41301, Training Loss: 0.00670
Epoch: 41301, Training Loss: 0.00696
Epoch: 41301, Training Loss: 0.00697
Epoch: 41301, Training Loss: 0.00854
Epoch: 41302, Training Loss: 0.00670
Epoch: 41302, Training Loss: 0.00696
Epoch: 41302, Training Loss: 0.00697
Epoch: 41302, Training Loss: 0.00854
Epoch: 41303, Training Loss: 0.00670
Epoch: 41303, Training Loss: 0.00696
Epoch: 41303, Training Loss: 0.00697
Epoch: 41303, Training Loss: 0.00854
Epoch: 41304, Training Loss: 0.00670
Epoch: 41304, Training Loss: 0.00696
Epoch: 41304, Training Loss: 0.00697
Epoch: 41304, Training Loss: 0.00854
Epoch: 41305, Training Loss: 0.00670
Epoch: 41305, Training Loss: 0.00696
Epoch: 41305, Training Loss: 0.00697
Epoch: 41305, Training Loss: 0.00854
Epoch: 41306, Training Loss: 0.00670
Epoch: 41306, Training Loss: 0.00696
Epoch: 41306, Training Loss: 0.00697
Epoch: 41306, Training Loss: 0.00854
Epoch: 41307, Training Loss: 0.00670
Epoch: 41307, Training Loss: 0.00696
Epoch: 41307, Training Loss: 0.00697
Epoch: 41307, Training Loss: 0.00854
Epoch: 41308, Training Loss: 0.00670
Epoch: 41308, Training Loss: 0.00696
Epoch: 41308, Training Loss: 0.00697
Epoch: 41308, Training Loss: 0.00854
Epoch: 41309, Training Loss: 0.00670
Epoch: 41309, Training Loss: 0.00696
Epoch: 41309, Training Loss: 0.00697
Epoch: 41309, Training Loss: 0.00854
Epoch: 41310, Training Loss: 0.00670
Epoch: 41310, Training Loss: 0.00696
Epoch: 41310, Training Loss: 0.00697
Epoch: 41310, Training Loss: 0.00854
Epoch: 41311, Training Loss: 0.00670
Epoch: 41311, Training Loss: 0.00696
Epoch: 41311, Training Loss: 0.00697
Epoch: 41311, Training Loss: 0.00854
Epoch: 41312, Training Loss: 0.00670
Epoch: 41312, Training Loss: 0.00696
Epoch: 41312, Training Loss: 0.00697
Epoch: 41312, Training Loss: 0.00854
Epoch: 41313, Training Loss: 0.00670
Epoch: 41313, Training Loss: 0.00696
Epoch: 41313, Training Loss: 0.00697
Epoch: 41313, Training Loss: 0.00854
Epoch: 41314, Training Loss: 0.00670
Epoch: 41314, Training Loss: 0.00696
Epoch: 41314, Training Loss: 0.00696
Epoch: 41314, Training Loss: 0.00854
Epoch: 41315, Training Loss: 0.00670
Epoch: 41315, Training Loss: 0.00696
Epoch: 41315, Training Loss: 0.00696
Epoch: 41315, Training Loss: 0.00854
Epoch: 41316, Training Loss: 0.00670
Epoch: 41316, Training Loss: 0.00696
Epoch: 41316, Training Loss: 0.00696
Epoch: 41316, Training Loss: 0.00854
Epoch: 41317, Training Loss: 0.00670
Epoch: 41317, Training Loss: 0.00696
Epoch: 41317, Training Loss: 0.00696
Epoch: 41317, Training Loss: 0.00854
Epoch: 41318, Training Loss: 0.00670
Epoch: 41318, Training Loss: 0.00696
Epoch: 41318, Training Loss: 0.00696
Epoch: 41318, Training Loss: 0.00854
Epoch: 41319, Training Loss: 0.00670
Epoch: 41319, Training Loss: 0.00696
Epoch: 41319, Training Loss: 0.00696
Epoch: 41319, Training Loss: 0.00854
Epoch: 41320, Training Loss: 0.00670
Epoch: 41320, Training Loss: 0.00696
Epoch: 41320, Training Loss: 0.00696
Epoch: 41320, Training Loss: 0.00854
Epoch: 41321, Training Loss: 0.00670
Epoch: 41321, Training Loss: 0.00696
Epoch: 41321, Training Loss: 0.00696
Epoch: 41321, Training Loss: 0.00854
Epoch: 41322, Training Loss: 0.00670
Epoch: 41322, Training Loss: 0.00696
Epoch: 41322, Training Loss: 0.00696
Epoch: 41322, Training Loss: 0.00854
Epoch: 41323, Training Loss: 0.00670
Epoch: 41323, Training Loss: 0.00696
Epoch: 41323, Training Loss: 0.00696
Epoch: 41323, Training Loss: 0.00854
Epoch: 41324, Training Loss: 0.00670
Epoch: 41324, Training Loss: 0.00696
Epoch: 41324, Training Loss: 0.00696
Epoch: 41324, Training Loss: 0.00854
Epoch: 41325, Training Loss: 0.00670
Epoch: 41325, Training Loss: 0.00696
Epoch: 41325, Training Loss: 0.00696
Epoch: 41325, Training Loss: 0.00854
Epoch: 41326, Training Loss: 0.00670
Epoch: 41326, Training Loss: 0.00696
Epoch: 41326, Training Loss: 0.00696
Epoch: 41326, Training Loss: 0.00854
Epoch: 41327, Training Loss: 0.00670
Epoch: 41327, Training Loss: 0.00696
Epoch: 41327, Training Loss: 0.00696
Epoch: 41327, Training Loss: 0.00854
Epoch: 41328, Training Loss: 0.00670
Epoch: 41328, Training Loss: 0.00696
Epoch: 41328, Training Loss: 0.00696
Epoch: 41328, Training Loss: 0.00854
Epoch: 41329, Training Loss: 0.00670
Epoch: 41329, Training Loss: 0.00696
Epoch: 41329, Training Loss: 0.00696
Epoch: 41329, Training Loss: 0.00854
Epoch: 41330, Training Loss: 0.00670
Epoch: 41330, Training Loss: 0.00696
Epoch: 41330, Training Loss: 0.00696
Epoch: 41330, Training Loss: 0.00854
Epoch: 41331, Training Loss: 0.00670
Epoch: 41331, Training Loss: 0.00696
Epoch: 41331, Training Loss: 0.00696
Epoch: 41331, Training Loss: 0.00854
Epoch: 41332, Training Loss: 0.00670
Epoch: 41332, Training Loss: 0.00696
Epoch: 41332, Training Loss: 0.00696
Epoch: 41332, Training Loss: 0.00854
Epoch: 41333, Training Loss: 0.00670
Epoch: 41333, Training Loss: 0.00696
Epoch: 41333, Training Loss: 0.00696
Epoch: 41333, Training Loss: 0.00854
Epoch: 41334, Training Loss: 0.00670
Epoch: 41334, Training Loss: 0.00696
Epoch: 41334, Training Loss: 0.00696
Epoch: 41334, Training Loss: 0.00854
Epoch: 41335, Training Loss: 0.00670
Epoch: 41335, Training Loss: 0.00696
Epoch: 41335, Training Loss: 0.00696
Epoch: 41335, Training Loss: 0.00854
Epoch: 41336, Training Loss: 0.00670
Epoch: 41336, Training Loss: 0.00696
Epoch: 41336, Training Loss: 0.00696
Epoch: 41336, Training Loss: 0.00854
Epoch: 41337, Training Loss: 0.00670
Epoch: 41337, Training Loss: 0.00696
Epoch: 41337, Training Loss: 0.00696
Epoch: 41337, Training Loss: 0.00854
Epoch: 41338, Training Loss: 0.00670
Epoch: 41338, Training Loss: 0.00695
Epoch: 41338, Training Loss: 0.00696
Epoch: 41338, Training Loss: 0.00854
Epoch: 41339, Training Loss: 0.00670
Epoch: 41339, Training Loss: 0.00695
Epoch: 41339, Training Loss: 0.00696
Epoch: 41339, Training Loss: 0.00854
Epoch: 41340, Training Loss: 0.00670
Epoch: 41340, Training Loss: 0.00695
Epoch: 41340, Training Loss: 0.00696
Epoch: 41340, Training Loss: 0.00854
Epoch: 41341, Training Loss: 0.00670
Epoch: 41341, Training Loss: 0.00695
Epoch: 41341, Training Loss: 0.00696
Epoch: 41341, Training Loss: 0.00854
Epoch: 41342, Training Loss: 0.00670
Epoch: 41342, Training Loss: 0.00695
Epoch: 41342, Training Loss: 0.00696
Epoch: 41342, Training Loss: 0.00854
Epoch: 41343, Training Loss: 0.00670
Epoch: 41343, Training Loss: 0.00695
Epoch: 41343, Training Loss: 0.00696
Epoch: 41343, Training Loss: 0.00854
Epoch: 41344, Training Loss: 0.00670
Epoch: 41344, Training Loss: 0.00695
Epoch: 41344, Training Loss: 0.00696
Epoch: 41344, Training Loss: 0.00854
Epoch: 41345, Training Loss: 0.00670
Epoch: 41345, Training Loss: 0.00695
Epoch: 41345, Training Loss: 0.00696
Epoch: 41345, Training Loss: 0.00854
Epoch: 41346, Training Loss: 0.00670
Epoch: 41346, Training Loss: 0.00695
Epoch: 41346, Training Loss: 0.00696
Epoch: 41346, Training Loss: 0.00854
Epoch: 41347, Training Loss: 0.00670
Epoch: 41347, Training Loss: 0.00695
Epoch: 41347, Training Loss: 0.00696
Epoch: 41347, Training Loss: 0.00854
Epoch: 41348, Training Loss: 0.00670
Epoch: 41348, Training Loss: 0.00695
Epoch: 41348, Training Loss: 0.00696
Epoch: 41348, Training Loss: 0.00854
Epoch: 41349, Training Loss: 0.00670
Epoch: 41349, Training Loss: 0.00695
Epoch: 41349, Training Loss: 0.00696
Epoch: 41349, Training Loss: 0.00854
Epoch: 41350, Training Loss: 0.00670
Epoch: 41350, Training Loss: 0.00695
Epoch: 41350, Training Loss: 0.00696
Epoch: 41350, Training Loss: 0.00854
Epoch: 41351, Training Loss: 0.00670
Epoch: 41351, Training Loss: 0.00695
Epoch: 41351, Training Loss: 0.00696
Epoch: 41351, Training Loss: 0.00854
Epoch: 41352, Training Loss: 0.00670
Epoch: 41352, Training Loss: 0.00695
Epoch: 41352, Training Loss: 0.00696
Epoch: 41352, Training Loss: 0.00854
Epoch: 41353, Training Loss: 0.00670
Epoch: 41353, Training Loss: 0.00695
Epoch: 41353, Training Loss: 0.00696
Epoch: 41353, Training Loss: 0.00854
Epoch: 41354, Training Loss: 0.00670
Epoch: 41354, Training Loss: 0.00695
Epoch: 41354, Training Loss: 0.00696
Epoch: 41354, Training Loss: 0.00854
Epoch: 41355, Training Loss: 0.00670
Epoch: 41355, Training Loss: 0.00695
Epoch: 41355, Training Loss: 0.00696
Epoch: 41355, Training Loss: 0.00854
Epoch: 41356, Training Loss: 0.00670
Epoch: 41356, Training Loss: 0.00695
Epoch: 41356, Training Loss: 0.00696
Epoch: 41356, Training Loss: 0.00854
Epoch: 41357, Training Loss: 0.00670
Epoch: 41357, Training Loss: 0.00695
Epoch: 41357, Training Loss: 0.00696
Epoch: 41357, Training Loss: 0.00854
Epoch: 41358, Training Loss: 0.00670
Epoch: 41358, Training Loss: 0.00695
Epoch: 41358, Training Loss: 0.00696
Epoch: 41358, Training Loss: 0.00854
Epoch: 41359, Training Loss: 0.00670
Epoch: 41359, Training Loss: 0.00695
Epoch: 41359, Training Loss: 0.00696
Epoch: 41359, Training Loss: 0.00853
Epoch: 41360, Training Loss: 0.00670
Epoch: 41360, Training Loss: 0.00695
Epoch: 41360, Training Loss: 0.00696
Epoch: 41360, Training Loss: 0.00853
Epoch: 41361, Training Loss: 0.00670
Epoch: 41361, Training Loss: 0.00695
Epoch: 41361, Training Loss: 0.00696
Epoch: 41361, Training Loss: 0.00853
Epoch: 41362, Training Loss: 0.00670
Epoch: 41362, Training Loss: 0.00695
Epoch: 41362, Training Loss: 0.00696
Epoch: 41362, Training Loss: 0.00853
Epoch: 41363, Training Loss: 0.00670
Epoch: 41363, Training Loss: 0.00695
Epoch: 41363, Training Loss: 0.00696
Epoch: 41363, Training Loss: 0.00853
Epoch: 41364, Training Loss: 0.00670
Epoch: 41364, Training Loss: 0.00695
Epoch: 41364, Training Loss: 0.00696
Epoch: 41364, Training Loss: 0.00853
Epoch: 41365, Training Loss: 0.00670
Epoch: 41365, Training Loss: 0.00695
Epoch: 41365, Training Loss: 0.00696
Epoch: 41365, Training Loss: 0.00853
Epoch: 41366, Training Loss: 0.00670
Epoch: 41366, Training Loss: 0.00695
Epoch: 41366, Training Loss: 0.00696
Epoch: 41366, Training Loss: 0.00853
Epoch: 41367, Training Loss: 0.00670
Epoch: 41367, Training Loss: 0.00695
Epoch: 41367, Training Loss: 0.00696
Epoch: 41367, Training Loss: 0.00853
Epoch: 41368, Training Loss: 0.00670
Epoch: 41368, Training Loss: 0.00695
Epoch: 41368, Training Loss: 0.00696
Epoch: 41368, Training Loss: 0.00853
Epoch: 41369, Training Loss: 0.00670
Epoch: 41369, Training Loss: 0.00695
Epoch: 41369, Training Loss: 0.00696
Epoch: 41369, Training Loss: 0.00853
Epoch: 41370, Training Loss: 0.00670
Epoch: 41370, Training Loss: 0.00695
Epoch: 41370, Training Loss: 0.00696
Epoch: 41370, Training Loss: 0.00853
Epoch: 41371, Training Loss: 0.00670
Epoch: 41371, Training Loss: 0.00695
Epoch: 41371, Training Loss: 0.00696
Epoch: 41371, Training Loss: 0.00853
Epoch: 41372, Training Loss: 0.00670
Epoch: 41372, Training Loss: 0.00695
Epoch: 41372, Training Loss: 0.00696
Epoch: 41372, Training Loss: 0.00853
Epoch: 41373, Training Loss: 0.00670
Epoch: 41373, Training Loss: 0.00695
Epoch: 41373, Training Loss: 0.00696
Epoch: 41373, Training Loss: 0.00853
Epoch: 41374, Training Loss: 0.00670
Epoch: 41374, Training Loss: 0.00695
Epoch: 41374, Training Loss: 0.00696
Epoch: 41374, Training Loss: 0.00853
Epoch: 41375, Training Loss: 0.00670
Epoch: 41375, Training Loss: 0.00695
Epoch: 41375, Training Loss: 0.00696
Epoch: 41375, Training Loss: 0.00853
Epoch: 41376, Training Loss: 0.00670
Epoch: 41376, Training Loss: 0.00695
Epoch: 41376, Training Loss: 0.00696
Epoch: 41376, Training Loss: 0.00853
Epoch: 41377, Training Loss: 0.00669
Epoch: 41377, Training Loss: 0.00695
Epoch: 41377, Training Loss: 0.00696
Epoch: 41377, Training Loss: 0.00853
Epoch: 41378, Training Loss: 0.00669
Epoch: 41378, Training Loss: 0.00695
Epoch: 41378, Training Loss: 0.00696
Epoch: 41378, Training Loss: 0.00853
Epoch: 41379, Training Loss: 0.00669
Epoch: 41379, Training Loss: 0.00695
Epoch: 41379, Training Loss: 0.00696
Epoch: 41379, Training Loss: 0.00853
Epoch: 41380, Training Loss: 0.00669
Epoch: 41380, Training Loss: 0.00695
Epoch: 41380, Training Loss: 0.00696
Epoch: 41380, Training Loss: 0.00853
Epoch: 41381, Training Loss: 0.00669
Epoch: 41381, Training Loss: 0.00695
Epoch: 41381, Training Loss: 0.00696
Epoch: 41381, Training Loss: 0.00853
Epoch: 41382, Training Loss: 0.00669
Epoch: 41382, Training Loss: 0.00695
Epoch: 41382, Training Loss: 0.00696
Epoch: 41382, Training Loss: 0.00853
Epoch: 41383, Training Loss: 0.00669
Epoch: 41383, Training Loss: 0.00695
Epoch: 41383, Training Loss: 0.00696
Epoch: 41383, Training Loss: 0.00853
Epoch: 41384, Training Loss: 0.00669
Epoch: 41384, Training Loss: 0.00695
Epoch: 41384, Training Loss: 0.00696
Epoch: 41384, Training Loss: 0.00853
Epoch: 41385, Training Loss: 0.00669
Epoch: 41385, Training Loss: 0.00695
Epoch: 41385, Training Loss: 0.00696
Epoch: 41385, Training Loss: 0.00853
Epoch: 41386, Training Loss: 0.00669
Epoch: 41386, Training Loss: 0.00695
Epoch: 41386, Training Loss: 0.00696
Epoch: 41386, Training Loss: 0.00853
Epoch: 41387, Training Loss: 0.00669
Epoch: 41387, Training Loss: 0.00695
Epoch: 41387, Training Loss: 0.00696
Epoch: 41387, Training Loss: 0.00853
Epoch: 41388, Training Loss: 0.00669
Epoch: 41388, Training Loss: 0.00695
Epoch: 41388, Training Loss: 0.00696
Epoch: 41388, Training Loss: 0.00853
Epoch: 41389, Training Loss: 0.00669
Epoch: 41389, Training Loss: 0.00695
Epoch: 41389, Training Loss: 0.00696
Epoch: 41389, Training Loss: 0.00853
Epoch: 41390, Training Loss: 0.00669
Epoch: 41390, Training Loss: 0.00695
Epoch: 41390, Training Loss: 0.00696
Epoch: 41390, Training Loss: 0.00853
Epoch: 41391, Training Loss: 0.00669
Epoch: 41391, Training Loss: 0.00695
Epoch: 41391, Training Loss: 0.00696
Epoch: 41391, Training Loss: 0.00853
Epoch: 41392, Training Loss: 0.00669
Epoch: 41392, Training Loss: 0.00695
Epoch: 41392, Training Loss: 0.00696
Epoch: 41392, Training Loss: 0.00853
Epoch: 41393, Training Loss: 0.00669
Epoch: 41393, Training Loss: 0.00695
Epoch: 41393, Training Loss: 0.00696
Epoch: 41393, Training Loss: 0.00853
Epoch: 41394, Training Loss: 0.00669
Epoch: 41394, Training Loss: 0.00695
Epoch: 41394, Training Loss: 0.00696
Epoch: 41394, Training Loss: 0.00853
Epoch: 41395, Training Loss: 0.00669
Epoch: 41395, Training Loss: 0.00695
Epoch: 41395, Training Loss: 0.00696
Epoch: 41395, Training Loss: 0.00853
Epoch: 41396, Training Loss: 0.00669
Epoch: 41396, Training Loss: 0.00695
Epoch: 41396, Training Loss: 0.00696
Epoch: 41396, Training Loss: 0.00853
Epoch: 41397, Training Loss: 0.00669
Epoch: 41397, Training Loss: 0.00695
Epoch: 41397, Training Loss: 0.00696
Epoch: 41397, Training Loss: 0.00853
Epoch: 41398, Training Loss: 0.00669
Epoch: 41398, Training Loss: 0.00695
Epoch: 41398, Training Loss: 0.00696
Epoch: 41398, Training Loss: 0.00853
Epoch: 41399, Training Loss: 0.00669
Epoch: 41399, Training Loss: 0.00695
Epoch: 41399, Training Loss: 0.00696
Epoch: 41399, Training Loss: 0.00853
Epoch: 41400, Training Loss: 0.00669
Epoch: 41400, Training Loss: 0.00695
Epoch: 41400, Training Loss: 0.00696
Epoch: 41400, Training Loss: 0.00853
Epoch: 41401, Training Loss: 0.00669
Epoch: 41401, Training Loss: 0.00695
Epoch: 41401, Training Loss: 0.00696
Epoch: 41401, Training Loss: 0.00853
Epoch: 41402, Training Loss: 0.00669
Epoch: 41402, Training Loss: 0.00695
Epoch: 41402, Training Loss: 0.00696
Epoch: 41402, Training Loss: 0.00853
Epoch: 41403, Training Loss: 0.00669
Epoch: 41403, Training Loss: 0.00695
Epoch: 41403, Training Loss: 0.00696
Epoch: 41403, Training Loss: 0.00853
Epoch: 41404, Training Loss: 0.00669
Epoch: 41404, Training Loss: 0.00695
Epoch: 41404, Training Loss: 0.00696
Epoch: 41404, Training Loss: 0.00853
Epoch: 41405, Training Loss: 0.00669
Epoch: 41405, Training Loss: 0.00695
Epoch: 41405, Training Loss: 0.00696
Epoch: 41405, Training Loss: 0.00853
Epoch: 41406, Training Loss: 0.00669
Epoch: 41406, Training Loss: 0.00695
Epoch: 41406, Training Loss: 0.00696
Epoch: 41406, Training Loss: 0.00853
Epoch: 41407, Training Loss: 0.00669
Epoch: 41407, Training Loss: 0.00695
Epoch: 41407, Training Loss: 0.00696
Epoch: 41407, Training Loss: 0.00853
Epoch: 41408, Training Loss: 0.00669
Epoch: 41408, Training Loss: 0.00695
Epoch: 41408, Training Loss: 0.00696
Epoch: 41408, Training Loss: 0.00853
Epoch: 41409, Training Loss: 0.00669
Epoch: 41409, Training Loss: 0.00695
Epoch: 41409, Training Loss: 0.00696
Epoch: 41409, Training Loss: 0.00853
Epoch: 41410, Training Loss: 0.00669
Epoch: 41410, Training Loss: 0.00695
Epoch: 41410, Training Loss: 0.00696
Epoch: 41410, Training Loss: 0.00853
Epoch: 41411, Training Loss: 0.00669
Epoch: 41411, Training Loss: 0.00695
Epoch: 41411, Training Loss: 0.00696
Epoch: 41411, Training Loss: 0.00853
Epoch: 41412, Training Loss: 0.00669
Epoch: 41412, Training Loss: 0.00695
Epoch: 41412, Training Loss: 0.00696
Epoch: 41412, Training Loss: 0.00853
Epoch: 41413, Training Loss: 0.00669
Epoch: 41413, Training Loss: 0.00695
Epoch: 41413, Training Loss: 0.00696
Epoch: 41413, Training Loss: 0.00853
Epoch: 41414, Training Loss: 0.00669
Epoch: 41414, Training Loss: 0.00695
Epoch: 41414, Training Loss: 0.00696
Epoch: 41414, Training Loss: 0.00853
Epoch: 41415, Training Loss: 0.00669
Epoch: 41415, Training Loss: 0.00695
Epoch: 41415, Training Loss: 0.00696
Epoch: 41415, Training Loss: 0.00853
Epoch: 41416, Training Loss: 0.00669
Epoch: 41416, Training Loss: 0.00695
Epoch: 41416, Training Loss: 0.00696
Epoch: 41416, Training Loss: 0.00853
Epoch: 41417, Training Loss: 0.00669
Epoch: 41417, Training Loss: 0.00695
Epoch: 41417, Training Loss: 0.00696
Epoch: 41417, Training Loss: 0.00853
Epoch: 41418, Training Loss: 0.00669
Epoch: 41418, Training Loss: 0.00695
Epoch: 41418, Training Loss: 0.00696
Epoch: 41418, Training Loss: 0.00853
Epoch: 41419, Training Loss: 0.00669
Epoch: 41419, Training Loss: 0.00695
Epoch: 41419, Training Loss: 0.00696
Epoch: 41419, Training Loss: 0.00853
Epoch: 41420, Training Loss: 0.00669
Epoch: 41420, Training Loss: 0.00695
Epoch: 41420, Training Loss: 0.00696
Epoch: 41420, Training Loss: 0.00853
Epoch: 41421, Training Loss: 0.00669
Epoch: 41421, Training Loss: 0.00695
Epoch: 41421, Training Loss: 0.00696
Epoch: 41421, Training Loss: 0.00853
Epoch: 41422, Training Loss: 0.00669
Epoch: 41422, Training Loss: 0.00695
Epoch: 41422, Training Loss: 0.00695
Epoch: 41422, Training Loss: 0.00853
Epoch: 41423, Training Loss: 0.00669
Epoch: 41423, Training Loss: 0.00695
Epoch: 41423, Training Loss: 0.00695
Epoch: 41423, Training Loss: 0.00853
Epoch: 41424, Training Loss: 0.00669
Epoch: 41424, Training Loss: 0.00695
Epoch: 41424, Training Loss: 0.00695
Epoch: 41424, Training Loss: 0.00853
Epoch: 41425, Training Loss: 0.00669
Epoch: 41425, Training Loss: 0.00695
Epoch: 41425, Training Loss: 0.00695
Epoch: 41425, Training Loss: 0.00853
Epoch: 41426, Training Loss: 0.00669
Epoch: 41426, Training Loss: 0.00695
Epoch: 41426, Training Loss: 0.00695
Epoch: 41426, Training Loss: 0.00853
Epoch: 41427, Training Loss: 0.00669
Epoch: 41427, Training Loss: 0.00695
Epoch: 41427, Training Loss: 0.00695
Epoch: 41427, Training Loss: 0.00853
Epoch: 41428, Training Loss: 0.00669
Epoch: 41428, Training Loss: 0.00695
Epoch: 41428, Training Loss: 0.00695
Epoch: 41428, Training Loss: 0.00853
Epoch: 41429, Training Loss: 0.00669
Epoch: 41429, Training Loss: 0.00695
Epoch: 41429, Training Loss: 0.00695
Epoch: 41429, Training Loss: 0.00853
Epoch: 41430, Training Loss: 0.00669
Epoch: 41430, Training Loss: 0.00695
Epoch: 41430, Training Loss: 0.00695
Epoch: 41430, Training Loss: 0.00853
Epoch: 41431, Training Loss: 0.00669
Epoch: 41431, Training Loss: 0.00695
Epoch: 41431, Training Loss: 0.00695
Epoch: 41431, Training Loss: 0.00853
Epoch: 41432, Training Loss: 0.00669
Epoch: 41432, Training Loss: 0.00695
Epoch: 41432, Training Loss: 0.00695
Epoch: 41432, Training Loss: 0.00853
Epoch: 41433, Training Loss: 0.00669
Epoch: 41433, Training Loss: 0.00695
Epoch: 41433, Training Loss: 0.00695
Epoch: 41433, Training Loss: 0.00853
Epoch: 41434, Training Loss: 0.00669
Epoch: 41434, Training Loss: 0.00695
Epoch: 41434, Training Loss: 0.00695
Epoch: 41434, Training Loss: 0.00853
Epoch: 41435, Training Loss: 0.00669
Epoch: 41435, Training Loss: 0.00695
Epoch: 41435, Training Loss: 0.00695
Epoch: 41435, Training Loss: 0.00853
Epoch: 41436, Training Loss: 0.00669
Epoch: 41436, Training Loss: 0.00695
Epoch: 41436, Training Loss: 0.00695
Epoch: 41436, Training Loss: 0.00853
Epoch: 41437, Training Loss: 0.00669
Epoch: 41437, Training Loss: 0.00695
Epoch: 41437, Training Loss: 0.00695
Epoch: 41437, Training Loss: 0.00853
Epoch: 41438, Training Loss: 0.00669
Epoch: 41438, Training Loss: 0.00695
Epoch: 41438, Training Loss: 0.00695
Epoch: 41438, Training Loss: 0.00853
Epoch: 41439, Training Loss: 0.00669
Epoch: 41439, Training Loss: 0.00695
Epoch: 41439, Training Loss: 0.00695
Epoch: 41439, Training Loss: 0.00853
Epoch: 41440, Training Loss: 0.00669
Epoch: 41440, Training Loss: 0.00695
Epoch: 41440, Training Loss: 0.00695
Epoch: 41440, Training Loss: 0.00853
Epoch: 41441, Training Loss: 0.00669
Epoch: 41441, Training Loss: 0.00695
Epoch: 41441, Training Loss: 0.00695
Epoch: 41441, Training Loss: 0.00853
Epoch: 41442, Training Loss: 0.00669
Epoch: 41442, Training Loss: 0.00695
Epoch: 41442, Training Loss: 0.00695
Epoch: 41442, Training Loss: 0.00853
Epoch: 41443, Training Loss: 0.00669
Epoch: 41443, Training Loss: 0.00695
Epoch: 41443, Training Loss: 0.00695
Epoch: 41443, Training Loss: 0.00853
Epoch: 41444, Training Loss: 0.00669
Epoch: 41444, Training Loss: 0.00695
Epoch: 41444, Training Loss: 0.00695
Epoch: 41444, Training Loss: 0.00853
Epoch: 41445, Training Loss: 0.00669
Epoch: 41445, Training Loss: 0.00695
Epoch: 41445, Training Loss: 0.00695
Epoch: 41445, Training Loss: 0.00853
Epoch: 41446, Training Loss: 0.00669
Epoch: 41446, Training Loss: 0.00695
Epoch: 41446, Training Loss: 0.00695
Epoch: 41446, Training Loss: 0.00852
Epoch: 41447, Training Loss: 0.00669
Epoch: 41447, Training Loss: 0.00694
Epoch: 41447, Training Loss: 0.00695
Epoch: 41447, Training Loss: 0.00852
Epoch: 41448, Training Loss: 0.00669
Epoch: 41448, Training Loss: 0.00694
Epoch: 41448, Training Loss: 0.00695
Epoch: 41448, Training Loss: 0.00852
Epoch: 41449, Training Loss: 0.00669
Epoch: 41449, Training Loss: 0.00694
Epoch: 41449, Training Loss: 0.00695
Epoch: 41449, Training Loss: 0.00852
Epoch: 41450, Training Loss: 0.00669
Epoch: 41450, Training Loss: 0.00694
Epoch: 41450, Training Loss: 0.00695
Epoch: 41450, Training Loss: 0.00852
Epoch: 41451, Training Loss: 0.00669
Epoch: 41451, Training Loss: 0.00694
Epoch: 41451, Training Loss: 0.00695
Epoch: 41451, Training Loss: 0.00852
Epoch: 41452, Training Loss: 0.00669
Epoch: 41452, Training Loss: 0.00694
Epoch: 41452, Training Loss: 0.00695
Epoch: 41452, Training Loss: 0.00852
Epoch: 41453, Training Loss: 0.00669
Epoch: 41453, Training Loss: 0.00694
Epoch: 41453, Training Loss: 0.00695
Epoch: 41453, Training Loss: 0.00852
Epoch: 41454, Training Loss: 0.00669
Epoch: 41454, Training Loss: 0.00694
Epoch: 41454, Training Loss: 0.00695
Epoch: 41454, Training Loss: 0.00852
Epoch: 41455, Training Loss: 0.00669
Epoch: 41455, Training Loss: 0.00694
Epoch: 41455, Training Loss: 0.00695
Epoch: 41455, Training Loss: 0.00852
Epoch: 41456, Training Loss: 0.00669
Epoch: 41456, Training Loss: 0.00694
Epoch: 41456, Training Loss: 0.00695
Epoch: 41456, Training Loss: 0.00852
Epoch: 41457, Training Loss: 0.00669
Epoch: 41457, Training Loss: 0.00694
Epoch: 41457, Training Loss: 0.00695
Epoch: 41457, Training Loss: 0.00852
Epoch: 41458, Training Loss: 0.00669
Epoch: 41458, Training Loss: 0.00694
Epoch: 41458, Training Loss: 0.00695
Epoch: 41458, Training Loss: 0.00852
Epoch: 41459, Training Loss: 0.00669
Epoch: 41459, Training Loss: 0.00694
Epoch: 41459, Training Loss: 0.00695
Epoch: 41459, Training Loss: 0.00852
Epoch: 41460, Training Loss: 0.00669
Epoch: 41460, Training Loss: 0.00694
Epoch: 41460, Training Loss: 0.00695
Epoch: 41460, Training Loss: 0.00852
Epoch: 41461, Training Loss: 0.00669
Epoch: 41461, Training Loss: 0.00694
Epoch: 41461, Training Loss: 0.00695
Epoch: 41461, Training Loss: 0.00852
Epoch: 41462, Training Loss: 0.00669
Epoch: 41462, Training Loss: 0.00694
Epoch: 41462, Training Loss: 0.00695
Epoch: 41462, Training Loss: 0.00852
Epoch: 41463, Training Loss: 0.00669
Epoch: 41463, Training Loss: 0.00694
Epoch: 41463, Training Loss: 0.00695
Epoch: 41463, Training Loss: 0.00852
Epoch: 41464, Training Loss: 0.00669
Epoch: 41464, Training Loss: 0.00694
Epoch: 41464, Training Loss: 0.00695
Epoch: 41464, Training Loss: 0.00852
Epoch: 41465, Training Loss: 0.00669
Epoch: 41465, Training Loss: 0.00694
Epoch: 41465, Training Loss: 0.00695
Epoch: 41465, Training Loss: 0.00852
Epoch: 41466, Training Loss: 0.00669
Epoch: 41466, Training Loss: 0.00694
Epoch: 41466, Training Loss: 0.00695
Epoch: 41466, Training Loss: 0.00852
Epoch: 41467, Training Loss: 0.00669
Epoch: 41467, Training Loss: 0.00694
Epoch: 41467, Training Loss: 0.00695
Epoch: 41467, Training Loss: 0.00852
Epoch: 41468, Training Loss: 0.00669
Epoch: 41468, Training Loss: 0.00694
Epoch: 41468, Training Loss: 0.00695
Epoch: 41468, Training Loss: 0.00852
Epoch: 41469, Training Loss: 0.00669
Epoch: 41469, Training Loss: 0.00694
Epoch: 41469, Training Loss: 0.00695
Epoch: 41469, Training Loss: 0.00852
Epoch: 41470, Training Loss: 0.00669
Epoch: 41470, Training Loss: 0.00694
Epoch: 41470, Training Loss: 0.00695
Epoch: 41470, Training Loss: 0.00852
Epoch: 41471, Training Loss: 0.00669
Epoch: 41471, Training Loss: 0.00694
Epoch: 41471, Training Loss: 0.00695
Epoch: 41471, Training Loss: 0.00852
Epoch: 41472, Training Loss: 0.00669
Epoch: 41472, Training Loss: 0.00694
Epoch: 41472, Training Loss: 0.00695
Epoch: 41472, Training Loss: 0.00852
Epoch: 41473, Training Loss: 0.00669
Epoch: 41473, Training Loss: 0.00694
Epoch: 41473, Training Loss: 0.00695
Epoch: 41473, Training Loss: 0.00852
Epoch: 41474, Training Loss: 0.00669
Epoch: 41474, Training Loss: 0.00694
Epoch: 41474, Training Loss: 0.00695
Epoch: 41474, Training Loss: 0.00852
Epoch: 41475, Training Loss: 0.00669
Epoch: 41475, Training Loss: 0.00694
Epoch: 41475, Training Loss: 0.00695
Epoch: 41475, Training Loss: 0.00852
Epoch: 41476, Training Loss: 0.00669
Epoch: 41476, Training Loss: 0.00694
Epoch: 41476, Training Loss: 0.00695
Epoch: 41476, Training Loss: 0.00852
Epoch: 41477, Training Loss: 0.00669
Epoch: 41477, Training Loss: 0.00694
Epoch: 41477, Training Loss: 0.00695
Epoch: 41477, Training Loss: 0.00852
Epoch: 41478, Training Loss: 0.00669
Epoch: 41478, Training Loss: 0.00694
Epoch: 41478, Training Loss: 0.00695
Epoch: 41478, Training Loss: 0.00852
Epoch: 41479, Training Loss: 0.00669
Epoch: 41479, Training Loss: 0.00694
Epoch: 41479, Training Loss: 0.00695
Epoch: 41479, Training Loss: 0.00852
Epoch: 41480, Training Loss: 0.00669
Epoch: 41480, Training Loss: 0.00694
Epoch: 41480, Training Loss: 0.00695
Epoch: 41480, Training Loss: 0.00852
Epoch: 41481, Training Loss: 0.00669
Epoch: 41481, Training Loss: 0.00694
Epoch: 41481, Training Loss: 0.00695
Epoch: 41481, Training Loss: 0.00852
Epoch: 41482, Training Loss: 0.00669
Epoch: 41482, Training Loss: 0.00694
Epoch: 41482, Training Loss: 0.00695
Epoch: 41482, Training Loss: 0.00852
Epoch: 41483, Training Loss: 0.00669
Epoch: 41483, Training Loss: 0.00694
Epoch: 41483, Training Loss: 0.00695
Epoch: 41483, Training Loss: 0.00852
Epoch: 41484, Training Loss: 0.00669
Epoch: 41484, Training Loss: 0.00694
Epoch: 41484, Training Loss: 0.00695
Epoch: 41484, Training Loss: 0.00852
Epoch: 41485, Training Loss: 0.00669
Epoch: 41485, Training Loss: 0.00694
Epoch: 41485, Training Loss: 0.00695
Epoch: 41485, Training Loss: 0.00852
Epoch: 41486, Training Loss: 0.00669
Epoch: 41486, Training Loss: 0.00694
Epoch: 41486, Training Loss: 0.00695
Epoch: 41486, Training Loss: 0.00852
Epoch: 41487, Training Loss: 0.00669
Epoch: 41487, Training Loss: 0.00694
Epoch: 41487, Training Loss: 0.00695
Epoch: 41487, Training Loss: 0.00852
Epoch: 41488, Training Loss: 0.00669
Epoch: 41488, Training Loss: 0.00694
Epoch: 41488, Training Loss: 0.00695
Epoch: 41488, Training Loss: 0.00852
Epoch: 41489, Training Loss: 0.00669
Epoch: 41489, Training Loss: 0.00694
Epoch: 41489, Training Loss: 0.00695
Epoch: 41489, Training Loss: 0.00852
Epoch: 41490, Training Loss: 0.00669
Epoch: 41490, Training Loss: 0.00694
Epoch: 41490, Training Loss: 0.00695
Epoch: 41490, Training Loss: 0.00852
Epoch: 41491, Training Loss: 0.00669
Epoch: 41491, Training Loss: 0.00694
Epoch: 41491, Training Loss: 0.00695
Epoch: 41491, Training Loss: 0.00852
Epoch: 41492, Training Loss: 0.00668
Epoch: 41492, Training Loss: 0.00694
Epoch: 41492, Training Loss: 0.00695
Epoch: 41492, Training Loss: 0.00852
Epoch: 41493, Training Loss: 0.00668
Epoch: 41493, Training Loss: 0.00694
Epoch: 41493, Training Loss: 0.00695
Epoch: 41493, Training Loss: 0.00852
Epoch: 41494, Training Loss: 0.00668
Epoch: 41494, Training Loss: 0.00694
Epoch: 41494, Training Loss: 0.00695
Epoch: 41494, Training Loss: 0.00852
Epoch: 41495, Training Loss: 0.00668
Epoch: 41495, Training Loss: 0.00694
Epoch: 41495, Training Loss: 0.00695
Epoch: 41495, Training Loss: 0.00852
Epoch: 41496, Training Loss: 0.00668
Epoch: 41496, Training Loss: 0.00694
Epoch: 41496, Training Loss: 0.00695
Epoch: 41496, Training Loss: 0.00852
Epoch: 41497, Training Loss: 0.00668
Epoch: 41497, Training Loss: 0.00694
Epoch: 41497, Training Loss: 0.00695
Epoch: 41497, Training Loss: 0.00852
Epoch: 41498, Training Loss: 0.00668
Epoch: 41498, Training Loss: 0.00694
Epoch: 41498, Training Loss: 0.00695
Epoch: 41498, Training Loss: 0.00852
Epoch: 41499, Training Loss: 0.00668
Epoch: 41499, Training Loss: 0.00694
Epoch: 41499, Training Loss: 0.00695
Epoch: 41499, Training Loss: 0.00852
Epoch: 41500, Training Loss: 0.00668
Epoch: 41500, Training Loss: 0.00694
Epoch: 41500, Training Loss: 0.00695
Epoch: 41500, Training Loss: 0.00852
Epoch: 41501, Training Loss: 0.00668
Epoch: 41501, Training Loss: 0.00694
Epoch: 41501, Training Loss: 0.00695
Epoch: 41501, Training Loss: 0.00852
Epoch: 41502, Training Loss: 0.00668
Epoch: 41502, Training Loss: 0.00694
Epoch: 41502, Training Loss: 0.00695
Epoch: 41502, Training Loss: 0.00852
Epoch: 41503, Training Loss: 0.00668
Epoch: 41503, Training Loss: 0.00694
Epoch: 41503, Training Loss: 0.00695
Epoch: 41503, Training Loss: 0.00852
Epoch: 41504, Training Loss: 0.00668
Epoch: 41504, Training Loss: 0.00694
Epoch: 41504, Training Loss: 0.00695
Epoch: 41504, Training Loss: 0.00852
Epoch: 41505, Training Loss: 0.00668
Epoch: 41505, Training Loss: 0.00694
Epoch: 41505, Training Loss: 0.00695
Epoch: 41505, Training Loss: 0.00852
Epoch: 41506, Training Loss: 0.00668
Epoch: 41506, Training Loss: 0.00694
Epoch: 41506, Training Loss: 0.00695
Epoch: 41506, Training Loss: 0.00852
Epoch: 41507, Training Loss: 0.00668
Epoch: 41507, Training Loss: 0.00694
Epoch: 41507, Training Loss: 0.00695
Epoch: 41507, Training Loss: 0.00852
Epoch: 41508, Training Loss: 0.00668
Epoch: 41508, Training Loss: 0.00694
Epoch: 41508, Training Loss: 0.00695
Epoch: 41508, Training Loss: 0.00852
Epoch: 41509, Training Loss: 0.00668
Epoch: 41509, Training Loss: 0.00694
Epoch: 41509, Training Loss: 0.00695
Epoch: 41509, Training Loss: 0.00852
Epoch: 41510, Training Loss: 0.00668
Epoch: 41510, Training Loss: 0.00694
Epoch: 41510, Training Loss: 0.00695
Epoch: 41510, Training Loss: 0.00852
Epoch: 41511, Training Loss: 0.00668
Epoch: 41511, Training Loss: 0.00694
Epoch: 41511, Training Loss: 0.00695
Epoch: 41511, Training Loss: 0.00852
Epoch: 41512, Training Loss: 0.00668
Epoch: 41512, Training Loss: 0.00694
Epoch: 41512, Training Loss: 0.00695
Epoch: 41512, Training Loss: 0.00852
Epoch: 41513, Training Loss: 0.00668
Epoch: 41513, Training Loss: 0.00694
Epoch: 41513, Training Loss: 0.00695
Epoch: 41513, Training Loss: 0.00852
Epoch: 41514, Training Loss: 0.00668
Epoch: 41514, Training Loss: 0.00694
Epoch: 41514, Training Loss: 0.00695
Epoch: 41514, Training Loss: 0.00852
Epoch: 41515, Training Loss: 0.00668
Epoch: 41515, Training Loss: 0.00694
Epoch: 41515, Training Loss: 0.00695
Epoch: 41515, Training Loss: 0.00852
Epoch: 41516, Training Loss: 0.00668
Epoch: 41516, Training Loss: 0.00694
Epoch: 41516, Training Loss: 0.00695
Epoch: 41516, Training Loss: 0.00852
Epoch: 41517, Training Loss: 0.00668
Epoch: 41517, Training Loss: 0.00694
Epoch: 41517, Training Loss: 0.00695
Epoch: 41517, Training Loss: 0.00852
Epoch: 41518, Training Loss: 0.00668
Epoch: 41518, Training Loss: 0.00694
Epoch: 41518, Training Loss: 0.00695
Epoch: 41518, Training Loss: 0.00852
Epoch: 41519, Training Loss: 0.00668
Epoch: 41519, Training Loss: 0.00694
Epoch: 41519, Training Loss: 0.00695
Epoch: 41519, Training Loss: 0.00852
Epoch: 41520, Training Loss: 0.00668
Epoch: 41520, Training Loss: 0.00694
Epoch: 41520, Training Loss: 0.00695
Epoch: 41520, Training Loss: 0.00852
Epoch: 41521, Training Loss: 0.00668
Epoch: 41521, Training Loss: 0.00694
Epoch: 41521, Training Loss: 0.00695
Epoch: 41521, Training Loss: 0.00852
Epoch: 41522, Training Loss: 0.00668
Epoch: 41522, Training Loss: 0.00694
Epoch: 41522, Training Loss: 0.00695
Epoch: 41522, Training Loss: 0.00852
Epoch: 41523, Training Loss: 0.00668
Epoch: 41523, Training Loss: 0.00694
Epoch: 41523, Training Loss: 0.00695
Epoch: 41523, Training Loss: 0.00852
Epoch: 41524, Training Loss: 0.00668
Epoch: 41524, Training Loss: 0.00694
Epoch: 41524, Training Loss: 0.00695
Epoch: 41524, Training Loss: 0.00852
Epoch: 41525, Training Loss: 0.00668
Epoch: 41525, Training Loss: 0.00694
Epoch: 41525, Training Loss: 0.00695
Epoch: 41525, Training Loss: 0.00852
Epoch: 41526, Training Loss: 0.00668
Epoch: 41526, Training Loss: 0.00694
Epoch: 41526, Training Loss: 0.00695
Epoch: 41526, Training Loss: 0.00852
Epoch: 41527, Training Loss: 0.00668
Epoch: 41527, Training Loss: 0.00694
Epoch: 41527, Training Loss: 0.00695
Epoch: 41527, Training Loss: 0.00852
Epoch: 41528, Training Loss: 0.00668
Epoch: 41528, Training Loss: 0.00694
Epoch: 41528, Training Loss: 0.00695
Epoch: 41528, Training Loss: 0.00852
Epoch: 41529, Training Loss: 0.00668
Epoch: 41529, Training Loss: 0.00694
Epoch: 41529, Training Loss: 0.00695
Epoch: 41529, Training Loss: 0.00852
Epoch: 41530, Training Loss: 0.00668
Epoch: 41530, Training Loss: 0.00694
Epoch: 41530, Training Loss: 0.00695
Epoch: 41530, Training Loss: 0.00852
Epoch: 41531, Training Loss: 0.00668
Epoch: 41531, Training Loss: 0.00694
Epoch: 41531, Training Loss: 0.00694
Epoch: 41531, Training Loss: 0.00852
Epoch: 41532, Training Loss: 0.00668
Epoch: 41532, Training Loss: 0.00694
Epoch: 41532, Training Loss: 0.00694
Epoch: 41532, Training Loss: 0.00852
Epoch: 41533, Training Loss: 0.00668
Epoch: 41533, Training Loss: 0.00694
Epoch: 41533, Training Loss: 0.00694
Epoch: 41533, Training Loss: 0.00852
Epoch: 41534, Training Loss: 0.00668
Epoch: 41534, Training Loss: 0.00694
Epoch: 41534, Training Loss: 0.00694
Epoch: 41534, Training Loss: 0.00851
Epoch: 41535, Training Loss: 0.00668
Epoch: 41535, Training Loss: 0.00694
Epoch: 41535, Training Loss: 0.00694
Epoch: 41535, Training Loss: 0.00851
Epoch: 41536, Training Loss: 0.00668
Epoch: 41536, Training Loss: 0.00694
Epoch: 41536, Training Loss: 0.00694
Epoch: 41536, Training Loss: 0.00851
Epoch: 41537, Training Loss: 0.00668
Epoch: 41537, Training Loss: 0.00694
Epoch: 41537, Training Loss: 0.00694
Epoch: 41537, Training Loss: 0.00851
Epoch: 41538, Training Loss: 0.00668
Epoch: 41538, Training Loss: 0.00694
Epoch: 41538, Training Loss: 0.00694
Epoch: 41538, Training Loss: 0.00851
Epoch: 41539, Training Loss: 0.00668
Epoch: 41539, Training Loss: 0.00694
Epoch: 41539, Training Loss: 0.00694
Epoch: 41539, Training Loss: 0.00851
Epoch: 41540, Training Loss: 0.00668
Epoch: 41540, Training Loss: 0.00694
Epoch: 41540, Training Loss: 0.00694
Epoch: 41540, Training Loss: 0.00851
Epoch: 41541, Training Loss: 0.00668
Epoch: 41541, Training Loss: 0.00694
Epoch: 41541, Training Loss: 0.00694
Epoch: 41541, Training Loss: 0.00851
Epoch: 41542, Training Loss: 0.00668
Epoch: 41542, Training Loss: 0.00694
Epoch: 41542, Training Loss: 0.00694
Epoch: 41542, Training Loss: 0.00851
Epoch: 41543, Training Loss: 0.00668
Epoch: 41543, Training Loss: 0.00694
Epoch: 41543, Training Loss: 0.00694
Epoch: 41543, Training Loss: 0.00851
Epoch: 41544, Training Loss: 0.00668
Epoch: 41544, Training Loss: 0.00694
Epoch: 41544, Training Loss: 0.00694
Epoch: 41544, Training Loss: 0.00851
Epoch: 41545, Training Loss: 0.00668
Epoch: 41545, Training Loss: 0.00694
Epoch: 41545, Training Loss: 0.00694
Epoch: 41545, Training Loss: 0.00851
Epoch: 41546, Training Loss: 0.00668
Epoch: 41546, Training Loss: 0.00694
Epoch: 41546, Training Loss: 0.00694
Epoch: 41546, Training Loss: 0.00851
Epoch: 41547, Training Loss: 0.00668
Epoch: 41547, Training Loss: 0.00694
Epoch: 41547, Training Loss: 0.00694
Epoch: 41547, Training Loss: 0.00851
Epoch: 41548, Training Loss: 0.00668
Epoch: 41548, Training Loss: 0.00694
Epoch: 41548, Training Loss: 0.00694
Epoch: 41548, Training Loss: 0.00851
Epoch: 41549, Training Loss: 0.00668
Epoch: 41549, Training Loss: 0.00694
Epoch: 41549, Training Loss: 0.00694
Epoch: 41549, Training Loss: 0.00851
Epoch: 41550, Training Loss: 0.00668
Epoch: 41550, Training Loss: 0.00694
Epoch: 41550, Training Loss: 0.00694
Epoch: 41550, Training Loss: 0.00851
Epoch: 41551, Training Loss: 0.00668
Epoch: 41551, Training Loss: 0.00694
Epoch: 41551, Training Loss: 0.00694
Epoch: 41551, Training Loss: 0.00851
Epoch: 41552, Training Loss: 0.00668
Epoch: 41552, Training Loss: 0.00694
Epoch: 41552, Training Loss: 0.00694
Epoch: 41552, Training Loss: 0.00851
Epoch: 41553, Training Loss: 0.00668
Epoch: 41553, Training Loss: 0.00694
Epoch: 41553, Training Loss: 0.00694
Epoch: 41553, Training Loss: 0.00851
Epoch: 41554, Training Loss: 0.00668
Epoch: 41554, Training Loss: 0.00694
Epoch: 41554, Training Loss: 0.00694
Epoch: 41554, Training Loss: 0.00851
Epoch: 41555, Training Loss: 0.00668
Epoch: 41555, Training Loss: 0.00694
Epoch: 41555, Training Loss: 0.00694
Epoch: 41555, Training Loss: 0.00851
Epoch: 41556, Training Loss: 0.00668
Epoch: 41556, Training Loss: 0.00693
Epoch: 41556, Training Loss: 0.00694
Epoch: 41556, Training Loss: 0.00851
Epoch: 41557, Training Loss: 0.00668
Epoch: 41557, Training Loss: 0.00693
Epoch: 41557, Training Loss: 0.00694
Epoch: 41557, Training Loss: 0.00851
Epoch: 41558, Training Loss: 0.00668
Epoch: 41558, Training Loss: 0.00693
Epoch: 41558, Training Loss: 0.00694
Epoch: 41558, Training Loss: 0.00851
Epoch: 41559, Training Loss: 0.00668
Epoch: 41559, Training Loss: 0.00693
Epoch: 41559, Training Loss: 0.00694
Epoch: 41559, Training Loss: 0.00851
Epoch: 41560, Training Loss: 0.00668
Epoch: 41560, Training Loss: 0.00693
Epoch: 41560, Training Loss: 0.00694
Epoch: 41560, Training Loss: 0.00851
Epoch: 41561, Training Loss: 0.00668
Epoch: 41561, Training Loss: 0.00693
Epoch: 41561, Training Loss: 0.00694
Epoch: 41561, Training Loss: 0.00851
Epoch: 41562, Training Loss: 0.00668
Epoch: 41562, Training Loss: 0.00693
Epoch: 41562, Training Loss: 0.00694
Epoch: 41562, Training Loss: 0.00851
Epoch: 41563, Training Loss: 0.00668
Epoch: 41563, Training Loss: 0.00693
Epoch: 41563, Training Loss: 0.00694
Epoch: 41563, Training Loss: 0.00851
Epoch: 41564, Training Loss: 0.00668
Epoch: 41564, Training Loss: 0.00693
Epoch: 41564, Training Loss: 0.00694
Epoch: 41564, Training Loss: 0.00851
Epoch: 41565, Training Loss: 0.00668
Epoch: 41565, Training Loss: 0.00693
Epoch: 41565, Training Loss: 0.00694
Epoch: 41565, Training Loss: 0.00851
Epoch: 41566, Training Loss: 0.00668
Epoch: 41566, Training Loss: 0.00693
Epoch: 41566, Training Loss: 0.00694
Epoch: 41566, Training Loss: 0.00851
Epoch: 41567, Training Loss: 0.00668
Epoch: 41567, Training Loss: 0.00693
Epoch: 41567, Training Loss: 0.00694
Epoch: 41567, Training Loss: 0.00851
Epoch: 41568, Training Loss: 0.00668
Epoch: 41568, Training Loss: 0.00693
Epoch: 41568, Training Loss: 0.00694
Epoch: 41568, Training Loss: 0.00851
Epoch: 41569, Training Loss: 0.00668
Epoch: 41569, Training Loss: 0.00693
Epoch: 41569, Training Loss: 0.00694
Epoch: 41569, Training Loss: 0.00851
Epoch: 41570, Training Loss: 0.00668
Epoch: 41570, Training Loss: 0.00693
Epoch: 41570, Training Loss: 0.00694
Epoch: 41570, Training Loss: 0.00851
Epoch: 41571, Training Loss: 0.00668
Epoch: 41571, Training Loss: 0.00693
Epoch: 41571, Training Loss: 0.00694
Epoch: 41571, Training Loss: 0.00851
Epoch: 41572, Training Loss: 0.00668
Epoch: 41572, Training Loss: 0.00693
Epoch: 41572, Training Loss: 0.00694
Epoch: 41572, Training Loss: 0.00851
Epoch: 41573, Training Loss: 0.00668
Epoch: 41573, Training Loss: 0.00693
Epoch: 41573, Training Loss: 0.00694
Epoch: 41573, Training Loss: 0.00851
Epoch: 41574, Training Loss: 0.00668
Epoch: 41574, Training Loss: 0.00693
Epoch: 41574, Training Loss: 0.00694
Epoch: 41574, Training Loss: 0.00851
Epoch: 41575, Training Loss: 0.00668
Epoch: 41575, Training Loss: 0.00693
Epoch: 41575, Training Loss: 0.00694
Epoch: 41575, Training Loss: 0.00851
Epoch: 41576, Training Loss: 0.00668
Epoch: 41576, Training Loss: 0.00693
Epoch: 41576, Training Loss: 0.00694
Epoch: 41576, Training Loss: 0.00851
Epoch: 41577, Training Loss: 0.00668
Epoch: 41577, Training Loss: 0.00693
Epoch: 41577, Training Loss: 0.00694
Epoch: 41577, Training Loss: 0.00851
Epoch: 41578, Training Loss: 0.00668
Epoch: 41578, Training Loss: 0.00693
Epoch: 41578, Training Loss: 0.00694
Epoch: 41578, Training Loss: 0.00851
Epoch: 41579, Training Loss: 0.00668
Epoch: 41579, Training Loss: 0.00693
Epoch: 41579, Training Loss: 0.00694
Epoch: 41579, Training Loss: 0.00851
Epoch: 41580, Training Loss: 0.00668
Epoch: 41580, Training Loss: 0.00693
Epoch: 41580, Training Loss: 0.00694
Epoch: 41580, Training Loss: 0.00851
Epoch: 41581, Training Loss: 0.00668
Epoch: 41581, Training Loss: 0.00693
Epoch: 41581, Training Loss: 0.00694
Epoch: 41581, Training Loss: 0.00851
Epoch: 41582, Training Loss: 0.00668
Epoch: 41582, Training Loss: 0.00693
Epoch: 41582, Training Loss: 0.00694
Epoch: 41582, Training Loss: 0.00851
Epoch: 41583, Training Loss: 0.00668
Epoch: 41583, Training Loss: 0.00693
Epoch: 41583, Training Loss: 0.00694
Epoch: 41583, Training Loss: 0.00851
Epoch: 41584, Training Loss: 0.00668
Epoch: 41584, Training Loss: 0.00693
Epoch: 41584, Training Loss: 0.00694
Epoch: 41584, Training Loss: 0.00851
Epoch: 41585, Training Loss: 0.00668
Epoch: 41585, Training Loss: 0.00693
Epoch: 41585, Training Loss: 0.00694
Epoch: 41585, Training Loss: 0.00851
Epoch: 41586, Training Loss: 0.00668
Epoch: 41586, Training Loss: 0.00693
Epoch: 41586, Training Loss: 0.00694
Epoch: 41586, Training Loss: 0.00851
Epoch: 41587, Training Loss: 0.00668
Epoch: 41587, Training Loss: 0.00693
Epoch: 41587, Training Loss: 0.00694
Epoch: 41587, Training Loss: 0.00851
Epoch: 41588, Training Loss: 0.00668
Epoch: 41588, Training Loss: 0.00693
Epoch: 41588, Training Loss: 0.00694
Epoch: 41588, Training Loss: 0.00851
Epoch: 41589, Training Loss: 0.00668
Epoch: 41589, Training Loss: 0.00693
Epoch: 41589, Training Loss: 0.00694
Epoch: 41589, Training Loss: 0.00851
Epoch: 41590, Training Loss: 0.00668
Epoch: 41590, Training Loss: 0.00693
Epoch: 41590, Training Loss: 0.00694
Epoch: 41590, Training Loss: 0.00851
Epoch: 41591, Training Loss: 0.00668
Epoch: 41591, Training Loss: 0.00693
Epoch: 41591, Training Loss: 0.00694
Epoch: 41591, Training Loss: 0.00851
Epoch: 41592, Training Loss: 0.00668
Epoch: 41592, Training Loss: 0.00693
Epoch: 41592, Training Loss: 0.00694
Epoch: 41592, Training Loss: 0.00851
Epoch: 41593, Training Loss: 0.00668
Epoch: 41593, Training Loss: 0.00693
Epoch: 41593, Training Loss: 0.00694
Epoch: 41593, Training Loss: 0.00851
Epoch: 41594, Training Loss: 0.00668
Epoch: 41594, Training Loss: 0.00693
Epoch: 41594, Training Loss: 0.00694
Epoch: 41594, Training Loss: 0.00851
Epoch: 41595, Training Loss: 0.00668
Epoch: 41595, Training Loss: 0.00693
Epoch: 41595, Training Loss: 0.00694
Epoch: 41595, Training Loss: 0.00851
Epoch: 41596, Training Loss: 0.00668
Epoch: 41596, Training Loss: 0.00693
Epoch: 41596, Training Loss: 0.00694
Epoch: 41596, Training Loss: 0.00851
Epoch: 41597, Training Loss: 0.00668
Epoch: 41597, Training Loss: 0.00693
Epoch: 41597, Training Loss: 0.00694
Epoch: 41597, Training Loss: 0.00851
Epoch: 41598, Training Loss: 0.00668
Epoch: 41598, Training Loss: 0.00693
Epoch: 41598, Training Loss: 0.00694
Epoch: 41598, Training Loss: 0.00851
Epoch: 41599, Training Loss: 0.00668
Epoch: 41599, Training Loss: 0.00693
Epoch: 41599, Training Loss: 0.00694
Epoch: 41599, Training Loss: 0.00851
Epoch: 41600, Training Loss: 0.00668
Epoch: 41600, Training Loss: 0.00693
Epoch: 41600, Training Loss: 0.00694
Epoch: 41600, Training Loss: 0.00851
Epoch: 41601, Training Loss: 0.00668
Epoch: 41601, Training Loss: 0.00693
Epoch: 41601, Training Loss: 0.00694
Epoch: 41601, Training Loss: 0.00851
Epoch: 41602, Training Loss: 0.00668
Epoch: 41602, Training Loss: 0.00693
Epoch: 41602, Training Loss: 0.00694
Epoch: 41602, Training Loss: 0.00851
Epoch: 41603, Training Loss: 0.00668
Epoch: 41603, Training Loss: 0.00693
Epoch: 41603, Training Loss: 0.00694
Epoch: 41603, Training Loss: 0.00851
Epoch: 41604, Training Loss: 0.00668
Epoch: 41604, Training Loss: 0.00693
Epoch: 41604, Training Loss: 0.00694
Epoch: 41604, Training Loss: 0.00851
Epoch: 41605, Training Loss: 0.00668
Epoch: 41605, Training Loss: 0.00693
Epoch: 41605, Training Loss: 0.00694
Epoch: 41605, Training Loss: 0.00851
Epoch: 41606, Training Loss: 0.00668
Epoch: 41606, Training Loss: 0.00693
Epoch: 41606, Training Loss: 0.00694
Epoch: 41606, Training Loss: 0.00851
Epoch: 41607, Training Loss: 0.00668
Epoch: 41607, Training Loss: 0.00693
Epoch: 41607, Training Loss: 0.00694
Epoch: 41607, Training Loss: 0.00851
Epoch: 41608, Training Loss: 0.00667
Epoch: 41608, Training Loss: 0.00693
Epoch: 41608, Training Loss: 0.00694
Epoch: 41608, Training Loss: 0.00851
Epoch: 41609, Training Loss: 0.00667
Epoch: 41609, Training Loss: 0.00693
Epoch: 41609, Training Loss: 0.00694
Epoch: 41609, Training Loss: 0.00851
Epoch: 41610, Training Loss: 0.00667
Epoch: 41610, Training Loss: 0.00693
Epoch: 41610, Training Loss: 0.00694
Epoch: 41610, Training Loss: 0.00851
Epoch: 41611, Training Loss: 0.00667
Epoch: 41611, Training Loss: 0.00693
Epoch: 41611, Training Loss: 0.00694
Epoch: 41611, Training Loss: 0.00851
Epoch: 41612, Training Loss: 0.00667
Epoch: 41612, Training Loss: 0.00693
Epoch: 41612, Training Loss: 0.00694
Epoch: 41612, Training Loss: 0.00851
Epoch: 41613, Training Loss: 0.00667
Epoch: 41613, Training Loss: 0.00693
Epoch: 41613, Training Loss: 0.00694
Epoch: 41613, Training Loss: 0.00851
Epoch: 41614, Training Loss: 0.00667
Epoch: 41614, Training Loss: 0.00693
Epoch: 41614, Training Loss: 0.00694
Epoch: 41614, Training Loss: 0.00851
Epoch: 41615, Training Loss: 0.00667
Epoch: 41615, Training Loss: 0.00693
Epoch: 41615, Training Loss: 0.00694
Epoch: 41615, Training Loss: 0.00851
Epoch: 41616, Training Loss: 0.00667
Epoch: 41616, Training Loss: 0.00693
Epoch: 41616, Training Loss: 0.00694
Epoch: 41616, Training Loss: 0.00851
Epoch: 41617, Training Loss: 0.00667
Epoch: 41617, Training Loss: 0.00693
Epoch: 41617, Training Loss: 0.00694
Epoch: 41617, Training Loss: 0.00851
Epoch: 41618, Training Loss: 0.00667
Epoch: 41618, Training Loss: 0.00693
Epoch: 41618, Training Loss: 0.00694
Epoch: 41618, Training Loss: 0.00851
Epoch: 41619, Training Loss: 0.00667
Epoch: 41619, Training Loss: 0.00693
Epoch: 41619, Training Loss: 0.00694
Epoch: 41619, Training Loss: 0.00851
Epoch: 41620, Training Loss: 0.00667
Epoch: 41620, Training Loss: 0.00693
Epoch: 41620, Training Loss: 0.00694
Epoch: 41620, Training Loss: 0.00851
Epoch: 41621, Training Loss: 0.00667
Epoch: 41621, Training Loss: 0.00693
Epoch: 41621, Training Loss: 0.00694
Epoch: 41621, Training Loss: 0.00851
Epoch: 41622, Training Loss: 0.00667
Epoch: 41622, Training Loss: 0.00693
Epoch: 41622, Training Loss: 0.00694
Epoch: 41622, Training Loss: 0.00851
Epoch: 41623, Training Loss: 0.00667
Epoch: 41623, Training Loss: 0.00693
Epoch: 41623, Training Loss: 0.00694
Epoch: 41623, Training Loss: 0.00850
Epoch: 41624, Training Loss: 0.00667
Epoch: 41624, Training Loss: 0.00693
Epoch: 41624, Training Loss: 0.00694
Epoch: 41624, Training Loss: 0.00850
Epoch: 41625, Training Loss: 0.00667
Epoch: 41625, Training Loss: 0.00693
Epoch: 41625, Training Loss: 0.00694
Epoch: 41625, Training Loss: 0.00850
Epoch: 41626, Training Loss: 0.00667
Epoch: 41626, Training Loss: 0.00693
Epoch: 41626, Training Loss: 0.00694
Epoch: 41626, Training Loss: 0.00850
Epoch: 41627, Training Loss: 0.00667
Epoch: 41627, Training Loss: 0.00693
Epoch: 41627, Training Loss: 0.00694
Epoch: 41627, Training Loss: 0.00850
Epoch: 41628, Training Loss: 0.00667
Epoch: 41628, Training Loss: 0.00693
Epoch: 41628, Training Loss: 0.00694
Epoch: 41628, Training Loss: 0.00850
Epoch: 41629, Training Loss: 0.00667
Epoch: 41629, Training Loss: 0.00693
Epoch: 41629, Training Loss: 0.00694
Epoch: 41629, Training Loss: 0.00850
Epoch: 41630, Training Loss: 0.00667
Epoch: 41630, Training Loss: 0.00693
Epoch: 41630, Training Loss: 0.00694
Epoch: 41630, Training Loss: 0.00850
Epoch: 41631, Training Loss: 0.00667
Epoch: 41631, Training Loss: 0.00693
Epoch: 41631, Training Loss: 0.00694
Epoch: 41631, Training Loss: 0.00850
Epoch: 41632, Training Loss: 0.00667
Epoch: 41632, Training Loss: 0.00693
Epoch: 41632, Training Loss: 0.00694
Epoch: 41632, Training Loss: 0.00850
Epoch: 41633, Training Loss: 0.00667
Epoch: 41633, Training Loss: 0.00693
Epoch: 41633, Training Loss: 0.00694
Epoch: 41633, Training Loss: 0.00850
Epoch: 41634, Training Loss: 0.00667
Epoch: 41634, Training Loss: 0.00693
Epoch: 41634, Training Loss: 0.00694
Epoch: 41634, Training Loss: 0.00850
Epoch: 41635, Training Loss: 0.00667
Epoch: 41635, Training Loss: 0.00693
Epoch: 41635, Training Loss: 0.00694
Epoch: 41635, Training Loss: 0.00850
Epoch: 41636, Training Loss: 0.00667
Epoch: 41636, Training Loss: 0.00693
Epoch: 41636, Training Loss: 0.00694
Epoch: 41636, Training Loss: 0.00850
Epoch: 41637, Training Loss: 0.00667
Epoch: 41637, Training Loss: 0.00693
Epoch: 41637, Training Loss: 0.00694
Epoch: 41637, Training Loss: 0.00850
Epoch: 41638, Training Loss: 0.00667
Epoch: 41638, Training Loss: 0.00693
Epoch: 41638, Training Loss: 0.00694
Epoch: 41638, Training Loss: 0.00850
Epoch: 41639, Training Loss: 0.00667
Epoch: 41639, Training Loss: 0.00693
Epoch: 41639, Training Loss: 0.00694
Epoch: 41639, Training Loss: 0.00850
Epoch: 41640, Training Loss: 0.00667
Epoch: 41640, Training Loss: 0.00693
Epoch: 41640, Training Loss: 0.00693
Epoch: 41640, Training Loss: 0.00850
Epoch: 41641, Training Loss: 0.00667
Epoch: 41641, Training Loss: 0.00693
Epoch: 41641, Training Loss: 0.00693
Epoch: 41641, Training Loss: 0.00850
Epoch: 41642, Training Loss: 0.00667
Epoch: 41642, Training Loss: 0.00693
Epoch: 41642, Training Loss: 0.00693
Epoch: 41642, Training Loss: 0.00850
Epoch: 41643, Training Loss: 0.00667
Epoch: 41643, Training Loss: 0.00693
Epoch: 41643, Training Loss: 0.00693
Epoch: 41643, Training Loss: 0.00850
Epoch: 41644, Training Loss: 0.00667
Epoch: 41644, Training Loss: 0.00693
Epoch: 41644, Training Loss: 0.00693
Epoch: 41644, Training Loss: 0.00850
Epoch: 41645, Training Loss: 0.00667
Epoch: 41645, Training Loss: 0.00693
Epoch: 41645, Training Loss: 0.00693
Epoch: 41645, Training Loss: 0.00850
Epoch: 41646, Training Loss: 0.00667
Epoch: 41646, Training Loss: 0.00693
Epoch: 41646, Training Loss: 0.00693
Epoch: 41646, Training Loss: 0.00850
Epoch: 41647, Training Loss: 0.00667
Epoch: 41647, Training Loss: 0.00693
Epoch: 41647, Training Loss: 0.00693
Epoch: 41647, Training Loss: 0.00850
Epoch: 41648, Training Loss: 0.00667
Epoch: 41648, Training Loss: 0.00693
Epoch: 41648, Training Loss: 0.00693
Epoch: 41648, Training Loss: 0.00850
Epoch: 41649, Training Loss: 0.00667
Epoch: 41649, Training Loss: 0.00693
Epoch: 41649, Training Loss: 0.00693
Epoch: 41649, Training Loss: 0.00850
Epoch: 41650, Training Loss: 0.00667
Epoch: 41650, Training Loss: 0.00693
Epoch: 41650, Training Loss: 0.00693
Epoch: 41650, Training Loss: 0.00850
Epoch: 41651, Training Loss: 0.00667
Epoch: 41651, Training Loss: 0.00693
Epoch: 41651, Training Loss: 0.00693
Epoch: 41651, Training Loss: 0.00850
Epoch: 41652, Training Loss: 0.00667
Epoch: 41652, Training Loss: 0.00693
Epoch: 41652, Training Loss: 0.00693
Epoch: 41652, Training Loss: 0.00850
Epoch: 41653, Training Loss: 0.00667
Epoch: 41653, Training Loss: 0.00693
Epoch: 41653, Training Loss: 0.00693
Epoch: 41653, Training Loss: 0.00850
Epoch: 41654, Training Loss: 0.00667
Epoch: 41654, Training Loss: 0.00693
Epoch: 41654, Training Loss: 0.00693
Epoch: 41654, Training Loss: 0.00850
Epoch: 41655, Training Loss: 0.00667
Epoch: 41655, Training Loss: 0.00693
Epoch: 41655, Training Loss: 0.00693
Epoch: 41655, Training Loss: 0.00850
Epoch: 41656, Training Loss: 0.00667
Epoch: 41656, Training Loss: 0.00693
Epoch: 41656, Training Loss: 0.00693
Epoch: 41656, Training Loss: 0.00850
Epoch: 41657, Training Loss: 0.00667
Epoch: 41657, Training Loss: 0.00693
Epoch: 41657, Training Loss: 0.00693
Epoch: 41657, Training Loss: 0.00850
Epoch: 41658, Training Loss: 0.00667
Epoch: 41658, Training Loss: 0.00693
Epoch: 41658, Training Loss: 0.00693
Epoch: 41658, Training Loss: 0.00850
Epoch: 41659, Training Loss: 0.00667
Epoch: 41659, Training Loss: 0.00693
Epoch: 41659, Training Loss: 0.00693
Epoch: 41659, Training Loss: 0.00850
Epoch: 41660, Training Loss: 0.00667
Epoch: 41660, Training Loss: 0.00693
Epoch: 41660, Training Loss: 0.00693
Epoch: 41660, Training Loss: 0.00850
Epoch: 41661, Training Loss: 0.00667
Epoch: 41661, Training Loss: 0.00693
Epoch: 41661, Training Loss: 0.00693
Epoch: 41661, Training Loss: 0.00850
Epoch: 41662, Training Loss: 0.00667
Epoch: 41662, Training Loss: 0.00693
Epoch: 41662, Training Loss: 0.00693
Epoch: 41662, Training Loss: 0.00850
Epoch: 41663, Training Loss: 0.00667
Epoch: 41663, Training Loss: 0.00693
Epoch: 41663, Training Loss: 0.00693
Epoch: 41663, Training Loss: 0.00850
Epoch: 41664, Training Loss: 0.00667
Epoch: 41664, Training Loss: 0.00693
Epoch: 41664, Training Loss: 0.00693
Epoch: 41664, Training Loss: 0.00850
Epoch: 41665, Training Loss: 0.00667
Epoch: 41665, Training Loss: 0.00693
Epoch: 41665, Training Loss: 0.00693
Epoch: 41665, Training Loss: 0.00850
Epoch: 41666, Training Loss: 0.00667
Epoch: 41666, Training Loss: 0.00692
Epoch: 41666, Training Loss: 0.00693
Epoch: 41666, Training Loss: 0.00850
Epoch: 41667, Training Loss: 0.00667
Epoch: 41667, Training Loss: 0.00692
Epoch: 41667, Training Loss: 0.00693
Epoch: 41667, Training Loss: 0.00850
Epoch: 41668, Training Loss: 0.00667
Epoch: 41668, Training Loss: 0.00692
Epoch: 41668, Training Loss: 0.00693
Epoch: 41668, Training Loss: 0.00850
Epoch: 41669, Training Loss: 0.00667
Epoch: 41669, Training Loss: 0.00692
Epoch: 41669, Training Loss: 0.00693
Epoch: 41669, Training Loss: 0.00850
Epoch: 41670, Training Loss: 0.00667
Epoch: 41670, Training Loss: 0.00692
Epoch: 41670, Training Loss: 0.00693
Epoch: 41670, Training Loss: 0.00850
Epoch: 41671, Training Loss: 0.00667
Epoch: 41671, Training Loss: 0.00692
Epoch: 41671, Training Loss: 0.00693
Epoch: 41671, Training Loss: 0.00850
Epoch: 41672, Training Loss: 0.00667
Epoch: 41672, Training Loss: 0.00692
Epoch: 41672, Training Loss: 0.00693
Epoch: 41672, Training Loss: 0.00850
Epoch: 41673, Training Loss: 0.00667
Epoch: 41673, Training Loss: 0.00692
Epoch: 41673, Training Loss: 0.00693
Epoch: 41673, Training Loss: 0.00850
Epoch: 41674, Training Loss: 0.00667
Epoch: 41674, Training Loss: 0.00692
Epoch: 41674, Training Loss: 0.00693
Epoch: 41674, Training Loss: 0.00850
Epoch: 41675, Training Loss: 0.00667
Epoch: 41675, Training Loss: 0.00692
Epoch: 41675, Training Loss: 0.00693
Epoch: 41675, Training Loss: 0.00850
Epoch: 41676, Training Loss: 0.00667
Epoch: 41676, Training Loss: 0.00692
Epoch: 41676, Training Loss: 0.00693
Epoch: 41676, Training Loss: 0.00850
Epoch: 41677, Training Loss: 0.00667
Epoch: 41677, Training Loss: 0.00692
Epoch: 41677, Training Loss: 0.00693
Epoch: 41677, Training Loss: 0.00850
Epoch: 41678, Training Loss: 0.00667
Epoch: 41678, Training Loss: 0.00692
Epoch: 41678, Training Loss: 0.00693
Epoch: 41678, Training Loss: 0.00850
Epoch: 41679, Training Loss: 0.00667
Epoch: 41679, Training Loss: 0.00692
Epoch: 41679, Training Loss: 0.00693
Epoch: 41679, Training Loss: 0.00850
Epoch: 41680, Training Loss: 0.00667
Epoch: 41680, Training Loss: 0.00692
Epoch: 41680, Training Loss: 0.00693
Epoch: 41680, Training Loss: 0.00850
Epoch: 41681, Training Loss: 0.00667
Epoch: 41681, Training Loss: 0.00692
Epoch: 41681, Training Loss: 0.00693
Epoch: 41681, Training Loss: 0.00850
Epoch: 41682, Training Loss: 0.00667
Epoch: 41682, Training Loss: 0.00692
Epoch: 41682, Training Loss: 0.00693
Epoch: 41682, Training Loss: 0.00850
Epoch: 41683, Training Loss: 0.00667
Epoch: 41683, Training Loss: 0.00692
Epoch: 41683, Training Loss: 0.00693
Epoch: 41683, Training Loss: 0.00850
Epoch: 41684, Training Loss: 0.00667
Epoch: 41684, Training Loss: 0.00692
Epoch: 41684, Training Loss: 0.00693
Epoch: 41684, Training Loss: 0.00850
Epoch: 41685, Training Loss: 0.00667
Epoch: 41685, Training Loss: 0.00692
Epoch: 41685, Training Loss: 0.00693
Epoch: 41685, Training Loss: 0.00850
Epoch: 41686, Training Loss: 0.00667
Epoch: 41686, Training Loss: 0.00692
Epoch: 41686, Training Loss: 0.00693
Epoch: 41686, Training Loss: 0.00850
Epoch: 41687, Training Loss: 0.00667
Epoch: 41687, Training Loss: 0.00692
Epoch: 41687, Training Loss: 0.00693
Epoch: 41687, Training Loss: 0.00850
Epoch: 41688, Training Loss: 0.00667
Epoch: 41688, Training Loss: 0.00692
Epoch: 41688, Training Loss: 0.00693
Epoch: 41688, Training Loss: 0.00850
Epoch: 41689, Training Loss: 0.00667
Epoch: 41689, Training Loss: 0.00692
Epoch: 41689, Training Loss: 0.00693
Epoch: 41689, Training Loss: 0.00850
Epoch: 41690, Training Loss: 0.00667
Epoch: 41690, Training Loss: 0.00692
Epoch: 41690, Training Loss: 0.00693
Epoch: 41690, Training Loss: 0.00850
Epoch: 41691, Training Loss: 0.00667
Epoch: 41691, Training Loss: 0.00692
Epoch: 41691, Training Loss: 0.00693
Epoch: 41691, Training Loss: 0.00850
Epoch: 41692, Training Loss: 0.00667
Epoch: 41692, Training Loss: 0.00692
Epoch: 41692, Training Loss: 0.00693
Epoch: 41692, Training Loss: 0.00850
Epoch: 41693, Training Loss: 0.00667
Epoch: 41693, Training Loss: 0.00692
Epoch: 41693, Training Loss: 0.00693
Epoch: 41693, Training Loss: 0.00850
Epoch: 41694, Training Loss: 0.00667
Epoch: 41694, Training Loss: 0.00692
Epoch: 41694, Training Loss: 0.00693
Epoch: 41694, Training Loss: 0.00850
Epoch: 41695, Training Loss: 0.00667
Epoch: 41695, Training Loss: 0.00692
Epoch: 41695, Training Loss: 0.00693
Epoch: 41695, Training Loss: 0.00850
Epoch: 41696, Training Loss: 0.00667
Epoch: 41696, Training Loss: 0.00692
Epoch: 41696, Training Loss: 0.00693
Epoch: 41696, Training Loss: 0.00850
Epoch: 41697, Training Loss: 0.00667
Epoch: 41697, Training Loss: 0.00692
Epoch: 41697, Training Loss: 0.00693
Epoch: 41697, Training Loss: 0.00850
Epoch: 41698, Training Loss: 0.00667
Epoch: 41698, Training Loss: 0.00692
Epoch: 41698, Training Loss: 0.00693
Epoch: 41698, Training Loss: 0.00850
Epoch: 41699, Training Loss: 0.00667
Epoch: 41699, Training Loss: 0.00692
Epoch: 41699, Training Loss: 0.00693
Epoch: 41699, Training Loss: 0.00850
Epoch: 41700, Training Loss: 0.00667
Epoch: 41700, Training Loss: 0.00692
Epoch: 41700, Training Loss: 0.00693
Epoch: 41700, Training Loss: 0.00850
Epoch: 41701, Training Loss: 0.00667
Epoch: 41701, Training Loss: 0.00692
Epoch: 41701, Training Loss: 0.00693
Epoch: 41701, Training Loss: 0.00850
Epoch: 41702, Training Loss: 0.00667
Epoch: 41702, Training Loss: 0.00692
Epoch: 41702, Training Loss: 0.00693
Epoch: 41702, Training Loss: 0.00850
Epoch: 41703, Training Loss: 0.00667
Epoch: 41703, Training Loss: 0.00692
Epoch: 41703, Training Loss: 0.00693
Epoch: 41703, Training Loss: 0.00850
Epoch: 41704, Training Loss: 0.00667
Epoch: 41704, Training Loss: 0.00692
Epoch: 41704, Training Loss: 0.00693
Epoch: 41704, Training Loss: 0.00850
Epoch: 41705, Training Loss: 0.00667
Epoch: 41705, Training Loss: 0.00692
Epoch: 41705, Training Loss: 0.00693
Epoch: 41705, Training Loss: 0.00850
Epoch: 41706, Training Loss: 0.00667
Epoch: 41706, Training Loss: 0.00692
Epoch: 41706, Training Loss: 0.00693
Epoch: 41706, Training Loss: 0.00850
Epoch: 41707, Training Loss: 0.00667
Epoch: 41707, Training Loss: 0.00692
Epoch: 41707, Training Loss: 0.00693
Epoch: 41707, Training Loss: 0.00850
Epoch: 41708, Training Loss: 0.00667
Epoch: 41708, Training Loss: 0.00692
Epoch: 41708, Training Loss: 0.00693
Epoch: 41708, Training Loss: 0.00850
Epoch: 41709, Training Loss: 0.00667
Epoch: 41709, Training Loss: 0.00692
Epoch: 41709, Training Loss: 0.00693
Epoch: 41709, Training Loss: 0.00850
Epoch: 41710, Training Loss: 0.00667
Epoch: 41710, Training Loss: 0.00692
Epoch: 41710, Training Loss: 0.00693
Epoch: 41710, Training Loss: 0.00850
Epoch: 41711, Training Loss: 0.00667
Epoch: 41711, Training Loss: 0.00692
Epoch: 41711, Training Loss: 0.00693
Epoch: 41711, Training Loss: 0.00849
Epoch: 41712, Training Loss: 0.00667
Epoch: 41712, Training Loss: 0.00692
Epoch: 41712, Training Loss: 0.00693
Epoch: 41712, Training Loss: 0.00849
Epoch: 41713, Training Loss: 0.00667
Epoch: 41713, Training Loss: 0.00692
Epoch: 41713, Training Loss: 0.00693
Epoch: 41713, Training Loss: 0.00849
Epoch: 41714, Training Loss: 0.00667
Epoch: 41714, Training Loss: 0.00692
Epoch: 41714, Training Loss: 0.00693
Epoch: 41714, Training Loss: 0.00849
Epoch: 41715, Training Loss: 0.00667
Epoch: 41715, Training Loss: 0.00692
Epoch: 41715, Training Loss: 0.00693
Epoch: 41715, Training Loss: 0.00849
Epoch: 41716, Training Loss: 0.00667
Epoch: 41716, Training Loss: 0.00692
Epoch: 41716, Training Loss: 0.00693
Epoch: 41716, Training Loss: 0.00849
Epoch: 41717, Training Loss: 0.00667
Epoch: 41717, Training Loss: 0.00692
Epoch: 41717, Training Loss: 0.00693
Epoch: 41717, Training Loss: 0.00849
Epoch: 41718, Training Loss: 0.00667
Epoch: 41718, Training Loss: 0.00692
Epoch: 41718, Training Loss: 0.00693
Epoch: 41718, Training Loss: 0.00849
Epoch: 41719, Training Loss: 0.00667
Epoch: 41719, Training Loss: 0.00692
Epoch: 41719, Training Loss: 0.00693
Epoch: 41719, Training Loss: 0.00849
Epoch: 41720, Training Loss: 0.00667
Epoch: 41720, Training Loss: 0.00692
Epoch: 41720, Training Loss: 0.00693
Epoch: 41720, Training Loss: 0.00849
Epoch: 41721, Training Loss: 0.00667
Epoch: 41721, Training Loss: 0.00692
Epoch: 41721, Training Loss: 0.00693
Epoch: 41721, Training Loss: 0.00849
Epoch: 41722, Training Loss: 0.00667
Epoch: 41722, Training Loss: 0.00692
Epoch: 41722, Training Loss: 0.00693
Epoch: 41722, Training Loss: 0.00849
Epoch: 41723, Training Loss: 0.00667
Epoch: 41723, Training Loss: 0.00692
Epoch: 41723, Training Loss: 0.00693
Epoch: 41723, Training Loss: 0.00849
Epoch: 41724, Training Loss: 0.00666
Epoch: 41724, Training Loss: 0.00692
Epoch: 41724, Training Loss: 0.00693
Epoch: 41724, Training Loss: 0.00849
Epoch: 41725, Training Loss: 0.00666
Epoch: 41725, Training Loss: 0.00692
Epoch: 41725, Training Loss: 0.00693
Epoch: 41725, Training Loss: 0.00849
Epoch: 41726, Training Loss: 0.00666
Epoch: 41726, Training Loss: 0.00692
Epoch: 41726, Training Loss: 0.00693
Epoch: 41726, Training Loss: 0.00849
Epoch: 41727, Training Loss: 0.00666
Epoch: 41727, Training Loss: 0.00692
Epoch: 41727, Training Loss: 0.00693
Epoch: 41727, Training Loss: 0.00849
Epoch: 41728, Training Loss: 0.00666
Epoch: 41728, Training Loss: 0.00692
Epoch: 41728, Training Loss: 0.00693
Epoch: 41728, Training Loss: 0.00849
Epoch: 41729, Training Loss: 0.00666
Epoch: 41729, Training Loss: 0.00692
Epoch: 41729, Training Loss: 0.00693
Epoch: 41729, Training Loss: 0.00849
Epoch: 41730, Training Loss: 0.00666
Epoch: 41730, Training Loss: 0.00692
Epoch: 41730, Training Loss: 0.00693
Epoch: 41730, Training Loss: 0.00849
Epoch: 41731, Training Loss: 0.00666
Epoch: 41731, Training Loss: 0.00692
Epoch: 41731, Training Loss: 0.00693
Epoch: 41731, Training Loss: 0.00849
Epoch: 41732, Training Loss: 0.00666
Epoch: 41732, Training Loss: 0.00692
Epoch: 41732, Training Loss: 0.00693
Epoch: 41732, Training Loss: 0.00849
Epoch: 41733, Training Loss: 0.00666
Epoch: 41733, Training Loss: 0.00692
Epoch: 41733, Training Loss: 0.00693
Epoch: 41733, Training Loss: 0.00849
Epoch: 41734, Training Loss: 0.00666
Epoch: 41734, Training Loss: 0.00692
Epoch: 41734, Training Loss: 0.00693
Epoch: 41734, Training Loss: 0.00849
Epoch: 41735, Training Loss: 0.00666
Epoch: 41735, Training Loss: 0.00692
Epoch: 41735, Training Loss: 0.00693
Epoch: 41735, Training Loss: 0.00849
Epoch: 41736, Training Loss: 0.00666
Epoch: 41736, Training Loss: 0.00692
Epoch: 41736, Training Loss: 0.00693
Epoch: 41736, Training Loss: 0.00849
Epoch: 41737, Training Loss: 0.00666
Epoch: 41737, Training Loss: 0.00692
Epoch: 41737, Training Loss: 0.00693
Epoch: 41737, Training Loss: 0.00849
Epoch: 41738, Training Loss: 0.00666
Epoch: 41738, Training Loss: 0.00692
Epoch: 41738, Training Loss: 0.00693
Epoch: 41738, Training Loss: 0.00849
Epoch: 41739, Training Loss: 0.00666
Epoch: 41739, Training Loss: 0.00692
Epoch: 41739, Training Loss: 0.00693
Epoch: 41739, Training Loss: 0.00849
Epoch: 41740, Training Loss: 0.00666
Epoch: 41740, Training Loss: 0.00692
Epoch: 41740, Training Loss: 0.00693
Epoch: 41740, Training Loss: 0.00849
Epoch: 41741, Training Loss: 0.00666
Epoch: 41741, Training Loss: 0.00692
Epoch: 41741, Training Loss: 0.00693
Epoch: 41741, Training Loss: 0.00849
Epoch: 41742, Training Loss: 0.00666
Epoch: 41742, Training Loss: 0.00692
Epoch: 41742, Training Loss: 0.00693
Epoch: 41742, Training Loss: 0.00849
Epoch: 41743, Training Loss: 0.00666
Epoch: 41743, Training Loss: 0.00692
Epoch: 41743, Training Loss: 0.00693
Epoch: 41743, Training Loss: 0.00849
Epoch: 41744, Training Loss: 0.00666
Epoch: 41744, Training Loss: 0.00692
Epoch: 41744, Training Loss: 0.00693
Epoch: 41744, Training Loss: 0.00849
Epoch: 41745, Training Loss: 0.00666
Epoch: 41745, Training Loss: 0.00692
Epoch: 41745, Training Loss: 0.00693
Epoch: 41745, Training Loss: 0.00849
Epoch: 41746, Training Loss: 0.00666
Epoch: 41746, Training Loss: 0.00692
Epoch: 41746, Training Loss: 0.00693
Epoch: 41746, Training Loss: 0.00849
Epoch: 41747, Training Loss: 0.00666
Epoch: 41747, Training Loss: 0.00692
Epoch: 41747, Training Loss: 0.00693
Epoch: 41747, Training Loss: 0.00849
Epoch: 41748, Training Loss: 0.00666
Epoch: 41748, Training Loss: 0.00692
Epoch: 41748, Training Loss: 0.00693
Epoch: 41748, Training Loss: 0.00849
Epoch: 41749, Training Loss: 0.00666
Epoch: 41749, Training Loss: 0.00692
Epoch: 41749, Training Loss: 0.00693
Epoch: 41749, Training Loss: 0.00849
Epoch: 41750, Training Loss: 0.00666
Epoch: 41750, Training Loss: 0.00692
Epoch: 41750, Training Loss: 0.00692
Epoch: 41750, Training Loss: 0.00849
Epoch: 41751, Training Loss: 0.00666
Epoch: 41751, Training Loss: 0.00692
Epoch: 41751, Training Loss: 0.00692
Epoch: 41751, Training Loss: 0.00849
Epoch: 41752, Training Loss: 0.00666
Epoch: 41752, Training Loss: 0.00692
Epoch: 41752, Training Loss: 0.00692
Epoch: 41752, Training Loss: 0.00849
Epoch: 41753, Training Loss: 0.00666
Epoch: 41753, Training Loss: 0.00692
Epoch: 41753, Training Loss: 0.00692
Epoch: 41753, Training Loss: 0.00849
Epoch: 41754, Training Loss: 0.00666
Epoch: 41754, Training Loss: 0.00692
Epoch: 41754, Training Loss: 0.00692
Epoch: 41754, Training Loss: 0.00849
Epoch: 41755, Training Loss: 0.00666
Epoch: 41755, Training Loss: 0.00692
Epoch: 41755, Training Loss: 0.00692
Epoch: 41755, Training Loss: 0.00849
Epoch: 41756, Training Loss: 0.00666
Epoch: 41756, Training Loss: 0.00692
Epoch: 41756, Training Loss: 0.00692
Epoch: 41756, Training Loss: 0.00849
Epoch: 41757, Training Loss: 0.00666
Epoch: 41757, Training Loss: 0.00692
Epoch: 41757, Training Loss: 0.00692
Epoch: 41757, Training Loss: 0.00849
Epoch: 41758, Training Loss: 0.00666
Epoch: 41758, Training Loss: 0.00692
Epoch: 41758, Training Loss: 0.00692
Epoch: 41758, Training Loss: 0.00849
Epoch: 41759, Training Loss: 0.00666
Epoch: 41759, Training Loss: 0.00692
Epoch: 41759, Training Loss: 0.00692
Epoch: 41759, Training Loss: 0.00849
Epoch: 41760, Training Loss: 0.00666
Epoch: 41760, Training Loss: 0.00692
Epoch: 41760, Training Loss: 0.00692
Epoch: 41760, Training Loss: 0.00849
Epoch: 41761, Training Loss: 0.00666
Epoch: 41761, Training Loss: 0.00692
Epoch: 41761, Training Loss: 0.00692
Epoch: 41761, Training Loss: 0.00849
Epoch: 41762, Training Loss: 0.00666
Epoch: 41762, Training Loss: 0.00692
Epoch: 41762, Training Loss: 0.00692
Epoch: 41762, Training Loss: 0.00849
Epoch: 41763, Training Loss: 0.00666
Epoch: 41763, Training Loss: 0.00692
Epoch: 41763, Training Loss: 0.00692
Epoch: 41763, Training Loss: 0.00849
Epoch: 41764, Training Loss: 0.00666
Epoch: 41764, Training Loss: 0.00692
Epoch: 41764, Training Loss: 0.00692
Epoch: 41764, Training Loss: 0.00849
Epoch: 41765, Training Loss: 0.00666
Epoch: 41765, Training Loss: 0.00692
Epoch: 41765, Training Loss: 0.00692
Epoch: 41765, Training Loss: 0.00849
Epoch: 41766, Training Loss: 0.00666
Epoch: 41766, Training Loss: 0.00692
Epoch: 41766, Training Loss: 0.00692
Epoch: 41766, Training Loss: 0.00849
Epoch: 41767, Training Loss: 0.00666
Epoch: 41767, Training Loss: 0.00692
Epoch: 41767, Training Loss: 0.00692
Epoch: 41767, Training Loss: 0.00849
Epoch: 41768, Training Loss: 0.00666
Epoch: 41768, Training Loss: 0.00692
Epoch: 41768, Training Loss: 0.00692
Epoch: 41768, Training Loss: 0.00849
Epoch: 41769, Training Loss: 0.00666
Epoch: 41769, Training Loss: 0.00692
Epoch: 41769, Training Loss: 0.00692
Epoch: 41769, Training Loss: 0.00849
Epoch: 41770, Training Loss: 0.00666
Epoch: 41770, Training Loss: 0.00692
Epoch: 41770, Training Loss: 0.00692
Epoch: 41770, Training Loss: 0.00849
Epoch: 41771, Training Loss: 0.00666
Epoch: 41771, Training Loss: 0.00692
Epoch: 41771, Training Loss: 0.00692
Epoch: 41771, Training Loss: 0.00849
Epoch: 41772, Training Loss: 0.00666
Epoch: 41772, Training Loss: 0.00692
Epoch: 41772, Training Loss: 0.00692
Epoch: 41772, Training Loss: 0.00849
Epoch: 41773, Training Loss: 0.00666
Epoch: 41773, Training Loss: 0.00692
Epoch: 41773, Training Loss: 0.00692
Epoch: 41773, Training Loss: 0.00849
Epoch: 41774, Training Loss: 0.00666
Epoch: 41774, Training Loss: 0.00692
Epoch: 41774, Training Loss: 0.00692
Epoch: 41774, Training Loss: 0.00849
Epoch: 41775, Training Loss: 0.00666
Epoch: 41775, Training Loss: 0.00692
Epoch: 41775, Training Loss: 0.00692
Epoch: 41775, Training Loss: 0.00849
Epoch: 41776, Training Loss: 0.00666
Epoch: 41776, Training Loss: 0.00691
Epoch: 41776, Training Loss: 0.00692
Epoch: 41776, Training Loss: 0.00849
Epoch: 41777, Training Loss: 0.00666
Epoch: 41777, Training Loss: 0.00691
Epoch: 41777, Training Loss: 0.00692
Epoch: 41777, Training Loss: 0.00849
Epoch: 41778, Training Loss: 0.00666
Epoch: 41778, Training Loss: 0.00691
Epoch: 41778, Training Loss: 0.00692
Epoch: 41778, Training Loss: 0.00849
Epoch: 41779, Training Loss: 0.00666
Epoch: 41779, Training Loss: 0.00691
Epoch: 41779, Training Loss: 0.00692
Epoch: 41779, Training Loss: 0.00849
Epoch: 41780, Training Loss: 0.00666
Epoch: 41780, Training Loss: 0.00691
Epoch: 41780, Training Loss: 0.00692
Epoch: 41780, Training Loss: 0.00849
Epoch: 41781, Training Loss: 0.00666
Epoch: 41781, Training Loss: 0.00691
Epoch: 41781, Training Loss: 0.00692
Epoch: 41781, Training Loss: 0.00849
Epoch: 41782, Training Loss: 0.00666
Epoch: 41782, Training Loss: 0.00691
Epoch: 41782, Training Loss: 0.00692
Epoch: 41782, Training Loss: 0.00849
Epoch: 41783, Training Loss: 0.00666
Epoch: 41783, Training Loss: 0.00691
Epoch: 41783, Training Loss: 0.00692
Epoch: 41783, Training Loss: 0.00849
Epoch: 41784, Training Loss: 0.00666
Epoch: 41784, Training Loss: 0.00691
Epoch: 41784, Training Loss: 0.00692
Epoch: 41784, Training Loss: 0.00849
Epoch: 41785, Training Loss: 0.00666
Epoch: 41785, Training Loss: 0.00691
Epoch: 41785, Training Loss: 0.00692
Epoch: 41785, Training Loss: 0.00849
Epoch: 41786, Training Loss: 0.00666
Epoch: 41786, Training Loss: 0.00691
Epoch: 41786, Training Loss: 0.00692
Epoch: 41786, Training Loss: 0.00849
Epoch: 41787, Training Loss: 0.00666
Epoch: 41787, Training Loss: 0.00691
Epoch: 41787, Training Loss: 0.00692
Epoch: 41787, Training Loss: 0.00849
Epoch: 41788, Training Loss: 0.00666
Epoch: 41788, Training Loss: 0.00691
Epoch: 41788, Training Loss: 0.00692
Epoch: 41788, Training Loss: 0.00849
Epoch: 41789, Training Loss: 0.00666
Epoch: 41789, Training Loss: 0.00691
Epoch: 41789, Training Loss: 0.00692
Epoch: 41789, Training Loss: 0.00849
Epoch: 41790, Training Loss: 0.00666
Epoch: 41790, Training Loss: 0.00691
Epoch: 41790, Training Loss: 0.00692
Epoch: 41790, Training Loss: 0.00849
Epoch: 41791, Training Loss: 0.00666
Epoch: 41791, Training Loss: 0.00691
Epoch: 41791, Training Loss: 0.00692
Epoch: 41791, Training Loss: 0.00849
Epoch: 41792, Training Loss: 0.00666
Epoch: 41792, Training Loss: 0.00691
Epoch: 41792, Training Loss: 0.00692
Epoch: 41792, Training Loss: 0.00849
Epoch: 41793, Training Loss: 0.00666
Epoch: 41793, Training Loss: 0.00691
Epoch: 41793, Training Loss: 0.00692
Epoch: 41793, Training Loss: 0.00849
Epoch: 41794, Training Loss: 0.00666
Epoch: 41794, Training Loss: 0.00691
Epoch: 41794, Training Loss: 0.00692
Epoch: 41794, Training Loss: 0.00849
Epoch: 41795, Training Loss: 0.00666
Epoch: 41795, Training Loss: 0.00691
Epoch: 41795, Training Loss: 0.00692
Epoch: 41795, Training Loss: 0.00849
Epoch: 41796, Training Loss: 0.00666
Epoch: 41796, Training Loss: 0.00691
Epoch: 41796, Training Loss: 0.00692
Epoch: 41796, Training Loss: 0.00849
Epoch: 41797, Training Loss: 0.00666
Epoch: 41797, Training Loss: 0.00691
Epoch: 41797, Training Loss: 0.00692
Epoch: 41797, Training Loss: 0.00849
Epoch: 41798, Training Loss: 0.00666
Epoch: 41798, Training Loss: 0.00691
Epoch: 41798, Training Loss: 0.00692
Epoch: 41798, Training Loss: 0.00849
Epoch: 41799, Training Loss: 0.00666
Epoch: 41799, Training Loss: 0.00691
Epoch: 41799, Training Loss: 0.00692
Epoch: 41799, Training Loss: 0.00849
Epoch: 41800, Training Loss: 0.00666
Epoch: 41800, Training Loss: 0.00691
Epoch: 41800, Training Loss: 0.00692
Epoch: 41800, Training Loss: 0.00848
Epoch: 41801, Training Loss: 0.00666
Epoch: 41801, Training Loss: 0.00691
Epoch: 41801, Training Loss: 0.00692
Epoch: 41801, Training Loss: 0.00848
Epoch: 41802, Training Loss: 0.00666
Epoch: 41802, Training Loss: 0.00691
Epoch: 41802, Training Loss: 0.00692
Epoch: 41802, Training Loss: 0.00848
Epoch: 41803, Training Loss: 0.00666
Epoch: 41803, Training Loss: 0.00691
Epoch: 41803, Training Loss: 0.00692
Epoch: 41803, Training Loss: 0.00848
Epoch: 41804, Training Loss: 0.00666
Epoch: 41804, Training Loss: 0.00691
Epoch: 41804, Training Loss: 0.00692
Epoch: 41804, Training Loss: 0.00848
Epoch: 41805, Training Loss: 0.00666
Epoch: 41805, Training Loss: 0.00691
Epoch: 41805, Training Loss: 0.00692
Epoch: 41805, Training Loss: 0.00848
Epoch: 41806, Training Loss: 0.00666
Epoch: 41806, Training Loss: 0.00691
Epoch: 41806, Training Loss: 0.00692
Epoch: 41806, Training Loss: 0.00848
Epoch: 41807, Training Loss: 0.00666
Epoch: 41807, Training Loss: 0.00691
Epoch: 41807, Training Loss: 0.00692
Epoch: 41807, Training Loss: 0.00848
Epoch: 41808, Training Loss: 0.00666
Epoch: 41808, Training Loss: 0.00691
Epoch: 41808, Training Loss: 0.00692
Epoch: 41808, Training Loss: 0.00848
Epoch: 41809, Training Loss: 0.00666
Epoch: 41809, Training Loss: 0.00691
Epoch: 41809, Training Loss: 0.00692
Epoch: 41809, Training Loss: 0.00848
Epoch: 41810, Training Loss: 0.00666
Epoch: 41810, Training Loss: 0.00691
Epoch: 41810, Training Loss: 0.00692
Epoch: 41810, Training Loss: 0.00848
Epoch: 41811, Training Loss: 0.00666
Epoch: 41811, Training Loss: 0.00691
Epoch: 41811, Training Loss: 0.00692
Epoch: 41811, Training Loss: 0.00848
Epoch: 41812, Training Loss: 0.00666
Epoch: 41812, Training Loss: 0.00691
Epoch: 41812, Training Loss: 0.00692
Epoch: 41812, Training Loss: 0.00848
Epoch: 41813, Training Loss: 0.00666
Epoch: 41813, Training Loss: 0.00691
Epoch: 41813, Training Loss: 0.00692
Epoch: 41813, Training Loss: 0.00848
Epoch: 41814, Training Loss: 0.00666
Epoch: 41814, Training Loss: 0.00691
Epoch: 41814, Training Loss: 0.00692
Epoch: 41814, Training Loss: 0.00848
Epoch: 41815, Training Loss: 0.00666
Epoch: 41815, Training Loss: 0.00691
Epoch: 41815, Training Loss: 0.00692
Epoch: 41815, Training Loss: 0.00848
Epoch: 41816, Training Loss: 0.00666
Epoch: 41816, Training Loss: 0.00691
Epoch: 41816, Training Loss: 0.00692
Epoch: 41816, Training Loss: 0.00848
Epoch: 41817, Training Loss: 0.00666
Epoch: 41817, Training Loss: 0.00691
Epoch: 41817, Training Loss: 0.00692
Epoch: 41817, Training Loss: 0.00848
Epoch: 41818, Training Loss: 0.00666
Epoch: 41818, Training Loss: 0.00691
Epoch: 41818, Training Loss: 0.00692
Epoch: 41818, Training Loss: 0.00848
Epoch: 41819, Training Loss: 0.00666
Epoch: 41819, Training Loss: 0.00691
Epoch: 41819, Training Loss: 0.00692
Epoch: 41819, Training Loss: 0.00848
Epoch: 41820, Training Loss: 0.00666
Epoch: 41820, Training Loss: 0.00691
Epoch: 41820, Training Loss: 0.00692
Epoch: 41820, Training Loss: 0.00848
Epoch: 41821, Training Loss: 0.00666
Epoch: 41821, Training Loss: 0.00691
Epoch: 41821, Training Loss: 0.00692
Epoch: 41821, Training Loss: 0.00848
Epoch: 41822, Training Loss: 0.00666
Epoch: 41822, Training Loss: 0.00691
Epoch: 41822, Training Loss: 0.00692
Epoch: 41822, Training Loss: 0.00848
Epoch: 41823, Training Loss: 0.00666
Epoch: 41823, Training Loss: 0.00691
Epoch: 41823, Training Loss: 0.00692
Epoch: 41823, Training Loss: 0.00848
Epoch: 41824, Training Loss: 0.00666
Epoch: 41824, Training Loss: 0.00691
Epoch: 41824, Training Loss: 0.00692
Epoch: 41824, Training Loss: 0.00848
Epoch: 41825, Training Loss: 0.00666
Epoch: 41825, Training Loss: 0.00691
Epoch: 41825, Training Loss: 0.00692
Epoch: 41825, Training Loss: 0.00848
Epoch: 41826, Training Loss: 0.00666
Epoch: 41826, Training Loss: 0.00691
Epoch: 41826, Training Loss: 0.00692
Epoch: 41826, Training Loss: 0.00848
Epoch: 41827, Training Loss: 0.00666
Epoch: 41827, Training Loss: 0.00691
Epoch: 41827, Training Loss: 0.00692
Epoch: 41827, Training Loss: 0.00848
Epoch: 41828, Training Loss: 0.00666
Epoch: 41828, Training Loss: 0.00691
Epoch: 41828, Training Loss: 0.00692
Epoch: 41828, Training Loss: 0.00848
Epoch: 41829, Training Loss: 0.00666
Epoch: 41829, Training Loss: 0.00691
Epoch: 41829, Training Loss: 0.00692
Epoch: 41829, Training Loss: 0.00848
Epoch: 41830, Training Loss: 0.00666
Epoch: 41830, Training Loss: 0.00691
Epoch: 41830, Training Loss: 0.00692
Epoch: 41830, Training Loss: 0.00848
Epoch: 41831, Training Loss: 0.00666
Epoch: 41831, Training Loss: 0.00691
Epoch: 41831, Training Loss: 0.00692
Epoch: 41831, Training Loss: 0.00848
Epoch: 41832, Training Loss: 0.00666
Epoch: 41832, Training Loss: 0.00691
Epoch: 41832, Training Loss: 0.00692
Epoch: 41832, Training Loss: 0.00848
Epoch: 41833, Training Loss: 0.00666
Epoch: 41833, Training Loss: 0.00691
Epoch: 41833, Training Loss: 0.00692
Epoch: 41833, Training Loss: 0.00848
Epoch: 41834, Training Loss: 0.00666
Epoch: 41834, Training Loss: 0.00691
Epoch: 41834, Training Loss: 0.00692
Epoch: 41834, Training Loss: 0.00848
Epoch: 41835, Training Loss: 0.00666
Epoch: 41835, Training Loss: 0.00691
Epoch: 41835, Training Loss: 0.00692
Epoch: 41835, Training Loss: 0.00848
Epoch: 41836, Training Loss: 0.00666
Epoch: 41836, Training Loss: 0.00691
Epoch: 41836, Training Loss: 0.00692
Epoch: 41836, Training Loss: 0.00848
Epoch: 41837, Training Loss: 0.00666
Epoch: 41837, Training Loss: 0.00691
Epoch: 41837, Training Loss: 0.00692
Epoch: 41837, Training Loss: 0.00848
Epoch: 41838, Training Loss: 0.00666
Epoch: 41838, Training Loss: 0.00691
Epoch: 41838, Training Loss: 0.00692
Epoch: 41838, Training Loss: 0.00848
Epoch: 41839, Training Loss: 0.00666
Epoch: 41839, Training Loss: 0.00691
Epoch: 41839, Training Loss: 0.00692
Epoch: 41839, Training Loss: 0.00848
Epoch: 41840, Training Loss: 0.00666
Epoch: 41840, Training Loss: 0.00691
Epoch: 41840, Training Loss: 0.00692
Epoch: 41840, Training Loss: 0.00848
Epoch: 41841, Training Loss: 0.00665
Epoch: 41841, Training Loss: 0.00691
Epoch: 41841, Training Loss: 0.00692
Epoch: 41841, Training Loss: 0.00848
Epoch: 41842, Training Loss: 0.00665
Epoch: 41842, Training Loss: 0.00691
Epoch: 41842, Training Loss: 0.00692
Epoch: 41842, Training Loss: 0.00848
Epoch: 41843, Training Loss: 0.00665
Epoch: 41843, Training Loss: 0.00691
Epoch: 41843, Training Loss: 0.00692
Epoch: 41843, Training Loss: 0.00848
Epoch: 41844, Training Loss: 0.00665
Epoch: 41844, Training Loss: 0.00691
Epoch: 41844, Training Loss: 0.00692
Epoch: 41844, Training Loss: 0.00848
Epoch: 41845, Training Loss: 0.00665
Epoch: 41845, Training Loss: 0.00691
Epoch: 41845, Training Loss: 0.00692
Epoch: 41845, Training Loss: 0.00848
Epoch: 41846, Training Loss: 0.00665
Epoch: 41846, Training Loss: 0.00691
Epoch: 41846, Training Loss: 0.00692
Epoch: 41846, Training Loss: 0.00848
Epoch: 41847, Training Loss: 0.00665
Epoch: 41847, Training Loss: 0.00691
Epoch: 41847, Training Loss: 0.00692
Epoch: 41847, Training Loss: 0.00848
Epoch: 41848, Training Loss: 0.00665
Epoch: 41848, Training Loss: 0.00691
Epoch: 41848, Training Loss: 0.00692
Epoch: 41848, Training Loss: 0.00848
Epoch: 41849, Training Loss: 0.00665
Epoch: 41849, Training Loss: 0.00691
Epoch: 41849, Training Loss: 0.00692
Epoch: 41849, Training Loss: 0.00848
Epoch: 41850, Training Loss: 0.00665
Epoch: 41850, Training Loss: 0.00691
Epoch: 41850, Training Loss: 0.00692
Epoch: 41850, Training Loss: 0.00848
Epoch: 41851, Training Loss: 0.00665
Epoch: 41851, Training Loss: 0.00691
Epoch: 41851, Training Loss: 0.00692
Epoch: 41851, Training Loss: 0.00848
Epoch: 41852, Training Loss: 0.00665
Epoch: 41852, Training Loss: 0.00691
Epoch: 41852, Training Loss: 0.00692
Epoch: 41852, Training Loss: 0.00848
Epoch: 41853, Training Loss: 0.00665
Epoch: 41853, Training Loss: 0.00691
Epoch: 41853, Training Loss: 0.00692
Epoch: 41853, Training Loss: 0.00848
Epoch: 41854, Training Loss: 0.00665
Epoch: 41854, Training Loss: 0.00691
Epoch: 41854, Training Loss: 0.00692
Epoch: 41854, Training Loss: 0.00848
Epoch: 41855, Training Loss: 0.00665
Epoch: 41855, Training Loss: 0.00691
Epoch: 41855, Training Loss: 0.00692
Epoch: 41855, Training Loss: 0.00848
Epoch: 41856, Training Loss: 0.00665
Epoch: 41856, Training Loss: 0.00691
Epoch: 41856, Training Loss: 0.00692
Epoch: 41856, Training Loss: 0.00848
Epoch: 41857, Training Loss: 0.00665
Epoch: 41857, Training Loss: 0.00691
Epoch: 41857, Training Loss: 0.00692
Epoch: 41857, Training Loss: 0.00848
Epoch: 41858, Training Loss: 0.00665
Epoch: 41858, Training Loss: 0.00691
Epoch: 41858, Training Loss: 0.00692
Epoch: 41858, Training Loss: 0.00848
Epoch: 41859, Training Loss: 0.00665
Epoch: 41859, Training Loss: 0.00691
Epoch: 41859, Training Loss: 0.00692
Epoch: 41859, Training Loss: 0.00848
Epoch: 41860, Training Loss: 0.00665
Epoch: 41860, Training Loss: 0.00691
Epoch: 41860, Training Loss: 0.00692
Epoch: 41860, Training Loss: 0.00848
Epoch: 41861, Training Loss: 0.00665
Epoch: 41861, Training Loss: 0.00691
Epoch: 41861, Training Loss: 0.00691
Epoch: 41861, Training Loss: 0.00848
Epoch: 41862, Training Loss: 0.00665
Epoch: 41862, Training Loss: 0.00691
Epoch: 41862, Training Loss: 0.00691
Epoch: 41862, Training Loss: 0.00848
Epoch: 41863, Training Loss: 0.00665
Epoch: 41863, Training Loss: 0.00691
Epoch: 41863, Training Loss: 0.00691
Epoch: 41863, Training Loss: 0.00848
Epoch: 41864, Training Loss: 0.00665
Epoch: 41864, Training Loss: 0.00691
Epoch: 41864, Training Loss: 0.00691
Epoch: 41864, Training Loss: 0.00848
Epoch: 41865, Training Loss: 0.00665
Epoch: 41865, Training Loss: 0.00691
Epoch: 41865, Training Loss: 0.00691
Epoch: 41865, Training Loss: 0.00848
Epoch: 41866, Training Loss: 0.00665
Epoch: 41866, Training Loss: 0.00691
Epoch: 41866, Training Loss: 0.00691
Epoch: 41866, Training Loss: 0.00848
Epoch: 41867, Training Loss: 0.00665
Epoch: 41867, Training Loss: 0.00691
Epoch: 41867, Training Loss: 0.00691
Epoch: 41867, Training Loss: 0.00848
Epoch: 41868, Training Loss: 0.00665
Epoch: 41868, Training Loss: 0.00691
Epoch: 41868, Training Loss: 0.00691
Epoch: 41868, Training Loss: 0.00848
Epoch: 41869, Training Loss: 0.00665
Epoch: 41869, Training Loss: 0.00691
Epoch: 41869, Training Loss: 0.00691
Epoch: 41869, Training Loss: 0.00848
Epoch: 41870, Training Loss: 0.00665
Epoch: 41870, Training Loss: 0.00691
Epoch: 41870, Training Loss: 0.00691
Epoch: 41870, Training Loss: 0.00848
Epoch: 41871, Training Loss: 0.00665
Epoch: 41871, Training Loss: 0.00691
Epoch: 41871, Training Loss: 0.00691
Epoch: 41871, Training Loss: 0.00848
Epoch: 41872, Training Loss: 0.00665
Epoch: 41872, Training Loss: 0.00691
Epoch: 41872, Training Loss: 0.00691
Epoch: 41872, Training Loss: 0.00848
Epoch: 41873, Training Loss: 0.00665
Epoch: 41873, Training Loss: 0.00691
Epoch: 41873, Training Loss: 0.00691
Epoch: 41873, Training Loss: 0.00848
Epoch: 41874, Training Loss: 0.00665
Epoch: 41874, Training Loss: 0.00691
Epoch: 41874, Training Loss: 0.00691
Epoch: 41874, Training Loss: 0.00848
Epoch: 41875, Training Loss: 0.00665
Epoch: 41875, Training Loss: 0.00691
Epoch: 41875, Training Loss: 0.00691
Epoch: 41875, Training Loss: 0.00848
Epoch: 41876, Training Loss: 0.00665
Epoch: 41876, Training Loss: 0.00691
Epoch: 41876, Training Loss: 0.00691
Epoch: 41876, Training Loss: 0.00848
Epoch: 41877, Training Loss: 0.00665
Epoch: 41877, Training Loss: 0.00691
Epoch: 41877, Training Loss: 0.00691
Epoch: 41877, Training Loss: 0.00848
Epoch: 41878, Training Loss: 0.00665
Epoch: 41878, Training Loss: 0.00691
Epoch: 41878, Training Loss: 0.00691
Epoch: 41878, Training Loss: 0.00848
Epoch: 41879, Training Loss: 0.00665
Epoch: 41879, Training Loss: 0.00691
Epoch: 41879, Training Loss: 0.00691
Epoch: 41879, Training Loss: 0.00848
Epoch: 41880, Training Loss: 0.00665
Epoch: 41880, Training Loss: 0.00691
Epoch: 41880, Training Loss: 0.00691
Epoch: 41880, Training Loss: 0.00848
Epoch: 41881, Training Loss: 0.00665
Epoch: 41881, Training Loss: 0.00691
Epoch: 41881, Training Loss: 0.00691
Epoch: 41881, Training Loss: 0.00848
Epoch: 41882, Training Loss: 0.00665
Epoch: 41882, Training Loss: 0.00691
Epoch: 41882, Training Loss: 0.00691
Epoch: 41882, Training Loss: 0.00848
Epoch: 41883, Training Loss: 0.00665
Epoch: 41883, Training Loss: 0.00691
Epoch: 41883, Training Loss: 0.00691
Epoch: 41883, Training Loss: 0.00848
Epoch: 41884, Training Loss: 0.00665
Epoch: 41884, Training Loss: 0.00691
Epoch: 41884, Training Loss: 0.00691
Epoch: 41884, Training Loss: 0.00848
Epoch: 41885, Training Loss: 0.00665
Epoch: 41885, Training Loss: 0.00691
Epoch: 41885, Training Loss: 0.00691
Epoch: 41885, Training Loss: 0.00848
Epoch: 41886, Training Loss: 0.00665
Epoch: 41886, Training Loss: 0.00690
Epoch: 41886, Training Loss: 0.00691
Epoch: 41886, Training Loss: 0.00848
Epoch: 41887, Training Loss: 0.00665
Epoch: 41887, Training Loss: 0.00690
Epoch: 41887, Training Loss: 0.00691
Epoch: 41887, Training Loss: 0.00848
Epoch: 41888, Training Loss: 0.00665
Epoch: 41888, Training Loss: 0.00690
Epoch: 41888, Training Loss: 0.00691
Epoch: 41888, Training Loss: 0.00848
Epoch: 41889, Training Loss: 0.00665
Epoch: 41889, Training Loss: 0.00690
Epoch: 41889, Training Loss: 0.00691
Epoch: 41889, Training Loss: 0.00848
Epoch: 41890, Training Loss: 0.00665
Epoch: 41890, Training Loss: 0.00690
Epoch: 41890, Training Loss: 0.00691
Epoch: 41890, Training Loss: 0.00847
Epoch: 41891, Training Loss: 0.00665
Epoch: 41891, Training Loss: 0.00690
Epoch: 41891, Training Loss: 0.00691
Epoch: 41891, Training Loss: 0.00847
Epoch: 41892, Training Loss: 0.00665
Epoch: 41892, Training Loss: 0.00690
Epoch: 41892, Training Loss: 0.00691
Epoch: 41892, Training Loss: 0.00847
Epoch: 41893, Training Loss: 0.00665
Epoch: 41893, Training Loss: 0.00690
Epoch: 41893, Training Loss: 0.00691
Epoch: 41893, Training Loss: 0.00847
Epoch: 41894, Training Loss: 0.00665
Epoch: 41894, Training Loss: 0.00690
Epoch: 41894, Training Loss: 0.00691
Epoch: 41894, Training Loss: 0.00847
Epoch: 41895, Training Loss: 0.00665
Epoch: 41895, Training Loss: 0.00690
Epoch: 41895, Training Loss: 0.00691
Epoch: 41895, Training Loss: 0.00847
Epoch: 41896, Training Loss: 0.00665
Epoch: 41896, Training Loss: 0.00690
Epoch: 41896, Training Loss: 0.00691
Epoch: 41896, Training Loss: 0.00847
Epoch: 41897, Training Loss: 0.00665
Epoch: 41897, Training Loss: 0.00690
Epoch: 41897, Training Loss: 0.00691
Epoch: 41897, Training Loss: 0.00847
Epoch: 41898, Training Loss: 0.00665
Epoch: 41898, Training Loss: 0.00690
Epoch: 41898, Training Loss: 0.00691
Epoch: 41898, Training Loss: 0.00847
Epoch: 41899, Training Loss: 0.00665
Epoch: 41899, Training Loss: 0.00690
Epoch: 41899, Training Loss: 0.00691
Epoch: 41899, Training Loss: 0.00847
Epoch: 41900, Training Loss: 0.00665
Epoch: 41900, Training Loss: 0.00690
Epoch: 41900, Training Loss: 0.00691
Epoch: 41900, Training Loss: 0.00847
Epoch: 41901, Training Loss: 0.00665
Epoch: 41901, Training Loss: 0.00690
Epoch: 41901, Training Loss: 0.00691
Epoch: 41901, Training Loss: 0.00847
Epoch: 41902, Training Loss: 0.00665
Epoch: 41902, Training Loss: 0.00690
Epoch: 41902, Training Loss: 0.00691
Epoch: 41902, Training Loss: 0.00847
Epoch: 41903, Training Loss: 0.00665
Epoch: 41903, Training Loss: 0.00690
Epoch: 41903, Training Loss: 0.00691
Epoch: 41903, Training Loss: 0.00847
Epoch: 41904, Training Loss: 0.00665
Epoch: 41904, Training Loss: 0.00690
Epoch: 41904, Training Loss: 0.00691
Epoch: 41904, Training Loss: 0.00847
Epoch: 41905, Training Loss: 0.00665
Epoch: 41905, Training Loss: 0.00690
Epoch: 41905, Training Loss: 0.00691
Epoch: 41905, Training Loss: 0.00847
Epoch: 41906, Training Loss: 0.00665
Epoch: 41906, Training Loss: 0.00690
Epoch: 41906, Training Loss: 0.00691
Epoch: 41906, Training Loss: 0.00847
Epoch: 41907, Training Loss: 0.00665
Epoch: 41907, Training Loss: 0.00690
Epoch: 41907, Training Loss: 0.00691
Epoch: 41907, Training Loss: 0.00847
Epoch: 41908, Training Loss: 0.00665
Epoch: 41908, Training Loss: 0.00690
Epoch: 41908, Training Loss: 0.00691
Epoch: 41908, Training Loss: 0.00847
Epoch: 41909, Training Loss: 0.00665
Epoch: 41909, Training Loss: 0.00690
Epoch: 41909, Training Loss: 0.00691
Epoch: 41909, Training Loss: 0.00847
Epoch: 41910, Training Loss: 0.00665
Epoch: 41910, Training Loss: 0.00690
Epoch: 41910, Training Loss: 0.00691
Epoch: 41910, Training Loss: 0.00847
Epoch: 41911, Training Loss: 0.00665
Epoch: 41911, Training Loss: 0.00690
Epoch: 41911, Training Loss: 0.00691
Epoch: 41911, Training Loss: 0.00847
Epoch: 41912, Training Loss: 0.00665
Epoch: 41912, Training Loss: 0.00690
Epoch: 41912, Training Loss: 0.00691
Epoch: 41912, Training Loss: 0.00847
Epoch: 41913, Training Loss: 0.00665
Epoch: 41913, Training Loss: 0.00690
Epoch: 41913, Training Loss: 0.00691
Epoch: 41913, Training Loss: 0.00847
Epoch: 41914, Training Loss: 0.00665
Epoch: 41914, Training Loss: 0.00690
Epoch: 41914, Training Loss: 0.00691
Epoch: 41914, Training Loss: 0.00847
Epoch: 41915, Training Loss: 0.00665
Epoch: 41915, Training Loss: 0.00690
Epoch: 41915, Training Loss: 0.00691
Epoch: 41915, Training Loss: 0.00847
Epoch: 41916, Training Loss: 0.00665
Epoch: 41916, Training Loss: 0.00690
Epoch: 41916, Training Loss: 0.00691
Epoch: 41916, Training Loss: 0.00847
Epoch: 41917, Training Loss: 0.00665
Epoch: 41917, Training Loss: 0.00690
Epoch: 41917, Training Loss: 0.00691
Epoch: 41917, Training Loss: 0.00847
Epoch: 41918, Training Loss: 0.00665
Epoch: 41918, Training Loss: 0.00690
Epoch: 41918, Training Loss: 0.00691
Epoch: 41918, Training Loss: 0.00847
Epoch: 41919, Training Loss: 0.00665
Epoch: 41919, Training Loss: 0.00690
Epoch: 41919, Training Loss: 0.00691
Epoch: 41919, Training Loss: 0.00847
Epoch: 41920, Training Loss: 0.00665
Epoch: 41920, Training Loss: 0.00690
Epoch: 41920, Training Loss: 0.00691
Epoch: 41920, Training Loss: 0.00847
Epoch: 41921, Training Loss: 0.00665
Epoch: 41921, Training Loss: 0.00690
Epoch: 41921, Training Loss: 0.00691
Epoch: 41921, Training Loss: 0.00847
Epoch: 41922, Training Loss: 0.00665
Epoch: 41922, Training Loss: 0.00690
Epoch: 41922, Training Loss: 0.00691
Epoch: 41922, Training Loss: 0.00847
Epoch: 41923, Training Loss: 0.00665
Epoch: 41923, Training Loss: 0.00690
Epoch: 41923, Training Loss: 0.00691
Epoch: 41923, Training Loss: 0.00847
Epoch: 41924, Training Loss: 0.00665
Epoch: 41924, Training Loss: 0.00690
Epoch: 41924, Training Loss: 0.00691
Epoch: 41924, Training Loss: 0.00847
Epoch: 41925, Training Loss: 0.00665
Epoch: 41925, Training Loss: 0.00690
Epoch: 41925, Training Loss: 0.00691
Epoch: 41925, Training Loss: 0.00847
Epoch: 41926, Training Loss: 0.00665
Epoch: 41926, Training Loss: 0.00690
Epoch: 41926, Training Loss: 0.00691
Epoch: 41926, Training Loss: 0.00847
Epoch: 41927, Training Loss: 0.00665
Epoch: 41927, Training Loss: 0.00690
Epoch: 41927, Training Loss: 0.00691
Epoch: 41927, Training Loss: 0.00847
Epoch: 41928, Training Loss: 0.00665
Epoch: 41928, Training Loss: 0.00690
Epoch: 41928, Training Loss: 0.00691
Epoch: 41928, Training Loss: 0.00847
Epoch: 41929, Training Loss: 0.00665
Epoch: 41929, Training Loss: 0.00690
Epoch: 41929, Training Loss: 0.00691
Epoch: 41929, Training Loss: 0.00847
Epoch: 41930, Training Loss: 0.00665
Epoch: 41930, Training Loss: 0.00690
Epoch: 41930, Training Loss: 0.00691
Epoch: 41930, Training Loss: 0.00847
Epoch: 41931, Training Loss: 0.00665
Epoch: 41931, Training Loss: 0.00690
Epoch: 41931, Training Loss: 0.00691
Epoch: 41931, Training Loss: 0.00847
Epoch: 41932, Training Loss: 0.00665
Epoch: 41932, Training Loss: 0.00690
Epoch: 41932, Training Loss: 0.00691
Epoch: 41932, Training Loss: 0.00847
Epoch: 41933, Training Loss: 0.00665
Epoch: 41933, Training Loss: 0.00690
Epoch: 41933, Training Loss: 0.00691
Epoch: 41933, Training Loss: 0.00847
Epoch: 41934, Training Loss: 0.00665
Epoch: 41934, Training Loss: 0.00690
Epoch: 41934, Training Loss: 0.00691
Epoch: 41934, Training Loss: 0.00847
Epoch: 41935, Training Loss: 0.00665
Epoch: 41935, Training Loss: 0.00690
Epoch: 41935, Training Loss: 0.00691
Epoch: 41935, Training Loss: 0.00847
Epoch: 41936, Training Loss: 0.00665
Epoch: 41936, Training Loss: 0.00690
Epoch: 41936, Training Loss: 0.00691
Epoch: 41936, Training Loss: 0.00847
Epoch: 41937, Training Loss: 0.00665
Epoch: 41937, Training Loss: 0.00690
Epoch: 41937, Training Loss: 0.00691
Epoch: 41937, Training Loss: 0.00847
Epoch: 41938, Training Loss: 0.00665
Epoch: 41938, Training Loss: 0.00690
Epoch: 41938, Training Loss: 0.00691
Epoch: 41938, Training Loss: 0.00847
Epoch: 41939, Training Loss: 0.00665
Epoch: 41939, Training Loss: 0.00690
Epoch: 41939, Training Loss: 0.00691
Epoch: 41939, Training Loss: 0.00847
Epoch: 41940, Training Loss: 0.00665
Epoch: 41940, Training Loss: 0.00690
Epoch: 41940, Training Loss: 0.00691
Epoch: 41940, Training Loss: 0.00847
Epoch: 41941, Training Loss: 0.00665
Epoch: 41941, Training Loss: 0.00690
Epoch: 41941, Training Loss: 0.00691
Epoch: 41941, Training Loss: 0.00847
Epoch: 41942, Training Loss: 0.00665
Epoch: 41942, Training Loss: 0.00690
Epoch: 41942, Training Loss: 0.00691
Epoch: 41942, Training Loss: 0.00847
Epoch: 41943, Training Loss: 0.00665
Epoch: 41943, Training Loss: 0.00690
Epoch: 41943, Training Loss: 0.00691
Epoch: 41943, Training Loss: 0.00847
Epoch: 41944, Training Loss: 0.00665
Epoch: 41944, Training Loss: 0.00690
Epoch: 41944, Training Loss: 0.00691
Epoch: 41944, Training Loss: 0.00847
Epoch: 41945, Training Loss: 0.00665
Epoch: 41945, Training Loss: 0.00690
Epoch: 41945, Training Loss: 0.00691
Epoch: 41945, Training Loss: 0.00847
Epoch: 41946, Training Loss: 0.00665
Epoch: 41946, Training Loss: 0.00690
Epoch: 41946, Training Loss: 0.00691
Epoch: 41946, Training Loss: 0.00847
Epoch: 41947, Training Loss: 0.00665
Epoch: 41947, Training Loss: 0.00690
Epoch: 41947, Training Loss: 0.00691
Epoch: 41947, Training Loss: 0.00847
Epoch: 41948, Training Loss: 0.00665
Epoch: 41948, Training Loss: 0.00690
Epoch: 41948, Training Loss: 0.00691
Epoch: 41948, Training Loss: 0.00847
Epoch: 41949, Training Loss: 0.00665
Epoch: 41949, Training Loss: 0.00690
Epoch: 41949, Training Loss: 0.00691
Epoch: 41949, Training Loss: 0.00847
Epoch: 41950, Training Loss: 0.00665
Epoch: 41950, Training Loss: 0.00690
Epoch: 41950, Training Loss: 0.00691
Epoch: 41950, Training Loss: 0.00847
Epoch: 41951, Training Loss: 0.00665
Epoch: 41951, Training Loss: 0.00690
Epoch: 41951, Training Loss: 0.00691
Epoch: 41951, Training Loss: 0.00847
Epoch: 41952, Training Loss: 0.00665
Epoch: 41952, Training Loss: 0.00690
Epoch: 41952, Training Loss: 0.00691
Epoch: 41952, Training Loss: 0.00847
Epoch: 41953, Training Loss: 0.00665
Epoch: 41953, Training Loss: 0.00690
Epoch: 41953, Training Loss: 0.00691
Epoch: 41953, Training Loss: 0.00847
Epoch: 41954, Training Loss: 0.00665
Epoch: 41954, Training Loss: 0.00690
Epoch: 41954, Training Loss: 0.00691
Epoch: 41954, Training Loss: 0.00847
Epoch: 41955, Training Loss: 0.00665
Epoch: 41955, Training Loss: 0.00690
Epoch: 41955, Training Loss: 0.00691
Epoch: 41955, Training Loss: 0.00847
Epoch: 41956, Training Loss: 0.00665
Epoch: 41956, Training Loss: 0.00690
Epoch: 41956, Training Loss: 0.00691
Epoch: 41956, Training Loss: 0.00847
Epoch: 41957, Training Loss: 0.00665
Epoch: 41957, Training Loss: 0.00690
Epoch: 41957, Training Loss: 0.00691
Epoch: 41957, Training Loss: 0.00847
Epoch: 41958, Training Loss: 0.00664
Epoch: 41958, Training Loss: 0.00690
Epoch: 41958, Training Loss: 0.00691
Epoch: 41958, Training Loss: 0.00847
Epoch: 41959, Training Loss: 0.00664
Epoch: 41959, Training Loss: 0.00690
Epoch: 41959, Training Loss: 0.00691
Epoch: 41959, Training Loss: 0.00847
Epoch: 41960, Training Loss: 0.00664
Epoch: 41960, Training Loss: 0.00690
Epoch: 41960, Training Loss: 0.00691
Epoch: 41960, Training Loss: 0.00847
Epoch: 41961, Training Loss: 0.00664
Epoch: 41961, Training Loss: 0.00690
Epoch: 41961, Training Loss: 0.00691
Epoch: 41961, Training Loss: 0.00847
Epoch: 41962, Training Loss: 0.00664
Epoch: 41962, Training Loss: 0.00690
Epoch: 41962, Training Loss: 0.00691
Epoch: 41962, Training Loss: 0.00847
Epoch: 41963, Training Loss: 0.00664
Epoch: 41963, Training Loss: 0.00690
Epoch: 41963, Training Loss: 0.00691
Epoch: 41963, Training Loss: 0.00847
Epoch: 41964, Training Loss: 0.00664
Epoch: 41964, Training Loss: 0.00690
Epoch: 41964, Training Loss: 0.00691
Epoch: 41964, Training Loss: 0.00847
Epoch: 41965, Training Loss: 0.00664
Epoch: 41965, Training Loss: 0.00690
Epoch: 41965, Training Loss: 0.00691
Epoch: 41965, Training Loss: 0.00847
Epoch: 41966, Training Loss: 0.00664
Epoch: 41966, Training Loss: 0.00690
Epoch: 41966, Training Loss: 0.00691
Epoch: 41966, Training Loss: 0.00847
Epoch: 41967, Training Loss: 0.00664
Epoch: 41967, Training Loss: 0.00690
Epoch: 41967, Training Loss: 0.00691
Epoch: 41967, Training Loss: 0.00847
Epoch: 41968, Training Loss: 0.00664
Epoch: 41968, Training Loss: 0.00690
Epoch: 41968, Training Loss: 0.00691
Epoch: 41968, Training Loss: 0.00847
Epoch: 41969, Training Loss: 0.00664
Epoch: 41969, Training Loss: 0.00690
Epoch: 41969, Training Loss: 0.00691
Epoch: 41969, Training Loss: 0.00847
Epoch: 41970, Training Loss: 0.00664
Epoch: 41970, Training Loss: 0.00690
Epoch: 41970, Training Loss: 0.00691
Epoch: 41970, Training Loss: 0.00847
Epoch: 41971, Training Loss: 0.00664
Epoch: 41971, Training Loss: 0.00690
Epoch: 41971, Training Loss: 0.00690
Epoch: 41971, Training Loss: 0.00847
Epoch: 41972, Training Loss: 0.00664
Epoch: 41972, Training Loss: 0.00690
Epoch: 41972, Training Loss: 0.00690
Epoch: 41972, Training Loss: 0.00847
Epoch: 41973, Training Loss: 0.00664
Epoch: 41973, Training Loss: 0.00690
Epoch: 41973, Training Loss: 0.00690
Epoch: 41973, Training Loss: 0.00847
Epoch: 41974, Training Loss: 0.00664
Epoch: 41974, Training Loss: 0.00690
Epoch: 41974, Training Loss: 0.00690
Epoch: 41974, Training Loss: 0.00847
Epoch: 41975, Training Loss: 0.00664
Epoch: 41975, Training Loss: 0.00690
Epoch: 41975, Training Loss: 0.00690
Epoch: 41975, Training Loss: 0.00847
Epoch: 41976, Training Loss: 0.00664
Epoch: 41976, Training Loss: 0.00690
Epoch: 41976, Training Loss: 0.00690
Epoch: 41976, Training Loss: 0.00847
Epoch: 41977, Training Loss: 0.00664
Epoch: 41977, Training Loss: 0.00690
Epoch: 41977, Training Loss: 0.00690
Epoch: 41977, Training Loss: 0.00847
Epoch: 41978, Training Loss: 0.00664
Epoch: 41978, Training Loss: 0.00690
Epoch: 41978, Training Loss: 0.00690
Epoch: 41978, Training Loss: 0.00847
Epoch: 41979, Training Loss: 0.00664
Epoch: 41979, Training Loss: 0.00690
Epoch: 41979, Training Loss: 0.00690
Epoch: 41979, Training Loss: 0.00846
Epoch: 41980, Training Loss: 0.00664
Epoch: 41980, Training Loss: 0.00690
Epoch: 41980, Training Loss: 0.00690
Epoch: 41980, Training Loss: 0.00846
Epoch: 41981, Training Loss: 0.00664
Epoch: 41981, Training Loss: 0.00690
Epoch: 41981, Training Loss: 0.00690
Epoch: 41981, Training Loss: 0.00846
Epoch: 41982, Training Loss: 0.00664
Epoch: 41982, Training Loss: 0.00690
Epoch: 41982, Training Loss: 0.00690
Epoch: 41982, Training Loss: 0.00846
Epoch: 41983, Training Loss: 0.00664
Epoch: 41983, Training Loss: 0.00690
Epoch: 41983, Training Loss: 0.00690
Epoch: 41983, Training Loss: 0.00846
Epoch: 41984, Training Loss: 0.00664
Epoch: 41984, Training Loss: 0.00690
Epoch: 41984, Training Loss: 0.00690
Epoch: 41984, Training Loss: 0.00846
Epoch: 41985, Training Loss: 0.00664
Epoch: 41985, Training Loss: 0.00690
Epoch: 41985, Training Loss: 0.00690
Epoch: 41985, Training Loss: 0.00846
Epoch: 41986, Training Loss: 0.00664
Epoch: 41986, Training Loss: 0.00690
Epoch: 41986, Training Loss: 0.00690
Epoch: 41986, Training Loss: 0.00846
Epoch: 41987, Training Loss: 0.00664
Epoch: 41987, Training Loss: 0.00690
Epoch: 41987, Training Loss: 0.00690
Epoch: 41987, Training Loss: 0.00846
Epoch: 41988, Training Loss: 0.00664
Epoch: 41988, Training Loss: 0.00690
Epoch: 41988, Training Loss: 0.00690
Epoch: 41988, Training Loss: 0.00846
Epoch: 41989, Training Loss: 0.00664
Epoch: 41989, Training Loss: 0.00690
Epoch: 41989, Training Loss: 0.00690
Epoch: 41989, Training Loss: 0.00846
Epoch: 41990, Training Loss: 0.00664
Epoch: 41990, Training Loss: 0.00690
Epoch: 41990, Training Loss: 0.00690
Epoch: 41990, Training Loss: 0.00846
Epoch: 41991, Training Loss: 0.00664
Epoch: 41991, Training Loss: 0.00690
Epoch: 41991, Training Loss: 0.00690
Epoch: 41991, Training Loss: 0.00846
Epoch: 41992, Training Loss: 0.00664
Epoch: 41992, Training Loss: 0.00690
Epoch: 41992, Training Loss: 0.00690
Epoch: 41992, Training Loss: 0.00846
Epoch: 41993, Training Loss: 0.00664
Epoch: 41993, Training Loss: 0.00690
Epoch: 41993, Training Loss: 0.00690
Epoch: 41993, Training Loss: 0.00846
Epoch: 41994, Training Loss: 0.00664
Epoch: 41994, Training Loss: 0.00690
Epoch: 41994, Training Loss: 0.00690
Epoch: 41994, Training Loss: 0.00846
Epoch: 41995, Training Loss: 0.00664
Epoch: 41995, Training Loss: 0.00690
Epoch: 41995, Training Loss: 0.00690
Epoch: 41995, Training Loss: 0.00846
Epoch: 41996, Training Loss: 0.00664
Epoch: 41996, Training Loss: 0.00690
Epoch: 41996, Training Loss: 0.00690
Epoch: 41996, Training Loss: 0.00846
Epoch: 41997, Training Loss: 0.00664
Epoch: 41997, Training Loss: 0.00689
Epoch: 41997, Training Loss: 0.00690
Epoch: 41997, Training Loss: 0.00846
Epoch: 41998, Training Loss: 0.00664
Epoch: 41998, Training Loss: 0.00689
Epoch: 41998, Training Loss: 0.00690
Epoch: 41998, Training Loss: 0.00846
Epoch: 41999, Training Loss: 0.00664
Epoch: 41999, Training Loss: 0.00689
Epoch: 41999, Training Loss: 0.00690
Epoch: 41999, Training Loss: 0.00846
Epoch: 42000, Training Loss: 0.00664
Epoch: 42000, Training Loss: 0.00689
Epoch: 42000, Training Loss: 0.00690
Epoch: 42000, Training Loss: 0.00846
Epoch: 42001, Training Loss: 0.00664
Epoch: 42001, Training Loss: 0.00689
Epoch: 42001, Training Loss: 0.00690
Epoch: 42001, Training Loss: 0.00846
Epoch: 42002, Training Loss: 0.00664
Epoch: 42002, Training Loss: 0.00689
Epoch: 42002, Training Loss: 0.00690
Epoch: 42002, Training Loss: 0.00846
Epoch: 42003, Training Loss: 0.00664
Epoch: 42003, Training Loss: 0.00689
Epoch: 42003, Training Loss: 0.00690
Epoch: 42003, Training Loss: 0.00846
Epoch: 42004, Training Loss: 0.00664
Epoch: 42004, Training Loss: 0.00689
Epoch: 42004, Training Loss: 0.00690
Epoch: 42004, Training Loss: 0.00846
Epoch: 42005, Training Loss: 0.00664
Epoch: 42005, Training Loss: 0.00689
Epoch: 42005, Training Loss: 0.00690
Epoch: 42005, Training Loss: 0.00846
Epoch: 42006, Training Loss: 0.00664
Epoch: 42006, Training Loss: 0.00689
Epoch: 42006, Training Loss: 0.00690
Epoch: 42006, Training Loss: 0.00846
Epoch: 42007, Training Loss: 0.00664
Epoch: 42007, Training Loss: 0.00689
Epoch: 42007, Training Loss: 0.00690
Epoch: 42007, Training Loss: 0.00846
Epoch: 42008, Training Loss: 0.00664
Epoch: 42008, Training Loss: 0.00689
Epoch: 42008, Training Loss: 0.00690
Epoch: 42008, Training Loss: 0.00846
Epoch: 42009, Training Loss: 0.00664
Epoch: 42009, Training Loss: 0.00689
Epoch: 42009, Training Loss: 0.00690
Epoch: 42009, Training Loss: 0.00846
Epoch: 42010, Training Loss: 0.00664
Epoch: 42010, Training Loss: 0.00689
Epoch: 42010, Training Loss: 0.00690
Epoch: 42010, Training Loss: 0.00846
Epoch: 42011, Training Loss: 0.00664
Epoch: 42011, Training Loss: 0.00689
Epoch: 42011, Training Loss: 0.00690
Epoch: 42011, Training Loss: 0.00846
Epoch: 42012, Training Loss: 0.00664
Epoch: 42012, Training Loss: 0.00689
Epoch: 42012, Training Loss: 0.00690
Epoch: 42012, Training Loss: 0.00846
Epoch: 42013, Training Loss: 0.00664
Epoch: 42013, Training Loss: 0.00689
Epoch: 42013, Training Loss: 0.00690
Epoch: 42013, Training Loss: 0.00846
Epoch: 42014, Training Loss: 0.00664
Epoch: 42014, Training Loss: 0.00689
Epoch: 42014, Training Loss: 0.00690
Epoch: 42014, Training Loss: 0.00846
Epoch: 42015, Training Loss: 0.00664
Epoch: 42015, Training Loss: 0.00689
Epoch: 42015, Training Loss: 0.00690
Epoch: 42015, Training Loss: 0.00846
Epoch: 42016, Training Loss: 0.00664
Epoch: 42016, Training Loss: 0.00689
Epoch: 42016, Training Loss: 0.00690
Epoch: 42016, Training Loss: 0.00846
Epoch: 42017, Training Loss: 0.00664
Epoch: 42017, Training Loss: 0.00689
Epoch: 42017, Training Loss: 0.00690
Epoch: 42017, Training Loss: 0.00846
Epoch: 42018, Training Loss: 0.00664
Epoch: 42018, Training Loss: 0.00689
Epoch: 42018, Training Loss: 0.00690
Epoch: 42018, Training Loss: 0.00846
Epoch: 42019, Training Loss: 0.00664
Epoch: 42019, Training Loss: 0.00689
Epoch: 42019, Training Loss: 0.00690
Epoch: 42019, Training Loss: 0.00846
Epoch: 42020, Training Loss: 0.00664
Epoch: 42020, Training Loss: 0.00689
Epoch: 42020, Training Loss: 0.00690
Epoch: 42020, Training Loss: 0.00846
Epoch: 42021, Training Loss: 0.00664
Epoch: 42021, Training Loss: 0.00689
Epoch: 42021, Training Loss: 0.00690
Epoch: 42021, Training Loss: 0.00846
Epoch: 42022, Training Loss: 0.00664
Epoch: 42022, Training Loss: 0.00689
Epoch: 42022, Training Loss: 0.00690
Epoch: 42022, Training Loss: 0.00846
Epoch: 42023, Training Loss: 0.00664
Epoch: 42023, Training Loss: 0.00689
Epoch: 42023, Training Loss: 0.00690
Epoch: 42023, Training Loss: 0.00846
Epoch: 42024, Training Loss: 0.00664
Epoch: 42024, Training Loss: 0.00689
Epoch: 42024, Training Loss: 0.00690
Epoch: 42024, Training Loss: 0.00846
Epoch: 42025, Training Loss: 0.00664
Epoch: 42025, Training Loss: 0.00689
Epoch: 42025, Training Loss: 0.00690
Epoch: 42025, Training Loss: 0.00846
Epoch: 42026, Training Loss: 0.00664
Epoch: 42026, Training Loss: 0.00689
Epoch: 42026, Training Loss: 0.00690
Epoch: 42026, Training Loss: 0.00846
Epoch: 42027, Training Loss: 0.00664
Epoch: 42027, Training Loss: 0.00689
Epoch: 42027, Training Loss: 0.00690
Epoch: 42027, Training Loss: 0.00846
Epoch: 42028, Training Loss: 0.00664
Epoch: 42028, Training Loss: 0.00689
Epoch: 42028, Training Loss: 0.00690
Epoch: 42028, Training Loss: 0.00846
Epoch: 42029, Training Loss: 0.00664
Epoch: 42029, Training Loss: 0.00689
Epoch: 42029, Training Loss: 0.00690
Epoch: 42029, Training Loss: 0.00846
Epoch: 42030, Training Loss: 0.00664
Epoch: 42030, Training Loss: 0.00689
Epoch: 42030, Training Loss: 0.00690
Epoch: 42030, Training Loss: 0.00846
Epoch: 42031, Training Loss: 0.00664
Epoch: 42031, Training Loss: 0.00689
Epoch: 42031, Training Loss: 0.00690
Epoch: 42031, Training Loss: 0.00846
Epoch: 42032, Training Loss: 0.00664
Epoch: 42032, Training Loss: 0.00689
Epoch: 42032, Training Loss: 0.00690
Epoch: 42032, Training Loss: 0.00846
Epoch: 42033, Training Loss: 0.00664
Epoch: 42033, Training Loss: 0.00689
Epoch: 42033, Training Loss: 0.00690
Epoch: 42033, Training Loss: 0.00846
Epoch: 42034, Training Loss: 0.00664
Epoch: 42034, Training Loss: 0.00689
Epoch: 42034, Training Loss: 0.00690
Epoch: 42034, Training Loss: 0.00846
Epoch: 42035, Training Loss: 0.00664
Epoch: 42035, Training Loss: 0.00689
Epoch: 42035, Training Loss: 0.00690
Epoch: 42035, Training Loss: 0.00846
Epoch: 42036, Training Loss: 0.00664
Epoch: 42036, Training Loss: 0.00689
Epoch: 42036, Training Loss: 0.00690
Epoch: 42036, Training Loss: 0.00846
Epoch: 42037, Training Loss: 0.00664
Epoch: 42037, Training Loss: 0.00689
Epoch: 42037, Training Loss: 0.00690
Epoch: 42037, Training Loss: 0.00846
Epoch: 42038, Training Loss: 0.00664
Epoch: 42038, Training Loss: 0.00689
Epoch: 42038, Training Loss: 0.00690
Epoch: 42038, Training Loss: 0.00846
Epoch: 42039, Training Loss: 0.00664
Epoch: 42039, Training Loss: 0.00689
Epoch: 42039, Training Loss: 0.00690
Epoch: 42039, Training Loss: 0.00846
Epoch: 42040, Training Loss: 0.00664
Epoch: 42040, Training Loss: 0.00689
Epoch: 42040, Training Loss: 0.00690
Epoch: 42040, Training Loss: 0.00846
Epoch: 42041, Training Loss: 0.00664
Epoch: 42041, Training Loss: 0.00689
Epoch: 42041, Training Loss: 0.00690
Epoch: 42041, Training Loss: 0.00846
Epoch: 42042, Training Loss: 0.00664
Epoch: 42042, Training Loss: 0.00689
Epoch: 42042, Training Loss: 0.00690
Epoch: 42042, Training Loss: 0.00846
Epoch: 42043, Training Loss: 0.00664
Epoch: 42043, Training Loss: 0.00689
Epoch: 42043, Training Loss: 0.00690
Epoch: 42043, Training Loss: 0.00846
Epoch: 42044, Training Loss: 0.00664
Epoch: 42044, Training Loss: 0.00689
Epoch: 42044, Training Loss: 0.00690
Epoch: 42044, Training Loss: 0.00846
Epoch: 42045, Training Loss: 0.00664
Epoch: 42045, Training Loss: 0.00689
Epoch: 42045, Training Loss: 0.00690
Epoch: 42045, Training Loss: 0.00846
Epoch: 42046, Training Loss: 0.00664
Epoch: 42046, Training Loss: 0.00689
Epoch: 42046, Training Loss: 0.00690
Epoch: 42046, Training Loss: 0.00846
Epoch: 42047, Training Loss: 0.00664
Epoch: 42047, Training Loss: 0.00689
Epoch: 42047, Training Loss: 0.00690
Epoch: 42047, Training Loss: 0.00846
Epoch: 42048, Training Loss: 0.00664
Epoch: 42048, Training Loss: 0.00689
Epoch: 42048, Training Loss: 0.00690
Epoch: 42048, Training Loss: 0.00846
Epoch: 42049, Training Loss: 0.00664
Epoch: 42049, Training Loss: 0.00689
Epoch: 42049, Training Loss: 0.00690
Epoch: 42049, Training Loss: 0.00846
Epoch: 42050, Training Loss: 0.00664
Epoch: 42050, Training Loss: 0.00689
Epoch: 42050, Training Loss: 0.00690
Epoch: 42050, Training Loss: 0.00846
Epoch: 42051, Training Loss: 0.00664
Epoch: 42051, Training Loss: 0.00689
Epoch: 42051, Training Loss: 0.00690
Epoch: 42051, Training Loss: 0.00846
Epoch: 42052, Training Loss: 0.00664
Epoch: 42052, Training Loss: 0.00689
Epoch: 42052, Training Loss: 0.00690
Epoch: 42052, Training Loss: 0.00846
Epoch: 42053, Training Loss: 0.00664
Epoch: 42053, Training Loss: 0.00689
Epoch: 42053, Training Loss: 0.00690
Epoch: 42053, Training Loss: 0.00846
Epoch: 42054, Training Loss: 0.00664
Epoch: 42054, Training Loss: 0.00689
Epoch: 42054, Training Loss: 0.00690
Epoch: 42054, Training Loss: 0.00846
Epoch: 42055, Training Loss: 0.00664
Epoch: 42055, Training Loss: 0.00689
Epoch: 42055, Training Loss: 0.00690
Epoch: 42055, Training Loss: 0.00846
Epoch: 42056, Training Loss: 0.00664
Epoch: 42056, Training Loss: 0.00689
Epoch: 42056, Training Loss: 0.00690
Epoch: 42056, Training Loss: 0.00846
Epoch: 42057, Training Loss: 0.00664
Epoch: 42057, Training Loss: 0.00689
Epoch: 42057, Training Loss: 0.00690
Epoch: 42057, Training Loss: 0.00846
Epoch: 42058, Training Loss: 0.00664
Epoch: 42058, Training Loss: 0.00689
Epoch: 42058, Training Loss: 0.00690
Epoch: 42058, Training Loss: 0.00846
Epoch: 42059, Training Loss: 0.00664
Epoch: 42059, Training Loss: 0.00689
Epoch: 42059, Training Loss: 0.00690
Epoch: 42059, Training Loss: 0.00846
Epoch: 42060, Training Loss: 0.00664
Epoch: 42060, Training Loss: 0.00689
Epoch: 42060, Training Loss: 0.00690
Epoch: 42060, Training Loss: 0.00846
Epoch: 42061, Training Loss: 0.00664
Epoch: 42061, Training Loss: 0.00689
Epoch: 42061, Training Loss: 0.00690
Epoch: 42061, Training Loss: 0.00846
Epoch: 42062, Training Loss: 0.00664
Epoch: 42062, Training Loss: 0.00689
Epoch: 42062, Training Loss: 0.00690
Epoch: 42062, Training Loss: 0.00846
Epoch: 42063, Training Loss: 0.00664
Epoch: 42063, Training Loss: 0.00689
Epoch: 42063, Training Loss: 0.00690
Epoch: 42063, Training Loss: 0.00846
Epoch: 42064, Training Loss: 0.00664
Epoch: 42064, Training Loss: 0.00689
Epoch: 42064, Training Loss: 0.00690
Epoch: 42064, Training Loss: 0.00846
Epoch: 42065, Training Loss: 0.00664
Epoch: 42065, Training Loss: 0.00689
Epoch: 42065, Training Loss: 0.00690
Epoch: 42065, Training Loss: 0.00846
Epoch: 42066, Training Loss: 0.00664
Epoch: 42066, Training Loss: 0.00689
Epoch: 42066, Training Loss: 0.00690
Epoch: 42066, Training Loss: 0.00846
Epoch: 42067, Training Loss: 0.00664
Epoch: 42067, Training Loss: 0.00689
Epoch: 42067, Training Loss: 0.00690
Epoch: 42067, Training Loss: 0.00846
Epoch: 42068, Training Loss: 0.00664
Epoch: 42068, Training Loss: 0.00689
Epoch: 42068, Training Loss: 0.00690
Epoch: 42068, Training Loss: 0.00846
Epoch: 42069, Training Loss: 0.00664
Epoch: 42069, Training Loss: 0.00689
Epoch: 42069, Training Loss: 0.00690
Epoch: 42069, Training Loss: 0.00845
Epoch: 42070, Training Loss: 0.00664
Epoch: 42070, Training Loss: 0.00689
Epoch: 42070, Training Loss: 0.00690
Epoch: 42070, Training Loss: 0.00845
Epoch: 42071, Training Loss: 0.00664
Epoch: 42071, Training Loss: 0.00689
Epoch: 42071, Training Loss: 0.00690
Epoch: 42071, Training Loss: 0.00845
Epoch: 42072, Training Loss: 0.00664
Epoch: 42072, Training Loss: 0.00689
Epoch: 42072, Training Loss: 0.00690
Epoch: 42072, Training Loss: 0.00845
Epoch: 42073, Training Loss: 0.00664
Epoch: 42073, Training Loss: 0.00689
Epoch: 42073, Training Loss: 0.00690
Epoch: 42073, Training Loss: 0.00845
Epoch: 42074, Training Loss: 0.00664
Epoch: 42074, Training Loss: 0.00689
Epoch: 42074, Training Loss: 0.00690
Epoch: 42074, Training Loss: 0.00845
Epoch: 42075, Training Loss: 0.00664
Epoch: 42075, Training Loss: 0.00689
Epoch: 42075, Training Loss: 0.00690
Epoch: 42075, Training Loss: 0.00845
Epoch: 42076, Training Loss: 0.00663
Epoch: 42076, Training Loss: 0.00689
Epoch: 42076, Training Loss: 0.00690
Epoch: 42076, Training Loss: 0.00845
Epoch: 42077, Training Loss: 0.00663
Epoch: 42077, Training Loss: 0.00689
Epoch: 42077, Training Loss: 0.00690
Epoch: 42077, Training Loss: 0.00845
Epoch: 42078, Training Loss: 0.00663
Epoch: 42078, Training Loss: 0.00689
Epoch: 42078, Training Loss: 0.00690
Epoch: 42078, Training Loss: 0.00845
Epoch: 42079, Training Loss: 0.00663
Epoch: 42079, Training Loss: 0.00689
Epoch: 42079, Training Loss: 0.00690
Epoch: 42079, Training Loss: 0.00845
Epoch: 42080, Training Loss: 0.00663
Epoch: 42080, Training Loss: 0.00689
Epoch: 42080, Training Loss: 0.00690
Epoch: 42080, Training Loss: 0.00845
Epoch: 42081, Training Loss: 0.00663
Epoch: 42081, Training Loss: 0.00689
Epoch: 42081, Training Loss: 0.00690
Epoch: 42081, Training Loss: 0.00845
Epoch: 42082, Training Loss: 0.00663
Epoch: 42082, Training Loss: 0.00689
Epoch: 42082, Training Loss: 0.00690
Epoch: 42082, Training Loss: 0.00845
Epoch: 42083, Training Loss: 0.00663
Epoch: 42083, Training Loss: 0.00689
Epoch: 42083, Training Loss: 0.00689
Epoch: 42083, Training Loss: 0.00845
Epoch: 42084, Training Loss: 0.00663
Epoch: 42084, Training Loss: 0.00689
Epoch: 42084, Training Loss: 0.00689
Epoch: 42084, Training Loss: 0.00845
Epoch: 42085, Training Loss: 0.00663
Epoch: 42085, Training Loss: 0.00689
Epoch: 42085, Training Loss: 0.00689
Epoch: 42085, Training Loss: 0.00845
Epoch: 42086, Training Loss: 0.00663
Epoch: 42086, Training Loss: 0.00689
Epoch: 42086, Training Loss: 0.00689
Epoch: 42086, Training Loss: 0.00845
Epoch: 42087, Training Loss: 0.00663
Epoch: 42087, Training Loss: 0.00689
Epoch: 42087, Training Loss: 0.00689
Epoch: 42087, Training Loss: 0.00845
Epoch: 42088, Training Loss: 0.00663
Epoch: 42088, Training Loss: 0.00689
Epoch: 42088, Training Loss: 0.00689
Epoch: 42088, Training Loss: 0.00845
Epoch: 42089, Training Loss: 0.00663
Epoch: 42089, Training Loss: 0.00689
Epoch: 42089, Training Loss: 0.00689
Epoch: 42089, Training Loss: 0.00845
Epoch: 42090, Training Loss: 0.00663
Epoch: 42090, Training Loss: 0.00689
Epoch: 42090, Training Loss: 0.00689
Epoch: 42090, Training Loss: 0.00845
Epoch: 42091, Training Loss: 0.00663
Epoch: 42091, Training Loss: 0.00689
Epoch: 42091, Training Loss: 0.00689
Epoch: 42091, Training Loss: 0.00845
Epoch: 42092, Training Loss: 0.00663
Epoch: 42092, Training Loss: 0.00689
Epoch: 42092, Training Loss: 0.00689
Epoch: 42092, Training Loss: 0.00845
Epoch: 42093, Training Loss: 0.00663
Epoch: 42093, Training Loss: 0.00689
Epoch: 42093, Training Loss: 0.00689
Epoch: 42093, Training Loss: 0.00845
Epoch: 42094, Training Loss: 0.00663
Epoch: 42094, Training Loss: 0.00689
Epoch: 42094, Training Loss: 0.00689
Epoch: 42094, Training Loss: 0.00845
Epoch: 42095, Training Loss: 0.00663
Epoch: 42095, Training Loss: 0.00689
Epoch: 42095, Training Loss: 0.00689
Epoch: 42095, Training Loss: 0.00845
Epoch: 42096, Training Loss: 0.00663
Epoch: 42096, Training Loss: 0.00689
Epoch: 42096, Training Loss: 0.00689
Epoch: 42096, Training Loss: 0.00845
Epoch: 42097, Training Loss: 0.00663
Epoch: 42097, Training Loss: 0.00689
Epoch: 42097, Training Loss: 0.00689
Epoch: 42097, Training Loss: 0.00845
Epoch: 42098, Training Loss: 0.00663
Epoch: 42098, Training Loss: 0.00689
Epoch: 42098, Training Loss: 0.00689
Epoch: 42098, Training Loss: 0.00845
Epoch: 42099, Training Loss: 0.00663
Epoch: 42099, Training Loss: 0.00689
Epoch: 42099, Training Loss: 0.00689
Epoch: 42099, Training Loss: 0.00845
Epoch: 42100, Training Loss: 0.00663
Epoch: 42100, Training Loss: 0.00689
Epoch: 42100, Training Loss: 0.00689
Epoch: 42100, Training Loss: 0.00845
Epoch: 42101, Training Loss: 0.00663
Epoch: 42101, Training Loss: 0.00689
Epoch: 42101, Training Loss: 0.00689
Epoch: 42101, Training Loss: 0.00845
Epoch: 42102, Training Loss: 0.00663
Epoch: 42102, Training Loss: 0.00689
Epoch: 42102, Training Loss: 0.00689
Epoch: 42102, Training Loss: 0.00845
Epoch: 42103, Training Loss: 0.00663
Epoch: 42103, Training Loss: 0.00689
Epoch: 42103, Training Loss: 0.00689
Epoch: 42103, Training Loss: 0.00845
Epoch: 42104, Training Loss: 0.00663
Epoch: 42104, Training Loss: 0.00689
Epoch: 42104, Training Loss: 0.00689
Epoch: 42104, Training Loss: 0.00845
Epoch: 42105, Training Loss: 0.00663
Epoch: 42105, Training Loss: 0.00689
Epoch: 42105, Training Loss: 0.00689
Epoch: 42105, Training Loss: 0.00845
Epoch: 42106, Training Loss: 0.00663
Epoch: 42106, Training Loss: 0.00689
Epoch: 42106, Training Loss: 0.00689
Epoch: 42106, Training Loss: 0.00845
Epoch: 42107, Training Loss: 0.00663
Epoch: 42107, Training Loss: 0.00689
Epoch: 42107, Training Loss: 0.00689
Epoch: 42107, Training Loss: 0.00845
Epoch: 42108, Training Loss: 0.00663
Epoch: 42108, Training Loss: 0.00689
Epoch: 42108, Training Loss: 0.00689
Epoch: 42108, Training Loss: 0.00845
Epoch: 42109, Training Loss: 0.00663
Epoch: 42109, Training Loss: 0.00688
Epoch: 42109, Training Loss: 0.00689
Epoch: 42109, Training Loss: 0.00845
Epoch: 42110, Training Loss: 0.00663
Epoch: 42110, Training Loss: 0.00688
Epoch: 42110, Training Loss: 0.00689
Epoch: 42110, Training Loss: 0.00845
Epoch: 42111, Training Loss: 0.00663
Epoch: 42111, Training Loss: 0.00688
Epoch: 42111, Training Loss: 0.00689
Epoch: 42111, Training Loss: 0.00845
Epoch: 42112, Training Loss: 0.00663
Epoch: 42112, Training Loss: 0.00688
Epoch: 42112, Training Loss: 0.00689
Epoch: 42112, Training Loss: 0.00845
Epoch: 42113, Training Loss: 0.00663
Epoch: 42113, Training Loss: 0.00688
Epoch: 42113, Training Loss: 0.00689
Epoch: 42113, Training Loss: 0.00845
Epoch: 42114, Training Loss: 0.00663
Epoch: 42114, Training Loss: 0.00688
Epoch: 42114, Training Loss: 0.00689
Epoch: 42114, Training Loss: 0.00845
Epoch: 42115, Training Loss: 0.00663
Epoch: 42115, Training Loss: 0.00688
Epoch: 42115, Training Loss: 0.00689
Epoch: 42115, Training Loss: 0.00845
Epoch: 42116, Training Loss: 0.00663
Epoch: 42116, Training Loss: 0.00688
Epoch: 42116, Training Loss: 0.00689
Epoch: 42116, Training Loss: 0.00845
Epoch: 42117, Training Loss: 0.00663
Epoch: 42117, Training Loss: 0.00688
Epoch: 42117, Training Loss: 0.00689
Epoch: 42117, Training Loss: 0.00845
Epoch: 42118, Training Loss: 0.00663
Epoch: 42118, Training Loss: 0.00688
Epoch: 42118, Training Loss: 0.00689
Epoch: 42118, Training Loss: 0.00845
Epoch: 42119, Training Loss: 0.00663
Epoch: 42119, Training Loss: 0.00688
Epoch: 42119, Training Loss: 0.00689
Epoch: 42119, Training Loss: 0.00845
Epoch: 42120, Training Loss: 0.00663
Epoch: 42120, Training Loss: 0.00688
Epoch: 42120, Training Loss: 0.00689
Epoch: 42120, Training Loss: 0.00845
Epoch: 42121, Training Loss: 0.00663
Epoch: 42121, Training Loss: 0.00688
Epoch: 42121, Training Loss: 0.00689
Epoch: 42121, Training Loss: 0.00845
Epoch: 42122, Training Loss: 0.00663
Epoch: 42122, Training Loss: 0.00688
Epoch: 42122, Training Loss: 0.00689
Epoch: 42122, Training Loss: 0.00845
Epoch: 42123, Training Loss: 0.00663
Epoch: 42123, Training Loss: 0.00688
Epoch: 42123, Training Loss: 0.00689
Epoch: 42123, Training Loss: 0.00845
Epoch: 42124, Training Loss: 0.00663
Epoch: 42124, Training Loss: 0.00688
Epoch: 42124, Training Loss: 0.00689
Epoch: 42124, Training Loss: 0.00845
Epoch: 42125, Training Loss: 0.00663
Epoch: 42125, Training Loss: 0.00688
Epoch: 42125, Training Loss: 0.00689
Epoch: 42125, Training Loss: 0.00845
Epoch: 42126, Training Loss: 0.00663
Epoch: 42126, Training Loss: 0.00688
Epoch: 42126, Training Loss: 0.00689
Epoch: 42126, Training Loss: 0.00845
Epoch: 42127, Training Loss: 0.00663
Epoch: 42127, Training Loss: 0.00688
Epoch: 42127, Training Loss: 0.00689
Epoch: 42127, Training Loss: 0.00845
Epoch: 42128, Training Loss: 0.00663
Epoch: 42128, Training Loss: 0.00688
Epoch: 42128, Training Loss: 0.00689
Epoch: 42128, Training Loss: 0.00845
Epoch: 42129, Training Loss: 0.00663
Epoch: 42129, Training Loss: 0.00688
Epoch: 42129, Training Loss: 0.00689
Epoch: 42129, Training Loss: 0.00845
Epoch: 42130, Training Loss: 0.00663
Epoch: 42130, Training Loss: 0.00688
Epoch: 42130, Training Loss: 0.00689
Epoch: 42130, Training Loss: 0.00845
Epoch: 42131, Training Loss: 0.00663
Epoch: 42131, Training Loss: 0.00688
Epoch: 42131, Training Loss: 0.00689
Epoch: 42131, Training Loss: 0.00845
Epoch: 42132, Training Loss: 0.00663
Epoch: 42132, Training Loss: 0.00688
Epoch: 42132, Training Loss: 0.00689
Epoch: 42132, Training Loss: 0.00845
Epoch: 42133, Training Loss: 0.00663
Epoch: 42133, Training Loss: 0.00688
Epoch: 42133, Training Loss: 0.00689
Epoch: 42133, Training Loss: 0.00845
Epoch: 42134, Training Loss: 0.00663
Epoch: 42134, Training Loss: 0.00688
Epoch: 42134, Training Loss: 0.00689
Epoch: 42134, Training Loss: 0.00845
Epoch: 42135, Training Loss: 0.00663
Epoch: 42135, Training Loss: 0.00688
Epoch: 42135, Training Loss: 0.00689
Epoch: 42135, Training Loss: 0.00845
Epoch: 42136, Training Loss: 0.00663
Epoch: 42136, Training Loss: 0.00688
Epoch: 42136, Training Loss: 0.00689
Epoch: 42136, Training Loss: 0.00845
Epoch: 42137, Training Loss: 0.00663
Epoch: 42137, Training Loss: 0.00688
Epoch: 42137, Training Loss: 0.00689
Epoch: 42137, Training Loss: 0.00845
Epoch: 42138, Training Loss: 0.00663
Epoch: 42138, Training Loss: 0.00688
Epoch: 42138, Training Loss: 0.00689
Epoch: 42138, Training Loss: 0.00845
Epoch: 42139, Training Loss: 0.00663
Epoch: 42139, Training Loss: 0.00688
Epoch: 42139, Training Loss: 0.00689
Epoch: 42139, Training Loss: 0.00845
Epoch: 42140, Training Loss: 0.00663
Epoch: 42140, Training Loss: 0.00688
Epoch: 42140, Training Loss: 0.00689
Epoch: 42140, Training Loss: 0.00845
Epoch: 42141, Training Loss: 0.00663
Epoch: 42141, Training Loss: 0.00688
Epoch: 42141, Training Loss: 0.00689
Epoch: 42141, Training Loss: 0.00845
Epoch: 42142, Training Loss: 0.00663
Epoch: 42142, Training Loss: 0.00688
Epoch: 42142, Training Loss: 0.00689
Epoch: 42142, Training Loss: 0.00845
Epoch: 42143, Training Loss: 0.00663
Epoch: 42143, Training Loss: 0.00688
Epoch: 42143, Training Loss: 0.00689
Epoch: 42143, Training Loss: 0.00845
Epoch: 42144, Training Loss: 0.00663
Epoch: 42144, Training Loss: 0.00688
Epoch: 42144, Training Loss: 0.00689
Epoch: 42144, Training Loss: 0.00845
Epoch: 42145, Training Loss: 0.00663
Epoch: 42145, Training Loss: 0.00688
Epoch: 42145, Training Loss: 0.00689
Epoch: 42145, Training Loss: 0.00845
Epoch: 42146, Training Loss: 0.00663
Epoch: 42146, Training Loss: 0.00688
Epoch: 42146, Training Loss: 0.00689
Epoch: 42146, Training Loss: 0.00845
Epoch: 42147, Training Loss: 0.00663
Epoch: 42147, Training Loss: 0.00688
Epoch: 42147, Training Loss: 0.00689
Epoch: 42147, Training Loss: 0.00845
Epoch: 42148, Training Loss: 0.00663
Epoch: 42148, Training Loss: 0.00688
Epoch: 42148, Training Loss: 0.00689
Epoch: 42148, Training Loss: 0.00845
Epoch: 42149, Training Loss: 0.00663
Epoch: 42149, Training Loss: 0.00688
Epoch: 42149, Training Loss: 0.00689
Epoch: 42149, Training Loss: 0.00845
Epoch: 42150, Training Loss: 0.00663
Epoch: 42150, Training Loss: 0.00688
Epoch: 42150, Training Loss: 0.00689
Epoch: 42150, Training Loss: 0.00845
Epoch: 42151, Training Loss: 0.00663
Epoch: 42151, Training Loss: 0.00688
Epoch: 42151, Training Loss: 0.00689
Epoch: 42151, Training Loss: 0.00845
Epoch: 42152, Training Loss: 0.00663
Epoch: 42152, Training Loss: 0.00688
Epoch: 42152, Training Loss: 0.00689
Epoch: 42152, Training Loss: 0.00845
Epoch: 42153, Training Loss: 0.00663
Epoch: 42153, Training Loss: 0.00688
Epoch: 42153, Training Loss: 0.00689
Epoch: 42153, Training Loss: 0.00845
Epoch: 42154, Training Loss: 0.00663
Epoch: 42154, Training Loss: 0.00688
Epoch: 42154, Training Loss: 0.00689
Epoch: 42154, Training Loss: 0.00845
Epoch: 42155, Training Loss: 0.00663
Epoch: 42155, Training Loss: 0.00688
Epoch: 42155, Training Loss: 0.00689
Epoch: 42155, Training Loss: 0.00845
Epoch: 42156, Training Loss: 0.00663
Epoch: 42156, Training Loss: 0.00688
Epoch: 42156, Training Loss: 0.00689
Epoch: 42156, Training Loss: 0.00845
Epoch: 42157, Training Loss: 0.00663
Epoch: 42157, Training Loss: 0.00688
Epoch: 42157, Training Loss: 0.00689
Epoch: 42157, Training Loss: 0.00845
Epoch: 42158, Training Loss: 0.00663
Epoch: 42158, Training Loss: 0.00688
Epoch: 42158, Training Loss: 0.00689
Epoch: 42158, Training Loss: 0.00845
Epoch: 42159, Training Loss: 0.00663
Epoch: 42159, Training Loss: 0.00688
Epoch: 42159, Training Loss: 0.00689
Epoch: 42159, Training Loss: 0.00844
Epoch: 42160, Training Loss: 0.00663
Epoch: 42160, Training Loss: 0.00688
Epoch: 42160, Training Loss: 0.00689
Epoch: 42160, Training Loss: 0.00844
Epoch: 42161, Training Loss: 0.00663
Epoch: 42161, Training Loss: 0.00688
Epoch: 42161, Training Loss: 0.00689
Epoch: 42161, Training Loss: 0.00844
Epoch: 42162, Training Loss: 0.00663
Epoch: 42162, Training Loss: 0.00688
Epoch: 42162, Training Loss: 0.00689
Epoch: 42162, Training Loss: 0.00844
Epoch: 42163, Training Loss: 0.00663
Epoch: 42163, Training Loss: 0.00688
Epoch: 42163, Training Loss: 0.00689
Epoch: 42163, Training Loss: 0.00844
Epoch: 42164, Training Loss: 0.00663
Epoch: 42164, Training Loss: 0.00688
Epoch: 42164, Training Loss: 0.00689
Epoch: 42164, Training Loss: 0.00844
Epoch: 42165, Training Loss: 0.00663
Epoch: 42165, Training Loss: 0.00688
Epoch: 42165, Training Loss: 0.00689
Epoch: 42165, Training Loss: 0.00844
Epoch: 42166, Training Loss: 0.00663
Epoch: 42166, Training Loss: 0.00688
Epoch: 42166, Training Loss: 0.00689
Epoch: 42166, Training Loss: 0.00844
Epoch: 42167, Training Loss: 0.00663
Epoch: 42167, Training Loss: 0.00688
Epoch: 42167, Training Loss: 0.00689
Epoch: 42167, Training Loss: 0.00844
Epoch: 42168, Training Loss: 0.00663
Epoch: 42168, Training Loss: 0.00688
Epoch: 42168, Training Loss: 0.00689
Epoch: 42168, Training Loss: 0.00844
Epoch: 42169, Training Loss: 0.00663
Epoch: 42169, Training Loss: 0.00688
Epoch: 42169, Training Loss: 0.00689
Epoch: 42169, Training Loss: 0.00844
Epoch: 42170, Training Loss: 0.00663
Epoch: 42170, Training Loss: 0.00688
Epoch: 42170, Training Loss: 0.00689
Epoch: 42170, Training Loss: 0.00844
Epoch: 42171, Training Loss: 0.00663
Epoch: 42171, Training Loss: 0.00688
Epoch: 42171, Training Loss: 0.00689
Epoch: 42171, Training Loss: 0.00844
Epoch: 42172, Training Loss: 0.00663
Epoch: 42172, Training Loss: 0.00688
Epoch: 42172, Training Loss: 0.00689
Epoch: 42172, Training Loss: 0.00844
Epoch: 42173, Training Loss: 0.00663
Epoch: 42173, Training Loss: 0.00688
Epoch: 42173, Training Loss: 0.00689
Epoch: 42173, Training Loss: 0.00844
Epoch: 42174, Training Loss: 0.00663
Epoch: 42174, Training Loss: 0.00688
Epoch: 42174, Training Loss: 0.00689
Epoch: 42174, Training Loss: 0.00844
Epoch: 42175, Training Loss: 0.00663
Epoch: 42175, Training Loss: 0.00688
Epoch: 42175, Training Loss: 0.00689
Epoch: 42175, Training Loss: 0.00844
Epoch: 42176, Training Loss: 0.00663
Epoch: 42176, Training Loss: 0.00688
Epoch: 42176, Training Loss: 0.00689
Epoch: 42176, Training Loss: 0.00844
Epoch: 42177, Training Loss: 0.00663
Epoch: 42177, Training Loss: 0.00688
Epoch: 42177, Training Loss: 0.00689
Epoch: 42177, Training Loss: 0.00844
Epoch: 42178, Training Loss: 0.00663
Epoch: 42178, Training Loss: 0.00688
Epoch: 42178, Training Loss: 0.00689
Epoch: 42178, Training Loss: 0.00844
Epoch: 42179, Training Loss: 0.00663
Epoch: 42179, Training Loss: 0.00688
Epoch: 42179, Training Loss: 0.00689
Epoch: 42179, Training Loss: 0.00844
Epoch: 42180, Training Loss: 0.00663
Epoch: 42180, Training Loss: 0.00688
Epoch: 42180, Training Loss: 0.00689
Epoch: 42180, Training Loss: 0.00844
Epoch: 42181, Training Loss: 0.00663
Epoch: 42181, Training Loss: 0.00688
Epoch: 42181, Training Loss: 0.00689
Epoch: 42181, Training Loss: 0.00844
Epoch: 42182, Training Loss: 0.00663
Epoch: 42182, Training Loss: 0.00688
Epoch: 42182, Training Loss: 0.00689
Epoch: 42182, Training Loss: 0.00844
Epoch: 42183, Training Loss: 0.00663
Epoch: 42183, Training Loss: 0.00688
Epoch: 42183, Training Loss: 0.00689
Epoch: 42183, Training Loss: 0.00844
Epoch: 42184, Training Loss: 0.00663
Epoch: 42184, Training Loss: 0.00688
Epoch: 42184, Training Loss: 0.00689
Epoch: 42184, Training Loss: 0.00844
Epoch: 42185, Training Loss: 0.00663
Epoch: 42185, Training Loss: 0.00688
Epoch: 42185, Training Loss: 0.00689
Epoch: 42185, Training Loss: 0.00844
Epoch: 42186, Training Loss: 0.00663
Epoch: 42186, Training Loss: 0.00688
Epoch: 42186, Training Loss: 0.00689
Epoch: 42186, Training Loss: 0.00844
Epoch: 42187, Training Loss: 0.00663
Epoch: 42187, Training Loss: 0.00688
Epoch: 42187, Training Loss: 0.00689
Epoch: 42187, Training Loss: 0.00844
Epoch: 42188, Training Loss: 0.00663
Epoch: 42188, Training Loss: 0.00688
Epoch: 42188, Training Loss: 0.00689
Epoch: 42188, Training Loss: 0.00844
Epoch: 42189, Training Loss: 0.00663
Epoch: 42189, Training Loss: 0.00688
Epoch: 42189, Training Loss: 0.00689
Epoch: 42189, Training Loss: 0.00844
Epoch: 42190, Training Loss: 0.00663
Epoch: 42190, Training Loss: 0.00688
Epoch: 42190, Training Loss: 0.00689
Epoch: 42190, Training Loss: 0.00844
Epoch: 42191, Training Loss: 0.00663
Epoch: 42191, Training Loss: 0.00688
Epoch: 42191, Training Loss: 0.00689
Epoch: 42191, Training Loss: 0.00844
Epoch: 42192, Training Loss: 0.00663
Epoch: 42192, Training Loss: 0.00688
Epoch: 42192, Training Loss: 0.00689
Epoch: 42192, Training Loss: 0.00844
Epoch: 42193, Training Loss: 0.00663
Epoch: 42193, Training Loss: 0.00688
Epoch: 42193, Training Loss: 0.00689
Epoch: 42193, Training Loss: 0.00844
Epoch: 42194, Training Loss: 0.00662
Epoch: 42194, Training Loss: 0.00688
Epoch: 42194, Training Loss: 0.00688
Epoch: 42194, Training Loss: 0.00844
Epoch: 42195, Training Loss: 0.00662
Epoch: 42195, Training Loss: 0.00688
Epoch: 42195, Training Loss: 0.00688
Epoch: 42195, Training Loss: 0.00844
Epoch: 42196, Training Loss: 0.00662
Epoch: 42196, Training Loss: 0.00688
Epoch: 42196, Training Loss: 0.00688
Epoch: 42196, Training Loss: 0.00844
Epoch: 42197, Training Loss: 0.00662
Epoch: 42197, Training Loss: 0.00688
Epoch: 42197, Training Loss: 0.00688
Epoch: 42197, Training Loss: 0.00844
Epoch: 42198, Training Loss: 0.00662
Epoch: 42198, Training Loss: 0.00688
Epoch: 42198, Training Loss: 0.00688
Epoch: 42198, Training Loss: 0.00844
Epoch: 42199, Training Loss: 0.00662
Epoch: 42199, Training Loss: 0.00688
Epoch: 42199, Training Loss: 0.00688
Epoch: 42199, Training Loss: 0.00844
Epoch: 42200, Training Loss: 0.00662
Epoch: 42200, Training Loss: 0.00688
Epoch: 42200, Training Loss: 0.00688
Epoch: 42200, Training Loss: 0.00844
Epoch: 42201, Training Loss: 0.00662
Epoch: 42201, Training Loss: 0.00688
Epoch: 42201, Training Loss: 0.00688
Epoch: 42201, Training Loss: 0.00844
Epoch: 42202, Training Loss: 0.00662
Epoch: 42202, Training Loss: 0.00688
Epoch: 42202, Training Loss: 0.00688
Epoch: 42202, Training Loss: 0.00844
Epoch: 42203, Training Loss: 0.00662
Epoch: 42203, Training Loss: 0.00688
Epoch: 42203, Training Loss: 0.00688
Epoch: 42203, Training Loss: 0.00844
Epoch: 42204, Training Loss: 0.00662
Epoch: 42204, Training Loss: 0.00688
Epoch: 42204, Training Loss: 0.00688
Epoch: 42204, Training Loss: 0.00844
Epoch: 42205, Training Loss: 0.00662
Epoch: 42205, Training Loss: 0.00688
Epoch: 42205, Training Loss: 0.00688
Epoch: 42205, Training Loss: 0.00844
Epoch: 42206, Training Loss: 0.00662
Epoch: 42206, Training Loss: 0.00688
Epoch: 42206, Training Loss: 0.00688
Epoch: 42206, Training Loss: 0.00844
Epoch: 42207, Training Loss: 0.00662
Epoch: 42207, Training Loss: 0.00688
Epoch: 42207, Training Loss: 0.00688
Epoch: 42207, Training Loss: 0.00844
Epoch: 42208, Training Loss: 0.00662
Epoch: 42208, Training Loss: 0.00688
Epoch: 42208, Training Loss: 0.00688
Epoch: 42208, Training Loss: 0.00844
Epoch: 42209, Training Loss: 0.00662
Epoch: 42209, Training Loss: 0.00688
Epoch: 42209, Training Loss: 0.00688
Epoch: 42209, Training Loss: 0.00844
Epoch: 42210, Training Loss: 0.00662
Epoch: 42210, Training Loss: 0.00688
Epoch: 42210, Training Loss: 0.00688
Epoch: 42210, Training Loss: 0.00844
Epoch: 42211, Training Loss: 0.00662
Epoch: 42211, Training Loss: 0.00688
Epoch: 42211, Training Loss: 0.00688
Epoch: 42211, Training Loss: 0.00844
Epoch: 42212, Training Loss: 0.00662
Epoch: 42212, Training Loss: 0.00688
Epoch: 42212, Training Loss: 0.00688
Epoch: 42212, Training Loss: 0.00844
Epoch: 42213, Training Loss: 0.00662
Epoch: 42213, Training Loss: 0.00688
Epoch: 42213, Training Loss: 0.00688
Epoch: 42213, Training Loss: 0.00844
Epoch: 42214, Training Loss: 0.00662
Epoch: 42214, Training Loss: 0.00688
Epoch: 42214, Training Loss: 0.00688
Epoch: 42214, Training Loss: 0.00844
Epoch: 42215, Training Loss: 0.00662
Epoch: 42215, Training Loss: 0.00688
Epoch: 42215, Training Loss: 0.00688
Epoch: 42215, Training Loss: 0.00844
Epoch: 42216, Training Loss: 0.00662
Epoch: 42216, Training Loss: 0.00688
Epoch: 42216, Training Loss: 0.00688
Epoch: 42216, Training Loss: 0.00844
Epoch: 42217, Training Loss: 0.00662
Epoch: 42217, Training Loss: 0.00688
Epoch: 42217, Training Loss: 0.00688
Epoch: 42217, Training Loss: 0.00844
Epoch: 42218, Training Loss: 0.00662
Epoch: 42218, Training Loss: 0.00688
Epoch: 42218, Training Loss: 0.00688
Epoch: 42218, Training Loss: 0.00844
Epoch: 42219, Training Loss: 0.00662
Epoch: 42219, Training Loss: 0.00688
Epoch: 42219, Training Loss: 0.00688
Epoch: 42219, Training Loss: 0.00844
Epoch: 42220, Training Loss: 0.00662
Epoch: 42220, Training Loss: 0.00688
Epoch: 42220, Training Loss: 0.00688
Epoch: 42220, Training Loss: 0.00844
Epoch: 42221, Training Loss: 0.00662
Epoch: 42221, Training Loss: 0.00687
Epoch: 42221, Training Loss: 0.00688
Epoch: 42221, Training Loss: 0.00844
Epoch: 42222, Training Loss: 0.00662
Epoch: 42222, Training Loss: 0.00687
Epoch: 42222, Training Loss: 0.00688
Epoch: 42222, Training Loss: 0.00844
Epoch: 42223, Training Loss: 0.00662
Epoch: 42223, Training Loss: 0.00687
Epoch: 42223, Training Loss: 0.00688
Epoch: 42223, Training Loss: 0.00844
Epoch: 42224, Training Loss: 0.00662
Epoch: 42224, Training Loss: 0.00687
Epoch: 42224, Training Loss: 0.00688
Epoch: 42224, Training Loss: 0.00844
Epoch: 42225, Training Loss: 0.00662
Epoch: 42225, Training Loss: 0.00687
Epoch: 42225, Training Loss: 0.00688
Epoch: 42225, Training Loss: 0.00844
Epoch: 42226, Training Loss: 0.00662
Epoch: 42226, Training Loss: 0.00687
Epoch: 42226, Training Loss: 0.00688
Epoch: 42226, Training Loss: 0.00844
Epoch: 42227, Training Loss: 0.00662
Epoch: 42227, Training Loss: 0.00687
Epoch: 42227, Training Loss: 0.00688
Epoch: 42227, Training Loss: 0.00844
Epoch: 42228, Training Loss: 0.00662
Epoch: 42228, Training Loss: 0.00687
Epoch: 42228, Training Loss: 0.00688
Epoch: 42228, Training Loss: 0.00844
Epoch: 42229, Training Loss: 0.00662
Epoch: 42229, Training Loss: 0.00687
Epoch: 42229, Training Loss: 0.00688
Epoch: 42229, Training Loss: 0.00844
Epoch: 42230, Training Loss: 0.00662
Epoch: 42230, Training Loss: 0.00687
Epoch: 42230, Training Loss: 0.00688
Epoch: 42230, Training Loss: 0.00844
Epoch: 42231, Training Loss: 0.00662
Epoch: 42231, Training Loss: 0.00687
Epoch: 42231, Training Loss: 0.00688
Epoch: 42231, Training Loss: 0.00844
Epoch: 42232, Training Loss: 0.00662
Epoch: 42232, Training Loss: 0.00687
Epoch: 42232, Training Loss: 0.00688
Epoch: 42232, Training Loss: 0.00844
Epoch: 42233, Training Loss: 0.00662
Epoch: 42233, Training Loss: 0.00687
Epoch: 42233, Training Loss: 0.00688
Epoch: 42233, Training Loss: 0.00844
Epoch: 42234, Training Loss: 0.00662
Epoch: 42234, Training Loss: 0.00687
Epoch: 42234, Training Loss: 0.00688
Epoch: 42234, Training Loss: 0.00844
Epoch: 42235, Training Loss: 0.00662
Epoch: 42235, Training Loss: 0.00687
Epoch: 42235, Training Loss: 0.00688
Epoch: 42235, Training Loss: 0.00844
Epoch: 42236, Training Loss: 0.00662
Epoch: 42236, Training Loss: 0.00687
Epoch: 42236, Training Loss: 0.00688
Epoch: 42236, Training Loss: 0.00844
Epoch: 42237, Training Loss: 0.00662
Epoch: 42237, Training Loss: 0.00687
Epoch: 42237, Training Loss: 0.00688
Epoch: 42237, Training Loss: 0.00844
Epoch: 42238, Training Loss: 0.00662
Epoch: 42238, Training Loss: 0.00687
Epoch: 42238, Training Loss: 0.00688
Epoch: 42238, Training Loss: 0.00844
Epoch: 42239, Training Loss: 0.00662
Epoch: 42239, Training Loss: 0.00687
Epoch: 42239, Training Loss: 0.00688
Epoch: 42239, Training Loss: 0.00844
Epoch: 42240, Training Loss: 0.00662
Epoch: 42240, Training Loss: 0.00687
Epoch: 42240, Training Loss: 0.00688
Epoch: 42240, Training Loss: 0.00844
Epoch: 42241, Training Loss: 0.00662
Epoch: 42241, Training Loss: 0.00687
Epoch: 42241, Training Loss: 0.00688
Epoch: 42241, Training Loss: 0.00844
Epoch: 42242, Training Loss: 0.00662
Epoch: 42242, Training Loss: 0.00687
Epoch: 42242, Training Loss: 0.00688
Epoch: 42242, Training Loss: 0.00844
Epoch: 42243, Training Loss: 0.00662
Epoch: 42243, Training Loss: 0.00687
Epoch: 42243, Training Loss: 0.00688
Epoch: 42243, Training Loss: 0.00844
Epoch: 42244, Training Loss: 0.00662
Epoch: 42244, Training Loss: 0.00687
Epoch: 42244, Training Loss: 0.00688
Epoch: 42244, Training Loss: 0.00844
Epoch: 42245, Training Loss: 0.00662
Epoch: 42245, Training Loss: 0.00687
Epoch: 42245, Training Loss: 0.00688
Epoch: 42245, Training Loss: 0.00844
Epoch: 42246, Training Loss: 0.00662
Epoch: 42246, Training Loss: 0.00687
Epoch: 42246, Training Loss: 0.00688
Epoch: 42246, Training Loss: 0.00844
Epoch: 42247, Training Loss: 0.00662
Epoch: 42247, Training Loss: 0.00687
Epoch: 42247, Training Loss: 0.00688
Epoch: 42247, Training Loss: 0.00844
Epoch: 42248, Training Loss: 0.00662
Epoch: 42248, Training Loss: 0.00687
Epoch: 42248, Training Loss: 0.00688
Epoch: 42248, Training Loss: 0.00844
Epoch: 42249, Training Loss: 0.00662
Epoch: 42249, Training Loss: 0.00687
Epoch: 42249, Training Loss: 0.00688
Epoch: 42249, Training Loss: 0.00844
Epoch: 42250, Training Loss: 0.00662
Epoch: 42250, Training Loss: 0.00687
Epoch: 42250, Training Loss: 0.00688
Epoch: 42250, Training Loss: 0.00843
Epoch: 42251, Training Loss: 0.00662
Epoch: 42251, Training Loss: 0.00687
Epoch: 42251, Training Loss: 0.00688
Epoch: 42251, Training Loss: 0.00843
Epoch: 42252, Training Loss: 0.00662
Epoch: 42252, Training Loss: 0.00687
Epoch: 42252, Training Loss: 0.00688
Epoch: 42252, Training Loss: 0.00843
Epoch: 42253, Training Loss: 0.00662
Epoch: 42253, Training Loss: 0.00687
Epoch: 42253, Training Loss: 0.00688
Epoch: 42253, Training Loss: 0.00843
Epoch: 42254, Training Loss: 0.00662
Epoch: 42254, Training Loss: 0.00687
Epoch: 42254, Training Loss: 0.00688
Epoch: 42254, Training Loss: 0.00843
Epoch: 42255, Training Loss: 0.00662
Epoch: 42255, Training Loss: 0.00687
Epoch: 42255, Training Loss: 0.00688
Epoch: 42255, Training Loss: 0.00843
Epoch: 42256, Training Loss: 0.00662
Epoch: 42256, Training Loss: 0.00687
Epoch: 42256, Training Loss: 0.00688
Epoch: 42256, Training Loss: 0.00843
Epoch: 42257, Training Loss: 0.00662
Epoch: 42257, Training Loss: 0.00687
Epoch: 42257, Training Loss: 0.00688
Epoch: 42257, Training Loss: 0.00843
Epoch: 42258, Training Loss: 0.00662
Epoch: 42258, Training Loss: 0.00687
Epoch: 42258, Training Loss: 0.00688
Epoch: 42258, Training Loss: 0.00843
Epoch: 42259, Training Loss: 0.00662
Epoch: 42259, Training Loss: 0.00687
Epoch: 42259, Training Loss: 0.00688
Epoch: 42259, Training Loss: 0.00843
Epoch: 42260, Training Loss: 0.00662
Epoch: 42260, Training Loss: 0.00687
Epoch: 42260, Training Loss: 0.00688
Epoch: 42260, Training Loss: 0.00843
Epoch: 42261, Training Loss: 0.00662
Epoch: 42261, Training Loss: 0.00687
Epoch: 42261, Training Loss: 0.00688
Epoch: 42261, Training Loss: 0.00843
Epoch: 42262, Training Loss: 0.00662
Epoch: 42262, Training Loss: 0.00687
Epoch: 42262, Training Loss: 0.00688
Epoch: 42262, Training Loss: 0.00843
Epoch: 42263, Training Loss: 0.00662
Epoch: 42263, Training Loss: 0.00687
Epoch: 42263, Training Loss: 0.00688
Epoch: 42263, Training Loss: 0.00843
Epoch: 42264, Training Loss: 0.00662
Epoch: 42264, Training Loss: 0.00687
Epoch: 42264, Training Loss: 0.00688
Epoch: 42264, Training Loss: 0.00843
Epoch: 42265, Training Loss: 0.00662
Epoch: 42265, Training Loss: 0.00687
Epoch: 42265, Training Loss: 0.00688
Epoch: 42265, Training Loss: 0.00843
Epoch: 42266, Training Loss: 0.00662
Epoch: 42266, Training Loss: 0.00687
Epoch: 42266, Training Loss: 0.00688
Epoch: 42266, Training Loss: 0.00843
Epoch: 42267, Training Loss: 0.00662
Epoch: 42267, Training Loss: 0.00687
Epoch: 42267, Training Loss: 0.00688
Epoch: 42267, Training Loss: 0.00843
Epoch: 42268, Training Loss: 0.00662
Epoch: 42268, Training Loss: 0.00687
Epoch: 42268, Training Loss: 0.00688
Epoch: 42268, Training Loss: 0.00843
Epoch: 42269, Training Loss: 0.00662
Epoch: 42269, Training Loss: 0.00687
Epoch: 42269, Training Loss: 0.00688
Epoch: 42269, Training Loss: 0.00843
Epoch: 42270, Training Loss: 0.00662
Epoch: 42270, Training Loss: 0.00687
Epoch: 42270, Training Loss: 0.00688
Epoch: 42270, Training Loss: 0.00843
Epoch: 42271, Training Loss: 0.00662
Epoch: 42271, Training Loss: 0.00687
Epoch: 42271, Training Loss: 0.00688
Epoch: 42271, Training Loss: 0.00843
Epoch: 42272, Training Loss: 0.00662
Epoch: 42272, Training Loss: 0.00687
Epoch: 42272, Training Loss: 0.00688
Epoch: 42272, Training Loss: 0.00843
Epoch: 42273, Training Loss: 0.00662
Epoch: 42273, Training Loss: 0.00687
Epoch: 42273, Training Loss: 0.00688
Epoch: 42273, Training Loss: 0.00843
Epoch: 42274, Training Loss: 0.00662
Epoch: 42274, Training Loss: 0.00687
Epoch: 42274, Training Loss: 0.00688
Epoch: 42274, Training Loss: 0.00843
Epoch: 42275, Training Loss: 0.00662
Epoch: 42275, Training Loss: 0.00687
Epoch: 42275, Training Loss: 0.00688
Epoch: 42275, Training Loss: 0.00843
Epoch: 42276, Training Loss: 0.00662
Epoch: 42276, Training Loss: 0.00687
Epoch: 42276, Training Loss: 0.00688
Epoch: 42276, Training Loss: 0.00843
Epoch: 42277, Training Loss: 0.00662
Epoch: 42277, Training Loss: 0.00687
Epoch: 42277, Training Loss: 0.00688
Epoch: 42277, Training Loss: 0.00843
Epoch: 42278, Training Loss: 0.00662
Epoch: 42278, Training Loss: 0.00687
Epoch: 42278, Training Loss: 0.00688
Epoch: 42278, Training Loss: 0.00843
Epoch: 42279, Training Loss: 0.00662
Epoch: 42279, Training Loss: 0.00687
Epoch: 42279, Training Loss: 0.00688
Epoch: 42279, Training Loss: 0.00843
Epoch: 42280, Training Loss: 0.00662
Epoch: 42280, Training Loss: 0.00687
Epoch: 42280, Training Loss: 0.00688
Epoch: 42280, Training Loss: 0.00843
Epoch: 42281, Training Loss: 0.00662
Epoch: 42281, Training Loss: 0.00687
Epoch: 42281, Training Loss: 0.00688
Epoch: 42281, Training Loss: 0.00843
Epoch: 42282, Training Loss: 0.00662
Epoch: 42282, Training Loss: 0.00687
Epoch: 42282, Training Loss: 0.00688
Epoch: 42282, Training Loss: 0.00843
Epoch: 42283, Training Loss: 0.00662
Epoch: 42283, Training Loss: 0.00687
Epoch: 42283, Training Loss: 0.00688
Epoch: 42283, Training Loss: 0.00843
Epoch: 42284, Training Loss: 0.00662
Epoch: 42284, Training Loss: 0.00687
Epoch: 42284, Training Loss: 0.00688
Epoch: 42284, Training Loss: 0.00843
Epoch: 42285, Training Loss: 0.00662
Epoch: 42285, Training Loss: 0.00687
Epoch: 42285, Training Loss: 0.00688
Epoch: 42285, Training Loss: 0.00843
Epoch: 42286, Training Loss: 0.00662
Epoch: 42286, Training Loss: 0.00687
Epoch: 42286, Training Loss: 0.00688
Epoch: 42286, Training Loss: 0.00843
Epoch: 42287, Training Loss: 0.00662
Epoch: 42287, Training Loss: 0.00687
Epoch: 42287, Training Loss: 0.00688
Epoch: 42287, Training Loss: 0.00843
Epoch: 42288, Training Loss: 0.00662
Epoch: 42288, Training Loss: 0.00687
Epoch: 42288, Training Loss: 0.00688
Epoch: 42288, Training Loss: 0.00843
Epoch: 42289, Training Loss: 0.00662
Epoch: 42289, Training Loss: 0.00687
Epoch: 42289, Training Loss: 0.00688
Epoch: 42289, Training Loss: 0.00843
Epoch: 42290, Training Loss: 0.00662
Epoch: 42290, Training Loss: 0.00687
Epoch: 42290, Training Loss: 0.00688
Epoch: 42290, Training Loss: 0.00843
Epoch: 42291, Training Loss: 0.00662
Epoch: 42291, Training Loss: 0.00687
Epoch: 42291, Training Loss: 0.00688
Epoch: 42291, Training Loss: 0.00843
Epoch: 42292, Training Loss: 0.00662
Epoch: 42292, Training Loss: 0.00687
Epoch: 42292, Training Loss: 0.00688
Epoch: 42292, Training Loss: 0.00843
Epoch: 42293, Training Loss: 0.00662
Epoch: 42293, Training Loss: 0.00687
Epoch: 42293, Training Loss: 0.00688
Epoch: 42293, Training Loss: 0.00843
Epoch: 42294, Training Loss: 0.00662
Epoch: 42294, Training Loss: 0.00687
Epoch: 42294, Training Loss: 0.00688
Epoch: 42294, Training Loss: 0.00843
Epoch: 42295, Training Loss: 0.00662
Epoch: 42295, Training Loss: 0.00687
Epoch: 42295, Training Loss: 0.00688
Epoch: 42295, Training Loss: 0.00843
Epoch: 42296, Training Loss: 0.00662
Epoch: 42296, Training Loss: 0.00687
Epoch: 42296, Training Loss: 0.00688
Epoch: 42296, Training Loss: 0.00843
Epoch: 42297, Training Loss: 0.00662
Epoch: 42297, Training Loss: 0.00687
Epoch: 42297, Training Loss: 0.00688
Epoch: 42297, Training Loss: 0.00843
Epoch: 42298, Training Loss: 0.00662
Epoch: 42298, Training Loss: 0.00687
Epoch: 42298, Training Loss: 0.00688
Epoch: 42298, Training Loss: 0.00843
Epoch: 42299, Training Loss: 0.00662
Epoch: 42299, Training Loss: 0.00687
Epoch: 42299, Training Loss: 0.00688
Epoch: 42299, Training Loss: 0.00843
Epoch: 42300, Training Loss: 0.00662
Epoch: 42300, Training Loss: 0.00687
Epoch: 42300, Training Loss: 0.00688
Epoch: 42300, Training Loss: 0.00843
Epoch: 42301, Training Loss: 0.00662
Epoch: 42301, Training Loss: 0.00687
Epoch: 42301, Training Loss: 0.00688
Epoch: 42301, Training Loss: 0.00843
Epoch: 42302, Training Loss: 0.00662
Epoch: 42302, Training Loss: 0.00687
Epoch: 42302, Training Loss: 0.00688
Epoch: 42302, Training Loss: 0.00843
Epoch: 42303, Training Loss: 0.00662
Epoch: 42303, Training Loss: 0.00687
Epoch: 42303, Training Loss: 0.00688
Epoch: 42303, Training Loss: 0.00843
Epoch: 42304, Training Loss: 0.00662
Epoch: 42304, Training Loss: 0.00687
Epoch: 42304, Training Loss: 0.00688
Epoch: 42304, Training Loss: 0.00843
Epoch: 42305, Training Loss: 0.00662
Epoch: 42305, Training Loss: 0.00687
Epoch: 42305, Training Loss: 0.00688
Epoch: 42305, Training Loss: 0.00843
Epoch: 42306, Training Loss: 0.00662
Epoch: 42306, Training Loss: 0.00687
Epoch: 42306, Training Loss: 0.00687
Epoch: 42306, Training Loss: 0.00843
Epoch: 42307, Training Loss: 0.00662
Epoch: 42307, Training Loss: 0.00687
Epoch: 42307, Training Loss: 0.00687
Epoch: 42307, Training Loss: 0.00843
Epoch: 42308, Training Loss: 0.00662
Epoch: 42308, Training Loss: 0.00687
Epoch: 42308, Training Loss: 0.00687
Epoch: 42308, Training Loss: 0.00843
Epoch: 42309, Training Loss: 0.00662
Epoch: 42309, Training Loss: 0.00687
Epoch: 42309, Training Loss: 0.00687
Epoch: 42309, Training Loss: 0.00843
Epoch: 42310, Training Loss: 0.00662
Epoch: 42310, Training Loss: 0.00687
Epoch: 42310, Training Loss: 0.00687
Epoch: 42310, Training Loss: 0.00843
Epoch: 42311, Training Loss: 0.00662
Epoch: 42311, Training Loss: 0.00687
Epoch: 42311, Training Loss: 0.00687
Epoch: 42311, Training Loss: 0.00843
Epoch: 42312, Training Loss: 0.00662
Epoch: 42312, Training Loss: 0.00687
Epoch: 42312, Training Loss: 0.00687
Epoch: 42312, Training Loss: 0.00843
Epoch: 42313, Training Loss: 0.00661
Epoch: 42313, Training Loss: 0.00687
Epoch: 42313, Training Loss: 0.00687
Epoch: 42313, Training Loss: 0.00843
Epoch: 42314, Training Loss: 0.00661
Epoch: 42314, Training Loss: 0.00687
Epoch: 42314, Training Loss: 0.00687
Epoch: 42314, Training Loss: 0.00843
Epoch: 42315, Training Loss: 0.00661
Epoch: 42315, Training Loss: 0.00687
Epoch: 42315, Training Loss: 0.00687
Epoch: 42315, Training Loss: 0.00843
Epoch: 42316, Training Loss: 0.00661
Epoch: 42316, Training Loss: 0.00687
Epoch: 42316, Training Loss: 0.00687
Epoch: 42316, Training Loss: 0.00843
Epoch: 42317, Training Loss: 0.00661
Epoch: 42317, Training Loss: 0.00687
Epoch: 42317, Training Loss: 0.00687
Epoch: 42317, Training Loss: 0.00843
Epoch: 42318, Training Loss: 0.00661
Epoch: 42318, Training Loss: 0.00687
Epoch: 42318, Training Loss: 0.00687
Epoch: 42318, Training Loss: 0.00843
Epoch: 42319, Training Loss: 0.00661
Epoch: 42319, Training Loss: 0.00687
Epoch: 42319, Training Loss: 0.00687
Epoch: 42319, Training Loss: 0.00843
Epoch: 42320, Training Loss: 0.00661
Epoch: 42320, Training Loss: 0.00687
Epoch: 42320, Training Loss: 0.00687
Epoch: 42320, Training Loss: 0.00843
Epoch: 42321, Training Loss: 0.00661
Epoch: 42321, Training Loss: 0.00687
Epoch: 42321, Training Loss: 0.00687
Epoch: 42321, Training Loss: 0.00843
Epoch: 42322, Training Loss: 0.00661
Epoch: 42322, Training Loss: 0.00687
Epoch: 42322, Training Loss: 0.00687
Epoch: 42322, Training Loss: 0.00843
Epoch: 42323, Training Loss: 0.00661
Epoch: 42323, Training Loss: 0.00687
Epoch: 42323, Training Loss: 0.00687
Epoch: 42323, Training Loss: 0.00843
Epoch: 42324, Training Loss: 0.00661
Epoch: 42324, Training Loss: 0.00687
Epoch: 42324, Training Loss: 0.00687
Epoch: 42324, Training Loss: 0.00843
Epoch: 42325, Training Loss: 0.00661
Epoch: 42325, Training Loss: 0.00687
Epoch: 42325, Training Loss: 0.00687
Epoch: 42325, Training Loss: 0.00843
Epoch: 42326, Training Loss: 0.00661
Epoch: 42326, Training Loss: 0.00687
Epoch: 42326, Training Loss: 0.00687
Epoch: 42326, Training Loss: 0.00843
Epoch: 42327, Training Loss: 0.00661
Epoch: 42327, Training Loss: 0.00687
Epoch: 42327, Training Loss: 0.00687
Epoch: 42327, Training Loss: 0.00843
Epoch: 42328, Training Loss: 0.00661
Epoch: 42328, Training Loss: 0.00687
Epoch: 42328, Training Loss: 0.00687
Epoch: 42328, Training Loss: 0.00843
Epoch: 42329, Training Loss: 0.00661
Epoch: 42329, Training Loss: 0.00687
Epoch: 42329, Training Loss: 0.00687
Epoch: 42329, Training Loss: 0.00843
Epoch: 42330, Training Loss: 0.00661
Epoch: 42330, Training Loss: 0.00687
Epoch: 42330, Training Loss: 0.00687
Epoch: 42330, Training Loss: 0.00843
Epoch: 42331, Training Loss: 0.00661
Epoch: 42331, Training Loss: 0.00687
Epoch: 42331, Training Loss: 0.00687
Epoch: 42331, Training Loss: 0.00843
Epoch: 42332, Training Loss: 0.00661
Epoch: 42332, Training Loss: 0.00687
Epoch: 42332, Training Loss: 0.00687
Epoch: 42332, Training Loss: 0.00843
Epoch: 42333, Training Loss: 0.00661
Epoch: 42333, Training Loss: 0.00686
Epoch: 42333, Training Loss: 0.00687
Epoch: 42333, Training Loss: 0.00843
Epoch: 42334, Training Loss: 0.00661
Epoch: 42334, Training Loss: 0.00686
Epoch: 42334, Training Loss: 0.00687
Epoch: 42334, Training Loss: 0.00843
Epoch: 42335, Training Loss: 0.00661
Epoch: 42335, Training Loss: 0.00686
Epoch: 42335, Training Loss: 0.00687
Epoch: 42335, Training Loss: 0.00843
Epoch: 42336, Training Loss: 0.00661
Epoch: 42336, Training Loss: 0.00686
Epoch: 42336, Training Loss: 0.00687
Epoch: 42336, Training Loss: 0.00843
Epoch: 42337, Training Loss: 0.00661
Epoch: 42337, Training Loss: 0.00686
Epoch: 42337, Training Loss: 0.00687
Epoch: 42337, Training Loss: 0.00843
Epoch: 42338, Training Loss: 0.00661
Epoch: 42338, Training Loss: 0.00686
Epoch: 42338, Training Loss: 0.00687
Epoch: 42338, Training Loss: 0.00843
Epoch: 42339, Training Loss: 0.00661
Epoch: 42339, Training Loss: 0.00686
Epoch: 42339, Training Loss: 0.00687
Epoch: 42339, Training Loss: 0.00843
Epoch: 42340, Training Loss: 0.00661
Epoch: 42340, Training Loss: 0.00686
Epoch: 42340, Training Loss: 0.00687
Epoch: 42340, Training Loss: 0.00843
Epoch: 42341, Training Loss: 0.00661
Epoch: 42341, Training Loss: 0.00686
Epoch: 42341, Training Loss: 0.00687
Epoch: 42341, Training Loss: 0.00842
Epoch: 42342, Training Loss: 0.00661
Epoch: 42342, Training Loss: 0.00686
Epoch: 42342, Training Loss: 0.00687
Epoch: 42342, Training Loss: 0.00842
Epoch: 42343, Training Loss: 0.00661
Epoch: 42343, Training Loss: 0.00686
Epoch: 42343, Training Loss: 0.00687
Epoch: 42343, Training Loss: 0.00842
Epoch: 42344, Training Loss: 0.00661
Epoch: 42344, Training Loss: 0.00686
Epoch: 42344, Training Loss: 0.00687
Epoch: 42344, Training Loss: 0.00842
Epoch: 42345, Training Loss: 0.00661
Epoch: 42345, Training Loss: 0.00686
Epoch: 42345, Training Loss: 0.00687
Epoch: 42345, Training Loss: 0.00842
Epoch: 42346, Training Loss: 0.00661
Epoch: 42346, Training Loss: 0.00686
Epoch: 42346, Training Loss: 0.00687
Epoch: 42346, Training Loss: 0.00842
Epoch: 42347, Training Loss: 0.00661
Epoch: 42347, Training Loss: 0.00686
Epoch: 42347, Training Loss: 0.00687
Epoch: 42347, Training Loss: 0.00842
Epoch: 42348, Training Loss: 0.00661
Epoch: 42348, Training Loss: 0.00686
Epoch: 42348, Training Loss: 0.00687
Epoch: 42348, Training Loss: 0.00842
Epoch: 42349, Training Loss: 0.00661
Epoch: 42349, Training Loss: 0.00686
Epoch: 42349, Training Loss: 0.00687
Epoch: 42349, Training Loss: 0.00842
Epoch: 42350, Training Loss: 0.00661
Epoch: 42350, Training Loss: 0.00686
Epoch: 42350, Training Loss: 0.00687
Epoch: 42350, Training Loss: 0.00842
Epoch: 42351, Training Loss: 0.00661
Epoch: 42351, Training Loss: 0.00686
Epoch: 42351, Training Loss: 0.00687
Epoch: 42351, Training Loss: 0.00842
Epoch: 42352, Training Loss: 0.00661
Epoch: 42352, Training Loss: 0.00686
Epoch: 42352, Training Loss: 0.00687
Epoch: 42352, Training Loss: 0.00842
Epoch: 42353, Training Loss: 0.00661
Epoch: 42353, Training Loss: 0.00686
Epoch: 42353, Training Loss: 0.00687
Epoch: 42353, Training Loss: 0.00842
Epoch: 42354, Training Loss: 0.00661
Epoch: 42354, Training Loss: 0.00686
Epoch: 42354, Training Loss: 0.00687
Epoch: 42354, Training Loss: 0.00842
Epoch: 42355, Training Loss: 0.00661
Epoch: 42355, Training Loss: 0.00686
Epoch: 42355, Training Loss: 0.00687
Epoch: 42355, Training Loss: 0.00842
Epoch: 42356, Training Loss: 0.00661
Epoch: 42356, Training Loss: 0.00686
Epoch: 42356, Training Loss: 0.00687
Epoch: 42356, Training Loss: 0.00842
Epoch: 42357, Training Loss: 0.00661
Epoch: 42357, Training Loss: 0.00686
Epoch: 42357, Training Loss: 0.00687
Epoch: 42357, Training Loss: 0.00842
Epoch: 42358, Training Loss: 0.00661
Epoch: 42358, Training Loss: 0.00686
Epoch: 42358, Training Loss: 0.00687
Epoch: 42358, Training Loss: 0.00842
Epoch: 42359, Training Loss: 0.00661
Epoch: 42359, Training Loss: 0.00686
Epoch: 42359, Training Loss: 0.00687
Epoch: 42359, Training Loss: 0.00842
Epoch: 42360, Training Loss: 0.00661
Epoch: 42360, Training Loss: 0.00686
Epoch: 42360, Training Loss: 0.00687
Epoch: 42360, Training Loss: 0.00842
Epoch: 42361, Training Loss: 0.00661
Epoch: 42361, Training Loss: 0.00686
Epoch: 42361, Training Loss: 0.00687
Epoch: 42361, Training Loss: 0.00842
Epoch: 42362, Training Loss: 0.00661
Epoch: 42362, Training Loss: 0.00686
Epoch: 42362, Training Loss: 0.00687
Epoch: 42362, Training Loss: 0.00842
Epoch: 42363, Training Loss: 0.00661
Epoch: 42363, Training Loss: 0.00686
Epoch: 42363, Training Loss: 0.00687
Epoch: 42363, Training Loss: 0.00842
Epoch: 42364, Training Loss: 0.00661
Epoch: 42364, Training Loss: 0.00686
Epoch: 42364, Training Loss: 0.00687
Epoch: 42364, Training Loss: 0.00842
Epoch: 42365, Training Loss: 0.00661
Epoch: 42365, Training Loss: 0.00686
Epoch: 42365, Training Loss: 0.00687
Epoch: 42365, Training Loss: 0.00842
Epoch: 42366, Training Loss: 0.00661
Epoch: 42366, Training Loss: 0.00686
Epoch: 42366, Training Loss: 0.00687
Epoch: 42366, Training Loss: 0.00842
Epoch: 42367, Training Loss: 0.00661
Epoch: 42367, Training Loss: 0.00686
Epoch: 42367, Training Loss: 0.00687
Epoch: 42367, Training Loss: 0.00842
Epoch: 42368, Training Loss: 0.00661
Epoch: 42368, Training Loss: 0.00686
Epoch: 42368, Training Loss: 0.00687
Epoch: 42368, Training Loss: 0.00842
Epoch: 42369, Training Loss: 0.00661
Epoch: 42369, Training Loss: 0.00686
Epoch: 42369, Training Loss: 0.00687
Epoch: 42369, Training Loss: 0.00842
Epoch: 42370, Training Loss: 0.00661
Epoch: 42370, Training Loss: 0.00686
Epoch: 42370, Training Loss: 0.00687
Epoch: 42370, Training Loss: 0.00842
Epoch: 42371, Training Loss: 0.00661
Epoch: 42371, Training Loss: 0.00686
Epoch: 42371, Training Loss: 0.00687
Epoch: 42371, Training Loss: 0.00842
Epoch: 42372, Training Loss: 0.00661
Epoch: 42372, Training Loss: 0.00686
Epoch: 42372, Training Loss: 0.00687
Epoch: 42372, Training Loss: 0.00842
Epoch: 42373, Training Loss: 0.00661
Epoch: 42373, Training Loss: 0.00686
Epoch: 42373, Training Loss: 0.00687
Epoch: 42373, Training Loss: 0.00842
Epoch: 42374, Training Loss: 0.00661
Epoch: 42374, Training Loss: 0.00686
Epoch: 42374, Training Loss: 0.00687
Epoch: 42374, Training Loss: 0.00842
Epoch: 42375, Training Loss: 0.00661
Epoch: 42375, Training Loss: 0.00686
Epoch: 42375, Training Loss: 0.00687
Epoch: 42375, Training Loss: 0.00842
Epoch: 42376, Training Loss: 0.00661
Epoch: 42376, Training Loss: 0.00686
Epoch: 42376, Training Loss: 0.00687
Epoch: 42376, Training Loss: 0.00842
Epoch: 42377, Training Loss: 0.00661
Epoch: 42377, Training Loss: 0.00686
Epoch: 42377, Training Loss: 0.00687
Epoch: 42377, Training Loss: 0.00842
Epoch: 42378, Training Loss: 0.00661
Epoch: 42378, Training Loss: 0.00686
Epoch: 42378, Training Loss: 0.00687
Epoch: 42378, Training Loss: 0.00842
Epoch: 42379, Training Loss: 0.00661
Epoch: 42379, Training Loss: 0.00686
Epoch: 42379, Training Loss: 0.00687
Epoch: 42379, Training Loss: 0.00842
Epoch: 42380, Training Loss: 0.00661
Epoch: 42380, Training Loss: 0.00686
Epoch: 42380, Training Loss: 0.00687
Epoch: 42380, Training Loss: 0.00842
Epoch: 42381, Training Loss: 0.00661
Epoch: 42381, Training Loss: 0.00686
Epoch: 42381, Training Loss: 0.00687
Epoch: 42381, Training Loss: 0.00842
Epoch: 42382, Training Loss: 0.00661
Epoch: 42382, Training Loss: 0.00686
Epoch: 42382, Training Loss: 0.00687
Epoch: 42382, Training Loss: 0.00842
Epoch: 42383, Training Loss: 0.00661
Epoch: 42383, Training Loss: 0.00686
Epoch: 42383, Training Loss: 0.00687
Epoch: 42383, Training Loss: 0.00842
Epoch: 42384, Training Loss: 0.00661
Epoch: 42384, Training Loss: 0.00686
Epoch: 42384, Training Loss: 0.00687
Epoch: 42384, Training Loss: 0.00842
Epoch: 42385, Training Loss: 0.00661
Epoch: 42385, Training Loss: 0.00686
Epoch: 42385, Training Loss: 0.00687
Epoch: 42385, Training Loss: 0.00842
Epoch: 42386, Training Loss: 0.00661
Epoch: 42386, Training Loss: 0.00686
Epoch: 42386, Training Loss: 0.00687
Epoch: 42386, Training Loss: 0.00842
Epoch: 42387, Training Loss: 0.00661
Epoch: 42387, Training Loss: 0.00686
Epoch: 42387, Training Loss: 0.00687
Epoch: 42387, Training Loss: 0.00842
Epoch: 42388, Training Loss: 0.00661
Epoch: 42388, Training Loss: 0.00686
Epoch: 42388, Training Loss: 0.00687
Epoch: 42388, Training Loss: 0.00842
Epoch: 42389, Training Loss: 0.00661
Epoch: 42389, Training Loss: 0.00686
Epoch: 42389, Training Loss: 0.00687
Epoch: 42389, Training Loss: 0.00842
Epoch: 42390, Training Loss: 0.00661
Epoch: 42390, Training Loss: 0.00686
Epoch: 42390, Training Loss: 0.00687
Epoch: 42390, Training Loss: 0.00842
Epoch: 42391, Training Loss: 0.00661
Epoch: 42391, Training Loss: 0.00686
Epoch: 42391, Training Loss: 0.00687
Epoch: 42391, Training Loss: 0.00842
Epoch: 42392, Training Loss: 0.00661
Epoch: 42392, Training Loss: 0.00686
Epoch: 42392, Training Loss: 0.00687
Epoch: 42392, Training Loss: 0.00842
Epoch: 42393, Training Loss: 0.00661
Epoch: 42393, Training Loss: 0.00686
Epoch: 42393, Training Loss: 0.00687
Epoch: 42393, Training Loss: 0.00842
Epoch: 42394, Training Loss: 0.00661
Epoch: 42394, Training Loss: 0.00686
Epoch: 42394, Training Loss: 0.00687
Epoch: 42394, Training Loss: 0.00842
Epoch: 42395, Training Loss: 0.00661
Epoch: 42395, Training Loss: 0.00686
Epoch: 42395, Training Loss: 0.00687
Epoch: 42395, Training Loss: 0.00842
Epoch: 42396, Training Loss: 0.00661
Epoch: 42396, Training Loss: 0.00686
Epoch: 42396, Training Loss: 0.00687
Epoch: 42396, Training Loss: 0.00842
Epoch: 42397, Training Loss: 0.00661
Epoch: 42397, Training Loss: 0.00686
Epoch: 42397, Training Loss: 0.00687
Epoch: 42397, Training Loss: 0.00842
Epoch: 42398, Training Loss: 0.00661
Epoch: 42398, Training Loss: 0.00686
Epoch: 42398, Training Loss: 0.00687
Epoch: 42398, Training Loss: 0.00842
Epoch: 42399, Training Loss: 0.00661
Epoch: 42399, Training Loss: 0.00686
Epoch: 42399, Training Loss: 0.00687
Epoch: 42399, Training Loss: 0.00842
Epoch: 42400, Training Loss: 0.00661
Epoch: 42400, Training Loss: 0.00686
Epoch: 42400, Training Loss: 0.00687
Epoch: 42400, Training Loss: 0.00842
Epoch: 42401, Training Loss: 0.00661
Epoch: 42401, Training Loss: 0.00686
Epoch: 42401, Training Loss: 0.00687
Epoch: 42401, Training Loss: 0.00842
Epoch: 42402, Training Loss: 0.00661
Epoch: 42402, Training Loss: 0.00686
Epoch: 42402, Training Loss: 0.00687
Epoch: 42402, Training Loss: 0.00842
Epoch: 42403, Training Loss: 0.00661
Epoch: 42403, Training Loss: 0.00686
Epoch: 42403, Training Loss: 0.00687
Epoch: 42403, Training Loss: 0.00842
Epoch: 42404, Training Loss: 0.00661
Epoch: 42404, Training Loss: 0.00686
Epoch: 42404, Training Loss: 0.00687
Epoch: 42404, Training Loss: 0.00842
Epoch: 42405, Training Loss: 0.00661
Epoch: 42405, Training Loss: 0.00686
Epoch: 42405, Training Loss: 0.00687
Epoch: 42405, Training Loss: 0.00842
Epoch: 42406, Training Loss: 0.00661
Epoch: 42406, Training Loss: 0.00686
Epoch: 42406, Training Loss: 0.00687
Epoch: 42406, Training Loss: 0.00842
Epoch: 42407, Training Loss: 0.00661
Epoch: 42407, Training Loss: 0.00686
Epoch: 42407, Training Loss: 0.00687
Epoch: 42407, Training Loss: 0.00842
Epoch: 42408, Training Loss: 0.00661
Epoch: 42408, Training Loss: 0.00686
Epoch: 42408, Training Loss: 0.00687
Epoch: 42408, Training Loss: 0.00842
Epoch: 42409, Training Loss: 0.00661
Epoch: 42409, Training Loss: 0.00686
Epoch: 42409, Training Loss: 0.00687
Epoch: 42409, Training Loss: 0.00842
Epoch: 42410, Training Loss: 0.00661
Epoch: 42410, Training Loss: 0.00686
Epoch: 42410, Training Loss: 0.00687
Epoch: 42410, Training Loss: 0.00842
Epoch: 42411, Training Loss: 0.00661
Epoch: 42411, Training Loss: 0.00686
Epoch: 42411, Training Loss: 0.00687
Epoch: 42411, Training Loss: 0.00842
Epoch: 42412, Training Loss: 0.00661
Epoch: 42412, Training Loss: 0.00686
Epoch: 42412, Training Loss: 0.00687
Epoch: 42412, Training Loss: 0.00842
Epoch: 42413, Training Loss: 0.00661
Epoch: 42413, Training Loss: 0.00686
Epoch: 42413, Training Loss: 0.00687
Epoch: 42413, Training Loss: 0.00842
Epoch: 42414, Training Loss: 0.00661
Epoch: 42414, Training Loss: 0.00686
Epoch: 42414, Training Loss: 0.00687
Epoch: 42414, Training Loss: 0.00842
Epoch: 42415, Training Loss: 0.00661
Epoch: 42415, Training Loss: 0.00686
Epoch: 42415, Training Loss: 0.00687
Epoch: 42415, Training Loss: 0.00842
Epoch: 42416, Training Loss: 0.00661
Epoch: 42416, Training Loss: 0.00686
Epoch: 42416, Training Loss: 0.00687
Epoch: 42416, Training Loss: 0.00842
Epoch: 42417, Training Loss: 0.00661
Epoch: 42417, Training Loss: 0.00686
Epoch: 42417, Training Loss: 0.00687
Epoch: 42417, Training Loss: 0.00842
Epoch: 42418, Training Loss: 0.00661
Epoch: 42418, Training Loss: 0.00686
Epoch: 42418, Training Loss: 0.00687
Epoch: 42418, Training Loss: 0.00842
Epoch: 42419, Training Loss: 0.00661
Epoch: 42419, Training Loss: 0.00686
Epoch: 42419, Training Loss: 0.00686
Epoch: 42419, Training Loss: 0.00842
Epoch: 42420, Training Loss: 0.00661
Epoch: 42420, Training Loss: 0.00686
Epoch: 42420, Training Loss: 0.00686
Epoch: 42420, Training Loss: 0.00842
Epoch: 42421, Training Loss: 0.00661
Epoch: 42421, Training Loss: 0.00686
Epoch: 42421, Training Loss: 0.00686
Epoch: 42421, Training Loss: 0.00842
Epoch: 42422, Training Loss: 0.00661
Epoch: 42422, Training Loss: 0.00686
Epoch: 42422, Training Loss: 0.00686
Epoch: 42422, Training Loss: 0.00842
Epoch: 42423, Training Loss: 0.00661
Epoch: 42423, Training Loss: 0.00686
Epoch: 42423, Training Loss: 0.00686
Epoch: 42423, Training Loss: 0.00842
Epoch: 42424, Training Loss: 0.00661
Epoch: 42424, Training Loss: 0.00686
Epoch: 42424, Training Loss: 0.00686
Epoch: 42424, Training Loss: 0.00842
Epoch: 42425, Training Loss: 0.00661
Epoch: 42425, Training Loss: 0.00686
Epoch: 42425, Training Loss: 0.00686
Epoch: 42425, Training Loss: 0.00842
Epoch: 42426, Training Loss: 0.00661
Epoch: 42426, Training Loss: 0.00686
Epoch: 42426, Training Loss: 0.00686
Epoch: 42426, Training Loss: 0.00842
Epoch: 42427, Training Loss: 0.00661
Epoch: 42427, Training Loss: 0.00686
Epoch: 42427, Training Loss: 0.00686
Epoch: 42427, Training Loss: 0.00842
Epoch: 42428, Training Loss: 0.00661
Epoch: 42428, Training Loss: 0.00686
Epoch: 42428, Training Loss: 0.00686
Epoch: 42428, Training Loss: 0.00842
Epoch: 42429, Training Loss: 0.00661
Epoch: 42429, Training Loss: 0.00686
Epoch: 42429, Training Loss: 0.00686
Epoch: 42429, Training Loss: 0.00842
Epoch: 42430, Training Loss: 0.00661
Epoch: 42430, Training Loss: 0.00686
Epoch: 42430, Training Loss: 0.00686
Epoch: 42430, Training Loss: 0.00842
Epoch: 42431, Training Loss: 0.00661
Epoch: 42431, Training Loss: 0.00686
Epoch: 42431, Training Loss: 0.00686
Epoch: 42431, Training Loss: 0.00842
Epoch: 42432, Training Loss: 0.00660
Epoch: 42432, Training Loss: 0.00686
Epoch: 42432, Training Loss: 0.00686
Epoch: 42432, Training Loss: 0.00841
Epoch: 42433, Training Loss: 0.00660
Epoch: 42433, Training Loss: 0.00686
Epoch: 42433, Training Loss: 0.00686
Epoch: 42433, Training Loss: 0.00841
Epoch: 42434, Training Loss: 0.00660
Epoch: 42434, Training Loss: 0.00686
Epoch: 42434, Training Loss: 0.00686
Epoch: 42434, Training Loss: 0.00841
Epoch: 42435, Training Loss: 0.00660
Epoch: 42435, Training Loss: 0.00686
Epoch: 42435, Training Loss: 0.00686
Epoch: 42435, Training Loss: 0.00841
Epoch: 42436, Training Loss: 0.00660
Epoch: 42436, Training Loss: 0.00686
Epoch: 42436, Training Loss: 0.00686
Epoch: 42436, Training Loss: 0.00841
Epoch: 42437, Training Loss: 0.00660
Epoch: 42437, Training Loss: 0.00686
Epoch: 42437, Training Loss: 0.00686
Epoch: 42437, Training Loss: 0.00841
Epoch: 42438, Training Loss: 0.00660
Epoch: 42438, Training Loss: 0.00686
Epoch: 42438, Training Loss: 0.00686
Epoch: 42438, Training Loss: 0.00841
Epoch: 42439, Training Loss: 0.00660
Epoch: 42439, Training Loss: 0.00686
Epoch: 42439, Training Loss: 0.00686
Epoch: 42439, Training Loss: 0.00841
Epoch: 42440, Training Loss: 0.00660
Epoch: 42440, Training Loss: 0.00686
Epoch: 42440, Training Loss: 0.00686
Epoch: 42440, Training Loss: 0.00841
Epoch: 42441, Training Loss: 0.00660
Epoch: 42441, Training Loss: 0.00686
Epoch: 42441, Training Loss: 0.00686
Epoch: 42441, Training Loss: 0.00841
Epoch: 42442, Training Loss: 0.00660
Epoch: 42442, Training Loss: 0.00686
Epoch: 42442, Training Loss: 0.00686
Epoch: 42442, Training Loss: 0.00841
Epoch: 42443, Training Loss: 0.00660
Epoch: 42443, Training Loss: 0.00686
Epoch: 42443, Training Loss: 0.00686
Epoch: 42443, Training Loss: 0.00841
Epoch: 42444, Training Loss: 0.00660
Epoch: 42444, Training Loss: 0.00686
Epoch: 42444, Training Loss: 0.00686
Epoch: 42444, Training Loss: 0.00841
Epoch: 42445, Training Loss: 0.00660
Epoch: 42445, Training Loss: 0.00686
Epoch: 42445, Training Loss: 0.00686
Epoch: 42445, Training Loss: 0.00841
Epoch: 42446, Training Loss: 0.00660
Epoch: 42446, Training Loss: 0.00685
Epoch: 42446, Training Loss: 0.00686
Epoch: 42446, Training Loss: 0.00841
Epoch: 42447, Training Loss: 0.00660
Epoch: 42447, Training Loss: 0.00685
Epoch: 42447, Training Loss: 0.00686
Epoch: 42447, Training Loss: 0.00841
Epoch: 42448, Training Loss: 0.00660
Epoch: 42448, Training Loss: 0.00685
Epoch: 42448, Training Loss: 0.00686
Epoch: 42448, Training Loss: 0.00841
Epoch: 42449, Training Loss: 0.00660
Epoch: 42449, Training Loss: 0.00685
Epoch: 42449, Training Loss: 0.00686
Epoch: 42449, Training Loss: 0.00841
Epoch: 42450, Training Loss: 0.00660
Epoch: 42450, Training Loss: 0.00685
Epoch: 42450, Training Loss: 0.00686
Epoch: 42450, Training Loss: 0.00841
Epoch: 42451, Training Loss: 0.00660
Epoch: 42451, Training Loss: 0.00685
Epoch: 42451, Training Loss: 0.00686
Epoch: 42451, Training Loss: 0.00841
Epoch: 42452, Training Loss: 0.00660
Epoch: 42452, Training Loss: 0.00685
Epoch: 42452, Training Loss: 0.00686
Epoch: 42452, Training Loss: 0.00841
Epoch: 42453, Training Loss: 0.00660
Epoch: 42453, Training Loss: 0.00685
Epoch: 42453, Training Loss: 0.00686
Epoch: 42453, Training Loss: 0.00841
Epoch: 42454, Training Loss: 0.00660
Epoch: 42454, Training Loss: 0.00685
Epoch: 42454, Training Loss: 0.00686
Epoch: 42454, Training Loss: 0.00841
Epoch: 42455, Training Loss: 0.00660
Epoch: 42455, Training Loss: 0.00685
Epoch: 42455, Training Loss: 0.00686
Epoch: 42455, Training Loss: 0.00841
Epoch: 42456, Training Loss: 0.00660
Epoch: 42456, Training Loss: 0.00685
Epoch: 42456, Training Loss: 0.00686
Epoch: 42456, Training Loss: 0.00841
Epoch: 42457, Training Loss: 0.00660
Epoch: 42457, Training Loss: 0.00685
Epoch: 42457, Training Loss: 0.00686
Epoch: 42457, Training Loss: 0.00841
Epoch: 42458, Training Loss: 0.00660
Epoch: 42458, Training Loss: 0.00685
Epoch: 42458, Training Loss: 0.00686
Epoch: 42458, Training Loss: 0.00841
Epoch: 42459, Training Loss: 0.00660
Epoch: 42459, Training Loss: 0.00685
Epoch: 42459, Training Loss: 0.00686
Epoch: 42459, Training Loss: 0.00841
Epoch: 42460, Training Loss: 0.00660
Epoch: 42460, Training Loss: 0.00685
Epoch: 42460, Training Loss: 0.00686
Epoch: 42460, Training Loss: 0.00841
Epoch: 42461, Training Loss: 0.00660
Epoch: 42461, Training Loss: 0.00685
Epoch: 42461, Training Loss: 0.00686
Epoch: 42461, Training Loss: 0.00841
Epoch: 42462, Training Loss: 0.00660
Epoch: 42462, Training Loss: 0.00685
Epoch: 42462, Training Loss: 0.00686
Epoch: 42462, Training Loss: 0.00841
Epoch: 42463, Training Loss: 0.00660
Epoch: 42463, Training Loss: 0.00685
Epoch: 42463, Training Loss: 0.00686
Epoch: 42463, Training Loss: 0.00841
Epoch: 42464, Training Loss: 0.00660
Epoch: 42464, Training Loss: 0.00685
Epoch: 42464, Training Loss: 0.00686
Epoch: 42464, Training Loss: 0.00841
Epoch: 42465, Training Loss: 0.00660
Epoch: 42465, Training Loss: 0.00685
Epoch: 42465, Training Loss: 0.00686
Epoch: 42465, Training Loss: 0.00841
Epoch: 42466, Training Loss: 0.00660
Epoch: 42466, Training Loss: 0.00685
Epoch: 42466, Training Loss: 0.00686
Epoch: 42466, Training Loss: 0.00841
Epoch: 42467, Training Loss: 0.00660
Epoch: 42467, Training Loss: 0.00685
Epoch: 42467, Training Loss: 0.00686
Epoch: 42467, Training Loss: 0.00841
Epoch: 42468, Training Loss: 0.00660
Epoch: 42468, Training Loss: 0.00685
Epoch: 42468, Training Loss: 0.00686
Epoch: 42468, Training Loss: 0.00841
Epoch: 42469, Training Loss: 0.00660
Epoch: 42469, Training Loss: 0.00685
Epoch: 42469, Training Loss: 0.00686
Epoch: 42469, Training Loss: 0.00841
Epoch: 42470, Training Loss: 0.00660
Epoch: 42470, Training Loss: 0.00685
Epoch: 42470, Training Loss: 0.00686
Epoch: 42470, Training Loss: 0.00841
Epoch: 42471, Training Loss: 0.00660
Epoch: 42471, Training Loss: 0.00685
Epoch: 42471, Training Loss: 0.00686
Epoch: 42471, Training Loss: 0.00841
Epoch: 42472, Training Loss: 0.00660
Epoch: 42472, Training Loss: 0.00685
Epoch: 42472, Training Loss: 0.00686
Epoch: 42472, Training Loss: 0.00841
Epoch: 42473, Training Loss: 0.00660
Epoch: 42473, Training Loss: 0.00685
Epoch: 42473, Training Loss: 0.00686
Epoch: 42473, Training Loss: 0.00841
Epoch: 42474, Training Loss: 0.00660
Epoch: 42474, Training Loss: 0.00685
Epoch: 42474, Training Loss: 0.00686
Epoch: 42474, Training Loss: 0.00841
Epoch: 42475, Training Loss: 0.00660
Epoch: 42475, Training Loss: 0.00685
Epoch: 42475, Training Loss: 0.00686
Epoch: 42475, Training Loss: 0.00841
Epoch: 42476, Training Loss: 0.00660
Epoch: 42476, Training Loss: 0.00685
Epoch: 42476, Training Loss: 0.00686
Epoch: 42476, Training Loss: 0.00841
Epoch: 42477, Training Loss: 0.00660
Epoch: 42477, Training Loss: 0.00685
Epoch: 42477, Training Loss: 0.00686
Epoch: 42477, Training Loss: 0.00841
Epoch: 42478, Training Loss: 0.00660
Epoch: 42478, Training Loss: 0.00685
Epoch: 42478, Training Loss: 0.00686
Epoch: 42478, Training Loss: 0.00841
Epoch: 42479, Training Loss: 0.00660
Epoch: 42479, Training Loss: 0.00685
Epoch: 42479, Training Loss: 0.00686
Epoch: 42479, Training Loss: 0.00841
Epoch: 42480, Training Loss: 0.00660
Epoch: 42480, Training Loss: 0.00685
Epoch: 42480, Training Loss: 0.00686
Epoch: 42480, Training Loss: 0.00841
Epoch: 42481, Training Loss: 0.00660
Epoch: 42481, Training Loss: 0.00685
Epoch: 42481, Training Loss: 0.00686
Epoch: 42481, Training Loss: 0.00841
Epoch: 42482, Training Loss: 0.00660
Epoch: 42482, Training Loss: 0.00685
Epoch: 42482, Training Loss: 0.00686
Epoch: 42482, Training Loss: 0.00841
Epoch: 42483, Training Loss: 0.00660
Epoch: 42483, Training Loss: 0.00685
Epoch: 42483, Training Loss: 0.00686
Epoch: 42483, Training Loss: 0.00841
Epoch: 42484, Training Loss: 0.00660
Epoch: 42484, Training Loss: 0.00685
Epoch: 42484, Training Loss: 0.00686
Epoch: 42484, Training Loss: 0.00841
Epoch: 42485, Training Loss: 0.00660
Epoch: 42485, Training Loss: 0.00685
Epoch: 42485, Training Loss: 0.00686
Epoch: 42485, Training Loss: 0.00841
Epoch: 42486, Training Loss: 0.00660
Epoch: 42486, Training Loss: 0.00685
Epoch: 42486, Training Loss: 0.00686
Epoch: 42486, Training Loss: 0.00841
Epoch: 42487, Training Loss: 0.00660
Epoch: 42487, Training Loss: 0.00685
Epoch: 42487, Training Loss: 0.00686
Epoch: 42487, Training Loss: 0.00841
Epoch: 42488, Training Loss: 0.00660
Epoch: 42488, Training Loss: 0.00685
Epoch: 42488, Training Loss: 0.00686
Epoch: 42488, Training Loss: 0.00841
Epoch: 42489, Training Loss: 0.00660
Epoch: 42489, Training Loss: 0.00685
Epoch: 42489, Training Loss: 0.00686
Epoch: 42489, Training Loss: 0.00841
Epoch: 42490, Training Loss: 0.00660
Epoch: 42490, Training Loss: 0.00685
Epoch: 42490, Training Loss: 0.00686
Epoch: 42490, Training Loss: 0.00841
Epoch: 42491, Training Loss: 0.00660
Epoch: 42491, Training Loss: 0.00685
Epoch: 42491, Training Loss: 0.00686
Epoch: 42491, Training Loss: 0.00841
Epoch: 42492, Training Loss: 0.00660
Epoch: 42492, Training Loss: 0.00685
Epoch: 42492, Training Loss: 0.00686
Epoch: 42492, Training Loss: 0.00841
Epoch: 42493, Training Loss: 0.00660
Epoch: 42493, Training Loss: 0.00685
Epoch: 42493, Training Loss: 0.00686
Epoch: 42493, Training Loss: 0.00841
Epoch: 42494, Training Loss: 0.00660
Epoch: 42494, Training Loss: 0.00685
Epoch: 42494, Training Loss: 0.00686
Epoch: 42494, Training Loss: 0.00841
Epoch: 42495, Training Loss: 0.00660
Epoch: 42495, Training Loss: 0.00685
Epoch: 42495, Training Loss: 0.00686
Epoch: 42495, Training Loss: 0.00841
Epoch: 42496, Training Loss: 0.00660
Epoch: 42496, Training Loss: 0.00685
Epoch: 42496, Training Loss: 0.00686
Epoch: 42496, Training Loss: 0.00841
Epoch: 42497, Training Loss: 0.00660
Epoch: 42497, Training Loss: 0.00685
Epoch: 42497, Training Loss: 0.00686
Epoch: 42497, Training Loss: 0.00841
Epoch: 42498, Training Loss: 0.00660
Epoch: 42498, Training Loss: 0.00685
Epoch: 42498, Training Loss: 0.00686
Epoch: 42498, Training Loss: 0.00841
Epoch: 42499, Training Loss: 0.00660
Epoch: 42499, Training Loss: 0.00685
Epoch: 42499, Training Loss: 0.00686
Epoch: 42499, Training Loss: 0.00841
Epoch: 42500, Training Loss: 0.00660
Epoch: 42500, Training Loss: 0.00685
Epoch: 42500, Training Loss: 0.00686
Epoch: 42500, Training Loss: 0.00841
Epoch: 42501, Training Loss: 0.00660
Epoch: 42501, Training Loss: 0.00685
Epoch: 42501, Training Loss: 0.00686
Epoch: 42501, Training Loss: 0.00841
Epoch: 42502, Training Loss: 0.00660
Epoch: 42502, Training Loss: 0.00685
Epoch: 42502, Training Loss: 0.00686
Epoch: 42502, Training Loss: 0.00841
Epoch: 42503, Training Loss: 0.00660
Epoch: 42503, Training Loss: 0.00685
Epoch: 42503, Training Loss: 0.00686
Epoch: 42503, Training Loss: 0.00841
Epoch: 42504, Training Loss: 0.00660
Epoch: 42504, Training Loss: 0.00685
Epoch: 42504, Training Loss: 0.00686
Epoch: 42504, Training Loss: 0.00841
Epoch: 42505, Training Loss: 0.00660
Epoch: 42505, Training Loss: 0.00685
Epoch: 42505, Training Loss: 0.00686
Epoch: 42505, Training Loss: 0.00841
Epoch: 42506, Training Loss: 0.00660
Epoch: 42506, Training Loss: 0.00685
Epoch: 42506, Training Loss: 0.00686
Epoch: 42506, Training Loss: 0.00841
Epoch: 42507, Training Loss: 0.00660
Epoch: 42507, Training Loss: 0.00685
Epoch: 42507, Training Loss: 0.00686
Epoch: 42507, Training Loss: 0.00841
Epoch: 42508, Training Loss: 0.00660
Epoch: 42508, Training Loss: 0.00685
Epoch: 42508, Training Loss: 0.00686
Epoch: 42508, Training Loss: 0.00841
Epoch: 42509, Training Loss: 0.00660
Epoch: 42509, Training Loss: 0.00685
Epoch: 42509, Training Loss: 0.00686
Epoch: 42509, Training Loss: 0.00841
Epoch: 42510, Training Loss: 0.00660
Epoch: 42510, Training Loss: 0.00685
Epoch: 42510, Training Loss: 0.00686
Epoch: 42510, Training Loss: 0.00841
Epoch: 42511, Training Loss: 0.00660
Epoch: 42511, Training Loss: 0.00685
Epoch: 42511, Training Loss: 0.00686
Epoch: 42511, Training Loss: 0.00841
Epoch: 42512, Training Loss: 0.00660
Epoch: 42512, Training Loss: 0.00685
Epoch: 42512, Training Loss: 0.00686
Epoch: 42512, Training Loss: 0.00841
Epoch: 42513, Training Loss: 0.00660
Epoch: 42513, Training Loss: 0.00685
Epoch: 42513, Training Loss: 0.00686
Epoch: 42513, Training Loss: 0.00841
Epoch: 42514, Training Loss: 0.00660
Epoch: 42514, Training Loss: 0.00685
Epoch: 42514, Training Loss: 0.00686
Epoch: 42514, Training Loss: 0.00841
Epoch: 42515, Training Loss: 0.00660
Epoch: 42515, Training Loss: 0.00685
Epoch: 42515, Training Loss: 0.00686
Epoch: 42515, Training Loss: 0.00841
Epoch: 42516, Training Loss: 0.00660
Epoch: 42516, Training Loss: 0.00685
Epoch: 42516, Training Loss: 0.00686
Epoch: 42516, Training Loss: 0.00841
Epoch: 42517, Training Loss: 0.00660
Epoch: 42517, Training Loss: 0.00685
Epoch: 42517, Training Loss: 0.00686
Epoch: 42517, Training Loss: 0.00841
Epoch: 42518, Training Loss: 0.00660
Epoch: 42518, Training Loss: 0.00685
Epoch: 42518, Training Loss: 0.00686
Epoch: 42518, Training Loss: 0.00841
Epoch: 42519, Training Loss: 0.00660
Epoch: 42519, Training Loss: 0.00685
Epoch: 42519, Training Loss: 0.00686
Epoch: 42519, Training Loss: 0.00841
Epoch: 42520, Training Loss: 0.00660
Epoch: 42520, Training Loss: 0.00685
Epoch: 42520, Training Loss: 0.00686
Epoch: 42520, Training Loss: 0.00841
Epoch: 42521, Training Loss: 0.00660
Epoch: 42521, Training Loss: 0.00685
Epoch: 42521, Training Loss: 0.00686
Epoch: 42521, Training Loss: 0.00841
Epoch: 42522, Training Loss: 0.00660
Epoch: 42522, Training Loss: 0.00685
Epoch: 42522, Training Loss: 0.00686
Epoch: 42522, Training Loss: 0.00841
Epoch: 42523, Training Loss: 0.00660
Epoch: 42523, Training Loss: 0.00685
Epoch: 42523, Training Loss: 0.00686
Epoch: 42523, Training Loss: 0.00840
Epoch: 42524, Training Loss: 0.00660
Epoch: 42524, Training Loss: 0.00685
Epoch: 42524, Training Loss: 0.00686
Epoch: 42524, Training Loss: 0.00840
Epoch: 42525, Training Loss: 0.00660
Epoch: 42525, Training Loss: 0.00685
Epoch: 42525, Training Loss: 0.00686
Epoch: 42525, Training Loss: 0.00840
Epoch: 42526, Training Loss: 0.00660
Epoch: 42526, Training Loss: 0.00685
Epoch: 42526, Training Loss: 0.00686
Epoch: 42526, Training Loss: 0.00840
Epoch: 42527, Training Loss: 0.00660
Epoch: 42527, Training Loss: 0.00685
Epoch: 42527, Training Loss: 0.00686
Epoch: 42527, Training Loss: 0.00840
Epoch: 42528, Training Loss: 0.00660
Epoch: 42528, Training Loss: 0.00685
Epoch: 42528, Training Loss: 0.00686
Epoch: 42528, Training Loss: 0.00840
Epoch: 42529, Training Loss: 0.00660
Epoch: 42529, Training Loss: 0.00685
Epoch: 42529, Training Loss: 0.00686
Epoch: 42529, Training Loss: 0.00840
Epoch: 42530, Training Loss: 0.00660
Epoch: 42530, Training Loss: 0.00685
Epoch: 42530, Training Loss: 0.00686
Epoch: 42530, Training Loss: 0.00840
Epoch: 42531, Training Loss: 0.00660
Epoch: 42531, Training Loss: 0.00685
Epoch: 42531, Training Loss: 0.00686
Epoch: 42531, Training Loss: 0.00840
Epoch: 42532, Training Loss: 0.00660
Epoch: 42532, Training Loss: 0.00685
Epoch: 42532, Training Loss: 0.00685
Epoch: 42532, Training Loss: 0.00840
Epoch: 42533, Training Loss: 0.00660
Epoch: 42533, Training Loss: 0.00685
Epoch: 42533, Training Loss: 0.00685
Epoch: 42533, Training Loss: 0.00840
Epoch: 42534, Training Loss: 0.00660
Epoch: 42534, Training Loss: 0.00685
Epoch: 42534, Training Loss: 0.00685
Epoch: 42534, Training Loss: 0.00840
Epoch: 42535, Training Loss: 0.00660
Epoch: 42535, Training Loss: 0.00685
Epoch: 42535, Training Loss: 0.00685
Epoch: 42535, Training Loss: 0.00840
Epoch: 42536, Training Loss: 0.00660
Epoch: 42536, Training Loss: 0.00685
Epoch: 42536, Training Loss: 0.00685
Epoch: 42536, Training Loss: 0.00840
Epoch: 42537, Training Loss: 0.00660
Epoch: 42537, Training Loss: 0.00685
Epoch: 42537, Training Loss: 0.00685
Epoch: 42537, Training Loss: 0.00840
Epoch: 42538, Training Loss: 0.00660
Epoch: 42538, Training Loss: 0.00685
Epoch: 42538, Training Loss: 0.00685
Epoch: 42538, Training Loss: 0.00840
Epoch: 42539, Training Loss: 0.00660
Epoch: 42539, Training Loss: 0.00685
Epoch: 42539, Training Loss: 0.00685
Epoch: 42539, Training Loss: 0.00840
Epoch: 42540, Training Loss: 0.00660
Epoch: 42540, Training Loss: 0.00685
Epoch: 42540, Training Loss: 0.00685
Epoch: 42540, Training Loss: 0.00840
Epoch: 42541, Training Loss: 0.00660
Epoch: 42541, Training Loss: 0.00685
Epoch: 42541, Training Loss: 0.00685
Epoch: 42541, Training Loss: 0.00840
Epoch: 42542, Training Loss: 0.00660
Epoch: 42542, Training Loss: 0.00685
Epoch: 42542, Training Loss: 0.00685
Epoch: 42542, Training Loss: 0.00840
Epoch: 42543, Training Loss: 0.00660
Epoch: 42543, Training Loss: 0.00685
Epoch: 42543, Training Loss: 0.00685
Epoch: 42543, Training Loss: 0.00840
Epoch: 42544, Training Loss: 0.00660
Epoch: 42544, Training Loss: 0.00685
Epoch: 42544, Training Loss: 0.00685
Epoch: 42544, Training Loss: 0.00840
Epoch: 42545, Training Loss: 0.00660
Epoch: 42545, Training Loss: 0.00685
Epoch: 42545, Training Loss: 0.00685
Epoch: 42545, Training Loss: 0.00840
Epoch: 42546, Training Loss: 0.00660
Epoch: 42546, Training Loss: 0.00685
Epoch: 42546, Training Loss: 0.00685
Epoch: 42546, Training Loss: 0.00840
Epoch: 42547, Training Loss: 0.00660
Epoch: 42547, Training Loss: 0.00685
Epoch: 42547, Training Loss: 0.00685
Epoch: 42547, Training Loss: 0.00840
Epoch: 42548, Training Loss: 0.00660
Epoch: 42548, Training Loss: 0.00685
Epoch: 42548, Training Loss: 0.00685
Epoch: 42548, Training Loss: 0.00840
Epoch: 42549, Training Loss: 0.00660
Epoch: 42549, Training Loss: 0.00685
Epoch: 42549, Training Loss: 0.00685
Epoch: 42549, Training Loss: 0.00840
Epoch: 42550, Training Loss: 0.00660
Epoch: 42550, Training Loss: 0.00685
Epoch: 42550, Training Loss: 0.00685
Epoch: 42550, Training Loss: 0.00840
Epoch: 42551, Training Loss: 0.00660
Epoch: 42551, Training Loss: 0.00685
Epoch: 42551, Training Loss: 0.00685
Epoch: 42551, Training Loss: 0.00840
Epoch: 42552, Training Loss: 0.00659
Epoch: 42552, Training Loss: 0.00685
Epoch: 42552, Training Loss: 0.00685
Epoch: 42552, Training Loss: 0.00840
Epoch: 42553, Training Loss: 0.00659
Epoch: 42553, Training Loss: 0.00685
Epoch: 42553, Training Loss: 0.00685
Epoch: 42553, Training Loss: 0.00840
Epoch: 42554, Training Loss: 0.00659
Epoch: 42554, Training Loss: 0.00685
Epoch: 42554, Training Loss: 0.00685
Epoch: 42554, Training Loss: 0.00840
Epoch: 42555, Training Loss: 0.00659
Epoch: 42555, Training Loss: 0.00685
Epoch: 42555, Training Loss: 0.00685
Epoch: 42555, Training Loss: 0.00840
Epoch: 42556, Training Loss: 0.00659
Epoch: 42556, Training Loss: 0.00685
Epoch: 42556, Training Loss: 0.00685
Epoch: 42556, Training Loss: 0.00840
Epoch: 42557, Training Loss: 0.00659
Epoch: 42557, Training Loss: 0.00685
Epoch: 42557, Training Loss: 0.00685
Epoch: 42557, Training Loss: 0.00840
Epoch: 42558, Training Loss: 0.00659
Epoch: 42558, Training Loss: 0.00685
Epoch: 42558, Training Loss: 0.00685
Epoch: 42558, Training Loss: 0.00840
Epoch: 42559, Training Loss: 0.00659
Epoch: 42559, Training Loss: 0.00685
Epoch: 42559, Training Loss: 0.00685
Epoch: 42559, Training Loss: 0.00840
Epoch: 42560, Training Loss: 0.00659
Epoch: 42560, Training Loss: 0.00684
Epoch: 42560, Training Loss: 0.00685
Epoch: 42560, Training Loss: 0.00840
Epoch: 42561, Training Loss: 0.00659
Epoch: 42561, Training Loss: 0.00684
Epoch: 42561, Training Loss: 0.00685
Epoch: 42561, Training Loss: 0.00840
Epoch: 42562, Training Loss: 0.00659
Epoch: 42562, Training Loss: 0.00684
Epoch: 42562, Training Loss: 0.00685
Epoch: 42562, Training Loss: 0.00840
Epoch: 42563, Training Loss: 0.00659
Epoch: 42563, Training Loss: 0.00684
Epoch: 42563, Training Loss: 0.00685
Epoch: 42563, Training Loss: 0.00840
Epoch: 42564, Training Loss: 0.00659
Epoch: 42564, Training Loss: 0.00684
Epoch: 42564, Training Loss: 0.00685
Epoch: 42564, Training Loss: 0.00840
Epoch: 42565, Training Loss: 0.00659
Epoch: 42565, Training Loss: 0.00684
Epoch: 42565, Training Loss: 0.00685
Epoch: 42565, Training Loss: 0.00840
Epoch: 42566, Training Loss: 0.00659
Epoch: 42566, Training Loss: 0.00684
Epoch: 42566, Training Loss: 0.00685
Epoch: 42566, Training Loss: 0.00840
Epoch: 42567, Training Loss: 0.00659
Epoch: 42567, Training Loss: 0.00684
Epoch: 42567, Training Loss: 0.00685
Epoch: 42567, Training Loss: 0.00840
Epoch: 42568, Training Loss: 0.00659
Epoch: 42568, Training Loss: 0.00684
Epoch: 42568, Training Loss: 0.00685
Epoch: 42568, Training Loss: 0.00840
Epoch: 42569, Training Loss: 0.00659
Epoch: 42569, Training Loss: 0.00684
Epoch: 42569, Training Loss: 0.00685
Epoch: 42569, Training Loss: 0.00840
Epoch: 42570, Training Loss: 0.00659
Epoch: 42570, Training Loss: 0.00684
Epoch: 42570, Training Loss: 0.00685
Epoch: 42570, Training Loss: 0.00840
Epoch: 42571, Training Loss: 0.00659
Epoch: 42571, Training Loss: 0.00684
Epoch: 42571, Training Loss: 0.00685
Epoch: 42571, Training Loss: 0.00840
Epoch: 42572, Training Loss: 0.00659
Epoch: 42572, Training Loss: 0.00684
Epoch: 42572, Training Loss: 0.00685
Epoch: 42572, Training Loss: 0.00840
Epoch: 42573, Training Loss: 0.00659
Epoch: 42573, Training Loss: 0.00684
Epoch: 42573, Training Loss: 0.00685
Epoch: 42573, Training Loss: 0.00840
Epoch: 42574, Training Loss: 0.00659
Epoch: 42574, Training Loss: 0.00684
Epoch: 42574, Training Loss: 0.00685
Epoch: 42574, Training Loss: 0.00840
Epoch: 42575, Training Loss: 0.00659
Epoch: 42575, Training Loss: 0.00684
Epoch: 42575, Training Loss: 0.00685
Epoch: 42575, Training Loss: 0.00840
Epoch: 42576, Training Loss: 0.00659
Epoch: 42576, Training Loss: 0.00684
Epoch: 42576, Training Loss: 0.00685
Epoch: 42576, Training Loss: 0.00840
Epoch: 42577, Training Loss: 0.00659
Epoch: 42577, Training Loss: 0.00684
Epoch: 42577, Training Loss: 0.00685
Epoch: 42577, Training Loss: 0.00840
Epoch: 42578, Training Loss: 0.00659
Epoch: 42578, Training Loss: 0.00684
Epoch: 42578, Training Loss: 0.00685
Epoch: 42578, Training Loss: 0.00840
Epoch: 42579, Training Loss: 0.00659
Epoch: 42579, Training Loss: 0.00684
Epoch: 42579, Training Loss: 0.00685
Epoch: 42579, Training Loss: 0.00840
Epoch: 42580, Training Loss: 0.00659
Epoch: 42580, Training Loss: 0.00684
Epoch: 42580, Training Loss: 0.00685
Epoch: 42580, Training Loss: 0.00840
Epoch: 42581, Training Loss: 0.00659
Epoch: 42581, Training Loss: 0.00684
Epoch: 42581, Training Loss: 0.00685
Epoch: 42581, Training Loss: 0.00840
Epoch: 42582, Training Loss: 0.00659
Epoch: 42582, Training Loss: 0.00684
Epoch: 42582, Training Loss: 0.00685
Epoch: 42582, Training Loss: 0.00840
Epoch: 42583, Training Loss: 0.00659
Epoch: 42583, Training Loss: 0.00684
Epoch: 42583, Training Loss: 0.00685
Epoch: 42583, Training Loss: 0.00840
Epoch: 42584, Training Loss: 0.00659
Epoch: 42584, Training Loss: 0.00684
Epoch: 42584, Training Loss: 0.00685
Epoch: 42584, Training Loss: 0.00840
Epoch: 42585, Training Loss: 0.00659
Epoch: 42585, Training Loss: 0.00684
Epoch: 42585, Training Loss: 0.00685
Epoch: 42585, Training Loss: 0.00840
Epoch: 42586, Training Loss: 0.00659
Epoch: 42586, Training Loss: 0.00684
Epoch: 42586, Training Loss: 0.00685
Epoch: 42586, Training Loss: 0.00840
Epoch: 42587, Training Loss: 0.00659
Epoch: 42587, Training Loss: 0.00684
Epoch: 42587, Training Loss: 0.00685
Epoch: 42587, Training Loss: 0.00840
Epoch: 42588, Training Loss: 0.00659
Epoch: 42588, Training Loss: 0.00684
Epoch: 42588, Training Loss: 0.00685
Epoch: 42588, Training Loss: 0.00840
Epoch: 42589, Training Loss: 0.00659
Epoch: 42589, Training Loss: 0.00684
Epoch: 42589, Training Loss: 0.00685
Epoch: 42589, Training Loss: 0.00840
Epoch: 42590, Training Loss: 0.00659
Epoch: 42590, Training Loss: 0.00684
Epoch: 42590, Training Loss: 0.00685
Epoch: 42590, Training Loss: 0.00840
Epoch: 42591, Training Loss: 0.00659
Epoch: 42591, Training Loss: 0.00684
Epoch: 42591, Training Loss: 0.00685
Epoch: 42591, Training Loss: 0.00840
Epoch: 42592, Training Loss: 0.00659
Epoch: 42592, Training Loss: 0.00684
Epoch: 42592, Training Loss: 0.00685
Epoch: 42592, Training Loss: 0.00840
Epoch: 42593, Training Loss: 0.00659
Epoch: 42593, Training Loss: 0.00684
Epoch: 42593, Training Loss: 0.00685
Epoch: 42593, Training Loss: 0.00840
Epoch: 42594, Training Loss: 0.00659
Epoch: 42594, Training Loss: 0.00684
Epoch: 42594, Training Loss: 0.00685
Epoch: 42594, Training Loss: 0.00840
Epoch: 42595, Training Loss: 0.00659
Epoch: 42595, Training Loss: 0.00684
Epoch: 42595, Training Loss: 0.00685
Epoch: 42595, Training Loss: 0.00840
Epoch: 42596, Training Loss: 0.00659
Epoch: 42596, Training Loss: 0.00684
Epoch: 42596, Training Loss: 0.00685
Epoch: 42596, Training Loss: 0.00840
Epoch: 42597, Training Loss: 0.00659
Epoch: 42597, Training Loss: 0.00684
Epoch: 42597, Training Loss: 0.00685
Epoch: 42597, Training Loss: 0.00840
Epoch: 42598, Training Loss: 0.00659
Epoch: 42598, Training Loss: 0.00684
Epoch: 42598, Training Loss: 0.00685
Epoch: 42598, Training Loss: 0.00840
Epoch: 42599, Training Loss: 0.00659
Epoch: 42599, Training Loss: 0.00684
Epoch: 42599, Training Loss: 0.00685
Epoch: 42599, Training Loss: 0.00840
Epoch: 42600, Training Loss: 0.00659
Epoch: 42600, Training Loss: 0.00684
Epoch: 42600, Training Loss: 0.00685
Epoch: 42600, Training Loss: 0.00840
Epoch: 42601, Training Loss: 0.00659
Epoch: 42601, Training Loss: 0.00684
Epoch: 42601, Training Loss: 0.00685
Epoch: 42601, Training Loss: 0.00840
Epoch: 42602, Training Loss: 0.00659
Epoch: 42602, Training Loss: 0.00684
Epoch: 42602, Training Loss: 0.00685
Epoch: 42602, Training Loss: 0.00840
Epoch: 42603, Training Loss: 0.00659
Epoch: 42603, Training Loss: 0.00684
Epoch: 42603, Training Loss: 0.00685
Epoch: 42603, Training Loss: 0.00840
Epoch: 42604, Training Loss: 0.00659
Epoch: 42604, Training Loss: 0.00684
Epoch: 42604, Training Loss: 0.00685
Epoch: 42604, Training Loss: 0.00840
Epoch: 42605, Training Loss: 0.00659
Epoch: 42605, Training Loss: 0.00684
Epoch: 42605, Training Loss: 0.00685
Epoch: 42605, Training Loss: 0.00840
Epoch: 42606, Training Loss: 0.00659
Epoch: 42606, Training Loss: 0.00684
Epoch: 42606, Training Loss: 0.00685
Epoch: 42606, Training Loss: 0.00840
Epoch: 42607, Training Loss: 0.00659
Epoch: 42607, Training Loss: 0.00684
Epoch: 42607, Training Loss: 0.00685
Epoch: 42607, Training Loss: 0.00840
Epoch: 42608, Training Loss: 0.00659
Epoch: 42608, Training Loss: 0.00684
Epoch: 42608, Training Loss: 0.00685
Epoch: 42608, Training Loss: 0.00840
Epoch: 42609, Training Loss: 0.00659
Epoch: 42609, Training Loss: 0.00684
Epoch: 42609, Training Loss: 0.00685
Epoch: 42609, Training Loss: 0.00840
Epoch: 42610, Training Loss: 0.00659
Epoch: 42610, Training Loss: 0.00684
Epoch: 42610, Training Loss: 0.00685
Epoch: 42610, Training Loss: 0.00840
Epoch: 42611, Training Loss: 0.00659
Epoch: 42611, Training Loss: 0.00684
Epoch: 42611, Training Loss: 0.00685
Epoch: 42611, Training Loss: 0.00840
Epoch: 42612, Training Loss: 0.00659
Epoch: 42612, Training Loss: 0.00684
Epoch: 42612, Training Loss: 0.00685
Epoch: 42612, Training Loss: 0.00840
Epoch: 42613, Training Loss: 0.00659
Epoch: 42613, Training Loss: 0.00684
Epoch: 42613, Training Loss: 0.00685
Epoch: 42613, Training Loss: 0.00840
Epoch: 42614, Training Loss: 0.00659
Epoch: 42614, Training Loss: 0.00684
Epoch: 42614, Training Loss: 0.00685
Epoch: 42614, Training Loss: 0.00840
Epoch: 42615, Training Loss: 0.00659
Epoch: 42615, Training Loss: 0.00684
Epoch: 42615, Training Loss: 0.00685
Epoch: 42615, Training Loss: 0.00839
Epoch: 42616, Training Loss: 0.00659
Epoch: 42616, Training Loss: 0.00684
Epoch: 42616, Training Loss: 0.00685
Epoch: 42616, Training Loss: 0.00839
Epoch: 42617, Training Loss: 0.00659
Epoch: 42617, Training Loss: 0.00684
Epoch: 42617, Training Loss: 0.00685
Epoch: 42617, Training Loss: 0.00839
Epoch: 42618, Training Loss: 0.00659
Epoch: 42618, Training Loss: 0.00684
Epoch: 42618, Training Loss: 0.00685
Epoch: 42618, Training Loss: 0.00839
Epoch: 42619, Training Loss: 0.00659
Epoch: 42619, Training Loss: 0.00684
Epoch: 42619, Training Loss: 0.00685
Epoch: 42619, Training Loss: 0.00839
Epoch: 42620, Training Loss: 0.00659
Epoch: 42620, Training Loss: 0.00684
Epoch: 42620, Training Loss: 0.00685
Epoch: 42620, Training Loss: 0.00839
Epoch: 42621, Training Loss: 0.00659
Epoch: 42621, Training Loss: 0.00684
Epoch: 42621, Training Loss: 0.00685
Epoch: 42621, Training Loss: 0.00839
Epoch: 42622, Training Loss: 0.00659
Epoch: 42622, Training Loss: 0.00684
Epoch: 42622, Training Loss: 0.00685
Epoch: 42622, Training Loss: 0.00839
Epoch: 42623, Training Loss: 0.00659
Epoch: 42623, Training Loss: 0.00684
Epoch: 42623, Training Loss: 0.00685
Epoch: 42623, Training Loss: 0.00839
Epoch: 42624, Training Loss: 0.00659
Epoch: 42624, Training Loss: 0.00684
Epoch: 42624, Training Loss: 0.00685
Epoch: 42624, Training Loss: 0.00839
Epoch: 42625, Training Loss: 0.00659
Epoch: 42625, Training Loss: 0.00684
Epoch: 42625, Training Loss: 0.00685
Epoch: 42625, Training Loss: 0.00839
Epoch: 42626, Training Loss: 0.00659
Epoch: 42626, Training Loss: 0.00684
Epoch: 42626, Training Loss: 0.00685
Epoch: 42626, Training Loss: 0.00839
Epoch: 42627, Training Loss: 0.00659
Epoch: 42627, Training Loss: 0.00684
Epoch: 42627, Training Loss: 0.00685
Epoch: 42627, Training Loss: 0.00839
Epoch: 42628, Training Loss: 0.00659
Epoch: 42628, Training Loss: 0.00684
Epoch: 42628, Training Loss: 0.00685
Epoch: 42628, Training Loss: 0.00839
Epoch: 42629, Training Loss: 0.00659
Epoch: 42629, Training Loss: 0.00684
Epoch: 42629, Training Loss: 0.00685
Epoch: 42629, Training Loss: 0.00839
Epoch: 42630, Training Loss: 0.00659
Epoch: 42630, Training Loss: 0.00684
Epoch: 42630, Training Loss: 0.00685
Epoch: 42630, Training Loss: 0.00839
Epoch: 42631, Training Loss: 0.00659
Epoch: 42631, Training Loss: 0.00684
Epoch: 42631, Training Loss: 0.00685
Epoch: 42631, Training Loss: 0.00839
Epoch: 42632, Training Loss: 0.00659
Epoch: 42632, Training Loss: 0.00684
Epoch: 42632, Training Loss: 0.00685
Epoch: 42632, Training Loss: 0.00839
Epoch: 42633, Training Loss: 0.00659
Epoch: 42633, Training Loss: 0.00684
Epoch: 42633, Training Loss: 0.00685
Epoch: 42633, Training Loss: 0.00839
Epoch: 42634, Training Loss: 0.00659
Epoch: 42634, Training Loss: 0.00684
Epoch: 42634, Training Loss: 0.00685
Epoch: 42634, Training Loss: 0.00839
Epoch: 42635, Training Loss: 0.00659
Epoch: 42635, Training Loss: 0.00684
Epoch: 42635, Training Loss: 0.00685
Epoch: 42635, Training Loss: 0.00839
Epoch: 42636, Training Loss: 0.00659
Epoch: 42636, Training Loss: 0.00684
Epoch: 42636, Training Loss: 0.00685
Epoch: 42636, Training Loss: 0.00839
Epoch: 42637, Training Loss: 0.00659
Epoch: 42637, Training Loss: 0.00684
Epoch: 42637, Training Loss: 0.00685
Epoch: 42637, Training Loss: 0.00839
Epoch: 42638, Training Loss: 0.00659
Epoch: 42638, Training Loss: 0.00684
Epoch: 42638, Training Loss: 0.00685
Epoch: 42638, Training Loss: 0.00839
Epoch: 42639, Training Loss: 0.00659
Epoch: 42639, Training Loss: 0.00684
Epoch: 42639, Training Loss: 0.00685
Epoch: 42639, Training Loss: 0.00839
Epoch: 42640, Training Loss: 0.00659
Epoch: 42640, Training Loss: 0.00684
Epoch: 42640, Training Loss: 0.00685
Epoch: 42640, Training Loss: 0.00839
Epoch: 42641, Training Loss: 0.00659
Epoch: 42641, Training Loss: 0.00684
Epoch: 42641, Training Loss: 0.00685
Epoch: 42641, Training Loss: 0.00839
Epoch: 42642, Training Loss: 0.00659
Epoch: 42642, Training Loss: 0.00684
Epoch: 42642, Training Loss: 0.00685
Epoch: 42642, Training Loss: 0.00839
Epoch: 42643, Training Loss: 0.00659
Epoch: 42643, Training Loss: 0.00684
Epoch: 42643, Training Loss: 0.00685
Epoch: 42643, Training Loss: 0.00839
Epoch: 42644, Training Loss: 0.00659
Epoch: 42644, Training Loss: 0.00684
Epoch: 42644, Training Loss: 0.00685
Epoch: 42644, Training Loss: 0.00839
Epoch: 42645, Training Loss: 0.00659
Epoch: 42645, Training Loss: 0.00684
Epoch: 42645, Training Loss: 0.00685
Epoch: 42645, Training Loss: 0.00839
Epoch: 42646, Training Loss: 0.00659
Epoch: 42646, Training Loss: 0.00684
Epoch: 42646, Training Loss: 0.00684
Epoch: 42646, Training Loss: 0.00839
Epoch: 42647, Training Loss: 0.00659
Epoch: 42647, Training Loss: 0.00684
Epoch: 42647, Training Loss: 0.00684
Epoch: 42647, Training Loss: 0.00839
Epoch: 42648, Training Loss: 0.00659
Epoch: 42648, Training Loss: 0.00684
Epoch: 42648, Training Loss: 0.00684
Epoch: 42648, Training Loss: 0.00839
Epoch: 42649, Training Loss: 0.00659
Epoch: 42649, Training Loss: 0.00684
Epoch: 42649, Training Loss: 0.00684
Epoch: 42649, Training Loss: 0.00839
Epoch: 42650, Training Loss: 0.00659
Epoch: 42650, Training Loss: 0.00684
Epoch: 42650, Training Loss: 0.00684
Epoch: 42650, Training Loss: 0.00839
Epoch: 42651, Training Loss: 0.00659
Epoch: 42651, Training Loss: 0.00684
Epoch: 42651, Training Loss: 0.00684
Epoch: 42651, Training Loss: 0.00839
Epoch: 42652, Training Loss: 0.00659
Epoch: 42652, Training Loss: 0.00684
Epoch: 42652, Training Loss: 0.00684
Epoch: 42652, Training Loss: 0.00839
Epoch: 42653, Training Loss: 0.00659
Epoch: 42653, Training Loss: 0.00684
Epoch: 42653, Training Loss: 0.00684
Epoch: 42653, Training Loss: 0.00839
Epoch: 42654, Training Loss: 0.00659
Epoch: 42654, Training Loss: 0.00684
Epoch: 42654, Training Loss: 0.00684
Epoch: 42654, Training Loss: 0.00839
Epoch: 42655, Training Loss: 0.00659
Epoch: 42655, Training Loss: 0.00684
Epoch: 42655, Training Loss: 0.00684
Epoch: 42655, Training Loss: 0.00839
Epoch: 42656, Training Loss: 0.00659
Epoch: 42656, Training Loss: 0.00684
Epoch: 42656, Training Loss: 0.00684
Epoch: 42656, Training Loss: 0.00839
Epoch: 42657, Training Loss: 0.00659
Epoch: 42657, Training Loss: 0.00684
Epoch: 42657, Training Loss: 0.00684
Epoch: 42657, Training Loss: 0.00839
Epoch: 42658, Training Loss: 0.00659
Epoch: 42658, Training Loss: 0.00684
Epoch: 42658, Training Loss: 0.00684
Epoch: 42658, Training Loss: 0.00839
Epoch: 42659, Training Loss: 0.00659
Epoch: 42659, Training Loss: 0.00684
Epoch: 42659, Training Loss: 0.00684
Epoch: 42659, Training Loss: 0.00839
Epoch: 42660, Training Loss: 0.00659
Epoch: 42660, Training Loss: 0.00684
Epoch: 42660, Training Loss: 0.00684
Epoch: 42660, Training Loss: 0.00839
Epoch: 42661, Training Loss: 0.00659
Epoch: 42661, Training Loss: 0.00684
Epoch: 42661, Training Loss: 0.00684
Epoch: 42661, Training Loss: 0.00839
Epoch: 42662, Training Loss: 0.00659
Epoch: 42662, Training Loss: 0.00684
Epoch: 42662, Training Loss: 0.00684
Epoch: 42662, Training Loss: 0.00839
Epoch: 42663, Training Loss: 0.00659
Epoch: 42663, Training Loss: 0.00684
Epoch: 42663, Training Loss: 0.00684
Epoch: 42663, Training Loss: 0.00839
Epoch: 42664, Training Loss: 0.00659
Epoch: 42664, Training Loss: 0.00684
Epoch: 42664, Training Loss: 0.00684
Epoch: 42664, Training Loss: 0.00839
Epoch: 42665, Training Loss: 0.00659
Epoch: 42665, Training Loss: 0.00684
Epoch: 42665, Training Loss: 0.00684
Epoch: 42665, Training Loss: 0.00839
Epoch: 42666, Training Loss: 0.00659
Epoch: 42666, Training Loss: 0.00684
Epoch: 42666, Training Loss: 0.00684
Epoch: 42666, Training Loss: 0.00839
Epoch: 42667, Training Loss: 0.00659
Epoch: 42667, Training Loss: 0.00684
Epoch: 42667, Training Loss: 0.00684
Epoch: 42667, Training Loss: 0.00839
Epoch: 42668, Training Loss: 0.00659
Epoch: 42668, Training Loss: 0.00684
Epoch: 42668, Training Loss: 0.00684
Epoch: 42668, Training Loss: 0.00839
Epoch: 42669, Training Loss: 0.00659
Epoch: 42669, Training Loss: 0.00684
Epoch: 42669, Training Loss: 0.00684
Epoch: 42669, Training Loss: 0.00839
Epoch: 42670, Training Loss: 0.00659
Epoch: 42670, Training Loss: 0.00684
Epoch: 42670, Training Loss: 0.00684
Epoch: 42670, Training Loss: 0.00839
Epoch: 42671, Training Loss: 0.00659
Epoch: 42671, Training Loss: 0.00684
Epoch: 42671, Training Loss: 0.00684
Epoch: 42671, Training Loss: 0.00839
Epoch: 42672, Training Loss: 0.00659
Epoch: 42672, Training Loss: 0.00684
Epoch: 42672, Training Loss: 0.00684
Epoch: 42672, Training Loss: 0.00839
Epoch: 42673, Training Loss: 0.00658
Epoch: 42673, Training Loss: 0.00684
Epoch: 42673, Training Loss: 0.00684
Epoch: 42673, Training Loss: 0.00839
Epoch: 42674, Training Loss: 0.00658
Epoch: 42674, Training Loss: 0.00683
Epoch: 42674, Training Loss: 0.00684
Epoch: 42674, Training Loss: 0.00839
Epoch: 42675, Training Loss: 0.00658
Epoch: 42675, Training Loss: 0.00683
Epoch: 42675, Training Loss: 0.00684
Epoch: 42675, Training Loss: 0.00839
Epoch: 42676, Training Loss: 0.00658
Epoch: 42676, Training Loss: 0.00683
Epoch: 42676, Training Loss: 0.00684
Epoch: 42676, Training Loss: 0.00839
Epoch: 42677, Training Loss: 0.00658
Epoch: 42677, Training Loss: 0.00683
Epoch: 42677, Training Loss: 0.00684
Epoch: 42677, Training Loss: 0.00839
Epoch: 42678, Training Loss: 0.00658
Epoch: 42678, Training Loss: 0.00683
Epoch: 42678, Training Loss: 0.00684
Epoch: 42678, Training Loss: 0.00839
Epoch: 42679, Training Loss: 0.00658
Epoch: 42679, Training Loss: 0.00683
Epoch: 42679, Training Loss: 0.00684
Epoch: 42679, Training Loss: 0.00839
Epoch: 42680, Training Loss: 0.00658
Epoch: 42680, Training Loss: 0.00683
Epoch: 42680, Training Loss: 0.00684
Epoch: 42680, Training Loss: 0.00839
Epoch: 42681, Training Loss: 0.00658
Epoch: 42681, Training Loss: 0.00683
Epoch: 42681, Training Loss: 0.00684
Epoch: 42681, Training Loss: 0.00839
Epoch: 42682, Training Loss: 0.00658
Epoch: 42682, Training Loss: 0.00683
Epoch: 42682, Training Loss: 0.00684
Epoch: 42682, Training Loss: 0.00839
Epoch: 42683, Training Loss: 0.00658
Epoch: 42683, Training Loss: 0.00683
Epoch: 42683, Training Loss: 0.00684
Epoch: 42683, Training Loss: 0.00839
Epoch: 42684, Training Loss: 0.00658
Epoch: 42684, Training Loss: 0.00683
Epoch: 42684, Training Loss: 0.00684
Epoch: 42684, Training Loss: 0.00839
Epoch: 42685, Training Loss: 0.00658
Epoch: 42685, Training Loss: 0.00683
Epoch: 42685, Training Loss: 0.00684
Epoch: 42685, Training Loss: 0.00839
Epoch: 42686, Training Loss: 0.00658
Epoch: 42686, Training Loss: 0.00683
Epoch: 42686, Training Loss: 0.00684
Epoch: 42686, Training Loss: 0.00839
Epoch: 42687, Training Loss: 0.00658
Epoch: 42687, Training Loss: 0.00683
Epoch: 42687, Training Loss: 0.00684
Epoch: 42687, Training Loss: 0.00839
Epoch: 42688, Training Loss: 0.00658
Epoch: 42688, Training Loss: 0.00683
Epoch: 42688, Training Loss: 0.00684
Epoch: 42688, Training Loss: 0.00839
Epoch: 42689, Training Loss: 0.00658
Epoch: 42689, Training Loss: 0.00683
Epoch: 42689, Training Loss: 0.00684
Epoch: 42689, Training Loss: 0.00839
Epoch: 42690, Training Loss: 0.00658
Epoch: 42690, Training Loss: 0.00683
Epoch: 42690, Training Loss: 0.00684
Epoch: 42690, Training Loss: 0.00839
Epoch: 42691, Training Loss: 0.00658
Epoch: 42691, Training Loss: 0.00683
Epoch: 42691, Training Loss: 0.00684
Epoch: 42691, Training Loss: 0.00839
Epoch: 42692, Training Loss: 0.00658
Epoch: 42692, Training Loss: 0.00683
Epoch: 42692, Training Loss: 0.00684
Epoch: 42692, Training Loss: 0.00839
Epoch: 42693, Training Loss: 0.00658
Epoch: 42693, Training Loss: 0.00683
Epoch: 42693, Training Loss: 0.00684
Epoch: 42693, Training Loss: 0.00839
Epoch: 42694, Training Loss: 0.00658
Epoch: 42694, Training Loss: 0.00683
Epoch: 42694, Training Loss: 0.00684
Epoch: 42694, Training Loss: 0.00839
Epoch: 42695, Training Loss: 0.00658
Epoch: 42695, Training Loss: 0.00683
Epoch: 42695, Training Loss: 0.00684
Epoch: 42695, Training Loss: 0.00839
Epoch: 42696, Training Loss: 0.00658
Epoch: 42696, Training Loss: 0.00683
Epoch: 42696, Training Loss: 0.00684
Epoch: 42696, Training Loss: 0.00839
Epoch: 42697, Training Loss: 0.00658
Epoch: 42697, Training Loss: 0.00683
Epoch: 42697, Training Loss: 0.00684
Epoch: 42697, Training Loss: 0.00839
Epoch: 42698, Training Loss: 0.00658
Epoch: 42698, Training Loss: 0.00683
Epoch: 42698, Training Loss: 0.00684
Epoch: 42698, Training Loss: 0.00839
Epoch: 42699, Training Loss: 0.00658
Epoch: 42699, Training Loss: 0.00683
Epoch: 42699, Training Loss: 0.00684
Epoch: 42699, Training Loss: 0.00839
Epoch: 42700, Training Loss: 0.00658
Epoch: 42700, Training Loss: 0.00683
Epoch: 42700, Training Loss: 0.00684
Epoch: 42700, Training Loss: 0.00839
Epoch: 42701, Training Loss: 0.00658
Epoch: 42701, Training Loss: 0.00683
Epoch: 42701, Training Loss: 0.00684
Epoch: 42701, Training Loss: 0.00839
Epoch: 42702, Training Loss: 0.00658
Epoch: 42702, Training Loss: 0.00683
Epoch: 42702, Training Loss: 0.00684
Epoch: 42702, Training Loss: 0.00839
Epoch: 42703, Training Loss: 0.00658
Epoch: 42703, Training Loss: 0.00683
Epoch: 42703, Training Loss: 0.00684
Epoch: 42703, Training Loss: 0.00839
Epoch: 42704, Training Loss: 0.00658
Epoch: 42704, Training Loss: 0.00683
Epoch: 42704, Training Loss: 0.00684
Epoch: 42704, Training Loss: 0.00839
Epoch: 42705, Training Loss: 0.00658
Epoch: 42705, Training Loss: 0.00683
Epoch: 42705, Training Loss: 0.00684
Epoch: 42705, Training Loss: 0.00839
Epoch: 42706, Training Loss: 0.00658
Epoch: 42706, Training Loss: 0.00683
Epoch: 42706, Training Loss: 0.00684
Epoch: 42706, Training Loss: 0.00839
Epoch: 42707, Training Loss: 0.00658
Epoch: 42707, Training Loss: 0.00683
Epoch: 42707, Training Loss: 0.00684
Epoch: 42707, Training Loss: 0.00838
Epoch: 42708, Training Loss: 0.00658
Epoch: 42708, Training Loss: 0.00683
Epoch: 42708, Training Loss: 0.00684
Epoch: 42708, Training Loss: 0.00838
Epoch: 42709, Training Loss: 0.00658
Epoch: 42709, Training Loss: 0.00683
Epoch: 42709, Training Loss: 0.00684
Epoch: 42709, Training Loss: 0.00838
Epoch: 42710, Training Loss: 0.00658
Epoch: 42710, Training Loss: 0.00683
Epoch: 42710, Training Loss: 0.00684
Epoch: 42710, Training Loss: 0.00838
Epoch: 42711, Training Loss: 0.00658
Epoch: 42711, Training Loss: 0.00683
Epoch: 42711, Training Loss: 0.00684
Epoch: 42711, Training Loss: 0.00838
Epoch: 42712, Training Loss: 0.00658
Epoch: 42712, Training Loss: 0.00683
Epoch: 42712, Training Loss: 0.00684
Epoch: 42712, Training Loss: 0.00838
Epoch: 42713, Training Loss: 0.00658
Epoch: 42713, Training Loss: 0.00683
Epoch: 42713, Training Loss: 0.00684
Epoch: 42713, Training Loss: 0.00838
Epoch: 42714, Training Loss: 0.00658
Epoch: 42714, Training Loss: 0.00683
Epoch: 42714, Training Loss: 0.00684
Epoch: 42714, Training Loss: 0.00838
Epoch: 42715, Training Loss: 0.00658
Epoch: 42715, Training Loss: 0.00683
Epoch: 42715, Training Loss: 0.00684
Epoch: 42715, Training Loss: 0.00838
Epoch: 42716, Training Loss: 0.00658
Epoch: 42716, Training Loss: 0.00683
Epoch: 42716, Training Loss: 0.00684
Epoch: 42716, Training Loss: 0.00838
Epoch: 42717, Training Loss: 0.00658
Epoch: 42717, Training Loss: 0.00683
Epoch: 42717, Training Loss: 0.00684
Epoch: 42717, Training Loss: 0.00838
Epoch: 42718, Training Loss: 0.00658
Epoch: 42718, Training Loss: 0.00683
Epoch: 42718, Training Loss: 0.00684
Epoch: 42718, Training Loss: 0.00838
Epoch: 42719, Training Loss: 0.00658
Epoch: 42719, Training Loss: 0.00683
Epoch: 42719, Training Loss: 0.00684
Epoch: 42719, Training Loss: 0.00838
Epoch: 42720, Training Loss: 0.00658
Epoch: 42720, Training Loss: 0.00683
Epoch: 42720, Training Loss: 0.00684
Epoch: 42720, Training Loss: 0.00838
Epoch: 42721, Training Loss: 0.00658
Epoch: 42721, Training Loss: 0.00683
Epoch: 42721, Training Loss: 0.00684
Epoch: 42721, Training Loss: 0.00838
Epoch: 42722, Training Loss: 0.00658
Epoch: 42722, Training Loss: 0.00683
Epoch: 42722, Training Loss: 0.00684
Epoch: 42722, Training Loss: 0.00838
Epoch: 42723, Training Loss: 0.00658
Epoch: 42723, Training Loss: 0.00683
Epoch: 42723, Training Loss: 0.00684
Epoch: 42723, Training Loss: 0.00838
Epoch: 42724, Training Loss: 0.00658
Epoch: 42724, Training Loss: 0.00683
Epoch: 42724, Training Loss: 0.00684
Epoch: 42724, Training Loss: 0.00838
Epoch: 42725, Training Loss: 0.00658
Epoch: 42725, Training Loss: 0.00683
Epoch: 42725, Training Loss: 0.00684
Epoch: 42725, Training Loss: 0.00838
Epoch: 42726, Training Loss: 0.00658
Epoch: 42726, Training Loss: 0.00683
Epoch: 42726, Training Loss: 0.00684
Epoch: 42726, Training Loss: 0.00838
Epoch: 42727, Training Loss: 0.00658
Epoch: 42727, Training Loss: 0.00683
Epoch: 42727, Training Loss: 0.00684
Epoch: 42727, Training Loss: 0.00838
Epoch: 42728, Training Loss: 0.00658
Epoch: 42728, Training Loss: 0.00683
Epoch: 42728, Training Loss: 0.00684
Epoch: 42728, Training Loss: 0.00838
Epoch: 42729, Training Loss: 0.00658
Epoch: 42729, Training Loss: 0.00683
Epoch: 42729, Training Loss: 0.00684
Epoch: 42729, Training Loss: 0.00838
Epoch: 42730, Training Loss: 0.00658
Epoch: 42730, Training Loss: 0.00683
Epoch: 42730, Training Loss: 0.00684
Epoch: 42730, Training Loss: 0.00838
Epoch: 42731, Training Loss: 0.00658
Epoch: 42731, Training Loss: 0.00683
Epoch: 42731, Training Loss: 0.00684
Epoch: 42731, Training Loss: 0.00838
Epoch: 42732, Training Loss: 0.00658
Epoch: 42732, Training Loss: 0.00683
Epoch: 42732, Training Loss: 0.00684
Epoch: 42732, Training Loss: 0.00838
Epoch: 42733, Training Loss: 0.00658
Epoch: 42733, Training Loss: 0.00683
Epoch: 42733, Training Loss: 0.00684
Epoch: 42733, Training Loss: 0.00838
Epoch: 42734, Training Loss: 0.00658
Epoch: 42734, Training Loss: 0.00683
Epoch: 42734, Training Loss: 0.00684
Epoch: 42734, Training Loss: 0.00838
Epoch: 42735, Training Loss: 0.00658
Epoch: 42735, Training Loss: 0.00683
Epoch: 42735, Training Loss: 0.00684
Epoch: 42735, Training Loss: 0.00838
Epoch: 42736, Training Loss: 0.00658
Epoch: 42736, Training Loss: 0.00683
Epoch: 42736, Training Loss: 0.00684
Epoch: 42736, Training Loss: 0.00838
Epoch: 42737, Training Loss: 0.00658
Epoch: 42737, Training Loss: 0.00683
Epoch: 42737, Training Loss: 0.00684
Epoch: 42737, Training Loss: 0.00838
Epoch: 42738, Training Loss: 0.00658
Epoch: 42738, Training Loss: 0.00683
Epoch: 42738, Training Loss: 0.00684
Epoch: 42738, Training Loss: 0.00838
Epoch: 42739, Training Loss: 0.00658
Epoch: 42739, Training Loss: 0.00683
Epoch: 42739, Training Loss: 0.00684
Epoch: 42739, Training Loss: 0.00838
Epoch: 42740, Training Loss: 0.00658
Epoch: 42740, Training Loss: 0.00683
Epoch: 42740, Training Loss: 0.00684
Epoch: 42740, Training Loss: 0.00838
Epoch: 42741, Training Loss: 0.00658
Epoch: 42741, Training Loss: 0.00683
Epoch: 42741, Training Loss: 0.00684
Epoch: 42741, Training Loss: 0.00838
Epoch: 42742, Training Loss: 0.00658
Epoch: 42742, Training Loss: 0.00683
Epoch: 42742, Training Loss: 0.00684
Epoch: 42742, Training Loss: 0.00838
Epoch: 42743, Training Loss: 0.00658
Epoch: 42743, Training Loss: 0.00683
Epoch: 42743, Training Loss: 0.00684
Epoch: 42743, Training Loss: 0.00838
Epoch: 42744, Training Loss: 0.00658
Epoch: 42744, Training Loss: 0.00683
Epoch: 42744, Training Loss: 0.00684
Epoch: 42744, Training Loss: 0.00838
Epoch: 42745, Training Loss: 0.00658
Epoch: 42745, Training Loss: 0.00683
Epoch: 42745, Training Loss: 0.00684
Epoch: 42745, Training Loss: 0.00838
Epoch: 42746, Training Loss: 0.00658
Epoch: 42746, Training Loss: 0.00683
Epoch: 42746, Training Loss: 0.00684
Epoch: 42746, Training Loss: 0.00838
Epoch: 42747, Training Loss: 0.00658
Epoch: 42747, Training Loss: 0.00683
Epoch: 42747, Training Loss: 0.00684
Epoch: 42747, Training Loss: 0.00838
Epoch: 42748, Training Loss: 0.00658
Epoch: 42748, Training Loss: 0.00683
Epoch: 42748, Training Loss: 0.00684
Epoch: 42748, Training Loss: 0.00838
Epoch: 42749, Training Loss: 0.00658
Epoch: 42749, Training Loss: 0.00683
Epoch: 42749, Training Loss: 0.00684
Epoch: 42749, Training Loss: 0.00838
Epoch: 42750, Training Loss: 0.00658
Epoch: 42750, Training Loss: 0.00683
Epoch: 42750, Training Loss: 0.00684
Epoch: 42750, Training Loss: 0.00838
Epoch: 42751, Training Loss: 0.00658
Epoch: 42751, Training Loss: 0.00683
Epoch: 42751, Training Loss: 0.00684
Epoch: 42751, Training Loss: 0.00838
Epoch: 42752, Training Loss: 0.00658
Epoch: 42752, Training Loss: 0.00683
Epoch: 42752, Training Loss: 0.00684
Epoch: 42752, Training Loss: 0.00838
Epoch: 42753, Training Loss: 0.00658
Epoch: 42753, Training Loss: 0.00683
Epoch: 42753, Training Loss: 0.00684
Epoch: 42753, Training Loss: 0.00838
Epoch: 42754, Training Loss: 0.00658
Epoch: 42754, Training Loss: 0.00683
Epoch: 42754, Training Loss: 0.00684
Epoch: 42754, Training Loss: 0.00838
Epoch: 42755, Training Loss: 0.00658
Epoch: 42755, Training Loss: 0.00683
Epoch: 42755, Training Loss: 0.00684
Epoch: 42755, Training Loss: 0.00838
Epoch: 42756, Training Loss: 0.00658
Epoch: 42756, Training Loss: 0.00683
Epoch: 42756, Training Loss: 0.00684
Epoch: 42756, Training Loss: 0.00838
Epoch: 42757, Training Loss: 0.00658
Epoch: 42757, Training Loss: 0.00683
Epoch: 42757, Training Loss: 0.00684
Epoch: 42757, Training Loss: 0.00838
Epoch: 42758, Training Loss: 0.00658
Epoch: 42758, Training Loss: 0.00683
Epoch: 42758, Training Loss: 0.00684
Epoch: 42758, Training Loss: 0.00838
Epoch: 42759, Training Loss: 0.00658
Epoch: 42759, Training Loss: 0.00683
Epoch: 42759, Training Loss: 0.00684
Epoch: 42759, Training Loss: 0.00838
Epoch: 42760, Training Loss: 0.00658
Epoch: 42760, Training Loss: 0.00683
Epoch: 42760, Training Loss: 0.00683
Epoch: 42760, Training Loss: 0.00838
Epoch: 42761, Training Loss: 0.00658
Epoch: 42761, Training Loss: 0.00683
Epoch: 42761, Training Loss: 0.00683
Epoch: 42761, Training Loss: 0.00838
Epoch: 42762, Training Loss: 0.00658
Epoch: 42762, Training Loss: 0.00683
Epoch: 42762, Training Loss: 0.00683
Epoch: 42762, Training Loss: 0.00838
Epoch: 42763, Training Loss: 0.00658
Epoch: 42763, Training Loss: 0.00683
Epoch: 42763, Training Loss: 0.00683
Epoch: 42763, Training Loss: 0.00838
Epoch: 42764, Training Loss: 0.00658
Epoch: 42764, Training Loss: 0.00683
Epoch: 42764, Training Loss: 0.00683
Epoch: 42764, Training Loss: 0.00838
Epoch: 42765, Training Loss: 0.00658
Epoch: 42765, Training Loss: 0.00683
Epoch: 42765, Training Loss: 0.00683
Epoch: 42765, Training Loss: 0.00838
Epoch: 42766, Training Loss: 0.00658
Epoch: 42766, Training Loss: 0.00683
Epoch: 42766, Training Loss: 0.00683
Epoch: 42766, Training Loss: 0.00838
Epoch: 42767, Training Loss: 0.00658
Epoch: 42767, Training Loss: 0.00683
Epoch: 42767, Training Loss: 0.00683
Epoch: 42767, Training Loss: 0.00838
Epoch: 42768, Training Loss: 0.00658
Epoch: 42768, Training Loss: 0.00683
Epoch: 42768, Training Loss: 0.00683
Epoch: 42768, Training Loss: 0.00838
Epoch: 42769, Training Loss: 0.00658
Epoch: 42769, Training Loss: 0.00683
Epoch: 42769, Training Loss: 0.00683
Epoch: 42769, Training Loss: 0.00838
Epoch: 42770, Training Loss: 0.00658
Epoch: 42770, Training Loss: 0.00683
Epoch: 42770, Training Loss: 0.00683
Epoch: 42770, Training Loss: 0.00838
Epoch: 42771, Training Loss: 0.00658
Epoch: 42771, Training Loss: 0.00683
Epoch: 42771, Training Loss: 0.00683
Epoch: 42771, Training Loss: 0.00838
Epoch: 42772, Training Loss: 0.00658
Epoch: 42772, Training Loss: 0.00683
Epoch: 42772, Training Loss: 0.00683
Epoch: 42772, Training Loss: 0.00838
Epoch: 42773, Training Loss: 0.00658
Epoch: 42773, Training Loss: 0.00683
Epoch: 42773, Training Loss: 0.00683
Epoch: 42773, Training Loss: 0.00838
Epoch: 42774, Training Loss: 0.00658
Epoch: 42774, Training Loss: 0.00683
Epoch: 42774, Training Loss: 0.00683
Epoch: 42774, Training Loss: 0.00838
Epoch: 42775, Training Loss: 0.00658
Epoch: 42775, Training Loss: 0.00683
Epoch: 42775, Training Loss: 0.00683
Epoch: 42775, Training Loss: 0.00838
Epoch: 42776, Training Loss: 0.00658
Epoch: 42776, Training Loss: 0.00683
Epoch: 42776, Training Loss: 0.00683
Epoch: 42776, Training Loss: 0.00838
Epoch: 42777, Training Loss: 0.00658
Epoch: 42777, Training Loss: 0.00683
Epoch: 42777, Training Loss: 0.00683
Epoch: 42777, Training Loss: 0.00838
Epoch: 42778, Training Loss: 0.00658
Epoch: 42778, Training Loss: 0.00683
Epoch: 42778, Training Loss: 0.00683
Epoch: 42778, Training Loss: 0.00838
Epoch: 42779, Training Loss: 0.00658
Epoch: 42779, Training Loss: 0.00683
Epoch: 42779, Training Loss: 0.00683
Epoch: 42779, Training Loss: 0.00838
Epoch: 42780, Training Loss: 0.00658
Epoch: 42780, Training Loss: 0.00683
Epoch: 42780, Training Loss: 0.00683
Epoch: 42780, Training Loss: 0.00838
Epoch: 42781, Training Loss: 0.00658
Epoch: 42781, Training Loss: 0.00683
Epoch: 42781, Training Loss: 0.00683
Epoch: 42781, Training Loss: 0.00838
Epoch: 42782, Training Loss: 0.00658
Epoch: 42782, Training Loss: 0.00683
Epoch: 42782, Training Loss: 0.00683
Epoch: 42782, Training Loss: 0.00838
Epoch: 42783, Training Loss: 0.00658
Epoch: 42783, Training Loss: 0.00683
Epoch: 42783, Training Loss: 0.00683
Epoch: 42783, Training Loss: 0.00838
Epoch: 42784, Training Loss: 0.00658
Epoch: 42784, Training Loss: 0.00683
Epoch: 42784, Training Loss: 0.00683
Epoch: 42784, Training Loss: 0.00838
Epoch: 42785, Training Loss: 0.00658
Epoch: 42785, Training Loss: 0.00683
Epoch: 42785, Training Loss: 0.00683
Epoch: 42785, Training Loss: 0.00838
Epoch: 42786, Training Loss: 0.00658
Epoch: 42786, Training Loss: 0.00683
Epoch: 42786, Training Loss: 0.00683
Epoch: 42786, Training Loss: 0.00838
Epoch: 42787, Training Loss: 0.00658
Epoch: 42787, Training Loss: 0.00683
Epoch: 42787, Training Loss: 0.00683
Epoch: 42787, Training Loss: 0.00838
Epoch: 42788, Training Loss: 0.00658
Epoch: 42788, Training Loss: 0.00682
Epoch: 42788, Training Loss: 0.00683
Epoch: 42788, Training Loss: 0.00838
Epoch: 42789, Training Loss: 0.00658
Epoch: 42789, Training Loss: 0.00682
Epoch: 42789, Training Loss: 0.00683
Epoch: 42789, Training Loss: 0.00838
Epoch: 42790, Training Loss: 0.00658
Epoch: 42790, Training Loss: 0.00682
Epoch: 42790, Training Loss: 0.00683
Epoch: 42790, Training Loss: 0.00838
Epoch: 42791, Training Loss: 0.00658
Epoch: 42791, Training Loss: 0.00682
Epoch: 42791, Training Loss: 0.00683
Epoch: 42791, Training Loss: 0.00838
Epoch: 42792, Training Loss: 0.00658
Epoch: 42792, Training Loss: 0.00682
Epoch: 42792, Training Loss: 0.00683
Epoch: 42792, Training Loss: 0.00838
Epoch: 42793, Training Loss: 0.00658
Epoch: 42793, Training Loss: 0.00682
Epoch: 42793, Training Loss: 0.00683
Epoch: 42793, Training Loss: 0.00838
Epoch: 42794, Training Loss: 0.00657
Epoch: 42794, Training Loss: 0.00682
Epoch: 42794, Training Loss: 0.00683
Epoch: 42794, Training Loss: 0.00838
Epoch: 42795, Training Loss: 0.00657
Epoch: 42795, Training Loss: 0.00682
Epoch: 42795, Training Loss: 0.00683
Epoch: 42795, Training Loss: 0.00838
Epoch: 42796, Training Loss: 0.00657
Epoch: 42796, Training Loss: 0.00682
Epoch: 42796, Training Loss: 0.00683
Epoch: 42796, Training Loss: 0.00838
Epoch: 42797, Training Loss: 0.00657
Epoch: 42797, Training Loss: 0.00682
Epoch: 42797, Training Loss: 0.00683
Epoch: 42797, Training Loss: 0.00838
Epoch: 42798, Training Loss: 0.00657
Epoch: 42798, Training Loss: 0.00682
Epoch: 42798, Training Loss: 0.00683
Epoch: 42798, Training Loss: 0.00838
Epoch: 42799, Training Loss: 0.00657
Epoch: 42799, Training Loss: 0.00682
Epoch: 42799, Training Loss: 0.00683
Epoch: 42799, Training Loss: 0.00838
Epoch: 42800, Training Loss: 0.00657
Epoch: 42800, Training Loss: 0.00682
Epoch: 42800, Training Loss: 0.00683
Epoch: 42800, Training Loss: 0.00837
Epoch: 42801, Training Loss: 0.00657
Epoch: 42801, Training Loss: 0.00682
Epoch: 42801, Training Loss: 0.00683
Epoch: 42801, Training Loss: 0.00837
Epoch: 42802, Training Loss: 0.00657
Epoch: 42802, Training Loss: 0.00682
Epoch: 42802, Training Loss: 0.00683
Epoch: 42802, Training Loss: 0.00837
Epoch: 42803, Training Loss: 0.00657
Epoch: 42803, Training Loss: 0.00682
Epoch: 42803, Training Loss: 0.00683
Epoch: 42803, Training Loss: 0.00837
Epoch: 42804, Training Loss: 0.00657
Epoch: 42804, Training Loss: 0.00682
Epoch: 42804, Training Loss: 0.00683
Epoch: 42804, Training Loss: 0.00837
Epoch: 42805, Training Loss: 0.00657
Epoch: 42805, Training Loss: 0.00682
Epoch: 42805, Training Loss: 0.00683
Epoch: 42805, Training Loss: 0.00837
Epoch: 42806, Training Loss: 0.00657
Epoch: 42806, Training Loss: 0.00682
Epoch: 42806, Training Loss: 0.00683
Epoch: 42806, Training Loss: 0.00837
Epoch: 42807, Training Loss: 0.00657
Epoch: 42807, Training Loss: 0.00682
Epoch: 42807, Training Loss: 0.00683
Epoch: 42807, Training Loss: 0.00837
Epoch: 42808, Training Loss: 0.00657
Epoch: 42808, Training Loss: 0.00682
Epoch: 42808, Training Loss: 0.00683
Epoch: 42808, Training Loss: 0.00837
Epoch: 42809, Training Loss: 0.00657
Epoch: 42809, Training Loss: 0.00682
Epoch: 42809, Training Loss: 0.00683
Epoch: 42809, Training Loss: 0.00837
Epoch: 42810, Training Loss: 0.00657
Epoch: 42810, Training Loss: 0.00682
Epoch: 42810, Training Loss: 0.00683
Epoch: 42810, Training Loss: 0.00837
Epoch: 42811, Training Loss: 0.00657
Epoch: 42811, Training Loss: 0.00682
Epoch: 42811, Training Loss: 0.00683
Epoch: 42811, Training Loss: 0.00837
Epoch: 42812, Training Loss: 0.00657
Epoch: 42812, Training Loss: 0.00682
Epoch: 42812, Training Loss: 0.00683
Epoch: 42812, Training Loss: 0.00837
Epoch: 42813, Training Loss: 0.00657
Epoch: 42813, Training Loss: 0.00682
Epoch: 42813, Training Loss: 0.00683
Epoch: 42813, Training Loss: 0.00837
Epoch: 42814, Training Loss: 0.00657
Epoch: 42814, Training Loss: 0.00682
Epoch: 42814, Training Loss: 0.00683
Epoch: 42814, Training Loss: 0.00837
Epoch: 42815, Training Loss: 0.00657
Epoch: 42815, Training Loss: 0.00682
Epoch: 42815, Training Loss: 0.00683
Epoch: 42815, Training Loss: 0.00837
Epoch: 42816, Training Loss: 0.00657
Epoch: 42816, Training Loss: 0.00682
Epoch: 42816, Training Loss: 0.00683
Epoch: 42816, Training Loss: 0.00837
Epoch: 42817, Training Loss: 0.00657
Epoch: 42817, Training Loss: 0.00682
Epoch: 42817, Training Loss: 0.00683
Epoch: 42817, Training Loss: 0.00837
Epoch: 42818, Training Loss: 0.00657
Epoch: 42818, Training Loss: 0.00682
Epoch: 42818, Training Loss: 0.00683
Epoch: 42818, Training Loss: 0.00837
Epoch: 42819, Training Loss: 0.00657
Epoch: 42819, Training Loss: 0.00682
Epoch: 42819, Training Loss: 0.00683
Epoch: 42819, Training Loss: 0.00837
Epoch: 42820, Training Loss: 0.00657
Epoch: 42820, Training Loss: 0.00682
Epoch: 42820, Training Loss: 0.00683
Epoch: 42820, Training Loss: 0.00837
Epoch: 42821, Training Loss: 0.00657
Epoch: 42821, Training Loss: 0.00682
Epoch: 42821, Training Loss: 0.00683
Epoch: 42821, Training Loss: 0.00837
Epoch: 42822, Training Loss: 0.00657
Epoch: 42822, Training Loss: 0.00682
Epoch: 42822, Training Loss: 0.00683
Epoch: 42822, Training Loss: 0.00837
Epoch: 42823, Training Loss: 0.00657
Epoch: 42823, Training Loss: 0.00682
Epoch: 42823, Training Loss: 0.00683
Epoch: 42823, Training Loss: 0.00837
Epoch: 42824, Training Loss: 0.00657
Epoch: 42824, Training Loss: 0.00682
Epoch: 42824, Training Loss: 0.00683
Epoch: 42824, Training Loss: 0.00837
Epoch: 42825, Training Loss: 0.00657
Epoch: 42825, Training Loss: 0.00682
Epoch: 42825, Training Loss: 0.00683
Epoch: 42825, Training Loss: 0.00837
Epoch: 42826, Training Loss: 0.00657
Epoch: 42826, Training Loss: 0.00682
Epoch: 42826, Training Loss: 0.00683
Epoch: 42826, Training Loss: 0.00837
Epoch: 42827, Training Loss: 0.00657
Epoch: 42827, Training Loss: 0.00682
Epoch: 42827, Training Loss: 0.00683
Epoch: 42827, Training Loss: 0.00837
Epoch: 42828, Training Loss: 0.00657
Epoch: 42828, Training Loss: 0.00682
Epoch: 42828, Training Loss: 0.00683
Epoch: 42828, Training Loss: 0.00837
Epoch: 42829, Training Loss: 0.00657
Epoch: 42829, Training Loss: 0.00682
Epoch: 42829, Training Loss: 0.00683
Epoch: 42829, Training Loss: 0.00837
Epoch: 42830, Training Loss: 0.00657
Epoch: 42830, Training Loss: 0.00682
Epoch: 42830, Training Loss: 0.00683
Epoch: 42830, Training Loss: 0.00837
Epoch: 42831, Training Loss: 0.00657
Epoch: 42831, Training Loss: 0.00682
Epoch: 42831, Training Loss: 0.00683
Epoch: 42831, Training Loss: 0.00837
Epoch: 42832, Training Loss: 0.00657
Epoch: 42832, Training Loss: 0.00682
Epoch: 42832, Training Loss: 0.00683
Epoch: 42832, Training Loss: 0.00837
Epoch: 42833, Training Loss: 0.00657
Epoch: 42833, Training Loss: 0.00682
Epoch: 42833, Training Loss: 0.00683
Epoch: 42833, Training Loss: 0.00837
Epoch: 42834, Training Loss: 0.00657
Epoch: 42834, Training Loss: 0.00682
Epoch: 42834, Training Loss: 0.00683
Epoch: 42834, Training Loss: 0.00837
Epoch: 42835, Training Loss: 0.00657
Epoch: 42835, Training Loss: 0.00682
Epoch: 42835, Training Loss: 0.00683
Epoch: 42835, Training Loss: 0.00837
Epoch: 42836, Training Loss: 0.00657
Epoch: 42836, Training Loss: 0.00682
Epoch: 42836, Training Loss: 0.00683
Epoch: 42836, Training Loss: 0.00837
Epoch: 42837, Training Loss: 0.00657
Epoch: 42837, Training Loss: 0.00682
Epoch: 42837, Training Loss: 0.00683
Epoch: 42837, Training Loss: 0.00837
Epoch: 42838, Training Loss: 0.00657
Epoch: 42838, Training Loss: 0.00682
Epoch: 42838, Training Loss: 0.00683
Epoch: 42838, Training Loss: 0.00837
Epoch: 42839, Training Loss: 0.00657
Epoch: 42839, Training Loss: 0.00682
Epoch: 42839, Training Loss: 0.00683
Epoch: 42839, Training Loss: 0.00837
Epoch: 42840, Training Loss: 0.00657
Epoch: 42840, Training Loss: 0.00682
Epoch: 42840, Training Loss: 0.00683
Epoch: 42840, Training Loss: 0.00837
Epoch: 42841, Training Loss: 0.00657
Epoch: 42841, Training Loss: 0.00682
Epoch: 42841, Training Loss: 0.00683
Epoch: 42841, Training Loss: 0.00837
Epoch: 42842, Training Loss: 0.00657
Epoch: 42842, Training Loss: 0.00682
Epoch: 42842, Training Loss: 0.00683
Epoch: 42842, Training Loss: 0.00837
Epoch: 42843, Training Loss: 0.00657
Epoch: 42843, Training Loss: 0.00682
Epoch: 42843, Training Loss: 0.00683
Epoch: 42843, Training Loss: 0.00837
Epoch: 42844, Training Loss: 0.00657
Epoch: 42844, Training Loss: 0.00682
Epoch: 42844, Training Loss: 0.00683
Epoch: 42844, Training Loss: 0.00837
Epoch: 42845, Training Loss: 0.00657
Epoch: 42845, Training Loss: 0.00682
Epoch: 42845, Training Loss: 0.00683
Epoch: 42845, Training Loss: 0.00837
Epoch: 42846, Training Loss: 0.00657
Epoch: 42846, Training Loss: 0.00682
Epoch: 42846, Training Loss: 0.00683
Epoch: 42846, Training Loss: 0.00837
Epoch: 42847, Training Loss: 0.00657
Epoch: 42847, Training Loss: 0.00682
Epoch: 42847, Training Loss: 0.00683
Epoch: 42847, Training Loss: 0.00837
Epoch: 42848, Training Loss: 0.00657
Epoch: 42848, Training Loss: 0.00682
Epoch: 42848, Training Loss: 0.00683
Epoch: 42848, Training Loss: 0.00837
Epoch: 42849, Training Loss: 0.00657
Epoch: 42849, Training Loss: 0.00682
Epoch: 42849, Training Loss: 0.00683
Epoch: 42849, Training Loss: 0.00837
Epoch: 42850, Training Loss: 0.00657
Epoch: 42850, Training Loss: 0.00682
Epoch: 42850, Training Loss: 0.00683
Epoch: 42850, Training Loss: 0.00837
Epoch: 42851, Training Loss: 0.00657
Epoch: 42851, Training Loss: 0.00682
Epoch: 42851, Training Loss: 0.00683
Epoch: 42851, Training Loss: 0.00837
Epoch: 42852, Training Loss: 0.00657
Epoch: 42852, Training Loss: 0.00682
Epoch: 42852, Training Loss: 0.00683
Epoch: 42852, Training Loss: 0.00837
Epoch: 42853, Training Loss: 0.00657
Epoch: 42853, Training Loss: 0.00682
Epoch: 42853, Training Loss: 0.00683
Epoch: 42853, Training Loss: 0.00837
Epoch: 42854, Training Loss: 0.00657
Epoch: 42854, Training Loss: 0.00682
Epoch: 42854, Training Loss: 0.00683
Epoch: 42854, Training Loss: 0.00837
Epoch: 42855, Training Loss: 0.00657
Epoch: 42855, Training Loss: 0.00682
Epoch: 42855, Training Loss: 0.00683
Epoch: 42855, Training Loss: 0.00837
Epoch: 42856, Training Loss: 0.00657
Epoch: 42856, Training Loss: 0.00682
Epoch: 42856, Training Loss: 0.00683
Epoch: 42856, Training Loss: 0.00837
Epoch: 42857, Training Loss: 0.00657
Epoch: 42857, Training Loss: 0.00682
Epoch: 42857, Training Loss: 0.00683
Epoch: 42857, Training Loss: 0.00837
Epoch: 42858, Training Loss: 0.00657
Epoch: 42858, Training Loss: 0.00682
Epoch: 42858, Training Loss: 0.00683
Epoch: 42858, Training Loss: 0.00837
Epoch: 42859, Training Loss: 0.00657
Epoch: 42859, Training Loss: 0.00682
Epoch: 42859, Training Loss: 0.00683
Epoch: 42859, Training Loss: 0.00837
Epoch: 42860, Training Loss: 0.00657
Epoch: 42860, Training Loss: 0.00682
Epoch: 42860, Training Loss: 0.00683
Epoch: 42860, Training Loss: 0.00837
Epoch: 42861, Training Loss: 0.00657
Epoch: 42861, Training Loss: 0.00682
Epoch: 42861, Training Loss: 0.00683
Epoch: 42861, Training Loss: 0.00837
Epoch: 42862, Training Loss: 0.00657
Epoch: 42862, Training Loss: 0.00682
Epoch: 42862, Training Loss: 0.00683
Epoch: 42862, Training Loss: 0.00837
Epoch: 42863, Training Loss: 0.00657
Epoch: 42863, Training Loss: 0.00682
Epoch: 42863, Training Loss: 0.00683
Epoch: 42863, Training Loss: 0.00837
Epoch: 42864, Training Loss: 0.00657
Epoch: 42864, Training Loss: 0.00682
Epoch: 42864, Training Loss: 0.00683
Epoch: 42864, Training Loss: 0.00837
Epoch: 42865, Training Loss: 0.00657
Epoch: 42865, Training Loss: 0.00682
Epoch: 42865, Training Loss: 0.00683
Epoch: 42865, Training Loss: 0.00837
Epoch: 42866, Training Loss: 0.00657
Epoch: 42866, Training Loss: 0.00682
Epoch: 42866, Training Loss: 0.00683
Epoch: 42866, Training Loss: 0.00837
Epoch: 42867, Training Loss: 0.00657
Epoch: 42867, Training Loss: 0.00682
Epoch: 42867, Training Loss: 0.00683
Epoch: 42867, Training Loss: 0.00837
Epoch: 42868, Training Loss: 0.00657
Epoch: 42868, Training Loss: 0.00682
Epoch: 42868, Training Loss: 0.00683
Epoch: 42868, Training Loss: 0.00837
Epoch: 42869, Training Loss: 0.00657
Epoch: 42869, Training Loss: 0.00682
Epoch: 42869, Training Loss: 0.00683
Epoch: 42869, Training Loss: 0.00837
Epoch: 42870, Training Loss: 0.00657
Epoch: 42870, Training Loss: 0.00682
Epoch: 42870, Training Loss: 0.00683
Epoch: 42870, Training Loss: 0.00837
Epoch: 42871, Training Loss: 0.00657
Epoch: 42871, Training Loss: 0.00682
Epoch: 42871, Training Loss: 0.00683
Epoch: 42871, Training Loss: 0.00837
Epoch: 42872, Training Loss: 0.00657
Epoch: 42872, Training Loss: 0.00682
Epoch: 42872, Training Loss: 0.00683
Epoch: 42872, Training Loss: 0.00837
Epoch: 42873, Training Loss: 0.00657
Epoch: 42873, Training Loss: 0.00682
Epoch: 42873, Training Loss: 0.00683
Epoch: 42873, Training Loss: 0.00837
Epoch: 42874, Training Loss: 0.00657
Epoch: 42874, Training Loss: 0.00682
Epoch: 42874, Training Loss: 0.00683
Epoch: 42874, Training Loss: 0.00837
Epoch: 42875, Training Loss: 0.00657
Epoch: 42875, Training Loss: 0.00682
Epoch: 42875, Training Loss: 0.00682
Epoch: 42875, Training Loss: 0.00837
Epoch: 42876, Training Loss: 0.00657
Epoch: 42876, Training Loss: 0.00682
Epoch: 42876, Training Loss: 0.00682
Epoch: 42876, Training Loss: 0.00837
Epoch: 42877, Training Loss: 0.00657
Epoch: 42877, Training Loss: 0.00682
Epoch: 42877, Training Loss: 0.00682
Epoch: 42877, Training Loss: 0.00837
Epoch: 42878, Training Loss: 0.00657
Epoch: 42878, Training Loss: 0.00682
Epoch: 42878, Training Loss: 0.00682
Epoch: 42878, Training Loss: 0.00837
Epoch: 42879, Training Loss: 0.00657
Epoch: 42879, Training Loss: 0.00682
Epoch: 42879, Training Loss: 0.00682
Epoch: 42879, Training Loss: 0.00837
Epoch: 42880, Training Loss: 0.00657
Epoch: 42880, Training Loss: 0.00682
Epoch: 42880, Training Loss: 0.00682
Epoch: 42880, Training Loss: 0.00837
Epoch: 42881, Training Loss: 0.00657
Epoch: 42881, Training Loss: 0.00682
Epoch: 42881, Training Loss: 0.00682
Epoch: 42881, Training Loss: 0.00837
Epoch: 42882, Training Loss: 0.00657
Epoch: 42882, Training Loss: 0.00682
Epoch: 42882, Training Loss: 0.00682
Epoch: 42882, Training Loss: 0.00837
Epoch: 42883, Training Loss: 0.00657
Epoch: 42883, Training Loss: 0.00682
Epoch: 42883, Training Loss: 0.00682
Epoch: 42883, Training Loss: 0.00837
Epoch: 42884, Training Loss: 0.00657
Epoch: 42884, Training Loss: 0.00682
Epoch: 42884, Training Loss: 0.00682
Epoch: 42884, Training Loss: 0.00837
Epoch: 42885, Training Loss: 0.00657
Epoch: 42885, Training Loss: 0.00682
Epoch: 42885, Training Loss: 0.00682
Epoch: 42885, Training Loss: 0.00837
Epoch: 42886, Training Loss: 0.00657
Epoch: 42886, Training Loss: 0.00682
Epoch: 42886, Training Loss: 0.00682
Epoch: 42886, Training Loss: 0.00837
Epoch: 42887, Training Loss: 0.00657
Epoch: 42887, Training Loss: 0.00682
Epoch: 42887, Training Loss: 0.00682
Epoch: 42887, Training Loss: 0.00837
Epoch: 42888, Training Loss: 0.00657
Epoch: 42888, Training Loss: 0.00682
Epoch: 42888, Training Loss: 0.00682
Epoch: 42888, Training Loss: 0.00837
Epoch: 42889, Training Loss: 0.00657
Epoch: 42889, Training Loss: 0.00682
Epoch: 42889, Training Loss: 0.00682
Epoch: 42889, Training Loss: 0.00837
Epoch: 42890, Training Loss: 0.00657
Epoch: 42890, Training Loss: 0.00682
Epoch: 42890, Training Loss: 0.00682
Epoch: 42890, Training Loss: 0.00837
Epoch: 42891, Training Loss: 0.00657
Epoch: 42891, Training Loss: 0.00682
Epoch: 42891, Training Loss: 0.00682
Epoch: 42891, Training Loss: 0.00837
Epoch: 42892, Training Loss: 0.00657
Epoch: 42892, Training Loss: 0.00682
Epoch: 42892, Training Loss: 0.00682
Epoch: 42892, Training Loss: 0.00836
Epoch: 42893, Training Loss: 0.00657
Epoch: 42893, Training Loss: 0.00682
Epoch: 42893, Training Loss: 0.00682
Epoch: 42893, Training Loss: 0.00836
Epoch: 42894, Training Loss: 0.00657
Epoch: 42894, Training Loss: 0.00682
Epoch: 42894, Training Loss: 0.00682
Epoch: 42894, Training Loss: 0.00836
Epoch: 42895, Training Loss: 0.00657
Epoch: 42895, Training Loss: 0.00682
Epoch: 42895, Training Loss: 0.00682
Epoch: 42895, Training Loss: 0.00836
Epoch: 42896, Training Loss: 0.00657
Epoch: 42896, Training Loss: 0.00682
Epoch: 42896, Training Loss: 0.00682
Epoch: 42896, Training Loss: 0.00836
Epoch: 42897, Training Loss: 0.00657
Epoch: 42897, Training Loss: 0.00682
Epoch: 42897, Training Loss: 0.00682
Epoch: 42897, Training Loss: 0.00836
Epoch: 42898, Training Loss: 0.00657
Epoch: 42898, Training Loss: 0.00682
Epoch: 42898, Training Loss: 0.00682
Epoch: 42898, Training Loss: 0.00836
Epoch: 42899, Training Loss: 0.00657
Epoch: 42899, Training Loss: 0.00682
Epoch: 42899, Training Loss: 0.00682
Epoch: 42899, Training Loss: 0.00836
Epoch: 42900, Training Loss: 0.00657
Epoch: 42900, Training Loss: 0.00682
Epoch: 42900, Training Loss: 0.00682
Epoch: 42900, Training Loss: 0.00836
Epoch: 42901, Training Loss: 0.00657
Epoch: 42901, Training Loss: 0.00682
Epoch: 42901, Training Loss: 0.00682
Epoch: 42901, Training Loss: 0.00836
Epoch: 42902, Training Loss: 0.00657
Epoch: 42902, Training Loss: 0.00682
Epoch: 42902, Training Loss: 0.00682
Epoch: 42902, Training Loss: 0.00836
Epoch: 42903, Training Loss: 0.00657
Epoch: 42903, Training Loss: 0.00681
Epoch: 42903, Training Loss: 0.00682
Epoch: 42903, Training Loss: 0.00836
Epoch: 42904, Training Loss: 0.00657
Epoch: 42904, Training Loss: 0.00681
Epoch: 42904, Training Loss: 0.00682
Epoch: 42904, Training Loss: 0.00836
Epoch: 42905, Training Loss: 0.00657
Epoch: 42905, Training Loss: 0.00681
Epoch: 42905, Training Loss: 0.00682
Epoch: 42905, Training Loss: 0.00836
Epoch: 42906, Training Loss: 0.00657
Epoch: 42906, Training Loss: 0.00681
Epoch: 42906, Training Loss: 0.00682
Epoch: 42906, Training Loss: 0.00836
Epoch: 42907, Training Loss: 0.00657
Epoch: 42907, Training Loss: 0.00681
Epoch: 42907, Training Loss: 0.00682
Epoch: 42907, Training Loss: 0.00836
Epoch: 42908, Training Loss: 0.00657
Epoch: 42908, Training Loss: 0.00681
Epoch: 42908, Training Loss: 0.00682
Epoch: 42908, Training Loss: 0.00836
Epoch: 42909, Training Loss: 0.00657
Epoch: 42909, Training Loss: 0.00681
Epoch: 42909, Training Loss: 0.00682
Epoch: 42909, Training Loss: 0.00836
Epoch: 42910, Training Loss: 0.00657
Epoch: 42910, Training Loss: 0.00681
Epoch: 42910, Training Loss: 0.00682
Epoch: 42910, Training Loss: 0.00836
Epoch: 42911, Training Loss: 0.00657
Epoch: 42911, Training Loss: 0.00681
Epoch: 42911, Training Loss: 0.00682
Epoch: 42911, Training Loss: 0.00836
Epoch: 42912, Training Loss: 0.00657
Epoch: 42912, Training Loss: 0.00681
Epoch: 42912, Training Loss: 0.00682
Epoch: 42912, Training Loss: 0.00836
Epoch: 42913, Training Loss: 0.00657
Epoch: 42913, Training Loss: 0.00681
Epoch: 42913, Training Loss: 0.00682
Epoch: 42913, Training Loss: 0.00836
Epoch: 42914, Training Loss: 0.00657
Epoch: 42914, Training Loss: 0.00681
Epoch: 42914, Training Loss: 0.00682
Epoch: 42914, Training Loss: 0.00836
Epoch: 42915, Training Loss: 0.00657
Epoch: 42915, Training Loss: 0.00681
Epoch: 42915, Training Loss: 0.00682
Epoch: 42915, Training Loss: 0.00836
Epoch: 42916, Training Loss: 0.00656
Epoch: 42916, Training Loss: 0.00681
Epoch: 42916, Training Loss: 0.00682
Epoch: 42916, Training Loss: 0.00836
Epoch: 42917, Training Loss: 0.00656
Epoch: 42917, Training Loss: 0.00681
Epoch: 42917, Training Loss: 0.00682
Epoch: 42917, Training Loss: 0.00836
Epoch: 42918, Training Loss: 0.00656
Epoch: 42918, Training Loss: 0.00681
Epoch: 42918, Training Loss: 0.00682
Epoch: 42918, Training Loss: 0.00836
Epoch: 42919, Training Loss: 0.00656
Epoch: 42919, Training Loss: 0.00681
Epoch: 42919, Training Loss: 0.00682
Epoch: 42919, Training Loss: 0.00836
Epoch: 42920, Training Loss: 0.00656
Epoch: 42920, Training Loss: 0.00681
Epoch: 42920, Training Loss: 0.00682
Epoch: 42920, Training Loss: 0.00836
Epoch: 42921, Training Loss: 0.00656
Epoch: 42921, Training Loss: 0.00681
Epoch: 42921, Training Loss: 0.00682
Epoch: 42921, Training Loss: 0.00836
Epoch: 42922, Training Loss: 0.00656
Epoch: 42922, Training Loss: 0.00681
Epoch: 42922, Training Loss: 0.00682
Epoch: 42922, Training Loss: 0.00836
Epoch: 42923, Training Loss: 0.00656
Epoch: 42923, Training Loss: 0.00681
Epoch: 42923, Training Loss: 0.00682
Epoch: 42923, Training Loss: 0.00836
Epoch: 42924, Training Loss: 0.00656
Epoch: 42924, Training Loss: 0.00681
Epoch: 42924, Training Loss: 0.00682
Epoch: 42924, Training Loss: 0.00836
Epoch: 42925, Training Loss: 0.00656
Epoch: 42925, Training Loss: 0.00681
Epoch: 42925, Training Loss: 0.00682
Epoch: 42925, Training Loss: 0.00836
Epoch: 42926, Training Loss: 0.00656
Epoch: 42926, Training Loss: 0.00681
Epoch: 42926, Training Loss: 0.00682
Epoch: 42926, Training Loss: 0.00836
Epoch: 42927, Training Loss: 0.00656
Epoch: 42927, Training Loss: 0.00681
Epoch: 42927, Training Loss: 0.00682
Epoch: 42927, Training Loss: 0.00836
Epoch: 42928, Training Loss: 0.00656
Epoch: 42928, Training Loss: 0.00681
Epoch: 42928, Training Loss: 0.00682
Epoch: 42928, Training Loss: 0.00836
Epoch: 42929, Training Loss: 0.00656
Epoch: 42929, Training Loss: 0.00681
Epoch: 42929, Training Loss: 0.00682
Epoch: 42929, Training Loss: 0.00836
Epoch: 42930, Training Loss: 0.00656
Epoch: 42930, Training Loss: 0.00681
Epoch: 42930, Training Loss: 0.00682
Epoch: 42930, Training Loss: 0.00836
Epoch: 42931, Training Loss: 0.00656
Epoch: 42931, Training Loss: 0.00681
Epoch: 42931, Training Loss: 0.00682
Epoch: 42931, Training Loss: 0.00836
Epoch: 42932, Training Loss: 0.00656
Epoch: 42932, Training Loss: 0.00681
Epoch: 42932, Training Loss: 0.00682
Epoch: 42932, Training Loss: 0.00836
Epoch: 42933, Training Loss: 0.00656
Epoch: 42933, Training Loss: 0.00681
Epoch: 42933, Training Loss: 0.00682
Epoch: 42933, Training Loss: 0.00836
Epoch: 42934, Training Loss: 0.00656
Epoch: 42934, Training Loss: 0.00681
Epoch: 42934, Training Loss: 0.00682
Epoch: 42934, Training Loss: 0.00836
Epoch: 42935, Training Loss: 0.00656
Epoch: 42935, Training Loss: 0.00681
Epoch: 42935, Training Loss: 0.00682
Epoch: 42935, Training Loss: 0.00836
Epoch: 42936, Training Loss: 0.00656
Epoch: 42936, Training Loss: 0.00681
Epoch: 42936, Training Loss: 0.00682
Epoch: 42936, Training Loss: 0.00836
Epoch: 42937, Training Loss: 0.00656
Epoch: 42937, Training Loss: 0.00681
Epoch: 42937, Training Loss: 0.00682
Epoch: 42937, Training Loss: 0.00836
Epoch: 42938, Training Loss: 0.00656
Epoch: 42938, Training Loss: 0.00681
Epoch: 42938, Training Loss: 0.00682
Epoch: 42938, Training Loss: 0.00836
Epoch: 42939, Training Loss: 0.00656
Epoch: 42939, Training Loss: 0.00681
Epoch: 42939, Training Loss: 0.00682
Epoch: 42939, Training Loss: 0.00836
Epoch: 42940, Training Loss: 0.00656
Epoch: 42940, Training Loss: 0.00681
Epoch: 42940, Training Loss: 0.00682
Epoch: 42940, Training Loss: 0.00836
Epoch: 42941, Training Loss: 0.00656
Epoch: 42941, Training Loss: 0.00681
Epoch: 42941, Training Loss: 0.00682
Epoch: 42941, Training Loss: 0.00836
Epoch: 42942, Training Loss: 0.00656
Epoch: 42942, Training Loss: 0.00681
Epoch: 42942, Training Loss: 0.00682
Epoch: 42942, Training Loss: 0.00836
Epoch: 42943, Training Loss: 0.00656
Epoch: 42943, Training Loss: 0.00681
Epoch: 42943, Training Loss: 0.00682
Epoch: 42943, Training Loss: 0.00836
Epoch: 42944, Training Loss: 0.00656
Epoch: 42944, Training Loss: 0.00681
Epoch: 42944, Training Loss: 0.00682
Epoch: 42944, Training Loss: 0.00836
Epoch: 42945, Training Loss: 0.00656
Epoch: 42945, Training Loss: 0.00681
Epoch: 42945, Training Loss: 0.00682
Epoch: 42945, Training Loss: 0.00836
Epoch: 42946, Training Loss: 0.00656
Epoch: 42946, Training Loss: 0.00681
Epoch: 42946, Training Loss: 0.00682
Epoch: 42946, Training Loss: 0.00836
Epoch: 42947, Training Loss: 0.00656
Epoch: 42947, Training Loss: 0.00681
Epoch: 42947, Training Loss: 0.00682
Epoch: 42947, Training Loss: 0.00836
Epoch: 42948, Training Loss: 0.00656
Epoch: 42948, Training Loss: 0.00681
Epoch: 42948, Training Loss: 0.00682
Epoch: 42948, Training Loss: 0.00836
Epoch: 42949, Training Loss: 0.00656
Epoch: 42949, Training Loss: 0.00681
Epoch: 42949, Training Loss: 0.00682
Epoch: 42949, Training Loss: 0.00836
Epoch: 42950, Training Loss: 0.00656
Epoch: 42950, Training Loss: 0.00681
Epoch: 42950, Training Loss: 0.00682
Epoch: 42950, Training Loss: 0.00836
Epoch: 42951, Training Loss: 0.00656
Epoch: 42951, Training Loss: 0.00681
Epoch: 42951, Training Loss: 0.00682
Epoch: 42951, Training Loss: 0.00836
Epoch: 42952, Training Loss: 0.00656
Epoch: 42952, Training Loss: 0.00681
Epoch: 42952, Training Loss: 0.00682
Epoch: 42952, Training Loss: 0.00836
Epoch: 42953, Training Loss: 0.00656
Epoch: 42953, Training Loss: 0.00681
Epoch: 42953, Training Loss: 0.00682
Epoch: 42953, Training Loss: 0.00836
Epoch: 42954, Training Loss: 0.00656
Epoch: 42954, Training Loss: 0.00681
Epoch: 42954, Training Loss: 0.00682
Epoch: 42954, Training Loss: 0.00836
Epoch: 42955, Training Loss: 0.00656
Epoch: 42955, Training Loss: 0.00681
Epoch: 42955, Training Loss: 0.00682
Epoch: 42955, Training Loss: 0.00836
Epoch: 42956, Training Loss: 0.00656
Epoch: 42956, Training Loss: 0.00681
Epoch: 42956, Training Loss: 0.00682
Epoch: 42956, Training Loss: 0.00836
Epoch: 42957, Training Loss: 0.00656
Epoch: 42957, Training Loss: 0.00681
Epoch: 42957, Training Loss: 0.00682
Epoch: 42957, Training Loss: 0.00836
Epoch: 42958, Training Loss: 0.00656
Epoch: 42958, Training Loss: 0.00681
Epoch: 42958, Training Loss: 0.00682
Epoch: 42958, Training Loss: 0.00836
Epoch: 42959, Training Loss: 0.00656
Epoch: 42959, Training Loss: 0.00681
Epoch: 42959, Training Loss: 0.00682
Epoch: 42959, Training Loss: 0.00836
Epoch: 42960, Training Loss: 0.00656
Epoch: 42960, Training Loss: 0.00681
Epoch: 42960, Training Loss: 0.00682
Epoch: 42960, Training Loss: 0.00836
Epoch: 42961, Training Loss: 0.00656
Epoch: 42961, Training Loss: 0.00681
Epoch: 42961, Training Loss: 0.00682
Epoch: 42961, Training Loss: 0.00836
Epoch: 42962, Training Loss: 0.00656
Epoch: 42962, Training Loss: 0.00681
Epoch: 42962, Training Loss: 0.00682
Epoch: 42962, Training Loss: 0.00836
Epoch: 42963, Training Loss: 0.00656
Epoch: 42963, Training Loss: 0.00681
Epoch: 42963, Training Loss: 0.00682
Epoch: 42963, Training Loss: 0.00836
Epoch: 42964, Training Loss: 0.00656
Epoch: 42964, Training Loss: 0.00681
Epoch: 42964, Training Loss: 0.00682
Epoch: 42964, Training Loss: 0.00836
Epoch: 42965, Training Loss: 0.00656
Epoch: 42965, Training Loss: 0.00681
Epoch: 42965, Training Loss: 0.00682
Epoch: 42965, Training Loss: 0.00836
Epoch: 42966, Training Loss: 0.00656
Epoch: 42966, Training Loss: 0.00681
Epoch: 42966, Training Loss: 0.00682
Epoch: 42966, Training Loss: 0.00836
Epoch: 42967, Training Loss: 0.00656
Epoch: 42967, Training Loss: 0.00681
Epoch: 42967, Training Loss: 0.00682
Epoch: 42967, Training Loss: 0.00836
Epoch: 42968, Training Loss: 0.00656
Epoch: 42968, Training Loss: 0.00681
Epoch: 42968, Training Loss: 0.00682
Epoch: 42968, Training Loss: 0.00836
Epoch: 42969, Training Loss: 0.00656
Epoch: 42969, Training Loss: 0.00681
Epoch: 42969, Training Loss: 0.00682
Epoch: 42969, Training Loss: 0.00836
Epoch: 42970, Training Loss: 0.00656
Epoch: 42970, Training Loss: 0.00681
Epoch: 42970, Training Loss: 0.00682
Epoch: 42970, Training Loss: 0.00836
Epoch: 42971, Training Loss: 0.00656
Epoch: 42971, Training Loss: 0.00681
Epoch: 42971, Training Loss: 0.00682
Epoch: 42971, Training Loss: 0.00836
Epoch: 42972, Training Loss: 0.00656
Epoch: 42972, Training Loss: 0.00681
Epoch: 42972, Training Loss: 0.00682
Epoch: 42972, Training Loss: 0.00836
Epoch: 42973, Training Loss: 0.00656
Epoch: 42973, Training Loss: 0.00681
Epoch: 42973, Training Loss: 0.00682
Epoch: 42973, Training Loss: 0.00836
Epoch: 42974, Training Loss: 0.00656
Epoch: 42974, Training Loss: 0.00681
Epoch: 42974, Training Loss: 0.00682
Epoch: 42974, Training Loss: 0.00836
Epoch: 42975, Training Loss: 0.00656
Epoch: 42975, Training Loss: 0.00681
Epoch: 42975, Training Loss: 0.00682
Epoch: 42975, Training Loss: 0.00836
Epoch: 42976, Training Loss: 0.00656
Epoch: 42976, Training Loss: 0.00681
Epoch: 42976, Training Loss: 0.00682
Epoch: 42976, Training Loss: 0.00836
Epoch: 42977, Training Loss: 0.00656
Epoch: 42977, Training Loss: 0.00681
Epoch: 42977, Training Loss: 0.00682
Epoch: 42977, Training Loss: 0.00836
Epoch: 42978, Training Loss: 0.00656
Epoch: 42978, Training Loss: 0.00681
Epoch: 42978, Training Loss: 0.00682
Epoch: 42978, Training Loss: 0.00836
Epoch: 42979, Training Loss: 0.00656
Epoch: 42979, Training Loss: 0.00681
Epoch: 42979, Training Loss: 0.00682
Epoch: 42979, Training Loss: 0.00836
Epoch: 42980, Training Loss: 0.00656
Epoch: 42980, Training Loss: 0.00681
Epoch: 42980, Training Loss: 0.00682
Epoch: 42980, Training Loss: 0.00836
Epoch: 42981, Training Loss: 0.00656
Epoch: 42981, Training Loss: 0.00681
Epoch: 42981, Training Loss: 0.00682
Epoch: 42981, Training Loss: 0.00836
Epoch: 42982, Training Loss: 0.00656
Epoch: 42982, Training Loss: 0.00681
Epoch: 42982, Training Loss: 0.00682
Epoch: 42982, Training Loss: 0.00836
Epoch: 42983, Training Loss: 0.00656
Epoch: 42983, Training Loss: 0.00681
Epoch: 42983, Training Loss: 0.00682
Epoch: 42983, Training Loss: 0.00836
Epoch: 42984, Training Loss: 0.00656
Epoch: 42984, Training Loss: 0.00681
Epoch: 42984, Training Loss: 0.00682
Epoch: 42984, Training Loss: 0.00836
Epoch: 42985, Training Loss: 0.00656
Epoch: 42985, Training Loss: 0.00681
Epoch: 42985, Training Loss: 0.00682
Epoch: 42985, Training Loss: 0.00836
Epoch: 42986, Training Loss: 0.00656
Epoch: 42986, Training Loss: 0.00681
Epoch: 42986, Training Loss: 0.00682
Epoch: 42986, Training Loss: 0.00835
Epoch: 42987, Training Loss: 0.00656
Epoch: 42987, Training Loss: 0.00681
Epoch: 42987, Training Loss: 0.00682
Epoch: 42987, Training Loss: 0.00835
Epoch: 42988, Training Loss: 0.00656
Epoch: 42988, Training Loss: 0.00681
Epoch: 42988, Training Loss: 0.00682
Epoch: 42988, Training Loss: 0.00835
Epoch: 42989, Training Loss: 0.00656
Epoch: 42989, Training Loss: 0.00681
Epoch: 42989, Training Loss: 0.00682
Epoch: 42989, Training Loss: 0.00835
Epoch: 42990, Training Loss: 0.00656
Epoch: 42990, Training Loss: 0.00681
Epoch: 42990, Training Loss: 0.00681
Epoch: 42990, Training Loss: 0.00835
Epoch: 42991, Training Loss: 0.00656
Epoch: 42991, Training Loss: 0.00681
Epoch: 42991, Training Loss: 0.00681
Epoch: 42991, Training Loss: 0.00835
Epoch: 42992, Training Loss: 0.00656
Epoch: 42992, Training Loss: 0.00681
Epoch: 42992, Training Loss: 0.00681
Epoch: 42992, Training Loss: 0.00835
Epoch: 42993, Training Loss: 0.00656
Epoch: 42993, Training Loss: 0.00681
Epoch: 42993, Training Loss: 0.00681
Epoch: 42993, Training Loss: 0.00835
Epoch: 42994, Training Loss: 0.00656
Epoch: 42994, Training Loss: 0.00681
Epoch: 42994, Training Loss: 0.00681
Epoch: 42994, Training Loss: 0.00835
Epoch: 42995, Training Loss: 0.00656
Epoch: 42995, Training Loss: 0.00681
Epoch: 42995, Training Loss: 0.00681
Epoch: 42995, Training Loss: 0.00835
Epoch: 42996, Training Loss: 0.00656
Epoch: 42996, Training Loss: 0.00681
Epoch: 42996, Training Loss: 0.00681
Epoch: 42996, Training Loss: 0.00835
Epoch: 42997, Training Loss: 0.00656
Epoch: 42997, Training Loss: 0.00681
Epoch: 42997, Training Loss: 0.00681
Epoch: 42997, Training Loss: 0.00835
Epoch: 42998, Training Loss: 0.00656
Epoch: 42998, Training Loss: 0.00681
Epoch: 42998, Training Loss: 0.00681
Epoch: 42998, Training Loss: 0.00835
Epoch: 42999, Training Loss: 0.00656
Epoch: 42999, Training Loss: 0.00681
Epoch: 42999, Training Loss: 0.00681
Epoch: 42999, Training Loss: 0.00835
Epoch: 43000, Training Loss: 0.00656
Epoch: 43000, Training Loss: 0.00681
Epoch: 43000, Training Loss: 0.00681
Epoch: 43000, Training Loss: 0.00835
Epoch: 43001, Training Loss: 0.00656
Epoch: 43001, Training Loss: 0.00681
Epoch: 43001, Training Loss: 0.00681
Epoch: 43001, Training Loss: 0.00835
Epoch: 43002, Training Loss: 0.00656
Epoch: 43002, Training Loss: 0.00681
Epoch: 43002, Training Loss: 0.00681
Epoch: 43002, Training Loss: 0.00835
Epoch: 43003, Training Loss: 0.00656
Epoch: 43003, Training Loss: 0.00681
Epoch: 43003, Training Loss: 0.00681
Epoch: 43003, Training Loss: 0.00835
Epoch: 43004, Training Loss: 0.00656
Epoch: 43004, Training Loss: 0.00681
Epoch: 43004, Training Loss: 0.00681
Epoch: 43004, Training Loss: 0.00835
Epoch: 43005, Training Loss: 0.00656
Epoch: 43005, Training Loss: 0.00681
Epoch: 43005, Training Loss: 0.00681
Epoch: 43005, Training Loss: 0.00835
Epoch: 43006, Training Loss: 0.00656
Epoch: 43006, Training Loss: 0.00681
Epoch: 43006, Training Loss: 0.00681
Epoch: 43006, Training Loss: 0.00835
Epoch: 43007, Training Loss: 0.00656
Epoch: 43007, Training Loss: 0.00681
Epoch: 43007, Training Loss: 0.00681
Epoch: 43007, Training Loss: 0.00835
Epoch: 43008, Training Loss: 0.00656
Epoch: 43008, Training Loss: 0.00681
Epoch: 43008, Training Loss: 0.00681
Epoch: 43008, Training Loss: 0.00835
Epoch: 43009, Training Loss: 0.00656
Epoch: 43009, Training Loss: 0.00681
Epoch: 43009, Training Loss: 0.00681
Epoch: 43009, Training Loss: 0.00835
Epoch: 43010, Training Loss: 0.00656
Epoch: 43010, Training Loss: 0.00681
Epoch: 43010, Training Loss: 0.00681
Epoch: 43010, Training Loss: 0.00835
Epoch: 43011, Training Loss: 0.00656
Epoch: 43011, Training Loss: 0.00681
Epoch: 43011, Training Loss: 0.00681
Epoch: 43011, Training Loss: 0.00835
Epoch: 43012, Training Loss: 0.00656
Epoch: 43012, Training Loss: 0.00681
Epoch: 43012, Training Loss: 0.00681
Epoch: 43012, Training Loss: 0.00835
Epoch: 43013, Training Loss: 0.00656
Epoch: 43013, Training Loss: 0.00681
Epoch: 43013, Training Loss: 0.00681
Epoch: 43013, Training Loss: 0.00835
Epoch: 43014, Training Loss: 0.00656
Epoch: 43014, Training Loss: 0.00681
Epoch: 43014, Training Loss: 0.00681
Epoch: 43014, Training Loss: 0.00835
Epoch: 43015, Training Loss: 0.00656
Epoch: 43015, Training Loss: 0.00681
Epoch: 43015, Training Loss: 0.00681
Epoch: 43015, Training Loss: 0.00835
Epoch: 43016, Training Loss: 0.00656
Epoch: 43016, Training Loss: 0.00681
Epoch: 43016, Training Loss: 0.00681
Epoch: 43016, Training Loss: 0.00835
Epoch: 43017, Training Loss: 0.00656
Epoch: 43017, Training Loss: 0.00681
Epoch: 43017, Training Loss: 0.00681
Epoch: 43017, Training Loss: 0.00835
Epoch: 43018, Training Loss: 0.00656
Epoch: 43018, Training Loss: 0.00680
Epoch: 43018, Training Loss: 0.00681
Epoch: 43018, Training Loss: 0.00835
Epoch: 43019, Training Loss: 0.00656
Epoch: 43019, Training Loss: 0.00680
Epoch: 43019, Training Loss: 0.00681
Epoch: 43019, Training Loss: 0.00835
Epoch: 43020, Training Loss: 0.00656
Epoch: 43020, Training Loss: 0.00680
Epoch: 43020, Training Loss: 0.00681
Epoch: 43020, Training Loss: 0.00835
Epoch: 43021, Training Loss: 0.00656
Epoch: 43021, Training Loss: 0.00680
Epoch: 43021, Training Loss: 0.00681
Epoch: 43021, Training Loss: 0.00835
Epoch: 43022, Training Loss: 0.00656
Epoch: 43022, Training Loss: 0.00680
Epoch: 43022, Training Loss: 0.00681
Epoch: 43022, Training Loss: 0.00835
Epoch: 43023, Training Loss: 0.00656
Epoch: 43023, Training Loss: 0.00680
Epoch: 43023, Training Loss: 0.00681
Epoch: 43023, Training Loss: 0.00835
Epoch: 43024, Training Loss: 0.00656
Epoch: 43024, Training Loss: 0.00680
Epoch: 43024, Training Loss: 0.00681
Epoch: 43024, Training Loss: 0.00835
Epoch: 43025, Training Loss: 0.00656
Epoch: 43025, Training Loss: 0.00680
Epoch: 43025, Training Loss: 0.00681
Epoch: 43025, Training Loss: 0.00835
Epoch: 43026, Training Loss: 0.00656
Epoch: 43026, Training Loss: 0.00680
Epoch: 43026, Training Loss: 0.00681
Epoch: 43026, Training Loss: 0.00835
Epoch: 43027, Training Loss: 0.00656
Epoch: 43027, Training Loss: 0.00680
Epoch: 43027, Training Loss: 0.00681
Epoch: 43027, Training Loss: 0.00835
Epoch: 43028, Training Loss: 0.00656
Epoch: 43028, Training Loss: 0.00680
Epoch: 43028, Training Loss: 0.00681
Epoch: 43028, Training Loss: 0.00835
Epoch: 43029, Training Loss: 0.00656
Epoch: 43029, Training Loss: 0.00680
Epoch: 43029, Training Loss: 0.00681
Epoch: 43029, Training Loss: 0.00835
Epoch: 43030, Training Loss: 0.00656
Epoch: 43030, Training Loss: 0.00680
Epoch: 43030, Training Loss: 0.00681
Epoch: 43030, Training Loss: 0.00835
Epoch: 43031, Training Loss: 0.00656
Epoch: 43031, Training Loss: 0.00680
Epoch: 43031, Training Loss: 0.00681
Epoch: 43031, Training Loss: 0.00835
Epoch: 43032, Training Loss: 0.00656
Epoch: 43032, Training Loss: 0.00680
Epoch: 43032, Training Loss: 0.00681
Epoch: 43032, Training Loss: 0.00835
Epoch: 43033, Training Loss: 0.00656
Epoch: 43033, Training Loss: 0.00680
Epoch: 43033, Training Loss: 0.00681
Epoch: 43033, Training Loss: 0.00835
Epoch: 43034, Training Loss: 0.00656
Epoch: 43034, Training Loss: 0.00680
Epoch: 43034, Training Loss: 0.00681
Epoch: 43034, Training Loss: 0.00835
Epoch: 43035, Training Loss: 0.00656
Epoch: 43035, Training Loss: 0.00680
Epoch: 43035, Training Loss: 0.00681
Epoch: 43035, Training Loss: 0.00835
Epoch: 43036, Training Loss: 0.00656
Epoch: 43036, Training Loss: 0.00680
Epoch: 43036, Training Loss: 0.00681
Epoch: 43036, Training Loss: 0.00835
Epoch: 43037, Training Loss: 0.00656
Epoch: 43037, Training Loss: 0.00680
Epoch: 43037, Training Loss: 0.00681
Epoch: 43037, Training Loss: 0.00835
Epoch: 43038, Training Loss: 0.00655
Epoch: 43038, Training Loss: 0.00680
Epoch: 43038, Training Loss: 0.00681
Epoch: 43038, Training Loss: 0.00835
Epoch: 43039, Training Loss: 0.00655
Epoch: 43039, Training Loss: 0.00680
Epoch: 43039, Training Loss: 0.00681
Epoch: 43039, Training Loss: 0.00835
Epoch: 43040, Training Loss: 0.00655
Epoch: 43040, Training Loss: 0.00680
Epoch: 43040, Training Loss: 0.00681
Epoch: 43040, Training Loss: 0.00835
Epoch: 43041, Training Loss: 0.00655
Epoch: 43041, Training Loss: 0.00680
Epoch: 43041, Training Loss: 0.00681
Epoch: 43041, Training Loss: 0.00835
Epoch: 43042, Training Loss: 0.00655
Epoch: 43042, Training Loss: 0.00680
Epoch: 43042, Training Loss: 0.00681
Epoch: 43042, Training Loss: 0.00835
Epoch: 43043, Training Loss: 0.00655
Epoch: 43043, Training Loss: 0.00680
Epoch: 43043, Training Loss: 0.00681
Epoch: 43043, Training Loss: 0.00835
Epoch: 43044, Training Loss: 0.00655
Epoch: 43044, Training Loss: 0.00680
Epoch: 43044, Training Loss: 0.00681
Epoch: 43044, Training Loss: 0.00835
Epoch: 43045, Training Loss: 0.00655
Epoch: 43045, Training Loss: 0.00680
Epoch: 43045, Training Loss: 0.00681
Epoch: 43045, Training Loss: 0.00835
Epoch: 43046, Training Loss: 0.00655
Epoch: 43046, Training Loss: 0.00680
Epoch: 43046, Training Loss: 0.00681
Epoch: 43046, Training Loss: 0.00835
Epoch: 43047, Training Loss: 0.00655
Epoch: 43047, Training Loss: 0.00680
Epoch: 43047, Training Loss: 0.00681
Epoch: 43047, Training Loss: 0.00835
Epoch: 43048, Training Loss: 0.00655
Epoch: 43048, Training Loss: 0.00680
Epoch: 43048, Training Loss: 0.00681
Epoch: 43048, Training Loss: 0.00835
Epoch: 43049, Training Loss: 0.00655
Epoch: 43049, Training Loss: 0.00680
Epoch: 43049, Training Loss: 0.00681
Epoch: 43049, Training Loss: 0.00835
Epoch: 43050, Training Loss: 0.00655
Epoch: 43050, Training Loss: 0.00680
Epoch: 43050, Training Loss: 0.00681
Epoch: 43050, Training Loss: 0.00835
Epoch: 43051, Training Loss: 0.00655
Epoch: 43051, Training Loss: 0.00680
Epoch: 43051, Training Loss: 0.00681
Epoch: 43051, Training Loss: 0.00835
Epoch: 43052, Training Loss: 0.00655
Epoch: 43052, Training Loss: 0.00680
Epoch: 43052, Training Loss: 0.00681
Epoch: 43052, Training Loss: 0.00835
Epoch: 43053, Training Loss: 0.00655
Epoch: 43053, Training Loss: 0.00680
Epoch: 43053, Training Loss: 0.00681
Epoch: 43053, Training Loss: 0.00835
Epoch: 43054, Training Loss: 0.00655
Epoch: 43054, Training Loss: 0.00680
Epoch: 43054, Training Loss: 0.00681
Epoch: 43054, Training Loss: 0.00835
Epoch: 43055, Training Loss: 0.00655
Epoch: 43055, Training Loss: 0.00680
Epoch: 43055, Training Loss: 0.00681
Epoch: 43055, Training Loss: 0.00835
Epoch: 43056, Training Loss: 0.00655
Epoch: 43056, Training Loss: 0.00680
Epoch: 43056, Training Loss: 0.00681
Epoch: 43056, Training Loss: 0.00835
Epoch: 43057, Training Loss: 0.00655
Epoch: 43057, Training Loss: 0.00680
Epoch: 43057, Training Loss: 0.00681
Epoch: 43057, Training Loss: 0.00835
Epoch: 43058, Training Loss: 0.00655
Epoch: 43058, Training Loss: 0.00680
Epoch: 43058, Training Loss: 0.00681
Epoch: 43058, Training Loss: 0.00835
Epoch: 43059, Training Loss: 0.00655
Epoch: 43059, Training Loss: 0.00680
Epoch: 43059, Training Loss: 0.00681
Epoch: 43059, Training Loss: 0.00835
Epoch: 43060, Training Loss: 0.00655
Epoch: 43060, Training Loss: 0.00680
Epoch: 43060, Training Loss: 0.00681
Epoch: 43060, Training Loss: 0.00835
Epoch: 43061, Training Loss: 0.00655
Epoch: 43061, Training Loss: 0.00680
Epoch: 43061, Training Loss: 0.00681
Epoch: 43061, Training Loss: 0.00835
Epoch: 43062, Training Loss: 0.00655
Epoch: 43062, Training Loss: 0.00680
Epoch: 43062, Training Loss: 0.00681
Epoch: 43062, Training Loss: 0.00835
Epoch: 43063, Training Loss: 0.00655
Epoch: 43063, Training Loss: 0.00680
Epoch: 43063, Training Loss: 0.00681
Epoch: 43063, Training Loss: 0.00835
Epoch: 43064, Training Loss: 0.00655
Epoch: 43064, Training Loss: 0.00680
Epoch: 43064, Training Loss: 0.00681
Epoch: 43064, Training Loss: 0.00835
Epoch: 43065, Training Loss: 0.00655
Epoch: 43065, Training Loss: 0.00680
Epoch: 43065, Training Loss: 0.00681
Epoch: 43065, Training Loss: 0.00835
Epoch: 43066, Training Loss: 0.00655
Epoch: 43066, Training Loss: 0.00680
Epoch: 43066, Training Loss: 0.00681
Epoch: 43066, Training Loss: 0.00835
Epoch: 43067, Training Loss: 0.00655
Epoch: 43067, Training Loss: 0.00680
Epoch: 43067, Training Loss: 0.00681
Epoch: 43067, Training Loss: 0.00835
Epoch: 43068, Training Loss: 0.00655
Epoch: 43068, Training Loss: 0.00680
Epoch: 43068, Training Loss: 0.00681
Epoch: 43068, Training Loss: 0.00835
Epoch: 43069, Training Loss: 0.00655
Epoch: 43069, Training Loss: 0.00680
Epoch: 43069, Training Loss: 0.00681
Epoch: 43069, Training Loss: 0.00835
Epoch: 43070, Training Loss: 0.00655
Epoch: 43070, Training Loss: 0.00680
Epoch: 43070, Training Loss: 0.00681
Epoch: 43070, Training Loss: 0.00835
Epoch: 43071, Training Loss: 0.00655
Epoch: 43071, Training Loss: 0.00680
Epoch: 43071, Training Loss: 0.00681
Epoch: 43071, Training Loss: 0.00835
Epoch: 43072, Training Loss: 0.00655
Epoch: 43072, Training Loss: 0.00680
Epoch: 43072, Training Loss: 0.00681
Epoch: 43072, Training Loss: 0.00835
Epoch: 43073, Training Loss: 0.00655
Epoch: 43073, Training Loss: 0.00680
Epoch: 43073, Training Loss: 0.00681
Epoch: 43073, Training Loss: 0.00835
Epoch: 43074, Training Loss: 0.00655
Epoch: 43074, Training Loss: 0.00680
Epoch: 43074, Training Loss: 0.00681
Epoch: 43074, Training Loss: 0.00835
Epoch: 43075, Training Loss: 0.00655
Epoch: 43075, Training Loss: 0.00680
Epoch: 43075, Training Loss: 0.00681
Epoch: 43075, Training Loss: 0.00835
Epoch: 43076, Training Loss: 0.00655
Epoch: 43076, Training Loss: 0.00680
Epoch: 43076, Training Loss: 0.00681
Epoch: 43076, Training Loss: 0.00835
Epoch: 43077, Training Loss: 0.00655
Epoch: 43077, Training Loss: 0.00680
Epoch: 43077, Training Loss: 0.00681
Epoch: 43077, Training Loss: 0.00835
Epoch: 43078, Training Loss: 0.00655
Epoch: 43078, Training Loss: 0.00680
Epoch: 43078, Training Loss: 0.00681
Epoch: 43078, Training Loss: 0.00835
Epoch: 43079, Training Loss: 0.00655
Epoch: 43079, Training Loss: 0.00680
Epoch: 43079, Training Loss: 0.00681
Epoch: 43079, Training Loss: 0.00834
Epoch: 43080, Training Loss: 0.00655
Epoch: 43080, Training Loss: 0.00680
Epoch: 43080, Training Loss: 0.00681
Epoch: 43080, Training Loss: 0.00834
Epoch: 43081, Training Loss: 0.00655
Epoch: 43081, Training Loss: 0.00680
Epoch: 43081, Training Loss: 0.00681
Epoch: 43081, Training Loss: 0.00834
Epoch: 43082, Training Loss: 0.00655
Epoch: 43082, Training Loss: 0.00680
Epoch: 43082, Training Loss: 0.00681
Epoch: 43082, Training Loss: 0.00834
Epoch: 43083, Training Loss: 0.00655
Epoch: 43083, Training Loss: 0.00680
Epoch: 43083, Training Loss: 0.00681
Epoch: 43083, Training Loss: 0.00834
Epoch: 43084, Training Loss: 0.00655
Epoch: 43084, Training Loss: 0.00680
Epoch: 43084, Training Loss: 0.00681
Epoch: 43084, Training Loss: 0.00834
Epoch: 43085, Training Loss: 0.00655
Epoch: 43085, Training Loss: 0.00680
Epoch: 43085, Training Loss: 0.00681
Epoch: 43085, Training Loss: 0.00834
Epoch: 43086, Training Loss: 0.00655
Epoch: 43086, Training Loss: 0.00680
Epoch: 43086, Training Loss: 0.00681
Epoch: 43086, Training Loss: 0.00834
Epoch: 43087, Training Loss: 0.00655
Epoch: 43087, Training Loss: 0.00680
Epoch: 43087, Training Loss: 0.00681
Epoch: 43087, Training Loss: 0.00834
Epoch: 43088, Training Loss: 0.00655
Epoch: 43088, Training Loss: 0.00680
Epoch: 43088, Training Loss: 0.00681
Epoch: 43088, Training Loss: 0.00834
Epoch: 43089, Training Loss: 0.00655
Epoch: 43089, Training Loss: 0.00680
Epoch: 43089, Training Loss: 0.00681
Epoch: 43089, Training Loss: 0.00834
Epoch: 43090, Training Loss: 0.00655
Epoch: 43090, Training Loss: 0.00680
Epoch: 43090, Training Loss: 0.00681
Epoch: 43090, Training Loss: 0.00834
Epoch: 43091, Training Loss: 0.00655
Epoch: 43091, Training Loss: 0.00680
Epoch: 43091, Training Loss: 0.00681
Epoch: 43091, Training Loss: 0.00834
Epoch: 43092, Training Loss: 0.00655
Epoch: 43092, Training Loss: 0.00680
Epoch: 43092, Training Loss: 0.00681
Epoch: 43092, Training Loss: 0.00834
Epoch: 43093, Training Loss: 0.00655
Epoch: 43093, Training Loss: 0.00680
Epoch: 43093, Training Loss: 0.00681
Epoch: 43093, Training Loss: 0.00834
Epoch: 43094, Training Loss: 0.00655
Epoch: 43094, Training Loss: 0.00680
Epoch: 43094, Training Loss: 0.00681
Epoch: 43094, Training Loss: 0.00834
Epoch: 43095, Training Loss: 0.00655
Epoch: 43095, Training Loss: 0.00680
Epoch: 43095, Training Loss: 0.00681
Epoch: 43095, Training Loss: 0.00834
Epoch: 43096, Training Loss: 0.00655
Epoch: 43096, Training Loss: 0.00680
Epoch: 43096, Training Loss: 0.00681
Epoch: 43096, Training Loss: 0.00834
Epoch: 43097, Training Loss: 0.00655
Epoch: 43097, Training Loss: 0.00680
Epoch: 43097, Training Loss: 0.00681
Epoch: 43097, Training Loss: 0.00834
Epoch: 43098, Training Loss: 0.00655
Epoch: 43098, Training Loss: 0.00680
Epoch: 43098, Training Loss: 0.00681
Epoch: 43098, Training Loss: 0.00834
Epoch: 43099, Training Loss: 0.00655
Epoch: 43099, Training Loss: 0.00680
Epoch: 43099, Training Loss: 0.00681
Epoch: 43099, Training Loss: 0.00834
Epoch: 43100, Training Loss: 0.00655
Epoch: 43100, Training Loss: 0.00680
Epoch: 43100, Training Loss: 0.00681
Epoch: 43100, Training Loss: 0.00834
Epoch: 43101, Training Loss: 0.00655
Epoch: 43101, Training Loss: 0.00680
Epoch: 43101, Training Loss: 0.00681
Epoch: 43101, Training Loss: 0.00834
Epoch: 43102, Training Loss: 0.00655
Epoch: 43102, Training Loss: 0.00680
Epoch: 43102, Training Loss: 0.00681
Epoch: 43102, Training Loss: 0.00834
Epoch: 43103, Training Loss: 0.00655
Epoch: 43103, Training Loss: 0.00680
Epoch: 43103, Training Loss: 0.00681
Epoch: 43103, Training Loss: 0.00834
Epoch: 43104, Training Loss: 0.00655
Epoch: 43104, Training Loss: 0.00680
Epoch: 43104, Training Loss: 0.00681
Epoch: 43104, Training Loss: 0.00834
Epoch: 43105, Training Loss: 0.00655
Epoch: 43105, Training Loss: 0.00680
Epoch: 43105, Training Loss: 0.00680
Epoch: 43105, Training Loss: 0.00834
Epoch: 43106, Training Loss: 0.00655
Epoch: 43106, Training Loss: 0.00680
Epoch: 43106, Training Loss: 0.00680
Epoch: 43106, Training Loss: 0.00834
Epoch: 43107, Training Loss: 0.00655
Epoch: 43107, Training Loss: 0.00680
Epoch: 43107, Training Loss: 0.00680
Epoch: 43107, Training Loss: 0.00834
Epoch: 43108, Training Loss: 0.00655
Epoch: 43108, Training Loss: 0.00680
Epoch: 43108, Training Loss: 0.00680
Epoch: 43108, Training Loss: 0.00834
Epoch: 43109, Training Loss: 0.00655
Epoch: 43109, Training Loss: 0.00680
Epoch: 43109, Training Loss: 0.00680
Epoch: 43109, Training Loss: 0.00834
Epoch: 43110, Training Loss: 0.00655
Epoch: 43110, Training Loss: 0.00680
Epoch: 43110, Training Loss: 0.00680
Epoch: 43110, Training Loss: 0.00834
Epoch: 43111, Training Loss: 0.00655
Epoch: 43111, Training Loss: 0.00680
Epoch: 43111, Training Loss: 0.00680
Epoch: 43111, Training Loss: 0.00834
Epoch: 43112, Training Loss: 0.00655
Epoch: 43112, Training Loss: 0.00680
Epoch: 43112, Training Loss: 0.00680
Epoch: 43112, Training Loss: 0.00834
Epoch: 43113, Training Loss: 0.00655
Epoch: 43113, Training Loss: 0.00680
Epoch: 43113, Training Loss: 0.00680
Epoch: 43113, Training Loss: 0.00834
Epoch: 43114, Training Loss: 0.00655
Epoch: 43114, Training Loss: 0.00680
Epoch: 43114, Training Loss: 0.00680
Epoch: 43114, Training Loss: 0.00834
Epoch: 43115, Training Loss: 0.00655
Epoch: 43115, Training Loss: 0.00680
Epoch: 43115, Training Loss: 0.00680
Epoch: 43115, Training Loss: 0.00834
Epoch: 43116, Training Loss: 0.00655
Epoch: 43116, Training Loss: 0.00680
Epoch: 43116, Training Loss: 0.00680
Epoch: 43116, Training Loss: 0.00834
Epoch: 43117, Training Loss: 0.00655
Epoch: 43117, Training Loss: 0.00680
Epoch: 43117, Training Loss: 0.00680
Epoch: 43117, Training Loss: 0.00834
Epoch: 43118, Training Loss: 0.00655
Epoch: 43118, Training Loss: 0.00680
Epoch: 43118, Training Loss: 0.00680
Epoch: 43118, Training Loss: 0.00834
Epoch: 43119, Training Loss: 0.00655
Epoch: 43119, Training Loss: 0.00680
Epoch: 43119, Training Loss: 0.00680
Epoch: 43119, Training Loss: 0.00834
Epoch: 43120, Training Loss: 0.00655
Epoch: 43120, Training Loss: 0.00680
Epoch: 43120, Training Loss: 0.00680
Epoch: 43120, Training Loss: 0.00834
Epoch: 43121, Training Loss: 0.00655
Epoch: 43121, Training Loss: 0.00680
Epoch: 43121, Training Loss: 0.00680
Epoch: 43121, Training Loss: 0.00834
Epoch: 43122, Training Loss: 0.00655
Epoch: 43122, Training Loss: 0.00680
Epoch: 43122, Training Loss: 0.00680
Epoch: 43122, Training Loss: 0.00834
Epoch: 43123, Training Loss: 0.00655
Epoch: 43123, Training Loss: 0.00680
Epoch: 43123, Training Loss: 0.00680
Epoch: 43123, Training Loss: 0.00834
Epoch: 43124, Training Loss: 0.00655
Epoch: 43124, Training Loss: 0.00680
Epoch: 43124, Training Loss: 0.00680
Epoch: 43124, Training Loss: 0.00834
Epoch: 43125, Training Loss: 0.00655
Epoch: 43125, Training Loss: 0.00680
Epoch: 43125, Training Loss: 0.00680
Epoch: 43125, Training Loss: 0.00834
Epoch: 43126, Training Loss: 0.00655
Epoch: 43126, Training Loss: 0.00680
Epoch: 43126, Training Loss: 0.00680
Epoch: 43126, Training Loss: 0.00834
Epoch: 43127, Training Loss: 0.00655
Epoch: 43127, Training Loss: 0.00680
Epoch: 43127, Training Loss: 0.00680
Epoch: 43127, Training Loss: 0.00834
Epoch: 43128, Training Loss: 0.00655
Epoch: 43128, Training Loss: 0.00680
Epoch: 43128, Training Loss: 0.00680
Epoch: 43128, Training Loss: 0.00834
Epoch: 43129, Training Loss: 0.00655
Epoch: 43129, Training Loss: 0.00680
Epoch: 43129, Training Loss: 0.00680
Epoch: 43129, Training Loss: 0.00834
Epoch: 43130, Training Loss: 0.00655
Epoch: 43130, Training Loss: 0.00680
Epoch: 43130, Training Loss: 0.00680
Epoch: 43130, Training Loss: 0.00834
Epoch: 43131, Training Loss: 0.00655
Epoch: 43131, Training Loss: 0.00680
Epoch: 43131, Training Loss: 0.00680
Epoch: 43131, Training Loss: 0.00834
Epoch: 43132, Training Loss: 0.00655
Epoch: 43132, Training Loss: 0.00680
Epoch: 43132, Training Loss: 0.00680
Epoch: 43132, Training Loss: 0.00834
Epoch: 43133, Training Loss: 0.00655
Epoch: 43133, Training Loss: 0.00680
Epoch: 43133, Training Loss: 0.00680
Epoch: 43133, Training Loss: 0.00834
Epoch: 43134, Training Loss: 0.00655
Epoch: 43134, Training Loss: 0.00679
Epoch: 43134, Training Loss: 0.00680
Epoch: 43134, Training Loss: 0.00834
Epoch: 43135, Training Loss: 0.00655
Epoch: 43135, Training Loss: 0.00679
Epoch: 43135, Training Loss: 0.00680
Epoch: 43135, Training Loss: 0.00834
Epoch: 43136, Training Loss: 0.00655
Epoch: 43136, Training Loss: 0.00679
Epoch: 43136, Training Loss: 0.00680
Epoch: 43136, Training Loss: 0.00834
Epoch: 43137, Training Loss: 0.00655
Epoch: 43137, Training Loss: 0.00679
Epoch: 43137, Training Loss: 0.00680
Epoch: 43137, Training Loss: 0.00834
Epoch: 43138, Training Loss: 0.00655
Epoch: 43138, Training Loss: 0.00679
Epoch: 43138, Training Loss: 0.00680
Epoch: 43138, Training Loss: 0.00834
Epoch: 43139, Training Loss: 0.00655
Epoch: 43139, Training Loss: 0.00679
Epoch: 43139, Training Loss: 0.00680
Epoch: 43139, Training Loss: 0.00834
Epoch: 43140, Training Loss: 0.00655
Epoch: 43140, Training Loss: 0.00679
Epoch: 43140, Training Loss: 0.00680
Epoch: 43140, Training Loss: 0.00834
Epoch: 43141, Training Loss: 0.00655
Epoch: 43141, Training Loss: 0.00679
Epoch: 43141, Training Loss: 0.00680
Epoch: 43141, Training Loss: 0.00834
Epoch: 43142, Training Loss: 0.00655
Epoch: 43142, Training Loss: 0.00679
Epoch: 43142, Training Loss: 0.00680
Epoch: 43142, Training Loss: 0.00834
Epoch: 43143, Training Loss: 0.00655
Epoch: 43143, Training Loss: 0.00679
Epoch: 43143, Training Loss: 0.00680
Epoch: 43143, Training Loss: 0.00834
Epoch: 43144, Training Loss: 0.00655
Epoch: 43144, Training Loss: 0.00679
Epoch: 43144, Training Loss: 0.00680
Epoch: 43144, Training Loss: 0.00834
Epoch: 43145, Training Loss: 0.00655
Epoch: 43145, Training Loss: 0.00679
Epoch: 43145, Training Loss: 0.00680
Epoch: 43145, Training Loss: 0.00834
Epoch: 43146, Training Loss: 0.00655
Epoch: 43146, Training Loss: 0.00679
Epoch: 43146, Training Loss: 0.00680
Epoch: 43146, Training Loss: 0.00834
Epoch: 43147, Training Loss: 0.00655
Epoch: 43147, Training Loss: 0.00679
Epoch: 43147, Training Loss: 0.00680
Epoch: 43147, Training Loss: 0.00834
Epoch: 43148, Training Loss: 0.00655
Epoch: 43148, Training Loss: 0.00679
Epoch: 43148, Training Loss: 0.00680
Epoch: 43148, Training Loss: 0.00834
Epoch: 43149, Training Loss: 0.00655
Epoch: 43149, Training Loss: 0.00679
Epoch: 43149, Training Loss: 0.00680
Epoch: 43149, Training Loss: 0.00834
Epoch: 43150, Training Loss: 0.00655
Epoch: 43150, Training Loss: 0.00679
Epoch: 43150, Training Loss: 0.00680
Epoch: 43150, Training Loss: 0.00834
Epoch: 43151, Training Loss: 0.00655
Epoch: 43151, Training Loss: 0.00679
Epoch: 43151, Training Loss: 0.00680
Epoch: 43151, Training Loss: 0.00834
Epoch: 43152, Training Loss: 0.00655
Epoch: 43152, Training Loss: 0.00679
Epoch: 43152, Training Loss: 0.00680
Epoch: 43152, Training Loss: 0.00834
Epoch: 43153, Training Loss: 0.00655
Epoch: 43153, Training Loss: 0.00679
Epoch: 43153, Training Loss: 0.00680
Epoch: 43153, Training Loss: 0.00834
Epoch: 43154, Training Loss: 0.00655
Epoch: 43154, Training Loss: 0.00679
Epoch: 43154, Training Loss: 0.00680
Epoch: 43154, Training Loss: 0.00834
Epoch: 43155, Training Loss: 0.00655
Epoch: 43155, Training Loss: 0.00679
Epoch: 43155, Training Loss: 0.00680
Epoch: 43155, Training Loss: 0.00834
Epoch: 43156, Training Loss: 0.00655
Epoch: 43156, Training Loss: 0.00679
Epoch: 43156, Training Loss: 0.00680
Epoch: 43156, Training Loss: 0.00834
Epoch: 43157, Training Loss: 0.00655
Epoch: 43157, Training Loss: 0.00679
Epoch: 43157, Training Loss: 0.00680
Epoch: 43157, Training Loss: 0.00834
Epoch: 43158, Training Loss: 0.00655
Epoch: 43158, Training Loss: 0.00679
Epoch: 43158, Training Loss: 0.00680
Epoch: 43158, Training Loss: 0.00834
Epoch: 43159, Training Loss: 0.00655
Epoch: 43159, Training Loss: 0.00679
Epoch: 43159, Training Loss: 0.00680
Epoch: 43159, Training Loss: 0.00834
Epoch: 43160, Training Loss: 0.00655
Epoch: 43160, Training Loss: 0.00679
Epoch: 43160, Training Loss: 0.00680
Epoch: 43160, Training Loss: 0.00834
Epoch: 43161, Training Loss: 0.00654
Epoch: 43161, Training Loss: 0.00679
Epoch: 43161, Training Loss: 0.00680
Epoch: 43161, Training Loss: 0.00834
Epoch: 43162, Training Loss: 0.00654
Epoch: 43162, Training Loss: 0.00679
Epoch: 43162, Training Loss: 0.00680
Epoch: 43162, Training Loss: 0.00834
Epoch: 43163, Training Loss: 0.00654
Epoch: 43163, Training Loss: 0.00679
Epoch: 43163, Training Loss: 0.00680
Epoch: 43163, Training Loss: 0.00834
Epoch: 43164, Training Loss: 0.00654
Epoch: 43164, Training Loss: 0.00679
Epoch: 43164, Training Loss: 0.00680
Epoch: 43164, Training Loss: 0.00834
Epoch: 43165, Training Loss: 0.00654
Epoch: 43165, Training Loss: 0.00679
Epoch: 43165, Training Loss: 0.00680
Epoch: 43165, Training Loss: 0.00834
Epoch: 43166, Training Loss: 0.00654
Epoch: 43166, Training Loss: 0.00679
Epoch: 43166, Training Loss: 0.00680
Epoch: 43166, Training Loss: 0.00834
Epoch: 43167, Training Loss: 0.00654
Epoch: 43167, Training Loss: 0.00679
Epoch: 43167, Training Loss: 0.00680
Epoch: 43167, Training Loss: 0.00834
Epoch: 43168, Training Loss: 0.00654
Epoch: 43168, Training Loss: 0.00679
Epoch: 43168, Training Loss: 0.00680
Epoch: 43168, Training Loss: 0.00834
Epoch: 43169, Training Loss: 0.00654
Epoch: 43169, Training Loss: 0.00679
Epoch: 43169, Training Loss: 0.00680
Epoch: 43169, Training Loss: 0.00834
Epoch: 43170, Training Loss: 0.00654
Epoch: 43170, Training Loss: 0.00679
Epoch: 43170, Training Loss: 0.00680
Epoch: 43170, Training Loss: 0.00834
Epoch: 43171, Training Loss: 0.00654
Epoch: 43171, Training Loss: 0.00679
Epoch: 43171, Training Loss: 0.00680
Epoch: 43171, Training Loss: 0.00834
Epoch: 43172, Training Loss: 0.00654
Epoch: 43172, Training Loss: 0.00679
Epoch: 43172, Training Loss: 0.00680
Epoch: 43172, Training Loss: 0.00834
Epoch: 43173, Training Loss: 0.00654
Epoch: 43173, Training Loss: 0.00679
Epoch: 43173, Training Loss: 0.00680
Epoch: 43173, Training Loss: 0.00833
Epoch: 43174, Training Loss: 0.00654
Epoch: 43174, Training Loss: 0.00679
Epoch: 43174, Training Loss: 0.00680
Epoch: 43174, Training Loss: 0.00833
Epoch: 43175, Training Loss: 0.00654
Epoch: 43175, Training Loss: 0.00679
Epoch: 43175, Training Loss: 0.00680
Epoch: 43175, Training Loss: 0.00833
Epoch: 43176, Training Loss: 0.00654
Epoch: 43176, Training Loss: 0.00679
Epoch: 43176, Training Loss: 0.00680
Epoch: 43176, Training Loss: 0.00833
Epoch: 43177, Training Loss: 0.00654
Epoch: 43177, Training Loss: 0.00679
Epoch: 43177, Training Loss: 0.00680
Epoch: 43177, Training Loss: 0.00833
Epoch: 43178, Training Loss: 0.00654
Epoch: 43178, Training Loss: 0.00679
Epoch: 43178, Training Loss: 0.00680
Epoch: 43178, Training Loss: 0.00833
Epoch: 43179, Training Loss: 0.00654
Epoch: 43179, Training Loss: 0.00679
Epoch: 43179, Training Loss: 0.00680
Epoch: 43179, Training Loss: 0.00833
Epoch: 43180, Training Loss: 0.00654
Epoch: 43180, Training Loss: 0.00679
Epoch: 43180, Training Loss: 0.00680
Epoch: 43180, Training Loss: 0.00833
Epoch: 43181, Training Loss: 0.00654
Epoch: 43181, Training Loss: 0.00679
Epoch: 43181, Training Loss: 0.00680
Epoch: 43181, Training Loss: 0.00833
Epoch: 43182, Training Loss: 0.00654
Epoch: 43182, Training Loss: 0.00679
Epoch: 43182, Training Loss: 0.00680
Epoch: 43182, Training Loss: 0.00833
Epoch: 43183, Training Loss: 0.00654
Epoch: 43183, Training Loss: 0.00679
Epoch: 43183, Training Loss: 0.00680
Epoch: 43183, Training Loss: 0.00833
Epoch: 43184, Training Loss: 0.00654
Epoch: 43184, Training Loss: 0.00679
Epoch: 43184, Training Loss: 0.00680
Epoch: 43184, Training Loss: 0.00833
Epoch: 43185, Training Loss: 0.00654
Epoch: 43185, Training Loss: 0.00679
Epoch: 43185, Training Loss: 0.00680
Epoch: 43185, Training Loss: 0.00833
Epoch: 43186, Training Loss: 0.00654
Epoch: 43186, Training Loss: 0.00679
Epoch: 43186, Training Loss: 0.00680
Epoch: 43186, Training Loss: 0.00833
Epoch: 43187, Training Loss: 0.00654
Epoch: 43187, Training Loss: 0.00679
Epoch: 43187, Training Loss: 0.00680
Epoch: 43187, Training Loss: 0.00833
Epoch: 43188, Training Loss: 0.00654
Epoch: 43188, Training Loss: 0.00679
Epoch: 43188, Training Loss: 0.00680
Epoch: 43188, Training Loss: 0.00833
Epoch: 43189, Training Loss: 0.00654
Epoch: 43189, Training Loss: 0.00679
Epoch: 43189, Training Loss: 0.00680
Epoch: 43189, Training Loss: 0.00833
Epoch: 43190, Training Loss: 0.00654
Epoch: 43190, Training Loss: 0.00679
Epoch: 43190, Training Loss: 0.00680
Epoch: 43190, Training Loss: 0.00833
Epoch: 43191, Training Loss: 0.00654
Epoch: 43191, Training Loss: 0.00679
Epoch: 43191, Training Loss: 0.00680
Epoch: 43191, Training Loss: 0.00833
Epoch: 43192, Training Loss: 0.00654
Epoch: 43192, Training Loss: 0.00679
Epoch: 43192, Training Loss: 0.00680
Epoch: 43192, Training Loss: 0.00833
Epoch: 43193, Training Loss: 0.00654
Epoch: 43193, Training Loss: 0.00679
Epoch: 43193, Training Loss: 0.00680
Epoch: 43193, Training Loss: 0.00833
Epoch: 43194, Training Loss: 0.00654
Epoch: 43194, Training Loss: 0.00679
Epoch: 43194, Training Loss: 0.00680
Epoch: 43194, Training Loss: 0.00833
Epoch: 43195, Training Loss: 0.00654
Epoch: 43195, Training Loss: 0.00679
Epoch: 43195, Training Loss: 0.00680
Epoch: 43195, Training Loss: 0.00833
Epoch: 43196, Training Loss: 0.00654
Epoch: 43196, Training Loss: 0.00679
Epoch: 43196, Training Loss: 0.00680
Epoch: 43196, Training Loss: 0.00833
Epoch: 43197, Training Loss: 0.00654
Epoch: 43197, Training Loss: 0.00679
Epoch: 43197, Training Loss: 0.00680
Epoch: 43197, Training Loss: 0.00833
Epoch: 43198, Training Loss: 0.00654
Epoch: 43198, Training Loss: 0.00679
Epoch: 43198, Training Loss: 0.00680
Epoch: 43198, Training Loss: 0.00833
Epoch: 43199, Training Loss: 0.00654
Epoch: 43199, Training Loss: 0.00679
Epoch: 43199, Training Loss: 0.00680
Epoch: 43199, Training Loss: 0.00833
Epoch: 43200, Training Loss: 0.00654
Epoch: 43200, Training Loss: 0.00679
Epoch: 43200, Training Loss: 0.00680
Epoch: 43200, Training Loss: 0.00833
Epoch: 43201, Training Loss: 0.00654
Epoch: 43201, Training Loss: 0.00679
Epoch: 43201, Training Loss: 0.00680
Epoch: 43201, Training Loss: 0.00833
Epoch: 43202, Training Loss: 0.00654
Epoch: 43202, Training Loss: 0.00679
Epoch: 43202, Training Loss: 0.00680
Epoch: 43202, Training Loss: 0.00833
Epoch: 43203, Training Loss: 0.00654
Epoch: 43203, Training Loss: 0.00679
Epoch: 43203, Training Loss: 0.00680
Epoch: 43203, Training Loss: 0.00833
Epoch: 43204, Training Loss: 0.00654
Epoch: 43204, Training Loss: 0.00679
Epoch: 43204, Training Loss: 0.00680
Epoch: 43204, Training Loss: 0.00833
Epoch: 43205, Training Loss: 0.00654
Epoch: 43205, Training Loss: 0.00679
Epoch: 43205, Training Loss: 0.00680
Epoch: 43205, Training Loss: 0.00833
Epoch: 43206, Training Loss: 0.00654
Epoch: 43206, Training Loss: 0.00679
Epoch: 43206, Training Loss: 0.00680
Epoch: 43206, Training Loss: 0.00833
Epoch: 43207, Training Loss: 0.00654
Epoch: 43207, Training Loss: 0.00679
Epoch: 43207, Training Loss: 0.00680
Epoch: 43207, Training Loss: 0.00833
Epoch: 43208, Training Loss: 0.00654
Epoch: 43208, Training Loss: 0.00679
Epoch: 43208, Training Loss: 0.00680
Epoch: 43208, Training Loss: 0.00833
Epoch: 43209, Training Loss: 0.00654
Epoch: 43209, Training Loss: 0.00679
Epoch: 43209, Training Loss: 0.00680
Epoch: 43209, Training Loss: 0.00833
Epoch: 43210, Training Loss: 0.00654
Epoch: 43210, Training Loss: 0.00679
Epoch: 43210, Training Loss: 0.00680
Epoch: 43210, Training Loss: 0.00833
Epoch: 43211, Training Loss: 0.00654
Epoch: 43211, Training Loss: 0.00679
Epoch: 43211, Training Loss: 0.00680
Epoch: 43211, Training Loss: 0.00833
Epoch: 43212, Training Loss: 0.00654
Epoch: 43212, Training Loss: 0.00679
Epoch: 43212, Training Loss: 0.00680
Epoch: 43212, Training Loss: 0.00833
Epoch: 43213, Training Loss: 0.00654
Epoch: 43213, Training Loss: 0.00679
Epoch: 43213, Training Loss: 0.00680
Epoch: 43213, Training Loss: 0.00833
Epoch: 43214, Training Loss: 0.00654
Epoch: 43214, Training Loss: 0.00679
Epoch: 43214, Training Loss: 0.00680
Epoch: 43214, Training Loss: 0.00833
Epoch: 43215, Training Loss: 0.00654
Epoch: 43215, Training Loss: 0.00679
Epoch: 43215, Training Loss: 0.00680
Epoch: 43215, Training Loss: 0.00833
Epoch: 43216, Training Loss: 0.00654
Epoch: 43216, Training Loss: 0.00679
Epoch: 43216, Training Loss: 0.00680
Epoch: 43216, Training Loss: 0.00833
Epoch: 43217, Training Loss: 0.00654
Epoch: 43217, Training Loss: 0.00679
Epoch: 43217, Training Loss: 0.00680
Epoch: 43217, Training Loss: 0.00833
Epoch: 43218, Training Loss: 0.00654
Epoch: 43218, Training Loss: 0.00679
Epoch: 43218, Training Loss: 0.00680
Epoch: 43218, Training Loss: 0.00833
Epoch: 43219, Training Loss: 0.00654
Epoch: 43219, Training Loss: 0.00679
Epoch: 43219, Training Loss: 0.00680
Epoch: 43219, Training Loss: 0.00833
Epoch: 43220, Training Loss: 0.00654
Epoch: 43220, Training Loss: 0.00679
Epoch: 43220, Training Loss: 0.00680
Epoch: 43220, Training Loss: 0.00833
Epoch: 43221, Training Loss: 0.00654
Epoch: 43221, Training Loss: 0.00679
Epoch: 43221, Training Loss: 0.00679
Epoch: 43221, Training Loss: 0.00833
Epoch: 43222, Training Loss: 0.00654
Epoch: 43222, Training Loss: 0.00679
Epoch: 43222, Training Loss: 0.00679
Epoch: 43222, Training Loss: 0.00833
Epoch: 43223, Training Loss: 0.00654
Epoch: 43223, Training Loss: 0.00679
Epoch: 43223, Training Loss: 0.00679
Epoch: 43223, Training Loss: 0.00833
Epoch: 43224, Training Loss: 0.00654
Epoch: 43224, Training Loss: 0.00679
Epoch: 43224, Training Loss: 0.00679
Epoch: 43224, Training Loss: 0.00833
Epoch: 43225, Training Loss: 0.00654
Epoch: 43225, Training Loss: 0.00679
Epoch: 43225, Training Loss: 0.00679
Epoch: 43225, Training Loss: 0.00833
Epoch: 43226, Training Loss: 0.00654
Epoch: 43226, Training Loss: 0.00679
Epoch: 43226, Training Loss: 0.00679
Epoch: 43226, Training Loss: 0.00833
Epoch: 43227, Training Loss: 0.00654
Epoch: 43227, Training Loss: 0.00679
Epoch: 43227, Training Loss: 0.00679
Epoch: 43227, Training Loss: 0.00833
Epoch: 43228, Training Loss: 0.00654
Epoch: 43228, Training Loss: 0.00679
Epoch: 43228, Training Loss: 0.00679
Epoch: 43228, Training Loss: 0.00833
Epoch: 43229, Training Loss: 0.00654
Epoch: 43229, Training Loss: 0.00679
Epoch: 43229, Training Loss: 0.00679
Epoch: 43229, Training Loss: 0.00833
Epoch: 43230, Training Loss: 0.00654
Epoch: 43230, Training Loss: 0.00679
Epoch: 43230, Training Loss: 0.00679
Epoch: 43230, Training Loss: 0.00833
Epoch: 43231, Training Loss: 0.00654
Epoch: 43231, Training Loss: 0.00679
Epoch: 43231, Training Loss: 0.00679
Epoch: 43231, Training Loss: 0.00833
Epoch: 43232, Training Loss: 0.00654
Epoch: 43232, Training Loss: 0.00679
Epoch: 43232, Training Loss: 0.00679
Epoch: 43232, Training Loss: 0.00833
Epoch: 43233, Training Loss: 0.00654
Epoch: 43233, Training Loss: 0.00679
Epoch: 43233, Training Loss: 0.00679
Epoch: 43233, Training Loss: 0.00833
Epoch: 43234, Training Loss: 0.00654
Epoch: 43234, Training Loss: 0.00679
Epoch: 43234, Training Loss: 0.00679
Epoch: 43234, Training Loss: 0.00833
Epoch: 43235, Training Loss: 0.00654
Epoch: 43235, Training Loss: 0.00679
Epoch: 43235, Training Loss: 0.00679
Epoch: 43235, Training Loss: 0.00833
Epoch: 43236, Training Loss: 0.00654
Epoch: 43236, Training Loss: 0.00679
Epoch: 43236, Training Loss: 0.00679
Epoch: 43236, Training Loss: 0.00833
Epoch: 43237, Training Loss: 0.00654
Epoch: 43237, Training Loss: 0.00679
Epoch: 43237, Training Loss: 0.00679
Epoch: 43237, Training Loss: 0.00833
Epoch: 43238, Training Loss: 0.00654
Epoch: 43238, Training Loss: 0.00679
Epoch: 43238, Training Loss: 0.00679
Epoch: 43238, Training Loss: 0.00833
Epoch: 43239, Training Loss: 0.00654
Epoch: 43239, Training Loss: 0.00679
Epoch: 43239, Training Loss: 0.00679
Epoch: 43239, Training Loss: 0.00833
Epoch: 43240, Training Loss: 0.00654
Epoch: 43240, Training Loss: 0.00679
Epoch: 43240, Training Loss: 0.00679
Epoch: 43240, Training Loss: 0.00833
Epoch: 43241, Training Loss: 0.00654
Epoch: 43241, Training Loss: 0.00679
Epoch: 43241, Training Loss: 0.00679
Epoch: 43241, Training Loss: 0.00833
Epoch: 43242, Training Loss: 0.00654
Epoch: 43242, Training Loss: 0.00679
Epoch: 43242, Training Loss: 0.00679
Epoch: 43242, Training Loss: 0.00833
Epoch: 43243, Training Loss: 0.00654
Epoch: 43243, Training Loss: 0.00679
Epoch: 43243, Training Loss: 0.00679
Epoch: 43243, Training Loss: 0.00833
Epoch: 43244, Training Loss: 0.00654
Epoch: 43244, Training Loss: 0.00679
Epoch: 43244, Training Loss: 0.00679
Epoch: 43244, Training Loss: 0.00833
Epoch: 43245, Training Loss: 0.00654
Epoch: 43245, Training Loss: 0.00679
Epoch: 43245, Training Loss: 0.00679
Epoch: 43245, Training Loss: 0.00833
Epoch: 43246, Training Loss: 0.00654
Epoch: 43246, Training Loss: 0.00679
Epoch: 43246, Training Loss: 0.00679
Epoch: 43246, Training Loss: 0.00833
Epoch: 43247, Training Loss: 0.00654
Epoch: 43247, Training Loss: 0.00679
Epoch: 43247, Training Loss: 0.00679
Epoch: 43247, Training Loss: 0.00833
Epoch: 43248, Training Loss: 0.00654
Epoch: 43248, Training Loss: 0.00679
Epoch: 43248, Training Loss: 0.00679
Epoch: 43248, Training Loss: 0.00833
Epoch: 43249, Training Loss: 0.00654
Epoch: 43249, Training Loss: 0.00679
Epoch: 43249, Training Loss: 0.00679
Epoch: 43249, Training Loss: 0.00833
Epoch: 43250, Training Loss: 0.00654
Epoch: 43250, Training Loss: 0.00679
Epoch: 43250, Training Loss: 0.00679
Epoch: 43250, Training Loss: 0.00833
Epoch: 43251, Training Loss: 0.00654
Epoch: 43251, Training Loss: 0.00678
Epoch: 43251, Training Loss: 0.00679
Epoch: 43251, Training Loss: 0.00833
Epoch: 43252, Training Loss: 0.00654
Epoch: 43252, Training Loss: 0.00678
Epoch: 43252, Training Loss: 0.00679
Epoch: 43252, Training Loss: 0.00833
Epoch: 43253, Training Loss: 0.00654
Epoch: 43253, Training Loss: 0.00678
Epoch: 43253, Training Loss: 0.00679
Epoch: 43253, Training Loss: 0.00833
Epoch: 43254, Training Loss: 0.00654
Epoch: 43254, Training Loss: 0.00678
Epoch: 43254, Training Loss: 0.00679
Epoch: 43254, Training Loss: 0.00833
Epoch: 43255, Training Loss: 0.00654
Epoch: 43255, Training Loss: 0.00678
Epoch: 43255, Training Loss: 0.00679
Epoch: 43255, Training Loss: 0.00833
Epoch: 43256, Training Loss: 0.00654
Epoch: 43256, Training Loss: 0.00678
Epoch: 43256, Training Loss: 0.00679
Epoch: 43256, Training Loss: 0.00833
Epoch: 43257, Training Loss: 0.00654
Epoch: 43257, Training Loss: 0.00678
Epoch: 43257, Training Loss: 0.00679
Epoch: 43257, Training Loss: 0.00833
Epoch: 43258, Training Loss: 0.00654
Epoch: 43258, Training Loss: 0.00678
Epoch: 43258, Training Loss: 0.00679
Epoch: 43258, Training Loss: 0.00833
Epoch: 43259, Training Loss: 0.00654
Epoch: 43259, Training Loss: 0.00678
Epoch: 43259, Training Loss: 0.00679
Epoch: 43259, Training Loss: 0.00833
Epoch: 43260, Training Loss: 0.00654
Epoch: 43260, Training Loss: 0.00678
Epoch: 43260, Training Loss: 0.00679
Epoch: 43260, Training Loss: 0.00833
Epoch: 43261, Training Loss: 0.00654
Epoch: 43261, Training Loss: 0.00678
Epoch: 43261, Training Loss: 0.00679
Epoch: 43261, Training Loss: 0.00833
Epoch: 43262, Training Loss: 0.00654
Epoch: 43262, Training Loss: 0.00678
Epoch: 43262, Training Loss: 0.00679
Epoch: 43262, Training Loss: 0.00833
Epoch: 43263, Training Loss: 0.00654
Epoch: 43263, Training Loss: 0.00678
Epoch: 43263, Training Loss: 0.00679
Epoch: 43263, Training Loss: 0.00833
Epoch: 43264, Training Loss: 0.00654
Epoch: 43264, Training Loss: 0.00678
Epoch: 43264, Training Loss: 0.00679
Epoch: 43264, Training Loss: 0.00833
Epoch: 43265, Training Loss: 0.00654
Epoch: 43265, Training Loss: 0.00678
Epoch: 43265, Training Loss: 0.00679
Epoch: 43265, Training Loss: 0.00833
Epoch: 43266, Training Loss: 0.00654
Epoch: 43266, Training Loss: 0.00678
Epoch: 43266, Training Loss: 0.00679
Epoch: 43266, Training Loss: 0.00833
Epoch: 43267, Training Loss: 0.00654
Epoch: 43267, Training Loss: 0.00678
Epoch: 43267, Training Loss: 0.00679
Epoch: 43267, Training Loss: 0.00832
Epoch: 43268, Training Loss: 0.00654
Epoch: 43268, Training Loss: 0.00678
Epoch: 43268, Training Loss: 0.00679
Epoch: 43268, Training Loss: 0.00832
Epoch: 43269, Training Loss: 0.00654
Epoch: 43269, Training Loss: 0.00678
Epoch: 43269, Training Loss: 0.00679
Epoch: 43269, Training Loss: 0.00832
Epoch: 43270, Training Loss: 0.00654
Epoch: 43270, Training Loss: 0.00678
Epoch: 43270, Training Loss: 0.00679
Epoch: 43270, Training Loss: 0.00832
Epoch: 43271, Training Loss: 0.00654
Epoch: 43271, Training Loss: 0.00678
Epoch: 43271, Training Loss: 0.00679
Epoch: 43271, Training Loss: 0.00832
Epoch: 43272, Training Loss: 0.00654
Epoch: 43272, Training Loss: 0.00678
Epoch: 43272, Training Loss: 0.00679
Epoch: 43272, Training Loss: 0.00832
Epoch: 43273, Training Loss: 0.00654
Epoch: 43273, Training Loss: 0.00678
Epoch: 43273, Training Loss: 0.00679
Epoch: 43273, Training Loss: 0.00832
Epoch: 43274, Training Loss: 0.00654
Epoch: 43274, Training Loss: 0.00678
Epoch: 43274, Training Loss: 0.00679
Epoch: 43274, Training Loss: 0.00832
Epoch: 43275, Training Loss: 0.00654
Epoch: 43275, Training Loss: 0.00678
Epoch: 43275, Training Loss: 0.00679
Epoch: 43275, Training Loss: 0.00832
Epoch: 43276, Training Loss: 0.00654
Epoch: 43276, Training Loss: 0.00678
Epoch: 43276, Training Loss: 0.00679
Epoch: 43276, Training Loss: 0.00832
Epoch: 43277, Training Loss: 0.00654
Epoch: 43277, Training Loss: 0.00678
Epoch: 43277, Training Loss: 0.00679
Epoch: 43277, Training Loss: 0.00832
Epoch: 43278, Training Loss: 0.00654
Epoch: 43278, Training Loss: 0.00678
Epoch: 43278, Training Loss: 0.00679
Epoch: 43278, Training Loss: 0.00832
Epoch: 43279, Training Loss: 0.00654
Epoch: 43279, Training Loss: 0.00678
Epoch: 43279, Training Loss: 0.00679
Epoch: 43279, Training Loss: 0.00832
Epoch: 43280, Training Loss: 0.00654
Epoch: 43280, Training Loss: 0.00678
Epoch: 43280, Training Loss: 0.00679
Epoch: 43280, Training Loss: 0.00832
Epoch: 43281, Training Loss: 0.00654
Epoch: 43281, Training Loss: 0.00678
Epoch: 43281, Training Loss: 0.00679
Epoch: 43281, Training Loss: 0.00832
Epoch: 43282, Training Loss: 0.00654
Epoch: 43282, Training Loss: 0.00678
Epoch: 43282, Training Loss: 0.00679
Epoch: 43282, Training Loss: 0.00832
Epoch: 43283, Training Loss: 0.00654
Epoch: 43283, Training Loss: 0.00678
Epoch: 43283, Training Loss: 0.00679
Epoch: 43283, Training Loss: 0.00832
Epoch: 43284, Training Loss: 0.00653
Epoch: 43284, Training Loss: 0.00678
Epoch: 43284, Training Loss: 0.00679
Epoch: 43284, Training Loss: 0.00832
Epoch: 43285, Training Loss: 0.00653
Epoch: 43285, Training Loss: 0.00678
Epoch: 43285, Training Loss: 0.00679
Epoch: 43285, Training Loss: 0.00832
Epoch: 43286, Training Loss: 0.00653
Epoch: 43286, Training Loss: 0.00678
Epoch: 43286, Training Loss: 0.00679
Epoch: 43286, Training Loss: 0.00832
Epoch: 43287, Training Loss: 0.00653
Epoch: 43287, Training Loss: 0.00678
Epoch: 43287, Training Loss: 0.00679
Epoch: 43287, Training Loss: 0.00832
Epoch: 43288, Training Loss: 0.00653
Epoch: 43288, Training Loss: 0.00678
Epoch: 43288, Training Loss: 0.00679
Epoch: 43288, Training Loss: 0.00832
Epoch: 43289, Training Loss: 0.00653
Epoch: 43289, Training Loss: 0.00678
Epoch: 43289, Training Loss: 0.00679
Epoch: 43289, Training Loss: 0.00832
Epoch: 43290, Training Loss: 0.00653
Epoch: 43290, Training Loss: 0.00678
Epoch: 43290, Training Loss: 0.00679
Epoch: 43290, Training Loss: 0.00832
Epoch: 43291, Training Loss: 0.00653
Epoch: 43291, Training Loss: 0.00678
Epoch: 43291, Training Loss: 0.00679
Epoch: 43291, Training Loss: 0.00832
Epoch: 43292, Training Loss: 0.00653
Epoch: 43292, Training Loss: 0.00678
Epoch: 43292, Training Loss: 0.00679
Epoch: 43292, Training Loss: 0.00832
Epoch: 43293, Training Loss: 0.00653
Epoch: 43293, Training Loss: 0.00678
Epoch: 43293, Training Loss: 0.00679
Epoch: 43293, Training Loss: 0.00832
Epoch: 43294, Training Loss: 0.00653
Epoch: 43294, Training Loss: 0.00678
Epoch: 43294, Training Loss: 0.00679
Epoch: 43294, Training Loss: 0.00832
Epoch: 43295, Training Loss: 0.00653
Epoch: 43295, Training Loss: 0.00678
Epoch: 43295, Training Loss: 0.00679
Epoch: 43295, Training Loss: 0.00832
Epoch: 43296, Training Loss: 0.00653
Epoch: 43296, Training Loss: 0.00678
Epoch: 43296, Training Loss: 0.00679
Epoch: 43296, Training Loss: 0.00832
Epoch: 43297, Training Loss: 0.00653
Epoch: 43297, Training Loss: 0.00678
Epoch: 43297, Training Loss: 0.00679
Epoch: 43297, Training Loss: 0.00832
Epoch: 43298, Training Loss: 0.00653
Epoch: 43298, Training Loss: 0.00678
Epoch: 43298, Training Loss: 0.00679
Epoch: 43298, Training Loss: 0.00832
Epoch: 43299, Training Loss: 0.00653
Epoch: 43299, Training Loss: 0.00678
Epoch: 43299, Training Loss: 0.00679
Epoch: 43299, Training Loss: 0.00832
Epoch: 43300, Training Loss: 0.00653
Epoch: 43300, Training Loss: 0.00678
Epoch: 43300, Training Loss: 0.00679
Epoch: 43300, Training Loss: 0.00832
Epoch: 43301, Training Loss: 0.00653
Epoch: 43301, Training Loss: 0.00678
Epoch: 43301, Training Loss: 0.00679
Epoch: 43301, Training Loss: 0.00832
Epoch: 43302, Training Loss: 0.00653
Epoch: 43302, Training Loss: 0.00678
Epoch: 43302, Training Loss: 0.00679
Epoch: 43302, Training Loss: 0.00832
Epoch: 43303, Training Loss: 0.00653
Epoch: 43303, Training Loss: 0.00678
Epoch: 43303, Training Loss: 0.00679
Epoch: 43303, Training Loss: 0.00832
Epoch: 43304, Training Loss: 0.00653
Epoch: 43304, Training Loss: 0.00678
Epoch: 43304, Training Loss: 0.00679
Epoch: 43304, Training Loss: 0.00832
Epoch: 43305, Training Loss: 0.00653
Epoch: 43305, Training Loss: 0.00678
Epoch: 43305, Training Loss: 0.00679
Epoch: 43305, Training Loss: 0.00832
Epoch: 43306, Training Loss: 0.00653
Epoch: 43306, Training Loss: 0.00678
Epoch: 43306, Training Loss: 0.00679
Epoch: 43306, Training Loss: 0.00832
Epoch: 43307, Training Loss: 0.00653
Epoch: 43307, Training Loss: 0.00678
Epoch: 43307, Training Loss: 0.00679
Epoch: 43307, Training Loss: 0.00832
Epoch: 43308, Training Loss: 0.00653
Epoch: 43308, Training Loss: 0.00678
Epoch: 43308, Training Loss: 0.00679
Epoch: 43308, Training Loss: 0.00832
Epoch: 43309, Training Loss: 0.00653
Epoch: 43309, Training Loss: 0.00678
Epoch: 43309, Training Loss: 0.00679
Epoch: 43309, Training Loss: 0.00832
Epoch: 43310, Training Loss: 0.00653
Epoch: 43310, Training Loss: 0.00678
Epoch: 43310, Training Loss: 0.00679
Epoch: 43310, Training Loss: 0.00832
Epoch: 43311, Training Loss: 0.00653
Epoch: 43311, Training Loss: 0.00678
Epoch: 43311, Training Loss: 0.00679
Epoch: 43311, Training Loss: 0.00832
Epoch: 43312, Training Loss: 0.00653
Epoch: 43312, Training Loss: 0.00678
Epoch: 43312, Training Loss: 0.00679
Epoch: 43312, Training Loss: 0.00832
Epoch: 43313, Training Loss: 0.00653
Epoch: 43313, Training Loss: 0.00678
Epoch: 43313, Training Loss: 0.00679
Epoch: 43313, Training Loss: 0.00832
Epoch: 43314, Training Loss: 0.00653
Epoch: 43314, Training Loss: 0.00678
Epoch: 43314, Training Loss: 0.00679
Epoch: 43314, Training Loss: 0.00832
Epoch: 43315, Training Loss: 0.00653
Epoch: 43315, Training Loss: 0.00678
Epoch: 43315, Training Loss: 0.00679
Epoch: 43315, Training Loss: 0.00832
Epoch: 43316, Training Loss: 0.00653
Epoch: 43316, Training Loss: 0.00678
Epoch: 43316, Training Loss: 0.00679
Epoch: 43316, Training Loss: 0.00832
Epoch: 43317, Training Loss: 0.00653
Epoch: 43317, Training Loss: 0.00678
Epoch: 43317, Training Loss: 0.00679
Epoch: 43317, Training Loss: 0.00832
Epoch: 43318, Training Loss: 0.00653
Epoch: 43318, Training Loss: 0.00678
Epoch: 43318, Training Loss: 0.00679
Epoch: 43318, Training Loss: 0.00832
Epoch: 43319, Training Loss: 0.00653
Epoch: 43319, Training Loss: 0.00678
Epoch: 43319, Training Loss: 0.00679
Epoch: 43319, Training Loss: 0.00832
Epoch: 43320, Training Loss: 0.00653
Epoch: 43320, Training Loss: 0.00678
Epoch: 43320, Training Loss: 0.00679
Epoch: 43320, Training Loss: 0.00832
Epoch: 43321, Training Loss: 0.00653
Epoch: 43321, Training Loss: 0.00678
Epoch: 43321, Training Loss: 0.00679
Epoch: 43321, Training Loss: 0.00832
Epoch: 43322, Training Loss: 0.00653
Epoch: 43322, Training Loss: 0.00678
Epoch: 43322, Training Loss: 0.00679
Epoch: 43322, Training Loss: 0.00832
Epoch: 43323, Training Loss: 0.00653
Epoch: 43323, Training Loss: 0.00678
Epoch: 43323, Training Loss: 0.00679
Epoch: 43323, Training Loss: 0.00832
Epoch: 43324, Training Loss: 0.00653
Epoch: 43324, Training Loss: 0.00678
Epoch: 43324, Training Loss: 0.00679
Epoch: 43324, Training Loss: 0.00832
Epoch: 43325, Training Loss: 0.00653
Epoch: 43325, Training Loss: 0.00678
Epoch: 43325, Training Loss: 0.00679
Epoch: 43325, Training Loss: 0.00832
Epoch: 43326, Training Loss: 0.00653
Epoch: 43326, Training Loss: 0.00678
Epoch: 43326, Training Loss: 0.00679
Epoch: 43326, Training Loss: 0.00832
Epoch: 43327, Training Loss: 0.00653
Epoch: 43327, Training Loss: 0.00678
Epoch: 43327, Training Loss: 0.00679
Epoch: 43327, Training Loss: 0.00832
Epoch: 43328, Training Loss: 0.00653
Epoch: 43328, Training Loss: 0.00678
Epoch: 43328, Training Loss: 0.00679
Epoch: 43328, Training Loss: 0.00832
Epoch: 43329, Training Loss: 0.00653
Epoch: 43329, Training Loss: 0.00678
Epoch: 43329, Training Loss: 0.00679
Epoch: 43329, Training Loss: 0.00832
Epoch: 43330, Training Loss: 0.00653
Epoch: 43330, Training Loss: 0.00678
Epoch: 43330, Training Loss: 0.00679
Epoch: 43330, Training Loss: 0.00832
Epoch: 43331, Training Loss: 0.00653
Epoch: 43331, Training Loss: 0.00678
Epoch: 43331, Training Loss: 0.00679
Epoch: 43331, Training Loss: 0.00832
Epoch: 43332, Training Loss: 0.00653
Epoch: 43332, Training Loss: 0.00678
Epoch: 43332, Training Loss: 0.00679
Epoch: 43332, Training Loss: 0.00832
Epoch: 43333, Training Loss: 0.00653
Epoch: 43333, Training Loss: 0.00678
Epoch: 43333, Training Loss: 0.00679
Epoch: 43333, Training Loss: 0.00832
Epoch: 43334, Training Loss: 0.00653
Epoch: 43334, Training Loss: 0.00678
Epoch: 43334, Training Loss: 0.00679
Epoch: 43334, Training Loss: 0.00832
Epoch: 43335, Training Loss: 0.00653
Epoch: 43335, Training Loss: 0.00678
Epoch: 43335, Training Loss: 0.00679
Epoch: 43335, Training Loss: 0.00832
Epoch: 43336, Training Loss: 0.00653
Epoch: 43336, Training Loss: 0.00678
Epoch: 43336, Training Loss: 0.00679
Epoch: 43336, Training Loss: 0.00832
Epoch: 43337, Training Loss: 0.00653
Epoch: 43337, Training Loss: 0.00678
Epoch: 43337, Training Loss: 0.00679
Epoch: 43337, Training Loss: 0.00832
Epoch: 43338, Training Loss: 0.00653
Epoch: 43338, Training Loss: 0.00678
Epoch: 43338, Training Loss: 0.00678
Epoch: 43338, Training Loss: 0.00832
Epoch: 43339, Training Loss: 0.00653
Epoch: 43339, Training Loss: 0.00678
Epoch: 43339, Training Loss: 0.00678
Epoch: 43339, Training Loss: 0.00832
Epoch: 43340, Training Loss: 0.00653
Epoch: 43340, Training Loss: 0.00678
Epoch: 43340, Training Loss: 0.00678
Epoch: 43340, Training Loss: 0.00832
Epoch: 43341, Training Loss: 0.00653
Epoch: 43341, Training Loss: 0.00678
Epoch: 43341, Training Loss: 0.00678
Epoch: 43341, Training Loss: 0.00832
Epoch: 43342, Training Loss: 0.00653
Epoch: 43342, Training Loss: 0.00678
Epoch: 43342, Training Loss: 0.00678
Epoch: 43342, Training Loss: 0.00832
Epoch: 43343, Training Loss: 0.00653
Epoch: 43343, Training Loss: 0.00678
Epoch: 43343, Training Loss: 0.00678
Epoch: 43343, Training Loss: 0.00832
Epoch: 43344, Training Loss: 0.00653
Epoch: 43344, Training Loss: 0.00678
Epoch: 43344, Training Loss: 0.00678
Epoch: 43344, Training Loss: 0.00832
Epoch: 43345, Training Loss: 0.00653
Epoch: 43345, Training Loss: 0.00678
Epoch: 43345, Training Loss: 0.00678
Epoch: 43345, Training Loss: 0.00832
Epoch: 43346, Training Loss: 0.00653
Epoch: 43346, Training Loss: 0.00678
Epoch: 43346, Training Loss: 0.00678
Epoch: 43346, Training Loss: 0.00832
Epoch: 43347, Training Loss: 0.00653
Epoch: 43347, Training Loss: 0.00678
Epoch: 43347, Training Loss: 0.00678
Epoch: 43347, Training Loss: 0.00832
Epoch: 43348, Training Loss: 0.00653
Epoch: 43348, Training Loss: 0.00678
Epoch: 43348, Training Loss: 0.00678
Epoch: 43348, Training Loss: 0.00832
Epoch: 43349, Training Loss: 0.00653
Epoch: 43349, Training Loss: 0.00678
Epoch: 43349, Training Loss: 0.00678
Epoch: 43349, Training Loss: 0.00832
Epoch: 43350, Training Loss: 0.00653
Epoch: 43350, Training Loss: 0.00678
Epoch: 43350, Training Loss: 0.00678
Epoch: 43350, Training Loss: 0.00832
Epoch: 43351, Training Loss: 0.00653
Epoch: 43351, Training Loss: 0.00678
Epoch: 43351, Training Loss: 0.00678
Epoch: 43351, Training Loss: 0.00832
Epoch: 43352, Training Loss: 0.00653
Epoch: 43352, Training Loss: 0.00678
Epoch: 43352, Training Loss: 0.00678
Epoch: 43352, Training Loss: 0.00832
Epoch: 43353, Training Loss: 0.00653
Epoch: 43353, Training Loss: 0.00678
Epoch: 43353, Training Loss: 0.00678
Epoch: 43353, Training Loss: 0.00832
Epoch: 43354, Training Loss: 0.00653
Epoch: 43354, Training Loss: 0.00678
Epoch: 43354, Training Loss: 0.00678
Epoch: 43354, Training Loss: 0.00832
Epoch: 43355, Training Loss: 0.00653
Epoch: 43355, Training Loss: 0.00678
Epoch: 43355, Training Loss: 0.00678
Epoch: 43355, Training Loss: 0.00832
Epoch: 43356, Training Loss: 0.00653
Epoch: 43356, Training Loss: 0.00678
Epoch: 43356, Training Loss: 0.00678
Epoch: 43356, Training Loss: 0.00832
Epoch: 43357, Training Loss: 0.00653
Epoch: 43357, Training Loss: 0.00678
Epoch: 43357, Training Loss: 0.00678
Epoch: 43357, Training Loss: 0.00832
Epoch: 43358, Training Loss: 0.00653
Epoch: 43358, Training Loss: 0.00678
Epoch: 43358, Training Loss: 0.00678
Epoch: 43358, Training Loss: 0.00832
Epoch: 43359, Training Loss: 0.00653
Epoch: 43359, Training Loss: 0.00678
Epoch: 43359, Training Loss: 0.00678
Epoch: 43359, Training Loss: 0.00832
Epoch: 43360, Training Loss: 0.00653
Epoch: 43360, Training Loss: 0.00678
Epoch: 43360, Training Loss: 0.00678
Epoch: 43360, Training Loss: 0.00832
Epoch: 43361, Training Loss: 0.00653
Epoch: 43361, Training Loss: 0.00678
Epoch: 43361, Training Loss: 0.00678
Epoch: 43361, Training Loss: 0.00831
Epoch: 43362, Training Loss: 0.00653
Epoch: 43362, Training Loss: 0.00678
Epoch: 43362, Training Loss: 0.00678
Epoch: 43362, Training Loss: 0.00831
Epoch: 43363, Training Loss: 0.00653
Epoch: 43363, Training Loss: 0.00678
Epoch: 43363, Training Loss: 0.00678
Epoch: 43363, Training Loss: 0.00831
Epoch: 43364, Training Loss: 0.00653
Epoch: 43364, Training Loss: 0.00678
Epoch: 43364, Training Loss: 0.00678
Epoch: 43364, Training Loss: 0.00831
Epoch: 43365, Training Loss: 0.00653
Epoch: 43365, Training Loss: 0.00678
Epoch: 43365, Training Loss: 0.00678
Epoch: 43365, Training Loss: 0.00831
Epoch: 43366, Training Loss: 0.00653
Epoch: 43366, Training Loss: 0.00678
Epoch: 43366, Training Loss: 0.00678
Epoch: 43366, Training Loss: 0.00831
Epoch: 43367, Training Loss: 0.00653
Epoch: 43367, Training Loss: 0.00678
Epoch: 43367, Training Loss: 0.00678
Epoch: 43367, Training Loss: 0.00831
Epoch: 43368, Training Loss: 0.00653
Epoch: 43368, Training Loss: 0.00677
Epoch: 43368, Training Loss: 0.00678
Epoch: 43368, Training Loss: 0.00831
Epoch: 43369, Training Loss: 0.00653
Epoch: 43369, Training Loss: 0.00677
Epoch: 43369, Training Loss: 0.00678
Epoch: 43369, Training Loss: 0.00831
Epoch: 43370, Training Loss: 0.00653
Epoch: 43370, Training Loss: 0.00677
Epoch: 43370, Training Loss: 0.00678
Epoch: 43370, Training Loss: 0.00831
Epoch: 43371, Training Loss: 0.00653
Epoch: 43371, Training Loss: 0.00677
Epoch: 43371, Training Loss: 0.00678
Epoch: 43371, Training Loss: 0.00831
Epoch: 43372, Training Loss: 0.00653
Epoch: 43372, Training Loss: 0.00677
Epoch: 43372, Training Loss: 0.00678
Epoch: 43372, Training Loss: 0.00831
Epoch: 43373, Training Loss: 0.00653
Epoch: 43373, Training Loss: 0.00677
Epoch: 43373, Training Loss: 0.00678
Epoch: 43373, Training Loss: 0.00831
Epoch: 43374, Training Loss: 0.00653
Epoch: 43374, Training Loss: 0.00677
Epoch: 43374, Training Loss: 0.00678
Epoch: 43374, Training Loss: 0.00831
Epoch: 43375, Training Loss: 0.00653
Epoch: 43375, Training Loss: 0.00677
Epoch: 43375, Training Loss: 0.00678
Epoch: 43375, Training Loss: 0.00831
Epoch: 43376, Training Loss: 0.00653
Epoch: 43376, Training Loss: 0.00677
Epoch: 43376, Training Loss: 0.00678
Epoch: 43376, Training Loss: 0.00831
Epoch: 43377, Training Loss: 0.00653
Epoch: 43377, Training Loss: 0.00677
Epoch: 43377, Training Loss: 0.00678
Epoch: 43377, Training Loss: 0.00831
Epoch: 43378, Training Loss: 0.00653
Epoch: 43378, Training Loss: 0.00677
Epoch: 43378, Training Loss: 0.00678
Epoch: 43378, Training Loss: 0.00831
Epoch: 43379, Training Loss: 0.00653
Epoch: 43379, Training Loss: 0.00677
Epoch: 43379, Training Loss: 0.00678
Epoch: 43379, Training Loss: 0.00831
Epoch: 43380, Training Loss: 0.00653
Epoch: 43380, Training Loss: 0.00677
Epoch: 43380, Training Loss: 0.00678
Epoch: 43380, Training Loss: 0.00831
Epoch: 43381, Training Loss: 0.00653
Epoch: 43381, Training Loss: 0.00677
Epoch: 43381, Training Loss: 0.00678
Epoch: 43381, Training Loss: 0.00831
Epoch: 43382, Training Loss: 0.00653
Epoch: 43382, Training Loss: 0.00677
Epoch: 43382, Training Loss: 0.00678
Epoch: 43382, Training Loss: 0.00831
Epoch: 43383, Training Loss: 0.00653
Epoch: 43383, Training Loss: 0.00677
Epoch: 43383, Training Loss: 0.00678
Epoch: 43383, Training Loss: 0.00831
Epoch: 43384, Training Loss: 0.00653
Epoch: 43384, Training Loss: 0.00677
Epoch: 43384, Training Loss: 0.00678
Epoch: 43384, Training Loss: 0.00831
Epoch: 43385, Training Loss: 0.00653
Epoch: 43385, Training Loss: 0.00677
Epoch: 43385, Training Loss: 0.00678
Epoch: 43385, Training Loss: 0.00831
Epoch: 43386, Training Loss: 0.00653
Epoch: 43386, Training Loss: 0.00677
Epoch: 43386, Training Loss: 0.00678
Epoch: 43386, Training Loss: 0.00831
Epoch: 43387, Training Loss: 0.00653
Epoch: 43387, Training Loss: 0.00677
Epoch: 43387, Training Loss: 0.00678
Epoch: 43387, Training Loss: 0.00831
Epoch: 43388, Training Loss: 0.00653
Epoch: 43388, Training Loss: 0.00677
Epoch: 43388, Training Loss: 0.00678
Epoch: 43388, Training Loss: 0.00831
Epoch: 43389, Training Loss: 0.00653
Epoch: 43389, Training Loss: 0.00677
Epoch: 43389, Training Loss: 0.00678
Epoch: 43389, Training Loss: 0.00831
Epoch: 43390, Training Loss: 0.00653
Epoch: 43390, Training Loss: 0.00677
Epoch: 43390, Training Loss: 0.00678
Epoch: 43390, Training Loss: 0.00831
Epoch: 43391, Training Loss: 0.00653
Epoch: 43391, Training Loss: 0.00677
Epoch: 43391, Training Loss: 0.00678
Epoch: 43391, Training Loss: 0.00831
Epoch: 43392, Training Loss: 0.00653
Epoch: 43392, Training Loss: 0.00677
Epoch: 43392, Training Loss: 0.00678
Epoch: 43392, Training Loss: 0.00831
Epoch: 43393, Training Loss: 0.00653
Epoch: 43393, Training Loss: 0.00677
Epoch: 43393, Training Loss: 0.00678
Epoch: 43393, Training Loss: 0.00831
Epoch: 43394, Training Loss: 0.00653
Epoch: 43394, Training Loss: 0.00677
Epoch: 43394, Training Loss: 0.00678
Epoch: 43394, Training Loss: 0.00831
Epoch: 43395, Training Loss: 0.00653
Epoch: 43395, Training Loss: 0.00677
Epoch: 43395, Training Loss: 0.00678
Epoch: 43395, Training Loss: 0.00831
Epoch: 43396, Training Loss: 0.00653
Epoch: 43396, Training Loss: 0.00677
Epoch: 43396, Training Loss: 0.00678
Epoch: 43396, Training Loss: 0.00831
Epoch: 43397, Training Loss: 0.00653
Epoch: 43397, Training Loss: 0.00677
Epoch: 43397, Training Loss: 0.00678
Epoch: 43397, Training Loss: 0.00831
Epoch: 43398, Training Loss: 0.00653
Epoch: 43398, Training Loss: 0.00677
Epoch: 43398, Training Loss: 0.00678
Epoch: 43398, Training Loss: 0.00831
Epoch: 43399, Training Loss: 0.00653
Epoch: 43399, Training Loss: 0.00677
Epoch: 43399, Training Loss: 0.00678
Epoch: 43399, Training Loss: 0.00831
Epoch: 43400, Training Loss: 0.00653
Epoch: 43400, Training Loss: 0.00677
Epoch: 43400, Training Loss: 0.00678
Epoch: 43400, Training Loss: 0.00831
Epoch: 43401, Training Loss: 0.00653
Epoch: 43401, Training Loss: 0.00677
Epoch: 43401, Training Loss: 0.00678
Epoch: 43401, Training Loss: 0.00831
Epoch: 43402, Training Loss: 0.00653
Epoch: 43402, Training Loss: 0.00677
Epoch: 43402, Training Loss: 0.00678
Epoch: 43402, Training Loss: 0.00831
Epoch: 43403, Training Loss: 0.00653
Epoch: 43403, Training Loss: 0.00677
Epoch: 43403, Training Loss: 0.00678
Epoch: 43403, Training Loss: 0.00831
Epoch: 43404, Training Loss: 0.00653
Epoch: 43404, Training Loss: 0.00677
Epoch: 43404, Training Loss: 0.00678
Epoch: 43404, Training Loss: 0.00831
Epoch: 43405, Training Loss: 0.00653
Epoch: 43405, Training Loss: 0.00677
Epoch: 43405, Training Loss: 0.00678
Epoch: 43405, Training Loss: 0.00831
Epoch: 43406, Training Loss: 0.00653
Epoch: 43406, Training Loss: 0.00677
Epoch: 43406, Training Loss: 0.00678
Epoch: 43406, Training Loss: 0.00831
Epoch: 43407, Training Loss: 0.00653
Epoch: 43407, Training Loss: 0.00677
Epoch: 43407, Training Loss: 0.00678
Epoch: 43407, Training Loss: 0.00831
Epoch: 43408, Training Loss: 0.00652
Epoch: 43408, Training Loss: 0.00677
Epoch: 43408, Training Loss: 0.00678
Epoch: 43408, Training Loss: 0.00831
Epoch: 43409, Training Loss: 0.00652
Epoch: 43409, Training Loss: 0.00677
Epoch: 43409, Training Loss: 0.00678
Epoch: 43409, Training Loss: 0.00831
Epoch: 43410, Training Loss: 0.00652
Epoch: 43410, Training Loss: 0.00677
Epoch: 43410, Training Loss: 0.00678
Epoch: 43410, Training Loss: 0.00831
Epoch: 43411, Training Loss: 0.00652
Epoch: 43411, Training Loss: 0.00677
Epoch: 43411, Training Loss: 0.00678
Epoch: 43411, Training Loss: 0.00831
Epoch: 43412, Training Loss: 0.00652
Epoch: 43412, Training Loss: 0.00677
Epoch: 43412, Training Loss: 0.00678
Epoch: 43412, Training Loss: 0.00831
Epoch: 43413, Training Loss: 0.00652
Epoch: 43413, Training Loss: 0.00677
Epoch: 43413, Training Loss: 0.00678
Epoch: 43413, Training Loss: 0.00831
Epoch: 43414, Training Loss: 0.00652
Epoch: 43414, Training Loss: 0.00677
Epoch: 43414, Training Loss: 0.00678
Epoch: 43414, Training Loss: 0.00831
Epoch: 43415, Training Loss: 0.00652
Epoch: 43415, Training Loss: 0.00677
Epoch: 43415, Training Loss: 0.00678
Epoch: 43415, Training Loss: 0.00831
Epoch: 43416, Training Loss: 0.00652
Epoch: 43416, Training Loss: 0.00677
Epoch: 43416, Training Loss: 0.00678
Epoch: 43416, Training Loss: 0.00831
Epoch: 43417, Training Loss: 0.00652
Epoch: 43417, Training Loss: 0.00677
Epoch: 43417, Training Loss: 0.00678
Epoch: 43417, Training Loss: 0.00831
Epoch: 43418, Training Loss: 0.00652
Epoch: 43418, Training Loss: 0.00677
Epoch: 43418, Training Loss: 0.00678
Epoch: 43418, Training Loss: 0.00831
Epoch: 43419, Training Loss: 0.00652
Epoch: 43419, Training Loss: 0.00677
Epoch: 43419, Training Loss: 0.00678
Epoch: 43419, Training Loss: 0.00831
Epoch: 43420, Training Loss: 0.00652
Epoch: 43420, Training Loss: 0.00677
Epoch: 43420, Training Loss: 0.00678
Epoch: 43420, Training Loss: 0.00831
Epoch: 43421, Training Loss: 0.00652
Epoch: 43421, Training Loss: 0.00677
Epoch: 43421, Training Loss: 0.00678
Epoch: 43421, Training Loss: 0.00831
Epoch: 43422, Training Loss: 0.00652
Epoch: 43422, Training Loss: 0.00677
Epoch: 43422, Training Loss: 0.00678
Epoch: 43422, Training Loss: 0.00831
Epoch: 43423, Training Loss: 0.00652
Epoch: 43423, Training Loss: 0.00677
Epoch: 43423, Training Loss: 0.00678
Epoch: 43423, Training Loss: 0.00831
Epoch: 43424, Training Loss: 0.00652
Epoch: 43424, Training Loss: 0.00677
Epoch: 43424, Training Loss: 0.00678
Epoch: 43424, Training Loss: 0.00831
Epoch: 43425, Training Loss: 0.00652
Epoch: 43425, Training Loss: 0.00677
Epoch: 43425, Training Loss: 0.00678
Epoch: 43425, Training Loss: 0.00831
Epoch: 43426, Training Loss: 0.00652
Epoch: 43426, Training Loss: 0.00677
Epoch: 43426, Training Loss: 0.00678
Epoch: 43426, Training Loss: 0.00831
Epoch: 43427, Training Loss: 0.00652
Epoch: 43427, Training Loss: 0.00677
Epoch: 43427, Training Loss: 0.00678
Epoch: 43427, Training Loss: 0.00831
Epoch: 43428, Training Loss: 0.00652
Epoch: 43428, Training Loss: 0.00677
Epoch: 43428, Training Loss: 0.00678
Epoch: 43428, Training Loss: 0.00831
Epoch: 43429, Training Loss: 0.00652
Epoch: 43429, Training Loss: 0.00677
Epoch: 43429, Training Loss: 0.00678
Epoch: 43429, Training Loss: 0.00831
Epoch: 43430, Training Loss: 0.00652
Epoch: 43430, Training Loss: 0.00677
Epoch: 43430, Training Loss: 0.00678
Epoch: 43430, Training Loss: 0.00831
Epoch: 43431, Training Loss: 0.00652
Epoch: 43431, Training Loss: 0.00677
Epoch: 43431, Training Loss: 0.00678
Epoch: 43431, Training Loss: 0.00831
Epoch: 43432, Training Loss: 0.00652
Epoch: 43432, Training Loss: 0.00677
Epoch: 43432, Training Loss: 0.00678
Epoch: 43432, Training Loss: 0.00831
Epoch: 43433, Training Loss: 0.00652
Epoch: 43433, Training Loss: 0.00677
Epoch: 43433, Training Loss: 0.00678
Epoch: 43433, Training Loss: 0.00831
Epoch: 43434, Training Loss: 0.00652
Epoch: 43434, Training Loss: 0.00677
Epoch: 43434, Training Loss: 0.00678
Epoch: 43434, Training Loss: 0.00831
Epoch: 43435, Training Loss: 0.00652
Epoch: 43435, Training Loss: 0.00677
Epoch: 43435, Training Loss: 0.00678
Epoch: 43435, Training Loss: 0.00831
Epoch: 43436, Training Loss: 0.00652
Epoch: 43436, Training Loss: 0.00677
Epoch: 43436, Training Loss: 0.00678
Epoch: 43436, Training Loss: 0.00831
Epoch: 43437, Training Loss: 0.00652
Epoch: 43437, Training Loss: 0.00677
Epoch: 43437, Training Loss: 0.00678
Epoch: 43437, Training Loss: 0.00831
Epoch: 43438, Training Loss: 0.00652
Epoch: 43438, Training Loss: 0.00677
Epoch: 43438, Training Loss: 0.00678
Epoch: 43438, Training Loss: 0.00831
Epoch: 43439, Training Loss: 0.00652
Epoch: 43439, Training Loss: 0.00677
Epoch: 43439, Training Loss: 0.00678
Epoch: 43439, Training Loss: 0.00831
Epoch: 43440, Training Loss: 0.00652
Epoch: 43440, Training Loss: 0.00677
Epoch: 43440, Training Loss: 0.00678
Epoch: 43440, Training Loss: 0.00831
Epoch: 43441, Training Loss: 0.00652
Epoch: 43441, Training Loss: 0.00677
Epoch: 43441, Training Loss: 0.00678
Epoch: 43441, Training Loss: 0.00831
Epoch: 43442, Training Loss: 0.00652
Epoch: 43442, Training Loss: 0.00677
Epoch: 43442, Training Loss: 0.00678
Epoch: 43442, Training Loss: 0.00831
Epoch: 43443, Training Loss: 0.00652
Epoch: 43443, Training Loss: 0.00677
Epoch: 43443, Training Loss: 0.00678
Epoch: 43443, Training Loss: 0.00831
Epoch: 43444, Training Loss: 0.00652
Epoch: 43444, Training Loss: 0.00677
Epoch: 43444, Training Loss: 0.00678
Epoch: 43444, Training Loss: 0.00831
Epoch: 43445, Training Loss: 0.00652
Epoch: 43445, Training Loss: 0.00677
Epoch: 43445, Training Loss: 0.00678
Epoch: 43445, Training Loss: 0.00831
Epoch: 43446, Training Loss: 0.00652
Epoch: 43446, Training Loss: 0.00677
Epoch: 43446, Training Loss: 0.00678
Epoch: 43446, Training Loss: 0.00831
Epoch: 43447, Training Loss: 0.00652
Epoch: 43447, Training Loss: 0.00677
Epoch: 43447, Training Loss: 0.00678
Epoch: 43447, Training Loss: 0.00831
Epoch: 43448, Training Loss: 0.00652
Epoch: 43448, Training Loss: 0.00677
Epoch: 43448, Training Loss: 0.00678
Epoch: 43448, Training Loss: 0.00831
Epoch: 43449, Training Loss: 0.00652
Epoch: 43449, Training Loss: 0.00677
Epoch: 43449, Training Loss: 0.00678
Epoch: 43449, Training Loss: 0.00831
Epoch: 43450, Training Loss: 0.00652
Epoch: 43450, Training Loss: 0.00677
Epoch: 43450, Training Loss: 0.00678
Epoch: 43450, Training Loss: 0.00831
Epoch: 43451, Training Loss: 0.00652
Epoch: 43451, Training Loss: 0.00677
Epoch: 43451, Training Loss: 0.00678
Epoch: 43451, Training Loss: 0.00831
Epoch: 43452, Training Loss: 0.00652
Epoch: 43452, Training Loss: 0.00677
Epoch: 43452, Training Loss: 0.00678
Epoch: 43452, Training Loss: 0.00831
Epoch: 43453, Training Loss: 0.00652
Epoch: 43453, Training Loss: 0.00677
Epoch: 43453, Training Loss: 0.00678
Epoch: 43453, Training Loss: 0.00831
Epoch: 43454, Training Loss: 0.00652
Epoch: 43454, Training Loss: 0.00677
Epoch: 43454, Training Loss: 0.00678
Epoch: 43454, Training Loss: 0.00831
Epoch: 43455, Training Loss: 0.00652
Epoch: 43455, Training Loss: 0.00677
Epoch: 43455, Training Loss: 0.00677
Epoch: 43455, Training Loss: 0.00831
Epoch: 43456, Training Loss: 0.00652
Epoch: 43456, Training Loss: 0.00677
Epoch: 43456, Training Loss: 0.00677
Epoch: 43456, Training Loss: 0.00830
Epoch: 43457, Training Loss: 0.00652
Epoch: 43457, Training Loss: 0.00677
Epoch: 43457, Training Loss: 0.00677
Epoch: 43457, Training Loss: 0.00830
Epoch: 43458, Training Loss: 0.00652
Epoch: 43458, Training Loss: 0.00677
Epoch: 43458, Training Loss: 0.00677
Epoch: 43458, Training Loss: 0.00830
Epoch: 43459, Training Loss: 0.00652
Epoch: 43459, Training Loss: 0.00677
Epoch: 43459, Training Loss: 0.00677
Epoch: 43459, Training Loss: 0.00830
Epoch: 43460, Training Loss: 0.00652
Epoch: 43460, Training Loss: 0.00677
Epoch: 43460, Training Loss: 0.00677
Epoch: 43460, Training Loss: 0.00830
Epoch: 43461, Training Loss: 0.00652
Epoch: 43461, Training Loss: 0.00677
Epoch: 43461, Training Loss: 0.00677
Epoch: 43461, Training Loss: 0.00830
Epoch: 43462, Training Loss: 0.00652
Epoch: 43462, Training Loss: 0.00677
Epoch: 43462, Training Loss: 0.00677
Epoch: 43462, Training Loss: 0.00830
Epoch: 43463, Training Loss: 0.00652
Epoch: 43463, Training Loss: 0.00677
Epoch: 43463, Training Loss: 0.00677
Epoch: 43463, Training Loss: 0.00830
Epoch: 43464, Training Loss: 0.00652
Epoch: 43464, Training Loss: 0.00677
Epoch: 43464, Training Loss: 0.00677
Epoch: 43464, Training Loss: 0.00830
Epoch: 43465, Training Loss: 0.00652
Epoch: 43465, Training Loss: 0.00677
Epoch: 43465, Training Loss: 0.00677
Epoch: 43465, Training Loss: 0.00830
Epoch: 43466, Training Loss: 0.00652
Epoch: 43466, Training Loss: 0.00677
Epoch: 43466, Training Loss: 0.00677
Epoch: 43466, Training Loss: 0.00830
Epoch: 43467, Training Loss: 0.00652
Epoch: 43467, Training Loss: 0.00677
Epoch: 43467, Training Loss: 0.00677
Epoch: 43467, Training Loss: 0.00830
Epoch: 43468, Training Loss: 0.00652
Epoch: 43468, Training Loss: 0.00677
Epoch: 43468, Training Loss: 0.00677
Epoch: 43468, Training Loss: 0.00830
Epoch: 43469, Training Loss: 0.00652
Epoch: 43469, Training Loss: 0.00677
Epoch: 43469, Training Loss: 0.00677
Epoch: 43469, Training Loss: 0.00830
Epoch: 43470, Training Loss: 0.00652
Epoch: 43470, Training Loss: 0.00677
Epoch: 43470, Training Loss: 0.00677
Epoch: 43470, Training Loss: 0.00830
Epoch: 43471, Training Loss: 0.00652
Epoch: 43471, Training Loss: 0.00677
Epoch: 43471, Training Loss: 0.00677
Epoch: 43471, Training Loss: 0.00830
Epoch: 43472, Training Loss: 0.00652
Epoch: 43472, Training Loss: 0.00677
Epoch: 43472, Training Loss: 0.00677
Epoch: 43472, Training Loss: 0.00830
Epoch: 43473, Training Loss: 0.00652
Epoch: 43473, Training Loss: 0.00677
Epoch: 43473, Training Loss: 0.00677
Epoch: 43473, Training Loss: 0.00830
Epoch: 43474, Training Loss: 0.00652
Epoch: 43474, Training Loss: 0.00677
Epoch: 43474, Training Loss: 0.00677
Epoch: 43474, Training Loss: 0.00830
Epoch: 43475, Training Loss: 0.00652
Epoch: 43475, Training Loss: 0.00677
Epoch: 43475, Training Loss: 0.00677
Epoch: 43475, Training Loss: 0.00830
Epoch: 43476, Training Loss: 0.00652
Epoch: 43476, Training Loss: 0.00677
Epoch: 43476, Training Loss: 0.00677
Epoch: 43476, Training Loss: 0.00830
Epoch: 43477, Training Loss: 0.00652
Epoch: 43477, Training Loss: 0.00677
Epoch: 43477, Training Loss: 0.00677
Epoch: 43477, Training Loss: 0.00830
Epoch: 43478, Training Loss: 0.00652
Epoch: 43478, Training Loss: 0.00677
Epoch: 43478, Training Loss: 0.00677
Epoch: 43478, Training Loss: 0.00830
Epoch: 43479, Training Loss: 0.00652
Epoch: 43479, Training Loss: 0.00677
Epoch: 43479, Training Loss: 0.00677
Epoch: 43479, Training Loss: 0.00830
Epoch: 43480, Training Loss: 0.00652
Epoch: 43480, Training Loss: 0.00677
Epoch: 43480, Training Loss: 0.00677
Epoch: 43480, Training Loss: 0.00830
Epoch: 43481, Training Loss: 0.00652
Epoch: 43481, Training Loss: 0.00677
Epoch: 43481, Training Loss: 0.00677
Epoch: 43481, Training Loss: 0.00830
Epoch: 43482, Training Loss: 0.00652
Epoch: 43482, Training Loss: 0.00677
Epoch: 43482, Training Loss: 0.00677
Epoch: 43482, Training Loss: 0.00830
Epoch: 43483, Training Loss: 0.00652
Epoch: 43483, Training Loss: 0.00677
Epoch: 43483, Training Loss: 0.00677
Epoch: 43483, Training Loss: 0.00830
Epoch: 43484, Training Loss: 0.00652
Epoch: 43484, Training Loss: 0.00677
Epoch: 43484, Training Loss: 0.00677
Epoch: 43484, Training Loss: 0.00830
Epoch: 43485, Training Loss: 0.00652
Epoch: 43485, Training Loss: 0.00676
Epoch: 43485, Training Loss: 0.00677
Epoch: 43485, Training Loss: 0.00830
Epoch: 43486, Training Loss: 0.00652
Epoch: 43486, Training Loss: 0.00676
Epoch: 43486, Training Loss: 0.00677
Epoch: 43486, Training Loss: 0.00830
Epoch: 43487, Training Loss: 0.00652
Epoch: 43487, Training Loss: 0.00676
Epoch: 43487, Training Loss: 0.00677
Epoch: 43487, Training Loss: 0.00830
Epoch: 43488, Training Loss: 0.00652
Epoch: 43488, Training Loss: 0.00676
Epoch: 43488, Training Loss: 0.00677
Epoch: 43488, Training Loss: 0.00830
Epoch: 43489, Training Loss: 0.00652
Epoch: 43489, Training Loss: 0.00676
Epoch: 43489, Training Loss: 0.00677
Epoch: 43489, Training Loss: 0.00830
Epoch: 43490, Training Loss: 0.00652
Epoch: 43490, Training Loss: 0.00676
Epoch: 43490, Training Loss: 0.00677
Epoch: 43490, Training Loss: 0.00830
Epoch: 43491, Training Loss: 0.00652
Epoch: 43491, Training Loss: 0.00676
Epoch: 43491, Training Loss: 0.00677
Epoch: 43491, Training Loss: 0.00830
Epoch: 43492, Training Loss: 0.00652
Epoch: 43492, Training Loss: 0.00676
Epoch: 43492, Training Loss: 0.00677
Epoch: 43492, Training Loss: 0.00830
Epoch: 43493, Training Loss: 0.00652
Epoch: 43493, Training Loss: 0.00676
Epoch: 43493, Training Loss: 0.00677
Epoch: 43493, Training Loss: 0.00830
Epoch: 43494, Training Loss: 0.00652
Epoch: 43494, Training Loss: 0.00676
Epoch: 43494, Training Loss: 0.00677
Epoch: 43494, Training Loss: 0.00830
Epoch: 43495, Training Loss: 0.00652
Epoch: 43495, Training Loss: 0.00676
Epoch: 43495, Training Loss: 0.00677
Epoch: 43495, Training Loss: 0.00830
Epoch: 43496, Training Loss: 0.00652
Epoch: 43496, Training Loss: 0.00676
Epoch: 43496, Training Loss: 0.00677
Epoch: 43496, Training Loss: 0.00830
Epoch: 43497, Training Loss: 0.00652
Epoch: 43497, Training Loss: 0.00676
Epoch: 43497, Training Loss: 0.00677
Epoch: 43497, Training Loss: 0.00830
Epoch: 43498, Training Loss: 0.00652
Epoch: 43498, Training Loss: 0.00676
Epoch: 43498, Training Loss: 0.00677
Epoch: 43498, Training Loss: 0.00830
Epoch: 43499, Training Loss: 0.00652
Epoch: 43499, Training Loss: 0.00676
Epoch: 43499, Training Loss: 0.00677
Epoch: 43499, Training Loss: 0.00830
Epoch: 43500, Training Loss: 0.00652
Epoch: 43500, Training Loss: 0.00676
Epoch: 43500, Training Loss: 0.00677
Epoch: 43500, Training Loss: 0.00830
Epoch: 43501, Training Loss: 0.00652
Epoch: 43501, Training Loss: 0.00676
Epoch: 43501, Training Loss: 0.00677
Epoch: 43501, Training Loss: 0.00830
Epoch: 43502, Training Loss: 0.00652
Epoch: 43502, Training Loss: 0.00676
Epoch: 43502, Training Loss: 0.00677
Epoch: 43502, Training Loss: 0.00830
Epoch: 43503, Training Loss: 0.00652
Epoch: 43503, Training Loss: 0.00676
Epoch: 43503, Training Loss: 0.00677
Epoch: 43503, Training Loss: 0.00830
Epoch: 43504, Training Loss: 0.00652
Epoch: 43504, Training Loss: 0.00676
Epoch: 43504, Training Loss: 0.00677
Epoch: 43504, Training Loss: 0.00830
Epoch: 43505, Training Loss: 0.00652
Epoch: 43505, Training Loss: 0.00676
Epoch: 43505, Training Loss: 0.00677
Epoch: 43505, Training Loss: 0.00830
Epoch: 43506, Training Loss: 0.00652
Epoch: 43506, Training Loss: 0.00676
Epoch: 43506, Training Loss: 0.00677
Epoch: 43506, Training Loss: 0.00830
Epoch: 43507, Training Loss: 0.00652
Epoch: 43507, Training Loss: 0.00676
Epoch: 43507, Training Loss: 0.00677
Epoch: 43507, Training Loss: 0.00830
Epoch: 43508, Training Loss: 0.00652
Epoch: 43508, Training Loss: 0.00676
Epoch: 43508, Training Loss: 0.00677
Epoch: 43508, Training Loss: 0.00830
Epoch: 43509, Training Loss: 0.00652
Epoch: 43509, Training Loss: 0.00676
Epoch: 43509, Training Loss: 0.00677
Epoch: 43509, Training Loss: 0.00830
Epoch: 43510, Training Loss: 0.00652
Epoch: 43510, Training Loss: 0.00676
Epoch: 43510, Training Loss: 0.00677
Epoch: 43510, Training Loss: 0.00830
Epoch: 43511, Training Loss: 0.00652
Epoch: 43511, Training Loss: 0.00676
Epoch: 43511, Training Loss: 0.00677
Epoch: 43511, Training Loss: 0.00830
Epoch: 43512, Training Loss: 0.00652
Epoch: 43512, Training Loss: 0.00676
Epoch: 43512, Training Loss: 0.00677
Epoch: 43512, Training Loss: 0.00830
Epoch: 43513, Training Loss: 0.00652
Epoch: 43513, Training Loss: 0.00676
Epoch: 43513, Training Loss: 0.00677
Epoch: 43513, Training Loss: 0.00830
Epoch: 43514, Training Loss: 0.00652
Epoch: 43514, Training Loss: 0.00676
Epoch: 43514, Training Loss: 0.00677
Epoch: 43514, Training Loss: 0.00830
Epoch: 43515, Training Loss: 0.00652
Epoch: 43515, Training Loss: 0.00676
Epoch: 43515, Training Loss: 0.00677
Epoch: 43515, Training Loss: 0.00830
Epoch: 43516, Training Loss: 0.00652
Epoch: 43516, Training Loss: 0.00676
Epoch: 43516, Training Loss: 0.00677
Epoch: 43516, Training Loss: 0.00830
Epoch: 43517, Training Loss: 0.00652
Epoch: 43517, Training Loss: 0.00676
Epoch: 43517, Training Loss: 0.00677
Epoch: 43517, Training Loss: 0.00830
Epoch: 43518, Training Loss: 0.00652
Epoch: 43518, Training Loss: 0.00676
Epoch: 43518, Training Loss: 0.00677
Epoch: 43518, Training Loss: 0.00830
Epoch: 43519, Training Loss: 0.00652
Epoch: 43519, Training Loss: 0.00676
Epoch: 43519, Training Loss: 0.00677
Epoch: 43519, Training Loss: 0.00830
Epoch: 43520, Training Loss: 0.00652
Epoch: 43520, Training Loss: 0.00676
Epoch: 43520, Training Loss: 0.00677
Epoch: 43520, Training Loss: 0.00830
Epoch: 43521, Training Loss: 0.00652
Epoch: 43521, Training Loss: 0.00676
Epoch: 43521, Training Loss: 0.00677
Epoch: 43521, Training Loss: 0.00830
Epoch: 43522, Training Loss: 0.00652
Epoch: 43522, Training Loss: 0.00676
Epoch: 43522, Training Loss: 0.00677
Epoch: 43522, Training Loss: 0.00830
Epoch: 43523, Training Loss: 0.00652
Epoch: 43523, Training Loss: 0.00676
Epoch: 43523, Training Loss: 0.00677
Epoch: 43523, Training Loss: 0.00830
Epoch: 43524, Training Loss: 0.00652
Epoch: 43524, Training Loss: 0.00676
Epoch: 43524, Training Loss: 0.00677
Epoch: 43524, Training Loss: 0.00830
Epoch: 43525, Training Loss: 0.00652
Epoch: 43525, Training Loss: 0.00676
Epoch: 43525, Training Loss: 0.00677
Epoch: 43525, Training Loss: 0.00830
Epoch: 43526, Training Loss: 0.00652
Epoch: 43526, Training Loss: 0.00676
Epoch: 43526, Training Loss: 0.00677
Epoch: 43526, Training Loss: 0.00830
Epoch: 43527, Training Loss: 0.00652
Epoch: 43527, Training Loss: 0.00676
Epoch: 43527, Training Loss: 0.00677
Epoch: 43527, Training Loss: 0.00830
Epoch: 43528, Training Loss: 0.00652
Epoch: 43528, Training Loss: 0.00676
Epoch: 43528, Training Loss: 0.00677
Epoch: 43528, Training Loss: 0.00830
Epoch: 43529, Training Loss: 0.00652
Epoch: 43529, Training Loss: 0.00676
Epoch: 43529, Training Loss: 0.00677
Epoch: 43529, Training Loss: 0.00830
Epoch: 43530, Training Loss: 0.00652
Epoch: 43530, Training Loss: 0.00676
Epoch: 43530, Training Loss: 0.00677
Epoch: 43530, Training Loss: 0.00830
Epoch: 43531, Training Loss: 0.00652
Epoch: 43531, Training Loss: 0.00676
Epoch: 43531, Training Loss: 0.00677
Epoch: 43531, Training Loss: 0.00830
Epoch: 43532, Training Loss: 0.00651
Epoch: 43532, Training Loss: 0.00676
Epoch: 43532, Training Loss: 0.00677
Epoch: 43532, Training Loss: 0.00830
Epoch: 43533, Training Loss: 0.00651
Epoch: 43533, Training Loss: 0.00676
Epoch: 43533, Training Loss: 0.00677
Epoch: 43533, Training Loss: 0.00830
Epoch: 43534, Training Loss: 0.00651
Epoch: 43534, Training Loss: 0.00676
Epoch: 43534, Training Loss: 0.00677
Epoch: 43534, Training Loss: 0.00830
Epoch: 43535, Training Loss: 0.00651
Epoch: 43535, Training Loss: 0.00676
Epoch: 43535, Training Loss: 0.00677
Epoch: 43535, Training Loss: 0.00830
Epoch: 43536, Training Loss: 0.00651
Epoch: 43536, Training Loss: 0.00676
Epoch: 43536, Training Loss: 0.00677
Epoch: 43536, Training Loss: 0.00830
Epoch: 43537, Training Loss: 0.00651
Epoch: 43537, Training Loss: 0.00676
Epoch: 43537, Training Loss: 0.00677
Epoch: 43537, Training Loss: 0.00830
Epoch: 43538, Training Loss: 0.00651
Epoch: 43538, Training Loss: 0.00676
Epoch: 43538, Training Loss: 0.00677
Epoch: 43538, Training Loss: 0.00830
Epoch: 43539, Training Loss: 0.00651
Epoch: 43539, Training Loss: 0.00676
Epoch: 43539, Training Loss: 0.00677
Epoch: 43539, Training Loss: 0.00830
Epoch: 43540, Training Loss: 0.00651
Epoch: 43540, Training Loss: 0.00676
Epoch: 43540, Training Loss: 0.00677
Epoch: 43540, Training Loss: 0.00830
Epoch: 43541, Training Loss: 0.00651
Epoch: 43541, Training Loss: 0.00676
Epoch: 43541, Training Loss: 0.00677
Epoch: 43541, Training Loss: 0.00830
Epoch: 43542, Training Loss: 0.00651
Epoch: 43542, Training Loss: 0.00676
Epoch: 43542, Training Loss: 0.00677
Epoch: 43542, Training Loss: 0.00830
Epoch: 43543, Training Loss: 0.00651
Epoch: 43543, Training Loss: 0.00676
Epoch: 43543, Training Loss: 0.00677
Epoch: 43543, Training Loss: 0.00830
Epoch: 43544, Training Loss: 0.00651
Epoch: 43544, Training Loss: 0.00676
Epoch: 43544, Training Loss: 0.00677
Epoch: 43544, Training Loss: 0.00830
Epoch: 43545, Training Loss: 0.00651
Epoch: 43545, Training Loss: 0.00676
Epoch: 43545, Training Loss: 0.00677
Epoch: 43545, Training Loss: 0.00830
Epoch: 43546, Training Loss: 0.00651
Epoch: 43546, Training Loss: 0.00676
Epoch: 43546, Training Loss: 0.00677
Epoch: 43546, Training Loss: 0.00830
Epoch: 43547, Training Loss: 0.00651
Epoch: 43547, Training Loss: 0.00676
Epoch: 43547, Training Loss: 0.00677
Epoch: 43547, Training Loss: 0.00830
Epoch: 43548, Training Loss: 0.00651
Epoch: 43548, Training Loss: 0.00676
Epoch: 43548, Training Loss: 0.00677
Epoch: 43548, Training Loss: 0.00830
Epoch: 43549, Training Loss: 0.00651
Epoch: 43549, Training Loss: 0.00676
Epoch: 43549, Training Loss: 0.00677
Epoch: 43549, Training Loss: 0.00830
Epoch: 43550, Training Loss: 0.00651
Epoch: 43550, Training Loss: 0.00676
Epoch: 43550, Training Loss: 0.00677
Epoch: 43550, Training Loss: 0.00830
Epoch: 43551, Training Loss: 0.00651
Epoch: 43551, Training Loss: 0.00676
Epoch: 43551, Training Loss: 0.00677
Epoch: 43551, Training Loss: 0.00829
Epoch: 43552, Training Loss: 0.00651
Epoch: 43552, Training Loss: 0.00676
Epoch: 43552, Training Loss: 0.00677
Epoch: 43552, Training Loss: 0.00829
Epoch: 43553, Training Loss: 0.00651
Epoch: 43553, Training Loss: 0.00676
Epoch: 43553, Training Loss: 0.00677
Epoch: 43553, Training Loss: 0.00829
Epoch: 43554, Training Loss: 0.00651
Epoch: 43554, Training Loss: 0.00676
Epoch: 43554, Training Loss: 0.00677
Epoch: 43554, Training Loss: 0.00829
Epoch: 43555, Training Loss: 0.00651
Epoch: 43555, Training Loss: 0.00676
Epoch: 43555, Training Loss: 0.00677
Epoch: 43555, Training Loss: 0.00829
Epoch: 43556, Training Loss: 0.00651
Epoch: 43556, Training Loss: 0.00676
Epoch: 43556, Training Loss: 0.00677
Epoch: 43556, Training Loss: 0.00829
Epoch: 43557, Training Loss: 0.00651
Epoch: 43557, Training Loss: 0.00676
Epoch: 43557, Training Loss: 0.00677
Epoch: 43557, Training Loss: 0.00829
Epoch: 43558, Training Loss: 0.00651
Epoch: 43558, Training Loss: 0.00676
Epoch: 43558, Training Loss: 0.00677
Epoch: 43558, Training Loss: 0.00829
Epoch: 43559, Training Loss: 0.00651
Epoch: 43559, Training Loss: 0.00676
Epoch: 43559, Training Loss: 0.00677
Epoch: 43559, Training Loss: 0.00829
Epoch: 43560, Training Loss: 0.00651
Epoch: 43560, Training Loss: 0.00676
Epoch: 43560, Training Loss: 0.00677
Epoch: 43560, Training Loss: 0.00829
Epoch: 43561, Training Loss: 0.00651
Epoch: 43561, Training Loss: 0.00676
Epoch: 43561, Training Loss: 0.00677
Epoch: 43561, Training Loss: 0.00829
Epoch: 43562, Training Loss: 0.00651
Epoch: 43562, Training Loss: 0.00676
Epoch: 43562, Training Loss: 0.00677
Epoch: 43562, Training Loss: 0.00829
Epoch: 43563, Training Loss: 0.00651
Epoch: 43563, Training Loss: 0.00676
Epoch: 43563, Training Loss: 0.00677
Epoch: 43563, Training Loss: 0.00829
Epoch: 43564, Training Loss: 0.00651
Epoch: 43564, Training Loss: 0.00676
Epoch: 43564, Training Loss: 0.00677
Epoch: 43564, Training Loss: 0.00829
Epoch: 43565, Training Loss: 0.00651
Epoch: 43565, Training Loss: 0.00676
Epoch: 43565, Training Loss: 0.00677
Epoch: 43565, Training Loss: 0.00829
Epoch: 43566, Training Loss: 0.00651
Epoch: 43566, Training Loss: 0.00676
Epoch: 43566, Training Loss: 0.00677
Epoch: 43566, Training Loss: 0.00829
Epoch: 43567, Training Loss: 0.00651
Epoch: 43567, Training Loss: 0.00676
Epoch: 43567, Training Loss: 0.00677
Epoch: 43567, Training Loss: 0.00829
Epoch: 43568, Training Loss: 0.00651
Epoch: 43568, Training Loss: 0.00676
Epoch: 43568, Training Loss: 0.00677
Epoch: 43568, Training Loss: 0.00829
Epoch: 43569, Training Loss: 0.00651
Epoch: 43569, Training Loss: 0.00676
Epoch: 43569, Training Loss: 0.00677
Epoch: 43569, Training Loss: 0.00829
Epoch: 43570, Training Loss: 0.00651
Epoch: 43570, Training Loss: 0.00676
Epoch: 43570, Training Loss: 0.00677
Epoch: 43570, Training Loss: 0.00829
Epoch: 43571, Training Loss: 0.00651
Epoch: 43571, Training Loss: 0.00676
Epoch: 43571, Training Loss: 0.00677
Epoch: 43571, Training Loss: 0.00829
Epoch: 43572, Training Loss: 0.00651
Epoch: 43572, Training Loss: 0.00676
Epoch: 43572, Training Loss: 0.00677
Epoch: 43572, Training Loss: 0.00829
Epoch: 43573, Training Loss: 0.00651
Epoch: 43573, Training Loss: 0.00676
Epoch: 43573, Training Loss: 0.00676
Epoch: 43573, Training Loss: 0.00829
Epoch: 43574, Training Loss: 0.00651
Epoch: 43574, Training Loss: 0.00676
Epoch: 43574, Training Loss: 0.00676
Epoch: 43574, Training Loss: 0.00829
Epoch: 43575, Training Loss: 0.00651
Epoch: 43575, Training Loss: 0.00676
Epoch: 43575, Training Loss: 0.00676
Epoch: 43575, Training Loss: 0.00829
Epoch: 43576, Training Loss: 0.00651
Epoch: 43576, Training Loss: 0.00676
Epoch: 43576, Training Loss: 0.00676
Epoch: 43576, Training Loss: 0.00829
Epoch: 43577, Training Loss: 0.00651
Epoch: 43577, Training Loss: 0.00676
Epoch: 43577, Training Loss: 0.00676
Epoch: 43577, Training Loss: 0.00829
Epoch: 43578, Training Loss: 0.00651
Epoch: 43578, Training Loss: 0.00676
Epoch: 43578, Training Loss: 0.00676
Epoch: 43578, Training Loss: 0.00829
Epoch: 43579, Training Loss: 0.00651
Epoch: 43579, Training Loss: 0.00676
Epoch: 43579, Training Loss: 0.00676
Epoch: 43579, Training Loss: 0.00829
Epoch: 43580, Training Loss: 0.00651
Epoch: 43580, Training Loss: 0.00676
Epoch: 43580, Training Loss: 0.00676
Epoch: 43580, Training Loss: 0.00829
Epoch: 43581, Training Loss: 0.00651
Epoch: 43581, Training Loss: 0.00676
Epoch: 43581, Training Loss: 0.00676
Epoch: 43581, Training Loss: 0.00829
Epoch: 43582, Training Loss: 0.00651
Epoch: 43582, Training Loss: 0.00676
Epoch: 43582, Training Loss: 0.00676
Epoch: 43582, Training Loss: 0.00829
Epoch: 43583, Training Loss: 0.00651
Epoch: 43583, Training Loss: 0.00676
Epoch: 43583, Training Loss: 0.00676
Epoch: 43583, Training Loss: 0.00829
Epoch: 43584, Training Loss: 0.00651
Epoch: 43584, Training Loss: 0.00676
Epoch: 43584, Training Loss: 0.00676
Epoch: 43584, Training Loss: 0.00829
Epoch: 43585, Training Loss: 0.00651
Epoch: 43585, Training Loss: 0.00676
Epoch: 43585, Training Loss: 0.00676
Epoch: 43585, Training Loss: 0.00829
Epoch: 43586, Training Loss: 0.00651
Epoch: 43586, Training Loss: 0.00676
Epoch: 43586, Training Loss: 0.00676
Epoch: 43586, Training Loss: 0.00829
Epoch: 43587, Training Loss: 0.00651
Epoch: 43587, Training Loss: 0.00676
Epoch: 43587, Training Loss: 0.00676
Epoch: 43587, Training Loss: 0.00829
Epoch: 43588, Training Loss: 0.00651
Epoch: 43588, Training Loss: 0.00676
Epoch: 43588, Training Loss: 0.00676
Epoch: 43588, Training Loss: 0.00829
Epoch: 43589, Training Loss: 0.00651
Epoch: 43589, Training Loss: 0.00676
Epoch: 43589, Training Loss: 0.00676
Epoch: 43589, Training Loss: 0.00829
Epoch: 43590, Training Loss: 0.00651
Epoch: 43590, Training Loss: 0.00676
Epoch: 43590, Training Loss: 0.00676
Epoch: 43590, Training Loss: 0.00829
Epoch: 43591, Training Loss: 0.00651
Epoch: 43591, Training Loss: 0.00676
Epoch: 43591, Training Loss: 0.00676
Epoch: 43591, Training Loss: 0.00829
Epoch: 43592, Training Loss: 0.00651
Epoch: 43592, Training Loss: 0.00676
Epoch: 43592, Training Loss: 0.00676
Epoch: 43592, Training Loss: 0.00829
Epoch: 43593, Training Loss: 0.00651
Epoch: 43593, Training Loss: 0.00676
Epoch: 43593, Training Loss: 0.00676
Epoch: 43593, Training Loss: 0.00829
Epoch: 43594, Training Loss: 0.00651
Epoch: 43594, Training Loss: 0.00676
Epoch: 43594, Training Loss: 0.00676
Epoch: 43594, Training Loss: 0.00829
Epoch: 43595, Training Loss: 0.00651
Epoch: 43595, Training Loss: 0.00676
Epoch: 43595, Training Loss: 0.00676
Epoch: 43595, Training Loss: 0.00829
Epoch: 43596, Training Loss: 0.00651
Epoch: 43596, Training Loss: 0.00676
Epoch: 43596, Training Loss: 0.00676
Epoch: 43596, Training Loss: 0.00829
Epoch: 43597, Training Loss: 0.00651
Epoch: 43597, Training Loss: 0.00676
Epoch: 43597, Training Loss: 0.00676
Epoch: 43597, Training Loss: 0.00829
Epoch: 43598, Training Loss: 0.00651
Epoch: 43598, Training Loss: 0.00676
Epoch: 43598, Training Loss: 0.00676
Epoch: 43598, Training Loss: 0.00829
Epoch: 43599, Training Loss: 0.00651
Epoch: 43599, Training Loss: 0.00676
Epoch: 43599, Training Loss: 0.00676
Epoch: 43599, Training Loss: 0.00829
Epoch: 43600, Training Loss: 0.00651
Epoch: 43600, Training Loss: 0.00676
Epoch: 43600, Training Loss: 0.00676
Epoch: 43600, Training Loss: 0.00829
Epoch: 43601, Training Loss: 0.00651
Epoch: 43601, Training Loss: 0.00676
Epoch: 43601, Training Loss: 0.00676
Epoch: 43601, Training Loss: 0.00829
Epoch: 43602, Training Loss: 0.00651
Epoch: 43602, Training Loss: 0.00676
Epoch: 43602, Training Loss: 0.00676
Epoch: 43602, Training Loss: 0.00829
Epoch: 43603, Training Loss: 0.00651
Epoch: 43603, Training Loss: 0.00675
Epoch: 43603, Training Loss: 0.00676
Epoch: 43603, Training Loss: 0.00829
Epoch: 43604, Training Loss: 0.00651
Epoch: 43604, Training Loss: 0.00675
Epoch: 43604, Training Loss: 0.00676
Epoch: 43604, Training Loss: 0.00829
Epoch: 43605, Training Loss: 0.00651
Epoch: 43605, Training Loss: 0.00675
Epoch: 43605, Training Loss: 0.00676
Epoch: 43605, Training Loss: 0.00829
Epoch: 43606, Training Loss: 0.00651
Epoch: 43606, Training Loss: 0.00675
Epoch: 43606, Training Loss: 0.00676
Epoch: 43606, Training Loss: 0.00829
Epoch: 43607, Training Loss: 0.00651
Epoch: 43607, Training Loss: 0.00675
Epoch: 43607, Training Loss: 0.00676
Epoch: 43607, Training Loss: 0.00829
Epoch: 43608, Training Loss: 0.00651
Epoch: 43608, Training Loss: 0.00675
Epoch: 43608, Training Loss: 0.00676
Epoch: 43608, Training Loss: 0.00829
Epoch: 43609, Training Loss: 0.00651
Epoch: 43609, Training Loss: 0.00675
Epoch: 43609, Training Loss: 0.00676
Epoch: 43609, Training Loss: 0.00829
Epoch: 43610, Training Loss: 0.00651
Epoch: 43610, Training Loss: 0.00675
Epoch: 43610, Training Loss: 0.00676
Epoch: 43610, Training Loss: 0.00829
Epoch: 43611, Training Loss: 0.00651
Epoch: 43611, Training Loss: 0.00675
Epoch: 43611, Training Loss: 0.00676
Epoch: 43611, Training Loss: 0.00829
Epoch: 43612, Training Loss: 0.00651
Epoch: 43612, Training Loss: 0.00675
Epoch: 43612, Training Loss: 0.00676
Epoch: 43612, Training Loss: 0.00829
Epoch: 43613, Training Loss: 0.00651
Epoch: 43613, Training Loss: 0.00675
Epoch: 43613, Training Loss: 0.00676
Epoch: 43613, Training Loss: 0.00829
Epoch: 43614, Training Loss: 0.00651
Epoch: 43614, Training Loss: 0.00675
Epoch: 43614, Training Loss: 0.00676
Epoch: 43614, Training Loss: 0.00829
Epoch: 43615, Training Loss: 0.00651
Epoch: 43615, Training Loss: 0.00675
Epoch: 43615, Training Loss: 0.00676
Epoch: 43615, Training Loss: 0.00829
Epoch: 43616, Training Loss: 0.00651
Epoch: 43616, Training Loss: 0.00675
Epoch: 43616, Training Loss: 0.00676
Epoch: 43616, Training Loss: 0.00829
Epoch: 43617, Training Loss: 0.00651
Epoch: 43617, Training Loss: 0.00675
Epoch: 43617, Training Loss: 0.00676
Epoch: 43617, Training Loss: 0.00829
Epoch: 43618, Training Loss: 0.00651
Epoch: 43618, Training Loss: 0.00675
Epoch: 43618, Training Loss: 0.00676
Epoch: 43618, Training Loss: 0.00829
Epoch: 43619, Training Loss: 0.00651
Epoch: 43619, Training Loss: 0.00675
Epoch: 43619, Training Loss: 0.00676
Epoch: 43619, Training Loss: 0.00829
Epoch: 43620, Training Loss: 0.00651
Epoch: 43620, Training Loss: 0.00675
Epoch: 43620, Training Loss: 0.00676
Epoch: 43620, Training Loss: 0.00829
Epoch: 43621, Training Loss: 0.00651
Epoch: 43621, Training Loss: 0.00675
Epoch: 43621, Training Loss: 0.00676
Epoch: 43621, Training Loss: 0.00829
Epoch: 43622, Training Loss: 0.00651
Epoch: 43622, Training Loss: 0.00675
Epoch: 43622, Training Loss: 0.00676
Epoch: 43622, Training Loss: 0.00829
Epoch: 43623, Training Loss: 0.00651
Epoch: 43623, Training Loss: 0.00675
Epoch: 43623, Training Loss: 0.00676
Epoch: 43623, Training Loss: 0.00829
Epoch: 43624, Training Loss: 0.00651
Epoch: 43624, Training Loss: 0.00675
Epoch: 43624, Training Loss: 0.00676
Epoch: 43624, Training Loss: 0.00829
Epoch: 43625, Training Loss: 0.00651
Epoch: 43625, Training Loss: 0.00675
Epoch: 43625, Training Loss: 0.00676
Epoch: 43625, Training Loss: 0.00829
Epoch: 43626, Training Loss: 0.00651
Epoch: 43626, Training Loss: 0.00675
Epoch: 43626, Training Loss: 0.00676
Epoch: 43626, Training Loss: 0.00829
Epoch: 43627, Training Loss: 0.00651
Epoch: 43627, Training Loss: 0.00675
Epoch: 43627, Training Loss: 0.00676
Epoch: 43627, Training Loss: 0.00829
Epoch: 43628, Training Loss: 0.00651
Epoch: 43628, Training Loss: 0.00675
Epoch: 43628, Training Loss: 0.00676
Epoch: 43628, Training Loss: 0.00829
Epoch: 43629, Training Loss: 0.00651
Epoch: 43629, Training Loss: 0.00675
Epoch: 43629, Training Loss: 0.00676
Epoch: 43629, Training Loss: 0.00829
Epoch: 43630, Training Loss: 0.00651
Epoch: 43630, Training Loss: 0.00675
Epoch: 43630, Training Loss: 0.00676
Epoch: 43630, Training Loss: 0.00829
Epoch: 43631, Training Loss: 0.00651
Epoch: 43631, Training Loss: 0.00675
Epoch: 43631, Training Loss: 0.00676
Epoch: 43631, Training Loss: 0.00829
Epoch: 43632, Training Loss: 0.00651
Epoch: 43632, Training Loss: 0.00675
Epoch: 43632, Training Loss: 0.00676
Epoch: 43632, Training Loss: 0.00829
Epoch: 43633, Training Loss: 0.00651
Epoch: 43633, Training Loss: 0.00675
Epoch: 43633, Training Loss: 0.00676
Epoch: 43633, Training Loss: 0.00829
Epoch: 43634, Training Loss: 0.00651
Epoch: 43634, Training Loss: 0.00675
Epoch: 43634, Training Loss: 0.00676
Epoch: 43634, Training Loss: 0.00829
Epoch: 43635, Training Loss: 0.00651
Epoch: 43635, Training Loss: 0.00675
Epoch: 43635, Training Loss: 0.00676
Epoch: 43635, Training Loss: 0.00829
Epoch: 43636, Training Loss: 0.00651
Epoch: 43636, Training Loss: 0.00675
Epoch: 43636, Training Loss: 0.00676
Epoch: 43636, Training Loss: 0.00829
Epoch: 43637, Training Loss: 0.00651
Epoch: 43637, Training Loss: 0.00675
Epoch: 43637, Training Loss: 0.00676
Epoch: 43637, Training Loss: 0.00829
Epoch: 43638, Training Loss: 0.00651
Epoch: 43638, Training Loss: 0.00675
Epoch: 43638, Training Loss: 0.00676
Epoch: 43638, Training Loss: 0.00829
Epoch: 43639, Training Loss: 0.00651
Epoch: 43639, Training Loss: 0.00675
Epoch: 43639, Training Loss: 0.00676
Epoch: 43639, Training Loss: 0.00829
Epoch: 43640, Training Loss: 0.00651
Epoch: 43640, Training Loss: 0.00675
Epoch: 43640, Training Loss: 0.00676
Epoch: 43640, Training Loss: 0.00829
Epoch: 43641, Training Loss: 0.00651
Epoch: 43641, Training Loss: 0.00675
Epoch: 43641, Training Loss: 0.00676
Epoch: 43641, Training Loss: 0.00829
Epoch: 43642, Training Loss: 0.00651
Epoch: 43642, Training Loss: 0.00675
Epoch: 43642, Training Loss: 0.00676
Epoch: 43642, Training Loss: 0.00829
Epoch: 43643, Training Loss: 0.00651
Epoch: 43643, Training Loss: 0.00675
Epoch: 43643, Training Loss: 0.00676
Epoch: 43643, Training Loss: 0.00829
Epoch: 43644, Training Loss: 0.00651
Epoch: 43644, Training Loss: 0.00675
Epoch: 43644, Training Loss: 0.00676
Epoch: 43644, Training Loss: 0.00829
Epoch: 43645, Training Loss: 0.00651
Epoch: 43645, Training Loss: 0.00675
Epoch: 43645, Training Loss: 0.00676
Epoch: 43645, Training Loss: 0.00829
Epoch: 43646, Training Loss: 0.00651
Epoch: 43646, Training Loss: 0.00675
Epoch: 43646, Training Loss: 0.00676
Epoch: 43646, Training Loss: 0.00829
Epoch: 43647, Training Loss: 0.00651
Epoch: 43647, Training Loss: 0.00675
Epoch: 43647, Training Loss: 0.00676
Epoch: 43647, Training Loss: 0.00828
Epoch: 43648, Training Loss: 0.00651
Epoch: 43648, Training Loss: 0.00675
Epoch: 43648, Training Loss: 0.00676
Epoch: 43648, Training Loss: 0.00828
Epoch: 43649, Training Loss: 0.00651
Epoch: 43649, Training Loss: 0.00675
Epoch: 43649, Training Loss: 0.00676
Epoch: 43649, Training Loss: 0.00828
Epoch: 43650, Training Loss: 0.00651
Epoch: 43650, Training Loss: 0.00675
Epoch: 43650, Training Loss: 0.00676
Epoch: 43650, Training Loss: 0.00828
Epoch: 43651, Training Loss: 0.00651
Epoch: 43651, Training Loss: 0.00675
Epoch: 43651, Training Loss: 0.00676
Epoch: 43651, Training Loss: 0.00828
Epoch: 43652, Training Loss: 0.00651
Epoch: 43652, Training Loss: 0.00675
Epoch: 43652, Training Loss: 0.00676
Epoch: 43652, Training Loss: 0.00828
Epoch: 43653, Training Loss: 0.00651
Epoch: 43653, Training Loss: 0.00675
Epoch: 43653, Training Loss: 0.00676
Epoch: 43653, Training Loss: 0.00828
Epoch: 43654, Training Loss: 0.00651
Epoch: 43654, Training Loss: 0.00675
Epoch: 43654, Training Loss: 0.00676
Epoch: 43654, Training Loss: 0.00828
Epoch: 43655, Training Loss: 0.00651
Epoch: 43655, Training Loss: 0.00675
Epoch: 43655, Training Loss: 0.00676
Epoch: 43655, Training Loss: 0.00828
Epoch: 43656, Training Loss: 0.00651
Epoch: 43656, Training Loss: 0.00675
Epoch: 43656, Training Loss: 0.00676
Epoch: 43656, Training Loss: 0.00828
Epoch: 43657, Training Loss: 0.00650
Epoch: 43657, Training Loss: 0.00675
Epoch: 43657, Training Loss: 0.00676
Epoch: 43657, Training Loss: 0.00828
Epoch: 43658, Training Loss: 0.00650
Epoch: 43658, Training Loss: 0.00675
Epoch: 43658, Training Loss: 0.00676
Epoch: 43658, Training Loss: 0.00828
Epoch: 43659, Training Loss: 0.00650
Epoch: 43659, Training Loss: 0.00675
Epoch: 43659, Training Loss: 0.00676
Epoch: 43659, Training Loss: 0.00828
Epoch: 43660, Training Loss: 0.00650
Epoch: 43660, Training Loss: 0.00675
Epoch: 43660, Training Loss: 0.00676
Epoch: 43660, Training Loss: 0.00828
Epoch: 43661, Training Loss: 0.00650
Epoch: 43661, Training Loss: 0.00675
Epoch: 43661, Training Loss: 0.00676
Epoch: 43661, Training Loss: 0.00828
Epoch: 43662, Training Loss: 0.00650
Epoch: 43662, Training Loss: 0.00675
Epoch: 43662, Training Loss: 0.00676
Epoch: 43662, Training Loss: 0.00828
Epoch: 43663, Training Loss: 0.00650
Epoch: 43663, Training Loss: 0.00675
Epoch: 43663, Training Loss: 0.00676
Epoch: 43663, Training Loss: 0.00828
Epoch: 43664, Training Loss: 0.00650
Epoch: 43664, Training Loss: 0.00675
Epoch: 43664, Training Loss: 0.00676
Epoch: 43664, Training Loss: 0.00828
Epoch: 43665, Training Loss: 0.00650
Epoch: 43665, Training Loss: 0.00675
Epoch: 43665, Training Loss: 0.00676
Epoch: 43665, Training Loss: 0.00828
Epoch: 43666, Training Loss: 0.00650
Epoch: 43666, Training Loss: 0.00675
Epoch: 43666, Training Loss: 0.00676
Epoch: 43666, Training Loss: 0.00828
Epoch: 43667, Training Loss: 0.00650
Epoch: 43667, Training Loss: 0.00675
Epoch: 43667, Training Loss: 0.00676
Epoch: 43667, Training Loss: 0.00828
Epoch: 43668, Training Loss: 0.00650
Epoch: 43668, Training Loss: 0.00675
Epoch: 43668, Training Loss: 0.00676
Epoch: 43668, Training Loss: 0.00828
Epoch: 43669, Training Loss: 0.00650
Epoch: 43669, Training Loss: 0.00675
Epoch: 43669, Training Loss: 0.00676
Epoch: 43669, Training Loss: 0.00828
Epoch: 43670, Training Loss: 0.00650
Epoch: 43670, Training Loss: 0.00675
Epoch: 43670, Training Loss: 0.00676
Epoch: 43670, Training Loss: 0.00828
Epoch: 43671, Training Loss: 0.00650
Epoch: 43671, Training Loss: 0.00675
Epoch: 43671, Training Loss: 0.00676
Epoch: 43671, Training Loss: 0.00828
Epoch: 43672, Training Loss: 0.00650
Epoch: 43672, Training Loss: 0.00675
Epoch: 43672, Training Loss: 0.00676
Epoch: 43672, Training Loss: 0.00828
Epoch: 43673, Training Loss: 0.00650
Epoch: 43673, Training Loss: 0.00675
Epoch: 43673, Training Loss: 0.00676
Epoch: 43673, Training Loss: 0.00828
Epoch: 43674, Training Loss: 0.00650
Epoch: 43674, Training Loss: 0.00675
Epoch: 43674, Training Loss: 0.00676
Epoch: 43674, Training Loss: 0.00828
Epoch: 43675, Training Loss: 0.00650
Epoch: 43675, Training Loss: 0.00675
Epoch: 43675, Training Loss: 0.00676
Epoch: 43675, Training Loss: 0.00828
Epoch: 43676, Training Loss: 0.00650
Epoch: 43676, Training Loss: 0.00675
Epoch: 43676, Training Loss: 0.00676
Epoch: 43676, Training Loss: 0.00828
Epoch: 43677, Training Loss: 0.00650
Epoch: 43677, Training Loss: 0.00675
Epoch: 43677, Training Loss: 0.00676
Epoch: 43677, Training Loss: 0.00828
Epoch: 43678, Training Loss: 0.00650
Epoch: 43678, Training Loss: 0.00675
Epoch: 43678, Training Loss: 0.00676
Epoch: 43678, Training Loss: 0.00828
Epoch: 43679, Training Loss: 0.00650
Epoch: 43679, Training Loss: 0.00675
Epoch: 43679, Training Loss: 0.00676
Epoch: 43679, Training Loss: 0.00828
Epoch: 43680, Training Loss: 0.00650
Epoch: 43680, Training Loss: 0.00675
Epoch: 43680, Training Loss: 0.00676
Epoch: 43680, Training Loss: 0.00828
Epoch: 43681, Training Loss: 0.00650
Epoch: 43681, Training Loss: 0.00675
Epoch: 43681, Training Loss: 0.00676
Epoch: 43681, Training Loss: 0.00828
Epoch: 43682, Training Loss: 0.00650
Epoch: 43682, Training Loss: 0.00675
Epoch: 43682, Training Loss: 0.00676
Epoch: 43682, Training Loss: 0.00828
Epoch: 43683, Training Loss: 0.00650
Epoch: 43683, Training Loss: 0.00675
Epoch: 43683, Training Loss: 0.00676
Epoch: 43683, Training Loss: 0.00828
Epoch: 43684, Training Loss: 0.00650
Epoch: 43684, Training Loss: 0.00675
Epoch: 43684, Training Loss: 0.00676
Epoch: 43684, Training Loss: 0.00828
Epoch: 43685, Training Loss: 0.00650
Epoch: 43685, Training Loss: 0.00675
Epoch: 43685, Training Loss: 0.00676
Epoch: 43685, Training Loss: 0.00828
Epoch: 43686, Training Loss: 0.00650
Epoch: 43686, Training Loss: 0.00675
Epoch: 43686, Training Loss: 0.00676
Epoch: 43686, Training Loss: 0.00828
Epoch: 43687, Training Loss: 0.00650
Epoch: 43687, Training Loss: 0.00675
Epoch: 43687, Training Loss: 0.00676
Epoch: 43687, Training Loss: 0.00828
Epoch: 43688, Training Loss: 0.00650
Epoch: 43688, Training Loss: 0.00675
Epoch: 43688, Training Loss: 0.00676
Epoch: 43688, Training Loss: 0.00828
Epoch: 43689, Training Loss: 0.00650
Epoch: 43689, Training Loss: 0.00675
Epoch: 43689, Training Loss: 0.00676
Epoch: 43689, Training Loss: 0.00828
Epoch: 43690, Training Loss: 0.00650
Epoch: 43690, Training Loss: 0.00675
Epoch: 43690, Training Loss: 0.00676
Epoch: 43690, Training Loss: 0.00828
Epoch: 43691, Training Loss: 0.00650
Epoch: 43691, Training Loss: 0.00675
Epoch: 43691, Training Loss: 0.00675
Epoch: 43691, Training Loss: 0.00828
Epoch: 43692, Training Loss: 0.00650
Epoch: 43692, Training Loss: 0.00675
Epoch: 43692, Training Loss: 0.00675
Epoch: 43692, Training Loss: 0.00828
Epoch: 43693, Training Loss: 0.00650
Epoch: 43693, Training Loss: 0.00675
Epoch: 43693, Training Loss: 0.00675
Epoch: 43693, Training Loss: 0.00828
Epoch: 43694, Training Loss: 0.00650
Epoch: 43694, Training Loss: 0.00675
Epoch: 43694, Training Loss: 0.00675
Epoch: 43694, Training Loss: 0.00828
Epoch: 43695, Training Loss: 0.00650
Epoch: 43695, Training Loss: 0.00675
Epoch: 43695, Training Loss: 0.00675
Epoch: 43695, Training Loss: 0.00828
Epoch: 43696, Training Loss: 0.00650
Epoch: 43696, Training Loss: 0.00675
Epoch: 43696, Training Loss: 0.00675
Epoch: 43696, Training Loss: 0.00828
Epoch: 43697, Training Loss: 0.00650
Epoch: 43697, Training Loss: 0.00675
Epoch: 43697, Training Loss: 0.00675
Epoch: 43697, Training Loss: 0.00828
Epoch: 43698, Training Loss: 0.00650
Epoch: 43698, Training Loss: 0.00675
Epoch: 43698, Training Loss: 0.00675
Epoch: 43698, Training Loss: 0.00828
Epoch: 43699, Training Loss: 0.00650
Epoch: 43699, Training Loss: 0.00675
Epoch: 43699, Training Loss: 0.00675
Epoch: 43699, Training Loss: 0.00828
Epoch: 43700, Training Loss: 0.00650
Epoch: 43700, Training Loss: 0.00675
Epoch: 43700, Training Loss: 0.00675
Epoch: 43700, Training Loss: 0.00828
Epoch: 43701, Training Loss: 0.00650
Epoch: 43701, Training Loss: 0.00675
Epoch: 43701, Training Loss: 0.00675
Epoch: 43701, Training Loss: 0.00828
Epoch: 43702, Training Loss: 0.00650
Epoch: 43702, Training Loss: 0.00675
Epoch: 43702, Training Loss: 0.00675
Epoch: 43702, Training Loss: 0.00828
Epoch: 43703, Training Loss: 0.00650
Epoch: 43703, Training Loss: 0.00675
Epoch: 43703, Training Loss: 0.00675
Epoch: 43703, Training Loss: 0.00828
Epoch: 43704, Training Loss: 0.00650
Epoch: 43704, Training Loss: 0.00675
Epoch: 43704, Training Loss: 0.00675
Epoch: 43704, Training Loss: 0.00828
Epoch: 43705, Training Loss: 0.00650
Epoch: 43705, Training Loss: 0.00675
Epoch: 43705, Training Loss: 0.00675
Epoch: 43705, Training Loss: 0.00828
Epoch: 43706, Training Loss: 0.00650
Epoch: 43706, Training Loss: 0.00675
Epoch: 43706, Training Loss: 0.00675
Epoch: 43706, Training Loss: 0.00828
Epoch: 43707, Training Loss: 0.00650
Epoch: 43707, Training Loss: 0.00675
Epoch: 43707, Training Loss: 0.00675
Epoch: 43707, Training Loss: 0.00828
Epoch: 43708, Training Loss: 0.00650
Epoch: 43708, Training Loss: 0.00675
Epoch: 43708, Training Loss: 0.00675
Epoch: 43708, Training Loss: 0.00828
Epoch: 43709, Training Loss: 0.00650
Epoch: 43709, Training Loss: 0.00675
Epoch: 43709, Training Loss: 0.00675
Epoch: 43709, Training Loss: 0.00828
Epoch: 43710, Training Loss: 0.00650
Epoch: 43710, Training Loss: 0.00675
Epoch: 43710, Training Loss: 0.00675
Epoch: 43710, Training Loss: 0.00828
Epoch: 43711, Training Loss: 0.00650
Epoch: 43711, Training Loss: 0.00675
Epoch: 43711, Training Loss: 0.00675
Epoch: 43711, Training Loss: 0.00828
Epoch: 43712, Training Loss: 0.00650
Epoch: 43712, Training Loss: 0.00675
Epoch: 43712, Training Loss: 0.00675
Epoch: 43712, Training Loss: 0.00828
Epoch: 43713, Training Loss: 0.00650
Epoch: 43713, Training Loss: 0.00675
Epoch: 43713, Training Loss: 0.00675
Epoch: 43713, Training Loss: 0.00828
Epoch: 43714, Training Loss: 0.00650
Epoch: 43714, Training Loss: 0.00675
Epoch: 43714, Training Loss: 0.00675
Epoch: 43714, Training Loss: 0.00828
Epoch: 43715, Training Loss: 0.00650
Epoch: 43715, Training Loss: 0.00675
Epoch: 43715, Training Loss: 0.00675
Epoch: 43715, Training Loss: 0.00828
Epoch: 43716, Training Loss: 0.00650
Epoch: 43716, Training Loss: 0.00675
Epoch: 43716, Training Loss: 0.00675
Epoch: 43716, Training Loss: 0.00828
Epoch: 43717, Training Loss: 0.00650
Epoch: 43717, Training Loss: 0.00675
Epoch: 43717, Training Loss: 0.00675
Epoch: 43717, Training Loss: 0.00828
Epoch: 43718, Training Loss: 0.00650
Epoch: 43718, Training Loss: 0.00675
Epoch: 43718, Training Loss: 0.00675
Epoch: 43718, Training Loss: 0.00828
Epoch: 43719, Training Loss: 0.00650
Epoch: 43719, Training Loss: 0.00675
Epoch: 43719, Training Loss: 0.00675
Epoch: 43719, Training Loss: 0.00828
Epoch: 43720, Training Loss: 0.00650
Epoch: 43720, Training Loss: 0.00675
Epoch: 43720, Training Loss: 0.00675
Epoch: 43720, Training Loss: 0.00828
Epoch: 43721, Training Loss: 0.00650
Epoch: 43721, Training Loss: 0.00675
Epoch: 43721, Training Loss: 0.00675
Epoch: 43721, Training Loss: 0.00828
Epoch: 43722, Training Loss: 0.00650
Epoch: 43722, Training Loss: 0.00674
Epoch: 43722, Training Loss: 0.00675
Epoch: 43722, Training Loss: 0.00828
Epoch: 43723, Training Loss: 0.00650
Epoch: 43723, Training Loss: 0.00674
Epoch: 43723, Training Loss: 0.00675
Epoch: 43723, Training Loss: 0.00828
Epoch: 43724, Training Loss: 0.00650
Epoch: 43724, Training Loss: 0.00674
Epoch: 43724, Training Loss: 0.00675
Epoch: 43724, Training Loss: 0.00828
Epoch: 43725, Training Loss: 0.00650
Epoch: 43725, Training Loss: 0.00674
Epoch: 43725, Training Loss: 0.00675
Epoch: 43725, Training Loss: 0.00828
Epoch: 43726, Training Loss: 0.00650
Epoch: 43726, Training Loss: 0.00674
Epoch: 43726, Training Loss: 0.00675
Epoch: 43726, Training Loss: 0.00828
Epoch: 43727, Training Loss: 0.00650
Epoch: 43727, Training Loss: 0.00674
Epoch: 43727, Training Loss: 0.00675
Epoch: 43727, Training Loss: 0.00828
Epoch: 43728, Training Loss: 0.00650
Epoch: 43728, Training Loss: 0.00674
Epoch: 43728, Training Loss: 0.00675
Epoch: 43728, Training Loss: 0.00828
Epoch: 43729, Training Loss: 0.00650
Epoch: 43729, Training Loss: 0.00674
Epoch: 43729, Training Loss: 0.00675
Epoch: 43729, Training Loss: 0.00828
Epoch: 43730, Training Loss: 0.00650
Epoch: 43730, Training Loss: 0.00674
Epoch: 43730, Training Loss: 0.00675
Epoch: 43730, Training Loss: 0.00828
Epoch: 43731, Training Loss: 0.00650
Epoch: 43731, Training Loss: 0.00674
Epoch: 43731, Training Loss: 0.00675
Epoch: 43731, Training Loss: 0.00828
Epoch: 43732, Training Loss: 0.00650
Epoch: 43732, Training Loss: 0.00674
Epoch: 43732, Training Loss: 0.00675
Epoch: 43732, Training Loss: 0.00828
Epoch: 43733, Training Loss: 0.00650
Epoch: 43733, Training Loss: 0.00674
Epoch: 43733, Training Loss: 0.00675
Epoch: 43733, Training Loss: 0.00828
Epoch: 43734, Training Loss: 0.00650
Epoch: 43734, Training Loss: 0.00674
Epoch: 43734, Training Loss: 0.00675
Epoch: 43734, Training Loss: 0.00828
Epoch: 43735, Training Loss: 0.00650
Epoch: 43735, Training Loss: 0.00674
Epoch: 43735, Training Loss: 0.00675
Epoch: 43735, Training Loss: 0.00828
Epoch: 43736, Training Loss: 0.00650
Epoch: 43736, Training Loss: 0.00674
Epoch: 43736, Training Loss: 0.00675
Epoch: 43736, Training Loss: 0.00828
Epoch: 43737, Training Loss: 0.00650
Epoch: 43737, Training Loss: 0.00674
Epoch: 43737, Training Loss: 0.00675
Epoch: 43737, Training Loss: 0.00828
Epoch: 43738, Training Loss: 0.00650
Epoch: 43738, Training Loss: 0.00674
Epoch: 43738, Training Loss: 0.00675
Epoch: 43738, Training Loss: 0.00828
Epoch: 43739, Training Loss: 0.00650
Epoch: 43739, Training Loss: 0.00674
Epoch: 43739, Training Loss: 0.00675
Epoch: 43739, Training Loss: 0.00828
Epoch: 43740, Training Loss: 0.00650
Epoch: 43740, Training Loss: 0.00674
Epoch: 43740, Training Loss: 0.00675
Epoch: 43740, Training Loss: 0.00828
Epoch: 43741, Training Loss: 0.00650
Epoch: 43741, Training Loss: 0.00674
Epoch: 43741, Training Loss: 0.00675
Epoch: 43741, Training Loss: 0.00828
Epoch: 43742, Training Loss: 0.00650
Epoch: 43742, Training Loss: 0.00674
Epoch: 43742, Training Loss: 0.00675
Epoch: 43742, Training Loss: 0.00827
Epoch: 43743, Training Loss: 0.00650
Epoch: 43743, Training Loss: 0.00674
Epoch: 43743, Training Loss: 0.00675
Epoch: 43743, Training Loss: 0.00827
Epoch: 43744, Training Loss: 0.00650
Epoch: 43744, Training Loss: 0.00674
Epoch: 43744, Training Loss: 0.00675
Epoch: 43744, Training Loss: 0.00827
Epoch: 43745, Training Loss: 0.00650
Epoch: 43745, Training Loss: 0.00674
Epoch: 43745, Training Loss: 0.00675
Epoch: 43745, Training Loss: 0.00827
Epoch: 43746, Training Loss: 0.00650
Epoch: 43746, Training Loss: 0.00674
Epoch: 43746, Training Loss: 0.00675
Epoch: 43746, Training Loss: 0.00827
Epoch: 43747, Training Loss: 0.00650
Epoch: 43747, Training Loss: 0.00674
Epoch: 43747, Training Loss: 0.00675
Epoch: 43747, Training Loss: 0.00827
Epoch: 43748, Training Loss: 0.00650
Epoch: 43748, Training Loss: 0.00674
Epoch: 43748, Training Loss: 0.00675
Epoch: 43748, Training Loss: 0.00827
Epoch: 43749, Training Loss: 0.00650
Epoch: 43749, Training Loss: 0.00674
Epoch: 43749, Training Loss: 0.00675
Epoch: 43749, Training Loss: 0.00827
Epoch: 43750, Training Loss: 0.00650
Epoch: 43750, Training Loss: 0.00674
Epoch: 43750, Training Loss: 0.00675
Epoch: 43750, Training Loss: 0.00827
Epoch: 43751, Training Loss: 0.00650
Epoch: 43751, Training Loss: 0.00674
Epoch: 43751, Training Loss: 0.00675
Epoch: 43751, Training Loss: 0.00827
Epoch: 43752, Training Loss: 0.00650
Epoch: 43752, Training Loss: 0.00674
Epoch: 43752, Training Loss: 0.00675
Epoch: 43752, Training Loss: 0.00827
Epoch: 43753, Training Loss: 0.00650
Epoch: 43753, Training Loss: 0.00674
Epoch: 43753, Training Loss: 0.00675
Epoch: 43753, Training Loss: 0.00827
Epoch: 43754, Training Loss: 0.00650
Epoch: 43754, Training Loss: 0.00674
Epoch: 43754, Training Loss: 0.00675
Epoch: 43754, Training Loss: 0.00827
Epoch: 43755, Training Loss: 0.00650
Epoch: 43755, Training Loss: 0.00674
Epoch: 43755, Training Loss: 0.00675
Epoch: 43755, Training Loss: 0.00827
Epoch: 43756, Training Loss: 0.00650
Epoch: 43756, Training Loss: 0.00674
Epoch: 43756, Training Loss: 0.00675
Epoch: 43756, Training Loss: 0.00827
Epoch: 43757, Training Loss: 0.00650
Epoch: 43757, Training Loss: 0.00674
Epoch: 43757, Training Loss: 0.00675
Epoch: 43757, Training Loss: 0.00827
Epoch: 43758, Training Loss: 0.00650
Epoch: 43758, Training Loss: 0.00674
Epoch: 43758, Training Loss: 0.00675
Epoch: 43758, Training Loss: 0.00827
Epoch: 43759, Training Loss: 0.00650
Epoch: 43759, Training Loss: 0.00674
Epoch: 43759, Training Loss: 0.00675
Epoch: 43759, Training Loss: 0.00827
Epoch: 43760, Training Loss: 0.00650
Epoch: 43760, Training Loss: 0.00674
Epoch: 43760, Training Loss: 0.00675
Epoch: 43760, Training Loss: 0.00827
Epoch: 43761, Training Loss: 0.00650
Epoch: 43761, Training Loss: 0.00674
Epoch: 43761, Training Loss: 0.00675
Epoch: 43761, Training Loss: 0.00827
Epoch: 43762, Training Loss: 0.00650
Epoch: 43762, Training Loss: 0.00674
Epoch: 43762, Training Loss: 0.00675
Epoch: 43762, Training Loss: 0.00827
Epoch: 43763, Training Loss: 0.00650
Epoch: 43763, Training Loss: 0.00674
Epoch: 43763, Training Loss: 0.00675
Epoch: 43763, Training Loss: 0.00827
Epoch: 43764, Training Loss: 0.00650
Epoch: 43764, Training Loss: 0.00674
Epoch: 43764, Training Loss: 0.00675
Epoch: 43764, Training Loss: 0.00827
Epoch: 43765, Training Loss: 0.00650
Epoch: 43765, Training Loss: 0.00674
Epoch: 43765, Training Loss: 0.00675
Epoch: 43765, Training Loss: 0.00827
Epoch: 43766, Training Loss: 0.00650
Epoch: 43766, Training Loss: 0.00674
Epoch: 43766, Training Loss: 0.00675
Epoch: 43766, Training Loss: 0.00827
Epoch: 43767, Training Loss: 0.00650
Epoch: 43767, Training Loss: 0.00674
Epoch: 43767, Training Loss: 0.00675
Epoch: 43767, Training Loss: 0.00827
Epoch: 43768, Training Loss: 0.00650
Epoch: 43768, Training Loss: 0.00674
Epoch: 43768, Training Loss: 0.00675
Epoch: 43768, Training Loss: 0.00827
Epoch: 43769, Training Loss: 0.00650
Epoch: 43769, Training Loss: 0.00674
Epoch: 43769, Training Loss: 0.00675
Epoch: 43769, Training Loss: 0.00827
Epoch: 43770, Training Loss: 0.00650
Epoch: 43770, Training Loss: 0.00674
Epoch: 43770, Training Loss: 0.00675
Epoch: 43770, Training Loss: 0.00827
Epoch: 43771, Training Loss: 0.00650
Epoch: 43771, Training Loss: 0.00674
Epoch: 43771, Training Loss: 0.00675
Epoch: 43771, Training Loss: 0.00827
Epoch: 43772, Training Loss: 0.00650
Epoch: 43772, Training Loss: 0.00674
Epoch: 43772, Training Loss: 0.00675
Epoch: 43772, Training Loss: 0.00827
Epoch: 43773, Training Loss: 0.00650
Epoch: 43773, Training Loss: 0.00674
Epoch: 43773, Training Loss: 0.00675
Epoch: 43773, Training Loss: 0.00827
Epoch: 43774, Training Loss: 0.00650
Epoch: 43774, Training Loss: 0.00674
Epoch: 43774, Training Loss: 0.00675
Epoch: 43774, Training Loss: 0.00827
Epoch: 43775, Training Loss: 0.00650
Epoch: 43775, Training Loss: 0.00674
Epoch: 43775, Training Loss: 0.00675
Epoch: 43775, Training Loss: 0.00827
Epoch: 43776, Training Loss: 0.00650
Epoch: 43776, Training Loss: 0.00674
Epoch: 43776, Training Loss: 0.00675
Epoch: 43776, Training Loss: 0.00827
Epoch: 43777, Training Loss: 0.00650
Epoch: 43777, Training Loss: 0.00674
Epoch: 43777, Training Loss: 0.00675
Epoch: 43777, Training Loss: 0.00827
Epoch: 43778, Training Loss: 0.00650
Epoch: 43778, Training Loss: 0.00674
Epoch: 43778, Training Loss: 0.00675
Epoch: 43778, Training Loss: 0.00827
Epoch: 43779, Training Loss: 0.00650
Epoch: 43779, Training Loss: 0.00674
Epoch: 43779, Training Loss: 0.00675
Epoch: 43779, Training Loss: 0.00827
Epoch: 43780, Training Loss: 0.00650
Epoch: 43780, Training Loss: 0.00674
Epoch: 43780, Training Loss: 0.00675
Epoch: 43780, Training Loss: 0.00827
Epoch: 43781, Training Loss: 0.00650
Epoch: 43781, Training Loss: 0.00674
Epoch: 43781, Training Loss: 0.00675
Epoch: 43781, Training Loss: 0.00827
Epoch: 43782, Training Loss: 0.00650
Epoch: 43782, Training Loss: 0.00674
Epoch: 43782, Training Loss: 0.00675
Epoch: 43782, Training Loss: 0.00827
Epoch: 43783, Training Loss: 0.00649
Epoch: 43783, Training Loss: 0.00674
Epoch: 43783, Training Loss: 0.00675
Epoch: 43783, Training Loss: 0.00827
Epoch: 43784, Training Loss: 0.00649
Epoch: 43784, Training Loss: 0.00674
Epoch: 43784, Training Loss: 0.00675
Epoch: 43784, Training Loss: 0.00827
Epoch: 43785, Training Loss: 0.00649
Epoch: 43785, Training Loss: 0.00674
Epoch: 43785, Training Loss: 0.00675
Epoch: 43785, Training Loss: 0.00827
Epoch: 43786, Training Loss: 0.00649
Epoch: 43786, Training Loss: 0.00674
Epoch: 43786, Training Loss: 0.00675
Epoch: 43786, Training Loss: 0.00827
Epoch: 43787, Training Loss: 0.00649
Epoch: 43787, Training Loss: 0.00674
Epoch: 43787, Training Loss: 0.00675
Epoch: 43787, Training Loss: 0.00827
Epoch: 43788, Training Loss: 0.00649
Epoch: 43788, Training Loss: 0.00674
Epoch: 43788, Training Loss: 0.00675
Epoch: 43788, Training Loss: 0.00827
Epoch: 43789, Training Loss: 0.00649
Epoch: 43789, Training Loss: 0.00674
Epoch: 43789, Training Loss: 0.00675
Epoch: 43789, Training Loss: 0.00827
Epoch: 43790, Training Loss: 0.00649
Epoch: 43790, Training Loss: 0.00674
Epoch: 43790, Training Loss: 0.00675
Epoch: 43790, Training Loss: 0.00827
Epoch: 43791, Training Loss: 0.00649
Epoch: 43791, Training Loss: 0.00674
Epoch: 43791, Training Loss: 0.00675
Epoch: 43791, Training Loss: 0.00827
Epoch: 43792, Training Loss: 0.00649
Epoch: 43792, Training Loss: 0.00674
Epoch: 43792, Training Loss: 0.00675
Epoch: 43792, Training Loss: 0.00827
Epoch: 43793, Training Loss: 0.00649
Epoch: 43793, Training Loss: 0.00674
Epoch: 43793, Training Loss: 0.00675
Epoch: 43793, Training Loss: 0.00827
Epoch: 43794, Training Loss: 0.00649
Epoch: 43794, Training Loss: 0.00674
Epoch: 43794, Training Loss: 0.00675
Epoch: 43794, Training Loss: 0.00827
Epoch: 43795, Training Loss: 0.00649
Epoch: 43795, Training Loss: 0.00674
Epoch: 43795, Training Loss: 0.00675
Epoch: 43795, Training Loss: 0.00827
Epoch: 43796, Training Loss: 0.00649
Epoch: 43796, Training Loss: 0.00674
Epoch: 43796, Training Loss: 0.00675
Epoch: 43796, Training Loss: 0.00827
Epoch: 43797, Training Loss: 0.00649
Epoch: 43797, Training Loss: 0.00674
Epoch: 43797, Training Loss: 0.00675
Epoch: 43797, Training Loss: 0.00827
Epoch: 43798, Training Loss: 0.00649
Epoch: 43798, Training Loss: 0.00674
Epoch: 43798, Training Loss: 0.00675
Epoch: 43798, Training Loss: 0.00827
Epoch: 43799, Training Loss: 0.00649
Epoch: 43799, Training Loss: 0.00674
Epoch: 43799, Training Loss: 0.00675
Epoch: 43799, Training Loss: 0.00827
Epoch: 43800, Training Loss: 0.00649
Epoch: 43800, Training Loss: 0.00674
Epoch: 43800, Training Loss: 0.00675
Epoch: 43800, Training Loss: 0.00827
Epoch: 43801, Training Loss: 0.00649
Epoch: 43801, Training Loss: 0.00674
Epoch: 43801, Training Loss: 0.00675
Epoch: 43801, Training Loss: 0.00827
Epoch: 43802, Training Loss: 0.00649
Epoch: 43802, Training Loss: 0.00674
Epoch: 43802, Training Loss: 0.00675
Epoch: 43802, Training Loss: 0.00827
Epoch: 43803, Training Loss: 0.00649
Epoch: 43803, Training Loss: 0.00674
Epoch: 43803, Training Loss: 0.00675
Epoch: 43803, Training Loss: 0.00827
Epoch: 43804, Training Loss: 0.00649
Epoch: 43804, Training Loss: 0.00674
Epoch: 43804, Training Loss: 0.00675
Epoch: 43804, Training Loss: 0.00827
Epoch: 43805, Training Loss: 0.00649
Epoch: 43805, Training Loss: 0.00674
Epoch: 43805, Training Loss: 0.00675
Epoch: 43805, Training Loss: 0.00827
Epoch: 43806, Training Loss: 0.00649
Epoch: 43806, Training Loss: 0.00674
Epoch: 43806, Training Loss: 0.00675
Epoch: 43806, Training Loss: 0.00827
Epoch: 43807, Training Loss: 0.00649
Epoch: 43807, Training Loss: 0.00674
Epoch: 43807, Training Loss: 0.00675
Epoch: 43807, Training Loss: 0.00827
Epoch: 43808, Training Loss: 0.00649
Epoch: 43808, Training Loss: 0.00674
Epoch: 43808, Training Loss: 0.00675
Epoch: 43808, Training Loss: 0.00827
Epoch: 43809, Training Loss: 0.00649
Epoch: 43809, Training Loss: 0.00674
Epoch: 43809, Training Loss: 0.00675
Epoch: 43809, Training Loss: 0.00827
Epoch: 43810, Training Loss: 0.00649
Epoch: 43810, Training Loss: 0.00674
Epoch: 43810, Training Loss: 0.00674
Epoch: 43810, Training Loss: 0.00827
Epoch: 43811, Training Loss: 0.00649
Epoch: 43811, Training Loss: 0.00674
Epoch: 43811, Training Loss: 0.00674
Epoch: 43811, Training Loss: 0.00827
Epoch: 43812, Training Loss: 0.00649
Epoch: 43812, Training Loss: 0.00674
Epoch: 43812, Training Loss: 0.00674
Epoch: 43812, Training Loss: 0.00827
Epoch: 43813, Training Loss: 0.00649
Epoch: 43813, Training Loss: 0.00674
Epoch: 43813, Training Loss: 0.00674
Epoch: 43813, Training Loss: 0.00827
Epoch: 43814, Training Loss: 0.00649
Epoch: 43814, Training Loss: 0.00674
Epoch: 43814, Training Loss: 0.00674
Epoch: 43814, Training Loss: 0.00827
Epoch: 43815, Training Loss: 0.00649
Epoch: 43815, Training Loss: 0.00674
Epoch: 43815, Training Loss: 0.00674
Epoch: 43815, Training Loss: 0.00827
Epoch: 43816, Training Loss: 0.00649
Epoch: 43816, Training Loss: 0.00674
Epoch: 43816, Training Loss: 0.00674
Epoch: 43816, Training Loss: 0.00827
Epoch: 43817, Training Loss: 0.00649
Epoch: 43817, Training Loss: 0.00674
Epoch: 43817, Training Loss: 0.00674
Epoch: 43817, Training Loss: 0.00827
Epoch: 43818, Training Loss: 0.00649
Epoch: 43818, Training Loss: 0.00674
Epoch: 43818, Training Loss: 0.00674
Epoch: 43818, Training Loss: 0.00827
Epoch: 43819, Training Loss: 0.00649
Epoch: 43819, Training Loss: 0.00674
Epoch: 43819, Training Loss: 0.00674
Epoch: 43819, Training Loss: 0.00827
Epoch: 43820, Training Loss: 0.00649
Epoch: 43820, Training Loss: 0.00674
Epoch: 43820, Training Loss: 0.00674
Epoch: 43820, Training Loss: 0.00827
Epoch: 43821, Training Loss: 0.00649
Epoch: 43821, Training Loss: 0.00674
Epoch: 43821, Training Loss: 0.00674
Epoch: 43821, Training Loss: 0.00827
Epoch: 43822, Training Loss: 0.00649
Epoch: 43822, Training Loss: 0.00674
Epoch: 43822, Training Loss: 0.00674
Epoch: 43822, Training Loss: 0.00827
Epoch: 43823, Training Loss: 0.00649
Epoch: 43823, Training Loss: 0.00674
Epoch: 43823, Training Loss: 0.00674
Epoch: 43823, Training Loss: 0.00827
Epoch: 43824, Training Loss: 0.00649
Epoch: 43824, Training Loss: 0.00674
Epoch: 43824, Training Loss: 0.00674
Epoch: 43824, Training Loss: 0.00827
Epoch: 43825, Training Loss: 0.00649
Epoch: 43825, Training Loss: 0.00674
Epoch: 43825, Training Loss: 0.00674
Epoch: 43825, Training Loss: 0.00827
Epoch: 43826, Training Loss: 0.00649
Epoch: 43826, Training Loss: 0.00674
Epoch: 43826, Training Loss: 0.00674
Epoch: 43826, Training Loss: 0.00827
Epoch: 43827, Training Loss: 0.00649
Epoch: 43827, Training Loss: 0.00674
Epoch: 43827, Training Loss: 0.00674
Epoch: 43827, Training Loss: 0.00827
Epoch: 43828, Training Loss: 0.00649
Epoch: 43828, Training Loss: 0.00674
Epoch: 43828, Training Loss: 0.00674
Epoch: 43828, Training Loss: 0.00827
Epoch: 43829, Training Loss: 0.00649
Epoch: 43829, Training Loss: 0.00674
Epoch: 43829, Training Loss: 0.00674
Epoch: 43829, Training Loss: 0.00827
Epoch: 43830, Training Loss: 0.00649
Epoch: 43830, Training Loss: 0.00674
Epoch: 43830, Training Loss: 0.00674
Epoch: 43830, Training Loss: 0.00827
Epoch: 43831, Training Loss: 0.00649
Epoch: 43831, Training Loss: 0.00674
Epoch: 43831, Training Loss: 0.00674
Epoch: 43831, Training Loss: 0.00827
Epoch: 43832, Training Loss: 0.00649
Epoch: 43832, Training Loss: 0.00674
Epoch: 43832, Training Loss: 0.00674
Epoch: 43832, Training Loss: 0.00827
Epoch: 43833, Training Loss: 0.00649
Epoch: 43833, Training Loss: 0.00674
Epoch: 43833, Training Loss: 0.00674
Epoch: 43833, Training Loss: 0.00827
Epoch: 43834, Training Loss: 0.00649
Epoch: 43834, Training Loss: 0.00674
Epoch: 43834, Training Loss: 0.00674
Epoch: 43834, Training Loss: 0.00827
Epoch: 43835, Training Loss: 0.00649
Epoch: 43835, Training Loss: 0.00674
Epoch: 43835, Training Loss: 0.00674
Epoch: 43835, Training Loss: 0.00827
Epoch: 43836, Training Loss: 0.00649
Epoch: 43836, Training Loss: 0.00674
Epoch: 43836, Training Loss: 0.00674
Epoch: 43836, Training Loss: 0.00827
Epoch: 43837, Training Loss: 0.00649
Epoch: 43837, Training Loss: 0.00674
Epoch: 43837, Training Loss: 0.00674
Epoch: 43837, Training Loss: 0.00827
Epoch: 43838, Training Loss: 0.00649
Epoch: 43838, Training Loss: 0.00674
Epoch: 43838, Training Loss: 0.00674
Epoch: 43838, Training Loss: 0.00826
Epoch: 43839, Training Loss: 0.00649
Epoch: 43839, Training Loss: 0.00674
Epoch: 43839, Training Loss: 0.00674
Epoch: 43839, Training Loss: 0.00826
Epoch: 43840, Training Loss: 0.00649
Epoch: 43840, Training Loss: 0.00674
Epoch: 43840, Training Loss: 0.00674
Epoch: 43840, Training Loss: 0.00826
Epoch: 43841, Training Loss: 0.00649
Epoch: 43841, Training Loss: 0.00673
Epoch: 43841, Training Loss: 0.00674
Epoch: 43841, Training Loss: 0.00826
Epoch: 43842, Training Loss: 0.00649
Epoch: 43842, Training Loss: 0.00673
Epoch: 43842, Training Loss: 0.00674
Epoch: 43842, Training Loss: 0.00826
Epoch: 43843, Training Loss: 0.00649
Epoch: 43843, Training Loss: 0.00673
Epoch: 43843, Training Loss: 0.00674
Epoch: 43843, Training Loss: 0.00826
Epoch: 43844, Training Loss: 0.00649
Epoch: 43844, Training Loss: 0.00673
Epoch: 43844, Training Loss: 0.00674
Epoch: 43844, Training Loss: 0.00826
Epoch: 43845, Training Loss: 0.00649
Epoch: 43845, Training Loss: 0.00673
Epoch: 43845, Training Loss: 0.00674
Epoch: 43845, Training Loss: 0.00826
Epoch: 43846, Training Loss: 0.00649
Epoch: 43846, Training Loss: 0.00673
Epoch: 43846, Training Loss: 0.00674
Epoch: 43846, Training Loss: 0.00826
Epoch: 43847, Training Loss: 0.00649
Epoch: 43847, Training Loss: 0.00673
Epoch: 43847, Training Loss: 0.00674
Epoch: 43847, Training Loss: 0.00826
Epoch: 43848, Training Loss: 0.00649
Epoch: 43848, Training Loss: 0.00673
Epoch: 43848, Training Loss: 0.00674
Epoch: 43848, Training Loss: 0.00826
Epoch: 43849, Training Loss: 0.00649
Epoch: 43849, Training Loss: 0.00673
Epoch: 43849, Training Loss: 0.00674
Epoch: 43849, Training Loss: 0.00826
Epoch: 43850, Training Loss: 0.00649
Epoch: 43850, Training Loss: 0.00673
Epoch: 43850, Training Loss: 0.00674
Epoch: 43850, Training Loss: 0.00826
Epoch: 43851, Training Loss: 0.00649
Epoch: 43851, Training Loss: 0.00673
Epoch: 43851, Training Loss: 0.00674
Epoch: 43851, Training Loss: 0.00826
Epoch: 43852, Training Loss: 0.00649
Epoch: 43852, Training Loss: 0.00673
Epoch: 43852, Training Loss: 0.00674
Epoch: 43852, Training Loss: 0.00826
Epoch: 43853, Training Loss: 0.00649
Epoch: 43853, Training Loss: 0.00673
Epoch: 43853, Training Loss: 0.00674
Epoch: 43853, Training Loss: 0.00826
Epoch: 43854, Training Loss: 0.00649
Epoch: 43854, Training Loss: 0.00673
Epoch: 43854, Training Loss: 0.00674
Epoch: 43854, Training Loss: 0.00826
Epoch: 43855, Training Loss: 0.00649
Epoch: 43855, Training Loss: 0.00673
Epoch: 43855, Training Loss: 0.00674
Epoch: 43855, Training Loss: 0.00826
Epoch: 43856, Training Loss: 0.00649
Epoch: 43856, Training Loss: 0.00673
Epoch: 43856, Training Loss: 0.00674
Epoch: 43856, Training Loss: 0.00826
Epoch: 43857, Training Loss: 0.00649
Epoch: 43857, Training Loss: 0.00673
Epoch: 43857, Training Loss: 0.00674
Epoch: 43857, Training Loss: 0.00826
Epoch: 43858, Training Loss: 0.00649
Epoch: 43858, Training Loss: 0.00673
Epoch: 43858, Training Loss: 0.00674
Epoch: 43858, Training Loss: 0.00826
Epoch: 43859, Training Loss: 0.00649
Epoch: 43859, Training Loss: 0.00673
Epoch: 43859, Training Loss: 0.00674
Epoch: 43859, Training Loss: 0.00826
Epoch: 43860, Training Loss: 0.00649
Epoch: 43860, Training Loss: 0.00673
Epoch: 43860, Training Loss: 0.00674
Epoch: 43860, Training Loss: 0.00826
Epoch: 43861, Training Loss: 0.00649
Epoch: 43861, Training Loss: 0.00673
Epoch: 43861, Training Loss: 0.00674
Epoch: 43861, Training Loss: 0.00826
Epoch: 43862, Training Loss: 0.00649
Epoch: 43862, Training Loss: 0.00673
Epoch: 43862, Training Loss: 0.00674
Epoch: 43862, Training Loss: 0.00826
Epoch: 43863, Training Loss: 0.00649
Epoch: 43863, Training Loss: 0.00673
Epoch: 43863, Training Loss: 0.00674
Epoch: 43863, Training Loss: 0.00826
Epoch: 43864, Training Loss: 0.00649
Epoch: 43864, Training Loss: 0.00673
Epoch: 43864, Training Loss: 0.00674
Epoch: 43864, Training Loss: 0.00826
Epoch: 43865, Training Loss: 0.00649
Epoch: 43865, Training Loss: 0.00673
Epoch: 43865, Training Loss: 0.00674
Epoch: 43865, Training Loss: 0.00826
Epoch: 43866, Training Loss: 0.00649
Epoch: 43866, Training Loss: 0.00673
Epoch: 43866, Training Loss: 0.00674
Epoch: 43866, Training Loss: 0.00826
Epoch: 43867, Training Loss: 0.00649
Epoch: 43867, Training Loss: 0.00673
Epoch: 43867, Training Loss: 0.00674
Epoch: 43867, Training Loss: 0.00826
Epoch: 43868, Training Loss: 0.00649
Epoch: 43868, Training Loss: 0.00673
Epoch: 43868, Training Loss: 0.00674
Epoch: 43868, Training Loss: 0.00826
Epoch: 43869, Training Loss: 0.00649
Epoch: 43869, Training Loss: 0.00673
Epoch: 43869, Training Loss: 0.00674
Epoch: 43869, Training Loss: 0.00826
Epoch: 43870, Training Loss: 0.00649
Epoch: 43870, Training Loss: 0.00673
Epoch: 43870, Training Loss: 0.00674
Epoch: 43870, Training Loss: 0.00826
Epoch: 43871, Training Loss: 0.00649
Epoch: 43871, Training Loss: 0.00673
Epoch: 43871, Training Loss: 0.00674
Epoch: 43871, Training Loss: 0.00826
Epoch: 43872, Training Loss: 0.00649
Epoch: 43872, Training Loss: 0.00673
Epoch: 43872, Training Loss: 0.00674
Epoch: 43872, Training Loss: 0.00826
Epoch: 43873, Training Loss: 0.00649
Epoch: 43873, Training Loss: 0.00673
Epoch: 43873, Training Loss: 0.00674
Epoch: 43873, Training Loss: 0.00826
Epoch: 43874, Training Loss: 0.00649
Epoch: 43874, Training Loss: 0.00673
Epoch: 43874, Training Loss: 0.00674
Epoch: 43874, Training Loss: 0.00826
Epoch: 43875, Training Loss: 0.00649
Epoch: 43875, Training Loss: 0.00673
Epoch: 43875, Training Loss: 0.00674
Epoch: 43875, Training Loss: 0.00826
Epoch: 43876, Training Loss: 0.00649
Epoch: 43876, Training Loss: 0.00673
Epoch: 43876, Training Loss: 0.00674
Epoch: 43876, Training Loss: 0.00826
Epoch: 43877, Training Loss: 0.00649
Epoch: 43877, Training Loss: 0.00673
Epoch: 43877, Training Loss: 0.00674
Epoch: 43877, Training Loss: 0.00826
Epoch: 43878, Training Loss: 0.00649
Epoch: 43878, Training Loss: 0.00673
Epoch: 43878, Training Loss: 0.00674
Epoch: 43878, Training Loss: 0.00826
Epoch: 43879, Training Loss: 0.00649
Epoch: 43879, Training Loss: 0.00673
Epoch: 43879, Training Loss: 0.00674
Epoch: 43879, Training Loss: 0.00826
Epoch: 43880, Training Loss: 0.00649
Epoch: 43880, Training Loss: 0.00673
Epoch: 43880, Training Loss: 0.00674
Epoch: 43880, Training Loss: 0.00826
Epoch: 43881, Training Loss: 0.00649
Epoch: 43881, Training Loss: 0.00673
Epoch: 43881, Training Loss: 0.00674
Epoch: 43881, Training Loss: 0.00826
Epoch: 43882, Training Loss: 0.00649
Epoch: 43882, Training Loss: 0.00673
Epoch: 43882, Training Loss: 0.00674
Epoch: 43882, Training Loss: 0.00826
Epoch: 43883, Training Loss: 0.00649
Epoch: 43883, Training Loss: 0.00673
Epoch: 43883, Training Loss: 0.00674
Epoch: 43883, Training Loss: 0.00826
Epoch: 43884, Training Loss: 0.00649
Epoch: 43884, Training Loss: 0.00673
Epoch: 43884, Training Loss: 0.00674
Epoch: 43884, Training Loss: 0.00826
Epoch: 43885, Training Loss: 0.00649
Epoch: 43885, Training Loss: 0.00673
Epoch: 43885, Training Loss: 0.00674
Epoch: 43885, Training Loss: 0.00826
Epoch: 43886, Training Loss: 0.00649
Epoch: 43886, Training Loss: 0.00673
Epoch: 43886, Training Loss: 0.00674
Epoch: 43886, Training Loss: 0.00826
Epoch: 43887, Training Loss: 0.00649
Epoch: 43887, Training Loss: 0.00673
Epoch: 43887, Training Loss: 0.00674
Epoch: 43887, Training Loss: 0.00826
Epoch: 43888, Training Loss: 0.00649
Epoch: 43888, Training Loss: 0.00673
Epoch: 43888, Training Loss: 0.00674
Epoch: 43888, Training Loss: 0.00826
Epoch: 43889, Training Loss: 0.00649
Epoch: 43889, Training Loss: 0.00673
Epoch: 43889, Training Loss: 0.00674
Epoch: 43889, Training Loss: 0.00826
Epoch: 43890, Training Loss: 0.00649
Epoch: 43890, Training Loss: 0.00673
Epoch: 43890, Training Loss: 0.00674
Epoch: 43890, Training Loss: 0.00826
Epoch: 43891, Training Loss: 0.00649
Epoch: 43891, Training Loss: 0.00673
Epoch: 43891, Training Loss: 0.00674
Epoch: 43891, Training Loss: 0.00826
Epoch: 43892, Training Loss: 0.00649
Epoch: 43892, Training Loss: 0.00673
Epoch: 43892, Training Loss: 0.00674
Epoch: 43892, Training Loss: 0.00826
Epoch: 43893, Training Loss: 0.00649
Epoch: 43893, Training Loss: 0.00673
Epoch: 43893, Training Loss: 0.00674
Epoch: 43893, Training Loss: 0.00826
Epoch: 43894, Training Loss: 0.00649
Epoch: 43894, Training Loss: 0.00673
Epoch: 43894, Training Loss: 0.00674
Epoch: 43894, Training Loss: 0.00826
Epoch: 43895, Training Loss: 0.00649
Epoch: 43895, Training Loss: 0.00673
Epoch: 43895, Training Loss: 0.00674
Epoch: 43895, Training Loss: 0.00826
Epoch: 43896, Training Loss: 0.00649
Epoch: 43896, Training Loss: 0.00673
Epoch: 43896, Training Loss: 0.00674
Epoch: 43896, Training Loss: 0.00826
Epoch: 43897, Training Loss: 0.00649
Epoch: 43897, Training Loss: 0.00673
Epoch: 43897, Training Loss: 0.00674
Epoch: 43897, Training Loss: 0.00826
Epoch: 43898, Training Loss: 0.00649
Epoch: 43898, Training Loss: 0.00673
Epoch: 43898, Training Loss: 0.00674
Epoch: 43898, Training Loss: 0.00826
Epoch: 43899, Training Loss: 0.00649
Epoch: 43899, Training Loss: 0.00673
Epoch: 43899, Training Loss: 0.00674
Epoch: 43899, Training Loss: 0.00826
Epoch: 43900, Training Loss: 0.00649
Epoch: 43900, Training Loss: 0.00673
Epoch: 43900, Training Loss: 0.00674
Epoch: 43900, Training Loss: 0.00826
Epoch: 43901, Training Loss: 0.00649
Epoch: 43901, Training Loss: 0.00673
Epoch: 43901, Training Loss: 0.00674
Epoch: 43901, Training Loss: 0.00826
Epoch: 43902, Training Loss: 0.00649
Epoch: 43902, Training Loss: 0.00673
Epoch: 43902, Training Loss: 0.00674
Epoch: 43902, Training Loss: 0.00826
Epoch: 43903, Training Loss: 0.00649
Epoch: 43903, Training Loss: 0.00673
Epoch: 43903, Training Loss: 0.00674
Epoch: 43903, Training Loss: 0.00826
Epoch: 43904, Training Loss: 0.00649
Epoch: 43904, Training Loss: 0.00673
Epoch: 43904, Training Loss: 0.00674
Epoch: 43904, Training Loss: 0.00826
Epoch: 43905, Training Loss: 0.00649
Epoch: 43905, Training Loss: 0.00673
Epoch: 43905, Training Loss: 0.00674
Epoch: 43905, Training Loss: 0.00826
Epoch: 43906, Training Loss: 0.00649
Epoch: 43906, Training Loss: 0.00673
Epoch: 43906, Training Loss: 0.00674
Epoch: 43906, Training Loss: 0.00826
Epoch: 43907, Training Loss: 0.00649
Epoch: 43907, Training Loss: 0.00673
Epoch: 43907, Training Loss: 0.00674
Epoch: 43907, Training Loss: 0.00826
Epoch: 43908, Training Loss: 0.00649
Epoch: 43908, Training Loss: 0.00673
Epoch: 43908, Training Loss: 0.00674
Epoch: 43908, Training Loss: 0.00826
Epoch: 43909, Training Loss: 0.00648
Epoch: 43909, Training Loss: 0.00673
Epoch: 43909, Training Loss: 0.00674
Epoch: 43909, Training Loss: 0.00826
Epoch: 43910, Training Loss: 0.00648
Epoch: 43910, Training Loss: 0.00673
Epoch: 43910, Training Loss: 0.00674
Epoch: 43910, Training Loss: 0.00826
Epoch: 43911, Training Loss: 0.00648
Epoch: 43911, Training Loss: 0.00673
Epoch: 43911, Training Loss: 0.00674
Epoch: 43911, Training Loss: 0.00826
Epoch: 43912, Training Loss: 0.00648
Epoch: 43912, Training Loss: 0.00673
Epoch: 43912, Training Loss: 0.00674
Epoch: 43912, Training Loss: 0.00826
Epoch: 43913, Training Loss: 0.00648
Epoch: 43913, Training Loss: 0.00673
Epoch: 43913, Training Loss: 0.00674
Epoch: 43913, Training Loss: 0.00826
Epoch: 43914, Training Loss: 0.00648
Epoch: 43914, Training Loss: 0.00673
Epoch: 43914, Training Loss: 0.00674
Epoch: 43914, Training Loss: 0.00826
Epoch: 43915, Training Loss: 0.00648
Epoch: 43915, Training Loss: 0.00673
Epoch: 43915, Training Loss: 0.00674
Epoch: 43915, Training Loss: 0.00826
Epoch: 43916, Training Loss: 0.00648
Epoch: 43916, Training Loss: 0.00673
Epoch: 43916, Training Loss: 0.00674
Epoch: 43916, Training Loss: 0.00826
Epoch: 43917, Training Loss: 0.00648
Epoch: 43917, Training Loss: 0.00673
Epoch: 43917, Training Loss: 0.00674
Epoch: 43917, Training Loss: 0.00826
Epoch: 43918, Training Loss: 0.00648
Epoch: 43918, Training Loss: 0.00673
Epoch: 43918, Training Loss: 0.00674
Epoch: 43918, Training Loss: 0.00826
Epoch: 43919, Training Loss: 0.00648
Epoch: 43919, Training Loss: 0.00673
Epoch: 43919, Training Loss: 0.00674
Epoch: 43919, Training Loss: 0.00826
Epoch: 43920, Training Loss: 0.00648
Epoch: 43920, Training Loss: 0.00673
Epoch: 43920, Training Loss: 0.00674
Epoch: 43920, Training Loss: 0.00826
Epoch: 43921, Training Loss: 0.00648
Epoch: 43921, Training Loss: 0.00673
Epoch: 43921, Training Loss: 0.00674
Epoch: 43921, Training Loss: 0.00826
Epoch: 43922, Training Loss: 0.00648
Epoch: 43922, Training Loss: 0.00673
Epoch: 43922, Training Loss: 0.00674
Epoch: 43922, Training Loss: 0.00826
Epoch: 43923, Training Loss: 0.00648
Epoch: 43923, Training Loss: 0.00673
Epoch: 43923, Training Loss: 0.00674
Epoch: 43923, Training Loss: 0.00826
Epoch: 43924, Training Loss: 0.00648
Epoch: 43924, Training Loss: 0.00673
Epoch: 43924, Training Loss: 0.00674
Epoch: 43924, Training Loss: 0.00826
Epoch: 43925, Training Loss: 0.00648
Epoch: 43925, Training Loss: 0.00673
Epoch: 43925, Training Loss: 0.00674
Epoch: 43925, Training Loss: 0.00826
Epoch: 43926, Training Loss: 0.00648
Epoch: 43926, Training Loss: 0.00673
Epoch: 43926, Training Loss: 0.00674
Epoch: 43926, Training Loss: 0.00826
Epoch: 43927, Training Loss: 0.00648
Epoch: 43927, Training Loss: 0.00673
Epoch: 43927, Training Loss: 0.00674
Epoch: 43927, Training Loss: 0.00826
Epoch: 43928, Training Loss: 0.00648
Epoch: 43928, Training Loss: 0.00673
Epoch: 43928, Training Loss: 0.00674
Epoch: 43928, Training Loss: 0.00826
Epoch: 43929, Training Loss: 0.00648
Epoch: 43929, Training Loss: 0.00673
Epoch: 43929, Training Loss: 0.00673
Epoch: 43929, Training Loss: 0.00826
Epoch: 43930, Training Loss: 0.00648
Epoch: 43930, Training Loss: 0.00673
Epoch: 43930, Training Loss: 0.00673
Epoch: 43930, Training Loss: 0.00826
Epoch: 43931, Training Loss: 0.00648
Epoch: 43931, Training Loss: 0.00673
Epoch: 43931, Training Loss: 0.00673
Epoch: 43931, Training Loss: 0.00826
Epoch: 43932, Training Loss: 0.00648
Epoch: 43932, Training Loss: 0.00673
Epoch: 43932, Training Loss: 0.00673
Epoch: 43932, Training Loss: 0.00826
Epoch: 43933, Training Loss: 0.00648
Epoch: 43933, Training Loss: 0.00673
Epoch: 43933, Training Loss: 0.00673
Epoch: 43933, Training Loss: 0.00826
Epoch: 43934, Training Loss: 0.00648
Epoch: 43934, Training Loss: 0.00673
Epoch: 43934, Training Loss: 0.00673
Epoch: 43934, Training Loss: 0.00826
Epoch: 43935, Training Loss: 0.00648
Epoch: 43935, Training Loss: 0.00673
Epoch: 43935, Training Loss: 0.00673
Epoch: 43935, Training Loss: 0.00825
Epoch: 43936, Training Loss: 0.00648
Epoch: 43936, Training Loss: 0.00673
Epoch: 43936, Training Loss: 0.00673
Epoch: 43936, Training Loss: 0.00825
Epoch: 43937, Training Loss: 0.00648
Epoch: 43937, Training Loss: 0.00673
Epoch: 43937, Training Loss: 0.00673
Epoch: 43937, Training Loss: 0.00825
Epoch: 43938, Training Loss: 0.00648
Epoch: 43938, Training Loss: 0.00673
Epoch: 43938, Training Loss: 0.00673
Epoch: 43938, Training Loss: 0.00825
Epoch: 43939, Training Loss: 0.00648
Epoch: 43939, Training Loss: 0.00673
Epoch: 43939, Training Loss: 0.00673
Epoch: 43939, Training Loss: 0.00825
Epoch: 43940, Training Loss: 0.00648
Epoch: 43940, Training Loss: 0.00673
Epoch: 43940, Training Loss: 0.00673
Epoch: 43940, Training Loss: 0.00825
Epoch: 43941, Training Loss: 0.00648
Epoch: 43941, Training Loss: 0.00673
Epoch: 43941, Training Loss: 0.00673
Epoch: 43941, Training Loss: 0.00825
Epoch: 43942, Training Loss: 0.00648
Epoch: 43942, Training Loss: 0.00673
Epoch: 43942, Training Loss: 0.00673
Epoch: 43942, Training Loss: 0.00825
Epoch: 43943, Training Loss: 0.00648
Epoch: 43943, Training Loss: 0.00673
Epoch: 43943, Training Loss: 0.00673
Epoch: 43943, Training Loss: 0.00825
Epoch: 43944, Training Loss: 0.00648
Epoch: 43944, Training Loss: 0.00673
Epoch: 43944, Training Loss: 0.00673
Epoch: 43944, Training Loss: 0.00825
Epoch: 43945, Training Loss: 0.00648
Epoch: 43945, Training Loss: 0.00673
Epoch: 43945, Training Loss: 0.00673
Epoch: 43945, Training Loss: 0.00825
Epoch: 43946, Training Loss: 0.00648
Epoch: 43946, Training Loss: 0.00673
Epoch: 43946, Training Loss: 0.00673
Epoch: 43946, Training Loss: 0.00825
Epoch: 43947, Training Loss: 0.00648
Epoch: 43947, Training Loss: 0.00673
Epoch: 43947, Training Loss: 0.00673
Epoch: 43947, Training Loss: 0.00825
Epoch: 43948, Training Loss: 0.00648
Epoch: 43948, Training Loss: 0.00673
Epoch: 43948, Training Loss: 0.00673
Epoch: 43948, Training Loss: 0.00825
Epoch: 43949, Training Loss: 0.00648
Epoch: 43949, Training Loss: 0.00673
Epoch: 43949, Training Loss: 0.00673
Epoch: 43949, Training Loss: 0.00825
Epoch: 43950, Training Loss: 0.00648
Epoch: 43950, Training Loss: 0.00673
Epoch: 43950, Training Loss: 0.00673
Epoch: 43950, Training Loss: 0.00825
Epoch: 43951, Training Loss: 0.00648
Epoch: 43951, Training Loss: 0.00673
Epoch: 43951, Training Loss: 0.00673
Epoch: 43951, Training Loss: 0.00825
Epoch: 43952, Training Loss: 0.00648
Epoch: 43952, Training Loss: 0.00673
Epoch: 43952, Training Loss: 0.00673
Epoch: 43952, Training Loss: 0.00825
Epoch: 43953, Training Loss: 0.00648
Epoch: 43953, Training Loss: 0.00673
Epoch: 43953, Training Loss: 0.00673
Epoch: 43953, Training Loss: 0.00825
Epoch: 43954, Training Loss: 0.00648
Epoch: 43954, Training Loss: 0.00673
Epoch: 43954, Training Loss: 0.00673
Epoch: 43954, Training Loss: 0.00825
Epoch: 43955, Training Loss: 0.00648
Epoch: 43955, Training Loss: 0.00673
Epoch: 43955, Training Loss: 0.00673
Epoch: 43955, Training Loss: 0.00825
Epoch: 43956, Training Loss: 0.00648
Epoch: 43956, Training Loss: 0.00673
Epoch: 43956, Training Loss: 0.00673
Epoch: 43956, Training Loss: 0.00825
Epoch: 43957, Training Loss: 0.00648
Epoch: 43957, Training Loss: 0.00673
Epoch: 43957, Training Loss: 0.00673
Epoch: 43957, Training Loss: 0.00825
Epoch: 43958, Training Loss: 0.00648
Epoch: 43958, Training Loss: 0.00673
Epoch: 43958, Training Loss: 0.00673
Epoch: 43958, Training Loss: 0.00825
Epoch: 43959, Training Loss: 0.00648
Epoch: 43959, Training Loss: 0.00673
Epoch: 43959, Training Loss: 0.00673
Epoch: 43959, Training Loss: 0.00825
Epoch: 43960, Training Loss: 0.00648
Epoch: 43960, Training Loss: 0.00672
Epoch: 43960, Training Loss: 0.00673
Epoch: 43960, Training Loss: 0.00825
Epoch: 43961, Training Loss: 0.00648
Epoch: 43961, Training Loss: 0.00672
Epoch: 43961, Training Loss: 0.00673
Epoch: 43961, Training Loss: 0.00825
Epoch: 43962, Training Loss: 0.00648
Epoch: 43962, Training Loss: 0.00672
Epoch: 43962, Training Loss: 0.00673
Epoch: 43962, Training Loss: 0.00825
Epoch: 43963, Training Loss: 0.00648
Epoch: 43963, Training Loss: 0.00672
Epoch: 43963, Training Loss: 0.00673
Epoch: 43963, Training Loss: 0.00825
Epoch: 43964, Training Loss: 0.00648
Epoch: 43964, Training Loss: 0.00672
Epoch: 43964, Training Loss: 0.00673
Epoch: 43964, Training Loss: 0.00825
Epoch: 43965, Training Loss: 0.00648
Epoch: 43965, Training Loss: 0.00672
Epoch: 43965, Training Loss: 0.00673
Epoch: 43965, Training Loss: 0.00825
Epoch: 43966, Training Loss: 0.00648
Epoch: 43966, Training Loss: 0.00672
Epoch: 43966, Training Loss: 0.00673
Epoch: 43966, Training Loss: 0.00825
Epoch: 43967, Training Loss: 0.00648
Epoch: 43967, Training Loss: 0.00672
Epoch: 43967, Training Loss: 0.00673
Epoch: 43967, Training Loss: 0.00825
Epoch: 43968, Training Loss: 0.00648
Epoch: 43968, Training Loss: 0.00672
Epoch: 43968, Training Loss: 0.00673
Epoch: 43968, Training Loss: 0.00825
Epoch: 43969, Training Loss: 0.00648
Epoch: 43969, Training Loss: 0.00672
Epoch: 43969, Training Loss: 0.00673
Epoch: 43969, Training Loss: 0.00825
Epoch: 43970, Training Loss: 0.00648
Epoch: 43970, Training Loss: 0.00672
Epoch: 43970, Training Loss: 0.00673
Epoch: 43970, Training Loss: 0.00825
Epoch: 43971, Training Loss: 0.00648
Epoch: 43971, Training Loss: 0.00672
Epoch: 43971, Training Loss: 0.00673
Epoch: 43971, Training Loss: 0.00825
Epoch: 43972, Training Loss: 0.00648
Epoch: 43972, Training Loss: 0.00672
Epoch: 43972, Training Loss: 0.00673
Epoch: 43972, Training Loss: 0.00825
Epoch: 43973, Training Loss: 0.00648
Epoch: 43973, Training Loss: 0.00672
Epoch: 43973, Training Loss: 0.00673
Epoch: 43973, Training Loss: 0.00825
Epoch: 43974, Training Loss: 0.00648
Epoch: 43974, Training Loss: 0.00672
Epoch: 43974, Training Loss: 0.00673
Epoch: 43974, Training Loss: 0.00825
Epoch: 43975, Training Loss: 0.00648
Epoch: 43975, Training Loss: 0.00672
Epoch: 43975, Training Loss: 0.00673
Epoch: 43975, Training Loss: 0.00825
Epoch: 43976, Training Loss: 0.00648
Epoch: 43976, Training Loss: 0.00672
Epoch: 43976, Training Loss: 0.00673
Epoch: 43976, Training Loss: 0.00825
Epoch: 43977, Training Loss: 0.00648
Epoch: 43977, Training Loss: 0.00672
Epoch: 43977, Training Loss: 0.00673
Epoch: 43977, Training Loss: 0.00825
Epoch: 43978, Training Loss: 0.00648
Epoch: 43978, Training Loss: 0.00672
Epoch: 43978, Training Loss: 0.00673
Epoch: 43978, Training Loss: 0.00825
Epoch: 43979, Training Loss: 0.00648
Epoch: 43979, Training Loss: 0.00672
Epoch: 43979, Training Loss: 0.00673
Epoch: 43979, Training Loss: 0.00825
Epoch: 43980, Training Loss: 0.00648
Epoch: 43980, Training Loss: 0.00672
Epoch: 43980, Training Loss: 0.00673
Epoch: 43980, Training Loss: 0.00825
Epoch: 43981, Training Loss: 0.00648
Epoch: 43981, Training Loss: 0.00672
Epoch: 43981, Training Loss: 0.00673
Epoch: 43981, Training Loss: 0.00825
Epoch: 43982, Training Loss: 0.00648
Epoch: 43982, Training Loss: 0.00672
Epoch: 43982, Training Loss: 0.00673
Epoch: 43982, Training Loss: 0.00825
Epoch: 43983, Training Loss: 0.00648
Epoch: 43983, Training Loss: 0.00672
Epoch: 43983, Training Loss: 0.00673
Epoch: 43983, Training Loss: 0.00825
Epoch: 43984, Training Loss: 0.00648
Epoch: 43984, Training Loss: 0.00672
Epoch: 43984, Training Loss: 0.00673
Epoch: 43984, Training Loss: 0.00825
Epoch: 43985, Training Loss: 0.00648
Epoch: 43985, Training Loss: 0.00672
Epoch: 43985, Training Loss: 0.00673
Epoch: 43985, Training Loss: 0.00825
Epoch: 43986, Training Loss: 0.00648
Epoch: 43986, Training Loss: 0.00672
Epoch: 43986, Training Loss: 0.00673
Epoch: 43986, Training Loss: 0.00825
Epoch: 43987, Training Loss: 0.00648
Epoch: 43987, Training Loss: 0.00672
Epoch: 43987, Training Loss: 0.00673
Epoch: 43987, Training Loss: 0.00825
Epoch: 43988, Training Loss: 0.00648
Epoch: 43988, Training Loss: 0.00672
Epoch: 43988, Training Loss: 0.00673
Epoch: 43988, Training Loss: 0.00825
Epoch: 43989, Training Loss: 0.00648
Epoch: 43989, Training Loss: 0.00672
Epoch: 43989, Training Loss: 0.00673
Epoch: 43989, Training Loss: 0.00825
Epoch: 43990, Training Loss: 0.00648
Epoch: 43990, Training Loss: 0.00672
Epoch: 43990, Training Loss: 0.00673
Epoch: 43990, Training Loss: 0.00825
Epoch: 43991, Training Loss: 0.00648
Epoch: 43991, Training Loss: 0.00672
Epoch: 43991, Training Loss: 0.00673
Epoch: 43991, Training Loss: 0.00825
Epoch: 43992, Training Loss: 0.00648
Epoch: 43992, Training Loss: 0.00672
Epoch: 43992, Training Loss: 0.00673
Epoch: 43992, Training Loss: 0.00825
Epoch: 43993, Training Loss: 0.00648
Epoch: 43993, Training Loss: 0.00672
Epoch: 43993, Training Loss: 0.00673
Epoch: 43993, Training Loss: 0.00825
Epoch: 43994, Training Loss: 0.00648
Epoch: 43994, Training Loss: 0.00672
Epoch: 43994, Training Loss: 0.00673
Epoch: 43994, Training Loss: 0.00825
Epoch: 43995, Training Loss: 0.00648
Epoch: 43995, Training Loss: 0.00672
Epoch: 43995, Training Loss: 0.00673
Epoch: 43995, Training Loss: 0.00825
Epoch: 43996, Training Loss: 0.00648
Epoch: 43996, Training Loss: 0.00672
Epoch: 43996, Training Loss: 0.00673
Epoch: 43996, Training Loss: 0.00825
Epoch: 43997, Training Loss: 0.00648
Epoch: 43997, Training Loss: 0.00672
Epoch: 43997, Training Loss: 0.00673
Epoch: 43997, Training Loss: 0.00825
Epoch: 43998, Training Loss: 0.00648
Epoch: 43998, Training Loss: 0.00672
Epoch: 43998, Training Loss: 0.00673
Epoch: 43998, Training Loss: 0.00825
Epoch: 43999, Training Loss: 0.00648
Epoch: 43999, Training Loss: 0.00672
Epoch: 43999, Training Loss: 0.00673
Epoch: 43999, Training Loss: 0.00825
Epoch: 44000, Training Loss: 0.00648
Epoch: 44000, Training Loss: 0.00672
Epoch: 44000, Training Loss: 0.00673
Epoch: 44000, Training Loss: 0.00825
Epoch: 44001, Training Loss: 0.00648
Epoch: 44001, Training Loss: 0.00672
Epoch: 44001, Training Loss: 0.00673
Epoch: 44001, Training Loss: 0.00825
Epoch: 44002, Training Loss: 0.00648
Epoch: 44002, Training Loss: 0.00672
Epoch: 44002, Training Loss: 0.00673
Epoch: 44002, Training Loss: 0.00825
Epoch: 44003, Training Loss: 0.00648
Epoch: 44003, Training Loss: 0.00672
Epoch: 44003, Training Loss: 0.00673
Epoch: 44003, Training Loss: 0.00825
Epoch: 44004, Training Loss: 0.00648
Epoch: 44004, Training Loss: 0.00672
Epoch: 44004, Training Loss: 0.00673
Epoch: 44004, Training Loss: 0.00825
Epoch: 44005, Training Loss: 0.00648
Epoch: 44005, Training Loss: 0.00672
Epoch: 44005, Training Loss: 0.00673
Epoch: 44005, Training Loss: 0.00825
Epoch: 44006, Training Loss: 0.00648
Epoch: 44006, Training Loss: 0.00672
Epoch: 44006, Training Loss: 0.00673
Epoch: 44006, Training Loss: 0.00825
Epoch: 44007, Training Loss: 0.00648
Epoch: 44007, Training Loss: 0.00672
Epoch: 44007, Training Loss: 0.00673
Epoch: 44007, Training Loss: 0.00825
Epoch: 44008, Training Loss: 0.00648
Epoch: 44008, Training Loss: 0.00672
Epoch: 44008, Training Loss: 0.00673
Epoch: 44008, Training Loss: 0.00825
Epoch: 44009, Training Loss: 0.00648
Epoch: 44009, Training Loss: 0.00672
Epoch: 44009, Training Loss: 0.00673
Epoch: 44009, Training Loss: 0.00825
Epoch: 44010, Training Loss: 0.00648
Epoch: 44010, Training Loss: 0.00672
Epoch: 44010, Training Loss: 0.00673
Epoch: 44010, Training Loss: 0.00825
Epoch: 44011, Training Loss: 0.00648
Epoch: 44011, Training Loss: 0.00672
Epoch: 44011, Training Loss: 0.00673
Epoch: 44011, Training Loss: 0.00825
Epoch: 44012, Training Loss: 0.00648
Epoch: 44012, Training Loss: 0.00672
Epoch: 44012, Training Loss: 0.00673
Epoch: 44012, Training Loss: 0.00825
Epoch: 44013, Training Loss: 0.00648
Epoch: 44013, Training Loss: 0.00672
Epoch: 44013, Training Loss: 0.00673
Epoch: 44013, Training Loss: 0.00825
Epoch: 44014, Training Loss: 0.00648
Epoch: 44014, Training Loss: 0.00672
Epoch: 44014, Training Loss: 0.00673
Epoch: 44014, Training Loss: 0.00825
Epoch: 44015, Training Loss: 0.00648
Epoch: 44015, Training Loss: 0.00672
Epoch: 44015, Training Loss: 0.00673
Epoch: 44015, Training Loss: 0.00825
Epoch: 44016, Training Loss: 0.00648
Epoch: 44016, Training Loss: 0.00672
Epoch: 44016, Training Loss: 0.00673
Epoch: 44016, Training Loss: 0.00825
Epoch: 44017, Training Loss: 0.00648
Epoch: 44017, Training Loss: 0.00672
Epoch: 44017, Training Loss: 0.00673
Epoch: 44017, Training Loss: 0.00825
Epoch: 44018, Training Loss: 0.00648
Epoch: 44018, Training Loss: 0.00672
Epoch: 44018, Training Loss: 0.00673
Epoch: 44018, Training Loss: 0.00825
Epoch: 44019, Training Loss: 0.00648
Epoch: 44019, Training Loss: 0.00672
Epoch: 44019, Training Loss: 0.00673
Epoch: 44019, Training Loss: 0.00825
Epoch: 44020, Training Loss: 0.00648
Epoch: 44020, Training Loss: 0.00672
Epoch: 44020, Training Loss: 0.00673
Epoch: 44020, Training Loss: 0.00825
Epoch: 44021, Training Loss: 0.00648
Epoch: 44021, Training Loss: 0.00672
Epoch: 44021, Training Loss: 0.00673
Epoch: 44021, Training Loss: 0.00825
Epoch: 44022, Training Loss: 0.00648
Epoch: 44022, Training Loss: 0.00672
Epoch: 44022, Training Loss: 0.00673
Epoch: 44022, Training Loss: 0.00825
Epoch: 44023, Training Loss: 0.00648
Epoch: 44023, Training Loss: 0.00672
Epoch: 44023, Training Loss: 0.00673
Epoch: 44023, Training Loss: 0.00825
Epoch: 44024, Training Loss: 0.00648
Epoch: 44024, Training Loss: 0.00672
Epoch: 44024, Training Loss: 0.00673
Epoch: 44024, Training Loss: 0.00825
Epoch: 44025, Training Loss: 0.00648
Epoch: 44025, Training Loss: 0.00672
Epoch: 44025, Training Loss: 0.00673
Epoch: 44025, Training Loss: 0.00825
Epoch: 44026, Training Loss: 0.00648
Epoch: 44026, Training Loss: 0.00672
Epoch: 44026, Training Loss: 0.00673
Epoch: 44026, Training Loss: 0.00825
Epoch: 44027, Training Loss: 0.00648
Epoch: 44027, Training Loss: 0.00672
Epoch: 44027, Training Loss: 0.00673
Epoch: 44027, Training Loss: 0.00825
Epoch: 44028, Training Loss: 0.00648
Epoch: 44028, Training Loss: 0.00672
Epoch: 44028, Training Loss: 0.00673
Epoch: 44028, Training Loss: 0.00825
Epoch: 44029, Training Loss: 0.00648
Epoch: 44029, Training Loss: 0.00672
Epoch: 44029, Training Loss: 0.00673
Epoch: 44029, Training Loss: 0.00825
Epoch: 44030, Training Loss: 0.00648
Epoch: 44030, Training Loss: 0.00672
Epoch: 44030, Training Loss: 0.00673
Epoch: 44030, Training Loss: 0.00825
Epoch: 44031, Training Loss: 0.00648
Epoch: 44031, Training Loss: 0.00672
Epoch: 44031, Training Loss: 0.00673
Epoch: 44031, Training Loss: 0.00825
Epoch: 44032, Training Loss: 0.00648
Epoch: 44032, Training Loss: 0.00672
Epoch: 44032, Training Loss: 0.00673
Epoch: 44032, Training Loss: 0.00824
Epoch: 44033, Training Loss: 0.00648
Epoch: 44033, Training Loss: 0.00672
Epoch: 44033, Training Loss: 0.00673
Epoch: 44033, Training Loss: 0.00824
Epoch: 44034, Training Loss: 0.00648
Epoch: 44034, Training Loss: 0.00672
Epoch: 44034, Training Loss: 0.00673
Epoch: 44034, Training Loss: 0.00824
Epoch: 44035, Training Loss: 0.00647
Epoch: 44035, Training Loss: 0.00672
Epoch: 44035, Training Loss: 0.00673
Epoch: 44035, Training Loss: 0.00824
Epoch: 44036, Training Loss: 0.00647
Epoch: 44036, Training Loss: 0.00672
Epoch: 44036, Training Loss: 0.00673
Epoch: 44036, Training Loss: 0.00824
Epoch: 44037, Training Loss: 0.00647
Epoch: 44037, Training Loss: 0.00672
Epoch: 44037, Training Loss: 0.00673
Epoch: 44037, Training Loss: 0.00824
Epoch: 44038, Training Loss: 0.00647
Epoch: 44038, Training Loss: 0.00672
Epoch: 44038, Training Loss: 0.00673
Epoch: 44038, Training Loss: 0.00824
Epoch: 44039, Training Loss: 0.00647
Epoch: 44039, Training Loss: 0.00672
Epoch: 44039, Training Loss: 0.00673
Epoch: 44039, Training Loss: 0.00824
Epoch: 44040, Training Loss: 0.00647
Epoch: 44040, Training Loss: 0.00672
Epoch: 44040, Training Loss: 0.00673
Epoch: 44040, Training Loss: 0.00824
Epoch: 44041, Training Loss: 0.00647
Epoch: 44041, Training Loss: 0.00672
Epoch: 44041, Training Loss: 0.00673
Epoch: 44041, Training Loss: 0.00824
Epoch: 44042, Training Loss: 0.00647
Epoch: 44042, Training Loss: 0.00672
Epoch: 44042, Training Loss: 0.00673
Epoch: 44042, Training Loss: 0.00824
Epoch: 44043, Training Loss: 0.00647
Epoch: 44043, Training Loss: 0.00672
Epoch: 44043, Training Loss: 0.00673
Epoch: 44043, Training Loss: 0.00824
Epoch: 44044, Training Loss: 0.00647
Epoch: 44044, Training Loss: 0.00672
Epoch: 44044, Training Loss: 0.00673
Epoch: 44044, Training Loss: 0.00824
Epoch: 44045, Training Loss: 0.00647
Epoch: 44045, Training Loss: 0.00672
Epoch: 44045, Training Loss: 0.00673
Epoch: 44045, Training Loss: 0.00824
Epoch: 44046, Training Loss: 0.00647
Epoch: 44046, Training Loss: 0.00672
Epoch: 44046, Training Loss: 0.00673
Epoch: 44046, Training Loss: 0.00824
Epoch: 44047, Training Loss: 0.00647
Epoch: 44047, Training Loss: 0.00672
Epoch: 44047, Training Loss: 0.00673
Epoch: 44047, Training Loss: 0.00824
Epoch: 44048, Training Loss: 0.00647
Epoch: 44048, Training Loss: 0.00672
Epoch: 44048, Training Loss: 0.00673
Epoch: 44048, Training Loss: 0.00824
Epoch: 44049, Training Loss: 0.00647
Epoch: 44049, Training Loss: 0.00672
Epoch: 44049, Training Loss: 0.00672
Epoch: 44049, Training Loss: 0.00824
Epoch: 44050, Training Loss: 0.00647
Epoch: 44050, Training Loss: 0.00672
Epoch: 44050, Training Loss: 0.00672
Epoch: 44050, Training Loss: 0.00824
Epoch: 44051, Training Loss: 0.00647
Epoch: 44051, Training Loss: 0.00672
Epoch: 44051, Training Loss: 0.00672
Epoch: 44051, Training Loss: 0.00824
Epoch: 44052, Training Loss: 0.00647
Epoch: 44052, Training Loss: 0.00672
Epoch: 44052, Training Loss: 0.00672
Epoch: 44052, Training Loss: 0.00824
Epoch: 44053, Training Loss: 0.00647
Epoch: 44053, Training Loss: 0.00672
Epoch: 44053, Training Loss: 0.00672
Epoch: 44053, Training Loss: 0.00824
Epoch: 44054, Training Loss: 0.00647
Epoch: 44054, Training Loss: 0.00672
Epoch: 44054, Training Loss: 0.00672
Epoch: 44054, Training Loss: 0.00824
Epoch: 44055, Training Loss: 0.00647
Epoch: 44055, Training Loss: 0.00672
Epoch: 44055, Training Loss: 0.00672
Epoch: 44055, Training Loss: 0.00824
Epoch: 44056, Training Loss: 0.00647
Epoch: 44056, Training Loss: 0.00672
Epoch: 44056, Training Loss: 0.00672
Epoch: 44056, Training Loss: 0.00824
Epoch: 44057, Training Loss: 0.00647
Epoch: 44057, Training Loss: 0.00672
Epoch: 44057, Training Loss: 0.00672
Epoch: 44057, Training Loss: 0.00824
Epoch: 44058, Training Loss: 0.00647
Epoch: 44058, Training Loss: 0.00672
Epoch: 44058, Training Loss: 0.00672
Epoch: 44058, Training Loss: 0.00824
Epoch: 44059, Training Loss: 0.00647
Epoch: 44059, Training Loss: 0.00672
Epoch: 44059, Training Loss: 0.00672
Epoch: 44059, Training Loss: 0.00824
Epoch: 44060, Training Loss: 0.00647
Epoch: 44060, Training Loss: 0.00672
Epoch: 44060, Training Loss: 0.00672
Epoch: 44060, Training Loss: 0.00824
Epoch: 44061, Training Loss: 0.00647
Epoch: 44061, Training Loss: 0.00672
Epoch: 44061, Training Loss: 0.00672
Epoch: 44061, Training Loss: 0.00824
Epoch: 44062, Training Loss: 0.00647
Epoch: 44062, Training Loss: 0.00672
Epoch: 44062, Training Loss: 0.00672
Epoch: 44062, Training Loss: 0.00824
Epoch: 44063, Training Loss: 0.00647
Epoch: 44063, Training Loss: 0.00672
Epoch: 44063, Training Loss: 0.00672
Epoch: 44063, Training Loss: 0.00824
Epoch: 44064, Training Loss: 0.00647
Epoch: 44064, Training Loss: 0.00672
Epoch: 44064, Training Loss: 0.00672
Epoch: 44064, Training Loss: 0.00824
Epoch: 44065, Training Loss: 0.00647
Epoch: 44065, Training Loss: 0.00672
Epoch: 44065, Training Loss: 0.00672
Epoch: 44065, Training Loss: 0.00824
Epoch: 44066, Training Loss: 0.00647
Epoch: 44066, Training Loss: 0.00672
Epoch: 44066, Training Loss: 0.00672
Epoch: 44066, Training Loss: 0.00824
Epoch: 44067, Training Loss: 0.00647
Epoch: 44067, Training Loss: 0.00672
Epoch: 44067, Training Loss: 0.00672
Epoch: 44067, Training Loss: 0.00824
Epoch: 44068, Training Loss: 0.00647
Epoch: 44068, Training Loss: 0.00672
Epoch: 44068, Training Loss: 0.00672
Epoch: 44068, Training Loss: 0.00824
Epoch: 44069, Training Loss: 0.00647
Epoch: 44069, Training Loss: 0.00672
Epoch: 44069, Training Loss: 0.00672
Epoch: 44069, Training Loss: 0.00824
Epoch: 44070, Training Loss: 0.00647
Epoch: 44070, Training Loss: 0.00672
Epoch: 44070, Training Loss: 0.00672
Epoch: 44070, Training Loss: 0.00824
Epoch: 44071, Training Loss: 0.00647
Epoch: 44071, Training Loss: 0.00672
Epoch: 44071, Training Loss: 0.00672
Epoch: 44071, Training Loss: 0.00824
Epoch: 44072, Training Loss: 0.00647
Epoch: 44072, Training Loss: 0.00672
Epoch: 44072, Training Loss: 0.00672
Epoch: 44072, Training Loss: 0.00824
Epoch: 44073, Training Loss: 0.00647
Epoch: 44073, Training Loss: 0.00672
Epoch: 44073, Training Loss: 0.00672
Epoch: 44073, Training Loss: 0.00824
Epoch: 44074, Training Loss: 0.00647
Epoch: 44074, Training Loss: 0.00672
Epoch: 44074, Training Loss: 0.00672
Epoch: 44074, Training Loss: 0.00824
Epoch: 44075, Training Loss: 0.00647
Epoch: 44075, Training Loss: 0.00672
Epoch: 44075, Training Loss: 0.00672
Epoch: 44075, Training Loss: 0.00824
Epoch: 44076, Training Loss: 0.00647
Epoch: 44076, Training Loss: 0.00672
Epoch: 44076, Training Loss: 0.00672
Epoch: 44076, Training Loss: 0.00824
Epoch: 44077, Training Loss: 0.00647
Epoch: 44077, Training Loss: 0.00672
Epoch: 44077, Training Loss: 0.00672
Epoch: 44077, Training Loss: 0.00824
Epoch: 44078, Training Loss: 0.00647
Epoch: 44078, Training Loss: 0.00672
Epoch: 44078, Training Loss: 0.00672
Epoch: 44078, Training Loss: 0.00824
Epoch: 44079, Training Loss: 0.00647
Epoch: 44079, Training Loss: 0.00672
Epoch: 44079, Training Loss: 0.00672
Epoch: 44079, Training Loss: 0.00824
Epoch: 44080, Training Loss: 0.00647
Epoch: 44080, Training Loss: 0.00671
Epoch: 44080, Training Loss: 0.00672
Epoch: 44080, Training Loss: 0.00824
Epoch: 44081, Training Loss: 0.00647
Epoch: 44081, Training Loss: 0.00671
Epoch: 44081, Training Loss: 0.00672
Epoch: 44081, Training Loss: 0.00824
Epoch: 44082, Training Loss: 0.00647
Epoch: 44082, Training Loss: 0.00671
Epoch: 44082, Training Loss: 0.00672
Epoch: 44082, Training Loss: 0.00824
Epoch: 44083, Training Loss: 0.00647
Epoch: 44083, Training Loss: 0.00671
Epoch: 44083, Training Loss: 0.00672
Epoch: 44083, Training Loss: 0.00824
Epoch: 44084, Training Loss: 0.00647
Epoch: 44084, Training Loss: 0.00671
Epoch: 44084, Training Loss: 0.00672
Epoch: 44084, Training Loss: 0.00824
Epoch: 44085, Training Loss: 0.00647
Epoch: 44085, Training Loss: 0.00671
Epoch: 44085, Training Loss: 0.00672
Epoch: 44085, Training Loss: 0.00824
Epoch: 44086, Training Loss: 0.00647
Epoch: 44086, Training Loss: 0.00671
Epoch: 44086, Training Loss: 0.00672
Epoch: 44086, Training Loss: 0.00824
Epoch: 44087, Training Loss: 0.00647
Epoch: 44087, Training Loss: 0.00671
Epoch: 44087, Training Loss: 0.00672
Epoch: 44087, Training Loss: 0.00824
Epoch: 44088, Training Loss: 0.00647
Epoch: 44088, Training Loss: 0.00671
Epoch: 44088, Training Loss: 0.00672
Epoch: 44088, Training Loss: 0.00824
Epoch: 44089, Training Loss: 0.00647
Epoch: 44089, Training Loss: 0.00671
Epoch: 44089, Training Loss: 0.00672
Epoch: 44089, Training Loss: 0.00824
Epoch: 44090, Training Loss: 0.00647
Epoch: 44090, Training Loss: 0.00671
Epoch: 44090, Training Loss: 0.00672
Epoch: 44090, Training Loss: 0.00824
Epoch: 44091, Training Loss: 0.00647
Epoch: 44091, Training Loss: 0.00671
Epoch: 44091, Training Loss: 0.00672
Epoch: 44091, Training Loss: 0.00824
Epoch: 44092, Training Loss: 0.00647
Epoch: 44092, Training Loss: 0.00671
Epoch: 44092, Training Loss: 0.00672
Epoch: 44092, Training Loss: 0.00824
Epoch: 44093, Training Loss: 0.00647
Epoch: 44093, Training Loss: 0.00671
Epoch: 44093, Training Loss: 0.00672
Epoch: 44093, Training Loss: 0.00824
Epoch: 44094, Training Loss: 0.00647
Epoch: 44094, Training Loss: 0.00671
Epoch: 44094, Training Loss: 0.00672
Epoch: 44094, Training Loss: 0.00824
Epoch: 44095, Training Loss: 0.00647
Epoch: 44095, Training Loss: 0.00671
Epoch: 44095, Training Loss: 0.00672
Epoch: 44095, Training Loss: 0.00824
Epoch: 44096, Training Loss: 0.00647
Epoch: 44096, Training Loss: 0.00671
Epoch: 44096, Training Loss: 0.00672
Epoch: 44096, Training Loss: 0.00824
Epoch: 44097, Training Loss: 0.00647
Epoch: 44097, Training Loss: 0.00671
Epoch: 44097, Training Loss: 0.00672
Epoch: 44097, Training Loss: 0.00824
Epoch: 44098, Training Loss: 0.00647
Epoch: 44098, Training Loss: 0.00671
Epoch: 44098, Training Loss: 0.00672
Epoch: 44098, Training Loss: 0.00824
Epoch: 44099, Training Loss: 0.00647
Epoch: 44099, Training Loss: 0.00671
Epoch: 44099, Training Loss: 0.00672
Epoch: 44099, Training Loss: 0.00824
Epoch: 44100, Training Loss: 0.00647
Epoch: 44100, Training Loss: 0.00671
Epoch: 44100, Training Loss: 0.00672
Epoch: 44100, Training Loss: 0.00824
Epoch: 44101, Training Loss: 0.00647
Epoch: 44101, Training Loss: 0.00671
Epoch: 44101, Training Loss: 0.00672
Epoch: 44101, Training Loss: 0.00824
Epoch: 44102, Training Loss: 0.00647
Epoch: 44102, Training Loss: 0.00671
Epoch: 44102, Training Loss: 0.00672
Epoch: 44102, Training Loss: 0.00824
Epoch: 44103, Training Loss: 0.00647
Epoch: 44103, Training Loss: 0.00671
Epoch: 44103, Training Loss: 0.00672
Epoch: 44103, Training Loss: 0.00824
Epoch: 44104, Training Loss: 0.00647
Epoch: 44104, Training Loss: 0.00671
Epoch: 44104, Training Loss: 0.00672
Epoch: 44104, Training Loss: 0.00824
Epoch: 44105, Training Loss: 0.00647
Epoch: 44105, Training Loss: 0.00671
Epoch: 44105, Training Loss: 0.00672
Epoch: 44105, Training Loss: 0.00824
Epoch: 44106, Training Loss: 0.00647
Epoch: 44106, Training Loss: 0.00671
Epoch: 44106, Training Loss: 0.00672
Epoch: 44106, Training Loss: 0.00824
Epoch: 44107, Training Loss: 0.00647
Epoch: 44107, Training Loss: 0.00671
Epoch: 44107, Training Loss: 0.00672
Epoch: 44107, Training Loss: 0.00824
Epoch: 44108, Training Loss: 0.00647
Epoch: 44108, Training Loss: 0.00671
Epoch: 44108, Training Loss: 0.00672
Epoch: 44108, Training Loss: 0.00824
Epoch: 44109, Training Loss: 0.00647
Epoch: 44109, Training Loss: 0.00671
Epoch: 44109, Training Loss: 0.00672
Epoch: 44109, Training Loss: 0.00824
Epoch: 44110, Training Loss: 0.00647
Epoch: 44110, Training Loss: 0.00671
Epoch: 44110, Training Loss: 0.00672
Epoch: 44110, Training Loss: 0.00824
Epoch: 44111, Training Loss: 0.00647
Epoch: 44111, Training Loss: 0.00671
Epoch: 44111, Training Loss: 0.00672
Epoch: 44111, Training Loss: 0.00824
Epoch: 44112, Training Loss: 0.00647
Epoch: 44112, Training Loss: 0.00671
Epoch: 44112, Training Loss: 0.00672
Epoch: 44112, Training Loss: 0.00824
Epoch: 44113, Training Loss: 0.00647
Epoch: 44113, Training Loss: 0.00671
Epoch: 44113, Training Loss: 0.00672
Epoch: 44113, Training Loss: 0.00824
Epoch: 44114, Training Loss: 0.00647
Epoch: 44114, Training Loss: 0.00671
Epoch: 44114, Training Loss: 0.00672
Epoch: 44114, Training Loss: 0.00824
Epoch: 44115, Training Loss: 0.00647
Epoch: 44115, Training Loss: 0.00671
Epoch: 44115, Training Loss: 0.00672
Epoch: 44115, Training Loss: 0.00824
Epoch: 44116, Training Loss: 0.00647
Epoch: 44116, Training Loss: 0.00671
Epoch: 44116, Training Loss: 0.00672
Epoch: 44116, Training Loss: 0.00824
Epoch: 44117, Training Loss: 0.00647
Epoch: 44117, Training Loss: 0.00671
Epoch: 44117, Training Loss: 0.00672
Epoch: 44117, Training Loss: 0.00824
Epoch: 44118, Training Loss: 0.00647
Epoch: 44118, Training Loss: 0.00671
Epoch: 44118, Training Loss: 0.00672
Epoch: 44118, Training Loss: 0.00824
Epoch: 44119, Training Loss: 0.00647
Epoch: 44119, Training Loss: 0.00671
Epoch: 44119, Training Loss: 0.00672
Epoch: 44119, Training Loss: 0.00824
Epoch: 44120, Training Loss: 0.00647
Epoch: 44120, Training Loss: 0.00671
Epoch: 44120, Training Loss: 0.00672
Epoch: 44120, Training Loss: 0.00824
Epoch: 44121, Training Loss: 0.00647
Epoch: 44121, Training Loss: 0.00671
Epoch: 44121, Training Loss: 0.00672
Epoch: 44121, Training Loss: 0.00824
Epoch: 44122, Training Loss: 0.00647
Epoch: 44122, Training Loss: 0.00671
Epoch: 44122, Training Loss: 0.00672
Epoch: 44122, Training Loss: 0.00824
Epoch: 44123, Training Loss: 0.00647
Epoch: 44123, Training Loss: 0.00671
Epoch: 44123, Training Loss: 0.00672
Epoch: 44123, Training Loss: 0.00824
Epoch: 44124, Training Loss: 0.00647
Epoch: 44124, Training Loss: 0.00671
Epoch: 44124, Training Loss: 0.00672
Epoch: 44124, Training Loss: 0.00824
Epoch: 44125, Training Loss: 0.00647
Epoch: 44125, Training Loss: 0.00671
Epoch: 44125, Training Loss: 0.00672
Epoch: 44125, Training Loss: 0.00824
Epoch: 44126, Training Loss: 0.00647
Epoch: 44126, Training Loss: 0.00671
Epoch: 44126, Training Loss: 0.00672
Epoch: 44126, Training Loss: 0.00824
Epoch: 44127, Training Loss: 0.00647
Epoch: 44127, Training Loss: 0.00671
Epoch: 44127, Training Loss: 0.00672
Epoch: 44127, Training Loss: 0.00824
Epoch: 44128, Training Loss: 0.00647
Epoch: 44128, Training Loss: 0.00671
Epoch: 44128, Training Loss: 0.00672
Epoch: 44128, Training Loss: 0.00824
Epoch: 44129, Training Loss: 0.00647
Epoch: 44129, Training Loss: 0.00671
Epoch: 44129, Training Loss: 0.00672
Epoch: 44129, Training Loss: 0.00823
Epoch: 44130, Training Loss: 0.00647
Epoch: 44130, Training Loss: 0.00671
Epoch: 44130, Training Loss: 0.00672
Epoch: 44130, Training Loss: 0.00823
Epoch: 44131, Training Loss: 0.00647
Epoch: 44131, Training Loss: 0.00671
Epoch: 44131, Training Loss: 0.00672
Epoch: 44131, Training Loss: 0.00823
Epoch: 44132, Training Loss: 0.00647
Epoch: 44132, Training Loss: 0.00671
Epoch: 44132, Training Loss: 0.00672
Epoch: 44132, Training Loss: 0.00823
Epoch: 44133, Training Loss: 0.00647
Epoch: 44133, Training Loss: 0.00671
Epoch: 44133, Training Loss: 0.00672
Epoch: 44133, Training Loss: 0.00823
Epoch: 44134, Training Loss: 0.00647
Epoch: 44134, Training Loss: 0.00671
Epoch: 44134, Training Loss: 0.00672
Epoch: 44134, Training Loss: 0.00823
Epoch: 44135, Training Loss: 0.00647
Epoch: 44135, Training Loss: 0.00671
Epoch: 44135, Training Loss: 0.00672
Epoch: 44135, Training Loss: 0.00823
Epoch: 44136, Training Loss: 0.00647
Epoch: 44136, Training Loss: 0.00671
Epoch: 44136, Training Loss: 0.00672
Epoch: 44136, Training Loss: 0.00823
Epoch: 44137, Training Loss: 0.00647
Epoch: 44137, Training Loss: 0.00671
Epoch: 44137, Training Loss: 0.00672
Epoch: 44137, Training Loss: 0.00823
Epoch: 44138, Training Loss: 0.00647
Epoch: 44138, Training Loss: 0.00671
Epoch: 44138, Training Loss: 0.00672
Epoch: 44138, Training Loss: 0.00823
Epoch: 44139, Training Loss: 0.00647
Epoch: 44139, Training Loss: 0.00671
Epoch: 44139, Training Loss: 0.00672
Epoch: 44139, Training Loss: 0.00823
Epoch: 44140, Training Loss: 0.00647
Epoch: 44140, Training Loss: 0.00671
Epoch: 44140, Training Loss: 0.00672
Epoch: 44140, Training Loss: 0.00823
Epoch: 44141, Training Loss: 0.00647
Epoch: 44141, Training Loss: 0.00671
Epoch: 44141, Training Loss: 0.00672
Epoch: 44141, Training Loss: 0.00823
Epoch: 44142, Training Loss: 0.00647
Epoch: 44142, Training Loss: 0.00671
Epoch: 44142, Training Loss: 0.00672
Epoch: 44142, Training Loss: 0.00823
Epoch: 44143, Training Loss: 0.00647
Epoch: 44143, Training Loss: 0.00671
Epoch: 44143, Training Loss: 0.00672
Epoch: 44143, Training Loss: 0.00823
Epoch: 44144, Training Loss: 0.00647
Epoch: 44144, Training Loss: 0.00671
Epoch: 44144, Training Loss: 0.00672
Epoch: 44144, Training Loss: 0.00823
Epoch: 44145, Training Loss: 0.00647
Epoch: 44145, Training Loss: 0.00671
Epoch: 44145, Training Loss: 0.00672
Epoch: 44145, Training Loss: 0.00823
Epoch: 44146, Training Loss: 0.00647
Epoch: 44146, Training Loss: 0.00671
Epoch: 44146, Training Loss: 0.00672
Epoch: 44146, Training Loss: 0.00823
Epoch: 44147, Training Loss: 0.00647
Epoch: 44147, Training Loss: 0.00671
Epoch: 44147, Training Loss: 0.00672
Epoch: 44147, Training Loss: 0.00823
Epoch: 44148, Training Loss: 0.00647
Epoch: 44148, Training Loss: 0.00671
Epoch: 44148, Training Loss: 0.00672
Epoch: 44148, Training Loss: 0.00823
Epoch: 44149, Training Loss: 0.00647
Epoch: 44149, Training Loss: 0.00671
Epoch: 44149, Training Loss: 0.00672
Epoch: 44149, Training Loss: 0.00823
Epoch: 44150, Training Loss: 0.00647
Epoch: 44150, Training Loss: 0.00671
Epoch: 44150, Training Loss: 0.00672
Epoch: 44150, Training Loss: 0.00823
Epoch: 44151, Training Loss: 0.00647
Epoch: 44151, Training Loss: 0.00671
Epoch: 44151, Training Loss: 0.00672
Epoch: 44151, Training Loss: 0.00823
Epoch: 44152, Training Loss: 0.00647
Epoch: 44152, Training Loss: 0.00671
Epoch: 44152, Training Loss: 0.00672
Epoch: 44152, Training Loss: 0.00823
Epoch: 44153, Training Loss: 0.00647
Epoch: 44153, Training Loss: 0.00671
Epoch: 44153, Training Loss: 0.00672
Epoch: 44153, Training Loss: 0.00823
Epoch: 44154, Training Loss: 0.00647
Epoch: 44154, Training Loss: 0.00671
Epoch: 44154, Training Loss: 0.00672
Epoch: 44154, Training Loss: 0.00823
Epoch: 44155, Training Loss: 0.00647
Epoch: 44155, Training Loss: 0.00671
Epoch: 44155, Training Loss: 0.00672
Epoch: 44155, Training Loss: 0.00823
Epoch: 44156, Training Loss: 0.00647
Epoch: 44156, Training Loss: 0.00671
Epoch: 44156, Training Loss: 0.00672
Epoch: 44156, Training Loss: 0.00823
Epoch: 44157, Training Loss: 0.00647
Epoch: 44157, Training Loss: 0.00671
Epoch: 44157, Training Loss: 0.00672
Epoch: 44157, Training Loss: 0.00823
Epoch: 44158, Training Loss: 0.00647
Epoch: 44158, Training Loss: 0.00671
Epoch: 44158, Training Loss: 0.00672
Epoch: 44158, Training Loss: 0.00823
Epoch: 44159, Training Loss: 0.00647
Epoch: 44159, Training Loss: 0.00671
Epoch: 44159, Training Loss: 0.00672
Epoch: 44159, Training Loss: 0.00823
Epoch: 44160, Training Loss: 0.00647
Epoch: 44160, Training Loss: 0.00671
Epoch: 44160, Training Loss: 0.00672
Epoch: 44160, Training Loss: 0.00823
Epoch: 44161, Training Loss: 0.00647
Epoch: 44161, Training Loss: 0.00671
Epoch: 44161, Training Loss: 0.00672
Epoch: 44161, Training Loss: 0.00823
Epoch: 44162, Training Loss: 0.00647
Epoch: 44162, Training Loss: 0.00671
Epoch: 44162, Training Loss: 0.00672
Epoch: 44162, Training Loss: 0.00823
Epoch: 44163, Training Loss: 0.00646
Epoch: 44163, Training Loss: 0.00671
Epoch: 44163, Training Loss: 0.00672
Epoch: 44163, Training Loss: 0.00823
Epoch: 44164, Training Loss: 0.00646
Epoch: 44164, Training Loss: 0.00671
Epoch: 44164, Training Loss: 0.00672
Epoch: 44164, Training Loss: 0.00823
Epoch: 44165, Training Loss: 0.00646
Epoch: 44165, Training Loss: 0.00671
Epoch: 44165, Training Loss: 0.00672
Epoch: 44165, Training Loss: 0.00823
Epoch: 44166, Training Loss: 0.00646
Epoch: 44166, Training Loss: 0.00671
Epoch: 44166, Training Loss: 0.00672
Epoch: 44166, Training Loss: 0.00823
Epoch: 44167, Training Loss: 0.00646
Epoch: 44167, Training Loss: 0.00671
Epoch: 44167, Training Loss: 0.00672
Epoch: 44167, Training Loss: 0.00823
Epoch: 44168, Training Loss: 0.00646
Epoch: 44168, Training Loss: 0.00671
Epoch: 44168, Training Loss: 0.00672
Epoch: 44168, Training Loss: 0.00823
Epoch: 44169, Training Loss: 0.00646
Epoch: 44169, Training Loss: 0.00671
Epoch: 44169, Training Loss: 0.00671
Epoch: 44169, Training Loss: 0.00823
Epoch: 44170, Training Loss: 0.00646
Epoch: 44170, Training Loss: 0.00671
Epoch: 44170, Training Loss: 0.00671
Epoch: 44170, Training Loss: 0.00823
Epoch: 44171, Training Loss: 0.00646
Epoch: 44171, Training Loss: 0.00671
Epoch: 44171, Training Loss: 0.00671
Epoch: 44171, Training Loss: 0.00823
Epoch: 44172, Training Loss: 0.00646
Epoch: 44172, Training Loss: 0.00671
Epoch: 44172, Training Loss: 0.00671
Epoch: 44172, Training Loss: 0.00823
Epoch: 44173, Training Loss: 0.00646
Epoch: 44173, Training Loss: 0.00671
Epoch: 44173, Training Loss: 0.00671
Epoch: 44173, Training Loss: 0.00823
Epoch: 44174, Training Loss: 0.00646
Epoch: 44174, Training Loss: 0.00671
Epoch: 44174, Training Loss: 0.00671
Epoch: 44174, Training Loss: 0.00823
Epoch: 44175, Training Loss: 0.00646
Epoch: 44175, Training Loss: 0.00671
Epoch: 44175, Training Loss: 0.00671
Epoch: 44175, Training Loss: 0.00823
Epoch: 44176, Training Loss: 0.00646
Epoch: 44176, Training Loss: 0.00671
Epoch: 44176, Training Loss: 0.00671
Epoch: 44176, Training Loss: 0.00823
Epoch: 44177, Training Loss: 0.00646
Epoch: 44177, Training Loss: 0.00671
Epoch: 44177, Training Loss: 0.00671
Epoch: 44177, Training Loss: 0.00823
Epoch: 44178, Training Loss: 0.00646
Epoch: 44178, Training Loss: 0.00671
Epoch: 44178, Training Loss: 0.00671
Epoch: 44178, Training Loss: 0.00823
Epoch: 44179, Training Loss: 0.00646
Epoch: 44179, Training Loss: 0.00671
Epoch: 44179, Training Loss: 0.00671
Epoch: 44179, Training Loss: 0.00823
Epoch: 44180, Training Loss: 0.00646
Epoch: 44180, Training Loss: 0.00671
Epoch: 44180, Training Loss: 0.00671
Epoch: 44180, Training Loss: 0.00823
Epoch: 44181, Training Loss: 0.00646
Epoch: 44181, Training Loss: 0.00671
Epoch: 44181, Training Loss: 0.00671
Epoch: 44181, Training Loss: 0.00823
Epoch: 44182, Training Loss: 0.00646
Epoch: 44182, Training Loss: 0.00671
Epoch: 44182, Training Loss: 0.00671
Epoch: 44182, Training Loss: 0.00823
Epoch: 44183, Training Loss: 0.00646
Epoch: 44183, Training Loss: 0.00671
Epoch: 44183, Training Loss: 0.00671
Epoch: 44183, Training Loss: 0.00823
Epoch: 44184, Training Loss: 0.00646
Epoch: 44184, Training Loss: 0.00671
Epoch: 44184, Training Loss: 0.00671
Epoch: 44184, Training Loss: 0.00823
Epoch: 44185, Training Loss: 0.00646
Epoch: 44185, Training Loss: 0.00671
Epoch: 44185, Training Loss: 0.00671
Epoch: 44185, Training Loss: 0.00823
Epoch: 44186, Training Loss: 0.00646
Epoch: 44186, Training Loss: 0.00671
Epoch: 44186, Training Loss: 0.00671
Epoch: 44186, Training Loss: 0.00823
Epoch: 44187, Training Loss: 0.00646
Epoch: 44187, Training Loss: 0.00671
Epoch: 44187, Training Loss: 0.00671
Epoch: 44187, Training Loss: 0.00823
Epoch: 44188, Training Loss: 0.00646
Epoch: 44188, Training Loss: 0.00671
Epoch: 44188, Training Loss: 0.00671
Epoch: 44188, Training Loss: 0.00823
Epoch: 44189, Training Loss: 0.00646
Epoch: 44189, Training Loss: 0.00671
Epoch: 44189, Training Loss: 0.00671
Epoch: 44189, Training Loss: 0.00823
Epoch: 44190, Training Loss: 0.00646
Epoch: 44190, Training Loss: 0.00671
Epoch: 44190, Training Loss: 0.00671
Epoch: 44190, Training Loss: 0.00823
Epoch: 44191, Training Loss: 0.00646
Epoch: 44191, Training Loss: 0.00671
Epoch: 44191, Training Loss: 0.00671
Epoch: 44191, Training Loss: 0.00823
Epoch: 44192, Training Loss: 0.00646
Epoch: 44192, Training Loss: 0.00671
Epoch: 44192, Training Loss: 0.00671
Epoch: 44192, Training Loss: 0.00823
Epoch: 44193, Training Loss: 0.00646
Epoch: 44193, Training Loss: 0.00671
Epoch: 44193, Training Loss: 0.00671
Epoch: 44193, Training Loss: 0.00823
Epoch: 44194, Training Loss: 0.00646
Epoch: 44194, Training Loss: 0.00671
Epoch: 44194, Training Loss: 0.00671
Epoch: 44194, Training Loss: 0.00823
Epoch: 44195, Training Loss: 0.00646
Epoch: 44195, Training Loss: 0.00671
Epoch: 44195, Training Loss: 0.00671
Epoch: 44195, Training Loss: 0.00823
Epoch: 44196, Training Loss: 0.00646
Epoch: 44196, Training Loss: 0.00671
Epoch: 44196, Training Loss: 0.00671
Epoch: 44196, Training Loss: 0.00823
Epoch: 44197, Training Loss: 0.00646
Epoch: 44197, Training Loss: 0.00671
Epoch: 44197, Training Loss: 0.00671
Epoch: 44197, Training Loss: 0.00823
Epoch: 44198, Training Loss: 0.00646
Epoch: 44198, Training Loss: 0.00671
Epoch: 44198, Training Loss: 0.00671
Epoch: 44198, Training Loss: 0.00823
Epoch: 44199, Training Loss: 0.00646
Epoch: 44199, Training Loss: 0.00671
Epoch: 44199, Training Loss: 0.00671
Epoch: 44199, Training Loss: 0.00823
Epoch: 44200, Training Loss: 0.00646
Epoch: 44200, Training Loss: 0.00671
Epoch: 44200, Training Loss: 0.00671
Epoch: 44200, Training Loss: 0.00823
Epoch: 44201, Training Loss: 0.00646
Epoch: 44201, Training Loss: 0.00670
Epoch: 44201, Training Loss: 0.00671
Epoch: 44201, Training Loss: 0.00823
Epoch: 44202, Training Loss: 0.00646
Epoch: 44202, Training Loss: 0.00670
Epoch: 44202, Training Loss: 0.00671
Epoch: 44202, Training Loss: 0.00823
Epoch: 44203, Training Loss: 0.00646
Epoch: 44203, Training Loss: 0.00670
Epoch: 44203, Training Loss: 0.00671
Epoch: 44203, Training Loss: 0.00823
Epoch: 44204, Training Loss: 0.00646
Epoch: 44204, Training Loss: 0.00670
Epoch: 44204, Training Loss: 0.00671
Epoch: 44204, Training Loss: 0.00823
Epoch: 44205, Training Loss: 0.00646
Epoch: 44205, Training Loss: 0.00670
Epoch: 44205, Training Loss: 0.00671
Epoch: 44205, Training Loss: 0.00823
Epoch: 44206, Training Loss: 0.00646
Epoch: 44206, Training Loss: 0.00670
Epoch: 44206, Training Loss: 0.00671
Epoch: 44206, Training Loss: 0.00823
Epoch: 44207, Training Loss: 0.00646
Epoch: 44207, Training Loss: 0.00670
Epoch: 44207, Training Loss: 0.00671
Epoch: 44207, Training Loss: 0.00823
Epoch: 44208, Training Loss: 0.00646
Epoch: 44208, Training Loss: 0.00670
Epoch: 44208, Training Loss: 0.00671
Epoch: 44208, Training Loss: 0.00823
Epoch: 44209, Training Loss: 0.00646
Epoch: 44209, Training Loss: 0.00670
Epoch: 44209, Training Loss: 0.00671
Epoch: 44209, Training Loss: 0.00823
Epoch: 44210, Training Loss: 0.00646
Epoch: 44210, Training Loss: 0.00670
Epoch: 44210, Training Loss: 0.00671
Epoch: 44210, Training Loss: 0.00823
Epoch: 44211, Training Loss: 0.00646
Epoch: 44211, Training Loss: 0.00670
Epoch: 44211, Training Loss: 0.00671
Epoch: 44211, Training Loss: 0.00823
Epoch: 44212, Training Loss: 0.00646
Epoch: 44212, Training Loss: 0.00670
Epoch: 44212, Training Loss: 0.00671
Epoch: 44212, Training Loss: 0.00823
Epoch: 44213, Training Loss: 0.00646
Epoch: 44213, Training Loss: 0.00670
Epoch: 44213, Training Loss: 0.00671
Epoch: 44213, Training Loss: 0.00823
Epoch: 44214, Training Loss: 0.00646
Epoch: 44214, Training Loss: 0.00670
Epoch: 44214, Training Loss: 0.00671
Epoch: 44214, Training Loss: 0.00823
Epoch: 44215, Training Loss: 0.00646
Epoch: 44215, Training Loss: 0.00670
Epoch: 44215, Training Loss: 0.00671
Epoch: 44215, Training Loss: 0.00823
Epoch: 44216, Training Loss: 0.00646
Epoch: 44216, Training Loss: 0.00670
Epoch: 44216, Training Loss: 0.00671
Epoch: 44216, Training Loss: 0.00823
Epoch: 44217, Training Loss: 0.00646
Epoch: 44217, Training Loss: 0.00670
Epoch: 44217, Training Loss: 0.00671
Epoch: 44217, Training Loss: 0.00823
Epoch: 44218, Training Loss: 0.00646
Epoch: 44218, Training Loss: 0.00670
Epoch: 44218, Training Loss: 0.00671
Epoch: 44218, Training Loss: 0.00823
Epoch: 44219, Training Loss: 0.00646
Epoch: 44219, Training Loss: 0.00670
Epoch: 44219, Training Loss: 0.00671
Epoch: 44219, Training Loss: 0.00823
Epoch: 44220, Training Loss: 0.00646
Epoch: 44220, Training Loss: 0.00670
Epoch: 44220, Training Loss: 0.00671
Epoch: 44220, Training Loss: 0.00823
Epoch: 44221, Training Loss: 0.00646
Epoch: 44221, Training Loss: 0.00670
Epoch: 44221, Training Loss: 0.00671
Epoch: 44221, Training Loss: 0.00823
Epoch: 44222, Training Loss: 0.00646
Epoch: 44222, Training Loss: 0.00670
Epoch: 44222, Training Loss: 0.00671
Epoch: 44222, Training Loss: 0.00823
Epoch: 44223, Training Loss: 0.00646
Epoch: 44223, Training Loss: 0.00670
Epoch: 44223, Training Loss: 0.00671
Epoch: 44223, Training Loss: 0.00823
Epoch: 44224, Training Loss: 0.00646
Epoch: 44224, Training Loss: 0.00670
Epoch: 44224, Training Loss: 0.00671
Epoch: 44224, Training Loss: 0.00823
Epoch: 44225, Training Loss: 0.00646
Epoch: 44225, Training Loss: 0.00670
Epoch: 44225, Training Loss: 0.00671
Epoch: 44225, Training Loss: 0.00823
Epoch: 44226, Training Loss: 0.00646
Epoch: 44226, Training Loss: 0.00670
Epoch: 44226, Training Loss: 0.00671
Epoch: 44226, Training Loss: 0.00822
Epoch: 44227, Training Loss: 0.00646
Epoch: 44227, Training Loss: 0.00670
Epoch: 44227, Training Loss: 0.00671
Epoch: 44227, Training Loss: 0.00822
Epoch: 44228, Training Loss: 0.00646
Epoch: 44228, Training Loss: 0.00670
Epoch: 44228, Training Loss: 0.00671
Epoch: 44228, Training Loss: 0.00822
Epoch: 44229, Training Loss: 0.00646
Epoch: 44229, Training Loss: 0.00670
Epoch: 44229, Training Loss: 0.00671
Epoch: 44229, Training Loss: 0.00822
Epoch: 44230, Training Loss: 0.00646
Epoch: 44230, Training Loss: 0.00670
Epoch: 44230, Training Loss: 0.00671
Epoch: 44230, Training Loss: 0.00822
Epoch: 44231, Training Loss: 0.00646
Epoch: 44231, Training Loss: 0.00670
Epoch: 44231, Training Loss: 0.00671
Epoch: 44231, Training Loss: 0.00822
Epoch: 44232, Training Loss: 0.00646
Epoch: 44232, Training Loss: 0.00670
Epoch: 44232, Training Loss: 0.00671
Epoch: 44232, Training Loss: 0.00822
Epoch: 44233, Training Loss: 0.00646
Epoch: 44233, Training Loss: 0.00670
Epoch: 44233, Training Loss: 0.00671
Epoch: 44233, Training Loss: 0.00822
Epoch: 44234, Training Loss: 0.00646
Epoch: 44234, Training Loss: 0.00670
Epoch: 44234, Training Loss: 0.00671
Epoch: 44234, Training Loss: 0.00822
Epoch: 44235, Training Loss: 0.00646
Epoch: 44235, Training Loss: 0.00670
Epoch: 44235, Training Loss: 0.00671
Epoch: 44235, Training Loss: 0.00822
Epoch: 44236, Training Loss: 0.00646
Epoch: 44236, Training Loss: 0.00670
Epoch: 44236, Training Loss: 0.00671
Epoch: 44236, Training Loss: 0.00822
Epoch: 44237, Training Loss: 0.00646
Epoch: 44237, Training Loss: 0.00670
Epoch: 44237, Training Loss: 0.00671
Epoch: 44237, Training Loss: 0.00822
Epoch: 44238, Training Loss: 0.00646
Epoch: 44238, Training Loss: 0.00670
Epoch: 44238, Training Loss: 0.00671
Epoch: 44238, Training Loss: 0.00822
Epoch: 44239, Training Loss: 0.00646
Epoch: 44239, Training Loss: 0.00670
Epoch: 44239, Training Loss: 0.00671
Epoch: 44239, Training Loss: 0.00822
Epoch: 44240, Training Loss: 0.00646
Epoch: 44240, Training Loss: 0.00670
Epoch: 44240, Training Loss: 0.00671
Epoch: 44240, Training Loss: 0.00822
Epoch: 44241, Training Loss: 0.00646
Epoch: 44241, Training Loss: 0.00670
Epoch: 44241, Training Loss: 0.00671
Epoch: 44241, Training Loss: 0.00822
Epoch: 44242, Training Loss: 0.00646
Epoch: 44242, Training Loss: 0.00670
Epoch: 44242, Training Loss: 0.00671
Epoch: 44242, Training Loss: 0.00822
Epoch: 44243, Training Loss: 0.00646
Epoch: 44243, Training Loss: 0.00670
Epoch: 44243, Training Loss: 0.00671
Epoch: 44243, Training Loss: 0.00822
Epoch: 44244, Training Loss: 0.00646
Epoch: 44244, Training Loss: 0.00670
Epoch: 44244, Training Loss: 0.00671
Epoch: 44244, Training Loss: 0.00822
Epoch: 44245, Training Loss: 0.00646
Epoch: 44245, Training Loss: 0.00670
Epoch: 44245, Training Loss: 0.00671
Epoch: 44245, Training Loss: 0.00822
Epoch: 44246, Training Loss: 0.00646
Epoch: 44246, Training Loss: 0.00670
Epoch: 44246, Training Loss: 0.00671
Epoch: 44246, Training Loss: 0.00822
Epoch: 44247, Training Loss: 0.00646
Epoch: 44247, Training Loss: 0.00670
Epoch: 44247, Training Loss: 0.00671
Epoch: 44247, Training Loss: 0.00822
Epoch: 44248, Training Loss: 0.00646
Epoch: 44248, Training Loss: 0.00670
Epoch: 44248, Training Loss: 0.00671
Epoch: 44248, Training Loss: 0.00822
Epoch: 44249, Training Loss: 0.00646
Epoch: 44249, Training Loss: 0.00670
Epoch: 44249, Training Loss: 0.00671
Epoch: 44249, Training Loss: 0.00822
Epoch: 44250, Training Loss: 0.00646
Epoch: 44250, Training Loss: 0.00670
Epoch: 44250, Training Loss: 0.00671
Epoch: 44250, Training Loss: 0.00822
Epoch: 44251, Training Loss: 0.00646
Epoch: 44251, Training Loss: 0.00670
Epoch: 44251, Training Loss: 0.00671
Epoch: 44251, Training Loss: 0.00822
Epoch: 44252, Training Loss: 0.00646
Epoch: 44252, Training Loss: 0.00670
Epoch: 44252, Training Loss: 0.00671
Epoch: 44252, Training Loss: 0.00822
Epoch: 44253, Training Loss: 0.00646
Epoch: 44253, Training Loss: 0.00670
Epoch: 44253, Training Loss: 0.00671
Epoch: 44253, Training Loss: 0.00822
Epoch: 44254, Training Loss: 0.00646
Epoch: 44254, Training Loss: 0.00670
Epoch: 44254, Training Loss: 0.00671
Epoch: 44254, Training Loss: 0.00822
Epoch: 44255, Training Loss: 0.00646
Epoch: 44255, Training Loss: 0.00670
Epoch: 44255, Training Loss: 0.00671
Epoch: 44255, Training Loss: 0.00822
Epoch: 44256, Training Loss: 0.00646
Epoch: 44256, Training Loss: 0.00670
Epoch: 44256, Training Loss: 0.00671
Epoch: 44256, Training Loss: 0.00822
Epoch: 44257, Training Loss: 0.00646
Epoch: 44257, Training Loss: 0.00670
Epoch: 44257, Training Loss: 0.00671
Epoch: 44257, Training Loss: 0.00822
Epoch: 44258, Training Loss: 0.00646
Epoch: 44258, Training Loss: 0.00670
Epoch: 44258, Training Loss: 0.00671
Epoch: 44258, Training Loss: 0.00822
Epoch: 44259, Training Loss: 0.00646
Epoch: 44259, Training Loss: 0.00670
Epoch: 44259, Training Loss: 0.00671
Epoch: 44259, Training Loss: 0.00822
Epoch: 44260, Training Loss: 0.00646
Epoch: 44260, Training Loss: 0.00670
Epoch: 44260, Training Loss: 0.00671
Epoch: 44260, Training Loss: 0.00822
Epoch: 44261, Training Loss: 0.00646
Epoch: 44261, Training Loss: 0.00670
Epoch: 44261, Training Loss: 0.00671
Epoch: 44261, Training Loss: 0.00822
Epoch: 44262, Training Loss: 0.00646
Epoch: 44262, Training Loss: 0.00670
Epoch: 44262, Training Loss: 0.00671
Epoch: 44262, Training Loss: 0.00822
Epoch: 44263, Training Loss: 0.00646
Epoch: 44263, Training Loss: 0.00670
Epoch: 44263, Training Loss: 0.00671
Epoch: 44263, Training Loss: 0.00822
Epoch: 44264, Training Loss: 0.00646
Epoch: 44264, Training Loss: 0.00670
Epoch: 44264, Training Loss: 0.00671
Epoch: 44264, Training Loss: 0.00822
Epoch: 44265, Training Loss: 0.00646
Epoch: 44265, Training Loss: 0.00670
Epoch: 44265, Training Loss: 0.00671
Epoch: 44265, Training Loss: 0.00822
Epoch: 44266, Training Loss: 0.00646
Epoch: 44266, Training Loss: 0.00670
Epoch: 44266, Training Loss: 0.00671
Epoch: 44266, Training Loss: 0.00822
Epoch: 44267, Training Loss: 0.00646
Epoch: 44267, Training Loss: 0.00670
Epoch: 44267, Training Loss: 0.00671
Epoch: 44267, Training Loss: 0.00822
Epoch: 44268, Training Loss: 0.00646
Epoch: 44268, Training Loss: 0.00670
Epoch: 44268, Training Loss: 0.00671
Epoch: 44268, Training Loss: 0.00822
Epoch: 44269, Training Loss: 0.00646
Epoch: 44269, Training Loss: 0.00670
Epoch: 44269, Training Loss: 0.00671
Epoch: 44269, Training Loss: 0.00822
Epoch: 44270, Training Loss: 0.00646
Epoch: 44270, Training Loss: 0.00670
Epoch: 44270, Training Loss: 0.00671
Epoch: 44270, Training Loss: 0.00822
Epoch: 44271, Training Loss: 0.00646
Epoch: 44271, Training Loss: 0.00670
Epoch: 44271, Training Loss: 0.00671
Epoch: 44271, Training Loss: 0.00822
Epoch: 44272, Training Loss: 0.00646
Epoch: 44272, Training Loss: 0.00670
Epoch: 44272, Training Loss: 0.00671
Epoch: 44272, Training Loss: 0.00822
Epoch: 44273, Training Loss: 0.00646
Epoch: 44273, Training Loss: 0.00670
Epoch: 44273, Training Loss: 0.00671
Epoch: 44273, Training Loss: 0.00822
Epoch: 44274, Training Loss: 0.00646
Epoch: 44274, Training Loss: 0.00670
Epoch: 44274, Training Loss: 0.00671
Epoch: 44274, Training Loss: 0.00822
Epoch: 44275, Training Loss: 0.00646
Epoch: 44275, Training Loss: 0.00670
Epoch: 44275, Training Loss: 0.00671
Epoch: 44275, Training Loss: 0.00822
Epoch: 44276, Training Loss: 0.00646
Epoch: 44276, Training Loss: 0.00670
Epoch: 44276, Training Loss: 0.00671
Epoch: 44276, Training Loss: 0.00822
Epoch: 44277, Training Loss: 0.00646
Epoch: 44277, Training Loss: 0.00670
Epoch: 44277, Training Loss: 0.00671
Epoch: 44277, Training Loss: 0.00822
Epoch: 44278, Training Loss: 0.00646
Epoch: 44278, Training Loss: 0.00670
Epoch: 44278, Training Loss: 0.00671
Epoch: 44278, Training Loss: 0.00822
Epoch: 44279, Training Loss: 0.00646
Epoch: 44279, Training Loss: 0.00670
Epoch: 44279, Training Loss: 0.00671
Epoch: 44279, Training Loss: 0.00822
Epoch: 44280, Training Loss: 0.00646
Epoch: 44280, Training Loss: 0.00670
Epoch: 44280, Training Loss: 0.00671
Epoch: 44280, Training Loss: 0.00822
Epoch: 44281, Training Loss: 0.00646
Epoch: 44281, Training Loss: 0.00670
Epoch: 44281, Training Loss: 0.00671
Epoch: 44281, Training Loss: 0.00822
Epoch: 44282, Training Loss: 0.00646
Epoch: 44282, Training Loss: 0.00670
Epoch: 44282, Training Loss: 0.00671
Epoch: 44282, Training Loss: 0.00822
Epoch: 44283, Training Loss: 0.00646
Epoch: 44283, Training Loss: 0.00670
Epoch: 44283, Training Loss: 0.00671
Epoch: 44283, Training Loss: 0.00822
Epoch: 44284, Training Loss: 0.00646
Epoch: 44284, Training Loss: 0.00670
Epoch: 44284, Training Loss: 0.00671
Epoch: 44284, Training Loss: 0.00822
Epoch: 44285, Training Loss: 0.00646
Epoch: 44285, Training Loss: 0.00670
Epoch: 44285, Training Loss: 0.00671
Epoch: 44285, Training Loss: 0.00822
Epoch: 44286, Training Loss: 0.00646
Epoch: 44286, Training Loss: 0.00670
Epoch: 44286, Training Loss: 0.00671
Epoch: 44286, Training Loss: 0.00822
Epoch: 44287, Training Loss: 0.00646
Epoch: 44287, Training Loss: 0.00670
Epoch: 44287, Training Loss: 0.00671
Epoch: 44287, Training Loss: 0.00822
Epoch: 44288, Training Loss: 0.00646
Epoch: 44288, Training Loss: 0.00670
Epoch: 44288, Training Loss: 0.00671
Epoch: 44288, Training Loss: 0.00822
Epoch: 44289, Training Loss: 0.00646
Epoch: 44289, Training Loss: 0.00670
Epoch: 44289, Training Loss: 0.00671
Epoch: 44289, Training Loss: 0.00822
Epoch: 44290, Training Loss: 0.00646
Epoch: 44290, Training Loss: 0.00670
Epoch: 44290, Training Loss: 0.00670
Epoch: 44290, Training Loss: 0.00822
Epoch: 44291, Training Loss: 0.00645
Epoch: 44291, Training Loss: 0.00670
Epoch: 44291, Training Loss: 0.00670
Epoch: 44291, Training Loss: 0.00822
Epoch: 44292, Training Loss: 0.00645
Epoch: 44292, Training Loss: 0.00670
Epoch: 44292, Training Loss: 0.00670
Epoch: 44292, Training Loss: 0.00822
Epoch: 44293, Training Loss: 0.00645
Epoch: 44293, Training Loss: 0.00670
Epoch: 44293, Training Loss: 0.00670
Epoch: 44293, Training Loss: 0.00822
Epoch: 44294, Training Loss: 0.00645
Epoch: 44294, Training Loss: 0.00670
Epoch: 44294, Training Loss: 0.00670
Epoch: 44294, Training Loss: 0.00822
Epoch: 44295, Training Loss: 0.00645
Epoch: 44295, Training Loss: 0.00670
Epoch: 44295, Training Loss: 0.00670
Epoch: 44295, Training Loss: 0.00822
Epoch: 44296, Training Loss: 0.00645
Epoch: 44296, Training Loss: 0.00670
Epoch: 44296, Training Loss: 0.00670
Epoch: 44296, Training Loss: 0.00822
Epoch: 44297, Training Loss: 0.00645
Epoch: 44297, Training Loss: 0.00670
Epoch: 44297, Training Loss: 0.00670
Epoch: 44297, Training Loss: 0.00822
Epoch: 44298, Training Loss: 0.00645
Epoch: 44298, Training Loss: 0.00670
Epoch: 44298, Training Loss: 0.00670
Epoch: 44298, Training Loss: 0.00822
Epoch: 44299, Training Loss: 0.00645
Epoch: 44299, Training Loss: 0.00670
Epoch: 44299, Training Loss: 0.00670
Epoch: 44299, Training Loss: 0.00822
Epoch: 44300, Training Loss: 0.00645
Epoch: 44300, Training Loss: 0.00670
Epoch: 44300, Training Loss: 0.00670
Epoch: 44300, Training Loss: 0.00822
Epoch: 44301, Training Loss: 0.00645
Epoch: 44301, Training Loss: 0.00670
Epoch: 44301, Training Loss: 0.00670
Epoch: 44301, Training Loss: 0.00822
Epoch: 44302, Training Loss: 0.00645
Epoch: 44302, Training Loss: 0.00670
Epoch: 44302, Training Loss: 0.00670
Epoch: 44302, Training Loss: 0.00822
Epoch: 44303, Training Loss: 0.00645
Epoch: 44303, Training Loss: 0.00670
Epoch: 44303, Training Loss: 0.00670
Epoch: 44303, Training Loss: 0.00822
Epoch: 44304, Training Loss: 0.00645
Epoch: 44304, Training Loss: 0.00670
Epoch: 44304, Training Loss: 0.00670
Epoch: 44304, Training Loss: 0.00822
Epoch: 44305, Training Loss: 0.00645
Epoch: 44305, Training Loss: 0.00670
Epoch: 44305, Training Loss: 0.00670
Epoch: 44305, Training Loss: 0.00822
Epoch: 44306, Training Loss: 0.00645
Epoch: 44306, Training Loss: 0.00670
Epoch: 44306, Training Loss: 0.00670
Epoch: 44306, Training Loss: 0.00822
Epoch: 44307, Training Loss: 0.00645
Epoch: 44307, Training Loss: 0.00670
Epoch: 44307, Training Loss: 0.00670
Epoch: 44307, Training Loss: 0.00822
Epoch: 44308, Training Loss: 0.00645
Epoch: 44308, Training Loss: 0.00670
Epoch: 44308, Training Loss: 0.00670
Epoch: 44308, Training Loss: 0.00822
Epoch: 44309, Training Loss: 0.00645
Epoch: 44309, Training Loss: 0.00670
Epoch: 44309, Training Loss: 0.00670
Epoch: 44309, Training Loss: 0.00822
Epoch: 44310, Training Loss: 0.00645
Epoch: 44310, Training Loss: 0.00670
Epoch: 44310, Training Loss: 0.00670
Epoch: 44310, Training Loss: 0.00822
Epoch: 44311, Training Loss: 0.00645
Epoch: 44311, Training Loss: 0.00670
Epoch: 44311, Training Loss: 0.00670
Epoch: 44311, Training Loss: 0.00822
Epoch: 44312, Training Loss: 0.00645
Epoch: 44312, Training Loss: 0.00670
Epoch: 44312, Training Loss: 0.00670
Epoch: 44312, Training Loss: 0.00822
Epoch: 44313, Training Loss: 0.00645
Epoch: 44313, Training Loss: 0.00670
Epoch: 44313, Training Loss: 0.00670
Epoch: 44313, Training Loss: 0.00822
Epoch: 44314, Training Loss: 0.00645
Epoch: 44314, Training Loss: 0.00670
Epoch: 44314, Training Loss: 0.00670
Epoch: 44314, Training Loss: 0.00822
Epoch: 44315, Training Loss: 0.00645
Epoch: 44315, Training Loss: 0.00670
Epoch: 44315, Training Loss: 0.00670
Epoch: 44315, Training Loss: 0.00822
Epoch: 44316, Training Loss: 0.00645
Epoch: 44316, Training Loss: 0.00670
Epoch: 44316, Training Loss: 0.00670
Epoch: 44316, Training Loss: 0.00822
Epoch: 44317, Training Loss: 0.00645
Epoch: 44317, Training Loss: 0.00670
Epoch: 44317, Training Loss: 0.00670
Epoch: 44317, Training Loss: 0.00822
Epoch: 44318, Training Loss: 0.00645
Epoch: 44318, Training Loss: 0.00670
Epoch: 44318, Training Loss: 0.00670
Epoch: 44318, Training Loss: 0.00822
Epoch: 44319, Training Loss: 0.00645
Epoch: 44319, Training Loss: 0.00670
Epoch: 44319, Training Loss: 0.00670
Epoch: 44319, Training Loss: 0.00822
Epoch: 44320, Training Loss: 0.00645
Epoch: 44320, Training Loss: 0.00670
Epoch: 44320, Training Loss: 0.00670
Epoch: 44320, Training Loss: 0.00822
Epoch: 44321, Training Loss: 0.00645
Epoch: 44321, Training Loss: 0.00670
Epoch: 44321, Training Loss: 0.00670
Epoch: 44321, Training Loss: 0.00822
Epoch: 44322, Training Loss: 0.00645
Epoch: 44322, Training Loss: 0.00669
Epoch: 44322, Training Loss: 0.00670
Epoch: 44322, Training Loss: 0.00822
Epoch: 44323, Training Loss: 0.00645
Epoch: 44323, Training Loss: 0.00669
Epoch: 44323, Training Loss: 0.00670
Epoch: 44323, Training Loss: 0.00822
Epoch: 44324, Training Loss: 0.00645
Epoch: 44324, Training Loss: 0.00669
Epoch: 44324, Training Loss: 0.00670
Epoch: 44324, Training Loss: 0.00821
Epoch: 44325, Training Loss: 0.00645
Epoch: 44325, Training Loss: 0.00669
Epoch: 44325, Training Loss: 0.00670
Epoch: 44325, Training Loss: 0.00821
Epoch: 44326, Training Loss: 0.00645
Epoch: 44326, Training Loss: 0.00669
Epoch: 44326, Training Loss: 0.00670
Epoch: 44326, Training Loss: 0.00821
Epoch: 44327, Training Loss: 0.00645
Epoch: 44327, Training Loss: 0.00669
Epoch: 44327, Training Loss: 0.00670
Epoch: 44327, Training Loss: 0.00821
Epoch: 44328, Training Loss: 0.00645
Epoch: 44328, Training Loss: 0.00669
Epoch: 44328, Training Loss: 0.00670
Epoch: 44328, Training Loss: 0.00821
Epoch: 44329, Training Loss: 0.00645
Epoch: 44329, Training Loss: 0.00669
Epoch: 44329, Training Loss: 0.00670
Epoch: 44329, Training Loss: 0.00821
Epoch: 44330, Training Loss: 0.00645
Epoch: 44330, Training Loss: 0.00669
Epoch: 44330, Training Loss: 0.00670
Epoch: 44330, Training Loss: 0.00821
Epoch: 44331, Training Loss: 0.00645
Epoch: 44331, Training Loss: 0.00669
Epoch: 44331, Training Loss: 0.00670
Epoch: 44331, Training Loss: 0.00821
Epoch: 44332, Training Loss: 0.00645
Epoch: 44332, Training Loss: 0.00669
Epoch: 44332, Training Loss: 0.00670
Epoch: 44332, Training Loss: 0.00821
Epoch: 44333, Training Loss: 0.00645
Epoch: 44333, Training Loss: 0.00669
Epoch: 44333, Training Loss: 0.00670
Epoch: 44333, Training Loss: 0.00821
Epoch: 44334, Training Loss: 0.00645
Epoch: 44334, Training Loss: 0.00669
Epoch: 44334, Training Loss: 0.00670
Epoch: 44334, Training Loss: 0.00821
Epoch: 44335, Training Loss: 0.00645
Epoch: 44335, Training Loss: 0.00669
Epoch: 44335, Training Loss: 0.00670
Epoch: 44335, Training Loss: 0.00821
Epoch: 44336, Training Loss: 0.00645
Epoch: 44336, Training Loss: 0.00669
Epoch: 44336, Training Loss: 0.00670
Epoch: 44336, Training Loss: 0.00821
Epoch: 44337, Training Loss: 0.00645
Epoch: 44337, Training Loss: 0.00669
Epoch: 44337, Training Loss: 0.00670
Epoch: 44337, Training Loss: 0.00821
Epoch: 44338, Training Loss: 0.00645
Epoch: 44338, Training Loss: 0.00669
Epoch: 44338, Training Loss: 0.00670
Epoch: 44338, Training Loss: 0.00821
Epoch: 44339, Training Loss: 0.00645
Epoch: 44339, Training Loss: 0.00669
Epoch: 44339, Training Loss: 0.00670
Epoch: 44339, Training Loss: 0.00821
Epoch: 44340, Training Loss: 0.00645
Epoch: 44340, Training Loss: 0.00669
Epoch: 44340, Training Loss: 0.00670
Epoch: 44340, Training Loss: 0.00821
Epoch: 44341, Training Loss: 0.00645
Epoch: 44341, Training Loss: 0.00669
Epoch: 44341, Training Loss: 0.00670
Epoch: 44341, Training Loss: 0.00821
Epoch: 44342, Training Loss: 0.00645
Epoch: 44342, Training Loss: 0.00669
Epoch: 44342, Training Loss: 0.00670
Epoch: 44342, Training Loss: 0.00821
Epoch: 44343, Training Loss: 0.00645
Epoch: 44343, Training Loss: 0.00669
Epoch: 44343, Training Loss: 0.00670
Epoch: 44343, Training Loss: 0.00821
Epoch: 44344, Training Loss: 0.00645
Epoch: 44344, Training Loss: 0.00669
Epoch: 44344, Training Loss: 0.00670
Epoch: 44344, Training Loss: 0.00821
Epoch: 44345, Training Loss: 0.00645
Epoch: 44345, Training Loss: 0.00669
Epoch: 44345, Training Loss: 0.00670
Epoch: 44345, Training Loss: 0.00821
Epoch: 44346, Training Loss: 0.00645
Epoch: 44346, Training Loss: 0.00669
Epoch: 44346, Training Loss: 0.00670
Epoch: 44346, Training Loss: 0.00821
Epoch: 44347, Training Loss: 0.00645
Epoch: 44347, Training Loss: 0.00669
Epoch: 44347, Training Loss: 0.00670
Epoch: 44347, Training Loss: 0.00821
Epoch: 44348, Training Loss: 0.00645
Epoch: 44348, Training Loss: 0.00669
Epoch: 44348, Training Loss: 0.00670
Epoch: 44348, Training Loss: 0.00821
Epoch: 44349, Training Loss: 0.00645
Epoch: 44349, Training Loss: 0.00669
Epoch: 44349, Training Loss: 0.00670
Epoch: 44349, Training Loss: 0.00821
Epoch: 44350, Training Loss: 0.00645
Epoch: 44350, Training Loss: 0.00669
Epoch: 44350, Training Loss: 0.00670
Epoch: 44350, Training Loss: 0.00821
Epoch: 44351, Training Loss: 0.00645
Epoch: 44351, Training Loss: 0.00669
Epoch: 44351, Training Loss: 0.00670
Epoch: 44351, Training Loss: 0.00821
Epoch: 44352, Training Loss: 0.00645
Epoch: 44352, Training Loss: 0.00669
Epoch: 44352, Training Loss: 0.00670
Epoch: 44352, Training Loss: 0.00821
Epoch: 44353, Training Loss: 0.00645
Epoch: 44353, Training Loss: 0.00669
Epoch: 44353, Training Loss: 0.00670
Epoch: 44353, Training Loss: 0.00821
Epoch: 44354, Training Loss: 0.00645
Epoch: 44354, Training Loss: 0.00669
Epoch: 44354, Training Loss: 0.00670
Epoch: 44354, Training Loss: 0.00821
Epoch: 44355, Training Loss: 0.00645
Epoch: 44355, Training Loss: 0.00669
Epoch: 44355, Training Loss: 0.00670
Epoch: 44355, Training Loss: 0.00821
Epoch: 44356, Training Loss: 0.00645
Epoch: 44356, Training Loss: 0.00669
Epoch: 44356, Training Loss: 0.00670
Epoch: 44356, Training Loss: 0.00821
Epoch: 44357, Training Loss: 0.00645
Epoch: 44357, Training Loss: 0.00669
Epoch: 44357, Training Loss: 0.00670
Epoch: 44357, Training Loss: 0.00821
Epoch: 44358, Training Loss: 0.00645
Epoch: 44358, Training Loss: 0.00669
Epoch: 44358, Training Loss: 0.00670
Epoch: 44358, Training Loss: 0.00821
Epoch: 44359, Training Loss: 0.00645
Epoch: 44359, Training Loss: 0.00669
Epoch: 44359, Training Loss: 0.00670
Epoch: 44359, Training Loss: 0.00821
Epoch: 44360, Training Loss: 0.00645
Epoch: 44360, Training Loss: 0.00669
Epoch: 44360, Training Loss: 0.00670
Epoch: 44360, Training Loss: 0.00821
Epoch: 44361, Training Loss: 0.00645
Epoch: 44361, Training Loss: 0.00669
Epoch: 44361, Training Loss: 0.00670
Epoch: 44361, Training Loss: 0.00821
Epoch: 44362, Training Loss: 0.00645
Epoch: 44362, Training Loss: 0.00669
Epoch: 44362, Training Loss: 0.00670
Epoch: 44362, Training Loss: 0.00821
Epoch: 44363, Training Loss: 0.00645
Epoch: 44363, Training Loss: 0.00669
Epoch: 44363, Training Loss: 0.00670
Epoch: 44363, Training Loss: 0.00821
Epoch: 44364, Training Loss: 0.00645
Epoch: 44364, Training Loss: 0.00669
Epoch: 44364, Training Loss: 0.00670
Epoch: 44364, Training Loss: 0.00821
Epoch: 44365, Training Loss: 0.00645
Epoch: 44365, Training Loss: 0.00669
Epoch: 44365, Training Loss: 0.00670
Epoch: 44365, Training Loss: 0.00821
Epoch: 44366, Training Loss: 0.00645
Epoch: 44366, Training Loss: 0.00669
Epoch: 44366, Training Loss: 0.00670
Epoch: 44366, Training Loss: 0.00821
Epoch: 44367, Training Loss: 0.00645
Epoch: 44367, Training Loss: 0.00669
Epoch: 44367, Training Loss: 0.00670
Epoch: 44367, Training Loss: 0.00821
Epoch: 44368, Training Loss: 0.00645
Epoch: 44368, Training Loss: 0.00669
Epoch: 44368, Training Loss: 0.00670
Epoch: 44368, Training Loss: 0.00821
Epoch: 44369, Training Loss: 0.00645
Epoch: 44369, Training Loss: 0.00669
Epoch: 44369, Training Loss: 0.00670
Epoch: 44369, Training Loss: 0.00821
Epoch: 44370, Training Loss: 0.00645
Epoch: 44370, Training Loss: 0.00669
Epoch: 44370, Training Loss: 0.00670
Epoch: 44370, Training Loss: 0.00821
Epoch: 44371, Training Loss: 0.00645
Epoch: 44371, Training Loss: 0.00669
Epoch: 44371, Training Loss: 0.00670
Epoch: 44371, Training Loss: 0.00821
Epoch: 44372, Training Loss: 0.00645
Epoch: 44372, Training Loss: 0.00669
Epoch: 44372, Training Loss: 0.00670
Epoch: 44372, Training Loss: 0.00821
Epoch: 44373, Training Loss: 0.00645
Epoch: 44373, Training Loss: 0.00669
Epoch: 44373, Training Loss: 0.00670
Epoch: 44373, Training Loss: 0.00821
Epoch: 44374, Training Loss: 0.00645
Epoch: 44374, Training Loss: 0.00669
Epoch: 44374, Training Loss: 0.00670
Epoch: 44374, Training Loss: 0.00821
Epoch: 44375, Training Loss: 0.00645
Epoch: 44375, Training Loss: 0.00669
Epoch: 44375, Training Loss: 0.00670
Epoch: 44375, Training Loss: 0.00821
Epoch: 44376, Training Loss: 0.00645
Epoch: 44376, Training Loss: 0.00669
Epoch: 44376, Training Loss: 0.00670
Epoch: 44376, Training Loss: 0.00821
Epoch: 44377, Training Loss: 0.00645
Epoch: 44377, Training Loss: 0.00669
Epoch: 44377, Training Loss: 0.00670
Epoch: 44377, Training Loss: 0.00821
Epoch: 44378, Training Loss: 0.00645
Epoch: 44378, Training Loss: 0.00669
Epoch: 44378, Training Loss: 0.00670
Epoch: 44378, Training Loss: 0.00821
Epoch: 44379, Training Loss: 0.00645
Epoch: 44379, Training Loss: 0.00669
Epoch: 44379, Training Loss: 0.00670
Epoch: 44379, Training Loss: 0.00821
Epoch: 44380, Training Loss: 0.00645
Epoch: 44380, Training Loss: 0.00669
Epoch: 44380, Training Loss: 0.00670
Epoch: 44380, Training Loss: 0.00821
Epoch: 44381, Training Loss: 0.00645
Epoch: 44381, Training Loss: 0.00669
Epoch: 44381, Training Loss: 0.00670
Epoch: 44381, Training Loss: 0.00821
Epoch: 44382, Training Loss: 0.00645
Epoch: 44382, Training Loss: 0.00669
Epoch: 44382, Training Loss: 0.00670
Epoch: 44382, Training Loss: 0.00821
Epoch: 44383, Training Loss: 0.00645
Epoch: 44383, Training Loss: 0.00669
Epoch: 44383, Training Loss: 0.00670
Epoch: 44383, Training Loss: 0.00821
Epoch: 44384, Training Loss: 0.00645
Epoch: 44384, Training Loss: 0.00669
Epoch: 44384, Training Loss: 0.00670
Epoch: 44384, Training Loss: 0.00821
Epoch: 44385, Training Loss: 0.00645
Epoch: 44385, Training Loss: 0.00669
Epoch: 44385, Training Loss: 0.00670
Epoch: 44385, Training Loss: 0.00821
Epoch: 44386, Training Loss: 0.00645
Epoch: 44386, Training Loss: 0.00669
Epoch: 44386, Training Loss: 0.00670
Epoch: 44386, Training Loss: 0.00821
Epoch: 44387, Training Loss: 0.00645
Epoch: 44387, Training Loss: 0.00669
Epoch: 44387, Training Loss: 0.00670
Epoch: 44387, Training Loss: 0.00821
Epoch: 44388, Training Loss: 0.00645
Epoch: 44388, Training Loss: 0.00669
Epoch: 44388, Training Loss: 0.00670
Epoch: 44388, Training Loss: 0.00821
Epoch: 44389, Training Loss: 0.00645
Epoch: 44389, Training Loss: 0.00669
Epoch: 44389, Training Loss: 0.00670
Epoch: 44389, Training Loss: 0.00821
Epoch: 44390, Training Loss: 0.00645
Epoch: 44390, Training Loss: 0.00669
Epoch: 44390, Training Loss: 0.00670
Epoch: 44390, Training Loss: 0.00821
Epoch: 44391, Training Loss: 0.00645
Epoch: 44391, Training Loss: 0.00669
Epoch: 44391, Training Loss: 0.00670
Epoch: 44391, Training Loss: 0.00821
Epoch: 44392, Training Loss: 0.00645
Epoch: 44392, Training Loss: 0.00669
Epoch: 44392, Training Loss: 0.00670
Epoch: 44392, Training Loss: 0.00821
Epoch: 44393, Training Loss: 0.00645
Epoch: 44393, Training Loss: 0.00669
Epoch: 44393, Training Loss: 0.00670
Epoch: 44393, Training Loss: 0.00821
Epoch: 44394, Training Loss: 0.00645
Epoch: 44394, Training Loss: 0.00669
Epoch: 44394, Training Loss: 0.00670
Epoch: 44394, Training Loss: 0.00821
Epoch: 44395, Training Loss: 0.00645
Epoch: 44395, Training Loss: 0.00669
Epoch: 44395, Training Loss: 0.00670
Epoch: 44395, Training Loss: 0.00821
Epoch: 44396, Training Loss: 0.00645
Epoch: 44396, Training Loss: 0.00669
Epoch: 44396, Training Loss: 0.00670
Epoch: 44396, Training Loss: 0.00821
Epoch: 44397, Training Loss: 0.00645
Epoch: 44397, Training Loss: 0.00669
Epoch: 44397, Training Loss: 0.00670
Epoch: 44397, Training Loss: 0.00821
Epoch: 44398, Training Loss: 0.00645
Epoch: 44398, Training Loss: 0.00669
Epoch: 44398, Training Loss: 0.00670
Epoch: 44398, Training Loss: 0.00821
Epoch: 44399, Training Loss: 0.00645
Epoch: 44399, Training Loss: 0.00669
Epoch: 44399, Training Loss: 0.00670
Epoch: 44399, Training Loss: 0.00821
Epoch: 44400, Training Loss: 0.00645
Epoch: 44400, Training Loss: 0.00669
Epoch: 44400, Training Loss: 0.00670
Epoch: 44400, Training Loss: 0.00821
Epoch: 44401, Training Loss: 0.00645
Epoch: 44401, Training Loss: 0.00669
Epoch: 44401, Training Loss: 0.00670
Epoch: 44401, Training Loss: 0.00821
Epoch: 44402, Training Loss: 0.00645
Epoch: 44402, Training Loss: 0.00669
Epoch: 44402, Training Loss: 0.00670
Epoch: 44402, Training Loss: 0.00821
Epoch: 44403, Training Loss: 0.00645
Epoch: 44403, Training Loss: 0.00669
Epoch: 44403, Training Loss: 0.00670
Epoch: 44403, Training Loss: 0.00821
Epoch: 44404, Training Loss: 0.00645
Epoch: 44404, Training Loss: 0.00669
Epoch: 44404, Training Loss: 0.00670
Epoch: 44404, Training Loss: 0.00821
Epoch: 44405, Training Loss: 0.00645
Epoch: 44405, Training Loss: 0.00669
Epoch: 44405, Training Loss: 0.00670
Epoch: 44405, Training Loss: 0.00821
Epoch: 44406, Training Loss: 0.00645
Epoch: 44406, Training Loss: 0.00669
Epoch: 44406, Training Loss: 0.00670
Epoch: 44406, Training Loss: 0.00821
Epoch: 44407, Training Loss: 0.00645
Epoch: 44407, Training Loss: 0.00669
Epoch: 44407, Training Loss: 0.00670
Epoch: 44407, Training Loss: 0.00821
Epoch: 44408, Training Loss: 0.00645
Epoch: 44408, Training Loss: 0.00669
Epoch: 44408, Training Loss: 0.00670
Epoch: 44408, Training Loss: 0.00821
Epoch: 44409, Training Loss: 0.00645
Epoch: 44409, Training Loss: 0.00669
Epoch: 44409, Training Loss: 0.00670
Epoch: 44409, Training Loss: 0.00821
Epoch: 44410, Training Loss: 0.00645
Epoch: 44410, Training Loss: 0.00669
Epoch: 44410, Training Loss: 0.00670
Epoch: 44410, Training Loss: 0.00821
Epoch: 44411, Training Loss: 0.00645
Epoch: 44411, Training Loss: 0.00669
Epoch: 44411, Training Loss: 0.00669
Epoch: 44411, Training Loss: 0.00821
Epoch: 44412, Training Loss: 0.00645
Epoch: 44412, Training Loss: 0.00669
Epoch: 44412, Training Loss: 0.00669
Epoch: 44412, Training Loss: 0.00821
Epoch: 44413, Training Loss: 0.00645
Epoch: 44413, Training Loss: 0.00669
Epoch: 44413, Training Loss: 0.00669
Epoch: 44413, Training Loss: 0.00821
Epoch: 44414, Training Loss: 0.00645
Epoch: 44414, Training Loss: 0.00669
Epoch: 44414, Training Loss: 0.00669
Epoch: 44414, Training Loss: 0.00821
Epoch: 44415, Training Loss: 0.00645
Epoch: 44415, Training Loss: 0.00669
Epoch: 44415, Training Loss: 0.00669
Epoch: 44415, Training Loss: 0.00821
Epoch: 44416, Training Loss: 0.00645
Epoch: 44416, Training Loss: 0.00669
Epoch: 44416, Training Loss: 0.00669
Epoch: 44416, Training Loss: 0.00821
Epoch: 44417, Training Loss: 0.00645
Epoch: 44417, Training Loss: 0.00669
Epoch: 44417, Training Loss: 0.00669
Epoch: 44417, Training Loss: 0.00821
Epoch: 44418, Training Loss: 0.00645
Epoch: 44418, Training Loss: 0.00669
Epoch: 44418, Training Loss: 0.00669
Epoch: 44418, Training Loss: 0.00821
Epoch: 44419, Training Loss: 0.00644
Epoch: 44419, Training Loss: 0.00669
Epoch: 44419, Training Loss: 0.00669
Epoch: 44419, Training Loss: 0.00821
Epoch: 44420, Training Loss: 0.00644
Epoch: 44420, Training Loss: 0.00669
Epoch: 44420, Training Loss: 0.00669
Epoch: 44420, Training Loss: 0.00821
Epoch: 44421, Training Loss: 0.00644
Epoch: 44421, Training Loss: 0.00669
Epoch: 44421, Training Loss: 0.00669
Epoch: 44421, Training Loss: 0.00821
Epoch: 44422, Training Loss: 0.00644
Epoch: 44422, Training Loss: 0.00669
Epoch: 44422, Training Loss: 0.00669
Epoch: 44422, Training Loss: 0.00820
Epoch: 44423, Training Loss: 0.00644
Epoch: 44423, Training Loss: 0.00669
Epoch: 44423, Training Loss: 0.00669
Epoch: 44423, Training Loss: 0.00820
Epoch: 44424, Training Loss: 0.00644
Epoch: 44424, Training Loss: 0.00669
Epoch: 44424, Training Loss: 0.00669
Epoch: 44424, Training Loss: 0.00820
Epoch: 44425, Training Loss: 0.00644
Epoch: 44425, Training Loss: 0.00669
Epoch: 44425, Training Loss: 0.00669
Epoch: 44425, Training Loss: 0.00820
Epoch: 44426, Training Loss: 0.00644
Epoch: 44426, Training Loss: 0.00669
Epoch: 44426, Training Loss: 0.00669
Epoch: 44426, Training Loss: 0.00820
Epoch: 44427, Training Loss: 0.00644
Epoch: 44427, Training Loss: 0.00669
Epoch: 44427, Training Loss: 0.00669
Epoch: 44427, Training Loss: 0.00820
Epoch: 44428, Training Loss: 0.00644
Epoch: 44428, Training Loss: 0.00669
Epoch: 44428, Training Loss: 0.00669
Epoch: 44428, Training Loss: 0.00820
Epoch: 44429, Training Loss: 0.00644
Epoch: 44429, Training Loss: 0.00669
Epoch: 44429, Training Loss: 0.00669
Epoch: 44429, Training Loss: 0.00820
Epoch: 44430, Training Loss: 0.00644
Epoch: 44430, Training Loss: 0.00669
Epoch: 44430, Training Loss: 0.00669
Epoch: 44430, Training Loss: 0.00820
Epoch: 44431, Training Loss: 0.00644
Epoch: 44431, Training Loss: 0.00669
Epoch: 44431, Training Loss: 0.00669
Epoch: 44431, Training Loss: 0.00820
Epoch: 44432, Training Loss: 0.00644
Epoch: 44432, Training Loss: 0.00669
Epoch: 44432, Training Loss: 0.00669
Epoch: 44432, Training Loss: 0.00820
Epoch: 44433, Training Loss: 0.00644
Epoch: 44433, Training Loss: 0.00669
Epoch: 44433, Training Loss: 0.00669
Epoch: 44433, Training Loss: 0.00820
Epoch: 44434, Training Loss: 0.00644
Epoch: 44434, Training Loss: 0.00669
Epoch: 44434, Training Loss: 0.00669
Epoch: 44434, Training Loss: 0.00820
Epoch: 44435, Training Loss: 0.00644
Epoch: 44435, Training Loss: 0.00669
Epoch: 44435, Training Loss: 0.00669
Epoch: 44435, Training Loss: 0.00820
Epoch: 44436, Training Loss: 0.00644
Epoch: 44436, Training Loss: 0.00669
Epoch: 44436, Training Loss: 0.00669
Epoch: 44436, Training Loss: 0.00820
Epoch: 44437, Training Loss: 0.00644
Epoch: 44437, Training Loss: 0.00669
Epoch: 44437, Training Loss: 0.00669
Epoch: 44437, Training Loss: 0.00820
Epoch: 44438, Training Loss: 0.00644
Epoch: 44438, Training Loss: 0.00669
Epoch: 44438, Training Loss: 0.00669
Epoch: 44438, Training Loss: 0.00820
Epoch: 44439, Training Loss: 0.00644
Epoch: 44439, Training Loss: 0.00669
Epoch: 44439, Training Loss: 0.00669
Epoch: 44439, Training Loss: 0.00820
Epoch: 44440, Training Loss: 0.00644
Epoch: 44440, Training Loss: 0.00669
Epoch: 44440, Training Loss: 0.00669
Epoch: 44440, Training Loss: 0.00820
Epoch: 44441, Training Loss: 0.00644
Epoch: 44441, Training Loss: 0.00669
Epoch: 44441, Training Loss: 0.00669
Epoch: 44441, Training Loss: 0.00820
Epoch: 44442, Training Loss: 0.00644
Epoch: 44442, Training Loss: 0.00669
Epoch: 44442, Training Loss: 0.00669
Epoch: 44442, Training Loss: 0.00820
Epoch: 44443, Training Loss: 0.00644
Epoch: 44443, Training Loss: 0.00669
Epoch: 44443, Training Loss: 0.00669
Epoch: 44443, Training Loss: 0.00820
Epoch: 44444, Training Loss: 0.00644
Epoch: 44444, Training Loss: 0.00668
Epoch: 44444, Training Loss: 0.00669
Epoch: 44444, Training Loss: 0.00820
Epoch: 44445, Training Loss: 0.00644
Epoch: 44445, Training Loss: 0.00668
Epoch: 44445, Training Loss: 0.00669
Epoch: 44445, Training Loss: 0.00820
Epoch: 44446, Training Loss: 0.00644
Epoch: 44446, Training Loss: 0.00668
Epoch: 44446, Training Loss: 0.00669
Epoch: 44446, Training Loss: 0.00820
Epoch: 44447, Training Loss: 0.00644
Epoch: 44447, Training Loss: 0.00668
Epoch: 44447, Training Loss: 0.00669
Epoch: 44447, Training Loss: 0.00820
Epoch: 44448, Training Loss: 0.00644
Epoch: 44448, Training Loss: 0.00668
Epoch: 44448, Training Loss: 0.00669
Epoch: 44448, Training Loss: 0.00820
Epoch: 44449, Training Loss: 0.00644
Epoch: 44449, Training Loss: 0.00668
Epoch: 44449, Training Loss: 0.00669
Epoch: 44449, Training Loss: 0.00820
Epoch: 44450, Training Loss: 0.00644
Epoch: 44450, Training Loss: 0.00668
Epoch: 44450, Training Loss: 0.00669
Epoch: 44450, Training Loss: 0.00820
Epoch: 44451, Training Loss: 0.00644
Epoch: 44451, Training Loss: 0.00668
Epoch: 44451, Training Loss: 0.00669
Epoch: 44451, Training Loss: 0.00820
Epoch: 44452, Training Loss: 0.00644
Epoch: 44452, Training Loss: 0.00668
Epoch: 44452, Training Loss: 0.00669
Epoch: 44452, Training Loss: 0.00820
Epoch: 44453, Training Loss: 0.00644
Epoch: 44453, Training Loss: 0.00668
Epoch: 44453, Training Loss: 0.00669
Epoch: 44453, Training Loss: 0.00820
Epoch: 44454, Training Loss: 0.00644
Epoch: 44454, Training Loss: 0.00668
Epoch: 44454, Training Loss: 0.00669
Epoch: 44454, Training Loss: 0.00820
Epoch: 44455, Training Loss: 0.00644
Epoch: 44455, Training Loss: 0.00668
Epoch: 44455, Training Loss: 0.00669
Epoch: 44455, Training Loss: 0.00820
Epoch: 44456, Training Loss: 0.00644
Epoch: 44456, Training Loss: 0.00668
Epoch: 44456, Training Loss: 0.00669
Epoch: 44456, Training Loss: 0.00820
Epoch: 44457, Training Loss: 0.00644
Epoch: 44457, Training Loss: 0.00668
Epoch: 44457, Training Loss: 0.00669
Epoch: 44457, Training Loss: 0.00820
Epoch: 44458, Training Loss: 0.00644
Epoch: 44458, Training Loss: 0.00668
Epoch: 44458, Training Loss: 0.00669
Epoch: 44458, Training Loss: 0.00820
Epoch: 44459, Training Loss: 0.00644
Epoch: 44459, Training Loss: 0.00668
Epoch: 44459, Training Loss: 0.00669
Epoch: 44459, Training Loss: 0.00820
Epoch: 44460, Training Loss: 0.00644
Epoch: 44460, Training Loss: 0.00668
Epoch: 44460, Training Loss: 0.00669
Epoch: 44460, Training Loss: 0.00820
Epoch: 44461, Training Loss: 0.00644
Epoch: 44461, Training Loss: 0.00668
Epoch: 44461, Training Loss: 0.00669
Epoch: 44461, Training Loss: 0.00820
Epoch: 44462, Training Loss: 0.00644
Epoch: 44462, Training Loss: 0.00668
Epoch: 44462, Training Loss: 0.00669
Epoch: 44462, Training Loss: 0.00820
Epoch: 44463, Training Loss: 0.00644
Epoch: 44463, Training Loss: 0.00668
Epoch: 44463, Training Loss: 0.00669
Epoch: 44463, Training Loss: 0.00820
Epoch: 44464, Training Loss: 0.00644
Epoch: 44464, Training Loss: 0.00668
Epoch: 44464, Training Loss: 0.00669
Epoch: 44464, Training Loss: 0.00820
Epoch: 44465, Training Loss: 0.00644
Epoch: 44465, Training Loss: 0.00668
Epoch: 44465, Training Loss: 0.00669
Epoch: 44465, Training Loss: 0.00820
Epoch: 44466, Training Loss: 0.00644
Epoch: 44466, Training Loss: 0.00668
Epoch: 44466, Training Loss: 0.00669
Epoch: 44466, Training Loss: 0.00820
Epoch: 44467, Training Loss: 0.00644
Epoch: 44467, Training Loss: 0.00668
Epoch: 44467, Training Loss: 0.00669
Epoch: 44467, Training Loss: 0.00820
Epoch: 44468, Training Loss: 0.00644
Epoch: 44468, Training Loss: 0.00668
Epoch: 44468, Training Loss: 0.00669
Epoch: 44468, Training Loss: 0.00820
Epoch: 44469, Training Loss: 0.00644
Epoch: 44469, Training Loss: 0.00668
Epoch: 44469, Training Loss: 0.00669
Epoch: 44469, Training Loss: 0.00820
Epoch: 44470, Training Loss: 0.00644
Epoch: 44470, Training Loss: 0.00668
Epoch: 44470, Training Loss: 0.00669
Epoch: 44470, Training Loss: 0.00820
Epoch: 44471, Training Loss: 0.00644
Epoch: 44471, Training Loss: 0.00668
Epoch: 44471, Training Loss: 0.00669
Epoch: 44471, Training Loss: 0.00820
Epoch: 44472, Training Loss: 0.00644
Epoch: 44472, Training Loss: 0.00668
Epoch: 44472, Training Loss: 0.00669
Epoch: 44472, Training Loss: 0.00820
Epoch: 44473, Training Loss: 0.00644
Epoch: 44473, Training Loss: 0.00668
Epoch: 44473, Training Loss: 0.00669
Epoch: 44473, Training Loss: 0.00820
Epoch: 44474, Training Loss: 0.00644
Epoch: 44474, Training Loss: 0.00668
Epoch: 44474, Training Loss: 0.00669
Epoch: 44474, Training Loss: 0.00820
Epoch: 44475, Training Loss: 0.00644
Epoch: 44475, Training Loss: 0.00668
Epoch: 44475, Training Loss: 0.00669
Epoch: 44475, Training Loss: 0.00820
Epoch: 44476, Training Loss: 0.00644
Epoch: 44476, Training Loss: 0.00668
Epoch: 44476, Training Loss: 0.00669
Epoch: 44476, Training Loss: 0.00820
Epoch: 44477, Training Loss: 0.00644
Epoch: 44477, Training Loss: 0.00668
Epoch: 44477, Training Loss: 0.00669
Epoch: 44477, Training Loss: 0.00820
Epoch: 44478, Training Loss: 0.00644
Epoch: 44478, Training Loss: 0.00668
Epoch: 44478, Training Loss: 0.00669
Epoch: 44478, Training Loss: 0.00820
Epoch: 44479, Training Loss: 0.00644
Epoch: 44479, Training Loss: 0.00668
Epoch: 44479, Training Loss: 0.00669
Epoch: 44479, Training Loss: 0.00820
Epoch: 44480, Training Loss: 0.00644
Epoch: 44480, Training Loss: 0.00668
Epoch: 44480, Training Loss: 0.00669
Epoch: 44480, Training Loss: 0.00820
Epoch: 44481, Training Loss: 0.00644
Epoch: 44481, Training Loss: 0.00668
Epoch: 44481, Training Loss: 0.00669
Epoch: 44481, Training Loss: 0.00820
Epoch: 44482, Training Loss: 0.00644
Epoch: 44482, Training Loss: 0.00668
Epoch: 44482, Training Loss: 0.00669
Epoch: 44482, Training Loss: 0.00820
Epoch: 44483, Training Loss: 0.00644
Epoch: 44483, Training Loss: 0.00668
Epoch: 44483, Training Loss: 0.00669
Epoch: 44483, Training Loss: 0.00820
Epoch: 44484, Training Loss: 0.00644
Epoch: 44484, Training Loss: 0.00668
Epoch: 44484, Training Loss: 0.00669
Epoch: 44484, Training Loss: 0.00820
Epoch: 44485, Training Loss: 0.00644
Epoch: 44485, Training Loss: 0.00668
Epoch: 44485, Training Loss: 0.00669
Epoch: 44485, Training Loss: 0.00820
Epoch: 44486, Training Loss: 0.00644
Epoch: 44486, Training Loss: 0.00668
Epoch: 44486, Training Loss: 0.00669
Epoch: 44486, Training Loss: 0.00820
Epoch: 44487, Training Loss: 0.00644
Epoch: 44487, Training Loss: 0.00668
Epoch: 44487, Training Loss: 0.00669
Epoch: 44487, Training Loss: 0.00820
Epoch: 44488, Training Loss: 0.00644
Epoch: 44488, Training Loss: 0.00668
Epoch: 44488, Training Loss: 0.00669
Epoch: 44488, Training Loss: 0.00820
Epoch: 44489, Training Loss: 0.00644
Epoch: 44489, Training Loss: 0.00668
Epoch: 44489, Training Loss: 0.00669
Epoch: 44489, Training Loss: 0.00820
Epoch: 44490, Training Loss: 0.00644
Epoch: 44490, Training Loss: 0.00668
Epoch: 44490, Training Loss: 0.00669
Epoch: 44490, Training Loss: 0.00820
Epoch: 44491, Training Loss: 0.00644
Epoch: 44491, Training Loss: 0.00668
Epoch: 44491, Training Loss: 0.00669
Epoch: 44491, Training Loss: 0.00820
Epoch: 44492, Training Loss: 0.00644
Epoch: 44492, Training Loss: 0.00668
Epoch: 44492, Training Loss: 0.00669
Epoch: 44492, Training Loss: 0.00820
Epoch: 44493, Training Loss: 0.00644
Epoch: 44493, Training Loss: 0.00668
Epoch: 44493, Training Loss: 0.00669
Epoch: 44493, Training Loss: 0.00820
Epoch: 44494, Training Loss: 0.00644
Epoch: 44494, Training Loss: 0.00668
Epoch: 44494, Training Loss: 0.00669
Epoch: 44494, Training Loss: 0.00820
Epoch: 44495, Training Loss: 0.00644
Epoch: 44495, Training Loss: 0.00668
Epoch: 44495, Training Loss: 0.00669
Epoch: 44495, Training Loss: 0.00820
Epoch: 44496, Training Loss: 0.00644
Epoch: 44496, Training Loss: 0.00668
Epoch: 44496, Training Loss: 0.00669
Epoch: 44496, Training Loss: 0.00820
Epoch: 44497, Training Loss: 0.00644
Epoch: 44497, Training Loss: 0.00668
Epoch: 44497, Training Loss: 0.00669
Epoch: 44497, Training Loss: 0.00820
Epoch: 44498, Training Loss: 0.00644
Epoch: 44498, Training Loss: 0.00668
Epoch: 44498, Training Loss: 0.00669
Epoch: 44498, Training Loss: 0.00820
Epoch: 44499, Training Loss: 0.00644
Epoch: 44499, Training Loss: 0.00668
Epoch: 44499, Training Loss: 0.00669
Epoch: 44499, Training Loss: 0.00820
Epoch: 44500, Training Loss: 0.00644
Epoch: 44500, Training Loss: 0.00668
Epoch: 44500, Training Loss: 0.00669
Epoch: 44500, Training Loss: 0.00820
Epoch: 44501, Training Loss: 0.00644
Epoch: 44501, Training Loss: 0.00668
Epoch: 44501, Training Loss: 0.00669
Epoch: 44501, Training Loss: 0.00820
Epoch: 44502, Training Loss: 0.00644
Epoch: 44502, Training Loss: 0.00668
Epoch: 44502, Training Loss: 0.00669
Epoch: 44502, Training Loss: 0.00820
Epoch: 44503, Training Loss: 0.00644
Epoch: 44503, Training Loss: 0.00668
Epoch: 44503, Training Loss: 0.00669
Epoch: 44503, Training Loss: 0.00820
Epoch: 44504, Training Loss: 0.00644
Epoch: 44504, Training Loss: 0.00668
Epoch: 44504, Training Loss: 0.00669
Epoch: 44504, Training Loss: 0.00820
Epoch: 44505, Training Loss: 0.00644
Epoch: 44505, Training Loss: 0.00668
Epoch: 44505, Training Loss: 0.00669
Epoch: 44505, Training Loss: 0.00820
Epoch: 44506, Training Loss: 0.00644
Epoch: 44506, Training Loss: 0.00668
Epoch: 44506, Training Loss: 0.00669
Epoch: 44506, Training Loss: 0.00820
Epoch: 44507, Training Loss: 0.00644
Epoch: 44507, Training Loss: 0.00668
Epoch: 44507, Training Loss: 0.00669
Epoch: 44507, Training Loss: 0.00820
Epoch: 44508, Training Loss: 0.00644
Epoch: 44508, Training Loss: 0.00668
Epoch: 44508, Training Loss: 0.00669
Epoch: 44508, Training Loss: 0.00820
Epoch: 44509, Training Loss: 0.00644
Epoch: 44509, Training Loss: 0.00668
Epoch: 44509, Training Loss: 0.00669
Epoch: 44509, Training Loss: 0.00820
Epoch: 44510, Training Loss: 0.00644
Epoch: 44510, Training Loss: 0.00668
Epoch: 44510, Training Loss: 0.00669
Epoch: 44510, Training Loss: 0.00820
Epoch: 44511, Training Loss: 0.00644
Epoch: 44511, Training Loss: 0.00668
Epoch: 44511, Training Loss: 0.00669
Epoch: 44511, Training Loss: 0.00820
Epoch: 44512, Training Loss: 0.00644
Epoch: 44512, Training Loss: 0.00668
Epoch: 44512, Training Loss: 0.00669
Epoch: 44512, Training Loss: 0.00820
Epoch: 44513, Training Loss: 0.00644
Epoch: 44513, Training Loss: 0.00668
Epoch: 44513, Training Loss: 0.00669
Epoch: 44513, Training Loss: 0.00820
Epoch: 44514, Training Loss: 0.00644
Epoch: 44514, Training Loss: 0.00668
Epoch: 44514, Training Loss: 0.00669
Epoch: 44514, Training Loss: 0.00820
Epoch: 44515, Training Loss: 0.00644
Epoch: 44515, Training Loss: 0.00668
Epoch: 44515, Training Loss: 0.00669
Epoch: 44515, Training Loss: 0.00820
Epoch: 44516, Training Loss: 0.00644
Epoch: 44516, Training Loss: 0.00668
Epoch: 44516, Training Loss: 0.00669
Epoch: 44516, Training Loss: 0.00820
Epoch: 44517, Training Loss: 0.00644
Epoch: 44517, Training Loss: 0.00668
Epoch: 44517, Training Loss: 0.00669
Epoch: 44517, Training Loss: 0.00820
Epoch: 44518, Training Loss: 0.00644
Epoch: 44518, Training Loss: 0.00668
Epoch: 44518, Training Loss: 0.00669
Epoch: 44518, Training Loss: 0.00820
Epoch: 44519, Training Loss: 0.00644
Epoch: 44519, Training Loss: 0.00668
Epoch: 44519, Training Loss: 0.00669
Epoch: 44519, Training Loss: 0.00820
Epoch: 44520, Training Loss: 0.00644
Epoch: 44520, Training Loss: 0.00668
Epoch: 44520, Training Loss: 0.00669
Epoch: 44520, Training Loss: 0.00820
Epoch: 44521, Training Loss: 0.00644
Epoch: 44521, Training Loss: 0.00668
Epoch: 44521, Training Loss: 0.00669
Epoch: 44521, Training Loss: 0.00819
Epoch: 44522, Training Loss: 0.00644
Epoch: 44522, Training Loss: 0.00668
Epoch: 44522, Training Loss: 0.00669
Epoch: 44522, Training Loss: 0.00819
Epoch: 44523, Training Loss: 0.00644
Epoch: 44523, Training Loss: 0.00668
Epoch: 44523, Training Loss: 0.00669
Epoch: 44523, Training Loss: 0.00819
Epoch: 44524, Training Loss: 0.00644
Epoch: 44524, Training Loss: 0.00668
Epoch: 44524, Training Loss: 0.00669
Epoch: 44524, Training Loss: 0.00819
Epoch: 44525, Training Loss: 0.00644
Epoch: 44525, Training Loss: 0.00668
Epoch: 44525, Training Loss: 0.00669
Epoch: 44525, Training Loss: 0.00819
Epoch: 44526, Training Loss: 0.00644
Epoch: 44526, Training Loss: 0.00668
Epoch: 44526, Training Loss: 0.00669
Epoch: 44526, Training Loss: 0.00819
Epoch: 44527, Training Loss: 0.00644
Epoch: 44527, Training Loss: 0.00668
Epoch: 44527, Training Loss: 0.00669
Epoch: 44527, Training Loss: 0.00819
Epoch: 44528, Training Loss: 0.00644
Epoch: 44528, Training Loss: 0.00668
Epoch: 44528, Training Loss: 0.00669
Epoch: 44528, Training Loss: 0.00819
Epoch: 44529, Training Loss: 0.00644
Epoch: 44529, Training Loss: 0.00668
Epoch: 44529, Training Loss: 0.00669
Epoch: 44529, Training Loss: 0.00819
Epoch: 44530, Training Loss: 0.00644
Epoch: 44530, Training Loss: 0.00668
Epoch: 44530, Training Loss: 0.00669
Epoch: 44530, Training Loss: 0.00819
Epoch: 44531, Training Loss: 0.00644
Epoch: 44531, Training Loss: 0.00668
Epoch: 44531, Training Loss: 0.00669
Epoch: 44531, Training Loss: 0.00819
Epoch: 44532, Training Loss: 0.00644
Epoch: 44532, Training Loss: 0.00668
Epoch: 44532, Training Loss: 0.00669
Epoch: 44532, Training Loss: 0.00819
Epoch: 44533, Training Loss: 0.00644
Epoch: 44533, Training Loss: 0.00668
Epoch: 44533, Training Loss: 0.00668
Epoch: 44533, Training Loss: 0.00819
Epoch: 44534, Training Loss: 0.00644
Epoch: 44534, Training Loss: 0.00668
Epoch: 44534, Training Loss: 0.00668
Epoch: 44534, Training Loss: 0.00819
Epoch: 44535, Training Loss: 0.00644
Epoch: 44535, Training Loss: 0.00668
Epoch: 44535, Training Loss: 0.00668
Epoch: 44535, Training Loss: 0.00819
Epoch: 44536, Training Loss: 0.00644
Epoch: 44536, Training Loss: 0.00668
Epoch: 44536, Training Loss: 0.00668
Epoch: 44536, Training Loss: 0.00819
Epoch: 44537, Training Loss: 0.00644
Epoch: 44537, Training Loss: 0.00668
Epoch: 44537, Training Loss: 0.00668
Epoch: 44537, Training Loss: 0.00819
Epoch: 44538, Training Loss: 0.00644
Epoch: 44538, Training Loss: 0.00668
Epoch: 44538, Training Loss: 0.00668
Epoch: 44538, Training Loss: 0.00819
Epoch: 44539, Training Loss: 0.00644
Epoch: 44539, Training Loss: 0.00668
Epoch: 44539, Training Loss: 0.00668
Epoch: 44539, Training Loss: 0.00819
Epoch: 44540, Training Loss: 0.00644
Epoch: 44540, Training Loss: 0.00668
Epoch: 44540, Training Loss: 0.00668
Epoch: 44540, Training Loss: 0.00819
Epoch: 44541, Training Loss: 0.00644
Epoch: 44541, Training Loss: 0.00668
Epoch: 44541, Training Loss: 0.00668
Epoch: 44541, Training Loss: 0.00819
Epoch: 44542, Training Loss: 0.00644
Epoch: 44542, Training Loss: 0.00668
Epoch: 44542, Training Loss: 0.00668
Epoch: 44542, Training Loss: 0.00819
Epoch: 44543, Training Loss: 0.00644
Epoch: 44543, Training Loss: 0.00668
Epoch: 44543, Training Loss: 0.00668
Epoch: 44543, Training Loss: 0.00819
Epoch: 44544, Training Loss: 0.00644
Epoch: 44544, Training Loss: 0.00668
Epoch: 44544, Training Loss: 0.00668
Epoch: 44544, Training Loss: 0.00819
Epoch: 44545, Training Loss: 0.00644
Epoch: 44545, Training Loss: 0.00668
Epoch: 44545, Training Loss: 0.00668
Epoch: 44545, Training Loss: 0.00819
Epoch: 44546, Training Loss: 0.00644
Epoch: 44546, Training Loss: 0.00668
Epoch: 44546, Training Loss: 0.00668
Epoch: 44546, Training Loss: 0.00819
Epoch: 44547, Training Loss: 0.00644
Epoch: 44547, Training Loss: 0.00668
Epoch: 44547, Training Loss: 0.00668
Epoch: 44547, Training Loss: 0.00819
Epoch: 44548, Training Loss: 0.00643
Epoch: 44548, Training Loss: 0.00668
Epoch: 44548, Training Loss: 0.00668
Epoch: 44548, Training Loss: 0.00819
Epoch: 44549, Training Loss: 0.00643
Epoch: 44549, Training Loss: 0.00668
Epoch: 44549, Training Loss: 0.00668
Epoch: 44549, Training Loss: 0.00819
Epoch: 44550, Training Loss: 0.00643
Epoch: 44550, Training Loss: 0.00668
Epoch: 44550, Training Loss: 0.00668
Epoch: 44550, Training Loss: 0.00819
Epoch: 44551, Training Loss: 0.00643
Epoch: 44551, Training Loss: 0.00668
Epoch: 44551, Training Loss: 0.00668
Epoch: 44551, Training Loss: 0.00819
Epoch: 44552, Training Loss: 0.00643
Epoch: 44552, Training Loss: 0.00668
Epoch: 44552, Training Loss: 0.00668
Epoch: 44552, Training Loss: 0.00819
Epoch: 44553, Training Loss: 0.00643
Epoch: 44553, Training Loss: 0.00668
Epoch: 44553, Training Loss: 0.00668
Epoch: 44553, Training Loss: 0.00819
Epoch: 44554, Training Loss: 0.00643
Epoch: 44554, Training Loss: 0.00668
Epoch: 44554, Training Loss: 0.00668
Epoch: 44554, Training Loss: 0.00819
Epoch: 44555, Training Loss: 0.00643
Epoch: 44555, Training Loss: 0.00668
Epoch: 44555, Training Loss: 0.00668
Epoch: 44555, Training Loss: 0.00819
Epoch: 44556, Training Loss: 0.00643
Epoch: 44556, Training Loss: 0.00668
Epoch: 44556, Training Loss: 0.00668
Epoch: 44556, Training Loss: 0.00819
Epoch: 44557, Training Loss: 0.00643
Epoch: 44557, Training Loss: 0.00668
Epoch: 44557, Training Loss: 0.00668
Epoch: 44557, Training Loss: 0.00819
Epoch: 44558, Training Loss: 0.00643
Epoch: 44558, Training Loss: 0.00668
Epoch: 44558, Training Loss: 0.00668
Epoch: 44558, Training Loss: 0.00819
Epoch: 44559, Training Loss: 0.00643
Epoch: 44559, Training Loss: 0.00668
Epoch: 44559, Training Loss: 0.00668
Epoch: 44559, Training Loss: 0.00819
Epoch: 44560, Training Loss: 0.00643
Epoch: 44560, Training Loss: 0.00668
Epoch: 44560, Training Loss: 0.00668
Epoch: 44560, Training Loss: 0.00819
Epoch: 44561, Training Loss: 0.00643
Epoch: 44561, Training Loss: 0.00668
Epoch: 44561, Training Loss: 0.00668
Epoch: 44561, Training Loss: 0.00819
Epoch: 44562, Training Loss: 0.00643
Epoch: 44562, Training Loss: 0.00668
Epoch: 44562, Training Loss: 0.00668
Epoch: 44562, Training Loss: 0.00819
Epoch: 44563, Training Loss: 0.00643
Epoch: 44563, Training Loss: 0.00668
Epoch: 44563, Training Loss: 0.00668
Epoch: 44563, Training Loss: 0.00819
Epoch: 44564, Training Loss: 0.00643
Epoch: 44564, Training Loss: 0.00668
Epoch: 44564, Training Loss: 0.00668
Epoch: 44564, Training Loss: 0.00819
Epoch: 44565, Training Loss: 0.00643
Epoch: 44565, Training Loss: 0.00668
Epoch: 44565, Training Loss: 0.00668
Epoch: 44565, Training Loss: 0.00819
Epoch: 44566, Training Loss: 0.00643
Epoch: 44566, Training Loss: 0.00667
Epoch: 44566, Training Loss: 0.00668
Epoch: 44566, Training Loss: 0.00819
Epoch: 44567, Training Loss: 0.00643
Epoch: 44567, Training Loss: 0.00667
Epoch: 44567, Training Loss: 0.00668
Epoch: 44567, Training Loss: 0.00819
Epoch: 44568, Training Loss: 0.00643
Epoch: 44568, Training Loss: 0.00667
Epoch: 44568, Training Loss: 0.00668
Epoch: 44568, Training Loss: 0.00819
Epoch: 44569, Training Loss: 0.00643
Epoch: 44569, Training Loss: 0.00667
Epoch: 44569, Training Loss: 0.00668
Epoch: 44569, Training Loss: 0.00819
Epoch: 44570, Training Loss: 0.00643
Epoch: 44570, Training Loss: 0.00667
Epoch: 44570, Training Loss: 0.00668
Epoch: 44570, Training Loss: 0.00819
Epoch: 44571, Training Loss: 0.00643
Epoch: 44571, Training Loss: 0.00667
Epoch: 44571, Training Loss: 0.00668
Epoch: 44571, Training Loss: 0.00819
Epoch: 44572, Training Loss: 0.00643
Epoch: 44572, Training Loss: 0.00667
Epoch: 44572, Training Loss: 0.00668
Epoch: 44572, Training Loss: 0.00819
Epoch: 44573, Training Loss: 0.00643
Epoch: 44573, Training Loss: 0.00667
Epoch: 44573, Training Loss: 0.00668
Epoch: 44573, Training Loss: 0.00819
Epoch: 44574, Training Loss: 0.00643
Epoch: 44574, Training Loss: 0.00667
Epoch: 44574, Training Loss: 0.00668
Epoch: 44574, Training Loss: 0.00819
Epoch: 44575, Training Loss: 0.00643
Epoch: 44575, Training Loss: 0.00667
Epoch: 44575, Training Loss: 0.00668
Epoch: 44575, Training Loss: 0.00819
Epoch: 44576, Training Loss: 0.00643
Epoch: 44576, Training Loss: 0.00667
Epoch: 44576, Training Loss: 0.00668
Epoch: 44576, Training Loss: 0.00819
Epoch: 44577, Training Loss: 0.00643
Epoch: 44577, Training Loss: 0.00667
Epoch: 44577, Training Loss: 0.00668
Epoch: 44577, Training Loss: 0.00819
Epoch: 44578, Training Loss: 0.00643
Epoch: 44578, Training Loss: 0.00667
Epoch: 44578, Training Loss: 0.00668
Epoch: 44578, Training Loss: 0.00819
Epoch: 44579, Training Loss: 0.00643
Epoch: 44579, Training Loss: 0.00667
Epoch: 44579, Training Loss: 0.00668
Epoch: 44579, Training Loss: 0.00819
Epoch: 44580, Training Loss: 0.00643
Epoch: 44580, Training Loss: 0.00667
Epoch: 44580, Training Loss: 0.00668
Epoch: 44580, Training Loss: 0.00819
Epoch: 44581, Training Loss: 0.00643
Epoch: 44581, Training Loss: 0.00667
Epoch: 44581, Training Loss: 0.00668
Epoch: 44581, Training Loss: 0.00819
Epoch: 44582, Training Loss: 0.00643
Epoch: 44582, Training Loss: 0.00667
Epoch: 44582, Training Loss: 0.00668
Epoch: 44582, Training Loss: 0.00819
Epoch: 44583, Training Loss: 0.00643
Epoch: 44583, Training Loss: 0.00667
Epoch: 44583, Training Loss: 0.00668
Epoch: 44583, Training Loss: 0.00819
Epoch: 44584, Training Loss: 0.00643
Epoch: 44584, Training Loss: 0.00667
Epoch: 44584, Training Loss: 0.00668
Epoch: 44584, Training Loss: 0.00819
Epoch: 44585, Training Loss: 0.00643
Epoch: 44585, Training Loss: 0.00667
Epoch: 44585, Training Loss: 0.00668
Epoch: 44585, Training Loss: 0.00819
Epoch: 44586, Training Loss: 0.00643
Epoch: 44586, Training Loss: 0.00667
Epoch: 44586, Training Loss: 0.00668
Epoch: 44586, Training Loss: 0.00819
Epoch: 44587, Training Loss: 0.00643
Epoch: 44587, Training Loss: 0.00667
Epoch: 44587, Training Loss: 0.00668
Epoch: 44587, Training Loss: 0.00819
Epoch: 44588, Training Loss: 0.00643
Epoch: 44588, Training Loss: 0.00667
Epoch: 44588, Training Loss: 0.00668
Epoch: 44588, Training Loss: 0.00819
Epoch: 44589, Training Loss: 0.00643
Epoch: 44589, Training Loss: 0.00667
Epoch: 44589, Training Loss: 0.00668
Epoch: 44589, Training Loss: 0.00819
Epoch: 44590, Training Loss: 0.00643
Epoch: 44590, Training Loss: 0.00667
Epoch: 44590, Training Loss: 0.00668
Epoch: 44590, Training Loss: 0.00819
Epoch: 44591, Training Loss: 0.00643
Epoch: 44591, Training Loss: 0.00667
Epoch: 44591, Training Loss: 0.00668
Epoch: 44591, Training Loss: 0.00819
Epoch: 44592, Training Loss: 0.00643
Epoch: 44592, Training Loss: 0.00667
Epoch: 44592, Training Loss: 0.00668
Epoch: 44592, Training Loss: 0.00819
Epoch: 44593, Training Loss: 0.00643
Epoch: 44593, Training Loss: 0.00667
Epoch: 44593, Training Loss: 0.00668
Epoch: 44593, Training Loss: 0.00819
Epoch: 44594, Training Loss: 0.00643
Epoch: 44594, Training Loss: 0.00667
Epoch: 44594, Training Loss: 0.00668
Epoch: 44594, Training Loss: 0.00819
Epoch: 44595, Training Loss: 0.00643
Epoch: 44595, Training Loss: 0.00667
Epoch: 44595, Training Loss: 0.00668
Epoch: 44595, Training Loss: 0.00819
Epoch: 44596, Training Loss: 0.00643
Epoch: 44596, Training Loss: 0.00667
Epoch: 44596, Training Loss: 0.00668
Epoch: 44596, Training Loss: 0.00819
Epoch: 44597, Training Loss: 0.00643
Epoch: 44597, Training Loss: 0.00667
Epoch: 44597, Training Loss: 0.00668
Epoch: 44597, Training Loss: 0.00819
Epoch: 44598, Training Loss: 0.00643
Epoch: 44598, Training Loss: 0.00667
Epoch: 44598, Training Loss: 0.00668
Epoch: 44598, Training Loss: 0.00819
Epoch: 44599, Training Loss: 0.00643
Epoch: 44599, Training Loss: 0.00667
Epoch: 44599, Training Loss: 0.00668
Epoch: 44599, Training Loss: 0.00819
Epoch: 44600, Training Loss: 0.00643
Epoch: 44600, Training Loss: 0.00667
Epoch: 44600, Training Loss: 0.00668
Epoch: 44600, Training Loss: 0.00819
Epoch: 44601, Training Loss: 0.00643
Epoch: 44601, Training Loss: 0.00667
Epoch: 44601, Training Loss: 0.00668
Epoch: 44601, Training Loss: 0.00819
Epoch: 44602, Training Loss: 0.00643
Epoch: 44602, Training Loss: 0.00667
Epoch: 44602, Training Loss: 0.00668
Epoch: 44602, Training Loss: 0.00819
Epoch: 44603, Training Loss: 0.00643
Epoch: 44603, Training Loss: 0.00667
Epoch: 44603, Training Loss: 0.00668
Epoch: 44603, Training Loss: 0.00819
Epoch: 44604, Training Loss: 0.00643
Epoch: 44604, Training Loss: 0.00667
Epoch: 44604, Training Loss: 0.00668
Epoch: 44604, Training Loss: 0.00819
Epoch: 44605, Training Loss: 0.00643
Epoch: 44605, Training Loss: 0.00667
Epoch: 44605, Training Loss: 0.00668
Epoch: 44605, Training Loss: 0.00819
Epoch: 44606, Training Loss: 0.00643
Epoch: 44606, Training Loss: 0.00667
Epoch: 44606, Training Loss: 0.00668
Epoch: 44606, Training Loss: 0.00819
Epoch: 44607, Training Loss: 0.00643
Epoch: 44607, Training Loss: 0.00667
Epoch: 44607, Training Loss: 0.00668
Epoch: 44607, Training Loss: 0.00819
Epoch: 44608, Training Loss: 0.00643
Epoch: 44608, Training Loss: 0.00667
Epoch: 44608, Training Loss: 0.00668
Epoch: 44608, Training Loss: 0.00819
Epoch: 44609, Training Loss: 0.00643
Epoch: 44609, Training Loss: 0.00667
Epoch: 44609, Training Loss: 0.00668
Epoch: 44609, Training Loss: 0.00819
Epoch: 44610, Training Loss: 0.00643
Epoch: 44610, Training Loss: 0.00667
Epoch: 44610, Training Loss: 0.00668
Epoch: 44610, Training Loss: 0.00819
Epoch: 44611, Training Loss: 0.00643
Epoch: 44611, Training Loss: 0.00667
Epoch: 44611, Training Loss: 0.00668
Epoch: 44611, Training Loss: 0.00819
Epoch: 44612, Training Loss: 0.00643
Epoch: 44612, Training Loss: 0.00667
Epoch: 44612, Training Loss: 0.00668
Epoch: 44612, Training Loss: 0.00819
Epoch: 44613, Training Loss: 0.00643
Epoch: 44613, Training Loss: 0.00667
Epoch: 44613, Training Loss: 0.00668
Epoch: 44613, Training Loss: 0.00819
Epoch: 44614, Training Loss: 0.00643
Epoch: 44614, Training Loss: 0.00667
Epoch: 44614, Training Loss: 0.00668
Epoch: 44614, Training Loss: 0.00819
Epoch: 44615, Training Loss: 0.00643
Epoch: 44615, Training Loss: 0.00667
Epoch: 44615, Training Loss: 0.00668
Epoch: 44615, Training Loss: 0.00819
Epoch: 44616, Training Loss: 0.00643
Epoch: 44616, Training Loss: 0.00667
Epoch: 44616, Training Loss: 0.00668
Epoch: 44616, Training Loss: 0.00819
Epoch: 44617, Training Loss: 0.00643
Epoch: 44617, Training Loss: 0.00667
Epoch: 44617, Training Loss: 0.00668
Epoch: 44617, Training Loss: 0.00819
Epoch: 44618, Training Loss: 0.00643
Epoch: 44618, Training Loss: 0.00667
Epoch: 44618, Training Loss: 0.00668
Epoch: 44618, Training Loss: 0.00819
Epoch: 44619, Training Loss: 0.00643
Epoch: 44619, Training Loss: 0.00667
Epoch: 44619, Training Loss: 0.00668
Epoch: 44619, Training Loss: 0.00819
Epoch: 44620, Training Loss: 0.00643
Epoch: 44620, Training Loss: 0.00667
Epoch: 44620, Training Loss: 0.00668
Epoch: 44620, Training Loss: 0.00818
Epoch: 44621, Training Loss: 0.00643
Epoch: 44621, Training Loss: 0.00667
Epoch: 44621, Training Loss: 0.00668
Epoch: 44621, Training Loss: 0.00818
Epoch: 44622, Training Loss: 0.00643
Epoch: 44622, Training Loss: 0.00667
Epoch: 44622, Training Loss: 0.00668
Epoch: 44622, Training Loss: 0.00818
Epoch: 44623, Training Loss: 0.00643
Epoch: 44623, Training Loss: 0.00667
Epoch: 44623, Training Loss: 0.00668
Epoch: 44623, Training Loss: 0.00818
Epoch: 44624, Training Loss: 0.00643
Epoch: 44624, Training Loss: 0.00667
Epoch: 44624, Training Loss: 0.00668
Epoch: 44624, Training Loss: 0.00818
Epoch: 44625, Training Loss: 0.00643
Epoch: 44625, Training Loss: 0.00667
Epoch: 44625, Training Loss: 0.00668
Epoch: 44625, Training Loss: 0.00818
Epoch: 44626, Training Loss: 0.00643
Epoch: 44626, Training Loss: 0.00667
Epoch: 44626, Training Loss: 0.00668
Epoch: 44626, Training Loss: 0.00818
Epoch: 44627, Training Loss: 0.00643
Epoch: 44627, Training Loss: 0.00667
Epoch: 44627, Training Loss: 0.00668
Epoch: 44627, Training Loss: 0.00818
Epoch: 44628, Training Loss: 0.00643
Epoch: 44628, Training Loss: 0.00667
Epoch: 44628, Training Loss: 0.00668
Epoch: 44628, Training Loss: 0.00818
Epoch: 44629, Training Loss: 0.00643
Epoch: 44629, Training Loss: 0.00667
Epoch: 44629, Training Loss: 0.00668
Epoch: 44629, Training Loss: 0.00818
Epoch: 44630, Training Loss: 0.00643
Epoch: 44630, Training Loss: 0.00667
Epoch: 44630, Training Loss: 0.00668
Epoch: 44630, Training Loss: 0.00818
Epoch: 44631, Training Loss: 0.00643
Epoch: 44631, Training Loss: 0.00667
Epoch: 44631, Training Loss: 0.00668
Epoch: 44631, Training Loss: 0.00818
Epoch: 44632, Training Loss: 0.00643
Epoch: 44632, Training Loss: 0.00667
Epoch: 44632, Training Loss: 0.00668
Epoch: 44632, Training Loss: 0.00818
Epoch: 44633, Training Loss: 0.00643
Epoch: 44633, Training Loss: 0.00667
Epoch: 44633, Training Loss: 0.00668
Epoch: 44633, Training Loss: 0.00818
Epoch: 44634, Training Loss: 0.00643
Epoch: 44634, Training Loss: 0.00667
Epoch: 44634, Training Loss: 0.00668
Epoch: 44634, Training Loss: 0.00818
Epoch: 44635, Training Loss: 0.00643
Epoch: 44635, Training Loss: 0.00667
Epoch: 44635, Training Loss: 0.00668
Epoch: 44635, Training Loss: 0.00818
Epoch: 44636, Training Loss: 0.00643
Epoch: 44636, Training Loss: 0.00667
Epoch: 44636, Training Loss: 0.00668
Epoch: 44636, Training Loss: 0.00818
Epoch: 44637, Training Loss: 0.00643
Epoch: 44637, Training Loss: 0.00667
Epoch: 44637, Training Loss: 0.00668
Epoch: 44637, Training Loss: 0.00818
Epoch: 44638, Training Loss: 0.00643
Epoch: 44638, Training Loss: 0.00667
Epoch: 44638, Training Loss: 0.00668
Epoch: 44638, Training Loss: 0.00818
Epoch: 44639, Training Loss: 0.00643
Epoch: 44639, Training Loss: 0.00667
Epoch: 44639, Training Loss: 0.00668
Epoch: 44639, Training Loss: 0.00818
Epoch: 44640, Training Loss: 0.00643
Epoch: 44640, Training Loss: 0.00667
Epoch: 44640, Training Loss: 0.00668
Epoch: 44640, Training Loss: 0.00818
Epoch: 44641, Training Loss: 0.00643
Epoch: 44641, Training Loss: 0.00667
Epoch: 44641, Training Loss: 0.00668
Epoch: 44641, Training Loss: 0.00818
Epoch: 44642, Training Loss: 0.00643
Epoch: 44642, Training Loss: 0.00667
Epoch: 44642, Training Loss: 0.00668
Epoch: 44642, Training Loss: 0.00818
Epoch: 44643, Training Loss: 0.00643
Epoch: 44643, Training Loss: 0.00667
Epoch: 44643, Training Loss: 0.00668
Epoch: 44643, Training Loss: 0.00818
Epoch: 44644, Training Loss: 0.00643
Epoch: 44644, Training Loss: 0.00667
Epoch: 44644, Training Loss: 0.00668
Epoch: 44644, Training Loss: 0.00818
Epoch: 44645, Training Loss: 0.00643
Epoch: 44645, Training Loss: 0.00667
Epoch: 44645, Training Loss: 0.00668
Epoch: 44645, Training Loss: 0.00818
Epoch: 44646, Training Loss: 0.00643
Epoch: 44646, Training Loss: 0.00667
Epoch: 44646, Training Loss: 0.00668
Epoch: 44646, Training Loss: 0.00818
Epoch: 44647, Training Loss: 0.00643
Epoch: 44647, Training Loss: 0.00667
Epoch: 44647, Training Loss: 0.00668
Epoch: 44647, Training Loss: 0.00818
Epoch: 44648, Training Loss: 0.00643
Epoch: 44648, Training Loss: 0.00667
Epoch: 44648, Training Loss: 0.00668
Epoch: 44648, Training Loss: 0.00818
Epoch: 44649, Training Loss: 0.00643
Epoch: 44649, Training Loss: 0.00667
Epoch: 44649, Training Loss: 0.00668
Epoch: 44649, Training Loss: 0.00818
Epoch: 44650, Training Loss: 0.00643
Epoch: 44650, Training Loss: 0.00667
Epoch: 44650, Training Loss: 0.00668
Epoch: 44650, Training Loss: 0.00818
Epoch: 44651, Training Loss: 0.00643
Epoch: 44651, Training Loss: 0.00667
Epoch: 44651, Training Loss: 0.00668
Epoch: 44651, Training Loss: 0.00818
Epoch: 44652, Training Loss: 0.00643
Epoch: 44652, Training Loss: 0.00667
Epoch: 44652, Training Loss: 0.00668
Epoch: 44652, Training Loss: 0.00818
Epoch: 44653, Training Loss: 0.00643
Epoch: 44653, Training Loss: 0.00667
Epoch: 44653, Training Loss: 0.00668
Epoch: 44653, Training Loss: 0.00818
Epoch: 44654, Training Loss: 0.00643
Epoch: 44654, Training Loss: 0.00667
Epoch: 44654, Training Loss: 0.00668
Epoch: 44654, Training Loss: 0.00818
Epoch: 44655, Training Loss: 0.00643
Epoch: 44655, Training Loss: 0.00667
Epoch: 44655, Training Loss: 0.00667
Epoch: 44655, Training Loss: 0.00818
Epoch: 44656, Training Loss: 0.00643
Epoch: 44656, Training Loss: 0.00667
Epoch: 44656, Training Loss: 0.00667
Epoch: 44656, Training Loss: 0.00818
Epoch: 44657, Training Loss: 0.00643
Epoch: 44657, Training Loss: 0.00667
Epoch: 44657, Training Loss: 0.00667
Epoch: 44657, Training Loss: 0.00818
Epoch: 44658, Training Loss: 0.00643
Epoch: 44658, Training Loss: 0.00667
Epoch: 44658, Training Loss: 0.00667
Epoch: 44658, Training Loss: 0.00818
Epoch: 44659, Training Loss: 0.00643
Epoch: 44659, Training Loss: 0.00667
Epoch: 44659, Training Loss: 0.00667
Epoch: 44659, Training Loss: 0.00818
Epoch: 44660, Training Loss: 0.00643
Epoch: 44660, Training Loss: 0.00667
Epoch: 44660, Training Loss: 0.00667
Epoch: 44660, Training Loss: 0.00818
Epoch: 44661, Training Loss: 0.00643
Epoch: 44661, Training Loss: 0.00667
Epoch: 44661, Training Loss: 0.00667
Epoch: 44661, Training Loss: 0.00818
Epoch: 44662, Training Loss: 0.00643
Epoch: 44662, Training Loss: 0.00667
Epoch: 44662, Training Loss: 0.00667
Epoch: 44662, Training Loss: 0.00818
Epoch: 44663, Training Loss: 0.00643
Epoch: 44663, Training Loss: 0.00667
Epoch: 44663, Training Loss: 0.00667
Epoch: 44663, Training Loss: 0.00818
Epoch: 44664, Training Loss: 0.00643
Epoch: 44664, Training Loss: 0.00667
Epoch: 44664, Training Loss: 0.00667
Epoch: 44664, Training Loss: 0.00818
Epoch: 44665, Training Loss: 0.00643
Epoch: 44665, Training Loss: 0.00667
Epoch: 44665, Training Loss: 0.00667
Epoch: 44665, Training Loss: 0.00818
Epoch: 44666, Training Loss: 0.00643
Epoch: 44666, Training Loss: 0.00667
Epoch: 44666, Training Loss: 0.00667
Epoch: 44666, Training Loss: 0.00818
Epoch: 44667, Training Loss: 0.00643
Epoch: 44667, Training Loss: 0.00667
Epoch: 44667, Training Loss: 0.00667
Epoch: 44667, Training Loss: 0.00818
Epoch: 44668, Training Loss: 0.00643
Epoch: 44668, Training Loss: 0.00667
Epoch: 44668, Training Loss: 0.00667
Epoch: 44668, Training Loss: 0.00818
Epoch: 44669, Training Loss: 0.00643
Epoch: 44669, Training Loss: 0.00667
Epoch: 44669, Training Loss: 0.00667
Epoch: 44669, Training Loss: 0.00818
Epoch: 44670, Training Loss: 0.00643
Epoch: 44670, Training Loss: 0.00667
Epoch: 44670, Training Loss: 0.00667
Epoch: 44670, Training Loss: 0.00818
Epoch: 44671, Training Loss: 0.00643
Epoch: 44671, Training Loss: 0.00667
Epoch: 44671, Training Loss: 0.00667
Epoch: 44671, Training Loss: 0.00818
Epoch: 44672, Training Loss: 0.00643
Epoch: 44672, Training Loss: 0.00667
Epoch: 44672, Training Loss: 0.00667
Epoch: 44672, Training Loss: 0.00818
Epoch: 44673, Training Loss: 0.00643
Epoch: 44673, Training Loss: 0.00667
Epoch: 44673, Training Loss: 0.00667
Epoch: 44673, Training Loss: 0.00818
Epoch: 44674, Training Loss: 0.00643
Epoch: 44674, Training Loss: 0.00667
Epoch: 44674, Training Loss: 0.00667
Epoch: 44674, Training Loss: 0.00818
Epoch: 44675, Training Loss: 0.00643
Epoch: 44675, Training Loss: 0.00667
Epoch: 44675, Training Loss: 0.00667
Epoch: 44675, Training Loss: 0.00818
Epoch: 44676, Training Loss: 0.00643
Epoch: 44676, Training Loss: 0.00667
Epoch: 44676, Training Loss: 0.00667
Epoch: 44676, Training Loss: 0.00818
Epoch: 44677, Training Loss: 0.00643
Epoch: 44677, Training Loss: 0.00667
Epoch: 44677, Training Loss: 0.00667
Epoch: 44677, Training Loss: 0.00818
Epoch: 44678, Training Loss: 0.00642
Epoch: 44678, Training Loss: 0.00667
Epoch: 44678, Training Loss: 0.00667
Epoch: 44678, Training Loss: 0.00818
Epoch: 44679, Training Loss: 0.00642
Epoch: 44679, Training Loss: 0.00667
Epoch: 44679, Training Loss: 0.00667
Epoch: 44679, Training Loss: 0.00818
Epoch: 44680, Training Loss: 0.00642
Epoch: 44680, Training Loss: 0.00667
Epoch: 44680, Training Loss: 0.00667
Epoch: 44680, Training Loss: 0.00818
Epoch: 44681, Training Loss: 0.00642
Epoch: 44681, Training Loss: 0.00667
Epoch: 44681, Training Loss: 0.00667
Epoch: 44681, Training Loss: 0.00818
Epoch: 44682, Training Loss: 0.00642
Epoch: 44682, Training Loss: 0.00667
Epoch: 44682, Training Loss: 0.00667
Epoch: 44682, Training Loss: 0.00818
Epoch: 44683, Training Loss: 0.00642
Epoch: 44683, Training Loss: 0.00667
Epoch: 44683, Training Loss: 0.00667
Epoch: 44683, Training Loss: 0.00818
Epoch: 44684, Training Loss: 0.00642
Epoch: 44684, Training Loss: 0.00667
Epoch: 44684, Training Loss: 0.00667
Epoch: 44684, Training Loss: 0.00818
Epoch: 44685, Training Loss: 0.00642
Epoch: 44685, Training Loss: 0.00667
Epoch: 44685, Training Loss: 0.00667
Epoch: 44685, Training Loss: 0.00818
Epoch: 44686, Training Loss: 0.00642
Epoch: 44686, Training Loss: 0.00667
Epoch: 44686, Training Loss: 0.00667
Epoch: 44686, Training Loss: 0.00818
Epoch: 44687, Training Loss: 0.00642
Epoch: 44687, Training Loss: 0.00667
Epoch: 44687, Training Loss: 0.00667
Epoch: 44687, Training Loss: 0.00818
Epoch: 44688, Training Loss: 0.00642
Epoch: 44688, Training Loss: 0.00667
Epoch: 44688, Training Loss: 0.00667
Epoch: 44688, Training Loss: 0.00818
Epoch: 44689, Training Loss: 0.00642
Epoch: 44689, Training Loss: 0.00666
Epoch: 44689, Training Loss: 0.00667
Epoch: 44689, Training Loss: 0.00818
Epoch: 44690, Training Loss: 0.00642
Epoch: 44690, Training Loss: 0.00666
Epoch: 44690, Training Loss: 0.00667
Epoch: 44690, Training Loss: 0.00818
Epoch: 44691, Training Loss: 0.00642
Epoch: 44691, Training Loss: 0.00666
Epoch: 44691, Training Loss: 0.00667
Epoch: 44691, Training Loss: 0.00818
Epoch: 44692, Training Loss: 0.00642
Epoch: 44692, Training Loss: 0.00666
Epoch: 44692, Training Loss: 0.00667
Epoch: 44692, Training Loss: 0.00818
Epoch: 44693, Training Loss: 0.00642
Epoch: 44693, Training Loss: 0.00666
Epoch: 44693, Training Loss: 0.00667
Epoch: 44693, Training Loss: 0.00818
Epoch: 44694, Training Loss: 0.00642
Epoch: 44694, Training Loss: 0.00666
Epoch: 44694, Training Loss: 0.00667
Epoch: 44694, Training Loss: 0.00818
Epoch: 44695, Training Loss: 0.00642
Epoch: 44695, Training Loss: 0.00666
Epoch: 44695, Training Loss: 0.00667
Epoch: 44695, Training Loss: 0.00818
Epoch: 44696, Training Loss: 0.00642
Epoch: 44696, Training Loss: 0.00666
Epoch: 44696, Training Loss: 0.00667
Epoch: 44696, Training Loss: 0.00818
Epoch: 44697, Training Loss: 0.00642
Epoch: 44697, Training Loss: 0.00666
Epoch: 44697, Training Loss: 0.00667
Epoch: 44697, Training Loss: 0.00818
Epoch: 44698, Training Loss: 0.00642
Epoch: 44698, Training Loss: 0.00666
Epoch: 44698, Training Loss: 0.00667
Epoch: 44698, Training Loss: 0.00818
Epoch: 44699, Training Loss: 0.00642
Epoch: 44699, Training Loss: 0.00666
Epoch: 44699, Training Loss: 0.00667
Epoch: 44699, Training Loss: 0.00818
Epoch: 44700, Training Loss: 0.00642
Epoch: 44700, Training Loss: 0.00666
Epoch: 44700, Training Loss: 0.00667
Epoch: 44700, Training Loss: 0.00818
Epoch: 44701, Training Loss: 0.00642
Epoch: 44701, Training Loss: 0.00666
Epoch: 44701, Training Loss: 0.00667
Epoch: 44701, Training Loss: 0.00818
Epoch: 44702, Training Loss: 0.00642
Epoch: 44702, Training Loss: 0.00666
Epoch: 44702, Training Loss: 0.00667
Epoch: 44702, Training Loss: 0.00818
Epoch: 44703, Training Loss: 0.00642
Epoch: 44703, Training Loss: 0.00666
Epoch: 44703, Training Loss: 0.00667
Epoch: 44703, Training Loss: 0.00818
Epoch: 44704, Training Loss: 0.00642
Epoch: 44704, Training Loss: 0.00666
Epoch: 44704, Training Loss: 0.00667
Epoch: 44704, Training Loss: 0.00818
Epoch: 44705, Training Loss: 0.00642
Epoch: 44705, Training Loss: 0.00666
Epoch: 44705, Training Loss: 0.00667
Epoch: 44705, Training Loss: 0.00818
Epoch: 44706, Training Loss: 0.00642
Epoch: 44706, Training Loss: 0.00666
Epoch: 44706, Training Loss: 0.00667
Epoch: 44706, Training Loss: 0.00818
Epoch: 44707, Training Loss: 0.00642
Epoch: 44707, Training Loss: 0.00666
Epoch: 44707, Training Loss: 0.00667
Epoch: 44707, Training Loss: 0.00818
Epoch: 44708, Training Loss: 0.00642
Epoch: 44708, Training Loss: 0.00666
Epoch: 44708, Training Loss: 0.00667
Epoch: 44708, Training Loss: 0.00818
Epoch: 44709, Training Loss: 0.00642
Epoch: 44709, Training Loss: 0.00666
Epoch: 44709, Training Loss: 0.00667
Epoch: 44709, Training Loss: 0.00818
Epoch: 44710, Training Loss: 0.00642
Epoch: 44710, Training Loss: 0.00666
Epoch: 44710, Training Loss: 0.00667
Epoch: 44710, Training Loss: 0.00818
Epoch: 44711, Training Loss: 0.00642
Epoch: 44711, Training Loss: 0.00666
Epoch: 44711, Training Loss: 0.00667
Epoch: 44711, Training Loss: 0.00818
Epoch: 44712, Training Loss: 0.00642
Epoch: 44712, Training Loss: 0.00666
Epoch: 44712, Training Loss: 0.00667
Epoch: 44712, Training Loss: 0.00818
Epoch: 44713, Training Loss: 0.00642
Epoch: 44713, Training Loss: 0.00666
Epoch: 44713, Training Loss: 0.00667
Epoch: 44713, Training Loss: 0.00818
Epoch: 44714, Training Loss: 0.00642
Epoch: 44714, Training Loss: 0.00666
Epoch: 44714, Training Loss: 0.00667
Epoch: 44714, Training Loss: 0.00818
Epoch: 44715, Training Loss: 0.00642
Epoch: 44715, Training Loss: 0.00666
Epoch: 44715, Training Loss: 0.00667
Epoch: 44715, Training Loss: 0.00818
Epoch: 44716, Training Loss: 0.00642
Epoch: 44716, Training Loss: 0.00666
Epoch: 44716, Training Loss: 0.00667
Epoch: 44716, Training Loss: 0.00818
Epoch: 44717, Training Loss: 0.00642
Epoch: 44717, Training Loss: 0.00666
Epoch: 44717, Training Loss: 0.00667
Epoch: 44717, Training Loss: 0.00818
Epoch: 44718, Training Loss: 0.00642
Epoch: 44718, Training Loss: 0.00666
Epoch: 44718, Training Loss: 0.00667
Epoch: 44718, Training Loss: 0.00818
Epoch: 44719, Training Loss: 0.00642
Epoch: 44719, Training Loss: 0.00666
Epoch: 44719, Training Loss: 0.00667
Epoch: 44719, Training Loss: 0.00817
Epoch: 44720, Training Loss: 0.00642
Epoch: 44720, Training Loss: 0.00666
Epoch: 44720, Training Loss: 0.00667
Epoch: 44720, Training Loss: 0.00817
Epoch: 44721, Training Loss: 0.00642
Epoch: 44721, Training Loss: 0.00666
Epoch: 44721, Training Loss: 0.00667
Epoch: 44721, Training Loss: 0.00817
Epoch: 44722, Training Loss: 0.00642
Epoch: 44722, Training Loss: 0.00666
Epoch: 44722, Training Loss: 0.00667
Epoch: 44722, Training Loss: 0.00817
Epoch: 44723, Training Loss: 0.00642
Epoch: 44723, Training Loss: 0.00666
Epoch: 44723, Training Loss: 0.00667
Epoch: 44723, Training Loss: 0.00817
Epoch: 44724, Training Loss: 0.00642
Epoch: 44724, Training Loss: 0.00666
Epoch: 44724, Training Loss: 0.00667
Epoch: 44724, Training Loss: 0.00817
Epoch: 44725, Training Loss: 0.00642
Epoch: 44725, Training Loss: 0.00666
Epoch: 44725, Training Loss: 0.00667
Epoch: 44725, Training Loss: 0.00817
Epoch: 44726, Training Loss: 0.00642
Epoch: 44726, Training Loss: 0.00666
Epoch: 44726, Training Loss: 0.00667
Epoch: 44726, Training Loss: 0.00817
Epoch: 44727, Training Loss: 0.00642
Epoch: 44727, Training Loss: 0.00666
Epoch: 44727, Training Loss: 0.00667
Epoch: 44727, Training Loss: 0.00817
Epoch: 44728, Training Loss: 0.00642
Epoch: 44728, Training Loss: 0.00666
Epoch: 44728, Training Loss: 0.00667
Epoch: 44728, Training Loss: 0.00817
Epoch: 44729, Training Loss: 0.00642
Epoch: 44729, Training Loss: 0.00666
Epoch: 44729, Training Loss: 0.00667
Epoch: 44729, Training Loss: 0.00817
Epoch: 44730, Training Loss: 0.00642
Epoch: 44730, Training Loss: 0.00666
Epoch: 44730, Training Loss: 0.00667
Epoch: 44730, Training Loss: 0.00817
Epoch: 44731, Training Loss: 0.00642
Epoch: 44731, Training Loss: 0.00666
Epoch: 44731, Training Loss: 0.00667
Epoch: 44731, Training Loss: 0.00817
Epoch: 44732, Training Loss: 0.00642
Epoch: 44732, Training Loss: 0.00666
Epoch: 44732, Training Loss: 0.00667
Epoch: 44732, Training Loss: 0.00817
Epoch: 44733, Training Loss: 0.00642
Epoch: 44733, Training Loss: 0.00666
Epoch: 44733, Training Loss: 0.00667
Epoch: 44733, Training Loss: 0.00817
Epoch: 44734, Training Loss: 0.00642
Epoch: 44734, Training Loss: 0.00666
Epoch: 44734, Training Loss: 0.00667
Epoch: 44734, Training Loss: 0.00817
Epoch: 44735, Training Loss: 0.00642
Epoch: 44735, Training Loss: 0.00666
Epoch: 44735, Training Loss: 0.00667
Epoch: 44735, Training Loss: 0.00817
Epoch: 44736, Training Loss: 0.00642
Epoch: 44736, Training Loss: 0.00666
Epoch: 44736, Training Loss: 0.00667
Epoch: 44736, Training Loss: 0.00817
Epoch: 44737, Training Loss: 0.00642
Epoch: 44737, Training Loss: 0.00666
Epoch: 44737, Training Loss: 0.00667
Epoch: 44737, Training Loss: 0.00817
Epoch: 44738, Training Loss: 0.00642
Epoch: 44738, Training Loss: 0.00666
Epoch: 44738, Training Loss: 0.00667
Epoch: 44738, Training Loss: 0.00817
Epoch: 44739, Training Loss: 0.00642
Epoch: 44739, Training Loss: 0.00666
Epoch: 44739, Training Loss: 0.00667
Epoch: 44739, Training Loss: 0.00817
Epoch: 44740, Training Loss: 0.00642
Epoch: 44740, Training Loss: 0.00666
Epoch: 44740, Training Loss: 0.00667
Epoch: 44740, Training Loss: 0.00817
Epoch: 44741, Training Loss: 0.00642
Epoch: 44741, Training Loss: 0.00666
Epoch: 44741, Training Loss: 0.00667
Epoch: 44741, Training Loss: 0.00817
Epoch: 44742, Training Loss: 0.00642
Epoch: 44742, Training Loss: 0.00666
Epoch: 44742, Training Loss: 0.00667
Epoch: 44742, Training Loss: 0.00817
Epoch: 44743, Training Loss: 0.00642
Epoch: 44743, Training Loss: 0.00666
Epoch: 44743, Training Loss: 0.00667
Epoch: 44743, Training Loss: 0.00817
Epoch: 44744, Training Loss: 0.00642
Epoch: 44744, Training Loss: 0.00666
Epoch: 44744, Training Loss: 0.00667
Epoch: 44744, Training Loss: 0.00817
Epoch: 44745, Training Loss: 0.00642
Epoch: 44745, Training Loss: 0.00666
Epoch: 44745, Training Loss: 0.00667
Epoch: 44745, Training Loss: 0.00817
Epoch: 44746, Training Loss: 0.00642
Epoch: 44746, Training Loss: 0.00666
Epoch: 44746, Training Loss: 0.00667
Epoch: 44746, Training Loss: 0.00817
Epoch: 44747, Training Loss: 0.00642
Epoch: 44747, Training Loss: 0.00666
Epoch: 44747, Training Loss: 0.00667
Epoch: 44747, Training Loss: 0.00817
Epoch: 44748, Training Loss: 0.00642
Epoch: 44748, Training Loss: 0.00666
Epoch: 44748, Training Loss: 0.00667
Epoch: 44748, Training Loss: 0.00817
Epoch: 44749, Training Loss: 0.00642
Epoch: 44749, Training Loss: 0.00666
Epoch: 44749, Training Loss: 0.00667
Epoch: 44749, Training Loss: 0.00817
Epoch: 44750, Training Loss: 0.00642
Epoch: 44750, Training Loss: 0.00666
Epoch: 44750, Training Loss: 0.00667
Epoch: 44750, Training Loss: 0.00817
Epoch: 44751, Training Loss: 0.00642
Epoch: 44751, Training Loss: 0.00666
Epoch: 44751, Training Loss: 0.00667
Epoch: 44751, Training Loss: 0.00817
Epoch: 44752, Training Loss: 0.00642
Epoch: 44752, Training Loss: 0.00666
Epoch: 44752, Training Loss: 0.00667
Epoch: 44752, Training Loss: 0.00817
Epoch: 44753, Training Loss: 0.00642
Epoch: 44753, Training Loss: 0.00666
Epoch: 44753, Training Loss: 0.00667
Epoch: 44753, Training Loss: 0.00817
Epoch: 44754, Training Loss: 0.00642
Epoch: 44754, Training Loss: 0.00666
Epoch: 44754, Training Loss: 0.00667
Epoch: 44754, Training Loss: 0.00817
Epoch: 44755, Training Loss: 0.00642
Epoch: 44755, Training Loss: 0.00666
Epoch: 44755, Training Loss: 0.00667
Epoch: 44755, Training Loss: 0.00817
Epoch: 44756, Training Loss: 0.00642
Epoch: 44756, Training Loss: 0.00666
Epoch: 44756, Training Loss: 0.00667
Epoch: 44756, Training Loss: 0.00817
Epoch: 44757, Training Loss: 0.00642
Epoch: 44757, Training Loss: 0.00666
Epoch: 44757, Training Loss: 0.00667
Epoch: 44757, Training Loss: 0.00817
Epoch: 44758, Training Loss: 0.00642
Epoch: 44758, Training Loss: 0.00666
Epoch: 44758, Training Loss: 0.00667
Epoch: 44758, Training Loss: 0.00817
Epoch: 44759, Training Loss: 0.00642
Epoch: 44759, Training Loss: 0.00666
Epoch: 44759, Training Loss: 0.00667
Epoch: 44759, Training Loss: 0.00817
Epoch: 44760, Training Loss: 0.00642
Epoch: 44760, Training Loss: 0.00666
Epoch: 44760, Training Loss: 0.00667
Epoch: 44760, Training Loss: 0.00817
Epoch: 44761, Training Loss: 0.00642
Epoch: 44761, Training Loss: 0.00666
Epoch: 44761, Training Loss: 0.00667
Epoch: 44761, Training Loss: 0.00817
Epoch: 44762, Training Loss: 0.00642
Epoch: 44762, Training Loss: 0.00666
Epoch: 44762, Training Loss: 0.00667
Epoch: 44762, Training Loss: 0.00817
Epoch: 44763, Training Loss: 0.00642
Epoch: 44763, Training Loss: 0.00666
Epoch: 44763, Training Loss: 0.00667
Epoch: 44763, Training Loss: 0.00817
Epoch: 44764, Training Loss: 0.00642
Epoch: 44764, Training Loss: 0.00666
Epoch: 44764, Training Loss: 0.00667
Epoch: 44764, Training Loss: 0.00817
Epoch: 44765, Training Loss: 0.00642
Epoch: 44765, Training Loss: 0.00666
Epoch: 44765, Training Loss: 0.00667
Epoch: 44765, Training Loss: 0.00817
Epoch: 44766, Training Loss: 0.00642
Epoch: 44766, Training Loss: 0.00666
Epoch: 44766, Training Loss: 0.00667
Epoch: 44766, Training Loss: 0.00817
Epoch: 44767, Training Loss: 0.00642
Epoch: 44767, Training Loss: 0.00666
Epoch: 44767, Training Loss: 0.00667
Epoch: 44767, Training Loss: 0.00817
Epoch: 44768, Training Loss: 0.00642
Epoch: 44768, Training Loss: 0.00666
Epoch: 44768, Training Loss: 0.00667
Epoch: 44768, Training Loss: 0.00817
Epoch: 44769, Training Loss: 0.00642
Epoch: 44769, Training Loss: 0.00666
Epoch: 44769, Training Loss: 0.00667
Epoch: 44769, Training Loss: 0.00817
Epoch: 44770, Training Loss: 0.00642
Epoch: 44770, Training Loss: 0.00666
Epoch: 44770, Training Loss: 0.00667
Epoch: 44770, Training Loss: 0.00817
Epoch: 44771, Training Loss: 0.00642
Epoch: 44771, Training Loss: 0.00666
Epoch: 44771, Training Loss: 0.00667
Epoch: 44771, Training Loss: 0.00817
Epoch: 44772, Training Loss: 0.00642
Epoch: 44772, Training Loss: 0.00666
Epoch: 44772, Training Loss: 0.00667
Epoch: 44772, Training Loss: 0.00817
Epoch: 44773, Training Loss: 0.00642
Epoch: 44773, Training Loss: 0.00666
Epoch: 44773, Training Loss: 0.00667
Epoch: 44773, Training Loss: 0.00817
Epoch: 44774, Training Loss: 0.00642
Epoch: 44774, Training Loss: 0.00666
Epoch: 44774, Training Loss: 0.00667
Epoch: 44774, Training Loss: 0.00817
Epoch: 44775, Training Loss: 0.00642
Epoch: 44775, Training Loss: 0.00666
Epoch: 44775, Training Loss: 0.00667
Epoch: 44775, Training Loss: 0.00817
Epoch: 44776, Training Loss: 0.00642
Epoch: 44776, Training Loss: 0.00666
Epoch: 44776, Training Loss: 0.00667
Epoch: 44776, Training Loss: 0.00817
Epoch: 44777, Training Loss: 0.00642
Epoch: 44777, Training Loss: 0.00666
Epoch: 44777, Training Loss: 0.00667
Epoch: 44777, Training Loss: 0.00817
Epoch: 44778, Training Loss: 0.00642
Epoch: 44778, Training Loss: 0.00666
Epoch: 44778, Training Loss: 0.00666
Epoch: 44778, Training Loss: 0.00817
Epoch: 44779, Training Loss: 0.00642
Epoch: 44779, Training Loss: 0.00666
Epoch: 44779, Training Loss: 0.00666
Epoch: 44779, Training Loss: 0.00817
Epoch: 44780, Training Loss: 0.00642
Epoch: 44780, Training Loss: 0.00666
Epoch: 44780, Training Loss: 0.00666
Epoch: 44780, Training Loss: 0.00817
Epoch: 44781, Training Loss: 0.00642
Epoch: 44781, Training Loss: 0.00666
Epoch: 44781, Training Loss: 0.00666
Epoch: 44781, Training Loss: 0.00817
Epoch: 44782, Training Loss: 0.00642
Epoch: 44782, Training Loss: 0.00666
Epoch: 44782, Training Loss: 0.00666
Epoch: 44782, Training Loss: 0.00817
Epoch: 44783, Training Loss: 0.00642
Epoch: 44783, Training Loss: 0.00666
Epoch: 44783, Training Loss: 0.00666
Epoch: 44783, Training Loss: 0.00817
Epoch: 44784, Training Loss: 0.00642
Epoch: 44784, Training Loss: 0.00666
Epoch: 44784, Training Loss: 0.00666
Epoch: 44784, Training Loss: 0.00817
Epoch: 44785, Training Loss: 0.00642
Epoch: 44785, Training Loss: 0.00666
Epoch: 44785, Training Loss: 0.00666
Epoch: 44785, Training Loss: 0.00817
Epoch: 44786, Training Loss: 0.00642
Epoch: 44786, Training Loss: 0.00666
Epoch: 44786, Training Loss: 0.00666
Epoch: 44786, Training Loss: 0.00817
Epoch: 44787, Training Loss: 0.00642
Epoch: 44787, Training Loss: 0.00666
Epoch: 44787, Training Loss: 0.00666
Epoch: 44787, Training Loss: 0.00817
Epoch: 44788, Training Loss: 0.00642
Epoch: 44788, Training Loss: 0.00666
Epoch: 44788, Training Loss: 0.00666
Epoch: 44788, Training Loss: 0.00817
Epoch: 44789, Training Loss: 0.00642
Epoch: 44789, Training Loss: 0.00666
Epoch: 44789, Training Loss: 0.00666
Epoch: 44789, Training Loss: 0.00817
Epoch: 44790, Training Loss: 0.00642
Epoch: 44790, Training Loss: 0.00666
Epoch: 44790, Training Loss: 0.00666
Epoch: 44790, Training Loss: 0.00817
Epoch: 44791, Training Loss: 0.00642
Epoch: 44791, Training Loss: 0.00666
Epoch: 44791, Training Loss: 0.00666
Epoch: 44791, Training Loss: 0.00817
Epoch: 44792, Training Loss: 0.00642
Epoch: 44792, Training Loss: 0.00666
Epoch: 44792, Training Loss: 0.00666
Epoch: 44792, Training Loss: 0.00817
Epoch: 44793, Training Loss: 0.00642
Epoch: 44793, Training Loss: 0.00666
Epoch: 44793, Training Loss: 0.00666
Epoch: 44793, Training Loss: 0.00817
Epoch: 44794, Training Loss: 0.00642
Epoch: 44794, Training Loss: 0.00666
Epoch: 44794, Training Loss: 0.00666
Epoch: 44794, Training Loss: 0.00817
Epoch: 44795, Training Loss: 0.00642
Epoch: 44795, Training Loss: 0.00666
Epoch: 44795, Training Loss: 0.00666
Epoch: 44795, Training Loss: 0.00817
Epoch: 44796, Training Loss: 0.00642
Epoch: 44796, Training Loss: 0.00666
Epoch: 44796, Training Loss: 0.00666
Epoch: 44796, Training Loss: 0.00817
Epoch: 44797, Training Loss: 0.00642
Epoch: 44797, Training Loss: 0.00666
Epoch: 44797, Training Loss: 0.00666
Epoch: 44797, Training Loss: 0.00817
Epoch: 44798, Training Loss: 0.00642
Epoch: 44798, Training Loss: 0.00666
Epoch: 44798, Training Loss: 0.00666
Epoch: 44798, Training Loss: 0.00817
Epoch: 44799, Training Loss: 0.00642
Epoch: 44799, Training Loss: 0.00666
Epoch: 44799, Training Loss: 0.00666
Epoch: 44799, Training Loss: 0.00817
Epoch: 44800, Training Loss: 0.00642
Epoch: 44800, Training Loss: 0.00666
Epoch: 44800, Training Loss: 0.00666
Epoch: 44800, Training Loss: 0.00817
Epoch: 44801, Training Loss: 0.00642
Epoch: 44801, Training Loss: 0.00666
Epoch: 44801, Training Loss: 0.00666
Epoch: 44801, Training Loss: 0.00817
Epoch: 44802, Training Loss: 0.00642
Epoch: 44802, Training Loss: 0.00666
Epoch: 44802, Training Loss: 0.00666
Epoch: 44802, Training Loss: 0.00817
Epoch: 44803, Training Loss: 0.00642
Epoch: 44803, Training Loss: 0.00666
Epoch: 44803, Training Loss: 0.00666
Epoch: 44803, Training Loss: 0.00817
Epoch: 44804, Training Loss: 0.00642
Epoch: 44804, Training Loss: 0.00666
Epoch: 44804, Training Loss: 0.00666
Epoch: 44804, Training Loss: 0.00817
Epoch: 44805, Training Loss: 0.00642
Epoch: 44805, Training Loss: 0.00666
Epoch: 44805, Training Loss: 0.00666
Epoch: 44805, Training Loss: 0.00817
Epoch: 44806, Training Loss: 0.00642
Epoch: 44806, Training Loss: 0.00666
Epoch: 44806, Training Loss: 0.00666
Epoch: 44806, Training Loss: 0.00817
Epoch: 44807, Training Loss: 0.00642
Epoch: 44807, Training Loss: 0.00666
Epoch: 44807, Training Loss: 0.00666
Epoch: 44807, Training Loss: 0.00817
Epoch: 44808, Training Loss: 0.00641
Epoch: 44808, Training Loss: 0.00666
Epoch: 44808, Training Loss: 0.00666
Epoch: 44808, Training Loss: 0.00817
Epoch: 44809, Training Loss: 0.00641
Epoch: 44809, Training Loss: 0.00666
Epoch: 44809, Training Loss: 0.00666
Epoch: 44809, Training Loss: 0.00817
Epoch: 44810, Training Loss: 0.00641
Epoch: 44810, Training Loss: 0.00666
Epoch: 44810, Training Loss: 0.00666
Epoch: 44810, Training Loss: 0.00817
Epoch: 44811, Training Loss: 0.00641
Epoch: 44811, Training Loss: 0.00666
Epoch: 44811, Training Loss: 0.00666
Epoch: 44811, Training Loss: 0.00817
Epoch: 44812, Training Loss: 0.00641
Epoch: 44812, Training Loss: 0.00665
Epoch: 44812, Training Loss: 0.00666
Epoch: 44812, Training Loss: 0.00817
Epoch: 44813, Training Loss: 0.00641
Epoch: 44813, Training Loss: 0.00665
Epoch: 44813, Training Loss: 0.00666
Epoch: 44813, Training Loss: 0.00817
Epoch: 44814, Training Loss: 0.00641
Epoch: 44814, Training Loss: 0.00665
Epoch: 44814, Training Loss: 0.00666
Epoch: 44814, Training Loss: 0.00817
Epoch: 44815, Training Loss: 0.00641
Epoch: 44815, Training Loss: 0.00665
Epoch: 44815, Training Loss: 0.00666
Epoch: 44815, Training Loss: 0.00817
Epoch: 44816, Training Loss: 0.00641
Epoch: 44816, Training Loss: 0.00665
Epoch: 44816, Training Loss: 0.00666
Epoch: 44816, Training Loss: 0.00817
Epoch: 44817, Training Loss: 0.00641
Epoch: 44817, Training Loss: 0.00665
Epoch: 44817, Training Loss: 0.00666
Epoch: 44817, Training Loss: 0.00817
Epoch: 44818, Training Loss: 0.00641
Epoch: 44818, Training Loss: 0.00665
Epoch: 44818, Training Loss: 0.00666
Epoch: 44818, Training Loss: 0.00817
Epoch: 44819, Training Loss: 0.00641
Epoch: 44819, Training Loss: 0.00665
Epoch: 44819, Training Loss: 0.00666
Epoch: 44819, Training Loss: 0.00816
Epoch: 44820, Training Loss: 0.00641
Epoch: 44820, Training Loss: 0.00665
Epoch: 44820, Training Loss: 0.00666
Epoch: 44820, Training Loss: 0.00816
Epoch: 44821, Training Loss: 0.00641
Epoch: 44821, Training Loss: 0.00665
Epoch: 44821, Training Loss: 0.00666
Epoch: 44821, Training Loss: 0.00816
Epoch: 44822, Training Loss: 0.00641
Epoch: 44822, Training Loss: 0.00665
Epoch: 44822, Training Loss: 0.00666
Epoch: 44822, Training Loss: 0.00816
Epoch: 44823, Training Loss: 0.00641
Epoch: 44823, Training Loss: 0.00665
Epoch: 44823, Training Loss: 0.00666
Epoch: 44823, Training Loss: 0.00816
Epoch: 44824, Training Loss: 0.00641
Epoch: 44824, Training Loss: 0.00665
Epoch: 44824, Training Loss: 0.00666
Epoch: 44824, Training Loss: 0.00816
Epoch: 44825, Training Loss: 0.00641
Epoch: 44825, Training Loss: 0.00665
Epoch: 44825, Training Loss: 0.00666
Epoch: 44825, Training Loss: 0.00816
Epoch: 44826, Training Loss: 0.00641
Epoch: 44826, Training Loss: 0.00665
Epoch: 44826, Training Loss: 0.00666
Epoch: 44826, Training Loss: 0.00816
Epoch: 44827, Training Loss: 0.00641
Epoch: 44827, Training Loss: 0.00665
Epoch: 44827, Training Loss: 0.00666
Epoch: 44827, Training Loss: 0.00816
Epoch: 44828, Training Loss: 0.00641
Epoch: 44828, Training Loss: 0.00665
Epoch: 44828, Training Loss: 0.00666
Epoch: 44828, Training Loss: 0.00816
Epoch: 44829, Training Loss: 0.00641
Epoch: 44829, Training Loss: 0.00665
Epoch: 44829, Training Loss: 0.00666
Epoch: 44829, Training Loss: 0.00816
Epoch: 44830, Training Loss: 0.00641
Epoch: 44830, Training Loss: 0.00665
Epoch: 44830, Training Loss: 0.00666
Epoch: 44830, Training Loss: 0.00816
Epoch: 44831, Training Loss: 0.00641
Epoch: 44831, Training Loss: 0.00665
Epoch: 44831, Training Loss: 0.00666
Epoch: 44831, Training Loss: 0.00816
Epoch: 44832, Training Loss: 0.00641
Epoch: 44832, Training Loss: 0.00665
Epoch: 44832, Training Loss: 0.00666
Epoch: 44832, Training Loss: 0.00816
Epoch: 44833, Training Loss: 0.00641
Epoch: 44833, Training Loss: 0.00665
Epoch: 44833, Training Loss: 0.00666
Epoch: 44833, Training Loss: 0.00816
Epoch: 44834, Training Loss: 0.00641
Epoch: 44834, Training Loss: 0.00665
Epoch: 44834, Training Loss: 0.00666
Epoch: 44834, Training Loss: 0.00816
Epoch: 44835, Training Loss: 0.00641
Epoch: 44835, Training Loss: 0.00665
Epoch: 44835, Training Loss: 0.00666
Epoch: 44835, Training Loss: 0.00816
Epoch: 44836, Training Loss: 0.00641
Epoch: 44836, Training Loss: 0.00665
Epoch: 44836, Training Loss: 0.00666
Epoch: 44836, Training Loss: 0.00816
Epoch: 44837, Training Loss: 0.00641
Epoch: 44837, Training Loss: 0.00665
Epoch: 44837, Training Loss: 0.00666
Epoch: 44837, Training Loss: 0.00816
Epoch: 44838, Training Loss: 0.00641
Epoch: 44838, Training Loss: 0.00665
Epoch: 44838, Training Loss: 0.00666
Epoch: 44838, Training Loss: 0.00816
Epoch: 44839, Training Loss: 0.00641
Epoch: 44839, Training Loss: 0.00665
Epoch: 44839, Training Loss: 0.00666
Epoch: 44839, Training Loss: 0.00816
Epoch: 44840, Training Loss: 0.00641
Epoch: 44840, Training Loss: 0.00665
Epoch: 44840, Training Loss: 0.00666
Epoch: 44840, Training Loss: 0.00816
Epoch: 44841, Training Loss: 0.00641
Epoch: 44841, Training Loss: 0.00665
Epoch: 44841, Training Loss: 0.00666
Epoch: 44841, Training Loss: 0.00816
Epoch: 44842, Training Loss: 0.00641
Epoch: 44842, Training Loss: 0.00665
Epoch: 44842, Training Loss: 0.00666
Epoch: 44842, Training Loss: 0.00816
Epoch: 44843, Training Loss: 0.00641
Epoch: 44843, Training Loss: 0.00665
Epoch: 44843, Training Loss: 0.00666
Epoch: 44843, Training Loss: 0.00816
Epoch: 44844, Training Loss: 0.00641
Epoch: 44844, Training Loss: 0.00665
Epoch: 44844, Training Loss: 0.00666
Epoch: 44844, Training Loss: 0.00816
Epoch: 44845, Training Loss: 0.00641
Epoch: 44845, Training Loss: 0.00665
Epoch: 44845, Training Loss: 0.00666
Epoch: 44845, Training Loss: 0.00816
Epoch: 44846, Training Loss: 0.00641
Epoch: 44846, Training Loss: 0.00665
Epoch: 44846, Training Loss: 0.00666
Epoch: 44846, Training Loss: 0.00816
Epoch: 44847, Training Loss: 0.00641
Epoch: 44847, Training Loss: 0.00665
Epoch: 44847, Training Loss: 0.00666
Epoch: 44847, Training Loss: 0.00816
Epoch: 44848, Training Loss: 0.00641
Epoch: 44848, Training Loss: 0.00665
Epoch: 44848, Training Loss: 0.00666
Epoch: 44848, Training Loss: 0.00816
Epoch: 44849, Training Loss: 0.00641
Epoch: 44849, Training Loss: 0.00665
Epoch: 44849, Training Loss: 0.00666
Epoch: 44849, Training Loss: 0.00816
Epoch: 44850, Training Loss: 0.00641
Epoch: 44850, Training Loss: 0.00665
Epoch: 44850, Training Loss: 0.00666
Epoch: 44850, Training Loss: 0.00816
Epoch: 44851, Training Loss: 0.00641
Epoch: 44851, Training Loss: 0.00665
Epoch: 44851, Training Loss: 0.00666
Epoch: 44851, Training Loss: 0.00816
Epoch: 44852, Training Loss: 0.00641
Epoch: 44852, Training Loss: 0.00665
Epoch: 44852, Training Loss: 0.00666
Epoch: 44852, Training Loss: 0.00816
Epoch: 44853, Training Loss: 0.00641
Epoch: 44853, Training Loss: 0.00665
Epoch: 44853, Training Loss: 0.00666
Epoch: 44853, Training Loss: 0.00816
Epoch: 44854, Training Loss: 0.00641
Epoch: 44854, Training Loss: 0.00665
Epoch: 44854, Training Loss: 0.00666
Epoch: 44854, Training Loss: 0.00816
Epoch: 44855, Training Loss: 0.00641
Epoch: 44855, Training Loss: 0.00665
Epoch: 44855, Training Loss: 0.00666
Epoch: 44855, Training Loss: 0.00816
Epoch: 44856, Training Loss: 0.00641
Epoch: 44856, Training Loss: 0.00665
Epoch: 44856, Training Loss: 0.00666
Epoch: 44856, Training Loss: 0.00816
Epoch: 44857, Training Loss: 0.00641
Epoch: 44857, Training Loss: 0.00665
Epoch: 44857, Training Loss: 0.00666
Epoch: 44857, Training Loss: 0.00816
Epoch: 44858, Training Loss: 0.00641
Epoch: 44858, Training Loss: 0.00665
Epoch: 44858, Training Loss: 0.00666
Epoch: 44858, Training Loss: 0.00816
Epoch: 44859, Training Loss: 0.00641
Epoch: 44859, Training Loss: 0.00665
Epoch: 44859, Training Loss: 0.00666
Epoch: 44859, Training Loss: 0.00816
Epoch: 44860, Training Loss: 0.00641
Epoch: 44860, Training Loss: 0.00665
Epoch: 44860, Training Loss: 0.00666
Epoch: 44860, Training Loss: 0.00816
Epoch: 44861, Training Loss: 0.00641
Epoch: 44861, Training Loss: 0.00665
Epoch: 44861, Training Loss: 0.00666
Epoch: 44861, Training Loss: 0.00816
Epoch: 44862, Training Loss: 0.00641
Epoch: 44862, Training Loss: 0.00665
Epoch: 44862, Training Loss: 0.00666
Epoch: 44862, Training Loss: 0.00816
Epoch: 44863, Training Loss: 0.00641
Epoch: 44863, Training Loss: 0.00665
Epoch: 44863, Training Loss: 0.00666
Epoch: 44863, Training Loss: 0.00816
Epoch: 44864, Training Loss: 0.00641
Epoch: 44864, Training Loss: 0.00665
Epoch: 44864, Training Loss: 0.00666
Epoch: 44864, Training Loss: 0.00816
Epoch: 44865, Training Loss: 0.00641
Epoch: 44865, Training Loss: 0.00665
Epoch: 44865, Training Loss: 0.00666
Epoch: 44865, Training Loss: 0.00816
Epoch: 44866, Training Loss: 0.00641
Epoch: 44866, Training Loss: 0.00665
Epoch: 44866, Training Loss: 0.00666
Epoch: 44866, Training Loss: 0.00816
Epoch: 44867, Training Loss: 0.00641
Epoch: 44867, Training Loss: 0.00665
Epoch: 44867, Training Loss: 0.00666
Epoch: 44867, Training Loss: 0.00816
Epoch: 44868, Training Loss: 0.00641
Epoch: 44868, Training Loss: 0.00665
Epoch: 44868, Training Loss: 0.00666
Epoch: 44868, Training Loss: 0.00816
Epoch: 44869, Training Loss: 0.00641
Epoch: 44869, Training Loss: 0.00665
Epoch: 44869, Training Loss: 0.00666
Epoch: 44869, Training Loss: 0.00816
Epoch: 44870, Training Loss: 0.00641
Epoch: 44870, Training Loss: 0.00665
Epoch: 44870, Training Loss: 0.00666
Epoch: 44870, Training Loss: 0.00816
Epoch: 44871, Training Loss: 0.00641
Epoch: 44871, Training Loss: 0.00665
Epoch: 44871, Training Loss: 0.00666
Epoch: 44871, Training Loss: 0.00816
Epoch: 44872, Training Loss: 0.00641
Epoch: 44872, Training Loss: 0.00665
Epoch: 44872, Training Loss: 0.00666
Epoch: 44872, Training Loss: 0.00816
Epoch: 44873, Training Loss: 0.00641
Epoch: 44873, Training Loss: 0.00665
Epoch: 44873, Training Loss: 0.00666
Epoch: 44873, Training Loss: 0.00816
Epoch: 44874, Training Loss: 0.00641
Epoch: 44874, Training Loss: 0.00665
Epoch: 44874, Training Loss: 0.00666
Epoch: 44874, Training Loss: 0.00816
Epoch: 44875, Training Loss: 0.00641
Epoch: 44875, Training Loss: 0.00665
Epoch: 44875, Training Loss: 0.00666
Epoch: 44875, Training Loss: 0.00816
Epoch: 44876, Training Loss: 0.00641
Epoch: 44876, Training Loss: 0.00665
Epoch: 44876, Training Loss: 0.00666
Epoch: 44876, Training Loss: 0.00816
Epoch: 44877, Training Loss: 0.00641
Epoch: 44877, Training Loss: 0.00665
Epoch: 44877, Training Loss: 0.00666
Epoch: 44877, Training Loss: 0.00816
Epoch: 44878, Training Loss: 0.00641
Epoch: 44878, Training Loss: 0.00665
Epoch: 44878, Training Loss: 0.00666
Epoch: 44878, Training Loss: 0.00816
Epoch: 44879, Training Loss: 0.00641
Epoch: 44879, Training Loss: 0.00665
Epoch: 44879, Training Loss: 0.00666
Epoch: 44879, Training Loss: 0.00816
Epoch: 44880, Training Loss: 0.00641
Epoch: 44880, Training Loss: 0.00665
Epoch: 44880, Training Loss: 0.00666
Epoch: 44880, Training Loss: 0.00816
Epoch: 44881, Training Loss: 0.00641
Epoch: 44881, Training Loss: 0.00665
Epoch: 44881, Training Loss: 0.00666
Epoch: 44881, Training Loss: 0.00816
Epoch: 44882, Training Loss: 0.00641
Epoch: 44882, Training Loss: 0.00665
Epoch: 44882, Training Loss: 0.00666
Epoch: 44882, Training Loss: 0.00816
Epoch: 44883, Training Loss: 0.00641
Epoch: 44883, Training Loss: 0.00665
Epoch: 44883, Training Loss: 0.00666
Epoch: 44883, Training Loss: 0.00816
Epoch: 44884, Training Loss: 0.00641
Epoch: 44884, Training Loss: 0.00665
Epoch: 44884, Training Loss: 0.00666
Epoch: 44884, Training Loss: 0.00816
Epoch: 44885, Training Loss: 0.00641
Epoch: 44885, Training Loss: 0.00665
Epoch: 44885, Training Loss: 0.00666
Epoch: 44885, Training Loss: 0.00816
Epoch: 44886, Training Loss: 0.00641
Epoch: 44886, Training Loss: 0.00665
Epoch: 44886, Training Loss: 0.00666
Epoch: 44886, Training Loss: 0.00816
Epoch: 44887, Training Loss: 0.00641
Epoch: 44887, Training Loss: 0.00665
Epoch: 44887, Training Loss: 0.00666
Epoch: 44887, Training Loss: 0.00816
Epoch: 44888, Training Loss: 0.00641
Epoch: 44888, Training Loss: 0.00665
Epoch: 44888, Training Loss: 0.00666
Epoch: 44888, Training Loss: 0.00816
Epoch: 44889, Training Loss: 0.00641
Epoch: 44889, Training Loss: 0.00665
Epoch: 44889, Training Loss: 0.00666
Epoch: 44889, Training Loss: 0.00816
Epoch: 44890, Training Loss: 0.00641
Epoch: 44890, Training Loss: 0.00665
Epoch: 44890, Training Loss: 0.00666
Epoch: 44890, Training Loss: 0.00816
Epoch: 44891, Training Loss: 0.00641
Epoch: 44891, Training Loss: 0.00665
Epoch: 44891, Training Loss: 0.00666
Epoch: 44891, Training Loss: 0.00816
Epoch: 44892, Training Loss: 0.00641
Epoch: 44892, Training Loss: 0.00665
Epoch: 44892, Training Loss: 0.00666
Epoch: 44892, Training Loss: 0.00816
Epoch: 44893, Training Loss: 0.00641
Epoch: 44893, Training Loss: 0.00665
Epoch: 44893, Training Loss: 0.00666
Epoch: 44893, Training Loss: 0.00816
Epoch: 44894, Training Loss: 0.00641
Epoch: 44894, Training Loss: 0.00665
Epoch: 44894, Training Loss: 0.00666
Epoch: 44894, Training Loss: 0.00816
Epoch: 44895, Training Loss: 0.00641
Epoch: 44895, Training Loss: 0.00665
Epoch: 44895, Training Loss: 0.00666
Epoch: 44895, Training Loss: 0.00816
Epoch: 44896, Training Loss: 0.00641
Epoch: 44896, Training Loss: 0.00665
Epoch: 44896, Training Loss: 0.00666
Epoch: 44896, Training Loss: 0.00816
Epoch: 44897, Training Loss: 0.00641
Epoch: 44897, Training Loss: 0.00665
Epoch: 44897, Training Loss: 0.00666
Epoch: 44897, Training Loss: 0.00816
Epoch: 44898, Training Loss: 0.00641
Epoch: 44898, Training Loss: 0.00665
Epoch: 44898, Training Loss: 0.00666
Epoch: 44898, Training Loss: 0.00816
Epoch: 44899, Training Loss: 0.00641
Epoch: 44899, Training Loss: 0.00665
Epoch: 44899, Training Loss: 0.00666
Epoch: 44899, Training Loss: 0.00816
Epoch: 44900, Training Loss: 0.00641
Epoch: 44900, Training Loss: 0.00665
Epoch: 44900, Training Loss: 0.00666
Epoch: 44900, Training Loss: 0.00816
Epoch: 44901, Training Loss: 0.00641
Epoch: 44901, Training Loss: 0.00665
Epoch: 44901, Training Loss: 0.00666
Epoch: 44901, Training Loss: 0.00816
Epoch: 44902, Training Loss: 0.00641
Epoch: 44902, Training Loss: 0.00665
Epoch: 44902, Training Loss: 0.00665
Epoch: 44902, Training Loss: 0.00816
Epoch: 44903, Training Loss: 0.00641
Epoch: 44903, Training Loss: 0.00665
Epoch: 44903, Training Loss: 0.00665
Epoch: 44903, Training Loss: 0.00816
Epoch: 44904, Training Loss: 0.00641
Epoch: 44904, Training Loss: 0.00665
Epoch: 44904, Training Loss: 0.00665
Epoch: 44904, Training Loss: 0.00816
Epoch: 44905, Training Loss: 0.00641
Epoch: 44905, Training Loss: 0.00665
Epoch: 44905, Training Loss: 0.00665
Epoch: 44905, Training Loss: 0.00816
Epoch: 44906, Training Loss: 0.00641
Epoch: 44906, Training Loss: 0.00665
Epoch: 44906, Training Loss: 0.00665
Epoch: 44906, Training Loss: 0.00816
Epoch: 44907, Training Loss: 0.00641
Epoch: 44907, Training Loss: 0.00665
Epoch: 44907, Training Loss: 0.00665
Epoch: 44907, Training Loss: 0.00816
Epoch: 44908, Training Loss: 0.00641
Epoch: 44908, Training Loss: 0.00665
Epoch: 44908, Training Loss: 0.00665
Epoch: 44908, Training Loss: 0.00816
Epoch: 44909, Training Loss: 0.00641
Epoch: 44909, Training Loss: 0.00665
Epoch: 44909, Training Loss: 0.00665
Epoch: 44909, Training Loss: 0.00816
Epoch: 44910, Training Loss: 0.00641
Epoch: 44910, Training Loss: 0.00665
Epoch: 44910, Training Loss: 0.00665
Epoch: 44910, Training Loss: 0.00816
Epoch: 44911, Training Loss: 0.00641
Epoch: 44911, Training Loss: 0.00665
Epoch: 44911, Training Loss: 0.00665
Epoch: 44911, Training Loss: 0.00816
Epoch: 44912, Training Loss: 0.00641
Epoch: 44912, Training Loss: 0.00665
Epoch: 44912, Training Loss: 0.00665
Epoch: 44912, Training Loss: 0.00816
Epoch: 44913, Training Loss: 0.00641
Epoch: 44913, Training Loss: 0.00665
Epoch: 44913, Training Loss: 0.00665
Epoch: 44913, Training Loss: 0.00816
Epoch: 44914, Training Loss: 0.00641
Epoch: 44914, Training Loss: 0.00665
Epoch: 44914, Training Loss: 0.00665
Epoch: 44914, Training Loss: 0.00816
Epoch: 44915, Training Loss: 0.00641
Epoch: 44915, Training Loss: 0.00665
Epoch: 44915, Training Loss: 0.00665
Epoch: 44915, Training Loss: 0.00816
Epoch: 44916, Training Loss: 0.00641
Epoch: 44916, Training Loss: 0.00665
Epoch: 44916, Training Loss: 0.00665
Epoch: 44916, Training Loss: 0.00816
Epoch: 44917, Training Loss: 0.00641
Epoch: 44917, Training Loss: 0.00665
Epoch: 44917, Training Loss: 0.00665
Epoch: 44917, Training Loss: 0.00816
Epoch: 44918, Training Loss: 0.00641
Epoch: 44918, Training Loss: 0.00665
Epoch: 44918, Training Loss: 0.00665
Epoch: 44918, Training Loss: 0.00816
Epoch: 44919, Training Loss: 0.00641
Epoch: 44919, Training Loss: 0.00665
Epoch: 44919, Training Loss: 0.00665
Epoch: 44919, Training Loss: 0.00815
Epoch: 44920, Training Loss: 0.00641
Epoch: 44920, Training Loss: 0.00665
Epoch: 44920, Training Loss: 0.00665
Epoch: 44920, Training Loss: 0.00815
Epoch: 44921, Training Loss: 0.00641
Epoch: 44921, Training Loss: 0.00665
Epoch: 44921, Training Loss: 0.00665
Epoch: 44921, Training Loss: 0.00815
Epoch: 44922, Training Loss: 0.00641
Epoch: 44922, Training Loss: 0.00665
Epoch: 44922, Training Loss: 0.00665
Epoch: 44922, Training Loss: 0.00815
Epoch: 44923, Training Loss: 0.00641
Epoch: 44923, Training Loss: 0.00665
Epoch: 44923, Training Loss: 0.00665
Epoch: 44923, Training Loss: 0.00815
Epoch: 44924, Training Loss: 0.00641
Epoch: 44924, Training Loss: 0.00665
Epoch: 44924, Training Loss: 0.00665
Epoch: 44924, Training Loss: 0.00815
Epoch: 44925, Training Loss: 0.00641
Epoch: 44925, Training Loss: 0.00665
Epoch: 44925, Training Loss: 0.00665
Epoch: 44925, Training Loss: 0.00815
Epoch: 44926, Training Loss: 0.00641
Epoch: 44926, Training Loss: 0.00665
Epoch: 44926, Training Loss: 0.00665
Epoch: 44926, Training Loss: 0.00815
Epoch: 44927, Training Loss: 0.00641
Epoch: 44927, Training Loss: 0.00665
Epoch: 44927, Training Loss: 0.00665
Epoch: 44927, Training Loss: 0.00815
Epoch: 44928, Training Loss: 0.00641
Epoch: 44928, Training Loss: 0.00665
Epoch: 44928, Training Loss: 0.00665
Epoch: 44928, Training Loss: 0.00815
Epoch: 44929, Training Loss: 0.00641
Epoch: 44929, Training Loss: 0.00665
Epoch: 44929, Training Loss: 0.00665
Epoch: 44929, Training Loss: 0.00815
Epoch: 44930, Training Loss: 0.00641
Epoch: 44930, Training Loss: 0.00665
Epoch: 44930, Training Loss: 0.00665
Epoch: 44930, Training Loss: 0.00815
Epoch: 44931, Training Loss: 0.00641
Epoch: 44931, Training Loss: 0.00665
Epoch: 44931, Training Loss: 0.00665
Epoch: 44931, Training Loss: 0.00815
Epoch: 44932, Training Loss: 0.00641
Epoch: 44932, Training Loss: 0.00665
Epoch: 44932, Training Loss: 0.00665
Epoch: 44932, Training Loss: 0.00815
Epoch: 44933, Training Loss: 0.00641
Epoch: 44933, Training Loss: 0.00665
Epoch: 44933, Training Loss: 0.00665
Epoch: 44933, Training Loss: 0.00815
Epoch: 44934, Training Loss: 0.00641
Epoch: 44934, Training Loss: 0.00665
Epoch: 44934, Training Loss: 0.00665
Epoch: 44934, Training Loss: 0.00815
Epoch: 44935, Training Loss: 0.00641
Epoch: 44935, Training Loss: 0.00665
Epoch: 44935, Training Loss: 0.00665
Epoch: 44935, Training Loss: 0.00815
Epoch: 44936, Training Loss: 0.00641
Epoch: 44936, Training Loss: 0.00664
Epoch: 44936, Training Loss: 0.00665
Epoch: 44936, Training Loss: 0.00815
Epoch: 44937, Training Loss: 0.00641
Epoch: 44937, Training Loss: 0.00664
Epoch: 44937, Training Loss: 0.00665
Epoch: 44937, Training Loss: 0.00815
Epoch: 44938, Training Loss: 0.00641
Epoch: 44938, Training Loss: 0.00664
Epoch: 44938, Training Loss: 0.00665
Epoch: 44938, Training Loss: 0.00815
Epoch: 44939, Training Loss: 0.00640
Epoch: 44939, Training Loss: 0.00664
Epoch: 44939, Training Loss: 0.00665
Epoch: 44939, Training Loss: 0.00815
Epoch: 44940, Training Loss: 0.00640
Epoch: 44940, Training Loss: 0.00664
Epoch: 44940, Training Loss: 0.00665
Epoch: 44940, Training Loss: 0.00815
Epoch: 44941, Training Loss: 0.00640
Epoch: 44941, Training Loss: 0.00664
Epoch: 44941, Training Loss: 0.00665
Epoch: 44941, Training Loss: 0.00815
Epoch: 44942, Training Loss: 0.00640
Epoch: 44942, Training Loss: 0.00664
Epoch: 44942, Training Loss: 0.00665
Epoch: 44942, Training Loss: 0.00815
Epoch: 44943, Training Loss: 0.00640
Epoch: 44943, Training Loss: 0.00664
Epoch: 44943, Training Loss: 0.00665
Epoch: 44943, Training Loss: 0.00815
Epoch: 44944, Training Loss: 0.00640
Epoch: 44944, Training Loss: 0.00664
Epoch: 44944, Training Loss: 0.00665
Epoch: 44944, Training Loss: 0.00815
Epoch: 44945, Training Loss: 0.00640
Epoch: 44945, Training Loss: 0.00664
Epoch: 44945, Training Loss: 0.00665
Epoch: 44945, Training Loss: 0.00815
Epoch: 44946, Training Loss: 0.00640
Epoch: 44946, Training Loss: 0.00664
Epoch: 44946, Training Loss: 0.00665
Epoch: 44946, Training Loss: 0.00815
Epoch: 44947, Training Loss: 0.00640
Epoch: 44947, Training Loss: 0.00664
Epoch: 44947, Training Loss: 0.00665
Epoch: 44947, Training Loss: 0.00815
Epoch: 44948, Training Loss: 0.00640
Epoch: 44948, Training Loss: 0.00664
Epoch: 44948, Training Loss: 0.00665
Epoch: 44948, Training Loss: 0.00815
Epoch: 44949, Training Loss: 0.00640
Epoch: 44949, Training Loss: 0.00664
Epoch: 44949, Training Loss: 0.00665
Epoch: 44949, Training Loss: 0.00815
Epoch: 44950, Training Loss: 0.00640
Epoch: 44950, Training Loss: 0.00664
Epoch: 44950, Training Loss: 0.00665
Epoch: 44950, Training Loss: 0.00815
Epoch: 44951, Training Loss: 0.00640
Epoch: 44951, Training Loss: 0.00664
Epoch: 44951, Training Loss: 0.00665
Epoch: 44951, Training Loss: 0.00815
Epoch: 44952, Training Loss: 0.00640
Epoch: 44952, Training Loss: 0.00664
Epoch: 44952, Training Loss: 0.00665
Epoch: 44952, Training Loss: 0.00815
Epoch: 44953, Training Loss: 0.00640
Epoch: 44953, Training Loss: 0.00664
Epoch: 44953, Training Loss: 0.00665
Epoch: 44953, Training Loss: 0.00815
Epoch: 44954, Training Loss: 0.00640
Epoch: 44954, Training Loss: 0.00664
Epoch: 44954, Training Loss: 0.00665
Epoch: 44954, Training Loss: 0.00815
Epoch: 44955, Training Loss: 0.00640
Epoch: 44955, Training Loss: 0.00664
Epoch: 44955, Training Loss: 0.00665
Epoch: 44955, Training Loss: 0.00815
Epoch: 44956, Training Loss: 0.00640
Epoch: 44956, Training Loss: 0.00664
Epoch: 44956, Training Loss: 0.00665
Epoch: 44956, Training Loss: 0.00815
Epoch: 44957, Training Loss: 0.00640
Epoch: 44957, Training Loss: 0.00664
Epoch: 44957, Training Loss: 0.00665
Epoch: 44957, Training Loss: 0.00815
Epoch: 44958, Training Loss: 0.00640
Epoch: 44958, Training Loss: 0.00664
Epoch: 44958, Training Loss: 0.00665
Epoch: 44958, Training Loss: 0.00815
Epoch: 44959, Training Loss: 0.00640
Epoch: 44959, Training Loss: 0.00664
Epoch: 44959, Training Loss: 0.00665
Epoch: 44959, Training Loss: 0.00815
Epoch: 44960, Training Loss: 0.00640
Epoch: 44960, Training Loss: 0.00664
Epoch: 44960, Training Loss: 0.00665
Epoch: 44960, Training Loss: 0.00815
Epoch: 44961, Training Loss: 0.00640
Epoch: 44961, Training Loss: 0.00664
Epoch: 44961, Training Loss: 0.00665
Epoch: 44961, Training Loss: 0.00815
Epoch: 44962, Training Loss: 0.00640
Epoch: 44962, Training Loss: 0.00664
Epoch: 44962, Training Loss: 0.00665
Epoch: 44962, Training Loss: 0.00815
Epoch: 44963, Training Loss: 0.00640
Epoch: 44963, Training Loss: 0.00664
Epoch: 44963, Training Loss: 0.00665
Epoch: 44963, Training Loss: 0.00815
Epoch: 44964, Training Loss: 0.00640
Epoch: 44964, Training Loss: 0.00664
Epoch: 44964, Training Loss: 0.00665
Epoch: 44964, Training Loss: 0.00815
Epoch: 44965, Training Loss: 0.00640
Epoch: 44965, Training Loss: 0.00664
Epoch: 44965, Training Loss: 0.00665
Epoch: 44965, Training Loss: 0.00815
Epoch: 44966, Training Loss: 0.00640
Epoch: 44966, Training Loss: 0.00664
Epoch: 44966, Training Loss: 0.00665
Epoch: 44966, Training Loss: 0.00815
Epoch: 44967, Training Loss: 0.00640
Epoch: 44967, Training Loss: 0.00664
Epoch: 44967, Training Loss: 0.00665
Epoch: 44967, Training Loss: 0.00815
Epoch: 44968, Training Loss: 0.00640
Epoch: 44968, Training Loss: 0.00664
Epoch: 44968, Training Loss: 0.00665
Epoch: 44968, Training Loss: 0.00815
Epoch: 44969, Training Loss: 0.00640
Epoch: 44969, Training Loss: 0.00664
Epoch: 44969, Training Loss: 0.00665
Epoch: 44969, Training Loss: 0.00815
Epoch: 44970, Training Loss: 0.00640
Epoch: 44970, Training Loss: 0.00664
Epoch: 44970, Training Loss: 0.00665
Epoch: 44970, Training Loss: 0.00815
Epoch: 44971, Training Loss: 0.00640
Epoch: 44971, Training Loss: 0.00664
Epoch: 44971, Training Loss: 0.00665
Epoch: 44971, Training Loss: 0.00815
Epoch: 44972, Training Loss: 0.00640
Epoch: 44972, Training Loss: 0.00664
Epoch: 44972, Training Loss: 0.00665
Epoch: 44972, Training Loss: 0.00815
Epoch: 44973, Training Loss: 0.00640
Epoch: 44973, Training Loss: 0.00664
Epoch: 44973, Training Loss: 0.00665
Epoch: 44973, Training Loss: 0.00815
Epoch: 44974, Training Loss: 0.00640
Epoch: 44974, Training Loss: 0.00664
Epoch: 44974, Training Loss: 0.00665
Epoch: 44974, Training Loss: 0.00815
Epoch: 44975, Training Loss: 0.00640
Epoch: 44975, Training Loss: 0.00664
Epoch: 44975, Training Loss: 0.00665
Epoch: 44975, Training Loss: 0.00815
Epoch: 44976, Training Loss: 0.00640
Epoch: 44976, Training Loss: 0.00664
Epoch: 44976, Training Loss: 0.00665
Epoch: 44976, Training Loss: 0.00815
Epoch: 44977, Training Loss: 0.00640
Epoch: 44977, Training Loss: 0.00664
Epoch: 44977, Training Loss: 0.00665
Epoch: 44977, Training Loss: 0.00815
Epoch: 44978, Training Loss: 0.00640
Epoch: 44978, Training Loss: 0.00664
Epoch: 44978, Training Loss: 0.00665
Epoch: 44978, Training Loss: 0.00815
Epoch: 44979, Training Loss: 0.00640
Epoch: 44979, Training Loss: 0.00664
Epoch: 44979, Training Loss: 0.00665
Epoch: 44979, Training Loss: 0.00815
Epoch: 44980, Training Loss: 0.00640
Epoch: 44980, Training Loss: 0.00664
Epoch: 44980, Training Loss: 0.00665
Epoch: 44980, Training Loss: 0.00815
Epoch: 44981, Training Loss: 0.00640
Epoch: 44981, Training Loss: 0.00664
Epoch: 44981, Training Loss: 0.00665
Epoch: 44981, Training Loss: 0.00815
Epoch: 44982, Training Loss: 0.00640
Epoch: 44982, Training Loss: 0.00664
Epoch: 44982, Training Loss: 0.00665
Epoch: 44982, Training Loss: 0.00815
Epoch: 44983, Training Loss: 0.00640
Epoch: 44983, Training Loss: 0.00664
Epoch: 44983, Training Loss: 0.00665
Epoch: 44983, Training Loss: 0.00815
Epoch: 44984, Training Loss: 0.00640
Epoch: 44984, Training Loss: 0.00664
Epoch: 44984, Training Loss: 0.00665
Epoch: 44984, Training Loss: 0.00815
Epoch: 44985, Training Loss: 0.00640
Epoch: 44985, Training Loss: 0.00664
Epoch: 44985, Training Loss: 0.00665
Epoch: 44985, Training Loss: 0.00815
Epoch: 44986, Training Loss: 0.00640
Epoch: 44986, Training Loss: 0.00664
Epoch: 44986, Training Loss: 0.00665
Epoch: 44986, Training Loss: 0.00815
Epoch: 44987, Training Loss: 0.00640
Epoch: 44987, Training Loss: 0.00664
Epoch: 44987, Training Loss: 0.00665
Epoch: 44987, Training Loss: 0.00815
Epoch: 44988, Training Loss: 0.00640
Epoch: 44988, Training Loss: 0.00664
Epoch: 44988, Training Loss: 0.00665
Epoch: 44988, Training Loss: 0.00815
Epoch: 44989, Training Loss: 0.00640
Epoch: 44989, Training Loss: 0.00664
Epoch: 44989, Training Loss: 0.00665
Epoch: 44989, Training Loss: 0.00815
Epoch: 44990, Training Loss: 0.00640
Epoch: 44990, Training Loss: 0.00664
Epoch: 44990, Training Loss: 0.00665
Epoch: 44990, Training Loss: 0.00815
Epoch: 44991, Training Loss: 0.00640
Epoch: 44991, Training Loss: 0.00664
Epoch: 44991, Training Loss: 0.00665
Epoch: 44991, Training Loss: 0.00815
Epoch: 44992, Training Loss: 0.00640
Epoch: 44992, Training Loss: 0.00664
Epoch: 44992, Training Loss: 0.00665
Epoch: 44992, Training Loss: 0.00815
Epoch: 44993, Training Loss: 0.00640
Epoch: 44993, Training Loss: 0.00664
Epoch: 44993, Training Loss: 0.00665
Epoch: 44993, Training Loss: 0.00815
Epoch: 44994, Training Loss: 0.00640
Epoch: 44994, Training Loss: 0.00664
Epoch: 44994, Training Loss: 0.00665
Epoch: 44994, Training Loss: 0.00815
Epoch: 44995, Training Loss: 0.00640
Epoch: 44995, Training Loss: 0.00664
Epoch: 44995, Training Loss: 0.00665
Epoch: 44995, Training Loss: 0.00815
Epoch: 44996, Training Loss: 0.00640
Epoch: 44996, Training Loss: 0.00664
Epoch: 44996, Training Loss: 0.00665
Epoch: 44996, Training Loss: 0.00815
Epoch: 44997, Training Loss: 0.00640
Epoch: 44997, Training Loss: 0.00664
Epoch: 44997, Training Loss: 0.00665
Epoch: 44997, Training Loss: 0.00815
Epoch: 44998, Training Loss: 0.00640
Epoch: 44998, Training Loss: 0.00664
Epoch: 44998, Training Loss: 0.00665
Epoch: 44998, Training Loss: 0.00815
Epoch: 44999, Training Loss: 0.00640
Epoch: 44999, Training Loss: 0.00664
Epoch: 44999, Training Loss: 0.00665
Epoch: 44999, Training Loss: 0.00815
Epoch: 45000, Training Loss: 0.00640
Epoch: 45000, Training Loss: 0.00664
Epoch: 45000, Training Loss: 0.00665
Epoch: 45000, Training Loss: 0.00815
Epoch: 45001, Training Loss: 0.00640
Epoch: 45001, Training Loss: 0.00664
Epoch: 45001, Training Loss: 0.00665
Epoch: 45001, Training Loss: 0.00815
Epoch: 45002, Training Loss: 0.00640
Epoch: 45002, Training Loss: 0.00664
Epoch: 45002, Training Loss: 0.00665
Epoch: 45002, Training Loss: 0.00815
Epoch: 45003, Training Loss: 0.00640
Epoch: 45003, Training Loss: 0.00664
Epoch: 45003, Training Loss: 0.00665
Epoch: 45003, Training Loss: 0.00815
Epoch: 45004, Training Loss: 0.00640
Epoch: 45004, Training Loss: 0.00664
Epoch: 45004, Training Loss: 0.00665
Epoch: 45004, Training Loss: 0.00815
Epoch: 45005, Training Loss: 0.00640
Epoch: 45005, Training Loss: 0.00664
Epoch: 45005, Training Loss: 0.00665
Epoch: 45005, Training Loss: 0.00815
Epoch: 45006, Training Loss: 0.00640
Epoch: 45006, Training Loss: 0.00664
Epoch: 45006, Training Loss: 0.00665
Epoch: 45006, Training Loss: 0.00815
Epoch: 45007, Training Loss: 0.00640
Epoch: 45007, Training Loss: 0.00664
Epoch: 45007, Training Loss: 0.00665
Epoch: 45007, Training Loss: 0.00815
Epoch: 45008, Training Loss: 0.00640
Epoch: 45008, Training Loss: 0.00664
Epoch: 45008, Training Loss: 0.00665
Epoch: 45008, Training Loss: 0.00815
Epoch: 45009, Training Loss: 0.00640
Epoch: 45009, Training Loss: 0.00664
Epoch: 45009, Training Loss: 0.00665
Epoch: 45009, Training Loss: 0.00815
Epoch: 45010, Training Loss: 0.00640
Epoch: 45010, Training Loss: 0.00664
Epoch: 45010, Training Loss: 0.00665
Epoch: 45010, Training Loss: 0.00815
Epoch: 45011, Training Loss: 0.00640
Epoch: 45011, Training Loss: 0.00664
Epoch: 45011, Training Loss: 0.00665
Epoch: 45011, Training Loss: 0.00815
Epoch: 45012, Training Loss: 0.00640
Epoch: 45012, Training Loss: 0.00664
Epoch: 45012, Training Loss: 0.00665
Epoch: 45012, Training Loss: 0.00815
Epoch: 45013, Training Loss: 0.00640
Epoch: 45013, Training Loss: 0.00664
Epoch: 45013, Training Loss: 0.00665
Epoch: 45013, Training Loss: 0.00815
Epoch: 45014, Training Loss: 0.00640
Epoch: 45014, Training Loss: 0.00664
Epoch: 45014, Training Loss: 0.00665
Epoch: 45014, Training Loss: 0.00815
Epoch: 45015, Training Loss: 0.00640
Epoch: 45015, Training Loss: 0.00664
Epoch: 45015, Training Loss: 0.00665
Epoch: 45015, Training Loss: 0.00815
Epoch: 45016, Training Loss: 0.00640
Epoch: 45016, Training Loss: 0.00664
Epoch: 45016, Training Loss: 0.00665
Epoch: 45016, Training Loss: 0.00815
Epoch: 45017, Training Loss: 0.00640
Epoch: 45017, Training Loss: 0.00664
Epoch: 45017, Training Loss: 0.00665
Epoch: 45017, Training Loss: 0.00815
Epoch: 45018, Training Loss: 0.00640
Epoch: 45018, Training Loss: 0.00664
Epoch: 45018, Training Loss: 0.00665
Epoch: 45018, Training Loss: 0.00815
Epoch: 45019, Training Loss: 0.00640
Epoch: 45019, Training Loss: 0.00664
Epoch: 45019, Training Loss: 0.00665
Epoch: 45019, Training Loss: 0.00814
Epoch: 45020, Training Loss: 0.00640
Epoch: 45020, Training Loss: 0.00664
Epoch: 45020, Training Loss: 0.00665
Epoch: 45020, Training Loss: 0.00814
Epoch: 45021, Training Loss: 0.00640
Epoch: 45021, Training Loss: 0.00664
Epoch: 45021, Training Loss: 0.00665
Epoch: 45021, Training Loss: 0.00814
Epoch: 45022, Training Loss: 0.00640
Epoch: 45022, Training Loss: 0.00664
Epoch: 45022, Training Loss: 0.00665
Epoch: 45022, Training Loss: 0.00814
Epoch: 45023, Training Loss: 0.00640
Epoch: 45023, Training Loss: 0.00664
Epoch: 45023, Training Loss: 0.00665
Epoch: 45023, Training Loss: 0.00814
Epoch: 45024, Training Loss: 0.00640
Epoch: 45024, Training Loss: 0.00664
Epoch: 45024, Training Loss: 0.00665
Epoch: 45024, Training Loss: 0.00814
Epoch: 45025, Training Loss: 0.00640
Epoch: 45025, Training Loss: 0.00664
Epoch: 45025, Training Loss: 0.00665
Epoch: 45025, Training Loss: 0.00814
Epoch: 45026, Training Loss: 0.00640
Epoch: 45026, Training Loss: 0.00664
Epoch: 45026, Training Loss: 0.00664
Epoch: 45026, Training Loss: 0.00814
Epoch: 45027, Training Loss: 0.00640
Epoch: 45027, Training Loss: 0.00664
Epoch: 45027, Training Loss: 0.00664
Epoch: 45027, Training Loss: 0.00814
Epoch: 45028, Training Loss: 0.00640
Epoch: 45028, Training Loss: 0.00664
Epoch: 45028, Training Loss: 0.00664
Epoch: 45028, Training Loss: 0.00814
Epoch: 45029, Training Loss: 0.00640
Epoch: 45029, Training Loss: 0.00664
Epoch: 45029, Training Loss: 0.00664
Epoch: 45029, Training Loss: 0.00814
Epoch: 45030, Training Loss: 0.00640
Epoch: 45030, Training Loss: 0.00664
Epoch: 45030, Training Loss: 0.00664
Epoch: 45030, Training Loss: 0.00814
Epoch: 45031, Training Loss: 0.00640
Epoch: 45031, Training Loss: 0.00664
Epoch: 45031, Training Loss: 0.00664
Epoch: 45031, Training Loss: 0.00814
Epoch: 45032, Training Loss: 0.00640
Epoch: 45032, Training Loss: 0.00664
Epoch: 45032, Training Loss: 0.00664
Epoch: 45032, Training Loss: 0.00814
Epoch: 45033, Training Loss: 0.00640
Epoch: 45033, Training Loss: 0.00664
Epoch: 45033, Training Loss: 0.00664
Epoch: 45033, Training Loss: 0.00814
Epoch: 45034, Training Loss: 0.00640
Epoch: 45034, Training Loss: 0.00664
Epoch: 45034, Training Loss: 0.00664
Epoch: 45034, Training Loss: 0.00814
Epoch: 45035, Training Loss: 0.00640
Epoch: 45035, Training Loss: 0.00664
Epoch: 45035, Training Loss: 0.00664
Epoch: 45035, Training Loss: 0.00814
Epoch: 45036, Training Loss: 0.00640
Epoch: 45036, Training Loss: 0.00664
Epoch: 45036, Training Loss: 0.00664
Epoch: 45036, Training Loss: 0.00814
Epoch: 45037, Training Loss: 0.00640
Epoch: 45037, Training Loss: 0.00664
Epoch: 45037, Training Loss: 0.00664
Epoch: 45037, Training Loss: 0.00814
Epoch: 45038, Training Loss: 0.00640
Epoch: 45038, Training Loss: 0.00664
Epoch: 45038, Training Loss: 0.00664
Epoch: 45038, Training Loss: 0.00814
Epoch: 45039, Training Loss: 0.00640
Epoch: 45039, Training Loss: 0.00664
Epoch: 45039, Training Loss: 0.00664
Epoch: 45039, Training Loss: 0.00814
Epoch: 45040, Training Loss: 0.00640
Epoch: 45040, Training Loss: 0.00664
Epoch: 45040, Training Loss: 0.00664
Epoch: 45040, Training Loss: 0.00814
Epoch: 45041, Training Loss: 0.00640
Epoch: 45041, Training Loss: 0.00664
Epoch: 45041, Training Loss: 0.00664
Epoch: 45041, Training Loss: 0.00814
Epoch: 45042, Training Loss: 0.00640
Epoch: 45042, Training Loss: 0.00664
Epoch: 45042, Training Loss: 0.00664
Epoch: 45042, Training Loss: 0.00814
Epoch: 45043, Training Loss: 0.00640
Epoch: 45043, Training Loss: 0.00664
Epoch: 45043, Training Loss: 0.00664
Epoch: 45043, Training Loss: 0.00814
Epoch: 45044, Training Loss: 0.00640
Epoch: 45044, Training Loss: 0.00664
Epoch: 45044, Training Loss: 0.00664
Epoch: 45044, Training Loss: 0.00814
Epoch: 45045, Training Loss: 0.00640
Epoch: 45045, Training Loss: 0.00664
Epoch: 45045, Training Loss: 0.00664
Epoch: 45045, Training Loss: 0.00814
Epoch: 45046, Training Loss: 0.00640
Epoch: 45046, Training Loss: 0.00664
Epoch: 45046, Training Loss: 0.00664
Epoch: 45046, Training Loss: 0.00814
Epoch: 45047, Training Loss: 0.00640
Epoch: 45047, Training Loss: 0.00664
Epoch: 45047, Training Loss: 0.00664
Epoch: 45047, Training Loss: 0.00814
Epoch: 45048, Training Loss: 0.00640
Epoch: 45048, Training Loss: 0.00664
Epoch: 45048, Training Loss: 0.00664
Epoch: 45048, Training Loss: 0.00814
Epoch: 45049, Training Loss: 0.00640
Epoch: 45049, Training Loss: 0.00664
Epoch: 45049, Training Loss: 0.00664
Epoch: 45049, Training Loss: 0.00814
Epoch: 45050, Training Loss: 0.00640
Epoch: 45050, Training Loss: 0.00664
Epoch: 45050, Training Loss: 0.00664
Epoch: 45050, Training Loss: 0.00814
Epoch: 45051, Training Loss: 0.00640
Epoch: 45051, Training Loss: 0.00664
Epoch: 45051, Training Loss: 0.00664
Epoch: 45051, Training Loss: 0.00814
Epoch: 45052, Training Loss: 0.00640
Epoch: 45052, Training Loss: 0.00664
Epoch: 45052, Training Loss: 0.00664
Epoch: 45052, Training Loss: 0.00814
Epoch: 45053, Training Loss: 0.00640
Epoch: 45053, Training Loss: 0.00664
Epoch: 45053, Training Loss: 0.00664
Epoch: 45053, Training Loss: 0.00814
Epoch: 45054, Training Loss: 0.00640
Epoch: 45054, Training Loss: 0.00664
Epoch: 45054, Training Loss: 0.00664
Epoch: 45054, Training Loss: 0.00814
Epoch: 45055, Training Loss: 0.00640
Epoch: 45055, Training Loss: 0.00664
Epoch: 45055, Training Loss: 0.00664
Epoch: 45055, Training Loss: 0.00814
Epoch: 45056, Training Loss: 0.00640
Epoch: 45056, Training Loss: 0.00664
Epoch: 45056, Training Loss: 0.00664
Epoch: 45056, Training Loss: 0.00814
Epoch: 45057, Training Loss: 0.00640
Epoch: 45057, Training Loss: 0.00664
Epoch: 45057, Training Loss: 0.00664
Epoch: 45057, Training Loss: 0.00814
Epoch: 45058, Training Loss: 0.00640
Epoch: 45058, Training Loss: 0.00664
Epoch: 45058, Training Loss: 0.00664
Epoch: 45058, Training Loss: 0.00814
Epoch: 45059, Training Loss: 0.00640
Epoch: 45059, Training Loss: 0.00664
Epoch: 45059, Training Loss: 0.00664
Epoch: 45059, Training Loss: 0.00814
Epoch: 45060, Training Loss: 0.00640
Epoch: 45060, Training Loss: 0.00663
Epoch: 45060, Training Loss: 0.00664
Epoch: 45060, Training Loss: 0.00814
Epoch: 45061, Training Loss: 0.00640
Epoch: 45061, Training Loss: 0.00663
Epoch: 45061, Training Loss: 0.00664
Epoch: 45061, Training Loss: 0.00814
Epoch: 45062, Training Loss: 0.00640
Epoch: 45062, Training Loss: 0.00663
Epoch: 45062, Training Loss: 0.00664
Epoch: 45062, Training Loss: 0.00814
Epoch: 45063, Training Loss: 0.00640
Epoch: 45063, Training Loss: 0.00663
Epoch: 45063, Training Loss: 0.00664
Epoch: 45063, Training Loss: 0.00814
Epoch: 45064, Training Loss: 0.00640
Epoch: 45064, Training Loss: 0.00663
Epoch: 45064, Training Loss: 0.00664
Epoch: 45064, Training Loss: 0.00814
Epoch: 45065, Training Loss: 0.00640
Epoch: 45065, Training Loss: 0.00663
Epoch: 45065, Training Loss: 0.00664
Epoch: 45065, Training Loss: 0.00814
Epoch: 45066, Training Loss: 0.00640
Epoch: 45066, Training Loss: 0.00663
Epoch: 45066, Training Loss: 0.00664
Epoch: 45066, Training Loss: 0.00814
Epoch: 45067, Training Loss: 0.00640
Epoch: 45067, Training Loss: 0.00663
Epoch: 45067, Training Loss: 0.00664
Epoch: 45067, Training Loss: 0.00814
Epoch: 45068, Training Loss: 0.00640
Epoch: 45068, Training Loss: 0.00663
Epoch: 45068, Training Loss: 0.00664
Epoch: 45068, Training Loss: 0.00814
Epoch: 45069, Training Loss: 0.00640
Epoch: 45069, Training Loss: 0.00663
Epoch: 45069, Training Loss: 0.00664
Epoch: 45069, Training Loss: 0.00814
Epoch: 45070, Training Loss: 0.00640
Epoch: 45070, Training Loss: 0.00663
Epoch: 45070, Training Loss: 0.00664
Epoch: 45070, Training Loss: 0.00814
Epoch: 45071, Training Loss: 0.00639
Epoch: 45071, Training Loss: 0.00663
Epoch: 45071, Training Loss: 0.00664
Epoch: 45071, Training Loss: 0.00814
Epoch: 45072, Training Loss: 0.00639
Epoch: 45072, Training Loss: 0.00663
Epoch: 45072, Training Loss: 0.00664
Epoch: 45072, Training Loss: 0.00814
Epoch: 45073, Training Loss: 0.00639
Epoch: 45073, Training Loss: 0.00663
Epoch: 45073, Training Loss: 0.00664
Epoch: 45073, Training Loss: 0.00814
Epoch: 45074, Training Loss: 0.00639
Epoch: 45074, Training Loss: 0.00663
Epoch: 45074, Training Loss: 0.00664
Epoch: 45074, Training Loss: 0.00814
Epoch: 45075, Training Loss: 0.00639
Epoch: 45075, Training Loss: 0.00663
Epoch: 45075, Training Loss: 0.00664
Epoch: 45075, Training Loss: 0.00814
Epoch: 45076, Training Loss: 0.00639
Epoch: 45076, Training Loss: 0.00663
Epoch: 45076, Training Loss: 0.00664
Epoch: 45076, Training Loss: 0.00814
Epoch: 45077, Training Loss: 0.00639
Epoch: 45077, Training Loss: 0.00663
Epoch: 45077, Training Loss: 0.00664
Epoch: 45077, Training Loss: 0.00814
Epoch: 45078, Training Loss: 0.00639
Epoch: 45078, Training Loss: 0.00663
Epoch: 45078, Training Loss: 0.00664
Epoch: 45078, Training Loss: 0.00814
Epoch: 45079, Training Loss: 0.00639
Epoch: 45079, Training Loss: 0.00663
Epoch: 45079, Training Loss: 0.00664
Epoch: 45079, Training Loss: 0.00814
Epoch: 45080, Training Loss: 0.00639
Epoch: 45080, Training Loss: 0.00663
Epoch: 45080, Training Loss: 0.00664
Epoch: 45080, Training Loss: 0.00814
Epoch: 45081, Training Loss: 0.00639
Epoch: 45081, Training Loss: 0.00663
Epoch: 45081, Training Loss: 0.00664
Epoch: 45081, Training Loss: 0.00814
Epoch: 45082, Training Loss: 0.00639
Epoch: 45082, Training Loss: 0.00663
Epoch: 45082, Training Loss: 0.00664
Epoch: 45082, Training Loss: 0.00814
Epoch: 45083, Training Loss: 0.00639
Epoch: 45083, Training Loss: 0.00663
Epoch: 45083, Training Loss: 0.00664
Epoch: 45083, Training Loss: 0.00814
Epoch: 45084, Training Loss: 0.00639
Epoch: 45084, Training Loss: 0.00663
Epoch: 45084, Training Loss: 0.00664
Epoch: 45084, Training Loss: 0.00814
Epoch: 45085, Training Loss: 0.00639
Epoch: 45085, Training Loss: 0.00663
Epoch: 45085, Training Loss: 0.00664
Epoch: 45085, Training Loss: 0.00814
Epoch: 45086, Training Loss: 0.00639
Epoch: 45086, Training Loss: 0.00663
Epoch: 45086, Training Loss: 0.00664
Epoch: 45086, Training Loss: 0.00814
Epoch: 45087, Training Loss: 0.00639
Epoch: 45087, Training Loss: 0.00663
Epoch: 45087, Training Loss: 0.00664
Epoch: 45087, Training Loss: 0.00814
Epoch: 45088, Training Loss: 0.00639
Epoch: 45088, Training Loss: 0.00663
Epoch: 45088, Training Loss: 0.00664
Epoch: 45088, Training Loss: 0.00814
Epoch: 45089, Training Loss: 0.00639
Epoch: 45089, Training Loss: 0.00663
Epoch: 45089, Training Loss: 0.00664
Epoch: 45089, Training Loss: 0.00814
Epoch: 45090, Training Loss: 0.00639
Epoch: 45090, Training Loss: 0.00663
Epoch: 45090, Training Loss: 0.00664
Epoch: 45090, Training Loss: 0.00814
Epoch: 45091, Training Loss: 0.00639
Epoch: 45091, Training Loss: 0.00663
Epoch: 45091, Training Loss: 0.00664
Epoch: 45091, Training Loss: 0.00814
Epoch: 45092, Training Loss: 0.00639
Epoch: 45092, Training Loss: 0.00663
Epoch: 45092, Training Loss: 0.00664
Epoch: 45092, Training Loss: 0.00814
Epoch: 45093, Training Loss: 0.00639
Epoch: 45093, Training Loss: 0.00663
Epoch: 45093, Training Loss: 0.00664
Epoch: 45093, Training Loss: 0.00814
Epoch: 45094, Training Loss: 0.00639
Epoch: 45094, Training Loss: 0.00663
Epoch: 45094, Training Loss: 0.00664
Epoch: 45094, Training Loss: 0.00814
Epoch: 45095, Training Loss: 0.00639
Epoch: 45095, Training Loss: 0.00663
Epoch: 45095, Training Loss: 0.00664
Epoch: 45095, Training Loss: 0.00814
Epoch: 45096, Training Loss: 0.00639
Epoch: 45096, Training Loss: 0.00663
Epoch: 45096, Training Loss: 0.00664
Epoch: 45096, Training Loss: 0.00814
Epoch: 45097, Training Loss: 0.00639
Epoch: 45097, Training Loss: 0.00663
Epoch: 45097, Training Loss: 0.00664
Epoch: 45097, Training Loss: 0.00814
Epoch: 45098, Training Loss: 0.00639
Epoch: 45098, Training Loss: 0.00663
Epoch: 45098, Training Loss: 0.00664
Epoch: 45098, Training Loss: 0.00814
Epoch: 45099, Training Loss: 0.00639
Epoch: 45099, Training Loss: 0.00663
Epoch: 45099, Training Loss: 0.00664
Epoch: 45099, Training Loss: 0.00814
Epoch: 45100, Training Loss: 0.00639
Epoch: 45100, Training Loss: 0.00663
Epoch: 45100, Training Loss: 0.00664
Epoch: 45100, Training Loss: 0.00814
Epoch: 45101, Training Loss: 0.00639
Epoch: 45101, Training Loss: 0.00663
Epoch: 45101, Training Loss: 0.00664
Epoch: 45101, Training Loss: 0.00814
Epoch: 45102, Training Loss: 0.00639
Epoch: 45102, Training Loss: 0.00663
Epoch: 45102, Training Loss: 0.00664
Epoch: 45102, Training Loss: 0.00814
Epoch: 45103, Training Loss: 0.00639
Epoch: 45103, Training Loss: 0.00663
Epoch: 45103, Training Loss: 0.00664
Epoch: 45103, Training Loss: 0.00814
Epoch: 45104, Training Loss: 0.00639
Epoch: 45104, Training Loss: 0.00663
Epoch: 45104, Training Loss: 0.00664
Epoch: 45104, Training Loss: 0.00814
Epoch: 45105, Training Loss: 0.00639
Epoch: 45105, Training Loss: 0.00663
Epoch: 45105, Training Loss: 0.00664
Epoch: 45105, Training Loss: 0.00814
Epoch: 45106, Training Loss: 0.00639
Epoch: 45106, Training Loss: 0.00663
Epoch: 45106, Training Loss: 0.00664
Epoch: 45106, Training Loss: 0.00814
Epoch: 45107, Training Loss: 0.00639
Epoch: 45107, Training Loss: 0.00663
Epoch: 45107, Training Loss: 0.00664
Epoch: 45107, Training Loss: 0.00814
Epoch: 45108, Training Loss: 0.00639
Epoch: 45108, Training Loss: 0.00663
Epoch: 45108, Training Loss: 0.00664
Epoch: 45108, Training Loss: 0.00814
Epoch: 45109, Training Loss: 0.00639
Epoch: 45109, Training Loss: 0.00663
Epoch: 45109, Training Loss: 0.00664
Epoch: 45109, Training Loss: 0.00814
Epoch: 45110, Training Loss: 0.00639
Epoch: 45110, Training Loss: 0.00663
Epoch: 45110, Training Loss: 0.00664
Epoch: 45110, Training Loss: 0.00814
Epoch: 45111, Training Loss: 0.00639
Epoch: 45111, Training Loss: 0.00663
Epoch: 45111, Training Loss: 0.00664
Epoch: 45111, Training Loss: 0.00814
Epoch: 45112, Training Loss: 0.00639
Epoch: 45112, Training Loss: 0.00663
Epoch: 45112, Training Loss: 0.00664
Epoch: 45112, Training Loss: 0.00814
Epoch: 45113, Training Loss: 0.00639
Epoch: 45113, Training Loss: 0.00663
Epoch: 45113, Training Loss: 0.00664
Epoch: 45113, Training Loss: 0.00814
Epoch: 45114, Training Loss: 0.00639
Epoch: 45114, Training Loss: 0.00663
Epoch: 45114, Training Loss: 0.00664
Epoch: 45114, Training Loss: 0.00814
Epoch: 45115, Training Loss: 0.00639
Epoch: 45115, Training Loss: 0.00663
Epoch: 45115, Training Loss: 0.00664
Epoch: 45115, Training Loss: 0.00814
Epoch: 45116, Training Loss: 0.00639
Epoch: 45116, Training Loss: 0.00663
Epoch: 45116, Training Loss: 0.00664
Epoch: 45116, Training Loss: 0.00814
Epoch: 45117, Training Loss: 0.00639
Epoch: 45117, Training Loss: 0.00663
Epoch: 45117, Training Loss: 0.00664
Epoch: 45117, Training Loss: 0.00814
Epoch: 45118, Training Loss: 0.00639
Epoch: 45118, Training Loss: 0.00663
Epoch: 45118, Training Loss: 0.00664
Epoch: 45118, Training Loss: 0.00814
Epoch: 45119, Training Loss: 0.00639
Epoch: 45119, Training Loss: 0.00663
Epoch: 45119, Training Loss: 0.00664
Epoch: 45119, Training Loss: 0.00814
Epoch: 45120, Training Loss: 0.00639
Epoch: 45120, Training Loss: 0.00663
Epoch: 45120, Training Loss: 0.00664
Epoch: 45120, Training Loss: 0.00813
Epoch: 45121, Training Loss: 0.00639
Epoch: 45121, Training Loss: 0.00663
Epoch: 45121, Training Loss: 0.00664
Epoch: 45121, Training Loss: 0.00813
Epoch: 45122, Training Loss: 0.00639
Epoch: 45122, Training Loss: 0.00663
Epoch: 45122, Training Loss: 0.00664
Epoch: 45122, Training Loss: 0.00813
Epoch: 45123, Training Loss: 0.00639
Epoch: 45123, Training Loss: 0.00663
Epoch: 45123, Training Loss: 0.00664
Epoch: 45123, Training Loss: 0.00813
Epoch: 45124, Training Loss: 0.00639
Epoch: 45124, Training Loss: 0.00663
Epoch: 45124, Training Loss: 0.00664
Epoch: 45124, Training Loss: 0.00813
Epoch: 45125, Training Loss: 0.00639
Epoch: 45125, Training Loss: 0.00663
Epoch: 45125, Training Loss: 0.00664
Epoch: 45125, Training Loss: 0.00813
Epoch: 45126, Training Loss: 0.00639
Epoch: 45126, Training Loss: 0.00663
Epoch: 45126, Training Loss: 0.00664
Epoch: 45126, Training Loss: 0.00813
Epoch: 45127, Training Loss: 0.00639
Epoch: 45127, Training Loss: 0.00663
Epoch: 45127, Training Loss: 0.00664
Epoch: 45127, Training Loss: 0.00813
Epoch: 45128, Training Loss: 0.00639
Epoch: 45128, Training Loss: 0.00663
Epoch: 45128, Training Loss: 0.00664
Epoch: 45128, Training Loss: 0.00813
Epoch: 45129, Training Loss: 0.00639
Epoch: 45129, Training Loss: 0.00663
Epoch: 45129, Training Loss: 0.00664
Epoch: 45129, Training Loss: 0.00813
Epoch: 45130, Training Loss: 0.00639
Epoch: 45130, Training Loss: 0.00663
Epoch: 45130, Training Loss: 0.00664
Epoch: 45130, Training Loss: 0.00813
Epoch: 45131, Training Loss: 0.00639
Epoch: 45131, Training Loss: 0.00663
Epoch: 45131, Training Loss: 0.00664
Epoch: 45131, Training Loss: 0.00813
Epoch: 45132, Training Loss: 0.00639
Epoch: 45132, Training Loss: 0.00663
Epoch: 45132, Training Loss: 0.00664
Epoch: 45132, Training Loss: 0.00813
Epoch: 45133, Training Loss: 0.00639
Epoch: 45133, Training Loss: 0.00663
Epoch: 45133, Training Loss: 0.00664
Epoch: 45133, Training Loss: 0.00813
Epoch: 45134, Training Loss: 0.00639
Epoch: 45134, Training Loss: 0.00663
Epoch: 45134, Training Loss: 0.00664
Epoch: 45134, Training Loss: 0.00813
Epoch: 45135, Training Loss: 0.00639
Epoch: 45135, Training Loss: 0.00663
Epoch: 45135, Training Loss: 0.00664
Epoch: 45135, Training Loss: 0.00813
Epoch: 45136, Training Loss: 0.00639
Epoch: 45136, Training Loss: 0.00663
Epoch: 45136, Training Loss: 0.00664
Epoch: 45136, Training Loss: 0.00813
Epoch: 45137, Training Loss: 0.00639
Epoch: 45137, Training Loss: 0.00663
Epoch: 45137, Training Loss: 0.00664
Epoch: 45137, Training Loss: 0.00813
Epoch: 45138, Training Loss: 0.00639
Epoch: 45138, Training Loss: 0.00663
Epoch: 45138, Training Loss: 0.00664
Epoch: 45138, Training Loss: 0.00813
Epoch: 45139, Training Loss: 0.00639
Epoch: 45139, Training Loss: 0.00663
Epoch: 45139, Training Loss: 0.00664
Epoch: 45139, Training Loss: 0.00813
Epoch: 45140, Training Loss: 0.00639
Epoch: 45140, Training Loss: 0.00663
Epoch: 45140, Training Loss: 0.00664
Epoch: 45140, Training Loss: 0.00813
Epoch: 45141, Training Loss: 0.00639
Epoch: 45141, Training Loss: 0.00663
Epoch: 45141, Training Loss: 0.00664
Epoch: 45141, Training Loss: 0.00813
Epoch: 45142, Training Loss: 0.00639
Epoch: 45142, Training Loss: 0.00663
Epoch: 45142, Training Loss: 0.00664
Epoch: 45142, Training Loss: 0.00813
Epoch: 45143, Training Loss: 0.00639
Epoch: 45143, Training Loss: 0.00663
Epoch: 45143, Training Loss: 0.00664
Epoch: 45143, Training Loss: 0.00813
Epoch: 45144, Training Loss: 0.00639
Epoch: 45144, Training Loss: 0.00663
Epoch: 45144, Training Loss: 0.00664
Epoch: 45144, Training Loss: 0.00813
Epoch: 45145, Training Loss: 0.00639
Epoch: 45145, Training Loss: 0.00663
Epoch: 45145, Training Loss: 0.00664
Epoch: 45145, Training Loss: 0.00813
Epoch: 45146, Training Loss: 0.00639
Epoch: 45146, Training Loss: 0.00663
Epoch: 45146, Training Loss: 0.00664
Epoch: 45146, Training Loss: 0.00813
Epoch: 45147, Training Loss: 0.00639
Epoch: 45147, Training Loss: 0.00663
Epoch: 45147, Training Loss: 0.00664
Epoch: 45147, Training Loss: 0.00813
Epoch: 45148, Training Loss: 0.00639
Epoch: 45148, Training Loss: 0.00663
Epoch: 45148, Training Loss: 0.00664
Epoch: 45148, Training Loss: 0.00813
Epoch: 45149, Training Loss: 0.00639
Epoch: 45149, Training Loss: 0.00663
Epoch: 45149, Training Loss: 0.00664
Epoch: 45149, Training Loss: 0.00813
Epoch: 45150, Training Loss: 0.00639
Epoch: 45150, Training Loss: 0.00663
Epoch: 45150, Training Loss: 0.00663
Epoch: 45150, Training Loss: 0.00813
Epoch: 45151, Training Loss: 0.00639
Epoch: 45151, Training Loss: 0.00663
Epoch: 45151, Training Loss: 0.00663
Epoch: 45151, Training Loss: 0.00813
Epoch: 45152, Training Loss: 0.00639
Epoch: 45152, Training Loss: 0.00663
Epoch: 45152, Training Loss: 0.00663
Epoch: 45152, Training Loss: 0.00813
Epoch: 45153, Training Loss: 0.00639
Epoch: 45153, Training Loss: 0.00663
Epoch: 45153, Training Loss: 0.00663
Epoch: 45153, Training Loss: 0.00813
Epoch: 45154, Training Loss: 0.00639
Epoch: 45154, Training Loss: 0.00663
Epoch: 45154, Training Loss: 0.00663
Epoch: 45154, Training Loss: 0.00813
Epoch: 45155, Training Loss: 0.00639
Epoch: 45155, Training Loss: 0.00663
Epoch: 45155, Training Loss: 0.00663
Epoch: 45155, Training Loss: 0.00813
Epoch: 45156, Training Loss: 0.00639
Epoch: 45156, Training Loss: 0.00663
Epoch: 45156, Training Loss: 0.00663
Epoch: 45156, Training Loss: 0.00813
Epoch: 45157, Training Loss: 0.00639
Epoch: 45157, Training Loss: 0.00663
Epoch: 45157, Training Loss: 0.00663
Epoch: 45157, Training Loss: 0.00813
Epoch: 45158, Training Loss: 0.00639
Epoch: 45158, Training Loss: 0.00663
Epoch: 45158, Training Loss: 0.00663
Epoch: 45158, Training Loss: 0.00813
Epoch: 45159, Training Loss: 0.00639
Epoch: 45159, Training Loss: 0.00663
Epoch: 45159, Training Loss: 0.00663
Epoch: 45159, Training Loss: 0.00813
Epoch: 45160, Training Loss: 0.00639
Epoch: 45160, Training Loss: 0.00663
Epoch: 45160, Training Loss: 0.00663
Epoch: 45160, Training Loss: 0.00813
Epoch: 45161, Training Loss: 0.00639
Epoch: 45161, Training Loss: 0.00663
Epoch: 45161, Training Loss: 0.00663
Epoch: 45161, Training Loss: 0.00813
Epoch: 45162, Training Loss: 0.00639
Epoch: 45162, Training Loss: 0.00663
Epoch: 45162, Training Loss: 0.00663
Epoch: 45162, Training Loss: 0.00813
Epoch: 45163, Training Loss: 0.00639
Epoch: 45163, Training Loss: 0.00663
Epoch: 45163, Training Loss: 0.00663
Epoch: 45163, Training Loss: 0.00813
Epoch: 45164, Training Loss: 0.00639
Epoch: 45164, Training Loss: 0.00663
Epoch: 45164, Training Loss: 0.00663
Epoch: 45164, Training Loss: 0.00813
Epoch: 45165, Training Loss: 0.00639
Epoch: 45165, Training Loss: 0.00663
Epoch: 45165, Training Loss: 0.00663
Epoch: 45165, Training Loss: 0.00813
Epoch: 45166, Training Loss: 0.00639
Epoch: 45166, Training Loss: 0.00663
Epoch: 45166, Training Loss: 0.00663
Epoch: 45166, Training Loss: 0.00813
Epoch: 45167, Training Loss: 0.00639
Epoch: 45167, Training Loss: 0.00663
Epoch: 45167, Training Loss: 0.00663
Epoch: 45167, Training Loss: 0.00813
Epoch: 45168, Training Loss: 0.00639
Epoch: 45168, Training Loss: 0.00663
Epoch: 45168, Training Loss: 0.00663
Epoch: 45168, Training Loss: 0.00813
Epoch: 45169, Training Loss: 0.00639
Epoch: 45169, Training Loss: 0.00663
Epoch: 45169, Training Loss: 0.00663
Epoch: 45169, Training Loss: 0.00813
Epoch: 45170, Training Loss: 0.00639
Epoch: 45170, Training Loss: 0.00663
Epoch: 45170, Training Loss: 0.00663
Epoch: 45170, Training Loss: 0.00813
Epoch: 45171, Training Loss: 0.00639
Epoch: 45171, Training Loss: 0.00663
Epoch: 45171, Training Loss: 0.00663
Epoch: 45171, Training Loss: 0.00813
Epoch: 45172, Training Loss: 0.00639
Epoch: 45172, Training Loss: 0.00663
Epoch: 45172, Training Loss: 0.00663
Epoch: 45172, Training Loss: 0.00813
Epoch: 45173, Training Loss: 0.00639
Epoch: 45173, Training Loss: 0.00663
Epoch: 45173, Training Loss: 0.00663
Epoch: 45173, Training Loss: 0.00813
Epoch: 45174, Training Loss: 0.00639
Epoch: 45174, Training Loss: 0.00663
Epoch: 45174, Training Loss: 0.00663
Epoch: 45174, Training Loss: 0.00813
Epoch: 45175, Training Loss: 0.00639
Epoch: 45175, Training Loss: 0.00663
Epoch: 45175, Training Loss: 0.00663
Epoch: 45175, Training Loss: 0.00813
Epoch: 45176, Training Loss: 0.00639
Epoch: 45176, Training Loss: 0.00663
Epoch: 45176, Training Loss: 0.00663
Epoch: 45176, Training Loss: 0.00813
Epoch: 45177, Training Loss: 0.00639
Epoch: 45177, Training Loss: 0.00663
Epoch: 45177, Training Loss: 0.00663
Epoch: 45177, Training Loss: 0.00813
Epoch: 45178, Training Loss: 0.00639
Epoch: 45178, Training Loss: 0.00663
Epoch: 45178, Training Loss: 0.00663
Epoch: 45178, Training Loss: 0.00813
Epoch: 45179, Training Loss: 0.00639
Epoch: 45179, Training Loss: 0.00663
Epoch: 45179, Training Loss: 0.00663
Epoch: 45179, Training Loss: 0.00813
Epoch: 45180, Training Loss: 0.00639
Epoch: 45180, Training Loss: 0.00663
Epoch: 45180, Training Loss: 0.00663
Epoch: 45180, Training Loss: 0.00813
Epoch: 45181, Training Loss: 0.00639
Epoch: 45181, Training Loss: 0.00663
Epoch: 45181, Training Loss: 0.00663
Epoch: 45181, Training Loss: 0.00813
Epoch: 45182, Training Loss: 0.00639
Epoch: 45182, Training Loss: 0.00663
Epoch: 45182, Training Loss: 0.00663
Epoch: 45182, Training Loss: 0.00813
Epoch: 45183, Training Loss: 0.00639
Epoch: 45183, Training Loss: 0.00663
Epoch: 45183, Training Loss: 0.00663
Epoch: 45183, Training Loss: 0.00813
Epoch: 45184, Training Loss: 0.00639
Epoch: 45184, Training Loss: 0.00663
Epoch: 45184, Training Loss: 0.00663
Epoch: 45184, Training Loss: 0.00813
Epoch: 45185, Training Loss: 0.00639
Epoch: 45185, Training Loss: 0.00662
Epoch: 45185, Training Loss: 0.00663
Epoch: 45185, Training Loss: 0.00813
Epoch: 45186, Training Loss: 0.00639
Epoch: 45186, Training Loss: 0.00662
Epoch: 45186, Training Loss: 0.00663
Epoch: 45186, Training Loss: 0.00813
Epoch: 45187, Training Loss: 0.00639
Epoch: 45187, Training Loss: 0.00662
Epoch: 45187, Training Loss: 0.00663
Epoch: 45187, Training Loss: 0.00813
Epoch: 45188, Training Loss: 0.00639
Epoch: 45188, Training Loss: 0.00662
Epoch: 45188, Training Loss: 0.00663
Epoch: 45188, Training Loss: 0.00813
Epoch: 45189, Training Loss: 0.00639
Epoch: 45189, Training Loss: 0.00662
Epoch: 45189, Training Loss: 0.00663
Epoch: 45189, Training Loss: 0.00813
Epoch: 45190, Training Loss: 0.00639
Epoch: 45190, Training Loss: 0.00662
Epoch: 45190, Training Loss: 0.00663
Epoch: 45190, Training Loss: 0.00813
Epoch: 45191, Training Loss: 0.00639
Epoch: 45191, Training Loss: 0.00662
Epoch: 45191, Training Loss: 0.00663
Epoch: 45191, Training Loss: 0.00813
Epoch: 45192, Training Loss: 0.00639
Epoch: 45192, Training Loss: 0.00662
Epoch: 45192, Training Loss: 0.00663
Epoch: 45192, Training Loss: 0.00813
Epoch: 45193, Training Loss: 0.00639
Epoch: 45193, Training Loss: 0.00662
Epoch: 45193, Training Loss: 0.00663
Epoch: 45193, Training Loss: 0.00813
Epoch: 45194, Training Loss: 0.00639
Epoch: 45194, Training Loss: 0.00662
Epoch: 45194, Training Loss: 0.00663
Epoch: 45194, Training Loss: 0.00813
Epoch: 45195, Training Loss: 0.00639
Epoch: 45195, Training Loss: 0.00662
Epoch: 45195, Training Loss: 0.00663
Epoch: 45195, Training Loss: 0.00813
Epoch: 45196, Training Loss: 0.00639
Epoch: 45196, Training Loss: 0.00662
Epoch: 45196, Training Loss: 0.00663
Epoch: 45196, Training Loss: 0.00813
Epoch: 45197, Training Loss: 0.00639
Epoch: 45197, Training Loss: 0.00662
Epoch: 45197, Training Loss: 0.00663
Epoch: 45197, Training Loss: 0.00813
Epoch: 45198, Training Loss: 0.00639
Epoch: 45198, Training Loss: 0.00662
Epoch: 45198, Training Loss: 0.00663
Epoch: 45198, Training Loss: 0.00813
Epoch: 45199, Training Loss: 0.00639
Epoch: 45199, Training Loss: 0.00662
Epoch: 45199, Training Loss: 0.00663
Epoch: 45199, Training Loss: 0.00813
Epoch: 45200, Training Loss: 0.00639
Epoch: 45200, Training Loss: 0.00662
Epoch: 45200, Training Loss: 0.00663
Epoch: 45200, Training Loss: 0.00813
Epoch: 45201, Training Loss: 0.00639
Epoch: 45201, Training Loss: 0.00662
Epoch: 45201, Training Loss: 0.00663
Epoch: 45201, Training Loss: 0.00813
Epoch: 45202, Training Loss: 0.00639
Epoch: 45202, Training Loss: 0.00662
Epoch: 45202, Training Loss: 0.00663
Epoch: 45202, Training Loss: 0.00813
Epoch: 45203, Training Loss: 0.00638
Epoch: 45203, Training Loss: 0.00662
Epoch: 45203, Training Loss: 0.00663
Epoch: 45203, Training Loss: 0.00813
Epoch: 45204, Training Loss: 0.00638
Epoch: 45204, Training Loss: 0.00662
Epoch: 45204, Training Loss: 0.00663
Epoch: 45204, Training Loss: 0.00813
Epoch: 45205, Training Loss: 0.00638
Epoch: 45205, Training Loss: 0.00662
Epoch: 45205, Training Loss: 0.00663
Epoch: 45205, Training Loss: 0.00813
Epoch: 45206, Training Loss: 0.00638
Epoch: 45206, Training Loss: 0.00662
Epoch: 45206, Training Loss: 0.00663
Epoch: 45206, Training Loss: 0.00813
Epoch: 45207, Training Loss: 0.00638
Epoch: 45207, Training Loss: 0.00662
Epoch: 45207, Training Loss: 0.00663
Epoch: 45207, Training Loss: 0.00813
Epoch: 45208, Training Loss: 0.00638
Epoch: 45208, Training Loss: 0.00662
Epoch: 45208, Training Loss: 0.00663
Epoch: 45208, Training Loss: 0.00813
Epoch: 45209, Training Loss: 0.00638
Epoch: 45209, Training Loss: 0.00662
Epoch: 45209, Training Loss: 0.00663
Epoch: 45209, Training Loss: 0.00813
Epoch: 45210, Training Loss: 0.00638
Epoch: 45210, Training Loss: 0.00662
Epoch: 45210, Training Loss: 0.00663
Epoch: 45210, Training Loss: 0.00813
Epoch: 45211, Training Loss: 0.00638
Epoch: 45211, Training Loss: 0.00662
Epoch: 45211, Training Loss: 0.00663
Epoch: 45211, Training Loss: 0.00813
Epoch: 45212, Training Loss: 0.00638
Epoch: 45212, Training Loss: 0.00662
Epoch: 45212, Training Loss: 0.00663
Epoch: 45212, Training Loss: 0.00813
Epoch: 45213, Training Loss: 0.00638
Epoch: 45213, Training Loss: 0.00662
Epoch: 45213, Training Loss: 0.00663
Epoch: 45213, Training Loss: 0.00813
Epoch: 45214, Training Loss: 0.00638
Epoch: 45214, Training Loss: 0.00662
Epoch: 45214, Training Loss: 0.00663
Epoch: 45214, Training Loss: 0.00813
Epoch: 45215, Training Loss: 0.00638
Epoch: 45215, Training Loss: 0.00662
Epoch: 45215, Training Loss: 0.00663
Epoch: 45215, Training Loss: 0.00813
Epoch: 45216, Training Loss: 0.00638
Epoch: 45216, Training Loss: 0.00662
Epoch: 45216, Training Loss: 0.00663
Epoch: 45216, Training Loss: 0.00813
Epoch: 45217, Training Loss: 0.00638
Epoch: 45217, Training Loss: 0.00662
Epoch: 45217, Training Loss: 0.00663
Epoch: 45217, Training Loss: 0.00813
Epoch: 45218, Training Loss: 0.00638
Epoch: 45218, Training Loss: 0.00662
Epoch: 45218, Training Loss: 0.00663
Epoch: 45218, Training Loss: 0.00813
Epoch: 45219, Training Loss: 0.00638
Epoch: 45219, Training Loss: 0.00662
Epoch: 45219, Training Loss: 0.00663
Epoch: 45219, Training Loss: 0.00813
Epoch: 45220, Training Loss: 0.00638
Epoch: 45220, Training Loss: 0.00662
Epoch: 45220, Training Loss: 0.00663
Epoch: 45220, Training Loss: 0.00813
Epoch: 45221, Training Loss: 0.00638
Epoch: 45221, Training Loss: 0.00662
Epoch: 45221, Training Loss: 0.00663
Epoch: 45221, Training Loss: 0.00812
Epoch: 45222, Training Loss: 0.00638
Epoch: 45222, Training Loss: 0.00662
Epoch: 45222, Training Loss: 0.00663
Epoch: 45222, Training Loss: 0.00812
Epoch: 45223, Training Loss: 0.00638
Epoch: 45223, Training Loss: 0.00662
Epoch: 45223, Training Loss: 0.00663
Epoch: 45223, Training Loss: 0.00812
Epoch: 45224, Training Loss: 0.00638
Epoch: 45224, Training Loss: 0.00662
Epoch: 45224, Training Loss: 0.00663
Epoch: 45224, Training Loss: 0.00812
Epoch: 45225, Training Loss: 0.00638
Epoch: 45225, Training Loss: 0.00662
Epoch: 45225, Training Loss: 0.00663
Epoch: 45225, Training Loss: 0.00812
Epoch: 45226, Training Loss: 0.00638
Epoch: 45226, Training Loss: 0.00662
Epoch: 45226, Training Loss: 0.00663
Epoch: 45226, Training Loss: 0.00812
Epoch: 45227, Training Loss: 0.00638
Epoch: 45227, Training Loss: 0.00662
Epoch: 45227, Training Loss: 0.00663
Epoch: 45227, Training Loss: 0.00812
Epoch: 45228, Training Loss: 0.00638
Epoch: 45228, Training Loss: 0.00662
Epoch: 45228, Training Loss: 0.00663
Epoch: 45228, Training Loss: 0.00812
Epoch: 45229, Training Loss: 0.00638
Epoch: 45229, Training Loss: 0.00662
Epoch: 45229, Training Loss: 0.00663
Epoch: 45229, Training Loss: 0.00812
Epoch: 45230, Training Loss: 0.00638
Epoch: 45230, Training Loss: 0.00662
Epoch: 45230, Training Loss: 0.00663
Epoch: 45230, Training Loss: 0.00812
Epoch: 45231, Training Loss: 0.00638
Epoch: 45231, Training Loss: 0.00662
Epoch: 45231, Training Loss: 0.00663
Epoch: 45231, Training Loss: 0.00812
Epoch: 45232, Training Loss: 0.00638
Epoch: 45232, Training Loss: 0.00662
Epoch: 45232, Training Loss: 0.00663
Epoch: 45232, Training Loss: 0.00812
Epoch: 45233, Training Loss: 0.00638
Epoch: 45233, Training Loss: 0.00662
Epoch: 45233, Training Loss: 0.00663
Epoch: 45233, Training Loss: 0.00812
Epoch: 45234, Training Loss: 0.00638
Epoch: 45234, Training Loss: 0.00662
Epoch: 45234, Training Loss: 0.00663
Epoch: 45234, Training Loss: 0.00812
Epoch: 45235, Training Loss: 0.00638
Epoch: 45235, Training Loss: 0.00662
Epoch: 45235, Training Loss: 0.00663
Epoch: 45235, Training Loss: 0.00812
Epoch: 45236, Training Loss: 0.00638
Epoch: 45236, Training Loss: 0.00662
Epoch: 45236, Training Loss: 0.00663
Epoch: 45236, Training Loss: 0.00812
Epoch: 45237, Training Loss: 0.00638
Epoch: 45237, Training Loss: 0.00662
Epoch: 45237, Training Loss: 0.00663
Epoch: 45237, Training Loss: 0.00812
Epoch: 45238, Training Loss: 0.00638
Epoch: 45238, Training Loss: 0.00662
Epoch: 45238, Training Loss: 0.00663
Epoch: 45238, Training Loss: 0.00812
Epoch: 45239, Training Loss: 0.00638
Epoch: 45239, Training Loss: 0.00662
Epoch: 45239, Training Loss: 0.00663
Epoch: 45239, Training Loss: 0.00812
Epoch: 45240, Training Loss: 0.00638
Epoch: 45240, Training Loss: 0.00662
Epoch: 45240, Training Loss: 0.00663
Epoch: 45240, Training Loss: 0.00812
Epoch: 45241, Training Loss: 0.00638
Epoch: 45241, Training Loss: 0.00662
Epoch: 45241, Training Loss: 0.00663
Epoch: 45241, Training Loss: 0.00812
Epoch: 45242, Training Loss: 0.00638
Epoch: 45242, Training Loss: 0.00662
Epoch: 45242, Training Loss: 0.00663
Epoch: 45242, Training Loss: 0.00812
Epoch: 45243, Training Loss: 0.00638
Epoch: 45243, Training Loss: 0.00662
Epoch: 45243, Training Loss: 0.00663
Epoch: 45243, Training Loss: 0.00812
Epoch: 45244, Training Loss: 0.00638
Epoch: 45244, Training Loss: 0.00662
Epoch: 45244, Training Loss: 0.00663
Epoch: 45244, Training Loss: 0.00812
Epoch: 45245, Training Loss: 0.00638
Epoch: 45245, Training Loss: 0.00662
Epoch: 45245, Training Loss: 0.00663
Epoch: 45245, Training Loss: 0.00812
Epoch: 45246, Training Loss: 0.00638
Epoch: 45246, Training Loss: 0.00662
Epoch: 45246, Training Loss: 0.00663
Epoch: 45246, Training Loss: 0.00812
Epoch: 45247, Training Loss: 0.00638
Epoch: 45247, Training Loss: 0.00662
Epoch: 45247, Training Loss: 0.00663
Epoch: 45247, Training Loss: 0.00812
Epoch: 45248, Training Loss: 0.00638
Epoch: 45248, Training Loss: 0.00662
Epoch: 45248, Training Loss: 0.00663
Epoch: 45248, Training Loss: 0.00812
Epoch: 45249, Training Loss: 0.00638
Epoch: 45249, Training Loss: 0.00662
Epoch: 45249, Training Loss: 0.00663
Epoch: 45249, Training Loss: 0.00812
Epoch: 45250, Training Loss: 0.00638
Epoch: 45250, Training Loss: 0.00662
Epoch: 45250, Training Loss: 0.00663
Epoch: 45250, Training Loss: 0.00812
Epoch: 45251, Training Loss: 0.00638
Epoch: 45251, Training Loss: 0.00662
Epoch: 45251, Training Loss: 0.00663
Epoch: 45251, Training Loss: 0.00812
Epoch: 45252, Training Loss: 0.00638
Epoch: 45252, Training Loss: 0.00662
Epoch: 45252, Training Loss: 0.00663
Epoch: 45252, Training Loss: 0.00812
Epoch: 45253, Training Loss: 0.00638
Epoch: 45253, Training Loss: 0.00662
Epoch: 45253, Training Loss: 0.00663
Epoch: 45253, Training Loss: 0.00812
Epoch: 45254, Training Loss: 0.00638
Epoch: 45254, Training Loss: 0.00662
Epoch: 45254, Training Loss: 0.00663
Epoch: 45254, Training Loss: 0.00812
Epoch: 45255, Training Loss: 0.00638
Epoch: 45255, Training Loss: 0.00662
Epoch: 45255, Training Loss: 0.00663
Epoch: 45255, Training Loss: 0.00812
Epoch: 45256, Training Loss: 0.00638
Epoch: 45256, Training Loss: 0.00662
Epoch: 45256, Training Loss: 0.00663
Epoch: 45256, Training Loss: 0.00812
Epoch: 45257, Training Loss: 0.00638
Epoch: 45257, Training Loss: 0.00662
Epoch: 45257, Training Loss: 0.00663
Epoch: 45257, Training Loss: 0.00812
Epoch: 45258, Training Loss: 0.00638
Epoch: 45258, Training Loss: 0.00662
Epoch: 45258, Training Loss: 0.00663
Epoch: 45258, Training Loss: 0.00812
Epoch: 45259, Training Loss: 0.00638
Epoch: 45259, Training Loss: 0.00662
Epoch: 45259, Training Loss: 0.00663
Epoch: 45259, Training Loss: 0.00812
Epoch: 45260, Training Loss: 0.00638
Epoch: 45260, Training Loss: 0.00662
Epoch: 45260, Training Loss: 0.00663
Epoch: 45260, Training Loss: 0.00812
Epoch: 45261, Training Loss: 0.00638
Epoch: 45261, Training Loss: 0.00662
Epoch: 45261, Training Loss: 0.00663
Epoch: 45261, Training Loss: 0.00812
Epoch: 45262, Training Loss: 0.00638
Epoch: 45262, Training Loss: 0.00662
Epoch: 45262, Training Loss: 0.00663
Epoch: 45262, Training Loss: 0.00812
Epoch: 45263, Training Loss: 0.00638
Epoch: 45263, Training Loss: 0.00662
Epoch: 45263, Training Loss: 0.00663
Epoch: 45263, Training Loss: 0.00812
Epoch: 45264, Training Loss: 0.00638
Epoch: 45264, Training Loss: 0.00662
Epoch: 45264, Training Loss: 0.00663
Epoch: 45264, Training Loss: 0.00812
Epoch: 45265, Training Loss: 0.00638
Epoch: 45265, Training Loss: 0.00662
Epoch: 45265, Training Loss: 0.00663
Epoch: 45265, Training Loss: 0.00812
Epoch: 45266, Training Loss: 0.00638
Epoch: 45266, Training Loss: 0.00662
Epoch: 45266, Training Loss: 0.00663
Epoch: 45266, Training Loss: 0.00812
Epoch: 45267, Training Loss: 0.00638
Epoch: 45267, Training Loss: 0.00662
Epoch: 45267, Training Loss: 0.00663
Epoch: 45267, Training Loss: 0.00812
Epoch: 45268, Training Loss: 0.00638
Epoch: 45268, Training Loss: 0.00662
Epoch: 45268, Training Loss: 0.00663
Epoch: 45268, Training Loss: 0.00812
Epoch: 45269, Training Loss: 0.00638
Epoch: 45269, Training Loss: 0.00662
Epoch: 45269, Training Loss: 0.00663
Epoch: 45269, Training Loss: 0.00812
Epoch: 45270, Training Loss: 0.00638
Epoch: 45270, Training Loss: 0.00662
Epoch: 45270, Training Loss: 0.00663
Epoch: 45270, Training Loss: 0.00812
Epoch: 45271, Training Loss: 0.00638
Epoch: 45271, Training Loss: 0.00662
Epoch: 45271, Training Loss: 0.00663
Epoch: 45271, Training Loss: 0.00812
Epoch: 45272, Training Loss: 0.00638
Epoch: 45272, Training Loss: 0.00662
Epoch: 45272, Training Loss: 0.00663
Epoch: 45272, Training Loss: 0.00812
Epoch: 45273, Training Loss: 0.00638
Epoch: 45273, Training Loss: 0.00662
Epoch: 45273, Training Loss: 0.00663
Epoch: 45273, Training Loss: 0.00812
Epoch: 45274, Training Loss: 0.00638
Epoch: 45274, Training Loss: 0.00662
Epoch: 45274, Training Loss: 0.00663
Epoch: 45274, Training Loss: 0.00812
Epoch: 45275, Training Loss: 0.00638
Epoch: 45275, Training Loss: 0.00662
Epoch: 45275, Training Loss: 0.00663
Epoch: 45275, Training Loss: 0.00812
Epoch: 45276, Training Loss: 0.00638
Epoch: 45276, Training Loss: 0.00662
Epoch: 45276, Training Loss: 0.00662
Epoch: 45276, Training Loss: 0.00812
Epoch: 45277, Training Loss: 0.00638
Epoch: 45277, Training Loss: 0.00662
Epoch: 45277, Training Loss: 0.00662
Epoch: 45277, Training Loss: 0.00812
Epoch: 45278, Training Loss: 0.00638
Epoch: 45278, Training Loss: 0.00662
Epoch: 45278, Training Loss: 0.00662
Epoch: 45278, Training Loss: 0.00812
Epoch: 45279, Training Loss: 0.00638
Epoch: 45279, Training Loss: 0.00662
Epoch: 45279, Training Loss: 0.00662
Epoch: 45279, Training Loss: 0.00812
Epoch: 45280, Training Loss: 0.00638
Epoch: 45280, Training Loss: 0.00662
Epoch: 45280, Training Loss: 0.00662
Epoch: 45280, Training Loss: 0.00812
Epoch: 45281, Training Loss: 0.00638
Epoch: 45281, Training Loss: 0.00662
Epoch: 45281, Training Loss: 0.00662
Epoch: 45281, Training Loss: 0.00812
Epoch: 45282, Training Loss: 0.00638
Epoch: 45282, Training Loss: 0.00662
Epoch: 45282, Training Loss: 0.00662
Epoch: 45282, Training Loss: 0.00812
Epoch: 45283, Training Loss: 0.00638
Epoch: 45283, Training Loss: 0.00662
Epoch: 45283, Training Loss: 0.00662
Epoch: 45283, Training Loss: 0.00812
Epoch: 45284, Training Loss: 0.00638
Epoch: 45284, Training Loss: 0.00662
Epoch: 45284, Training Loss: 0.00662
Epoch: 45284, Training Loss: 0.00812
Epoch: 45285, Training Loss: 0.00638
Epoch: 45285, Training Loss: 0.00662
Epoch: 45285, Training Loss: 0.00662
Epoch: 45285, Training Loss: 0.00812
Epoch: 45286, Training Loss: 0.00638
Epoch: 45286, Training Loss: 0.00662
Epoch: 45286, Training Loss: 0.00662
Epoch: 45286, Training Loss: 0.00812
Epoch: 45287, Training Loss: 0.00638
Epoch: 45287, Training Loss: 0.00662
Epoch: 45287, Training Loss: 0.00662
Epoch: 45287, Training Loss: 0.00812
Epoch: 45288, Training Loss: 0.00638
Epoch: 45288, Training Loss: 0.00662
Epoch: 45288, Training Loss: 0.00662
Epoch: 45288, Training Loss: 0.00812
Epoch: 45289, Training Loss: 0.00638
Epoch: 45289, Training Loss: 0.00662
Epoch: 45289, Training Loss: 0.00662
Epoch: 45289, Training Loss: 0.00812
Epoch: 45290, Training Loss: 0.00638
Epoch: 45290, Training Loss: 0.00662
Epoch: 45290, Training Loss: 0.00662
Epoch: 45290, Training Loss: 0.00812
Epoch: 45291, Training Loss: 0.00638
Epoch: 45291, Training Loss: 0.00662
Epoch: 45291, Training Loss: 0.00662
Epoch: 45291, Training Loss: 0.00812
Epoch: 45292, Training Loss: 0.00638
Epoch: 45292, Training Loss: 0.00662
Epoch: 45292, Training Loss: 0.00662
Epoch: 45292, Training Loss: 0.00812
Epoch: 45293, Training Loss: 0.00638
Epoch: 45293, Training Loss: 0.00662
Epoch: 45293, Training Loss: 0.00662
Epoch: 45293, Training Loss: 0.00812
Epoch: 45294, Training Loss: 0.00638
Epoch: 45294, Training Loss: 0.00662
Epoch: 45294, Training Loss: 0.00662
Epoch: 45294, Training Loss: 0.00812
Epoch: 45295, Training Loss: 0.00638
Epoch: 45295, Training Loss: 0.00662
Epoch: 45295, Training Loss: 0.00662
Epoch: 45295, Training Loss: 0.00812
Epoch: 45296, Training Loss: 0.00638
Epoch: 45296, Training Loss: 0.00662
Epoch: 45296, Training Loss: 0.00662
Epoch: 45296, Training Loss: 0.00812
Epoch: 45297, Training Loss: 0.00638
Epoch: 45297, Training Loss: 0.00662
Epoch: 45297, Training Loss: 0.00662
Epoch: 45297, Training Loss: 0.00812
Epoch: 45298, Training Loss: 0.00638
Epoch: 45298, Training Loss: 0.00662
Epoch: 45298, Training Loss: 0.00662
Epoch: 45298, Training Loss: 0.00812
Epoch: 45299, Training Loss: 0.00638
Epoch: 45299, Training Loss: 0.00662
Epoch: 45299, Training Loss: 0.00662
Epoch: 45299, Training Loss: 0.00812
Epoch: 45300, Training Loss: 0.00638
Epoch: 45300, Training Loss: 0.00662
Epoch: 45300, Training Loss: 0.00662
Epoch: 45300, Training Loss: 0.00812
Epoch: 45301, Training Loss: 0.00638
Epoch: 45301, Training Loss: 0.00662
Epoch: 45301, Training Loss: 0.00662
Epoch: 45301, Training Loss: 0.00812
Epoch: 45302, Training Loss: 0.00638
Epoch: 45302, Training Loss: 0.00662
Epoch: 45302, Training Loss: 0.00662
Epoch: 45302, Training Loss: 0.00812
Epoch: 45303, Training Loss: 0.00638
Epoch: 45303, Training Loss: 0.00662
Epoch: 45303, Training Loss: 0.00662
Epoch: 45303, Training Loss: 0.00812
Epoch: 45304, Training Loss: 0.00638
Epoch: 45304, Training Loss: 0.00662
Epoch: 45304, Training Loss: 0.00662
Epoch: 45304, Training Loss: 0.00812
Epoch: 45305, Training Loss: 0.00638
Epoch: 45305, Training Loss: 0.00662
Epoch: 45305, Training Loss: 0.00662
Epoch: 45305, Training Loss: 0.00812
Epoch: 45306, Training Loss: 0.00638
Epoch: 45306, Training Loss: 0.00662
Epoch: 45306, Training Loss: 0.00662
Epoch: 45306, Training Loss: 0.00812
Epoch: 45307, Training Loss: 0.00638
Epoch: 45307, Training Loss: 0.00662
Epoch: 45307, Training Loss: 0.00662
Epoch: 45307, Training Loss: 0.00812
Epoch: 45308, Training Loss: 0.00638
Epoch: 45308, Training Loss: 0.00662
Epoch: 45308, Training Loss: 0.00662
Epoch: 45308, Training Loss: 0.00812
Epoch: 45309, Training Loss: 0.00638
Epoch: 45309, Training Loss: 0.00662
Epoch: 45309, Training Loss: 0.00662
Epoch: 45309, Training Loss: 0.00812
Epoch: 45310, Training Loss: 0.00638
Epoch: 45310, Training Loss: 0.00661
Epoch: 45310, Training Loss: 0.00662
Epoch: 45310, Training Loss: 0.00812
Epoch: 45311, Training Loss: 0.00638
Epoch: 45311, Training Loss: 0.00661
Epoch: 45311, Training Loss: 0.00662
Epoch: 45311, Training Loss: 0.00812
Epoch: 45312, Training Loss: 0.00638
Epoch: 45312, Training Loss: 0.00661
Epoch: 45312, Training Loss: 0.00662
Epoch: 45312, Training Loss: 0.00812
Epoch: 45313, Training Loss: 0.00638
Epoch: 45313, Training Loss: 0.00661
Epoch: 45313, Training Loss: 0.00662
Epoch: 45313, Training Loss: 0.00812
Epoch: 45314, Training Loss: 0.00638
Epoch: 45314, Training Loss: 0.00661
Epoch: 45314, Training Loss: 0.00662
Epoch: 45314, Training Loss: 0.00812
Epoch: 45315, Training Loss: 0.00638
Epoch: 45315, Training Loss: 0.00661
Epoch: 45315, Training Loss: 0.00662
Epoch: 45315, Training Loss: 0.00812
Epoch: 45316, Training Loss: 0.00638
Epoch: 45316, Training Loss: 0.00661
Epoch: 45316, Training Loss: 0.00662
Epoch: 45316, Training Loss: 0.00812
Epoch: 45317, Training Loss: 0.00638
Epoch: 45317, Training Loss: 0.00661
Epoch: 45317, Training Loss: 0.00662
Epoch: 45317, Training Loss: 0.00812
Epoch: 45318, Training Loss: 0.00638
Epoch: 45318, Training Loss: 0.00661
Epoch: 45318, Training Loss: 0.00662
Epoch: 45318, Training Loss: 0.00812
Epoch: 45319, Training Loss: 0.00638
Epoch: 45319, Training Loss: 0.00661
Epoch: 45319, Training Loss: 0.00662
Epoch: 45319, Training Loss: 0.00812
Epoch: 45320, Training Loss: 0.00638
Epoch: 45320, Training Loss: 0.00661
Epoch: 45320, Training Loss: 0.00662
Epoch: 45320, Training Loss: 0.00812
Epoch: 45321, Training Loss: 0.00638
Epoch: 45321, Training Loss: 0.00661
Epoch: 45321, Training Loss: 0.00662
Epoch: 45321, Training Loss: 0.00812
Epoch: 45322, Training Loss: 0.00638
Epoch: 45322, Training Loss: 0.00661
Epoch: 45322, Training Loss: 0.00662
Epoch: 45322, Training Loss: 0.00811
Epoch: 45323, Training Loss: 0.00638
Epoch: 45323, Training Loss: 0.00661
Epoch: 45323, Training Loss: 0.00662
Epoch: 45323, Training Loss: 0.00811
Epoch: 45324, Training Loss: 0.00638
Epoch: 45324, Training Loss: 0.00661
Epoch: 45324, Training Loss: 0.00662
Epoch: 45324, Training Loss: 0.00811
Epoch: 45325, Training Loss: 0.00638
Epoch: 45325, Training Loss: 0.00661
Epoch: 45325, Training Loss: 0.00662
Epoch: 45325, Training Loss: 0.00811
Epoch: 45326, Training Loss: 0.00638
Epoch: 45326, Training Loss: 0.00661
Epoch: 45326, Training Loss: 0.00662
Epoch: 45326, Training Loss: 0.00811
Epoch: 45327, Training Loss: 0.00638
Epoch: 45327, Training Loss: 0.00661
Epoch: 45327, Training Loss: 0.00662
Epoch: 45327, Training Loss: 0.00811
Epoch: 45328, Training Loss: 0.00638
Epoch: 45328, Training Loss: 0.00661
Epoch: 45328, Training Loss: 0.00662
Epoch: 45328, Training Loss: 0.00811
Epoch: 45329, Training Loss: 0.00638
Epoch: 45329, Training Loss: 0.00661
Epoch: 45329, Training Loss: 0.00662
Epoch: 45329, Training Loss: 0.00811
Epoch: 45330, Training Loss: 0.00638
Epoch: 45330, Training Loss: 0.00661
Epoch: 45330, Training Loss: 0.00662
Epoch: 45330, Training Loss: 0.00811
Epoch: 45331, Training Loss: 0.00638
Epoch: 45331, Training Loss: 0.00661
Epoch: 45331, Training Loss: 0.00662
Epoch: 45331, Training Loss: 0.00811
Epoch: 45332, Training Loss: 0.00638
Epoch: 45332, Training Loss: 0.00661
Epoch: 45332, Training Loss: 0.00662
Epoch: 45332, Training Loss: 0.00811
Epoch: 45333, Training Loss: 0.00638
Epoch: 45333, Training Loss: 0.00661
Epoch: 45333, Training Loss: 0.00662
Epoch: 45333, Training Loss: 0.00811
Epoch: 45334, Training Loss: 0.00638
Epoch: 45334, Training Loss: 0.00661
Epoch: 45334, Training Loss: 0.00662
Epoch: 45334, Training Loss: 0.00811
Epoch: 45335, Training Loss: 0.00637
Epoch: 45335, Training Loss: 0.00661
Epoch: 45335, Training Loss: 0.00662
Epoch: 45335, Training Loss: 0.00811
Epoch: 45336, Training Loss: 0.00637
Epoch: 45336, Training Loss: 0.00661
Epoch: 45336, Training Loss: 0.00662
Epoch: 45336, Training Loss: 0.00811
Epoch: 45337, Training Loss: 0.00637
Epoch: 45337, Training Loss: 0.00661
Epoch: 45337, Training Loss: 0.00662
Epoch: 45337, Training Loss: 0.00811
Epoch: 45338, Training Loss: 0.00637
Epoch: 45338, Training Loss: 0.00661
Epoch: 45338, Training Loss: 0.00662
Epoch: 45338, Training Loss: 0.00811
Epoch: 45339, Training Loss: 0.00637
Epoch: 45339, Training Loss: 0.00661
Epoch: 45339, Training Loss: 0.00662
Epoch: 45339, Training Loss: 0.00811
Epoch: 45340, Training Loss: 0.00637
Epoch: 45340, Training Loss: 0.00661
Epoch: 45340, Training Loss: 0.00662
Epoch: 45340, Training Loss: 0.00811
Epoch: 45341, Training Loss: 0.00637
Epoch: 45341, Training Loss: 0.00661
Epoch: 45341, Training Loss: 0.00662
Epoch: 45341, Training Loss: 0.00811
Epoch: 45342, Training Loss: 0.00637
Epoch: 45342, Training Loss: 0.00661
Epoch: 45342, Training Loss: 0.00662
Epoch: 45342, Training Loss: 0.00811
Epoch: 45343, Training Loss: 0.00637
Epoch: 45343, Training Loss: 0.00661
Epoch: 45343, Training Loss: 0.00662
Epoch: 45343, Training Loss: 0.00811
Epoch: 45344, Training Loss: 0.00637
Epoch: 45344, Training Loss: 0.00661
Epoch: 45344, Training Loss: 0.00662
Epoch: 45344, Training Loss: 0.00811
Epoch: 45345, Training Loss: 0.00637
Epoch: 45345, Training Loss: 0.00661
Epoch: 45345, Training Loss: 0.00662
Epoch: 45345, Training Loss: 0.00811
Epoch: 45346, Training Loss: 0.00637
Epoch: 45346, Training Loss: 0.00661
Epoch: 45346, Training Loss: 0.00662
Epoch: 45346, Training Loss: 0.00811
Epoch: 45347, Training Loss: 0.00637
Epoch: 45347, Training Loss: 0.00661
Epoch: 45347, Training Loss: 0.00662
Epoch: 45347, Training Loss: 0.00811
Epoch: 45348, Training Loss: 0.00637
Epoch: 45348, Training Loss: 0.00661
Epoch: 45348, Training Loss: 0.00662
Epoch: 45348, Training Loss: 0.00811
Epoch: 45349, Training Loss: 0.00637
Epoch: 45349, Training Loss: 0.00661
Epoch: 45349, Training Loss: 0.00662
Epoch: 45349, Training Loss: 0.00811
Epoch: 45350, Training Loss: 0.00637
Epoch: 45350, Training Loss: 0.00661
Epoch: 45350, Training Loss: 0.00662
Epoch: 45350, Training Loss: 0.00811
Epoch: 45351, Training Loss: 0.00637
Epoch: 45351, Training Loss: 0.00661
Epoch: 45351, Training Loss: 0.00662
Epoch: 45351, Training Loss: 0.00811
Epoch: 45352, Training Loss: 0.00637
Epoch: 45352, Training Loss: 0.00661
Epoch: 45352, Training Loss: 0.00662
Epoch: 45352, Training Loss: 0.00811
Epoch: 45353, Training Loss: 0.00637
Epoch: 45353, Training Loss: 0.00661
Epoch: 45353, Training Loss: 0.00662
Epoch: 45353, Training Loss: 0.00811
Epoch: 45354, Training Loss: 0.00637
Epoch: 45354, Training Loss: 0.00661
Epoch: 45354, Training Loss: 0.00662
Epoch: 45354, Training Loss: 0.00811
Epoch: 45355, Training Loss: 0.00637
Epoch: 45355, Training Loss: 0.00661
Epoch: 45355, Training Loss: 0.00662
Epoch: 45355, Training Loss: 0.00811
Epoch: 45356, Training Loss: 0.00637
Epoch: 45356, Training Loss: 0.00661
Epoch: 45356, Training Loss: 0.00662
Epoch: 45356, Training Loss: 0.00811
Epoch: 45357, Training Loss: 0.00637
Epoch: 45357, Training Loss: 0.00661
Epoch: 45357, Training Loss: 0.00662
Epoch: 45357, Training Loss: 0.00811
Epoch: 45358, Training Loss: 0.00637
Epoch: 45358, Training Loss: 0.00661
Epoch: 45358, Training Loss: 0.00662
Epoch: 45358, Training Loss: 0.00811
Epoch: 45359, Training Loss: 0.00637
Epoch: 45359, Training Loss: 0.00661
Epoch: 45359, Training Loss: 0.00662
Epoch: 45359, Training Loss: 0.00811
Epoch: 45360, Training Loss: 0.00637
Epoch: 45360, Training Loss: 0.00661
Epoch: 45360, Training Loss: 0.00662
Epoch: 45360, Training Loss: 0.00811
Epoch: 45361, Training Loss: 0.00637
Epoch: 45361, Training Loss: 0.00661
Epoch: 45361, Training Loss: 0.00662
Epoch: 45361, Training Loss: 0.00811
Epoch: 45362, Training Loss: 0.00637
Epoch: 45362, Training Loss: 0.00661
Epoch: 45362, Training Loss: 0.00662
Epoch: 45362, Training Loss: 0.00811
Epoch: 45363, Training Loss: 0.00637
Epoch: 45363, Training Loss: 0.00661
Epoch: 45363, Training Loss: 0.00662
Epoch: 45363, Training Loss: 0.00811
Epoch: 45364, Training Loss: 0.00637
Epoch: 45364, Training Loss: 0.00661
Epoch: 45364, Training Loss: 0.00662
Epoch: 45364, Training Loss: 0.00811
Epoch: 45365, Training Loss: 0.00637
Epoch: 45365, Training Loss: 0.00661
Epoch: 45365, Training Loss: 0.00662
Epoch: 45365, Training Loss: 0.00811
Epoch: 45366, Training Loss: 0.00637
Epoch: 45366, Training Loss: 0.00661
Epoch: 45366, Training Loss: 0.00662
Epoch: 45366, Training Loss: 0.00811
Epoch: 45367, Training Loss: 0.00637
Epoch: 45367, Training Loss: 0.00661
Epoch: 45367, Training Loss: 0.00662
Epoch: 45367, Training Loss: 0.00811
Epoch: 45368, Training Loss: 0.00637
Epoch: 45368, Training Loss: 0.00661
Epoch: 45368, Training Loss: 0.00662
Epoch: 45368, Training Loss: 0.00811
Epoch: 45369, Training Loss: 0.00637
Epoch: 45369, Training Loss: 0.00661
Epoch: 45369, Training Loss: 0.00662
Epoch: 45369, Training Loss: 0.00811
Epoch: 45370, Training Loss: 0.00637
Epoch: 45370, Training Loss: 0.00661
Epoch: 45370, Training Loss: 0.00662
Epoch: 45370, Training Loss: 0.00811
Epoch: 45371, Training Loss: 0.00637
Epoch: 45371, Training Loss: 0.00661
Epoch: 45371, Training Loss: 0.00662
Epoch: 45371, Training Loss: 0.00811
Epoch: 45372, Training Loss: 0.00637
Epoch: 45372, Training Loss: 0.00661
Epoch: 45372, Training Loss: 0.00662
Epoch: 45372, Training Loss: 0.00811
Epoch: 45373, Training Loss: 0.00637
Epoch: 45373, Training Loss: 0.00661
Epoch: 45373, Training Loss: 0.00662
Epoch: 45373, Training Loss: 0.00811
Epoch: 45374, Training Loss: 0.00637
Epoch: 45374, Training Loss: 0.00661
Epoch: 45374, Training Loss: 0.00662
Epoch: 45374, Training Loss: 0.00811
Epoch: 45375, Training Loss: 0.00637
Epoch: 45375, Training Loss: 0.00661
Epoch: 45375, Training Loss: 0.00662
Epoch: 45375, Training Loss: 0.00811
Epoch: 45376, Training Loss: 0.00637
Epoch: 45376, Training Loss: 0.00661
Epoch: 45376, Training Loss: 0.00662
Epoch: 45376, Training Loss: 0.00811
Epoch: 45377, Training Loss: 0.00637
Epoch: 45377, Training Loss: 0.00661
Epoch: 45377, Training Loss: 0.00662
Epoch: 45377, Training Loss: 0.00811
Epoch: 45378, Training Loss: 0.00637
Epoch: 45378, Training Loss: 0.00661
Epoch: 45378, Training Loss: 0.00662
Epoch: 45378, Training Loss: 0.00811
Epoch: 45379, Training Loss: 0.00637
Epoch: 45379, Training Loss: 0.00661
Epoch: 45379, Training Loss: 0.00662
Epoch: 45379, Training Loss: 0.00811
Epoch: 45380, Training Loss: 0.00637
Epoch: 45380, Training Loss: 0.00661
Epoch: 45380, Training Loss: 0.00662
Epoch: 45380, Training Loss: 0.00811
Epoch: 45381, Training Loss: 0.00637
Epoch: 45381, Training Loss: 0.00661
Epoch: 45381, Training Loss: 0.00662
Epoch: 45381, Training Loss: 0.00811
Epoch: 45382, Training Loss: 0.00637
Epoch: 45382, Training Loss: 0.00661
Epoch: 45382, Training Loss: 0.00662
Epoch: 45382, Training Loss: 0.00811
Epoch: 45383, Training Loss: 0.00637
Epoch: 45383, Training Loss: 0.00661
Epoch: 45383, Training Loss: 0.00662
Epoch: 45383, Training Loss: 0.00811
Epoch: 45384, Training Loss: 0.00637
Epoch: 45384, Training Loss: 0.00661
Epoch: 45384, Training Loss: 0.00662
Epoch: 45384, Training Loss: 0.00811
Epoch: 45385, Training Loss: 0.00637
Epoch: 45385, Training Loss: 0.00661
Epoch: 45385, Training Loss: 0.00662
Epoch: 45385, Training Loss: 0.00811
Epoch: 45386, Training Loss: 0.00637
Epoch: 45386, Training Loss: 0.00661
Epoch: 45386, Training Loss: 0.00662
Epoch: 45386, Training Loss: 0.00811
Epoch: 45387, Training Loss: 0.00637
Epoch: 45387, Training Loss: 0.00661
Epoch: 45387, Training Loss: 0.00662
Epoch: 45387, Training Loss: 0.00811
Epoch: 45388, Training Loss: 0.00637
Epoch: 45388, Training Loss: 0.00661
Epoch: 45388, Training Loss: 0.00662
Epoch: 45388, Training Loss: 0.00811
Epoch: 45389, Training Loss: 0.00637
Epoch: 45389, Training Loss: 0.00661
Epoch: 45389, Training Loss: 0.00662
Epoch: 45389, Training Loss: 0.00811
Epoch: 45390, Training Loss: 0.00637
Epoch: 45390, Training Loss: 0.00661
Epoch: 45390, Training Loss: 0.00662
Epoch: 45390, Training Loss: 0.00811
Epoch: 45391, Training Loss: 0.00637
Epoch: 45391, Training Loss: 0.00661
Epoch: 45391, Training Loss: 0.00662
Epoch: 45391, Training Loss: 0.00811
Epoch: 45392, Training Loss: 0.00637
Epoch: 45392, Training Loss: 0.00661
Epoch: 45392, Training Loss: 0.00662
Epoch: 45392, Training Loss: 0.00811
Epoch: 45393, Training Loss: 0.00637
Epoch: 45393, Training Loss: 0.00661
Epoch: 45393, Training Loss: 0.00662
Epoch: 45393, Training Loss: 0.00811
Epoch: 45394, Training Loss: 0.00637
Epoch: 45394, Training Loss: 0.00661
Epoch: 45394, Training Loss: 0.00662
Epoch: 45394, Training Loss: 0.00811
Epoch: 45395, Training Loss: 0.00637
Epoch: 45395, Training Loss: 0.00661
Epoch: 45395, Training Loss: 0.00662
Epoch: 45395, Training Loss: 0.00811
Epoch: 45396, Training Loss: 0.00637
Epoch: 45396, Training Loss: 0.00661
Epoch: 45396, Training Loss: 0.00662
Epoch: 45396, Training Loss: 0.00811
Epoch: 45397, Training Loss: 0.00637
Epoch: 45397, Training Loss: 0.00661
Epoch: 45397, Training Loss: 0.00662
Epoch: 45397, Training Loss: 0.00811
Epoch: 45398, Training Loss: 0.00637
Epoch: 45398, Training Loss: 0.00661
Epoch: 45398, Training Loss: 0.00662
Epoch: 45398, Training Loss: 0.00811
Epoch: 45399, Training Loss: 0.00637
Epoch: 45399, Training Loss: 0.00661
Epoch: 45399, Training Loss: 0.00662
Epoch: 45399, Training Loss: 0.00811
Epoch: 45400, Training Loss: 0.00637
Epoch: 45400, Training Loss: 0.00661
Epoch: 45400, Training Loss: 0.00662
Epoch: 45400, Training Loss: 0.00811
Epoch: 45401, Training Loss: 0.00637
Epoch: 45401, Training Loss: 0.00661
Epoch: 45401, Training Loss: 0.00661
Epoch: 45401, Training Loss: 0.00811
Epoch: 45402, Training Loss: 0.00637
Epoch: 45402, Training Loss: 0.00661
Epoch: 45402, Training Loss: 0.00661
Epoch: 45402, Training Loss: 0.00811
Epoch: 45403, Training Loss: 0.00637
Epoch: 45403, Training Loss: 0.00661
Epoch: 45403, Training Loss: 0.00661
Epoch: 45403, Training Loss: 0.00811
Epoch: 45404, Training Loss: 0.00637
Epoch: 45404, Training Loss: 0.00661
Epoch: 45404, Training Loss: 0.00661
Epoch: 45404, Training Loss: 0.00811
Epoch: 45405, Training Loss: 0.00637
Epoch: 45405, Training Loss: 0.00661
Epoch: 45405, Training Loss: 0.00661
Epoch: 45405, Training Loss: 0.00811
Epoch: 45406, Training Loss: 0.00637
Epoch: 45406, Training Loss: 0.00661
Epoch: 45406, Training Loss: 0.00661
Epoch: 45406, Training Loss: 0.00811
Epoch: 45407, Training Loss: 0.00637
Epoch: 45407, Training Loss: 0.00661
Epoch: 45407, Training Loss: 0.00661
Epoch: 45407, Training Loss: 0.00811
Epoch: 45408, Training Loss: 0.00637
Epoch: 45408, Training Loss: 0.00661
Epoch: 45408, Training Loss: 0.00661
Epoch: 45408, Training Loss: 0.00811
Epoch: 45409, Training Loss: 0.00637
Epoch: 45409, Training Loss: 0.00661
Epoch: 45409, Training Loss: 0.00661
Epoch: 45409, Training Loss: 0.00811
Epoch: 45410, Training Loss: 0.00637
Epoch: 45410, Training Loss: 0.00661
Epoch: 45410, Training Loss: 0.00661
Epoch: 45410, Training Loss: 0.00811
Epoch: 45411, Training Loss: 0.00637
Epoch: 45411, Training Loss: 0.00661
Epoch: 45411, Training Loss: 0.00661
Epoch: 45411, Training Loss: 0.00811
Epoch: 45412, Training Loss: 0.00637
Epoch: 45412, Training Loss: 0.00661
Epoch: 45412, Training Loss: 0.00661
Epoch: 45412, Training Loss: 0.00811
Epoch: 45413, Training Loss: 0.00637
Epoch: 45413, Training Loss: 0.00661
Epoch: 45413, Training Loss: 0.00661
Epoch: 45413, Training Loss: 0.00811
Epoch: 45414, Training Loss: 0.00637
Epoch: 45414, Training Loss: 0.00661
Epoch: 45414, Training Loss: 0.00661
Epoch: 45414, Training Loss: 0.00811
Epoch: 45415, Training Loss: 0.00637
Epoch: 45415, Training Loss: 0.00661
Epoch: 45415, Training Loss: 0.00661
Epoch: 45415, Training Loss: 0.00811
Epoch: 45416, Training Loss: 0.00637
Epoch: 45416, Training Loss: 0.00661
Epoch: 45416, Training Loss: 0.00661
Epoch: 45416, Training Loss: 0.00811
Epoch: 45417, Training Loss: 0.00637
Epoch: 45417, Training Loss: 0.00661
Epoch: 45417, Training Loss: 0.00661
Epoch: 45417, Training Loss: 0.00811
Epoch: 45418, Training Loss: 0.00637
Epoch: 45418, Training Loss: 0.00661
Epoch: 45418, Training Loss: 0.00661
Epoch: 45418, Training Loss: 0.00811
Epoch: 45419, Training Loss: 0.00637
Epoch: 45419, Training Loss: 0.00661
Epoch: 45419, Training Loss: 0.00661
Epoch: 45419, Training Loss: 0.00811
Epoch: 45420, Training Loss: 0.00637
Epoch: 45420, Training Loss: 0.00661
Epoch: 45420, Training Loss: 0.00661
Epoch: 45420, Training Loss: 0.00811
Epoch: 45421, Training Loss: 0.00637
Epoch: 45421, Training Loss: 0.00661
Epoch: 45421, Training Loss: 0.00661
Epoch: 45421, Training Loss: 0.00811
Epoch: 45422, Training Loss: 0.00637
Epoch: 45422, Training Loss: 0.00661
Epoch: 45422, Training Loss: 0.00661
Epoch: 45422, Training Loss: 0.00811
Epoch: 45423, Training Loss: 0.00637
Epoch: 45423, Training Loss: 0.00661
Epoch: 45423, Training Loss: 0.00661
Epoch: 45423, Training Loss: 0.00811
Epoch: 45424, Training Loss: 0.00637
Epoch: 45424, Training Loss: 0.00661
Epoch: 45424, Training Loss: 0.00661
Epoch: 45424, Training Loss: 0.00810
Epoch: 45425, Training Loss: 0.00637
Epoch: 45425, Training Loss: 0.00661
Epoch: 45425, Training Loss: 0.00661
Epoch: 45425, Training Loss: 0.00810
Epoch: 45426, Training Loss: 0.00637
Epoch: 45426, Training Loss: 0.00661
Epoch: 45426, Training Loss: 0.00661
Epoch: 45426, Training Loss: 0.00810
Epoch: 45427, Training Loss: 0.00637
Epoch: 45427, Training Loss: 0.00661
Epoch: 45427, Training Loss: 0.00661
Epoch: 45427, Training Loss: 0.00810
Epoch: 45428, Training Loss: 0.00637
Epoch: 45428, Training Loss: 0.00661
Epoch: 45428, Training Loss: 0.00661
Epoch: 45428, Training Loss: 0.00810
Epoch: 45429, Training Loss: 0.00637
Epoch: 45429, Training Loss: 0.00661
Epoch: 45429, Training Loss: 0.00661
Epoch: 45429, Training Loss: 0.00810
Epoch: 45430, Training Loss: 0.00637
Epoch: 45430, Training Loss: 0.00661
Epoch: 45430, Training Loss: 0.00661
Epoch: 45430, Training Loss: 0.00810
Epoch: 45431, Training Loss: 0.00637
Epoch: 45431, Training Loss: 0.00661
Epoch: 45431, Training Loss: 0.00661
Epoch: 45431, Training Loss: 0.00810
Epoch: 45432, Training Loss: 0.00637
Epoch: 45432, Training Loss: 0.00661
Epoch: 45432, Training Loss: 0.00661
Epoch: 45432, Training Loss: 0.00810
Epoch: 45433, Training Loss: 0.00637
Epoch: 45433, Training Loss: 0.00661
Epoch: 45433, Training Loss: 0.00661
Epoch: 45433, Training Loss: 0.00810
Epoch: 45434, Training Loss: 0.00637
Epoch: 45434, Training Loss: 0.00661
Epoch: 45434, Training Loss: 0.00661
Epoch: 45434, Training Loss: 0.00810
Epoch: 45435, Training Loss: 0.00637
Epoch: 45435, Training Loss: 0.00661
Epoch: 45435, Training Loss: 0.00661
Epoch: 45435, Training Loss: 0.00810
Epoch: 45436, Training Loss: 0.00637
Epoch: 45436, Training Loss: 0.00661
Epoch: 45436, Training Loss: 0.00661
Epoch: 45436, Training Loss: 0.00810
Epoch: 45437, Training Loss: 0.00637
Epoch: 45437, Training Loss: 0.00660
Epoch: 45437, Training Loss: 0.00661
Epoch: 45437, Training Loss: 0.00810
Epoch: 45438, Training Loss: 0.00637
Epoch: 45438, Training Loss: 0.00660
Epoch: 45438, Training Loss: 0.00661
Epoch: 45438, Training Loss: 0.00810
Epoch: 45439, Training Loss: 0.00637
Epoch: 45439, Training Loss: 0.00660
Epoch: 45439, Training Loss: 0.00661
Epoch: 45439, Training Loss: 0.00810
Epoch: 45440, Training Loss: 0.00637
Epoch: 45440, Training Loss: 0.00660
Epoch: 45440, Training Loss: 0.00661
Epoch: 45440, Training Loss: 0.00810
Epoch: 45441, Training Loss: 0.00637
Epoch: 45441, Training Loss: 0.00660
Epoch: 45441, Training Loss: 0.00661
Epoch: 45441, Training Loss: 0.00810
Epoch: 45442, Training Loss: 0.00637
Epoch: 45442, Training Loss: 0.00660
Epoch: 45442, Training Loss: 0.00661
Epoch: 45442, Training Loss: 0.00810
Epoch: 45443, Training Loss: 0.00637
Epoch: 45443, Training Loss: 0.00660
Epoch: 45443, Training Loss: 0.00661
Epoch: 45443, Training Loss: 0.00810
Epoch: 45444, Training Loss: 0.00637
Epoch: 45444, Training Loss: 0.00660
Epoch: 45444, Training Loss: 0.00661
Epoch: 45444, Training Loss: 0.00810
Epoch: 45445, Training Loss: 0.00637
Epoch: 45445, Training Loss: 0.00660
Epoch: 45445, Training Loss: 0.00661
Epoch: 45445, Training Loss: 0.00810
Epoch: 45446, Training Loss: 0.00637
Epoch: 45446, Training Loss: 0.00660
Epoch: 45446, Training Loss: 0.00661
Epoch: 45446, Training Loss: 0.00810
Epoch: 45447, Training Loss: 0.00637
Epoch: 45447, Training Loss: 0.00660
Epoch: 45447, Training Loss: 0.00661
Epoch: 45447, Training Loss: 0.00810
Epoch: 45448, Training Loss: 0.00637
Epoch: 45448, Training Loss: 0.00660
Epoch: 45448, Training Loss: 0.00661
Epoch: 45448, Training Loss: 0.00810
Epoch: 45449, Training Loss: 0.00637
Epoch: 45449, Training Loss: 0.00660
Epoch: 45449, Training Loss: 0.00661
Epoch: 45449, Training Loss: 0.00810
Epoch: 45450, Training Loss: 0.00637
Epoch: 45450, Training Loss: 0.00660
Epoch: 45450, Training Loss: 0.00661
Epoch: 45450, Training Loss: 0.00810
Epoch: 45451, Training Loss: 0.00637
Epoch: 45451, Training Loss: 0.00660
Epoch: 45451, Training Loss: 0.00661
Epoch: 45451, Training Loss: 0.00810
Epoch: 45452, Training Loss: 0.00637
Epoch: 45452, Training Loss: 0.00660
Epoch: 45452, Training Loss: 0.00661
Epoch: 45452, Training Loss: 0.00810
Epoch: 45453, Training Loss: 0.00637
Epoch: 45453, Training Loss: 0.00660
Epoch: 45453, Training Loss: 0.00661
Epoch: 45453, Training Loss: 0.00810
Epoch: 45454, Training Loss: 0.00637
Epoch: 45454, Training Loss: 0.00660
Epoch: 45454, Training Loss: 0.00661
Epoch: 45454, Training Loss: 0.00810
Epoch: 45455, Training Loss: 0.00637
Epoch: 45455, Training Loss: 0.00660
Epoch: 45455, Training Loss: 0.00661
Epoch: 45455, Training Loss: 0.00810
Epoch: 45456, Training Loss: 0.00637
Epoch: 45456, Training Loss: 0.00660
Epoch: 45456, Training Loss: 0.00661
Epoch: 45456, Training Loss: 0.00810
Epoch: 45457, Training Loss: 0.00637
Epoch: 45457, Training Loss: 0.00660
Epoch: 45457, Training Loss: 0.00661
Epoch: 45457, Training Loss: 0.00810
Epoch: 45458, Training Loss: 0.00637
Epoch: 45458, Training Loss: 0.00660
Epoch: 45458, Training Loss: 0.00661
Epoch: 45458, Training Loss: 0.00810
Epoch: 45459, Training Loss: 0.00637
Epoch: 45459, Training Loss: 0.00660
Epoch: 45459, Training Loss: 0.00661
Epoch: 45459, Training Loss: 0.00810
Epoch: 45460, Training Loss: 0.00637
Epoch: 45460, Training Loss: 0.00660
Epoch: 45460, Training Loss: 0.00661
Epoch: 45460, Training Loss: 0.00810
Epoch: 45461, Training Loss: 0.00637
Epoch: 45461, Training Loss: 0.00660
Epoch: 45461, Training Loss: 0.00661
Epoch: 45461, Training Loss: 0.00810
Epoch: 45462, Training Loss: 0.00637
Epoch: 45462, Training Loss: 0.00660
Epoch: 45462, Training Loss: 0.00661
Epoch: 45462, Training Loss: 0.00810
Epoch: 45463, Training Loss: 0.00637
Epoch: 45463, Training Loss: 0.00660
Epoch: 45463, Training Loss: 0.00661
Epoch: 45463, Training Loss: 0.00810
Epoch: 45464, Training Loss: 0.00637
Epoch: 45464, Training Loss: 0.00660
Epoch: 45464, Training Loss: 0.00661
Epoch: 45464, Training Loss: 0.00810
Epoch: 45465, Training Loss: 0.00637
Epoch: 45465, Training Loss: 0.00660
Epoch: 45465, Training Loss: 0.00661
Epoch: 45465, Training Loss: 0.00810
Epoch: 45466, Training Loss: 0.00637
Epoch: 45466, Training Loss: 0.00660
Epoch: 45466, Training Loss: 0.00661
Epoch: 45466, Training Loss: 0.00810
Epoch: 45467, Training Loss: 0.00637
Epoch: 45467, Training Loss: 0.00660
Epoch: 45467, Training Loss: 0.00661
Epoch: 45467, Training Loss: 0.00810
Epoch: 45468, Training Loss: 0.00637
Epoch: 45468, Training Loss: 0.00660
Epoch: 45468, Training Loss: 0.00661
Epoch: 45468, Training Loss: 0.00810
Epoch: 45469, Training Loss: 0.00636
Epoch: 45469, Training Loss: 0.00660
Epoch: 45469, Training Loss: 0.00661
Epoch: 45469, Training Loss: 0.00810
Epoch: 45470, Training Loss: 0.00636
Epoch: 45470, Training Loss: 0.00660
Epoch: 45470, Training Loss: 0.00661
Epoch: 45470, Training Loss: 0.00810
Epoch: 45471, Training Loss: 0.00636
Epoch: 45471, Training Loss: 0.00660
Epoch: 45471, Training Loss: 0.00661
Epoch: 45471, Training Loss: 0.00810
Epoch: 45472, Training Loss: 0.00636
Epoch: 45472, Training Loss: 0.00660
Epoch: 45472, Training Loss: 0.00661
Epoch: 45472, Training Loss: 0.00810
Epoch: 45473, Training Loss: 0.00636
Epoch: 45473, Training Loss: 0.00660
Epoch: 45473, Training Loss: 0.00661
Epoch: 45473, Training Loss: 0.00810
Epoch: 45474, Training Loss: 0.00636
Epoch: 45474, Training Loss: 0.00660
Epoch: 45474, Training Loss: 0.00661
Epoch: 45474, Training Loss: 0.00810
Epoch: 45475, Training Loss: 0.00636
Epoch: 45475, Training Loss: 0.00660
Epoch: 45475, Training Loss: 0.00661
Epoch: 45475, Training Loss: 0.00810
Epoch: 45476, Training Loss: 0.00636
Epoch: 45476, Training Loss: 0.00660
Epoch: 45476, Training Loss: 0.00661
Epoch: 45476, Training Loss: 0.00810
Epoch: 45477, Training Loss: 0.00636
Epoch: 45477, Training Loss: 0.00660
Epoch: 45477, Training Loss: 0.00661
Epoch: 45477, Training Loss: 0.00810
Epoch: 45478, Training Loss: 0.00636
Epoch: 45478, Training Loss: 0.00660
Epoch: 45478, Training Loss: 0.00661
Epoch: 45478, Training Loss: 0.00810
Epoch: 45479, Training Loss: 0.00636
Epoch: 45479, Training Loss: 0.00660
Epoch: 45479, Training Loss: 0.00661
Epoch: 45479, Training Loss: 0.00810
Epoch: 45480, Training Loss: 0.00636
Epoch: 45480, Training Loss: 0.00660
Epoch: 45480, Training Loss: 0.00661
Epoch: 45480, Training Loss: 0.00810
Epoch: 45481, Training Loss: 0.00636
Epoch: 45481, Training Loss: 0.00660
Epoch: 45481, Training Loss: 0.00661
Epoch: 45481, Training Loss: 0.00810
Epoch: 45482, Training Loss: 0.00636
Epoch: 45482, Training Loss: 0.00660
Epoch: 45482, Training Loss: 0.00661
Epoch: 45482, Training Loss: 0.00810
Epoch: 45483, Training Loss: 0.00636
Epoch: 45483, Training Loss: 0.00660
Epoch: 45483, Training Loss: 0.00661
Epoch: 45483, Training Loss: 0.00810
Epoch: 45484, Training Loss: 0.00636
Epoch: 45484, Training Loss: 0.00660
Epoch: 45484, Training Loss: 0.00661
Epoch: 45484, Training Loss: 0.00810
Epoch: 45485, Training Loss: 0.00636
Epoch: 45485, Training Loss: 0.00660
Epoch: 45485, Training Loss: 0.00661
Epoch: 45485, Training Loss: 0.00810
Epoch: 45486, Training Loss: 0.00636
Epoch: 45486, Training Loss: 0.00660
Epoch: 45486, Training Loss: 0.00661
Epoch: 45486, Training Loss: 0.00810
Epoch: 45487, Training Loss: 0.00636
Epoch: 45487, Training Loss: 0.00660
Epoch: 45487, Training Loss: 0.00661
Epoch: 45487, Training Loss: 0.00810
Epoch: 45488, Training Loss: 0.00636
Epoch: 45488, Training Loss: 0.00660
Epoch: 45488, Training Loss: 0.00661
Epoch: 45488, Training Loss: 0.00810
Epoch: 45489, Training Loss: 0.00636
Epoch: 45489, Training Loss: 0.00660
Epoch: 45489, Training Loss: 0.00661
Epoch: 45489, Training Loss: 0.00810
Epoch: 45490, Training Loss: 0.00636
Epoch: 45490, Training Loss: 0.00660
Epoch: 45490, Training Loss: 0.00661
Epoch: 45490, Training Loss: 0.00810
Epoch: 45491, Training Loss: 0.00636
Epoch: 45491, Training Loss: 0.00660
Epoch: 45491, Training Loss: 0.00661
Epoch: 45491, Training Loss: 0.00810
Epoch: 45492, Training Loss: 0.00636
Epoch: 45492, Training Loss: 0.00660
Epoch: 45492, Training Loss: 0.00661
Epoch: 45492, Training Loss: 0.00810
Epoch: 45493, Training Loss: 0.00636
Epoch: 45493, Training Loss: 0.00660
Epoch: 45493, Training Loss: 0.00661
Epoch: 45493, Training Loss: 0.00810
Epoch: 45494, Training Loss: 0.00636
Epoch: 45494, Training Loss: 0.00660
Epoch: 45494, Training Loss: 0.00661
Epoch: 45494, Training Loss: 0.00810
Epoch: 45495, Training Loss: 0.00636
Epoch: 45495, Training Loss: 0.00660
Epoch: 45495, Training Loss: 0.00661
Epoch: 45495, Training Loss: 0.00810
Epoch: 45496, Training Loss: 0.00636
Epoch: 45496, Training Loss: 0.00660
Epoch: 45496, Training Loss: 0.00661
Epoch: 45496, Training Loss: 0.00810
Epoch: 45497, Training Loss: 0.00636
Epoch: 45497, Training Loss: 0.00660
Epoch: 45497, Training Loss: 0.00661
Epoch: 45497, Training Loss: 0.00810
Epoch: 45498, Training Loss: 0.00636
Epoch: 45498, Training Loss: 0.00660
Epoch: 45498, Training Loss: 0.00661
Epoch: 45498, Training Loss: 0.00810
Epoch: 45499, Training Loss: 0.00636
Epoch: 45499, Training Loss: 0.00660
Epoch: 45499, Training Loss: 0.00661
Epoch: 45499, Training Loss: 0.00810
Epoch: 45500, Training Loss: 0.00636
Epoch: 45500, Training Loss: 0.00660
Epoch: 45500, Training Loss: 0.00661
Epoch: 45500, Training Loss: 0.00810
Epoch: 45501, Training Loss: 0.00636
Epoch: 45501, Training Loss: 0.00660
Epoch: 45501, Training Loss: 0.00661
Epoch: 45501, Training Loss: 0.00810
Epoch: 45502, Training Loss: 0.00636
Epoch: 45502, Training Loss: 0.00660
Epoch: 45502, Training Loss: 0.00661
Epoch: 45502, Training Loss: 0.00810
Epoch: 45503, Training Loss: 0.00636
Epoch: 45503, Training Loss: 0.00660
Epoch: 45503, Training Loss: 0.00661
Epoch: 45503, Training Loss: 0.00810
Epoch: 45504, Training Loss: 0.00636
Epoch: 45504, Training Loss: 0.00660
Epoch: 45504, Training Loss: 0.00661
Epoch: 45504, Training Loss: 0.00810
Epoch: 45505, Training Loss: 0.00636
Epoch: 45505, Training Loss: 0.00660
Epoch: 45505, Training Loss: 0.00661
Epoch: 45505, Training Loss: 0.00810
Epoch: 45506, Training Loss: 0.00636
Epoch: 45506, Training Loss: 0.00660
Epoch: 45506, Training Loss: 0.00661
Epoch: 45506, Training Loss: 0.00810
Epoch: 45507, Training Loss: 0.00636
Epoch: 45507, Training Loss: 0.00660
Epoch: 45507, Training Loss: 0.00661
Epoch: 45507, Training Loss: 0.00810
Epoch: 45508, Training Loss: 0.00636
Epoch: 45508, Training Loss: 0.00660
Epoch: 45508, Training Loss: 0.00661
Epoch: 45508, Training Loss: 0.00810
Epoch: 45509, Training Loss: 0.00636
Epoch: 45509, Training Loss: 0.00660
Epoch: 45509, Training Loss: 0.00661
Epoch: 45509, Training Loss: 0.00810
Epoch: 45510, Training Loss: 0.00636
Epoch: 45510, Training Loss: 0.00660
Epoch: 45510, Training Loss: 0.00661
Epoch: 45510, Training Loss: 0.00810
Epoch: 45511, Training Loss: 0.00636
Epoch: 45511, Training Loss: 0.00660
Epoch: 45511, Training Loss: 0.00661
Epoch: 45511, Training Loss: 0.00810
Epoch: 45512, Training Loss: 0.00636
Epoch: 45512, Training Loss: 0.00660
Epoch: 45512, Training Loss: 0.00661
Epoch: 45512, Training Loss: 0.00810
Epoch: 45513, Training Loss: 0.00636
Epoch: 45513, Training Loss: 0.00660
Epoch: 45513, Training Loss: 0.00661
Epoch: 45513, Training Loss: 0.00810
Epoch: 45514, Training Loss: 0.00636
Epoch: 45514, Training Loss: 0.00660
Epoch: 45514, Training Loss: 0.00661
Epoch: 45514, Training Loss: 0.00810
Epoch: 45515, Training Loss: 0.00636
Epoch: 45515, Training Loss: 0.00660
Epoch: 45515, Training Loss: 0.00661
Epoch: 45515, Training Loss: 0.00810
Epoch: 45516, Training Loss: 0.00636
Epoch: 45516, Training Loss: 0.00660
Epoch: 45516, Training Loss: 0.00661
Epoch: 45516, Training Loss: 0.00810
Epoch: 45517, Training Loss: 0.00636
Epoch: 45517, Training Loss: 0.00660
Epoch: 45517, Training Loss: 0.00661
Epoch: 45517, Training Loss: 0.00810
Epoch: 45518, Training Loss: 0.00636
Epoch: 45518, Training Loss: 0.00660
Epoch: 45518, Training Loss: 0.00661
Epoch: 45518, Training Loss: 0.00810
Epoch: 45519, Training Loss: 0.00636
Epoch: 45519, Training Loss: 0.00660
Epoch: 45519, Training Loss: 0.00661
Epoch: 45519, Training Loss: 0.00810
Epoch: 45520, Training Loss: 0.00636
Epoch: 45520, Training Loss: 0.00660
Epoch: 45520, Training Loss: 0.00661
Epoch: 45520, Training Loss: 0.00810
Epoch: 45521, Training Loss: 0.00636
Epoch: 45521, Training Loss: 0.00660
Epoch: 45521, Training Loss: 0.00661
Epoch: 45521, Training Loss: 0.00810
Epoch: 45522, Training Loss: 0.00636
Epoch: 45522, Training Loss: 0.00660
Epoch: 45522, Training Loss: 0.00661
Epoch: 45522, Training Loss: 0.00810
Epoch: 45523, Training Loss: 0.00636
Epoch: 45523, Training Loss: 0.00660
Epoch: 45523, Training Loss: 0.00661
Epoch: 45523, Training Loss: 0.00810
Epoch: 45524, Training Loss: 0.00636
Epoch: 45524, Training Loss: 0.00660
Epoch: 45524, Training Loss: 0.00661
Epoch: 45524, Training Loss: 0.00810
Epoch: 45525, Training Loss: 0.00636
Epoch: 45525, Training Loss: 0.00660
Epoch: 45525, Training Loss: 0.00661
Epoch: 45525, Training Loss: 0.00810
Epoch: 45526, Training Loss: 0.00636
Epoch: 45526, Training Loss: 0.00660
Epoch: 45526, Training Loss: 0.00661
Epoch: 45526, Training Loss: 0.00809
Epoch: 45527, Training Loss: 0.00636
Epoch: 45527, Training Loss: 0.00660
Epoch: 45527, Training Loss: 0.00661
Epoch: 45527, Training Loss: 0.00809
Epoch: 45528, Training Loss: 0.00636
Epoch: 45528, Training Loss: 0.00660
Epoch: 45528, Training Loss: 0.00660
Epoch: 45528, Training Loss: 0.00809
Epoch: 45529, Training Loss: 0.00636
Epoch: 45529, Training Loss: 0.00660
Epoch: 45529, Training Loss: 0.00660
Epoch: 45529, Training Loss: 0.00809
Epoch: 45530, Training Loss: 0.00636
Epoch: 45530, Training Loss: 0.00660
Epoch: 45530, Training Loss: 0.00660
Epoch: 45530, Training Loss: 0.00809
Epoch: 45531, Training Loss: 0.00636
Epoch: 45531, Training Loss: 0.00660
Epoch: 45531, Training Loss: 0.00660
Epoch: 45531, Training Loss: 0.00809
Epoch: 45532, Training Loss: 0.00636
Epoch: 45532, Training Loss: 0.00660
Epoch: 45532, Training Loss: 0.00660
Epoch: 45532, Training Loss: 0.00809
Epoch: 45533, Training Loss: 0.00636
Epoch: 45533, Training Loss: 0.00660
Epoch: 45533, Training Loss: 0.00660
Epoch: 45533, Training Loss: 0.00809
Epoch: 45534, Training Loss: 0.00636
Epoch: 45534, Training Loss: 0.00660
Epoch: 45534, Training Loss: 0.00660
Epoch: 45534, Training Loss: 0.00809
Epoch: 45535, Training Loss: 0.00636
Epoch: 45535, Training Loss: 0.00660
Epoch: 45535, Training Loss: 0.00660
Epoch: 45535, Training Loss: 0.00809
Epoch: 45536, Training Loss: 0.00636
Epoch: 45536, Training Loss: 0.00660
Epoch: 45536, Training Loss: 0.00660
Epoch: 45536, Training Loss: 0.00809
Epoch: 45537, Training Loss: 0.00636
Epoch: 45537, Training Loss: 0.00660
Epoch: 45537, Training Loss: 0.00660
Epoch: 45537, Training Loss: 0.00809
Epoch: 45538, Training Loss: 0.00636
Epoch: 45538, Training Loss: 0.00660
Epoch: 45538, Training Loss: 0.00660
Epoch: 45538, Training Loss: 0.00809
Epoch: 45539, Training Loss: 0.00636
Epoch: 45539, Training Loss: 0.00660
Epoch: 45539, Training Loss: 0.00660
Epoch: 45539, Training Loss: 0.00809
Epoch: 45540, Training Loss: 0.00636
Epoch: 45540, Training Loss: 0.00660
Epoch: 45540, Training Loss: 0.00660
Epoch: 45540, Training Loss: 0.00809
Epoch: 45541, Training Loss: 0.00636
Epoch: 45541, Training Loss: 0.00660
Epoch: 45541, Training Loss: 0.00660
Epoch: 45541, Training Loss: 0.00809
Epoch: 45542, Training Loss: 0.00636
Epoch: 45542, Training Loss: 0.00660
Epoch: 45542, Training Loss: 0.00660
Epoch: 45542, Training Loss: 0.00809
Epoch: 45543, Training Loss: 0.00636
Epoch: 45543, Training Loss: 0.00660
Epoch: 45543, Training Loss: 0.00660
Epoch: 45543, Training Loss: 0.00809
Epoch: 45544, Training Loss: 0.00636
Epoch: 45544, Training Loss: 0.00660
Epoch: 45544, Training Loss: 0.00660
Epoch: 45544, Training Loss: 0.00809
Epoch: 45545, Training Loss: 0.00636
Epoch: 45545, Training Loss: 0.00660
Epoch: 45545, Training Loss: 0.00660
Epoch: 45545, Training Loss: 0.00809
Epoch: 45546, Training Loss: 0.00636
Epoch: 45546, Training Loss: 0.00660
Epoch: 45546, Training Loss: 0.00660
Epoch: 45546, Training Loss: 0.00809
Epoch: 45547, Training Loss: 0.00636
Epoch: 45547, Training Loss: 0.00660
Epoch: 45547, Training Loss: 0.00660
Epoch: 45547, Training Loss: 0.00809
Epoch: 45548, Training Loss: 0.00636
Epoch: 45548, Training Loss: 0.00660
Epoch: 45548, Training Loss: 0.00660
Epoch: 45548, Training Loss: 0.00809
Epoch: 45549, Training Loss: 0.00636
Epoch: 45549, Training Loss: 0.00660
Epoch: 45549, Training Loss: 0.00660
Epoch: 45549, Training Loss: 0.00809
Epoch: 45550, Training Loss: 0.00636
Epoch: 45550, Training Loss: 0.00660
Epoch: 45550, Training Loss: 0.00660
Epoch: 45550, Training Loss: 0.00809
Epoch: 45551, Training Loss: 0.00636
Epoch: 45551, Training Loss: 0.00660
Epoch: 45551, Training Loss: 0.00660
Epoch: 45551, Training Loss: 0.00809
Epoch: 45552, Training Loss: 0.00636
Epoch: 45552, Training Loss: 0.00660
Epoch: 45552, Training Loss: 0.00660
Epoch: 45552, Training Loss: 0.00809
Epoch: 45553, Training Loss: 0.00636
Epoch: 45553, Training Loss: 0.00660
Epoch: 45553, Training Loss: 0.00660
Epoch: 45553, Training Loss: 0.00809
Epoch: 45554, Training Loss: 0.00636
Epoch: 45554, Training Loss: 0.00660
Epoch: 45554, Training Loss: 0.00660
Epoch: 45554, Training Loss: 0.00809
Epoch: 45555, Training Loss: 0.00636
Epoch: 45555, Training Loss: 0.00660
Epoch: 45555, Training Loss: 0.00660
Epoch: 45555, Training Loss: 0.00809
Epoch: 45556, Training Loss: 0.00636
Epoch: 45556, Training Loss: 0.00660
Epoch: 45556, Training Loss: 0.00660
Epoch: 45556, Training Loss: 0.00809
Epoch: 45557, Training Loss: 0.00636
Epoch: 45557, Training Loss: 0.00660
Epoch: 45557, Training Loss: 0.00660
Epoch: 45557, Training Loss: 0.00809
Epoch: 45558, Training Loss: 0.00636
Epoch: 45558, Training Loss: 0.00660
Epoch: 45558, Training Loss: 0.00660
Epoch: 45558, Training Loss: 0.00809
Epoch: 45559, Training Loss: 0.00636
Epoch: 45559, Training Loss: 0.00660
Epoch: 45559, Training Loss: 0.00660
Epoch: 45559, Training Loss: 0.00809
Epoch: 45560, Training Loss: 0.00636
Epoch: 45560, Training Loss: 0.00660
Epoch: 45560, Training Loss: 0.00660
Epoch: 45560, Training Loss: 0.00809
Epoch: 45561, Training Loss: 0.00636
Epoch: 45561, Training Loss: 0.00660
Epoch: 45561, Training Loss: 0.00660
Epoch: 45561, Training Loss: 0.00809
Epoch: 45562, Training Loss: 0.00636
Epoch: 45562, Training Loss: 0.00660
Epoch: 45562, Training Loss: 0.00660
Epoch: 45562, Training Loss: 0.00809
Epoch: 45563, Training Loss: 0.00636
Epoch: 45563, Training Loss: 0.00659
Epoch: 45563, Training Loss: 0.00660
Epoch: 45563, Training Loss: 0.00809
Epoch: 45564, Training Loss: 0.00636
Epoch: 45564, Training Loss: 0.00659
Epoch: 45564, Training Loss: 0.00660
Epoch: 45564, Training Loss: 0.00809
Epoch: 45565, Training Loss: 0.00636
Epoch: 45565, Training Loss: 0.00659
Epoch: 45565, Training Loss: 0.00660
Epoch: 45565, Training Loss: 0.00809
Epoch: 45566, Training Loss: 0.00636
Epoch: 45566, Training Loss: 0.00659
Epoch: 45566, Training Loss: 0.00660
Epoch: 45566, Training Loss: 0.00809
Epoch: 45567, Training Loss: 0.00636
Epoch: 45567, Training Loss: 0.00659
Epoch: 45567, Training Loss: 0.00660
Epoch: 45567, Training Loss: 0.00809
Epoch: 45568, Training Loss: 0.00636
Epoch: 45568, Training Loss: 0.00659
Epoch: 45568, Training Loss: 0.00660
Epoch: 45568, Training Loss: 0.00809
Epoch: 45569, Training Loss: 0.00636
Epoch: 45569, Training Loss: 0.00659
Epoch: 45569, Training Loss: 0.00660
Epoch: 45569, Training Loss: 0.00809
Epoch: 45570, Training Loss: 0.00636
Epoch: 45570, Training Loss: 0.00659
Epoch: 45570, Training Loss: 0.00660
Epoch: 45570, Training Loss: 0.00809
Epoch: 45571, Training Loss: 0.00636
Epoch: 45571, Training Loss: 0.00659
Epoch: 45571, Training Loss: 0.00660
Epoch: 45571, Training Loss: 0.00809
Epoch: 45572, Training Loss: 0.00636
Epoch: 45572, Training Loss: 0.00659
Epoch: 45572, Training Loss: 0.00660
Epoch: 45572, Training Loss: 0.00809
Epoch: 45573, Training Loss: 0.00636
Epoch: 45573, Training Loss: 0.00659
Epoch: 45573, Training Loss: 0.00660
Epoch: 45573, Training Loss: 0.00809
Epoch: 45574, Training Loss: 0.00636
Epoch: 45574, Training Loss: 0.00659
Epoch: 45574, Training Loss: 0.00660
Epoch: 45574, Training Loss: 0.00809
Epoch: 45575, Training Loss: 0.00636
Epoch: 45575, Training Loss: 0.00659
Epoch: 45575, Training Loss: 0.00660
Epoch: 45575, Training Loss: 0.00809
Epoch: 45576, Training Loss: 0.00636
Epoch: 45576, Training Loss: 0.00659
Epoch: 45576, Training Loss: 0.00660
Epoch: 45576, Training Loss: 0.00809
Epoch: 45577, Training Loss: 0.00636
Epoch: 45577, Training Loss: 0.00659
Epoch: 45577, Training Loss: 0.00660
Epoch: 45577, Training Loss: 0.00809
Epoch: 45578, Training Loss: 0.00636
Epoch: 45578, Training Loss: 0.00659
Epoch: 45578, Training Loss: 0.00660
Epoch: 45578, Training Loss: 0.00809
Epoch: 45579, Training Loss: 0.00636
Epoch: 45579, Training Loss: 0.00659
Epoch: 45579, Training Loss: 0.00660
Epoch: 45579, Training Loss: 0.00809
Epoch: 45580, Training Loss: 0.00636
Epoch: 45580, Training Loss: 0.00659
Epoch: 45580, Training Loss: 0.00660
Epoch: 45580, Training Loss: 0.00809
Epoch: 45581, Training Loss: 0.00636
Epoch: 45581, Training Loss: 0.00659
Epoch: 45581, Training Loss: 0.00660
Epoch: 45581, Training Loss: 0.00809
Epoch: 45582, Training Loss: 0.00636
Epoch: 45582, Training Loss: 0.00659
Epoch: 45582, Training Loss: 0.00660
Epoch: 45582, Training Loss: 0.00809
Epoch: 45583, Training Loss: 0.00636
Epoch: 45583, Training Loss: 0.00659
Epoch: 45583, Training Loss: 0.00660
Epoch: 45583, Training Loss: 0.00809
Epoch: 45584, Training Loss: 0.00636
Epoch: 45584, Training Loss: 0.00659
Epoch: 45584, Training Loss: 0.00660
Epoch: 45584, Training Loss: 0.00809
Epoch: 45585, Training Loss: 0.00636
Epoch: 45585, Training Loss: 0.00659
Epoch: 45585, Training Loss: 0.00660
Epoch: 45585, Training Loss: 0.00809
Epoch: 45586, Training Loss: 0.00636
Epoch: 45586, Training Loss: 0.00659
Epoch: 45586, Training Loss: 0.00660
Epoch: 45586, Training Loss: 0.00809
Epoch: 45587, Training Loss: 0.00636
Epoch: 45587, Training Loss: 0.00659
Epoch: 45587, Training Loss: 0.00660
Epoch: 45587, Training Loss: 0.00809
Epoch: 45588, Training Loss: 0.00636
Epoch: 45588, Training Loss: 0.00659
Epoch: 45588, Training Loss: 0.00660
Epoch: 45588, Training Loss: 0.00809
Epoch: 45589, Training Loss: 0.00636
Epoch: 45589, Training Loss: 0.00659
Epoch: 45589, Training Loss: 0.00660
Epoch: 45589, Training Loss: 0.00809
Epoch: 45590, Training Loss: 0.00636
Epoch: 45590, Training Loss: 0.00659
Epoch: 45590, Training Loss: 0.00660
Epoch: 45590, Training Loss: 0.00809
Epoch: 45591, Training Loss: 0.00636
Epoch: 45591, Training Loss: 0.00659
Epoch: 45591, Training Loss: 0.00660
Epoch: 45591, Training Loss: 0.00809
Epoch: 45592, Training Loss: 0.00636
Epoch: 45592, Training Loss: 0.00659
Epoch: 45592, Training Loss: 0.00660
Epoch: 45592, Training Loss: 0.00809
Epoch: 45593, Training Loss: 0.00636
Epoch: 45593, Training Loss: 0.00659
Epoch: 45593, Training Loss: 0.00660
Epoch: 45593, Training Loss: 0.00809
Epoch: 45594, Training Loss: 0.00636
Epoch: 45594, Training Loss: 0.00659
Epoch: 45594, Training Loss: 0.00660
Epoch: 45594, Training Loss: 0.00809
Epoch: 45595, Training Loss: 0.00636
Epoch: 45595, Training Loss: 0.00659
Epoch: 45595, Training Loss: 0.00660
Epoch: 45595, Training Loss: 0.00809
Epoch: 45596, Training Loss: 0.00636
Epoch: 45596, Training Loss: 0.00659
Epoch: 45596, Training Loss: 0.00660
Epoch: 45596, Training Loss: 0.00809
Epoch: 45597, Training Loss: 0.00636
Epoch: 45597, Training Loss: 0.00659
Epoch: 45597, Training Loss: 0.00660
Epoch: 45597, Training Loss: 0.00809
Epoch: 45598, Training Loss: 0.00636
Epoch: 45598, Training Loss: 0.00659
Epoch: 45598, Training Loss: 0.00660
Epoch: 45598, Training Loss: 0.00809
Epoch: 45599, Training Loss: 0.00636
Epoch: 45599, Training Loss: 0.00659
Epoch: 45599, Training Loss: 0.00660
Epoch: 45599, Training Loss: 0.00809
Epoch: 45600, Training Loss: 0.00636
Epoch: 45600, Training Loss: 0.00659
Epoch: 45600, Training Loss: 0.00660
Epoch: 45600, Training Loss: 0.00809
Epoch: 45601, Training Loss: 0.00636
Epoch: 45601, Training Loss: 0.00659
Epoch: 45601, Training Loss: 0.00660
Epoch: 45601, Training Loss: 0.00809
Epoch: 45602, Training Loss: 0.00636
Epoch: 45602, Training Loss: 0.00659
Epoch: 45602, Training Loss: 0.00660
Epoch: 45602, Training Loss: 0.00809
Epoch: 45603, Training Loss: 0.00635
Epoch: 45603, Training Loss: 0.00659
Epoch: 45603, Training Loss: 0.00660
Epoch: 45603, Training Loss: 0.00809
Epoch: 45604, Training Loss: 0.00635
Epoch: 45604, Training Loss: 0.00659
Epoch: 45604, Training Loss: 0.00660
Epoch: 45604, Training Loss: 0.00809
Epoch: 45605, Training Loss: 0.00635
Epoch: 45605, Training Loss: 0.00659
Epoch: 45605, Training Loss: 0.00660
Epoch: 45605, Training Loss: 0.00809
Epoch: 45606, Training Loss: 0.00635
Epoch: 45606, Training Loss: 0.00659
Epoch: 45606, Training Loss: 0.00660
Epoch: 45606, Training Loss: 0.00809
Epoch: 45607, Training Loss: 0.00635
Epoch: 45607, Training Loss: 0.00659
Epoch: 45607, Training Loss: 0.00660
Epoch: 45607, Training Loss: 0.00809
Epoch: 45608, Training Loss: 0.00635
Epoch: 45608, Training Loss: 0.00659
Epoch: 45608, Training Loss: 0.00660
Epoch: 45608, Training Loss: 0.00809
Epoch: 45609, Training Loss: 0.00635
Epoch: 45609, Training Loss: 0.00659
Epoch: 45609, Training Loss: 0.00660
Epoch: 45609, Training Loss: 0.00809
Epoch: 45610, Training Loss: 0.00635
Epoch: 45610, Training Loss: 0.00659
Epoch: 45610, Training Loss: 0.00660
Epoch: 45610, Training Loss: 0.00809
Epoch: 45611, Training Loss: 0.00635
Epoch: 45611, Training Loss: 0.00659
Epoch: 45611, Training Loss: 0.00660
Epoch: 45611, Training Loss: 0.00809
Epoch: 45612, Training Loss: 0.00635
Epoch: 45612, Training Loss: 0.00659
Epoch: 45612, Training Loss: 0.00660
Epoch: 45612, Training Loss: 0.00809
Epoch: 45613, Training Loss: 0.00635
Epoch: 45613, Training Loss: 0.00659
Epoch: 45613, Training Loss: 0.00660
Epoch: 45613, Training Loss: 0.00809
Epoch: 45614, Training Loss: 0.00635
Epoch: 45614, Training Loss: 0.00659
Epoch: 45614, Training Loss: 0.00660
Epoch: 45614, Training Loss: 0.00809
Epoch: 45615, Training Loss: 0.00635
Epoch: 45615, Training Loss: 0.00659
Epoch: 45615, Training Loss: 0.00660
Epoch: 45615, Training Loss: 0.00809
Epoch: 45616, Training Loss: 0.00635
Epoch: 45616, Training Loss: 0.00659
Epoch: 45616, Training Loss: 0.00660
Epoch: 45616, Training Loss: 0.00809
Epoch: 45617, Training Loss: 0.00635
Epoch: 45617, Training Loss: 0.00659
Epoch: 45617, Training Loss: 0.00660
Epoch: 45617, Training Loss: 0.00809
Epoch: 45618, Training Loss: 0.00635
Epoch: 45618, Training Loss: 0.00659
Epoch: 45618, Training Loss: 0.00660
Epoch: 45618, Training Loss: 0.00809
Epoch: 45619, Training Loss: 0.00635
Epoch: 45619, Training Loss: 0.00659
Epoch: 45619, Training Loss: 0.00660
Epoch: 45619, Training Loss: 0.00809
Epoch: 45620, Training Loss: 0.00635
Epoch: 45620, Training Loss: 0.00659
Epoch: 45620, Training Loss: 0.00660
Epoch: 45620, Training Loss: 0.00809
Epoch: 45621, Training Loss: 0.00635
Epoch: 45621, Training Loss: 0.00659
Epoch: 45621, Training Loss: 0.00660
Epoch: 45621, Training Loss: 0.00809
Epoch: 45622, Training Loss: 0.00635
Epoch: 45622, Training Loss: 0.00659
Epoch: 45622, Training Loss: 0.00660
Epoch: 45622, Training Loss: 0.00809
Epoch: 45623, Training Loss: 0.00635
Epoch: 45623, Training Loss: 0.00659
Epoch: 45623, Training Loss: 0.00660
Epoch: 45623, Training Loss: 0.00809
Epoch: 45624, Training Loss: 0.00635
Epoch: 45624, Training Loss: 0.00659
Epoch: 45624, Training Loss: 0.00660
Epoch: 45624, Training Loss: 0.00809
Epoch: 45625, Training Loss: 0.00635
Epoch: 45625, Training Loss: 0.00659
Epoch: 45625, Training Loss: 0.00660
Epoch: 45625, Training Loss: 0.00809
Epoch: 45626, Training Loss: 0.00635
Epoch: 45626, Training Loss: 0.00659
Epoch: 45626, Training Loss: 0.00660
Epoch: 45626, Training Loss: 0.00809
Epoch: 45627, Training Loss: 0.00635
Epoch: 45627, Training Loss: 0.00659
Epoch: 45627, Training Loss: 0.00660
Epoch: 45627, Training Loss: 0.00809
Epoch: 45628, Training Loss: 0.00635
Epoch: 45628, Training Loss: 0.00659
Epoch: 45628, Training Loss: 0.00660
Epoch: 45628, Training Loss: 0.00809
Epoch: 45629, Training Loss: 0.00635
Epoch: 45629, Training Loss: 0.00659
Epoch: 45629, Training Loss: 0.00660
Epoch: 45629, Training Loss: 0.00808
Epoch: 45630, Training Loss: 0.00635
Epoch: 45630, Training Loss: 0.00659
Epoch: 45630, Training Loss: 0.00660
Epoch: 45630, Training Loss: 0.00808
Epoch: 45631, Training Loss: 0.00635
Epoch: 45631, Training Loss: 0.00659
Epoch: 45631, Training Loss: 0.00660
Epoch: 45631, Training Loss: 0.00808
Epoch: 45632, Training Loss: 0.00635
Epoch: 45632, Training Loss: 0.00659
Epoch: 45632, Training Loss: 0.00660
Epoch: 45632, Training Loss: 0.00808
Epoch: 45633, Training Loss: 0.00635
Epoch: 45633, Training Loss: 0.00659
Epoch: 45633, Training Loss: 0.00660
Epoch: 45633, Training Loss: 0.00808
Epoch: 45634, Training Loss: 0.00635
Epoch: 45634, Training Loss: 0.00659
Epoch: 45634, Training Loss: 0.00660
Epoch: 45634, Training Loss: 0.00808
Epoch: 45635, Training Loss: 0.00635
Epoch: 45635, Training Loss: 0.00659
Epoch: 45635, Training Loss: 0.00660
Epoch: 45635, Training Loss: 0.00808
Epoch: 45636, Training Loss: 0.00635
Epoch: 45636, Training Loss: 0.00659
Epoch: 45636, Training Loss: 0.00660
Epoch: 45636, Training Loss: 0.00808
Epoch: 45637, Training Loss: 0.00635
Epoch: 45637, Training Loss: 0.00659
Epoch: 45637, Training Loss: 0.00660
Epoch: 45637, Training Loss: 0.00808
Epoch: 45638, Training Loss: 0.00635
Epoch: 45638, Training Loss: 0.00659
Epoch: 45638, Training Loss: 0.00660
Epoch: 45638, Training Loss: 0.00808
Epoch: 45639, Training Loss: 0.00635
Epoch: 45639, Training Loss: 0.00659
Epoch: 45639, Training Loss: 0.00660
Epoch: 45639, Training Loss: 0.00808
Epoch: 45640, Training Loss: 0.00635
Epoch: 45640, Training Loss: 0.00659
Epoch: 45640, Training Loss: 0.00660
Epoch: 45640, Training Loss: 0.00808
Epoch: 45641, Training Loss: 0.00635
Epoch: 45641, Training Loss: 0.00659
Epoch: 45641, Training Loss: 0.00660
Epoch: 45641, Training Loss: 0.00808
Epoch: 45642, Training Loss: 0.00635
Epoch: 45642, Training Loss: 0.00659
Epoch: 45642, Training Loss: 0.00660
Epoch: 45642, Training Loss: 0.00808
Epoch: 45643, Training Loss: 0.00635
Epoch: 45643, Training Loss: 0.00659
Epoch: 45643, Training Loss: 0.00660
Epoch: 45643, Training Loss: 0.00808
Epoch: 45644, Training Loss: 0.00635
Epoch: 45644, Training Loss: 0.00659
Epoch: 45644, Training Loss: 0.00660
Epoch: 45644, Training Loss: 0.00808
Epoch: 45645, Training Loss: 0.00635
Epoch: 45645, Training Loss: 0.00659
Epoch: 45645, Training Loss: 0.00660
Epoch: 45645, Training Loss: 0.00808
Epoch: 45646, Training Loss: 0.00635
Epoch: 45646, Training Loss: 0.00659
Epoch: 45646, Training Loss: 0.00660
Epoch: 45646, Training Loss: 0.00808
Epoch: 45647, Training Loss: 0.00635
Epoch: 45647, Training Loss: 0.00659
Epoch: 45647, Training Loss: 0.00660
Epoch: 45647, Training Loss: 0.00808
Epoch: 45648, Training Loss: 0.00635
Epoch: 45648, Training Loss: 0.00659
Epoch: 45648, Training Loss: 0.00660
Epoch: 45648, Training Loss: 0.00808
Epoch: 45649, Training Loss: 0.00635
Epoch: 45649, Training Loss: 0.00659
Epoch: 45649, Training Loss: 0.00660
Epoch: 45649, Training Loss: 0.00808
Epoch: 45650, Training Loss: 0.00635
Epoch: 45650, Training Loss: 0.00659
Epoch: 45650, Training Loss: 0.00660
Epoch: 45650, Training Loss: 0.00808
Epoch: 45651, Training Loss: 0.00635
Epoch: 45651, Training Loss: 0.00659
Epoch: 45651, Training Loss: 0.00660
Epoch: 45651, Training Loss: 0.00808
Epoch: 45652, Training Loss: 0.00635
Epoch: 45652, Training Loss: 0.00659
Epoch: 45652, Training Loss: 0.00660
Epoch: 45652, Training Loss: 0.00808
Epoch: 45653, Training Loss: 0.00635
Epoch: 45653, Training Loss: 0.00659
Epoch: 45653, Training Loss: 0.00660
Epoch: 45653, Training Loss: 0.00808
Epoch: 45654, Training Loss: 0.00635
Epoch: 45654, Training Loss: 0.00659
Epoch: 45654, Training Loss: 0.00659
Epoch: 45654, Training Loss: 0.00808
Epoch: 45655, Training Loss: 0.00635
Epoch: 45655, Training Loss: 0.00659
Epoch: 45655, Training Loss: 0.00659
Epoch: 45655, Training Loss: 0.00808
Epoch: 45656, Training Loss: 0.00635
Epoch: 45656, Training Loss: 0.00659
Epoch: 45656, Training Loss: 0.00659
Epoch: 45656, Training Loss: 0.00808
Epoch: 45657, Training Loss: 0.00635
Epoch: 45657, Training Loss: 0.00659
Epoch: 45657, Training Loss: 0.00659
Epoch: 45657, Training Loss: 0.00808
Epoch: 45658, Training Loss: 0.00635
Epoch: 45658, Training Loss: 0.00659
Epoch: 45658, Training Loss: 0.00659
Epoch: 45658, Training Loss: 0.00808
Epoch: 45659, Training Loss: 0.00635
Epoch: 45659, Training Loss: 0.00659
Epoch: 45659, Training Loss: 0.00659
Epoch: 45659, Training Loss: 0.00808
Epoch: 45660, Training Loss: 0.00635
Epoch: 45660, Training Loss: 0.00659
Epoch: 45660, Training Loss: 0.00659
Epoch: 45660, Training Loss: 0.00808
Epoch: 45661, Training Loss: 0.00635
Epoch: 45661, Training Loss: 0.00659
Epoch: 45661, Training Loss: 0.00659
Epoch: 45661, Training Loss: 0.00808
Epoch: 45662, Training Loss: 0.00635
Epoch: 45662, Training Loss: 0.00659
Epoch: 45662, Training Loss: 0.00659
Epoch: 45662, Training Loss: 0.00808
Epoch: 45663, Training Loss: 0.00635
Epoch: 45663, Training Loss: 0.00659
Epoch: 45663, Training Loss: 0.00659
Epoch: 45663, Training Loss: 0.00808
Epoch: 45664, Training Loss: 0.00635
Epoch: 45664, Training Loss: 0.00659
Epoch: 45664, Training Loss: 0.00659
Epoch: 45664, Training Loss: 0.00808
Epoch: 45665, Training Loss: 0.00635
Epoch: 45665, Training Loss: 0.00659
Epoch: 45665, Training Loss: 0.00659
Epoch: 45665, Training Loss: 0.00808
Epoch: 45666, Training Loss: 0.00635
Epoch: 45666, Training Loss: 0.00659
Epoch: 45666, Training Loss: 0.00659
Epoch: 45666, Training Loss: 0.00808
Epoch: 45667, Training Loss: 0.00635
Epoch: 45667, Training Loss: 0.00659
Epoch: 45667, Training Loss: 0.00659
Epoch: 45667, Training Loss: 0.00808
Epoch: 45668, Training Loss: 0.00635
Epoch: 45668, Training Loss: 0.00659
Epoch: 45668, Training Loss: 0.00659
Epoch: 45668, Training Loss: 0.00808
Epoch: 45669, Training Loss: 0.00635
Epoch: 45669, Training Loss: 0.00659
Epoch: 45669, Training Loss: 0.00659
Epoch: 45669, Training Loss: 0.00808
Epoch: 45670, Training Loss: 0.00635
Epoch: 45670, Training Loss: 0.00659
Epoch: 45670, Training Loss: 0.00659
Epoch: 45670, Training Loss: 0.00808
Epoch: 45671, Training Loss: 0.00635
Epoch: 45671, Training Loss: 0.00659
Epoch: 45671, Training Loss: 0.00659
Epoch: 45671, Training Loss: 0.00808
Epoch: 45672, Training Loss: 0.00635
Epoch: 45672, Training Loss: 0.00659
Epoch: 45672, Training Loss: 0.00659
Epoch: 45672, Training Loss: 0.00808
Epoch: 45673, Training Loss: 0.00635
Epoch: 45673, Training Loss: 0.00659
Epoch: 45673, Training Loss: 0.00659
Epoch: 45673, Training Loss: 0.00808
Epoch: 45674, Training Loss: 0.00635
Epoch: 45674, Training Loss: 0.00659
Epoch: 45674, Training Loss: 0.00659
Epoch: 45674, Training Loss: 0.00808
Epoch: 45675, Training Loss: 0.00635
Epoch: 45675, Training Loss: 0.00659
Epoch: 45675, Training Loss: 0.00659
Epoch: 45675, Training Loss: 0.00808
Epoch: 45676, Training Loss: 0.00635
Epoch: 45676, Training Loss: 0.00659
Epoch: 45676, Training Loss: 0.00659
Epoch: 45676, Training Loss: 0.00808
Epoch: 45677, Training Loss: 0.00635
Epoch: 45677, Training Loss: 0.00659
Epoch: 45677, Training Loss: 0.00659
Epoch: 45677, Training Loss: 0.00808
Epoch: 45678, Training Loss: 0.00635
Epoch: 45678, Training Loss: 0.00659
Epoch: 45678, Training Loss: 0.00659
Epoch: 45678, Training Loss: 0.00808
Epoch: 45679, Training Loss: 0.00635
Epoch: 45679, Training Loss: 0.00659
Epoch: 45679, Training Loss: 0.00659
Epoch: 45679, Training Loss: 0.00808
Epoch: 45680, Training Loss: 0.00635
Epoch: 45680, Training Loss: 0.00659
Epoch: 45680, Training Loss: 0.00659
Epoch: 45680, Training Loss: 0.00808
Epoch: 45681, Training Loss: 0.00635
Epoch: 45681, Training Loss: 0.00659
Epoch: 45681, Training Loss: 0.00659
Epoch: 45681, Training Loss: 0.00808
Epoch: 45682, Training Loss: 0.00635
Epoch: 45682, Training Loss: 0.00659
Epoch: 45682, Training Loss: 0.00659
Epoch: 45682, Training Loss: 0.00808
Epoch: 45683, Training Loss: 0.00635
Epoch: 45683, Training Loss: 0.00659
Epoch: 45683, Training Loss: 0.00659
Epoch: 45683, Training Loss: 0.00808
Epoch: 45684, Training Loss: 0.00635
Epoch: 45684, Training Loss: 0.00659
Epoch: 45684, Training Loss: 0.00659
Epoch: 45684, Training Loss: 0.00808
Epoch: 45685, Training Loss: 0.00635
Epoch: 45685, Training Loss: 0.00659
Epoch: 45685, Training Loss: 0.00659
Epoch: 45685, Training Loss: 0.00808
Epoch: 45686, Training Loss: 0.00635
Epoch: 45686, Training Loss: 0.00659
Epoch: 45686, Training Loss: 0.00659
Epoch: 45686, Training Loss: 0.00808
Epoch: 45687, Training Loss: 0.00635
Epoch: 45687, Training Loss: 0.00659
Epoch: 45687, Training Loss: 0.00659
Epoch: 45687, Training Loss: 0.00808
Epoch: 45688, Training Loss: 0.00635
Epoch: 45688, Training Loss: 0.00659
Epoch: 45688, Training Loss: 0.00659
Epoch: 45688, Training Loss: 0.00808
Epoch: 45689, Training Loss: 0.00635
Epoch: 45689, Training Loss: 0.00659
Epoch: 45689, Training Loss: 0.00659
Epoch: 45689, Training Loss: 0.00808
Epoch: 45690, Training Loss: 0.00635
Epoch: 45690, Training Loss: 0.00658
Epoch: 45690, Training Loss: 0.00659
Epoch: 45690, Training Loss: 0.00808
Epoch: 45691, Training Loss: 0.00635
Epoch: 45691, Training Loss: 0.00658
Epoch: 45691, Training Loss: 0.00659
Epoch: 45691, Training Loss: 0.00808
Epoch: 45692, Training Loss: 0.00635
Epoch: 45692, Training Loss: 0.00658
Epoch: 45692, Training Loss: 0.00659
Epoch: 45692, Training Loss: 0.00808
Epoch: 45693, Training Loss: 0.00635
Epoch: 45693, Training Loss: 0.00658
Epoch: 45693, Training Loss: 0.00659
Epoch: 45693, Training Loss: 0.00808
Epoch: 45694, Training Loss: 0.00635
Epoch: 45694, Training Loss: 0.00658
Epoch: 45694, Training Loss: 0.00659
Epoch: 45694, Training Loss: 0.00808
Epoch: 45695, Training Loss: 0.00635
Epoch: 45695, Training Loss: 0.00658
Epoch: 45695, Training Loss: 0.00659
Epoch: 45695, Training Loss: 0.00808
Epoch: 45696, Training Loss: 0.00635
Epoch: 45696, Training Loss: 0.00658
Epoch: 45696, Training Loss: 0.00659
Epoch: 45696, Training Loss: 0.00808
Epoch: 45697, Training Loss: 0.00635
Epoch: 45697, Training Loss: 0.00658
Epoch: 45697, Training Loss: 0.00659
Epoch: 45697, Training Loss: 0.00808
Epoch: 45698, Training Loss: 0.00635
Epoch: 45698, Training Loss: 0.00658
Epoch: 45698, Training Loss: 0.00659
Epoch: 45698, Training Loss: 0.00808
Epoch: 45699, Training Loss: 0.00635
Epoch: 45699, Training Loss: 0.00658
Epoch: 45699, Training Loss: 0.00659
Epoch: 45699, Training Loss: 0.00808
Epoch: 45700, Training Loss: 0.00635
Epoch: 45700, Training Loss: 0.00658
Epoch: 45700, Training Loss: 0.00659
Epoch: 45700, Training Loss: 0.00808
Epoch: 45701, Training Loss: 0.00635
Epoch: 45701, Training Loss: 0.00658
Epoch: 45701, Training Loss: 0.00659
Epoch: 45701, Training Loss: 0.00808
Epoch: 45702, Training Loss: 0.00635
Epoch: 45702, Training Loss: 0.00658
Epoch: 45702, Training Loss: 0.00659
Epoch: 45702, Training Loss: 0.00808
Epoch: 45703, Training Loss: 0.00635
Epoch: 45703, Training Loss: 0.00658
Epoch: 45703, Training Loss: 0.00659
Epoch: 45703, Training Loss: 0.00808
Epoch: 45704, Training Loss: 0.00635
Epoch: 45704, Training Loss: 0.00658
Epoch: 45704, Training Loss: 0.00659
Epoch: 45704, Training Loss: 0.00808
Epoch: 45705, Training Loss: 0.00635
Epoch: 45705, Training Loss: 0.00658
Epoch: 45705, Training Loss: 0.00659
Epoch: 45705, Training Loss: 0.00808
Epoch: 45706, Training Loss: 0.00635
Epoch: 45706, Training Loss: 0.00658
Epoch: 45706, Training Loss: 0.00659
Epoch: 45706, Training Loss: 0.00808
Epoch: 45707, Training Loss: 0.00635
Epoch: 45707, Training Loss: 0.00658
Epoch: 45707, Training Loss: 0.00659
Epoch: 45707, Training Loss: 0.00808
Epoch: 45708, Training Loss: 0.00635
Epoch: 45708, Training Loss: 0.00658
Epoch: 45708, Training Loss: 0.00659
Epoch: 45708, Training Loss: 0.00808
Epoch: 45709, Training Loss: 0.00635
Epoch: 45709, Training Loss: 0.00658
Epoch: 45709, Training Loss: 0.00659
Epoch: 45709, Training Loss: 0.00808
Epoch: 45710, Training Loss: 0.00635
Epoch: 45710, Training Loss: 0.00658
Epoch: 45710, Training Loss: 0.00659
Epoch: 45710, Training Loss: 0.00808
Epoch: 45711, Training Loss: 0.00635
Epoch: 45711, Training Loss: 0.00658
Epoch: 45711, Training Loss: 0.00659
Epoch: 45711, Training Loss: 0.00808
Epoch: 45712, Training Loss: 0.00635
Epoch: 45712, Training Loss: 0.00658
Epoch: 45712, Training Loss: 0.00659
Epoch: 45712, Training Loss: 0.00808
Epoch: 45713, Training Loss: 0.00635
Epoch: 45713, Training Loss: 0.00658
Epoch: 45713, Training Loss: 0.00659
Epoch: 45713, Training Loss: 0.00808
Epoch: 45714, Training Loss: 0.00635
Epoch: 45714, Training Loss: 0.00658
Epoch: 45714, Training Loss: 0.00659
Epoch: 45714, Training Loss: 0.00808
Epoch: 45715, Training Loss: 0.00635
Epoch: 45715, Training Loss: 0.00658
Epoch: 45715, Training Loss: 0.00659
Epoch: 45715, Training Loss: 0.00808
Epoch: 45716, Training Loss: 0.00635
Epoch: 45716, Training Loss: 0.00658
Epoch: 45716, Training Loss: 0.00659
Epoch: 45716, Training Loss: 0.00808
Epoch: 45717, Training Loss: 0.00635
Epoch: 45717, Training Loss: 0.00658
Epoch: 45717, Training Loss: 0.00659
Epoch: 45717, Training Loss: 0.00808
Epoch: 45718, Training Loss: 0.00635
Epoch: 45718, Training Loss: 0.00658
Epoch: 45718, Training Loss: 0.00659
Epoch: 45718, Training Loss: 0.00808
Epoch: 45719, Training Loss: 0.00635
Epoch: 45719, Training Loss: 0.00658
Epoch: 45719, Training Loss: 0.00659
Epoch: 45719, Training Loss: 0.00808
Epoch: 45720, Training Loss: 0.00635
Epoch: 45720, Training Loss: 0.00658
Epoch: 45720, Training Loss: 0.00659
Epoch: 45720, Training Loss: 0.00808
Epoch: 45721, Training Loss: 0.00635
Epoch: 45721, Training Loss: 0.00658
Epoch: 45721, Training Loss: 0.00659
Epoch: 45721, Training Loss: 0.00808
Epoch: 45722, Training Loss: 0.00635
Epoch: 45722, Training Loss: 0.00658
Epoch: 45722, Training Loss: 0.00659
Epoch: 45722, Training Loss: 0.00808
Epoch: 45723, Training Loss: 0.00635
Epoch: 45723, Training Loss: 0.00658
Epoch: 45723, Training Loss: 0.00659
Epoch: 45723, Training Loss: 0.00808
Epoch: 45724, Training Loss: 0.00635
Epoch: 45724, Training Loss: 0.00658
Epoch: 45724, Training Loss: 0.00659
Epoch: 45724, Training Loss: 0.00808
Epoch: 45725, Training Loss: 0.00635
Epoch: 45725, Training Loss: 0.00658
Epoch: 45725, Training Loss: 0.00659
Epoch: 45725, Training Loss: 0.00808
Epoch: 45726, Training Loss: 0.00635
Epoch: 45726, Training Loss: 0.00658
Epoch: 45726, Training Loss: 0.00659
Epoch: 45726, Training Loss: 0.00808
Epoch: 45727, Training Loss: 0.00635
Epoch: 45727, Training Loss: 0.00658
Epoch: 45727, Training Loss: 0.00659
Epoch: 45727, Training Loss: 0.00808
Epoch: 45728, Training Loss: 0.00635
Epoch: 45728, Training Loss: 0.00658
Epoch: 45728, Training Loss: 0.00659
Epoch: 45728, Training Loss: 0.00808
Epoch: 45729, Training Loss: 0.00635
Epoch: 45729, Training Loss: 0.00658
Epoch: 45729, Training Loss: 0.00659
Epoch: 45729, Training Loss: 0.00808
Epoch: 45730, Training Loss: 0.00635
Epoch: 45730, Training Loss: 0.00658
Epoch: 45730, Training Loss: 0.00659
Epoch: 45730, Training Loss: 0.00808
Epoch: 45731, Training Loss: 0.00635
Epoch: 45731, Training Loss: 0.00658
Epoch: 45731, Training Loss: 0.00659
Epoch: 45731, Training Loss: 0.00808
Epoch: 45732, Training Loss: 0.00635
Epoch: 45732, Training Loss: 0.00658
Epoch: 45732, Training Loss: 0.00659
Epoch: 45732, Training Loss: 0.00807
Epoch: 45733, Training Loss: 0.00635
Epoch: 45733, Training Loss: 0.00658
Epoch: 45733, Training Loss: 0.00659
Epoch: 45733, Training Loss: 0.00807
Epoch: 45734, Training Loss: 0.00635
Epoch: 45734, Training Loss: 0.00658
Epoch: 45734, Training Loss: 0.00659
Epoch: 45734, Training Loss: 0.00807
Epoch: 45735, Training Loss: 0.00635
Epoch: 45735, Training Loss: 0.00658
Epoch: 45735, Training Loss: 0.00659
Epoch: 45735, Training Loss: 0.00807
Epoch: 45736, Training Loss: 0.00635
Epoch: 45736, Training Loss: 0.00658
Epoch: 45736, Training Loss: 0.00659
Epoch: 45736, Training Loss: 0.00807
Epoch: 45737, Training Loss: 0.00634
Epoch: 45737, Training Loss: 0.00658
Epoch: 45737, Training Loss: 0.00659
Epoch: 45737, Training Loss: 0.00807
Epoch: 45738, Training Loss: 0.00634
Epoch: 45738, Training Loss: 0.00658
Epoch: 45738, Training Loss: 0.00659
Epoch: 45738, Training Loss: 0.00807
Epoch: 45739, Training Loss: 0.00634
Epoch: 45739, Training Loss: 0.00658
Epoch: 45739, Training Loss: 0.00659
Epoch: 45739, Training Loss: 0.00807
Epoch: 45740, Training Loss: 0.00634
Epoch: 45740, Training Loss: 0.00658
Epoch: 45740, Training Loss: 0.00659
Epoch: 45740, Training Loss: 0.00807
Epoch: 45741, Training Loss: 0.00634
Epoch: 45741, Training Loss: 0.00658
Epoch: 45741, Training Loss: 0.00659
Epoch: 45741, Training Loss: 0.00807
Epoch: 45742, Training Loss: 0.00634
Epoch: 45742, Training Loss: 0.00658
Epoch: 45742, Training Loss: 0.00659
Epoch: 45742, Training Loss: 0.00807
Epoch: 45743, Training Loss: 0.00634
Epoch: 45743, Training Loss: 0.00658
Epoch: 45743, Training Loss: 0.00659
Epoch: 45743, Training Loss: 0.00807
Epoch: 45744, Training Loss: 0.00634
Epoch: 45744, Training Loss: 0.00658
Epoch: 45744, Training Loss: 0.00659
Epoch: 45744, Training Loss: 0.00807
Epoch: 45745, Training Loss: 0.00634
Epoch: 45745, Training Loss: 0.00658
Epoch: 45745, Training Loss: 0.00659
Epoch: 45745, Training Loss: 0.00807
Epoch: 45746, Training Loss: 0.00634
Epoch: 45746, Training Loss: 0.00658
Epoch: 45746, Training Loss: 0.00659
Epoch: 45746, Training Loss: 0.00807
Epoch: 45747, Training Loss: 0.00634
Epoch: 45747, Training Loss: 0.00658
Epoch: 45747, Training Loss: 0.00659
Epoch: 45747, Training Loss: 0.00807
Epoch: 45748, Training Loss: 0.00634
Epoch: 45748, Training Loss: 0.00658
Epoch: 45748, Training Loss: 0.00659
Epoch: 45748, Training Loss: 0.00807
Epoch: 45749, Training Loss: 0.00634
Epoch: 45749, Training Loss: 0.00658
Epoch: 45749, Training Loss: 0.00659
Epoch: 45749, Training Loss: 0.00807
Epoch: 45750, Training Loss: 0.00634
Epoch: 45750, Training Loss: 0.00658
Epoch: 45750, Training Loss: 0.00659
Epoch: 45750, Training Loss: 0.00807
Epoch: 45751, Training Loss: 0.00634
Epoch: 45751, Training Loss: 0.00658
Epoch: 45751, Training Loss: 0.00659
Epoch: 45751, Training Loss: 0.00807
Epoch: 45752, Training Loss: 0.00634
Epoch: 45752, Training Loss: 0.00658
Epoch: 45752, Training Loss: 0.00659
Epoch: 45752, Training Loss: 0.00807
Epoch: 45753, Training Loss: 0.00634
Epoch: 45753, Training Loss: 0.00658
Epoch: 45753, Training Loss: 0.00659
Epoch: 45753, Training Loss: 0.00807
Epoch: 45754, Training Loss: 0.00634
Epoch: 45754, Training Loss: 0.00658
Epoch: 45754, Training Loss: 0.00659
Epoch: 45754, Training Loss: 0.00807
Epoch: 45755, Training Loss: 0.00634
Epoch: 45755, Training Loss: 0.00658
Epoch: 45755, Training Loss: 0.00659
Epoch: 45755, Training Loss: 0.00807
Epoch: 45756, Training Loss: 0.00634
Epoch: 45756, Training Loss: 0.00658
Epoch: 45756, Training Loss: 0.00659
Epoch: 45756, Training Loss: 0.00807
Epoch: 45757, Training Loss: 0.00634
Epoch: 45757, Training Loss: 0.00658
Epoch: 45757, Training Loss: 0.00659
Epoch: 45757, Training Loss: 0.00807
Epoch: 45758, Training Loss: 0.00634
Epoch: 45758, Training Loss: 0.00658
Epoch: 45758, Training Loss: 0.00659
Epoch: 45758, Training Loss: 0.00807
Epoch: 45759, Training Loss: 0.00634
Epoch: 45759, Training Loss: 0.00658
Epoch: 45759, Training Loss: 0.00659
Epoch: 45759, Training Loss: 0.00807
Epoch: 45760, Training Loss: 0.00634
Epoch: 45760, Training Loss: 0.00658
Epoch: 45760, Training Loss: 0.00659
Epoch: 45760, Training Loss: 0.00807
Epoch: 45761, Training Loss: 0.00634
Epoch: 45761, Training Loss: 0.00658
Epoch: 45761, Training Loss: 0.00659
Epoch: 45761, Training Loss: 0.00807
Epoch: 45762, Training Loss: 0.00634
Epoch: 45762, Training Loss: 0.00658
Epoch: 45762, Training Loss: 0.00659
Epoch: 45762, Training Loss: 0.00807
Epoch: 45763, Training Loss: 0.00634
Epoch: 45763, Training Loss: 0.00658
Epoch: 45763, Training Loss: 0.00659
Epoch: 45763, Training Loss: 0.00807
Epoch: 45764, Training Loss: 0.00634
Epoch: 45764, Training Loss: 0.00658
Epoch: 45764, Training Loss: 0.00659
Epoch: 45764, Training Loss: 0.00807
Epoch: 45765, Training Loss: 0.00634
Epoch: 45765, Training Loss: 0.00658
Epoch: 45765, Training Loss: 0.00659
Epoch: 45765, Training Loss: 0.00807
Epoch: 45766, Training Loss: 0.00634
Epoch: 45766, Training Loss: 0.00658
Epoch: 45766, Training Loss: 0.00659
Epoch: 45766, Training Loss: 0.00807
Epoch: 45767, Training Loss: 0.00634
Epoch: 45767, Training Loss: 0.00658
Epoch: 45767, Training Loss: 0.00659
Epoch: 45767, Training Loss: 0.00807
Epoch: 45768, Training Loss: 0.00634
Epoch: 45768, Training Loss: 0.00658
Epoch: 45768, Training Loss: 0.00659
Epoch: 45768, Training Loss: 0.00807
Epoch: 45769, Training Loss: 0.00634
Epoch: 45769, Training Loss: 0.00658
Epoch: 45769, Training Loss: 0.00659
Epoch: 45769, Training Loss: 0.00807
Epoch: 45770, Training Loss: 0.00634
Epoch: 45770, Training Loss: 0.00658
Epoch: 45770, Training Loss: 0.00659
Epoch: 45770, Training Loss: 0.00807
Epoch: 45771, Training Loss: 0.00634
Epoch: 45771, Training Loss: 0.00658
Epoch: 45771, Training Loss: 0.00659
Epoch: 45771, Training Loss: 0.00807
Epoch: 45772, Training Loss: 0.00634
Epoch: 45772, Training Loss: 0.00658
Epoch: 45772, Training Loss: 0.00659
Epoch: 45772, Training Loss: 0.00807
Epoch: 45773, Training Loss: 0.00634
Epoch: 45773, Training Loss: 0.00658
Epoch: 45773, Training Loss: 0.00659
Epoch: 45773, Training Loss: 0.00807
Epoch: 45774, Training Loss: 0.00634
Epoch: 45774, Training Loss: 0.00658
Epoch: 45774, Training Loss: 0.00659
Epoch: 45774, Training Loss: 0.00807
Epoch: 45775, Training Loss: 0.00634
Epoch: 45775, Training Loss: 0.00658
Epoch: 45775, Training Loss: 0.00659
Epoch: 45775, Training Loss: 0.00807
Epoch: 45776, Training Loss: 0.00634
Epoch: 45776, Training Loss: 0.00658
Epoch: 45776, Training Loss: 0.00659
Epoch: 45776, Training Loss: 0.00807
Epoch: 45777, Training Loss: 0.00634
Epoch: 45777, Training Loss: 0.00658
Epoch: 45777, Training Loss: 0.00659
Epoch: 45777, Training Loss: 0.00807
Epoch: 45778, Training Loss: 0.00634
Epoch: 45778, Training Loss: 0.00658
Epoch: 45778, Training Loss: 0.00659
Epoch: 45778, Training Loss: 0.00807
Epoch: 45779, Training Loss: 0.00634
Epoch: 45779, Training Loss: 0.00658
Epoch: 45779, Training Loss: 0.00659
Epoch: 45779, Training Loss: 0.00807
Epoch: 45780, Training Loss: 0.00634
Epoch: 45780, Training Loss: 0.00658
Epoch: 45780, Training Loss: 0.00659
Epoch: 45780, Training Loss: 0.00807
Epoch: 45781, Training Loss: 0.00634
Epoch: 45781, Training Loss: 0.00658
Epoch: 45781, Training Loss: 0.00659
Epoch: 45781, Training Loss: 0.00807
Epoch: 45782, Training Loss: 0.00634
Epoch: 45782, Training Loss: 0.00658
Epoch: 45782, Training Loss: 0.00658
Epoch: 45782, Training Loss: 0.00807
Epoch: 45783, Training Loss: 0.00634
Epoch: 45783, Training Loss: 0.00658
Epoch: 45783, Training Loss: 0.00658
Epoch: 45783, Training Loss: 0.00807
Epoch: 45784, Training Loss: 0.00634
Epoch: 45784, Training Loss: 0.00658
Epoch: 45784, Training Loss: 0.00658
Epoch: 45784, Training Loss: 0.00807
Epoch: 45785, Training Loss: 0.00634
Epoch: 45785, Training Loss: 0.00658
Epoch: 45785, Training Loss: 0.00658
Epoch: 45785, Training Loss: 0.00807
Epoch: 45786, Training Loss: 0.00634
Epoch: 45786, Training Loss: 0.00658
Epoch: 45786, Training Loss: 0.00658
Epoch: 45786, Training Loss: 0.00807
Epoch: 45787, Training Loss: 0.00634
Epoch: 45787, Training Loss: 0.00658
Epoch: 45787, Training Loss: 0.00658
Epoch: 45787, Training Loss: 0.00807
Epoch: 45788, Training Loss: 0.00634
Epoch: 45788, Training Loss: 0.00658
Epoch: 45788, Training Loss: 0.00658
Epoch: 45788, Training Loss: 0.00807
Epoch: 45789, Training Loss: 0.00634
Epoch: 45789, Training Loss: 0.00658
Epoch: 45789, Training Loss: 0.00658
Epoch: 45789, Training Loss: 0.00807
Epoch: 45790, Training Loss: 0.00634
Epoch: 45790, Training Loss: 0.00658
Epoch: 45790, Training Loss: 0.00658
Epoch: 45790, Training Loss: 0.00807
Epoch: 45791, Training Loss: 0.00634
Epoch: 45791, Training Loss: 0.00658
Epoch: 45791, Training Loss: 0.00658
Epoch: 45791, Training Loss: 0.00807
Epoch: 45792, Training Loss: 0.00634
Epoch: 45792, Training Loss: 0.00658
Epoch: 45792, Training Loss: 0.00658
Epoch: 45792, Training Loss: 0.00807
Epoch: 45793, Training Loss: 0.00634
Epoch: 45793, Training Loss: 0.00658
Epoch: 45793, Training Loss: 0.00658
Epoch: 45793, Training Loss: 0.00807
Epoch: 45794, Training Loss: 0.00634
Epoch: 45794, Training Loss: 0.00658
Epoch: 45794, Training Loss: 0.00658
Epoch: 45794, Training Loss: 0.00807
Epoch: 45795, Training Loss: 0.00634
Epoch: 45795, Training Loss: 0.00658
Epoch: 45795, Training Loss: 0.00658
Epoch: 45795, Training Loss: 0.00807
Epoch: 45796, Training Loss: 0.00634
Epoch: 45796, Training Loss: 0.00658
Epoch: 45796, Training Loss: 0.00658
Epoch: 45796, Training Loss: 0.00807
Epoch: 45797, Training Loss: 0.00634
Epoch: 45797, Training Loss: 0.00658
Epoch: 45797, Training Loss: 0.00658
Epoch: 45797, Training Loss: 0.00807
Epoch: 45798, Training Loss: 0.00634
Epoch: 45798, Training Loss: 0.00658
Epoch: 45798, Training Loss: 0.00658
Epoch: 45798, Training Loss: 0.00807
Epoch: 45799, Training Loss: 0.00634
Epoch: 45799, Training Loss: 0.00658
Epoch: 45799, Training Loss: 0.00658
Epoch: 45799, Training Loss: 0.00807
Epoch: 45800, Training Loss: 0.00634
Epoch: 45800, Training Loss: 0.00658
Epoch: 45800, Training Loss: 0.00658
Epoch: 45800, Training Loss: 0.00807
Epoch: 45801, Training Loss: 0.00634
Epoch: 45801, Training Loss: 0.00658
Epoch: 45801, Training Loss: 0.00658
Epoch: 45801, Training Loss: 0.00807
Epoch: 45802, Training Loss: 0.00634
Epoch: 45802, Training Loss: 0.00658
Epoch: 45802, Training Loss: 0.00658
Epoch: 45802, Training Loss: 0.00807
Epoch: 45803, Training Loss: 0.00634
Epoch: 45803, Training Loss: 0.00658
Epoch: 45803, Training Loss: 0.00658
Epoch: 45803, Training Loss: 0.00807
Epoch: 45804, Training Loss: 0.00634
Epoch: 45804, Training Loss: 0.00658
Epoch: 45804, Training Loss: 0.00658
Epoch: 45804, Training Loss: 0.00807
Epoch: 45805, Training Loss: 0.00634
Epoch: 45805, Training Loss: 0.00658
Epoch: 45805, Training Loss: 0.00658
Epoch: 45805, Training Loss: 0.00807
Epoch: 45806, Training Loss: 0.00634
Epoch: 45806, Training Loss: 0.00658
Epoch: 45806, Training Loss: 0.00658
Epoch: 45806, Training Loss: 0.00807
Epoch: 45807, Training Loss: 0.00634
Epoch: 45807, Training Loss: 0.00658
Epoch: 45807, Training Loss: 0.00658
Epoch: 45807, Training Loss: 0.00807
Epoch: 45808, Training Loss: 0.00634
Epoch: 45808, Training Loss: 0.00658
Epoch: 45808, Training Loss: 0.00658
Epoch: 45808, Training Loss: 0.00807
Epoch: 45809, Training Loss: 0.00634
Epoch: 45809, Training Loss: 0.00658
Epoch: 45809, Training Loss: 0.00658
Epoch: 45809, Training Loss: 0.00807
Epoch: 45810, Training Loss: 0.00634
Epoch: 45810, Training Loss: 0.00658
Epoch: 45810, Training Loss: 0.00658
Epoch: 45810, Training Loss: 0.00807
Epoch: 45811, Training Loss: 0.00634
Epoch: 45811, Training Loss: 0.00658
Epoch: 45811, Training Loss: 0.00658
Epoch: 45811, Training Loss: 0.00807
Epoch: 45812, Training Loss: 0.00634
Epoch: 45812, Training Loss: 0.00658
Epoch: 45812, Training Loss: 0.00658
Epoch: 45812, Training Loss: 0.00807
Epoch: 45813, Training Loss: 0.00634
Epoch: 45813, Training Loss: 0.00658
Epoch: 45813, Training Loss: 0.00658
Epoch: 45813, Training Loss: 0.00807
Epoch: 45814, Training Loss: 0.00634
Epoch: 45814, Training Loss: 0.00658
Epoch: 45814, Training Loss: 0.00658
Epoch: 45814, Training Loss: 0.00807
Epoch: 45815, Training Loss: 0.00634
Epoch: 45815, Training Loss: 0.00658
Epoch: 45815, Training Loss: 0.00658
Epoch: 45815, Training Loss: 0.00807
Epoch: 45816, Training Loss: 0.00634
Epoch: 45816, Training Loss: 0.00658
Epoch: 45816, Training Loss: 0.00658
Epoch: 45816, Training Loss: 0.00807
Epoch: 45817, Training Loss: 0.00634
Epoch: 45817, Training Loss: 0.00658
Epoch: 45817, Training Loss: 0.00658
Epoch: 45817, Training Loss: 0.00807
Epoch: 45818, Training Loss: 0.00634
Epoch: 45818, Training Loss: 0.00657
Epoch: 45818, Training Loss: 0.00658
Epoch: 45818, Training Loss: 0.00807
Epoch: 45819, Training Loss: 0.00634
Epoch: 45819, Training Loss: 0.00657
Epoch: 45819, Training Loss: 0.00658
Epoch: 45819, Training Loss: 0.00807
Epoch: 45820, Training Loss: 0.00634
Epoch: 45820, Training Loss: 0.00657
Epoch: 45820, Training Loss: 0.00658
Epoch: 45820, Training Loss: 0.00807
Epoch: 45821, Training Loss: 0.00634
Epoch: 45821, Training Loss: 0.00657
Epoch: 45821, Training Loss: 0.00658
Epoch: 45821, Training Loss: 0.00807
Epoch: 45822, Training Loss: 0.00634
Epoch: 45822, Training Loss: 0.00657
Epoch: 45822, Training Loss: 0.00658
Epoch: 45822, Training Loss: 0.00807
Epoch: 45823, Training Loss: 0.00634
Epoch: 45823, Training Loss: 0.00657
Epoch: 45823, Training Loss: 0.00658
Epoch: 45823, Training Loss: 0.00807
Epoch: 45824, Training Loss: 0.00634
Epoch: 45824, Training Loss: 0.00657
Epoch: 45824, Training Loss: 0.00658
Epoch: 45824, Training Loss: 0.00807
Epoch: 45825, Training Loss: 0.00634
Epoch: 45825, Training Loss: 0.00657
Epoch: 45825, Training Loss: 0.00658
Epoch: 45825, Training Loss: 0.00807
Epoch: 45826, Training Loss: 0.00634
Epoch: 45826, Training Loss: 0.00657
Epoch: 45826, Training Loss: 0.00658
Epoch: 45826, Training Loss: 0.00807
Epoch: 45827, Training Loss: 0.00634
Epoch: 45827, Training Loss: 0.00657
Epoch: 45827, Training Loss: 0.00658
Epoch: 45827, Training Loss: 0.00807
Epoch: 45828, Training Loss: 0.00634
Epoch: 45828, Training Loss: 0.00657
Epoch: 45828, Training Loss: 0.00658
Epoch: 45828, Training Loss: 0.00807
Epoch: 45829, Training Loss: 0.00634
Epoch: 45829, Training Loss: 0.00657
Epoch: 45829, Training Loss: 0.00658
Epoch: 45829, Training Loss: 0.00807
Epoch: 45830, Training Loss: 0.00634
Epoch: 45830, Training Loss: 0.00657
Epoch: 45830, Training Loss: 0.00658
Epoch: 45830, Training Loss: 0.00807
Epoch: 45831, Training Loss: 0.00634
Epoch: 45831, Training Loss: 0.00657
Epoch: 45831, Training Loss: 0.00658
Epoch: 45831, Training Loss: 0.00807
Epoch: 45832, Training Loss: 0.00634
Epoch: 45832, Training Loss: 0.00657
Epoch: 45832, Training Loss: 0.00658
Epoch: 45832, Training Loss: 0.00807
Epoch: 45833, Training Loss: 0.00634
Epoch: 45833, Training Loss: 0.00657
Epoch: 45833, Training Loss: 0.00658
Epoch: 45833, Training Loss: 0.00807
Epoch: 45834, Training Loss: 0.00634
Epoch: 45834, Training Loss: 0.00657
Epoch: 45834, Training Loss: 0.00658
Epoch: 45834, Training Loss: 0.00807
Epoch: 45835, Training Loss: 0.00634
Epoch: 45835, Training Loss: 0.00657
Epoch: 45835, Training Loss: 0.00658
Epoch: 45835, Training Loss: 0.00806
Epoch: 45836, Training Loss: 0.00634
Epoch: 45836, Training Loss: 0.00657
Epoch: 45836, Training Loss: 0.00658
Epoch: 45836, Training Loss: 0.00806
Epoch: 45837, Training Loss: 0.00634
Epoch: 45837, Training Loss: 0.00657
Epoch: 45837, Training Loss: 0.00658
Epoch: 45837, Training Loss: 0.00806
Epoch: 45838, Training Loss: 0.00634
Epoch: 45838, Training Loss: 0.00657
Epoch: 45838, Training Loss: 0.00658
Epoch: 45838, Training Loss: 0.00806
Epoch: 45839, Training Loss: 0.00634
Epoch: 45839, Training Loss: 0.00657
Epoch: 45839, Training Loss: 0.00658
Epoch: 45839, Training Loss: 0.00806
Epoch: 45840, Training Loss: 0.00634
Epoch: 45840, Training Loss: 0.00657
Epoch: 45840, Training Loss: 0.00658
Epoch: 45840, Training Loss: 0.00806
Epoch: 45841, Training Loss: 0.00634
Epoch: 45841, Training Loss: 0.00657
Epoch: 45841, Training Loss: 0.00658
Epoch: 45841, Training Loss: 0.00806
Epoch: 45842, Training Loss: 0.00634
Epoch: 45842, Training Loss: 0.00657
Epoch: 45842, Training Loss: 0.00658
Epoch: 45842, Training Loss: 0.00806
Epoch: 45843, Training Loss: 0.00634
Epoch: 45843, Training Loss: 0.00657
Epoch: 45843, Training Loss: 0.00658
Epoch: 45843, Training Loss: 0.00806
Epoch: 45844, Training Loss: 0.00634
Epoch: 45844, Training Loss: 0.00657
Epoch: 45844, Training Loss: 0.00658
Epoch: 45844, Training Loss: 0.00806
Epoch: 45845, Training Loss: 0.00634
Epoch: 45845, Training Loss: 0.00657
Epoch: 45845, Training Loss: 0.00658
Epoch: 45845, Training Loss: 0.00806
Epoch: 45846, Training Loss: 0.00634
Epoch: 45846, Training Loss: 0.00657
Epoch: 45846, Training Loss: 0.00658
Epoch: 45846, Training Loss: 0.00806
Epoch: 45847, Training Loss: 0.00634
Epoch: 45847, Training Loss: 0.00657
Epoch: 45847, Training Loss: 0.00658
Epoch: 45847, Training Loss: 0.00806
Epoch: 45848, Training Loss: 0.00634
Epoch: 45848, Training Loss: 0.00657
Epoch: 45848, Training Loss: 0.00658
Epoch: 45848, Training Loss: 0.00806
Epoch: 45849, Training Loss: 0.00634
Epoch: 45849, Training Loss: 0.00657
Epoch: 45849, Training Loss: 0.00658
Epoch: 45849, Training Loss: 0.00806
Epoch: 45850, Training Loss: 0.00634
Epoch: 45850, Training Loss: 0.00657
Epoch: 45850, Training Loss: 0.00658
Epoch: 45850, Training Loss: 0.00806
Epoch: 45851, Training Loss: 0.00634
Epoch: 45851, Training Loss: 0.00657
Epoch: 45851, Training Loss: 0.00658
Epoch: 45851, Training Loss: 0.00806
Epoch: 45852, Training Loss: 0.00634
Epoch: 45852, Training Loss: 0.00657
Epoch: 45852, Training Loss: 0.00658
Epoch: 45852, Training Loss: 0.00806
Epoch: 45853, Training Loss: 0.00634
Epoch: 45853, Training Loss: 0.00657
Epoch: 45853, Training Loss: 0.00658
Epoch: 45853, Training Loss: 0.00806
Epoch: 45854, Training Loss: 0.00634
Epoch: 45854, Training Loss: 0.00657
Epoch: 45854, Training Loss: 0.00658
Epoch: 45854, Training Loss: 0.00806
Epoch: 45855, Training Loss: 0.00634
Epoch: 45855, Training Loss: 0.00657
Epoch: 45855, Training Loss: 0.00658
Epoch: 45855, Training Loss: 0.00806
Epoch: 45856, Training Loss: 0.00634
Epoch: 45856, Training Loss: 0.00657
Epoch: 45856, Training Loss: 0.00658
Epoch: 45856, Training Loss: 0.00806
Epoch: 45857, Training Loss: 0.00634
Epoch: 45857, Training Loss: 0.00657
Epoch: 45857, Training Loss: 0.00658
Epoch: 45857, Training Loss: 0.00806
Epoch: 45858, Training Loss: 0.00634
Epoch: 45858, Training Loss: 0.00657
Epoch: 45858, Training Loss: 0.00658
Epoch: 45858, Training Loss: 0.00806
Epoch: 45859, Training Loss: 0.00634
Epoch: 45859, Training Loss: 0.00657
Epoch: 45859, Training Loss: 0.00658
Epoch: 45859, Training Loss: 0.00806
Epoch: 45860, Training Loss: 0.00634
Epoch: 45860, Training Loss: 0.00657
Epoch: 45860, Training Loss: 0.00658
Epoch: 45860, Training Loss: 0.00806
Epoch: 45861, Training Loss: 0.00634
Epoch: 45861, Training Loss: 0.00657
Epoch: 45861, Training Loss: 0.00658
Epoch: 45861, Training Loss: 0.00806
Epoch: 45862, Training Loss: 0.00634
Epoch: 45862, Training Loss: 0.00657
Epoch: 45862, Training Loss: 0.00658
Epoch: 45862, Training Loss: 0.00806
Epoch: 45863, Training Loss: 0.00634
Epoch: 45863, Training Loss: 0.00657
Epoch: 45863, Training Loss: 0.00658
Epoch: 45863, Training Loss: 0.00806
Epoch: 45864, Training Loss: 0.00634
Epoch: 45864, Training Loss: 0.00657
Epoch: 45864, Training Loss: 0.00658
Epoch: 45864, Training Loss: 0.00806
Epoch: 45865, Training Loss: 0.00634
Epoch: 45865, Training Loss: 0.00657
Epoch: 45865, Training Loss: 0.00658
Epoch: 45865, Training Loss: 0.00806
Epoch: 45866, Training Loss: 0.00634
Epoch: 45866, Training Loss: 0.00657
Epoch: 45866, Training Loss: 0.00658
Epoch: 45866, Training Loss: 0.00806
Epoch: 45867, Training Loss: 0.00634
Epoch: 45867, Training Loss: 0.00657
Epoch: 45867, Training Loss: 0.00658
Epoch: 45867, Training Loss: 0.00806
Epoch: 45868, Training Loss: 0.00634
Epoch: 45868, Training Loss: 0.00657
Epoch: 45868, Training Loss: 0.00658
Epoch: 45868, Training Loss: 0.00806
Epoch: 45869, Training Loss: 0.00634
Epoch: 45869, Training Loss: 0.00657
Epoch: 45869, Training Loss: 0.00658
Epoch: 45869, Training Loss: 0.00806
Epoch: 45870, Training Loss: 0.00634
Epoch: 45870, Training Loss: 0.00657
Epoch: 45870, Training Loss: 0.00658
Epoch: 45870, Training Loss: 0.00806
Epoch: 45871, Training Loss: 0.00634
Epoch: 45871, Training Loss: 0.00657
Epoch: 45871, Training Loss: 0.00658
Epoch: 45871, Training Loss: 0.00806
Epoch: 45872, Training Loss: 0.00634
Epoch: 45872, Training Loss: 0.00657
Epoch: 45872, Training Loss: 0.00658
Epoch: 45872, Training Loss: 0.00806
Epoch: 45873, Training Loss: 0.00633
Epoch: 45873, Training Loss: 0.00657
Epoch: 45873, Training Loss: 0.00658
Epoch: 45873, Training Loss: 0.00806
Epoch: 45874, Training Loss: 0.00633
Epoch: 45874, Training Loss: 0.00657
Epoch: 45874, Training Loss: 0.00658
Epoch: 45874, Training Loss: 0.00806
Epoch: 45875, Training Loss: 0.00633
Epoch: 45875, Training Loss: 0.00657
Epoch: 45875, Training Loss: 0.00658
Epoch: 45875, Training Loss: 0.00806
Epoch: 45876, Training Loss: 0.00633
Epoch: 45876, Training Loss: 0.00657
Epoch: 45876, Training Loss: 0.00658
Epoch: 45876, Training Loss: 0.00806
Epoch: 45877, Training Loss: 0.00633
Epoch: 45877, Training Loss: 0.00657
Epoch: 45877, Training Loss: 0.00658
Epoch: 45877, Training Loss: 0.00806
Epoch: 45878, Training Loss: 0.00633
Epoch: 45878, Training Loss: 0.00657
Epoch: 45878, Training Loss: 0.00658
Epoch: 45878, Training Loss: 0.00806
Epoch: 45879, Training Loss: 0.00633
Epoch: 45879, Training Loss: 0.00657
Epoch: 45879, Training Loss: 0.00658
Epoch: 45879, Training Loss: 0.00806
Epoch: 45880, Training Loss: 0.00633
Epoch: 45880, Training Loss: 0.00657
Epoch: 45880, Training Loss: 0.00658
Epoch: 45880, Training Loss: 0.00806
Epoch: 45881, Training Loss: 0.00633
Epoch: 45881, Training Loss: 0.00657
Epoch: 45881, Training Loss: 0.00658
Epoch: 45881, Training Loss: 0.00806
Epoch: 45882, Training Loss: 0.00633
Epoch: 45882, Training Loss: 0.00657
Epoch: 45882, Training Loss: 0.00658
Epoch: 45882, Training Loss: 0.00806
Epoch: 45883, Training Loss: 0.00633
Epoch: 45883, Training Loss: 0.00657
Epoch: 45883, Training Loss: 0.00658
Epoch: 45883, Training Loss: 0.00806
Epoch: 45884, Training Loss: 0.00633
Epoch: 45884, Training Loss: 0.00657
Epoch: 45884, Training Loss: 0.00658
Epoch: 45884, Training Loss: 0.00806
Epoch: 45885, Training Loss: 0.00633
Epoch: 45885, Training Loss: 0.00657
Epoch: 45885, Training Loss: 0.00658
Epoch: 45885, Training Loss: 0.00806
Epoch: 45886, Training Loss: 0.00633
Epoch: 45886, Training Loss: 0.00657
Epoch: 45886, Training Loss: 0.00658
Epoch: 45886, Training Loss: 0.00806
Epoch: 45887, Training Loss: 0.00633
Epoch: 45887, Training Loss: 0.00657
Epoch: 45887, Training Loss: 0.00658
Epoch: 45887, Training Loss: 0.00806
Epoch: 45888, Training Loss: 0.00633
Epoch: 45888, Training Loss: 0.00657
Epoch: 45888, Training Loss: 0.00658
Epoch: 45888, Training Loss: 0.00806
Epoch: 45889, Training Loss: 0.00633
Epoch: 45889, Training Loss: 0.00657
Epoch: 45889, Training Loss: 0.00658
Epoch: 45889, Training Loss: 0.00806
Epoch: 45890, Training Loss: 0.00633
Epoch: 45890, Training Loss: 0.00657
Epoch: 45890, Training Loss: 0.00658
Epoch: 45890, Training Loss: 0.00806
Epoch: 45891, Training Loss: 0.00633
Epoch: 45891, Training Loss: 0.00657
Epoch: 45891, Training Loss: 0.00658
Epoch: 45891, Training Loss: 0.00806
Epoch: 45892, Training Loss: 0.00633
Epoch: 45892, Training Loss: 0.00657
Epoch: 45892, Training Loss: 0.00658
Epoch: 45892, Training Loss: 0.00806
Epoch: 45893, Training Loss: 0.00633
Epoch: 45893, Training Loss: 0.00657
Epoch: 45893, Training Loss: 0.00658
Epoch: 45893, Training Loss: 0.00806
Epoch: 45894, Training Loss: 0.00633
Epoch: 45894, Training Loss: 0.00657
Epoch: 45894, Training Loss: 0.00658
Epoch: 45894, Training Loss: 0.00806
Epoch: 45895, Training Loss: 0.00633
Epoch: 45895, Training Loss: 0.00657
Epoch: 45895, Training Loss: 0.00658
Epoch: 45895, Training Loss: 0.00806
Epoch: 45896, Training Loss: 0.00633
Epoch: 45896, Training Loss: 0.00657
Epoch: 45896, Training Loss: 0.00658
Epoch: 45896, Training Loss: 0.00806
Epoch: 45897, Training Loss: 0.00633
Epoch: 45897, Training Loss: 0.00657
Epoch: 45897, Training Loss: 0.00658
Epoch: 45897, Training Loss: 0.00806
Epoch: 45898, Training Loss: 0.00633
Epoch: 45898, Training Loss: 0.00657
Epoch: 45898, Training Loss: 0.00658
Epoch: 45898, Training Loss: 0.00806
Epoch: 45899, Training Loss: 0.00633
Epoch: 45899, Training Loss: 0.00657
Epoch: 45899, Training Loss: 0.00658
Epoch: 45899, Training Loss: 0.00806
Epoch: 45900, Training Loss: 0.00633
Epoch: 45900, Training Loss: 0.00657
Epoch: 45900, Training Loss: 0.00658
Epoch: 45900, Training Loss: 0.00806
Epoch: 45901, Training Loss: 0.00633
Epoch: 45901, Training Loss: 0.00657
Epoch: 45901, Training Loss: 0.00658
Epoch: 45901, Training Loss: 0.00806
Epoch: 45902, Training Loss: 0.00633
Epoch: 45902, Training Loss: 0.00657
Epoch: 45902, Training Loss: 0.00658
Epoch: 45902, Training Loss: 0.00806
Epoch: 45903, Training Loss: 0.00633
Epoch: 45903, Training Loss: 0.00657
Epoch: 45903, Training Loss: 0.00658
Epoch: 45903, Training Loss: 0.00806
Epoch: 45904, Training Loss: 0.00633
Epoch: 45904, Training Loss: 0.00657
Epoch: 45904, Training Loss: 0.00658
Epoch: 45904, Training Loss: 0.00806
Epoch: 45905, Training Loss: 0.00633
Epoch: 45905, Training Loss: 0.00657
Epoch: 45905, Training Loss: 0.00658
Epoch: 45905, Training Loss: 0.00806
Epoch: 45906, Training Loss: 0.00633
Epoch: 45906, Training Loss: 0.00657
Epoch: 45906, Training Loss: 0.00658
Epoch: 45906, Training Loss: 0.00806
Epoch: 45907, Training Loss: 0.00633
Epoch: 45907, Training Loss: 0.00657
Epoch: 45907, Training Loss: 0.00658
Epoch: 45907, Training Loss: 0.00806
Epoch: 45908, Training Loss: 0.00633
Epoch: 45908, Training Loss: 0.00657
Epoch: 45908, Training Loss: 0.00658
Epoch: 45908, Training Loss: 0.00806
Epoch: 45909, Training Loss: 0.00633
Epoch: 45909, Training Loss: 0.00657
Epoch: 45909, Training Loss: 0.00658
Epoch: 45909, Training Loss: 0.00806
Epoch: 45910, Training Loss: 0.00633
Epoch: 45910, Training Loss: 0.00657
Epoch: 45910, Training Loss: 0.00657
Epoch: 45910, Training Loss: 0.00806
Epoch: 45911, Training Loss: 0.00633
Epoch: 45911, Training Loss: 0.00657
Epoch: 45911, Training Loss: 0.00657
Epoch: 45911, Training Loss: 0.00806
Epoch: 45912, Training Loss: 0.00633
Epoch: 45912, Training Loss: 0.00657
Epoch: 45912, Training Loss: 0.00657
Epoch: 45912, Training Loss: 0.00806
Epoch: 45913, Training Loss: 0.00633
Epoch: 45913, Training Loss: 0.00657
Epoch: 45913, Training Loss: 0.00657
Epoch: 45913, Training Loss: 0.00806
Epoch: 45914, Training Loss: 0.00633
Epoch: 45914, Training Loss: 0.00657
Epoch: 45914, Training Loss: 0.00657
Epoch: 45914, Training Loss: 0.00806
Epoch: 45915, Training Loss: 0.00633
Epoch: 45915, Training Loss: 0.00657
Epoch: 45915, Training Loss: 0.00657
Epoch: 45915, Training Loss: 0.00806
Epoch: 45916, Training Loss: 0.00633
Epoch: 45916, Training Loss: 0.00657
Epoch: 45916, Training Loss: 0.00657
Epoch: 45916, Training Loss: 0.00806
Epoch: 45917, Training Loss: 0.00633
Epoch: 45917, Training Loss: 0.00657
Epoch: 45917, Training Loss: 0.00657
Epoch: 45917, Training Loss: 0.00806
Epoch: 45918, Training Loss: 0.00633
Epoch: 45918, Training Loss: 0.00657
Epoch: 45918, Training Loss: 0.00657
Epoch: 45918, Training Loss: 0.00806
Epoch: 45919, Training Loss: 0.00633
Epoch: 45919, Training Loss: 0.00657
Epoch: 45919, Training Loss: 0.00657
Epoch: 45919, Training Loss: 0.00806
Epoch: 45920, Training Loss: 0.00633
Epoch: 45920, Training Loss: 0.00657
Epoch: 45920, Training Loss: 0.00657
Epoch: 45920, Training Loss: 0.00806
Epoch: 45921, Training Loss: 0.00633
Epoch: 45921, Training Loss: 0.00657
Epoch: 45921, Training Loss: 0.00657
Epoch: 45921, Training Loss: 0.00806
Epoch: 45922, Training Loss: 0.00633
Epoch: 45922, Training Loss: 0.00657
Epoch: 45922, Training Loss: 0.00657
Epoch: 45922, Training Loss: 0.00806
Epoch: 45923, Training Loss: 0.00633
Epoch: 45923, Training Loss: 0.00657
Epoch: 45923, Training Loss: 0.00657
Epoch: 45923, Training Loss: 0.00806
Epoch: 45924, Training Loss: 0.00633
Epoch: 45924, Training Loss: 0.00657
Epoch: 45924, Training Loss: 0.00657
Epoch: 45924, Training Loss: 0.00806
Epoch: 45925, Training Loss: 0.00633
Epoch: 45925, Training Loss: 0.00657
Epoch: 45925, Training Loss: 0.00657
Epoch: 45925, Training Loss: 0.00806
Epoch: 45926, Training Loss: 0.00633
Epoch: 45926, Training Loss: 0.00657
Epoch: 45926, Training Loss: 0.00657
Epoch: 45926, Training Loss: 0.00806
Epoch: 45927, Training Loss: 0.00633
Epoch: 45927, Training Loss: 0.00657
Epoch: 45927, Training Loss: 0.00657
Epoch: 45927, Training Loss: 0.00806
Epoch: 45928, Training Loss: 0.00633
Epoch: 45928, Training Loss: 0.00657
Epoch: 45928, Training Loss: 0.00657
Epoch: 45928, Training Loss: 0.00806
Epoch: 45929, Training Loss: 0.00633
Epoch: 45929, Training Loss: 0.00657
Epoch: 45929, Training Loss: 0.00657
Epoch: 45929, Training Loss: 0.00806
Epoch: 45930, Training Loss: 0.00633
Epoch: 45930, Training Loss: 0.00657
Epoch: 45930, Training Loss: 0.00657
Epoch: 45930, Training Loss: 0.00806
Epoch: 45931, Training Loss: 0.00633
Epoch: 45931, Training Loss: 0.00657
Epoch: 45931, Training Loss: 0.00657
Epoch: 45931, Training Loss: 0.00806
Epoch: 45932, Training Loss: 0.00633
Epoch: 45932, Training Loss: 0.00657
Epoch: 45932, Training Loss: 0.00657
Epoch: 45932, Training Loss: 0.00806
Epoch: 45933, Training Loss: 0.00633
Epoch: 45933, Training Loss: 0.00657
Epoch: 45933, Training Loss: 0.00657
Epoch: 45933, Training Loss: 0.00806
Epoch: 45934, Training Loss: 0.00633
Epoch: 45934, Training Loss: 0.00657
Epoch: 45934, Training Loss: 0.00657
Epoch: 45934, Training Loss: 0.00806
Epoch: 45935, Training Loss: 0.00633
Epoch: 45935, Training Loss: 0.00657
Epoch: 45935, Training Loss: 0.00657
Epoch: 45935, Training Loss: 0.00806
Epoch: 45936, Training Loss: 0.00633
Epoch: 45936, Training Loss: 0.00657
Epoch: 45936, Training Loss: 0.00657
Epoch: 45936, Training Loss: 0.00806
Epoch: 45937, Training Loss: 0.00633
Epoch: 45937, Training Loss: 0.00657
Epoch: 45937, Training Loss: 0.00657
Epoch: 45937, Training Loss: 0.00806
Epoch: 45938, Training Loss: 0.00633
Epoch: 45938, Training Loss: 0.00657
Epoch: 45938, Training Loss: 0.00657
Epoch: 45938, Training Loss: 0.00806
Epoch: 45939, Training Loss: 0.00633
Epoch: 45939, Training Loss: 0.00657
Epoch: 45939, Training Loss: 0.00657
Epoch: 45939, Training Loss: 0.00805
Epoch: 45940, Training Loss: 0.00633
Epoch: 45940, Training Loss: 0.00657
Epoch: 45940, Training Loss: 0.00657
Epoch: 45940, Training Loss: 0.00805
Epoch: 45941, Training Loss: 0.00633
Epoch: 45941, Training Loss: 0.00657
Epoch: 45941, Training Loss: 0.00657
Epoch: 45941, Training Loss: 0.00805
Epoch: 45942, Training Loss: 0.00633
Epoch: 45942, Training Loss: 0.00657
Epoch: 45942, Training Loss: 0.00657
Epoch: 45942, Training Loss: 0.00805
Epoch: 45943, Training Loss: 0.00633
Epoch: 45943, Training Loss: 0.00657
Epoch: 45943, Training Loss: 0.00657
Epoch: 45943, Training Loss: 0.00805
Epoch: 45944, Training Loss: 0.00633
Epoch: 45944, Training Loss: 0.00657
Epoch: 45944, Training Loss: 0.00657
Epoch: 45944, Training Loss: 0.00805
Epoch: 45945, Training Loss: 0.00633
Epoch: 45945, Training Loss: 0.00657
Epoch: 45945, Training Loss: 0.00657
Epoch: 45945, Training Loss: 0.00805
Epoch: 45946, Training Loss: 0.00633
Epoch: 45946, Training Loss: 0.00656
Epoch: 45946, Training Loss: 0.00657
Epoch: 45946, Training Loss: 0.00805
Epoch: 45947, Training Loss: 0.00633
Epoch: 45947, Training Loss: 0.00656
Epoch: 45947, Training Loss: 0.00657
Epoch: 45947, Training Loss: 0.00805
Epoch: 45948, Training Loss: 0.00633
Epoch: 45948, Training Loss: 0.00656
Epoch: 45948, Training Loss: 0.00657
Epoch: 45948, Training Loss: 0.00805
Epoch: 45949, Training Loss: 0.00633
Epoch: 45949, Training Loss: 0.00656
Epoch: 45949, Training Loss: 0.00657
Epoch: 45949, Training Loss: 0.00805
Epoch: 45950, Training Loss: 0.00633
Epoch: 45950, Training Loss: 0.00656
Epoch: 45950, Training Loss: 0.00657
Epoch: 45950, Training Loss: 0.00805
Epoch: 45951, Training Loss: 0.00633
Epoch: 45951, Training Loss: 0.00656
Epoch: 45951, Training Loss: 0.00657
Epoch: 45951, Training Loss: 0.00805
Epoch: 45952, Training Loss: 0.00633
Epoch: 45952, Training Loss: 0.00656
Epoch: 45952, Training Loss: 0.00657
Epoch: 45952, Training Loss: 0.00805
Epoch: 45953, Training Loss: 0.00633
Epoch: 45953, Training Loss: 0.00656
Epoch: 45953, Training Loss: 0.00657
Epoch: 45953, Training Loss: 0.00805
Epoch: 45954, Training Loss: 0.00633
Epoch: 45954, Training Loss: 0.00656
Epoch: 45954, Training Loss: 0.00657
Epoch: 45954, Training Loss: 0.00805
Epoch: 45955, Training Loss: 0.00633
Epoch: 45955, Training Loss: 0.00656
Epoch: 45955, Training Loss: 0.00657
Epoch: 45955, Training Loss: 0.00805
Epoch: 45956, Training Loss: 0.00633
Epoch: 45956, Training Loss: 0.00656
Epoch: 45956, Training Loss: 0.00657
Epoch: 45956, Training Loss: 0.00805
Epoch: 45957, Training Loss: 0.00633
Epoch: 45957, Training Loss: 0.00656
Epoch: 45957, Training Loss: 0.00657
Epoch: 45957, Training Loss: 0.00805
Epoch: 45958, Training Loss: 0.00633
Epoch: 45958, Training Loss: 0.00656
Epoch: 45958, Training Loss: 0.00657
Epoch: 45958, Training Loss: 0.00805
Epoch: 45959, Training Loss: 0.00633
Epoch: 45959, Training Loss: 0.00656
Epoch: 45959, Training Loss: 0.00657
Epoch: 45959, Training Loss: 0.00805
Epoch: 45960, Training Loss: 0.00633
Epoch: 45960, Training Loss: 0.00656
Epoch: 45960, Training Loss: 0.00657
Epoch: 45960, Training Loss: 0.00805
Epoch: 45961, Training Loss: 0.00633
Epoch: 45961, Training Loss: 0.00656
Epoch: 45961, Training Loss: 0.00657
Epoch: 45961, Training Loss: 0.00805
Epoch: 45962, Training Loss: 0.00633
Epoch: 45962, Training Loss: 0.00656
Epoch: 45962, Training Loss: 0.00657
Epoch: 45962, Training Loss: 0.00805
Epoch: 45963, Training Loss: 0.00633
Epoch: 45963, Training Loss: 0.00656
Epoch: 45963, Training Loss: 0.00657
Epoch: 45963, Training Loss: 0.00805
Epoch: 45964, Training Loss: 0.00633
Epoch: 45964, Training Loss: 0.00656
Epoch: 45964, Training Loss: 0.00657
Epoch: 45964, Training Loss: 0.00805
Epoch: 45965, Training Loss: 0.00633
Epoch: 45965, Training Loss: 0.00656
Epoch: 45965, Training Loss: 0.00657
Epoch: 45965, Training Loss: 0.00805
Epoch: 45966, Training Loss: 0.00633
Epoch: 45966, Training Loss: 0.00656
Epoch: 45966, Training Loss: 0.00657
Epoch: 45966, Training Loss: 0.00805
Epoch: 45967, Training Loss: 0.00633
Epoch: 45967, Training Loss: 0.00656
Epoch: 45967, Training Loss: 0.00657
Epoch: 45967, Training Loss: 0.00805
Epoch: 45968, Training Loss: 0.00633
Epoch: 45968, Training Loss: 0.00656
Epoch: 45968, Training Loss: 0.00657
Epoch: 45968, Training Loss: 0.00805
Epoch: 45969, Training Loss: 0.00633
Epoch: 45969, Training Loss: 0.00656
Epoch: 45969, Training Loss: 0.00657
Epoch: 45969, Training Loss: 0.00805
Epoch: 45970, Training Loss: 0.00633
Epoch: 45970, Training Loss: 0.00656
Epoch: 45970, Training Loss: 0.00657
Epoch: 45970, Training Loss: 0.00805
Epoch: 45971, Training Loss: 0.00633
Epoch: 45971, Training Loss: 0.00656
Epoch: 45971, Training Loss: 0.00657
Epoch: 45971, Training Loss: 0.00805
Epoch: 45972, Training Loss: 0.00633
Epoch: 45972, Training Loss: 0.00656
Epoch: 45972, Training Loss: 0.00657
Epoch: 45972, Training Loss: 0.00805
Epoch: 45973, Training Loss: 0.00633
Epoch: 45973, Training Loss: 0.00656
Epoch: 45973, Training Loss: 0.00657
Epoch: 45973, Training Loss: 0.00805
Epoch: 45974, Training Loss: 0.00633
Epoch: 45974, Training Loss: 0.00656
Epoch: 45974, Training Loss: 0.00657
Epoch: 45974, Training Loss: 0.00805
Epoch: 45975, Training Loss: 0.00633
Epoch: 45975, Training Loss: 0.00656
Epoch: 45975, Training Loss: 0.00657
Epoch: 45975, Training Loss: 0.00805
Epoch: 45976, Training Loss: 0.00633
Epoch: 45976, Training Loss: 0.00656
Epoch: 45976, Training Loss: 0.00657
Epoch: 45976, Training Loss: 0.00805
Epoch: 45977, Training Loss: 0.00633
Epoch: 45977, Training Loss: 0.00656
Epoch: 45977, Training Loss: 0.00657
Epoch: 45977, Training Loss: 0.00805
Epoch: 45978, Training Loss: 0.00633
Epoch: 45978, Training Loss: 0.00656
Epoch: 45978, Training Loss: 0.00657
Epoch: 45978, Training Loss: 0.00805
Epoch: 45979, Training Loss: 0.00633
Epoch: 45979, Training Loss: 0.00656
Epoch: 45979, Training Loss: 0.00657
Epoch: 45979, Training Loss: 0.00805
Epoch: 45980, Training Loss: 0.00633
Epoch: 45980, Training Loss: 0.00656
Epoch: 45980, Training Loss: 0.00657
Epoch: 45980, Training Loss: 0.00805
Epoch: 45981, Training Loss: 0.00633
Epoch: 45981, Training Loss: 0.00656
Epoch: 45981, Training Loss: 0.00657
Epoch: 45981, Training Loss: 0.00805
Epoch: 45982, Training Loss: 0.00633
Epoch: 45982, Training Loss: 0.00656
Epoch: 45982, Training Loss: 0.00657
Epoch: 45982, Training Loss: 0.00805
Epoch: 45983, Training Loss: 0.00633
Epoch: 45983, Training Loss: 0.00656
Epoch: 45983, Training Loss: 0.00657
Epoch: 45983, Training Loss: 0.00805
Epoch: 45984, Training Loss: 0.00633
Epoch: 45984, Training Loss: 0.00656
Epoch: 45984, Training Loss: 0.00657
Epoch: 45984, Training Loss: 0.00805
Epoch: 45985, Training Loss: 0.00633
Epoch: 45985, Training Loss: 0.00656
Epoch: 45985, Training Loss: 0.00657
Epoch: 45985, Training Loss: 0.00805
Epoch: 45986, Training Loss: 0.00633
Epoch: 45986, Training Loss: 0.00656
Epoch: 45986, Training Loss: 0.00657
Epoch: 45986, Training Loss: 0.00805
Epoch: 45987, Training Loss: 0.00633
Epoch: 45987, Training Loss: 0.00656
Epoch: 45987, Training Loss: 0.00657
Epoch: 45987, Training Loss: 0.00805
Epoch: 45988, Training Loss: 0.00633
Epoch: 45988, Training Loss: 0.00656
Epoch: 45988, Training Loss: 0.00657
Epoch: 45988, Training Loss: 0.00805
Epoch: 45989, Training Loss: 0.00633
Epoch: 45989, Training Loss: 0.00656
Epoch: 45989, Training Loss: 0.00657
Epoch: 45989, Training Loss: 0.00805
Epoch: 45990, Training Loss: 0.00633
Epoch: 45990, Training Loss: 0.00656
Epoch: 45990, Training Loss: 0.00657
Epoch: 45990, Training Loss: 0.00805
Epoch: 45991, Training Loss: 0.00633
Epoch: 45991, Training Loss: 0.00656
Epoch: 45991, Training Loss: 0.00657
Epoch: 45991, Training Loss: 0.00805
Epoch: 45992, Training Loss: 0.00633
Epoch: 45992, Training Loss: 0.00656
Epoch: 45992, Training Loss: 0.00657
Epoch: 45992, Training Loss: 0.00805
Epoch: 45993, Training Loss: 0.00633
Epoch: 45993, Training Loss: 0.00656
Epoch: 45993, Training Loss: 0.00657
Epoch: 45993, Training Loss: 0.00805
Epoch: 45994, Training Loss: 0.00633
Epoch: 45994, Training Loss: 0.00656
Epoch: 45994, Training Loss: 0.00657
Epoch: 45994, Training Loss: 0.00805
Epoch: 45995, Training Loss: 0.00633
Epoch: 45995, Training Loss: 0.00656
Epoch: 45995, Training Loss: 0.00657
Epoch: 45995, Training Loss: 0.00805
Epoch: 45996, Training Loss: 0.00633
Epoch: 45996, Training Loss: 0.00656
Epoch: 45996, Training Loss: 0.00657
Epoch: 45996, Training Loss: 0.00805
Epoch: 45997, Training Loss: 0.00633
Epoch: 45997, Training Loss: 0.00656
Epoch: 45997, Training Loss: 0.00657
Epoch: 45997, Training Loss: 0.00805
Epoch: 45998, Training Loss: 0.00633
Epoch: 45998, Training Loss: 0.00656
Epoch: 45998, Training Loss: 0.00657
Epoch: 45998, Training Loss: 0.00805
Epoch: 45999, Training Loss: 0.00633
Epoch: 45999, Training Loss: 0.00656
Epoch: 45999, Training Loss: 0.00657
Epoch: 45999, Training Loss: 0.00805

In [43]:
def eval_network2(input_value):
    """Test the network and return the output value."""   
    network = Network2([2,2,1])
    network.set_values(weights2, biases2)
    output_value = network.feedforward(input_value)
    return output_value[0][0]
    
def test2():
    """A few tests to make sure the network performs properly."""
    print("0 XOR 0 = 0?: ", eval_network2(np.array([0,0])[:,None]))
    print("0 XOR 1 = 1?: ", eval_network2(np.array([0,1])[:,None]))
    print("1 XOR 0 = 1?: ", eval_network2(np.array([1,0])[:,None]))
    print("1 XOR 1 = 0?: ", eval_network2(np.array([1,1])[:,None]))   
    
test2()


0 XOR 0 = 0?:  0.00629314519422
0 XOR 1 = 1?:  0.993498035933
1 XOR 0 = 1?:  0.993471858555
1 XOR 1 = 0?:  0.00797628178004

In [86]:
plt.figure(2)
train_losses2 = np.array(train_losses2)
plt.plot(epoch_n2, train_losses2.T)
plt.xlabel("Epoch")
plt.ylabel("Loss")
plt.title("Training Losses")
plt.show()


Cost Function

This part will use different cost function: Cross-entropy.


In [49]:
# Set training epoches and learning rate
epochs3 = 1000
lr_rate = 0.6

In [45]:
class Network3(object):
    
    def __init__(self, sizes):
        """The list "sizes" contains the number of neurons in respective layers.""" 
        self.sizes = sizes
        self.number_layers = len(sizes)
        self.biases = [np.random.randn(i, 1) for i in sizes[1:]]
        self.weights = [np.random.randn(i, j) for i,j in zip(sizes[1:],sizes[:-1])]   
        
    def feedforward(self, input_):
        """Return the output of the neural network."""
        activation = input_
        for bias, weight in zip(self.biases, self.weights):
            output = np.dot(weight, activation) + bias
            activation = sigmoid(output)
        return activation
    
    def set_values(self, weights, biases):
        self.weights = weights
        self.biases = biases
    
    def backprop(self, training_data, desired_output, lr_rate):
        """Update parameters"""
        # Store gradients of weights and biases for update
        grad_weights = np.zeros_like(self.weights)
        grad_biases = np.zeros_like(self.biases)
        
        # Store outputs and activations for backprop
        outputs = []
        activation = training_data
        activations = [activation] 
        for b, w in zip(self.biases, self.weights):
            output = np.dot(w, activation) + b
            outputs.append(output)
            activation = sigmoid(output)
            activations.append(activation)
        
        # Compute the gradients of the last layer
        delta =  self.feedforward(training_data) - desired_output # Modify this line
        grad_biases[-1] = delta
        grad_weights[-1] = np.dot(delta, activations[-2].transpose())
        
        # Compute gradients of remaining layers
        for layer in range(2, self.number_layers):
            delta = np.dot(self.weights[-layer+1].transpose(), delta) * sigmoid_deriv(outputs[-layer])
            grad_biases[-layer] = delta
            grad_weights[-layer] = np.dot(delta, activations[-layer-1].transpose())
        
        # Update weights and biases
        self.weights = [w-lr_rate*grad_w for w, grad_w in zip(self.weights, grad_weights)]
        self.biases = [b-lr_rate*grad_b for b, grad_b in zip(self.biases, grad_biases)]

In [88]:
# Train the network 
def train3(training_data, desired_output, epochs3, lr_rate):
    losses = [] # Store losses for plotting
    epoch_n = [] # Store epochs for plotting
    network = Network3([2,2,1])
    for epoch in range(epochs3):
        for x, y in zip(training_data, desired_output):
            x = x[:, None] # Convert row vector to column vector
            network.backprop(x, y, lr_rate)
            # Printing out the training progress
            train_loss = abs(y - network.feedforward(x))
            if epoch % 500 == 0:
                losses.append(train_loss[0][0]) 
                epoch_n.append(epoch)
            print("Epoch: {}, Training Loss: {:.5f}".format(epoch, train_loss[0][0]))
    return network.weights, network.biases, losses, epoch_n
    
weights3, biases3, train_losses3, epoch_n3 = train3(training_data, desired_output, epochs3, lr_rate)


Epoch: 0, Training Loss: 0.28528
Epoch: 0, Training Loss: 0.52031
Epoch: 0, Training Loss: 0.42876
Epoch: 0, Training Loss: 0.40858
Epoch: 1, Training Loss: 0.34081
Epoch: 1, Training Loss: 0.46594
Epoch: 1, Training Loss: 0.39644
Epoch: 1, Training Loss: 0.43559
Epoch: 2, Training Loss: 0.35881
Epoch: 2, Training Loss: 0.44850
Epoch: 2, Training Loss: 0.38769
Epoch: 2, Training Loss: 0.44360
Epoch: 3, Training Loss: 0.36431
Epoch: 3, Training Loss: 0.44191
Epoch: 3, Training Loss: 0.38583
Epoch: 3, Training Loss: 0.44580
Epoch: 4, Training Loss: 0.36597
Epoch: 4, Training Loss: 0.43842
Epoch: 4, Training Loss: 0.38604
Epoch: 4, Training Loss: 0.44623
Epoch: 5, Training Loss: 0.36645
Epoch: 5, Training Loss: 0.43571
Epoch: 5, Training Loss: 0.38690
Epoch: 5, Training Loss: 0.44609
Epoch: 6, Training Loss: 0.36657
Epoch: 6, Training Loss: 0.43309
Epoch: 6, Training Loss: 0.38799
Epoch: 6, Training Loss: 0.44574
Epoch: 7, Training Loss: 0.36656
Epoch: 7, Training Loss: 0.43033
Epoch: 7, Training Loss: 0.38918
Epoch: 7, Training Loss: 0.44530
Epoch: 8, Training Loss: 0.36649
Epoch: 8, Training Loss: 0.42735
Epoch: 8, Training Loss: 0.39046
Epoch: 8, Training Loss: 0.44477
Epoch: 9, Training Loss: 0.36637
Epoch: 9, Training Loss: 0.42412
Epoch: 9, Training Loss: 0.39180
Epoch: 9, Training Loss: 0.44417
Epoch: 10, Training Loss: 0.36621
Epoch: 10, Training Loss: 0.42061
Epoch: 10, Training Loss: 0.39323
Epoch: 10, Training Loss: 0.44348
Epoch: 11, Training Loss: 0.36600
Epoch: 11, Training Loss: 0.41680
Epoch: 11, Training Loss: 0.39474
Epoch: 11, Training Loss: 0.44271
Epoch: 12, Training Loss: 0.36574
Epoch: 12, Training Loss: 0.41267
Epoch: 12, Training Loss: 0.39634
Epoch: 12, Training Loss: 0.44183
Epoch: 13, Training Loss: 0.36542
Epoch: 13, Training Loss: 0.40820
Epoch: 13, Training Loss: 0.39803
Epoch: 13, Training Loss: 0.44084
Epoch: 14, Training Loss: 0.36502
Epoch: 14, Training Loss: 0.40338
Epoch: 14, Training Loss: 0.39982
Epoch: 14, Training Loss: 0.43971
Epoch: 15, Training Loss: 0.36455
Epoch: 15, Training Loss: 0.39820
Epoch: 15, Training Loss: 0.40172
Epoch: 15, Training Loss: 0.43844
Epoch: 16, Training Loss: 0.36400
Epoch: 16, Training Loss: 0.39265
Epoch: 16, Training Loss: 0.40373
Epoch: 16, Training Loss: 0.43701
Epoch: 17, Training Loss: 0.36335
Epoch: 17, Training Loss: 0.38674
Epoch: 17, Training Loss: 0.40584
Epoch: 17, Training Loss: 0.43540
Epoch: 18, Training Loss: 0.36260
Epoch: 18, Training Loss: 0.38048
Epoch: 18, Training Loss: 0.40807
Epoch: 18, Training Loss: 0.43359
Epoch: 19, Training Loss: 0.36173
Epoch: 19, Training Loss: 0.37387
Epoch: 19, Training Loss: 0.41040
Epoch: 19, Training Loss: 0.43156
Epoch: 20, Training Loss: 0.36073
Epoch: 20, Training Loss: 0.36695
Epoch: 20, Training Loss: 0.41283
Epoch: 20, Training Loss: 0.42929
Epoch: 21, Training Loss: 0.35961
Epoch: 21, Training Loss: 0.35974
Epoch: 21, Training Loss: 0.41533
Epoch: 21, Training Loss: 0.42677
Epoch: 22, Training Loss: 0.35833
Epoch: 22, Training Loss: 0.35227
Epoch: 22, Training Loss: 0.41790
Epoch: 22, Training Loss: 0.42398
Epoch: 23, Training Loss: 0.35691
Epoch: 23, Training Loss: 0.34458
Epoch: 23, Training Loss: 0.42051
Epoch: 23, Training Loss: 0.42091
Epoch: 24, Training Loss: 0.35532
Epoch: 24, Training Loss: 0.33672
Epoch: 24, Training Loss: 0.42313
Epoch: 24, Training Loss: 0.41753
Epoch: 25, Training Loss: 0.35355
Epoch: 25, Training Loss: 0.32873
Epoch: 25, Training Loss: 0.42573
Epoch: 25, Training Loss: 0.41385
Epoch: 26, Training Loss: 0.35161
Epoch: 26, Training Loss: 0.32064
Epoch: 26, Training Loss: 0.42827
Epoch: 26, Training Loss: 0.40985
Epoch: 27, Training Loss: 0.34947
Epoch: 27, Training Loss: 0.31251
Epoch: 27, Training Loss: 0.43070
Epoch: 27, Training Loss: 0.40553
Epoch: 28, Training Loss: 0.34714
Epoch: 28, Training Loss: 0.30438
Epoch: 28, Training Loss: 0.43299
Epoch: 28, Training Loss: 0.40089
Epoch: 29, Training Loss: 0.34460
Epoch: 29, Training Loss: 0.29628
Epoch: 29, Training Loss: 0.43507
Epoch: 29, Training Loss: 0.39593
Epoch: 30, Training Loss: 0.34184
Epoch: 30, Training Loss: 0.28825
Epoch: 30, Training Loss: 0.43691
Epoch: 30, Training Loss: 0.39065
Epoch: 31, Training Loss: 0.33887
Epoch: 31, Training Loss: 0.28033
Epoch: 31, Training Loss: 0.43844
Epoch: 31, Training Loss: 0.38506
Epoch: 32, Training Loss: 0.33567
Epoch: 32, Training Loss: 0.27254
Epoch: 32, Training Loss: 0.43962
Epoch: 32, Training Loss: 0.37917
Epoch: 33, Training Loss: 0.33226
Epoch: 33, Training Loss: 0.26492
Epoch: 33, Training Loss: 0.44040
Epoch: 33, Training Loss: 0.37298
Epoch: 34, Training Loss: 0.32861
Epoch: 34, Training Loss: 0.25748
Epoch: 34, Training Loss: 0.44073
Epoch: 34, Training Loss: 0.36652
Epoch: 35, Training Loss: 0.32475
Epoch: 35, Training Loss: 0.25025
Epoch: 35, Training Loss: 0.44056
Epoch: 35, Training Loss: 0.35979
Epoch: 36, Training Loss: 0.32067
Epoch: 36, Training Loss: 0.24324
Epoch: 36, Training Loss: 0.43986
Epoch: 36, Training Loss: 0.35281
Epoch: 37, Training Loss: 0.31638
Epoch: 37, Training Loss: 0.23646
Epoch: 37, Training Loss: 0.43860
Epoch: 37, Training Loss: 0.34561
Epoch: 38, Training Loss: 0.31189
Epoch: 38, Training Loss: 0.22992
Epoch: 38, Training Loss: 0.43675
Epoch: 38, Training Loss: 0.33819
Epoch: 39, Training Loss: 0.30721
Epoch: 39, Training Loss: 0.22361
Epoch: 39, Training Loss: 0.43430
Epoch: 39, Training Loss: 0.33058
Epoch: 40, Training Loss: 0.30236
Epoch: 40, Training Loss: 0.21755
Epoch: 40, Training Loss: 0.43123
Epoch: 40, Training Loss: 0.32281
Epoch: 41, Training Loss: 0.29734
Epoch: 41, Training Loss: 0.21171
Epoch: 41, Training Loss: 0.42756
Epoch: 41, Training Loss: 0.31489
Epoch: 42, Training Loss: 0.29218
Epoch: 42, Training Loss: 0.20610
Epoch: 42, Training Loss: 0.42329
Epoch: 42, Training Loss: 0.30686
Epoch: 43, Training Loss: 0.28689
Epoch: 43, Training Loss: 0.20071
Epoch: 43, Training Loss: 0.41846
Epoch: 43, Training Loss: 0.29875
Epoch: 44, Training Loss: 0.28150
Epoch: 44, Training Loss: 0.19552
Epoch: 44, Training Loss: 0.41307
Epoch: 44, Training Loss: 0.29058
Epoch: 45, Training Loss: 0.27601
Epoch: 45, Training Loss: 0.19054
Epoch: 45, Training Loss: 0.40718
Epoch: 45, Training Loss: 0.28239
Epoch: 46, Training Loss: 0.27046
Epoch: 46, Training Loss: 0.18573
Epoch: 46, Training Loss: 0.40083
Epoch: 46, Training Loss: 0.27420
Epoch: 47, Training Loss: 0.26485
Epoch: 47, Training Loss: 0.18110
Epoch: 47, Training Loss: 0.39405
Epoch: 47, Training Loss: 0.26606
Epoch: 48, Training Loss: 0.25921
Epoch: 48, Training Loss: 0.17662
Epoch: 48, Training Loss: 0.38691
Epoch: 48, Training Loss: 0.25798
Epoch: 49, Training Loss: 0.25355
Epoch: 49, Training Loss: 0.17230
Epoch: 49, Training Loss: 0.37945
Epoch: 49, Training Loss: 0.25000
Epoch: 50, Training Loss: 0.24789
Epoch: 50, Training Loss: 0.16811
Epoch: 50, Training Loss: 0.37173
Epoch: 50, Training Loss: 0.24215
Epoch: 51, Training Loss: 0.24224
Epoch: 51, Training Loss: 0.16406
Epoch: 51, Training Loss: 0.36380
Epoch: 51, Training Loss: 0.23445
Epoch: 52, Training Loss: 0.23663
Epoch: 52, Training Loss: 0.16013
Epoch: 52, Training Loss: 0.35571
Epoch: 52, Training Loss: 0.22692
Epoch: 53, Training Loss: 0.23105
Epoch: 53, Training Loss: 0.15631
Epoch: 53, Training Loss: 0.34750
Epoch: 53, Training Loss: 0.21957
Epoch: 54, Training Loss: 0.22553
Epoch: 54, Training Loss: 0.15261
Epoch: 54, Training Loss: 0.33924
Epoch: 54, Training Loss: 0.21243
Epoch: 55, Training Loss: 0.22008
Epoch: 55, Training Loss: 0.14901
Epoch: 55, Training Loss: 0.33095
Epoch: 55, Training Loss: 0.20550
Epoch: 56, Training Loss: 0.21470
Epoch: 56, Training Loss: 0.14551
Epoch: 56, Training Loss: 0.32269
Epoch: 56, Training Loss: 0.19880
Epoch: 57, Training Loss: 0.20941
Epoch: 57, Training Loss: 0.14211
Epoch: 57, Training Loss: 0.31448
Epoch: 57, Training Loss: 0.19232
Epoch: 58, Training Loss: 0.20421
Epoch: 58, Training Loss: 0.13881
Epoch: 58, Training Loss: 0.30635
Epoch: 58, Training Loss: 0.18607
Epoch: 59, Training Loss: 0.19911
Epoch: 59, Training Loss: 0.13559
Epoch: 59, Training Loss: 0.29834
Epoch: 59, Training Loss: 0.18006
Epoch: 60, Training Loss: 0.19411
Epoch: 60, Training Loss: 0.13247
Epoch: 60, Training Loss: 0.29046
Epoch: 60, Training Loss: 0.17428
Epoch: 61, Training Loss: 0.18923
Epoch: 61, Training Loss: 0.12944
Epoch: 61, Training Loss: 0.28274
Epoch: 61, Training Loss: 0.16873
Epoch: 62, Training Loss: 0.18447
Epoch: 62, Training Loss: 0.12649
Epoch: 62, Training Loss: 0.27520
Epoch: 62, Training Loss: 0.16341
Epoch: 63, Training Loss: 0.17983
Epoch: 63, Training Loss: 0.12363
Epoch: 63, Training Loss: 0.26783
Epoch: 63, Training Loss: 0.15831
Epoch: 64, Training Loss: 0.17531
Epoch: 64, Training Loss: 0.12086
Epoch: 64, Training Loss: 0.26066
Epoch: 64, Training Loss: 0.15342
Epoch: 65, Training Loss: 0.17091
Epoch: 65, Training Loss: 0.11816
Epoch: 65, Training Loss: 0.25369
Epoch: 65, Training Loss: 0.14874
Epoch: 66, Training Loss: 0.16664
Epoch: 66, Training Loss: 0.11555
Epoch: 66, Training Loss: 0.24693
Epoch: 66, Training Loss: 0.14427
Epoch: 67, Training Loss: 0.16250
Epoch: 67, Training Loss: 0.11302
Epoch: 67, Training Loss: 0.24037
Epoch: 67, Training Loss: 0.13998
Epoch: 68, Training Loss: 0.15848
Epoch: 68, Training Loss: 0.11056
Epoch: 68, Training Loss: 0.23402
Epoch: 68, Training Loss: 0.13589
Epoch: 69, Training Loss: 0.15459
Epoch: 69, Training Loss: 0.10818
Epoch: 69, Training Loss: 0.22788
Epoch: 69, Training Loss: 0.13198
Epoch: 70, Training Loss: 0.15082
Epoch: 70, Training Loss: 0.10588
Epoch: 70, Training Loss: 0.22194
Epoch: 70, Training Loss: 0.12823
Epoch: 71, Training Loss: 0.14717
Epoch: 71, Training Loss: 0.10364
Epoch: 71, Training Loss: 0.21621
Epoch: 71, Training Loss: 0.12465
Epoch: 72, Training Loss: 0.14364
Epoch: 72, Training Loss: 0.10148
Epoch: 72, Training Loss: 0.21068
Epoch: 72, Training Loss: 0.12123
Epoch: 73, Training Loss: 0.14023
Epoch: 73, Training Loss: 0.09938
Epoch: 73, Training Loss: 0.20534
Epoch: 73, Training Loss: 0.11795
Epoch: 74, Training Loss: 0.13693
Epoch: 74, Training Loss: 0.09735
Epoch: 74, Training Loss: 0.20018
Epoch: 74, Training Loss: 0.11482
Epoch: 75, Training Loss: 0.13374
Epoch: 75, Training Loss: 0.09538
Epoch: 75, Training Loss: 0.19522
Epoch: 75, Training Loss: 0.11182
Epoch: 76, Training Loss: 0.13066
Epoch: 76, Training Loss: 0.09347
Epoch: 76, Training Loss: 0.19043
Epoch: 76, Training Loss: 0.10895
Epoch: 77, Training Loss: 0.12768
Epoch: 77, Training Loss: 0.09163
Epoch: 77, Training Loss: 0.18581
Epoch: 77, Training Loss: 0.10620
Epoch: 78, Training Loss: 0.12481
Epoch: 78, Training Loss: 0.08984
Epoch: 78, Training Loss: 0.18136
Epoch: 78, Training Loss: 0.10356
Epoch: 79, Training Loss: 0.12203
Epoch: 79, Training Loss: 0.08811
Epoch: 79, Training Loss: 0.17707
Epoch: 79, Training Loss: 0.10104
Epoch: 80, Training Loss: 0.11935
Epoch: 80, Training Loss: 0.08643
Epoch: 80, Training Loss: 0.17294
Epoch: 80, Training Loss: 0.09861
Epoch: 81, Training Loss: 0.11676
Epoch: 81, Training Loss: 0.08480
Epoch: 81, Training Loss: 0.16895
Epoch: 81, Training Loss: 0.09629
Epoch: 82, Training Loss: 0.11425
Epoch: 82, Training Loss: 0.08323
Epoch: 82, Training Loss: 0.16511
Epoch: 82, Training Loss: 0.09406
Epoch: 83, Training Loss: 0.11183
Epoch: 83, Training Loss: 0.08170
Epoch: 83, Training Loss: 0.16140
Epoch: 83, Training Loss: 0.09192
Epoch: 84, Training Loss: 0.10950
Epoch: 84, Training Loss: 0.08022
Epoch: 84, Training Loss: 0.15783
Epoch: 84, Training Loss: 0.08986
Epoch: 85, Training Loss: 0.10724
Epoch: 85, Training Loss: 0.07878
Epoch: 85, Training Loss: 0.15438
Epoch: 85, Training Loss: 0.08788
Epoch: 86, Training Loss: 0.10505
Epoch: 86, Training Loss: 0.07739
Epoch: 86, Training Loss: 0.15106
Epoch: 86, Training Loss: 0.08598
Epoch: 87, Training Loss: 0.10294
Epoch: 87, Training Loss: 0.07604
Epoch: 87, Training Loss: 0.14785
Epoch: 87, Training Loss: 0.08414
Epoch: 88, Training Loss: 0.10090
Epoch: 88, Training Loss: 0.07473
Epoch: 88, Training Loss: 0.14475
Epoch: 88, Training Loss: 0.08238
Epoch: 89, Training Loss: 0.09892
Epoch: 89, Training Loss: 0.07345
Epoch: 89, Training Loss: 0.14175
Epoch: 89, Training Loss: 0.08068
Epoch: 90, Training Loss: 0.09701
Epoch: 90, Training Loss: 0.07222
Epoch: 90, Training Loss: 0.13886
Epoch: 90, Training Loss: 0.07905
Epoch: 91, Training Loss: 0.09516
Epoch: 91, Training Loss: 0.07102
Epoch: 91, Training Loss: 0.13607
Epoch: 91, Training Loss: 0.07747
Epoch: 92, Training Loss: 0.09337
Epoch: 92, Training Loss: 0.06985
Epoch: 92, Training Loss: 0.13337
Epoch: 92, Training Loss: 0.07595
Epoch: 93, Training Loss: 0.09163
Epoch: 93, Training Loss: 0.06872
Epoch: 93, Training Loss: 0.13076
Epoch: 93, Training Loss: 0.07448
Epoch: 94, Training Loss: 0.08995
Epoch: 94, Training Loss: 0.06762
Epoch: 94, Training Loss: 0.12824
Epoch: 94, Training Loss: 0.07306
Epoch: 95, Training Loss: 0.08833
Epoch: 95, Training Loss: 0.06655
Epoch: 95, Training Loss: 0.12580
Epoch: 95, Training Loss: 0.07170
Epoch: 96, Training Loss: 0.08675
Epoch: 96, Training Loss: 0.06551
Epoch: 96, Training Loss: 0.12344
Epoch: 96, Training Loss: 0.07037
Epoch: 97, Training Loss: 0.08522
Epoch: 97, Training Loss: 0.06450
Epoch: 97, Training Loss: 0.12115
Epoch: 97, Training Loss: 0.06909
Epoch: 98, Training Loss: 0.08374
Epoch: 98, Training Loss: 0.06352
Epoch: 98, Training Loss: 0.11894
Epoch: 98, Training Loss: 0.06786
Epoch: 99, Training Loss: 0.08230
Epoch: 99, Training Loss: 0.06256
Epoch: 99, Training Loss: 0.11680
Epoch: 99, Training Loss: 0.06666
Epoch: 100, Training Loss: 0.08090
Epoch: 100, Training Loss: 0.06163
Epoch: 100, Training Loss: 0.11472
Epoch: 100, Training Loss: 0.06550
Epoch: 101, Training Loss: 0.07955
Epoch: 101, Training Loss: 0.06073
Epoch: 101, Training Loss: 0.11271
Epoch: 101, Training Loss: 0.06438
Epoch: 102, Training Loss: 0.07823
Epoch: 102, Training Loss: 0.05984
Epoch: 102, Training Loss: 0.11075
Epoch: 102, Training Loss: 0.06329
Epoch: 103, Training Loss: 0.07696
Epoch: 103, Training Loss: 0.05898
Epoch: 103, Training Loss: 0.10886
Epoch: 103, Training Loss: 0.06224
Epoch: 104, Training Loss: 0.07572
Epoch: 104, Training Loss: 0.05815
Epoch: 104, Training Loss: 0.10702
Epoch: 104, Training Loss: 0.06122
Epoch: 105, Training Loss: 0.07451
Epoch: 105, Training Loss: 0.05733
Epoch: 105, Training Loss: 0.10524
Epoch: 105, Training Loss: 0.06023
Epoch: 106, Training Loss: 0.07334
Epoch: 106, Training Loss: 0.05654
Epoch: 106, Training Loss: 0.10351
Epoch: 106, Training Loss: 0.05927
Epoch: 107, Training Loss: 0.07220
Epoch: 107, Training Loss: 0.05576
Epoch: 107, Training Loss: 0.10183
Epoch: 107, Training Loss: 0.05833
Epoch: 108, Training Loss: 0.07109
Epoch: 108, Training Loss: 0.05501
Epoch: 108, Training Loss: 0.10020
Epoch: 108, Training Loss: 0.05743
Epoch: 109, Training Loss: 0.07001
Epoch: 109, Training Loss: 0.05427
Epoch: 109, Training Loss: 0.09862
Epoch: 109, Training Loss: 0.05655
Epoch: 110, Training Loss: 0.06897
Epoch: 110, Training Loss: 0.05355
Epoch: 110, Training Loss: 0.09707
Epoch: 110, Training Loss: 0.05569
Epoch: 111, Training Loss: 0.06794
Epoch: 111, Training Loss: 0.05285
Epoch: 111, Training Loss: 0.09558
Epoch: 111, Training Loss: 0.05486
Epoch: 112, Training Loss: 0.06695
Epoch: 112, Training Loss: 0.05216
Epoch: 112, Training Loss: 0.09412
Epoch: 112, Training Loss: 0.05405
Epoch: 113, Training Loss: 0.06598
Epoch: 113, Training Loss: 0.05149
Epoch: 113, Training Loss: 0.09270
Epoch: 113, Training Loss: 0.05327
Epoch: 114, Training Loss: 0.06504
Epoch: 114, Training Loss: 0.05084
Epoch: 114, Training Loss: 0.09132
Epoch: 114, Training Loss: 0.05250
Epoch: 115, Training Loss: 0.06412
Epoch: 115, Training Loss: 0.05020
Epoch: 115, Training Loss: 0.08998
Epoch: 115, Training Loss: 0.05176
Epoch: 116, Training Loss: 0.06323
Epoch: 116, Training Loss: 0.04958
Epoch: 116, Training Loss: 0.08867
Epoch: 116, Training Loss: 0.05103
Epoch: 117, Training Loss: 0.06235
Epoch: 117, Training Loss: 0.04897
Epoch: 117, Training Loss: 0.08740
Epoch: 117, Training Loss: 0.05033
Epoch: 118, Training Loss: 0.06150
Epoch: 118, Training Loss: 0.04837
Epoch: 118, Training Loss: 0.08616
Epoch: 118, Training Loss: 0.04964
Epoch: 119, Training Loss: 0.06067
Epoch: 119, Training Loss: 0.04779
Epoch: 119, Training Loss: 0.08495
Epoch: 119, Training Loss: 0.04897
Epoch: 120, Training Loss: 0.05986
Epoch: 120, Training Loss: 0.04722
Epoch: 120, Training Loss: 0.08378
Epoch: 120, Training Loss: 0.04831
Epoch: 121, Training Loss: 0.05907
Epoch: 121, Training Loss: 0.04666
Epoch: 121, Training Loss: 0.08263
Epoch: 121, Training Loss: 0.04768
Epoch: 122, Training Loss: 0.05830
Epoch: 122, Training Loss: 0.04612
Epoch: 122, Training Loss: 0.08151
Epoch: 122, Training Loss: 0.04705
Epoch: 123, Training Loss: 0.05755
Epoch: 123, Training Loss: 0.04559
Epoch: 123, Training Loss: 0.08042
Epoch: 123, Training Loss: 0.04645
Epoch: 124, Training Loss: 0.05682
Epoch: 124, Training Loss: 0.04507
Epoch: 124, Training Loss: 0.07935
Epoch: 124, Training Loss: 0.04586
Epoch: 125, Training Loss: 0.05610
Epoch: 125, Training Loss: 0.04455
Epoch: 125, Training Loss: 0.07831
Epoch: 125, Training Loss: 0.04528
Epoch: 126, Training Loss: 0.05540
Epoch: 126, Training Loss: 0.04406
Epoch: 126, Training Loss: 0.07730
Epoch: 126, Training Loss: 0.04472
Epoch: 127, Training Loss: 0.05471
Epoch: 127, Training Loss: 0.04357
Epoch: 127, Training Loss: 0.07631
Epoch: 127, Training Loss: 0.04416
Epoch: 128, Training Loss: 0.05404
Epoch: 128, Training Loss: 0.04309
Epoch: 128, Training Loss: 0.07534
Epoch: 128, Training Loss: 0.04363
Epoch: 129, Training Loss: 0.05339
Epoch: 129, Training Loss: 0.04262
Epoch: 129, Training Loss: 0.07439
Epoch: 129, Training Loss: 0.04310
Epoch: 130, Training Loss: 0.05275
Epoch: 130, Training Loss: 0.04216
Epoch: 130, Training Loss: 0.07347
Epoch: 130, Training Loss: 0.04259
Epoch: 131, Training Loss: 0.05212
Epoch: 131, Training Loss: 0.04171
Epoch: 131, Training Loss: 0.07257
Epoch: 131, Training Loss: 0.04209
Epoch: 132, Training Loss: 0.05151
Epoch: 132, Training Loss: 0.04127
Epoch: 132, Training Loss: 0.07169
Epoch: 132, Training Loss: 0.04159
Epoch: 133, Training Loss: 0.05091
Epoch: 133, Training Loss: 0.04083
Epoch: 133, Training Loss: 0.07082
Epoch: 133, Training Loss: 0.04111
Epoch: 134, Training Loss: 0.05032
Epoch: 134, Training Loss: 0.04041
Epoch: 134, Training Loss: 0.06998
Epoch: 134, Training Loss: 0.04064
Epoch: 135, Training Loss: 0.04975
Epoch: 135, Training Loss: 0.03999
Epoch: 135, Training Loss: 0.06916
Epoch: 135, Training Loss: 0.04019
Epoch: 136, Training Loss: 0.04918
Epoch: 136, Training Loss: 0.03958
Epoch: 136, Training Loss: 0.06835
Epoch: 136, Training Loss: 0.03974
Epoch: 137, Training Loss: 0.04863
Epoch: 137, Training Loss: 0.03918
Epoch: 137, Training Loss: 0.06756
Epoch: 137, Training Loss: 0.03930
Epoch: 138, Training Loss: 0.04810
Epoch: 138, Training Loss: 0.03879
Epoch: 138, Training Loss: 0.06679
Epoch: 138, Training Loss: 0.03886
Epoch: 139, Training Loss: 0.04757
Epoch: 139, Training Loss: 0.03841
Epoch: 139, Training Loss: 0.06603
Epoch: 139, Training Loss: 0.03844
Epoch: 140, Training Loss: 0.04705
Epoch: 140, Training Loss: 0.03803
Epoch: 140, Training Loss: 0.06529
Epoch: 140, Training Loss: 0.03803
Epoch: 141, Training Loss: 0.04654
Epoch: 141, Training Loss: 0.03766
Epoch: 141, Training Loss: 0.06457
Epoch: 141, Training Loss: 0.03762
Epoch: 142, Training Loss: 0.04605
Epoch: 142, Training Loss: 0.03729
Epoch: 142, Training Loss: 0.06386
Epoch: 142, Training Loss: 0.03723
Epoch: 143, Training Loss: 0.04556
Epoch: 143, Training Loss: 0.03694
Epoch: 143, Training Loss: 0.06316
Epoch: 143, Training Loss: 0.03684
Epoch: 144, Training Loss: 0.04508
Epoch: 144, Training Loss: 0.03658
Epoch: 144, Training Loss: 0.06248
Epoch: 144, Training Loss: 0.03646
Epoch: 145, Training Loss: 0.04462
Epoch: 145, Training Loss: 0.03624
Epoch: 145, Training Loss: 0.06181
Epoch: 145, Training Loss: 0.03608
Epoch: 146, Training Loss: 0.04416
Epoch: 146, Training Loss: 0.03590
Epoch: 146, Training Loss: 0.06116
Epoch: 146, Training Loss: 0.03572
Epoch: 147, Training Loss: 0.04371
Epoch: 147, Training Loss: 0.03557
Epoch: 147, Training Loss: 0.06052
Epoch: 147, Training Loss: 0.03536
Epoch: 148, Training Loss: 0.04326
Epoch: 148, Training Loss: 0.03524
Epoch: 148, Training Loss: 0.05989
Epoch: 148, Training Loss: 0.03501
Epoch: 149, Training Loss: 0.04283
Epoch: 149, Training Loss: 0.03492
Epoch: 149, Training Loss: 0.05927
Epoch: 149, Training Loss: 0.03466
Epoch: 150, Training Loss: 0.04241
Epoch: 150, Training Loss: 0.03460
Epoch: 150, Training Loss: 0.05867
Epoch: 150, Training Loss: 0.03432
Epoch: 151, Training Loss: 0.04199
Epoch: 151, Training Loss: 0.03429
Epoch: 151, Training Loss: 0.05807
Epoch: 151, Training Loss: 0.03399
Epoch: 152, Training Loss: 0.04158
Epoch: 152, Training Loss: 0.03399
Epoch: 152, Training Loss: 0.05749
Epoch: 152, Training Loss: 0.03366
Epoch: 153, Training Loss: 0.04118
Epoch: 153, Training Loss: 0.03369
Epoch: 153, Training Loss: 0.05692
Epoch: 153, Training Loss: 0.03334
Epoch: 154, Training Loss: 0.04078
Epoch: 154, Training Loss: 0.03339
Epoch: 154, Training Loss: 0.05636
Epoch: 154, Training Loss: 0.03303
Epoch: 155, Training Loss: 0.04039
Epoch: 155, Training Loss: 0.03310
Epoch: 155, Training Loss: 0.05581
Epoch: 155, Training Loss: 0.03272
Epoch: 156, Training Loss: 0.04001
Epoch: 156, Training Loss: 0.03282
Epoch: 156, Training Loss: 0.05527
Epoch: 156, Training Loss: 0.03241
Epoch: 157, Training Loss: 0.03964
Epoch: 157, Training Loss: 0.03254
Epoch: 157, Training Loss: 0.05474
Epoch: 157, Training Loss: 0.03211
Epoch: 158, Training Loss: 0.03927
Epoch: 158, Training Loss: 0.03226
Epoch: 158, Training Loss: 0.05422
Epoch: 158, Training Loss: 0.03182
Epoch: 159, Training Loss: 0.03891
Epoch: 159, Training Loss: 0.03199
Epoch: 159, Training Loss: 0.05370
Epoch: 159, Training Loss: 0.03153
Epoch: 160, Training Loss: 0.03856
Epoch: 160, Training Loss: 0.03172
Epoch: 160, Training Loss: 0.05320
Epoch: 160, Training Loss: 0.03125
Epoch: 161, Training Loss: 0.03821
Epoch: 161, Training Loss: 0.03146
Epoch: 161, Training Loss: 0.05271
Epoch: 161, Training Loss: 0.03097
Epoch: 162, Training Loss: 0.03786
Epoch: 162, Training Loss: 0.03120
Epoch: 162, Training Loss: 0.05222
Epoch: 162, Training Loss: 0.03070
Epoch: 163, Training Loss: 0.03753
Epoch: 163, Training Loss: 0.03094
Epoch: 163, Training Loss: 0.05175
Epoch: 163, Training Loss: 0.03043
Epoch: 164, Training Loss: 0.03720
Epoch: 164, Training Loss: 0.03069
Epoch: 164, Training Loss: 0.05128
Epoch: 164, Training Loss: 0.03017
Epoch: 165, Training Loss: 0.03687
Epoch: 165, Training Loss: 0.03045
Epoch: 165, Training Loss: 0.05082
Epoch: 165, Training Loss: 0.02991
Epoch: 166, Training Loss: 0.03655
Epoch: 166, Training Loss: 0.03020
Epoch: 166, Training Loss: 0.05036
Epoch: 166, Training Loss: 0.02965
Epoch: 167, Training Loss: 0.03623
Epoch: 167, Training Loss: 0.02996
Epoch: 167, Training Loss: 0.04992
Epoch: 167, Training Loss: 0.02940
Epoch: 168, Training Loss: 0.03592
Epoch: 168, Training Loss: 0.02973
Epoch: 168, Training Loss: 0.04948
Epoch: 168, Training Loss: 0.02915
Epoch: 169, Training Loss: 0.03562
Epoch: 169, Training Loss: 0.02950
Epoch: 169, Training Loss: 0.04905
Epoch: 169, Training Loss: 0.02891
Epoch: 170, Training Loss: 0.03532
Epoch: 170, Training Loss: 0.02927
Epoch: 170, Training Loss: 0.04863
Epoch: 170, Training Loss: 0.02867
Epoch: 171, Training Loss: 0.03502
Epoch: 171, Training Loss: 0.02904
Epoch: 171, Training Loss: 0.04821
Epoch: 171, Training Loss: 0.02843
Epoch: 172, Training Loss: 0.03473
Epoch: 172, Training Loss: 0.02882
Epoch: 172, Training Loss: 0.04780
Epoch: 172, Training Loss: 0.02820
Epoch: 173, Training Loss: 0.03444
Epoch: 173, Training Loss: 0.02860
Epoch: 173, Training Loss: 0.04740
Epoch: 173, Training Loss: 0.02797
Epoch: 174, Training Loss: 0.03416
Epoch: 174, Training Loss: 0.02838
Epoch: 174, Training Loss: 0.04700
Epoch: 174, Training Loss: 0.02775
Epoch: 175, Training Loss: 0.03389
Epoch: 175, Training Loss: 0.02817
Epoch: 175, Training Loss: 0.04661
Epoch: 175, Training Loss: 0.02753
Epoch: 176, Training Loss: 0.03361
Epoch: 176, Training Loss: 0.02796
Epoch: 176, Training Loss: 0.04623
Epoch: 176, Training Loss: 0.02731
Epoch: 177, Training Loss: 0.03334
Epoch: 177, Training Loss: 0.02775
Epoch: 177, Training Loss: 0.04585
Epoch: 177, Training Loss: 0.02709
Epoch: 178, Training Loss: 0.03308
Epoch: 178, Training Loss: 0.02755
Epoch: 178, Training Loss: 0.04548
Epoch: 178, Training Loss: 0.02688
Epoch: 179, Training Loss: 0.03282
Epoch: 179, Training Loss: 0.02735
Epoch: 179, Training Loss: 0.04511
Epoch: 179, Training Loss: 0.02667
Epoch: 180, Training Loss: 0.03256
Epoch: 180, Training Loss: 0.02715
Epoch: 180, Training Loss: 0.04475
Epoch: 180, Training Loss: 0.02647
Epoch: 181, Training Loss: 0.03231
Epoch: 181, Training Loss: 0.02696
Epoch: 181, Training Loss: 0.04439
Epoch: 181, Training Loss: 0.02627
Epoch: 182, Training Loss: 0.03206
Epoch: 182, Training Loss: 0.02676
Epoch: 182, Training Loss: 0.04404
Epoch: 182, Training Loss: 0.02607
Epoch: 183, Training Loss: 0.03181
Epoch: 183, Training Loss: 0.02657
Epoch: 183, Training Loss: 0.04370
Epoch: 183, Training Loss: 0.02587
Epoch: 184, Training Loss: 0.03157
Epoch: 184, Training Loss: 0.02639
Epoch: 184, Training Loss: 0.04336
Epoch: 184, Training Loss: 0.02568
Epoch: 185, Training Loss: 0.03133
Epoch: 185, Training Loss: 0.02620
Epoch: 185, Training Loss: 0.04302
Epoch: 185, Training Loss: 0.02549
Epoch: 186, Training Loss: 0.03109
Epoch: 186, Training Loss: 0.02602
Epoch: 186, Training Loss: 0.04269
Epoch: 186, Training Loss: 0.02530
Epoch: 187, Training Loss: 0.03086
Epoch: 187, Training Loss: 0.02584
Epoch: 187, Training Loss: 0.04237
Epoch: 187, Training Loss: 0.02512
Epoch: 188, Training Loss: 0.03063
Epoch: 188, Training Loss: 0.02566
Epoch: 188, Training Loss: 0.04205
Epoch: 188, Training Loss: 0.02493
Epoch: 189, Training Loss: 0.03041
Epoch: 189, Training Loss: 0.02549
Epoch: 189, Training Loss: 0.04173
Epoch: 189, Training Loss: 0.02475
Epoch: 190, Training Loss: 0.03018
Epoch: 190, Training Loss: 0.02531
Epoch: 190, Training Loss: 0.04142
Epoch: 190, Training Loss: 0.02458
Epoch: 191, Training Loss: 0.02997
Epoch: 191, Training Loss: 0.02514
Epoch: 191, Training Loss: 0.04112
Epoch: 191, Training Loss: 0.02440
Epoch: 192, Training Loss: 0.02975
Epoch: 192, Training Loss: 0.02497
Epoch: 192, Training Loss: 0.04081
Epoch: 192, Training Loss: 0.02423
Epoch: 193, Training Loss: 0.02954
Epoch: 193, Training Loss: 0.02481
Epoch: 193, Training Loss: 0.04051
Epoch: 193, Training Loss: 0.02406
Epoch: 194, Training Loss: 0.02933
Epoch: 194, Training Loss: 0.02464
Epoch: 194, Training Loss: 0.04022
Epoch: 194, Training Loss: 0.02389
Epoch: 195, Training Loss: 0.02912
Epoch: 195, Training Loss: 0.02448
Epoch: 195, Training Loss: 0.03993
Epoch: 195, Training Loss: 0.02372
Epoch: 196, Training Loss: 0.02891
Epoch: 196, Training Loss: 0.02432
Epoch: 196, Training Loss: 0.03964
Epoch: 196, Training Loss: 0.02356
Epoch: 197, Training Loss: 0.02871
Epoch: 197, Training Loss: 0.02416
Epoch: 197, Training Loss: 0.03936
Epoch: 197, Training Loss: 0.02340
Epoch: 198, Training Loss: 0.02851
Epoch: 198, Training Loss: 0.02401
Epoch: 198, Training Loss: 0.03908
Epoch: 198, Training Loss: 0.02324
Epoch: 199, Training Loss: 0.02831
Epoch: 199, Training Loss: 0.02385
Epoch: 199, Training Loss: 0.03881
Epoch: 199, Training Loss: 0.02308
Epoch: 200, Training Loss: 0.02812
Epoch: 200, Training Loss: 0.02370
Epoch: 200, Training Loss: 0.03854
Epoch: 200, Training Loss: 0.02293
Epoch: 201, Training Loss: 0.02793
Epoch: 201, Training Loss: 0.02355
Epoch: 201, Training Loss: 0.03827
Epoch: 201, Training Loss: 0.02277
Epoch: 202, Training Loss: 0.02774
Epoch: 202, Training Loss: 0.02340
Epoch: 202, Training Loss: 0.03801
Epoch: 202, Training Loss: 0.02262
Epoch: 203, Training Loss: 0.02755
Epoch: 203, Training Loss: 0.02325
Epoch: 203, Training Loss: 0.03775
Epoch: 203, Training Loss: 0.02247
Epoch: 204, Training Loss: 0.02737
Epoch: 204, Training Loss: 0.02311
Epoch: 204, Training Loss: 0.03749
Epoch: 204, Training Loss: 0.02233
Epoch: 205, Training Loss: 0.02719
Epoch: 205, Training Loss: 0.02297
Epoch: 205, Training Loss: 0.03724
Epoch: 205, Training Loss: 0.02218
Epoch: 206, Training Loss: 0.02701
Epoch: 206, Training Loss: 0.02282
Epoch: 206, Training Loss: 0.03699
Epoch: 206, Training Loss: 0.02204
Epoch: 207, Training Loss: 0.02683
Epoch: 207, Training Loss: 0.02268
Epoch: 207, Training Loss: 0.03674
Epoch: 207, Training Loss: 0.02190
Epoch: 208, Training Loss: 0.02665
Epoch: 208, Training Loss: 0.02255
Epoch: 208, Training Loss: 0.03650
Epoch: 208, Training Loss: 0.02176
Epoch: 209, Training Loss: 0.02648
Epoch: 209, Training Loss: 0.02241
Epoch: 209, Training Loss: 0.03626
Epoch: 209, Training Loss: 0.02162
Epoch: 210, Training Loss: 0.02631
Epoch: 210, Training Loss: 0.02227
Epoch: 210, Training Loss: 0.03602
Epoch: 210, Training Loss: 0.02148
Epoch: 211, Training Loss: 0.02614
Epoch: 211, Training Loss: 0.02214
Epoch: 211, Training Loss: 0.03578
Epoch: 211, Training Loss: 0.02135
Epoch: 212, Training Loss: 0.02598
Epoch: 212, Training Loss: 0.02201
Epoch: 212, Training Loss: 0.03555
Epoch: 212, Training Loss: 0.02121
Epoch: 213, Training Loss: 0.02581
Epoch: 213, Training Loss: 0.02188
Epoch: 213, Training Loss: 0.03532
Epoch: 213, Training Loss: 0.02108
Epoch: 214, Training Loss: 0.02565
Epoch: 214, Training Loss: 0.02175
Epoch: 214, Training Loss: 0.03510
Epoch: 214, Training Loss: 0.02095
Epoch: 215, Training Loss: 0.02549
Epoch: 215, Training Loss: 0.02162
Epoch: 215, Training Loss: 0.03487
Epoch: 215, Training Loss: 0.02082
Epoch: 216, Training Loss: 0.02533
Epoch: 216, Training Loss: 0.02150
Epoch: 216, Training Loss: 0.03465
Epoch: 216, Training Loss: 0.02070
Epoch: 217, Training Loss: 0.02517
Epoch: 217, Training Loss: 0.02137
Epoch: 217, Training Loss: 0.03443
Epoch: 217, Training Loss: 0.02057
Epoch: 218, Training Loss: 0.02502
Epoch: 218, Training Loss: 0.02125
Epoch: 218, Training Loss: 0.03422
Epoch: 218, Training Loss: 0.02045
Epoch: 219, Training Loss: 0.02486
Epoch: 219, Training Loss: 0.02113
Epoch: 219, Training Loss: 0.03401
Epoch: 219, Training Loss: 0.02032
Epoch: 220, Training Loss: 0.02471
Epoch: 220, Training Loss: 0.02101
Epoch: 220, Training Loss: 0.03380
Epoch: 220, Training Loss: 0.02020
Epoch: 221, Training Loss: 0.02456
Epoch: 221, Training Loss: 0.02089
Epoch: 221, Training Loss: 0.03359
Epoch: 221, Training Loss: 0.02008
Epoch: 222, Training Loss: 0.02441
Epoch: 222, Training Loss: 0.02077
Epoch: 222, Training Loss: 0.03338
Epoch: 222, Training Loss: 0.01997
Epoch: 223, Training Loss: 0.02427
Epoch: 223, Training Loss: 0.02065
Epoch: 223, Training Loss: 0.03318
Epoch: 223, Training Loss: 0.01985
Epoch: 224, Training Loss: 0.02412
Epoch: 224, Training Loss: 0.02054
Epoch: 224, Training Loss: 0.03298
Epoch: 224, Training Loss: 0.01973
Epoch: 225, Training Loss: 0.02398
Epoch: 225, Training Loss: 0.02042
Epoch: 225, Training Loss: 0.03278
Epoch: 225, Training Loss: 0.01962
Epoch: 226, Training Loss: 0.02384
Epoch: 226, Training Loss: 0.02031
Epoch: 226, Training Loss: 0.03259
Epoch: 226, Training Loss: 0.01951
Epoch: 227, Training Loss: 0.02370
Epoch: 227, Training Loss: 0.02020
Epoch: 227, Training Loss: 0.03239
Epoch: 227, Training Loss: 0.01939
Epoch: 228, Training Loss: 0.02356
Epoch: 228, Training Loss: 0.02009
Epoch: 228, Training Loss: 0.03220
Epoch: 228, Training Loss: 0.01928
Epoch: 229, Training Loss: 0.02343
Epoch: 229, Training Loss: 0.01998
Epoch: 229, Training Loss: 0.03201
Epoch: 229, Training Loss: 0.01917
Epoch: 230, Training Loss: 0.02329
Epoch: 230, Training Loss: 0.01987
Epoch: 230, Training Loss: 0.03183
Epoch: 230, Training Loss: 0.01907
Epoch: 231, Training Loss: 0.02316
Epoch: 231, Training Loss: 0.01976
Epoch: 231, Training Loss: 0.03164
Epoch: 231, Training Loss: 0.01896
Epoch: 232, Training Loss: 0.02303
Epoch: 232, Training Loss: 0.01966
Epoch: 232, Training Loss: 0.03146
Epoch: 232, Training Loss: 0.01885
Epoch: 233, Training Loss: 0.02289
Epoch: 233, Training Loss: 0.01955
Epoch: 233, Training Loss: 0.03128
Epoch: 233, Training Loss: 0.01875
Epoch: 234, Training Loss: 0.02277
Epoch: 234, Training Loss: 0.01945
Epoch: 234, Training Loss: 0.03110
Epoch: 234, Training Loss: 0.01864
Epoch: 235, Training Loss: 0.02264
Epoch: 235, Training Loss: 0.01935
Epoch: 235, Training Loss: 0.03092
Epoch: 235, Training Loss: 0.01854
Epoch: 236, Training Loss: 0.02251
Epoch: 236, Training Loss: 0.01925
Epoch: 236, Training Loss: 0.03075
Epoch: 236, Training Loss: 0.01844
Epoch: 237, Training Loss: 0.02239
Epoch: 237, Training Loss: 0.01915
Epoch: 237, Training Loss: 0.03057
Epoch: 237, Training Loss: 0.01834
Epoch: 238, Training Loss: 0.02226
Epoch: 238, Training Loss: 0.01905
Epoch: 238, Training Loss: 0.03040
Epoch: 238, Training Loss: 0.01824
Epoch: 239, Training Loss: 0.02214
Epoch: 239, Training Loss: 0.01895
Epoch: 239, Training Loss: 0.03023
Epoch: 239, Training Loss: 0.01814
Epoch: 240, Training Loss: 0.02202
Epoch: 240, Training Loss: 0.01885
Epoch: 240, Training Loss: 0.03007
Epoch: 240, Training Loss: 0.01805
Epoch: 241, Training Loss: 0.02190
Epoch: 241, Training Loss: 0.01875
Epoch: 241, Training Loss: 0.02990
Epoch: 241, Training Loss: 0.01795
Epoch: 242, Training Loss: 0.02178
Epoch: 242, Training Loss: 0.01866
Epoch: 242, Training Loss: 0.02974
Epoch: 242, Training Loss: 0.01786
Epoch: 243, Training Loss: 0.02166
Epoch: 243, Training Loss: 0.01856
Epoch: 243, Training Loss: 0.02957
Epoch: 243, Training Loss: 0.01776
Epoch: 244, Training Loss: 0.02155
Epoch: 244, Training Loss: 0.01847
Epoch: 244, Training Loss: 0.02941
Epoch: 244, Training Loss: 0.01767
Epoch: 245, Training Loss: 0.02143
Epoch: 245, Training Loss: 0.01838
Epoch: 245, Training Loss: 0.02926
Epoch: 245, Training Loss: 0.01758
Epoch: 246, Training Loss: 0.02132
Epoch: 246, Training Loss: 0.01828
Epoch: 246, Training Loss: 0.02910
Epoch: 246, Training Loss: 0.01748
Epoch: 247, Training Loss: 0.02121
Epoch: 247, Training Loss: 0.01819
Epoch: 247, Training Loss: 0.02894
Epoch: 247, Training Loss: 0.01739
Epoch: 248, Training Loss: 0.02109
Epoch: 248, Training Loss: 0.01810
Epoch: 248, Training Loss: 0.02879
Epoch: 248, Training Loss: 0.01731
Epoch: 249, Training Loss: 0.02098
Epoch: 249, Training Loss: 0.01801
Epoch: 249, Training Loss: 0.02864
Epoch: 249, Training Loss: 0.01722
Epoch: 250, Training Loss: 0.02087
Epoch: 250, Training Loss: 0.01793
Epoch: 250, Training Loss: 0.02849
Epoch: 250, Training Loss: 0.01713
Epoch: 251, Training Loss: 0.02077
Epoch: 251, Training Loss: 0.01784
Epoch: 251, Training Loss: 0.02834
Epoch: 251, Training Loss: 0.01704
Epoch: 252, Training Loss: 0.02066
Epoch: 252, Training Loss: 0.01775
Epoch: 252, Training Loss: 0.02819
Epoch: 252, Training Loss: 0.01696
Epoch: 253, Training Loss: 0.02055
Epoch: 253, Training Loss: 0.01766
Epoch: 253, Training Loss: 0.02804
Epoch: 253, Training Loss: 0.01687
Epoch: 254, Training Loss: 0.02045
Epoch: 254, Training Loss: 0.01758
Epoch: 254, Training Loss: 0.02790
Epoch: 254, Training Loss: 0.01679
Epoch: 255, Training Loss: 0.02034
Epoch: 255, Training Loss: 0.01750
Epoch: 255, Training Loss: 0.02775
Epoch: 255, Training Loss: 0.01670
Epoch: 256, Training Loss: 0.02024
Epoch: 256, Training Loss: 0.01741
Epoch: 256, Training Loss: 0.02761
Epoch: 256, Training Loss: 0.01662
Epoch: 257, Training Loss: 0.02014
Epoch: 257, Training Loss: 0.01733
Epoch: 257, Training Loss: 0.02747
Epoch: 257, Training Loss: 0.01654
Epoch: 258, Training Loss: 0.02004
Epoch: 258, Training Loss: 0.01725
Epoch: 258, Training Loss: 0.02733
Epoch: 258, Training Loss: 0.01646
Epoch: 259, Training Loss: 0.01994
Epoch: 259, Training Loss: 0.01717
Epoch: 259, Training Loss: 0.02720
Epoch: 259, Training Loss: 0.01638
Epoch: 260, Training Loss: 0.01984
Epoch: 260, Training Loss: 0.01709
Epoch: 260, Training Loss: 0.02706
Epoch: 260, Training Loss: 0.01630
Epoch: 261, Training Loss: 0.01974
Epoch: 261, Training Loss: 0.01701
Epoch: 261, Training Loss: 0.02692
Epoch: 261, Training Loss: 0.01622
Epoch: 262, Training Loss: 0.01965
Epoch: 262, Training Loss: 0.01693
Epoch: 262, Training Loss: 0.02679
Epoch: 262, Training Loss: 0.01614
Epoch: 263, Training Loss: 0.01955
Epoch: 263, Training Loss: 0.01685
Epoch: 263, Training Loss: 0.02666
Epoch: 263, Training Loss: 0.01606
Epoch: 264, Training Loss: 0.01945
Epoch: 264, Training Loss: 0.01677
Epoch: 264, Training Loss: 0.02653
Epoch: 264, Training Loss: 0.01599
Epoch: 265, Training Loss: 0.01936
Epoch: 265, Training Loss: 0.01669
Epoch: 265, Training Loss: 0.02640
Epoch: 265, Training Loss: 0.01591
Epoch: 266, Training Loss: 0.01927
Epoch: 266, Training Loss: 0.01662
Epoch: 266, Training Loss: 0.02627
Epoch: 266, Training Loss: 0.01584
Epoch: 267, Training Loss: 0.01917
Epoch: 267, Training Loss: 0.01654
Epoch: 267, Training Loss: 0.02614
Epoch: 267, Training Loss: 0.01576
Epoch: 268, Training Loss: 0.01908
Epoch: 268, Training Loss: 0.01647
Epoch: 268, Training Loss: 0.02601
Epoch: 268, Training Loss: 0.01569
Epoch: 269, Training Loss: 0.01899
Epoch: 269, Training Loss: 0.01639
Epoch: 269, Training Loss: 0.02589
Epoch: 269, Training Loss: 0.01562
Epoch: 270, Training Loss: 0.01890
Epoch: 270, Training Loss: 0.01632
Epoch: 270, Training Loss: 0.02576
Epoch: 270, Training Loss: 0.01554
Epoch: 271, Training Loss: 0.01881
Epoch: 271, Training Loss: 0.01625
Epoch: 271, Training Loss: 0.02564
Epoch: 271, Training Loss: 0.01547
Epoch: 272, Training Loss: 0.01872
Epoch: 272, Training Loss: 0.01617
Epoch: 272, Training Loss: 0.02552
Epoch: 272, Training Loss: 0.01540
Epoch: 273, Training Loss: 0.01864
Epoch: 273, Training Loss: 0.01610
Epoch: 273, Training Loss: 0.02540
Epoch: 273, Training Loss: 0.01533
Epoch: 274, Training Loss: 0.01855
Epoch: 274, Training Loss: 0.01603
Epoch: 274, Training Loss: 0.02528
Epoch: 274, Training Loss: 0.01526
Epoch: 275, Training Loss: 0.01846
Epoch: 275, Training Loss: 0.01596
Epoch: 275, Training Loss: 0.02516
Epoch: 275, Training Loss: 0.01519
Epoch: 276, Training Loss: 0.01838
Epoch: 276, Training Loss: 0.01589
Epoch: 276, Training Loss: 0.02504
Epoch: 276, Training Loss: 0.01512
Epoch: 277, Training Loss: 0.01829
Epoch: 277, Training Loss: 0.01582
Epoch: 277, Training Loss: 0.02493
Epoch: 277, Training Loss: 0.01505
Epoch: 278, Training Loss: 0.01821
Epoch: 278, Training Loss: 0.01575
Epoch: 278, Training Loss: 0.02481
Epoch: 278, Training Loss: 0.01499
Epoch: 279, Training Loss: 0.01813
Epoch: 279, Training Loss: 0.01568
Epoch: 279, Training Loss: 0.02470
Epoch: 279, Training Loss: 0.01492
Epoch: 280, Training Loss: 0.01804
Epoch: 280, Training Loss: 0.01562
Epoch: 280, Training Loss: 0.02459
Epoch: 280, Training Loss: 0.01485
Epoch: 281, Training Loss: 0.01796
Epoch: 281, Training Loss: 0.01555
Epoch: 281, Training Loss: 0.02447
Epoch: 281, Training Loss: 0.01479
Epoch: 282, Training Loss: 0.01788
Epoch: 282, Training Loss: 0.01548
Epoch: 282, Training Loss: 0.02436
Epoch: 282, Training Loss: 0.01472
Epoch: 283, Training Loss: 0.01780
Epoch: 283, Training Loss: 0.01542
Epoch: 283, Training Loss: 0.02425
Epoch: 283, Training Loss: 0.01466
Epoch: 284, Training Loss: 0.01772
Epoch: 284, Training Loss: 0.01535
Epoch: 284, Training Loss: 0.02414
Epoch: 284, Training Loss: 0.01459
Epoch: 285, Training Loss: 0.01764
Epoch: 285, Training Loss: 0.01529
Epoch: 285, Training Loss: 0.02404
Epoch: 285, Training Loss: 0.01453
Epoch: 286, Training Loss: 0.01756
Epoch: 286, Training Loss: 0.01522
Epoch: 286, Training Loss: 0.02393
Epoch: 286, Training Loss: 0.01447
Epoch: 287, Training Loss: 0.01749
Epoch: 287, Training Loss: 0.01516
Epoch: 287, Training Loss: 0.02382
Epoch: 287, Training Loss: 0.01440
Epoch: 288, Training Loss: 0.01741
Epoch: 288, Training Loss: 0.01510
Epoch: 288, Training Loss: 0.02372
Epoch: 288, Training Loss: 0.01434
Epoch: 289, Training Loss: 0.01733
Epoch: 289, Training Loss: 0.01503
Epoch: 289, Training Loss: 0.02361
Epoch: 289, Training Loss: 0.01428
Epoch: 290, Training Loss: 0.01726
Epoch: 290, Training Loss: 0.01497
Epoch: 290, Training Loss: 0.02351
Epoch: 290, Training Loss: 0.01422
Epoch: 291, Training Loss: 0.01718
Epoch: 291, Training Loss: 0.01491
Epoch: 291, Training Loss: 0.02340
Epoch: 291, Training Loss: 0.01416
Epoch: 292, Training Loss: 0.01711
Epoch: 292, Training Loss: 0.01485
Epoch: 292, Training Loss: 0.02330
Epoch: 292, Training Loss: 0.01410
Epoch: 293, Training Loss: 0.01704
Epoch: 293, Training Loss: 0.01479
Epoch: 293, Training Loss: 0.02320
Epoch: 293, Training Loss: 0.01404
Epoch: 294, Training Loss: 0.01696
Epoch: 294, Training Loss: 0.01473
Epoch: 294, Training Loss: 0.02310
Epoch: 294, Training Loss: 0.01398
Epoch: 295, Training Loss: 0.01689
Epoch: 295, Training Loss: 0.01467
Epoch: 295, Training Loss: 0.02300
Epoch: 295, Training Loss: 0.01392
Epoch: 296, Training Loss: 0.01682
Epoch: 296, Training Loss: 0.01461
Epoch: 296, Training Loss: 0.02290
Epoch: 296, Training Loss: 0.01387
Epoch: 297, Training Loss: 0.01675
Epoch: 297, Training Loss: 0.01455
Epoch: 297, Training Loss: 0.02281
Epoch: 297, Training Loss: 0.01381
Epoch: 298, Training Loss: 0.01668
Epoch: 298, Training Loss: 0.01449
Epoch: 298, Training Loss: 0.02271
Epoch: 298, Training Loss: 0.01375
Epoch: 299, Training Loss: 0.01661
Epoch: 299, Training Loss: 0.01443
Epoch: 299, Training Loss: 0.02261
Epoch: 299, Training Loss: 0.01370
Epoch: 300, Training Loss: 0.01654
Epoch: 300, Training Loss: 0.01438
Epoch: 300, Training Loss: 0.02252
Epoch: 300, Training Loss: 0.01364
Epoch: 301, Training Loss: 0.01647
Epoch: 301, Training Loss: 0.01432
Epoch: 301, Training Loss: 0.02242
Epoch: 301, Training Loss: 0.01358
Epoch: 302, Training Loss: 0.01640
Epoch: 302, Training Loss: 0.01426
Epoch: 302, Training Loss: 0.02233
Epoch: 302, Training Loss: 0.01353
Epoch: 303, Training Loss: 0.01633
Epoch: 303, Training Loss: 0.01421
Epoch: 303, Training Loss: 0.02224
Epoch: 303, Training Loss: 0.01347
Epoch: 304, Training Loss: 0.01627
Epoch: 304, Training Loss: 0.01415
Epoch: 304, Training Loss: 0.02214
Epoch: 304, Training Loss: 0.01342
Epoch: 305, Training Loss: 0.01620
Epoch: 305, Training Loss: 0.01409
Epoch: 305, Training Loss: 0.02205
Epoch: 305, Training Loss: 0.01337
Epoch: 306, Training Loss: 0.01613
Epoch: 306, Training Loss: 0.01404
Epoch: 306, Training Loss: 0.02196
Epoch: 306, Training Loss: 0.01331
Epoch: 307, Training Loss: 0.01607
Epoch: 307, Training Loss: 0.01399
Epoch: 307, Training Loss: 0.02187
Epoch: 307, Training Loss: 0.01326
Epoch: 308, Training Loss: 0.01600
Epoch: 308, Training Loss: 0.01393
Epoch: 308, Training Loss: 0.02178
Epoch: 308, Training Loss: 0.01321
Epoch: 309, Training Loss: 0.01594
Epoch: 309, Training Loss: 0.01388
Epoch: 309, Training Loss: 0.02169
Epoch: 309, Training Loss: 0.01315
Epoch: 310, Training Loss: 0.01587
Epoch: 310, Training Loss: 0.01382
Epoch: 310, Training Loss: 0.02161
Epoch: 310, Training Loss: 0.01310
Epoch: 311, Training Loss: 0.01581
Epoch: 311, Training Loss: 0.01377
Epoch: 311, Training Loss: 0.02152
Epoch: 311, Training Loss: 0.01305
Epoch: 312, Training Loss: 0.01575
Epoch: 312, Training Loss: 0.01372
Epoch: 312, Training Loss: 0.02143
Epoch: 312, Training Loss: 0.01300
Epoch: 313, Training Loss: 0.01568
Epoch: 313, Training Loss: 0.01367
Epoch: 313, Training Loss: 0.02135
Epoch: 313, Training Loss: 0.01295
Epoch: 314, Training Loss: 0.01562
Epoch: 314, Training Loss: 0.01362
Epoch: 314, Training Loss: 0.02126
Epoch: 314, Training Loss: 0.01290
Epoch: 315, Training Loss: 0.01556
Epoch: 315, Training Loss: 0.01356
Epoch: 315, Training Loss: 0.02118
Epoch: 315, Training Loss: 0.01285
Epoch: 316, Training Loss: 0.01550
Epoch: 316, Training Loss: 0.01351
Epoch: 316, Training Loss: 0.02109
Epoch: 316, Training Loss: 0.01280
Epoch: 317, Training Loss: 0.01544
Epoch: 317, Training Loss: 0.01346
Epoch: 317, Training Loss: 0.02101
Epoch: 317, Training Loss: 0.01275
Epoch: 318, Training Loss: 0.01538
Epoch: 318, Training Loss: 0.01341
Epoch: 318, Training Loss: 0.02093
Epoch: 318, Training Loss: 0.01270
Epoch: 319, Training Loss: 0.01532
Epoch: 319, Training Loss: 0.01336
Epoch: 319, Training Loss: 0.02085
Epoch: 319, Training Loss: 0.01265
Epoch: 320, Training Loss: 0.01526
Epoch: 320, Training Loss: 0.01331
Epoch: 320, Training Loss: 0.02076
Epoch: 320, Training Loss: 0.01261
Epoch: 321, Training Loss: 0.01520
Epoch: 321, Training Loss: 0.01326
Epoch: 321, Training Loss: 0.02068
Epoch: 321, Training Loss: 0.01256
Epoch: 322, Training Loss: 0.01514
Epoch: 322, Training Loss: 0.01322
Epoch: 322, Training Loss: 0.02060
Epoch: 322, Training Loss: 0.01251
Epoch: 323, Training Loss: 0.01508
Epoch: 323, Training Loss: 0.01317
Epoch: 323, Training Loss: 0.02052
Epoch: 323, Training Loss: 0.01246
Epoch: 324, Training Loss: 0.01502
Epoch: 324, Training Loss: 0.01312
Epoch: 324, Training Loss: 0.02045
Epoch: 324, Training Loss: 0.01242
Epoch: 325, Training Loss: 0.01497
Epoch: 325, Training Loss: 0.01307
Epoch: 325, Training Loss: 0.02037
Epoch: 325, Training Loss: 0.01237
Epoch: 326, Training Loss: 0.01491
Epoch: 326, Training Loss: 0.01302
Epoch: 326, Training Loss: 0.02029
Epoch: 326, Training Loss: 0.01233
Epoch: 327, Training Loss: 0.01485
Epoch: 327, Training Loss: 0.01298
Epoch: 327, Training Loss: 0.02021
Epoch: 327, Training Loss: 0.01228
Epoch: 328, Training Loss: 0.01480
Epoch: 328, Training Loss: 0.01293
Epoch: 328, Training Loss: 0.02014
Epoch: 328, Training Loss: 0.01223
Epoch: 329, Training Loss: 0.01474
Epoch: 329, Training Loss: 0.01288
Epoch: 329, Training Loss: 0.02006
Epoch: 329, Training Loss: 0.01219
Epoch: 330, Training Loss: 0.01469
Epoch: 330, Training Loss: 0.01284
Epoch: 330, Training Loss: 0.01998
Epoch: 330, Training Loss: 0.01215
Epoch: 331, Training Loss: 0.01463
Epoch: 331, Training Loss: 0.01279
Epoch: 331, Training Loss: 0.01991
Epoch: 331, Training Loss: 0.01210
Epoch: 332, Training Loss: 0.01458
Epoch: 332, Training Loss: 0.01275
Epoch: 332, Training Loss: 0.01984
Epoch: 332, Training Loss: 0.01206
Epoch: 333, Training Loss: 0.01452
Epoch: 333, Training Loss: 0.01270
Epoch: 333, Training Loss: 0.01976
Epoch: 333, Training Loss: 0.01201
Epoch: 334, Training Loss: 0.01447
Epoch: 334, Training Loss: 0.01266
Epoch: 334, Training Loss: 0.01969
Epoch: 334, Training Loss: 0.01197
Epoch: 335, Training Loss: 0.01442
Epoch: 335, Training Loss: 0.01261
Epoch: 335, Training Loss: 0.01962
Epoch: 335, Training Loss: 0.01193
Epoch: 336, Training Loss: 0.01437
Epoch: 336, Training Loss: 0.01257
Epoch: 336, Training Loss: 0.01954
Epoch: 336, Training Loss: 0.01188
Epoch: 337, Training Loss: 0.01431
Epoch: 337, Training Loss: 0.01252
Epoch: 337, Training Loss: 0.01947
Epoch: 337, Training Loss: 0.01184
Epoch: 338, Training Loss: 0.01426
Epoch: 338, Training Loss: 0.01248
Epoch: 338, Training Loss: 0.01940
Epoch: 338, Training Loss: 0.01180
Epoch: 339, Training Loss: 0.01421
Epoch: 339, Training Loss: 0.01244
Epoch: 339, Training Loss: 0.01933
Epoch: 339, Training Loss: 0.01176
Epoch: 340, Training Loss: 0.01416
Epoch: 340, Training Loss: 0.01239
Epoch: 340, Training Loss: 0.01926
Epoch: 340, Training Loss: 0.01172
Epoch: 341, Training Loss: 0.01411
Epoch: 341, Training Loss: 0.01235
Epoch: 341, Training Loss: 0.01919
Epoch: 341, Training Loss: 0.01167
Epoch: 342, Training Loss: 0.01406
Epoch: 342, Training Loss: 0.01231
Epoch: 342, Training Loss: 0.01912
Epoch: 342, Training Loss: 0.01163
Epoch: 343, Training Loss: 0.01401
Epoch: 343, Training Loss: 0.01227
Epoch: 343, Training Loss: 0.01905
Epoch: 343, Training Loss: 0.01159
Epoch: 344, Training Loss: 0.01396
Epoch: 344, Training Loss: 0.01223
Epoch: 344, Training Loss: 0.01898
Epoch: 344, Training Loss: 0.01155
Epoch: 345, Training Loss: 0.01391
Epoch: 345, Training Loss: 0.01218
Epoch: 345, Training Loss: 0.01892
Epoch: 345, Training Loss: 0.01151
Epoch: 346, Training Loss: 0.01386
Epoch: 346, Training Loss: 0.01214
Epoch: 346, Training Loss: 0.01885
Epoch: 346, Training Loss: 0.01147
Epoch: 347, Training Loss: 0.01381
Epoch: 347, Training Loss: 0.01210
Epoch: 347, Training Loss: 0.01878
Epoch: 347, Training Loss: 0.01143
Epoch: 348, Training Loss: 0.01376
Epoch: 348, Training Loss: 0.01206
Epoch: 348, Training Loss: 0.01872
Epoch: 348, Training Loss: 0.01139
Epoch: 349, Training Loss: 0.01371
Epoch: 349, Training Loss: 0.01202
Epoch: 349, Training Loss: 0.01865
Epoch: 349, Training Loss: 0.01135
Epoch: 350, Training Loss: 0.01366
Epoch: 350, Training Loss: 0.01198
Epoch: 350, Training Loss: 0.01858
Epoch: 350, Training Loss: 0.01132
Epoch: 351, Training Loss: 0.01362
Epoch: 351, Training Loss: 0.01194
Epoch: 351, Training Loss: 0.01852
Epoch: 351, Training Loss: 0.01128
Epoch: 352, Training Loss: 0.01357
Epoch: 352, Training Loss: 0.01190
Epoch: 352, Training Loss: 0.01846
Epoch: 352, Training Loss: 0.01124
Epoch: 353, Training Loss: 0.01352
Epoch: 353, Training Loss: 0.01186
Epoch: 353, Training Loss: 0.01839
Epoch: 353, Training Loss: 0.01120
Epoch: 354, Training Loss: 0.01348
Epoch: 354, Training Loss: 0.01182
Epoch: 354, Training Loss: 0.01833
Epoch: 354, Training Loss: 0.01116
Epoch: 355, Training Loss: 0.01343
Epoch: 355, Training Loss: 0.01178
Epoch: 355, Training Loss: 0.01826
Epoch: 355, Training Loss: 0.01113
Epoch: 356, Training Loss: 0.01338
Epoch: 356, Training Loss: 0.01175
Epoch: 356, Training Loss: 0.01820
Epoch: 356, Training Loss: 0.01109
Epoch: 357, Training Loss: 0.01334
Epoch: 357, Training Loss: 0.01171
Epoch: 357, Training Loss: 0.01814
Epoch: 357, Training Loss: 0.01105
Epoch: 358, Training Loss: 0.01329
Epoch: 358, Training Loss: 0.01167
Epoch: 358, Training Loss: 0.01808
Epoch: 358, Training Loss: 0.01101
Epoch: 359, Training Loss: 0.01325
Epoch: 359, Training Loss: 0.01163
Epoch: 359, Training Loss: 0.01802
Epoch: 359, Training Loss: 0.01098
Epoch: 360, Training Loss: 0.01320
Epoch: 360, Training Loss: 0.01159
Epoch: 360, Training Loss: 0.01795
Epoch: 360, Training Loss: 0.01094
Epoch: 361, Training Loss: 0.01316
Epoch: 361, Training Loss: 0.01156
Epoch: 361, Training Loss: 0.01789
Epoch: 361, Training Loss: 0.01091
Epoch: 362, Training Loss: 0.01311
Epoch: 362, Training Loss: 0.01152
Epoch: 362, Training Loss: 0.01783
Epoch: 362, Training Loss: 0.01087
Epoch: 363, Training Loss: 0.01307
Epoch: 363, Training Loss: 0.01148
Epoch: 363, Training Loss: 0.01777
Epoch: 363, Training Loss: 0.01083
Epoch: 364, Training Loss: 0.01303
Epoch: 364, Training Loss: 0.01144
Epoch: 364, Training Loss: 0.01771
Epoch: 364, Training Loss: 0.01080
Epoch: 365, Training Loss: 0.01298
Epoch: 365, Training Loss: 0.01141
Epoch: 365, Training Loss: 0.01765
Epoch: 365, Training Loss: 0.01076
Epoch: 366, Training Loss: 0.01294
Epoch: 366, Training Loss: 0.01137
Epoch: 366, Training Loss: 0.01760
Epoch: 366, Training Loss: 0.01073
Epoch: 367, Training Loss: 0.01290
Epoch: 367, Training Loss: 0.01134
Epoch: 367, Training Loss: 0.01754
Epoch: 367, Training Loss: 0.01069
Epoch: 368, Training Loss: 0.01285
Epoch: 368, Training Loss: 0.01130
Epoch: 368, Training Loss: 0.01748
Epoch: 368, Training Loss: 0.01066
Epoch: 369, Training Loss: 0.01281
Epoch: 369, Training Loss: 0.01126
Epoch: 369, Training Loss: 0.01742
Epoch: 369, Training Loss: 0.01063
Epoch: 370, Training Loss: 0.01277
Epoch: 370, Training Loss: 0.01123
Epoch: 370, Training Loss: 0.01736
Epoch: 370, Training Loss: 0.01059
Epoch: 371, Training Loss: 0.01273
Epoch: 371, Training Loss: 0.01119
Epoch: 371, Training Loss: 0.01731
Epoch: 371, Training Loss: 0.01056
Epoch: 372, Training Loss: 0.01269
Epoch: 372, Training Loss: 0.01116
Epoch: 372, Training Loss: 0.01725
Epoch: 372, Training Loss: 0.01052
Epoch: 373, Training Loss: 0.01265
Epoch: 373, Training Loss: 0.01112
Epoch: 373, Training Loss: 0.01720
Epoch: 373, Training Loss: 0.01049
Epoch: 374, Training Loss: 0.01261
Epoch: 374, Training Loss: 0.01109
Epoch: 374, Training Loss: 0.01714
Epoch: 374, Training Loss: 0.01046
Epoch: 375, Training Loss: 0.01256
Epoch: 375, Training Loss: 0.01106
Epoch: 375, Training Loss: 0.01708
Epoch: 375, Training Loss: 0.01042
Epoch: 376, Training Loss: 0.01252
Epoch: 376, Training Loss: 0.01102
Epoch: 376, Training Loss: 0.01703
Epoch: 376, Training Loss: 0.01039
Epoch: 377, Training Loss: 0.01248
Epoch: 377, Training Loss: 0.01099
Epoch: 377, Training Loss: 0.01697
Epoch: 377, Training Loss: 0.01036
Epoch: 378, Training Loss: 0.01244
Epoch: 378, Training Loss: 0.01095
Epoch: 378, Training Loss: 0.01692
Epoch: 378, Training Loss: 0.01033
Epoch: 379, Training Loss: 0.01240
Epoch: 379, Training Loss: 0.01092
Epoch: 379, Training Loss: 0.01687
Epoch: 379, Training Loss: 0.01029
Epoch: 380, Training Loss: 0.01236
Epoch: 380, Training Loss: 0.01089
Epoch: 380, Training Loss: 0.01681
Epoch: 380, Training Loss: 0.01026
Epoch: 381, Training Loss: 0.01233
Epoch: 381, Training Loss: 0.01085
Epoch: 381, Training Loss: 0.01676
Epoch: 381, Training Loss: 0.01023
Epoch: 382, Training Loss: 0.01229
Epoch: 382, Training Loss: 0.01082
Epoch: 382, Training Loss: 0.01671
Epoch: 382, Training Loss: 0.01020
Epoch: 383, Training Loss: 0.01225
Epoch: 383, Training Loss: 0.01079
Epoch: 383, Training Loss: 0.01665
Epoch: 383, Training Loss: 0.01017
Epoch: 384, Training Loss: 0.01221
Epoch: 384, Training Loss: 0.01076
Epoch: 384, Training Loss: 0.01660
Epoch: 384, Training Loss: 0.01014
Epoch: 385, Training Loss: 0.01217
Epoch: 385, Training Loss: 0.01072
Epoch: 385, Training Loss: 0.01655
Epoch: 385, Training Loss: 0.01011
Epoch: 386, Training Loss: 0.01213
Epoch: 386, Training Loss: 0.01069
Epoch: 386, Training Loss: 0.01650
Epoch: 386, Training Loss: 0.01007
Epoch: 387, Training Loss: 0.01210
Epoch: 387, Training Loss: 0.01066
Epoch: 387, Training Loss: 0.01645
Epoch: 387, Training Loss: 0.01004
Epoch: 388, Training Loss: 0.01206
Epoch: 388, Training Loss: 0.01063
Epoch: 388, Training Loss: 0.01639
Epoch: 388, Training Loss: 0.01001
Epoch: 389, Training Loss: 0.01202
Epoch: 389, Training Loss: 0.01060
Epoch: 389, Training Loss: 0.01634
Epoch: 389, Training Loss: 0.00998
Epoch: 390, Training Loss: 0.01198
Epoch: 390, Training Loss: 0.01056
Epoch: 390, Training Loss: 0.01629
Epoch: 390, Training Loss: 0.00995
Epoch: 391, Training Loss: 0.01195
Epoch: 391, Training Loss: 0.01053
Epoch: 391, Training Loss: 0.01624
Epoch: 391, Training Loss: 0.00992
Epoch: 392, Training Loss: 0.01191
Epoch: 392, Training Loss: 0.01050
Epoch: 392, Training Loss: 0.01619
Epoch: 392, Training Loss: 0.00989
Epoch: 393, Training Loss: 0.01187
Epoch: 393, Training Loss: 0.01047
Epoch: 393, Training Loss: 0.01614
Epoch: 393, Training Loss: 0.00986
Epoch: 394, Training Loss: 0.01184
Epoch: 394, Training Loss: 0.01044
Epoch: 394, Training Loss: 0.01609
Epoch: 394, Training Loss: 0.00983
Epoch: 395, Training Loss: 0.01180
Epoch: 395, Training Loss: 0.01041
Epoch: 395, Training Loss: 0.01604
Epoch: 395, Training Loss: 0.00980
Epoch: 396, Training Loss: 0.01177
Epoch: 396, Training Loss: 0.01038
Epoch: 396, Training Loss: 0.01600
Epoch: 396, Training Loss: 0.00978
Epoch: 397, Training Loss: 0.01173
Epoch: 397, Training Loss: 0.01035
Epoch: 397, Training Loss: 0.01595
Epoch: 397, Training Loss: 0.00975
Epoch: 398, Training Loss: 0.01170
Epoch: 398, Training Loss: 0.01032
Epoch: 398, Training Loss: 0.01590
Epoch: 398, Training Loss: 0.00972
Epoch: 399, Training Loss: 0.01166
Epoch: 399, Training Loss: 0.01029
Epoch: 399, Training Loss: 0.01585
Epoch: 399, Training Loss: 0.00969
Epoch: 400, Training Loss: 0.01163
Epoch: 400, Training Loss: 0.01026
Epoch: 400, Training Loss: 0.01580
Epoch: 400, Training Loss: 0.00966
Epoch: 401, Training Loss: 0.01159
Epoch: 401, Training Loss: 0.01023
Epoch: 401, Training Loss: 0.01576
Epoch: 401, Training Loss: 0.00963
Epoch: 402, Training Loss: 0.01156
Epoch: 402, Training Loss: 0.01020
Epoch: 402, Training Loss: 0.01571
Epoch: 402, Training Loss: 0.00960
Epoch: 403, Training Loss: 0.01152
Epoch: 403, Training Loss: 0.01017
Epoch: 403, Training Loss: 0.01566
Epoch: 403, Training Loss: 0.00958
Epoch: 404, Training Loss: 0.01149
Epoch: 404, Training Loss: 0.01014
Epoch: 404, Training Loss: 0.01562
Epoch: 404, Training Loss: 0.00955
Epoch: 405, Training Loss: 0.01145
Epoch: 405, Training Loss: 0.01011
Epoch: 405, Training Loss: 0.01557
Epoch: 405, Training Loss: 0.00952
Epoch: 406, Training Loss: 0.01142
Epoch: 406, Training Loss: 0.01009
Epoch: 406, Training Loss: 0.01552
Epoch: 406, Training Loss: 0.00949
Epoch: 407, Training Loss: 0.01139
Epoch: 407, Training Loss: 0.01006
Epoch: 407, Training Loss: 0.01548
Epoch: 407, Training Loss: 0.00947
Epoch: 408, Training Loss: 0.01135
Epoch: 408, Training Loss: 0.01003
Epoch: 408, Training Loss: 0.01543
Epoch: 408, Training Loss: 0.00944
Epoch: 409, Training Loss: 0.01132
Epoch: 409, Training Loss: 0.01000
Epoch: 409, Training Loss: 0.01539
Epoch: 409, Training Loss: 0.00941
Epoch: 410, Training Loss: 0.01129
Epoch: 410, Training Loss: 0.00997
Epoch: 410, Training Loss: 0.01534
Epoch: 410, Training Loss: 0.00939
Epoch: 411, Training Loss: 0.01125
Epoch: 411, Training Loss: 0.00995
Epoch: 411, Training Loss: 0.01530
Epoch: 411, Training Loss: 0.00936
Epoch: 412, Training Loss: 0.01122
Epoch: 412, Training Loss: 0.00992
Epoch: 412, Training Loss: 0.01525
Epoch: 412, Training Loss: 0.00933
Epoch: 413, Training Loss: 0.01119
Epoch: 413, Training Loss: 0.00989
Epoch: 413, Training Loss: 0.01521
Epoch: 413, Training Loss: 0.00931
Epoch: 414, Training Loss: 0.01116
Epoch: 414, Training Loss: 0.00986
Epoch: 414, Training Loss: 0.01517
Epoch: 414, Training Loss: 0.00928
Epoch: 415, Training Loss: 0.01113
Epoch: 415, Training Loss: 0.00984
Epoch: 415, Training Loss: 0.01512
Epoch: 415, Training Loss: 0.00925
Epoch: 416, Training Loss: 0.01109
Epoch: 416, Training Loss: 0.00981
Epoch: 416, Training Loss: 0.01508
Epoch: 416, Training Loss: 0.00923
Epoch: 417, Training Loss: 0.01106
Epoch: 417, Training Loss: 0.00978
Epoch: 417, Training Loss: 0.01504
Epoch: 417, Training Loss: 0.00920
Epoch: 418, Training Loss: 0.01103
Epoch: 418, Training Loss: 0.00975
Epoch: 418, Training Loss: 0.01499
Epoch: 418, Training Loss: 0.00918
Epoch: 419, Training Loss: 0.01100
Epoch: 419, Training Loss: 0.00973
Epoch: 419, Training Loss: 0.01495
Epoch: 419, Training Loss: 0.00915
Epoch: 420, Training Loss: 0.01097
Epoch: 420, Training Loss: 0.00970
Epoch: 420, Training Loss: 0.01491
Epoch: 420, Training Loss: 0.00913
Epoch: 421, Training Loss: 0.01094
Epoch: 421, Training Loss: 0.00967
Epoch: 421, Training Loss: 0.01487
Epoch: 421, Training Loss: 0.00910
Epoch: 422, Training Loss: 0.01091
Epoch: 422, Training Loss: 0.00965
Epoch: 422, Training Loss: 0.01482
Epoch: 422, Training Loss: 0.00908
Epoch: 423, Training Loss: 0.01088
Epoch: 423, Training Loss: 0.00962
Epoch: 423, Training Loss: 0.01478
Epoch: 423, Training Loss: 0.00905
Epoch: 424, Training Loss: 0.01084
Epoch: 424, Training Loss: 0.00960
Epoch: 424, Training Loss: 0.01474
Epoch: 424, Training Loss: 0.00903
Epoch: 425, Training Loss: 0.01081
Epoch: 425, Training Loss: 0.00957
Epoch: 425, Training Loss: 0.01470
Epoch: 425, Training Loss: 0.00900
Epoch: 426, Training Loss: 0.01078
Epoch: 426, Training Loss: 0.00954
Epoch: 426, Training Loss: 0.01466
Epoch: 426, Training Loss: 0.00898
Epoch: 427, Training Loss: 0.01075
Epoch: 427, Training Loss: 0.00952
Epoch: 427, Training Loss: 0.01462
Epoch: 427, Training Loss: 0.00895
Epoch: 428, Training Loss: 0.01072
Epoch: 428, Training Loss: 0.00949
Epoch: 428, Training Loss: 0.01458
Epoch: 428, Training Loss: 0.00893
Epoch: 429, Training Loss: 0.01069
Epoch: 429, Training Loss: 0.00947
Epoch: 429, Training Loss: 0.01454
Epoch: 429, Training Loss: 0.00890
Epoch: 430, Training Loss: 0.01067
Epoch: 430, Training Loss: 0.00944
Epoch: 430, Training Loss: 0.01450
Epoch: 430, Training Loss: 0.00888
Epoch: 431, Training Loss: 0.01064
Epoch: 431, Training Loss: 0.00942
Epoch: 431, Training Loss: 0.01446
Epoch: 431, Training Loss: 0.00886
Epoch: 432, Training Loss: 0.01061
Epoch: 432, Training Loss: 0.00939
Epoch: 432, Training Loss: 0.01442
Epoch: 432, Training Loss: 0.00883
Epoch: 433, Training Loss: 0.01058
Epoch: 433, Training Loss: 0.00937
Epoch: 433, Training Loss: 0.01438
Epoch: 433, Training Loss: 0.00881
Epoch: 434, Training Loss: 0.01055
Epoch: 434, Training Loss: 0.00934
Epoch: 434, Training Loss: 0.01434
Epoch: 434, Training Loss: 0.00878
Epoch: 435, Training Loss: 0.01052
Epoch: 435, Training Loss: 0.00932
Epoch: 435, Training Loss: 0.01430
Epoch: 435, Training Loss: 0.00876
Epoch: 436, Training Loss: 0.01049
Epoch: 436, Training Loss: 0.00929
Epoch: 436, Training Loss: 0.01426
Epoch: 436, Training Loss: 0.00874
Epoch: 437, Training Loss: 0.01046
Epoch: 437, Training Loss: 0.00927
Epoch: 437, Training Loss: 0.01422
Epoch: 437, Training Loss: 0.00871
Epoch: 438, Training Loss: 0.01044
Epoch: 438, Training Loss: 0.00925
Epoch: 438, Training Loss: 0.01418
Epoch: 438, Training Loss: 0.00869
Epoch: 439, Training Loss: 0.01041
Epoch: 439, Training Loss: 0.00922
Epoch: 439, Training Loss: 0.01415
Epoch: 439, Training Loss: 0.00867
Epoch: 440, Training Loss: 0.01038
Epoch: 440, Training Loss: 0.00920
Epoch: 440, Training Loss: 0.01411
Epoch: 440, Training Loss: 0.00865
Epoch: 441, Training Loss: 0.01035
Epoch: 441, Training Loss: 0.00917
Epoch: 441, Training Loss: 0.01407
Epoch: 441, Training Loss: 0.00862
Epoch: 442, Training Loss: 0.01032
Epoch: 442, Training Loss: 0.00915
Epoch: 442, Training Loss: 0.01403
Epoch: 442, Training Loss: 0.00860
Epoch: 443, Training Loss: 0.01030
Epoch: 443, Training Loss: 0.00913
Epoch: 443, Training Loss: 0.01399
Epoch: 443, Training Loss: 0.00858
Epoch: 444, Training Loss: 0.01027
Epoch: 444, Training Loss: 0.00910
Epoch: 444, Training Loss: 0.01396
Epoch: 444, Training Loss: 0.00856
Epoch: 445, Training Loss: 0.01024
Epoch: 445, Training Loss: 0.00908
Epoch: 445, Training Loss: 0.01392
Epoch: 445, Training Loss: 0.00853
Epoch: 446, Training Loss: 0.01021
Epoch: 446, Training Loss: 0.00906
Epoch: 446, Training Loss: 0.01388
Epoch: 446, Training Loss: 0.00851
Epoch: 447, Training Loss: 0.01019
Epoch: 447, Training Loss: 0.00903
Epoch: 447, Training Loss: 0.01385
Epoch: 447, Training Loss: 0.00849
Epoch: 448, Training Loss: 0.01016
Epoch: 448, Training Loss: 0.00901
Epoch: 448, Training Loss: 0.01381
Epoch: 448, Training Loss: 0.00847
Epoch: 449, Training Loss: 0.01013
Epoch: 449, Training Loss: 0.00899
Epoch: 449, Training Loss: 0.01377
Epoch: 449, Training Loss: 0.00845
Epoch: 450, Training Loss: 0.01011
Epoch: 450, Training Loss: 0.00897
Epoch: 450, Training Loss: 0.01374
Epoch: 450, Training Loss: 0.00842
Epoch: 451, Training Loss: 0.01008
Epoch: 451, Training Loss: 0.00894
Epoch: 451, Training Loss: 0.01370
Epoch: 451, Training Loss: 0.00840
Epoch: 452, Training Loss: 0.01005
Epoch: 452, Training Loss: 0.00892
Epoch: 452, Training Loss: 0.01367
Epoch: 452, Training Loss: 0.00838
Epoch: 453, Training Loss: 0.01003
Epoch: 453, Training Loss: 0.00890
Epoch: 453, Training Loss: 0.01363
Epoch: 453, Training Loss: 0.00836
Epoch: 454, Training Loss: 0.01000
Epoch: 454, Training Loss: 0.00888
Epoch: 454, Training Loss: 0.01359
Epoch: 454, Training Loss: 0.00834
Epoch: 455, Training Loss: 0.00998
Epoch: 455, Training Loss: 0.00885
Epoch: 455, Training Loss: 0.01356
Epoch: 455, Training Loss: 0.00832
Epoch: 456, Training Loss: 0.00995
Epoch: 456, Training Loss: 0.00883
Epoch: 456, Training Loss: 0.01352
Epoch: 456, Training Loss: 0.00830
Epoch: 457, Training Loss: 0.00993
Epoch: 457, Training Loss: 0.00881
Epoch: 457, Training Loss: 0.01349
Epoch: 457, Training Loss: 0.00828
Epoch: 458, Training Loss: 0.00990
Epoch: 458, Training Loss: 0.00879
Epoch: 458, Training Loss: 0.01346
Epoch: 458, Training Loss: 0.00825
Epoch: 459, Training Loss: 0.00987
Epoch: 459, Training Loss: 0.00877
Epoch: 459, Training Loss: 0.01342
Epoch: 459, Training Loss: 0.00823
Epoch: 460, Training Loss: 0.00985
Epoch: 460, Training Loss: 0.00874
Epoch: 460, Training Loss: 0.01339
Epoch: 460, Training Loss: 0.00821
Epoch: 461, Training Loss: 0.00982
Epoch: 461, Training Loss: 0.00872
Epoch: 461, Training Loss: 0.01335
Epoch: 461, Training Loss: 0.00819
Epoch: 462, Training Loss: 0.00980
Epoch: 462, Training Loss: 0.00870
Epoch: 462, Training Loss: 0.01332
Epoch: 462, Training Loss: 0.00817
Epoch: 463, Training Loss: 0.00977
Epoch: 463, Training Loss: 0.00868
Epoch: 463, Training Loss: 0.01328
Epoch: 463, Training Loss: 0.00815
Epoch: 464, Training Loss: 0.00975
Epoch: 464, Training Loss: 0.00866
Epoch: 464, Training Loss: 0.01325
Epoch: 464, Training Loss: 0.00813
Epoch: 465, Training Loss: 0.00972
Epoch: 465, Training Loss: 0.00864
Epoch: 465, Training Loss: 0.01322
Epoch: 465, Training Loss: 0.00811
Epoch: 466, Training Loss: 0.00970
Epoch: 466, Training Loss: 0.00862
Epoch: 466, Training Loss: 0.01318
Epoch: 466, Training Loss: 0.00809
Epoch: 467, Training Loss: 0.00968
Epoch: 467, Training Loss: 0.00860
Epoch: 467, Training Loss: 0.01315
Epoch: 467, Training Loss: 0.00807
Epoch: 468, Training Loss: 0.00965
Epoch: 468, Training Loss: 0.00858
Epoch: 468, Training Loss: 0.01312
Epoch: 468, Training Loss: 0.00805
Epoch: 469, Training Loss: 0.00963
Epoch: 469, Training Loss: 0.00855
Epoch: 469, Training Loss: 0.01309
Epoch: 469, Training Loss: 0.00803
Epoch: 470, Training Loss: 0.00960
Epoch: 470, Training Loss: 0.00853
Epoch: 470, Training Loss: 0.01305
Epoch: 470, Training Loss: 0.00801
Epoch: 471, Training Loss: 0.00958
Epoch: 471, Training Loss: 0.00851
Epoch: 471, Training Loss: 0.01302
Epoch: 471, Training Loss: 0.00799
Epoch: 472, Training Loss: 0.00956
Epoch: 472, Training Loss: 0.00849
Epoch: 472, Training Loss: 0.01299
Epoch: 472, Training Loss: 0.00797
Epoch: 473, Training Loss: 0.00953
Epoch: 473, Training Loss: 0.00847
Epoch: 473, Training Loss: 0.01296
Epoch: 473, Training Loss: 0.00795
Epoch: 474, Training Loss: 0.00951
Epoch: 474, Training Loss: 0.00845
Epoch: 474, Training Loss: 0.01292
Epoch: 474, Training Loss: 0.00793
Epoch: 475, Training Loss: 0.00949
Epoch: 475, Training Loss: 0.00843
Epoch: 475, Training Loss: 0.01289
Epoch: 475, Training Loss: 0.00792
Epoch: 476, Training Loss: 0.00946
Epoch: 476, Training Loss: 0.00841
Epoch: 476, Training Loss: 0.01286
Epoch: 476, Training Loss: 0.00790
Epoch: 477, Training Loss: 0.00944
Epoch: 477, Training Loss: 0.00839
Epoch: 477, Training Loss: 0.01283
Epoch: 477, Training Loss: 0.00788
Epoch: 478, Training Loss: 0.00942
Epoch: 478, Training Loss: 0.00837
Epoch: 478, Training Loss: 0.01280
Epoch: 478, Training Loss: 0.00786
Epoch: 479, Training Loss: 0.00939
Epoch: 479, Training Loss: 0.00835
Epoch: 479, Training Loss: 0.01277
Epoch: 479, Training Loss: 0.00784
Epoch: 480, Training Loss: 0.00937
Epoch: 480, Training Loss: 0.00833
Epoch: 480, Training Loss: 0.01273
Epoch: 480, Training Loss: 0.00782
Epoch: 481, Training Loss: 0.00935
Epoch: 481, Training Loss: 0.00831
Epoch: 481, Training Loss: 0.01270
Epoch: 481, Training Loss: 0.00780
Epoch: 482, Training Loss: 0.00932
Epoch: 482, Training Loss: 0.00829
Epoch: 482, Training Loss: 0.01267
Epoch: 482, Training Loss: 0.00778
Epoch: 483, Training Loss: 0.00930
Epoch: 483, Training Loss: 0.00827
Epoch: 483, Training Loss: 0.01264
Epoch: 483, Training Loss: 0.00777
Epoch: 484, Training Loss: 0.00928
Epoch: 484, Training Loss: 0.00826
Epoch: 484, Training Loss: 0.01261
Epoch: 484, Training Loss: 0.00775
Epoch: 485, Training Loss: 0.00926
Epoch: 485, Training Loss: 0.00824
Epoch: 485, Training Loss: 0.01258
Epoch: 485, Training Loss: 0.00773
Epoch: 486, Training Loss: 0.00923
Epoch: 486, Training Loss: 0.00822
Epoch: 486, Training Loss: 0.01255
Epoch: 486, Training Loss: 0.00771
Epoch: 487, Training Loss: 0.00921
Epoch: 487, Training Loss: 0.00820
Epoch: 487, Training Loss: 0.01252
Epoch: 487, Training Loss: 0.00769
Epoch: 488, Training Loss: 0.00919
Epoch: 488, Training Loss: 0.00818
Epoch: 488, Training Loss: 0.01249
Epoch: 488, Training Loss: 0.00767
Epoch: 489, Training Loss: 0.00917
Epoch: 489, Training Loss: 0.00816
Epoch: 489, Training Loss: 0.01246
Epoch: 489, Training Loss: 0.00766
Epoch: 490, Training Loss: 0.00915
Epoch: 490, Training Loss: 0.00814
Epoch: 490, Training Loss: 0.01243
Epoch: 490, Training Loss: 0.00764
Epoch: 491, Training Loss: 0.00913
Epoch: 491, Training Loss: 0.00812
Epoch: 491, Training Loss: 0.01240
Epoch: 491, Training Loss: 0.00762
Epoch: 492, Training Loss: 0.00910
Epoch: 492, Training Loss: 0.00810
Epoch: 492, Training Loss: 0.01237
Epoch: 492, Training Loss: 0.00760
Epoch: 493, Training Loss: 0.00908
Epoch: 493, Training Loss: 0.00809
Epoch: 493, Training Loss: 0.01234
Epoch: 493, Training Loss: 0.00759
Epoch: 494, Training Loss: 0.00906
Epoch: 494, Training Loss: 0.00807
Epoch: 494, Training Loss: 0.01231
Epoch: 494, Training Loss: 0.00757
Epoch: 495, Training Loss: 0.00904
Epoch: 495, Training Loss: 0.00805
Epoch: 495, Training Loss: 0.01229
Epoch: 495, Training Loss: 0.00755
Epoch: 496, Training Loss: 0.00902
Epoch: 496, Training Loss: 0.00803
Epoch: 496, Training Loss: 0.01226
Epoch: 496, Training Loss: 0.00753
Epoch: 497, Training Loss: 0.00900
Epoch: 497, Training Loss: 0.00801
Epoch: 497, Training Loss: 0.01223
Epoch: 497, Training Loss: 0.00752
Epoch: 498, Training Loss: 0.00898
Epoch: 498, Training Loss: 0.00799
Epoch: 498, Training Loss: 0.01220
Epoch: 498, Training Loss: 0.00750
Epoch: 499, Training Loss: 0.00896
Epoch: 499, Training Loss: 0.00798
Epoch: 499, Training Loss: 0.01217
Epoch: 499, Training Loss: 0.00748
Epoch: 500, Training Loss: 0.00893
Epoch: 500, Training Loss: 0.00796
Epoch: 500, Training Loss: 0.01214
Epoch: 500, Training Loss: 0.00746
Epoch: 501, Training Loss: 0.00891
Epoch: 501, Training Loss: 0.00794
Epoch: 501, Training Loss: 0.01211
Epoch: 501, Training Loss: 0.00745
Epoch: 502, Training Loss: 0.00889
Epoch: 502, Training Loss: 0.00792
Epoch: 502, Training Loss: 0.01209
Epoch: 502, Training Loss: 0.00743
Epoch: 503, Training Loss: 0.00887
Epoch: 503, Training Loss: 0.00790
Epoch: 503, Training Loss: 0.01206
Epoch: 503, Training Loss: 0.00741
Epoch: 504, Training Loss: 0.00885
Epoch: 504, Training Loss: 0.00789
Epoch: 504, Training Loss: 0.01203
Epoch: 504, Training Loss: 0.00740
Epoch: 505, Training Loss: 0.00883
Epoch: 505, Training Loss: 0.00787
Epoch: 505, Training Loss: 0.01200
Epoch: 505, Training Loss: 0.00738
Epoch: 506, Training Loss: 0.00881
Epoch: 506, Training Loss: 0.00785
Epoch: 506, Training Loss: 0.01198
Epoch: 506, Training Loss: 0.00736
Epoch: 507, Training Loss: 0.00879
Epoch: 507, Training Loss: 0.00783
Epoch: 507, Training Loss: 0.01195
Epoch: 507, Training Loss: 0.00735
Epoch: 508, Training Loss: 0.00877
Epoch: 508, Training Loss: 0.00782
Epoch: 508, Training Loss: 0.01192
Epoch: 508, Training Loss: 0.00733
Epoch: 509, Training Loss: 0.00875
Epoch: 509, Training Loss: 0.00780
Epoch: 509, Training Loss: 0.01189
Epoch: 509, Training Loss: 0.00731
Epoch: 510, Training Loss: 0.00873
Epoch: 510, Training Loss: 0.00778
Epoch: 510, Training Loss: 0.01187
Epoch: 510, Training Loss: 0.00730
Epoch: 511, Training Loss: 0.00871
Epoch: 511, Training Loss: 0.00777
Epoch: 511, Training Loss: 0.01184
Epoch: 511, Training Loss: 0.00728
Epoch: 512, Training Loss: 0.00869
Epoch: 512, Training Loss: 0.00775
Epoch: 512, Training Loss: 0.01181
Epoch: 512, Training Loss: 0.00727
Epoch: 513, Training Loss: 0.00867
Epoch: 513, Training Loss: 0.00773
Epoch: 513, Training Loss: 0.01179
Epoch: 513, Training Loss: 0.00725
Epoch: 514, Training Loss: 0.00865
Epoch: 514, Training Loss: 0.00771
Epoch: 514, Training Loss: 0.01176
Epoch: 514, Training Loss: 0.00723
Epoch: 515, Training Loss: 0.00863
Epoch: 515, Training Loss: 0.00770
Epoch: 515, Training Loss: 0.01173
Epoch: 515, Training Loss: 0.00722
Epoch: 516, Training Loss: 0.00861
Epoch: 516, Training Loss: 0.00768
Epoch: 516, Training Loss: 0.01171
Epoch: 516, Training Loss: 0.00720
Epoch: 517, Training Loss: 0.00859
Epoch: 517, Training Loss: 0.00766
Epoch: 517, Training Loss: 0.01168
Epoch: 517, Training Loss: 0.00719
Epoch: 518, Training Loss: 0.00857
Epoch: 518, Training Loss: 0.00765
Epoch: 518, Training Loss: 0.01165
Epoch: 518, Training Loss: 0.00717
Epoch: 519, Training Loss: 0.00856
Epoch: 519, Training Loss: 0.00763
Epoch: 519, Training Loss: 0.01163
Epoch: 519, Training Loss: 0.00715
Epoch: 520, Training Loss: 0.00854
Epoch: 520, Training Loss: 0.00761
Epoch: 520, Training Loss: 0.01160
Epoch: 520, Training Loss: 0.00714
Epoch: 521, Training Loss: 0.00852
Epoch: 521, Training Loss: 0.00760
Epoch: 521, Training Loss: 0.01158
Epoch: 521, Training Loss: 0.00712
Epoch: 522, Training Loss: 0.00850
Epoch: 522, Training Loss: 0.00758
Epoch: 522, Training Loss: 0.01155
Epoch: 522, Training Loss: 0.00711
Epoch: 523, Training Loss: 0.00848
Epoch: 523, Training Loss: 0.00757
Epoch: 523, Training Loss: 0.01153
Epoch: 523, Training Loss: 0.00709
Epoch: 524, Training Loss: 0.00846
Epoch: 524, Training Loss: 0.00755
Epoch: 524, Training Loss: 0.01150
Epoch: 524, Training Loss: 0.00708
Epoch: 525, Training Loss: 0.00844
Epoch: 525, Training Loss: 0.00753
Epoch: 525, Training Loss: 0.01148
Epoch: 525, Training Loss: 0.00706
Epoch: 526, Training Loss: 0.00842
Epoch: 526, Training Loss: 0.00752
Epoch: 526, Training Loss: 0.01145
Epoch: 526, Training Loss: 0.00705
Epoch: 527, Training Loss: 0.00841
Epoch: 527, Training Loss: 0.00750
Epoch: 527, Training Loss: 0.01142
Epoch: 527, Training Loss: 0.00703
Epoch: 528, Training Loss: 0.00839
Epoch: 528, Training Loss: 0.00749
Epoch: 528, Training Loss: 0.01140
Epoch: 528, Training Loss: 0.00702
Epoch: 529, Training Loss: 0.00837
Epoch: 529, Training Loss: 0.00747
Epoch: 529, Training Loss: 0.01138
Epoch: 529, Training Loss: 0.00700
Epoch: 530, Training Loss: 0.00835
Epoch: 530, Training Loss: 0.00745
Epoch: 530, Training Loss: 0.01135
Epoch: 530, Training Loss: 0.00699
Epoch: 531, Training Loss: 0.00833
Epoch: 531, Training Loss: 0.00744
Epoch: 531, Training Loss: 0.01133
Epoch: 531, Training Loss: 0.00697
Epoch: 532, Training Loss: 0.00831
Epoch: 532, Training Loss: 0.00742
Epoch: 532, Training Loss: 0.01130
Epoch: 532, Training Loss: 0.00696
Epoch: 533, Training Loss: 0.00830
Epoch: 533, Training Loss: 0.00741
Epoch: 533, Training Loss: 0.01128
Epoch: 533, Training Loss: 0.00694
Epoch: 534, Training Loss: 0.00828
Epoch: 534, Training Loss: 0.00739
Epoch: 534, Training Loss: 0.01125
Epoch: 534, Training Loss: 0.00693
Epoch: 535, Training Loss: 0.00826
Epoch: 535, Training Loss: 0.00738
Epoch: 535, Training Loss: 0.01123
Epoch: 535, Training Loss: 0.00691
Epoch: 536, Training Loss: 0.00824
Epoch: 536, Training Loss: 0.00736
Epoch: 536, Training Loss: 0.01120
Epoch: 536, Training Loss: 0.00690
Epoch: 537, Training Loss: 0.00822
Epoch: 537, Training Loss: 0.00735
Epoch: 537, Training Loss: 0.01118
Epoch: 537, Training Loss: 0.00688
Epoch: 538, Training Loss: 0.00821
Epoch: 538, Training Loss: 0.00733
Epoch: 538, Training Loss: 0.01116
Epoch: 538, Training Loss: 0.00687
Epoch: 539, Training Loss: 0.00819
Epoch: 539, Training Loss: 0.00731
Epoch: 539, Training Loss: 0.01113
Epoch: 539, Training Loss: 0.00685
Epoch: 540, Training Loss: 0.00817
Epoch: 540, Training Loss: 0.00730
Epoch: 540, Training Loss: 0.01111
Epoch: 540, Training Loss: 0.00684
Epoch: 541, Training Loss: 0.00815
Epoch: 541, Training Loss: 0.00728
Epoch: 541, Training Loss: 0.01108
Epoch: 541, Training Loss: 0.00683
Epoch: 542, Training Loss: 0.00814
Epoch: 542, Training Loss: 0.00727
Epoch: 542, Training Loss: 0.01106
Epoch: 542, Training Loss: 0.00681
Epoch: 543, Training Loss: 0.00812
Epoch: 543, Training Loss: 0.00725
Epoch: 543, Training Loss: 0.01104
Epoch: 543, Training Loss: 0.00680
Epoch: 544, Training Loss: 0.00810
Epoch: 544, Training Loss: 0.00724
Epoch: 544, Training Loss: 0.01101
Epoch: 544, Training Loss: 0.00678
Epoch: 545, Training Loss: 0.00809
Epoch: 545, Training Loss: 0.00722
Epoch: 545, Training Loss: 0.01099
Epoch: 545, Training Loss: 0.00677
Epoch: 546, Training Loss: 0.00807
Epoch: 546, Training Loss: 0.00721
Epoch: 546, Training Loss: 0.01097
Epoch: 546, Training Loss: 0.00676
Epoch: 547, Training Loss: 0.00805
Epoch: 547, Training Loss: 0.00720
Epoch: 547, Training Loss: 0.01094
Epoch: 547, Training Loss: 0.00674
Epoch: 548, Training Loss: 0.00803
Epoch: 548, Training Loss: 0.00718
Epoch: 548, Training Loss: 0.01092
Epoch: 548, Training Loss: 0.00673
Epoch: 549, Training Loss: 0.00802
Epoch: 549, Training Loss: 0.00717
Epoch: 549, Training Loss: 0.01090
Epoch: 549, Training Loss: 0.00671
Epoch: 550, Training Loss: 0.00800
Epoch: 550, Training Loss: 0.00715
Epoch: 550, Training Loss: 0.01088
Epoch: 550, Training Loss: 0.00670
Epoch: 551, Training Loss: 0.00798
Epoch: 551, Training Loss: 0.00714
Epoch: 551, Training Loss: 0.01085
Epoch: 551, Training Loss: 0.00669
Epoch: 552, Training Loss: 0.00797
Epoch: 552, Training Loss: 0.00712
Epoch: 552, Training Loss: 0.01083
Epoch: 552, Training Loss: 0.00667
Epoch: 553, Training Loss: 0.00795
Epoch: 553, Training Loss: 0.00711
Epoch: 553, Training Loss: 0.01081
Epoch: 553, Training Loss: 0.00666
Epoch: 554, Training Loss: 0.00793
Epoch: 554, Training Loss: 0.00709
Epoch: 554, Training Loss: 0.01079
Epoch: 554, Training Loss: 0.00665
Epoch: 555, Training Loss: 0.00792
Epoch: 555, Training Loss: 0.00708
Epoch: 555, Training Loss: 0.01076
Epoch: 555, Training Loss: 0.00663
Epoch: 556, Training Loss: 0.00790
Epoch: 556, Training Loss: 0.00707
Epoch: 556, Training Loss: 0.01074
Epoch: 556, Training Loss: 0.00662
Epoch: 557, Training Loss: 0.00789
Epoch: 557, Training Loss: 0.00705
Epoch: 557, Training Loss: 0.01072
Epoch: 557, Training Loss: 0.00660
Epoch: 558, Training Loss: 0.00787
Epoch: 558, Training Loss: 0.00704
Epoch: 558, Training Loss: 0.01070
Epoch: 558, Training Loss: 0.00659
Epoch: 559, Training Loss: 0.00785
Epoch: 559, Training Loss: 0.00702
Epoch: 559, Training Loss: 0.01068
Epoch: 559, Training Loss: 0.00658
Epoch: 560, Training Loss: 0.00784
Epoch: 560, Training Loss: 0.00701
Epoch: 560, Training Loss: 0.01065
Epoch: 560, Training Loss: 0.00656
Epoch: 561, Training Loss: 0.00782
Epoch: 561, Training Loss: 0.00700
Epoch: 561, Training Loss: 0.01063
Epoch: 561, Training Loss: 0.00655
Epoch: 562, Training Loss: 0.00781
Epoch: 562, Training Loss: 0.00698
Epoch: 562, Training Loss: 0.01061
Epoch: 562, Training Loss: 0.00654
Epoch: 563, Training Loss: 0.00779
Epoch: 563, Training Loss: 0.00697
Epoch: 563, Training Loss: 0.01059
Epoch: 563, Training Loss: 0.00653
Epoch: 564, Training Loss: 0.00777
Epoch: 564, Training Loss: 0.00695
Epoch: 564, Training Loss: 0.01057
Epoch: 564, Training Loss: 0.00651
Epoch: 565, Training Loss: 0.00776
Epoch: 565, Training Loss: 0.00694
Epoch: 565, Training Loss: 0.01055
Epoch: 565, Training Loss: 0.00650
Epoch: 566, Training Loss: 0.00774
Epoch: 566, Training Loss: 0.00693
Epoch: 566, Training Loss: 0.01052
Epoch: 566, Training Loss: 0.00649
Epoch: 567, Training Loss: 0.00773
Epoch: 567, Training Loss: 0.00691
Epoch: 567, Training Loss: 0.01050
Epoch: 567, Training Loss: 0.00647
Epoch: 568, Training Loss: 0.00771
Epoch: 568, Training Loss: 0.00690
Epoch: 568, Training Loss: 0.01048
Epoch: 568, Training Loss: 0.00646
Epoch: 569, Training Loss: 0.00770
Epoch: 569, Training Loss: 0.00689
Epoch: 569, Training Loss: 0.01046
Epoch: 569, Training Loss: 0.00645
Epoch: 570, Training Loss: 0.00768
Epoch: 570, Training Loss: 0.00687
Epoch: 570, Training Loss: 0.01044
Epoch: 570, Training Loss: 0.00644
Epoch: 571, Training Loss: 0.00766
Epoch: 571, Training Loss: 0.00686
Epoch: 571, Training Loss: 0.01042
Epoch: 571, Training Loss: 0.00642
Epoch: 572, Training Loss: 0.00765
Epoch: 572, Training Loss: 0.00685
Epoch: 572, Training Loss: 0.01040
Epoch: 572, Training Loss: 0.00641
Epoch: 573, Training Loss: 0.00763
Epoch: 573, Training Loss: 0.00683
Epoch: 573, Training Loss: 0.01038
Epoch: 573, Training Loss: 0.00640
Epoch: 574, Training Loss: 0.00762
Epoch: 574, Training Loss: 0.00682
Epoch: 574, Training Loss: 0.01036
Epoch: 574, Training Loss: 0.00639
Epoch: 575, Training Loss: 0.00760
Epoch: 575, Training Loss: 0.00681
Epoch: 575, Training Loss: 0.01034
Epoch: 575, Training Loss: 0.00637
Epoch: 576, Training Loss: 0.00759
Epoch: 576, Training Loss: 0.00679
Epoch: 576, Training Loss: 0.01032
Epoch: 576, Training Loss: 0.00636
Epoch: 577, Training Loss: 0.00757
Epoch: 577, Training Loss: 0.00678
Epoch: 577, Training Loss: 0.01030
Epoch: 577, Training Loss: 0.00635
Epoch: 578, Training Loss: 0.00756
Epoch: 578, Training Loss: 0.00677
Epoch: 578, Training Loss: 0.01027
Epoch: 578, Training Loss: 0.00634
Epoch: 579, Training Loss: 0.00754
Epoch: 579, Training Loss: 0.00675
Epoch: 579, Training Loss: 0.01025
Epoch: 579, Training Loss: 0.00632
Epoch: 580, Training Loss: 0.00753
Epoch: 580, Training Loss: 0.00674
Epoch: 580, Training Loss: 0.01023
Epoch: 580, Training Loss: 0.00631
Epoch: 581, Training Loss: 0.00751
Epoch: 581, Training Loss: 0.00673
Epoch: 581, Training Loss: 0.01021
Epoch: 581, Training Loss: 0.00630
Epoch: 582, Training Loss: 0.00750
Epoch: 582, Training Loss: 0.00671
Epoch: 582, Training Loss: 0.01019
Epoch: 582, Training Loss: 0.00629
Epoch: 583, Training Loss: 0.00748
Epoch: 583, Training Loss: 0.00670
Epoch: 583, Training Loss: 0.01017
Epoch: 583, Training Loss: 0.00627
Epoch: 584, Training Loss: 0.00747
Epoch: 584, Training Loss: 0.00669
Epoch: 584, Training Loss: 0.01015
Epoch: 584, Training Loss: 0.00626
Epoch: 585, Training Loss: 0.00745
Epoch: 585, Training Loss: 0.00668
Epoch: 585, Training Loss: 0.01013
Epoch: 585, Training Loss: 0.00625
Epoch: 586, Training Loss: 0.00744
Epoch: 586, Training Loss: 0.00666
Epoch: 586, Training Loss: 0.01011
Epoch: 586, Training Loss: 0.00624
Epoch: 587, Training Loss: 0.00743
Epoch: 587, Training Loss: 0.00665
Epoch: 587, Training Loss: 0.01010
Epoch: 587, Training Loss: 0.00623
Epoch: 588, Training Loss: 0.00741
Epoch: 588, Training Loss: 0.00664
Epoch: 588, Training Loss: 0.01008
Epoch: 588, Training Loss: 0.00621
Epoch: 589, Training Loss: 0.00740
Epoch: 589, Training Loss: 0.00663
Epoch: 589, Training Loss: 0.01006
Epoch: 589, Training Loss: 0.00620
Epoch: 590, Training Loss: 0.00738
Epoch: 590, Training Loss: 0.00661
Epoch: 590, Training Loss: 0.01004
Epoch: 590, Training Loss: 0.00619
Epoch: 591, Training Loss: 0.00737
Epoch: 591, Training Loss: 0.00660
Epoch: 591, Training Loss: 0.01002
Epoch: 591, Training Loss: 0.00618
Epoch: 592, Training Loss: 0.00735
Epoch: 592, Training Loss: 0.00659
Epoch: 592, Training Loss: 0.01000
Epoch: 592, Training Loss: 0.00617
Epoch: 593, Training Loss: 0.00734
Epoch: 593, Training Loss: 0.00658
Epoch: 593, Training Loss: 0.00998
Epoch: 593, Training Loss: 0.00616
Epoch: 594, Training Loss: 0.00733
Epoch: 594, Training Loss: 0.00656
Epoch: 594, Training Loss: 0.00996
Epoch: 594, Training Loss: 0.00614
Epoch: 595, Training Loss: 0.00731
Epoch: 595, Training Loss: 0.00655
Epoch: 595, Training Loss: 0.00994
Epoch: 595, Training Loss: 0.00613
Epoch: 596, Training Loss: 0.00730
Epoch: 596, Training Loss: 0.00654
Epoch: 596, Training Loss: 0.00992
Epoch: 596, Training Loss: 0.00612
Epoch: 597, Training Loss: 0.00728
Epoch: 597, Training Loss: 0.00653
Epoch: 597, Training Loss: 0.00990
Epoch: 597, Training Loss: 0.00611
Epoch: 598, Training Loss: 0.00727
Epoch: 598, Training Loss: 0.00652
Epoch: 598, Training Loss: 0.00988
Epoch: 598, Training Loss: 0.00610
Epoch: 599, Training Loss: 0.00726
Epoch: 599, Training Loss: 0.00650
Epoch: 599, Training Loss: 0.00987
Epoch: 599, Training Loss: 0.00609
Epoch: 600, Training Loss: 0.00724
Epoch: 600, Training Loss: 0.00649
Epoch: 600, Training Loss: 0.00985
Epoch: 600, Training Loss: 0.00608
Epoch: 601, Training Loss: 0.00723
Epoch: 601, Training Loss: 0.00648
Epoch: 601, Training Loss: 0.00983
Epoch: 601, Training Loss: 0.00606
Epoch: 602, Training Loss: 0.00721
Epoch: 602, Training Loss: 0.00647
Epoch: 602, Training Loss: 0.00981
Epoch: 602, Training Loss: 0.00605
Epoch: 603, Training Loss: 0.00720
Epoch: 603, Training Loss: 0.00646
Epoch: 603, Training Loss: 0.00979
Epoch: 603, Training Loss: 0.00604
Epoch: 604, Training Loss: 0.00719
Epoch: 604, Training Loss: 0.00644
Epoch: 604, Training Loss: 0.00977
Epoch: 604, Training Loss: 0.00603
Epoch: 605, Training Loss: 0.00717
Epoch: 605, Training Loss: 0.00643
Epoch: 605, Training Loss: 0.00975
Epoch: 605, Training Loss: 0.00602
Epoch: 606, Training Loss: 0.00716
Epoch: 606, Training Loss: 0.00642
Epoch: 606, Training Loss: 0.00974
Epoch: 606, Training Loss: 0.00601
Epoch: 607, Training Loss: 0.00715
Epoch: 607, Training Loss: 0.00641
Epoch: 607, Training Loss: 0.00972
Epoch: 607, Training Loss: 0.00600
Epoch: 608, Training Loss: 0.00713
Epoch: 608, Training Loss: 0.00640
Epoch: 608, Training Loss: 0.00970
Epoch: 608, Training Loss: 0.00599
Epoch: 609, Training Loss: 0.00712
Epoch: 609, Training Loss: 0.00639
Epoch: 609, Training Loss: 0.00968
Epoch: 609, Training Loss: 0.00598
Epoch: 610, Training Loss: 0.00711
Epoch: 610, Training Loss: 0.00637
Epoch: 610, Training Loss: 0.00966
Epoch: 610, Training Loss: 0.00596
Epoch: 611, Training Loss: 0.00709
Epoch: 611, Training Loss: 0.00636
Epoch: 611, Training Loss: 0.00965
Epoch: 611, Training Loss: 0.00595
Epoch: 612, Training Loss: 0.00708
Epoch: 612, Training Loss: 0.00635
Epoch: 612, Training Loss: 0.00963
Epoch: 612, Training Loss: 0.00594
Epoch: 613, Training Loss: 0.00707
Epoch: 613, Training Loss: 0.00634
Epoch: 613, Training Loss: 0.00961
Epoch: 613, Training Loss: 0.00593
Epoch: 614, Training Loss: 0.00705
Epoch: 614, Training Loss: 0.00633
Epoch: 614, Training Loss: 0.00959
Epoch: 614, Training Loss: 0.00592
Epoch: 615, Training Loss: 0.00704
Epoch: 615, Training Loss: 0.00632
Epoch: 615, Training Loss: 0.00957
Epoch: 615, Training Loss: 0.00591
Epoch: 616, Training Loss: 0.00703
Epoch: 616, Training Loss: 0.00630
Epoch: 616, Training Loss: 0.00956
Epoch: 616, Training Loss: 0.00590
Epoch: 617, Training Loss: 0.00702
Epoch: 617, Training Loss: 0.00629
Epoch: 617, Training Loss: 0.00954
Epoch: 617, Training Loss: 0.00589
Epoch: 618, Training Loss: 0.00700
Epoch: 618, Training Loss: 0.00628
Epoch: 618, Training Loss: 0.00952
Epoch: 618, Training Loss: 0.00588
Epoch: 619, Training Loss: 0.00699
Epoch: 619, Training Loss: 0.00627
Epoch: 619, Training Loss: 0.00950
Epoch: 619, Training Loss: 0.00587
Epoch: 620, Training Loss: 0.00698
Epoch: 620, Training Loss: 0.00626
Epoch: 620, Training Loss: 0.00949
Epoch: 620, Training Loss: 0.00586
Epoch: 621, Training Loss: 0.00696
Epoch: 621, Training Loss: 0.00625
Epoch: 621, Training Loss: 0.00947
Epoch: 621, Training Loss: 0.00585
Epoch: 622, Training Loss: 0.00695
Epoch: 622, Training Loss: 0.00624
Epoch: 622, Training Loss: 0.00945
Epoch: 622, Training Loss: 0.00584
Epoch: 623, Training Loss: 0.00694
Epoch: 623, Training Loss: 0.00623
Epoch: 623, Training Loss: 0.00943
Epoch: 623, Training Loss: 0.00583
Epoch: 624, Training Loss: 0.00693
Epoch: 624, Training Loss: 0.00622
Epoch: 624, Training Loss: 0.00942
Epoch: 624, Training Loss: 0.00582
Epoch: 625, Training Loss: 0.00691
Epoch: 625, Training Loss: 0.00620
Epoch: 625, Training Loss: 0.00940
Epoch: 625, Training Loss: 0.00580
Epoch: 626, Training Loss: 0.00690
Epoch: 626, Training Loss: 0.00619
Epoch: 626, Training Loss: 0.00938
Epoch: 626, Training Loss: 0.00579
Epoch: 627, Training Loss: 0.00689
Epoch: 627, Training Loss: 0.00618
Epoch: 627, Training Loss: 0.00937
Epoch: 627, Training Loss: 0.00578
Epoch: 628, Training Loss: 0.00688
Epoch: 628, Training Loss: 0.00617
Epoch: 628, Training Loss: 0.00935
Epoch: 628, Training Loss: 0.00577
Epoch: 629, Training Loss: 0.00686
Epoch: 629, Training Loss: 0.00616
Epoch: 629, Training Loss: 0.00933
Epoch: 629, Training Loss: 0.00576
Epoch: 630, Training Loss: 0.00685
Epoch: 630, Training Loss: 0.00615
Epoch: 630, Training Loss: 0.00932
Epoch: 630, Training Loss: 0.00575
Epoch: 631, Training Loss: 0.00684
Epoch: 631, Training Loss: 0.00614
Epoch: 631, Training Loss: 0.00930
Epoch: 631, Training Loss: 0.00574
Epoch: 632, Training Loss: 0.00683
Epoch: 632, Training Loss: 0.00613
Epoch: 632, Training Loss: 0.00928
Epoch: 632, Training Loss: 0.00573
Epoch: 633, Training Loss: 0.00681
Epoch: 633, Training Loss: 0.00612
Epoch: 633, Training Loss: 0.00927
Epoch: 633, Training Loss: 0.00572
Epoch: 634, Training Loss: 0.00680
Epoch: 634, Training Loss: 0.00611
Epoch: 634, Training Loss: 0.00925
Epoch: 634, Training Loss: 0.00571
Epoch: 635, Training Loss: 0.00679
Epoch: 635, Training Loss: 0.00610
Epoch: 635, Training Loss: 0.00923
Epoch: 635, Training Loss: 0.00570
Epoch: 636, Training Loss: 0.00678
Epoch: 636, Training Loss: 0.00609
Epoch: 636, Training Loss: 0.00922
Epoch: 636, Training Loss: 0.00569
Epoch: 637, Training Loss: 0.00677
Epoch: 637, Training Loss: 0.00608
Epoch: 637, Training Loss: 0.00920
Epoch: 637, Training Loss: 0.00568
Epoch: 638, Training Loss: 0.00675
Epoch: 638, Training Loss: 0.00607
Epoch: 638, Training Loss: 0.00918
Epoch: 638, Training Loss: 0.00567
Epoch: 639, Training Loss: 0.00674
Epoch: 639, Training Loss: 0.00605
Epoch: 639, Training Loss: 0.00917
Epoch: 639, Training Loss: 0.00566
Epoch: 640, Training Loss: 0.00673
Epoch: 640, Training Loss: 0.00604
Epoch: 640, Training Loss: 0.00915
Epoch: 640, Training Loss: 0.00565
Epoch: 641, Training Loss: 0.00672
Epoch: 641, Training Loss: 0.00603
Epoch: 641, Training Loss: 0.00914
Epoch: 641, Training Loss: 0.00564
Epoch: 642, Training Loss: 0.00671
Epoch: 642, Training Loss: 0.00602
Epoch: 642, Training Loss: 0.00912
Epoch: 642, Training Loss: 0.00563
Epoch: 643, Training Loss: 0.00669
Epoch: 643, Training Loss: 0.00601
Epoch: 643, Training Loss: 0.00910
Epoch: 643, Training Loss: 0.00562
Epoch: 644, Training Loss: 0.00668
Epoch: 644, Training Loss: 0.00600
Epoch: 644, Training Loss: 0.00909
Epoch: 644, Training Loss: 0.00561
Epoch: 645, Training Loss: 0.00667
Epoch: 645, Training Loss: 0.00599
Epoch: 645, Training Loss: 0.00907
Epoch: 645, Training Loss: 0.00560
Epoch: 646, Training Loss: 0.00666
Epoch: 646, Training Loss: 0.00598
Epoch: 646, Training Loss: 0.00906
Epoch: 646, Training Loss: 0.00560
Epoch: 647, Training Loss: 0.00665
Epoch: 647, Training Loss: 0.00597
Epoch: 647, Training Loss: 0.00904
Epoch: 647, Training Loss: 0.00559
Epoch: 648, Training Loss: 0.00664
Epoch: 648, Training Loss: 0.00596
Epoch: 648, Training Loss: 0.00902
Epoch: 648, Training Loss: 0.00558
Epoch: 649, Training Loss: 0.00662
Epoch: 649, Training Loss: 0.00595
Epoch: 649, Training Loss: 0.00901
Epoch: 649, Training Loss: 0.00557
Epoch: 650, Training Loss: 0.00661
Epoch: 650, Training Loss: 0.00594
Epoch: 650, Training Loss: 0.00899
Epoch: 650, Training Loss: 0.00556
Epoch: 651, Training Loss: 0.00660
Epoch: 651, Training Loss: 0.00593
Epoch: 651, Training Loss: 0.00898
Epoch: 651, Training Loss: 0.00555
Epoch: 652, Training Loss: 0.00659
Epoch: 652, Training Loss: 0.00592
Epoch: 652, Training Loss: 0.00896
Epoch: 652, Training Loss: 0.00554
Epoch: 653, Training Loss: 0.00658
Epoch: 653, Training Loss: 0.00591
Epoch: 653, Training Loss: 0.00895
Epoch: 653, Training Loss: 0.00553
Epoch: 654, Training Loss: 0.00657
Epoch: 654, Training Loss: 0.00590
Epoch: 654, Training Loss: 0.00893
Epoch: 654, Training Loss: 0.00552
Epoch: 655, Training Loss: 0.00656
Epoch: 655, Training Loss: 0.00589
Epoch: 655, Training Loss: 0.00892
Epoch: 655, Training Loss: 0.00551
Epoch: 656, Training Loss: 0.00654
Epoch: 656, Training Loss: 0.00588
Epoch: 656, Training Loss: 0.00890
Epoch: 656, Training Loss: 0.00550
Epoch: 657, Training Loss: 0.00653
Epoch: 657, Training Loss: 0.00587
Epoch: 657, Training Loss: 0.00888
Epoch: 657, Training Loss: 0.00549
Epoch: 658, Training Loss: 0.00652
Epoch: 658, Training Loss: 0.00586
Epoch: 658, Training Loss: 0.00887
Epoch: 658, Training Loss: 0.00548
Epoch: 659, Training Loss: 0.00651
Epoch: 659, Training Loss: 0.00585
Epoch: 659, Training Loss: 0.00885
Epoch: 659, Training Loss: 0.00547
Epoch: 660, Training Loss: 0.00650
Epoch: 660, Training Loss: 0.00584
Epoch: 660, Training Loss: 0.00884
Epoch: 660, Training Loss: 0.00546
Epoch: 661, Training Loss: 0.00649
Epoch: 661, Training Loss: 0.00583
Epoch: 661, Training Loss: 0.00882
Epoch: 661, Training Loss: 0.00545
Epoch: 662, Training Loss: 0.00648
Epoch: 662, Training Loss: 0.00582
Epoch: 662, Training Loss: 0.00881
Epoch: 662, Training Loss: 0.00545
Epoch: 663, Training Loss: 0.00647
Epoch: 663, Training Loss: 0.00581
Epoch: 663, Training Loss: 0.00879
Epoch: 663, Training Loss: 0.00544
Epoch: 664, Training Loss: 0.00646
Epoch: 664, Training Loss: 0.00580
Epoch: 664, Training Loss: 0.00878
Epoch: 664, Training Loss: 0.00543
Epoch: 665, Training Loss: 0.00644
Epoch: 665, Training Loss: 0.00579
Epoch: 665, Training Loss: 0.00876
Epoch: 665, Training Loss: 0.00542
Epoch: 666, Training Loss: 0.00643
Epoch: 666, Training Loss: 0.00578
Epoch: 666, Training Loss: 0.00875
Epoch: 666, Training Loss: 0.00541
Epoch: 667, Training Loss: 0.00642
Epoch: 667, Training Loss: 0.00578
Epoch: 667, Training Loss: 0.00873
Epoch: 667, Training Loss: 0.00540
Epoch: 668, Training Loss: 0.00641
Epoch: 668, Training Loss: 0.00577
Epoch: 668, Training Loss: 0.00872
Epoch: 668, Training Loss: 0.00539
Epoch: 669, Training Loss: 0.00640
Epoch: 669, Training Loss: 0.00576
Epoch: 669, Training Loss: 0.00871
Epoch: 669, Training Loss: 0.00538
Epoch: 670, Training Loss: 0.00639
Epoch: 670, Training Loss: 0.00575
Epoch: 670, Training Loss: 0.00869
Epoch: 670, Training Loss: 0.00537
Epoch: 671, Training Loss: 0.00638
Epoch: 671, Training Loss: 0.00574
Epoch: 671, Training Loss: 0.00868
Epoch: 671, Training Loss: 0.00536
Epoch: 672, Training Loss: 0.00637
Epoch: 672, Training Loss: 0.00573
Epoch: 672, Training Loss: 0.00866
Epoch: 672, Training Loss: 0.00536
Epoch: 673, Training Loss: 0.00636
Epoch: 673, Training Loss: 0.00572
Epoch: 673, Training Loss: 0.00865
Epoch: 673, Training Loss: 0.00535
Epoch: 674, Training Loss: 0.00635
Epoch: 674, Training Loss: 0.00571
Epoch: 674, Training Loss: 0.00863
Epoch: 674, Training Loss: 0.00534
Epoch: 675, Training Loss: 0.00634
Epoch: 675, Training Loss: 0.00570
Epoch: 675, Training Loss: 0.00862
Epoch: 675, Training Loss: 0.00533
Epoch: 676, Training Loss: 0.00633
Epoch: 676, Training Loss: 0.00569
Epoch: 676, Training Loss: 0.00860
Epoch: 676, Training Loss: 0.00532
Epoch: 677, Training Loss: 0.00632
Epoch: 677, Training Loss: 0.00568
Epoch: 677, Training Loss: 0.00859
Epoch: 677, Training Loss: 0.00531
Epoch: 678, Training Loss: 0.00630
Epoch: 678, Training Loss: 0.00567
Epoch: 678, Training Loss: 0.00858
Epoch: 678, Training Loss: 0.00530
Epoch: 679, Training Loss: 0.00629
Epoch: 679, Training Loss: 0.00566
Epoch: 679, Training Loss: 0.00856
Epoch: 679, Training Loss: 0.00529
Epoch: 680, Training Loss: 0.00628
Epoch: 680, Training Loss: 0.00565
Epoch: 680, Training Loss: 0.00855
Epoch: 680, Training Loss: 0.00529
Epoch: 681, Training Loss: 0.00627
Epoch: 681, Training Loss: 0.00564
Epoch: 681, Training Loss: 0.00853
Epoch: 681, Training Loss: 0.00528
Epoch: 682, Training Loss: 0.00626
Epoch: 682, Training Loss: 0.00564
Epoch: 682, Training Loss: 0.00852
Epoch: 682, Training Loss: 0.00527
Epoch: 683, Training Loss: 0.00625
Epoch: 683, Training Loss: 0.00563
Epoch: 683, Training Loss: 0.00851
Epoch: 683, Training Loss: 0.00526
Epoch: 684, Training Loss: 0.00624
Epoch: 684, Training Loss: 0.00562
Epoch: 684, Training Loss: 0.00849
Epoch: 684, Training Loss: 0.00525
Epoch: 685, Training Loss: 0.00623
Epoch: 685, Training Loss: 0.00561
Epoch: 685, Training Loss: 0.00848
Epoch: 685, Training Loss: 0.00524
Epoch: 686, Training Loss: 0.00622
Epoch: 686, Training Loss: 0.00560
Epoch: 686, Training Loss: 0.00846
Epoch: 686, Training Loss: 0.00523
Epoch: 687, Training Loss: 0.00621
Epoch: 687, Training Loss: 0.00559
Epoch: 687, Training Loss: 0.00845
Epoch: 687, Training Loss: 0.00523
Epoch: 688, Training Loss: 0.00620
Epoch: 688, Training Loss: 0.00558
Epoch: 688, Training Loss: 0.00844
Epoch: 688, Training Loss: 0.00522
Epoch: 689, Training Loss: 0.00619
Epoch: 689, Training Loss: 0.00557
Epoch: 689, Training Loss: 0.00842
Epoch: 689, Training Loss: 0.00521
Epoch: 690, Training Loss: 0.00618
Epoch: 690, Training Loss: 0.00556
Epoch: 690, Training Loss: 0.00841
Epoch: 690, Training Loss: 0.00520
Epoch: 691, Training Loss: 0.00617
Epoch: 691, Training Loss: 0.00556
Epoch: 691, Training Loss: 0.00839
Epoch: 691, Training Loss: 0.00519
Epoch: 692, Training Loss: 0.00616
Epoch: 692, Training Loss: 0.00555
Epoch: 692, Training Loss: 0.00838
Epoch: 692, Training Loss: 0.00518
Epoch: 693, Training Loss: 0.00615
Epoch: 693, Training Loss: 0.00554
Epoch: 693, Training Loss: 0.00837
Epoch: 693, Training Loss: 0.00518
Epoch: 694, Training Loss: 0.00614
Epoch: 694, Training Loss: 0.00553
Epoch: 694, Training Loss: 0.00835
Epoch: 694, Training Loss: 0.00517
Epoch: 695, Training Loss: 0.00613
Epoch: 695, Training Loss: 0.00552
Epoch: 695, Training Loss: 0.00834
Epoch: 695, Training Loss: 0.00516
Epoch: 696, Training Loss: 0.00612
Epoch: 696, Training Loss: 0.00551
Epoch: 696, Training Loss: 0.00833
Epoch: 696, Training Loss: 0.00515
Epoch: 697, Training Loss: 0.00611
Epoch: 697, Training Loss: 0.00550
Epoch: 697, Training Loss: 0.00831
Epoch: 697, Training Loss: 0.00514
Epoch: 698, Training Loss: 0.00610
Epoch: 698, Training Loss: 0.00549
Epoch: 698, Training Loss: 0.00830
Epoch: 698, Training Loss: 0.00514
Epoch: 699, Training Loss: 0.00609
Epoch: 699, Training Loss: 0.00549
Epoch: 699, Training Loss: 0.00829
Epoch: 699, Training Loss: 0.00513
Epoch: 700, Training Loss: 0.00608
Epoch: 700, Training Loss: 0.00548
Epoch: 700, Training Loss: 0.00827
Epoch: 700, Training Loss: 0.00512
Epoch: 701, Training Loss: 0.00607
Epoch: 701, Training Loss: 0.00547
Epoch: 701, Training Loss: 0.00826
Epoch: 701, Training Loss: 0.00511
Epoch: 702, Training Loss: 0.00606
Epoch: 702, Training Loss: 0.00546
Epoch: 702, Training Loss: 0.00825
Epoch: 702, Training Loss: 0.00510
Epoch: 703, Training Loss: 0.00605
Epoch: 703, Training Loss: 0.00545
Epoch: 703, Training Loss: 0.00823
Epoch: 703, Training Loss: 0.00510
Epoch: 704, Training Loss: 0.00604
Epoch: 704, Training Loss: 0.00544
Epoch: 704, Training Loss: 0.00822
Epoch: 704, Training Loss: 0.00509
Epoch: 705, Training Loss: 0.00603
Epoch: 705, Training Loss: 0.00543
Epoch: 705, Training Loss: 0.00821
Epoch: 705, Training Loss: 0.00508
Epoch: 706, Training Loss: 0.00602
Epoch: 706, Training Loss: 0.00543
Epoch: 706, Training Loss: 0.00820
Epoch: 706, Training Loss: 0.00507
Epoch: 707, Training Loss: 0.00601
Epoch: 707, Training Loss: 0.00542
Epoch: 707, Training Loss: 0.00818
Epoch: 707, Training Loss: 0.00506
Epoch: 708, Training Loss: 0.00601
Epoch: 708, Training Loss: 0.00541
Epoch: 708, Training Loss: 0.00817
Epoch: 708, Training Loss: 0.00506
Epoch: 709, Training Loss: 0.00600
Epoch: 709, Training Loss: 0.00540
Epoch: 709, Training Loss: 0.00816
Epoch: 709, Training Loss: 0.00505
Epoch: 710, Training Loss: 0.00599
Epoch: 710, Training Loss: 0.00539
Epoch: 710, Training Loss: 0.00814
Epoch: 710, Training Loss: 0.00504
Epoch: 711, Training Loss: 0.00598
Epoch: 711, Training Loss: 0.00538
Epoch: 711, Training Loss: 0.00813
Epoch: 711, Training Loss: 0.00503
Epoch: 712, Training Loss: 0.00597
Epoch: 712, Training Loss: 0.00538
Epoch: 712, Training Loss: 0.00812
Epoch: 712, Training Loss: 0.00502
Epoch: 713, Training Loss: 0.00596
Epoch: 713, Training Loss: 0.00537
Epoch: 713, Training Loss: 0.00811
Epoch: 713, Training Loss: 0.00502
Epoch: 714, Training Loss: 0.00595
Epoch: 714, Training Loss: 0.00536
Epoch: 714, Training Loss: 0.00809
Epoch: 714, Training Loss: 0.00501
Epoch: 715, Training Loss: 0.00594
Epoch: 715, Training Loss: 0.00535
Epoch: 715, Training Loss: 0.00808
Epoch: 715, Training Loss: 0.00500
Epoch: 716, Training Loss: 0.00593
Epoch: 716, Training Loss: 0.00534
Epoch: 716, Training Loss: 0.00807
Epoch: 716, Training Loss: 0.00499
Epoch: 717, Training Loss: 0.00592
Epoch: 717, Training Loss: 0.00533
Epoch: 717, Training Loss: 0.00805
Epoch: 717, Training Loss: 0.00499
Epoch: 718, Training Loss: 0.00591
Epoch: 718, Training Loss: 0.00533
Epoch: 718, Training Loss: 0.00804
Epoch: 718, Training Loss: 0.00498
Epoch: 719, Training Loss: 0.00590
Epoch: 719, Training Loss: 0.00532
Epoch: 719, Training Loss: 0.00803
Epoch: 719, Training Loss: 0.00497
Epoch: 720, Training Loss: 0.00589
Epoch: 720, Training Loss: 0.00531
Epoch: 720, Training Loss: 0.00802
Epoch: 720, Training Loss: 0.00496
Epoch: 721, Training Loss: 0.00588
Epoch: 721, Training Loss: 0.00530
Epoch: 721, Training Loss: 0.00800
Epoch: 721, Training Loss: 0.00496
Epoch: 722, Training Loss: 0.00587
Epoch: 722, Training Loss: 0.00529
Epoch: 722, Training Loss: 0.00799
Epoch: 722, Training Loss: 0.00495
Epoch: 723, Training Loss: 0.00587
Epoch: 723, Training Loss: 0.00529
Epoch: 723, Training Loss: 0.00798
Epoch: 723, Training Loss: 0.00494
Epoch: 724, Training Loss: 0.00586
Epoch: 724, Training Loss: 0.00528
Epoch: 724, Training Loss: 0.00797
Epoch: 724, Training Loss: 0.00493
Epoch: 725, Training Loss: 0.00585
Epoch: 725, Training Loss: 0.00527
Epoch: 725, Training Loss: 0.00796
Epoch: 725, Training Loss: 0.00493
Epoch: 726, Training Loss: 0.00584
Epoch: 726, Training Loss: 0.00526
Epoch: 726, Training Loss: 0.00794
Epoch: 726, Training Loss: 0.00492
Epoch: 727, Training Loss: 0.00583
Epoch: 727, Training Loss: 0.00525
Epoch: 727, Training Loss: 0.00793
Epoch: 727, Training Loss: 0.00491
Epoch: 728, Training Loss: 0.00582
Epoch: 728, Training Loss: 0.00525
Epoch: 728, Training Loss: 0.00792
Epoch: 728, Training Loss: 0.00490
Epoch: 729, Training Loss: 0.00581
Epoch: 729, Training Loss: 0.00524
Epoch: 729, Training Loss: 0.00791
Epoch: 729, Training Loss: 0.00490
Epoch: 730, Training Loss: 0.00580
Epoch: 730, Training Loss: 0.00523
Epoch: 730, Training Loss: 0.00789
Epoch: 730, Training Loss: 0.00489
Epoch: 731, Training Loss: 0.00579
Epoch: 731, Training Loss: 0.00522
Epoch: 731, Training Loss: 0.00788
Epoch: 731, Training Loss: 0.00488
Epoch: 732, Training Loss: 0.00578
Epoch: 732, Training Loss: 0.00522
Epoch: 732, Training Loss: 0.00787
Epoch: 732, Training Loss: 0.00487
Epoch: 733, Training Loss: 0.00578
Epoch: 733, Training Loss: 0.00521
Epoch: 733, Training Loss: 0.00786
Epoch: 733, Training Loss: 0.00487
Epoch: 734, Training Loss: 0.00577
Epoch: 734, Training Loss: 0.00520
Epoch: 734, Training Loss: 0.00785
Epoch: 734, Training Loss: 0.00486
Epoch: 735, Training Loss: 0.00576
Epoch: 735, Training Loss: 0.00519
Epoch: 735, Training Loss: 0.00784
Epoch: 735, Training Loss: 0.00485
Epoch: 736, Training Loss: 0.00575
Epoch: 736, Training Loss: 0.00518
Epoch: 736, Training Loss: 0.00782
Epoch: 736, Training Loss: 0.00484
Epoch: 737, Training Loss: 0.00574
Epoch: 737, Training Loss: 0.00518
Epoch: 737, Training Loss: 0.00781
Epoch: 737, Training Loss: 0.00484
Epoch: 738, Training Loss: 0.00573
Epoch: 738, Training Loss: 0.00517
Epoch: 738, Training Loss: 0.00780
Epoch: 738, Training Loss: 0.00483
Epoch: 739, Training Loss: 0.00572
Epoch: 739, Training Loss: 0.00516
Epoch: 739, Training Loss: 0.00779
Epoch: 739, Training Loss: 0.00482
Epoch: 740, Training Loss: 0.00571
Epoch: 740, Training Loss: 0.00515
Epoch: 740, Training Loss: 0.00778
Epoch: 740, Training Loss: 0.00482
Epoch: 741, Training Loss: 0.00571
Epoch: 741, Training Loss: 0.00515
Epoch: 741, Training Loss: 0.00776
Epoch: 741, Training Loss: 0.00481
Epoch: 742, Training Loss: 0.00570
Epoch: 742, Training Loss: 0.00514
Epoch: 742, Training Loss: 0.00775
Epoch: 742, Training Loss: 0.00480
Epoch: 743, Training Loss: 0.00569
Epoch: 743, Training Loss: 0.00513
Epoch: 743, Training Loss: 0.00774
Epoch: 743, Training Loss: 0.00479
Epoch: 744, Training Loss: 0.00568
Epoch: 744, Training Loss: 0.00512
Epoch: 744, Training Loss: 0.00773
Epoch: 744, Training Loss: 0.00479
Epoch: 745, Training Loss: 0.00567
Epoch: 745, Training Loss: 0.00512
Epoch: 745, Training Loss: 0.00772
Epoch: 745, Training Loss: 0.00478
Epoch: 746, Training Loss: 0.00566
Epoch: 746, Training Loss: 0.00511
Epoch: 746, Training Loss: 0.00771
Epoch: 746, Training Loss: 0.00477
Epoch: 747, Training Loss: 0.00566
Epoch: 747, Training Loss: 0.00510
Epoch: 747, Training Loss: 0.00769
Epoch: 747, Training Loss: 0.00477
Epoch: 748, Training Loss: 0.00565
Epoch: 748, Training Loss: 0.00509
Epoch: 748, Training Loss: 0.00768
Epoch: 748, Training Loss: 0.00476
Epoch: 749, Training Loss: 0.00564
Epoch: 749, Training Loss: 0.00509
Epoch: 749, Training Loss: 0.00767
Epoch: 749, Training Loss: 0.00475
Epoch: 750, Training Loss: 0.00563
Epoch: 750, Training Loss: 0.00508
Epoch: 750, Training Loss: 0.00766
Epoch: 750, Training Loss: 0.00475
Epoch: 751, Training Loss: 0.00562
Epoch: 751, Training Loss: 0.00507
Epoch: 751, Training Loss: 0.00765
Epoch: 751, Training Loss: 0.00474
Epoch: 752, Training Loss: 0.00561
Epoch: 752, Training Loss: 0.00506
Epoch: 752, Training Loss: 0.00764
Epoch: 752, Training Loss: 0.00473
Epoch: 753, Training Loss: 0.00560
Epoch: 753, Training Loss: 0.00506
Epoch: 753, Training Loss: 0.00763
Epoch: 753, Training Loss: 0.00472
Epoch: 754, Training Loss: 0.00560
Epoch: 754, Training Loss: 0.00505
Epoch: 754, Training Loss: 0.00762
Epoch: 754, Training Loss: 0.00472
Epoch: 755, Training Loss: 0.00559
Epoch: 755, Training Loss: 0.00504
Epoch: 755, Training Loss: 0.00760
Epoch: 755, Training Loss: 0.00471
Epoch: 756, Training Loss: 0.00558
Epoch: 756, Training Loss: 0.00504
Epoch: 756, Training Loss: 0.00759
Epoch: 756, Training Loss: 0.00470
Epoch: 757, Training Loss: 0.00557
Epoch: 757, Training Loss: 0.00503
Epoch: 757, Training Loss: 0.00758
Epoch: 757, Training Loss: 0.00470
Epoch: 758, Training Loss: 0.00556
Epoch: 758, Training Loss: 0.00502
Epoch: 758, Training Loss: 0.00757
Epoch: 758, Training Loss: 0.00469
Epoch: 759, Training Loss: 0.00556
Epoch: 759, Training Loss: 0.00501
Epoch: 759, Training Loss: 0.00756
Epoch: 759, Training Loss: 0.00468
Epoch: 760, Training Loss: 0.00555
Epoch: 760, Training Loss: 0.00501
Epoch: 760, Training Loss: 0.00755
Epoch: 760, Training Loss: 0.00468
Epoch: 761, Training Loss: 0.00554
Epoch: 761, Training Loss: 0.00500
Epoch: 761, Training Loss: 0.00754
Epoch: 761, Training Loss: 0.00467
Epoch: 762, Training Loss: 0.00553
Epoch: 762, Training Loss: 0.00499
Epoch: 762, Training Loss: 0.00753
Epoch: 762, Training Loss: 0.00466
Epoch: 763, Training Loss: 0.00552
Epoch: 763, Training Loss: 0.00498
Epoch: 763, Training Loss: 0.00752
Epoch: 763, Training Loss: 0.00466
Epoch: 764, Training Loss: 0.00551
Epoch: 764, Training Loss: 0.00498
Epoch: 764, Training Loss: 0.00750
Epoch: 764, Training Loss: 0.00465
Epoch: 765, Training Loss: 0.00551
Epoch: 765, Training Loss: 0.00497
Epoch: 765, Training Loss: 0.00749
Epoch: 765, Training Loss: 0.00464
Epoch: 766, Training Loss: 0.00550
Epoch: 766, Training Loss: 0.00496
Epoch: 766, Training Loss: 0.00748
Epoch: 766, Training Loss: 0.00464
Epoch: 767, Training Loss: 0.00549
Epoch: 767, Training Loss: 0.00496
Epoch: 767, Training Loss: 0.00747
Epoch: 767, Training Loss: 0.00463
Epoch: 768, Training Loss: 0.00548
Epoch: 768, Training Loss: 0.00495
Epoch: 768, Training Loss: 0.00746
Epoch: 768, Training Loss: 0.00462
Epoch: 769, Training Loss: 0.00547
Epoch: 769, Training Loss: 0.00494
Epoch: 769, Training Loss: 0.00745
Epoch: 769, Training Loss: 0.00462
Epoch: 770, Training Loss: 0.00547
Epoch: 770, Training Loss: 0.00494
Epoch: 770, Training Loss: 0.00744
Epoch: 770, Training Loss: 0.00461
Epoch: 771, Training Loss: 0.00546
Epoch: 771, Training Loss: 0.00493
Epoch: 771, Training Loss: 0.00743
Epoch: 771, Training Loss: 0.00460
Epoch: 772, Training Loss: 0.00545
Epoch: 772, Training Loss: 0.00492
Epoch: 772, Training Loss: 0.00742
Epoch: 772, Training Loss: 0.00460
Epoch: 773, Training Loss: 0.00544
Epoch: 773, Training Loss: 0.00491
Epoch: 773, Training Loss: 0.00741
Epoch: 773, Training Loss: 0.00459
Epoch: 774, Training Loss: 0.00544
Epoch: 774, Training Loss: 0.00491
Epoch: 774, Training Loss: 0.00740
Epoch: 774, Training Loss: 0.00458
Epoch: 775, Training Loss: 0.00543
Epoch: 775, Training Loss: 0.00490
Epoch: 775, Training Loss: 0.00739
Epoch: 775, Training Loss: 0.00458
Epoch: 776, Training Loss: 0.00542
Epoch: 776, Training Loss: 0.00489
Epoch: 776, Training Loss: 0.00738
Epoch: 776, Training Loss: 0.00457
Epoch: 777, Training Loss: 0.00541
Epoch: 777, Training Loss: 0.00489
Epoch: 777, Training Loss: 0.00737
Epoch: 777, Training Loss: 0.00456
Epoch: 778, Training Loss: 0.00540
Epoch: 778, Training Loss: 0.00488
Epoch: 778, Training Loss: 0.00736
Epoch: 778, Training Loss: 0.00456
Epoch: 779, Training Loss: 0.00540
Epoch: 779, Training Loss: 0.00487
Epoch: 779, Training Loss: 0.00734
Epoch: 779, Training Loss: 0.00455
Epoch: 780, Training Loss: 0.00539
Epoch: 780, Training Loss: 0.00487
Epoch: 780, Training Loss: 0.00733
Epoch: 780, Training Loss: 0.00455
Epoch: 781, Training Loss: 0.00538
Epoch: 781, Training Loss: 0.00486
Epoch: 781, Training Loss: 0.00732
Epoch: 781, Training Loss: 0.00454
Epoch: 782, Training Loss: 0.00537
Epoch: 782, Training Loss: 0.00485
Epoch: 782, Training Loss: 0.00731
Epoch: 782, Training Loss: 0.00453
Epoch: 783, Training Loss: 0.00537
Epoch: 783, Training Loss: 0.00485
Epoch: 783, Training Loss: 0.00730
Epoch: 783, Training Loss: 0.00453
Epoch: 784, Training Loss: 0.00536
Epoch: 784, Training Loss: 0.00484
Epoch: 784, Training Loss: 0.00729
Epoch: 784, Training Loss: 0.00452
Epoch: 785, Training Loss: 0.00535
Epoch: 785, Training Loss: 0.00483
Epoch: 785, Training Loss: 0.00728
Epoch: 785, Training Loss: 0.00451
Epoch: 786, Training Loss: 0.00534
Epoch: 786, Training Loss: 0.00483
Epoch: 786, Training Loss: 0.00727
Epoch: 786, Training Loss: 0.00451
Epoch: 787, Training Loss: 0.00534
Epoch: 787, Training Loss: 0.00482
Epoch: 787, Training Loss: 0.00726
Epoch: 787, Training Loss: 0.00450
Epoch: 788, Training Loss: 0.00533
Epoch: 788, Training Loss: 0.00481
Epoch: 788, Training Loss: 0.00725
Epoch: 788, Training Loss: 0.00449
Epoch: 789, Training Loss: 0.00532
Epoch: 789, Training Loss: 0.00481
Epoch: 789, Training Loss: 0.00724
Epoch: 789, Training Loss: 0.00449
Epoch: 790, Training Loss: 0.00531
Epoch: 790, Training Loss: 0.00480
Epoch: 790, Training Loss: 0.00723
Epoch: 790, Training Loss: 0.00448
Epoch: 791, Training Loss: 0.00531
Epoch: 791, Training Loss: 0.00479
Epoch: 791, Training Loss: 0.00722
Epoch: 791, Training Loss: 0.00448
Epoch: 792, Training Loss: 0.00530
Epoch: 792, Training Loss: 0.00479
Epoch: 792, Training Loss: 0.00721
Epoch: 792, Training Loss: 0.00447
Epoch: 793, Training Loss: 0.00529
Epoch: 793, Training Loss: 0.00478
Epoch: 793, Training Loss: 0.00720
Epoch: 793, Training Loss: 0.00446
Epoch: 794, Training Loss: 0.00528
Epoch: 794, Training Loss: 0.00477
Epoch: 794, Training Loss: 0.00719
Epoch: 794, Training Loss: 0.00446
Epoch: 795, Training Loss: 0.00528
Epoch: 795, Training Loss: 0.00477
Epoch: 795, Training Loss: 0.00718
Epoch: 795, Training Loss: 0.00445
Epoch: 796, Training Loss: 0.00527
Epoch: 796, Training Loss: 0.00476
Epoch: 796, Training Loss: 0.00717
Epoch: 796, Training Loss: 0.00445
Epoch: 797, Training Loss: 0.00526
Epoch: 797, Training Loss: 0.00475
Epoch: 797, Training Loss: 0.00716
Epoch: 797, Training Loss: 0.00444
Epoch: 798, Training Loss: 0.00525
Epoch: 798, Training Loss: 0.00475
Epoch: 798, Training Loss: 0.00715
Epoch: 798, Training Loss: 0.00443
Epoch: 799, Training Loss: 0.00525
Epoch: 799, Training Loss: 0.00474
Epoch: 799, Training Loss: 0.00714
Epoch: 799, Training Loss: 0.00443
Epoch: 800, Training Loss: 0.00524
Epoch: 800, Training Loss: 0.00473
Epoch: 800, Training Loss: 0.00713
Epoch: 800, Training Loss: 0.00442
Epoch: 801, Training Loss: 0.00523
Epoch: 801, Training Loss: 0.00473
Epoch: 801, Training Loss: 0.00712
Epoch: 801, Training Loss: 0.00442
Epoch: 802, Training Loss: 0.00522
Epoch: 802, Training Loss: 0.00472
Epoch: 802, Training Loss: 0.00711
Epoch: 802, Training Loss: 0.00441
Epoch: 803, Training Loss: 0.00522
Epoch: 803, Training Loss: 0.00472
Epoch: 803, Training Loss: 0.00710
Epoch: 803, Training Loss: 0.00440
Epoch: 804, Training Loss: 0.00521
Epoch: 804, Training Loss: 0.00471
Epoch: 804, Training Loss: 0.00709
Epoch: 804, Training Loss: 0.00440
Epoch: 805, Training Loss: 0.00520
Epoch: 805, Training Loss: 0.00470
Epoch: 805, Training Loss: 0.00708
Epoch: 805, Training Loss: 0.00439
Epoch: 806, Training Loss: 0.00520
Epoch: 806, Training Loss: 0.00470
Epoch: 806, Training Loss: 0.00707
Epoch: 806, Training Loss: 0.00439
Epoch: 807, Training Loss: 0.00519
Epoch: 807, Training Loss: 0.00469
Epoch: 807, Training Loss: 0.00706
Epoch: 807, Training Loss: 0.00438
Epoch: 808, Training Loss: 0.00518
Epoch: 808, Training Loss: 0.00468
Epoch: 808, Training Loss: 0.00705
Epoch: 808, Training Loss: 0.00437
Epoch: 809, Training Loss: 0.00517
Epoch: 809, Training Loss: 0.00468
Epoch: 809, Training Loss: 0.00704
Epoch: 809, Training Loss: 0.00437
Epoch: 810, Training Loss: 0.00517
Epoch: 810, Training Loss: 0.00467
Epoch: 810, Training Loss: 0.00703
Epoch: 810, Training Loss: 0.00436
Epoch: 811, Training Loss: 0.00516
Epoch: 811, Training Loss: 0.00466
Epoch: 811, Training Loss: 0.00702
Epoch: 811, Training Loss: 0.00436
Epoch: 812, Training Loss: 0.00515
Epoch: 812, Training Loss: 0.00466
Epoch: 812, Training Loss: 0.00701
Epoch: 812, Training Loss: 0.00435
Epoch: 813, Training Loss: 0.00515
Epoch: 813, Training Loss: 0.00465
Epoch: 813, Training Loss: 0.00701
Epoch: 813, Training Loss: 0.00434
Epoch: 814, Training Loss: 0.00514
Epoch: 814, Training Loss: 0.00465
Epoch: 814, Training Loss: 0.00700
Epoch: 814, Training Loss: 0.00434
Epoch: 815, Training Loss: 0.00513
Epoch: 815, Training Loss: 0.00464
Epoch: 815, Training Loss: 0.00699
Epoch: 815, Training Loss: 0.00433
Epoch: 816, Training Loss: 0.00513
Epoch: 816, Training Loss: 0.00463
Epoch: 816, Training Loss: 0.00698
Epoch: 816, Training Loss: 0.00433
Epoch: 817, Training Loss: 0.00512
Epoch: 817, Training Loss: 0.00463
Epoch: 817, Training Loss: 0.00697
Epoch: 817, Training Loss: 0.00432
Epoch: 818, Training Loss: 0.00511
Epoch: 818, Training Loss: 0.00462
Epoch: 818, Training Loss: 0.00696
Epoch: 818, Training Loss: 0.00432
Epoch: 819, Training Loss: 0.00510
Epoch: 819, Training Loss: 0.00462
Epoch: 819, Training Loss: 0.00695
Epoch: 819, Training Loss: 0.00431
Epoch: 820, Training Loss: 0.00510
Epoch: 820, Training Loss: 0.00461
Epoch: 820, Training Loss: 0.00694
Epoch: 820, Training Loss: 0.00430
Epoch: 821, Training Loss: 0.00509
Epoch: 821, Training Loss: 0.00460
Epoch: 821, Training Loss: 0.00693
Epoch: 821, Training Loss: 0.00430
Epoch: 822, Training Loss: 0.00508
Epoch: 822, Training Loss: 0.00460
Epoch: 822, Training Loss: 0.00692
Epoch: 822, Training Loss: 0.00429
Epoch: 823, Training Loss: 0.00508
Epoch: 823, Training Loss: 0.00459
Epoch: 823, Training Loss: 0.00691
Epoch: 823, Training Loss: 0.00429
Epoch: 824, Training Loss: 0.00507
Epoch: 824, Training Loss: 0.00458
Epoch: 824, Training Loss: 0.00690
Epoch: 824, Training Loss: 0.00428
Epoch: 825, Training Loss: 0.00506
Epoch: 825, Training Loss: 0.00458
Epoch: 825, Training Loss: 0.00689
Epoch: 825, Training Loss: 0.00428
Epoch: 826, Training Loss: 0.00506
Epoch: 826, Training Loss: 0.00457
Epoch: 826, Training Loss: 0.00688
Epoch: 826, Training Loss: 0.00427
Epoch: 827, Training Loss: 0.00505
Epoch: 827, Training Loss: 0.00457
Epoch: 827, Training Loss: 0.00687
Epoch: 827, Training Loss: 0.00426
Epoch: 828, Training Loss: 0.00504
Epoch: 828, Training Loss: 0.00456
Epoch: 828, Training Loss: 0.00687
Epoch: 828, Training Loss: 0.00426
Epoch: 829, Training Loss: 0.00504
Epoch: 829, Training Loss: 0.00455
Epoch: 829, Training Loss: 0.00686
Epoch: 829, Training Loss: 0.00425
Epoch: 830, Training Loss: 0.00503
Epoch: 830, Training Loss: 0.00455
Epoch: 830, Training Loss: 0.00685
Epoch: 830, Training Loss: 0.00425
Epoch: 831, Training Loss: 0.00502
Epoch: 831, Training Loss: 0.00454
Epoch: 831, Training Loss: 0.00684
Epoch: 831, Training Loss: 0.00424
Epoch: 832, Training Loss: 0.00502
Epoch: 832, Training Loss: 0.00454
Epoch: 832, Training Loss: 0.00683
Epoch: 832, Training Loss: 0.00424
Epoch: 833, Training Loss: 0.00501
Epoch: 833, Training Loss: 0.00453
Epoch: 833, Training Loss: 0.00682
Epoch: 833, Training Loss: 0.00423
Epoch: 834, Training Loss: 0.00500
Epoch: 834, Training Loss: 0.00453
Epoch: 834, Training Loss: 0.00681
Epoch: 834, Training Loss: 0.00423
Epoch: 835, Training Loss: 0.00500
Epoch: 835, Training Loss: 0.00452
Epoch: 835, Training Loss: 0.00680
Epoch: 835, Training Loss: 0.00422
Epoch: 836, Training Loss: 0.00499
Epoch: 836, Training Loss: 0.00451
Epoch: 836, Training Loss: 0.00679
Epoch: 836, Training Loss: 0.00421
Epoch: 837, Training Loss: 0.00498
Epoch: 837, Training Loss: 0.00451
Epoch: 837, Training Loss: 0.00678
Epoch: 837, Training Loss: 0.00421
Epoch: 838, Training Loss: 0.00498
Epoch: 838, Training Loss: 0.00450
Epoch: 838, Training Loss: 0.00678
Epoch: 838, Training Loss: 0.00420
Epoch: 839, Training Loss: 0.00497
Epoch: 839, Training Loss: 0.00450
Epoch: 839, Training Loss: 0.00677
Epoch: 839, Training Loss: 0.00420
Epoch: 840, Training Loss: 0.00496
Epoch: 840, Training Loss: 0.00449
Epoch: 840, Training Loss: 0.00676
Epoch: 840, Training Loss: 0.00419
Epoch: 841, Training Loss: 0.00496
Epoch: 841, Training Loss: 0.00448
Epoch: 841, Training Loss: 0.00675
Epoch: 841, Training Loss: 0.00419
Epoch: 842, Training Loss: 0.00495
Epoch: 842, Training Loss: 0.00448
Epoch: 842, Training Loss: 0.00674
Epoch: 842, Training Loss: 0.00418
Epoch: 843, Training Loss: 0.00494
Epoch: 843, Training Loss: 0.00447
Epoch: 843, Training Loss: 0.00673
Epoch: 843, Training Loss: 0.00418
Epoch: 844, Training Loss: 0.00494
Epoch: 844, Training Loss: 0.00447
Epoch: 844, Training Loss: 0.00672
Epoch: 844, Training Loss: 0.00417
Epoch: 845, Training Loss: 0.00493
Epoch: 845, Training Loss: 0.00446
Epoch: 845, Training Loss: 0.00671
Epoch: 845, Training Loss: 0.00417
Epoch: 846, Training Loss: 0.00492
Epoch: 846, Training Loss: 0.00446
Epoch: 846, Training Loss: 0.00670
Epoch: 846, Training Loss: 0.00416
Epoch: 847, Training Loss: 0.00492
Epoch: 847, Training Loss: 0.00445
Epoch: 847, Training Loss: 0.00670
Epoch: 847, Training Loss: 0.00415
Epoch: 848, Training Loss: 0.00491
Epoch: 848, Training Loss: 0.00444
Epoch: 848, Training Loss: 0.00669
Epoch: 848, Training Loss: 0.00415
Epoch: 849, Training Loss: 0.00490
Epoch: 849, Training Loss: 0.00444
Epoch: 849, Training Loss: 0.00668
Epoch: 849, Training Loss: 0.00414
Epoch: 850, Training Loss: 0.00490
Epoch: 850, Training Loss: 0.00443
Epoch: 850, Training Loss: 0.00667
Epoch: 850, Training Loss: 0.00414
Epoch: 851, Training Loss: 0.00489
Epoch: 851, Training Loss: 0.00443
Epoch: 851, Training Loss: 0.00666
Epoch: 851, Training Loss: 0.00413
Epoch: 852, Training Loss: 0.00489
Epoch: 852, Training Loss: 0.00442
Epoch: 852, Training Loss: 0.00665
Epoch: 852, Training Loss: 0.00413
Epoch: 853, Training Loss: 0.00488
Epoch: 853, Training Loss: 0.00442
Epoch: 853, Training Loss: 0.00664
Epoch: 853, Training Loss: 0.00412
Epoch: 854, Training Loss: 0.00487
Epoch: 854, Training Loss: 0.00441
Epoch: 854, Training Loss: 0.00664
Epoch: 854, Training Loss: 0.00412
Epoch: 855, Training Loss: 0.00487
Epoch: 855, Training Loss: 0.00441
Epoch: 855, Training Loss: 0.00663
Epoch: 855, Training Loss: 0.00411
Epoch: 856, Training Loss: 0.00486
Epoch: 856, Training Loss: 0.00440
Epoch: 856, Training Loss: 0.00662
Epoch: 856, Training Loss: 0.00411
Epoch: 857, Training Loss: 0.00485
Epoch: 857, Training Loss: 0.00439
Epoch: 857, Training Loss: 0.00661
Epoch: 857, Training Loss: 0.00410
Epoch: 858, Training Loss: 0.00485
Epoch: 858, Training Loss: 0.00439
Epoch: 858, Training Loss: 0.00660
Epoch: 858, Training Loss: 0.00410
Epoch: 859, Training Loss: 0.00484
Epoch: 859, Training Loss: 0.00438
Epoch: 859, Training Loss: 0.00659
Epoch: 859, Training Loss: 0.00409
Epoch: 860, Training Loss: 0.00484
Epoch: 860, Training Loss: 0.00438
Epoch: 860, Training Loss: 0.00658
Epoch: 860, Training Loss: 0.00409
Epoch: 861, Training Loss: 0.00483
Epoch: 861, Training Loss: 0.00437
Epoch: 861, Training Loss: 0.00658
Epoch: 861, Training Loss: 0.00408
Epoch: 862, Training Loss: 0.00482
Epoch: 862, Training Loss: 0.00437
Epoch: 862, Training Loss: 0.00657
Epoch: 862, Training Loss: 0.00408
Epoch: 863, Training Loss: 0.00482
Epoch: 863, Training Loss: 0.00436
Epoch: 863, Training Loss: 0.00656
Epoch: 863, Training Loss: 0.00407
Epoch: 864, Training Loss: 0.00481
Epoch: 864, Training Loss: 0.00436
Epoch: 864, Training Loss: 0.00655
Epoch: 864, Training Loss: 0.00407
Epoch: 865, Training Loss: 0.00480
Epoch: 865, Training Loss: 0.00435
Epoch: 865, Training Loss: 0.00654
Epoch: 865, Training Loss: 0.00406
Epoch: 866, Training Loss: 0.00480
Epoch: 866, Training Loss: 0.00434
Epoch: 866, Training Loss: 0.00653
Epoch: 866, Training Loss: 0.00406
Epoch: 867, Training Loss: 0.00479
Epoch: 867, Training Loss: 0.00434
Epoch: 867, Training Loss: 0.00653
Epoch: 867, Training Loss: 0.00405
Epoch: 868, Training Loss: 0.00479
Epoch: 868, Training Loss: 0.00433
Epoch: 868, Training Loss: 0.00652
Epoch: 868, Training Loss: 0.00405
Epoch: 869, Training Loss: 0.00478
Epoch: 869, Training Loss: 0.00433
Epoch: 869, Training Loss: 0.00651
Epoch: 869, Training Loss: 0.00404
Epoch: 870, Training Loss: 0.00477
Epoch: 870, Training Loss: 0.00432
Epoch: 870, Training Loss: 0.00650
Epoch: 870, Training Loss: 0.00404
Epoch: 871, Training Loss: 0.00477
Epoch: 871, Training Loss: 0.00432
Epoch: 871, Training Loss: 0.00649
Epoch: 871, Training Loss: 0.00403
Epoch: 872, Training Loss: 0.00476
Epoch: 872, Training Loss: 0.00431
Epoch: 872, Training Loss: 0.00648
Epoch: 872, Training Loss: 0.00403
Epoch: 873, Training Loss: 0.00476
Epoch: 873, Training Loss: 0.00431
Epoch: 873, Training Loss: 0.00648
Epoch: 873, Training Loss: 0.00402
Epoch: 874, Training Loss: 0.00475
Epoch: 874, Training Loss: 0.00430
Epoch: 874, Training Loss: 0.00647
Epoch: 874, Training Loss: 0.00402
Epoch: 875, Training Loss: 0.00474
Epoch: 875, Training Loss: 0.00430
Epoch: 875, Training Loss: 0.00646
Epoch: 875, Training Loss: 0.00401
Epoch: 876, Training Loss: 0.00474
Epoch: 876, Training Loss: 0.00429
Epoch: 876, Training Loss: 0.00645
Epoch: 876, Training Loss: 0.00401
Epoch: 877, Training Loss: 0.00473
Epoch: 877, Training Loss: 0.00429
Epoch: 877, Training Loss: 0.00644
Epoch: 877, Training Loss: 0.00400
Epoch: 878, Training Loss: 0.00473
Epoch: 878, Training Loss: 0.00428
Epoch: 878, Training Loss: 0.00644
Epoch: 878, Training Loss: 0.00400
Epoch: 879, Training Loss: 0.00472
Epoch: 879, Training Loss: 0.00428
Epoch: 879, Training Loss: 0.00643
Epoch: 879, Training Loss: 0.00399
Epoch: 880, Training Loss: 0.00471
Epoch: 880, Training Loss: 0.00427
Epoch: 880, Training Loss: 0.00642
Epoch: 880, Training Loss: 0.00399
Epoch: 881, Training Loss: 0.00471
Epoch: 881, Training Loss: 0.00426
Epoch: 881, Training Loss: 0.00641
Epoch: 881, Training Loss: 0.00398
Epoch: 882, Training Loss: 0.00470
Epoch: 882, Training Loss: 0.00426
Epoch: 882, Training Loss: 0.00640
Epoch: 882, Training Loss: 0.00398
Epoch: 883, Training Loss: 0.00470
Epoch: 883, Training Loss: 0.00425
Epoch: 883, Training Loss: 0.00640
Epoch: 883, Training Loss: 0.00397
Epoch: 884, Training Loss: 0.00469
Epoch: 884, Training Loss: 0.00425
Epoch: 884, Training Loss: 0.00639
Epoch: 884, Training Loss: 0.00397
Epoch: 885, Training Loss: 0.00468
Epoch: 885, Training Loss: 0.00424
Epoch: 885, Training Loss: 0.00638
Epoch: 885, Training Loss: 0.00396
Epoch: 886, Training Loss: 0.00468
Epoch: 886, Training Loss: 0.00424
Epoch: 886, Training Loss: 0.00637
Epoch: 886, Training Loss: 0.00396
Epoch: 887, Training Loss: 0.00467
Epoch: 887, Training Loss: 0.00423
Epoch: 887, Training Loss: 0.00636
Epoch: 887, Training Loss: 0.00395
Epoch: 888, Training Loss: 0.00467
Epoch: 888, Training Loss: 0.00423
Epoch: 888, Training Loss: 0.00636
Epoch: 888, Training Loss: 0.00395
Epoch: 889, Training Loss: 0.00466
Epoch: 889, Training Loss: 0.00422
Epoch: 889, Training Loss: 0.00635
Epoch: 889, Training Loss: 0.00394
Epoch: 890, Training Loss: 0.00466
Epoch: 890, Training Loss: 0.00422
Epoch: 890, Training Loss: 0.00634
Epoch: 890, Training Loss: 0.00394
Epoch: 891, Training Loss: 0.00465
Epoch: 891, Training Loss: 0.00421
Epoch: 891, Training Loss: 0.00633
Epoch: 891, Training Loss: 0.00393
Epoch: 892, Training Loss: 0.00464
Epoch: 892, Training Loss: 0.00421
Epoch: 892, Training Loss: 0.00633
Epoch: 892, Training Loss: 0.00393
Epoch: 893, Training Loss: 0.00464
Epoch: 893, Training Loss: 0.00420
Epoch: 893, Training Loss: 0.00632
Epoch: 893, Training Loss: 0.00392
Epoch: 894, Training Loss: 0.00463
Epoch: 894, Training Loss: 0.00420
Epoch: 894, Training Loss: 0.00631
Epoch: 894, Training Loss: 0.00392
Epoch: 895, Training Loss: 0.00463
Epoch: 895, Training Loss: 0.00419
Epoch: 895, Training Loss: 0.00630
Epoch: 895, Training Loss: 0.00391
Epoch: 896, Training Loss: 0.00462
Epoch: 896, Training Loss: 0.00419
Epoch: 896, Training Loss: 0.00629
Epoch: 896, Training Loss: 0.00391
Epoch: 897, Training Loss: 0.00462
Epoch: 897, Training Loss: 0.00418
Epoch: 897, Training Loss: 0.00629
Epoch: 897, Training Loss: 0.00390
Epoch: 898, Training Loss: 0.00461
Epoch: 898, Training Loss: 0.00418
Epoch: 898, Training Loss: 0.00628
Epoch: 898, Training Loss: 0.00390
Epoch: 899, Training Loss: 0.00460
Epoch: 899, Training Loss: 0.00417
Epoch: 899, Training Loss: 0.00627
Epoch: 899, Training Loss: 0.00389
Epoch: 900, Training Loss: 0.00460
Epoch: 900, Training Loss: 0.00417
Epoch: 900, Training Loss: 0.00626
Epoch: 900, Training Loss: 0.00389
Epoch: 901, Training Loss: 0.00459
Epoch: 901, Training Loss: 0.00416
Epoch: 901, Training Loss: 0.00626
Epoch: 901, Training Loss: 0.00389
Epoch: 902, Training Loss: 0.00459
Epoch: 902, Training Loss: 0.00416
Epoch: 902, Training Loss: 0.00625
Epoch: 902, Training Loss: 0.00388
Epoch: 903, Training Loss: 0.00458
Epoch: 903, Training Loss: 0.00415
Epoch: 903, Training Loss: 0.00624
Epoch: 903, Training Loss: 0.00388
Epoch: 904, Training Loss: 0.00458
Epoch: 904, Training Loss: 0.00415
Epoch: 904, Training Loss: 0.00623
Epoch: 904, Training Loss: 0.00387
Epoch: 905, Training Loss: 0.00457
Epoch: 905, Training Loss: 0.00414
Epoch: 905, Training Loss: 0.00623
Epoch: 905, Training Loss: 0.00387
Epoch: 906, Training Loss: 0.00457
Epoch: 906, Training Loss: 0.00414
Epoch: 906, Training Loss: 0.00622
Epoch: 906, Training Loss: 0.00386
Epoch: 907, Training Loss: 0.00456
Epoch: 907, Training Loss: 0.00413
Epoch: 907, Training Loss: 0.00621
Epoch: 907, Training Loss: 0.00386
Epoch: 908, Training Loss: 0.00455
Epoch: 908, Training Loss: 0.00413
Epoch: 908, Training Loss: 0.00620
Epoch: 908, Training Loss: 0.00385
Epoch: 909, Training Loss: 0.00455
Epoch: 909, Training Loss: 0.00412
Epoch: 909, Training Loss: 0.00620
Epoch: 909, Training Loss: 0.00385
Epoch: 910, Training Loss: 0.00454
Epoch: 910, Training Loss: 0.00412
Epoch: 910, Training Loss: 0.00619
Epoch: 910, Training Loss: 0.00384
Epoch: 911, Training Loss: 0.00454
Epoch: 911, Training Loss: 0.00411
Epoch: 911, Training Loss: 0.00618
Epoch: 911, Training Loss: 0.00384
Epoch: 912, Training Loss: 0.00453
Epoch: 912, Training Loss: 0.00411
Epoch: 912, Training Loss: 0.00617
Epoch: 912, Training Loss: 0.00383
Epoch: 913, Training Loss: 0.00453
Epoch: 913, Training Loss: 0.00410
Epoch: 913, Training Loss: 0.00617
Epoch: 913, Training Loss: 0.00383
Epoch: 914, Training Loss: 0.00452
Epoch: 914, Training Loss: 0.00410
Epoch: 914, Training Loss: 0.00616
Epoch: 914, Training Loss: 0.00383
Epoch: 915, Training Loss: 0.00452
Epoch: 915, Training Loss: 0.00409
Epoch: 915, Training Loss: 0.00615
Epoch: 915, Training Loss: 0.00382
Epoch: 916, Training Loss: 0.00451
Epoch: 916, Training Loss: 0.00409
Epoch: 916, Training Loss: 0.00614
Epoch: 916, Training Loss: 0.00382
Epoch: 917, Training Loss: 0.00451
Epoch: 917, Training Loss: 0.00408
Epoch: 917, Training Loss: 0.00614
Epoch: 917, Training Loss: 0.00381
Epoch: 918, Training Loss: 0.00450
Epoch: 918, Training Loss: 0.00408
Epoch: 918, Training Loss: 0.00613
Epoch: 918, Training Loss: 0.00381
Epoch: 919, Training Loss: 0.00449
Epoch: 919, Training Loss: 0.00407
Epoch: 919, Training Loss: 0.00612
Epoch: 919, Training Loss: 0.00380
Epoch: 920, Training Loss: 0.00449
Epoch: 920, Training Loss: 0.00407
Epoch: 920, Training Loss: 0.00611
Epoch: 920, Training Loss: 0.00380
Epoch: 921, Training Loss: 0.00448
Epoch: 921, Training Loss: 0.00407
Epoch: 921, Training Loss: 0.00611
Epoch: 921, Training Loss: 0.00379
Epoch: 922, Training Loss: 0.00448
Epoch: 922, Training Loss: 0.00406
Epoch: 922, Training Loss: 0.00610
Epoch: 922, Training Loss: 0.00379
Epoch: 923, Training Loss: 0.00447
Epoch: 923, Training Loss: 0.00406
Epoch: 923, Training Loss: 0.00609
Epoch: 923, Training Loss: 0.00379
Epoch: 924, Training Loss: 0.00447
Epoch: 924, Training Loss: 0.00405
Epoch: 924, Training Loss: 0.00609
Epoch: 924, Training Loss: 0.00378
Epoch: 925, Training Loss: 0.00446
Epoch: 925, Training Loss: 0.00405
Epoch: 925, Training Loss: 0.00608
Epoch: 925, Training Loss: 0.00378
Epoch: 926, Training Loss: 0.00446
Epoch: 926, Training Loss: 0.00404
Epoch: 926, Training Loss: 0.00607
Epoch: 926, Training Loss: 0.00377
Epoch: 927, Training Loss: 0.00445
Epoch: 927, Training Loss: 0.00404
Epoch: 927, Training Loss: 0.00606
Epoch: 927, Training Loss: 0.00377
Epoch: 928, Training Loss: 0.00445
Epoch: 928, Training Loss: 0.00403
Epoch: 928, Training Loss: 0.00606
Epoch: 928, Training Loss: 0.00376
Epoch: 929, Training Loss: 0.00444
Epoch: 929, Training Loss: 0.00403
Epoch: 929, Training Loss: 0.00605
Epoch: 929, Training Loss: 0.00376
Epoch: 930, Training Loss: 0.00444
Epoch: 930, Training Loss: 0.00402
Epoch: 930, Training Loss: 0.00604
Epoch: 930, Training Loss: 0.00375
Epoch: 931, Training Loss: 0.00443
Epoch: 931, Training Loss: 0.00402
Epoch: 931, Training Loss: 0.00604
Epoch: 931, Training Loss: 0.00375
Epoch: 932, Training Loss: 0.00443
Epoch: 932, Training Loss: 0.00401
Epoch: 932, Training Loss: 0.00603
Epoch: 932, Training Loss: 0.00375
Epoch: 933, Training Loss: 0.00442
Epoch: 933, Training Loss: 0.00401
Epoch: 933, Training Loss: 0.00602
Epoch: 933, Training Loss: 0.00374
Epoch: 934, Training Loss: 0.00442
Epoch: 934, Training Loss: 0.00400
Epoch: 934, Training Loss: 0.00601
Epoch: 934, Training Loss: 0.00374
Epoch: 935, Training Loss: 0.00441
Epoch: 935, Training Loss: 0.00400
Epoch: 935, Training Loss: 0.00601
Epoch: 935, Training Loss: 0.00373
Epoch: 936, Training Loss: 0.00440
Epoch: 936, Training Loss: 0.00400
Epoch: 936, Training Loss: 0.00600
Epoch: 936, Training Loss: 0.00373
Epoch: 937, Training Loss: 0.00440
Epoch: 937, Training Loss: 0.00399
Epoch: 937, Training Loss: 0.00599
Epoch: 937, Training Loss: 0.00372
Epoch: 938, Training Loss: 0.00439
Epoch: 938, Training Loss: 0.00399
Epoch: 938, Training Loss: 0.00599
Epoch: 938, Training Loss: 0.00372
Epoch: 939, Training Loss: 0.00439
Epoch: 939, Training Loss: 0.00398
Epoch: 939, Training Loss: 0.00598
Epoch: 939, Training Loss: 0.00372
Epoch: 940, Training Loss: 0.00438
Epoch: 940, Training Loss: 0.00398
Epoch: 940, Training Loss: 0.00597
Epoch: 940, Training Loss: 0.00371
Epoch: 941, Training Loss: 0.00438
Epoch: 941, Training Loss: 0.00397
Epoch: 941, Training Loss: 0.00597
Epoch: 941, Training Loss: 0.00371
Epoch: 942, Training Loss: 0.00437
Epoch: 942, Training Loss: 0.00397
Epoch: 942, Training Loss: 0.00596
Epoch: 942, Training Loss: 0.00370
Epoch: 943, Training Loss: 0.00437
Epoch: 943, Training Loss: 0.00396
Epoch: 943, Training Loss: 0.00595
Epoch: 943, Training Loss: 0.00370
Epoch: 944, Training Loss: 0.00436
Epoch: 944, Training Loss: 0.00396
Epoch: 944, Training Loss: 0.00595
Epoch: 944, Training Loss: 0.00369
Epoch: 945, Training Loss: 0.00436
Epoch: 945, Training Loss: 0.00395
Epoch: 945, Training Loss: 0.00594
Epoch: 945, Training Loss: 0.00369
Epoch: 946, Training Loss: 0.00435
Epoch: 946, Training Loss: 0.00395
Epoch: 946, Training Loss: 0.00593
Epoch: 946, Training Loss: 0.00369
Epoch: 947, Training Loss: 0.00435
Epoch: 947, Training Loss: 0.00395
Epoch: 947, Training Loss: 0.00592
Epoch: 947, Training Loss: 0.00368
Epoch: 948, Training Loss: 0.00434
Epoch: 948, Training Loss: 0.00394
Epoch: 948, Training Loss: 0.00592
Epoch: 948, Training Loss: 0.00368
Epoch: 949, Training Loss: 0.00434
Epoch: 949, Training Loss: 0.00394
Epoch: 949, Training Loss: 0.00591
Epoch: 949, Training Loss: 0.00367
Epoch: 950, Training Loss: 0.00433
Epoch: 950, Training Loss: 0.00393
Epoch: 950, Training Loss: 0.00590
Epoch: 950, Training Loss: 0.00367
Epoch: 951, Training Loss: 0.00433
Epoch: 951, Training Loss: 0.00393
Epoch: 951, Training Loss: 0.00590
Epoch: 951, Training Loss: 0.00366
Epoch: 952, Training Loss: 0.00432
Epoch: 952, Training Loss: 0.00392
Epoch: 952, Training Loss: 0.00589
Epoch: 952, Training Loss: 0.00366
Epoch: 953, Training Loss: 0.00432
Epoch: 953, Training Loss: 0.00392
Epoch: 953, Training Loss: 0.00588
Epoch: 953, Training Loss: 0.00366
Epoch: 954, Training Loss: 0.00431
Epoch: 954, Training Loss: 0.00391
Epoch: 954, Training Loss: 0.00588
Epoch: 954, Training Loss: 0.00365
Epoch: 955, Training Loss: 0.00431
Epoch: 955, Training Loss: 0.00391
Epoch: 955, Training Loss: 0.00587
Epoch: 955, Training Loss: 0.00365
Epoch: 956, Training Loss: 0.00430
Epoch: 956, Training Loss: 0.00391
Epoch: 956, Training Loss: 0.00586
Epoch: 956, Training Loss: 0.00364
Epoch: 957, Training Loss: 0.00430
Epoch: 957, Training Loss: 0.00390
Epoch: 957, Training Loss: 0.00586
Epoch: 957, Training Loss: 0.00364
Epoch: 958, Training Loss: 0.00429
Epoch: 958, Training Loss: 0.00390
Epoch: 958, Training Loss: 0.00585
Epoch: 958, Training Loss: 0.00364
Epoch: 959, Training Loss: 0.00429
Epoch: 959, Training Loss: 0.00389
Epoch: 959, Training Loss: 0.00584
Epoch: 959, Training Loss: 0.00363
Epoch: 960, Training Loss: 0.00428
Epoch: 960, Training Loss: 0.00389
Epoch: 960, Training Loss: 0.00584
Epoch: 960, Training Loss: 0.00363
Epoch: 961, Training Loss: 0.00428
Epoch: 961, Training Loss: 0.00388
Epoch: 961, Training Loss: 0.00583
Epoch: 961, Training Loss: 0.00362
Epoch: 962, Training Loss: 0.00427
Epoch: 962, Training Loss: 0.00388
Epoch: 962, Training Loss: 0.00582
Epoch: 962, Training Loss: 0.00362
Epoch: 963, Training Loss: 0.00427
Epoch: 963, Training Loss: 0.00387
Epoch: 963, Training Loss: 0.00582
Epoch: 963, Training Loss: 0.00362
Epoch: 964, Training Loss: 0.00426
Epoch: 964, Training Loss: 0.00387
Epoch: 964, Training Loss: 0.00581
Epoch: 964, Training Loss: 0.00361
Epoch: 965, Training Loss: 0.00426
Epoch: 965, Training Loss: 0.00387
Epoch: 965, Training Loss: 0.00580
Epoch: 965, Training Loss: 0.00361
Epoch: 966, Training Loss: 0.00425
Epoch: 966, Training Loss: 0.00386
Epoch: 966, Training Loss: 0.00580
Epoch: 966, Training Loss: 0.00360
Epoch: 967, Training Loss: 0.00425
Epoch: 967, Training Loss: 0.00386
Epoch: 967, Training Loss: 0.00579
Epoch: 967, Training Loss: 0.00360
Epoch: 968, Training Loss: 0.00425
Epoch: 968, Training Loss: 0.00385
Epoch: 968, Training Loss: 0.00578
Epoch: 968, Training Loss: 0.00360
Epoch: 969, Training Loss: 0.00424
Epoch: 969, Training Loss: 0.00385
Epoch: 969, Training Loss: 0.00578
Epoch: 969, Training Loss: 0.00359
Epoch: 970, Training Loss: 0.00424
Epoch: 970, Training Loss: 0.00384
Epoch: 970, Training Loss: 0.00577
Epoch: 970, Training Loss: 0.00359
Epoch: 971, Training Loss: 0.00423
Epoch: 971, Training Loss: 0.00384
Epoch: 971, Training Loss: 0.00577
Epoch: 971, Training Loss: 0.00358
Epoch: 972, Training Loss: 0.00423
Epoch: 972, Training Loss: 0.00384
Epoch: 972, Training Loss: 0.00576
Epoch: 972, Training Loss: 0.00358
Epoch: 973, Training Loss: 0.00422
Epoch: 973, Training Loss: 0.00383
Epoch: 973, Training Loss: 0.00575
Epoch: 973, Training Loss: 0.00358
Epoch: 974, Training Loss: 0.00422
Epoch: 974, Training Loss: 0.00383
Epoch: 974, Training Loss: 0.00575
Epoch: 974, Training Loss: 0.00357
Epoch: 975, Training Loss: 0.00421
Epoch: 975, Training Loss: 0.00382
Epoch: 975, Training Loss: 0.00574
Epoch: 975, Training Loss: 0.00357
Epoch: 976, Training Loss: 0.00421
Epoch: 976, Training Loss: 0.00382
Epoch: 976, Training Loss: 0.00573
Epoch: 976, Training Loss: 0.00356
Epoch: 977, Training Loss: 0.00420
Epoch: 977, Training Loss: 0.00382
Epoch: 977, Training Loss: 0.00573
Epoch: 977, Training Loss: 0.00356
Epoch: 978, Training Loss: 0.00420
Epoch: 978, Training Loss: 0.00381
Epoch: 978, Training Loss: 0.00572
Epoch: 978, Training Loss: 0.00356
Epoch: 979, Training Loss: 0.00419
Epoch: 979, Training Loss: 0.00381
Epoch: 979, Training Loss: 0.00571
Epoch: 979, Training Loss: 0.00355
Epoch: 980, Training Loss: 0.00419
Epoch: 980, Training Loss: 0.00380
Epoch: 980, Training Loss: 0.00571
Epoch: 980, Training Loss: 0.00355
Epoch: 981, Training Loss: 0.00418
Epoch: 981, Training Loss: 0.00380
Epoch: 981, Training Loss: 0.00570
Epoch: 981, Training Loss: 0.00354
Epoch: 982, Training Loss: 0.00418
Epoch: 982, Training Loss: 0.00379
Epoch: 982, Training Loss: 0.00569
Epoch: 982, Training Loss: 0.00354
Epoch: 983, Training Loss: 0.00417
Epoch: 983, Training Loss: 0.00379
Epoch: 983, Training Loss: 0.00569
Epoch: 983, Training Loss: 0.00354
Epoch: 984, Training Loss: 0.00417
Epoch: 984, Training Loss: 0.00379
Epoch: 984, Training Loss: 0.00568
Epoch: 984, Training Loss: 0.00353
Epoch: 985, Training Loss: 0.00416
Epoch: 985, Training Loss: 0.00378
Epoch: 985, Training Loss: 0.00568
Epoch: 985, Training Loss: 0.00353
Epoch: 986, Training Loss: 0.00416
Epoch: 986, Training Loss: 0.00378
Epoch: 986, Training Loss: 0.00567
Epoch: 986, Training Loss: 0.00352
Epoch: 987, Training Loss: 0.00416
Epoch: 987, Training Loss: 0.00377
Epoch: 987, Training Loss: 0.00566
Epoch: 987, Training Loss: 0.00352
Epoch: 988, Training Loss: 0.00415
Epoch: 988, Training Loss: 0.00377
Epoch: 988, Training Loss: 0.00566
Epoch: 988, Training Loss: 0.00352
Epoch: 989, Training Loss: 0.00415
Epoch: 989, Training Loss: 0.00377
Epoch: 989, Training Loss: 0.00565
Epoch: 989, Training Loss: 0.00351
Epoch: 990, Training Loss: 0.00414
Epoch: 990, Training Loss: 0.00376
Epoch: 990, Training Loss: 0.00564
Epoch: 990, Training Loss: 0.00351
Epoch: 991, Training Loss: 0.00414
Epoch: 991, Training Loss: 0.00376
Epoch: 991, Training Loss: 0.00564
Epoch: 991, Training Loss: 0.00351
Epoch: 992, Training Loss: 0.00413
Epoch: 992, Training Loss: 0.00375
Epoch: 992, Training Loss: 0.00563
Epoch: 992, Training Loss: 0.00350
Epoch: 993, Training Loss: 0.00413
Epoch: 993, Training Loss: 0.00375
Epoch: 993, Training Loss: 0.00563
Epoch: 993, Training Loss: 0.00350
Epoch: 994, Training Loss: 0.00412
Epoch: 994, Training Loss: 0.00375
Epoch: 994, Training Loss: 0.00562
Epoch: 994, Training Loss: 0.00349
Epoch: 995, Training Loss: 0.00412
Epoch: 995, Training Loss: 0.00374
Epoch: 995, Training Loss: 0.00561
Epoch: 995, Training Loss: 0.00349
Epoch: 996, Training Loss: 0.00411
Epoch: 996, Training Loss: 0.00374
Epoch: 996, Training Loss: 0.00561
Epoch: 996, Training Loss: 0.00349
Epoch: 997, Training Loss: 0.00411
Epoch: 997, Training Loss: 0.00373
Epoch: 997, Training Loss: 0.00560
Epoch: 997, Training Loss: 0.00348
Epoch: 998, Training Loss: 0.00411
Epoch: 998, Training Loss: 0.00373
Epoch: 998, Training Loss: 0.00560
Epoch: 998, Training Loss: 0.00348
Epoch: 999, Training Loss: 0.00410
Epoch: 999, Training Loss: 0.00372
Epoch: 999, Training Loss: 0.00559
Epoch: 999, Training Loss: 0.00348

In [68]:
epoch_n == epoch_plt.legend(loc='best')n2 == epoch_n3


Out[68]:
True

In [89]:
plt.figure(3)
train_losses2 = np.array(train_losses2)
plt.plot(epoch_n2, train_losses2.T)
plt.xlabel("Epoch")
plt.ylabel("Loss")
plt.title("Training Losses")
plt.show()


Combine all the loss results

Now let's analysis all the results of different methods. More detials please see the 'Report - Analysis the neural network'.


In [91]:
train_losses3 = np.array(train_losses3)
plt.figure(4)
plt.plot(epoch_n3, train_losses3, 'g', label='Cross-entropy')
plt.plot(epoch_n2, train_losses2, 'b', label='Regularized-Gaussian')
plt.plot(epoch_n, train_losses, 'r', label='Normal')
plt.xlabel("Epoch")
plt.ylabel("Loss")
plt.title("Training Losses")
plt.show()


From the figure we can see that with the help of cross-entropy cost function we can get the better result.